softwareed RTC

I have a 20mhz running at full speed on a mega88.

I need a 1 sec timer so im using timer0 with a 1024 prescale to code hack myself a somewhat 1 second timer.

my logic is as follows: 20Mhz/1024=~19531

1931/255=76

So 76 t0 overflows is a ~ 1 second right? For some reason it seems more like 2 seconds. What what did I miss here? I can post code tomorrow, but I feel like its more of me just missing something in my thinking.

when in doubt, use avr studio’s simulator to check exact timing.

Im just talking pseudo code here. If I count only half of my calculated overflows, 38, it correctly comes out to ~1 second. I feel like im just missing something or am doing incorrect math. Any ideas?

use the PC program called AVRcalc to compute the timer preset values (and UART speed settings). It’s on avrfreaks.net.

also, you can use an 8 bit timer and save chip space. Just interrupt, say, 100/sec and have the ISR count interrupts.

With the standard crystals, the RTC will be in error several seconds after a day or so.

that programs neat but only seems to help with 16bit timers.

I currently am using timer0 and counting overflow interrupts. according to my math in the first post 76 overflows should be a second, but like i said its more like 2. And i cant figure out why.

which avrcalc do you mean? i found 2, the 2nd seems better.

http://www.avrfreaks.net/index.php?func … ks%20Tools

http://svn.b9.com/

How are you observing the timer rate? Some of the timer output modes would effectively halve the output frequency (e.g. toggle mode, phase-correct PWM mode).

Also it’s possible to prescale the entire system clock with CLKPS3:0 but presumably you’d know if you were doing that.

Im having other issues now i need to fox before I can figure this out unfortunately. Im gona open another thread to keep things organized.