I need a 1 sec timer so im using timer0 with a 1024 prescale to code hack myself a somewhat 1 second timer.
my logic is as follows: 20Mhz/1024=~19531
1931/255=76
So 76 t0 overflows is a ~ 1 second right? For some reason it seems more like 2 seconds. What what did I miss here? I can post code tomorrow, but I feel like its more of me just missing something in my thinking.
Im just talking pseudo code here. If I count only half of my calculated overflows, 38, it correctly comes out to ~1 second. I feel like im just missing something or am doing incorrect math. Any ideas?
that programs neat but only seems to help with 16bit timers.
I currently am using timer0 and counting overflow interrupts. according to my math in the first post 76 overflows should be a second, but like i said its more like 2. And i cant figure out why.
How are you observing the timer rate? Some of the timer output modes would effectively halve the output frequency (e.g. toggle mode, phase-correct PWM mode).
Also it’s possible to prescale the entire system clock with CLKPS3:0 but presumably you’d know if you were doing that.