[solved]Calibrate timer/track time with better accuracy
Posted: Sat Jan 16, 2021 2:39 am
Started reading CMOS in order to provide timestamps and noticed that the I'm not tracking time very accurately ( 3~5 seconds slower per hour). I currently re-read CMOS on each hour mark so the effect is quite visible as the clock jumps forward 3~5 seconds at the end of each hour.
I'm using the LAPIC timer, it has the ARAT bit and after numerous attempts to calibrate, I'm pretty sure its native rate is 1ghz as strange as that may be (neither the CPU's clock rate nor the QPI bus speed).
The way I do calibration is:
1. set up PIT to run at 1Khz (using 1193 as divider) and set up LAPIC to run at a rate low enough (like divider 32, initial value 10000) that I can comfortably service interrupts from both, the interrupt handlers increment counters for each.
2. wait for 1 sec or PIT counter reach 1000
3. CLI and use the ratio between the 2 counters to determine LAPIC timer's rate
But this calibration seems too fragile:
Although the native rate is 1000000000, getting 1 less interrupt during the calibration would knock that down to 999680000 and 1 more it becomes 1000320000, and I can't run the interrupts much faster than this to greatly reduce the effect of 1 less/more interrupt.
It is also difficult to wait for longer times as these timers can easily roll over if I simply read them later, and during boot there are many areas (probably more going forward) where interrupts are disabled so leave the 2 timer interrupts running and hope for the best doesn't seem like a good idea either.
The issue might also be:
Maybe I'm not tracking time correctly. I'm counting ticks in the normal timer handler (that also drive many other events like context switch or print a clock at a corner of screen at lower frequency) and if this handler misses any interrupt ticks (for example due to CLI elsewhere) there's no good way to tell that ticks are missed.
Any suggestions to improve the calibration/tracking?
I'm using the LAPIC timer, it has the ARAT bit and after numerous attempts to calibrate, I'm pretty sure its native rate is 1ghz as strange as that may be (neither the CPU's clock rate nor the QPI bus speed).
The way I do calibration is:
1. set up PIT to run at 1Khz (using 1193 as divider) and set up LAPIC to run at a rate low enough (like divider 32, initial value 10000) that I can comfortably service interrupts from both, the interrupt handlers increment counters for each.
2. wait for 1 sec or PIT counter reach 1000
3. CLI and use the ratio between the 2 counters to determine LAPIC timer's rate
But this calibration seems too fragile:
Although the native rate is 1000000000, getting 1 less interrupt during the calibration would knock that down to 999680000 and 1 more it becomes 1000320000, and I can't run the interrupts much faster than this to greatly reduce the effect of 1 less/more interrupt.
It is also difficult to wait for longer times as these timers can easily roll over if I simply read them later, and during boot there are many areas (probably more going forward) where interrupts are disabled so leave the 2 timer interrupts running and hope for the best doesn't seem like a good idea either.
The issue might also be:
Maybe I'm not tracking time correctly. I'm counting ticks in the normal timer handler (that also drive many other events like context switch or print a clock at a corner of screen at lower frequency) and if this handler misses any interrupt ticks (for example due to CLI elsewhere) there's no good way to tell that ticks are missed.
Any suggestions to improve the calibration/tracking?