In wiki article
http://wiki.osdev.org/Programmable_Interval_Timer
wide availability of cheap 14.31818 crystals makes perfect sense. Divided by 12 it produces 1.193182 MHZ for the clock. PIT was usually programmed by BIOS to divide by 0x10000 producing the familiar 18.206 Hz or 54.92 ms interval of INT 8.
The interesting coincidence is that 0x10000 of interrupt 8 (counted by BIOS and used by DOS clock) makes up 3599.59 seconds which is very close to one hour. I noticed that taking apart an old DOS TSR corner of the screen clock program and was surprised by the simplicity of the conversion from the memory location 0040:006C.
The perfect crystal for 1 hour count at 40:006E would be 14,316,557 Hz so I do not think NTSC M clock was picked this way.
Anyone has insight if this was considered at PC/XT design time?
Choice of clock frequency 1.193182
- piranha
- Member
- Posts: 1391
- Joined: Thu Dec 21, 2006 7:42 pm
- Location: Unknown. Momentum is pretty certain, however.
- Contact:
Re: Choice of clock frequency 1.193182
Drugs.
Edit: Or they decided to troll everyone.
-JL
Edit: Or they decided to troll everyone.
-JL
SeaOS: Adding VT-x, networking, and ARM support
dbittman on IRC, @danielbittman on twitter
https://dbittman.github.io
dbittman on IRC, @danielbittman on twitter
https://dbittman.github.io
Re: Choice of clock frequency 1.193182
I am not quite sure if I can follow your reasoning correctly.
But the PC clock, as you said, was using readily available TV circuitry.
NTSC refresh rates were originally chosen to exactly match the 60 Hz of the US power grid, which helped early kinescope implementations.
Later (but still before the PC), the NTSC refresh rate was shifted slightly downwards, to 59.94 Hz, to avoid certain audio / video issues.
I would guess that this (or something like this) eventually led to the "odd" clock in TV circuitry that the original PC then copied.
Source: Wikipedia.
But the PC clock, as you said, was using readily available TV circuitry.
NTSC refresh rates were originally chosen to exactly match the 60 Hz of the US power grid, which helped early kinescope implementations.
Later (but still before the PC), the NTSC refresh rate was shifted slightly downwards, to 59.94 Hz, to avoid certain audio / video issues.
I would guess that this (or something like this) eventually led to the "odd" clock in TV circuitry that the original PC then copied.
Source: Wikipedia.
Every good solution is obvious once you've found it.