Why VGA needs IRQ?
Why VGA needs IRQ?
I am wondering why VGA needs IRQ.
Where is used from the OS?
Where is used from the OS?
Keep coding...
...the sky is the limit
AsteriOS project: http://www.mindfields.gr/main/index.php ... &Itemid=27
...the sky is the limit
AsteriOS project: http://www.mindfields.gr/main/index.php ... &Itemid=27
The VGA IRQ is used for vsync.
Some background:
On old GFX Cards you can only access the video memory between screen refresh. If you write something to he video memory and the screen just gets refreshed at that same time the Processor (your code) will wait until the GFX car finished the refresh. To not lose any time by this you wait for the end of a refresh (IRQ) and then write the new image to the video memory.
Modern cars don't have this limit. But I think its still the same in VGA modus.
Today you can enable vsync in games to avoid that two images sharing the screen. Because the redrawing of the video memory is faster then the screen refresh. The image changes during the screen refresh.
It looks somehow like this: (Image 1 and 2)
111111111111111111
111111111111111111
111111111122222222
222222222222222222
222222222222222222
222222222222222222
Some background:
On old GFX Cards you can only access the video memory between screen refresh. If you write something to he video memory and the screen just gets refreshed at that same time the Processor (your code) will wait until the GFX car finished the refresh. To not lose any time by this you wait for the end of a refresh (IRQ) and then write the new image to the video memory.
Modern cars don't have this limit. But I think its still the same in VGA modus.
Today you can enable vsync in games to avoid that two images sharing the screen. Because the redrawing of the video memory is faster then the screen refresh. The image changes during the screen refresh.
It looks somehow like this: (Image 1 and 2)
111111111111111111
111111111111111111
111111111122222222
222222222222222222
222222222222222222
222222222222222222
Hi,
For modern video cards, most (all?) of them use bus mastering/DMA to transfer graphics data (and commands for the graphics accelerator) into the video card. In this case the IRQ is used for "transfer complete" and/or for "command queue empty".
Cheers,
Brendan
For modern video cards, most (all?) of them use bus mastering/DMA to transfer graphics data (and commands for the graphics accelerator) into the video card. In this case the IRQ is used for "transfer complete" and/or for "command queue empty".
Maybe, but can you think of something a "refresh rate dependant" timer can do that can't be done better by something designed to be used as a timer (PIT, RTC, HPET, local APIC timer)?AJ wrote:OT: So, out of interest, can the (now seemingly redundant?) VGA IRQ be used as an additional timer, then - if you know you have a 60Hz refresh rate and weren't using the IRQ for anything else, say?
Cheers,
Brendan
For all things; perfection is, and will always remain, impossible to achieve in practice. However; by striving for perfection we create things that are as perfect as practically possible. Let the pursuit of perfection be our guide.
So actually in modern OSs not needed/used.
Thats why in some BIOSs there is an option to assign or not an IRQ to VGA.
btw, how you change the refresh rate in PMode ?
Thats why in some BIOSs there is an option to assign or not an IRQ to VGA.
btw, how you change the refresh rate in PMode ?
Keep coding...
...the sky is the limit
AsteriOS project: http://www.mindfields.gr/main/index.php ... &Itemid=27
...the sky is the limit
AsteriOS project: http://www.mindfields.gr/main/index.php ... &Itemid=27
!-->Jef wrote:So actually in modern OSs not needed/used.
And for vsync even if you don't use it all the time.Brendan wrote:For modern video cards, most (all?) of them use bus mastering/DMA to transfer graphics data (and commands for the graphics accelerator) into the video card. In this case the IRQ is used for "transfer complete" and/or for "command queue empty".
But you don't need the IRQ in a small hobby OS.
Changing the refresh rate is part of the GFX driver. I don't know how VGA handles it but you can change it if you use VESA. At last if the Card supports the last VESA standard what is not the case on some (or all?) ATI Chips.Jef wrote:btw, how you change the refresh rate in PMode ?
Or you write drivers directly for the GFX Chip if you can get your hands on the docs.
- Combuster
- Member
- Posts: 9301
- Joined: Wed Oct 18, 2006 3:45 am
- Libera.chat IRC: [com]buster
- Location: On the balcony, where I can actually keep 1½m distance
- Contact:
I'd rather say, two clock sources. One's 25MHz and the others 28MHz. Divide them by the amount of pixels in a frame (including the blank sections) and you get the refresh rate. That means that you can change the clock and get 640x480 at 67MHz. You can also change the resolution and get frequencies from 40 - 80Hz for different resolutions. The key to success is however that the H and V frequencies stay within tolerance limits of the monitor. For old fixed-frequency VGA monitors, that is indeed 60 or 70 Hz.jal wrote:VGA has two refresh rates, 60Hz for 480 line modes and 70Hz for 400 line modes.Osbios wrote:Changing the refresh rate is part of the GFX driver. I don't know how VGA handles it but you can change it if you use VESA.