Source code VESA in Protected Mode
Yes, with VBE 3, you can do it in PMODE. The basic idea is this: you copy all the functions to a location in memory that is read/write, then you set the Pmode bit, switch to 16-bit pmode, and call the functions from the pmode interface. There are specs for the interface online, just google them. Just keep in mind, you need to run in 16-bit pmode to use this, and it will only work on vbe3 supporting cards.Steve the Pirate wrote:Is there no way to do this in protected mode? Like with VBE 3?
I need source code about VBE 3 PM InterfaceReady4Dis wrote:Yes, with VBE 3, you can do it in PMODE. The basic idea is this: you copy all the functions to a location in memory that is read/write, then you set the Pmode bit, switch to 16-bit pmode, and call the functions from the pmode interface. There are specs for the interface online, just google them. Just keep in mind, you need to run in 16-bit pmode to use this, and it will only work on vbe3 supporting cards.Steve the Pirate wrote:Is there no way to do this in protected mode? Like with VBE 3?
- Brynet-Inc
- Member
- Posts: 2426
- Joined: Tue Oct 17, 2006 9:29 pm
- Libera.chat IRC: brynet
- Location: Canada
- Contact:
VBE Core 3.0 is old
New? From http://www.vesa.org/public/VBE/vbecore3.pdf on the first page:Brynet-Inc wrote:Problem is many emulators don't include support for VBE 3, You have a card that has the new extensions available?
It is old. Strange that so few implements it. The VBE Core 2.0 standard is from november 1994.VESA BIOS EXTENSION (VBE)
Core Functions
Standard
Version: 3.0
Date: September 16, 1998
I'm thinking about writing the protected mode entry for the VGABIOS used in Bochs and Qemu, but currently I have problems compiling it.
One should IMHO create drivers in modules, so fx. the video driver easily can be exchange with another one for a specific card. But when OSdeving in Bochs and Qemu it could be nice to write a plain simple protected mode VESA driver
Yes, drivers should definitely be modules, but it's not to hard to implement a real mode drop-back function in the kernel to support vbe 2.0, or implement it in a 'generic' video driver. It's a pain, there should be a simpler way by now (how many OS's are still real mode? why wouldn't they have switched sooner?), but it's what we have to work with, and what works. Virtual 86 is fine, but I really don't care for it for my OS, so I have a 'simple' function to drop to real mode,and come back to pmode again. I keep a certain amount of memory free in the real-mode memory space that is mapped to my kernel for the calls. It's hackish, but works. I have not tested much with vbe 3.0, since i hear few cards support it, and all vbe 3.0 cards support the real mode interface, I find little reason to implement an extra driver just for vesa, it's not like the rm interface is really that slow (how fast do you have to switch graphics mode, and how often?).
In a word,the BIOS interface provides us with a uniform abstraction which we can use to access the video without knowing everything implemented on a lower layer,so we don't have to care too much about facing different hardware implementations.JJeronimo wrote:Can you explain why poeple don't set the VESA video mode directly, instead of using the BIOS interface?
JJ
Well...m wrote:In a word,the BIOS interface provides us with a uniform abstraction which we can use to access the video without knowing everything implemented on a lower layer,so we don't have to care too much about facing different hardware implementations.
VBE is a BIOS extension standard...
VESA, which is the same as Super VGA (correct me if I'm saying garbage!), is a hardware standard, so shouldn't all the card manufacturers follow the standard and thus have compatible hardware interfaces, just as in VGA?
JJ
People do... Hobby OS' don't do it because that would mean having a better device driver interface and writing at least a full driver for there own graphics card. Every card does it differently so the simplest way to do it is to use the BIOS calls or VBE which you can depend upon until you are able to do it directly with the card.JJeronimo wrote:Can you explain why poeple don't set the VESA video mode directly, instead of using the BIOS interface?
JJ
- Combuster
- Member
- Posts: 9301
- Joined: Wed Oct 18, 2006 3:45 am
- Libera.chat IRC: [com]buster
- Location: On the balcony, where I can actually keep 1½m distance
- Contact:
VESA is the company that releases the standards. SVGA refers to virtually any card that can do high resolution modes, and is VGA compatible. SVGA cards themselves are not compatible with each other. They generally only adhere to the VBE standard.JJeronimo wrote: VESA, which is the same as Super VGA (correct me if I'm saying garbage!), is a hardware standard, so shouldn't all the card manufacturers follow the standard and thus have compatible hardware interfaces, just as in VGA?
Try googling for "VGADOC" for information about several (S)VGA cards, and see for yourself how different they are.
Ok... You've just undone one of my greatest doubts concerning graphics standards... really, I didn't understand what refered to what!Combuster wrote:VESA is the company that releases the standards. SVGA refers to virtually any card that can do high resolution modes, and is VGA compatible.
So, my NVIDIA RIVA TNT is supposedly a SVGA card, because it provides VBE-compatible BIOS routines to access it's high resolution modes, but it's hardware interface is secret...
And... VBE doesn't provide support for 3D acceleration, for example... because of that, if one wants to do 3D accelerated operations on the card (s)he needs to support it directly, right?
But is there any standard for SVGA hardware interfaces that should be followed but is not?SVGA cards themselves are not compatible with each other. They generally only adhere to the VBE standard.
Ok... Thanks...Try googling for "VGADOC" for information about several (S)VGA cards, and see for yourself how different they are.
JJ
The only interface is the int 10h, it's standard, but only can be used in real mode 16-bit. VBE3 extends the interface to 16-bit pmode, but you still must check for VBE3 support, and if not found fall back to real mode anyways, not really a point in using the protected mode interface that I know of.