VESA VBE Problems
VESA VBE Problems
Hi guys. I have an problem about finding VBE mode for 800x600 with 24-bit color depth, and seems that is something wrong with my code. Newest code: pastebin.
Top of the file is filled with Mode Info Block and Controller Info Block.
Scroll down a bit and you will find InitVBE, which seems to be problematic.
Mode won't set correctly nor on Bochs, nor on VMWare.
Any help?
Current problem: Now it passes through 30 modes and seem that VMware can't find 800x600 video mode nor with 24-bit color depth nor 32-bit color depth. The data may be corrupt.
Top of the file is filled with Mode Info Block and Controller Info Block.
Scroll down a bit and you will find InitVBE, which seems to be problematic.
Mode won't set correctly nor on Bochs, nor on VMWare.
Any help?
Current problem: Now it passes through 30 modes and seem that VMware can't find 800x600 video mode nor with 24-bit color depth nor 32-bit color depth. The data may be corrupt.
Last edited by Ycep on Wed Sep 14, 2016 4:42 am, edited 1 time in total.
Re: VESA VBE Problems
Make sure that you are calling those functions from real mode. Why do you want to use a 24 bit mode anyways? Are you doing all the necessary mode scanning?
OS: Basic OS
About: 32 Bit Monolithic Kernel Written in C++ and Assembly, Custom FAT 32 Bootloader
About: 32 Bit Monolithic Kernel Written in C++ and Assembly, Custom FAT 32 Bootloader
Re: VESA VBE Problems
I cant see a direct error in your code, but you should debug first of all to see WHICH call exactly fails.
Re: VESA VBE Problems
Code: Select all
cmp ax, 0x4F
je .damn
OS Development Series | Wiki | os | ncc
char c[2]={"\x90\xC3"};int main(){void(*f)()=(void(__cdecl*)(void))(void*)&c;f();}
char c[2]={"\x90\xC3"};int main(){void(*f)()=(void(__cdecl*)(void))(void*)&c;f();}
Re: VESA VBE Problems
Haha your right, I didnt even check that part...neon wrote:I noticed those two lines in a number of places throughout the code. However, al=4fh and ah=0 means that it is supported and the interrupt call was a success. In other words, if "ax = 0x4f" then the interrupt call was a success and not an error.Code: Select all
cmp ax, 0x4F je .damn
I shouldnt assume stuff sometimes huh
Re: VESA VBE Problems
Thanks both of you guys. Now it gets stuck in this part:
Anywhooo?
Code: Select all
cmp dx, 0xFFFF
je .damn2
mov ax, 0x4f01
mov cx, dx
mov di, ModeBlock
int 0x10
cmp ax, 0x4F
jne .damn
mov ax, 800
cmp [ModeBlock.xres], ax
jne .loop
mov ax, 600
cmp [ModeBlock.yres], ax
jne .loop
mov ax, 24
cmp [ModeBlock.bpp], ax
jne .loop
mov eax, ModeBlock.pointer
push eax
mov ax, 0x4F02
mov bx, dx
or bx, 0x4000
int 0x10
cmp ax, 0x4f
jne .damn
Re: VESA VBE Problems
Hi,
Note that by pre-setting the signature field to to "VBE2" you're telling the "Get Controller Information" function that you've got a 512 byte buffer for its information; and VBE can store the list of video mode numbers in the "reserved" part of your 512 byte area. Your "InfoBlock" structure is not 512 bytes; so VBE just corrupts whatever is after that, and whatever is after that corrupts your "InfoBlock" structure. More specifically, each time you use the "Get Mode Information" function it trashes the "reserved" part of that 512 byte "InfoBlock" area that probably contains the list of video mode numbers, so (apart from the first time) you're probably asking VBE for mode information for dodgy/corrupted video mode numbers.
Note that the "ModeBlock" needs to be 256 bytes too. I don't know what is after that in memory (or what else you're corrupting).
Apart from that; "cli" then "hlt" breaks if you or firmware get an NMI or SMI (it "unhalts" the CPU, so CPU begins executing whatever is after the "hlt") and it'd be better to do a ".die: hlt" and "jmp .die" with interrupts left enabled, so that on real computers "control+alt+delete" and other IRQs (like the one for turning off floppy motor) still work.
Also, most real video cards (not emulators) typically only support 24-bpp (old video cards) or 32-bpp (most/newer video cards); which means that when your code actually works it probably won't help much on real hardware anyway. My advice is to generate all your graphics in a "standard for you" format (e.g. 32-bpp) in a buffer in RAM, and then have multiple different functions to convert that into whatever the video mode happens to want (either during or before blitting it to display memory). This makes it much much easier to support many different pixel formats (and can have significant other advantages later on when your code gets more advanced).
Cheers,
Brendan
The nice thing about emulators like Bochs is that you can:Lukand wrote:Thanks both of you guys. Now it gets stuck in this part:
- Enable the "magic breakpoint" feature (when configuring and compiling Bochs); then insert an "xchg bx,bx" instruction anywhere you like in your source code to make Bochs' debugger stop at that point.
- Use Bochs' debugger to single-step from one instruction to the next
- Examine the state of the CPU (registers, etc), the contents of memory, the state of various devices (PIT, PIC, etc)
Note that by pre-setting the signature field to to "VBE2" you're telling the "Get Controller Information" function that you've got a 512 byte buffer for its information; and VBE can store the list of video mode numbers in the "reserved" part of your 512 byte area. Your "InfoBlock" structure is not 512 bytes; so VBE just corrupts whatever is after that, and whatever is after that corrupts your "InfoBlock" structure. More specifically, each time you use the "Get Mode Information" function it trashes the "reserved" part of that 512 byte "InfoBlock" area that probably contains the list of video mode numbers, so (apart from the first time) you're probably asking VBE for mode information for dodgy/corrupted video mode numbers.
Note that the "ModeBlock" needs to be 256 bytes too. I don't know what is after that in memory (or what else you're corrupting).
Apart from that; "cli" then "hlt" breaks if you or firmware get an NMI or SMI (it "unhalts" the CPU, so CPU begins executing whatever is after the "hlt") and it'd be better to do a ".die: hlt" and "jmp .die" with interrupts left enabled, so that on real computers "control+alt+delete" and other IRQs (like the one for turning off floppy motor) still work.
Also, most real video cards (not emulators) typically only support 24-bpp (old video cards) or 32-bpp (most/newer video cards); which means that when your code actually works it probably won't help much on real hardware anyway. My advice is to generate all your graphics in a "standard for you" format (e.g. 32-bpp) in a buffer in RAM, and then have multiple different functions to convert that into whatever the video mode happens to want (either during or before blitting it to display memory). This makes it much much easier to support many different pixel formats (and can have significant other advantages later on when your code gets more advanced).
Cheers,
Brendan
For all things; perfection is, and will always remain, impossible to achieve in practice. However; by striving for perfection we create things that are as perfect as practically possible. Let the pursuit of perfection be our guide.
- BrightLight
- Member
- Posts: 901
- Joined: Sat Dec 27, 2014 9:11 am
- Location: Maadi, Cairo, Egypt
- Contact:
Re: VESA VBE Problems
The color depth is a byte value, not a word.Lukand wrote:Code: Select all
mov ax, 24 cmp [ModeBlock.bpp], ax jne .loop
This should be mov eax, [ModeBlock.pointer].Lukand wrote:Code: Select all
mov eax, ModeBlock.pointer
You know your OS is advanced when you stop using the Intel programming guide as a reference.
Re: VESA VBE Problems
@Brendan thanks for suggestions.
@omarrx24 . Whoops. Thank you a lot too.
@omarrx24 . Whoops. Thank you a lot too.
Re: VESA VBE Problems
It seems that VMware can't find mode for 800x600 with 24-bit nor 32-bit color depth. It finds 30 modes, of which no one is 800x600 with 24-bit color depth.
@Brendan : Thank you a lot! That 512-byte and 256-byte thing fixed it a bit.
Why do Bochs Debugger won't appear in Win32? (in bochsdbg.exe)? I want to check all modes that it found, because it may be corrupt again.
@Brendan : Thank you a lot! That 512-byte and 256-byte thing fixed it a bit.
Why do Bochs Debugger won't appear in Win32? (in bochsdbg.exe)? I want to check all modes that it found, because it may be corrupt again.
- BrightLight
- Member
- Posts: 901
- Joined: Sat Dec 27, 2014 9:11 am
- Location: Maadi, Cairo, Egypt
- Contact:
Re: VESA VBE Problems
I can confirm VMware supports 800x600x32. Perhaps you are doing something wrong then, but your post is not very informative...
You know your OS is advanced when you stop using the Intel programming guide as a reference.