Page 1 of 2
GRUB2 VBE Set the highest and greatest possible video mode?
Posted: Sun Nov 01, 2015 12:37 am
by CrafterMan24
Hello, OsDev forum.
I have a correctly configured multiboot C os.
GRUB2 sets the VBE mode on Boot, and i get the LFB adress from multiboot header, and set the pixels
I say to grub "just set this vbe mode" with this code in bootstrap assembly:
MBALIGN equ 1<<0
MEMINFO equ 1<<1
VIDINFO equ 1<<2
FLAGS equ MBALIGN | MEMINFO | VIDINFO
MAGIC equ 0x1BADB002
CHECKSUM equ -(MAGIC + FLAGS)
section .multiboot
align 4
dd MAGIC
dd FLAGS
dd CHECKSUM
dd 0, 0, 0, 0, 0
dd 0
dd 1024, 768, 32
Yes, it is working great but, this resolution can be don't supported on some monitors, and this resolution can be really small than monitor's best highest resolution...
How can i set this resolution automaticly as the greatest and highest resolution and depth mode?
Thanks
Oh and i forgot, my grub config is:
menuentry "TestOS" {
multiboot /boot/OS.bin
}
And I'm using Grub2, not patched legacy grub.
@catnikita255, do you have a knowledge about it?
Re: GRUB2 VBE Set the highest and greatest possible video mo
Posted: Sun Nov 01, 2015 3:32 am
by Combuster
Don't use GRUB's video setting code, but drop to real mode and call VBE (or alternatively, use GOP on EFI machines, though that usually gives you far less options). Also mind that some monitors don't actually like the largest resolution the video card can do.
Re: GRUB2 VBE Set the highest and greatest possible video mo
Posted: Sun Nov 01, 2015 3:37 am
by CrafterMan24
Combuster wrote:Don't use GRUB's video setting code, but drop to real mode and call VBE (or alternatively, use GOP on EFI machines, though that usually gives you far less options). Also mind that some monitors don't actually like the largest resolution the video card can do.
Normally I was dropping to real mode and calling VBE but i think it is not clear and it don't run in VMWare...
Also it is really broking paging.
Thanks for your answer, and is there a way for auto-detecting this on GRUB?
Maybe with modifying GRUB source code?
Re: GRUB2 VBE Set the highest and greatest possible video mo
Posted: Sun Nov 01, 2015 3:39 am
by osdever
CrafterMan24 wrote:
@catnikita255
Do not use my nick!
Re: GRUB2 VBE Set the highest and greatest possible video mo
Posted: Sun Nov 01, 2015 3:40 am
by Combuster
CrafterMan24 wrote:Normally I was dropping to real mode and calling VBE but i think it is not clear and it don't run in VMWare...
Also it is really broking paging.
So you have a bug. Do you want to talk about it?
Re: GRUB2 VBE Set the highest and greatest possible video mo
Posted: Sun Nov 01, 2015 3:41 am
by CrafterMan24
catnikita255 wrote:CrafterMan24 wrote:
@catnikita255
Do not use my nick!
Lol i was just called you, because you were have a topic about GRUB VBE.
I'm not using your nick, like in XenForo i'm calling you in thread, but looks this function not working in PhpBB
I'm not sure why you get so angry...
Re: GRUB2 VBE Set the highest and greatest possible video mo
Posted: Sun Nov 01, 2015 3:41 am
by CrafterMan24
Combuster wrote:CrafterMan24 wrote:Normally I was dropping to real mode and calling VBE but i think it is not clear and it don't run in VMWare...
Also it is really broking paging.
So you have a bug. Do you want to talk about it?
Yes, what could cause it don't run in VMWare?
I'm rewriting my VBE code for dropping real mode now
Re: GRUB2 VBE Set the highest and greatest possible video mo
Posted: Sun Nov 01, 2015 4:06 am
by Combuster
My crystal ball doesn't seem to be working today.
How about you tell us what you're really doing to get there?
Re: GRUB2 VBE Set the highest and greatest possible video mo
Posted: Sun Nov 01, 2015 5:21 am
by CrafterMan24
Combuster wrote:My crystal ball doesn't seem to be working today.
How about you tell us what you're really doing to get there?
My problem is:
I downloaded and used the Napalm's Real Mode Interrupt Driver (it switching back to real mode doing the interrupt and entering the protected mode again like you said)
First i tried with VGA Mode 12h.
It worked in QEMU, Bochs, VMWare, VirtualBox and Real Hardware
And i tried VBE 1024x768x24 and...
It worked in QEMU, Bochs, VirtualBox and Real Hardware...
But not worked in VMware...
In VMWare, display is not resizing and staying in normal text mode...
I can't debug in VMWare because i don't know about VMWare debugging, i only have a serial debugger, and it works on QEMU
#######################
#######################
Edit: I finally debugged in VMWare, and this is the result:
61440x2161x8 LFB Adress: 0xf0000871 (This is the result i got from VMWare)
800x600x15 LFB Adress: 0xfd000000 (This is the result i got from other correct running machines)
Re: GRUB2 VBE Set the highest and greatest possible video mo
Posted: Sun Nov 01, 2015 7:47 am
by max
Combuster wrote:Don't use GRUB's video setting code, but drop to real mode and call VBE (or alternatively, use GOP on EFI machines, though that usually gives you far less options). Also mind that some monitors don't actually like the largest resolution the video card can do.
* Don't use GRUB's video setting code, but write a virtual 8086 monitor and call VBE.
Re: GRUB2 VBE Set the highest and greatest possible video mo
Posted: Sun Nov 01, 2015 10:56 am
by Octocontrabass
max wrote:Combuster wrote:Don't use GRUB's video setting code, but drop to real mode and call VBE (or alternatively, use GOP on EFI machines, though that usually gives you far less options). Also mind that some monitors don't actually like the largest resolution the video card can do.
* Don't use GRUB's video setting code, but write a virtual 8086 monitor and call VBE.
Don't use GRUB's video setting code, but write an x86 emulator and call VBE.
Re: GRUB2 VBE Set the highest and greatest possible video mo
Posted: Sun Nov 01, 2015 11:03 am
by CrafterMan24
Octocontrabass wrote:max wrote:Combuster wrote:Don't use GRUB's video setting code, but drop to real mode and call VBE (or alternatively, use GOP on EFI machines, though that usually gives you far less options). Also mind that some monitors don't actually like the largest resolution the video card can do.
* Don't use GRUB's video setting code, but write a virtual 8086 monitor and call VBE.
Don't use GRUB's video setting code, but write an x86 emulator and call VBE.
lol, both vbe with x86 emulator and dropping real mode is failing in VMWare...
Not sure whats the buggy, the debugger output in my last post...
Re: GRUB2 VBE Set the highest and greatest possible video mo
Posted: Sun Nov 01, 2015 11:48 am
by Brendan
Hi,
CrafterMan24 wrote:How can i set this resolution automaticly as the greatest and highest resolution and depth mode?
My approach was/is:
- Get the monitor's EDID. Note that in some cases you can't get the monitor's EDID from the monitor and need/want some way for the user to provide it via. a file. For worst case (no EDID at all), assume the monitor can only handle the "VESA safe mode timings" (which is mostly one 640*480 @ 60 Hz video timing that all monitors are supposed to support).
- Get a list of video modes from VBE and VGA. Filter out anything your OS doesn't support. Note that for VBE some modes allow horizontal and/or vertical pixel doubling and can be used to create additional video modes (e.g. for a 1920*1600 video mode you might be able to create additional 1920*800, 960*1600 and 960*800 video modes). Also (for VBE 3.0) you might be able to use "CRTC info" to create even more modes (e.g. 800*600 @ 60 Hz, 800*600 @ 80 Hz, 800*600 @ 120 Hz, ...); and this should be done using feedback from the monitor's EDID (keeping within frequencies that both the video card and monitor support; and trying to match specific timings that the monitor says it supports).
- For each video mode, calculate:
- a score representing how likely it is that the video card supports it properly. Various shenanigans (using "CRTC info", using palette, etc) increase the risk that the video card won't handle it properly.
- a score representing how likely it is that the monitor supports it properly. This is complicated, because (without using "CRTC info" to force a specific video mode timing) its impossible to know what timing any VBE mode actually uses. For an 800*600 video mode, if the monitor happens to support 3 different 800*600 timings then it's more likely that monitor will support the video card's 800*600 mode, and if the monitor only supports one 800*600 timing then its less likely the monitor will like the video card's 800*600 mode. Also note that there are a set of "sort of standard" modes (indicated by EDID's "established timings") that are more likely to be what the video card provides, and if the monitor supports (e.g.) the "sort of standard 800*600" mode then that's more important than whether it supports other 800*600 modes.
- a "reliability rating" for the video mode; which would be calculated by combining the "how likely video card supports it properly" score and the "how likely monitor supports it properly" score.
- an "OS rating" saying how much the OS likes the video mode. This takes into account things like available RAM and CPU speeds (e.g. you really want to avoid 4096*2160 when you've only got 16 MiB of RAM and the CPU is an old 166 MHz Pentium). Note that you should only care about software rendering for this rating - if there's a native video driver capable of hardware accelerated rendering the native video driver can change video mode to suit itself (and should be able to do a far better job of choosing a video mode than boot code using VBE ever will).
- a "monitor rating" saying how much the monitor likes the video mode. Note that most monitors have a "preferred video mode" (representing their LCD panel's native resolution) plus other preferences; and if (e.g.) the monitor's preferred video mode is 1440*1200 then 1024*768 might get a low "monitor rating" (because 1024 isn't a multiple of 1440 and 768 isn't a multiple of 1200) and 800*600 might get a better "monitor rating" (because 600 is exactly half of 1200 and the results won't look as blurred when the monitor scales it). Also note that most monitors have a "bits per primary colour" - if one monitor supports 8 bits per primary colour it might give 16-bpp video modes a lower score than 32-bpp video modes, and if another monitor supports 6 bits per primary colour it might give 16-bpp video modes almost the same score as 32-bpp video modes.
- a "user rating" saying what the user wants. This may or may not be derived from actual end user settings. In the past I've always had a set of "preferred horizontal resolution, vertical resolution, colour depth and refresh rate" boot variables that the user can set to whatever they like; but most users just want "maximum everything" and it goes against my "just works, without end user hassles" ideals. For the next version of my OS it's likely the OS will use heuristics to generate the "user rating" itself without any end user settings.
- a total score for the video mode, which combines the reliability rating, the OS's rating, the monitor's rating and the user's rating (e.g. maybe "score = reliability * ( OS_preference + user_preference + monitor_preference);").
- Choose the video mode with the highest total score.
Of course how you design the calculations is something that's going to take research, trail and error, and tweaking. For my first attempt the reliability ratings had a strong effect (I really wanted to avoid choosing something that didn't work) and anything that looked like it might be "640*480 @ 60 Hz VESA safe mode timing" was given a reliability rating boost (in addition to almost always getting good reliability ratings without the boost and getting good scores from the OS rating); and the end result was that it was very hard to convince the OS to choose anything other than 640*480 @ 60 Hz video modes.
Cheers,
Brendan
Re: GRUB2 VBE Set the highest and greatest possible video mo
Posted: Sun Nov 01, 2015 11:58 am
by Brendan
Hi,
CrafterMan24 wrote:lol, both vbe with x86 emulator and dropping real mode is failing in VMWare...
Not sure whats the buggy, the debugger output in my last post...
CrafterMan24 wrote:61440x2161x8 LFB Adress: 0xf0000871 (This is the result i got from VMWare)
If you convert that to hex you get: 0xF000 * 0x0871 * 0x0008 bpp at 0xF0000871.
My guess is that the interrupt service routine that the BIOS uses for "invalid interrupt" is at 0xF000:0x0871; and you've used a NULL pointer or something (instead of a valid pointer to VBE mode information) to get dodgy values from the real mode IVT.
Of course without seeing any code it's hard to guess what the problem/s might be...
Cheers,
Brendan
Re: GRUB2 VBE Set the highest and greatest possible video mo
Posted: Sun Nov 01, 2015 12:27 pm
by CrafterMan24
Brendan wrote:Hi,
CrafterMan24 wrote:lol, both vbe with x86 emulator and dropping real mode is failing in VMWare...
Not sure whats the buggy, the debugger output in my last post...
CrafterMan24 wrote:61440x2161x8 LFB Adress: 0xf0000871 (This is the result i got from VMWare)
If you convert that to hex you get: 0xF000 * 0x0871 * 0x0008 bpp at 0xF0000871.
My guess is that the interrupt service routine that the BIOS uses for "invalid interrupt" is at 0xF000:0x0871; and you've used a NULL pointer or something (instead of a valid pointer to VBE mode information) to get dodgy values from the real mode IVT.
Of course without seeing any code it's hard to guess what the problem/s might be...
Cheers,
Brendan
Oh, so it is a problem in my code, because I was copying empty VBE info and Mode info to an empty buffer, changing values on this buffer, and copying them back to VBE info and Mode info again...
I tried without buffers, and made the operations in mode info and vbe info directly, but i got bunch of "uint32_t vs modeInfoBlock" errors..
I'm sure the x86 emulator works, because i tried another interrupts and they worked perfect. So yes, i must fix my code...