GRUB2 VBE Set the highest and greatest possible video mode?
-
- Member
- Posts: 28
- Joined: Sun Nov 01, 2015 12:19 am
GRUB2 VBE Set the highest and greatest possible video mode?
Hello, OsDev forum.
I have a correctly configured multiboot C os.
GRUB2 sets the VBE mode on Boot, and i get the LFB adress from multiboot header, and set the pixels
I say to grub "just set this vbe mode" with this code in bootstrap assembly:
MBALIGN equ 1<<0
MEMINFO equ 1<<1
VIDINFO equ 1<<2
FLAGS equ MBALIGN | MEMINFO | VIDINFO
MAGIC equ 0x1BADB002
CHECKSUM equ -(MAGIC + FLAGS)
section .multiboot
align 4
dd MAGIC
dd FLAGS
dd CHECKSUM
dd 0, 0, 0, 0, 0
dd 0
dd 1024, 768, 32
Yes, it is working great but, this resolution can be don't supported on some monitors, and this resolution can be really small than monitor's best highest resolution...
How can i set this resolution automaticly as the greatest and highest resolution and depth mode?
Thanks
Oh and i forgot, my grub config is:
menuentry "TestOS" {
multiboot /boot/OS.bin
}
And I'm using Grub2, not patched legacy grub.
@catnikita255, do you have a knowledge about it?
I have a correctly configured multiboot C os.
GRUB2 sets the VBE mode on Boot, and i get the LFB adress from multiboot header, and set the pixels
I say to grub "just set this vbe mode" with this code in bootstrap assembly:
MBALIGN equ 1<<0
MEMINFO equ 1<<1
VIDINFO equ 1<<2
FLAGS equ MBALIGN | MEMINFO | VIDINFO
MAGIC equ 0x1BADB002
CHECKSUM equ -(MAGIC + FLAGS)
section .multiboot
align 4
dd MAGIC
dd FLAGS
dd CHECKSUM
dd 0, 0, 0, 0, 0
dd 0
dd 1024, 768, 32
Yes, it is working great but, this resolution can be don't supported on some monitors, and this resolution can be really small than monitor's best highest resolution...
How can i set this resolution automaticly as the greatest and highest resolution and depth mode?
Thanks
Oh and i forgot, my grub config is:
menuentry "TestOS" {
multiboot /boot/OS.bin
}
And I'm using Grub2, not patched legacy grub.
@catnikita255, do you have a knowledge about it?
Last edited by CrafterMan24 on Sun Nov 01, 2015 3:42 am, edited 1 time in total.
- Combuster
- Member
- Posts: 9301
- Joined: Wed Oct 18, 2006 3:45 am
- Libera.chat IRC: [com]buster
- Location: On the balcony, where I can actually keep 1½m distance
- Contact:
Re: GRUB2 VBE Set the highest and greatest possible video mo
Don't use GRUB's video setting code, but drop to real mode and call VBE (or alternatively, use GOP on EFI machines, though that usually gives you far less options). Also mind that some monitors don't actually like the largest resolution the video card can do.
-
- Member
- Posts: 28
- Joined: Sun Nov 01, 2015 12:19 am
Re: GRUB2 VBE Set the highest and greatest possible video mo
Normally I was dropping to real mode and calling VBE but i think it is not clear and it don't run in VMWare...Combuster wrote:Don't use GRUB's video setting code, but drop to real mode and call VBE (or alternatively, use GOP on EFI machines, though that usually gives you far less options). Also mind that some monitors don't actually like the largest resolution the video card can do.
Also it is really broking paging.
Thanks for your answer, and is there a way for auto-detecting this on GRUB?
Maybe with modifying GRUB source code?
Re: GRUB2 VBE Set the highest and greatest possible video mo
Do not use my nick!CrafterMan24 wrote: @catnikita255
Developing U365.
Source:
only testing: http://gitlab.com/bps-projs/U365/tree/testing
OSDev newbies can copy any code from my repositories, just leave a notice that this code was written by U365 development team, not by you.
Source:
only testing: http://gitlab.com/bps-projs/U365/tree/testing
OSDev newbies can copy any code from my repositories, just leave a notice that this code was written by U365 development team, not by you.
- Combuster
- Member
- Posts: 9301
- Joined: Wed Oct 18, 2006 3:45 am
- Libera.chat IRC: [com]buster
- Location: On the balcony, where I can actually keep 1½m distance
- Contact:
Re: GRUB2 VBE Set the highest and greatest possible video mo
So you have a bug. Do you want to talk about it?CrafterMan24 wrote:Normally I was dropping to real mode and calling VBE but i think it is not clear and it don't run in VMWare...
Also it is really broking paging.
-
- Member
- Posts: 28
- Joined: Sun Nov 01, 2015 12:19 am
Re: GRUB2 VBE Set the highest and greatest possible video mo
Lol i was just called you, because you were have a topic about GRUB VBE.catnikita255 wrote:Do not use my nick!CrafterMan24 wrote: @catnikita255
I'm not using your nick, like in XenForo i'm calling you in thread, but looks this function not working in PhpBB
I'm not sure why you get so angry...
Last edited by CrafterMan24 on Sun Nov 01, 2015 3:47 am, edited 3 times in total.
-
- Member
- Posts: 28
- Joined: Sun Nov 01, 2015 12:19 am
Re: GRUB2 VBE Set the highest and greatest possible video mo
Yes, what could cause it don't run in VMWare?Combuster wrote:So you have a bug. Do you want to talk about it?CrafterMan24 wrote:Normally I was dropping to real mode and calling VBE but i think it is not clear and it don't run in VMWare...
Also it is really broking paging.
I'm rewriting my VBE code for dropping real mode now
- Combuster
- Member
- Posts: 9301
- Joined: Wed Oct 18, 2006 3:45 am
- Libera.chat IRC: [com]buster
- Location: On the balcony, where I can actually keep 1½m distance
- Contact:
Re: GRUB2 VBE Set the highest and greatest possible video mo
My crystal ball doesn't seem to be working today.
How about you tell us what you're really doing to get there?
How about you tell us what you're really doing to get there?
-
- Member
- Posts: 28
- Joined: Sun Nov 01, 2015 12:19 am
Re: GRUB2 VBE Set the highest and greatest possible video mo
My problem is:Combuster wrote:My crystal ball doesn't seem to be working today.
How about you tell us what you're really doing to get there?
I downloaded and used the Napalm's Real Mode Interrupt Driver (it switching back to real mode doing the interrupt and entering the protected mode again like you said)
First i tried with VGA Mode 12h.
It worked in QEMU, Bochs, VMWare, VirtualBox and Real Hardware
And i tried VBE 1024x768x24 and...
It worked in QEMU, Bochs, VirtualBox and Real Hardware...
But not worked in VMware...
In VMWare, display is not resizing and staying in normal text mode...
I can't debug in VMWare because i don't know about VMWare debugging, i only have a serial debugger, and it works on QEMU
#######################
#######################
Edit: I finally debugged in VMWare, and this is the result:
61440x2161x8 LFB Adress: 0xf0000871 (This is the result i got from VMWare)
800x600x15 LFB Adress: 0xfd000000 (This is the result i got from other correct running machines)
- max
- Member
- Posts: 616
- Joined: Mon Mar 05, 2012 11:23 am
- Libera.chat IRC: maxdev
- Location: Germany
- Contact:
Re: GRUB2 VBE Set the highest and greatest possible video mo
* Don't use GRUB's video setting code, but write a virtual 8086 monitor and call VBE.Combuster wrote:Don't use GRUB's video setting code, but drop to real mode and call VBE (or alternatively, use GOP on EFI machines, though that usually gives you far less options). Also mind that some monitors don't actually like the largest resolution the video card can do.
-
- Member
- Posts: 5588
- Joined: Mon Mar 25, 2013 7:01 pm
Re: GRUB2 VBE Set the highest and greatest possible video mo
Don't use GRUB's video setting code, but write an x86 emulator and call VBE.max wrote:* Don't use GRUB's video setting code, but write a virtual 8086 monitor and call VBE.Combuster wrote:Don't use GRUB's video setting code, but drop to real mode and call VBE (or alternatively, use GOP on EFI machines, though that usually gives you far less options). Also mind that some monitors don't actually like the largest resolution the video card can do.
-
- Member
- Posts: 28
- Joined: Sun Nov 01, 2015 12:19 am
Re: GRUB2 VBE Set the highest and greatest possible video mo
lol, both vbe with x86 emulator and dropping real mode is failing in VMWare...Octocontrabass wrote:Don't use GRUB's video setting code, but write an x86 emulator and call VBE.max wrote:* Don't use GRUB's video setting code, but write a virtual 8086 monitor and call VBE.Combuster wrote:Don't use GRUB's video setting code, but drop to real mode and call VBE (or alternatively, use GOP on EFI machines, though that usually gives you far less options). Also mind that some monitors don't actually like the largest resolution the video card can do.
Not sure whats the buggy, the debugger output in my last post...
Re: GRUB2 VBE Set the highest and greatest possible video mo
Hi,
Cheers,
Brendan
My approach was/is:CrafterMan24 wrote:How can i set this resolution automaticly as the greatest and highest resolution and depth mode?
- Get the monitor's EDID. Note that in some cases you can't get the monitor's EDID from the monitor and need/want some way for the user to provide it via. a file. For worst case (no EDID at all), assume the monitor can only handle the "VESA safe mode timings" (which is mostly one 640*480 @ 60 Hz video timing that all monitors are supposed to support).
- Get a list of video modes from VBE and VGA. Filter out anything your OS doesn't support. Note that for VBE some modes allow horizontal and/or vertical pixel doubling and can be used to create additional video modes (e.g. for a 1920*1600 video mode you might be able to create additional 1920*800, 960*1600 and 960*800 video modes). Also (for VBE 3.0) you might be able to use "CRTC info" to create even more modes (e.g. 800*600 @ 60 Hz, 800*600 @ 80 Hz, 800*600 @ 120 Hz, ...); and this should be done using feedback from the monitor's EDID (keeping within frequencies that both the video card and monitor support; and trying to match specific timings that the monitor says it supports).
- For each video mode, calculate:
- a score representing how likely it is that the video card supports it properly. Various shenanigans (using "CRTC info", using palette, etc) increase the risk that the video card won't handle it properly.
- a score representing how likely it is that the monitor supports it properly. This is complicated, because (without using "CRTC info" to force a specific video mode timing) its impossible to know what timing any VBE mode actually uses. For an 800*600 video mode, if the monitor happens to support 3 different 800*600 timings then it's more likely that monitor will support the video card's 800*600 mode, and if the monitor only supports one 800*600 timing then its less likely the monitor will like the video card's 800*600 mode. Also note that there are a set of "sort of standard" modes (indicated by EDID's "established timings") that are more likely to be what the video card provides, and if the monitor supports (e.g.) the "sort of standard 800*600" mode then that's more important than whether it supports other 800*600 modes.
- a "reliability rating" for the video mode; which would be calculated by combining the "how likely video card supports it properly" score and the "how likely monitor supports it properly" score.
- an "OS rating" saying how much the OS likes the video mode. This takes into account things like available RAM and CPU speeds (e.g. you really want to avoid 4096*2160 when you've only got 16 MiB of RAM and the CPU is an old 166 MHz Pentium). Note that you should only care about software rendering for this rating - if there's a native video driver capable of hardware accelerated rendering the native video driver can change video mode to suit itself (and should be able to do a far better job of choosing a video mode than boot code using VBE ever will).
- a "monitor rating" saying how much the monitor likes the video mode. Note that most monitors have a "preferred video mode" (representing their LCD panel's native resolution) plus other preferences; and if (e.g.) the monitor's preferred video mode is 1440*1200 then 1024*768 might get a low "monitor rating" (because 1024 isn't a multiple of 1440 and 768 isn't a multiple of 1200) and 800*600 might get a better "monitor rating" (because 600 is exactly half of 1200 and the results won't look as blurred when the monitor scales it). Also note that most monitors have a "bits per primary colour" - if one monitor supports 8 bits per primary colour it might give 16-bpp video modes a lower score than 32-bpp video modes, and if another monitor supports 6 bits per primary colour it might give 16-bpp video modes almost the same score as 32-bpp video modes.
- a "user rating" saying what the user wants. This may or may not be derived from actual end user settings. In the past I've always had a set of "preferred horizontal resolution, vertical resolution, colour depth and refresh rate" boot variables that the user can set to whatever they like; but most users just want "maximum everything" and it goes against my "just works, without end user hassles" ideals. For the next version of my OS it's likely the OS will use heuristics to generate the "user rating" itself without any end user settings.
- a total score for the video mode, which combines the reliability rating, the OS's rating, the monitor's rating and the user's rating (e.g. maybe "score = reliability * ( OS_preference + user_preference + monitor_preference);").
- Choose the video mode with the highest total score.
Cheers,
Brendan
For all things; perfection is, and will always remain, impossible to achieve in practice. However; by striving for perfection we create things that are as perfect as practically possible. Let the pursuit of perfection be our guide.
Re: GRUB2 VBE Set the highest and greatest possible video mo
Hi,
My guess is that the interrupt service routine that the BIOS uses for "invalid interrupt" is at 0xF000:0x0871; and you've used a NULL pointer or something (instead of a valid pointer to VBE mode information) to get dodgy values from the real mode IVT.
Of course without seeing any code it's hard to guess what the problem/s might be...
Cheers,
Brendan
CrafterMan24 wrote:lol, both vbe with x86 emulator and dropping real mode is failing in VMWare...
Not sure whats the buggy, the debugger output in my last post...
If you convert that to hex you get: 0xF000 * 0x0871 * 0x0008 bpp at 0xF0000871.CrafterMan24 wrote:61440x2161x8 LFB Adress: 0xf0000871 (This is the result i got from VMWare)
My guess is that the interrupt service routine that the BIOS uses for "invalid interrupt" is at 0xF000:0x0871; and you've used a NULL pointer or something (instead of a valid pointer to VBE mode information) to get dodgy values from the real mode IVT.
Of course without seeing any code it's hard to guess what the problem/s might be...
Cheers,
Brendan
For all things; perfection is, and will always remain, impossible to achieve in practice. However; by striving for perfection we create things that are as perfect as practically possible. Let the pursuit of perfection be our guide.
-
- Member
- Posts: 28
- Joined: Sun Nov 01, 2015 12:19 am
Re: GRUB2 VBE Set the highest and greatest possible video mo
Oh, so it is a problem in my code, because I was copying empty VBE info and Mode info to an empty buffer, changing values on this buffer, and copying them back to VBE info and Mode info again...Brendan wrote:Hi,
CrafterMan24 wrote:lol, both vbe with x86 emulator and dropping real mode is failing in VMWare...
Not sure whats the buggy, the debugger output in my last post...If you convert that to hex you get: 0xF000 * 0x0871 * 0x0008 bpp at 0xF0000871.CrafterMan24 wrote:61440x2161x8 LFB Adress: 0xf0000871 (This is the result i got from VMWare)
My guess is that the interrupt service routine that the BIOS uses for "invalid interrupt" is at 0xF000:0x0871; and you've used a NULL pointer or something (instead of a valid pointer to VBE mode information) to get dodgy values from the real mode IVT.
Of course without seeing any code it's hard to guess what the problem/s might be...
Cheers,
Brendan
I tried without buffers, and made the operations in mode info and vbe info directly, but i got bunch of "uint32_t vs modeInfoBlock" errors..
I'm sure the x86 emulator works, because i tried another interrupts and they worked perfect. So yes, i must fix my code...