Hi,
mallard wrote:Brendan wrote:I've never seen laptops in a corporate environment running an OS without native video drivers.
As I've said multiple times already, obviously native drivers are the goal. However, when they are not available you have to do something sensible.
Where "something sensible" is using VBE, VGA or UGA/GOP during boot to setup a video mode; and doesn't include wasting time with virtual80x86 mode and/or an emulator (for long mode) or doing things like diddling with chipset to try force the video card's ROM to work on UEFI, when you could've used that time to implement (basic/unaccelerated) native video drivers for one or more video cards instead.
mallard wrote:Brendan wrote:Ideally, when users connect a monitor, you want the video driver detect that and automatically get the monitor's EDID, and automatically set a video mode (hopefully the monitor's preferred/native resolution).
"Blindly" relying on EDID is often a bad idea. Sometimes the user may have a different preference (for whatever reason) or it might be simply non-available (e.g. a KVM switch is in the way - modern Linux distributions are awful in this case). User preference should take precedence over auto-detection.
The goal is to make the user's life easier (and not waste their time by forcing them to cope with a bunch of hassles when it could've been avoided). Part of this is "Plug & Play" - auto-detecting and auto-configuring hardware so the OS just works. There are cases where it's not possible and end-user hassle can't be avoided (and you do need usable work-arounds), but that's far less common and shouldn't be used as an excuse to avoid auto-detection and auto-configuration that does benefit most users most of the time.
mallard wrote:However, giving the bootloader access to user preferences isn't all that straighforward. Either you have to update the boorloader's configuration every time the user changes something, or you have to extend the bootloader to read the OS's configuration. The latter means that the bootloader needs to be able to read your filesystem, parse your configuration files, etc. and either way, you're introducing "tight coupling" between the bootloader and OS; something that the multiboot standard was designed to avoid.
It's not that hard, and it's very likely that you're going to need some way to pass settings to the boot loader for other reasons anyway.
mallard wrote:Brendan wrote:If VBE doesn't exist, it's impossible to use it (and "hard" is easy compared to "impossible").
Except that it does exist and can (probably) be relied on to exist for approximately another decade. (Outside of a few specialised devices/tablets and Apple systems, I don't see the BIOS CSM going anywhere quite yet.) Note that even for OSs that support (U)EFI boot, many drivers still require the CSM to be loaded (e.g. AMD graphics drivers on Linux).
It might take up to 10 years for BIOS to go away completely; but it also might take 10 years for your OS to become "feature complete" too; which means that it's likely that BIOS will be gone before your OSs is ready for actual use.
Also note that the main reason BIOS CSM still exists now is companies using Windows XP. Windows XP support ended in April.
mallard wrote:The hardest part about having native drivers isn't the time and effort required to develop them; it's fact that you need the hardware to do so. So, unless you've got a big pile of money/hardware, it's largely impossible for a hobbyist.
You don't need to cover all hardware yourself. The idea is to make the OS "impressive enough" that other people are willing to help. This may mean family and friends that are willing to let you use their computer to test your drivers; and may mean other programmers see your OS and are willing to write drivers for you.
You also don't need to do this immediately. You might not be able to afford a video card now; but that doesn't mean you won't be able to afford a video card in the next 10 years.
Also note that with a single "ATI AtomBIOS" driver and perhaps 3 Intel video drivers, you've got about 90% of computers covered; and that setting up a video mode during boot is still "good enough" when there's no native driver.
mallard wrote:Brendan wrote:The big name OSs started using VBE about 20 years ago, when OSs like Win95 had to be able to run DOS applications (for marketing/backward compatibility reasons) and a big fat emulation layer had to be built into the OS anyway. Now, if they actually do still use VBE after boot it's extremely likely that the only reason they're doing it is because they're too lazy to rip out the old code they already had.
That's not right at all. Windows 9x never had any official VBE support. Their "fallback" was standard VGA. It was the 2000/XP era when Microsoft added appropriate drivers to Windows. Even Windows 10 has full support. Mac OS X has full VBE support (and can boot from BIOS), despite never officially supporting any hardware where it's needed. VBE support is poorest, but still available, on Linux (although Linux can also use a framebuffer set up by a bootloader).
Windows 9x had a big fat emulation layer for running DOS programs; including emulating hardware like the PIC, PIT, RTC, PS/2, etc; and including emulating the IVT, BDA, etc. It needed this big fat emulation layer because DOS programs were allowed to mess with hardware directly. If you've already got a big fat emulation layer like this, the amount of extra work involved in allowing the OS to use it for video mode switching is almost nothing.
If Microsoft didn't need the big fat emulation layer for DOS programs, would they have bothered supporting "driver-less video mode switching after boot"? My guess is that they probably still would have, because they've got thousands of developers and it made more sense before UEFI existed, before multi-monitor became common, before people started expecting high resolutions, before LCD monitors existed that can only display one resolution (and do a low quality rescaling internally when they don't get that resolution). I assume that you don't plan to support legacy/DOS programs, and that you don't have thousands of developers, and that it's not 1990 anymore.
Also note that on UEFI; when there's no native driver Windows does not support video mode switching after boot.
OS X does not have VBE support ("boot camp" might have, but that's not OS X). They also have the least reason to bother - their (80x86) systems are UEFI anyway, and they only need to support the small number of video cards that they've chosen to provide (and don't need to support "white box" hardware like Windows or Linux does).
Linux is a random group of people doing anything that want. It's never an example of good engineering.
mallard wrote:Brendan wrote:
If you use VBE (or "VGA BIOS functions", or UGA/GOP for UEFI) during boot you don't have to worry about a VGA driver (or low resolution/colour depth video modes) and you're still able to display a UI for anything. If the video card is so messed up that you can't use VBE (or "VGA BIOS functions" or UGA/GOP) during boot, then treat the computer as "headless" (until/unless there's a native video driver); or (if you don't support headless systems) display a "video card is unusable" error message (in text mode, or using "console output" on UEFI) and refuse to boot.
There's no reason why you can't have multiple "fallback" options and there's no particular disadvantage to having a standard VGA driver in addition to other options. At the very least, part of it is going to be needed to display that text-mode error message...
You're right - there's no reason you can't waste a lot of time on irrelevant stuff that users don't care about (instead of spending that time doing more important things that users do care about).
Cheers,
Brendan