Hi,
rdos wrote:Brendan wrote:In addition; both major 80x86 CPU manufacturers have been supplying chips with built-in GPUs for a while now (and this will get even more common - as computers get smaller and the built-in GPUs get faster the extra space/cost/hassle of a separate video cards gets harder and harder to justify). In these cases there may be no separate video card and no "video card ROM". Instead the firmware for the CPU's built-in video is typically built into the system's firmware (not into the video's pretend PCI device). If the system's firmware doesn't include support for legacy BIOS at all (which is extremely likely for small "tablet" devices and possibly also Apple computers now) then there's no reason for the manufacturer to bother including the unnecessary VBE code (for the CPU's video) in its system firmware.
The video card, in addition to network-chips (primarily WiFi chips, because most wired chips today are RTL-compatible) forms an exception in that they have not yet reached standardization, and thus still have vendors doing operating system drivers. I anticipate this will change in the future. We will see some kind of standard implementation of a graphics system and a standard WiFi-chip. Until then, it is just foolish to put down time on making native video-drivers or WiFi-drivers. Of course, if you have a specific project that could afford such a massive work, it might be worthwhile, otherwise it will not be.
The only hardware standard we've ever seen for video was a de-facto VGA "standard", which was back when "PC compatible" meant "compatible with IBM's PC hardware". Because VGA wasn't designed as an official standard it didn't cover future extensions, and because video card manufacturers need a reason for people to buy their products they all had to add extensions that weren't covered by the "VGA non-standard".
For 80x86 video, the only thing I can see changing in the next 20 years is NVidia being unable to compete with cheaper/smaller "CPU+GPU" solutions and being pushed out of the market.
rdos wrote:Brendan wrote:In addition to that addition; VBE tends to rely on obsolete VGA compatibility which requires special hardware support to sustain (legacy VGA port forwarding, the legacy "VGA display window", etc). For "pure UEFI" there's no need for chipsets to include this special support, and (in the long term) it'd be extremely stupid to assume the chipset includes it.
VBE was an attempt to standardize graphics, which unfortunately is no longer so attractive. But until there comes a new standardized hardware specification for graphics (along with machines on the market using it), VBE is the best we have.
VBE was an attempt to standardize a very small part of graphics (setting video modes and nothing else). VBE-AF attempted to standardize another part of graphics (2D/3D acceleration) but it failed; and (ignoring software interfaces intended for applications, like OpenGL and OpenCL) there's been no attempt to standardize the remainder (video memory management, GPGPU, etc). Basically; the only thing we've ever had is "something that provides bare minimum functionality during boot" where native video drivers are necessary to get more than the bare minimum.
rdos wrote:UEFI have severely crippled VBE, so it is most sensible to just kick-out UEFI and "reinstall" VBE.
UEFI discarded all software interfaces intended for BIOS (including VBE) and replaced them all with newer interfaces (including GOP). What is sensible is to use the correct interface (GOP) instead of attempting to reanimate the corpse of a deceased software interface that was intended for a completely different environment (VBE).
rdos wrote:Brendan wrote:To make this idiotic hack work reliably you'd need a whitelist/blacklist and (eventually) a whole bunch of "motherboard drivers" (to reconfigure chipset specific things like memory controllers, etc); plus a fall-back for the cases were firmware and/or hardware doesn't provide the necessary (now obsolete) requirements. If you do have a fall-back, then you can skip the rest and just use the fall-back on everything.
Not so. It works with no hacks at all on 4 different machines from different vendors (Intel, AMD, Sony, Samsung). While I have not booted any of them through UEFI yet (at least 3 have UEFI boot support), it is not likely that UEFI has not mapped the video BIOS at 0xC0000, which would be the only reason it wouldn't work under UEFI.
Um, what? You failed to test if VBE works when UEFI is used on 4 machines (out of the several million different machines that exist now, and several billion that will exist in future); and from this failure to test anything on an insignificant number of computers you conclude that VBE works now and will continue to work forever?
What you're saying is that VBE works now if you boot using BIOS and don't use UEFI at all. Wow - I'm sure nobody would've expected that.
rdos wrote:it is not likely that UEFI has not mapped the video BIOS at 0xC0000, which would be the only reason it wouldn't work under UEFI.
You're a complete moron. I fail to understand how someone could remain so ignorant and wrong.
If the video card has a ROM at all (it may not); then that ROM can contain several different "ROM images" (e.g. one for VBE and one for UEFI) and UEFI firmware may use the "UEFI ROM image" and not use VBE at all. Even if UEFI firmware does happen to use a "VBE ROM image" then it may map the "VBE ROM image" at 0xC00000 and then unmap it when you exit boot services.
Eventually everything will change to "pure UEFI" (rather than the current "BIOS+UEFI hybrid" systems we're seeing now), and then no new systems will need any "VBE ROM image" at all, and the hardware that obsolete VBE ROMs rely on may be completely removed from chipsets (in addition to the removal of A20 gate, PIC chips, PIT chip, PS2 controller, etc).
rdos wrote:Brendan wrote:RDOS is sort of "special"; because his support for video was designed years ago (when it was harder to predict the future) and now he's stuck with very badly designed video support (which makes him desperate to cling on to dying technology). He ignores the fact that nobody else here is cursed with the same problem; and (while continually failing to realise it) serves as a warning against bad design decisions.
I think you have failed to notice that the graphics API in RDOS is a standalone module. As such it can be replaced with another module (like a native graphics driver). But in the typical Windowing contexts you advocate, which by the way aren't able to create new consoles, or change resolution, the graphics API is so tightly connected to everything in the OS that changing this interface breaks everything. Not so in RDOS, which is why it is perfectly possible to write native video-drivers. It's just not practical with the large number of vendors that haven't made their interfaces public.
Um..
You can't change a "graphics API module" and expect all software (GUI, applications) that were designed for the old graphics API to suddenly rewrite themselves to be compatible with a different graphics API. For every (sane) OS you can create as many new consoles as you like whenever you like (but it's impossible for software to make new video/monitor hardware appear out of thin air). For the system I advocate, the only thing that needs to know/care about video modes is the video driver, and the graphics API that applications use is a high-level thing. It's impossible for applications to use "nothing" to describe what they want the user to see, and stupid to complain that applications and GUI both know about the graphics API that allows applications to talk to the GUI (or allows the GUI to talk to the video driver).
For video mode switching after boot; who actually cares? The fact is that modern monitors have a "native resolution", so you use that resolution (or as close to it as you can get) and never change video modes unless you change monitors. If you need to reboot to change video modes (because you don't have any native video driver); then nobody cares because nobody replaces their monitor that often anyway. The only exception to this is modern 3D games (which require 3D acceleration and therefore require a native video driver anyway, where the native video driver that is required anyway would be capable of switching video modes without a reboot).
The real answer here is that only people who have bad graphics APIs (with no resolution independence) need to care about video mode switching; because in this case either all applications have to support all video modes (which is insane) or you have to switch video modes whenever you switch between applications (which is insane, and means you can't have applications running in their own windows regardless of how much you want it and how hard you try to pretend that you don't want it).
Cheers,
Brendan