Hi,
Tyler wrote:Would it be possible and/or safe, to use a Generic "return to VGA mode" function after a Graphics card has set up by it's driver? I am hoping for more than just a "depends on the Graphics Card" if at all possible. I have begun attempts on an old Graphics Card with only some breakage, but would rather not kill any Cards or Monitors trying.
Combuster is right - AFAIK almost all video cards have a VGA mode and an extended mode, where you can't predict what will happen if VGA code is used when the card is in extended mode without knowing about the specific card itself.
I'm curious about what you mean by "breakage" though. In general nothing any software does should damage any hardware, however there are a few exceptions to this that I know of. The first is repeatedly placing stresses on disk drive mechanics (e.g. sending the floppy heads to sector 99). The second way is turning off CPU, GPU, drive bay, motherboard and/or case fans, which might cause things to be demaged by overheating (but CPUs and possibly other devices have over-temperature shutdown to prevent this).
Lastly, there are old "VGA only" monitors that can't handle non-VGA signals and do break. A few years ago I sold a perfectly good Pentium system (old, but working and tested) with one of these old monitors (which was carefully setup to bootwindows 95 in 800*600 mode) in a garage sale. After about 2 weeks the new owners started a game that tried to switch to 1024*768 video mode which killed the monitor (they came back and I sold them a newer replacement monitor). This is the reason that most OSs don't use higher video modes unless the monitor supports DDC (where the monitor tells the video card/driver what it's capable of). It's also the reason why you don't buy computers at garage sales unless you know what you're doing...
Anyway these old VGA-only monitors can die at higher resolutions *and* if they get messed up signals from the video card. If the video driver enables "extended mode" then generic return to VGA mode code that expects the video card to be in "VGA mode" can mess things up and cause one of these old monitors to die. Newer monitors aren't a problem (you get messed up video but don't damage anything).
Tyler wrote:Also, if that is a matter of Chipset Specific, does anyone know, because i am too lazy to check the code, how Linux Switches from X11 to another console if X11 encapsulates the Driver?
I would assume that X switches back to "VGA mode" when it quits.
I wouldn't assume that running the video card ROM's initialization code will always work on all video cards. I wouldn't be surprised if the ROM initialization code in some video cards assumes that the card is in it's "power-on" default state, and doesn't fully configure everything (it's not the sort of thing that all video card manufacturers would test for or worry about).
Lastly, the VGA has a standard reset flag that might (but might not - I haven't tested it on any video card) reset the card to it's default/VGA state, even when the card was operating in "extended mode". It might be possible to have a blacklist/whitelist of video cards that "VGA reset" and/or "ROM initialization" does/doesn't work for, but then you're still out of options if the video card is on the blacklist.
If you're thinking about fault tolerance (e.g. some sort of software to restore a generic video mode that recovers if the video driver crashes), then you're mostly out of luck. For fault tolerance I'd make sure your video drivers disable output when changing video modes (so if they crash during a video mode switch there's no output to the monitor to worry about - the user just gets a black screen). I'd also ensure that the video drivers fully initialize the video when they start (and don't assume anything is in any specific state, although this is easy to say and hard to do). That would allow your OS to replace a crashed video driver with another video driver and possibly restart the same/crashed video driver. You might also be able to automatically reboot the computer if the video driver crashes.
Finally, don't forget that modern video cards/drivers may use bus mastering to transfer data between the computer's RAM and the video card's memory. It'd be possible for a video driver to setup a transfer then crash, leaving DMA/bus mastering transfers running while no sane video driver controls it. For automatic fault recovery any replacement video driver that's started would want to make sure any ongoing bus master transfers are stopped as soon as possible. You might also want the OSs memory manager to keep track of which pages are used for DMA/bus mastering, so that these pages are placed into a "cool down" list and not re-used if the driver crashes - the last thing you'd want is for the video driver to crash and for the RAM to be freed then re-allocated by other software while a DMA transfer is writing to the RAM (so that who-ever allocates the RAM gets their data trashed).
Cheers,
Brendan