Hi,
jal wrote:Are 0x00-0x1F valud Unicode code points? Never new that...
They are valid code points, but they're also control characters (newline, tab, etc) just like ASCII, which means you don't need font data for them. There's a similar range of control characters (code points 0x000080 to 0x009F) and a few orphans (0x00007F/delete and 0x0000AD/soft hyphen), and some characters that look the same (e.g. space and "no-break space"). I ended up with a total of 189 individual characters in my font data.
Here's the font data at 2 different sizes (top left is 6 * 8 pixels per character; the squares with diagonal lines correspond to "undisplayable" control characters):
jal wrote:It's completely resolution independent.
That's pretty cool, although you still may want some options for visually impaired (or the opposit) people.
It's modular. The module that displays the boot log using the video card could be replaced by a module that does anything else; including sending the boot log to a braille terminal, or doing morse code with the PC speaker, or...
The boot code itself allows several of these "display modules" at the same time (e.g. you could use 4 display modules, and send to boot log to the video card, serial port, network card and sound card all at the same time).
jal wrote:Here's a screen shot
Anti-aliased and all. Impressive.
Thanks. Was wondering if anyone would notice the "anti-aliased-ness", although you haven't seen 6*8 characters on 8-bpp yet (it's ugly, and barely readable).
AJ wrote:Nice one - I know how much work you put in to the "behind the scenes reliability", too. Good to see how far it has come on. Out of interest, is the "% reliable" rating a measure of your confidence of whether a video mode is likely to work on the current adapter / monitor? If so, how do you arrive at that figure?
The full mess goes a little like this...
For each video mode listed by VBE, I try to create additional "derived" video modes. If the video card supports it, my code can use double scanned mode and CRTC timing information; so that (for e.g.) for one "1024 * 768, 32-bpp" video mode reported by VBE I might end up with:
- 1024 * 768, 32-bpp, unknown refresh rate (the original mode)
- 1024 * 768, 32-bpp @ 60.004 Hz refresh rate
- 1024 * 768, 32-bpp @ 70.069 Hz refresh rate
- 1024 * 768, 32-bpp @ 75.029 Hz refresh rate
- 1024 * 768, 32-bpp @ 84.997 Hz refresh rate
- 1024 * 384, 32-bpp @ 60.004 Hz refresh rate
- 1024 * 384, 32-bpp @ 70.069 Hz refresh rate
- 1024 * 384, 32-bpp @ 75.029 Hz refresh rate
- 1024 * 384, 32-bpp @ 84.997 Hz refresh rate
Then I filter out every video mode that my OS doesn't support: anything with horizontal resolution less than 320 pixels or vertical resolution less than 200 pixels, anything that needs bank switching to access all of display memory, anything that uses an unsupported pixel format (e.g. BGR or CMY instead of RGB). For different pixel formats there's also horizontal resolution restrictions (e.g. horizontal resolution must be a multiple of 8 for all 8-bpp modes, multiple of 4 for all 15-bpp and 16-bpp modes, etc).
Then I use the EDID information from the monitor to estimate the probability that the monitor will be able to handle each video mode. This is complicated, because it's impossible to determine the video card's timing (without forcing the video card to use your own CRTC timings), and therefore impossible to determine if the horizontal frequency and vertical frequency is within the monitor's limits. For example, a video card might report a 800 * 600 video mode, and the monitor might support 800 * 600 at 60 Hz but might not support 800 * 600 at 75 Hz, and you won't know what timing the video card uses as default.
If my code can use CRTC info to force the video card to use specific timings, then it'll give the video mode either a very high reliability rating or a very low reliability rating based on what EDID reported. If my code can't use CRTC info then it can't be too sure about the video card's timing, and it'll calculate the horizontal and vertical timings the video card might use if the refresh rate is 60 Hz and see if that is within the monitor's limits and use that for the basis of the reliability rating, and then find out how many standard timings the monitor supports at the same resolution and adjust the reliability rating according to that. For example (for a "800 * 600" video mode with unknown timing), if my code determines that (at 60 Hz) the video mode's timing would be supported by the monitor then it might give it a base rating of 66%; and then if the monitor says it supports "800 * 600 at 60 Hz" and "800 * 600 at 70 Hz" but doesn't support "800 * 600 at 80 Hz", then my code would adjust this rating by 2/3 and come up with a final rating of 44%.
Of course if my code can't get EDID from the monitor, then it will assume the monitor is "standard VGA" and calculate reliability ratings based on that (and most high resolution video modes will get very low ratings). In addition, my code does make some more assumptions about video mode timings. For example, for standard VGA modes (e.g. "640 * 480 at 4-bpp", "320 * 200 at 8-bpp", etc) it will assume that the video card probably uses standard VGA timings, and you'll get higher reliability ratings because of this.
From here, my code uses the user's preferences and the reliability rating to select a video mode from the list. Eventually this selection will also take into account the monitor's preferences (as some monitors have "preferred" video modes that give better results - especially LCD monitors), but I haven't done this yet (in most cases the monitor says it prefers something like "1680 * 1050" and the video card won't support it anyway).
This entire system seems to work, sort of. However, it does mean that in a large number of cases (e.g. mostly video cards that don't support CRTC info, which is unfortunately most of them) the user doesn't get the video mode they want. For example, the user might set the preferences for "800 * 600 at 32-bpp @ 70 Hz", and they might get "640 * 480 at 4-bpp @ 59.950 Hz" because the reliability rating for this (standard VGA) video mode is high enough to override the user's preferences. It also causes problems on every emulator I've tested, because none of the emulators return EDID information for the monitor - my code assumes it's a standard VGA monitor in this case, so you end up with very low reliability ratings for most video modes, which makes it almost impossible for the user to end up with anything other than standard VGA video modes.
Note: I hacked my Bochs BIOS so that it will return EDID information for a standard 15 inch monitor.
I guess what I'm saying is that the code still needs some fine tuning, but it will never be perfect. Initially I wanted a fully automated boot that always gives very reliable results (with no need for the user to intervene at all). In the end I had to implement a "interactive boot menu" to cope with the uncertainty, because it's impossible to guarantee reliable results using VBE regardless of how hard you try.
Cheers,
Brendan