Page 1 of 1

How exactly do I generate the bitmask for font rendering?

Posted: Tue May 24, 2022 10:30 pm
by ThatCodingGuy89
Post is to do with this page in the wiki: https://wiki.osdev.org/VGA_Fonts#Displaying_a_character

Specifically, the optimized version of the character displaying routine. There is no explanation of how exactly this bitmap is generated. (Or hardcoded)

I don't know if I am an idiot and have missed some computer graphics term for bitmask that implies some structure, or if the page is just unclear on how to do things.

Re: How exactly do I generate the bitmask for font rendering

Posted: Wed May 25, 2022 2:40 am
by iansjack
I'm not quite sure what your problem is. The section "Decoding of bitmap fonts" shows you the structure of the character bitmap.

Edit: Ah - I think you're referring to the mask_table array. Sorry, I've no idea where that came from.

Re: How exactly do I generate the bitmask for font rendering

Posted: Wed May 25, 2022 3:01 am
by klange
While the math seems to have been left as an exercise for the reader, the mask table is basically the same data as the bitmap - just with whole bytes filled in instead of individual bits. Or rather, with whole pixels filled in - whatever that pixel size may be.

A brief explanation of the idea is that you have your bitmap representation of a glyph, made up of a byte per row, with bits representing columns, and you take this bitmap and expand it out so that each bit is now 8 bits. Note that for an 8-bit bitmap row, this means a mask row is 8 bytes - the example code seems to assume 32-bit values, so you need to two of them, but I'm lazy and will use 64-bit values. So a row of 01101100b becomes 0x00FFFF00_FFFF0000. Except... we're probably on a little-endian machine, so this is actually backwards. It should be byte reversed to 0x0000FFFF_00FFFF00. We do this for every row in our bitmap and then the masking can be used to write multiple pixels at a time, which is probably faster than individual calls to a set-pixel function.

Re: How exactly do I generate the bitmask for font rendering

Posted: Wed May 25, 2022 3:56 am
by ThatCodingGuy89
klange wrote:While the math seems to have been left as an exercise for the reader, the mask table is basically the same data as the bitmap - just with whole bytes filled in instead of individual bits. Or rather, with whole pixels filled in - whatever that pixel size may be.

A brief explanation of the idea is that you have your bitmap representation of a glyph, made up of a byte per row, with bits representing columns, and you take this bitmap and expand it out so that each bit is now 8 bits. Note that for an 8-bit bitmap row, this means a mask row is 8 bytes - the example code seems to assume 32-bit values, so you need to two of them, but I'm lazy and will use 64-bit values. So a row of 01101100b becomes 0x00FFFF00_FFFF0000. Except... we're probably on a little-endian machine, so this is actually backwards. It should be byte reversed to 0x0000FFFF_00FFFF00. We do this for every row in our bitmap and then the masking can be used to write multiple pixels at a time, which is probably faster than individual calls to a set-pixel function.
Ah, so 0b00100110 becomes 0x00FFFF00, 0x00FF0000. Just checking if I understood your explanation correctly.