Framebuffer: Draw a PSF character
Posted: Thu Mar 11, 2021 6:12 pm
Ok after all the good advice i got from my previous questions about framebuffer (thanks to whoeve replied), i'm nearly there with drawing a character.
At least i hope. So i'm following the tutorial on the wiki: https://wiki.osdev.org/PC_Screen_Font so far i have done:
Here a screenshot: On the left side what my kernel is displaying on the right side the character i'm tring to display.
The functtion is more or less the same in the wiki page (spent lot of time understanding what it does, was not a mere copy and paste )
The main difference is that i used the pitch from grub instead of scanline (that from what i understood should be the same thing)
The font i used is: Lat2-VGA28x16.psf (28x16)
This is the information that i read from the psf header (all values are HEX):
Adn the framebuffer header information are:
I tried to investigate but haven't figured out what the real issue is, and what i'm doing wrong. I'm wondering if that code works only with some specific sizes of font. Or i don't know what i'm doing wrong.
Can someone help me?
Btw i'm not sure if this is a typo in the code, or not but this line in the tutorial:
if i use it as it is it doesn't display anything. It has to be:
At least i hope. So i'm following the tutorial on the wiki: https://wiki.osdev.org/PC_Screen_Font so far i have done:
- Converted a PSF into a binary file and linked it in the kernel
- Got access to the PSF data from the kernel
- Implemented the function to draw the character on the screen
Here a screenshot: On the left side what my kernel is displaying on the right side the character i'm tring to display.
The functtion is more or less the same in the wiki page (spent lot of time understanding what it does, was not a mere copy and paste )
Code: Select all
extern char _binary_fonts_default_psf_size;
extern char _binary_fonts_default_psf_start;
extern char _binary_fonts_default_psf_end;
extern uint32_t FRAMEBUFFER_PITCH;
extern void *FRAMEBUFFER_MEM;
void _fb_putchar(unsigned short int symbol, int cx, int cy, uint32_t fg, uint32_t bg){
_printStr("Inside fb_putchar");
char *framebuffer = (char *) FRAMEBUFFER_MEM;
PSF_font *default_font = (PSF_font*)&_binary_fonts_default_psf_start;
uint32_t pitch = FRAMEBUFFER_PITCH;
unsigned char *glyph = (unsigned char *)&_binary_fonts_default_psf_start +
default_font->headersize + (symbol>0&&symbol<default_font->numglyph?symbol:0) * default_font->bytesperglyph;
int bytesperline = (default_font->width + 7)/8;
int offset = (cy * default_font->height * pitch) +
(cx * (default_font->width+1) * 4);
int x, y, line, mask;
for(y=0; y<default_font->height; y++){
line = offset;
mask = 1 << (default_font->width - 1);
for(x=0; x<default_font->width; x++){
*((uint32_t*) (framebuffer + line)) = ((int) *glyph) & mask ? fg : bg;
mask >>= 1;
line +=4;
}
glyph += bytesperline;
offset +=pitch;
}
}
The font i used is: Lat2-VGA28x16.psf (28x16)
This is the information that i read from the psf header (all values are HEX):
Code: Select all
Magic: 864AB572
Number of glyphs: 100
Header size: 20
Bytes per glyphs: 38
Flags: 1
Version: 0
Width: 10
Height: 1C
Code: Select all
Found Multiboot framebuffer: 8
---framebuffer-type: 1
---framebuffer-width: 400
---framebuffer-height: 300
---framebuffer-address: FD000000
---framebuffer-bpp: 20
---framebuffer-pitch: 1000
Can someone help me?
Btw i'm not sure if this is a typo in the code, or not but this line in the tutorial:
Code: Select all
extern char *fb;
......
*((uint32_t*)(&fb + line)) = ((int)*glyph) & (mask) ? fg : bg;
Code: Select all
*((uint32_t*)(fb + line)) = ((int)*glyph) & (mask) ? fg : bg;