How to use graphic adapter's bit blitting?

Question about which tools to use, bugs, the best way to implement a function, etc should go here. Don't forget to see if your question is answered in the wiki first! When in doubt post here.
Binarus
Posts: 12
Joined: Sun Feb 16, 2014 12:48 pm

How to use graphic adapter's bit blitting?

Post by Binarus »

Dear experts,

I have the following problem:

I am developing a real-time application which runs under DOS (actually, DOS is only used for booting and starting the application; the application then takes complete control over the hardware, thus becoming a minimal "O/S" itself, which is the reason why I dare to ask a question here which probably is quite dumb). The application writes many status messages to the screen; until now, this has been realized by entering a text mode with 80x50 and writing the output directly into the screen buffer (in this case, B000:XXXX). This is fast enough and has worked quite well for some time.

But now, I need some more space for the output. It would be sufficient (actually, ideal) to have 90*25 in text mode. Unfortunately, the graphics hardware (Intel 855) definitely does not support any text mode with more than 80 characters horizontally, at least on the embedded PC in question. I have found out about that by using the VESA bios functions (I first got a list of supported modes by INT 10, 4F00 and then checked every mode from the list via INT 10, 4F01).

Thus, I probably have to use graphic mode for the output. But using the VESA BIOS calls, outputting a character in graphic mode took about 170 us. This is not acceptable by orders of magnitude since I have to output one complete line (let's say 90 characters) per main loop cycle which is run every ms. I am quite sure that copying a character's pixels to the appropriate screen buffer locations would still be far too slow even when using my own assembly code.

The only solution which I am currently aware of would be to use the graphic adapter's bit blitting capabilities, and this is exactly my question: In the documentation for the Intel 855, I did not find any hint how I could program the 855 to do such operations. Is there any reference which describes how that controller can be programmed, which commands it supports exactly, and to which registers (or memory locations) I have to write the commands and their parameters? Or do I have to study the DirectX documentation in the hope to get an idea?

If anybody knows of a method how to put the 855 into 90x50 (or even higher) text modes, this would be even preferable,though.

Thank you very much in advance,

Binarus
User avatar
Combuster
Member
Member
Posts: 9301
Joined: Wed Oct 18, 2006 3:45 am
Libera.chat IRC: [com]buster
Location: On the balcony, where I can actually keep 1½m distance
Contact:

Re: How to use graphic adapter's bit blitting?

Post by Combuster »

But using the VESA BIOS calls, outputting a character
Of course that's slow. With VBE enabled you can still plot individual pixels directly into video memory like you do with characters in text mode. You just have to do a bit more work of your own.

Also, you can just set mode 12 instead (no VBE needed) for 640x480 mode (= 80x60 character cells). With only 8 bytes to write per individual character if you know how to do it efficiently, that's the next fastest thing you can do to text mode.
"Certainly avoid yourself. He is a newbie and might not realize it. You'll hate his code deeply a few years down the road." - Sortie
[ My OS ] [ VDisk/SFS ]
User avatar
Pancakes
Member
Member
Posts: 75
Joined: Mon Mar 19, 2012 1:52 pm

Re: How to use graphic adapter's bit blitting?

Post by Pancakes »

I do believe that the VESA interface may provide accelerated functions which could provide you with a bit blit function. Maybe someone can come along that knows how check if this VESA/AF exists and then call it.

I wanted to show you how you can figure out how to talk directly to the hardware if you wanted too.

I am assuming you are still in 16-bit mode and if you are then this reply may not be able to help you unless you devise a way to access the MMIO (memory mapped I/O) regions used by the device, I/O ports, and interrupts. You can likely find a way to not even use interrupts by polling or them just not being needed, and I/O ports may not even need to be used.

http://www.cs.fsu.edu/~baker/devices/lx ... ntelfbhw.c

A starting point is the driver from Linux, and you will notice it has a procedure called 'intelfbhw_do_bitblt' which shows how to tell the graphics card to copy a region.

Your going to have to initialize it and it looks like 'intelfbhw_program_mode' might be what you need to follow along with the initialize the card.

Also you are going to need to access the PCI bus to figure out where the MMIO area is, unless you can figure out where it defaults to which could change, but should be enough just to get it up and running.

I just wanted to add this in case someone out there needs a point to get started from if they are already in protected mode and do not want to use the VESA functions.
Binarus
Posts: 12
Joined: Sun Feb 16, 2014 12:48 pm

Re: How to use graphic adapter's bit blitting?

Post by Binarus »

Combuster wrote:With VBE enabled you can still plot individual pixels directly into video memory like you do with characters in text mode.
Yes, but the i855 does not have VESA modes with 1 bits per pixel. Assuming that I get a font with 8x8 pixels per character, this means that, for every character, I would have to do the following:

1) Get the top "line" of the character (fits in one byte since the character is 8x8).
2) For every set bit in this line, set a byte in the appropriate screen buffer location.
3) Repeat these steps for the remaing seven lines of the character.

I think that I could get character processing times of 20us or so (if I am lucky), but I suspect I never would be able to "write" a complete line in about 50us (which is my goal). Nevertheless, I'll try tomorrow.
Combuster wrote:Also, you can just set mode 12 instead (no VBE needed) for 640x480 mode (= 80x60 character cells). With only 8 bytes to write per individual character if you know how to do it efficiently, that's the next fastest thing you can do to text mode.
This would be the ideal solution if the problem was the number of vertical lines; it would make expanding of the 1-bit-per-pixel-data to higher color depth data unnecessary. But the problem actually is the width of the line (I need at least 90 characters per line).

Nevertheless, thanks a lot!
Binarus
Posts: 12
Joined: Sun Feb 16, 2014 12:48 pm

Re: How to use graphic adapter's bit blitting?

Post by Binarus »

Pancakes wrote:I do believe that the VESA interface may provide accelerated functions which could provide you with a bit blit function.
Yes, I think I have seen this in Ralph Brown's interrupt list.
Pancakes wrote:I wanted to show you how you can figure out how to talk directly to the hardware if you wanted too.
That's exactly what I'm after :D
Pancakes wrote: I am assuming you are still in 16-bit mode and if you are then this reply may not be able to help you ...
Yes, the application runs in real mode, but by applying some tricks, it has access to the complete 32 bit memory space. I am already driving custom PCI hardware that way.
Pancakes wrote:http://www.cs.fsu.edu/~baker/devices/lxr/http/source/linux/drivers/video/intelfb/intelfbhw.c
A starting point is the driver from Linux, and you will notice it has a procedure called 'intelfbhw_do_bitblt' which shows how to tell the graphics card to copy a region.
Your going to have to initialize it and it looks like 'intelfbhw_program_mode' might be what you need to follow along with the initialize the card.
Also you are going to need to access the PCI bus to figure out where the MMIO area is, unless you can figure out where it defaults to which could change, but should be enough just to get it up and running.
Just a big thanks for that excellent answer! I already was afraid that I had to read the linux sources. Some hours ago, I have studied some code from X.org which deals with that chip, but it was too complex to understand in short time. I suppose the same applies to the driver which is part of the linux source code. I'll look into the linux driver the next days.
I am still wondering if it's really necessary to study other people's source code to get information which usually is part of a manual ...
User avatar
Pancakes
Member
Member
Posts: 75
Joined: Mon Mar 19, 2012 1:52 pm

Re: How to use graphic adapter's bit blitting?

Post by Pancakes »

http://www.intel.com/content/dam/doc/da ... asheet.pdf

That is the only datasheet I can find and it appears to have all the registers laid out. I tried to scroll through it but could not find any real information on how to initialize the graphics chip. It listed all the electrical pins too, and no menu so hard to jump around in it.

For me I would likely end up just mainly using the Linux driver source and referring back to the datasheet if I needed more information on something. The Linux source looks really scary but it is actually not that bad once you start to understand all the macros they are using.

Mostly all the driver source code is doing is INREG(portnum) and OUTREG(portnum, value). You could even cheat a little by installing Linux on your target machine, but recompiling the initialization function where it outputs some debugging information about the arguments, then use those arguments.

But, maybe someone else might read this and has some actual working code that initializes the card which could save you a lot of time.
User avatar
Brendan
Member
Member
Posts: 8561
Joined: Sat Jan 15, 2005 12:00 am
Location: At his keyboard!
Contact:

Re: How to use graphic adapter's bit blitting?

Post by Brendan »

Hi,
Binarus wrote:But now, I need some more space for the output. It would be sufficient (actually, ideal) to have 90*25 in text mode. Unfortunately, the graphics hardware (Intel 855) definitely does not support any text mode with more than 80 characters horizontally, at least on the embedded PC in question. I have found out about that by using the VESA bios functions (I first got a list of supported modes by INT 10, 4F00 and then checked every mode from the list via INT 10, 4F01).
The video hardware itself works on clock dividers and is probably capable of about a billion different video modes. The monitor is probably only capable of understanding about 1 thousand of the billion video modes. VBE is crap and might only offer 50 video modes. Basically; it's extremely likely that the video card and the monitor are both capable of handling a 90*25 text mode. The problem is figuring out how to setup the video mode without using VBE, which makes this a massive pain (overcomplicated, not portable, etc).
Binarus wrote:Thus, I probably have to use graphic mode for the output. But using the VESA BIOS calls, outputting a character in graphic mode took about 170 us. This is not acceptable by orders of magnitude since I have to output one complete line (let's say 90 characters) per main loop cycle which is run every ms. I am quite sure that copying a character's pixels to the appropriate screen buffer locations would still be far too slow even when using my own assembly code.
BIOS functions are never acceptable for video output. It's very likely that BIOS is 10 times slower than simple/bad/unoptimised code you could write yourself; and very likely that BIOS is 100 times slower than optimised code you could write yourself. If BIOS does one character in 170 us, then you can probably do 100 characters in 170 us (or 500 characters in 1 ms) without too much trouble.

Of course the monitor is probably only capable of display 60 frames per second, so it's very unlikely that "one line every 1 ms" is a sane requirement. If the same line on the screen is being repeatedly overwritten then you probably only need to display 1 line every 16 ms. If a different line on the screen is written each time then maybe the real requirement is "scroll the screen once and display 16 lines, every 16 ms".


Cheers,

Brendan
For all things; perfection is, and will always remain, impossible to achieve in practice. However; by striving for perfection we create things that are as perfect as practically possible. Let the pursuit of perfection be our guide.
Binarus
Posts: 12
Joined: Sun Feb 16, 2014 12:48 pm

Re: How to use graphic adapter's bit blitting?

Post by Binarus »

Pancakes wrote:http://www.intel.com/content/dam/doc/datasheet/855gm-gme-chipset-graphics-memory-controller-hub-datasheet.pdf
Well, I already had that datasheet. It does not contain a single word about the blitter's structure or how operate the blitter. It even doesn't contain a single word about the registers or memory locations which must be accessed to operate the blitter (my impression is that the PCI configuration registers which are listed in the document for various part of the chip do not have anything to do with the actual 2D engine or 3D engine of the chip).

Nevertheless, thanks again!
User avatar
Combuster
Member
Member
Posts: 9301
Joined: Wed Oct 18, 2006 3:45 am
Libera.chat IRC: [com]buster
Location: On the balcony, where I can actually keep 1½m distance
Contact:

Re: How to use graphic adapter's bit blitting?

Post by Combuster »

Some afterthoughts:
Well, I already had that datasheet. It does not contain a single word about the blitter's structure or how operate the blitter.
Which doesn't surprise me. There's more extensive documentation, but it's pretty badly organised and it probably doesn't work for the 855 if you don't know the differences between chipset models. It's probably faster to steal the code from X.org and figure how it works - especially for 2D. At any rate, this stuff is HARD and I wouldn't recommend it given your current line of thought.
VBE/AF
That standard was dead on arrival because of DirectX. Don't expect to find anything meaningful with that.
VESA modes with 1 bits per pixel
The only reason they don't exist is because nobody wants them - and many hardware including the VGA is at least 4-bit colours under the hood. Even regular text mode as you know it has 16 colours and I would think you tried using it already, no? :wink:
I need at least 90 characters per line
1: "need"? You do? I don't believe you.
2: You can do 720x480 with a VGA driver on any compatible VGA Hardware (including the VGA fixed frequency monitors, including alphanumeric mode, and I never heard an i855 being incompatible). Pulling it off however is your homework - if only to teach you how graphics cards work (and make more sense out of more modern hardware).
"Certainly avoid yourself. He is a newbie and might not realize it. You'll hate his code deeply a few years down the road." - Sortie
[ My OS ] [ VDisk/SFS ]
freecrac
Member
Member
Posts: 69
Joined: Thu Sep 20, 2012 5:11 am
Location: germany hamburg

Re: How to use graphic adapter's bit blitting?

Post by freecrac »

Hello.
Brendan wrote:The video hardware itself works on clock dividers and is probably capable of about a billion different video modes. The monitor is probably only capable of understanding about 1 thousand of the billion video modes. VBE is crap and might only offer 50 video modes.
But before VBE every manufacturer build a different way.
Starting with VBE 2 it is only a problem of the manufacturers of the display devices for to define new VBE modenumbers in there VBE BIOS implementation.
vbe3.pdf
Starting with VBE version 2.0, VESA will no longer define new VESA mode numbers and it will no longer be mandatory to support these old mode numbers.
But Starting with VBE 3 we can use videomodes with own CRTC-Parameter tables for to become refreshrates controlled modes with units in 0.01 Hz and a function to become a normalized pixel clock for our parameter table. Additional we can use vbe hardware triple buffering with the linear framebuffer for to enable flickerfree movement of a screenwide content without to become a tearing with it, or we can use videomodes for stereoscopic shutterglasses, both in combination and together with a refreshrates controlled mode.
BIOS functions are never acceptable for video output.
Yes, only for to become a videomode it is usefull, but not for outputing our content to the display.

Dirk
User avatar
Brendan
Member
Member
Posts: 8561
Joined: Sat Jan 15, 2005 12:00 am
Location: At his keyboard!
Contact:

Re: How to use graphic adapter's bit blitting?

Post by Brendan »

Hi,
freecrac wrote:
Brendan wrote:The video hardware itself works on clock dividers and is probably capable of about a billion different video modes. The monitor is probably only capable of understanding about 1 thousand of the billion video modes. VBE is crap and might only offer 50 video modes.
But before VBE every manufacturer build a different way.
Starting with VBE 2 it is only a problem of the manufacturers of the display devices for to define new VBE modenumbers in there VBE BIOS implementation.
Yes, the problem is that manufacturers don't. However, I'm not sure it's fair to blame manufacturers entirely - due to legacy idiocy (which UEFI fixes) their "video card ROM" is mostly limited to 64 KiB, which makes it a little hard to have a list of 1 billion video modes.

A smarter way would be to have standard parameterised formulas. For an example, software could have just told VBE the desired horizontal resolution, vertical resolution, colour depth/pixel format and refresh rate, and let VBE dynamically generate the details for the closest supported video mode using the parameterised formulas (and/or static lookup tables). Sadly, even though VESA have already created 2 sets of "standard parameterised formulas" (GTF and CVT), VBE can't use them.
freecrac wrote:
Starting with VBE version 2.0, VESA will no longer define new VESA mode numbers and it will no longer be mandatory to support these old mode numbers.
But Starting with VBE 3 we can use videomodes with own CRTC-Parameter tables for to become refreshrates controlled modes with units in 0.01 Hz and a function to become a normalized pixel clock for our parameter table. Additional we can use vbe hardware triple buffering with the linear framebuffer for to enable flickerfree movement of a screenwide content without to become a tearing with it, or we can use videomodes for stereoscopic shutterglasses, both in combination and together with a refreshrates controlled mode.
The CRTC stuff in VBE 3 only allows you to change timing (e.g. screen refresh rate, margins, etc). You can't change resolution or colour depth. Note: This isn't 100% correct - e.g. you can enable "pixel doubling" to convert 1024*768 into 512*768, but can't ask for 600*768.


Cheers,

Brendan
For all things; perfection is, and will always remain, impossible to achieve in practice. However; by striving for perfection we create things that are as perfect as practically possible. Let the pursuit of perfection be our guide.
linguofreak
Member
Member
Posts: 510
Joined: Wed Mar 09, 2011 3:55 am

Re: How to use graphic adapter's bit blitting?

Post by linguofreak »

freecrac wrote:Yes, only for to become a videomode it is usefull, but not for outputing our content to the display.Dirk
"Become" means "werden", not "bekommen". "Bekommen" can be translated as "get", "obtain", "receive", etc, and in this particular context (putting hardware into a particular mode) you could also use "enter" or "set up".

Suggested phrasing: "It is only useful for setting up video modes, not for..."
freecrac
Member
Member
Posts: 69
Joined: Thu Sep 20, 2012 5:11 am
Location: germany hamburg

Re: How to use graphic adapter's bit blitting?

Post by freecrac »

Brendan wrote:Hi,
freecrac wrote:
Brendan wrote:The video hardware itself works on clock dividers and is probably capable of about a billion different video modes. The monitor is probably only capable of understanding about 1 thousand of the billion video modes. VBE is crap and might only offer 50 video modes.
But before VBE every manufacturer build a different way.
Starting with VBE 2 it is only a problem of the manufacturers of the display devices for to define new VBE modenumbers in there VBE BIOS implementation.
Yes, the problem is that manufacturers don't. However, I'm not sure it's fair to blame manufacturers entirely - due to legacy idiocy (which UEFI fixes) their "video card ROM" is mostly limited to 64 KiB, which makes it a little hard to have a list of 1 billion video modes.

A smarter way would be to have standard parameterised formulas. For an example, software could have just told VBE the desired horizontal resolution, vertical resolution, colour depth/pixel format and refresh rate, and let VBE dynamically generate the details for the closest supported video mode using the parameterised formulas (and/or static lookup tables). Sadly, even though VESA have already created 2 sets of "standard parameterised formulas" (GTF and CVT), VBE can't use them.
But we can use the GTF for to calculate the VBE CRTC-parameter for the supported video modes of the VBE bios. Example with the "GTFCALC.C" from Rayers "vesatest.zip":
http://rayer.g6.cz/programm/programe.htm
freecrac wrote:
Starting with VBE version 2.0, VESA will no longer define new VESA mode numbers and it will no longer be mandatory to support these old mode numbers.
But Starting with VBE 3 we can use videomodes with own CRTC-Parameter tables for to become refreshrates controlled modes with units in 0.01 Hz and a function to become a normalized pixel clock for our parameter table. Additional we can use vbe hardware triple buffering with the linear framebuffer for to enable flickerfree movement of a screenwide content without to become a tearing with it, or we can use videomodes for stereoscopic shutterglasses, both in combination and together with a refreshrates controlled mode.
The CRTC stuff in VBE 3 only allows you to change timing (e.g. screen refresh rate, margins, etc). You can't change resolution or colour depth. Note: This isn't 100% correct - e.g. you can enable "pixel doubling" to convert 1024*768 into 512*768, but can't ask for 600*768.


Cheers,

Brendan
Yes you are right. We can only use the resolutions that we can find in the VBE bios.

Sorry for my bad englisch.
@linguofreak: Thank you for the translation.

Dirk
Binarus
Posts: 12
Joined: Sun Feb 16, 2014 12:48 pm

Re: How to use graphic adapter's bit blitting?

Post by Binarus »

Brendan wrote:BIOS functions are never acceptable for video output. It's very likely that BIOS is 10 times slower than simple/bad/unoptimised code you could write yourself; and very likely that BIOS is 100 times slower than optimised code you could write yourself. If BIOS does one character in 170 us, then you can probably do 100 characters in 170 us (or 500 characters in 1 ms) without too much trouble.
Your estimation was quite good. Indeed, I now have made some assembly code which outputs a char in about 1.7us in graphics mode; this code is optimized in some way, but optimization is not finished yet. I suppose that I can get to 1us per char, but this probably is a limit.
Brendan wrote:Of course the monitor is probably only capable of display 60 frames per second, so it's very unlikely that "one line every 1 ms" is a sane requirement.
The underlying problem is not that easy: The real-time-application drives a medical device which makes movements; the underlying axis transformations are extremely complex. For debugging purposes, notably during development while not everything works like intended, all sort of interpolator and position data must be printed onto the screen, of course in a form which is readable and structured (i.e. tables). The interpolator and safety main loop are driven every ms; thus, if I output one line per ms and if I have 80 lines, the screen is completely refreshed roughly every 0.1s which is a thing I can live with. Outputting a line (80 chars) *in text mode* costs about 50us including converting some double values to ASCII, so I can do it once per loop (using normal C code).

As you can see, doing less than one line per ms decreases the refresh rate proportionally, and I can't accept refresh rates less than 0.2s (for the whole screen). That's the reason why I can't switch to graphics mode and just output 1 char per loop. Thus, I will work on further optimizing my assembly code for the graphics mode, will choose a mode where about 90 characters (that's the table width I need) fit in one line, and will try my best to get printing a line done within 100us; that's the maximum of CPU time which I can provide for that.

Thank you very much,

Binarus
Last edited by Binarus on Fri Feb 21, 2014 3:00 am, edited 1 time in total.
User avatar
Combuster
Member
Member
Posts: 9301
Joined: Wed Oct 18, 2006 3:45 am
Libera.chat IRC: [com]buster
Location: On the balcony, where I can actually keep 1½m distance
Contact:

Re: How to use graphic adapter's bit blitting?

Post by Combuster »

notably during development while not everything works like intended
Which suggests two things:
1) The output is not needed in production - because if you're not there to watch the screen nobody gets to interpret stuff anyway.
2) In debugging conditions, there's often no particular need to reach production speed.

Also,
In graphics mode you have 640 time coordinates and 480 y coordinates. Quit the text and plot simple line graphs and you have nearly a second of history to watch.
"Certainly avoid yourself. He is a newbie and might not realize it. You'll hate his code deeply a few years down the road." - Sortie
[ My OS ] [ VDisk/SFS ]
Post Reply