Anyone ever done "hello world" through a modern GPU?
- LieutenantHacker
- Member
- Posts: 69
- Joined: Sat May 04, 2013 2:24 pm
- Location: Canada
Anyone ever done "hello world" through a modern GPU?
So when most people say how to output "hello world" on, say, an x86 bare machine specifically, they'll tell you to write to the MMIO addresses like follows:
http://wiki.osdev.org/Printing_To_Screen#basics
But there's a few misconceptions about that, namely:
You're not really accessing the GPU. Even if you set up every "VGA register" manually and wrote the values to each memory-mapped area, this does not directly initialize the GPU or write to its registers directly, shader assembly, etc.
Some Radeon cards have a lot of open documentation, and NVIDIA is harder; but people have done homebrew stuff with PS2, and Sony never released documentation on that.
I attempted this with SASS and NVIDIA, but spent hours and hours trying everything imaginable with x86 assembly (trying to writer shader code to MMIO addresses for the GPU, trying to find out how the lowest-level device drivers do it, etc.), and got nothing done.
CUDA wouldn't count here. What I'm saying is, has anyone got a hello world by directly setting registers in assembly and writing to the GPU like a driver would to get something on the screen? I believe many drivers have been reverse engineered, the MMIO of the GPU can be found out easily on many modern OSes, so someone doing this shouldn't be too out of this world.
It doesn't even HAVE to be a Radeon/NVIDIA GPU, but just any modern GPU used in computers:
1.Intel's Graphics Media Accelerator series.
2.Intel's HD series.
3.Adreno series.
4.Mali series.
Has anyone ever attempted and succeeded with anything? Not for anything practical, but just as a challenge/learning experience.
http://wiki.osdev.org/Printing_To_Screen#basics
But there's a few misconceptions about that, namely:
You're not really accessing the GPU. Even if you set up every "VGA register" manually and wrote the values to each memory-mapped area, this does not directly initialize the GPU or write to its registers directly, shader assembly, etc.
Some Radeon cards have a lot of open documentation, and NVIDIA is harder; but people have done homebrew stuff with PS2, and Sony never released documentation on that.
I attempted this with SASS and NVIDIA, but spent hours and hours trying everything imaginable with x86 assembly (trying to writer shader code to MMIO addresses for the GPU, trying to find out how the lowest-level device drivers do it, etc.), and got nothing done.
CUDA wouldn't count here. What I'm saying is, has anyone got a hello world by directly setting registers in assembly and writing to the GPU like a driver would to get something on the screen? I believe many drivers have been reverse engineered, the MMIO of the GPU can be found out easily on many modern OSes, so someone doing this shouldn't be too out of this world.
It doesn't even HAVE to be a Radeon/NVIDIA GPU, but just any modern GPU used in computers:
1.Intel's Graphics Media Accelerator series.
2.Intel's HD series.
3.Adreno series.
4.Mali series.
Has anyone ever attempted and succeeded with anything? Not for anything practical, but just as a challenge/learning experience.
The desire to hack, with the ethics to code.
I'm gonna build an 8-bit computer soon, with this as reference: http://www.instructables.com/id/How-to- ... -Computer/
I'm gonna build an 8-bit computer soon, with this as reference: http://www.instructables.com/id/How-to- ... -Computer/
- Combuster
- Member
- Posts: 9301
- Joined: Wed Oct 18, 2006 3:45 am
- Libera.chat IRC: [com]buster
- Location: On the balcony, where I can actually keep 1½m distance
- Contact:
Re: Anyone ever done "hello world" through a modern GPU?
I can bootstrap a V2x00 GPU and run my own code on it. Didn't try making it do "hello world" though, nor would that be considered a modern GPU, but it certainly was a fun exercise
Re: Anyone ever done "hello world" through a modern GPU?
Well, I have written a partial Radeon driver that sets up full hd and compiled and run a hello world program while in that mode, where the kernel does text rendering to the framebuffer. I guess that counts.
Re: Anyone ever done "hello world" through a modern GPU?
If you really want to write "Hello world" then the GPU is absolutely useless. GPU just does some arithmetic, but actual letters on the screen are the memory bytes at some address. So, to write "hello" you need to know how to write to the screen memory. But may be GPU instruction set includes something like mov instruction, which moves data from GPU register to some video memory location. Then, if you prefer not to write to the video memory using video card registers, you should figure out how to feed the GPU with instructions like mov. It seems there should be some way to do this, but I have never tested it. For the beginner there are many sites with OpenGL, OpenCL, GLSL, CUDA and other things where examples can be found, but it is high level, there's no registers and GPU instructions.LieutenantHacker wrote:You're not really accessing the GPU. Even if you set up every "VGA register" manually and wrote the values to each memory-mapped area, this does not directly initialize the GPU or write to its registers directly, shader assembly, etc.
- LieutenantHacker
- Member
- Posts: 69
- Joined: Sat May 04, 2013 2:24 pm
- Location: Canada
Re: Anyone ever done "hello world" through a modern GPU?
Not sure what you mean by "screen memory" ... the GPU, obviously, has to get data to some form of "pixels" or the like before getting it on the screen; this is "screen memory." Maybe you mean the GPU is useless for stuff like shading, 3-D geometrics, etc., because we're talking about just "hello world". I believe all modern operating systems, even older Windows 95/98 with just 2-D GUI rendering, use a device driver to access GPU memory-mapped I/O to update data the screen as processed.
I guess it comes down to a more specific question:
What exactly does writing to "the screen" mean?
Getting anything to show on the screen? INT 10 from legacy BIOS can do that, but that hardly counts.
Writing "directly" to video memory, as the link I showed above illustrates, is also limited simply because you are stuck within the confines of resolution from which only GPU-specific initialization can exceed. Again, you don't need to "initialize" the GPU to do that.
I guess it comes down to a more specific question:
What exactly does writing to "the screen" mean?
Getting anything to show on the screen? INT 10 from legacy BIOS can do that, but that hardly counts.
Writing "directly" to video memory, as the link I showed above illustrates, is also limited simply because you are stuck within the confines of resolution from which only GPU-specific initialization can exceed. Again, you don't need to "initialize" the GPU to do that.
The desire to hack, with the ethics to code.
I'm gonna build an 8-bit computer soon, with this as reference: http://www.instructables.com/id/How-to- ... -Computer/
I'm gonna build an 8-bit computer soon, with this as reference: http://www.instructables.com/id/How-to- ... -Computer/
- LieutenantHacker
- Member
- Posts: 69
- Joined: Sat May 04, 2013 2:24 pm
- Location: Canada
Re: Anyone ever done "hello world" through a modern GPU?
The goal of writing a device driver is to create a low-level program that can access specific hardware; as middleware in most cases, that is quite vague. What's also vague is that people have slightly different ideas of what "screen memory", "framebuffer", and "writing directly to video memory" actually mean.sortie wrote:Well, I have written a partial Radeon driver that sets up full hd and compiled and run a hello world program while in that mode, where the kernel does text rendering to the framebuffer. I guess that counts.
I would not consider what you're describing to count because the goal is not simply getting hello world to the screen and that's it.
The desire to hack, with the ethics to code.
I'm gonna build an 8-bit computer soon, with this as reference: http://www.instructables.com/id/How-to- ... -Computer/
I'm gonna build an 8-bit computer soon, with this as reference: http://www.instructables.com/id/How-to- ... -Computer/
Re: Anyone ever done "hello world" through a modern GPU?
Then you should first describe your goal clearly and only after we have your goal understood it is possible to answer your questions.LieutenantHacker wrote:because the goal is not simply getting hello world to the screen and that's it.
- Owen
- Member
- Posts: 1700
- Joined: Fri Jun 13, 2008 3:21 pm
- Location: Cambridge, United Kingdom
- Contact:
Re: Anyone ever done "hello world" through a modern GPU?
LieutenantHacker wrote:The goal of writing a device driver is to create a low-level program that can access specific hardware; as middleware in most cases, that is quite vague. What's also vague is that people have slightly different ideas of what "screen memory", "framebuffer", and "writing directly to video memory" actually mean.sortie wrote:Well, I have written a partial Radeon driver that sets up full hd and compiled and run a hello world program while in that mode, where the kernel does text rendering to the framebuffer. I guess that counts.
I would not consider what you're describing to count because the goal is not simply getting hello world to the screen and that's it.
You don't need to do anything with the shading units to get things on screen. Don't confuse the GPU and the framebuffer. They are often completely separate, and by different companies.
- Bender
- Member
- Posts: 449
- Joined: Wed Aug 21, 2013 3:53 am
- Libera.chat IRC: bender|
- Location: Asia, Singapore
Re: Anyone ever done "hello world" through a modern GPU?
idk if I am correct but since a GPU has it's own instruction set, wouldn't it need it's own assembler/compiler?
Well, it's useless to do a hello world with a gpu, afaik gpu does stuff faster than the onboard cpu (drawing images faster etc.) , basically to write a hello world all you need is the memory address of the framebuffer and some info about the screen (bpp, max x, max y). I guess VBE was made to make the information easily available for hobby osdevers.
I have little knowledge about this subject so forgive the ignorance, the actual setting of the screen resolution is done by the GPU or.... something else? You have to query the gpu about the screen resolution info or something else?
Well, it's useless to do a hello world with a gpu, afaik gpu does stuff faster than the onboard cpu (drawing images faster etc.) , basically to write a hello world all you need is the memory address of the framebuffer and some info about the screen (bpp, max x, max y). I guess VBE was made to make the information easily available for hobby osdevers.
I have little knowledge about this subject so forgive the ignorance, the actual setting of the screen resolution is done by the GPU or.... something else? You have to query the gpu about the screen resolution info or something else?
"In a time of universal deceit - telling the truth is a revolutionary act." -- George Orwell
(R3X Runtime VM)(CHIP8 Interpreter OS)
(R3X Runtime VM)(CHIP8 Interpreter OS)
- Combuster
- Member
- Posts: 9301
- Joined: Wed Oct 18, 2006 3:45 am
- Libera.chat IRC: [com]buster
- Location: On the balcony, where I can actually keep 1½m distance
- Contact:
Re: Anyone ever done "hello world" through a modern GPU?
the actual setting of the screen resolution is done by the GPU or.... something else? You have to query the gpu about the screen resolution info or something else?
Typically, there are a few distinct components built around a chunk of memory:Don't confuse the GPU and the framebuffer. They are often completely separate, and by different companies.
- The CRTC reads memory and produces VGA signals.
- The bus controller performs reads and writes between the host and video card
- The GPU only reads, modifies, and writes the onboard memory.
So to get video output, you only need to have an operational CRTC. To change the output you only need to agree on using a common piece of memory and how it's interpreted - the framebuffer. The GPU is needed in none of this.
Re: Anyone ever done "hello world" through a modern GPU?
It seems the original intent of the topic starter was driven by a lack of general understanding of the path from program in PC's memory to the coloured dots on the screen. And "hello world" was supposed to show this path in details. I can't say that I have such understanding. Generally it, of course, looks like the short description form Combuster:
Initially some region of conventional memory contains some GPU instructions and another region contains some images in some special format. Next, for the performance reason, the images should be copied into the video card's memory and GPU instructions should be feed to the GPU, would it be another memory-to-memory copy operation or some sequence of IO register based operations. Next the GPU should start to execute provided instructions and to produce some modified version of provided images. Next the video card should understand that some region of it's memory should be displayed on the screen. Next the region should be translated into some signals on the LCD screen's socket, which usually can be found on the edge of the video card.
And the above description is still too short to get some clue about all things involved in the process of drawing images. First of all it is the GPU instruction set that should be figured out, then comes the format of images the GPU expects (and it should depend on the GPU program we are going to use), next issue is related to the IO or memory mapped registers intended for the video card management (copy memory regions, start GPU, switch LCD to some mode and so on).
Where are all the bits of the information described above? May be some "hello world" example can tell us exactly the steps required and formats used. And while, of course, the best way to get such information is to read vendor's video card specifications, the availability of such specifications is the problem. But may be somewhere on the net is the resource with plenty of specifications and other video card related stuff? I haven't dug deep into the issue and will appreciate any additional information.
But with modern graphics cards the path is much more complex. The complexity is a consequence of graphics card requirements. And most important requirement is the speed. To achieve the maximal speed the GPU was introduced. And with GPU we have a lot of supporting hardware. As it seems to me the details should be like this:Combuster wrote:Typically, there are a few distinct components built around a chunk of memory:
- The CRTC reads memory and produces VGA signals.
- The bus controller performs reads and writes between the host and video card
- The GPU only reads, modifies, and writes the onboard memory.
Initially some region of conventional memory contains some GPU instructions and another region contains some images in some special format. Next, for the performance reason, the images should be copied into the video card's memory and GPU instructions should be feed to the GPU, would it be another memory-to-memory copy operation or some sequence of IO register based operations. Next the GPU should start to execute provided instructions and to produce some modified version of provided images. Next the video card should understand that some region of it's memory should be displayed on the screen. Next the region should be translated into some signals on the LCD screen's socket, which usually can be found on the edge of the video card.
And the above description is still too short to get some clue about all things involved in the process of drawing images. First of all it is the GPU instruction set that should be figured out, then comes the format of images the GPU expects (and it should depend on the GPU program we are going to use), next issue is related to the IO or memory mapped registers intended for the video card management (copy memory regions, start GPU, switch LCD to some mode and so on).
Where are all the bits of the information described above? May be some "hello world" example can tell us exactly the steps required and formats used. And while, of course, the best way to get such information is to read vendor's video card specifications, the availability of such specifications is the problem. But may be somewhere on the net is the resource with plenty of specifications and other video card related stuff? I haven't dug deep into the issue and will appreciate any additional information.
- LieutenantHacker
- Member
- Posts: 69
- Joined: Sat May 04, 2013 2:24 pm
- Location: Canada
Re: Anyone ever done "hello world" through a modern GPU?
I'll admit that I was generally confused over the path of drawing anything to the screen.
However, my question still rests at the GPU, not CRTC or the such. I still hold my question as:
Anyone ever done "hello world" through a modern GPU?
And what that means is that, by using the GPU directly, get hello world on the screen; no CRTC, or other video display controllers onboard a PCB. This is what I originally meant, but was not totally confident in my understanding of output differences in hardware.
However, as stated before, I think that old Windows operating systems (95/98/2000) used GPU functions for basic 2-D drawing (desktop, windows, etc.), and not a CRTC (with resolutions under 800x600 on some of those settings, some CRTCs could do basic drawing that would suffice, making the GPU unnecessary). It most definitely used the GPU for any program that depended on DX or OpenGL.
However, my question still rests at the GPU, not CRTC or the such. I still hold my question as:
Anyone ever done "hello world" through a modern GPU?
And what that means is that, by using the GPU directly, get hello world on the screen; no CRTC, or other video display controllers onboard a PCB. This is what I originally meant, but was not totally confident in my understanding of output differences in hardware.
However, as stated before, I think that old Windows operating systems (95/98/2000) used GPU functions for basic 2-D drawing (desktop, windows, etc.), and not a CRTC (with resolutions under 800x600 on some of those settings, some CRTCs could do basic drawing that would suffice, making the GPU unnecessary). It most definitely used the GPU for any program that depended on DX or OpenGL.
The desire to hack, with the ethics to code.
I'm gonna build an 8-bit computer soon, with this as reference: http://www.instructables.com/id/How-to- ... -Computer/
I'm gonna build an 8-bit computer soon, with this as reference: http://www.instructables.com/id/How-to- ... -Computer/
-
- Member
- Posts: 5513
- Joined: Mon Mar 25, 2013 7:01 pm
Re: Anyone ever done "hello world" through a modern GPU?
If you do not correctly set up those "other video display controllers", you won't see anything on the screen. The GPU does not generate video signals.LieutenantHacker wrote:no CRTC, or other video display controllers onboard a PCB
- LieutenantHacker
- Member
- Posts: 69
- Joined: Sat May 04, 2013 2:24 pm
- Location: Canada
Re: Anyone ever done "hello world" through a modern GPU?
You sure about that? This article on Wiki says that a GPU goes a step further than a VDP:
http://en.wikipedia.org/wiki/Video_display_controller
Any reliable sources that can claim that a GPU can not drive a video signal without a VDP/CRTC?
Also, if output is possible without a GPU and generally unrelated (in terms of the output signal), why do I hear people arguing here that a GPU is needed to get specific resolutions? If the GPU does not drive a video signal, how can it extend the native resolution of a CRTC/VDP?
http://en.wikipedia.org/wiki/Video_display_controller
Any reliable sources that can claim that a GPU can not drive a video signal without a VDP/CRTC?
Also, if output is possible without a GPU and generally unrelated (in terms of the output signal), why do I hear people arguing here that a GPU is needed to get specific resolutions? If the GPU does not drive a video signal, how can it extend the native resolution of a CRTC/VDP?
The desire to hack, with the ethics to code.
I'm gonna build an 8-bit computer soon, with this as reference: http://www.instructables.com/id/How-to- ... -Computer/
I'm gonna build an 8-bit computer soon, with this as reference: http://www.instructables.com/id/How-to- ... -Computer/
- Combuster
- Member
- Posts: 9301
- Joined: Wed Oct 18, 2006 3:45 am
- Libera.chat IRC: [com]buster
- Location: On the balcony, where I can actually keep 1½m distance
- Contact:
Re: Anyone ever done "hello world" through a modern GPU?
So much for reading the relevant wikipedia page:
Wikipedia wrote:A graphics processing unit (GPU), also occasionally called visual processing unit (VPU), is a specialized electronic circuit designed to rapidly manipulate and alter memory to accelerate the creation of images in a frame buffer intended for output to a display.