Some confusion about VGA and display drivers
Posted: Sun Aug 06, 2017 3:15 pm
Hi there,
It's quite a while since I posted here! Some things regarding VGA got me a bit confused, and I thought this would be the best place to ask. In a general sense, I do understand how the VGA hardware works - how the analog signals are generated from video memory etc. With older motherboards, I understand the whole process: the CPU would set a video mode and write to video memory using some bus protocol, and the VGA chip would (in cooperation with the video RAM and the RAMDAC) generate the output, clock, and vertical and horizontal retrace signals.
However, I don't understand the process on slightly more modern motherboards (I say slightly because of course, most modern motherboards have HDMI or at least DVI). I have an old motherboard here, which has an internal GPU. Based on googling, it seems that the integrated GPU usually just uses the system RAM as video RAM. Also, I don't think there are RAMDACs on the motherboard, but I'm not sure. Do CPU's with an integrated GPU just output the (analog) VGA signals immediately? This seems a bit hard to believe when looking at the motherboard, considering how far the CPU is from the VGA connector.
Now suppose my CPU wouldn't have an integrated GPU, but I would be using an external GPU. If I don't have the proper drivers, an OS obviously has to fall back to some commonly supported protocol or interface (right?). Now, I would like to know more about this, but I can't find any information (probably because I don't know the proper terms).
It's quite a while since I posted here! Some things regarding VGA got me a bit confused, and I thought this would be the best place to ask. In a general sense, I do understand how the VGA hardware works - how the analog signals are generated from video memory etc. With older motherboards, I understand the whole process: the CPU would set a video mode and write to video memory using some bus protocol, and the VGA chip would (in cooperation with the video RAM and the RAMDAC) generate the output, clock, and vertical and horizontal retrace signals.
However, I don't understand the process on slightly more modern motherboards (I say slightly because of course, most modern motherboards have HDMI or at least DVI). I have an old motherboard here, which has an internal GPU. Based on googling, it seems that the integrated GPU usually just uses the system RAM as video RAM. Also, I don't think there are RAMDACs on the motherboard, but I'm not sure. Do CPU's with an integrated GPU just output the (analog) VGA signals immediately? This seems a bit hard to believe when looking at the motherboard, considering how far the CPU is from the VGA connector.
Now suppose my CPU wouldn't have an integrated GPU, but I would be using an external GPU. If I don't have the proper drivers, an OS obviously has to fall back to some commonly supported protocol or interface (right?). Now, I would like to know more about this, but I can't find any information (probably because I don't know the proper terms).