Brendan wrote:
VGA hardware only really had one of these pieces (the "scans a pixel buffer and converts the pixels into VGA/HDMI/DVI signals" piece).
To clarify this statement a bit, permit me to give a bit of a history lesson.
What Brendan is alluding to here is that what we usually call 'VGA' today is a rather different thing from the original, even discounting the advances in technology. The original
Video Graphics Array - not, as is usually said nowadays, 'Adapter' - was an ASIC directly located on the motherboards of the
IBM PS/2 models 50, 60, and 80, and was directly integrated into the system, separate from the
Micro Channel Architecture bus that the system used for MCA peripheral cards. They deliberately designed it this way, because the whole purpose of the PS/2 line was to shoot the PC clone market in the head - though it ended up hitting IBM's foot instead.
They made this kind of mistake more than once with the PC market, starting with when they created it in the first place. They had always expected that they could use the PC to rein in the small computer market, so they could then make them basically just glorified smart terminals. This didn't fly in 1981, and it didn't fly seven years later, either.
It was assumed by IBM that the proprietary array could not be cloned, and that the few PC manufacturers who did license the technology would be unable to compete due to ruinous licensing requirements. It was also thought that it would be impossible to run a 640x480 256 color video through an adapter on the 16-bit AT bus, so they discounted the possibility of someone making a work-alike video system that wasn't integrated.
As I said, they made this sort of mistake a lot. Now, to be fair, they had a point; the original AT Bus VGA compatible cards stank on ice, and the VGA 8051 compatible monitors were far too expensive for most PC users; but the main effect this actually had was to give a new lease on life for CGA, MDA and HGC, and for the XT class PCs in general. IBM, more than anything else, misjudged the carrying capacity of the market - most people were perfectly happy with a mono monitor and a Turbo 8088, and the high-performance folks who wanted 80286s and 80386s usually needed an AT (now re-christened
Industry Standard Architecture in contrast to it's proposed 32-bit successor,
EISA) bus system to run peripherals with no available MCA adapters, anyway.
By the time VGA-compatible monitors and adapters were coming down in price circa 1992, both MCA and EISA were dead in the water (though ISA outlasted them both, and was often combined with other, later buses in the early and mid-1990s), and the resurgent market for
'Super VGA' turned to a new, special-purpose interface,
VESA Local Bus, for the video cards. This gave way to
PCI a year or so later, then to
AGP in 1996, then most recently to
PCI Express around 2004.
The point is, you can't really talk about 'VGA' at the hardware level, only at the level of hardware
emulation of the original VGA, which is usually at the BIOS level but not always at the register level. The actual hardware used by different adapters was, and remains, radically different from that used by IBM in 1987.