Re: I want to know stories about 80286
Posted: Tue May 02, 2017 7:19 am
In a sense, yes, but you have to remember that Intel wasn't aiming for - and didn't even want - the home computer market. The were expecting to have to compete with the LispMs, the Xerox Star, and the MicroVAX for the high-end workstation market - and all of them had extremely heavyweight CPU designs, with the MicroVAX being the least elaborate design among those (which is saying a lot, considering that the VAX ISA was one of the most complex in any production mainframe). They - and nearly everyone else - though that both tagged architecture and capability addressing would be absolutely essential technologies for workstations by 1985, and the idea that something like the 68K would be powerful enough for a workstation system seemed ludicrous.tom9876543 wrote: Intel was way behind thanks to its obsession with the failing IAPX432.
Then things started to change drastically. The 432 ran into problems with the complex memory subsystem - the assumption that hardware-based capabilities would somehow magically become more efficient in a silicon implementation than it was in a TTL implementation proved wishful thinking - and the complexity of the overall design was proving to be too much. They found that it took several times more transistors than expected, meaning at then-current transistor densities, trying to fit the whole thing onto one chip meant that the physical size of the chip was impractically large, causing the failure rate to skyrocket.
Meanwhile, LMI and Symbolics were both so enamored of their own design skills that they were focusing on constantly improving the designs rather than developing a practical production line (this was a problem with several companies at the time, with Foonly being another stark example of this). They were essentially selling hand-wrapped prototypes built in TTL logic as finished products, and made few moves to either formalize the designs for an assembly line, or re-implement them as single-chip CPUs (which probably wouldn't have worked for the same reasons the 432 didn't), though TI tried to later on.
The Xerox Star (and the Xerox Dorado, their attempt to enter the LispM market) was also an expensive TTL system that was never really re-designed for mass production. In any case, the upper management were never really enthusiastic for the PARC ideas - they correctly saw it as a threat to the company's core business, and assumed that if they buried it, no one else would rediscover the ideas, not aware of the amount of press coverage it had already gotten. Soon afterwards, Xerox made a deal with Apple to let them use a version of the PARC GUI, with much the same assumption that home computers were a dead end anyway that Intel had - they figured it would shut those damn hippies at in Palo Alto up by giving them a practical lesson in what would ans wouldn't work, because the Lisa was obviously underpowered as a workstation and you'd have to be crazy to think people really wanted this 'Macintosh' thing in their homes...
Then, just as it looked like all these things might work out after all, a bombshell hit: the results of the Berkeley RISC and Stanford MIPS projects were published. No one saw this coming, and there was a huge fight in both academia and the industry over what this really meant.
And in the background, where no one was watching, companies like SUN (originally a specialty company formed to support the Stanford University Network for the CS department) and Apollo started releasing workstations in the $10K to $30K range - well below the price of the LispMs and the Star - that were just barely powerful enough to run a version of System V Unix. They lacked all the fancy design and optimizations of the existing workstations, but the fact that they were cut in a single silicon chip rather than built from TTL meant that the difference in performance wasn't as huge as conventional wisdom suggested, and the gap was rapidly closing. Suddenly, everyone who had been buying the high end workstations decided that half a loaf today was better than a whole loaf tomorrow, and flocked to these machines which could do most of what they wanted for a lower price.
Intel was caught at a crossroads in the design of computers, and bet on the wrong path - or at least, a path that wasn't feasible yet. It is possible that capability addressing and tagged memory could be made to work now, given Moore's Law and other improvements in die processes, but there hasn't been much interest in it until recently.