Hi,
tom9876543 wrote:That was not Intel's fault - since the beginning the first 32 interrupts were reserved by Intel for exceptions.
Yes you are correct, my bad. IBM is supposed to be the pinnacle of computing excellence but they stuffed that up.
IBM did a lot of good stuff, and a lot of average stuff, and a lot of other stuff. The original PC fits in the "other" category - something IBM slapped together by recycling parts from other systems and gluing them together to get something to market quickly.
tom9876543 wrote:Intel make a CPU and had no say in how that CPU is used. A20 wasn't Intel's problem
I would disagree with you there. I found the Intel 8086 Users Manual on the web. It clearly says the following:
- offsets wrap around
- the memory address space is limited to 1 megabyte
It does not clearly say what happens when the physical address is 21 bits, but based on the above, you would assume a wrap around.
The manual documents what does happen for that CPU. In 1978 (when that manual was probably written) Intel didn't have a working time machine, and therefore wasn't able to find out that people are too stupid to realise a CPU with more memory might not wrap until it was too late. They probably didn't even know if there was going to be an 80186 or not back then anyway.
tom9876543 wrote:So you're suggesting that back in 1985, Intel should've used some sort of time travel to see what people would/wouldn't be doing 15 years later? Hindsight is easy. Foresight isn't.
I am suggesting Intel should have had the following philosophy when creating the 386:
Build a 32 bit CPU that is 100% compatible with the 8086 but make the design as elegant and clean as possible. Get rid of 286 compatibility as its "protection" is primitive and convoluted.
They did get rid of most 286 compatibility. They left just enough in so that applications (but not system software) would still work. In hindsight, this was probably a bit too risky, and maybe Intel would be closer to having 100% market share now if they didn't screw over the people that had developed software for 80286 protected mode 20 years ago. Heck, maybe it took ages for Microsoft to develop a 32-bit OS because Microsoft were worried Intel would break backward compatibility again and leave them with an expensive OS that would run on 80386 and nothing else.
For 80386 they made segmentation very robust; so that you could have several pieces of code (at different privilege levels) all relying on each other without going through any intermediate/kernel code. For example, a process (at CPL=3) could use a task gate to cause an immediate task switch to another processes (also at CPL=3), and the new process could use a call gate to access a driver (at CPL=2), without any need to switch to/from CPL=0 and without any security problems. Intel couldn't have known that these remarkably powerful features wouldn't be used very much when they first designed it.
If you want to blame someone for most of these things, then you should blame yourself. You should've built a time machine and travelled back in time (to both 1978 and to 1985) and warned Intel (and IBM?) about the future. You didn't; therefore all of us programmers, and IBM and Intel and lots of other companies should all sue you for failing to invent time travel.
Cheers,
Brendan