Any reason to do 16-bits?

Discussions on more advanced topics such as monolithic vs micro-kernels, transactional memory models, and paging vs segmentation should go here. Use this forum to expand and improve the wiki!
User avatar
SpyderTL
Member
Member
Posts: 1074
Joined: Sun Sep 19, 2010 10:05 pm

Re: Any reason to do 16-bits?

Post by SpyderTL »

I've done both 16-bit and 32-bit OS proof-of-concepts, and I would say that you can get away with 16-bits as long as you aren't planning on using graphics mode (anything over 320x200x32/mode 13h), digital audio, digital video, or otherwise needing to deal with over 1 MB of system memory at a time.

So, basically you can do text mode in 16-bit, but you will need to go to 32-bit to do anything else substantial.
My first stage boot loader (512 bytes) loads a 2nd stage (up to 64K) in, obviously, real mode. The second stage switches to PM, which brought up an issue: do I make a 3rd stage boot loader (how many freakin bootloaders do I want?) or do I mix 16 and 32 bit code (sounds messy).
I'm also mixing 16-bit and 32-bit code in one of my boot loader stages, which I'm not thrilled with, but it doesn't bother me enough to actually change it. I have slightly different boot loaders for floppy disk, hard disk, CD-ROM, USB and Network boot. Some of them are 32-bit only, and some of them are 2-stage 16-bit/32-bit, and some are 3-stage 16-bit/mixed/32-bit designs. I occasionally modify them, as needed, but I'm sure that they'll probably all be replaced by GRUB one day...
Project: OZone
Source: GitHub
Current Task: LIB/OBJ file support
"The more they overthink the plumbing, the easier it is to stop up the drain." - Montgomery Scott
Razumir
Posts: 1
Joined: Tue May 17, 2016 12:17 pm

Re: Any reason to do 16-bits?

Post by Razumir »

Schol-R-LEA wrote:
FelixBoop wrote:Given all this, do you think it would be worth developing a modern, 16 bit OS? Obviously, it wouldn't be a super computer, but I think that a system that could use modern file formats, and maybe even do basic networking, could be quite useful, not to mention cheap.
... [/i], especially if the target is an x86 PC system. Simply put, the original x86 architecture and instruction set stinks on ice, and while a lot of the flaws were corrected (or at least compensated for) by the 32-bit extensions, it is still a truly wretched design that even Intel wants to see the back of - it only persists because no one wants to have to re-write every single piece of Windows software people want to keep using.
I think that's a bad excuse for Intel/MS. The old x86 "genes" are like a "magic spell" that marks their rule, they stick to keeping them in their "cells". It works, it's popular, there are manauls to explain the tricks, so - "who cares" that it's ugly?

Apple did emulated 680x0 when it was needed back in mid 90-ies when switching to PPC. In another era they switched to Intel.

There was also the Crusoe CPU. https://en.wikipedia.org/wiki/Transmeta

There was also the virtual machines (environments) trend with Java, CLR/.NET - since 90-ies and early 2000s.

If the Intel/MS/AMD/other PC CPU producers (when they still existed in the 90-ies) wanted, they could have made a joint effort in building an emulating software/standards for the transition period, and create a better design without the flaws; there could be a software dynamic translation on the fly, or somebody dare to turn away from the established standards. However they were competing without such risks and the companies died one after another.

Also not all of the software had to be rewritten, maybe only the kernels, the rest - just recompiled.

Besides a lot of code is, or have to be, rewritten/recompiled anyway, because new CPUs demand it for higher performance, newer compilers, new requirements for the GUI, new DirectX standards, new hardware (like GPGPU/OpenCL trend), multicore CPUs, then hardware vrtualization etc.

As of the compatibility - it's/was a "trade mark" for Microsoft, but how many people are still using a unique Windows 95 or Windows 98 software on a modern PC for productivity tasks and without emulation? It's a nonsense and the OS won't cope with the hardware anyway.
User avatar
Schol-R-LEA
Member
Member
Posts: 1925
Joined: Fri Oct 27, 2006 9:42 am
Location: Athens, GA, USA

Re: Any reason to do 16-bits?

Post by Schol-R-LEA »

@Razumir Three things: first, you might want to check the date of the last post in a thread before responding to it; second, I quite agree that it is a poor excuse, as does everyone else including MS and Intel, but the problem is that attempts to pry the desktop away from x86 by switching to other (real or virtual) environments - the IBM PPC Windows systems of the mid-1990s, Itanium, Java JVM, .Net Framework, and more - have all been resounding failures in one way or another, at least in regards to that particular goal; and third, it isn't the operating system that presents the problem, but rather the 3rd-party drivers and applications, especially the keystone ones, many if not most of which fiddle with the system on a lower level than they are supposed to, depending on undocumented features, existing bugs, and specific hardware implementations in order to work.

Worse, some of the workhorse applications, and many, many obscure drivers, no longer have the original source code on hand to recompile, so they would have to remain emulated until someone can afford to do a fresh re-write - something that was problematic enough eighteen years ago when the Y2K scare was in full swing, and isn't much better today.

Microsoft and Intel have enough headaches keeping x86 systems compatible with themselves; trying to emulate all the crap that various popular but non-compliant programs rely on would be (to borrow a simile from years gone by) the equivalent of repairing an airliner in mid-flight after a collision. No one in the industry wants the status quo, but until it reaches a crisis point, it is unlikely to change.

What kind of crisis, you say? I don't know, but it would basically be one large enough to make continuing with any of the existing software impossible. If you've ever read Ringworld, consider the reason Louis gave Teela for wanting to go on the expedition: humanity was going to need the Puppeteers' more advanced hyperdrive, because even with a thousand years' advance warning, there was no way people were going to get moving to escape the galactic core explosion until the incoming shockwave was right on top of them. That's the sort of inertia we are seeing in this matter today: everyone knows that it is all going to fall apart someday soon, and most expected it to happen several years ago, but no one wants to risk losing their position and status until they've already lost it.

Hell, Intel has been looking for a replacement for the x86 since before it even hit the market: the 16-bit 8086 was an interim design that they figured was expendable, and the 8088 was a chopped-down version of that. They were as caught off-guard by its success as anyone else; their plan was that the real money would be in the i432, which never even made it to production in any significant numbers due to performance problems. To the corporate managers, x86 is a cash cow, but the engineers see it for the gilded cage it really is.
Rev. First Speaker Schol-R-LEA;2 LCF ELF JAM POEE KoR KCO PPWMTF
Ordo OS Project
Lisp programmers tend to seem very odd to outsiders, just like anyone else who has had a religious experience they can't quite explain to others.
Post Reply