Page 1 of 2
Why 1 byte = 8 bits?
Posted: Mon Oct 24, 2005 12:33 am
by NOTNULL
Hi all,
I was just wondering why bytes have been designed to hold 8 bits? Or why does 8 bits comprise one byte? Is there any peculiar reasons for choosing the number 8? Why not some other numbers?
-NOTNULL
Re:Why 1 byte = 8 bits?
Posted: Mon Oct 24, 2005 12:53 am
by Solar
It's just what became the most common. Many early architectures had byte sizes <> 8 bit.
Re:Why 1 byte = 8 bits?
Posted: Mon Oct 24, 2005 12:54 am
by Candy
NOTNULL wrote:
I was just wondering why bytes have been designed to hold 8 bits? Or why does 8 bits comprise one byte? Is there any peculiar reasons for choosing the number 8? Why not some other numbers?
Well... because you can store a bit offset in a full amount of bits (no wastage), and a full alphabet fits in it without cramming and without leaving a load of wasted space.
The options for the first category were 4, 8 and 16. For the second they were between 6 and 10. The only match is 8.
On a sidenote, it wasn't always like this. There were machines that worked with 36-bit words, which packed 4 characters of each 9 bits. Also, there are numerous other machines that do something else. People have just agreed on the byte because it works nicely.
Re:Why 1 byte = 8 bits?
Posted: Mon Oct 24, 2005 1:42 am
by Solar
Referring to the
Jargon File, emphasis is mine:
byte: /bi:t/, n.
[techspeak] A unit of memory or data equal to the amount used to represent one character; on modern architectures this is invariably 8 bits. Some older architectures used byte for quantities of 6, 7, or (especially) 9 bits, and the PDP-10 supported bytes that were actually bitfields of 1 to 36 bits! These usages are now obsolete, killed off by universal adoption of power-of-2 word sizes.
Historical note: The term was coined by Werner Buchholz in 1956 during the early design phase for the IBM Stretch computer; originally it was described as 1 to 6 bits (typical I/O equipment of the period used 6-bit chunks of information). The move to an 8-bit byte happened in late 1956, and this size was later adopted and promulgated as a standard by the System/360. The word was coined by mutating the word ?bite? so it would not be accidentally misspelled as bit. See also nybble.
Re:Why 1 byte = 8 bits?
Posted: Sun Oct 30, 2005 10:30 pm
by NotTheCHEAT
Why are there 5,280 feet in a mile? Why seven days in a week? On the PC, a byte just happens to be 8 bits. There has been 7 bits, 10 bits, and other things. But on the ubiquitous PC, it happens to be 8 bits, and the PC sets the standard, so a byte is known to be 8 bits, even though there are other kinds of bytes.
Re:Why 1 byte = 8 bits?
Posted: Mon Oct 31, 2005 2:07 am
by Candy
NotTheCHEAT wrote:
Why are there 5,280 feet in a mile? Why seven days in a week?
I've always wondered about those two. First ,why define such weird transformation things if you can just say a thousand XYZ or such. Second, an 8-day week would be better, allows for a longer weekend
.
Monday mornings... :-\
Re:Why 1 byte = 8 bits?
Posted: Mon Oct 31, 2005 2:57 am
by Solar
Candy wrote:
First ,why define such weird transformation things if you can just say a thousand XYZ or such.
Legacy. In some cultures numerical base was not equal to 10. And there were happy early days where measurement did not have to be precise, so they measured in hands, foot, spans, strides and whatever.
Second, an 8-day week would be better, allows for a longer weekend
.
Four 7-day weeks happen to equal one full phase of the moon, which was the basis for all early calendars.
Re:Why 1 byte = 8 bits?
Posted: Mon Oct 31, 2005 3:30 am
by distantvoices
Now, I'm not gonna cite a certain book for this would in all earnest open a full can o' worms. *rofl*
well - for the bit & myte stuff? isn't that because it's easier to construct adders which contain in-ports in a number which is any power of 2? (say a byte is 2^3)
Re:Why 1 byte = 8 bits?
Posted: Mon Oct 31, 2005 3:36 am
by Solar
NotTheCHEAT wrote:
On the PC, a byte just happens to be 8 bits. There has been 7 bits, 10 bits, and other things. But on the ubiquitous PC, it happens to be 8 bits, and the PC sets the standard, so a byte is known to be 8 bits, even though there are other kinds of bytes.
As much as it has become a reflex, the PC is not to be blamed for
everything. System/360 not only predated the PC, but even the 8088 and - lo and behold! - even the founding of Intel.
Re:Why 1 byte = 8 bits?
Posted: Mon Oct 31, 2005 4:06 am
by Candy
beyond infinity wrote:
well - for the bit & myte stuff? isn't that because it's easier to construct adders which contain in-ports in a number which is any power of 2? (say a byte is 2^3)
Well... no. Full adders are a sequence of normal single-bit adders, potentially optimized by chopping it in half and calculating the upper half with and without carry, then later on checking which is correct. That will work with any length, the optimized with any length a multiple (not power) of two. Nine or ten would work nicely.
It does matter for some other instructions, say bitcount. If you make an instruction of that, you make the same speed with 16 bits as with 9 bits (since you need 4 levels of adding). For 8-bits, you only need 3. Yet, this doesn't explain why it isn't 16 or 32 bits, which would nowadays, given Unicode and 32/64-bits processors, be more practical.
*off to developing hardware in VHDL again* -> yep, I'm enjoying myself at my internship
.. now if that buffer of 256bytes would just not render as 2800 logic elements but as memory bits...
Re:Why 1 byte = 8 bits?
Posted: Fri Nov 04, 2005 11:56 am
by iammisc
About the mile:
A mile is the distance a Roman soldier covered in 1000 steps.
Re:Why 1 byte = 8 bits?
Posted: Fri Nov 04, 2005 3:17 pm
by Eero Ränik
If I'm not mistaken, a mile is around 1.6 km. That'd mean a Roman soldier covered 1.6 meters with one step. It seems unlikely. However, you are right. You just should've included that you meant 1000 double steps.
Re:Why 1 byte = 8 bits?
Posted: Fri Nov 04, 2005 3:42 pm
by CESS.tk
Eero R?nik wrote:
If I'm not mistaken, a mile is around 1.6 km. That'd mean a Roman soldier covered 1.6 meters with one step. It seems unlikely. However, you are right. You just should've included that you meant 1000 double steps.
Note that one double step is exactly one loop in an AGI view.
Re:Why 1 byte = 8 bits?
Posted: Fri Nov 04, 2005 4:14 pm
by Eero Ränik
So a mile is a lot more than AGI can handle in one view.
Re:Why 1 byte = 8 bits?
Posted: Sun Nov 06, 2005 5:48 am
by Solar
And a sea mile is the distance a roman soldier could cover with 1000 strokes?