Why 1 byte = 8 bits?

All off topic discussions go here. Everything from the funny thing your cat did to your favorite tv shows. Non-programming computer questions are ok too.
rwfromxenon

Re:Why 1 byte = 8 bits?

Post by rwfromxenon »

1000 DOUBLE strokes ;D
User avatar
Candy
Member
Member
Posts: 3882
Joined: Tue Oct 17, 2006 11:33 pm
Location: Eindhoven

Re:Why 1 byte = 8 bits?

Post by Candy »

A thousand? wouldn't fit in the imperial system, it's a consistent multiple of ten.

A mile is 8 furlongs.

A furlong is 10 chains.

A chain is 4 rods.

A rod is 198 inch.

An inch is 2.54 cm.

That makes the mile equal to 63360 inch, or 1609.344 meters, which it is.

Consider a league (3 miles), which is what a soldier would walk in an hour as more likely. It's pretty unlikely that they stated that you must make 10 paces each 6 seconds.
User avatar
Solar
Member
Member
Posts: 7615
Joined: Thu Nov 16, 2006 12:01 pm
Location: Germany
Contact:

Re:Why 1 byte = 8 bits?

Post by Solar »

Candy wrote: Consider a league (3 miles), which is what a soldier would walk in an hour as more likely.
Fits nicely with the 6 km / hour expected marching speed when I was with the "greens"...
Every good solution is obvious once you've found it.
User avatar
Candy
Member
Member
Posts: 3882
Joined: Tue Oct 17, 2006 11:33 pm
Location: Eindhoven

Re:Why 1 byte = 8 bits?

Post by Candy »

Solar wrote:
Candy wrote: Consider a league (3 miles), which is what a soldier would walk in an hour as more likely.
Fits nicely with the 6 km / hour expected marching speed when I was with the "greens"...
Well... no. 6km/hour is about 1.25 leagues an hour. A league is only 4827 meters, remember? Although I don't expect you to keep that speed up for a full day...
User avatar
Solar
Member
Member
Posts: 7615
Joined: Thu Nov 16, 2006 12:01 pm
Location: Germany
Contact:

Re:Why 1 byte = 8 bits?

Post by Solar »

Roughly speaking, man. And our legs today are a lot longer than those of the average, period Roman. ;D
Every good solution is obvious once you've found it.
User avatar
Candy
Member
Member
Posts: 3882
Joined: Tue Oct 17, 2006 11:33 pm
Location: Eindhoven

Re:Why 1 byte = 8 bits?

Post by Candy »

Solar wrote: Roughly speaking, man. And our legs today are a lot longer than those of the average, period Roman. ;D
hm... also, I just thought that you might have meant military service as opposed to boyscouts... *oops*
User avatar
Solar
Member
Member
Posts: 7615
Joined: Thu Nov 16, 2006 12:01 pm
Location: Germany
Contact:

Re:Why 1 byte = 8 bits?

Post by Solar »

Never been to the boyscouts, but it was a kindergarten all right... ;D
Every good solution is obvious once you've found it.
NotTheCHEAT

Re:Why 1 byte = 8 bits?

Post by NotTheCHEAT »

And interestingly, the standard width of a railroad track is the standard width of ancient Roman roads, which were designed to be twice the width of a horses behind :P

Rome standardized a lot of things. And I guess that's how the modern byte came to be 8 bits.

By the way, I would blame the 8-bitedness of the byte on the PC. System/360 may have used 8-bit bytes, but that's not what caused everyone else to use 8-bit bytes. Everyone else uses 8-bit bytes because the ubiquitous PC uses 8-bit bytes. Perhaps the PC used 8-bit bytes because System/360 did, but that does not directly affect the fact that today, most computer systems use 8-bit bytes.

Pythagorous wasn't the first to notice that a[sup]2[/sup] + b[sup]2[/sup] = c[sup]2[/sup], but he's the one that told the world, and he's the one that made it famous. He may have copied it from the Egyptians, but the Egyptians aren't the ones that made it world famous. And System/360 isn't the system that made 8-bit bytes famous.
User avatar
Solar
Member
Member
Posts: 7615
Joined: Thu Nov 16, 2006 12:01 pm
Location: Germany
Contact:

Re:Why 1 byte = 8 bits?

Post by Solar »

NotTheCHEAT wrote: And interestingly, the standard width of a railroad track is the standard width of ancient Roman roads, which were designed to be twice the width of a horses behind :P
Which of the three different railroad widths in use in the European Union alone do you mean?

Never, ever assume the "standard" you know is the same for everyone... especially not in a thread about standards. ;)
Rome standardized a lot of things. And I guess that's how the modern byte came to be 8 bits.
Because of the Romans? :-D
By the way, I would blame the 8-bitedness of the byte on the PC. System/360 may have used 8-bit bytes, but that's not what caused everyone else to use 8-bit bytes. Everyone else uses 8-bit bytes because the ubiquitous PC uses 8-bit bytes.

Perhaps the PC used 8-bit bytes because System/360 did, but that does not directly affect the fact that today, most computer systems use 8-bit bytes.
*sigh*

It does not suit to blame everything on the IBM PC. There was computer history before that machine saw the light of day.

Jargon File, entry "byte", emphasis mine:
Historical note: The term was coined by Werner Buchholz in 1956 during the early design phase for the IBM Stretch computer; originally it was described as 1 to 6 bits (typical I/O equipment of the period used 6-bit chunks of information). The move to an 8-bit byte happened in late 1956, and this size was later adopted and promulgated as a standard by the System/360.
The IBM PC was released 1981. Popular CPUs of the day were the 6502 (released 1975), the Z80 (released 1976), and the 8086 (1978 (!)). All using 8-bit bytes, all widely used before the first scetch of the IBM PC was put to the drawing board.

Not trying to sound offending, but which non-8-bit-byte CPU had a comparative popularity to, say, the Apple II or the Altair 8080, that it required the IBM PC to make 8bit "standard"?
And System/360 isn't the system that made 8-bit bytes famous.
Sure as hell it wasn't the IBM PC either.
Every good solution is obvious once you've found it.
User avatar
Candy
Member
Member
Posts: 3882
Joined: Tue Oct 17, 2006 11:33 pm
Location: Eindhoven

Re:Why 1 byte = 8 bits?

Post by Candy »

Solar wrote:
NotTheCHEAT wrote: And interestingly, the standard width of a railroad track is the standard width of ancient Roman roads, which were designed to be twice the width of a horses behind :P
Which of the three different railroad widths in use in the European Union alone do you mean?
Well... the definition could still be right. Italian horses have a behind of 71.75 cm, and spanish horses have a behind of slightly over 80cm, that's well possible isn't it?
Rome standardized a lot of things. And I guess that's how the modern byte came to be 8 bits.
Because of the Romans? :-D
The romans didn't do anything with a bit. A roman bit had only one value, since they didn't invent the 0 yet. You thus couldn't really make a computer in roman culture since it had no way to succesfully indicate termination of a C program.
Kemp

Re:Why 1 byte = 8 bits?

Post by Kemp »

Hmmm.... design a computer that can only have 1s everywhere. Every voltage and magnetic bit must correspond to a digit 1.

I suppose you could just have Xv (X = 5, 3.3 or whatever voltage you're using) and no voltage (ie, tristate devices without the 0 state). Then you could send data by means of the frequency of the Xv pulses sent. Then you hit another problem, a device either has to put Xv on the bus or disconnect itself from the bus, how can it receive the data? The Xv "pulses" won't actually be going anywhere ;D

Yay, my first pointless hypothetical scenario for today. I've decided that the Romans couldn't build computers due to a lack of a 0. Of course, they could make electricity flow without specifically saying "This point is at 0 volts", maybe this was even more pointless than I mentioned...
Post Reply