Page 2 of 2
Posted: Wed Aug 29, 2007 9:49 pm
by Tyler
jerryleecooper wrote:For those who think 1 mb is 1000kb, first. 1 byte is not ten bits, it's eight.
When you'll come to me with your computers computing on bytes that are ten bits on size, kilobytes that have 1000 of theses bytes, then we'll talk about gibi kibi gigli.
A byte is not 8 bits.
Posted: Wed Aug 29, 2007 9:54 pm
by Alboin
Tyler wrote:jerryleecooper wrote:For those who think 1 mb is 1000kb, first. 1 byte is not ten bits, it's eight.
When you'll come to me with your computers computing on bytes that are ten bits on size, kilobytes that have 1000 of theses bytes, then we'll talk about gibi kibi gigli.
A byte is not 8 bits.
Yes, a byte can be any number of bits. An octet is 8 bits.
Posted: Thu Aug 30, 2007 1:34 am
by inflater
Gb,GB,GiB... Why we just stick to the standard - EB,PB,TB,GB,MB,kB,B - if we want to specify exact size,for example of a file or medium,and Eb,Pb,Gb,Mb,kb,b - as a unit for specifying transmit size suffixed by speed: "ps" or "/s" and let the MiB things away? It just sounds silly (is it standardised?)
I never heard my friends talking "this game has 126 mebibytes" or kinds of that stuff
Regards
inflater
Posted: Thu Aug 30, 2007 5:14 am
by Khumba
I think it's a bad idea to be using kB/GB/etc. period on computers, because:
- It's not the standard on which computers run. As far as I know, it's only hard drive manufacturers that use the prefixes correctly, though I'd prefer that they use GiB. (Some software, mostly open source from what I've seen, also will use MiB/etc. correctly.)
- The confusion makes it vague. If 1kB were 1024B, then 1kB/1ks != 1B/s, which is plain confusing, so at least follow the standard.
- Kib are closer to KiB than KB anyway, by a factor of 8 instead of not being proportional. I also think transfer rates should be reported in KiB/s, not Kib/s. I don't feel like doing the extra divide by eight. But perhaps KiB et al should be renamed KiO/GiO/etc. (kibioctet, gibioctet)?
While not quite as bad jnc100's example, I've seen a data structures textbook that used "1000 Kb" to mean 1000 bytes and "1000Kb" to mean 1024 bytes
.
I find kibi/mebi/etc. are hard to say quickly; maybe whoever decided on the binary prefixes should've chosen something nicer, like kini/meni, though "ni" wouldn't be short for anything. I'm sure if I started talking about "mibs" people would have no clue, but I use MiB while chatting, and it doesn't cause confusion.
Posted: Thu Aug 30, 2007 5:37 am
by inflater
Well I really prefer the good ol' 30 year old standard
Of course people's opinions are varying
Brynet-Inc wrote:I don't know about the rest of you.. but I dislike the way "kibi/mebi/gibi/tebi" sound!
My words!
Regards
inflater
Posted: Thu Aug 30, 2007 6:45 am
by Brynet-Inc
Tyler wrote:A byte is not 8 bits.
Since when? last time I checked it was...
- at least on systems that matter.
Posted: Thu Aug 30, 2007 10:48 am
by Smilediver
Brynet-Inc wrote:Tyler wrote:A byte is not 8 bits.
Since when? last time I checked it was...
- at least on systems that matter.
There are (were?) architectures where byte has 16 bits. It's just like int - you can't be sure about it's size.
Posted: Thu Aug 30, 2007 11:54 am
by Brynet-Inc
Trinka wrote:There are (were?) architectures where byte has 16 bits. It's just like int - you can't be sure about it's size.
Unfortunately, I've never encountered one yet.. do you know any such systems by name?
Posted: Thu Aug 30, 2007 1:02 pm
by Smilediver
Brynet-Inc wrote:Trinka wrote:There are (were?) architectures where byte has 16 bits. It's just like int - you can't be sure about it's size.
Unfortunately, I've never encountered one yet.. do you know any such systems by name?
Me neither... I just knew this from somewhere... A quick google found this:
byte: Traditionally, a byte is a sequence of 8 adjacent bits operated upon as
a unit. However, the TMS320C2x/C2xx/C5x byte is 16 bits.
By ANSI C definition, the sizeof operator yields the number of bytes re-
quired to store an object. ANSI further stipulates that when sizeof is
applied to char, the result is 1. Since the TMS320C2x/C2xx/C5x char is
16 bits (to make it separately addressable), a byte is also 16 bits. This
can yield unexpected results; for example, sizeof (int) = = 1 (not 2).
TMS320C2x/C2xx/C5x bytes and words are equivalent (16 bits).
A byte is 32 bits for the TMS320C3x/C4x. On a parallel processor and
the ’C6x, where the smallest addressable unit is 8 bits in length, the C
definition corresponds to the traditional notion of an 8-bit byte.
Posted: Thu Aug 30, 2007 5:52 pm
by AndrewAPrice
Posted: Thu Aug 30, 2007 7:22 pm
by jerryleecooper
Note that since 1998, the IEC, then the IEEE has normalized a new model describing binary prefixes avoiding consumer confusion between bytes & bits:
I don't care about what they did, I will still call it MB, GB, KB, etc in my own OS.
Posted: Fri Aug 31, 2007 8:44 am
by JamesM
Tyler: In the olden days, some machines had 7-bit bytes, others 6. IIRC the PDP series had nonstandard byte sizes, but I can't remember. I say nonstandard, but there wasn't any standard at all until the intel system/360.
Posted: Fri Aug 31, 2007 6:43 pm
by Tyler
Brynet-Inc wrote:Tyler wrote:A byte is not 8 bits.
Since when? last time I checked it was...
- at least on systems that matter.
It would be more appropriate to ask, since when have bytes been assumed to be 8-bits? Though it may be the current standard, i see no reason why it could't become 16 in 20 years when Unicode is more than Universal and we have the memory to waste.
It also depends on what you mean by Byte. The Data may be 8-bits, but even within the last decade (perhaps only because i tinker with far older computer's) there were forms of RAM with Parity bits that would be refered to as part of the byte in a none software sense.
Another classic example would be networking protocols, which attempt such Universal compatability that they refer to the "bytes" as octets in order to specify the size for existing architectures that do not conform.
A byte is the smallest addressible location in a processor's address space.
An interesting side note, most of you will already know, technically, "Word" is a term that refers to the natural register size of a processor and not to the 2-byte chunks of data we associate it with.