Print

Print


Herman Miller wrote:
> [log in to unmask] wrote:
> 
>> I'm not saying it's logical, or that it's halfway usable.  (I use them
>> like you said.)  But Wikipedia (http://en.wikipedia.org/wiki/MiB) says
>> otherwise.
> 
> 
> I should have said that in all the years I've been programming, I've
> _never_ heard anyone use "kilobyte" to mean 1000 bytes. Never. Maybe
> left-handed Lithuanians use it that way; I don't know. But the phony
> "confusion" issue is never a problem with "kilobyte" in practice (and
> never would have been an issue with "megabyte" if not for those *!$(#
> hard disk manufacturers).
> 
> Wikipedia is a great resource, but Wikipedia authors don't own the
> language. Vague claims about using "kilobyte" to mean 1000 bytes really
> ought to have some documentation behind them if they want to be believed
> (an example of actual usage from a published source). I'll continue to
> use "kilobyte" and "megabyte" to indicate 1,024 and 1,048,576 bytes
> respectively, which until very recently was the only correct usage, and
> ignore the pedantic prescriptivists. In some cases, language change
> really can be detrimental.

Yes, such as this one. "Byte" mightn't be an SI unit, but "kilo-" and 
"mega-" and so forth are common prefixes, and everyone knows they mean 
1000 and 1000*1000. The matter is confused not by "*!$(#" hard drive 
manufacturers (and I think that language is inappropriate here) who 
simply use the prefixes in their commonly accepted ways, but by computer 
scientists & programmers who take the prefixes and used them to mean 
something they'd never meant before.

Powers of two are important in a binary system---1 048 576 is a much 
more useful and natural number in computing than is 1 000 000. If we 
want to create a word that refers to 1 048 576 bytes, the reasonable 
thing to do is to create a new word to avoid any possible confusion.
It doesn't hurt anyone to speak of kibibytes or mebibytes, and is at 
least unambiguous.

(BTW: When is "very recently"? It's been over six years since the IEC 
created the binary prefixes, and I doubt that the confusion started on 
that day; at any rate, six years is a long time, especially in 
computers. Wikipedia notes (albeit without sourcing) that using kilobyte 
and megabyte to mean 1000 bytes and 1 000 000 bytes "has a long 
engineering tradition, predating consumer complaints about the apparent 
discrepancy, which began to surface in the mid-1990s". An ambiguous 
sentence which could mean either the complaints started in the 
mid-1990s, or the tradition did. The next sentences clarify as the 
former: "The decimal-based capacity in hard disk drives follows the 
method usef for serially accessed storage media which predate direct 
access storage media like hard disk drives. Paper punch cards could only 
be used in a serial fashion, like the magnetic tapes that followed. When 
a stream of data is stored, it's more logical to indicate how many 
thousands, millions or billions of bytes have been stored versus how 
many multiples of 1024, 1 048 576 or 1 073 741 824 bytes have been. When 
the first hard disk drives were being developed, the decimal measurement 
was only natural since the hard disk drive served essentially the same 
function as punc cards and tapes". It would seem therefore that it's 
certainly *not* a recent phenomenon, and I do not understand how 
correctness can be defined only by *one* usage, when their are *two* 
uses, both quite old, and one with etymology on its side.)

--
Tristan.