What is the origin of K = 1024?
Rise to the top 3% as a developer or hire one of them at Toptal: https://topt.al/25cXVn
--------------------------------------------------
Music by Eric Matyas
https://www.soundimage.org
Track title: Ancient Construction
--
Chapters
00:00 What Is The Origin Of K = 1024?
00:42 Accepted Answer Score 63
01:02 Answer 2 Score 27
02:01 Answer 3 Score 8
02:30 Answer 4 Score 3
03:02 Answer 5 Score 1
03:41 Thank you
--
Full question
https://superuser.com/questions/287375/w...
--
Content licensed under CC BY-SA
https://meta.stackexchange.com/help/lice...
--
Tags
#storage #history
#avk47
ACCEPTED ANSWER
Score 63
It goes back quite some time, and is detailed here. It looks like you can blame IBM, if anybody.
Having thought about it some more, I would blame the Americans as a whole, for their blatant disregard for the Système international d'unités :P
ANSWER 2
Score 27
All computing was low-level at the beginning. And at low level programming the number "1000" is totally useless and they needed prefixes for larger amounts so they reused the SI ones. Everyone knew it in the field, there was no confusion. It served well for 30 years or who knows.
It's not because they were Americans so they needed to break SI at all costs. :-)
There is no programmer who I know and says kibibyte. They say kilobyte and they mean 1024 bytes. Algorithms are full of the powers of 2. Even today, "1000" is a really useless number between programmers.
Saying kibi and mibi is just too funny and draws attention from the subject. We happily give it away to the telecommunication and disk storage sectors :-). And I will write kibibytes on user interfaces where non-programmers may read it.
ANSWER 3
Score 8
It is correct and makes sense for technical people to use 1024 = 1K in many cases.
For end users it is normally better to say 1000 = 1k because everybody is used to the 10-based number system.
The problem is where to draw the line. Sometimes marketing or advertising people do not really succeed in the "translation" or in adapting technical data and language to end users.
ANSWER 4
Score 3
Blame semiconductor manufacturers (they provide us with binary hardware only)[1]
Better yet: blame logic itself (binary logic is just the most elementary logic).
Better yet: who shall we blame for the wretched decimal system?
It has far more flaws than the binary system. It was based cough on the average number of fingers in the human species cough
Oooo...
[1] I want my quantum three-qubit computer!!! Now!
ANSWER 5
Score 1
1024 is not to be blamed it is a very good thing indeed, as it is the reason computer (digital) can be as fast and as efficient as they are today. Because the computer only use 2 value (0,1) it takes out the hardship and complexity (inaccuracy) of anolog system out of the equation.
It would be more complicated if we said a kilobyte is 1000 bits because 2 to what power is 1000? so even 1 kilobyte would be inaccurate because it will have floating points or an approximation.
But i largely blame marketing for selling a 8 gigabytes* and adding this in the small print
* 1 gigabyte is 1,000,000,000 bytes.
it is a shame really, that is the same thing with connection speed, your ISP will say 1.5Mbps instead of telling you ~150 kiloBytes. it's just very misleading