this post was submitted on 14 May 2024
1358 points (99.1% liked)
Programmer Humor
32490 readers
554 users here now
Post funny things about programming here! (Or just rant about your favourite programming language.)
Rules:
- Posts must be relevant to programming, programmers, or computer science.
- No NSFW content.
- Jokes must be in good taste. No hate speech, bigotry, etc.
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
If you're not aware, it was called MB because of JEDEC when IEC units weren't invented. IEC units were introduced because they remove the double meaning of JEDEC units — decimal and binary. IEC units only carry the binary meaning, hence why they're superior. If you convert 1000 kB to 1 MB then use MB, but in case of 1024 KiB to 1 MiB you should be using MiB. It's all about getting the point across, and JEDEC units aren't good at it.
I'm failing to understand why we would need decimal units at all. Whats the point of them? And why do the original units havr to change name to something as ridiculous as "Gibibyte" while the unnecessary decimal units get the binary's old name?
You poor innocent soul... I can try to explain why decimal is even mentioned, but it would probably take a lot of time, and I'm not sure if I will be able to clarify things up.
I can at least say this: 2 TB HDD drive is indeed 2*10^12 B, but suddenly shindow$ in its File Explorer will show you that in fact the drive is only 1.82 TB. But WHY? Everyone asks, feeling scammed. Because HDD spec uses decimal units (SI; MB) and Window$ uses binary units (JEDEC; MB), i.e., 1.82 TiB (IEC; MiB). And macOS also uses JEDEC units, AFAIK.
More and more FOSS software uses IEC units and KDE Plasma is a good example: file manager, package manager etc. uses IEC units. Simply put, JEDEC added the binary meaning to decimal units, so at first MB (and now) only carried decimal meaning (until JEDEC shit out their standard). And the only reason why "gibibyte" is ridiculous, is because we all grew up with JEDEC interpretation of SI units. So it will take many generations for everyone to adapt xxbityte words into daily conversations. I'm (already) doing my part. It's just the legacy that we have to deal with.
All international bodies (BIPM, NIST, EU) agree that the SI prefixes "refer strictly to powers of 10" and that the binary definitions "should not be used" for them.
https://en.wikipedia.org/wiki/Binary_prefix#IEC_1999_Standard
https://en.wikipedia.org/wiki/Binary_prefix#Other_standards_bodies_and_organizations
https://en.wikipedia.org/wiki/JEDEC_memory_standards#JEDEC_Standard_100B.01
Well, thank you for taking the time to write this detailed explanation!
Windows and MacOS use the abbriviation "MB" referring to the binary units, correct? How come that these big OS's use another unit than these large international bodies recognize?
On a side note, I've always found it weird why HDDs or SSDs are/were sold with 128GB, 265GB, 512GB etc. when they are referring to decimal units.
Yez. I'm only sure about the first one, but didn't test myself whether the macOS is using power of 2 or 10 under the hood (of MB). You can open properties of something big and try converting raw number of bytes with
/1024^n
and/1000^n
and compare the end results.Legacy, legacy everywhere (IMO). And of course they don't want to confuse their precious users that don't know any better. And this also would break some scripts that rely on that specific output. GNU C library also uses JEDEC units by default, hence flatpak and other software.
It is weird for everyone, because we mainly only count with multiples of 2 when it comes to digital size of information. I didn't investigate why they use power of 10, but I've seen that some other hardware also uses decimal units (I think at least in RAM, but JEDEC is used intentionally or not for CPU cache memory). I had a link where the RAM thingy is lightly addressed, but I couldn't find it.
spoiler
P.S. it's "OSes" and "macOS" BTW.