this post was submitted on 27 Dec 2023
148 points (70.0% liked)
Technology
59608 readers
3414 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I was confused when I just read the headline. Should be "Why I (that would be you not me) think a kilobyte should be 1000 instead of 1024". Unpopular opinion would be a better sub for it.
You should read the blog post. It's not a matter of option.
It totally is a matter of opinion. These are arbitrary rules, made up by us. We can make up whatever rules we want to.
I agree that it's weird that only in CS kilo means 1024. It would be logical to change that, to keep consistency across different fields of science. But that does not make it any less a matter of opinion.
You can't store data in base 10, nor address memory or storage in base 10 given present computers. It's a bit more than a matter of opinion that computers are base 2
Yes computers are base 2 but we can still make up whatever rules we want about them. We could even make up rules that say that we are to consider everything a computer does to be in base 10 but it can only use the lowest 2 values of any given digit. It would be a total mess and it would make no sense whatsoever but we could define those rules.
Just because you wrote about a topic doesn't mean you're suddenly the authority figure lol.
I know there is no option as 1024 is what the standard is now. Im not reading that anymore than someone saying how a red light really means go.
1024 is not the standard. The standard term for 1024 is "kibi" or "Ki" and the standard term for 1000 is "kilo" and has been since the year 1795.
There was a convention to use kilo for 1024 in the early days of computing since the "kibi" term didn't exist until 1998 (and took a while to become commonly used) — but that convention was always recognised as an incorrect use of the term. People just didn't care much especially since kilobytes were commonly rounded anyway. A 30,424 byte file is 29.7109375 kibibytes or 30.424 kilobytes... both will likely be rounded to 30 either way, so who cares if it's slightly wrong? Just use bytes if you need to know the exact size.
Also - hard drives, floppy disks, etc have always referred to their size in base 1000 numbers so if you were working with 30KB in the early days of computers it was very rarely RAM. A PDP-11 computer, for example, might have only had 8196 bytes of RAM (that's 8 kibibytes).
There are some places where the convention is still used and it can be pretty misleading as you work with larger numbers. For example 128 gigs equals 128,000,000,000 bytes (if using the correct 1000 unit) or 137,438,953,472 bytes (if kilo/mega/giga = 1024).
The "wrong" convention is commonly still used for RAM chips. So a 128GB RAM chip is significantly larger than a 128GB SSD.
I've never met anyone that actually uses the new prefixes for 1024 and the old prefixes to mean 1000
That is not true. For a long time everything (computer related) was in the base 2 variants. Then the HD manufacturers changed so their drives would appear larger than they actually were (according to everyone's notions of what kn/mb/gb meant). It was a marketing shrinkflation stunt.