this post was submitted on 21 Feb 2024
313 points (94.8% liked)
Technology
60075 readers
3558 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
There is nothing to keep you from using factors of 1024 (except he slightly ludicrous prefix "kibi" and "mebi"), but other than low level stuff like disc sectors or bios where you might want to use bit logic instead of division it's rather rare. I too started in the time when division op was more costly than bit level logic.
I'd argue that any user facing applications are better off with base 1000, except by convention. Like a majority of users don't know or care or need to care what bits or bytes do. It's programmers that like the beauty of the bit logic, not users. @mb_@lemm.ee
I agree with what you said, and its imo why the discussion of a factor of 1000 and 1024 will always rage on. Im a developer, and do embedded stuff in my free time. Everything around me is factor 1024 because of it, and i hate the factor 1000. But from a generic user standpoint, i agree its a lot more user friendly, as they are used to the metric system of a factor of 10
It is user friendly, and technically incorrect, since nothing ever lines up with reality when you use 1000 because the underlying system is base 8.
Or you get the weird non-sense all around "my computer has 18.8gb of memory"...