this post was submitted on 24 Oct 2023
342 points (97.8% liked)
Technology
59087 readers
3244 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The absolute most important thing is the ability for end users to replace their batteries and displays. Storage expansion is somewhat moot by now thanks to cloud and NAS storage options coupled with 5G speeds.
Personally, I think there is absolutely no reason why in something like the iMacs, HD and RAM shouldn’t be user replaceable and upgradable.
They always used to be until Jonny I’ve got his thinness bug.
If Apple went for it and introduced a new aesthetic where there were small visible screws which became a symbol for cate about the environment, they could probably push the industry in that direction
Storage and RAM not being user upgradable is an environmental nightmare for sustainability.
Not having internal slots for storage and relying on USB or NAS is not an appropriate alternative for professionals regardless of what their leadership says is what professionals want.
We’ll never know, but RAM being part of the SoC is probably contributing substantially to their performance capabilities compared to competition. The only real way to know that probably requires being an engineer at Apple. I’d wager $3.50 that they’d get a substantial performance deficit from switching to DIMMs, and that terrifies them since that would further push everyone to x86 workstations.
Perhaps. But they started removing upgradable RAM in the Intel era. It’s not a new thing that came with the M1
Simple: user-replaceable RAM is too slow. Apples M-series SoCs combine the CPU and GPU and both share the same memory. This has massive performance advantages, especially for GPU-compute tasks. Performance of GPU code is very dependent on memory bandwidth. You cannot have high-bandwidth memory on a user-replaceable module, you have to have the memory chips physically close to the processor. This is the reason there are no user-replaceable RAM modules on GPUs either.
With GPU compute becoming more and more important, I expect the PC world to get rid of user replaceable RAM and GPUs as well in the future.
That doesn't really explain why they removed the ability in the Intel Macs. But that's very informative, thank you.