this post was submitted on 26 Mar 2025
213 points (96.5% liked)

Technology

68130 readers
3607 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

cross-posted from: https://feddit.org/post/9780943

cross-posted from: https://europe.pub/post/53784

you are viewing a single comment's thread
view the rest of the comments
[–] the_riviera_kid@lemmy.world 0 points 6 days ago* (last edited 6 days ago) (3 children)

It's not though, ARM themselves admit it. https://www.arm.com/glossary/risc.

"With RISC, a central processing unit (CPU) implements the processor design principle of simplified instructions that can do less but can execute more rapidly."

None of this is to say RISC or by extension ARM is bad, just that where everything currently is it's not a good choice for everyday computing. By design its as light weight and simple as possible so that it can perform its specific function faster and more efficiently with less overhead than a more general purpose processor.

Geeks for geeks has a good writeup on it.

https://www.geeksforgeeks.org/computer-organization-risc-and-cisc/

[–] lka1988@lemmy.dbzer0.com 2 points 5 days ago

Just to add - a rather large reason the technology we have today even exists is thanks in no small part to the x86 architecture and it's immense backwards-compatibility.

[–] oo1@lemmings.world 1 points 4 days ago

Laptops run off batteries a lot of the time - so compromising outright performance - full instruction set - for battery life will be attractive for many laptop users who use it on the go.

I'm no apple fanatic, I'd never get one, but I do see the appeal of those apple laptops.

I'm sure x86 could get closer on the performance to battery tradeoff if they wanted to; but I bet they'd be looking to price up at the apple level for that.

[–] brucethemoose@lemmy.world 4 points 6 days ago* (last edited 6 days ago)

I mean, that doesn’t mean much.

The Fujitsu A64FX had full 512-bit SVE, with 2x 512-bit units per core and HBM memory, which is as CISC as it gets. IIRC was the "widest" CPU that could get the most done per clock, at the time, and the US Department of Energy seemed to love them.

And then you have tiny cores like Intel's in order ones that are way thinner than ARM designs.

Reality is decoding doesn’t take up much die space these days and stuff is decoded into micro ops anyway. The ISA has an effect, but efficiency/appropriateness for different platforms comes down to design and business decisions more than the ISA.