this post was submitted on 02 Aug 2025
138 points (92.1% liked)

Tech

1836 readers
184 users here now

A community for high quality news and discussion around technological advancements and changes

Things that fit:

Things that don't fit

Community Wiki

founded 2 years ago
MODERATORS
 

Despite the rapid pace of GPU evolution and the hype around AI hardware, Linus Torvalds — the father of Linux — is still using a 2017-era AMD Radeon RX 580 as his main desktop GPU here in 2025. The Polaris-based graphics may be almost a decade old, but it’s aged remarkably well in Linux circles thanks to robust and mature open-source driver support. Torvalds' continued use of the RX 580, therefore, isn’t just boomer nostalgia. It's a statement of practicality, long-term support, and his disdain for unnecessary complexity.

Spotted by Phoronix, this revelation came during a bug report around AMD’s Display Stream Compression (DSC), which was causing black screen issues in Linux 6.17. Torvalds bisected the regression himself, eventually reverting a patch to maintain kernel progress. Ironically, DSC is what allows his Radeon RX 580 to comfortably drive his modern 5K ASUS ProArt monitor, a testament to how far open-source drivers have come.

“... same old boring Radeon RX 580,” Torvalds wrote in an email to the Linux Kernel Mailing List (LKML), reverting the patch for now so development can continue uninterrupted. That one line from the man himself speaks volumes about his preference for stability over novelty.

all 47 comments
sorted by: hot top controversial new old
[–] olosta@lemmy.world 60 points 3 weeks ago (1 children)

The reasons to upgrade from this GPU is to launch AAA games from the last three years, AI/ML, creative tools (3D, video...) or to save a few watt-hour. If you don't care about that upgrading is just wasteful.

[–] LodeMike 34 points 3 weeks ago

I remember watching an LTT video about Torvalds' setup and they mentioned he said something like "I don't game, this [the 580] is overkill"

And it probably still is overkill.

[–] BananaTrifleViolin@lemmy.world 46 points 3 weeks ago* (last edited 3 weeks ago) (2 children)

This is tech writers thinking everyone lives like them. An 8 year old graphics card if you're not high end gaming or video editing is fine. That card will still run a 4k desktop, and probably multiscreen 4k desktops without any issue.

For most users, graphics cards have long been at a level when they don't need upgrading. A mid range graphics card from even 10 years ago is more than powerful enough to watch video, or use desktop programs, and even fine for a wide range of games.

It's only if you want high end 3D gaming that upgrading is needed and arguably even that has already beyond a point of diminishing returns in the last 5 years for the majority of users and titles.

I do game a fair it and my RTX 3070 which is 5 years old really doesn't need upgrading. Admittedly that was higher end when it launched, but it still plays a game like Cyberpunk 2077 at a high end settings. It's arguable how much of the "ultra" settings on most games most users would even notice the difference, let alone actually need. New cards are certainly very powerful but the fidelity jump for the price and power just isn't there in the way it would have been when upgrading a card even 10 years ago.

[–] iAmTheTot@sh.itjust.works 2 points 3 weeks ago (1 children)

I do game a fair it and my RTX 3070 which is 5 years old really doesn’t need upgrading.

Resolution and frame rate? Because at 4k mine was struuuuuggling, I had to get more VRAM.

The 30 series really sucks from its lack of vram. Amazing GPU, just not enough vram to keep it fed.

[–] TheV2@programming.dev 0 points 3 weeks ago

I played Cyberpunk 2077 at the lowest settings with a GTX 1060. If realistic graphics can be a feature of a game, not caring about realistic graphics is a feature of mine.

[–] iAmTheTot@sh.itjust.works 27 points 3 weeks ago

I'm going to hazard a guess that he's not doing a lot of high resolution, high refresh gaming on it.

[–] socialsecurity@piefed.social 18 points 3 weeks ago (1 children)

A lot of millennial gamers are going this route since endless upgrading does not yield much improvement and of course, fuck nvidia.

Hardware should be used until it either does not do the job required or it breaks outright.

[–] douglasg14b@programming.dev 5 points 3 weeks ago

This is how I've always used hardware. Y'all out here buying up new parts each year they release?!?

It's like iPhone crowd energy, but for PC parts I suppose.

[–] owiseedoubleyou@lemmy.ml 18 points 3 weeks ago (2 children)

Damn an RX 580 is now considered "outdated"?

[–] JATtho@lemmy.world 15 points 3 weeks ago (1 children)

It has been an goat-tier supported gpu on linux. Still plays doom (2016) era games at nearly max settings. Since last year i have startted calling it a potato that never rots.

[–] guynamedzero@lemmy.dbzer0.com 3 points 3 weeks ago

Ohhhhhh I might have to steal that phrase

[–] RisingSwell@lemmy.dbzer0.com 7 points 3 weeks ago (2 children)

Is that really surprising? It's old as shit as far as computer hardware goes.

[–] CallMeAnAI@lemmy.world 2 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

No it's ml jackasses being pedantic and starting shit as usual.

[–] Sibbo@sopuli.xyz 15 points 3 weeks ago
[–] CallMeAnAI@lemmy.world 13 points 3 weeks ago

Linus doesn't game?!!!??!?!!!!? Holy fuck, let's get channel 12 up in here to figure out what's going on in Linus House.

[–] mindbleach@sh.itjust.works 11 points 3 weeks ago (2 children)

I have the same GPU and Blender still doesn't give a fuck.

[–] Lembot_0004@discuss.online 6 points 3 weeks ago

You: Please, draw a triangle.
Blender: I don't give a fuck what you want -- I won't draw anything with this GPU!

(Safety :) here before someone started seriously explaining that 580 is totally enough for Blender)

[–] ExtremeDullard@lemmy.sdf.org 5 points 3 weeks ago* (last edited 3 weeks ago)

That's pretty much my experience with Blender: the Blender release cycle seems to be hell-bent on shutting out everybody who doesn't have the latest GPU-du-jour. i.e. if you don't have infinite resources to throw at the latest compute-cum-space-heater device, you're permanently stuck with a late version 2 or 3.

[–] who@feddit.org 8 points 3 weeks ago* (last edited 3 weeks ago) (2 children)

isn’t just boomer nostalgia.

Of course it isn't, Mr. Nasir. Linus is not a boomer.

[–] ExtremeDullard@lemmy.sdf.org 6 points 3 weeks ago* (last edited 3 weeks ago)

"Boomer" has become a agist term used indiscriminately by younger generations to refer to people they perceive as old. My generation said "Pop" or "Grandpa",

Just like "Hacker" used to be something to be proud of and now means anyone with or without skills up to no good with computers, and just like "Beg the question" has nothing to do with supplication, this is simply the English language shifting in real time right in front of your eyes.

[–] AllNewTypeFace@leminal.space 0 points 3 weeks ago

“boomer” now just means “no-longer-young person”, and we’re a decade or two away from millennials being boomers.

[–] vext01@lemmy.sdf.org 7 points 3 weeks ago

Me using onboard graphics....

[–] matelt@feddit.uk 6 points 3 weeks ago

Nice, I don't feel so bad about my RX 560 now

[–] Rentlar@lemmy.ca 5 points 3 weeks ago

That's pretty RAD-eon.

[–] Sidyctism2@discuss.tchncs.de 3 points 3 weeks ago
[–] Blackmist@feddit.uk 3 points 3 weeks ago

Good for him. My missus had one and it died pretty quickly due to RAM failure.

One of many reasons I'd never buy a laptop with soldered RAM.

[–] thatradomguy@lemmy.world 2 points 3 weeks ago

Linus ditching Apple? Now that's a cold day in hell.