136
submitted 7 months ago by alessandro@lemmy.ca to c/pcgaming@lemmy.ca
top 50 comments
sorted by: hot top controversial new old
[-] Norgur@kbin.social 116 points 7 months ago

Thing is: there is always the "next better thing" around the corner. That's what progress is about. The only thing you can do is choose the best available option for you when you need new hardware and be done with it until you need another upgrade.

[-] Sigmatics@lemmy.ca 84 points 7 months ago

Exactly. The best time to buy a graphics card is never

[-] wrath_of_grunge@kbin.social 19 points 7 months ago

really my rule of thumb has always been when it's a significant upgrade.

for a long time i didn't really upgrade until it was a 4x increase over my old. certain exceptions were occasionally made. nowadays i'm a bit more opportunistic in my upgrades. but i still seek out 'meaningful' upgrades. upgrades that are a decent jump over the old. typically 50% improvement in performance, or upgrades i can get for really cheap.

[-] schmidtster@lemmy.world 12 points 7 months ago* (last edited 7 months ago)

4x…? Even in older cards that’s more than a decade between cards.

A 4080 is only 2.5x as powerful as a 1080ti, those are 5 years apart.

[-] Sigmatics@lemmy.ca 11 points 7 months ago* (last edited 7 months ago)

What's wrong with upgrading once every 5-10 years? Not everyone plays the latest games on 4k Ultra

Admittedly 4x is a bit steep, more like 3-4x

load more comments (2 replies)
load more comments (2 replies)
[-] jmcs@discuss.tchncs.de 12 points 7 months ago* (last edited 7 months ago)

It depends on what you need. I think usually you can get the best bang for buck by buying the now previous generation when the new one is released.

[-] miketunes@lemmy.world 5 points 7 months ago

Yup just picked up a whole PC with rtx3090 for $800.

load more comments (5 replies)
[-] massive_bereavement@kbin.social 9 points 7 months ago

Graphics card. Not even once.

load more comments (1 replies)
[-] AeroLemming@lemm.ee 24 points 7 months ago* (last edited 7 months ago)

You have a magical button. If you press it now, you will get $100 and it will disappear. Every year you don't press it, the amount of money you will get if you do press it goes up by 20%. When should you press the button? At any given point in time, waiting just one more year adds an entire 20% to your eventual prize, so it never makes sense to press it, but you have to eventually or you get nothing.

Same thing with graphics cards.

[-] Bizarroland@kbin.social 9 points 7 months ago

Is it compound or straight percentage?

Cuz if it's just straight percentage then it's $20 a year, whereas if it is compound then it's a 2X multiplier every three and a half years roughly.

[-] AeroLemming@lemm.ee 4 points 7 months ago

Compound, which more closely models the actual rate at which computing power has grown over the years.

load more comments (4 replies)
[-] SkyeStarfall@lemmy.blahaj.zone 6 points 7 months ago

Once you need it, or, alternatively, once you have enough to live comfortably for the rest of your life. It's exponential growth, you only get one chance, just gotta decide what your goal with the money actually is.

load more comments (1 replies)
[-] Sigmatics@lemmy.ca 4 points 7 months ago

Press it before you retire

Same with graphics cards

[-] hydroel@lemmy.world 11 points 7 months ago

Yeah it's always that: "I want to buy the new shiny thing! But it's expensive, so I'll wait for a while for its price to come down." You wait for a while, the price comes down, you buy the new shiny thing and then comes out the newest shiny thing.

[-] Norgur@kbin.social 4 points 7 months ago

Yep. There will always be "just wait N months and there will be the bestest thing that beats the old bestest thing". You are guaranteed to get buyers remorse when shopping for hardware. Just buy what best suits you or needs and budget at the time you decided is the best.time for you (or at the time your old component bites the dust) and then stop looking at any development on those components for at least a year. Just ignore any deals, new releases, whatever and be happy with the component you bought.

[-] nik282000@lemmy.ca 6 points 7 months ago

I bought a 1080 for my last PC build, downloaded the driver installer and ran the setup. There were ads in the setup for the 2k series that had launched the day before. FML

[-] Norgur@kbin.social 9 points 7 months ago

Yep. I bought a 4080 just a few weeks ago. Now there is ads for the refresh all over... Thing is: you card didn't get any worse. You thought the card was a good value proposition for you when you bought it and it hasn't lost any of that.

load more comments (2 replies)
[-] Outtatime@sh.itjust.works 62 points 7 months ago

I'm so sick of Nvidia's bullshit. My next system will be AMD just out of spite. That's goes for processors as well

[-] CaptainEffort@sh.itjust.works 18 points 7 months ago

That’s exactly why I’ve been using AMD for the past 2 years. Fuck Nvidia

[-] kureta@lemmy.ml 17 points 7 months ago

only thing keeping me is CUDA and there's no replacement for it. I know AMD has I-forgot-what-it's-called but it is not a realistic option for many machine learning tasks.

[-] dojan@lemmy.world 14 points 7 months ago

I went with an AM5 and an Intel Arc GPU. Quite satisfied, the GPU is doing great and didn’t cost an arm and a leg.

[-] Nanomerce@lemmy.world 5 points 7 months ago

How is the stability in modern games? I know the drivers are way better now but more samples is always great.

[-] dojan@lemmy.world 6 points 7 months ago

Like, new releases? I don’t really play many new games.

Had Baldur’s Gate III crash once, and that’s the newest title I’ve played.

Other than that I play Final Fantasy XIV, Guild Wars 2, The Sims and Elden Ring, never had any issues.

[-] Vinny_93@lemmy.world 5 points 7 months ago

Considering the price of a 4070 vs the 7800XT, the 4070 makes a lot more sense where I live.

But yes, the way AMD makes their software open to use (FSR, FreeSync) and they put DisplayPort 2.1 on their cards, they create a lot of goodwill for me.

[-] Cagi@lemmy.ca 4 points 7 months ago* (last edited 7 months ago)

The only thing giving me pause about ATI cards is their ray tracing is allegedly visibly worse. They say next gen will be much better, but we shall see. I love my current non ray tracing card, an rx590, but she's getting a bit long in the tooth for some games.

[-] limitedduck@awful.systems 17 points 7 months ago

ATI

"Now that's a name I've not heard in a long time"

[-] Cagi@lemmy.ca 21 points 7 months ago

Not since, oh before most of Lemmy was born. I'm old enough to remember when Nvidia were the anti-monopoly good guys fighting the evil Voodoo stranglehold on the industry. You either die a hero or you live long enough to see yourself become the villain.

[-] PenguinTD@lemmy.ca 4 points 7 months ago

yeah, that's pretty much why I stopped buying Nvidia after GTX 1080. Cuda was bad in terms of their practice, but not that impactful since OpenCL etc can still tune and work properly with similar performance, just software developer/researcher love free support/R&D/money to progress their goal. They are willing to be the minions which I can't ask them to not take the free money. But RTX and then tensor core is where I draw the line, since their patent and implementation will have actual harm in computer graphic and AI research space but I guess it was a bit too late. We are already seeing the results and Nvidia is making banks with that advantage. They are essentially just applying the Intel playbook but doing it slightly different, they don't buy the OEM vendors, they "invest" software developers/researcher to use their closed tech. Now everyone is paying the premium if you buy RTX/AI chips from Nvidia and the capital boom from AI will make the gap hard to close for AMD. After all, R&D requires lots of money.

I have to admit I still tend to call them that, too. Oldttimers I guess.

The first GPU I remember being excited to pop into my computer and run was a Matrox G400 Max. Damn I'm old.

[-] Cagi@lemmy.ca 7 points 7 months ago

I would have been so jealous. Being able to click "3d acceleration" felt so good when I finally upgraded. But I was 12, so my dad was in charge of pc parts. Luckily he was kind of techy, so we got there. Being able to run Jedi Knight: Dark Forces II with max settings is a day I'll never forget for some reason, lol.

[-] sederx@programming.dev 19 points 7 months ago

i saw a 4080 on amazon for 1200, shits crazy

[-] GarytheSnail@programming.dev 18 points 7 months ago

All three cards are rumored to come with the same memory configuration as their base models...

Sigh.

[-] Fungah@lemmy.world 8 points 7 months ago

Give us more fucking vram you dicks.

load more comments (1 replies)
[-] the_q@lemmy.world 17 points 7 months ago
load more comments (1 replies)
[-] Schmuppes@lemmy.world 17 points 7 months ago

Major refresh means what nowadays? 7 instead of 4 percent gains compared to the previous generation?

[-] NOT_RICK@lemmy.world 8 points 7 months ago

The article speculates a 5% gain for the 4080 super but a 22% gain for the 4070 super which makes sense because the base 4070 was really disappointing compared to the 3070.

load more comments (2 replies)
[-] massive_bereavement@kbin.social 5 points 7 months ago* (last edited 7 months ago)

For anything ML related, having the additional memory is worth the investment, as it allows for larger models.

That said, at these prices it raises the question if it is more sensible to just throw money at GCP or AWS for their GPU node time.

[-] gnuplusmatt@reddthat.com 15 points 7 months ago* (last edited 7 months ago)

As a Linux gamer, this really wasn't on the cards anyway

[-] BCsven@lemmy.ca 4 points 7 months ago

AMD is a better decision, but my nVidia works great with Linux, but I'm on OpenSUSE and nVidia hosts their own OpenSUSE drivers so it works out of the get go once you add the nVidia repo

load more comments (3 replies)
load more comments (4 replies)
[-] RizzRustbolt@lemmy.world 12 points 7 months ago

freezes

stands there with my credit card in my hand while the cashier stares at me awkwardly

[-] joneskind@beehaw.org 8 points 7 months ago

It really is a risky bet to make.

I doubt full price RTX 4080 SUPER upgrade will worth it over a discounted regular RTX 4080.

SUPER upgrades never crossed the +10%

I’d rather wait for the Ti version

load more comments (3 replies)
load more comments
view more: next ›
this post was submitted on 17 Nov 2023
136 points (91.5% liked)

PC Gaming

7580 readers
459 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 1 year ago
MODERATORS