this post was submitted on 21 Jan 2024
323 points (97.1% liked)

Technology

59440 readers
3616 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Do PC gamers feel 12GB of VRAM is simply not enough for the money in 2024?

you are viewing a single comment's thread
view the rest of the comments
[–] Rakonat@lemmy.world 95 points 10 months ago (2 children)

Nvidia over pricing their cards and limiting stock, acting like there is still a gpu shortage from all the crypto bros sucking everything up.

Right now, their competitors are beating them at hundreds of dollars below nvidias mrp like for like with the only true advantage nvidia has is in ray tracing and arguably VR.

It's possible we're approaching another shorter with the AI bubble though for the moment that seems to be pretty far off.

TL;DR Nvidia is trying to sell a card at twice it's value cause greed.

[–] Evilcoleslaw@lemmy.world 35 points 10 months ago* (last edited 10 months ago) (3 children)

They're beating AMD at ray tracing, upsampling (DLSS vs FSR), VR, and especially streaming (NVENC). For the latter look at the newly announced beta partnership with Twitch and OBS which will bring higher quality transcoding and easier setup only for Nvidia for now and soon AV1 encoding only for Nvidia (at first anyway).

The raw performance is mostly there for AMD with the exception of RT, and FSR has gotten better. But Nvidia is doing Nvidia shit and using the software ecosystem to entrench themselves despite the insane pricing.

[–] mihies@kbin.social 6 points 10 months ago (1 children)

And they beat AMD in efficiency! I'm (not) surprised that people ignore this important aspect which matters in noise, heat and power usage.

[–] MonkderZweite@feddit.ch 20 points 10 months ago (2 children)

Toms Hardware did a test, Rx 6800 is leader there. Next, RTX 3070, is 4.3% worse. Are their newer cards more efficient than AMD's newer cards?

[–] pycorax@lemmy.world 5 points 10 months ago (1 children)

They seem to be but honestly, this generation hasn't been very impressive for both team green and red. I got a 6950 XT last year and seeing all these new releases has only proven that I made a good investment.

[–] Daveyborn@lemmy.world 2 points 10 months ago

Nothing compelling enough for me to hop off of a titan Xp yet. (Bought a titan because it was cheaper than a 1070 at the time because of scalpers)

[–] Crashumbc@lemmy.world 2 points 10 months ago

30 series maybe.

40 series power usage Nvidia destroys AMD.

The 4070 uses WAY less than a 3070... It's 200 (220 for supera) that's nearly more than my 1070 170w

[–] umbrella@lemmy.ml 4 points 10 months ago

Streaming performance is really good on AMD cards, IME. Upscaling is honestly close and getting closer.

I dont think better RT performance is worth the big premium or annoyances nvidia cards bring. Doubly so on Linux.

[–] altima_neo@lemmy.zip 0 points 10 months ago (1 children)

And AI. They're beating the pants off AMD at AI.

[–] Evilcoleslaw@lemmy.world 2 points 10 months ago* (last edited 10 months ago)

True enough. I was thinking more of the gaming use case. But even beyond AI and just a general compute workload they're beating the pants off AMD with CUDA as well.

[–] genie@lemmy.world 16 points 10 months ago

Couldn't agree more! Abstracting to a general economic case -- those hundreds of dollars are a double digit percentage of the overall cost! Double digit % cost increase for single digit % performance doesn't quite add up @nvidia :)

Especially with Google going with TPUs for their AI monstrosities it makes less and less sense at large scale for a consumers to pay the Nvidia tax just for CUDA compatibility. Especially with the entrance of things like SYCL that help programmers avoid vendor lock.