this post was submitted on 07 Jan 2025
168 points (96.2% liked)

Hardware

830 readers
196 users here now

All things related to technology hardware, with a focus on computing hardware.


Rules (Click to Expand):

  1. Follow the Lemmy.world Rules - https://mastodon.world/about

  2. Be kind. No bullying, harassment, racism, sexism etc. against other users.

  3. No Spam, illegal content, or NSFW content.

  4. Please stay on topic, adjacent topics (e.g. software) are fine if they are strongly relevant to technology hardware. Another example would be business news for hardware-focused companies.

  5. Please try and post original sources when possible (as opposed to summaries).

  6. If posting an archived version of the article, please include a URL link to the original article in the body of the post.


Some other hardware communities across Lemmy:

Icon by "icon lauk" under CC BY 3.0

founded 2 years ago
MODERATORS
 

The first salvo of RTX 50 series GPU will arrive in January, with pricing starting at $549 for the RTX 5070 and topping out at an eye-watering $1,999 for the flagship RTX 5090. In between those are the $749 RTX 5070 Ti and $999 RTX 5080. Laptop variants of the desktop GPUs will follow in March, with pricing there starting at $1,299 for 5070-equipped PCs.

top 50 comments
sorted by: hot top controversial new old
[–] CrowAirbrush@lemmy.world 5 points 1 day ago (2 children)

Yeah sure, the 5090 will be a 2k the same way a 3080 went for 800...i watched them peak at 3500 (seriously, i screenshotted it but it got lost as i gave up the salt).

The 4090 is sitting at 2400 ($2500)right now over here, i can 100% assure you the 5090 will cost more than that when it gets here.

[–] Psythik@lemmy.world 1 points 17 hours ago

Where is "over here"? I want to sell my 4090 to your people and turn a nice profit.

[–] BeardedGingerWonder@feddit.uk 1 points 1 day ago (1 children)

That 3.5k was in the middle of a shortage at the height of an Ethereum mining boom though.

[–] CrowAirbrush@lemmy.world 1 points 17 hours ago

I know it's an extreme, but current prices are still extreme.

[–] Anticorp@lemmy.world 3 points 1 day ago (3 children)

What's with this new trend of CEOs wearing leather jackets, as if they're cool people? Put your fucking suits back on, assholes.

[–] didnt1able@sh.itjust.works 3 points 1 day ago

I mean, Jensen was always pretty casual comparatively speaking.

load more comments (2 replies)
[–] Grandwolf319@sh.itjust.works 21 points 1 day ago (2 children)

From google:

The RTX 4090 was released as the first model of the series on October 12, 2022, launched for $1,599 US, and the 16GB RTX 4080 was released on November 16, 2022 for $1,199 US.

So they dropped the 80 series in price by $200 while increasing the 5090 by $400.

Pretty smart honestly. Those who have to have the best are willing to spend more and I’m happy the 80 series is more affordable.

[–] UnderpantsWeevil@lemmy.world 13 points 1 day ago (1 children)

I’m happy the 80 series is more affordable

I'd hardly call $1200 affordable.

[–] Grandwolf319@sh.itjust.works 6 points 1 day ago

It’s $999 now which is more affordable than $1200

[–] TBi@lemmy.world 2 points 1 day ago

5080 is same price as the 4080 super, probably because the 4080 wasn’t selling.

[–] ramble81@lemm.ee 31 points 2 days ago (3 children)

Just using this thread as a reminder the new Intel Arc B580 is showing 4060 performance for only $250

load more comments (3 replies)
[–] SuperSpruce@lemmy.zip 49 points 2 days ago* (last edited 2 days ago) (4 children)

The prices are high, but what really is shocking are the power consumption figures. The 5090 is 575W(!!), while the 5080 is 360W, 5070Ti is 300W, and the 5070 is 250W.

If you are getting one of these, factor in the cost of a better PSU and your electric bill too. We're getting closer and closer to the limit of power from a US electrical socket.

[–] lukewarm_ozone 4 points 1 day ago (2 children)

It's clear what must be done - all US household sockets must be changed to 220V. Sure, it'll be a notable expense, but it's for the health of the gaming industry.

[–] SuperSpruce@lemmy.zip 2 points 18 hours ago

It'll buy us about 8 more years. At this rate, the TGP is increasing at about 10% per year:

3090: Late 2020, 350W 4090: Late 2022, 450W 5090: Early 2025, 575W

Therefore, around 2037, a single 90-tier GPU will pop a 110V breaker, and by 2045, it will pop a 220V breaker too.

/s

[–] turmacar@lemmy.world 4 points 1 day ago

Don't be silly.

Just move your PC to your laundry room and plug it into the 240V dryer outlet.

[–] pishadoot@sh.itjust.works 10 points 1 day ago* (last edited 1 day ago) (3 children)

1000W PSU pulls max 8.3A on a 120v circuit.

Residential circuits in USA are 15-20A, very rarely are they 10 but I've seen some super old ones or split 20A breakers in the wild.

A single duplex outlet must be rated to the same amperage as the breaker in order to be code, so with a 5090 PC you're around half capacity of what you'd normally find, worst case. Nice big monitors take about an amp each, and other peripherals are negligible.

You could easily pop a breaker if you've got a bunch of other stuff on the same circuit, but that's true for anything.

I think the power draw on a 5090 is crazy, crazy high don't get me wrong, but let's be reasonable here - electricity costs yes, but we're not getting close to the limits of a circuit/receptacle (yet).

[–] Resonosity@lemmy.dbzer0.com 5 points 1 day ago (1 children)

Actually the National Electric Code (NEC) limits loads for 15 Aac receptacles to 12 Aac, and for 20 Aac receptacles 16 Aac iirc because those are the breaker ratings and you size those at 125% of the load (conversely, 1/125% = 80% where loads should be 80% of the break ratings).

So with a 15 Aac outlet and a 1000 Wac load at minimum 95% power factor, you're drawing 8.8 Aac which is ~73% of the capacity of the outlet (8.8/12). For a 20 Aac outlet, 8.8 Aac is ~55%% capacity (8.8/16).

Nonetheless, you're totally right. We're not approaching the limit of the technology unlike electric car chargers.

[–] pishadoot@sh.itjust.works 1 points 6 hours ago (1 children)

The NEC limits CONTINUOUS loads to 80%, not intermittent loads. Continuous loads are things like heaters, AC units, etc. Things plugged into the wall are generally not considered continuous loads, so your breakers in a residential home are usually not derated, and receptacles never are from what I've seen. (Although it could be argued that a gaming computer would be a continuous load, as it runs 3+ hours for many people, but there's still no electrician that would treat it that way, probably ever, unless it was some kind of commercial space that rented gaming seats or something. Either way it would be planned in advance)

The rule that you're describing is for the initial planning of the circuit. It's for the rating of your wires and overcurrent protections, which is done at the time of installation, based on the expected continuous and intermittent loads. For residential planning nobody treats a standard branch circuit for wall receptacles as somewhere you'd derate, so your 15A circuit is a 15A circuit, you don't need to do any more math on it and derate it further.

[–] Resonosity@lemmy.dbzer0.com 1 points 6 hours ago

You could make the argument that people with 5090s do run their PCs longer than 3 hours since those folk are more prone to longer bouts of gaming to feel like they're returning on their expensive investment. And as the capabilities of our PCs become more and more robust, it will likely mean that people will more and more need to consider whether the circuit they're plugging into will take the load they're giving it.

Doesn't hurt to plan for the future regarding building wiring, since most tech folk do so regarding their PC builds.

But, up on further inspection... I may be inclined to agree with you. See this thread from licensed and qualified professionals in the space.

It seems that homeowners are given a special class of immunity when it comes to manifesting hazards associated with their use of electricity. Whether or not that immunity should be granted, given that improper use of electrical equipment in a household can lead to fires and cause undue harm to the community at large, I think is up for debate.

[–] SuperSpruce@lemmy.zip 1 points 1 day ago (4 children)

That's just the GPU with efficient other parts. Now if we do 575W GPU + 350W CPU + 75W RGB fans + 200W monitors + 20% buffer, we are at 1440W, or 12A. Now we're close to popping a breaker.

This makes me curious: What is the cheapest way to get a breaker that can handle more power? It seems like all the ways I can think of would be many 5090s in cost.

[–] pishadoot@sh.itjust.works 1 points 6 hours ago (1 children)

350W CPU?? Even a 14900k is only 250W, most are 120-180.

75W of fans????

I'm sure you could find parts with that much draw, but that is not normal.

[–] MirthfulAlembic@lemmy.world 2 points 1 day ago (1 children)

How many RGB fans does this theoretical build have to use 75W alone?

[–] SuperSpruce@lemmy.zip 1 points 18 hours ago

How else are you gonna cool 925W in a PC form factor? Ever seen fans for server racks?

load more comments (2 replies)
load more comments (1 replies)
[–] ICastFist@programming.dev 2 points 1 day ago (1 children)

Anyone getting a 5090 is most definitely not someone who worries about the electric bill

I know plenty of people who'd get a 5090 and worry about the electric bill.

[–] this_1_is_mine@lemmy.world 14 points 2 days ago

Going to need to run a separate PSU on a different branch circuit at this rate.

[–] samus12345@lemm.ee 14 points 2 days ago (5 children)

And this is BEFORE the tariffs!

[–] Dasus@lemmy.world 10 points 2 days ago (3 children)

But you see because of the tariffs the American gamers will just default to American GPUs, duh.

load more comments (3 replies)
load more comments (4 replies)
[–] GoldenDeLorean@lemmy.world 59 points 2 days ago (3 children)

I just...I just don't need fps and resolution that much. Godspeed to those that feel they do need it.

[–] Jessica@discuss.tchncs.de 5 points 1 day ago* (last edited 1 day ago)

VR enthusiasts can put it to use. The higher end headsets have resolutions of over 5000 x 5000 pixels per eye.

You are basically rendering the entire game twice, once for each eye, and the resolution is like eight times as many pixels compared to your typical 1080p game

[–] DarkCloud@lemmy.world 52 points 2 days ago (1 children)

No one should, video graphics haven't progressed that far. Only the lack of optimisation has.

[–] timewarp@lemmy.world 28 points 2 days ago (10 children)

You're missing a major audience willing to pay $2k for these cards, people wanting to run large AI language models locally.

load more comments (10 replies)
[–] Sixtyforce@sh.itjust.works 10 points 2 days ago (2 children)

I'm staying on 1440p deliberately. My 3080 is still perfectly fine for a few more years, at least current console gen.

load more comments (2 replies)
[–] halcyoncmdr@lemmy.world 32 points 2 days ago

Welp, looks like I'll start looking at AMD and Intel instead. Nvidia is pricing itself at a premium that's impossible to actually meet compared to competitors.

There will be people that buy it. Professionals that can actually use the hardware and can justify it via things like business tax benefits, and those with enough money to waste that it doesn't matter.

For everyone else, competitors are going to be much better options. Especially with Intel's very fast progression into the dedicated card game with Arc and generational improvements.

[–] theunknownmuncher@lemmy.world 33 points 2 days ago* (last edited 2 days ago)

575W TDP is an absolute fucking joke.

load more comments
view more: next ›