this post was submitted on 06 Nov 2024
59 points (98.4% liked)

PC Gaming

8635 readers
847 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 1 year ago
MODERATORS
top 15 comments
sorted by: hot top controversial new old
[–] RightHandOfIkaros@lemmy.world 22 points 2 weeks ago

I hope they keep making GPUs. They were making really good progress and had pretty good performance per dollar lately.

[–] Didros@beehaw.org 8 points 2 weeks ago

It would absolutely break my heart for the big big corporation to make a dumb decision and lose more market share! Clowns.

[–] brucethemoose@lemmy.world 4 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

If they wanna abandon discrete GPUs... OK.

But they need graphics. They should make M Pro/Max-ish integrated GPUs like AMD is already planning on doing, with wide busses, instead of topping out at bottom-end configs.

They could turn around and sell them as GPU-accelerated servers too, like the market is begging for right now.

[–] fuckwit_mcbumcrumble@lemmy.dbzer0.com 6 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

Intel's CEO says 'large' integrated GPUs are the way forward.

You didn't even have to click on the article it was in the preview text. And that's exactly what Intel has been doing with their 100 and 200 series CPUs (that's what they're called right?). The 140v that's in the lunar lake while not cleanly beating AMD's 890 is putting up a pretty good fight. And that's in the super hamstrung for power Lunar Lake CPUs, with Arcs horribly unoptimized silicon and drivers. https://www.youtube.com/watch?v=eg74aUQGdSg

If Intel can figure out how to slim down the silicon for battlemage to make it more efficient (space and power wise) then they could have some actual competition for AMD.

[–] brucethemoose@lemmy.world 2 points 2 weeks ago* (last edited 2 weeks ago)

I wouldn't call that "large."

Strix Halo (256 bit LPDDR5X, 40 AMD CUs) is where I'd start calling integrated graphics "large." Intel is going to remain a laughing stock in the gaming world without bigger designs than their little 128-bit IGPs.

[–] superkret@feddit.org 3 points 2 weeks ago (1 children)

Honestly, graphics have been good enough for a long time now.
I'm currently re-playing Skyrim on an ultra-light convertible laptop attached to an external monitor.
And it looks beautiful.
After several days, I noticed it was still on power-saving + silent cooling mode.

[–] woodgen@lemm.ee 4 points 2 weeks ago (1 children)

Right, now we just need 64GB of VRAM to run our own LLM.

[–] superkret@feddit.org -4 points 2 weeks ago (2 children)

No, we don't. Billions of people manage to live without LLMs.

[–] ILikeBoobies@lemmy.ca 4 points 2 weeks ago

I think you missed the point of where the discrete graphics card market is going

[–] desktop_user@lemmy.blahaj.zone 3 points 2 weeks ago

and I would rather have my llms and eat my privacy too

[–] CalcProgrammer1 3 points 2 weeks ago

I'm pretty happy with my Arc A770. It's in my secondary build because it can't do 4K 144Hz, but for the price it has been a great 1440p card and has solid Linux support. I would rather buy Intel than NVIDIA when it comes to a gaming GPU because of NVIDIA's poor Linux support.

[–] Nomecks@lemmy.ca 2 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

Intel sees the AI market as the way forward. NVIDIA's AI business eclipses its graphics business by an order of magnitude now, and Intel wants in. They know that they rule the integrated graphics market, and can leverage that position to drive growth with things like edge processing for CoPilot.

[–] brucethemoose@lemmy.world 4 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

The localllama crowd is supremely unimpressed with Intel, not just because of software issues but because they just don't have beefy enough designs, like Apple does, and AMD will soon enough. Even the latest chips are simply not fast enough for a "smart" model, and the A770 doesn't have enough VRAM to be worth the trouble.

They made some good contributions to runtimes, but seeing how they fired a bunch of engineers, I'm not sure that will continue.

[–] Nomecks@lemmy.ca 1 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

People running LLMs aren't the target. People who use things like ChatGPT and CoPilot on low power PCs who may benefit from edge inference acceleration are. Every major LLM dreams of offloading compute on the end users. It saves them tons of money.

[–] brucethemoose@lemmy.world 3 points 2 weeks ago* (last edited 2 weeks ago)

One can't offload "usable" LLMs without tons of memory bandwidth and plenty of RAM. It's just not physically possible.

You can run small models like Phi pretty quick, but I don't think people will be satisfied with that for copilot, even as basic autocomplete.

About 2x faster than Intel's current IGPs is the threshold where the offloading can happen, IMO. And that's exactly what AMD/Apple are producing.