this post was submitted on 04 Aug 2023
89 points (94.9% liked)

PC Gaming

8219 readers
870 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Syldon@feddit.uk 2 points 1 year ago (2 children)

How big do they think the AI market is going to be? It is not going to compete with the consumer demand for very long. Chips last for years, so once AI chip is purchased it will be in use until the next generation GPU arrives. So yes the initial purchase may be predominantly expensive AI chips that will make AMD and Nvidia a boatload of cash. But that is a finite market, and TSMC make a lot of chips. Chips for industry have always be at the forefront of the cash cow that is GPU and CPU sales. Intel is also making an entrance for the low end.

I think I will be waiting a while. I have little interest in being gouged for no other reason than greed.

[–] altima_neo@lemmy.zip 2 points 1 year ago

Yeah they're nuts if they think consumer grade graphics cards are of any use to anyone seriously dealing with ai.

The biggest thing holding back cards right now, even the 4090 is vram. AI needs a ton of it, especially if you're trying to train an AI.

More than likely, the will be more demand for those 48 gig+ enterprise cards.

[–] keldzh@lemmy.ca 1 points 1 year ago

But AI begin to be a consumer product too. It's no more just bots to help tech support, but products used by people (search in Bing with the help of ChatGPT, CoPilote from GitHub to help developers write code and so on). With increased usage it'll need more GPU to calculate more answers. And I'm not sure which market is bigger.