this post was submitted on 28 Jul 2024
143 points (97.4% liked)

Technology

34387 readers
235 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] YurkshireLad@lemmy.ca 66 points 1 month ago (2 children)

350,000 servers? Jesus, what a waste of resources.

[–] yogthos@lemmy.ml 53 points 1 month ago (5 children)

just capitalist markets allocating resources efficiently where they're need

load more comments (5 replies)
[–] AlexWIWA@lemmy.ml 19 points 1 month ago

Sounds like we're going to get some killer deals on used hardware in a year or so

[–] queermunist@lemmy.ml 56 points 1 month ago (1 children)

Totally not a bubble though.

[–] MajorHavoc@programming.dev 24 points 1 month ago* (last edited 1 month ago)

Yeah. It's a legitimate business, where the funders at the top of the pyramid are paid by those that join at the bottom!

[–] riskable@programming.dev 33 points 1 month ago (1 children)

Now's the time to start saving for a discount GPU in approximately 12 months.

[–] FaceDeer@fedia.io 17 points 1 month ago (2 children)

They don't use GPUs, they use more specialized devices like the H100.

[–] tyler@programming.dev 8 points 1 month ago (1 children)

Everyone that doesn’t have access to those is using gpus though.

[–] FaceDeer@fedia.io 8 points 1 month ago (3 children)

We are talking specifically about OpenAI, though.

[–] porous_grey_matter@lemmy.ml 7 points 1 month ago (1 children)

People who previously were at the high end of GPU can now afford used H100s -> they sell their GPUs -> we can maybe afford them

load more comments (1 replies)
load more comments (2 replies)
[–] Aabbcc@lemm.ee 3 points 1 month ago

Can I use a H100 to run hell divers 2?

[–] Ephera@lemmy.ml 20 points 1 month ago (1 children)

I do expect them to receive more funding, but I also expect that to be tied to pricing increases. And I feel like that could break their neck.

In my team, we're doing lots of GenAI use-cases and far too often, it's a matter of slapping a chatbot interface onto a normal SQL database query, just so we can tell our customers and their bosses that we did something with GenAI, because that's what they're receiving funding for. Apart from these user interfaces, we're hardly solving problems with GenAI.

If the operation costs go up and management starts asking what the pricing for a non-GenAI solution would be like, I expect the answer to be rather devastating for most use-cases.

Like, there's maybe still a decent niche in that developing a chatbot interface is likely cheaper than a traditional interface, so maybe new projects might start out with a chatbot interface and later get a regular GUI to reduce operation costs. And of course, there is the niche of actual language processing, for which LLMs are genuinely a good tool. But yeah, going to be interesting how many real-world use-cases remain once the hype dies down.

[–] yogthos@lemmy.ml 5 points 1 month ago

It's also worth noting that smaller model work fine for these types of use cases, so it might just make sense to run a local model at that point.

[–] chemicalwonka@discuss.tchncs.de 17 points 1 month ago
[–] Travelator@thelemmy.club 16 points 1 month ago (1 children)

Good. It's fake crap tech that no one needs.

[–] curiousaur@reddthat.com 9 points 1 month ago (5 children)

It's actually really awesome and truly helps with my work.

load more comments (5 replies)
[–] PeepinGoodArgs@reddthat.com 13 points 1 month ago

I will be in a perfect position to snatch a discount H100 in 12 months

[–] PanArab@lemmy.ml 13 points 1 month ago

I hope so! I am so sick and tired of AI this and AI that at work.

[–] flambonkscious@sh.itjust.works 13 points 1 month ago (3 children)

The start(-up?)[sic] generates up to $2 billion annually from ChatGPT and an additional $ 1 billion from LLM access fees, translating to an approximate total revenue of between $3.5 billion and $4.5 billion annually.

I hope their reporting is better then their math...

[–] Hector_McG@programming.dev 10 points 1 month ago

Probably used ChatGPT….

[–] twei@discuss.tchncs.de 8 points 1 month ago (1 children)

Maybe they also added 500M for stuff like Dall-E?

load more comments (1 replies)
load more comments (1 replies)
[–] delirious_owl@discuss.online 12 points 1 month ago

Bubble. Meet pop.

[–] kjaeselrek@lemmy.ml 9 points 1 month ago
[–] NigelFrobisher@aussie.zone 8 points 1 month ago (1 children)
[–] ryan213@lemmy.ca 12 points 1 month ago
[–] Tangentism@lemmy.ml 7 points 1 month ago
[–] geneva_convenience@lemmy.ml 7 points 1 month ago

Ai stands for artificial income.

[–] strawberry@kbin.run 7 points 1 month ago
[–] Aurenkin@sh.itjust.works 6 points 1 month ago

Last time a batch of these popped up it was saying they'd be bankrupt in 2024 so I guess they've made it to 2025 now. I wonder if we'll see similar articles again next year.

[–] coffee_with_cream@sh.itjust.works 6 points 1 month ago (1 children)

For anyone doing a serious project, it's much more cost effective to rent a node and run your own models on it. You can spin them up and down as needed, cache often-used queries, etc.

[–] yogthos@lemmy.ml 6 points 1 month ago

For sure, and in a lot of use cases you don't even need a really big model. There are a few niche scenarios where you require a large context that's not practical to run on your own infrastructure, but in most cases I agree.

[–] arran4@aussie.zone 3 points 1 month ago (1 children)

This sounds like FUD to me. If it were it would be acquired pretty quickly.

[–] jackyalcine@lemmy.ml 6 points 1 month ago (2 children)

They're wholly owned by Microsoft so it'd probably be mothballed at worst.

[–] 50MYT@aussie.zone 5 points 1 month ago (1 children)
load more comments (1 replies)
[–] arran4@aussie.zone 4 points 1 month ago (2 children)

For another conversation I need some evidence of that, where did you find it?

load more comments (2 replies)
load more comments
view more: next ›