this post was submitted on 19 Jul 2024
328 points (95.3% liked)

Lemmy Shitpost

26838 readers
3301 users here now

Welcome to Lemmy Shitpost. Here you can shitpost to your hearts content.

Anything and everything goes. Memes, Jokes, Vents and Banter. Though we still have to comply with lemmy.world instance rules. So behave!


Rules:

1. Be Respectful


Refrain from using harmful language pertaining to a protected characteristic: e.g. race, gender, sexuality, disability or religion.

Refrain from being argumentative when responding or commenting to posts/replies. Personal attacks are not welcome here.

...


2. No Illegal Content


Content that violates the law. Any post/comment found to be in breach of common law will be removed and given to the authorities if required.

That means:

-No promoting violence/threats against any individuals

-No CSA content or Revenge Porn

-No sharing private/personal information (Doxxing)

...


3. No Spam


Posting the same post, no matter the intent is against the rules.

-If you have posted content, please refrain from re-posting said content within this community.

-Do not spam posts with intent to harass, annoy, bully, advertise, scam or harm this community.

-No posting Scams/Advertisements/Phishing Links/IP Grabbers

-No Bots, Bots will be banned from the community.

...


4. No Porn/ExplicitContent


-Do not post explicit content. Lemmy.World is not the instance for NSFW content.

-Do not post Gore or Shock Content.

...


5. No Enciting Harassment,Brigading, Doxxing or Witch Hunts


-Do not Brigade other Communities

-No calls to action against other communities/users within Lemmy or outside of Lemmy.

-No Witch Hunts against users/communities.

-No content that harasses members within or outside of the community.

...


6. NSFW should be behind NSFW tags.


-Content that is NSFW should be behind NSFW tags.

-Content that might be distressing should be kept behind NSFW tags.

...

If you see content that is a breach of the rules, please flag and report the comment and a moderator will take action where they can.


Also check out:

Partnered Communities:

1.Memes

2.Lemmy Review

3.Mildly Infuriating

4.Lemmy Be Wholesome

5.No Stupid Questions

6.You Should Know

7.Comedy Heaven

8.Credible Defense

9.Ten Forward

10.LinuxMemes (Linux themed memes)


Reach out to

All communities included on the sidebar are to be made in compliance with the instance rules. Striker

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] MotoAsh@lemmy.world 8 points 4 months ago (6 children)

Why use commercial graphics accelerators to run a highly limited "AI"-unique work set? There are specific cards made to accelerate machine learning things that are highly potent with far less power draw than 3090's.

[–] ShadowRam@fedia.io 30 points 4 months ago (1 children)

Well yeah, but 10x the price....

[–] MotoAsh@lemmy.world 1 points 4 months ago* (last edited 4 months ago) (2 children)

Not if it's for inference only. What do you think the "AI accelerators" they're putting in phones now are? Do you think they'd be as expensive or power hungry as an entire 3090 for performance if they were putting them in small devices?

[–] ShadowRam@fedia.io 8 points 4 months ago (1 children)

Ok,

Show me a PCE-E board that can do inference calculations as fast as a 3090 but is less expensive than a 3090.

[–] RandomlyRight@sh.itjust.works 5 points 4 months ago

I'd be interested (and surprised) too

[–] RandomlyRight@sh.itjust.works 1 points 4 months ago* (last edited 4 months ago) (1 children)

Yeah show me a phone with 48GB RAM. It’s a big factor to consider. Actually, some people are recommending a Mac Studio cause you can get it with 128GB RAM and more and it’s shared with the AI/GPU accelerator. Very energy efficient, but sucks as soon as you want to do literally anything other than inference

[–] Fuzzypyro@lemmy.world 1 points 3 months ago (1 children)

I wouldn’t say it particularly sucks. It could be used as a powerhouse hosting server. Docker makes it very easy to do no matter the os now a days. Really though I’d say its competition is more along the lines of ampere systems in terms of power to performance. It even beats amperes 128 core arm cpu at a power to performance ratio which is extremely impressive in the server/enterprise world. Not to say you’re gonna see them in data centers because price to performance is a thing as well. I just feel like it fits right into the niche it was designed for.

[–] RandomlyRight@sh.itjust.works 1 points 3 months ago

How could you solve the problem of storage expansion? I assume there exists some kind of thunderbolt jbod thing or similar

[–] mergingapples@lemmy.world 18 points 4 months ago

Because those specific cards are fuckloads more expensive.

[–] d00ery@lemmy.world 6 points 4 months ago

What are you recommending, I'd be interested in something that's similar in price to 3090.

[–] Diabolo96@lemmy.dbzer0.com 5 points 4 months ago (1 children)

It's for inference, not training.

[–] MotoAsh@lemmy.world 2 points 4 months ago (1 children)

Even better, because those are cheap as hell compared to 3090s.

[–] Diabolo96@lemmy.dbzer0.com 1 points 4 months ago

But can they run Crysis ?

[–] VeganCheesecake@lemmy.blahaj.zone 4 points 4 months ago* (last edited 4 months ago)

Would you link one? Because the only things I know of are the small coral accelerators that aren't really comparable, and specialised data centre stuff you need to request quotes for to even get a price, from companies that probably aren't much interested in selling one direct to customer.

[–] GBU_28@lemm.ee 3 points 4 months ago

Huh?

Stuff like llama.cpp really wants a GPU, a 3090 is a great place to start.