this post was submitted on 22 Nov 2023
1 points (100.0% liked)

Homelab

371 readers
9 users here now

Rules

founded 1 year ago
MODERATORS
 

I'm just getting into playing with Ollama and want to work to build some self-hosted AI applications. I don't need heavy duty cards because they probably won't ever be under too much load, so I'm mostly looking for power efficiency + decent price.

Any suggestions for cards I should look at? So far I've been browsing ebay and I was looking at Tesla M40 24GB DDR5's. They're reasonably priced, but I'm wondering if anyone has any specific recommendations.

top 3 comments
sorted by: hot top controversial new old
[–] dxx255@alien.top 1 points 1 year ago

I installed a Tesla P40 in a R720xd. You just have to carefully select the correct cable for power supply.

[–] PermanentLiminality@alien.top 1 points 1 year ago (1 children)

Consider getting a P40 instead. Newer gen chip compared to the M40 and it should be supported for longer. It's worth the extra cost.

Make sure to source the needed power cables.

[–] Jeremiah_K@alien.top 1 points 1 year ago

Good to know, I'll probably go for a P40 instead.