I’m just getting into playing with Ollama and want to work to build some self-hosted AI applications. I don’t need heavy duty cards because they probably won’t ever be under too much load, so I’m mostly looking for power efficiency + decent price.

Any suggestions for cards I should look at? So far I’ve been browsing ebay and I was looking at Tesla M40 24GB DDR5’s. They’re reasonably priced, but I’m wondering if anyone has any specific recommendations.

You are viewing a single thread.
View all comments
1 point

Consider getting a P40 instead. Newer gen chip compared to the M40 and it should be supported for longer. It’s worth the extra cost.

Make sure to source the needed power cables.

permalink
report
reply
1 point

Good to know, I’ll probably go for a P40 instead.

permalink
report
parent
reply

Homelab

!homelab@selfhosted.forum

Create post

Rules

  • Be Civil.
  • Post about your homelab, discussion of your homelab, questions you may have, or general discussion about transition your skill from the homelab to the workplace.
  • No memes or potato images.
  • We love detailed homelab builds, especially network diagrams!
  • Report any posts that you feel should be brought to our attention.
  • Please no shitposting or blogspam.
  • No Referral Linking.
  • Keep piracy discussion off of this community

Community stats

  • 9

    Monthly active users

  • 1.4K

    Posts

  • 6K

    Comments