You are viewing a single thread.
View all comments View context
3 points

Hey me too.

And I do have a couple different LLMs installed on my rig. But having that resource running locally is years and years away from being remotely performant.

On the bright side there are many open source llms, and it seems like there’s more everyday.

permalink
report
parent
reply

People Twitter

!whitepeopletwitter@sh.itjust.works

Create post

People tweeting stuff. We allow tweets from anyone.

RULES:

  1. Mark NSFW content.
  2. No doxxing people.
  3. Must be a tweet or similar
  4. No bullying.
  5. Be excellent to each other.

Community stats

  • 9.2K

    Monthly active users

  • 817

    Posts

  • 38K

    Comments