8 points

Anyone interested in a local llm should check out Llamafile from Mozilla.

permalink
report
reply
5 points

So what exactly is this? Open-source ChatGPT-alternatives have existed before and alongside ChatGPT the entire time, in the form of downloading oogabooga or a different interface and downloading an open source model from Huggingface. They aren’t competitive because users don’t have terabytes of VRAM or AI accelerators.

permalink
report
reply
4 points
*

Edit: spelling. The Facebook LLM is pretty decent and has a huge amount of tokens. You can install it locally and feed your own data into the model so it will become tailor made.

permalink
report
parent
reply
6 points

I think you mean tailor. As in, clothes fitted to you.

permalink
report
parent
reply
1 point

Exactly, my auto carrot likes Taylor.

permalink
report
parent
reply
3 points
*
Deleted by creator
permalink
report
parent
reply
-1 points

It’s basically a UI for downloading and running models. You don’t need terabytes of VRAM to run most models though. A decent GPU and 16 gigs of RAM or so works fine.

permalink
report
parent
reply
1 point

What are the hardware requirements?

permalink
report
reply
-1 points

Depends on the size of the model you want to run. Generally, having a decent GPU and at least 16 gigs of RAM is helpful.

permalink
report
parent
reply
18 points

“100% Open Source“

[links to two proprietary services]

Why are so many projects like this?

permalink
report
reply
2 points

I imagine it’s because a lot of people don’t have the hardware that can run models locally. I do wish they didn’t bake those in though.

permalink
report
parent
reply
13 points
*

Other offline tools I’ve found:

GPT4All

RWKY-Runner

permalink
report
reply

any feelings on what you like best / works best?

permalink
report
parent
reply
2 points

They all work well enough on my weak machine with an RX580.

Buuuuuuuuuut, RWKY had some kind of optimization thing going that makes it two or three times faster to generate output. The problem is that you have to be more aware of the order of your input. It has a hard time going backwards to a previous sentence, for example.

So you’d want to say things like “In the next sentence, identify the subject.” and not “Identify the subject in the previous text.”

permalink
report
parent
reply

Open Source

!opensource@lemmy.ml

Create post

All about open source! Feel free to ask questions, and share news, and interesting stuff!

Useful Links

Rules

  • Posts must be relevant to the open source ideology
  • No NSFW content
  • No hate speech, bigotry, etc

Related Communities

Community icon from opensource.org, but we are not affiliated with them.

Community stats

  • 3.7K

    Monthly active users

  • 1.8K

    Posts

  • 30K

    Comments