You are viewing a single thread.
View all comments View context
3 points
*

No, lol. Well, at least I’m not 100% familiar with Pis new offerings, but idk about their PCI-E capabilities. Direct quote:

The tool can run on low-cost graphics processing units (GPUs) and needs roughly 8GB of RAM to process requests — versus larger models, which need high-end industrial GPUs.

Makes your question seem silly trying to imagine hooking up my GPU which is probably bigger than a Pi to a Pi.

Have been running all the image generation models on a 2060 super (8GB VRAM) up to this point including SD-XL, the model they “distilled” theirs from… Not really sure what exactly they think they are differentiating themselves from, reading the article…

permalink
report
parent
reply
3 points

Makes your question seem silly trying to imagine hooking up my GPU which is probably bigger than a Pi to a Pi.

Jeff Geerling has entered the chat

permalink
report
parent
reply
2 points

Here is an alternative Piped link(s):

Jeff Geerling has entered the chat

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I’m open-source; check me out at GitHub.

permalink
report
parent
reply
3 points

There are three models and the smallest one is 700M parameters.

permalink
report
parent
reply

Stable Diffusion

!stable_diffusion@lemmy.dbzer0.com

Create post

Discuss matters related to our favourite AI Art generation technology

Also see

Other communities

Community stats

  • 166

    Monthly active users

  • 849

    Posts

  • 1.7K

    Comments

Community moderators