Since the shutdown of SD on Colab, is there any option for running SD without disposable income?

I know about StableHorde, but it doesn’t seem to really… Well, work. Not for people without GPUs to gain Kudos on at least. It always gives a 5+ minute long queue and then ends up erroring out before that time runs out.

EDIT: It took me a while to set up. but as it turns out, my best option is in fact my 10-year-old computer with a 2GB AMD card. Using the DirectML fork of the WebUI with --lowvram runs pretty damn well for me. It’s not as fast as Colab was, but it’s not slow by any means. I guess the best advice in the end is, even if you’re on a shitbox, try it, your shitbox might surprise you. So take note, though, that running on 2GB Vram doesn’t work for everyone, only the luckiest of broke mfs can do that it seems.

3 points

Have you tried running it on yoru PC? Comfy UI seems to run it on a potato

permalink
report
reply
1 point

I think Automatic1111 runs on most CPUs, but you need a lot of ram.

permalink
report
parent
reply
1 point

I didn’t have any hope for being able to run it locally, but regardless, I’ve been getting it set up on and off all day. Well, as it turns out, I’m lucky enough to be able to generate with DirectML and 2GB VRAM in a not-terrible amount of time.

To test, I generated a couple of simple prompts. This one is from ‘a poster for Sex the movie’. It’s not NSFW, but I’d say it’s suggestive… in a weird, warped way. I didn’t use any negatives to test.

permalink
report
parent
reply
4 points

Since the shutdown of SD on Colab

What happened? I’m out of the loop

permalink
report
reply
5 points

Colab no longer supports or allows Stable Diffusion on their free tier. You can still buy time for SD, though.

permalink
report
parent
reply
4 points
5 points
*

There are a few.

Draw things https://drawthings.ai

Diffusionbee https://diffusionbee.com

Are apps available on macos. Not sure about other platforms.

Other than that a more complex approach is installing automatic1111 which is a web interface that runs locally. https://www.youtube.com/watch?v=kqXpAKVQDNU

Just had a little look at ^ this guy’s YouTube channel and he has guides to install stuff on windows too if that’s needed for ya.

permalink
report
reply
Deleted by creator
permalink
report
parent
reply
1 point

I struggled to replace a dying HDD this month, there’s no way I can afford a SD-capable GPU.

permalink
report
parent
reply
3 points

You don’t need a GPU. It’ll just be a bit slower.

permalink
report
parent
reply
1 point
*

I never knew it was possible to run on CPU. Well, thanks for the idea, but that doesn’t seem usable. SD outputs are terrible 9 of 10 times, and at ten+ minutes a generation…

permalink
report
parent
reply
7 points

…a lot slower to be honest

permalink
report
parent
reply

Stable Diffusion

!stable_diffusion@lemmy.dbzer0.com

Create post

Discuss matters related to our favourite AI Art generation technology

Also see

Other communities

Community stats

  • 250

    Monthly active users

  • 823

    Posts

  • 1.7K

    Comments

Community moderators