What are the hardware requirements to run SDXL?

In particular, how much VRAM is required?

This is assuming A1111 and not using --lowvram or --medvram.

6 points
*

I don’t have A1111 but in ComfyUI using a shared workflow that does base and then refiner, SDXL 0.9 was using 12GB of VRAM and 22GB of ram in Ubuntu for me. Doing images of 1024x1024~ GPU: AMD RX 6800

permalink
report
reply
5 points

Also using Comfy. Have been able to get away with 6GB of VRAM doing 1024x1024 and it took a bit longer but I’ve done a couple of 1024x2048’s and they’re coming out good :3

permalink
report
parent
reply
4 points

Also have a 6800xt, 32gb ram. SDXL 1.0 running with A1111, but I can only generate images using --medvram. This is on windows admittedly.

permalink
report
parent
reply
1 point

Yeah in windows you don’t have ROCm so it sucks.

permalink
report
parent
reply
5 points
*

3070 8gb vram, 16gb ram

In confy About 16-17sec to do everything at 1024x1024 with 20steps and 5 refiner.

A1111 About the same, using batch of 4 and then using batch img2img refining. Just more clicks without extensions etc as getting the same it/s between confy and a1111 - - medvram

permalink
report
reply
3 points

I am using a 3070 8gb FTW, and 32gb ddr5…I have yet to get SDXL to even generate without an error yet. I assumed it was my hardware.

permalink
report
parent
reply
1 point

Nah, auto1111 seriously struggles with SDXL for me. But comfy manages to do it without issue.

permalink
report
parent
reply
1 point

I tried ComyfyUI last night - but, unless I am just missing something, I couln’t get past the workflow screen. Thanks for the tip, I will keep tinkering with things.

I tried InvokeAI, but was having the same problem. I got the safetensor directly from Stability AI’s hugging face page - so I have to be doing something wrong…3 different UIs and I couldn’t get any to work. I am losing my technical aptitude :)

permalink
report
parent
reply
2 points

and im tryin to make SDXL work on my 1660ti laptop lol , comfyui runs it like 1:30 min for each pic , A1111 can’t even load the vae , however yesterday i saw and update on hugging face page of sd that they chaned to 0.9 vae for sdxl1 , seems like there was an issue with their provided 1.0 vae

permalink
report
reply
2 points

It’s hard to give precise figures, because there’s always tricks to getting a little more or less but from my (admittedly limited) testing SDXL is significantly more demanding, and 10+GB of VRAM is probably going to be the minimum to run it. I don’t remember exactly what I was doing but I run on an RTX A4500 card, and I managed to max out the 20GB of VRAM just with one SDXL process, where I can normally run a LORA training and 512x768 size images at the same time.

permalink
report
reply
2 points

I have a 6600 XT (so 8GBs of VRAM) and had no luck with A1111 or Vladmandic’s port, they would crash. ComfyUI worked with no fiddling.

permalink
report
reply

Stable Diffusion

!stable_diffusion@lemmy.dbzer0.com

Create post

Discuss matters related to our favourite AI Art generation technology

Also see

Other communities

Community stats

  • 166

    Monthly active users

  • 849

    Posts

  • 1.7K

    Comments

Community moderators