Hardware Specs for Stable Diffusion SDXL
The recommended GPU model for Stable Diffusion SDXL is the NVIDIA RTX 3090 or higher. This GPU has 24GB of VRAM, which is the minimum amount of VRAM required to run SDXL.
The RTX 3090 also has a high number of CUDA cores, which are the processing units that are used to run the model.
If you have a lower-end GPU, you may still be able to run SDXL, but it will be slower and you may not be able to generate images with the same quality. Here are some other GPUs that are recommended for SDXL:
NVIDIA RTX 3080
NVIDIA RTX 3070 Ti
NVIDIA RTX 3060 Ti
NVIDIA RTX 2080 Ti
NVIDIA RTX 2070 Super
If you are on a budget, you can also try using a cloud GPU service like;
Google Cloud TPU or AWS DeepRacer.
These services offer powerful GPUs that can be used to run SDXL.
In addition to a GPU, you will also need a computer with at least 16GB of RAM and a fast CPU. The CPU is not as important as the GPU, but it will help to speed up the generation process.
It's been confirmed by the community members on Reddit that Stable Diffusion XL can work with AMD graphics cards (RX 6700, RX 6800, etc.) using Automatic1111 directml, but it works very slowly and takes a lot of SSD writes.
Using SDXL on Google Colab or services like ClipDrop is much more efficient.
Stable Diffusion exclusively runs best on a graphics card.
As for which GPU to use, we suggest the Nvidia RTX 4080 and 4090 models with 16 or 24 GB VRAM for best results.
The 4080 also beats the 3090 Ti by 55%/18% with/without xformers. The 4070 Ti interestingly was 22% slower than the 3090 Ti without xformers, but 20% faster with xformers
The RTX 4090 is now 72% faster than the 3090 Ti without xformers, and a whopping 134% faster with xformers.
Last updated