Can this be run with an AMD GPU? #29

by Vehrn - opened

I see CUDA is the default in the setup scripts. When running the setup I get the following error on my 6800AMD GPU. AssertionError: Torch not compiled with CUDA enabled

Thanks in advance.

How to run CompVis/stable-diffusion on AMD Linux

# AMD Driver installation: https://docs.amd.com/bundle/ROCm-Installation-Guide-v5.1/page/How_to_Install_ROCm.html
# command would be something like this after installing amdgpu-install
# sudo amdgpu-install --rocmrelease=5.2.3 --usecase=dkms,graphics,rocm,lrt,hip,hiplibsdk
# if its installed already, try rocm-smi command it will show available GPUs

cd stable-diffusion/
conda env create -f environment.yaml
conda activate ldm
conda remove cudatoolkit -y
pip3 uninstall torch torchvision -y 
# Install PyTorch ROCm
pip3 install torch torchvision torchaudio --extra-index-url https://download.pytorch.org/whl/rocm5.1.1
pip3 install transformers==4.19.2 scann kornia==0.6.4 torchmetrics==0.6.0

# Place the model as model.ckpt in the models/ldm/stable-diffusion-v1/ folder
python scripts/txt2img.py --prompt "a photograph of an astronaut riding a horse" --plms

How to run huggingface/diffusers on AMD Linux

git clone https://github.com/huggingface/diffusers.git
cd diffusers/
pip3 install -e .
pip3 uninstall torch 
pip3 install torch torchvision torchaudio --extra-index-url https://download.pytorch.org/whl/rocm5.1.1

Run the code without autocast.. (https://github.com/huggingface/diffusers/tree/main/src/diffusers/pipelines#text-to-image-generation-with-stable-diffusion)

# make sure you're logged in with `huggingface-cli login`
from diffusers import StableDiffusionPipeline, LMSDiscreteScheduler

pipe = StableDiffusionPipeline.from_pretrained("CompVis/stable-diffusion-v1-4", use_auth_token=True)
pipe = pipe.to("cuda")

prompt = "a photo of an astronaut riding a horse on mars"
image = pipe(prompt)["sample"][0] 
image.save("astronaut_rides_horse.png")

If you are on Windows, try this ONNX DirectML approach: https://gist.github.com/harishanand95/75f4515e6187a6aa3261af6ac6f61269

May I ask how much VRAM does your device had. I am facing memory troubles on my 3070 with a 8 GB VRAM.