How can I use it on multiple GPUs?

#7
by aeminkocal - opened

I used original Llama 2 13B on 3090 + 4090 System with torchrun with two nodes option. How can I run this in similar fashion? Also will there be a 13B version?

Sign up or log in to comment