Text Generation
Transformers
PyTorch
mpt
Composer
MosaicML
llm-foundry
conversational
custom_code
text-generation-inference

how to load the model with multiple GPUs

#9
by Sven00 - opened

I have not found a guidance on how to load the model and run inference with multiple GPUs. The instructions provided by mosaicML covers only a single GPU. Thank you

Sign up or log in to comment