Transformers
mpt
Composer
MosaicML
llm-foundry
text-generation-inference

Can someone write a tutorial for Runpods installation after getting the Block GPU template?

#5
by ilostmyleftshoe25 - opened

I can get the right setting for the model downloader?

My Runpod template uses text-generation-webui and that does not currently support non-Llama GGMLs.

I will look into making another template with another UI that does support MPT GGMLs and other non-Llama GGMLs.

Sign up or log in to comment