Text Generation
Transformers
PyTorch
mpt
Composer
MosaicML
llm-foundry
custom_code
text-generation-inference

ValueError: MPTForCausalLM does not support `device_map='auto'` yet.

#38
by AayushShah - opened

I've used this code:

model_name = "mosaicml/mpt-7b-instruct"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name,
                                             low_cpu_mem_usage=True,
                                             trust_remote_code=True,
                                             load_in_8bit=True,
                                             torch_dtype=torch.float16,
                                             device_map="auto")

But gives this error:

ValueError: MPTForCausalLM does not support `device_map='auto'` yet.
Mosaic ML, Inc. org
abhi-mosaic changed discussion status to closed

Sign up or log in to comment