Problem loading model

#28
by jdc4429 - opened

I know my path is correct, in fact I've tried changing it multiple times.. but I keep getting this error!

INFO:Loading ~/oobabooga_linux/text-generation-webui/models/mpt-7b-storywriter...
ERROR:The path to the model does not exist. Exiting.

This is the proper path!!! Even if I try loading the model from the web interface it does nothing.. It just doesn't load. If I try manually starting from server.py it does the same thing!

Command: python server.py --model '/home/jeff/oobabooga_linux/text-generation-webui/models/mpt-7b-storywriter' --listen --trust-remote-code --bf16

Tried with and with quotes, tried ~/ to start, tried full path. @%@%$#$#@%$!

If I try running from oobabooga, I get the following error...

ValueError: MPTForCausalLM does not support device_map='auto' yet.

Yet I see no option specified for device_map ?!

I wish the people who made these things would give better instructions that work.

I'm not sure if this is why it's not loading. It's only a warning..

INFO:Loading mpt-7b-storywriter...
/home/jeff/.cache/huggingface/modules/transformers_modules/mpt-7b-storywriter/attention.py:157: UserWarning: Using attn_impl: torch. If your model does not use alibi or prefix_lm we recommend using attn_impl: flash otherwise we recommend using attn_impl: triton.
warnings.warn('Using attn_impl: torch. If your model does not use alibi or ' + 'prefix_lm we recommend using attn_impl: flash otherwise ' + 'we recommend using attn_impl: triton.')
INFO:Loading mpt-7b-storywriter...
ERROR:No model is loaded! Select one in the Model tab.

Mosaic ML, Inc. org

Support for device_map: auto just added in this PR: https://huggingface.co/mosaicml/mpt-7b-storywriter/discussions/33

abhi-mosaic changed discussion status to closed

Sign up or log in to comment