Was anyone successful in using this model with FastChat?

#1
by Digg - opened

I am trying to get this model to work with FastChat - running a command in the terminal:
python3 -m fastchat.serve.cli --model-path TheBloke/wizard-vicuna-13B-SuperHOT-8K-GPTQ --device mps

However I am getting this error:
TheBloke/wizard-vicuna-13B-SuperHOT-8K-GPTQ does not appear to have a file named pytorch_model.bin, tf_model.h5, model.ckpt or flax_model.msgpack.

Is there a way to resolve this issue?

python3 -m fastchat.serve.cli --model-path lmsys/vicuna-7b-v1.3

I see you have a '1' before python, just wondering if that's a problem?

Thanks for your response - I edited my original message to remove the '1'. I mistyped the command in the post while adding the tilde (`) character. In the terminal, I typed the command correctly.

This comment has been hidden

Sign up or log in to comment