ValueError when use Vllm

#5
by aliaminibagh - opened

when I run this command :
vllm serve "CohereForAI/c4ai-command-r7b-12-2024"

I got this error :
ValueError: The checkpoint you are trying to load has model type cohere2 but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.

Note : Uninstalling and installing transformers again does not help to fix the error!

Cohere For AI org

Yes, the changes needed to run this model are only available if you install Transformers from source. If you are using pip, run pip install 'git+https://github.com/huggingface/transformers.git' and you should be good to go!

Sign up or log in to comment