vllm

cannot run model wit VLLM library - missting config.json file

#22
by JBod - opened

hey I was trying to deploy the module in a notebook, after many struggles i finaly reach a point where I would need help.

OSError: /root/mistral_models/Pixtral does not appear to have a file named config.json. Checkout 'https://huggingface.co//root/mistral_models/Pixtral/tree/None' for available files.

it seems that the config.json is missing. can I reuse some from other models?

BR
Jacek

I have the same error

config.json is the config file for Hugging Face formatted models where you get a series of model.safetensors files and the config.json file.

Mistral initially release their models in the Mistral format where you get a single consolidated.safetensors and a params.json.

vLLM is supposed to support this format https://github.com/vllm-project/vllm/pull/8168 and from what I can tell it should just work. This was released in v0.6.1 but there's been fixes since so you'd probably be best going with v0.6.2

If that doesn't work you could also try passing the arguments to force it into this format.

  • --load-format mistral
  • --config-format mistral

Sign up or log in to comment