Converting back to Mistral/vLLM format
#4
by
RonanMcGovern
- opened
Many thanks for having made this version.
Is there a way to convert back to the mistral/vllm format?
The reason I ask is because I have a fine-tuned transformers model that I want to inference with vLLM. Thanks.
Any update on this? We're very interested too
Hi, you basically would need to write the reverse of https://github.com/huggingface/transformers/blob/main/src/transformers/models/pixtral/convert_pixtral_weights_to_hf.py.
I wrote this script which seems to work: https://github.com/spring-anth/transform_pixtral/blob/main/convert_hf_transformers_pixtral_model_to_vllm_compatible_version.py, I can now host my finetuned model with vLLM. If you find any mistakes or have improvement suggestions please let me know :)