vllm部署失败

#4
by cheneychenglong - opened

vllm部署失败

感谢您的反馈,很快我们会提供vllm推理代码

部署失败+1,没法成功读取config.json中的内容:

image.png

Skywork org

我们已确认config.json需要进行两处修改以支持vllm部署:

  • 添加architectures字段: "architectures":["InternVLChatModel"]
    image.png
  • 修改model_type字段: "model_type":"internvl_chat"
    image.png

我们会在文档中更新这些必要步骤。再次感谢您的贡献!
Thank you for sharing the solution! We've confirmed that two modifications to config.json are needed for vllm deployment:

Add the architectures field: "architectures":["InternVLChatModel"]
Modify the model_type field: "model_type":"internvl_chat"

We'll update our documentation with these necessary steps to help other users deploy the model smoothly. Thanks again for your contribution!

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment