vllm部署失败
#4
by
cheneychenglong
- opened
vllm部署失败
感谢您的反馈,很快我们会提供vllm推理代码
我们已确认config.json需要进行两处修改以支持vllm部署:
- 添加architectures字段: "architectures":["InternVLChatModel"]
- 修改model_type字段: "model_type":"internvl_chat"
我们会在文档中更新这些必要步骤。再次感谢您的贡献!
Thank you for sharing the solution! We've confirmed that two modifications to config.json are needed for vllm deployment:
Add the architectures field: "architectures":["InternVLChatModel"]
Modify the model_type field: "model_type":"internvl_chat"
We'll update our documentation with these necessary steps to help other users deploy the model smoothly. Thanks again for your contribution!