Spaces:
Running
on
A10G
Running
on
A10G
Phi-3.5-MoE-instruct
#117
by
goodasdgood
- opened
When will Phi-3.5-MoE-instruct be supported?
Qwen/Qwen2-VL???????
Error: Error converting to fp16: b'INFO:hf-to-gguf:Loading model: llava-v1.5-7b\nERROR:hf-to-gguf:Model LlavaLlamaForCausalLM is not supported\n'
how to convert Phi-3.5-MoE-instruct to gguf?
mlx-community/Dracarys2-72B-Instruct-4bit
Error converting
rhymes-ai/Aria
Error: Error converting