The mixtral config was mixed, so I removed it.
Because convert of llama.cpp failed.
"num_local_experts": 8,
"num_experts_per_tok": 2,

Owner

@mmnga Done! Thank you for spotting that!

Vezora changed pull request status to closed

Sign up or log in to comment