Incorrect intermediate_size

#2

This fixes the model for llama.cpp at least, untested on transformers.

Field is correct, conversion script needs to handle moe_intermediate_size and shared_expert_intermediate_size, see ggerganov/llama.cpp#7816(comment).

CISCai changed pull request status to closed

Sign up or log in to comment