mixtral / mergekit_moe_config.yml
ehristoforu's picture
Upload folder using huggingface_hub
88f24de verified
raw
history blame
244 Bytes
base_model: unsloth/Mistral-Nemo-Instruct-2407-bnb-4bit
gate_mode: random
architecture: mixtral
dtype: bfloat16
experts:
- source_model: unsloth/Mistral-Nemo-Instruct-2407-bnb-4bit
- source_model: unsloth/Mistral-Nemo-Instruct-2407-bnb-4bit