MoEstral-2x7B / mergekit_moe_config.yml
paulilioaica's picture
Upload folder using huggingface_hub
94c9ad5 verified
raw
history blame
291 Bytes
base_model: mistralai/Mistral-7B-Instruct-v0.2
gate_mode: cheap_embed
dtype: float16
experts:
- source_model: mistralai/Mistral-7B-Instruct-v0.2
positive_prompts: ["science, logic, math"]
- source_model: teknium/OpenHermes-2.5-Mistral-7B
positive_prompts: ["reasoning, numbers"]