MultiverseBuddy-15B-MoE / mergekit_moe_config.yml
allknowingroger's picture
Upload folder using huggingface_hub
21404db verified
raw
history blame
243 Bytes
base_model: allknowingroger/MultiverseEx26-7B-slerp
experts:
- source_model: allknowingroger/MultiverseEx26-7B-slerp
positive_prompts: ["what"]
- source_model: OpenBuddy/openbuddy-mistral2-7b-v20.2-32k
positive_prompts: ["think"]