Grafted-Llama2-2x70B / mergekit_moe_config.yml
lodrick-the-lafted's picture
Upload folder using huggingface_hub
6a0247f verified
raw
history blame
225 Bytes
base_model: /workspace/WG
gate_mode: hidden
dtype: float16
experts:
- source_model: /workspace/AN
positive_prompts: ["merged roleplay. erp"]
- source_model: /workspace/WG
positive_prompts: ["roleplay, code"]