Lily-MoE-2x7b / mergekit_moe_config.yml
LunaticPython161's picture
Upload folder using huggingface_hub
bfd3224 verified
raw
history blame contribute delete
No virus
661 Bytes
base_model: cognitivecomputations/dolphin-2.6-mistral-7b-dpo-laser
gate_mode: hidden # one of "hidden", "cheap_embed", or "random"
dtype: bfloat16 # output dtype (float32, float16, or bfloat16)
experts:
- source_model: cognitivecomputations/dolphin-2.6-mistral-7b-dpo-laser
positive_prompts:
- "chat"
- "assistant"
- "tell me"
- "explain"
- "code"
- "programming"
- source_model: LunaticPython161/CyberWitch-7B
positive_prompts:
- "solve"
- "count"
- "math"
- "mathematics"
- "algorithm"
- "cypher"
- "cybersecurity"
- "penetration testing"
- "red team"
- "blue team"
- "hacking"