metadata
license: apache-2.0
tags:
- moe
- merge
- mergekit
- lazymergekit
- cognitivecomputations/dolphin-2_6-phi-2
- lxuechen/phi-2-dpo
phixtral-2x2.8
phixtral-2x2.8 is a Mixure of Experts (MoE) made with the following models using a custom version of mergekit:
🧩 Configuration
base_model: cognitivecomputations/dolphin-2_6-phi-2
gate_mode: cheap_embed
experts:
- source_model: cognitivecomputations/dolphin-2_6-phi-2
positive_prompts: [""]
- source_model: lxuechen/phi-2-dpo
positive_prompts: [""]
💻 Usage
This architecture is not compatible with the transformers library. I'm working on hacking something to run it. Contact me if you're interested!