license: cc-by-nc-4.0 | |
tags: | |
- moe | |
# Fine Tuned Mixtral MOE 2x7B | |
MOE the following models by mergekit and then fine tuned by DPO. | |
* [mistralai/Mistral-7B-Instruct-v0.2](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2) | |
* [NurtureAI/neural-chat-7b-v3-16k](https://huggingface.co/NurtureAI/neural-chat-7b-v3-16k) | |
* [meta-math/jondurbin/bagel-dpo-7b-v0.1](https://huggingface.co/jondurbin/bagel-dpo-7b-v0.1) |