h2m's picture
Create README.md
7a9386d verified
|
raw
history blame
355 Bytes
---
license: apache-2.0
---
MoE model build with:
1. https://github.com/cg123/mergekit/tree/mixtral
2. Mistral models, latest merges and fine tunes.
3. Expert prompts heavily inspired by https://huggingface.co/Kquant03/Eukaryote-8x7B-bf16
For details check model files, there is config yaml I used to create that model.
Come back later for more details.