metadata
license: apache-2.0
language:
- en
tags:
- merge
MoE model build with:
- https://github.com/cg123/mergekit/tree/mixtral
- Mistral models, latest merges and fine tunes.
- Expert prompts heavily inspired by https://huggingface.co/Kquant03/Eukaryote-8x7B-bf16
For details check model files, there is config yaml I used to create that model.
Come back later for more details.