base_model: | |
- openbmb/Eurux-8x22b-nca | |
- alpindale/WizardLM-2-8x22B | |
- fireworks-ai/mixtral-8x22b-instruct-oh | |
- migtissera/Tess-2.0-Mixtral-8x22B | |
library_name: transformers | |
tags: | |
- mergekit | |
- merge | |
# WizardLM-2-8x22B-BigMerge | |
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). | |
## Merge Details | |
### Merge Method | |
This model was merged using the [Model Stock](https://arxiv.org/abs/2403.19522) merge method using [alpindale/WizardLM-2-8x22B](https://huggingface.co/alpindale/WizardLM-2-8x22B) as a base. | |
### Models Merged | |
The following models were included in the merge: | |
* [openbmb/Eurux-8x22b-nca](https://huggingface.co/openbmb/Eurux-8x22b-nca) | |
* [fireworks-ai/mixtral-8x22b-instruct-oh](https://huggingface.co/fireworks-ai/mixtral-8x22b-instruct-oh) | |
* [migtissera/Tess-2.0-Mixtral-8x22B](https://huggingface.co/migtissera/Tess-2.0-Mixtral-8x22B) | |
### Configuration | |
The following YAML configuration was used to produce this model: | |
```yaml | |
models: | |
- model: alpindale/WizardLM-2-8x22B | |
- model: openbmb/Eurux-8x22b-nca | |
- model: migtissera/Tess-2.0-Mixtral-8x22B | |
- model: fireworks-ai/mixtral-8x22b-instruct-oh | |
base_model: alpindale/WizardLM-2-8x22B | |
merge_method: model_stock | |
dtype: bfloat16 | |
``` | |