base_model: | |
- MidoriUnko/Behemoth-v2.2-Magnum-v4-123B | |
- MarsupialAI/Monstral-123B-v2 | |
library_name: transformers | |
tags: | |
- mergekit | |
- merge | |
# merged_model | |
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). | |
## Merge Details | |
### Merge Method | |
This model was merged using the passthrough merge method. | |
### Models Merged | |
The following models were included in the merge: | |
* [MidoriUnko/Behemoth-v2.2-Magnum-v4-123B](https://huggingface.co/MidoriUnko/Behemoth-v2.2-Magnum-v4-123B) | |
* [MarsupialAI/Monstral-123B-v2](https://huggingface.co/MarsupialAI/Monstral-123B-v2) | |
### Configuration | |
The following YAML configuration was used to produce this model: | |
```yaml | |
slices: | |
- sources: | |
- model: MarsupialAI/Monstral-123B-v2 | |
layer_range: [0, 61] | |
- sources: | |
- model: MidoriUnko/Behemoth-v2.2-Magnum-v4-123B | |
layer_range: [27, 88] | |
merge_method: passthrough | |
dtype: float16 | |
name: monstral | |
``` | |