File size: 1,395 Bytes
5d60675
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
---
license: cc-by-nc-4.0
base_model:
- saishf/Ortho-SOVL-8B-L3
- saishf/SOVLish-Maid-L3-8B
- saishf/Merge-Mayhem-L3-V2.1
- saishf/Merge-Mayhem-L3-V2
library_name: transformers
tags:
- mergekit
- merge

---
# merge

This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).

## Merge Details
This model is a merge of all of my SOVL models, in the hopes to create the most unhinged and wild model possible. But in Mixtral fashion! 

It may be insane, it may be incoherent. I can't load it :3
### Merge Method

This model was merged using the [Mixture Of Experts](https://arxiv.org/abs/2401.04088) method.

### Models Merged

The following models were included in the merge:
* [saishf/Ortho-SOVL-8B-L3](https://huggingface.co/saishf/Ortho-SOVL-8B-L3)
* [saishf/SOVLish-Maid-L3-8B](https://huggingface.co/saishf/SOVLish-Maid-L3-8B)
* [saishf/Merge-Mayhem-L3-V2.1](https://huggingface.co/saishf/Merge-Mayhem-L3-V2.1)
* [saishf/Merge-Mayhem-L3-V2](https://huggingface.co/saishf/Merge-Mayhem-L3-V2)
### Configuration

The following YAML configuration was used to produce this model:

```yaml
base_model: saishf/Ortho-SOVL-8B-L3
gate_mode: random
dtype: bfloat16
experts:
  - source_model: saishf/Ortho-SOVL-8B-L3
  - source_model: saishf/SOVLish-Maid-L3-8B
  - source_model: saishf/Merge-Mayhem-L3-V2.1
  - source_model: saishf/Merge-Mayhem-L3-V2
```