File size: 1,330 Bytes
ed7989b
4651496
649ff4a
 
 
 
 
ed7989b
d8b5641
c3c0bd6
3432d47
649ff4a
45070cb
 
 
 
649ff4a
 
 
 
3432d47
2862764
3432d47
 
 
 
 
649ff4a
 
 
 
 
 
 
 
3432d47
649ff4a
3432d47
649ff4a
 
3432d47
649ff4a
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
---
license: cc-by-nc-4.0
library_name: transformers
tags:
- mergekit
- merge

---
![OrcaHermes](https://huggingface.co/cookinai/OrcaHermes-Mistral-70B-miqu/resolve/main/converted_image.png)

# OrcaHermes-Mistral-70B

This model was created by SLERP Merging 2 Miqu Models trained on 2 high preforming datsets.


Just an experiment, have not seen much miqu slerps yet.

### Models Merged

The following models were included in the merge:

[alicecomfy/miqu-openhermes-full](https://huggingface.co/alicecomfy/miqu-openhermes-full)
- Base Miqu Trained on [Openhermes](https://huggingface.co/datasets/teknium/OpenHermes-2.5)

[ShinojiResearch/Senku-70B-Full](https://huggingface.co/ShinojiResearch/Senku-70B-Full)
- Base Miqu Trained on [Slimorca](https://huggingface.co/datasets/Open-Orca/SlimOrca)


### Configuration

The following YAML configuration was used to produce this model:

```yaml
slices:
  - sources:
      - model: local//path//to//Senku-70B-Full
        layer_range: [0, 80]
      - model: local//path//to//miqu-openhermes-full
        layer_range: [0, 80]
merge_method: slerp
base_model: local//path//to//Senku-70B-Full
parameters:
  t:
    - filter: self_attn
      value: [0, 0.5, 0.3, 0.7, 1]
    - filter: mlp
      value: [1, 0.5, 0.7, 0.3, 0]
    - value: 0.5 # fallback for rest of tensors
dtype: float16

```