File size: 3,455 Bytes
ce70340
 
 
0a16ac0
135588c
0a16ac0
 
 
 
 
 
 
 
 
135588c
0a16ac0
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
036cf37
0a16ac0
 
 
 
 
 
135588c
0a16ac0
 
 
 
 
036cf37
0a16ac0
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
aa7925b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
dbc2a50
0a16ac0
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
---
license: cc-by-nc-4.0
---

Replaced Zephyr by Airoboros 2.2 in the mix.

## Description

This repo contains fp16 files of Mistral-11B-AirOmniMix.

## Model used
- [Mistral-7B-OpenOrca](https://huggingface.co/Open-Orca/Mistral-7B-OpenOrca)
- [Mistral-7B-v0.1-Open-Platypus](https://huggingface.co/akjindal53244/Mistral-7B-v0.1-Open-Platypus)
- [CollectiveCognition-v1.1-Mistral-7B](https://huggingface.co/teknium/CollectiveCognition-v1.1-Mistral-7B)
- [airoboros-mistral2.2-7b](https://huggingface.co/teknium/airoboros-mistral2.2-7b)

## Prompt template

The best one after further testing is this one, since Zephyr is out of the merge:

```
USER: <prompt>
ASSISTANT:
```

But this one work too:

```
Below is an instruction that describes a task. Write a response that appropriately completes the request.

### Instruction:
{prompt}

### Response:

```

Or use any prompting system from one of the 4 source model, should work.

## The secret sauce

Mistral-11B-OpenOrcaPlatypus :
```
slices:
  - sources:
    - model: Open-Orca/Mistral-7B-OpenOrca
      layer_range: [0, 24]
  - sources:
    - model: akjindal53244/Mistral-7B-v0.1-Open-Platypus
      layer_range: [8, 32]
merge_method: passthrough
dtype: bfloat16
```

Mistral-11B-CC-Airo :
```
slices:
  - sources:
    - model: "/content/drive/MyDrive/CC-v1.1-7B-bf16"
      layer_range: [0, 24]
  - sources:
    - model: "/content/drive/MyDrive/Mistral-7B-Airoboros-2.2-bf16"
      layer_range: [8, 32]
merge_method: passthrough
dtype: bfloat16
```

Mistral-11B-AirOmniMix :
```
slices:
  - sources:
      - model: Mistral-11B-OpenOrcaPlatypus
        layer_range: [0, 48]
      - model: Mistral-11B-CC-Airo
        layer_range: [0, 48]
merge_method: slerp
base_model: Mistral-11B-OpenOrcaPlatypus
parameters:
  t:
    - filter: lm_head 
      value: [0.75]
    - filter: embed_tokens
      value: [0.75]
    - filter: self_attn
      value: [0.75, 0.25]
    - filter: mlp
      value:  [0.25, 0.75]
    - filter: layernorm
      value: [0.5, 0.5]
    - filter: modelnorm
      value: [0.75]
    - value: 0.5 # fallback for rest of tensors
dtype: bfloat16
```
I use [mergekit](https://github.com/cg123/mergekit) for all the manipulation told here.

## Some scoring I done myself

hf-causal-experimental (pretrained=/content/drive/MyDrive/Mistral-11B-AirOmniMix), limit: None, provide_description: False, num_fewshot: 0, batch_size: 4
|    Task     |Version| Metric |Value |   |Stderr|
|-------------|------:|--------|-----:|---|-----:|
|arc_challenge|      0|acc     |0.5452|±  |0.0146|
|             |       |acc_norm|0.5836|±  |0.0144|
|arc_easy     |      0|acc     |0.8321|±  |0.0077|
|             |       |acc_norm|0.8119|±  |0.0080|
|hellaswag    |      0|acc     |0.6381|±  |0.0048|
|             |       |acc_norm|0.8250|±  |0.0038|
|piqa         |      0|acc     |0.8096|±  |0.0092|
|             |       |acc_norm|0.8243|±  |0.0089|
|truthfulqa_mc|      1|mc1     |0.3941|±  |0.0171|
|             |       |mc2     |0.5606|±  |0.0155|
|winogrande   |      0|acc     |0.7395|±  |0.0123|


![image/png](https://cdn-uploads.huggingface.co/production/uploads/63ab1241ad514ca8d1430003/rnraBZz-I9CUD1GVNVF00.png)

## Others

Special thanks to Sushi, [Henky](https://github.com/KoboldAI/KoboldAI-Client) for the machine he give me for big task, and [Charles Goddard](https://github.com/cg123) for his amazing tool.

If you want to support me, you can [here](https://ko-fi.com/undiai).