File size: 3,607 Bytes
3203302
 
 
 
 
 
 
 
869b97d
3203302
 
 
869b97d
 
3203302
 
 
be8f3f8
3203302
 
 
 
869b97d
 
 
 
24ee161
869b97d
3203302
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
---
base_model: []
library_name: transformers
tags:
- mergekit
- merge

---
## Evolutionary model merge

This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).

104 evaluations

## Merge Details
### Merge Method

This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](https://arxiv.org/abs/2306.01708) merge method using NeuralBeagle14-7B as a base.

### Models Merged

The following models were included in the merge:
* Mistral-7B-v0.1-flashback-v2
* Mistral-7B-Merge-14-v0.2
* Starling-LM-7B-beta_581094980




### Configuration

The following YAML configuration was used to produce this model:

```yaml
base_model: /content/evol_merge_storage/input_models/NeuralBeagle14-7B_2368216670
dtype: bfloat16
merge_method: dare_ties
parameters:
  int8_mask: 1.0
  normalize: 1.0
slices:
- sources:
  - layer_range: [0, 8]
    model: /content/evol_merge_storage/input_models/Mistral-7B-v0.1-flashback-v2_2000655885
    parameters:
      density: 0.9063003498824225
      weight: 0.2716275746104375
  - layer_range: [0, 8]
    model: /content/evol_merge_storage/input_models/Mistral-7B-Merge-14-v0.2_3453453312
    parameters:
      density: 0.8605347663753816
      weight: 0.7040535407789865
  - layer_range: [0, 8]
    model: /content/evol_merge_storage/input_models/Starling-LM-7B-beta_581094980
    parameters:
      density: 1.0
      weight: 0.29417107478605065
  - layer_range: [0, 8]
    model: /content/evol_merge_storage/input_models/NeuralBeagle14-7B_2368216670
- sources:
  - layer_range: [8, 16]
    model: /content/evol_merge_storage/input_models/Mistral-7B-v0.1-flashback-v2_2000655885
    parameters:
      density: 0.9575970148743844
      weight: 0.15956926996874868
  - layer_range: [8, 16]
    model: /content/evol_merge_storage/input_models/Mistral-7B-Merge-14-v0.2_3453453312
    parameters:
      density: 1.0
      weight: 0.4071229613448434
  - layer_range: [8, 16]
    model: /content/evol_merge_storage/input_models/Starling-LM-7B-beta_581094980
    parameters:
      density: 1.0
      weight: 0.29267434269480536
  - layer_range: [8, 16]
    model: /content/evol_merge_storage/input_models/NeuralBeagle14-7B_2368216670
- sources:
  - layer_range: [16, 24]
    model: /content/evol_merge_storage/input_models/Mistral-7B-v0.1-flashback-v2_2000655885
    parameters:
      density: 0.853521244265145
      weight: 0.7268702601235844
  - layer_range: [16, 24]
    model: /content/evol_merge_storage/input_models/Mistral-7B-Merge-14-v0.2_3453453312
    parameters:
      density: 1.0
      weight: 0.3526854709444127
  - layer_range: [16, 24]
    model: /content/evol_merge_storage/input_models/Starling-LM-7B-beta_581094980
    parameters:
      density: 0.8904104909249966
      weight: 0.565939501390856
  - layer_range: [16, 24]
    model: /content/evol_merge_storage/input_models/NeuralBeagle14-7B_2368216670
- sources:
  - layer_range: [24, 32]
    model: /content/evol_merge_storage/input_models/Mistral-7B-v0.1-flashback-v2_2000655885
    parameters:
      density: 1.0
      weight: 0.3075681562252658
  - layer_range: [24, 32]
    model: /content/evol_merge_storage/input_models/Mistral-7B-Merge-14-v0.2_3453453312
    parameters:
      density: 0.6564325638087776
      weight: -0.24554943561719403
  - layer_range: [24, 32]
    model: /content/evol_merge_storage/input_models/Starling-LM-7B-beta_581094980
    parameters:
      density: 0.5678792182777617
      weight: 0.218593901640624
  - layer_range: [24, 32]
    model: /content/evol_merge_storage/input_models/NeuralBeagle14-7B_2368216670
```