limin(gate) commited on
Commit
4b63f80
1 Parent(s): a3a434f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +75 -1
README.md CHANGED
@@ -7,7 +7,7 @@ tags:
7
  - liminerity/merge3
8
  - yam-peleg/Experiment26-7B
9
  ---
10
-
11
  # INEX8-7B
12
 
13
  INEX8-7B is a merge of the following models using [mergekit](https://github.com/cg123/mergekit):
@@ -17,6 +17,80 @@ INEX8-7B is a merge of the following models using [mergekit](https://github.com/
17
  ## 🧩 Configuration
18
 
19
  ```yaml
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
20
  slices:
21
  - sources:
22
  - model: liminerity/merge3
 
7
  - liminerity/merge3
8
  - yam-peleg/Experiment26-7B
9
  ---
10
+ MADE WITH LOVE BY LIMINERITY
11
  # INEX8-7B
12
 
13
  INEX8-7B is a merge of the following models using [mergekit](https://github.com/cg123/mergekit):
 
17
  ## 🧩 Configuration
18
 
19
  ```yaml
20
+ MODEL_NAME = "merge"
21
+ slices:
22
+ - sources:
23
+ - model: MSL7/INEX4-7b
24
+ layer_range: [0, 32]
25
+ - model: yam-peleg/Experiment24-7B
26
+ layer_range: [0, 32]
27
+ merge_method: slerp
28
+ base_model: MSL7/INEX4-7b
29
+ parameters:
30
+ t:
31
+ - filter: self_attn
32
+ value: [0, 0.5, 0.3, 0.7, 1]
33
+ - filter: mlp
34
+ value: [1, 0.5, 0.7, 0.3, 0]
35
+ - value: 0.5
36
+ dtype: bfloat16
37
+
38
+ MODEL_NAME = "merge1
39
+ slices:
40
+ - sources:
41
+ - model: liminerity/merge
42
+ layer_range: [0, 32]
43
+ - model: CorticalStack/shadow-clown-7B-dare
44
+ layer_range: [0, 32]
45
+ merge_method: slerp
46
+ base_model: liminerity/merge
47
+ parameters:
48
+ t:
49
+ - filter: self_attn
50
+ value: [0, 0.5, 0.3, 0.7, 1]
51
+ - filter: mlp
52
+ value: [1, 0.5, 0.7, 0.3, 0]
53
+ - value: 0.5
54
+ dtype: bfloat16
55
+
56
+ MODEL_NAME = "merge2
57
+ slices:
58
+ - sources:
59
+ - model: liminerity/merge1
60
+ layer_range: [0, 32]
61
+ - model: bardsai/jaskier-7b-dpo-v6.1
62
+ layer_range: [0, 32]
63
+ merge_method: slerp
64
+ base_model: liminerity/merge1
65
+ parameters:
66
+ t:
67
+ - filter: self_attn
68
+ value: [0, 0.5, 0.3, 0.7, 1]
69
+ - filter: mlp
70
+ value: [1, 0.5, 0.7, 0.3, 0]
71
+ - value: 0.5
72
+ dtype: bfloat16
73
+
74
+ MODEL_NAME = "merge3"
75
+ slices:
76
+ - sources:
77
+ - model: liminerity/merge2
78
+ layer_range: [0, 32]
79
+ - model: eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO
80
+ layer_range: [0, 32]
81
+ merge_method: slerp
82
+ base_model: liminerity/merge2
83
+ parameters:
84
+ t:
85
+ - filter: self_attn
86
+ value: [0, 0.5, 0.3, 0.7, 1]
87
+ - filter: mlp
88
+ value: [1, 0.5, 0.7, 0.3, 0]
89
+ - value: 0.5
90
+ dtype: bfloat16
91
+
92
+
93
+ MODEL_NAME: "INEX8-7b"
94
  slices:
95
  - sources:
96
  - model: liminerity/merge3