matchaaaaa commited on
Commit
0b68b50
1 Parent(s): 6edb133

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +52 -50
README.md CHANGED
@@ -1,50 +1,52 @@
1
- ---
2
- base_model: []
3
- library_name: transformers
4
- tags:
5
- - mergekit
6
- - merge
7
-
8
- ---
9
- # evo-6.1
10
-
11
- This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
12
-
13
- ## Merge Details
14
- ### Merge Method
15
-
16
- This model was merged using the passthrough merge method.
17
-
18
- ### Models Merged
19
-
20
- The following models were included in the merge:
21
- * D:/MLnonsense/models/SanjiWatsuki_Kunoichi-7B
22
- * mega\Kuno-Fimbul-splice
23
- * mega\Frosty-Mytho
24
- * mega\Fimbul-Frosty-Mytho-splice
25
- * D:/MLnonsense/models/Sao10K_Fimbulvetr-11B-v2
26
-
27
- ### Configuration
28
-
29
- The following YAML configuration was used to produce this model:
30
-
31
- ```yaml
32
- dtype: float32
33
- merge_method: passthrough
34
- slices:
35
- - sources:
36
- - layer_range: [0, 16]
37
- model: D:/MLnonsense/models/SanjiWatsuki_Kunoichi-7B
38
- - sources:
39
- - layer_range: [0, 8]
40
- model: mega\Kuno-Fimbul-splice
41
- - sources:
42
- - layer_range: [16, 32]
43
- model: D:/MLnonsense/models/Sao10K_Fimbulvetr-11B-v2
44
- - sources:
45
- - layer_range: [0, 8]
46
- model: mega\Fimbul-Frosty-Mytho-splice
47
- - sources:
48
- - layer_range: [16, 32]
49
- model: mega\Frosty-Mytho
50
- ```
 
 
 
1
+ ---
2
+ base_model: []
3
+ library_name: transformers
4
+ tags:
5
+ - mergekit
6
+ - merge
7
+
8
+ ---
9
+ # Chaifighter Latte 14B
10
+
11
+
12
+ ## The Deets
13
+ ### Mergekit
14
+
15
+ This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
16
+
17
+ ### Merge Method
18
+
19
+ This model was merged using the passthrough merge method.
20
+
21
+ ### Models Merged
22
+
23
+ * [SanjiWatsuki/Kunoichi-7B]()
24
+ * [Sao10K/Fimbulvetr-11B-v2]()
25
+ * [Sao10K/Frostwind-v2.1-m7]()
26
+ * [Gryphe/MythoMist-7b]()
27
+
28
+
29
+ ### Configuration
30
+
31
+ The following YAML configuration was used to produce this model:
32
+
33
+ ```yaml
34
+ dtype: float32
35
+ merge_method: passthrough
36
+ slices:
37
+ - sources:
38
+ - layer_range: [0, 16]
39
+ model: D:/MLnonsense/models/SanjiWatsuki_Kunoichi-7B
40
+ - sources:
41
+ - layer_range: [0, 8]
42
+ model: mega\Kuno-Fimbul-splice
43
+ - sources:
44
+ - layer_range: [16, 32]
45
+ model: D:/MLnonsense/models/Sao10K_Fimbulvetr-11B-v2
46
+ - sources:
47
+ - layer_range: [0, 8]
48
+ model: mega\Fimbul-Frosty-Mytho-splice
49
+ - sources:
50
+ - layer_range: [16, 32]
51
+ model: mega\Frosty-Mytho
52
+ ```