matchaaaaa commited on
Commit
793a2d1
1 Parent(s): e1c5a89

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +46 -39
README.md CHANGED
@@ -1,39 +1,46 @@
1
- ---
2
- base_model:
3
- - KatyTheCutie/LemonadeRP-4.5.3
4
- library_name: transformers
5
- tags:
6
- - mergekit
7
- - merge
8
-
9
- ---
10
- # Mytho-Lemon-11B
11
-
12
- This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
13
-
14
- ## Merge Details
15
- ### Merge Method
16
-
17
- This model was merged using the passthrough merge method.
18
-
19
- ### Models Merged
20
-
21
- The following models were included in the merge:
22
- * C:\MLnonsense\text-generation-webui-snapshot-2024-03-03\models\Gryphe_MythoMist-7B
23
- * [KatyTheCutie/LemonadeRP-4.5.3](https://huggingface.co/KatyTheCutie/LemonadeRP-4.5.3)
24
-
25
- ### Configuration
26
-
27
- The following YAML configuration was used to produce this model:
28
-
29
- ```yaml
30
- slices:
31
- - sources:
32
- - model: KatyTheCutie/LemonadeRP-4.5.3
33
- layer_range: [0, 24]
34
- - sources:
35
- - model: C:\MLnonsense\text-generation-webui-snapshot-2024-03-03\models\Gryphe_MythoMist-7B
36
- layer_range: [8, 32]
37
- merge_method: passthrough
38
- dtype: bfloat16
39
- ```
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model:
3
+ - KatyTheCutie/LemonadeRP-4.5.3
4
+ library_name: transformers
5
+ tags:
6
+ - mergekit
7
+ - merge
8
+
9
+ ---
10
+ # Mytho-Lemon-11B
11
+
12
+ Just a simple 11B frankenmerge of LemonadeRP and MythoMist which was used in [matchaaaaa/Chaifighter-20B-v2](https://huggingface.co/matchaaaaa/Chaifighter-20B-v2).
13
+
14
+ I didn't have to merge the models like this in Chaifighter, but I already had this lying around from a previous attempt, so I just went with it. It's nothing special, but here it is!
15
+
16
+ ## Merge Details
17
+
18
+ This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
19
+
20
+ ### Merge Method
21
+
22
+ This model was merged using the passthrough merge method.
23
+
24
+ ### Models Merged
25
+
26
+ The following models were included in the merge:
27
+ * [Gryphe/MythoMist-7B](https://huggingface.co/Gryphe/MythoMist-7b)
28
+ * [KatyTheCutie/LemonadeRP-4.5.3](https://huggingface.co/KatyTheCutie/LemonadeRP-4.5.3)
29
+
30
+ ### Configuration
31
+
32
+ The following YAML configuration was used to produce this model:
33
+
34
+ ```yaml
35
+ slices:
36
+ - sources:
37
+ - model: KatyTheCutie/LemonadeRP-4.5.3
38
+ layer_range: [0, 24]
39
+ - sources:
40
+ - model: Gryphe/MythoMist-7B
41
+ layer_range: [8, 32]
42
+ merge_method: passthrough
43
+ dtype: bfloat16
44
+ ```
45
+
46
+ Anyway, have a great day!