Masterjp123 commited on
Commit
60597f4
1 Parent(s): 4e83611

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +43 -1
README.md CHANGED
@@ -7,6 +7,7 @@ base_model:
7
  - KoboldAI/LLaMA2-13B-Erebus-v3
8
  - Henk717/echidna-tiefigther-25
9
  - Undi95/Unholy-v2-13B
 
10
  tags:
11
  - mergekit
12
  - merge
@@ -46,6 +47,7 @@ The following models were included in the merge:
46
  * [KoboldAI/LLaMA2-13B-Erebus-v3](https://huggingface.co/KoboldAI/LLaMA2-13B-Erebus-v3)
47
  * [Henk717/echidna-tiefigther-25](https://huggingface.co/Henk717/echidna-tiefigther-25)
48
  * [Undi95/Unholy-v2-13B](https://huggingface.co/Undi95/Unholy-v2-13B)
 
49
 
50
  ### Configuration
51
 
@@ -117,4 +119,44 @@ slices:
117
  weight: 0.33
118
  ```
119
 
120
- For the final merge
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7
  - KoboldAI/LLaMA2-13B-Erebus-v3
8
  - Henk717/echidna-tiefigther-25
9
  - Undi95/Unholy-v2-13B
10
+ - ddh0/EstopianOrcaMaid-13b
11
  tags:
12
  - mergekit
13
  - merge
 
47
  * [KoboldAI/LLaMA2-13B-Erebus-v3](https://huggingface.co/KoboldAI/LLaMA2-13B-Erebus-v3)
48
  * [Henk717/echidna-tiefigther-25](https://huggingface.co/Henk717/echidna-tiefigther-25)
49
  * [Undi95/Unholy-v2-13B](https://huggingface.co/Undi95/Unholy-v2-13B)
50
+ * [EstopianOrcaMaid](https://huggingface.co/ddh0/EstopianOrcaMaid-13b)
51
 
52
  ### Configuration
53
 
 
119
  weight: 0.33
120
  ```
121
 
122
+ for the final merge
123
+ ```yaml
124
+ base_model:
125
+ model:
126
+ path: TheBloke/Llama-2-13B-fp16
127
+ dtype: bfloat16
128
+ merge_method: ties
129
+ parameters:
130
+ int8_mask: 1.0
131
+ normalize: 1.0
132
+ slices:
133
+ - sources:
134
+ - layer_range: [0, 40]
135
+ model:
136
+ model:
137
+ path: ddh0/EstopianOrcaMaid-13b
138
+ parameters:
139
+ density: [1.0, 0.7, 0.1]
140
+ weight: 1.0
141
+ - layer_range: [0, 40]
142
+ model:
143
+ model:
144
+ path: Masterjp123/snowyrpp1
145
+ parameters:
146
+ density: 0.5
147
+ weight: [0.0, 0.3, 0.7, 1.0]
148
+ - layer_range: [0, 40]
149
+ model:
150
+ model:
151
+ path: Masterjp123/snowyrpp2
152
+ parameters:
153
+ density: 0.33
154
+ weight:
155
+ - filter: mlp
156
+ value: 0.5
157
+ - value: 0.0
158
+ - layer_range: [0, 40]
159
+ model:
160
+ model:
161
+ path: TheBloke/Llama-2-13B-fp16
162
+ ```