GGUF
Not-For-All-Audiences
nsfw
Inference Endpoints
Undi95 commited on
Commit
6014a90
1 Parent(s): 7af886f

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +410 -0
README.md ADDED
@@ -0,0 +1,410 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: cc-by-nc-4.0
3
+ tags:
4
+ - not-for-all-audiences
5
+ - nsfw
6
+ ---
7
+ [[fp16](https://huggingface.co/Undi95/Dawn-v2-70B) - [gguf](https://huggingface.co/Undi95/Dawn-v2-70B-GGUF) - [exl2](https://huggingface.co/Undi95/Dawn-v2-70B-2.4bpw-h6-exl2)]
8
+ <!-- description start -->
9
+ ## Description
10
+
11
+ This repo contains GGUF files of Dawn-70B, a merge I have done with the new [layer shuffle](https://github.com/cg123/mergekit/blob/main/mergekit/scripts/layershuffle.py) method from mergekit.
12
+
13
+ [UtopiaXL](https://huggingface.co/Undi95/UtopiaXL-13B) was a huge success for me, I really liked it, so I took the same path to do this 70B: A good base, some psychologic data, some medical data, a little bit of this, of that, and LimaRP at the end as always.
14
+
15
+ <!-- description end -->
16
+ <!-- description start -->
17
+ ## Models and loras used
18
+
19
+ - [Sao10K/Euryale-1.3-L2-70B](https://huggingface.co/Sao10K/Euryale-1.3-L2-70B)
20
+ - [Xwin-LM/Xwin-LM-70B-V0.1](https://huggingface.co/Xwin-LM/Xwin-LM-70B-V0.1)
21
+ - [ehartford/Samantha-1.11-70b](https://huggingface.co/ehartford/Samantha-1.11-70b)
22
+ - [NousResearch/Nous-Hermes-Llama2-70b](https://huggingface.co/NousResearch/Nous-Hermes-Llama2-70b)
23
+ - [augtoma/qCammel-70-x](https://huggingface.co/augtoma/qCammel-70-x)
24
+ - [jondurbin/airoboros-l2-c70b-3.1.2](https://huggingface.co/jondurbin/airoboros-l2-c70b-3.1.2)
25
+ - [fangloveskari/ORCA_LLaMA_70B_QLoRA](https://huggingface.co/fangloveskari/ORCA_LLaMA_70B_QLoRA)
26
+ - [Doctor-Shotgun/limarpv3-llama2-70b-qlora](https://huggingface.co/Doctor-Shotgun/limarpv3-llama2-70b-qlora)
27
+
28
+ <!-- description end -->
29
+ ## The sauce
30
+ ```
31
+ !mergekit-layershuffle ./Dawn-v2-70B \
32
+ --model Sao10K/Euryale-1.3-L2-70B --weight 0.3 \
33
+ --model Xwin-LM/Xwin-LM-70B-V0.1 --weight 0.2 \
34
+ --model ehartford/Samantha-1.11-70b --weight 0.1 \
35
+ --model NousResearch/Nous-Hermes-Llama2-70b --weight 0.05 \
36
+ --model augtoma/qCammel-70-x --weight 0.05 \
37
+ --model jondurbin/airoboros-l2-c70b-3.1.2 --weight 0.2 \
38
+ --model fangloveskari/ORCA_LLaMA_70B_QLoRA --weight 0.1 \
39
+ --write-yaml Dawn-v2-70B.yaml
40
+
41
+ =========================
42
+
43
+ merge_method: passthrough
44
+ slices:
45
+ - sources:
46
+ - layer_range:
47
+ - 0
48
+ - 1
49
+ model: fangloveskari/ORCA_LLaMA_70B_QLoRA
50
+ - sources:
51
+ - layer_range:
52
+ - 1
53
+ - 2
54
+ model: jondurbin/airoboros-l2-c70b-3.1.2
55
+ - sources:
56
+ - layer_range:
57
+ - 2
58
+ - 3
59
+ model: Sao10K/Euryale-1.3-L2-70B
60
+ - sources:
61
+ - layer_range:
62
+ - 3
63
+ - 4
64
+ model: jondurbin/airoboros-l2-c70b-3.1.2
65
+ - sources:
66
+ - layer_range:
67
+ - 4
68
+ - 5
69
+ model: fangloveskari/ORCA_LLaMA_70B_QLoRA
70
+ - sources:
71
+ - layer_range:
72
+ - 5
73
+ - 6
74
+ model: ehartford/Samantha-1.11-70b
75
+ - sources:
76
+ - layer_range:
77
+ - 6
78
+ - 8
79
+ model: Xwin-LM/Xwin-LM-70B-V0.1
80
+ - sources:
81
+ - layer_range:
82
+ - 8
83
+ - 9
84
+ model: ehartford/Samantha-1.11-70b
85
+ - sources:
86
+ - layer_range:
87
+ - 9
88
+ - 10
89
+ model: Sao10K/Euryale-1.3-L2-70B
90
+ - sources:
91
+ - layer_range:
92
+ - 10
93
+ - 11
94
+ model: ehartford/Samantha-1.11-70b
95
+ - sources:
96
+ - layer_range:
97
+ - 11
98
+ - 12
99
+ model: jondurbin/airoboros-l2-c70b-3.1.2
100
+ - sources:
101
+ - layer_range:
102
+ - 12
103
+ - 13
104
+ model: fangloveskari/ORCA_LLaMA_70B_QLoRA
105
+ - sources:
106
+ - layer_range:
107
+ - 13
108
+ - 14
109
+ model: Sao10K/Euryale-1.3-L2-70B
110
+ - sources:
111
+ - layer_range:
112
+ - 14
113
+ - 15
114
+ model: fangloveskari/ORCA_LLaMA_70B_QLoRA
115
+ - sources:
116
+ - layer_range:
117
+ - 15
118
+ - 16
119
+ model: Sao10K/Euryale-1.3-L2-70B
120
+ - sources:
121
+ - layer_range:
122
+ - 16
123
+ - 17
124
+ model: fangloveskari/ORCA_LLaMA_70B_QLoRA
125
+ - sources:
126
+ - layer_range:
127
+ - 17
128
+ - 18
129
+ model: jondurbin/airoboros-l2-c70b-3.1.2
130
+ - sources:
131
+ - layer_range:
132
+ - 18
133
+ - 19
134
+ model: NousResearch/Nous-Hermes-Llama2-70b
135
+ - sources:
136
+ - layer_range:
137
+ - 19
138
+ - 20
139
+ model: Xwin-LM/Xwin-LM-70B-V0.1
140
+ - sources:
141
+ - layer_range:
142
+ - 20
143
+ - 21
144
+ model: Sao10K/Euryale-1.3-L2-70B
145
+ - sources:
146
+ - layer_range:
147
+ - 21
148
+ - 22
149
+ model: ehartford/Samantha-1.11-70b
150
+ - sources:
151
+ - layer_range:
152
+ - 22
153
+ - 23
154
+ model: jondurbin/airoboros-l2-c70b-3.1.2
155
+ - sources:
156
+ - layer_range:
157
+ - 23
158
+ - 24
159
+ model: augtoma/qCammel-70-x
160
+ - sources:
161
+ - layer_range:
162
+ - 24
163
+ - 25
164
+ model: Sao10K/Euryale-1.3-L2-70B
165
+ - sources:
166
+ - layer_range:
167
+ - 25
168
+ - 27
169
+ model: jondurbin/airoboros-l2-c70b-3.1.2
170
+ - sources:
171
+ - layer_range:
172
+ - 27
173
+ - 28
174
+ model: Xwin-LM/Xwin-LM-70B-V0.1
175
+ - sources:
176
+ - layer_range:
177
+ - 28
178
+ - 29
179
+ model: ehartford/Samantha-1.11-70b
180
+ - sources:
181
+ - layer_range:
182
+ - 29
183
+ - 30
184
+ model: Sao10K/Euryale-1.3-L2-70B
185
+ - sources:
186
+ - layer_range:
187
+ - 30
188
+ - 32
189
+ model: Xwin-LM/Xwin-LM-70B-V0.1
190
+ - sources:
191
+ - layer_range:
192
+ - 32
193
+ - 33
194
+ model: ehartford/Samantha-1.11-70b
195
+ - sources:
196
+ - layer_range:
197
+ - 33
198
+ - 34
199
+ model: augtoma/qCammel-70-x
200
+ - sources:
201
+ - layer_range:
202
+ - 34
203
+ - 35
204
+ model: Xwin-LM/Xwin-LM-70B-V0.1
205
+ - sources:
206
+ - layer_range:
207
+ - 35
208
+ - 37
209
+ model: Sao10K/Euryale-1.3-L2-70B
210
+ - sources:
211
+ - layer_range:
212
+ - 37
213
+ - 38
214
+ model: jondurbin/airoboros-l2-c70b-3.1.2
215
+ - sources:
216
+ - layer_range:
217
+ - 38
218
+ - 39
219
+ model: ehartford/Samantha-1.11-70b
220
+ - sources:
221
+ - layer_range:
222
+ - 39
223
+ - 40
224
+ model: augtoma/qCammel-70-x
225
+ - sources:
226
+ - layer_range:
227
+ - 40
228
+ - 41
229
+ model: Xwin-LM/Xwin-LM-70B-V0.1
230
+ - sources:
231
+ - layer_range:
232
+ - 41
233
+ - 42
234
+ model: ehartford/Samantha-1.11-70b
235
+ - sources:
236
+ - layer_range:
237
+ - 42
238
+ - 43
239
+ model: Sao10K/Euryale-1.3-L2-70B
240
+ - sources:
241
+ - layer_range:
242
+ - 43
243
+ - 44
244
+ model: Xwin-LM/Xwin-LM-70B-V0.1
245
+ - sources:
246
+ - layer_range:
247
+ - 44
248
+ - 45
249
+ model: NousResearch/Nous-Hermes-Llama2-70b
250
+ - sources:
251
+ - layer_range:
252
+ - 45
253
+ - 46
254
+ model: jondurbin/airoboros-l2-c70b-3.1.2
255
+ - sources:
256
+ - layer_range:
257
+ - 46
258
+ - 48
259
+ model: ehartford/Samantha-1.11-70b
260
+ - sources:
261
+ - layer_range:
262
+ - 48
263
+ - 49
264
+ model: Sao10K/Euryale-1.3-L2-70B
265
+ - sources:
266
+ - layer_range:
267
+ - 49
268
+ - 50
269
+ model: Xwin-LM/Xwin-LM-70B-V0.1
270
+ - sources:
271
+ - layer_range:
272
+ - 50
273
+ - 51
274
+ model: jondurbin/airoboros-l2-c70b-3.1.2
275
+ - sources:
276
+ - layer_range:
277
+ - 51
278
+ - 54
279
+ model: fangloveskari/ORCA_LLaMA_70B_QLoRA
280
+ - sources:
281
+ - layer_range:
282
+ - 54
283
+ - 55
284
+ model: jondurbin/airoboros-l2-c70b-3.1.2
285
+ - sources:
286
+ - layer_range:
287
+ - 55
288
+ - 56
289
+ model: fangloveskari/ORCA_LLaMA_70B_QLoRA
290
+ - sources:
291
+ - layer_range:
292
+ - 56
293
+ - 58
294
+ model: jondurbin/airoboros-l2-c70b-3.1.2
295
+ - sources:
296
+ - layer_range:
297
+ - 58
298
+ - 59
299
+ model: Sao10K/Euryale-1.3-L2-70B
300
+ - sources:
301
+ - layer_range:
302
+ - 59
303
+ - 60
304
+ model: Xwin-LM/Xwin-LM-70B-V0.1
305
+ - sources:
306
+ - layer_range:
307
+ - 60
308
+ - 62
309
+ model: jondurbin/airoboros-l2-c70b-3.1.2
310
+ - sources:
311
+ - layer_range:
312
+ - 62
313
+ - 63
314
+ model: Xwin-LM/Xwin-LM-70B-V0.1
315
+ - sources:
316
+ - layer_range:
317
+ - 63
318
+ - 64
319
+ model: fangloveskari/ORCA_LLaMA_70B_QLoRA
320
+ - sources:
321
+ - layer_range:
322
+ - 64
323
+ - 65
324
+ model: NousResearch/Nous-Hermes-Llama2-70b
325
+ - sources:
326
+ - layer_range:
327
+ - 65
328
+ - 66
329
+ model: Sao10K/Euryale-1.3-L2-70B
330
+ - sources:
331
+ - layer_range:
332
+ - 66
333
+ - 67
334
+ model: Xwin-LM/Xwin-LM-70B-V0.1
335
+ - sources:
336
+ - layer_range:
337
+ - 67
338
+ - 68
339
+ model: augtoma/qCammel-70-x
340
+ - sources:
341
+ - layer_range:
342
+ - 68
343
+ - 70
344
+ model: Xwin-LM/Xwin-LM-70B-V0.1
345
+ - sources:
346
+ - layer_range:
347
+ - 70
348
+ - 71
349
+ model: augtoma/qCammel-70-x
350
+ - sources:
351
+ - layer_range:
352
+ - 71
353
+ - 72
354
+ model: Xwin-LM/Xwin-LM-70B-V0.1
355
+ - sources:
356
+ - layer_range:
357
+ - 72
358
+ - 73
359
+ model: Sao10K/Euryale-1.3-L2-70B
360
+ - sources:
361
+ - layer_range:
362
+ - 73
363
+ - 75
364
+ model: jondurbin/airoboros-l2-c70b-3.1.2
365
+ - sources:
366
+ - layer_range:
367
+ - 75
368
+ - 76
369
+ model: Sao10K/Euryale-1.3-L2-70B
370
+ - sources:
371
+ - layer_range:
372
+ - 76
373
+ - 77
374
+ model: augtoma/qCammel-70-x
375
+ - sources:
376
+ - layer_range:
377
+ - 77
378
+ - 78
379
+ model: Xwin-LM/Xwin-LM-70B-V0.1
380
+ - sources:
381
+ - layer_range:
382
+ - 78
383
+ - 79
384
+ model: NousResearch/Nous-Hermes-Llama2-70b
385
+ - sources:
386
+ - layer_range:
387
+ - 79
388
+ - 80
389
+ model: Xwin-LM/Xwin-LM-70B-V0.1
390
+
391
+
392
+ =========================
393
+
394
+ => Applying Doctor-Shotgun/limarpv3-llama2-70b-qlora x 0.35
395
+ ```
396
+ <!-- prompt-template start -->
397
+ ## Prompt template: Alpaca
398
+
399
+ ```
400
+ Below is an instruction that describes a task. Write a response that appropriately completes the request.
401
+
402
+ ### Instruction:
403
+ {prompt}
404
+
405
+ ### Response:
406
+
407
+ ```
408
+ A big thanks to [Charles](https://huggingface.co/chargoddard) for adding the layer shuffle method to his tool [mergekit](https://github.com/cg123/mergekit/tree/main) and [Henky/KoboldAI](https://koboldai.org/) for the machine he let me use.
409
+
410
+ If you want to support me, you can [here](https://ko-fi.com/undiai).