CalamitousFelicitousness commited on
Commit
fdb05a5
1 Parent(s): f60dc04

Upload folder using huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +492 -0
README.md ADDED
@@ -0,0 +1,492 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ license: other
4
+ license_name: qwen
5
+ license_link: https://huggingface.co/Qwen/Qwen2.5-72B-Instruct/blob/main/LICENSE
6
+ base_model: Qwen/Qwen2.5-72B
7
+ datasets:
8
+ - anthracite-org/kalo-opus-instruct-22k-no-refusal
9
+ - Nopm/Opus_WritingStruct
10
+ - Gryphe/Sonnet3.5-SlimOrcaDedupCleaned
11
+ - Gryphe/Sonnet3.5-Charcard-Roleplay
12
+ - Gryphe/ChatGPT-4o-Writing-Prompts
13
+ - Epiculous/Synthstruct-Gens-v1.1-Filtered-n-Cleaned
14
+ - Epiculous/SynthRP-Gens-v1.1-Filtered-n-Cleaned
15
+ - nothingiisreal/Reddit-Dirty-And-WritingPrompts
16
+ - allura-org/Celeste-1.x-data-mixture
17
+ - cognitivecomputations/dolphin-2.9.3
18
+ tags:
19
+ - generated_from_trainer
20
+ model-index:
21
+ - name: EVA-Qwen2.5-72B-SFFT-v0.1
22
+ results: []
23
+ ---
24
+
25
+ # This repo contains the copy of the original quantized to INT8. Original: [EVA-UNIT-01/EVA-Qwen2.5-72B-v0.1](https://huggingface.co/EVA-UNIT-01/EVA-Qwen2.5-72B-v0.1)
26
+
27
+ # EVA Qwen2.5-72B v0.1
28
+
29
+ <p>
30
+ A RP/storywriting specialist model, full-parameter finetune of Qwen2.5-72B on mixture of synthetic and natural data.<br>
31
+ It uses Celeste 70B 0.1 data mixture, greatly expanding it to improve versatility, creativity and "flavor" of the resulting model.<br>
32
+ </p>
33
+
34
+ <p>Dedicated to Nev.</p>
35
+
36
+ <p><b>Version notes for 0.1</b>: Reprocessed dataset (via Cahvay for 32B 0.2, used here as well), readjusted training config for 8xH100 SXM. Significant improvements in instruction following, long context understanding and overall coherence over v0.0.</p>
37
+
38
+ <p>
39
+ <p>Prompt format is ChatML.</p><br>
40
+ <h3>Recommended sampler values:</h3>
41
+ <ul>
42
+ <li>Temperature: 1</li>
43
+ <li>Min-P: 0.05</li>
44
+ <li>Top-A: 0.2</li>
45
+ <li>Repetition Penalty: 1.03</li>
46
+ </ul>
47
+
48
+ <h3>Recommended SillyTavern presets (via CalamitousFelicitousness):</h3>
49
+
50
+ - [Context](https://huggingface.co/EVA-UNIT-01/EVA-Yi-1.5-9B-32K-V1/blob/main/%5BChatML%5D%20Roleplay-v1.9%20Context.json)
51
+ - [Instruct and System Prompt](https://huggingface.co/EVA-UNIT-01/EVA-Yi-1.5-9B-32K-V1/blob/main/%5BChatML%5D%20Roleplay-v1.9%20Instruct.json)
52
+ </p>
53
+
54
+ <p>
55
+ <br>
56
+ <h3>
57
+ Training data:
58
+ </h3>
59
+ <ul>
60
+ <li>Celeste 70B 0.1 data mixture minus Opus Instruct subset. See that model's <a href=https://huggingface.co/nothingiisreal/L3.1-70B-Celeste-V0.1-BF16>card</a> for details.</li>
61
+ <li>Kalomaze's Opus_Instruct_25k dataset, filtered for refusals.</li>
62
+ <li>A subset (1k rows) of ChatGPT-4o-WritingPrompts by Gryphe</li>
63
+ <li>A subset (2k rows) of Sonnet3.5-Charcards-Roleplay by Gryphe</li>
64
+ <li>Synthstruct and SynthRP datasets by Epiculous</li>
65
+ <li>A subset from Dolphin-2.9.3, including filtered version of not_samantha and a small subset of systemchat.</li>
66
+ </ul>
67
+ <h3>
68
+ Training time and hardware:
69
+ </h3>
70
+ <ul><li>15 hours on 8xH100 SXM, provided by <a href=https://featherless.ai/>FeatherlessAI</a></li></ul><br>
71
+ </p>
72
+ <p>Model was created by Kearm, Auri and Cahvay.</p>
73
+ <h4>Special thanks:</h4><ul>
74
+ <li><b>to Cahvay for his work on investigating and reprocessing the corrupted dataset, removing the single biggest source of data poisoning.</b></li>
75
+ <li><b>to <a href=https://featherless.ai/>FeatherlessAI</a> for generously providing 8xH100 SXM node for training of this model</b></li>
76
+ <li>to Gryphe, Lemmy, Kalomaze, Nopm, Epiculous and CognitiveComputations for the data</li>
77
+ <li>and to Allura-org for support, feedback, beta-testing and doing quality control of EVA models.</li></ul>
78
+
79
+
80
+ [<img src="https://raw.githubusercontent.com/axolotl-ai-cloud/axolotl/main/image/axolotl-badge-web.png" alt="Built with Axolotl" width="200" height="32"/>](https://github.com/axolotl-ai-cloud/axolotl)
81
+ <details><summary>See axolotl config</summary>
82
+
83
+ axolotl version: `0.4.1`
84
+ ```yaml
85
+ base_model: Qwen/Qwen2.5-72B
86
+
87
+ load_in_8bit: false
88
+ load_in_4bit: false
89
+ strict: false
90
+
91
+ plugins:
92
+ - axolotl.integrations.liger.LigerPlugin
93
+ liger_rope: true
94
+ liger_rms_norm: true
95
+ liger_swiglu: true
96
+ liger_fused_linear_cross_entropy: true
97
+
98
+ # plugins:
99
+ # - axolotl.integrations.spectrum.SpectrumPlugin
100
+
101
+ # spectrum_top_fraction: 0.5
102
+ # # Optional if using a pre-scanned model as your base_model. Useful if using a model mirror
103
+ # spectrum_model_name: Qwen/Qwen2.5-32B
104
+
105
+ datasets:
106
+ - path: datasets/Celeste_Filtered_utf8fix.jsonl
107
+ type: sharegpt
108
+ - path: datasets/deduped_not_samantha_norefusals.jsonl
109
+ type: sharegpt
110
+ - path: datasets/deduped_SynthRP-Gens_processed_ShareGPT_converted_cleaned.jsonl
111
+ type: sharegpt
112
+ - path: datasets/deduped_Synthstruct-Gens_processed_sharegpt_converted_cleaned.jsonl
113
+ type: sharegpt
114
+ - path: datasets/Gryphe-4o-WP-filtered-sharegpt_utf8fix.jsonl
115
+ type: sharegpt
116
+ - path: datasets/opus-instruct-22k-no_refusals-filtered_utf8fix.jsonl
117
+ type: sharegpt
118
+ - path: datasets/Sonnet3-5-charcard-names-filtered-sharegpt_utf8fix.jsonl
119
+ type: sharegpt
120
+ - path: datasets/SystemChat_subset_filtered_sharegpt_utf8fix.jsonl
121
+ type: sharegpt
122
+
123
+ chat_template: chatml
124
+ shuffle_merged_datasets: true
125
+ val_set_size: 0.001
126
+ output_dir: ./EVA-Qwen2.5-72B-SFFT-v0.1
127
+
128
+ sequence_len: 8192
129
+ sample_packing: true
130
+ eval_sample_packing: false
131
+ pad_to_sequence_len: false
132
+
133
+ # adapter: qlora
134
+ # lora_model_dir:
135
+ # lora_r: 64
136
+ # lora_alpha: 128
137
+ # lora_dropout: 0.05
138
+ # lora_target_linear: true
139
+ # peft_use_dora: true
140
+
141
+ unfrozen_parameters:
142
+ - ^lm_head.weight$
143
+ - ^model.embed_tokens.weight$
144
+ # mlp.down_proj layers
145
+ - model.layers.62.mlp.down_proj
146
+ - model.layers.64.mlp.down_proj
147
+ - model.layers.63.mlp.down_proj
148
+ - model.layers.66.mlp.down_proj
149
+ - model.layers.65.mlp.down_proj
150
+ - model.layers.67.mlp.down_proj
151
+ - model.layers.68.mlp.down_proj
152
+ - model.layers.31.mlp.down_proj
153
+ - model.layers.60.mlp.down_proj
154
+ - model.layers.69.mlp.down_proj
155
+ - model.layers.61.mlp.down_proj
156
+ - model.layers.59.mlp.down_proj
157
+ - model.layers.30.mlp.down_proj
158
+ - model.layers.70.mlp.down_proj
159
+ - model.layers.32.mlp.down_proj
160
+ - model.layers.34.mlp.down_proj
161
+ - model.layers.33.mlp.down_proj
162
+ - model.layers.76.mlp.down_proj
163
+ - model.layers.72.mlp.down_proj
164
+ - model.layers.71.mlp.down_proj
165
+ - model.layers.58.mlp.down_proj
166
+ - model.layers.75.mlp.down_proj
167
+ - model.layers.29.mlp.down_proj
168
+ - model.layers.56.mlp.down_proj
169
+ - model.layers.26.mlp.down_proj
170
+ - model.layers.35.mlp.down_proj
171
+ - model.layers.28.mlp.down_proj
172
+ - model.layers.57.mlp.down_proj
173
+ - model.layers.77.mlp.down_proj
174
+ - model.layers.36.mlp.down_proj
175
+ - model.layers.27.mlp.down_proj
176
+ - model.layers.25.mlp.down_proj
177
+ - model.layers.78.mlp.down_proj
178
+ - model.layers.37.mlp.down_proj
179
+ - model.layers.73.mlp.down_proj
180
+ - model.layers.55.mlp.down_proj
181
+ - model.layers.54.mlp.down_proj
182
+ - model.layers.74.mlp.down_proj
183
+ - model.layers.24.mlp.down_proj
184
+ - model.layers.53.mlp.down_proj
185
+ # mlp.gate_proj layers
186
+ - model.layers.78.mlp.gate_proj
187
+ - model.layers.77.mlp.gate_proj
188
+ - model.layers.76.mlp.gate_proj
189
+ - model.layers.79.mlp.gate_proj
190
+ - model.layers.75.mlp.gate_proj
191
+ - model.layers.74.mlp.gate_proj
192
+ - model.layers.73.mlp.gate_proj
193
+ - model.layers.72.mlp.gate_proj
194
+ - model.layers.71.mlp.gate_proj
195
+ - model.layers.70.mlp.gate_proj
196
+ - model.layers.69.mlp.gate_proj
197
+ - model.layers.57.mlp.gate_proj
198
+ - model.layers.54.mlp.gate_proj
199
+ - model.layers.55.mlp.gate_proj
200
+ - model.layers.68.mlp.gate_proj
201
+ - model.layers.63.mlp.gate_proj
202
+ - model.layers.53.mlp.gate_proj
203
+ - model.layers.44.mlp.gate_proj
204
+ - model.layers.45.mlp.gate_proj
205
+ - model.layers.49.mlp.gate_proj
206
+ - model.layers.58.mlp.gate_proj
207
+ - model.layers.46.mlp.gate_proj
208
+ - model.layers.56.mlp.gate_proj
209
+ - model.layers.67.mlp.gate_proj
210
+ - model.layers.62.mlp.gate_proj
211
+ - model.layers.50.mlp.gate_proj
212
+ - model.layers.64.mlp.gate_proj
213
+ - model.layers.52.mlp.gate_proj
214
+ - model.layers.40.mlp.gate_proj
215
+ - model.layers.43.mlp.gate_proj
216
+ - model.layers.48.mlp.gate_proj
217
+ - model.layers.66.mlp.gate_proj
218
+ - model.layers.47.mlp.gate_proj
219
+ - model.layers.59.mlp.gate_proj
220
+ - model.layers.65.mlp.gate_proj
221
+ - model.layers.61.mlp.gate_proj
222
+ - model.layers.60.mlp.gate_proj
223
+ - model.layers.42.mlp.gate_proj
224
+ - model.layers.51.mlp.gate_proj
225
+ - model.layers.41.mlp.gate_proj
226
+ # mlp.up_proj layers
227
+ - model.layers.70.mlp.up_proj
228
+ - model.layers.69.mlp.up_proj
229
+ - model.layers.71.mlp.up_proj
230
+ - model.layers.68.mlp.up_proj
231
+ - model.layers.72.mlp.up_proj
232
+ - model.layers.67.mlp.up_proj
233
+ - model.layers.66.mlp.up_proj
234
+ - model.layers.73.mlp.up_proj
235
+ - model.layers.46.mlp.up_proj
236
+ - model.layers.63.mlp.up_proj
237
+ - model.layers.75.mlp.up_proj
238
+ - model.layers.76.mlp.up_proj
239
+ - model.layers.74.mlp.up_proj
240
+ - model.layers.45.mlp.up_proj
241
+ - model.layers.62.mlp.up_proj
242
+ - model.layers.64.mlp.up_proj
243
+ - model.layers.65.mlp.up_proj
244
+ - model.layers.44.mlp.up_proj
245
+ - model.layers.53.mlp.up_proj
246
+ - model.layers.47.mlp.up_proj
247
+ - model.layers.49.mlp.up_proj
248
+ - model.layers.48.mlp.up_proj
249
+ - model.layers.57.mlp.up_proj
250
+ - model.layers.43.mlp.up_proj
251
+ - model.layers.42.mlp.up_proj
252
+ - model.layers.56.mlp.up_proj
253
+ - model.layers.61.mlp.up_proj
254
+ - model.layers.54.mlp.up_proj
255
+ - model.layers.40.mlp.up_proj
256
+ - model.layers.55.mlp.up_proj
257
+ - model.layers.77.mlp.up_proj
258
+ - model.layers.60.mlp.up_proj
259
+ - model.layers.41.mlp.up_proj
260
+ - model.layers.35.mlp.up_proj
261
+ - model.layers.37.mlp.up_proj
262
+ - model.layers.58.mlp.up_proj
263
+ - model.layers.34.mlp.up_proj
264
+ - model.layers.38.mlp.up_proj
265
+ - model.layers.33.mlp.up_proj
266
+ - model.layers.39.mlp.up_proj
267
+ # self_attn.k_proj layers
268
+ - model.layers.36.self_attn.k_proj
269
+ - model.layers.79.self_attn.k_proj
270
+ - model.layers.35.self_attn.k_proj
271
+ - model.layers.34.self_attn.k_proj
272
+ - model.layers.37.self_attn.k_proj
273
+ - model.layers.33.self_attn.k_proj
274
+ - model.layers.38.self_attn.k_proj
275
+ - model.layers.39.self_attn.k_proj
276
+ - model.layers.74.self_attn.k_proj
277
+ - model.layers.77.self_attn.k_proj
278
+ - model.layers.41.self_attn.k_proj
279
+ - model.layers.69.self_attn.k_proj
280
+ - model.layers.32.self_attn.k_proj
281
+ - model.layers.78.self_attn.k_proj
282
+ - model.layers.30.self_attn.k_proj
283
+ - model.layers.70.self_attn.k_proj
284
+ - model.layers.25.self_attn.k_proj
285
+ - model.layers.42.self_attn.k_proj
286
+ - model.layers.29.self_attn.k_proj
287
+ - model.layers.31.self_attn.k_proj
288
+ - model.layers.68.self_attn.k_proj
289
+ - model.layers.66.self_attn.k_proj
290
+ - model.layers.22.self_attn.k_proj
291
+ - model.layers.65.self_attn.k_proj
292
+ - model.layers.44.self_attn.k_proj
293
+ - model.layers.40.self_attn.k_proj
294
+ - model.layers.63.self_attn.k_proj
295
+ - model.layers.23.self_attn.k_proj
296
+ - model.layers.28.self_attn.k_proj
297
+ - model.layers.24.self_attn.k_proj
298
+ - model.layers.26.self_attn.k_proj
299
+ - model.layers.67.self_attn.k_proj
300
+ - model.layers.75.self_attn.k_proj
301
+ - model.layers.27.self_attn.k_proj
302
+ - model.layers.57.self_attn.k_proj
303
+ - model.layers.64.self_attn.k_proj
304
+ - model.layers.71.self_attn.k_proj
305
+ - model.layers.61.self_attn.k_proj
306
+ - model.layers.72.self_attn.k_proj
307
+ - model.layers.73.self_attn.k_proj
308
+ # self_attn.o_proj layers
309
+ - model.layers.69.self_attn.o_proj
310
+ - model.layers.39.self_attn.o_proj
311
+ - model.layers.16.self_attn.o_proj
312
+ - model.layers.14.self_attn.o_proj
313
+ - model.layers.19.self_attn.o_proj
314
+ - model.layers.42.self_attn.o_proj
315
+ - model.layers.12.self_attn.o_proj
316
+ - model.layers.15.self_attn.o_proj
317
+ - model.layers.17.self_attn.o_proj
318
+ - model.layers.38.self_attn.o_proj
319
+ - model.layers.23.self_attn.o_proj
320
+ - model.layers.22.self_attn.o_proj
321
+ - model.layers.13.self_attn.o_proj
322
+ - model.layers.29.self_attn.o_proj
323
+ - model.layers.41.self_attn.o_proj
324
+ - model.layers.44.self_attn.o_proj
325
+ - model.layers.46.self_attn.o_proj
326
+ - model.layers.45.self_attn.o_proj
327
+ - model.layers.43.self_attn.o_proj
328
+ - model.layers.49.self_attn.o_proj
329
+ - model.layers.30.self_attn.o_proj
330
+ - model.layers.26.self_attn.o_proj
331
+ - model.layers.25.self_attn.o_proj
332
+ - model.layers.37.self_attn.o_proj
333
+ - model.layers.47.self_attn.o_proj
334
+ - model.layers.11.self_attn.o_proj
335
+ - model.layers.18.self_attn.o_proj
336
+ - model.layers.28.self_attn.o_proj
337
+ - model.layers.20.self_attn.o_proj
338
+ - model.layers.27.self_attn.o_proj
339
+ - model.layers.53.self_attn.o_proj
340
+ - model.layers.52.self_attn.o_proj
341
+ - model.layers.35.self_attn.o_proj
342
+ - model.layers.71.self_attn.o_proj
343
+ - model.layers.10.self_attn.o_proj
344
+ - model.layers.3.self_attn.o_proj
345
+ - model.layers.21.self_attn.o_proj
346
+ - model.layers.24.self_attn.o_proj
347
+ - model.layers.68.self_attn.o_proj
348
+ - model.layers.48.self_attn.o_proj
349
+ # self_attn.q_proj layers
350
+ - model.layers.1.self_attn.q_proj
351
+ - model.layers.2.self_attn.q_proj
352
+ - model.layers.3.self_attn.q_proj
353
+ - model.layers.0.self_attn.q_proj
354
+ - model.layers.5.self_attn.q_proj
355
+ - model.layers.4.self_attn.q_proj
356
+ - model.layers.6.self_attn.q_proj
357
+ - model.layers.8.self_attn.q_proj
358
+ - model.layers.7.self_attn.q_proj
359
+ - model.layers.9.self_attn.q_proj
360
+ - model.layers.10.self_attn.q_proj
361
+ - model.layers.68.self_attn.q_proj
362
+ - model.layers.25.self_attn.q_proj
363
+ - model.layers.12.self_attn.q_proj
364
+ - model.layers.54.self_attn.q_proj
365
+ - model.layers.55.self_attn.q_proj
366
+ - model.layers.61.self_attn.q_proj
367
+ - model.layers.18.self_attn.q_proj
368
+ - model.layers.49.self_attn.q_proj
369
+ - model.layers.66.self_attn.q_proj
370
+ - model.layers.72.self_attn.q_proj
371
+ - model.layers.11.self_attn.q_proj
372
+ - model.layers.52.self_attn.q_proj
373
+ - model.layers.64.self_attn.q_proj
374
+ - model.layers.15.self_attn.q_proj
375
+ - model.layers.60.self_attn.q_proj
376
+ - model.layers.50.self_attn.q_proj
377
+ - model.layers.59.self_attn.q_proj
378
+ - model.layers.53.self_attn.q_proj
379
+ - model.layers.48.self_attn.q_proj
380
+ - model.layers.57.self_attn.q_proj
381
+ - model.layers.70.self_attn.q_proj
382
+ - model.layers.17.self_attn.q_proj
383
+ - model.layers.67.self_attn.q_proj
384
+ - model.layers.71.self_attn.q_proj
385
+ - model.layers.62.self_attn.q_proj
386
+ - model.layers.51.self_attn.q_proj
387
+ - model.layers.19.self_attn.q_proj
388
+ - model.layers.58.self_attn.q_proj
389
+ - model.layers.13.self_attn.q_proj
390
+ # self_attn.v_proj layers
391
+ - model.layers.23.self_attn.v_proj
392
+ - model.layers.25.self_attn.v_proj
393
+ - model.layers.26.self_attn.v_proj
394
+ - model.layers.27.self_attn.v_proj
395
+ - model.layers.28.self_attn.v_proj
396
+ - model.layers.29.self_attn.v_proj
397
+ - model.layers.30.self_attn.v_proj
398
+ - model.layers.31.self_attn.v_proj
399
+ - model.layers.34.self_attn.v_proj
400
+ - model.layers.35.self_attn.v_proj
401
+ - model.layers.36.self_attn.v_proj
402
+ - model.layers.37.self_attn.v_proj
403
+ - model.layers.38.self_attn.v_proj
404
+ - model.layers.42.self_attn.v_proj
405
+ - model.layers.48.self_attn.v_proj
406
+ - model.layers.57.self_attn.v_proj
407
+ - model.layers.58.self_attn.v_proj
408
+ - model.layers.61.self_attn.v_proj
409
+ - model.layers.63.self_attn.v_proj
410
+ - model.layers.64.self_attn.v_proj
411
+ - model.layers.65.self_attn.v_proj
412
+ - model.layers.66.self_attn.v_proj
413
+ - model.layers.69.self_attn.v_proj
414
+ - model.layers.70.self_attn.v_proj
415
+ - model.layers.74.self_attn.v_proj
416
+ - model.layers.75.self_attn.v_proj
417
+ - model.layers.72.self_attn.v_proj
418
+ - model.layers.39.self_attn.v_proj
419
+ - model.layers.41.self_attn.v_proj
420
+ - model.layers.40.self_attn.v_proj
421
+ - model.layers.33.self_attn.v_proj
422
+ - model.layers.59.self_attn.v_proj
423
+ - model.layers.16.self_attn.v_proj
424
+ - model.layers.15.self_attn.v_proj
425
+ - model.layers.76.self_attn.v_proj
426
+ - model.layers.24.self_attn.v_proj
427
+ - model.layers.68.self_attn.v_proj
428
+ - model.layers.67.self_attn.v_proj
429
+ - model.layers.55.self_attn.v_proj
430
+ - model.layers.44.self_attn.v_proj
431
+
432
+
433
+
434
+ wandb_project: EVA-Qwen2.5-72B-SFFT-v0.1
435
+ wandb_entity:
436
+ wandb_watch:
437
+ wandb_name: Unit-01
438
+ wandb_log_model:
439
+
440
+ gradient_accumulation_steps: 8
441
+ micro_batch_size: 1
442
+ num_epochs: 3
443
+ optimizer: paged_adamw_8bit
444
+ lr_scheduler: cosine
445
+ learning_rate: 0.00005
446
+ max_grad_norm: 3
447
+
448
+ train_on_inputs: false
449
+ group_by_length: false
450
+ bf16: auto
451
+ fp16:
452
+ tf32: false
453
+
454
+ gradient_checkpointing: "unsloth"
455
+ # gradient_checkpointing_kwargs:
456
+ # use_reentrant: true
457
+ early_stopping_patience:
458
+ resume_from_checkpoint:
459
+ local_rank:
460
+ logging_steps: 1
461
+ xformers_attention:
462
+ flash_attention: true
463
+
464
+ warmup_steps: 20
465
+ evals_per_epoch: 4
466
+ saves_per_epoch: 4
467
+ save_safetensors: true
468
+ hub_model_id:
469
+ hub_strategy:
470
+ debug:
471
+ deepspeed: deepspeed_configs/zero3_bf16_cpuoffload_params.json
472
+ weight_decay: 0.1
473
+ # fsdp:
474
+ # - full_shard
475
+ # - auto_wrap
476
+ # fsdp_config:
477
+ # fsdp_limit_all_gathers: true
478
+ # fsdp_sync_module_states: false
479
+ # fsdp_offload_params: true
480
+ # fsdp_cpu_ram_efficient_loading: true
481
+ # fsdp_auto_wrap_policy: TRANSFORMER_BASED_WRAP
482
+ # fsdp_transformer_layer_cls_to_wrap: Qwen2DecoderLayer
483
+ # fsdp_activation_checkpointing: true
484
+ # fsdp_state_dict_type: SHARDED_STATE_DICT # Changed from FULL_STATE_DICT
485
+ # fsdp_sharding_strategy: FULL_SHARD
486
+ # fsdp_forward_prefetch: false # Added
487
+ # fsdp_backward_prefetch: "BACKWARD_PRE" # Added
488
+ # fsdp_backward_prefetch_limit: 1 # Added
489
+ # fsdp_mixed_precision: BF16 # Added
490
+ ```
491
+
492
+ </details><br>