shpotes commited on
Commit
5e5d1f5
1 Parent(s): 94cd824

Training in progress, step 500

Browse files
This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. .gitattributes +1 -0
  2. README.md +12 -14
  3. all_results.json +0 -14
  4. eval_results.json +0 -9
  5. pytorch_model.bin +1 -1
  6. run-300M.sh +2 -0
  7. train_results.json +0 -8
  8. trainer_state.json +0 -337
  9. training_args.bin +1 -1
  10. wandb/debug-internal.log +1 -1
  11. wandb/debug.log +1 -1
  12. wandb/latest-run +1 -1
  13. wandb/run-20220130_021614-39ybnrhl/files/config.yaml +6 -0
  14. wandb/run-20220130_021614-39ybnrhl/files/output.log +6 -0
  15. wandb/run-20220130_021614-39ybnrhl/files/wandb-summary.json +0 -0
  16. wandb/run-20220130_021614-39ybnrhl/logs/debug-internal.log +124 -0
  17. wandb/run-20220130_021614-39ybnrhl/logs/debug.log +154 -0
  18. wandb/run-20220130_021614-39ybnrhl/run-39ybnrhl.wandb +2 -2
  19. wandb/run-20220130_235926-2f5mzr6c/files/config.yaml +655 -0
  20. wandb/run-20220130_235926-2f5mzr6c/files/output.log +35 -0
  21. wandb/run-20220130_235926-2f5mzr6c/files/requirements.txt +87 -0
  22. wandb/run-20220130_235926-2f5mzr6c/files/wandb-metadata.json +75 -0
  23. wandb/run-20220130_235926-2f5mzr6c/files/wandb-summary.json +1 -0
  24. wandb/run-20220130_235926-2f5mzr6c/logs/debug-internal.log +144 -0
  25. wandb/run-20220130_235926-2f5mzr6c/logs/debug.log +136 -0
  26. wandb/run-20220130_235926-2f5mzr6c/run-2f5mzr6c.wandb +0 -0
  27. wandb/run-20220131_000222-2f4y0tls/files/config.yaml +655 -0
  28. wandb/run-20220131_000222-2f4y0tls/files/output.log +35 -0
  29. wandb/run-20220131_000222-2f4y0tls/files/requirements.txt +87 -0
  30. wandb/run-20220131_000222-2f4y0tls/files/wandb-metadata.json +75 -0
  31. wandb/run-20220131_000222-2f4y0tls/files/wandb-summary.json +1 -0
  32. wandb/run-20220131_000222-2f4y0tls/logs/debug-internal.log +144 -0
  33. wandb/run-20220131_000222-2f4y0tls/logs/debug.log +136 -0
  34. wandb/run-20220131_000222-2f4y0tls/run-2f4y0tls.wandb +0 -0
  35. wandb/run-20220131_001044-248r0x8f/files/config.yaml +0 -0
  36. wandb/run-20220131_001044-248r0x8f/files/output.log +200 -0
  37. wandb/run-20220131_001044-248r0x8f/files/requirements.txt +87 -0
  38. wandb/run-20220131_001044-248r0x8f/files/wandb-metadata.json +75 -0
  39. wandb/run-20220131_001044-248r0x8f/files/wandb-summary.json +0 -0
  40. wandb/run-20220131_001044-248r0x8f/logs/debug-internal.log +0 -0
  41. wandb/run-20220131_001044-248r0x8f/logs/debug.log +24 -0
  42. wandb/run-20220131_001044-248r0x8f/run-248r0x8f.wandb +0 -0
  43. wandb/run-20220131_140404-gjg8nz5t/files/config.yaml +0 -0
  44. wandb/run-20220131_140404-gjg8nz5t/files/output.log +4449 -0
  45. wandb/run-20220131_140404-gjg8nz5t/files/requirements.txt +87 -0
  46. wandb/run-20220131_140404-gjg8nz5t/files/wandb-metadata.json +75 -0
  47. wandb/run-20220131_140404-gjg8nz5t/files/wandb-summary.json +0 -0
  48. wandb/run-20220131_140404-gjg8nz5t/logs/debug-internal.log +0 -0
  49. wandb/run-20220131_140404-gjg8nz5t/logs/debug.log +170 -0
  50. wandb/run-20220131_140404-gjg8nz5t/run-gjg8nz5t.wandb +3 -0
.gitattributes CHANGED
@@ -40,3 +40,4 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
40
  .venv/lib/python3.9/site-packages/torch/lib/libtorch_cuda.so filter=lfs diff=lfs merge=lfs -text
41
  .venv/lib/python3.9/site-packages/torch/lib/libtorch_python.so filter=lfs diff=lfs merge=lfs -text
42
  wandb/run-20220130_021614-39ybnrhl/run-39ybnrhl.wandb filter=lfs diff=lfs merge=lfs -text
 
 
40
  .venv/lib/python3.9/site-packages/torch/lib/libtorch_cuda.so filter=lfs diff=lfs merge=lfs -text
41
  .venv/lib/python3.9/site-packages/torch/lib/libtorch_python.so filter=lfs diff=lfs merge=lfs -text
42
  wandb/run-20220130_021614-39ybnrhl/run-39ybnrhl.wandb filter=lfs diff=lfs merge=lfs -text
43
+ wandb/run-20220131_140404-gjg8nz5t/run-gjg8nz5t.wandb filter=lfs diff=lfs merge=lfs -text
README.md CHANGED
@@ -4,10 +4,8 @@ language:
4
  license: apache-2.0
5
  tags:
6
  - automatic-speech-recognition
7
- - mozilla-foundation/common_voice_7_0
8
  - generated_from_trainer
9
- - et
10
- - robust-speech-event
11
  datasets:
12
  - common_voice
13
  model-index:
@@ -20,10 +18,10 @@ should probably proofread and complete it, then remove this comment. -->
20
 
21
  #
22
 
23
- This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the MOZILLA-FOUNDATION/COMMON_VOICE_7_0 - ET dataset.
24
  It achieves the following results on the evaluation set:
25
- - Loss: 0.4783
26
- - Wer: 0.3439
27
 
28
  ## Model description
29
 
@@ -58,14 +56,14 @@ The following hyperparameters were used during training:
58
 
59
  | Training Loss | Epoch | Step | Validation Loss | Wer |
60
  |:-------------:|:-----:|:----:|:---------------:|:------:|
61
- | 0.3382 | 12.5 | 500 | 0.3756 | 0.4654 |
62
- | 0.1941 | 25.0 | 1000 | 0.4314 | 0.4388 |
63
- | 0.1562 | 37.5 | 1500 | 0.4554 | 0.4155 |
64
- | 0.1267 | 50.0 | 2000 | 0.4749 | 0.4063 |
65
- | 0.1003 | 62.5 | 2500 | 0.4812 | 0.3939 |
66
- | 0.0751 | 75.0 | 3000 | 0.4776 | 0.3626 |
67
- | 0.0621 | 87.5 | 3500 | 0.4851 | 0.3497 |
68
- | 0.0568 | 100.0 | 4000 | 0.4783 | 0.3439 |
69
 
70
 
71
  ### Framework versions
 
4
  license: apache-2.0
5
  tags:
6
  - automatic-speech-recognition
7
+ - mozilla-foundation/common_voice_8_0
8
  - generated_from_trainer
 
 
9
  datasets:
10
  - common_voice
11
  model-index:
 
18
 
19
  #
20
 
21
+ This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the MOZILLA-FOUNDATION/COMMON_VOICE_8_0 - ET dataset.
22
  It achieves the following results on the evaluation set:
23
+ - Loss: 0.4927
24
+ - Wer: 0.3536
25
 
26
  ## Model description
27
 
 
56
 
57
  | Training Loss | Epoch | Step | Validation Loss | Wer |
58
  |:-------------:|:-----:|:----:|:---------------:|:------:|
59
+ | 0.3442 | 12.5 | 500 | 0.3825 | 0.4763 |
60
+ | 0.1934 | 25.0 | 1000 | 0.4236 | 0.4414 |
61
+ | 0.149 | 37.5 | 1500 | 0.4503 | 0.4190 |
62
+ | 0.1253 | 50.0 | 2000 | 0.4674 | 0.4143 |
63
+ | 0.0966 | 62.5 | 2500 | 0.4847 | 0.3925 |
64
+ | 0.0741 | 75.0 | 3000 | 0.4745 | 0.3704 |
65
+ | 0.0608 | 87.5 | 3500 | 0.4807 | 0.3568 |
66
+ | 0.0541 | 100.0 | 4000 | 0.4927 | 0.3536 |
67
 
68
 
69
  ### Framework versions
all_results.json DELETED
@@ -1,14 +0,0 @@
1
- {
2
- "epoch": 100.0,
3
- "eval_loss": 0.4782812297344208,
4
- "eval_runtime": 133.6277,
5
- "eval_samples": 2609,
6
- "eval_samples_per_second": 19.524,
7
- "eval_steps_per_second": 0.277,
8
- "eval_wer": 0.34393891186764236,
9
- "train_loss": 0.34811098051071165,
10
- "train_runtime": 35258.4892,
11
- "train_samples": 5705,
12
- "train_samples_per_second": 16.181,
13
- "train_steps_per_second": 0.113
14
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
eval_results.json DELETED
@@ -1,9 +0,0 @@
1
- {
2
- "epoch": 100.0,
3
- "eval_loss": 0.4782812297344208,
4
- "eval_runtime": 133.6277,
5
- "eval_samples": 2609,
6
- "eval_samples_per_second": 19.524,
7
- "eval_steps_per_second": 0.277,
8
- "eval_wer": 0.34393891186764236
9
- }
 
 
 
 
 
 
 
 
 
 
pytorch_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:09f4be720228bd8290e0ffdf5845506b1a5543ffdf4bc8088784140c4c2b1b65
3
  size 1262083569
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8e46f7437225e43f76d2ee2c7a3137b59f4088c38f3fa35c66ff2baafd8ae804
3
  size 1262083569
run-300M.sh CHANGED
@@ -36,3 +36,5 @@ python src/run_speech_recognition_ctc_bnb.py \
36
  --run_name="cosine+drop_proj+low_specaugment-300M" \
37
  --do_train --do_eval \
38
  --use_auth_token --push_to_hub
 
 
 
36
  --run_name="cosine+drop_proj+low_specaugment-300M" \
37
  --do_train --do_eval \
38
  --use_auth_token --push_to_hub
39
+
40
+ # --use_auth_token --push_to_hub
train_results.json DELETED
@@ -1,8 +0,0 @@
1
- {
2
- "epoch": 100.0,
3
- "train_loss": 0.34811098051071165,
4
- "train_runtime": 35258.4892,
5
- "train_samples": 5705,
6
- "train_samples_per_second": 16.181,
7
- "train_steps_per_second": 0.113
8
- }
 
 
 
 
 
 
 
 
 
trainer_state.json DELETED
@@ -1,337 +0,0 @@
1
- {
2
- "best_metric": null,
3
- "best_model_checkpoint": null,
4
- "epoch": 100.0,
5
- "global_step": 4000,
6
- "is_hyper_param_search": false,
7
- "is_local_process_zero": true,
8
- "is_world_process_zero": true,
9
- "log_history": [
10
- {
11
- "epoch": 2.5,
12
- "learning_rate": 0.00019800000000000002,
13
- "loss": 4.5986,
14
- "step": 100
15
- },
16
- {
17
- "epoch": 5.0,
18
- "learning_rate": 0.000398,
19
- "loss": 2.913,
20
- "step": 200
21
- },
22
- {
23
- "epoch": 7.5,
24
- "learning_rate": 0.000598,
25
- "loss": 1.3181,
26
- "step": 300
27
- },
28
- {
29
- "epoch": 10.0,
30
- "learning_rate": 0.0007980000000000001,
31
- "loss": 0.4316,
32
- "step": 400
33
- },
34
- {
35
- "epoch": 12.5,
36
- "learning_rate": 0.000998,
37
- "loss": 0.3382,
38
- "step": 500
39
- },
40
- {
41
- "epoch": 12.5,
42
- "eval_loss": 0.37556204199790955,
43
- "eval_runtime": 131.256,
44
- "eval_samples_per_second": 19.877,
45
- "eval_steps_per_second": 0.282,
46
- "eval_wer": 0.46538339166401527,
47
- "step": 500
48
- },
49
- {
50
- "epoch": 15.0,
51
- "learning_rate": 0.0009980271764103532,
52
- "loss": 0.2923,
53
- "step": 600
54
- },
55
- {
56
- "epoch": 17.5,
57
- "learning_rate": 0.000992044732251972,
58
- "loss": 0.2513,
59
- "step": 700
60
- },
61
- {
62
- "epoch": 20.0,
63
- "learning_rate": 0.0009821006332271156,
64
- "loss": 0.2257,
65
- "step": 800
66
- },
67
- {
68
- "epoch": 22.5,
69
- "learning_rate": 0.0009682749433740962,
70
- "loss": 0.2067,
71
- "step": 900
72
- },
73
- {
74
- "epoch": 25.0,
75
- "learning_rate": 0.0009506789790182364,
76
- "loss": 0.1941,
77
- "step": 1000
78
- },
79
- {
80
- "epoch": 25.0,
81
- "eval_loss": 0.43141674995422363,
82
- "eval_runtime": 132.4466,
83
- "eval_samples_per_second": 19.699,
84
- "eval_steps_per_second": 0.279,
85
- "eval_wer": 0.43875278396436523,
86
- "step": 1000
87
- },
88
- {
89
- "epoch": 27.5,
90
- "learning_rate": 0.00092945441251827,
91
- "loss": 0.1976,
92
- "step": 1100
93
- },
94
- {
95
- "epoch": 30.0,
96
- "learning_rate": 0.0009047721316038118,
97
- "loss": 0.1801,
98
- "step": 1200
99
- },
100
- {
101
- "epoch": 32.5,
102
- "learning_rate": 0.0008768308634878388,
103
- "loss": 0.169,
104
- "step": 1300
105
- },
106
- {
107
- "epoch": 35.0,
108
- "learning_rate": 0.0008458555748320216,
109
- "loss": 0.1634,
110
- "step": 1400
111
- },
112
- {
113
- "epoch": 37.5,
114
- "learning_rate": 0.0008120956604474414,
115
- "loss": 0.1562,
116
- "step": 1500
117
- },
118
- {
119
- "epoch": 37.5,
120
- "eval_loss": 0.455356240272522,
121
- "eval_runtime": 130.437,
122
- "eval_samples_per_second": 20.002,
123
- "eval_steps_per_second": 0.284,
124
- "eval_wer": 0.41552656697422846,
125
- "step": 1500
126
- },
127
- {
128
- "epoch": 40.0,
129
- "learning_rate": 0.0007758229353142152,
130
- "loss": 0.1561,
131
- "step": 1600
132
- },
133
- {
134
- "epoch": 42.5,
135
- "learning_rate": 0.0007373294460870985,
136
- "loss": 0.1525,
137
- "step": 1700
138
- },
139
- {
140
- "epoch": 45.0,
141
- "learning_rate": 0.0006969251197075427,
142
- "loss": 0.137,
143
- "step": 1800
144
- },
145
- {
146
- "epoch": 47.5,
147
- "learning_rate": 0.0006549352680541975,
148
- "loss": 0.1305,
149
- "step": 1900
150
- },
151
- {
152
- "epoch": 50.0,
153
- "learning_rate": 0.000611697968722942,
154
- "loss": 0.1267,
155
- "step": 2000
156
- },
157
- {
158
- "epoch": 50.0,
159
- "eval_loss": 0.47493159770965576,
160
- "eval_runtime": 130.8969,
161
- "eval_samples_per_second": 19.932,
162
- "eval_steps_per_second": 0.283,
163
- "eval_wer": 0.40629971364937956,
164
- "step": 2000
165
- },
166
- {
167
- "epoch": 52.5,
168
- "learning_rate": 0.0005675613430248713,
169
- "loss": 0.1211,
170
- "step": 2100
171
- },
172
- {
173
- "epoch": 55.0,
174
- "learning_rate": 0.0005228807531181908,
175
- "loss": 0.1132,
176
- "step": 2200
177
- },
178
- {
179
- "epoch": 57.5,
180
- "learning_rate": 0.00047801594084106763,
181
- "loss": 0.1098,
182
- "step": 2300
183
- },
184
- {
185
- "epoch": 60.0,
186
- "learning_rate": 0.0004333281312818746,
187
- "loss": 0.1017,
188
- "step": 2400
189
- },
190
- {
191
- "epoch": 62.5,
192
- "learning_rate": 0.00038917712440717607,
193
- "loss": 0.1003,
194
- "step": 2500
195
- },
196
- {
197
- "epoch": 62.5,
198
- "eval_loss": 0.48115354776382446,
199
- "eval_runtime": 131.1388,
200
- "eval_samples_per_second": 19.895,
201
- "eval_steps_per_second": 0.282,
202
- "eval_wer": 0.3938593700286351,
203
- "step": 2500
204
- },
205
- {
206
- "epoch": 65.0,
207
- "learning_rate": 0.00034591839816395533,
208
- "loss": 0.0952,
209
- "step": 2600
210
- },
211
- {
212
- "epoch": 67.5,
213
- "learning_rate": 0.00030390024638020374,
214
- "loss": 0.0892,
215
- "step": 2700
216
- },
217
- {
218
- "epoch": 70.0,
219
- "learning_rate": 0.0002634609745078109,
220
- "loss": 0.0828,
221
- "step": 2800
222
- },
223
- {
224
- "epoch": 72.5,
225
- "learning_rate": 0.00022492617578598646,
226
- "loss": 0.0784,
227
- "step": 2900
228
- },
229
- {
230
- "epoch": 75.0,
231
- "learning_rate": 0.00018860610975594382,
232
- "loss": 0.0751,
233
- "step": 3000
234
- },
235
- {
236
- "epoch": 75.0,
237
- "eval_loss": 0.47760340571403503,
238
- "eval_runtime": 131.8409,
239
- "eval_samples_per_second": 19.789,
240
- "eval_steps_per_second": 0.281,
241
- "eval_wer": 0.36264715240216355,
242
- "step": 3000
243
- },
244
- {
245
- "epoch": 77.5,
246
- "learning_rate": 0.0001547932042335039,
247
- "loss": 0.0722,
248
- "step": 3100
249
- },
250
- {
251
- "epoch": 80.0,
252
- "learning_rate": 0.00012375970085226701,
253
- "loss": 0.0678,
254
- "step": 3200
255
- },
256
- {
257
- "epoch": 82.5,
258
- "learning_rate": 9.575546313405425e-05,
259
- "loss": 0.0657,
260
- "step": 3300
261
- },
262
- {
263
- "epoch": 85.0,
264
- "learning_rate": 7.100596473474763e-05,
265
- "loss": 0.0641,
266
- "step": 3400
267
- },
268
- {
269
- "epoch": 87.5,
270
- "learning_rate": 4.9710474062988955e-05,
271
- "loss": 0.0621,
272
- "step": 3500
273
- },
274
- {
275
- "epoch": 87.5,
276
- "eval_loss": 0.4850753843784332,
277
- "eval_runtime": 131.0079,
278
- "eval_samples_per_second": 19.915,
279
- "eval_steps_per_second": 0.282,
280
- "eval_wer": 0.34972955774737513,
281
- "step": 3500
282
- },
283
- {
284
- "epoch": 90.0,
285
- "learning_rate": 3.204044988812144e-05,
286
- "loss": 0.0593,
287
- "step": 3600
288
- },
289
- {
290
- "epoch": 92.5,
291
- "learning_rate": 1.8138160854995144e-05,
292
- "loss": 0.0572,
293
- "step": 3700
294
- },
295
- {
296
- "epoch": 95.0,
297
- "learning_rate": 8.115540020491363e-06,
298
- "loss": 0.0573,
299
- "step": 3800
300
- },
301
- {
302
- "epoch": 97.5,
303
- "learning_rate": 2.053283634363745e-06,
304
- "loss": 0.0563,
305
- "step": 3900
306
- },
307
- {
308
- "epoch": 100.0,
309
- "learning_rate": 2.0142048445803695e-10,
310
- "loss": 0.0568,
311
- "step": 4000
312
- },
313
- {
314
- "epoch": 100.0,
315
- "eval_loss": 0.4782812297344208,
316
- "eval_runtime": 130.3732,
317
- "eval_samples_per_second": 20.012,
318
- "eval_steps_per_second": 0.284,
319
- "eval_wer": 0.34393891186764236,
320
- "step": 4000
321
- },
322
- {
323
- "epoch": 100.0,
324
- "step": 4000,
325
- "total_flos": 1.0569685775816081e+20,
326
- "train_loss": 0.34811098051071165,
327
- "train_runtime": 35258.4892,
328
- "train_samples_per_second": 16.181,
329
- "train_steps_per_second": 0.113
330
- }
331
- ],
332
- "max_steps": 4000,
333
- "num_train_epochs": 100,
334
- "total_flos": 1.0569685775816081e+20,
335
- "trial_name": null,
336
- "trial_params": null
337
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:2d1df92319f113f5c8e9351655b7b394f03cfde64572411bd15fe44a33d3ca39
3
  size 3055
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:dbea5863d2f8f230bd8247e00c740ae4f41b777dcaff932c2756042244c24a81
3
  size 3055
wandb/debug-internal.log CHANGED
@@ -1 +1 @@
1
- run-20220130_021614-39ybnrhl/logs/debug-internal.log
 
1
+ run-20220202_140930-21azk117/logs/debug-internal.log
wandb/debug.log CHANGED
@@ -1 +1 @@
1
- run-20220130_021614-39ybnrhl/logs/debug.log
 
1
+ run-20220202_140930-21azk117/logs/debug.log
wandb/latest-run CHANGED
@@ -1 +1 @@
1
- run-20220130_021614-39ybnrhl
 
1
+ run-20220202_140930-21azk117
wandb/run-20220130_021614-39ybnrhl/files/config.yaml CHANGED
@@ -4829,7 +4829,13 @@ _wandb:
4829
  - 1
4830
  - 5
4831
  - 11
 
 
 
 
4832
  3:
 
 
4833
  - 13
4834
  4: 3.9.6
4835
  5: 0.12.9
 
4829
  - 1
4830
  - 5
4831
  - 11
4832
+ 2:
4833
+ - 1
4834
+ - 5
4835
+ - 11
4836
  3:
4837
+ - 1
4838
+ - 7
4839
  - 13
4840
  4: 3.9.6
4841
  5: 0.12.9
wandb/run-20220130_021614-39ybnrhl/files/output.log CHANGED
@@ -4466,3 +4466,9 @@ Upload file wandb/run-20220130_021614-39ybnrhl/run-39ybnrhl.wandb: 100%|██
4466
  01/30/2022 12:06:39 - WARNING - huggingface_hub.repository - To https://huggingface.co/shpotes/xls-r-et
4467
  3268d07e..314b5e1c main -> main
4468
  Dropping the following result as it does not have all the necessary fields:
 
 
 
 
 
 
 
4466
  01/30/2022 12:06:39 - WARNING - huggingface_hub.repository - To https://huggingface.co/shpotes/xls-r-et
4467
  3268d07e..314b5e1c main -> main
4468
  Dropping the following result as it does not have all the necessary fields:
4469
+ {'dataset': {'name': 'MOZILLA-FOUNDATION/COMMON_VOICE_7_0 - ET', 'type': 'common_voice', 'args': 'Config: et, Training split: train+validation, Eval split: test'}}
4470
+ Upload file wandb/run-20220130_021614-39ybnrhl/run-39ybnrhl.wandb: 2%|██▏ | 608k/27.6M [00:01<00:48, 589kB/s]To https://huggingface.co/shpotes/xls-r-et
4471
+ 314b5e1c..69a8523f main -> main
4472
+ Upload file wandb/run-20220130_021614-39ybnrhl/run-39ybnrhl.wandb: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████| 27.6M/27.6M [00:02<00:00, 14.4MB/s]
4473
+ 01/30/2022 12:06:49 - WARNING - huggingface_hub.repository - To https://huggingface.co/shpotes/xls-r-et
4474
+ 314b5e1c..69a8523f main -> main
wandb/run-20220130_021614-39ybnrhl/files/wandb-summary.json CHANGED
The diff for this file is too large to render. See raw diff
 
wandb/run-20220130_021614-39ybnrhl/logs/debug-internal.log CHANGED
@@ -11615,3 +11615,127 @@
11615
  2022-01-30 12:06:43,725 INFO Thread-8 :2586828 [dir_watcher.py:_on_file_modified():230] file/dir modified: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_021614-39ybnrhl/files/output.log
11616
  2022-01-30 12:06:44,212 DEBUG HandlerThread:2586828 [handler.py:handle_request():130] handle_request: stop_status
11617
  2022-01-30 12:06:44,213 DEBUG SenderThread:2586828 [sender.py:send_request():248] send_request: stop_status
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
11615
  2022-01-30 12:06:43,725 INFO Thread-8 :2586828 [dir_watcher.py:_on_file_modified():230] file/dir modified: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_021614-39ybnrhl/files/output.log
11616
  2022-01-30 12:06:44,212 DEBUG HandlerThread:2586828 [handler.py:handle_request():130] handle_request: stop_status
11617
  2022-01-30 12:06:44,213 DEBUG SenderThread:2586828 [sender.py:send_request():248] send_request: stop_status
11618
+ 2022-01-30 12:06:49,728 INFO Thread-8 :2586828 [dir_watcher.py:_on_file_modified():230] file/dir modified: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_021614-39ybnrhl/files/output.log
11619
+ 2022-01-30 12:06:51,729 INFO Thread-8 :2586828 [dir_watcher.py:_on_file_modified():230] file/dir modified: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_021614-39ybnrhl/files/output.log
11620
+ 2022-01-30 12:06:54,537 DEBUG HandlerThread:2586828 [handler.py:handle_request():130] handle_request: poll_exit
11621
+ 2022-01-30 12:06:54,538 DEBUG SenderThread:2586828 [sender.py:send():234] send: telemetry
11622
+ 2022-01-30 12:06:54,538 DEBUG SenderThread:2586828 [sender.py:send():234] send: exit
11623
+ 2022-01-30 12:06:54,538 INFO SenderThread:2586828 [sender.py:send_exit():366] handling exit code: 0
11624
+ 2022-01-30 12:06:54,539 INFO SenderThread:2586828 [sender.py:send_exit():368] handling runtime: 35438
11625
+ 2022-01-30 12:06:54,567 INFO SenderThread:2586828 [sender.py:_save_file():939] saving file wandb-summary.json with policy end
11626
+ 2022-01-30 12:06:54,567 INFO SenderThread:2586828 [sender.py:send_exit():374] send defer
11627
+ 2022-01-30 12:06:54,567 DEBUG SenderThread:2586828 [sender.py:send_request():248] send_request: poll_exit
11628
+ 2022-01-30 12:06:54,568 DEBUG HandlerThread:2586828 [handler.py:handle_request():130] handle_request: defer
11629
+ 2022-01-30 12:06:54,568 INFO HandlerThread:2586828 [handler.py:handle_request_defer():147] handle defer: 0
11630
+ 2022-01-30 12:06:54,569 DEBUG SenderThread:2586828 [sender.py:send_request():248] send_request: defer
11631
+ 2022-01-30 12:06:54,569 INFO SenderThread:2586828 [sender.py:send_request_defer():383] handle sender defer: 0
11632
+ 2022-01-30 12:06:54,569 INFO SenderThread:2586828 [sender.py:transition_state():387] send defer: 1
11633
+ 2022-01-30 12:06:54,569 DEBUG HandlerThread:2586828 [handler.py:handle_request():130] handle_request: defer
11634
+ 2022-01-30 12:06:54,569 INFO HandlerThread:2586828 [handler.py:handle_request_defer():147] handle defer: 1
11635
+ 2022-01-30 12:06:54,722 DEBUG HandlerThread:2586828 [handler.py:handle_request():130] handle_request: poll_exit
11636
+ 2022-01-30 12:06:54,722 DEBUG SenderThread:2586828 [sender.py:send_request():248] send_request: defer
11637
+ 2022-01-30 12:06:54,723 INFO SenderThread:2586828 [sender.py:send_request_defer():383] handle sender defer: 1
11638
+ 2022-01-30 12:06:54,723 INFO SenderThread:2586828 [sender.py:transition_state():387] send defer: 2
11639
+ 2022-01-30 12:06:54,723 DEBUG SenderThread:2586828 [sender.py:send_request():248] send_request: poll_exit
11640
+ 2022-01-30 12:06:54,724 DEBUG SenderThread:2586828 [sender.py:send():234] send: stats
11641
+ 2022-01-30 12:06:54,724 DEBUG HandlerThread:2586828 [handler.py:handle_request():130] handle_request: defer
11642
+ 2022-01-30 12:06:54,725 INFO HandlerThread:2586828 [handler.py:handle_request_defer():147] handle defer: 2
11643
+ 2022-01-30 12:06:54,725 DEBUG SenderThread:2586828 [sender.py:send_request():248] send_request: defer
11644
+ 2022-01-30 12:06:54,725 INFO SenderThread:2586828 [sender.py:send_request_defer():383] handle sender defer: 2
11645
+ 2022-01-30 12:06:54,725 INFO SenderThread:2586828 [sender.py:transition_state():387] send defer: 3
11646
+ 2022-01-30 12:06:54,726 DEBUG HandlerThread:2586828 [handler.py:handle_request():130] handle_request: defer
11647
+ 2022-01-30 12:06:54,726 INFO HandlerThread:2586828 [handler.py:handle_request_defer():147] handle defer: 3
11648
+ 2022-01-30 12:06:54,756 DEBUG SenderThread:2586828 [sender.py:send():234] send: summary
11649
+ 2022-01-30 12:06:54,761 INFO Thread-8 :2586828 [dir_watcher.py:_on_file_modified():230] file/dir modified: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_021614-39ybnrhl/files/output.log
11650
+ 2022-01-30 12:06:54,797 INFO Thread-8 :2586828 [dir_watcher.py:_on_file_modified():230] file/dir modified: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_021614-39ybnrhl/files/wandb-summary.json
11651
+ 2022-01-30 12:06:54,819 INFO SenderThread:2586828 [sender.py:_save_file():939] saving file wandb-summary.json with policy end
11652
+ 2022-01-30 12:06:54,819 DEBUG SenderThread:2586828 [sender.py:send_request():248] send_request: defer
11653
+ 2022-01-30 12:06:54,819 INFO SenderThread:2586828 [sender.py:send_request_defer():383] handle sender defer: 3
11654
+ 2022-01-30 12:06:54,819 INFO SenderThread:2586828 [sender.py:transition_state():387] send defer: 4
11655
+ 2022-01-30 12:06:54,820 DEBUG HandlerThread:2586828 [handler.py:handle_request():130] handle_request: defer
11656
+ 2022-01-30 12:06:54,820 INFO HandlerThread:2586828 [handler.py:handle_request_defer():147] handle defer: 4
11657
+ 2022-01-30 12:06:54,820 DEBUG SenderThread:2586828 [sender.py:send_request():248] send_request: defer
11658
+ 2022-01-30 12:06:54,820 INFO SenderThread:2586828 [sender.py:send_request_defer():383] handle sender defer: 4
11659
+ 2022-01-30 12:06:54,826 DEBUG HandlerThread:2586828 [handler.py:handle_request():130] handle_request: poll_exit
11660
+ 2022-01-30 12:06:55,436 INFO SenderThread:2586828 [sender.py:transition_state():387] send defer: 5
11661
+ 2022-01-30 12:06:55,436 DEBUG SenderThread:2586828 [sender.py:send_request():248] send_request: poll_exit
11662
+ 2022-01-30 12:06:55,437 DEBUG HandlerThread:2586828 [handler.py:handle_request():130] handle_request: defer
11663
+ 2022-01-30 12:06:55,437 INFO HandlerThread:2586828 [handler.py:handle_request_defer():147] handle defer: 5
11664
+ 2022-01-30 12:06:55,437 DEBUG SenderThread:2586828 [sender.py:send_request():248] send_request: defer
11665
+ 2022-01-30 12:06:55,437 INFO SenderThread:2586828 [sender.py:send_request_defer():383] handle sender defer: 5
11666
+ 2022-01-30 12:06:55,437 INFO SenderThread:2586828 [dir_watcher.py:finish():283] shutting down directory watcher
11667
+ 2022-01-30 12:06:55,538 DEBUG HandlerThread:2586828 [handler.py:handle_request():130] handle_request: poll_exit
11668
+ 2022-01-30 12:06:55,757 INFO Thread-8 :2586828 [dir_watcher.py:_on_file_modified():230] file/dir modified: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_021614-39ybnrhl/files/config.yaml
11669
+ 2022-01-30 12:06:55,757 INFO SenderThread:2586828 [dir_watcher.py:_on_file_modified():230] file/dir modified: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_021614-39ybnrhl/files/wandb-summary.json
11670
+ 2022-01-30 12:06:55,758 INFO SenderThread:2586828 [dir_watcher.py:finish():313] scan: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_021614-39ybnrhl/files
11671
+ 2022-01-30 12:06:55,758 INFO SenderThread:2586828 [dir_watcher.py:finish():327] scan save: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_021614-39ybnrhl/files/config.yaml config.yaml
11672
+ 2022-01-30 12:06:55,758 INFO SenderThread:2586828 [dir_watcher.py:finish():327] scan save: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_021614-39ybnrhl/files/wandb-summary.json wandb-summary.json
11673
+ 2022-01-30 12:06:55,759 INFO SenderThread:2586828 [dir_watcher.py:finish():327] scan save: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_021614-39ybnrhl/files/requirements.txt requirements.txt
11674
+ 2022-01-30 12:06:55,759 INFO SenderThread:2586828 [dir_watcher.py:finish():327] scan save: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_021614-39ybnrhl/files/wandb-metadata.json wandb-metadata.json
11675
+ 2022-01-30 12:06:55,759 INFO SenderThread:2586828 [dir_watcher.py:finish():327] scan save: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_021614-39ybnrhl/files/output.log output.log
11676
+ 2022-01-30 12:06:55,759 INFO SenderThread:2586828 [sender.py:transition_state():387] send defer: 6
11677
+ 2022-01-30 12:06:55,765 DEBUG SenderThread:2586828 [sender.py:send_request():248] send_request: poll_exit
11678
+ 2022-01-30 12:06:55,766 DEBUG HandlerThread:2586828 [handler.py:handle_request():130] handle_request: defer
11679
+ 2022-01-30 12:06:55,772 INFO HandlerThread:2586828 [handler.py:handle_request_defer():147] handle defer: 6
11680
+ 2022-01-30 12:06:55,773 DEBUG SenderThread:2586828 [sender.py:send_request():248] send_request: defer
11681
+ 2022-01-30 12:06:55,773 INFO SenderThread:2586828 [sender.py:send_request_defer():383] handle sender defer: 6
11682
+ 2022-01-30 12:06:55,773 INFO SenderThread:2586828 [file_pusher.py:finish():177] shutting down file pusher
11683
+ 2022-01-30 12:06:55,868 DEBUG HandlerThread:2586828 [handler.py:handle_request():130] handle_request: poll_exit
11684
+ 2022-01-30 12:06:55,868 DEBUG SenderThread:2586828 [sender.py:send_request():248] send_request: poll_exit
11685
+ 2022-01-30 12:06:55,972 DEBUG HandlerThread:2586828 [handler.py:handle_request():130] handle_request: poll_exit
11686
+ 2022-01-30 12:06:55,973 DEBUG SenderThread:2586828 [sender.py:send_request():248] send_request: poll_exit
11687
+ 2022-01-30 12:06:56,075 DEBUG HandlerThread:2586828 [handler.py:handle_request():130] handle_request: poll_exit
11688
+ 2022-01-30 12:06:56,076 DEBUG SenderThread:2586828 [sender.py:send_request():248] send_request: poll_exit
11689
+ 2022-01-30 12:06:56,178 DEBUG HandlerThread:2586828 [handler.py:handle_request():130] handle_request: poll_exit
11690
+ 2022-01-30 12:06:56,178 DEBUG SenderThread:2586828 [sender.py:send_request():248] send_request: poll_exit
11691
+ 2022-01-30 12:06:56,281 DEBUG HandlerThread:2586828 [handler.py:handle_request():130] handle_request: poll_exit
11692
+ 2022-01-30 12:06:56,281 DEBUG SenderThread:2586828 [sender.py:send_request():248] send_request: poll_exit
11693
+ 2022-01-30 12:06:56,368 INFO Thread-14 :2586828 [upload_job.py:push():137] Uploaded file /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_021614-39ybnrhl/files/requirements.txt
11694
+ 2022-01-30 12:06:56,383 DEBUG HandlerThread:2586828 [handler.py:handle_request():130] handle_request: poll_exit
11695
+ 2022-01-30 12:06:56,384 DEBUG SenderThread:2586828 [sender.py:send_request():248] send_request: poll_exit
11696
+ 2022-01-30 12:06:56,464 INFO Thread-15 :2586828 [upload_job.py:push():137] Uploaded file /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_021614-39ybnrhl/files/output.log
11697
+ 2022-01-30 12:06:56,486 DEBUG HandlerThread:2586828 [handler.py:handle_request():130] handle_request: poll_exit
11698
+ 2022-01-30 12:06:56,487 DEBUG SenderThread:2586828 [sender.py:send_request():248] send_request: poll_exit
11699
+ 2022-01-30 12:06:56,591 INFO Thread-12 :2586828 [upload_job.py:push():137] Uploaded file /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_021614-39ybnrhl/files/config.yaml
11700
+ 2022-01-30 12:06:56,591 DEBUG HandlerThread:2586828 [handler.py:handle_request():130] handle_request: poll_exit
11701
+ 2022-01-30 12:06:56,592 DEBUG SenderThread:2586828 [sender.py:send_request():248] send_request: poll_exit
11702
+ 2022-01-30 12:06:56,694 DEBUG HandlerThread:2586828 [handler.py:handle_request():130] handle_request: poll_exit
11703
+ 2022-01-30 12:06:56,695 DEBUG SenderThread:2586828 [sender.py:send_request():248] send_request: poll_exit
11704
+ 2022-01-30 12:06:56,797 DEBUG HandlerThread:2586828 [handler.py:handle_request():130] handle_request: poll_exit
11705
+ 2022-01-30 12:06:56,797 DEBUG SenderThread:2586828 [sender.py:send_request():248] send_request: poll_exit
11706
+ 2022-01-30 12:06:56,828 INFO Thread-13 :2586828 [upload_job.py:push():137] Uploaded file /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_021614-39ybnrhl/files/wandb-summary.json
11707
+ 2022-01-30 12:06:56,899 DEBUG HandlerThread:2586828 [handler.py:handle_request():130] handle_request: poll_exit
11708
+ 2022-01-30 12:06:56,900 DEBUG SenderThread:2586828 [sender.py:send_request():248] send_request: poll_exit
11709
+ 2022-01-30 12:06:57,002 DEBUG HandlerThread:2586828 [handler.py:handle_request():130] handle_request: poll_exit
11710
+ 2022-01-30 12:06:57,002 DEBUG SenderThread:2586828 [sender.py:send_request():248] send_request: poll_exit
11711
+ 2022-01-30 12:06:57,029 INFO Thread-7 :2586828 [sender.py:transition_state():387] send defer: 7
11712
+ 2022-01-30 12:06:57,030 DEBUG HandlerThread:2586828 [handler.py:handle_request():130] handle_request: defer
11713
+ 2022-01-30 12:06:57,030 INFO HandlerThread:2586828 [handler.py:handle_request_defer():147] handle defer: 7
11714
+ 2022-01-30 12:06:57,030 DEBUG SenderThread:2586828 [sender.py:send_request():248] send_request: defer
11715
+ 2022-01-30 12:06:57,030 INFO SenderThread:2586828 [sender.py:send_request_defer():383] handle sender defer: 7
11716
+ 2022-01-30 12:06:57,105 DEBUG HandlerThread:2586828 [handler.py:handle_request():130] handle_request: poll_exit
11717
+ 2022-01-30 12:06:58,369 INFO SenderThread:2586828 [sender.py:transition_state():387] send defer: 8
11718
+ 2022-01-30 12:06:58,369 DEBUG SenderThread:2586828 [sender.py:send_request():248] send_request: poll_exit
11719
+ 2022-01-30 12:06:58,370 DEBUG HandlerThread:2586828 [handler.py:handle_request():130] handle_request: defer
11720
+ 2022-01-30 12:06:58,370 INFO HandlerThread:2586828 [handler.py:handle_request_defer():147] handle defer: 8
11721
+ 2022-01-30 12:06:58,370 DEBUG SenderThread:2586828 [sender.py:send_request():248] send_request: defer
11722
+ 2022-01-30 12:06:58,371 INFO SenderThread:2586828 [sender.py:send_request_defer():383] handle sender defer: 8
11723
+ 2022-01-30 12:06:58,371 INFO SenderThread:2586828 [sender.py:transition_state():387] send defer: 9
11724
+ 2022-01-30 12:06:58,372 DEBUG HandlerThread:2586828 [handler.py:handle_request():130] handle_request: defer
11725
+ 2022-01-30 12:06:58,373 DEBUG SenderThread:2586828 [sender.py:send():234] send: final
11726
+ 2022-01-30 12:06:58,373 INFO HandlerThread:2586828 [handler.py:handle_request_defer():147] handle defer: 9
11727
+ 2022-01-30 12:06:58,373 DEBUG SenderThread:2586828 [sender.py:send():234] send: footer
11728
+ 2022-01-30 12:06:58,374 DEBUG SenderThread:2586828 [sender.py:send_request():248] send_request: defer
11729
+ 2022-01-30 12:06:58,374 INFO SenderThread:2586828 [sender.py:send_request_defer():383] handle sender defer: 9
11730
+ 2022-01-30 12:06:58,471 DEBUG HandlerThread:2586828 [handler.py:handle_request():130] handle_request: poll_exit
11731
+ 2022-01-30 12:06:58,472 DEBUG SenderThread:2586828 [sender.py:send_request():248] send_request: poll_exit
11732
+ 2022-01-30 12:06:58,472 INFO SenderThread:2586828 [file_pusher.py:join():182] waiting for file pusher
11733
+ 2022-01-30 12:06:58,752 DEBUG HandlerThread:2586828 [handler.py:handle_request():130] handle_request: get_summary
11734
+ 2022-01-30 12:06:58,809 DEBUG HandlerThread:2586828 [handler.py:handle_request():130] handle_request: sampled_history
11735
+ 2022-01-30 12:06:58,813 DEBUG HandlerThread:2586828 [handler.py:handle_request():130] handle_request: shutdown
11736
+ 2022-01-30 12:06:58,813 INFO HandlerThread:2586828 [handler.py:finish():731] shutting down handler
11737
+ 2022-01-30 12:06:59,373 INFO WriterThread:2586828 [datastore.py:close():281] close: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_021614-39ybnrhl/run-39ybnrhl.wandb
11738
+ 2022-01-30 12:06:59,749 INFO SenderThread:2586828 [sender.py:finish():1070] shutting down sender
11739
+ 2022-01-30 12:06:59,749 INFO SenderThread:2586828 [file_pusher.py:finish():177] shutting down file pusher
11740
+ 2022-01-30 12:06:59,749 INFO SenderThread:2586828 [file_pusher.py:join():182] waiting for file pusher
11741
+ 2022-01-30 12:06:59,756 INFO MainThread:2586828 [internal.py:handle_exit():77] Internal process exited
wandb/run-20220130_021614-39ybnrhl/logs/debug.log CHANGED
@@ -22,3 +22,157 @@ config: {}
22
  2022-01-30 02:16:15,933 INFO MainThread:2586598 [wandb_init.py:init():633] run started, returning control to user process
23
  2022-01-30 02:16:15,950 INFO MainThread:2586598 [wandb_run.py:_config_callback():956] config_cb None None {'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': 'float32', 'use_bfloat16': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'chunk_size_feed_forward': 0, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'architectures': ['Wav2Vec2ForPreTraining'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'bos_token_id': 1, 'pad_token_id': 36, 'eos_token_id': 2, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': 'facebook/wav2vec2-xls-r-300m', 'transformers_version': '4.16.0.dev0', 'feat_extract_dropout': 0.0, 'model_type': 'wav2vec2', 'num_feat_extract_layers': 7, 'hidden_size': 1024, 'feat_extract_norm': 'layer', 'feat_extract_activation': 'gelu', 'conv_dim': [512, 512, 512, 512, 512, 512, 512], 'conv_stride': [5, 2, 2, 2, 2, 2, 2], 'conv_kernel': [10, 3, 3, 3, 3, 2, 2], 'conv_bias': True, 'num_conv_pos_embeddings': 128, 'num_conv_pos_embedding_groups': 16, 'num_hidden_layers': 24, 'intermediate_size': 4096, 'hidden_act': 'gelu', 'num_attention_heads': 16, 'hidden_dropout': 0.0, 'attention_dropout': 0.0, 'activation_dropout': 0.0, 'feat_proj_dropout': 0.1, 'final_dropout': 0.0, 'layerdrop': 0.0, 'layer_norm_eps': 1e-05, 'initializer_range': 0.02, 'vocab_size': 39, 'do_stable_layer_norm': True, 'use_weighted_layer_sum': False, 'apply_spec_augment': True, 'mask_time_prob': 0.1, 'mask_time_length': 10, 'mask_time_min_masks': 2, 'mask_feature_prob': 0.0, 'mask_feature_length': 10, 'mask_feature_min_masks': 0, 'num_codevectors_per_group': 320, 'num_codevector_groups': 2, 'contrastive_logits_temperature': 0.1, 'feat_quantizer_dropout': 0.0, 'num_negatives': 100, 'codevector_dim': 768, 'proj_codevector_dim': 768, 'diversity_loss_weight': 0.1, 'ctc_loss_reduction': 'mean', 'ctc_zero_infinity': False, 'add_adapter': False, 'adapter_kernel_size': 3, 'adapter_stride': 2, 'num_adapter_layers': 3, 'output_hidden_size': 1024, 'classifier_proj_size': 256, 'tdnn_dim': [512, 512, 512, 512, 1500], 'tdnn_kernel': [5, 3, 3, 1, 1], 'tdnn_dilation': [1, 2, 3, 1, 1], 'xvector_output_dim': 512, 'output_dir': './', 'overwrite_output_dir': True, 'do_train': True, 'do_eval': True, 'do_predict': False, 'evaluation_strategy': 'steps', 'prediction_loss_only': False, 'per_device_train_batch_size': 72, 'per_device_eval_batch_size': 72, 'per_gpu_train_batch_size': 'None', 'per_gpu_eval_batch_size': 'None', 'gradient_accumulation_steps': 2, 'eval_accumulation_steps': 'None', 'learning_rate': 0.0003, 'weight_decay': 0.0, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 1.0, 'num_train_epochs': 100.0, 'max_steps': -1, 'lr_scheduler_type': 'cosine', 'warmup_ratio': 0.0, 'warmup_steps': 500, 'log_level': -1, 'log_level_replica': -1, 'log_on_each_node': True, 'logging_dir': './runs/Jan30_02-15-54_ganymede.eafit.edu.co', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 100, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 500, 'save_total_limit': 1, 'save_on_each_node': False, 'no_cuda': False, 'seed': 42, 'bf16': False, 'fp16': True, 'fp16_opt_level': 'O1', 'half_precision_backend': 'amp', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': 'None', 'local_rank': -1, 'xpu_backend': 'None', 'tpu_num_cores': 'None', 'tpu_metrics_debug': False, 'debug': '[]', 'dataloader_drop_last': False, 'eval_steps': 500, 'dataloader_num_workers': 0, 'past_index': -1, 'run_name': 'cosine+drop_proj+low_specaugment-300M', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': 'None', 'load_best_model_at_end': False, 'metric_for_best_model': 'None', 'greater_is_better': 'None', 'ignore_data_skip': False, 'sharded_ddp': '[]', 'deepspeed': 'None', 'label_smoothing_factor': 0.0, 'optim': 'adamw_hf', 'adafactor': False, 'group_by_length': True, 'length_column_name': 'input_length', 'report_to': "['wandb']", 'ddp_find_unused_parameters': 'None', 'ddp_bucket_cap_mb': 'None', 'dataloader_pin_memory': True, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': True, 'resume_from_checkpoint': 'None', 'hub_model_id': 'None', 'hub_strategy': 'every_save', 'hub_token': '<HUB_TOKEN>', 'gradient_checkpointing': True, 'fp16_backend': 'auto', 'push_to_hub_model_id': 'None', 'push_to_hub_organization': 'None', 'push_to_hub_token': '<PUSH_TO_HUB_TOKEN>', '_n_gpu': 1, 'mp_parameters': '', 'train_batch_size': 72, 'eval_batch_size': 72}
24
  2022-01-30 02:16:15,953 INFO MainThread:2586598 [wandb_watch.py:watch():43] Watching
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
22
  2022-01-30 02:16:15,933 INFO MainThread:2586598 [wandb_init.py:init():633] run started, returning control to user process
23
  2022-01-30 02:16:15,950 INFO MainThread:2586598 [wandb_run.py:_config_callback():956] config_cb None None {'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': 'float32', 'use_bfloat16': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'chunk_size_feed_forward': 0, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'architectures': ['Wav2Vec2ForPreTraining'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'bos_token_id': 1, 'pad_token_id': 36, 'eos_token_id': 2, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': 'facebook/wav2vec2-xls-r-300m', 'transformers_version': '4.16.0.dev0', 'feat_extract_dropout': 0.0, 'model_type': 'wav2vec2', 'num_feat_extract_layers': 7, 'hidden_size': 1024, 'feat_extract_norm': 'layer', 'feat_extract_activation': 'gelu', 'conv_dim': [512, 512, 512, 512, 512, 512, 512], 'conv_stride': [5, 2, 2, 2, 2, 2, 2], 'conv_kernel': [10, 3, 3, 3, 3, 2, 2], 'conv_bias': True, 'num_conv_pos_embeddings': 128, 'num_conv_pos_embedding_groups': 16, 'num_hidden_layers': 24, 'intermediate_size': 4096, 'hidden_act': 'gelu', 'num_attention_heads': 16, 'hidden_dropout': 0.0, 'attention_dropout': 0.0, 'activation_dropout': 0.0, 'feat_proj_dropout': 0.1, 'final_dropout': 0.0, 'layerdrop': 0.0, 'layer_norm_eps': 1e-05, 'initializer_range': 0.02, 'vocab_size': 39, 'do_stable_layer_norm': True, 'use_weighted_layer_sum': False, 'apply_spec_augment': True, 'mask_time_prob': 0.1, 'mask_time_length': 10, 'mask_time_min_masks': 2, 'mask_feature_prob': 0.0, 'mask_feature_length': 10, 'mask_feature_min_masks': 0, 'num_codevectors_per_group': 320, 'num_codevector_groups': 2, 'contrastive_logits_temperature': 0.1, 'feat_quantizer_dropout': 0.0, 'num_negatives': 100, 'codevector_dim': 768, 'proj_codevector_dim': 768, 'diversity_loss_weight': 0.1, 'ctc_loss_reduction': 'mean', 'ctc_zero_infinity': False, 'add_adapter': False, 'adapter_kernel_size': 3, 'adapter_stride': 2, 'num_adapter_layers': 3, 'output_hidden_size': 1024, 'classifier_proj_size': 256, 'tdnn_dim': [512, 512, 512, 512, 1500], 'tdnn_kernel': [5, 3, 3, 1, 1], 'tdnn_dilation': [1, 2, 3, 1, 1], 'xvector_output_dim': 512, 'output_dir': './', 'overwrite_output_dir': True, 'do_train': True, 'do_eval': True, 'do_predict': False, 'evaluation_strategy': 'steps', 'prediction_loss_only': False, 'per_device_train_batch_size': 72, 'per_device_eval_batch_size': 72, 'per_gpu_train_batch_size': 'None', 'per_gpu_eval_batch_size': 'None', 'gradient_accumulation_steps': 2, 'eval_accumulation_steps': 'None', 'learning_rate': 0.0003, 'weight_decay': 0.0, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 1.0, 'num_train_epochs': 100.0, 'max_steps': -1, 'lr_scheduler_type': 'cosine', 'warmup_ratio': 0.0, 'warmup_steps': 500, 'log_level': -1, 'log_level_replica': -1, 'log_on_each_node': True, 'logging_dir': './runs/Jan30_02-15-54_ganymede.eafit.edu.co', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 100, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 500, 'save_total_limit': 1, 'save_on_each_node': False, 'no_cuda': False, 'seed': 42, 'bf16': False, 'fp16': True, 'fp16_opt_level': 'O1', 'half_precision_backend': 'amp', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': 'None', 'local_rank': -1, 'xpu_backend': 'None', 'tpu_num_cores': 'None', 'tpu_metrics_debug': False, 'debug': '[]', 'dataloader_drop_last': False, 'eval_steps': 500, 'dataloader_num_workers': 0, 'past_index': -1, 'run_name': 'cosine+drop_proj+low_specaugment-300M', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': 'None', 'load_best_model_at_end': False, 'metric_for_best_model': 'None', 'greater_is_better': 'None', 'ignore_data_skip': False, 'sharded_ddp': '[]', 'deepspeed': 'None', 'label_smoothing_factor': 0.0, 'optim': 'adamw_hf', 'adafactor': False, 'group_by_length': True, 'length_column_name': 'input_length', 'report_to': "['wandb']", 'ddp_find_unused_parameters': 'None', 'ddp_bucket_cap_mb': 'None', 'dataloader_pin_memory': True, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': True, 'resume_from_checkpoint': 'None', 'hub_model_id': 'None', 'hub_strategy': 'every_save', 'hub_token': '<HUB_TOKEN>', 'gradient_checkpointing': True, 'fp16_backend': 'auto', 'push_to_hub_model_id': 'None', 'push_to_hub_organization': 'None', 'push_to_hub_token': '<PUSH_TO_HUB_TOKEN>', '_n_gpu': 1, 'mp_parameters': '', 'train_batch_size': 72, 'eval_batch_size': 72}
24
  2022-01-30 02:16:15,953 INFO MainThread:2586598 [wandb_watch.py:watch():43] Watching
25
+ 2022-01-30 12:06:52,264 INFO MainThread:2586598 [wandb_run.py:_atexit_cleanup():1780] got exitcode: 0
26
+ 2022-01-30 12:06:52,266 INFO MainThread:2586598 [wandb_run.py:_restore():1752] restore
27
+ 2022-01-30 12:06:54,568 INFO MainThread:2586598 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
28
+ wandb_count: 1
29
+ }
30
+ pusher_stats {
31
+ uploaded_bytes: 2324
32
+ total_bytes: 2324
33
+ }
34
+
35
+ 2022-01-30 12:06:54,725 INFO MainThread:2586598 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
36
+ wandb_count: 1
37
+ }
38
+ pusher_stats {
39
+ uploaded_bytes: 2324
40
+ total_bytes: 2324
41
+ }
42
+
43
+ 2022-01-30 12:06:55,437 INFO MainThread:2586598 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
44
+ wandb_count: 1
45
+ }
46
+ pusher_stats {
47
+ uploaded_bytes: 2324
48
+ total_bytes: 2324
49
+ }
50
+
51
+ 2022-01-30 12:06:55,766 INFO MainThread:2586598 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
52
+ wandb_count: 5
53
+ }
54
+ pusher_stats {
55
+ uploaded_bytes: 2324
56
+ total_bytes: 823600
57
+ }
58
+
59
+ 2022-01-30 12:06:55,869 INFO MainThread:2586598 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
60
+ wandb_count: 5
61
+ }
62
+ pusher_stats {
63
+ uploaded_bytes: 2324
64
+ total_bytes: 823600
65
+ }
66
+
67
+ 2022-01-30 12:06:55,974 INFO MainThread:2586598 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
68
+ wandb_count: 5
69
+ }
70
+ pusher_stats {
71
+ uploaded_bytes: 2324
72
+ total_bytes: 823600
73
+ }
74
+
75
+ 2022-01-30 12:06:56,076 INFO MainThread:2586598 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
76
+ wandb_count: 5
77
+ }
78
+ pusher_stats {
79
+ uploaded_bytes: 648240
80
+ total_bytes: 823600
81
+ }
82
+
83
+ 2022-01-30 12:06:56,179 INFO MainThread:2586598 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
84
+ wandb_count: 5
85
+ }
86
+ pusher_stats {
87
+ uploaded_bytes: 823600
88
+ total_bytes: 823600
89
+ }
90
+
91
+ 2022-01-30 12:06:56,282 INFO MainThread:2586598 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
92
+ wandb_count: 5
93
+ }
94
+ pusher_stats {
95
+ uploaded_bytes: 823600
96
+ total_bytes: 823600
97
+ }
98
+
99
+ 2022-01-30 12:06:56,384 INFO MainThread:2586598 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
100
+ wandb_count: 5
101
+ }
102
+ pusher_stats {
103
+ uploaded_bytes: 823600
104
+ total_bytes: 823600
105
+ }
106
+
107
+ 2022-01-30 12:06:56,487 INFO MainThread:2586598 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
108
+ wandb_count: 5
109
+ }
110
+ pusher_stats {
111
+ uploaded_bytes: 823600
112
+ total_bytes: 823600
113
+ }
114
+
115
+ 2022-01-30 12:06:56,593 INFO MainThread:2586598 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
116
+ wandb_count: 5
117
+ }
118
+ pusher_stats {
119
+ uploaded_bytes: 823600
120
+ total_bytes: 823600
121
+ }
122
+
123
+ 2022-01-30 12:06:56,696 INFO MainThread:2586598 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
124
+ wandb_count: 5
125
+ }
126
+ pusher_stats {
127
+ uploaded_bytes: 823600
128
+ total_bytes: 823600
129
+ }
130
+
131
+ 2022-01-30 12:06:56,798 INFO MainThread:2586598 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
132
+ wandb_count: 5
133
+ }
134
+ pusher_stats {
135
+ uploaded_bytes: 823600
136
+ total_bytes: 823600
137
+ }
138
+
139
+ 2022-01-30 12:06:56,901 INFO MainThread:2586598 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
140
+ wandb_count: 5
141
+ }
142
+ pusher_stats {
143
+ uploaded_bytes: 823600
144
+ total_bytes: 823600
145
+ }
146
+
147
+ 2022-01-30 12:06:57,003 INFO MainThread:2586598 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
148
+ wandb_count: 5
149
+ }
150
+ pusher_stats {
151
+ uploaded_bytes: 823600
152
+ total_bytes: 823600
153
+ }
154
+
155
+ 2022-01-30 12:06:58,370 INFO MainThread:2586598 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
156
+ wandb_count: 5
157
+ }
158
+ pusher_stats {
159
+ uploaded_bytes: 823600
160
+ total_bytes: 823600
161
+ }
162
+
163
+ 2022-01-30 12:06:58,749 INFO MainThread:2586598 [wandb_run.py:_wait_for_finish():1912] got exit ret: done: true
164
+ exit_result {
165
+ }
166
+ file_counts {
167
+ wandb_count: 5
168
+ }
169
+ pusher_stats {
170
+ uploaded_bytes: 823600
171
+ total_bytes: 823600
172
+ }
173
+ local_info {
174
+ }
175
+
176
+ 2022-01-30 12:06:59,849 INFO MainThread:2586598 [wandb_run.py:_append_history():2130] rendering history
177
+ 2022-01-30 12:06:59,851 INFO MainThread:2586598 [wandb_run.py:_append_summary():2085] rendering summary
178
+ 2022-01-30 12:06:59,852 INFO MainThread:2586598 [wandb_run.py:_append_files():2180] logging synced files
wandb/run-20220130_021614-39ybnrhl/run-39ybnrhl.wandb CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:6eac2795fa7f814318ab928f073a82af9829ba5ae347710e438bbde295fd8803
3
- size 28982149
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:30ee8fdaa5352284f95b60d219533a41ece77f180dedf05d88329a2405ba18e2
3
+ size 29629456
wandb/run-20220130_235926-2f5mzr6c/files/config.yaml ADDED
@@ -0,0 +1,655 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ wandb_version: 1
2
+
3
+ _n_gpu:
4
+ desc: null
5
+ value: 1
6
+ _name_or_path:
7
+ desc: null
8
+ value: facebook/wav2vec2-xls-r-300m
9
+ _wandb:
10
+ desc: null
11
+ value:
12
+ cli_version: 0.12.9
13
+ framework: huggingface
14
+ huggingface_version: 4.16.0.dev0
15
+ is_jupyter_run: false
16
+ is_kaggle_kernel: true
17
+ m:
18
+ - 1: train/global_step
19
+ 6:
20
+ - 3
21
+ python_version: 3.9.6
22
+ start_time: 1643605166
23
+ t:
24
+ 1:
25
+ - 1
26
+ - 5
27
+ - 11
28
+ 2:
29
+ - 1
30
+ - 5
31
+ - 11
32
+ 3:
33
+ - 1
34
+ - 7
35
+ - 13
36
+ 4: 3.9.6
37
+ 5: 0.12.9
38
+ 6: 4.16.0.dev0
39
+ 8:
40
+ - 2
41
+ - 5
42
+ activation_dropout:
43
+ desc: null
44
+ value: 0.0
45
+ adafactor:
46
+ desc: null
47
+ value: false
48
+ adam_beta1:
49
+ desc: null
50
+ value: 0.9
51
+ adam_beta2:
52
+ desc: null
53
+ value: 0.999
54
+ adam_epsilon:
55
+ desc: null
56
+ value: 1.0e-08
57
+ adapter_kernel_size:
58
+ desc: null
59
+ value: 3
60
+ adapter_stride:
61
+ desc: null
62
+ value: 2
63
+ add_adapter:
64
+ desc: null
65
+ value: false
66
+ add_cross_attention:
67
+ desc: null
68
+ value: false
69
+ apply_spec_augment:
70
+ desc: null
71
+ value: true
72
+ architectures:
73
+ desc: null
74
+ value:
75
+ - Wav2Vec2ForPreTraining
76
+ attention_dropout:
77
+ desc: null
78
+ value: 0.0
79
+ bad_words_ids:
80
+ desc: null
81
+ value: null
82
+ bf16:
83
+ desc: null
84
+ value: false
85
+ bf16_full_eval:
86
+ desc: null
87
+ value: false
88
+ bos_token_id:
89
+ desc: null
90
+ value: 1
91
+ chunk_size_feed_forward:
92
+ desc: null
93
+ value: 0
94
+ classifier_proj_size:
95
+ desc: null
96
+ value: 256
97
+ codevector_dim:
98
+ desc: null
99
+ value: 768
100
+ contrastive_logits_temperature:
101
+ desc: null
102
+ value: 0.1
103
+ conv_bias:
104
+ desc: null
105
+ value: true
106
+ conv_dim:
107
+ desc: null
108
+ value:
109
+ - 512
110
+ - 512
111
+ - 512
112
+ - 512
113
+ - 512
114
+ - 512
115
+ - 512
116
+ conv_kernel:
117
+ desc: null
118
+ value:
119
+ - 10
120
+ - 3
121
+ - 3
122
+ - 3
123
+ - 3
124
+ - 2
125
+ - 2
126
+ conv_stride:
127
+ desc: null
128
+ value:
129
+ - 5
130
+ - 2
131
+ - 2
132
+ - 2
133
+ - 2
134
+ - 2
135
+ - 2
136
+ cross_attention_hidden_size:
137
+ desc: null
138
+ value: null
139
+ ctc_loss_reduction:
140
+ desc: null
141
+ value: mean
142
+ ctc_zero_infinity:
143
+ desc: null
144
+ value: false
145
+ dataloader_drop_last:
146
+ desc: null
147
+ value: false
148
+ dataloader_num_workers:
149
+ desc: null
150
+ value: 0
151
+ dataloader_pin_memory:
152
+ desc: null
153
+ value: true
154
+ ddp_bucket_cap_mb:
155
+ desc: null
156
+ value: None
157
+ ddp_find_unused_parameters:
158
+ desc: null
159
+ value: None
160
+ debug:
161
+ desc: null
162
+ value: '[]'
163
+ decoder_start_token_id:
164
+ desc: null
165
+ value: null
166
+ deepspeed:
167
+ desc: null
168
+ value: None
169
+ disable_tqdm:
170
+ desc: null
171
+ value: false
172
+ diversity_loss_weight:
173
+ desc: null
174
+ value: 0.1
175
+ diversity_penalty:
176
+ desc: null
177
+ value: 0.0
178
+ do_eval:
179
+ desc: null
180
+ value: true
181
+ do_predict:
182
+ desc: null
183
+ value: false
184
+ do_sample:
185
+ desc: null
186
+ value: false
187
+ do_stable_layer_norm:
188
+ desc: null
189
+ value: true
190
+ do_train:
191
+ desc: null
192
+ value: true
193
+ early_stopping:
194
+ desc: null
195
+ value: false
196
+ encoder_no_repeat_ngram_size:
197
+ desc: null
198
+ value: 0
199
+ eos_token_id:
200
+ desc: null
201
+ value: 2
202
+ eval_accumulation_steps:
203
+ desc: null
204
+ value: None
205
+ eval_batch_size:
206
+ desc: null
207
+ value: 72
208
+ eval_steps:
209
+ desc: null
210
+ value: 500
211
+ evaluation_strategy:
212
+ desc: null
213
+ value: steps
214
+ feat_extract_activation:
215
+ desc: null
216
+ value: gelu
217
+ feat_extract_dropout:
218
+ desc: null
219
+ value: 0.0
220
+ feat_extract_norm:
221
+ desc: null
222
+ value: layer
223
+ feat_proj_dropout:
224
+ desc: null
225
+ value: 0.1
226
+ feat_quantizer_dropout:
227
+ desc: null
228
+ value: 0.0
229
+ final_dropout:
230
+ desc: null
231
+ value: 0.0
232
+ finetuning_task:
233
+ desc: null
234
+ value: null
235
+ forced_bos_token_id:
236
+ desc: null
237
+ value: null
238
+ forced_eos_token_id:
239
+ desc: null
240
+ value: null
241
+ fp16:
242
+ desc: null
243
+ value: true
244
+ fp16_backend:
245
+ desc: null
246
+ value: auto
247
+ fp16_full_eval:
248
+ desc: null
249
+ value: false
250
+ fp16_opt_level:
251
+ desc: null
252
+ value: O1
253
+ gradient_accumulation_steps:
254
+ desc: null
255
+ value: 2
256
+ gradient_checkpointing:
257
+ desc: null
258
+ value: true
259
+ greater_is_better:
260
+ desc: null
261
+ value: None
262
+ group_by_length:
263
+ desc: null
264
+ value: true
265
+ half_precision_backend:
266
+ desc: null
267
+ value: amp
268
+ hidden_act:
269
+ desc: null
270
+ value: gelu
271
+ hidden_dropout:
272
+ desc: null
273
+ value: 0.0
274
+ hidden_size:
275
+ desc: null
276
+ value: 1024
277
+ hub_model_id:
278
+ desc: null
279
+ value: None
280
+ hub_strategy:
281
+ desc: null
282
+ value: every_save
283
+ hub_token:
284
+ desc: null
285
+ value: <HUB_TOKEN>
286
+ id2label:
287
+ desc: null
288
+ value:
289
+ '0': LABEL_0
290
+ '1': LABEL_1
291
+ ignore_data_skip:
292
+ desc: null
293
+ value: false
294
+ initializer_range:
295
+ desc: null
296
+ value: 0.02
297
+ intermediate_size:
298
+ desc: null
299
+ value: 4096
300
+ is_decoder:
301
+ desc: null
302
+ value: false
303
+ is_encoder_decoder:
304
+ desc: null
305
+ value: false
306
+ label2id:
307
+ desc: null
308
+ value:
309
+ LABEL_0: 0
310
+ LABEL_1: 1
311
+ label_names:
312
+ desc: null
313
+ value: None
314
+ label_smoothing_factor:
315
+ desc: null
316
+ value: 0.0
317
+ layer_norm_eps:
318
+ desc: null
319
+ value: 1.0e-05
320
+ layerdrop:
321
+ desc: null
322
+ value: 0.0
323
+ learning_rate:
324
+ desc: null
325
+ value: 0.0003
326
+ length_column_name:
327
+ desc: null
328
+ value: input_length
329
+ length_penalty:
330
+ desc: null
331
+ value: 1.0
332
+ load_best_model_at_end:
333
+ desc: null
334
+ value: false
335
+ local_rank:
336
+ desc: null
337
+ value: -1
338
+ log_level:
339
+ desc: null
340
+ value: -1
341
+ log_level_replica:
342
+ desc: null
343
+ value: -1
344
+ log_on_each_node:
345
+ desc: null
346
+ value: true
347
+ logging_dir:
348
+ desc: null
349
+ value: ./runs/Jan30_23-55-50_ganymede.eafit.edu.co
350
+ logging_first_step:
351
+ desc: null
352
+ value: false
353
+ logging_nan_inf_filter:
354
+ desc: null
355
+ value: true
356
+ logging_steps:
357
+ desc: null
358
+ value: 100
359
+ logging_strategy:
360
+ desc: null
361
+ value: steps
362
+ lr_scheduler_type:
363
+ desc: null
364
+ value: cosine
365
+ mask_feature_length:
366
+ desc: null
367
+ value: 10
368
+ mask_feature_min_masks:
369
+ desc: null
370
+ value: 0
371
+ mask_feature_prob:
372
+ desc: null
373
+ value: 0.0
374
+ mask_time_length:
375
+ desc: null
376
+ value: 10
377
+ mask_time_min_masks:
378
+ desc: null
379
+ value: 2
380
+ mask_time_prob:
381
+ desc: null
382
+ value: 0.1
383
+ max_grad_norm:
384
+ desc: null
385
+ value: 1.0
386
+ max_length:
387
+ desc: null
388
+ value: 20
389
+ max_steps:
390
+ desc: null
391
+ value: -1
392
+ metric_for_best_model:
393
+ desc: null
394
+ value: None
395
+ min_length:
396
+ desc: null
397
+ value: 0
398
+ model_type:
399
+ desc: null
400
+ value: wav2vec2
401
+ mp_parameters:
402
+ desc: null
403
+ value: ''
404
+ no_cuda:
405
+ desc: null
406
+ value: false
407
+ no_repeat_ngram_size:
408
+ desc: null
409
+ value: 0
410
+ num_adapter_layers:
411
+ desc: null
412
+ value: 3
413
+ num_attention_heads:
414
+ desc: null
415
+ value: 16
416
+ num_beam_groups:
417
+ desc: null
418
+ value: 1
419
+ num_beams:
420
+ desc: null
421
+ value: 1
422
+ num_codevector_groups:
423
+ desc: null
424
+ value: 2
425
+ num_codevectors_per_group:
426
+ desc: null
427
+ value: 320
428
+ num_conv_pos_embedding_groups:
429
+ desc: null
430
+ value: 16
431
+ num_conv_pos_embeddings:
432
+ desc: null
433
+ value: 128
434
+ num_feat_extract_layers:
435
+ desc: null
436
+ value: 7
437
+ num_hidden_layers:
438
+ desc: null
439
+ value: 24
440
+ num_negatives:
441
+ desc: null
442
+ value: 100
443
+ num_return_sequences:
444
+ desc: null
445
+ value: 1
446
+ num_train_epochs:
447
+ desc: null
448
+ value: 100.0
449
+ optim:
450
+ desc: null
451
+ value: adamw_hf
452
+ output_attentions:
453
+ desc: null
454
+ value: false
455
+ output_dir:
456
+ desc: null
457
+ value: ./
458
+ output_hidden_size:
459
+ desc: null
460
+ value: 1024
461
+ output_hidden_states:
462
+ desc: null
463
+ value: false
464
+ output_scores:
465
+ desc: null
466
+ value: false
467
+ overwrite_output_dir:
468
+ desc: null
469
+ value: true
470
+ pad_token_id:
471
+ desc: null
472
+ value: 36
473
+ past_index:
474
+ desc: null
475
+ value: -1
476
+ per_device_eval_batch_size:
477
+ desc: null
478
+ value: 72
479
+ per_device_train_batch_size:
480
+ desc: null
481
+ value: 72
482
+ per_gpu_eval_batch_size:
483
+ desc: null
484
+ value: None
485
+ per_gpu_train_batch_size:
486
+ desc: null
487
+ value: None
488
+ prediction_loss_only:
489
+ desc: null
490
+ value: false
491
+ prefix:
492
+ desc: null
493
+ value: null
494
+ problem_type:
495
+ desc: null
496
+ value: null
497
+ proj_codevector_dim:
498
+ desc: null
499
+ value: 768
500
+ pruned_heads:
501
+ desc: null
502
+ value: {}
503
+ push_to_hub:
504
+ desc: null
505
+ value: false
506
+ push_to_hub_model_id:
507
+ desc: null
508
+ value: None
509
+ push_to_hub_organization:
510
+ desc: null
511
+ value: None
512
+ push_to_hub_token:
513
+ desc: null
514
+ value: <PUSH_TO_HUB_TOKEN>
515
+ remove_invalid_values:
516
+ desc: null
517
+ value: false
518
+ remove_unused_columns:
519
+ desc: null
520
+ value: true
521
+ repetition_penalty:
522
+ desc: null
523
+ value: 1.0
524
+ report_to:
525
+ desc: null
526
+ value: '[''wandb'']'
527
+ resume_from_checkpoint:
528
+ desc: null
529
+ value: None
530
+ return_dict:
531
+ desc: null
532
+ value: true
533
+ return_dict_in_generate:
534
+ desc: null
535
+ value: false
536
+ run_name:
537
+ desc: null
538
+ value: cosine+drop_proj+low_specaugment-300M+cv_8_0
539
+ save_on_each_node:
540
+ desc: null
541
+ value: false
542
+ save_steps:
543
+ desc: null
544
+ value: 500
545
+ save_strategy:
546
+ desc: null
547
+ value: steps
548
+ save_total_limit:
549
+ desc: null
550
+ value: 1
551
+ seed:
552
+ desc: null
553
+ value: 42
554
+ sep_token_id:
555
+ desc: null
556
+ value: null
557
+ sharded_ddp:
558
+ desc: null
559
+ value: '[]'
560
+ skip_memory_metrics:
561
+ desc: null
562
+ value: true
563
+ task_specific_params:
564
+ desc: null
565
+ value: null
566
+ tdnn_dilation:
567
+ desc: null
568
+ value:
569
+ - 1
570
+ - 2
571
+ - 3
572
+ - 1
573
+ - 1
574
+ tdnn_dim:
575
+ desc: null
576
+ value:
577
+ - 512
578
+ - 512
579
+ - 512
580
+ - 512
581
+ - 1500
582
+ tdnn_kernel:
583
+ desc: null
584
+ value:
585
+ - 5
586
+ - 3
587
+ - 3
588
+ - 1
589
+ - 1
590
+ temperature:
591
+ desc: null
592
+ value: 1.0
593
+ tf32:
594
+ desc: null
595
+ value: None
596
+ tie_encoder_decoder:
597
+ desc: null
598
+ value: false
599
+ tie_word_embeddings:
600
+ desc: null
601
+ value: true
602
+ tokenizer_class:
603
+ desc: null
604
+ value: null
605
+ top_k:
606
+ desc: null
607
+ value: 50
608
+ top_p:
609
+ desc: null
610
+ value: 1.0
611
+ torch_dtype:
612
+ desc: null
613
+ value: float32
614
+ torchscript:
615
+ desc: null
616
+ value: false
617
+ tpu_metrics_debug:
618
+ desc: null
619
+ value: false
620
+ tpu_num_cores:
621
+ desc: null
622
+ value: None
623
+ train_batch_size:
624
+ desc: null
625
+ value: 72
626
+ transformers_version:
627
+ desc: null
628
+ value: 4.16.0.dev0
629
+ use_bfloat16:
630
+ desc: null
631
+ value: false
632
+ use_legacy_prediction_loop:
633
+ desc: null
634
+ value: false
635
+ use_weighted_layer_sum:
636
+ desc: null
637
+ value: false
638
+ vocab_size:
639
+ desc: null
640
+ value: 39
641
+ warmup_ratio:
642
+ desc: null
643
+ value: 0.0
644
+ warmup_steps:
645
+ desc: null
646
+ value: 500
647
+ weight_decay:
648
+ desc: null
649
+ value: 0.0
650
+ xpu_backend:
651
+ desc: null
652
+ value: None
653
+ xvector_output_dim:
654
+ desc: null
655
+ value: 512
wandb/run-20220130_235926-2f5mzr6c/files/output.log ADDED
@@ -0,0 +1,35 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+
2
+ 0%| | 0/4000 [00:00<?, ?it/s]Traceback (most recent call last):
3
+ File "/home/sagrilaft/Project/audio/xls-r-et/src/run_speech_recognition_ctc_bnb.py", line 760, in <module>
4
+ main()
5
+ File "/home/sagrilaft/Project/audio/xls-r-et/src/run_speech_recognition_ctc_bnb.py", line 711, in main
6
+ train_result = trainer.train(resume_from_checkpoint=checkpoint)
7
+ File "/home/sagrilaft/Project/audio/xls-r-et/.venv/lib/python3.9/site-packages/transformers/trainer.py", line 1365, in train
8
+ tr_loss_step = self.training_step(model, inputs)
9
+ File "/home/sagrilaft/Project/audio/xls-r-et/.venv/lib/python3.9/site-packages/transformers/trainer.py", line 1940, in training_step
10
+ loss = self.compute_loss(model, inputs)
11
+ File "/home/sagrilaft/Project/audio/xls-r-et/.venv/lib/python3.9/site-packages/transformers/trainer.py", line 1972, in compute_loss
12
+ outputs = model(**inputs)
13
+ File "/home/sagrilaft/Project/audio/xls-r-et/.venv/lib64/python3.9/site-packages/torch/nn/modules/module.py", line 1102, in _call_impl
14
+ return forward_call(*input, **kwargs)
15
+ File "/home/sagrilaft/Project/audio/xls-r-et/.venv/lib/python3.9/site-packages/transformers/models/wav2vec2/modeling_wav2vec2.py", line 1720, in forward
16
+ outputs = self.wav2vec2(
17
+ File "/home/sagrilaft/Project/audio/xls-r-et/.venv/lib64/python3.9/site-packages/torch/nn/modules/module.py", line 1102, in _call_impl
18
+ return forward_call(*input, **kwargs)
19
+ File "/home/sagrilaft/Project/audio/xls-r-et/.venv/lib/python3.9/site-packages/transformers/models/wav2vec2/modeling_wav2vec2.py", line 1313, in forward
20
+ extract_features = self.feature_extractor(input_values)
21
+ File "/home/sagrilaft/Project/audio/xls-r-et/.venv/lib64/python3.9/site-packages/torch/nn/modules/module.py", line 1102, in _call_impl
22
+ return forward_call(*input, **kwargs)
23
+ File "/home/sagrilaft/Project/audio/xls-r-et/.venv/lib/python3.9/site-packages/transformers/models/wav2vec2/modeling_wav2vec2.py", line 482, in forward
24
+ hidden_states = conv_layer(hidden_states)
25
+ File "/home/sagrilaft/Project/audio/xls-r-et/.venv/lib64/python3.9/site-packages/torch/nn/modules/module.py", line 1102, in _call_impl
26
+ return forward_call(*input, **kwargs)
27
+ File "/home/sagrilaft/Project/audio/xls-r-et/.venv/lib/python3.9/site-packages/transformers/models/wav2vec2/modeling_wav2vec2.py", line 357, in forward
28
+ hidden_states = self.layer_norm(hidden_states)
29
+ File "/home/sagrilaft/Project/audio/xls-r-et/.venv/lib64/python3.9/site-packages/torch/nn/modules/module.py", line 1102, in _call_impl
30
+ return forward_call(*input, **kwargs)
31
+ File "/home/sagrilaft/Project/audio/xls-r-et/.venv/lib64/python3.9/site-packages/torch/nn/modules/normalization.py", line 189, in forward
32
+ return F.layer_norm(
33
+ File "/home/sagrilaft/Project/audio/xls-r-et/.venv/lib64/python3.9/site-packages/torch/nn/functional.py", line 2347, in layer_norm
34
+ return torch.layer_norm(input, normalized_shape, weight, bias, eps, torch.backends.cudnn.enabled)
35
+ RuntimeError: CUDA out of memory. Tried to allocate 4.71 GiB (GPU 0; 31.75 GiB total capacity; 3.62 GiB already allocated; 1.22 GiB free; 3.62 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
wandb/run-20220130_235926-2f5mzr6c/files/requirements.txt ADDED
@@ -0,0 +1,87 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ aiohttp==3.8.1
2
+ aiosignal==1.2.0
3
+ appdirs==1.4.4
4
+ async-timeout==4.0.2
5
+ attrs==21.4.0
6
+ audioread==2.1.9
7
+ bitsandbytes-cuda113==0.26.0
8
+ certifi==2021.10.8
9
+ cffi==1.15.0
10
+ charset-normalizer==2.0.10
11
+ click==8.0.3
12
+ clldutils==3.10.1
13
+ colorlog==6.6.0
14
+ configparser==5.2.0
15
+ csvw==1.11.0
16
+ datasets==1.18.1.dev0
17
+ decorator==5.1.1
18
+ dill==0.3.4
19
+ dlinfo==1.2.1
20
+ docker-pycreds==0.4.0
21
+ filelock==3.4.2
22
+ frozenlist==1.3.0
23
+ fsspec==2022.1.0
24
+ gitdb==4.0.9
25
+ gitpython==3.1.26
26
+ huggingface-hub==0.4.0
27
+ hypothesis==6.36.0
28
+ idna==3.3
29
+ isodate==0.6.1
30
+ jiwer==2.3.0
31
+ joblib==1.1.0
32
+ librosa==0.8.1
33
+ llvmlite==0.38.0
34
+ multidict==6.0.2
35
+ multiprocess==0.70.12.2
36
+ numba==0.55.0
37
+ numpy==1.21.5
38
+ packaging==21.3
39
+ pandas==1.4.0
40
+ pathtools==0.1.2
41
+ phonemizer==3.0.1
42
+ pip==21.3.1
43
+ pooch==1.6.0
44
+ promise==2.3
45
+ protobuf==3.19.3
46
+ psutil==5.9.0
47
+ pyarrow==6.0.1
48
+ pycparser==2.21
49
+ pyctcdecode==0.3.0
50
+ pygtrie==2.4.2
51
+ pyparsing==3.0.7
52
+ python-dateutil==2.8.2
53
+ python-levenshtein==0.12.2
54
+ pytz==2021.3
55
+ pyyaml==6.0
56
+ regex==2022.1.18
57
+ requests==2.27.1
58
+ resampy==0.2.2
59
+ rfc3986==2.0.0
60
+ sacremoses==0.0.47
61
+ scikit-learn==1.0.2
62
+ scipy==1.7.3
63
+ segments==2.2.0
64
+ sentry-sdk==1.5.4
65
+ setuptools==60.2.0
66
+ shortuuid==1.0.8
67
+ six==1.16.0
68
+ smmap==5.0.0
69
+ sortedcontainers==2.4.0
70
+ soundfile==0.10.3.post1
71
+ subprocess32==3.5.4
72
+ tabulate==0.8.9
73
+ termcolor==1.1.0
74
+ threadpoolctl==3.0.0
75
+ tokenizers==0.11.4
76
+ torch==1.10.1
77
+ torchaudio==0.10.1
78
+ tqdm==4.62.3
79
+ transformers==4.16.0.dev0
80
+ typing-extensions==4.0.1
81
+ uritemplate==4.1.1
82
+ urllib3==1.26.8
83
+ wandb==0.12.9
84
+ wheel==0.37.1
85
+ xxhash==2.0.2
86
+ yarl==1.7.2
87
+ yaspin==2.1.0
wandb/run-20220130_235926-2f5mzr6c/files/wandb-metadata.json ADDED
@@ -0,0 +1,75 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "os": "Linux-4.18.0-305.10.2.el8_4.x86_64-x86_64-with-glibc2.28",
3
+ "python": "3.9.6",
4
+ "heartbeatAt": "2022-01-31T04:59:28.040302",
5
+ "startedAt": "2022-01-31T04:59:26.944858",
6
+ "docker": null,
7
+ "gpu": "Tesla V100-PCIE-32GB",
8
+ "gpu_count": 3,
9
+ "cpu_count": 64,
10
+ "cuda": null,
11
+ "args": [
12
+ "--dataset_name=mozilla-foundation/common_voice_8_0",
13
+ "--model_name_or_path=facebook/wav2vec2-xls-r-300m",
14
+ "--dataset_config_name=et",
15
+ "--output_dir=./",
16
+ "--overwrite_output_dir",
17
+ "--num_train_epochs=100",
18
+ "--per_device_train_batch_size=72",
19
+ "--per_device_eval_batch_size=72",
20
+ "--gradient_accumulation_steps=2",
21
+ "--learning_rate=3e-4",
22
+ "--save_total_limit=1",
23
+ "--warmup_steps=500",
24
+ "--evaluation_strategy=steps",
25
+ "--text_column_name=sentence",
26
+ "--length_column_name=input_length",
27
+ "--save_steps=500",
28
+ "--eval_steps=500",
29
+ "--logging_steps=100",
30
+ "--layerdrop=0.0",
31
+ "--freeze_feature_encoder",
32
+ "--feat_proj_dropout=0.1",
33
+ "--chars_to_ignore",
34
+ ",",
35
+ "?",
36
+ ".",
37
+ "!",
38
+ "-",
39
+ ";",
40
+ ":",
41
+ "\"",
42
+ "\u201c",
43
+ "%",
44
+ "\u2018",
45
+ "\u201d",
46
+ "\ufffd",
47
+ "\u2014",
48
+ "\u2019",
49
+ "\u2026",
50
+ "\u2013",
51
+ "--gradient_checkpointing",
52
+ "--lr_scheduler_type=cosine",
53
+ "--fp16",
54
+ "--group_by_length",
55
+ "--mask_time_prob=0.1",
56
+ "--mask_time_length=10",
57
+ "--report_to=wandb",
58
+ "--run_name=cosine+drop_proj+low_specaugment-300M+cv_8_0",
59
+ "--do_train",
60
+ "--do_eval",
61
+ "--use_auth_token"
62
+ ],
63
+ "state": "running",
64
+ "program": "/home/sagrilaft/Project/audio/xls-r-et/src/run_speech_recognition_ctc_bnb.py",
65
+ "codePath": "src/run_speech_recognition_ctc_bnb.py",
66
+ "git": {
67
+ "remote": "https://huggingface.co/shpotes/xls-r-et",
68
+ "commit": "ff66b86f52be4c55fb5be74a60f889284554c939"
69
+ },
70
+ "email": "shpotes3@gmail.com",
71
+ "root": "/home/sagrilaft/Project/audio/xls-r-et",
72
+ "host": "ganymede.eafit.edu.co",
73
+ "username": "sagrilaft",
74
+ "executable": "/home/sagrilaft/Project/audio/xls-r-et/.venv/bin/python"
75
+ }
wandb/run-20220130_235926-2f5mzr6c/files/wandb-summary.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"_wandb": {"runtime": 6}}
wandb/run-20220130_235926-2f5mzr6c/logs/debug-internal.log ADDED
@@ -0,0 +1,144 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2022-01-30 23:59:27,435 INFO MainThread:2678958 [internal.py:wandb_internal():87] W&B internal server running at pid: 2678958, started at: 2022-01-30 23:59:27.435029
2
+ 2022-01-30 23:59:27,437 DEBUG HandlerThread:2678958 [handler.py:handle_request():130] handle_request: check_version
3
+ 2022-01-30 23:59:27,437 INFO WriterThread:2678958 [datastore.py:open_for_write():77] open: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_235926-2f5mzr6c/run-2f5mzr6c.wandb
4
+ 2022-01-30 23:59:27,440 DEBUG SenderThread:2678958 [sender.py:send():234] send: header
5
+ 2022-01-30 23:59:27,440 DEBUG SenderThread:2678958 [sender.py:send_request():248] send_request: check_version
6
+ 2022-01-30 23:59:27,856 DEBUG SenderThread:2678958 [sender.py:send():234] send: run
7
+ 2022-01-30 23:59:28,031 INFO SenderThread:2678958 [dir_watcher.py:__init__():169] watching files in: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_235926-2f5mzr6c/files
8
+ 2022-01-30 23:59:28,032 INFO SenderThread:2678958 [sender.py:_start_run_threads():804] run started: 2f5mzr6c with start time 1643605166
9
+ 2022-01-30 23:59:28,032 DEBUG SenderThread:2678958 [sender.py:send():234] send: summary
10
+ 2022-01-30 23:59:28,032 INFO SenderThread:2678958 [sender.py:_save_file():939] saving file wandb-summary.json with policy end
11
+ 2022-01-30 23:59:28,033 DEBUG HandlerThread:2678958 [handler.py:handle_request():130] handle_request: run_start
12
+ 2022-01-30 23:59:28,040 DEBUG HandlerThread:2678958 [meta.py:__init__():40] meta init
13
+ 2022-01-30 23:59:28,040 DEBUG HandlerThread:2678958 [meta.py:__init__():54] meta init done
14
+ 2022-01-30 23:59:28,040 DEBUG HandlerThread:2678958 [meta.py:probe():214] probe
15
+ 2022-01-30 23:59:28,046 DEBUG HandlerThread:2678958 [meta.py:_setup_git():204] setup git
16
+ 2022-01-30 23:59:28,060 DEBUG HandlerThread:2678958 [meta.py:_setup_git():211] setup git done
17
+ 2022-01-30 23:59:28,060 DEBUG HandlerThread:2678958 [meta.py:_save_pip():58] save pip
18
+ 2022-01-30 23:59:28,060 DEBUG HandlerThread:2678958 [meta.py:_save_pip():72] save pip done
19
+ 2022-01-30 23:59:28,060 DEBUG HandlerThread:2678958 [meta.py:probe():252] probe done
20
+ 2022-01-30 23:59:28,063 DEBUG SenderThread:2678958 [sender.py:send():234] send: files
21
+ 2022-01-30 23:59:28,063 INFO SenderThread:2678958 [sender.py:_save_file():939] saving file wandb-metadata.json with policy now
22
+ 2022-01-30 23:59:28,071 DEBUG HandlerThread:2678958 [handler.py:handle_request():130] handle_request: stop_status
23
+ 2022-01-30 23:59:28,071 DEBUG SenderThread:2678958 [sender.py:send_request():248] send_request: stop_status
24
+ 2022-01-30 23:59:28,199 DEBUG SenderThread:2678958 [sender.py:send():234] send: config
25
+ 2022-01-30 23:59:28,200 DEBUG SenderThread:2678958 [sender.py:send():234] send: metric
26
+ 2022-01-30 23:59:28,200 DEBUG SenderThread:2678958 [sender.py:send():234] send: metric
27
+ 2022-01-30 23:59:28,200 WARNING SenderThread:2678958 [sender.py:send_metric():897] Seen metric with glob (shouldnt happen)
28
+ 2022-01-30 23:59:28,627 INFO Thread-11 :2678958 [upload_job.py:push():137] Uploaded file /tmp/tmpp6bl39oswandb/6kg16zdm-wandb-metadata.json
29
+ 2022-01-30 23:59:29,033 INFO Thread-8 :2678958 [dir_watcher.py:_on_file_created():217] file/dir created: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_235926-2f5mzr6c/files/wandb-summary.json
30
+ 2022-01-30 23:59:29,033 INFO Thread-8 :2678958 [dir_watcher.py:_on_file_created():217] file/dir created: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_235926-2f5mzr6c/files/wandb-metadata.json
31
+ 2022-01-30 23:59:29,033 INFO Thread-8 :2678958 [dir_watcher.py:_on_file_created():217] file/dir created: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_235926-2f5mzr6c/files/output.log
32
+ 2022-01-30 23:59:29,033 INFO Thread-8 :2678958 [dir_watcher.py:_on_file_created():217] file/dir created: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_235926-2f5mzr6c/files/requirements.txt
33
+ 2022-01-30 23:59:31,032 INFO Thread-8 :2678958 [dir_watcher.py:_on_file_modified():230] file/dir modified: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_235926-2f5mzr6c/files/output.log
34
+ 2022-01-30 23:59:34,715 DEBUG HandlerThread:2678958 [handler.py:handle_request():130] handle_request: poll_exit
35
+ 2022-01-30 23:59:34,715 DEBUG SenderThread:2678958 [sender.py:send():234] send: telemetry
36
+ 2022-01-30 23:59:34,716 DEBUG SenderThread:2678958 [sender.py:send():234] send: exit
37
+ 2022-01-30 23:59:34,716 INFO SenderThread:2678958 [sender.py:send_exit():366] handling exit code: 1
38
+ 2022-01-30 23:59:34,716 INFO SenderThread:2678958 [sender.py:send_exit():368] handling runtime: 6
39
+ 2022-01-30 23:59:34,754 INFO SenderThread:2678958 [sender.py:_save_file():939] saving file wandb-summary.json with policy end
40
+ 2022-01-30 23:59:34,754 INFO SenderThread:2678958 [sender.py:send_exit():374] send defer
41
+ 2022-01-30 23:59:34,755 DEBUG SenderThread:2678958 [sender.py:send_request():248] send_request: poll_exit
42
+ 2022-01-30 23:59:34,755 DEBUG HandlerThread:2678958 [handler.py:handle_request():130] handle_request: defer
43
+ 2022-01-30 23:59:34,755 INFO HandlerThread:2678958 [handler.py:handle_request_defer():147] handle defer: 0
44
+ 2022-01-30 23:59:34,756 DEBUG SenderThread:2678958 [sender.py:send_request():248] send_request: defer
45
+ 2022-01-30 23:59:34,756 INFO SenderThread:2678958 [sender.py:send_request_defer():383] handle sender defer: 0
46
+ 2022-01-30 23:59:34,756 INFO SenderThread:2678958 [sender.py:transition_state():387] send defer: 1
47
+ 2022-01-30 23:59:34,756 DEBUG HandlerThread:2678958 [handler.py:handle_request():130] handle_request: defer
48
+ 2022-01-30 23:59:34,757 INFO HandlerThread:2678958 [handler.py:handle_request_defer():147] handle defer: 1
49
+ 2022-01-30 23:59:34,880 DEBUG SenderThread:2678958 [sender.py:send_request():248] send_request: defer
50
+ 2022-01-30 23:59:34,880 INFO SenderThread:2678958 [sender.py:send_request_defer():383] handle sender defer: 1
51
+ 2022-01-30 23:59:34,880 INFO SenderThread:2678958 [sender.py:transition_state():387] send defer: 2
52
+ 2022-01-30 23:59:34,880 DEBUG HandlerThread:2678958 [handler.py:handle_request():130] handle_request: poll_exit
53
+ 2022-01-30 23:59:34,881 DEBUG SenderThread:2678958 [sender.py:send_request():248] send_request: poll_exit
54
+ 2022-01-30 23:59:34,881 DEBUG SenderThread:2678958 [sender.py:send():234] send: stats
55
+ 2022-01-30 23:59:34,882 DEBUG HandlerThread:2678958 [handler.py:handle_request():130] handle_request: defer
56
+ 2022-01-30 23:59:34,882 INFO HandlerThread:2678958 [handler.py:handle_request_defer():147] handle defer: 2
57
+ 2022-01-30 23:59:34,882 DEBUG SenderThread:2678958 [sender.py:send_request():248] send_request: defer
58
+ 2022-01-30 23:59:34,883 INFO SenderThread:2678958 [sender.py:send_request_defer():383] handle sender defer: 2
59
+ 2022-01-30 23:59:34,883 INFO SenderThread:2678958 [sender.py:transition_state():387] send defer: 3
60
+ 2022-01-30 23:59:34,883 DEBUG HandlerThread:2678958 [handler.py:handle_request():130] handle_request: defer
61
+ 2022-01-30 23:59:34,883 INFO HandlerThread:2678958 [handler.py:handle_request_defer():147] handle defer: 3
62
+ 2022-01-30 23:59:34,883 DEBUG SenderThread:2678958 [sender.py:send():234] send: summary
63
+ 2022-01-30 23:59:34,896 INFO SenderThread:2678958 [sender.py:_save_file():939] saving file wandb-summary.json with policy end
64
+ 2022-01-30 23:59:34,896 DEBUG SenderThread:2678958 [sender.py:send_request():248] send_request: defer
65
+ 2022-01-30 23:59:34,896 INFO SenderThread:2678958 [sender.py:send_request_defer():383] handle sender defer: 3
66
+ 2022-01-30 23:59:34,896 INFO SenderThread:2678958 [sender.py:transition_state():387] send defer: 4
67
+ 2022-01-30 23:59:34,897 DEBUG HandlerThread:2678958 [handler.py:handle_request():130] handle_request: defer
68
+ 2022-01-30 23:59:34,897 INFO HandlerThread:2678958 [handler.py:handle_request_defer():147] handle defer: 4
69
+ 2022-01-30 23:59:34,897 DEBUG SenderThread:2678958 [sender.py:send_request():248] send_request: defer
70
+ 2022-01-30 23:59:34,897 INFO SenderThread:2678958 [sender.py:send_request_defer():383] handle sender defer: 4
71
+ 2022-01-30 23:59:34,984 DEBUG HandlerThread:2678958 [handler.py:handle_request():130] handle_request: poll_exit
72
+ 2022-01-30 23:59:35,034 INFO Thread-8 :2678958 [dir_watcher.py:_on_file_modified():230] file/dir modified: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_235926-2f5mzr6c/files/wandb-summary.json
73
+ 2022-01-30 23:59:35,034 INFO Thread-8 :2678958 [dir_watcher.py:_on_file_modified():230] file/dir modified: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_235926-2f5mzr6c/files/output.log
74
+ 2022-01-30 23:59:35,172 INFO SenderThread:2678958 [sender.py:transition_state():387] send defer: 5
75
+ 2022-01-30 23:59:35,173 DEBUG SenderThread:2678958 [sender.py:send_request():248] send_request: poll_exit
76
+ 2022-01-30 23:59:35,173 DEBUG HandlerThread:2678958 [handler.py:handle_request():130] handle_request: defer
77
+ 2022-01-30 23:59:35,173 INFO HandlerThread:2678958 [handler.py:handle_request_defer():147] handle defer: 5
78
+ 2022-01-30 23:59:35,173 DEBUG SenderThread:2678958 [sender.py:send_request():248] send_request: defer
79
+ 2022-01-30 23:59:35,173 INFO SenderThread:2678958 [sender.py:send_request_defer():383] handle sender defer: 5
80
+ 2022-01-30 23:59:35,173 INFO SenderThread:2678958 [dir_watcher.py:finish():283] shutting down directory watcher
81
+ 2022-01-30 23:59:35,275 DEBUG HandlerThread:2678958 [handler.py:handle_request():130] handle_request: poll_exit
82
+ 2022-01-30 23:59:36,034 INFO Thread-8 :2678958 [dir_watcher.py:_on_file_modified():230] file/dir modified: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_235926-2f5mzr6c/files/config.yaml
83
+ 2022-01-30 23:59:36,035 INFO SenderThread:2678958 [dir_watcher.py:finish():313] scan: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_235926-2f5mzr6c/files
84
+ 2022-01-30 23:59:36,035 INFO SenderThread:2678958 [dir_watcher.py:finish():327] scan save: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_235926-2f5mzr6c/files/config.yaml config.yaml
85
+ 2022-01-30 23:59:36,035 INFO SenderThread:2678958 [dir_watcher.py:finish():327] scan save: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_235926-2f5mzr6c/files/wandb-summary.json wandb-summary.json
86
+ 2022-01-30 23:59:36,035 INFO SenderThread:2678958 [dir_watcher.py:finish():327] scan save: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_235926-2f5mzr6c/files/requirements.txt requirements.txt
87
+ 2022-01-30 23:59:36,036 INFO SenderThread:2678958 [dir_watcher.py:finish():327] scan save: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_235926-2f5mzr6c/files/wandb-metadata.json wandb-metadata.json
88
+ 2022-01-30 23:59:36,036 INFO SenderThread:2678958 [dir_watcher.py:finish():327] scan save: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_235926-2f5mzr6c/files/output.log output.log
89
+ 2022-01-30 23:59:36,036 INFO SenderThread:2678958 [sender.py:transition_state():387] send defer: 6
90
+ 2022-01-30 23:59:36,036 DEBUG SenderThread:2678958 [sender.py:send_request():248] send_request: poll_exit
91
+ 2022-01-30 23:59:36,042 DEBUG HandlerThread:2678958 [handler.py:handle_request():130] handle_request: defer
92
+ 2022-01-30 23:59:36,043 INFO HandlerThread:2678958 [handler.py:handle_request_defer():147] handle defer: 6
93
+ 2022-01-30 23:59:36,048 DEBUG SenderThread:2678958 [sender.py:send_request():248] send_request: defer
94
+ 2022-01-30 23:59:36,048 INFO SenderThread:2678958 [sender.py:send_request_defer():383] handle sender defer: 6
95
+ 2022-01-30 23:59:36,048 INFO SenderThread:2678958 [file_pusher.py:finish():177] shutting down file pusher
96
+ 2022-01-30 23:59:36,144 DEBUG HandlerThread:2678958 [handler.py:handle_request():130] handle_request: poll_exit
97
+ 2022-01-30 23:59:36,144 DEBUG SenderThread:2678958 [sender.py:send_request():248] send_request: poll_exit
98
+ 2022-01-30 23:59:36,246 DEBUG HandlerThread:2678958 [handler.py:handle_request():130] handle_request: poll_exit
99
+ 2022-01-30 23:59:36,247 DEBUG SenderThread:2678958 [sender.py:send_request():248] send_request: poll_exit
100
+ 2022-01-30 23:59:36,349 DEBUG HandlerThread:2678958 [handler.py:handle_request():130] handle_request: poll_exit
101
+ 2022-01-30 23:59:36,349 DEBUG SenderThread:2678958 [sender.py:send_request():248] send_request: poll_exit
102
+ 2022-01-30 23:59:36,451 DEBUG HandlerThread:2678958 [handler.py:handle_request():130] handle_request: poll_exit
103
+ 2022-01-30 23:59:36,452 DEBUG SenderThread:2678958 [sender.py:send_request():248] send_request: poll_exit
104
+ 2022-01-30 23:59:36,554 DEBUG HandlerThread:2678958 [handler.py:handle_request():130] handle_request: poll_exit
105
+ 2022-01-30 23:59:36,554 DEBUG SenderThread:2678958 [sender.py:send_request():248] send_request: poll_exit
106
+ 2022-01-30 23:59:36,585 INFO Thread-12 :2678958 [upload_job.py:push():137] Uploaded file /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_235926-2f5mzr6c/files/config.yaml
107
+ 2022-01-30 23:59:36,593 INFO Thread-14 :2678958 [upload_job.py:push():137] Uploaded file /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_235926-2f5mzr6c/files/requirements.txt
108
+ 2022-01-30 23:59:36,600 INFO Thread-15 :2678958 [upload_job.py:push():137] Uploaded file /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_235926-2f5mzr6c/files/output.log
109
+ 2022-01-30 23:59:36,609 INFO Thread-13 :2678958 [upload_job.py:push():137] Uploaded file /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_235926-2f5mzr6c/files/wandb-summary.json
110
+ 2022-01-30 23:59:36,656 DEBUG HandlerThread:2678958 [handler.py:handle_request():130] handle_request: poll_exit
111
+ 2022-01-30 23:59:36,657 DEBUG SenderThread:2678958 [sender.py:send_request():248] send_request: poll_exit
112
+ 2022-01-30 23:59:36,758 DEBUG HandlerThread:2678958 [handler.py:handle_request():130] handle_request: poll_exit
113
+ 2022-01-30 23:59:36,759 DEBUG SenderThread:2678958 [sender.py:send_request():248] send_request: poll_exit
114
+ 2022-01-30 23:59:36,810 INFO Thread-7 :2678958 [sender.py:transition_state():387] send defer: 7
115
+ 2022-01-30 23:59:36,810 DEBUG HandlerThread:2678958 [handler.py:handle_request():130] handle_request: defer
116
+ 2022-01-30 23:59:36,810 INFO HandlerThread:2678958 [handler.py:handle_request_defer():147] handle defer: 7
117
+ 2022-01-30 23:59:36,811 DEBUG SenderThread:2678958 [sender.py:send_request():248] send_request: defer
118
+ 2022-01-30 23:59:36,811 INFO SenderThread:2678958 [sender.py:send_request_defer():383] handle sender defer: 7
119
+ 2022-01-30 23:59:36,860 DEBUG HandlerThread:2678958 [handler.py:handle_request():130] handle_request: poll_exit
120
+ 2022-01-30 23:59:36,965 INFO SenderThread:2678958 [sender.py:transition_state():387] send defer: 8
121
+ 2022-01-30 23:59:36,966 DEBUG SenderThread:2678958 [sender.py:send_request():248] send_request: poll_exit
122
+ 2022-01-30 23:59:36,966 DEBUG HandlerThread:2678958 [handler.py:handle_request():130] handle_request: defer
123
+ 2022-01-30 23:59:36,966 INFO HandlerThread:2678958 [handler.py:handle_request_defer():147] handle defer: 8
124
+ 2022-01-30 23:59:36,967 DEBUG SenderThread:2678958 [sender.py:send_request():248] send_request: defer
125
+ 2022-01-30 23:59:36,967 INFO SenderThread:2678958 [sender.py:send_request_defer():383] handle sender defer: 8
126
+ 2022-01-30 23:59:36,967 INFO SenderThread:2678958 [sender.py:transition_state():387] send defer: 9
127
+ 2022-01-30 23:59:36,967 DEBUG SenderThread:2678958 [sender.py:send():234] send: final
128
+ 2022-01-30 23:59:36,968 DEBUG SenderThread:2678958 [sender.py:send():234] send: footer
129
+ 2022-01-30 23:59:36,968 DEBUG HandlerThread:2678958 [handler.py:handle_request():130] handle_request: defer
130
+ 2022-01-30 23:59:36,968 INFO HandlerThread:2678958 [handler.py:handle_request_defer():147] handle defer: 9
131
+ 2022-01-30 23:59:36,968 DEBUG SenderThread:2678958 [sender.py:send_request():248] send_request: defer
132
+ 2022-01-30 23:59:36,968 INFO SenderThread:2678958 [sender.py:send_request_defer():383] handle sender defer: 9
133
+ 2022-01-30 23:59:37,067 DEBUG HandlerThread:2678958 [handler.py:handle_request():130] handle_request: poll_exit
134
+ 2022-01-30 23:59:37,068 DEBUG SenderThread:2678958 [sender.py:send_request():248] send_request: poll_exit
135
+ 2022-01-30 23:59:37,068 INFO SenderThread:2678958 [file_pusher.py:join():182] waiting for file pusher
136
+ 2022-01-30 23:59:37,319 DEBUG HandlerThread:2678958 [handler.py:handle_request():130] handle_request: get_summary
137
+ 2022-01-30 23:59:37,319 DEBUG HandlerThread:2678958 [handler.py:handle_request():130] handle_request: sampled_history
138
+ 2022-01-30 23:59:37,320 DEBUG HandlerThread:2678958 [handler.py:handle_request():130] handle_request: shutdown
139
+ 2022-01-30 23:59:37,320 INFO HandlerThread:2678958 [handler.py:finish():731] shutting down handler
140
+ 2022-01-30 23:59:37,968 INFO WriterThread:2678958 [datastore.py:close():281] close: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_235926-2f5mzr6c/run-2f5mzr6c.wandb
141
+ 2022-01-30 23:59:38,317 INFO SenderThread:2678958 [sender.py:finish():1070] shutting down sender
142
+ 2022-01-30 23:59:38,317 INFO SenderThread:2678958 [file_pusher.py:finish():177] shutting down file pusher
143
+ 2022-01-30 23:59:38,317 INFO SenderThread:2678958 [file_pusher.py:join():182] waiting for file pusher
144
+ 2022-01-30 23:59:38,319 INFO MainThread:2678958 [internal.py:handle_exit():77] Internal process exited
wandb/run-20220130_235926-2f5mzr6c/logs/debug.log ADDED
@@ -0,0 +1,136 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2022-01-30 23:59:26,947 INFO MainThread:2678204 [wandb_setup.py:_flush():71] setting env: {'project': 'xls-r-estonian'}
2
+ 2022-01-30 23:59:26,948 INFO MainThread:2678204 [wandb_setup.py:_flush():71] setting login settings: {}
3
+ 2022-01-30 23:59:26,948 INFO MainThread:2678204 [wandb_init.py:_log_setup():371] Logging user logs to /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_235926-2f5mzr6c/logs/debug.log
4
+ 2022-01-30 23:59:26,948 INFO MainThread:2678204 [wandb_init.py:_log_setup():372] Logging internal logs to /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220130_235926-2f5mzr6c/logs/debug-internal.log
5
+ 2022-01-30 23:59:26,949 INFO MainThread:2678204 [wandb_init.py:init():404] calling init triggers
6
+ 2022-01-30 23:59:26,949 INFO MainThread:2678204 [wandb_init.py:init():409] wandb.init called with sweep_config: {}
7
+ config: {}
8
+ 2022-01-30 23:59:26,949 INFO MainThread:2678204 [wandb_init.py:init():460] starting backend
9
+ 2022-01-30 23:59:26,949 INFO MainThread:2678204 [backend.py:_multiprocessing_setup():99] multiprocessing start_methods=fork,spawn,forkserver, using: spawn
10
+ 2022-01-30 23:59:26,976 INFO MainThread:2678204 [backend.py:ensure_launched():216] starting backend process...
11
+ 2022-01-30 23:59:26,998 INFO MainThread:2678204 [backend.py:ensure_launched():221] started backend process with pid: 2678958
12
+ 2022-01-30 23:59:26,999 INFO MainThread:2678204 [wandb_init.py:init():469] backend started and connected
13
+ 2022-01-30 23:59:27,004 INFO MainThread:2678204 [wandb_init.py:init():533] updated telemetry
14
+ 2022-01-30 23:59:27,053 INFO MainThread:2678204 [wandb_init.py:init():563] communicating current version
15
+ 2022-01-30 23:59:27,855 INFO MainThread:2678204 [wandb_init.py:init():568] got version response
16
+ 2022-01-30 23:59:27,855 INFO MainThread:2678204 [wandb_init.py:init():578] communicating run to backend with 30 second timeout
17
+ 2022-01-30 23:59:28,033 INFO MainThread:2678204 [wandb_init.py:init():606] starting run threads in backend
18
+ 2022-01-30 23:59:28,070 INFO MainThread:2678204 [wandb_run.py:_console_start():1810] atexit reg
19
+ 2022-01-30 23:59:28,071 INFO MainThread:2678204 [wandb_run.py:_redirect():1684] redirect: SettingsConsole.REDIRECT
20
+ 2022-01-30 23:59:28,072 INFO MainThread:2678204 [wandb_run.py:_redirect():1689] Redirecting console.
21
+ 2022-01-30 23:59:28,074 INFO MainThread:2678204 [wandb_run.py:_redirect():1745] Redirects installed.
22
+ 2022-01-30 23:59:28,074 INFO MainThread:2678204 [wandb_init.py:init():633] run started, returning control to user process
23
+ 2022-01-30 23:59:28,091 INFO MainThread:2678204 [wandb_run.py:_config_callback():956] config_cb None None {'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': 'float32', 'use_bfloat16': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'chunk_size_feed_forward': 0, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'architectures': ['Wav2Vec2ForPreTraining'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'bos_token_id': 1, 'pad_token_id': 36, 'eos_token_id': 2, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': 'facebook/wav2vec2-xls-r-300m', 'transformers_version': '4.16.0.dev0', 'feat_extract_dropout': 0.0, 'model_type': 'wav2vec2', 'num_feat_extract_layers': 7, 'hidden_size': 1024, 'feat_extract_norm': 'layer', 'feat_extract_activation': 'gelu', 'conv_dim': [512, 512, 512, 512, 512, 512, 512], 'conv_stride': [5, 2, 2, 2, 2, 2, 2], 'conv_kernel': [10, 3, 3, 3, 3, 2, 2], 'conv_bias': True, 'num_conv_pos_embeddings': 128, 'num_conv_pos_embedding_groups': 16, 'num_hidden_layers': 24, 'intermediate_size': 4096, 'hidden_act': 'gelu', 'num_attention_heads': 16, 'hidden_dropout': 0.0, 'attention_dropout': 0.0, 'activation_dropout': 0.0, 'feat_proj_dropout': 0.1, 'final_dropout': 0.0, 'layerdrop': 0.0, 'layer_norm_eps': 1e-05, 'initializer_range': 0.02, 'vocab_size': 39, 'do_stable_layer_norm': True, 'use_weighted_layer_sum': False, 'apply_spec_augment': True, 'mask_time_prob': 0.1, 'mask_time_length': 10, 'mask_time_min_masks': 2, 'mask_feature_prob': 0.0, 'mask_feature_length': 10, 'mask_feature_min_masks': 0, 'num_codevectors_per_group': 320, 'num_codevector_groups': 2, 'contrastive_logits_temperature': 0.1, 'feat_quantizer_dropout': 0.0, 'num_negatives': 100, 'codevector_dim': 768, 'proj_codevector_dim': 768, 'diversity_loss_weight': 0.1, 'ctc_loss_reduction': 'mean', 'ctc_zero_infinity': False, 'add_adapter': False, 'adapter_kernel_size': 3, 'adapter_stride': 2, 'num_adapter_layers': 3, 'output_hidden_size': 1024, 'classifier_proj_size': 256, 'tdnn_dim': [512, 512, 512, 512, 1500], 'tdnn_kernel': [5, 3, 3, 1, 1], 'tdnn_dilation': [1, 2, 3, 1, 1], 'xvector_output_dim': 512, 'output_dir': './', 'overwrite_output_dir': True, 'do_train': True, 'do_eval': True, 'do_predict': False, 'evaluation_strategy': 'steps', 'prediction_loss_only': False, 'per_device_train_batch_size': 72, 'per_device_eval_batch_size': 72, 'per_gpu_train_batch_size': 'None', 'per_gpu_eval_batch_size': 'None', 'gradient_accumulation_steps': 2, 'eval_accumulation_steps': 'None', 'learning_rate': 0.0003, 'weight_decay': 0.0, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 1.0, 'num_train_epochs': 100.0, 'max_steps': -1, 'lr_scheduler_type': 'cosine', 'warmup_ratio': 0.0, 'warmup_steps': 500, 'log_level': -1, 'log_level_replica': -1, 'log_on_each_node': True, 'logging_dir': './runs/Jan30_23-55-50_ganymede.eafit.edu.co', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 100, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 500, 'save_total_limit': 1, 'save_on_each_node': False, 'no_cuda': False, 'seed': 42, 'bf16': False, 'fp16': True, 'fp16_opt_level': 'O1', 'half_precision_backend': 'amp', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': 'None', 'local_rank': -1, 'xpu_backend': 'None', 'tpu_num_cores': 'None', 'tpu_metrics_debug': False, 'debug': '[]', 'dataloader_drop_last': False, 'eval_steps': 500, 'dataloader_num_workers': 0, 'past_index': -1, 'run_name': 'cosine+drop_proj+low_specaugment-300M+cv_8_0', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': 'None', 'load_best_model_at_end': False, 'metric_for_best_model': 'None', 'greater_is_better': 'None', 'ignore_data_skip': False, 'sharded_ddp': '[]', 'deepspeed': 'None', 'label_smoothing_factor': 0.0, 'optim': 'adamw_hf', 'adafactor': False, 'group_by_length': True, 'length_column_name': 'input_length', 'report_to': "['wandb']", 'ddp_find_unused_parameters': 'None', 'ddp_bucket_cap_mb': 'None', 'dataloader_pin_memory': True, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': False, 'resume_from_checkpoint': 'None', 'hub_model_id': 'None', 'hub_strategy': 'every_save', 'hub_token': '<HUB_TOKEN>', 'gradient_checkpointing': True, 'fp16_backend': 'auto', 'push_to_hub_model_id': 'None', 'push_to_hub_organization': 'None', 'push_to_hub_token': '<PUSH_TO_HUB_TOKEN>', '_n_gpu': 1, 'mp_parameters': '', 'train_batch_size': 72, 'eval_batch_size': 72}
24
+ 2022-01-30 23:59:28,093 INFO MainThread:2678204 [wandb_watch.py:watch():43] Watching
25
+ 2022-01-30 23:59:32,346 INFO MainThread:2678204 [wandb_run.py:_atexit_cleanup():1780] got exitcode: 1
26
+ 2022-01-30 23:59:32,348 INFO MainThread:2678204 [wandb_run.py:_restore():1752] restore
27
+ 2022-01-30 23:59:34,755 INFO MainThread:2678204 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
28
+ wandb_count: 1
29
+ }
30
+ pusher_stats {
31
+ uploaded_bytes: 2306
32
+ total_bytes: 2306
33
+ }
34
+
35
+ 2022-01-30 23:59:34,882 INFO MainThread:2678204 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
36
+ wandb_count: 1
37
+ }
38
+ pusher_stats {
39
+ uploaded_bytes: 2306
40
+ total_bytes: 2306
41
+ }
42
+
43
+ 2022-01-30 23:59:35,173 INFO MainThread:2678204 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
44
+ wandb_count: 1
45
+ }
46
+ pusher_stats {
47
+ uploaded_bytes: 2306
48
+ total_bytes: 2306
49
+ }
50
+
51
+ 2022-01-30 23:59:36,042 INFO MainThread:2678204 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
52
+ wandb_count: 4
53
+ }
54
+ pusher_stats {
55
+ uploaded_bytes: 2306
56
+ total_bytes: 13222
57
+ }
58
+
59
+ 2022-01-30 23:59:36,145 INFO MainThread:2678204 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
60
+ wandb_count: 5
61
+ }
62
+ pusher_stats {
63
+ uploaded_bytes: 2306
64
+ total_bytes: 16724
65
+ }
66
+
67
+ 2022-01-30 23:59:36,247 INFO MainThread:2678204 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
68
+ wandb_count: 5
69
+ }
70
+ pusher_stats {
71
+ uploaded_bytes: 2306
72
+ total_bytes: 16724
73
+ }
74
+
75
+ 2022-01-30 23:59:36,350 INFO MainThread:2678204 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
76
+ wandb_count: 5
77
+ }
78
+ pusher_stats {
79
+ uploaded_bytes: 16724
80
+ total_bytes: 16724
81
+ }
82
+
83
+ 2022-01-30 23:59:36,452 INFO MainThread:2678204 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
84
+ wandb_count: 5
85
+ }
86
+ pusher_stats {
87
+ uploaded_bytes: 16724
88
+ total_bytes: 16724
89
+ }
90
+
91
+ 2022-01-30 23:59:36,555 INFO MainThread:2678204 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
92
+ wandb_count: 5
93
+ }
94
+ pusher_stats {
95
+ uploaded_bytes: 16724
96
+ total_bytes: 16724
97
+ }
98
+
99
+ 2022-01-30 23:59:36,657 INFO MainThread:2678204 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
100
+ wandb_count: 5
101
+ }
102
+ pusher_stats {
103
+ uploaded_bytes: 16724
104
+ total_bytes: 16724
105
+ }
106
+
107
+ 2022-01-30 23:59:36,759 INFO MainThread:2678204 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
108
+ wandb_count: 5
109
+ }
110
+ pusher_stats {
111
+ uploaded_bytes: 16724
112
+ total_bytes: 16724
113
+ }
114
+
115
+ 2022-01-30 23:59:36,966 INFO MainThread:2678204 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
116
+ wandb_count: 5
117
+ }
118
+ pusher_stats {
119
+ uploaded_bytes: 16724
120
+ total_bytes: 16724
121
+ }
122
+
123
+ 2022-01-30 23:59:37,317 INFO MainThread:2678204 [wandb_run.py:_wait_for_finish():1912] got exit ret: done: true
124
+ exit_result {
125
+ }
126
+ file_counts {
127
+ wandb_count: 5
128
+ }
129
+ pusher_stats {
130
+ uploaded_bytes: 16724
131
+ total_bytes: 16724
132
+ }
133
+ local_info {
134
+ }
135
+
136
+ 2022-01-30 23:59:38,396 INFO MainThread:2678204 [wandb_run.py:_append_files():2180] logging synced files
wandb/run-20220130_235926-2f5mzr6c/run-2f5mzr6c.wandb ADDED
Binary file (10.6 kB). View file
 
wandb/run-20220131_000222-2f4y0tls/files/config.yaml ADDED
@@ -0,0 +1,655 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ wandb_version: 1
2
+
3
+ _n_gpu:
4
+ desc: null
5
+ value: 1
6
+ _name_or_path:
7
+ desc: null
8
+ value: facebook/wav2vec2-xls-r-300m
9
+ _wandb:
10
+ desc: null
11
+ value:
12
+ cli_version: 0.12.9
13
+ framework: huggingface
14
+ huggingface_version: 4.16.0.dev0
15
+ is_jupyter_run: false
16
+ is_kaggle_kernel: true
17
+ m:
18
+ - 1: train/global_step
19
+ 6:
20
+ - 3
21
+ python_version: 3.9.6
22
+ start_time: 1643605342
23
+ t:
24
+ 1:
25
+ - 1
26
+ - 5
27
+ - 11
28
+ 2:
29
+ - 1
30
+ - 5
31
+ - 11
32
+ 3:
33
+ - 1
34
+ - 7
35
+ - 13
36
+ 4: 3.9.6
37
+ 5: 0.12.9
38
+ 6: 4.16.0.dev0
39
+ 8:
40
+ - 2
41
+ - 5
42
+ activation_dropout:
43
+ desc: null
44
+ value: 0.0
45
+ adafactor:
46
+ desc: null
47
+ value: false
48
+ adam_beta1:
49
+ desc: null
50
+ value: 0.9
51
+ adam_beta2:
52
+ desc: null
53
+ value: 0.999
54
+ adam_epsilon:
55
+ desc: null
56
+ value: 1.0e-08
57
+ adapter_kernel_size:
58
+ desc: null
59
+ value: 3
60
+ adapter_stride:
61
+ desc: null
62
+ value: 2
63
+ add_adapter:
64
+ desc: null
65
+ value: false
66
+ add_cross_attention:
67
+ desc: null
68
+ value: false
69
+ apply_spec_augment:
70
+ desc: null
71
+ value: true
72
+ architectures:
73
+ desc: null
74
+ value:
75
+ - Wav2Vec2ForPreTraining
76
+ attention_dropout:
77
+ desc: null
78
+ value: 0.0
79
+ bad_words_ids:
80
+ desc: null
81
+ value: null
82
+ bf16:
83
+ desc: null
84
+ value: false
85
+ bf16_full_eval:
86
+ desc: null
87
+ value: false
88
+ bos_token_id:
89
+ desc: null
90
+ value: 1
91
+ chunk_size_feed_forward:
92
+ desc: null
93
+ value: 0
94
+ classifier_proj_size:
95
+ desc: null
96
+ value: 256
97
+ codevector_dim:
98
+ desc: null
99
+ value: 768
100
+ contrastive_logits_temperature:
101
+ desc: null
102
+ value: 0.1
103
+ conv_bias:
104
+ desc: null
105
+ value: true
106
+ conv_dim:
107
+ desc: null
108
+ value:
109
+ - 512
110
+ - 512
111
+ - 512
112
+ - 512
113
+ - 512
114
+ - 512
115
+ - 512
116
+ conv_kernel:
117
+ desc: null
118
+ value:
119
+ - 10
120
+ - 3
121
+ - 3
122
+ - 3
123
+ - 3
124
+ - 2
125
+ - 2
126
+ conv_stride:
127
+ desc: null
128
+ value:
129
+ - 5
130
+ - 2
131
+ - 2
132
+ - 2
133
+ - 2
134
+ - 2
135
+ - 2
136
+ cross_attention_hidden_size:
137
+ desc: null
138
+ value: null
139
+ ctc_loss_reduction:
140
+ desc: null
141
+ value: mean
142
+ ctc_zero_infinity:
143
+ desc: null
144
+ value: false
145
+ dataloader_drop_last:
146
+ desc: null
147
+ value: false
148
+ dataloader_num_workers:
149
+ desc: null
150
+ value: 0
151
+ dataloader_pin_memory:
152
+ desc: null
153
+ value: true
154
+ ddp_bucket_cap_mb:
155
+ desc: null
156
+ value: None
157
+ ddp_find_unused_parameters:
158
+ desc: null
159
+ value: None
160
+ debug:
161
+ desc: null
162
+ value: '[]'
163
+ decoder_start_token_id:
164
+ desc: null
165
+ value: null
166
+ deepspeed:
167
+ desc: null
168
+ value: None
169
+ disable_tqdm:
170
+ desc: null
171
+ value: false
172
+ diversity_loss_weight:
173
+ desc: null
174
+ value: 0.1
175
+ diversity_penalty:
176
+ desc: null
177
+ value: 0.0
178
+ do_eval:
179
+ desc: null
180
+ value: true
181
+ do_predict:
182
+ desc: null
183
+ value: false
184
+ do_sample:
185
+ desc: null
186
+ value: false
187
+ do_stable_layer_norm:
188
+ desc: null
189
+ value: true
190
+ do_train:
191
+ desc: null
192
+ value: true
193
+ early_stopping:
194
+ desc: null
195
+ value: false
196
+ encoder_no_repeat_ngram_size:
197
+ desc: null
198
+ value: 0
199
+ eos_token_id:
200
+ desc: null
201
+ value: 2
202
+ eval_accumulation_steps:
203
+ desc: null
204
+ value: None
205
+ eval_batch_size:
206
+ desc: null
207
+ value: 64
208
+ eval_steps:
209
+ desc: null
210
+ value: 500
211
+ evaluation_strategy:
212
+ desc: null
213
+ value: steps
214
+ feat_extract_activation:
215
+ desc: null
216
+ value: gelu
217
+ feat_extract_dropout:
218
+ desc: null
219
+ value: 0.0
220
+ feat_extract_norm:
221
+ desc: null
222
+ value: layer
223
+ feat_proj_dropout:
224
+ desc: null
225
+ value: 0.1
226
+ feat_quantizer_dropout:
227
+ desc: null
228
+ value: 0.0
229
+ final_dropout:
230
+ desc: null
231
+ value: 0.0
232
+ finetuning_task:
233
+ desc: null
234
+ value: null
235
+ forced_bos_token_id:
236
+ desc: null
237
+ value: null
238
+ forced_eos_token_id:
239
+ desc: null
240
+ value: null
241
+ fp16:
242
+ desc: null
243
+ value: true
244
+ fp16_backend:
245
+ desc: null
246
+ value: auto
247
+ fp16_full_eval:
248
+ desc: null
249
+ value: false
250
+ fp16_opt_level:
251
+ desc: null
252
+ value: O1
253
+ gradient_accumulation_steps:
254
+ desc: null
255
+ value: 3
256
+ gradient_checkpointing:
257
+ desc: null
258
+ value: true
259
+ greater_is_better:
260
+ desc: null
261
+ value: None
262
+ group_by_length:
263
+ desc: null
264
+ value: true
265
+ half_precision_backend:
266
+ desc: null
267
+ value: amp
268
+ hidden_act:
269
+ desc: null
270
+ value: gelu
271
+ hidden_dropout:
272
+ desc: null
273
+ value: 0.0
274
+ hidden_size:
275
+ desc: null
276
+ value: 1024
277
+ hub_model_id:
278
+ desc: null
279
+ value: None
280
+ hub_strategy:
281
+ desc: null
282
+ value: every_save
283
+ hub_token:
284
+ desc: null
285
+ value: <HUB_TOKEN>
286
+ id2label:
287
+ desc: null
288
+ value:
289
+ '0': LABEL_0
290
+ '1': LABEL_1
291
+ ignore_data_skip:
292
+ desc: null
293
+ value: false
294
+ initializer_range:
295
+ desc: null
296
+ value: 0.02
297
+ intermediate_size:
298
+ desc: null
299
+ value: 4096
300
+ is_decoder:
301
+ desc: null
302
+ value: false
303
+ is_encoder_decoder:
304
+ desc: null
305
+ value: false
306
+ label2id:
307
+ desc: null
308
+ value:
309
+ LABEL_0: 0
310
+ LABEL_1: 1
311
+ label_names:
312
+ desc: null
313
+ value: None
314
+ label_smoothing_factor:
315
+ desc: null
316
+ value: 0.0
317
+ layer_norm_eps:
318
+ desc: null
319
+ value: 1.0e-05
320
+ layerdrop:
321
+ desc: null
322
+ value: 0.0
323
+ learning_rate:
324
+ desc: null
325
+ value: 0.0003
326
+ length_column_name:
327
+ desc: null
328
+ value: input_length
329
+ length_penalty:
330
+ desc: null
331
+ value: 1.0
332
+ load_best_model_at_end:
333
+ desc: null
334
+ value: false
335
+ local_rank:
336
+ desc: null
337
+ value: -1
338
+ log_level:
339
+ desc: null
340
+ value: -1
341
+ log_level_replica:
342
+ desc: null
343
+ value: -1
344
+ log_on_each_node:
345
+ desc: null
346
+ value: true
347
+ logging_dir:
348
+ desc: null
349
+ value: ./runs/Jan31_00-00-32_ganymede.eafit.edu.co
350
+ logging_first_step:
351
+ desc: null
352
+ value: false
353
+ logging_nan_inf_filter:
354
+ desc: null
355
+ value: true
356
+ logging_steps:
357
+ desc: null
358
+ value: 100
359
+ logging_strategy:
360
+ desc: null
361
+ value: steps
362
+ lr_scheduler_type:
363
+ desc: null
364
+ value: cosine
365
+ mask_feature_length:
366
+ desc: null
367
+ value: 10
368
+ mask_feature_min_masks:
369
+ desc: null
370
+ value: 0
371
+ mask_feature_prob:
372
+ desc: null
373
+ value: 0.0
374
+ mask_time_length:
375
+ desc: null
376
+ value: 10
377
+ mask_time_min_masks:
378
+ desc: null
379
+ value: 2
380
+ mask_time_prob:
381
+ desc: null
382
+ value: 0.1
383
+ max_grad_norm:
384
+ desc: null
385
+ value: 1.0
386
+ max_length:
387
+ desc: null
388
+ value: 20
389
+ max_steps:
390
+ desc: null
391
+ value: -1
392
+ metric_for_best_model:
393
+ desc: null
394
+ value: None
395
+ min_length:
396
+ desc: null
397
+ value: 0
398
+ model_type:
399
+ desc: null
400
+ value: wav2vec2
401
+ mp_parameters:
402
+ desc: null
403
+ value: ''
404
+ no_cuda:
405
+ desc: null
406
+ value: false
407
+ no_repeat_ngram_size:
408
+ desc: null
409
+ value: 0
410
+ num_adapter_layers:
411
+ desc: null
412
+ value: 3
413
+ num_attention_heads:
414
+ desc: null
415
+ value: 16
416
+ num_beam_groups:
417
+ desc: null
418
+ value: 1
419
+ num_beams:
420
+ desc: null
421
+ value: 1
422
+ num_codevector_groups:
423
+ desc: null
424
+ value: 2
425
+ num_codevectors_per_group:
426
+ desc: null
427
+ value: 320
428
+ num_conv_pos_embedding_groups:
429
+ desc: null
430
+ value: 16
431
+ num_conv_pos_embeddings:
432
+ desc: null
433
+ value: 128
434
+ num_feat_extract_layers:
435
+ desc: null
436
+ value: 7
437
+ num_hidden_layers:
438
+ desc: null
439
+ value: 24
440
+ num_negatives:
441
+ desc: null
442
+ value: 100
443
+ num_return_sequences:
444
+ desc: null
445
+ value: 1
446
+ num_train_epochs:
447
+ desc: null
448
+ value: 100.0
449
+ optim:
450
+ desc: null
451
+ value: adamw_hf
452
+ output_attentions:
453
+ desc: null
454
+ value: false
455
+ output_dir:
456
+ desc: null
457
+ value: ./
458
+ output_hidden_size:
459
+ desc: null
460
+ value: 1024
461
+ output_hidden_states:
462
+ desc: null
463
+ value: false
464
+ output_scores:
465
+ desc: null
466
+ value: false
467
+ overwrite_output_dir:
468
+ desc: null
469
+ value: true
470
+ pad_token_id:
471
+ desc: null
472
+ value: 36
473
+ past_index:
474
+ desc: null
475
+ value: -1
476
+ per_device_eval_batch_size:
477
+ desc: null
478
+ value: 64
479
+ per_device_train_batch_size:
480
+ desc: null
481
+ value: 64
482
+ per_gpu_eval_batch_size:
483
+ desc: null
484
+ value: None
485
+ per_gpu_train_batch_size:
486
+ desc: null
487
+ value: None
488
+ prediction_loss_only:
489
+ desc: null
490
+ value: false
491
+ prefix:
492
+ desc: null
493
+ value: null
494
+ problem_type:
495
+ desc: null
496
+ value: null
497
+ proj_codevector_dim:
498
+ desc: null
499
+ value: 768
500
+ pruned_heads:
501
+ desc: null
502
+ value: {}
503
+ push_to_hub:
504
+ desc: null
505
+ value: false
506
+ push_to_hub_model_id:
507
+ desc: null
508
+ value: None
509
+ push_to_hub_organization:
510
+ desc: null
511
+ value: None
512
+ push_to_hub_token:
513
+ desc: null
514
+ value: <PUSH_TO_HUB_TOKEN>
515
+ remove_invalid_values:
516
+ desc: null
517
+ value: false
518
+ remove_unused_columns:
519
+ desc: null
520
+ value: true
521
+ repetition_penalty:
522
+ desc: null
523
+ value: 1.0
524
+ report_to:
525
+ desc: null
526
+ value: '[''wandb'']'
527
+ resume_from_checkpoint:
528
+ desc: null
529
+ value: None
530
+ return_dict:
531
+ desc: null
532
+ value: true
533
+ return_dict_in_generate:
534
+ desc: null
535
+ value: false
536
+ run_name:
537
+ desc: null
538
+ value: cosine+drop_proj+low_specaugment-300M+cv_8_0
539
+ save_on_each_node:
540
+ desc: null
541
+ value: false
542
+ save_steps:
543
+ desc: null
544
+ value: 500
545
+ save_strategy:
546
+ desc: null
547
+ value: steps
548
+ save_total_limit:
549
+ desc: null
550
+ value: 1
551
+ seed:
552
+ desc: null
553
+ value: 42
554
+ sep_token_id:
555
+ desc: null
556
+ value: null
557
+ sharded_ddp:
558
+ desc: null
559
+ value: '[]'
560
+ skip_memory_metrics:
561
+ desc: null
562
+ value: true
563
+ task_specific_params:
564
+ desc: null
565
+ value: null
566
+ tdnn_dilation:
567
+ desc: null
568
+ value:
569
+ - 1
570
+ - 2
571
+ - 3
572
+ - 1
573
+ - 1
574
+ tdnn_dim:
575
+ desc: null
576
+ value:
577
+ - 512
578
+ - 512
579
+ - 512
580
+ - 512
581
+ - 1500
582
+ tdnn_kernel:
583
+ desc: null
584
+ value:
585
+ - 5
586
+ - 3
587
+ - 3
588
+ - 1
589
+ - 1
590
+ temperature:
591
+ desc: null
592
+ value: 1.0
593
+ tf32:
594
+ desc: null
595
+ value: None
596
+ tie_encoder_decoder:
597
+ desc: null
598
+ value: false
599
+ tie_word_embeddings:
600
+ desc: null
601
+ value: true
602
+ tokenizer_class:
603
+ desc: null
604
+ value: null
605
+ top_k:
606
+ desc: null
607
+ value: 50
608
+ top_p:
609
+ desc: null
610
+ value: 1.0
611
+ torch_dtype:
612
+ desc: null
613
+ value: float32
614
+ torchscript:
615
+ desc: null
616
+ value: false
617
+ tpu_metrics_debug:
618
+ desc: null
619
+ value: false
620
+ tpu_num_cores:
621
+ desc: null
622
+ value: None
623
+ train_batch_size:
624
+ desc: null
625
+ value: 64
626
+ transformers_version:
627
+ desc: null
628
+ value: 4.16.0.dev0
629
+ use_bfloat16:
630
+ desc: null
631
+ value: false
632
+ use_legacy_prediction_loop:
633
+ desc: null
634
+ value: false
635
+ use_weighted_layer_sum:
636
+ desc: null
637
+ value: false
638
+ vocab_size:
639
+ desc: null
640
+ value: 39
641
+ warmup_ratio:
642
+ desc: null
643
+ value: 0.0
644
+ warmup_steps:
645
+ desc: null
646
+ value: 500
647
+ weight_decay:
648
+ desc: null
649
+ value: 0.0
650
+ xpu_backend:
651
+ desc: null
652
+ value: None
653
+ xvector_output_dim:
654
+ desc: null
655
+ value: 512
wandb/run-20220131_000222-2f4y0tls/files/output.log ADDED
@@ -0,0 +1,35 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+
2
+ 0%| | 0/3000 [00:00<?, ?it/s]Traceback (most recent call last):
3
+ File "/home/sagrilaft/Project/audio/xls-r-et/src/run_speech_recognition_ctc_bnb.py", line 760, in <module>
4
+ main()
5
+ File "/home/sagrilaft/Project/audio/xls-r-et/src/run_speech_recognition_ctc_bnb.py", line 711, in main
6
+ train_result = trainer.train(resume_from_checkpoint=checkpoint)
7
+ File "/home/sagrilaft/Project/audio/xls-r-et/.venv/lib/python3.9/site-packages/transformers/trainer.py", line 1365, in train
8
+ tr_loss_step = self.training_step(model, inputs)
9
+ File "/home/sagrilaft/Project/audio/xls-r-et/.venv/lib/python3.9/site-packages/transformers/trainer.py", line 1940, in training_step
10
+ loss = self.compute_loss(model, inputs)
11
+ File "/home/sagrilaft/Project/audio/xls-r-et/.venv/lib/python3.9/site-packages/transformers/trainer.py", line 1972, in compute_loss
12
+ outputs = model(**inputs)
13
+ File "/home/sagrilaft/Project/audio/xls-r-et/.venv/lib64/python3.9/site-packages/torch/nn/modules/module.py", line 1102, in _call_impl
14
+ return forward_call(*input, **kwargs)
15
+ File "/home/sagrilaft/Project/audio/xls-r-et/.venv/lib/python3.9/site-packages/transformers/models/wav2vec2/modeling_wav2vec2.py", line 1720, in forward
16
+ outputs = self.wav2vec2(
17
+ File "/home/sagrilaft/Project/audio/xls-r-et/.venv/lib64/python3.9/site-packages/torch/nn/modules/module.py", line 1102, in _call_impl
18
+ return forward_call(*input, **kwargs)
19
+ File "/home/sagrilaft/Project/audio/xls-r-et/.venv/lib/python3.9/site-packages/transformers/models/wav2vec2/modeling_wav2vec2.py", line 1313, in forward
20
+ extract_features = self.feature_extractor(input_values)
21
+ File "/home/sagrilaft/Project/audio/xls-r-et/.venv/lib64/python3.9/site-packages/torch/nn/modules/module.py", line 1102, in _call_impl
22
+ return forward_call(*input, **kwargs)
23
+ File "/home/sagrilaft/Project/audio/xls-r-et/.venv/lib/python3.9/site-packages/transformers/models/wav2vec2/modeling_wav2vec2.py", line 482, in forward
24
+ hidden_states = conv_layer(hidden_states)
25
+ File "/home/sagrilaft/Project/audio/xls-r-et/.venv/lib64/python3.9/site-packages/torch/nn/modules/module.py", line 1102, in _call_impl
26
+ return forward_call(*input, **kwargs)
27
+ File "/home/sagrilaft/Project/audio/xls-r-et/.venv/lib/python3.9/site-packages/transformers/models/wav2vec2/modeling_wav2vec2.py", line 357, in forward
28
+ hidden_states = self.layer_norm(hidden_states)
29
+ File "/home/sagrilaft/Project/audio/xls-r-et/.venv/lib64/python3.9/site-packages/torch/nn/modules/module.py", line 1102, in _call_impl
30
+ return forward_call(*input, **kwargs)
31
+ File "/home/sagrilaft/Project/audio/xls-r-et/.venv/lib64/python3.9/site-packages/torch/nn/modules/normalization.py", line 189, in forward
32
+ return F.layer_norm(
33
+ File "/home/sagrilaft/Project/audio/xls-r-et/.venv/lib64/python3.9/site-packages/torch/nn/functional.py", line 2347, in layer_norm
34
+ return torch.layer_norm(input, normalized_shape, weight, bias, eps, torch.backends.cudnn.enabled)
35
+ RuntimeError: CUDA out of memory. Tried to allocate 4.18 GiB (GPU 0; 31.75 GiB total capacity; 3.35 GiB already allocated; 1.49 GiB free; 3.35 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
wandb/run-20220131_000222-2f4y0tls/files/requirements.txt ADDED
@@ -0,0 +1,87 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ aiohttp==3.8.1
2
+ aiosignal==1.2.0
3
+ appdirs==1.4.4
4
+ async-timeout==4.0.2
5
+ attrs==21.4.0
6
+ audioread==2.1.9
7
+ bitsandbytes-cuda113==0.26.0
8
+ certifi==2021.10.8
9
+ cffi==1.15.0
10
+ charset-normalizer==2.0.10
11
+ click==8.0.3
12
+ clldutils==3.10.1
13
+ colorlog==6.6.0
14
+ configparser==5.2.0
15
+ csvw==1.11.0
16
+ datasets==1.18.1.dev0
17
+ decorator==5.1.1
18
+ dill==0.3.4
19
+ dlinfo==1.2.1
20
+ docker-pycreds==0.4.0
21
+ filelock==3.4.2
22
+ frozenlist==1.3.0
23
+ fsspec==2022.1.0
24
+ gitdb==4.0.9
25
+ gitpython==3.1.26
26
+ huggingface-hub==0.4.0
27
+ hypothesis==6.36.0
28
+ idna==3.3
29
+ isodate==0.6.1
30
+ jiwer==2.3.0
31
+ joblib==1.1.0
32
+ librosa==0.8.1
33
+ llvmlite==0.38.0
34
+ multidict==6.0.2
35
+ multiprocess==0.70.12.2
36
+ numba==0.55.0
37
+ numpy==1.21.5
38
+ packaging==21.3
39
+ pandas==1.4.0
40
+ pathtools==0.1.2
41
+ phonemizer==3.0.1
42
+ pip==21.3.1
43
+ pooch==1.6.0
44
+ promise==2.3
45
+ protobuf==3.19.3
46
+ psutil==5.9.0
47
+ pyarrow==6.0.1
48
+ pycparser==2.21
49
+ pyctcdecode==0.3.0
50
+ pygtrie==2.4.2
51
+ pyparsing==3.0.7
52
+ python-dateutil==2.8.2
53
+ python-levenshtein==0.12.2
54
+ pytz==2021.3
55
+ pyyaml==6.0
56
+ regex==2022.1.18
57
+ requests==2.27.1
58
+ resampy==0.2.2
59
+ rfc3986==2.0.0
60
+ sacremoses==0.0.47
61
+ scikit-learn==1.0.2
62
+ scipy==1.7.3
63
+ segments==2.2.0
64
+ sentry-sdk==1.5.4
65
+ setuptools==60.2.0
66
+ shortuuid==1.0.8
67
+ six==1.16.0
68
+ smmap==5.0.0
69
+ sortedcontainers==2.4.0
70
+ soundfile==0.10.3.post1
71
+ subprocess32==3.5.4
72
+ tabulate==0.8.9
73
+ termcolor==1.1.0
74
+ threadpoolctl==3.0.0
75
+ tokenizers==0.11.4
76
+ torch==1.10.1
77
+ torchaudio==0.10.1
78
+ tqdm==4.62.3
79
+ transformers==4.16.0.dev0
80
+ typing-extensions==4.0.1
81
+ uritemplate==4.1.1
82
+ urllib3==1.26.8
83
+ wandb==0.12.9
84
+ wheel==0.37.1
85
+ xxhash==2.0.2
86
+ yarl==1.7.2
87
+ yaspin==2.1.0
wandb/run-20220131_000222-2f4y0tls/files/wandb-metadata.json ADDED
@@ -0,0 +1,75 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "os": "Linux-4.18.0-305.10.2.el8_4.x86_64-x86_64-with-glibc2.28",
3
+ "python": "3.9.6",
4
+ "heartbeatAt": "2022-01-31T05:02:23.826134",
5
+ "startedAt": "2022-01-31T05:02:22.877153",
6
+ "docker": null,
7
+ "gpu": "Tesla V100-PCIE-32GB",
8
+ "gpu_count": 3,
9
+ "cpu_count": 64,
10
+ "cuda": null,
11
+ "args": [
12
+ "--dataset_name=mozilla-foundation/common_voice_8_0",
13
+ "--model_name_or_path=facebook/wav2vec2-xls-r-300m",
14
+ "--dataset_config_name=et",
15
+ "--output_dir=./",
16
+ "--overwrite_output_dir",
17
+ "--num_train_epochs=100",
18
+ "--per_device_train_batch_size=64",
19
+ "--per_device_eval_batch_size=64",
20
+ "--gradient_accumulation_steps=3",
21
+ "--learning_rate=3e-4",
22
+ "--save_total_limit=1",
23
+ "--warmup_steps=500",
24
+ "--evaluation_strategy=steps",
25
+ "--text_column_name=sentence",
26
+ "--length_column_name=input_length",
27
+ "--save_steps=500",
28
+ "--eval_steps=500",
29
+ "--logging_steps=100",
30
+ "--layerdrop=0.0",
31
+ "--freeze_feature_encoder",
32
+ "--feat_proj_dropout=0.1",
33
+ "--chars_to_ignore",
34
+ ",",
35
+ "?",
36
+ ".",
37
+ "!",
38
+ "-",
39
+ ";",
40
+ ":",
41
+ "\"",
42
+ "\u201c",
43
+ "%",
44
+ "\u2018",
45
+ "\u201d",
46
+ "\ufffd",
47
+ "\u2014",
48
+ "\u2019",
49
+ "\u2026",
50
+ "\u2013",
51
+ "--gradient_checkpointing",
52
+ "--lr_scheduler_type=cosine",
53
+ "--fp16",
54
+ "--group_by_length",
55
+ "--mask_time_prob=0.1",
56
+ "--mask_time_length=10",
57
+ "--report_to=wandb",
58
+ "--run_name=cosine+drop_proj+low_specaugment-300M+cv_8_0",
59
+ "--do_train",
60
+ "--do_eval",
61
+ "--use_auth_token"
62
+ ],
63
+ "state": "running",
64
+ "program": "/home/sagrilaft/Project/audio/xls-r-et/src/run_speech_recognition_ctc_bnb.py",
65
+ "codePath": "src/run_speech_recognition_ctc_bnb.py",
66
+ "git": {
67
+ "remote": "https://huggingface.co/shpotes/xls-r-et",
68
+ "commit": "ff66b86f52be4c55fb5be74a60f889284554c939"
69
+ },
70
+ "email": "shpotes3@gmail.com",
71
+ "root": "/home/sagrilaft/Project/audio/xls-r-et",
72
+ "host": "ganymede.eafit.edu.co",
73
+ "username": "sagrilaft",
74
+ "executable": "/home/sagrilaft/Project/audio/xls-r-et/.venv/bin/python"
75
+ }
wandb/run-20220131_000222-2f4y0tls/files/wandb-summary.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"_wandb": {"runtime": 6}}
wandb/run-20220131_000222-2f4y0tls/logs/debug-internal.log ADDED
@@ -0,0 +1,144 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2022-01-31 00:02:23,360 INFO MainThread:2679567 [internal.py:wandb_internal():87] W&B internal server running at pid: 2679567, started at: 2022-01-31 00:02:23.359907
2
+ 2022-01-31 00:02:23,362 DEBUG HandlerThread:2679567 [handler.py:handle_request():130] handle_request: check_version
3
+ 2022-01-31 00:02:23,362 INFO WriterThread:2679567 [datastore.py:open_for_write():77] open: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220131_000222-2f4y0tls/run-2f4y0tls.wandb
4
+ 2022-01-31 00:02:23,365 DEBUG SenderThread:2679567 [sender.py:send():234] send: header
5
+ 2022-01-31 00:02:23,365 DEBUG SenderThread:2679567 [sender.py:send_request():248] send_request: check_version
6
+ 2022-01-31 00:02:23,638 DEBUG SenderThread:2679567 [sender.py:send():234] send: run
7
+ 2022-01-31 00:02:23,818 INFO SenderThread:2679567 [dir_watcher.py:__init__():169] watching files in: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220131_000222-2f4y0tls/files
8
+ 2022-01-31 00:02:23,818 INFO SenderThread:2679567 [sender.py:_start_run_threads():804] run started: 2f4y0tls with start time 1643605342
9
+ 2022-01-31 00:02:23,818 DEBUG SenderThread:2679567 [sender.py:send():234] send: summary
10
+ 2022-01-31 00:02:23,819 INFO SenderThread:2679567 [sender.py:_save_file():939] saving file wandb-summary.json with policy end
11
+ 2022-01-31 00:02:23,820 DEBUG HandlerThread:2679567 [handler.py:handle_request():130] handle_request: run_start
12
+ 2022-01-31 00:02:23,825 DEBUG HandlerThread:2679567 [meta.py:__init__():40] meta init
13
+ 2022-01-31 00:02:23,825 DEBUG HandlerThread:2679567 [meta.py:__init__():54] meta init done
14
+ 2022-01-31 00:02:23,826 DEBUG HandlerThread:2679567 [meta.py:probe():214] probe
15
+ 2022-01-31 00:02:23,833 DEBUG HandlerThread:2679567 [meta.py:_setup_git():204] setup git
16
+ 2022-01-31 00:02:23,851 DEBUG HandlerThread:2679567 [meta.py:_setup_git():211] setup git done
17
+ 2022-01-31 00:02:23,851 DEBUG HandlerThread:2679567 [meta.py:_save_pip():58] save pip
18
+ 2022-01-31 00:02:23,852 DEBUG HandlerThread:2679567 [meta.py:_save_pip():72] save pip done
19
+ 2022-01-31 00:02:23,852 DEBUG HandlerThread:2679567 [meta.py:probe():252] probe done
20
+ 2022-01-31 00:02:23,854 DEBUG SenderThread:2679567 [sender.py:send():234] send: files
21
+ 2022-01-31 00:02:23,855 INFO SenderThread:2679567 [sender.py:_save_file():939] saving file wandb-metadata.json with policy now
22
+ 2022-01-31 00:02:23,864 DEBUG HandlerThread:2679567 [handler.py:handle_request():130] handle_request: stop_status
23
+ 2022-01-31 00:02:23,864 DEBUG SenderThread:2679567 [sender.py:send_request():248] send_request: stop_status
24
+ 2022-01-31 00:02:24,005 DEBUG SenderThread:2679567 [sender.py:send():234] send: config
25
+ 2022-01-31 00:02:24,006 DEBUG SenderThread:2679567 [sender.py:send():234] send: metric
26
+ 2022-01-31 00:02:24,007 DEBUG SenderThread:2679567 [sender.py:send():234] send: metric
27
+ 2022-01-31 00:02:24,007 WARNING SenderThread:2679567 [sender.py:send_metric():897] Seen metric with glob (shouldnt happen)
28
+ 2022-01-31 00:02:24,402 INFO Thread-11 :2679567 [upload_job.py:push():137] Uploaded file /tmp/tmpbod3yngjwandb/1fkdlg3x-wandb-metadata.json
29
+ 2022-01-31 00:02:24,821 INFO Thread-8 :2679567 [dir_watcher.py:_on_file_created():217] file/dir created: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220131_000222-2f4y0tls/files/wandb-summary.json
30
+ 2022-01-31 00:02:24,822 INFO Thread-8 :2679567 [dir_watcher.py:_on_file_created():217] file/dir created: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220131_000222-2f4y0tls/files/requirements.txt
31
+ 2022-01-31 00:02:24,822 INFO Thread-8 :2679567 [dir_watcher.py:_on_file_created():217] file/dir created: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220131_000222-2f4y0tls/files/output.log
32
+ 2022-01-31 00:02:24,822 INFO Thread-8 :2679567 [dir_watcher.py:_on_file_created():217] file/dir created: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220131_000222-2f4y0tls/files/wandb-metadata.json
33
+ 2022-01-31 00:02:26,819 INFO Thread-8 :2679567 [dir_watcher.py:_on_file_modified():230] file/dir modified: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220131_000222-2f4y0tls/files/output.log
34
+ 2022-01-31 00:02:30,026 DEBUG SenderThread:2679567 [sender.py:send():234] send: telemetry
35
+ 2022-01-31 00:02:30,026 DEBUG HandlerThread:2679567 [handler.py:handle_request():130] handle_request: poll_exit
36
+ 2022-01-31 00:02:30,026 DEBUG SenderThread:2679567 [sender.py:send():234] send: exit
37
+ 2022-01-31 00:02:30,027 INFO SenderThread:2679567 [sender.py:send_exit():366] handling exit code: 1
38
+ 2022-01-31 00:02:30,027 INFO SenderThread:2679567 [sender.py:send_exit():368] handling runtime: 6
39
+ 2022-01-31 00:02:30,027 INFO SenderThread:2679567 [sender.py:_save_file():939] saving file wandb-summary.json with policy end
40
+ 2022-01-31 00:02:30,028 INFO SenderThread:2679567 [sender.py:send_exit():374] send defer
41
+ 2022-01-31 00:02:30,028 DEBUG SenderThread:2679567 [sender.py:send_request():248] send_request: poll_exit
42
+ 2022-01-31 00:02:30,029 DEBUG HandlerThread:2679567 [handler.py:handle_request():130] handle_request: defer
43
+ 2022-01-31 00:02:30,029 INFO HandlerThread:2679567 [handler.py:handle_request_defer():147] handle defer: 0
44
+ 2022-01-31 00:02:30,029 DEBUG SenderThread:2679567 [sender.py:send_request():248] send_request: defer
45
+ 2022-01-31 00:02:30,029 INFO SenderThread:2679567 [sender.py:send_request_defer():383] handle sender defer: 0
46
+ 2022-01-31 00:02:30,030 INFO SenderThread:2679567 [sender.py:transition_state():387] send defer: 1
47
+ 2022-01-31 00:02:30,030 DEBUG HandlerThread:2679567 [handler.py:handle_request():130] handle_request: defer
48
+ 2022-01-31 00:02:30,030 INFO HandlerThread:2679567 [handler.py:handle_request_defer():147] handle defer: 1
49
+ 2022-01-31 00:02:30,123 DEBUG SenderThread:2679567 [sender.py:send_request():248] send_request: defer
50
+ 2022-01-31 00:02:30,123 INFO SenderThread:2679567 [sender.py:send_request_defer():383] handle sender defer: 1
51
+ 2022-01-31 00:02:30,124 INFO SenderThread:2679567 [sender.py:transition_state():387] send defer: 2
52
+ 2022-01-31 00:02:30,124 DEBUG SenderThread:2679567 [sender.py:send():234] send: stats
53
+ 2022-01-31 00:02:30,125 DEBUG HandlerThread:2679567 [handler.py:handle_request():130] handle_request: defer
54
+ 2022-01-31 00:02:30,125 INFO HandlerThread:2679567 [handler.py:handle_request_defer():147] handle defer: 2
55
+ 2022-01-31 00:02:30,125 DEBUG SenderThread:2679567 [sender.py:send_request():248] send_request: defer
56
+ 2022-01-31 00:02:30,126 INFO SenderThread:2679567 [sender.py:send_request_defer():383] handle sender defer: 2
57
+ 2022-01-31 00:02:30,126 INFO SenderThread:2679567 [sender.py:transition_state():387] send defer: 3
58
+ 2022-01-31 00:02:30,126 DEBUG HandlerThread:2679567 [handler.py:handle_request():130] handle_request: defer
59
+ 2022-01-31 00:02:30,126 INFO HandlerThread:2679567 [handler.py:handle_request_defer():147] handle defer: 3
60
+ 2022-01-31 00:02:30,127 DEBUG SenderThread:2679567 [sender.py:send():234] send: summary
61
+ 2022-01-31 00:02:30,127 INFO SenderThread:2679567 [sender.py:_save_file():939] saving file wandb-summary.json with policy end
62
+ 2022-01-31 00:02:30,128 DEBUG SenderThread:2679567 [sender.py:send_request():248] send_request: defer
63
+ 2022-01-31 00:02:30,128 INFO SenderThread:2679567 [sender.py:send_request_defer():383] handle sender defer: 3
64
+ 2022-01-31 00:02:30,128 INFO SenderThread:2679567 [sender.py:transition_state():387] send defer: 4
65
+ 2022-01-31 00:02:30,128 DEBUG HandlerThread:2679567 [handler.py:handle_request():130] handle_request: defer
66
+ 2022-01-31 00:02:30,128 INFO HandlerThread:2679567 [handler.py:handle_request_defer():147] handle defer: 4
67
+ 2022-01-31 00:02:30,129 DEBUG SenderThread:2679567 [sender.py:send_request():248] send_request: defer
68
+ 2022-01-31 00:02:30,129 INFO SenderThread:2679567 [sender.py:send_request_defer():383] handle sender defer: 4
69
+ 2022-01-31 00:02:30,138 DEBUG HandlerThread:2679567 [handler.py:handle_request():130] handle_request: poll_exit
70
+ 2022-01-31 00:02:30,334 INFO SenderThread:2679567 [sender.py:transition_state():387] send defer: 5
71
+ 2022-01-31 00:02:30,334 DEBUG SenderThread:2679567 [sender.py:send_request():248] send_request: poll_exit
72
+ 2022-01-31 00:02:30,335 DEBUG HandlerThread:2679567 [handler.py:handle_request():130] handle_request: defer
73
+ 2022-01-31 00:02:30,335 INFO HandlerThread:2679567 [handler.py:handle_request_defer():147] handle defer: 5
74
+ 2022-01-31 00:02:30,335 DEBUG SenderThread:2679567 [sender.py:send_request():248] send_request: defer
75
+ 2022-01-31 00:02:30,335 INFO SenderThread:2679567 [sender.py:send_request_defer():383] handle sender defer: 5
76
+ 2022-01-31 00:02:30,335 INFO SenderThread:2679567 [dir_watcher.py:finish():283] shutting down directory watcher
77
+ 2022-01-31 00:02:30,436 DEBUG HandlerThread:2679567 [handler.py:handle_request():130] handle_request: poll_exit
78
+ 2022-01-31 00:02:30,821 INFO SenderThread:2679567 [dir_watcher.py:_on_file_modified():230] file/dir modified: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220131_000222-2f4y0tls/files/wandb-summary.json
79
+ 2022-01-31 00:02:30,822 INFO SenderThread:2679567 [dir_watcher.py:_on_file_modified():230] file/dir modified: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220131_000222-2f4y0tls/files/config.yaml
80
+ 2022-01-31 00:02:30,822 INFO SenderThread:2679567 [dir_watcher.py:_on_file_modified():230] file/dir modified: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220131_000222-2f4y0tls/files/output.log
81
+ 2022-01-31 00:02:30,822 INFO SenderThread:2679567 [dir_watcher.py:finish():313] scan: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220131_000222-2f4y0tls/files
82
+ 2022-01-31 00:02:30,823 INFO SenderThread:2679567 [dir_watcher.py:finish():327] scan save: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220131_000222-2f4y0tls/files/config.yaml config.yaml
83
+ 2022-01-31 00:02:30,823 INFO SenderThread:2679567 [dir_watcher.py:finish():327] scan save: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220131_000222-2f4y0tls/files/wandb-summary.json wandb-summary.json
84
+ 2022-01-31 00:02:30,823 INFO SenderThread:2679567 [dir_watcher.py:finish():327] scan save: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220131_000222-2f4y0tls/files/requirements.txt requirements.txt
85
+ 2022-01-31 00:02:30,823 INFO SenderThread:2679567 [dir_watcher.py:finish():327] scan save: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220131_000222-2f4y0tls/files/wandb-metadata.json wandb-metadata.json
86
+ 2022-01-31 00:02:30,824 INFO SenderThread:2679567 [dir_watcher.py:finish():327] scan save: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220131_000222-2f4y0tls/files/output.log output.log
87
+ 2022-01-31 00:02:30,824 INFO SenderThread:2679567 [sender.py:transition_state():387] send defer: 6
88
+ 2022-01-31 00:02:30,830 DEBUG SenderThread:2679567 [sender.py:send_request():248] send_request: poll_exit
89
+ 2022-01-31 00:02:30,847 DEBUG HandlerThread:2679567 [handler.py:handle_request():130] handle_request: defer
90
+ 2022-01-31 00:02:30,848 INFO HandlerThread:2679567 [handler.py:handle_request_defer():147] handle defer: 6
91
+ 2022-01-31 00:02:30,848 DEBUG SenderThread:2679567 [sender.py:send_request():248] send_request: defer
92
+ 2022-01-31 00:02:30,848 INFO SenderThread:2679567 [sender.py:send_request_defer():383] handle sender defer: 6
93
+ 2022-01-31 00:02:30,848 INFO SenderThread:2679567 [file_pusher.py:finish():177] shutting down file pusher
94
+ 2022-01-31 00:02:30,933 DEBUG HandlerThread:2679567 [handler.py:handle_request():130] handle_request: poll_exit
95
+ 2022-01-31 00:02:30,933 DEBUG SenderThread:2679567 [sender.py:send_request():248] send_request: poll_exit
96
+ 2022-01-31 00:02:31,035 DEBUG HandlerThread:2679567 [handler.py:handle_request():130] handle_request: poll_exit
97
+ 2022-01-31 00:02:31,036 DEBUG SenderThread:2679567 [sender.py:send_request():248] send_request: poll_exit
98
+ 2022-01-31 00:02:31,138 DEBUG HandlerThread:2679567 [handler.py:handle_request():130] handle_request: poll_exit
99
+ 2022-01-31 00:02:31,139 DEBUG SenderThread:2679567 [sender.py:send_request():248] send_request: poll_exit
100
+ 2022-01-31 00:02:31,241 DEBUG HandlerThread:2679567 [handler.py:handle_request():130] handle_request: poll_exit
101
+ 2022-01-31 00:02:31,241 DEBUG SenderThread:2679567 [sender.py:send_request():248] send_request: poll_exit
102
+ 2022-01-31 00:02:31,344 DEBUG HandlerThread:2679567 [handler.py:handle_request():130] handle_request: poll_exit
103
+ 2022-01-31 00:02:31,344 DEBUG SenderThread:2679567 [sender.py:send_request():248] send_request: poll_exit
104
+ 2022-01-31 00:02:31,361 INFO Thread-14 :2679567 [upload_job.py:push():137] Uploaded file /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220131_000222-2f4y0tls/files/requirements.txt
105
+ 2022-01-31 00:02:31,401 INFO Thread-13 :2679567 [upload_job.py:push():137] Uploaded file /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220131_000222-2f4y0tls/files/wandb-summary.json
106
+ 2022-01-31 00:02:31,433 INFO Thread-15 :2679567 [upload_job.py:push():137] Uploaded file /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220131_000222-2f4y0tls/files/output.log
107
+ 2022-01-31 00:02:31,446 DEBUG HandlerThread:2679567 [handler.py:handle_request():130] handle_request: poll_exit
108
+ 2022-01-31 00:02:31,446 DEBUG SenderThread:2679567 [sender.py:send_request():248] send_request: poll_exit
109
+ 2022-01-31 00:02:31,525 INFO Thread-12 :2679567 [upload_job.py:push():137] Uploaded file /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220131_000222-2f4y0tls/files/config.yaml
110
+ 2022-01-31 00:02:31,549 DEBUG HandlerThread:2679567 [handler.py:handle_request():130] handle_request: poll_exit
111
+ 2022-01-31 00:02:31,549 DEBUG SenderThread:2679567 [sender.py:send_request():248] send_request: poll_exit
112
+ 2022-01-31 00:02:31,651 DEBUG HandlerThread:2679567 [handler.py:handle_request():130] handle_request: poll_exit
113
+ 2022-01-31 00:02:31,652 DEBUG SenderThread:2679567 [sender.py:send_request():248] send_request: poll_exit
114
+ 2022-01-31 00:02:31,726 INFO Thread-7 :2679567 [sender.py:transition_state():387] send defer: 7
115
+ 2022-01-31 00:02:31,726 DEBUG HandlerThread:2679567 [handler.py:handle_request():130] handle_request: defer
116
+ 2022-01-31 00:02:31,727 INFO HandlerThread:2679567 [handler.py:handle_request_defer():147] handle defer: 7
117
+ 2022-01-31 00:02:31,727 DEBUG SenderThread:2679567 [sender.py:send_request():248] send_request: defer
118
+ 2022-01-31 00:02:31,727 INFO SenderThread:2679567 [sender.py:send_request_defer():383] handle sender defer: 7
119
+ 2022-01-31 00:02:31,754 DEBUG HandlerThread:2679567 [handler.py:handle_request():130] handle_request: poll_exit
120
+ 2022-01-31 00:02:32,103 INFO SenderThread:2679567 [sender.py:transition_state():387] send defer: 8
121
+ 2022-01-31 00:02:32,104 DEBUG SenderThread:2679567 [sender.py:send_request():248] send_request: poll_exit
122
+ 2022-01-31 00:02:32,104 DEBUG HandlerThread:2679567 [handler.py:handle_request():130] handle_request: defer
123
+ 2022-01-31 00:02:32,105 INFO HandlerThread:2679567 [handler.py:handle_request_defer():147] handle defer: 8
124
+ 2022-01-31 00:02:32,105 DEBUG SenderThread:2679567 [sender.py:send_request():248] send_request: defer
125
+ 2022-01-31 00:02:32,105 INFO SenderThread:2679567 [sender.py:send_request_defer():383] handle sender defer: 8
126
+ 2022-01-31 00:02:32,105 INFO SenderThread:2679567 [sender.py:transition_state():387] send defer: 9
127
+ 2022-01-31 00:02:32,106 DEBUG HandlerThread:2679567 [handler.py:handle_request():130] handle_request: defer
128
+ 2022-01-31 00:02:32,106 DEBUG SenderThread:2679567 [sender.py:send():234] send: final
129
+ 2022-01-31 00:02:32,106 INFO HandlerThread:2679567 [handler.py:handle_request_defer():147] handle defer: 9
130
+ 2022-01-31 00:02:32,107 DEBUG SenderThread:2679567 [sender.py:send():234] send: footer
131
+ 2022-01-31 00:02:32,107 DEBUG SenderThread:2679567 [sender.py:send_request():248] send_request: defer
132
+ 2022-01-31 00:02:32,107 INFO SenderThread:2679567 [sender.py:send_request_defer():383] handle sender defer: 9
133
+ 2022-01-31 00:02:32,206 DEBUG HandlerThread:2679567 [handler.py:handle_request():130] handle_request: poll_exit
134
+ 2022-01-31 00:02:32,207 DEBUG SenderThread:2679567 [sender.py:send_request():248] send_request: poll_exit
135
+ 2022-01-31 00:02:32,207 INFO SenderThread:2679567 [file_pusher.py:join():182] waiting for file pusher
136
+ 2022-01-31 00:02:32,476 DEBUG HandlerThread:2679567 [handler.py:handle_request():130] handle_request: get_summary
137
+ 2022-01-31 00:02:32,477 DEBUG HandlerThread:2679567 [handler.py:handle_request():130] handle_request: sampled_history
138
+ 2022-01-31 00:02:32,478 DEBUG HandlerThread:2679567 [handler.py:handle_request():130] handle_request: shutdown
139
+ 2022-01-31 00:02:32,478 INFO HandlerThread:2679567 [handler.py:finish():731] shutting down handler
140
+ 2022-01-31 00:02:33,107 INFO WriterThread:2679567 [datastore.py:close():281] close: /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220131_000222-2f4y0tls/run-2f4y0tls.wandb
141
+ 2022-01-31 00:02:33,474 INFO SenderThread:2679567 [sender.py:finish():1070] shutting down sender
142
+ 2022-01-31 00:02:33,474 INFO SenderThread:2679567 [file_pusher.py:finish():177] shutting down file pusher
143
+ 2022-01-31 00:02:33,474 INFO SenderThread:2679567 [file_pusher.py:join():182] waiting for file pusher
144
+ 2022-01-31 00:02:33,477 INFO MainThread:2679567 [internal.py:handle_exit():77] Internal process exited
wandb/run-20220131_000222-2f4y0tls/logs/debug.log ADDED
@@ -0,0 +1,136 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2022-01-31 00:02:22,880 INFO MainThread:2679148 [wandb_setup.py:_flush():71] setting env: {'project': 'xls-r-estonian'}
2
+ 2022-01-31 00:02:22,880 INFO MainThread:2679148 [wandb_setup.py:_flush():71] setting login settings: {}
3
+ 2022-01-31 00:02:22,880 INFO MainThread:2679148 [wandb_init.py:_log_setup():371] Logging user logs to /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220131_000222-2f4y0tls/logs/debug.log
4
+ 2022-01-31 00:02:22,880 INFO MainThread:2679148 [wandb_init.py:_log_setup():372] Logging internal logs to /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220131_000222-2f4y0tls/logs/debug-internal.log
5
+ 2022-01-31 00:02:22,881 INFO MainThread:2679148 [wandb_init.py:init():404] calling init triggers
6
+ 2022-01-31 00:02:22,881 INFO MainThread:2679148 [wandb_init.py:init():409] wandb.init called with sweep_config: {}
7
+ config: {}
8
+ 2022-01-31 00:02:22,881 INFO MainThread:2679148 [wandb_init.py:init():460] starting backend
9
+ 2022-01-31 00:02:22,881 INFO MainThread:2679148 [backend.py:_multiprocessing_setup():99] multiprocessing start_methods=fork,spawn,forkserver, using: spawn
10
+ 2022-01-31 00:02:22,908 INFO MainThread:2679148 [backend.py:ensure_launched():216] starting backend process...
11
+ 2022-01-31 00:02:22,930 INFO MainThread:2679148 [backend.py:ensure_launched():221] started backend process with pid: 2679567
12
+ 2022-01-31 00:02:22,932 INFO MainThread:2679148 [wandb_init.py:init():469] backend started and connected
13
+ 2022-01-31 00:02:22,937 INFO MainThread:2679148 [wandb_init.py:init():533] updated telemetry
14
+ 2022-01-31 00:02:22,990 INFO MainThread:2679148 [wandb_init.py:init():563] communicating current version
15
+ 2022-01-31 00:02:23,636 INFO MainThread:2679148 [wandb_init.py:init():568] got version response
16
+ 2022-01-31 00:02:23,636 INFO MainThread:2679148 [wandb_init.py:init():578] communicating run to backend with 30 second timeout
17
+ 2022-01-31 00:02:23,819 INFO MainThread:2679148 [wandb_init.py:init():606] starting run threads in backend
18
+ 2022-01-31 00:02:23,864 INFO MainThread:2679148 [wandb_run.py:_console_start():1810] atexit reg
19
+ 2022-01-31 00:02:23,865 INFO MainThread:2679148 [wandb_run.py:_redirect():1684] redirect: SettingsConsole.REDIRECT
20
+ 2022-01-31 00:02:23,865 INFO MainThread:2679148 [wandb_run.py:_redirect():1689] Redirecting console.
21
+ 2022-01-31 00:02:23,869 INFO MainThread:2679148 [wandb_run.py:_redirect():1745] Redirects installed.
22
+ 2022-01-31 00:02:23,869 INFO MainThread:2679148 [wandb_init.py:init():633] run started, returning control to user process
23
+ 2022-01-31 00:02:23,885 INFO MainThread:2679148 [wandb_run.py:_config_callback():956] config_cb None None {'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': 'float32', 'use_bfloat16': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'chunk_size_feed_forward': 0, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'architectures': ['Wav2Vec2ForPreTraining'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'bos_token_id': 1, 'pad_token_id': 36, 'eos_token_id': 2, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': 'facebook/wav2vec2-xls-r-300m', 'transformers_version': '4.16.0.dev0', 'feat_extract_dropout': 0.0, 'model_type': 'wav2vec2', 'num_feat_extract_layers': 7, 'hidden_size': 1024, 'feat_extract_norm': 'layer', 'feat_extract_activation': 'gelu', 'conv_dim': [512, 512, 512, 512, 512, 512, 512], 'conv_stride': [5, 2, 2, 2, 2, 2, 2], 'conv_kernel': [10, 3, 3, 3, 3, 2, 2], 'conv_bias': True, 'num_conv_pos_embeddings': 128, 'num_conv_pos_embedding_groups': 16, 'num_hidden_layers': 24, 'intermediate_size': 4096, 'hidden_act': 'gelu', 'num_attention_heads': 16, 'hidden_dropout': 0.0, 'attention_dropout': 0.0, 'activation_dropout': 0.0, 'feat_proj_dropout': 0.1, 'final_dropout': 0.0, 'layerdrop': 0.0, 'layer_norm_eps': 1e-05, 'initializer_range': 0.02, 'vocab_size': 39, 'do_stable_layer_norm': True, 'use_weighted_layer_sum': False, 'apply_spec_augment': True, 'mask_time_prob': 0.1, 'mask_time_length': 10, 'mask_time_min_masks': 2, 'mask_feature_prob': 0.0, 'mask_feature_length': 10, 'mask_feature_min_masks': 0, 'num_codevectors_per_group': 320, 'num_codevector_groups': 2, 'contrastive_logits_temperature': 0.1, 'feat_quantizer_dropout': 0.0, 'num_negatives': 100, 'codevector_dim': 768, 'proj_codevector_dim': 768, 'diversity_loss_weight': 0.1, 'ctc_loss_reduction': 'mean', 'ctc_zero_infinity': False, 'add_adapter': False, 'adapter_kernel_size': 3, 'adapter_stride': 2, 'num_adapter_layers': 3, 'output_hidden_size': 1024, 'classifier_proj_size': 256, 'tdnn_dim': [512, 512, 512, 512, 1500], 'tdnn_kernel': [5, 3, 3, 1, 1], 'tdnn_dilation': [1, 2, 3, 1, 1], 'xvector_output_dim': 512, 'output_dir': './', 'overwrite_output_dir': True, 'do_train': True, 'do_eval': True, 'do_predict': False, 'evaluation_strategy': 'steps', 'prediction_loss_only': False, 'per_device_train_batch_size': 64, 'per_device_eval_batch_size': 64, 'per_gpu_train_batch_size': 'None', 'per_gpu_eval_batch_size': 'None', 'gradient_accumulation_steps': 3, 'eval_accumulation_steps': 'None', 'learning_rate': 0.0003, 'weight_decay': 0.0, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 1.0, 'num_train_epochs': 100.0, 'max_steps': -1, 'lr_scheduler_type': 'cosine', 'warmup_ratio': 0.0, 'warmup_steps': 500, 'log_level': -1, 'log_level_replica': -1, 'log_on_each_node': True, 'logging_dir': './runs/Jan31_00-00-32_ganymede.eafit.edu.co', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 100, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 500, 'save_total_limit': 1, 'save_on_each_node': False, 'no_cuda': False, 'seed': 42, 'bf16': False, 'fp16': True, 'fp16_opt_level': 'O1', 'half_precision_backend': 'amp', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': 'None', 'local_rank': -1, 'xpu_backend': 'None', 'tpu_num_cores': 'None', 'tpu_metrics_debug': False, 'debug': '[]', 'dataloader_drop_last': False, 'eval_steps': 500, 'dataloader_num_workers': 0, 'past_index': -1, 'run_name': 'cosine+drop_proj+low_specaugment-300M+cv_8_0', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': 'None', 'load_best_model_at_end': False, 'metric_for_best_model': 'None', 'greater_is_better': 'None', 'ignore_data_skip': False, 'sharded_ddp': '[]', 'deepspeed': 'None', 'label_smoothing_factor': 0.0, 'optim': 'adamw_hf', 'adafactor': False, 'group_by_length': True, 'length_column_name': 'input_length', 'report_to': "['wandb']", 'ddp_find_unused_parameters': 'None', 'ddp_bucket_cap_mb': 'None', 'dataloader_pin_memory': True, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': False, 'resume_from_checkpoint': 'None', 'hub_model_id': 'None', 'hub_strategy': 'every_save', 'hub_token': '<HUB_TOKEN>', 'gradient_checkpointing': True, 'fp16_backend': 'auto', 'push_to_hub_model_id': 'None', 'push_to_hub_organization': 'None', 'push_to_hub_token': '<PUSH_TO_HUB_TOKEN>', '_n_gpu': 1, 'mp_parameters': '', 'train_batch_size': 64, 'eval_batch_size': 64}
24
+ 2022-01-31 00:02:23,887 INFO MainThread:2679148 [wandb_watch.py:watch():43] Watching
25
+ 2022-01-31 00:02:27,505 INFO MainThread:2679148 [wandb_run.py:_atexit_cleanup():1780] got exitcode: 1
26
+ 2022-01-31 00:02:27,507 INFO MainThread:2679148 [wandb_run.py:_restore():1752] restore
27
+ 2022-01-31 00:02:30,029 INFO MainThread:2679148 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
28
+ wandb_count: 1
29
+ }
30
+ pusher_stats {
31
+ uploaded_bytes: 2306
32
+ total_bytes: 2306
33
+ }
34
+
35
+ 2022-01-31 00:02:30,335 INFO MainThread:2679148 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
36
+ wandb_count: 1
37
+ }
38
+ pusher_stats {
39
+ uploaded_bytes: 2306
40
+ total_bytes: 2306
41
+ }
42
+
43
+ 2022-01-31 00:02:30,831 INFO MainThread:2679148 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
44
+ wandb_count: 4
45
+ }
46
+ pusher_stats {
47
+ uploaded_bytes: 2306
48
+ total_bytes: 13222
49
+ }
50
+
51
+ 2022-01-31 00:02:30,934 INFO MainThread:2679148 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
52
+ wandb_count: 5
53
+ }
54
+ pusher_stats {
55
+ uploaded_bytes: 2306
56
+ total_bytes: 16724
57
+ }
58
+
59
+ 2022-01-31 00:02:31,037 INFO MainThread:2679148 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
60
+ wandb_count: 5
61
+ }
62
+ pusher_stats {
63
+ uploaded_bytes: 2306
64
+ total_bytes: 16724
65
+ }
66
+
67
+ 2022-01-31 00:02:31,140 INFO MainThread:2679148 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
68
+ wandb_count: 5
69
+ }
70
+ pusher_stats {
71
+ uploaded_bytes: 16724
72
+ total_bytes: 16724
73
+ }
74
+
75
+ 2022-01-31 00:02:31,242 INFO MainThread:2679148 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
76
+ wandb_count: 5
77
+ }
78
+ pusher_stats {
79
+ uploaded_bytes: 16724
80
+ total_bytes: 16724
81
+ }
82
+
83
+ 2022-01-31 00:02:31,345 INFO MainThread:2679148 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
84
+ wandb_count: 5
85
+ }
86
+ pusher_stats {
87
+ uploaded_bytes: 16724
88
+ total_bytes: 16724
89
+ }
90
+
91
+ 2022-01-31 00:02:31,447 INFO MainThread:2679148 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
92
+ wandb_count: 5
93
+ }
94
+ pusher_stats {
95
+ uploaded_bytes: 16724
96
+ total_bytes: 16724
97
+ }
98
+
99
+ 2022-01-31 00:02:31,550 INFO MainThread:2679148 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
100
+ wandb_count: 5
101
+ }
102
+ pusher_stats {
103
+ uploaded_bytes: 16724
104
+ total_bytes: 16724
105
+ }
106
+
107
+ 2022-01-31 00:02:31,652 INFO MainThread:2679148 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
108
+ wandb_count: 5
109
+ }
110
+ pusher_stats {
111
+ uploaded_bytes: 16724
112
+ total_bytes: 16724
113
+ }
114
+
115
+ 2022-01-31 00:02:32,105 INFO MainThread:2679148 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
116
+ wandb_count: 5
117
+ }
118
+ pusher_stats {
119
+ uploaded_bytes: 16724
120
+ total_bytes: 16724
121
+ }
122
+
123
+ 2022-01-31 00:02:32,474 INFO MainThread:2679148 [wandb_run.py:_wait_for_finish():1912] got exit ret: done: true
124
+ exit_result {
125
+ }
126
+ file_counts {
127
+ wandb_count: 5
128
+ }
129
+ pusher_stats {
130
+ uploaded_bytes: 16724
131
+ total_bytes: 16724
132
+ }
133
+ local_info {
134
+ }
135
+
136
+ 2022-01-31 00:02:33,564 INFO MainThread:2679148 [wandb_run.py:_append_files():2180] logging synced files
wandb/run-20220131_000222-2f4y0tls/run-2f4y0tls.wandb ADDED
Binary file (10.6 kB). View file
 
wandb/run-20220131_001044-248r0x8f/files/config.yaml ADDED
The diff for this file is too large to render. See raw diff
 
wandb/run-20220131_001044-248r0x8f/files/output.log ADDED
@@ -0,0 +1,200 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+
2
+
3
+
4
+
5
+
6
+
7
+
8
+
9
+
10
+
11
+
12
+
13
+
14
+
15
+
16
+
17
+
18
+
19
+
20
+
21
+
22
+
23
+
24
+
25
+
26
+
27
+
28
+
29
+
30
+
31
+
32
+
33
+
34
+
35
+
36
+
37
+
38
+
39
+
40
+
41
+
42
+
43
+
44
+
45
+
46
+
47
+
48
+
49
+
50
+
51
+
52
+
53
+
54
+
55
+
56
+
57
+
58
+
59
+
60
+
61
+
62
+
63
+
64
+
65
+
66
+
67
+
68
+
69
+
70
+
71
+
72
+
73
+
74
+
75
+
76
+
77
+
78
+
79
+
80
+
81
+
82
+
83
+
84
+
85
+
86
+
87
+
88
+
89
+
90
+
91
+
92
+
93
+
94
+
95
+
96
+
97
+
98
+
99
+
100
+
101
+
102
+ 2%|████▏ | 100/4000 [14:16<10:37:06, 9.80s/it]
103
+
104
+
105
+
106
+
107
+
108
+
109
+
110
+
111
+
112
+
113
+
114
+
115
+
116
+
117
+
118
+
119
+
120
+
121
+
122
+
123
+
124
+
125
+
126
+
127
+
128
+
129
+
130
+
131
+
132
+
133
+
134
+
135
+
136
+
137
+
138
+
139
+
140
+
141
+
142
+
143
+
144
+
145
+
146
+
147
+
148
+
149
+
150
+
151
+
152
+
153
+
154
+
155
+
156
+
157
+
158
+
159
+
160
+
161
+
162
+
163
+
164
+
165
+
166
+
167
+
168
+
169
+
170
+
171
+
172
+
173
+
174
+
175
+
176
+
177
+
178
+
179
+
180
+
181
+
182
+
183
+
184
+
185
+
186
+
187
+
188
+
189
+
190
+
191
+
192
+
193
+
194
+
195
+
196
+
197
+
198
+
199
+
200
+
wandb/run-20220131_001044-248r0x8f/files/requirements.txt ADDED
@@ -0,0 +1,87 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ aiohttp==3.8.1
2
+ aiosignal==1.2.0
3
+ appdirs==1.4.4
4
+ async-timeout==4.0.2
5
+ attrs==21.4.0
6
+ audioread==2.1.9
7
+ bitsandbytes-cuda113==0.26.0
8
+ certifi==2021.10.8
9
+ cffi==1.15.0
10
+ charset-normalizer==2.0.10
11
+ click==8.0.3
12
+ clldutils==3.10.1
13
+ colorlog==6.6.0
14
+ configparser==5.2.0
15
+ csvw==1.11.0
16
+ datasets==1.18.1.dev0
17
+ decorator==5.1.1
18
+ dill==0.3.4
19
+ dlinfo==1.2.1
20
+ docker-pycreds==0.4.0
21
+ filelock==3.4.2
22
+ frozenlist==1.3.0
23
+ fsspec==2022.1.0
24
+ gitdb==4.0.9
25
+ gitpython==3.1.26
26
+ huggingface-hub==0.4.0
27
+ hypothesis==6.36.0
28
+ idna==3.3
29
+ isodate==0.6.1
30
+ jiwer==2.3.0
31
+ joblib==1.1.0
32
+ librosa==0.8.1
33
+ llvmlite==0.38.0
34
+ multidict==6.0.2
35
+ multiprocess==0.70.12.2
36
+ numba==0.55.0
37
+ numpy==1.21.5
38
+ packaging==21.3
39
+ pandas==1.4.0
40
+ pathtools==0.1.2
41
+ phonemizer==3.0.1
42
+ pip==21.3.1
43
+ pooch==1.6.0
44
+ promise==2.3
45
+ protobuf==3.19.3
46
+ psutil==5.9.0
47
+ pyarrow==6.0.1
48
+ pycparser==2.21
49
+ pyctcdecode==0.3.0
50
+ pygtrie==2.4.2
51
+ pyparsing==3.0.7
52
+ python-dateutil==2.8.2
53
+ python-levenshtein==0.12.2
54
+ pytz==2021.3
55
+ pyyaml==6.0
56
+ regex==2022.1.18
57
+ requests==2.27.1
58
+ resampy==0.2.2
59
+ rfc3986==2.0.0
60
+ sacremoses==0.0.47
61
+ scikit-learn==1.0.2
62
+ scipy==1.7.3
63
+ segments==2.2.0
64
+ sentry-sdk==1.5.4
65
+ setuptools==60.2.0
66
+ shortuuid==1.0.8
67
+ six==1.16.0
68
+ smmap==5.0.0
69
+ sortedcontainers==2.4.0
70
+ soundfile==0.10.3.post1
71
+ subprocess32==3.5.4
72
+ tabulate==0.8.9
73
+ termcolor==1.1.0
74
+ threadpoolctl==3.0.0
75
+ tokenizers==0.11.4
76
+ torch==1.10.1
77
+ torchaudio==0.10.1
78
+ tqdm==4.62.3
79
+ transformers==4.16.0.dev0
80
+ typing-extensions==4.0.1
81
+ uritemplate==4.1.1
82
+ urllib3==1.26.8
83
+ wandb==0.12.9
84
+ wheel==0.37.1
85
+ xxhash==2.0.2
86
+ yarl==1.7.2
87
+ yaspin==2.1.0
wandb/run-20220131_001044-248r0x8f/files/wandb-metadata.json ADDED
@@ -0,0 +1,75 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "os": "Linux-4.18.0-305.10.2.el8_4.x86_64-x86_64-with-glibc2.28",
3
+ "python": "3.9.6",
4
+ "heartbeatAt": "2022-01-31T05:10:45.361510",
5
+ "startedAt": "2022-01-31T05:10:44.391320",
6
+ "docker": null,
7
+ "gpu": "Tesla V100-PCIE-32GB",
8
+ "gpu_count": 3,
9
+ "cpu_count": 64,
10
+ "cuda": null,
11
+ "args": [
12
+ "--dataset_name=mozilla-foundation/common_voice_8_0",
13
+ "--model_name_or_path=facebook/wav2vec2-xls-r-300m",
14
+ "--dataset_config_name=et",
15
+ "--output_dir=./",
16
+ "--overwrite_output_dir",
17
+ "--num_train_epochs=100",
18
+ "--per_device_train_batch_size=72",
19
+ "--per_device_eval_batch_size=72",
20
+ "--gradient_accumulation_steps=2",
21
+ "--learning_rate=3e-4",
22
+ "--save_total_limit=1",
23
+ "--warmup_steps=500",
24
+ "--evaluation_strategy=steps",
25
+ "--text_column_name=sentence",
26
+ "--length_column_name=input_length",
27
+ "--save_steps=500",
28
+ "--eval_steps=500",
29
+ "--logging_steps=100",
30
+ "--layerdrop=0.0",
31
+ "--freeze_feature_encoder",
32
+ "--feat_proj_dropout=0.1",
33
+ "--chars_to_ignore",
34
+ ",",
35
+ "?",
36
+ ".",
37
+ "!",
38
+ "-",
39
+ ";",
40
+ ":",
41
+ "\"",
42
+ "\u201c",
43
+ "%",
44
+ "\u2018",
45
+ "\u201d",
46
+ "\ufffd",
47
+ "\u2014",
48
+ "\u2019",
49
+ "\u2026",
50
+ "\u2013",
51
+ "--gradient_checkpointing",
52
+ "--lr_scheduler_type=cosine",
53
+ "--fp16",
54
+ "--group_by_length",
55
+ "--mask_time_prob=0.1",
56
+ "--mask_time_length=10",
57
+ "--report_to=wandb",
58
+ "--run_name=cosine+drop_proj+low_specaugment-300M+cv_8_0",
59
+ "--do_train",
60
+ "--do_eval",
61
+ "--use_auth_token"
62
+ ],
63
+ "state": "running",
64
+ "program": "/home/sagrilaft/Project/audio/xls-r-et/src/run_speech_recognition_ctc_bnb.py",
65
+ "codePath": "src/run_speech_recognition_ctc_bnb.py",
66
+ "git": {
67
+ "remote": "https://huggingface.co/shpotes/xls-r-et",
68
+ "commit": "ff66b86f52be4c55fb5be74a60f889284554c939"
69
+ },
70
+ "email": "shpotes3@gmail.com",
71
+ "root": "/home/sagrilaft/Project/audio/xls-r-et",
72
+ "host": "ganymede.eafit.edu.co",
73
+ "username": "sagrilaft",
74
+ "executable": "/home/sagrilaft/Project/audio/xls-r-et/.venv/bin/python"
75
+ }
wandb/run-20220131_001044-248r0x8f/files/wandb-summary.json ADDED
The diff for this file is too large to render. See raw diff
 
wandb/run-20220131_001044-248r0x8f/logs/debug-internal.log ADDED
The diff for this file is too large to render. See raw diff
 
wandb/run-20220131_001044-248r0x8f/logs/debug.log ADDED
@@ -0,0 +1,24 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2022-01-31 00:10:44,393 INFO MainThread:2680825 [wandb_setup.py:_flush():71] setting env: {'project': 'xls-r-estonian'}
2
+ 2022-01-31 00:10:44,393 INFO MainThread:2680825 [wandb_setup.py:_flush():71] setting login settings: {}
3
+ 2022-01-31 00:10:44,393 INFO MainThread:2680825 [wandb_init.py:_log_setup():371] Logging user logs to /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220131_001044-248r0x8f/logs/debug.log
4
+ 2022-01-31 00:10:44,393 INFO MainThread:2680825 [wandb_init.py:_log_setup():372] Logging internal logs to /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220131_001044-248r0x8f/logs/debug-internal.log
5
+ 2022-01-31 00:10:44,393 INFO MainThread:2680825 [wandb_init.py:init():404] calling init triggers
6
+ 2022-01-31 00:10:44,394 INFO MainThread:2680825 [wandb_init.py:init():409] wandb.init called with sweep_config: {}
7
+ config: {}
8
+ 2022-01-31 00:10:44,394 INFO MainThread:2680825 [wandb_init.py:init():460] starting backend
9
+ 2022-01-31 00:10:44,394 INFO MainThread:2680825 [backend.py:_multiprocessing_setup():99] multiprocessing start_methods=fork,spawn,forkserver, using: spawn
10
+ 2022-01-31 00:10:44,417 INFO MainThread:2680825 [backend.py:ensure_launched():216] starting backend process...
11
+ 2022-01-31 00:10:44,436 INFO MainThread:2680825 [backend.py:ensure_launched():221] started backend process with pid: 2681230
12
+ 2022-01-31 00:10:44,438 INFO MainThread:2680825 [wandb_init.py:init():469] backend started and connected
13
+ 2022-01-31 00:10:44,442 INFO MainThread:2680825 [wandb_init.py:init():533] updated telemetry
14
+ 2022-01-31 00:10:44,488 INFO MainThread:2680825 [wandb_init.py:init():563] communicating current version
15
+ 2022-01-31 00:10:45,185 INFO MainThread:2680825 [wandb_init.py:init():568] got version response
16
+ 2022-01-31 00:10:45,185 INFO MainThread:2680825 [wandb_init.py:init():578] communicating run to backend with 30 second timeout
17
+ 2022-01-31 00:10:45,354 INFO MainThread:2680825 [wandb_init.py:init():606] starting run threads in backend
18
+ 2022-01-31 00:10:45,387 INFO MainThread:2680825 [wandb_run.py:_console_start():1810] atexit reg
19
+ 2022-01-31 00:10:45,387 INFO MainThread:2680825 [wandb_run.py:_redirect():1684] redirect: SettingsConsole.REDIRECT
20
+ 2022-01-31 00:10:45,387 INFO MainThread:2680825 [wandb_run.py:_redirect():1689] Redirecting console.
21
+ 2022-01-31 00:10:45,389 INFO MainThread:2680825 [wandb_run.py:_redirect():1745] Redirects installed.
22
+ 2022-01-31 00:10:45,389 INFO MainThread:2680825 [wandb_init.py:init():633] run started, returning control to user process
23
+ 2022-01-31 00:10:45,405 INFO MainThread:2680825 [wandb_run.py:_config_callback():956] config_cb None None {'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': 'float32', 'use_bfloat16': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'chunk_size_feed_forward': 0, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'architectures': ['Wav2Vec2ForPreTraining'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'bos_token_id': 1, 'pad_token_id': 36, 'eos_token_id': 2, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': 'facebook/wav2vec2-xls-r-300m', 'transformers_version': '4.16.0.dev0', 'feat_extract_dropout': 0.0, 'model_type': 'wav2vec2', 'num_feat_extract_layers': 7, 'hidden_size': 1024, 'feat_extract_norm': 'layer', 'feat_extract_activation': 'gelu', 'conv_dim': [512, 512, 512, 512, 512, 512, 512], 'conv_stride': [5, 2, 2, 2, 2, 2, 2], 'conv_kernel': [10, 3, 3, 3, 3, 2, 2], 'conv_bias': True, 'num_conv_pos_embeddings': 128, 'num_conv_pos_embedding_groups': 16, 'num_hidden_layers': 24, 'intermediate_size': 4096, 'hidden_act': 'gelu', 'num_attention_heads': 16, 'hidden_dropout': 0.0, 'attention_dropout': 0.0, 'activation_dropout': 0.0, 'feat_proj_dropout': 0.1, 'final_dropout': 0.0, 'layerdrop': 0.0, 'layer_norm_eps': 1e-05, 'initializer_range': 0.02, 'vocab_size': 39, 'do_stable_layer_norm': True, 'use_weighted_layer_sum': False, 'apply_spec_augment': True, 'mask_time_prob': 0.1, 'mask_time_length': 10, 'mask_time_min_masks': 2, 'mask_feature_prob': 0.0, 'mask_feature_length': 10, 'mask_feature_min_masks': 0, 'num_codevectors_per_group': 320, 'num_codevector_groups': 2, 'contrastive_logits_temperature': 0.1, 'feat_quantizer_dropout': 0.0, 'num_negatives': 100, 'codevector_dim': 768, 'proj_codevector_dim': 768, 'diversity_loss_weight': 0.1, 'ctc_loss_reduction': 'mean', 'ctc_zero_infinity': False, 'add_adapter': False, 'adapter_kernel_size': 3, 'adapter_stride': 2, 'num_adapter_layers': 3, 'output_hidden_size': 1024, 'classifier_proj_size': 256, 'tdnn_dim': [512, 512, 512, 512, 1500], 'tdnn_kernel': [5, 3, 3, 1, 1], 'tdnn_dilation': [1, 2, 3, 1, 1], 'xvector_output_dim': 512, 'output_dir': './', 'overwrite_output_dir': True, 'do_train': True, 'do_eval': True, 'do_predict': False, 'evaluation_strategy': 'steps', 'prediction_loss_only': False, 'per_device_train_batch_size': 72, 'per_device_eval_batch_size': 72, 'per_gpu_train_batch_size': 'None', 'per_gpu_eval_batch_size': 'None', 'gradient_accumulation_steps': 2, 'eval_accumulation_steps': 'None', 'learning_rate': 0.0003, 'weight_decay': 0.0, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 1.0, 'num_train_epochs': 100.0, 'max_steps': -1, 'lr_scheduler_type': 'cosine', 'warmup_ratio': 0.0, 'warmup_steps': 500, 'log_level': -1, 'log_level_replica': -1, 'log_on_each_node': True, 'logging_dir': './runs/Jan31_00-09-57_ganymede.eafit.edu.co', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 100, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 500, 'save_total_limit': 1, 'save_on_each_node': False, 'no_cuda': False, 'seed': 42, 'bf16': False, 'fp16': True, 'fp16_opt_level': 'O1', 'half_precision_backend': 'amp', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': 'None', 'local_rank': -1, 'xpu_backend': 'None', 'tpu_num_cores': 'None', 'tpu_metrics_debug': False, 'debug': '[]', 'dataloader_drop_last': False, 'eval_steps': 500, 'dataloader_num_workers': 0, 'past_index': -1, 'run_name': 'cosine+drop_proj+low_specaugment-300M+cv_8_0', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': 'None', 'load_best_model_at_end': False, 'metric_for_best_model': 'None', 'greater_is_better': 'None', 'ignore_data_skip': False, 'sharded_ddp': '[]', 'deepspeed': 'None', 'label_smoothing_factor': 0.0, 'optim': 'adamw_hf', 'adafactor': False, 'group_by_length': True, 'length_column_name': 'input_length', 'report_to': "['wandb']", 'ddp_find_unused_parameters': 'None', 'ddp_bucket_cap_mb': 'None', 'dataloader_pin_memory': True, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': False, 'resume_from_checkpoint': 'None', 'hub_model_id': 'None', 'hub_strategy': 'every_save', 'hub_token': '<HUB_TOKEN>', 'gradient_checkpointing': True, 'fp16_backend': 'auto', 'push_to_hub_model_id': 'None', 'push_to_hub_organization': 'None', 'push_to_hub_token': '<PUSH_TO_HUB_TOKEN>', '_n_gpu': 1, 'mp_parameters': '', 'train_batch_size': 72, 'eval_batch_size': 72}
24
+ 2022-01-31 00:10:45,407 INFO MainThread:2680825 [wandb_watch.py:watch():43] Watching
wandb/run-20220131_001044-248r0x8f/run-248r0x8f.wandb ADDED
Binary file (2.85 MB). View file
 
wandb/run-20220131_140404-gjg8nz5t/files/config.yaml ADDED
The diff for this file is too large to render. See raw diff
 
wandb/run-20220131_140404-gjg8nz5t/files/output.log ADDED
@@ -0,0 +1,4449 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+
2
+
3
+
4
+
5
+
6
+
7
+
8
+
9
+
10
+
11
+
12
+
13
+
14
+
15
+
16
+
17
+
18
+
19
+
20
+
21
+
22
+
23
+
24
+
25
+
26
+
27
+
28
+
29
+
30
+
31
+
32
+
33
+
34
+
35
+
36
+
37
+
38
+
39
+
40
+
41
+
42
+
43
+
44
+
45
+
46
+
47
+
48
+
49
+
50
+
51
+
52
+
53
+
54
+
55
+
56
+
57
+
58
+
59
+
60
+
61
+
62
+
63
+
64
+
65
+
66
+
67
+
68
+
69
+
70
+
71
+
72
+
73
+
74
+
75
+
76
+
77
+
78
+
79
+
80
+
81
+
82
+
83
+
84
+
85
+
86
+
87
+
88
+
89
+
90
+
91
+
92
+
93
+
94
+
95
+
96
+
97
+
98
+
99
+
100
+
101
+
102
+ 2%|████▏ | 100/4000 [14:21<10:42:35, 9.89s/it]
103
+
104
+
105
+
106
+
107
+
108
+
109
+
110
+
111
+
112
+
113
+
114
+
115
+
116
+
117
+
118
+
119
+
120
+
121
+
122
+
123
+
124
+
125
+
126
+
127
+
128
+
129
+
130
+
131
+
132
+
133
+
134
+
135
+
136
+
137
+
138
+
139
+
140
+
141
+
142
+
143
+
144
+
145
+
146
+
147
+
148
+
149
+
150
+
151
+
152
+
153
+
154
+
155
+
156
+
157
+
158
+
159
+
160
+
161
+
162
+
163
+
164
+
165
+
166
+
167
+
168
+
169
+
170
+
171
+
172
+
173
+
174
+
175
+
176
+
177
+
178
+
179
+
180
+
181
+
182
+
183
+
184
+
185
+
186
+
187
+
188
+
189
+
190
+
191
+
192
+
193
+
194
+
195
+
196
+
197
+
198
+
199
+
200
+
201
+
202
+ 5%|████████▎ | 199/4000 [28:13<8:38:23, 8.18s/it]
203
+
204
+
205
+
206
+
207
+
208
+
209
+
210
+
211
+
212
+
213
+
214
+
215
+
216
+
217
+
218
+
219
+
220
+
221
+
222
+
223
+
224
+
225
+
226
+
227
+
228
+
229
+
230
+
231
+
232
+
233
+
234
+
235
+
236
+
237
+
238
+
239
+
240
+
241
+
242
+
243
+
244
+
245
+
246
+
247
+
248
+
249
+
250
+
251
+
252
+
253
+
254
+
255
+
256
+
257
+
258
+
259
+
260
+
261
+
262
+
263
+
264
+
265
+
266
+
267
+
268
+
269
+
270
+
271
+
272
+
273
+
274
+
275
+
276
+
277
+
278
+
279
+
280
+
281
+
282
+
283
+
284
+
285
+
286
+
287
+
288
+
289
+
290
+
291
+
292
+
293
+
294
+
295
+
296
+
297
+
298
+
299
+
300
+
301
+
302
+
303
+
304
+ 8%|████████████▌ | 300/4000 [42:33<10:16:59, 10.01s/it]
305
+
306
+
307
+
308
+
309
+
310
+
311
+
312
+
313
+
314
+
315
+
316
+
317
+
318
+
319
+
320
+
321
+
322
+
323
+
324
+
325
+
326
+
327
+
328
+
329
+
330
+
331
+
332
+
333
+
334
+
335
+
336
+
337
+
338
+
339
+
340
+
341
+
342
+
343
+
344
+
345
+
346
+
347
+
348
+
349
+
350
+
351
+
352
+
353
+
354
+
355
+
356
+
357
+
358
+
359
+
360
+
361
+
362
+
363
+
364
+
365
+
366
+
367
+
368
+
369
+
370
+
371
+
372
+
373
+
374
+
375
+
376
+
377
+
378
+
379
+
380
+
381
+
382
+
383
+
384
+
385
+
386
+
387
+
388
+
389
+
390
+
391
+
392
+
393
+
394
+
395
+
396
+
397
+
398
+
399
+
400
+
401
+
402
+
403
+
404
+
405
+ 10%|████████████████▊ | 400/4000 [56:32<6:29:49, 6.50s/it]
406
+
407
+
408
+
409
+
410
+
411
+
412
+
413
+
414
+
415
+
416
+
417
+
418
+
419
+
420
+
421
+
422
+
423
+
424
+
425
+
426
+
427
+
428
+
429
+
430
+
431
+
432
+
433
+
434
+
435
+
436
+
437
+
438
+
439
+
440
+
441
+
442
+
443
+
444
+
445
+
446
+
447
+
448
+
449
+
450
+
451
+
452
+
453
+
454
+
455
+
456
+
457
+
458
+
459
+
460
+
461
+
462
+
463
+
464
+
465
+
466
+
467
+
468
+
469
+
470
+
471
+
472
+
473
+
474
+
475
+
476
+
477
+
478
+
479
+
480
+
481
+
482
+
483
+
484
+
485
+
486
+
487
+
488
+
489
+
490
+
491
+
492
+
493
+
494
+
495
+
496
+
497
+
498
+
499
+
500
+
501
+
502
+
503
+
504
+
505
+ 12%|████████████████████▊ | 500/4000 [1:10:56<9:45:10, 10.03s/it]The following columns in the evaluation set don't have a corresponding argument in `Wav2Vec2ForCTC.forward` and have been ignored: input_length.
506
+ ***** Running Evaluation *****
507
+ Num examples = 2613
508
+ Batch size = 72
509
+ {'loss': 0.3442, 'learning_rate': 0.000998, 'epoch': 12.5}
510
+
511
+
512
+
513
+
514
+
515
+
516
+
517
+
518
+
519
+
520
+
521
+
522
+
523
+
524
+
525
+
526
+
527
+
528
+
529
+
530
+
531
+
532
+
533
+
534
+
535
+
536
+
537
+
538
+
539
+
540
+
541
+
542
+
543
+
544
+
545
+
546
+ Configuration saved in ./checkpoint-500/config.json
547
+ Model weights saved in ./checkpoint-500/pytorch_model.bin
548
+ Configuration saved in ./checkpoint-500/preprocessor_config.json
549
+ {'eval_loss': 0.3824974596500397, 'eval_wer': 0.47628709104742584, 'eval_runtime': 133.9974, 'eval_samples_per_second': 19.5, 'eval_steps_per_second': 0.276, 'epoch': 12.5}
550
+
551
+
552
+
553
+
554
+
555
+
556
+
557
+
558
+
559
+
560
+
561
+
562
+
563
+
564
+
565
+
566
+
567
+
568
+
569
+
570
+
571
+
572
+
573
+
574
+
575
+
576
+
577
+
578
+
579
+
580
+
581
+
582
+
583
+
584
+
585
+
586
+
587
+
588
+
589
+
590
+
591
+
592
+
593
+
594
+
595
+
596
+
597
+
598
+
599
+
600
+
601
+
602
+
603
+
604
+
605
+
606
+
607
+
608
+
609
+
610
+
611
+
612
+
613
+
614
+
615
+
616
+
617
+
618
+
619
+
620
+
621
+
622
+
623
+
624
+
625
+
626
+
627
+
628
+
629
+
630
+
631
+
632
+
633
+
634
+
635
+
636
+
637
+
638
+
639
+
640
+
641
+
642
+
643
+
644
+
645
+
646
+
647
+
648
+
649
+ 15%|████████████████████████▉ | 600/4000 [1:27:09<6:07:37, 6.49s/it]
650
+
651
+
652
+
653
+
654
+
655
+
656
+
657
+
658
+
659
+
660
+
661
+
662
+
663
+
664
+
665
+
666
+
667
+
668
+
669
+
670
+
671
+
672
+
673
+
674
+
675
+
676
+
677
+
678
+
679
+
680
+
681
+
682
+
683
+
684
+
685
+
686
+
687
+
688
+
689
+
690
+
691
+
692
+
693
+
694
+
695
+
696
+
697
+
698
+
699
+
700
+
701
+
702
+
703
+
704
+
705
+
706
+
707
+
708
+
709
+
710
+
711
+
712
+
713
+
714
+
715
+
716
+
717
+
718
+
719
+
720
+
721
+
722
+
723
+
724
+
725
+
726
+
727
+
728
+
729
+
730
+
731
+
732
+
733
+
734
+
735
+
736
+
737
+
738
+
739
+
740
+
741
+
742
+
743
+
744
+
745
+
746
+
747
+
748
+
749
+ 17%|█████████████████████████████ | 699/4000 [1:41:17<8:04:21, 8.80s/it]
750
+
751
+
752
+
753
+
754
+
755
+
756
+
757
+
758
+
759
+
760
+
761
+
762
+
763
+
764
+
765
+
766
+
767
+
768
+
769
+
770
+
771
+
772
+
773
+
774
+
775
+
776
+
777
+
778
+
779
+
780
+
781
+
782
+
783
+
784
+
785
+
786
+
787
+
788
+
789
+
790
+
791
+
792
+
793
+
794
+
795
+
796
+
797
+
798
+
799
+
800
+
801
+
802
+
803
+
804
+
805
+
806
+
807
+
808
+
809
+
810
+
811
+
812
+
813
+
814
+
815
+
816
+
817
+
818
+
819
+
820
+
821
+
822
+
823
+
824
+
825
+
826
+
827
+
828
+
829
+
830
+
831
+
832
+
833
+
834
+
835
+
836
+
837
+
838
+
839
+
840
+
841
+
842
+
843
+
844
+
845
+
846
+
847
+
848
+
849
+
850
+
851
+ 20%|█████████████████████████████████▏ | 800/4000 [1:55:25<5:49:59, 6.56s/it]
852
+
853
+
854
+
855
+
856
+
857
+
858
+
859
+
860
+
861
+
862
+
863
+
864
+
865
+
866
+
867
+
868
+
869
+
870
+
871
+
872
+
873
+
874
+
875
+
876
+
877
+
878
+
879
+
880
+
881
+
882
+
883
+
884
+
885
+
886
+
887
+
888
+
889
+
890
+
891
+
892
+
893
+
894
+
895
+
896
+
897
+
898
+
899
+
900
+
901
+
902
+
903
+
904
+
905
+
906
+
907
+
908
+
909
+
910
+
911
+
912
+
913
+
914
+
915
+
916
+
917
+
918
+
919
+
920
+
921
+
922
+
923
+
924
+
925
+
926
+
927
+
928
+
929
+
930
+
931
+
932
+
933
+
934
+
935
+
936
+
937
+
938
+
939
+
940
+
941
+
942
+
943
+
944
+
945
+
946
+
947
+
948
+
949
+
950
+
951
+ 22%|█████████████████████████████████████▎ | 899/4000 [2:09:27<7:31:17, 8.73s/it]
952
+
953
+
954
+
955
+
956
+
957
+
958
+
959
+
960
+
961
+
962
+
963
+
964
+
965
+
966
+
967
+
968
+
969
+
970
+
971
+
972
+
973
+
974
+
975
+
976
+
977
+
978
+
979
+
980
+
981
+
982
+
983
+
984
+
985
+
986
+
987
+
988
+
989
+
990
+
991
+
992
+
993
+
994
+
995
+
996
+
997
+
998
+
999
+
1000
+
1001
+
1002
+
1003
+
1004
+
1005
+
1006
+
1007
+
1008
+
1009
+
1010
+
1011
+
1012
+
1013
+
1014
+
1015
+
1016
+
1017
+
1018
+
1019
+
1020
+
1021
+
1022
+
1023
+
1024
+
1025
+
1026
+
1027
+
1028
+
1029
+
1030
+
1031
+
1032
+
1033
+
1034
+
1035
+
1036
+
1037
+
1038
+
1039
+
1040
+
1041
+
1042
+
1043
+
1044
+
1045
+
1046
+
1047
+
1048
+
1049
+
1050
+
1051
+
1052
+ 25%|███████████████████████████████████���█████▎ | 1000/4000 [2:23:35<5:34:34, 6.69s/it]The following columns in the evaluation set don't have a corresponding argument in `Wav2Vec2ForCTC.forward` and have been ignored: input_length.
1053
+ ***** Running Evaluation *****
1054
+ Num examples = 2613
1055
+ Batch size = 72
1056
+ {'loss': 0.1934, 'learning_rate': 0.0009506789790182364, 'epoch': 25.0}
1057
+
1058
+
1059
+
1060
+
1061
+
1062
+
1063
+
1064
+
1065
+
1066
+
1067
+
1068
+
1069
+
1070
+
1071
+
1072
+
1073
+
1074
+
1075
+
1076
+
1077
+
1078
+
1079
+
1080
+
1081
+
1082
+
1083
+
1084
+
1085
+
1086
+
1087
+
1088
+
1089
+
1090
+
1091
+
1092
+
1093
+ 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 37/37 [02:07<00:00, 2.09s/it]
1094
+ Configuration saved in ./checkpoint-1000/config.json
1095
+ Model weights saved in ./checkpoint-1000/pytorch_model.bin
1096
+ Configuration saved in ./checkpoint-1000/preprocessor_config.json
1097
+ Deleting older checkpoint [checkpoint-500] due to args.save_total_limit
1098
+
1099
+
1100
+
1101
+
1102
+
1103
+
1104
+
1105
+
1106
+
1107
+
1108
+
1109
+
1110
+
1111
+
1112
+
1113
+
1114
+
1115
+
1116
+
1117
+
1118
+
1119
+
1120
+
1121
+
1122
+
1123
+
1124
+
1125
+
1126
+
1127
+
1128
+
1129
+
1130
+
1131
+
1132
+
1133
+
1134
+
1135
+
1136
+
1137
+
1138
+
1139
+
1140
+
1141
+
1142
+
1143
+
1144
+
1145
+
1146
+
1147
+
1148
+
1149
+
1150
+
1151
+
1152
+
1153
+
1154
+
1155
+
1156
+
1157
+
1158
+
1159
+
1160
+
1161
+
1162
+
1163
+
1164
+
1165
+
1166
+
1167
+
1168
+
1169
+
1170
+
1171
+
1172
+
1173
+
1174
+
1175
+
1176
+
1177
+
1178
+
1179
+
1180
+
1181
+
1182
+
1183
+
1184
+
1185
+
1186
+
1187
+
1188
+
1189
+
1190
+
1191
+
1192
+
1193
+
1194
+
1195
+
1196
+ 27%|█████████████████████████████████████████████▎ | 1099/4000 [2:39:51<6:58:13, 8.65s/it]
1197
+
1198
+
1199
+
1200
+
1201
+
1202
+
1203
+
1204
+
1205
+
1206
+
1207
+
1208
+
1209
+
1210
+
1211
+
1212
+
1213
+
1214
+
1215
+
1216
+
1217
+
1218
+
1219
+
1220
+
1221
+
1222
+
1223
+
1224
+
1225
+
1226
+
1227
+
1228
+
1229
+
1230
+
1231
+
1232
+
1233
+
1234
+
1235
+
1236
+
1237
+
1238
+
1239
+
1240
+
1241
+
1242
+
1243
+
1244
+
1245
+
1246
+
1247
+
1248
+
1249
+
1250
+
1251
+
1252
+
1253
+
1254
+
1255
+
1256
+
1257
+
1258
+
1259
+
1260
+
1261
+
1262
+
1263
+
1264
+
1265
+
1266
+
1267
+
1268
+
1269
+
1270
+
1271
+
1272
+
1273
+
1274
+
1275
+
1276
+
1277
+
1278
+
1279
+
1280
+
1281
+
1282
+
1283
+
1284
+
1285
+
1286
+
1287
+
1288
+
1289
+
1290
+
1291
+
1292
+
1293
+
1294
+
1295
+
1296
+
1297
+
1298
+ 30%|█████████████████████████████████████████████████▌ | 1200/4000 [2:53:55<5:10:00, 6.64s/it]
1299
+
1300
+
1301
+
1302
+
1303
+
1304
+
1305
+
1306
+
1307
+
1308
+
1309
+
1310
+
1311
+
1312
+
1313
+
1314
+
1315
+
1316
+
1317
+
1318
+
1319
+
1320
+
1321
+
1322
+
1323
+
1324
+
1325
+
1326
+
1327
+
1328
+
1329
+
1330
+
1331
+
1332
+
1333
+
1334
+
1335
+
1336
+
1337
+
1338
+
1339
+
1340
+
1341
+
1342
+
1343
+
1344
+
1345
+
1346
+
1347
+
1348
+
1349
+
1350
+
1351
+
1352
+
1353
+
1354
+
1355
+
1356
+
1357
+
1358
+
1359
+
1360
+
1361
+
1362
+
1363
+
1364
+
1365
+
1366
+
1367
+
1368
+
1369
+
1370
+
1371
+
1372
+
1373
+
1374
+
1375
+
1376
+
1377
+
1378
+
1379
+
1380
+
1381
+
1382
+
1383
+
1384
+
1385
+
1386
+
1387
+
1388
+
1389
+
1390
+
1391
+
1392
+
1393
+
1394
+
1395
+
1396
+
1397
+
1398
+
1399
+ 32%|█████████████████████████████████████████████████████▋ | 1300/4000 [3:08:09<7:25:48, 9.91s/it]
1400
+
1401
+
1402
+
1403
+
1404
+
1405
+
1406
+
1407
+
1408
+
1409
+
1410
+
1411
+
1412
+
1413
+
1414
+
1415
+
1416
+
1417
+
1418
+
1419
+
1420
+
1421
+
1422
+
1423
+
1424
+
1425
+
1426
+
1427
+
1428
+
1429
+
1430
+
1431
+
1432
+
1433
+
1434
+
1435
+
1436
+
1437
+
1438
+
1439
+
1440
+
1441
+
1442
+
1443
+
1444
+
1445
+
1446
+
1447
+
1448
+
1449
+
1450
+
1451
+
1452
+
1453
+
1454
+
1455
+
1456
+
1457
+
1458
+
1459
+
1460
+
1461
+
1462
+
1463
+
1464
+
1465
+
1466
+
1467
+
1468
+
1469
+
1470
+
1471
+
1472
+
1473
+
1474
+
1475
+
1476
+
1477
+
1478
+
1479
+
1480
+
1481
+
1482
+
1483
+
1484
+
1485
+
1486
+
1487
+
1488
+
1489
+
1490
+
1491
+
1492
+
1493
+
1494
+
1495
+
1496
+
1497
+
1498
+
1499
+ 35%|█████████████████████████████████████████████████████████▋ | 1399/4000 [3:21:57<5:40:57, 7.87s/it]
1500
+
1501
+
1502
+
1503
+
1504
+
1505
+
1506
+
1507
+
1508
+
1509
+
1510
+
1511
+
1512
+
1513
+
1514
+
1515
+
1516
+
1517
+
1518
+
1519
+
1520
+
1521
+
1522
+
1523
+
1524
+
1525
+
1526
+
1527
+
1528
+
1529
+
1530
+
1531
+
1532
+
1533
+
1534
+
1535
+
1536
+
1537
+
1538
+
1539
+
1540
+
1541
+
1542
+
1543
+
1544
+
1545
+
1546
+
1547
+
1548
+
1549
+
1550
+
1551
+
1552
+
1553
+
1554
+
1555
+
1556
+
1557
+
1558
+
1559
+
1560
+
1561
+
1562
+
1563
+
1564
+
1565
+
1566
+
1567
+
1568
+
1569
+
1570
+
1571
+
1572
+
1573
+
1574
+
1575
+
1576
+
1577
+
1578
+
1579
+
1580
+
1581
+
1582
+
1583
+
1584
+
1585
+
1586
+
1587
+
1588
+
1589
+
1590
+
1591
+
1592
+
1593
+
1594
+
1595
+
1596
+
1597
+
1598
+
1599
+
1600
+ 37%|█████████████████████████████████████████████████████████████▊ | 1499/4000 [3:36:00<6:04:19, 8.74s/it]
1601
+ 38%|█████████████████████████████████████████████████████████████▉ | 1500/4000 [3:36:12<6:49:33, 9.83s/it]The following columns in the evaluation set don't have a corresponding argument in `Wav2Vec2ForCTC.forward` and have been ignored: input_length.
1602
+ ***** Running Evaluation *****
1603
+ Num examples = 2613
1604
+ Batch size = 72
1605
+
1606
+
1607
+
1608
+
1609
+
1610
+
1611
+
1612
+
1613
+
1614
+
1615
+
1616
+
1617
+
1618
+
1619
+
1620
+
1621
+
1622
+
1623
+
1624
+
1625
+
1626
+
1627
+
1628
+
1629
+
1630
+
1631
+
1632
+
1633
+
1634
+
1635
+
1636
+
1637
+
1638
+
1639
+
1640
+ 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 37/37 [02:07<00:00, 2.07s/it]
1641
+
1642
+ Configuration saved in ./checkpoint-1500/config.json
1643
+ Model weights saved in ./checkpoint-1500/pytorch_model.bin
1644
+ Configuration saved in ./checkpoint-1500/preprocessor_config.json
1645
+ Deleting older checkpoint [checkpoint-1000] due to args.save_total_limit
1646
+
1647
+
1648
+
1649
+
1650
+
1651
+
1652
+
1653
+
1654
+
1655
+
1656
+
1657
+
1658
+
1659
+
1660
+
1661
+
1662
+
1663
+
1664
+
1665
+
1666
+
1667
+
1668
+
1669
+
1670
+
1671
+
1672
+
1673
+
1674
+
1675
+
1676
+
1677
+
1678
+
1679
+
1680
+
1681
+
1682
+
1683
+
1684
+
1685
+
1686
+
1687
+
1688
+
1689
+
1690
+
1691
+
1692
+
1693
+
1694
+
1695
+
1696
+
1697
+
1698
+
1699
+
1700
+
1701
+
1702
+
1703
+
1704
+
1705
+
1706
+
1707
+
1708
+
1709
+
1710
+
1711
+
1712
+
1713
+
1714
+
1715
+
1716
+
1717
+
1718
+
1719
+
1720
+
1721
+
1722
+
1723
+
1724
+
1725
+
1726
+
1727
+
1728
+
1729
+
1730
+
1731
+
1732
+
1733
+
1734
+
1735
+
1736
+
1737
+
1738
+
1739
+
1740
+
1741
+
1742
+
1743
+
1744
+ 40%|█████████████████████████████████████████████████████████████████▉ | 1599/4000 [3:52:16<5:12:57, 7.82s/it]
1745
+
1746
+
1747
+
1748
+
1749
+
1750
+
1751
+
1752
+
1753
+
1754
+
1755
+
1756
+
1757
+
1758
+
1759
+
1760
+
1761
+
1762
+
1763
+
1764
+
1765
+
1766
+
1767
+
1768
+
1769
+
1770
+
1771
+
1772
+
1773
+
1774
+
1775
+
1776
+
1777
+
1778
+
1779
+
1780
+
1781
+
1782
+
1783
+
1784
+
1785
+
1786
+
1787
+
1788
+
1789
+
1790
+
1791
+
1792
+
1793
+
1794
+
1795
+
1796
+
1797
+
1798
+
1799
+
1800
+
1801
+
1802
+
1803
+
1804
+
1805
+
1806
+
1807
+
1808
+
1809
+
1810
+
1811
+
1812
+
1813
+
1814
+
1815
+
1816
+
1817
+
1818
+
1819
+
1820
+
1821
+
1822
+
1823
+
1824
+
1825
+
1826
+
1827
+
1828
+
1829
+
1830
+
1831
+
1832
+
1833
+
1834
+
1835
+
1836
+
1837
+
1838
+
1839
+
1840
+
1841
+
1842
+
1843
+
1844
+
1845
+
1846
+ 42%|██████████████████████████████████████████████████████████████████████▏ | 1700/4000 [4:06:34<6:20:58, 9.94s/it]
1847
+
1848
+
1849
+
1850
+
1851
+
1852
+
1853
+
1854
+
1855
+
1856
+
1857
+
1858
+
1859
+
1860
+
1861
+
1862
+
1863
+
1864
+
1865
+
1866
+
1867
+
1868
+
1869
+
1870
+
1871
+
1872
+
1873
+
1874
+
1875
+
1876
+
1877
+
1878
+
1879
+
1880
+
1881
+
1882
+
1883
+
1884
+
1885
+
1886
+
1887
+
1888
+
1889
+
1890
+
1891
+
1892
+
1893
+
1894
+
1895
+
1896
+
1897
+
1898
+
1899
+
1900
+
1901
+
1902
+
1903
+
1904
+
1905
+
1906
+
1907
+
1908
+
1909
+
1910
+
1911
+
1912
+
1913
+
1914
+
1915
+
1916
+
1917
+
1918
+
1919
+
1920
+
1921
+
1922
+
1923
+
1924
+
1925
+
1926
+
1927
+
1928
+
1929
+
1930
+
1931
+
1932
+
1933
+
1934
+
1935
+
1936
+
1937
+
1938
+
1939
+
1940
+
1941
+
1942
+
1943
+
1944
+
1945
+
1946
+
1947
+ 45%|██████████████████████████████████████████████████████████████████████████▎ | 1800/4000 [4:20:24<3:56:42, 6.46s/it]
1948
+
1949
+
1950
+
1951
+
1952
+
1953
+
1954
+
1955
+
1956
+
1957
+
1958
+
1959
+
1960
+
1961
+
1962
+
1963
+
1964
+
1965
+
1966
+
1967
+
1968
+
1969
+
1970
+
1971
+
1972
+
1973
+
1974
+
1975
+
1976
+
1977
+
1978
+
1979
+
1980
+
1981
+
1982
+
1983
+
1984
+
1985
+
1986
+
1987
+
1988
+
1989
+
1990
+
1991
+
1992
+
1993
+
1994
+
1995
+
1996
+
1997
+
1998
+
1999
+
2000
+
2001
+
2002
+
2003
+
2004
+
2005
+
2006
+
2007
+
2008
+
2009
+
2010
+
2011
+
2012
+
2013
+
2014
+
2015
+
2016
+
2017
+
2018
+
2019
+
2020
+
2021
+
2022
+
2023
+
2024
+
2025
+
2026
+
2027
+
2028
+
2029
+
2030
+
2031
+
2032
+
2033
+
2034
+
2035
+
2036
+
2037
+
2038
+
2039
+
2040
+
2041
+
2042
+
2043
+
2044
+
2045
+
2046
+
2047
+
2048
+ 48%|██████████████████████████████████████████████████████████████████████████████▍ | 1900/4000 [4:34:41<5:44:05, 9.83s/it]
2049
+
2050
+
2051
+
2052
+
2053
+
2054
+
2055
+
2056
+
2057
+
2058
+
2059
+
2060
+
2061
+
2062
+
2063
+
2064
+
2065
+
2066
+
2067
+
2068
+
2069
+
2070
+
2071
+
2072
+
2073
+
2074
+
2075
+
2076
+
2077
+
2078
+
2079
+
2080
+
2081
+
2082
+
2083
+
2084
+
2085
+
2086
+
2087
+
2088
+
2089
+
2090
+
2091
+
2092
+
2093
+
2094
+
2095
+
2096
+
2097
+
2098
+
2099
+
2100
+
2101
+
2102
+
2103
+
2104
+
2105
+
2106
+
2107
+
2108
+
2109
+
2110
+
2111
+
2112
+
2113
+
2114
+
2115
+
2116
+
2117
+
2118
+
2119
+
2120
+
2121
+
2122
+
2123
+
2124
+
2125
+
2126
+
2127
+
2128
+
2129
+
2130
+
2131
+
2132
+
2133
+
2134
+
2135
+
2136
+
2137
+
2138
+
2139
+
2140
+
2141
+
2142
+
2143
+
2144
+
2145
+
2146
+
2147
+
2148
+ 50%|██████████████████████████████████████████████████████████████████████████████████▍ | 1999/4000 [4:48:30<4:28:04, 8.04s/it]
2149
+ 50%|██████████████████████████████████████████████████████████████████████████████████▌ | 2000/4000 [4:48:33<3:41:10, 6.64s/it]The following columns in the evaluation set don't have a corresponding argument in `Wav2Vec2ForCTC.forward` and have been ignored: input_length.
2150
+ ***** Running Evaluation *****
2151
+ Num examples = 2613
2152
+ Batch size = 72
2153
+
2154
+
2155
+
2156
+
2157
+
2158
+
2159
+
2160
+
2161
+
2162
+
2163
+
2164
+
2165
+
2166
+
2167
+
2168
+
2169
+
2170
+
2171
+
2172
+
2173
+
2174
+
2175
+
2176
+
2177
+
2178
+
2179
+
2180
+
2181
+
2182
+
2183
+
2184
+
2185
+
2186
+
2187
+
2188
+
2189
+ Configuration saved in ./checkpoint-2000/config.json
2190
+ {'eval_loss': 0.46736055612564087, 'eval_wer': 0.41431016992137965, 'eval_runtime': 133.2229, 'eval_samples_per_second': 19.614, 'eval_steps_per_second': 0.278, 'epoch': 50.0}
2191
+ Model weights saved in ./checkpoint-2000/pytorch_model.bin
2192
+ Configuration saved in ./checkpoint-2000/preprocessor_config.json
2193
+ Deleting older checkpoint [checkpoint-1500] due to args.save_total_limit
2194
+
2195
+
2196
+
2197
+
2198
+
2199
+
2200
+
2201
+
2202
+
2203
+
2204
+
2205
+
2206
+
2207
+
2208
+
2209
+
2210
+
2211
+
2212
+
2213
+
2214
+
2215
+
2216
+
2217
+
2218
+
2219
+
2220
+
2221
+
2222
+
2223
+
2224
+
2225
+
2226
+
2227
+
2228
+
2229
+
2230
+
2231
+
2232
+
2233
+
2234
+
2235
+
2236
+
2237
+
2238
+
2239
+
2240
+
2241
+
2242
+
2243
+
2244
+
2245
+
2246
+
2247
+
2248
+
2249
+
2250
+
2251
+
2252
+
2253
+
2254
+
2255
+
2256
+
2257
+
2258
+
2259
+
2260
+
2261
+
2262
+
2263
+
2264
+
2265
+
2266
+
2267
+
2268
+
2269
+
2270
+
2271
+
2272
+
2273
+
2274
+
2275
+
2276
+
2277
+
2278
+
2279
+
2280
+
2281
+
2282
+
2283
+
2284
+
2285
+
2286
+
2287
+
2288
+
2289
+
2290
+
2291
+
2292
+ 52%|██████████████████████████████████████████████████████████████████████████████████████▌ | 2099/4000 [5:04:49<4:36:30, 8.73s/it]
2293
+
2294
+
2295
+
2296
+
2297
+
2298
+
2299
+
2300
+
2301
+
2302
+
2303
+
2304
+
2305
+
2306
+
2307
+
2308
+
2309
+
2310
+
2311
+
2312
+
2313
+
2314
+
2315
+
2316
+
2317
+
2318
+
2319
+
2320
+
2321
+
2322
+
2323
+
2324
+
2325
+
2326
+
2327
+
2328
+
2329
+
2330
+
2331
+
2332
+
2333
+
2334
+
2335
+
2336
+
2337
+
2338
+
2339
+
2340
+
2341
+
2342
+
2343
+
2344
+
2345
+
2346
+
2347
+
2348
+
2349
+
2350
+
2351
+
2352
+
2353
+
2354
+
2355
+
2356
+
2357
+
2358
+
2359
+
2360
+
2361
+
2362
+
2363
+
2364
+
2365
+
2366
+
2367
+
2368
+
2369
+
2370
+
2371
+
2372
+
2373
+
2374
+
2375
+
2376
+
2377
+
2378
+
2379
+
2380
+
2381
+
2382
+
2383
+
2384
+
2385
+
2386
+
2387
+
2388
+
2389
+
2390
+
2391
+
2392
+
2393
+
2394
+ 55%|█████████████████████████████████████████████████��████████████████████████████████████████▊ | 2200/4000 [5:18:51<3:16:16, 6.54s/it]
2395
+
2396
+
2397
+
2398
+
2399
+
2400
+
2401
+
2402
+
2403
+
2404
+
2405
+
2406
+
2407
+
2408
+
2409
+
2410
+
2411
+
2412
+
2413
+
2414
+
2415
+
2416
+
2417
+
2418
+
2419
+
2420
+
2421
+
2422
+
2423
+
2424
+
2425
+
2426
+
2427
+
2428
+
2429
+
2430
+
2431
+
2432
+
2433
+
2434
+
2435
+
2436
+
2437
+
2438
+
2439
+
2440
+
2441
+
2442
+
2443
+
2444
+
2445
+
2446
+
2447
+
2448
+
2449
+
2450
+
2451
+
2452
+
2453
+
2454
+
2455
+
2456
+
2457
+
2458
+
2459
+
2460
+
2461
+
2462
+
2463
+
2464
+
2465
+
2466
+
2467
+
2468
+
2469
+
2470
+
2471
+
2472
+
2473
+
2474
+
2475
+
2476
+
2477
+
2478
+
2479
+
2480
+
2481
+
2482
+
2483
+
2484
+
2485
+
2486
+
2487
+
2488
+
2489
+
2490
+
2491
+
2492
+
2493
+
2494
+ 57%|██████████████████████████████████████████████████████████████████████████████████████████████▊ | 2299/4000 [5:32:49<4:06:37, 8.70s/it]
2495
+
2496
+
2497
+
2498
+
2499
+
2500
+
2501
+
2502
+
2503
+
2504
+
2505
+
2506
+
2507
+
2508
+
2509
+
2510
+
2511
+
2512
+
2513
+
2514
+
2515
+
2516
+
2517
+
2518
+
2519
+
2520
+
2521
+
2522
+
2523
+
2524
+
2525
+
2526
+
2527
+
2528
+
2529
+
2530
+
2531
+
2532
+
2533
+
2534
+
2535
+
2536
+
2537
+
2538
+
2539
+
2540
+
2541
+
2542
+
2543
+
2544
+
2545
+
2546
+
2547
+
2548
+
2549
+
2550
+
2551
+
2552
+
2553
+
2554
+
2555
+
2556
+
2557
+
2558
+
2559
+
2560
+
2561
+
2562
+
2563
+
2564
+
2565
+
2566
+
2567
+
2568
+
2569
+
2570
+
2571
+
2572
+
2573
+
2574
+
2575
+
2576
+
2577
+
2578
+
2579
+
2580
+
2581
+
2582
+
2583
+
2584
+
2585
+
2586
+
2587
+
2588
+
2589
+
2590
+
2591
+
2592
+
2593
+
2594
+
2595
+
2596
+ 60%|███████████████████████████████████████████████████████████████████████████████████████████████████ | 2400/4000 [5:46:47<2:47:55, 6.30s/it]
2597
+
2598
+
2599
+
2600
+
2601
+
2602
+
2603
+
2604
+
2605
+
2606
+
2607
+
2608
+
2609
+
2610
+
2611
+
2612
+
2613
+
2614
+
2615
+
2616
+
2617
+
2618
+
2619
+
2620
+
2621
+
2622
+
2623
+
2624
+
2625
+
2626
+
2627
+
2628
+
2629
+
2630
+
2631
+
2632
+
2633
+
2634
+
2635
+
2636
+
2637
+
2638
+
2639
+
2640
+
2641
+
2642
+
2643
+
2644
+
2645
+
2646
+
2647
+
2648
+
2649
+
2650
+
2651
+
2652
+
2653
+
2654
+
2655
+
2656
+
2657
+
2658
+
2659
+
2660
+
2661
+
2662
+
2663
+
2664
+
2665
+
2666
+
2667
+
2668
+
2669
+
2670
+
2671
+
2672
+
2673
+
2674
+
2675
+
2676
+
2677
+
2678
+
2679
+
2680
+
2681
+
2682
+
2683
+
2684
+
2685
+
2686
+
2687
+
2688
+
2689
+
2690
+
2691
+
2692
+
2693
+
2694
+
2695
+
2696
+ 62%|███████████████████████████████████████████████████████████████████████████████████████████████████████ | 2499/4000 [6:00:45<3:36:50, 8.67s/it]
2697
+ 62%|███████████████████████████████████████████████████████████████████████████████████████████████████████▏ | 2500/4000 [6:00:58<4:04:42, 9.79s/it]The following columns in the evaluation set don't have a corresponding argument in `Wav2Vec2ForCTC.forward` and have been ignored: input_length.
2698
+ ***** Running Evaluation *****
2699
+ Num examples = 2613
2700
+ Batch size = 72
2701
+
2702
+
2703
+
2704
+
2705
+
2706
+
2707
+
2708
+
2709
+
2710
+
2711
+
2712
+
2713
+
2714
+
2715
+
2716
+
2717
+
2718
+
2719
+
2720
+
2721
+
2722
+
2723
+
2724
+
2725
+
2726
+
2727
+
2728
+
2729
+
2730
+
2731
+
2732
+
2733
+
2734
+
2735
+
2736
+
2737
+ Configuration saved in ./checkpoint-2500/config.json
2738
+ {'eval_loss': 0.4847034215927124, 'eval_wer': 0.39253106771493784, 'eval_runtime': 132.6998, 'eval_samples_per_second': 19.691, 'eval_steps_per_second': 0.279, 'epoch': 62.5}
2739
+ Model weights saved in ./checkpoint-2500/pytorch_model.bin
2740
+ Configuration saved in ./checkpoint-2500/preprocessor_config.json
2741
+ Deleting older checkpoint [checkpoint-2000] due to args.save_total_limit
2742
+
2743
+
2744
+
2745
+
2746
+
2747
+
2748
+
2749
+
2750
+
2751
+
2752
+
2753
+
2754
+
2755
+
2756
+
2757
+
2758
+
2759
+
2760
+
2761
+
2762
+
2763
+
2764
+
2765
+
2766
+
2767
+
2768
+
2769
+
2770
+
2771
+
2772
+
2773
+
2774
+
2775
+
2776
+
2777
+
2778
+
2779
+
2780
+
2781
+
2782
+
2783
+
2784
+
2785
+
2786
+
2787
+
2788
+
2789
+
2790
+
2791
+
2792
+
2793
+
2794
+
2795
+
2796
+
2797
+
2798
+
2799
+
2800
+
2801
+
2802
+
2803
+
2804
+
2805
+
2806
+
2807
+
2808
+
2809
+
2810
+
2811
+
2812
+
2813
+
2814
+
2815
+
2816
+
2817
+
2818
+
2819
+
2820
+
2821
+
2822
+
2823
+
2824
+
2825
+
2826
+
2827
+
2828
+
2829
+
2830
+
2831
+
2832
+
2833
+
2834
+
2835
+
2836
+
2837
+
2838
+
2839
+
2840
+
2841
+ 65%|███████████████████████████████████████████████████████████████████████████████████████████████████████████▎ | 2600/4000 [6:17:02<2:29:59, 6.43s/it]
2842
+
2843
+
2844
+
2845
+
2846
+
2847
+
2848
+
2849
+
2850
+
2851
+
2852
+
2853
+
2854
+
2855
+
2856
+
2857
+
2858
+
2859
+
2860
+
2861
+
2862
+
2863
+
2864
+
2865
+
2866
+
2867
+
2868
+
2869
+
2870
+
2871
+
2872
+
2873
+
2874
+
2875
+
2876
+
2877
+
2878
+
2879
+
2880
+
2881
+
2882
+
2883
+
2884
+
2885
+
2886
+
2887
+
2888
+
2889
+
2890
+
2891
+
2892
+
2893
+
2894
+
2895
+
2896
+
2897
+
2898
+
2899
+
2900
+
2901
+
2902
+
2903
+
2904
+
2905
+
2906
+
2907
+
2908
+
2909
+
2910
+
2911
+
2912
+
2913
+
2914
+
2915
+
2916
+
2917
+
2918
+
2919
+
2920
+
2921
+
2922
+
2923
+
2924
+
2925
+
2926
+
2927
+
2928
+
2929
+
2930
+
2931
+
2932
+
2933
+
2934
+
2935
+
2936
+
2937
+
2938
+
2939
+
2940
+
2941
+
2942
+ 68%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████▍ | 2700/4000 [6:31:13<3:35:21, 9.94s/it]
2943
+
2944
+
2945
+
2946
+
2947
+
2948
+
2949
+
2950
+
2951
+
2952
+
2953
+
2954
+
2955
+
2956
+
2957
+
2958
+
2959
+
2960
+
2961
+
2962
+
2963
+
2964
+
2965
+
2966
+
2967
+
2968
+
2969
+
2970
+
2971
+
2972
+
2973
+
2974
+
2975
+
2976
+
2977
+
2978
+
2979
+
2980
+
2981
+
2982
+
2983
+
2984
+
2985
+
2986
+
2987
+
2988
+
2989
+
2990
+
2991
+
2992
+
2993
+
2994
+
2995
+
2996
+
2997
+
2998
+
2999
+
3000
+
3001
+
3002
+
3003
+
3004
+
3005
+
3006
+
3007
+
3008
+
3009
+
3010
+
3011
+
3012
+
3013
+
3014
+
3015
+
3016
+
3017
+
3018
+
3019
+
3020
+
3021
+
3022
+
3023
+
3024
+
3025
+
3026
+
3027
+
3028
+
3029
+
3030
+
3031
+
3032
+
3033
+
3034
+
3035
+
3036
+
3037
+
3038
+
3039
+
3040
+
3041
+
3042
+
3043
+ 70%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████▍ | 2800/4000 [6:45:03<2:09:33, 6.48s/it]
3044
+
3045
+
3046
+
3047
+
3048
+
3049
+
3050
+
3051
+
3052
+
3053
+
3054
+
3055
+
3056
+
3057
+
3058
+
3059
+
3060
+
3061
+
3062
+
3063
+
3064
+
3065
+
3066
+
3067
+
3068
+
3069
+
3070
+
3071
+
3072
+
3073
+
3074
+
3075
+
3076
+
3077
+
3078
+
3079
+
3080
+
3081
+
3082
+
3083
+
3084
+
3085
+
3086
+
3087
+
3088
+
3089
+
3090
+
3091
+
3092
+
3093
+
3094
+
3095
+
3096
+
3097
+
3098
+
3099
+
3100
+
3101
+
3102
+
3103
+
3104
+
3105
+
3106
+
3107
+
3108
+
3109
+
3110
+
3111
+
3112
+
3113
+
3114
+
3115
+
3116
+
3117
+
3118
+
3119
+
3120
+
3121
+
3122
+
3123
+
3124
+
3125
+
3126
+
3127
+
3128
+
3129
+
3130
+
3131
+
3132
+
3133
+
3134
+
3135
+
3136
+
3137
+
3138
+
3139
+
3140
+
3141
+
3142
+
3143
+
3144
+ 72%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████▋ | 2900/4000 [6:59:16<3:00:07, 9.82s/it]
3145
+
3146
+
3147
+
3148
+
3149
+
3150
+
3151
+
3152
+
3153
+
3154
+
3155
+
3156
+
3157
+
3158
+
3159
+
3160
+
3161
+
3162
+
3163
+
3164
+
3165
+
3166
+
3167
+
3168
+
3169
+
3170
+
3171
+
3172
+
3173
+
3174
+
3175
+
3176
+
3177
+
3178
+
3179
+
3180
+
3181
+
3182
+
3183
+
3184
+
3185
+
3186
+
3187
+
3188
+
3189
+
3190
+
3191
+
3192
+
3193
+
3194
+
3195
+
3196
+
3197
+
3198
+
3199
+
3200
+
3201
+
3202
+
3203
+
3204
+
3205
+
3206
+
3207
+
3208
+
3209
+
3210
+
3211
+
3212
+
3213
+
3214
+
3215
+
3216
+
3217
+
3218
+
3219
+
3220
+
3221
+
3222
+
3223
+
3224
+
3225
+
3226
+
3227
+
3228
+
3229
+
3230
+
3231
+
3232
+
3233
+
3234
+
3235
+
3236
+
3237
+
3238
+
3239
+
3240
+
3241
+
3242
+
3243
+
3244
+ 75%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████▊ | 3000/4000 [7:13:03<1:48:27, 6.51s/it]The following columns in the evaluation set don't have a corresponding argument in `Wav2Vec2ForCTC.forward` and have been ignored: input_length.
3245
+ ***** Running Evaluation *****
3246
+ Num examples = 2613
3247
+ Batch size = 72
3248
+ {'loss': 0.0741, 'learning_rate': 0.00018860610975594382, 'epoch': 75.0}
3249
+
3250
+
3251
+
3252
+
3253
+
3254
+
3255
+
3256
+
3257
+
3258
+
3259
+
3260
+
3261
+
3262
+
3263
+
3264
+
3265
+
3266
+
3267
+
3268
+
3269
+
3270
+
3271
+
3272
+
3273
+
3274
+
3275
+
3276
+
3277
+
3278
+
3279
+
3280
+
3281
+
3282
+
3283
+
3284
+
3285
+ 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 37/37 [02:07<00:00, 2.06s/it]
3286
+ Configuration saved in ./checkpoint-3000/config.json
3287
+ Model weights saved in ./checkpoint-3000/pytorch_model.bin
3288
+ Configuration saved in ./checkpoint-3000/preprocessor_config.json
3289
+ Deleting older checkpoint [checkpoint-2500] due to args.save_total_limit
3290
+
3291
+
3292
+
3293
+
3294
+
3295
+
3296
+
3297
+
3298
+
3299
+
3300
+
3301
+
3302
+
3303
+
3304
+
3305
+
3306
+
3307
+
3308
+
3309
+
3310
+
3311
+
3312
+
3313
+
3314
+
3315
+
3316
+
3317
+
3318
+
3319
+
3320
+
3321
+
3322
+
3323
+
3324
+
3325
+
3326
+
3327
+
3328
+
3329
+
3330
+
3331
+
3332
+
3333
+
3334
+
3335
+
3336
+
3337
+
3338
+
3339
+
3340
+
3341
+
3342
+
3343
+
3344
+
3345
+
3346
+
3347
+
3348
+
3349
+
3350
+
3351
+
3352
+
3353
+
3354
+
3355
+
3356
+
3357
+
3358
+
3359
+
3360
+
3361
+
3362
+
3363
+
3364
+
3365
+
3366
+
3367
+
3368
+
3369
+
3370
+
3371
+
3372
+
3373
+
3374
+
3375
+
3376
+
3377
+
3378
+
3379
+
3380
+
3381
+
3382
+
3383
+
3384
+
3385
+
3386
+
3387
+
3388
+
3389
+ 78%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████▉ | 3100/4000 [7:29:27<2:29:56, 10.00s/it]
3390
+
3391
+
3392
+
3393
+
3394
+
3395
+
3396
+
3397
+
3398
+
3399
+
3400
+
3401
+
3402
+
3403
+
3404
+
3405
+
3406
+
3407
+
3408
+
3409
+
3410
+
3411
+
3412
+
3413
+
3414
+
3415
+
3416
+
3417
+
3418
+
3419
+
3420
+
3421
+
3422
+
3423
+
3424
+
3425
+
3426
+
3427
+
3428
+
3429
+
3430
+
3431
+
3432
+
3433
+
3434
+
3435
+
3436
+
3437
+
3438
+
3439
+
3440
+
3441
+
3442
+
3443
+
3444
+
3445
+
3446
+
3447
+
3448
+
3449
+
3450
+
3451
+
3452
+
3453
+
3454
+
3455
+
3456
+
3457
+
3458
+
3459
+
3460
+
3461
+
3462
+
3463
+
3464
+
3465
+
3466
+
3467
+
3468
+
3469
+
3470
+
3471
+
3472
+
3473
+
3474
+
3475
+
3476
+
3477
+
3478
+
3479
+
3480
+
3481
+
3482
+
3483
+
3484
+
3485
+
3486
+
3487
+
3488
+
3489
+
3490
+ 80%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████ | 3200/4000 [7:43:16<1:27:34, 6.57s/it]
3491
+
3492
+
3493
+
3494
+
3495
+
3496
+
3497
+
3498
+
3499
+
3500
+
3501
+
3502
+
3503
+
3504
+
3505
+
3506
+
3507
+
3508
+
3509
+
3510
+
3511
+
3512
+
3513
+
3514
+
3515
+
3516
+
3517
+
3518
+
3519
+
3520
+
3521
+
3522
+
3523
+
3524
+
3525
+
3526
+
3527
+
3528
+
3529
+
3530
+
3531
+
3532
+
3533
+
3534
+
3535
+
3536
+
3537
+
3538
+
3539
+
3540
+
3541
+
3542
+
3543
+
3544
+
3545
+
3546
+
3547
+
3548
+
3549
+
3550
+
3551
+
3552
+
3553
+
3554
+
3555
+
3556
+
3557
+
3558
+
3559
+
3560
+
3561
+
3562
+
3563
+
3564
+
3565
+
3566
+
3567
+
3568
+
3569
+
3570
+
3571
+
3572
+
3573
+
3574
+
3575
+
3576
+
3577
+
3578
+
3579
+
3580
+
3581
+
3582
+
3583
+
3584
+
3585
+
3586
+
3587
+
3588
+
3589
+
3590
+ 82%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████ | 3299/4000 [7:57:15<1:42:16, 8.75s/it]
3591
+
3592
+
3593
+
3594
+
3595
+
3596
+
3597
+
3598
+
3599
+
3600
+
3601
+
3602
+
3603
+
3604
+
3605
+
3606
+
3607
+
3608
+
3609
+
3610
+
3611
+
3612
+
3613
+
3614
+
3615
+
3616
+
3617
+
3618
+
3619
+
3620
+
3621
+
3622
+
3623
+
3624
+
3625
+
3626
+
3627
+
3628
+
3629
+
3630
+
3631
+
3632
+
3633
+
3634
+
3635
+
3636
+
3637
+
3638
+
3639
+
3640
+
3641
+
3642
+
3643
+
3644
+
3645
+
3646
+
3647
+
3648
+
3649
+
3650
+
3651
+
3652
+
3653
+
3654
+
3655
+
3656
+
3657
+
3658
+
3659
+
3660
+
3661
+
3662
+
3663
+
3664
+
3665
+
3666
+
3667
+
3668
+
3669
+
3670
+
3671
+
3672
+
3673
+
3674
+
3675
+
3676
+
3677
+
3678
+
3679
+
3680
+
3681
+
3682
+
3683
+
3684
+
3685
+
3686
+
3687
+
3688
+
3689
+
3690
+
3691
+ 85%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████��████▏ | 3399/4000 [8:11:14<1:19:33, 7.94s/it]
3692
+
3693
+
3694
+
3695
+
3696
+
3697
+
3698
+
3699
+
3700
+
3701
+
3702
+
3703
+
3704
+
3705
+
3706
+
3707
+
3708
+
3709
+
3710
+
3711
+
3712
+
3713
+
3714
+
3715
+
3716
+
3717
+
3718
+
3719
+
3720
+
3721
+
3722
+
3723
+
3724
+
3725
+
3726
+
3727
+
3728
+
3729
+
3730
+
3731
+
3732
+
3733
+
3734
+
3735
+
3736
+
3737
+
3738
+
3739
+
3740
+
3741
+
3742
+
3743
+
3744
+
3745
+
3746
+
3747
+
3748
+
3749
+
3750
+
3751
+
3752
+
3753
+
3754
+
3755
+
3756
+
3757
+
3758
+
3759
+
3760
+
3761
+
3762
+
3763
+
3764
+
3765
+
3766
+
3767
+
3768
+
3769
+
3770
+
3771
+
3772
+
3773
+
3774
+
3775
+
3776
+
3777
+
3778
+
3779
+
3780
+
3781
+
3782
+
3783
+
3784
+
3785
+
3786
+
3787
+
3788
+
3789
+
3790
+
3791
+
3792
+ 88%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████▍ | 3500/4000 [8:25:31<1:22:42, 9.93s/it]The following columns in the evaluation set don't have a corresponding argument in `Wav2Vec2ForCTC.forward` and have been ignored: input_length.
3793
+ ***** Running Evaluation *****
3794
+ Num examples = 2613
3795
+ Batch size = 72
3796
+ {'loss': 0.0608, 'learning_rate': 4.9710474062988955e-05, 'epoch': 87.5}
3797
+
3798
+
3799
+
3800
+
3801
+
3802
+
3803
+
3804
+
3805
+
3806
+
3807
+
3808
+
3809
+
3810
+
3811
+
3812
+
3813
+
3814
+
3815
+
3816
+
3817
+
3818
+
3819
+
3820
+
3821
+
3822
+
3823
+
3824
+
3825
+
3826
+
3827
+
3828
+
3829
+
3830
+
3831
+
3832
+ 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 37/37 [02:07<00:00, 2.08s/it]
3833
+ Configuration saved in ./checkpoint-3500/config.json
3834
+ Model weights saved in ./checkpoint-3500/pytorch_model.bin
3835
+ Configuration saved in ./checkpoint-3500/preprocessor_config.json
3836
+ Deleting older checkpoint [checkpoint-3000] due to args.save_total_limit
3837
+
3838
+
3839
+
3840
+
3841
+
3842
+
3843
+
3844
+
3845
+
3846
+
3847
+
3848
+
3849
+
3850
+
3851
+
3852
+
3853
+
3854
+
3855
+
3856
+
3857
+
3858
+
3859
+
3860
+
3861
+
3862
+
3863
+
3864
+
3865
+
3866
+
3867
+
3868
+
3869
+
3870
+
3871
+
3872
+
3873
+
3874
+
3875
+
3876
+
3877
+
3878
+
3879
+
3880
+
3881
+
3882
+
3883
+
3884
+
3885
+
3886
+
3887
+
3888
+
3889
+
3890
+
3891
+
3892
+
3893
+
3894
+
3895
+
3896
+
3897
+
3898
+
3899
+
3900
+
3901
+
3902
+
3903
+
3904
+
3905
+
3906
+
3907
+
3908
+
3909
+
3910
+
3911
+
3912
+
3913
+
3914
+
3915
+
3916
+
3917
+
3918
+
3919
+
3920
+
3921
+
3922
+
3923
+
3924
+
3925
+
3926
+
3927
+
3928
+
3929
+
3930
+
3931
+
3932
+
3933
+
3934
+
3935
+ 90%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████▎ | 3599/4000 [8:41:30<52:10, 7.81s/it]
3936
+
3937
+
3938
+
3939
+
3940
+
3941
+
3942
+
3943
+
3944
+
3945
+
3946
+
3947
+
3948
+
3949
+
3950
+
3951
+
3952
+
3953
+
3954
+
3955
+
3956
+
3957
+
3958
+
3959
+
3960
+
3961
+
3962
+
3963
+
3964
+
3965
+
3966
+
3967
+
3968
+
3969
+
3970
+
3971
+
3972
+
3973
+
3974
+
3975
+
3976
+
3977
+
3978
+
3979
+
3980
+
3981
+
3982
+
3983
+
3984
+
3985
+
3986
+
3987
+
3988
+
3989
+
3990
+
3991
+
3992
+
3993
+
3994
+
3995
+
3996
+
3997
+
3998
+
3999
+
4000
+
4001
+
4002
+
4003
+
4004
+
4005
+
4006
+
4007
+
4008
+
4009
+
4010
+
4011
+
4012
+
4013
+
4014
+
4015
+
4016
+
4017
+
4018
+
4019
+
4020
+
4021
+
4022
+
4023
+
4024
+
4025
+
4026
+
4027
+
4028
+
4029
+
4030
+
4031
+
4032
+
4033
+
4034
+
4035
+
4036
+
4037
+ 92%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████▍ | 3700/4000 [8:55:43<49:07, 9.83s/it]
4038
+
4039
+
4040
+
4041
+
4042
+
4043
+
4044
+
4045
+
4046
+
4047
+
4048
+
4049
+
4050
+
4051
+
4052
+
4053
+
4054
+
4055
+
4056
+
4057
+
4058
+
4059
+
4060
+
4061
+
4062
+
4063
+
4064
+
4065
+
4066
+
4067
+
4068
+
4069
+
4070
+
4071
+
4072
+
4073
+
4074
+
4075
+
4076
+
4077
+
4078
+
4079
+
4080
+
4081
+
4082
+
4083
+
4084
+
4085
+
4086
+
4087
+
4088
+
4089
+
4090
+
4091
+
4092
+
4093
+
4094
+
4095
+
4096
+
4097
+
4098
+
4099
+
4100
+
4101
+
4102
+
4103
+
4104
+
4105
+
4106
+
4107
+
4108
+
4109
+
4110
+
4111
+
4112
+
4113
+
4114
+
4115
+
4116
+
4117
+
4118
+
4119
+
4120
+
4121
+
4122
+
4123
+
4124
+
4125
+
4126
+
4127
+
4128
+
4129
+
4130
+
4131
+
4132
+
4133
+
4134
+
4135
+
4136
+
4137
+
4138
+ 95%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████▋ | 3800/4000 [9:09:27<22:19, 6.70s/it]
4139
+
4140
+
4141
+
4142
+
4143
+
4144
+
4145
+
4146
+
4147
+
4148
+
4149
+
4150
+
4151
+
4152
+
4153
+
4154
+
4155
+
4156
+
4157
+
4158
+
4159
+
4160
+
4161
+
4162
+
4163
+
4164
+
4165
+
4166
+
4167
+
4168
+
4169
+
4170
+
4171
+
4172
+
4173
+
4174
+
4175
+
4176
+
4177
+
4178
+
4179
+
4180
+
4181
+
4182
+
4183
+
4184
+
4185
+
4186
+
4187
+
4188
+
4189
+
4190
+
4191
+
4192
+
4193
+
4194
+
4195
+
4196
+
4197
+
4198
+
4199
+
4200
+
4201
+
4202
+
4203
+
4204
+
4205
+
4206
+
4207
+
4208
+
4209
+
4210
+
4211
+
4212
+
4213
+
4214
+
4215
+
4216
+
4217
+
4218
+
4219
+
4220
+
4221
+
4222
+
4223
+
4224
+
4225
+
4226
+
4227
+
4228
+
4229
+
4230
+
4231
+
4232
+
4233
+
4234
+
4235
+
4236
+
4237
+
4238
+
4239
+ 98%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████▊ | 3900/4000 [9:23:35<16:13, 9.74s/it]
4240
+
4241
+
4242
+
4243
+
4244
+
4245
+
4246
+
4247
+
4248
+
4249
+
4250
+
4251
+
4252
+
4253
+
4254
+
4255
+
4256
+
4257
+
4258
+
4259
+
4260
+
4261
+
4262
+
4263
+
4264
+
4265
+
4266
+
4267
+
4268
+
4269
+
4270
+
4271
+
4272
+
4273
+
4274
+
4275
+
4276
+
4277
+
4278
+
4279
+
4280
+
4281
+
4282
+
4283
+
4284
+
4285
+
4286
+
4287
+
4288
+
4289
+
4290
+
4291
+
4292
+
4293
+
4294
+
4295
+
4296
+
4297
+
4298
+
4299
+
4300
+
4301
+
4302
+
4303
+
4304
+
4305
+
4306
+
4307
+
4308
+
4309
+
4310
+
4311
+
4312
+
4313
+
4314
+
4315
+
4316
+
4317
+
4318
+
4319
+
4320
+
4321
+
4322
+
4323
+
4324
+
4325
+
4326
+
4327
+
4328
+
4329
+
4330
+
4331
+
4332
+
4333
+
4334
+
4335
+
4336
+
4337
+
4338
+
4339
+ 100%|██████████████████���████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 4000/4000 [9:37:22<00:00, 6.61s/it]The following columns in the evaluation set don't have a corresponding argument in `Wav2Vec2ForCTC.forward` and have been ignored: input_length.
4340
+ ***** Running Evaluation *****
4341
+ Num examples = 2613
4342
+ Batch size = 72
4343
+ {'loss': 0.0541, 'learning_rate': 2.0142048445803695e-10, 'epoch': 100.0}
4344
+
4345
+
4346
+
4347
+
4348
+
4349
+
4350
+
4351
+
4352
+
4353
+
4354
+
4355
+
4356
+
4357
+
4358
+
4359
+
4360
+
4361
+
4362
+
4363
+
4364
+
4365
+
4366
+
4367
+
4368
+
4369
+
4370
+
4371
+
4372
+
4373
+
4374
+
4375
+
4376
+
4377
+
4378
+
4379
+ 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 37/37 [02:07<00:00, 2.07s/it]
4380
+ Configuration saved in ./checkpoint-4000/config.json
4381
+ Model weights saved in ./checkpoint-4000/pytorch_model.bin
4382
+ Configuration saved in ./checkpoint-4000/preprocessor_config.json
4383
+ {'train_runtime': 34778.5805, 'train_samples_per_second': 16.435, 'train_steps_per_second': 0.115, 'train_loss': 0.3524592447280884, 'epoch': 100.0}
4384
+ Deleting older checkpoint [checkpoint-3500] due to args.save_total_limit
4385
+ Training completed. Do not forget to share your model on huggingface.co/models =)
4386
+ 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 4000/4000 [9:39:37<00:00, 8.69s/it]
4387
+ Saving model checkpoint to ./
4388
+ Configuration saved in ./config.json
4389
+ Model weights saved in ./pytorch_model.bin
4390
+ Configuration saved in ./preprocessor_config.json
4391
+ The following columns in the evaluation set don't have a corresponding argument in `Wav2Vec2ForCTC.forward` and have been ignored: input_length.
4392
+ ***** Running Evaluation *****
4393
+ Num examples = 2613
4394
+ Batch size = 72
4395
+ ***** train metrics *****
4396
+ epoch = 100.0
4397
+ train_loss = 0.3525
4398
+ train_runtime = 9:39:38.58
4399
+ train_samples = 5716
4400
+ train_samples_per_second = 16.435
4401
+ train_steps_per_second = 0.115
4402
+ 01/31/2022 23:43:44 - INFO - __main__ - *** Evaluate ***
4403
+
4404
+
4405
+
4406
+
4407
+
4408
+
4409
+
4410
+
4411
+
4412
+
4413
+
4414
+
4415
+
4416
+
4417
+
4418
+
4419
+
4420
+
4421
+
4422
+
4423
+
4424
+
4425
+
4426
+
4427
+
4428
+
4429
+
4430
+
4431
+
4432
+
4433
+
4434
+
4435
+
4436
+
4437
+
4438
+
4439
+ 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 37/37 [02:08<00:00, 3.47s/it]
4440
+ ***** eval metrics *****
4441
+ epoch = 100.0
4442
+ eval_loss = 0.4927
4443
+ eval_runtime = 0:02:12.07
4444
+ eval_samples = 2613
4445
+ eval_samples_per_second = 19.785
4446
+ eval_steps_per_second = 0.28
4447
+ eval_wer = 0.3536
4448
+ Dropping the following result as it does not have all the necessary fields:
4449
+ {'dataset': {'name': 'MOZILLA-FOUNDATION/COMMON_VOICE_8_0 - ET', 'type': 'common_voice', 'args': 'Config: et, Training split: train+validation, Eval split: test'}}
wandb/run-20220131_140404-gjg8nz5t/files/requirements.txt ADDED
@@ -0,0 +1,87 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ aiohttp==3.8.1
2
+ aiosignal==1.2.0
3
+ appdirs==1.4.4
4
+ async-timeout==4.0.2
5
+ attrs==21.4.0
6
+ audioread==2.1.9
7
+ bitsandbytes-cuda113==0.26.0
8
+ certifi==2021.10.8
9
+ cffi==1.15.0
10
+ charset-normalizer==2.0.10
11
+ click==8.0.3
12
+ clldutils==3.10.1
13
+ colorlog==6.6.0
14
+ configparser==5.2.0
15
+ csvw==1.11.0
16
+ datasets==1.18.1.dev0
17
+ decorator==5.1.1
18
+ dill==0.3.4
19
+ dlinfo==1.2.1
20
+ docker-pycreds==0.4.0
21
+ filelock==3.4.2
22
+ frozenlist==1.3.0
23
+ fsspec==2022.1.0
24
+ gitdb==4.0.9
25
+ gitpython==3.1.26
26
+ huggingface-hub==0.4.0
27
+ hypothesis==6.36.0
28
+ idna==3.3
29
+ isodate==0.6.1
30
+ jiwer==2.3.0
31
+ joblib==1.1.0
32
+ librosa==0.8.1
33
+ llvmlite==0.38.0
34
+ multidict==6.0.2
35
+ multiprocess==0.70.12.2
36
+ numba==0.55.0
37
+ numpy==1.21.5
38
+ packaging==21.3
39
+ pandas==1.4.0
40
+ pathtools==0.1.2
41
+ phonemizer==3.0.1
42
+ pip==21.3.1
43
+ pooch==1.6.0
44
+ promise==2.3
45
+ protobuf==3.19.3
46
+ psutil==5.9.0
47
+ pyarrow==6.0.1
48
+ pycparser==2.21
49
+ pyctcdecode==0.3.0
50
+ pygtrie==2.4.2
51
+ pyparsing==3.0.7
52
+ python-dateutil==2.8.2
53
+ python-levenshtein==0.12.2
54
+ pytz==2021.3
55
+ pyyaml==6.0
56
+ regex==2022.1.18
57
+ requests==2.27.1
58
+ resampy==0.2.2
59
+ rfc3986==2.0.0
60
+ sacremoses==0.0.47
61
+ scikit-learn==1.0.2
62
+ scipy==1.7.3
63
+ segments==2.2.0
64
+ sentry-sdk==1.5.4
65
+ setuptools==60.2.0
66
+ shortuuid==1.0.8
67
+ six==1.16.0
68
+ smmap==5.0.0
69
+ sortedcontainers==2.4.0
70
+ soundfile==0.10.3.post1
71
+ subprocess32==3.5.4
72
+ tabulate==0.8.9
73
+ termcolor==1.1.0
74
+ threadpoolctl==3.0.0
75
+ tokenizers==0.11.4
76
+ torch==1.10.1
77
+ torchaudio==0.10.1
78
+ tqdm==4.62.3
79
+ transformers==4.16.0.dev0
80
+ typing-extensions==4.0.1
81
+ uritemplate==4.1.1
82
+ urllib3==1.26.8
83
+ wandb==0.12.9
84
+ wheel==0.37.1
85
+ xxhash==2.0.2
86
+ yarl==1.7.2
87
+ yaspin==2.1.0
wandb/run-20220131_140404-gjg8nz5t/files/wandb-metadata.json ADDED
@@ -0,0 +1,75 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "os": "Linux-4.18.0-305.10.2.el8_4.x86_64-x86_64-with-glibc2.28",
3
+ "python": "3.9.6",
4
+ "heartbeatAt": "2022-01-31T19:04:05.461063",
5
+ "startedAt": "2022-01-31T19:04:04.537861",
6
+ "docker": null,
7
+ "gpu": "Tesla V100-PCIE-32GB",
8
+ "gpu_count": 3,
9
+ "cpu_count": 64,
10
+ "cuda": null,
11
+ "args": [
12
+ "--dataset_name=mozilla-foundation/common_voice_8_0",
13
+ "--model_name_or_path=facebook/wav2vec2-xls-r-300m",
14
+ "--dataset_config_name=et",
15
+ "--output_dir=./",
16
+ "--overwrite_output_dir",
17
+ "--num_train_epochs=100",
18
+ "--per_device_train_batch_size=72",
19
+ "--per_device_eval_batch_size=72",
20
+ "--gradient_accumulation_steps=2",
21
+ "--learning_rate=3e-4",
22
+ "--save_total_limit=1",
23
+ "--warmup_steps=500",
24
+ "--evaluation_strategy=steps",
25
+ "--text_column_name=sentence",
26
+ "--length_column_name=input_length",
27
+ "--save_steps=500",
28
+ "--eval_steps=500",
29
+ "--logging_steps=100",
30
+ "--layerdrop=0.0",
31
+ "--freeze_feature_encoder",
32
+ "--feat_proj_dropout=0.1",
33
+ "--chars_to_ignore",
34
+ ",",
35
+ "?",
36
+ ".",
37
+ "!",
38
+ "-",
39
+ ";",
40
+ ":",
41
+ "\"",
42
+ "\u201c",
43
+ "%",
44
+ "\u2018",
45
+ "\u201d",
46
+ "\ufffd",
47
+ "\u2014",
48
+ "\u2019",
49
+ "\u2026",
50
+ "\u2013",
51
+ "--gradient_checkpointing",
52
+ "--lr_scheduler_type=cosine",
53
+ "--fp16",
54
+ "--group_by_length",
55
+ "--mask_time_prob=0.1",
56
+ "--mask_time_length=10",
57
+ "--report_to=wandb",
58
+ "--run_name=cosine+drop_proj+low_specaugment-300M+cv_8_0",
59
+ "--do_train",
60
+ "--do_eval",
61
+ "--use_auth_token"
62
+ ],
63
+ "state": "running",
64
+ "program": "/home/sagrilaft/Project/audio/xls-r-et/src/run_speech_recognition_ctc_bnb.py",
65
+ "codePath": "src/run_speech_recognition_ctc_bnb.py",
66
+ "git": {
67
+ "remote": "https://huggingface.co/shpotes/xls-r-et",
68
+ "commit": "ff66b86f52be4c55fb5be74a60f889284554c939"
69
+ },
70
+ "email": "shpotes3@gmail.com",
71
+ "root": "/home/sagrilaft/Project/audio/xls-r-et",
72
+ "host": "ganymede.eafit.edu.co",
73
+ "username": "sagrilaft",
74
+ "executable": "/home/sagrilaft/Project/audio/xls-r-et/.venv/bin/python"
75
+ }
wandb/run-20220131_140404-gjg8nz5t/files/wandb-summary.json ADDED
The diff for this file is too large to render. See raw diff
 
wandb/run-20220131_140404-gjg8nz5t/logs/debug-internal.log ADDED
The diff for this file is too large to render. See raw diff
 
wandb/run-20220131_140404-gjg8nz5t/logs/debug.log ADDED
@@ -0,0 +1,170 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2022-01-31 14:04:04,540 INFO MainThread:2738250 [wandb_setup.py:_flush():71] setting env: {'project': 'xls-r-estonian'}
2
+ 2022-01-31 14:04:04,540 INFO MainThread:2738250 [wandb_setup.py:_flush():71] setting login settings: {}
3
+ 2022-01-31 14:04:04,540 INFO MainThread:2738250 [wandb_init.py:_log_setup():371] Logging user logs to /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220131_140404-gjg8nz5t/logs/debug.log
4
+ 2022-01-31 14:04:04,540 INFO MainThread:2738250 [wandb_init.py:_log_setup():372] Logging internal logs to /home/sagrilaft/Project/audio/xls-r-et/wandb/run-20220131_140404-gjg8nz5t/logs/debug-internal.log
5
+ 2022-01-31 14:04:04,541 INFO MainThread:2738250 [wandb_init.py:init():404] calling init triggers
6
+ 2022-01-31 14:04:04,541 INFO MainThread:2738250 [wandb_init.py:init():409] wandb.init called with sweep_config: {}
7
+ config: {}
8
+ 2022-01-31 14:04:04,541 INFO MainThread:2738250 [wandb_init.py:init():460] starting backend
9
+ 2022-01-31 14:04:04,541 INFO MainThread:2738250 [backend.py:_multiprocessing_setup():99] multiprocessing start_methods=fork,spawn,forkserver, using: spawn
10
+ 2022-01-31 14:04:04,562 INFO MainThread:2738250 [backend.py:ensure_launched():216] starting backend process...
11
+ 2022-01-31 14:04:04,580 INFO MainThread:2738250 [backend.py:ensure_launched():221] started backend process with pid: 2738674
12
+ 2022-01-31 14:04:04,582 INFO MainThread:2738250 [wandb_init.py:init():469] backend started and connected
13
+ 2022-01-31 14:04:04,586 INFO MainThread:2738250 [wandb_init.py:init():533] updated telemetry
14
+ 2022-01-31 14:04:04,634 INFO MainThread:2738250 [wandb_init.py:init():563] communicating current version
15
+ 2022-01-31 14:04:05,285 INFO MainThread:2738250 [wandb_init.py:init():568] got version response
16
+ 2022-01-31 14:04:05,285 INFO MainThread:2738250 [wandb_init.py:init():578] communicating run to backend with 30 second timeout
17
+ 2022-01-31 14:04:05,454 INFO MainThread:2738250 [wandb_init.py:init():606] starting run threads in backend
18
+ 2022-01-31 14:04:05,500 INFO MainThread:2738250 [wandb_run.py:_console_start():1810] atexit reg
19
+ 2022-01-31 14:04:05,501 INFO MainThread:2738250 [wandb_run.py:_redirect():1684] redirect: SettingsConsole.REDIRECT
20
+ 2022-01-31 14:04:05,501 INFO MainThread:2738250 [wandb_run.py:_redirect():1689] Redirecting console.
21
+ 2022-01-31 14:04:05,503 INFO MainThread:2738250 [wandb_run.py:_redirect():1745] Redirects installed.
22
+ 2022-01-31 14:04:05,503 INFO MainThread:2738250 [wandb_init.py:init():633] run started, returning control to user process
23
+ 2022-01-31 14:04:05,519 INFO MainThread:2738250 [wandb_run.py:_config_callback():956] config_cb None None {'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': 'float32', 'use_bfloat16': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'chunk_size_feed_forward': 0, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'architectures': ['Wav2Vec2ForPreTraining'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'bos_token_id': 1, 'pad_token_id': 36, 'eos_token_id': 2, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': 'facebook/wav2vec2-xls-r-300m', 'transformers_version': '4.16.0.dev0', 'feat_extract_dropout': 0.0, 'model_type': 'wav2vec2', 'num_feat_extract_layers': 7, 'hidden_size': 1024, 'feat_extract_norm': 'layer', 'feat_extract_activation': 'gelu', 'conv_dim': [512, 512, 512, 512, 512, 512, 512], 'conv_stride': [5, 2, 2, 2, 2, 2, 2], 'conv_kernel': [10, 3, 3, 3, 3, 2, 2], 'conv_bias': True, 'num_conv_pos_embeddings': 128, 'num_conv_pos_embedding_groups': 16, 'num_hidden_layers': 24, 'intermediate_size': 4096, 'hidden_act': 'gelu', 'num_attention_heads': 16, 'hidden_dropout': 0.0, 'attention_dropout': 0.0, 'activation_dropout': 0.0, 'feat_proj_dropout': 0.1, 'final_dropout': 0.0, 'layerdrop': 0.0, 'layer_norm_eps': 1e-05, 'initializer_range': 0.02, 'vocab_size': 39, 'do_stable_layer_norm': True, 'use_weighted_layer_sum': False, 'apply_spec_augment': True, 'mask_time_prob': 0.1, 'mask_time_length': 10, 'mask_time_min_masks': 2, 'mask_feature_prob': 0.0, 'mask_feature_length': 10, 'mask_feature_min_masks': 0, 'num_codevectors_per_group': 320, 'num_codevector_groups': 2, 'contrastive_logits_temperature': 0.1, 'feat_quantizer_dropout': 0.0, 'num_negatives': 100, 'codevector_dim': 768, 'proj_codevector_dim': 768, 'diversity_loss_weight': 0.1, 'ctc_loss_reduction': 'mean', 'ctc_zero_infinity': False, 'add_adapter': False, 'adapter_kernel_size': 3, 'adapter_stride': 2, 'num_adapter_layers': 3, 'output_hidden_size': 1024, 'classifier_proj_size': 256, 'tdnn_dim': [512, 512, 512, 512, 1500], 'tdnn_kernel': [5, 3, 3, 1, 1], 'tdnn_dilation': [1, 2, 3, 1, 1], 'xvector_output_dim': 512, 'output_dir': './', 'overwrite_output_dir': True, 'do_train': True, 'do_eval': True, 'do_predict': False, 'evaluation_strategy': 'steps', 'prediction_loss_only': False, 'per_device_train_batch_size': 72, 'per_device_eval_batch_size': 72, 'per_gpu_train_batch_size': 'None', 'per_gpu_eval_batch_size': 'None', 'gradient_accumulation_steps': 2, 'eval_accumulation_steps': 'None', 'learning_rate': 0.0003, 'weight_decay': 0.0, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 1.0, 'num_train_epochs': 100.0, 'max_steps': -1, 'lr_scheduler_type': 'cosine', 'warmup_ratio': 0.0, 'warmup_steps': 500, 'log_level': -1, 'log_level_replica': -1, 'log_on_each_node': True, 'logging_dir': './runs/Jan31_14-02-07_ganymede.eafit.edu.co', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 100, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 500, 'save_total_limit': 1, 'save_on_each_node': False, 'no_cuda': False, 'seed': 42, 'bf16': False, 'fp16': True, 'fp16_opt_level': 'O1', 'half_precision_backend': 'amp', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': 'None', 'local_rank': -1, 'xpu_backend': 'None', 'tpu_num_cores': 'None', 'tpu_metrics_debug': False, 'debug': '[]', 'dataloader_drop_last': False, 'eval_steps': 500, 'dataloader_num_workers': 0, 'past_index': -1, 'run_name': 'cosine+drop_proj+low_specaugment-300M+cv_8_0', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': 'None', 'load_best_model_at_end': False, 'metric_for_best_model': 'None', 'greater_is_better': 'None', 'ignore_data_skip': False, 'sharded_ddp': '[]', 'deepspeed': 'None', 'label_smoothing_factor': 0.0, 'optim': 'adamw_hf', 'adafactor': False, 'group_by_length': True, 'length_column_name': 'input_length', 'report_to': "['wandb']", 'ddp_find_unused_parameters': 'None', 'ddp_bucket_cap_mb': 'None', 'dataloader_pin_memory': True, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': False, 'resume_from_checkpoint': 'None', 'hub_model_id': 'None', 'hub_strategy': 'every_save', 'hub_token': '<HUB_TOKEN>', 'gradient_checkpointing': True, 'fp16_backend': 'auto', 'push_to_hub_model_id': 'None', 'push_to_hub_organization': 'None', 'push_to_hub_token': '<PUSH_TO_HUB_TOKEN>', '_n_gpu': 1, 'mp_parameters': '', 'train_batch_size': 72, 'eval_batch_size': 72}
24
+ 2022-01-31 14:04:05,521 INFO MainThread:2738250 [wandb_watch.py:watch():43] Watching
25
+ 2022-01-31 23:45:57,065 INFO MainThread:2738250 [wandb_run.py:_atexit_cleanup():1780] got exitcode: 0
26
+ 2022-01-31 23:45:57,067 INFO MainThread:2738250 [wandb_run.py:_restore():1752] restore
27
+ 2022-01-31 23:45:59,548 INFO MainThread:2738250 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
28
+ wandb_count: 1
29
+ }
30
+ pusher_stats {
31
+ uploaded_bytes: 2306
32
+ total_bytes: 2306
33
+ }
34
+
35
+ 2022-01-31 23:45:59,693 INFO MainThread:2738250 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
36
+ wandb_count: 1
37
+ }
38
+ pusher_stats {
39
+ uploaded_bytes: 2306
40
+ total_bytes: 2306
41
+ }
42
+
43
+ 2022-01-31 23:46:00,350 INFO MainThread:2738250 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
44
+ wandb_count: 1
45
+ }
46
+ pusher_stats {
47
+ uploaded_bytes: 2306
48
+ total_bytes: 2306
49
+ }
50
+
51
+ 2022-01-31 23:46:00,677 INFO MainThread:2738250 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
52
+ wandb_count: 4
53
+ }
54
+ pusher_stats {
55
+ uploaded_bytes: 2306
56
+ total_bytes: 798601
57
+ }
58
+
59
+ 2022-01-31 23:46:00,779 INFO MainThread:2738250 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
60
+ wandb_count: 5
61
+ }
62
+ pusher_stats {
63
+ uploaded_bytes: 2306
64
+ total_bytes: 829187
65
+ }
66
+
67
+ 2022-01-31 23:46:00,885 INFO MainThread:2738250 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
68
+ wandb_count: 5
69
+ }
70
+ pusher_stats {
71
+ uploaded_bytes: 2306
72
+ total_bytes: 829187
73
+ }
74
+
75
+ 2022-01-31 23:46:00,988 INFO MainThread:2738250 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
76
+ wandb_count: 5
77
+ }
78
+ pusher_stats {
79
+ uploaded_bytes: 2306
80
+ total_bytes: 829187
81
+ }
82
+
83
+ 2022-01-31 23:46:01,090 INFO MainThread:2738250 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
84
+ wandb_count: 5
85
+ }
86
+ pusher_stats {
87
+ uploaded_bytes: 829187
88
+ total_bytes: 829187
89
+ }
90
+
91
+ 2022-01-31 23:46:01,192 INFO MainThread:2738250 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
92
+ wandb_count: 5
93
+ }
94
+ pusher_stats {
95
+ uploaded_bytes: 829187
96
+ total_bytes: 829187
97
+ }
98
+
99
+ 2022-01-31 23:46:01,295 INFO MainThread:2738250 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
100
+ wandb_count: 5
101
+ }
102
+ pusher_stats {
103
+ uploaded_bytes: 829187
104
+ total_bytes: 829187
105
+ }
106
+
107
+ 2022-01-31 23:46:01,397 INFO MainThread:2738250 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
108
+ wandb_count: 5
109
+ }
110
+ pusher_stats {
111
+ uploaded_bytes: 829187
112
+ total_bytes: 829187
113
+ }
114
+
115
+ 2022-01-31 23:46:01,500 INFO MainThread:2738250 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
116
+ wandb_count: 5
117
+ }
118
+ pusher_stats {
119
+ uploaded_bytes: 829187
120
+ total_bytes: 829187
121
+ }
122
+
123
+ 2022-01-31 23:46:01,602 INFO MainThread:2738250 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
124
+ wandb_count: 5
125
+ }
126
+ pusher_stats {
127
+ uploaded_bytes: 829187
128
+ total_bytes: 829187
129
+ }
130
+
131
+ 2022-01-31 23:46:01,704 INFO MainThread:2738250 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
132
+ wandb_count: 5
133
+ }
134
+ pusher_stats {
135
+ uploaded_bytes: 829187
136
+ total_bytes: 829187
137
+ }
138
+
139
+ 2022-01-31 23:46:01,807 INFO MainThread:2738250 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
140
+ wandb_count: 5
141
+ }
142
+ pusher_stats {
143
+ uploaded_bytes: 829187
144
+ total_bytes: 829187
145
+ }
146
+
147
+ 2022-01-31 23:46:03,074 INFO MainThread:2738250 [wandb_run.py:_wait_for_finish():1912] got exit ret: file_counts {
148
+ wandb_count: 5
149
+ }
150
+ pusher_stats {
151
+ uploaded_bytes: 829187
152
+ total_bytes: 829187
153
+ }
154
+
155
+ 2022-01-31 23:46:03,447 INFO MainThread:2738250 [wandb_run.py:_wait_for_finish():1912] got exit ret: done: true
156
+ exit_result {
157
+ }
158
+ file_counts {
159
+ wandb_count: 5
160
+ }
161
+ pusher_stats {
162
+ uploaded_bytes: 829187
163
+ total_bytes: 829187
164
+ }
165
+ local_info {
166
+ }
167
+
168
+ 2022-01-31 23:46:04,560 INFO MainThread:2738250 [wandb_run.py:_append_history():2130] rendering history
169
+ 2022-01-31 23:46:04,562 INFO MainThread:2738250 [wandb_run.py:_append_summary():2085] rendering summary
170
+ 2022-01-31 23:46:04,563 INFO MainThread:2738250 [wandb_run.py:_append_files():2180] logging synced files
wandb/run-20220131_140404-gjg8nz5t/run-gjg8nz5t.wandb ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e2c3f1a476aa6d29faeae196b5d94a6d5a4ab1d94b0f862f997944cc5d7bd9b5
3
+ size 29809444