agnesluhtaru commited on
Commit
1a9e6f4
1 Parent(s): 0265a61

Training in progress, step 1000

Browse files
Files changed (21) hide show
  1. README.md +0 -65
  2. all_results.json +0 -12
  3. config.json +9 -9
  4. eval_results.json +0 -8
  5. pytorch_model.bin +2 -2
  6. runs/Dec12_20-15-25_nid007212/1670868963.1957881/events.out.tfevents.1670868963.nid007212.108213.1 +0 -3
  7. runs/Dec12_20-15-25_nid007212/events.out.tfevents.1670868963.nid007212.108213.0 +0 -3
  8. runs/Dec12_20-45-22_nid007212/1670870762.3217633/events.out.tfevents.1670870762.nid007212.21786.1 +0 -3
  9. runs/Dec12_20-45-22_nid007212/events.out.tfevents.1670870762.nid007212.21786.0 +0 -3
  10. runs/Dec12_20-47-24_nid007212/events.out.tfevents.1670870883.nid007212.24215.0 +0 -3
  11. runs/Dec13_12-09-42_nid006505/1670926221.4825838/events.out.tfevents.1670926221.nid006505.42921.1 +0 -3
  12. runs/Dec13_13-27-39_nid006766/1670930929.787034/events.out.tfevents.1670930929.nid006766.129058.1 +0 -3
  13. runs/Dec13_13-27-39_nid006766/events.out.tfevents.1670930929.nid006766.129058.0 +0 -3
  14. runs/Dec13_17-53-57_nid006703/1670946874.9508193/events.out.tfevents.1670946874.nid006703.128967.1 +0 -3
  15. runs/Dec13_17-53-57_nid006703/events.out.tfevents.1670946874.nid006703.128967.0 +0 -3
  16. runs/Dec13_17-53-57_nid006703/events.out.tfevents.1671051702.nid006703.128967.2 +0 -3
  17. runs/{Dec12_20-47-24_nid007212/1670870883.503076/events.out.tfevents.1670870883.nid007212.24215.1 → Dec15_13-16-07_nid006501/1671103005.8075087/events.out.tfevents.1671103005.nid006501.105699.1} +1 -1
  18. runs/{Dec13_12-09-42_nid006505/events.out.tfevents.1670926221.nid006505.42921.0 → Dec15_13-16-07_nid006501/events.out.tfevents.1671103005.nid006501.105699.0} +1 -1
  19. train_results.json +0 -7
  20. trainer_state.json +0 -1270
  21. training_args.bin +1 -1
README.md DELETED
@@ -1,65 +0,0 @@
1
- ---
2
- license: apache-2.0
3
- tags:
4
- - generated_from_trainer
5
- metrics:
6
- - wer
7
- model-index:
8
- - name: whisper-small-et
9
- results: []
10
- ---
11
-
12
- <!-- This model card has been generated automatically according to the information the Trainer had access to. You
13
- should probably proofread and complete it, then remove this comment. -->
14
-
15
- # whisper-small-et
16
-
17
- This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the None dataset.
18
- It achieves the following results on the evaluation set:
19
- - Loss: 1.0549
20
- - Wer: 37.2559
21
-
22
- ## Model description
23
-
24
- More information needed
25
-
26
- ## Intended uses & limitations
27
-
28
- More information needed
29
-
30
- ## Training and evaluation data
31
-
32
- More information needed
33
-
34
- ## Training procedure
35
-
36
- ### Training hyperparameters
37
-
38
- The following hyperparameters were used during training:
39
- - learning_rate: 1e-05
40
- - train_batch_size: 64
41
- - eval_batch_size: 32
42
- - seed: 42
43
- - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
44
- - lr_scheduler_type: linear
45
- - lr_scheduler_warmup_steps: 500
46
- - training_steps: 5000
47
- - mixed_precision_training: Native AMP
48
-
49
- ### Training results
50
-
51
- | Training Loss | Epoch | Step | Validation Loss | Wer |
52
- |:-------------:|:-----:|:----:|:---------------:|:-------:|
53
- | 0.0489 | 6.02 | 1000 | 0.6054 | 33.5155 |
54
- | 0.0067 | 13.02 | 2000 | 0.7319 | 33.8098 |
55
- | 0.0018 | 20.01 | 3000 | 0.8847 | 34.6952 |
56
- | 0.0014 | 27.01 | 4000 | 1.0163 | 37.2011 |
57
- | 0.0007 | 34.0 | 5000 | 1.0549 | 37.2559 |
58
-
59
-
60
- ### Framework versions
61
-
62
- - Transformers 4.26.0.dev0
63
- - Pytorch 1.12.1+rocm5.1.1
64
- - Datasets 2.7.1.dev0
65
- - Tokenizers 0.13.2
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
all_results.json DELETED
@@ -1,12 +0,0 @@
1
- {
2
- "epoch": 34.0,
3
- "eval_loss": 0.6054422855377197,
4
- "eval_runtime": 6785.7386,
5
- "eval_samples_per_second": 1.166,
6
- "eval_steps_per_second": 0.037,
7
- "eval_wer": 33.51547271432038,
8
- "train_loss": 0.11123611520007252,
9
- "train_runtime": 97978.5764,
10
- "train_samples_per_second": 3.266,
11
- "train_steps_per_second": 0.051
12
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
config.json CHANGED
@@ -1,5 +1,5 @@
1
  {
2
- "_name_or_path": "openai/whisper-medium",
3
  "activation_dropout": 0.0,
4
  "activation_function": "gelu",
5
  "architectures": [
@@ -11,17 +11,17 @@
11
  50257
12
  ],
13
  "bos_token_id": 50257,
14
- "d_model": 1024,
15
- "decoder_attention_heads": 16,
16
- "decoder_ffn_dim": 4096,
17
  "decoder_layerdrop": 0.0,
18
- "decoder_layers": 24,
19
  "decoder_start_token_id": 50258,
20
  "dropout": 0.0,
21
- "encoder_attention_heads": 16,
22
- "encoder_ffn_dim": 4096,
23
  "encoder_layerdrop": 0.0,
24
- "encoder_layers": 24,
25
  "eos_token_id": 50257,
26
  "forced_decoder_ids": null,
27
  "init_std": 0.02,
@@ -30,7 +30,7 @@
30
  "max_source_positions": 1500,
31
  "max_target_positions": 448,
32
  "model_type": "whisper",
33
- "num_hidden_layers": 24,
34
  "num_mel_bins": 80,
35
  "pad_token_id": 50257,
36
  "scale_embedding": false,
 
1
  {
2
+ "_name_or_path": "openai/whisper-small",
3
  "activation_dropout": 0.0,
4
  "activation_function": "gelu",
5
  "architectures": [
 
11
  50257
12
  ],
13
  "bos_token_id": 50257,
14
+ "d_model": 768,
15
+ "decoder_attention_heads": 12,
16
+ "decoder_ffn_dim": 3072,
17
  "decoder_layerdrop": 0.0,
18
+ "decoder_layers": 12,
19
  "decoder_start_token_id": 50258,
20
  "dropout": 0.0,
21
+ "encoder_attention_heads": 12,
22
+ "encoder_ffn_dim": 3072,
23
  "encoder_layerdrop": 0.0,
24
+ "encoder_layers": 12,
25
  "eos_token_id": 50257,
26
  "forced_decoder_ids": null,
27
  "init_std": 0.02,
 
30
  "max_source_positions": 1500,
31
  "max_target_positions": 448,
32
  "model_type": "whisper",
33
+ "num_hidden_layers": 12,
34
  "num_mel_bins": 80,
35
  "pad_token_id": 50257,
36
  "scale_embedding": false,
eval_results.json DELETED
@@ -1,8 +0,0 @@
1
- {
2
- "epoch": 34.0,
3
- "eval_loss": 0.6054422855377197,
4
- "eval_runtime": 6785.7386,
5
- "eval_samples_per_second": 1.166,
6
- "eval_steps_per_second": 0.037,
7
- "eval_wer": 33.51547271432038
8
- }
 
 
 
 
 
 
 
 
 
pytorch_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:71724ba3c430040d3129ac90d89f17498ca21cd24bdd3c20c804a2b8d1abab4c
3
- size 3055748571
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:561b68c787a7f63e54a08c737be04744b1f49e6f25ce6ee66ab84bba388423fe
3
+ size 967099139
runs/Dec12_20-15-25_nid007212/1670868963.1957881/events.out.tfevents.1670868963.nid007212.108213.1 DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:202307e1533d515ced252a98c29f8918137b5f0d3fc50e3469d78e94a1f21f9f
3
- size 5908
 
 
 
 
runs/Dec12_20-15-25_nid007212/events.out.tfevents.1670868963.nid007212.108213.0 DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:97f8593ba27b71844f0ff045dd2c36d00d05ccd4d1528a9d84b23cb7f769492c
3
- size 4305
 
 
 
 
runs/Dec12_20-45-22_nid007212/1670870762.3217633/events.out.tfevents.1670870762.nid007212.21786.1 DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:38f9196d80a3f7aefe2477ffa5085c20e35cbd673b21603b0fecc1bf228258dc
3
- size 5908
 
 
 
 
runs/Dec12_20-45-22_nid007212/events.out.tfevents.1670870762.nid007212.21786.0 DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:d36b264c52f14ff0183f4925fb96fa80c7662482685979c2b47ad72cbd030e51
3
- size 4305
 
 
 
 
runs/Dec12_20-47-24_nid007212/events.out.tfevents.1670870883.nid007212.24215.0 DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:a669428cc9b333f32b2af1fddc443f846023431d2e8a5503d4d0367bcd121892
3
- size 4305
 
 
 
 
runs/Dec13_12-09-42_nid006505/1670926221.4825838/events.out.tfevents.1670926221.nid006505.42921.1 DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:a2e4eacf39aabe5df1d3b01cbfd66a12f45918652a0c06926d998239c3624e47
3
- size 5908
 
 
 
 
runs/Dec13_13-27-39_nid006766/1670930929.787034/events.out.tfevents.1670930929.nid006766.129058.1 DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:8720ff82c16270543f47053fb6727d4870c617589e9d4697059a1af3f8ddb12a
3
- size 5908
 
 
 
 
runs/Dec13_13-27-39_nid006766/events.out.tfevents.1670930929.nid006766.129058.0 DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:8dd5d6af65fe9362a427f41e6c77751aa3e131e48367e11cac07cb148b89ea39
3
- size 37282
 
 
 
 
runs/Dec13_17-53-57_nid006703/1670946874.9508193/events.out.tfevents.1670946874.nid006703.128967.1 DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:cad0c0c916011a66c362f9ff644ec54f6aa11a5b6fe4d4ef4358250c6c53b7af
3
- size 5908
 
 
 
 
runs/Dec13_17-53-57_nid006703/events.out.tfevents.1670946874.nid006703.128967.0 DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:b027357e29e0adf7a3742369b37217d0e1d9359003f79786baa614d9778a0081
3
- size 37634
 
 
 
 
runs/Dec13_17-53-57_nid006703/events.out.tfevents.1671051702.nid006703.128967.2 DELETED
@@ -1,3 +0,0 @@
1
- version https://git-lfs.github.com/spec/v1
2
- oid sha256:35017a055f78b77bb1506a3d5b6ec8eec51c7f195bcabf9eac8627ccd0274745
3
- size 358
 
 
 
 
runs/{Dec12_20-47-24_nid007212/1670870883.503076/events.out.tfevents.1670870883.nid007212.24215.1 → Dec15_13-16-07_nid006501/1671103005.8075087/events.out.tfevents.1671103005.nid006501.105699.1} RENAMED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:7c197ff3d741a2e1236c5efd756aa016585f67d4de54eb84f201b54c60dbe26b
3
  size 5908
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:156a9651b48d10af24ce34673994c080d7af0e90218b849dc4b15b10a72cd503
3
  size 5908
runs/{Dec13_12-09-42_nid006505/events.out.tfevents.1670926221.nid006505.42921.0 → Dec15_13-16-07_nid006501/events.out.tfevents.1671103005.nid006501.105699.0} RENAMED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:96ea9b3c4c51a275b338af1d47a2cbcbfff4f45596414c29cbf0bcbc05675be4
3
  size 10888
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:72ac0066614e1b7b6f2852c6423bd3ffd97de5c57a7219450ab35a5008a232a3
3
  size 10888
train_results.json DELETED
@@ -1,7 +0,0 @@
1
- {
2
- "epoch": 34.0,
3
- "train_loss": 0.11123611520007252,
4
- "train_runtime": 97978.5764,
5
- "train_samples_per_second": 3.266,
6
- "train_steps_per_second": 0.051
7
- }
 
 
 
 
 
 
 
 
trainer_state.json DELETED
@@ -1,1270 +0,0 @@
1
- {
2
- "best_metric": 33.51547271432038,
3
- "best_model_checkpoint": "whisper-small-et/checkpoint-1000",
4
- "epoch": 34.0004,
5
- "global_step": 5000,
6
- "is_hyper_param_search": false,
7
- "is_local_process_zero": true,
8
- "is_world_process_zero": true,
9
- "log_history": [
10
- {
11
- "epoch": 0.01,
12
- "learning_rate": 4.6000000000000004e-07,
13
- "loss": 1.7783,
14
- "step": 25
15
- },
16
- {
17
- "epoch": 0.01,
18
- "learning_rate": 9.600000000000001e-07,
19
- "loss": 1.5296,
20
- "step": 50
21
- },
22
- {
23
- "epoch": 0.01,
24
- "learning_rate": 1.46e-06,
25
- "loss": 1.3037,
26
- "step": 75
27
- },
28
- {
29
- "epoch": 0.02,
30
- "learning_rate": 1.9600000000000003e-06,
31
- "loss": 1.2767,
32
- "step": 100
33
- },
34
- {
35
- "epoch": 0.03,
36
- "learning_rate": 2.46e-06,
37
- "loss": 1.5126,
38
- "step": 125
39
- },
40
- {
41
- "epoch": 1.0,
42
- "learning_rate": 2.96e-06,
43
- "loss": 1.4068,
44
- "step": 150
45
- },
46
- {
47
- "epoch": 1.01,
48
- "learning_rate": 3.46e-06,
49
- "loss": 1.2894,
50
- "step": 175
51
- },
52
- {
53
- "epoch": 1.01,
54
- "learning_rate": 3.96e-06,
55
- "loss": 1.1285,
56
- "step": 200
57
- },
58
- {
59
- "epoch": 1.02,
60
- "learning_rate": 4.4600000000000005e-06,
61
- "loss": 1.0276,
62
- "step": 225
63
- },
64
- {
65
- "epoch": 1.02,
66
- "learning_rate": 4.960000000000001e-06,
67
- "loss": 0.9116,
68
- "step": 250
69
- },
70
- {
71
- "epoch": 1.03,
72
- "learning_rate": 5.460000000000001e-06,
73
- "loss": 0.7987,
74
- "step": 275
75
- },
76
- {
77
- "epoch": 2.0,
78
- "learning_rate": 5.9600000000000005e-06,
79
- "loss": 0.7129,
80
- "step": 300
81
- },
82
- {
83
- "epoch": 2.01,
84
- "learning_rate": 6.460000000000001e-06,
85
- "loss": 0.6695,
86
- "step": 325
87
- },
88
- {
89
- "epoch": 2.01,
90
- "learning_rate": 6.96e-06,
91
- "loss": 0.6068,
92
- "step": 350
93
- },
94
- {
95
- "epoch": 2.02,
96
- "learning_rate": 7.4600000000000006e-06,
97
- "loss": 0.5721,
98
- "step": 375
99
- },
100
- {
101
- "epoch": 2.02,
102
- "learning_rate": 7.960000000000002e-06,
103
- "loss": 0.5162,
104
- "step": 400
105
- },
106
- {
107
- "epoch": 2.03,
108
- "learning_rate": 8.46e-06,
109
- "loss": 0.4458,
110
- "step": 425
111
- },
112
- {
113
- "epoch": 3.0,
114
- "learning_rate": 8.96e-06,
115
- "loss": 0.3935,
116
- "step": 450
117
- },
118
- {
119
- "epoch": 3.01,
120
- "learning_rate": 9.460000000000001e-06,
121
- "loss": 0.371,
122
- "step": 475
123
- },
124
- {
125
- "epoch": 3.01,
126
- "learning_rate": 9.960000000000001e-06,
127
- "loss": 0.3565,
128
- "step": 500
129
- },
130
- {
131
- "epoch": 3.02,
132
- "learning_rate": 9.94888888888889e-06,
133
- "loss": 0.3519,
134
- "step": 525
135
- },
136
- {
137
- "epoch": 3.02,
138
- "learning_rate": 9.893333333333334e-06,
139
- "loss": 0.2977,
140
- "step": 550
141
- },
142
- {
143
- "epoch": 3.03,
144
- "learning_rate": 9.837777777777778e-06,
145
- "loss": 0.2411,
146
- "step": 575
147
- },
148
- {
149
- "epoch": 4.0,
150
- "learning_rate": 9.782222222222222e-06,
151
- "loss": 0.2003,
152
- "step": 600
153
- },
154
- {
155
- "epoch": 4.01,
156
- "learning_rate": 9.726666666666668e-06,
157
- "loss": 0.1988,
158
- "step": 625
159
- },
160
- {
161
- "epoch": 4.01,
162
- "learning_rate": 9.671111111111112e-06,
163
- "loss": 0.202,
164
- "step": 650
165
- },
166
- {
167
- "epoch": 4.02,
168
- "learning_rate": 9.615555555555558e-06,
169
- "loss": 0.1985,
170
- "step": 675
171
- },
172
- {
173
- "epoch": 4.02,
174
- "learning_rate": 9.56e-06,
175
- "loss": 0.1661,
176
- "step": 700
177
- },
178
- {
179
- "epoch": 4.03,
180
- "learning_rate": 9.504444444444446e-06,
181
- "loss": 0.1213,
182
- "step": 725
183
- },
184
- {
185
- "epoch": 5.0,
186
- "learning_rate": 9.44888888888889e-06,
187
- "loss": 0.0985,
188
- "step": 750
189
- },
190
- {
191
- "epoch": 5.01,
192
- "learning_rate": 9.393333333333334e-06,
193
- "loss": 0.1113,
194
- "step": 775
195
- },
196
- {
197
- "epoch": 5.01,
198
- "learning_rate": 9.33777777777778e-06,
199
- "loss": 0.1164,
200
- "step": 800
201
- },
202
- {
203
- "epoch": 5.02,
204
- "learning_rate": 9.282222222222222e-06,
205
- "loss": 0.1083,
206
- "step": 825
207
- },
208
- {
209
- "epoch": 5.02,
210
- "learning_rate": 9.226666666666668e-06,
211
- "loss": 0.0924,
212
- "step": 850
213
- },
214
- {
215
- "epoch": 5.03,
216
- "learning_rate": 9.171111111111112e-06,
217
- "loss": 0.059,
218
- "step": 875
219
- },
220
- {
221
- "epoch": 6.0,
222
- "learning_rate": 9.115555555555556e-06,
223
- "loss": 0.0564,
224
- "step": 900
225
- },
226
- {
227
- "epoch": 6.01,
228
- "learning_rate": 9.060000000000001e-06,
229
- "loss": 0.06,
230
- "step": 925
231
- },
232
- {
233
- "epoch": 6.01,
234
- "learning_rate": 9.004444444444445e-06,
235
- "loss": 0.0646,
236
- "step": 950
237
- },
238
- {
239
- "epoch": 6.02,
240
- "learning_rate": 8.94888888888889e-06,
241
- "loss": 0.0624,
242
- "step": 975
243
- },
244
- {
245
- "epoch": 6.02,
246
- "learning_rate": 8.893333333333333e-06,
247
- "loss": 0.0489,
248
- "step": 1000
249
- },
250
- {
251
- "epoch": 6.02,
252
- "eval_loss": 0.6054422855377197,
253
- "eval_runtime": 6829.9531,
254
- "eval_samples_per_second": 1.158,
255
- "eval_steps_per_second": 0.036,
256
- "eval_wer": 33.51547271432038,
257
- "step": 1000
258
- },
259
- {
260
- "epoch": 6.03,
261
- "learning_rate": 8.83777777777778e-06,
262
- "loss": 0.0341,
263
- "step": 1025
264
- },
265
- {
266
- "epoch": 7.0,
267
- "learning_rate": 8.782222222222223e-06,
268
- "loss": 0.0324,
269
- "step": 1050
270
- },
271
- {
272
- "epoch": 7.01,
273
- "learning_rate": 8.726666666666667e-06,
274
- "loss": 0.0342,
275
- "step": 1075
276
- },
277
- {
278
- "epoch": 7.01,
279
- "learning_rate": 8.671111111111113e-06,
280
- "loss": 0.0407,
281
- "step": 1100
282
- },
283
- {
284
- "epoch": 7.02,
285
- "learning_rate": 8.615555555555555e-06,
286
- "loss": 0.0391,
287
- "step": 1125
288
- },
289
- {
290
- "epoch": 7.02,
291
- "learning_rate": 8.560000000000001e-06,
292
- "loss": 0.0289,
293
- "step": 1150
294
- },
295
- {
296
- "epoch": 7.03,
297
- "learning_rate": 8.504444444444445e-06,
298
- "loss": 0.0182,
299
- "step": 1175
300
- },
301
- {
302
- "epoch": 8.0,
303
- "learning_rate": 8.448888888888889e-06,
304
- "loss": 0.0197,
305
- "step": 1200
306
- },
307
- {
308
- "epoch": 8.01,
309
- "learning_rate": 8.393333333333335e-06,
310
- "loss": 0.0224,
311
- "step": 1225
312
- },
313
- {
314
- "epoch": 8.01,
315
- "learning_rate": 8.337777777777777e-06,
316
- "loss": 0.0229,
317
- "step": 1250
318
- },
319
- {
320
- "epoch": 8.02,
321
- "learning_rate": 8.282222222222223e-06,
322
- "loss": 0.0236,
323
- "step": 1275
324
- },
325
- {
326
- "epoch": 8.02,
327
- "learning_rate": 8.226666666666667e-06,
328
- "loss": 0.0175,
329
- "step": 1300
330
- },
331
- {
332
- "epoch": 9.0,
333
- "learning_rate": 8.171111111111113e-06,
334
- "loss": 0.0137,
335
- "step": 1325
336
- },
337
- {
338
- "epoch": 9.01,
339
- "learning_rate": 8.115555555555557e-06,
340
- "loss": 0.0142,
341
- "step": 1350
342
- },
343
- {
344
- "epoch": 9.01,
345
- "learning_rate": 8.06e-06,
346
- "loss": 0.0158,
347
- "step": 1375
348
- },
349
- {
350
- "epoch": 9.02,
351
- "learning_rate": 8.004444444444445e-06,
352
- "loss": 0.0174,
353
- "step": 1400
354
- },
355
- {
356
- "epoch": 9.02,
357
- "learning_rate": 7.948888888888889e-06,
358
- "loss": 0.0154,
359
- "step": 1425
360
- },
361
- {
362
- "epoch": 9.03,
363
- "learning_rate": 7.893333333333335e-06,
364
- "loss": 0.0122,
365
- "step": 1450
366
- },
367
- {
368
- "epoch": 10.0,
369
- "learning_rate": 7.837777777777779e-06,
370
- "loss": 0.0091,
371
- "step": 1475
372
- },
373
- {
374
- "epoch": 10.01,
375
- "learning_rate": 7.782222222222223e-06,
376
- "loss": 0.0094,
377
- "step": 1500
378
- },
379
- {
380
- "epoch": 10.01,
381
- "learning_rate": 7.726666666666667e-06,
382
- "loss": 0.011,
383
- "step": 1525
384
- },
385
- {
386
- "epoch": 10.02,
387
- "learning_rate": 7.67111111111111e-06,
388
- "loss": 0.0123,
389
- "step": 1550
390
- },
391
- {
392
- "epoch": 10.02,
393
- "learning_rate": 7.6155555555555564e-06,
394
- "loss": 0.0111,
395
- "step": 1575
396
- },
397
- {
398
- "epoch": 10.03,
399
- "learning_rate": 7.5600000000000005e-06,
400
- "loss": 0.0084,
401
- "step": 1600
402
- },
403
- {
404
- "epoch": 11.0,
405
- "learning_rate": 7.504444444444445e-06,
406
- "loss": 0.0071,
407
- "step": 1625
408
- },
409
- {
410
- "epoch": 11.01,
411
- "learning_rate": 7.44888888888889e-06,
412
- "loss": 0.0083,
413
- "step": 1650
414
- },
415
- {
416
- "epoch": 11.01,
417
- "learning_rate": 7.393333333333333e-06,
418
- "loss": 0.0086,
419
- "step": 1675
420
- },
421
- {
422
- "epoch": 11.02,
423
- "learning_rate": 7.337777777777778e-06,
424
- "loss": 0.0086,
425
- "step": 1700
426
- },
427
- {
428
- "epoch": 11.02,
429
- "learning_rate": 7.282222222222222e-06,
430
- "loss": 0.0078,
431
- "step": 1725
432
- },
433
- {
434
- "epoch": 11.03,
435
- "learning_rate": 7.226666666666667e-06,
436
- "loss": 0.0065,
437
- "step": 1750
438
- },
439
- {
440
- "epoch": 12.0,
441
- "learning_rate": 7.171111111111112e-06,
442
- "loss": 0.006,
443
- "step": 1775
444
- },
445
- {
446
- "epoch": 12.01,
447
- "learning_rate": 7.115555555555557e-06,
448
- "loss": 0.0059,
449
- "step": 1800
450
- },
451
- {
452
- "epoch": 12.01,
453
- "learning_rate": 7.06e-06,
454
- "loss": 0.0069,
455
- "step": 1825
456
- },
457
- {
458
- "epoch": 12.02,
459
- "learning_rate": 7.004444444444445e-06,
460
- "loss": 0.0071,
461
- "step": 1850
462
- },
463
- {
464
- "epoch": 12.02,
465
- "learning_rate": 6.948888888888889e-06,
466
- "loss": 0.0063,
467
- "step": 1875
468
- },
469
- {
470
- "epoch": 12.03,
471
- "learning_rate": 6.893333333333334e-06,
472
- "loss": 0.0046,
473
- "step": 1900
474
- },
475
- {
476
- "epoch": 13.0,
477
- "learning_rate": 6.837777777777779e-06,
478
- "loss": 0.0047,
479
- "step": 1925
480
- },
481
- {
482
- "epoch": 13.01,
483
- "learning_rate": 6.782222222222222e-06,
484
- "loss": 0.0052,
485
- "step": 1950
486
- },
487
- {
488
- "epoch": 13.01,
489
- "learning_rate": 6.726666666666667e-06,
490
- "loss": 0.006,
491
- "step": 1975
492
- },
493
- {
494
- "epoch": 13.02,
495
- "learning_rate": 6.671111111111112e-06,
496
- "loss": 0.0067,
497
- "step": 2000
498
- },
499
- {
500
- "epoch": 13.02,
501
- "eval_loss": 0.7319000363349915,
502
- "eval_runtime": 5844.1698,
503
- "eval_samples_per_second": 1.354,
504
- "eval_steps_per_second": 0.042,
505
- "eval_wer": 33.809818304224954,
506
- "step": 2000
507
- },
508
- {
509
- "epoch": 13.02,
510
- "learning_rate": 6.615555555555556e-06,
511
- "loss": 0.0051,
512
- "step": 2025
513
- },
514
- {
515
- "epoch": 13.03,
516
- "learning_rate": 6.560000000000001e-06,
517
- "loss": 0.0032,
518
- "step": 2050
519
- },
520
- {
521
- "epoch": 14.0,
522
- "learning_rate": 6.504444444444446e-06,
523
- "loss": 0.0029,
524
- "step": 2075
525
- },
526
- {
527
- "epoch": 14.01,
528
- "learning_rate": 6.448888888888889e-06,
529
- "loss": 0.0032,
530
- "step": 2100
531
- },
532
- {
533
- "epoch": 14.01,
534
- "learning_rate": 6.393333333333334e-06,
535
- "loss": 0.0039,
536
- "step": 2125
537
- },
538
- {
539
- "epoch": 14.02,
540
- "learning_rate": 6.3377777777777786e-06,
541
- "loss": 0.0047,
542
- "step": 2150
543
- },
544
- {
545
- "epoch": 14.02,
546
- "learning_rate": 6.282222222222223e-06,
547
- "loss": 0.0036,
548
- "step": 2175
549
- },
550
- {
551
- "epoch": 14.03,
552
- "learning_rate": 6.2266666666666675e-06,
553
- "loss": 0.003,
554
- "step": 2200
555
- },
556
- {
557
- "epoch": 15.0,
558
- "learning_rate": 6.171111111111112e-06,
559
- "loss": 0.0034,
560
- "step": 2225
561
- },
562
- {
563
- "epoch": 15.01,
564
- "learning_rate": 6.1155555555555555e-06,
565
- "loss": 0.0029,
566
- "step": 2250
567
- },
568
- {
569
- "epoch": 15.01,
570
- "learning_rate": 6.0600000000000004e-06,
571
- "loss": 0.0035,
572
- "step": 2275
573
- },
574
- {
575
- "epoch": 15.02,
576
- "learning_rate": 6.004444444444445e-06,
577
- "loss": 0.0031,
578
- "step": 2300
579
- },
580
- {
581
- "epoch": 15.02,
582
- "learning_rate": 5.948888888888889e-06,
583
- "loss": 0.0026,
584
- "step": 2325
585
- },
586
- {
587
- "epoch": 15.03,
588
- "learning_rate": 5.893333333333334e-06,
589
- "loss": 0.0025,
590
- "step": 2350
591
- },
592
- {
593
- "epoch": 16.0,
594
- "learning_rate": 5.837777777777777e-06,
595
- "loss": 0.0025,
596
- "step": 2375
597
- },
598
- {
599
- "epoch": 16.01,
600
- "learning_rate": 5.782222222222222e-06,
601
- "loss": 0.003,
602
- "step": 2400
603
- },
604
- {
605
- "epoch": 16.01,
606
- "learning_rate": 5.726666666666667e-06,
607
- "loss": 0.0026,
608
- "step": 2425
609
- },
610
- {
611
- "epoch": 16.02,
612
- "learning_rate": 5.671111111111112e-06,
613
- "loss": 0.0024,
614
- "step": 2450
615
- },
616
- {
617
- "epoch": 16.02,
618
- "learning_rate": 5.615555555555556e-06,
619
- "loss": 0.0021,
620
- "step": 2475
621
- },
622
- {
623
- "epoch": 17.0,
624
- "learning_rate": 5.560000000000001e-06,
625
- "loss": 0.0021,
626
- "step": 2500
627
- },
628
- {
629
- "epoch": 17.01,
630
- "learning_rate": 5.504444444444444e-06,
631
- "loss": 0.0021,
632
- "step": 2525
633
- },
634
- {
635
- "epoch": 17.01,
636
- "learning_rate": 5.448888888888889e-06,
637
- "loss": 0.0022,
638
- "step": 2550
639
- },
640
- {
641
- "epoch": 17.02,
642
- "learning_rate": 5.393333333333334e-06,
643
- "loss": 0.0024,
644
- "step": 2575
645
- },
646
- {
647
- "epoch": 17.02,
648
- "learning_rate": 5.337777777777779e-06,
649
- "loss": 0.0021,
650
- "step": 2600
651
- },
652
- {
653
- "epoch": 17.03,
654
- "learning_rate": 5.282222222222223e-06,
655
- "loss": 0.0019,
656
- "step": 2625
657
- },
658
- {
659
- "epoch": 18.0,
660
- "learning_rate": 5.226666666666667e-06,
661
- "loss": 0.0018,
662
- "step": 2650
663
- },
664
- {
665
- "epoch": 18.01,
666
- "learning_rate": 5.171111111111111e-06,
667
- "loss": 0.0019,
668
- "step": 2675
669
- },
670
- {
671
- "epoch": 18.01,
672
- "learning_rate": 5.115555555555556e-06,
673
- "loss": 0.0021,
674
- "step": 2700
675
- },
676
- {
677
- "epoch": 18.02,
678
- "learning_rate": 5.060000000000001e-06,
679
- "loss": 0.002,
680
- "step": 2725
681
- },
682
- {
683
- "epoch": 18.02,
684
- "learning_rate": 5.004444444444445e-06,
685
- "loss": 0.0019,
686
- "step": 2750
687
- },
688
- {
689
- "epoch": 18.03,
690
- "learning_rate": 4.94888888888889e-06,
691
- "loss": 0.0018,
692
- "step": 2775
693
- },
694
- {
695
- "epoch": 19.0,
696
- "learning_rate": 4.893333333333334e-06,
697
- "loss": 0.0017,
698
- "step": 2800
699
- },
700
- {
701
- "epoch": 19.01,
702
- "learning_rate": 4.837777777777778e-06,
703
- "loss": 0.0018,
704
- "step": 2825
705
- },
706
- {
707
- "epoch": 19.01,
708
- "learning_rate": 4.7822222222222226e-06,
709
- "loss": 0.0019,
710
- "step": 2850
711
- },
712
- {
713
- "epoch": 19.02,
714
- "learning_rate": 4.7266666666666674e-06,
715
- "loss": 0.0019,
716
- "step": 2875
717
- },
718
- {
719
- "epoch": 19.02,
720
- "learning_rate": 4.6711111111111115e-06,
721
- "loss": 0.0018,
722
- "step": 2900
723
- },
724
- {
725
- "epoch": 19.03,
726
- "learning_rate": 4.6155555555555555e-06,
727
- "loss": 0.0016,
728
- "step": 2925
729
- },
730
- {
731
- "epoch": 20.0,
732
- "learning_rate": 4.56e-06,
733
- "loss": 0.0017,
734
- "step": 2950
735
- },
736
- {
737
- "epoch": 20.01,
738
- "learning_rate": 4.504444444444444e-06,
739
- "loss": 0.0017,
740
- "step": 2975
741
- },
742
- {
743
- "epoch": 20.01,
744
- "learning_rate": 4.448888888888889e-06,
745
- "loss": 0.0018,
746
- "step": 3000
747
- },
748
- {
749
- "epoch": 20.01,
750
- "eval_loss": 0.8846777677536011,
751
- "eval_runtime": 6554.8609,
752
- "eval_samples_per_second": 1.207,
753
- "eval_steps_per_second": 0.038,
754
- "eval_wer": 34.695203575985786,
755
- "step": 3000
756
- },
757
- {
758
- "epoch": 20.02,
759
- "learning_rate": 4.393333333333334e-06,
760
- "loss": 0.0018,
761
- "step": 3025
762
- },
763
- {
764
- "epoch": 20.02,
765
- "learning_rate": 4.337777777777778e-06,
766
- "loss": 0.0017,
767
- "step": 3050
768
- },
769
- {
770
- "epoch": 20.03,
771
- "learning_rate": 4.282222222222222e-06,
772
- "loss": 0.0016,
773
- "step": 3075
774
- },
775
- {
776
- "epoch": 21.0,
777
- "learning_rate": 4.226666666666667e-06,
778
- "loss": 0.0016,
779
- "step": 3100
780
- },
781
- {
782
- "epoch": 21.01,
783
- "learning_rate": 4.171111111111111e-06,
784
- "loss": 0.0017,
785
- "step": 3125
786
- },
787
- {
788
- "epoch": 21.01,
789
- "learning_rate": 4.115555555555556e-06,
790
- "loss": 0.0017,
791
- "step": 3150
792
- },
793
- {
794
- "epoch": 21.02,
795
- "learning_rate": 4.060000000000001e-06,
796
- "loss": 0.0017,
797
- "step": 3175
798
- },
799
- {
800
- "epoch": 21.02,
801
- "learning_rate": 4.004444444444445e-06,
802
- "loss": 0.0016,
803
- "step": 3200
804
- },
805
- {
806
- "epoch": 21.03,
807
- "learning_rate": 3.948888888888889e-06,
808
- "loss": 0.0015,
809
- "step": 3225
810
- },
811
- {
812
- "epoch": 22.0,
813
- "learning_rate": 3.893333333333333e-06,
814
- "loss": 0.0015,
815
- "step": 3250
816
- },
817
- {
818
- "epoch": 22.01,
819
- "learning_rate": 3.837777777777778e-06,
820
- "loss": 0.0017,
821
- "step": 3275
822
- },
823
- {
824
- "epoch": 22.01,
825
- "learning_rate": 3.782222222222223e-06,
826
- "loss": 0.0016,
827
- "step": 3300
828
- },
829
- {
830
- "epoch": 22.02,
831
- "learning_rate": 3.726666666666667e-06,
832
- "loss": 0.0017,
833
- "step": 3325
834
- },
835
- {
836
- "epoch": 22.02,
837
- "learning_rate": 3.6711111111111113e-06,
838
- "loss": 0.0015,
839
- "step": 3350
840
- },
841
- {
842
- "epoch": 22.03,
843
- "learning_rate": 3.615555555555556e-06,
844
- "loss": 0.0015,
845
- "step": 3375
846
- },
847
- {
848
- "epoch": 23.0,
849
- "learning_rate": 3.5600000000000002e-06,
850
- "loss": 0.0015,
851
- "step": 3400
852
- },
853
- {
854
- "epoch": 23.01,
855
- "learning_rate": 3.5044444444444447e-06,
856
- "loss": 0.0016,
857
- "step": 3425
858
- },
859
- {
860
- "epoch": 23.01,
861
- "learning_rate": 3.4488888888888896e-06,
862
- "loss": 0.0016,
863
- "step": 3450
864
- },
865
- {
866
- "epoch": 23.02,
867
- "learning_rate": 3.3933333333333336e-06,
868
- "loss": 0.0016,
869
- "step": 3475
870
- },
871
- {
872
- "epoch": 23.02,
873
- "learning_rate": 3.337777777777778e-06,
874
- "loss": 0.0015,
875
- "step": 3500
876
- },
877
- {
878
- "epoch": 23.03,
879
- "learning_rate": 3.282222222222223e-06,
880
- "loss": 0.0014,
881
- "step": 3525
882
- },
883
- {
884
- "epoch": 24.0,
885
- "learning_rate": 3.226666666666667e-06,
886
- "loss": 0.0014,
887
- "step": 3550
888
- },
889
- {
890
- "epoch": 24.01,
891
- "learning_rate": 3.1711111111111114e-06,
892
- "loss": 0.0015,
893
- "step": 3575
894
- },
895
- {
896
- "epoch": 24.01,
897
- "learning_rate": 3.1155555555555555e-06,
898
- "loss": 0.0015,
899
- "step": 3600
900
- },
901
- {
902
- "epoch": 24.02,
903
- "learning_rate": 3.0600000000000003e-06,
904
- "loss": 0.0015,
905
- "step": 3625
906
- },
907
- {
908
- "epoch": 24.02,
909
- "learning_rate": 3.004444444444445e-06,
910
- "loss": 0.0014,
911
- "step": 3650
912
- },
913
- {
914
- "epoch": 24.03,
915
- "learning_rate": 2.948888888888889e-06,
916
- "loss": 0.0014,
917
- "step": 3675
918
- },
919
- {
920
- "epoch": 25.0,
921
- "learning_rate": 2.8933333333333337e-06,
922
- "loss": 0.0014,
923
- "step": 3700
924
- },
925
- {
926
- "epoch": 25.01,
927
- "learning_rate": 2.837777777777778e-06,
928
- "loss": 0.0015,
929
- "step": 3725
930
- },
931
- {
932
- "epoch": 25.02,
933
- "learning_rate": 2.7822222222222222e-06,
934
- "loss": 0.0015,
935
- "step": 3750
936
- },
937
- {
938
- "epoch": 25.02,
939
- "learning_rate": 2.726666666666667e-06,
940
- "loss": 0.0015,
941
- "step": 3775
942
- },
943
- {
944
- "epoch": 25.02,
945
- "learning_rate": 2.6711111111111116e-06,
946
- "loss": 0.0013,
947
- "step": 3800
948
- },
949
- {
950
- "epoch": 26.0,
951
- "learning_rate": 2.6155555555555556e-06,
952
- "loss": 0.0013,
953
- "step": 3825
954
- },
955
- {
956
- "epoch": 26.01,
957
- "learning_rate": 2.56e-06,
958
- "loss": 0.0014,
959
- "step": 3850
960
- },
961
- {
962
- "epoch": 26.01,
963
- "learning_rate": 2.504444444444445e-06,
964
- "loss": 0.0015,
965
- "step": 3875
966
- },
967
- {
968
- "epoch": 26.02,
969
- "learning_rate": 2.448888888888889e-06,
970
- "loss": 0.0014,
971
- "step": 3900
972
- },
973
- {
974
- "epoch": 26.02,
975
- "learning_rate": 2.3933333333333334e-06,
976
- "loss": 0.0014,
977
- "step": 3925
978
- },
979
- {
980
- "epoch": 26.03,
981
- "learning_rate": 2.337777777777778e-06,
982
- "loss": 0.0013,
983
- "step": 3950
984
- },
985
- {
986
- "epoch": 27.0,
987
- "learning_rate": 2.2822222222222223e-06,
988
- "loss": 0.0013,
989
- "step": 3975
990
- },
991
- {
992
- "epoch": 27.01,
993
- "learning_rate": 2.226666666666667e-06,
994
- "loss": 0.0014,
995
- "step": 4000
996
- },
997
- {
998
- "epoch": 27.01,
999
- "eval_loss": 1.016266107559204,
1000
- "eval_runtime": 6945.4562,
1001
- "eval_samples_per_second": 1.139,
1002
- "eval_steps_per_second": 0.036,
1003
- "eval_wer": 37.20105526025317,
1004
- "step": 4000
1005
- },
1006
- {
1007
- "epoch": 27.01,
1008
- "learning_rate": 2.1711111111111113e-06,
1009
- "loss": 0.0011,
1010
- "step": 4025
1011
- },
1012
- {
1013
- "epoch": 27.02,
1014
- "learning_rate": 2.1155555555555557e-06,
1015
- "loss": 0.001,
1016
- "step": 4050
1017
- },
1018
- {
1019
- "epoch": 27.02,
1020
- "learning_rate": 2.06e-06,
1021
- "loss": 0.0009,
1022
- "step": 4075
1023
- },
1024
- {
1025
- "epoch": 27.03,
1026
- "learning_rate": 2.0044444444444446e-06,
1027
- "loss": 0.0008,
1028
- "step": 4100
1029
- },
1030
- {
1031
- "epoch": 28.0,
1032
- "learning_rate": 1.948888888888889e-06,
1033
- "loss": 0.0008,
1034
- "step": 4125
1035
- },
1036
- {
1037
- "epoch": 28.01,
1038
- "learning_rate": 1.8933333333333333e-06,
1039
- "loss": 0.0009,
1040
- "step": 4150
1041
- },
1042
- {
1043
- "epoch": 28.01,
1044
- "learning_rate": 1.837777777777778e-06,
1045
- "loss": 0.0009,
1046
- "step": 4175
1047
- },
1048
- {
1049
- "epoch": 28.02,
1050
- "learning_rate": 1.7822222222222225e-06,
1051
- "loss": 0.0009,
1052
- "step": 4200
1053
- },
1054
- {
1055
- "epoch": 28.02,
1056
- "learning_rate": 1.7266666666666667e-06,
1057
- "loss": 0.0009,
1058
- "step": 4225
1059
- },
1060
- {
1061
- "epoch": 28.03,
1062
- "learning_rate": 1.6711111111111112e-06,
1063
- "loss": 0.0008,
1064
- "step": 4250
1065
- },
1066
- {
1067
- "epoch": 29.0,
1068
- "learning_rate": 1.6155555555555559e-06,
1069
- "loss": 0.0008,
1070
- "step": 4275
1071
- },
1072
- {
1073
- "epoch": 29.01,
1074
- "learning_rate": 1.56e-06,
1075
- "loss": 0.0008,
1076
- "step": 4300
1077
- },
1078
- {
1079
- "epoch": 29.01,
1080
- "learning_rate": 1.5044444444444446e-06,
1081
- "loss": 0.0008,
1082
- "step": 4325
1083
- },
1084
- {
1085
- "epoch": 29.02,
1086
- "learning_rate": 1.4488888888888892e-06,
1087
- "loss": 0.0009,
1088
- "step": 4350
1089
- },
1090
- {
1091
- "epoch": 29.02,
1092
- "learning_rate": 1.3933333333333335e-06,
1093
- "loss": 0.0008,
1094
- "step": 4375
1095
- },
1096
- {
1097
- "epoch": 29.03,
1098
- "learning_rate": 1.337777777777778e-06,
1099
- "loss": 0.0007,
1100
- "step": 4400
1101
- },
1102
- {
1103
- "epoch": 30.0,
1104
- "learning_rate": 1.2822222222222222e-06,
1105
- "loss": 0.0008,
1106
- "step": 4425
1107
- },
1108
- {
1109
- "epoch": 30.01,
1110
- "learning_rate": 1.2266666666666666e-06,
1111
- "loss": 0.0008,
1112
- "step": 4450
1113
- },
1114
- {
1115
- "epoch": 30.01,
1116
- "learning_rate": 1.171111111111111e-06,
1117
- "loss": 0.0008,
1118
- "step": 4475
1119
- },
1120
- {
1121
- "epoch": 30.02,
1122
- "learning_rate": 1.1155555555555558e-06,
1123
- "loss": 0.0008,
1124
- "step": 4500
1125
- },
1126
- {
1127
- "epoch": 30.02,
1128
- "learning_rate": 1.06e-06,
1129
- "loss": 0.0008,
1130
- "step": 4525
1131
- },
1132
- {
1133
- "epoch": 30.03,
1134
- "learning_rate": 1.0044444444444445e-06,
1135
- "loss": 0.0007,
1136
- "step": 4550
1137
- },
1138
- {
1139
- "epoch": 31.0,
1140
- "learning_rate": 9.488888888888889e-07,
1141
- "loss": 0.0007,
1142
- "step": 4575
1143
- },
1144
- {
1145
- "epoch": 31.01,
1146
- "learning_rate": 8.933333333333334e-07,
1147
- "loss": 0.0008,
1148
- "step": 4600
1149
- },
1150
- {
1151
- "epoch": 31.01,
1152
- "learning_rate": 8.37777777777778e-07,
1153
- "loss": 0.0008,
1154
- "step": 4625
1155
- },
1156
- {
1157
- "epoch": 31.02,
1158
- "learning_rate": 7.822222222222223e-07,
1159
- "loss": 0.0008,
1160
- "step": 4650
1161
- },
1162
- {
1163
- "epoch": 31.02,
1164
- "learning_rate": 7.266666666666668e-07,
1165
- "loss": 0.0008,
1166
- "step": 4675
1167
- },
1168
- {
1169
- "epoch": 31.03,
1170
- "learning_rate": 6.711111111111111e-07,
1171
- "loss": 0.0007,
1172
- "step": 4700
1173
- },
1174
- {
1175
- "epoch": 32.0,
1176
- "learning_rate": 6.155555555555556e-07,
1177
- "loss": 0.0007,
1178
- "step": 4725
1179
- },
1180
- {
1181
- "epoch": 32.01,
1182
- "learning_rate": 5.6e-07,
1183
- "loss": 0.0008,
1184
- "step": 4750
1185
- },
1186
- {
1187
- "epoch": 32.01,
1188
- "learning_rate": 5.044444444444445e-07,
1189
- "loss": 0.0008,
1190
- "step": 4775
1191
- },
1192
- {
1193
- "epoch": 32.02,
1194
- "learning_rate": 4.488888888888889e-07,
1195
- "loss": 0.0008,
1196
- "step": 4800
1197
- },
1198
- {
1199
- "epoch": 32.02,
1200
- "learning_rate": 3.9333333333333336e-07,
1201
- "loss": 0.0008,
1202
- "step": 4825
1203
- },
1204
- {
1205
- "epoch": 32.03,
1206
- "learning_rate": 3.3777777777777777e-07,
1207
- "loss": 0.0007,
1208
- "step": 4850
1209
- },
1210
- {
1211
- "epoch": 33.0,
1212
- "learning_rate": 2.822222222222222e-07,
1213
- "loss": 0.0007,
1214
- "step": 4875
1215
- },
1216
- {
1217
- "epoch": 33.01,
1218
- "learning_rate": 2.266666666666667e-07,
1219
- "loss": 0.0008,
1220
- "step": 4900
1221
- },
1222
- {
1223
- "epoch": 33.01,
1224
- "learning_rate": 1.7111111111111114e-07,
1225
- "loss": 0.0008,
1226
- "step": 4925
1227
- },
1228
- {
1229
- "epoch": 33.02,
1230
- "learning_rate": 1.1555555555555556e-07,
1231
- "loss": 0.0008,
1232
- "step": 4950
1233
- },
1234
- {
1235
- "epoch": 33.02,
1236
- "learning_rate": 6.000000000000001e-08,
1237
- "loss": 0.0007,
1238
- "step": 4975
1239
- },
1240
- {
1241
- "epoch": 34.0,
1242
- "learning_rate": 4.444444444444445e-09,
1243
- "loss": 0.0007,
1244
- "step": 5000
1245
- },
1246
- {
1247
- "epoch": 34.0,
1248
- "eval_loss": 1.0548938512802124,
1249
- "eval_runtime": 6875.5586,
1250
- "eval_samples_per_second": 1.151,
1251
- "eval_steps_per_second": 0.036,
1252
- "eval_wer": 37.255853641352424,
1253
- "step": 5000
1254
- },
1255
- {
1256
- "epoch": 34.0,
1257
- "step": 5000,
1258
- "total_flos": 9.221977335545856e+19,
1259
- "train_loss": 0.11123611520007252,
1260
- "train_runtime": 97978.5764,
1261
- "train_samples_per_second": 3.266,
1262
- "train_steps_per_second": 0.051
1263
- }
1264
- ],
1265
- "max_steps": 5000,
1266
- "num_train_epochs": 9223372036854775807,
1267
- "total_flos": 9.221977335545856e+19,
1268
- "trial_name": null,
1269
- "trial_params": null
1270
- }
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:26c1a453988fda935770eaf75f250da376fe7af5e3fe223d78507eb2605e6450
3
  size 3631
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ef358c6b774022254eb4f6934c5c403f823e7e24a501523ba3622c4894230724
3
  size 3631