fsicoli commited on
Commit
bb50e9e
1 Parent(s): 417d66d

Upload 72 files

Browse files
This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. README.md +85 -0
  2. added_tokens.json +1609 -0
  3. all_results.json +14 -0
  4. checkpoint-1000/config.json +51 -0
  5. checkpoint-1000/generation_config.json +248 -0
  6. checkpoint-1000/model.safetensors +3 -0
  7. checkpoint-1000/optimizer.pt +3 -0
  8. checkpoint-1000/preprocessor_config.json +14 -0
  9. checkpoint-1000/rng_state_0.pth +3 -0
  10. checkpoint-1000/rng_state_1.pth +3 -0
  11. checkpoint-1000/scheduler.pt +3 -0
  12. checkpoint-1000/trainer_state.json +310 -0
  13. checkpoint-1000/training_args.bin +3 -0
  14. checkpoint-2000/config.json +51 -0
  15. checkpoint-2000/generation_config.json +248 -0
  16. checkpoint-2000/model.safetensors +3 -0
  17. checkpoint-2000/optimizer.pt +3 -0
  18. checkpoint-2000/preprocessor_config.json +14 -0
  19. checkpoint-2000/rng_state_0.pth +3 -0
  20. checkpoint-2000/rng_state_1.pth +3 -0
  21. checkpoint-2000/scheduler.pt +3 -0
  22. checkpoint-2000/trainer_state.json +599 -0
  23. checkpoint-2000/training_args.bin +3 -0
  24. checkpoint-3000/config.json +51 -0
  25. checkpoint-3000/generation_config.json +248 -0
  26. checkpoint-3000/model.safetensors +3 -0
  27. checkpoint-3000/optimizer.pt +3 -0
  28. checkpoint-3000/preprocessor_config.json +14 -0
  29. checkpoint-3000/rng_state_0.pth +3 -0
  30. checkpoint-3000/rng_state_1.pth +3 -0
  31. checkpoint-3000/scheduler.pt +3 -0
  32. checkpoint-3000/trainer_state.json +888 -0
  33. checkpoint-3000/training_args.bin +3 -0
  34. checkpoint-4000/config.json +51 -0
  35. checkpoint-4000/generation_config.json +248 -0
  36. checkpoint-4000/model.safetensors +3 -0
  37. checkpoint-4000/optimizer.pt +3 -0
  38. checkpoint-4000/preprocessor_config.json +14 -0
  39. checkpoint-4000/rng_state_0.pth +3 -0
  40. checkpoint-4000/rng_state_1.pth +3 -0
  41. checkpoint-4000/scheduler.pt +3 -0
  42. checkpoint-4000/trainer_state.json +1177 -0
  43. checkpoint-4000/training_args.bin +3 -0
  44. checkpoint-5000/config.json +51 -0
  45. checkpoint-5000/generation_config.json +248 -0
  46. checkpoint-5000/model.safetensors +3 -0
  47. checkpoint-5000/optimizer.pt +3 -0
  48. checkpoint-5000/preprocessor_config.json +14 -0
  49. checkpoint-5000/rng_state_0.pth +3 -0
  50. checkpoint-5000/rng_state_1.pth +3 -0
README.md ADDED
@@ -0,0 +1,85 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: openai/whisper-medium
4
+ tags:
5
+ - generated_from_trainer
6
+ datasets:
7
+ - mozilla-foundation/common_voice_16_1
8
+ metrics:
9
+ - wer
10
+ model-index:
11
+ - name: whisper-large-v3-pt-cv16-fleurs
12
+ results:
13
+ - task:
14
+ name: Automatic Speech Recognition
15
+ type: automatic-speech-recognition
16
+ dataset:
17
+ name: mozilla-foundation/common_voice_16_1 pt
18
+ type: mozilla-foundation/common_voice_16_1
19
+ config: pt
20
+ split: test
21
+ args: pt
22
+ metrics:
23
+ - name: Wer
24
+ type: wer
25
+ value: 0.11905377038591959
26
+ ---
27
+
28
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
29
+ should probably proofread and complete it, then remove this comment. -->
30
+
31
+ # whisper-large-v3-pt-cv16-fleurs
32
+
33
+ This model is a fine-tuned version of [openai/whisper-medium](https://huggingface.co/openai/whisper-medium) on the mozilla-foundation/common_voice_16_1 pt dataset.
34
+ It achieves the following results on the evaluation set:
35
+ - Loss: 0.1975
36
+ - Wer: 0.1191
37
+
38
+ ## Model description
39
+
40
+ More information needed
41
+
42
+ ## Intended uses & limitations
43
+
44
+ More information needed
45
+
46
+ ## Training and evaluation data
47
+
48
+ More information needed
49
+
50
+ ## Training procedure
51
+
52
+ ### Training hyperparameters
53
+
54
+ The following hyperparameters were used during training:
55
+ - learning_rate: 1e-06
56
+ - train_batch_size: 1
57
+ - eval_batch_size: 1
58
+ - seed: 42
59
+ - distributed_type: multi-GPU
60
+ - num_devices: 2
61
+ - total_train_batch_size: 2
62
+ - total_eval_batch_size: 2
63
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
64
+ - lr_scheduler_type: linear
65
+ - lr_scheduler_warmup_steps: 2000
66
+ - training_steps: 5000
67
+ - mixed_precision_training: Native AMP
68
+
69
+ ### Training results
70
+
71
+ | Training Loss | Epoch | Step | Validation Loss | Wer |
72
+ |:-------------:|:-----:|:----:|:---------------:|:------:|
73
+ | 0.2614 | 0.06 | 1000 | 0.2986 | 0.1466 |
74
+ | 0.2632 | 0.13 | 2000 | 0.2244 | 0.1316 |
75
+ | 0.1694 | 0.19 | 3000 | 0.2086 | 0.1234 |
76
+ | 0.1658 | 0.26 | 4000 | 0.1987 | 0.1205 |
77
+ | 0.1391 | 0.32 | 5000 | 0.1975 | 0.1191 |
78
+
79
+
80
+ ### Framework versions
81
+
82
+ - Transformers 4.39.0.dev0
83
+ - Pytorch 2.2.1
84
+ - Datasets 2.16.1
85
+ - Tokenizers 0.15.2
added_tokens.json ADDED
@@ -0,0 +1,1609 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "<|0.00|>": 50364,
3
+ "<|0.02|>": 50365,
4
+ "<|0.04|>": 50366,
5
+ "<|0.06|>": 50367,
6
+ "<|0.08|>": 50368,
7
+ "<|0.10|>": 50369,
8
+ "<|0.12|>": 50370,
9
+ "<|0.14|>": 50371,
10
+ "<|0.16|>": 50372,
11
+ "<|0.18|>": 50373,
12
+ "<|0.20|>": 50374,
13
+ "<|0.22|>": 50375,
14
+ "<|0.24|>": 50376,
15
+ "<|0.26|>": 50377,
16
+ "<|0.28|>": 50378,
17
+ "<|0.30|>": 50379,
18
+ "<|0.32|>": 50380,
19
+ "<|0.34|>": 50381,
20
+ "<|0.36|>": 50382,
21
+ "<|0.38|>": 50383,
22
+ "<|0.40|>": 50384,
23
+ "<|0.42|>": 50385,
24
+ "<|0.44|>": 50386,
25
+ "<|0.46|>": 50387,
26
+ "<|0.48|>": 50388,
27
+ "<|0.50|>": 50389,
28
+ "<|0.52|>": 50390,
29
+ "<|0.54|>": 50391,
30
+ "<|0.56|>": 50392,
31
+ "<|0.58|>": 50393,
32
+ "<|0.60|>": 50394,
33
+ "<|0.62|>": 50395,
34
+ "<|0.64|>": 50396,
35
+ "<|0.66|>": 50397,
36
+ "<|0.68|>": 50398,
37
+ "<|0.70|>": 50399,
38
+ "<|0.72|>": 50400,
39
+ "<|0.74|>": 50401,
40
+ "<|0.76|>": 50402,
41
+ "<|0.78|>": 50403,
42
+ "<|0.80|>": 50404,
43
+ "<|0.82|>": 50405,
44
+ "<|0.84|>": 50406,
45
+ "<|0.86|>": 50407,
46
+ "<|0.88|>": 50408,
47
+ "<|0.90|>": 50409,
48
+ "<|0.92|>": 50410,
49
+ "<|0.94|>": 50411,
50
+ "<|0.96|>": 50412,
51
+ "<|0.98|>": 50413,
52
+ "<|1.00|>": 50414,
53
+ "<|1.02|>": 50415,
54
+ "<|1.04|>": 50416,
55
+ "<|1.06|>": 50417,
56
+ "<|1.08|>": 50418,
57
+ "<|1.10|>": 50419,
58
+ "<|1.12|>": 50420,
59
+ "<|1.14|>": 50421,
60
+ "<|1.16|>": 50422,
61
+ "<|1.18|>": 50423,
62
+ "<|1.20|>": 50424,
63
+ "<|1.22|>": 50425,
64
+ "<|1.24|>": 50426,
65
+ "<|1.26|>": 50427,
66
+ "<|1.28|>": 50428,
67
+ "<|1.30|>": 50429,
68
+ "<|1.32|>": 50430,
69
+ "<|1.34|>": 50431,
70
+ "<|1.36|>": 50432,
71
+ "<|1.38|>": 50433,
72
+ "<|1.40|>": 50434,
73
+ "<|1.42|>": 50435,
74
+ "<|1.44|>": 50436,
75
+ "<|1.46|>": 50437,
76
+ "<|1.48|>": 50438,
77
+ "<|1.50|>": 50439,
78
+ "<|1.52|>": 50440,
79
+ "<|1.54|>": 50441,
80
+ "<|1.56|>": 50442,
81
+ "<|1.58|>": 50443,
82
+ "<|1.60|>": 50444,
83
+ "<|1.62|>": 50445,
84
+ "<|1.64|>": 50446,
85
+ "<|1.66|>": 50447,
86
+ "<|1.68|>": 50448,
87
+ "<|1.70|>": 50449,
88
+ "<|1.72|>": 50450,
89
+ "<|1.74|>": 50451,
90
+ "<|1.76|>": 50452,
91
+ "<|1.78|>": 50453,
92
+ "<|1.80|>": 50454,
93
+ "<|1.82|>": 50455,
94
+ "<|1.84|>": 50456,
95
+ "<|1.86|>": 50457,
96
+ "<|1.88|>": 50458,
97
+ "<|1.90|>": 50459,
98
+ "<|1.92|>": 50460,
99
+ "<|1.94|>": 50461,
100
+ "<|1.96|>": 50462,
101
+ "<|1.98|>": 50463,
102
+ "<|10.00|>": 50864,
103
+ "<|10.02|>": 50865,
104
+ "<|10.04|>": 50866,
105
+ "<|10.06|>": 50867,
106
+ "<|10.08|>": 50868,
107
+ "<|10.10|>": 50869,
108
+ "<|10.12|>": 50870,
109
+ "<|10.14|>": 50871,
110
+ "<|10.16|>": 50872,
111
+ "<|10.18|>": 50873,
112
+ "<|10.20|>": 50874,
113
+ "<|10.22|>": 50875,
114
+ "<|10.24|>": 50876,
115
+ "<|10.26|>": 50877,
116
+ "<|10.28|>": 50878,
117
+ "<|10.30|>": 50879,
118
+ "<|10.32|>": 50880,
119
+ "<|10.34|>": 50881,
120
+ "<|10.36|>": 50882,
121
+ "<|10.38|>": 50883,
122
+ "<|10.40|>": 50884,
123
+ "<|10.42|>": 50885,
124
+ "<|10.44|>": 50886,
125
+ "<|10.46|>": 50887,
126
+ "<|10.48|>": 50888,
127
+ "<|10.50|>": 50889,
128
+ "<|10.52|>": 50890,
129
+ "<|10.54|>": 50891,
130
+ "<|10.56|>": 50892,
131
+ "<|10.58|>": 50893,
132
+ "<|10.60|>": 50894,
133
+ "<|10.62|>": 50895,
134
+ "<|10.64|>": 50896,
135
+ "<|10.66|>": 50897,
136
+ "<|10.68|>": 50898,
137
+ "<|10.70|>": 50899,
138
+ "<|10.72|>": 50900,
139
+ "<|10.74|>": 50901,
140
+ "<|10.76|>": 50902,
141
+ "<|10.78|>": 50903,
142
+ "<|10.80|>": 50904,
143
+ "<|10.82|>": 50905,
144
+ "<|10.84|>": 50906,
145
+ "<|10.86|>": 50907,
146
+ "<|10.88|>": 50908,
147
+ "<|10.90|>": 50909,
148
+ "<|10.92|>": 50910,
149
+ "<|10.94|>": 50911,
150
+ "<|10.96|>": 50912,
151
+ "<|10.98|>": 50913,
152
+ "<|11.00|>": 50914,
153
+ "<|11.02|>": 50915,
154
+ "<|11.04|>": 50916,
155
+ "<|11.06|>": 50917,
156
+ "<|11.08|>": 50918,
157
+ "<|11.10|>": 50919,
158
+ "<|11.12|>": 50920,
159
+ "<|11.14|>": 50921,
160
+ "<|11.16|>": 50922,
161
+ "<|11.18|>": 50923,
162
+ "<|11.20|>": 50924,
163
+ "<|11.22|>": 50925,
164
+ "<|11.24|>": 50926,
165
+ "<|11.26|>": 50927,
166
+ "<|11.28|>": 50928,
167
+ "<|11.30|>": 50929,
168
+ "<|11.32|>": 50930,
169
+ "<|11.34|>": 50931,
170
+ "<|11.36|>": 50932,
171
+ "<|11.38|>": 50933,
172
+ "<|11.40|>": 50934,
173
+ "<|11.42|>": 50935,
174
+ "<|11.44|>": 50936,
175
+ "<|11.46|>": 50937,
176
+ "<|11.48|>": 50938,
177
+ "<|11.50|>": 50939,
178
+ "<|11.52|>": 50940,
179
+ "<|11.54|>": 50941,
180
+ "<|11.56|>": 50942,
181
+ "<|11.58|>": 50943,
182
+ "<|11.60|>": 50944,
183
+ "<|11.62|>": 50945,
184
+ "<|11.64|>": 50946,
185
+ "<|11.66|>": 50947,
186
+ "<|11.68|>": 50948,
187
+ "<|11.70|>": 50949,
188
+ "<|11.72|>": 50950,
189
+ "<|11.74|>": 50951,
190
+ "<|11.76|>": 50952,
191
+ "<|11.78|>": 50953,
192
+ "<|11.80|>": 50954,
193
+ "<|11.82|>": 50955,
194
+ "<|11.84|>": 50956,
195
+ "<|11.86|>": 50957,
196
+ "<|11.88|>": 50958,
197
+ "<|11.90|>": 50959,
198
+ "<|11.92|>": 50960,
199
+ "<|11.94|>": 50961,
200
+ "<|11.96|>": 50962,
201
+ "<|11.98|>": 50963,
202
+ "<|12.00|>": 50964,
203
+ "<|12.02|>": 50965,
204
+ "<|12.04|>": 50966,
205
+ "<|12.06|>": 50967,
206
+ "<|12.08|>": 50968,
207
+ "<|12.10|>": 50969,
208
+ "<|12.12|>": 50970,
209
+ "<|12.14|>": 50971,
210
+ "<|12.16|>": 50972,
211
+ "<|12.18|>": 50973,
212
+ "<|12.20|>": 50974,
213
+ "<|12.22|>": 50975,
214
+ "<|12.24|>": 50976,
215
+ "<|12.26|>": 50977,
216
+ "<|12.28|>": 50978,
217
+ "<|12.30|>": 50979,
218
+ "<|12.32|>": 50980,
219
+ "<|12.34|>": 50981,
220
+ "<|12.36|>": 50982,
221
+ "<|12.38|>": 50983,
222
+ "<|12.40|>": 50984,
223
+ "<|12.42|>": 50985,
224
+ "<|12.44|>": 50986,
225
+ "<|12.46|>": 50987,
226
+ "<|12.48|>": 50988,
227
+ "<|12.50|>": 50989,
228
+ "<|12.52|>": 50990,
229
+ "<|12.54|>": 50991,
230
+ "<|12.56|>": 50992,
231
+ "<|12.58|>": 50993,
232
+ "<|12.60|>": 50994,
233
+ "<|12.62|>": 50995,
234
+ "<|12.64|>": 50996,
235
+ "<|12.66|>": 50997,
236
+ "<|12.68|>": 50998,
237
+ "<|12.70|>": 50999,
238
+ "<|12.72|>": 51000,
239
+ "<|12.74|>": 51001,
240
+ "<|12.76|>": 51002,
241
+ "<|12.78|>": 51003,
242
+ "<|12.80|>": 51004,
243
+ "<|12.82|>": 51005,
244
+ "<|12.84|>": 51006,
245
+ "<|12.86|>": 51007,
246
+ "<|12.88|>": 51008,
247
+ "<|12.90|>": 51009,
248
+ "<|12.92|>": 51010,
249
+ "<|12.94|>": 51011,
250
+ "<|12.96|>": 51012,
251
+ "<|12.98|>": 51013,
252
+ "<|13.00|>": 51014,
253
+ "<|13.02|>": 51015,
254
+ "<|13.04|>": 51016,
255
+ "<|13.06|>": 51017,
256
+ "<|13.08|>": 51018,
257
+ "<|13.10|>": 51019,
258
+ "<|13.12|>": 51020,
259
+ "<|13.14|>": 51021,
260
+ "<|13.16|>": 51022,
261
+ "<|13.18|>": 51023,
262
+ "<|13.20|>": 51024,
263
+ "<|13.22|>": 51025,
264
+ "<|13.24|>": 51026,
265
+ "<|13.26|>": 51027,
266
+ "<|13.28|>": 51028,
267
+ "<|13.30|>": 51029,
268
+ "<|13.32|>": 51030,
269
+ "<|13.34|>": 51031,
270
+ "<|13.36|>": 51032,
271
+ "<|13.38|>": 51033,
272
+ "<|13.40|>": 51034,
273
+ "<|13.42|>": 51035,
274
+ "<|13.44|>": 51036,
275
+ "<|13.46|>": 51037,
276
+ "<|13.48|>": 51038,
277
+ "<|13.50|>": 51039,
278
+ "<|13.52|>": 51040,
279
+ "<|13.54|>": 51041,
280
+ "<|13.56|>": 51042,
281
+ "<|13.58|>": 51043,
282
+ "<|13.60|>": 51044,
283
+ "<|13.62|>": 51045,
284
+ "<|13.64|>": 51046,
285
+ "<|13.66|>": 51047,
286
+ "<|13.68|>": 51048,
287
+ "<|13.70|>": 51049,
288
+ "<|13.72|>": 51050,
289
+ "<|13.74|>": 51051,
290
+ "<|13.76|>": 51052,
291
+ "<|13.78|>": 51053,
292
+ "<|13.80|>": 51054,
293
+ "<|13.82|>": 51055,
294
+ "<|13.84|>": 51056,
295
+ "<|13.86|>": 51057,
296
+ "<|13.88|>": 51058,
297
+ "<|13.90|>": 51059,
298
+ "<|13.92|>": 51060,
299
+ "<|13.94|>": 51061,
300
+ "<|13.96|>": 51062,
301
+ "<|13.98|>": 51063,
302
+ "<|14.00|>": 51064,
303
+ "<|14.02|>": 51065,
304
+ "<|14.04|>": 51066,
305
+ "<|14.06|>": 51067,
306
+ "<|14.08|>": 51068,
307
+ "<|14.10|>": 51069,
308
+ "<|14.12|>": 51070,
309
+ "<|14.14|>": 51071,
310
+ "<|14.16|>": 51072,
311
+ "<|14.18|>": 51073,
312
+ "<|14.20|>": 51074,
313
+ "<|14.22|>": 51075,
314
+ "<|14.24|>": 51076,
315
+ "<|14.26|>": 51077,
316
+ "<|14.28|>": 51078,
317
+ "<|14.30|>": 51079,
318
+ "<|14.32|>": 51080,
319
+ "<|14.34|>": 51081,
320
+ "<|14.36|>": 51082,
321
+ "<|14.38|>": 51083,
322
+ "<|14.40|>": 51084,
323
+ "<|14.42|>": 51085,
324
+ "<|14.44|>": 51086,
325
+ "<|14.46|>": 51087,
326
+ "<|14.48|>": 51088,
327
+ "<|14.50|>": 51089,
328
+ "<|14.52|>": 51090,
329
+ "<|14.54|>": 51091,
330
+ "<|14.56|>": 51092,
331
+ "<|14.58|>": 51093,
332
+ "<|14.60|>": 51094,
333
+ "<|14.62|>": 51095,
334
+ "<|14.64|>": 51096,
335
+ "<|14.66|>": 51097,
336
+ "<|14.68|>": 51098,
337
+ "<|14.70|>": 51099,
338
+ "<|14.72|>": 51100,
339
+ "<|14.74|>": 51101,
340
+ "<|14.76|>": 51102,
341
+ "<|14.78|>": 51103,
342
+ "<|14.80|>": 51104,
343
+ "<|14.82|>": 51105,
344
+ "<|14.84|>": 51106,
345
+ "<|14.86|>": 51107,
346
+ "<|14.88|>": 51108,
347
+ "<|14.90|>": 51109,
348
+ "<|14.92|>": 51110,
349
+ "<|14.94|>": 51111,
350
+ "<|14.96|>": 51112,
351
+ "<|14.98|>": 51113,
352
+ "<|15.00|>": 51114,
353
+ "<|15.02|>": 51115,
354
+ "<|15.04|>": 51116,
355
+ "<|15.06|>": 51117,
356
+ "<|15.08|>": 51118,
357
+ "<|15.10|>": 51119,
358
+ "<|15.12|>": 51120,
359
+ "<|15.14|>": 51121,
360
+ "<|15.16|>": 51122,
361
+ "<|15.18|>": 51123,
362
+ "<|15.20|>": 51124,
363
+ "<|15.22|>": 51125,
364
+ "<|15.24|>": 51126,
365
+ "<|15.26|>": 51127,
366
+ "<|15.28|>": 51128,
367
+ "<|15.30|>": 51129,
368
+ "<|15.32|>": 51130,
369
+ "<|15.34|>": 51131,
370
+ "<|15.36|>": 51132,
371
+ "<|15.38|>": 51133,
372
+ "<|15.40|>": 51134,
373
+ "<|15.42|>": 51135,
374
+ "<|15.44|>": 51136,
375
+ "<|15.46|>": 51137,
376
+ "<|15.48|>": 51138,
377
+ "<|15.50|>": 51139,
378
+ "<|15.52|>": 51140,
379
+ "<|15.54|>": 51141,
380
+ "<|15.56|>": 51142,
381
+ "<|15.58|>": 51143,
382
+ "<|15.60|>": 51144,
383
+ "<|15.62|>": 51145,
384
+ "<|15.64|>": 51146,
385
+ "<|15.66|>": 51147,
386
+ "<|15.68|>": 51148,
387
+ "<|15.70|>": 51149,
388
+ "<|15.72|>": 51150,
389
+ "<|15.74|>": 51151,
390
+ "<|15.76|>": 51152,
391
+ "<|15.78|>": 51153,
392
+ "<|15.80|>": 51154,
393
+ "<|15.82|>": 51155,
394
+ "<|15.84|>": 51156,
395
+ "<|15.86|>": 51157,
396
+ "<|15.88|>": 51158,
397
+ "<|15.90|>": 51159,
398
+ "<|15.92|>": 51160,
399
+ "<|15.94|>": 51161,
400
+ "<|15.96|>": 51162,
401
+ "<|15.98|>": 51163,
402
+ "<|16.00|>": 51164,
403
+ "<|16.02|>": 51165,
404
+ "<|16.04|>": 51166,
405
+ "<|16.06|>": 51167,
406
+ "<|16.08|>": 51168,
407
+ "<|16.10|>": 51169,
408
+ "<|16.12|>": 51170,
409
+ "<|16.14|>": 51171,
410
+ "<|16.16|>": 51172,
411
+ "<|16.18|>": 51173,
412
+ "<|16.20|>": 51174,
413
+ "<|16.22|>": 51175,
414
+ "<|16.24|>": 51176,
415
+ "<|16.26|>": 51177,
416
+ "<|16.28|>": 51178,
417
+ "<|16.30|>": 51179,
418
+ "<|16.32|>": 51180,
419
+ "<|16.34|>": 51181,
420
+ "<|16.36|>": 51182,
421
+ "<|16.38|>": 51183,
422
+ "<|16.40|>": 51184,
423
+ "<|16.42|>": 51185,
424
+ "<|16.44|>": 51186,
425
+ "<|16.46|>": 51187,
426
+ "<|16.48|>": 51188,
427
+ "<|16.50|>": 51189,
428
+ "<|16.52|>": 51190,
429
+ "<|16.54|>": 51191,
430
+ "<|16.56|>": 51192,
431
+ "<|16.58|>": 51193,
432
+ "<|16.60|>": 51194,
433
+ "<|16.62|>": 51195,
434
+ "<|16.64|>": 51196,
435
+ "<|16.66|>": 51197,
436
+ "<|16.68|>": 51198,
437
+ "<|16.70|>": 51199,
438
+ "<|16.72|>": 51200,
439
+ "<|16.74|>": 51201,
440
+ "<|16.76|>": 51202,
441
+ "<|16.78|>": 51203,
442
+ "<|16.80|>": 51204,
443
+ "<|16.82|>": 51205,
444
+ "<|16.84|>": 51206,
445
+ "<|16.86|>": 51207,
446
+ "<|16.88|>": 51208,
447
+ "<|16.90|>": 51209,
448
+ "<|16.92|>": 51210,
449
+ "<|16.94|>": 51211,
450
+ "<|16.96|>": 51212,
451
+ "<|16.98|>": 51213,
452
+ "<|17.00|>": 51214,
453
+ "<|17.02|>": 51215,
454
+ "<|17.04|>": 51216,
455
+ "<|17.06|>": 51217,
456
+ "<|17.08|>": 51218,
457
+ "<|17.10|>": 51219,
458
+ "<|17.12|>": 51220,
459
+ "<|17.14|>": 51221,
460
+ "<|17.16|>": 51222,
461
+ "<|17.18|>": 51223,
462
+ "<|17.20|>": 51224,
463
+ "<|17.22|>": 51225,
464
+ "<|17.24|>": 51226,
465
+ "<|17.26|>": 51227,
466
+ "<|17.28|>": 51228,
467
+ "<|17.30|>": 51229,
468
+ "<|17.32|>": 51230,
469
+ "<|17.34|>": 51231,
470
+ "<|17.36|>": 51232,
471
+ "<|17.38|>": 51233,
472
+ "<|17.40|>": 51234,
473
+ "<|17.42|>": 51235,
474
+ "<|17.44|>": 51236,
475
+ "<|17.46|>": 51237,
476
+ "<|17.48|>": 51238,
477
+ "<|17.50|>": 51239,
478
+ "<|17.52|>": 51240,
479
+ "<|17.54|>": 51241,
480
+ "<|17.56|>": 51242,
481
+ "<|17.58|>": 51243,
482
+ "<|17.60|>": 51244,
483
+ "<|17.62|>": 51245,
484
+ "<|17.64|>": 51246,
485
+ "<|17.66|>": 51247,
486
+ "<|17.68|>": 51248,
487
+ "<|17.70|>": 51249,
488
+ "<|17.72|>": 51250,
489
+ "<|17.74|>": 51251,
490
+ "<|17.76|>": 51252,
491
+ "<|17.78|>": 51253,
492
+ "<|17.80|>": 51254,
493
+ "<|17.82|>": 51255,
494
+ "<|17.84|>": 51256,
495
+ "<|17.86|>": 51257,
496
+ "<|17.88|>": 51258,
497
+ "<|17.90|>": 51259,
498
+ "<|17.92|>": 51260,
499
+ "<|17.94|>": 51261,
500
+ "<|17.96|>": 51262,
501
+ "<|17.98|>": 51263,
502
+ "<|18.00|>": 51264,
503
+ "<|18.02|>": 51265,
504
+ "<|18.04|>": 51266,
505
+ "<|18.06|>": 51267,
506
+ "<|18.08|>": 51268,
507
+ "<|18.10|>": 51269,
508
+ "<|18.12|>": 51270,
509
+ "<|18.14|>": 51271,
510
+ "<|18.16|>": 51272,
511
+ "<|18.18|>": 51273,
512
+ "<|18.20|>": 51274,
513
+ "<|18.22|>": 51275,
514
+ "<|18.24|>": 51276,
515
+ "<|18.26|>": 51277,
516
+ "<|18.28|>": 51278,
517
+ "<|18.30|>": 51279,
518
+ "<|18.32|>": 51280,
519
+ "<|18.34|>": 51281,
520
+ "<|18.36|>": 51282,
521
+ "<|18.38|>": 51283,
522
+ "<|18.40|>": 51284,
523
+ "<|18.42|>": 51285,
524
+ "<|18.44|>": 51286,
525
+ "<|18.46|>": 51287,
526
+ "<|18.48|>": 51288,
527
+ "<|18.50|>": 51289,
528
+ "<|18.52|>": 51290,
529
+ "<|18.54|>": 51291,
530
+ "<|18.56|>": 51292,
531
+ "<|18.58|>": 51293,
532
+ "<|18.60|>": 51294,
533
+ "<|18.62|>": 51295,
534
+ "<|18.64|>": 51296,
535
+ "<|18.66|>": 51297,
536
+ "<|18.68|>": 51298,
537
+ "<|18.70|>": 51299,
538
+ "<|18.72|>": 51300,
539
+ "<|18.74|>": 51301,
540
+ "<|18.76|>": 51302,
541
+ "<|18.78|>": 51303,
542
+ "<|18.80|>": 51304,
543
+ "<|18.82|>": 51305,
544
+ "<|18.84|>": 51306,
545
+ "<|18.86|>": 51307,
546
+ "<|18.88|>": 51308,
547
+ "<|18.90|>": 51309,
548
+ "<|18.92|>": 51310,
549
+ "<|18.94|>": 51311,
550
+ "<|18.96|>": 51312,
551
+ "<|18.98|>": 51313,
552
+ "<|19.00|>": 51314,
553
+ "<|19.02|>": 51315,
554
+ "<|19.04|>": 51316,
555
+ "<|19.06|>": 51317,
556
+ "<|19.08|>": 51318,
557
+ "<|19.10|>": 51319,
558
+ "<|19.12|>": 51320,
559
+ "<|19.14|>": 51321,
560
+ "<|19.16|>": 51322,
561
+ "<|19.18|>": 51323,
562
+ "<|19.20|>": 51324,
563
+ "<|19.22|>": 51325,
564
+ "<|19.24|>": 51326,
565
+ "<|19.26|>": 51327,
566
+ "<|19.28|>": 51328,
567
+ "<|19.30|>": 51329,
568
+ "<|19.32|>": 51330,
569
+ "<|19.34|>": 51331,
570
+ "<|19.36|>": 51332,
571
+ "<|19.38|>": 51333,
572
+ "<|19.40|>": 51334,
573
+ "<|19.42|>": 51335,
574
+ "<|19.44|>": 51336,
575
+ "<|19.46|>": 51337,
576
+ "<|19.48|>": 51338,
577
+ "<|19.50|>": 51339,
578
+ "<|19.52|>": 51340,
579
+ "<|19.54|>": 51341,
580
+ "<|19.56|>": 51342,
581
+ "<|19.58|>": 51343,
582
+ "<|19.60|>": 51344,
583
+ "<|19.62|>": 51345,
584
+ "<|19.64|>": 51346,
585
+ "<|19.66|>": 51347,
586
+ "<|19.68|>": 51348,
587
+ "<|19.70|>": 51349,
588
+ "<|19.72|>": 51350,
589
+ "<|19.74|>": 51351,
590
+ "<|19.76|>": 51352,
591
+ "<|19.78|>": 51353,
592
+ "<|19.80|>": 51354,
593
+ "<|19.82|>": 51355,
594
+ "<|19.84|>": 51356,
595
+ "<|19.86|>": 51357,
596
+ "<|19.88|>": 51358,
597
+ "<|19.90|>": 51359,
598
+ "<|19.92|>": 51360,
599
+ "<|19.94|>": 51361,
600
+ "<|19.96|>": 51362,
601
+ "<|19.98|>": 51363,
602
+ "<|2.00|>": 50464,
603
+ "<|2.02|>": 50465,
604
+ "<|2.04|>": 50466,
605
+ "<|2.06|>": 50467,
606
+ "<|2.08|>": 50468,
607
+ "<|2.10|>": 50469,
608
+ "<|2.12|>": 50470,
609
+ "<|2.14|>": 50471,
610
+ "<|2.16|>": 50472,
611
+ "<|2.18|>": 50473,
612
+ "<|2.20|>": 50474,
613
+ "<|2.22|>": 50475,
614
+ "<|2.24|>": 50476,
615
+ "<|2.26|>": 50477,
616
+ "<|2.28|>": 50478,
617
+ "<|2.30|>": 50479,
618
+ "<|2.32|>": 50480,
619
+ "<|2.34|>": 50481,
620
+ "<|2.36|>": 50482,
621
+ "<|2.38|>": 50483,
622
+ "<|2.40|>": 50484,
623
+ "<|2.42|>": 50485,
624
+ "<|2.44|>": 50486,
625
+ "<|2.46|>": 50487,
626
+ "<|2.48|>": 50488,
627
+ "<|2.50|>": 50489,
628
+ "<|2.52|>": 50490,
629
+ "<|2.54|>": 50491,
630
+ "<|2.56|>": 50492,
631
+ "<|2.58|>": 50493,
632
+ "<|2.60|>": 50494,
633
+ "<|2.62|>": 50495,
634
+ "<|2.64|>": 50496,
635
+ "<|2.66|>": 50497,
636
+ "<|2.68|>": 50498,
637
+ "<|2.70|>": 50499,
638
+ "<|2.72|>": 50500,
639
+ "<|2.74|>": 50501,
640
+ "<|2.76|>": 50502,
641
+ "<|2.78|>": 50503,
642
+ "<|2.80|>": 50504,
643
+ "<|2.82|>": 50505,
644
+ "<|2.84|>": 50506,
645
+ "<|2.86|>": 50507,
646
+ "<|2.88|>": 50508,
647
+ "<|2.90|>": 50509,
648
+ "<|2.92|>": 50510,
649
+ "<|2.94|>": 50511,
650
+ "<|2.96|>": 50512,
651
+ "<|2.98|>": 50513,
652
+ "<|20.00|>": 51364,
653
+ "<|20.02|>": 51365,
654
+ "<|20.04|>": 51366,
655
+ "<|20.06|>": 51367,
656
+ "<|20.08|>": 51368,
657
+ "<|20.10|>": 51369,
658
+ "<|20.12|>": 51370,
659
+ "<|20.14|>": 51371,
660
+ "<|20.16|>": 51372,
661
+ "<|20.18|>": 51373,
662
+ "<|20.20|>": 51374,
663
+ "<|20.22|>": 51375,
664
+ "<|20.24|>": 51376,
665
+ "<|20.26|>": 51377,
666
+ "<|20.28|>": 51378,
667
+ "<|20.30|>": 51379,
668
+ "<|20.32|>": 51380,
669
+ "<|20.34|>": 51381,
670
+ "<|20.36|>": 51382,
671
+ "<|20.38|>": 51383,
672
+ "<|20.40|>": 51384,
673
+ "<|20.42|>": 51385,
674
+ "<|20.44|>": 51386,
675
+ "<|20.46|>": 51387,
676
+ "<|20.48|>": 51388,
677
+ "<|20.50|>": 51389,
678
+ "<|20.52|>": 51390,
679
+ "<|20.54|>": 51391,
680
+ "<|20.56|>": 51392,
681
+ "<|20.58|>": 51393,
682
+ "<|20.60|>": 51394,
683
+ "<|20.62|>": 51395,
684
+ "<|20.64|>": 51396,
685
+ "<|20.66|>": 51397,
686
+ "<|20.68|>": 51398,
687
+ "<|20.70|>": 51399,
688
+ "<|20.72|>": 51400,
689
+ "<|20.74|>": 51401,
690
+ "<|20.76|>": 51402,
691
+ "<|20.78|>": 51403,
692
+ "<|20.80|>": 51404,
693
+ "<|20.82|>": 51405,
694
+ "<|20.84|>": 51406,
695
+ "<|20.86|>": 51407,
696
+ "<|20.88|>": 51408,
697
+ "<|20.90|>": 51409,
698
+ "<|20.92|>": 51410,
699
+ "<|20.94|>": 51411,
700
+ "<|20.96|>": 51412,
701
+ "<|20.98|>": 51413,
702
+ "<|21.00|>": 51414,
703
+ "<|21.02|>": 51415,
704
+ "<|21.04|>": 51416,
705
+ "<|21.06|>": 51417,
706
+ "<|21.08|>": 51418,
707
+ "<|21.10|>": 51419,
708
+ "<|21.12|>": 51420,
709
+ "<|21.14|>": 51421,
710
+ "<|21.16|>": 51422,
711
+ "<|21.18|>": 51423,
712
+ "<|21.20|>": 51424,
713
+ "<|21.22|>": 51425,
714
+ "<|21.24|>": 51426,
715
+ "<|21.26|>": 51427,
716
+ "<|21.28|>": 51428,
717
+ "<|21.30|>": 51429,
718
+ "<|21.32|>": 51430,
719
+ "<|21.34|>": 51431,
720
+ "<|21.36|>": 51432,
721
+ "<|21.38|>": 51433,
722
+ "<|21.40|>": 51434,
723
+ "<|21.42|>": 51435,
724
+ "<|21.44|>": 51436,
725
+ "<|21.46|>": 51437,
726
+ "<|21.48|>": 51438,
727
+ "<|21.50|>": 51439,
728
+ "<|21.52|>": 51440,
729
+ "<|21.54|>": 51441,
730
+ "<|21.56|>": 51442,
731
+ "<|21.58|>": 51443,
732
+ "<|21.60|>": 51444,
733
+ "<|21.62|>": 51445,
734
+ "<|21.64|>": 51446,
735
+ "<|21.66|>": 51447,
736
+ "<|21.68|>": 51448,
737
+ "<|21.70|>": 51449,
738
+ "<|21.72|>": 51450,
739
+ "<|21.74|>": 51451,
740
+ "<|21.76|>": 51452,
741
+ "<|21.78|>": 51453,
742
+ "<|21.80|>": 51454,
743
+ "<|21.82|>": 51455,
744
+ "<|21.84|>": 51456,
745
+ "<|21.86|>": 51457,
746
+ "<|21.88|>": 51458,
747
+ "<|21.90|>": 51459,
748
+ "<|21.92|>": 51460,
749
+ "<|21.94|>": 51461,
750
+ "<|21.96|>": 51462,
751
+ "<|21.98|>": 51463,
752
+ "<|22.00|>": 51464,
753
+ "<|22.02|>": 51465,
754
+ "<|22.04|>": 51466,
755
+ "<|22.06|>": 51467,
756
+ "<|22.08|>": 51468,
757
+ "<|22.10|>": 51469,
758
+ "<|22.12|>": 51470,
759
+ "<|22.14|>": 51471,
760
+ "<|22.16|>": 51472,
761
+ "<|22.18|>": 51473,
762
+ "<|22.20|>": 51474,
763
+ "<|22.22|>": 51475,
764
+ "<|22.24|>": 51476,
765
+ "<|22.26|>": 51477,
766
+ "<|22.28|>": 51478,
767
+ "<|22.30|>": 51479,
768
+ "<|22.32|>": 51480,
769
+ "<|22.34|>": 51481,
770
+ "<|22.36|>": 51482,
771
+ "<|22.38|>": 51483,
772
+ "<|22.40|>": 51484,
773
+ "<|22.42|>": 51485,
774
+ "<|22.44|>": 51486,
775
+ "<|22.46|>": 51487,
776
+ "<|22.48|>": 51488,
777
+ "<|22.50|>": 51489,
778
+ "<|22.52|>": 51490,
779
+ "<|22.54|>": 51491,
780
+ "<|22.56|>": 51492,
781
+ "<|22.58|>": 51493,
782
+ "<|22.60|>": 51494,
783
+ "<|22.62|>": 51495,
784
+ "<|22.64|>": 51496,
785
+ "<|22.66|>": 51497,
786
+ "<|22.68|>": 51498,
787
+ "<|22.70|>": 51499,
788
+ "<|22.72|>": 51500,
789
+ "<|22.74|>": 51501,
790
+ "<|22.76|>": 51502,
791
+ "<|22.78|>": 51503,
792
+ "<|22.80|>": 51504,
793
+ "<|22.82|>": 51505,
794
+ "<|22.84|>": 51506,
795
+ "<|22.86|>": 51507,
796
+ "<|22.88|>": 51508,
797
+ "<|22.90|>": 51509,
798
+ "<|22.92|>": 51510,
799
+ "<|22.94|>": 51511,
800
+ "<|22.96|>": 51512,
801
+ "<|22.98|>": 51513,
802
+ "<|23.00|>": 51514,
803
+ "<|23.02|>": 51515,
804
+ "<|23.04|>": 51516,
805
+ "<|23.06|>": 51517,
806
+ "<|23.08|>": 51518,
807
+ "<|23.10|>": 51519,
808
+ "<|23.12|>": 51520,
809
+ "<|23.14|>": 51521,
810
+ "<|23.16|>": 51522,
811
+ "<|23.18|>": 51523,
812
+ "<|23.20|>": 51524,
813
+ "<|23.22|>": 51525,
814
+ "<|23.24|>": 51526,
815
+ "<|23.26|>": 51527,
816
+ "<|23.28|>": 51528,
817
+ "<|23.30|>": 51529,
818
+ "<|23.32|>": 51530,
819
+ "<|23.34|>": 51531,
820
+ "<|23.36|>": 51532,
821
+ "<|23.38|>": 51533,
822
+ "<|23.40|>": 51534,
823
+ "<|23.42|>": 51535,
824
+ "<|23.44|>": 51536,
825
+ "<|23.46|>": 51537,
826
+ "<|23.48|>": 51538,
827
+ "<|23.50|>": 51539,
828
+ "<|23.52|>": 51540,
829
+ "<|23.54|>": 51541,
830
+ "<|23.56|>": 51542,
831
+ "<|23.58|>": 51543,
832
+ "<|23.60|>": 51544,
833
+ "<|23.62|>": 51545,
834
+ "<|23.64|>": 51546,
835
+ "<|23.66|>": 51547,
836
+ "<|23.68|>": 51548,
837
+ "<|23.70|>": 51549,
838
+ "<|23.72|>": 51550,
839
+ "<|23.74|>": 51551,
840
+ "<|23.76|>": 51552,
841
+ "<|23.78|>": 51553,
842
+ "<|23.80|>": 51554,
843
+ "<|23.82|>": 51555,
844
+ "<|23.84|>": 51556,
845
+ "<|23.86|>": 51557,
846
+ "<|23.88|>": 51558,
847
+ "<|23.90|>": 51559,
848
+ "<|23.92|>": 51560,
849
+ "<|23.94|>": 51561,
850
+ "<|23.96|>": 51562,
851
+ "<|23.98|>": 51563,
852
+ "<|24.00|>": 51564,
853
+ "<|24.02|>": 51565,
854
+ "<|24.04|>": 51566,
855
+ "<|24.06|>": 51567,
856
+ "<|24.08|>": 51568,
857
+ "<|24.10|>": 51569,
858
+ "<|24.12|>": 51570,
859
+ "<|24.14|>": 51571,
860
+ "<|24.16|>": 51572,
861
+ "<|24.18|>": 51573,
862
+ "<|24.20|>": 51574,
863
+ "<|24.22|>": 51575,
864
+ "<|24.24|>": 51576,
865
+ "<|24.26|>": 51577,
866
+ "<|24.28|>": 51578,
867
+ "<|24.30|>": 51579,
868
+ "<|24.32|>": 51580,
869
+ "<|24.34|>": 51581,
870
+ "<|24.36|>": 51582,
871
+ "<|24.38|>": 51583,
872
+ "<|24.40|>": 51584,
873
+ "<|24.42|>": 51585,
874
+ "<|24.44|>": 51586,
875
+ "<|24.46|>": 51587,
876
+ "<|24.48|>": 51588,
877
+ "<|24.50|>": 51589,
878
+ "<|24.52|>": 51590,
879
+ "<|24.54|>": 51591,
880
+ "<|24.56|>": 51592,
881
+ "<|24.58|>": 51593,
882
+ "<|24.60|>": 51594,
883
+ "<|24.62|>": 51595,
884
+ "<|24.64|>": 51596,
885
+ "<|24.66|>": 51597,
886
+ "<|24.68|>": 51598,
887
+ "<|24.70|>": 51599,
888
+ "<|24.72|>": 51600,
889
+ "<|24.74|>": 51601,
890
+ "<|24.76|>": 51602,
891
+ "<|24.78|>": 51603,
892
+ "<|24.80|>": 51604,
893
+ "<|24.82|>": 51605,
894
+ "<|24.84|>": 51606,
895
+ "<|24.86|>": 51607,
896
+ "<|24.88|>": 51608,
897
+ "<|24.90|>": 51609,
898
+ "<|24.92|>": 51610,
899
+ "<|24.94|>": 51611,
900
+ "<|24.96|>": 51612,
901
+ "<|24.98|>": 51613,
902
+ "<|25.00|>": 51614,
903
+ "<|25.02|>": 51615,
904
+ "<|25.04|>": 51616,
905
+ "<|25.06|>": 51617,
906
+ "<|25.08|>": 51618,
907
+ "<|25.10|>": 51619,
908
+ "<|25.12|>": 51620,
909
+ "<|25.14|>": 51621,
910
+ "<|25.16|>": 51622,
911
+ "<|25.18|>": 51623,
912
+ "<|25.20|>": 51624,
913
+ "<|25.22|>": 51625,
914
+ "<|25.24|>": 51626,
915
+ "<|25.26|>": 51627,
916
+ "<|25.28|>": 51628,
917
+ "<|25.30|>": 51629,
918
+ "<|25.32|>": 51630,
919
+ "<|25.34|>": 51631,
920
+ "<|25.36|>": 51632,
921
+ "<|25.38|>": 51633,
922
+ "<|25.40|>": 51634,
923
+ "<|25.42|>": 51635,
924
+ "<|25.44|>": 51636,
925
+ "<|25.46|>": 51637,
926
+ "<|25.48|>": 51638,
927
+ "<|25.50|>": 51639,
928
+ "<|25.52|>": 51640,
929
+ "<|25.54|>": 51641,
930
+ "<|25.56|>": 51642,
931
+ "<|25.58|>": 51643,
932
+ "<|25.60|>": 51644,
933
+ "<|25.62|>": 51645,
934
+ "<|25.64|>": 51646,
935
+ "<|25.66|>": 51647,
936
+ "<|25.68|>": 51648,
937
+ "<|25.70|>": 51649,
938
+ "<|25.72|>": 51650,
939
+ "<|25.74|>": 51651,
940
+ "<|25.76|>": 51652,
941
+ "<|25.78|>": 51653,
942
+ "<|25.80|>": 51654,
943
+ "<|25.82|>": 51655,
944
+ "<|25.84|>": 51656,
945
+ "<|25.86|>": 51657,
946
+ "<|25.88|>": 51658,
947
+ "<|25.90|>": 51659,
948
+ "<|25.92|>": 51660,
949
+ "<|25.94|>": 51661,
950
+ "<|25.96|>": 51662,
951
+ "<|25.98|>": 51663,
952
+ "<|26.00|>": 51664,
953
+ "<|26.02|>": 51665,
954
+ "<|26.04|>": 51666,
955
+ "<|26.06|>": 51667,
956
+ "<|26.08|>": 51668,
957
+ "<|26.10|>": 51669,
958
+ "<|26.12|>": 51670,
959
+ "<|26.14|>": 51671,
960
+ "<|26.16|>": 51672,
961
+ "<|26.18|>": 51673,
962
+ "<|26.20|>": 51674,
963
+ "<|26.22|>": 51675,
964
+ "<|26.24|>": 51676,
965
+ "<|26.26|>": 51677,
966
+ "<|26.28|>": 51678,
967
+ "<|26.30|>": 51679,
968
+ "<|26.32|>": 51680,
969
+ "<|26.34|>": 51681,
970
+ "<|26.36|>": 51682,
971
+ "<|26.38|>": 51683,
972
+ "<|26.40|>": 51684,
973
+ "<|26.42|>": 51685,
974
+ "<|26.44|>": 51686,
975
+ "<|26.46|>": 51687,
976
+ "<|26.48|>": 51688,
977
+ "<|26.50|>": 51689,
978
+ "<|26.52|>": 51690,
979
+ "<|26.54|>": 51691,
980
+ "<|26.56|>": 51692,
981
+ "<|26.58|>": 51693,
982
+ "<|26.60|>": 51694,
983
+ "<|26.62|>": 51695,
984
+ "<|26.64|>": 51696,
985
+ "<|26.66|>": 51697,
986
+ "<|26.68|>": 51698,
987
+ "<|26.70|>": 51699,
988
+ "<|26.72|>": 51700,
989
+ "<|26.74|>": 51701,
990
+ "<|26.76|>": 51702,
991
+ "<|26.78|>": 51703,
992
+ "<|26.80|>": 51704,
993
+ "<|26.82|>": 51705,
994
+ "<|26.84|>": 51706,
995
+ "<|26.86|>": 51707,
996
+ "<|26.88|>": 51708,
997
+ "<|26.90|>": 51709,
998
+ "<|26.92|>": 51710,
999
+ "<|26.94|>": 51711,
1000
+ "<|26.96|>": 51712,
1001
+ "<|26.98|>": 51713,
1002
+ "<|27.00|>": 51714,
1003
+ "<|27.02|>": 51715,
1004
+ "<|27.04|>": 51716,
1005
+ "<|27.06|>": 51717,
1006
+ "<|27.08|>": 51718,
1007
+ "<|27.10|>": 51719,
1008
+ "<|27.12|>": 51720,
1009
+ "<|27.14|>": 51721,
1010
+ "<|27.16|>": 51722,
1011
+ "<|27.18|>": 51723,
1012
+ "<|27.20|>": 51724,
1013
+ "<|27.22|>": 51725,
1014
+ "<|27.24|>": 51726,
1015
+ "<|27.26|>": 51727,
1016
+ "<|27.28|>": 51728,
1017
+ "<|27.30|>": 51729,
1018
+ "<|27.32|>": 51730,
1019
+ "<|27.34|>": 51731,
1020
+ "<|27.36|>": 51732,
1021
+ "<|27.38|>": 51733,
1022
+ "<|27.40|>": 51734,
1023
+ "<|27.42|>": 51735,
1024
+ "<|27.44|>": 51736,
1025
+ "<|27.46|>": 51737,
1026
+ "<|27.48|>": 51738,
1027
+ "<|27.50|>": 51739,
1028
+ "<|27.52|>": 51740,
1029
+ "<|27.54|>": 51741,
1030
+ "<|27.56|>": 51742,
1031
+ "<|27.58|>": 51743,
1032
+ "<|27.60|>": 51744,
1033
+ "<|27.62|>": 51745,
1034
+ "<|27.64|>": 51746,
1035
+ "<|27.66|>": 51747,
1036
+ "<|27.68|>": 51748,
1037
+ "<|27.70|>": 51749,
1038
+ "<|27.72|>": 51750,
1039
+ "<|27.74|>": 51751,
1040
+ "<|27.76|>": 51752,
1041
+ "<|27.78|>": 51753,
1042
+ "<|27.80|>": 51754,
1043
+ "<|27.82|>": 51755,
1044
+ "<|27.84|>": 51756,
1045
+ "<|27.86|>": 51757,
1046
+ "<|27.88|>": 51758,
1047
+ "<|27.90|>": 51759,
1048
+ "<|27.92|>": 51760,
1049
+ "<|27.94|>": 51761,
1050
+ "<|27.96|>": 51762,
1051
+ "<|27.98|>": 51763,
1052
+ "<|28.00|>": 51764,
1053
+ "<|28.02|>": 51765,
1054
+ "<|28.04|>": 51766,
1055
+ "<|28.06|>": 51767,
1056
+ "<|28.08|>": 51768,
1057
+ "<|28.10|>": 51769,
1058
+ "<|28.12|>": 51770,
1059
+ "<|28.14|>": 51771,
1060
+ "<|28.16|>": 51772,
1061
+ "<|28.18|>": 51773,
1062
+ "<|28.20|>": 51774,
1063
+ "<|28.22|>": 51775,
1064
+ "<|28.24|>": 51776,
1065
+ "<|28.26|>": 51777,
1066
+ "<|28.28|>": 51778,
1067
+ "<|28.30|>": 51779,
1068
+ "<|28.32|>": 51780,
1069
+ "<|28.34|>": 51781,
1070
+ "<|28.36|>": 51782,
1071
+ "<|28.38|>": 51783,
1072
+ "<|28.40|>": 51784,
1073
+ "<|28.42|>": 51785,
1074
+ "<|28.44|>": 51786,
1075
+ "<|28.46|>": 51787,
1076
+ "<|28.48|>": 51788,
1077
+ "<|28.50|>": 51789,
1078
+ "<|28.52|>": 51790,
1079
+ "<|28.54|>": 51791,
1080
+ "<|28.56|>": 51792,
1081
+ "<|28.58|>": 51793,
1082
+ "<|28.60|>": 51794,
1083
+ "<|28.62|>": 51795,
1084
+ "<|28.64|>": 51796,
1085
+ "<|28.66|>": 51797,
1086
+ "<|28.68|>": 51798,
1087
+ "<|28.70|>": 51799,
1088
+ "<|28.72|>": 51800,
1089
+ "<|28.74|>": 51801,
1090
+ "<|28.76|>": 51802,
1091
+ "<|28.78|>": 51803,
1092
+ "<|28.80|>": 51804,
1093
+ "<|28.82|>": 51805,
1094
+ "<|28.84|>": 51806,
1095
+ "<|28.86|>": 51807,
1096
+ "<|28.88|>": 51808,
1097
+ "<|28.90|>": 51809,
1098
+ "<|28.92|>": 51810,
1099
+ "<|28.94|>": 51811,
1100
+ "<|28.96|>": 51812,
1101
+ "<|28.98|>": 51813,
1102
+ "<|29.00|>": 51814,
1103
+ "<|29.02|>": 51815,
1104
+ "<|29.04|>": 51816,
1105
+ "<|29.06|>": 51817,
1106
+ "<|29.08|>": 51818,
1107
+ "<|29.10|>": 51819,
1108
+ "<|29.12|>": 51820,
1109
+ "<|29.14|>": 51821,
1110
+ "<|29.16|>": 51822,
1111
+ "<|29.18|>": 51823,
1112
+ "<|29.20|>": 51824,
1113
+ "<|29.22|>": 51825,
1114
+ "<|29.24|>": 51826,
1115
+ "<|29.26|>": 51827,
1116
+ "<|29.28|>": 51828,
1117
+ "<|29.30|>": 51829,
1118
+ "<|29.32|>": 51830,
1119
+ "<|29.34|>": 51831,
1120
+ "<|29.36|>": 51832,
1121
+ "<|29.38|>": 51833,
1122
+ "<|29.40|>": 51834,
1123
+ "<|29.42|>": 51835,
1124
+ "<|29.44|>": 51836,
1125
+ "<|29.46|>": 51837,
1126
+ "<|29.48|>": 51838,
1127
+ "<|29.50|>": 51839,
1128
+ "<|29.52|>": 51840,
1129
+ "<|29.54|>": 51841,
1130
+ "<|29.56|>": 51842,
1131
+ "<|29.58|>": 51843,
1132
+ "<|29.60|>": 51844,
1133
+ "<|29.62|>": 51845,
1134
+ "<|29.64|>": 51846,
1135
+ "<|29.66|>": 51847,
1136
+ "<|29.68|>": 51848,
1137
+ "<|29.70|>": 51849,
1138
+ "<|29.72|>": 51850,
1139
+ "<|29.74|>": 51851,
1140
+ "<|29.76|>": 51852,
1141
+ "<|29.78|>": 51853,
1142
+ "<|29.80|>": 51854,
1143
+ "<|29.82|>": 51855,
1144
+ "<|29.84|>": 51856,
1145
+ "<|29.86|>": 51857,
1146
+ "<|29.88|>": 51858,
1147
+ "<|29.90|>": 51859,
1148
+ "<|29.92|>": 51860,
1149
+ "<|29.94|>": 51861,
1150
+ "<|29.96|>": 51862,
1151
+ "<|29.98|>": 51863,
1152
+ "<|3.00|>": 50514,
1153
+ "<|3.02|>": 50515,
1154
+ "<|3.04|>": 50516,
1155
+ "<|3.06|>": 50517,
1156
+ "<|3.08|>": 50518,
1157
+ "<|3.10|>": 50519,
1158
+ "<|3.12|>": 50520,
1159
+ "<|3.14|>": 50521,
1160
+ "<|3.16|>": 50522,
1161
+ "<|3.18|>": 50523,
1162
+ "<|3.20|>": 50524,
1163
+ "<|3.22|>": 50525,
1164
+ "<|3.24|>": 50526,
1165
+ "<|3.26|>": 50527,
1166
+ "<|3.28|>": 50528,
1167
+ "<|3.30|>": 50529,
1168
+ "<|3.32|>": 50530,
1169
+ "<|3.34|>": 50531,
1170
+ "<|3.36|>": 50532,
1171
+ "<|3.38|>": 50533,
1172
+ "<|3.40|>": 50534,
1173
+ "<|3.42|>": 50535,
1174
+ "<|3.44|>": 50536,
1175
+ "<|3.46|>": 50537,
1176
+ "<|3.48|>": 50538,
1177
+ "<|3.50|>": 50539,
1178
+ "<|3.52|>": 50540,
1179
+ "<|3.54|>": 50541,
1180
+ "<|3.56|>": 50542,
1181
+ "<|3.58|>": 50543,
1182
+ "<|3.60|>": 50544,
1183
+ "<|3.62|>": 50545,
1184
+ "<|3.64|>": 50546,
1185
+ "<|3.66|>": 50547,
1186
+ "<|3.68|>": 50548,
1187
+ "<|3.70|>": 50549,
1188
+ "<|3.72|>": 50550,
1189
+ "<|3.74|>": 50551,
1190
+ "<|3.76|>": 50552,
1191
+ "<|3.78|>": 50553,
1192
+ "<|3.80|>": 50554,
1193
+ "<|3.82|>": 50555,
1194
+ "<|3.84|>": 50556,
1195
+ "<|3.86|>": 50557,
1196
+ "<|3.88|>": 50558,
1197
+ "<|3.90|>": 50559,
1198
+ "<|3.92|>": 50560,
1199
+ "<|3.94|>": 50561,
1200
+ "<|3.96|>": 50562,
1201
+ "<|3.98|>": 50563,
1202
+ "<|30.00|>": 51864,
1203
+ "<|4.00|>": 50564,
1204
+ "<|4.02|>": 50565,
1205
+ "<|4.04|>": 50566,
1206
+ "<|4.06|>": 50567,
1207
+ "<|4.08|>": 50568,
1208
+ "<|4.10|>": 50569,
1209
+ "<|4.12|>": 50570,
1210
+ "<|4.14|>": 50571,
1211
+ "<|4.16|>": 50572,
1212
+ "<|4.18|>": 50573,
1213
+ "<|4.20|>": 50574,
1214
+ "<|4.22|>": 50575,
1215
+ "<|4.24|>": 50576,
1216
+ "<|4.26|>": 50577,
1217
+ "<|4.28|>": 50578,
1218
+ "<|4.30|>": 50579,
1219
+ "<|4.32|>": 50580,
1220
+ "<|4.34|>": 50581,
1221
+ "<|4.36|>": 50582,
1222
+ "<|4.38|>": 50583,
1223
+ "<|4.40|>": 50584,
1224
+ "<|4.42|>": 50585,
1225
+ "<|4.44|>": 50586,
1226
+ "<|4.46|>": 50587,
1227
+ "<|4.48|>": 50588,
1228
+ "<|4.50|>": 50589,
1229
+ "<|4.52|>": 50590,
1230
+ "<|4.54|>": 50591,
1231
+ "<|4.56|>": 50592,
1232
+ "<|4.58|>": 50593,
1233
+ "<|4.60|>": 50594,
1234
+ "<|4.62|>": 50595,
1235
+ "<|4.64|>": 50596,
1236
+ "<|4.66|>": 50597,
1237
+ "<|4.68|>": 50598,
1238
+ "<|4.70|>": 50599,
1239
+ "<|4.72|>": 50600,
1240
+ "<|4.74|>": 50601,
1241
+ "<|4.76|>": 50602,
1242
+ "<|4.78|>": 50603,
1243
+ "<|4.80|>": 50604,
1244
+ "<|4.82|>": 50605,
1245
+ "<|4.84|>": 50606,
1246
+ "<|4.86|>": 50607,
1247
+ "<|4.88|>": 50608,
1248
+ "<|4.90|>": 50609,
1249
+ "<|4.92|>": 50610,
1250
+ "<|4.94|>": 50611,
1251
+ "<|4.96|>": 50612,
1252
+ "<|4.98|>": 50613,
1253
+ "<|5.00|>": 50614,
1254
+ "<|5.02|>": 50615,
1255
+ "<|5.04|>": 50616,
1256
+ "<|5.06|>": 50617,
1257
+ "<|5.08|>": 50618,
1258
+ "<|5.10|>": 50619,
1259
+ "<|5.12|>": 50620,
1260
+ "<|5.14|>": 50621,
1261
+ "<|5.16|>": 50622,
1262
+ "<|5.18|>": 50623,
1263
+ "<|5.20|>": 50624,
1264
+ "<|5.22|>": 50625,
1265
+ "<|5.24|>": 50626,
1266
+ "<|5.26|>": 50627,
1267
+ "<|5.28|>": 50628,
1268
+ "<|5.30|>": 50629,
1269
+ "<|5.32|>": 50630,
1270
+ "<|5.34|>": 50631,
1271
+ "<|5.36|>": 50632,
1272
+ "<|5.38|>": 50633,
1273
+ "<|5.40|>": 50634,
1274
+ "<|5.42|>": 50635,
1275
+ "<|5.44|>": 50636,
1276
+ "<|5.46|>": 50637,
1277
+ "<|5.48|>": 50638,
1278
+ "<|5.50|>": 50639,
1279
+ "<|5.52|>": 50640,
1280
+ "<|5.54|>": 50641,
1281
+ "<|5.56|>": 50642,
1282
+ "<|5.58|>": 50643,
1283
+ "<|5.60|>": 50644,
1284
+ "<|5.62|>": 50645,
1285
+ "<|5.64|>": 50646,
1286
+ "<|5.66|>": 50647,
1287
+ "<|5.68|>": 50648,
1288
+ "<|5.70|>": 50649,
1289
+ "<|5.72|>": 50650,
1290
+ "<|5.74|>": 50651,
1291
+ "<|5.76|>": 50652,
1292
+ "<|5.78|>": 50653,
1293
+ "<|5.80|>": 50654,
1294
+ "<|5.82|>": 50655,
1295
+ "<|5.84|>": 50656,
1296
+ "<|5.86|>": 50657,
1297
+ "<|5.88|>": 50658,
1298
+ "<|5.90|>": 50659,
1299
+ "<|5.92|>": 50660,
1300
+ "<|5.94|>": 50661,
1301
+ "<|5.96|>": 50662,
1302
+ "<|5.98|>": 50663,
1303
+ "<|6.00|>": 50664,
1304
+ "<|6.02|>": 50665,
1305
+ "<|6.04|>": 50666,
1306
+ "<|6.06|>": 50667,
1307
+ "<|6.08|>": 50668,
1308
+ "<|6.10|>": 50669,
1309
+ "<|6.12|>": 50670,
1310
+ "<|6.14|>": 50671,
1311
+ "<|6.16|>": 50672,
1312
+ "<|6.18|>": 50673,
1313
+ "<|6.20|>": 50674,
1314
+ "<|6.22|>": 50675,
1315
+ "<|6.24|>": 50676,
1316
+ "<|6.26|>": 50677,
1317
+ "<|6.28|>": 50678,
1318
+ "<|6.30|>": 50679,
1319
+ "<|6.32|>": 50680,
1320
+ "<|6.34|>": 50681,
1321
+ "<|6.36|>": 50682,
1322
+ "<|6.38|>": 50683,
1323
+ "<|6.40|>": 50684,
1324
+ "<|6.42|>": 50685,
1325
+ "<|6.44|>": 50686,
1326
+ "<|6.46|>": 50687,
1327
+ "<|6.48|>": 50688,
1328
+ "<|6.50|>": 50689,
1329
+ "<|6.52|>": 50690,
1330
+ "<|6.54|>": 50691,
1331
+ "<|6.56|>": 50692,
1332
+ "<|6.58|>": 50693,
1333
+ "<|6.60|>": 50694,
1334
+ "<|6.62|>": 50695,
1335
+ "<|6.64|>": 50696,
1336
+ "<|6.66|>": 50697,
1337
+ "<|6.68|>": 50698,
1338
+ "<|6.70|>": 50699,
1339
+ "<|6.72|>": 50700,
1340
+ "<|6.74|>": 50701,
1341
+ "<|6.76|>": 50702,
1342
+ "<|6.78|>": 50703,
1343
+ "<|6.80|>": 50704,
1344
+ "<|6.82|>": 50705,
1345
+ "<|6.84|>": 50706,
1346
+ "<|6.86|>": 50707,
1347
+ "<|6.88|>": 50708,
1348
+ "<|6.90|>": 50709,
1349
+ "<|6.92|>": 50710,
1350
+ "<|6.94|>": 50711,
1351
+ "<|6.96|>": 50712,
1352
+ "<|6.98|>": 50713,
1353
+ "<|7.00|>": 50714,
1354
+ "<|7.02|>": 50715,
1355
+ "<|7.04|>": 50716,
1356
+ "<|7.06|>": 50717,
1357
+ "<|7.08|>": 50718,
1358
+ "<|7.10|>": 50719,
1359
+ "<|7.12|>": 50720,
1360
+ "<|7.14|>": 50721,
1361
+ "<|7.16|>": 50722,
1362
+ "<|7.18|>": 50723,
1363
+ "<|7.20|>": 50724,
1364
+ "<|7.22|>": 50725,
1365
+ "<|7.24|>": 50726,
1366
+ "<|7.26|>": 50727,
1367
+ "<|7.28|>": 50728,
1368
+ "<|7.30|>": 50729,
1369
+ "<|7.32|>": 50730,
1370
+ "<|7.34|>": 50731,
1371
+ "<|7.36|>": 50732,
1372
+ "<|7.38|>": 50733,
1373
+ "<|7.40|>": 50734,
1374
+ "<|7.42|>": 50735,
1375
+ "<|7.44|>": 50736,
1376
+ "<|7.46|>": 50737,
1377
+ "<|7.48|>": 50738,
1378
+ "<|7.50|>": 50739,
1379
+ "<|7.52|>": 50740,
1380
+ "<|7.54|>": 50741,
1381
+ "<|7.56|>": 50742,
1382
+ "<|7.58|>": 50743,
1383
+ "<|7.60|>": 50744,
1384
+ "<|7.62|>": 50745,
1385
+ "<|7.64|>": 50746,
1386
+ "<|7.66|>": 50747,
1387
+ "<|7.68|>": 50748,
1388
+ "<|7.70|>": 50749,
1389
+ "<|7.72|>": 50750,
1390
+ "<|7.74|>": 50751,
1391
+ "<|7.76|>": 50752,
1392
+ "<|7.78|>": 50753,
1393
+ "<|7.80|>": 50754,
1394
+ "<|7.82|>": 50755,
1395
+ "<|7.84|>": 50756,
1396
+ "<|7.86|>": 50757,
1397
+ "<|7.88|>": 50758,
1398
+ "<|7.90|>": 50759,
1399
+ "<|7.92|>": 50760,
1400
+ "<|7.94|>": 50761,
1401
+ "<|7.96|>": 50762,
1402
+ "<|7.98|>": 50763,
1403
+ "<|8.00|>": 50764,
1404
+ "<|8.02|>": 50765,
1405
+ "<|8.04|>": 50766,
1406
+ "<|8.06|>": 50767,
1407
+ "<|8.08|>": 50768,
1408
+ "<|8.10|>": 50769,
1409
+ "<|8.12|>": 50770,
1410
+ "<|8.14|>": 50771,
1411
+ "<|8.16|>": 50772,
1412
+ "<|8.18|>": 50773,
1413
+ "<|8.20|>": 50774,
1414
+ "<|8.22|>": 50775,
1415
+ "<|8.24|>": 50776,
1416
+ "<|8.26|>": 50777,
1417
+ "<|8.28|>": 50778,
1418
+ "<|8.30|>": 50779,
1419
+ "<|8.32|>": 50780,
1420
+ "<|8.34|>": 50781,
1421
+ "<|8.36|>": 50782,
1422
+ "<|8.38|>": 50783,
1423
+ "<|8.40|>": 50784,
1424
+ "<|8.42|>": 50785,
1425
+ "<|8.44|>": 50786,
1426
+ "<|8.46|>": 50787,
1427
+ "<|8.48|>": 50788,
1428
+ "<|8.50|>": 50789,
1429
+ "<|8.52|>": 50790,
1430
+ "<|8.54|>": 50791,
1431
+ "<|8.56|>": 50792,
1432
+ "<|8.58|>": 50793,
1433
+ "<|8.60|>": 50794,
1434
+ "<|8.62|>": 50795,
1435
+ "<|8.64|>": 50796,
1436
+ "<|8.66|>": 50797,
1437
+ "<|8.68|>": 50798,
1438
+ "<|8.70|>": 50799,
1439
+ "<|8.72|>": 50800,
1440
+ "<|8.74|>": 50801,
1441
+ "<|8.76|>": 50802,
1442
+ "<|8.78|>": 50803,
1443
+ "<|8.80|>": 50804,
1444
+ "<|8.82|>": 50805,
1445
+ "<|8.84|>": 50806,
1446
+ "<|8.86|>": 50807,
1447
+ "<|8.88|>": 50808,
1448
+ "<|8.90|>": 50809,
1449
+ "<|8.92|>": 50810,
1450
+ "<|8.94|>": 50811,
1451
+ "<|8.96|>": 50812,
1452
+ "<|8.98|>": 50813,
1453
+ "<|9.00|>": 50814,
1454
+ "<|9.02|>": 50815,
1455
+ "<|9.04|>": 50816,
1456
+ "<|9.06|>": 50817,
1457
+ "<|9.08|>": 50818,
1458
+ "<|9.10|>": 50819,
1459
+ "<|9.12|>": 50820,
1460
+ "<|9.14|>": 50821,
1461
+ "<|9.16|>": 50822,
1462
+ "<|9.18|>": 50823,
1463
+ "<|9.20|>": 50824,
1464
+ "<|9.22|>": 50825,
1465
+ "<|9.24|>": 50826,
1466
+ "<|9.26|>": 50827,
1467
+ "<|9.28|>": 50828,
1468
+ "<|9.30|>": 50829,
1469
+ "<|9.32|>": 50830,
1470
+ "<|9.34|>": 50831,
1471
+ "<|9.36|>": 50832,
1472
+ "<|9.38|>": 50833,
1473
+ "<|9.40|>": 50834,
1474
+ "<|9.42|>": 50835,
1475
+ "<|9.44|>": 50836,
1476
+ "<|9.46|>": 50837,
1477
+ "<|9.48|>": 50838,
1478
+ "<|9.50|>": 50839,
1479
+ "<|9.52|>": 50840,
1480
+ "<|9.54|>": 50841,
1481
+ "<|9.56|>": 50842,
1482
+ "<|9.58|>": 50843,
1483
+ "<|9.60|>": 50844,
1484
+ "<|9.62|>": 50845,
1485
+ "<|9.64|>": 50846,
1486
+ "<|9.66|>": 50847,
1487
+ "<|9.68|>": 50848,
1488
+ "<|9.70|>": 50849,
1489
+ "<|9.72|>": 50850,
1490
+ "<|9.74|>": 50851,
1491
+ "<|9.76|>": 50852,
1492
+ "<|9.78|>": 50853,
1493
+ "<|9.80|>": 50854,
1494
+ "<|9.82|>": 50855,
1495
+ "<|9.84|>": 50856,
1496
+ "<|9.86|>": 50857,
1497
+ "<|9.88|>": 50858,
1498
+ "<|9.90|>": 50859,
1499
+ "<|9.92|>": 50860,
1500
+ "<|9.94|>": 50861,
1501
+ "<|9.96|>": 50862,
1502
+ "<|9.98|>": 50863,
1503
+ "<|af|>": 50327,
1504
+ "<|am|>": 50334,
1505
+ "<|ar|>": 50272,
1506
+ "<|as|>": 50350,
1507
+ "<|az|>": 50304,
1508
+ "<|ba|>": 50355,
1509
+ "<|be|>": 50330,
1510
+ "<|bg|>": 50292,
1511
+ "<|bn|>": 50302,
1512
+ "<|bo|>": 50347,
1513
+ "<|br|>": 50309,
1514
+ "<|bs|>": 50315,
1515
+ "<|ca|>": 50270,
1516
+ "<|cs|>": 50283,
1517
+ "<|cy|>": 50297,
1518
+ "<|da|>": 50285,
1519
+ "<|de|>": 50261,
1520
+ "<|el|>": 50281,
1521
+ "<|en|>": 50259,
1522
+ "<|es|>": 50262,
1523
+ "<|et|>": 50307,
1524
+ "<|eu|>": 50310,
1525
+ "<|fa|>": 50300,
1526
+ "<|fi|>": 50277,
1527
+ "<|fo|>": 50338,
1528
+ "<|fr|>": 50265,
1529
+ "<|gl|>": 50319,
1530
+ "<|gu|>": 50333,
1531
+ "<|haw|>": 50352,
1532
+ "<|ha|>": 50354,
1533
+ "<|he|>": 50279,
1534
+ "<|hi|>": 50276,
1535
+ "<|hr|>": 50291,
1536
+ "<|ht|>": 50339,
1537
+ "<|hu|>": 50286,
1538
+ "<|hy|>": 50312,
1539
+ "<|id|>": 50275,
1540
+ "<|is|>": 50311,
1541
+ "<|it|>": 50274,
1542
+ "<|ja|>": 50266,
1543
+ "<|jw|>": 50356,
1544
+ "<|ka|>": 50329,
1545
+ "<|kk|>": 50316,
1546
+ "<|km|>": 50323,
1547
+ "<|kn|>": 50306,
1548
+ "<|ko|>": 50264,
1549
+ "<|la|>": 50294,
1550
+ "<|lb|>": 50345,
1551
+ "<|ln|>": 50353,
1552
+ "<|lo|>": 50336,
1553
+ "<|lt|>": 50293,
1554
+ "<|lv|>": 50301,
1555
+ "<|mg|>": 50349,
1556
+ "<|mi|>": 50295,
1557
+ "<|mk|>": 50308,
1558
+ "<|ml|>": 50296,
1559
+ "<|mn|>": 50314,
1560
+ "<|mr|>": 50320,
1561
+ "<|ms|>": 50282,
1562
+ "<|mt|>": 50343,
1563
+ "<|my|>": 50346,
1564
+ "<|ne|>": 50313,
1565
+ "<|nl|>": 50271,
1566
+ "<|nn|>": 50342,
1567
+ "<|nocaptions|>": 50362,
1568
+ "<|notimestamps|>": 50363,
1569
+ "<|no|>": 50288,
1570
+ "<|oc|>": 50328,
1571
+ "<|pa|>": 50321,
1572
+ "<|pl|>": 50269,
1573
+ "<|ps|>": 50340,
1574
+ "<|pt|>": 50267,
1575
+ "<|ro|>": 50284,
1576
+ "<|ru|>": 50263,
1577
+ "<|sa|>": 50344,
1578
+ "<|sd|>": 50332,
1579
+ "<|si|>": 50322,
1580
+ "<|sk|>": 50298,
1581
+ "<|sl|>": 50305,
1582
+ "<|sn|>": 50324,
1583
+ "<|so|>": 50326,
1584
+ "<|sq|>": 50317,
1585
+ "<|sr|>": 50303,
1586
+ "<|startoflm|>": 50360,
1587
+ "<|startofprev|>": 50361,
1588
+ "<|startoftranscript|>": 50258,
1589
+ "<|su|>": 50357,
1590
+ "<|sv|>": 50273,
1591
+ "<|sw|>": 50318,
1592
+ "<|ta|>": 50287,
1593
+ "<|te|>": 50299,
1594
+ "<|tg|>": 50331,
1595
+ "<|th|>": 50289,
1596
+ "<|tk|>": 50341,
1597
+ "<|tl|>": 50348,
1598
+ "<|transcribe|>": 50359,
1599
+ "<|translate|>": 50358,
1600
+ "<|tr|>": 50268,
1601
+ "<|tt|>": 50351,
1602
+ "<|uk|>": 50280,
1603
+ "<|ur|>": 50290,
1604
+ "<|uz|>": 50337,
1605
+ "<|vi|>": 50278,
1606
+ "<|yi|>": 50335,
1607
+ "<|yo|>": 50325,
1608
+ "<|zh|>": 50260
1609
+ }
all_results.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 0.32,
3
+ "eval_loss": 0.19753539562225342,
4
+ "eval_runtime": 7876.6978,
5
+ "eval_samples": 9414,
6
+ "eval_samples_per_second": 1.195,
7
+ "eval_steps_per_second": 0.598,
8
+ "eval_wer": 0.11905377038591959,
9
+ "train_loss": 0.32176169362068174,
10
+ "train_runtime": 49101.8688,
11
+ "train_samples": 31097,
12
+ "train_samples_per_second": 0.204,
13
+ "train_steps_per_second": 0.102
14
+ }
checkpoint-1000/config.json ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "openai/whisper-medium",
3
+ "activation_dropout": 0.0,
4
+ "activation_function": "gelu",
5
+ "apply_spec_augment": false,
6
+ "architectures": [
7
+ "WhisperForConditionalGeneration"
8
+ ],
9
+ "attention_dropout": 0.0,
10
+ "begin_suppress_tokens": [
11
+ 220,
12
+ 50257
13
+ ],
14
+ "bos_token_id": 50257,
15
+ "classifier_proj_size": 256,
16
+ "d_model": 1024,
17
+ "decoder_attention_heads": 16,
18
+ "decoder_ffn_dim": 4096,
19
+ "decoder_layerdrop": 0.0,
20
+ "decoder_layers": 24,
21
+ "decoder_start_token_id": 50258,
22
+ "dropout": 0.0,
23
+ "encoder_attention_heads": 16,
24
+ "encoder_ffn_dim": 4096,
25
+ "encoder_layerdrop": 0.0,
26
+ "encoder_layers": 24,
27
+ "eos_token_id": 50257,
28
+ "forced_decoder_ids": null,
29
+ "init_std": 0.02,
30
+ "is_encoder_decoder": true,
31
+ "mask_feature_length": 10,
32
+ "mask_feature_min_masks": 0,
33
+ "mask_feature_prob": 0.0,
34
+ "mask_time_length": 10,
35
+ "mask_time_min_masks": 2,
36
+ "mask_time_prob": 0.05,
37
+ "max_length": 448,
38
+ "max_source_positions": 1500,
39
+ "max_target_positions": 448,
40
+ "median_filter_width": 7,
41
+ "model_type": "whisper",
42
+ "num_hidden_layers": 24,
43
+ "num_mel_bins": 80,
44
+ "pad_token_id": 50257,
45
+ "scale_embedding": false,
46
+ "torch_dtype": "float32",
47
+ "transformers_version": "4.39.0.dev0",
48
+ "use_cache": true,
49
+ "use_weighted_layer_sum": false,
50
+ "vocab_size": 51865
51
+ }
checkpoint-1000/generation_config.json ADDED
@@ -0,0 +1,248 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alignment_heads": [
3
+ [
4
+ 13,
5
+ 15
6
+ ],
7
+ [
8
+ 15,
9
+ 4
10
+ ],
11
+ [
12
+ 15,
13
+ 15
14
+ ],
15
+ [
16
+ 16,
17
+ 1
18
+ ],
19
+ [
20
+ 20,
21
+ 0
22
+ ],
23
+ [
24
+ 23,
25
+ 4
26
+ ]
27
+ ],
28
+ "begin_suppress_tokens": [
29
+ 220,
30
+ 50257
31
+ ],
32
+ "bos_token_id": 50257,
33
+ "decoder_start_token_id": 50258,
34
+ "eos_token_id": 50257,
35
+ "forced_decoder_ids": [
36
+ [
37
+ 1,
38
+ null
39
+ ],
40
+ [
41
+ 2,
42
+ 50359
43
+ ]
44
+ ],
45
+ "is_multilingual": true,
46
+ "lang_to_id": {
47
+ "<|af|>": 50327,
48
+ "<|am|>": 50334,
49
+ "<|ar|>": 50272,
50
+ "<|as|>": 50350,
51
+ "<|az|>": 50304,
52
+ "<|ba|>": 50355,
53
+ "<|be|>": 50330,
54
+ "<|bg|>": 50292,
55
+ "<|bn|>": 50302,
56
+ "<|bo|>": 50347,
57
+ "<|br|>": 50309,
58
+ "<|bs|>": 50315,
59
+ "<|ca|>": 50270,
60
+ "<|cs|>": 50283,
61
+ "<|cy|>": 50297,
62
+ "<|da|>": 50285,
63
+ "<|de|>": 50261,
64
+ "<|el|>": 50281,
65
+ "<|en|>": 50259,
66
+ "<|es|>": 50262,
67
+ "<|et|>": 50307,
68
+ "<|eu|>": 50310,
69
+ "<|fa|>": 50300,
70
+ "<|fi|>": 50277,
71
+ "<|fo|>": 50338,
72
+ "<|fr|>": 50265,
73
+ "<|gl|>": 50319,
74
+ "<|gu|>": 50333,
75
+ "<|haw|>": 50352,
76
+ "<|ha|>": 50354,
77
+ "<|he|>": 50279,
78
+ "<|hi|>": 50276,
79
+ "<|hr|>": 50291,
80
+ "<|ht|>": 50339,
81
+ "<|hu|>": 50286,
82
+ "<|hy|>": 50312,
83
+ "<|id|>": 50275,
84
+ "<|is|>": 50311,
85
+ "<|it|>": 50274,
86
+ "<|ja|>": 50266,
87
+ "<|jw|>": 50356,
88
+ "<|ka|>": 50329,
89
+ "<|kk|>": 50316,
90
+ "<|km|>": 50323,
91
+ "<|kn|>": 50306,
92
+ "<|ko|>": 50264,
93
+ "<|la|>": 50294,
94
+ "<|lb|>": 50345,
95
+ "<|ln|>": 50353,
96
+ "<|lo|>": 50336,
97
+ "<|lt|>": 50293,
98
+ "<|lv|>": 50301,
99
+ "<|mg|>": 50349,
100
+ "<|mi|>": 50295,
101
+ "<|mk|>": 50308,
102
+ "<|ml|>": 50296,
103
+ "<|mn|>": 50314,
104
+ "<|mr|>": 50320,
105
+ "<|ms|>": 50282,
106
+ "<|mt|>": 50343,
107
+ "<|my|>": 50346,
108
+ "<|ne|>": 50313,
109
+ "<|nl|>": 50271,
110
+ "<|nn|>": 50342,
111
+ "<|no|>": 50288,
112
+ "<|oc|>": 50328,
113
+ "<|pa|>": 50321,
114
+ "<|pl|>": 50269,
115
+ "<|ps|>": 50340,
116
+ "<|pt|>": 50267,
117
+ "<|ro|>": 50284,
118
+ "<|ru|>": 50263,
119
+ "<|sa|>": 50344,
120
+ "<|sd|>": 50332,
121
+ "<|si|>": 50322,
122
+ "<|sk|>": 50298,
123
+ "<|sl|>": 50305,
124
+ "<|sn|>": 50324,
125
+ "<|so|>": 50326,
126
+ "<|sq|>": 50317,
127
+ "<|sr|>": 50303,
128
+ "<|su|>": 50357,
129
+ "<|sv|>": 50273,
130
+ "<|sw|>": 50318,
131
+ "<|ta|>": 50287,
132
+ "<|te|>": 50299,
133
+ "<|tg|>": 50331,
134
+ "<|th|>": 50289,
135
+ "<|tk|>": 50341,
136
+ "<|tl|>": 50348,
137
+ "<|tr|>": 50268,
138
+ "<|tt|>": 50351,
139
+ "<|uk|>": 50280,
140
+ "<|ur|>": 50290,
141
+ "<|uz|>": 50337,
142
+ "<|vi|>": 50278,
143
+ "<|yi|>": 50335,
144
+ "<|yo|>": 50325,
145
+ "<|zh|>": 50260
146
+ },
147
+ "max_initial_timestamp_index": 50,
148
+ "max_length": 448,
149
+ "no_timestamps_token_id": 50363,
150
+ "pad_token_id": 50257,
151
+ "prev_sot_token_id": 50361,
152
+ "return_timestamps": false,
153
+ "suppress_tokens": [
154
+ 1,
155
+ 2,
156
+ 7,
157
+ 8,
158
+ 9,
159
+ 10,
160
+ 14,
161
+ 25,
162
+ 26,
163
+ 27,
164
+ 28,
165
+ 29,
166
+ 31,
167
+ 58,
168
+ 59,
169
+ 60,
170
+ 61,
171
+ 62,
172
+ 63,
173
+ 90,
174
+ 91,
175
+ 92,
176
+ 93,
177
+ 359,
178
+ 503,
179
+ 522,
180
+ 542,
181
+ 873,
182
+ 893,
183
+ 902,
184
+ 918,
185
+ 922,
186
+ 931,
187
+ 1350,
188
+ 1853,
189
+ 1982,
190
+ 2460,
191
+ 2627,
192
+ 3246,
193
+ 3253,
194
+ 3268,
195
+ 3536,
196
+ 3846,
197
+ 3961,
198
+ 4183,
199
+ 4667,
200
+ 6585,
201
+ 6647,
202
+ 7273,
203
+ 9061,
204
+ 9383,
205
+ 10428,
206
+ 10929,
207
+ 11938,
208
+ 12033,
209
+ 12331,
210
+ 12562,
211
+ 13793,
212
+ 14157,
213
+ 14635,
214
+ 15265,
215
+ 15618,
216
+ 16553,
217
+ 16604,
218
+ 18362,
219
+ 18956,
220
+ 20075,
221
+ 21675,
222
+ 22520,
223
+ 26130,
224
+ 26161,
225
+ 26435,
226
+ 28279,
227
+ 29464,
228
+ 31650,
229
+ 32302,
230
+ 32470,
231
+ 36865,
232
+ 42863,
233
+ 47425,
234
+ 49870,
235
+ 50254,
236
+ 50258,
237
+ 50358,
238
+ 50359,
239
+ 50360,
240
+ 50361,
241
+ 50362
242
+ ],
243
+ "task_to_id": {
244
+ "transcribe": 50359,
245
+ "translate": 50358
246
+ },
247
+ "transformers_version": "4.39.0.dev0"
248
+ }
checkpoint-1000/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1e0bedf8e7b9cd0952e56e16ab16358abe622a10abefb68d450abbfcb9b2a95f
3
+ size 3055544304
checkpoint-1000/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:019e09b15fe12afea1465fbb3ef2ce53aa4f60ddf49a5fb7907f3a665779aa47
3
+ size 6099375168
checkpoint-1000/preprocessor_config.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "chunk_length": 30,
3
+ "feature_extractor_type": "WhisperFeatureExtractor",
4
+ "feature_size": 80,
5
+ "hop_length": 160,
6
+ "n_fft": 400,
7
+ "n_samples": 480000,
8
+ "nb_max_frames": 3000,
9
+ "padding_side": "right",
10
+ "padding_value": 0.0,
11
+ "processor_class": "WhisperProcessor",
12
+ "return_attention_mask": false,
13
+ "sampling_rate": 16000
14
+ }
checkpoint-1000/rng_state_0.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:df528c5d527661f823d282113926d0df3e36af7d1e7c60fd108fb3bf9f597a76
3
+ size 14512
checkpoint-1000/rng_state_1.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3c5e8c9bfda7c93dd1101f1a893002bcea94f2d4802b291c7c9a4c4ea449c257
3
+ size 14512
checkpoint-1000/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:23fbc4302577877a0e0a4ca15f37438a3136a5a656b82bfb3078edcf0400b5c1
3
+ size 1064
checkpoint-1000/trainer_state.json ADDED
@@ -0,0 +1,310 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_metric": null,
3
+ "best_model_checkpoint": null,
4
+ "epoch": 0.06431281754453663,
5
+ "eval_steps": 1000,
6
+ "global_step": 1000,
7
+ "is_hyper_param_search": false,
8
+ "is_local_process_zero": true,
9
+ "is_world_process_zero": true,
10
+ "log_history": [
11
+ {
12
+ "epoch": 0.0,
13
+ "grad_norm": 58.440006256103516,
14
+ "learning_rate": 1e-08,
15
+ "loss": 1.3813,
16
+ "step": 25
17
+ },
18
+ {
19
+ "epoch": 0.0,
20
+ "grad_norm": 59.934757232666016,
21
+ "learning_rate": 2.2e-08,
22
+ "loss": 1.469,
23
+ "step": 50
24
+ },
25
+ {
26
+ "epoch": 0.0,
27
+ "grad_norm": 31.011018753051758,
28
+ "learning_rate": 3.4500000000000005e-08,
29
+ "loss": 1.2226,
30
+ "step": 75
31
+ },
32
+ {
33
+ "epoch": 0.01,
34
+ "grad_norm": 59.818233489990234,
35
+ "learning_rate": 4.7e-08,
36
+ "loss": 1.2458,
37
+ "step": 100
38
+ },
39
+ {
40
+ "epoch": 0.01,
41
+ "grad_norm": 60.51572036743164,
42
+ "learning_rate": 5.95e-08,
43
+ "loss": 1.2781,
44
+ "step": 125
45
+ },
46
+ {
47
+ "epoch": 0.01,
48
+ "grad_norm": 51.360103607177734,
49
+ "learning_rate": 7.2e-08,
50
+ "loss": 1.4055,
51
+ "step": 150
52
+ },
53
+ {
54
+ "epoch": 0.01,
55
+ "grad_norm": 73.35002136230469,
56
+ "learning_rate": 8.45e-08,
57
+ "loss": 1.3354,
58
+ "step": 175
59
+ },
60
+ {
61
+ "epoch": 0.01,
62
+ "grad_norm": 69.32823944091797,
63
+ "learning_rate": 9.7e-08,
64
+ "loss": 1.2005,
65
+ "step": 200
66
+ },
67
+ {
68
+ "epoch": 0.01,
69
+ "grad_norm": 51.02174377441406,
70
+ "learning_rate": 1.095e-07,
71
+ "loss": 1.3853,
72
+ "step": 225
73
+ },
74
+ {
75
+ "epoch": 0.02,
76
+ "grad_norm": 72.20179748535156,
77
+ "learning_rate": 1.2199999999999998e-07,
78
+ "loss": 1.4476,
79
+ "step": 250
80
+ },
81
+ {
82
+ "epoch": 0.02,
83
+ "grad_norm": 108.30382537841797,
84
+ "learning_rate": 1.345e-07,
85
+ "loss": 1.2339,
86
+ "step": 275
87
+ },
88
+ {
89
+ "epoch": 0.02,
90
+ "grad_norm": 66.15994262695312,
91
+ "learning_rate": 1.4699999999999998e-07,
92
+ "loss": 1.379,
93
+ "step": 300
94
+ },
95
+ {
96
+ "epoch": 0.02,
97
+ "grad_norm": 47.82923126220703,
98
+ "learning_rate": 1.595e-07,
99
+ "loss": 1.1467,
100
+ "step": 325
101
+ },
102
+ {
103
+ "epoch": 0.02,
104
+ "grad_norm": 85.7218246459961,
105
+ "learning_rate": 1.7199999999999998e-07,
106
+ "loss": 1.1622,
107
+ "step": 350
108
+ },
109
+ {
110
+ "epoch": 0.02,
111
+ "grad_norm": 68.25504302978516,
112
+ "learning_rate": 1.845e-07,
113
+ "loss": 1.1413,
114
+ "step": 375
115
+ },
116
+ {
117
+ "epoch": 0.03,
118
+ "grad_norm": 106.06077575683594,
119
+ "learning_rate": 1.97e-07,
120
+ "loss": 1.0855,
121
+ "step": 400
122
+ },
123
+ {
124
+ "epoch": 0.03,
125
+ "grad_norm": 79.60690307617188,
126
+ "learning_rate": 2.095e-07,
127
+ "loss": 0.929,
128
+ "step": 425
129
+ },
130
+ {
131
+ "epoch": 0.03,
132
+ "grad_norm": 42.14814376831055,
133
+ "learning_rate": 2.22e-07,
134
+ "loss": 0.8728,
135
+ "step": 450
136
+ },
137
+ {
138
+ "epoch": 0.03,
139
+ "grad_norm": 37.4913444519043,
140
+ "learning_rate": 2.3449999999999996e-07,
141
+ "loss": 0.6651,
142
+ "step": 475
143
+ },
144
+ {
145
+ "epoch": 0.03,
146
+ "grad_norm": 41.89991760253906,
147
+ "learning_rate": 2.47e-07,
148
+ "loss": 0.5875,
149
+ "step": 500
150
+ },
151
+ {
152
+ "epoch": 0.03,
153
+ "grad_norm": 75.21453094482422,
154
+ "learning_rate": 2.595e-07,
155
+ "loss": 0.6868,
156
+ "step": 525
157
+ },
158
+ {
159
+ "epoch": 0.04,
160
+ "grad_norm": 21.09180450439453,
161
+ "learning_rate": 2.72e-07,
162
+ "loss": 0.741,
163
+ "step": 550
164
+ },
165
+ {
166
+ "epoch": 0.04,
167
+ "grad_norm": 44.54707336425781,
168
+ "learning_rate": 2.845e-07,
169
+ "loss": 0.3898,
170
+ "step": 575
171
+ },
172
+ {
173
+ "epoch": 0.04,
174
+ "grad_norm": 31.656843185424805,
175
+ "learning_rate": 2.9699999999999997e-07,
176
+ "loss": 0.422,
177
+ "step": 600
178
+ },
179
+ {
180
+ "epoch": 0.04,
181
+ "grad_norm": 56.28642654418945,
182
+ "learning_rate": 3.0949999999999996e-07,
183
+ "loss": 0.3803,
184
+ "step": 625
185
+ },
186
+ {
187
+ "epoch": 0.04,
188
+ "grad_norm": 38.66410827636719,
189
+ "learning_rate": 3.22e-07,
190
+ "loss": 0.5062,
191
+ "step": 650
192
+ },
193
+ {
194
+ "epoch": 0.04,
195
+ "grad_norm": 31.183727264404297,
196
+ "learning_rate": 3.345e-07,
197
+ "loss": 0.4075,
198
+ "step": 675
199
+ },
200
+ {
201
+ "epoch": 0.05,
202
+ "grad_norm": 23.618703842163086,
203
+ "learning_rate": 3.4699999999999997e-07,
204
+ "loss": 0.3627,
205
+ "step": 700
206
+ },
207
+ {
208
+ "epoch": 0.05,
209
+ "grad_norm": 70.09487915039062,
210
+ "learning_rate": 3.5949999999999996e-07,
211
+ "loss": 0.3087,
212
+ "step": 725
213
+ },
214
+ {
215
+ "epoch": 0.05,
216
+ "grad_norm": 74.42188262939453,
217
+ "learning_rate": 3.72e-07,
218
+ "loss": 0.4021,
219
+ "step": 750
220
+ },
221
+ {
222
+ "epoch": 0.05,
223
+ "grad_norm": 44.99939727783203,
224
+ "learning_rate": 3.845e-07,
225
+ "loss": 0.3203,
226
+ "step": 775
227
+ },
228
+ {
229
+ "epoch": 0.05,
230
+ "grad_norm": 42.77998352050781,
231
+ "learning_rate": 3.97e-07,
232
+ "loss": 0.3797,
233
+ "step": 800
234
+ },
235
+ {
236
+ "epoch": 0.05,
237
+ "grad_norm": 64.61412811279297,
238
+ "learning_rate": 4.0949999999999995e-07,
239
+ "loss": 0.3403,
240
+ "step": 825
241
+ },
242
+ {
243
+ "epoch": 0.05,
244
+ "grad_norm": 29.286806106567383,
245
+ "learning_rate": 4.2199999999999994e-07,
246
+ "loss": 0.2879,
247
+ "step": 850
248
+ },
249
+ {
250
+ "epoch": 0.06,
251
+ "grad_norm": 58.146263122558594,
252
+ "learning_rate": 4.345e-07,
253
+ "loss": 0.4017,
254
+ "step": 875
255
+ },
256
+ {
257
+ "epoch": 0.06,
258
+ "grad_norm": 44.624202728271484,
259
+ "learning_rate": 4.4699999999999997e-07,
260
+ "loss": 0.3698,
261
+ "step": 900
262
+ },
263
+ {
264
+ "epoch": 0.06,
265
+ "grad_norm": 47.91656494140625,
266
+ "learning_rate": 4.595e-07,
267
+ "loss": 0.4008,
268
+ "step": 925
269
+ },
270
+ {
271
+ "epoch": 0.06,
272
+ "grad_norm": 36.263668060302734,
273
+ "learning_rate": 4.7199999999999994e-07,
274
+ "loss": 0.2041,
275
+ "step": 950
276
+ },
277
+ {
278
+ "epoch": 0.06,
279
+ "grad_norm": 12.398943901062012,
280
+ "learning_rate": 4.845e-07,
281
+ "loss": 0.2978,
282
+ "step": 975
283
+ },
284
+ {
285
+ "epoch": 0.06,
286
+ "grad_norm": 4.42283821105957,
287
+ "learning_rate": 4.97e-07,
288
+ "loss": 0.2614,
289
+ "step": 1000
290
+ },
291
+ {
292
+ "epoch": 0.06,
293
+ "eval_loss": 0.29864633083343506,
294
+ "eval_runtime": 7667.7674,
295
+ "eval_samples_per_second": 1.228,
296
+ "eval_steps_per_second": 0.614,
297
+ "eval_wer": 0.14664944291942517,
298
+ "step": 1000
299
+ }
300
+ ],
301
+ "logging_steps": 25,
302
+ "max_steps": 5000,
303
+ "num_input_tokens_seen": 0,
304
+ "num_train_epochs": 1,
305
+ "save_steps": 1000,
306
+ "total_flos": 2.0412098362212352e+18,
307
+ "train_batch_size": 1,
308
+ "trial_name": null,
309
+ "trial_params": null
310
+ }
checkpoint-1000/training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:354821a80788dff5d057ffce4a4d80a406ce5bb0affa48cc6029ca3faa14edf2
3
+ size 5048
checkpoint-2000/config.json ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "openai/whisper-medium",
3
+ "activation_dropout": 0.0,
4
+ "activation_function": "gelu",
5
+ "apply_spec_augment": false,
6
+ "architectures": [
7
+ "WhisperForConditionalGeneration"
8
+ ],
9
+ "attention_dropout": 0.0,
10
+ "begin_suppress_tokens": [
11
+ 220,
12
+ 50257
13
+ ],
14
+ "bos_token_id": 50257,
15
+ "classifier_proj_size": 256,
16
+ "d_model": 1024,
17
+ "decoder_attention_heads": 16,
18
+ "decoder_ffn_dim": 4096,
19
+ "decoder_layerdrop": 0.0,
20
+ "decoder_layers": 24,
21
+ "decoder_start_token_id": 50258,
22
+ "dropout": 0.0,
23
+ "encoder_attention_heads": 16,
24
+ "encoder_ffn_dim": 4096,
25
+ "encoder_layerdrop": 0.0,
26
+ "encoder_layers": 24,
27
+ "eos_token_id": 50257,
28
+ "forced_decoder_ids": null,
29
+ "init_std": 0.02,
30
+ "is_encoder_decoder": true,
31
+ "mask_feature_length": 10,
32
+ "mask_feature_min_masks": 0,
33
+ "mask_feature_prob": 0.0,
34
+ "mask_time_length": 10,
35
+ "mask_time_min_masks": 2,
36
+ "mask_time_prob": 0.05,
37
+ "max_length": 448,
38
+ "max_source_positions": 1500,
39
+ "max_target_positions": 448,
40
+ "median_filter_width": 7,
41
+ "model_type": "whisper",
42
+ "num_hidden_layers": 24,
43
+ "num_mel_bins": 80,
44
+ "pad_token_id": 50257,
45
+ "scale_embedding": false,
46
+ "torch_dtype": "float32",
47
+ "transformers_version": "4.39.0.dev0",
48
+ "use_cache": true,
49
+ "use_weighted_layer_sum": false,
50
+ "vocab_size": 51865
51
+ }
checkpoint-2000/generation_config.json ADDED
@@ -0,0 +1,248 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alignment_heads": [
3
+ [
4
+ 13,
5
+ 15
6
+ ],
7
+ [
8
+ 15,
9
+ 4
10
+ ],
11
+ [
12
+ 15,
13
+ 15
14
+ ],
15
+ [
16
+ 16,
17
+ 1
18
+ ],
19
+ [
20
+ 20,
21
+ 0
22
+ ],
23
+ [
24
+ 23,
25
+ 4
26
+ ]
27
+ ],
28
+ "begin_suppress_tokens": [
29
+ 220,
30
+ 50257
31
+ ],
32
+ "bos_token_id": 50257,
33
+ "decoder_start_token_id": 50258,
34
+ "eos_token_id": 50257,
35
+ "forced_decoder_ids": [
36
+ [
37
+ 1,
38
+ null
39
+ ],
40
+ [
41
+ 2,
42
+ 50359
43
+ ]
44
+ ],
45
+ "is_multilingual": true,
46
+ "lang_to_id": {
47
+ "<|af|>": 50327,
48
+ "<|am|>": 50334,
49
+ "<|ar|>": 50272,
50
+ "<|as|>": 50350,
51
+ "<|az|>": 50304,
52
+ "<|ba|>": 50355,
53
+ "<|be|>": 50330,
54
+ "<|bg|>": 50292,
55
+ "<|bn|>": 50302,
56
+ "<|bo|>": 50347,
57
+ "<|br|>": 50309,
58
+ "<|bs|>": 50315,
59
+ "<|ca|>": 50270,
60
+ "<|cs|>": 50283,
61
+ "<|cy|>": 50297,
62
+ "<|da|>": 50285,
63
+ "<|de|>": 50261,
64
+ "<|el|>": 50281,
65
+ "<|en|>": 50259,
66
+ "<|es|>": 50262,
67
+ "<|et|>": 50307,
68
+ "<|eu|>": 50310,
69
+ "<|fa|>": 50300,
70
+ "<|fi|>": 50277,
71
+ "<|fo|>": 50338,
72
+ "<|fr|>": 50265,
73
+ "<|gl|>": 50319,
74
+ "<|gu|>": 50333,
75
+ "<|haw|>": 50352,
76
+ "<|ha|>": 50354,
77
+ "<|he|>": 50279,
78
+ "<|hi|>": 50276,
79
+ "<|hr|>": 50291,
80
+ "<|ht|>": 50339,
81
+ "<|hu|>": 50286,
82
+ "<|hy|>": 50312,
83
+ "<|id|>": 50275,
84
+ "<|is|>": 50311,
85
+ "<|it|>": 50274,
86
+ "<|ja|>": 50266,
87
+ "<|jw|>": 50356,
88
+ "<|ka|>": 50329,
89
+ "<|kk|>": 50316,
90
+ "<|km|>": 50323,
91
+ "<|kn|>": 50306,
92
+ "<|ko|>": 50264,
93
+ "<|la|>": 50294,
94
+ "<|lb|>": 50345,
95
+ "<|ln|>": 50353,
96
+ "<|lo|>": 50336,
97
+ "<|lt|>": 50293,
98
+ "<|lv|>": 50301,
99
+ "<|mg|>": 50349,
100
+ "<|mi|>": 50295,
101
+ "<|mk|>": 50308,
102
+ "<|ml|>": 50296,
103
+ "<|mn|>": 50314,
104
+ "<|mr|>": 50320,
105
+ "<|ms|>": 50282,
106
+ "<|mt|>": 50343,
107
+ "<|my|>": 50346,
108
+ "<|ne|>": 50313,
109
+ "<|nl|>": 50271,
110
+ "<|nn|>": 50342,
111
+ "<|no|>": 50288,
112
+ "<|oc|>": 50328,
113
+ "<|pa|>": 50321,
114
+ "<|pl|>": 50269,
115
+ "<|ps|>": 50340,
116
+ "<|pt|>": 50267,
117
+ "<|ro|>": 50284,
118
+ "<|ru|>": 50263,
119
+ "<|sa|>": 50344,
120
+ "<|sd|>": 50332,
121
+ "<|si|>": 50322,
122
+ "<|sk|>": 50298,
123
+ "<|sl|>": 50305,
124
+ "<|sn|>": 50324,
125
+ "<|so|>": 50326,
126
+ "<|sq|>": 50317,
127
+ "<|sr|>": 50303,
128
+ "<|su|>": 50357,
129
+ "<|sv|>": 50273,
130
+ "<|sw|>": 50318,
131
+ "<|ta|>": 50287,
132
+ "<|te|>": 50299,
133
+ "<|tg|>": 50331,
134
+ "<|th|>": 50289,
135
+ "<|tk|>": 50341,
136
+ "<|tl|>": 50348,
137
+ "<|tr|>": 50268,
138
+ "<|tt|>": 50351,
139
+ "<|uk|>": 50280,
140
+ "<|ur|>": 50290,
141
+ "<|uz|>": 50337,
142
+ "<|vi|>": 50278,
143
+ "<|yi|>": 50335,
144
+ "<|yo|>": 50325,
145
+ "<|zh|>": 50260
146
+ },
147
+ "max_initial_timestamp_index": 50,
148
+ "max_length": 448,
149
+ "no_timestamps_token_id": 50363,
150
+ "pad_token_id": 50257,
151
+ "prev_sot_token_id": 50361,
152
+ "return_timestamps": false,
153
+ "suppress_tokens": [
154
+ 1,
155
+ 2,
156
+ 7,
157
+ 8,
158
+ 9,
159
+ 10,
160
+ 14,
161
+ 25,
162
+ 26,
163
+ 27,
164
+ 28,
165
+ 29,
166
+ 31,
167
+ 58,
168
+ 59,
169
+ 60,
170
+ 61,
171
+ 62,
172
+ 63,
173
+ 90,
174
+ 91,
175
+ 92,
176
+ 93,
177
+ 359,
178
+ 503,
179
+ 522,
180
+ 542,
181
+ 873,
182
+ 893,
183
+ 902,
184
+ 918,
185
+ 922,
186
+ 931,
187
+ 1350,
188
+ 1853,
189
+ 1982,
190
+ 2460,
191
+ 2627,
192
+ 3246,
193
+ 3253,
194
+ 3268,
195
+ 3536,
196
+ 3846,
197
+ 3961,
198
+ 4183,
199
+ 4667,
200
+ 6585,
201
+ 6647,
202
+ 7273,
203
+ 9061,
204
+ 9383,
205
+ 10428,
206
+ 10929,
207
+ 11938,
208
+ 12033,
209
+ 12331,
210
+ 12562,
211
+ 13793,
212
+ 14157,
213
+ 14635,
214
+ 15265,
215
+ 15618,
216
+ 16553,
217
+ 16604,
218
+ 18362,
219
+ 18956,
220
+ 20075,
221
+ 21675,
222
+ 22520,
223
+ 26130,
224
+ 26161,
225
+ 26435,
226
+ 28279,
227
+ 29464,
228
+ 31650,
229
+ 32302,
230
+ 32470,
231
+ 36865,
232
+ 42863,
233
+ 47425,
234
+ 49870,
235
+ 50254,
236
+ 50258,
237
+ 50358,
238
+ 50359,
239
+ 50360,
240
+ 50361,
241
+ 50362
242
+ ],
243
+ "task_to_id": {
244
+ "transcribe": 50359,
245
+ "translate": 50358
246
+ },
247
+ "transformers_version": "4.39.0.dev0"
248
+ }
checkpoint-2000/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:fcedba9c9e95f97e88e4dfe09bcfa72c1f4013a51282919c8c4a193e91daf85f
3
+ size 3055544304
checkpoint-2000/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b543b1e36b9d7b2a757b27cc657ee82fd76a421af01d45102c2a1fa66a6084d0
3
+ size 6099375168
checkpoint-2000/preprocessor_config.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "chunk_length": 30,
3
+ "feature_extractor_type": "WhisperFeatureExtractor",
4
+ "feature_size": 80,
5
+ "hop_length": 160,
6
+ "n_fft": 400,
7
+ "n_samples": 480000,
8
+ "nb_max_frames": 3000,
9
+ "padding_side": "right",
10
+ "padding_value": 0.0,
11
+ "processor_class": "WhisperProcessor",
12
+ "return_attention_mask": false,
13
+ "sampling_rate": 16000
14
+ }
checkpoint-2000/rng_state_0.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8580ab37c23f0a0f3c1baabe4b6a4e789328843d26389b15775df688d64648e7
3
+ size 14512
checkpoint-2000/rng_state_1.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a0031f65afa9fc49af8ce3388f5e708e94c2e73e3de60dab6a7da06caf0b9654
3
+ size 14512
checkpoint-2000/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:04c04d9b9dc8075de0e127d0d9cc62a5bdd2459cb835e4d52b71161b1dd07707
3
+ size 1064
checkpoint-2000/trainer_state.json ADDED
@@ -0,0 +1,599 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_metric": null,
3
+ "best_model_checkpoint": null,
4
+ "epoch": 0.12862563508907326,
5
+ "eval_steps": 1000,
6
+ "global_step": 2000,
7
+ "is_hyper_param_search": false,
8
+ "is_local_process_zero": true,
9
+ "is_world_process_zero": true,
10
+ "log_history": [
11
+ {
12
+ "epoch": 0.0,
13
+ "grad_norm": 58.440006256103516,
14
+ "learning_rate": 1e-08,
15
+ "loss": 1.3813,
16
+ "step": 25
17
+ },
18
+ {
19
+ "epoch": 0.0,
20
+ "grad_norm": 59.934757232666016,
21
+ "learning_rate": 2.2e-08,
22
+ "loss": 1.469,
23
+ "step": 50
24
+ },
25
+ {
26
+ "epoch": 0.0,
27
+ "grad_norm": 31.011018753051758,
28
+ "learning_rate": 3.4500000000000005e-08,
29
+ "loss": 1.2226,
30
+ "step": 75
31
+ },
32
+ {
33
+ "epoch": 0.01,
34
+ "grad_norm": 59.818233489990234,
35
+ "learning_rate": 4.7e-08,
36
+ "loss": 1.2458,
37
+ "step": 100
38
+ },
39
+ {
40
+ "epoch": 0.01,
41
+ "grad_norm": 60.51572036743164,
42
+ "learning_rate": 5.95e-08,
43
+ "loss": 1.2781,
44
+ "step": 125
45
+ },
46
+ {
47
+ "epoch": 0.01,
48
+ "grad_norm": 51.360103607177734,
49
+ "learning_rate": 7.2e-08,
50
+ "loss": 1.4055,
51
+ "step": 150
52
+ },
53
+ {
54
+ "epoch": 0.01,
55
+ "grad_norm": 73.35002136230469,
56
+ "learning_rate": 8.45e-08,
57
+ "loss": 1.3354,
58
+ "step": 175
59
+ },
60
+ {
61
+ "epoch": 0.01,
62
+ "grad_norm": 69.32823944091797,
63
+ "learning_rate": 9.7e-08,
64
+ "loss": 1.2005,
65
+ "step": 200
66
+ },
67
+ {
68
+ "epoch": 0.01,
69
+ "grad_norm": 51.02174377441406,
70
+ "learning_rate": 1.095e-07,
71
+ "loss": 1.3853,
72
+ "step": 225
73
+ },
74
+ {
75
+ "epoch": 0.02,
76
+ "grad_norm": 72.20179748535156,
77
+ "learning_rate": 1.2199999999999998e-07,
78
+ "loss": 1.4476,
79
+ "step": 250
80
+ },
81
+ {
82
+ "epoch": 0.02,
83
+ "grad_norm": 108.30382537841797,
84
+ "learning_rate": 1.345e-07,
85
+ "loss": 1.2339,
86
+ "step": 275
87
+ },
88
+ {
89
+ "epoch": 0.02,
90
+ "grad_norm": 66.15994262695312,
91
+ "learning_rate": 1.4699999999999998e-07,
92
+ "loss": 1.379,
93
+ "step": 300
94
+ },
95
+ {
96
+ "epoch": 0.02,
97
+ "grad_norm": 47.82923126220703,
98
+ "learning_rate": 1.595e-07,
99
+ "loss": 1.1467,
100
+ "step": 325
101
+ },
102
+ {
103
+ "epoch": 0.02,
104
+ "grad_norm": 85.7218246459961,
105
+ "learning_rate": 1.7199999999999998e-07,
106
+ "loss": 1.1622,
107
+ "step": 350
108
+ },
109
+ {
110
+ "epoch": 0.02,
111
+ "grad_norm": 68.25504302978516,
112
+ "learning_rate": 1.845e-07,
113
+ "loss": 1.1413,
114
+ "step": 375
115
+ },
116
+ {
117
+ "epoch": 0.03,
118
+ "grad_norm": 106.06077575683594,
119
+ "learning_rate": 1.97e-07,
120
+ "loss": 1.0855,
121
+ "step": 400
122
+ },
123
+ {
124
+ "epoch": 0.03,
125
+ "grad_norm": 79.60690307617188,
126
+ "learning_rate": 2.095e-07,
127
+ "loss": 0.929,
128
+ "step": 425
129
+ },
130
+ {
131
+ "epoch": 0.03,
132
+ "grad_norm": 42.14814376831055,
133
+ "learning_rate": 2.22e-07,
134
+ "loss": 0.8728,
135
+ "step": 450
136
+ },
137
+ {
138
+ "epoch": 0.03,
139
+ "grad_norm": 37.4913444519043,
140
+ "learning_rate": 2.3449999999999996e-07,
141
+ "loss": 0.6651,
142
+ "step": 475
143
+ },
144
+ {
145
+ "epoch": 0.03,
146
+ "grad_norm": 41.89991760253906,
147
+ "learning_rate": 2.47e-07,
148
+ "loss": 0.5875,
149
+ "step": 500
150
+ },
151
+ {
152
+ "epoch": 0.03,
153
+ "grad_norm": 75.21453094482422,
154
+ "learning_rate": 2.595e-07,
155
+ "loss": 0.6868,
156
+ "step": 525
157
+ },
158
+ {
159
+ "epoch": 0.04,
160
+ "grad_norm": 21.09180450439453,
161
+ "learning_rate": 2.72e-07,
162
+ "loss": 0.741,
163
+ "step": 550
164
+ },
165
+ {
166
+ "epoch": 0.04,
167
+ "grad_norm": 44.54707336425781,
168
+ "learning_rate": 2.845e-07,
169
+ "loss": 0.3898,
170
+ "step": 575
171
+ },
172
+ {
173
+ "epoch": 0.04,
174
+ "grad_norm": 31.656843185424805,
175
+ "learning_rate": 2.9699999999999997e-07,
176
+ "loss": 0.422,
177
+ "step": 600
178
+ },
179
+ {
180
+ "epoch": 0.04,
181
+ "grad_norm": 56.28642654418945,
182
+ "learning_rate": 3.0949999999999996e-07,
183
+ "loss": 0.3803,
184
+ "step": 625
185
+ },
186
+ {
187
+ "epoch": 0.04,
188
+ "grad_norm": 38.66410827636719,
189
+ "learning_rate": 3.22e-07,
190
+ "loss": 0.5062,
191
+ "step": 650
192
+ },
193
+ {
194
+ "epoch": 0.04,
195
+ "grad_norm": 31.183727264404297,
196
+ "learning_rate": 3.345e-07,
197
+ "loss": 0.4075,
198
+ "step": 675
199
+ },
200
+ {
201
+ "epoch": 0.05,
202
+ "grad_norm": 23.618703842163086,
203
+ "learning_rate": 3.4699999999999997e-07,
204
+ "loss": 0.3627,
205
+ "step": 700
206
+ },
207
+ {
208
+ "epoch": 0.05,
209
+ "grad_norm": 70.09487915039062,
210
+ "learning_rate": 3.5949999999999996e-07,
211
+ "loss": 0.3087,
212
+ "step": 725
213
+ },
214
+ {
215
+ "epoch": 0.05,
216
+ "grad_norm": 74.42188262939453,
217
+ "learning_rate": 3.72e-07,
218
+ "loss": 0.4021,
219
+ "step": 750
220
+ },
221
+ {
222
+ "epoch": 0.05,
223
+ "grad_norm": 44.99939727783203,
224
+ "learning_rate": 3.845e-07,
225
+ "loss": 0.3203,
226
+ "step": 775
227
+ },
228
+ {
229
+ "epoch": 0.05,
230
+ "grad_norm": 42.77998352050781,
231
+ "learning_rate": 3.97e-07,
232
+ "loss": 0.3797,
233
+ "step": 800
234
+ },
235
+ {
236
+ "epoch": 0.05,
237
+ "grad_norm": 64.61412811279297,
238
+ "learning_rate": 4.0949999999999995e-07,
239
+ "loss": 0.3403,
240
+ "step": 825
241
+ },
242
+ {
243
+ "epoch": 0.05,
244
+ "grad_norm": 29.286806106567383,
245
+ "learning_rate": 4.2199999999999994e-07,
246
+ "loss": 0.2879,
247
+ "step": 850
248
+ },
249
+ {
250
+ "epoch": 0.06,
251
+ "grad_norm": 58.146263122558594,
252
+ "learning_rate": 4.345e-07,
253
+ "loss": 0.4017,
254
+ "step": 875
255
+ },
256
+ {
257
+ "epoch": 0.06,
258
+ "grad_norm": 44.624202728271484,
259
+ "learning_rate": 4.4699999999999997e-07,
260
+ "loss": 0.3698,
261
+ "step": 900
262
+ },
263
+ {
264
+ "epoch": 0.06,
265
+ "grad_norm": 47.91656494140625,
266
+ "learning_rate": 4.595e-07,
267
+ "loss": 0.4008,
268
+ "step": 925
269
+ },
270
+ {
271
+ "epoch": 0.06,
272
+ "grad_norm": 36.263668060302734,
273
+ "learning_rate": 4.7199999999999994e-07,
274
+ "loss": 0.2041,
275
+ "step": 950
276
+ },
277
+ {
278
+ "epoch": 0.06,
279
+ "grad_norm": 12.398943901062012,
280
+ "learning_rate": 4.845e-07,
281
+ "loss": 0.2978,
282
+ "step": 975
283
+ },
284
+ {
285
+ "epoch": 0.06,
286
+ "grad_norm": 4.42283821105957,
287
+ "learning_rate": 4.97e-07,
288
+ "loss": 0.2614,
289
+ "step": 1000
290
+ },
291
+ {
292
+ "epoch": 0.06,
293
+ "eval_loss": 0.29864633083343506,
294
+ "eval_runtime": 7667.7674,
295
+ "eval_samples_per_second": 1.228,
296
+ "eval_steps_per_second": 0.614,
297
+ "eval_wer": 0.14664944291942517,
298
+ "step": 1000
299
+ },
300
+ {
301
+ "epoch": 0.07,
302
+ "grad_norm": 59.681270599365234,
303
+ "learning_rate": 5.095e-07,
304
+ "loss": 0.2588,
305
+ "step": 1025
306
+ },
307
+ {
308
+ "epoch": 0.07,
309
+ "grad_norm": 27.44911766052246,
310
+ "learning_rate": 5.22e-07,
311
+ "loss": 0.2839,
312
+ "step": 1050
313
+ },
314
+ {
315
+ "epoch": 0.07,
316
+ "grad_norm": 56.26525115966797,
317
+ "learning_rate": 5.344999999999999e-07,
318
+ "loss": 0.2329,
319
+ "step": 1075
320
+ },
321
+ {
322
+ "epoch": 0.07,
323
+ "grad_norm": 112.37168884277344,
324
+ "learning_rate": 5.47e-07,
325
+ "loss": 0.2746,
326
+ "step": 1100
327
+ },
328
+ {
329
+ "epoch": 0.07,
330
+ "grad_norm": 92.82706451416016,
331
+ "learning_rate": 5.595e-07,
332
+ "loss": 0.2621,
333
+ "step": 1125
334
+ },
335
+ {
336
+ "epoch": 0.07,
337
+ "grad_norm": 17.20562744140625,
338
+ "learning_rate": 5.719999999999999e-07,
339
+ "loss": 0.1826,
340
+ "step": 1150
341
+ },
342
+ {
343
+ "epoch": 0.08,
344
+ "grad_norm": 31.119354248046875,
345
+ "learning_rate": 5.845e-07,
346
+ "loss": 0.2487,
347
+ "step": 1175
348
+ },
349
+ {
350
+ "epoch": 0.08,
351
+ "grad_norm": 7.4088850021362305,
352
+ "learning_rate": 5.97e-07,
353
+ "loss": 0.297,
354
+ "step": 1200
355
+ },
356
+ {
357
+ "epoch": 0.08,
358
+ "grad_norm": 72.84540557861328,
359
+ "learning_rate": 6.095e-07,
360
+ "loss": 0.3108,
361
+ "step": 1225
362
+ },
363
+ {
364
+ "epoch": 0.08,
365
+ "grad_norm": 38.68337631225586,
366
+ "learning_rate": 6.219999999999999e-07,
367
+ "loss": 0.2819,
368
+ "step": 1250
369
+ },
370
+ {
371
+ "epoch": 0.08,
372
+ "grad_norm": 5.215000152587891,
373
+ "learning_rate": 6.344999999999999e-07,
374
+ "loss": 0.1953,
375
+ "step": 1275
376
+ },
377
+ {
378
+ "epoch": 0.08,
379
+ "grad_norm": 41.42685317993164,
380
+ "learning_rate": 6.47e-07,
381
+ "loss": 0.2333,
382
+ "step": 1300
383
+ },
384
+ {
385
+ "epoch": 0.09,
386
+ "grad_norm": 6.224233150482178,
387
+ "learning_rate": 6.595e-07,
388
+ "loss": 0.2213,
389
+ "step": 1325
390
+ },
391
+ {
392
+ "epoch": 0.09,
393
+ "grad_norm": 48.12126541137695,
394
+ "learning_rate": 6.72e-07,
395
+ "loss": 0.3121,
396
+ "step": 1350
397
+ },
398
+ {
399
+ "epoch": 0.09,
400
+ "grad_norm": 23.151887893676758,
401
+ "learning_rate": 6.845e-07,
402
+ "loss": 0.2076,
403
+ "step": 1375
404
+ },
405
+ {
406
+ "epoch": 0.09,
407
+ "grad_norm": 53.516395568847656,
408
+ "learning_rate": 6.97e-07,
409
+ "loss": 0.2835,
410
+ "step": 1400
411
+ },
412
+ {
413
+ "epoch": 0.09,
414
+ "grad_norm": 41.62558364868164,
415
+ "learning_rate": 7.094999999999999e-07,
416
+ "loss": 0.2574,
417
+ "step": 1425
418
+ },
419
+ {
420
+ "epoch": 0.09,
421
+ "grad_norm": 98.05493927001953,
422
+ "learning_rate": 7.219999999999999e-07,
423
+ "loss": 0.3034,
424
+ "step": 1450
425
+ },
426
+ {
427
+ "epoch": 0.09,
428
+ "grad_norm": 86.28963470458984,
429
+ "learning_rate": 7.345e-07,
430
+ "loss": 0.2657,
431
+ "step": 1475
432
+ },
433
+ {
434
+ "epoch": 0.1,
435
+ "grad_norm": 2.8914854526519775,
436
+ "learning_rate": 7.47e-07,
437
+ "loss": 0.2636,
438
+ "step": 1500
439
+ },
440
+ {
441
+ "epoch": 0.1,
442
+ "grad_norm": 56.13273239135742,
443
+ "learning_rate": 7.594999999999999e-07,
444
+ "loss": 0.1522,
445
+ "step": 1525
446
+ },
447
+ {
448
+ "epoch": 0.1,
449
+ "grad_norm": 12.941767692565918,
450
+ "learning_rate": 7.72e-07,
451
+ "loss": 0.2097,
452
+ "step": 1550
453
+ },
454
+ {
455
+ "epoch": 0.1,
456
+ "grad_norm": 13.613518714904785,
457
+ "learning_rate": 7.845e-07,
458
+ "loss": 0.2303,
459
+ "step": 1575
460
+ },
461
+ {
462
+ "epoch": 0.1,
463
+ "grad_norm": 23.761892318725586,
464
+ "learning_rate": 7.970000000000001e-07,
465
+ "loss": 0.2763,
466
+ "step": 1600
467
+ },
468
+ {
469
+ "epoch": 0.1,
470
+ "grad_norm": 31.896230697631836,
471
+ "learning_rate": 8.094999999999999e-07,
472
+ "loss": 0.2722,
473
+ "step": 1625
474
+ },
475
+ {
476
+ "epoch": 0.11,
477
+ "grad_norm": 51.43158721923828,
478
+ "learning_rate": 8.219999999999999e-07,
479
+ "loss": 0.2228,
480
+ "step": 1650
481
+ },
482
+ {
483
+ "epoch": 0.11,
484
+ "grad_norm": 2.879077196121216,
485
+ "learning_rate": 8.345e-07,
486
+ "loss": 0.2694,
487
+ "step": 1675
488
+ },
489
+ {
490
+ "epoch": 0.11,
491
+ "grad_norm": 98.36167907714844,
492
+ "learning_rate": 8.469999999999999e-07,
493
+ "loss": 0.3088,
494
+ "step": 1700
495
+ },
496
+ {
497
+ "epoch": 0.11,
498
+ "grad_norm": 16.084274291992188,
499
+ "learning_rate": 8.595e-07,
500
+ "loss": 0.1828,
501
+ "step": 1725
502
+ },
503
+ {
504
+ "epoch": 0.11,
505
+ "grad_norm": 5.136277675628662,
506
+ "learning_rate": 8.72e-07,
507
+ "loss": 0.1753,
508
+ "step": 1750
509
+ },
510
+ {
511
+ "epoch": 0.11,
512
+ "grad_norm": 64.78803253173828,
513
+ "learning_rate": 8.845e-07,
514
+ "loss": 0.198,
515
+ "step": 1775
516
+ },
517
+ {
518
+ "epoch": 0.12,
519
+ "grad_norm": 41.84619903564453,
520
+ "learning_rate": 8.969999999999999e-07,
521
+ "loss": 0.2894,
522
+ "step": 1800
523
+ },
524
+ {
525
+ "epoch": 0.12,
526
+ "grad_norm": 45.18673324584961,
527
+ "learning_rate": 9.094999999999999e-07,
528
+ "loss": 0.1844,
529
+ "step": 1825
530
+ },
531
+ {
532
+ "epoch": 0.12,
533
+ "grad_norm": 20.42123794555664,
534
+ "learning_rate": 9.22e-07,
535
+ "loss": 0.2576,
536
+ "step": 1850
537
+ },
538
+ {
539
+ "epoch": 0.12,
540
+ "grad_norm": 8.751657485961914,
541
+ "learning_rate": 9.344999999999999e-07,
542
+ "loss": 0.2492,
543
+ "step": 1875
544
+ },
545
+ {
546
+ "epoch": 0.12,
547
+ "grad_norm": 29.69828224182129,
548
+ "learning_rate": 9.469999999999999e-07,
549
+ "loss": 0.1479,
550
+ "step": 1900
551
+ },
552
+ {
553
+ "epoch": 0.12,
554
+ "grad_norm": 22.91360855102539,
555
+ "learning_rate": 9.594999999999999e-07,
556
+ "loss": 0.2164,
557
+ "step": 1925
558
+ },
559
+ {
560
+ "epoch": 0.13,
561
+ "grad_norm": 17.22205352783203,
562
+ "learning_rate": 9.72e-07,
563
+ "loss": 0.2276,
564
+ "step": 1950
565
+ },
566
+ {
567
+ "epoch": 0.13,
568
+ "grad_norm": 83.4366683959961,
569
+ "learning_rate": 9.845e-07,
570
+ "loss": 0.2717,
571
+ "step": 1975
572
+ },
573
+ {
574
+ "epoch": 0.13,
575
+ "grad_norm": 11.829337120056152,
576
+ "learning_rate": 9.97e-07,
577
+ "loss": 0.2632,
578
+ "step": 2000
579
+ },
580
+ {
581
+ "epoch": 0.13,
582
+ "eval_loss": 0.22439254820346832,
583
+ "eval_runtime": 7771.0673,
584
+ "eval_samples_per_second": 1.211,
585
+ "eval_steps_per_second": 0.606,
586
+ "eval_wer": 0.13156789924107865,
587
+ "step": 2000
588
+ }
589
+ ],
590
+ "logging_steps": 25,
591
+ "max_steps": 5000,
592
+ "num_input_tokens_seen": 0,
593
+ "num_train_epochs": 1,
594
+ "save_steps": 1000,
595
+ "total_flos": 4.0824196724424704e+18,
596
+ "train_batch_size": 1,
597
+ "trial_name": null,
598
+ "trial_params": null
599
+ }
checkpoint-2000/training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:354821a80788dff5d057ffce4a4d80a406ce5bb0affa48cc6029ca3faa14edf2
3
+ size 5048
checkpoint-3000/config.json ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "openai/whisper-medium",
3
+ "activation_dropout": 0.0,
4
+ "activation_function": "gelu",
5
+ "apply_spec_augment": false,
6
+ "architectures": [
7
+ "WhisperForConditionalGeneration"
8
+ ],
9
+ "attention_dropout": 0.0,
10
+ "begin_suppress_tokens": [
11
+ 220,
12
+ 50257
13
+ ],
14
+ "bos_token_id": 50257,
15
+ "classifier_proj_size": 256,
16
+ "d_model": 1024,
17
+ "decoder_attention_heads": 16,
18
+ "decoder_ffn_dim": 4096,
19
+ "decoder_layerdrop": 0.0,
20
+ "decoder_layers": 24,
21
+ "decoder_start_token_id": 50258,
22
+ "dropout": 0.0,
23
+ "encoder_attention_heads": 16,
24
+ "encoder_ffn_dim": 4096,
25
+ "encoder_layerdrop": 0.0,
26
+ "encoder_layers": 24,
27
+ "eos_token_id": 50257,
28
+ "forced_decoder_ids": null,
29
+ "init_std": 0.02,
30
+ "is_encoder_decoder": true,
31
+ "mask_feature_length": 10,
32
+ "mask_feature_min_masks": 0,
33
+ "mask_feature_prob": 0.0,
34
+ "mask_time_length": 10,
35
+ "mask_time_min_masks": 2,
36
+ "mask_time_prob": 0.05,
37
+ "max_length": 448,
38
+ "max_source_positions": 1500,
39
+ "max_target_positions": 448,
40
+ "median_filter_width": 7,
41
+ "model_type": "whisper",
42
+ "num_hidden_layers": 24,
43
+ "num_mel_bins": 80,
44
+ "pad_token_id": 50257,
45
+ "scale_embedding": false,
46
+ "torch_dtype": "float32",
47
+ "transformers_version": "4.39.0.dev0",
48
+ "use_cache": true,
49
+ "use_weighted_layer_sum": false,
50
+ "vocab_size": 51865
51
+ }
checkpoint-3000/generation_config.json ADDED
@@ -0,0 +1,248 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alignment_heads": [
3
+ [
4
+ 13,
5
+ 15
6
+ ],
7
+ [
8
+ 15,
9
+ 4
10
+ ],
11
+ [
12
+ 15,
13
+ 15
14
+ ],
15
+ [
16
+ 16,
17
+ 1
18
+ ],
19
+ [
20
+ 20,
21
+ 0
22
+ ],
23
+ [
24
+ 23,
25
+ 4
26
+ ]
27
+ ],
28
+ "begin_suppress_tokens": [
29
+ 220,
30
+ 50257
31
+ ],
32
+ "bos_token_id": 50257,
33
+ "decoder_start_token_id": 50258,
34
+ "eos_token_id": 50257,
35
+ "forced_decoder_ids": [
36
+ [
37
+ 1,
38
+ null
39
+ ],
40
+ [
41
+ 2,
42
+ 50359
43
+ ]
44
+ ],
45
+ "is_multilingual": true,
46
+ "lang_to_id": {
47
+ "<|af|>": 50327,
48
+ "<|am|>": 50334,
49
+ "<|ar|>": 50272,
50
+ "<|as|>": 50350,
51
+ "<|az|>": 50304,
52
+ "<|ba|>": 50355,
53
+ "<|be|>": 50330,
54
+ "<|bg|>": 50292,
55
+ "<|bn|>": 50302,
56
+ "<|bo|>": 50347,
57
+ "<|br|>": 50309,
58
+ "<|bs|>": 50315,
59
+ "<|ca|>": 50270,
60
+ "<|cs|>": 50283,
61
+ "<|cy|>": 50297,
62
+ "<|da|>": 50285,
63
+ "<|de|>": 50261,
64
+ "<|el|>": 50281,
65
+ "<|en|>": 50259,
66
+ "<|es|>": 50262,
67
+ "<|et|>": 50307,
68
+ "<|eu|>": 50310,
69
+ "<|fa|>": 50300,
70
+ "<|fi|>": 50277,
71
+ "<|fo|>": 50338,
72
+ "<|fr|>": 50265,
73
+ "<|gl|>": 50319,
74
+ "<|gu|>": 50333,
75
+ "<|haw|>": 50352,
76
+ "<|ha|>": 50354,
77
+ "<|he|>": 50279,
78
+ "<|hi|>": 50276,
79
+ "<|hr|>": 50291,
80
+ "<|ht|>": 50339,
81
+ "<|hu|>": 50286,
82
+ "<|hy|>": 50312,
83
+ "<|id|>": 50275,
84
+ "<|is|>": 50311,
85
+ "<|it|>": 50274,
86
+ "<|ja|>": 50266,
87
+ "<|jw|>": 50356,
88
+ "<|ka|>": 50329,
89
+ "<|kk|>": 50316,
90
+ "<|km|>": 50323,
91
+ "<|kn|>": 50306,
92
+ "<|ko|>": 50264,
93
+ "<|la|>": 50294,
94
+ "<|lb|>": 50345,
95
+ "<|ln|>": 50353,
96
+ "<|lo|>": 50336,
97
+ "<|lt|>": 50293,
98
+ "<|lv|>": 50301,
99
+ "<|mg|>": 50349,
100
+ "<|mi|>": 50295,
101
+ "<|mk|>": 50308,
102
+ "<|ml|>": 50296,
103
+ "<|mn|>": 50314,
104
+ "<|mr|>": 50320,
105
+ "<|ms|>": 50282,
106
+ "<|mt|>": 50343,
107
+ "<|my|>": 50346,
108
+ "<|ne|>": 50313,
109
+ "<|nl|>": 50271,
110
+ "<|nn|>": 50342,
111
+ "<|no|>": 50288,
112
+ "<|oc|>": 50328,
113
+ "<|pa|>": 50321,
114
+ "<|pl|>": 50269,
115
+ "<|ps|>": 50340,
116
+ "<|pt|>": 50267,
117
+ "<|ro|>": 50284,
118
+ "<|ru|>": 50263,
119
+ "<|sa|>": 50344,
120
+ "<|sd|>": 50332,
121
+ "<|si|>": 50322,
122
+ "<|sk|>": 50298,
123
+ "<|sl|>": 50305,
124
+ "<|sn|>": 50324,
125
+ "<|so|>": 50326,
126
+ "<|sq|>": 50317,
127
+ "<|sr|>": 50303,
128
+ "<|su|>": 50357,
129
+ "<|sv|>": 50273,
130
+ "<|sw|>": 50318,
131
+ "<|ta|>": 50287,
132
+ "<|te|>": 50299,
133
+ "<|tg|>": 50331,
134
+ "<|th|>": 50289,
135
+ "<|tk|>": 50341,
136
+ "<|tl|>": 50348,
137
+ "<|tr|>": 50268,
138
+ "<|tt|>": 50351,
139
+ "<|uk|>": 50280,
140
+ "<|ur|>": 50290,
141
+ "<|uz|>": 50337,
142
+ "<|vi|>": 50278,
143
+ "<|yi|>": 50335,
144
+ "<|yo|>": 50325,
145
+ "<|zh|>": 50260
146
+ },
147
+ "max_initial_timestamp_index": 50,
148
+ "max_length": 448,
149
+ "no_timestamps_token_id": 50363,
150
+ "pad_token_id": 50257,
151
+ "prev_sot_token_id": 50361,
152
+ "return_timestamps": false,
153
+ "suppress_tokens": [
154
+ 1,
155
+ 2,
156
+ 7,
157
+ 8,
158
+ 9,
159
+ 10,
160
+ 14,
161
+ 25,
162
+ 26,
163
+ 27,
164
+ 28,
165
+ 29,
166
+ 31,
167
+ 58,
168
+ 59,
169
+ 60,
170
+ 61,
171
+ 62,
172
+ 63,
173
+ 90,
174
+ 91,
175
+ 92,
176
+ 93,
177
+ 359,
178
+ 503,
179
+ 522,
180
+ 542,
181
+ 873,
182
+ 893,
183
+ 902,
184
+ 918,
185
+ 922,
186
+ 931,
187
+ 1350,
188
+ 1853,
189
+ 1982,
190
+ 2460,
191
+ 2627,
192
+ 3246,
193
+ 3253,
194
+ 3268,
195
+ 3536,
196
+ 3846,
197
+ 3961,
198
+ 4183,
199
+ 4667,
200
+ 6585,
201
+ 6647,
202
+ 7273,
203
+ 9061,
204
+ 9383,
205
+ 10428,
206
+ 10929,
207
+ 11938,
208
+ 12033,
209
+ 12331,
210
+ 12562,
211
+ 13793,
212
+ 14157,
213
+ 14635,
214
+ 15265,
215
+ 15618,
216
+ 16553,
217
+ 16604,
218
+ 18362,
219
+ 18956,
220
+ 20075,
221
+ 21675,
222
+ 22520,
223
+ 26130,
224
+ 26161,
225
+ 26435,
226
+ 28279,
227
+ 29464,
228
+ 31650,
229
+ 32302,
230
+ 32470,
231
+ 36865,
232
+ 42863,
233
+ 47425,
234
+ 49870,
235
+ 50254,
236
+ 50258,
237
+ 50358,
238
+ 50359,
239
+ 50360,
240
+ 50361,
241
+ 50362
242
+ ],
243
+ "task_to_id": {
244
+ "transcribe": 50359,
245
+ "translate": 50358
246
+ },
247
+ "transformers_version": "4.39.0.dev0"
248
+ }
checkpoint-3000/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d810c1853f6f998df01ab24d425af2809ded6b78390f5d7dff0954ad40ee5641
3
+ size 3055544304
checkpoint-3000/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f8909f848ffe9b73bb01bac425c22c254ca0f922efce5bada3d9007e01c104f3
3
+ size 6099375168
checkpoint-3000/preprocessor_config.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "chunk_length": 30,
3
+ "feature_extractor_type": "WhisperFeatureExtractor",
4
+ "feature_size": 80,
5
+ "hop_length": 160,
6
+ "n_fft": 400,
7
+ "n_samples": 480000,
8
+ "nb_max_frames": 3000,
9
+ "padding_side": "right",
10
+ "padding_value": 0.0,
11
+ "processor_class": "WhisperProcessor",
12
+ "return_attention_mask": false,
13
+ "sampling_rate": 16000
14
+ }
checkpoint-3000/rng_state_0.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ba49a46979d1d35e1468674f8d32a431c4e6705fd648f67412bdbcb3f05a0a8c
3
+ size 14512
checkpoint-3000/rng_state_1.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:601f6b91dc79ca6a479677bcf31fcff4f6b62d22118568fd58d625f22b4f469c
3
+ size 14512
checkpoint-3000/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3864c841ea419b2b63c4c301356c217667a74879f8f4dc4f54152a961e87803f
3
+ size 1064
checkpoint-3000/trainer_state.json ADDED
@@ -0,0 +1,888 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_metric": null,
3
+ "best_model_checkpoint": null,
4
+ "epoch": 0.19293845263360987,
5
+ "eval_steps": 1000,
6
+ "global_step": 3000,
7
+ "is_hyper_param_search": false,
8
+ "is_local_process_zero": true,
9
+ "is_world_process_zero": true,
10
+ "log_history": [
11
+ {
12
+ "epoch": 0.0,
13
+ "grad_norm": 58.440006256103516,
14
+ "learning_rate": 1e-08,
15
+ "loss": 1.3813,
16
+ "step": 25
17
+ },
18
+ {
19
+ "epoch": 0.0,
20
+ "grad_norm": 59.934757232666016,
21
+ "learning_rate": 2.2e-08,
22
+ "loss": 1.469,
23
+ "step": 50
24
+ },
25
+ {
26
+ "epoch": 0.0,
27
+ "grad_norm": 31.011018753051758,
28
+ "learning_rate": 3.4500000000000005e-08,
29
+ "loss": 1.2226,
30
+ "step": 75
31
+ },
32
+ {
33
+ "epoch": 0.01,
34
+ "grad_norm": 59.818233489990234,
35
+ "learning_rate": 4.7e-08,
36
+ "loss": 1.2458,
37
+ "step": 100
38
+ },
39
+ {
40
+ "epoch": 0.01,
41
+ "grad_norm": 60.51572036743164,
42
+ "learning_rate": 5.95e-08,
43
+ "loss": 1.2781,
44
+ "step": 125
45
+ },
46
+ {
47
+ "epoch": 0.01,
48
+ "grad_norm": 51.360103607177734,
49
+ "learning_rate": 7.2e-08,
50
+ "loss": 1.4055,
51
+ "step": 150
52
+ },
53
+ {
54
+ "epoch": 0.01,
55
+ "grad_norm": 73.35002136230469,
56
+ "learning_rate": 8.45e-08,
57
+ "loss": 1.3354,
58
+ "step": 175
59
+ },
60
+ {
61
+ "epoch": 0.01,
62
+ "grad_norm": 69.32823944091797,
63
+ "learning_rate": 9.7e-08,
64
+ "loss": 1.2005,
65
+ "step": 200
66
+ },
67
+ {
68
+ "epoch": 0.01,
69
+ "grad_norm": 51.02174377441406,
70
+ "learning_rate": 1.095e-07,
71
+ "loss": 1.3853,
72
+ "step": 225
73
+ },
74
+ {
75
+ "epoch": 0.02,
76
+ "grad_norm": 72.20179748535156,
77
+ "learning_rate": 1.2199999999999998e-07,
78
+ "loss": 1.4476,
79
+ "step": 250
80
+ },
81
+ {
82
+ "epoch": 0.02,
83
+ "grad_norm": 108.30382537841797,
84
+ "learning_rate": 1.345e-07,
85
+ "loss": 1.2339,
86
+ "step": 275
87
+ },
88
+ {
89
+ "epoch": 0.02,
90
+ "grad_norm": 66.15994262695312,
91
+ "learning_rate": 1.4699999999999998e-07,
92
+ "loss": 1.379,
93
+ "step": 300
94
+ },
95
+ {
96
+ "epoch": 0.02,
97
+ "grad_norm": 47.82923126220703,
98
+ "learning_rate": 1.595e-07,
99
+ "loss": 1.1467,
100
+ "step": 325
101
+ },
102
+ {
103
+ "epoch": 0.02,
104
+ "grad_norm": 85.7218246459961,
105
+ "learning_rate": 1.7199999999999998e-07,
106
+ "loss": 1.1622,
107
+ "step": 350
108
+ },
109
+ {
110
+ "epoch": 0.02,
111
+ "grad_norm": 68.25504302978516,
112
+ "learning_rate": 1.845e-07,
113
+ "loss": 1.1413,
114
+ "step": 375
115
+ },
116
+ {
117
+ "epoch": 0.03,
118
+ "grad_norm": 106.06077575683594,
119
+ "learning_rate": 1.97e-07,
120
+ "loss": 1.0855,
121
+ "step": 400
122
+ },
123
+ {
124
+ "epoch": 0.03,
125
+ "grad_norm": 79.60690307617188,
126
+ "learning_rate": 2.095e-07,
127
+ "loss": 0.929,
128
+ "step": 425
129
+ },
130
+ {
131
+ "epoch": 0.03,
132
+ "grad_norm": 42.14814376831055,
133
+ "learning_rate": 2.22e-07,
134
+ "loss": 0.8728,
135
+ "step": 450
136
+ },
137
+ {
138
+ "epoch": 0.03,
139
+ "grad_norm": 37.4913444519043,
140
+ "learning_rate": 2.3449999999999996e-07,
141
+ "loss": 0.6651,
142
+ "step": 475
143
+ },
144
+ {
145
+ "epoch": 0.03,
146
+ "grad_norm": 41.89991760253906,
147
+ "learning_rate": 2.47e-07,
148
+ "loss": 0.5875,
149
+ "step": 500
150
+ },
151
+ {
152
+ "epoch": 0.03,
153
+ "grad_norm": 75.21453094482422,
154
+ "learning_rate": 2.595e-07,
155
+ "loss": 0.6868,
156
+ "step": 525
157
+ },
158
+ {
159
+ "epoch": 0.04,
160
+ "grad_norm": 21.09180450439453,
161
+ "learning_rate": 2.72e-07,
162
+ "loss": 0.741,
163
+ "step": 550
164
+ },
165
+ {
166
+ "epoch": 0.04,
167
+ "grad_norm": 44.54707336425781,
168
+ "learning_rate": 2.845e-07,
169
+ "loss": 0.3898,
170
+ "step": 575
171
+ },
172
+ {
173
+ "epoch": 0.04,
174
+ "grad_norm": 31.656843185424805,
175
+ "learning_rate": 2.9699999999999997e-07,
176
+ "loss": 0.422,
177
+ "step": 600
178
+ },
179
+ {
180
+ "epoch": 0.04,
181
+ "grad_norm": 56.28642654418945,
182
+ "learning_rate": 3.0949999999999996e-07,
183
+ "loss": 0.3803,
184
+ "step": 625
185
+ },
186
+ {
187
+ "epoch": 0.04,
188
+ "grad_norm": 38.66410827636719,
189
+ "learning_rate": 3.22e-07,
190
+ "loss": 0.5062,
191
+ "step": 650
192
+ },
193
+ {
194
+ "epoch": 0.04,
195
+ "grad_norm": 31.183727264404297,
196
+ "learning_rate": 3.345e-07,
197
+ "loss": 0.4075,
198
+ "step": 675
199
+ },
200
+ {
201
+ "epoch": 0.05,
202
+ "grad_norm": 23.618703842163086,
203
+ "learning_rate": 3.4699999999999997e-07,
204
+ "loss": 0.3627,
205
+ "step": 700
206
+ },
207
+ {
208
+ "epoch": 0.05,
209
+ "grad_norm": 70.09487915039062,
210
+ "learning_rate": 3.5949999999999996e-07,
211
+ "loss": 0.3087,
212
+ "step": 725
213
+ },
214
+ {
215
+ "epoch": 0.05,
216
+ "grad_norm": 74.42188262939453,
217
+ "learning_rate": 3.72e-07,
218
+ "loss": 0.4021,
219
+ "step": 750
220
+ },
221
+ {
222
+ "epoch": 0.05,
223
+ "grad_norm": 44.99939727783203,
224
+ "learning_rate": 3.845e-07,
225
+ "loss": 0.3203,
226
+ "step": 775
227
+ },
228
+ {
229
+ "epoch": 0.05,
230
+ "grad_norm": 42.77998352050781,
231
+ "learning_rate": 3.97e-07,
232
+ "loss": 0.3797,
233
+ "step": 800
234
+ },
235
+ {
236
+ "epoch": 0.05,
237
+ "grad_norm": 64.61412811279297,
238
+ "learning_rate": 4.0949999999999995e-07,
239
+ "loss": 0.3403,
240
+ "step": 825
241
+ },
242
+ {
243
+ "epoch": 0.05,
244
+ "grad_norm": 29.286806106567383,
245
+ "learning_rate": 4.2199999999999994e-07,
246
+ "loss": 0.2879,
247
+ "step": 850
248
+ },
249
+ {
250
+ "epoch": 0.06,
251
+ "grad_norm": 58.146263122558594,
252
+ "learning_rate": 4.345e-07,
253
+ "loss": 0.4017,
254
+ "step": 875
255
+ },
256
+ {
257
+ "epoch": 0.06,
258
+ "grad_norm": 44.624202728271484,
259
+ "learning_rate": 4.4699999999999997e-07,
260
+ "loss": 0.3698,
261
+ "step": 900
262
+ },
263
+ {
264
+ "epoch": 0.06,
265
+ "grad_norm": 47.91656494140625,
266
+ "learning_rate": 4.595e-07,
267
+ "loss": 0.4008,
268
+ "step": 925
269
+ },
270
+ {
271
+ "epoch": 0.06,
272
+ "grad_norm": 36.263668060302734,
273
+ "learning_rate": 4.7199999999999994e-07,
274
+ "loss": 0.2041,
275
+ "step": 950
276
+ },
277
+ {
278
+ "epoch": 0.06,
279
+ "grad_norm": 12.398943901062012,
280
+ "learning_rate": 4.845e-07,
281
+ "loss": 0.2978,
282
+ "step": 975
283
+ },
284
+ {
285
+ "epoch": 0.06,
286
+ "grad_norm": 4.42283821105957,
287
+ "learning_rate": 4.97e-07,
288
+ "loss": 0.2614,
289
+ "step": 1000
290
+ },
291
+ {
292
+ "epoch": 0.06,
293
+ "eval_loss": 0.29864633083343506,
294
+ "eval_runtime": 7667.7674,
295
+ "eval_samples_per_second": 1.228,
296
+ "eval_steps_per_second": 0.614,
297
+ "eval_wer": 0.14664944291942517,
298
+ "step": 1000
299
+ },
300
+ {
301
+ "epoch": 0.07,
302
+ "grad_norm": 59.681270599365234,
303
+ "learning_rate": 5.095e-07,
304
+ "loss": 0.2588,
305
+ "step": 1025
306
+ },
307
+ {
308
+ "epoch": 0.07,
309
+ "grad_norm": 27.44911766052246,
310
+ "learning_rate": 5.22e-07,
311
+ "loss": 0.2839,
312
+ "step": 1050
313
+ },
314
+ {
315
+ "epoch": 0.07,
316
+ "grad_norm": 56.26525115966797,
317
+ "learning_rate": 5.344999999999999e-07,
318
+ "loss": 0.2329,
319
+ "step": 1075
320
+ },
321
+ {
322
+ "epoch": 0.07,
323
+ "grad_norm": 112.37168884277344,
324
+ "learning_rate": 5.47e-07,
325
+ "loss": 0.2746,
326
+ "step": 1100
327
+ },
328
+ {
329
+ "epoch": 0.07,
330
+ "grad_norm": 92.82706451416016,
331
+ "learning_rate": 5.595e-07,
332
+ "loss": 0.2621,
333
+ "step": 1125
334
+ },
335
+ {
336
+ "epoch": 0.07,
337
+ "grad_norm": 17.20562744140625,
338
+ "learning_rate": 5.719999999999999e-07,
339
+ "loss": 0.1826,
340
+ "step": 1150
341
+ },
342
+ {
343
+ "epoch": 0.08,
344
+ "grad_norm": 31.119354248046875,
345
+ "learning_rate": 5.845e-07,
346
+ "loss": 0.2487,
347
+ "step": 1175
348
+ },
349
+ {
350
+ "epoch": 0.08,
351
+ "grad_norm": 7.4088850021362305,
352
+ "learning_rate": 5.97e-07,
353
+ "loss": 0.297,
354
+ "step": 1200
355
+ },
356
+ {
357
+ "epoch": 0.08,
358
+ "grad_norm": 72.84540557861328,
359
+ "learning_rate": 6.095e-07,
360
+ "loss": 0.3108,
361
+ "step": 1225
362
+ },
363
+ {
364
+ "epoch": 0.08,
365
+ "grad_norm": 38.68337631225586,
366
+ "learning_rate": 6.219999999999999e-07,
367
+ "loss": 0.2819,
368
+ "step": 1250
369
+ },
370
+ {
371
+ "epoch": 0.08,
372
+ "grad_norm": 5.215000152587891,
373
+ "learning_rate": 6.344999999999999e-07,
374
+ "loss": 0.1953,
375
+ "step": 1275
376
+ },
377
+ {
378
+ "epoch": 0.08,
379
+ "grad_norm": 41.42685317993164,
380
+ "learning_rate": 6.47e-07,
381
+ "loss": 0.2333,
382
+ "step": 1300
383
+ },
384
+ {
385
+ "epoch": 0.09,
386
+ "grad_norm": 6.224233150482178,
387
+ "learning_rate": 6.595e-07,
388
+ "loss": 0.2213,
389
+ "step": 1325
390
+ },
391
+ {
392
+ "epoch": 0.09,
393
+ "grad_norm": 48.12126541137695,
394
+ "learning_rate": 6.72e-07,
395
+ "loss": 0.3121,
396
+ "step": 1350
397
+ },
398
+ {
399
+ "epoch": 0.09,
400
+ "grad_norm": 23.151887893676758,
401
+ "learning_rate": 6.845e-07,
402
+ "loss": 0.2076,
403
+ "step": 1375
404
+ },
405
+ {
406
+ "epoch": 0.09,
407
+ "grad_norm": 53.516395568847656,
408
+ "learning_rate": 6.97e-07,
409
+ "loss": 0.2835,
410
+ "step": 1400
411
+ },
412
+ {
413
+ "epoch": 0.09,
414
+ "grad_norm": 41.62558364868164,
415
+ "learning_rate": 7.094999999999999e-07,
416
+ "loss": 0.2574,
417
+ "step": 1425
418
+ },
419
+ {
420
+ "epoch": 0.09,
421
+ "grad_norm": 98.05493927001953,
422
+ "learning_rate": 7.219999999999999e-07,
423
+ "loss": 0.3034,
424
+ "step": 1450
425
+ },
426
+ {
427
+ "epoch": 0.09,
428
+ "grad_norm": 86.28963470458984,
429
+ "learning_rate": 7.345e-07,
430
+ "loss": 0.2657,
431
+ "step": 1475
432
+ },
433
+ {
434
+ "epoch": 0.1,
435
+ "grad_norm": 2.8914854526519775,
436
+ "learning_rate": 7.47e-07,
437
+ "loss": 0.2636,
438
+ "step": 1500
439
+ },
440
+ {
441
+ "epoch": 0.1,
442
+ "grad_norm": 56.13273239135742,
443
+ "learning_rate": 7.594999999999999e-07,
444
+ "loss": 0.1522,
445
+ "step": 1525
446
+ },
447
+ {
448
+ "epoch": 0.1,
449
+ "grad_norm": 12.941767692565918,
450
+ "learning_rate": 7.72e-07,
451
+ "loss": 0.2097,
452
+ "step": 1550
453
+ },
454
+ {
455
+ "epoch": 0.1,
456
+ "grad_norm": 13.613518714904785,
457
+ "learning_rate": 7.845e-07,
458
+ "loss": 0.2303,
459
+ "step": 1575
460
+ },
461
+ {
462
+ "epoch": 0.1,
463
+ "grad_norm": 23.761892318725586,
464
+ "learning_rate": 7.970000000000001e-07,
465
+ "loss": 0.2763,
466
+ "step": 1600
467
+ },
468
+ {
469
+ "epoch": 0.1,
470
+ "grad_norm": 31.896230697631836,
471
+ "learning_rate": 8.094999999999999e-07,
472
+ "loss": 0.2722,
473
+ "step": 1625
474
+ },
475
+ {
476
+ "epoch": 0.11,
477
+ "grad_norm": 51.43158721923828,
478
+ "learning_rate": 8.219999999999999e-07,
479
+ "loss": 0.2228,
480
+ "step": 1650
481
+ },
482
+ {
483
+ "epoch": 0.11,
484
+ "grad_norm": 2.879077196121216,
485
+ "learning_rate": 8.345e-07,
486
+ "loss": 0.2694,
487
+ "step": 1675
488
+ },
489
+ {
490
+ "epoch": 0.11,
491
+ "grad_norm": 98.36167907714844,
492
+ "learning_rate": 8.469999999999999e-07,
493
+ "loss": 0.3088,
494
+ "step": 1700
495
+ },
496
+ {
497
+ "epoch": 0.11,
498
+ "grad_norm": 16.084274291992188,
499
+ "learning_rate": 8.595e-07,
500
+ "loss": 0.1828,
501
+ "step": 1725
502
+ },
503
+ {
504
+ "epoch": 0.11,
505
+ "grad_norm": 5.136277675628662,
506
+ "learning_rate": 8.72e-07,
507
+ "loss": 0.1753,
508
+ "step": 1750
509
+ },
510
+ {
511
+ "epoch": 0.11,
512
+ "grad_norm": 64.78803253173828,
513
+ "learning_rate": 8.845e-07,
514
+ "loss": 0.198,
515
+ "step": 1775
516
+ },
517
+ {
518
+ "epoch": 0.12,
519
+ "grad_norm": 41.84619903564453,
520
+ "learning_rate": 8.969999999999999e-07,
521
+ "loss": 0.2894,
522
+ "step": 1800
523
+ },
524
+ {
525
+ "epoch": 0.12,
526
+ "grad_norm": 45.18673324584961,
527
+ "learning_rate": 9.094999999999999e-07,
528
+ "loss": 0.1844,
529
+ "step": 1825
530
+ },
531
+ {
532
+ "epoch": 0.12,
533
+ "grad_norm": 20.42123794555664,
534
+ "learning_rate": 9.22e-07,
535
+ "loss": 0.2576,
536
+ "step": 1850
537
+ },
538
+ {
539
+ "epoch": 0.12,
540
+ "grad_norm": 8.751657485961914,
541
+ "learning_rate": 9.344999999999999e-07,
542
+ "loss": 0.2492,
543
+ "step": 1875
544
+ },
545
+ {
546
+ "epoch": 0.12,
547
+ "grad_norm": 29.69828224182129,
548
+ "learning_rate": 9.469999999999999e-07,
549
+ "loss": 0.1479,
550
+ "step": 1900
551
+ },
552
+ {
553
+ "epoch": 0.12,
554
+ "grad_norm": 22.91360855102539,
555
+ "learning_rate": 9.594999999999999e-07,
556
+ "loss": 0.2164,
557
+ "step": 1925
558
+ },
559
+ {
560
+ "epoch": 0.13,
561
+ "grad_norm": 17.22205352783203,
562
+ "learning_rate": 9.72e-07,
563
+ "loss": 0.2276,
564
+ "step": 1950
565
+ },
566
+ {
567
+ "epoch": 0.13,
568
+ "grad_norm": 83.4366683959961,
569
+ "learning_rate": 9.845e-07,
570
+ "loss": 0.2717,
571
+ "step": 1975
572
+ },
573
+ {
574
+ "epoch": 0.13,
575
+ "grad_norm": 11.829337120056152,
576
+ "learning_rate": 9.97e-07,
577
+ "loss": 0.2632,
578
+ "step": 2000
579
+ },
580
+ {
581
+ "epoch": 0.13,
582
+ "eval_loss": 0.22439254820346832,
583
+ "eval_runtime": 7771.0673,
584
+ "eval_samples_per_second": 1.211,
585
+ "eval_steps_per_second": 0.606,
586
+ "eval_wer": 0.13156789924107865,
587
+ "step": 2000
588
+ },
589
+ {
590
+ "epoch": 0.13,
591
+ "grad_norm": 42.36271667480469,
592
+ "learning_rate": 9.936666666666667e-07,
593
+ "loss": 0.21,
594
+ "step": 2025
595
+ },
596
+ {
597
+ "epoch": 0.13,
598
+ "grad_norm": 57.45354461669922,
599
+ "learning_rate": 9.853333333333333e-07,
600
+ "loss": 0.2244,
601
+ "step": 2050
602
+ },
603
+ {
604
+ "epoch": 0.13,
605
+ "grad_norm": 17.302165985107422,
606
+ "learning_rate": 9.773333333333333e-07,
607
+ "loss": 0.2492,
608
+ "step": 2075
609
+ },
610
+ {
611
+ "epoch": 0.14,
612
+ "grad_norm": 41.777069091796875,
613
+ "learning_rate": 9.69e-07,
614
+ "loss": 0.1598,
615
+ "step": 2100
616
+ },
617
+ {
618
+ "epoch": 0.14,
619
+ "grad_norm": 37.14485549926758,
620
+ "learning_rate": 9.606666666666666e-07,
621
+ "loss": 0.2483,
622
+ "step": 2125
623
+ },
624
+ {
625
+ "epoch": 0.14,
626
+ "grad_norm": 53.22433090209961,
627
+ "learning_rate": 9.523333333333333e-07,
628
+ "loss": 0.1913,
629
+ "step": 2150
630
+ },
631
+ {
632
+ "epoch": 0.14,
633
+ "grad_norm": 78.79158782958984,
634
+ "learning_rate": 9.439999999999999e-07,
635
+ "loss": 0.3075,
636
+ "step": 2175
637
+ },
638
+ {
639
+ "epoch": 0.14,
640
+ "grad_norm": 2.1396484375,
641
+ "learning_rate": 9.356666666666666e-07,
642
+ "loss": 0.2427,
643
+ "step": 2200
644
+ },
645
+ {
646
+ "epoch": 0.14,
647
+ "grad_norm": 9.334494590759277,
648
+ "learning_rate": 9.273333333333333e-07,
649
+ "loss": 0.2201,
650
+ "step": 2225
651
+ },
652
+ {
653
+ "epoch": 0.14,
654
+ "grad_norm": 8.948480606079102,
655
+ "learning_rate": 9.19e-07,
656
+ "loss": 0.2047,
657
+ "step": 2250
658
+ },
659
+ {
660
+ "epoch": 0.15,
661
+ "grad_norm": 3.004768133163452,
662
+ "learning_rate": 9.106666666666666e-07,
663
+ "loss": 0.1928,
664
+ "step": 2275
665
+ },
666
+ {
667
+ "epoch": 0.15,
668
+ "grad_norm": 3.458395481109619,
669
+ "learning_rate": 9.023333333333333e-07,
670
+ "loss": 0.1788,
671
+ "step": 2300
672
+ },
673
+ {
674
+ "epoch": 0.15,
675
+ "grad_norm": 61.66895294189453,
676
+ "learning_rate": 8.939999999999999e-07,
677
+ "loss": 0.1959,
678
+ "step": 2325
679
+ },
680
+ {
681
+ "epoch": 0.15,
682
+ "grad_norm": 45.452789306640625,
683
+ "learning_rate": 8.856666666666666e-07,
684
+ "loss": 0.211,
685
+ "step": 2350
686
+ },
687
+ {
688
+ "epoch": 0.15,
689
+ "grad_norm": 18.07378578186035,
690
+ "learning_rate": 8.773333333333332e-07,
691
+ "loss": 0.2391,
692
+ "step": 2375
693
+ },
694
+ {
695
+ "epoch": 0.15,
696
+ "grad_norm": 46.68052291870117,
697
+ "learning_rate": 8.69e-07,
698
+ "loss": 0.1782,
699
+ "step": 2400
700
+ },
701
+ {
702
+ "epoch": 0.16,
703
+ "grad_norm": 5.451249599456787,
704
+ "learning_rate": 8.606666666666667e-07,
705
+ "loss": 0.1569,
706
+ "step": 2425
707
+ },
708
+ {
709
+ "epoch": 0.16,
710
+ "grad_norm": 1.6330296993255615,
711
+ "learning_rate": 8.523333333333334e-07,
712
+ "loss": 0.1381,
713
+ "step": 2450
714
+ },
715
+ {
716
+ "epoch": 0.16,
717
+ "grad_norm": 43.628761291503906,
718
+ "learning_rate": 8.439999999999999e-07,
719
+ "loss": 0.1943,
720
+ "step": 2475
721
+ },
722
+ {
723
+ "epoch": 0.16,
724
+ "grad_norm": 42.83442687988281,
725
+ "learning_rate": 8.356666666666666e-07,
726
+ "loss": 0.1937,
727
+ "step": 2500
728
+ },
729
+ {
730
+ "epoch": 0.16,
731
+ "grad_norm": 41.783485412597656,
732
+ "learning_rate": 8.273333333333333e-07,
733
+ "loss": 0.1738,
734
+ "step": 2525
735
+ },
736
+ {
737
+ "epoch": 0.16,
738
+ "grad_norm": 43.905025482177734,
739
+ "learning_rate": 8.189999999999999e-07,
740
+ "loss": 0.2068,
741
+ "step": 2550
742
+ },
743
+ {
744
+ "epoch": 0.17,
745
+ "grad_norm": 35.906982421875,
746
+ "learning_rate": 8.106666666666666e-07,
747
+ "loss": 0.3004,
748
+ "step": 2575
749
+ },
750
+ {
751
+ "epoch": 0.17,
752
+ "grad_norm": 75.37654113769531,
753
+ "learning_rate": 8.023333333333333e-07,
754
+ "loss": 0.2073,
755
+ "step": 2600
756
+ },
757
+ {
758
+ "epoch": 0.17,
759
+ "grad_norm": 0.4306688904762268,
760
+ "learning_rate": 7.94e-07,
761
+ "loss": 0.1722,
762
+ "step": 2625
763
+ },
764
+ {
765
+ "epoch": 0.17,
766
+ "grad_norm": 48.76789093017578,
767
+ "learning_rate": 7.856666666666665e-07,
768
+ "loss": 0.2808,
769
+ "step": 2650
770
+ },
771
+ {
772
+ "epoch": 0.17,
773
+ "grad_norm": 21.527475357055664,
774
+ "learning_rate": 7.773333333333333e-07,
775
+ "loss": 0.1416,
776
+ "step": 2675
777
+ },
778
+ {
779
+ "epoch": 0.17,
780
+ "grad_norm": 6.267962455749512,
781
+ "learning_rate": 7.69e-07,
782
+ "loss": 0.1522,
783
+ "step": 2700
784
+ },
785
+ {
786
+ "epoch": 0.18,
787
+ "grad_norm": 38.2205696105957,
788
+ "learning_rate": 7.606666666666667e-07,
789
+ "loss": 0.1988,
790
+ "step": 2725
791
+ },
792
+ {
793
+ "epoch": 0.18,
794
+ "grad_norm": 2.1994435787200928,
795
+ "learning_rate": 7.523333333333333e-07,
796
+ "loss": 0.2384,
797
+ "step": 2750
798
+ },
799
+ {
800
+ "epoch": 0.18,
801
+ "grad_norm": 21.002376556396484,
802
+ "learning_rate": 7.44e-07,
803
+ "loss": 0.198,
804
+ "step": 2775
805
+ },
806
+ {
807
+ "epoch": 0.18,
808
+ "grad_norm": 66.96015167236328,
809
+ "learning_rate": 7.356666666666667e-07,
810
+ "loss": 0.2185,
811
+ "step": 2800
812
+ },
813
+ {
814
+ "epoch": 0.18,
815
+ "grad_norm": 16.91470718383789,
816
+ "learning_rate": 7.273333333333333e-07,
817
+ "loss": 0.2149,
818
+ "step": 2825
819
+ },
820
+ {
821
+ "epoch": 0.18,
822
+ "grad_norm": 12.189261436462402,
823
+ "learning_rate": 7.189999999999999e-07,
824
+ "loss": 0.1907,
825
+ "step": 2850
826
+ },
827
+ {
828
+ "epoch": 0.18,
829
+ "grad_norm": 5.648806095123291,
830
+ "learning_rate": 7.106666666666666e-07,
831
+ "loss": 0.1634,
832
+ "step": 2875
833
+ },
834
+ {
835
+ "epoch": 0.19,
836
+ "grad_norm": 6.627074241638184,
837
+ "learning_rate": 7.023333333333333e-07,
838
+ "loss": 0.2275,
839
+ "step": 2900
840
+ },
841
+ {
842
+ "epoch": 0.19,
843
+ "grad_norm": 42.446556091308594,
844
+ "learning_rate": 6.939999999999999e-07,
845
+ "loss": 0.1888,
846
+ "step": 2925
847
+ },
848
+ {
849
+ "epoch": 0.19,
850
+ "grad_norm": 19.29751968383789,
851
+ "learning_rate": 6.856666666666667e-07,
852
+ "loss": 0.1603,
853
+ "step": 2950
854
+ },
855
+ {
856
+ "epoch": 0.19,
857
+ "grad_norm": 88.98928833007812,
858
+ "learning_rate": 6.773333333333334e-07,
859
+ "loss": 0.2295,
860
+ "step": 2975
861
+ },
862
+ {
863
+ "epoch": 0.19,
864
+ "grad_norm": 13.686836242675781,
865
+ "learning_rate": 6.69e-07,
866
+ "loss": 0.1694,
867
+ "step": 3000
868
+ },
869
+ {
870
+ "epoch": 0.19,
871
+ "eval_loss": 0.20859745144844055,
872
+ "eval_runtime": 7797.8116,
873
+ "eval_samples_per_second": 1.207,
874
+ "eval_steps_per_second": 0.604,
875
+ "eval_wer": 0.12344582593250444,
876
+ "step": 3000
877
+ }
878
+ ],
879
+ "logging_steps": 25,
880
+ "max_steps": 5000,
881
+ "num_input_tokens_seen": 0,
882
+ "num_train_epochs": 1,
883
+ "save_steps": 1000,
884
+ "total_flos": 6.123629508663706e+18,
885
+ "train_batch_size": 1,
886
+ "trial_name": null,
887
+ "trial_params": null
888
+ }
checkpoint-3000/training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:354821a80788dff5d057ffce4a4d80a406ce5bb0affa48cc6029ca3faa14edf2
3
+ size 5048
checkpoint-4000/config.json ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "openai/whisper-medium",
3
+ "activation_dropout": 0.0,
4
+ "activation_function": "gelu",
5
+ "apply_spec_augment": false,
6
+ "architectures": [
7
+ "WhisperForConditionalGeneration"
8
+ ],
9
+ "attention_dropout": 0.0,
10
+ "begin_suppress_tokens": [
11
+ 220,
12
+ 50257
13
+ ],
14
+ "bos_token_id": 50257,
15
+ "classifier_proj_size": 256,
16
+ "d_model": 1024,
17
+ "decoder_attention_heads": 16,
18
+ "decoder_ffn_dim": 4096,
19
+ "decoder_layerdrop": 0.0,
20
+ "decoder_layers": 24,
21
+ "decoder_start_token_id": 50258,
22
+ "dropout": 0.0,
23
+ "encoder_attention_heads": 16,
24
+ "encoder_ffn_dim": 4096,
25
+ "encoder_layerdrop": 0.0,
26
+ "encoder_layers": 24,
27
+ "eos_token_id": 50257,
28
+ "forced_decoder_ids": null,
29
+ "init_std": 0.02,
30
+ "is_encoder_decoder": true,
31
+ "mask_feature_length": 10,
32
+ "mask_feature_min_masks": 0,
33
+ "mask_feature_prob": 0.0,
34
+ "mask_time_length": 10,
35
+ "mask_time_min_masks": 2,
36
+ "mask_time_prob": 0.05,
37
+ "max_length": 448,
38
+ "max_source_positions": 1500,
39
+ "max_target_positions": 448,
40
+ "median_filter_width": 7,
41
+ "model_type": "whisper",
42
+ "num_hidden_layers": 24,
43
+ "num_mel_bins": 80,
44
+ "pad_token_id": 50257,
45
+ "scale_embedding": false,
46
+ "torch_dtype": "float32",
47
+ "transformers_version": "4.39.0.dev0",
48
+ "use_cache": true,
49
+ "use_weighted_layer_sum": false,
50
+ "vocab_size": 51865
51
+ }
checkpoint-4000/generation_config.json ADDED
@@ -0,0 +1,248 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alignment_heads": [
3
+ [
4
+ 13,
5
+ 15
6
+ ],
7
+ [
8
+ 15,
9
+ 4
10
+ ],
11
+ [
12
+ 15,
13
+ 15
14
+ ],
15
+ [
16
+ 16,
17
+ 1
18
+ ],
19
+ [
20
+ 20,
21
+ 0
22
+ ],
23
+ [
24
+ 23,
25
+ 4
26
+ ]
27
+ ],
28
+ "begin_suppress_tokens": [
29
+ 220,
30
+ 50257
31
+ ],
32
+ "bos_token_id": 50257,
33
+ "decoder_start_token_id": 50258,
34
+ "eos_token_id": 50257,
35
+ "forced_decoder_ids": [
36
+ [
37
+ 1,
38
+ null
39
+ ],
40
+ [
41
+ 2,
42
+ 50359
43
+ ]
44
+ ],
45
+ "is_multilingual": true,
46
+ "lang_to_id": {
47
+ "<|af|>": 50327,
48
+ "<|am|>": 50334,
49
+ "<|ar|>": 50272,
50
+ "<|as|>": 50350,
51
+ "<|az|>": 50304,
52
+ "<|ba|>": 50355,
53
+ "<|be|>": 50330,
54
+ "<|bg|>": 50292,
55
+ "<|bn|>": 50302,
56
+ "<|bo|>": 50347,
57
+ "<|br|>": 50309,
58
+ "<|bs|>": 50315,
59
+ "<|ca|>": 50270,
60
+ "<|cs|>": 50283,
61
+ "<|cy|>": 50297,
62
+ "<|da|>": 50285,
63
+ "<|de|>": 50261,
64
+ "<|el|>": 50281,
65
+ "<|en|>": 50259,
66
+ "<|es|>": 50262,
67
+ "<|et|>": 50307,
68
+ "<|eu|>": 50310,
69
+ "<|fa|>": 50300,
70
+ "<|fi|>": 50277,
71
+ "<|fo|>": 50338,
72
+ "<|fr|>": 50265,
73
+ "<|gl|>": 50319,
74
+ "<|gu|>": 50333,
75
+ "<|haw|>": 50352,
76
+ "<|ha|>": 50354,
77
+ "<|he|>": 50279,
78
+ "<|hi|>": 50276,
79
+ "<|hr|>": 50291,
80
+ "<|ht|>": 50339,
81
+ "<|hu|>": 50286,
82
+ "<|hy|>": 50312,
83
+ "<|id|>": 50275,
84
+ "<|is|>": 50311,
85
+ "<|it|>": 50274,
86
+ "<|ja|>": 50266,
87
+ "<|jw|>": 50356,
88
+ "<|ka|>": 50329,
89
+ "<|kk|>": 50316,
90
+ "<|km|>": 50323,
91
+ "<|kn|>": 50306,
92
+ "<|ko|>": 50264,
93
+ "<|la|>": 50294,
94
+ "<|lb|>": 50345,
95
+ "<|ln|>": 50353,
96
+ "<|lo|>": 50336,
97
+ "<|lt|>": 50293,
98
+ "<|lv|>": 50301,
99
+ "<|mg|>": 50349,
100
+ "<|mi|>": 50295,
101
+ "<|mk|>": 50308,
102
+ "<|ml|>": 50296,
103
+ "<|mn|>": 50314,
104
+ "<|mr|>": 50320,
105
+ "<|ms|>": 50282,
106
+ "<|mt|>": 50343,
107
+ "<|my|>": 50346,
108
+ "<|ne|>": 50313,
109
+ "<|nl|>": 50271,
110
+ "<|nn|>": 50342,
111
+ "<|no|>": 50288,
112
+ "<|oc|>": 50328,
113
+ "<|pa|>": 50321,
114
+ "<|pl|>": 50269,
115
+ "<|ps|>": 50340,
116
+ "<|pt|>": 50267,
117
+ "<|ro|>": 50284,
118
+ "<|ru|>": 50263,
119
+ "<|sa|>": 50344,
120
+ "<|sd|>": 50332,
121
+ "<|si|>": 50322,
122
+ "<|sk|>": 50298,
123
+ "<|sl|>": 50305,
124
+ "<|sn|>": 50324,
125
+ "<|so|>": 50326,
126
+ "<|sq|>": 50317,
127
+ "<|sr|>": 50303,
128
+ "<|su|>": 50357,
129
+ "<|sv|>": 50273,
130
+ "<|sw|>": 50318,
131
+ "<|ta|>": 50287,
132
+ "<|te|>": 50299,
133
+ "<|tg|>": 50331,
134
+ "<|th|>": 50289,
135
+ "<|tk|>": 50341,
136
+ "<|tl|>": 50348,
137
+ "<|tr|>": 50268,
138
+ "<|tt|>": 50351,
139
+ "<|uk|>": 50280,
140
+ "<|ur|>": 50290,
141
+ "<|uz|>": 50337,
142
+ "<|vi|>": 50278,
143
+ "<|yi|>": 50335,
144
+ "<|yo|>": 50325,
145
+ "<|zh|>": 50260
146
+ },
147
+ "max_initial_timestamp_index": 50,
148
+ "max_length": 448,
149
+ "no_timestamps_token_id": 50363,
150
+ "pad_token_id": 50257,
151
+ "prev_sot_token_id": 50361,
152
+ "return_timestamps": false,
153
+ "suppress_tokens": [
154
+ 1,
155
+ 2,
156
+ 7,
157
+ 8,
158
+ 9,
159
+ 10,
160
+ 14,
161
+ 25,
162
+ 26,
163
+ 27,
164
+ 28,
165
+ 29,
166
+ 31,
167
+ 58,
168
+ 59,
169
+ 60,
170
+ 61,
171
+ 62,
172
+ 63,
173
+ 90,
174
+ 91,
175
+ 92,
176
+ 93,
177
+ 359,
178
+ 503,
179
+ 522,
180
+ 542,
181
+ 873,
182
+ 893,
183
+ 902,
184
+ 918,
185
+ 922,
186
+ 931,
187
+ 1350,
188
+ 1853,
189
+ 1982,
190
+ 2460,
191
+ 2627,
192
+ 3246,
193
+ 3253,
194
+ 3268,
195
+ 3536,
196
+ 3846,
197
+ 3961,
198
+ 4183,
199
+ 4667,
200
+ 6585,
201
+ 6647,
202
+ 7273,
203
+ 9061,
204
+ 9383,
205
+ 10428,
206
+ 10929,
207
+ 11938,
208
+ 12033,
209
+ 12331,
210
+ 12562,
211
+ 13793,
212
+ 14157,
213
+ 14635,
214
+ 15265,
215
+ 15618,
216
+ 16553,
217
+ 16604,
218
+ 18362,
219
+ 18956,
220
+ 20075,
221
+ 21675,
222
+ 22520,
223
+ 26130,
224
+ 26161,
225
+ 26435,
226
+ 28279,
227
+ 29464,
228
+ 31650,
229
+ 32302,
230
+ 32470,
231
+ 36865,
232
+ 42863,
233
+ 47425,
234
+ 49870,
235
+ 50254,
236
+ 50258,
237
+ 50358,
238
+ 50359,
239
+ 50360,
240
+ 50361,
241
+ 50362
242
+ ],
243
+ "task_to_id": {
244
+ "transcribe": 50359,
245
+ "translate": 50358
246
+ },
247
+ "transformers_version": "4.39.0.dev0"
248
+ }
checkpoint-4000/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e65e3e32a85b888767c21b36c99de614f02680902f1eab86dd91ec5187927883
3
+ size 3055544304
checkpoint-4000/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:71bf7e4aaa27faaa1feec848e29dbbec44bc0f9cd884d4012bf6c60d52759c66
3
+ size 6099375168
checkpoint-4000/preprocessor_config.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "chunk_length": 30,
3
+ "feature_extractor_type": "WhisperFeatureExtractor",
4
+ "feature_size": 80,
5
+ "hop_length": 160,
6
+ "n_fft": 400,
7
+ "n_samples": 480000,
8
+ "nb_max_frames": 3000,
9
+ "padding_side": "right",
10
+ "padding_value": 0.0,
11
+ "processor_class": "WhisperProcessor",
12
+ "return_attention_mask": false,
13
+ "sampling_rate": 16000
14
+ }
checkpoint-4000/rng_state_0.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:38b8b62da44e55a4e0fbe978adad4bd0b7ab2e9694ba015b067edc03782bcc72
3
+ size 14512
checkpoint-4000/rng_state_1.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:95dfd88a73f76ec273d80667d846df46c9b2acc1c256c6743b0b1403c53506c6
3
+ size 14512
checkpoint-4000/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3f9ef7843f3fea1295dd17e342aa58e2cc76483953f923ff874eeda253c89f3b
3
+ size 1064
checkpoint-4000/trainer_state.json ADDED
@@ -0,0 +1,1177 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_metric": null,
3
+ "best_model_checkpoint": null,
4
+ "epoch": 0.25725127017814653,
5
+ "eval_steps": 1000,
6
+ "global_step": 4000,
7
+ "is_hyper_param_search": false,
8
+ "is_local_process_zero": true,
9
+ "is_world_process_zero": true,
10
+ "log_history": [
11
+ {
12
+ "epoch": 0.0,
13
+ "grad_norm": 58.440006256103516,
14
+ "learning_rate": 1e-08,
15
+ "loss": 1.3813,
16
+ "step": 25
17
+ },
18
+ {
19
+ "epoch": 0.0,
20
+ "grad_norm": 59.934757232666016,
21
+ "learning_rate": 2.2e-08,
22
+ "loss": 1.469,
23
+ "step": 50
24
+ },
25
+ {
26
+ "epoch": 0.0,
27
+ "grad_norm": 31.011018753051758,
28
+ "learning_rate": 3.4500000000000005e-08,
29
+ "loss": 1.2226,
30
+ "step": 75
31
+ },
32
+ {
33
+ "epoch": 0.01,
34
+ "grad_norm": 59.818233489990234,
35
+ "learning_rate": 4.7e-08,
36
+ "loss": 1.2458,
37
+ "step": 100
38
+ },
39
+ {
40
+ "epoch": 0.01,
41
+ "grad_norm": 60.51572036743164,
42
+ "learning_rate": 5.95e-08,
43
+ "loss": 1.2781,
44
+ "step": 125
45
+ },
46
+ {
47
+ "epoch": 0.01,
48
+ "grad_norm": 51.360103607177734,
49
+ "learning_rate": 7.2e-08,
50
+ "loss": 1.4055,
51
+ "step": 150
52
+ },
53
+ {
54
+ "epoch": 0.01,
55
+ "grad_norm": 73.35002136230469,
56
+ "learning_rate": 8.45e-08,
57
+ "loss": 1.3354,
58
+ "step": 175
59
+ },
60
+ {
61
+ "epoch": 0.01,
62
+ "grad_norm": 69.32823944091797,
63
+ "learning_rate": 9.7e-08,
64
+ "loss": 1.2005,
65
+ "step": 200
66
+ },
67
+ {
68
+ "epoch": 0.01,
69
+ "grad_norm": 51.02174377441406,
70
+ "learning_rate": 1.095e-07,
71
+ "loss": 1.3853,
72
+ "step": 225
73
+ },
74
+ {
75
+ "epoch": 0.02,
76
+ "grad_norm": 72.20179748535156,
77
+ "learning_rate": 1.2199999999999998e-07,
78
+ "loss": 1.4476,
79
+ "step": 250
80
+ },
81
+ {
82
+ "epoch": 0.02,
83
+ "grad_norm": 108.30382537841797,
84
+ "learning_rate": 1.345e-07,
85
+ "loss": 1.2339,
86
+ "step": 275
87
+ },
88
+ {
89
+ "epoch": 0.02,
90
+ "grad_norm": 66.15994262695312,
91
+ "learning_rate": 1.4699999999999998e-07,
92
+ "loss": 1.379,
93
+ "step": 300
94
+ },
95
+ {
96
+ "epoch": 0.02,
97
+ "grad_norm": 47.82923126220703,
98
+ "learning_rate": 1.595e-07,
99
+ "loss": 1.1467,
100
+ "step": 325
101
+ },
102
+ {
103
+ "epoch": 0.02,
104
+ "grad_norm": 85.7218246459961,
105
+ "learning_rate": 1.7199999999999998e-07,
106
+ "loss": 1.1622,
107
+ "step": 350
108
+ },
109
+ {
110
+ "epoch": 0.02,
111
+ "grad_norm": 68.25504302978516,
112
+ "learning_rate": 1.845e-07,
113
+ "loss": 1.1413,
114
+ "step": 375
115
+ },
116
+ {
117
+ "epoch": 0.03,
118
+ "grad_norm": 106.06077575683594,
119
+ "learning_rate": 1.97e-07,
120
+ "loss": 1.0855,
121
+ "step": 400
122
+ },
123
+ {
124
+ "epoch": 0.03,
125
+ "grad_norm": 79.60690307617188,
126
+ "learning_rate": 2.095e-07,
127
+ "loss": 0.929,
128
+ "step": 425
129
+ },
130
+ {
131
+ "epoch": 0.03,
132
+ "grad_norm": 42.14814376831055,
133
+ "learning_rate": 2.22e-07,
134
+ "loss": 0.8728,
135
+ "step": 450
136
+ },
137
+ {
138
+ "epoch": 0.03,
139
+ "grad_norm": 37.4913444519043,
140
+ "learning_rate": 2.3449999999999996e-07,
141
+ "loss": 0.6651,
142
+ "step": 475
143
+ },
144
+ {
145
+ "epoch": 0.03,
146
+ "grad_norm": 41.89991760253906,
147
+ "learning_rate": 2.47e-07,
148
+ "loss": 0.5875,
149
+ "step": 500
150
+ },
151
+ {
152
+ "epoch": 0.03,
153
+ "grad_norm": 75.21453094482422,
154
+ "learning_rate": 2.595e-07,
155
+ "loss": 0.6868,
156
+ "step": 525
157
+ },
158
+ {
159
+ "epoch": 0.04,
160
+ "grad_norm": 21.09180450439453,
161
+ "learning_rate": 2.72e-07,
162
+ "loss": 0.741,
163
+ "step": 550
164
+ },
165
+ {
166
+ "epoch": 0.04,
167
+ "grad_norm": 44.54707336425781,
168
+ "learning_rate": 2.845e-07,
169
+ "loss": 0.3898,
170
+ "step": 575
171
+ },
172
+ {
173
+ "epoch": 0.04,
174
+ "grad_norm": 31.656843185424805,
175
+ "learning_rate": 2.9699999999999997e-07,
176
+ "loss": 0.422,
177
+ "step": 600
178
+ },
179
+ {
180
+ "epoch": 0.04,
181
+ "grad_norm": 56.28642654418945,
182
+ "learning_rate": 3.0949999999999996e-07,
183
+ "loss": 0.3803,
184
+ "step": 625
185
+ },
186
+ {
187
+ "epoch": 0.04,
188
+ "grad_norm": 38.66410827636719,
189
+ "learning_rate": 3.22e-07,
190
+ "loss": 0.5062,
191
+ "step": 650
192
+ },
193
+ {
194
+ "epoch": 0.04,
195
+ "grad_norm": 31.183727264404297,
196
+ "learning_rate": 3.345e-07,
197
+ "loss": 0.4075,
198
+ "step": 675
199
+ },
200
+ {
201
+ "epoch": 0.05,
202
+ "grad_norm": 23.618703842163086,
203
+ "learning_rate": 3.4699999999999997e-07,
204
+ "loss": 0.3627,
205
+ "step": 700
206
+ },
207
+ {
208
+ "epoch": 0.05,
209
+ "grad_norm": 70.09487915039062,
210
+ "learning_rate": 3.5949999999999996e-07,
211
+ "loss": 0.3087,
212
+ "step": 725
213
+ },
214
+ {
215
+ "epoch": 0.05,
216
+ "grad_norm": 74.42188262939453,
217
+ "learning_rate": 3.72e-07,
218
+ "loss": 0.4021,
219
+ "step": 750
220
+ },
221
+ {
222
+ "epoch": 0.05,
223
+ "grad_norm": 44.99939727783203,
224
+ "learning_rate": 3.845e-07,
225
+ "loss": 0.3203,
226
+ "step": 775
227
+ },
228
+ {
229
+ "epoch": 0.05,
230
+ "grad_norm": 42.77998352050781,
231
+ "learning_rate": 3.97e-07,
232
+ "loss": 0.3797,
233
+ "step": 800
234
+ },
235
+ {
236
+ "epoch": 0.05,
237
+ "grad_norm": 64.61412811279297,
238
+ "learning_rate": 4.0949999999999995e-07,
239
+ "loss": 0.3403,
240
+ "step": 825
241
+ },
242
+ {
243
+ "epoch": 0.05,
244
+ "grad_norm": 29.286806106567383,
245
+ "learning_rate": 4.2199999999999994e-07,
246
+ "loss": 0.2879,
247
+ "step": 850
248
+ },
249
+ {
250
+ "epoch": 0.06,
251
+ "grad_norm": 58.146263122558594,
252
+ "learning_rate": 4.345e-07,
253
+ "loss": 0.4017,
254
+ "step": 875
255
+ },
256
+ {
257
+ "epoch": 0.06,
258
+ "grad_norm": 44.624202728271484,
259
+ "learning_rate": 4.4699999999999997e-07,
260
+ "loss": 0.3698,
261
+ "step": 900
262
+ },
263
+ {
264
+ "epoch": 0.06,
265
+ "grad_norm": 47.91656494140625,
266
+ "learning_rate": 4.595e-07,
267
+ "loss": 0.4008,
268
+ "step": 925
269
+ },
270
+ {
271
+ "epoch": 0.06,
272
+ "grad_norm": 36.263668060302734,
273
+ "learning_rate": 4.7199999999999994e-07,
274
+ "loss": 0.2041,
275
+ "step": 950
276
+ },
277
+ {
278
+ "epoch": 0.06,
279
+ "grad_norm": 12.398943901062012,
280
+ "learning_rate": 4.845e-07,
281
+ "loss": 0.2978,
282
+ "step": 975
283
+ },
284
+ {
285
+ "epoch": 0.06,
286
+ "grad_norm": 4.42283821105957,
287
+ "learning_rate": 4.97e-07,
288
+ "loss": 0.2614,
289
+ "step": 1000
290
+ },
291
+ {
292
+ "epoch": 0.06,
293
+ "eval_loss": 0.29864633083343506,
294
+ "eval_runtime": 7667.7674,
295
+ "eval_samples_per_second": 1.228,
296
+ "eval_steps_per_second": 0.614,
297
+ "eval_wer": 0.14664944291942517,
298
+ "step": 1000
299
+ },
300
+ {
301
+ "epoch": 0.07,
302
+ "grad_norm": 59.681270599365234,
303
+ "learning_rate": 5.095e-07,
304
+ "loss": 0.2588,
305
+ "step": 1025
306
+ },
307
+ {
308
+ "epoch": 0.07,
309
+ "grad_norm": 27.44911766052246,
310
+ "learning_rate": 5.22e-07,
311
+ "loss": 0.2839,
312
+ "step": 1050
313
+ },
314
+ {
315
+ "epoch": 0.07,
316
+ "grad_norm": 56.26525115966797,
317
+ "learning_rate": 5.344999999999999e-07,
318
+ "loss": 0.2329,
319
+ "step": 1075
320
+ },
321
+ {
322
+ "epoch": 0.07,
323
+ "grad_norm": 112.37168884277344,
324
+ "learning_rate": 5.47e-07,
325
+ "loss": 0.2746,
326
+ "step": 1100
327
+ },
328
+ {
329
+ "epoch": 0.07,
330
+ "grad_norm": 92.82706451416016,
331
+ "learning_rate": 5.595e-07,
332
+ "loss": 0.2621,
333
+ "step": 1125
334
+ },
335
+ {
336
+ "epoch": 0.07,
337
+ "grad_norm": 17.20562744140625,
338
+ "learning_rate": 5.719999999999999e-07,
339
+ "loss": 0.1826,
340
+ "step": 1150
341
+ },
342
+ {
343
+ "epoch": 0.08,
344
+ "grad_norm": 31.119354248046875,
345
+ "learning_rate": 5.845e-07,
346
+ "loss": 0.2487,
347
+ "step": 1175
348
+ },
349
+ {
350
+ "epoch": 0.08,
351
+ "grad_norm": 7.4088850021362305,
352
+ "learning_rate": 5.97e-07,
353
+ "loss": 0.297,
354
+ "step": 1200
355
+ },
356
+ {
357
+ "epoch": 0.08,
358
+ "grad_norm": 72.84540557861328,
359
+ "learning_rate": 6.095e-07,
360
+ "loss": 0.3108,
361
+ "step": 1225
362
+ },
363
+ {
364
+ "epoch": 0.08,
365
+ "grad_norm": 38.68337631225586,
366
+ "learning_rate": 6.219999999999999e-07,
367
+ "loss": 0.2819,
368
+ "step": 1250
369
+ },
370
+ {
371
+ "epoch": 0.08,
372
+ "grad_norm": 5.215000152587891,
373
+ "learning_rate": 6.344999999999999e-07,
374
+ "loss": 0.1953,
375
+ "step": 1275
376
+ },
377
+ {
378
+ "epoch": 0.08,
379
+ "grad_norm": 41.42685317993164,
380
+ "learning_rate": 6.47e-07,
381
+ "loss": 0.2333,
382
+ "step": 1300
383
+ },
384
+ {
385
+ "epoch": 0.09,
386
+ "grad_norm": 6.224233150482178,
387
+ "learning_rate": 6.595e-07,
388
+ "loss": 0.2213,
389
+ "step": 1325
390
+ },
391
+ {
392
+ "epoch": 0.09,
393
+ "grad_norm": 48.12126541137695,
394
+ "learning_rate": 6.72e-07,
395
+ "loss": 0.3121,
396
+ "step": 1350
397
+ },
398
+ {
399
+ "epoch": 0.09,
400
+ "grad_norm": 23.151887893676758,
401
+ "learning_rate": 6.845e-07,
402
+ "loss": 0.2076,
403
+ "step": 1375
404
+ },
405
+ {
406
+ "epoch": 0.09,
407
+ "grad_norm": 53.516395568847656,
408
+ "learning_rate": 6.97e-07,
409
+ "loss": 0.2835,
410
+ "step": 1400
411
+ },
412
+ {
413
+ "epoch": 0.09,
414
+ "grad_norm": 41.62558364868164,
415
+ "learning_rate": 7.094999999999999e-07,
416
+ "loss": 0.2574,
417
+ "step": 1425
418
+ },
419
+ {
420
+ "epoch": 0.09,
421
+ "grad_norm": 98.05493927001953,
422
+ "learning_rate": 7.219999999999999e-07,
423
+ "loss": 0.3034,
424
+ "step": 1450
425
+ },
426
+ {
427
+ "epoch": 0.09,
428
+ "grad_norm": 86.28963470458984,
429
+ "learning_rate": 7.345e-07,
430
+ "loss": 0.2657,
431
+ "step": 1475
432
+ },
433
+ {
434
+ "epoch": 0.1,
435
+ "grad_norm": 2.8914854526519775,
436
+ "learning_rate": 7.47e-07,
437
+ "loss": 0.2636,
438
+ "step": 1500
439
+ },
440
+ {
441
+ "epoch": 0.1,
442
+ "grad_norm": 56.13273239135742,
443
+ "learning_rate": 7.594999999999999e-07,
444
+ "loss": 0.1522,
445
+ "step": 1525
446
+ },
447
+ {
448
+ "epoch": 0.1,
449
+ "grad_norm": 12.941767692565918,
450
+ "learning_rate": 7.72e-07,
451
+ "loss": 0.2097,
452
+ "step": 1550
453
+ },
454
+ {
455
+ "epoch": 0.1,
456
+ "grad_norm": 13.613518714904785,
457
+ "learning_rate": 7.845e-07,
458
+ "loss": 0.2303,
459
+ "step": 1575
460
+ },
461
+ {
462
+ "epoch": 0.1,
463
+ "grad_norm": 23.761892318725586,
464
+ "learning_rate": 7.970000000000001e-07,
465
+ "loss": 0.2763,
466
+ "step": 1600
467
+ },
468
+ {
469
+ "epoch": 0.1,
470
+ "grad_norm": 31.896230697631836,
471
+ "learning_rate": 8.094999999999999e-07,
472
+ "loss": 0.2722,
473
+ "step": 1625
474
+ },
475
+ {
476
+ "epoch": 0.11,
477
+ "grad_norm": 51.43158721923828,
478
+ "learning_rate": 8.219999999999999e-07,
479
+ "loss": 0.2228,
480
+ "step": 1650
481
+ },
482
+ {
483
+ "epoch": 0.11,
484
+ "grad_norm": 2.879077196121216,
485
+ "learning_rate": 8.345e-07,
486
+ "loss": 0.2694,
487
+ "step": 1675
488
+ },
489
+ {
490
+ "epoch": 0.11,
491
+ "grad_norm": 98.36167907714844,
492
+ "learning_rate": 8.469999999999999e-07,
493
+ "loss": 0.3088,
494
+ "step": 1700
495
+ },
496
+ {
497
+ "epoch": 0.11,
498
+ "grad_norm": 16.084274291992188,
499
+ "learning_rate": 8.595e-07,
500
+ "loss": 0.1828,
501
+ "step": 1725
502
+ },
503
+ {
504
+ "epoch": 0.11,
505
+ "grad_norm": 5.136277675628662,
506
+ "learning_rate": 8.72e-07,
507
+ "loss": 0.1753,
508
+ "step": 1750
509
+ },
510
+ {
511
+ "epoch": 0.11,
512
+ "grad_norm": 64.78803253173828,
513
+ "learning_rate": 8.845e-07,
514
+ "loss": 0.198,
515
+ "step": 1775
516
+ },
517
+ {
518
+ "epoch": 0.12,
519
+ "grad_norm": 41.84619903564453,
520
+ "learning_rate": 8.969999999999999e-07,
521
+ "loss": 0.2894,
522
+ "step": 1800
523
+ },
524
+ {
525
+ "epoch": 0.12,
526
+ "grad_norm": 45.18673324584961,
527
+ "learning_rate": 9.094999999999999e-07,
528
+ "loss": 0.1844,
529
+ "step": 1825
530
+ },
531
+ {
532
+ "epoch": 0.12,
533
+ "grad_norm": 20.42123794555664,
534
+ "learning_rate": 9.22e-07,
535
+ "loss": 0.2576,
536
+ "step": 1850
537
+ },
538
+ {
539
+ "epoch": 0.12,
540
+ "grad_norm": 8.751657485961914,
541
+ "learning_rate": 9.344999999999999e-07,
542
+ "loss": 0.2492,
543
+ "step": 1875
544
+ },
545
+ {
546
+ "epoch": 0.12,
547
+ "grad_norm": 29.69828224182129,
548
+ "learning_rate": 9.469999999999999e-07,
549
+ "loss": 0.1479,
550
+ "step": 1900
551
+ },
552
+ {
553
+ "epoch": 0.12,
554
+ "grad_norm": 22.91360855102539,
555
+ "learning_rate": 9.594999999999999e-07,
556
+ "loss": 0.2164,
557
+ "step": 1925
558
+ },
559
+ {
560
+ "epoch": 0.13,
561
+ "grad_norm": 17.22205352783203,
562
+ "learning_rate": 9.72e-07,
563
+ "loss": 0.2276,
564
+ "step": 1950
565
+ },
566
+ {
567
+ "epoch": 0.13,
568
+ "grad_norm": 83.4366683959961,
569
+ "learning_rate": 9.845e-07,
570
+ "loss": 0.2717,
571
+ "step": 1975
572
+ },
573
+ {
574
+ "epoch": 0.13,
575
+ "grad_norm": 11.829337120056152,
576
+ "learning_rate": 9.97e-07,
577
+ "loss": 0.2632,
578
+ "step": 2000
579
+ },
580
+ {
581
+ "epoch": 0.13,
582
+ "eval_loss": 0.22439254820346832,
583
+ "eval_runtime": 7771.0673,
584
+ "eval_samples_per_second": 1.211,
585
+ "eval_steps_per_second": 0.606,
586
+ "eval_wer": 0.13156789924107865,
587
+ "step": 2000
588
+ },
589
+ {
590
+ "epoch": 0.13,
591
+ "grad_norm": 42.36271667480469,
592
+ "learning_rate": 9.936666666666667e-07,
593
+ "loss": 0.21,
594
+ "step": 2025
595
+ },
596
+ {
597
+ "epoch": 0.13,
598
+ "grad_norm": 57.45354461669922,
599
+ "learning_rate": 9.853333333333333e-07,
600
+ "loss": 0.2244,
601
+ "step": 2050
602
+ },
603
+ {
604
+ "epoch": 0.13,
605
+ "grad_norm": 17.302165985107422,
606
+ "learning_rate": 9.773333333333333e-07,
607
+ "loss": 0.2492,
608
+ "step": 2075
609
+ },
610
+ {
611
+ "epoch": 0.14,
612
+ "grad_norm": 41.777069091796875,
613
+ "learning_rate": 9.69e-07,
614
+ "loss": 0.1598,
615
+ "step": 2100
616
+ },
617
+ {
618
+ "epoch": 0.14,
619
+ "grad_norm": 37.14485549926758,
620
+ "learning_rate": 9.606666666666666e-07,
621
+ "loss": 0.2483,
622
+ "step": 2125
623
+ },
624
+ {
625
+ "epoch": 0.14,
626
+ "grad_norm": 53.22433090209961,
627
+ "learning_rate": 9.523333333333333e-07,
628
+ "loss": 0.1913,
629
+ "step": 2150
630
+ },
631
+ {
632
+ "epoch": 0.14,
633
+ "grad_norm": 78.79158782958984,
634
+ "learning_rate": 9.439999999999999e-07,
635
+ "loss": 0.3075,
636
+ "step": 2175
637
+ },
638
+ {
639
+ "epoch": 0.14,
640
+ "grad_norm": 2.1396484375,
641
+ "learning_rate": 9.356666666666666e-07,
642
+ "loss": 0.2427,
643
+ "step": 2200
644
+ },
645
+ {
646
+ "epoch": 0.14,
647
+ "grad_norm": 9.334494590759277,
648
+ "learning_rate": 9.273333333333333e-07,
649
+ "loss": 0.2201,
650
+ "step": 2225
651
+ },
652
+ {
653
+ "epoch": 0.14,
654
+ "grad_norm": 8.948480606079102,
655
+ "learning_rate": 9.19e-07,
656
+ "loss": 0.2047,
657
+ "step": 2250
658
+ },
659
+ {
660
+ "epoch": 0.15,
661
+ "grad_norm": 3.004768133163452,
662
+ "learning_rate": 9.106666666666666e-07,
663
+ "loss": 0.1928,
664
+ "step": 2275
665
+ },
666
+ {
667
+ "epoch": 0.15,
668
+ "grad_norm": 3.458395481109619,
669
+ "learning_rate": 9.023333333333333e-07,
670
+ "loss": 0.1788,
671
+ "step": 2300
672
+ },
673
+ {
674
+ "epoch": 0.15,
675
+ "grad_norm": 61.66895294189453,
676
+ "learning_rate": 8.939999999999999e-07,
677
+ "loss": 0.1959,
678
+ "step": 2325
679
+ },
680
+ {
681
+ "epoch": 0.15,
682
+ "grad_norm": 45.452789306640625,
683
+ "learning_rate": 8.856666666666666e-07,
684
+ "loss": 0.211,
685
+ "step": 2350
686
+ },
687
+ {
688
+ "epoch": 0.15,
689
+ "grad_norm": 18.07378578186035,
690
+ "learning_rate": 8.773333333333332e-07,
691
+ "loss": 0.2391,
692
+ "step": 2375
693
+ },
694
+ {
695
+ "epoch": 0.15,
696
+ "grad_norm": 46.68052291870117,
697
+ "learning_rate": 8.69e-07,
698
+ "loss": 0.1782,
699
+ "step": 2400
700
+ },
701
+ {
702
+ "epoch": 0.16,
703
+ "grad_norm": 5.451249599456787,
704
+ "learning_rate": 8.606666666666667e-07,
705
+ "loss": 0.1569,
706
+ "step": 2425
707
+ },
708
+ {
709
+ "epoch": 0.16,
710
+ "grad_norm": 1.6330296993255615,
711
+ "learning_rate": 8.523333333333334e-07,
712
+ "loss": 0.1381,
713
+ "step": 2450
714
+ },
715
+ {
716
+ "epoch": 0.16,
717
+ "grad_norm": 43.628761291503906,
718
+ "learning_rate": 8.439999999999999e-07,
719
+ "loss": 0.1943,
720
+ "step": 2475
721
+ },
722
+ {
723
+ "epoch": 0.16,
724
+ "grad_norm": 42.83442687988281,
725
+ "learning_rate": 8.356666666666666e-07,
726
+ "loss": 0.1937,
727
+ "step": 2500
728
+ },
729
+ {
730
+ "epoch": 0.16,
731
+ "grad_norm": 41.783485412597656,
732
+ "learning_rate": 8.273333333333333e-07,
733
+ "loss": 0.1738,
734
+ "step": 2525
735
+ },
736
+ {
737
+ "epoch": 0.16,
738
+ "grad_norm": 43.905025482177734,
739
+ "learning_rate": 8.189999999999999e-07,
740
+ "loss": 0.2068,
741
+ "step": 2550
742
+ },
743
+ {
744
+ "epoch": 0.17,
745
+ "grad_norm": 35.906982421875,
746
+ "learning_rate": 8.106666666666666e-07,
747
+ "loss": 0.3004,
748
+ "step": 2575
749
+ },
750
+ {
751
+ "epoch": 0.17,
752
+ "grad_norm": 75.37654113769531,
753
+ "learning_rate": 8.023333333333333e-07,
754
+ "loss": 0.2073,
755
+ "step": 2600
756
+ },
757
+ {
758
+ "epoch": 0.17,
759
+ "grad_norm": 0.4306688904762268,
760
+ "learning_rate": 7.94e-07,
761
+ "loss": 0.1722,
762
+ "step": 2625
763
+ },
764
+ {
765
+ "epoch": 0.17,
766
+ "grad_norm": 48.76789093017578,
767
+ "learning_rate": 7.856666666666665e-07,
768
+ "loss": 0.2808,
769
+ "step": 2650
770
+ },
771
+ {
772
+ "epoch": 0.17,
773
+ "grad_norm": 21.527475357055664,
774
+ "learning_rate": 7.773333333333333e-07,
775
+ "loss": 0.1416,
776
+ "step": 2675
777
+ },
778
+ {
779
+ "epoch": 0.17,
780
+ "grad_norm": 6.267962455749512,
781
+ "learning_rate": 7.69e-07,
782
+ "loss": 0.1522,
783
+ "step": 2700
784
+ },
785
+ {
786
+ "epoch": 0.18,
787
+ "grad_norm": 38.2205696105957,
788
+ "learning_rate": 7.606666666666667e-07,
789
+ "loss": 0.1988,
790
+ "step": 2725
791
+ },
792
+ {
793
+ "epoch": 0.18,
794
+ "grad_norm": 2.1994435787200928,
795
+ "learning_rate": 7.523333333333333e-07,
796
+ "loss": 0.2384,
797
+ "step": 2750
798
+ },
799
+ {
800
+ "epoch": 0.18,
801
+ "grad_norm": 21.002376556396484,
802
+ "learning_rate": 7.44e-07,
803
+ "loss": 0.198,
804
+ "step": 2775
805
+ },
806
+ {
807
+ "epoch": 0.18,
808
+ "grad_norm": 66.96015167236328,
809
+ "learning_rate": 7.356666666666667e-07,
810
+ "loss": 0.2185,
811
+ "step": 2800
812
+ },
813
+ {
814
+ "epoch": 0.18,
815
+ "grad_norm": 16.91470718383789,
816
+ "learning_rate": 7.273333333333333e-07,
817
+ "loss": 0.2149,
818
+ "step": 2825
819
+ },
820
+ {
821
+ "epoch": 0.18,
822
+ "grad_norm": 12.189261436462402,
823
+ "learning_rate": 7.189999999999999e-07,
824
+ "loss": 0.1907,
825
+ "step": 2850
826
+ },
827
+ {
828
+ "epoch": 0.18,
829
+ "grad_norm": 5.648806095123291,
830
+ "learning_rate": 7.106666666666666e-07,
831
+ "loss": 0.1634,
832
+ "step": 2875
833
+ },
834
+ {
835
+ "epoch": 0.19,
836
+ "grad_norm": 6.627074241638184,
837
+ "learning_rate": 7.023333333333333e-07,
838
+ "loss": 0.2275,
839
+ "step": 2900
840
+ },
841
+ {
842
+ "epoch": 0.19,
843
+ "grad_norm": 42.446556091308594,
844
+ "learning_rate": 6.939999999999999e-07,
845
+ "loss": 0.1888,
846
+ "step": 2925
847
+ },
848
+ {
849
+ "epoch": 0.19,
850
+ "grad_norm": 19.29751968383789,
851
+ "learning_rate": 6.856666666666667e-07,
852
+ "loss": 0.1603,
853
+ "step": 2950
854
+ },
855
+ {
856
+ "epoch": 0.19,
857
+ "grad_norm": 88.98928833007812,
858
+ "learning_rate": 6.773333333333334e-07,
859
+ "loss": 0.2295,
860
+ "step": 2975
861
+ },
862
+ {
863
+ "epoch": 0.19,
864
+ "grad_norm": 13.686836242675781,
865
+ "learning_rate": 6.69e-07,
866
+ "loss": 0.1694,
867
+ "step": 3000
868
+ },
869
+ {
870
+ "epoch": 0.19,
871
+ "eval_loss": 0.20859745144844055,
872
+ "eval_runtime": 7797.8116,
873
+ "eval_samples_per_second": 1.207,
874
+ "eval_steps_per_second": 0.604,
875
+ "eval_wer": 0.12344582593250444,
876
+ "step": 3000
877
+ },
878
+ {
879
+ "epoch": 0.19,
880
+ "grad_norm": 44.50759506225586,
881
+ "learning_rate": 6.606666666666666e-07,
882
+ "loss": 0.2145,
883
+ "step": 3025
884
+ },
885
+ {
886
+ "epoch": 0.2,
887
+ "grad_norm": 53.09928512573242,
888
+ "learning_rate": 6.523333333333333e-07,
889
+ "loss": 0.1721,
890
+ "step": 3050
891
+ },
892
+ {
893
+ "epoch": 0.2,
894
+ "grad_norm": 53.15538024902344,
895
+ "learning_rate": 6.44e-07,
896
+ "loss": 0.2002,
897
+ "step": 3075
898
+ },
899
+ {
900
+ "epoch": 0.2,
901
+ "grad_norm": 28.469669342041016,
902
+ "learning_rate": 6.356666666666667e-07,
903
+ "loss": 0.184,
904
+ "step": 3100
905
+ },
906
+ {
907
+ "epoch": 0.2,
908
+ "grad_norm": 63.475502014160156,
909
+ "learning_rate": 6.273333333333333e-07,
910
+ "loss": 0.1787,
911
+ "step": 3125
912
+ },
913
+ {
914
+ "epoch": 0.2,
915
+ "grad_norm": 38.70827865600586,
916
+ "learning_rate": 6.189999999999999e-07,
917
+ "loss": 0.1815,
918
+ "step": 3150
919
+ },
920
+ {
921
+ "epoch": 0.2,
922
+ "grad_norm": 48.54985809326172,
923
+ "learning_rate": 6.106666666666666e-07,
924
+ "loss": 0.1589,
925
+ "step": 3175
926
+ },
927
+ {
928
+ "epoch": 0.21,
929
+ "grad_norm": 25.11480140686035,
930
+ "learning_rate": 6.023333333333333e-07,
931
+ "loss": 0.1795,
932
+ "step": 3200
933
+ },
934
+ {
935
+ "epoch": 0.21,
936
+ "grad_norm": 17.55191421508789,
937
+ "learning_rate": 5.939999999999999e-07,
938
+ "loss": 0.1464,
939
+ "step": 3225
940
+ },
941
+ {
942
+ "epoch": 0.21,
943
+ "grad_norm": 53.10033416748047,
944
+ "learning_rate": 5.856666666666667e-07,
945
+ "loss": 0.2208,
946
+ "step": 3250
947
+ },
948
+ {
949
+ "epoch": 0.21,
950
+ "grad_norm": 1.9264570474624634,
951
+ "learning_rate": 5.773333333333334e-07,
952
+ "loss": 0.1538,
953
+ "step": 3275
954
+ },
955
+ {
956
+ "epoch": 0.21,
957
+ "grad_norm": 11.413477897644043,
958
+ "learning_rate": 5.69e-07,
959
+ "loss": 0.2038,
960
+ "step": 3300
961
+ },
962
+ {
963
+ "epoch": 0.21,
964
+ "grad_norm": 53.29780578613281,
965
+ "learning_rate": 5.606666666666666e-07,
966
+ "loss": 0.1312,
967
+ "step": 3325
968
+ },
969
+ {
970
+ "epoch": 0.22,
971
+ "grad_norm": 3.7282674312591553,
972
+ "learning_rate": 5.523333333333333e-07,
973
+ "loss": 0.2328,
974
+ "step": 3350
975
+ },
976
+ {
977
+ "epoch": 0.22,
978
+ "grad_norm": Infinity,
979
+ "learning_rate": 5.443333333333333e-07,
980
+ "loss": 0.2432,
981
+ "step": 3375
982
+ },
983
+ {
984
+ "epoch": 0.22,
985
+ "grad_norm": 22.73953628540039,
986
+ "learning_rate": 5.36e-07,
987
+ "loss": 0.2353,
988
+ "step": 3400
989
+ },
990
+ {
991
+ "epoch": 0.22,
992
+ "grad_norm": 19.394702911376953,
993
+ "learning_rate": 5.276666666666666e-07,
994
+ "loss": 0.0976,
995
+ "step": 3425
996
+ },
997
+ {
998
+ "epoch": 0.22,
999
+ "grad_norm": 1.5477691888809204,
1000
+ "learning_rate": 5.193333333333332e-07,
1001
+ "loss": 0.1921,
1002
+ "step": 3450
1003
+ },
1004
+ {
1005
+ "epoch": 0.22,
1006
+ "grad_norm": 33.865806579589844,
1007
+ "learning_rate": 5.11e-07,
1008
+ "loss": 0.1957,
1009
+ "step": 3475
1010
+ },
1011
+ {
1012
+ "epoch": 0.23,
1013
+ "grad_norm": 8.566771507263184,
1014
+ "learning_rate": 5.026666666666667e-07,
1015
+ "loss": 0.1601,
1016
+ "step": 3500
1017
+ },
1018
+ {
1019
+ "epoch": 0.23,
1020
+ "grad_norm": 22.204965591430664,
1021
+ "learning_rate": 4.943333333333333e-07,
1022
+ "loss": 0.1894,
1023
+ "step": 3525
1024
+ },
1025
+ {
1026
+ "epoch": 0.23,
1027
+ "grad_norm": 32.81788635253906,
1028
+ "learning_rate": 4.86e-07,
1029
+ "loss": 0.2235,
1030
+ "step": 3550
1031
+ },
1032
+ {
1033
+ "epoch": 0.23,
1034
+ "grad_norm": 60.057193756103516,
1035
+ "learning_rate": 4.776666666666667e-07,
1036
+ "loss": 0.1714,
1037
+ "step": 3575
1038
+ },
1039
+ {
1040
+ "epoch": 0.23,
1041
+ "grad_norm": 11.461939811706543,
1042
+ "learning_rate": 4.693333333333333e-07,
1043
+ "loss": 0.1757,
1044
+ "step": 3600
1045
+ },
1046
+ {
1047
+ "epoch": 0.23,
1048
+ "grad_norm": 29.48383331298828,
1049
+ "learning_rate": 4.61e-07,
1050
+ "loss": 0.2581,
1051
+ "step": 3625
1052
+ },
1053
+ {
1054
+ "epoch": 0.23,
1055
+ "grad_norm": 5.3872270584106445,
1056
+ "learning_rate": 4.526666666666666e-07,
1057
+ "loss": 0.1493,
1058
+ "step": 3650
1059
+ },
1060
+ {
1061
+ "epoch": 0.24,
1062
+ "grad_norm": 24.588903427124023,
1063
+ "learning_rate": 4.4433333333333333e-07,
1064
+ "loss": 0.2003,
1065
+ "step": 3675
1066
+ },
1067
+ {
1068
+ "epoch": 0.24,
1069
+ "grad_norm": 42.52607727050781,
1070
+ "learning_rate": 4.36e-07,
1071
+ "loss": 0.2212,
1072
+ "step": 3700
1073
+ },
1074
+ {
1075
+ "epoch": 0.24,
1076
+ "grad_norm": 17.575077056884766,
1077
+ "learning_rate": 4.2766666666666664e-07,
1078
+ "loss": 0.1639,
1079
+ "step": 3725
1080
+ },
1081
+ {
1082
+ "epoch": 0.24,
1083
+ "grad_norm": 77.39998626708984,
1084
+ "learning_rate": 4.193333333333333e-07,
1085
+ "loss": 0.2154,
1086
+ "step": 3750
1087
+ },
1088
+ {
1089
+ "epoch": 0.24,
1090
+ "grad_norm": 65.65005493164062,
1091
+ "learning_rate": 4.1099999999999996e-07,
1092
+ "loss": 0.2396,
1093
+ "step": 3775
1094
+ },
1095
+ {
1096
+ "epoch": 0.24,
1097
+ "grad_norm": 45.75455093383789,
1098
+ "learning_rate": 4.0266666666666667e-07,
1099
+ "loss": 0.159,
1100
+ "step": 3800
1101
+ },
1102
+ {
1103
+ "epoch": 0.25,
1104
+ "grad_norm": 44.56821060180664,
1105
+ "learning_rate": 3.943333333333333e-07,
1106
+ "loss": 0.1935,
1107
+ "step": 3825
1108
+ },
1109
+ {
1110
+ "epoch": 0.25,
1111
+ "grad_norm": 11.89593505859375,
1112
+ "learning_rate": 3.86e-07,
1113
+ "loss": 0.1649,
1114
+ "step": 3850
1115
+ },
1116
+ {
1117
+ "epoch": 0.25,
1118
+ "grad_norm": 7.169790267944336,
1119
+ "learning_rate": 3.7766666666666665e-07,
1120
+ "loss": 0.1925,
1121
+ "step": 3875
1122
+ },
1123
+ {
1124
+ "epoch": 0.25,
1125
+ "grad_norm": 141.48680114746094,
1126
+ "learning_rate": 3.693333333333333e-07,
1127
+ "loss": 0.1757,
1128
+ "step": 3900
1129
+ },
1130
+ {
1131
+ "epoch": 0.25,
1132
+ "grad_norm": 1.0640227794647217,
1133
+ "learning_rate": 3.6099999999999996e-07,
1134
+ "loss": 0.1733,
1135
+ "step": 3925
1136
+ },
1137
+ {
1138
+ "epoch": 0.25,
1139
+ "grad_norm": 6.541628360748291,
1140
+ "learning_rate": 3.526666666666667e-07,
1141
+ "loss": 0.1857,
1142
+ "step": 3950
1143
+ },
1144
+ {
1145
+ "epoch": 0.26,
1146
+ "grad_norm": 62.5667610168457,
1147
+ "learning_rate": 3.4433333333333333e-07,
1148
+ "loss": 0.1946,
1149
+ "step": 3975
1150
+ },
1151
+ {
1152
+ "epoch": 0.26,
1153
+ "grad_norm": 2.5611398220062256,
1154
+ "learning_rate": 3.36e-07,
1155
+ "loss": 0.1658,
1156
+ "step": 4000
1157
+ },
1158
+ {
1159
+ "epoch": 0.26,
1160
+ "eval_loss": 0.19874949753284454,
1161
+ "eval_runtime": 7779.7728,
1162
+ "eval_samples_per_second": 1.21,
1163
+ "eval_steps_per_second": 0.605,
1164
+ "eval_wer": 0.12049087679638301,
1165
+ "step": 4000
1166
+ }
1167
+ ],
1168
+ "logging_steps": 25,
1169
+ "max_steps": 5000,
1170
+ "num_input_tokens_seen": 0,
1171
+ "num_train_epochs": 1,
1172
+ "save_steps": 1000,
1173
+ "total_flos": 8.164839344884941e+18,
1174
+ "train_batch_size": 1,
1175
+ "trial_name": null,
1176
+ "trial_params": null
1177
+ }
checkpoint-4000/training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:354821a80788dff5d057ffce4a4d80a406ce5bb0affa48cc6029ca3faa14edf2
3
+ size 5048
checkpoint-5000/config.json ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "openai/whisper-medium",
3
+ "activation_dropout": 0.0,
4
+ "activation_function": "gelu",
5
+ "apply_spec_augment": false,
6
+ "architectures": [
7
+ "WhisperForConditionalGeneration"
8
+ ],
9
+ "attention_dropout": 0.0,
10
+ "begin_suppress_tokens": [
11
+ 220,
12
+ 50257
13
+ ],
14
+ "bos_token_id": 50257,
15
+ "classifier_proj_size": 256,
16
+ "d_model": 1024,
17
+ "decoder_attention_heads": 16,
18
+ "decoder_ffn_dim": 4096,
19
+ "decoder_layerdrop": 0.0,
20
+ "decoder_layers": 24,
21
+ "decoder_start_token_id": 50258,
22
+ "dropout": 0.0,
23
+ "encoder_attention_heads": 16,
24
+ "encoder_ffn_dim": 4096,
25
+ "encoder_layerdrop": 0.0,
26
+ "encoder_layers": 24,
27
+ "eos_token_id": 50257,
28
+ "forced_decoder_ids": null,
29
+ "init_std": 0.02,
30
+ "is_encoder_decoder": true,
31
+ "mask_feature_length": 10,
32
+ "mask_feature_min_masks": 0,
33
+ "mask_feature_prob": 0.0,
34
+ "mask_time_length": 10,
35
+ "mask_time_min_masks": 2,
36
+ "mask_time_prob": 0.05,
37
+ "max_length": 448,
38
+ "max_source_positions": 1500,
39
+ "max_target_positions": 448,
40
+ "median_filter_width": 7,
41
+ "model_type": "whisper",
42
+ "num_hidden_layers": 24,
43
+ "num_mel_bins": 80,
44
+ "pad_token_id": 50257,
45
+ "scale_embedding": false,
46
+ "torch_dtype": "float32",
47
+ "transformers_version": "4.39.0.dev0",
48
+ "use_cache": true,
49
+ "use_weighted_layer_sum": false,
50
+ "vocab_size": 51865
51
+ }
checkpoint-5000/generation_config.json ADDED
@@ -0,0 +1,248 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alignment_heads": [
3
+ [
4
+ 13,
5
+ 15
6
+ ],
7
+ [
8
+ 15,
9
+ 4
10
+ ],
11
+ [
12
+ 15,
13
+ 15
14
+ ],
15
+ [
16
+ 16,
17
+ 1
18
+ ],
19
+ [
20
+ 20,
21
+ 0
22
+ ],
23
+ [
24
+ 23,
25
+ 4
26
+ ]
27
+ ],
28
+ "begin_suppress_tokens": [
29
+ 220,
30
+ 50257
31
+ ],
32
+ "bos_token_id": 50257,
33
+ "decoder_start_token_id": 50258,
34
+ "eos_token_id": 50257,
35
+ "forced_decoder_ids": [
36
+ [
37
+ 1,
38
+ null
39
+ ],
40
+ [
41
+ 2,
42
+ 50359
43
+ ]
44
+ ],
45
+ "is_multilingual": true,
46
+ "lang_to_id": {
47
+ "<|af|>": 50327,
48
+ "<|am|>": 50334,
49
+ "<|ar|>": 50272,
50
+ "<|as|>": 50350,
51
+ "<|az|>": 50304,
52
+ "<|ba|>": 50355,
53
+ "<|be|>": 50330,
54
+ "<|bg|>": 50292,
55
+ "<|bn|>": 50302,
56
+ "<|bo|>": 50347,
57
+ "<|br|>": 50309,
58
+ "<|bs|>": 50315,
59
+ "<|ca|>": 50270,
60
+ "<|cs|>": 50283,
61
+ "<|cy|>": 50297,
62
+ "<|da|>": 50285,
63
+ "<|de|>": 50261,
64
+ "<|el|>": 50281,
65
+ "<|en|>": 50259,
66
+ "<|es|>": 50262,
67
+ "<|et|>": 50307,
68
+ "<|eu|>": 50310,
69
+ "<|fa|>": 50300,
70
+ "<|fi|>": 50277,
71
+ "<|fo|>": 50338,
72
+ "<|fr|>": 50265,
73
+ "<|gl|>": 50319,
74
+ "<|gu|>": 50333,
75
+ "<|haw|>": 50352,
76
+ "<|ha|>": 50354,
77
+ "<|he|>": 50279,
78
+ "<|hi|>": 50276,
79
+ "<|hr|>": 50291,
80
+ "<|ht|>": 50339,
81
+ "<|hu|>": 50286,
82
+ "<|hy|>": 50312,
83
+ "<|id|>": 50275,
84
+ "<|is|>": 50311,
85
+ "<|it|>": 50274,
86
+ "<|ja|>": 50266,
87
+ "<|jw|>": 50356,
88
+ "<|ka|>": 50329,
89
+ "<|kk|>": 50316,
90
+ "<|km|>": 50323,
91
+ "<|kn|>": 50306,
92
+ "<|ko|>": 50264,
93
+ "<|la|>": 50294,
94
+ "<|lb|>": 50345,
95
+ "<|ln|>": 50353,
96
+ "<|lo|>": 50336,
97
+ "<|lt|>": 50293,
98
+ "<|lv|>": 50301,
99
+ "<|mg|>": 50349,
100
+ "<|mi|>": 50295,
101
+ "<|mk|>": 50308,
102
+ "<|ml|>": 50296,
103
+ "<|mn|>": 50314,
104
+ "<|mr|>": 50320,
105
+ "<|ms|>": 50282,
106
+ "<|mt|>": 50343,
107
+ "<|my|>": 50346,
108
+ "<|ne|>": 50313,
109
+ "<|nl|>": 50271,
110
+ "<|nn|>": 50342,
111
+ "<|no|>": 50288,
112
+ "<|oc|>": 50328,
113
+ "<|pa|>": 50321,
114
+ "<|pl|>": 50269,
115
+ "<|ps|>": 50340,
116
+ "<|pt|>": 50267,
117
+ "<|ro|>": 50284,
118
+ "<|ru|>": 50263,
119
+ "<|sa|>": 50344,
120
+ "<|sd|>": 50332,
121
+ "<|si|>": 50322,
122
+ "<|sk|>": 50298,
123
+ "<|sl|>": 50305,
124
+ "<|sn|>": 50324,
125
+ "<|so|>": 50326,
126
+ "<|sq|>": 50317,
127
+ "<|sr|>": 50303,
128
+ "<|su|>": 50357,
129
+ "<|sv|>": 50273,
130
+ "<|sw|>": 50318,
131
+ "<|ta|>": 50287,
132
+ "<|te|>": 50299,
133
+ "<|tg|>": 50331,
134
+ "<|th|>": 50289,
135
+ "<|tk|>": 50341,
136
+ "<|tl|>": 50348,
137
+ "<|tr|>": 50268,
138
+ "<|tt|>": 50351,
139
+ "<|uk|>": 50280,
140
+ "<|ur|>": 50290,
141
+ "<|uz|>": 50337,
142
+ "<|vi|>": 50278,
143
+ "<|yi|>": 50335,
144
+ "<|yo|>": 50325,
145
+ "<|zh|>": 50260
146
+ },
147
+ "max_initial_timestamp_index": 50,
148
+ "max_length": 448,
149
+ "no_timestamps_token_id": 50363,
150
+ "pad_token_id": 50257,
151
+ "prev_sot_token_id": 50361,
152
+ "return_timestamps": false,
153
+ "suppress_tokens": [
154
+ 1,
155
+ 2,
156
+ 7,
157
+ 8,
158
+ 9,
159
+ 10,
160
+ 14,
161
+ 25,
162
+ 26,
163
+ 27,
164
+ 28,
165
+ 29,
166
+ 31,
167
+ 58,
168
+ 59,
169
+ 60,
170
+ 61,
171
+ 62,
172
+ 63,
173
+ 90,
174
+ 91,
175
+ 92,
176
+ 93,
177
+ 359,
178
+ 503,
179
+ 522,
180
+ 542,
181
+ 873,
182
+ 893,
183
+ 902,
184
+ 918,
185
+ 922,
186
+ 931,
187
+ 1350,
188
+ 1853,
189
+ 1982,
190
+ 2460,
191
+ 2627,
192
+ 3246,
193
+ 3253,
194
+ 3268,
195
+ 3536,
196
+ 3846,
197
+ 3961,
198
+ 4183,
199
+ 4667,
200
+ 6585,
201
+ 6647,
202
+ 7273,
203
+ 9061,
204
+ 9383,
205
+ 10428,
206
+ 10929,
207
+ 11938,
208
+ 12033,
209
+ 12331,
210
+ 12562,
211
+ 13793,
212
+ 14157,
213
+ 14635,
214
+ 15265,
215
+ 15618,
216
+ 16553,
217
+ 16604,
218
+ 18362,
219
+ 18956,
220
+ 20075,
221
+ 21675,
222
+ 22520,
223
+ 26130,
224
+ 26161,
225
+ 26435,
226
+ 28279,
227
+ 29464,
228
+ 31650,
229
+ 32302,
230
+ 32470,
231
+ 36865,
232
+ 42863,
233
+ 47425,
234
+ 49870,
235
+ 50254,
236
+ 50258,
237
+ 50358,
238
+ 50359,
239
+ 50360,
240
+ 50361,
241
+ 50362
242
+ ],
243
+ "task_to_id": {
244
+ "transcribe": 50359,
245
+ "translate": 50358
246
+ },
247
+ "transformers_version": "4.39.0.dev0"
248
+ }
checkpoint-5000/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d6895584743a460ddcc75e7583354720bd420638e167d30f3a04f3e181a75e6d
3
+ size 3055544304
checkpoint-5000/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:efbbca218494d51e6e1dd9a4db941b6ab626b16ddc75d03b7ca24c5ad2614371
3
+ size 6099375168
checkpoint-5000/preprocessor_config.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "chunk_length": 30,
3
+ "feature_extractor_type": "WhisperFeatureExtractor",
4
+ "feature_size": 80,
5
+ "hop_length": 160,
6
+ "n_fft": 400,
7
+ "n_samples": 480000,
8
+ "nb_max_frames": 3000,
9
+ "padding_side": "right",
10
+ "padding_value": 0.0,
11
+ "processor_class": "WhisperProcessor",
12
+ "return_attention_mask": false,
13
+ "sampling_rate": 16000
14
+ }
checkpoint-5000/rng_state_0.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:bdf2c1d98aa41546ecf319cd9bd73773d3f53fabd87cd6e3ea3528db38a0d927
3
+ size 14512
checkpoint-5000/rng_state_1.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9cfe56bc5ddaabc7b5ed388a44933bd48f6125f85c7457aa0b355f9afb0e88ac
3
+ size 14512