showgan commited on
Commit
72621ec
1 Parent(s): 3f7bb26

Training in progress, step 1000

Browse files
This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. Models/hindi/added_tokens.json +1609 -0
  2. Models/hindi/checkpoint-1000/config.json +52 -0
  3. Models/hindi/checkpoint-1000/generation_config.json +265 -0
  4. Models/hindi/checkpoint-1000/model.safetensors +3 -0
  5. Models/hindi/checkpoint-1000/optimizer.pt +3 -0
  6. Models/hindi/checkpoint-1000/preprocessor_config.json +14 -0
  7. Models/hindi/checkpoint-1000/rng_state.pth +3 -0
  8. Models/hindi/checkpoint-1000/scheduler.pt +3 -0
  9. Models/hindi/checkpoint-1000/trainer_state.json +310 -0
  10. Models/hindi/checkpoint-1000/training_args.bin +3 -0
  11. Models/hindi/checkpoint-2000/config.json +52 -0
  12. Models/hindi/checkpoint-2000/generation_config.json +265 -0
  13. Models/hindi/checkpoint-2000/model.safetensors +3 -0
  14. Models/hindi/checkpoint-2000/optimizer.pt +3 -0
  15. Models/hindi/checkpoint-2000/preprocessor_config.json +14 -0
  16. Models/hindi/checkpoint-2000/rng_state.pth +3 -0
  17. Models/hindi/checkpoint-2000/scheduler.pt +3 -0
  18. Models/hindi/checkpoint-2000/trainer_state.json +599 -0
  19. Models/hindi/checkpoint-2000/training_args.bin +3 -0
  20. Models/hindi/checkpoint-3000/config.json +52 -0
  21. Models/hindi/checkpoint-3000/generation_config.json +265 -0
  22. Models/hindi/checkpoint-3000/model.safetensors +3 -0
  23. Models/hindi/checkpoint-3000/optimizer.pt +3 -0
  24. Models/hindi/checkpoint-3000/preprocessor_config.json +14 -0
  25. Models/hindi/checkpoint-3000/rng_state.pth +3 -0
  26. Models/hindi/checkpoint-3000/scheduler.pt +3 -0
  27. Models/hindi/checkpoint-3000/trainer_state.json +888 -0
  28. Models/hindi/checkpoint-3000/training_args.bin +3 -0
  29. Models/hindi/checkpoint-4000/config.json +52 -0
  30. Models/hindi/checkpoint-4000/generation_config.json +265 -0
  31. Models/hindi/checkpoint-4000/model.safetensors +3 -0
  32. Models/hindi/checkpoint-4000/optimizer.pt +3 -0
  33. Models/hindi/checkpoint-4000/preprocessor_config.json +14 -0
  34. Models/hindi/checkpoint-4000/rng_state.pth +3 -0
  35. Models/hindi/checkpoint-4000/scheduler.pt +3 -0
  36. Models/hindi/checkpoint-4000/trainer_state.json +1177 -0
  37. Models/hindi/checkpoint-4000/training_args.bin +3 -0
  38. Models/hindi/checkpoint-5000/config.json +52 -0
  39. Models/hindi/checkpoint-5000/generation_config.json +265 -0
  40. Models/hindi/checkpoint-5000/model.safetensors +3 -0
  41. Models/hindi/checkpoint-5000/optimizer.pt +3 -0
  42. Models/hindi/checkpoint-5000/preprocessor_config.json +14 -0
  43. Models/hindi/checkpoint-5000/rng_state.pth +3 -0
  44. Models/hindi/checkpoint-5000/scheduler.pt +3 -0
  45. Models/hindi/checkpoint-5000/trainer_state.json +1466 -0
  46. Models/hindi/checkpoint-5000/training_args.bin +3 -0
  47. Models/hindi/config.json +52 -0
  48. Models/hindi/generation_config.json +265 -0
  49. Models/hindi/merges.txt +0 -0
  50. Models/hindi/model.safetensors +3 -0
Models/hindi/added_tokens.json ADDED
@@ -0,0 +1,1609 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "<|0.00|>": 50364,
3
+ "<|0.02|>": 50365,
4
+ "<|0.04|>": 50366,
5
+ "<|0.06|>": 50367,
6
+ "<|0.08|>": 50368,
7
+ "<|0.10|>": 50369,
8
+ "<|0.12|>": 50370,
9
+ "<|0.14|>": 50371,
10
+ "<|0.16|>": 50372,
11
+ "<|0.18|>": 50373,
12
+ "<|0.20|>": 50374,
13
+ "<|0.22|>": 50375,
14
+ "<|0.24|>": 50376,
15
+ "<|0.26|>": 50377,
16
+ "<|0.28|>": 50378,
17
+ "<|0.30|>": 50379,
18
+ "<|0.32|>": 50380,
19
+ "<|0.34|>": 50381,
20
+ "<|0.36|>": 50382,
21
+ "<|0.38|>": 50383,
22
+ "<|0.40|>": 50384,
23
+ "<|0.42|>": 50385,
24
+ "<|0.44|>": 50386,
25
+ "<|0.46|>": 50387,
26
+ "<|0.48|>": 50388,
27
+ "<|0.50|>": 50389,
28
+ "<|0.52|>": 50390,
29
+ "<|0.54|>": 50391,
30
+ "<|0.56|>": 50392,
31
+ "<|0.58|>": 50393,
32
+ "<|0.60|>": 50394,
33
+ "<|0.62|>": 50395,
34
+ "<|0.64|>": 50396,
35
+ "<|0.66|>": 50397,
36
+ "<|0.68|>": 50398,
37
+ "<|0.70|>": 50399,
38
+ "<|0.72|>": 50400,
39
+ "<|0.74|>": 50401,
40
+ "<|0.76|>": 50402,
41
+ "<|0.78|>": 50403,
42
+ "<|0.80|>": 50404,
43
+ "<|0.82|>": 50405,
44
+ "<|0.84|>": 50406,
45
+ "<|0.86|>": 50407,
46
+ "<|0.88|>": 50408,
47
+ "<|0.90|>": 50409,
48
+ "<|0.92|>": 50410,
49
+ "<|0.94|>": 50411,
50
+ "<|0.96|>": 50412,
51
+ "<|0.98|>": 50413,
52
+ "<|1.00|>": 50414,
53
+ "<|1.02|>": 50415,
54
+ "<|1.04|>": 50416,
55
+ "<|1.06|>": 50417,
56
+ "<|1.08|>": 50418,
57
+ "<|1.10|>": 50419,
58
+ "<|1.12|>": 50420,
59
+ "<|1.14|>": 50421,
60
+ "<|1.16|>": 50422,
61
+ "<|1.18|>": 50423,
62
+ "<|1.20|>": 50424,
63
+ "<|1.22|>": 50425,
64
+ "<|1.24|>": 50426,
65
+ "<|1.26|>": 50427,
66
+ "<|1.28|>": 50428,
67
+ "<|1.30|>": 50429,
68
+ "<|1.32|>": 50430,
69
+ "<|1.34|>": 50431,
70
+ "<|1.36|>": 50432,
71
+ "<|1.38|>": 50433,
72
+ "<|1.40|>": 50434,
73
+ "<|1.42|>": 50435,
74
+ "<|1.44|>": 50436,
75
+ "<|1.46|>": 50437,
76
+ "<|1.48|>": 50438,
77
+ "<|1.50|>": 50439,
78
+ "<|1.52|>": 50440,
79
+ "<|1.54|>": 50441,
80
+ "<|1.56|>": 50442,
81
+ "<|1.58|>": 50443,
82
+ "<|1.60|>": 50444,
83
+ "<|1.62|>": 50445,
84
+ "<|1.64|>": 50446,
85
+ "<|1.66|>": 50447,
86
+ "<|1.68|>": 50448,
87
+ "<|1.70|>": 50449,
88
+ "<|1.72|>": 50450,
89
+ "<|1.74|>": 50451,
90
+ "<|1.76|>": 50452,
91
+ "<|1.78|>": 50453,
92
+ "<|1.80|>": 50454,
93
+ "<|1.82|>": 50455,
94
+ "<|1.84|>": 50456,
95
+ "<|1.86|>": 50457,
96
+ "<|1.88|>": 50458,
97
+ "<|1.90|>": 50459,
98
+ "<|1.92|>": 50460,
99
+ "<|1.94|>": 50461,
100
+ "<|1.96|>": 50462,
101
+ "<|1.98|>": 50463,
102
+ "<|10.00|>": 50864,
103
+ "<|10.02|>": 50865,
104
+ "<|10.04|>": 50866,
105
+ "<|10.06|>": 50867,
106
+ "<|10.08|>": 50868,
107
+ "<|10.10|>": 50869,
108
+ "<|10.12|>": 50870,
109
+ "<|10.14|>": 50871,
110
+ "<|10.16|>": 50872,
111
+ "<|10.18|>": 50873,
112
+ "<|10.20|>": 50874,
113
+ "<|10.22|>": 50875,
114
+ "<|10.24|>": 50876,
115
+ "<|10.26|>": 50877,
116
+ "<|10.28|>": 50878,
117
+ "<|10.30|>": 50879,
118
+ "<|10.32|>": 50880,
119
+ "<|10.34|>": 50881,
120
+ "<|10.36|>": 50882,
121
+ "<|10.38|>": 50883,
122
+ "<|10.40|>": 50884,
123
+ "<|10.42|>": 50885,
124
+ "<|10.44|>": 50886,
125
+ "<|10.46|>": 50887,
126
+ "<|10.48|>": 50888,
127
+ "<|10.50|>": 50889,
128
+ "<|10.52|>": 50890,
129
+ "<|10.54|>": 50891,
130
+ "<|10.56|>": 50892,
131
+ "<|10.58|>": 50893,
132
+ "<|10.60|>": 50894,
133
+ "<|10.62|>": 50895,
134
+ "<|10.64|>": 50896,
135
+ "<|10.66|>": 50897,
136
+ "<|10.68|>": 50898,
137
+ "<|10.70|>": 50899,
138
+ "<|10.72|>": 50900,
139
+ "<|10.74|>": 50901,
140
+ "<|10.76|>": 50902,
141
+ "<|10.78|>": 50903,
142
+ "<|10.80|>": 50904,
143
+ "<|10.82|>": 50905,
144
+ "<|10.84|>": 50906,
145
+ "<|10.86|>": 50907,
146
+ "<|10.88|>": 50908,
147
+ "<|10.90|>": 50909,
148
+ "<|10.92|>": 50910,
149
+ "<|10.94|>": 50911,
150
+ "<|10.96|>": 50912,
151
+ "<|10.98|>": 50913,
152
+ "<|11.00|>": 50914,
153
+ "<|11.02|>": 50915,
154
+ "<|11.04|>": 50916,
155
+ "<|11.06|>": 50917,
156
+ "<|11.08|>": 50918,
157
+ "<|11.10|>": 50919,
158
+ "<|11.12|>": 50920,
159
+ "<|11.14|>": 50921,
160
+ "<|11.16|>": 50922,
161
+ "<|11.18|>": 50923,
162
+ "<|11.20|>": 50924,
163
+ "<|11.22|>": 50925,
164
+ "<|11.24|>": 50926,
165
+ "<|11.26|>": 50927,
166
+ "<|11.28|>": 50928,
167
+ "<|11.30|>": 50929,
168
+ "<|11.32|>": 50930,
169
+ "<|11.34|>": 50931,
170
+ "<|11.36|>": 50932,
171
+ "<|11.38|>": 50933,
172
+ "<|11.40|>": 50934,
173
+ "<|11.42|>": 50935,
174
+ "<|11.44|>": 50936,
175
+ "<|11.46|>": 50937,
176
+ "<|11.48|>": 50938,
177
+ "<|11.50|>": 50939,
178
+ "<|11.52|>": 50940,
179
+ "<|11.54|>": 50941,
180
+ "<|11.56|>": 50942,
181
+ "<|11.58|>": 50943,
182
+ "<|11.60|>": 50944,
183
+ "<|11.62|>": 50945,
184
+ "<|11.64|>": 50946,
185
+ "<|11.66|>": 50947,
186
+ "<|11.68|>": 50948,
187
+ "<|11.70|>": 50949,
188
+ "<|11.72|>": 50950,
189
+ "<|11.74|>": 50951,
190
+ "<|11.76|>": 50952,
191
+ "<|11.78|>": 50953,
192
+ "<|11.80|>": 50954,
193
+ "<|11.82|>": 50955,
194
+ "<|11.84|>": 50956,
195
+ "<|11.86|>": 50957,
196
+ "<|11.88|>": 50958,
197
+ "<|11.90|>": 50959,
198
+ "<|11.92|>": 50960,
199
+ "<|11.94|>": 50961,
200
+ "<|11.96|>": 50962,
201
+ "<|11.98|>": 50963,
202
+ "<|12.00|>": 50964,
203
+ "<|12.02|>": 50965,
204
+ "<|12.04|>": 50966,
205
+ "<|12.06|>": 50967,
206
+ "<|12.08|>": 50968,
207
+ "<|12.10|>": 50969,
208
+ "<|12.12|>": 50970,
209
+ "<|12.14|>": 50971,
210
+ "<|12.16|>": 50972,
211
+ "<|12.18|>": 50973,
212
+ "<|12.20|>": 50974,
213
+ "<|12.22|>": 50975,
214
+ "<|12.24|>": 50976,
215
+ "<|12.26|>": 50977,
216
+ "<|12.28|>": 50978,
217
+ "<|12.30|>": 50979,
218
+ "<|12.32|>": 50980,
219
+ "<|12.34|>": 50981,
220
+ "<|12.36|>": 50982,
221
+ "<|12.38|>": 50983,
222
+ "<|12.40|>": 50984,
223
+ "<|12.42|>": 50985,
224
+ "<|12.44|>": 50986,
225
+ "<|12.46|>": 50987,
226
+ "<|12.48|>": 50988,
227
+ "<|12.50|>": 50989,
228
+ "<|12.52|>": 50990,
229
+ "<|12.54|>": 50991,
230
+ "<|12.56|>": 50992,
231
+ "<|12.58|>": 50993,
232
+ "<|12.60|>": 50994,
233
+ "<|12.62|>": 50995,
234
+ "<|12.64|>": 50996,
235
+ "<|12.66|>": 50997,
236
+ "<|12.68|>": 50998,
237
+ "<|12.70|>": 50999,
238
+ "<|12.72|>": 51000,
239
+ "<|12.74|>": 51001,
240
+ "<|12.76|>": 51002,
241
+ "<|12.78|>": 51003,
242
+ "<|12.80|>": 51004,
243
+ "<|12.82|>": 51005,
244
+ "<|12.84|>": 51006,
245
+ "<|12.86|>": 51007,
246
+ "<|12.88|>": 51008,
247
+ "<|12.90|>": 51009,
248
+ "<|12.92|>": 51010,
249
+ "<|12.94|>": 51011,
250
+ "<|12.96|>": 51012,
251
+ "<|12.98|>": 51013,
252
+ "<|13.00|>": 51014,
253
+ "<|13.02|>": 51015,
254
+ "<|13.04|>": 51016,
255
+ "<|13.06|>": 51017,
256
+ "<|13.08|>": 51018,
257
+ "<|13.10|>": 51019,
258
+ "<|13.12|>": 51020,
259
+ "<|13.14|>": 51021,
260
+ "<|13.16|>": 51022,
261
+ "<|13.18|>": 51023,
262
+ "<|13.20|>": 51024,
263
+ "<|13.22|>": 51025,
264
+ "<|13.24|>": 51026,
265
+ "<|13.26|>": 51027,
266
+ "<|13.28|>": 51028,
267
+ "<|13.30|>": 51029,
268
+ "<|13.32|>": 51030,
269
+ "<|13.34|>": 51031,
270
+ "<|13.36|>": 51032,
271
+ "<|13.38|>": 51033,
272
+ "<|13.40|>": 51034,
273
+ "<|13.42|>": 51035,
274
+ "<|13.44|>": 51036,
275
+ "<|13.46|>": 51037,
276
+ "<|13.48|>": 51038,
277
+ "<|13.50|>": 51039,
278
+ "<|13.52|>": 51040,
279
+ "<|13.54|>": 51041,
280
+ "<|13.56|>": 51042,
281
+ "<|13.58|>": 51043,
282
+ "<|13.60|>": 51044,
283
+ "<|13.62|>": 51045,
284
+ "<|13.64|>": 51046,
285
+ "<|13.66|>": 51047,
286
+ "<|13.68|>": 51048,
287
+ "<|13.70|>": 51049,
288
+ "<|13.72|>": 51050,
289
+ "<|13.74|>": 51051,
290
+ "<|13.76|>": 51052,
291
+ "<|13.78|>": 51053,
292
+ "<|13.80|>": 51054,
293
+ "<|13.82|>": 51055,
294
+ "<|13.84|>": 51056,
295
+ "<|13.86|>": 51057,
296
+ "<|13.88|>": 51058,
297
+ "<|13.90|>": 51059,
298
+ "<|13.92|>": 51060,
299
+ "<|13.94|>": 51061,
300
+ "<|13.96|>": 51062,
301
+ "<|13.98|>": 51063,
302
+ "<|14.00|>": 51064,
303
+ "<|14.02|>": 51065,
304
+ "<|14.04|>": 51066,
305
+ "<|14.06|>": 51067,
306
+ "<|14.08|>": 51068,
307
+ "<|14.10|>": 51069,
308
+ "<|14.12|>": 51070,
309
+ "<|14.14|>": 51071,
310
+ "<|14.16|>": 51072,
311
+ "<|14.18|>": 51073,
312
+ "<|14.20|>": 51074,
313
+ "<|14.22|>": 51075,
314
+ "<|14.24|>": 51076,
315
+ "<|14.26|>": 51077,
316
+ "<|14.28|>": 51078,
317
+ "<|14.30|>": 51079,
318
+ "<|14.32|>": 51080,
319
+ "<|14.34|>": 51081,
320
+ "<|14.36|>": 51082,
321
+ "<|14.38|>": 51083,
322
+ "<|14.40|>": 51084,
323
+ "<|14.42|>": 51085,
324
+ "<|14.44|>": 51086,
325
+ "<|14.46|>": 51087,
326
+ "<|14.48|>": 51088,
327
+ "<|14.50|>": 51089,
328
+ "<|14.52|>": 51090,
329
+ "<|14.54|>": 51091,
330
+ "<|14.56|>": 51092,
331
+ "<|14.58|>": 51093,
332
+ "<|14.60|>": 51094,
333
+ "<|14.62|>": 51095,
334
+ "<|14.64|>": 51096,
335
+ "<|14.66|>": 51097,
336
+ "<|14.68|>": 51098,
337
+ "<|14.70|>": 51099,
338
+ "<|14.72|>": 51100,
339
+ "<|14.74|>": 51101,
340
+ "<|14.76|>": 51102,
341
+ "<|14.78|>": 51103,
342
+ "<|14.80|>": 51104,
343
+ "<|14.82|>": 51105,
344
+ "<|14.84|>": 51106,
345
+ "<|14.86|>": 51107,
346
+ "<|14.88|>": 51108,
347
+ "<|14.90|>": 51109,
348
+ "<|14.92|>": 51110,
349
+ "<|14.94|>": 51111,
350
+ "<|14.96|>": 51112,
351
+ "<|14.98|>": 51113,
352
+ "<|15.00|>": 51114,
353
+ "<|15.02|>": 51115,
354
+ "<|15.04|>": 51116,
355
+ "<|15.06|>": 51117,
356
+ "<|15.08|>": 51118,
357
+ "<|15.10|>": 51119,
358
+ "<|15.12|>": 51120,
359
+ "<|15.14|>": 51121,
360
+ "<|15.16|>": 51122,
361
+ "<|15.18|>": 51123,
362
+ "<|15.20|>": 51124,
363
+ "<|15.22|>": 51125,
364
+ "<|15.24|>": 51126,
365
+ "<|15.26|>": 51127,
366
+ "<|15.28|>": 51128,
367
+ "<|15.30|>": 51129,
368
+ "<|15.32|>": 51130,
369
+ "<|15.34|>": 51131,
370
+ "<|15.36|>": 51132,
371
+ "<|15.38|>": 51133,
372
+ "<|15.40|>": 51134,
373
+ "<|15.42|>": 51135,
374
+ "<|15.44|>": 51136,
375
+ "<|15.46|>": 51137,
376
+ "<|15.48|>": 51138,
377
+ "<|15.50|>": 51139,
378
+ "<|15.52|>": 51140,
379
+ "<|15.54|>": 51141,
380
+ "<|15.56|>": 51142,
381
+ "<|15.58|>": 51143,
382
+ "<|15.60|>": 51144,
383
+ "<|15.62|>": 51145,
384
+ "<|15.64|>": 51146,
385
+ "<|15.66|>": 51147,
386
+ "<|15.68|>": 51148,
387
+ "<|15.70|>": 51149,
388
+ "<|15.72|>": 51150,
389
+ "<|15.74|>": 51151,
390
+ "<|15.76|>": 51152,
391
+ "<|15.78|>": 51153,
392
+ "<|15.80|>": 51154,
393
+ "<|15.82|>": 51155,
394
+ "<|15.84|>": 51156,
395
+ "<|15.86|>": 51157,
396
+ "<|15.88|>": 51158,
397
+ "<|15.90|>": 51159,
398
+ "<|15.92|>": 51160,
399
+ "<|15.94|>": 51161,
400
+ "<|15.96|>": 51162,
401
+ "<|15.98|>": 51163,
402
+ "<|16.00|>": 51164,
403
+ "<|16.02|>": 51165,
404
+ "<|16.04|>": 51166,
405
+ "<|16.06|>": 51167,
406
+ "<|16.08|>": 51168,
407
+ "<|16.10|>": 51169,
408
+ "<|16.12|>": 51170,
409
+ "<|16.14|>": 51171,
410
+ "<|16.16|>": 51172,
411
+ "<|16.18|>": 51173,
412
+ "<|16.20|>": 51174,
413
+ "<|16.22|>": 51175,
414
+ "<|16.24|>": 51176,
415
+ "<|16.26|>": 51177,
416
+ "<|16.28|>": 51178,
417
+ "<|16.30|>": 51179,
418
+ "<|16.32|>": 51180,
419
+ "<|16.34|>": 51181,
420
+ "<|16.36|>": 51182,
421
+ "<|16.38|>": 51183,
422
+ "<|16.40|>": 51184,
423
+ "<|16.42|>": 51185,
424
+ "<|16.44|>": 51186,
425
+ "<|16.46|>": 51187,
426
+ "<|16.48|>": 51188,
427
+ "<|16.50|>": 51189,
428
+ "<|16.52|>": 51190,
429
+ "<|16.54|>": 51191,
430
+ "<|16.56|>": 51192,
431
+ "<|16.58|>": 51193,
432
+ "<|16.60|>": 51194,
433
+ "<|16.62|>": 51195,
434
+ "<|16.64|>": 51196,
435
+ "<|16.66|>": 51197,
436
+ "<|16.68|>": 51198,
437
+ "<|16.70|>": 51199,
438
+ "<|16.72|>": 51200,
439
+ "<|16.74|>": 51201,
440
+ "<|16.76|>": 51202,
441
+ "<|16.78|>": 51203,
442
+ "<|16.80|>": 51204,
443
+ "<|16.82|>": 51205,
444
+ "<|16.84|>": 51206,
445
+ "<|16.86|>": 51207,
446
+ "<|16.88|>": 51208,
447
+ "<|16.90|>": 51209,
448
+ "<|16.92|>": 51210,
449
+ "<|16.94|>": 51211,
450
+ "<|16.96|>": 51212,
451
+ "<|16.98|>": 51213,
452
+ "<|17.00|>": 51214,
453
+ "<|17.02|>": 51215,
454
+ "<|17.04|>": 51216,
455
+ "<|17.06|>": 51217,
456
+ "<|17.08|>": 51218,
457
+ "<|17.10|>": 51219,
458
+ "<|17.12|>": 51220,
459
+ "<|17.14|>": 51221,
460
+ "<|17.16|>": 51222,
461
+ "<|17.18|>": 51223,
462
+ "<|17.20|>": 51224,
463
+ "<|17.22|>": 51225,
464
+ "<|17.24|>": 51226,
465
+ "<|17.26|>": 51227,
466
+ "<|17.28|>": 51228,
467
+ "<|17.30|>": 51229,
468
+ "<|17.32|>": 51230,
469
+ "<|17.34|>": 51231,
470
+ "<|17.36|>": 51232,
471
+ "<|17.38|>": 51233,
472
+ "<|17.40|>": 51234,
473
+ "<|17.42|>": 51235,
474
+ "<|17.44|>": 51236,
475
+ "<|17.46|>": 51237,
476
+ "<|17.48|>": 51238,
477
+ "<|17.50|>": 51239,
478
+ "<|17.52|>": 51240,
479
+ "<|17.54|>": 51241,
480
+ "<|17.56|>": 51242,
481
+ "<|17.58|>": 51243,
482
+ "<|17.60|>": 51244,
483
+ "<|17.62|>": 51245,
484
+ "<|17.64|>": 51246,
485
+ "<|17.66|>": 51247,
486
+ "<|17.68|>": 51248,
487
+ "<|17.70|>": 51249,
488
+ "<|17.72|>": 51250,
489
+ "<|17.74|>": 51251,
490
+ "<|17.76|>": 51252,
491
+ "<|17.78|>": 51253,
492
+ "<|17.80|>": 51254,
493
+ "<|17.82|>": 51255,
494
+ "<|17.84|>": 51256,
495
+ "<|17.86|>": 51257,
496
+ "<|17.88|>": 51258,
497
+ "<|17.90|>": 51259,
498
+ "<|17.92|>": 51260,
499
+ "<|17.94|>": 51261,
500
+ "<|17.96|>": 51262,
501
+ "<|17.98|>": 51263,
502
+ "<|18.00|>": 51264,
503
+ "<|18.02|>": 51265,
504
+ "<|18.04|>": 51266,
505
+ "<|18.06|>": 51267,
506
+ "<|18.08|>": 51268,
507
+ "<|18.10|>": 51269,
508
+ "<|18.12|>": 51270,
509
+ "<|18.14|>": 51271,
510
+ "<|18.16|>": 51272,
511
+ "<|18.18|>": 51273,
512
+ "<|18.20|>": 51274,
513
+ "<|18.22|>": 51275,
514
+ "<|18.24|>": 51276,
515
+ "<|18.26|>": 51277,
516
+ "<|18.28|>": 51278,
517
+ "<|18.30|>": 51279,
518
+ "<|18.32|>": 51280,
519
+ "<|18.34|>": 51281,
520
+ "<|18.36|>": 51282,
521
+ "<|18.38|>": 51283,
522
+ "<|18.40|>": 51284,
523
+ "<|18.42|>": 51285,
524
+ "<|18.44|>": 51286,
525
+ "<|18.46|>": 51287,
526
+ "<|18.48|>": 51288,
527
+ "<|18.50|>": 51289,
528
+ "<|18.52|>": 51290,
529
+ "<|18.54|>": 51291,
530
+ "<|18.56|>": 51292,
531
+ "<|18.58|>": 51293,
532
+ "<|18.60|>": 51294,
533
+ "<|18.62|>": 51295,
534
+ "<|18.64|>": 51296,
535
+ "<|18.66|>": 51297,
536
+ "<|18.68|>": 51298,
537
+ "<|18.70|>": 51299,
538
+ "<|18.72|>": 51300,
539
+ "<|18.74|>": 51301,
540
+ "<|18.76|>": 51302,
541
+ "<|18.78|>": 51303,
542
+ "<|18.80|>": 51304,
543
+ "<|18.82|>": 51305,
544
+ "<|18.84|>": 51306,
545
+ "<|18.86|>": 51307,
546
+ "<|18.88|>": 51308,
547
+ "<|18.90|>": 51309,
548
+ "<|18.92|>": 51310,
549
+ "<|18.94|>": 51311,
550
+ "<|18.96|>": 51312,
551
+ "<|18.98|>": 51313,
552
+ "<|19.00|>": 51314,
553
+ "<|19.02|>": 51315,
554
+ "<|19.04|>": 51316,
555
+ "<|19.06|>": 51317,
556
+ "<|19.08|>": 51318,
557
+ "<|19.10|>": 51319,
558
+ "<|19.12|>": 51320,
559
+ "<|19.14|>": 51321,
560
+ "<|19.16|>": 51322,
561
+ "<|19.18|>": 51323,
562
+ "<|19.20|>": 51324,
563
+ "<|19.22|>": 51325,
564
+ "<|19.24|>": 51326,
565
+ "<|19.26|>": 51327,
566
+ "<|19.28|>": 51328,
567
+ "<|19.30|>": 51329,
568
+ "<|19.32|>": 51330,
569
+ "<|19.34|>": 51331,
570
+ "<|19.36|>": 51332,
571
+ "<|19.38|>": 51333,
572
+ "<|19.40|>": 51334,
573
+ "<|19.42|>": 51335,
574
+ "<|19.44|>": 51336,
575
+ "<|19.46|>": 51337,
576
+ "<|19.48|>": 51338,
577
+ "<|19.50|>": 51339,
578
+ "<|19.52|>": 51340,
579
+ "<|19.54|>": 51341,
580
+ "<|19.56|>": 51342,
581
+ "<|19.58|>": 51343,
582
+ "<|19.60|>": 51344,
583
+ "<|19.62|>": 51345,
584
+ "<|19.64|>": 51346,
585
+ "<|19.66|>": 51347,
586
+ "<|19.68|>": 51348,
587
+ "<|19.70|>": 51349,
588
+ "<|19.72|>": 51350,
589
+ "<|19.74|>": 51351,
590
+ "<|19.76|>": 51352,
591
+ "<|19.78|>": 51353,
592
+ "<|19.80|>": 51354,
593
+ "<|19.82|>": 51355,
594
+ "<|19.84|>": 51356,
595
+ "<|19.86|>": 51357,
596
+ "<|19.88|>": 51358,
597
+ "<|19.90|>": 51359,
598
+ "<|19.92|>": 51360,
599
+ "<|19.94|>": 51361,
600
+ "<|19.96|>": 51362,
601
+ "<|19.98|>": 51363,
602
+ "<|2.00|>": 50464,
603
+ "<|2.02|>": 50465,
604
+ "<|2.04|>": 50466,
605
+ "<|2.06|>": 50467,
606
+ "<|2.08|>": 50468,
607
+ "<|2.10|>": 50469,
608
+ "<|2.12|>": 50470,
609
+ "<|2.14|>": 50471,
610
+ "<|2.16|>": 50472,
611
+ "<|2.18|>": 50473,
612
+ "<|2.20|>": 50474,
613
+ "<|2.22|>": 50475,
614
+ "<|2.24|>": 50476,
615
+ "<|2.26|>": 50477,
616
+ "<|2.28|>": 50478,
617
+ "<|2.30|>": 50479,
618
+ "<|2.32|>": 50480,
619
+ "<|2.34|>": 50481,
620
+ "<|2.36|>": 50482,
621
+ "<|2.38|>": 50483,
622
+ "<|2.40|>": 50484,
623
+ "<|2.42|>": 50485,
624
+ "<|2.44|>": 50486,
625
+ "<|2.46|>": 50487,
626
+ "<|2.48|>": 50488,
627
+ "<|2.50|>": 50489,
628
+ "<|2.52|>": 50490,
629
+ "<|2.54|>": 50491,
630
+ "<|2.56|>": 50492,
631
+ "<|2.58|>": 50493,
632
+ "<|2.60|>": 50494,
633
+ "<|2.62|>": 50495,
634
+ "<|2.64|>": 50496,
635
+ "<|2.66|>": 50497,
636
+ "<|2.68|>": 50498,
637
+ "<|2.70|>": 50499,
638
+ "<|2.72|>": 50500,
639
+ "<|2.74|>": 50501,
640
+ "<|2.76|>": 50502,
641
+ "<|2.78|>": 50503,
642
+ "<|2.80|>": 50504,
643
+ "<|2.82|>": 50505,
644
+ "<|2.84|>": 50506,
645
+ "<|2.86|>": 50507,
646
+ "<|2.88|>": 50508,
647
+ "<|2.90|>": 50509,
648
+ "<|2.92|>": 50510,
649
+ "<|2.94|>": 50511,
650
+ "<|2.96|>": 50512,
651
+ "<|2.98|>": 50513,
652
+ "<|20.00|>": 51364,
653
+ "<|20.02|>": 51365,
654
+ "<|20.04|>": 51366,
655
+ "<|20.06|>": 51367,
656
+ "<|20.08|>": 51368,
657
+ "<|20.10|>": 51369,
658
+ "<|20.12|>": 51370,
659
+ "<|20.14|>": 51371,
660
+ "<|20.16|>": 51372,
661
+ "<|20.18|>": 51373,
662
+ "<|20.20|>": 51374,
663
+ "<|20.22|>": 51375,
664
+ "<|20.24|>": 51376,
665
+ "<|20.26|>": 51377,
666
+ "<|20.28|>": 51378,
667
+ "<|20.30|>": 51379,
668
+ "<|20.32|>": 51380,
669
+ "<|20.34|>": 51381,
670
+ "<|20.36|>": 51382,
671
+ "<|20.38|>": 51383,
672
+ "<|20.40|>": 51384,
673
+ "<|20.42|>": 51385,
674
+ "<|20.44|>": 51386,
675
+ "<|20.46|>": 51387,
676
+ "<|20.48|>": 51388,
677
+ "<|20.50|>": 51389,
678
+ "<|20.52|>": 51390,
679
+ "<|20.54|>": 51391,
680
+ "<|20.56|>": 51392,
681
+ "<|20.58|>": 51393,
682
+ "<|20.60|>": 51394,
683
+ "<|20.62|>": 51395,
684
+ "<|20.64|>": 51396,
685
+ "<|20.66|>": 51397,
686
+ "<|20.68|>": 51398,
687
+ "<|20.70|>": 51399,
688
+ "<|20.72|>": 51400,
689
+ "<|20.74|>": 51401,
690
+ "<|20.76|>": 51402,
691
+ "<|20.78|>": 51403,
692
+ "<|20.80|>": 51404,
693
+ "<|20.82|>": 51405,
694
+ "<|20.84|>": 51406,
695
+ "<|20.86|>": 51407,
696
+ "<|20.88|>": 51408,
697
+ "<|20.90|>": 51409,
698
+ "<|20.92|>": 51410,
699
+ "<|20.94|>": 51411,
700
+ "<|20.96|>": 51412,
701
+ "<|20.98|>": 51413,
702
+ "<|21.00|>": 51414,
703
+ "<|21.02|>": 51415,
704
+ "<|21.04|>": 51416,
705
+ "<|21.06|>": 51417,
706
+ "<|21.08|>": 51418,
707
+ "<|21.10|>": 51419,
708
+ "<|21.12|>": 51420,
709
+ "<|21.14|>": 51421,
710
+ "<|21.16|>": 51422,
711
+ "<|21.18|>": 51423,
712
+ "<|21.20|>": 51424,
713
+ "<|21.22|>": 51425,
714
+ "<|21.24|>": 51426,
715
+ "<|21.26|>": 51427,
716
+ "<|21.28|>": 51428,
717
+ "<|21.30|>": 51429,
718
+ "<|21.32|>": 51430,
719
+ "<|21.34|>": 51431,
720
+ "<|21.36|>": 51432,
721
+ "<|21.38|>": 51433,
722
+ "<|21.40|>": 51434,
723
+ "<|21.42|>": 51435,
724
+ "<|21.44|>": 51436,
725
+ "<|21.46|>": 51437,
726
+ "<|21.48|>": 51438,
727
+ "<|21.50|>": 51439,
728
+ "<|21.52|>": 51440,
729
+ "<|21.54|>": 51441,
730
+ "<|21.56|>": 51442,
731
+ "<|21.58|>": 51443,
732
+ "<|21.60|>": 51444,
733
+ "<|21.62|>": 51445,
734
+ "<|21.64|>": 51446,
735
+ "<|21.66|>": 51447,
736
+ "<|21.68|>": 51448,
737
+ "<|21.70|>": 51449,
738
+ "<|21.72|>": 51450,
739
+ "<|21.74|>": 51451,
740
+ "<|21.76|>": 51452,
741
+ "<|21.78|>": 51453,
742
+ "<|21.80|>": 51454,
743
+ "<|21.82|>": 51455,
744
+ "<|21.84|>": 51456,
745
+ "<|21.86|>": 51457,
746
+ "<|21.88|>": 51458,
747
+ "<|21.90|>": 51459,
748
+ "<|21.92|>": 51460,
749
+ "<|21.94|>": 51461,
750
+ "<|21.96|>": 51462,
751
+ "<|21.98|>": 51463,
752
+ "<|22.00|>": 51464,
753
+ "<|22.02|>": 51465,
754
+ "<|22.04|>": 51466,
755
+ "<|22.06|>": 51467,
756
+ "<|22.08|>": 51468,
757
+ "<|22.10|>": 51469,
758
+ "<|22.12|>": 51470,
759
+ "<|22.14|>": 51471,
760
+ "<|22.16|>": 51472,
761
+ "<|22.18|>": 51473,
762
+ "<|22.20|>": 51474,
763
+ "<|22.22|>": 51475,
764
+ "<|22.24|>": 51476,
765
+ "<|22.26|>": 51477,
766
+ "<|22.28|>": 51478,
767
+ "<|22.30|>": 51479,
768
+ "<|22.32|>": 51480,
769
+ "<|22.34|>": 51481,
770
+ "<|22.36|>": 51482,
771
+ "<|22.38|>": 51483,
772
+ "<|22.40|>": 51484,
773
+ "<|22.42|>": 51485,
774
+ "<|22.44|>": 51486,
775
+ "<|22.46|>": 51487,
776
+ "<|22.48|>": 51488,
777
+ "<|22.50|>": 51489,
778
+ "<|22.52|>": 51490,
779
+ "<|22.54|>": 51491,
780
+ "<|22.56|>": 51492,
781
+ "<|22.58|>": 51493,
782
+ "<|22.60|>": 51494,
783
+ "<|22.62|>": 51495,
784
+ "<|22.64|>": 51496,
785
+ "<|22.66|>": 51497,
786
+ "<|22.68|>": 51498,
787
+ "<|22.70|>": 51499,
788
+ "<|22.72|>": 51500,
789
+ "<|22.74|>": 51501,
790
+ "<|22.76|>": 51502,
791
+ "<|22.78|>": 51503,
792
+ "<|22.80|>": 51504,
793
+ "<|22.82|>": 51505,
794
+ "<|22.84|>": 51506,
795
+ "<|22.86|>": 51507,
796
+ "<|22.88|>": 51508,
797
+ "<|22.90|>": 51509,
798
+ "<|22.92|>": 51510,
799
+ "<|22.94|>": 51511,
800
+ "<|22.96|>": 51512,
801
+ "<|22.98|>": 51513,
802
+ "<|23.00|>": 51514,
803
+ "<|23.02|>": 51515,
804
+ "<|23.04|>": 51516,
805
+ "<|23.06|>": 51517,
806
+ "<|23.08|>": 51518,
807
+ "<|23.10|>": 51519,
808
+ "<|23.12|>": 51520,
809
+ "<|23.14|>": 51521,
810
+ "<|23.16|>": 51522,
811
+ "<|23.18|>": 51523,
812
+ "<|23.20|>": 51524,
813
+ "<|23.22|>": 51525,
814
+ "<|23.24|>": 51526,
815
+ "<|23.26|>": 51527,
816
+ "<|23.28|>": 51528,
817
+ "<|23.30|>": 51529,
818
+ "<|23.32|>": 51530,
819
+ "<|23.34|>": 51531,
820
+ "<|23.36|>": 51532,
821
+ "<|23.38|>": 51533,
822
+ "<|23.40|>": 51534,
823
+ "<|23.42|>": 51535,
824
+ "<|23.44|>": 51536,
825
+ "<|23.46|>": 51537,
826
+ "<|23.48|>": 51538,
827
+ "<|23.50|>": 51539,
828
+ "<|23.52|>": 51540,
829
+ "<|23.54|>": 51541,
830
+ "<|23.56|>": 51542,
831
+ "<|23.58|>": 51543,
832
+ "<|23.60|>": 51544,
833
+ "<|23.62|>": 51545,
834
+ "<|23.64|>": 51546,
835
+ "<|23.66|>": 51547,
836
+ "<|23.68|>": 51548,
837
+ "<|23.70|>": 51549,
838
+ "<|23.72|>": 51550,
839
+ "<|23.74|>": 51551,
840
+ "<|23.76|>": 51552,
841
+ "<|23.78|>": 51553,
842
+ "<|23.80|>": 51554,
843
+ "<|23.82|>": 51555,
844
+ "<|23.84|>": 51556,
845
+ "<|23.86|>": 51557,
846
+ "<|23.88|>": 51558,
847
+ "<|23.90|>": 51559,
848
+ "<|23.92|>": 51560,
849
+ "<|23.94|>": 51561,
850
+ "<|23.96|>": 51562,
851
+ "<|23.98|>": 51563,
852
+ "<|24.00|>": 51564,
853
+ "<|24.02|>": 51565,
854
+ "<|24.04|>": 51566,
855
+ "<|24.06|>": 51567,
856
+ "<|24.08|>": 51568,
857
+ "<|24.10|>": 51569,
858
+ "<|24.12|>": 51570,
859
+ "<|24.14|>": 51571,
860
+ "<|24.16|>": 51572,
861
+ "<|24.18|>": 51573,
862
+ "<|24.20|>": 51574,
863
+ "<|24.22|>": 51575,
864
+ "<|24.24|>": 51576,
865
+ "<|24.26|>": 51577,
866
+ "<|24.28|>": 51578,
867
+ "<|24.30|>": 51579,
868
+ "<|24.32|>": 51580,
869
+ "<|24.34|>": 51581,
870
+ "<|24.36|>": 51582,
871
+ "<|24.38|>": 51583,
872
+ "<|24.40|>": 51584,
873
+ "<|24.42|>": 51585,
874
+ "<|24.44|>": 51586,
875
+ "<|24.46|>": 51587,
876
+ "<|24.48|>": 51588,
877
+ "<|24.50|>": 51589,
878
+ "<|24.52|>": 51590,
879
+ "<|24.54|>": 51591,
880
+ "<|24.56|>": 51592,
881
+ "<|24.58|>": 51593,
882
+ "<|24.60|>": 51594,
883
+ "<|24.62|>": 51595,
884
+ "<|24.64|>": 51596,
885
+ "<|24.66|>": 51597,
886
+ "<|24.68|>": 51598,
887
+ "<|24.70|>": 51599,
888
+ "<|24.72|>": 51600,
889
+ "<|24.74|>": 51601,
890
+ "<|24.76|>": 51602,
891
+ "<|24.78|>": 51603,
892
+ "<|24.80|>": 51604,
893
+ "<|24.82|>": 51605,
894
+ "<|24.84|>": 51606,
895
+ "<|24.86|>": 51607,
896
+ "<|24.88|>": 51608,
897
+ "<|24.90|>": 51609,
898
+ "<|24.92|>": 51610,
899
+ "<|24.94|>": 51611,
900
+ "<|24.96|>": 51612,
901
+ "<|24.98|>": 51613,
902
+ "<|25.00|>": 51614,
903
+ "<|25.02|>": 51615,
904
+ "<|25.04|>": 51616,
905
+ "<|25.06|>": 51617,
906
+ "<|25.08|>": 51618,
907
+ "<|25.10|>": 51619,
908
+ "<|25.12|>": 51620,
909
+ "<|25.14|>": 51621,
910
+ "<|25.16|>": 51622,
911
+ "<|25.18|>": 51623,
912
+ "<|25.20|>": 51624,
913
+ "<|25.22|>": 51625,
914
+ "<|25.24|>": 51626,
915
+ "<|25.26|>": 51627,
916
+ "<|25.28|>": 51628,
917
+ "<|25.30|>": 51629,
918
+ "<|25.32|>": 51630,
919
+ "<|25.34|>": 51631,
920
+ "<|25.36|>": 51632,
921
+ "<|25.38|>": 51633,
922
+ "<|25.40|>": 51634,
923
+ "<|25.42|>": 51635,
924
+ "<|25.44|>": 51636,
925
+ "<|25.46|>": 51637,
926
+ "<|25.48|>": 51638,
927
+ "<|25.50|>": 51639,
928
+ "<|25.52|>": 51640,
929
+ "<|25.54|>": 51641,
930
+ "<|25.56|>": 51642,
931
+ "<|25.58|>": 51643,
932
+ "<|25.60|>": 51644,
933
+ "<|25.62|>": 51645,
934
+ "<|25.64|>": 51646,
935
+ "<|25.66|>": 51647,
936
+ "<|25.68|>": 51648,
937
+ "<|25.70|>": 51649,
938
+ "<|25.72|>": 51650,
939
+ "<|25.74|>": 51651,
940
+ "<|25.76|>": 51652,
941
+ "<|25.78|>": 51653,
942
+ "<|25.80|>": 51654,
943
+ "<|25.82|>": 51655,
944
+ "<|25.84|>": 51656,
945
+ "<|25.86|>": 51657,
946
+ "<|25.88|>": 51658,
947
+ "<|25.90|>": 51659,
948
+ "<|25.92|>": 51660,
949
+ "<|25.94|>": 51661,
950
+ "<|25.96|>": 51662,
951
+ "<|25.98|>": 51663,
952
+ "<|26.00|>": 51664,
953
+ "<|26.02|>": 51665,
954
+ "<|26.04|>": 51666,
955
+ "<|26.06|>": 51667,
956
+ "<|26.08|>": 51668,
957
+ "<|26.10|>": 51669,
958
+ "<|26.12|>": 51670,
959
+ "<|26.14|>": 51671,
960
+ "<|26.16|>": 51672,
961
+ "<|26.18|>": 51673,
962
+ "<|26.20|>": 51674,
963
+ "<|26.22|>": 51675,
964
+ "<|26.24|>": 51676,
965
+ "<|26.26|>": 51677,
966
+ "<|26.28|>": 51678,
967
+ "<|26.30|>": 51679,
968
+ "<|26.32|>": 51680,
969
+ "<|26.34|>": 51681,
970
+ "<|26.36|>": 51682,
971
+ "<|26.38|>": 51683,
972
+ "<|26.40|>": 51684,
973
+ "<|26.42|>": 51685,
974
+ "<|26.44|>": 51686,
975
+ "<|26.46|>": 51687,
976
+ "<|26.48|>": 51688,
977
+ "<|26.50|>": 51689,
978
+ "<|26.52|>": 51690,
979
+ "<|26.54|>": 51691,
980
+ "<|26.56|>": 51692,
981
+ "<|26.58|>": 51693,
982
+ "<|26.60|>": 51694,
983
+ "<|26.62|>": 51695,
984
+ "<|26.64|>": 51696,
985
+ "<|26.66|>": 51697,
986
+ "<|26.68|>": 51698,
987
+ "<|26.70|>": 51699,
988
+ "<|26.72|>": 51700,
989
+ "<|26.74|>": 51701,
990
+ "<|26.76|>": 51702,
991
+ "<|26.78|>": 51703,
992
+ "<|26.80|>": 51704,
993
+ "<|26.82|>": 51705,
994
+ "<|26.84|>": 51706,
995
+ "<|26.86|>": 51707,
996
+ "<|26.88|>": 51708,
997
+ "<|26.90|>": 51709,
998
+ "<|26.92|>": 51710,
999
+ "<|26.94|>": 51711,
1000
+ "<|26.96|>": 51712,
1001
+ "<|26.98|>": 51713,
1002
+ "<|27.00|>": 51714,
1003
+ "<|27.02|>": 51715,
1004
+ "<|27.04|>": 51716,
1005
+ "<|27.06|>": 51717,
1006
+ "<|27.08|>": 51718,
1007
+ "<|27.10|>": 51719,
1008
+ "<|27.12|>": 51720,
1009
+ "<|27.14|>": 51721,
1010
+ "<|27.16|>": 51722,
1011
+ "<|27.18|>": 51723,
1012
+ "<|27.20|>": 51724,
1013
+ "<|27.22|>": 51725,
1014
+ "<|27.24|>": 51726,
1015
+ "<|27.26|>": 51727,
1016
+ "<|27.28|>": 51728,
1017
+ "<|27.30|>": 51729,
1018
+ "<|27.32|>": 51730,
1019
+ "<|27.34|>": 51731,
1020
+ "<|27.36|>": 51732,
1021
+ "<|27.38|>": 51733,
1022
+ "<|27.40|>": 51734,
1023
+ "<|27.42|>": 51735,
1024
+ "<|27.44|>": 51736,
1025
+ "<|27.46|>": 51737,
1026
+ "<|27.48|>": 51738,
1027
+ "<|27.50|>": 51739,
1028
+ "<|27.52|>": 51740,
1029
+ "<|27.54|>": 51741,
1030
+ "<|27.56|>": 51742,
1031
+ "<|27.58|>": 51743,
1032
+ "<|27.60|>": 51744,
1033
+ "<|27.62|>": 51745,
1034
+ "<|27.64|>": 51746,
1035
+ "<|27.66|>": 51747,
1036
+ "<|27.68|>": 51748,
1037
+ "<|27.70|>": 51749,
1038
+ "<|27.72|>": 51750,
1039
+ "<|27.74|>": 51751,
1040
+ "<|27.76|>": 51752,
1041
+ "<|27.78|>": 51753,
1042
+ "<|27.80|>": 51754,
1043
+ "<|27.82|>": 51755,
1044
+ "<|27.84|>": 51756,
1045
+ "<|27.86|>": 51757,
1046
+ "<|27.88|>": 51758,
1047
+ "<|27.90|>": 51759,
1048
+ "<|27.92|>": 51760,
1049
+ "<|27.94|>": 51761,
1050
+ "<|27.96|>": 51762,
1051
+ "<|27.98|>": 51763,
1052
+ "<|28.00|>": 51764,
1053
+ "<|28.02|>": 51765,
1054
+ "<|28.04|>": 51766,
1055
+ "<|28.06|>": 51767,
1056
+ "<|28.08|>": 51768,
1057
+ "<|28.10|>": 51769,
1058
+ "<|28.12|>": 51770,
1059
+ "<|28.14|>": 51771,
1060
+ "<|28.16|>": 51772,
1061
+ "<|28.18|>": 51773,
1062
+ "<|28.20|>": 51774,
1063
+ "<|28.22|>": 51775,
1064
+ "<|28.24|>": 51776,
1065
+ "<|28.26|>": 51777,
1066
+ "<|28.28|>": 51778,
1067
+ "<|28.30|>": 51779,
1068
+ "<|28.32|>": 51780,
1069
+ "<|28.34|>": 51781,
1070
+ "<|28.36|>": 51782,
1071
+ "<|28.38|>": 51783,
1072
+ "<|28.40|>": 51784,
1073
+ "<|28.42|>": 51785,
1074
+ "<|28.44|>": 51786,
1075
+ "<|28.46|>": 51787,
1076
+ "<|28.48|>": 51788,
1077
+ "<|28.50|>": 51789,
1078
+ "<|28.52|>": 51790,
1079
+ "<|28.54|>": 51791,
1080
+ "<|28.56|>": 51792,
1081
+ "<|28.58|>": 51793,
1082
+ "<|28.60|>": 51794,
1083
+ "<|28.62|>": 51795,
1084
+ "<|28.64|>": 51796,
1085
+ "<|28.66|>": 51797,
1086
+ "<|28.68|>": 51798,
1087
+ "<|28.70|>": 51799,
1088
+ "<|28.72|>": 51800,
1089
+ "<|28.74|>": 51801,
1090
+ "<|28.76|>": 51802,
1091
+ "<|28.78|>": 51803,
1092
+ "<|28.80|>": 51804,
1093
+ "<|28.82|>": 51805,
1094
+ "<|28.84|>": 51806,
1095
+ "<|28.86|>": 51807,
1096
+ "<|28.88|>": 51808,
1097
+ "<|28.90|>": 51809,
1098
+ "<|28.92|>": 51810,
1099
+ "<|28.94|>": 51811,
1100
+ "<|28.96|>": 51812,
1101
+ "<|28.98|>": 51813,
1102
+ "<|29.00|>": 51814,
1103
+ "<|29.02|>": 51815,
1104
+ "<|29.04|>": 51816,
1105
+ "<|29.06|>": 51817,
1106
+ "<|29.08|>": 51818,
1107
+ "<|29.10|>": 51819,
1108
+ "<|29.12|>": 51820,
1109
+ "<|29.14|>": 51821,
1110
+ "<|29.16|>": 51822,
1111
+ "<|29.18|>": 51823,
1112
+ "<|29.20|>": 51824,
1113
+ "<|29.22|>": 51825,
1114
+ "<|29.24|>": 51826,
1115
+ "<|29.26|>": 51827,
1116
+ "<|29.28|>": 51828,
1117
+ "<|29.30|>": 51829,
1118
+ "<|29.32|>": 51830,
1119
+ "<|29.34|>": 51831,
1120
+ "<|29.36|>": 51832,
1121
+ "<|29.38|>": 51833,
1122
+ "<|29.40|>": 51834,
1123
+ "<|29.42|>": 51835,
1124
+ "<|29.44|>": 51836,
1125
+ "<|29.46|>": 51837,
1126
+ "<|29.48|>": 51838,
1127
+ "<|29.50|>": 51839,
1128
+ "<|29.52|>": 51840,
1129
+ "<|29.54|>": 51841,
1130
+ "<|29.56|>": 51842,
1131
+ "<|29.58|>": 51843,
1132
+ "<|29.60|>": 51844,
1133
+ "<|29.62|>": 51845,
1134
+ "<|29.64|>": 51846,
1135
+ "<|29.66|>": 51847,
1136
+ "<|29.68|>": 51848,
1137
+ "<|29.70|>": 51849,
1138
+ "<|29.72|>": 51850,
1139
+ "<|29.74|>": 51851,
1140
+ "<|29.76|>": 51852,
1141
+ "<|29.78|>": 51853,
1142
+ "<|29.80|>": 51854,
1143
+ "<|29.82|>": 51855,
1144
+ "<|29.84|>": 51856,
1145
+ "<|29.86|>": 51857,
1146
+ "<|29.88|>": 51858,
1147
+ "<|29.90|>": 51859,
1148
+ "<|29.92|>": 51860,
1149
+ "<|29.94|>": 51861,
1150
+ "<|29.96|>": 51862,
1151
+ "<|29.98|>": 51863,
1152
+ "<|3.00|>": 50514,
1153
+ "<|3.02|>": 50515,
1154
+ "<|3.04|>": 50516,
1155
+ "<|3.06|>": 50517,
1156
+ "<|3.08|>": 50518,
1157
+ "<|3.10|>": 50519,
1158
+ "<|3.12|>": 50520,
1159
+ "<|3.14|>": 50521,
1160
+ "<|3.16|>": 50522,
1161
+ "<|3.18|>": 50523,
1162
+ "<|3.20|>": 50524,
1163
+ "<|3.22|>": 50525,
1164
+ "<|3.24|>": 50526,
1165
+ "<|3.26|>": 50527,
1166
+ "<|3.28|>": 50528,
1167
+ "<|3.30|>": 50529,
1168
+ "<|3.32|>": 50530,
1169
+ "<|3.34|>": 50531,
1170
+ "<|3.36|>": 50532,
1171
+ "<|3.38|>": 50533,
1172
+ "<|3.40|>": 50534,
1173
+ "<|3.42|>": 50535,
1174
+ "<|3.44|>": 50536,
1175
+ "<|3.46|>": 50537,
1176
+ "<|3.48|>": 50538,
1177
+ "<|3.50|>": 50539,
1178
+ "<|3.52|>": 50540,
1179
+ "<|3.54|>": 50541,
1180
+ "<|3.56|>": 50542,
1181
+ "<|3.58|>": 50543,
1182
+ "<|3.60|>": 50544,
1183
+ "<|3.62|>": 50545,
1184
+ "<|3.64|>": 50546,
1185
+ "<|3.66|>": 50547,
1186
+ "<|3.68|>": 50548,
1187
+ "<|3.70|>": 50549,
1188
+ "<|3.72|>": 50550,
1189
+ "<|3.74|>": 50551,
1190
+ "<|3.76|>": 50552,
1191
+ "<|3.78|>": 50553,
1192
+ "<|3.80|>": 50554,
1193
+ "<|3.82|>": 50555,
1194
+ "<|3.84|>": 50556,
1195
+ "<|3.86|>": 50557,
1196
+ "<|3.88|>": 50558,
1197
+ "<|3.90|>": 50559,
1198
+ "<|3.92|>": 50560,
1199
+ "<|3.94|>": 50561,
1200
+ "<|3.96|>": 50562,
1201
+ "<|3.98|>": 50563,
1202
+ "<|30.00|>": 51864,
1203
+ "<|4.00|>": 50564,
1204
+ "<|4.02|>": 50565,
1205
+ "<|4.04|>": 50566,
1206
+ "<|4.06|>": 50567,
1207
+ "<|4.08|>": 50568,
1208
+ "<|4.10|>": 50569,
1209
+ "<|4.12|>": 50570,
1210
+ "<|4.14|>": 50571,
1211
+ "<|4.16|>": 50572,
1212
+ "<|4.18|>": 50573,
1213
+ "<|4.20|>": 50574,
1214
+ "<|4.22|>": 50575,
1215
+ "<|4.24|>": 50576,
1216
+ "<|4.26|>": 50577,
1217
+ "<|4.28|>": 50578,
1218
+ "<|4.30|>": 50579,
1219
+ "<|4.32|>": 50580,
1220
+ "<|4.34|>": 50581,
1221
+ "<|4.36|>": 50582,
1222
+ "<|4.38|>": 50583,
1223
+ "<|4.40|>": 50584,
1224
+ "<|4.42|>": 50585,
1225
+ "<|4.44|>": 50586,
1226
+ "<|4.46|>": 50587,
1227
+ "<|4.48|>": 50588,
1228
+ "<|4.50|>": 50589,
1229
+ "<|4.52|>": 50590,
1230
+ "<|4.54|>": 50591,
1231
+ "<|4.56|>": 50592,
1232
+ "<|4.58|>": 50593,
1233
+ "<|4.60|>": 50594,
1234
+ "<|4.62|>": 50595,
1235
+ "<|4.64|>": 50596,
1236
+ "<|4.66|>": 50597,
1237
+ "<|4.68|>": 50598,
1238
+ "<|4.70|>": 50599,
1239
+ "<|4.72|>": 50600,
1240
+ "<|4.74|>": 50601,
1241
+ "<|4.76|>": 50602,
1242
+ "<|4.78|>": 50603,
1243
+ "<|4.80|>": 50604,
1244
+ "<|4.82|>": 50605,
1245
+ "<|4.84|>": 50606,
1246
+ "<|4.86|>": 50607,
1247
+ "<|4.88|>": 50608,
1248
+ "<|4.90|>": 50609,
1249
+ "<|4.92|>": 50610,
1250
+ "<|4.94|>": 50611,
1251
+ "<|4.96|>": 50612,
1252
+ "<|4.98|>": 50613,
1253
+ "<|5.00|>": 50614,
1254
+ "<|5.02|>": 50615,
1255
+ "<|5.04|>": 50616,
1256
+ "<|5.06|>": 50617,
1257
+ "<|5.08|>": 50618,
1258
+ "<|5.10|>": 50619,
1259
+ "<|5.12|>": 50620,
1260
+ "<|5.14|>": 50621,
1261
+ "<|5.16|>": 50622,
1262
+ "<|5.18|>": 50623,
1263
+ "<|5.20|>": 50624,
1264
+ "<|5.22|>": 50625,
1265
+ "<|5.24|>": 50626,
1266
+ "<|5.26|>": 50627,
1267
+ "<|5.28|>": 50628,
1268
+ "<|5.30|>": 50629,
1269
+ "<|5.32|>": 50630,
1270
+ "<|5.34|>": 50631,
1271
+ "<|5.36|>": 50632,
1272
+ "<|5.38|>": 50633,
1273
+ "<|5.40|>": 50634,
1274
+ "<|5.42|>": 50635,
1275
+ "<|5.44|>": 50636,
1276
+ "<|5.46|>": 50637,
1277
+ "<|5.48|>": 50638,
1278
+ "<|5.50|>": 50639,
1279
+ "<|5.52|>": 50640,
1280
+ "<|5.54|>": 50641,
1281
+ "<|5.56|>": 50642,
1282
+ "<|5.58|>": 50643,
1283
+ "<|5.60|>": 50644,
1284
+ "<|5.62|>": 50645,
1285
+ "<|5.64|>": 50646,
1286
+ "<|5.66|>": 50647,
1287
+ "<|5.68|>": 50648,
1288
+ "<|5.70|>": 50649,
1289
+ "<|5.72|>": 50650,
1290
+ "<|5.74|>": 50651,
1291
+ "<|5.76|>": 50652,
1292
+ "<|5.78|>": 50653,
1293
+ "<|5.80|>": 50654,
1294
+ "<|5.82|>": 50655,
1295
+ "<|5.84|>": 50656,
1296
+ "<|5.86|>": 50657,
1297
+ "<|5.88|>": 50658,
1298
+ "<|5.90|>": 50659,
1299
+ "<|5.92|>": 50660,
1300
+ "<|5.94|>": 50661,
1301
+ "<|5.96|>": 50662,
1302
+ "<|5.98|>": 50663,
1303
+ "<|6.00|>": 50664,
1304
+ "<|6.02|>": 50665,
1305
+ "<|6.04|>": 50666,
1306
+ "<|6.06|>": 50667,
1307
+ "<|6.08|>": 50668,
1308
+ "<|6.10|>": 50669,
1309
+ "<|6.12|>": 50670,
1310
+ "<|6.14|>": 50671,
1311
+ "<|6.16|>": 50672,
1312
+ "<|6.18|>": 50673,
1313
+ "<|6.20|>": 50674,
1314
+ "<|6.22|>": 50675,
1315
+ "<|6.24|>": 50676,
1316
+ "<|6.26|>": 50677,
1317
+ "<|6.28|>": 50678,
1318
+ "<|6.30|>": 50679,
1319
+ "<|6.32|>": 50680,
1320
+ "<|6.34|>": 50681,
1321
+ "<|6.36|>": 50682,
1322
+ "<|6.38|>": 50683,
1323
+ "<|6.40|>": 50684,
1324
+ "<|6.42|>": 50685,
1325
+ "<|6.44|>": 50686,
1326
+ "<|6.46|>": 50687,
1327
+ "<|6.48|>": 50688,
1328
+ "<|6.50|>": 50689,
1329
+ "<|6.52|>": 50690,
1330
+ "<|6.54|>": 50691,
1331
+ "<|6.56|>": 50692,
1332
+ "<|6.58|>": 50693,
1333
+ "<|6.60|>": 50694,
1334
+ "<|6.62|>": 50695,
1335
+ "<|6.64|>": 50696,
1336
+ "<|6.66|>": 50697,
1337
+ "<|6.68|>": 50698,
1338
+ "<|6.70|>": 50699,
1339
+ "<|6.72|>": 50700,
1340
+ "<|6.74|>": 50701,
1341
+ "<|6.76|>": 50702,
1342
+ "<|6.78|>": 50703,
1343
+ "<|6.80|>": 50704,
1344
+ "<|6.82|>": 50705,
1345
+ "<|6.84|>": 50706,
1346
+ "<|6.86|>": 50707,
1347
+ "<|6.88|>": 50708,
1348
+ "<|6.90|>": 50709,
1349
+ "<|6.92|>": 50710,
1350
+ "<|6.94|>": 50711,
1351
+ "<|6.96|>": 50712,
1352
+ "<|6.98|>": 50713,
1353
+ "<|7.00|>": 50714,
1354
+ "<|7.02|>": 50715,
1355
+ "<|7.04|>": 50716,
1356
+ "<|7.06|>": 50717,
1357
+ "<|7.08|>": 50718,
1358
+ "<|7.10|>": 50719,
1359
+ "<|7.12|>": 50720,
1360
+ "<|7.14|>": 50721,
1361
+ "<|7.16|>": 50722,
1362
+ "<|7.18|>": 50723,
1363
+ "<|7.20|>": 50724,
1364
+ "<|7.22|>": 50725,
1365
+ "<|7.24|>": 50726,
1366
+ "<|7.26|>": 50727,
1367
+ "<|7.28|>": 50728,
1368
+ "<|7.30|>": 50729,
1369
+ "<|7.32|>": 50730,
1370
+ "<|7.34|>": 50731,
1371
+ "<|7.36|>": 50732,
1372
+ "<|7.38|>": 50733,
1373
+ "<|7.40|>": 50734,
1374
+ "<|7.42|>": 50735,
1375
+ "<|7.44|>": 50736,
1376
+ "<|7.46|>": 50737,
1377
+ "<|7.48|>": 50738,
1378
+ "<|7.50|>": 50739,
1379
+ "<|7.52|>": 50740,
1380
+ "<|7.54|>": 50741,
1381
+ "<|7.56|>": 50742,
1382
+ "<|7.58|>": 50743,
1383
+ "<|7.60|>": 50744,
1384
+ "<|7.62|>": 50745,
1385
+ "<|7.64|>": 50746,
1386
+ "<|7.66|>": 50747,
1387
+ "<|7.68|>": 50748,
1388
+ "<|7.70|>": 50749,
1389
+ "<|7.72|>": 50750,
1390
+ "<|7.74|>": 50751,
1391
+ "<|7.76|>": 50752,
1392
+ "<|7.78|>": 50753,
1393
+ "<|7.80|>": 50754,
1394
+ "<|7.82|>": 50755,
1395
+ "<|7.84|>": 50756,
1396
+ "<|7.86|>": 50757,
1397
+ "<|7.88|>": 50758,
1398
+ "<|7.90|>": 50759,
1399
+ "<|7.92|>": 50760,
1400
+ "<|7.94|>": 50761,
1401
+ "<|7.96|>": 50762,
1402
+ "<|7.98|>": 50763,
1403
+ "<|8.00|>": 50764,
1404
+ "<|8.02|>": 50765,
1405
+ "<|8.04|>": 50766,
1406
+ "<|8.06|>": 50767,
1407
+ "<|8.08|>": 50768,
1408
+ "<|8.10|>": 50769,
1409
+ "<|8.12|>": 50770,
1410
+ "<|8.14|>": 50771,
1411
+ "<|8.16|>": 50772,
1412
+ "<|8.18|>": 50773,
1413
+ "<|8.20|>": 50774,
1414
+ "<|8.22|>": 50775,
1415
+ "<|8.24|>": 50776,
1416
+ "<|8.26|>": 50777,
1417
+ "<|8.28|>": 50778,
1418
+ "<|8.30|>": 50779,
1419
+ "<|8.32|>": 50780,
1420
+ "<|8.34|>": 50781,
1421
+ "<|8.36|>": 50782,
1422
+ "<|8.38|>": 50783,
1423
+ "<|8.40|>": 50784,
1424
+ "<|8.42|>": 50785,
1425
+ "<|8.44|>": 50786,
1426
+ "<|8.46|>": 50787,
1427
+ "<|8.48|>": 50788,
1428
+ "<|8.50|>": 50789,
1429
+ "<|8.52|>": 50790,
1430
+ "<|8.54|>": 50791,
1431
+ "<|8.56|>": 50792,
1432
+ "<|8.58|>": 50793,
1433
+ "<|8.60|>": 50794,
1434
+ "<|8.62|>": 50795,
1435
+ "<|8.64|>": 50796,
1436
+ "<|8.66|>": 50797,
1437
+ "<|8.68|>": 50798,
1438
+ "<|8.70|>": 50799,
1439
+ "<|8.72|>": 50800,
1440
+ "<|8.74|>": 50801,
1441
+ "<|8.76|>": 50802,
1442
+ "<|8.78|>": 50803,
1443
+ "<|8.80|>": 50804,
1444
+ "<|8.82|>": 50805,
1445
+ "<|8.84|>": 50806,
1446
+ "<|8.86|>": 50807,
1447
+ "<|8.88|>": 50808,
1448
+ "<|8.90|>": 50809,
1449
+ "<|8.92|>": 50810,
1450
+ "<|8.94|>": 50811,
1451
+ "<|8.96|>": 50812,
1452
+ "<|8.98|>": 50813,
1453
+ "<|9.00|>": 50814,
1454
+ "<|9.02|>": 50815,
1455
+ "<|9.04|>": 50816,
1456
+ "<|9.06|>": 50817,
1457
+ "<|9.08|>": 50818,
1458
+ "<|9.10|>": 50819,
1459
+ "<|9.12|>": 50820,
1460
+ "<|9.14|>": 50821,
1461
+ "<|9.16|>": 50822,
1462
+ "<|9.18|>": 50823,
1463
+ "<|9.20|>": 50824,
1464
+ "<|9.22|>": 50825,
1465
+ "<|9.24|>": 50826,
1466
+ "<|9.26|>": 50827,
1467
+ "<|9.28|>": 50828,
1468
+ "<|9.30|>": 50829,
1469
+ "<|9.32|>": 50830,
1470
+ "<|9.34|>": 50831,
1471
+ "<|9.36|>": 50832,
1472
+ "<|9.38|>": 50833,
1473
+ "<|9.40|>": 50834,
1474
+ "<|9.42|>": 50835,
1475
+ "<|9.44|>": 50836,
1476
+ "<|9.46|>": 50837,
1477
+ "<|9.48|>": 50838,
1478
+ "<|9.50|>": 50839,
1479
+ "<|9.52|>": 50840,
1480
+ "<|9.54|>": 50841,
1481
+ "<|9.56|>": 50842,
1482
+ "<|9.58|>": 50843,
1483
+ "<|9.60|>": 50844,
1484
+ "<|9.62|>": 50845,
1485
+ "<|9.64|>": 50846,
1486
+ "<|9.66|>": 50847,
1487
+ "<|9.68|>": 50848,
1488
+ "<|9.70|>": 50849,
1489
+ "<|9.72|>": 50850,
1490
+ "<|9.74|>": 50851,
1491
+ "<|9.76|>": 50852,
1492
+ "<|9.78|>": 50853,
1493
+ "<|9.80|>": 50854,
1494
+ "<|9.82|>": 50855,
1495
+ "<|9.84|>": 50856,
1496
+ "<|9.86|>": 50857,
1497
+ "<|9.88|>": 50858,
1498
+ "<|9.90|>": 50859,
1499
+ "<|9.92|>": 50860,
1500
+ "<|9.94|>": 50861,
1501
+ "<|9.96|>": 50862,
1502
+ "<|9.98|>": 50863,
1503
+ "<|af|>": 50327,
1504
+ "<|am|>": 50334,
1505
+ "<|ar|>": 50272,
1506
+ "<|as|>": 50350,
1507
+ "<|az|>": 50304,
1508
+ "<|ba|>": 50355,
1509
+ "<|be|>": 50330,
1510
+ "<|bg|>": 50292,
1511
+ "<|bn|>": 50302,
1512
+ "<|bo|>": 50347,
1513
+ "<|br|>": 50309,
1514
+ "<|bs|>": 50315,
1515
+ "<|ca|>": 50270,
1516
+ "<|cs|>": 50283,
1517
+ "<|cy|>": 50297,
1518
+ "<|da|>": 50285,
1519
+ "<|de|>": 50261,
1520
+ "<|el|>": 50281,
1521
+ "<|en|>": 50259,
1522
+ "<|es|>": 50262,
1523
+ "<|et|>": 50307,
1524
+ "<|eu|>": 50310,
1525
+ "<|fa|>": 50300,
1526
+ "<|fi|>": 50277,
1527
+ "<|fo|>": 50338,
1528
+ "<|fr|>": 50265,
1529
+ "<|gl|>": 50319,
1530
+ "<|gu|>": 50333,
1531
+ "<|haw|>": 50352,
1532
+ "<|ha|>": 50354,
1533
+ "<|he|>": 50279,
1534
+ "<|hi|>": 50276,
1535
+ "<|hr|>": 50291,
1536
+ "<|ht|>": 50339,
1537
+ "<|hu|>": 50286,
1538
+ "<|hy|>": 50312,
1539
+ "<|id|>": 50275,
1540
+ "<|is|>": 50311,
1541
+ "<|it|>": 50274,
1542
+ "<|ja|>": 50266,
1543
+ "<|jw|>": 50356,
1544
+ "<|ka|>": 50329,
1545
+ "<|kk|>": 50316,
1546
+ "<|km|>": 50323,
1547
+ "<|kn|>": 50306,
1548
+ "<|ko|>": 50264,
1549
+ "<|la|>": 50294,
1550
+ "<|lb|>": 50345,
1551
+ "<|ln|>": 50353,
1552
+ "<|lo|>": 50336,
1553
+ "<|lt|>": 50293,
1554
+ "<|lv|>": 50301,
1555
+ "<|mg|>": 50349,
1556
+ "<|mi|>": 50295,
1557
+ "<|mk|>": 50308,
1558
+ "<|ml|>": 50296,
1559
+ "<|mn|>": 50314,
1560
+ "<|mr|>": 50320,
1561
+ "<|ms|>": 50282,
1562
+ "<|mt|>": 50343,
1563
+ "<|my|>": 50346,
1564
+ "<|ne|>": 50313,
1565
+ "<|nl|>": 50271,
1566
+ "<|nn|>": 50342,
1567
+ "<|nocaptions|>": 50362,
1568
+ "<|notimestamps|>": 50363,
1569
+ "<|no|>": 50288,
1570
+ "<|oc|>": 50328,
1571
+ "<|pa|>": 50321,
1572
+ "<|pl|>": 50269,
1573
+ "<|ps|>": 50340,
1574
+ "<|pt|>": 50267,
1575
+ "<|ro|>": 50284,
1576
+ "<|ru|>": 50263,
1577
+ "<|sa|>": 50344,
1578
+ "<|sd|>": 50332,
1579
+ "<|si|>": 50322,
1580
+ "<|sk|>": 50298,
1581
+ "<|sl|>": 50305,
1582
+ "<|sn|>": 50324,
1583
+ "<|so|>": 50326,
1584
+ "<|sq|>": 50317,
1585
+ "<|sr|>": 50303,
1586
+ "<|startoflm|>": 50360,
1587
+ "<|startofprev|>": 50361,
1588
+ "<|startoftranscript|>": 50258,
1589
+ "<|su|>": 50357,
1590
+ "<|sv|>": 50273,
1591
+ "<|sw|>": 50318,
1592
+ "<|ta|>": 50287,
1593
+ "<|te|>": 50299,
1594
+ "<|tg|>": 50331,
1595
+ "<|th|>": 50289,
1596
+ "<|tk|>": 50341,
1597
+ "<|tl|>": 50348,
1598
+ "<|transcribe|>": 50359,
1599
+ "<|translate|>": 50358,
1600
+ "<|tr|>": 50268,
1601
+ "<|tt|>": 50351,
1602
+ "<|uk|>": 50280,
1603
+ "<|ur|>": 50290,
1604
+ "<|uz|>": 50337,
1605
+ "<|vi|>": 50278,
1606
+ "<|yi|>": 50335,
1607
+ "<|yo|>": 50325,
1608
+ "<|zh|>": 50260
1609
+ }
Models/hindi/checkpoint-1000/config.json ADDED
@@ -0,0 +1,52 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "openai/whisper-small",
3
+ "activation_dropout": 0.0,
4
+ "activation_function": "gelu",
5
+ "apply_spec_augment": false,
6
+ "architectures": [
7
+ "WhisperForConditionalGeneration"
8
+ ],
9
+ "attention_dropout": 0.0,
10
+ "begin_suppress_tokens": [
11
+ 220,
12
+ 50257
13
+ ],
14
+ "bos_token_id": 50257,
15
+ "classifier_proj_size": 256,
16
+ "d_model": 768,
17
+ "decoder_attention_heads": 12,
18
+ "decoder_ffn_dim": 3072,
19
+ "decoder_layerdrop": 0.0,
20
+ "decoder_layers": 12,
21
+ "decoder_start_token_id": 50258,
22
+ "dropout": 0.0,
23
+ "encoder_attention_heads": 12,
24
+ "encoder_ffn_dim": 3072,
25
+ "encoder_layerdrop": 0.0,
26
+ "encoder_layers": 12,
27
+ "eos_token_id": 50257,
28
+ "forced_decoder_ids": null,
29
+ "init_std": 0.02,
30
+ "is_encoder_decoder": true,
31
+ "mask_feature_length": 10,
32
+ "mask_feature_min_masks": 0,
33
+ "mask_feature_prob": 0.0,
34
+ "mask_time_length": 10,
35
+ "mask_time_min_masks": 2,
36
+ "mask_time_prob": 0.05,
37
+ "max_length": 448,
38
+ "max_source_positions": 1500,
39
+ "max_target_positions": 448,
40
+ "median_filter_width": 7,
41
+ "model_type": "whisper",
42
+ "num_hidden_layers": 12,
43
+ "num_mel_bins": 80,
44
+ "pad_token_id": 50257,
45
+ "scale_embedding": false,
46
+ "suppress_tokens": [],
47
+ "torch_dtype": "float32",
48
+ "transformers_version": "4.40.0.dev0",
49
+ "use_cache": false,
50
+ "use_weighted_layer_sum": false,
51
+ "vocab_size": 51865
52
+ }
Models/hindi/checkpoint-1000/generation_config.json ADDED
@@ -0,0 +1,265 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alignment_heads": [
3
+ [
4
+ 5,
5
+ 3
6
+ ],
7
+ [
8
+ 5,
9
+ 9
10
+ ],
11
+ [
12
+ 8,
13
+ 0
14
+ ],
15
+ [
16
+ 8,
17
+ 4
18
+ ],
19
+ [
20
+ 8,
21
+ 7
22
+ ],
23
+ [
24
+ 8,
25
+ 8
26
+ ],
27
+ [
28
+ 9,
29
+ 0
30
+ ],
31
+ [
32
+ 9,
33
+ 7
34
+ ],
35
+ [
36
+ 9,
37
+ 9
38
+ ],
39
+ [
40
+ 10,
41
+ 5
42
+ ]
43
+ ],
44
+ "begin_suppress_tokens": [
45
+ 220,
46
+ 50257
47
+ ],
48
+ "bos_token_id": 50257,
49
+ "decoder_start_token_id": 50258,
50
+ "eos_token_id": 50257,
51
+ "forced_decoder_ids": [
52
+ [
53
+ 1,
54
+ null
55
+ ],
56
+ [
57
+ 2,
58
+ 50359
59
+ ]
60
+ ],
61
+ "is_multilingual": true,
62
+ "lang_to_id": {
63
+ "<|af|>": 50327,
64
+ "<|am|>": 50334,
65
+ "<|ar|>": 50272,
66
+ "<|as|>": 50350,
67
+ "<|az|>": 50304,
68
+ "<|ba|>": 50355,
69
+ "<|be|>": 50330,
70
+ "<|bg|>": 50292,
71
+ "<|bn|>": 50302,
72
+ "<|bo|>": 50347,
73
+ "<|br|>": 50309,
74
+ "<|bs|>": 50315,
75
+ "<|ca|>": 50270,
76
+ "<|cs|>": 50283,
77
+ "<|cy|>": 50297,
78
+ "<|da|>": 50285,
79
+ "<|de|>": 50261,
80
+ "<|el|>": 50281,
81
+ "<|en|>": 50259,
82
+ "<|es|>": 50262,
83
+ "<|et|>": 50307,
84
+ "<|eu|>": 50310,
85
+ "<|fa|>": 50300,
86
+ "<|fi|>": 50277,
87
+ "<|fo|>": 50338,
88
+ "<|fr|>": 50265,
89
+ "<|gl|>": 50319,
90
+ "<|gu|>": 50333,
91
+ "<|haw|>": 50352,
92
+ "<|ha|>": 50354,
93
+ "<|he|>": 50279,
94
+ "<|hi|>": 50276,
95
+ "<|hr|>": 50291,
96
+ "<|ht|>": 50339,
97
+ "<|hu|>": 50286,
98
+ "<|hy|>": 50312,
99
+ "<|id|>": 50275,
100
+ "<|is|>": 50311,
101
+ "<|it|>": 50274,
102
+ "<|ja|>": 50266,
103
+ "<|jw|>": 50356,
104
+ "<|ka|>": 50329,
105
+ "<|kk|>": 50316,
106
+ "<|km|>": 50323,
107
+ "<|kn|>": 50306,
108
+ "<|ko|>": 50264,
109
+ "<|la|>": 50294,
110
+ "<|lb|>": 50345,
111
+ "<|ln|>": 50353,
112
+ "<|lo|>": 50336,
113
+ "<|lt|>": 50293,
114
+ "<|lv|>": 50301,
115
+ "<|mg|>": 50349,
116
+ "<|mi|>": 50295,
117
+ "<|mk|>": 50308,
118
+ "<|ml|>": 50296,
119
+ "<|mn|>": 50314,
120
+ "<|mr|>": 50320,
121
+ "<|ms|>": 50282,
122
+ "<|mt|>": 50343,
123
+ "<|my|>": 50346,
124
+ "<|ne|>": 50313,
125
+ "<|nl|>": 50271,
126
+ "<|nn|>": 50342,
127
+ "<|no|>": 50288,
128
+ "<|oc|>": 50328,
129
+ "<|pa|>": 50321,
130
+ "<|pl|>": 50269,
131
+ "<|ps|>": 50340,
132
+ "<|pt|>": 50267,
133
+ "<|ro|>": 50284,
134
+ "<|ru|>": 50263,
135
+ "<|sa|>": 50344,
136
+ "<|sd|>": 50332,
137
+ "<|si|>": 50322,
138
+ "<|sk|>": 50298,
139
+ "<|sl|>": 50305,
140
+ "<|sn|>": 50324,
141
+ "<|so|>": 50326,
142
+ "<|sq|>": 50317,
143
+ "<|sr|>": 50303,
144
+ "<|su|>": 50357,
145
+ "<|sv|>": 50273,
146
+ "<|sw|>": 50318,
147
+ "<|ta|>": 50287,
148
+ "<|te|>": 50299,
149
+ "<|tg|>": 50331,
150
+ "<|th|>": 50289,
151
+ "<|tk|>": 50341,
152
+ "<|tl|>": 50348,
153
+ "<|tr|>": 50268,
154
+ "<|tt|>": 50351,
155
+ "<|uk|>": 50280,
156
+ "<|ur|>": 50290,
157
+ "<|uz|>": 50337,
158
+ "<|vi|>": 50278,
159
+ "<|yi|>": 50335,
160
+ "<|yo|>": 50325,
161
+ "<|zh|>": 50260
162
+ },
163
+ "language": "hi",
164
+ "max_initial_timestamp_index": 50,
165
+ "max_length": 448,
166
+ "no_timestamps_token_id": 50363,
167
+ "pad_token_id": 50257,
168
+ "prev_sot_token_id": 50361,
169
+ "return_timestamps": false,
170
+ "suppress_tokens": [
171
+ 1,
172
+ 2,
173
+ 7,
174
+ 8,
175
+ 9,
176
+ 10,
177
+ 14,
178
+ 25,
179
+ 26,
180
+ 27,
181
+ 28,
182
+ 29,
183
+ 31,
184
+ 58,
185
+ 59,
186
+ 60,
187
+ 61,
188
+ 62,
189
+ 63,
190
+ 90,
191
+ 91,
192
+ 92,
193
+ 93,
194
+ 359,
195
+ 503,
196
+ 522,
197
+ 542,
198
+ 873,
199
+ 893,
200
+ 902,
201
+ 918,
202
+ 922,
203
+ 931,
204
+ 1350,
205
+ 1853,
206
+ 1982,
207
+ 2460,
208
+ 2627,
209
+ 3246,
210
+ 3253,
211
+ 3268,
212
+ 3536,
213
+ 3846,
214
+ 3961,
215
+ 4183,
216
+ 4667,
217
+ 6585,
218
+ 6647,
219
+ 7273,
220
+ 9061,
221
+ 9383,
222
+ 10428,
223
+ 10929,
224
+ 11938,
225
+ 12033,
226
+ 12331,
227
+ 12562,
228
+ 13793,
229
+ 14157,
230
+ 14635,
231
+ 15265,
232
+ 15618,
233
+ 16553,
234
+ 16604,
235
+ 18362,
236
+ 18956,
237
+ 20075,
238
+ 21675,
239
+ 22520,
240
+ 26130,
241
+ 26161,
242
+ 26435,
243
+ 28279,
244
+ 29464,
245
+ 31650,
246
+ 32302,
247
+ 32470,
248
+ 36865,
249
+ 42863,
250
+ 47425,
251
+ 49870,
252
+ 50254,
253
+ 50258,
254
+ 50358,
255
+ 50359,
256
+ 50360,
257
+ 50361,
258
+ 50362
259
+ ],
260
+ "task_to_id": {
261
+ "transcribe": 50359,
262
+ "translate": 50358
263
+ },
264
+ "transformers_version": "4.40.0.dev0"
265
+ }
Models/hindi/checkpoint-1000/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1fd14aa6e594aaaee27a96da4e4572c377c57a1c513035f347084de089bcd7b8
3
+ size 966995080
Models/hindi/checkpoint-1000/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:29a8164cac4af723754631c7da66bb23202c19d62a32b5956cca25908bc138a1
3
+ size 1925064044
Models/hindi/checkpoint-1000/preprocessor_config.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "chunk_length": 30,
3
+ "feature_extractor_type": "WhisperFeatureExtractor",
4
+ "feature_size": 80,
5
+ "hop_length": 160,
6
+ "n_fft": 400,
7
+ "n_samples": 480000,
8
+ "nb_max_frames": 3000,
9
+ "padding_side": "right",
10
+ "padding_value": 0.0,
11
+ "processor_class": "WhisperProcessor",
12
+ "return_attention_mask": false,
13
+ "sampling_rate": 16000
14
+ }
Models/hindi/checkpoint-1000/rng_state.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:bded210308e42252bbae8eae4a69e1ce6c258f88afb96d000a6c88926d54924c
3
+ size 14244
Models/hindi/checkpoint-1000/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6983ae7e0032d082d946848755da0680fcdb0cd12518cc8330956c3315a05a99
3
+ size 1064
Models/hindi/checkpoint-1000/trainer_state.json ADDED
@@ -0,0 +1,310 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_metric": 18.938284039923083,
3
+ "best_model_checkpoint": "./checkpoint-1000",
4
+ "epoch": 9.7799511002445,
5
+ "eval_steps": 1000,
6
+ "global_step": 1000,
7
+ "is_hyper_param_search": false,
8
+ "is_local_process_zero": true,
9
+ "is_world_process_zero": true,
10
+ "log_history": [
11
+ {
12
+ "epoch": 0.24,
13
+ "grad_norm": 39.86528778076172,
14
+ "learning_rate": 5.000000000000001e-07,
15
+ "loss": 2.0555,
16
+ "step": 25
17
+ },
18
+ {
19
+ "epoch": 0.49,
20
+ "grad_norm": Infinity,
21
+ "learning_rate": 9.800000000000001e-07,
22
+ "loss": 1.5219,
23
+ "step": 50
24
+ },
25
+ {
26
+ "epoch": 0.73,
27
+ "grad_norm": 6.200109958648682,
28
+ "learning_rate": 1.48e-06,
29
+ "loss": 1.0167,
30
+ "step": 75
31
+ },
32
+ {
33
+ "epoch": 0.98,
34
+ "grad_norm": 5.486103057861328,
35
+ "learning_rate": 1.98e-06,
36
+ "loss": 0.7299,
37
+ "step": 100
38
+ },
39
+ {
40
+ "epoch": 1.22,
41
+ "grad_norm": 5.532894134521484,
42
+ "learning_rate": 2.4800000000000004e-06,
43
+ "loss": 0.6318,
44
+ "step": 125
45
+ },
46
+ {
47
+ "epoch": 1.47,
48
+ "grad_norm": 4.895535945892334,
49
+ "learning_rate": 2.9800000000000003e-06,
50
+ "loss": 0.5503,
51
+ "step": 150
52
+ },
53
+ {
54
+ "epoch": 1.71,
55
+ "grad_norm": 5.065937519073486,
56
+ "learning_rate": 3.48e-06,
57
+ "loss": 0.4999,
58
+ "step": 175
59
+ },
60
+ {
61
+ "epoch": 1.96,
62
+ "grad_norm": 4.807600498199463,
63
+ "learning_rate": 3.980000000000001e-06,
64
+ "loss": 0.4457,
65
+ "step": 200
66
+ },
67
+ {
68
+ "epoch": 2.2,
69
+ "grad_norm": 4.568106174468994,
70
+ "learning_rate": 4.48e-06,
71
+ "loss": 0.3779,
72
+ "step": 225
73
+ },
74
+ {
75
+ "epoch": 2.44,
76
+ "grad_norm": 4.705833911895752,
77
+ "learning_rate": 4.980000000000001e-06,
78
+ "loss": 0.3332,
79
+ "step": 250
80
+ },
81
+ {
82
+ "epoch": 2.69,
83
+ "grad_norm": 4.315070629119873,
84
+ "learning_rate": 5.480000000000001e-06,
85
+ "loss": 0.2858,
86
+ "step": 275
87
+ },
88
+ {
89
+ "epoch": 2.93,
90
+ "grad_norm": 2.660728693008423,
91
+ "learning_rate": 5.98e-06,
92
+ "loss": 0.2228,
93
+ "step": 300
94
+ },
95
+ {
96
+ "epoch": 3.18,
97
+ "grad_norm": 2.5569632053375244,
98
+ "learning_rate": 6.480000000000001e-06,
99
+ "loss": 0.1766,
100
+ "step": 325
101
+ },
102
+ {
103
+ "epoch": 3.42,
104
+ "grad_norm": 2.2663660049438477,
105
+ "learning_rate": 6.98e-06,
106
+ "loss": 0.1475,
107
+ "step": 350
108
+ },
109
+ {
110
+ "epoch": 3.67,
111
+ "grad_norm": 2.5257091522216797,
112
+ "learning_rate": 7.48e-06,
113
+ "loss": 0.1492,
114
+ "step": 375
115
+ },
116
+ {
117
+ "epoch": 3.91,
118
+ "grad_norm": 2.3954405784606934,
119
+ "learning_rate": 7.980000000000002e-06,
120
+ "loss": 0.142,
121
+ "step": 400
122
+ },
123
+ {
124
+ "epoch": 4.16,
125
+ "grad_norm": 2.181328296661377,
126
+ "learning_rate": 8.48e-06,
127
+ "loss": 0.1125,
128
+ "step": 425
129
+ },
130
+ {
131
+ "epoch": 4.4,
132
+ "grad_norm": 1.9877594709396362,
133
+ "learning_rate": 8.98e-06,
134
+ "loss": 0.0908,
135
+ "step": 450
136
+ },
137
+ {
138
+ "epoch": 4.65,
139
+ "grad_norm": 1.8853321075439453,
140
+ "learning_rate": 9.48e-06,
141
+ "loss": 0.0892,
142
+ "step": 475
143
+ },
144
+ {
145
+ "epoch": 4.89,
146
+ "grad_norm": 2.3802549839019775,
147
+ "learning_rate": 9.980000000000001e-06,
148
+ "loss": 0.091,
149
+ "step": 500
150
+ },
151
+ {
152
+ "epoch": 5.13,
153
+ "grad_norm": 1.3576879501342773,
154
+ "learning_rate": 9.946666666666667e-06,
155
+ "loss": 0.0713,
156
+ "step": 525
157
+ },
158
+ {
159
+ "epoch": 5.38,
160
+ "grad_norm": 2.4532103538513184,
161
+ "learning_rate": 9.891111111111113e-06,
162
+ "loss": 0.0534,
163
+ "step": 550
164
+ },
165
+ {
166
+ "epoch": 5.62,
167
+ "grad_norm": 1.3736106157302856,
168
+ "learning_rate": 9.835555555555556e-06,
169
+ "loss": 0.0512,
170
+ "step": 575
171
+ },
172
+ {
173
+ "epoch": 5.87,
174
+ "grad_norm": 2.0095458030700684,
175
+ "learning_rate": 9.780000000000001e-06,
176
+ "loss": 0.0571,
177
+ "step": 600
178
+ },
179
+ {
180
+ "epoch": 6.11,
181
+ "grad_norm": 1.2924531698226929,
182
+ "learning_rate": 9.724444444444445e-06,
183
+ "loss": 0.0453,
184
+ "step": 625
185
+ },
186
+ {
187
+ "epoch": 6.36,
188
+ "grad_norm": 1.0012321472167969,
189
+ "learning_rate": 9.66888888888889e-06,
190
+ "loss": 0.0292,
191
+ "step": 650
192
+ },
193
+ {
194
+ "epoch": 6.6,
195
+ "grad_norm": 1.4161434173583984,
196
+ "learning_rate": 9.613333333333335e-06,
197
+ "loss": 0.0325,
198
+ "step": 675
199
+ },
200
+ {
201
+ "epoch": 6.85,
202
+ "grad_norm": 5.784367084503174,
203
+ "learning_rate": 9.557777777777777e-06,
204
+ "loss": 0.031,
205
+ "step": 700
206
+ },
207
+ {
208
+ "epoch": 7.09,
209
+ "grad_norm": 0.8382102251052856,
210
+ "learning_rate": 9.502222222222223e-06,
211
+ "loss": 0.0247,
212
+ "step": 725
213
+ },
214
+ {
215
+ "epoch": 7.33,
216
+ "grad_norm": 1.2963491678237915,
217
+ "learning_rate": 9.446666666666667e-06,
218
+ "loss": 0.0162,
219
+ "step": 750
220
+ },
221
+ {
222
+ "epoch": 7.58,
223
+ "grad_norm": 1.7834402322769165,
224
+ "learning_rate": 9.391111111111111e-06,
225
+ "loss": 0.0175,
226
+ "step": 775
227
+ },
228
+ {
229
+ "epoch": 7.82,
230
+ "grad_norm": 0.9083292484283447,
231
+ "learning_rate": 9.335555555555557e-06,
232
+ "loss": 0.0193,
233
+ "step": 800
234
+ },
235
+ {
236
+ "epoch": 8.07,
237
+ "grad_norm": 0.5552634596824646,
238
+ "learning_rate": 9.280000000000001e-06,
239
+ "loss": 0.0157,
240
+ "step": 825
241
+ },
242
+ {
243
+ "epoch": 8.31,
244
+ "grad_norm": 1.1231069564819336,
245
+ "learning_rate": 9.224444444444445e-06,
246
+ "loss": 0.0105,
247
+ "step": 850
248
+ },
249
+ {
250
+ "epoch": 8.56,
251
+ "grad_norm": 1.206103801727295,
252
+ "learning_rate": 9.168888888888889e-06,
253
+ "loss": 0.0109,
254
+ "step": 875
255
+ },
256
+ {
257
+ "epoch": 8.8,
258
+ "grad_norm": 0.8872191309928894,
259
+ "learning_rate": 9.113333333333335e-06,
260
+ "loss": 0.0126,
261
+ "step": 900
262
+ },
263
+ {
264
+ "epoch": 9.05,
265
+ "grad_norm": 0.7421383261680603,
266
+ "learning_rate": 9.057777777777779e-06,
267
+ "loss": 0.0107,
268
+ "step": 925
269
+ },
270
+ {
271
+ "epoch": 9.29,
272
+ "grad_norm": 0.7581607103347778,
273
+ "learning_rate": 9.002222222222223e-06,
274
+ "loss": 0.006,
275
+ "step": 950
276
+ },
277
+ {
278
+ "epoch": 9.54,
279
+ "grad_norm": 0.6848894953727722,
280
+ "learning_rate": 8.946666666666669e-06,
281
+ "loss": 0.006,
282
+ "step": 975
283
+ },
284
+ {
285
+ "epoch": 9.78,
286
+ "grad_norm": 1.044122576713562,
287
+ "learning_rate": 8.891111111111111e-06,
288
+ "loss": 0.0067,
289
+ "step": 1000
290
+ },
291
+ {
292
+ "epoch": 9.78,
293
+ "eval_loss": 0.41375118494033813,
294
+ "eval_runtime": 1461.809,
295
+ "eval_samples_per_second": 1.98,
296
+ "eval_steps_per_second": 0.495,
297
+ "eval_wer": 18.938284039923083,
298
+ "step": 1000
299
+ }
300
+ ],
301
+ "logging_steps": 25,
302
+ "max_steps": 5000,
303
+ "num_input_tokens_seen": 0,
304
+ "num_train_epochs": 50,
305
+ "save_steps": 1000,
306
+ "total_flos": 1.845907654606848e+19,
307
+ "train_batch_size": 8,
308
+ "trial_name": null,
309
+ "trial_params": null
310
+ }
Models/hindi/checkpoint-1000/training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1562c17bb2dc7592b45af48209c138ce71faddb67bef288ecaf31f1c50f864ae
3
+ size 5048
Models/hindi/checkpoint-2000/config.json ADDED
@@ -0,0 +1,52 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "openai/whisper-small",
3
+ "activation_dropout": 0.0,
4
+ "activation_function": "gelu",
5
+ "apply_spec_augment": false,
6
+ "architectures": [
7
+ "WhisperForConditionalGeneration"
8
+ ],
9
+ "attention_dropout": 0.0,
10
+ "begin_suppress_tokens": [
11
+ 220,
12
+ 50257
13
+ ],
14
+ "bos_token_id": 50257,
15
+ "classifier_proj_size": 256,
16
+ "d_model": 768,
17
+ "decoder_attention_heads": 12,
18
+ "decoder_ffn_dim": 3072,
19
+ "decoder_layerdrop": 0.0,
20
+ "decoder_layers": 12,
21
+ "decoder_start_token_id": 50258,
22
+ "dropout": 0.0,
23
+ "encoder_attention_heads": 12,
24
+ "encoder_ffn_dim": 3072,
25
+ "encoder_layerdrop": 0.0,
26
+ "encoder_layers": 12,
27
+ "eos_token_id": 50257,
28
+ "forced_decoder_ids": null,
29
+ "init_std": 0.02,
30
+ "is_encoder_decoder": true,
31
+ "mask_feature_length": 10,
32
+ "mask_feature_min_masks": 0,
33
+ "mask_feature_prob": 0.0,
34
+ "mask_time_length": 10,
35
+ "mask_time_min_masks": 2,
36
+ "mask_time_prob": 0.05,
37
+ "max_length": 448,
38
+ "max_source_positions": 1500,
39
+ "max_target_positions": 448,
40
+ "median_filter_width": 7,
41
+ "model_type": "whisper",
42
+ "num_hidden_layers": 12,
43
+ "num_mel_bins": 80,
44
+ "pad_token_id": 50257,
45
+ "scale_embedding": false,
46
+ "suppress_tokens": [],
47
+ "torch_dtype": "float32",
48
+ "transformers_version": "4.40.0.dev0",
49
+ "use_cache": false,
50
+ "use_weighted_layer_sum": false,
51
+ "vocab_size": 51865
52
+ }
Models/hindi/checkpoint-2000/generation_config.json ADDED
@@ -0,0 +1,265 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alignment_heads": [
3
+ [
4
+ 5,
5
+ 3
6
+ ],
7
+ [
8
+ 5,
9
+ 9
10
+ ],
11
+ [
12
+ 8,
13
+ 0
14
+ ],
15
+ [
16
+ 8,
17
+ 4
18
+ ],
19
+ [
20
+ 8,
21
+ 7
22
+ ],
23
+ [
24
+ 8,
25
+ 8
26
+ ],
27
+ [
28
+ 9,
29
+ 0
30
+ ],
31
+ [
32
+ 9,
33
+ 7
34
+ ],
35
+ [
36
+ 9,
37
+ 9
38
+ ],
39
+ [
40
+ 10,
41
+ 5
42
+ ]
43
+ ],
44
+ "begin_suppress_tokens": [
45
+ 220,
46
+ 50257
47
+ ],
48
+ "bos_token_id": 50257,
49
+ "decoder_start_token_id": 50258,
50
+ "eos_token_id": 50257,
51
+ "forced_decoder_ids": [
52
+ [
53
+ 1,
54
+ null
55
+ ],
56
+ [
57
+ 2,
58
+ 50359
59
+ ]
60
+ ],
61
+ "is_multilingual": true,
62
+ "lang_to_id": {
63
+ "<|af|>": 50327,
64
+ "<|am|>": 50334,
65
+ "<|ar|>": 50272,
66
+ "<|as|>": 50350,
67
+ "<|az|>": 50304,
68
+ "<|ba|>": 50355,
69
+ "<|be|>": 50330,
70
+ "<|bg|>": 50292,
71
+ "<|bn|>": 50302,
72
+ "<|bo|>": 50347,
73
+ "<|br|>": 50309,
74
+ "<|bs|>": 50315,
75
+ "<|ca|>": 50270,
76
+ "<|cs|>": 50283,
77
+ "<|cy|>": 50297,
78
+ "<|da|>": 50285,
79
+ "<|de|>": 50261,
80
+ "<|el|>": 50281,
81
+ "<|en|>": 50259,
82
+ "<|es|>": 50262,
83
+ "<|et|>": 50307,
84
+ "<|eu|>": 50310,
85
+ "<|fa|>": 50300,
86
+ "<|fi|>": 50277,
87
+ "<|fo|>": 50338,
88
+ "<|fr|>": 50265,
89
+ "<|gl|>": 50319,
90
+ "<|gu|>": 50333,
91
+ "<|haw|>": 50352,
92
+ "<|ha|>": 50354,
93
+ "<|he|>": 50279,
94
+ "<|hi|>": 50276,
95
+ "<|hr|>": 50291,
96
+ "<|ht|>": 50339,
97
+ "<|hu|>": 50286,
98
+ "<|hy|>": 50312,
99
+ "<|id|>": 50275,
100
+ "<|is|>": 50311,
101
+ "<|it|>": 50274,
102
+ "<|ja|>": 50266,
103
+ "<|jw|>": 50356,
104
+ "<|ka|>": 50329,
105
+ "<|kk|>": 50316,
106
+ "<|km|>": 50323,
107
+ "<|kn|>": 50306,
108
+ "<|ko|>": 50264,
109
+ "<|la|>": 50294,
110
+ "<|lb|>": 50345,
111
+ "<|ln|>": 50353,
112
+ "<|lo|>": 50336,
113
+ "<|lt|>": 50293,
114
+ "<|lv|>": 50301,
115
+ "<|mg|>": 50349,
116
+ "<|mi|>": 50295,
117
+ "<|mk|>": 50308,
118
+ "<|ml|>": 50296,
119
+ "<|mn|>": 50314,
120
+ "<|mr|>": 50320,
121
+ "<|ms|>": 50282,
122
+ "<|mt|>": 50343,
123
+ "<|my|>": 50346,
124
+ "<|ne|>": 50313,
125
+ "<|nl|>": 50271,
126
+ "<|nn|>": 50342,
127
+ "<|no|>": 50288,
128
+ "<|oc|>": 50328,
129
+ "<|pa|>": 50321,
130
+ "<|pl|>": 50269,
131
+ "<|ps|>": 50340,
132
+ "<|pt|>": 50267,
133
+ "<|ro|>": 50284,
134
+ "<|ru|>": 50263,
135
+ "<|sa|>": 50344,
136
+ "<|sd|>": 50332,
137
+ "<|si|>": 50322,
138
+ "<|sk|>": 50298,
139
+ "<|sl|>": 50305,
140
+ "<|sn|>": 50324,
141
+ "<|so|>": 50326,
142
+ "<|sq|>": 50317,
143
+ "<|sr|>": 50303,
144
+ "<|su|>": 50357,
145
+ "<|sv|>": 50273,
146
+ "<|sw|>": 50318,
147
+ "<|ta|>": 50287,
148
+ "<|te|>": 50299,
149
+ "<|tg|>": 50331,
150
+ "<|th|>": 50289,
151
+ "<|tk|>": 50341,
152
+ "<|tl|>": 50348,
153
+ "<|tr|>": 50268,
154
+ "<|tt|>": 50351,
155
+ "<|uk|>": 50280,
156
+ "<|ur|>": 50290,
157
+ "<|uz|>": 50337,
158
+ "<|vi|>": 50278,
159
+ "<|yi|>": 50335,
160
+ "<|yo|>": 50325,
161
+ "<|zh|>": 50260
162
+ },
163
+ "language": "hi",
164
+ "max_initial_timestamp_index": 50,
165
+ "max_length": 448,
166
+ "no_timestamps_token_id": 50363,
167
+ "pad_token_id": 50257,
168
+ "prev_sot_token_id": 50361,
169
+ "return_timestamps": false,
170
+ "suppress_tokens": [
171
+ 1,
172
+ 2,
173
+ 7,
174
+ 8,
175
+ 9,
176
+ 10,
177
+ 14,
178
+ 25,
179
+ 26,
180
+ 27,
181
+ 28,
182
+ 29,
183
+ 31,
184
+ 58,
185
+ 59,
186
+ 60,
187
+ 61,
188
+ 62,
189
+ 63,
190
+ 90,
191
+ 91,
192
+ 92,
193
+ 93,
194
+ 359,
195
+ 503,
196
+ 522,
197
+ 542,
198
+ 873,
199
+ 893,
200
+ 902,
201
+ 918,
202
+ 922,
203
+ 931,
204
+ 1350,
205
+ 1853,
206
+ 1982,
207
+ 2460,
208
+ 2627,
209
+ 3246,
210
+ 3253,
211
+ 3268,
212
+ 3536,
213
+ 3846,
214
+ 3961,
215
+ 4183,
216
+ 4667,
217
+ 6585,
218
+ 6647,
219
+ 7273,
220
+ 9061,
221
+ 9383,
222
+ 10428,
223
+ 10929,
224
+ 11938,
225
+ 12033,
226
+ 12331,
227
+ 12562,
228
+ 13793,
229
+ 14157,
230
+ 14635,
231
+ 15265,
232
+ 15618,
233
+ 16553,
234
+ 16604,
235
+ 18362,
236
+ 18956,
237
+ 20075,
238
+ 21675,
239
+ 22520,
240
+ 26130,
241
+ 26161,
242
+ 26435,
243
+ 28279,
244
+ 29464,
245
+ 31650,
246
+ 32302,
247
+ 32470,
248
+ 36865,
249
+ 42863,
250
+ 47425,
251
+ 49870,
252
+ 50254,
253
+ 50258,
254
+ 50358,
255
+ 50359,
256
+ 50360,
257
+ 50361,
258
+ 50362
259
+ ],
260
+ "task_to_id": {
261
+ "transcribe": 50359,
262
+ "translate": 50358
263
+ },
264
+ "transformers_version": "4.40.0.dev0"
265
+ }
Models/hindi/checkpoint-2000/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:23a66aa9a065a7d9ef874c34a0cb322f63042c09bad65ea3c7b1f71300e8c8bd
3
+ size 966995080
Models/hindi/checkpoint-2000/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6bb71a12746438d02b6196cca5941b09288c42e8d653690935063efb260c687c
3
+ size 1925064044
Models/hindi/checkpoint-2000/preprocessor_config.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "chunk_length": 30,
3
+ "feature_extractor_type": "WhisperFeatureExtractor",
4
+ "feature_size": 80,
5
+ "hop_length": 160,
6
+ "n_fft": 400,
7
+ "n_samples": 480000,
8
+ "nb_max_frames": 3000,
9
+ "padding_side": "right",
10
+ "padding_value": 0.0,
11
+ "processor_class": "WhisperProcessor",
12
+ "return_attention_mask": false,
13
+ "sampling_rate": 16000
14
+ }
Models/hindi/checkpoint-2000/rng_state.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:71a0fdb9e8f10ab8d6cd4b270c3479a9f43e8126a634ce9d1630a3b83bad9e79
3
+ size 14244
Models/hindi/checkpoint-2000/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:33377184da97e177b0780baa2b74567aae5f37821389a4e05186d99b45f8185c
3
+ size 1064
Models/hindi/checkpoint-2000/trainer_state.json ADDED
@@ -0,0 +1,599 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_metric": 18.4735830052193,
3
+ "best_model_checkpoint": "./checkpoint-2000",
4
+ "epoch": 19.559902200489,
5
+ "eval_steps": 1000,
6
+ "global_step": 2000,
7
+ "is_hyper_param_search": false,
8
+ "is_local_process_zero": true,
9
+ "is_world_process_zero": true,
10
+ "log_history": [
11
+ {
12
+ "epoch": 0.24,
13
+ "grad_norm": 39.86528778076172,
14
+ "learning_rate": 5.000000000000001e-07,
15
+ "loss": 2.0555,
16
+ "step": 25
17
+ },
18
+ {
19
+ "epoch": 0.49,
20
+ "grad_norm": Infinity,
21
+ "learning_rate": 9.800000000000001e-07,
22
+ "loss": 1.5219,
23
+ "step": 50
24
+ },
25
+ {
26
+ "epoch": 0.73,
27
+ "grad_norm": 6.200109958648682,
28
+ "learning_rate": 1.48e-06,
29
+ "loss": 1.0167,
30
+ "step": 75
31
+ },
32
+ {
33
+ "epoch": 0.98,
34
+ "grad_norm": 5.486103057861328,
35
+ "learning_rate": 1.98e-06,
36
+ "loss": 0.7299,
37
+ "step": 100
38
+ },
39
+ {
40
+ "epoch": 1.22,
41
+ "grad_norm": 5.532894134521484,
42
+ "learning_rate": 2.4800000000000004e-06,
43
+ "loss": 0.6318,
44
+ "step": 125
45
+ },
46
+ {
47
+ "epoch": 1.47,
48
+ "grad_norm": 4.895535945892334,
49
+ "learning_rate": 2.9800000000000003e-06,
50
+ "loss": 0.5503,
51
+ "step": 150
52
+ },
53
+ {
54
+ "epoch": 1.71,
55
+ "grad_norm": 5.065937519073486,
56
+ "learning_rate": 3.48e-06,
57
+ "loss": 0.4999,
58
+ "step": 175
59
+ },
60
+ {
61
+ "epoch": 1.96,
62
+ "grad_norm": 4.807600498199463,
63
+ "learning_rate": 3.980000000000001e-06,
64
+ "loss": 0.4457,
65
+ "step": 200
66
+ },
67
+ {
68
+ "epoch": 2.2,
69
+ "grad_norm": 4.568106174468994,
70
+ "learning_rate": 4.48e-06,
71
+ "loss": 0.3779,
72
+ "step": 225
73
+ },
74
+ {
75
+ "epoch": 2.44,
76
+ "grad_norm": 4.705833911895752,
77
+ "learning_rate": 4.980000000000001e-06,
78
+ "loss": 0.3332,
79
+ "step": 250
80
+ },
81
+ {
82
+ "epoch": 2.69,
83
+ "grad_norm": 4.315070629119873,
84
+ "learning_rate": 5.480000000000001e-06,
85
+ "loss": 0.2858,
86
+ "step": 275
87
+ },
88
+ {
89
+ "epoch": 2.93,
90
+ "grad_norm": 2.660728693008423,
91
+ "learning_rate": 5.98e-06,
92
+ "loss": 0.2228,
93
+ "step": 300
94
+ },
95
+ {
96
+ "epoch": 3.18,
97
+ "grad_norm": 2.5569632053375244,
98
+ "learning_rate": 6.480000000000001e-06,
99
+ "loss": 0.1766,
100
+ "step": 325
101
+ },
102
+ {
103
+ "epoch": 3.42,
104
+ "grad_norm": 2.2663660049438477,
105
+ "learning_rate": 6.98e-06,
106
+ "loss": 0.1475,
107
+ "step": 350
108
+ },
109
+ {
110
+ "epoch": 3.67,
111
+ "grad_norm": 2.5257091522216797,
112
+ "learning_rate": 7.48e-06,
113
+ "loss": 0.1492,
114
+ "step": 375
115
+ },
116
+ {
117
+ "epoch": 3.91,
118
+ "grad_norm": 2.3954405784606934,
119
+ "learning_rate": 7.980000000000002e-06,
120
+ "loss": 0.142,
121
+ "step": 400
122
+ },
123
+ {
124
+ "epoch": 4.16,
125
+ "grad_norm": 2.181328296661377,
126
+ "learning_rate": 8.48e-06,
127
+ "loss": 0.1125,
128
+ "step": 425
129
+ },
130
+ {
131
+ "epoch": 4.4,
132
+ "grad_norm": 1.9877594709396362,
133
+ "learning_rate": 8.98e-06,
134
+ "loss": 0.0908,
135
+ "step": 450
136
+ },
137
+ {
138
+ "epoch": 4.65,
139
+ "grad_norm": 1.8853321075439453,
140
+ "learning_rate": 9.48e-06,
141
+ "loss": 0.0892,
142
+ "step": 475
143
+ },
144
+ {
145
+ "epoch": 4.89,
146
+ "grad_norm": 2.3802549839019775,
147
+ "learning_rate": 9.980000000000001e-06,
148
+ "loss": 0.091,
149
+ "step": 500
150
+ },
151
+ {
152
+ "epoch": 5.13,
153
+ "grad_norm": 1.3576879501342773,
154
+ "learning_rate": 9.946666666666667e-06,
155
+ "loss": 0.0713,
156
+ "step": 525
157
+ },
158
+ {
159
+ "epoch": 5.38,
160
+ "grad_norm": 2.4532103538513184,
161
+ "learning_rate": 9.891111111111113e-06,
162
+ "loss": 0.0534,
163
+ "step": 550
164
+ },
165
+ {
166
+ "epoch": 5.62,
167
+ "grad_norm": 1.3736106157302856,
168
+ "learning_rate": 9.835555555555556e-06,
169
+ "loss": 0.0512,
170
+ "step": 575
171
+ },
172
+ {
173
+ "epoch": 5.87,
174
+ "grad_norm": 2.0095458030700684,
175
+ "learning_rate": 9.780000000000001e-06,
176
+ "loss": 0.0571,
177
+ "step": 600
178
+ },
179
+ {
180
+ "epoch": 6.11,
181
+ "grad_norm": 1.2924531698226929,
182
+ "learning_rate": 9.724444444444445e-06,
183
+ "loss": 0.0453,
184
+ "step": 625
185
+ },
186
+ {
187
+ "epoch": 6.36,
188
+ "grad_norm": 1.0012321472167969,
189
+ "learning_rate": 9.66888888888889e-06,
190
+ "loss": 0.0292,
191
+ "step": 650
192
+ },
193
+ {
194
+ "epoch": 6.6,
195
+ "grad_norm": 1.4161434173583984,
196
+ "learning_rate": 9.613333333333335e-06,
197
+ "loss": 0.0325,
198
+ "step": 675
199
+ },
200
+ {
201
+ "epoch": 6.85,
202
+ "grad_norm": 5.784367084503174,
203
+ "learning_rate": 9.557777777777777e-06,
204
+ "loss": 0.031,
205
+ "step": 700
206
+ },
207
+ {
208
+ "epoch": 7.09,
209
+ "grad_norm": 0.8382102251052856,
210
+ "learning_rate": 9.502222222222223e-06,
211
+ "loss": 0.0247,
212
+ "step": 725
213
+ },
214
+ {
215
+ "epoch": 7.33,
216
+ "grad_norm": 1.2963491678237915,
217
+ "learning_rate": 9.446666666666667e-06,
218
+ "loss": 0.0162,
219
+ "step": 750
220
+ },
221
+ {
222
+ "epoch": 7.58,
223
+ "grad_norm": 1.7834402322769165,
224
+ "learning_rate": 9.391111111111111e-06,
225
+ "loss": 0.0175,
226
+ "step": 775
227
+ },
228
+ {
229
+ "epoch": 7.82,
230
+ "grad_norm": 0.9083292484283447,
231
+ "learning_rate": 9.335555555555557e-06,
232
+ "loss": 0.0193,
233
+ "step": 800
234
+ },
235
+ {
236
+ "epoch": 8.07,
237
+ "grad_norm": 0.5552634596824646,
238
+ "learning_rate": 9.280000000000001e-06,
239
+ "loss": 0.0157,
240
+ "step": 825
241
+ },
242
+ {
243
+ "epoch": 8.31,
244
+ "grad_norm": 1.1231069564819336,
245
+ "learning_rate": 9.224444444444445e-06,
246
+ "loss": 0.0105,
247
+ "step": 850
248
+ },
249
+ {
250
+ "epoch": 8.56,
251
+ "grad_norm": 1.206103801727295,
252
+ "learning_rate": 9.168888888888889e-06,
253
+ "loss": 0.0109,
254
+ "step": 875
255
+ },
256
+ {
257
+ "epoch": 8.8,
258
+ "grad_norm": 0.8872191309928894,
259
+ "learning_rate": 9.113333333333335e-06,
260
+ "loss": 0.0126,
261
+ "step": 900
262
+ },
263
+ {
264
+ "epoch": 9.05,
265
+ "grad_norm": 0.7421383261680603,
266
+ "learning_rate": 9.057777777777779e-06,
267
+ "loss": 0.0107,
268
+ "step": 925
269
+ },
270
+ {
271
+ "epoch": 9.29,
272
+ "grad_norm": 0.7581607103347778,
273
+ "learning_rate": 9.002222222222223e-06,
274
+ "loss": 0.006,
275
+ "step": 950
276
+ },
277
+ {
278
+ "epoch": 9.54,
279
+ "grad_norm": 0.6848894953727722,
280
+ "learning_rate": 8.946666666666669e-06,
281
+ "loss": 0.006,
282
+ "step": 975
283
+ },
284
+ {
285
+ "epoch": 9.78,
286
+ "grad_norm": 1.044122576713562,
287
+ "learning_rate": 8.891111111111111e-06,
288
+ "loss": 0.0067,
289
+ "step": 1000
290
+ },
291
+ {
292
+ "epoch": 9.78,
293
+ "eval_loss": 0.41375118494033813,
294
+ "eval_runtime": 1461.809,
295
+ "eval_samples_per_second": 1.98,
296
+ "eval_steps_per_second": 0.495,
297
+ "eval_wer": 18.938284039923083,
298
+ "step": 1000
299
+ },
300
+ {
301
+ "epoch": 10.02,
302
+ "grad_norm": 0.6757261753082275,
303
+ "learning_rate": 8.835555555555557e-06,
304
+ "loss": 0.0058,
305
+ "step": 1025
306
+ },
307
+ {
308
+ "epoch": 10.27,
309
+ "grad_norm": 1.085519552230835,
310
+ "learning_rate": 8.78e-06,
311
+ "loss": 0.0037,
312
+ "step": 1050
313
+ },
314
+ {
315
+ "epoch": 10.51,
316
+ "grad_norm": 0.8559943437576294,
317
+ "learning_rate": 8.724444444444445e-06,
318
+ "loss": 0.0044,
319
+ "step": 1075
320
+ },
321
+ {
322
+ "epoch": 10.76,
323
+ "grad_norm": 1.7756787538528442,
324
+ "learning_rate": 8.66888888888889e-06,
325
+ "loss": 0.0056,
326
+ "step": 1100
327
+ },
328
+ {
329
+ "epoch": 11.0,
330
+ "grad_norm": 0.5664415955543518,
331
+ "learning_rate": 8.613333333333333e-06,
332
+ "loss": 0.0048,
333
+ "step": 1125
334
+ },
335
+ {
336
+ "epoch": 11.25,
337
+ "grad_norm": 0.621498703956604,
338
+ "learning_rate": 8.557777777777778e-06,
339
+ "loss": 0.0038,
340
+ "step": 1150
341
+ },
342
+ {
343
+ "epoch": 11.49,
344
+ "grad_norm": 0.9859088659286499,
345
+ "learning_rate": 8.502222222222223e-06,
346
+ "loss": 0.0035,
347
+ "step": 1175
348
+ },
349
+ {
350
+ "epoch": 11.74,
351
+ "grad_norm": 1.2961162328720093,
352
+ "learning_rate": 8.446666666666668e-06,
353
+ "loss": 0.0041,
354
+ "step": 1200
355
+ },
356
+ {
357
+ "epoch": 11.98,
358
+ "grad_norm": 0.5769420862197876,
359
+ "learning_rate": 8.391111111111112e-06,
360
+ "loss": 0.0035,
361
+ "step": 1225
362
+ },
363
+ {
364
+ "epoch": 12.22,
365
+ "grad_norm": 0.5504060387611389,
366
+ "learning_rate": 8.335555555555556e-06,
367
+ "loss": 0.0022,
368
+ "step": 1250
369
+ },
370
+ {
371
+ "epoch": 12.47,
372
+ "grad_norm": 0.7063620090484619,
373
+ "learning_rate": 8.28e-06,
374
+ "loss": 0.0027,
375
+ "step": 1275
376
+ },
377
+ {
378
+ "epoch": 12.71,
379
+ "grad_norm": 0.6650658845901489,
380
+ "learning_rate": 8.224444444444444e-06,
381
+ "loss": 0.0029,
382
+ "step": 1300
383
+ },
384
+ {
385
+ "epoch": 12.96,
386
+ "grad_norm": 0.6803381443023682,
387
+ "learning_rate": 8.16888888888889e-06,
388
+ "loss": 0.0023,
389
+ "step": 1325
390
+ },
391
+ {
392
+ "epoch": 13.2,
393
+ "grad_norm": 0.19391104578971863,
394
+ "learning_rate": 8.113333333333334e-06,
395
+ "loss": 0.0013,
396
+ "step": 1350
397
+ },
398
+ {
399
+ "epoch": 13.45,
400
+ "grad_norm": 0.43767812848091125,
401
+ "learning_rate": 8.057777777777778e-06,
402
+ "loss": 0.002,
403
+ "step": 1375
404
+ },
405
+ {
406
+ "epoch": 13.69,
407
+ "grad_norm": 0.6082565188407898,
408
+ "learning_rate": 8.002222222222222e-06,
409
+ "loss": 0.0022,
410
+ "step": 1400
411
+ },
412
+ {
413
+ "epoch": 13.94,
414
+ "grad_norm": 0.30705004930496216,
415
+ "learning_rate": 7.946666666666666e-06,
416
+ "loss": 0.002,
417
+ "step": 1425
418
+ },
419
+ {
420
+ "epoch": 14.18,
421
+ "grad_norm": 0.18880507349967957,
422
+ "learning_rate": 7.891111111111112e-06,
423
+ "loss": 0.0015,
424
+ "step": 1450
425
+ },
426
+ {
427
+ "epoch": 14.43,
428
+ "grad_norm": 0.32524725794792175,
429
+ "learning_rate": 7.835555555555556e-06,
430
+ "loss": 0.0015,
431
+ "step": 1475
432
+ },
433
+ {
434
+ "epoch": 14.67,
435
+ "grad_norm": 2.48786997795105,
436
+ "learning_rate": 7.78e-06,
437
+ "loss": 0.0015,
438
+ "step": 1500
439
+ },
440
+ {
441
+ "epoch": 14.91,
442
+ "grad_norm": 0.3373986482620239,
443
+ "learning_rate": 7.724444444444446e-06,
444
+ "loss": 0.0013,
445
+ "step": 1525
446
+ },
447
+ {
448
+ "epoch": 15.16,
449
+ "grad_norm": 0.29098883271217346,
450
+ "learning_rate": 7.66888888888889e-06,
451
+ "loss": 0.0011,
452
+ "step": 1550
453
+ },
454
+ {
455
+ "epoch": 15.4,
456
+ "grad_norm": 0.12477891892194748,
457
+ "learning_rate": 7.613333333333334e-06,
458
+ "loss": 0.0008,
459
+ "step": 1575
460
+ },
461
+ {
462
+ "epoch": 15.65,
463
+ "grad_norm": 0.06489470601081848,
464
+ "learning_rate": 7.557777777777779e-06,
465
+ "loss": 0.0011,
466
+ "step": 1600
467
+ },
468
+ {
469
+ "epoch": 15.89,
470
+ "grad_norm": 0.061178650707006454,
471
+ "learning_rate": 7.502222222222223e-06,
472
+ "loss": 0.001,
473
+ "step": 1625
474
+ },
475
+ {
476
+ "epoch": 16.14,
477
+ "grad_norm": 0.038977060467004776,
478
+ "learning_rate": 7.446666666666668e-06,
479
+ "loss": 0.0007,
480
+ "step": 1650
481
+ },
482
+ {
483
+ "epoch": 16.38,
484
+ "grad_norm": 0.22110821306705475,
485
+ "learning_rate": 7.3911111111111125e-06,
486
+ "loss": 0.0007,
487
+ "step": 1675
488
+ },
489
+ {
490
+ "epoch": 16.63,
491
+ "grad_norm": 0.5320185422897339,
492
+ "learning_rate": 7.335555555555556e-06,
493
+ "loss": 0.0007,
494
+ "step": 1700
495
+ },
496
+ {
497
+ "epoch": 16.87,
498
+ "grad_norm": 0.7823454737663269,
499
+ "learning_rate": 7.280000000000001e-06,
500
+ "loss": 0.0008,
501
+ "step": 1725
502
+ },
503
+ {
504
+ "epoch": 17.11,
505
+ "grad_norm": 0.043301377445459366,
506
+ "learning_rate": 7.224444444444445e-06,
507
+ "loss": 0.001,
508
+ "step": 1750
509
+ },
510
+ {
511
+ "epoch": 17.36,
512
+ "grad_norm": 0.06231601908802986,
513
+ "learning_rate": 7.1688888888888895e-06,
514
+ "loss": 0.0005,
515
+ "step": 1775
516
+ },
517
+ {
518
+ "epoch": 17.6,
519
+ "grad_norm": 0.05838339775800705,
520
+ "learning_rate": 7.113333333333334e-06,
521
+ "loss": 0.0005,
522
+ "step": 1800
523
+ },
524
+ {
525
+ "epoch": 17.85,
526
+ "grad_norm": 0.05545497685670853,
527
+ "learning_rate": 7.057777777777778e-06,
528
+ "loss": 0.0012,
529
+ "step": 1825
530
+ },
531
+ {
532
+ "epoch": 18.09,
533
+ "grad_norm": 0.4030478894710541,
534
+ "learning_rate": 7.0022222222222225e-06,
535
+ "loss": 0.0008,
536
+ "step": 1850
537
+ },
538
+ {
539
+ "epoch": 18.34,
540
+ "grad_norm": 0.27439093589782715,
541
+ "learning_rate": 6.946666666666667e-06,
542
+ "loss": 0.0007,
543
+ "step": 1875
544
+ },
545
+ {
546
+ "epoch": 18.58,
547
+ "grad_norm": 0.25452977418899536,
548
+ "learning_rate": 6.891111111111111e-06,
549
+ "loss": 0.0007,
550
+ "step": 1900
551
+ },
552
+ {
553
+ "epoch": 18.83,
554
+ "grad_norm": 0.06759922206401825,
555
+ "learning_rate": 6.835555555555556e-06,
556
+ "loss": 0.0007,
557
+ "step": 1925
558
+ },
559
+ {
560
+ "epoch": 19.07,
561
+ "grad_norm": 0.25859466195106506,
562
+ "learning_rate": 6.780000000000001e-06,
563
+ "loss": 0.0006,
564
+ "step": 1950
565
+ },
566
+ {
567
+ "epoch": 19.32,
568
+ "grad_norm": 0.7427995800971985,
569
+ "learning_rate": 6.724444444444444e-06,
570
+ "loss": 0.0005,
571
+ "step": 1975
572
+ },
573
+ {
574
+ "epoch": 19.56,
575
+ "grad_norm": 0.0788324698805809,
576
+ "learning_rate": 6.668888888888889e-06,
577
+ "loss": 0.0008,
578
+ "step": 2000
579
+ },
580
+ {
581
+ "epoch": 19.56,
582
+ "eval_loss": 0.49481505155563354,
583
+ "eval_runtime": 1457.5675,
584
+ "eval_samples_per_second": 1.985,
585
+ "eval_steps_per_second": 0.497,
586
+ "eval_wer": 18.4735830052193,
587
+ "step": 2000
588
+ }
589
+ ],
590
+ "logging_steps": 25,
591
+ "max_steps": 5000,
592
+ "num_input_tokens_seen": 0,
593
+ "num_train_epochs": 50,
594
+ "save_steps": 1000,
595
+ "total_flos": 3.691699875053568e+19,
596
+ "train_batch_size": 8,
597
+ "trial_name": null,
598
+ "trial_params": null
599
+ }
Models/hindi/checkpoint-2000/training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1562c17bb2dc7592b45af48209c138ce71faddb67bef288ecaf31f1c50f864ae
3
+ size 5048
Models/hindi/checkpoint-3000/config.json ADDED
@@ -0,0 +1,52 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "openai/whisper-small",
3
+ "activation_dropout": 0.0,
4
+ "activation_function": "gelu",
5
+ "apply_spec_augment": false,
6
+ "architectures": [
7
+ "WhisperForConditionalGeneration"
8
+ ],
9
+ "attention_dropout": 0.0,
10
+ "begin_suppress_tokens": [
11
+ 220,
12
+ 50257
13
+ ],
14
+ "bos_token_id": 50257,
15
+ "classifier_proj_size": 256,
16
+ "d_model": 768,
17
+ "decoder_attention_heads": 12,
18
+ "decoder_ffn_dim": 3072,
19
+ "decoder_layerdrop": 0.0,
20
+ "decoder_layers": 12,
21
+ "decoder_start_token_id": 50258,
22
+ "dropout": 0.0,
23
+ "encoder_attention_heads": 12,
24
+ "encoder_ffn_dim": 3072,
25
+ "encoder_layerdrop": 0.0,
26
+ "encoder_layers": 12,
27
+ "eos_token_id": 50257,
28
+ "forced_decoder_ids": null,
29
+ "init_std": 0.02,
30
+ "is_encoder_decoder": true,
31
+ "mask_feature_length": 10,
32
+ "mask_feature_min_masks": 0,
33
+ "mask_feature_prob": 0.0,
34
+ "mask_time_length": 10,
35
+ "mask_time_min_masks": 2,
36
+ "mask_time_prob": 0.05,
37
+ "max_length": 448,
38
+ "max_source_positions": 1500,
39
+ "max_target_positions": 448,
40
+ "median_filter_width": 7,
41
+ "model_type": "whisper",
42
+ "num_hidden_layers": 12,
43
+ "num_mel_bins": 80,
44
+ "pad_token_id": 50257,
45
+ "scale_embedding": false,
46
+ "suppress_tokens": [],
47
+ "torch_dtype": "float32",
48
+ "transformers_version": "4.40.0.dev0",
49
+ "use_cache": false,
50
+ "use_weighted_layer_sum": false,
51
+ "vocab_size": 51865
52
+ }
Models/hindi/checkpoint-3000/generation_config.json ADDED
@@ -0,0 +1,265 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alignment_heads": [
3
+ [
4
+ 5,
5
+ 3
6
+ ],
7
+ [
8
+ 5,
9
+ 9
10
+ ],
11
+ [
12
+ 8,
13
+ 0
14
+ ],
15
+ [
16
+ 8,
17
+ 4
18
+ ],
19
+ [
20
+ 8,
21
+ 7
22
+ ],
23
+ [
24
+ 8,
25
+ 8
26
+ ],
27
+ [
28
+ 9,
29
+ 0
30
+ ],
31
+ [
32
+ 9,
33
+ 7
34
+ ],
35
+ [
36
+ 9,
37
+ 9
38
+ ],
39
+ [
40
+ 10,
41
+ 5
42
+ ]
43
+ ],
44
+ "begin_suppress_tokens": [
45
+ 220,
46
+ 50257
47
+ ],
48
+ "bos_token_id": 50257,
49
+ "decoder_start_token_id": 50258,
50
+ "eos_token_id": 50257,
51
+ "forced_decoder_ids": [
52
+ [
53
+ 1,
54
+ null
55
+ ],
56
+ [
57
+ 2,
58
+ 50359
59
+ ]
60
+ ],
61
+ "is_multilingual": true,
62
+ "lang_to_id": {
63
+ "<|af|>": 50327,
64
+ "<|am|>": 50334,
65
+ "<|ar|>": 50272,
66
+ "<|as|>": 50350,
67
+ "<|az|>": 50304,
68
+ "<|ba|>": 50355,
69
+ "<|be|>": 50330,
70
+ "<|bg|>": 50292,
71
+ "<|bn|>": 50302,
72
+ "<|bo|>": 50347,
73
+ "<|br|>": 50309,
74
+ "<|bs|>": 50315,
75
+ "<|ca|>": 50270,
76
+ "<|cs|>": 50283,
77
+ "<|cy|>": 50297,
78
+ "<|da|>": 50285,
79
+ "<|de|>": 50261,
80
+ "<|el|>": 50281,
81
+ "<|en|>": 50259,
82
+ "<|es|>": 50262,
83
+ "<|et|>": 50307,
84
+ "<|eu|>": 50310,
85
+ "<|fa|>": 50300,
86
+ "<|fi|>": 50277,
87
+ "<|fo|>": 50338,
88
+ "<|fr|>": 50265,
89
+ "<|gl|>": 50319,
90
+ "<|gu|>": 50333,
91
+ "<|haw|>": 50352,
92
+ "<|ha|>": 50354,
93
+ "<|he|>": 50279,
94
+ "<|hi|>": 50276,
95
+ "<|hr|>": 50291,
96
+ "<|ht|>": 50339,
97
+ "<|hu|>": 50286,
98
+ "<|hy|>": 50312,
99
+ "<|id|>": 50275,
100
+ "<|is|>": 50311,
101
+ "<|it|>": 50274,
102
+ "<|ja|>": 50266,
103
+ "<|jw|>": 50356,
104
+ "<|ka|>": 50329,
105
+ "<|kk|>": 50316,
106
+ "<|km|>": 50323,
107
+ "<|kn|>": 50306,
108
+ "<|ko|>": 50264,
109
+ "<|la|>": 50294,
110
+ "<|lb|>": 50345,
111
+ "<|ln|>": 50353,
112
+ "<|lo|>": 50336,
113
+ "<|lt|>": 50293,
114
+ "<|lv|>": 50301,
115
+ "<|mg|>": 50349,
116
+ "<|mi|>": 50295,
117
+ "<|mk|>": 50308,
118
+ "<|ml|>": 50296,
119
+ "<|mn|>": 50314,
120
+ "<|mr|>": 50320,
121
+ "<|ms|>": 50282,
122
+ "<|mt|>": 50343,
123
+ "<|my|>": 50346,
124
+ "<|ne|>": 50313,
125
+ "<|nl|>": 50271,
126
+ "<|nn|>": 50342,
127
+ "<|no|>": 50288,
128
+ "<|oc|>": 50328,
129
+ "<|pa|>": 50321,
130
+ "<|pl|>": 50269,
131
+ "<|ps|>": 50340,
132
+ "<|pt|>": 50267,
133
+ "<|ro|>": 50284,
134
+ "<|ru|>": 50263,
135
+ "<|sa|>": 50344,
136
+ "<|sd|>": 50332,
137
+ "<|si|>": 50322,
138
+ "<|sk|>": 50298,
139
+ "<|sl|>": 50305,
140
+ "<|sn|>": 50324,
141
+ "<|so|>": 50326,
142
+ "<|sq|>": 50317,
143
+ "<|sr|>": 50303,
144
+ "<|su|>": 50357,
145
+ "<|sv|>": 50273,
146
+ "<|sw|>": 50318,
147
+ "<|ta|>": 50287,
148
+ "<|te|>": 50299,
149
+ "<|tg|>": 50331,
150
+ "<|th|>": 50289,
151
+ "<|tk|>": 50341,
152
+ "<|tl|>": 50348,
153
+ "<|tr|>": 50268,
154
+ "<|tt|>": 50351,
155
+ "<|uk|>": 50280,
156
+ "<|ur|>": 50290,
157
+ "<|uz|>": 50337,
158
+ "<|vi|>": 50278,
159
+ "<|yi|>": 50335,
160
+ "<|yo|>": 50325,
161
+ "<|zh|>": 50260
162
+ },
163
+ "language": "hi",
164
+ "max_initial_timestamp_index": 50,
165
+ "max_length": 448,
166
+ "no_timestamps_token_id": 50363,
167
+ "pad_token_id": 50257,
168
+ "prev_sot_token_id": 50361,
169
+ "return_timestamps": false,
170
+ "suppress_tokens": [
171
+ 1,
172
+ 2,
173
+ 7,
174
+ 8,
175
+ 9,
176
+ 10,
177
+ 14,
178
+ 25,
179
+ 26,
180
+ 27,
181
+ 28,
182
+ 29,
183
+ 31,
184
+ 58,
185
+ 59,
186
+ 60,
187
+ 61,
188
+ 62,
189
+ 63,
190
+ 90,
191
+ 91,
192
+ 92,
193
+ 93,
194
+ 359,
195
+ 503,
196
+ 522,
197
+ 542,
198
+ 873,
199
+ 893,
200
+ 902,
201
+ 918,
202
+ 922,
203
+ 931,
204
+ 1350,
205
+ 1853,
206
+ 1982,
207
+ 2460,
208
+ 2627,
209
+ 3246,
210
+ 3253,
211
+ 3268,
212
+ 3536,
213
+ 3846,
214
+ 3961,
215
+ 4183,
216
+ 4667,
217
+ 6585,
218
+ 6647,
219
+ 7273,
220
+ 9061,
221
+ 9383,
222
+ 10428,
223
+ 10929,
224
+ 11938,
225
+ 12033,
226
+ 12331,
227
+ 12562,
228
+ 13793,
229
+ 14157,
230
+ 14635,
231
+ 15265,
232
+ 15618,
233
+ 16553,
234
+ 16604,
235
+ 18362,
236
+ 18956,
237
+ 20075,
238
+ 21675,
239
+ 22520,
240
+ 26130,
241
+ 26161,
242
+ 26435,
243
+ 28279,
244
+ 29464,
245
+ 31650,
246
+ 32302,
247
+ 32470,
248
+ 36865,
249
+ 42863,
250
+ 47425,
251
+ 49870,
252
+ 50254,
253
+ 50258,
254
+ 50358,
255
+ 50359,
256
+ 50360,
257
+ 50361,
258
+ 50362
259
+ ],
260
+ "task_to_id": {
261
+ "transcribe": 50359,
262
+ "translate": 50358
263
+ },
264
+ "transformers_version": "4.40.0.dev0"
265
+ }
Models/hindi/checkpoint-3000/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6af9607fcb9df94c69ca425e62e18d55770b0a44e787d56fddf42eb73894cbcf
3
+ size 966995080
Models/hindi/checkpoint-3000/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:df39818885c0e6483378ff38f62b30b3d6d196a2e31ec4bef6938752bcb459c2
3
+ size 1925064044
Models/hindi/checkpoint-3000/preprocessor_config.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "chunk_length": 30,
3
+ "feature_extractor_type": "WhisperFeatureExtractor",
4
+ "feature_size": 80,
5
+ "hop_length": 160,
6
+ "n_fft": 400,
7
+ "n_samples": 480000,
8
+ "nb_max_frames": 3000,
9
+ "padding_side": "right",
10
+ "padding_value": 0.0,
11
+ "processor_class": "WhisperProcessor",
12
+ "return_attention_mask": false,
13
+ "sampling_rate": 16000
14
+ }
Models/hindi/checkpoint-3000/rng_state.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:add9825a42b571991ab3854fb2e0ba47bf55d930a1681859c2a938fc623d503c
3
+ size 14244
Models/hindi/checkpoint-3000/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:62c6fc3c22ad845c5522d6bb20938bae92ca52dec79a56e7f1a42148dc7d838b
3
+ size 1064
Models/hindi/checkpoint-3000/trainer_state.json ADDED
@@ -0,0 +1,888 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_metric": 18.07297866495742,
3
+ "best_model_checkpoint": "./checkpoint-3000",
4
+ "epoch": 29.339853300733495,
5
+ "eval_steps": 1000,
6
+ "global_step": 3000,
7
+ "is_hyper_param_search": false,
8
+ "is_local_process_zero": true,
9
+ "is_world_process_zero": true,
10
+ "log_history": [
11
+ {
12
+ "epoch": 0.24,
13
+ "grad_norm": 39.86528778076172,
14
+ "learning_rate": 5.000000000000001e-07,
15
+ "loss": 2.0555,
16
+ "step": 25
17
+ },
18
+ {
19
+ "epoch": 0.49,
20
+ "grad_norm": Infinity,
21
+ "learning_rate": 9.800000000000001e-07,
22
+ "loss": 1.5219,
23
+ "step": 50
24
+ },
25
+ {
26
+ "epoch": 0.73,
27
+ "grad_norm": 6.200109958648682,
28
+ "learning_rate": 1.48e-06,
29
+ "loss": 1.0167,
30
+ "step": 75
31
+ },
32
+ {
33
+ "epoch": 0.98,
34
+ "grad_norm": 5.486103057861328,
35
+ "learning_rate": 1.98e-06,
36
+ "loss": 0.7299,
37
+ "step": 100
38
+ },
39
+ {
40
+ "epoch": 1.22,
41
+ "grad_norm": 5.532894134521484,
42
+ "learning_rate": 2.4800000000000004e-06,
43
+ "loss": 0.6318,
44
+ "step": 125
45
+ },
46
+ {
47
+ "epoch": 1.47,
48
+ "grad_norm": 4.895535945892334,
49
+ "learning_rate": 2.9800000000000003e-06,
50
+ "loss": 0.5503,
51
+ "step": 150
52
+ },
53
+ {
54
+ "epoch": 1.71,
55
+ "grad_norm": 5.065937519073486,
56
+ "learning_rate": 3.48e-06,
57
+ "loss": 0.4999,
58
+ "step": 175
59
+ },
60
+ {
61
+ "epoch": 1.96,
62
+ "grad_norm": 4.807600498199463,
63
+ "learning_rate": 3.980000000000001e-06,
64
+ "loss": 0.4457,
65
+ "step": 200
66
+ },
67
+ {
68
+ "epoch": 2.2,
69
+ "grad_norm": 4.568106174468994,
70
+ "learning_rate": 4.48e-06,
71
+ "loss": 0.3779,
72
+ "step": 225
73
+ },
74
+ {
75
+ "epoch": 2.44,
76
+ "grad_norm": 4.705833911895752,
77
+ "learning_rate": 4.980000000000001e-06,
78
+ "loss": 0.3332,
79
+ "step": 250
80
+ },
81
+ {
82
+ "epoch": 2.69,
83
+ "grad_norm": 4.315070629119873,
84
+ "learning_rate": 5.480000000000001e-06,
85
+ "loss": 0.2858,
86
+ "step": 275
87
+ },
88
+ {
89
+ "epoch": 2.93,
90
+ "grad_norm": 2.660728693008423,
91
+ "learning_rate": 5.98e-06,
92
+ "loss": 0.2228,
93
+ "step": 300
94
+ },
95
+ {
96
+ "epoch": 3.18,
97
+ "grad_norm": 2.5569632053375244,
98
+ "learning_rate": 6.480000000000001e-06,
99
+ "loss": 0.1766,
100
+ "step": 325
101
+ },
102
+ {
103
+ "epoch": 3.42,
104
+ "grad_norm": 2.2663660049438477,
105
+ "learning_rate": 6.98e-06,
106
+ "loss": 0.1475,
107
+ "step": 350
108
+ },
109
+ {
110
+ "epoch": 3.67,
111
+ "grad_norm": 2.5257091522216797,
112
+ "learning_rate": 7.48e-06,
113
+ "loss": 0.1492,
114
+ "step": 375
115
+ },
116
+ {
117
+ "epoch": 3.91,
118
+ "grad_norm": 2.3954405784606934,
119
+ "learning_rate": 7.980000000000002e-06,
120
+ "loss": 0.142,
121
+ "step": 400
122
+ },
123
+ {
124
+ "epoch": 4.16,
125
+ "grad_norm": 2.181328296661377,
126
+ "learning_rate": 8.48e-06,
127
+ "loss": 0.1125,
128
+ "step": 425
129
+ },
130
+ {
131
+ "epoch": 4.4,
132
+ "grad_norm": 1.9877594709396362,
133
+ "learning_rate": 8.98e-06,
134
+ "loss": 0.0908,
135
+ "step": 450
136
+ },
137
+ {
138
+ "epoch": 4.65,
139
+ "grad_norm": 1.8853321075439453,
140
+ "learning_rate": 9.48e-06,
141
+ "loss": 0.0892,
142
+ "step": 475
143
+ },
144
+ {
145
+ "epoch": 4.89,
146
+ "grad_norm": 2.3802549839019775,
147
+ "learning_rate": 9.980000000000001e-06,
148
+ "loss": 0.091,
149
+ "step": 500
150
+ },
151
+ {
152
+ "epoch": 5.13,
153
+ "grad_norm": 1.3576879501342773,
154
+ "learning_rate": 9.946666666666667e-06,
155
+ "loss": 0.0713,
156
+ "step": 525
157
+ },
158
+ {
159
+ "epoch": 5.38,
160
+ "grad_norm": 2.4532103538513184,
161
+ "learning_rate": 9.891111111111113e-06,
162
+ "loss": 0.0534,
163
+ "step": 550
164
+ },
165
+ {
166
+ "epoch": 5.62,
167
+ "grad_norm": 1.3736106157302856,
168
+ "learning_rate": 9.835555555555556e-06,
169
+ "loss": 0.0512,
170
+ "step": 575
171
+ },
172
+ {
173
+ "epoch": 5.87,
174
+ "grad_norm": 2.0095458030700684,
175
+ "learning_rate": 9.780000000000001e-06,
176
+ "loss": 0.0571,
177
+ "step": 600
178
+ },
179
+ {
180
+ "epoch": 6.11,
181
+ "grad_norm": 1.2924531698226929,
182
+ "learning_rate": 9.724444444444445e-06,
183
+ "loss": 0.0453,
184
+ "step": 625
185
+ },
186
+ {
187
+ "epoch": 6.36,
188
+ "grad_norm": 1.0012321472167969,
189
+ "learning_rate": 9.66888888888889e-06,
190
+ "loss": 0.0292,
191
+ "step": 650
192
+ },
193
+ {
194
+ "epoch": 6.6,
195
+ "grad_norm": 1.4161434173583984,
196
+ "learning_rate": 9.613333333333335e-06,
197
+ "loss": 0.0325,
198
+ "step": 675
199
+ },
200
+ {
201
+ "epoch": 6.85,
202
+ "grad_norm": 5.784367084503174,
203
+ "learning_rate": 9.557777777777777e-06,
204
+ "loss": 0.031,
205
+ "step": 700
206
+ },
207
+ {
208
+ "epoch": 7.09,
209
+ "grad_norm": 0.8382102251052856,
210
+ "learning_rate": 9.502222222222223e-06,
211
+ "loss": 0.0247,
212
+ "step": 725
213
+ },
214
+ {
215
+ "epoch": 7.33,
216
+ "grad_norm": 1.2963491678237915,
217
+ "learning_rate": 9.446666666666667e-06,
218
+ "loss": 0.0162,
219
+ "step": 750
220
+ },
221
+ {
222
+ "epoch": 7.58,
223
+ "grad_norm": 1.7834402322769165,
224
+ "learning_rate": 9.391111111111111e-06,
225
+ "loss": 0.0175,
226
+ "step": 775
227
+ },
228
+ {
229
+ "epoch": 7.82,
230
+ "grad_norm": 0.9083292484283447,
231
+ "learning_rate": 9.335555555555557e-06,
232
+ "loss": 0.0193,
233
+ "step": 800
234
+ },
235
+ {
236
+ "epoch": 8.07,
237
+ "grad_norm": 0.5552634596824646,
238
+ "learning_rate": 9.280000000000001e-06,
239
+ "loss": 0.0157,
240
+ "step": 825
241
+ },
242
+ {
243
+ "epoch": 8.31,
244
+ "grad_norm": 1.1231069564819336,
245
+ "learning_rate": 9.224444444444445e-06,
246
+ "loss": 0.0105,
247
+ "step": 850
248
+ },
249
+ {
250
+ "epoch": 8.56,
251
+ "grad_norm": 1.206103801727295,
252
+ "learning_rate": 9.168888888888889e-06,
253
+ "loss": 0.0109,
254
+ "step": 875
255
+ },
256
+ {
257
+ "epoch": 8.8,
258
+ "grad_norm": 0.8872191309928894,
259
+ "learning_rate": 9.113333333333335e-06,
260
+ "loss": 0.0126,
261
+ "step": 900
262
+ },
263
+ {
264
+ "epoch": 9.05,
265
+ "grad_norm": 0.7421383261680603,
266
+ "learning_rate": 9.057777777777779e-06,
267
+ "loss": 0.0107,
268
+ "step": 925
269
+ },
270
+ {
271
+ "epoch": 9.29,
272
+ "grad_norm": 0.7581607103347778,
273
+ "learning_rate": 9.002222222222223e-06,
274
+ "loss": 0.006,
275
+ "step": 950
276
+ },
277
+ {
278
+ "epoch": 9.54,
279
+ "grad_norm": 0.6848894953727722,
280
+ "learning_rate": 8.946666666666669e-06,
281
+ "loss": 0.006,
282
+ "step": 975
283
+ },
284
+ {
285
+ "epoch": 9.78,
286
+ "grad_norm": 1.044122576713562,
287
+ "learning_rate": 8.891111111111111e-06,
288
+ "loss": 0.0067,
289
+ "step": 1000
290
+ },
291
+ {
292
+ "epoch": 9.78,
293
+ "eval_loss": 0.41375118494033813,
294
+ "eval_runtime": 1461.809,
295
+ "eval_samples_per_second": 1.98,
296
+ "eval_steps_per_second": 0.495,
297
+ "eval_wer": 18.938284039923083,
298
+ "step": 1000
299
+ },
300
+ {
301
+ "epoch": 10.02,
302
+ "grad_norm": 0.6757261753082275,
303
+ "learning_rate": 8.835555555555557e-06,
304
+ "loss": 0.0058,
305
+ "step": 1025
306
+ },
307
+ {
308
+ "epoch": 10.27,
309
+ "grad_norm": 1.085519552230835,
310
+ "learning_rate": 8.78e-06,
311
+ "loss": 0.0037,
312
+ "step": 1050
313
+ },
314
+ {
315
+ "epoch": 10.51,
316
+ "grad_norm": 0.8559943437576294,
317
+ "learning_rate": 8.724444444444445e-06,
318
+ "loss": 0.0044,
319
+ "step": 1075
320
+ },
321
+ {
322
+ "epoch": 10.76,
323
+ "grad_norm": 1.7756787538528442,
324
+ "learning_rate": 8.66888888888889e-06,
325
+ "loss": 0.0056,
326
+ "step": 1100
327
+ },
328
+ {
329
+ "epoch": 11.0,
330
+ "grad_norm": 0.5664415955543518,
331
+ "learning_rate": 8.613333333333333e-06,
332
+ "loss": 0.0048,
333
+ "step": 1125
334
+ },
335
+ {
336
+ "epoch": 11.25,
337
+ "grad_norm": 0.621498703956604,
338
+ "learning_rate": 8.557777777777778e-06,
339
+ "loss": 0.0038,
340
+ "step": 1150
341
+ },
342
+ {
343
+ "epoch": 11.49,
344
+ "grad_norm": 0.9859088659286499,
345
+ "learning_rate": 8.502222222222223e-06,
346
+ "loss": 0.0035,
347
+ "step": 1175
348
+ },
349
+ {
350
+ "epoch": 11.74,
351
+ "grad_norm": 1.2961162328720093,
352
+ "learning_rate": 8.446666666666668e-06,
353
+ "loss": 0.0041,
354
+ "step": 1200
355
+ },
356
+ {
357
+ "epoch": 11.98,
358
+ "grad_norm": 0.5769420862197876,
359
+ "learning_rate": 8.391111111111112e-06,
360
+ "loss": 0.0035,
361
+ "step": 1225
362
+ },
363
+ {
364
+ "epoch": 12.22,
365
+ "grad_norm": 0.5504060387611389,
366
+ "learning_rate": 8.335555555555556e-06,
367
+ "loss": 0.0022,
368
+ "step": 1250
369
+ },
370
+ {
371
+ "epoch": 12.47,
372
+ "grad_norm": 0.7063620090484619,
373
+ "learning_rate": 8.28e-06,
374
+ "loss": 0.0027,
375
+ "step": 1275
376
+ },
377
+ {
378
+ "epoch": 12.71,
379
+ "grad_norm": 0.6650658845901489,
380
+ "learning_rate": 8.224444444444444e-06,
381
+ "loss": 0.0029,
382
+ "step": 1300
383
+ },
384
+ {
385
+ "epoch": 12.96,
386
+ "grad_norm": 0.6803381443023682,
387
+ "learning_rate": 8.16888888888889e-06,
388
+ "loss": 0.0023,
389
+ "step": 1325
390
+ },
391
+ {
392
+ "epoch": 13.2,
393
+ "grad_norm": 0.19391104578971863,
394
+ "learning_rate": 8.113333333333334e-06,
395
+ "loss": 0.0013,
396
+ "step": 1350
397
+ },
398
+ {
399
+ "epoch": 13.45,
400
+ "grad_norm": 0.43767812848091125,
401
+ "learning_rate": 8.057777777777778e-06,
402
+ "loss": 0.002,
403
+ "step": 1375
404
+ },
405
+ {
406
+ "epoch": 13.69,
407
+ "grad_norm": 0.6082565188407898,
408
+ "learning_rate": 8.002222222222222e-06,
409
+ "loss": 0.0022,
410
+ "step": 1400
411
+ },
412
+ {
413
+ "epoch": 13.94,
414
+ "grad_norm": 0.30705004930496216,
415
+ "learning_rate": 7.946666666666666e-06,
416
+ "loss": 0.002,
417
+ "step": 1425
418
+ },
419
+ {
420
+ "epoch": 14.18,
421
+ "grad_norm": 0.18880507349967957,
422
+ "learning_rate": 7.891111111111112e-06,
423
+ "loss": 0.0015,
424
+ "step": 1450
425
+ },
426
+ {
427
+ "epoch": 14.43,
428
+ "grad_norm": 0.32524725794792175,
429
+ "learning_rate": 7.835555555555556e-06,
430
+ "loss": 0.0015,
431
+ "step": 1475
432
+ },
433
+ {
434
+ "epoch": 14.67,
435
+ "grad_norm": 2.48786997795105,
436
+ "learning_rate": 7.78e-06,
437
+ "loss": 0.0015,
438
+ "step": 1500
439
+ },
440
+ {
441
+ "epoch": 14.91,
442
+ "grad_norm": 0.3373986482620239,
443
+ "learning_rate": 7.724444444444446e-06,
444
+ "loss": 0.0013,
445
+ "step": 1525
446
+ },
447
+ {
448
+ "epoch": 15.16,
449
+ "grad_norm": 0.29098883271217346,
450
+ "learning_rate": 7.66888888888889e-06,
451
+ "loss": 0.0011,
452
+ "step": 1550
453
+ },
454
+ {
455
+ "epoch": 15.4,
456
+ "grad_norm": 0.12477891892194748,
457
+ "learning_rate": 7.613333333333334e-06,
458
+ "loss": 0.0008,
459
+ "step": 1575
460
+ },
461
+ {
462
+ "epoch": 15.65,
463
+ "grad_norm": 0.06489470601081848,
464
+ "learning_rate": 7.557777777777779e-06,
465
+ "loss": 0.0011,
466
+ "step": 1600
467
+ },
468
+ {
469
+ "epoch": 15.89,
470
+ "grad_norm": 0.061178650707006454,
471
+ "learning_rate": 7.502222222222223e-06,
472
+ "loss": 0.001,
473
+ "step": 1625
474
+ },
475
+ {
476
+ "epoch": 16.14,
477
+ "grad_norm": 0.038977060467004776,
478
+ "learning_rate": 7.446666666666668e-06,
479
+ "loss": 0.0007,
480
+ "step": 1650
481
+ },
482
+ {
483
+ "epoch": 16.38,
484
+ "grad_norm": 0.22110821306705475,
485
+ "learning_rate": 7.3911111111111125e-06,
486
+ "loss": 0.0007,
487
+ "step": 1675
488
+ },
489
+ {
490
+ "epoch": 16.63,
491
+ "grad_norm": 0.5320185422897339,
492
+ "learning_rate": 7.335555555555556e-06,
493
+ "loss": 0.0007,
494
+ "step": 1700
495
+ },
496
+ {
497
+ "epoch": 16.87,
498
+ "grad_norm": 0.7823454737663269,
499
+ "learning_rate": 7.280000000000001e-06,
500
+ "loss": 0.0008,
501
+ "step": 1725
502
+ },
503
+ {
504
+ "epoch": 17.11,
505
+ "grad_norm": 0.043301377445459366,
506
+ "learning_rate": 7.224444444444445e-06,
507
+ "loss": 0.001,
508
+ "step": 1750
509
+ },
510
+ {
511
+ "epoch": 17.36,
512
+ "grad_norm": 0.06231601908802986,
513
+ "learning_rate": 7.1688888888888895e-06,
514
+ "loss": 0.0005,
515
+ "step": 1775
516
+ },
517
+ {
518
+ "epoch": 17.6,
519
+ "grad_norm": 0.05838339775800705,
520
+ "learning_rate": 7.113333333333334e-06,
521
+ "loss": 0.0005,
522
+ "step": 1800
523
+ },
524
+ {
525
+ "epoch": 17.85,
526
+ "grad_norm": 0.05545497685670853,
527
+ "learning_rate": 7.057777777777778e-06,
528
+ "loss": 0.0012,
529
+ "step": 1825
530
+ },
531
+ {
532
+ "epoch": 18.09,
533
+ "grad_norm": 0.4030478894710541,
534
+ "learning_rate": 7.0022222222222225e-06,
535
+ "loss": 0.0008,
536
+ "step": 1850
537
+ },
538
+ {
539
+ "epoch": 18.34,
540
+ "grad_norm": 0.27439093589782715,
541
+ "learning_rate": 6.946666666666667e-06,
542
+ "loss": 0.0007,
543
+ "step": 1875
544
+ },
545
+ {
546
+ "epoch": 18.58,
547
+ "grad_norm": 0.25452977418899536,
548
+ "learning_rate": 6.891111111111111e-06,
549
+ "loss": 0.0007,
550
+ "step": 1900
551
+ },
552
+ {
553
+ "epoch": 18.83,
554
+ "grad_norm": 0.06759922206401825,
555
+ "learning_rate": 6.835555555555556e-06,
556
+ "loss": 0.0007,
557
+ "step": 1925
558
+ },
559
+ {
560
+ "epoch": 19.07,
561
+ "grad_norm": 0.25859466195106506,
562
+ "learning_rate": 6.780000000000001e-06,
563
+ "loss": 0.0006,
564
+ "step": 1950
565
+ },
566
+ {
567
+ "epoch": 19.32,
568
+ "grad_norm": 0.7427995800971985,
569
+ "learning_rate": 6.724444444444444e-06,
570
+ "loss": 0.0005,
571
+ "step": 1975
572
+ },
573
+ {
574
+ "epoch": 19.56,
575
+ "grad_norm": 0.0788324698805809,
576
+ "learning_rate": 6.668888888888889e-06,
577
+ "loss": 0.0008,
578
+ "step": 2000
579
+ },
580
+ {
581
+ "epoch": 19.56,
582
+ "eval_loss": 0.49481505155563354,
583
+ "eval_runtime": 1457.5675,
584
+ "eval_samples_per_second": 1.985,
585
+ "eval_steps_per_second": 0.497,
586
+ "eval_wer": 18.4735830052193,
587
+ "step": 2000
588
+ },
589
+ {
590
+ "epoch": 19.8,
591
+ "grad_norm": 0.04227956011891365,
592
+ "learning_rate": 6.613333333333334e-06,
593
+ "loss": 0.0008,
594
+ "step": 2025
595
+ },
596
+ {
597
+ "epoch": 20.05,
598
+ "grad_norm": 0.5580443739891052,
599
+ "learning_rate": 6.557777777777778e-06,
600
+ "loss": 0.001,
601
+ "step": 2050
602
+ },
603
+ {
604
+ "epoch": 20.29,
605
+ "grad_norm": 0.7394335865974426,
606
+ "learning_rate": 6.502222222222223e-06,
607
+ "loss": 0.0014,
608
+ "step": 2075
609
+ },
610
+ {
611
+ "epoch": 20.54,
612
+ "grad_norm": 0.8055688142776489,
613
+ "learning_rate": 6.446666666666668e-06,
614
+ "loss": 0.0011,
615
+ "step": 2100
616
+ },
617
+ {
618
+ "epoch": 20.78,
619
+ "grad_norm": 0.13119255006313324,
620
+ "learning_rate": 6.391111111111111e-06,
621
+ "loss": 0.0016,
622
+ "step": 2125
623
+ },
624
+ {
625
+ "epoch": 21.03,
626
+ "grad_norm": 0.21813702583312988,
627
+ "learning_rate": 6.335555555555556e-06,
628
+ "loss": 0.0014,
629
+ "step": 2150
630
+ },
631
+ {
632
+ "epoch": 21.27,
633
+ "grad_norm": 0.1066213995218277,
634
+ "learning_rate": 6.280000000000001e-06,
635
+ "loss": 0.0009,
636
+ "step": 2175
637
+ },
638
+ {
639
+ "epoch": 21.52,
640
+ "grad_norm": 0.8583650588989258,
641
+ "learning_rate": 6.224444444444445e-06,
642
+ "loss": 0.0012,
643
+ "step": 2200
644
+ },
645
+ {
646
+ "epoch": 21.76,
647
+ "grad_norm": 1.2513171434402466,
648
+ "learning_rate": 6.16888888888889e-06,
649
+ "loss": 0.0021,
650
+ "step": 2225
651
+ },
652
+ {
653
+ "epoch": 22.0,
654
+ "grad_norm": 0.8390223979949951,
655
+ "learning_rate": 6.113333333333333e-06,
656
+ "loss": 0.0018,
657
+ "step": 2250
658
+ },
659
+ {
660
+ "epoch": 22.25,
661
+ "grad_norm": 0.8746078610420227,
662
+ "learning_rate": 6.057777777777778e-06,
663
+ "loss": 0.0015,
664
+ "step": 2275
665
+ },
666
+ {
667
+ "epoch": 22.49,
668
+ "grad_norm": 0.13358770310878754,
669
+ "learning_rate": 6.002222222222223e-06,
670
+ "loss": 0.0016,
671
+ "step": 2300
672
+ },
673
+ {
674
+ "epoch": 22.74,
675
+ "grad_norm": 0.07681471109390259,
676
+ "learning_rate": 5.946666666666668e-06,
677
+ "loss": 0.0011,
678
+ "step": 2325
679
+ },
680
+ {
681
+ "epoch": 22.98,
682
+ "grad_norm": 0.5511406660079956,
683
+ "learning_rate": 5.891111111111112e-06,
684
+ "loss": 0.0013,
685
+ "step": 2350
686
+ },
687
+ {
688
+ "epoch": 23.23,
689
+ "grad_norm": 0.23318354785442352,
690
+ "learning_rate": 5.8355555555555565e-06,
691
+ "loss": 0.0011,
692
+ "step": 2375
693
+ },
694
+ {
695
+ "epoch": 23.47,
696
+ "grad_norm": 0.12396834790706635,
697
+ "learning_rate": 5.78e-06,
698
+ "loss": 0.0009,
699
+ "step": 2400
700
+ },
701
+ {
702
+ "epoch": 23.72,
703
+ "grad_norm": 0.0644838809967041,
704
+ "learning_rate": 5.724444444444445e-06,
705
+ "loss": 0.0006,
706
+ "step": 2425
707
+ },
708
+ {
709
+ "epoch": 23.96,
710
+ "grad_norm": 0.47172439098358154,
711
+ "learning_rate": 5.6688888888888895e-06,
712
+ "loss": 0.0007,
713
+ "step": 2450
714
+ },
715
+ {
716
+ "epoch": 24.21,
717
+ "grad_norm": 0.030231019482016563,
718
+ "learning_rate": 5.613333333333334e-06,
719
+ "loss": 0.0005,
720
+ "step": 2475
721
+ },
722
+ {
723
+ "epoch": 24.45,
724
+ "grad_norm": 0.01894545555114746,
725
+ "learning_rate": 5.557777777777778e-06,
726
+ "loss": 0.0003,
727
+ "step": 2500
728
+ },
729
+ {
730
+ "epoch": 24.69,
731
+ "grad_norm": 0.18070322275161743,
732
+ "learning_rate": 5.5022222222222224e-06,
733
+ "loss": 0.0009,
734
+ "step": 2525
735
+ },
736
+ {
737
+ "epoch": 24.94,
738
+ "grad_norm": 0.0435708686709404,
739
+ "learning_rate": 5.4466666666666665e-06,
740
+ "loss": 0.0009,
741
+ "step": 2550
742
+ },
743
+ {
744
+ "epoch": 25.18,
745
+ "grad_norm": 0.07220447063446045,
746
+ "learning_rate": 5.391111111111111e-06,
747
+ "loss": 0.0005,
748
+ "step": 2575
749
+ },
750
+ {
751
+ "epoch": 25.43,
752
+ "grad_norm": 0.01733986660838127,
753
+ "learning_rate": 5.335555555555556e-06,
754
+ "loss": 0.0004,
755
+ "step": 2600
756
+ },
757
+ {
758
+ "epoch": 25.67,
759
+ "grad_norm": 0.03520004078745842,
760
+ "learning_rate": 5.28e-06,
761
+ "loss": 0.0007,
762
+ "step": 2625
763
+ },
764
+ {
765
+ "epoch": 25.92,
766
+ "grad_norm": 0.03853292763233185,
767
+ "learning_rate": 5.224444444444445e-06,
768
+ "loss": 0.0005,
769
+ "step": 2650
770
+ },
771
+ {
772
+ "epoch": 26.16,
773
+ "grad_norm": 0.13450591266155243,
774
+ "learning_rate": 5.168888888888889e-06,
775
+ "loss": 0.0005,
776
+ "step": 2675
777
+ },
778
+ {
779
+ "epoch": 26.41,
780
+ "grad_norm": 0.029255390167236328,
781
+ "learning_rate": 5.113333333333333e-06,
782
+ "loss": 0.0004,
783
+ "step": 2700
784
+ },
785
+ {
786
+ "epoch": 26.65,
787
+ "grad_norm": 0.025706447660923004,
788
+ "learning_rate": 5.057777777777778e-06,
789
+ "loss": 0.0003,
790
+ "step": 2725
791
+ },
792
+ {
793
+ "epoch": 26.89,
794
+ "grad_norm": 0.902415931224823,
795
+ "learning_rate": 5.002222222222223e-06,
796
+ "loss": 0.0002,
797
+ "step": 2750
798
+ },
799
+ {
800
+ "epoch": 27.14,
801
+ "grad_norm": 0.013656423427164555,
802
+ "learning_rate": 4.946666666666667e-06,
803
+ "loss": 0.0002,
804
+ "step": 2775
805
+ },
806
+ {
807
+ "epoch": 27.38,
808
+ "grad_norm": 0.018052740022540092,
809
+ "learning_rate": 4.891111111111111e-06,
810
+ "loss": 0.0003,
811
+ "step": 2800
812
+ },
813
+ {
814
+ "epoch": 27.63,
815
+ "grad_norm": 0.07001502811908722,
816
+ "learning_rate": 4.835555555555556e-06,
817
+ "loss": 0.0002,
818
+ "step": 2825
819
+ },
820
+ {
821
+ "epoch": 27.87,
822
+ "grad_norm": 0.01241993810981512,
823
+ "learning_rate": 4.78e-06,
824
+ "loss": 0.0003,
825
+ "step": 2850
826
+ },
827
+ {
828
+ "epoch": 28.12,
829
+ "grad_norm": 0.013495378196239471,
830
+ "learning_rate": 4.724444444444445e-06,
831
+ "loss": 0.0001,
832
+ "step": 2875
833
+ },
834
+ {
835
+ "epoch": 28.36,
836
+ "grad_norm": 0.011048069223761559,
837
+ "learning_rate": 4.66888888888889e-06,
838
+ "loss": 0.0001,
839
+ "step": 2900
840
+ },
841
+ {
842
+ "epoch": 28.61,
843
+ "grad_norm": 0.015970442444086075,
844
+ "learning_rate": 4.613333333333334e-06,
845
+ "loss": 0.0002,
846
+ "step": 2925
847
+ },
848
+ {
849
+ "epoch": 28.85,
850
+ "grad_norm": 0.009559527039527893,
851
+ "learning_rate": 4.557777777777778e-06,
852
+ "loss": 0.0002,
853
+ "step": 2950
854
+ },
855
+ {
856
+ "epoch": 29.1,
857
+ "grad_norm": 0.010293275117874146,
858
+ "learning_rate": 4.502222222222223e-06,
859
+ "loss": 0.0001,
860
+ "step": 2975
861
+ },
862
+ {
863
+ "epoch": 29.34,
864
+ "grad_norm": 0.01116804126650095,
865
+ "learning_rate": 4.446666666666667e-06,
866
+ "loss": 0.0001,
867
+ "step": 3000
868
+ },
869
+ {
870
+ "epoch": 29.34,
871
+ "eval_loss": 0.5352661609649658,
872
+ "eval_runtime": 1458.2257,
873
+ "eval_samples_per_second": 1.985,
874
+ "eval_steps_per_second": 0.496,
875
+ "eval_wer": 18.07297866495742,
876
+ "step": 3000
877
+ }
878
+ ],
879
+ "logging_steps": 25,
880
+ "max_steps": 5000,
881
+ "num_input_tokens_seen": 0,
882
+ "num_train_epochs": 50,
883
+ "save_steps": 1000,
884
+ "total_flos": 5.537492095500288e+19,
885
+ "train_batch_size": 8,
886
+ "trial_name": null,
887
+ "trial_params": null
888
+ }
Models/hindi/checkpoint-3000/training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1562c17bb2dc7592b45af48209c138ce71faddb67bef288ecaf31f1c50f864ae
3
+ size 5048
Models/hindi/checkpoint-4000/config.json ADDED
@@ -0,0 +1,52 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "openai/whisper-small",
3
+ "activation_dropout": 0.0,
4
+ "activation_function": "gelu",
5
+ "apply_spec_augment": false,
6
+ "architectures": [
7
+ "WhisperForConditionalGeneration"
8
+ ],
9
+ "attention_dropout": 0.0,
10
+ "begin_suppress_tokens": [
11
+ 220,
12
+ 50257
13
+ ],
14
+ "bos_token_id": 50257,
15
+ "classifier_proj_size": 256,
16
+ "d_model": 768,
17
+ "decoder_attention_heads": 12,
18
+ "decoder_ffn_dim": 3072,
19
+ "decoder_layerdrop": 0.0,
20
+ "decoder_layers": 12,
21
+ "decoder_start_token_id": 50258,
22
+ "dropout": 0.0,
23
+ "encoder_attention_heads": 12,
24
+ "encoder_ffn_dim": 3072,
25
+ "encoder_layerdrop": 0.0,
26
+ "encoder_layers": 12,
27
+ "eos_token_id": 50257,
28
+ "forced_decoder_ids": null,
29
+ "init_std": 0.02,
30
+ "is_encoder_decoder": true,
31
+ "mask_feature_length": 10,
32
+ "mask_feature_min_masks": 0,
33
+ "mask_feature_prob": 0.0,
34
+ "mask_time_length": 10,
35
+ "mask_time_min_masks": 2,
36
+ "mask_time_prob": 0.05,
37
+ "max_length": 448,
38
+ "max_source_positions": 1500,
39
+ "max_target_positions": 448,
40
+ "median_filter_width": 7,
41
+ "model_type": "whisper",
42
+ "num_hidden_layers": 12,
43
+ "num_mel_bins": 80,
44
+ "pad_token_id": 50257,
45
+ "scale_embedding": false,
46
+ "suppress_tokens": [],
47
+ "torch_dtype": "float32",
48
+ "transformers_version": "4.40.0.dev0",
49
+ "use_cache": false,
50
+ "use_weighted_layer_sum": false,
51
+ "vocab_size": 51865
52
+ }
Models/hindi/checkpoint-4000/generation_config.json ADDED
@@ -0,0 +1,265 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alignment_heads": [
3
+ [
4
+ 5,
5
+ 3
6
+ ],
7
+ [
8
+ 5,
9
+ 9
10
+ ],
11
+ [
12
+ 8,
13
+ 0
14
+ ],
15
+ [
16
+ 8,
17
+ 4
18
+ ],
19
+ [
20
+ 8,
21
+ 7
22
+ ],
23
+ [
24
+ 8,
25
+ 8
26
+ ],
27
+ [
28
+ 9,
29
+ 0
30
+ ],
31
+ [
32
+ 9,
33
+ 7
34
+ ],
35
+ [
36
+ 9,
37
+ 9
38
+ ],
39
+ [
40
+ 10,
41
+ 5
42
+ ]
43
+ ],
44
+ "begin_suppress_tokens": [
45
+ 220,
46
+ 50257
47
+ ],
48
+ "bos_token_id": 50257,
49
+ "decoder_start_token_id": 50258,
50
+ "eos_token_id": 50257,
51
+ "forced_decoder_ids": [
52
+ [
53
+ 1,
54
+ null
55
+ ],
56
+ [
57
+ 2,
58
+ 50359
59
+ ]
60
+ ],
61
+ "is_multilingual": true,
62
+ "lang_to_id": {
63
+ "<|af|>": 50327,
64
+ "<|am|>": 50334,
65
+ "<|ar|>": 50272,
66
+ "<|as|>": 50350,
67
+ "<|az|>": 50304,
68
+ "<|ba|>": 50355,
69
+ "<|be|>": 50330,
70
+ "<|bg|>": 50292,
71
+ "<|bn|>": 50302,
72
+ "<|bo|>": 50347,
73
+ "<|br|>": 50309,
74
+ "<|bs|>": 50315,
75
+ "<|ca|>": 50270,
76
+ "<|cs|>": 50283,
77
+ "<|cy|>": 50297,
78
+ "<|da|>": 50285,
79
+ "<|de|>": 50261,
80
+ "<|el|>": 50281,
81
+ "<|en|>": 50259,
82
+ "<|es|>": 50262,
83
+ "<|et|>": 50307,
84
+ "<|eu|>": 50310,
85
+ "<|fa|>": 50300,
86
+ "<|fi|>": 50277,
87
+ "<|fo|>": 50338,
88
+ "<|fr|>": 50265,
89
+ "<|gl|>": 50319,
90
+ "<|gu|>": 50333,
91
+ "<|haw|>": 50352,
92
+ "<|ha|>": 50354,
93
+ "<|he|>": 50279,
94
+ "<|hi|>": 50276,
95
+ "<|hr|>": 50291,
96
+ "<|ht|>": 50339,
97
+ "<|hu|>": 50286,
98
+ "<|hy|>": 50312,
99
+ "<|id|>": 50275,
100
+ "<|is|>": 50311,
101
+ "<|it|>": 50274,
102
+ "<|ja|>": 50266,
103
+ "<|jw|>": 50356,
104
+ "<|ka|>": 50329,
105
+ "<|kk|>": 50316,
106
+ "<|km|>": 50323,
107
+ "<|kn|>": 50306,
108
+ "<|ko|>": 50264,
109
+ "<|la|>": 50294,
110
+ "<|lb|>": 50345,
111
+ "<|ln|>": 50353,
112
+ "<|lo|>": 50336,
113
+ "<|lt|>": 50293,
114
+ "<|lv|>": 50301,
115
+ "<|mg|>": 50349,
116
+ "<|mi|>": 50295,
117
+ "<|mk|>": 50308,
118
+ "<|ml|>": 50296,
119
+ "<|mn|>": 50314,
120
+ "<|mr|>": 50320,
121
+ "<|ms|>": 50282,
122
+ "<|mt|>": 50343,
123
+ "<|my|>": 50346,
124
+ "<|ne|>": 50313,
125
+ "<|nl|>": 50271,
126
+ "<|nn|>": 50342,
127
+ "<|no|>": 50288,
128
+ "<|oc|>": 50328,
129
+ "<|pa|>": 50321,
130
+ "<|pl|>": 50269,
131
+ "<|ps|>": 50340,
132
+ "<|pt|>": 50267,
133
+ "<|ro|>": 50284,
134
+ "<|ru|>": 50263,
135
+ "<|sa|>": 50344,
136
+ "<|sd|>": 50332,
137
+ "<|si|>": 50322,
138
+ "<|sk|>": 50298,
139
+ "<|sl|>": 50305,
140
+ "<|sn|>": 50324,
141
+ "<|so|>": 50326,
142
+ "<|sq|>": 50317,
143
+ "<|sr|>": 50303,
144
+ "<|su|>": 50357,
145
+ "<|sv|>": 50273,
146
+ "<|sw|>": 50318,
147
+ "<|ta|>": 50287,
148
+ "<|te|>": 50299,
149
+ "<|tg|>": 50331,
150
+ "<|th|>": 50289,
151
+ "<|tk|>": 50341,
152
+ "<|tl|>": 50348,
153
+ "<|tr|>": 50268,
154
+ "<|tt|>": 50351,
155
+ "<|uk|>": 50280,
156
+ "<|ur|>": 50290,
157
+ "<|uz|>": 50337,
158
+ "<|vi|>": 50278,
159
+ "<|yi|>": 50335,
160
+ "<|yo|>": 50325,
161
+ "<|zh|>": 50260
162
+ },
163
+ "language": "hi",
164
+ "max_initial_timestamp_index": 50,
165
+ "max_length": 448,
166
+ "no_timestamps_token_id": 50363,
167
+ "pad_token_id": 50257,
168
+ "prev_sot_token_id": 50361,
169
+ "return_timestamps": false,
170
+ "suppress_tokens": [
171
+ 1,
172
+ 2,
173
+ 7,
174
+ 8,
175
+ 9,
176
+ 10,
177
+ 14,
178
+ 25,
179
+ 26,
180
+ 27,
181
+ 28,
182
+ 29,
183
+ 31,
184
+ 58,
185
+ 59,
186
+ 60,
187
+ 61,
188
+ 62,
189
+ 63,
190
+ 90,
191
+ 91,
192
+ 92,
193
+ 93,
194
+ 359,
195
+ 503,
196
+ 522,
197
+ 542,
198
+ 873,
199
+ 893,
200
+ 902,
201
+ 918,
202
+ 922,
203
+ 931,
204
+ 1350,
205
+ 1853,
206
+ 1982,
207
+ 2460,
208
+ 2627,
209
+ 3246,
210
+ 3253,
211
+ 3268,
212
+ 3536,
213
+ 3846,
214
+ 3961,
215
+ 4183,
216
+ 4667,
217
+ 6585,
218
+ 6647,
219
+ 7273,
220
+ 9061,
221
+ 9383,
222
+ 10428,
223
+ 10929,
224
+ 11938,
225
+ 12033,
226
+ 12331,
227
+ 12562,
228
+ 13793,
229
+ 14157,
230
+ 14635,
231
+ 15265,
232
+ 15618,
233
+ 16553,
234
+ 16604,
235
+ 18362,
236
+ 18956,
237
+ 20075,
238
+ 21675,
239
+ 22520,
240
+ 26130,
241
+ 26161,
242
+ 26435,
243
+ 28279,
244
+ 29464,
245
+ 31650,
246
+ 32302,
247
+ 32470,
248
+ 36865,
249
+ 42863,
250
+ 47425,
251
+ 49870,
252
+ 50254,
253
+ 50258,
254
+ 50358,
255
+ 50359,
256
+ 50360,
257
+ 50361,
258
+ 50362
259
+ ],
260
+ "task_to_id": {
261
+ "transcribe": 50359,
262
+ "translate": 50358
263
+ },
264
+ "transformers_version": "4.40.0.dev0"
265
+ }
Models/hindi/checkpoint-4000/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0e7ee6929d4a9289b779431fe8f499fff1d35555d5380656ae14fe72eda3de8e
3
+ size 966995080
Models/hindi/checkpoint-4000/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:051e7b617e2783440bbf8e083f6a4234b94e39cc11317e059447c1c181331510
3
+ size 1925064044
Models/hindi/checkpoint-4000/preprocessor_config.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "chunk_length": 30,
3
+ "feature_extractor_type": "WhisperFeatureExtractor",
4
+ "feature_size": 80,
5
+ "hop_length": 160,
6
+ "n_fft": 400,
7
+ "n_samples": 480000,
8
+ "nb_max_frames": 3000,
9
+ "padding_side": "right",
10
+ "padding_value": 0.0,
11
+ "processor_class": "WhisperProcessor",
12
+ "return_attention_mask": false,
13
+ "sampling_rate": 16000
14
+ }
Models/hindi/checkpoint-4000/rng_state.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c76d2a773674218dbf191e2bd7972993110ab38f0f4ec23dc0737b952d379f3b
3
+ size 14244
Models/hindi/checkpoint-4000/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a382a9c61605f8d81ded18f481590da40e63ed655cb410f92839fa7b08314e83
3
+ size 1064
Models/hindi/checkpoint-4000/trainer_state.json ADDED
@@ -0,0 +1,1177 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_metric": 18.056954491346946,
3
+ "best_model_checkpoint": "./checkpoint-4000",
4
+ "epoch": 39.119804400978,
5
+ "eval_steps": 1000,
6
+ "global_step": 4000,
7
+ "is_hyper_param_search": false,
8
+ "is_local_process_zero": true,
9
+ "is_world_process_zero": true,
10
+ "log_history": [
11
+ {
12
+ "epoch": 0.24,
13
+ "grad_norm": 39.86528778076172,
14
+ "learning_rate": 5.000000000000001e-07,
15
+ "loss": 2.0555,
16
+ "step": 25
17
+ },
18
+ {
19
+ "epoch": 0.49,
20
+ "grad_norm": Infinity,
21
+ "learning_rate": 9.800000000000001e-07,
22
+ "loss": 1.5219,
23
+ "step": 50
24
+ },
25
+ {
26
+ "epoch": 0.73,
27
+ "grad_norm": 6.200109958648682,
28
+ "learning_rate": 1.48e-06,
29
+ "loss": 1.0167,
30
+ "step": 75
31
+ },
32
+ {
33
+ "epoch": 0.98,
34
+ "grad_norm": 5.486103057861328,
35
+ "learning_rate": 1.98e-06,
36
+ "loss": 0.7299,
37
+ "step": 100
38
+ },
39
+ {
40
+ "epoch": 1.22,
41
+ "grad_norm": 5.532894134521484,
42
+ "learning_rate": 2.4800000000000004e-06,
43
+ "loss": 0.6318,
44
+ "step": 125
45
+ },
46
+ {
47
+ "epoch": 1.47,
48
+ "grad_norm": 4.895535945892334,
49
+ "learning_rate": 2.9800000000000003e-06,
50
+ "loss": 0.5503,
51
+ "step": 150
52
+ },
53
+ {
54
+ "epoch": 1.71,
55
+ "grad_norm": 5.065937519073486,
56
+ "learning_rate": 3.48e-06,
57
+ "loss": 0.4999,
58
+ "step": 175
59
+ },
60
+ {
61
+ "epoch": 1.96,
62
+ "grad_norm": 4.807600498199463,
63
+ "learning_rate": 3.980000000000001e-06,
64
+ "loss": 0.4457,
65
+ "step": 200
66
+ },
67
+ {
68
+ "epoch": 2.2,
69
+ "grad_norm": 4.568106174468994,
70
+ "learning_rate": 4.48e-06,
71
+ "loss": 0.3779,
72
+ "step": 225
73
+ },
74
+ {
75
+ "epoch": 2.44,
76
+ "grad_norm": 4.705833911895752,
77
+ "learning_rate": 4.980000000000001e-06,
78
+ "loss": 0.3332,
79
+ "step": 250
80
+ },
81
+ {
82
+ "epoch": 2.69,
83
+ "grad_norm": 4.315070629119873,
84
+ "learning_rate": 5.480000000000001e-06,
85
+ "loss": 0.2858,
86
+ "step": 275
87
+ },
88
+ {
89
+ "epoch": 2.93,
90
+ "grad_norm": 2.660728693008423,
91
+ "learning_rate": 5.98e-06,
92
+ "loss": 0.2228,
93
+ "step": 300
94
+ },
95
+ {
96
+ "epoch": 3.18,
97
+ "grad_norm": 2.5569632053375244,
98
+ "learning_rate": 6.480000000000001e-06,
99
+ "loss": 0.1766,
100
+ "step": 325
101
+ },
102
+ {
103
+ "epoch": 3.42,
104
+ "grad_norm": 2.2663660049438477,
105
+ "learning_rate": 6.98e-06,
106
+ "loss": 0.1475,
107
+ "step": 350
108
+ },
109
+ {
110
+ "epoch": 3.67,
111
+ "grad_norm": 2.5257091522216797,
112
+ "learning_rate": 7.48e-06,
113
+ "loss": 0.1492,
114
+ "step": 375
115
+ },
116
+ {
117
+ "epoch": 3.91,
118
+ "grad_norm": 2.3954405784606934,
119
+ "learning_rate": 7.980000000000002e-06,
120
+ "loss": 0.142,
121
+ "step": 400
122
+ },
123
+ {
124
+ "epoch": 4.16,
125
+ "grad_norm": 2.181328296661377,
126
+ "learning_rate": 8.48e-06,
127
+ "loss": 0.1125,
128
+ "step": 425
129
+ },
130
+ {
131
+ "epoch": 4.4,
132
+ "grad_norm": 1.9877594709396362,
133
+ "learning_rate": 8.98e-06,
134
+ "loss": 0.0908,
135
+ "step": 450
136
+ },
137
+ {
138
+ "epoch": 4.65,
139
+ "grad_norm": 1.8853321075439453,
140
+ "learning_rate": 9.48e-06,
141
+ "loss": 0.0892,
142
+ "step": 475
143
+ },
144
+ {
145
+ "epoch": 4.89,
146
+ "grad_norm": 2.3802549839019775,
147
+ "learning_rate": 9.980000000000001e-06,
148
+ "loss": 0.091,
149
+ "step": 500
150
+ },
151
+ {
152
+ "epoch": 5.13,
153
+ "grad_norm": 1.3576879501342773,
154
+ "learning_rate": 9.946666666666667e-06,
155
+ "loss": 0.0713,
156
+ "step": 525
157
+ },
158
+ {
159
+ "epoch": 5.38,
160
+ "grad_norm": 2.4532103538513184,
161
+ "learning_rate": 9.891111111111113e-06,
162
+ "loss": 0.0534,
163
+ "step": 550
164
+ },
165
+ {
166
+ "epoch": 5.62,
167
+ "grad_norm": 1.3736106157302856,
168
+ "learning_rate": 9.835555555555556e-06,
169
+ "loss": 0.0512,
170
+ "step": 575
171
+ },
172
+ {
173
+ "epoch": 5.87,
174
+ "grad_norm": 2.0095458030700684,
175
+ "learning_rate": 9.780000000000001e-06,
176
+ "loss": 0.0571,
177
+ "step": 600
178
+ },
179
+ {
180
+ "epoch": 6.11,
181
+ "grad_norm": 1.2924531698226929,
182
+ "learning_rate": 9.724444444444445e-06,
183
+ "loss": 0.0453,
184
+ "step": 625
185
+ },
186
+ {
187
+ "epoch": 6.36,
188
+ "grad_norm": 1.0012321472167969,
189
+ "learning_rate": 9.66888888888889e-06,
190
+ "loss": 0.0292,
191
+ "step": 650
192
+ },
193
+ {
194
+ "epoch": 6.6,
195
+ "grad_norm": 1.4161434173583984,
196
+ "learning_rate": 9.613333333333335e-06,
197
+ "loss": 0.0325,
198
+ "step": 675
199
+ },
200
+ {
201
+ "epoch": 6.85,
202
+ "grad_norm": 5.784367084503174,
203
+ "learning_rate": 9.557777777777777e-06,
204
+ "loss": 0.031,
205
+ "step": 700
206
+ },
207
+ {
208
+ "epoch": 7.09,
209
+ "grad_norm": 0.8382102251052856,
210
+ "learning_rate": 9.502222222222223e-06,
211
+ "loss": 0.0247,
212
+ "step": 725
213
+ },
214
+ {
215
+ "epoch": 7.33,
216
+ "grad_norm": 1.2963491678237915,
217
+ "learning_rate": 9.446666666666667e-06,
218
+ "loss": 0.0162,
219
+ "step": 750
220
+ },
221
+ {
222
+ "epoch": 7.58,
223
+ "grad_norm": 1.7834402322769165,
224
+ "learning_rate": 9.391111111111111e-06,
225
+ "loss": 0.0175,
226
+ "step": 775
227
+ },
228
+ {
229
+ "epoch": 7.82,
230
+ "grad_norm": 0.9083292484283447,
231
+ "learning_rate": 9.335555555555557e-06,
232
+ "loss": 0.0193,
233
+ "step": 800
234
+ },
235
+ {
236
+ "epoch": 8.07,
237
+ "grad_norm": 0.5552634596824646,
238
+ "learning_rate": 9.280000000000001e-06,
239
+ "loss": 0.0157,
240
+ "step": 825
241
+ },
242
+ {
243
+ "epoch": 8.31,
244
+ "grad_norm": 1.1231069564819336,
245
+ "learning_rate": 9.224444444444445e-06,
246
+ "loss": 0.0105,
247
+ "step": 850
248
+ },
249
+ {
250
+ "epoch": 8.56,
251
+ "grad_norm": 1.206103801727295,
252
+ "learning_rate": 9.168888888888889e-06,
253
+ "loss": 0.0109,
254
+ "step": 875
255
+ },
256
+ {
257
+ "epoch": 8.8,
258
+ "grad_norm": 0.8872191309928894,
259
+ "learning_rate": 9.113333333333335e-06,
260
+ "loss": 0.0126,
261
+ "step": 900
262
+ },
263
+ {
264
+ "epoch": 9.05,
265
+ "grad_norm": 0.7421383261680603,
266
+ "learning_rate": 9.057777777777779e-06,
267
+ "loss": 0.0107,
268
+ "step": 925
269
+ },
270
+ {
271
+ "epoch": 9.29,
272
+ "grad_norm": 0.7581607103347778,
273
+ "learning_rate": 9.002222222222223e-06,
274
+ "loss": 0.006,
275
+ "step": 950
276
+ },
277
+ {
278
+ "epoch": 9.54,
279
+ "grad_norm": 0.6848894953727722,
280
+ "learning_rate": 8.946666666666669e-06,
281
+ "loss": 0.006,
282
+ "step": 975
283
+ },
284
+ {
285
+ "epoch": 9.78,
286
+ "grad_norm": 1.044122576713562,
287
+ "learning_rate": 8.891111111111111e-06,
288
+ "loss": 0.0067,
289
+ "step": 1000
290
+ },
291
+ {
292
+ "epoch": 9.78,
293
+ "eval_loss": 0.41375118494033813,
294
+ "eval_runtime": 1461.809,
295
+ "eval_samples_per_second": 1.98,
296
+ "eval_steps_per_second": 0.495,
297
+ "eval_wer": 18.938284039923083,
298
+ "step": 1000
299
+ },
300
+ {
301
+ "epoch": 10.02,
302
+ "grad_norm": 0.6757261753082275,
303
+ "learning_rate": 8.835555555555557e-06,
304
+ "loss": 0.0058,
305
+ "step": 1025
306
+ },
307
+ {
308
+ "epoch": 10.27,
309
+ "grad_norm": 1.085519552230835,
310
+ "learning_rate": 8.78e-06,
311
+ "loss": 0.0037,
312
+ "step": 1050
313
+ },
314
+ {
315
+ "epoch": 10.51,
316
+ "grad_norm": 0.8559943437576294,
317
+ "learning_rate": 8.724444444444445e-06,
318
+ "loss": 0.0044,
319
+ "step": 1075
320
+ },
321
+ {
322
+ "epoch": 10.76,
323
+ "grad_norm": 1.7756787538528442,
324
+ "learning_rate": 8.66888888888889e-06,
325
+ "loss": 0.0056,
326
+ "step": 1100
327
+ },
328
+ {
329
+ "epoch": 11.0,
330
+ "grad_norm": 0.5664415955543518,
331
+ "learning_rate": 8.613333333333333e-06,
332
+ "loss": 0.0048,
333
+ "step": 1125
334
+ },
335
+ {
336
+ "epoch": 11.25,
337
+ "grad_norm": 0.621498703956604,
338
+ "learning_rate": 8.557777777777778e-06,
339
+ "loss": 0.0038,
340
+ "step": 1150
341
+ },
342
+ {
343
+ "epoch": 11.49,
344
+ "grad_norm": 0.9859088659286499,
345
+ "learning_rate": 8.502222222222223e-06,
346
+ "loss": 0.0035,
347
+ "step": 1175
348
+ },
349
+ {
350
+ "epoch": 11.74,
351
+ "grad_norm": 1.2961162328720093,
352
+ "learning_rate": 8.446666666666668e-06,
353
+ "loss": 0.0041,
354
+ "step": 1200
355
+ },
356
+ {
357
+ "epoch": 11.98,
358
+ "grad_norm": 0.5769420862197876,
359
+ "learning_rate": 8.391111111111112e-06,
360
+ "loss": 0.0035,
361
+ "step": 1225
362
+ },
363
+ {
364
+ "epoch": 12.22,
365
+ "grad_norm": 0.5504060387611389,
366
+ "learning_rate": 8.335555555555556e-06,
367
+ "loss": 0.0022,
368
+ "step": 1250
369
+ },
370
+ {
371
+ "epoch": 12.47,
372
+ "grad_norm": 0.7063620090484619,
373
+ "learning_rate": 8.28e-06,
374
+ "loss": 0.0027,
375
+ "step": 1275
376
+ },
377
+ {
378
+ "epoch": 12.71,
379
+ "grad_norm": 0.6650658845901489,
380
+ "learning_rate": 8.224444444444444e-06,
381
+ "loss": 0.0029,
382
+ "step": 1300
383
+ },
384
+ {
385
+ "epoch": 12.96,
386
+ "grad_norm": 0.6803381443023682,
387
+ "learning_rate": 8.16888888888889e-06,
388
+ "loss": 0.0023,
389
+ "step": 1325
390
+ },
391
+ {
392
+ "epoch": 13.2,
393
+ "grad_norm": 0.19391104578971863,
394
+ "learning_rate": 8.113333333333334e-06,
395
+ "loss": 0.0013,
396
+ "step": 1350
397
+ },
398
+ {
399
+ "epoch": 13.45,
400
+ "grad_norm": 0.43767812848091125,
401
+ "learning_rate": 8.057777777777778e-06,
402
+ "loss": 0.002,
403
+ "step": 1375
404
+ },
405
+ {
406
+ "epoch": 13.69,
407
+ "grad_norm": 0.6082565188407898,
408
+ "learning_rate": 8.002222222222222e-06,
409
+ "loss": 0.0022,
410
+ "step": 1400
411
+ },
412
+ {
413
+ "epoch": 13.94,
414
+ "grad_norm": 0.30705004930496216,
415
+ "learning_rate": 7.946666666666666e-06,
416
+ "loss": 0.002,
417
+ "step": 1425
418
+ },
419
+ {
420
+ "epoch": 14.18,
421
+ "grad_norm": 0.18880507349967957,
422
+ "learning_rate": 7.891111111111112e-06,
423
+ "loss": 0.0015,
424
+ "step": 1450
425
+ },
426
+ {
427
+ "epoch": 14.43,
428
+ "grad_norm": 0.32524725794792175,
429
+ "learning_rate": 7.835555555555556e-06,
430
+ "loss": 0.0015,
431
+ "step": 1475
432
+ },
433
+ {
434
+ "epoch": 14.67,
435
+ "grad_norm": 2.48786997795105,
436
+ "learning_rate": 7.78e-06,
437
+ "loss": 0.0015,
438
+ "step": 1500
439
+ },
440
+ {
441
+ "epoch": 14.91,
442
+ "grad_norm": 0.3373986482620239,
443
+ "learning_rate": 7.724444444444446e-06,
444
+ "loss": 0.0013,
445
+ "step": 1525
446
+ },
447
+ {
448
+ "epoch": 15.16,
449
+ "grad_norm": 0.29098883271217346,
450
+ "learning_rate": 7.66888888888889e-06,
451
+ "loss": 0.0011,
452
+ "step": 1550
453
+ },
454
+ {
455
+ "epoch": 15.4,
456
+ "grad_norm": 0.12477891892194748,
457
+ "learning_rate": 7.613333333333334e-06,
458
+ "loss": 0.0008,
459
+ "step": 1575
460
+ },
461
+ {
462
+ "epoch": 15.65,
463
+ "grad_norm": 0.06489470601081848,
464
+ "learning_rate": 7.557777777777779e-06,
465
+ "loss": 0.0011,
466
+ "step": 1600
467
+ },
468
+ {
469
+ "epoch": 15.89,
470
+ "grad_norm": 0.061178650707006454,
471
+ "learning_rate": 7.502222222222223e-06,
472
+ "loss": 0.001,
473
+ "step": 1625
474
+ },
475
+ {
476
+ "epoch": 16.14,
477
+ "grad_norm": 0.038977060467004776,
478
+ "learning_rate": 7.446666666666668e-06,
479
+ "loss": 0.0007,
480
+ "step": 1650
481
+ },
482
+ {
483
+ "epoch": 16.38,
484
+ "grad_norm": 0.22110821306705475,
485
+ "learning_rate": 7.3911111111111125e-06,
486
+ "loss": 0.0007,
487
+ "step": 1675
488
+ },
489
+ {
490
+ "epoch": 16.63,
491
+ "grad_norm": 0.5320185422897339,
492
+ "learning_rate": 7.335555555555556e-06,
493
+ "loss": 0.0007,
494
+ "step": 1700
495
+ },
496
+ {
497
+ "epoch": 16.87,
498
+ "grad_norm": 0.7823454737663269,
499
+ "learning_rate": 7.280000000000001e-06,
500
+ "loss": 0.0008,
501
+ "step": 1725
502
+ },
503
+ {
504
+ "epoch": 17.11,
505
+ "grad_norm": 0.043301377445459366,
506
+ "learning_rate": 7.224444444444445e-06,
507
+ "loss": 0.001,
508
+ "step": 1750
509
+ },
510
+ {
511
+ "epoch": 17.36,
512
+ "grad_norm": 0.06231601908802986,
513
+ "learning_rate": 7.1688888888888895e-06,
514
+ "loss": 0.0005,
515
+ "step": 1775
516
+ },
517
+ {
518
+ "epoch": 17.6,
519
+ "grad_norm": 0.05838339775800705,
520
+ "learning_rate": 7.113333333333334e-06,
521
+ "loss": 0.0005,
522
+ "step": 1800
523
+ },
524
+ {
525
+ "epoch": 17.85,
526
+ "grad_norm": 0.05545497685670853,
527
+ "learning_rate": 7.057777777777778e-06,
528
+ "loss": 0.0012,
529
+ "step": 1825
530
+ },
531
+ {
532
+ "epoch": 18.09,
533
+ "grad_norm": 0.4030478894710541,
534
+ "learning_rate": 7.0022222222222225e-06,
535
+ "loss": 0.0008,
536
+ "step": 1850
537
+ },
538
+ {
539
+ "epoch": 18.34,
540
+ "grad_norm": 0.27439093589782715,
541
+ "learning_rate": 6.946666666666667e-06,
542
+ "loss": 0.0007,
543
+ "step": 1875
544
+ },
545
+ {
546
+ "epoch": 18.58,
547
+ "grad_norm": 0.25452977418899536,
548
+ "learning_rate": 6.891111111111111e-06,
549
+ "loss": 0.0007,
550
+ "step": 1900
551
+ },
552
+ {
553
+ "epoch": 18.83,
554
+ "grad_norm": 0.06759922206401825,
555
+ "learning_rate": 6.835555555555556e-06,
556
+ "loss": 0.0007,
557
+ "step": 1925
558
+ },
559
+ {
560
+ "epoch": 19.07,
561
+ "grad_norm": 0.25859466195106506,
562
+ "learning_rate": 6.780000000000001e-06,
563
+ "loss": 0.0006,
564
+ "step": 1950
565
+ },
566
+ {
567
+ "epoch": 19.32,
568
+ "grad_norm": 0.7427995800971985,
569
+ "learning_rate": 6.724444444444444e-06,
570
+ "loss": 0.0005,
571
+ "step": 1975
572
+ },
573
+ {
574
+ "epoch": 19.56,
575
+ "grad_norm": 0.0788324698805809,
576
+ "learning_rate": 6.668888888888889e-06,
577
+ "loss": 0.0008,
578
+ "step": 2000
579
+ },
580
+ {
581
+ "epoch": 19.56,
582
+ "eval_loss": 0.49481505155563354,
583
+ "eval_runtime": 1457.5675,
584
+ "eval_samples_per_second": 1.985,
585
+ "eval_steps_per_second": 0.497,
586
+ "eval_wer": 18.4735830052193,
587
+ "step": 2000
588
+ },
589
+ {
590
+ "epoch": 19.8,
591
+ "grad_norm": 0.04227956011891365,
592
+ "learning_rate": 6.613333333333334e-06,
593
+ "loss": 0.0008,
594
+ "step": 2025
595
+ },
596
+ {
597
+ "epoch": 20.05,
598
+ "grad_norm": 0.5580443739891052,
599
+ "learning_rate": 6.557777777777778e-06,
600
+ "loss": 0.001,
601
+ "step": 2050
602
+ },
603
+ {
604
+ "epoch": 20.29,
605
+ "grad_norm": 0.7394335865974426,
606
+ "learning_rate": 6.502222222222223e-06,
607
+ "loss": 0.0014,
608
+ "step": 2075
609
+ },
610
+ {
611
+ "epoch": 20.54,
612
+ "grad_norm": 0.8055688142776489,
613
+ "learning_rate": 6.446666666666668e-06,
614
+ "loss": 0.0011,
615
+ "step": 2100
616
+ },
617
+ {
618
+ "epoch": 20.78,
619
+ "grad_norm": 0.13119255006313324,
620
+ "learning_rate": 6.391111111111111e-06,
621
+ "loss": 0.0016,
622
+ "step": 2125
623
+ },
624
+ {
625
+ "epoch": 21.03,
626
+ "grad_norm": 0.21813702583312988,
627
+ "learning_rate": 6.335555555555556e-06,
628
+ "loss": 0.0014,
629
+ "step": 2150
630
+ },
631
+ {
632
+ "epoch": 21.27,
633
+ "grad_norm": 0.1066213995218277,
634
+ "learning_rate": 6.280000000000001e-06,
635
+ "loss": 0.0009,
636
+ "step": 2175
637
+ },
638
+ {
639
+ "epoch": 21.52,
640
+ "grad_norm": 0.8583650588989258,
641
+ "learning_rate": 6.224444444444445e-06,
642
+ "loss": 0.0012,
643
+ "step": 2200
644
+ },
645
+ {
646
+ "epoch": 21.76,
647
+ "grad_norm": 1.2513171434402466,
648
+ "learning_rate": 6.16888888888889e-06,
649
+ "loss": 0.0021,
650
+ "step": 2225
651
+ },
652
+ {
653
+ "epoch": 22.0,
654
+ "grad_norm": 0.8390223979949951,
655
+ "learning_rate": 6.113333333333333e-06,
656
+ "loss": 0.0018,
657
+ "step": 2250
658
+ },
659
+ {
660
+ "epoch": 22.25,
661
+ "grad_norm": 0.8746078610420227,
662
+ "learning_rate": 6.057777777777778e-06,
663
+ "loss": 0.0015,
664
+ "step": 2275
665
+ },
666
+ {
667
+ "epoch": 22.49,
668
+ "grad_norm": 0.13358770310878754,
669
+ "learning_rate": 6.002222222222223e-06,
670
+ "loss": 0.0016,
671
+ "step": 2300
672
+ },
673
+ {
674
+ "epoch": 22.74,
675
+ "grad_norm": 0.07681471109390259,
676
+ "learning_rate": 5.946666666666668e-06,
677
+ "loss": 0.0011,
678
+ "step": 2325
679
+ },
680
+ {
681
+ "epoch": 22.98,
682
+ "grad_norm": 0.5511406660079956,
683
+ "learning_rate": 5.891111111111112e-06,
684
+ "loss": 0.0013,
685
+ "step": 2350
686
+ },
687
+ {
688
+ "epoch": 23.23,
689
+ "grad_norm": 0.23318354785442352,
690
+ "learning_rate": 5.8355555555555565e-06,
691
+ "loss": 0.0011,
692
+ "step": 2375
693
+ },
694
+ {
695
+ "epoch": 23.47,
696
+ "grad_norm": 0.12396834790706635,
697
+ "learning_rate": 5.78e-06,
698
+ "loss": 0.0009,
699
+ "step": 2400
700
+ },
701
+ {
702
+ "epoch": 23.72,
703
+ "grad_norm": 0.0644838809967041,
704
+ "learning_rate": 5.724444444444445e-06,
705
+ "loss": 0.0006,
706
+ "step": 2425
707
+ },
708
+ {
709
+ "epoch": 23.96,
710
+ "grad_norm": 0.47172439098358154,
711
+ "learning_rate": 5.6688888888888895e-06,
712
+ "loss": 0.0007,
713
+ "step": 2450
714
+ },
715
+ {
716
+ "epoch": 24.21,
717
+ "grad_norm": 0.030231019482016563,
718
+ "learning_rate": 5.613333333333334e-06,
719
+ "loss": 0.0005,
720
+ "step": 2475
721
+ },
722
+ {
723
+ "epoch": 24.45,
724
+ "grad_norm": 0.01894545555114746,
725
+ "learning_rate": 5.557777777777778e-06,
726
+ "loss": 0.0003,
727
+ "step": 2500
728
+ },
729
+ {
730
+ "epoch": 24.69,
731
+ "grad_norm": 0.18070322275161743,
732
+ "learning_rate": 5.5022222222222224e-06,
733
+ "loss": 0.0009,
734
+ "step": 2525
735
+ },
736
+ {
737
+ "epoch": 24.94,
738
+ "grad_norm": 0.0435708686709404,
739
+ "learning_rate": 5.4466666666666665e-06,
740
+ "loss": 0.0009,
741
+ "step": 2550
742
+ },
743
+ {
744
+ "epoch": 25.18,
745
+ "grad_norm": 0.07220447063446045,
746
+ "learning_rate": 5.391111111111111e-06,
747
+ "loss": 0.0005,
748
+ "step": 2575
749
+ },
750
+ {
751
+ "epoch": 25.43,
752
+ "grad_norm": 0.01733986660838127,
753
+ "learning_rate": 5.335555555555556e-06,
754
+ "loss": 0.0004,
755
+ "step": 2600
756
+ },
757
+ {
758
+ "epoch": 25.67,
759
+ "grad_norm": 0.03520004078745842,
760
+ "learning_rate": 5.28e-06,
761
+ "loss": 0.0007,
762
+ "step": 2625
763
+ },
764
+ {
765
+ "epoch": 25.92,
766
+ "grad_norm": 0.03853292763233185,
767
+ "learning_rate": 5.224444444444445e-06,
768
+ "loss": 0.0005,
769
+ "step": 2650
770
+ },
771
+ {
772
+ "epoch": 26.16,
773
+ "grad_norm": 0.13450591266155243,
774
+ "learning_rate": 5.168888888888889e-06,
775
+ "loss": 0.0005,
776
+ "step": 2675
777
+ },
778
+ {
779
+ "epoch": 26.41,
780
+ "grad_norm": 0.029255390167236328,
781
+ "learning_rate": 5.113333333333333e-06,
782
+ "loss": 0.0004,
783
+ "step": 2700
784
+ },
785
+ {
786
+ "epoch": 26.65,
787
+ "grad_norm": 0.025706447660923004,
788
+ "learning_rate": 5.057777777777778e-06,
789
+ "loss": 0.0003,
790
+ "step": 2725
791
+ },
792
+ {
793
+ "epoch": 26.89,
794
+ "grad_norm": 0.902415931224823,
795
+ "learning_rate": 5.002222222222223e-06,
796
+ "loss": 0.0002,
797
+ "step": 2750
798
+ },
799
+ {
800
+ "epoch": 27.14,
801
+ "grad_norm": 0.013656423427164555,
802
+ "learning_rate": 4.946666666666667e-06,
803
+ "loss": 0.0002,
804
+ "step": 2775
805
+ },
806
+ {
807
+ "epoch": 27.38,
808
+ "grad_norm": 0.018052740022540092,
809
+ "learning_rate": 4.891111111111111e-06,
810
+ "loss": 0.0003,
811
+ "step": 2800
812
+ },
813
+ {
814
+ "epoch": 27.63,
815
+ "grad_norm": 0.07001502811908722,
816
+ "learning_rate": 4.835555555555556e-06,
817
+ "loss": 0.0002,
818
+ "step": 2825
819
+ },
820
+ {
821
+ "epoch": 27.87,
822
+ "grad_norm": 0.01241993810981512,
823
+ "learning_rate": 4.78e-06,
824
+ "loss": 0.0003,
825
+ "step": 2850
826
+ },
827
+ {
828
+ "epoch": 28.12,
829
+ "grad_norm": 0.013495378196239471,
830
+ "learning_rate": 4.724444444444445e-06,
831
+ "loss": 0.0001,
832
+ "step": 2875
833
+ },
834
+ {
835
+ "epoch": 28.36,
836
+ "grad_norm": 0.011048069223761559,
837
+ "learning_rate": 4.66888888888889e-06,
838
+ "loss": 0.0001,
839
+ "step": 2900
840
+ },
841
+ {
842
+ "epoch": 28.61,
843
+ "grad_norm": 0.015970442444086075,
844
+ "learning_rate": 4.613333333333334e-06,
845
+ "loss": 0.0002,
846
+ "step": 2925
847
+ },
848
+ {
849
+ "epoch": 28.85,
850
+ "grad_norm": 0.009559527039527893,
851
+ "learning_rate": 4.557777777777778e-06,
852
+ "loss": 0.0002,
853
+ "step": 2950
854
+ },
855
+ {
856
+ "epoch": 29.1,
857
+ "grad_norm": 0.010293275117874146,
858
+ "learning_rate": 4.502222222222223e-06,
859
+ "loss": 0.0001,
860
+ "step": 2975
861
+ },
862
+ {
863
+ "epoch": 29.34,
864
+ "grad_norm": 0.01116804126650095,
865
+ "learning_rate": 4.446666666666667e-06,
866
+ "loss": 0.0001,
867
+ "step": 3000
868
+ },
869
+ {
870
+ "epoch": 29.34,
871
+ "eval_loss": 0.5352661609649658,
872
+ "eval_runtime": 1458.2257,
873
+ "eval_samples_per_second": 1.985,
874
+ "eval_steps_per_second": 0.496,
875
+ "eval_wer": 18.07297866495742,
876
+ "step": 3000
877
+ },
878
+ {
879
+ "epoch": 29.58,
880
+ "grad_norm": 0.0082534896209836,
881
+ "learning_rate": 4.391111111111112e-06,
882
+ "loss": 0.0001,
883
+ "step": 3025
884
+ },
885
+ {
886
+ "epoch": 29.83,
887
+ "grad_norm": 0.007231541443616152,
888
+ "learning_rate": 4.3355555555555565e-06,
889
+ "loss": 0.0001,
890
+ "step": 3050
891
+ },
892
+ {
893
+ "epoch": 30.07,
894
+ "grad_norm": 0.008312270976603031,
895
+ "learning_rate": 4.2800000000000005e-06,
896
+ "loss": 0.0001,
897
+ "step": 3075
898
+ },
899
+ {
900
+ "epoch": 30.32,
901
+ "grad_norm": 0.007097834721207619,
902
+ "learning_rate": 4.2244444444444446e-06,
903
+ "loss": 0.0001,
904
+ "step": 3100
905
+ },
906
+ {
907
+ "epoch": 30.56,
908
+ "grad_norm": 0.007524041458964348,
909
+ "learning_rate": 4.168888888888889e-06,
910
+ "loss": 0.0001,
911
+ "step": 3125
912
+ },
913
+ {
914
+ "epoch": 30.81,
915
+ "grad_norm": 0.00781218009069562,
916
+ "learning_rate": 4.1133333333333335e-06,
917
+ "loss": 0.0001,
918
+ "step": 3150
919
+ },
920
+ {
921
+ "epoch": 31.05,
922
+ "grad_norm": 0.006727377884089947,
923
+ "learning_rate": 4.057777777777778e-06,
924
+ "loss": 0.0001,
925
+ "step": 3175
926
+ },
927
+ {
928
+ "epoch": 31.3,
929
+ "grad_norm": 0.00624061468988657,
930
+ "learning_rate": 4.002222222222222e-06,
931
+ "loss": 0.0001,
932
+ "step": 3200
933
+ },
934
+ {
935
+ "epoch": 31.54,
936
+ "grad_norm": 0.006174057722091675,
937
+ "learning_rate": 3.946666666666667e-06,
938
+ "loss": 0.0001,
939
+ "step": 3225
940
+ },
941
+ {
942
+ "epoch": 31.78,
943
+ "grad_norm": 0.006670523434877396,
944
+ "learning_rate": 3.891111111111111e-06,
945
+ "loss": 0.0001,
946
+ "step": 3250
947
+ },
948
+ {
949
+ "epoch": 32.03,
950
+ "grad_norm": 0.006069181486964226,
951
+ "learning_rate": 3.835555555555555e-06,
952
+ "loss": 0.0001,
953
+ "step": 3275
954
+ },
955
+ {
956
+ "epoch": 32.27,
957
+ "grad_norm": 0.005773240700364113,
958
+ "learning_rate": 3.7800000000000002e-06,
959
+ "loss": 0.0001,
960
+ "step": 3300
961
+ },
962
+ {
963
+ "epoch": 32.52,
964
+ "grad_norm": 0.005736664403229952,
965
+ "learning_rate": 3.724444444444445e-06,
966
+ "loss": 0.0001,
967
+ "step": 3325
968
+ },
969
+ {
970
+ "epoch": 32.76,
971
+ "grad_norm": 0.0057275379076600075,
972
+ "learning_rate": 3.668888888888889e-06,
973
+ "loss": 0.0001,
974
+ "step": 3350
975
+ },
976
+ {
977
+ "epoch": 33.01,
978
+ "grad_norm": 0.0070615834556519985,
979
+ "learning_rate": 3.6133333333333336e-06,
980
+ "loss": 0.0001,
981
+ "step": 3375
982
+ },
983
+ {
984
+ "epoch": 33.25,
985
+ "grad_norm": 0.005553886294364929,
986
+ "learning_rate": 3.5577777777777785e-06,
987
+ "loss": 0.0001,
988
+ "step": 3400
989
+ },
990
+ {
991
+ "epoch": 33.5,
992
+ "grad_norm": 0.00474073551595211,
993
+ "learning_rate": 3.5022222222222225e-06,
994
+ "loss": 0.0001,
995
+ "step": 3425
996
+ },
997
+ {
998
+ "epoch": 33.74,
999
+ "grad_norm": 0.006729442626237869,
1000
+ "learning_rate": 3.446666666666667e-06,
1001
+ "loss": 0.0001,
1002
+ "step": 3450
1003
+ },
1004
+ {
1005
+ "epoch": 33.99,
1006
+ "grad_norm": 0.006181245669722557,
1007
+ "learning_rate": 3.391111111111111e-06,
1008
+ "loss": 0.0001,
1009
+ "step": 3475
1010
+ },
1011
+ {
1012
+ "epoch": 34.23,
1013
+ "grad_norm": 0.004282401409000158,
1014
+ "learning_rate": 3.335555555555556e-06,
1015
+ "loss": 0.0001,
1016
+ "step": 3500
1017
+ },
1018
+ {
1019
+ "epoch": 34.47,
1020
+ "grad_norm": 0.0049114222638309,
1021
+ "learning_rate": 3.2800000000000004e-06,
1022
+ "loss": 0.0001,
1023
+ "step": 3525
1024
+ },
1025
+ {
1026
+ "epoch": 34.72,
1027
+ "grad_norm": 0.004997397307306528,
1028
+ "learning_rate": 3.2244444444444444e-06,
1029
+ "loss": 0.0001,
1030
+ "step": 3550
1031
+ },
1032
+ {
1033
+ "epoch": 34.96,
1034
+ "grad_norm": 0.004732249770313501,
1035
+ "learning_rate": 3.1688888888888893e-06,
1036
+ "loss": 0.0001,
1037
+ "step": 3575
1038
+ },
1039
+ {
1040
+ "epoch": 35.21,
1041
+ "grad_norm": 0.005105483811348677,
1042
+ "learning_rate": 3.1133333333333337e-06,
1043
+ "loss": 0.0001,
1044
+ "step": 3600
1045
+ },
1046
+ {
1047
+ "epoch": 35.45,
1048
+ "grad_norm": 0.005223471205681562,
1049
+ "learning_rate": 3.0577777777777778e-06,
1050
+ "loss": 0.0001,
1051
+ "step": 3625
1052
+ },
1053
+ {
1054
+ "epoch": 35.7,
1055
+ "grad_norm": 0.005028573330491781,
1056
+ "learning_rate": 3.0022222222222227e-06,
1057
+ "loss": 0.0001,
1058
+ "step": 3650
1059
+ },
1060
+ {
1061
+ "epoch": 35.94,
1062
+ "grad_norm": 0.0035760572645813227,
1063
+ "learning_rate": 2.946666666666667e-06,
1064
+ "loss": 0.0001,
1065
+ "step": 3675
1066
+ },
1067
+ {
1068
+ "epoch": 36.19,
1069
+ "grad_norm": 0.0042958687990903854,
1070
+ "learning_rate": 2.891111111111111e-06,
1071
+ "loss": 0.0001,
1072
+ "step": 3700
1073
+ },
1074
+ {
1075
+ "epoch": 36.43,
1076
+ "grad_norm": 0.004194607958197594,
1077
+ "learning_rate": 2.835555555555556e-06,
1078
+ "loss": 0.0001,
1079
+ "step": 3725
1080
+ },
1081
+ {
1082
+ "epoch": 36.67,
1083
+ "grad_norm": 0.004419374745339155,
1084
+ "learning_rate": 2.7800000000000005e-06,
1085
+ "loss": 0.0001,
1086
+ "step": 3750
1087
+ },
1088
+ {
1089
+ "epoch": 36.92,
1090
+ "grad_norm": 0.004165703430771828,
1091
+ "learning_rate": 2.7244444444444445e-06,
1092
+ "loss": 0.0001,
1093
+ "step": 3775
1094
+ },
1095
+ {
1096
+ "epoch": 37.16,
1097
+ "grad_norm": 0.0038676238618791103,
1098
+ "learning_rate": 2.6688888888888894e-06,
1099
+ "loss": 0.0001,
1100
+ "step": 3800
1101
+ },
1102
+ {
1103
+ "epoch": 37.41,
1104
+ "grad_norm": 0.004417106043547392,
1105
+ "learning_rate": 2.6133333333333334e-06,
1106
+ "loss": 0.0001,
1107
+ "step": 3825
1108
+ },
1109
+ {
1110
+ "epoch": 37.65,
1111
+ "grad_norm": 0.004240726120769978,
1112
+ "learning_rate": 2.557777777777778e-06,
1113
+ "loss": 0.0001,
1114
+ "step": 3850
1115
+ },
1116
+ {
1117
+ "epoch": 37.9,
1118
+ "grad_norm": 0.003732978831976652,
1119
+ "learning_rate": 2.5022222222222224e-06,
1120
+ "loss": 0.0001,
1121
+ "step": 3875
1122
+ },
1123
+ {
1124
+ "epoch": 38.14,
1125
+ "grad_norm": 0.0037496890872716904,
1126
+ "learning_rate": 2.446666666666667e-06,
1127
+ "loss": 0.0001,
1128
+ "step": 3900
1129
+ },
1130
+ {
1131
+ "epoch": 38.39,
1132
+ "grad_norm": 0.003918818198144436,
1133
+ "learning_rate": 2.3911111111111113e-06,
1134
+ "loss": 0.0001,
1135
+ "step": 3925
1136
+ },
1137
+ {
1138
+ "epoch": 38.63,
1139
+ "grad_norm": 0.003961279056966305,
1140
+ "learning_rate": 2.3355555555555557e-06,
1141
+ "loss": 0.0001,
1142
+ "step": 3950
1143
+ },
1144
+ {
1145
+ "epoch": 38.88,
1146
+ "grad_norm": 0.00358415674418211,
1147
+ "learning_rate": 2.28e-06,
1148
+ "loss": 0.0001,
1149
+ "step": 3975
1150
+ },
1151
+ {
1152
+ "epoch": 39.12,
1153
+ "grad_norm": 0.004146341234445572,
1154
+ "learning_rate": 2.2244444444444447e-06,
1155
+ "loss": 0.0001,
1156
+ "step": 4000
1157
+ },
1158
+ {
1159
+ "epoch": 39.12,
1160
+ "eval_loss": 0.5624426603317261,
1161
+ "eval_runtime": 1469.1635,
1162
+ "eval_samples_per_second": 1.97,
1163
+ "eval_steps_per_second": 0.493,
1164
+ "eval_wer": 18.056954491346946,
1165
+ "step": 4000
1166
+ }
1167
+ ],
1168
+ "logging_steps": 25,
1169
+ "max_steps": 5000,
1170
+ "num_input_tokens_seen": 0,
1171
+ "num_train_epochs": 50,
1172
+ "save_steps": 1000,
1173
+ "total_flos": 7.383284315947008e+19,
1174
+ "train_batch_size": 8,
1175
+ "trial_name": null,
1176
+ "trial_params": null
1177
+ }
Models/hindi/checkpoint-4000/training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1562c17bb2dc7592b45af48209c138ce71faddb67bef288ecaf31f1c50f864ae
3
+ size 5048
Models/hindi/checkpoint-5000/config.json ADDED
@@ -0,0 +1,52 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "openai/whisper-small",
3
+ "activation_dropout": 0.0,
4
+ "activation_function": "gelu",
5
+ "apply_spec_augment": false,
6
+ "architectures": [
7
+ "WhisperForConditionalGeneration"
8
+ ],
9
+ "attention_dropout": 0.0,
10
+ "begin_suppress_tokens": [
11
+ 220,
12
+ 50257
13
+ ],
14
+ "bos_token_id": 50257,
15
+ "classifier_proj_size": 256,
16
+ "d_model": 768,
17
+ "decoder_attention_heads": 12,
18
+ "decoder_ffn_dim": 3072,
19
+ "decoder_layerdrop": 0.0,
20
+ "decoder_layers": 12,
21
+ "decoder_start_token_id": 50258,
22
+ "dropout": 0.0,
23
+ "encoder_attention_heads": 12,
24
+ "encoder_ffn_dim": 3072,
25
+ "encoder_layerdrop": 0.0,
26
+ "encoder_layers": 12,
27
+ "eos_token_id": 50257,
28
+ "forced_decoder_ids": null,
29
+ "init_std": 0.02,
30
+ "is_encoder_decoder": true,
31
+ "mask_feature_length": 10,
32
+ "mask_feature_min_masks": 0,
33
+ "mask_feature_prob": 0.0,
34
+ "mask_time_length": 10,
35
+ "mask_time_min_masks": 2,
36
+ "mask_time_prob": 0.05,
37
+ "max_length": 448,
38
+ "max_source_positions": 1500,
39
+ "max_target_positions": 448,
40
+ "median_filter_width": 7,
41
+ "model_type": "whisper",
42
+ "num_hidden_layers": 12,
43
+ "num_mel_bins": 80,
44
+ "pad_token_id": 50257,
45
+ "scale_embedding": false,
46
+ "suppress_tokens": [],
47
+ "torch_dtype": "float32",
48
+ "transformers_version": "4.40.0.dev0",
49
+ "use_cache": false,
50
+ "use_weighted_layer_sum": false,
51
+ "vocab_size": 51865
52
+ }
Models/hindi/checkpoint-5000/generation_config.json ADDED
@@ -0,0 +1,265 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alignment_heads": [
3
+ [
4
+ 5,
5
+ 3
6
+ ],
7
+ [
8
+ 5,
9
+ 9
10
+ ],
11
+ [
12
+ 8,
13
+ 0
14
+ ],
15
+ [
16
+ 8,
17
+ 4
18
+ ],
19
+ [
20
+ 8,
21
+ 7
22
+ ],
23
+ [
24
+ 8,
25
+ 8
26
+ ],
27
+ [
28
+ 9,
29
+ 0
30
+ ],
31
+ [
32
+ 9,
33
+ 7
34
+ ],
35
+ [
36
+ 9,
37
+ 9
38
+ ],
39
+ [
40
+ 10,
41
+ 5
42
+ ]
43
+ ],
44
+ "begin_suppress_tokens": [
45
+ 220,
46
+ 50257
47
+ ],
48
+ "bos_token_id": 50257,
49
+ "decoder_start_token_id": 50258,
50
+ "eos_token_id": 50257,
51
+ "forced_decoder_ids": [
52
+ [
53
+ 1,
54
+ null
55
+ ],
56
+ [
57
+ 2,
58
+ 50359
59
+ ]
60
+ ],
61
+ "is_multilingual": true,
62
+ "lang_to_id": {
63
+ "<|af|>": 50327,
64
+ "<|am|>": 50334,
65
+ "<|ar|>": 50272,
66
+ "<|as|>": 50350,
67
+ "<|az|>": 50304,
68
+ "<|ba|>": 50355,
69
+ "<|be|>": 50330,
70
+ "<|bg|>": 50292,
71
+ "<|bn|>": 50302,
72
+ "<|bo|>": 50347,
73
+ "<|br|>": 50309,
74
+ "<|bs|>": 50315,
75
+ "<|ca|>": 50270,
76
+ "<|cs|>": 50283,
77
+ "<|cy|>": 50297,
78
+ "<|da|>": 50285,
79
+ "<|de|>": 50261,
80
+ "<|el|>": 50281,
81
+ "<|en|>": 50259,
82
+ "<|es|>": 50262,
83
+ "<|et|>": 50307,
84
+ "<|eu|>": 50310,
85
+ "<|fa|>": 50300,
86
+ "<|fi|>": 50277,
87
+ "<|fo|>": 50338,
88
+ "<|fr|>": 50265,
89
+ "<|gl|>": 50319,
90
+ "<|gu|>": 50333,
91
+ "<|haw|>": 50352,
92
+ "<|ha|>": 50354,
93
+ "<|he|>": 50279,
94
+ "<|hi|>": 50276,
95
+ "<|hr|>": 50291,
96
+ "<|ht|>": 50339,
97
+ "<|hu|>": 50286,
98
+ "<|hy|>": 50312,
99
+ "<|id|>": 50275,
100
+ "<|is|>": 50311,
101
+ "<|it|>": 50274,
102
+ "<|ja|>": 50266,
103
+ "<|jw|>": 50356,
104
+ "<|ka|>": 50329,
105
+ "<|kk|>": 50316,
106
+ "<|km|>": 50323,
107
+ "<|kn|>": 50306,
108
+ "<|ko|>": 50264,
109
+ "<|la|>": 50294,
110
+ "<|lb|>": 50345,
111
+ "<|ln|>": 50353,
112
+ "<|lo|>": 50336,
113
+ "<|lt|>": 50293,
114
+ "<|lv|>": 50301,
115
+ "<|mg|>": 50349,
116
+ "<|mi|>": 50295,
117
+ "<|mk|>": 50308,
118
+ "<|ml|>": 50296,
119
+ "<|mn|>": 50314,
120
+ "<|mr|>": 50320,
121
+ "<|ms|>": 50282,
122
+ "<|mt|>": 50343,
123
+ "<|my|>": 50346,
124
+ "<|ne|>": 50313,
125
+ "<|nl|>": 50271,
126
+ "<|nn|>": 50342,
127
+ "<|no|>": 50288,
128
+ "<|oc|>": 50328,
129
+ "<|pa|>": 50321,
130
+ "<|pl|>": 50269,
131
+ "<|ps|>": 50340,
132
+ "<|pt|>": 50267,
133
+ "<|ro|>": 50284,
134
+ "<|ru|>": 50263,
135
+ "<|sa|>": 50344,
136
+ "<|sd|>": 50332,
137
+ "<|si|>": 50322,
138
+ "<|sk|>": 50298,
139
+ "<|sl|>": 50305,
140
+ "<|sn|>": 50324,
141
+ "<|so|>": 50326,
142
+ "<|sq|>": 50317,
143
+ "<|sr|>": 50303,
144
+ "<|su|>": 50357,
145
+ "<|sv|>": 50273,
146
+ "<|sw|>": 50318,
147
+ "<|ta|>": 50287,
148
+ "<|te|>": 50299,
149
+ "<|tg|>": 50331,
150
+ "<|th|>": 50289,
151
+ "<|tk|>": 50341,
152
+ "<|tl|>": 50348,
153
+ "<|tr|>": 50268,
154
+ "<|tt|>": 50351,
155
+ "<|uk|>": 50280,
156
+ "<|ur|>": 50290,
157
+ "<|uz|>": 50337,
158
+ "<|vi|>": 50278,
159
+ "<|yi|>": 50335,
160
+ "<|yo|>": 50325,
161
+ "<|zh|>": 50260
162
+ },
163
+ "language": "hi",
164
+ "max_initial_timestamp_index": 50,
165
+ "max_length": 448,
166
+ "no_timestamps_token_id": 50363,
167
+ "pad_token_id": 50257,
168
+ "prev_sot_token_id": 50361,
169
+ "return_timestamps": false,
170
+ "suppress_tokens": [
171
+ 1,
172
+ 2,
173
+ 7,
174
+ 8,
175
+ 9,
176
+ 10,
177
+ 14,
178
+ 25,
179
+ 26,
180
+ 27,
181
+ 28,
182
+ 29,
183
+ 31,
184
+ 58,
185
+ 59,
186
+ 60,
187
+ 61,
188
+ 62,
189
+ 63,
190
+ 90,
191
+ 91,
192
+ 92,
193
+ 93,
194
+ 359,
195
+ 503,
196
+ 522,
197
+ 542,
198
+ 873,
199
+ 893,
200
+ 902,
201
+ 918,
202
+ 922,
203
+ 931,
204
+ 1350,
205
+ 1853,
206
+ 1982,
207
+ 2460,
208
+ 2627,
209
+ 3246,
210
+ 3253,
211
+ 3268,
212
+ 3536,
213
+ 3846,
214
+ 3961,
215
+ 4183,
216
+ 4667,
217
+ 6585,
218
+ 6647,
219
+ 7273,
220
+ 9061,
221
+ 9383,
222
+ 10428,
223
+ 10929,
224
+ 11938,
225
+ 12033,
226
+ 12331,
227
+ 12562,
228
+ 13793,
229
+ 14157,
230
+ 14635,
231
+ 15265,
232
+ 15618,
233
+ 16553,
234
+ 16604,
235
+ 18362,
236
+ 18956,
237
+ 20075,
238
+ 21675,
239
+ 22520,
240
+ 26130,
241
+ 26161,
242
+ 26435,
243
+ 28279,
244
+ 29464,
245
+ 31650,
246
+ 32302,
247
+ 32470,
248
+ 36865,
249
+ 42863,
250
+ 47425,
251
+ 49870,
252
+ 50254,
253
+ 50258,
254
+ 50358,
255
+ 50359,
256
+ 50360,
257
+ 50361,
258
+ 50362
259
+ ],
260
+ "task_to_id": {
261
+ "transcribe": 50359,
262
+ "translate": 50358
263
+ },
264
+ "transformers_version": "4.40.0.dev0"
265
+ }
Models/hindi/checkpoint-5000/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:549e8b957e8819d5d22674fa49e45aba17dc286245ea05544c114472f8c88f24
3
+ size 966995080
Models/hindi/checkpoint-5000/optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3b4848e1a8fbf5a0d628b7e120178c4a64c8ec23402d6366ebb7edb58e7b6acb
3
+ size 1925064044
Models/hindi/checkpoint-5000/preprocessor_config.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "chunk_length": 30,
3
+ "feature_extractor_type": "WhisperFeatureExtractor",
4
+ "feature_size": 80,
5
+ "hop_length": 160,
6
+ "n_fft": 400,
7
+ "n_samples": 480000,
8
+ "nb_max_frames": 3000,
9
+ "padding_side": "right",
10
+ "padding_value": 0.0,
11
+ "processor_class": "WhisperProcessor",
12
+ "return_attention_mask": false,
13
+ "sampling_rate": 16000
14
+ }
Models/hindi/checkpoint-5000/rng_state.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9c6335e28c7e09aedee56f3bff4cd20c9fc1c85bd0d1d2cfb7d15790331554b3
3
+ size 14244
Models/hindi/checkpoint-5000/scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d79662b413883c41ee46022c57c3925985bc10951685b4f1ce2702c31792813b
3
+ size 1064
Models/hindi/checkpoint-5000/trainer_state.json ADDED
@@ -0,0 +1,1466 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_metric": 18.056954491346946,
3
+ "best_model_checkpoint": "./checkpoint-4000",
4
+ "epoch": 48.899755501222494,
5
+ "eval_steps": 1000,
6
+ "global_step": 5000,
7
+ "is_hyper_param_search": false,
8
+ "is_local_process_zero": true,
9
+ "is_world_process_zero": true,
10
+ "log_history": [
11
+ {
12
+ "epoch": 0.24,
13
+ "grad_norm": 39.86528778076172,
14
+ "learning_rate": 5.000000000000001e-07,
15
+ "loss": 2.0555,
16
+ "step": 25
17
+ },
18
+ {
19
+ "epoch": 0.49,
20
+ "grad_norm": Infinity,
21
+ "learning_rate": 9.800000000000001e-07,
22
+ "loss": 1.5219,
23
+ "step": 50
24
+ },
25
+ {
26
+ "epoch": 0.73,
27
+ "grad_norm": 6.200109958648682,
28
+ "learning_rate": 1.48e-06,
29
+ "loss": 1.0167,
30
+ "step": 75
31
+ },
32
+ {
33
+ "epoch": 0.98,
34
+ "grad_norm": 5.486103057861328,
35
+ "learning_rate": 1.98e-06,
36
+ "loss": 0.7299,
37
+ "step": 100
38
+ },
39
+ {
40
+ "epoch": 1.22,
41
+ "grad_norm": 5.532894134521484,
42
+ "learning_rate": 2.4800000000000004e-06,
43
+ "loss": 0.6318,
44
+ "step": 125
45
+ },
46
+ {
47
+ "epoch": 1.47,
48
+ "grad_norm": 4.895535945892334,
49
+ "learning_rate": 2.9800000000000003e-06,
50
+ "loss": 0.5503,
51
+ "step": 150
52
+ },
53
+ {
54
+ "epoch": 1.71,
55
+ "grad_norm": 5.065937519073486,
56
+ "learning_rate": 3.48e-06,
57
+ "loss": 0.4999,
58
+ "step": 175
59
+ },
60
+ {
61
+ "epoch": 1.96,
62
+ "grad_norm": 4.807600498199463,
63
+ "learning_rate": 3.980000000000001e-06,
64
+ "loss": 0.4457,
65
+ "step": 200
66
+ },
67
+ {
68
+ "epoch": 2.2,
69
+ "grad_norm": 4.568106174468994,
70
+ "learning_rate": 4.48e-06,
71
+ "loss": 0.3779,
72
+ "step": 225
73
+ },
74
+ {
75
+ "epoch": 2.44,
76
+ "grad_norm": 4.705833911895752,
77
+ "learning_rate": 4.980000000000001e-06,
78
+ "loss": 0.3332,
79
+ "step": 250
80
+ },
81
+ {
82
+ "epoch": 2.69,
83
+ "grad_norm": 4.315070629119873,
84
+ "learning_rate": 5.480000000000001e-06,
85
+ "loss": 0.2858,
86
+ "step": 275
87
+ },
88
+ {
89
+ "epoch": 2.93,
90
+ "grad_norm": 2.660728693008423,
91
+ "learning_rate": 5.98e-06,
92
+ "loss": 0.2228,
93
+ "step": 300
94
+ },
95
+ {
96
+ "epoch": 3.18,
97
+ "grad_norm": 2.5569632053375244,
98
+ "learning_rate": 6.480000000000001e-06,
99
+ "loss": 0.1766,
100
+ "step": 325
101
+ },
102
+ {
103
+ "epoch": 3.42,
104
+ "grad_norm": 2.2663660049438477,
105
+ "learning_rate": 6.98e-06,
106
+ "loss": 0.1475,
107
+ "step": 350
108
+ },
109
+ {
110
+ "epoch": 3.67,
111
+ "grad_norm": 2.5257091522216797,
112
+ "learning_rate": 7.48e-06,
113
+ "loss": 0.1492,
114
+ "step": 375
115
+ },
116
+ {
117
+ "epoch": 3.91,
118
+ "grad_norm": 2.3954405784606934,
119
+ "learning_rate": 7.980000000000002e-06,
120
+ "loss": 0.142,
121
+ "step": 400
122
+ },
123
+ {
124
+ "epoch": 4.16,
125
+ "grad_norm": 2.181328296661377,
126
+ "learning_rate": 8.48e-06,
127
+ "loss": 0.1125,
128
+ "step": 425
129
+ },
130
+ {
131
+ "epoch": 4.4,
132
+ "grad_norm": 1.9877594709396362,
133
+ "learning_rate": 8.98e-06,
134
+ "loss": 0.0908,
135
+ "step": 450
136
+ },
137
+ {
138
+ "epoch": 4.65,
139
+ "grad_norm": 1.8853321075439453,
140
+ "learning_rate": 9.48e-06,
141
+ "loss": 0.0892,
142
+ "step": 475
143
+ },
144
+ {
145
+ "epoch": 4.89,
146
+ "grad_norm": 2.3802549839019775,
147
+ "learning_rate": 9.980000000000001e-06,
148
+ "loss": 0.091,
149
+ "step": 500
150
+ },
151
+ {
152
+ "epoch": 5.13,
153
+ "grad_norm": 1.3576879501342773,
154
+ "learning_rate": 9.946666666666667e-06,
155
+ "loss": 0.0713,
156
+ "step": 525
157
+ },
158
+ {
159
+ "epoch": 5.38,
160
+ "grad_norm": 2.4532103538513184,
161
+ "learning_rate": 9.891111111111113e-06,
162
+ "loss": 0.0534,
163
+ "step": 550
164
+ },
165
+ {
166
+ "epoch": 5.62,
167
+ "grad_norm": 1.3736106157302856,
168
+ "learning_rate": 9.835555555555556e-06,
169
+ "loss": 0.0512,
170
+ "step": 575
171
+ },
172
+ {
173
+ "epoch": 5.87,
174
+ "grad_norm": 2.0095458030700684,
175
+ "learning_rate": 9.780000000000001e-06,
176
+ "loss": 0.0571,
177
+ "step": 600
178
+ },
179
+ {
180
+ "epoch": 6.11,
181
+ "grad_norm": 1.2924531698226929,
182
+ "learning_rate": 9.724444444444445e-06,
183
+ "loss": 0.0453,
184
+ "step": 625
185
+ },
186
+ {
187
+ "epoch": 6.36,
188
+ "grad_norm": 1.0012321472167969,
189
+ "learning_rate": 9.66888888888889e-06,
190
+ "loss": 0.0292,
191
+ "step": 650
192
+ },
193
+ {
194
+ "epoch": 6.6,
195
+ "grad_norm": 1.4161434173583984,
196
+ "learning_rate": 9.613333333333335e-06,
197
+ "loss": 0.0325,
198
+ "step": 675
199
+ },
200
+ {
201
+ "epoch": 6.85,
202
+ "grad_norm": 5.784367084503174,
203
+ "learning_rate": 9.557777777777777e-06,
204
+ "loss": 0.031,
205
+ "step": 700
206
+ },
207
+ {
208
+ "epoch": 7.09,
209
+ "grad_norm": 0.8382102251052856,
210
+ "learning_rate": 9.502222222222223e-06,
211
+ "loss": 0.0247,
212
+ "step": 725
213
+ },
214
+ {
215
+ "epoch": 7.33,
216
+ "grad_norm": 1.2963491678237915,
217
+ "learning_rate": 9.446666666666667e-06,
218
+ "loss": 0.0162,
219
+ "step": 750
220
+ },
221
+ {
222
+ "epoch": 7.58,
223
+ "grad_norm": 1.7834402322769165,
224
+ "learning_rate": 9.391111111111111e-06,
225
+ "loss": 0.0175,
226
+ "step": 775
227
+ },
228
+ {
229
+ "epoch": 7.82,
230
+ "grad_norm": 0.9083292484283447,
231
+ "learning_rate": 9.335555555555557e-06,
232
+ "loss": 0.0193,
233
+ "step": 800
234
+ },
235
+ {
236
+ "epoch": 8.07,
237
+ "grad_norm": 0.5552634596824646,
238
+ "learning_rate": 9.280000000000001e-06,
239
+ "loss": 0.0157,
240
+ "step": 825
241
+ },
242
+ {
243
+ "epoch": 8.31,
244
+ "grad_norm": 1.1231069564819336,
245
+ "learning_rate": 9.224444444444445e-06,
246
+ "loss": 0.0105,
247
+ "step": 850
248
+ },
249
+ {
250
+ "epoch": 8.56,
251
+ "grad_norm": 1.206103801727295,
252
+ "learning_rate": 9.168888888888889e-06,
253
+ "loss": 0.0109,
254
+ "step": 875
255
+ },
256
+ {
257
+ "epoch": 8.8,
258
+ "grad_norm": 0.8872191309928894,
259
+ "learning_rate": 9.113333333333335e-06,
260
+ "loss": 0.0126,
261
+ "step": 900
262
+ },
263
+ {
264
+ "epoch": 9.05,
265
+ "grad_norm": 0.7421383261680603,
266
+ "learning_rate": 9.057777777777779e-06,
267
+ "loss": 0.0107,
268
+ "step": 925
269
+ },
270
+ {
271
+ "epoch": 9.29,
272
+ "grad_norm": 0.7581607103347778,
273
+ "learning_rate": 9.002222222222223e-06,
274
+ "loss": 0.006,
275
+ "step": 950
276
+ },
277
+ {
278
+ "epoch": 9.54,
279
+ "grad_norm": 0.6848894953727722,
280
+ "learning_rate": 8.946666666666669e-06,
281
+ "loss": 0.006,
282
+ "step": 975
283
+ },
284
+ {
285
+ "epoch": 9.78,
286
+ "grad_norm": 1.044122576713562,
287
+ "learning_rate": 8.891111111111111e-06,
288
+ "loss": 0.0067,
289
+ "step": 1000
290
+ },
291
+ {
292
+ "epoch": 9.78,
293
+ "eval_loss": 0.41375118494033813,
294
+ "eval_runtime": 1461.809,
295
+ "eval_samples_per_second": 1.98,
296
+ "eval_steps_per_second": 0.495,
297
+ "eval_wer": 18.938284039923083,
298
+ "step": 1000
299
+ },
300
+ {
301
+ "epoch": 10.02,
302
+ "grad_norm": 0.6757261753082275,
303
+ "learning_rate": 8.835555555555557e-06,
304
+ "loss": 0.0058,
305
+ "step": 1025
306
+ },
307
+ {
308
+ "epoch": 10.27,
309
+ "grad_norm": 1.085519552230835,
310
+ "learning_rate": 8.78e-06,
311
+ "loss": 0.0037,
312
+ "step": 1050
313
+ },
314
+ {
315
+ "epoch": 10.51,
316
+ "grad_norm": 0.8559943437576294,
317
+ "learning_rate": 8.724444444444445e-06,
318
+ "loss": 0.0044,
319
+ "step": 1075
320
+ },
321
+ {
322
+ "epoch": 10.76,
323
+ "grad_norm": 1.7756787538528442,
324
+ "learning_rate": 8.66888888888889e-06,
325
+ "loss": 0.0056,
326
+ "step": 1100
327
+ },
328
+ {
329
+ "epoch": 11.0,
330
+ "grad_norm": 0.5664415955543518,
331
+ "learning_rate": 8.613333333333333e-06,
332
+ "loss": 0.0048,
333
+ "step": 1125
334
+ },
335
+ {
336
+ "epoch": 11.25,
337
+ "grad_norm": 0.621498703956604,
338
+ "learning_rate": 8.557777777777778e-06,
339
+ "loss": 0.0038,
340
+ "step": 1150
341
+ },
342
+ {
343
+ "epoch": 11.49,
344
+ "grad_norm": 0.9859088659286499,
345
+ "learning_rate": 8.502222222222223e-06,
346
+ "loss": 0.0035,
347
+ "step": 1175
348
+ },
349
+ {
350
+ "epoch": 11.74,
351
+ "grad_norm": 1.2961162328720093,
352
+ "learning_rate": 8.446666666666668e-06,
353
+ "loss": 0.0041,
354
+ "step": 1200
355
+ },
356
+ {
357
+ "epoch": 11.98,
358
+ "grad_norm": 0.5769420862197876,
359
+ "learning_rate": 8.391111111111112e-06,
360
+ "loss": 0.0035,
361
+ "step": 1225
362
+ },
363
+ {
364
+ "epoch": 12.22,
365
+ "grad_norm": 0.5504060387611389,
366
+ "learning_rate": 8.335555555555556e-06,
367
+ "loss": 0.0022,
368
+ "step": 1250
369
+ },
370
+ {
371
+ "epoch": 12.47,
372
+ "grad_norm": 0.7063620090484619,
373
+ "learning_rate": 8.28e-06,
374
+ "loss": 0.0027,
375
+ "step": 1275
376
+ },
377
+ {
378
+ "epoch": 12.71,
379
+ "grad_norm": 0.6650658845901489,
380
+ "learning_rate": 8.224444444444444e-06,
381
+ "loss": 0.0029,
382
+ "step": 1300
383
+ },
384
+ {
385
+ "epoch": 12.96,
386
+ "grad_norm": 0.6803381443023682,
387
+ "learning_rate": 8.16888888888889e-06,
388
+ "loss": 0.0023,
389
+ "step": 1325
390
+ },
391
+ {
392
+ "epoch": 13.2,
393
+ "grad_norm": 0.19391104578971863,
394
+ "learning_rate": 8.113333333333334e-06,
395
+ "loss": 0.0013,
396
+ "step": 1350
397
+ },
398
+ {
399
+ "epoch": 13.45,
400
+ "grad_norm": 0.43767812848091125,
401
+ "learning_rate": 8.057777777777778e-06,
402
+ "loss": 0.002,
403
+ "step": 1375
404
+ },
405
+ {
406
+ "epoch": 13.69,
407
+ "grad_norm": 0.6082565188407898,
408
+ "learning_rate": 8.002222222222222e-06,
409
+ "loss": 0.0022,
410
+ "step": 1400
411
+ },
412
+ {
413
+ "epoch": 13.94,
414
+ "grad_norm": 0.30705004930496216,
415
+ "learning_rate": 7.946666666666666e-06,
416
+ "loss": 0.002,
417
+ "step": 1425
418
+ },
419
+ {
420
+ "epoch": 14.18,
421
+ "grad_norm": 0.18880507349967957,
422
+ "learning_rate": 7.891111111111112e-06,
423
+ "loss": 0.0015,
424
+ "step": 1450
425
+ },
426
+ {
427
+ "epoch": 14.43,
428
+ "grad_norm": 0.32524725794792175,
429
+ "learning_rate": 7.835555555555556e-06,
430
+ "loss": 0.0015,
431
+ "step": 1475
432
+ },
433
+ {
434
+ "epoch": 14.67,
435
+ "grad_norm": 2.48786997795105,
436
+ "learning_rate": 7.78e-06,
437
+ "loss": 0.0015,
438
+ "step": 1500
439
+ },
440
+ {
441
+ "epoch": 14.91,
442
+ "grad_norm": 0.3373986482620239,
443
+ "learning_rate": 7.724444444444446e-06,
444
+ "loss": 0.0013,
445
+ "step": 1525
446
+ },
447
+ {
448
+ "epoch": 15.16,
449
+ "grad_norm": 0.29098883271217346,
450
+ "learning_rate": 7.66888888888889e-06,
451
+ "loss": 0.0011,
452
+ "step": 1550
453
+ },
454
+ {
455
+ "epoch": 15.4,
456
+ "grad_norm": 0.12477891892194748,
457
+ "learning_rate": 7.613333333333334e-06,
458
+ "loss": 0.0008,
459
+ "step": 1575
460
+ },
461
+ {
462
+ "epoch": 15.65,
463
+ "grad_norm": 0.06489470601081848,
464
+ "learning_rate": 7.557777777777779e-06,
465
+ "loss": 0.0011,
466
+ "step": 1600
467
+ },
468
+ {
469
+ "epoch": 15.89,
470
+ "grad_norm": 0.061178650707006454,
471
+ "learning_rate": 7.502222222222223e-06,
472
+ "loss": 0.001,
473
+ "step": 1625
474
+ },
475
+ {
476
+ "epoch": 16.14,
477
+ "grad_norm": 0.038977060467004776,
478
+ "learning_rate": 7.446666666666668e-06,
479
+ "loss": 0.0007,
480
+ "step": 1650
481
+ },
482
+ {
483
+ "epoch": 16.38,
484
+ "grad_norm": 0.22110821306705475,
485
+ "learning_rate": 7.3911111111111125e-06,
486
+ "loss": 0.0007,
487
+ "step": 1675
488
+ },
489
+ {
490
+ "epoch": 16.63,
491
+ "grad_norm": 0.5320185422897339,
492
+ "learning_rate": 7.335555555555556e-06,
493
+ "loss": 0.0007,
494
+ "step": 1700
495
+ },
496
+ {
497
+ "epoch": 16.87,
498
+ "grad_norm": 0.7823454737663269,
499
+ "learning_rate": 7.280000000000001e-06,
500
+ "loss": 0.0008,
501
+ "step": 1725
502
+ },
503
+ {
504
+ "epoch": 17.11,
505
+ "grad_norm": 0.043301377445459366,
506
+ "learning_rate": 7.224444444444445e-06,
507
+ "loss": 0.001,
508
+ "step": 1750
509
+ },
510
+ {
511
+ "epoch": 17.36,
512
+ "grad_norm": 0.06231601908802986,
513
+ "learning_rate": 7.1688888888888895e-06,
514
+ "loss": 0.0005,
515
+ "step": 1775
516
+ },
517
+ {
518
+ "epoch": 17.6,
519
+ "grad_norm": 0.05838339775800705,
520
+ "learning_rate": 7.113333333333334e-06,
521
+ "loss": 0.0005,
522
+ "step": 1800
523
+ },
524
+ {
525
+ "epoch": 17.85,
526
+ "grad_norm": 0.05545497685670853,
527
+ "learning_rate": 7.057777777777778e-06,
528
+ "loss": 0.0012,
529
+ "step": 1825
530
+ },
531
+ {
532
+ "epoch": 18.09,
533
+ "grad_norm": 0.4030478894710541,
534
+ "learning_rate": 7.0022222222222225e-06,
535
+ "loss": 0.0008,
536
+ "step": 1850
537
+ },
538
+ {
539
+ "epoch": 18.34,
540
+ "grad_norm": 0.27439093589782715,
541
+ "learning_rate": 6.946666666666667e-06,
542
+ "loss": 0.0007,
543
+ "step": 1875
544
+ },
545
+ {
546
+ "epoch": 18.58,
547
+ "grad_norm": 0.25452977418899536,
548
+ "learning_rate": 6.891111111111111e-06,
549
+ "loss": 0.0007,
550
+ "step": 1900
551
+ },
552
+ {
553
+ "epoch": 18.83,
554
+ "grad_norm": 0.06759922206401825,
555
+ "learning_rate": 6.835555555555556e-06,
556
+ "loss": 0.0007,
557
+ "step": 1925
558
+ },
559
+ {
560
+ "epoch": 19.07,
561
+ "grad_norm": 0.25859466195106506,
562
+ "learning_rate": 6.780000000000001e-06,
563
+ "loss": 0.0006,
564
+ "step": 1950
565
+ },
566
+ {
567
+ "epoch": 19.32,
568
+ "grad_norm": 0.7427995800971985,
569
+ "learning_rate": 6.724444444444444e-06,
570
+ "loss": 0.0005,
571
+ "step": 1975
572
+ },
573
+ {
574
+ "epoch": 19.56,
575
+ "grad_norm": 0.0788324698805809,
576
+ "learning_rate": 6.668888888888889e-06,
577
+ "loss": 0.0008,
578
+ "step": 2000
579
+ },
580
+ {
581
+ "epoch": 19.56,
582
+ "eval_loss": 0.49481505155563354,
583
+ "eval_runtime": 1457.5675,
584
+ "eval_samples_per_second": 1.985,
585
+ "eval_steps_per_second": 0.497,
586
+ "eval_wer": 18.4735830052193,
587
+ "step": 2000
588
+ },
589
+ {
590
+ "epoch": 19.8,
591
+ "grad_norm": 0.04227956011891365,
592
+ "learning_rate": 6.613333333333334e-06,
593
+ "loss": 0.0008,
594
+ "step": 2025
595
+ },
596
+ {
597
+ "epoch": 20.05,
598
+ "grad_norm": 0.5580443739891052,
599
+ "learning_rate": 6.557777777777778e-06,
600
+ "loss": 0.001,
601
+ "step": 2050
602
+ },
603
+ {
604
+ "epoch": 20.29,
605
+ "grad_norm": 0.7394335865974426,
606
+ "learning_rate": 6.502222222222223e-06,
607
+ "loss": 0.0014,
608
+ "step": 2075
609
+ },
610
+ {
611
+ "epoch": 20.54,
612
+ "grad_norm": 0.8055688142776489,
613
+ "learning_rate": 6.446666666666668e-06,
614
+ "loss": 0.0011,
615
+ "step": 2100
616
+ },
617
+ {
618
+ "epoch": 20.78,
619
+ "grad_norm": 0.13119255006313324,
620
+ "learning_rate": 6.391111111111111e-06,
621
+ "loss": 0.0016,
622
+ "step": 2125
623
+ },
624
+ {
625
+ "epoch": 21.03,
626
+ "grad_norm": 0.21813702583312988,
627
+ "learning_rate": 6.335555555555556e-06,
628
+ "loss": 0.0014,
629
+ "step": 2150
630
+ },
631
+ {
632
+ "epoch": 21.27,
633
+ "grad_norm": 0.1066213995218277,
634
+ "learning_rate": 6.280000000000001e-06,
635
+ "loss": 0.0009,
636
+ "step": 2175
637
+ },
638
+ {
639
+ "epoch": 21.52,
640
+ "grad_norm": 0.8583650588989258,
641
+ "learning_rate": 6.224444444444445e-06,
642
+ "loss": 0.0012,
643
+ "step": 2200
644
+ },
645
+ {
646
+ "epoch": 21.76,
647
+ "grad_norm": 1.2513171434402466,
648
+ "learning_rate": 6.16888888888889e-06,
649
+ "loss": 0.0021,
650
+ "step": 2225
651
+ },
652
+ {
653
+ "epoch": 22.0,
654
+ "grad_norm": 0.8390223979949951,
655
+ "learning_rate": 6.113333333333333e-06,
656
+ "loss": 0.0018,
657
+ "step": 2250
658
+ },
659
+ {
660
+ "epoch": 22.25,
661
+ "grad_norm": 0.8746078610420227,
662
+ "learning_rate": 6.057777777777778e-06,
663
+ "loss": 0.0015,
664
+ "step": 2275
665
+ },
666
+ {
667
+ "epoch": 22.49,
668
+ "grad_norm": 0.13358770310878754,
669
+ "learning_rate": 6.002222222222223e-06,
670
+ "loss": 0.0016,
671
+ "step": 2300
672
+ },
673
+ {
674
+ "epoch": 22.74,
675
+ "grad_norm": 0.07681471109390259,
676
+ "learning_rate": 5.946666666666668e-06,
677
+ "loss": 0.0011,
678
+ "step": 2325
679
+ },
680
+ {
681
+ "epoch": 22.98,
682
+ "grad_norm": 0.5511406660079956,
683
+ "learning_rate": 5.891111111111112e-06,
684
+ "loss": 0.0013,
685
+ "step": 2350
686
+ },
687
+ {
688
+ "epoch": 23.23,
689
+ "grad_norm": 0.23318354785442352,
690
+ "learning_rate": 5.8355555555555565e-06,
691
+ "loss": 0.0011,
692
+ "step": 2375
693
+ },
694
+ {
695
+ "epoch": 23.47,
696
+ "grad_norm": 0.12396834790706635,
697
+ "learning_rate": 5.78e-06,
698
+ "loss": 0.0009,
699
+ "step": 2400
700
+ },
701
+ {
702
+ "epoch": 23.72,
703
+ "grad_norm": 0.0644838809967041,
704
+ "learning_rate": 5.724444444444445e-06,
705
+ "loss": 0.0006,
706
+ "step": 2425
707
+ },
708
+ {
709
+ "epoch": 23.96,
710
+ "grad_norm": 0.47172439098358154,
711
+ "learning_rate": 5.6688888888888895e-06,
712
+ "loss": 0.0007,
713
+ "step": 2450
714
+ },
715
+ {
716
+ "epoch": 24.21,
717
+ "grad_norm": 0.030231019482016563,
718
+ "learning_rate": 5.613333333333334e-06,
719
+ "loss": 0.0005,
720
+ "step": 2475
721
+ },
722
+ {
723
+ "epoch": 24.45,
724
+ "grad_norm": 0.01894545555114746,
725
+ "learning_rate": 5.557777777777778e-06,
726
+ "loss": 0.0003,
727
+ "step": 2500
728
+ },
729
+ {
730
+ "epoch": 24.69,
731
+ "grad_norm": 0.18070322275161743,
732
+ "learning_rate": 5.5022222222222224e-06,
733
+ "loss": 0.0009,
734
+ "step": 2525
735
+ },
736
+ {
737
+ "epoch": 24.94,
738
+ "grad_norm": 0.0435708686709404,
739
+ "learning_rate": 5.4466666666666665e-06,
740
+ "loss": 0.0009,
741
+ "step": 2550
742
+ },
743
+ {
744
+ "epoch": 25.18,
745
+ "grad_norm": 0.07220447063446045,
746
+ "learning_rate": 5.391111111111111e-06,
747
+ "loss": 0.0005,
748
+ "step": 2575
749
+ },
750
+ {
751
+ "epoch": 25.43,
752
+ "grad_norm": 0.01733986660838127,
753
+ "learning_rate": 5.335555555555556e-06,
754
+ "loss": 0.0004,
755
+ "step": 2600
756
+ },
757
+ {
758
+ "epoch": 25.67,
759
+ "grad_norm": 0.03520004078745842,
760
+ "learning_rate": 5.28e-06,
761
+ "loss": 0.0007,
762
+ "step": 2625
763
+ },
764
+ {
765
+ "epoch": 25.92,
766
+ "grad_norm": 0.03853292763233185,
767
+ "learning_rate": 5.224444444444445e-06,
768
+ "loss": 0.0005,
769
+ "step": 2650
770
+ },
771
+ {
772
+ "epoch": 26.16,
773
+ "grad_norm": 0.13450591266155243,
774
+ "learning_rate": 5.168888888888889e-06,
775
+ "loss": 0.0005,
776
+ "step": 2675
777
+ },
778
+ {
779
+ "epoch": 26.41,
780
+ "grad_norm": 0.029255390167236328,
781
+ "learning_rate": 5.113333333333333e-06,
782
+ "loss": 0.0004,
783
+ "step": 2700
784
+ },
785
+ {
786
+ "epoch": 26.65,
787
+ "grad_norm": 0.025706447660923004,
788
+ "learning_rate": 5.057777777777778e-06,
789
+ "loss": 0.0003,
790
+ "step": 2725
791
+ },
792
+ {
793
+ "epoch": 26.89,
794
+ "grad_norm": 0.902415931224823,
795
+ "learning_rate": 5.002222222222223e-06,
796
+ "loss": 0.0002,
797
+ "step": 2750
798
+ },
799
+ {
800
+ "epoch": 27.14,
801
+ "grad_norm": 0.013656423427164555,
802
+ "learning_rate": 4.946666666666667e-06,
803
+ "loss": 0.0002,
804
+ "step": 2775
805
+ },
806
+ {
807
+ "epoch": 27.38,
808
+ "grad_norm": 0.018052740022540092,
809
+ "learning_rate": 4.891111111111111e-06,
810
+ "loss": 0.0003,
811
+ "step": 2800
812
+ },
813
+ {
814
+ "epoch": 27.63,
815
+ "grad_norm": 0.07001502811908722,
816
+ "learning_rate": 4.835555555555556e-06,
817
+ "loss": 0.0002,
818
+ "step": 2825
819
+ },
820
+ {
821
+ "epoch": 27.87,
822
+ "grad_norm": 0.01241993810981512,
823
+ "learning_rate": 4.78e-06,
824
+ "loss": 0.0003,
825
+ "step": 2850
826
+ },
827
+ {
828
+ "epoch": 28.12,
829
+ "grad_norm": 0.013495378196239471,
830
+ "learning_rate": 4.724444444444445e-06,
831
+ "loss": 0.0001,
832
+ "step": 2875
833
+ },
834
+ {
835
+ "epoch": 28.36,
836
+ "grad_norm": 0.011048069223761559,
837
+ "learning_rate": 4.66888888888889e-06,
838
+ "loss": 0.0001,
839
+ "step": 2900
840
+ },
841
+ {
842
+ "epoch": 28.61,
843
+ "grad_norm": 0.015970442444086075,
844
+ "learning_rate": 4.613333333333334e-06,
845
+ "loss": 0.0002,
846
+ "step": 2925
847
+ },
848
+ {
849
+ "epoch": 28.85,
850
+ "grad_norm": 0.009559527039527893,
851
+ "learning_rate": 4.557777777777778e-06,
852
+ "loss": 0.0002,
853
+ "step": 2950
854
+ },
855
+ {
856
+ "epoch": 29.1,
857
+ "grad_norm": 0.010293275117874146,
858
+ "learning_rate": 4.502222222222223e-06,
859
+ "loss": 0.0001,
860
+ "step": 2975
861
+ },
862
+ {
863
+ "epoch": 29.34,
864
+ "grad_norm": 0.01116804126650095,
865
+ "learning_rate": 4.446666666666667e-06,
866
+ "loss": 0.0001,
867
+ "step": 3000
868
+ },
869
+ {
870
+ "epoch": 29.34,
871
+ "eval_loss": 0.5352661609649658,
872
+ "eval_runtime": 1458.2257,
873
+ "eval_samples_per_second": 1.985,
874
+ "eval_steps_per_second": 0.496,
875
+ "eval_wer": 18.07297866495742,
876
+ "step": 3000
877
+ },
878
+ {
879
+ "epoch": 29.58,
880
+ "grad_norm": 0.0082534896209836,
881
+ "learning_rate": 4.391111111111112e-06,
882
+ "loss": 0.0001,
883
+ "step": 3025
884
+ },
885
+ {
886
+ "epoch": 29.83,
887
+ "grad_norm": 0.007231541443616152,
888
+ "learning_rate": 4.3355555555555565e-06,
889
+ "loss": 0.0001,
890
+ "step": 3050
891
+ },
892
+ {
893
+ "epoch": 30.07,
894
+ "grad_norm": 0.008312270976603031,
895
+ "learning_rate": 4.2800000000000005e-06,
896
+ "loss": 0.0001,
897
+ "step": 3075
898
+ },
899
+ {
900
+ "epoch": 30.32,
901
+ "grad_norm": 0.007097834721207619,
902
+ "learning_rate": 4.2244444444444446e-06,
903
+ "loss": 0.0001,
904
+ "step": 3100
905
+ },
906
+ {
907
+ "epoch": 30.56,
908
+ "grad_norm": 0.007524041458964348,
909
+ "learning_rate": 4.168888888888889e-06,
910
+ "loss": 0.0001,
911
+ "step": 3125
912
+ },
913
+ {
914
+ "epoch": 30.81,
915
+ "grad_norm": 0.00781218009069562,
916
+ "learning_rate": 4.1133333333333335e-06,
917
+ "loss": 0.0001,
918
+ "step": 3150
919
+ },
920
+ {
921
+ "epoch": 31.05,
922
+ "grad_norm": 0.006727377884089947,
923
+ "learning_rate": 4.057777777777778e-06,
924
+ "loss": 0.0001,
925
+ "step": 3175
926
+ },
927
+ {
928
+ "epoch": 31.3,
929
+ "grad_norm": 0.00624061468988657,
930
+ "learning_rate": 4.002222222222222e-06,
931
+ "loss": 0.0001,
932
+ "step": 3200
933
+ },
934
+ {
935
+ "epoch": 31.54,
936
+ "grad_norm": 0.006174057722091675,
937
+ "learning_rate": 3.946666666666667e-06,
938
+ "loss": 0.0001,
939
+ "step": 3225
940
+ },
941
+ {
942
+ "epoch": 31.78,
943
+ "grad_norm": 0.006670523434877396,
944
+ "learning_rate": 3.891111111111111e-06,
945
+ "loss": 0.0001,
946
+ "step": 3250
947
+ },
948
+ {
949
+ "epoch": 32.03,
950
+ "grad_norm": 0.006069181486964226,
951
+ "learning_rate": 3.835555555555555e-06,
952
+ "loss": 0.0001,
953
+ "step": 3275
954
+ },
955
+ {
956
+ "epoch": 32.27,
957
+ "grad_norm": 0.005773240700364113,
958
+ "learning_rate": 3.7800000000000002e-06,
959
+ "loss": 0.0001,
960
+ "step": 3300
961
+ },
962
+ {
963
+ "epoch": 32.52,
964
+ "grad_norm": 0.005736664403229952,
965
+ "learning_rate": 3.724444444444445e-06,
966
+ "loss": 0.0001,
967
+ "step": 3325
968
+ },
969
+ {
970
+ "epoch": 32.76,
971
+ "grad_norm": 0.0057275379076600075,
972
+ "learning_rate": 3.668888888888889e-06,
973
+ "loss": 0.0001,
974
+ "step": 3350
975
+ },
976
+ {
977
+ "epoch": 33.01,
978
+ "grad_norm": 0.0070615834556519985,
979
+ "learning_rate": 3.6133333333333336e-06,
980
+ "loss": 0.0001,
981
+ "step": 3375
982
+ },
983
+ {
984
+ "epoch": 33.25,
985
+ "grad_norm": 0.005553886294364929,
986
+ "learning_rate": 3.5577777777777785e-06,
987
+ "loss": 0.0001,
988
+ "step": 3400
989
+ },
990
+ {
991
+ "epoch": 33.5,
992
+ "grad_norm": 0.00474073551595211,
993
+ "learning_rate": 3.5022222222222225e-06,
994
+ "loss": 0.0001,
995
+ "step": 3425
996
+ },
997
+ {
998
+ "epoch": 33.74,
999
+ "grad_norm": 0.006729442626237869,
1000
+ "learning_rate": 3.446666666666667e-06,
1001
+ "loss": 0.0001,
1002
+ "step": 3450
1003
+ },
1004
+ {
1005
+ "epoch": 33.99,
1006
+ "grad_norm": 0.006181245669722557,
1007
+ "learning_rate": 3.391111111111111e-06,
1008
+ "loss": 0.0001,
1009
+ "step": 3475
1010
+ },
1011
+ {
1012
+ "epoch": 34.23,
1013
+ "grad_norm": 0.004282401409000158,
1014
+ "learning_rate": 3.335555555555556e-06,
1015
+ "loss": 0.0001,
1016
+ "step": 3500
1017
+ },
1018
+ {
1019
+ "epoch": 34.47,
1020
+ "grad_norm": 0.0049114222638309,
1021
+ "learning_rate": 3.2800000000000004e-06,
1022
+ "loss": 0.0001,
1023
+ "step": 3525
1024
+ },
1025
+ {
1026
+ "epoch": 34.72,
1027
+ "grad_norm": 0.004997397307306528,
1028
+ "learning_rate": 3.2244444444444444e-06,
1029
+ "loss": 0.0001,
1030
+ "step": 3550
1031
+ },
1032
+ {
1033
+ "epoch": 34.96,
1034
+ "grad_norm": 0.004732249770313501,
1035
+ "learning_rate": 3.1688888888888893e-06,
1036
+ "loss": 0.0001,
1037
+ "step": 3575
1038
+ },
1039
+ {
1040
+ "epoch": 35.21,
1041
+ "grad_norm": 0.005105483811348677,
1042
+ "learning_rate": 3.1133333333333337e-06,
1043
+ "loss": 0.0001,
1044
+ "step": 3600
1045
+ },
1046
+ {
1047
+ "epoch": 35.45,
1048
+ "grad_norm": 0.005223471205681562,
1049
+ "learning_rate": 3.0577777777777778e-06,
1050
+ "loss": 0.0001,
1051
+ "step": 3625
1052
+ },
1053
+ {
1054
+ "epoch": 35.7,
1055
+ "grad_norm": 0.005028573330491781,
1056
+ "learning_rate": 3.0022222222222227e-06,
1057
+ "loss": 0.0001,
1058
+ "step": 3650
1059
+ },
1060
+ {
1061
+ "epoch": 35.94,
1062
+ "grad_norm": 0.0035760572645813227,
1063
+ "learning_rate": 2.946666666666667e-06,
1064
+ "loss": 0.0001,
1065
+ "step": 3675
1066
+ },
1067
+ {
1068
+ "epoch": 36.19,
1069
+ "grad_norm": 0.0042958687990903854,
1070
+ "learning_rate": 2.891111111111111e-06,
1071
+ "loss": 0.0001,
1072
+ "step": 3700
1073
+ },
1074
+ {
1075
+ "epoch": 36.43,
1076
+ "grad_norm": 0.004194607958197594,
1077
+ "learning_rate": 2.835555555555556e-06,
1078
+ "loss": 0.0001,
1079
+ "step": 3725
1080
+ },
1081
+ {
1082
+ "epoch": 36.67,
1083
+ "grad_norm": 0.004419374745339155,
1084
+ "learning_rate": 2.7800000000000005e-06,
1085
+ "loss": 0.0001,
1086
+ "step": 3750
1087
+ },
1088
+ {
1089
+ "epoch": 36.92,
1090
+ "grad_norm": 0.004165703430771828,
1091
+ "learning_rate": 2.7244444444444445e-06,
1092
+ "loss": 0.0001,
1093
+ "step": 3775
1094
+ },
1095
+ {
1096
+ "epoch": 37.16,
1097
+ "grad_norm": 0.0038676238618791103,
1098
+ "learning_rate": 2.6688888888888894e-06,
1099
+ "loss": 0.0001,
1100
+ "step": 3800
1101
+ },
1102
+ {
1103
+ "epoch": 37.41,
1104
+ "grad_norm": 0.004417106043547392,
1105
+ "learning_rate": 2.6133333333333334e-06,
1106
+ "loss": 0.0001,
1107
+ "step": 3825
1108
+ },
1109
+ {
1110
+ "epoch": 37.65,
1111
+ "grad_norm": 0.004240726120769978,
1112
+ "learning_rate": 2.557777777777778e-06,
1113
+ "loss": 0.0001,
1114
+ "step": 3850
1115
+ },
1116
+ {
1117
+ "epoch": 37.9,
1118
+ "grad_norm": 0.003732978831976652,
1119
+ "learning_rate": 2.5022222222222224e-06,
1120
+ "loss": 0.0001,
1121
+ "step": 3875
1122
+ },
1123
+ {
1124
+ "epoch": 38.14,
1125
+ "grad_norm": 0.0037496890872716904,
1126
+ "learning_rate": 2.446666666666667e-06,
1127
+ "loss": 0.0001,
1128
+ "step": 3900
1129
+ },
1130
+ {
1131
+ "epoch": 38.39,
1132
+ "grad_norm": 0.003918818198144436,
1133
+ "learning_rate": 2.3911111111111113e-06,
1134
+ "loss": 0.0001,
1135
+ "step": 3925
1136
+ },
1137
+ {
1138
+ "epoch": 38.63,
1139
+ "grad_norm": 0.003961279056966305,
1140
+ "learning_rate": 2.3355555555555557e-06,
1141
+ "loss": 0.0001,
1142
+ "step": 3950
1143
+ },
1144
+ {
1145
+ "epoch": 38.88,
1146
+ "grad_norm": 0.00358415674418211,
1147
+ "learning_rate": 2.28e-06,
1148
+ "loss": 0.0001,
1149
+ "step": 3975
1150
+ },
1151
+ {
1152
+ "epoch": 39.12,
1153
+ "grad_norm": 0.004146341234445572,
1154
+ "learning_rate": 2.2244444444444447e-06,
1155
+ "loss": 0.0001,
1156
+ "step": 4000
1157
+ },
1158
+ {
1159
+ "epoch": 39.12,
1160
+ "eval_loss": 0.5624426603317261,
1161
+ "eval_runtime": 1469.1635,
1162
+ "eval_samples_per_second": 1.97,
1163
+ "eval_steps_per_second": 0.493,
1164
+ "eval_wer": 18.056954491346946,
1165
+ "step": 4000
1166
+ },
1167
+ {
1168
+ "epoch": 39.36,
1169
+ "grad_norm": 0.004992615897208452,
1170
+ "learning_rate": 2.168888888888889e-06,
1171
+ "loss": 0.0001,
1172
+ "step": 4025
1173
+ },
1174
+ {
1175
+ "epoch": 39.61,
1176
+ "grad_norm": 0.0034102711360901594,
1177
+ "learning_rate": 2.1133333333333336e-06,
1178
+ "loss": 0.0001,
1179
+ "step": 4050
1180
+ },
1181
+ {
1182
+ "epoch": 39.85,
1183
+ "grad_norm": 0.0036119353026151657,
1184
+ "learning_rate": 2.057777777777778e-06,
1185
+ "loss": 0.0,
1186
+ "step": 4075
1187
+ },
1188
+ {
1189
+ "epoch": 40.1,
1190
+ "grad_norm": 0.0040961061604321,
1191
+ "learning_rate": 2.0022222222222225e-06,
1192
+ "loss": 0.0001,
1193
+ "step": 4100
1194
+ },
1195
+ {
1196
+ "epoch": 40.34,
1197
+ "grad_norm": 0.003933845553547144,
1198
+ "learning_rate": 1.9466666666666665e-06,
1199
+ "loss": 0.0001,
1200
+ "step": 4125
1201
+ },
1202
+ {
1203
+ "epoch": 40.59,
1204
+ "grad_norm": 0.00343153509311378,
1205
+ "learning_rate": 1.8911111111111114e-06,
1206
+ "loss": 0.0,
1207
+ "step": 4150
1208
+ },
1209
+ {
1210
+ "epoch": 40.83,
1211
+ "grad_norm": 0.0032181148417294025,
1212
+ "learning_rate": 1.8355555555555557e-06,
1213
+ "loss": 0.0,
1214
+ "step": 4175
1215
+ },
1216
+ {
1217
+ "epoch": 41.08,
1218
+ "grad_norm": 0.002862308407202363,
1219
+ "learning_rate": 1.7800000000000001e-06,
1220
+ "loss": 0.0,
1221
+ "step": 4200
1222
+ },
1223
+ {
1224
+ "epoch": 41.32,
1225
+ "grad_norm": 0.0038496414199471474,
1226
+ "learning_rate": 1.7244444444444448e-06,
1227
+ "loss": 0.0,
1228
+ "step": 4225
1229
+ },
1230
+ {
1231
+ "epoch": 41.56,
1232
+ "grad_norm": 0.0036370421294122934,
1233
+ "learning_rate": 1.668888888888889e-06,
1234
+ "loss": 0.0,
1235
+ "step": 4250
1236
+ },
1237
+ {
1238
+ "epoch": 41.81,
1239
+ "grad_norm": 0.003030687803402543,
1240
+ "learning_rate": 1.6133333333333335e-06,
1241
+ "loss": 0.0,
1242
+ "step": 4275
1243
+ },
1244
+ {
1245
+ "epoch": 42.05,
1246
+ "grad_norm": 0.003104611998423934,
1247
+ "learning_rate": 1.5577777777777777e-06,
1248
+ "loss": 0.0,
1249
+ "step": 4300
1250
+ },
1251
+ {
1252
+ "epoch": 42.3,
1253
+ "grad_norm": 0.0035941856913268566,
1254
+ "learning_rate": 1.5022222222222224e-06,
1255
+ "loss": 0.0,
1256
+ "step": 4325
1257
+ },
1258
+ {
1259
+ "epoch": 42.54,
1260
+ "grad_norm": 0.0029717443976551294,
1261
+ "learning_rate": 1.4466666666666669e-06,
1262
+ "loss": 0.0,
1263
+ "step": 4350
1264
+ },
1265
+ {
1266
+ "epoch": 42.79,
1267
+ "grad_norm": 0.0039994968101382256,
1268
+ "learning_rate": 1.3911111111111111e-06,
1269
+ "loss": 0.0,
1270
+ "step": 4375
1271
+ },
1272
+ {
1273
+ "epoch": 43.03,
1274
+ "grad_norm": 0.003351717721670866,
1275
+ "learning_rate": 1.3355555555555558e-06,
1276
+ "loss": 0.0,
1277
+ "step": 4400
1278
+ },
1279
+ {
1280
+ "epoch": 43.28,
1281
+ "grad_norm": 0.00382447661831975,
1282
+ "learning_rate": 1.28e-06,
1283
+ "loss": 0.0,
1284
+ "step": 4425
1285
+ },
1286
+ {
1287
+ "epoch": 43.52,
1288
+ "grad_norm": 0.0030940717551857233,
1289
+ "learning_rate": 1.2244444444444445e-06,
1290
+ "loss": 0.0,
1291
+ "step": 4450
1292
+ },
1293
+ {
1294
+ "epoch": 43.77,
1295
+ "grad_norm": 0.003348903963342309,
1296
+ "learning_rate": 1.168888888888889e-06,
1297
+ "loss": 0.0,
1298
+ "step": 4475
1299
+ },
1300
+ {
1301
+ "epoch": 44.01,
1302
+ "grad_norm": 0.002947325585409999,
1303
+ "learning_rate": 1.1133333333333334e-06,
1304
+ "loss": 0.0,
1305
+ "step": 4500
1306
+ },
1307
+ {
1308
+ "epoch": 44.25,
1309
+ "grad_norm": 0.0035392455756664276,
1310
+ "learning_rate": 1.0577777777777779e-06,
1311
+ "loss": 0.0,
1312
+ "step": 4525
1313
+ },
1314
+ {
1315
+ "epoch": 44.5,
1316
+ "grad_norm": 0.0031909053213894367,
1317
+ "learning_rate": 1.0022222222222223e-06,
1318
+ "loss": 0.0,
1319
+ "step": 4550
1320
+ },
1321
+ {
1322
+ "epoch": 44.74,
1323
+ "grad_norm": 0.0031825106125324965,
1324
+ "learning_rate": 9.466666666666667e-07,
1325
+ "loss": 0.0,
1326
+ "step": 4575
1327
+ },
1328
+ {
1329
+ "epoch": 44.99,
1330
+ "grad_norm": 0.003659332636743784,
1331
+ "learning_rate": 8.911111111111112e-07,
1332
+ "loss": 0.0,
1333
+ "step": 4600
1334
+ },
1335
+ {
1336
+ "epoch": 45.23,
1337
+ "grad_norm": 0.003141431836411357,
1338
+ "learning_rate": 8.355555555555556e-07,
1339
+ "loss": 0.0,
1340
+ "step": 4625
1341
+ },
1342
+ {
1343
+ "epoch": 45.48,
1344
+ "grad_norm": 0.0026865031104534864,
1345
+ "learning_rate": 7.8e-07,
1346
+ "loss": 0.0,
1347
+ "step": 4650
1348
+ },
1349
+ {
1350
+ "epoch": 45.72,
1351
+ "grad_norm": 0.0037062685005366802,
1352
+ "learning_rate": 7.244444444444446e-07,
1353
+ "loss": 0.0,
1354
+ "step": 4675
1355
+ },
1356
+ {
1357
+ "epoch": 45.97,
1358
+ "grad_norm": 0.0038293024990707636,
1359
+ "learning_rate": 6.68888888888889e-07,
1360
+ "loss": 0.0,
1361
+ "step": 4700
1362
+ },
1363
+ {
1364
+ "epoch": 46.21,
1365
+ "grad_norm": 0.0033506678882986307,
1366
+ "learning_rate": 6.133333333333333e-07,
1367
+ "loss": 0.0,
1368
+ "step": 4725
1369
+ },
1370
+ {
1371
+ "epoch": 46.45,
1372
+ "grad_norm": 0.0031830337829887867,
1373
+ "learning_rate": 5.577777777777779e-07,
1374
+ "loss": 0.0,
1375
+ "step": 4750
1376
+ },
1377
+ {
1378
+ "epoch": 46.7,
1379
+ "grad_norm": 0.002798918168991804,
1380
+ "learning_rate": 5.022222222222222e-07,
1381
+ "loss": 0.0,
1382
+ "step": 4775
1383
+ },
1384
+ {
1385
+ "epoch": 46.94,
1386
+ "grad_norm": 0.0032959673553705215,
1387
+ "learning_rate": 4.466666666666667e-07,
1388
+ "loss": 0.0,
1389
+ "step": 4800
1390
+ },
1391
+ {
1392
+ "epoch": 47.19,
1393
+ "grad_norm": 0.0029275265987962484,
1394
+ "learning_rate": 3.9111111111111115e-07,
1395
+ "loss": 0.0,
1396
+ "step": 4825
1397
+ },
1398
+ {
1399
+ "epoch": 47.43,
1400
+ "grad_norm": 0.0030392766930162907,
1401
+ "learning_rate": 3.3555555555555556e-07,
1402
+ "loss": 0.0,
1403
+ "step": 4850
1404
+ },
1405
+ {
1406
+ "epoch": 47.68,
1407
+ "grad_norm": 0.002865479327738285,
1408
+ "learning_rate": 2.8e-07,
1409
+ "loss": 0.0,
1410
+ "step": 4875
1411
+ },
1412
+ {
1413
+ "epoch": 47.92,
1414
+ "grad_norm": 0.0030035064555704594,
1415
+ "learning_rate": 2.2444444444444445e-07,
1416
+ "loss": 0.0,
1417
+ "step": 4900
1418
+ },
1419
+ {
1420
+ "epoch": 48.17,
1421
+ "grad_norm": 0.00276967347599566,
1422
+ "learning_rate": 1.6888888888888888e-07,
1423
+ "loss": 0.0,
1424
+ "step": 4925
1425
+ },
1426
+ {
1427
+ "epoch": 48.41,
1428
+ "grad_norm": 0.003191766096279025,
1429
+ "learning_rate": 1.1333333333333336e-07,
1430
+ "loss": 0.0,
1431
+ "step": 4950
1432
+ },
1433
+ {
1434
+ "epoch": 48.66,
1435
+ "grad_norm": 0.00321377394720912,
1436
+ "learning_rate": 5.777777777777778e-08,
1437
+ "loss": 0.0,
1438
+ "step": 4975
1439
+ },
1440
+ {
1441
+ "epoch": 48.9,
1442
+ "grad_norm": 0.0033689856063574553,
1443
+ "learning_rate": 2.2222222222222225e-09,
1444
+ "loss": 0.0,
1445
+ "step": 5000
1446
+ },
1447
+ {
1448
+ "epoch": 48.9,
1449
+ "eval_loss": 0.5714073181152344,
1450
+ "eval_runtime": 1474.141,
1451
+ "eval_samples_per_second": 1.963,
1452
+ "eval_steps_per_second": 0.491,
1453
+ "eval_wer": 18.063821994322865,
1454
+ "step": 5000
1455
+ }
1456
+ ],
1457
+ "logging_steps": 25,
1458
+ "max_steps": 5000,
1459
+ "num_input_tokens_seen": 0,
1460
+ "num_train_epochs": 50,
1461
+ "save_steps": 1000,
1462
+ "total_flos": 9.229191970553856e+19,
1463
+ "train_batch_size": 8,
1464
+ "trial_name": null,
1465
+ "trial_params": null
1466
+ }
Models/hindi/checkpoint-5000/training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1562c17bb2dc7592b45af48209c138ce71faddb67bef288ecaf31f1c50f864ae
3
+ size 5048
Models/hindi/config.json ADDED
@@ -0,0 +1,52 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "openai/whisper-small",
3
+ "activation_dropout": 0.0,
4
+ "activation_function": "gelu",
5
+ "apply_spec_augment": false,
6
+ "architectures": [
7
+ "WhisperForConditionalGeneration"
8
+ ],
9
+ "attention_dropout": 0.0,
10
+ "begin_suppress_tokens": [
11
+ 220,
12
+ 50257
13
+ ],
14
+ "bos_token_id": 50257,
15
+ "classifier_proj_size": 256,
16
+ "d_model": 768,
17
+ "decoder_attention_heads": 12,
18
+ "decoder_ffn_dim": 3072,
19
+ "decoder_layerdrop": 0.0,
20
+ "decoder_layers": 12,
21
+ "decoder_start_token_id": 50258,
22
+ "dropout": 0.0,
23
+ "encoder_attention_heads": 12,
24
+ "encoder_ffn_dim": 3072,
25
+ "encoder_layerdrop": 0.0,
26
+ "encoder_layers": 12,
27
+ "eos_token_id": 50257,
28
+ "forced_decoder_ids": null,
29
+ "init_std": 0.02,
30
+ "is_encoder_decoder": true,
31
+ "mask_feature_length": 10,
32
+ "mask_feature_min_masks": 0,
33
+ "mask_feature_prob": 0.0,
34
+ "mask_time_length": 10,
35
+ "mask_time_min_masks": 2,
36
+ "mask_time_prob": 0.05,
37
+ "max_length": 448,
38
+ "max_source_positions": 1500,
39
+ "max_target_positions": 448,
40
+ "median_filter_width": 7,
41
+ "model_type": "whisper",
42
+ "num_hidden_layers": 12,
43
+ "num_mel_bins": 80,
44
+ "pad_token_id": 50257,
45
+ "scale_embedding": false,
46
+ "suppress_tokens": [],
47
+ "torch_dtype": "float32",
48
+ "transformers_version": "4.40.0.dev0",
49
+ "use_cache": false,
50
+ "use_weighted_layer_sum": false,
51
+ "vocab_size": 51865
52
+ }
Models/hindi/generation_config.json ADDED
@@ -0,0 +1,265 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alignment_heads": [
3
+ [
4
+ 5,
5
+ 3
6
+ ],
7
+ [
8
+ 5,
9
+ 9
10
+ ],
11
+ [
12
+ 8,
13
+ 0
14
+ ],
15
+ [
16
+ 8,
17
+ 4
18
+ ],
19
+ [
20
+ 8,
21
+ 7
22
+ ],
23
+ [
24
+ 8,
25
+ 8
26
+ ],
27
+ [
28
+ 9,
29
+ 0
30
+ ],
31
+ [
32
+ 9,
33
+ 7
34
+ ],
35
+ [
36
+ 9,
37
+ 9
38
+ ],
39
+ [
40
+ 10,
41
+ 5
42
+ ]
43
+ ],
44
+ "begin_suppress_tokens": [
45
+ 220,
46
+ 50257
47
+ ],
48
+ "bos_token_id": 50257,
49
+ "decoder_start_token_id": 50258,
50
+ "eos_token_id": 50257,
51
+ "forced_decoder_ids": [
52
+ [
53
+ 1,
54
+ null
55
+ ],
56
+ [
57
+ 2,
58
+ 50359
59
+ ]
60
+ ],
61
+ "is_multilingual": true,
62
+ "lang_to_id": {
63
+ "<|af|>": 50327,
64
+ "<|am|>": 50334,
65
+ "<|ar|>": 50272,
66
+ "<|as|>": 50350,
67
+ "<|az|>": 50304,
68
+ "<|ba|>": 50355,
69
+ "<|be|>": 50330,
70
+ "<|bg|>": 50292,
71
+ "<|bn|>": 50302,
72
+ "<|bo|>": 50347,
73
+ "<|br|>": 50309,
74
+ "<|bs|>": 50315,
75
+ "<|ca|>": 50270,
76
+ "<|cs|>": 50283,
77
+ "<|cy|>": 50297,
78
+ "<|da|>": 50285,
79
+ "<|de|>": 50261,
80
+ "<|el|>": 50281,
81
+ "<|en|>": 50259,
82
+ "<|es|>": 50262,
83
+ "<|et|>": 50307,
84
+ "<|eu|>": 50310,
85
+ "<|fa|>": 50300,
86
+ "<|fi|>": 50277,
87
+ "<|fo|>": 50338,
88
+ "<|fr|>": 50265,
89
+ "<|gl|>": 50319,
90
+ "<|gu|>": 50333,
91
+ "<|haw|>": 50352,
92
+ "<|ha|>": 50354,
93
+ "<|he|>": 50279,
94
+ "<|hi|>": 50276,
95
+ "<|hr|>": 50291,
96
+ "<|ht|>": 50339,
97
+ "<|hu|>": 50286,
98
+ "<|hy|>": 50312,
99
+ "<|id|>": 50275,
100
+ "<|is|>": 50311,
101
+ "<|it|>": 50274,
102
+ "<|ja|>": 50266,
103
+ "<|jw|>": 50356,
104
+ "<|ka|>": 50329,
105
+ "<|kk|>": 50316,
106
+ "<|km|>": 50323,
107
+ "<|kn|>": 50306,
108
+ "<|ko|>": 50264,
109
+ "<|la|>": 50294,
110
+ "<|lb|>": 50345,
111
+ "<|ln|>": 50353,
112
+ "<|lo|>": 50336,
113
+ "<|lt|>": 50293,
114
+ "<|lv|>": 50301,
115
+ "<|mg|>": 50349,
116
+ "<|mi|>": 50295,
117
+ "<|mk|>": 50308,
118
+ "<|ml|>": 50296,
119
+ "<|mn|>": 50314,
120
+ "<|mr|>": 50320,
121
+ "<|ms|>": 50282,
122
+ "<|mt|>": 50343,
123
+ "<|my|>": 50346,
124
+ "<|ne|>": 50313,
125
+ "<|nl|>": 50271,
126
+ "<|nn|>": 50342,
127
+ "<|no|>": 50288,
128
+ "<|oc|>": 50328,
129
+ "<|pa|>": 50321,
130
+ "<|pl|>": 50269,
131
+ "<|ps|>": 50340,
132
+ "<|pt|>": 50267,
133
+ "<|ro|>": 50284,
134
+ "<|ru|>": 50263,
135
+ "<|sa|>": 50344,
136
+ "<|sd|>": 50332,
137
+ "<|si|>": 50322,
138
+ "<|sk|>": 50298,
139
+ "<|sl|>": 50305,
140
+ "<|sn|>": 50324,
141
+ "<|so|>": 50326,
142
+ "<|sq|>": 50317,
143
+ "<|sr|>": 50303,
144
+ "<|su|>": 50357,
145
+ "<|sv|>": 50273,
146
+ "<|sw|>": 50318,
147
+ "<|ta|>": 50287,
148
+ "<|te|>": 50299,
149
+ "<|tg|>": 50331,
150
+ "<|th|>": 50289,
151
+ "<|tk|>": 50341,
152
+ "<|tl|>": 50348,
153
+ "<|tr|>": 50268,
154
+ "<|tt|>": 50351,
155
+ "<|uk|>": 50280,
156
+ "<|ur|>": 50290,
157
+ "<|uz|>": 50337,
158
+ "<|vi|>": 50278,
159
+ "<|yi|>": 50335,
160
+ "<|yo|>": 50325,
161
+ "<|zh|>": 50260
162
+ },
163
+ "language": "hi",
164
+ "max_initial_timestamp_index": 50,
165
+ "max_length": 448,
166
+ "no_timestamps_token_id": 50363,
167
+ "pad_token_id": 50257,
168
+ "prev_sot_token_id": 50361,
169
+ "return_timestamps": false,
170
+ "suppress_tokens": [
171
+ 1,
172
+ 2,
173
+ 7,
174
+ 8,
175
+ 9,
176
+ 10,
177
+ 14,
178
+ 25,
179
+ 26,
180
+ 27,
181
+ 28,
182
+ 29,
183
+ 31,
184
+ 58,
185
+ 59,
186
+ 60,
187
+ 61,
188
+ 62,
189
+ 63,
190
+ 90,
191
+ 91,
192
+ 92,
193
+ 93,
194
+ 359,
195
+ 503,
196
+ 522,
197
+ 542,
198
+ 873,
199
+ 893,
200
+ 902,
201
+ 918,
202
+ 922,
203
+ 931,
204
+ 1350,
205
+ 1853,
206
+ 1982,
207
+ 2460,
208
+ 2627,
209
+ 3246,
210
+ 3253,
211
+ 3268,
212
+ 3536,
213
+ 3846,
214
+ 3961,
215
+ 4183,
216
+ 4667,
217
+ 6585,
218
+ 6647,
219
+ 7273,
220
+ 9061,
221
+ 9383,
222
+ 10428,
223
+ 10929,
224
+ 11938,
225
+ 12033,
226
+ 12331,
227
+ 12562,
228
+ 13793,
229
+ 14157,
230
+ 14635,
231
+ 15265,
232
+ 15618,
233
+ 16553,
234
+ 16604,
235
+ 18362,
236
+ 18956,
237
+ 20075,
238
+ 21675,
239
+ 22520,
240
+ 26130,
241
+ 26161,
242
+ 26435,
243
+ 28279,
244
+ 29464,
245
+ 31650,
246
+ 32302,
247
+ 32470,
248
+ 36865,
249
+ 42863,
250
+ 47425,
251
+ 49870,
252
+ 50254,
253
+ 50258,
254
+ 50358,
255
+ 50359,
256
+ 50360,
257
+ 50361,
258
+ 50362
259
+ ],
260
+ "task_to_id": {
261
+ "transcribe": 50359,
262
+ "translate": 50358
263
+ },
264
+ "transformers_version": "4.40.0.dev0"
265
+ }
Models/hindi/merges.txt ADDED
The diff for this file is too large to render. See raw diff
 
Models/hindi/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0e7ee6929d4a9289b779431fe8f499fff1d35555d5380656ae14fe72eda3de8e
3
+ size 966995080