artyomboyko commited on
Commit
35ba9b4
1 Parent(s): 4437eb5

End of training

Browse files
README.md ADDED
@@ -0,0 +1,95 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: artyomboyko/whisper-small-fine_tuned-ru-v2
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - wer
8
+ model-index:
9
+ - name: whisper-small-fine_tuned-ru-v2-v3
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # whisper-small-fine_tuned-ru-v2-v3
17
+
18
+ This model is a fine-tuned version of [artyomboyko/whisper-small-fine_tuned-ru-v2](https://huggingface.co/artyomboyko/whisper-small-fine_tuned-ru-v2) on an unknown dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 0.1329
21
+ - Wer: 12.6750
22
+ - Cer: 3.7305
23
+ - Learning Rate: 0.0000
24
+
25
+ ## Model description
26
+
27
+ More information needed
28
+
29
+ ## Intended uses & limitations
30
+
31
+ More information needed
32
+
33
+ ## Training and evaluation data
34
+
35
+ More information needed
36
+
37
+ ## Training procedure
38
+
39
+ ### Training hyperparameters
40
+
41
+ The following hyperparameters were used during training:
42
+ - learning_rate: 5e-08
43
+ - train_batch_size: 16
44
+ - eval_batch_size: 8
45
+ - seed: 42
46
+ - gradient_accumulation_steps: 2
47
+ - total_train_batch_size: 32
48
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
49
+ - lr_scheduler_type: linear
50
+ - lr_scheduler_warmup_steps: 250
51
+ - training_steps: 15000
52
+ - mixed_precision_training: Native AMP
53
+
54
+ ### Training results
55
+
56
+ | Training Loss | Epoch | Step | Validation Loss | Wer | Cer | Rate |
57
+ |:-------------:|:-----:|:-----:|:---------------:|:-------:|:------:|:------:|
58
+ | 0.0661 | 0.09 | 500 | 0.1358 | 12.9097 | 3.8217 | 0.0000 |
59
+ | 0.0616 | 0.17 | 1000 | 0.1357 | 12.9620 | 3.8949 | 0.0000 |
60
+ | 0.0601 | 0.26 | 1500 | 0.1357 | 12.8795 | 3.8225 | 0.0000 |
61
+ | 0.0666 | 0.35 | 2000 | 0.1353 | 12.9481 | 3.8871 | 0.0000 |
62
+ | 0.0669 | 0.43 | 2500 | 0.1352 | 12.8284 | 3.8283 | 0.0000 |
63
+ | 0.0665 | 0.52 | 3000 | 0.1351 | 12.8203 | 3.7833 | 0.0000 |
64
+ | 0.0649 | 0.61 | 3500 | 0.1349 | 12.8098 | 3.7824 | 0.0000 |
65
+ | 0.0607 | 0.69 | 4000 | 0.1347 | 12.8110 | 3.8105 | 0.0000 |
66
+ | 0.0636 | 0.78 | 4500 | 0.1345 | 12.7994 | 3.7893 | 0.0000 |
67
+ | 0.063 | 0.87 | 5000 | 0.1342 | 12.8319 | 3.8084 | 0.0000 |
68
+ | 0.0589 | 0.95 | 5500 | 0.1341 | 12.8807 | 3.8551 | 0.0000 |
69
+ | 0.0734 | 1.04 | 6000 | 0.1341 | 12.7691 | 3.7604 | 0.0000 |
70
+ | 0.0577 | 1.13 | 6500 | 0.1340 | 12.7645 | 3.7602 | 0.0000 |
71
+ | 0.052 | 1.21 | 7000 | 0.1340 | 12.7610 | 3.7655 | 0.0000 |
72
+ | 0.0626 | 1.3 | 7500 | 0.1339 | 12.7657 | 3.7593 | 0.0000 |
73
+ | 0.0617 | 1.39 | 8000 | 0.1338 | 12.7912 | 3.8268 | 0.0000 |
74
+ | 0.063 | 1.47 | 8500 | 0.1337 | 12.7343 | 3.7573 | 0.0000 |
75
+ | 0.0668 | 1.56 | 9000 | 0.1336 | 12.7308 | 3.7198 | 0.0000 |
76
+ | 0.0634 | 1.65 | 9500 | 0.1335 | 12.7215 | 3.7400 | 0.0000 |
77
+ | 0.0604 | 1.73 | 10000 | 0.1333 | 12.7192 | 3.7515 | 0.0000 |
78
+ | 0.0707 | 1.82 | 10500 | 0.1333 | 12.7052 | 3.7568 | 0.0000 |
79
+ | 0.0639 | 1.91 | 11000 | 0.1332 | 12.6983 | 3.7617 | 0.0000 |
80
+ | 0.0617 | 1.99 | 11500 | 0.1331 | 12.6936 | 3.7402 | 0.0000 |
81
+ | 0.0601 | 2.08 | 12000 | 0.1330 | 12.6901 | 3.7586 | 0.0000 |
82
+ | 0.0632 | 2.17 | 12500 | 0.1330 | 12.6785 | 3.7279 | 0.0000 |
83
+ | 0.0626 | 2.25 | 13000 | 0.1330 | 12.6808 | 3.7333 | 0.0000 |
84
+ | 0.066 | 2.34 | 13500 | 0.1329 | 12.6704 | 3.7512 | 0.0000 |
85
+ | 0.0674 | 2.42 | 14000 | 0.1329 | 12.6599 | 3.7384 | 0.0000 |
86
+ | 0.0637 | 2.51 | 14500 | 0.1329 | 12.6797 | 3.7428 | 0.0000 |
87
+ | 0.0641 | 2.6 | 15000 | 0.1329 | 12.6750 | 3.7305 | 0.0000 |
88
+
89
+
90
+ ### Framework versions
91
+
92
+ - Transformers 4.36.0.dev0
93
+ - Pytorch 2.1.1+cu121
94
+ - Datasets 2.15.0
95
+ - Tokenizers 0.15.0
generation_config.json ADDED
@@ -0,0 +1,263 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alignment_heads": [
3
+ [
4
+ 5,
5
+ 3
6
+ ],
7
+ [
8
+ 5,
9
+ 9
10
+ ],
11
+ [
12
+ 8,
13
+ 0
14
+ ],
15
+ [
16
+ 8,
17
+ 4
18
+ ],
19
+ [
20
+ 8,
21
+ 7
22
+ ],
23
+ [
24
+ 8,
25
+ 8
26
+ ],
27
+ [
28
+ 9,
29
+ 0
30
+ ],
31
+ [
32
+ 9,
33
+ 7
34
+ ],
35
+ [
36
+ 9,
37
+ 9
38
+ ],
39
+ [
40
+ 10,
41
+ 5
42
+ ]
43
+ ],
44
+ "begin_suppress_tokens": [
45
+ 220,
46
+ 50257
47
+ ],
48
+ "bos_token_id": 50257,
49
+ "decoder_start_token_id": 50258,
50
+ "eos_token_id": 50257,
51
+ "forced_decoder_ids": [
52
+ [
53
+ 1,
54
+ null
55
+ ],
56
+ [
57
+ 2,
58
+ 50359
59
+ ]
60
+ ],
61
+ "is_multilingual": true,
62
+ "lang_to_id": {
63
+ "<|af|>": 50327,
64
+ "<|am|>": 50334,
65
+ "<|ar|>": 50272,
66
+ "<|as|>": 50350,
67
+ "<|az|>": 50304,
68
+ "<|ba|>": 50355,
69
+ "<|be|>": 50330,
70
+ "<|bg|>": 50292,
71
+ "<|bn|>": 50302,
72
+ "<|bo|>": 50347,
73
+ "<|br|>": 50309,
74
+ "<|bs|>": 50315,
75
+ "<|ca|>": 50270,
76
+ "<|cs|>": 50283,
77
+ "<|cy|>": 50297,
78
+ "<|da|>": 50285,
79
+ "<|de|>": 50261,
80
+ "<|el|>": 50281,
81
+ "<|en|>": 50259,
82
+ "<|es|>": 50262,
83
+ "<|et|>": 50307,
84
+ "<|eu|>": 50310,
85
+ "<|fa|>": 50300,
86
+ "<|fi|>": 50277,
87
+ "<|fo|>": 50338,
88
+ "<|fr|>": 50265,
89
+ "<|gl|>": 50319,
90
+ "<|gu|>": 50333,
91
+ "<|haw|>": 50352,
92
+ "<|ha|>": 50354,
93
+ "<|he|>": 50279,
94
+ "<|hi|>": 50276,
95
+ "<|hr|>": 50291,
96
+ "<|ht|>": 50339,
97
+ "<|hu|>": 50286,
98
+ "<|hy|>": 50312,
99
+ "<|id|>": 50275,
100
+ "<|is|>": 50311,
101
+ "<|it|>": 50274,
102
+ "<|ja|>": 50266,
103
+ "<|jw|>": 50356,
104
+ "<|ka|>": 50329,
105
+ "<|kk|>": 50316,
106
+ "<|km|>": 50323,
107
+ "<|kn|>": 50306,
108
+ "<|ko|>": 50264,
109
+ "<|la|>": 50294,
110
+ "<|lb|>": 50345,
111
+ "<|ln|>": 50353,
112
+ "<|lo|>": 50336,
113
+ "<|lt|>": 50293,
114
+ "<|lv|>": 50301,
115
+ "<|mg|>": 50349,
116
+ "<|mi|>": 50295,
117
+ "<|mk|>": 50308,
118
+ "<|ml|>": 50296,
119
+ "<|mn|>": 50314,
120
+ "<|mr|>": 50320,
121
+ "<|ms|>": 50282,
122
+ "<|mt|>": 50343,
123
+ "<|my|>": 50346,
124
+ "<|ne|>": 50313,
125
+ "<|nl|>": 50271,
126
+ "<|nn|>": 50342,
127
+ "<|no|>": 50288,
128
+ "<|oc|>": 50328,
129
+ "<|pa|>": 50321,
130
+ "<|pl|>": 50269,
131
+ "<|ps|>": 50340,
132
+ "<|pt|>": 50267,
133
+ "<|ro|>": 50284,
134
+ "<|ru|>": 50263,
135
+ "<|sa|>": 50344,
136
+ "<|sd|>": 50332,
137
+ "<|si|>": 50322,
138
+ "<|sk|>": 50298,
139
+ "<|sl|>": 50305,
140
+ "<|sn|>": 50324,
141
+ "<|so|>": 50326,
142
+ "<|sq|>": 50317,
143
+ "<|sr|>": 50303,
144
+ "<|su|>": 50357,
145
+ "<|sv|>": 50273,
146
+ "<|sw|>": 50318,
147
+ "<|ta|>": 50287,
148
+ "<|te|>": 50299,
149
+ "<|tg|>": 50331,
150
+ "<|th|>": 50289,
151
+ "<|tk|>": 50341,
152
+ "<|tl|>": 50348,
153
+ "<|tr|>": 50268,
154
+ "<|tt|>": 50351,
155
+ "<|uk|>": 50280,
156
+ "<|ur|>": 50290,
157
+ "<|uz|>": 50337,
158
+ "<|vi|>": 50278,
159
+ "<|yi|>": 50335,
160
+ "<|yo|>": 50325,
161
+ "<|zh|>": 50260
162
+ },
163
+ "max_initial_timestamp_index": 1,
164
+ "max_length": 448,
165
+ "no_timestamps_token_id": 50363,
166
+ "pad_token_id": 50257,
167
+ "return_timestamps": false,
168
+ "suppress_tokens": [
169
+ 1,
170
+ 2,
171
+ 7,
172
+ 8,
173
+ 9,
174
+ 10,
175
+ 14,
176
+ 25,
177
+ 26,
178
+ 27,
179
+ 28,
180
+ 29,
181
+ 31,
182
+ 58,
183
+ 59,
184
+ 60,
185
+ 61,
186
+ 62,
187
+ 63,
188
+ 90,
189
+ 91,
190
+ 92,
191
+ 93,
192
+ 359,
193
+ 503,
194
+ 522,
195
+ 542,
196
+ 873,
197
+ 893,
198
+ 902,
199
+ 918,
200
+ 922,
201
+ 931,
202
+ 1350,
203
+ 1853,
204
+ 1982,
205
+ 2460,
206
+ 2627,
207
+ 3246,
208
+ 3253,
209
+ 3268,
210
+ 3536,
211
+ 3846,
212
+ 3961,
213
+ 4183,
214
+ 4667,
215
+ 6585,
216
+ 6647,
217
+ 7273,
218
+ 9061,
219
+ 9383,
220
+ 10428,
221
+ 10929,
222
+ 11938,
223
+ 12033,
224
+ 12331,
225
+ 12562,
226
+ 13793,
227
+ 14157,
228
+ 14635,
229
+ 15265,
230
+ 15618,
231
+ 16553,
232
+ 16604,
233
+ 18362,
234
+ 18956,
235
+ 20075,
236
+ 21675,
237
+ 22520,
238
+ 26130,
239
+ 26161,
240
+ 26435,
241
+ 28279,
242
+ 29464,
243
+ 31650,
244
+ 32302,
245
+ 32470,
246
+ 36865,
247
+ 42863,
248
+ 47425,
249
+ 49870,
250
+ 50254,
251
+ 50258,
252
+ 50358,
253
+ 50359,
254
+ 50360,
255
+ 50361,
256
+ 50362
257
+ ],
258
+ "task_to_id": {
259
+ "transcribe": 50359,
260
+ "translate": 50358
261
+ },
262
+ "transformers_version": "4.36.0.dev0"
263
+ }
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:4d69e95f3673b8ec0d3342cdd39e40504cee3e0110a3a29b8e813a9dfa516119
3
  size 966995080
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:330a26fc4b6d70eb1d1e3948e3de2c002d6d10ef2fd94bec793e5690651c9c75
3
  size 966995080
runs/Dec07_10-26-26_MSK-PC-01/events.out.tfevents.1701933987.MSK-PC-01.461.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:29ace9c615ffd51b7d3bfae629c7614b63e4ac6ac526e1d944883626f410b289
3
- size 108497
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:02cf540038f6edcea0d16fcdb487b79fb474c77ca4ac7d5e8680e26976999cc1
3
+ size 112472