fsicoli commited on
Commit
5ad6f75
1 Parent(s): f90e3a2

Model save

Browse files
README.md ADDED
@@ -0,0 +1,111 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: openai/whisper-medium
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - wer
8
+ model-index:
9
+ - name: whisper-medium-pt-3000h
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # whisper-medium-pt-3000h
17
+
18
+ This model is a fine-tuned version of [openai/whisper-medium](https://huggingface.co/openai/whisper-medium) on an unknown dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 0.9313
21
+ - Wer: 0.1103
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 5e-06
41
+ - train_batch_size: 8
42
+ - eval_batch_size: 8
43
+ - seed: 42
44
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
+ - lr_scheduler_type: linear
46
+ - lr_scheduler_warmup_steps: 10000
47
+ - num_epochs: 10.0
48
+ - mixed_precision_training: Native AMP
49
+
50
+ ### Training results
51
+
52
+ | Training Loss | Epoch | Step | Validation Loss | Wer |
53
+ |:-------------:|:-----:|:-------:|:---------------:|:------:|
54
+ | 0.4423 | 0.2 | 20000 | 0.4723 | 0.1633 |
55
+ | 0.4963 | 0.39 | 40000 | 0.4921 | 0.1547 |
56
+ | 0.3853 | 0.59 | 60000 | 0.5099 | 0.1470 |
57
+ | 0.37 | 0.79 | 80000 | 0.4753 | 0.1439 |
58
+ | 0.3615 | 0.98 | 100000 | 0.5074 | 0.1386 |
59
+ | 0.2394 | 1.18 | 120000 | 0.4858 | 0.1341 |
60
+ | 0.227 | 1.38 | 140000 | 0.5758 | 0.1323 |
61
+ | 0.2461 | 1.57 | 160000 | 0.5067 | 0.1322 |
62
+ | 0.2078 | 1.77 | 180000 | 0.5087 | 0.1291 |
63
+ | 0.2138 | 1.97 | 200000 | 0.5201 | 0.1273 |
64
+ | 0.1188 | 2.16 | 220000 | 0.6359 | 0.1265 |
65
+ | 0.1009 | 2.36 | 240000 | 0.6229 | 0.1253 |
66
+ | 0.1394 | 2.56 | 260000 | 0.5734 | 0.1231 |
67
+ | 0.1383 | 2.75 | 280000 | 0.5914 | 0.1213 |
68
+ | 0.1332 | 2.95 | 300000 | 0.6174 | 0.1212 |
69
+ | 0.0634 | 3.15 | 320000 | 0.6461 | 0.1190 |
70
+ | 0.0667 | 3.34 | 340000 | 0.6330 | 0.1211 |
71
+ | 0.0546 | 3.54 | 360000 | 0.6927 | 0.1190 |
72
+ | 0.1029 | 3.74 | 380000 | 0.6777 | 0.1184 |
73
+ | 0.0664 | 3.93 | 400000 | 0.6367 | 0.1161 |
74
+ | 0.0665 | 4.13 | 420000 | 0.7467 | 0.1171 |
75
+ | 0.0695 | 4.33 | 440000 | 0.7332 | 0.1164 |
76
+ | 0.0708 | 4.52 | 460000 | 0.7141 | 0.1171 |
77
+ | 0.0695 | 4.72 | 480000 | 0.6869 | 0.1169 |
78
+ | 0.0758 | 4.92 | 500000 | 0.7360 | 0.1153 |
79
+ | 0.061 | 5.11 | 520000 | 0.7594 | 0.1161 |
80
+ | 0.0804 | 5.31 | 540000 | 0.7640 | 0.1158 |
81
+ | 0.0963 | 5.51 | 560000 | 0.7848 | 0.1157 |
82
+ | 0.0815 | 5.7 | 580000 | 0.7635 | 0.1145 |
83
+ | 0.0794 | 5.9 | 600000 | 0.7566 | 0.1134 |
84
+ | 0.0907 | 6.1 | 620000 | 0.8152 | 0.1147 |
85
+ | 0.0664 | 6.29 | 640000 | 0.8405 | 0.1123 |
86
+ | 0.0654 | 6.49 | 660000 | 0.8278 | 0.1119 |
87
+ | 0.0652 | 6.69 | 680000 | 0.8267 | 0.1134 |
88
+ | 0.1043 | 6.88 | 700000 | 0.8254 | 0.1122 |
89
+ | 0.0383 | 7.08 | 720000 | 0.8719 | 0.1122 |
90
+ | 0.0461 | 7.28 | 740000 | 0.8640 | 0.1130 |
91
+ | 0.0791 | 7.47 | 760000 | 0.8990 | 0.1122 |
92
+ | 0.0587 | 7.67 | 780000 | 0.9107 | 0.1122 |
93
+ | 0.0578 | 7.87 | 800000 | 0.9060 | 0.1124 |
94
+ | 0.0218 | 8.06 | 820000 | 0.8845 | 0.1111 |
95
+ | 0.0125 | 8.26 | 840000 | 0.9072 | 0.1112 |
96
+ | 0.0172 | 8.46 | 860000 | 0.8899 | 0.1107 |
97
+ | 0.0204 | 8.65 | 880000 | 0.9149 | 0.1108 |
98
+ | 0.0145 | 8.85 | 900000 | 0.9097 | 0.1103 |
99
+ | 0.0146 | 9.05 | 920000 | 0.9084 | 0.1107 |
100
+ | 0.0166 | 9.24 | 940000 | 0.9053 | 0.1103 |
101
+ | 0.0177 | 9.44 | 960000 | 0.9193 | 0.1100 |
102
+ | 0.0157 | 9.64 | 980000 | 0.9212 | 0.1101 |
103
+ | 0.0096 | 9.83 | 1000000 | 0.9313 | 0.1103 |
104
+
105
+
106
+ ### Framework versions
107
+
108
+ - Transformers 4.39.0.dev0
109
+ - Pytorch 2.2.1+cu121
110
+ - Datasets 2.18.1.dev0
111
+ - Tokenizers 0.15.0
generation_config.json ADDED
@@ -0,0 +1,249 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alignment_heads": [
3
+ [
4
+ 13,
5
+ 15
6
+ ],
7
+ [
8
+ 15,
9
+ 4
10
+ ],
11
+ [
12
+ 15,
13
+ 15
14
+ ],
15
+ [
16
+ 16,
17
+ 1
18
+ ],
19
+ [
20
+ 20,
21
+ 0
22
+ ],
23
+ [
24
+ 23,
25
+ 4
26
+ ]
27
+ ],
28
+ "begin_suppress_tokens": [
29
+ 220,
30
+ 50257
31
+ ],
32
+ "bos_token_id": 50257,
33
+ "decoder_start_token_id": 50258,
34
+ "eos_token_id": 50257,
35
+ "forced_decoder_ids": [
36
+ [
37
+ 1,
38
+ null
39
+ ],
40
+ [
41
+ 2,
42
+ 50359
43
+ ]
44
+ ],
45
+ "is_multilingual": true,
46
+ "lang_to_id": {
47
+ "<|af|>": 50327,
48
+ "<|am|>": 50334,
49
+ "<|ar|>": 50272,
50
+ "<|as|>": 50350,
51
+ "<|az|>": 50304,
52
+ "<|ba|>": 50355,
53
+ "<|be|>": 50330,
54
+ "<|bg|>": 50292,
55
+ "<|bn|>": 50302,
56
+ "<|bo|>": 50347,
57
+ "<|br|>": 50309,
58
+ "<|bs|>": 50315,
59
+ "<|ca|>": 50270,
60
+ "<|cs|>": 50283,
61
+ "<|cy|>": 50297,
62
+ "<|da|>": 50285,
63
+ "<|de|>": 50261,
64
+ "<|el|>": 50281,
65
+ "<|en|>": 50259,
66
+ "<|es|>": 50262,
67
+ "<|et|>": 50307,
68
+ "<|eu|>": 50310,
69
+ "<|fa|>": 50300,
70
+ "<|fi|>": 50277,
71
+ "<|fo|>": 50338,
72
+ "<|fr|>": 50265,
73
+ "<|gl|>": 50319,
74
+ "<|gu|>": 50333,
75
+ "<|haw|>": 50352,
76
+ "<|ha|>": 50354,
77
+ "<|he|>": 50279,
78
+ "<|hi|>": 50276,
79
+ "<|hr|>": 50291,
80
+ "<|ht|>": 50339,
81
+ "<|hu|>": 50286,
82
+ "<|hy|>": 50312,
83
+ "<|id|>": 50275,
84
+ "<|is|>": 50311,
85
+ "<|it|>": 50274,
86
+ "<|ja|>": 50266,
87
+ "<|jw|>": 50356,
88
+ "<|ka|>": 50329,
89
+ "<|kk|>": 50316,
90
+ "<|km|>": 50323,
91
+ "<|kn|>": 50306,
92
+ "<|ko|>": 50264,
93
+ "<|la|>": 50294,
94
+ "<|lb|>": 50345,
95
+ "<|ln|>": 50353,
96
+ "<|lo|>": 50336,
97
+ "<|lt|>": 50293,
98
+ "<|lv|>": 50301,
99
+ "<|mg|>": 50349,
100
+ "<|mi|>": 50295,
101
+ "<|mk|>": 50308,
102
+ "<|ml|>": 50296,
103
+ "<|mn|>": 50314,
104
+ "<|mr|>": 50320,
105
+ "<|ms|>": 50282,
106
+ "<|mt|>": 50343,
107
+ "<|my|>": 50346,
108
+ "<|ne|>": 50313,
109
+ "<|nl|>": 50271,
110
+ "<|nn|>": 50342,
111
+ "<|no|>": 50288,
112
+ "<|oc|>": 50328,
113
+ "<|pa|>": 50321,
114
+ "<|pl|>": 50269,
115
+ "<|ps|>": 50340,
116
+ "<|pt|>": 50267,
117
+ "<|ro|>": 50284,
118
+ "<|ru|>": 50263,
119
+ "<|sa|>": 50344,
120
+ "<|sd|>": 50332,
121
+ "<|si|>": 50322,
122
+ "<|sk|>": 50298,
123
+ "<|sl|>": 50305,
124
+ "<|sn|>": 50324,
125
+ "<|so|>": 50326,
126
+ "<|sq|>": 50317,
127
+ "<|sr|>": 50303,
128
+ "<|su|>": 50357,
129
+ "<|sv|>": 50273,
130
+ "<|sw|>": 50318,
131
+ "<|ta|>": 50287,
132
+ "<|te|>": 50299,
133
+ "<|tg|>": 50331,
134
+ "<|th|>": 50289,
135
+ "<|tk|>": 50341,
136
+ "<|tl|>": 50348,
137
+ "<|tr|>": 50268,
138
+ "<|tt|>": 50351,
139
+ "<|uk|>": 50280,
140
+ "<|ur|>": 50290,
141
+ "<|uz|>": 50337,
142
+ "<|vi|>": 50278,
143
+ "<|yi|>": 50335,
144
+ "<|yo|>": 50325,
145
+ "<|zh|>": 50260
146
+ },
147
+ "language": "<|pt|>",
148
+ "max_initial_timestamp_index": 50,
149
+ "max_length": 448,
150
+ "no_timestamps_token_id": 50363,
151
+ "pad_token_id": 50257,
152
+ "prev_sot_token_id": 50361,
153
+ "return_timestamps": false,
154
+ "suppress_tokens": [
155
+ 1,
156
+ 2,
157
+ 7,
158
+ 8,
159
+ 9,
160
+ 10,
161
+ 14,
162
+ 25,
163
+ 26,
164
+ 27,
165
+ 28,
166
+ 29,
167
+ 31,
168
+ 58,
169
+ 59,
170
+ 60,
171
+ 61,
172
+ 62,
173
+ 63,
174
+ 90,
175
+ 91,
176
+ 92,
177
+ 93,
178
+ 359,
179
+ 503,
180
+ 522,
181
+ 542,
182
+ 873,
183
+ 893,
184
+ 902,
185
+ 918,
186
+ 922,
187
+ 931,
188
+ 1350,
189
+ 1853,
190
+ 1982,
191
+ 2460,
192
+ 2627,
193
+ 3246,
194
+ 3253,
195
+ 3268,
196
+ 3536,
197
+ 3846,
198
+ 3961,
199
+ 4183,
200
+ 4667,
201
+ 6585,
202
+ 6647,
203
+ 7273,
204
+ 9061,
205
+ 9383,
206
+ 10428,
207
+ 10929,
208
+ 11938,
209
+ 12033,
210
+ 12331,
211
+ 12562,
212
+ 13793,
213
+ 14157,
214
+ 14635,
215
+ 15265,
216
+ 15618,
217
+ 16553,
218
+ 16604,
219
+ 18362,
220
+ 18956,
221
+ 20075,
222
+ 21675,
223
+ 22520,
224
+ 26130,
225
+ 26161,
226
+ 26435,
227
+ 28279,
228
+ 29464,
229
+ 31650,
230
+ 32302,
231
+ 32470,
232
+ 36865,
233
+ 42863,
234
+ 47425,
235
+ 49870,
236
+ 50254,
237
+ 50258,
238
+ 50358,
239
+ 50359,
240
+ 50360,
241
+ 50361,
242
+ 50362
243
+ ],
244
+ "task_to_id": {
245
+ "transcribe": 50359,
246
+ "translate": 50358
247
+ },
248
+ "transformers_version": "4.39.0.dev0"
249
+ }
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:6e44c5504bbdc21d3b54e7af7c80ec8c73fe4b928f07e8e62936fc7d7956901d
3
  size 3055544304
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6721d621cdda548dac7ee9e55c58d3ce39224fdcd468e7f46c92a1bed4a10509
3
  size 3055544304
runs/May19_18-02-06_DITEC2014063010/events.out.tfevents.1716179641.DITEC2014063010.1572992.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:78abc056fa10e56227577d98243b177388a828f217f0cacd6c07e2e178ed2b3f
3
- size 8748080
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:fc4043ffaf20c894ebfc4036cb1f4a83d237b54e41a5876fbd3ad314ea4470f1
3
+ size 8766070