makhataei commited on
Commit
fb7edfb
1 Parent(s): 210aec4

End of training

Browse files
README.md ADDED
@@ -0,0 +1,165 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - fa
4
+ license: apache-2.0
5
+ base_model: makhataei/Whisper-Small-Common-Voice
6
+ tags:
7
+ - fa-asr
8
+ - generated_from_trainer
9
+ metrics:
10
+ - wer
11
+ model-index:
12
+ - name: Whisper Small Persian
13
+ results: []
14
+ ---
15
+
16
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
17
+ should probably proofread and complete it, then remove this comment. -->
18
+
19
+ # Whisper Small Persian
20
+
21
+ This model is a fine-tuned version of [makhataei/Whisper-Small-Common-Voice](https://huggingface.co/makhataei/Whisper-Small-Common-Voice) on the Ctejarat dataset.
22
+ It achieves the following results on the evaluation set:
23
+ - Loss: 0.4755
24
+ - Wer: 26.8240
25
+
26
+ ## Model description
27
+
28
+ More information needed
29
+
30
+ ## Intended uses & limitations
31
+
32
+ More information needed
33
+
34
+ ## Training and evaluation data
35
+
36
+ More information needed
37
+
38
+ ## Training procedure
39
+
40
+ ### Training hyperparameters
41
+
42
+ The following hyperparameters were used during training:
43
+ - learning_rate: 1e-07
44
+ - train_batch_size: 11
45
+ - eval_batch_size: 11
46
+ - seed: 42
47
+ - gradient_accumulation_steps: 8
48
+ - total_train_batch_size: 88
49
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
50
+ - lr_scheduler_type: linear
51
+ - lr_scheduler_warmup_steps: 500
52
+ - training_steps: 10000
53
+
54
+ ### Training results
55
+
56
+ | Training Loss | Epoch | Step | Validation Loss | Wer |
57
+ |:-------------:|:-------:|:-----:|:---------------:|:-------:|
58
+ | 0.9585 | 47.06 | 100 | 0.7643 | 39.2704 |
59
+ | 0.8272 | 94.12 | 200 | 0.7201 | 39.0558 |
60
+ | 0.6499 | 141.18 | 300 | 0.6593 | 39.6996 |
61
+ | 0.4717 | 188.24 | 400 | 0.5965 | 36.2661 |
62
+ | 0.2999 | 235.29 | 500 | 0.5380 | 34.3348 |
63
+ | 0.1746 | 282.35 | 600 | 0.4967 | 34.9785 |
64
+ | 0.0995 | 329.41 | 700 | 0.4795 | 35.8369 |
65
+ | 0.0509 | 376.47 | 800 | 0.4714 | 33.0472 |
66
+ | 0.0248 | 423.53 | 900 | 0.4669 | 30.9013 |
67
+ | 0.0144 | 470.59 | 1000 | 0.4643 | 30.6867 |
68
+ | 0.0093 | 517.65 | 1100 | 0.4625 | 29.8283 |
69
+ | 0.0066 | 564.71 | 1200 | 0.4618 | 29.3991 |
70
+ | 0.0051 | 611.76 | 1300 | 0.4616 | 29.6137 |
71
+ | 0.004 | 658.82 | 1400 | 0.4615 | 29.3991 |
72
+ | 0.0034 | 705.88 | 1500 | 0.4616 | 28.7554 |
73
+ | 0.0026 | 752.94 | 1600 | 0.4618 | 29.1845 |
74
+ | 0.0022 | 800.0 | 1700 | 0.4620 | 28.7554 |
75
+ | 0.0019 | 847.06 | 1800 | 0.4622 | 28.7554 |
76
+ | 0.0017 | 894.12 | 1900 | 0.4623 | 28.7554 |
77
+ | 0.0015 | 941.18 | 2000 | 0.4626 | 28.7554 |
78
+ | 0.0013 | 988.24 | 2100 | 0.4628 | 28.7554 |
79
+ | 0.0012 | 1035.29 | 2200 | 0.4630 | 28.3262 |
80
+ | 0.0011 | 1082.35 | 2300 | 0.4633 | 28.3262 |
81
+ | 0.001 | 1129.41 | 2400 | 0.4634 | 28.3262 |
82
+ | 0.0009 | 1176.47 | 2500 | 0.4636 | 28.3262 |
83
+ | 0.0008 | 1223.53 | 2600 | 0.4638 | 28.3262 |
84
+ | 0.0008 | 1270.59 | 2700 | 0.4640 | 27.8970 |
85
+ | 0.0007 | 1317.65 | 2800 | 0.4641 | 28.3262 |
86
+ | 0.0007 | 1364.71 | 2900 | 0.4644 | 28.3262 |
87
+ | 0.0006 | 1411.76 | 3000 | 0.4645 | 28.1116 |
88
+ | 0.0006 | 1458.82 | 3100 | 0.4647 | 27.8970 |
89
+ | 0.0005 | 1505.88 | 3200 | 0.4648 | 27.8970 |
90
+ | 0.0005 | 1552.94 | 3300 | 0.4650 | 28.1116 |
91
+ | 0.0005 | 1600.0 | 3400 | 0.4652 | 28.1116 |
92
+ | 0.0005 | 1647.06 | 3500 | 0.4654 | 27.8970 |
93
+ | 0.0004 | 1694.12 | 3600 | 0.4656 | 27.8970 |
94
+ | 0.0004 | 1741.18 | 3700 | 0.4657 | 27.8970 |
95
+ | 0.0004 | 1788.24 | 3800 | 0.4659 | 27.8970 |
96
+ | 0.0004 | 1835.29 | 3900 | 0.4661 | 27.4678 |
97
+ | 0.0004 | 1882.35 | 4000 | 0.4662 | 27.4678 |
98
+ | 0.0003 | 1929.41 | 4100 | 0.4664 | 27.4678 |
99
+ | 0.0003 | 1976.47 | 4200 | 0.4666 | 27.4678 |
100
+ | 0.0003 | 2023.53 | 4300 | 0.4668 | 27.4678 |
101
+ | 0.0003 | 2070.59 | 4400 | 0.4670 | 27.4678 |
102
+ | 0.0003 | 2117.65 | 4500 | 0.4672 | 27.4678 |
103
+ | 0.0003 | 2164.71 | 4600 | 0.4674 | 27.4678 |
104
+ | 0.0002 | 2211.76 | 4700 | 0.4676 | 27.4678 |
105
+ | 0.0002 | 2258.82 | 4800 | 0.4678 | 27.2532 |
106
+ | 0.0002 | 2305.88 | 4900 | 0.4680 | 27.2532 |
107
+ | 0.0002 | 2352.94 | 5000 | 0.4682 | 27.0386 |
108
+ | 0.0002 | 2400.0 | 5100 | 0.4684 | 27.0386 |
109
+ | 0.0002 | 2447.06 | 5200 | 0.4685 | 27.0386 |
110
+ | 0.0002 | 2494.12 | 5300 | 0.4688 | 27.0386 |
111
+ | 0.0002 | 2541.18 | 5400 | 0.4689 | 27.0386 |
112
+ | 0.0002 | 2588.24 | 5500 | 0.4691 | 27.0386 |
113
+ | 0.0002 | 2635.29 | 5600 | 0.4693 | 27.0386 |
114
+ | 0.0002 | 2682.35 | 5700 | 0.4695 | 27.0386 |
115
+ | 0.0002 | 2729.41 | 5800 | 0.4697 | 27.0386 |
116
+ | 0.0002 | 2776.47 | 5900 | 0.4699 | 27.0386 |
117
+ | 0.0002 | 2823.53 | 6000 | 0.4700 | 27.0386 |
118
+ | 0.0001 | 2870.59 | 6100 | 0.4702 | 27.0386 |
119
+ | 0.0001 | 2917.65 | 6200 | 0.4704 | 27.0386 |
120
+ | 0.0001 | 2964.71 | 6300 | 0.4706 | 27.0386 |
121
+ | 0.0001 | 3011.76 | 6400 | 0.4708 | 27.0386 |
122
+ | 0.0001 | 3058.82 | 6500 | 0.4710 | 27.0386 |
123
+ | 0.0001 | 3105.88 | 6600 | 0.4712 | 27.2532 |
124
+ | 0.0001 | 3152.94 | 6700 | 0.4714 | 27.2532 |
125
+ | 0.0001 | 3200.0 | 6800 | 0.4716 | 27.0386 |
126
+ | 0.0001 | 3247.06 | 6900 | 0.4718 | 27.0386 |
127
+ | 0.0001 | 3294.12 | 7000 | 0.4720 | 27.0386 |
128
+ | 0.0001 | 3341.18 | 7100 | 0.4721 | 27.0386 |
129
+ | 0.0001 | 3388.24 | 7200 | 0.4723 | 26.8240 |
130
+ | 0.0001 | 3435.29 | 7300 | 0.4725 | 26.8240 |
131
+ | 0.0001 | 3482.35 | 7400 | 0.4727 | 26.6094 |
132
+ | 0.0001 | 3529.41 | 7500 | 0.4728 | 26.6094 |
133
+ | 0.0001 | 3576.47 | 7600 | 0.4730 | 26.6094 |
134
+ | 0.0001 | 3623.53 | 7700 | 0.4732 | 26.6094 |
135
+ | 0.0001 | 3670.59 | 7800 | 0.4733 | 26.6094 |
136
+ | 0.0001 | 3717.65 | 7900 | 0.4735 | 26.6094 |
137
+ | 0.0001 | 3764.71 | 8000 | 0.4736 | 26.6094 |
138
+ | 0.0001 | 3811.76 | 8100 | 0.4738 | 26.6094 |
139
+ | 0.0001 | 3858.82 | 8200 | 0.4739 | 26.6094 |
140
+ | 0.0001 | 3905.88 | 8300 | 0.4741 | 26.6094 |
141
+ | 0.0001 | 3952.94 | 8400 | 0.4742 | 26.6094 |
142
+ | 0.0001 | 4000.0 | 8500 | 0.4744 | 26.6094 |
143
+ | 0.0001 | 4047.06 | 8600 | 0.4745 | 26.6094 |
144
+ | 0.0001 | 4094.12 | 8700 | 0.4746 | 26.6094 |
145
+ | 0.0001 | 4141.18 | 8800 | 0.4748 | 26.6094 |
146
+ | 0.0001 | 4188.24 | 8900 | 0.4749 | 26.6094 |
147
+ | 0.0001 | 4235.29 | 9000 | 0.4750 | 26.6094 |
148
+ | 0.0001 | 4282.35 | 9100 | 0.4751 | 26.6094 |
149
+ | 0.0001 | 4329.41 | 9200 | 0.4751 | 27.0386 |
150
+ | 0.0001 | 4376.47 | 9300 | 0.4752 | 27.0386 |
151
+ | 0.0001 | 4423.53 | 9400 | 0.4753 | 27.0386 |
152
+ | 0.0001 | 4470.59 | 9500 | 0.4754 | 27.0386 |
153
+ | 0.0001 | 4517.65 | 9600 | 0.4754 | 27.0386 |
154
+ | 0.0001 | 4564.71 | 9700 | 0.4755 | 26.8240 |
155
+ | 0.0001 | 4611.76 | 9800 | 0.4755 | 26.8240 |
156
+ | 0.0001 | 4658.82 | 9900 | 0.4755 | 26.8240 |
157
+ | 0.0001 | 4705.88 | 10000 | 0.4755 | 26.8240 |
158
+
159
+
160
+ ### Framework versions
161
+
162
+ - Transformers 4.35.2
163
+ - Pytorch 2.0.1+cu117
164
+ - Datasets 2.15.0
165
+ - Tokenizers 0.15.0
generation_config.json ADDED
@@ -0,0 +1,263 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alignment_heads": [
3
+ [
4
+ 5,
5
+ 3
6
+ ],
7
+ [
8
+ 5,
9
+ 9
10
+ ],
11
+ [
12
+ 8,
13
+ 0
14
+ ],
15
+ [
16
+ 8,
17
+ 4
18
+ ],
19
+ [
20
+ 8,
21
+ 7
22
+ ],
23
+ [
24
+ 8,
25
+ 8
26
+ ],
27
+ [
28
+ 9,
29
+ 0
30
+ ],
31
+ [
32
+ 9,
33
+ 7
34
+ ],
35
+ [
36
+ 9,
37
+ 9
38
+ ],
39
+ [
40
+ 10,
41
+ 5
42
+ ]
43
+ ],
44
+ "begin_suppress_tokens": [
45
+ 220,
46
+ 50257
47
+ ],
48
+ "bos_token_id": 50257,
49
+ "decoder_start_token_id": 50258,
50
+ "eos_token_id": 50257,
51
+ "forced_decoder_ids": [
52
+ [
53
+ 1,
54
+ null
55
+ ],
56
+ [
57
+ 2,
58
+ 50359
59
+ ]
60
+ ],
61
+ "is_multilingual": true,
62
+ "lang_to_id": {
63
+ "<|af|>": 50327,
64
+ "<|am|>": 50334,
65
+ "<|ar|>": 50272,
66
+ "<|as|>": 50350,
67
+ "<|az|>": 50304,
68
+ "<|ba|>": 50355,
69
+ "<|be|>": 50330,
70
+ "<|bg|>": 50292,
71
+ "<|bn|>": 50302,
72
+ "<|bo|>": 50347,
73
+ "<|br|>": 50309,
74
+ "<|bs|>": 50315,
75
+ "<|ca|>": 50270,
76
+ "<|cs|>": 50283,
77
+ "<|cy|>": 50297,
78
+ "<|da|>": 50285,
79
+ "<|de|>": 50261,
80
+ "<|el|>": 50281,
81
+ "<|en|>": 50259,
82
+ "<|es|>": 50262,
83
+ "<|et|>": 50307,
84
+ "<|eu|>": 50310,
85
+ "<|fa|>": 50300,
86
+ "<|fi|>": 50277,
87
+ "<|fo|>": 50338,
88
+ "<|fr|>": 50265,
89
+ "<|gl|>": 50319,
90
+ "<|gu|>": 50333,
91
+ "<|haw|>": 50352,
92
+ "<|ha|>": 50354,
93
+ "<|he|>": 50279,
94
+ "<|hi|>": 50276,
95
+ "<|hr|>": 50291,
96
+ "<|ht|>": 50339,
97
+ "<|hu|>": 50286,
98
+ "<|hy|>": 50312,
99
+ "<|id|>": 50275,
100
+ "<|is|>": 50311,
101
+ "<|it|>": 50274,
102
+ "<|ja|>": 50266,
103
+ "<|jw|>": 50356,
104
+ "<|ka|>": 50329,
105
+ "<|kk|>": 50316,
106
+ "<|km|>": 50323,
107
+ "<|kn|>": 50306,
108
+ "<|ko|>": 50264,
109
+ "<|la|>": 50294,
110
+ "<|lb|>": 50345,
111
+ "<|ln|>": 50353,
112
+ "<|lo|>": 50336,
113
+ "<|lt|>": 50293,
114
+ "<|lv|>": 50301,
115
+ "<|mg|>": 50349,
116
+ "<|mi|>": 50295,
117
+ "<|mk|>": 50308,
118
+ "<|ml|>": 50296,
119
+ "<|mn|>": 50314,
120
+ "<|mr|>": 50320,
121
+ "<|ms|>": 50282,
122
+ "<|mt|>": 50343,
123
+ "<|my|>": 50346,
124
+ "<|ne|>": 50313,
125
+ "<|nl|>": 50271,
126
+ "<|nn|>": 50342,
127
+ "<|no|>": 50288,
128
+ "<|oc|>": 50328,
129
+ "<|pa|>": 50321,
130
+ "<|pl|>": 50269,
131
+ "<|ps|>": 50340,
132
+ "<|pt|>": 50267,
133
+ "<|ro|>": 50284,
134
+ "<|ru|>": 50263,
135
+ "<|sa|>": 50344,
136
+ "<|sd|>": 50332,
137
+ "<|si|>": 50322,
138
+ "<|sk|>": 50298,
139
+ "<|sl|>": 50305,
140
+ "<|sn|>": 50324,
141
+ "<|so|>": 50326,
142
+ "<|sq|>": 50317,
143
+ "<|sr|>": 50303,
144
+ "<|su|>": 50357,
145
+ "<|sv|>": 50273,
146
+ "<|sw|>": 50318,
147
+ "<|ta|>": 50287,
148
+ "<|te|>": 50299,
149
+ "<|tg|>": 50331,
150
+ "<|th|>": 50289,
151
+ "<|tk|>": 50341,
152
+ "<|tl|>": 50348,
153
+ "<|tr|>": 50268,
154
+ "<|tt|>": 50351,
155
+ "<|uk|>": 50280,
156
+ "<|ur|>": 50290,
157
+ "<|uz|>": 50337,
158
+ "<|vi|>": 50278,
159
+ "<|yi|>": 50335,
160
+ "<|yo|>": 50325,
161
+ "<|zh|>": 50260
162
+ },
163
+ "max_initial_timestamp_index": 1,
164
+ "max_length": 448,
165
+ "no_timestamps_token_id": 50363,
166
+ "pad_token_id": 50257,
167
+ "return_timestamps": false,
168
+ "suppress_tokens": [
169
+ 1,
170
+ 2,
171
+ 7,
172
+ 8,
173
+ 9,
174
+ 10,
175
+ 14,
176
+ 25,
177
+ 26,
178
+ 27,
179
+ 28,
180
+ 29,
181
+ 31,
182
+ 58,
183
+ 59,
184
+ 60,
185
+ 61,
186
+ 62,
187
+ 63,
188
+ 90,
189
+ 91,
190
+ 92,
191
+ 93,
192
+ 359,
193
+ 503,
194
+ 522,
195
+ 542,
196
+ 873,
197
+ 893,
198
+ 902,
199
+ 918,
200
+ 922,
201
+ 931,
202
+ 1350,
203
+ 1853,
204
+ 1982,
205
+ 2460,
206
+ 2627,
207
+ 3246,
208
+ 3253,
209
+ 3268,
210
+ 3536,
211
+ 3846,
212
+ 3961,
213
+ 4183,
214
+ 4667,
215
+ 6585,
216
+ 6647,
217
+ 7273,
218
+ 9061,
219
+ 9383,
220
+ 10428,
221
+ 10929,
222
+ 11938,
223
+ 12033,
224
+ 12331,
225
+ 12562,
226
+ 13793,
227
+ 14157,
228
+ 14635,
229
+ 15265,
230
+ 15618,
231
+ 16553,
232
+ 16604,
233
+ 18362,
234
+ 18956,
235
+ 20075,
236
+ 21675,
237
+ 22520,
238
+ 26130,
239
+ 26161,
240
+ 26435,
241
+ 28279,
242
+ 29464,
243
+ 31650,
244
+ 32302,
245
+ 32470,
246
+ 36865,
247
+ 42863,
248
+ 47425,
249
+ 49870,
250
+ 50254,
251
+ 50258,
252
+ 50358,
253
+ 50359,
254
+ 50360,
255
+ 50361,
256
+ 50362
257
+ ],
258
+ "task_to_id": {
259
+ "transcribe": 50359,
260
+ "translate": 50358
261
+ },
262
+ "transformers_version": "4.35.2"
263
+ }
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:24b7083dd7c8b8062d85f779c67fcfe6bab096f1e67fcbec8efbfad56e308f67
3
  size 966995080
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7a2d0ac49e91d94036448a3045656d7f60f9d02a49a489919804b58faca56068
3
  size 966995080
runs/Mar10_10-45-41_Software-AI/events.out.tfevents.1710054941.Software-AI.2060343.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:6e2dab93de28b5fb9d41927a8a2eec49aa3d8ebbd7c1b667bf0711aa60b8259c
3
- size 98780
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e8c26395a273938018d9cd23501a899ded65b5b45380436f67418a80dffc4734
3
+ size 100080