libri-alpha-0.75-Temp-1-attention-3-layers-distil-with-6-layers-mse-take-4-unfreeze-extractor
This model is a fine-tuned version of rohitp1/libri-alpha-0.75-Temp-1-attention-3-layers-distil-with-6-layers-mse on the None dataset. It achieves the following results on the evaluation set:
- Loss: 35.4977
- Wer: 0.2414
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 8
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.2
- num_epochs: 40
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
557.7442 | 0.45 | 400 | 28.5437 | 0.3344 |
576.4579 | 0.9 | 800 | 28.0582 | 0.3313 |
557.924 | 1.35 | 1200 | 28.2032 | 0.3285 |
558.8386 | 1.79 | 1600 | 28.0733 | 0.3327 |
583.0312 | 2.24 | 2000 | 28.3506 | 0.3254 |
559.6182 | 2.69 | 2400 | 27.7517 | 0.3245 |
555.811 | 3.14 | 2800 | 28.1994 | 0.3275 |
555.9074 | 3.59 | 3200 | 28.2289 | 0.3267 |
569.4283 | 4.04 | 3600 | 27.9987 | 0.3247 |
523.5996 | 4.48 | 4000 | 27.9328 | 0.3178 |
543.8255 | 4.93 | 4400 | 28.0181 | 0.3192 |
508.707 | 5.38 | 4800 | 27.8667 | 0.3172 |
518.0536 | 5.83 | 5200 | 28.0461 | 0.3120 |
516.7025 | 6.28 | 5600 | 28.6324 | 0.3193 |
509.9804 | 6.73 | 6000 | 28.8554 | 0.3202 |
522.2005 | 7.17 | 6400 | 28.4986 | 0.3173 |
501.0925 | 7.62 | 6800 | 28.5744 | 0.3095 |
506.2044 | 8.07 | 7200 | 29.1753 | 0.3108 |
464.1213 | 8.52 | 7600 | 28.5564 | 0.3080 |
483.3067 | 8.97 | 8000 | 28.3099 | 0.3063 |
463.7952 | 9.42 | 8400 | 28.4788 | 0.2990 |
474.824 | 9.87 | 8800 | 27.5007 | 0.2959 |
441.7981 | 10.31 | 9200 | 28.3279 | 0.2906 |
445.6532 | 10.76 | 9600 | 27.6901 | 0.2881 |
427.3226 | 11.21 | 10000 | 28.5749 | 0.2860 |
419.5903 | 11.66 | 10400 | 27.3023 | 0.2825 |
425.3329 | 12.11 | 10800 | 28.3225 | 0.2803 |
401.3551 | 12.56 | 11200 | 28.1836 | 0.2814 |
409.8571 | 13.0 | 11600 | 27.9721 | 0.2806 |
382.0269 | 13.45 | 12000 | 28.2285 | 0.2798 |
363.1065 | 13.9 | 12400 | 28.9252 | 0.2821 |
386.975 | 14.35 | 12800 | 28.7444 | 0.2778 |
370.1886 | 14.8 | 13200 | 28.3816 | 0.2738 |
385.9398 | 15.25 | 13600 | 29.5411 | 0.2759 |
347.4368 | 15.7 | 14000 | 28.5876 | 0.2710 |
338.2872 | 16.14 | 14400 | 28.9052 | 0.2709 |
347.3471 | 16.59 | 14800 | 28.3766 | 0.2679 |
344.1634 | 17.04 | 15200 | 29.3270 | 0.2669 |
333.9699 | 17.49 | 15600 | 29.2184 | 0.2656 |
326.7914 | 17.94 | 16000 | 29.4644 | 0.2659 |
328.6156 | 18.39 | 16400 | 30.1155 | 0.2686 |
314.8902 | 18.83 | 16800 | 29.8135 | 0.2653 |
320.2311 | 19.28 | 17200 | 30.4169 | 0.2654 |
311.5116 | 19.73 | 17600 | 30.7323 | 0.2654 |
320.7442 | 20.18 | 18000 | 30.3148 | 0.2616 |
310.1395 | 20.63 | 18400 | 30.3432 | 0.2626 |
298.6844 | 21.08 | 18800 | 30.3217 | 0.2611 |
294.7287 | 21.52 | 19200 | 30.4799 | 0.2574 |
301.9398 | 21.97 | 19600 | 29.9043 | 0.2562 |
285.6117 | 22.42 | 20000 | 30.6270 | 0.2574 |
299.511 | 22.87 | 20400 | 30.4342 | 0.2580 |
271.373 | 23.32 | 20800 | 31.1784 | 0.2583 |
289.4111 | 23.77 | 21200 | 30.8436 | 0.2562 |
266.0083 | 24.22 | 21600 | 31.6785 | 0.2576 |
271.6104 | 24.66 | 22000 | 31.7733 | 0.2565 |
280.7621 | 25.11 | 22400 | 32.7097 | 0.2564 |
254.1648 | 25.56 | 22800 | 33.1091 | 0.2564 |
276.6574 | 26.01 | 23200 | 31.9279 | 0.2539 |
277.4295 | 26.46 | 23600 | 32.4169 | 0.2522 |
268.0675 | 26.91 | 24000 | 32.5259 | 0.2510 |
249.2665 | 27.35 | 24400 | 32.4788 | 0.2508 |
277.0122 | 27.8 | 24800 | 32.7013 | 0.2517 |
250.1679 | 28.25 | 25200 | 32.4869 | 0.2524 |
242.7224 | 28.7 | 25600 | 32.2633 | 0.2521 |
250.325 | 29.15 | 26000 | 33.0046 | 0.2491 |
233.9489 | 29.6 | 26400 | 32.7155 | 0.2485 |
246.6027 | 30.04 | 26800 | 33.6882 | 0.2485 |
244.4221 | 30.49 | 27200 | 34.2592 | 0.2492 |
239.4369 | 30.94 | 27600 | 33.6288 | 0.2492 |
239.1851 | 31.39 | 28000 | 34.0746 | 0.2484 |
234.8415 | 31.84 | 28400 | 34.1040 | 0.2466 |
225.2858 | 32.29 | 28800 | 34.6926 | 0.2483 |
241.6866 | 32.74 | 29200 | 34.0598 | 0.2474 |
224.4263 | 33.18 | 29600 | 34.8568 | 0.2459 |
227.2052 | 33.63 | 30000 | 34.8061 | 0.2456 |
226.6837 | 34.08 | 30400 | 34.9184 | 0.2450 |
219.9877 | 34.53 | 30800 | 34.8988 | 0.2441 |
225.5292 | 34.98 | 31200 | 34.9351 | 0.2447 |
215.8455 | 35.43 | 31600 | 34.9351 | 0.2437 |
210.303 | 35.87 | 32000 | 35.0217 | 0.2439 |
230.9594 | 36.32 | 32400 | 35.4323 | 0.2449 |
207.6091 | 36.77 | 32800 | 35.1739 | 0.2439 |
202.487 | 37.22 | 33200 | 35.3531 | 0.2441 |
209.1144 | 37.67 | 33600 | 35.4137 | 0.2419 |
212.8689 | 38.12 | 34000 | 35.4311 | 0.2434 |
201.1868 | 38.57 | 34400 | 35.6746 | 0.2426 |
206.6466 | 39.01 | 34800 | 35.5530 | 0.2420 |
218.2249 | 39.46 | 35200 | 35.4107 | 0.2415 |
226.1933 | 39.91 | 35600 | 35.4977 | 0.2414 |
Framework versions
- Transformers 4.24.0
- Pytorch 1.12.1
- Datasets 2.7.1
- Tokenizers 0.11.0
- Downloads last month
- 7