Edit model card

exp20-M04-both

This model is a fine-tuned version of yongjian/wav2vec2-large-a on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 3.3358
  • Wer: 1.1374

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 4
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Wer
37.2423 0.34 500 3.2577 1.0
3.1334 0.68 1000 3.0084 1.0
2.9616 1.02 1500 2.9946 1.0
2.8067 1.36 2000 2.6228 1.3130
2.732 1.7 2500 2.5059 1.5013
2.3673 2.04 3000 2.1828 1.4911
2.1378 2.38 3500 2.2066 1.4911
1.9853 2.72 4000 1.9877 1.4580
1.8574 3.06 4500 1.8850 1.4656
1.7085 3.4 5000 1.9121 1.4606
1.6161 3.74 5500 2.1036 1.4326
1.5304 4.08 6000 1.9807 1.4478
1.3531 4.42 6500 2.0211 1.4656
1.3269 4.77 7000 1.9231 1.3893
1.2312 5.11 7500 2.2652 1.4097
1.1161 5.45 8000 1.9543 1.4529
1.0305 5.79 8500 2.1463 1.4071
0.9403 6.13 9000 3.7872 1.4071
0.8723 6.47 9500 2.8466 1.4326
0.8752 6.81 10000 2.2215 1.3766
0.7774 7.15 10500 2.0462 1.3257
0.74 7.49 11000 2.1928 1.3333
0.7371 7.83 11500 2.8058 1.3410
0.7075 8.17 12000 2.3100 1.3308
0.6746 8.51 12500 2.6284 1.2875
0.6233 8.85 13000 2.2268 1.3003
0.7172 9.19 13500 2.1980 1.2926
0.5697 9.53 14000 2.1950 1.2468
0.5691 9.87 14500 2.1819 1.2316
0.5062 10.21 15000 2.1426 1.2621
0.4818 10.55 15500 2.2259 1.2545
0.5083 10.89 16000 2.1764 1.2214
0.3901 11.23 16500 2.2412 1.2341
0.4275 11.57 17000 2.3781 1.2290
0.4225 11.91 17500 2.1578 1.2443
0.4106 12.25 18000 2.5651 1.2341
0.3933 12.59 18500 2.1819 1.2265
0.3821 12.93 19000 2.0564 1.1934
0.3584 13.27 19500 2.5475 1.2290
0.3468 13.61 20000 2.5857 1.1781
0.3984 13.96 20500 2.2383 1.2239
0.308 14.3 21000 2.4947 1.2137
0.3356 14.64 21500 2.6563 1.2163
0.3406 14.98 22000 2.3337 1.2061
0.3297 15.32 22500 2.2793 1.1908
0.3028 15.66 23000 2.6462 1.1654
0.3226 16.0 23500 2.3785 1.1705
0.2605 16.34 24000 2.7212 1.1858
0.2669 16.68 24500 3.0365 1.2087
0.2967 17.02 25000 2.4898 1.1934
0.2547 17.36 25500 2.4020 1.1832
0.2779 17.7 26000 2.5558 1.1705
0.2341 18.04 26500 2.9406 1.1934
0.2304 18.38 27000 3.1528 1.1603
0.226 18.72 27500 3.0001 1.2163
0.2319 19.06 28000 3.0117 1.1603
0.1836 19.4 28500 2.8332 1.1858
0.2085 19.74 29000 2.8757 1.1603
0.2383 20.08 29500 3.2235 1.1934
0.2006 20.42 30000 3.0189 1.1603
0.1722 20.76 30500 2.8001 1.1527
0.1955 21.1 31000 3.0401 1.1578
0.1839 21.44 31500 3.2621 1.1578
0.1592 21.78 32000 3.1740 1.1552
0.1835 22.12 32500 3.3974 1.1934
0.197 22.46 33000 2.8283 1.1425
0.1788 22.8 33500 3.1983 1.1705
0.169 23.14 34000 3.1978 1.1425
0.1649 23.49 34500 3.1829 1.1552
0.1431 23.83 35000 3.0528 1.1272
0.1384 24.17 35500 3.3792 1.1196
0.1234 24.51 36000 3.3988 1.1425
0.1552 24.85 36500 3.1008 1.1170
0.124 25.19 37000 2.9486 1.1374
0.1439 25.53 37500 3.1028 1.1323
0.1612 25.87 38000 3.0209 1.1043
0.1456 26.21 38500 2.9466 1.1323
0.1333 26.55 39000 3.1298 1.1221
0.1368 26.89 39500 3.1051 1.1272
0.1263 27.23 40000 3.2888 1.1298
0.1198 27.57 40500 3.0984 1.1298
0.1202 27.91 41000 3.1653 1.1374
0.1252 28.25 41500 3.3016 1.1552
0.1177 28.59 42000 3.2566 1.1349
0.1072 28.93 42500 3.3303 1.1425
0.1497 29.27 43000 3.2549 1.1399
0.1089 29.61 43500 3.3121 1.1374
0.0936 29.95 44000 3.3358 1.1374

Framework versions

  • Transformers 4.23.1
  • Pytorch 1.12.1+cu113
  • Datasets 1.18.3
  • Tokenizers 0.13.2
Downloads last month
1