Edit model card

libri-alpha-0.5-Temp-1-mse

This model is a fine-tuned version of on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 28.9681
  • Wer: 0.1160

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 64
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
212.5522 0.75 100 55.9161 0.1500
171.0676 1.49 200 51.5701 0.1434
159.3247 2.24 300 40.6680 0.1416
147.7202 2.99 400 36.0320 0.1388
136.0871 3.73 500 32.8709 0.1323
126.3071 4.48 600 31.9204 0.1298
126.9502 5.22 700 31.0903 0.1281
117.0498 5.97 800 30.5398 0.1272
117.0928 6.72 900 30.2616 0.1262
116.35 7.46 1000 30.2445 0.1264
116.784 8.21 1100 30.0181 0.1268
111.6779 8.96 1200 29.6434 0.1252
110.2514 9.7 1300 29.6900 0.1233
112.603 10.45 1400 29.4023 0.1240
110.4294 11.19 1500 29.5929 0.1239
106.3693 11.94 1600 29.4228 0.1232
102.5095 12.69 1700 29.6106 0.1236
104.8351 13.43 1800 29.3908 0.1220
103.6225 14.18 1900 29.5250 0.1216
102.5769 14.93 2000 29.4744 0.1211
102.7153 15.67 2100 29.3769 0.1203
98.3215 16.42 2200 29.3692 0.1205
100.0971 17.16 2300 29.0029 0.1183
94.876 17.91 2400 28.9354 0.1181
100.2511 18.66 2500 28.9513 0.1168
95.3128 19.4 2600 29.0832 0.1166
95.2151 20.15 2700 29.0161 0.1157
92.6844 20.9 2800 29.0543 0.1152
96.837 21.64 2900 29.2276 0.1164
94.2866 22.39 3000 28.9697 0.1164
92.1945 23.13 3100 29.0823 0.1169
97.7153 23.88 3200 29.0628 0.1158
95.3836 24.63 3300 28.9681 0.1160

Framework versions

  • Transformers 4.25.1
  • Pytorch 1.12.1
  • Datasets 2.7.1
  • Tokenizers 0.11.0
Downloads last month
8