Edit model card

libri-alpha-0.5-Temp-1-processor-change

This model is a fine-tuned version of on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 91.9750
  • Wer: 0.1187

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 64
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
569.0646 0.75 100 175.3549 0.1589
440.3574 1.49 200 146.3654 0.1640
398.2328 2.24 300 128.7082 0.1562
357.5816 2.99 400 117.7871 0.1495
344.3317 3.73 500 111.0376 0.1417
331.0486 4.48 600 106.5447 0.1398
321.4498 5.22 700 105.1187 0.1363
305.8177 5.97 800 103.2541 0.1365
304.2076 6.72 900 105.3060 0.1385
297.746 7.46 1000 101.1069 0.1307
285.7675 8.21 1100 99.9853 0.1303
284.6546 8.96 1200 98.5235 0.1292
281.672 9.7 1300 97.8004 0.1295
281.0029 10.45 1400 96.9385 0.1278
283.847 11.19 1500 96.3700 0.1275
274.4053 11.94 1600 95.9557 0.1281
271.8855 12.69 1700 95.5764 0.1250
275.416 13.43 1800 95.0451 0.1266
267.7354 14.18 1900 94.6620 0.1242
273.9816 14.93 2000 95.0889 0.1241
263.9812 15.67 2100 94.4231 0.1241
258.6033 16.42 2200 93.8011 0.1225
260.4275 17.16 2300 94.0336 0.1210
258.7905 17.91 2400 93.4633 0.1216
255.6817 18.66 2500 93.0448 0.1212
252.3298 19.4 2600 92.9945 0.1216
250.5598 20.15 2700 92.9767 0.1200
249.4384 20.9 2800 93.1555 0.1203
255.6291 21.64 2900 92.7784 0.1208
249.5222 22.39 3000 92.5792 0.1203
250.498 23.13 3100 92.4570 0.1205
252.2656 23.88 3200 92.3685 0.1199
248.1438 24.63 3300 92.3731 0.1198
240.2946 25.37 3400 92.1875 0.1192
256.2254 26.12 3500 91.9586 0.1192
248.603 26.87 3600 91.9599 0.1191
252.9337 27.61 3700 92.1080 0.1189
250.9757 28.36 3800 92.1051 0.1188
248.7415 29.1 3900 91.9927 0.1187
248.7394 29.85 4000 91.9750 0.1187

Framework versions

  • Transformers 4.25.1
  • Pytorch 1.12.1
  • Datasets 2.7.1
  • Tokenizers 0.11.0
Downloads last month
7