librarian-bot's picture
Librarian Bot: Add base_model information to model
38c47e4
|
raw
history blame
7.42 kB
metadata
license: apache-2.0
tags:
  - generated_from_trainer
datasets:
  - common_voice
base_model: facebook/hubert-large-ll60k
model-index:
  - name: hubert-large-xlsr-common1000asli-demo-colab-dd
    results: []

hubert-large-xlsr-common1000asli-demo-colab-dd

This model is a fine-tuned version of facebook/hubert-large-ll60k on the common_voice dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0754
  • Wer: 0.5189

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 128
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 256
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 1000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
8.5628 10.53 400 1.4949 0.9944
0.7496 21.05 800 0.6398 0.6917
0.3298 31.58 1200 0.6116 0.6148
0.228 42.11 1600 0.6544 0.5835
0.17 52.63 2000 0.7028 0.5955
0.1466 63.16 2400 0.6935 0.5992
0.1261 73.68 2800 0.7101 0.5735
0.1109 84.21 3200 0.7360 0.5610
0.1001 94.74 3600 0.7924 0.5604
0.0856 105.26 4000 0.7975 0.5653
0.0821 115.79 4400 0.8027 0.5611
0.0783 126.32 4800 0.8238 0.5566
0.0691 136.84 5200 0.8109 0.5519
0.0627 147.37 5600 0.8231 0.5544
0.0589 157.89 6000 0.8747 0.5506
0.0548 168.42 6400 0.8440 0.5478
0.052 178.95 6800 0.8289 0.5393
0.0471 189.47 7200 0.8689 0.5492
0.0486 200.0 7600 0.8437 0.5372
0.0433 210.53 8000 0.8360 0.5453
0.0419 221.05 8400 0.8645 0.5391
0.0393 231.58 8800 0.8821 0.5506
0.0404 242.11 9200 0.9073 0.5419
1.318 252.63 9600 0.8408 0.5813
0.0489 263.16 10000 0.8206 0.5449
0.0406 273.68 10400 0.8592 0.5466
0.0359 284.21 10800 0.8597 0.5476
0.0344 294.74 11200 0.8349 0.5369
0.032 305.26 11600 0.8352 0.5379
0.0299 315.79 12000 0.8409 0.5420
0.0287 326.32 12400 0.8562 0.5441
0.0292 336.84 12800 0.9100 0.5519
0.0258 347.37 13200 0.9213 0.5447
0.0229 357.89 13600 0.9020 0.5343
0.0257 368.42 14000 0.9219 0.5531
0.0236 378.95 14400 0.9301 0.5516
0.0241 389.47 14800 0.9058 0.5359
0.022 400.0 15200 0.9067 0.5408
3.4199 410.53 15600 0.9661 0.6957
0.0554 421.05 16000 0.8984 0.5661
0.0289 431.58 16400 0.8843 0.5504
0.0234 442.11 16800 0.8943 0.5407
0.0219 452.63 17200 0.9325 0.5391
0.0194 463.16 17600 0.9588 0.5442
0.0195 473.68 18000 0.9660 0.5478
0.0184 484.21 18400 0.9325 0.5394
0.0178 494.74 18800 0.9526 0.5435
0.0171 505.26 19200 0.9533 0.5412
0.0174 515.79 19600 0.8962 0.5410
0.0165 526.32 20000 0.9699 0.5422
0.0153 536.84 20400 0.9252 0.5301
0.0141 547.37 20800 0.9364 0.5401
0.0148 557.89 21200 0.9479 0.5387
0.0141 568.42 21600 0.9692 0.5365
0.0136 578.95 22000 0.9779 0.5343
0.0127 589.47 22400 0.9684 0.5303
0.0122 600.0 22800 0.9930 0.5346
0.0122 610.53 23200 0.9733 0.5348
0.0112 621.05 23600 1.0059 0.5374
0.0113 631.58 24000 0.9801 0.5302
0.0114 642.11 24400 0.9901 0.5336
0.0101 652.63 24800 0.9943 0.5383
0.0106 663.16 25200 1.0296 0.5272
0.0099 673.68 25600 1.0321 0.5294
0.01 684.21 26000 1.0282 0.5310
0.01 694.74 26400 1.0336 0.5326
0.009 705.26 26800 1.0130 0.5247
0.0087 715.79 27200 1.0326 0.5261
0.0086 726.32 27600 1.0343 0.5255
0.0085 736.84 28000 1.0009 0.5338
0.0086 747.37 28400 1.0369 0.5279
0.008 757.89 28800 1.0063 0.5326
0.0095 768.42 29200 1.0152 0.5238
0.0072 778.95 29600 1.0313 0.5263
0.0073 789.47 30000 1.0440 0.5229
0.0068 800.0 30400 1.0348 0.5257
0.0076 810.53 30800 1.0040 0.5237
0.007 821.05 31200 1.0382 0.5205
0.0069 831.58 31600 1.0217 0.5276
0.0064 842.11 32000 1.0425 0.5301
0.0067 852.63 32400 1.0384 0.5262
0.006 863.16 32800 1.0698 0.5294
0.0058 873.68 33200 1.0412 0.5229
0.0063 884.21 33600 1.0423 0.5225
0.0053 894.74 34000 1.0554 0.5213
0.0055 905.26 34400 1.0593 0.5202
0.0051 915.79 34800 1.0716 0.5211
0.0052 926.32 35200 1.0668 0.5182
0.0048 936.84 35600 1.0840 0.5209
0.0052 947.37 36000 1.0633 0.5173
0.0046 957.89 36400 1.0747 0.5184
0.0051 968.42 36800 1.0766 0.5190
0.0052 978.95 37200 1.0748 0.5194
0.005 989.47 37600 1.0778 0.5186
0.005 1000.0 38000 1.0754 0.5189

Framework versions

  • Transformers 4.11.3
  • Pytorch 1.10.0+cu102
  • Datasets 1.13.3
  • Tokenizers 0.10.3