hts98's picture
update model card README.md
de8c9d5
|
raw
history blame
4.42 kB
metadata
license: apache-2.0
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: wav2vec2-large-960h-lv60-self-paper
    results: []

wav2vec2-large-960h-lv60-self-paper

This model is a fine-tuned version of facebook/wav2vec2-large-960h-lv60-self on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0854
  • Wer: 0.2950

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 420
  • num_epochs: 50.0

Training results

Training Loss Epoch Step Validation Loss Wer
No log 1.0 419 3.3473 1.0
5.8068 2.0 838 1.9191 0.8917
2.5663 3.0 1257 1.1006 0.5802
1.1433 4.0 1676 0.9009 0.4814
0.8522 5.0 2095 0.8215 0.4247
0.7256 6.0 2514 0.7522 0.3922
0.7256 7.0 2933 0.7202 0.3654
0.6239 8.0 3352 0.6909 0.3579
0.5618 9.0 3771 0.6887 0.3400
0.4998 10.0 4190 0.6788 0.3320
0.4569 11.0 4609 0.6805 0.3351
0.4156 12.0 5028 0.6910 0.3253
0.4156 13.0 5447 0.6859 0.3279
0.3763 14.0 5866 0.7075 0.3207
0.3473 15.0 6285 0.7174 0.3152
0.3141 16.0 6704 0.7284 0.3171
0.2884 17.0 7123 0.7537 0.3192
0.2771 18.0 7542 0.7312 0.3175
0.2771 19.0 7961 0.7669 0.3138
0.2538 20.0 8380 0.8143 0.3074
0.2319 21.0 8799 0.8185 0.3088
0.2206 22.0 9218 0.8111 0.3069
0.2093 23.0 9637 0.8248 0.3088
0.1979 24.0 10056 0.8572 0.3067
0.1979 25.0 10475 0.8710 0.3074
0.1852 26.0 10894 0.8922 0.3067
0.1742 27.0 11313 0.9040 0.3068
0.1688 28.0 11732 0.9144 0.3016
0.1578 29.0 12151 0.8990 0.3109
0.1557 30.0 12570 0.9465 0.3004
0.1557 31.0 12989 0.9480 0.3025
0.1456 32.0 13408 0.9731 0.3017
0.1398 33.0 13827 0.9633 0.3038
0.1343 34.0 14246 0.9844 0.3011
0.1275 35.0 14665 1.0078 0.2997
0.1266 36.0 15084 1.0066 0.2996
0.1243 37.0 15503 1.0133 0.3014
0.1243 38.0 15922 1.0387 0.2972
0.1182 39.0 16341 1.0173 0.3026
0.1152 40.0 16760 1.0527 0.2977
0.1134 41.0 17179 1.0491 0.2978
0.1101 42.0 17598 1.0662 0.2976
0.1083 43.0 18017 1.0544 0.2979
0.1083 44.0 18436 1.0599 0.2957
0.1073 45.0 18855 1.0767 0.2959
0.1045 46.0 19274 1.0773 0.2959
0.1024 47.0 19693 1.0731 0.2953
0.1015 48.0 20112 1.0823 0.2966
0.1016 49.0 20531 1.0885 0.2945
0.1016 50.0 20950 1.0854 0.2950

Framework versions

  • Transformers 4.31.0.dev0
  • Pytorch 2.0.0+cu117
  • Datasets 2.7.0
  • Tokenizers 0.13.2