assis / README.md
DRAGOO's picture
update model card README.md
bf9027e
|
raw
history blame
4.22 kB
metadata
license: apache-2.0
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: assis
    results: []

assis

This model is a fine-tuned version of facebook/wav2vec2-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3836
  • Wer: 1

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 3000
  • training_steps: 5000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
23.2159 0.6 100 22.1148 1
18.1848 1.2 200 16.7223 1
9.7817 1.8 300 7.9404 1
4.5091 2.4 400 3.7900 1
3.4946 2.99 500 3.2953 1
3.3286 3.59 600 3.1827 1
3.2078 4.19 700 3.1068 1
3.1528 4.79 800 3.0573 1
3.0709 5.39 900 3.0196 1
3.0163 5.99 1000 2.9919 1
2.9789 6.59 1100 2.9504 1
2.9468 7.19 1200 2.9272 1
2.9389 7.78 1300 2.9129 1
2.9192 8.38 1400 2.9005 1
2.9069 8.98 1500 2.8861 1
2.9074 9.58 1600 2.8816 1
2.883 10.18 1700 2.8746 1
2.8746 10.78 1800 2.8718 1
2.8637 11.38 1900 2.8567 1
2.8613 11.98 2000 2.8570 1
2.8598 12.57 2100 2.8449 1
2.8357 13.17 2200 2.8393 1
2.8352 13.77 2300 2.8350 1
2.8178 14.37 2400 2.7879 1
2.5089 14.97 2500 2.3686 1
2.0826 15.57 2600 1.8915 1
1.6003 16.17 2700 1.3513 1
1.2925 16.77 2800 1.0568 1
1.0837 17.37 2900 0.8760 1
0.9333 17.96 3000 0.7588 1
0.8214 18.56 3100 0.6841 1
0.7302 19.16 3200 0.6099 1
0.6815 19.76 3300 0.5459 1
0.6548 20.36 3400 0.5087 1
0.569 20.96 3500 0.4853 1
0.5919 21.56 3600 0.4666 1
0.5306 22.16 3700 0.4508 1
0.5228 22.75 3800 0.4389 1
0.5263 23.35 3900 0.4287 1
0.4945 23.95 4000 0.4182 1
0.4809 24.55 4100 0.4122 1
0.4813 25.15 4200 0.4112 1
0.4664 25.75 4300 0.3972 1
0.455 26.35 4400 0.3950 1
0.4415 26.95 4500 0.3962 1
0.4399 27.54 4600 0.3930 1
0.4451 28.14 4700 0.3864 1
0.4343 28.74 4800 0.3867 1
0.4418 29.34 4900 0.3865 1
0.4223 29.94 5000 0.3836 1

Framework versions

  • Transformers 4.28.0
  • Pytorch 2.0.1+cu117
  • Datasets 2.12.0
  • Tokenizers 0.13.3