assis / README.md
DRAGOO's picture
update model card README.md
7fc6066
metadata
license: apache-2.0
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: assis
    results: []

assis

This model is a fine-tuned version of facebook/wav2vec2-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8484
  • Wer: 1

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 3000
  • training_steps: 5000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
22.3611 0.45 100 25.5854 1
20.8274 0.9 200 20.4977 1
12.1089 1.35 300 11.0220 1
5.043 1.81 400 4.3838 1
3.788 2.26 500 3.5831 1
3.445 2.71 600 3.4112 1
3.3042 3.16 700 3.3104 1
3.221 3.61 800 3.2255 1
3.1628 4.06 900 3.1618 1
3.0645 4.51 1000 3.1010 1
3.0913 4.97 1100 3.0624 1
3.0819 5.42 1200 3.0136 1
2.9502 5.87 1300 2.9883 1
2.9611 6.32 1400 2.9651 1
2.9287 6.77 1500 2.9474 1
2.9461 7.22 1600 2.9280 1
2.9176 7.67 1700 2.9148 1
2.8986 8.13 1800 2.9138 1
2.8896 8.58 1900 2.9050 1
2.8879 9.03 2000 2.9093 1
2.9085 9.48 2100 2.8998 1
2.876 9.93 2200 2.8807 1
2.8649 10.38 2300 2.8734 1
2.8653 10.84 2400 2.8681 1
2.8683 11.29 2500 2.8596 1
2.8452 11.74 2600 2.8667 1
2.8468 12.19 2700 2.8514 1
2.846 12.64 2800 2.8541 1
2.8415 13.09 2900 2.8493 1
2.8195 13.54 3000 2.8472 1
2.8103 14.0 3100 2.8244 1
2.6495 14.45 3200 2.5809 1
2.3126 14.9 3300 2.1612 1
1.92 15.35 3400 1.7312 1
1.5734 15.8 3500 1.4245 1
1.4081 16.25 3600 1.2659 1
1.2573 16.7 3700 1.1694 1
1.194 17.16 3800 1.0930 1
1.1053 17.61 3900 1.0393 1
1.072 18.06 4000 0.9792 1
1.0148 18.51 4100 0.9468 1
0.9995 18.96 4200 0.9228 1
0.9688 19.41 4300 0.9071 1
0.956 19.86 4400 0.8950 1
0.9565 20.32 4500 0.8632 1
0.9215 20.77 4600 0.8673 1
0.9006 21.22 4700 0.8647 1
0.8645 21.67 4800 0.8566 1
0.8768 22.12 4900 0.8527 1
0.8809 22.57 5000 0.8484 1

Framework versions

  • Transformers 4.28.0
  • Pytorch 2.0.1+cu117
  • Datasets 2.12.0
  • Tokenizers 0.13.3