results / README.md
celneo7's picture
End of training
8dd46b9 verified
metadata
license: apache-2.0
base_model: facebook/wav2vec2-base
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: results
    results: []

results

This model is a fine-tuned version of facebook/wav2vec2-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 3.0216
  • Wer: 1.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
6.0993 0.5714 100 3.1919 1.0
2.9332 1.1429 200 3.0675 1.0
2.878 1.7143 300 3.1538 1.0
2.873 2.2857 400 2.9688 1.0
2.8574 2.8571 500 3.0386 1.0
2.859 3.4286 600 3.0947 1.0
2.8631 4.0 700 3.2471 1.0
2.8612 4.5714 800 2.9827 1.0
2.8592 5.1429 900 3.0277 1.0
2.8617 5.7143 1000 3.1227 1.0
2.8644 6.2857 1100 3.0502 1.0
2.8618 6.8571 1200 3.0055 1.0
2.8638 7.4286 1300 3.0646 1.0
2.8608 8.0 1400 3.1780 1.0
2.8585 8.5714 1500 2.9719 1.0
2.8624 9.1429 1600 3.0521 1.0
2.8588 9.7143 1700 3.0839 1.0
2.8594 10.2857 1800 3.1120 1.0
2.8566 10.8571 1900 2.9648 1.0
2.8587 11.4286 2000 3.0812 1.0
2.8588 12.0 2100 3.1690 1.0
2.8607 12.5714 2200 2.9951 1.0
2.8561 13.1429 2300 3.0317 1.0
2.8565 13.7143 2400 3.0880 1.0
2.8638 14.2857 2500 3.0978 1.0
2.8563 14.8571 2600 2.9716 1.0
2.8592 15.4286 2700 3.0461 1.0
2.859 16.0 2800 3.1339 1.0
2.8584 16.5714 2900 3.0304 1.0
2.8562 17.1429 3000 2.9964 1.0
2.8574 17.7143 3100 3.0665 1.0
2.8609 18.2857 3200 3.1042 1.0
2.8564 18.8571 3300 2.9905 1.0
2.8601 19.4286 3400 3.0030 1.0
2.8562 20.0 3500 3.1000 1.0
2.8565 20.5714 3600 3.0409 1.0
2.8566 21.1429 3700 2.9837 1.0
2.8577 21.7143 3800 3.0294 1.0
2.8554 22.2857 3900 3.0737 1.0
2.854 22.8571 4000 3.0101 1.0
2.8556 23.4286 4100 3.0014 1.0
2.8557 24.0 4200 3.0693 1.0
2.8531 24.5714 4300 3.0308 1.0
2.8552 25.1429 4400 3.0050 1.0
2.8536 25.7143 4500 3.0215 1.0
2.855 26.2857 4600 3.0509 1.0
2.8513 26.8571 4700 3.0163 1.0
2.8533 27.4286 4800 3.0170 1.0
2.8552 28.0 4900 3.0345 1.0
2.8521 28.5714 5000 3.0259 1.0
2.8522 29.1429 5100 3.0219 1.0
2.8543 29.7143 5200 3.0216 1.0

Framework versions

  • Transformers 4.41.1
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1