okezieowen's picture
End of training
ae55c56 verified
|
raw
history blame
2.66 kB
metadata
library_name: transformers
language:
  - yo
license: apache-2.0
base_model: openai/whisper-small
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: Whisper Small Naija
    results: []

Whisper Small Naija

This model is a fine-tuned version of openai/whisper-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5037
  • Wer: 46.0115

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 5000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
1.3494 0.1022 250 1.4026 80.6179
0.962 0.2045 500 1.0016 68.3649
0.751 0.3067 750 0.8457 58.7227
0.6622 0.4090 1000 0.7606 56.7281
0.601 0.5112 1250 0.7057 55.7731
0.6004 0.6135 1500 0.6700 51.7955
0.5235 0.7157 1750 0.6341 53.2861
0.4939 0.8180 2000 0.6102 53.3002
0.4897 0.9202 2250 0.5913 52.4227
0.3799 1.0225 2500 0.5749 50.2787
0.3693 1.1247 2750 0.5623 48.4396
0.3498 1.2270 3000 0.5506 48.1969
0.3438 1.3292 3250 0.5425 48.5770
0.3498 1.4315 3500 0.5342 46.8116
0.3126 1.5337 3750 0.5248 46.8427
0.3215 1.6360 4000 0.5172 46.2891
0.3318 1.7382 4250 0.5126 47.7971
0.3108 1.8405 4500 0.5080 46.3594
0.3499 1.9427 4750 0.5049 46.7832
0.2664 2.0450 5000 0.5037 46.0115

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.0.1+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1