whisper-small-ne / README.md
jenrish's picture
End of training
2905757 verified
|
raw
history blame
2.06 kB
metadata
library_name: transformers
language:
  - ne
license: apache-2.0
base_model: openai/whisper-small
tags:
  - generated_from_trainer
model-index:
  - name: Whisper Small Ne - Jenny Poudel
    results: []

Whisper Small Ne - Jenny Poudel

This model is a fine-tuned version of openai/whisper-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3732
  • Cer: 10.4532

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 1000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Cer
0.5845 1.7857 100 0.5426 23.4003
0.306 3.5714 200 0.3589 15.5604
0.1432 5.3571 300 0.3052 12.7828
0.0634 7.1429 400 0.3039 12.4767
0.0331 8.9286 500 0.3155 11.6404
0.0134 10.7143 600 0.3395 10.8938
0.0047 12.5 700 0.3492 10.1023
0.0018 14.2857 800 0.3679 10.4682
0.0008 16.0714 900 0.3702 10.4980
0.0006 17.8571 1000 0.3732 10.4532

Framework versions

  • Transformers 4.44.2
  • Pytorch 1.13.1+cu117
  • Datasets 2.21.0
  • Tokenizers 0.19.1