You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

whisper-small-Fleurs_AMMI_AFRIVOICE_LRSC-ln-10hrs-v1

This model is a fine-tuned version of openai/whisper-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9113
  • Wer: 0.2693
  • Cer: 0.1004

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 4
  • eval_batch_size: 2
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 8
  • optimizer: Use adamw_hf with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
2.662 1.0 384 1.1130 0.9642 0.4862
0.9465 2.0 768 0.7685 0.7183 0.3554
0.6467 3.0 1152 0.6290 0.7266 0.3898
0.4612 4.0 1536 0.5702 0.6397 0.3367
0.3264 5.0 1920 0.5509 0.6423 0.3565
0.2166 6.0 2304 0.5555 0.9089 0.5573
0.136 7.0 2688 0.5589 0.9389 0.5901
0.0854 8.0 3072 0.5840 0.9759 0.5946
0.0535 9.0 3456 0.6099 0.7188 0.4253
0.0401 10.0 3840 0.6250 0.8047 0.4718
0.0305 11.0 4224 0.6397 0.5552 0.3004
0.0234 12.0 4608 0.6584 0.4350 0.2223
0.0179 13.0 4992 0.6768 0.4224 0.1984
0.0141 14.0 5376 0.6962 0.4579 0.2346
0.0109 15.0 5760 0.7025 0.4128 0.1926
0.0096 16.0 6144 0.6962 0.3778 0.1732
0.0069 17.0 6528 0.7180 0.3991 0.1853
0.0061 18.0 6912 0.7368 0.3223 0.1319
0.0045 19.0 7296 0.7343 0.3606 0.1692
0.0054 20.0 7680 0.7521 0.2899 0.1133
0.0039 21.0 8064 0.7545 0.3050 0.1204
0.0052 22.0 8448 0.7758 0.2871 0.1105
0.0041 23.0 8832 0.7674 0.2952 0.1189
0.0045 24.0 9216 0.7896 0.2945 0.1143
0.0033 25.0 9600 0.7860 0.2891 0.1149
0.0026 26.0 9984 0.8004 0.2691 0.0999
0.0029 27.0 10368 0.8039 0.2696 0.0999
0.0036 28.0 10752 0.8287 0.2634 0.0939
0.0027 29.0 11136 0.7887 0.2707 0.1002
0.0032 30.0 11520 0.8300 0.2734 0.1032
0.0029 31.0 11904 0.8122 0.2628 0.0954
0.0032 32.0 12288 0.8459 0.2664 0.0962
0.0031 33.0 12672 0.8250 0.2741 0.1075
0.0023 34.0 13056 0.8579 0.2649 0.0966
0.0023 35.0 13440 0.8535 0.2606 0.0961
0.0018 36.0 13824 0.8601 0.2571 0.0930
0.0028 37.0 14208 0.8426 0.2603 0.0969
0.0021 38.0 14592 0.8617 0.2591 0.0967
0.0016 39.0 14976 0.8570 0.2572 0.0930
0.0022 40.0 15360 0.8496 0.2581 0.0926
0.0015 41.0 15744 0.8533 0.2578 0.0958
0.0014 42.0 16128 0.8752 0.2520 0.0891
0.0014 43.0 16512 0.8737 0.2552 0.0918
0.0013 44.0 16896 0.8928 0.2616 0.0974
0.0026 45.0 17280 0.9111 0.2613 0.0957
0.0029 46.0 17664 0.8834 0.2672 0.0999
0.0018 47.0 18048 0.8904 0.2555 0.0916
0.0014 48.0 18432 0.9028 0.2541 0.0893
0.0013 49.0 18816 0.8990 0.2558 0.0904
0.0014 50.0 19200 0.9087 0.2564 0.0927
0.0014 51.0 19584 0.9115 0.2610 0.0934
0.0007 52.0 19968 0.9291 0.2592 0.0938
0.0011 53.0 20352 0.9081 0.2603 0.0970
0.0014 54.0 20736 0.9113 0.2693 0.1004

Framework versions

  • Transformers 4.46.3
  • Pytorch 2.1.0+cu118
  • Datasets 3.1.0
  • Tokenizers 0.20.3
Downloads last month
19
Safetensors
Model size
242M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for asr-africa/whisper-small-Fleurs_AMMI_AFRIVOICE_LRSC-ln-10hrs-v1

Finetuned
(2105)
this model

Collection including asr-africa/whisper-small-Fleurs_AMMI_AFRIVOICE_LRSC-ln-10hrs-v1