Edit model card

Whisper Large-V3 Basque

This model is a fine-tuned version of openai/whisper-large-v3 on the mozilla-foundation/common_voice_16_1 eu dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3688
  • Wer: 6.8880

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 8
  • total_train_batch_size: 256
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 40000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
0.0095 10.04 1000 0.2023 9.6803
0.0032 20.08 2000 0.2153 9.0521
0.0023 30.11 3000 0.2234 8.8645
0.0023 40.15 4000 0.2278 8.4366
0.0012 50.19 5000 0.2260 7.9911
0.0005 60.23 6000 0.2435 7.9060
0.0013 70.26 7000 0.2254 7.8484
0.0004 80.3 8000 0.2367 7.4830
0.0008 90.34 9000 0.2289 7.4420
0.0007 100.38 10000 0.2385 7.5319
0.001 110.41 11000 0.2293 7.6325
0.0001 120.45 12000 0.2473 7.1430
0.0001 130.49 13000 0.2488 7.1870
0.0004 140.53 14000 0.2398 7.1831
0.0 150.56 15000 0.2620 7.0590
0.0001 160.6 16000 0.2547 7.1967
0.0 170.64 17000 0.2768 7.0736
0.0 180.68 18000 0.2878 7.0004
0.0 190.72 19000 0.2962 6.9466
0.0013 200.75 20000 0.2354 7.6042
0.0 210.79 21000 0.2720 6.8948
0.0 220.83 22000 0.2865 6.8987
0.0 230.87 23000 0.2954 6.8890
0.0 240.9 24000 0.3031 6.8821
0.0 250.94 25000 0.3102 6.8772
0.0 260.98 26000 0.3166 6.8899
0.0 271.02 27000 0.3233 6.8919
0.0 281.05 28000 0.3248 6.8919
0.0 291.09 29000 0.3363 6.9026
0.0 301.13 30000 0.3419 6.9085
0.0 311.17 31000 0.3471 6.8851
0.0 321.2 32000 0.3526 6.8704
0.0 331.24 33000 0.3570 6.8831
0.0 341.28 34000 0.3614 6.8851
0.0 351.32 35000 0.3645 6.8782
0.0 361.36 36000 0.3663 6.8714
0.0 371.39 37000 0.3677 6.8675
0.0 381.43 38000 0.3681 6.8802
0.0 391.47 39000 0.3686 6.8880
0.0 401.51 40000 0.3688 6.8880

Framework versions

  • Transformers 4.37.2
  • Pytorch 2.2.0+cu121
  • Datasets 2.16.1
  • Tokenizers 0.15.1
Downloads last month
7
Safetensors
Model size
1.54B params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for zuazo/whisper-large-v3-eu-cv16_1

Finetuned
(309)
this model

Dataset used to train zuazo/whisper-large-v3-eu-cv16_1

Evaluation results