Edit model card

CngFSt10sec_model

This model is a fine-tuned version of openai/whisper-medium on the Marcusxx/CngFSt10sec dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0450
  • Cer: 37.7682

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • training_steps: 2000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Cer
0.0901 2.0661 250 0.1106 52.6468
0.0164 4.1322 500 0.0532 66.9124
0.0042 6.1983 750 0.0421 30.8922
0.0006 8.2645 1000 0.0433 30.8011
0.0005 10.3306 1250 0.0439 28.2827
0.0004 12.3967 1500 0.0445 32.2674
0.0004 14.4628 1750 0.0448 34.0485
0.0004 16.5289 2000 0.0450 37.7682

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.2.2+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
0
Safetensors
Model size
764M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .

Model tree for Marcusxx/CngFSt10sec_model

Finetuned
(455)
this model

Dataset used to train Marcusxx/CngFSt10sec_model