w2v-V2

This model is a fine-tuned version of facebook/w2v-bert-2.0 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1706
  • Wer: 0.1496

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 4
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • training_steps: 5000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
0.3589 0.1049 300 0.2921 0.2762
0.3512 0.2099 600 0.2855 0.2767
0.2998 0.3148 900 0.2872 0.2550
0.3419 0.4197 1200 0.2641 0.2620
0.2757 0.5247 1500 0.2633 0.2332
0.2827 0.6296 1800 0.2473 0.2090
0.265 0.7345 2100 0.2304 0.2226
0.2985 0.8395 2400 0.2266 0.2109
0.2555 0.9444 2700 0.2279 0.1891
0.255 1.0493 3000 0.2129 0.1927
0.2194 1.1542 3300 0.1991 0.1821
0.172 1.2592 3600 0.1963 0.1710
0.2018 1.3641 3900 0.1860 0.1724
0.2098 1.4690 4200 0.1783 0.1717
0.1996 1.5740 4500 0.1709 0.1563
0.1926 1.6789 4800 0.1706 0.1496

Framework versions

  • Transformers 4.51.1
  • Pytorch 2.5.1+cu121
  • Datasets 3.3.1
  • Tokenizers 0.21.0
Downloads last month
8
Safetensors
Model size
606M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for BriereAssia/w2v-V2

Finetuned
(292)
this model