250119-centralized_learning

This model is a fine-tuned version of facebook/wav2vec2-base-960h on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 3624.4978
  • Wer: 1.0
  • Cer: 0.9981

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
3768.9383 1.0 2900 3624.4978 1.0 0.9981
3664.7982 2.0 5800 3623.8293 1.0 0.9981
3628.479 3.0 8700 3623.8401 1.0 0.9981
3629.406 4.0 11600 3623.7019 1.0 0.9981
3625.2683 5.0 14500 3623.6763 1.0 0.9981
3624.6528 6.0 17400 3623.6826 1.0 0.9981
3624.5862 7.0 20300 3623.6912 1.0 0.9981
3624.5 8.0 23200 3623.6826 1.0 0.9981
3623.9722 9.0 26100 3623.6606 1.0 0.9981
3625.5082 10.0 29000 3623.7019 1.0 0.9981
3624.1595 11.0 31900 3623.6472 1.0 0.9981
3625.175 12.0 34800 3623.6831 1.0 0.9981
3625.192 13.0 37700 3623.6731 1.0 0.9981
3623.9662 14.0 40600 3623.6777 1.0 0.9981
3623.9175 15.0 43500 3623.6653 1.0 0.9981
3623.9675 16.0 46400 3623.6497 1.0 0.9981
3624.471 17.0 49300 3623.6682 1.0 0.9981
3623.9983 18.0 52200 3623.6560 1.0 0.9981
3624.317 19.0 55100 3623.6587 1.0 0.9981
3623.9965 20.0 58000 3623.6743 1.0 0.9981
3623.9733 21.0 60900 3623.6799 1.0 0.9981
3623.9163 22.0 63800 3623.6455 1.0 0.9981
3623.817 23.0 66700 3623.6487 1.0 0.9981
3623.8545 24.0 69600 3623.6912 1.0 0.9981
3623.9572 25.0 72500 3623.6763 1.0 0.9981
3624.0855 26.0 75400 3623.6682 1.0 0.9981
3623.9163 27.0 78300 3623.6519 1.0 0.9981
3623.813 28.0 81200 3623.6863 1.0 0.9981
3624.0905 29.0 84100 3623.6797 1.0 0.9981
3623.847 30.0 87000 3623.6743 1.0 0.9981
3624.7505 31.0 89900 3623.6768 1.0 0.9981
3623.794 32.0 92800 3623.6716 1.0 0.9981
3623.775 33.0 95700 3623.6409 1.0 0.9981
3624.021 34.0 98600 3623.6431 1.0 0.9981
3623.7987 35.0 101500 3623.6394 1.0 0.9981
3623.8223 36.0 104400 3623.6831 1.0 0.9981
3623.979 37.0 107300 3623.6785 1.0 0.9981
3623.809 38.0 110200 3623.6826 1.0 0.9981
3623.857 39.0 113100 3623.6643 1.0 0.9981
3623.8252 40.0 116000 3623.6731 1.0 0.9981
3623.916 41.0 118900 3623.6838 1.0 0.9981
3623.9533 42.0 121800 3623.6694 1.0 0.9981
3618.18 43.0 124700 nan 1.0 0.9981
0.0 44.0 127600 nan 1.0 0.9981
0.0 45.0 130500 nan 1.0 0.9981
0.0 46.0 133400 nan 1.0 0.9981
0.0 47.0 136300 nan 1.0 0.9981
0.0 48.0 139200 nan 1.0 0.9981
0.0 49.0 142100 nan 1.0 0.9981
0.0 50.0 145000 nan 1.0 0.9981
0.0 51.0 147900 nan 1.0 0.9981
0.0 52.0 150800 nan 1.0 0.9981
0.0 53.0 153700 nan 1.0 0.9981
0.0 54.0 156600 nan 1.0 0.9981
0.0 55.0 159500 nan 1.0 0.9981
0.0 56.0 162400 nan 1.0 0.9981
0.0 57.0 165300 nan 1.0 0.9981
0.0 58.0 168200 nan 1.0 0.9981
0.0 59.0 171100 nan 1.0 0.9981
0.0 60.0 174000 nan 1.0 0.9981
0.0 61.0 176900 nan 1.0 0.9981
0.0 62.0 179800 nan 1.0 0.9981
0.0 63.0 182700 nan 1.0 0.9981
0.0 64.0 185600 nan 1.0 0.9981
0.0 65.0 188500 nan 1.0 0.9981
0.0 66.0 191400 nan 1.0 0.9981
0.0 67.0 194300 nan 1.0 0.9981
0.0 68.0 197200 nan 1.0 0.9981
0.0 69.0 200100 nan 1.0 0.9981
0.0 70.0 203000 nan 1.0 0.9981
0.0 71.0 205900 nan 1.0 0.9981
0.0 72.0 208800 nan 1.0 0.9981
0.0 73.0 211700 nan 1.0 0.9981
0.0 74.0 214600 nan 1.0 0.9981
0.0 75.0 217500 nan 1.0 0.9981
0.0 76.0 220400 nan 1.0 0.9981
0.0 77.0 223300 nan 1.0 0.9981
0.0 78.0 226200 nan 1.0 0.9981
0.0 79.0 229100 nan 1.0 0.9981
0.0 80.0 232000 nan 1.0 0.9981
0.0 81.0 234900 nan 1.0 0.9981
0.0 82.0 237800 nan 1.0 0.9981
0.0 83.0 240700 nan 1.0 0.9981
0.0 84.0 243600 nan 1.0 0.9981
0.0 85.0 246500 nan 1.0 0.9981
0.0 86.0 249400 nan 1.0 0.9981
0.0 87.0 252300 nan 1.0 0.9981
0.0 88.0 255200 nan 1.0 0.9981
0.0 89.0 258100 nan 1.0 0.9981
0.0 90.0 261000 nan 1.0 0.9981
0.0 91.0 263900 nan 1.0 0.9981
0.0 92.0 266800 nan 1.0 0.9981
0.0 93.0 269700 nan 1.0 0.9981
0.0 94.0 272600 nan 1.0 0.9981
0.0 95.0 275500 nan 1.0 0.9981
0.0 96.0 278400 nan 1.0 0.9981
0.0 97.0 281300 nan 1.0 0.9981
0.0 98.0 284200 nan 1.0 0.9981
0.0 99.0 287100 nan 1.0 0.9981
0.0 100.0 290000 nan 1.0 0.9981

Framework versions

  • Transformers 4.43.3
  • Pytorch 2.5.1+cu124
  • Datasets 3.2.0
  • Tokenizers 0.19.1
Downloads last month
2
Safetensors
Model size
94.4M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Model tree for zainulhakim/250119-centralized_learning

Finetuned
(123)
this model