iammahadev's picture
End of training
8bf6323 verified
metadata
license: mit
base_model: facebook/w2v-bert-2.0
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: w2v-bert-2-malayalam-combo-v1
    results: []

w2v-bert-2-malayalam-combo-v1

This model is a fine-tuned version of facebook/w2v-bert-2.0 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: inf
  • Wer: 0.1007

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 10
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
1.9859 0.2432 300 inf 0.4513
0.2903 0.4864 600 inf 0.4107
0.2294 0.7296 900 inf 0.3331
0.2075 0.9728 1200 inf 0.2968
0.1737 1.2161 1500 inf 0.2862
0.1561 1.4593 1800 inf 0.2603
0.1435 1.7025 2100 inf 0.2496
0.1388 1.9457 2400 inf 0.2329
0.1213 2.1889 2700 inf 0.2271
0.1168 2.4321 3000 inf 0.2202
0.1086 2.6753 3300 inf 0.2273
0.1131 2.9185 3600 inf 0.2132
0.0951 3.1617 3900 inf 0.2068
0.0851 3.4049 4200 inf 0.2075
0.0905 3.6482 4500 inf 0.1969
0.0811 3.8914 4800 inf 0.1941
0.0754 4.1346 5100 inf 0.1717
0.0653 4.3778 5400 inf 0.1704
0.0663 4.6210 5700 inf 0.1737
0.0635 4.8642 6000 inf 0.1551
0.0607 5.1074 6300 inf 0.1479
0.05 5.3506 6600 inf 0.1478
0.0519 5.5938 6900 inf 0.1441
0.048 5.8370 7200 inf 0.1410
0.0428 6.0803 7500 inf 0.1362
0.0344 6.3235 7800 inf 0.1325
0.0344 6.5667 8100 inf 0.1242
0.0361 6.8099 8400 inf 0.1247
0.031 7.0531 8700 inf 0.1227
0.0256 7.2963 9000 inf 0.1175
0.023 7.5395 9300 inf 0.1172
0.0223 7.7827 9600 inf 0.1161
0.0203 8.0259 9900 inf 0.1099
0.014 8.2692 10200 inf 0.1094
0.0158 8.5124 10500 inf 0.1081
0.0147 8.7556 10800 inf 0.1078
0.0132 8.9988 11100 inf 0.1049
0.008 9.2420 11400 inf 0.1048
0.0081 9.4852 11700 inf 0.1010
0.0081 9.7284 12000 inf 0.1010
0.0094 9.9716 12300 inf 0.1007

Framework versions

  • Transformers 4.42.4
  • Pytorch 2.3.1+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1