wav2vec2-large-mms-1b-nhi-adapterft-ilv_fold1
This model is a fine-tuned version of facebook/mms-1b-all on the audiofolder dataset. It achieves the following results on the evaluation set:
- Loss: 0.6054
- Wer: 0.3871
- Cer: 0.1165
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 20
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
1.312 | 1.6529 | 200 | 0.8322 | 0.7050 | 0.2128 |
0.8371 | 3.3058 | 400 | 0.6980 | 0.5965 | 0.1790 |
0.7485 | 4.9587 | 600 | 0.6473 | 0.5388 | 0.1606 |
0.6799 | 6.6116 | 800 | 0.6305 | 0.5407 | 0.1561 |
0.6308 | 8.2645 | 1000 | 0.6215 | 0.4876 | 0.1467 |
0.6359 | 9.9174 | 1200 | 0.6104 | 0.4979 | 0.1457 |
0.5936 | 11.5702 | 1400 | 0.5851 | 0.4799 | 0.1431 |
0.5757 | 13.2231 | 1600 | 0.5906 | 0.4769 | 0.1389 |
0.5479 | 14.8760 | 1800 | 0.5851 | 0.4826 | 0.1377 |
0.5516 | 16.5289 | 2000 | 0.5596 | 0.4685 | 0.1373 |
0.5269 | 18.1818 | 2200 | 0.5788 | 0.4624 | 0.1334 |
0.5131 | 19.8347 | 2400 | 0.5708 | 0.4952 | 0.1411 |
0.495 | 21.4876 | 2600 | 0.5743 | 0.4532 | 0.1303 |
0.4898 | 23.1405 | 2800 | 0.5731 | 0.4421 | 0.1290 |
0.4718 | 24.7934 | 3000 | 0.5610 | 0.4471 | 0.1307 |
0.4717 | 26.4463 | 3200 | 0.5691 | 0.4498 | 0.1307 |
0.4573 | 28.0992 | 3400 | 0.5787 | 0.4348 | 0.1277 |
0.4587 | 29.7521 | 3600 | 0.5597 | 0.4345 | 0.1253 |
0.4265 | 31.4050 | 3800 | 0.5582 | 0.4219 | 0.1246 |
0.4383 | 33.0579 | 4000 | 0.5665 | 0.4127 | 0.1229 |
0.416 | 34.7107 | 4200 | 0.5615 | 0.4241 | 0.1256 |
0.408 | 36.3636 | 4400 | 0.5561 | 0.4264 | 0.1257 |
0.4014 | 38.0165 | 4600 | 0.5676 | 0.4272 | 0.1255 |
0.3929 | 39.6694 | 4800 | 0.5799 | 0.4199 | 0.1224 |
0.3764 | 41.3223 | 5000 | 0.5859 | 0.4119 | 0.1197 |
0.3765 | 42.9752 | 5200 | 0.5763 | 0.4092 | 0.1207 |
0.3783 | 44.6281 | 5400 | 0.5926 | 0.4264 | 0.1241 |
0.36 | 46.2810 | 5600 | 0.5598 | 0.4012 | 0.1182 |
0.3625 | 47.9339 | 5800 | 0.5811 | 0.4257 | 0.1228 |
0.3615 | 49.5868 | 6000 | 0.5826 | 0.4096 | 0.1215 |
0.3331 | 51.2397 | 6200 | 0.5792 | 0.4119 | 0.1211 |
0.3378 | 52.8926 | 6400 | 0.5749 | 0.3993 | 0.1192 |
0.3322 | 54.5455 | 6600 | 0.5691 | 0.4100 | 0.1224 |
0.3298 | 56.1983 | 6800 | 0.5781 | 0.4020 | 0.1188 |
0.3216 | 57.8512 | 7000 | 0.5834 | 0.4131 | 0.1207 |
0.3141 | 59.5041 | 7200 | 0.5874 | 0.3963 | 0.1184 |
0.303 | 61.1570 | 7400 | 0.5974 | 0.3966 | 0.1161 |
0.3128 | 62.8099 | 7600 | 0.5845 | 0.3982 | 0.1169 |
0.2936 | 64.4628 | 7800 | 0.5694 | 0.4043 | 0.1172 |
0.2951 | 66.1157 | 8000 | 0.5751 | 0.4054 | 0.1195 |
0.2889 | 67.7686 | 8200 | 0.6028 | 0.4035 | 0.1189 |
0.2735 | 69.4215 | 8400 | 0.5818 | 0.3951 | 0.1169 |
0.2858 | 71.0744 | 8600 | 0.6124 | 0.3924 | 0.1184 |
0.2707 | 72.7273 | 8800 | 0.5837 | 0.3863 | 0.1147 |
0.2788 | 74.3802 | 9000 | 0.5813 | 0.3943 | 0.1168 |
0.2649 | 76.0331 | 9200 | 0.5954 | 0.3921 | 0.1171 |
0.2606 | 77.6860 | 9400 | 0.5994 | 0.3898 | 0.1161 |
0.2698 | 79.3388 | 9600 | 0.5999 | 0.3921 | 0.1164 |
0.2529 | 80.9917 | 9800 | 0.5921 | 0.3917 | 0.1171 |
0.2504 | 82.6446 | 10000 | 0.5949 | 0.3901 | 0.1168 |
0.2618 | 84.2975 | 10200 | 0.5993 | 0.3970 | 0.1181 |
0.2549 | 85.9504 | 10400 | 0.6138 | 0.3936 | 0.1170 |
0.2529 | 87.6033 | 10600 | 0.6049 | 0.3890 | 0.1165 |
0.2448 | 89.2562 | 10800 | 0.6051 | 0.3882 | 0.1164 |
0.2461 | 90.9091 | 11000 | 0.6089 | 0.3913 | 0.1169 |
0.245 | 92.5620 | 11200 | 0.6056 | 0.3878 | 0.1172 |
0.2423 | 94.2149 | 11400 | 0.6094 | 0.3821 | 0.1151 |
0.2423 | 95.8678 | 11600 | 0.6002 | 0.3863 | 0.1151 |
0.2364 | 97.5207 | 11800 | 0.6006 | 0.3844 | 0.1155 |
0.2427 | 99.1736 | 12000 | 0.6054 | 0.3871 | 0.1165 |
Framework versions
- Transformers 4.41.2
- Pytorch 2.4.0
- Datasets 2.19.1
- Tokenizers 0.19.1
- Downloads last month
- 16
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.