wav2vec2-large-xlsr-coraa-exp-3
This model is a fine-tuned version of Edresson/wav2vec2-large-xlsr-coraa-portuguese on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.6735
- Wer: 0.4171
- Cer: 0.1973
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
36.864 | 1.0 | 14 | 23.1058 | 1.0 | 0.9617 |
36.864 | 2.0 | 28 | 6.7821 | 1.0 | 0.9619 |
36.864 | 3.0 | 42 | 4.4729 | 1.0 | 0.9619 |
36.864 | 4.0 | 56 | 3.9256 | 1.0 | 0.9619 |
36.864 | 5.0 | 70 | 3.6910 | 1.0 | 0.9619 |
36.864 | 6.0 | 84 | 3.5482 | 1.0 | 0.9619 |
36.864 | 7.0 | 98 | 3.4159 | 1.0 | 0.9619 |
8.8097 | 8.0 | 112 | 3.3477 | 1.0 | 0.9619 |
8.8097 | 9.0 | 126 | 3.2437 | 1.0 | 0.9619 |
8.8097 | 10.0 | 140 | 3.1897 | 1.0 | 0.9619 |
8.8097 | 11.0 | 154 | 3.1493 | 1.0 | 0.9619 |
8.8097 | 12.0 | 168 | 3.1030 | 1.0 | 0.9619 |
8.8097 | 13.0 | 182 | 3.0839 | 1.0 | 0.9619 |
8.8097 | 14.0 | 196 | 3.0636 | 1.0 | 0.9619 |
3.0836 | 15.0 | 210 | 3.0740 | 1.0 | 0.9619 |
3.0836 | 16.0 | 224 | 3.0493 | 1.0 | 0.9619 |
3.0836 | 17.0 | 238 | 3.0592 | 1.0 | 0.9619 |
3.0836 | 18.0 | 252 | 3.0454 | 1.0 | 0.9619 |
3.0836 | 19.0 | 266 | 3.0413 | 1.0 | 0.9619 |
3.0836 | 20.0 | 280 | 3.0225 | 1.0 | 0.9619 |
3.0836 | 21.0 | 294 | 3.0180 | 1.0 | 0.9619 |
2.962 | 22.0 | 308 | 3.0182 | 1.0 | 0.9619 |
2.962 | 23.0 | 322 | 3.0088 | 1.0 | 0.9619 |
2.962 | 24.0 | 336 | 3.0045 | 1.0 | 0.9619 |
2.962 | 25.0 | 350 | 3.0062 | 1.0 | 0.9619 |
2.962 | 26.0 | 364 | 3.0002 | 1.0 | 0.9619 |
2.962 | 27.0 | 378 | 3.0015 | 1.0 | 0.9619 |
2.962 | 28.0 | 392 | 2.9998 | 1.0 | 0.9619 |
2.9296 | 29.0 | 406 | 2.9963 | 1.0 | 0.9619 |
2.9296 | 30.0 | 420 | 2.9960 | 1.0 | 0.9619 |
2.9296 | 31.0 | 434 | 2.9941 | 1.0 | 0.9619 |
2.9296 | 32.0 | 448 | 2.9875 | 1.0 | 0.9619 |
2.9296 | 33.0 | 462 | 2.9809 | 1.0 | 0.9619 |
2.9296 | 34.0 | 476 | 2.9867 | 1.0 | 0.9619 |
2.9296 | 35.0 | 490 | 2.9806 | 1.0 | 0.9619 |
2.9173 | 36.0 | 504 | 2.9788 | 1.0 | 0.9619 |
2.9173 | 37.0 | 518 | 2.9758 | 1.0 | 0.9613 |
2.9173 | 38.0 | 532 | 2.9576 | 1.0 | 0.9573 |
2.9173 | 39.0 | 546 | 2.9418 | 1.0 | 0.9567 |
2.9173 | 40.0 | 560 | 2.9332 | 1.0 | 0.9513 |
2.9173 | 41.0 | 574 | 2.8844 | 1.0 | 0.9505 |
2.9173 | 42.0 | 588 | 2.8447 | 1.0 | 0.9594 |
2.8677 | 43.0 | 602 | 2.7853 | 1.0 | 0.9609 |
2.8677 | 44.0 | 616 | 2.7609 | 1.0 | 0.9611 |
2.8677 | 45.0 | 630 | 2.7352 | 1.0 | 0.9560 |
2.8677 | 46.0 | 644 | 2.6978 | 1.0 | 0.9458 |
2.8677 | 47.0 | 658 | 2.6429 | 1.0 | 0.9091 |
2.8677 | 48.0 | 672 | 2.4628 | 1.0 | 0.7621 |
2.8677 | 49.0 | 686 | 2.2944 | 1.0 | 0.7007 |
2.5853 | 50.0 | 700 | 2.1218 | 1.0 | 0.6241 |
2.5853 | 51.0 | 714 | 1.9631 | 1.0 | 0.5478 |
2.5853 | 52.0 | 728 | 1.7663 | 1.0 | 0.4962 |
2.5853 | 53.0 | 742 | 1.6149 | 1.0 | 0.4384 |
2.5853 | 54.0 | 756 | 1.5029 | 1.0 | 0.4164 |
2.5853 | 55.0 | 770 | 1.4372 | 0.9998 | 0.4015 |
2.5853 | 56.0 | 784 | 1.3467 | 0.9992 | 0.3944 |
2.5853 | 57.0 | 798 | 1.2405 | 0.9961 | 0.3846 |
1.8208 | 58.0 | 812 | 1.1700 | 0.9898 | 0.3715 |
1.8208 | 59.0 | 826 | 1.1102 | 0.9807 | 0.3599 |
1.8208 | 60.0 | 840 | 1.0782 | 0.9606 | 0.3472 |
1.8208 | 61.0 | 854 | 1.0312 | 0.9350 | 0.3325 |
1.8208 | 62.0 | 868 | 0.9807 | 0.8935 | 0.3137 |
1.8208 | 63.0 | 882 | 0.9468 | 0.7877 | 0.2842 |
1.8208 | 64.0 | 896 | 0.9241 | 0.6071 | 0.2397 |
1.2338 | 65.0 | 910 | 0.9088 | 0.5173 | 0.2245 |
1.2338 | 66.0 | 924 | 0.8704 | 0.5136 | 0.2231 |
1.2338 | 67.0 | 938 | 0.8294 | 0.4935 | 0.2174 |
1.2338 | 68.0 | 952 | 0.8129 | 0.4803 | 0.2133 |
1.2338 | 69.0 | 966 | 0.8117 | 0.4616 | 0.2106 |
1.2338 | 70.0 | 980 | 0.7918 | 0.4559 | 0.2091 |
1.2338 | 71.0 | 994 | 0.7759 | 0.4502 | 0.2068 |
0.9426 | 72.0 | 1008 | 0.7622 | 0.4496 | 0.2072 |
0.9426 | 73.0 | 1022 | 0.7588 | 0.4439 | 0.2055 |
0.9426 | 74.0 | 1036 | 0.7419 | 0.4423 | 0.2044 |
0.9426 | 75.0 | 1050 | 0.7495 | 0.4344 | 0.2030 |
0.9426 | 76.0 | 1064 | 0.7344 | 0.4321 | 0.2024 |
0.9426 | 77.0 | 1078 | 0.7324 | 0.4325 | 0.2028 |
0.9426 | 78.0 | 1092 | 0.7141 | 0.4311 | 0.2015 |
0.8254 | 79.0 | 1106 | 0.7201 | 0.4291 | 0.2010 |
0.8254 | 80.0 | 1120 | 0.7149 | 0.4279 | 0.2007 |
0.8254 | 81.0 | 1134 | 0.6996 | 0.4226 | 0.1991 |
0.8254 | 82.0 | 1148 | 0.7043 | 0.4204 | 0.1985 |
0.8254 | 83.0 | 1162 | 0.6972 | 0.4195 | 0.1980 |
0.8254 | 84.0 | 1176 | 0.6973 | 0.4187 | 0.1981 |
0.8254 | 85.0 | 1190 | 0.6902 | 0.4218 | 0.1986 |
0.7519 | 86.0 | 1204 | 0.6910 | 0.4212 | 0.1980 |
0.7519 | 87.0 | 1218 | 0.6867 | 0.4204 | 0.1980 |
0.7519 | 88.0 | 1232 | 0.6844 | 0.4187 | 0.1978 |
0.7519 | 89.0 | 1246 | 0.6873 | 0.4163 | 0.1976 |
0.7519 | 90.0 | 1260 | 0.6771 | 0.4165 | 0.1968 |
0.7519 | 91.0 | 1274 | 0.6828 | 0.4161 | 0.1975 |
0.7519 | 92.0 | 1288 | 0.6806 | 0.4159 | 0.1974 |
0.7161 | 93.0 | 1302 | 0.6787 | 0.4149 | 0.1970 |
0.7161 | 94.0 | 1316 | 0.6768 | 0.4161 | 0.1971 |
0.7161 | 95.0 | 1330 | 0.6735 | 0.4171 | 0.1973 |
0.7161 | 96.0 | 1344 | 0.6771 | 0.4159 | 0.1973 |
0.7161 | 97.0 | 1358 | 0.6747 | 0.4155 | 0.1969 |
0.7161 | 98.0 | 1372 | 0.6761 | 0.4161 | 0.1971 |
0.7161 | 99.0 | 1386 | 0.6769 | 0.4167 | 0.1973 |
0.6707 | 100.0 | 1400 | 0.6766 | 0.4153 | 0.1969 |
Framework versions
- Transformers 4.28.0
- Pytorch 2.4.1+cu121
- Datasets 3.2.0
- Tokenizers 0.13.3
- Downloads last month
- 2
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
HF Inference API was unable to determine this model's library.