wav2vec2-turkish-300m-8
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the common_voice_17_0 dataset. It achieves the following results on the evaluation set:
- Loss: 0.2539
- Wer: 0.1949
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 0.1
- num_epochs: 20
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
3.4377 | 0.1724 | 500 | 0.7853 | 0.6509 |
0.7415 | 0.3447 | 1000 | 0.4365 | 0.4855 |
0.472 | 0.5171 | 1500 | 0.3851 | 0.4410 |
0.3678 | 0.6894 | 2000 | 0.3468 | 0.4292 |
0.3512 | 0.8618 | 2500 | 0.3287 | 0.4139 |
0.3345 | 1.0341 | 3000 | 0.3030 | 0.3810 |
0.2976 | 1.2065 | 3500 | 0.3085 | 0.3702 |
0.2841 | 1.3788 | 4000 | 0.3024 | 0.3964 |
0.2674 | 1.5512 | 4500 | 0.2864 | 0.3471 |
0.2693 | 1.7235 | 5000 | 0.2664 | 0.3411 |
0.2564 | 1.8959 | 5500 | 0.2700 | 0.3399 |
0.2407 | 2.0683 | 6000 | 0.2649 | 0.3284 |
0.2225 | 2.2406 | 6500 | 0.2619 | 0.3243 |
0.2209 | 2.4130 | 7000 | 0.2634 | 0.3154 |
0.2221 | 2.5853 | 7500 | 0.2700 | 0.3250 |
0.2104 | 2.7577 | 8000 | 0.2576 | 0.3115 |
0.2095 | 2.9300 | 8500 | 0.2522 | 0.3123 |
0.2031 | 3.1024 | 9000 | 0.2453 | 0.2954 |
0.1849 | 3.2747 | 9500 | 0.2483 | 0.2949 |
0.1911 | 3.4471 | 10000 | 0.2454 | 0.2984 |
0.1784 | 3.6194 | 10500 | 0.2619 | 0.2956 |
0.1891 | 3.7918 | 11000 | 0.2520 | 0.2870 |
0.1822 | 3.9642 | 11500 | 0.2456 | 0.2945 |
0.1633 | 4.1365 | 12000 | 0.2473 | 0.2905 |
0.1594 | 4.3089 | 12500 | 0.2413 | 0.2863 |
0.1616 | 4.4812 | 13000 | 0.2499 | 0.2852 |
0.1633 | 4.6536 | 13500 | 0.2414 | 0.2844 |
0.1652 | 4.8259 | 14000 | 0.2330 | 0.2894 |
0.1659 | 4.9983 | 14500 | 0.2339 | 0.2703 |
0.1496 | 5.1706 | 15000 | 0.2405 | 0.2832 |
0.1468 | 5.3430 | 15500 | 0.2378 | 0.2731 |
0.1435 | 5.5153 | 16000 | 0.2328 | 0.2679 |
0.1386 | 5.6877 | 16500 | 0.2332 | 0.2715 |
0.1422 | 5.8600 | 17000 | 0.2328 | 0.2683 |
0.1429 | 6.0324 | 17500 | 0.2500 | 0.2715 |
0.1271 | 6.2048 | 18000 | 0.2447 | 0.2635 |
0.1374 | 6.3771 | 18500 | 0.2412 | 0.2679 |
0.1306 | 6.5495 | 19000 | 0.2403 | 0.2604 |
0.1287 | 6.7218 | 19500 | 0.2319 | 0.2541 |
0.131 | 6.8942 | 20000 | 0.2407 | 0.2600 |
0.1261 | 7.0665 | 20500 | 0.2335 | 0.2547 |
0.1202 | 7.2389 | 21000 | 0.2321 | 0.2509 |
0.1194 | 7.4112 | 21500 | 0.2380 | 0.2546 |
0.1216 | 7.5836 | 22000 | 0.2515 | 0.2560 |
0.1139 | 7.7559 | 22500 | 0.2295 | 0.2502 |
0.1159 | 7.9283 | 23000 | 0.2291 | 0.2529 |
0.1145 | 8.1007 | 23500 | 0.2471 | 0.2507 |
0.1072 | 8.2730 | 24000 | 0.2327 | 0.2456 |
0.1106 | 8.4454 | 24500 | 0.2243 | 0.2461 |
0.1069 | 8.6177 | 25000 | 0.2305 | 0.2456 |
0.1116 | 8.7901 | 25500 | 0.2397 | 0.2486 |
0.1079 | 8.9624 | 26000 | 0.2417 | 0.2528 |
0.094 | 9.1348 | 26500 | 0.2484 | 0.2442 |
0.0954 | 9.3071 | 27000 | 0.2385 | 0.2477 |
0.0981 | 9.4795 | 27500 | 0.2526 | 0.2516 |
0.1037 | 9.6518 | 28000 | 0.2346 | 0.2391 |
0.0934 | 9.8242 | 28500 | 0.2342 | 0.2414 |
0.0968 | 9.9966 | 29000 | 0.2385 | 0.2387 |
0.0954 | 10.1689 | 29500 | 0.2367 | 0.2389 |
0.0903 | 10.3413 | 30000 | 0.2346 | 0.2365 |
0.0931 | 10.5136 | 30500 | 0.2472 | 0.2385 |
0.0911 | 10.6860 | 31000 | 0.2562 | 0.2368 |
0.0902 | 10.8583 | 31500 | 0.2375 | 0.2390 |
0.0831 | 11.0307 | 32000 | 0.2265 | 0.2326 |
0.0822 | 11.2030 | 32500 | 0.2464 | 0.2305 |
0.083 | 11.3754 | 33000 | 0.2361 | 0.2299 |
0.0802 | 11.5477 | 33500 | 0.2440 | 0.2389 |
0.0757 | 11.7201 | 34000 | 0.2435 | 0.2261 |
0.0781 | 11.8925 | 34500 | 0.2410 | 0.2293 |
0.0823 | 12.0648 | 35000 | 0.2551 | 0.2423 |
0.0748 | 12.2372 | 35500 | 0.2448 | 0.2245 |
0.0724 | 12.4095 | 36000 | 0.2369 | 0.2208 |
0.0716 | 12.5819 | 36500 | 0.2462 | 0.2280 |
0.0734 | 12.7542 | 37000 | 0.2407 | 0.2255 |
0.0771 | 12.9266 | 37500 | 0.2461 | 0.2304 |
0.0715 | 13.0989 | 38000 | 0.2496 | 0.2237 |
0.0702 | 13.2713 | 38500 | 0.2515 | 0.2228 |
0.0697 | 13.4436 | 39000 | 0.2377 | 0.2217 |
0.0712 | 13.6160 | 39500 | 0.2446 | 0.2182 |
0.0641 | 13.7883 | 40000 | 0.2461 | 0.2187 |
0.0712 | 13.9607 | 40500 | 0.2534 | 0.2155 |
0.0644 | 14.1331 | 41000 | 0.2428 | 0.2140 |
0.0584 | 14.3054 | 41500 | 0.2595 | 0.2156 |
0.0621 | 14.4778 | 42000 | 0.2474 | 0.2139 |
0.0634 | 14.6501 | 42500 | 0.2571 | 0.2184 |
0.0643 | 14.8225 | 43000 | 0.2556 | 0.2180 |
0.0599 | 14.9948 | 43500 | 0.2532 | 0.2160 |
0.06 | 15.1672 | 44000 | 0.2468 | 0.2182 |
0.0555 | 15.3395 | 44500 | 0.2530 | 0.2152 |
0.0542 | 15.5119 | 45000 | 0.2530 | 0.2080 |
0.0533 | 15.6842 | 45500 | 0.2414 | 0.2111 |
0.0587 | 15.8566 | 46000 | 0.2457 | 0.2081 |
0.0556 | 16.0290 | 46500 | 0.2509 | 0.2085 |
0.0538 | 16.2013 | 47000 | 0.2500 | 0.2067 |
0.052 | 16.3737 | 47500 | 0.2472 | 0.2076 |
0.0504 | 16.5460 | 48000 | 0.2537 | 0.2080 |
0.0562 | 16.7184 | 48500 | 0.2512 | 0.2047 |
0.0487 | 16.8907 | 49000 | 0.2604 | 0.2058 |
0.0526 | 17.0631 | 49500 | 0.2530 | 0.2064 |
0.0457 | 17.2354 | 50000 | 0.2531 | 0.2034 |
0.0483 | 17.4078 | 50500 | 0.2532 | 0.2032 |
0.0456 | 17.5801 | 51000 | 0.2585 | 0.2040 |
0.0507 | 17.7525 | 51500 | 0.2550 | 0.2025 |
0.0471 | 17.9249 | 52000 | 0.2439 | 0.2003 |
0.0485 | 18.0972 | 52500 | 0.2517 | 0.1989 |
0.0472 | 18.2696 | 53000 | 0.2540 | 0.2007 |
0.0472 | 18.4419 | 53500 | 0.2595 | 0.2016 |
0.0464 | 18.6143 | 54000 | 0.2491 | 0.1987 |
0.0436 | 18.7866 | 54500 | 0.2581 | 0.1988 |
0.0443 | 18.9590 | 55000 | 0.2530 | 0.1978 |
0.0454 | 19.1313 | 55500 | 0.2525 | 0.1967 |
0.039 | 19.3037 | 56000 | 0.2537 | 0.1956 |
0.0432 | 19.4760 | 56500 | 0.2571 | 0.1975 |
0.0431 | 19.6484 | 57000 | 0.2543 | 0.1964 |
0.0449 | 19.8208 | 57500 | 0.2543 | 0.1950 |
0.0407 | 19.9931 | 58000 | 0.2539 | 0.1949 |
Framework versions
- Transformers 4.40.0
- Pytorch 2.2.2+cu121
- Datasets 2.17.1
- Tokenizers 0.19.1
- Downloads last month
- 18
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for tgrhn/wav2vec2-turkish-300m-8
Base model
facebook/wav2vec2-xls-r-300m