Lingala-new
Collection
24 items
•
Updated
This model is a fine-tuned version of facebook/w2v-bert-2.0 on the None dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
8.0979 | 1.0 | 39 | 4.3776 | 1.0 | 1.0000 |
3.6468 | 2.0 | 78 | 3.1040 | 1.0052 | 0.7648 |
3.0608 | 3.0 | 117 | 2.9572 | 0.9997 | 0.9745 |
2.9092 | 4.0 | 156 | 2.8562 | 0.9994 | 0.8927 |
2.6916 | 5.0 | 195 | 2.0885 | 0.9851 | 0.6876 |
1.032 | 6.0 | 234 | 0.7733 | 0.4191 | 0.1298 |
0.5445 | 7.0 | 273 | 0.6238 | 0.3464 | 0.1085 |
0.4337 | 8.0 | 312 | 0.6581 | 0.3707 | 0.1118 |
0.3705 | 9.0 | 351 | 0.6407 | 0.3622 | 0.1106 |
0.318 | 10.0 | 390 | 0.6388 | 0.3649 | 0.1114 |
0.2505 | 11.0 | 429 | 0.6244 | 0.3410 | 0.1025 |
0.2108 | 12.0 | 468 | 0.6778 | 0.3368 | 0.1031 |
0.18 | 13.0 | 507 | 0.6533 | 0.3305 | 0.1026 |
0.1528 | 14.0 | 546 | 0.7003 | 0.3481 | 0.1030 |
0.1255 | 15.0 | 585 | 0.7181 | 0.3342 | 0.1035 |
0.1054 | 16.0 | 624 | 0.7766 | 0.3218 | 0.0980 |
0.0937 | 17.0 | 663 | 0.7155 | 0.3300 | 0.0995 |
0.0828 | 18.0 | 702 | 0.7353 | 0.3134 | 0.0955 |
0.0692 | 19.0 | 741 | 0.7471 | 0.3056 | 0.0930 |
0.0544 | 20.0 | 780 | 0.8148 | 0.3205 | 0.0988 |
0.0463 | 21.0 | 819 | 0.8425 | 0.3010 | 0.0940 |
0.0381 | 22.0 | 858 | 0.8396 | 0.3228 | 0.0971 |
0.0383 | 23.0 | 897 | 0.9645 | 0.3047 | 0.0968 |
0.0309 | 24.0 | 936 | 0.8552 | 0.3060 | 0.0929 |
0.0239 | 25.0 | 975 | 0.9528 | 0.3218 | 0.1018 |
0.0262 | 26.0 | 1014 | 0.9318 | 0.2996 | 0.0916 |
0.0189 | 27.0 | 1053 | 1.0495 | 0.2971 | 0.0926 |
0.0165 | 28.0 | 1092 | 0.9751 | 0.2924 | 0.0916 |
0.0132 | 29.0 | 1131 | 0.9325 | 0.2964 | 0.0924 |
0.0124 | 30.0 | 1170 | 0.9158 | 0.2960 | 0.0942 |
0.0147 | 31.0 | 1209 | 0.9964 | 0.2952 | 0.0926 |
0.0158 | 32.0 | 1248 | 1.0100 | 0.2850 | 0.0902 |
0.0077 | 33.0 | 1287 | 0.9393 | 0.2923 | 0.0921 |
0.0127 | 34.0 | 1326 | 0.9722 | 0.2982 | 0.0939 |
0.0044 | 35.0 | 1365 | 1.0325 | 0.2881 | 0.0901 |
0.0059 | 36.0 | 1404 | 1.0391 | 0.2785 | 0.0881 |
0.0027 | 37.0 | 1443 | 1.0116 | 0.2795 | 0.0866 |
0.0012 | 38.0 | 1482 | 1.0550 | 0.2735 | 0.0850 |
0.0006 | 39.0 | 1521 | 1.0673 | 0.2734 | 0.0851 |
0.0004 | 40.0 | 1560 | 1.0859 | 0.2762 | 0.0856 |
0.0004 | 41.0 | 1599 | 1.1013 | 0.2762 | 0.0858 |
0.0003 | 42.0 | 1638 | 1.1089 | 0.2745 | 0.0859 |
0.0002 | 43.0 | 1677 | 1.1119 | 0.2734 | 0.0856 |
0.0002 | 44.0 | 1716 | 1.1180 | 0.2721 | 0.0854 |
0.0001 | 45.0 | 1755 | 1.1242 | 0.2716 | 0.0852 |
0.0001 | 46.0 | 1794 | 1.1305 | 0.2712 | 0.0852 |
0.0001 | 47.0 | 1833 | 1.1367 | 0.2708 | 0.0852 |
0.0001 | 48.0 | 1872 | 1.1432 | 0.2709 | 0.0852 |
0.0001 | 49.0 | 1911 | 1.1477 | 0.2709 | 0.0852 |
0.0001 | 50.0 | 1950 | 1.1524 | 0.2708 | 0.0851 |
0.0001 | 51.0 | 1989 | 1.1563 | 0.2706 | 0.0851 |
0.0001 | 52.0 | 2028 | 1.1605 | 0.2703 | 0.0851 |
0.0001 | 53.0 | 2067 | 1.1637 | 0.2701 | 0.0850 |
0.0001 | 54.0 | 2106 | 1.1670 | 0.2698 | 0.0849 |
Base model
facebook/w2v-bert-2.0