ft_0123_korean

This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3699
  • Cer: 0.0865

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 20
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Cer
44.8594 0.08 100 44.3665 1.0
12.9149 0.16 200 6.5429 1.0
4.9281 0.24 300 5.0589 1.0
4.7706 0.32 400 5.0132 1.0
4.7775 0.4 500 4.9384 1.0
4.7654 0.48 600 4.8867 1.0
4.7027 0.56 700 4.8270 1.0
4.6665 0.64 800 4.8539 1.0
4.6345 0.72 900 4.7036 1.0
4.5947 0.8 1000 4.6693 1.0
4.5559 0.88 1100 4.5838 1.0
4.4675 0.96 1200 4.4082 0.9829
4.2824 1.04 1300 4.1135 0.9822
3.974 1.12 1400 3.5638 0.8459
3.4392 1.2 1500 2.9181 0.5621
3.0379 1.28 1600 2.6116 0.5058
2.8071 1.36 1700 2.3746 0.4765
2.6047 1.44 1800 2.2212 0.4516
2.4776 1.52 1900 2.0525 0.4275
2.3004 1.6 2000 1.9099 0.4083
2.2378 1.68 2100 1.8172 0.3878
2.1651 1.76 2200 1.7322 0.3740
2.0153 1.84 2300 1.6424 0.3547
1.9648 1.92 2400 1.5879 0.3416
1.8871 2.0 2500 1.5091 0.3311
1.8132 2.08 2600 1.4359 0.3203
1.7394 2.16 2700 1.3848 0.3193
1.6906 2.24 2800 1.3468 0.3147
1.6871 2.32 2900 1.3204 0.3095
1.6068 2.4 3000 1.2524 0.2957
1.5774 2.48 3100 1.2123 0.2852
1.5225 2.56 3200 1.1677 0.2785
1.4817 2.64 3300 1.1342 0.2727
1.4742 2.72 3400 1.0920 0.2667
1.4147 2.8 3500 1.0832 0.2616
1.4332 2.88 3600 1.0515 0.2588
1.3366 2.96 3700 1.0385 0.2558
1.3334 3.04 3800 0.9847 0.2501
1.277 3.12 3900 0.9959 0.2472
1.2866 3.2 4000 0.9506 0.2386
1.2299 3.28 4100 0.9189 0.2318
1.2261 3.36 4200 0.8910 0.2289
1.2111 3.44 4300 0.9037 0.2239
1.1714 3.52 4400 0.8729 0.2209
1.1327 3.6 4500 0.8688 0.2198
1.177 3.68 4600 0.8230 0.2143
1.1003 3.76 4700 0.8700 0.2196
1.083 3.84 4800 0.7949 0.2027
1.0884 3.92 4900 0.7844 0.2053
1.0457 4.0 5000 0.7730 0.2015
0.9918 4.08 5100 0.7647 0.1974
1.0337 4.16 5200 0.7602 0.1962
0.9958 4.24 5300 0.7328 0.1928
0.9732 4.32 5400 0.7355 0.1924
0.9486 4.4 5500 0.7216 0.1910
0.9375 4.48 5600 0.7011 0.1880
0.9509 4.56 5700 0.6768 0.1819
0.9454 4.64 5800 0.6813 0.1805
0.9493 4.72 5900 0.6705 0.1778
0.9406 4.8 6000 0.6510 0.1755
0.9441 4.88 6100 0.6461 0.1702
0.9469 4.96 6200 0.6496 0.1755
0.8764 5.04 6300 0.6510 0.1705
0.8494 5.12 6400 0.6289 0.1694
0.8175 5.2 6500 0.6184 0.1674
0.8543 5.28 6600 0.6155 0.1655
0.8571 5.36 6700 0.6156 0.1637
0.8629 5.44 6800 0.6085 0.1626
0.8203 5.52 6900 0.6037 0.1636
0.8155 5.6 7000 0.5979 0.1625
0.8114 5.68 7100 0.5817 0.1582
0.7884 5.76 7200 0.6055 0.1633
0.7826 5.84 7300 0.5757 0.1565
0.7918 5.92 7400 0.5803 0.1562
0.7835 6.0 7500 0.5714 0.1534
0.717 6.08 7600 0.5784 0.1557
0.726 6.16 7700 0.5566 0.1507
0.7377 6.24 7800 0.5686 0.1561
0.7187 6.32 7900 0.5505 0.1513
0.7374 6.4 8000 0.5610 0.1528
0.7278 6.48 8100 0.5579 0.1530
0.7383 6.56 8200 0.5479 0.1499
0.696 6.64 8300 0.5485 0.1484
0.726 6.72 8400 0.5355 0.1455
0.7024 6.8 8500 0.5213 0.1411
0.7294 6.88 8600 0.5213 0.1446
0.6634 6.96 8700 0.5211 0.1426
0.6888 7.04 8800 0.5341 0.1462
0.6623 7.12 8900 0.5123 0.1397
0.646 7.2 9000 0.5302 0.1447
0.6729 7.28 9100 0.5218 0.1412
0.6444 7.36 9200 0.5098 0.1384
0.6357 7.44 9300 0.5218 0.1414
0.6492 7.52 9400 0.5037 0.1365
0.6388 7.6 9500 0.5091 0.1368
0.6545 7.68 9600 0.4980 0.1365
0.6404 7.76 9700 0.4832 0.1318
0.644 7.84 9800 0.4904 0.1367
0.6401 7.92 9900 0.4985 0.1346
0.6312 8.0 10000 0.4863 0.1338
0.6088 8.08 10100 0.4851 0.1336
0.5566 8.16 10200 0.4844 0.1319
0.5806 8.24 10300 0.4865 0.1326
0.5604 8.32 10400 0.4833 0.1340
0.5977 8.4 10500 0.4726 0.1286
0.5909 8.48 10600 0.4784 0.1300
0.584 8.56 10700 0.4732 0.1282
0.5819 8.64 10800 0.4656 0.1261
0.5977 8.72 10900 0.4616 0.1249
0.5942 8.8 11000 0.4568 0.1249
0.5873 8.88 11100 0.4554 0.1252
0.576 8.96 11200 0.4600 0.1237
0.5503 9.04 11300 0.4699 0.1252
0.5586 9.12 11400 0.4695 0.1265
0.5146 9.2 11500 0.4544 0.1218
0.5505 9.28 11600 0.4509 0.1221
0.5358 9.36 11700 0.4577 0.1259
0.5465 9.44 11800 0.4481 0.1205
0.5255 9.52 11900 0.4427 0.1221
0.506 9.6 12000 0.4466 0.1193
0.5248 9.68 12100 0.4345 0.1167
0.569 9.76 12200 0.4429 0.1173
0.4933 9.84 12300 0.4377 0.1181
0.5381 9.92 12400 0.4359 0.1170
0.5206 10.0 12500 0.4401 0.1185
0.4755 10.08 12600 0.4409 0.1173
0.4899 10.16 12700 0.4350 0.1158
0.4834 10.24 12800 0.4351 0.1158
0.4973 10.32 12900 0.4202 0.1130
0.4798 10.4 13000 0.4335 0.1148
0.499 10.48 13100 0.4327 0.1149
0.4852 10.56 13200 0.4277 0.1149
0.475 10.64 13300 0.4233 0.1111
0.4653 10.72 13400 0.4169 0.1093
0.4776 10.8 13500 0.4341 0.1125
0.4803 10.88 13600 0.4305 0.1124
0.4721 10.96 13700 0.4212 0.1130
0.4655 11.04 13800 0.4262 0.1086
0.4513 11.12 13900 0.4295 0.1113
0.4563 11.2 14000 0.4299 0.1111
0.4537 11.28 14100 0.4192 0.1086
0.4273 11.36 14200 0.4176 0.1067
0.4667 11.44 14300 0.4125 0.1071
0.4468 11.52 14400 0.4080 0.1072
0.4546 11.6 14500 0.4108 0.1047
0.4468 11.68 14600 0.4104 0.1082
0.4322 11.76 14700 0.4071 0.1052
0.4379 11.84 14800 0.3994 0.1057
0.4471 11.92 14900 0.4043 0.1058
0.4394 12.0 15000 0.3971 0.1040
0.4162 12.08 15100 0.3995 0.1027
0.4183 12.16 15200 0.4092 0.1055
0.4192 12.24 15300 0.4078 0.1043
0.4154 12.32 15400 0.4062 0.1040
0.3932 12.4 15500 0.4032 0.1034
0.412 12.48 15600 0.4083 0.1041
0.4008 12.56 15700 0.3941 0.1025
0.4199 12.64 15800 0.4014 0.1031
0.4252 12.72 15900 0.3991 0.1014
0.436 12.8 16000 0.3938 0.1032
0.4137 12.88 16100 0.3902 0.1022
0.4201 12.96 16200 0.3927 0.1023
0.4052 13.04 16300 0.3967 0.1003
0.3852 13.12 16400 0.3942 0.0995
0.3912 13.2 16500 0.3959 0.1004
0.3658 13.28 16600 0.3970 0.1010
0.416 13.36 16700 0.4067 0.1036
0.396 13.44 16800 0.3964 0.0994
0.3972 13.52 16900 0.3937 0.1000
0.3863 13.6 17000 0.3990 0.0995
0.3688 13.68 17100 0.3901 0.0992
0.3712 13.76 17200 0.3828 0.0975
0.3711 13.84 17300 0.3853 0.0971
0.3845 13.92 17400 0.3909 0.0983
0.4011 14.0 17500 0.3859 0.0978
0.3762 14.08 17600 0.3961 0.0986
0.3734 14.16 17700 0.3930 0.0976
0.3577 14.24 17800 0.3883 0.0981
0.3723 14.32 17900 0.3864 0.0970
0.3596 14.4 18000 0.3900 0.0978
0.3642 14.48 18100 0.3848 0.0970
0.3684 14.56 18200 0.3949 0.0968
0.3553 14.64 18300 0.3968 0.0981
0.3564 14.72 18400 0.3942 0.0966
0.3709 14.8 18500 0.4025 0.0982
0.3653 14.88 18600 0.3840 0.0951
0.3679 14.96 18700 0.3777 0.0934
0.339 15.04 18800 0.3812 0.0941
0.3384 15.12 18900 0.3809 0.0935
0.3192 15.2 19000 0.3887 0.0942
0.3368 15.28 19100 0.3815 0.0939
0.3298 15.36 19200 0.3854 0.0935
0.3368 15.44 19300 0.3837 0.0936
0.3392 15.52 19400 0.3756 0.0935
0.3413 15.6 19500 0.3801 0.0934
0.3083 15.68 19600 0.3809 0.0930
0.3461 15.76 19700 0.3825 0.0927
0.3257 15.84 19800 0.3800 0.0928
0.3347 15.92 19900 0.3806 0.0919
0.3201 16.0 20000 0.3853 0.0927
0.3127 16.08 20100 0.3843 0.0917
0.2948 16.16 20200 0.3823 0.0907
0.3061 16.24 20300 0.3777 0.0900
0.3232 16.32 20400 0.3760 0.0904
0.3045 16.4 20500 0.3831 0.0908
0.3276 16.48 20600 0.3739 0.0898
0.3141 16.56 20700 0.3805 0.0907
0.3218 16.64 20800 0.3790 0.0906
0.3181 16.72 20900 0.3761 0.0904
0.3177 16.8 21000 0.3749 0.0903
0.3111 16.88 21100 0.3742 0.0894
0.3349 16.96 21200 0.3689 0.0883
0.3282 17.04 21300 0.3731 0.0889
0.2896 17.12 21400 0.3756 0.0895
0.2892 17.2 21500 0.3729 0.0886
0.3125 17.28 21600 0.3736 0.0882
0.31 17.36 21700 0.3701 0.0881
0.2853 17.44 21800 0.3742 0.0880
0.2954 17.52 21900 0.3757 0.0885
0.3003 17.6 22000 0.3720 0.0876
0.314 17.68 22100 0.3719 0.0883
0.3186 17.76 22200 0.3761 0.0890
0.31 17.84 22300 0.3698 0.0879
0.305 17.92 22400 0.3740 0.0881
0.2989 18.0 22500 0.3724 0.0878
0.2932 18.08 22600 0.3691 0.0872
0.2911 18.16 22700 0.3686 0.0873
0.2972 18.24 22800 0.3692 0.0872
0.2915 18.32 22900 0.3714 0.0874
0.2956 18.4 23000 0.3721 0.0873
0.2737 18.48 23100 0.3745 0.0875
0.3018 18.56 23200 0.3727 0.0873
0.2732 18.64 23300 0.3732 0.0871
0.2971 18.72 23400 0.3699 0.0876
0.3005 18.8 23500 0.3724 0.0871
0.2865 18.88 23600 0.3738 0.0869
0.2873 18.96 23700 0.3710 0.0865
0.2981 19.04 23800 0.3721 0.0869
0.2926 19.12 23900 0.3741 0.0869
0.2796 19.2 24000 0.3733 0.0867
0.2869 19.28 24100 0.3722 0.0860
0.2624 19.36 24200 0.3734 0.0862
0.2976 19.44 24300 0.3733 0.0862
0.2809 19.52 24400 0.3733 0.0864
0.273 19.6 24500 0.3716 0.0863
0.3105 19.68 24600 0.3702 0.0863
0.2937 19.76 24700 0.3702 0.0863
0.2849 19.84 24800 0.3697 0.0864
0.2865 19.92 24900 0.3700 0.0865
0.2872 20.0 25000 0.3699 0.0865

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.1+cu121
  • Datasets 2.13.0
  • Tokenizers 0.15.0
Downloads last month
44
Safetensors
Model size
317M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for yoon1000/ft_0123_korean

Finetuned
(557)
this model