Edit model card

xlm-roberta-large-finetuned-19March

This model is a fine-tuned version of xlm-roberta-large on the None dataset. It achieves the following results on the evaluation set:

  • Best F1: 75.2859
  • Loss: 3.7291
  • Exact: 38.1052
  • F1: 56.2024
  • Total: 3821
  • Hasans Exact: 54.7305
  • Hasans F1: 80.7951
  • Hasans Total: 2653
  • Noans Exact: 0.3425
  • Noans F1: 0.3425
  • Noans Total: 1168
  • Best Exact: 58.8589
  • Best Exact Thresh: 0.5893
  • Best F1 Thresh: 0.9986

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 1
  • eval_batch_size: 1
  • seed: 42
  • gradient_accumulation_steps: 16
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Best F1 Validation Loss Exact F1 Total Hasans Exact Hasans F1 Hasans Total Noans Exact Noans F1 Noans Total Best Exact Best Exact Thresh Best F1 Thresh
2.8178 0.11 200 46.8496 1.8065 25.3599 44.8486 3821 36.5247 64.5935 2653 0.0 0.0 1168 35.5666 0.8043 0.9449
1.7382 0.21 400 52.1438 1.5589 31.0913 49.9881 3821 44.7795 71.9957 2653 0.0 0.0 1168 39.5970 0.8782 0.9368
1.5846 0.32 600 55.6720 1.5377 34.7291 52.9053 3821 50.0188 76.1971 2653 0.0 0.0 1168 42.3973 0.7223 0.8983
1.3941 0.43 800 56.0026 1.5137 33.5514 52.0941 3821 48.3227 75.0289 2653 0.0 0.0 1168 43.4441 0.7816 0.9125
1.3771 0.54 1000 62.2027 1.3178 34.8076 53.0440 3821 50.1319 76.3970 2653 0.0 0.0 1168 48.2596 0.8079 0.8816
1.3422 0.64 1200 61.6569 1.3593 36.7705 54.3557 3821 52.9589 78.2862 2653 0.0 0.0 1168 49.3065 0.6991 0.8115
1.2506 0.75 1400 67.1569 1.1634 36.4826 54.6555 3821 52.5443 78.7180 2653 0.0 0.0 1168 53.4415 0.8273 0.9368
1.2003 0.86 1600 68.0239 1.1864 38.0005 55.5455 3821 54.4666 79.7359 2653 0.5993 0.5993 1168 53.9388 0.8636 0.9244
1.2101 0.97 1800 69.7667 1.1769 37.8958 56.0515 3821 54.5797 80.7285 2653 0.0 0.0 1168 55.3258 0.9193 0.9518
1.0566 1.07 2000 68.7591 1.2100 38.1314 55.7480 3821 54.9190 80.2914 2653 0.0 0.0 1168 54.3052 0.7215 0.8240
0.9504 1.18 2200 69.5176 1.1620 37.8173 55.3358 3821 54.4666 79.6977 2653 0.0 0.0 1168 55.0118 0.8300 0.8945
0.9177 1.29 2400 71.4471 1.1401 38.4978 56.1115 3821 55.4467 80.8150 2653 0.0 0.0 1168 57.1055 0.7949 0.8881
0.9203 1.4 2600 71.8718 1.1977 38.4978 56.1517 3821 55.4467 80.8729 2653 0.0 0.0 1168 56.8699 0.7610 0.8631
0.9513 1.5 2800 71.7460 1.1057 38.2361 55.9155 3821 54.9943 80.4572 2653 0.1712 0.1712 1168 56.9484 0.7965 0.8897
0.8996 1.61 3000 72.6287 1.1207 38.2884 55.7625 3821 55.1451 80.3122 2653 0.0 0.0 1168 57.6812 0.8633 0.9512
0.9045 1.72 3200 72.1882 1.1152 39.0212 56.3236 3821 56.2005 81.1205 2653 0.0 0.0 1168 57.7859 0.7800 0.8888
0.9005 1.82 3400 72.1757 1.1551 39.0474 56.2213 3821 56.1251 80.8599 2653 0.2568 0.2568 1168 57.1840 0.8174 0.9516
0.9102 1.93 3600 72.9329 1.1191 38.7071 56.4978 3821 55.7482 81.3714 2653 0.0 0.0 1168 57.8121 0.8652 0.9541
0.8203 2.04 3800 73.4690 1.1953 39.3091 56.5349 3821 56.6152 81.4247 2653 0.0 0.0 1168 58.2047 0.8819 0.9430
0.6482 2.15 4000 73.9489 1.1673 38.3407 56.1575 3821 55.2205 80.8812 2653 0.0 0.0 1168 57.8906 0.6748 0.9039
0.6331 2.25 4200 73.9252 1.1596 39.0997 56.3727 3821 56.3136 81.1912 2653 0.0 0.0 1168 58.5449 0.8977 0.9269
0.6239 2.36 4400 73.6730 1.1594 38.8903 56.4217 3821 55.9744 81.2240 2653 0.0856 0.0856 1168 58.0738 0.8784 0.9743
0.6572 2.47 4600 72.7751 1.1498 39.1259 55.9415 3821 56.2759 80.4948 2653 0.1712 0.1712 1168 58.4664 0.7339 0.8944
0.6652 2.58 4800 73.7635 1.1811 39.2306 56.3404 3821 56.4267 81.0692 2653 0.1712 0.1712 1168 58.2832 0.8527 0.8606
0.6604 2.68 5000 73.2122 1.1319 39.5446 56.4206 3821 56.9544 81.2601 2653 0.0 0.0 1168 58.7281 0.7900 0.9177
0.6514 2.79 5200 74.2678 1.2162 39.1521 56.5326 3821 56.3890 81.4215 2653 0.0 0.0 1168 59.2253 0.8502 0.9812
0.6718 2.9 5400 74.6439 1.1330 39.4138 56.6473 3821 56.7659 81.5867 2653 0.0 0.0 1168 59.5394 0.8374 0.9469
0.643 3.0 5600 73.0242 1.2631 37.8435 55.6916 3821 54.3159 80.0217 2653 0.4281 0.4281 1168 57.5766 0.7596 0.8457
0.4361 3.11 5800 74.1499 1.3032 39.4661 56.5452 3821 56.7282 81.3266 2653 0.2568 0.2568 1168 59.0683 0.7984 0.8484
0.4238 3.22 6000 74.5952 1.3679 38.9950 56.3652 3821 56.1628 81.1804 2653 0.0 0.0 1168 59.3824 0.7710 0.9094
0.4468 3.33 6200 74.4299 1.3699 38.3931 56.3625 3821 55.2959 81.1764 2653 0.0 0.0 1168 58.2570 0.7611 0.8728
0.4625 3.43 6400 74.7995 1.3095 38.6810 56.6461 3821 55.7105 81.5849 2653 0.0 0.0 1168 59.2253 0.7687 0.8944
0.4634 3.54 6600 74.5887 1.4208 39.7802 57.0180 3821 56.6152 81.4421 2653 1.5411 1.5411 1168 59.2515 0.7964 0.8398
0.47 3.65 6800 74.3833 1.3648 39.1521 56.3557 3821 56.3136 81.0912 2653 0.1712 0.1712 1168 59.1730 0.8667 0.9015
0.4598 3.76 7000 74.4817 1.3067 39.1782 56.2569 3821 56.4267 81.0244 2653 0.0 0.0 1168 59.5656 0.7476 0.9250
0.4608 3.86 7200 74.4170 1.3304 38.7857 56.1282 3821 55.8613 80.8390 2653 0.0 0.0 1168 58.8328 0.7717 0.8846
0.4743 3.97 7400 74.4807 1.3145 39.8063 56.8286 3821 56.9167 81.4332 2653 0.9418 0.9418 1168 59.5394 0.7264 0.8104
0.3466 4.08 7600 74.2807 1.5695 38.0529 55.9634 3821 54.7305 80.5262 2653 0.1712 0.1712 1168 58.6234 0.6575 0.8711
0.3209 4.19 7800 74.4014 1.6007 39.2829 56.8477 3821 56.0121 81.3099 2653 1.2842 1.2842 1168 59.1468 0.7379 0.8345
0.2965 4.29 8000 75.1669 1.6125 39.7016 56.9376 3821 56.3890 81.2131 2653 1.7979 1.7979 1168 59.8273 0.8465 0.8573
0.323 4.4 8200 75.2468 1.5257 39.5185 56.3139 3821 56.8790 81.0688 2653 0.0856 0.0856 1168 60.5601 0.8994 0.9968
0.3188 4.51 8400 74.5531 1.5630 38.4193 56.2742 3821 54.9943 80.7100 2653 0.7705 0.7705 1168 58.9898 0.6921 0.9107
0.3316 4.61 8600 73.7564 1.6488 38.7071 56.8133 3821 54.6928 80.7703 2653 2.3973 2.3973 1168 57.6027 0.6885 0.8446
0.3335 4.72 8800 75.0539 1.5713 39.8063 57.6728 3821 55.8236 81.5560 2653 3.4247 3.4247 1168 59.0421 0.6784 0.9024
0.3062 4.83 9000 73.9140 1.6366 38.4193 56.5598 3821 54.4289 80.5560 2653 2.0548 2.0548 1168 58.0738 0.6447 0.9738
0.3317 4.94 9200 75.1317 1.5375 40.6438 57.9963 3821 56.3890 81.3811 2653 4.8801 4.8801 1168 59.1992 0.8043 0.8979
0.2665 5.04 9400 74.5945 1.7715 42.1879 59.9249 3821 55.7105 81.2563 2653 11.4726 11.4726 1168 58.6496 0.7039 0.8899
0.2044 5.15 9600 74.6704 2.0130 39.3876 57.1735 3821 55.4844 81.1006 2653 2.8253 2.8253 1168 58.7804 0.5561 0.9753
0.2035 5.26 9800 73.9333 1.9572 40.1727 58.0513 3821 54.6551 80.4048 2653 7.2774 7.2774 1168 58.1000 0.6755 0.7745
0.237 5.37 10000 74.7114 1.9111 40.0157 58.0402 3821 54.3913 80.3512 2653 7.3630 7.3630 1168 58.6757 0.6207 0.9428
0.2194 5.47 10200 74.5000 1.9111 38.8380 56.3116 3821 55.3336 80.5001 2653 1.3699 1.3699 1168 58.7543 0.6829 0.9918
0.243 5.58 10400 74.6447 1.7084 38.1576 56.2303 3821 54.8059 80.8353 2653 0.3425 0.3425 1168 58.2832 0.5820 0.7634
0.2261 5.69 10600 75.2228 1.6893 44.4125 62.3606 3821 53.8259 79.6757 2653 23.0308 23.0308 1168 58.8589 0.7203 0.9326
0.2411 5.8 10800 75.1561 1.7086 39.2567 56.7270 3821 55.7482 80.9099 2653 1.7979 1.7979 1168 59.3300 0.7076 0.9906
0.2266 5.9 11000 74.8371 1.8812 41.1672 58.9705 3821 55.0697 80.7110 2653 9.5890 9.5890 1168 58.8851 0.7277 0.9859
0.2262 6.01 11200 74.9561 1.9699 40.0157 58.2772 3821 54.5420 80.8432 2653 7.0205 7.0205 1168 58.6234 0.6622 0.9921
0.1435 6.12 11400 75.2732 2.3215 41.4813 59.1006 3821 55.5974 80.9738 2653 9.4178 9.4178 1168 59.3300 0.6085 0.9580
0.1562 6.22 11600 74.8525 2.2761 37.7126 56.3116 3821 53.4112 80.1984 2653 2.0548 2.0548 1168 57.8906 0.9478 0.9993
0.1602 6.33 11800 75.1296 2.2181 41.5860 59.5824 3821 54.5797 80.4992 2653 12.0719 12.0719 1168 58.9113 0.9592 0.9972
0.1617 6.44 12000 74.7754 2.1303 37.6865 56.0801 3821 54.2405 80.7320 2653 0.0856 0.0856 1168 58.0738 0.6140 0.9971
0.1732 6.55 12200 75.7393 2.0434 38.6025 56.5949 3821 55.3336 81.2473 2653 0.5993 0.5993 1168 59.5917 0.6486 0.9946
0.1268 6.65 12400 74.6427 2.2969 37.4509 55.7997 3821 53.7505 80.1774 2653 0.4281 0.4281 1168 57.9429 0.5942 0.8802
0.1588 6.76 12600 74.9582 2.1332 38.1052 56.7031 3821 53.9389 80.7246 2653 2.1404 2.1404 1168 58.2570 0.5290 0.9927
0.1623 6.87 12800 75.0142 2.0222 39.3876 56.9883 3821 55.3336 80.6831 2653 3.1678 3.1678 1168 58.9113 0.8747 0.8747
0.148 6.98 13000 75.1339 2.0930 38.2099 56.1811 3821 54.7305 80.6137 2653 0.6849 0.6849 1168 58.6234 0.6673 0.9933
0.1309 7.08 13200 75.4867 2.4402 42.1094 59.9857 3821 54.6174 80.3638 2653 13.6986 13.6986 1168 59.1730 0.6728 0.9612
0.1173 7.19 13400 74.7539 2.7111 42.2141 59.6892 3821 55.4844 80.6531 2653 12.0719 12.0719 1168 58.5449 0.5282 0.9707
0.108 7.3 13600 75.4562 2.4802 41.4551 59.4454 3821 54.5420 80.4526 2653 11.7295 11.7295 1168 58.5972 0.6205 0.9876
0.0985 7.4 13800 75.5736 2.8397 41.2196 59.1842 3821 54.7682 80.6419 2653 10.4452 10.4452 1168 59.0683 0.8408 0.9942
0.1144 7.51 14000 74.9702 2.5953 38.8380 57.0815 3821 53.9766 80.2519 2653 4.4521 4.4521 1168 58.4140 0.5533 0.7640
0.1067 7.62 14200 75.4923 2.7441 38.6810 56.2112 3821 55.1451 80.3931 2653 1.2842 1.2842 1168 59.5394 0.8269 1.0000
0.1127 7.73 14400 74.7363 2.8387 37.8958 55.8558 3821 54.3913 80.2583 2653 0.4281 0.4281 1168 58.5449 0.4981 0.9928
0.1111 7.83 14600 75.0496 2.8232 38.8380 56.3759 3821 55.7859 81.0449 2653 0.3425 0.3425 1168 58.8589 0.6597 0.9983
0.104 7.94 14800 75.2988 2.7491 38.8903 56.3024 3821 55.8236 80.9014 2653 0.4281 0.4281 1168 59.4085 0.9766 0.9954
0.0988 8.05 15000 75.0794 2.9967 38.8642 56.1519 3821 55.7482 80.6470 2653 0.5137 0.5137 1168 59.1468 0.6109 0.9883
0.0627 8.16 15200 74.9803 3.1843 38.5501 56.4955 3821 54.7682 80.6142 2653 1.7123 1.7123 1168 58.8851 0.5983 0.9990
0.0511 8.26 15400 75.0023 3.3279 38.4716 56.3207 3821 54.7682 80.4754 2653 1.4555 1.4555 1168 58.6496 0.6087 0.9914
0.081 8.37 15600 75.0066 3.3160 37.9482 56.0321 3821 54.6174 80.6629 2653 0.0856 0.0856 1168 58.8589 0.6251 0.6604
0.0909 8.48 15800 74.9020 3.2023 37.7650 56.0174 3821 54.2405 80.5286 2653 0.3425 0.3425 1168 58.2308 0.6750 0.9895
0.0724 8.59 16000 75.1556 3.2594 39.2829 57.3387 3821 54.6928 80.6978 2653 4.2808 4.2808 1168 58.7281 0.5745 1.0000
0.0793 8.69 16200 75.2078 3.2888 38.2622 56.1814 3821 54.8059 80.6141 2653 0.6849 0.6849 1168 59.0160 0.8687 1.0000
0.0627 8.8 16400 75.3907 3.4785 39.0212 56.9735 3821 54.7682 80.6240 2653 3.2534 3.2534 1168 58.8589 0.6609 0.9997
0.0934 8.91 16600 75.4373 3.3474 38.4454 56.2844 3821 55.1451 80.8378 2653 0.5137 0.5137 1168 58.8589 0.9383 0.9991
0.0583 9.01 16800 75.2529 3.4352 38.4454 55.9520 3821 55.3336 80.5475 2653 0.0856 0.0856 1168 59.1992 0.7693 0.9870
0.0427 9.12 17000 75.3640 3.4907 38.4716 56.5872 3821 54.5797 80.6709 2653 1.8836 1.8836 1168 59.0160 0.8924 1.0000
0.046 9.23 17200 75.1963 3.5282 38.4454 56.5199 3821 54.6174 80.6493 2653 1.7123 1.7123 1168 58.7804 0.6665 1.0000
0.042 9.34 17400 75.2151 3.6017 37.8697 56.0853 3821 54.3159 80.5511 2653 0.5137 0.5137 1168 58.4402 0.8206 0.9998
0.0466 9.44 17600 75.4089 3.5608 38.1052 56.1973 3821 54.8059 80.8631 2653 0.1712 0.1712 1168 58.8851 0.9627 0.9795
0.0502 9.55 17800 75.3440 3.6178 38.2884 56.2233 3821 55.1074 80.9382 2653 0.0856 0.0856 1168 58.9636 0.7981 0.9991
0.0505 9.66 18000 75.2088 3.7243 37.9482 56.0745 3821 54.6551 80.7616 2653 0.0 0.0 1168 58.5449 0.5150 0.9954
0.0426 9.77 18200 75.2649 3.7307 37.9220 56.0874 3821 54.5797 80.7425 2653 0.0856 0.0856 1168 58.7543 0.4981 0.9938
0.0536 9.87 18400 75.2783 3.7090 37.9220 56.1133 3821 54.5797 80.7799 2653 0.0856 0.0856 1168 58.8851 0.7739 0.9990
0.0364 9.98 18600 75.2859 3.7291 38.1052 56.2024 3821 54.7305 80.7951 2653 0.3425 0.3425 1168 58.8589 0.5893 0.9986

Framework versions

  • Transformers 4.37.2
  • Pytorch 2.0.1+cu117
  • Datasets 2.17.0
  • Tokenizers 0.15.2
Downloads last month
14
Safetensors
Model size
559M params
Tensor type
F32

Finetuned from