Edit model card

label-transfer

This model is a fine-tuned version of saattrupdan/verdict-classifier on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0452
  • F1 Macro: 0.9872
  • F1 Misinformation: 0.9918
  • F1 Factual: 0.9979
  • F1 Other: 0.9720
  • Prec Macro: 0.9842
  • Prec Misinformation: 0.9958
  • Prec Factual: 0.9979
  • Prec Other: 0.9588

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • gradient_accumulation_steps: 32
  • total_train_batch_size: 2048
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1423
  • num_epochs: 1000

Training results

Training Loss Epoch Step Validation Loss F1 Macro F1 Misinformation F1 Factual F1 Other Prec Macro Prec Misinformation Prec Factual Prec Other
0.4236 0.9 5 0.4070 0.8866 0.9477 0.9658 0.7463 0.9306 0.9075 0.9766 0.9077
0.4175 1.9 10 0.4001 0.8872 0.9480 0.9658 0.7477 0.9308 0.9079 0.9766 0.9080
0.4115 2.9 15 0.3884 0.8896 0.9487 0.9668 0.7534 0.9323 0.9093 0.9787 0.9090
0.3932 3.9 20 0.3719 0.8943 0.9509 0.9668 0.7652 0.9343 0.9133 0.9787 0.9110
0.3785 4.9 25 0.3505 0.8973 0.9522 0.9668 0.7730 0.9353 0.9160 0.9787 0.9112
0.3653 5.9 30 0.3266 0.9009 0.9535 0.9683 0.7809 0.9369 0.9186 0.9818 0.9104
0.3337 6.9 35 0.3028 0.9143 0.9599 0.9694 0.8137 0.9425 0.9310 0.9818 0.9148
0.3181 7.9 40 0.2796 0.9181 0.9624 0.9673 0.8245 0.9431 0.9361 0.9807 0.9125
0.2976 8.9 45 0.2570 0.9199 0.9633 0.9673 0.8291 0.9434 0.9383 0.9807 0.9113
0.2845 9.9 50 0.2349 0.9242 0.9658 0.9668 0.8401 0.9453 0.9430 0.9797 0.9131
0.2649 10.9 55 0.2134 0.9270 0.9673 0.9668 0.8470 0.9451 0.9472 0.9797 0.9086
0.2399 11.9 60 0.1929 0.9330 0.9704 0.9668 0.8619 0.9467 0.9547 0.9797 0.9057
0.224 12.9 65 0.1735 0.9369 0.9724 0.9673 0.8710 0.9467 0.9608 0.9797 0.8996
0.1992 13.9 70 0.1564 0.9496 0.9783 0.9711 0.8995 0.9531 0.9744 0.9809 0.9039
0.1908 14.9 75 0.1427 0.9501 0.9784 0.9711 0.9006 0.9519 0.9765 0.9799 0.8993
0.1785 15.9 80 0.1309 0.9542 0.9790 0.9765 0.9072 0.9549 0.9782 0.9791 0.9076
0.1637 16.9 85 0.1215 0.9531 0.9791 0.9745 0.9056 0.9536 0.9784 0.9750 0.9073
0.151 17.9 90 0.1131 0.9540 0.9787 0.9771 0.9064 0.9549 0.9776 0.9771 0.9099
0.1395 18.9 95 0.1049 0.9555 0.9790 0.9787 0.9088 0.9558 0.9784 0.9772 0.9119
0.1285 19.9 100 0.0963 0.9600 0.9799 0.9833 0.9169 0.9602 0.9798 0.9843 0.9164
0.1228 20.9 105 0.0887 0.9654 0.9829 0.9844 0.9289 0.9639 0.9850 0.9854 0.9215
0.1163 21.9 110 0.0832 0.9672 0.9839 0.9849 0.9329 0.9655 0.9864 0.9864 0.9237
0.1045 22.9 115 0.0792 0.9690 0.9849 0.9849 0.9374 0.9666 0.9883 0.9864 0.9251
0.0975 23.9 120 0.0758 0.9701 0.9854 0.9854 0.9396 0.9682 0.9880 0.9864 0.9303
0.0957 24.9 125 0.0731 0.9710 0.9856 0.9864 0.9411 0.9691 0.9883 0.9885 0.9305
0.0911 25.9 130 0.0702 0.9743 0.9862 0.9901 0.9467 0.9722 0.9891 0.9896 0.9377
0.0884 26.9 135 0.0676 0.9759 0.9875 0.9901 0.9502 0.9728 0.9916 0.9886 0.9381
0.087 27.9 140 0.0652 0.9770 0.9878 0.9912 0.9521 0.9739 0.9919 0.9906 0.9392
0.0813 28.9 145 0.0631 0.9791 0.9880 0.9938 0.9555 0.9758 0.9925 0.9938 0.9412
0.0758 29.9 150 0.0612 0.9805 0.9887 0.9943 0.9584 0.9767 0.9938 0.9938 0.9424
0.0734 30.9 155 0.0598 0.9796 0.9882 0.9943 0.9564 0.9762 0.9927 0.9938 0.9422
0.0713 31.9 160 0.0586 0.9798 0.9883 0.9943 0.9569 0.9765 0.9927 0.9938 0.9430
0.0662 32.9 165 0.0568 0.9805 0.9887 0.9943 0.9584 0.9768 0.9936 0.9938 0.9432
0.063 33.9 170 0.0552 0.9813 0.9893 0.9943 0.9602 0.9778 0.9938 0.9938 0.9459
0.0623 34.9 175 0.0538 0.9819 0.9897 0.9943 0.9616 0.9785 0.9941 0.9938 0.9477
0.0601 35.9 180 0.0531 0.9828 0.9901 0.9948 0.9635 0.9793 0.9947 0.9938 0.9496
0.0549 36.9 185 0.0521 0.9826 0.9900 0.9948 0.9631 0.9790 0.9947 0.9938 0.9487
0.0539 37.9 190 0.0512 0.9824 0.9898 0.9948 0.9626 0.9789 0.9944 0.9938 0.9486
0.0525 38.9 195 0.0503 0.9827 0.9898 0.9953 0.9630 0.9792 0.9944 0.9938 0.9495
0.0494 39.9 200 0.0498 0.9831 0.9898 0.9958 0.9635 0.9796 0.9944 0.9948 0.9496
0.0502 40.9 205 0.0489 0.9838 0.9901 0.9964 0.9650 0.9804 0.9947 0.9958 0.9506
0.0499 41.9 210 0.0483 0.9845 0.9904 0.9969 0.9663 0.9813 0.9947 0.9958 0.9532
0.0484 42.9 215 0.0480 0.9847 0.9905 0.9969 0.9668 0.9814 0.9950 0.9958 0.9533
0.0465 43.9 220 0.0477 0.9852 0.9908 0.9969 0.9678 0.9816 0.9955 0.9958 0.9534
0.0453 44.9 225 0.0474 0.9856 0.9911 0.9969 0.9687 0.9822 0.9955 0.9958 0.9551
0.0452 45.9 230 0.0471 0.9856 0.9911 0.9969 0.9687 0.9822 0.9955 0.9958 0.9551
0.0453 46.9 235 0.0469 0.9854 0.9910 0.9969 0.9682 0.9821 0.9953 0.9958 0.9551
0.043 47.9 240 0.0468 0.9858 0.9912 0.9969 0.9692 0.9825 0.9955 0.9958 0.9560
0.0428 48.9 245 0.0465 0.9856 0.9911 0.9969 0.9687 0.9824 0.9953 0.9958 0.9560
0.0414 49.9 250 0.0465 0.9852 0.9911 0.9964 0.9682 0.9820 0.9953 0.9948 0.9560
0.0388 50.9 255 0.0462 0.9852 0.9911 0.9964 0.9682 0.9820 0.9953 0.9948 0.9560
0.0404 51.9 260 0.0458 0.9852 0.9911 0.9964 0.9682 0.9820 0.9953 0.9948 0.9560
0.0382 52.9 265 0.0454 0.9856 0.9911 0.9969 0.9687 0.9824 0.9953 0.9958 0.9560
0.042 53.9 270 0.0443 0.9862 0.9911 0.9979 0.9697 0.9831 0.9953 0.9979 0.9561
0.0369 54.9 275 0.0438 0.9862 0.9911 0.9979 0.9697 0.9831 0.9953 0.9979 0.9561
0.0383 55.9 280 0.0437 0.9862 0.9911 0.9979 0.9697 0.9831 0.9953 0.9979 0.9561
0.0373 56.9 285 0.0438 0.9862 0.9911 0.9979 0.9696 0.9833 0.9950 0.9979 0.9569
0.0402 57.9 290 0.0440 0.9862 0.9911 0.9979 0.9696 0.9833 0.9950 0.9979 0.9569
0.0389 58.9 295 0.0443 0.9858 0.9908 0.9979 0.9687 0.9831 0.9944 0.9979 0.9568
0.0361 59.9 300 0.0443 0.9860 0.9910 0.9979 0.9692 0.9832 0.9947 0.9979 0.9569
0.0369 60.9 305 0.0442 0.9860 0.9910 0.9979 0.9692 0.9832 0.9947 0.9979 0.9569
0.0353 61.9 310 0.0442 0.9862 0.9911 0.9979 0.9696 0.9833 0.9950 0.9979 0.9569
0.035 62.9 315 0.0446 0.9860 0.9910 0.9979 0.9692 0.9832 0.9947 0.9979 0.9569
0.0352 63.9 320 0.0449 0.9864 0.9912 0.9979 0.9701 0.9834 0.9953 0.9979 0.9570
0.0336 64.9 325 0.0451 0.9860 0.9910 0.9979 0.9692 0.9832 0.9947 0.9979 0.9569
0.0317 65.9 330 0.0448 0.9860 0.9910 0.9979 0.9692 0.9832 0.9947 0.9979 0.9569
0.0334 66.9 335 0.0447 0.9866 0.9914 0.9979 0.9705 0.9843 0.9944 0.9979 0.9605
0.0316 67.9 340 0.0447 0.9860 0.9910 0.9979 0.9691 0.9834 0.9944 0.9979 0.9577
0.0329 68.9 345 0.0451 0.9866 0.9914 0.9979 0.9706 0.9835 0.9955 0.9979 0.9570
0.0326 69.9 350 0.0454 0.9866 0.9914 0.9979 0.9706 0.9835 0.9955 0.9979 0.9570
0.032 70.9 355 0.0453 0.9868 0.9915 0.9979 0.9711 0.9838 0.9955 0.9979 0.9579
0.0325 71.9 360 0.0450 0.9864 0.9912 0.9979 0.9701 0.9836 0.9950 0.9979 0.9578
0.0319 72.9 365 0.0446 0.9868 0.9915 0.9979 0.9711 0.9838 0.9955 0.9979 0.9579
0.0326 73.9 370 0.0444 0.9868 0.9915 0.9979 0.9711 0.9838 0.9955 0.9979 0.9579
0.0315 74.9 375 0.0442 0.9873 0.9918 0.9979 0.9721 0.9840 0.9961 0.9979 0.9580
0.0304 75.9 380 0.0442 0.9866 0.9914 0.9979 0.9706 0.9837 0.9953 0.9979 0.9579
0.03 76.9 385 0.0444 0.9864 0.9912 0.9979 0.9702 0.9832 0.9955 0.9979 0.9561
0.0296 77.9 390 0.0448 0.9862 0.9911 0.9979 0.9697 0.9831 0.9953 0.9979 0.9561
0.0307 78.9 395 0.0452 0.9866 0.9914 0.9979 0.9706 0.9837 0.9953 0.9979 0.9579
0.0296 79.9 400 0.0453 0.9862 0.9911 0.9979 0.9697 0.9831 0.9953 0.9979 0.9561
0.0292 80.9 405 0.0454 0.9862 0.9911 0.9979 0.9697 0.9831 0.9953 0.9979 0.9561
0.0293 81.9 410 0.0452 0.9862 0.9911 0.9979 0.9697 0.9829 0.9955 0.9979 0.9552
0.0292 82.9 415 0.0454 0.9862 0.9911 0.9979 0.9697 0.9829 0.9955 0.9979 0.9552
0.0281 83.9 420 0.0454 0.9866 0.9914 0.9979 0.9706 0.9833 0.9958 0.9979 0.9562
0.0298 84.9 425 0.0452 0.9872 0.9918 0.9979 0.9720 0.9842 0.9958 0.9979 0.9588

Framework versions

  • Transformers 4.23.1
  • Pytorch 1.12.1+cu113
  • Datasets 2.6.1
  • Tokenizers 0.13.1
Downloads last month
19
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.