saattrupdan commited on
Commit
adcff3d
1 Parent(s): d5d4d0a

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +37 -35
README.md CHANGED
@@ -14,15 +14,15 @@ should probably proofread and complete it, then remove this comment. -->
14
 
15
  This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
- - Loss: 0.1520
18
- - F1 Macro: 0.9013
19
- - F1 Misinformation: 0.9841
20
- - F1 Factual: 0.9697
21
- - F1 Other: 0.75
22
- - Prec Macro: 0.8643
23
- - Prec Misinformation: 0.9954
24
- - Prec Factual: 0.9412
25
- - Prec Other: 0.6562
26
 
27
  ## Model description
28
 
@@ -49,38 +49,40 @@ The following hyperparameters were used during training:
49
  - total_train_batch_size: 32
50
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
51
  - lr_scheduler_type: linear
52
- - lr_scheduler_warmup_steps: 550
53
  - num_epochs: 1000
54
 
55
  ### Training results
56
 
57
  | Training Loss | Epoch | Step | Validation Loss | F1 Macro | F1 Misinformation | F1 Factual | F1 Other | Prec Macro | Prec Misinformation | Prec Factual | Prec Other |
58
  |:-------------:|:-----:|:----:|:---------------:|:--------:|:-----------------:|:----------:|:--------:|:----------:|:-------------------:|:------------:|:----------:|
59
- | 1.072 | 0.73 | 50 | 1.0233 | 0.3136 | 0.9408 | 0.0 | 0.0 | 0.2961 | 0.8882 | 0.0 | 0.0 |
60
- | 1.0077 | 1.47 | 100 | 0.8870 | 0.3136 | 0.9408 | 0.0 | 0.0 | 0.2961 | 0.8882 | 0.0 | 0.0 |
61
- | 0.9439 | 2.2 | 150 | 0.6889 | 0.3136 | 0.9408 | 0.0 | 0.0 | 0.2961 | 0.8882 | 0.0 | 0.0 |
62
- | 0.8743 | 2.93 | 200 | 0.3857 | 0.3129 | 0.9386 | 0.0 | 0.0 | 0.2959 | 0.8878 | 0.0 | 0.0 |
63
- | 0.7564 | 3.67 | 250 | 0.2474 | 0.4630 | 0.9716 | 0.0 | 0.4176 | 0.4225 | 0.9839 | 0.0 | 0.2836 |
64
- | 0.5366 | 4.41 | 300 | 0.1819 | 0.8054 | 0.9713 | 0.8772 | 0.5676 | 0.8043 | 0.9930 | 1.0 | 0.42 |
65
- | 0.4043 | 5.15 | 350 | 0.1344 | 0.8425 | 0.9738 | 0.9538 | 0.6 | 0.8093 | 0.9884 | 0.9394 | 0.5 |
66
- | 0.3792 | 5.87 | 400 | 0.1259 | 0.8645 | 0.9761 | 0.9841 | 0.6333 | 0.8388 | 0.9885 | 1.0 | 0.5278 |
67
- | 0.2756 | 6.61 | 450 | 0.1344 | 0.8576 | 0.9774 | 0.9538 | 0.6415 | 0.8366 | 0.9841 | 0.9394 | 0.5862 |
68
- | 0.2589 | 7.35 | 500 | 0.1188 | 0.8738 | 0.9783 | 0.9412 | 0.7018 | 0.8293 | 0.9931 | 0.8889 | 0.6061 |
69
- | 0.2175 | 8.09 | 550 | 0.1436 | 0.8573 | 0.9798 | 0.9538 | 0.6383 | 0.8571 | 0.9798 | 0.9394 | 0.6522 |
70
- | 0.1888 | 8.81 | 600 | 0.1566 | 0.8613 | 0.9761 | 0.9412 | 0.6667 | 0.8185 | 0.9907 | 0.8889 | 0.5758 |
71
- | 0.15 | 9.55 | 650 | 0.1549 | 0.8542 | 0.9773 | 0.9538 | 0.6316 | 0.8245 | 0.9885 | 0.9394 | 0.5455 |
72
- | 0.1464 | 10.29 | 700 | 0.1608 | 0.8633 | 0.9773 | 0.9697 | 0.6429 | 0.8307 | 0.9885 | 0.9412 | 0.5625 |
73
- | 0.0954 | 11.03 | 750 | 0.1520 | 0.9013 | 0.9841 | 0.9697 | 0.75 | 0.8643 | 0.9954 | 0.9412 | 0.6562 |
74
- | 0.1074 | 11.76 | 800 | 0.1655 | 0.8810 | 0.9819 | 0.9552 | 0.7059 | 0.8565 | 0.9886 | 0.9143 | 0.6667 |
75
- | 0.1078 | 12.49 | 850 | 0.1937 | 0.8989 | 0.9829 | 0.9552 | 0.7586 | 0.8530 | 0.9977 | 0.9143 | 0.6471 |
76
- | 0.098 | 13.23 | 900 | 0.2098 | 0.8767 | 0.9794 | 0.9412 | 0.7097 | 0.8226 | 1.0 | 0.8889 | 0.5789 |
77
- | 0.0931 | 13.96 | 950 | 0.1591 | 0.8755 | 0.9819 | 0.9538 | 0.6909 | 0.8477 | 0.9908 | 0.9394 | 0.6129 |
78
- | 0.0701 | 14.7 | 1000 | 0.2121 | 0.8926 | 0.9805 | 0.9552 | 0.7419 | 0.8398 | 1.0 | 0.9143 | 0.6053 |
79
- | 0.0692 | 15.44 | 1050 | 0.2118 | 0.8989 | 0.9829 | 0.9552 | 0.7586 | 0.8530 | 0.9977 | 0.9143 | 0.6471 |
80
- | 0.0848 | 16.17 | 1100 | 0.2094 | 0.8913 | 0.9818 | 0.9552 | 0.7368 | 0.8487 | 0.9954 | 0.9143 | 0.6364 |
81
- | 0.0471 | 16.9 | 1150 | 0.2197 | 0.8919 | 0.9818 | 0.9697 | 0.7241 | 0.8514 | 0.9954 | 0.9412 | 0.6176 |
82
- | 0.0399 | 17.64 | 1200 | 0.1997 | 0.9019 | 0.9852 | 0.9538 | 0.7667 | 0.8594 | 1.0 | 0.9394 | 0.6389 |
83
- | 0.0307 | 18.38 | 1250 | 0.2873 | 0.8830 | 0.9795 | 0.9697 | 0.7000 | 0.8400 | 0.9954 | 0.9412 | 0.5833 |
 
 
84
 
85
 
86
  ### Framework versions
 
14
 
15
  This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Loss: 0.1304
18
+ - F1 Macro: 0.8868
19
+ - F1 Misinformation: 0.9832
20
+ - F1 Factual: 0.9890
21
+ - F1 Other: 0.6882
22
+ - Prec Macro: 0.8580
23
+ - Prec Misinformation: 0.9918
24
+ - Prec Factual: 0.9783
25
+ - Prec Other: 0.6038
26
 
27
  ## Model description
28
 
 
49
  - total_train_batch_size: 32
50
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
51
  - lr_scheduler_type: linear
52
+ - lr_scheduler_warmup_steps: 625
53
  - num_epochs: 1000
54
 
55
  ### Training results
56
 
57
  | Training Loss | Epoch | Step | Validation Loss | F1 Macro | F1 Misinformation | F1 Factual | F1 Other | Prec Macro | Prec Misinformation | Prec Factual | Prec Other |
58
  |:-------------:|:-----:|:----:|:---------------:|:--------:|:-----------------:|:----------:|:--------:|:----------:|:-------------------:|:------------:|:----------:|
59
+ | 1.0588 | 0.64 | 50 | 1.0803 | 0.0256 | 0.0 | 0.0 | 0.0768 | 0.0133 | 0.0 | 0.0 | 0.0400 |
60
+ | 0.9885 | 1.28 | 100 | 1.0055 | 0.3497 | 0.9291 | 0.0 | 0.12 | 0.3910 | 0.8729 | 0.0 | 0.3 |
61
+ | 0.971 | 1.92 | 150 | 0.9218 | 0.3102 | 0.9306 | 0.0 | 0.0 | 0.2900 | 0.8701 | 0.0 | 0.0 |
62
+ | 0.9263 | 2.56 | 200 | 0.6035 | 0.3102 | 0.9306 | 0.0 | 0.0 | 0.2900 | 0.8701 | 0.0 | 0.0 |
63
+ | 0.8672 | 3.2 | 250 | 0.3639 | 0.4428 | 0.9337 | 0.0 | 0.3946 | 0.3976 | 0.9217 | 0.0 | 0.2710 |
64
+ | 0.743 | 3.84 | 300 | 0.2396 | 0.7944 | 0.9698 | 0.9091 | 0.5043 | 0.7893 | 0.9812 | 1.0 | 0.3867 |
65
+ | 0.5106 | 4.49 | 350 | 0.1579 | 0.8399 | 0.9733 | 0.9888 | 0.5577 | 0.8130 | 0.9859 | 1.0 | 0.4531 |
66
+ | 0.4215 | 5.13 | 400 | 0.1245 | 0.8174 | 0.9747 | 0.9834 | 0.4941 | 0.8076 | 0.9780 | 0.9780 | 0.4667 |
67
+ | 0.3941 | 5.77 | 450 | 0.1422 | 0.8298 | 0.9678 | 1.0 | 0.5217 | 0.7960 | 0.9880 | 1.0 | 0.4 |
68
+ | 0.3105 | 6.41 | 500 | 0.1352 | 0.8223 | 0.9696 | 0.9836 | 0.5138 | 0.7872 | 0.9881 | 0.9677 | 0.4058 |
69
+ | 0.3126 | 7.05 | 550 | 0.1126 | 0.8423 | 0.9756 | 0.9945 | 0.5567 | 0.8162 | 0.9859 | 0.9890 | 0.4737 |
70
+ | 0.2206 | 7.69 | 600 | 0.1206 | 0.8557 | 0.9761 | 0.9890 | 0.6019 | 0.8203 | 0.9905 | 0.9783 | 0.4921 |
71
+ | 0.2472 | 8.33 | 650 | 0.1296 | 0.8481 | 0.9731 | 0.9945 | 0.5766 | 0.8105 | 0.9917 | 0.9890 | 0.4507 |
72
+ | 0.1839 | 8.97 | 700 | 0.1357 | 0.8582 | 0.9761 | 0.9890 | 0.6095 | 0.8208 | 0.9917 | 0.9783 | 0.4923 |
73
+ | 0.1282 | 9.61 | 750 | 0.1465 | 0.8481 | 0.9756 | 0.9945 | 0.5743 | 0.8175 | 0.9882 | 0.9890 | 0.4754 |
74
+ | 0.1447 | 10.26 | 800 | 0.1621 | 0.8602 | 0.9767 | 0.9945 | 0.6095 | 0.8243 | 0.9917 | 0.9890 | 0.4923 |
75
+ | 0.1223 | 10.9 | 850 | 0.1304 | 0.8868 | 0.9832 | 0.9890 | 0.6882 | 0.8580 | 0.9918 | 0.9783 | 0.6038 |
76
+ | 0.1053 | 11.54 | 900 | 0.1640 | 0.8714 | 0.9797 | 0.9945 | 0.64 | 0.8380 | 0.9918 | 0.9890 | 0.5333 |
77
+ | 0.064 | 12.18 | 950 | 0.1983 | 0.8627 | 0.9791 | 0.9889 | 0.62 | 0.8321 | 0.9906 | 0.9889 | 0.5167 |
78
+ | 0.1085 | 12.82 | 1000 | 0.1811 | 0.8688 | 0.9803 | 0.9945 | 0.6316 | 0.8413 | 0.9895 | 0.9890 | 0.5455 |
79
+ | 0.0885 | 13.46 | 1050 | 0.2052 | 0.8710 | 0.9821 | 0.9945 | 0.6364 | 0.8532 | 0.9872 | 0.9890 | 0.5833 |
80
+ | 0.0799 | 14.1 | 1100 | 0.1826 | 0.8801 | 0.9827 | 0.9836 | 0.6742 | 0.8565 | 0.9895 | 0.9677 | 0.6122 |
81
+ | 0.0737 | 14.74 | 1150 | 0.2158 | 0.8556 | 0.9761 | 0.9945 | 0.5962 | 0.8213 | 0.9905 | 0.9890 | 0.4844 |
82
+ | 0.0564 | 15.38 | 1200 | 0.2283 | 0.8637 | 0.9797 | 0.9945 | 0.6170 | 0.8381 | 0.9883 | 0.9890 | 0.5370 |
83
+ | 0.0547 | 16.03 | 1250 | 0.2508 | 0.8693 | 0.9785 | 0.9888 | 0.6408 | 0.8381 | 0.9906 | 1.0 | 0.5238 |
84
+ | 0.0602 | 16.67 | 1300 | 0.2320 | 0.8555 | 0.9798 | 0.9889 | 0.5977 | 0.8420 | 0.9838 | 0.9889 | 0.5532 |
85
+ | 0.0576 | 17.31 | 1350 | 0.2346 | 0.8737 | 0.9803 | 0.9945 | 0.6465 | 0.8411 | 0.9918 | 0.9890 | 0.5424 |
86
 
87
 
88
  ### Framework versions