oscarwu commited on
Commit
0c5a746
1 Parent(s): 46a8ec6

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +72 -37
README.md CHANGED
@@ -14,15 +14,15 @@ should probably proofread and complete it, then remove this comment. -->
14
 
15
  This model is a fine-tuned version of [oscarwu/mlcovid19-classifier](https://huggingface.co/oscarwu/mlcovid19-classifier) on an unknown dataset.
16
  It achieves the following results on the evaluation set:
17
- - Loss: 0.4565
18
- - F1 Macro: 0.6596
19
- - F1 Misinformation: 0.8776
20
- - F1 Factual: 0.8823
21
- - F1 Other: 0.2188
22
- - Prec Macro: 0.7181
23
- - Prec Misinformation: 0.834
24
- - Prec Factual: 0.8952
25
- - Prec Other: 0.4251
26
 
27
  ## Model description
28
 
@@ -41,41 +41,76 @@ More information needed
41
  ### Training hyperparameters
42
 
43
  The following hyperparameters were used during training:
44
- - learning_rate: 2e-06
45
- - train_batch_size: 32
46
- - eval_batch_size: 32
47
  - seed: 42
48
  - gradient_accumulation_steps: 32
49
- - total_train_batch_size: 1024
50
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
51
  - lr_scheduler_type: linear
52
- - lr_scheduler_warmup_steps: 2165
53
- - num_epochs: 50
54
 
55
  ### Training results
56
 
57
- | Training Loss | Epoch | Step | Validation Loss | F1 Macro | F1 Misinformation | F1 Factual | F1 Other | Prec Macro | Prec Misinformation | Prec Factual | Prec Other |
58
- |:-------------:|:-----:|:----:|:---------------:|:--------:|:-----------------:|:----------:|:--------:|:----------:|:-------------------:|:------------:|:----------:|
59
- | 1.1986 | 0.94 | 16 | 1.0988 | 0.6120 | 0.8270 | 0.7623 | 0.2466 | 0.6567 | 0.7636 | 0.8578 | 0.3487 |
60
- | 1.1885 | 1.94 | 32 | 1.0851 | 0.6163 | 0.8271 | 0.7645 | 0.2574 | 0.6588 | 0.7660 | 0.8557 | 0.3547 |
61
- | 1.1538 | 2.94 | 48 | 1.0625 | 0.6198 | 0.8275 | 0.7683 | 0.2635 | 0.6593 | 0.7695 | 0.8521 | 0.3565 |
62
- | 1.1386 | 3.94 | 64 | 1.0307 | 0.6235 | 0.8259 | 0.7722 | 0.2725 | 0.6564 | 0.7738 | 0.8452 | 0.3502 |
63
- | 1.0935 | 4.94 | 80 | 0.9911 | 0.6276 | 0.8259 | 0.7797 | 0.2771 | 0.6549 | 0.7803 | 0.8392 | 0.3452 |
64
- | 1.055 | 5.94 | 96 | 0.9445 | 0.6304 | 0.8271 | 0.7912 | 0.2730 | 0.6521 | 0.7893 | 0.8344 | 0.3327 |
65
- | 0.9925 | 6.94 | 112 | 0.8945 | 0.6340 | 0.8270 | 0.8001 | 0.2749 | 0.6518 | 0.7976 | 0.8251 | 0.3327 |
66
- | 0.9446 | 7.94 | 128 | 0.8448 | 0.6390 | 0.8303 | 0.8106 | 0.2760 | 0.6545 | 0.8088 | 0.8186 | 0.3360 |
67
- | 0.8813 | 8.94 | 144 | 0.7970 | 0.6448 | 0.8355 | 0.8238 | 0.2752 | 0.6598 | 0.8185 | 0.8214 | 0.3395 |
68
- | 0.8259 | 9.94 | 160 | 0.7475 | 0.6480 | 0.8405 | 0.8330 | 0.2704 | 0.6644 | 0.8243 | 0.8256 | 0.3434 |
69
- | 0.7721 | 10.94 | 176 | 0.6971 | 0.6532 | 0.8483 | 0.8430 | 0.2684 | 0.6746 | 0.8281 | 0.8375 | 0.3583 |
70
- | 0.7107 | 11.94 | 192 | 0.6542 | 0.6510 | 0.8527 | 0.8496 | 0.2507 | 0.6765 | 0.8290 | 0.8448 | 0.3557 |
71
- | 0.6742 | 12.94 | 208 | 0.6126 | 0.6527 | 0.8554 | 0.8544 | 0.2484 | 0.6793 | 0.8298 | 0.8521 | 0.3560 |
72
- | 0.6296 | 13.94 | 224 | 0.5735 | 0.6560 | 0.8603 | 0.8586 | 0.2491 | 0.6902 | 0.8298 | 0.8602 | 0.3804 |
73
- | 0.5947 | 14.94 | 240 | 0.5416 | 0.6592 | 0.8641 | 0.8624 | 0.2512 | 0.6986 | 0.8299 | 0.8689 | 0.3970 |
74
- | 0.5728 | 15.94 | 256 | 0.5164 | 0.6584 | 0.8678 | 0.8674 | 0.2402 | 0.7028 | 0.8312 | 0.8745 | 0.4026 |
75
- | 0.5424 | 16.94 | 272 | 0.4950 | 0.6620 | 0.8711 | 0.8720 | 0.2428 | 0.7110 | 0.8315 | 0.8836 | 0.4178 |
76
- | 0.5277 | 17.94 | 288 | 0.4798 | 0.6594 | 0.8727 | 0.8751 | 0.2305 | 0.7107 | 0.8316 | 0.8874 | 0.4130 |
77
- | 0.5204 | 18.94 | 304 | 0.4679 | 0.6613 | 0.8749 | 0.8767 | 0.2323 | 0.7183 | 0.8335 | 0.8868 | 0.4346 |
78
- | 0.5061 | 19.94 | 320 | 0.4565 | 0.6596 | 0.8776 | 0.8823 | 0.2188 | 0.7181 | 0.834 | 0.8952 | 0.4251 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
79
 
80
 
81
  ### Framework versions
 
14
 
15
  This model is a fine-tuned version of [oscarwu/mlcovid19-classifier](https://huggingface.co/oscarwu/mlcovid19-classifier) on an unknown dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Loss: 0.2879
18
+ - F1 Macro: 0.7978
19
+ - F1 Misinformation: 0.9347
20
+ - F1 Factual: 0.9423
21
+ - F1 Other: 0.5166
22
+ - Prec Macro: 0.8156
23
+ - Prec Misinformation: 0.9277
24
+ - Prec Factual: 0.9345
25
+ - Prec Other: 0.5846
26
 
27
  ## Model description
28
 
 
41
  ### Training hyperparameters
42
 
43
  The following hyperparameters were used during training:
44
+ - learning_rate: 2e-05
45
+ - train_batch_size: 128
46
+ - eval_batch_size: 128
47
  - seed: 42
48
  - gradient_accumulation_steps: 32
49
+ - total_train_batch_size: 4096
50
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
51
  - lr_scheduler_type: linear
52
+ - lr_scheduler_warmup_steps: 2607
53
+ - num_epochs: 400
54
 
55
  ### Training results
56
 
57
+ | Training Loss | Epoch | Step | Validation Loss | F1 Macro | F1 Misinformation | F1 Factual | F1 Other | Prec Macro | Prec Misinformation | Prec Factual | Prec Other |
58
+ |:-------------:|:------:|:----:|:---------------:|:--------:|:-----------------:|:----------:|:--------:|:----------:|:-------------------:|:------------:|:----------:|
59
+ | 0.4535 | 1.98 | 10 | 0.4122 | 0.6809 | 0.8906 | 0.8993 | 0.2529 | 0.7749 | 0.8433 | 0.9169 | 0.5646 |
60
+ | 0.4445 | 3.98 | 20 | 0.4056 | 0.6844 | 0.8918 | 0.9004 | 0.2611 | 0.7706 | 0.8461 | 0.9171 | 0.5487 |
61
+ | 0.4362 | 5.98 | 30 | 0.3966 | 0.6870 | 0.8930 | 0.9020 | 0.2658 | 0.7672 | 0.8490 | 0.9171 | 0.5356 |
62
+ | 0.4229 | 7.98 | 40 | 0.3864 | 0.6885 | 0.8955 | 0.9055 | 0.2645 | 0.7652 | 0.8531 | 0.9179 | 0.5246 |
63
+ | 0.4134 | 9.98 | 50 | 0.3774 | 0.6889 | 0.8983 | 0.9091 | 0.2594 | 0.7697 | 0.8573 | 0.9173 | 0.5345 |
64
+ | 0.4004 | 11.98 | 60 | 0.3682 | 0.6907 | 0.8996 | 0.9111 | 0.2616 | 0.7763 | 0.8605 | 0.9148 | 0.5536 |
65
+ | 0.3893 | 13.98 | 70 | 0.3583 | 0.6960 | 0.9014 | 0.9124 | 0.2740 | 0.7853 | 0.8629 | 0.9152 | 0.5778 |
66
+ | 0.3853 | 15.98 | 80 | 0.3483 | 0.7036 | 0.9031 | 0.9157 | 0.2920 | 0.7749 | 0.8683 | 0.9172 | 0.5390 |
67
+ | 0.369 | 17.98 | 90 | 0.3399 | 0.7011 | 0.9037 | 0.9167 | 0.2828 | 0.7775 | 0.8690 | 0.9159 | 0.5476 |
68
+ | 0.36 | 19.98 | 100 | 0.3312 | 0.7102 | 0.9056 | 0.9194 | 0.3055 | 0.7836 | 0.8733 | 0.9167 | 0.5609 |
69
+ | 0.3445 | 21.98 | 110 | 0.3237 | 0.7116 | 0.9065 | 0.9204 | 0.3078 | 0.7860 | 0.8749 | 0.9165 | 0.5667 |
70
+ | 0.3406 | 23.98 | 120 | 0.3181 | 0.7058 | 0.9068 | 0.9212 | 0.2893 | 0.7880 | 0.8740 | 0.9162 | 0.5738 |
71
+ | 0.3286 | 25.98 | 130 | 0.3094 | 0.7183 | 0.9099 | 0.9250 | 0.32 | 0.7932 | 0.8782 | 0.9216 | 0.5797 |
72
+ | 0.3213 | 27.98 | 140 | 0.3049 | 0.7187 | 0.9111 | 0.9254 | 0.3196 | 0.7957 | 0.8800 | 0.9204 | 0.5867 |
73
+ | 0.3111 | 29.98 | 150 | 0.3017 | 0.7219 | 0.9129 | 0.9264 | 0.3263 | 0.7983 | 0.8843 | 0.9178 | 0.5927 |
74
+ | 0.3087 | 31.98 | 160 | 0.2970 | 0.7231 | 0.9132 | 0.9276 | 0.3287 | 0.7977 | 0.8850 | 0.9188 | 0.5893 |
75
+ | 0.2992 | 33.98 | 170 | 0.2926 | 0.7243 | 0.9141 | 0.9293 | 0.3293 | 0.8003 | 0.8839 | 0.9235 | 0.5935 |
76
+ | 0.2924 | 35.98 | 180 | 0.2892 | 0.7312 | 0.9150 | 0.9303 | 0.3482 | 0.7971 | 0.8889 | 0.9218 | 0.5806 |
77
+ | 0.2878 | 37.98 | 190 | 0.2870 | 0.7356 | 0.9173 | 0.9324 | 0.3571 | 0.8027 | 0.8906 | 0.9246 | 0.5929 |
78
+ | 0.2811 | 39.98 | 200 | 0.2844 | 0.7439 | 0.9188 | 0.9328 | 0.3801 | 0.8109 | 0.8954 | 0.9213 | 0.6161 |
79
+ | 0.2751 | 41.98 | 210 | 0.2816 | 0.7500 | 0.9197 | 0.9340 | 0.3963 | 0.8060 | 0.8973 | 0.9250 | 0.5956 |
80
+ | 0.2683 | 43.98 | 220 | 0.2798 | 0.7517 | 0.9210 | 0.9339 | 0.4000 | 0.8068 | 0.8976 | 0.9272 | 0.5956 |
81
+ | 0.2643 | 45.98 | 230 | 0.2766 | 0.7544 | 0.9221 | 0.9349 | 0.4062 | 0.8064 | 0.8990 | 0.9290 | 0.5910 |
82
+ | 0.2619 | 47.98 | 240 | 0.2736 | 0.7579 | 0.9227 | 0.9356 | 0.4155 | 0.8085 | 0.9002 | 0.9298 | 0.5954 |
83
+ | 0.2539 | 49.98 | 250 | 0.2733 | 0.7567 | 0.9231 | 0.9357 | 0.4111 | 0.8060 | 0.9006 | 0.9302 | 0.5872 |
84
+ | 0.2496 | 51.98 | 260 | 0.2713 | 0.7600 | 0.9235 | 0.9360 | 0.4206 | 0.8070 | 0.9009 | 0.9320 | 0.5881 |
85
+ | 0.2455 | 53.98 | 270 | 0.2697 | 0.7575 | 0.9231 | 0.9356 | 0.4139 | 0.8052 | 0.9009 | 0.9304 | 0.5844 |
86
+ | 0.2371 | 55.98 | 280 | 0.2686 | 0.7652 | 0.9239 | 0.9356 | 0.4360 | 0.8058 | 0.9058 | 0.9283 | 0.5833 |
87
+ | 0.2316 | 57.98 | 290 | 0.2686 | 0.7664 | 0.9243 | 0.9361 | 0.4389 | 0.8037 | 0.9073 | 0.9288 | 0.5749 |
88
+ | 0.2258 | 59.98 | 300 | 0.2664 | 0.7680 | 0.9247 | 0.9360 | 0.4431 | 0.8018 | 0.9095 | 0.9279 | 0.5680 |
89
+ | 0.2207 | 61.98 | 310 | 0.2663 | 0.7736 | 0.9262 | 0.9373 | 0.4574 | 0.8015 | 0.9145 | 0.9276 | 0.5625 |
90
+ | 0.2167 | 63.98 | 320 | 0.2643 | 0.7715 | 0.9268 | 0.9380 | 0.4498 | 0.8003 | 0.9127 | 0.9312 | 0.5571 |
91
+ | 0.2131 | 65.98 | 330 | 0.2627 | 0.7753 | 0.9287 | 0.9398 | 0.4573 | 0.8064 | 0.9123 | 0.9356 | 0.5714 |
92
+ | 0.2075 | 67.98 | 340 | 0.2644 | 0.7760 | 0.9290 | 0.9397 | 0.4593 | 0.8056 | 0.9136 | 0.9349 | 0.5682 |
93
+ | 0.2049 | 69.98 | 350 | 0.2648 | 0.7768 | 0.9290 | 0.9390 | 0.4623 | 0.8050 | 0.9174 | 0.9292 | 0.5685 |
94
+ | 0.2016 | 71.98 | 360 | 0.2631 | 0.7771 | 0.9295 | 0.9394 | 0.4623 | 0.8055 | 0.9165 | 0.9316 | 0.5685 |
95
+ | 0.1979 | 73.98 | 370 | 0.2644 | 0.7793 | 0.9305 | 0.9397 | 0.4677 | 0.8041 | 0.9208 | 0.9295 | 0.5620 |
96
+ | 0.1939 | 75.98 | 380 | 0.2671 | 0.7909 | 0.9312 | 0.9392 | 0.5023 | 0.8099 | 0.9272 | 0.9256 | 0.5771 |
97
+ | 0.1932 | 77.98 | 390 | 0.2648 | 0.7927 | 0.9325 | 0.9422 | 0.5035 | 0.8104 | 0.9242 | 0.9361 | 0.5709 |
98
+ | 0.1856 | 79.98 | 400 | 0.2615 | 0.7922 | 0.9331 | 0.9431 | 0.5004 | 0.8111 | 0.9235 | 0.9379 | 0.5719 |
99
+ | 0.1837 | 81.98 | 410 | 0.2624 | 0.7898 | 0.9328 | 0.9447 | 0.4920 | 0.8141 | 0.9183 | 0.9432 | 0.5808 |
100
+ | 0.1781 | 83.98 | 420 | 0.2660 | 0.7988 | 0.9334 | 0.9432 | 0.5196 | 0.8128 | 0.9263 | 0.9388 | 0.5733 |
101
+ | 0.172 | 85.98 | 430 | 0.2642 | 0.7909 | 0.9335 | 0.9428 | 0.4964 | 0.8139 | 0.9234 | 0.9353 | 0.5829 |
102
+ | 0.172 | 87.98 | 440 | 0.2695 | 0.7880 | 0.9321 | 0.9430 | 0.4889 | 0.8121 | 0.9172 | 0.9422 | 0.5771 |
103
+ | 0.1656 | 89.98 | 450 | 0.2671 | 0.7928 | 0.9337 | 0.9436 | 0.5012 | 0.8145 | 0.9212 | 0.9411 | 0.5811 |
104
+ | 0.163 | 91.98 | 460 | 0.2693 | 0.7949 | 0.9331 | 0.9429 | 0.5088 | 0.8111 | 0.9232 | 0.9408 | 0.5692 |
105
+ | 0.1555 | 93.98 | 470 | 0.2696 | 0.7967 | 0.9332 | 0.9431 | 0.5138 | 0.8142 | 0.9203 | 0.9449 | 0.5776 |
106
+ | 0.1513 | 95.98 | 480 | 0.2710 | 0.7985 | 0.9340 | 0.9443 | 0.5172 | 0.8156 | 0.9220 | 0.9450 | 0.5798 |
107
+ | 0.1478 | 97.98 | 490 | 0.2722 | 0.7991 | 0.9342 | 0.9442 | 0.5189 | 0.8138 | 0.9243 | 0.9436 | 0.5736 |
108
+ | 0.1435 | 99.98 | 500 | 0.2725 | 0.7981 | 0.9343 | 0.9432 | 0.5166 | 0.8124 | 0.9248 | 0.9424 | 0.57 |
109
+ | 0.1409 | 101.98 | 510 | 0.2754 | 0.7994 | 0.9345 | 0.9432 | 0.5206 | 0.8161 | 0.9231 | 0.9433 | 0.5819 |
110
+ | 0.1384 | 103.98 | 520 | 0.2817 | 0.7991 | 0.9347 | 0.9441 | 0.5184 | 0.8166 | 0.9233 | 0.9436 | 0.5828 |
111
+ | 0.1333 | 105.98 | 530 | 0.2833 | 0.7934 | 0.9351 | 0.9434 | 0.5016 | 0.8178 | 0.9232 | 0.9380 | 0.5921 |
112
+ | 0.1267 | 107.98 | 540 | 0.2929 | 0.7884 | 0.9337 | 0.9429 | 0.4886 | 0.8167 | 0.9198 | 0.9377 | 0.5925 |
113
+ | 0.1234 | 109.98 | 550 | 0.2879 | 0.7978 | 0.9347 | 0.9423 | 0.5166 | 0.8156 | 0.9277 | 0.9345 | 0.5846 |
114
 
115
 
116
  ### Framework versions