jordyvl commited on
Commit
eefa2a8
1 Parent(s): a0fc30a

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +115 -0
README.md ADDED
@@ -0,0 +1,115 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - generated_from_trainer
5
+ metrics:
6
+ - accuracy
7
+ model-index:
8
+ - name: resnet101_rvl-cdip-cnn_rvl_cdip-NK1000_hint_rand
9
+ results: []
10
+ ---
11
+
12
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
13
+ should probably proofread and complete it, then remove this comment. -->
14
+
15
+ # resnet101_rvl-cdip-cnn_rvl_cdip-NK1000_hint_rand
16
+
17
+ This model is a fine-tuned version of [bdpc/resnet101_rvl-cdip](https://huggingface.co/bdpc/resnet101_rvl-cdip) on the None dataset.
18
+ It achieves the following results on the evaluation set:
19
+ - Loss: 19.9624
20
+ - Accuracy: 0.765
21
+ - Brier Loss: 0.3910
22
+ - Nll: 2.2998
23
+ - F1 Micro: 0.765
24
+ - F1 Macro: 0.7641
25
+ - Ece: 0.1669
26
+ - Aurc: 0.0775
27
+
28
+ ## Model description
29
+
30
+ More information needed
31
+
32
+ ## Intended uses & limitations
33
+
34
+ More information needed
35
+
36
+ ## Training and evaluation data
37
+
38
+ More information needed
39
+
40
+ ## Training procedure
41
+
42
+ ### Training hyperparameters
43
+
44
+ The following hyperparameters were used during training:
45
+ - learning_rate: 0.0001
46
+ - train_batch_size: 64
47
+ - eval_batch_size: 64
48
+ - seed: 42
49
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
50
+ - lr_scheduler_type: linear
51
+ - lr_scheduler_warmup_ratio: 0.1
52
+ - num_epochs: 50
53
+
54
+ ### Training results
55
+
56
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy | Brier Loss | Nll | F1 Micro | F1 Macro | Ece | Aurc |
57
+ |:-------------:|:-----:|:-----:|:---------------:|:--------:|:----------:|:------:|:--------:|:--------:|:------:|:------:|
58
+ | No log | 1.0 | 250 | 26.8546 | 0.2102 | 0.8652 | 3.4225 | 0.2102 | 0.1897 | 0.0659 | 0.6137 |
59
+ | 26.8891 | 2.0 | 500 | 25.9803 | 0.2845 | 0.8381 | 3.4360 | 0.2845 | 0.2457 | 0.0762 | 0.5573 |
60
+ | 26.8891 | 3.0 | 750 | 25.8252 | 0.337 | 0.8092 | 3.6483 | 0.337 | 0.3168 | 0.1091 | 0.4988 |
61
+ | 25.1656 | 4.0 | 1000 | 24.9957 | 0.4088 | 0.7588 | 3.0528 | 0.4088 | 0.3859 | 0.1283 | 0.4163 |
62
+ | 25.1656 | 5.0 | 1250 | 24.2209 | 0.5517 | 0.5964 | 2.6661 | 0.5517 | 0.5510 | 0.0726 | 0.2489 |
63
+ | 23.8526 | 6.0 | 1500 | 23.4086 | 0.5915 | 0.5431 | 2.5360 | 0.5915 | 0.5840 | 0.0549 | 0.1994 |
64
+ | 23.8526 | 7.0 | 1750 | 23.0800 | 0.625 | 0.5049 | 2.4839 | 0.625 | 0.6268 | 0.0608 | 0.1668 |
65
+ | 22.7511 | 8.0 | 2000 | 22.9512 | 0.6573 | 0.4660 | 2.4202 | 0.6573 | 0.6564 | 0.0565 | 0.1407 |
66
+ | 22.7511 | 9.0 | 2250 | 22.7991 | 0.6783 | 0.4509 | 2.4137 | 0.6783 | 0.6749 | 0.0634 | 0.1315 |
67
+ | 21.9881 | 10.0 | 2500 | 22.8533 | 0.6352 | 0.5056 | 2.6549 | 0.6352 | 0.6231 | 0.1085 | 0.1554 |
68
+ | 21.9881 | 11.0 | 2750 | 22.7499 | 0.669 | 0.4673 | 2.5291 | 0.669 | 0.6642 | 0.1053 | 0.1347 |
69
+ | 21.391 | 12.0 | 3000 | 22.6520 | 0.6757 | 0.4767 | 2.5038 | 0.6757 | 0.6745 | 0.1204 | 0.1355 |
70
+ | 21.391 | 13.0 | 3250 | 22.4767 | 0.6737 | 0.4850 | 2.6030 | 0.6737 | 0.6718 | 0.1385 | 0.1380 |
71
+ | 20.9347 | 14.0 | 3500 | 22.3023 | 0.6767 | 0.4832 | 2.5438 | 0.6767 | 0.6770 | 0.1594 | 0.1301 |
72
+ | 20.9347 | 15.0 | 3750 | 22.1482 | 0.693 | 0.4666 | 2.5622 | 0.693 | 0.6913 | 0.1581 | 0.1209 |
73
+ | 20.5776 | 16.0 | 4000 | 22.1655 | 0.6943 | 0.4849 | 2.5685 | 0.6943 | 0.6994 | 0.1766 | 0.1288 |
74
+ | 20.5776 | 17.0 | 4250 | 22.0213 | 0.686 | 0.4922 | 2.6576 | 0.686 | 0.6925 | 0.1749 | 0.1250 |
75
+ | 20.2836 | 18.0 | 4500 | 21.5434 | 0.7023 | 0.4560 | 2.5508 | 0.7023 | 0.7018 | 0.1720 | 0.1146 |
76
+ | 20.2836 | 19.0 | 4750 | 21.7105 | 0.715 | 0.4501 | 2.5953 | 0.715 | 0.7128 | 0.1738 | 0.1083 |
77
+ | 20.0339 | 20.0 | 5000 | 21.6301 | 0.7057 | 0.4645 | 2.6131 | 0.7057 | 0.7033 | 0.1701 | 0.1153 |
78
+ | 20.0339 | 21.0 | 5250 | 21.9130 | 0.7007 | 0.4825 | 2.7114 | 0.7007 | 0.6989 | 0.1917 | 0.1231 |
79
+ | 19.8026 | 22.0 | 5500 | 21.7975 | 0.713 | 0.4702 | 2.7340 | 0.713 | 0.7117 | 0.1879 | 0.1148 |
80
+ | 19.8026 | 23.0 | 5750 | 21.5577 | 0.7173 | 0.4621 | 2.7138 | 0.7173 | 0.7100 | 0.1931 | 0.1072 |
81
+ | 19.6001 | 24.0 | 6000 | 21.2486 | 0.722 | 0.4491 | 2.5651 | 0.722 | 0.7214 | 0.1853 | 0.1045 |
82
+ | 19.6001 | 25.0 | 6250 | 21.0363 | 0.7348 | 0.4344 | 2.4688 | 0.7348 | 0.7364 | 0.1780 | 0.0974 |
83
+ | 19.4158 | 26.0 | 6500 | 21.3527 | 0.728 | 0.4495 | 2.7492 | 0.728 | 0.7219 | 0.1864 | 0.1005 |
84
+ | 19.4158 | 27.0 | 6750 | 20.8258 | 0.7355 | 0.4339 | 2.4375 | 0.7355 | 0.7352 | 0.1838 | 0.0943 |
85
+ | 19.2585 | 28.0 | 7000 | 21.0491 | 0.729 | 0.4465 | 2.6324 | 0.729 | 0.7245 | 0.1953 | 0.1010 |
86
+ | 19.2585 | 29.0 | 7250 | 20.7774 | 0.7425 | 0.4283 | 2.4694 | 0.7425 | 0.7410 | 0.1799 | 0.0908 |
87
+ | 19.1051 | 30.0 | 7500 | 20.6908 | 0.741 | 0.4311 | 2.4924 | 0.7410 | 0.7405 | 0.1890 | 0.0888 |
88
+ | 19.1051 | 31.0 | 7750 | 20.8242 | 0.743 | 0.4264 | 2.5098 | 0.743 | 0.7407 | 0.1826 | 0.0903 |
89
+ | 18.9722 | 32.0 | 8000 | 20.6257 | 0.7435 | 0.4288 | 2.4740 | 0.7435 | 0.7432 | 0.1865 | 0.0872 |
90
+ | 18.9722 | 33.0 | 8250 | 20.6265 | 0.745 | 0.4289 | 2.4552 | 0.745 | 0.7435 | 0.1862 | 0.0929 |
91
+ | 18.854 | 34.0 | 8500 | 20.4251 | 0.7505 | 0.4124 | 2.4631 | 0.7505 | 0.7513 | 0.1790 | 0.0845 |
92
+ | 18.854 | 35.0 | 8750 | 20.4164 | 0.741 | 0.4278 | 2.3888 | 0.7410 | 0.7402 | 0.1859 | 0.0889 |
93
+ | 18.7477 | 36.0 | 9000 | 20.3432 | 0.751 | 0.4184 | 2.4020 | 0.751 | 0.7485 | 0.1800 | 0.0850 |
94
+ | 18.7477 | 37.0 | 9250 | 20.4310 | 0.7555 | 0.4154 | 2.4639 | 0.7555 | 0.7528 | 0.1759 | 0.0842 |
95
+ | 18.6548 | 38.0 | 9500 | 20.1987 | 0.7542 | 0.4111 | 2.2921 | 0.7542 | 0.7542 | 0.1792 | 0.0815 |
96
+ | 18.6548 | 39.0 | 9750 | 20.2326 | 0.7562 | 0.4017 | 2.3536 | 0.7562 | 0.7537 | 0.1767 | 0.0829 |
97
+ | 18.5776 | 40.0 | 10000 | 20.1571 | 0.7575 | 0.3985 | 2.3405 | 0.7575 | 0.7568 | 0.1703 | 0.0811 |
98
+ | 18.5776 | 41.0 | 10250 | 20.1580 | 0.7625 | 0.3962 | 2.3855 | 0.7625 | 0.7621 | 0.1713 | 0.0814 |
99
+ | 18.5133 | 42.0 | 10500 | 20.0952 | 0.7572 | 0.4038 | 2.3600 | 0.7572 | 0.7563 | 0.1768 | 0.0794 |
100
+ | 18.5133 | 43.0 | 10750 | 20.1483 | 0.7575 | 0.4008 | 2.3713 | 0.7575 | 0.7564 | 0.1755 | 0.0820 |
101
+ | 18.4613 | 44.0 | 11000 | 20.0749 | 0.762 | 0.3992 | 2.3372 | 0.762 | 0.7618 | 0.1720 | 0.0795 |
102
+ | 18.4613 | 45.0 | 11250 | 20.0664 | 0.7578 | 0.4035 | 2.3570 | 0.7577 | 0.7566 | 0.1769 | 0.0795 |
103
+ | 18.4218 | 46.0 | 11500 | 19.9611 | 0.7622 | 0.3946 | 2.3399 | 0.7622 | 0.7617 | 0.1674 | 0.0784 |
104
+ | 18.4218 | 47.0 | 11750 | 19.9678 | 0.7632 | 0.3907 | 2.3011 | 0.7632 | 0.7635 | 0.1692 | 0.0772 |
105
+ | 18.3945 | 48.0 | 12000 | 19.9950 | 0.763 | 0.3910 | 2.2773 | 0.763 | 0.7616 | 0.1695 | 0.0775 |
106
+ | 18.3945 | 49.0 | 12250 | 20.0013 | 0.7625 | 0.3911 | 2.2875 | 0.7625 | 0.7618 | 0.1705 | 0.0777 |
107
+ | 18.3792 | 50.0 | 12500 | 19.9624 | 0.765 | 0.3910 | 2.2998 | 0.765 | 0.7641 | 0.1669 | 0.0775 |
108
+
109
+
110
+ ### Framework versions
111
+
112
+ - Transformers 4.26.1
113
+ - Pytorch 1.13.1.post200
114
+ - Datasets 2.9.0
115
+ - Tokenizers 0.13.2