sergiocannata commited on
Commit
8d6bdbd
1 Parent(s): a846b75

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +139 -0
README.md ADDED
@@ -0,0 +1,139 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - generated_from_trainer
5
+ datasets:
6
+ - imagefolder
7
+ metrics:
8
+ - accuracy
9
+ - f1
10
+ model-index:
11
+ - name: cvt-21-finetuned-brs2
12
+ results:
13
+ - task:
14
+ name: Image Classification
15
+ type: image-classification
16
+ dataset:
17
+ name: imagefolder
18
+ type: imagefolder
19
+ config: default
20
+ split: train
21
+ args: default
22
+ metrics:
23
+ - name: Accuracy
24
+ type: accuracy
25
+ value: 0.660377358490566
26
+ - name: F1
27
+ type: f1
28
+ value: 0.608695652173913
29
+ ---
30
+
31
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
32
+ should probably proofread and complete it, then remove this comment. -->
33
+
34
+ # cvt-21-finetuned-brs2
35
+
36
+ This model is a fine-tuned version of [microsoft/cvt-21](https://huggingface.co/microsoft/cvt-21) on the imagefolder dataset.
37
+ It achieves the following results on the evaluation set:
38
+ - Loss: 0.6947
39
+ - Accuracy: 0.6604
40
+ - F1: 0.6087
41
+ - Precision (ppv): 0.5385
42
+ - Recall (sensitivity): 0.7
43
+ - Specificity: 0.6364
44
+ - Npv: 0.7778
45
+ - Auc: 0.6682
46
+
47
+ ## Model description
48
+
49
+ More information needed
50
+
51
+ ## Intended uses & limitations
52
+
53
+ More information needed
54
+
55
+ ## Training and evaluation data
56
+
57
+ More information needed
58
+
59
+ ## Training procedure
60
+
61
+ ### Training hyperparameters
62
+
63
+ The following hyperparameters were used during training:
64
+ - learning_rate: 1e-05
65
+ - train_batch_size: 1
66
+ - eval_batch_size: 1
67
+ - seed: 42
68
+ - gradient_accumulation_steps: 4
69
+ - total_train_batch_size: 4
70
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
71
+ - lr_scheduler_type: linear
72
+ - lr_scheduler_warmup_ratio: 0.1
73
+ - num_epochs: 100
74
+
75
+ ### Training results
76
+
77
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision (ppv) | Recall (sensitivity) | Specificity | Npv | Auc |
78
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:---------------:|:--------------------:|:-----------:|:------:|:------:|
79
+ | 0.8177 | 1.89 | 100 | 0.7113 | 0.5283 | 0.5098 | 0.4194 | 0.65 | 0.4545 | 0.6818 | 0.5523 |
80
+ | 0.736 | 3.77 | 200 | 0.7178 | 0.5283 | 0.3902 | 0.3810 | 0.4 | 0.6061 | 0.625 | 0.5030 |
81
+ | 0.5978 | 5.66 | 300 | 0.6889 | 0.6038 | 0.5532 | 0.4815 | 0.65 | 0.5758 | 0.7308 | 0.6129 |
82
+ | 0.5576 | 7.55 | 400 | 0.7349 | 0.4717 | 0.5484 | 0.4048 | 0.85 | 0.2424 | 0.7273 | 0.5462 |
83
+ | 0.5219 | 9.43 | 500 | 0.6522 | 0.6038 | 0.4 | 0.4667 | 0.35 | 0.7576 | 0.6579 | 0.5538 |
84
+ | 0.5326 | 11.32 | 600 | 0.6665 | 0.6226 | 0.5238 | 0.5 | 0.55 | 0.6667 | 0.7097 | 0.6083 |
85
+ | 0.4381 | 13.21 | 700 | 0.7685 | 0.4717 | 0.5333 | 0.4 | 0.8 | 0.2727 | 0.6923 | 0.5364 |
86
+ | 0.5598 | 15.09 | 800 | 0.7212 | 0.5283 | 0.1935 | 0.2727 | 0.15 | 0.7576 | 0.5952 | 0.4538 |
87
+ | 0.6887 | 16.98 | 900 | 0.6985 | 0.6604 | 0.64 | 0.5333 | 0.8 | 0.5758 | 0.8261 | 0.6879 |
88
+ | 0.7594 | 18.87 | 1000 | 0.7040 | 0.5472 | 0.4286 | 0.4091 | 0.45 | 0.6061 | 0.6452 | 0.5280 |
89
+ | 0.2177 | 20.75 | 1100 | 0.8056 | 0.4528 | 0.5397 | 0.3953 | 0.85 | 0.2121 | 0.7 | 0.5311 |
90
+ | 0.4893 | 22.64 | 1200 | 0.8821 | 0.3396 | 0.3860 | 0.2973 | 0.55 | 0.2121 | 0.4375 | 0.3811 |
91
+ | 0.5994 | 24.53 | 1300 | 0.8059 | 0.5660 | 0.5660 | 0.4545 | 0.75 | 0.4545 | 0.75 | 0.6023 |
92
+ | 0.5179 | 26.42 | 1400 | 0.6750 | 0.6038 | 0.4615 | 0.4737 | 0.45 | 0.6970 | 0.6765 | 0.5735 |
93
+ | 0.198 | 28.3 | 1500 | 0.7448 | 0.3962 | 0.3333 | 0.2857 | 0.4 | 0.3939 | 0.52 | 0.3970 |
94
+ | 0.6536 | 30.19 | 1600 | 0.7555 | 0.5094 | 0.4583 | 0.3929 | 0.55 | 0.4848 | 0.64 | 0.5174 |
95
+ | 0.7558 | 32.08 | 1700 | 0.6664 | 0.5849 | 0.4762 | 0.4545 | 0.5 | 0.6364 | 0.6774 | 0.5682 |
96
+ | 0.4915 | 33.96 | 1800 | 0.9213 | 0.3962 | 0.5152 | 0.3696 | 0.85 | 0.1212 | 0.5714 | 0.4856 |
97
+ | 0.3661 | 35.85 | 1900 | 0.9202 | 0.4528 | 0.4912 | 0.3784 | 0.7 | 0.3030 | 0.625 | 0.5015 |
98
+ | 0.4838 | 37.74 | 2000 | 0.9297 | 0.4528 | 0.5085 | 0.3846 | 0.75 | 0.2727 | 0.6429 | 0.5114 |
99
+ | 0.8461 | 39.62 | 2100 | 0.9464 | 0.4717 | 0.5758 | 0.4130 | 0.95 | 0.1818 | 0.8571 | 0.5659 |
100
+ | 0.6937 | 41.51 | 2200 | 0.7129 | 0.5094 | 0.48 | 0.4 | 0.6 | 0.4545 | 0.6522 | 0.5273 |
101
+ | 0.6302 | 43.4 | 2300 | 0.6866 | 0.5849 | 0.6071 | 0.4722 | 0.85 | 0.4242 | 0.8235 | 0.6371 |
102
+ | 0.0793 | 45.28 | 2400 | 0.7791 | 0.5094 | 0.5517 | 0.4211 | 0.8 | 0.3333 | 0.7333 | 0.5667 |
103
+ | 0.464 | 47.17 | 2500 | 0.8116 | 0.4340 | 0.4444 | 0.3529 | 0.6 | 0.3333 | 0.5789 | 0.4667 |
104
+ | 0.6131 | 49.06 | 2600 | 0.5970 | 0.6226 | 0.5455 | 0.5 | 0.6 | 0.6364 | 0.7241 | 0.6182 |
105
+ | 0.6937 | 50.94 | 2700 | 0.8201 | 0.4340 | 0.4 | 0.3333 | 0.5 | 0.3939 | 0.5652 | 0.4470 |
106
+ | 0.6552 | 52.83 | 2800 | 0.7168 | 0.5660 | 0.5306 | 0.4483 | 0.65 | 0.5152 | 0.7083 | 0.5826 |
107
+ | 0.7749 | 54.72 | 2900 | 0.6875 | 0.5849 | 0.5217 | 0.4615 | 0.6 | 0.5758 | 0.7037 | 0.5879 |
108
+ | 0.9482 | 56.6 | 3000 | 0.6392 | 0.6226 | 0.6296 | 0.5 | 0.85 | 0.4848 | 0.8421 | 0.6674 |
109
+ | 0.2467 | 58.49 | 3100 | 0.6281 | 0.6038 | 0.5333 | 0.48 | 0.6 | 0.6061 | 0.7143 | 0.6030 |
110
+ | 0.2903 | 60.38 | 3200 | 0.7383 | 0.5472 | 0.5556 | 0.4412 | 0.75 | 0.4242 | 0.7368 | 0.5871 |
111
+ | 0.5859 | 62.26 | 3300 | 0.7191 | 0.6226 | 0.5652 | 0.5 | 0.65 | 0.6061 | 0.7407 | 0.6280 |
112
+ | 0.3815 | 64.15 | 3400 | 0.7469 | 0.5283 | 0.4444 | 0.4 | 0.5 | 0.5455 | 0.6429 | 0.5227 |
113
+ | 0.531 | 66.04 | 3500 | 0.7566 | 0.6226 | 0.5652 | 0.5 | 0.65 | 0.6061 | 0.7407 | 0.6280 |
114
+ | 0.3892 | 67.92 | 3600 | 0.8168 | 0.5660 | 0.5490 | 0.4516 | 0.7 | 0.4848 | 0.7273 | 0.5924 |
115
+ | 0.6487 | 69.81 | 3700 | 0.9077 | 0.4340 | 0.4643 | 0.3611 | 0.65 | 0.3030 | 0.5882 | 0.4765 |
116
+ | 0.5525 | 71.7 | 3800 | 0.6961 | 0.6038 | 0.5116 | 0.4783 | 0.55 | 0.6364 | 0.7 | 0.5932 |
117
+ | 0.3137 | 73.58 | 3900 | 1.0817 | 0.3774 | 0.4590 | 0.3415 | 0.7 | 0.1818 | 0.5 | 0.4409 |
118
+ | 0.3526 | 75.47 | 4000 | 0.7684 | 0.5472 | 0.5862 | 0.4474 | 0.85 | 0.3636 | 0.8 | 0.6068 |
119
+ | 0.5938 | 77.36 | 4100 | 0.8786 | 0.4340 | 0.4828 | 0.3684 | 0.7 | 0.2727 | 0.6 | 0.4864 |
120
+ | 0.2431 | 79.25 | 4200 | 0.8925 | 0.4151 | 0.4746 | 0.3590 | 0.7 | 0.2424 | 0.5714 | 0.4712 |
121
+ | 0.1021 | 81.13 | 4300 | 1.0740 | 0.4528 | 0.4727 | 0.3714 | 0.65 | 0.3333 | 0.6111 | 0.4917 |
122
+ | 0.3429 | 83.02 | 4400 | 0.7723 | 0.4906 | 0.5091 | 0.4 | 0.7 | 0.3636 | 0.6667 | 0.5318 |
123
+ | 0.3836 | 84.91 | 4500 | 0.7247 | 0.5472 | 0.5556 | 0.4412 | 0.75 | 0.4242 | 0.7368 | 0.5871 |
124
+ | 0.4099 | 86.79 | 4600 | 0.8508 | 0.4340 | 0.4828 | 0.3684 | 0.7 | 0.2727 | 0.6 | 0.4864 |
125
+ | 0.8264 | 88.68 | 4700 | 0.7682 | 0.5849 | 0.5769 | 0.4688 | 0.75 | 0.4848 | 0.7619 | 0.6174 |
126
+ | 0.1928 | 90.57 | 4800 | 0.8738 | 0.4906 | 0.5574 | 0.4146 | 0.85 | 0.2727 | 0.75 | 0.5614 |
127
+ | 0.3422 | 92.45 | 4900 | 0.8810 | 0.5660 | 0.5965 | 0.4595 | 0.85 | 0.3939 | 0.8125 | 0.6220 |
128
+ | 0.5524 | 94.34 | 5000 | 1.0801 | 0.3774 | 0.4923 | 0.3556 | 0.8 | 0.1212 | 0.5 | 0.4606 |
129
+ | 0.464 | 96.23 | 5100 | 0.9417 | 0.5283 | 0.5902 | 0.4390 | 0.9 | 0.3030 | 0.8333 | 0.6015 |
130
+ | 0.7182 | 98.11 | 5200 | 1.0335 | 0.4151 | 0.4746 | 0.3590 | 0.7 | 0.2424 | 0.5714 | 0.4712 |
131
+ | 0.604 | 100.0 | 5300 | 0.6947 | 0.6604 | 0.6087 | 0.5385 | 0.7 | 0.6364 | 0.7778 | 0.6682 |
132
+
133
+
134
+ ### Framework versions
135
+
136
+ - Transformers 4.24.0
137
+ - Pytorch 1.12.1+cu113
138
+ - Datasets 2.6.1
139
+ - Tokenizers 0.13.1