wcosmas commited on
Commit
ab25088
·
verified ·
1 Parent(s): c62f40f

Model save

Browse files
README.md CHANGED
@@ -4,22 +4,37 @@ license: apache-2.0
4
  base_model: google/vit-base-patch16-224-in21k
5
  tags:
6
  - generated_from_trainer
 
 
7
  metrics:
8
  - accuracy
9
  model-index:
10
  - name: vit-base-patch16-224-in21k-finetuned-papsmear
11
- results: []
 
 
 
 
 
 
 
 
 
 
 
 
 
12
  ---
13
 
14
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
15
  should probably proofread and complete it, then remove this comment. -->
16
 
17
- # vit-base-patch16-224-in21k-finetuned-biopsy
18
 
19
- This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset.
20
  It achieves the following results on the evaluation set:
21
- - Loss: 0.1092
22
- - Accuracy: 0.9732
23
 
24
  ## Model description
25
 
@@ -51,63 +66,60 @@ The following hyperparameters were used during training:
51
 
52
  ### Training results
53
 
54
- | Training Loss | Epoch | Step | Validation Loss | Accuracy |
55
- |:-------------:|:-----:|:----:|:---------------:|:--------:|
56
- | 1.1553 | 1.0 | 42 | 1.0950 | 0.5477 |
57
- | 0.7791 | 2.0 | 84 | 0.6486 | 0.8526 |
58
- | 0.433 | 3.0 | 126 | 0.3716 | 0.9129 |
59
- | 0.3495 | 4.0 | 168 | 0.2869 | 0.9347 |
60
- | 0.2556 | 5.0 | 210 | 0.2722 | 0.9280 |
61
- | 0.2791 | 6.0 | 252 | 0.2611 | 0.9330 |
62
- | 0.2343 | 7.0 | 294 | 0.2377 | 0.9380 |
63
- | 0.186 | 8.0 | 336 | 0.2158 | 0.9397 |
64
- | 0.1984 | 9.0 | 378 | 0.2222 | 0.9347 |
65
- | 0.1751 | 10.0 | 420 | 0.1993 | 0.9514 |
66
- | 0.1529 | 11.0 | 462 | 0.2101 | 0.9430 |
67
- | 0.1616 | 12.0 | 504 | 0.2543 | 0.9296 |
68
- | 0.1404 | 13.0 | 546 | 0.2029 | 0.9397 |
69
- | 0.1078 | 14.0 | 588 | 0.2087 | 0.9414 |
70
- | 0.1109 | 15.0 | 630 | 0.1381 | 0.9615 |
71
- | 0.1072 | 16.0 | 672 | 0.1895 | 0.9414 |
72
- | 0.0949 | 17.0 | 714 | 0.1981 | 0.9397 |
73
- | 0.0908 | 18.0 | 756 | 0.1608 | 0.9581 |
74
- | 0.0809 | 19.0 | 798 | 0.1764 | 0.9581 |
75
- | 0.0708 | 20.0 | 840 | 0.1512 | 0.9531 |
76
- | 0.0757 | 21.0 | 882 | 0.2027 | 0.9481 |
77
- | 0.0919 | 22.0 | 924 | 0.1487 | 0.9615 |
78
- | 0.07 | 23.0 | 966 | 0.1667 | 0.9615 |
79
- | 0.0629 | 24.0 | 1008 | 0.1904 | 0.9531 |
80
- | 0.0584 | 25.0 | 1050 | 0.1521 | 0.9631 |
81
- | 0.0666 | 26.0 | 1092 | 0.1326 | 0.9665 |
82
- | 0.062 | 27.0 | 1134 | 0.1772 | 0.9564 |
83
- | 0.0568 | 28.0 | 1176 | 0.1465 | 0.9564 |
84
- | 0.0453 | 29.0 | 1218 | 0.1347 | 0.9682 |
85
- | 0.0469 | 30.0 | 1260 | 0.1687 | 0.9631 |
86
- | 0.0541 | 31.0 | 1302 | 0.1390 | 0.9715 |
87
- | 0.0602 | 32.0 | 1344 | 0.1618 | 0.9615 |
88
- | 0.0497 | 33.0 | 1386 | 0.1415 | 0.9615 |
89
- | 0.0493 | 34.0 | 1428 | 0.1521 | 0.9631 |
90
- | 0.0606 | 35.0 | 1470 | 0.1429 | 0.9698 |
91
- | 0.0332 | 36.0 | 1512 | 0.1671 | 0.9648 |
92
- | 0.0432 | 37.0 | 1554 | 0.1441 | 0.9665 |
93
- | 0.0354 | 38.0 | 1596 | 0.1593 | 0.9682 |
94
- | 0.0432 | 39.0 | 1638 | 0.1395 | 0.9665 |
95
- | 0.0363 | 40.0 | 1680 | 0.1092 | 0.9732 |
96
- | 0.0288 | 41.0 | 1722 | 0.1550 | 0.9665 |
97
- | 0.0305 | 42.0 | 1764 | 0.1462 | 0.9682 |
98
- | 0.0326 | 43.0 | 1806 | 0.1343 | 0.9682 |
99
- | 0.027 | 44.0 | 1848 | 0.1109 | 0.9732 |
100
- | 0.0233 | 45.0 | 1890 | 0.1315 | 0.9732 |
101
- | 0.042 | 46.0 | 1932 | 0.1261 | 0.9732 |
102
- | 0.0251 | 47.0 | 1974 | 0.1320 | 0.9732 |
103
- | 0.041 | 48.0 | 2016 | 0.1282 | 0.9732 |
104
- | 0.0445 | 49.0 | 2058 | 0.1296 | 0.9732 |
105
- | 0.0308 | 50.0 | 2100 | 0.1325 | 0.9732 |
106
 
107
 
108
  ### Framework versions
109
 
110
- - Transformers 4.44.2
111
  - Pytorch 2.4.1+cu121
112
- - Datasets 3.0.1
113
- - Tokenizers 0.19.1
 
4
  base_model: google/vit-base-patch16-224-in21k
5
  tags:
6
  - generated_from_trainer
7
+ datasets:
8
+ - imagefolder
9
  metrics:
10
  - accuracy
11
  model-index:
12
  - name: vit-base-patch16-224-in21k-finetuned-papsmear
13
+ results:
14
+ - task:
15
+ name: Image Classification
16
+ type: image-classification
17
+ dataset:
18
+ name: imagefolder
19
+ type: imagefolder
20
+ config: default
21
+ split: train
22
+ args: default
23
+ metrics:
24
+ - name: Accuracy
25
+ type: accuracy
26
+ value: 0.9191176470588235
27
  ---
28
 
29
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
30
  should probably proofread and complete it, then remove this comment. -->
31
 
32
+ # vit-base-patch16-224-in21k-finetuned-papsmear
33
 
34
+ This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset.
35
  It achieves the following results on the evaluation set:
36
+ - Loss: 0.2707
37
+ - Accuracy: 0.9191
38
 
39
  ## Model description
40
 
 
66
 
67
  ### Training results
68
 
69
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
70
+ |:-------------:|:-------:|:----:|:---------------:|:--------:|
71
+ | No log | 0.9231 | 9 | 1.7346 | 0.2647 |
72
+ | 1.7645 | 1.9487 | 19 | 1.6152 | 0.3088 |
73
+ | 1.661 | 2.9744 | 29 | 1.4663 | 0.4118 |
74
+ | 1.496 | 4.0 | 39 | 1.2989 | 0.4853 |
75
+ | 1.3097 | 4.9231 | 48 | 1.1491 | 0.5588 |
76
+ | 1.091 | 5.9487 | 58 | 0.9933 | 0.7206 |
77
+ | 0.9088 | 6.9744 | 68 | 0.9171 | 0.6985 |
78
+ | 0.7858 | 8.0 | 78 | 0.8301 | 0.7721 |
79
+ | 0.7016 | 8.9231 | 87 | 0.7925 | 0.7353 |
80
+ | 0.6136 | 9.9487 | 97 | 0.6992 | 0.7647 |
81
+ | 0.532 | 10.9744 | 107 | 0.6401 | 0.8309 |
82
+ | 0.5018 | 12.0 | 117 | 0.5787 | 0.8382 |
83
+ | 0.4279 | 12.9231 | 126 | 0.6130 | 0.8088 |
84
+ | 0.4116 | 13.9487 | 136 | 0.5090 | 0.8382 |
85
+ | 0.3848 | 14.9744 | 146 | 0.5165 | 0.8676 |
86
+ | 0.3449 | 16.0 | 156 | 0.4843 | 0.8382 |
87
+ | 0.3008 | 16.9231 | 165 | 0.5460 | 0.8456 |
88
+ | 0.2797 | 17.9487 | 175 | 0.4985 | 0.8309 |
89
+ | 0.2696 | 18.9744 | 185 | 0.5586 | 0.8456 |
90
+ | 0.2633 | 20.0 | 195 | 0.4349 | 0.9044 |
91
+ | 0.2569 | 20.9231 | 204 | 0.4017 | 0.8897 |
92
+ | 0.27 | 21.9487 | 214 | 0.4758 | 0.8603 |
93
+ | 0.2706 | 22.9744 | 224 | 0.4133 | 0.8897 |
94
+ | 0.2211 | 24.0 | 234 | 0.3844 | 0.9118 |
95
+ | 0.1977 | 24.9231 | 243 | 0.3497 | 0.9265 |
96
+ | 0.1969 | 25.9487 | 253 | 0.3736 | 0.9044 |
97
+ | 0.1776 | 26.9744 | 263 | 0.3797 | 0.9044 |
98
+ | 0.1787 | 28.0 | 273 | 0.3949 | 0.8897 |
99
+ | 0.18 | 28.9231 | 282 | 0.3278 | 0.9265 |
100
+ | 0.1797 | 29.9487 | 292 | 0.3615 | 0.9044 |
101
+ | 0.1665 | 30.9744 | 302 | 0.4174 | 0.8603 |
102
+ | 0.163 | 32.0 | 312 | 0.3574 | 0.8971 |
103
+ | 0.1498 | 32.9231 | 321 | 0.3591 | 0.9044 |
104
+ | 0.1405 | 33.9487 | 331 | 0.3017 | 0.9191 |
105
+ | 0.155 | 34.9744 | 341 | 0.3303 | 0.9265 |
106
+ | 0.1519 | 36.0 | 351 | 0.3559 | 0.8971 |
107
+ | 0.1415 | 36.9231 | 360 | 0.2890 | 0.9191 |
108
+ | 0.1256 | 37.9487 | 370 | 0.3445 | 0.8897 |
109
+ | 0.1217 | 38.9744 | 380 | 0.3435 | 0.9118 |
110
+ | 0.1285 | 40.0 | 390 | 0.3025 | 0.9191 |
111
+ | 0.1285 | 40.9231 | 399 | 0.3602 | 0.8824 |
112
+ | 0.1301 | 41.9487 | 409 | 0.3336 | 0.8897 |
113
+ | 0.1243 | 42.9744 | 419 | 0.2825 | 0.9338 |
114
+ | 0.1191 | 44.0 | 429 | 0.2835 | 0.9265 |
115
+ | 0.1221 | 44.9231 | 438 | 0.2724 | 0.9191 |
116
+ | 0.1151 | 45.9487 | 448 | 0.2708 | 0.9191 |
117
+ | 0.1195 | 46.1538 | 450 | 0.2707 | 0.9191 |
 
 
 
118
 
119
 
120
  ### Framework versions
121
 
122
+ - Transformers 4.45.2
123
  - Pytorch 2.4.1+cu121
124
+ - Datasets 3.0.2
125
+ - Tokenizers 0.20.1
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:5cd05294020150c9dc040c41927590e5f86a46db9571e01d2d4d0a9acc0e4c4c
3
  size 343236280
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5e60d28d46f7a4efbb2d5699e3eb9ec7f91eca0d2d801679f817dfddd9d805a8
3
  size 343236280
runs/Oct22_18-03-39_549dd11dc1c2/events.out.tfevents.1729620228.549dd11dc1c2.1065.1 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:8bdf5008eab41ae9b6cb41c22a7061de5d9b9926ff6b80875847a109472fbafd
3
- size 29401
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7f13116da667838d3f5d30d258c1514c553a5452cfbca4917201b84c13b0038f
3
+ size 30078