lombardata commited on
Commit
d95f05a
1 Parent(s): fb60233

Model save

Browse files
Files changed (2) hide show
  1. README.md +113 -0
  2. pytorch_model.bin +1 -1
README.md ADDED
@@ -0,0 +1,113 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: facebook/dinov2-large
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - accuracy
8
+ model-index:
9
+ - name: dino-large-2023_12_08-with_custom_head-imgsize1036
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # dino-large-2023_12_08-with_custom_head-imgsize1036
17
+
18
+ This model is a fine-tuned version of [facebook/dinov2-large](https://huggingface.co/facebook/dinov2-large) on the None dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 0.1054
21
+ - F1 Micro: 0.8508
22
+ - F1 Macro: 0.8102
23
+ - Roc Auc: 0.8956
24
+ - Accuracy: 0.5720
25
+ - Learning Rate: 0.0001
26
+
27
+ ## Model description
28
+
29
+ More information needed
30
+
31
+ ## Intended uses & limitations
32
+
33
+ More information needed
34
+
35
+ ## Training and evaluation data
36
+
37
+ More information needed
38
+
39
+ ## Training procedure
40
+
41
+ ### Training hyperparameters
42
+
43
+ The following hyperparameters were used during training:
44
+ - learning_rate: 0.01
45
+ - train_batch_size: 16
46
+ - eval_batch_size: 16
47
+ - seed: 42
48
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
49
+ - lr_scheduler_type: linear
50
+ - num_epochs: 50
51
+
52
+ ### Training results
53
+
54
+ | Training Loss | Epoch | Step | Validation Loss | F1 Micro | F1 Macro | Roc Auc | Accuracy | Rate |
55
+ |:-------------:|:-----:|:-----:|:---------------:|:--------:|:--------:|:-------:|:--------:|:------:|
56
+ | 0.2502 | 1.0 | 536 | 0.1990 | 0.6427 | 0.5121 | 0.7510 | 0.4030 | 0.01 |
57
+ | 0.2164 | 2.0 | 1072 | 1.1451 | 0.6769 | 0.6296 | 0.7895 | 0.4048 | 0.01 |
58
+ | 0.2149 | 3.0 | 1608 | 0.2339 | 0.6516 | 0.5251 | 0.7524 | 0.3994 | 0.01 |
59
+ | 0.2159 | 4.0 | 2144 | 0.1699 | 0.7376 | 0.6028 | 0.8297 | 0.4116 | 0.01 |
60
+ | 0.2171 | 5.0 | 2680 | 0.1650 | 0.7364 | 0.6304 | 0.8166 | 0.4394 | 0.01 |
61
+ | 0.2166 | 6.0 | 3216 | 0.1748 | 0.6853 | 0.5123 | 0.7773 | 0.3923 | 0.01 |
62
+ | 0.2081 | 7.0 | 3752 | 0.1636 | 0.7455 | 0.6129 | 0.8291 | 0.4348 | 0.01 |
63
+ | 0.2111 | 8.0 | 4288 | 0.1651 | 0.7543 | 0.6445 | 0.8476 | 0.4277 | 0.01 |
64
+ | 0.209 | 9.0 | 4824 | 0.1750 | 0.7062 | 0.6522 | 0.7966 | 0.4359 | 0.01 |
65
+ | 0.2107 | 10.0 | 5360 | 0.1751 | 0.7244 | 0.5924 | 0.8146 | 0.3830 | 0.01 |
66
+ | 0.2162 | 11.0 | 5896 | 0.2229 | 0.7506 | 0.6780 | 0.8475 | 0.4252 | 0.01 |
67
+ | 0.2153 | 12.0 | 6432 | 0.1740 | 0.7501 | 0.6543 | 0.8550 | 0.4105 | 0.01 |
68
+ | 0.2197 | 13.0 | 6968 | 0.1745 | 0.7487 | 0.6605 | 0.8572 | 0.4187 | 0.01 |
69
+ | 0.18 | 14.0 | 7504 | 0.1348 | 0.8036 | 0.7455 | 0.8731 | 0.5059 | 0.001 |
70
+ | 0.164 | 15.0 | 8040 | 0.1308 | 0.8160 | 0.7783 | 0.8844 | 0.5173 | 0.001 |
71
+ | 0.162 | 16.0 | 8576 | 0.1305 | 0.8188 | 0.7530 | 0.8764 | 0.5202 | 0.001 |
72
+ | 0.1548 | 17.0 | 9112 | 0.1242 | 0.8291 | 0.7887 | 0.8945 | 0.5248 | 0.001 |
73
+ | 0.1532 | 18.0 | 9648 | 0.1247 | 0.8292 | 0.7823 | 0.8934 | 0.5227 | 0.001 |
74
+ | 0.152 | 19.0 | 10184 | 0.1272 | 0.8238 | 0.7688 | 0.8832 | 0.5280 | 0.001 |
75
+ | 0.1479 | 20.0 | 10720 | 0.1239 | 0.8280 | 0.7783 | 0.8834 | 0.5288 | 0.001 |
76
+ | 0.1483 | 21.0 | 11256 | 0.1376 | 0.8361 | 0.7914 | 0.8919 | 0.5341 | 0.001 |
77
+ | 0.1448 | 22.0 | 11792 | 0.1267 | 0.8292 | 0.7774 | 0.8842 | 0.5380 | 0.001 |
78
+ | 0.1456 | 23.0 | 12328 | 0.1217 | 0.8334 | 0.7914 | 0.8883 | 0.5448 | 0.001 |
79
+ | 0.1441 | 24.0 | 12864 | 0.1193 | 0.8283 | 0.7852 | 0.8801 | 0.5380 | 0.001 |
80
+ | 0.1406 | 25.0 | 13400 | 0.1185 | 0.8392 | 0.8020 | 0.8988 | 0.5341 | 0.001 |
81
+ | 0.1416 | 26.0 | 13936 | 0.1295 | 0.8351 | 0.7851 | 0.8889 | 0.5441 | 0.001 |
82
+ | 0.1417 | 27.0 | 14472 | 0.1390 | 0.8287 | 0.7699 | 0.8808 | 0.5305 | 0.001 |
83
+ | 0.142 | 28.0 | 15008 | 0.1256 | 0.8328 | 0.7857 | 0.8888 | 0.5441 | 0.001 |
84
+ | 0.14 | 29.0 | 15544 | 0.1268 | 0.8291 | 0.7759 | 0.8815 | 0.5359 | 0.001 |
85
+ | 0.1415 | 30.0 | 16080 | 0.1374 | 0.8240 | 0.7675 | 0.8722 | 0.5420 | 0.001 |
86
+ | 0.1414 | 31.0 | 16616 | 0.1281 | 0.8310 | 0.7795 | 0.8838 | 0.5406 | 0.001 |
87
+ | 0.1349 | 32.0 | 17152 | 0.1144 | 0.8389 | 0.7927 | 0.8892 | 0.5513 | 0.0001 |
88
+ | 0.1294 | 33.0 | 17688 | 0.1097 | 0.8414 | 0.7991 | 0.8915 | 0.5534 | 0.0001 |
89
+ | 0.1281 | 34.0 | 18224 | 0.1160 | 0.8425 | 0.7982 | 0.8925 | 0.5520 | 0.0001 |
90
+ | 0.1274 | 35.0 | 18760 | 0.1244 | 0.8441 | 0.7999 | 0.8935 | 0.5577 | 0.0001 |
91
+ | 0.1243 | 36.0 | 19296 | 0.1100 | 0.8434 | 0.7991 | 0.8898 | 0.5559 | 0.0001 |
92
+ | 0.1231 | 37.0 | 19832 | 0.1073 | 0.8485 | 0.8086 | 0.8989 | 0.5641 | 0.0001 |
93
+ | 0.1245 | 38.0 | 20368 | 0.1092 | 0.8456 | 0.8054 | 0.8916 | 0.5602 | 0.0001 |
94
+ | 0.1197 | 39.0 | 20904 | 0.1069 | 0.8483 | 0.8112 | 0.9002 | 0.5623 | 0.0001 |
95
+ | 0.1242 | 40.0 | 21440 | 0.1065 | 0.8468 | 0.8081 | 0.8949 | 0.5638 | 0.0001 |
96
+ | 0.1167 | 41.0 | 21976 | 0.1083 | 0.8462 | 0.8043 | 0.8934 | 0.5591 | 0.0001 |
97
+ | 0.1179 | 42.0 | 22512 | 0.1072 | 0.8505 | 0.8090 | 0.8978 | 0.5691 | 0.0001 |
98
+ | 0.1186 | 43.0 | 23048 | 0.1179 | 0.8483 | 0.8072 | 0.8945 | 0.5613 | 0.0001 |
99
+ | 0.1174 | 44.0 | 23584 | 0.1068 | 0.8477 | 0.8080 | 0.8946 | 0.5656 | 0.0001 |
100
+ | 0.1153 | 45.0 | 24120 | 0.1047 | 0.8534 | 0.8193 | 0.9025 | 0.5716 | 0.0001 |
101
+ | 0.1167 | 46.0 | 24656 | 0.1062 | 0.8535 | 0.8229 | 0.9080 | 0.5666 | 0.0001 |
102
+ | 0.1162 | 47.0 | 25192 | 0.1060 | 0.8522 | 0.8180 | 0.9006 | 0.5695 | 0.0001 |
103
+ | 0.1145 | 48.0 | 25728 | 0.1041 | 0.8529 | 0.8154 | 0.9002 | 0.5745 | 0.0001 |
104
+ | 0.1143 | 49.0 | 26264 | 0.1033 | 0.8542 | 0.8182 | 0.9043 | 0.5652 | 0.0001 |
105
+ | 0.1129 | 50.0 | 26800 | 0.1054 | 0.8508 | 0.8102 | 0.8956 | 0.5720 | 0.0001 |
106
+
107
+
108
+ ### Framework versions
109
+
110
+ - Transformers 4.34.1
111
+ - Pytorch 2.1.0+cu118
112
+ - Datasets 2.14.5
113
+ - Tokenizers 0.14.1
pytorch_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:1350bc7bea4919d8f76536d33102625efa4a239b4cde720f127616a1cbaf19b3
3
  size 1228188838
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:99b1a3cab590da814efabf2ed86688ab6d0de3585ac27a4ffbfcd884ffaf1914
3
  size 1228188838