lombardata commited on
Commit
b979ba7
1 Parent(s): 31182ad

Model save

Browse files
Files changed (1) hide show
  1. README.md +113 -0
README.md ADDED
@@ -0,0 +1,113 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: facebook/dinov2-base
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - accuracy
8
+ model-index:
9
+ - name: dino-base-2023_11_17-original_head
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # dino-base-2023_11_17-original_head
17
+
18
+ This model is a fine-tuned version of [facebook/dinov2-base](https://huggingface.co/facebook/dinov2-base) on the None dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 0.1307
21
+ - F1 Micro: 0.8332
22
+ - F1 Macro: 0.7987
23
+ - Roc Auc: 0.8961
24
+ - Accuracy: 0.5248
25
+ - Learning Rate: 0.0001
26
+
27
+ ## Model description
28
+
29
+ More information needed
30
+
31
+ ## Intended uses & limitations
32
+
33
+ More information needed
34
+
35
+ ## Training and evaluation data
36
+
37
+ More information needed
38
+
39
+ ## Training procedure
40
+
41
+ ### Training hyperparameters
42
+
43
+ The following hyperparameters were used during training:
44
+ - learning_rate: 0.01
45
+ - train_batch_size: 16
46
+ - eval_batch_size: 16
47
+ - seed: 42
48
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
49
+ - lr_scheduler_type: linear
50
+ - num_epochs: 50
51
+
52
+ ### Training results
53
+
54
+ | Training Loss | Epoch | Step | Validation Loss | F1 Micro | F1 Macro | Roc Auc | Accuracy | Rate |
55
+ |:-------------:|:-----:|:-----:|:---------------:|:--------:|:--------:|:-------:|:--------:|:------:|
56
+ | 0.4512 | 1.0 | 536 | 0.5206 | 0.7341 | 0.6738 | 0.8351 | 0.3994 | 0.01 |
57
+ | 0.4039 | 2.0 | 1072 | 0.3435 | 0.7977 | 0.7470 | 0.8819 | 0.4630 | 0.01 |
58
+ | 0.4097 | 3.0 | 1608 | 0.4639 | 0.7760 | 0.7255 | 0.8642 | 0.4237 | 0.01 |
59
+ | 0.3939 | 4.0 | 2144 | 0.3950 | 0.7937 | 0.7209 | 0.8757 | 0.4627 | 0.01 |
60
+ | 0.373 | 5.0 | 2680 | 0.4402 | 0.7570 | 0.7277 | 0.8610 | 0.4205 | 0.01 |
61
+ | 0.3838 | 6.0 | 3216 | 0.5527 | 0.7291 | 0.6525 | 0.8146 | 0.4101 | 0.01 |
62
+ | 0.3668 | 7.0 | 3752 | 0.4480 | 0.7590 | 0.7117 | 0.8412 | 0.4287 | 0.01 |
63
+ | 0.358 | 8.0 | 4288 | 0.4486 | 0.7743 | 0.7346 | 0.8785 | 0.4094 | 0.01 |
64
+ | 0.2861 | 9.0 | 4824 | 0.2277 | 0.8197 | 0.7896 | 0.8881 | 0.5002 | 0.001 |
65
+ | 0.1352 | 10.0 | 5360 | 0.2217 | 0.8174 | 0.7894 | 0.8949 | 0.4916 | 0.001 |
66
+ | 0.1151 | 11.0 | 5896 | 0.2070 | 0.8171 | 0.7840 | 0.8829 | 0.5091 | 0.001 |
67
+ | 0.106 | 12.0 | 6432 | 0.1962 | 0.8204 | 0.7974 | 0.9027 | 0.4995 | 0.001 |
68
+ | 0.1018 | 13.0 | 6968 | 0.1928 | 0.8178 | 0.7898 | 0.8933 | 0.4905 | 0.001 |
69
+ | 0.0925 | 14.0 | 7504 | 0.1798 | 0.8245 | 0.7847 | 0.8949 | 0.5002 | 0.001 |
70
+ | 0.0902 | 15.0 | 8040 | 0.1771 | 0.8159 | 0.7764 | 0.8798 | 0.5095 | 0.001 |
71
+ | 0.0871 | 16.0 | 8576 | 0.1733 | 0.8170 | 0.7821 | 0.8875 | 0.5055 | 0.001 |
72
+ | 0.084 | 17.0 | 9112 | 0.1710 | 0.8228 | 0.7924 | 0.9015 | 0.4930 | 0.001 |
73
+ | 0.0853 | 18.0 | 9648 | 0.1692 | 0.8218 | 0.7850 | 0.8905 | 0.4952 | 0.001 |
74
+ | 0.0841 | 19.0 | 10184 | 0.1660 | 0.8179 | 0.7836 | 0.8945 | 0.4945 | 0.001 |
75
+ | 0.0821 | 20.0 | 10720 | 0.1736 | 0.8107 | 0.7817 | 0.8831 | 0.4912 | 0.001 |
76
+ | 0.083 | 21.0 | 11256 | 0.1595 | 0.8178 | 0.7888 | 0.8955 | 0.4980 | 0.001 |
77
+ | 0.0801 | 22.0 | 11792 | 0.1613 | 0.8226 | 0.7895 | 0.8997 | 0.4991 | 0.001 |
78
+ | 0.0815 | 23.0 | 12328 | 0.1583 | 0.8177 | 0.7862 | 0.8899 | 0.5080 | 0.001 |
79
+ | 0.0822 | 24.0 | 12864 | 0.1555 | 0.8202 | 0.7782 | 0.8822 | 0.5134 | 0.001 |
80
+ | 0.0793 | 25.0 | 13400 | 0.1554 | 0.8207 | 0.7883 | 0.8986 | 0.5023 | 0.001 |
81
+ | 0.0788 | 26.0 | 13936 | 0.1543 | 0.8147 | 0.7822 | 0.8831 | 0.5016 | 0.001 |
82
+ | 0.0797 | 27.0 | 14472 | 0.1511 | 0.8230 | 0.7831 | 0.8886 | 0.5116 | 0.001 |
83
+ | 0.0795 | 28.0 | 15008 | 0.1510 | 0.8197 | 0.7831 | 0.8860 | 0.5038 | 0.001 |
84
+ | 0.079 | 29.0 | 15544 | 0.1465 | 0.8225 | 0.7879 | 0.8844 | 0.5120 | 0.001 |
85
+ | 0.0802 | 30.0 | 16080 | 0.1473 | 0.8229 | 0.7885 | 0.8966 | 0.5030 | 0.001 |
86
+ | 0.0786 | 31.0 | 16616 | 0.1627 | 0.8000 | 0.7544 | 0.8594 | 0.4955 | 0.001 |
87
+ | 0.0806 | 32.0 | 17152 | 0.1465 | 0.8221 | 0.7911 | 0.8916 | 0.4970 | 0.001 |
88
+ | 0.0776 | 33.0 | 17688 | 0.1477 | 0.8230 | 0.7925 | 0.9010 | 0.4998 | 0.001 |
89
+ | 0.0801 | 34.0 | 18224 | 0.1436 | 0.8221 | 0.7891 | 0.8961 | 0.5041 | 0.001 |
90
+ | 0.0797 | 35.0 | 18760 | 0.1497 | 0.8198 | 0.7843 | 0.8900 | 0.4905 | 0.001 |
91
+ | 0.0781 | 36.0 | 19296 | 0.1407 | 0.8254 | 0.7936 | 0.8924 | 0.5098 | 0.001 |
92
+ | 0.079 | 37.0 | 19832 | 0.1465 | 0.8229 | 0.7735 | 0.8898 | 0.5152 | 0.001 |
93
+ | 0.082 | 38.0 | 20368 | 0.1536 | 0.8102 | 0.7882 | 0.8861 | 0.4855 | 0.001 |
94
+ | 0.0781 | 39.0 | 20904 | 0.1463 | 0.8200 | 0.7856 | 0.8917 | 0.5052 | 0.001 |
95
+ | 0.0811 | 40.0 | 21440 | 0.1465 | 0.8159 | 0.7798 | 0.8885 | 0.5016 | 0.001 |
96
+ | 0.0786 | 41.0 | 21976 | 0.1521 | 0.8154 | 0.7669 | 0.8864 | 0.5027 | 0.001 |
97
+ | 0.0775 | 42.0 | 22512 | 0.1418 | 0.8256 | 0.7908 | 0.8961 | 0.5127 | 0.001 |
98
+ | 0.0641 | 43.0 | 23048 | 0.1318 | 0.8344 | 0.7996 | 0.8963 | 0.5259 | 0.0001 |
99
+ | 0.0633 | 44.0 | 23584 | 0.1312 | 0.8329 | 0.7964 | 0.8931 | 0.5313 | 0.0001 |
100
+ | 0.0627 | 45.0 | 24120 | 0.1313 | 0.8327 | 0.7981 | 0.8957 | 0.5277 | 0.0001 |
101
+ | 0.0627 | 46.0 | 24656 | 0.1307 | 0.8332 | 0.8015 | 0.8960 | 0.5270 | 0.0001 |
102
+ | 0.0619 | 47.0 | 25192 | 0.1307 | 0.8337 | 0.7990 | 0.8959 | 0.5273 | 0.0001 |
103
+ | 0.0626 | 48.0 | 25728 | 0.1309 | 0.8333 | 0.8008 | 0.8967 | 0.5234 | 0.0001 |
104
+ | 0.063 | 49.0 | 26264 | 0.1310 | 0.8324 | 0.7976 | 0.8954 | 0.5213 | 0.0001 |
105
+ | 0.0623 | 50.0 | 26800 | 0.1307 | 0.8332 | 0.7987 | 0.8961 | 0.5248 | 0.0001 |
106
+
107
+
108
+ ### Framework versions
109
+
110
+ - Transformers 4.34.1
111
+ - Pytorch 2.1.0+cu118
112
+ - Datasets 2.14.5
113
+ - Tokenizers 0.14.1