lombardata commited on
Commit
d06839c
1 Parent(s): e29a34a

Model save

Browse files
README.md ADDED
@@ -0,0 +1,114 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: facebook/dinov2-large
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - accuracy
8
+ model-index:
9
+ - name: dinov2-large-2024_05_23-drone_batch-size512_epochs50_freeze
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # dinov2-large-2024_05_23-drone_batch-size512_epochs50_freeze
17
+
18
+ This model is a fine-tuned version of [facebook/dinov2-large](https://huggingface.co/facebook/dinov2-large) on the None dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 0.2367
21
+ - F1 Micro: 0.7652
22
+ - F1 Macro: 0.4087
23
+ - Roc Auc: 0.8436
24
+ - Accuracy: 0.1424
25
+ - Learning Rate: 0.0001
26
+
27
+ ## Model description
28
+
29
+ More information needed
30
+
31
+ ## Intended uses & limitations
32
+
33
+ More information needed
34
+
35
+ ## Training and evaluation data
36
+
37
+ More information needed
38
+
39
+ ## Training procedure
40
+
41
+ ### Training hyperparameters
42
+
43
+ The following hyperparameters were used during training:
44
+ - learning_rate: 0.001
45
+ - train_batch_size: 512
46
+ - eval_batch_size: 512
47
+ - seed: 42
48
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
49
+ - lr_scheduler_type: linear
50
+ - num_epochs: 50
51
+ - mixed_precision_training: Native AMP
52
+
53
+ ### Training results
54
+
55
+ | Training Loss | Epoch | Step | Validation Loss | F1 Micro | F1 Macro | Roc Auc | Accuracy | Rate |
56
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|:--------:|:-------:|:--------:|:------:|
57
+ | No log | 1.0 | 28 | 0.5952 | 0.5739 | 0.4067 | 0.7528 | 0.0124 | 0.001 |
58
+ | No log | 2.0 | 56 | 0.4730 | 0.7307 | 0.4368 | 0.8401 | 0.0698 | 0.001 |
59
+ | No log | 3.0 | 84 | 0.3240 | 0.7499 | 0.3770 | 0.8378 | 0.1074 | 0.001 |
60
+ | No log | 4.0 | 112 | 0.2770 | 0.7521 | 0.3710 | 0.8372 | 0.1180 | 0.001 |
61
+ | No log | 5.0 | 140 | 0.2588 | 0.7507 | 0.3715 | 0.8353 | 0.1196 | 0.001 |
62
+ | No log | 6.0 | 168 | 0.2533 | 0.7520 | 0.3630 | 0.8354 | 0.1218 | 0.001 |
63
+ | No log | 7.0 | 196 | 0.2513 | 0.7517 | 0.3646 | 0.8347 | 0.1153 | 0.001 |
64
+ | No log | 8.0 | 224 | 0.2508 | 0.7576 | 0.3894 | 0.8407 | 0.1228 | 0.001 |
65
+ | No log | 9.0 | 252 | 0.2479 | 0.7550 | 0.3829 | 0.8360 | 0.1275 | 0.001 |
66
+ | No log | 10.0 | 280 | 0.2481 | 0.7583 | 0.3797 | 0.8407 | 0.1265 | 0.001 |
67
+ | No log | 11.0 | 308 | 0.2467 | 0.7601 | 0.3964 | 0.8431 | 0.1243 | 0.001 |
68
+ | No log | 12.0 | 336 | 0.2460 | 0.7565 | 0.3733 | 0.8362 | 0.1251 | 0.001 |
69
+ | No log | 13.0 | 364 | 0.2456 | 0.7582 | 0.3862 | 0.8399 | 0.1298 | 0.001 |
70
+ | No log | 14.0 | 392 | 0.2465 | 0.7526 | 0.3708 | 0.8323 | 0.1371 | 0.001 |
71
+ | No log | 15.0 | 420 | 0.2452 | 0.7541 | 0.3795 | 0.8344 | 0.1271 | 0.001 |
72
+ | No log | 16.0 | 448 | 0.2437 | 0.7597 | 0.3904 | 0.8409 | 0.1293 | 0.001 |
73
+ | No log | 17.0 | 476 | 0.2447 | 0.7526 | 0.3854 | 0.8317 | 0.1316 | 0.001 |
74
+ | 0.3126 | 18.0 | 504 | 0.2454 | 0.7534 | 0.3578 | 0.8326 | 0.1332 | 0.001 |
75
+ | 0.3126 | 19.0 | 532 | 0.2441 | 0.7568 | 0.3694 | 0.8367 | 0.1324 | 0.001 |
76
+ | 0.3126 | 20.0 | 560 | 0.2454 | 0.7509 | 0.3768 | 0.8288 | 0.1361 | 0.001 |
77
+ | 0.3126 | 21.0 | 588 | 0.2438 | 0.7602 | 0.3896 | 0.8416 | 0.1249 | 0.001 |
78
+ | 0.3126 | 22.0 | 616 | 0.2419 | 0.7576 | 0.3716 | 0.8368 | 0.1302 | 0.001 |
79
+ | 0.3126 | 23.0 | 644 | 0.2435 | 0.7629 | 0.3880 | 0.8454 | 0.1265 | 0.001 |
80
+ | 0.3126 | 24.0 | 672 | 0.2413 | 0.7561 | 0.3897 | 0.8344 | 0.1342 | 0.001 |
81
+ | 0.3126 | 25.0 | 700 | 0.2419 | 0.7599 | 0.3827 | 0.8415 | 0.1298 | 0.001 |
82
+ | 0.3126 | 26.0 | 728 | 0.2438 | 0.7593 | 0.3971 | 0.8401 | 0.1267 | 0.001 |
83
+ | 0.3126 | 27.0 | 756 | 0.2418 | 0.7614 | 0.3838 | 0.8422 | 0.1310 | 0.001 |
84
+ | 0.3126 | 28.0 | 784 | 0.2432 | 0.7498 | 0.3793 | 0.8275 | 0.1334 | 0.001 |
85
+ | 0.3126 | 29.0 | 812 | 0.2420 | 0.7622 | 0.3960 | 0.8436 | 0.1367 | 0.001 |
86
+ | 0.3126 | 30.0 | 840 | 0.2407 | 0.7620 | 0.3860 | 0.8404 | 0.1424 | 0.001 |
87
+ | 0.3126 | 31.0 | 868 | 0.2422 | 0.7612 | 0.3929 | 0.8429 | 0.1328 | 0.001 |
88
+ | 0.3126 | 32.0 | 896 | 0.2430 | 0.7516 | 0.3912 | 0.8298 | 0.1312 | 0.001 |
89
+ | 0.3126 | 33.0 | 924 | 0.2414 | 0.7589 | 0.3884 | 0.8388 | 0.1302 | 0.001 |
90
+ | 0.3126 | 34.0 | 952 | 0.2404 | 0.7625 | 0.4037 | 0.8419 | 0.1354 | 0.001 |
91
+ | 0.3126 | 35.0 | 980 | 0.2413 | 0.7602 | 0.3973 | 0.8400 | 0.1300 | 0.001 |
92
+ | 0.2465 | 36.0 | 1008 | 0.2419 | 0.7622 | 0.3876 | 0.8436 | 0.1357 | 0.001 |
93
+ | 0.2465 | 37.0 | 1036 | 0.2399 | 0.7598 | 0.3992 | 0.8381 | 0.1342 | 0.001 |
94
+ | 0.2465 | 38.0 | 1064 | 0.2400 | 0.7607 | 0.3933 | 0.8397 | 0.1330 | 0.001 |
95
+ | 0.2465 | 39.0 | 1092 | 0.2409 | 0.7619 | 0.4008 | 0.8412 | 0.1389 | 0.001 |
96
+ | 0.2465 | 40.0 | 1120 | 0.2399 | 0.76 | 0.3925 | 0.8378 | 0.1354 | 0.001 |
97
+ | 0.2465 | 41.0 | 1148 | 0.2423 | 0.7640 | 0.4061 | 0.8464 | 0.1249 | 0.001 |
98
+ | 0.2465 | 42.0 | 1176 | 0.2426 | 0.7569 | 0.4005 | 0.8378 | 0.1310 | 0.001 |
99
+ | 0.2465 | 43.0 | 1204 | 0.2392 | 0.7594 | 0.4008 | 0.8369 | 0.1336 | 0.001 |
100
+ | 0.2465 | 44.0 | 1232 | 0.2418 | 0.7577 | 0.4064 | 0.8365 | 0.1304 | 0.001 |
101
+ | 0.2465 | 45.0 | 1260 | 0.2411 | 0.7591 | 0.3906 | 0.8384 | 0.1379 | 0.001 |
102
+ | 0.2465 | 46.0 | 1288 | 0.2396 | 0.7654 | 0.4106 | 0.8457 | 0.1363 | 0.001 |
103
+ | 0.2465 | 47.0 | 1316 | 0.2396 | 0.7575 | 0.3968 | 0.8349 | 0.1326 | 0.001 |
104
+ | 0.2465 | 48.0 | 1344 | 0.2423 | 0.7564 | 0.3878 | 0.8373 | 0.1287 | 0.001 |
105
+ | 0.2465 | 49.0 | 1372 | 0.2398 | 0.7608 | 0.4027 | 0.8390 | 0.1330 | 0.001 |
106
+ | 0.2465 | 50.0 | 1400 | 0.2367 | 0.7652 | 0.4087 | 0.8436 | 0.1424 | 0.0001 |
107
+
108
+
109
+ ### Framework versions
110
+
111
+ - Transformers 4.41.0
112
+ - Pytorch 2.3.0+cu118
113
+ - Datasets 2.19.1
114
+ - Tokenizers 0.19.1
logs/events.out.tfevents.1716443927.u22redoip03 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:9236d9189b81bc70d7118f3454ca2d309f7af1f7c0b5af5b071526ff6674e85b
3
- size 32806
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:24ae2b1af1cb9b09f73175f675d96d419f500978520aa73205ab81153ba85556
3
+ size 34290
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:f72aba36b3ae35da679988a94a9bbe37c79791e61f18819293924449b0c83ea5
3
  size 1222977224
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:bbf57be4831d8e5ad0a2f6f713d7ce8de8a54958829b4c0f9967f23a814bfad7
3
  size 1222977224