lombardata commited on
Commit
f99de4a
1 Parent(s): f6be8c4

Upload README.md

Browse files
Files changed (1) hide show
  1. README.md +158 -90
README.md CHANGED
@@ -1,114 +1,182 @@
 
1
  ---
2
- license: apache-2.0
3
- base_model: facebook/dinov2-large
 
4
  tags:
 
 
5
  - generated_from_trainer
6
- metrics:
7
- - accuracy
8
  model-index:
9
  - name: dinov2-large-2024_05_23-drone_batch-size512_epochs50_freeze
10
  results: []
11
  ---
12
 
13
- <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
- should probably proofread and complete it, then remove this comment. -->
15
 
16
- # dinov2-large-2024_05_23-drone_batch-size512_epochs50_freeze
17
-
18
- This model is a fine-tuned version of [facebook/dinov2-large](https://huggingface.co/facebook/dinov2-large) on the None dataset.
19
- It achieves the following results on the evaluation set:
20
  - Loss: 0.2361
21
  - F1 Micro: 0.7694
22
  - F1 Macro: 0.4048
23
  - Roc Auc: 0.8448
24
  - Accuracy: 0.1449
25
- - Learning Rate: 0.0001
26
 
27
- ## Model description
 
 
 
28
 
29
- More information needed
30
 
31
- ## Intended uses & limitations
32
 
33
- More information needed
 
 
 
 
 
34
 
35
- ## Training and evaluation data
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
36
 
37
- More information needed
38
 
39
- ## Training procedure
40
 
41
- ### Training hyperparameters
42
 
43
  The following hyperparameters were used during training:
44
- - learning_rate: 0.001
45
- - train_batch_size: 512
46
- - eval_batch_size: 512
47
- - seed: 42
48
- - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
49
- - lr_scheduler_type: linear
50
- - num_epochs: 50
51
- - mixed_precision_training: Native AMP
52
-
53
- ### Training results
54
-
55
- | Training Loss | Epoch | Step | Validation Loss | F1 Micro | F1 Macro | Roc Auc | Accuracy | Rate |
56
- |:-------------:|:-----:|:----:|:---------------:|:--------:|:--------:|:-------:|:--------:|:------:|
57
- | No log | 1.0 | 28 | 0.5952 | 0.5739 | 0.4067 | 0.7528 | 0.0124 | 0.001 |
58
- | No log | 2.0 | 56 | 0.4730 | 0.7307 | 0.4368 | 0.8401 | 0.0698 | 0.001 |
59
- | No log | 3.0 | 84 | 0.3240 | 0.7499 | 0.3770 | 0.8378 | 0.1074 | 0.001 |
60
- | No log | 4.0 | 112 | 0.2770 | 0.7521 | 0.3710 | 0.8372 | 0.1180 | 0.001 |
61
- | No log | 5.0 | 140 | 0.2588 | 0.7507 | 0.3715 | 0.8353 | 0.1196 | 0.001 |
62
- | No log | 6.0 | 168 | 0.2533 | 0.7520 | 0.3630 | 0.8354 | 0.1218 | 0.001 |
63
- | No log | 7.0 | 196 | 0.2513 | 0.7517 | 0.3646 | 0.8347 | 0.1153 | 0.001 |
64
- | No log | 8.0 | 224 | 0.2508 | 0.7576 | 0.3894 | 0.8407 | 0.1228 | 0.001 |
65
- | No log | 9.0 | 252 | 0.2479 | 0.7550 | 0.3829 | 0.8360 | 0.1275 | 0.001 |
66
- | No log | 10.0 | 280 | 0.2481 | 0.7583 | 0.3797 | 0.8407 | 0.1265 | 0.001 |
67
- | No log | 11.0 | 308 | 0.2467 | 0.7601 | 0.3964 | 0.8431 | 0.1243 | 0.001 |
68
- | No log | 12.0 | 336 | 0.2460 | 0.7565 | 0.3733 | 0.8362 | 0.1251 | 0.001 |
69
- | No log | 13.0 | 364 | 0.2456 | 0.7582 | 0.3862 | 0.8399 | 0.1298 | 0.001 |
70
- | No log | 14.0 | 392 | 0.2465 | 0.7526 | 0.3708 | 0.8323 | 0.1371 | 0.001 |
71
- | No log | 15.0 | 420 | 0.2452 | 0.7541 | 0.3795 | 0.8344 | 0.1271 | 0.001 |
72
- | No log | 16.0 | 448 | 0.2437 | 0.7597 | 0.3904 | 0.8409 | 0.1293 | 0.001 |
73
- | No log | 17.0 | 476 | 0.2447 | 0.7526 | 0.3854 | 0.8317 | 0.1316 | 0.001 |
74
- | 0.3126 | 18.0 | 504 | 0.2454 | 0.7534 | 0.3578 | 0.8326 | 0.1332 | 0.001 |
75
- | 0.3126 | 19.0 | 532 | 0.2441 | 0.7568 | 0.3694 | 0.8367 | 0.1324 | 0.001 |
76
- | 0.3126 | 20.0 | 560 | 0.2454 | 0.7509 | 0.3768 | 0.8288 | 0.1361 | 0.001 |
77
- | 0.3126 | 21.0 | 588 | 0.2438 | 0.7602 | 0.3896 | 0.8416 | 0.1249 | 0.001 |
78
- | 0.3126 | 22.0 | 616 | 0.2419 | 0.7576 | 0.3716 | 0.8368 | 0.1302 | 0.001 |
79
- | 0.3126 | 23.0 | 644 | 0.2435 | 0.7629 | 0.3880 | 0.8454 | 0.1265 | 0.001 |
80
- | 0.3126 | 24.0 | 672 | 0.2413 | 0.7561 | 0.3897 | 0.8344 | 0.1342 | 0.001 |
81
- | 0.3126 | 25.0 | 700 | 0.2419 | 0.7599 | 0.3827 | 0.8415 | 0.1298 | 0.001 |
82
- | 0.3126 | 26.0 | 728 | 0.2438 | 0.7593 | 0.3971 | 0.8401 | 0.1267 | 0.001 |
83
- | 0.3126 | 27.0 | 756 | 0.2418 | 0.7614 | 0.3838 | 0.8422 | 0.1310 | 0.001 |
84
- | 0.3126 | 28.0 | 784 | 0.2432 | 0.7498 | 0.3793 | 0.8275 | 0.1334 | 0.001 |
85
- | 0.3126 | 29.0 | 812 | 0.2420 | 0.7622 | 0.3960 | 0.8436 | 0.1367 | 0.001 |
86
- | 0.3126 | 30.0 | 840 | 0.2407 | 0.7620 | 0.3860 | 0.8404 | 0.1424 | 0.001 |
87
- | 0.3126 | 31.0 | 868 | 0.2422 | 0.7612 | 0.3929 | 0.8429 | 0.1328 | 0.001 |
88
- | 0.3126 | 32.0 | 896 | 0.2430 | 0.7516 | 0.3912 | 0.8298 | 0.1312 | 0.001 |
89
- | 0.3126 | 33.0 | 924 | 0.2414 | 0.7589 | 0.3884 | 0.8388 | 0.1302 | 0.001 |
90
- | 0.3126 | 34.0 | 952 | 0.2404 | 0.7625 | 0.4037 | 0.8419 | 0.1354 | 0.001 |
91
- | 0.3126 | 35.0 | 980 | 0.2413 | 0.7602 | 0.3973 | 0.8400 | 0.1300 | 0.001 |
92
- | 0.2465 | 36.0 | 1008 | 0.2419 | 0.7622 | 0.3876 | 0.8436 | 0.1357 | 0.001 |
93
- | 0.2465 | 37.0 | 1036 | 0.2399 | 0.7598 | 0.3992 | 0.8381 | 0.1342 | 0.001 |
94
- | 0.2465 | 38.0 | 1064 | 0.2400 | 0.7607 | 0.3933 | 0.8397 | 0.1330 | 0.001 |
95
- | 0.2465 | 39.0 | 1092 | 0.2409 | 0.7619 | 0.4008 | 0.8412 | 0.1389 | 0.001 |
96
- | 0.2465 | 40.0 | 1120 | 0.2399 | 0.76 | 0.3925 | 0.8378 | 0.1354 | 0.001 |
97
- | 0.2465 | 41.0 | 1148 | 0.2423 | 0.7640 | 0.4061 | 0.8464 | 0.1249 | 0.001 |
98
- | 0.2465 | 42.0 | 1176 | 0.2426 | 0.7569 | 0.4005 | 0.8378 | 0.1310 | 0.001 |
99
- | 0.2465 | 43.0 | 1204 | 0.2392 | 0.7594 | 0.4008 | 0.8369 | 0.1336 | 0.001 |
100
- | 0.2465 | 44.0 | 1232 | 0.2418 | 0.7577 | 0.4064 | 0.8365 | 0.1304 | 0.001 |
101
- | 0.2465 | 45.0 | 1260 | 0.2411 | 0.7591 | 0.3906 | 0.8384 | 0.1379 | 0.001 |
102
- | 0.2465 | 46.0 | 1288 | 0.2396 | 0.7654 | 0.4106 | 0.8457 | 0.1363 | 0.001 |
103
- | 0.2465 | 47.0 | 1316 | 0.2396 | 0.7575 | 0.3968 | 0.8349 | 0.1326 | 0.001 |
104
- | 0.2465 | 48.0 | 1344 | 0.2423 | 0.7564 | 0.3878 | 0.8373 | 0.1287 | 0.001 |
105
- | 0.2465 | 49.0 | 1372 | 0.2398 | 0.7608 | 0.4027 | 0.8390 | 0.1330 | 0.001 |
106
- | 0.2465 | 50.0 | 1400 | 0.2367 | 0.7652 | 0.4087 | 0.8436 | 0.1424 | 0.0001 |
107
-
108
-
109
- ### Framework versions
110
-
111
- - Transformers 4.41.0
112
- - Pytorch 2.3.0+cu118
113
- - Datasets 2.19.1
114
- - Tokenizers 0.19.1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+
2
  ---
3
+ language:
4
+ - eng
5
+ license: wtfpl
6
  tags:
7
+ - multilabel-image-classification
8
+ - multilabel
9
  - generated_from_trainer
10
+ base_model: facebook/dinov2-large
 
11
  model-index:
12
  - name: dinov2-large-2024_05_23-drone_batch-size512_epochs50_freeze
13
  results: []
14
  ---
15
 
16
+ DinoVd'eau is a fine-tuned version of [facebook/dinov2-large](https://huggingface.co/facebook/dinov2-large). It achieves the following results on the test set:
 
17
 
 
 
 
 
18
  - Loss: 0.2361
19
  - F1 Micro: 0.7694
20
  - F1 Macro: 0.4048
21
  - Roc Auc: 0.8448
22
  - Accuracy: 0.1449
 
23
 
24
+ ---
25
+
26
+ # Model description
27
+ DinoVd'eau is a model built on top of dinov2 model for underwater multilabel image classification.The classification head is a combination of linear, ReLU, batch normalization, and dropout layers.
28
 
29
+ The source code for training the model can be found in this [Git repository](https://github.com/SeatizenDOI/DinoVdeau).
30
 
31
+ - **Developed by:** [lombardata](https://huggingface.co/lombardata), credits to [César Leblanc](https://huggingface.co/CesarLeblanc) and [Victor Illien](https://huggingface.co/groderg)
32
 
33
+ ---
34
+
35
+ # Intended uses & limitations
36
+ You can use the raw model for classify diverse marine species, encompassing coral morphotypes classes taken from the Global Coral Reef Monitoring Network (GCRMN), habitats classes and seagrass species.
37
+
38
+ ---
39
 
40
+ # Training and evaluation data
41
+ Details on the number of images for each class are given in the following table:
42
+ | Class | train | val | test | Total |
43
+ |:-------------------------|--------:|------:|-------:|--------:|
44
+ | Acropore_branched | 1575 | 562 | 565 | 2702 |
45
+ | Acropore_digitised | 1020 | 356 | 370 | 1746 |
46
+ | Acropore_sub_massive | 198 | 56 | 60 | 314 |
47
+ | Acropore_tabular | 659 | 248 | 238 | 1145 |
48
+ | Algae_assembly | 7175 | 2447 | 2430 | 12052 |
49
+ | Algae_drawn_up | 439 | 156 | 156 | 751 |
50
+ | Algae_limestone | 4694 | 1576 | 1523 | 7793 |
51
+ | Algae_sodding | 7151 | 2460 | 2467 | 12078 |
52
+ | Bleached_coral | 352 | 162 | 150 | 664 |
53
+ | Dead_coral | 4615 | 1589 | 1553 | 7757 |
54
+ | Living_coral | 85 | 37 | 28 | 150 |
55
+ | Millepore | 860 | 287 | 313 | 1460 |
56
+ | No_acropore_encrusting | 1978 | 675 | 667 | 3320 |
57
+ | No_acropore_massive | 4539 | 1613 | 1585 | 7737 |
58
+ | No_acropore_sub_massive | 3696 | 1245 | 1252 | 6193 |
59
+ | Rock | 10810 | 3735 | 3718 | 18263 |
60
+ | Rubble | 9948 | 3429 | 3403 | 16780 |
61
+ | Sand | 10976 | 3659 | 3659 | 18294 |
62
+ | Sea_urchins | 400 | 147 | 135 | 682 |
63
+ | Sponge | 207 | 59 | 56 | 322 |
64
+ | Thalassodendron_ciliatum | 216 | 74 | 70 | 360 |
65
+ | Useless | 89 | 30 | 30 | 149 |
66
 
67
+ ---
68
 
69
+ # Training procedure
70
 
71
+ ## Training hyperparameters
72
 
73
  The following hyperparameters were used during training:
74
+
75
+ - **Number of Epochs**: 50
76
+ - **Learning Rate**: 0.001
77
+ - **Train Batch Size**: 512
78
+ - **Eval Batch Size**: 512
79
+ - **Optimizer**: Adam
80
+ - **LR Scheduler Type**: ReduceLROnPlateau with a patience of 5 epochs and a factor of 0.1
81
+ - **Freeze Encoder**: Yes
82
+ - **Data Augmentation**: Yes
83
+
84
+
85
+ ## Data Augmentation
86
+ Data were augmented using the following transformations :
87
+
88
+ Train Transforms
89
+ - **PreProcess**: No additional parameters
90
+ - **Resize**: probability=1.00
91
+ - **RandomHorizontalFlip**: probability=0.25
92
+ - **RandomVerticalFlip**: probability=0.25
93
+ - **ColorJiggle**: probability=0.25
94
+ - **RandomPerspective**: probability=0.25
95
+ - **Normalize**: probability=1.00
96
+
97
+ Val Transforms
98
+ - **PreProcess**: No additional parameters
99
+ - **Resize**: probability=1.00
100
+ - **Normalize**: probability=1.00
101
+
102
+
103
+
104
+ ## Training results
105
+ Epoch | Validation Loss | Accuracy | F1 Macro | F1 Micro | Learning Rate
106
+ --- | --- | --- | --- | --- | ---
107
+ 1.0 | 0.5951732397079468 | 0.012405938580435224 | 0.5738973203699311 | 0.40667627783698285 | 0.001
108
+ 2.0 | 0.4730209410190582 | 0.06975798250966037 | 0.7307120964254151 | 0.4367882492755507 | 0.001
109
+ 3.0 | 0.3240152895450592 | 0.10738255033557047 | 0.7498981835953409 | 0.37702459211637257 | 0.001
110
+ 4.0 | 0.2770342230796814 | 0.11795810453528574 | 0.7521195160095482 | 0.3710481900670742 | 0.001
111
+ 5.0 | 0.25879302620887756 | 0.11958511287370348 | 0.7507292550220328 | 0.3714736793659693 | 0.001
112
+ 6.0 | 0.25328728556632996 | 0.12182224933902787 | 0.7520252586099456 | 0.36304822534346387 | 0.001
113
+ 7.0 | 0.25132349133491516 | 0.11531421598535692 | 0.7517183920016662 | 0.3646331511607325 | 0.001
114
+ 8.0 | 0.2507544159889221 | 0.12283912955053895 | 0.7576399892988702 | 0.38940077262215617 | 0.001
115
+ 9.0 | 0.24785615503787994 | 0.12751677852348994 | 0.7549859932265752 | 0.38290945223752887 | 0.001
116
+ 10.0 | 0.2480766475200653 | 0.12649989831197886 | 0.7583163191651716 | 0.37973264961121395 | 0.001
117
+ 11.0 | 0.24667006731033325 | 0.12426276184665447 | 0.7600958878849345 | 0.3964288145693209 | 0.001
118
+ 12.0 | 0.2459569126367569 | 0.12507626601586333 | 0.7564640698455339 | 0.3733203958034908 | 0.001
119
+ 13.0 | 0.245611771941185 | 0.1297539149888143 | 0.7581923944769908 | 0.38618999344086086 | 0.001
120
+ 14.0 | 0.24649737775325775 | 0.13707545251169412 | 0.7526021832952525 | 0.37084554766098704 | 0.001
121
+ 15.0 | 0.24523988366127014 | 0.1271100264388855 | 0.7540528606572888 | 0.37953234953900117 | 0.001
122
+ 16.0 | 0.24370642006397247 | 0.1293471629042099 | 0.7597242635642867 | 0.39042586476441543 | 0.001
123
+ 17.0 | 0.24466517567634583 | 0.13158429936953428 | 0.7525727259224682 | 0.38542350135117487 | 0.001
124
+ 17.857142857142858 | N/A | N/A | N/A | N/A | 0.001
125
+ 18.0 | 0.24544650316238403 | 0.133211307707952 | 0.7534316217590239 | 0.35783734462173733 | 0.001
126
+ 19.0 | 0.2440878450870514 | 0.13239780353874314 | 0.7568417082268136 | 0.3694145346248099 | 0.001
127
+ 20.0 | 0.2453632354736328 | 0.13605857230018303 | 0.750895096799091 | 0.3768127127776539 | 0.001
128
+ 21.0 | 0.24377579987049103 | 0.12487288997356112 | 0.760243826841616 | 0.38961590782494593 | 0.001
129
+ 22.0 | 0.24192409217357635 | 0.13016066707341875 | 0.7576183975637929 | 0.3715634230883189 | 0.001
130
+ 23.0 | 0.24348826706409454 | 0.12649989831197886 | 0.7628996647313762 | 0.3880375815747224 | 0.001
131
+ 24.0 | 0.2413305789232254 | 0.1342281879194631 | 0.7561114991428027 | 0.3896884130115941 | 0.001
132
+ 25.0 | 0.24189460277557373 | 0.1297539149888143 | 0.7599182173024102 | 0.38267978517684004 | 0.001
133
+ 26.0 | 0.2437727451324463 | 0.12670327435428105 | 0.7593076827294236 | 0.3971421437602147 | 0.001
134
+ 27.0 | 0.24182096123695374 | 0.1309741712426276 | 0.761437908496732 | 0.38383597863653807 | 0.001
135
+ 28.0 | 0.24316559731960297 | 0.13341468375025423 | 0.7498440155769273 | 0.3792682503180625 | 0.001
136
+ 29.0 | 0.24201267957687378 | 0.1366687004270897 | 0.7621594930458399 | 0.39596794972011545 | 0.001
137
+ 30.0 | 0.2406790852546692 | 0.14236322961155176 | 0.7619565217391304 | 0.38596411111358153 | 0.001
138
+ 31.0 | 0.24222084879875183 | 0.13280455562334756 | 0.7611869607298037 | 0.3928781445591724 | 0.001
139
+ 32.0 | 0.24304261803627014 | 0.13117754728492984 | 0.7516135926480015 | 0.3912203123758987 | 0.001
140
+ 33.0 | 0.24139608442783356 | 0.13016066707341875 | 0.758885526453094 | 0.38844227395152936 | 0.001
141
+ 34.0 | 0.24039919674396515 | 0.1354484441732764 | 0.7624706542289075 | 0.4037409737349212 | 0.001
142
+ 35.0 | 0.24134761095046997 | 0.12995729103111653 | 0.7601615858737297 | 0.3973020120442106 | 0.001
143
+ 35.714285714285715 | N/A | N/A | N/A | N/A | 0.001
144
+ 36.0 | 0.24192169308662415 | 0.13565182021557862 | 0.7622066694112803 | 0.38761085480429286 | 0.001
145
+ 37.0 | 0.2399486005306244 | 0.1342281879194631 | 0.7598352387357096 | 0.3992187594370792 | 0.001
146
+ 38.0 | 0.24004822969436646 | 0.13300793166564978 | 0.7607364527387098 | 0.3932700433432016 | 0.001
147
+ 39.0 | 0.24091550707817078 | 0.13890583689241406 | 0.7619087275149901 | 0.4007929579258356 | 0.001
148
+ 40.0 | 0.23991511762142181 | 0.1354484441732764 | 0.76 | 0.39250375468507387 | 0.001
149
+ 41.0 | 0.2422637641429901 | 0.12487288997356112 | 0.7639710985018574 | 0.40608061408264917 | 0.001
150
+ 42.0 | 0.24256455898284912 | 0.1309741712426276 | 0.7568840806286871 | 0.4005098857996497 | 0.001
151
+ 43.0 | 0.23922023177146912 | 0.13361805979255645 | 0.7594289817122102 | 0.4007981173529554 | 0.001
152
+ 44.0 | 0.24184103310108185 | 0.13036404311572097 | 0.7576905272903253 | 0.40641694858015515 | 0.001
153
+ 45.0 | 0.24105145037174225 | 0.13788895668090298 | 0.7591085068536151 | 0.39055068831340933 | 0.001
154
+ 46.0 | 0.23963303864002228 | 0.13626194834248526 | 0.7653508320819534 | 0.4106196361694743 | 0.001
155
+ 47.0 | 0.23957742750644684 | 0.13260117958104536 | 0.7575076348829317 | 0.3967990889217657 | 0.001
156
+ 48.0 | 0.24231907725334167 | 0.12873703477730322 | 0.7563947423325684 | 0.38777990454974365 | 0.001
157
+ 49.0 | 0.23978127539157867 | 0.13300793166564978 | 0.7608376348147216 | 0.40266317126303974 | 0.001
158
+ 50.0 | 0.23673731088638306 | 0.14236322961155176 | 0.7652267908369019 | 0.4087415721658059 | 0.0001
159
+
160
+
161
+ ---
162
+
163
+ # CO2 Emissions
164
+
165
+ The estimated CO2 emissions for training this model are documented below:
166
+
167
+ - **Emissions**: 0.02562211166966913 grams of CO2
168
+ - **Source**: Code Carbon
169
+ - **Training Type**: fine-tuning
170
+ - **Geographical Location**: Brest, France
171
+ - **Hardware Used**: NVIDIA Tesla V100 PCIe 32 Go
172
+
173
+
174
+ ---
175
+
176
+ # Framework Versions
177
+
178
+ - **Transformers**: 4.41.0
179
+ - **Pytorch**: 2.3.0+cu118
180
+ - **Datasets**: 2.19.1
181
+ - **Tokenizers**: 0.19.1
182
+