Edit model card

DinoVd'eau is a fine-tuned version of facebook/dinov2-large. It achieves the following results on the test set:

  • Loss: 0.1215
  • F1 Micro: 0.8204
  • F1 Macro: 0.7157
  • Roc Auc: 0.8823
  • Accuracy: 0.3104

Model description

DinoVd'eau is a model built on top of dinov2 model for underwater multilabel image classification.The classification head is a combination of linear, ReLU, batch normalization, and dropout layers.

The source code for training the model can be found in this Git repository.


Intended uses & limitations

You can use the raw model for classify diverse marine species, encompassing coral morphotypes classes taken from the Global Coral Reef Monitoring Network (GCRMN), habitats classes and seagrass species.


Training and evaluation data

Details on the number of images for each class are given in the following table:

Class train val test Total
Acropore_branched 1469 464 475 2408
Acropore_digitised 568 160 160 888
Acropore_sub_massive 150 50 43 243
Acropore_tabular 999 297 293 1589
Algae_assembly 2546 847 845 4238
Algae_drawn_up 367 126 127 620
Algae_limestone 1652 557 563 2772
Algae_sodding 3148 984 985 5117
Atra/Leucospilota 1084 348 360 1792
Bleached_coral 219 71 70 360
Blurred 191 67 62 320
Dead_coral 1979 642 643 3264
Fish 2018 656 647 3321
Homo_sapiens 161 62 59 282
Human_object 157 58 55 270
Living_coral 406 154 141 701
Millepore 385 127 125 637
No_acropore_encrusting 441 130 154 725
No_acropore_foliaceous 204 36 46 286
No_acropore_massive 1031 336 338 1705
No_acropore_solitary 202 53 48 303
No_acropore_sub_massive 1401 433 422 2256
Rock 4489 1495 1473 7457
Rubble 3092 1030 1001 5123
Sand 5842 1939 1938 9719
Sea_cucumber 1408 439 447 2294
Sea_urchins 327 107 111 545
Sponge 269 96 105 470
Syringodium_isoetifolium 1212 392 391 1995
Thalassodendron_ciliatum 782 261 260 1303
Useless 579 193 193 965

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • Number of Epochs: 150
  • Learning Rate: 0.001
  • Train Batch Size: 32
  • Eval Batch Size: 32
  • Optimizer: Adam
  • LR Scheduler Type: ReduceLROnPlateau with a patience of 5 epochs and a factor of 0.1
  • Freeze Encoder: Yes
  • Data Augmentation: Yes

Data Augmentation

Data were augmented using the following transformations :

Train Transforms

  • PreProcess: No additional parameters
  • Resize: probability=1.00
  • RandomHorizontalFlip: probability=0.25
  • RandomVerticalFlip: probability=0.25
  • ColorJiggle: probability=0.25
  • RandomPerspective: probability=0.25
  • Normalize: probability=1.00

Val Transforms

  • PreProcess: No additional parameters
  • Resize: probability=1.00
  • Normalize: probability=1.00

Training results

Epoch Validation Loss Accuracy F1 Macro F1 Micro Learning Rate
1 0.17415422201156616 0.2085932085932086 0.7404590050568353 0.5162775619988154 0.001
2 0.15274545550346375 0.24566874566874566 0.764943032427695 0.5709420242102552 0.001
3 0.15382306277751923 0.2331947331947332 0.7738333688472193 0.6133943864574483 0.001
4 0.14660890400409698 0.25294525294525294 0.7731990099439837 0.6213934903906897 0.001
5 0.14496158063411713 0.25225225225225223 0.779705004541719 0.6359968153569712 0.001
6 0.1457589715719223 0.25744975744975745 0.779149639161812 0.622618148625821 0.001
7 0.14135494828224182 0.2560637560637561 0.7819386120611281 0.6314560852973476 0.001
8 0.1444089263677597 0.25294525294525294 0.7788089713843774 0.6319844238092638 0.001
9 0.14939041435718536 0.253984753984754 0.7845909928439817 0.6438267192772905 0.001
10 0.1488008201122284 0.26056826056826055 0.781314848337675 0.6369772986821485 0.001
11 0.14019742608070374 0.2668052668052668 0.7908874532471949 0.6485164271397655 0.001
12 0.14058491587638855 0.2643797643797644 0.7919919415764292 0.6511022415877694 0.001
13 0.1414448618888855 0.25848925848925847 0.7886238060564525 0.6548535821523175 0.001
14 0.13970790803432465 0.2609147609147609 0.7858715834017014 0.642939190074806 0.001
15 0.13929404318332672 0.25467775467775466 0.7887839332822739 0.647764586776463 0.001
16 0.14013302326202393 0.26022176022176025 0.7872799447704522 0.6382812337204584 0.001
17 0.14000828564167023 0.26126126126126126 0.7914054767403498 0.658855700914724 0.001
18 0.14067760109901428 0.2623007623007623 0.7950154721083884 0.6570019787939599 0.001
19 0.1403833031654358 0.26334026334026334 0.7869753855406161 0.6387863462411734 0.001
20 0.14024138450622559 0.25814275814275817 0.7933673469387754 0.6625671825769295 0.001
21 0.13815540075302124 0.26126126126126126 0.7915182212302014 0.6474381756853906 0.001
22 0.13828933238983154 0.2661122661122661 0.7949146871863498 0.6543977143873699 0.001
23 0.14174197614192963 0.26056826056826055 0.7870318128553448 0.6522440096096492 0.001
24 0.13888411223888397 0.2643797643797644 0.7904468412942989 0.6572263172067145 0.001
25 0.13641789555549622 0.2643797643797644 0.7910486374611936 0.64678667359451 0.001
26 0.13611701130867004 0.2740817740817741 0.7951991828396323 0.6560564582428936 0.001
27 0.13627803325653076 0.27061677061677064 0.7917941645476877 0.65521533209197 0.001
28 0.1349439024925232 0.27373527373527373 0.7947991761071059 0.6586060981864155 0.001
29 0.13719521462917328 0.2591822591822592 0.7897062646217833 0.6517599642552386 0.001
30 0.1394360214471817 0.26334026334026334 0.7918512726956825 0.6442671274166923 0.001
31 0.13669367134571075 0.272002772002772 0.7943623705213109 0.6574097960333095 0.001
32 0.13651886582374573 0.2654192654192654 0.7959939404140717 0.6558543328603996 0.001
33 0.13673679530620575 0.26715176715176714 0.7944568845325108 0.6583185559464378 0.001
34 0.1367281824350357 0.27616077616077617 0.7965766385687254 0.6625084466281674 0.001
35 0.13261976838111877 0.28135828135828134 0.8055031248689233 0.6804690360313653 0.0001
36 0.12947888672351837 0.28586278586278585 0.8037934127899526 0.6802389877947733 0.0001
37 0.12828999757766724 0.28967428967428965 0.806705439616832 0.6822606215873706 0.0001
38 0.1284075379371643 0.29002079002079 0.8067390935325073 0.6852151919888436 0.0001
39 0.12737543880939484 0.2875952875952876 0.8092011710581347 0.6866759808687309 0.0001
40 0.12709173560142517 0.28967428967428965 0.8113057657732993 0.6929758112800428 0.0001
41 0.12971246242523193 0.2844767844767845 0.8076234331030954 0.6864642029467631 0.0001
42 0.12703745067119598 0.29244629244629244 0.8101682053876311 0.6931344862994194 0.0001
43 0.12598323822021484 0.2934857934857935 0.8119647302029279 0.6926540963932155 0.0001
44 0.12710615992546082 0.2927927927927928 0.8113588557004627 0.6904696757284721 0.0001
45 0.12680120766162872 0.29695079695079696 0.811040876527602 0.6963431949324558 0.0001
46 0.12561258673667908 0.2920997920997921 0.8119103773584906 0.6946920409428321 0.0001
47 0.1271747648715973 0.29002079002079 0.8087464050076129 0.6947051444967897 0.0001
48 0.12491773068904877 0.29140679140679143 0.8120579319636241 0.6952370818920537 0.0001
49 0.1251150369644165 0.2959112959112959 0.8144871901860969 0.703691903925076 0.0001
50 0.12496345490217209 0.2959112959112959 0.812450186668904 0.6988074495535966 0.0001
51 0.125050887465477 0.29972279972279975 0.8149292364213251 0.7030126778645163 0.0001
52 0.1249731183052063 0.29417879417879417 0.8111289912763614 0.6983399115557597 0.0001
53 0.12466750293970108 0.2993762993762994 0.8139398998330551 0.698388073922256 0.0001
54 0.1254480928182602 0.29902979902979904 0.8115978835978835 0.6984156969444639 0.0001
55 0.12474238872528076 0.2959112959112959 0.8138832997987928 0.7015045390607456 0.0001
56 0.12404459714889526 0.29972279972279975 0.8165271356574618 0.7032192299173865 0.0001
57 0.12446229159832001 0.2959112959112959 0.8135280295401142 0.6990840553021119 0.0001
58 0.12382330000400543 0.30006930006930005 0.8156354515050168 0.7026961633362946 0.0001
59 0.12372507899999619 0.3024948024948025 0.8150415024733797 0.7059082704324529 0.0001
60 0.1240258663892746 0.29625779625779625 0.8121835977050286 0.7026167649783129 0.0001
61 0.12330322712659836 0.3004158004158004 0.8159357886375987 0.7094810266650597 0.0001
62 0.12315689027309418 0.3049203049203049 0.8168744007670181 0.7094589802664862 0.0001
63 0.123635433614254 0.29902979902979904 0.8144212523719165 0.7089895822127842 0.0001
64 0.12338270246982574 0.29799029799029797 0.8161063723030608 0.7075157869865035 0.0001
65 0.12290407717227936 0.3038808038808039 0.8189507896162341 0.7126034657481439 0.0001
66 0.12282729148864746 0.30180180180180183 0.8170073999749153 0.7043931303734631 0.0001
67 0.12248742580413818 0.3024948024948025 0.8185772879811078 0.7111991780503831 0.0001
68 0.12252162396907806 0.30665280665280664 0.821071752951862 0.7172579840334519 0.0001
69 0.12295113503932953 0.29972279972279975 0.8167424625231599 0.7110556131351429 0.0001
70 0.12295562028884888 0.30006930006930005 0.8158616325669691 0.7087837517785948 0.0001
71 0.12329106777906418 0.2948717948717949 0.8138561096307575 0.707778203401737 0.0001
72 0.12272541224956512 0.2983367983367983 0.8160692524267765 0.7114746535986886 0.0001
73 0.12273020297288895 0.30180180180180183 0.8163557502000254 0.7056729105380449 0.0001
74 0.12269297242164612 0.3024948024948025 0.8151630664972469 0.7054783398456466 1e-05
75 0.12262517213821411 0.29972279972279975 0.8173044645854314 0.7083917186729065 1e-05
76 0.12232112884521484 0.30561330561330563 0.8187309919593383 0.7106046747029869 1e-05
77 0.12262178957462311 0.30457380457380456 0.8196396916354042 0.7133668967178236 1e-05
78 0.12249884009361267 0.306999306999307 0.8194882870908788 0.7146742531419814 1e-05
79 0.12189220637083054 0.3049203049203049 0.8164990279773477 0.7087193054323969 1e-05
80 0.12197525799274445 0.3052668052668053 0.8192081016020587 0.7101439670894348 1e-05
81 0.12205896526575089 0.3038808038808039 0.8175726610268866 0.7107459571521023 1e-05
82 0.12159168720245361 0.3031878031878032 0.8184297763417397 0.7113278189885613 1e-05
83 0.12251544743776321 0.30076230076230076 0.8158139534883719 0.7051440304665874 1e-05
84 0.12184271961450577 0.306999306999307 0.8199227350143314 0.7147976069086832 1e-05
85 0.12135996669530869 0.3108108108108108 0.820793888058462 0.7149887811155055 1e-05
86 0.12166806310415268 0.3097713097713098 0.8196269871072726 0.7185510113715166 1e-05
87 0.12152230739593506 0.3052668052668053 0.8181514290489721 0.7098048826196588 1e-05
88 0.1215335875749588 0.3042273042273042 0.8198695850811978 0.7101856187326527 1e-05
89 0.12205063551664352 0.3063063063063063 0.8213318006404466 0.7246957527366229 1e-05
90 0.12171518057584763 0.30353430353430355 0.8207687231936655 0.7171116579207764 1e-05
91 0.12145966291427612 0.3063063063063063 0.8183235703148046 0.708084609465125 1e-05
92 0.12132740020751953 0.3052668052668053 0.8197214904027098 0.7102565185709027 1.0000000000000002e-06
93 0.12155032902956009 0.30734580734580735 0.8197524711354763 0.7111372185197403 1.0000000000000002e-06
94 0.12135589122772217 0.31185031185031187 0.8197623514696686 0.7127741454479504 1.0000000000000002e-06
95 0.12127173691987991 0.30665280665280664 0.8189510723860589 0.7111473431295381 1.0000000000000002e-06
96 0.12127332389354706 0.30734580734580735 0.8200000000000001 0.712943033091551 1.0000000000000002e-06
97 0.12157938629388809 0.3063063063063063 0.8187654735428643 0.7102690868672712 1.0000000000000002e-06
98 0.12154795229434967 0.30734580734580735 0.8196965928873414 0.7133649008156769 1.0000000000000002e-06
99 0.12143915891647339 0.305959805959806 0.8185519412381952 0.7125216907542044 1.0000000000000002e-06
100 0.12138102203607559 0.305959805959806 0.8184674682642757 0.7122859356249583 1.0000000000000002e-06
101 0.12125788629055023 0.30803880803880807 0.8194612653998747 0.7134124545495113 1.0000000000000002e-06
102 0.12163840234279633 0.3087318087318087 0.8185519412381952 0.7137549224687901 1.0000000000000002e-06
103 0.12121523171663284 0.31185031185031187 0.8202622567443414 0.7141086573572747 1.0000000000000002e-06
104 0.12171010673046112 0.30561330561330563 0.8197980135214089 0.7132713508352587 1.0000000000000002e-06
105 0.12120237946510315 0.305959805959806 0.8208707211698238 0.7173086329423328 1.0000000000000002e-06
106 0.121320441365242 0.306999306999307 0.8202411246923366 0.7168350318343706 1.0000000000000002e-06
107 0.12175650894641876 0.3076923076923077 0.8203522831622004 0.7127205151410884 1.0000000000000002e-06
108 0.12122030556201935 0.30561330561330563 0.8193932282936642 0.7123324607754901 1.0000000000000002e-06
109 0.12156262993812561 0.30284130284130284 0.8161108997929082 0.7071448927803996 1.0000000000000002e-06
110 0.12173406779766083 0.306999306999307 0.8187651029080909 0.7115929755222881 1.0000000000000002e-06
111 0.12149995565414429 0.30665280665280664 0.8198044518410651 0.7114518894492406 1.0000000000000002e-06
112 0.12140031903982162 0.30457380457380456 0.8199390473009645 0.7120532717245909 1.0000000000000002e-07
113 0.12134973704814911 0.3063063063063063 0.8206870830052629 0.7144648451440695 1.0000000000000002e-07
114 0.12122870236635208 0.30942480942480943 0.820446806743626 0.7133248394823453 1.0000000000000002e-07
115 0.12133905291557312 0.30803880803880807 0.8197131420947299 0.7171882737023623 1.0000000000000002e-07

CO2 Emissions

The estimated CO2 emissions for training this model are documented below:

  • Emissions: 1.8280066628700984 grams of CO2
  • Source: Code Carbon
  • Training Type: fine-tuning
  • Geographical Location: Brest, France
  • Hardware Used: NVIDIA Tesla V100 PCIe 32 Go

Framework Versions

  • Transformers: 4.41.1
  • Pytorch: 2.3.0+cu121
  • Datasets: 2.19.1
  • Tokenizers: 0.19.1
Downloads last month
8
Safetensors
Model size
307M params
Tensor type
F32
·
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for lombardata/DinoVdeau-large-2024_07_22-batch-size32_epochs150_freeze

Finetuned
this model