lombardata commited on
Commit
12fa37c
1 Parent(s): b7bb1b9

Upload README.md

Browse files
Files changed (1) hide show
  1. README.md +168 -111
README.md CHANGED
@@ -1,135 +1,192 @@
 
1
  ---
2
- license: apache-2.0
3
- base_model: facebook/dinov2-large
 
4
  tags:
 
 
5
  - generated_from_trainer
6
- metrics:
7
- - accuracy
8
  model-index:
9
  - name: DinoVdeau_Aina-large-2024_06_12-batch-size32_epochs150_freeze
10
  results: []
11
  ---
12
 
13
- <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
- should probably proofread and complete it, then remove this comment. -->
15
 
16
- # DinoVdeau_Aina-large-2024_06_12-batch-size32_epochs150_freeze
17
-
18
- This model is a fine-tuned version of [facebook/dinov2-large](https://huggingface.co/facebook/dinov2-large) on the None dataset.
19
- It achieves the following results on the evaluation set:
20
  - Loss: 0.1378
21
  - F1 Micro: 0.8118
22
  - F1 Macro: 0.5888
23
  - Roc Auc: 0.8738
24
  - Accuracy: 0.5906
25
- - Learning Rate: 0.0000
26
 
27
- ## Model description
 
 
 
28
 
29
- More information needed
30
 
31
- ## Intended uses & limitations
32
 
33
- More information needed
 
 
 
 
 
34
 
35
- ## Training and evaluation data
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
36
 
37
- More information needed
38
 
39
- ## Training procedure
40
 
41
- ### Training hyperparameters
42
 
43
  The following hyperparameters were used during training:
44
- - learning_rate: 0.001
45
- - train_batch_size: 32
46
- - eval_batch_size: 32
47
- - seed: 42
48
- - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
49
- - lr_scheduler_type: linear
50
- - num_epochs: 150
51
- - mixed_precision_training: Native AMP
52
-
53
- ### Training results
54
-
55
- | Training Loss | Epoch | Step | Validation Loss | F1 Micro | F1 Macro | Roc Auc | Accuracy | Rate |
56
- |:-------------:|:-----:|:----:|:---------------:|:--------:|:--------:|:-------:|:--------:|:------:|
57
- | No log | 1.0 | 124 | 0.2556 | 0.7704 | 0.4260 | 0.8457 | 0.5343 | 0.001 |
58
- | No log | 2.0 | 248 | 0.1856 | 0.7608 | 0.4057 | 0.8321 | 0.5450 | 0.001 |
59
- | No log | 3.0 | 372 | 0.1580 | 0.7773 | 0.4770 | 0.8538 | 0.5335 | 0.001 |
60
- | No log | 4.0 | 496 | 0.1584 | 0.7700 | 0.3944 | 0.8407 | 0.5274 | 0.001 |
61
- | 0.2548 | 5.0 | 620 | 0.1562 | 0.7614 | 0.3748 | 0.8278 | 0.5442 | 0.001 |
62
- | 0.2548 | 6.0 | 744 | 0.1484 | 0.7822 | 0.4447 | 0.8495 | 0.5549 | 0.001 |
63
- | 0.2548 | 7.0 | 868 | 0.1445 | 0.7904 | 0.5285 | 0.8591 | 0.5648 | 0.001 |
64
- | 0.2548 | 8.0 | 992 | 0.1458 | 0.7778 | 0.4805 | 0.8437 | 0.5488 | 0.001 |
65
- | 0.1582 | 9.0 | 1116 | 0.1419 | 0.7990 | 0.4775 | 0.8675 | 0.5655 | 0.001 |
66
- | 0.1582 | 10.0 | 1240 | 0.1426 | 0.7917 | 0.5507 | 0.8580 | 0.5663 | 0.001 |
67
- | 0.1582 | 11.0 | 1364 | 0.1459 | 0.7766 | 0.4314 | 0.8393 | 0.5556 | 0.001 |
68
- | 0.1582 | 12.0 | 1488 | 0.1424 | 0.7899 | 0.4704 | 0.8551 | 0.5625 | 0.001 |
69
- | 0.149 | 13.0 | 1612 | 0.1420 | 0.7957 | 0.5304 | 0.8653 | 0.5587 | 0.001 |
70
- | 0.149 | 14.0 | 1736 | 0.1415 | 0.7959 | 0.5303 | 0.8651 | 0.5686 | 0.001 |
71
- | 0.149 | 15.0 | 1860 | 0.1447 | 0.7903 | 0.5021 | 0.8533 | 0.5747 | 0.001 |
72
- | 0.149 | 16.0 | 1984 | 0.1505 | 0.7780 | 0.4408 | 0.8441 | 0.5549 | 0.001 |
73
- | 0.1425 | 17.0 | 2108 | 0.1504 | 0.7849 | 0.5071 | 0.8561 | 0.5625 | 0.001 |
74
- | 0.1425 | 18.0 | 2232 | 0.1584 | 0.7869 | 0.4938 | 0.8559 | 0.5633 | 0.001 |
75
- | 0.1425 | 19.0 | 2356 | 0.1395 | 0.7954 | 0.5251 | 0.8677 | 0.5678 | 0.001 |
76
- | 0.1425 | 20.0 | 2480 | 0.1405 | 0.8031 | 0.5903 | 0.8760 | 0.5678 | 0.001 |
77
- | 0.1405 | 21.0 | 2604 | 0.1434 | 0.7966 | 0.4817 | 0.8662 | 0.5640 | 0.001 |
78
- | 0.1405 | 22.0 | 2728 | 0.1466 | 0.7923 | 0.5288 | 0.8662 | 0.5511 | 0.001 |
79
- | 0.1405 | 23.0 | 2852 | 0.1456 | 0.7919 | 0.4974 | 0.8548 | 0.5747 | 0.001 |
80
- | 0.1405 | 24.0 | 2976 | 0.1398 | 0.7889 | 0.5008 | 0.8538 | 0.5587 | 0.001 |
81
- | 0.1376 | 25.0 | 3100 | 0.1392 | 0.8018 | 0.5881 | 0.8690 | 0.5808 | 0.001 |
82
- | 0.1376 | 26.0 | 3224 | 0.1458 | 0.8024 | 0.5379 | 0.8654 | 0.5846 | 0.001 |
83
- | 0.1376 | 27.0 | 3348 | 0.1388 | 0.7990 | 0.5310 | 0.8632 | 0.5716 | 0.001 |
84
- | 0.1376 | 28.0 | 3472 | 0.1475 | 0.7953 | 0.4926 | 0.8626 | 0.5648 | 0.001 |
85
- | 0.1361 | 29.0 | 3596 | 0.1428 | 0.7916 | 0.4731 | 0.8538 | 0.5755 | 0.001 |
86
- | 0.1361 | 30.0 | 3720 | 0.1446 | 0.7960 | 0.5270 | 0.8665 | 0.5648 | 0.001 |
87
- | 0.1361 | 31.0 | 3844 | 0.2518 | 0.7860 | 0.5163 | 0.8571 | 0.5556 | 0.001 |
88
- | 0.1361 | 32.0 | 3968 | 0.1363 | 0.7985 | 0.5224 | 0.8652 | 0.5732 | 0.001 |
89
- | 0.1362 | 33.0 | 4092 | 0.1413 | 0.7980 | 0.5095 | 0.8638 | 0.5831 | 0.001 |
90
- | 0.1362 | 34.0 | 4216 | 0.1392 | 0.7939 | 0.5399 | 0.8588 | 0.5617 | 0.001 |
91
- | 0.1362 | 35.0 | 4340 | 0.1400 | 0.8021 | 0.4873 | 0.8662 | 0.5877 | 0.001 |
92
- | 0.1362 | 36.0 | 4464 | 0.1510 | 0.8016 | 0.5150 | 0.8647 | 0.5854 | 0.001 |
93
- | 0.1331 | 37.0 | 4588 | 0.1443 | 0.7901 | 0.4850 | 0.8592 | 0.5572 | 0.001 |
94
- | 0.1331 | 38.0 | 4712 | 0.1441 | 0.7946 | 0.5063 | 0.8598 | 0.5709 | 0.001 |
95
- | 0.1331 | 39.0 | 4836 | 0.1354 | 0.8024 | 0.5354 | 0.8646 | 0.5892 | 0.0001 |
96
- | 0.1331 | 40.0 | 4960 | 0.1358 | 0.8035 | 0.5341 | 0.8651 | 0.5938 | 0.0001 |
97
- | 0.1273 | 41.0 | 5084 | 0.1350 | 0.8055 | 0.5376 | 0.8695 | 0.5945 | 0.0001 |
98
- | 0.1273 | 42.0 | 5208 | 0.1369 | 0.8035 | 0.5480 | 0.8684 | 0.5892 | 0.0001 |
99
- | 0.1273 | 43.0 | 5332 | 0.1357 | 0.8032 | 0.5562 | 0.8686 | 0.5869 | 0.0001 |
100
- | 0.1273 | 44.0 | 5456 | 0.1349 | 0.8040 | 0.5433 | 0.8682 | 0.5892 | 0.0001 |
101
- | 0.1173 | 45.0 | 5580 | 0.1361 | 0.8025 | 0.5564 | 0.8691 | 0.5816 | 0.0001 |
102
- | 0.1173 | 46.0 | 5704 | 0.1350 | 0.8073 | 0.5738 | 0.8730 | 0.5938 | 0.0001 |
103
- | 0.1173 | 47.0 | 5828 | 0.1388 | 0.8036 | 0.5399 | 0.8683 | 0.5869 | 0.0001 |
104
- | 0.1173 | 48.0 | 5952 | 0.1371 | 0.8029 | 0.5475 | 0.8684 | 0.5869 | 0.0001 |
105
- | 0.113 | 49.0 | 6076 | 0.1369 | 0.8004 | 0.5558 | 0.8681 | 0.5770 | 0.0001 |
106
- | 0.113 | 50.0 | 6200 | 0.1356 | 0.8055 | 0.5775 | 0.8716 | 0.5854 | 0.0001 |
107
- | 0.113 | 51.0 | 6324 | 0.1357 | 0.8057 | 0.5639 | 0.8716 | 0.5899 | 1e-05 |
108
- | 0.113 | 52.0 | 6448 | 0.1353 | 0.8033 | 0.5524 | 0.8687 | 0.5877 | 1e-05 |
109
- | 0.1109 | 53.0 | 6572 | 0.1346 | 0.8048 | 0.5553 | 0.8703 | 0.5892 | 1e-05 |
110
- | 0.1109 | 54.0 | 6696 | 0.1351 | 0.8040 | 0.5514 | 0.8688 | 0.5869 | 1e-05 |
111
- | 0.1109 | 55.0 | 6820 | 0.1366 | 0.8044 | 0.5586 | 0.8710 | 0.5899 | 1e-05 |
112
- | 0.1109 | 56.0 | 6944 | 0.1358 | 0.8056 | 0.5596 | 0.8714 | 0.5877 | 1e-05 |
113
- | 0.1091 | 57.0 | 7068 | 0.1357 | 0.8046 | 0.5592 | 0.8692 | 0.5892 | 1e-05 |
114
- | 0.1091 | 58.0 | 7192 | 0.1356 | 0.8042 | 0.5653 | 0.8703 | 0.5869 | 1e-05 |
115
- | 0.1091 | 59.0 | 7316 | 0.1348 | 0.8065 | 0.5642 | 0.8707 | 0.5884 | 1e-05 |
116
- | 0.1091 | 60.0 | 7440 | 0.1367 | 0.8047 | 0.5544 | 0.8691 | 0.5899 | 0.0000 |
117
- | 0.109 | 61.0 | 7564 | 0.1341 | 0.8046 | 0.5561 | 0.8682 | 0.5877 | 0.0000 |
118
- | 0.109 | 62.0 | 7688 | 0.1345 | 0.8052 | 0.5646 | 0.8707 | 0.5861 | 0.0000 |
119
- | 0.109 | 63.0 | 7812 | 0.1371 | 0.8060 | 0.5661 | 0.8707 | 0.5899 | 0.0000 |
120
- | 0.109 | 64.0 | 7936 | 0.1348 | 0.8057 | 0.5522 | 0.8698 | 0.5930 | 0.0000 |
121
- | 0.1082 | 65.0 | 8060 | 0.1358 | 0.8048 | 0.5594 | 0.8707 | 0.5899 | 0.0000 |
122
- | 0.1082 | 66.0 | 8184 | 0.1352 | 0.8041 | 0.5681 | 0.8697 | 0.5854 | 0.0000 |
123
- | 0.1082 | 67.0 | 8308 | 0.1348 | 0.8059 | 0.5690 | 0.8721 | 0.5877 | 0.0000 |
124
- | 0.1082 | 68.0 | 8432 | 0.1348 | 0.8047 | 0.5681 | 0.8701 | 0.5854 | 0.0000 |
125
- | 0.1091 | 69.0 | 8556 | 0.1351 | 0.8036 | 0.5524 | 0.8672 | 0.5884 | 0.0000 |
126
- | 0.1091 | 70.0 | 8680 | 0.1365 | 0.8054 | 0.5680 | 0.8718 | 0.5854 | 0.0000 |
127
- | 0.1091 | 71.0 | 8804 | 0.1347 | 0.8073 | 0.5701 | 0.8722 | 0.5907 | 0.0000 |
128
-
129
-
130
- ### Framework versions
131
-
132
- - Transformers 4.41.1
133
- - Pytorch 2.3.0+cu121
134
- - Datasets 2.19.1
135
- - Tokenizers 0.19.1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+
2
  ---
3
+ language:
4
+ - eng
5
+ license: wtfpl
6
  tags:
7
+ - multilabel-image-classification
8
+ - multilabel
9
  - generated_from_trainer
10
+ base_model: facebook/dinov2-large
 
11
  model-index:
12
  - name: DinoVdeau_Aina-large-2024_06_12-batch-size32_epochs150_freeze
13
  results: []
14
  ---
15
 
16
+ DinoVd'eau is a fine-tuned version of [facebook/dinov2-large](https://huggingface.co/facebook/dinov2-large). It achieves the following results on the test set:
 
17
 
 
 
 
 
18
  - Loss: 0.1378
19
  - F1 Micro: 0.8118
20
  - F1 Macro: 0.5888
21
  - Roc Auc: 0.8738
22
  - Accuracy: 0.5906
 
23
 
24
+ ---
25
+
26
+ # Model description
27
+ DinoVd'eau is a model built on top of dinov2 model for underwater multilabel image classification.The classification head is a combination of linear, ReLU, batch normalization, and dropout layers.
28
 
29
+ The source code for training the model can be found in this [Git repository](https://github.com/SeatizenDOI/DinoVdeau).
30
 
31
+ - **Developed by:** [lombardata](https://huggingface.co/lombardata), credits to [César Leblanc](https://huggingface.co/CesarLeblanc) and [Victor Illien](https://huggingface.co/groderg)
32
 
33
+ ---
34
+
35
+ # Intended uses & limitations
36
+ You can use the raw model for classify diverse marine species, encompassing coral morphotypes classes taken from the Global Coral Reef Monitoring Network (GCRMN), habitats classes and seagrass species.
37
+
38
+ ---
39
 
40
+ # Training and evaluation data
41
+ Details on the number of images for each class are given in the following table:
42
+ | Class | train | val | test | Total |
43
+ |:--------|--------:|------:|-------:|--------:|
44
+ | Acr | 509 | 170 | 170 | 849 |
45
+ | Ech | 149 | 55 | 49 | 253 |
46
+ | Gal | 149 | 49 | 52 | 250 |
47
+ | Mtp | 278 | 93 | 92 | 463 |
48
+ | Poc | 166 | 54 | 60 | 280 |
49
+ | Por | 265 | 88 | 88 | 441 |
50
+ | ALGAE | 1221 | 407 | 407 | 2035 |
51
+ | RDC | 185 | 65 | 69 | 319 |
52
+ | SG | 1388 | 463 | 462 | 2313 |
53
+ | P | 198 | 66 | 66 | 330 |
54
+ | R | 1106 | 368 | 369 | 1843 |
55
+ | S | 2178 | 726 | 726 | 3630 |
56
+ | UNK | 132 | 44 | 44 | 220 |
57
 
58
+ ---
59
 
60
+ # Training procedure
61
 
62
+ ## Training hyperparameters
63
 
64
  The following hyperparameters were used during training:
65
+
66
+ - **Number of Epochs**: 150
67
+ - **Learning Rate**: 0.001
68
+ - **Train Batch Size**: 32
69
+ - **Eval Batch Size**: 32
70
+ - **Optimizer**: Adam
71
+ - **LR Scheduler Type**: ReduceLROnPlateau with a patience of 5 epochs and a factor of 0.1
72
+ - **Freeze Encoder**: Yes
73
+ - **Data Augmentation**: Yes
74
+
75
+
76
+ ## Data Augmentation
77
+ Data were augmented using the following transformations :
78
+
79
+ Train Transforms
80
+ - **PreProcess**: No additional parameters
81
+ - **Resize**: probability=1.00
82
+ - **RandomHorizontalFlip**: probability=0.25
83
+ - **RandomVerticalFlip**: probability=0.25
84
+ - **ColorJiggle**: probability=0.25
85
+ - **RandomPerspective**: probability=0.25
86
+ - **Normalize**: probability=1.00
87
+
88
+ Val Transforms
89
+ - **PreProcess**: No additional parameters
90
+ - **Resize**: probability=1.00
91
+ - **Normalize**: probability=1.00
92
+
93
+
94
+
95
+ ## Training results
96
+ Epoch | Validation Loss | Accuracy | F1 Macro | F1 Micro | Learning Rate
97
+ --- | --- | --- | --- | --- | ---
98
+ 1 | 0.25564512610435486 | 0.5342987804878049 | 0.7703523693803159 | 0.42604577042963465 | 0.001
99
+ 2 | 0.1855568140745163 | 0.5449695121951219 | 0.7607875994972768 | 0.40573815087342935 | 0.001
100
+ 3 | 0.15800759196281433 | 0.5335365853658537 | 0.7772908366533864 | 0.47696929852236075 | 0.001
101
+ 4 | 0.15843382477760315 | 0.5274390243902439 | 0.7700247729149464 | 0.39442465091480916 | 0.001
102
+ 5 | 0.15615205466747284 | 0.5442073170731707 | 0.761375774407178 | 0.3747774830573479 | 0.001
103
+ 6 | 0.1484147012233734 | 0.5548780487804879 | 0.7822349570200573 | 0.4447304719957427 | 0.001
104
+ 7 | 0.1444726288318634 | 0.5647865853658537 | 0.7904456041750301 | 0.5284590192832462 | 0.001
105
+ 8 | 0.14579781889915466 | 0.5487804878048781 | 0.7778469197261979 | 0.48048557941722936 | 0.001
106
+ 9 | 0.1419043242931366 | 0.5655487804878049 | 0.7989700930877401 | 0.4775190765596756 | 0.001
107
+ 10 | 0.14263293147087097 | 0.5663109756097561 | 0.7916750858759345 | 0.5507171237721333 | 0.001
108
+ 11 | 0.1459268480539322 | 0.555640243902439 | 0.7765845441145505 | 0.43142510371678144 | 0.001
109
+ 12 | 0.14237171411514282 | 0.5625 | 0.7899022801302932 | 0.47044459599035204 | 0.001
110
+ 13 | 0.14195148646831512 | 0.5586890243902439 | 0.7957159857199523 | 0.5304167061533165 | 0.001
111
+ 14 | 0.14145776629447937 | 0.5685975609756098 | 0.795869737887212 | 0.5302959882181708 | 0.001
112
+ 15 | 0.14466659724712372 | 0.5746951219512195 | 0.7903159622486664 | 0.5020556893881151 | 0.001
113
+ 16 | 0.15050330758094788 | 0.5548780487804879 | 0.7779618889809444 | 0.44075758480309546 | 0.001
114
+ 17 | 0.15037894248962402 | 0.5625 | 0.7849117174959872 | 0.5071297581342885 | 0.001
115
+ 18 | 0.15835434198379517 | 0.5632621951219512 | 0.7868521879411171 | 0.4937774793717716 | 0.001
116
+ 19 | 0.13946771621704102 | 0.5678353658536586 | 0.795441147573197 | 0.52511737039556 | 0.001
117
+ 20 | 0.1404852569103241 | 0.5678353658536586 | 0.8031007751937984 | 0.5902607630639873 | 0.001
118
+ 21 | 0.14341644942760468 | 0.5640243902439024 | 0.7965933848286789 | 0.4816903721736541 | 0.001
119
+ 22 | 0.14662735164165497 | 0.5510670731707317 | 0.7923046721633294 | 0.5288404087153705 | 0.001
120
+ 23 | 0.14562036097049713 | 0.5746951219512195 | 0.7918968692449356 | 0.4974177137762122 | 0.001
121
+ 24 | 0.13980671763420105 | 0.5586890243902439 | 0.7888934258881176 | 0.5008214078450646 | 0.001
122
+ 25 | 0.13920389115810394 | 0.5807926829268293 | 0.8018232263178755 | 0.5881062115996991 | 0.001
123
+ 26 | 0.14581459760665894 | 0.5846036585365854 | 0.8024120603015076 | 0.5378532463138629 | 0.001
124
+ 27 | 0.1388114094734192 | 0.5716463414634146 | 0.7989535117729925 | 0.530975959272302 | 0.001
125
+ 28 | 0.14750176668167114 | 0.5647865853658537 | 0.7952771662997797 | 0.4926175534546719 | 0.001
126
+ 29 | 0.14280082285404205 | 0.5754573170731707 | 0.7916238965304866 | 0.4730643714031951 | 0.001
127
+ 30 | 0.14457112550735474 | 0.5647865853658537 | 0.7960474308300396 | 0.5269721557360777 | 0.001
128
+ 31 | 0.2518298327922821 | 0.555640243902439 | 0.7859719438877755 | 0.5163333467519122 | 0.001
129
+ 32 | 0.13625293970108032 | 0.573170731707317 | 0.7984836392657622 | 0.5223572225800281 | 0.001
130
+ 33 | 0.14134813845157623 | 0.583079268292683 | 0.797995991983968 | 0.5094906036573911 | 0.001
131
+ 34 | 0.13918223977088928 | 0.5617378048780488 | 0.7939271255060729 | 0.539855430008089 | 0.001
132
+ 35 | 0.14003774523735046 | 0.5876524390243902 | 0.8020833333333333 | 0.48732799827507844 | 0.001
133
+ 36 | 0.15099692344665527 | 0.5853658536585366 | 0.8016096579476861 | 0.514951340134722 | 0.001
134
+ 37 | 0.14428909122943878 | 0.5571646341463414 | 0.7900541407659917 | 0.48501044595638915 | 0.001
135
+ 38 | 0.14405055344104767 | 0.5708841463414634 | 0.7945869521308826 | 0.5062759605333714 | 0.001
136
+ 39 | 0.13543353974819183 | 0.5891768292682927 | 0.8024193548387097 | 0.535367946456459 | 0.0001
137
+ 40 | 0.1358059197664261 | 0.59375 | 0.8035498184751916 | 0.5341458086147945 | 0.0001
138
+ 41 | 0.13504844903945923 | 0.5945121951219512 | 0.8055001992825829 | 0.5376190537551451 | 0.0001
139
+ 42 | 0.13691695034503937 | 0.5891768292682927 | 0.8035073734555599 | 0.5479834256786992 | 0.0001
140
+ 43 | 0.13574542105197906 | 0.586890243902439 | 0.8031840796019901 | 0.5561858306297028 | 0.0001
141
+ 44 | 0.13488240540027618 | 0.5891768292682927 | 0.8039920159680638 | 0.5433240630212841 | 0.0001
142
+ 45 | 0.1361350268125534 | 0.5815548780487805 | 0.8024593415311385 | 0.5564064426334538 | 0.0001
143
+ 46 | 0.1349516659975052 | 0.59375 | 0.8072669826224328 | 0.5738450359977211 | 0.0001
144
+ 47 | 0.13875022530555725 | 0.586890243902439 | 0.8035892323030908 | 0.5398873319319681 | 0.0001
145
+ 48 | 0.1370573788881302 | 0.586890243902439 | 0.8029458598726115 | 0.5475297928828294 | 0.0001
146
+ 49 | 0.13690504431724548 | 0.5769817073170732 | 0.800396432111001 | 0.5558247567089973 | 0.0001
147
+ 50 | 0.13561294972896576 | 0.5853658536585366 | 0.8055390702274975 | 0.5774903390899776 | 0.0001
148
+ 51 | 0.13570135831832886 | 0.5899390243902439 | 0.8056984566679858 | 0.5639229458996943 | 1e-05
149
+ 52 | 0.13534915447235107 | 0.5876524390243902 | 0.803262383131092 | 0.5523965140127979 | 1e-05
150
+ 53 | 0.13460643589496613 | 0.5891768292682927 | 0.8047619047619048 | 0.5553165219596603 | 1e-05
151
+ 54 | 0.13505160808563232 | 0.586890243902439 | 0.80398406374502 | 0.5513512052592927 | 1e-05
152
+ 55 | 0.13664333522319794 | 0.5899390243902439 | 0.8044295036582955 | 0.5586243767362515 | 1e-05
153
+ 56 | 0.13582760095596313 | 0.5876524390243902 | 0.8056215360253365 | 0.559616045636613 | 1e-05
154
+ 57 | 0.1356770098209381 | 0.5891768292682927 | 0.8046205935072694 | 0.5592413950447728 | 1e-05
155
+ 58 | 0.13557715713977814 | 0.586890243902439 | 0.804201347602061 | 0.5652648109787017 | 1e-05
156
+ 59 | 0.13481777906417847 | 0.5884146341463414 | 0.806522171405846 | 0.564174198827338 | 1e-05
157
+ 60 | 0.13674791157245636 | 0.5899390243902439 | 0.8047030689517736 | 0.5543815900756193 | 1.0000000000000002e-06
158
+ 61 | 0.13405902683734894 | 0.5876524390243902 | 0.8045563549160671 | 0.5561036196553861 | 1.0000000000000002e-06
159
+ 62 | 0.134480819106102 | 0.5861280487804879 | 0.8052360174533915 | 0.5646122700508593 | 1.0000000000000002e-06
160
+ 63 | 0.13709656894207 | 0.5899390243902439 | 0.805958291956306 | 0.5660740924208384 | 1.0000000000000002e-06
161
+ 64 | 0.13475064933300018 | 0.5929878048780488 | 0.8056551174830745 | 0.5521602551489271 | 1.0000000000000002e-06
162
+ 65 | 0.13584908843040466 | 0.5899390243902439 | 0.8048345551812959 | 0.5594424812390786 | 1.0000000000000002e-06
163
+ 66 | 0.13522611558437347 | 0.5853658536585366 | 0.8040500297796307 | 0.5680592850157375 | 1.0000000000000002e-06
164
+ 67 | 0.1348220705986023 | 0.5876524390243902 | 0.8059288537549407 | 0.5689511664868826 | 1.0000000000000002e-06
165
+ 68 | 0.13475003838539124 | 0.5853658536585366 | 0.8046843985708614 | 0.5680973012829017 | 1.0000000000000002e-07
166
+ 69 | 0.13508079946041107 | 0.5884146341463414 | 0.8036036036036036 | 0.5523522321966495 | 1.0000000000000002e-07
167
+ 70 | 0.13654960691928864 | 0.5853658536585366 | 0.8053744319304486 | 0.567977027563922 | 1.0000000000000002e-07
168
+ 71 | 0.13474920392036438 | 0.5907012195121951 | 0.8072885719944544 | 0.5700988754371263 | 1.0000000000000002e-07
169
+
170
+
171
+ ---
172
+
173
+ # CO2 Emissions
174
+
175
+ The estimated CO2 emissions for training this model are documented below:
176
+
177
+ - **Emissions**: 0.2101236289384968 grams of CO2
178
+ - **Source**: Code Carbon
179
+ - **Training Type**: fine-tuning
180
+ - **Geographical Location**: Brest, France
181
+ - **Hardware Used**: NVIDIA Tesla V100 PCIe 32 Go
182
+
183
+
184
+ ---
185
+
186
+ # Framework Versions
187
+
188
+ - **Transformers**: 4.41.1
189
+ - **Pytorch**: 2.3.0+cu121
190
+ - **Datasets**: 2.19.1
191
+ - **Tokenizers**: 0.19.1
192
+