File size: 3,699 Bytes
74a2d73
e91a92d
 
74a2d73
 
 
e91a92d
 
74a2d73
 
 
 
 
 
 
 
 
 
 
 
 
e91a92d
74a2d73
e91a92d
 
 
 
 
74a2d73
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
---
language:
- eng
license: apache-2.0
base_model: facebook/dinov2-giant
tags:
- multilabel-image-classification
- multilabel
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: dinov2-giant-2024_01_02-kornia_img-size518_batch-size32_epochs20_freeze
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# dinov2-giant-2024_01_02-kornia_img-size518_batch-size32_epochs20_freeze

This model is a fine-tuned version of [facebook/dinov2-giant](https://huggingface.co/facebook/dinov2-giant) on the multilabel_complete_dataset dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1266
- F1 Micro: 0.8142
- F1 Macro: 0.7719
- Roc Auc: 0.8801
- Accuracy: 0.5121
- Learning Rate: 0.001

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.01
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20

### Training results

| Training Loss | Epoch | Step | Accuracy | F1 Macro | F1 Micro | Validation Loss | Roc Auc | Rate  |
|:-------------:|:-----:|:----:|:--------:|:--------:|:--------:|:---------------:|:-------:|:-----:|
| No log        | 1.0   | 268  | 0.4055   | 0.5114   | 0.6258   | 0.2231          | 0.7463  | 0.01  |
| 0.2273        | 2.0   | 536  | 0.3812   | 0.4511   | 0.6106   | 0.2505          | 0.7360  | 0.01  |
| 0.2273        | 3.0   | 804  | 0.4176   | 0.6952   | 0.7531   | 0.1782          | 0.8425  | 0.01  |
| 0.196         | 4.0   | 1072 | 0.4241   | 0.6667   | 0.7646   | 0.1578          | 0.8562  | 0.01  |
| 0.196         | 5.0   | 1340 | 0.3551   | 0.6463   | 0.7290   | 0.1978          | 0.8616  | 0.01  |
| 0.1916        | 6.0   | 1608 | 0.4548   | 0.6155   | 0.7534   | 0.1570          | 0.8332  | 0.01  |
| 0.1916        | 7.0   | 1876 | 0.4076   | 0.7034   | 0.7711   | 0.1704          | 0.8893  | 0.01  |
| 0.1935        | 8.0   | 2144 | 0.4487   | 0.7240   | 0.7783   | 0.1584          | 0.8759  | 0.01  |
| 0.1935        | 9.0   | 2412 | 0.4434   | 0.7026   | 0.7725   | 0.1614          | 0.8787  | 0.01  |
| 0.1945        | 10.0  | 2680 | 0.4366   | 0.6245   | 0.7438   | 0.1569          | 0.8239  | 0.01  |
| 0.1945        | 11.0  | 2948 | 0.4298   | 0.6986   | 0.7639   | 0.1666          | 0.8614  | 0.01  |
| 0.1951        | 12.0  | 3216 | 0.4477   | 0.6291   | 0.7448   | 0.1585          | 0.8242  | 0.01  |
| 0.1951        | 13.0  | 3484 | 0.1565   | 0.7624   | 0.6650   | 0.8443          | 0.4380  | 0.01  |
| 0.1953        | 14.0  | 3752 | 0.1728   | 0.6728   | 0.5022   | 0.7639          | 0.4466  | 0.01  |
| 0.1945        | 15.0  | 4020 | 0.1565   | 0.7441   | 0.6524   | 0.8177          | 0.4555  | 0.01  |
| 0.1945        | 16.0  | 4288 | 0.1576   | 0.7515   | 0.6439   | 0.8311          | 0.4580  | 0.01  |
| 0.1929        | 17.0  | 4556 | 0.1701   | 0.7359   | 0.5707   | 0.8337          | 0.4312  | 0.01  |
| 0.1929        | 18.0  | 4824 | 0.1599   | 0.7531   | 0.6534   | 0.8451          | 0.4230  | 0.01  |
| 0.1952        | 19.0  | 5092 | 0.1603   | 0.7347   | 0.6658   | 0.8118          | 0.4548  | 0.01  |
| 0.1952        | 20.0  | 5360 | 0.1276   | 0.8134   | 0.7677   | 0.8759          | 0.5263  | 0.001 |


### Framework versions

- Transformers 4.34.1
- Pytorch 2.1.0+cu118
- Datasets 2.14.5
- Tokenizers 0.14.1