File size: 4,653 Bytes
6afe404
60c4c2d
 
6afe404
 
 
60c4c2d
 
6afe404
 
 
 
 
 
 
 
 
 
 
 
 
60c4c2d
6afe404
60c4c2d
 
 
 
 
6afe404
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
---
language:
- eng
license: apache-2.0
base_model: facebook/dinov2-base
tags:
- image-classification
- multilabel
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: dino-base-2023_11_24-unfreeze
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# dino-base-2023_11_24-unfreeze

This model is a fine-tuned version of [facebook/dinov2-base](https://huggingface.co/facebook/dinov2-base) on the multilabel_complete_dataset dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2045
- F1 Micro: 0.6595
- F1 Macro: 0.5161
- Roc Auc: 0.7681
- Accuracy: 0.2735
- Learning Rate: 0.001

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.01
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30

### Training results

| Training Loss | Epoch | Step  | Validation Loss | F1 Micro | F1 Macro | Roc Auc | Accuracy | Rate  |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|:--------:|:-------:|:--------:|:-----:|
| 0.3961        | 1.0   | 536   | 0.3293          | 0.3296   | 0.0966   | 0.6005  | 0.0736   | 0.01  |
| 0.3418        | 2.0   | 1072  | 0.3382          | 0.3379   | 0.1283   | 0.6054  | 0.0439   | 0.01  |
| 0.3334        | 3.0   | 1608  | 0.3551          | 0.3905   | 0.2420   | 0.6319  | 0.0786   | 0.01  |
| 0.3323        | 4.0   | 2144  | 0.3213          | 0.2555   | 0.1099   | 0.5720  | 0.0625   | 0.01  |
| 0.3248        | 5.0   | 2680  | 0.3164          | 0.3355   | 0.1298   | 0.6024  | 0.0450   | 0.01  |
| 0.3235        | 6.0   | 3216  | 0.3346          | 0.2864   | 0.0806   | 0.5834  | 0.0239   | 0.01  |
| 0.32          | 7.0   | 3752  | 0.3029          | 0.4594   | 0.1968   | 0.6663  | 0.0682   | 0.01  |
| 0.3138        | 8.0   | 4288  | 0.2866          | 0.5468   | 0.2940   | 0.7240  | 0.0579   | 0.01  |
| 0.3052        | 9.0   | 4824  | 0.2807          | 0.4767   | 0.2993   | 0.6672  | 0.1225   | 0.01  |
| 0.3157        | 10.0  | 5360  | 0.2955          | 0.4752   | 0.2091   | 0.6733  | 0.0707   | 0.01  |
| 0.3119        | 11.0  | 5896  | 0.3405          | 0.4028   | 0.2160   | 0.6336  | 0.1361   | 0.01  |
| 0.3162        | 12.0  | 6432  | 0.4163          | 0.4899   | 0.2965   | 0.6863  | 0.0532   | 0.01  |
| 0.3184        | 13.0  | 6968  | 0.2964          | 0.5429   | 0.3299   | 0.7170  | 0.1047   | 0.01  |
| 0.3142        | 14.0  | 7504  | 0.3005          | 0.5253   | 0.3154   | 0.7072  | 0.0832   | 0.01  |
| 0.3104        | 15.0  | 8040  | 3.1991          | 0.1673   | 0.0674   | 0.4879  | 0.0      | 0.01  |
| 0.3042        | 16.0  | 8576  | 0.2820          | 0.4544   | 0.2746   | 0.6519  | 0.1583   | 0.001 |
| 0.2788        | 17.0  | 9112  | 0.2741          | 0.5744   | 0.3842   | 0.7205  | 0.1640   | 0.001 |
| 0.2724        | 18.0  | 9648  | 0.2424          | 0.5903   | 0.3936   | 0.7256  | 0.2072   | 0.001 |
| 0.2642        | 19.0  | 10184 | 0.2414          | 0.6021   | 0.4095   | 0.7347  | 0.2186   | 0.001 |
| 0.2597        | 20.0  | 10720 | 0.2269          | 0.6079   | 0.4156   | 0.7347  | 0.2251   | 0.001 |
| 0.2575        | 21.0  | 11256 | 0.2249          | 0.6231   | 0.4253   | 0.7463  | 0.2340   | 0.001 |
| 0.253         | 22.0  | 11792 | 0.2261          | 0.6291   | 0.4639   | 0.7521  | 0.2429   | 0.001 |
| 0.2491        | 23.0  | 12328 | 0.2163          | 0.6454   | 0.4856   | 0.7627  | 0.2537   | 0.001 |
| 0.2484        | 24.0  | 12864 | 0.2212          | 0.6262   | 0.4635   | 0.7460  | 0.2569   | 0.001 |
| 0.2465        | 25.0  | 13400 | 0.2118          | 0.6486   | 0.4780   | 0.7622  | 0.2772   | 0.001 |
| 0.241         | 26.0  | 13936 | 0.2106          | 0.6602   | 0.5159   | 0.7727  | 0.2558   | 0.001 |
| 0.2413        | 27.0  | 14472 | 0.2135          | 0.6390   | 0.4979   | 0.7536  | 0.2722   | 0.001 |
| 0.2385        | 28.0  | 15008 | 0.2182          | 0.6103   | 0.4596   | 0.7319  | 0.2772   | 0.001 |
| 0.2366        | 29.0  | 15544 | 0.2132          | 0.6615   | 0.5354   | 0.7758  | 0.2708   | 0.001 |
| 0.2345        | 30.0  | 16080 | 0.2069          | 0.6566   | 0.5122   | 0.7658  | 0.2747   | 0.001 |


### Framework versions

- Transformers 4.34.1
- Pytorch 2.1.0+cu118
- Datasets 2.14.5
- Tokenizers 0.14.1