File size: 4,471 Bytes
776f81c
 
 
 
e913ca5
 
776f81c
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
e913ca5
776f81c
e913ca5
 
 
 
 
776f81c
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
3342963
776f81c
 
 
 
 
 
 
 
 
 
 
 
 
 
3342963
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
776f81c
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
---
license: apache-2.0
base_model: facebook/dinov2-base-imagenet1k-1-layer
tags:
- image-classification
- vision
- generated_from_trainer
metrics:
- accuracy
- precision
- recall
- f1
model-index:
- name: dinov2-base-imagenet1k-1-layer-finetuned-galaxy_mnist
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# dinov2-base-imagenet1k-1-layer-finetuned-galaxy_mnist

This model is a fine-tuned version of [facebook/dinov2-base-imagenet1k-1-layer](https://huggingface.co/facebook/dinov2-base-imagenet1k-1-layer) on the matthieulel/galaxy_mnist dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1875
- Accuracy: 0.9365
- Precision: 0.9367
- Recall: 0.9365
- F1: 0.9365

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-06
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 30

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1     |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:|
| 0.8401        | 0.99  | 31   | 0.6220          | 0.7605   | 0.7579    | 0.7605 | 0.7543 |
| 0.3857        | 1.98  | 62   | 0.2696          | 0.8875   | 0.8881    | 0.8875 | 0.8877 |
| 0.3144        | 2.98  | 93   | 0.2491          | 0.9015   | 0.9028    | 0.9015 | 0.9010 |
| 0.2769        | 4.0   | 125  | 0.2179          | 0.913    | 0.9129    | 0.913  | 0.9128 |
| 0.2858        | 4.99  | 156  | 0.2455          | 0.9025   | 0.9070    | 0.9025 | 0.9020 |
| 0.2704        | 5.98  | 187  | 0.2121          | 0.9155   | 0.9234    | 0.9155 | 0.9156 |
| 0.2557        | 6.98  | 218  | 0.2177          | 0.9155   | 0.9190    | 0.9155 | 0.9152 |
| 0.2069        | 8.0   | 250  | 0.1864          | 0.9255   | 0.9256    | 0.9255 | 0.9255 |
| 0.2344        | 8.99  | 281  | 0.1894          | 0.923    | 0.9237    | 0.923  | 0.9230 |
| 0.1996        | 9.98  | 312  | 0.1993          | 0.9235   | 0.9260    | 0.9235 | 0.9234 |
| 0.2011        | 10.98 | 343  | 0.1828          | 0.928    | 0.9280    | 0.928  | 0.9279 |
| 0.2229        | 12.0  | 375  | 0.2358          | 0.9155   | 0.9233    | 0.9155 | 0.9145 |
| 0.1792        | 12.99 | 406  | 0.1897          | 0.9205   | 0.9214    | 0.9205 | 0.9205 |
| 0.1898        | 13.98 | 437  | 0.2017          | 0.921    | 0.9217    | 0.921  | 0.9208 |
| 0.1735        | 14.98 | 468  | 0.1954          | 0.927    | 0.9270    | 0.927  | 0.9269 |
| 0.1751        | 16.0  | 500  | 0.1918          | 0.9295   | 0.9299    | 0.9295 | 0.9294 |
| 0.1732        | 16.99 | 531  | 0.1906          | 0.922    | 0.9225    | 0.922  | 0.9219 |
| 0.1738        | 17.98 | 562  | 0.1846          | 0.931    | 0.9317    | 0.931  | 0.9310 |
| 0.1694        | 18.98 | 593  | 0.1875          | 0.9365   | 0.9367    | 0.9365 | 0.9365 |
| 0.1723        | 20.0  | 625  | 0.1941          | 0.9285   | 0.9293    | 0.9285 | 0.9284 |
| 0.1574        | 20.99 | 656  | 0.1905          | 0.9335   | 0.9337    | 0.9335 | 0.9336 |
| 0.1485        | 21.98 | 687  | 0.1869          | 0.9315   | 0.9313    | 0.9315 | 0.9314 |
| 0.1537        | 22.98 | 718  | 0.1830          | 0.936    | 0.9360    | 0.936  | 0.9360 |
| 0.1406        | 24.0  | 750  | 0.1975          | 0.932    | 0.9322    | 0.932  | 0.9320 |
| 0.1326        | 24.99 | 781  | 0.1918          | 0.9315   | 0.9316    | 0.9315 | 0.9315 |
| 0.1238        | 25.98 | 812  | 0.2105          | 0.9275   | 0.9288    | 0.9275 | 0.9276 |
| 0.1299        | 26.98 | 843  | 0.2022          | 0.9325   | 0.9327    | 0.9325 | 0.9324 |
| 0.1387        | 28.0  | 875  | 0.2011          | 0.9335   | 0.9337    | 0.9335 | 0.9336 |
| 0.1279        | 28.99 | 906  | 0.2005          | 0.931    | 0.9310    | 0.931  | 0.9310 |
| 0.1256        | 29.76 | 930  | 0.2004          | 0.931    | 0.9310    | 0.931  | 0.9310 |


### Framework versions

- Transformers 4.37.2
- Pytorch 2.3.0
- Datasets 2.19.1
- Tokenizers 0.15.1