File size: 3,815 Bytes
a23add9
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
c3d9125
a23add9
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
---
license: apache-2.0
base_model: google/vit-base-patch16-224
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: vit-base-patch16-224-dmae-va-U5-42C
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# vit-base-patch16-224-dmae-va-U5-42C

This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1112
- Accuracy: 0.5667

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 1e-06
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 42

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log        | 0.9   | 7    | 1.4546          | 0.1333   |
| 1.5342        | 1.94  | 15   | 1.4379          | 0.1333   |
| 1.5342        | 2.97  | 23   | 1.4115          | 0.1667   |
| 1.5331        | 4.0   | 31   | 1.3787          | 0.2      |
| 1.4639        | 4.9   | 38   | 1.3513          | 0.2833   |
| 1.4639        | 5.94  | 46   | 1.3290          | 0.3333   |
| 1.4056        | 6.97  | 54   | 1.3114          | 0.3833   |
| 1.3679        | 8.0   | 62   | 1.2941          | 0.4333   |
| 1.3679        | 8.9   | 69   | 1.2827          | 0.4667   |
| 1.3387        | 9.94  | 77   | 1.2678          | 0.5      |
| 1.2992        | 10.97 | 85   | 1.2557          | 0.4667   |
| 1.2992        | 12.0  | 93   | 1.2454          | 0.4667   |
| 1.2797        | 12.9  | 100  | 1.2345          | 0.4833   |
| 1.2507        | 13.94 | 108  | 1.2215          | 0.4833   |
| 1.2507        | 14.97 | 116  | 1.2109          | 0.5      |
| 1.2337        | 16.0  | 124  | 1.2005          | 0.5      |
| 1.2337        | 16.9  | 131  | 1.1904          | 0.5      |
| 1.2076        | 17.94 | 139  | 1.1796          | 0.5167   |
| 1.1968        | 18.97 | 147  | 1.1699          | 0.5333   |
| 1.1968        | 20.0  | 155  | 1.1610          | 0.5333   |
| 1.171         | 20.9  | 162  | 1.1544          | 0.5333   |
| 1.1572        | 21.94 | 170  | 1.1476          | 0.5333   |
| 1.1572        | 22.97 | 178  | 1.1411          | 0.5333   |
| 1.1383        | 24.0  | 186  | 1.1350          | 0.5333   |
| 1.14          | 24.9  | 193  | 1.1298          | 0.5333   |
| 1.14          | 25.94 | 201  | 1.1256          | 0.55     |
| 1.1114        | 26.97 | 209  | 1.1212          | 0.55     |
| 1.1094        | 28.0  | 217  | 1.1173          | 0.55     |
| 1.1094        | 28.9  | 224  | 1.1143          | 0.55     |
| 1.0872        | 29.94 | 232  | 1.1112          | 0.5667   |
| 1.0941        | 30.97 | 240  | 1.1078          | 0.5667   |
| 1.0941        | 32.0  | 248  | 1.1054          | 0.5667   |
| 1.0882        | 32.9  | 255  | 1.1032          | 0.5667   |
| 1.0882        | 33.94 | 263  | 1.1012          | 0.5667   |
| 1.0685        | 34.97 | 271  | 1.0998          | 0.5667   |
| 1.0775        | 36.0  | 279  | 1.0988          | 0.5667   |
| 1.0775        | 36.9  | 286  | 1.0983          | 0.5667   |
| 1.0817        | 37.94 | 294  | 1.0981          | 0.5667   |


### Framework versions

- Transformers 4.38.2
- Pytorch 2.2.1+cu121
- Datasets 2.18.0
- Tokenizers 0.15.2