vit-base-patch16-224-dmae-va-U5-40-45-5e-05
This model is a fine-tuned version of google/vit-base-patch16-224 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.5841
- Accuracy: 0.8333
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 4.5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.05
- num_epochs: 40
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
No log | 0.9 | 7 | 1.3748 | 0.45 |
1.3825 | 1.94 | 15 | 1.3172 | 0.5 |
1.3327 | 2.97 | 23 | 1.2210 | 0.55 |
1.2072 | 4.0 | 31 | 1.1094 | 0.5833 |
1.2072 | 4.9 | 38 | 1.0342 | 0.5667 |
1.0691 | 5.94 | 46 | 0.9669 | 0.6167 |
0.9546 | 6.97 | 54 | 0.9254 | 0.7 |
0.8633 | 8.0 | 62 | 0.9027 | 0.75 |
0.8633 | 8.9 | 69 | 0.8455 | 0.75 |
0.7627 | 9.94 | 77 | 0.7551 | 0.7667 |
0.665 | 10.97 | 85 | 0.6982 | 0.8 |
0.55 | 12.0 | 93 | 0.7471 | 0.7333 |
0.4657 | 12.9 | 100 | 0.6946 | 0.7833 |
0.4657 | 13.94 | 108 | 0.5841 | 0.8333 |
0.3706 | 14.97 | 116 | 0.6047 | 0.8 |
0.3222 | 16.0 | 124 | 0.6454 | 0.7667 |
0.2909 | 16.9 | 131 | 0.5383 | 0.8333 |
0.2909 | 17.94 | 139 | 0.5574 | 0.8 |
0.2887 | 18.97 | 147 | 0.6286 | 0.8167 |
0.237 | 20.0 | 155 | 0.6517 | 0.8 |
0.2071 | 20.9 | 162 | 0.5069 | 0.8333 |
0.2076 | 21.94 | 170 | 0.6049 | 0.7833 |
0.2076 | 22.97 | 178 | 0.6403 | 0.7833 |
0.1789 | 24.0 | 186 | 0.6471 | 0.8167 |
0.1582 | 24.9 | 193 | 0.6160 | 0.8167 |
0.1508 | 25.94 | 201 | 0.6432 | 0.8 |
0.1508 | 26.97 | 209 | 0.5986 | 0.8333 |
0.1566 | 28.0 | 217 | 0.6011 | 0.8 |
0.122 | 28.9 | 224 | 0.5663 | 0.8333 |
0.1251 | 29.94 | 232 | 0.5868 | 0.8333 |
0.117 | 30.97 | 240 | 0.6516 | 0.8333 |
0.117 | 32.0 | 248 | 0.5860 | 0.8333 |
0.1136 | 32.9 | 255 | 0.5576 | 0.8167 |
0.1099 | 33.94 | 263 | 0.5949 | 0.8167 |
0.1336 | 34.97 | 271 | 0.6318 | 0.8167 |
0.1336 | 36.0 | 279 | 0.6262 | 0.8167 |
0.1052 | 36.13 | 280 | 0.6258 | 0.8167 |
Framework versions
- Transformers 4.36.2
- Pytorch 2.1.2+cu118
- Datasets 2.16.1
- Tokenizers 0.15.0
- Downloads last month
- 6