0.50-200Train-100Test-swinv2-large
This model is a fine-tuned version of microsoft/swinv2-large-patch4-window12-192-22k on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.7669
- Accuracy: 0.8233
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 20
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
2.4602 | 0.9825 | 14 | 1.7254 | 0.4318 |
1.7105 | 1.9649 | 28 | 0.8579 | 0.7047 |
0.6096 | 2.9474 | 42 | 0.7268 | 0.7562 |
0.3983 | 4.0 | 57 | 0.6706 | 0.7852 |
0.1083 | 4.9825 | 71 | 0.7051 | 0.7897 |
0.0952 | 5.9649 | 85 | 0.8423 | 0.7696 |
0.1106 | 6.9474 | 99 | 0.6406 | 0.8121 |
0.0357 | 8.0 | 114 | 0.8410 | 0.7897 |
0.0522 | 8.9825 | 128 | 0.8197 | 0.7987 |
0.0274 | 9.9649 | 142 | 0.8788 | 0.8098 |
0.0203 | 10.9474 | 156 | 0.8037 | 0.8233 |
0.0361 | 12.0 | 171 | 0.7932 | 0.8076 |
0.0204 | 12.9825 | 185 | 0.7503 | 0.8210 |
0.0165 | 13.9649 | 199 | 0.7416 | 0.8098 |
0.0129 | 14.9474 | 213 | 0.8474 | 0.8277 |
0.0062 | 16.0 | 228 | 0.7788 | 0.8233 |
0.0028 | 16.9825 | 242 | 0.7687 | 0.8255 |
0.001 | 17.9649 | 256 | 0.7730 | 0.8255 |
0.0019 | 18.9474 | 270 | 0.7681 | 0.8255 |
0.0014 | 19.6491 | 280 | 0.7669 | 0.8233 |
Framework versions
- Transformers 4.41.2
- Pytorch 2.1.2
- Datasets 2.19.2
- Tokenizers 0.19.1
- Downloads last month
- 14
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.