Edit model card

swinv2-tiny-patch4-window8-256-dmae-va-U5-42B

This model is a fine-tuned version of microsoft/swinv2-tiny-patch4-window8-256 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9637
  • Accuracy: 0.6667

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 4e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 42

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 0.9 7 7.8663 0.1167
6.936 1.94 15 7.7572 0.1167
6.936 2.97 23 7.1790 0.1167
6.7016 4.0 31 5.9033 0.1167
5.5439 4.9 38 4.6116 0.1167
5.5439 5.94 46 3.2830 0.1167
3.6477 6.97 54 2.2014 0.1167
2.2506 8.0 62 1.5647 0.45
2.2506 8.9 69 1.3160 0.45
1.5088 9.94 77 1.3676 0.3333
1.3868 10.97 85 1.3390 0.45
1.3868 12.0 93 1.3223 0.3833
1.351 12.9 100 1.3156 0.45
1.3271 13.94 108 1.3485 0.4833
1.3271 14.97 116 1.2646 0.4833
1.2322 16.0 124 1.2308 0.4833
1.2322 16.9 131 1.2160 0.5
1.22 17.94 139 1.2015 0.5
1.1899 18.97 147 1.2008 0.5
1.1899 20.0 155 1.1606 0.5
1.109 20.9 162 1.1182 0.5667
1.0603 21.94 170 1.0855 0.5333
1.0603 22.97 178 1.0763 0.5667
1.0264 24.0 186 1.1153 0.5833
1.0086 24.9 193 1.0770 0.65
1.0086 25.94 201 1.0041 0.6167
0.9301 26.97 209 0.9637 0.6667
0.9077 28.0 217 0.9824 0.5833
0.9077 28.9 224 0.9485 0.6
0.8725 29.94 232 0.9294 0.6167
0.8203 30.97 240 0.9348 0.6167
0.8203 32.0 248 0.9295 0.6
0.8211 32.9 255 0.9167 0.6
0.8211 33.94 263 0.9281 0.5833
0.7916 34.97 271 0.8803 0.6333
0.7822 36.0 279 0.8785 0.6333
0.7822 36.9 286 0.8906 0.6
0.7937 37.94 294 0.8899 0.6

Framework versions

  • Transformers 4.36.2
  • Pytorch 2.1.2+cu118
  • Datasets 2.16.1
  • Tokenizers 0.15.0
Downloads last month
9
Safetensors
Model size
27.6M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Augusto777/swinv2-tiny-patch4-window8-256-dmae-va-U5-42B

Finetuned
(46)
this model