Edit model card

save-model-final-fork

This model is a fine-tuned version of microsoft/swinv2-tiny-patch4-window16-256 on the jbarat/plant_species dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0002
  • Accuracy: 1.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 64
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 21

Training results

Training Loss Epoch Step Accuracy Validation Loss
No log 1.0 10 0.725 0.7459
No log 2.0 20 0.875 0.5996
No log 3.0 30 0.7375 0.8398
No log 4.0 40 0.8125 0.6444
No log 5.0 50 0.8375 0.6891
No log 6.0 60 0.825 0.6675
No log 7.0 70 0.8375 0.7300
No log 8.0 80 0.85 0.8635
No log 9.0 90 0.95 0.3333
0.1657 10.0 100 0.9375 0.2634
0.1657 11.0 110 0.9375 0.3821
0.1657 12.0 120 0.9625 0.2343
0.1657 13.0 130 0.9375 0.3103
0.1657 14.0 140 0.95 0.3481
0.1657 15.0 150 0.9625 0.2419
0.1657 16.0 160 0.9375 0.2408
0.1657 17.0 170 0.95 0.3202
0.1657 18.0 180 0.9125 0.4016
0.1657 19.0 190 0.925 0.3918
0.0254 20.0 200 0.9375 0.3874
0.0254 21.0 210 0.0006 1.0

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.1.2
  • Datasets 2.18.0
  • Tokenizers 0.15.2
Downloads last month
2
Safetensors
Model size
27.6M params
Tensor type
F32
·
Inference API
Drag image file here or click to browse from your device
This model can be loaded on Inference API (serverless).

Finetuned from