Edit model card

CGIAR-Crop-disease

This model is a fine-tuned version of gianlab/swin-tiny-patch4-window7-224-finetuned-plantdisease on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7438
  • Accuracy: 0.6964

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 32
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 50
  • num_epochs: 40

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.0386 1.0 652 0.9385 0.5669
0.9619 2.0 1304 0.9422 0.5811
0.9193 3.0 1956 0.8806 0.6348
0.8876 4.0 2608 0.8703 0.6488
0.8777 5.0 3260 0.8361 0.6607
0.863 6.0 3912 0.8543 0.6417
0.8316 7.0 4564 0.8101 0.6607
0.8301 8.0 5216 0.8197 0.6609
0.8264 9.0 5868 0.8111 0.6720
0.8283 10.0 6520 0.8065 0.6669
0.816 11.0 7172 0.8115 0.6578
0.8263 12.0 7824 0.8029 0.6753
0.8017 13.0 8476 0.7929 0.6707
0.8005 14.0 9128 0.8025 0.6661
0.7989 15.0 9780 0.8153 0.6594
0.7961 16.0 10432 0.8033 0.6720
0.7769 17.0 11084 0.7879 0.6682
0.7757 18.0 11736 0.7868 0.6732
0.7713 19.0 12388 0.7773 0.6747
0.7638 20.0 13040 0.7678 0.6811
0.7645 21.0 13692 0.7826 0.6795
0.7497 22.0 14344 0.7931 0.6807
0.761 23.0 14996 0.7719 0.6820
0.7486 24.0 15648 0.7641 0.6895
0.7446 25.0 16300 0.7686 0.6832
0.7418 26.0 16952 0.7683 0.6904
0.7344 27.0 17604 0.7549 0.6895
0.7369 28.0 18256 0.7501 0.6891
0.7238 29.0 18908 0.7454 0.6933
0.7264 30.0 19560 0.7565 0.6876
0.7185 31.0 20212 0.7524 0.6880
0.7112 32.0 20864 0.7712 0.6807
0.7073 33.0 21516 0.7532 0.6897
0.7102 34.0 22168 0.7457 0.6960
0.7053 35.0 22820 0.7438 0.6964
0.6979 36.0 23472 0.7449 0.6933
0.6973 37.0 24124 0.7477 0.6929
0.6967 38.0 24776 0.7508 0.6926
0.6939 39.0 25428 0.7481 0.6933
0.6936 40.0 26080 0.7460 0.6968

Framework versions

  • Transformers 4.37.1
  • Pytorch 2.0.0
  • Datasets 2.16.1
  • Tokenizers 0.15.0
Downloads last month
3
Safetensors
Model size
27.6M params
Tensor type
I64
·
F32
·

Finetuned from