CGIAR-Crop-disease / README.md
Professor's picture
End of training
6b17d60 verified
metadata
license: apache-2.0
base_model: gianlab/swin-tiny-patch4-window7-224-finetuned-plantdisease
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: CGIAR-Crop-disease
    results: []

CGIAR-Crop-disease

This model is a fine-tuned version of gianlab/swin-tiny-patch4-window7-224-finetuned-plantdisease on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7448
  • Accuracy: 0.6857

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 32
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 50
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.9931 1.0 652 0.8450 0.6346
0.9034 2.0 1304 0.8367 0.6456
0.8734 3.0 1956 0.8165 0.6601
0.851 4.0 2608 0.8982 0.6047
0.8444 5.0 3260 0.8000 0.6626
0.8261 6.0 3912 0.8339 0.6321
0.8262 7.0 4564 0.7984 0.6613
0.8152 8.0 5216 0.7859 0.6740
0.8081 9.0 5868 0.8387 0.6400
0.8012 10.0 6520 0.8229 0.6463
0.7957 11.0 7172 0.7807 0.6715
0.7975 12.0 7824 0.7752 0.6816
0.7885 13.0 8476 0.7885 0.6694
0.7896 14.0 9128 0.7806 0.6774
0.7871 15.0 9780 0.7713 0.6786
0.7696 16.0 10432 0.7881 0.6615
0.7742 17.0 11084 0.7616 0.6797
0.7638 18.0 11736 0.7509 0.6878
0.7655 19.0 12388 0.7995 0.6646
0.7624 20.0 13040 0.7712 0.6768
0.7544 21.0 13692 0.7491 0.6885
0.7567 22.0 14344 0.7472 0.6841
0.7487 23.0 14996 0.7608 0.6818
0.7427 24.0 15648 0.7494 0.6870
0.7468 25.0 16300 0.7543 0.6812
0.7365 26.0 16952 0.7494 0.6855
0.7328 27.0 17604 0.7448 0.6857
0.7398 28.0 18256 0.7461 0.6855
0.7266 29.0 18908 0.7513 0.6822
0.7286 30.0 19560 0.7456 0.6868

Framework versions

  • Transformers 4.37.1
  • Pytorch 2.0.0
  • Datasets 2.16.1
  • Tokenizers 0.15.0