Edit model card

plant-seedlings-freeze-0-6-aug-3-whole-data-train

This model is a fine-tuned version of google/vit-base-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0001
  • Accuracy: 1.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0002
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 20
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.6146 0.16 100 0.0402 1.0
0.6062 0.31 200 0.2393 1.0
0.4847 0.47 300 0.2164 1.0
0.5282 0.63 400 0.0427 1.0
0.4153 0.79 500 1.0996 0.0
0.3295 0.94 600 0.0499 1.0
0.3541 1.1 700 0.0009 1.0
0.4617 1.26 800 0.0106 1.0
0.3014 1.42 900 0.0045 1.0
0.3558 1.57 1000 0.0038 1.0
0.2357 1.73 1100 0.0140 1.0
0.3055 1.89 1200 0.0809 1.0
0.3278 2.04 1300 0.0077 1.0
0.3075 2.2 1400 0.0059 1.0
0.3462 2.36 1500 0.0377 1.0
0.2968 2.52 1600 0.0082 1.0
0.3392 2.67 1700 0.8628 0.0
0.2155 2.83 1800 0.0022 1.0
0.3521 2.99 1900 0.0671 1.0
0.3968 3.14 2000 0.0014 1.0
0.32 3.3 2100 0.0075 1.0
0.1787 3.46 2200 0.0015 1.0
0.2598 3.62 2300 0.0086 1.0
0.3424 3.77 2400 0.0008 1.0
0.2371 3.93 2500 0.0054 1.0
0.2773 4.09 2600 0.0028 1.0
0.3192 4.25 2700 0.0088 1.0
0.2173 4.4 2800 0.1174 1.0
0.2181 4.56 2900 0.0056 1.0
0.2476 4.72 3000 0.0006 1.0
0.2417 4.87 3100 0.0005 1.0
0.1915 5.03 3200 0.0002 1.0
0.149 5.19 3300 0.0004 1.0
0.1618 5.35 3400 0.1542 1.0
0.1752 5.5 3500 0.0001 1.0
0.1094 5.66 3600 0.0045 1.0
0.2532 5.82 3700 0.0016 1.0
0.1606 5.97 3800 0.0004 1.0
0.1781 6.13 3900 0.0007 1.0
0.1459 6.29 4000 0.0003 1.0
0.2357 6.45 4100 2.6113 0.0
0.2524 6.6 4200 0.0003 1.0
0.1708 6.76 4300 0.0006 1.0
0.1875 6.92 4400 0.0011 1.0
0.1462 7.08 4500 0.0004 1.0
0.1534 7.23 4600 0.0002 1.0
0.2834 7.39 4700 0.0003 1.0
0.2264 7.55 4800 0.0001 1.0
0.1007 7.7 4900 0.0001 1.0
0.2376 7.86 5000 0.0006 1.0
0.2233 8.02 5100 0.0002 1.0
0.1804 8.18 5200 0.0034 1.0
0.185 8.33 5300 0.0002 1.0
0.1149 8.49 5400 0.0007 1.0
0.2048 8.65 5500 0.0009 1.0
0.0786 8.81 5600 0.9478 0.0
0.2222 8.96 5700 0.0007 1.0
0.1289 9.12 5800 0.0009 1.0
0.2248 9.28 5900 0.0005 1.0
0.0987 9.43 6000 0.0002 1.0
0.2897 9.59 6100 0.0002 1.0
0.2023 9.75 6200 0.0042 1.0
0.1481 9.91 6300 0.0003 1.0
0.1224 10.06 6400 0.0009 1.0
0.1353 10.22 6500 0.0080 1.0
0.0659 10.38 6600 0.0006 1.0
0.1692 10.53 6700 0.0005 1.0
0.1713 10.69 6800 0.0006 1.0
0.1131 10.85 6900 0.0012 1.0
0.2325 11.01 7000 0.0003 1.0
0.0817 11.16 7100 0.0003 1.0
0.1854 11.32 7200 0.0001 1.0
0.0956 11.48 7300 0.0002 1.0
0.0758 11.64 7400 0.0127 1.0
0.0928 11.79 7500 0.0002 1.0
0.1563 11.95 7600 0.0004 1.0
0.0596 12.11 7700 0.0003 1.0
0.1266 12.26 7800 0.0031 1.0
0.1788 12.42 7900 0.0002 1.0
0.1663 12.58 8000 0.0071 1.0
0.064 12.74 8100 0.0003 1.0
0.1459 12.89 8200 0.0005 1.0
0.1237 13.05 8300 0.0001 1.0
0.1334 13.21 8400 0.0001 1.0
0.0802 13.36 8500 0.0001 1.0
0.1418 13.52 8600 0.0000 1.0
0.048 13.68 8700 0.0001 1.0
0.1267 13.84 8800 0.0121 1.0
0.1298 13.99 8900 0.0001 1.0
0.16 14.15 9000 0.0001 1.0
0.1295 14.31 9100 0.0001 1.0
0.1714 14.47 9200 0.0001 1.0
0.1377 14.62 9300 0.0001 1.0
0.1336 14.78 9400 0.0001 1.0
0.1293 14.94 9500 0.0001 1.0
0.111 15.09 9600 0.0001 1.0
0.0818 15.25 9700 0.0000 1.0
0.1884 15.41 9800 0.0001 1.0
0.1004 15.57 9900 0.0002 1.0
0.1029 15.72 10000 0.0000 1.0
0.0772 15.88 10100 0.0000 1.0
0.1573 16.04 10200 0.0001 1.0
0.0748 16.19 10300 0.0001 1.0
0.088 16.35 10400 0.0001 1.0
0.1062 16.51 10500 0.0001 1.0
0.0237 16.67 10600 0.0001 1.0
0.0729 16.82 10700 0.0000 1.0
0.1028 16.98 10800 0.0001 1.0
0.0423 17.14 10900 0.0000 1.0
0.0922 17.3 11000 0.0002 1.0
0.0788 17.45 11100 0.0001 1.0
0.0357 17.61 11200 0.0001 1.0
0.0519 17.77 11300 0.0000 1.0
0.108 17.92 11400 0.0001 1.0
0.1746 18.08 11500 0.1221 1.0
0.1 18.24 11600 0.0006 1.0
0.0798 18.4 11700 0.0001 1.0
0.0118 18.55 11800 0.0001 1.0
0.1151 18.71 11900 0.0001 1.0
0.0617 18.87 12000 0.0001 1.0
0.1577 19.03 12100 0.0001 1.0
0.1928 19.18 12200 0.0001 1.0
0.0462 19.34 12300 0.0001 1.0
0.0461 19.5 12400 0.3145 1.0
0.0454 19.65 12500 0.0001 1.0
0.0637 19.81 12600 0.0001 1.0
0.0733 19.97 12700 0.0001 1.0

Framework versions

  • Transformers 4.28.1
  • Pytorch 2.0.0+cu118
  • Datasets 2.11.0
  • Tokenizers 0.13.3
Downloads last month
8
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Evaluation results