Edit model card

ebayes/tree-crown-latest

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6589
  • Accuracy: 0.8636

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 10
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 150

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 36 1.5994 0.6364
No log 2.0 72 1.2587 0.6818
No log 3.0 108 1.0993 0.7045
No log 4.0 144 0.9721 0.7955
No log 5.0 180 0.9282 0.7955
No log 6.0 216 0.8947 0.7955
No log 7.0 252 0.8858 0.7727
No log 8.0 288 0.8159 0.7955
No log 9.0 324 0.7772 0.7727
No log 10.0 360 0.7519 0.7955
No log 11.0 396 0.6982 0.7955
No log 12.0 432 0.6639 0.7955
No log 13.0 468 0.6690 0.8409
0.6601 14.0 504 0.6565 0.8409
0.6601 15.0 540 0.6401 0.8409
0.6601 16.0 576 0.5868 0.8864
0.6601 17.0 612 0.5840 0.8864
0.6601 18.0 648 0.6214 0.8409
0.6601 19.0 684 0.6447 0.8636
0.6601 20.0 720 0.6387 0.8409
0.6601 21.0 756 0.5714 0.8636
0.6601 22.0 792 0.5483 0.8864
0.6601 23.0 828 0.5600 0.8864
0.6601 24.0 864 0.5785 0.8864
0.6601 25.0 900 0.5806 0.8864
0.6601 26.0 936 0.5598 0.8636
0.6601 27.0 972 0.5549 0.8864
0.1909 28.0 1008 0.5145 0.8864
0.1909 29.0 1044 0.5294 0.8636
0.1909 30.0 1080 0.5846 0.8636
0.1909 31.0 1116 0.5347 0.8864
0.1909 32.0 1152 0.5251 0.8864
0.1909 33.0 1188 0.5193 0.8864
0.1909 34.0 1224 0.6406 0.8409
0.1909 35.0 1260 0.5039 0.8864
0.1909 36.0 1296 0.5137 0.8864
0.1909 37.0 1332 0.6023 0.8636
0.1909 38.0 1368 0.5625 0.8864
0.1909 39.0 1404 0.5752 0.8864
0.1909 40.0 1440 0.5903 0.8864
0.1909 41.0 1476 0.5143 0.8864
0.0968 42.0 1512 0.5261 0.8864
0.0968 43.0 1548 0.5942 0.8864
0.0968 44.0 1584 0.6026 0.8636
0.0968 45.0 1620 0.5638 0.8864
0.0968 46.0 1656 0.6019 0.8864
0.0968 47.0 1692 0.5953 0.8864
0.0968 48.0 1728 0.6043 0.8864
0.0968 49.0 1764 0.5866 0.8864
0.0968 50.0 1800 0.5090 0.8864
0.0968 51.0 1836 0.5704 0.8864
0.0968 52.0 1872 0.5579 0.8636
0.0968 53.0 1908 0.5058 0.8864
0.0968 54.0 1944 0.5418 0.8864
0.0968 55.0 1980 0.5708 0.8864
0.0656 56.0 2016 0.5818 0.8864
0.0656 57.0 2052 0.5539 0.8864
0.0656 58.0 2088 0.5849 0.8864
0.0656 59.0 2124 0.5396 0.8864
0.0656 60.0 2160 0.5631 0.8864
0.0656 61.0 2196 0.5919 0.8864
0.0656 62.0 2232 0.5955 0.8864
0.0656 63.0 2268 0.5438 0.8864
0.0656 64.0 2304 0.5989 0.8636
0.0656 65.0 2340 0.5062 0.8864
0.0656 66.0 2376 0.5820 0.8636
0.0656 67.0 2412 0.5301 0.8864
0.0656 68.0 2448 0.6138 0.8864
0.0656 69.0 2484 0.5710 0.8636
0.0491 70.0 2520 0.6141 0.8636
0.0491 71.0 2556 0.6304 0.8636
0.0491 72.0 2592 0.5568 0.8636
0.0491 73.0 2628 0.6437 0.8636
0.0491 74.0 2664 0.5329 0.8864
0.0491 75.0 2700 0.6453 0.8864
0.0491 76.0 2736 0.6267 0.8636
0.0491 77.0 2772 0.6246 0.8636
0.0491 78.0 2808 0.6408 0.8636
0.0491 79.0 2844 0.6208 0.8636
0.0491 80.0 2880 0.5944 0.8636
0.0491 81.0 2916 0.6848 0.8636
0.0491 82.0 2952 0.6700 0.8409
0.0491 83.0 2988 0.5625 0.8864
0.0474 84.0 3024 0.4997 0.8864
0.0474 85.0 3060 0.6110 0.8864
0.0474 86.0 3096 0.5661 0.8864
0.0474 87.0 3132 0.5681 0.8864
0.0474 88.0 3168 0.5794 0.8636
0.0474 89.0 3204 0.6098 0.8864
0.0474 90.0 3240 0.6009 0.8636
0.0474 91.0 3276 0.5000 0.8864
0.0474 92.0 3312 0.5285 0.8864
0.0474 93.0 3348 0.5774 0.8864
0.0474 94.0 3384 0.5896 0.8864
0.0474 95.0 3420 0.5478 0.8864
0.0474 96.0 3456 0.5815 0.8864
0.0474 97.0 3492 0.5675 0.8864
0.0393 98.0 3528 0.5773 0.8864
0.0393 99.0 3564 0.6099 0.8864
0.0393 100.0 3600 0.7255 0.8409
0.0393 101.0 3636 0.6300 0.8864
0.0393 102.0 3672 0.5979 0.8409
0.0393 103.0 3708 0.6031 0.8864
0.0393 104.0 3744 0.6200 0.8864
0.0393 105.0 3780 0.6120 0.8864
0.0393 106.0 3816 0.5514 0.9091
0.0393 107.0 3852 0.6425 0.8864
0.0393 108.0 3888 0.6152 0.8864
0.0393 109.0 3924 0.6023 0.8864
0.0393 110.0 3960 0.6170 0.8864
0.0393 111.0 3996 0.6556 0.8864
0.0404 112.0 4032 0.6380 0.8864
0.0404 113.0 4068 0.6216 0.8864
0.0404 114.0 4104 0.5775 0.8864
0.0404 115.0 4140 0.6120 0.8864
0.0404 116.0 4176 0.6221 0.8864
0.0404 117.0 4212 0.6807 0.8636
0.0404 118.0 4248 0.6805 0.8636
0.0404 119.0 4284 0.6660 0.8636
0.0404 120.0 4320 0.6626 0.8636
0.0404 121.0 4356 0.6656 0.8636
0.0404 122.0 4392 0.6151 0.8636
0.0404 123.0 4428 0.6525 0.8636
0.0404 124.0 4464 0.6311 0.8636
0.0268 125.0 4500 0.6375 0.8636
0.0268 126.0 4536 0.6252 0.8636
0.0268 127.0 4572 0.6182 0.8409
0.0268 128.0 4608 0.6195 0.8636
0.0268 129.0 4644 0.6417 0.8636
0.0268 130.0 4680 0.6440 0.8636
0.0268 131.0 4716 0.6726 0.8636
0.0268 132.0 4752 0.6781 0.8636
0.0268 133.0 4788 0.6412 0.8636
0.0268 134.0 4824 0.6514 0.8636
0.0268 135.0 4860 0.6452 0.8636
0.0268 136.0 4896 0.6453 0.8864
0.0268 137.0 4932 0.6408 0.8864
0.0268 138.0 4968 0.6461 0.8864
0.0244 139.0 5004 0.6597 0.8864
0.0244 140.0 5040 0.6539 0.8864
0.0244 141.0 5076 0.6415 0.8864
0.0244 142.0 5112 0.6438 0.8864
0.0244 143.0 5148 0.6581 0.8636
0.0244 144.0 5184 0.6570 0.8636
0.0244 145.0 5220 0.6626 0.8636
0.0244 146.0 5256 0.6622 0.8636
0.0244 147.0 5292 0.6647 0.8636
0.0244 148.0 5328 0.6619 0.8636
0.0244 149.0 5364 0.6591 0.8636
0.0244 150.0 5400 0.6589 0.8636

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.3.0+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
17
Safetensors
Model size
85.8M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for ebayes/tree-crown-latest

Finetuned
(1725)
this model

Evaluation results