Edit model card

ebayes/tree_crown_model-test23

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2838
  • Accuracy: 0.8696

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 10
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 150

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 11 1.9719 0.6522
No log 2.0 22 1.6381 0.6522
No log 3.0 33 1.3958 0.6522
No log 4.0 44 1.2541 0.6522
No log 5.0 55 1.1207 0.6957
No log 6.0 66 1.0262 0.8261
No log 7.0 77 0.9421 0.8261
No log 8.0 88 0.9031 0.8261
No log 9.0 99 0.8398 0.8261
No log 10.0 110 0.7975 0.8261
No log 11.0 121 0.7547 0.8696
No log 12.0 132 0.7451 0.8696
No log 13.0 143 0.7017 0.8696
No log 14.0 154 0.6789 0.8696
No log 15.0 165 0.6688 0.8696
No log 16.0 176 0.6809 0.8696
No log 17.0 187 0.6342 0.8696
No log 18.0 198 0.6437 0.8696
No log 19.0 209 0.5902 0.8696
No log 20.0 220 0.5874 0.8696
No log 21.0 231 0.6042 0.8696
No log 22.0 242 0.5682 0.8696
No log 23.0 253 0.5395 0.8696
No log 24.0 264 0.5487 0.8696
No log 25.0 275 0.5239 0.8696
No log 26.0 286 0.5436 0.8696
No log 27.0 297 0.5169 0.8696
No log 28.0 308 0.5189 0.8696
No log 29.0 319 0.5314 0.8261
No log 30.0 330 0.4707 0.8696
No log 31.0 341 0.5169 0.8261
No log 32.0 352 0.5229 0.8696
No log 33.0 363 0.4598 0.8696
No log 34.0 374 0.4911 0.8696
No log 35.0 385 0.4516 0.8696
No log 36.0 396 0.4121 0.9130
No log 37.0 407 0.4875 0.8696
No log 38.0 418 0.4147 0.9130
No log 39.0 429 0.5118 0.8696
No log 40.0 440 0.4266 0.8696
No log 41.0 451 0.4114 0.8696
No log 42.0 462 0.4549 0.8261
No log 43.0 473 0.3795 0.9565
No log 44.0 484 0.4286 0.8696
No log 45.0 495 0.4409 0.8696
0.6437 46.0 506 0.4099 0.8696
0.6437 47.0 517 0.4075 0.9130
0.6437 48.0 528 0.3886 0.9130
0.6437 49.0 539 0.3900 0.8696
0.6437 50.0 550 0.3947 0.8696
0.6437 51.0 561 0.3676 0.8696
0.6437 52.0 572 0.3560 0.9130
0.6437 53.0 583 0.4100 0.8696
0.6437 54.0 594 0.4078 0.8696
0.6437 55.0 605 0.4357 0.8696
0.6437 56.0 616 0.3815 0.8696
0.6437 57.0 627 0.4172 0.8696
0.6437 58.0 638 0.4781 0.8696
0.6437 59.0 649 0.3847 0.8696
0.6437 60.0 660 0.3260 0.9130
0.6437 61.0 671 0.3578 0.8696
0.6437 62.0 682 0.3096 0.9130
0.6437 63.0 693 0.2946 0.9130
0.6437 64.0 704 0.3383 0.8696
0.6437 65.0 715 0.3748 0.8696
0.6437 66.0 726 0.3199 0.9130
0.6437 67.0 737 0.3761 0.8696
0.6437 68.0 748 0.3332 0.8696
0.6437 69.0 759 0.2815 0.9130
0.6437 70.0 770 0.3236 0.8696
0.6437 71.0 781 0.2962 0.9130
0.6437 72.0 792 0.3075 0.9130
0.6437 73.0 803 0.3461 0.8696
0.6437 74.0 814 0.3138 0.9130
0.6437 75.0 825 0.3043 0.9130
0.6437 76.0 836 0.2967 0.9130
0.6437 77.0 847 0.3008 0.9130
0.6437 78.0 858 0.2856 0.9130
0.6437 79.0 869 0.2939 0.9130
0.6437 80.0 880 0.3491 0.9130
0.6437 81.0 891 0.3049 0.9130
0.6437 82.0 902 0.3577 0.8696
0.6437 83.0 913 0.3369 0.8696
0.6437 84.0 924 0.2952 0.9130
0.6437 85.0 935 0.2881 0.9130
0.6437 86.0 946 0.3349 0.8696
0.6437 87.0 957 0.3025 0.9130
0.6437 88.0 968 0.2943 0.8696
0.6437 89.0 979 0.3035 0.9130
0.6437 90.0 990 0.2599 0.9130
0.1677 91.0 1001 0.3061 0.8696
0.1677 92.0 1012 0.4316 0.8261
0.1677 93.0 1023 0.3431 0.8696
0.1677 94.0 1034 0.3246 0.8696
0.1677 95.0 1045 0.3256 0.8696
0.1677 96.0 1056 0.2846 0.9130
0.1677 97.0 1067 0.3077 0.8696
0.1677 98.0 1078 0.2988 0.9130
0.1677 99.0 1089 0.2957 0.9130
0.1677 100.0 1100 0.2983 0.9130
0.1677 101.0 1111 0.2908 0.8696
0.1677 102.0 1122 0.2715 0.9130
0.1677 103.0 1133 0.3208 0.9130
0.1677 104.0 1144 0.3622 0.8261
0.1677 105.0 1155 0.3314 0.8696
0.1677 106.0 1166 0.3226 0.9130
0.1677 107.0 1177 0.3009 0.9565
0.1677 108.0 1188 0.3162 0.9130
0.1677 109.0 1199 0.2927 0.9565
0.1677 110.0 1210 0.2434 0.9130
0.1677 111.0 1221 0.3389 0.8696
0.1677 112.0 1232 0.3686 0.8696
0.1677 113.0 1243 0.3192 0.9130
0.1677 114.0 1254 0.2720 0.8696
0.1677 115.0 1265 0.2955 0.8696
0.1677 116.0 1276 0.3318 0.9130
0.1677 117.0 1287 0.3248 0.9130
0.1677 118.0 1298 0.3115 0.8696
0.1677 119.0 1309 0.2711 0.9130
0.1677 120.0 1320 0.2592 0.8696
0.1677 121.0 1331 0.2830 0.8696
0.1677 122.0 1342 0.2956 0.9130
0.1677 123.0 1353 0.3158 0.9130
0.1677 124.0 1364 0.3328 0.8696
0.1677 125.0 1375 0.3487 0.8696
0.1677 126.0 1386 0.3375 0.8696
0.1677 127.0 1397 0.3488 0.8696
0.1677 128.0 1408 0.3377 0.8696
0.1677 129.0 1419 0.3295 0.8696
0.1677 130.0 1430 0.3198 0.8696
0.1677 131.0 1441 0.2813 0.9130
0.1677 132.0 1452 0.2730 0.9130
0.1677 133.0 1463 0.2861 0.8696
0.1677 134.0 1474 0.3158 0.8696
0.1677 135.0 1485 0.3229 0.8696
0.1677 136.0 1496 0.3169 0.8696
0.1074 137.0 1507 0.3215 0.8696
0.1074 138.0 1518 0.3039 0.8696
0.1074 139.0 1529 0.2803 0.9130
0.1074 140.0 1540 0.2707 0.9130
0.1074 141.0 1551 0.2601 0.9130
0.1074 142.0 1562 0.2599 0.9130
0.1074 143.0 1573 0.2647 0.9130
0.1074 144.0 1584 0.2697 0.9130
0.1074 145.0 1595 0.2738 0.9130
0.1074 146.0 1606 0.2759 0.9130
0.1074 147.0 1617 0.2797 0.9130
0.1074 148.0 1628 0.2798 0.9130
0.1074 149.0 1639 0.2829 0.8696
0.1074 150.0 1650 0.2838 0.8696

Framework versions

  • Transformers 4.40.1
  • Pytorch 2.2.1+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
40
Safetensors
Model size
85.8M params
Tensor type
F32
·

Finetuned from

Evaluation results