Edit model card

astrophotography-object-classifier-alpha5

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1827
  • Accuracy: 0.9516

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 1337
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 150.0

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.2639 1.0 2575 0.2192 0.9461
0.2457 2.0 5150 0.2065 0.9464
0.3157 3.0 7725 0.1827 0.9516
0.3149 4.0 10300 0.1855 0.9488
0.1212 5.0 12875 0.2079 0.9480
0.078 6.0 15450 0.2008 0.9516
0.3493 7.0 18025 0.2038 0.9497
0.131 8.0 20600 0.2059 0.9510
0.2658 9.0 23175 0.2089 0.9510
0.0762 10.0 25750 0.2068 0.9541
0.127 11.0 28325 0.1986 0.9543
0.181 12.0 30900 0.2227 0.9513
0.1072 13.0 33475 0.2303 0.9502
0.0179 14.0 36050 0.2240 0.9483
0.1447 15.0 38625 0.2364 0.9505
0.0933 16.0 41200 0.2372 0.9532
0.17 17.0 43775 0.2166 0.9557
0.0463 18.0 46350 0.2852 0.9461
0.1207 19.0 48925 0.2653 0.9508
0.1761 20.0 51500 0.2443 0.9521
0.1441 21.0 54075 0.2464 0.9535
0.1279 22.0 56650 0.2681 0.9499
0.1811 23.0 59225 0.2626 0.9538
0.1737 24.0 61800 0.2604 0.9541
0.0275 25.0 64375 0.2625 0.9510
0.1757 26.0 66950 0.2819 0.9488
0.1257 27.0 69525 0.2708 0.9521
0.1097 28.0 72100 0.2801 0.9519
0.0772 29.0 74675 0.2870 0.9499
0.132 30.0 77250 0.2824 0.9497
0.0652 31.0 79825 0.2628 0.9538
0.0324 32.0 82400 0.3223 0.9453
0.1774 33.0 84975 0.2749 0.9549
0.1178 34.0 87550 0.2905 0.9513
0.0804 35.0 90125 0.3100 0.9480
0.0617 36.0 92700 0.3131 0.9475
0.0348 37.0 95275 0.3341 0.9486
0.0057 38.0 97850 0.3225 0.9466
0.0409 39.0 100425 0.3206 0.9483
0.1052 40.0 103000 0.3212 0.9494
0.0943 41.0 105575 0.3075 0.9508
0.0018 42.0 108150 0.3062 0.9519
0.0287 43.0 110725 0.3224 0.9469
0.0384 44.0 113300 0.3086 0.9488
0.1214 45.0 115875 0.3145 0.9494
0.1735 46.0 118450 0.3191 0.9494
0.0477 47.0 121025 0.3004 0.9521
0.0221 48.0 123600 0.3205 0.9480
0.0939 49.0 126175 0.3431 0.9486
0.0599 50.0 128750 0.3167 0.9516
0.1785 51.0 131325 0.3274 0.9513
0.1039 52.0 133900 0.3114 0.9519
0.0527 53.0 136475 0.3252 0.9477
0.0584 54.0 139050 0.3200 0.9510
0.1022 55.0 141625 0.3284 0.9491
0.013 56.0 144200 0.3386 0.9475
0.0488 57.0 146775 0.3290 0.9505
0.0514 58.0 149350 0.3126 0.9535
0.0184 59.0 151925 0.3196 0.9532
0.1233 60.0 154500 0.3270 0.9516
0.1667 61.0 157075 0.3250 0.9502
0.0497 62.0 159650 0.3375 0.9466
0.0445 63.0 162225 0.3493 0.9502
0.114 64.0 164800 0.3368 0.9488
0.048 65.0 167375 0.3358 0.9510
0.2337 66.0 169950 0.3330 0.9510
0.0705 67.0 172525 0.3480 0.9510
0.094 68.0 175100 0.3508 0.9497
0.0498 69.0 177675 0.3328 0.9508
0.0535 70.0 180250 0.3558 0.9499
0.0217 71.0 182825 0.3583 0.9488
0.0264 72.0 185400 0.3600 0.9477
0.0108 73.0 187975 0.3629 0.9491
0.0446 74.0 190550 0.3570 0.9508
0.0702 75.0 193125 0.3600 0.9502
0.141 76.0 195700 0.3428 0.9527
0.0226 77.0 198275 0.3594 0.9502
0.0055 78.0 200850 0.3653 0.9508
0.1442 79.0 203425 0.3437 0.9530
0.0834 80.0 206000 0.3431 0.9524
0.0388 81.0 208575 0.3426 0.9521
0.0321 82.0 211150 0.3555 0.9497
0.051 83.0 213725 0.3730 0.9505
0.0049 84.0 216300 0.3549 0.9527
0.043 85.0 218875 0.3592 0.9524
0.0284 86.0 221450 0.3749 0.9499
0.0923 87.0 224025 0.3527 0.9513
0.1188 88.0 226600 0.3725 0.9486
0.1493 89.0 229175 0.3560 0.9521
0.0164 90.0 231750 0.3573 0.9508
0.0477 91.0 234325 0.3679 0.9502
0.0827 92.0 236900 0.3683 0.9486
0.0799 93.0 239475 0.3667 0.9510
0.0413 94.0 242050 0.3604 0.9516
0.071 95.0 244625 0.3725 0.9483
0.2079 96.0 247200 0.3688 0.9483
0.0665 97.0 249775 0.3576 0.9521
0.0673 98.0 252350 0.3636 0.9513
0.062 99.0 254925 0.3688 0.9513
0.1217 100.0 257500 0.3742 0.9508
0.0951 101.0 260075 0.3718 0.9491
0.0118 102.0 262650 0.3849 0.9491
0.0307 103.0 265225 0.3644 0.9535
0.0157 104.0 267800 0.3647 0.9524
0.0125 105.0 270375 0.3994 0.9486
0.0213 106.0 272950 0.3775 0.9499
0.1249 107.0 275525 0.3902 0.9491
0.0333 108.0 278100 0.3637 0.9516
0.0545 109.0 280675 0.3663 0.9521
0.1136 110.0 283250 0.3847 0.9502
0.0751 111.0 285825 0.3818 0.9513
0.001 112.0 288400 0.3811 0.9521
0.0282 113.0 290975 0.3843 0.9510
0.1117 114.0 293550 0.3790 0.9521
0.0022 115.0 296125 0.3717 0.9521
0.0203 116.0 298700 0.3794 0.9530
0.0437 117.0 301275 0.3807 0.9527
0.0045 118.0 303850 0.3821 0.9530
0.0015 119.0 306425 0.3867 0.9527
0.1152 120.0 309000 0.3842 0.9521
0.0748 121.0 311575 0.3839 0.9527
0.0955 122.0 314150 0.3805 0.9516
0.0043 123.0 316725 0.3833 0.9521
0.0249 124.0 319300 0.3745 0.9497
0.0002 125.0 321875 0.3744 0.9519
0.0169 126.0 324450 0.3808 0.9510
0.0277 127.0 327025 0.3735 0.9524
0.0082 128.0 329600 0.3831 0.9527
0.0737 129.0 332175 0.3891 0.9524
0.0517 130.0 334750 0.3839 0.9530
0.0218 131.0 337325 0.3863 0.9527
0.0228 132.0 339900 0.3913 0.9519
0.0094 133.0 342475 0.3968 0.9513
0.0784 134.0 345050 0.3871 0.9532
0.0116 135.0 347625 0.3890 0.9538
0.015 136.0 350200 0.3846 0.9530
0.0307 137.0 352775 0.3850 0.9530
0.0081 138.0 355350 0.3852 0.9532
0.0705 139.0 357925 0.3859 0.9527
0.0442 140.0 360500 0.3871 0.9524
0.0888 141.0 363075 0.3851 0.9535
0.0169 142.0 365650 0.3908 0.9527
0.0132 143.0 368225 0.3923 0.9527
0.0349 144.0 370800 0.3880 0.9527
0.0014 145.0 373375 0.3875 0.9535
0.0495 146.0 375950 0.3898 0.9535
0.0006 147.0 378525 0.3908 0.9530
0.0226 148.0 381100 0.3899 0.9527
0.0927 149.0 383675 0.3895 0.9527
0.081 150.0 386250 0.3896 0.9527

Framework versions

  • Transformers 4.36.0.dev0
  • Pytorch 2.1.1+cu121
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
4
Safetensors
Model size
85.8M params
Tensor type
F32
ยท

Finetuned from

Space using bortle/astrophotography-object-classifier-alpha5 1

Evaluation results