vit-base-patch16-224-in21k-FINALLaneClassifier-VIT50AUGMENTED

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0000
  • Accuracy: {'accuracy': 1.0}
  • F1: {'f1': 1.0}
  • Precision: {'precision': 1.0}
  • Recall: {'recall': 1.0}

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Precision Recall
0.013 1.0 2098 0.0503 {'accuracy': 0.9872512808292625} {'f1': 0.9872500637993007} {'precision': 0.9875320438126312} {'recall': 0.9872891423140888}
0.0202 2.0 4196 0.0034 {'accuracy': 0.9991659716430359} {'f1': 0.9991659678069054} {'precision': 0.999164877117633} {'recall': 0.999168448562604}
0.0007 3.0 6294 0.0340 {'accuracy': 0.9864172524722984} {'f1': 0.9864157249694355} {'precision': 0.986738017682643} {'recall': 0.9864575908766928}
0.0002 4.0 8392 0.0078 {'accuracy': 0.9972596211128322} {'f1': 0.9972596209572226} {'precision': 0.9972664606608035} {'recall': 0.9972677595628415}
0.0001 5.0 10490 0.0051 {'accuracy': 0.9986893840104849} {'f1': 0.9986893803637995} {'precision': 0.998688915375447} {'recall': 0.9986932763126634}
0.0001 6.0 12588 0.0122 {'accuracy': 0.9965447396640057} {'f1': 0.9965447388791924} {'precision': 0.9965582720151911} {'recall': 0.9965550011879306}
0.0002 7.0 14686 0.0019 {'accuracy': 0.999523412367449} {'f1': 0.9995234093837869} {'precision': 0.9995224450811844} {'recall': 0.9995248277500595}
0.0002 8.0 16784 0.0089 {'accuracy': 0.9979745025616585} {'f1': 0.9979744996862612} {'precision': 0.9979755665421167} {'recall': 0.9979798081321687}
0.0413 9.0 18882 0.0082 {'accuracy': 0.9971404742046944} {'f1': 0.9971404741641006} {'precision': 0.997148288973384} {'recall': 0.9971489665003563}
0.0001 10.0 20980 0.0451 {'accuracy': 0.9908256880733946} {'f1': 0.9908253358952392} {'precision': 0.9909645623093171} {'recall': 0.9908529341886434}
0.0 11.0 23078 0.0075 {'accuracy': 0.998212796377934} {'f1': 0.9982127634963612} {'precision': 0.998220079886156} {'recall': 0.9982088765901349}
0.0 12.0 25176 0.0039 {'accuracy': 0.9991659716430359} {'f1': 0.9991659678069054} {'precision': 0.999164877117633} {'recall': 0.999168448562604}
0.013 13.0 27274 0.0107 {'accuracy': 0.997736208745383} {'f1': 0.9977362066886293} {'precision': 0.9977383781306159} {'recall': 0.9977422220071985}
0.0537 14.0 29372 0.0013 {'accuracy': 0.9996425592755868} {'f1': 0.9996425558453789} {'precision': 0.9996429388720207} {'recall': 0.9996422012013773}
0.0018 15.0 31470 0.0115 {'accuracy': 0.9973787680209698} {'f1': 0.997378766197631} {'precision': 0.997381574328435} {'recall': 0.9973851330141593}
0.0049 16.0 33568 0.0040 {'accuracy': 0.9986893840104849} {'f1': 0.9986893803637995} {'precision': 0.998688915375447} {'recall': 0.9986932763126634}
0.0032 17.0 35666 0.0002 {'accuracy': 0.9998808530918623} {'f1': 0.9998808519484597} {'precision': 0.9998812351543943} {'recall': 0.9998804971319312}
0.0002 18.0 37764 0.0018 {'accuracy': 0.9994042654593114} {'f1': 0.9994042620764765} {'precision': 0.9994031988541419} {'recall': 0.9994060346875742}
0.0003 19.0 39862 0.0028 {'accuracy': 0.9986893840104849} {'f1': 0.9986893803637995} {'precision': 0.998688915375447} {'recall': 0.9986932763126634}
0.0 20.0 41960 0.0001 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0 21.0 44058 0.0013 {'accuracy': 0.9996425592755868} {'f1': 0.9996425568196459} {'precision': 0.99964174826845} {'recall': 0.9996436208125445}
0.0005 22.0 46156 0.0032 {'accuracy': 0.9990468247348981} {'f1': 0.9990468198500874} {'precision': 0.9990457151585668} {'recall': 0.9990489456945351}
0.0 23.0 48254 0.0030 {'accuracy': 0.999523412367449} {'f1': 0.9995234087884033} {'precision': 0.9995228131486541} {'recall': 0.9995241179444757}
0.0 24.0 50352 0.0000 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0 25.0 52450 0.0039 {'accuracy': 0.9990468247348981} {'f1': 0.9990468208243473} {'precision': 0.9990458015267176} {'recall': 0.9990496555001188}
0.0 26.0 54548 0.0028 {'accuracy': 0.9992851185511736} {'f1': 0.9992851148875656} {'precision': 0.9992840095465394} {'recall': 0.9992872416250891}
0.0 27.0 56646 0.0010 {'accuracy': 0.9996425592755868} {'f1': 0.9996425568196459} {'precision': 0.99964174826845} {'recall': 0.9996436208125445}
0.0002 28.0 58744 0.0004 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0 29.0 60842 0.0000 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0 30.0 62940 0.0018 {'accuracy': 0.999523412367449} {'f1': 0.9995234093837869} {'precision': 0.9995224450811844} {'recall': 0.9995248277500595}
0.0001 31.0 65038 0.0020 {'accuracy': 0.9996425592755868} {'f1': 0.9996425558453789} {'precision': 0.9996429388720207} {'recall': 0.9996422012013773}
0.0002 32.0 67136 0.0001 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0 33.0 69234 0.0014 {'accuracy': 0.9996425592755868} {'f1': 0.9996425568196459} {'precision': 0.99964174826845} {'recall': 0.9996436208125445}
0.0 34.0 71332 0.0110 {'accuracy': 0.9984510901942094} {'f1': 0.9984510584424513} {'precision': 0.9984604452865941} {'recall': 0.9984464627151052}
0.0004 35.0 73430 0.0009 {'accuracy': 0.9998808530918623} {'f1': 0.9998808521176034} {'precision': 0.9998805256869773} {'recall': 0.9998812069375149}
0.0 36.0 75528 0.0009 {'accuracy': 0.9998808530918623} {'f1': 0.9998808521176034} {'precision': 0.9998805256869773} {'recall': 0.9998812069375149}
0.0 37.0 77626 0.0002 {'accuracy': 0.9998808530918623} {'f1': 0.9998808521176034} {'precision': 0.9998805256869773} {'recall': 0.9998812069375149}
0.0 38.0 79724 0.0000 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0 39.0 81822 0.0000 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0 40.0 83920 0.0024 {'accuracy': 0.9994042654593114} {'f1': 0.999404257847879} {'precision': 0.999406739439962} {'recall': 0.9994024856596558}
0.0 41.0 86018 0.0000 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0 42.0 88116 0.0000 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0 43.0 90214 0.0000 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0 44.0 92312 0.0000 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0 45.0 94410 0.0000 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0 46.0 96508 0.0000 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0 47.0 98606 0.0000 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0 48.0 100704 0.0000 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0 49.0 102802 0.0000 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}
0.0 50.0 104900 0.0000 {'accuracy': 1.0} {'f1': 1.0} {'precision': 1.0} {'recall': 1.0}

Framework versions

  • Transformers 4.43.3
  • Pytorch 2.3.1
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
8
Safetensors
Model size
85.8M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .

Model tree for mmomm25/vit-base-patch16-224-in21k-FINALLaneClassifier-VIT50AUGMENTED

Finetuned
(1783)
this model

Evaluation results