vit-base-patch16-224-in21k-FINALConcreteClassifier-VIT50epochsAUGMENTED
This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the imagefolder dataset. It achieves the following results on the evaluation set:
- Loss: 0.0001
- Accuracy: {'accuracy': 1.0}
- F1: {'f1': 1.0}
- Precision: {'precision': 1.0}
- Recall: {'recall': 1.0}
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
---|---|---|---|---|---|---|---|
0.8388 | 0.9994 | 407 | 0.7311 | {'accuracy': 0.9157070474435743} | {'f1': 0.917794966875943} | {'precision': 0.921646356866348} | {'recall': 0.9170296069514275} |
0.3038 | 1.9988 | 814 | 0.2005 | {'accuracy': 0.9949332105020727} | {'f1': 0.9950743368173648} | {'precision': 0.9950553355854266} | {'recall': 0.9951352438470434} |
0.1876 | 2.9982 | 1221 | 0.1047 | {'accuracy': 0.9940119760479041} | {'f1': 0.9941331065394295} | {'precision': 0.9943533048504515} | {'recall': 0.993969038215347} |
0.1113 | 4.0 | 1629 | 0.0507 | {'accuracy': 0.9975433747888838} | {'f1': 0.9976395144195634} | {'precision': 0.9977092256032858} | {'recall': 0.997582948802461} |
0.0796 | 4.9994 | 2036 | 0.0309 | {'accuracy': 0.9967756794104099} | {'f1': 0.9968744828925905} | {'precision': 0.9969869308866188} | {'recall': 0.996780568599779} |
0.083 | 5.9988 | 2443 | 0.0251 | {'accuracy': 0.9966221403347152} | {'f1': 0.9967738849901814} | {'precision': 0.9968774505123108} | {'recall': 0.9967275638007345} |
0.0571 | 6.9982 | 2850 | 0.0134 | {'accuracy': 0.9978504529402733} | {'f1': 0.9979500771037213} | {'precision': 0.9979377245997445} | {'recall': 0.9979754620815722} |
0.0422 | 8.0 | 3258 | 0.0114 | {'accuracy': 0.9981575310916628} | {'f1': 0.998254711081091} | {'precision': 0.9982895535805023} | {'recall': 0.9982350712572632} |
0.0358 | 8.9994 | 3665 | 0.0092 | {'accuracy': 0.9978504529402733} | {'f1': 0.9979638871067233} | {'precision': 0.9979608626797065} | {'recall': 0.9979954180985109} |
0.0294 | 9.9988 | 4072 | 0.0068 | {'accuracy': 0.997389835713189} | {'f1': 0.9975489796644634} | {'precision': 0.9975481848852183} | {'recall': 0.9975538912237429} |
0.047 | 10.9982 | 4479 | 0.0059 | {'accuracy': 0.9978504529402733} | {'f1': 0.9979815213920817} | {'precision': 0.9979815455594003} | {'recall': 0.9979874356917354} |
0.0195 | 12.0 | 4887 | 0.0031 | {'accuracy': 0.9995393827729157} | {'f1': 0.9995674497361959} | {'precision': 0.9995686099728593} | {'recall': 0.9995664555320074} |
0.0158 | 12.9994 | 5294 | 0.0023 | {'accuracy': 0.9996929218486105} | {'f1': 0.9997116397752187} | {'precision': 0.999710312862109} | {'recall': 0.9997136311569301} |
0.009 | 13.9988 | 5701 | 0.0036 | {'accuracy': 0.9990787655458314} | {'f1': 0.9991197992455174} | {'precision': 0.9991467576791808} | {'recall': 0.9990975295853345} |
0.0106 | 14.9982 | 6108 | 0.0025 | {'accuracy': 0.9993858436972209} | {'f1': 0.9994156462516233} | {'precision': 0.999429874572406} | {'recall': 0.9994035847694385} |
0.0044 | 16.0 | 6516 | 0.0032 | {'accuracy': 0.9990787655458314} | {'f1': 0.9991348548711563} | {'precision': 0.9991467576791808} | {'recall': 0.9991289198606272} |
0.0286 | 16.9994 | 6923 | 0.0012 | {'accuracy': 0.9998464609243052} | {'f1': 0.9998479593939164} | {'precision': 0.9998569794050343} | {'recall': 0.9998391248391248} |
0.034 | 17.9988 | 7330 | 0.0019 | {'accuracy': 0.9993858436972209} | {'f1': 0.9994154220725433} | {'precision': 0.9994224590190075} | {'recall': 0.99940957157452} |
0.0017 | 18.9982 | 7737 | 0.0015 | {'accuracy': 0.9996929218486105} | {'f1': 0.9996958969308075} | {'precision': 0.9997142857142858} | {'recall': 0.9996782496782497} |
0.0377 | 20.0 | 8145 | 0.0007 | {'accuracy': 0.9998464609243052} | {'f1': 0.999855816578732} | {'precision': 0.9998569794050343} | {'recall': 0.9998548199767712} |
0.0021 | 20.9994 | 8552 | 0.0008 | {'accuracy': 1.0} | {'f1': 1.0} | {'precision': 1.0} | {'recall': 1.0} |
0.0138 | 21.9988 | 8959 | 0.0006 | {'accuracy': 0.9998464609243052} | {'f1': 0.999855816578732} | {'precision': 0.9998569794050343} | {'recall': 0.9998548199767712} |
0.0086 | 22.9982 | 9366 | 0.0039 | {'accuracy': 0.9989252264701366} | {'f1': 0.9989828981253953} | {'precision': 0.998990938880296} | {'recall': 0.9989800183099152} |
0.0089 | 24.0 | 9774 | 0.0004 | {'accuracy': 1.0} | {'f1': 1.0} | {'precision': 1.0} | {'recall': 1.0} |
0.0064 | 24.9994 | 10181 | 0.0004 | {'accuracy': 1.0} | {'f1': 1.0} | {'precision': 1.0} | {'recall': 1.0} |
0.0121 | 25.9988 | 10588 | 0.0013 | {'accuracy': 0.9998464609243052} | {'f1': 0.9998479593939164} | {'precision': 0.9998569794050343} | {'recall': 0.9998391248391248} |
0.0123 | 26.9982 | 10995 | 0.0003 | {'accuracy': 1.0} | {'f1': 1.0} | {'precision': 1.0} | {'recall': 1.0} |
0.0129 | 28.0 | 11403 | 0.0012 | {'accuracy': 0.9995393827729157} | {'f1': 0.9995674416794638} | {'precision': 0.9995719178082192} | {'recall': 0.9995644599303136} |
0.0143 | 28.9994 | 11810 | 0.0003 | {'accuracy': 1.0} | {'f1': 1.0} | {'precision': 1.0} | {'recall': 1.0} |
0.0029 | 29.9988 | 12217 | 0.0030 | {'accuracy': 0.9993858436972209} | {'f1': 0.9994232496258108} | {'precision': 0.999429874572406} | {'recall': 0.9994192799070848} |
0.0059 | 30.9982 | 12624 | 0.0020 | {'accuracy': 0.9996929218486105} | {'f1': 0.999703857839045} | {'precision': 0.9997142857142858} | {'recall': 0.999693944815896} |
0.0026 | 32.0 | 13032 | 0.0002 | {'accuracy': 1.0} | {'f1': 1.0} | {'precision': 1.0} | {'recall': 1.0} |
0.0097 | 32.9994 | 13439 | 0.0005 | {'accuracy': 0.9996929218486105} | {'f1': 0.9997028008868588} | {'precision': 0.9996941515963085} | {'recall': 0.9997116355552362} |
0.0035 | 33.9988 | 13846 | 0.0002 | {'accuracy': 1.0} | {'f1': 1.0} | {'precision': 1.0} | {'recall': 1.0} |
0.0132 | 34.9982 | 14253 | 0.0008 | {'accuracy': 0.9996929218486105} | {'f1': 0.9997116397752187} | {'precision': 0.999710312862109} | {'recall': 0.9997136311569301} |
0.0011 | 36.0 | 14661 | 0.0029 | {'accuracy': 0.9995393827729157} | {'f1': 0.9995410021285396} | {'precision': 0.9995192307692308} | {'recall': 0.9995644599303136} |
0.006 | 36.9994 | 15068 | 0.0002 | {'accuracy': 1.0} | {'f1': 1.0} | {'precision': 1.0} | {'recall': 1.0} |
0.0038 | 37.9988 | 15475 | 0.0002 | {'accuracy': 1.0} | {'f1': 1.0} | {'precision': 1.0} | {'recall': 1.0} |
0.0125 | 38.9982 | 15882 | 0.0002 | {'accuracy': 1.0} | {'f1': 1.0} | {'precision': 1.0} | {'recall': 1.0} |
0.002 | 40.0 | 16290 | 0.0007 | {'accuracy': 0.9998464609243052} | {'f1': 0.9998479593939164} | {'precision': 0.9998569794050343} | {'recall': 0.9998391248391248} |
0.0017 | 40.9994 | 16697 | 0.0002 | {'accuracy': 1.0} | {'f1': 1.0} | {'precision': 1.0} | {'recall': 1.0} |
0.0002 | 41.9988 | 17104 | 0.0003 | {'accuracy': 0.9998464609243052} | {'f1': 0.999855816578732} | {'precision': 0.9998569794050343} | {'recall': 0.9998548199767712} |
0.0069 | 42.9982 | 17511 | 0.0005 | {'accuracy': 0.9998464609243052} | {'f1': 0.999855816578732} | {'precision': 0.9998569794050343} | {'recall': 0.9998548199767712} |
0.0008 | 44.0 | 17919 | 0.0001 | {'accuracy': 1.0} | {'f1': 1.0} | {'precision': 1.0} | {'recall': 1.0} |
0.0129 | 44.9994 | 18326 | 0.0002 | {'accuracy': 1.0} | {'f1': 1.0} | {'precision': 1.0} | {'recall': 1.0} |
0.0184 | 45.9988 | 18733 | 0.0001 | {'accuracy': 1.0} | {'f1': 1.0} | {'precision': 1.0} | {'recall': 1.0} |
0.0003 | 46.9982 | 19140 | 0.0001 | {'accuracy': 1.0} | {'f1': 1.0} | {'precision': 1.0} | {'recall': 1.0} |
0.0095 | 48.0 | 19548 | 0.0001 | {'accuracy': 1.0} | {'f1': 1.0} | {'precision': 1.0} | {'recall': 1.0} |
0.0003 | 48.9994 | 19955 | 0.0001 | {'accuracy': 1.0} | {'f1': 1.0} | {'precision': 1.0} | {'recall': 1.0} |
0.0002 | 49.9693 | 20350 | 0.0001 | {'accuracy': 1.0} | {'f1': 1.0} | {'precision': 1.0} | {'recall': 1.0} |
Framework versions
- Transformers 4.43.3
- Pytorch 2.3.1
- Datasets 2.20.0
- Tokenizers 0.19.1
- Downloads last month
- 3
Model tree for mmomm25/vit-base-patch16-224-in21k-FINALConcreteClassifier-VIT50epochsAUGMENTED
Base model
google/vit-base-patch16-224-in21kEvaluation results
- Accuracy on imagefolderself-reported[object Object]
- F1 on imagefolderself-reported[object Object]
- Precision on imagefolderself-reported[object Object]
- Recall on imagefolderself-reported[object Object]