--- license: apache-2.0 base_model: microsoft/beit-base-patch16-224-pt22k-ft22k tags: - generated_from_trainer datasets: - imagefolder metrics: - accuracy model-index: - name: Boya1_SGD_1-e3_20Epoch_09Momentum_Beit-base-patch16_fold5 results: - task: name: Image Classification type: image-classification dataset: name: imagefolder type: imagefolder config: default split: test args: default metrics: - name: Accuracy type: accuracy value: 0.4277190127474912 --- # Boya1_SGD_1-e3_20Epoch_09Momentum_Beit-base-patch16_fold5 This model is a fine-tuned version of [microsoft/beit-base-patch16-224-pt22k-ft22k](https://huggingface.co/microsoft/beit-base-patch16-224-pt22k-ft22k) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 1.7637 - Accuracy: 0.4277 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.001 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 20 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:-----:|:---------------:|:--------:| | 2.3338 | 1.0 | 924 | 2.4489 | 0.2086 | | 2.3427 | 2.0 | 1848 | 2.3125 | 0.2517 | | 2.1284 | 3.0 | 2772 | 2.2064 | 0.2853 | | 2.0324 | 4.0 | 3696 | 2.1236 | 0.3106 | | 1.929 | 5.0 | 4620 | 2.0514 | 0.3369 | | 1.9691 | 6.0 | 5544 | 1.9984 | 0.3537 | | 2.0646 | 7.0 | 6468 | 1.9525 | 0.3653 | | 1.8686 | 8.0 | 7392 | 1.9172 | 0.3813 | | 1.972 | 9.0 | 8316 | 1.8843 | 0.3916 | | 2.0678 | 10.0 | 9240 | 1.8632 | 0.3973 | | 1.8342 | 11.0 | 10164 | 1.8414 | 0.3976 | | 1.9641 | 12.0 | 11088 | 1.8250 | 0.4057 | | 1.6663 | 13.0 | 12012 | 1.8107 | 0.4093 | | 1.7839 | 14.0 | 12936 | 1.7966 | 0.4193 | | 1.7724 | 15.0 | 13860 | 1.7857 | 0.4258 | | 1.7746 | 16.0 | 14784 | 1.7787 | 0.4245 | | 1.9266 | 17.0 | 15708 | 1.7714 | 0.4261 | | 1.8612 | 18.0 | 16632 | 1.7673 | 0.4280 | | 1.7224 | 19.0 | 17556 | 1.7664 | 0.4291 | | 1.7078 | 20.0 | 18480 | 1.7637 | 0.4277 | ### Framework versions - Transformers 4.35.0 - Pytorch 2.1.0 - Datasets 2.14.6 - Tokenizers 0.14.1