--- license: apache-2.0 library_name: peft tags: - generated_from_trainer metrics: - accuracy - precision - recall - f1 base_model: microsoft/beit-base-patch16-224-pt22k-ft22k model-index: - name: chest-beit-base-finetuned results: [] --- # chest-beit-base-finetuned This model is a fine-tuned version of [microsoft/beit-base-patch16-224-pt22k-ft22k](https://huggingface.co/microsoft/beit-base-patch16-224-pt22k-ft22k) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.2620 - Accuracy: 0.9107 - Precision: 0.8923 - Recall: 0.8923 - F1: 0.8923 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.005 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:| | 0.4775 | 0.99 | 63 | 0.2264 | 0.9142 | 0.8850 | 0.8962 | 0.8903 | | 0.7117 | 1.99 | 127 | 0.4008 | 0.7391 | 0.3695 | 0.5 | 0.4250 | | 0.4115 | 3.0 | 191 | 0.4358 | 0.8155 | 0.7871 | 0.8645 | 0.7957 | | 0.3631 | 4.0 | 255 | 0.3091 | 0.8798 | 0.8381 | 0.8708 | 0.8518 | | 0.3794 | 4.99 | 318 | 0.2802 | 0.8798 | 0.8393 | 0.8623 | 0.8495 | | 0.3713 | 5.99 | 382 | 0.2805 | 0.8773 | 0.8371 | 0.8542 | 0.8449 | | 0.3953 | 7.0 | 446 | 0.3397 | 0.8584 | 0.8185 | 0.8872 | 0.8367 | | 0.3218 | 8.0 | 510 | 0.3072 | 0.8670 | 0.8257 | 0.8898 | 0.8448 | | 0.3219 | 8.99 | 573 | 0.2633 | 0.8961 | 0.8582 | 0.8872 | 0.8708 | | 0.3049 | 9.88 | 630 | 0.2739 | 0.8927 | 0.8528 | 0.8912 | 0.8685 | ### Framework versions - PEFT 0.9.0 - Transformers 4.38.2 - Pytorch 2.2.1+cu121 - Datasets 2.18.0 - Tokenizers 0.15.2