--- license: apache-2.0 library_name: peft tags: - generated_from_trainer metrics: - accuracy - precision - recall - f1 base_model: facebook/deit-base-patch16-224 model-index: - name: chest-deit-base-finetuned results: [] --- # chest-deit-base-finetuned This model is a fine-tuned version of [facebook/deit-base-patch16-224](https://huggingface.co/facebook/deit-base-patch16-224) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0979 - Accuracy: 0.9622 - Precision: 0.9531 - Recall: 0.9562 - F1: 0.9546 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.005 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:| | 0.2556 | 0.99 | 63 | 0.2019 | 0.9185 | 0.9446 | 0.8469 | 0.8822 | | 0.2302 | 1.99 | 127 | 0.1098 | 0.9614 | 0.9396 | 0.9654 | 0.9514 | | 0.2258 | 3.0 | 191 | 0.1151 | 0.9622 | 0.9641 | 0.9372 | 0.9496 | | 0.1465 | 4.0 | 255 | 0.0733 | 0.9725 | 0.9653 | 0.9633 | 0.9643 | | 0.1763 | 4.99 | 318 | 0.0763 | 0.9725 | 0.9703 | 0.9580 | 0.9639 | | 0.1627 | 5.99 | 382 | 0.1057 | 0.9571 | 0.9315 | 0.9656 | 0.9466 | | 0.1509 | 7.0 | 446 | 0.0701 | 0.9751 | 0.9638 | 0.9725 | 0.9680 | | 0.1209 | 8.0 | 510 | 0.1047 | 0.9571 | 0.9315 | 0.9656 | 0.9466 | | 0.0961 | 8.99 | 573 | 0.0721 | 0.9734 | 0.9577 | 0.9756 | 0.9662 | | 0.1063 | 9.88 | 630 | 0.0885 | 0.9622 | 0.9398 | 0.9681 | 0.9526 | ### Framework versions - PEFT 0.9.0 - Transformers 4.38.2 - Pytorch 2.2.1+cu121 - Datasets 2.18.0 - Tokenizers 0.15.2