--- license: mit tags: - generated_from_trainer metrics: - precision - recall - f1 - accuracy model-index: - name: Yepes_2e-05_250 results: [] --- # Yepes_2e-05_250 This model is a fine-tuned version of [microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract-fulltext](https://huggingface.co/microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract-fulltext) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.1279 - Precision: 0.6833 - Recall: 0.5100 - F1: 0.5840 - Accuracy: 0.9788 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - training_steps: 500 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:| | 0.8467 | 1.39 | 25 | 0.2149 | 0.0 | 0.0 | 0.0 | 0.9672 | | 0.1988 | 2.78 | 50 | 0.1959 | 0.0 | 0.0 | 0.0 | 0.9672 | | 0.156 | 4.17 | 75 | 0.1439 | 0.3268 | 0.2065 | 0.2530 | 0.9691 | | 0.1128 | 5.56 | 100 | 0.1324 | 0.49 | 0.2438 | 0.3256 | 0.9730 | | 0.0978 | 6.94 | 125 | 0.1222 | 0.4964 | 0.3433 | 0.4059 | 0.9747 | | 0.0788 | 8.33 | 150 | 0.1154 | 0.5193 | 0.3682 | 0.4309 | 0.9760 | | 0.067 | 9.72 | 175 | 0.1162 | 0.4711 | 0.3856 | 0.4241 | 0.9749 | | 0.058 | 11.11 | 200 | 0.1236 | 0.5275 | 0.3582 | 0.4267 | 0.9761 | | 0.0491 | 12.5 | 225 | 0.1177 | 0.4940 | 0.4104 | 0.4484 | 0.9754 | | 0.0443 | 13.89 | 250 | 0.1235 | 0.5472 | 0.4179 | 0.4739 | 0.9767 | | 0.0383 | 15.28 | 275 | 0.1198 | 0.5764 | 0.4502 | 0.5056 | 0.9770 | | 0.0369 | 16.67 | 300 | 0.1219 | 0.5892 | 0.4602 | 0.5168 | 0.9776 | | 0.0326 | 18.06 | 325 | 0.1261 | 0.7 | 0.4701 | 0.5625 | 0.9790 | | 0.0305 | 19.44 | 350 | 0.1269 | 0.6904 | 0.4826 | 0.5681 | 0.9787 | | 0.0269 | 20.83 | 375 | 0.1252 | 0.6656 | 0.5 | 0.5710 | 0.9783 | | 0.025 | 22.22 | 400 | 0.1253 | 0.6529 | 0.5100 | 0.5726 | 0.9782 | | 0.0244 | 23.61 | 425 | 0.1284 | 0.6875 | 0.4925 | 0.5739 | 0.9790 | | 0.0224 | 25.0 | 450 | 0.1279 | 0.6833 | 0.5100 | 0.5840 | 0.9788 | ### Framework versions - Transformers 4.27.4 - Pytorch 1.13.1+cu116 - Datasets 2.11.0 - Tokenizers 0.13.2