--- license: mit tags: - generated_from_trainer metrics: - precision - recall - f1 - accuracy model-index: - name: tmVar_5e-05_30_03 results: [] --- # tmVar_5e-05_30_03 This model is a fine-tuned version of [microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract-fulltext](https://huggingface.co/microsoft/BiomedNLP-PubMedBERT-base-uncased-abstract-fulltext) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.0230 - Precision: 0.8677 - Recall: 0.8865 - F1: 0.8770 - Accuracy: 0.9964 ## Model description Trained on Token set with max_length=475 ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - training_steps: 500 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:| | 0.3602 | 1.39 | 25 | 0.0547 | 0.4823 | 0.3676 | 0.4172 | 0.9851 | | 0.0498 | 2.78 | 50 | 0.0305 | 0.4518 | 0.5568 | 0.4988 | 0.9912 | | 0.0237 | 4.17 | 75 | 0.0198 | 0.6338 | 0.7297 | 0.6784 | 0.9942 | | 0.0089 | 5.56 | 100 | 0.0164 | 0.7895 | 0.8919 | 0.8376 | 0.9960 | | 0.0036 | 6.94 | 125 | 0.0138 | 0.7826 | 0.8757 | 0.8265 | 0.9967 | | 0.0023 | 8.33 | 150 | 0.0148 | 0.8462 | 0.8919 | 0.8684 | 0.9969 | | 0.0012 | 9.72 | 175 | 0.0159 | 0.7890 | 0.9297 | 0.8536 | 0.9966 | | 0.0012 | 11.11 | 200 | 0.0163 | 0.845 | 0.9135 | 0.8779 | 0.9970 | | 0.001 | 12.5 | 225 | 0.0165 | 0.8534 | 0.8811 | 0.8670 | 0.9967 | | 0.0012 | 13.89 | 250 | 0.0215 | 0.8020 | 0.8757 | 0.8372 | 0.9961 | | 0.0008 | 15.28 | 275 | 0.0192 | 0.875 | 0.9081 | 0.8912 | 0.9970 | | 0.0007 | 16.67 | 300 | 0.0192 | 0.875 | 0.9081 | 0.8912 | 0.9970 | | 0.0005 | 18.06 | 325 | 0.0192 | 0.875 | 0.9081 | 0.8912 | 0.9970 | | 0.0009 | 19.44 | 350 | 0.0230 | 0.8677 | 0.8865 | 0.8770 | 0.9964 | ### Framework versions - Transformers 4.27.4 - Pytorch 1.13.1+cu116 - Datasets 2.11.0 - Tokenizers 0.13.2