--- license: apache-2.0 base_model: distilbert-base-uncased tags: - generated_from_trainer metrics: - precision - recall - f1 - accuracy model-index: - name: distilbert-base-uncased-finetuned-ner-cadec-active results: [] --- # distilbert-base-uncased-finetuned-ner-cadec-active This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.4412 - Precision: 0.4150 - Recall: 0.3935 - F1: 0.4039 - Accuracy: 0.8743 - Adr Precision: 0.3457 - Adr Recall: 0.3922 - Adr F1: 0.3675 - Disease Precision: 0.0 - Disease Recall: 0.0 - Disease F1: 0.0 - Drug Precision: 0.7453 - Drug Recall: 0.6090 - Drug F1: 0.6703 - Finding Precision: 0.0 - Finding Recall: 0.0 - Finding F1: 0.0 - Symptom Precision: 0.0 - Symptom Recall: 0.0 - Symptom F1: 0.0 - B-adr Precision: 0.6327 - B-adr Recall: 0.5360 - B-adr F1: 0.5803 - B-disease Precision: 0.0 - B-disease Recall: 0.0 - B-disease F1: 0.0 - B-drug Precision: 0.9536 - B-drug Recall: 0.6275 - B-drug F1: 0.7569 - B-finding Precision: 0.0 - B-finding Recall: 0.0 - B-finding F1: 0.0 - B-symptom Precision: 0.0 - B-symptom Recall: 0.0 - B-symptom F1: 0.0 - I-adr Precision: 0.3171 - I-adr Recall: 0.3569 - I-adr F1: 0.3358 - I-disease Precision: 0.0 - I-disease Recall: 0.0 - I-disease F1: 0.0 - I-drug Precision: 0.8425 - I-drug Recall: 0.6805 - I-drug F1: 0.7529 - I-finding Precision: 0.0 - I-finding Recall: 0.0 - I-finding F1: 0.0 - I-symptom Precision: 0.0 - I-symptom Recall: 0.0 - I-symptom F1: 0.0 - Macro Avg F1: 0.2426 - Weighted Avg F1: 0.4834 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | Adr Precision | Adr Recall | Adr F1 | Disease Precision | Disease Recall | Disease F1 | Drug Precision | Drug Recall | Drug F1 | Finding Precision | Finding Recall | Finding F1 | Symptom Precision | Symptom Recall | Symptom F1 | B-adr Precision | B-adr Recall | B-adr F1 | B-disease Precision | B-disease Recall | B-disease F1 | B-drug Precision | B-drug Recall | B-drug F1 | B-finding Precision | B-finding Recall | B-finding F1 | B-symptom Precision | B-symptom Recall | B-symptom F1 | I-adr Precision | I-adr Recall | I-adr F1 | I-disease Precision | I-disease Recall | I-disease F1 | I-drug Precision | I-drug Recall | I-drug F1 | I-finding Precision | I-finding Recall | I-finding F1 | I-symptom Precision | I-symptom Recall | I-symptom F1 | Macro Avg F1 | Weighted Avg F1 | |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|:-------------:|:----------:|:------:|:-----------------:|:--------------:|:----------:|:--------------:|:-----------:|:-------:|:-----------------:|:--------------:|:----------:|:-----------------:|:--------------:|:----------:|:---------------:|:------------:|:--------:|:-------------------:|:----------------:|:------------:|:----------------:|:-------------:|:---------:|:-------------------:|:----------------:|:------------:|:-------------------:|:----------------:|:------------:|:---------------:|:------------:|:--------:|:-------------------:|:----------------:|:------------:|:----------------:|:-------------:|:---------:|:-------------------:|:----------------:|:------------:|:-------------------:|:----------------:|:------------:|:------------:|:---------------:| | No log | 1.0 | 16 | 0.8351 | 0.0 | 0.0 | 0.0 | 0.7726 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | No log | 2.0 | 32 | 0.6134 | 0.1945 | 0.1326 | 0.1577 | 0.8141 | 0.2006 | 0.1919 | 0.1962 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.1023 | 0.1857 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0445 | 0.0519 | 0.0479 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0234 | 0.0357 | | No log | 3.0 | 48 | 0.5354 | 0.3707 | 0.2591 | 0.3050 | 0.8433 | 0.2612 | 0.2214 | 0.2396 | 0.0 | 0.0 | 0.0 | 0.9378 | 0.5274 | 0.6751 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6628 | 0.0314 | 0.0600 | 0.0 | 0.0 | 0.0 | 0.9920 | 0.5583 | 0.7145 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0717 | 0.0736 | 0.0726 | 0.0 | 0.0 | 0.0 | 0.9882 | 0.5307 | 0.6905 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1538 | 0.2005 | | No log | 4.0 | 64 | 0.4985 | 0.3865 | 0.2823 | 0.3263 | 0.8498 | 0.2805 | 0.2491 | 0.2639 | 0.0 | 0.0 | 0.0 | 0.9429 | 0.5480 | 0.6931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7011 | 0.0860 | 0.1532 | 0.0 | 0.0 | 0.0 | 0.9829 | 0.5691 | 0.7209 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0971 | 0.1036 | 0.1002 | 0.0 | 0.0 | 0.0 | 0.9762 | 0.5542 | 0.7070 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1681 | 0.2455 | | No log | 5.0 | 80 | 0.4648 | 0.3699 | 0.3099 | 0.3373 | 0.8555 | 0.2758 | 0.2817 | 0.2787 | 0.0 | 0.0 | 0.0 | 0.8730 | 0.5731 | 0.6919 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6282 | 0.2127 | 0.3178 | 0.0 | 0.0 | 0.0 | 0.9750 | 0.5943 | 0.7384 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1388 | 0.1654 | 0.1509 | 0.0 | 0.0 | 0.0 | 0.9095 | 0.5894 | 0.7152 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1922 | 0.3240 | | No log | 6.0 | 96 | 0.4521 | 0.3858 | 0.3462 | 0.3649 | 0.8659 | 0.3021 | 0.3311 | 0.3159 | 0.0 | 0.0 | 0.0 | 0.8378 | 0.5839 | 0.6882 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6235 | 0.4111 | 0.4955 | 0.0 | 0.0 | 0.0 | 0.9695 | 0.5987 | 0.7403 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2352 | 0.2718 | 0.2522 | 0.0 | 0.0 | 0.0 | 0.8953 | 0.6173 | 0.7308 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2219 | 0.4222 | | No log | 7.0 | 112 | 0.4401 | 0.3818 | 0.3471 | 0.3636 | 0.8670 | 0.3023 | 0.3295 | 0.3153 | 0.0 | 0.0 | 0.0 | 0.7662 | 0.5937 | 0.6690 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6297 | 0.4026 | 0.4912 | 0.0 | 0.0 | 0.0 | 0.9566 | 0.6131 | 0.7473 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2293 | 0.2661 | 0.2463 | 0.0 | 0.0 | 0.0 | 0.8524 | 0.6516 | 0.7386 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2223 | 0.4204 | | No log | 8.0 | 128 | 0.4457 | 0.4104 | 0.3754 | 0.3921 | 0.8719 | 0.3365 | 0.3697 | 0.3523 | 0.0 | 0.0 | 0.0 | 0.7706 | 0.5964 | 0.6724 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6325 | 0.4941 | 0.5548 | 0.0 | 0.0 | 0.0 | 0.9607 | 0.6140 | 0.7492 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2955 | 0.3279 | 0.3109 | 0.0 | 0.0 | 0.0 | 0.8663 | 0.6606 | 0.7496 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2364 | 0.4651 | | No log | 9.0 | 144 | 0.4409 | 0.4104 | 0.3911 | 0.4005 | 0.8740 | 0.3411 | 0.3903 | 0.3641 | 0.0 | 0.0 | 0.0 | 0.7478 | 0.6036 | 0.6680 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6279 | 0.5404 | 0.5809 | 0.0 | 0.0 | 0.0 | 0.9533 | 0.6230 | 0.7535 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.3126 | 0.3531 | 0.3316 | 0.0 | 0.0 | 0.0 | 0.8482 | 0.6760 | 0.7524 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2418 | 0.4819 | | No log | 10.0 | 160 | 0.4412 | 0.4150 | 0.3935 | 0.4039 | 0.8743 | 0.3457 | 0.3922 | 0.3675 | 0.0 | 0.0 | 0.0 | 0.7453 | 0.6090 | 0.6703 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6327 | 0.5360 | 0.5803 | 0.0 | 0.0 | 0.0 | 0.9536 | 0.6275 | 0.7569 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.3171 | 0.3569 | 0.3358 | 0.0 | 0.0 | 0.0 | 0.8425 | 0.6805 | 0.7529 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2426 | 0.4834 | ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu121 - Datasets 2.15.0 - Tokenizers 0.15.0