--- license: apache-2.0 base_model: distilbert-base-uncased tags: - generated_from_trainer metrics: - precision - recall - f1 - accuracy model-index: - name: distilbert-base-uncased-finetuned-ner-cadec-active results: [] --- # distilbert-base-uncased-finetuned-ner-cadec-active This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.3991 - Precision: 0.4343 - Recall: 0.3889 - F1: 0.4104 - Accuracy: 0.8844 - Adr Precision: 0.3362 - Adr Recall: 0.3387 - Adr F1: 0.3374 - Disease Precision: 0.0 - Disease Recall: 0.0 - Disease F1: 0.0 - Drug Precision: 0.8077 - Drug Recall: 0.7819 - Drug F1: 0.7946 - Finding Precision: 0.0 - Finding Recall: 0.0 - Finding F1: 0.0 - Symptom Precision: 0.0 - Symptom Recall: 0.0 - Symptom F1: 0.0 - B-adr Precision: 0.6463 - B-adr Recall: 0.3827 - B-adr F1: 0.4807 - B-disease Precision: 0.0 - B-disease Recall: 0.0 - B-disease F1: 0.0 - B-drug Precision: 0.9463 - B-drug Recall: 0.75 - B-drug F1: 0.8368 - B-finding Precision: 0.0 - B-finding Recall: 0.0 - B-finding F1: 0.0 - B-symptom Precision: 0.0 - B-symptom Recall: 0.0 - B-symptom F1: 0.0 - I-adr Precision: 0.2517 - I-adr Recall: 0.2675 - I-adr F1: 0.2594 - I-disease Precision: 0.0 - I-disease Recall: 0.0 - I-disease F1: 0.0 - I-drug Precision: 0.7790 - I-drug Recall: 0.7540 - I-drug F1: 0.7663 - I-finding Precision: 0.0 - I-finding Recall: 0.0 - I-finding F1: 0.0 - I-symptom Precision: 0.0 - I-symptom Recall: 0.0 - I-symptom F1: 0.0 - Macro Avg F1: 0.2343 - Weighted Avg F1: 0.4310 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | Adr Precision | Adr Recall | Adr F1 | Disease Precision | Disease Recall | Disease F1 | Drug Precision | Drug Recall | Drug F1 | Finding Precision | Finding Recall | Finding F1 | Symptom Precision | Symptom Recall | Symptom F1 | B-adr Precision | B-adr Recall | B-adr F1 | B-disease Precision | B-disease Recall | B-disease F1 | B-drug Precision | B-drug Recall | B-drug F1 | B-finding Precision | B-finding Recall | B-finding F1 | B-symptom Precision | B-symptom Recall | B-symptom F1 | I-adr Precision | I-adr Recall | I-adr F1 | I-disease Precision | I-disease Recall | I-disease F1 | I-drug Precision | I-drug Recall | I-drug F1 | I-finding Precision | I-finding Recall | I-finding F1 | I-symptom Precision | I-symptom Recall | I-symptom F1 | Macro Avg F1 | Weighted Avg F1 | |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|:-------------:|:----------:|:------:|:-----------------:|:--------------:|:----------:|:--------------:|:-----------:|:-------:|:-----------------:|:--------------:|:----------:|:-----------------:|:--------------:|:----------:|:---------------:|:------------:|:--------:|:-------------------:|:----------------:|:------------:|:----------------:|:-------------:|:---------:|:-------------------:|:----------------:|:------------:|:-------------------:|:----------------:|:------------:|:---------------:|:------------:|:--------:|:-------------------:|:----------------:|:------------:|:----------------:|:-------------:|:---------:|:-------------------:|:----------------:|:------------:|:-------------------:|:----------------:|:------------:|:------------:|:---------------:| | No log | 1.0 | 16 | 0.8107 | 0.0 | 0.0 | 0.0 | 0.7876 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | | No log | 2.0 | 32 | 0.5639 | 0.2126 | 0.1382 | 0.1675 | 0.8328 | 0.2139 | 0.1962 | 0.2047 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0213 | 0.0417 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0555 | 0.0628 | 0.0589 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0101 | 0.0234 | | No log | 3.0 | 48 | 0.4822 | 0.4020 | 0.2835 | 0.3325 | 0.8593 | 0.3002 | 0.2456 | 0.2702 | 0.0 | 0.0 | 0.0 | 0.8571 | 0.5745 | 0.6879 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9841 | 0.6596 | 0.7898 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0552 | 0.0557 | 0.0554 | 0.0 | 0.0 | 0.0 | 0.9730 | 0.5775 | 0.7248 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1570 | 0.1809 | | No log | 4.0 | 64 | 0.4422 | 0.4168 | 0.3255 | 0.3655 | 0.8608 | 0.3084 | 0.2820 | 0.2946 | 0.0 | 0.0 | 0.0 | 0.9254 | 0.6596 | 0.7702 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5 | 0.0047 | 0.0094 | 0.0 | 0.0 | 0.0 | 0.9767 | 0.6702 | 0.7950 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0653 | 0.0736 | 0.0692 | 0.0 | 0.0 | 0.0 | 0.9549 | 0.6791 | 0.7937 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1667 | 0.1967 | | No log | 5.0 | 80 | 0.4164 | 0.4113 | 0.3347 | 0.3691 | 0.8649 | 0.3130 | 0.2907 | 0.3014 | 0.0 | 0.0 | 0.0 | 0.8141 | 0.6755 | 0.7384 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5833 | 0.0551 | 0.1007 | 0.0 | 0.0 | 0.0 | 0.9699 | 0.6862 | 0.8037 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0905 | 0.1023 | 0.0960 | 0.0 | 0.0 | 0.0 | 0.8903 | 0.7380 | 0.8070 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1808 | 0.2409 | | No log | 6.0 | 96 | 0.4044 | 0.4079 | 0.3490 | 0.3762 | 0.8746 | 0.3103 | 0.3009 | 0.3055 | 0.0 | 0.0 | 0.0 | 0.7929 | 0.7128 | 0.7507 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6182 | 0.2142 | 0.3181 | 0.0 | 0.0 | 0.0 | 0.9710 | 0.7128 | 0.8221 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1532 | 0.1706 | 0.1614 | 0.0 | 0.0 | 0.0 | 0.8512 | 0.7647 | 0.8056 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2107 | 0.3430 | | No log | 7.0 | 112 | 0.4013 | 0.4118 | 0.3797 | 0.3951 | 0.8755 | 0.3155 | 0.3256 | 0.3205 | 0.0 | 0.0 | 0.0 | 0.7696 | 0.7819 | 0.7757 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6202 | 0.2803 | 0.3861 | 0.0 | 0.0 | 0.0 | 0.9524 | 0.7447 | 0.8358 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1751 | 0.2047 | 0.1887 | 0.0 | 0.0 | 0.0 | 0.7696 | 0.7861 | 0.7778 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2188 | 0.3750 | | No log | 8.0 | 128 | 0.4008 | 0.4297 | 0.3756 | 0.4009 | 0.8820 | 0.3275 | 0.3241 | 0.3258 | 0.0 | 0.0 | 0.0 | 0.8324 | 0.7660 | 0.7978 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6507 | 0.3433 | 0.4495 | 0.0 | 0.0 | 0.0 | 0.9653 | 0.7394 | 0.8373 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2271 | 0.2406 | 0.2337 | 0.0 | 0.0 | 0.0 | 0.8129 | 0.7433 | 0.7765 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2297 | 0.4125 | | No log | 9.0 | 144 | 0.3987 | 0.4323 | 0.3920 | 0.4112 | 0.8843 | 0.3348 | 0.3430 | 0.3388 | 0.0 | 0.0 | 0.0 | 0.8122 | 0.7819 | 0.7967 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6380 | 0.3969 | 0.4893 | 0.0 | 0.0 | 0.0 | 0.9658 | 0.75 | 0.8443 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2487 | 0.2639 | 0.2561 | 0.0 | 0.0 | 0.0 | 0.7790 | 0.7540 | 0.7663 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2356 | 0.4339 | | No log | 10.0 | 160 | 0.3991 | 0.4343 | 0.3889 | 0.4104 | 0.8844 | 0.3362 | 0.3387 | 0.3374 | 0.0 | 0.0 | 0.0 | 0.8077 | 0.7819 | 0.7946 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6463 | 0.3827 | 0.4807 | 0.0 | 0.0 | 0.0 | 0.9463 | 0.75 | 0.8368 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2517 | 0.2675 | 0.2594 | 0.0 | 0.0 | 0.0 | 0.7790 | 0.7540 | 0.7663 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2343 | 0.4310 | ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu121 - Datasets 2.15.0 - Tokenizers 0.15.0