metadata
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_trainer
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: distilbert-base-uncased-finetuned-ner-cadec-active
results: []
distilbert-base-uncased-finetuned-ner-cadec-active
This model is a fine-tuned version of distilbert-base-uncased on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.3873
- Precision: 0.4488
- Recall: 0.4483
- F1: 0.4485
- Accuracy: 0.8907
- Adr Precision: 0.3791
- Adr Recall: 0.4375
- Adr F1: 0.4062
- Disease Precision: 0.0
- Disease Recall: 0.0
- Disease F1: 0.0
- Drug Precision: 0.7527
- Drug Recall: 0.7287
- Drug F1: 0.7405
- Finding Precision: 0.0
- Finding Recall: 0.0
- Finding F1: 0.0
- Symptom Precision: 0.0
- Symptom Recall: 0.0
- Symptom F1: 0.0
- B-adr Precision: 0.6329
- B-adr Recall: 0.5512
- B-adr F1: 0.5892
- B-disease Precision: 0.0
- B-disease Recall: 0.0
- B-disease F1: 0.0
- B-drug Precision: 0.9718
- B-drug Recall: 0.7340
- B-drug F1: 0.8364
- B-finding Precision: 0.0
- B-finding Recall: 0.0
- B-finding F1: 0.0
- B-symptom Precision: 0.0
- B-symptom Recall: 0.0
- B-symptom F1: 0.0
- I-adr Precision: 0.3287
- I-adr Recall: 0.3860
- I-adr F1: 0.3551
- I-disease Precision: 0.0
- I-disease Recall: 0.0
- I-disease F1: 0.0
- I-drug Precision: 0.8066
- I-drug Recall: 0.7807
- I-drug F1: 0.7935
- I-finding Precision: 0.0
- I-finding Recall: 0.0
- I-finding F1: 0.0
- I-symptom Precision: 0.0
- I-symptom Recall: 0.0
- I-symptom F1: 0.0
- Macro Avg F1: 0.2574
- Weighted Avg F1: 0.5041
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | Adr Precision | Adr Recall | Adr F1 | Disease Precision | Disease Recall | Disease F1 | Drug Precision | Drug Recall | Drug F1 | Finding Precision | Finding Recall | Finding F1 | Symptom Precision | Symptom Recall | Symptom F1 | B-adr Precision | B-adr Recall | B-adr F1 | B-disease Precision | B-disease Recall | B-disease F1 | B-drug Precision | B-drug Recall | B-drug F1 | B-finding Precision | B-finding Recall | B-finding F1 | B-symptom Precision | B-symptom Recall | B-symptom F1 | I-adr Precision | I-adr Recall | I-adr F1 | I-disease Precision | I-disease Recall | I-disease F1 | I-drug Precision | I-drug Recall | I-drug F1 | I-finding Precision | I-finding Recall | I-finding F1 | I-symptom Precision | I-symptom Recall | I-symptom F1 | Macro Avg F1 | Weighted Avg F1 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 16 | 0.8554 | 0.0 | 0.0 | 0.0 | 0.7876 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
No log | 2.0 | 32 | 0.6110 | 0.1709 | 0.0901 | 0.1180 | 0.8226 | 0.1709 | 0.1279 | 0.1463 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0699 | 0.0646 | 0.0672 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0067 | 0.0215 |
No log | 3.0 | 48 | 0.5114 | 0.2118 | 0.1433 | 0.1709 | 0.8496 | 0.2612 | 0.2035 | 0.2288 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.55 | 0.0173 | 0.0336 | 0.0 | 0.0 | 0.0 | 0.984 | 0.6543 | 0.7859 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0918 | 0.0880 | 0.0898 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0909 | 0.1259 |
No log | 4.0 | 64 | 0.4618 | 0.4412 | 0.3224 | 0.3726 | 0.8660 | 0.3271 | 0.2791 | 0.3012 | 0.0 | 0.0 | 0.0 | 0.9685 | 0.6543 | 0.7810 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6375 | 0.0803 | 0.1427 | 0.0 | 0.0 | 0.0 | 0.9843 | 0.6649 | 0.7937 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1209 | 0.1257 | 0.1232 | 0.0 | 0.0 | 0.0 | 0.9685 | 0.6578 | 0.7834 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1843 | 0.2613 |
No log | 5.0 | 80 | 0.4254 | 0.4072 | 0.3460 | 0.3741 | 0.8679 | 0.3080 | 0.3125 | 0.3102 | 0.0 | 0.0 | 0.0 | 0.9318 | 0.6543 | 0.7688 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5960 | 0.1858 | 0.2833 | 0.0 | 0.0 | 0.0 | 0.9843 | 0.6649 | 0.7937 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1381 | 0.1652 | 0.1504 | 0.0 | 0.0 | 0.0 | 0.9394 | 0.6631 | 0.7774 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2005 | 0.3207 |
No log | 6.0 | 96 | 0.4048 | 0.4377 | 0.4063 | 0.4214 | 0.8835 | 0.3634 | 0.3983 | 0.3800 | 0.0 | 0.0 | 0.0 | 0.8039 | 0.6543 | 0.7214 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6335 | 0.4409 | 0.5200 | 0.0 | 0.0 | 0.0 | 0.9766 | 0.6649 | 0.7911 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2772 | 0.3250 | 0.2992 | 0.0 | 0.0 | 0.0 | 0.8618 | 0.7005 | 0.7729 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2383 | 0.4538 |
No log | 7.0 | 112 | 0.3952 | 0.4114 | 0.3920 | 0.4015 | 0.8815 | 0.3303 | 0.3663 | 0.3473 | 0.0 | 0.0 | 0.0 | 0.7798 | 0.6968 | 0.7360 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6121 | 0.4126 | 0.4929 | 0.0 | 0.0 | 0.0 | 0.9784 | 0.7234 | 0.8318 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2394 | 0.2926 | 0.2633 | 0.0 | 0.0 | 0.0 | 0.8383 | 0.7487 | 0.7910 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2379 | 0.4388 |
No log | 8.0 | 128 | 0.3922 | 0.4575 | 0.4411 | 0.4492 | 0.8884 | 0.3821 | 0.4331 | 0.4060 | 0.0 | 0.0 | 0.0 | 0.8210 | 0.7074 | 0.76 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6331 | 0.5354 | 0.5802 | 0.0 | 0.0 | 0.0 | 0.9784 | 0.7234 | 0.8318 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.3281 | 0.3788 | 0.3517 | 0.0 | 0.0 | 0.0 | 0.8758 | 0.7540 | 0.8103 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2574 | 0.5010 |
No log | 9.0 | 144 | 0.3886 | 0.4549 | 0.4391 | 0.4469 | 0.8887 | 0.3815 | 0.4259 | 0.4025 | 0.0 | 0.0 | 0.0 | 0.7771 | 0.7234 | 0.7493 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6271 | 0.5244 | 0.5712 | 0.0 | 0.0 | 0.0 | 0.9716 | 0.7287 | 0.8328 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.3297 | 0.3770 | 0.3518 | 0.0 | 0.0 | 0.0 | 0.8333 | 0.7754 | 0.8033 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2559 | 0.4971 |
No log | 10.0 | 160 | 0.3873 | 0.4488 | 0.4483 | 0.4485 | 0.8907 | 0.3791 | 0.4375 | 0.4062 | 0.0 | 0.0 | 0.0 | 0.7527 | 0.7287 | 0.7405 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6329 | 0.5512 | 0.5892 | 0.0 | 0.0 | 0.0 | 0.9718 | 0.7340 | 0.8364 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.3287 | 0.3860 | 0.3551 | 0.0 | 0.0 | 0.0 | 0.8066 | 0.7807 | 0.7935 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2574 | 0.5041 |
Framework versions
- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.15.0
- Tokenizers 0.15.0