csNoHug commited on
Commit
03802b4
1 Parent(s): d0366b1

Training complete

Browse files
Files changed (1) hide show
  1. README.md +122 -0
README.md ADDED
@@ -0,0 +1,122 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: distilbert-base-uncased
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - precision
8
+ - recall
9
+ - f1
10
+ - accuracy
11
+ model-index:
12
+ - name: distilbert-base-uncased-finetuned-ner-cadec-active
13
+ results: []
14
+ ---
15
+
16
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
17
+ should probably proofread and complete it, then remove this comment. -->
18
+
19
+ # distilbert-base-uncased-finetuned-ner-cadec-active
20
+
21
+ This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
22
+ It achieves the following results on the evaluation set:
23
+ - Loss: 0.4418
24
+ - Precision: 0.4111
25
+ - Recall: 0.3715
26
+ - F1: 0.3903
27
+ - Accuracy: 0.8709
28
+ - Adr Precision: 0.3331
29
+ - Adr Recall: 0.3514
30
+ - Adr F1: 0.3420
31
+ - Disease Precision: 0.0
32
+ - Disease Recall: 0.0
33
+ - Disease F1: 0.0
34
+ - Drug Precision: 0.7366
35
+ - Drug Recall: 0.6395
36
+ - Drug F1: 0.6846
37
+ - Finding Precision: 0.0
38
+ - Finding Recall: 0.0
39
+ - Finding F1: 0.0
40
+ - Symptom Precision: 0.0
41
+ - Symptom Recall: 0.0
42
+ - Symptom F1: 0.0
43
+ - B-adr Precision: 0.6633
44
+ - B-adr Recall: 0.3935
45
+ - B-adr F1: 0.4939
46
+ - B-disease Precision: 0.0
47
+ - B-disease Recall: 0.0
48
+ - B-disease F1: 0.0
49
+ - B-drug Precision: 0.9637
50
+ - B-drug Recall: 0.6427
51
+ - B-drug F1: 0.7711
52
+ - B-finding Precision: 0.0
53
+ - B-finding Recall: 0.0
54
+ - B-finding F1: 0.0
55
+ - B-symptom Precision: 0.0
56
+ - B-symptom Recall: 0.0
57
+ - B-symptom F1: 0.0
58
+ - I-adr Precision: 0.2541
59
+ - I-adr Recall: 0.2887
60
+ - I-adr F1: 0.2703
61
+ - I-disease Precision: 0.0
62
+ - I-disease Recall: 0.0
63
+ - I-disease F1: 0.0
64
+ - I-drug Precision: 0.8042
65
+ - I-drug Recall: 0.6895
66
+ - I-drug F1: 0.7425
67
+ - I-finding Precision: 0.0
68
+ - I-finding Recall: 0.0
69
+ - I-finding F1: 0.0
70
+ - I-symptom Precision: 0.0
71
+ - I-symptom Recall: 0.0
72
+ - I-symptom F1: 0.0
73
+ - Macro Avg F1: 0.2278
74
+ - Weighted Avg F1: 0.4320
75
+
76
+ ## Model description
77
+
78
+ More information needed
79
+
80
+ ## Intended uses & limitations
81
+
82
+ More information needed
83
+
84
+ ## Training and evaluation data
85
+
86
+ More information needed
87
+
88
+ ## Training procedure
89
+
90
+ ### Training hyperparameters
91
+
92
+ The following hyperparameters were used during training:
93
+ - learning_rate: 2e-05
94
+ - train_batch_size: 8
95
+ - eval_batch_size: 8
96
+ - seed: 42
97
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
98
+ - lr_scheduler_type: linear
99
+ - num_epochs: 10
100
+
101
+ ### Training results
102
+
103
+ | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | Adr Precision | Adr Recall | Adr F1 | Disease Precision | Disease Recall | Disease F1 | Drug Precision | Drug Recall | Drug F1 | Finding Precision | Finding Recall | Finding F1 | Symptom Precision | Symptom Recall | Symptom F1 | B-adr Precision | B-adr Recall | B-adr F1 | B-disease Precision | B-disease Recall | B-disease F1 | B-drug Precision | B-drug Recall | B-drug F1 | B-finding Precision | B-finding Recall | B-finding F1 | B-symptom Precision | B-symptom Recall | B-symptom F1 | I-adr Precision | I-adr Recall | I-adr F1 | I-disease Precision | I-disease Recall | I-disease F1 | I-drug Precision | I-drug Recall | I-drug F1 | I-finding Precision | I-finding Recall | I-finding F1 | I-symptom Precision | I-symptom Recall | I-symptom F1 | Macro Avg F1 | Weighted Avg F1 |
104
+ |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|:-------------:|:----------:|:------:|:-----------------:|:--------------:|:----------:|:--------------:|:-----------:|:-------:|:-----------------:|:--------------:|:----------:|:-----------------:|:--------------:|:----------:|:---------------:|:------------:|:--------:|:-------------------:|:----------------:|:------------:|:----------------:|:-------------:|:---------:|:-------------------:|:----------------:|:------------:|:-------------------:|:----------------:|:------------:|:---------------:|:------------:|:--------:|:-------------------:|:----------------:|:------------:|:----------------:|:-------------:|:---------:|:-------------------:|:----------------:|:------------:|:-------------------:|:----------------:|:------------:|:------------:|:---------------:|
105
+ | No log | 1.0 | 16 | 0.9134 | 0.0 | 0.0 | 0.0 | 0.7726 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
106
+ | No log | 2.0 | 32 | 0.6436 | 0.1614 | 0.0574 | 0.0847 | 0.8059 | 0.1615 | 0.0830 | 0.1097 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 1.0 | 0.0009 | 0.0018 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0879 | 0.0551 | 0.0678 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0070 | 0.0215 |
107
+ | No log | 3.0 | 48 | 0.5510 | 0.2477 | 0.1653 | 0.1982 | 0.8339 | 0.2492 | 0.2005 | 0.2222 | 0.0 | 0.0 | 0.0 | 0.2403 | 0.1327 | 0.1710 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6667 | 0.0088 | 0.0174 | 0.0 | 0.0 | 0.0 | 0.9919 | 0.5485 | 0.7064 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0677 | 0.0663 | 0.0670 | 0.0 | 0.0 | 0.0 | 1.0 | 0.1336 | 0.2357 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1026 | 0.1321 |
108
+ | No log | 4.0 | 64 | 0.5042 | 0.3921 | 0.2704 | 0.3201 | 0.8458 | 0.2796 | 0.2332 | 0.2543 | 0.0 | 0.0 | 0.0 | 0.9634 | 0.5435 | 0.6950 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7672 | 0.0245 | 0.0475 | 0.0 | 0.0 | 0.0 | 0.9873 | 0.5575 | 0.7126 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0692 | 0.0701 | 0.0696 | 0.0 | 0.0 | 0.0 | 0.9838 | 0.5469 | 0.7030 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1533 | 0.1963 |
109
+ | No log | 5.0 | 80 | 0.4672 | 0.3841 | 0.2960 | 0.3344 | 0.8510 | 0.2863 | 0.2668 | 0.2762 | 0.0 | 0.0 | 0.0 | 0.8818 | 0.5552 | 0.6813 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7578 | 0.0931 | 0.1659 | 0.0 | 0.0 | 0.0 | 0.9802 | 0.5763 | 0.7258 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0978 | 0.1099 | 0.1035 | 0.0 | 0.0 | 0.0 | 0.9165 | 0.5650 | 0.6991 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1694 | 0.2509 |
110
+ | No log | 6.0 | 96 | 0.4543 | 0.3814 | 0.3116 | 0.3430 | 0.8580 | 0.2925 | 0.2833 | 0.2878 | 0.0 | 0.0 | 0.0 | 0.7839 | 0.5758 | 0.6639 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7212 | 0.1810 | 0.2894 | 0.0 | 0.0 | 0.0 | 0.9721 | 0.5934 | 0.7369 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1459 | 0.1679 | 0.1562 | 0.0 | 0.0 | 0.0 | 0.8661 | 0.6245 | 0.7257 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1908 | 0.3164 |
111
+ | No log | 7.0 | 112 | 0.4443 | 0.3829 | 0.3356 | 0.3577 | 0.8623 | 0.3027 | 0.3084 | 0.3055 | 0.0 | 0.0 | 0.0 | 0.7103 | 0.6090 | 0.6557 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6896 | 0.2571 | 0.3745 | 0.0 | 0.0 | 0.0 | 0.9639 | 0.6239 | 0.7575 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1725 | 0.2027 | 0.1863 | 0.0 | 0.0 | 0.0 | 0.8104 | 0.6868 | 0.7435 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2062 | 0.3610 |
112
+ | No log | 8.0 | 128 | 0.4512 | 0.4119 | 0.3448 | 0.3754 | 0.8680 | 0.3295 | 0.3219 | 0.3257 | 0.0 | 0.0 | 0.0 | 0.7559 | 0.6081 | 0.6740 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6898 | 0.3345 | 0.4505 | 0.0 | 0.0 | 0.0 | 0.9652 | 0.6221 | 0.7566 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2328 | 0.2530 | 0.2425 | 0.0 | 0.0 | 0.0 | 0.8540 | 0.6760 | 0.7547 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2204 | 0.4072 |
113
+ | No log | 9.0 | 144 | 0.4413 | 0.4046 | 0.3637 | 0.3831 | 0.8697 | 0.3250 | 0.3420 | 0.3333 | 0.0 | 0.0 | 0.0 | 0.7416 | 0.6332 | 0.6831 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6717 | 0.3703 | 0.4774 | 0.0 | 0.0 | 0.0 | 0.9648 | 0.6391 | 0.7689 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2378 | 0.2744 | 0.2548 | 0.0 | 0.0 | 0.0 | 0.8135 | 0.6850 | 0.7438 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2245 | 0.4210 |
114
+ | No log | 10.0 | 160 | 0.4418 | 0.4111 | 0.3715 | 0.3903 | 0.8709 | 0.3331 | 0.3514 | 0.3420 | 0.0 | 0.0 | 0.0 | 0.7366 | 0.6395 | 0.6846 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6633 | 0.3935 | 0.4939 | 0.0 | 0.0 | 0.0 | 0.9637 | 0.6427 | 0.7711 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2541 | 0.2887 | 0.2703 | 0.0 | 0.0 | 0.0 | 0.8042 | 0.6895 | 0.7425 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.2278 | 0.4320 |
115
+
116
+
117
+ ### Framework versions
118
+
119
+ - Transformers 4.35.2
120
+ - Pytorch 2.1.0+cu121
121
+ - Datasets 2.15.0
122
+ - Tokenizers 0.15.0