blaze999 commited on
Commit
4d0745a
1 Parent(s): 8ec23d7

Training complete

Browse files
README.md ADDED
@@ -0,0 +1,114 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ base_model: microsoft/deberta-v3-base
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - precision
8
+ - recall
9
+ - f1
10
+ - accuracy
11
+ model-index:
12
+ - name: clinical-ner
13
+ results: []
14
+ ---
15
+
16
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
17
+ should probably proofread and complete it, then remove this comment. -->
18
+
19
+ # clinical-ner
20
+
21
+ This model is a fine-tuned version of [microsoft/deberta-v3-base](https://huggingface.co/microsoft/deberta-v3-base) on the None dataset.
22
+ It achieves the following results on the evaluation set:
23
+ - Loss: 0.8058
24
+ - Precision: 0.5786
25
+ - Recall: 0.6683
26
+ - F1: 0.6202
27
+ - Accuracy: 0.8099
28
+
29
+ ## Model description
30
+
31
+ More information needed
32
+
33
+ ## Intended uses & limitations
34
+
35
+ More information needed
36
+
37
+ ## Training and evaluation data
38
+
39
+ More information needed
40
+
41
+ ## Training procedure
42
+
43
+ ### Training hyperparameters
44
+
45
+ The following hyperparameters were used during training:
46
+ - learning_rate: 2e-05
47
+ - train_batch_size: 16
48
+ - eval_batch_size: 16
49
+ - seed: 42
50
+ - gradient_accumulation_steps: 2
51
+ - total_train_batch_size: 32
52
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
53
+ - lr_scheduler_type: cosine
54
+ - lr_scheduler_warmup_ratio: 0.1
55
+ - num_epochs: 45
56
+ - mixed_precision_training: Native AMP
57
+
58
+ ### Training results
59
+
60
+ | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
61
+ |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
62
+ | No log | 1.0 | 5 | 4.7713 | 0.0002 | 0.001 | 0.0004 | 0.0182 |
63
+ | No log | 2.0 | 10 | 4.2264 | 0.0002 | 0.0008 | 0.0003 | 0.1481 |
64
+ | No log | 3.0 | 15 | 3.6238 | 0.0004 | 0.0003 | 0.0003 | 0.4575 |
65
+ | 4.2324 | 4.0 | 20 | 2.8751 | 0.0 | 0.0 | 0.0 | 0.4734 |
66
+ | 4.2324 | 5.0 | 25 | 2.4550 | 0.0306 | 0.0008 | 0.0015 | 0.4739 |
67
+ | 4.2324 | 6.0 | 30 | 2.1920 | 0.0722 | 0.0437 | 0.0545 | 0.5007 |
68
+ | 4.2324 | 7.0 | 35 | 1.9841 | 0.1137 | 0.1087 | 0.1112 | 0.5392 |
69
+ | 2.3521 | 8.0 | 40 | 1.8153 | 0.1956 | 0.189 | 0.1922 | 0.5829 |
70
+ | 2.3521 | 9.0 | 45 | 1.6504 | 0.2539 | 0.2617 | 0.2578 | 0.6218 |
71
+ | 2.3521 | 10.0 | 50 | 1.4801 | 0.3607 | 0.3787 | 0.3695 | 0.6782 |
72
+ | 2.3521 | 11.0 | 55 | 1.3417 | 0.3933 | 0.433 | 0.4122 | 0.7021 |
73
+ | 1.6185 | 12.0 | 60 | 1.2333 | 0.4054 | 0.4795 | 0.4394 | 0.7203 |
74
+ | 1.6185 | 13.0 | 65 | 1.1490 | 0.4307 | 0.5125 | 0.4680 | 0.7347 |
75
+ | 1.6185 | 14.0 | 70 | 1.0750 | 0.4412 | 0.543 | 0.4868 | 0.7503 |
76
+ | 1.6185 | 15.0 | 75 | 1.0179 | 0.4816 | 0.5637 | 0.5195 | 0.7619 |
77
+ | 1.1438 | 16.0 | 80 | 0.9774 | 0.4899 | 0.578 | 0.5303 | 0.7689 |
78
+ | 1.1438 | 17.0 | 85 | 0.9475 | 0.5005 | 0.5955 | 0.5439 | 0.7743 |
79
+ | 1.1438 | 18.0 | 90 | 0.9192 | 0.5082 | 0.6078 | 0.5535 | 0.7788 |
80
+ | 1.1438 | 19.0 | 95 | 0.8923 | 0.5151 | 0.6085 | 0.5579 | 0.7828 |
81
+ | 0.8863 | 20.0 | 100 | 0.8691 | 0.5263 | 0.6242 | 0.5711 | 0.7882 |
82
+ | 0.8863 | 21.0 | 105 | 0.8604 | 0.5358 | 0.6342 | 0.5809 | 0.7907 |
83
+ | 0.8863 | 22.0 | 110 | 0.8474 | 0.5429 | 0.641 | 0.5879 | 0.7946 |
84
+ | 0.8863 | 23.0 | 115 | 0.8362 | 0.5493 | 0.644 | 0.5929 | 0.7969 |
85
+ | 0.7361 | 24.0 | 120 | 0.8284 | 0.5531 | 0.6512 | 0.5982 | 0.7994 |
86
+ | 0.7361 | 25.0 | 125 | 0.8325 | 0.5555 | 0.6565 | 0.6018 | 0.8001 |
87
+ | 0.7361 | 26.0 | 130 | 0.8156 | 0.5686 | 0.6562 | 0.6093 | 0.8035 |
88
+ | 0.7361 | 27.0 | 135 | 0.8177 | 0.5634 | 0.6625 | 0.6089 | 0.8039 |
89
+ | 0.6449 | 28.0 | 140 | 0.8152 | 0.5643 | 0.6567 | 0.6070 | 0.8036 |
90
+ | 0.6449 | 29.0 | 145 | 0.8109 | 0.5700 | 0.6647 | 0.6137 | 0.8066 |
91
+ | 0.6449 | 30.0 | 150 | 0.8164 | 0.5697 | 0.6653 | 0.6138 | 0.8055 |
92
+ | 0.6449 | 31.0 | 155 | 0.8081 | 0.5742 | 0.6627 | 0.6153 | 0.8085 |
93
+ | 0.5912 | 32.0 | 160 | 0.8130 | 0.5687 | 0.6677 | 0.6142 | 0.8067 |
94
+ | 0.5912 | 33.0 | 165 | 0.8048 | 0.5779 | 0.6637 | 0.6179 | 0.8089 |
95
+ | 0.5912 | 34.0 | 170 | 0.8096 | 0.5760 | 0.669 | 0.6190 | 0.8085 |
96
+ | 0.5912 | 35.0 | 175 | 0.8063 | 0.5790 | 0.6677 | 0.6202 | 0.8091 |
97
+ | 0.5625 | 36.0 | 180 | 0.8052 | 0.5755 | 0.6673 | 0.6180 | 0.8094 |
98
+ | 0.5625 | 37.0 | 185 | 0.8063 | 0.5753 | 0.6667 | 0.6176 | 0.8093 |
99
+ | 0.5625 | 38.0 | 190 | 0.8055 | 0.5783 | 0.6677 | 0.6198 | 0.8103 |
100
+ | 0.5625 | 39.0 | 195 | 0.8052 | 0.5792 | 0.668 | 0.6205 | 0.8099 |
101
+ | 0.5442 | 40.0 | 200 | 0.8052 | 0.5798 | 0.6685 | 0.6210 | 0.8097 |
102
+ | 0.5442 | 41.0 | 205 | 0.8055 | 0.5784 | 0.6683 | 0.6201 | 0.8098 |
103
+ | 0.5442 | 42.0 | 210 | 0.8056 | 0.5789 | 0.6685 | 0.6205 | 0.8100 |
104
+ | 0.5442 | 43.0 | 215 | 0.8057 | 0.5786 | 0.6683 | 0.6202 | 0.8100 |
105
+ | 0.5397 | 44.0 | 220 | 0.8057 | 0.5786 | 0.6683 | 0.6202 | 0.8099 |
106
+ | 0.5397 | 45.0 | 225 | 0.8058 | 0.5786 | 0.6683 | 0.6202 | 0.8099 |
107
+
108
+
109
+ ### Framework versions
110
+
111
+ - Transformers 4.37.0
112
+ - Pytorch 2.1.2
113
+ - Datasets 2.1.0
114
+ - Tokenizers 0.15.1
added_tokens.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ {
2
+ "[MASK]": 128000
3
+ }
config.json ADDED
@@ -0,0 +1,205 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "microsoft/deberta-v3-base",
3
+ "architectures": [
4
+ "DebertaV2ForTokenClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "hidden_act": "gelu",
8
+ "hidden_dropout_prob": 0.1,
9
+ "hidden_size": 768,
10
+ "id2label": {
11
+ "0": "O",
12
+ "1": "B-ACTIVITY",
13
+ "2": "I-ACTIVITY",
14
+ "3": "I-ADMINISTRATION",
15
+ "4": "B-ADMINISTRATION",
16
+ "5": "B-AGE",
17
+ "6": "I-AGE",
18
+ "7": "I-AREA",
19
+ "8": "B-AREA",
20
+ "9": "B-BIOLOGICAL_ATTRIBUTE",
21
+ "10": "I-BIOLOGICAL_ATTRIBUTE",
22
+ "11": "I-BIOLOGICAL_STRUCTURE",
23
+ "12": "B-BIOLOGICAL_STRUCTURE",
24
+ "13": "B-CLINICAL_EVENT",
25
+ "14": "I-CLINICAL_EVENT",
26
+ "15": "B-COLOR",
27
+ "16": "I-COLOR",
28
+ "17": "I-COREFERENCE",
29
+ "18": "B-COREFERENCE",
30
+ "19": "B-DATE",
31
+ "20": "I-DATE",
32
+ "21": "I-DETAILED_DESCRIPTION",
33
+ "22": "B-DETAILED_DESCRIPTION",
34
+ "23": "I-DIAGNOSTIC_PROCEDURE",
35
+ "24": "B-DIAGNOSTIC_PROCEDURE",
36
+ "25": "I-DISEASE_DISORDER",
37
+ "26": "B-DISEASE_DISORDER",
38
+ "27": "B-DISTANCE",
39
+ "28": "I-DISTANCE",
40
+ "29": "B-DOSAGE",
41
+ "30": "I-DOSAGE",
42
+ "31": "I-DURATION",
43
+ "32": "B-DURATION",
44
+ "33": "I-FAMILY_HISTORY",
45
+ "34": "B-FAMILY_HISTORY",
46
+ "35": "B-FREQUENCY",
47
+ "36": "I-FREQUENCY",
48
+ "37": "I-HEIGHT",
49
+ "38": "B-HEIGHT",
50
+ "39": "B-HISTORY",
51
+ "40": "I-HISTORY",
52
+ "41": "I-LAB_VALUE",
53
+ "42": "B-LAB_VALUE",
54
+ "43": "I-MASS",
55
+ "44": "B-MASS",
56
+ "45": "I-MEDICATION",
57
+ "46": "B-MEDICATION",
58
+ "47": "I-NONBIOLOGICAL_LOCATION",
59
+ "48": "B-NONBIOLOGICAL_LOCATION",
60
+ "49": "I-OCCUPATION",
61
+ "50": "B-OCCUPATION",
62
+ "51": "B-OTHER_ENTITY",
63
+ "52": "I-OTHER_ENTITY",
64
+ "53": "B-OTHER_EVENT",
65
+ "54": "I-OTHER_EVENT",
66
+ "55": "I-OUTCOME",
67
+ "56": "B-OUTCOME",
68
+ "57": "I-PERSONAL_BACKGROUND",
69
+ "58": "B-PERSONAL_BACKGROUND",
70
+ "59": "B-QUALITATIVE_CONCEPT",
71
+ "60": "I-QUALITATIVE_CONCEPT",
72
+ "61": "I-QUANTITATIVE_CONCEPT",
73
+ "62": "B-QUANTITATIVE_CONCEPT",
74
+ "63": "B-SEVERITY",
75
+ "64": "I-SEVERITY",
76
+ "65": "B-SEX",
77
+ "66": "I-SEX",
78
+ "67": "B-SHAPE",
79
+ "68": "I-SHAPE",
80
+ "69": "B-SIGN_SYMPTOM",
81
+ "70": "I-SIGN_SYMPTOM",
82
+ "71": "B-SUBJECT",
83
+ "72": "I-SUBJECT",
84
+ "73": "B-TEXTURE",
85
+ "74": "I-TEXTURE",
86
+ "75": "B-THERAPEUTIC_PROCEDURE",
87
+ "76": "I-THERAPEUTIC_PROCEDURE",
88
+ "77": "I-TIME",
89
+ "78": "B-TIME",
90
+ "79": "B-VOLUME",
91
+ "80": "I-VOLUME",
92
+ "81": "I-WEIGHT",
93
+ "82": "B-WEIGHT"
94
+ },
95
+ "initializer_range": 0.02,
96
+ "intermediate_size": 3072,
97
+ "label2id": {
98
+ "B-ACTIVITY": 1,
99
+ "B-ADMINISTRATION": 4,
100
+ "B-AGE": 5,
101
+ "B-AREA": 8,
102
+ "B-BIOLOGICAL_ATTRIBUTE": 9,
103
+ "B-BIOLOGICAL_STRUCTURE": 12,
104
+ "B-CLINICAL_EVENT": 13,
105
+ "B-COLOR": 15,
106
+ "B-COREFERENCE": 18,
107
+ "B-DATE": 19,
108
+ "B-DETAILED_DESCRIPTION": 22,
109
+ "B-DIAGNOSTIC_PROCEDURE": 24,
110
+ "B-DISEASE_DISORDER": 26,
111
+ "B-DISTANCE": 27,
112
+ "B-DOSAGE": 29,
113
+ "B-DURATION": 32,
114
+ "B-FAMILY_HISTORY": 34,
115
+ "B-FREQUENCY": 35,
116
+ "B-HEIGHT": 38,
117
+ "B-HISTORY": 39,
118
+ "B-LAB_VALUE": 42,
119
+ "B-MASS": 44,
120
+ "B-MEDICATION": 46,
121
+ "B-NONBIOLOGICAL_LOCATION": 48,
122
+ "B-OCCUPATION": 50,
123
+ "B-OTHER_ENTITY": 51,
124
+ "B-OTHER_EVENT": 53,
125
+ "B-OUTCOME": 56,
126
+ "B-PERSONAL_BACKGROUND": 58,
127
+ "B-QUALITATIVE_CONCEPT": 59,
128
+ "B-QUANTITATIVE_CONCEPT": 62,
129
+ "B-SEVERITY": 63,
130
+ "B-SEX": 65,
131
+ "B-SHAPE": 67,
132
+ "B-SIGN_SYMPTOM": 69,
133
+ "B-SUBJECT": 71,
134
+ "B-TEXTURE": 73,
135
+ "B-THERAPEUTIC_PROCEDURE": 75,
136
+ "B-TIME": 78,
137
+ "B-VOLUME": 79,
138
+ "B-WEIGHT": 82,
139
+ "I-ACTIVITY": 2,
140
+ "I-ADMINISTRATION": 3,
141
+ "I-AGE": 6,
142
+ "I-AREA": 7,
143
+ "I-BIOLOGICAL_ATTRIBUTE": 10,
144
+ "I-BIOLOGICAL_STRUCTURE": 11,
145
+ "I-CLINICAL_EVENT": 14,
146
+ "I-COLOR": 16,
147
+ "I-COREFERENCE": 17,
148
+ "I-DATE": 20,
149
+ "I-DETAILED_DESCRIPTION": 21,
150
+ "I-DIAGNOSTIC_PROCEDURE": 23,
151
+ "I-DISEASE_DISORDER": 25,
152
+ "I-DISTANCE": 28,
153
+ "I-DOSAGE": 30,
154
+ "I-DURATION": 31,
155
+ "I-FAMILY_HISTORY": 33,
156
+ "I-FREQUENCY": 36,
157
+ "I-HEIGHT": 37,
158
+ "I-HISTORY": 40,
159
+ "I-LAB_VALUE": 41,
160
+ "I-MASS": 43,
161
+ "I-MEDICATION": 45,
162
+ "I-NONBIOLOGICAL_LOCATION": 47,
163
+ "I-OCCUPATION": 49,
164
+ "I-OTHER_ENTITY": 52,
165
+ "I-OTHER_EVENT": 54,
166
+ "I-OUTCOME": 55,
167
+ "I-PERSONAL_BACKGROUND": 57,
168
+ "I-QUALITATIVE_CONCEPT": 60,
169
+ "I-QUANTITATIVE_CONCEPT": 61,
170
+ "I-SEVERITY": 64,
171
+ "I-SEX": 66,
172
+ "I-SHAPE": 68,
173
+ "I-SIGN_SYMPTOM": 70,
174
+ "I-SUBJECT": 72,
175
+ "I-TEXTURE": 74,
176
+ "I-THERAPEUTIC_PROCEDURE": 76,
177
+ "I-TIME": 77,
178
+ "I-VOLUME": 80,
179
+ "I-WEIGHT": 81,
180
+ "O": 0
181
+ },
182
+ "layer_norm_eps": 1e-07,
183
+ "max_position_embeddings": 512,
184
+ "max_relative_positions": -1,
185
+ "model_type": "deberta-v2",
186
+ "norm_rel_ebd": "layer_norm",
187
+ "num_attention_heads": 12,
188
+ "num_hidden_layers": 12,
189
+ "pad_token_id": 0,
190
+ "pooler_dropout": 0,
191
+ "pooler_hidden_act": "gelu",
192
+ "pooler_hidden_size": 768,
193
+ "pos_att_type": [
194
+ "p2c",
195
+ "c2p"
196
+ ],
197
+ "position_biased_input": false,
198
+ "position_buckets": 256,
199
+ "relative_attention": true,
200
+ "share_att_key": true,
201
+ "torch_dtype": "float32",
202
+ "transformers_version": "4.37.0",
203
+ "type_vocab_size": 0,
204
+ "vocab_size": 128100
205
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6f45a1b7573817eaa3b4466cf258596f449234e58a363875654b3eb763363995
3
+ size 735605900
runs/Feb13_07-07-02_d318ae9f03d8/events.out.tfevents.1707808031.d318ae9f03d8.34.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e96305887f58564f149909961254c38307dd315bd8912abd8bb005c95b79a306
3
+ size 31831
special_tokens_map.json ADDED
@@ -0,0 +1,15 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": "[CLS]",
3
+ "cls_token": "[CLS]",
4
+ "eos_token": "[SEP]",
5
+ "mask_token": "[MASK]",
6
+ "pad_token": "[PAD]",
7
+ "sep_token": "[SEP]",
8
+ "unk_token": {
9
+ "content": "[UNK]",
10
+ "lstrip": false,
11
+ "normalized": true,
12
+ "rstrip": false,
13
+ "single_word": false
14
+ }
15
+ }
spm.model ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c679fbf93643d19aab7ee10c0b99e460bdbc02fedf34b92b05af343b4af586fd
3
+ size 2464616
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,58 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "[PAD]",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "1": {
12
+ "content": "[CLS]",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "2": {
20
+ "content": "[SEP]",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "3": {
28
+ "content": "[UNK]",
29
+ "lstrip": false,
30
+ "normalized": true,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "128000": {
36
+ "content": "[MASK]",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ }
43
+ },
44
+ "bos_token": "[CLS]",
45
+ "clean_up_tokenization_spaces": true,
46
+ "cls_token": "[CLS]",
47
+ "do_lower_case": false,
48
+ "eos_token": "[SEP]",
49
+ "mask_token": "[MASK]",
50
+ "model_max_length": 1000000000000000019884624838656,
51
+ "pad_token": "[PAD]",
52
+ "sep_token": "[SEP]",
53
+ "sp_model_kwargs": {},
54
+ "split_by_punct": false,
55
+ "tokenizer_class": "DebertaV2Tokenizer",
56
+ "unk_token": "[UNK]",
57
+ "vocab_type": "spm"
58
+ }
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:102e3baeac11cbbce67557f93e45e24b9d0b164b4994953350efdd61bf6123e7
3
+ size 4728