maren-hugg commited on
Commit
291ed0b
1 Parent(s): 4e18f14

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +1138 -0
README.md ADDED
@@ -0,0 +1,1138 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ tags:
4
+ - generated_from_trainer
5
+ datasets:
6
+ - xtreme
7
+ metrics:
8
+ - f1
9
+ - precision
10
+ - recall
11
+ - accuracy
12
+ model-index:
13
+ - name: xlm-roberta-base-finetuned-panx-en
14
+ results:
15
+ - task:
16
+ name: Token Classification
17
+ type: token-classification
18
+ dataset:
19
+ name: xtreme
20
+ type: xtreme
21
+ config: PAN-X.en
22
+ split: validation
23
+ args: PAN-X.en
24
+ metrics:
25
+ - name: F1
26
+ type: f1
27
+ value: 0.8236654056326187
28
+ - name: Precision
29
+ type: precision
30
+ value: 0.8163449520899875
31
+ - name: Recall
32
+ type: recall
33
+ value: 0.8311183373391772
34
+ - name: Accuracy
35
+ type: accuracy
36
+ value: 0.8236654056326187
37
+ ---
38
+
39
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
40
+ should probably proofread and complete it, then remove this comment. -->
41
+
42
+ # xlm-roberta-base-finetuned-panx-en
43
+
44
+ This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the xtreme dataset.
45
+ It achieves the following results on the evaluation set:
46
+ - Loss: 0.2487
47
+ - F1: 0.8237
48
+ - Precision: 0.8163
49
+ - Recall: 0.8311
50
+ - Accuracy: 0.8237
51
+ - Classification Report: precision recall f1-score support
52
+
53
+ LOC 0.85 0.85 0.85 4834
54
+ ORG 0.78 0.74 0.76 4677
55
+ PER 0.89 0.89 0.89 4635
56
+
57
+ micro avg 0.84 0.83 0.83 14146
58
+ macro avg 0.84 0.83 0.83 14146
59
+ weighted avg 0.84 0.83 0.83 14146
60
+
61
+
62
+ ## Model description
63
+
64
+ More information needed
65
+
66
+ ## Intended uses & limitations
67
+
68
+ More information needed
69
+
70
+ ## Training and evaluation data
71
+
72
+ More information needed
73
+
74
+ ## Training procedure
75
+
76
+ ### Training hyperparameters
77
+
78
+ The following hyperparameters were used during training:
79
+ - learning_rate: 5e-05
80
+ - train_batch_size: 24
81
+ - eval_batch_size: 24
82
+ - seed: 42
83
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
84
+ - lr_scheduler_type: linear
85
+ - num_epochs: 3
86
+
87
+ ### Training results
88
+
89
+ | Training Loss | Epoch | Step | Validation Loss | F1 | Precision | Recall | Accuracy | Classification Report |
90
+ |:-------------:|:-----:|:----:|:---------------:|:------:|:---------:|:------:|:--------:|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|
91
+ | 1.2644 | 0.03 | 24 | 0.8175 | 0.4212 | 0.3696 | 0.4897 | 0.4212 | precision recall f1-score support
92
+
93
+ LOC 0.28 0.18 0.22 4834
94
+ ORG 0.39 0.15 0.22 4677
95
+ PER 0.66 0.72 0.69 4635
96
+
97
+ micro avg 0.49 0.35 0.41 14146
98
+ macro avg 0.44 0.35 0.38 14146
99
+ weighted avg 0.44 0.35 0.37 14146
100
+ |
101
+ | 0.7209 | 0.06 | 48 | 0.5633 | 0.4817 | 0.4190 | 0.5665 | 0.4817 | precision recall f1-score support
102
+
103
+ LOC 0.35 0.49 0.41 4834
104
+ ORG 0.47 0.51 0.49 4677
105
+ PER 0.77 0.69 0.73 4635
106
+
107
+ micro avg 0.50 0.56 0.53 14146
108
+ macro avg 0.53 0.57 0.54 14146
109
+ weighted avg 0.53 0.56 0.54 14146
110
+ |
111
+ | 0.5951 | 0.09 | 72 | 0.4670 | 0.6059 | 0.5588 | 0.6617 | 0.6059 | precision recall f1-score support
112
+
113
+ LOC 0.60 0.63 0.61 4834
114
+ ORG 0.57 0.51 0.54 4677
115
+ PER 0.74 0.81 0.78 4635
116
+
117
+ micro avg 0.64 0.65 0.65 14146
118
+ macro avg 0.64 0.65 0.64 14146
119
+ weighted avg 0.64 0.65 0.64 14146
120
+ |
121
+ | 0.4475 | 0.12 | 96 | 0.4425 | 0.6659 | 0.6336 | 0.7016 | 0.6659 | precision recall f1-score support
122
+
123
+ LOC 0.65 0.75 0.70 4834
124
+ ORG 0.65 0.51 0.57 4677
125
+ PER 0.80 0.82 0.81 4635
126
+
127
+ micro avg 0.70 0.70 0.70 14146
128
+ macro avg 0.70 0.70 0.69 14146
129
+ weighted avg 0.70 0.70 0.69 14146
130
+ |
131
+ | 0.4978 | 0.14 | 120 | 0.4469 | 0.6375 | 0.5930 | 0.6892 | 0.6375 | precision recall f1-score support
132
+
133
+ LOC 0.66 0.68 0.67 4834
134
+ ORG 0.60 0.52 0.56 4677
135
+ PER 0.72 0.85 0.78 4635
136
+
137
+ micro avg 0.66 0.68 0.67 14146
138
+ macro avg 0.66 0.68 0.67 14146
139
+ weighted avg 0.66 0.68 0.67 14146
140
+ |
141
+ | 0.4383 | 0.17 | 144 | 0.4093 | 0.7003 | 0.6668 | 0.7374 | 0.7003 | precision recall f1-score support
142
+
143
+ LOC 0.75 0.71 0.73 4834
144
+ ORG 0.61 0.63 0.62 4677
145
+ PER 0.77 0.86 0.81 4635
146
+
147
+ micro avg 0.71 0.73 0.72 14146
148
+ macro avg 0.71 0.73 0.72 14146
149
+ weighted avg 0.71 0.73 0.72 14146
150
+ |
151
+ | 0.4148 | 0.2 | 168 | 0.3688 | 0.7122 | 0.6877 | 0.7387 | 0.7122 | precision recall f1-score support
152
+
153
+ LOC 0.71 0.74 0.73 4834
154
+ ORG 0.66 0.64 0.65 4677
155
+ PER 0.86 0.81 0.84 4635
156
+
157
+ micro avg 0.74 0.73 0.74 14146
158
+ macro avg 0.74 0.73 0.74 14146
159
+ weighted avg 0.74 0.73 0.74 14146
160
+ |
161
+ | 0.4513 | 0.23 | 192 | 0.3700 | 0.7236 | 0.7081 | 0.7397 | 0.7236 | precision recall f1-score support
162
+
163
+ LOC 0.78 0.71 0.74 4834
164
+ ORG 0.66 0.63 0.64 4677
165
+ PER 0.84 0.85 0.84 4635
166
+
167
+ micro avg 0.76 0.73 0.74 14146
168
+ macro avg 0.76 0.73 0.74 14146
169
+ weighted avg 0.76 0.73 0.74 14146
170
+ |
171
+ | 0.3786 | 0.26 | 216 | 0.3666 | 0.7304 | 0.7125 | 0.7493 | 0.7304 | precision recall f1-score support
172
+
173
+ LOC 0.74 0.80 0.77 4834
174
+ ORG 0.67 0.61 0.64 4677
175
+ PER 0.88 0.81 0.84 4635
176
+
177
+ micro avg 0.76 0.74 0.75 14146
178
+ macro avg 0.77 0.74 0.75 14146
179
+ weighted avg 0.77 0.74 0.75 14146
180
+ |
181
+ | 0.425 | 0.29 | 240 | 0.3652 | 0.7046 | 0.6874 | 0.7227 | 0.7046 | precision recall f1-score support
182
+
183
+ LOC 0.74 0.72 0.73 4834
184
+ ORG 0.64 0.58 0.61 4677
185
+ PER 0.83 0.86 0.84 4635
186
+
187
+ micro avg 0.74 0.72 0.73 14146
188
+ macro avg 0.74 0.72 0.73 14146
189
+ weighted avg 0.74 0.72 0.73 14146
190
+ |
191
+ | 0.4014 | 0.32 | 264 | 0.3438 | 0.7246 | 0.6964 | 0.7552 | 0.7246 | precision recall f1-score support
192
+
193
+ LOC 0.75 0.75 0.75 4834
194
+ ORG 0.70 0.63 0.66 4677
195
+ PER 0.80 0.86 0.83 4635
196
+
197
+ micro avg 0.75 0.75 0.75 14146
198
+ macro avg 0.75 0.75 0.75 14146
199
+ weighted avg 0.75 0.75 0.75 14146
200
+ |
201
+ | 0.3789 | 0.35 | 288 | 0.3533 | 0.7208 | 0.6922 | 0.7519 | 0.7208 | precision recall f1-score support
202
+
203
+ LOC 0.78 0.72 0.75 4834
204
+ ORG 0.65 0.66 0.65 4677
205
+ PER 0.77 0.86 0.81 4635
206
+
207
+ micro avg 0.73 0.75 0.74 14146
208
+ macro avg 0.73 0.75 0.74 14146
209
+ weighted avg 0.73 0.75 0.74 14146
210
+ |
211
+ | 0.4032 | 0.37 | 312 | 0.3567 | 0.7252 | 0.7125 | 0.7383 | 0.7252 | precision recall f1-score support
212
+
213
+ LOC 0.74 0.73 0.74 4834
214
+ ORG 0.67 0.62 0.64 4677
215
+ PER 0.88 0.84 0.86 4635
216
+
217
+ micro avg 0.77 0.73 0.75 14146
218
+ macro avg 0.77 0.73 0.75 14146
219
+ weighted avg 0.76 0.73 0.75 14146
220
+ |
221
+ | 0.371 | 0.4 | 336 | 0.3282 | 0.7433 | 0.7255 | 0.7620 | 0.7433 | precision recall f1-score support
222
+
223
+ LOC 0.78 0.77 0.77 4834
224
+ ORG 0.71 0.64 0.67 4677
225
+ PER 0.88 0.85 0.86 4635
226
+
227
+ micro avg 0.79 0.75 0.77 14146
228
+ macro avg 0.79 0.75 0.77 14146
229
+ weighted avg 0.79 0.75 0.77 14146
230
+ |
231
+ | 0.3397 | 0.43 | 360 | 0.3304 | 0.7522 | 0.7312 | 0.7745 | 0.7522 | precision recall f1-score support
232
+
233
+ LOC 0.83 0.76 0.80 4834
234
+ ORG 0.65 0.68 0.67 4677
235
+ PER 0.84 0.87 0.86 4635
236
+
237
+ micro avg 0.77 0.77 0.77 14146
238
+ macro avg 0.78 0.77 0.77 14146
239
+ weighted avg 0.78 0.77 0.77 14146
240
+ |
241
+ | 0.3871 | 0.46 | 384 | 0.3244 | 0.7427 | 0.7160 | 0.7715 | 0.7427 | precision recall f1-score support
242
+
243
+ LOC 0.79 0.77 0.78 4834
244
+ ORG 0.67 0.67 0.67 4677
245
+ PER 0.83 0.86 0.85 4635
246
+
247
+ micro avg 0.76 0.77 0.77 14146
248
+ macro avg 0.76 0.77 0.77 14146
249
+ weighted avg 0.76 0.77 0.77 14146
250
+ |
251
+ | 0.3461 | 0.49 | 408 | 0.3284 | 0.7520 | 0.7298 | 0.7756 | 0.7520 | precision recall f1-score support
252
+
253
+ LOC 0.77 0.81 0.79 4834
254
+ ORG 0.69 0.64 0.66 4677
255
+ PER 0.83 0.87 0.85 4635
256
+
257
+ micro avg 0.77 0.77 0.77 14146
258
+ macro avg 0.77 0.77 0.77 14146
259
+ weighted avg 0.77 0.77 0.77 14146
260
+ |
261
+ | 0.3504 | 0.52 | 432 | 0.3049 | 0.7574 | 0.7418 | 0.7737 | 0.7574 | precision recall f1-score support
262
+
263
+ LOC 0.79 0.80 0.79 4834
264
+ ORG 0.70 0.67 0.68 4677
265
+ PER 0.87 0.84 0.85 4635
266
+
267
+ micro avg 0.79 0.77 0.78 14146
268
+ macro avg 0.79 0.77 0.78 14146
269
+ weighted avg 0.79 0.77 0.78 14146
270
+ |
271
+ | 0.3387 | 0.55 | 456 | 0.3178 | 0.7717 | 0.7537 | 0.7906 | 0.7717 | precision recall f1-score support
272
+
273
+ LOC 0.79 0.81 0.80 4834
274
+ ORG 0.70 0.71 0.71 4677
275
+ PER 0.88 0.85 0.86 4635
276
+
277
+ micro avg 0.79 0.79 0.79 14146
278
+ macro avg 0.79 0.79 0.79 14146
279
+ weighted avg 0.79 0.79 0.79 14146
280
+ |
281
+ | 0.3259 | 0.58 | 480 | 0.3026 | 0.7738 | 0.7636 | 0.7843 | 0.7738 | precision recall f1-score support
282
+
283
+ LOC 0.81 0.79 0.80 4834
284
+ ORG 0.73 0.67 0.70 4677
285
+ PER 0.88 0.86 0.87 4635
286
+
287
+ micro avg 0.81 0.78 0.79 14146
288
+ macro avg 0.81 0.78 0.79 14146
289
+ weighted avg 0.81 0.78 0.79 14146
290
+ |
291
+ | 0.3473 | 0.6 | 504 | 0.3254 | 0.7324 | 0.7090 | 0.7574 | 0.7324 | precision recall f1-score support
292
+
293
+ LOC 0.79 0.69 0.74 4834
294
+ ORG 0.65 0.70 0.67 4677
295
+ PER 0.83 0.87 0.85 4635
296
+
297
+ micro avg 0.75 0.75 0.75 14146
298
+ macro avg 0.75 0.75 0.75 14146
299
+ weighted avg 0.75 0.75 0.75 14146
300
+ |
301
+ | 0.2893 | 0.63 | 528 | 0.3102 | 0.7689 | 0.7571 | 0.7810 | 0.7689 | precision recall f1-score support
302
+
303
+ LOC 0.80 0.79 0.80 4834
304
+ ORG 0.69 0.67 0.68 4677
305
+ PER 0.88 0.87 0.88 4635
306
+
307
+ micro avg 0.79 0.78 0.79 14146
308
+ macro avg 0.79 0.78 0.79 14146
309
+ weighted avg 0.79 0.78 0.79 14146
310
+ |
311
+ | 0.3669 | 0.66 | 552 | 0.3119 | 0.7631 | 0.7528 | 0.7737 | 0.7631 | precision recall f1-score support
312
+
313
+ LOC 0.78 0.83 0.81 4834
314
+ ORG 0.72 0.60 0.65 4677
315
+ PER 0.86 0.88 0.87 4635
316
+
317
+ micro avg 0.79 0.77 0.78 14146
318
+ macro avg 0.79 0.77 0.78 14146
319
+ weighted avg 0.78 0.77 0.78 14146
320
+ |
321
+ | 0.312 | 0.69 | 576 | 0.2963 | 0.7818 | 0.7734 | 0.7905 | 0.7818 | precision recall f1-score support
322
+
323
+ LOC 0.81 0.80 0.81 4834
324
+ ORG 0.74 0.69 0.71 4677
325
+ PER 0.86 0.86 0.86 4635
326
+
327
+ micro avg 0.80 0.79 0.80 14146
328
+ macro avg 0.80 0.79 0.79 14146
329
+ weighted avg 0.80 0.79 0.79 14146
330
+ |
331
+ | 0.297 | 0.72 | 600 | 0.3217 | 0.7542 | 0.7332 | 0.7765 | 0.7542 | precision recall f1-score support
332
+
333
+ LOC 0.80 0.73 0.76 4834
334
+ ORG 0.67 0.72 0.69 4677
335
+ PER 0.84 0.88 0.86 4635
336
+
337
+ micro avg 0.77 0.77 0.77 14146
338
+ macro avg 0.77 0.78 0.77 14146
339
+ weighted avg 0.77 0.77 0.77 14146
340
+ |
341
+ | 0.3095 | 0.75 | 624 | 0.3038 | 0.7732 | 0.7580 | 0.7891 | 0.7732 | precision recall f1-score support
342
+
343
+ LOC 0.85 0.76 0.80 4834
344
+ ORG 0.67 0.71 0.69 4677
345
+ PER 0.86 0.88 0.87 4635
346
+
347
+ micro avg 0.79 0.78 0.79 14146
348
+ macro avg 0.79 0.79 0.79 14146
349
+ weighted avg 0.79 0.78 0.79 14146
350
+ |
351
+ | 0.3514 | 0.78 | 648 | 0.2913 | 0.7794 | 0.7669 | 0.7924 | 0.7794 | precision recall f1-score support
352
+
353
+ LOC 0.83 0.81 0.82 4834
354
+ ORG 0.72 0.66 0.69 4677
355
+ PER 0.86 0.89 0.87 4635
356
+
357
+ micro avg 0.81 0.79 0.80 14146
358
+ macro avg 0.80 0.79 0.79 14146
359
+ weighted avg 0.80 0.79 0.79 14146
360
+ |
361
+ | 0.2824 | 0.81 | 672 | 0.3008 | 0.7813 | 0.7752 | 0.7876 | 0.7813 | precision recall f1-score support
362
+
363
+ LOC 0.80 0.81 0.80 4834
364
+ ORG 0.76 0.67 0.71 4677
365
+ PER 0.85 0.87 0.86 4635
366
+
367
+ micro avg 0.81 0.78 0.79 14146
368
+ macro avg 0.80 0.78 0.79 14146
369
+ weighted avg 0.80 0.78 0.79 14146
370
+ |
371
+ | 0.3203 | 0.83 | 696 | 0.2915 | 0.7807 | 0.7641 | 0.7980 | 0.7807 | precision recall f1-score support
372
+
373
+ LOC 0.80 0.81 0.80 4834
374
+ ORG 0.72 0.70 0.71 4677
375
+ PER 0.86 0.87 0.87 4635
376
+
377
+ micro avg 0.80 0.79 0.80 14146
378
+ macro avg 0.80 0.79 0.80 14146
379
+ weighted avg 0.80 0.79 0.80 14146
380
+ |
381
+ | 0.3089 | 0.86 | 720 | 0.2941 | 0.7838 | 0.7755 | 0.7923 | 0.7838 | precision recall f1-score support
382
+
383
+ LOC 0.80 0.83 0.81 4834
384
+ ORG 0.74 0.66 0.70 4677
385
+ PER 0.87 0.87 0.87 4635
386
+
387
+ micro avg 0.81 0.79 0.80 14146
388
+ macro avg 0.81 0.79 0.79 14146
389
+ weighted avg 0.80 0.79 0.79 14146
390
+ |
391
+ | 0.3174 | 0.89 | 744 | 0.2986 | 0.7770 | 0.7609 | 0.7937 | 0.7770 | precision recall f1-score support
392
+
393
+ LOC 0.77 0.84 0.80 4834
394
+ ORG 0.73 0.66 0.70 4677
395
+ PER 0.87 0.86 0.86 4635
396
+
397
+ micro avg 0.79 0.79 0.79 14146
398
+ macro avg 0.79 0.79 0.79 14146
399
+ weighted avg 0.79 0.79 0.79 14146
400
+ |
401
+ | 0.3264 | 0.92 | 768 | 0.2783 | 0.7788 | 0.7630 | 0.7951 | 0.7788 | precision recall f1-score support
402
+
403
+ LOC 0.77 0.82 0.79 4834
404
+ ORG 0.75 0.68 0.71 4677
405
+ PER 0.86 0.88 0.87 4635
406
+
407
+ micro avg 0.79 0.79 0.79 14146
408
+ macro avg 0.79 0.79 0.79 14146
409
+ weighted avg 0.79 0.79 0.79 14146
410
+ |
411
+ | 0.2815 | 0.95 | 792 | 0.2861 | 0.7848 | 0.7704 | 0.7998 | 0.7848 | precision recall f1-score support
412
+
413
+ LOC 0.82 0.81 0.81 4834
414
+ ORG 0.70 0.73 0.72 4677
415
+ PER 0.88 0.85 0.86 4635
416
+
417
+ micro avg 0.80 0.80 0.80 14146
418
+ macro avg 0.80 0.80 0.80 14146
419
+ weighted avg 0.80 0.80 0.80 14146
420
+ |
421
+ | 0.2895 | 0.98 | 816 | 0.2799 | 0.7842 | 0.7702 | 0.7988 | 0.7842 | precision recall f1-score support
422
+
423
+ LOC 0.80 0.80 0.80 4834
424
+ ORG 0.73 0.72 0.73 4677
425
+ PER 0.90 0.87 0.88 4635
426
+
427
+ micro avg 0.81 0.79 0.80 14146
428
+ macro avg 0.81 0.79 0.80 14146
429
+ weighted avg 0.81 0.79 0.80 14146
430
+ |
431
+ | 0.3023 | 1.01 | 840 | 0.2818 | 0.7876 | 0.7722 | 0.8038 | 0.7876 | precision recall f1-score support
432
+
433
+ LOC 0.81 0.83 0.82 4834
434
+ ORG 0.71 0.71 0.71 4677
435
+ PER 0.88 0.86 0.87 4635
436
+
437
+ micro avg 0.80 0.80 0.80 14146
438
+ macro avg 0.80 0.80 0.80 14146
439
+ weighted avg 0.80 0.80 0.80 14146
440
+ |
441
+ | 0.2358 | 1.04 | 864 | 0.2924 | 0.7836 | 0.7750 | 0.7925 | 0.7836 | precision recall f1-score support
442
+
443
+ LOC 0.84 0.79 0.82 4834
444
+ ORG 0.70 0.70 0.70 4677
445
+ PER 0.88 0.88 0.88 4635
446
+
447
+ micro avg 0.81 0.79 0.80 14146
448
+ macro avg 0.81 0.79 0.80 14146
449
+ weighted avg 0.81 0.79 0.80 14146
450
+ |
451
+ | 0.2819 | 1.06 | 888 | 0.2861 | 0.7761 | 0.7696 | 0.7828 | 0.7761 | precision recall f1-score support
452
+
453
+ LOC 0.83 0.80 0.81 4834
454
+ ORG 0.77 0.63 0.69 4677
455
+ PER 0.81 0.91 0.85 4635
456
+
457
+ micro avg 0.81 0.78 0.79 14146
458
+ macro avg 0.80 0.78 0.79 14146
459
+ weighted avg 0.80 0.78 0.79 14146
460
+ |
461
+ | 0.2692 | 1.09 | 912 | 0.2924 | 0.7756 | 0.7680 | 0.7833 | 0.7756 | precision recall f1-score support
462
+
463
+ LOC 0.82 0.77 0.80 4834
464
+ ORG 0.75 0.68 0.71 4677
465
+ PER 0.82 0.89 0.85 4635
466
+
467
+ micro avg 0.80 0.78 0.79 14146
468
+ macro avg 0.80 0.78 0.79 14146
469
+ weighted avg 0.80 0.78 0.79 14146
470
+ |
471
+ | 0.2478 | 1.12 | 936 | 0.2963 | 0.7833 | 0.7599 | 0.8082 | 0.7833 | precision recall f1-score support
472
+
473
+ LOC 0.79 0.85 0.82 4834
474
+ ORG 0.72 0.68 0.70 4677
475
+ PER 0.87 0.88 0.87 4635
476
+
477
+ micro avg 0.79 0.80 0.80 14146
478
+ macro avg 0.79 0.80 0.80 14146
479
+ weighted avg 0.79 0.80 0.80 14146
480
+ |
481
+ | 0.2557 | 1.15 | 960 | 0.2960 | 0.7783 | 0.7814 | 0.7751 | 0.7783 | precision recall f1-score support
482
+
483
+ LOC 0.81 0.73 0.77 4834
484
+ ORG 0.73 0.71 0.72 4677
485
+ PER 0.89 0.88 0.89 4635
486
+
487
+ micro avg 0.81 0.77 0.79 14146
488
+ macro avg 0.81 0.77 0.79 14146
489
+ weighted avg 0.81 0.77 0.79 14146
490
+ |
491
+ | 0.3003 | 1.18 | 984 | 0.2656 | 0.7862 | 0.7727 | 0.8002 | 0.7862 | precision recall f1-score support
492
+
493
+ LOC 0.81 0.83 0.82 4834
494
+ ORG 0.73 0.68 0.71 4677
495
+ PER 0.88 0.87 0.87 4635
496
+
497
+ micro avg 0.81 0.80 0.80 14146
498
+ macro avg 0.81 0.80 0.80 14146
499
+ weighted avg 0.81 0.80 0.80 14146
500
+ |
501
+ | 0.2254 | 1.21 | 1008 | 0.2791 | 0.8007 | 0.7890 | 0.8129 | 0.8007 | precision recall f1-score support
502
+
503
+ LOC 0.83 0.82 0.82 4834
504
+ ORG 0.73 0.73 0.73 4677
505
+ PER 0.88 0.88 0.88 4635
506
+
507
+ micro avg 0.82 0.81 0.81 14146
508
+ macro avg 0.82 0.81 0.81 14146
509
+ weighted avg 0.82 0.81 0.81 14146
510
+ |
511
+ | 0.2496 | 1.24 | 1032 | 0.2702 | 0.7877 | 0.7701 | 0.8062 | 0.7877 | precision recall f1-score support
512
+
513
+ LOC 0.78 0.85 0.81 4834
514
+ ORG 0.74 0.68 0.71 4677
515
+ PER 0.88 0.89 0.89 4635
516
+
517
+ micro avg 0.80 0.81 0.80 14146
518
+ macro avg 0.80 0.80 0.80 14146
519
+ weighted avg 0.80 0.81 0.80 14146
520
+ |
521
+ | 0.2124 | 1.27 | 1056 | 0.2888 | 0.7952 | 0.7895 | 0.8011 | 0.7952 | precision recall f1-score support
522
+
523
+ LOC 0.85 0.79 0.82 4834
524
+ ORG 0.75 0.71 0.73 4677
525
+ PER 0.85 0.89 0.87 4635
526
+
527
+ micro avg 0.82 0.80 0.81 14146
528
+ macro avg 0.82 0.80 0.81 14146
529
+ weighted avg 0.82 0.80 0.81 14146
530
+ |
531
+ | 0.2841 | 1.29 | 1080 | 0.2761 | 0.7946 | 0.7870 | 0.8023 | 0.7946 | precision recall f1-score support
532
+
533
+ LOC 0.82 0.85 0.83 4834
534
+ ORG 0.73 0.68 0.70 4677
535
+ PER 0.89 0.87 0.88 4635
536
+
537
+ micro avg 0.82 0.80 0.81 14146
538
+ macro avg 0.81 0.80 0.81 14146
539
+ weighted avg 0.81 0.80 0.81 14146
540
+ |
541
+ | 0.2517 | 1.32 | 1104 | 0.2659 | 0.8026 | 0.7909 | 0.8146 | 0.8026 | precision recall f1-score support
542
+
543
+ LOC 0.81 0.84 0.82 4834
544
+ ORG 0.76 0.72 0.74 4677
545
+ PER 0.88 0.88 0.88 4635
546
+
547
+ micro avg 0.82 0.81 0.82 14146
548
+ macro avg 0.82 0.81 0.81 14146
549
+ weighted avg 0.82 0.81 0.81 14146
550
+ |
551
+ | 0.2355 | 1.35 | 1128 | 0.2681 | 0.8003 | 0.7876 | 0.8134 | 0.8003 | precision recall f1-score support
552
+
553
+ LOC 0.83 0.82 0.82 4834
554
+ ORG 0.75 0.72 0.74 4677
555
+ PER 0.88 0.89 0.88 4635
556
+
557
+ micro avg 0.82 0.81 0.81 14146
558
+ macro avg 0.82 0.81 0.81 14146
559
+ weighted avg 0.82 0.81 0.81 14146
560
+ |
561
+ | 0.2402 | 1.38 | 1152 | 0.2701 | 0.7991 | 0.7892 | 0.8093 | 0.7991 | precision recall f1-score support
562
+
563
+ LOC 0.83 0.81 0.82 4834
564
+ ORG 0.73 0.73 0.73 4677
565
+ PER 0.90 0.88 0.89 4635
566
+
567
+ micro avg 0.82 0.81 0.81 14146
568
+ macro avg 0.82 0.81 0.81 14146
569
+ weighted avg 0.82 0.81 0.81 14146
570
+ |
571
+ | 0.2296 | 1.41 | 1176 | 0.2753 | 0.7946 | 0.7819 | 0.8077 | 0.7946 | precision recall f1-score support
572
+
573
+ LOC 0.81 0.83 0.82 4834
574
+ ORG 0.76 0.70 0.73 4677
575
+ PER 0.86 0.88 0.87 4635
576
+
577
+ micro avg 0.81 0.81 0.81 14146
578
+ macro avg 0.81 0.81 0.81 14146
579
+ weighted avg 0.81 0.81 0.81 14146
580
+ |
581
+ | 0.2453 | 1.44 | 1200 | 0.2696 | 0.8029 | 0.7912 | 0.8149 | 0.8029 | precision recall f1-score support
582
+
583
+ LOC 0.84 0.82 0.83 4834
584
+ ORG 0.75 0.72 0.74 4677
585
+ PER 0.87 0.89 0.88 4635
586
+
587
+ micro avg 0.82 0.81 0.82 14146
588
+ macro avg 0.82 0.81 0.81 14146
589
+ weighted avg 0.82 0.81 0.81 14146
590
+ |
591
+ | 0.2689 | 1.47 | 1224 | 0.2700 | 0.7936 | 0.7819 | 0.8056 | 0.7936 | precision recall f1-score support
592
+
593
+ LOC 0.81 0.83 0.82 4834
594
+ ORG 0.76 0.70 0.73 4677
595
+ PER 0.89 0.87 0.88 4635
596
+
597
+ micro avg 0.82 0.80 0.81 14146
598
+ macro avg 0.82 0.80 0.81 14146
599
+ weighted avg 0.82 0.80 0.81 14146
600
+ |
601
+ | 0.2362 | 1.5 | 1248 | 0.2705 | 0.8028 | 0.8005 | 0.8051 | 0.8028 | precision recall f1-score support
602
+
603
+ LOC 0.82 0.83 0.83 4834
604
+ ORG 0.77 0.70 0.73 4677
605
+ PER 0.90 0.88 0.89 4635
606
+
607
+ micro avg 0.83 0.80 0.82 14146
608
+ macro avg 0.83 0.80 0.81 14146
609
+ weighted avg 0.83 0.80 0.81 14146
610
+ |
611
+ | 0.226 | 1.53 | 1272 | 0.2642 | 0.8042 | 0.7910 | 0.8180 | 0.8042 | precision recall f1-score support
612
+
613
+ LOC 0.84 0.84 0.84 4834
614
+ ORG 0.74 0.72 0.73 4677
615
+ PER 0.86 0.89 0.87 4635
616
+
617
+ micro avg 0.81 0.82 0.81 14146
618
+ macro avg 0.81 0.82 0.81 14146
619
+ weighted avg 0.81 0.82 0.81 14146
620
+ |
621
+ | 0.2139 | 1.55 | 1296 | 0.2690 | 0.8013 | 0.7942 | 0.8084 | 0.8013 | precision recall f1-score support
622
+
623
+ LOC 0.86 0.78 0.82 4834
624
+ ORG 0.72 0.76 0.74 4677
625
+ PER 0.88 0.88 0.88 4635
626
+
627
+ micro avg 0.82 0.81 0.81 14146
628
+ macro avg 0.82 0.81 0.81 14146
629
+ weighted avg 0.82 0.81 0.81 14146
630
+ |
631
+ | 0.2744 | 1.58 | 1320 | 0.2619 | 0.7999 | 0.7841 | 0.8163 | 0.7999 | precision recall f1-score support
632
+
633
+ LOC 0.84 0.81 0.82 4834
634
+ ORG 0.71 0.75 0.73 4677
635
+ PER 0.88 0.88 0.88 4635
636
+
637
+ micro avg 0.81 0.81 0.81 14146
638
+ macro avg 0.81 0.81 0.81 14146
639
+ weighted avg 0.81 0.81 0.81 14146
640
+ |
641
+ | 0.2015 | 1.61 | 1344 | 0.2640 | 0.8066 | 0.8035 | 0.8098 | 0.8066 | precision recall f1-score support
642
+
643
+ LOC 0.83 0.83 0.83 4834
644
+ ORG 0.76 0.71 0.74 4677
645
+ PER 0.89 0.88 0.89 4635
646
+
647
+ micro avg 0.83 0.81 0.82 14146
648
+ macro avg 0.83 0.81 0.82 14146
649
+ weighted avg 0.83 0.81 0.82 14146
650
+ |
651
+ | 0.1949 | 1.64 | 1368 | 0.2750 | 0.8075 | 0.8023 | 0.8129 | 0.8075 | precision recall f1-score support
652
+
653
+ LOC 0.82 0.84 0.83 4834
654
+ ORG 0.77 0.70 0.73 4677
655
+ PER 0.90 0.88 0.89 4635
656
+
657
+ micro avg 0.83 0.81 0.82 14146
658
+ macro avg 0.83 0.81 0.82 14146
659
+ weighted avg 0.83 0.81 0.82 14146
660
+ |
661
+ | 0.2259 | 1.67 | 1392 | 0.2669 | 0.8092 | 0.7997 | 0.8189 | 0.8092 | precision recall f1-score support
662
+
663
+ LOC 0.81 0.86 0.83 4834
664
+ ORG 0.77 0.73 0.75 4677
665
+ PER 0.89 0.86 0.88 4635
666
+
667
+ micro avg 0.82 0.82 0.82 14146
668
+ macro avg 0.82 0.82 0.82 14146
669
+ weighted avg 0.82 0.82 0.82 14146
670
+ |
671
+ | 0.1884 | 1.7 | 1416 | 0.2729 | 0.8061 | 0.7990 | 0.8133 | 0.8061 | precision recall f1-score support
672
+
673
+ LOC 0.82 0.83 0.83 4834
674
+ ORG 0.75 0.73 0.74 4677
675
+ PER 0.91 0.87 0.89 4635
676
+
677
+ micro avg 0.82 0.81 0.82 14146
678
+ macro avg 0.83 0.81 0.82 14146
679
+ weighted avg 0.83 0.81 0.82 14146
680
+ |
681
+ | 0.1868 | 1.73 | 1440 | 0.2679 | 0.8083 | 0.8007 | 0.8161 | 0.8083 | precision recall f1-score support
682
+
683
+ LOC 0.84 0.83 0.84 4834
684
+ ORG 0.77 0.71 0.74 4677
685
+ PER 0.86 0.89 0.88 4635
686
+
687
+ micro avg 0.83 0.81 0.82 14146
688
+ macro avg 0.83 0.81 0.82 14146
689
+ weighted avg 0.83 0.81 0.82 14146
690
+ |
691
+ | 0.2292 | 1.76 | 1464 | 0.2658 | 0.8055 | 0.7954 | 0.8158 | 0.8055 | precision recall f1-score support
692
+
693
+ LOC 0.82 0.83 0.83 4834
694
+ ORG 0.76 0.73 0.74 4677
695
+ PER 0.87 0.88 0.88 4635
696
+
697
+ micro avg 0.82 0.81 0.82 14146
698
+ macro avg 0.82 0.81 0.82 14146
699
+ weighted avg 0.82 0.81 0.82 14146
700
+ |
701
+ | 0.22 | 1.78 | 1488 | 0.2610 | 0.8066 | 0.8006 | 0.8126 | 0.8066 | precision recall f1-score support
702
+
703
+ LOC 0.83 0.83 0.83 4834
704
+ ORG 0.76 0.72 0.74 4677
705
+ PER 0.89 0.87 0.88 4635
706
+
707
+ micro avg 0.83 0.81 0.82 14146
708
+ macro avg 0.83 0.81 0.82 14146
709
+ weighted avg 0.83 0.81 0.82 14146
710
+ |
711
+ | 0.2335 | 1.81 | 1512 | 0.2613 | 0.7997 | 0.7816 | 0.8185 | 0.7997 | precision recall f1-score support
712
+
713
+ LOC 0.79 0.87 0.83 4834
714
+ ORG 0.77 0.68 0.72 4677
715
+ PER 0.86 0.90 0.88 4635
716
+
717
+ micro avg 0.81 0.82 0.81 14146
718
+ macro avg 0.81 0.82 0.81 14146
719
+ weighted avg 0.81 0.82 0.81 14146
720
+ |
721
+ | 0.2379 | 1.84 | 1536 | 0.2495 | 0.8081 | 0.7975 | 0.8190 | 0.8081 | precision recall f1-score support
722
+
723
+ LOC 0.83 0.83 0.83 4834
724
+ ORG 0.77 0.73 0.75 4677
725
+ PER 0.87 0.89 0.88 4635
726
+
727
+ micro avg 0.83 0.82 0.82 14146
728
+ macro avg 0.83 0.82 0.82 14146
729
+ weighted avg 0.83 0.82 0.82 14146
730
+ |
731
+ | 0.2394 | 1.87 | 1560 | 0.2619 | 0.8063 | 0.7951 | 0.8177 | 0.8063 | precision recall f1-score support
732
+
733
+ LOC 0.84 0.83 0.83 4834
734
+ ORG 0.74 0.75 0.74 4677
735
+ PER 0.91 0.86 0.88 4635
736
+
737
+ micro avg 0.82 0.81 0.82 14146
738
+ macro avg 0.83 0.81 0.82 14146
739
+ weighted avg 0.83 0.81 0.82 14146
740
+ |
741
+ | 0.2526 | 1.9 | 1584 | 0.2502 | 0.8116 | 0.8032 | 0.8202 | 0.8116 | precision recall f1-score support
742
+
743
+ LOC 0.85 0.82 0.84 4834
744
+ ORG 0.77 0.73 0.75 4677
745
+ PER 0.86 0.90 0.88 4635
746
+
747
+ micro avg 0.83 0.82 0.82 14146
748
+ macro avg 0.83 0.82 0.82 14146
749
+ weighted avg 0.83 0.82 0.82 14146
750
+ |
751
+ | 0.2167 | 1.93 | 1608 | 0.2528 | 0.8134 | 0.8000 | 0.8273 | 0.8134 | precision recall f1-score support
752
+
753
+ LOC 0.83 0.86 0.84 4834
754
+ ORG 0.75 0.73 0.74 4677
755
+ PER 0.88 0.89 0.88 4635
756
+
757
+ micro avg 0.82 0.82 0.82 14146
758
+ macro avg 0.82 0.82 0.82 14146
759
+ weighted avg 0.82 0.82 0.82 14146
760
+ |
761
+ | 0.2354 | 1.96 | 1632 | 0.2449 | 0.8099 | 0.8013 | 0.8188 | 0.8099 | precision recall f1-score support
762
+
763
+ LOC 0.84 0.84 0.84 4834
764
+ ORG 0.75 0.72 0.74 4677
765
+ PER 0.89 0.89 0.89 4635
766
+
767
+ micro avg 0.83 0.82 0.82 14146
768
+ macro avg 0.83 0.82 0.82 14146
769
+ weighted avg 0.83 0.82 0.82 14146
770
+ |
771
+ | 0.2808 | 1.99 | 1656 | 0.2469 | 0.8067 | 0.7938 | 0.8201 | 0.8067 | precision recall f1-score support
772
+
773
+ LOC 0.84 0.82 0.83 4834
774
+ ORG 0.76 0.74 0.75 4677
775
+ PER 0.88 0.89 0.89 4635
776
+
777
+ micro avg 0.83 0.82 0.82 14146
778
+ macro avg 0.83 0.82 0.82 14146
779
+ weighted avg 0.83 0.82 0.82 14146
780
+ |
781
+ | 0.1924 | 2.01 | 1680 | 0.2487 | 0.8077 | 0.7930 | 0.8229 | 0.8077 | precision recall f1-score support
782
+
783
+ LOC 0.82 0.85 0.83 4834
784
+ ORG 0.75 0.72 0.74 4677
785
+ PER 0.88 0.90 0.89 4635
786
+
787
+ micro avg 0.82 0.82 0.82 14146
788
+ macro avg 0.82 0.82 0.82 14146
789
+ weighted avg 0.82 0.82 0.82 14146
790
+ |
791
+ | 0.1498 | 2.04 | 1704 | 0.2619 | 0.8127 | 0.8015 | 0.8242 | 0.8127 | precision recall f1-score support
792
+
793
+ LOC 0.85 0.84 0.84 4834
794
+ ORG 0.73 0.75 0.74 4677
795
+ PER 0.90 0.88 0.89 4635
796
+
797
+ micro avg 0.82 0.82 0.82 14146
798
+ macro avg 0.83 0.82 0.82 14146
799
+ weighted avg 0.83 0.82 0.82 14146
800
+ |
801
+ | 0.2 | 2.07 | 1728 | 0.2590 | 0.8133 | 0.8044 | 0.8224 | 0.8133 | precision recall f1-score support
802
+
803
+ LOC 0.84 0.85 0.84 4834
804
+ ORG 0.77 0.71 0.74 4677
805
+ PER 0.88 0.90 0.89 4635
806
+
807
+ micro avg 0.83 0.82 0.82 14146
808
+ macro avg 0.83 0.82 0.82 14146
809
+ weighted avg 0.83 0.82 0.82 14146
810
+ |
811
+ | 0.151 | 2.1 | 1752 | 0.2623 | 0.8066 | 0.7949 | 0.8186 | 0.8066 | precision recall f1-score support
812
+
813
+ LOC 0.82 0.85 0.84 4834
814
+ ORG 0.75 0.71 0.73 4677
815
+ PER 0.89 0.89 0.89 4635
816
+
817
+ micro avg 0.82 0.82 0.82 14146
818
+ macro avg 0.82 0.82 0.82 14146
819
+ weighted avg 0.82 0.82 0.82 14146
820
+ |
821
+ | 0.1646 | 2.13 | 1776 | 0.2632 | 0.8186 | 0.8137 | 0.8236 | 0.8186 | precision recall f1-score support
822
+
823
+ LOC 0.86 0.83 0.84 4834
824
+ ORG 0.76 0.74 0.75 4677
825
+ PER 0.89 0.89 0.89 4635
826
+
827
+ micro avg 0.84 0.82 0.83 14146
828
+ macro avg 0.84 0.82 0.83 14146
829
+ weighted avg 0.84 0.82 0.83 14146
830
+ |
831
+ | 0.1659 | 2.16 | 1800 | 0.2561 | 0.8188 | 0.8096 | 0.8281 | 0.8188 | precision recall f1-score support
832
+
833
+ LOC 0.83 0.86 0.84 4834
834
+ ORG 0.76 0.74 0.75 4677
835
+ PER 0.90 0.89 0.89 4635
836
+
837
+ micro avg 0.83 0.83 0.83 14146
838
+ macro avg 0.83 0.83 0.83 14146
839
+ weighted avg 0.83 0.83 0.83 14146
840
+ |
841
+ | 0.1888 | 2.19 | 1824 | 0.2549 | 0.8136 | 0.8038 | 0.8237 | 0.8136 | precision recall f1-score support
842
+
843
+ LOC 0.86 0.83 0.84 4834
844
+ ORG 0.75 0.74 0.75 4677
845
+ PER 0.87 0.89 0.88 4635
846
+
847
+ micro avg 0.83 0.82 0.83 14146
848
+ macro avg 0.83 0.82 0.82 14146
849
+ weighted avg 0.83 0.82 0.82 14146
850
+ |
851
+ | 0.2084 | 2.22 | 1848 | 0.2557 | 0.8141 | 0.8087 | 0.8197 | 0.8141 | precision recall f1-score support
852
+
853
+ LOC 0.82 0.85 0.84 4834
854
+ ORG 0.78 0.71 0.74 4677
855
+ PER 0.90 0.88 0.89 4635
856
+
857
+ micro avg 0.83 0.82 0.83 14146
858
+ macro avg 0.83 0.82 0.82 14146
859
+ weighted avg 0.83 0.82 0.82 14146
860
+ |
861
+ | 0.1571 | 2.24 | 1872 | 0.2697 | 0.8150 | 0.8053 | 0.8249 | 0.8150 | precision recall f1-score support
862
+
863
+ LOC 0.83 0.85 0.84 4834
864
+ ORG 0.74 0.76 0.75 4677
865
+ PER 0.92 0.86 0.89 4635
866
+
867
+ micro avg 0.83 0.82 0.82 14146
868
+ macro avg 0.83 0.82 0.83 14146
869
+ weighted avg 0.83 0.82 0.83 14146
870
+ |
871
+ | 0.1541 | 2.27 | 1896 | 0.2605 | 0.8191 | 0.8121 | 0.8262 | 0.8191 | precision recall f1-score support
872
+
873
+ LOC 0.84 0.86 0.85 4834
874
+ ORG 0.77 0.72 0.74 4677
875
+ PER 0.89 0.90 0.89 4635
876
+
877
+ micro avg 0.83 0.82 0.83 14146
878
+ macro avg 0.83 0.82 0.83 14146
879
+ weighted avg 0.83 0.82 0.83 14146
880
+ |
881
+ | 0.1586 | 2.3 | 1920 | 0.2742 | 0.8109 | 0.8073 | 0.8144 | 0.8109 | precision recall f1-score support
882
+
883
+ LOC 0.83 0.84 0.84 4834
884
+ ORG 0.76 0.73 0.74 4677
885
+ PER 0.91 0.86 0.89 4635
886
+
887
+ micro avg 0.83 0.81 0.82 14146
888
+ macro avg 0.83 0.81 0.82 14146
889
+ weighted avg 0.83 0.81 0.82 14146
890
+ |
891
+ | 0.1641 | 2.33 | 1944 | 0.2679 | 0.8148 | 0.8104 | 0.8193 | 0.8148 | precision recall f1-score support
892
+
893
+ LOC 0.84 0.85 0.84 4834
894
+ ORG 0.76 0.72 0.74 4677
895
+ PER 0.89 0.89 0.89 4635
896
+
897
+ micro avg 0.83 0.82 0.82 14146
898
+ macro avg 0.83 0.82 0.82 14146
899
+ weighted avg 0.83 0.82 0.82 14146
900
+ |
901
+ | 0.1914 | 2.36 | 1968 | 0.2596 | 0.8159 | 0.8056 | 0.8265 | 0.8159 | precision recall f1-score support
902
+
903
+ LOC 0.84 0.86 0.85 4834
904
+ ORG 0.76 0.72 0.74 4677
905
+ PER 0.88 0.89 0.89 4635
906
+
907
+ micro avg 0.83 0.82 0.83 14146
908
+ macro avg 0.83 0.82 0.83 14146
909
+ weighted avg 0.83 0.82 0.83 14146
910
+ |
911
+ | 0.1441 | 2.39 | 1992 | 0.2644 | 0.8183 | 0.8139 | 0.8226 | 0.8183 | precision recall f1-score support
912
+
913
+ LOC 0.85 0.84 0.85 4834
914
+ ORG 0.77 0.73 0.75 4677
915
+ PER 0.89 0.88 0.89 4635
916
+
917
+ micro avg 0.84 0.82 0.83 14146
918
+ macro avg 0.84 0.82 0.83 14146
919
+ weighted avg 0.84 0.82 0.83 14146
920
+ |
921
+ | 0.1672 | 2.42 | 2016 | 0.2652 | 0.8180 | 0.8081 | 0.8281 | 0.8180 | precision recall f1-score support
922
+
923
+ LOC 0.84 0.84 0.84 4834
924
+ ORG 0.77 0.75 0.76 4677
925
+ PER 0.89 0.89 0.89 4635
926
+
927
+ micro avg 0.83 0.83 0.83 14146
928
+ macro avg 0.83 0.83 0.83 14146
929
+ weighted avg 0.83 0.83 0.83 14146
930
+ |
931
+ | 0.1852 | 2.45 | 2040 | 0.2576 | 0.8205 | 0.8101 | 0.8313 | 0.8205 | precision recall f1-score support
932
+
933
+ LOC 0.83 0.87 0.85 4834
934
+ ORG 0.77 0.73 0.75 4677
935
+ PER 0.89 0.89 0.89 4635
936
+
937
+ micro avg 0.83 0.83 0.83 14146
938
+ macro avg 0.83 0.83 0.83 14146
939
+ weighted avg 0.83 0.83 0.83 14146
940
+ |
941
+ | 0.192 | 2.47 | 2064 | 0.2459 | 0.8179 | 0.8063 | 0.8298 | 0.8179 | precision recall f1-score support
942
+
943
+ LOC 0.83 0.85 0.84 4834
944
+ ORG 0.77 0.74 0.75 4677
945
+ PER 0.89 0.89 0.89 4635
946
+
947
+ micro avg 0.83 0.83 0.83 14146
948
+ macro avg 0.83 0.83 0.83 14146
949
+ weighted avg 0.83 0.83 0.83 14146
950
+ |
951
+ | 0.1698 | 2.5 | 2088 | 0.2482 | 0.8213 | 0.8149 | 0.8277 | 0.8213 | precision recall f1-score support
952
+
953
+ LOC 0.83 0.86 0.85 4834
954
+ ORG 0.79 0.72 0.75 4677
955
+ PER 0.89 0.89 0.89 4635
956
+
957
+ micro avg 0.84 0.83 0.83 14146
958
+ macro avg 0.84 0.83 0.83 14146
959
+ weighted avg 0.84 0.83 0.83 14146
960
+ |
961
+ | 0.1802 | 2.53 | 2112 | 0.2519 | 0.8155 | 0.8066 | 0.8247 | 0.8155 | precision recall f1-score support
962
+
963
+ LOC 0.83 0.87 0.85 4834
964
+ ORG 0.79 0.70 0.74 4677
965
+ PER 0.88 0.90 0.89 4635
966
+
967
+ micro avg 0.83 0.82 0.83 14146
968
+ macro avg 0.83 0.82 0.83 14146
969
+ weighted avg 0.83 0.82 0.83 14146
970
+ |
971
+ | 0.1619 | 2.56 | 2136 | 0.2582 | 0.8175 | 0.8036 | 0.8319 | 0.8175 | precision recall f1-score support
972
+
973
+ LOC 0.84 0.85 0.84 4834
974
+ ORG 0.75 0.75 0.75 4677
975
+ PER 0.89 0.89 0.89 4635
976
+
977
+ micro avg 0.83 0.83 0.83 14146
978
+ macro avg 0.83 0.83 0.83 14146
979
+ weighted avg 0.83 0.83 0.83 14146
980
+ |
981
+ | 0.1974 | 2.59 | 2160 | 0.2535 | 0.8184 | 0.8108 | 0.8261 | 0.8184 | precision recall f1-score support
982
+
983
+ LOC 0.83 0.86 0.85 4834
984
+ ORG 0.78 0.71 0.75 4677
985
+ PER 0.88 0.90 0.89 4635
986
+
987
+ micro avg 0.84 0.82 0.83 14146
988
+ macro avg 0.83 0.82 0.83 14146
989
+ weighted avg 0.83 0.82 0.83 14146
990
+ |
991
+ | 0.1655 | 2.62 | 2184 | 0.2514 | 0.8229 | 0.8165 | 0.8295 | 0.8229 | precision recall f1-score support
992
+
993
+ LOC 0.85 0.84 0.85 4834
994
+ ORG 0.77 0.74 0.76 4677
995
+ PER 0.89 0.89 0.89 4635
996
+
997
+ micro avg 0.84 0.83 0.83 14146
998
+ macro avg 0.84 0.83 0.83 14146
999
+ weighted avg 0.84 0.83 0.83 14146
1000
+ |
1001
+ | 0.1844 | 2.65 | 2208 | 0.2536 | 0.8208 | 0.8152 | 0.8264 | 0.8208 | precision recall f1-score support
1002
+
1003
+ LOC 0.85 0.85 0.85 4834
1004
+ ORG 0.77 0.73 0.75 4677
1005
+ PER 0.89 0.89 0.89 4635
1006
+
1007
+ micro avg 0.84 0.82 0.83 14146
1008
+ macro avg 0.84 0.82 0.83 14146
1009
+ weighted avg 0.84 0.82 0.83 14146
1010
+ |
1011
+ | 0.1601 | 2.68 | 2232 | 0.2531 | 0.8194 | 0.8104 | 0.8286 | 0.8194 | precision recall f1-score support
1012
+
1013
+ LOC 0.85 0.85 0.85 4834
1014
+ ORG 0.76 0.73 0.75 4677
1015
+ PER 0.89 0.89 0.89 4635
1016
+
1017
+ micro avg 0.84 0.83 0.83 14146
1018
+ macro avg 0.84 0.83 0.83 14146
1019
+ weighted avg 0.84 0.83 0.83 14146
1020
+ |
1021
+ | 0.161 | 2.71 | 2256 | 0.2508 | 0.8226 | 0.8145 | 0.8310 | 0.8226 | precision recall f1-score support
1022
+
1023
+ LOC 0.85 0.86 0.85 4834
1024
+ ORG 0.78 0.73 0.75 4677
1025
+ PER 0.88 0.89 0.89 4635
1026
+
1027
+ micro avg 0.84 0.83 0.83 14146
1028
+ macro avg 0.84 0.83 0.83 14146
1029
+ weighted avg 0.84 0.83 0.83 14146
1030
+ |
1031
+ | 0.1672 | 2.73 | 2280 | 0.2527 | 0.8216 | 0.8137 | 0.8296 | 0.8216 | precision recall f1-score support
1032
+
1033
+ LOC 0.85 0.84 0.85 4834
1034
+ ORG 0.77 0.74 0.76 4677
1035
+ PER 0.88 0.89 0.89 4635
1036
+
1037
+ micro avg 0.84 0.83 0.83 14146
1038
+ macro avg 0.84 0.83 0.83 14146
1039
+ weighted avg 0.84 0.83 0.83 14146
1040
+ |
1041
+ | 0.2053 | 2.76 | 2304 | 0.2482 | 0.8208 | 0.8112 | 0.8306 | 0.8208 | precision recall f1-score support
1042
+
1043
+ LOC 0.85 0.86 0.85 4834
1044
+ ORG 0.78 0.73 0.75 4677
1045
+ PER 0.89 0.90 0.89 4635
1046
+
1047
+ micro avg 0.84 0.83 0.83 14146
1048
+ macro avg 0.84 0.83 0.83 14146
1049
+ weighted avg 0.84 0.83 0.83 14146
1050
+ |
1051
+ | 0.1776 | 2.79 | 2328 | 0.2486 | 0.8215 | 0.8143 | 0.8288 | 0.8215 | precision recall f1-score support
1052
+
1053
+ LOC 0.85 0.86 0.85 4834
1054
+ ORG 0.78 0.73 0.75 4677
1055
+ PER 0.89 0.90 0.89 4635
1056
+
1057
+ micro avg 0.84 0.83 0.83 14146
1058
+ macro avg 0.84 0.83 0.83 14146
1059
+ weighted avg 0.84 0.83 0.83 14146
1060
+ |
1061
+ | 0.1559 | 2.82 | 2352 | 0.2495 | 0.8233 | 0.8156 | 0.8312 | 0.8233 | precision recall f1-score support
1062
+
1063
+ LOC 0.84 0.86 0.85 4834
1064
+ ORG 0.78 0.73 0.76 4677
1065
+ PER 0.89 0.89 0.89 4635
1066
+
1067
+ micro avg 0.84 0.83 0.83 14146
1068
+ macro avg 0.84 0.83 0.83 14146
1069
+ weighted avg 0.84 0.83 0.83 14146
1070
+ |
1071
+ | 0.1509 | 2.85 | 2376 | 0.2472 | 0.8231 | 0.8142 | 0.8322 | 0.8231 | precision recall f1-score support
1072
+
1073
+ LOC 0.84 0.86 0.85 4834
1074
+ ORG 0.78 0.73 0.76 4677
1075
+ PER 0.89 0.89 0.89 4635
1076
+
1077
+ micro avg 0.84 0.83 0.83 14146
1078
+ macro avg 0.84 0.83 0.83 14146
1079
+ weighted avg 0.84 0.83 0.83 14146
1080
+ |
1081
+ | 0.1695 | 2.88 | 2400 | 0.2465 | 0.8229 | 0.8134 | 0.8326 | 0.8229 | precision recall f1-score support
1082
+
1083
+ LOC 0.85 0.86 0.85 4834
1084
+ ORG 0.78 0.73 0.75 4677
1085
+ PER 0.88 0.90 0.89 4635
1086
+
1087
+ micro avg 0.84 0.83 0.83 14146
1088
+ macro avg 0.84 0.83 0.83 14146
1089
+ weighted avg 0.84 0.83 0.83 14146
1090
+ |
1091
+ | 0.1523 | 2.91 | 2424 | 0.2466 | 0.8234 | 0.8154 | 0.8315 | 0.8234 | precision recall f1-score support
1092
+
1093
+ LOC 0.85 0.86 0.85 4834
1094
+ ORG 0.78 0.73 0.75 4677
1095
+ PER 0.88 0.90 0.89 4635
1096
+
1097
+ micro avg 0.84 0.83 0.83 14146
1098
+ macro avg 0.84 0.83 0.83 14146
1099
+ weighted avg 0.84 0.83 0.83 14146
1100
+ |
1101
+ | 0.1525 | 2.94 | 2448 | 0.2478 | 0.8241 | 0.8165 | 0.8319 | 0.8241 | precision recall f1-score support
1102
+
1103
+ LOC 0.85 0.86 0.85 4834
1104
+ ORG 0.78 0.74 0.76 4677
1105
+ PER 0.88 0.90 0.89 4635
1106
+
1107
+ micro avg 0.84 0.83 0.83 14146
1108
+ macro avg 0.84 0.83 0.83 14146
1109
+ weighted avg 0.84 0.83 0.83 14146
1110
+ |
1111
+ | 0.1386 | 2.96 | 2472 | 0.2486 | 0.8236 | 0.8164 | 0.8309 | 0.8236 | precision recall f1-score support
1112
+
1113
+ LOC 0.85 0.85 0.85 4834
1114
+ ORG 0.78 0.74 0.76 4677
1115
+ PER 0.89 0.89 0.89 4635
1116
+
1117
+ micro avg 0.84 0.83 0.83 14146
1118
+ macro avg 0.84 0.83 0.83 14146
1119
+ weighted avg 0.84 0.83 0.83 14146
1120
+ |
1121
+ | 0.1532 | 2.99 | 2496 | 0.2487 | 0.8237 | 0.8163 | 0.8311 | 0.8237 | precision recall f1-score support
1122
+
1123
+ LOC 0.85 0.85 0.85 4834
1124
+ ORG 0.78 0.74 0.76 4677
1125
+ PER 0.89 0.89 0.89 4635
1126
+
1127
+ micro avg 0.84 0.83 0.83 14146
1128
+ macro avg 0.84 0.83 0.83 14146
1129
+ weighted avg 0.84 0.83 0.83 14146
1130
+ |
1131
+
1132
+
1133
+ ### Framework versions
1134
+
1135
+ - Transformers 4.28.1
1136
+ - Pytorch 2.0.0+cu118
1137
+ - Datasets 2.12.0
1138
+ - Tokenizers 0.13.3