Update README.md
Browse files
README.md
CHANGED
@@ -13,8 +13,7 @@ inference: false
|
|
13 |
|
14 |
This is the standard 4-class NER model for English that ships with [Flair](https://github.com/flairNLP/flair/).
|
15 |
|
16 |
-
|
17 |
-
- Trained on: CoNLL-03
|
18 |
|
19 |
Predicts 4 tags:
|
20 |
|
@@ -25,6 +24,10 @@ Predicts 4 tags:
|
|
25 |
| ORG | organization name |
|
26 |
| MISC | other name |
|
27 |
|
|
|
|
|
|
|
|
|
28 |
### Demo: How to use in Flair
|
29 |
|
30 |
Requires: **[Flair](https://github.com/flairNLP/flair/)** (`pip install flair`)
|
@@ -59,8 +62,25 @@ Span [1,2]: "George Washington" [− Labels: PER (0.9968)]
|
|
59 |
Span [5]: "Washington" [− Labels: LOC (0.9994)]
|
60 |
```
|
61 |
|
62 |
-
|
|
|
|
|
|
|
63 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
64 |
|
65 |
### Training: Script to train this model
|
66 |
|
|
|
13 |
|
14 |
This is the standard 4-class NER model for English that ships with [Flair](https://github.com/flairNLP/flair/).
|
15 |
|
16 |
+
F1-Score: **92,98** (CoNLL-03)
|
|
|
17 |
|
18 |
Predicts 4 tags:
|
19 |
|
|
|
24 |
| ORG | organization name |
|
25 |
| MISC | other name |
|
26 |
|
27 |
+
Based on [Flair embeddings](https://www.aclweb.org/anthology/C18-1139/) and LSTM-CRF.
|
28 |
+
|
29 |
+
---
|
30 |
+
|
31 |
### Demo: How to use in Flair
|
32 |
|
33 |
Requires: **[Flair](https://github.com/flairNLP/flair/)** (`pip install flair`)
|
|
|
62 |
Span [5]: "Washington" [− Labels: LOC (0.9994)]
|
63 |
```
|
64 |
|
65 |
+
So, the entities "*George Washington*" (labeled as a **person**) and "*Washington*" (labeled as a **location**) are found in the sentence "*George Washington went to Washington*".
|
66 |
+
|
67 |
+
|
68 |
+
---
|
69 |
|
70 |
+
### Cite
|
71 |
+
|
72 |
+
Please cite the following paper when using this model.
|
73 |
+
|
74 |
+
```
|
75 |
+
@inproceedings{akbik2018coling,
|
76 |
+
title={Contextual String Embeddings for Sequence Labeling},
|
77 |
+
author={Akbik, Alan and Blythe, Duncan and Vollgraf, Roland},
|
78 |
+
booktitle = {{COLING} 2018, 27th International Conference on Computational Linguistics},
|
79 |
+
pages = {1638--1649},
|
80 |
+
year = {2018}
|
81 |
+
}
|
82 |
+
```
|
83 |
+
---
|
84 |
|
85 |
### Training: Script to train this model
|
86 |
|