alanakbik commited on
Commit
6376136
1 Parent(s): b2f5995

initial model commit

Browse files
Files changed (4) hide show
  1. README.md +188 -0
  2. loss.tsv +151 -0
  3. pytorch_model.bin +3 -0
  4. training.log +0 -0
README.md ADDED
@@ -0,0 +1,188 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - flair
4
+ - token-classification
5
+ - sequence-tagger-model
6
+ language: en
7
+ datasets:
8
+ - ontonotes
9
+ inference: false
10
+ ---
11
+
12
+ ## English Part-of-Speech Tagging in Flair (fast model)
13
+
14
+ This is the fast part-of-speech tagging model for English that ships with [Flair](https://github.com/flairNLP/flair/).
15
+
16
+ F1-Score: **98,10** (Ontonotes)
17
+
18
+ Predicts fine-grained POS tags:
19
+
20
+ | **tag** | **meaning** |
21
+ |---------------------------------|-----------|
22
+ |ADD | Email |
23
+ |AFX | Affix |
24
+ |CC | Coordinating conjunction |
25
+ |CD | Cardinal number |
26
+ |DT | Determiner |
27
+ |EX | Existential there |
28
+ |FW | Foreign word |
29
+ |HYPH | Hyphen |
30
+ |IN | Preposition or subordinating conjunction |
31
+ |JJ | Adjective |
32
+ |JJR |Adjective, comparative |
33
+ |JJS | Adjective, superlative |
34
+ |LS | List item marker |
35
+ |MD | Modal |
36
+ |NFP | Superfluous punctuation |
37
+ |NN | Noun, singular or mass |
38
+ |NNP |Proper noun, singular |
39
+ |NNPS | Proper noun, plural |
40
+ |NNS |Noun, plural |
41
+ |PDT | Predeterminer |
42
+ |POS | Possessive ending |
43
+ |PRP | Personal pronoun |
44
+ |PRP$ | Possessive pronoun |
45
+ |RB | Adverb |
46
+ |RBR | Adverb, comparative |
47
+ |RBS | Adverb, superlative |
48
+ |RP | Particle |
49
+ |SYM | Symbol |
50
+ |TO | to |
51
+ |UH | Interjection |
52
+ |VB | Verb, base form |
53
+ |VBD | Verb, past tense |
54
+ |VBG | Verb, gerund or present participle |
55
+ |VBN | Verb, past participle |
56
+ |VBP | Verb, non-3rd person singular present |
57
+ |VBZ | Verb, 3rd person singular present |
58
+ |WDT | Wh-determiner |
59
+ |WP | Wh-pronoun |
60
+ |WP$ | Possessive wh-pronoun |
61
+ |WRB | Wh-adverb |
62
+ |XX | Unknown |
63
+
64
+
65
+
66
+ Based on [Flair embeddings](https://www.aclweb.org/anthology/C18-1139/) and LSTM-CRF.
67
+
68
+ ---
69
+
70
+ ### Demo: How to use in Flair
71
+
72
+ Requires: **[Flair](https://github.com/flairNLP/flair/)** (`pip install flair`)
73
+
74
+ ```python
75
+ from flair.data import Sentence
76
+ from flair.models import SequenceTagger
77
+
78
+ # load tagger
79
+ tagger = SequenceTagger.load("flair/pos-english")
80
+
81
+ # make example sentence
82
+ sentence = Sentence("I love Berlin.")
83
+
84
+ # predict NER tags
85
+ tagger.predict(sentence)
86
+
87
+ # print sentence
88
+ print(sentence)
89
+
90
+ # print predicted NER spans
91
+ print('The following NER tags are found:')
92
+ # iterate over entities and print
93
+ for entity in sentence.get_spans('pos'):
94
+ print(entity)
95
+
96
+ ```
97
+
98
+ This yields the following output:
99
+ ```
100
+ Span [1]: "I" [− Labels: PRP (1.0)]
101
+ Span [2]: "love" [− Labels: VBP (1.0)]
102
+ Span [3]: "Berlin" [− Labels: NNP (0.9999)]
103
+ Span [4]: "." [− Labels: . (1.0)]
104
+
105
+ ```
106
+
107
+ So, the word "*I*" is labeled as a **pronoun** (PRP), "*love*" is labeled as a **verb** (VBP) and "*Berlin*" is labeled as a **proper noun** (NNP) in the sentence "*TheI love Berlin*".
108
+
109
+
110
+ ---
111
+
112
+ ### Training: Script to train this model
113
+
114
+ The following Flair script was used to train this model:
115
+
116
+ ```python
117
+ from flair.data import Corpus
118
+ from flair.datasets import ColumnCorpus
119
+ from flair.embeddings import WordEmbeddings, StackedEmbeddings, FlairEmbeddings
120
+
121
+ # 1. load the corpus (Ontonotes does not ship with Flair, you need to download and reformat into a column format yourself)
122
+ corpus: Corpus = ColumnCorpus(
123
+ "resources/tasks/onto-ner",
124
+ column_format={0: "text", 1: "pos", 2: "upos", 3: "ner"},
125
+ tag_to_bioes="ner",
126
+ )
127
+
128
+ # 2. what tag do we want to predict?
129
+ tag_type = 'pos'
130
+
131
+ # 3. make the tag dictionary from the corpus
132
+ tag_dictionary = corpus.make_tag_dictionary(tag_type=tag_type)
133
+
134
+ # 4. initialize each embedding we use
135
+ embedding_types = [
136
+
137
+ # contextual string embeddings, forward
138
+ FlairEmbeddings('news-forward'),
139
+
140
+ # contextual string embeddings, backward
141
+ FlairEmbeddings('news-backward'),
142
+ ]
143
+
144
+ # embedding stack consists of Flair and GloVe embeddings
145
+ embeddings = StackedEmbeddings(embeddings=embedding_types)
146
+
147
+ # 5. initialize sequence tagger
148
+ from flair.models import SequenceTagger
149
+
150
+ tagger = SequenceTagger(hidden_size=256,
151
+ embeddings=embeddings,
152
+ tag_dictionary=tag_dictionary,
153
+ tag_type=tag_type)
154
+
155
+ # 6. initialize trainer
156
+ from flair.trainers import ModelTrainer
157
+
158
+ trainer = ModelTrainer(tagger, corpus)
159
+
160
+ # 7. run training
161
+ trainer.train('resources/taggers/pos-english',
162
+ train_with_dev=True,
163
+ max_epochs=150)
164
+ ```
165
+
166
+
167
+
168
+ ---
169
+
170
+ ### Cite
171
+
172
+ Please cite the following paper when using this model.
173
+
174
+ ```
175
+ @inproceedings{akbik2018coling,
176
+ title={Contextual String Embeddings for Sequence Labeling},
177
+ author={Akbik, Alan and Blythe, Duncan and Vollgraf, Roland},
178
+ booktitle = {{COLING} 2018, 27th International Conference on Computational Linguistics},
179
+ pages = {1638--1649},
180
+ year = {2018}
181
+ }
182
+ ```
183
+
184
+ ---
185
+
186
+ ### Issues?
187
+
188
+ The Flair issue tracker is available [here](https://github.com/flairNLP/flair/issues/).
loss.tsv ADDED
@@ -0,0 +1,151 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ EPOCH TIMESTAMP BAD_EPOCHS LEARNING_RATE TRAIN_LOSS
2
+ 1 10:40:13 0 0.1000 5.191687386260843
3
+ 2 10:49:44 0 0.1000 2.990255600686343
4
+ 3 10:59:11 0 0.1000 2.630100125231833
5
+ 4 11:08:38 0 0.1000 2.439984718876065
6
+ 5 11:18:04 0 0.1000 2.3081886467393837
7
+ 6 11:27:31 0 0.1000 2.2055936109794763
8
+ 7 11:36:56 0 0.1000 2.132141686632948
9
+ 8 11:46:16 0 0.1000 2.0610849993633775
10
+ 9 11:55:45 0 0.1000 2.021561934610583
11
+ 10 12:05:06 0 0.1000 1.9721916137888746
12
+ 11 12:14:29 0 0.1000 1.937625719016453
13
+ 12 12:23:50 0 0.1000 1.9136744570282271
14
+ 13 12:33:15 0 0.1000 1.8815782972326818
15
+ 14 12:42:38 0 0.1000 1.8704040062202598
16
+ 15 12:52:01 0 0.1000 1.8314167969181852
17
+ 16 13:01:23 0 0.1000 1.815200958094507
18
+ 17 13:10:42 0 0.1000 1.7952995858777243
19
+ 18 13:19:59 0 0.1000 1.7827841561020545
20
+ 19 13:29:26 0 0.1000 1.7682916816675438
21
+ 20 13:38:51 0 0.1000 1.7474985779681296
22
+ 21 13:48:12 0 0.1000 1.7368936350210658
23
+ 22 13:57:39 0 0.1000 1.7231756714380013
24
+ 23 14:07:08 0 0.1000 1.7124459424333753
25
+ 24 14:16:39 0 0.1000 1.6981768243492774
26
+ 25 14:26:08 1 0.1000 1.6988864801514822
27
+ 26 14:35:29 0 0.1000 1.6889761099500475
28
+ 27 14:44:55 0 0.1000 1.6771138597546884
29
+ 28 14:54:17 1 0.1000 1.6777101769987142
30
+ 29 15:03:39 0 0.1000 1.66600777979167
31
+ 30 15:13:09 0 0.1000 1.6517529875602361
32
+ 31 15:22:38 0 0.1000 1.64806510362985
33
+ 32 15:31:57 0 0.1000 1.6456873242022856
34
+ 33 15:41:10 0 0.1000 1.6344037736811727
35
+ 34 15:50:20 0 0.1000 1.6267880096300593
36
+ 35 15:59:28 1 0.1000 1.6295387614223191
37
+ 36 16:08:34 0 0.1000 1.616454012506413
38
+ 37 16:17:48 0 0.1000 1.610561783516182
39
+ 38 16:26:56 1 0.1000 1.617817803542569
40
+ 39 16:36:12 0 0.1000 1.591378326933339
41
+ 40 16:45:49 1 0.1000 1.6020105891632583
42
+ 41 16:55:26 0 0.1000 1.5872514364179575
43
+ 42 17:05:06 1 0.1000 1.5884696119236497
44
+ 43 17:14:52 0 0.1000 1.5812612841939027
45
+ 44 17:24:30 1 0.1000 1.581793251487444
46
+ 45 17:33:56 2 0.1000 1.5866447465824631
47
+ 46 17:43:19 3 0.1000 1.5841673767791604
48
+ 47 17:52:41 0 0.1000 1.5737929779291153
49
+ 48 18:02:01 1 0.1000 1.5756766953220906
50
+ 49 18:11:19 0 0.1000 1.564068170223596
51
+ 50 18:20:28 1 0.1000 1.5647186986230455
52
+ 51 18:29:34 0 0.1000 1.555480633681675
53
+ 52 18:38:36 1 0.1000 1.5621746712360742
54
+ 53 18:47:36 0 0.1000 1.5449711579196859
55
+ 54 18:56:38 0 0.1000 1.543854748258051
56
+ 55 19:05:41 1 0.1000 1.5526732677783606
57
+ 56 19:14:44 2 0.1000 1.5500289052724838
58
+ 57 19:23:46 0 0.1000 1.5427457933605842
59
+ 58 19:32:47 1 0.1000 1.5492016454575197
60
+ 59 19:41:48 2 0.1000 1.5442483125542694
61
+ 60 19:50:50 0 0.1000 1.5416108293353388
62
+ 61 19:59:48 0 0.1000 1.539070179327479
63
+ 62 20:08:51 0 0.1000 1.5283891413009392
64
+ 63 20:17:51 1 0.1000 1.5301074168929514
65
+ 64 20:26:53 2 0.1000 1.5304946095313665
66
+ 65 20:35:54 3 0.1000 1.5365828336747187
67
+ 66 20:45:02 4 0.1000 1.5349518755022085
68
+ 67 20:54:01 0 0.0500 1.4676669784874286
69
+ 68 21:03:02 0 0.0500 1.4282581994218646
70
+ 69 21:12:03 0 0.0500 1.4178193369676482
71
+ 70 21:21:01 0 0.0500 1.3994816730832154
72
+ 71 21:30:01 0 0.0500 1.3936838644068197
73
+ 72 21:39:01 0 0.0500 1.3778960541846617
74
+ 73 21:48:00 0 0.0500 1.3765544548911868
75
+ 74 21:56:58 0 0.0500 1.3616676717546752
76
+ 75 22:05:56 0 0.0500 1.3601923845848947
77
+ 76 22:14:54 1 0.0500 1.3641906388863079
78
+ 77 22:23:51 0 0.0500 1.3486524650060905
79
+ 78 22:32:50 0 0.0500 1.3401584661344312
80
+ 79 22:41:48 0 0.0500 1.3396736347225477
81
+ 80 22:50:47 0 0.0500 1.3384974124296656
82
+ 81 22:59:45 0 0.0500 1.3200992156087228
83
+ 82 23:08:44 0 0.0500 1.3120858202902776
84
+ 83 23:17:44 1 0.0500 1.312913179678737
85
+ 84 23:26:44 2 0.0500 1.3215642819989402
86
+ 85 23:35:44 0 0.0500 1.3076230350192988
87
+ 86 23:44:44 1 0.0500 1.3107236595423717
88
+ 87 23:53:46 0 0.0500 1.307279589041224
89
+ 88 00:02:47 0 0.0500 1.2990785246862555
90
+ 89 00:11:47 1 0.0500 1.3011387617633028
91
+ 90 00:20:45 0 0.0500 1.289999634511066
92
+ 91 00:29:45 1 0.0500 1.2988943146422225
93
+ 92 00:38:45 0 0.0500 1.28956611776127
94
+ 93 00:47:46 1 0.0500 1.2910581589982195
95
+ 94 00:56:54 0 0.0500 1.2823612826950146
96
+ 95 01:05:55 0 0.0500 1.2734393208994055
97
+ 96 01:14:54 1 0.0500 1.2823268779381267
98
+ 97 01:23:52 0 0.0500 1.27245156982035
99
+ 98 01:32:49 0 0.0500 1.266549679760663
100
+ 99 01:41:45 1 0.0500 1.2840989648508576
101
+ 100 01:50:43 2 0.0500 1.2673725502895858
102
+ 101 01:59:39 0 0.0500 1.2620051703475557
103
+ 102 02:08:38 0 0.0500 1.2609278967582955
104
+ 103 02:17:37 1 0.0500 1.262451319773242
105
+ 104 02:26:34 0 0.0500 1.2540171742889117
106
+ 105 02:35:32 1 0.0500 1.2569747807732168
107
+ 106 02:44:30 2 0.0500 1.2591986549013066
108
+ 107 02:53:31 0 0.0500 1.2485761789218435
109
+ 108 03:02:31 0 0.0500 1.248099730858263
110
+ 109 03:11:33 1 0.0500 1.2514484322520922
111
+ 110 03:20:35 0 0.0500 1.2448666573358032
112
+ 111 03:29:38 0 0.0500 1.2394069942100994
113
+ 112 03:38:39 0 0.0500 1.2357601409250836
114
+ 113 03:47:40 1 0.0500 1.2411275048525827
115
+ 114 03:56:40 2 0.0500 1.2393213147712203
116
+ 115 04:05:40 3 0.0500 1.236610540995058
117
+ 116 04:14:40 4 0.0500 1.241174234995302
118
+ 117 04:23:40 0 0.0250 1.2169747731933054
119
+ 118 04:32:41 0 0.0250 1.205440902293853
120
+ 119 04:41:41 0 0.0250 1.1918721127060223
121
+ 120 04:50:41 0 0.0250 1.1729565850293862
122
+ 121 04:59:41 1 0.0250 1.1737796594399326
123
+ 122 05:08:48 2 0.0250 1.1757655947050958
124
+ 123 05:17:46 3 0.0250 1.185993371605873
125
+ 124 05:26:45 0 0.0250 1.1611660176740501
126
+ 125 05:36:19 1 0.0250 1.1613049927297627
127
+ 126 05:45:49 2 0.0250 1.1627364553483028
128
+ 127 05:55:13 0 0.0250 1.1572273018225183
129
+ 128 06:04:34 0 0.0250 1.1571208929340795
130
+ 129 06:13:56 0 0.0250 1.1518588769098497
131
+ 130 06:23:14 0 0.0250 1.148175410572088
132
+ 131 06:32:26 1 0.0250 1.1503704931263654
133
+ 132 06:41:30 0 0.0250 1.1390686752661219
134
+ 133 06:50:25 1 0.0250 1.1395786151571095
135
+ 134 06:59:21 2 0.0250 1.1404599001272668
136
+ 135 07:08:18 0 0.0250 1.1327537002316062
137
+ 136 07:18:20 0 0.0250 1.1325569068485837
138
+ 137 07:28:02 0 0.0250 1.1305628216266632
139
+ 138 07:37:42 1 0.0250 1.1355784202071855
140
+ 139 07:47:32 0 0.0250 1.128019032298394
141
+ 140 07:57:07 0 0.0250 1.1219576685833481
142
+ 141 08:06:40 1 0.0250 1.1247122208577283
143
+ 142 08:16:13 0 0.0250 1.1212555271274638
144
+ 143 08:25:42 0 0.0250 1.1188698373538144
145
+ 144 08:35:12 1 0.0250 1.1210243045721413
146
+ 145 08:44:43 0 0.0250 1.1187412588776282
147
+ 146 08:54:12 1 0.0250 1.1203261433457428
148
+ 147 09:03:44 2 0.0250 1.1201197473283084
149
+ 148 09:13:21 0 0.0250 1.105603715833628
150
+ 149 09:22:53 1 0.0250 1.1072929360394208
151
+ 150 09:32:35 2 0.0250 1.1075447516058976
pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2897b43fa61696cb584642b5aab18728fa78580d88073e93343e65aafc3925de
3
+ size 75266317
training.log ADDED
The diff for this file is too large to render. See raw diff