alanakbik commited on
Commit
772cf50
1 Parent(s): 24e3595

initial model commit

Browse files
Files changed (4) hide show
  1. README.md +184 -0
  2. loss.tsv +151 -0
  3. pytorch_model.bin +3 -0
  4. training.log +0 -0
README.md ADDED
@@ -0,0 +1,184 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - flair
4
+ - token-classification
5
+ - sequence-tagger-model
6
+ language: en
7
+ datasets:
8
+ - conll2000
9
+ inference: false
10
+ ---
11
+
12
+ ## English Part-of-Speech Tagging in Flair (default model)
13
+
14
+ This is the standard part-of-speech tagging model for English that ships with [Flair](https://github.com/flairNLP/flair/).
15
+
16
+ F1-Score: **98,19** (Ontonotes)
17
+
18
+ Predicts fine-grained POS tags:
19
+
20
+ | **tag** | **meaning** |
21
+ |---------------------------------|-----------|
22
+ |ADD | Email |
23
+ |AFX | Affix |
24
+ |CC | Coordinating conjunction |
25
+ |CD | Cardinal number |
26
+ |DT | Determiner |
27
+ |EX | Existential there |
28
+ |FW | Foreign word |
29
+ |HYPH | Hyphen |
30
+ |IN | Preposition or subordinating conjunction |
31
+ |JJ | Adjective |
32
+ |JJR |Adjective, comparative |
33
+ |JJS | Adjective, superlative |
34
+ |LS | List item marker |
35
+ |MD | Modal |
36
+ |NFP | Superfluous punctuation |
37
+ |NN | Noun, singular or mass |
38
+ |NNP |Proper noun, singular |
39
+ |NNPS | Proper noun, plural |
40
+ |NNS |Noun, plural |
41
+ |PDT | Predeterminer |
42
+ |POS | Possessive ending |
43
+ |PRP | Personal pronoun |
44
+ |PRP$ | Possessive pronoun |
45
+ |RB | Adverb |
46
+ |RBR | Adverb, comparative |
47
+ |RBS | Adverb, superlative |
48
+ |RP | Particle |
49
+ |SYM | Symbol |
50
+ |TO | to |
51
+ |UH | Interjection |
52
+ |VB | Verb, base form |
53
+ |VBD | Verb, past tense |
54
+ |VBG | Verb, gerund or present participle |
55
+ |VBN | Verb, past participle |
56
+ |VBP | Verb, non-3rd person singular present |
57
+ |VBZ | Verb, 3rd person singular present |
58
+ |WDT | Wh-determiner |
59
+ |WP | Wh-pronoun |
60
+ |WP$ | Possessive wh-pronoun |
61
+ |WRB | Wh-adverb |
62
+ |XX | Unknown |
63
+
64
+
65
+
66
+ Based on [Flair embeddings](https://www.aclweb.org/anthology/C18-1139/) and LSTM-CRF.
67
+
68
+ ---
69
+
70
+ ### Demo: How to use in Flair
71
+
72
+ Requires: **[Flair](https://github.com/flairNLP/flair/)** (`pip install flair`)
73
+
74
+ ```python
75
+ from flair.data import Sentence
76
+ from flair.models import SequenceTagger
77
+
78
+ # load tagger
79
+ tagger = SequenceTagger.load("flair/pos-english")
80
+
81
+ # make example sentence
82
+ sentence = Sentence("I love Berlin")
83
+
84
+ # predict NER tags
85
+ tagger.predict(sentence)
86
+
87
+ # print sentence
88
+ print(sentence)
89
+
90
+ # print predicted NER spans
91
+ print('The following NER tags are found:')
92
+ # iterate over entities and print
93
+ for entity in sentence.get_spans('pos'):
94
+ print(entity)
95
+
96
+ ```
97
+
98
+ This yields the following output:
99
+ ```
100
+ Span [1,2,3]: "The happy man" [− Labels: NP (0.9958)]
101
+ Span [4,5,6]: "has been eating" [− Labels: VP (0.8759)]
102
+ Span [7]: "at" [− Labels: PP (1.0)]
103
+ Span [8,9]: "the diner" [− Labels: NP (0.9991)]
104
+
105
+ ```
106
+
107
+ So, the spans "*The happy man*" and "*the diner*" are labeled as **noun phrases** (NP) and "*has been eating*" is labeled as a **verb phrase** (VP) in the sentence "*The happy man has been eating at the diner*".
108
+
109
+
110
+ ---
111
+
112
+ ### Training: Script to train this model
113
+
114
+ The following Flair script was used to train this model:
115
+
116
+ ```python
117
+ from flair.data import Corpus
118
+ from flair.datasets import CONLL_2000
119
+ from flair.embeddings import WordEmbeddings, StackedEmbeddings, FlairEmbeddings
120
+
121
+ # 1. get the corpus
122
+ corpus: Corpus = CONLL_2000()
123
+
124
+ # 2. what tag do we want to predict?
125
+ tag_type = 'np'
126
+
127
+ # 3. make the tag dictionary from the corpus
128
+ tag_dictionary = corpus.make_tag_dictionary(tag_type=tag_type)
129
+
130
+ # 4. initialize each embedding we use
131
+ embedding_types = [
132
+
133
+ # contextual string embeddings, forward
134
+ FlairEmbeddings('news-forward'),
135
+
136
+ # contextual string embeddings, backward
137
+ FlairEmbeddings('news-backward'),
138
+ ]
139
+
140
+ # embedding stack consists of Flair and GloVe embeddings
141
+ embeddings = StackedEmbeddings(embeddings=embedding_types)
142
+
143
+ # 5. initialize sequence tagger
144
+ from flair.models import SequenceTagger
145
+
146
+ tagger = SequenceTagger(hidden_size=256,
147
+ embeddings=embeddings,
148
+ tag_dictionary=tag_dictionary,
149
+ tag_type=tag_type)
150
+
151
+ # 6. initialize trainer
152
+ from flair.trainers import ModelTrainer
153
+
154
+ trainer = ModelTrainer(tagger, corpus)
155
+
156
+ # 7. run training
157
+ trainer.train('resources/taggers/chunk-english',
158
+ train_with_dev=True,
159
+ max_epochs=150)
160
+ ```
161
+
162
+
163
+
164
+ ---
165
+
166
+ ### Cite
167
+
168
+ Please cite the following paper when using this model.
169
+
170
+ ```
171
+ @inproceedings{akbik2018coling,
172
+ title={Contextual String Embeddings for Sequence Labeling},
173
+ author={Akbik, Alan and Blythe, Duncan and Vollgraf, Roland},
174
+ booktitle = {{COLING} 2018, 27th International Conference on Computational Linguistics},
175
+ pages = {1638--1649},
176
+ year = {2018}
177
+ }
178
+ ```
179
+
180
+ ---
181
+
182
+ ### Issues?
183
+
184
+ The Flair issue tracker is available [here](https://github.com/flairNLP/flair/issues/).
loss.tsv ADDED
@@ -0,0 +1,151 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ EPOCH TIMESTAMP BAD_EPOCHS LEARNING_RATE TRAIN_LOSS
2
+ 1 20:52:08 0 0.1000 4.440161232385996
3
+ 2 21:02:54 0 0.1000 2.5081334869816616
4
+ 3 21:13:41 0 0.1000 2.1583339795976313
5
+ 4 21:24:30 0 0.1000 1.9826479059570241
6
+ 5 21:35:13 0 0.1000 1.8439768840007063
7
+ 6 21:46:04 0 0.1000 1.7556186997440626
8
+ 7 21:56:58 0 0.1000 1.6729187929180434
9
+ 8 22:07:41 0 0.1000 1.6175819635391235
10
+ 9 22:18:22 0 0.1000 1.567191819548607
11
+ 10 22:28:59 0 0.1000 1.5158289558707543
12
+ 11 22:39:33 0 0.1000 1.4816847196390044
13
+ 12 22:50:13 0 0.1000 1.4487488953684862
14
+ 13 23:00:55 0 0.1000 1.4116378486156465
15
+ 14 23:11:41 0 0.1000 1.3804849133289085
16
+ 15 23:22:21 0 0.1000 1.3604626842939629
17
+ 16 23:32:56 0 0.1000 1.3348652415567974
18
+ 17 23:43:29 0 0.1000 1.3190998460099381
19
+ 18 23:54:02 0 0.1000 1.300972046728404
20
+ 19 00:04:36 0 0.1000 1.2754099613540577
21
+ 20 00:15:11 0 0.1000 1.2620930742880083
22
+ 21 00:25:48 0 0.1000 1.2427125974421231
23
+ 22 00:36:23 0 0.1000 1.2239304436490221
24
+ 23 00:47:03 0 0.1000 1.2220602732222035
25
+ 24 00:57:53 0 0.1000 1.207376890598603
26
+ 25 01:08:41 0 0.1000 1.191661370452845
27
+ 26 01:19:19 0 0.1000 1.1854410221779121
28
+ 27 01:29:54 0 0.1000 1.1639552125953279
29
+ 28 01:40:32 1 0.1000 1.1674171195390088
30
+ 29 01:51:16 0 0.1000 1.1548268491142202
31
+ 30 02:02:03 0 0.1000 1.1502379114560362
32
+ 31 02:12:43 0 0.1000 1.1394139466195736
33
+ 32 02:23:23 0 0.1000 1.1333867625020584
34
+ 33 02:33:57 0 0.1000 1.1169679287469612
35
+ 34 02:44:23 1 0.1000 1.1183975887860893
36
+ 35 02:54:51 0 0.1000 1.1031492047827198
37
+ 36 03:05:17 1 0.1000 1.106654071999046
38
+ 37 03:15:55 0 0.1000 1.0981387158384863
39
+ 38 03:26:23 0 0.1000 1.091500723361969
40
+ 39 03:36:48 0 0.1000 1.078726376306336
41
+ 40 03:47:12 0 0.1000 1.0701815563665247
42
+ 41 03:57:38 0 0.1000 1.0676479135256893
43
+ 42 04:08:04 1 0.1000 1.0709096380449692
44
+ 43 04:18:31 0 0.1000 1.0584135180599286
45
+ 44 04:29:01 1 0.1000 1.0603101778592703
46
+ 45 04:39:33 2 0.1000 1.0599101366299504
47
+ 46 04:50:16 0 0.1000 1.054228850344442
48
+ 47 05:00:49 0 0.1000 1.0398632440252125
49
+ 48 05:11:17 1 0.1000 1.043083128501784
50
+ 49 05:21:45 0 0.1000 1.032125227125186
51
+ 50 05:32:13 0 0.1000 1.0312004477797814
52
+ 51 05:42:41 0 0.1000 1.0224073643954295
53
+ 52 05:53:08 1 0.1000 1.0266485528676015
54
+ 53 06:03:34 2 0.1000 1.0261119301813952
55
+ 54 06:13:59 0 0.1000 1.0190398614811447
56
+ 55 06:24:24 0 0.1000 1.0189366444214336
57
+ 56 06:34:49 0 0.1000 1.0186952622656553
58
+ 57 06:45:15 0 0.1000 1.014220685722693
59
+ 58 06:55:40 0 0.1000 1.012852365082165
60
+ 59 07:06:06 0 0.1000 1.0120688318419007
61
+ 60 07:16:40 0 0.1000 0.9965613165216626
62
+ 61 07:27:18 0 0.1000 0.9953716235115843
63
+ 62 07:38:02 1 0.1000 1.001837087363567
64
+ 63 07:48:53 2 0.1000 1.006256850496778
65
+ 64 07:59:37 3 0.1000 0.9984066509525731
66
+ 65 08:10:10 0 0.1000 0.9925584944351664
67
+ 66 08:20:57 0 0.1000 0.9878121419785157
68
+ 67 08:31:51 1 0.1000 0.9894720081225881
69
+ 68 08:42:45 2 0.1000 0.992577243672227
70
+ 69 08:53:30 3 0.1000 0.9881071116227024
71
+ 70 09:04:04 0 0.1000 0.9730017746169612
72
+ 71 09:14:40 1 0.1000 0.991095079498471
73
+ 72 09:25:23 2 0.1000 0.9847190643818873
74
+ 73 09:36:13 3 0.1000 0.9873424543074841
75
+ 74 09:47:25 4 0.1000 0.9847348382225577
76
+ 75 09:58:33 0 0.0500 0.9315909094967932
77
+ 76 10:09:28 0 0.0500 0.9022325243252628
78
+ 77 10:20:20 0 0.0500 0.8902195256048778
79
+ 78 10:31:19 0 0.0500 0.8723525498835546
80
+ 79 10:42:19 0 0.0500 0.8651090322125633
81
+ 80 10:53:15 0 0.0500 0.8573019430322467
82
+ 81 11:04:12 0 0.0500 0.8505386984460759
83
+ 82 11:15:07 0 0.0500 0.8416592055446697
84
+ 83 11:26:06 0 0.0500 0.840492116291568
85
+ 84 11:37:04 0 0.0500 0.8269484595082841
86
+ 85 11:48:07 0 0.0500 0.8259104798312457
87
+ 86 11:58:57 0 0.0500 0.8142883822940431
88
+ 87 12:09:37 1 0.0500 0.815013348145305
89
+ 88 12:20:15 2 0.0500 0.81506850761625
90
+ 89 12:30:54 0 0.0500 0.8126563506756189
91
+ 90 12:41:32 0 0.0500 0.8073379599373296
92
+ 91 12:52:12 0 0.0500 0.8044627973774694
93
+ 92 13:02:46 0 0.0500 0.7935339151126034
94
+ 93 13:13:17 1 0.0500 0.7999999434318182
95
+ 94 13:23:57 0 0.0500 0.7855544437777321
96
+ 95 13:34:30 0 0.0500 0.7852934920450426
97
+ 96 13:45:07 0 0.0500 0.7823002677481129
98
+ 97 13:55:44 1 0.0500 0.7847004000195917
99
+ 98 14:06:25 0 0.0500 0.7812922019103788
100
+ 99 14:17:07 0 0.0500 0.7765507183715983
101
+ 100 14:27:54 0 0.0500 0.7713469461618729
102
+ 101 14:38:45 0 0.0500 0.7689368553431529
103
+ 102 14:49:40 0 0.0500 0.7649135219600965
104
+ 103 15:00:28 1 0.0500 0.7698996701667894
105
+ 104 15:11:35 2 0.0500 0.7652068016439114
106
+ 105 15:22:40 0 0.0500 0.7561835708820595
107
+ 106 15:33:42 1 0.0500 0.7572112994149046
108
+ 107 15:44:39 2 0.0500 0.757130035375649
109
+ 108 15:55:31 0 0.0500 0.749976492377947
110
+ 109 16:06:31 0 0.0500 0.7495477832600755
111
+ 110 16:17:43 0 0.0500 0.7494549221025323
112
+ 111 16:28:54 1 0.0500 0.7499363383257164
113
+ 112 16:40:03 0 0.0500 0.742774221880256
114
+ 113 16:51:15 0 0.0500 0.7355138572881806
115
+ 114 17:02:32 1 0.0500 0.7430367245707872
116
+ 115 17:13:47 2 0.0500 0.7362946672597022
117
+ 116 17:25:03 0 0.0500 0.7296903043029443
118
+ 117 17:36:24 1 0.0500 0.7316516229285384
119
+ 118 17:47:40 2 0.0500 0.7298012239190768
120
+ 119 17:58:58 0 0.0500 0.725832704181941
121
+ 120 18:10:14 1 0.0500 0.7342031982147469
122
+ 121 18:21:33 2 0.0500 0.7373537805563999
123
+ 122 18:32:57 0 0.0500 0.720425709441023
124
+ 123 18:44:06 0 0.0500 0.7198627921770204
125
+ 124 18:55:14 0 0.0500 0.7186782485246659
126
+ 125 19:06:19 0 0.0500 0.7143201651550689
127
+ 126 19:16:59 1 0.0500 0.7179321614076506
128
+ 127 19:27:32 2 0.0500 0.7232625953037783
129
+ 128 19:38:01 3 0.0500 0.7178108556877892
130
+ 129 19:48:24 4 0.0500 0.7165789126000315
131
+ 130 19:58:50 0 0.0250 0.6979938816518154
132
+ 131 20:09:13 0 0.0250 0.6868638727068901
133
+ 132 20:19:38 0 0.0250 0.6839717018829202
134
+ 133 20:30:03 0 0.0250 0.6678646053908006
135
+ 134 20:40:32 1 0.0250 0.6725394108554102
136
+ 135 20:50:55 0 0.0250 0.667677206284595
137
+ 136 21:01:21 1 0.0250 0.6694862040596188
138
+ 137 21:11:51 0 0.0250 0.6669475417598238
139
+ 138 21:22:47 0 0.0250 0.6587011924892102
140
+ 139 21:33:48 1 0.0250 0.6638432242116838
141
+ 140 21:44:53 0 0.0250 0.656541748890337
142
+ 141 21:55:59 0 0.0250 0.6493324265671226
143
+ 142 22:07:02 1 0.0250 0.6537159925249387
144
+ 143 22:18:05 0 0.0250 0.644002894607355
145
+ 144 22:29:04 0 0.0250 0.643809141061216
146
+ 145 22:39:50 1 0.0250 0.6446365239035409
147
+ 146 22:50:34 0 0.0250 0.6404419986144552
148
+ 147 23:01:13 0 0.0250 0.6318214131301304
149
+ 148 23:11:56 1 0.0250 0.6437739029076864
150
+ 149 23:22:52 2 0.0250 0.6382293396176032
151
+ 150 23:33:44 0 0.0250 0.6294730271258444
pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:68330eb5a23b498ae33e13ff43799d95471ea29b5ef22e504dea809cf34e7fbc
3
+ size 249072763
training.log ADDED
The diff for this file is too large to render. See raw diff