alanakbik commited on
Commit
5064d3a
1 Parent(s): 2bb6106

initial model commit

Browse files
Files changed (4) hide show
  1. README.md +145 -0
  2. loss.tsv +151 -0
  3. pytorch_model.bin +3 -0
  4. training.log +0 -0
README.md ADDED
@@ -0,0 +1,145 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - flair
4
+ - token-classification
5
+ - sequence-tagger-model
6
+ language: en
7
+ datasets:
8
+ - conll2003
9
+ inference: false
10
+ ---
11
+
12
+ ## English NER in Flair (fast model)
13
+
14
+ This is the fast 4-class NER model for English that ships with [Flair](https://github.com/flairNLP/flair/).
15
+
16
+ F1-Score: **92,92** (corrected CoNLL-03)
17
+
18
+ Predicts 4 tags:
19
+
20
+ | **tag** | **meaning** |
21
+ |---------------------------------|-----------|
22
+ | PER | person name |
23
+ | LOC | location name |
24
+ | ORG | organization name |
25
+ | MISC | other name |
26
+
27
+ Based on [Flair embeddings](https://www.aclweb.org/anthology/C18-1139/) and LSTM-CRF.
28
+
29
+ ---
30
+
31
+ ### Demo: How to use in Flair
32
+
33
+ Requires: **[Flair](https://github.com/flairNLP/flair/)** (`pip install flair`)
34
+
35
+ ```python
36
+ from flair.data import Sentence
37
+ from flair.models import SequenceTagger
38
+
39
+ # load tagger
40
+ tagger = SequenceTagger.load("flair/ner-english-fast")
41
+
42
+ # make example sentence
43
+ sentence = Sentence("George Washington went to Washington")
44
+
45
+ # predict NER tags
46
+ tagger.predict(sentence)
47
+
48
+ # print sentence
49
+ print(sentence)
50
+
51
+ # print predicted NER spans
52
+ print('The following NER tags are found:')
53
+ # iterate over entities and print
54
+ for entity in sentence.get_spans('ner'):
55
+ print(entity)
56
+
57
+ ```
58
+
59
+ This yields the following output:
60
+ ```
61
+ Span [1,2]: "George Washington" [− Labels: PER (0.9968)]
62
+ Span [5]: "Washington" [− Labels: LOC (0.9994)]
63
+ ```
64
+
65
+ So, the entities "*George Washington*" (labeled as a **person**) and "*Washington*" (labeled as a **location**) are found in the sentence "*George Washington went to Washington*".
66
+
67
+
68
+ ---
69
+
70
+ ### Training: Script to train this model
71
+
72
+ The following Flair script was used to train this model:
73
+
74
+ ```python
75
+ from flair.data import Corpus
76
+ from flair.datasets import CONLL_03
77
+ from flair.embeddings import WordEmbeddings, StackedEmbeddings, FlairEmbeddings
78
+
79
+ # 1. get the corpus
80
+ corpus: Corpus = CONLL_03()
81
+
82
+ # 2. what tag do we want to predict?
83
+ tag_type = 'ner'
84
+
85
+ # 3. make the tag dictionary from the corpus
86
+ tag_dictionary = corpus.make_tag_dictionary(tag_type=tag_type)
87
+
88
+ # 4. initialize each embedding we use
89
+ embedding_types = [
90
+
91
+ # GloVe embeddings
92
+ WordEmbeddings('glove'),
93
+
94
+ # contextual string embeddings, forward
95
+ FlairEmbeddings('news-forward'),
96
+
97
+ # contextual string embeddings, backward
98
+ FlairEmbeddings('news-backward'),
99
+ ]
100
+
101
+ # embedding stack consists of Flair and GloVe embeddings
102
+ embeddings = StackedEmbeddings(embeddings=embedding_types)
103
+
104
+ # 5. initialize sequence tagger
105
+ from flair.models import SequenceTagger
106
+
107
+ tagger = SequenceTagger(hidden_size=256,
108
+ embeddings=embeddings,
109
+ tag_dictionary=tag_dictionary,
110
+ tag_type=tag_type)
111
+
112
+ # 6. initialize trainer
113
+ from flair.trainers import ModelTrainer
114
+
115
+ trainer = ModelTrainer(tagger, corpus)
116
+
117
+ # 7. run training
118
+ trainer.train('resources/taggers/ner-english',
119
+ train_with_dev=True,
120
+ max_epochs=150)
121
+ ```
122
+
123
+
124
+
125
+ ---
126
+
127
+ ### Cite
128
+
129
+ Please cite the following paper when using this model.
130
+
131
+ ```
132
+ @inproceedings{akbik2018coling,
133
+ title={Contextual String Embeddings for Sequence Labeling},
134
+ author={Akbik, Alan and Blythe, Duncan and Vollgraf, Roland},
135
+ booktitle = {{COLING} 2018, 27th International Conference on Computational Linguistics},
136
+ pages = {1638--1649},
137
+ year = {2018}
138
+ }
139
+ ```
140
+
141
+ ---
142
+
143
+ ### Issues?
144
+
145
+ The Flair issue tracker is available [here](https://github.com/flairNLP/flair/issues/).
loss.tsv ADDED
@@ -0,0 +1,151 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ EPOCH TIMESTAMP BAD_EPOCHS LEARNING_RATE TRAIN_LOSS
2
+ 1 21:27:37 0 0.1000 3.576245712941583
3
+ 2 21:29:01 0 0.1000 1.6267465572285502
4
+ 3 21:30:26 0 0.1000 1.2932199398645117
5
+ 4 21:31:51 0 0.1000 1.1177818077274515
6
+ 5 21:33:15 0 0.1000 1.0296866451449032
7
+ 6 21:34:40 0 0.1000 0.9302731227554097
8
+ 7 21:36:05 0 0.1000 0.8873416783783254
9
+ 8 21:37:30 0 0.1000 0.8345710260769988
10
+ 9 21:38:55 0 0.1000 0.8028050488969193
11
+ 10 21:40:19 0 0.1000 0.769371804132869
12
+ 11 21:41:44 0 0.1000 0.7434782063989323
13
+ 12 21:43:09 0 0.1000 0.7134135925694357
14
+ 13 21:44:35 0 0.1000 0.697478834633963
15
+ 14 21:46:00 0 0.1000 0.654981232660858
16
+ 15 21:47:25 0 0.1000 0.6490681288363058
17
+ 16 21:48:51 0 0.1000 0.6350203718801465
18
+ 17 21:50:16 0 0.1000 0.6019481727735528
19
+ 18 21:51:41 0 0.1000 0.5960373881779895
20
+ 19 21:53:05 0 0.1000 0.571467437958227
21
+ 20 21:54:30 0 0.1000 0.5631542286682355
22
+ 21 21:55:56 0 0.1000 0.5591391156887329
23
+ 22 21:57:21 0 0.1000 0.5469877312926552
24
+ 23 21:58:46 0 0.1000 0.543492383406132
25
+ 24 22:00:10 0 0.1000 0.5150289349493724
26
+ 25 22:01:35 1 0.1000 0.5185878202554923
27
+ 26 22:03:00 0 0.1000 0.5032382932597701
28
+ 27 22:04:26 1 0.1000 0.5042508806844678
29
+ 28 22:05:51 0 0.1000 0.48771740151922915
30
+ 29 22:07:17 0 0.1000 0.47727425799622564
31
+ 30 22:08:42 1 0.1000 0.4816816206549919
32
+ 31 22:10:06 0 0.1000 0.46041105548509315
33
+ 32 22:11:46 1 0.1000 0.4666691600474753
34
+ 33 22:13:11 0 0.1000 0.4550447771915152
35
+ 34 22:14:36 0 0.1000 0.4482554853716983
36
+ 35 22:16:00 0 0.1000 0.4392374910980086
37
+ 36 22:17:26 1 0.1000 0.44741923635519004
38
+ 37 22:18:50 0 0.1000 0.4324368980206266
39
+ 38 22:20:15 0 0.1000 0.42393419299793395
40
+ 39 22:21:41 1 0.1000 0.4377899808644117
41
+ 40 22:23:06 0 0.1000 0.40397032364448415
42
+ 41 22:24:32 1 0.1000 0.4180835625274649
43
+ 42 22:25:59 2 0.1000 0.40589749664539776
44
+ 43 22:27:43 3 0.1000 0.4139155642707137
45
+ 44 22:29:08 0 0.1000 0.40055191476793983
46
+ 45 22:30:33 0 0.1000 0.3847549316013538
47
+ 46 22:31:59 1 0.1000 0.39164391398146936
48
+ 47 22:33:24 0 0.1000 0.378595719444035
49
+ 48 22:34:49 0 0.1000 0.37545035072142563
50
+ 49 22:36:15 1 0.1000 0.37854261922685406
51
+ 50 22:37:39 0 0.1000 0.3668022963065135
52
+ 51 22:39:04 0 0.1000 0.3653561896821366
53
+ 52 22:40:29 1 0.1000 0.36685169342009327
54
+ 53 22:41:55 0 0.1000 0.3547235012431688
55
+ 54 22:43:19 1 0.1000 0.35747438044393365
56
+ 55 22:44:44 2 0.1000 0.36117520110233675
57
+ 56 22:46:09 0 0.1000 0.3506846820251851
58
+ 57 22:47:35 0 0.1000 0.34869071834166593
59
+ 58 22:49:01 1 0.1000 0.34916296806422215
60
+ 59 22:50:26 0 0.1000 0.3443586528065461
61
+ 60 22:51:50 1 0.1000 0.34437141102986246
62
+ 61 22:53:15 2 0.1000 0.346735600452823
63
+ 62 22:54:40 0 0.1000 0.3406031290894445
64
+ 63 22:56:05 0 0.1000 0.3351760045448436
65
+ 64 22:57:29 0 0.1000 0.3322265410112052
66
+ 65 22:58:53 0 0.1000 0.32590263230796856
67
+ 66 23:00:18 0 0.1000 0.31927544255799883
68
+ 67 23:01:43 1 0.1000 0.3250151945018693
69
+ 68 23:03:08 2 0.1000 0.32256935213845744
70
+ 69 23:04:33 3 0.1000 0.3210709377463105
71
+ 70 23:05:58 4 0.1000 0.32127140860863124
72
+ 71 23:07:25 0 0.0500 0.30183341349416143
73
+ 72 23:09:15 0 0.0500 0.2842321318774661
74
+ 73 23:10:40 1 0.0500 0.2871976368412187
75
+ 74 23:12:05 0 0.0500 0.2702093680825415
76
+ 75 23:13:29 1 0.0500 0.27023764628964136
77
+ 76 23:14:55 0 0.0500 0.26883385820856576
78
+ 77 23:16:21 0 0.0500 0.26329286245605615
79
+ 78 23:17:46 0 0.0500 0.2628642472002325
80
+ 79 23:19:11 0 0.0500 0.259990004138856
81
+ 80 23:20:36 1 0.0500 0.27014149534457094
82
+ 81 23:22:01 0 0.0500 0.2593999018869068
83
+ 82 23:23:26 0 0.0500 0.2551785586283931
84
+ 83 23:24:51 0 0.0500 0.2534079396319163
85
+ 84 23:26:15 1 0.0500 0.2551872474671919
86
+ 85 23:27:40 2 0.0500 0.2594162785224145
87
+ 86 23:29:07 0 0.0500 0.2524074929990346
88
+ 87 23:30:32 0 0.0500 0.25090720037682146
89
+ 88 23:31:58 0 0.0500 0.23916956306069712
90
+ 89 23:33:22 0 0.0500 0.2356912124241832
91
+ 90 23:34:47 1 0.0500 0.24508511117081852
92
+ 91 23:36:11 2 0.0500 0.24216246373857123
93
+ 92 23:37:37 3 0.0500 0.23845135922767693
94
+ 93 23:39:02 4 0.0500 0.23764705532996716
95
+ 94 23:40:27 1 0.0250 0.24339046411782125
96
+ 95 23:41:51 0 0.0250 0.2290070934716282
97
+ 96 23:43:16 0 0.0250 0.2265663307065828
98
+ 97 23:44:41 1 0.0250 0.22918056114282034
99
+ 98 23:46:06 0 0.0250 0.2230998661982108
100
+ 99 23:47:30 0 0.0250 0.21604776231548453
101
+ 100 23:48:56 1 0.0250 0.23747224881773507
102
+ 101 23:50:20 2 0.0250 0.21889492183263543
103
+ 102 23:51:44 0 0.0250 0.2143383389220962
104
+ 103 23:53:08 1 0.0250 0.21810815311213838
105
+ 104 23:54:31 2 0.0250 0.21440149303761463
106
+ 105 23:55:54 0 0.0250 0.21424433980372887
107
+ 106 23:57:18 1 0.0250 0.2152267113988158
108
+ 107 23:58:41 0 0.0250 0.2141137710194799
109
+ 108 00:00:04 0 0.0250 0.21356280777556233
110
+ 109 00:01:27 1 0.0250 0.22050891868487188
111
+ 110 00:02:50 2 0.0250 0.21593614047558246
112
+ 111 00:04:13 0 0.0250 0.21179587707583664
113
+ 112 00:05:35 0 0.0250 0.20590584845388238
114
+ 113 00:07:03 1 0.0250 0.20764590038245992
115
+ 114 00:08:37 0 0.0250 0.2053749050143399
116
+ 115 00:10:00 1 0.0250 0.20565480036260206
117
+ 116 00:11:23 2 0.0250 0.20719853168518482
118
+ 117 00:12:45 3 0.0250 0.20710589055302023
119
+ 118 00:14:07 0 0.0250 0.19874572183323813
120
+ 119 00:15:30 1 0.0250 0.21330269343585154
121
+ 120 00:16:52 2 0.0250 0.20283179949450342
122
+ 121 00:18:15 3 0.0250 0.2028056960081375
123
+ 122 00:19:37 4 0.0250 0.20502184220601485
124
+ 123 00:20:59 0 0.0125 0.1984874309052395
125
+ 124 00:22:22 0 0.0125 0.19242043826210348
126
+ 125 00:23:44 1 0.0125 0.20257633792448648
127
+ 126 00:25:05 2 0.0125 0.19872382566144195
128
+ 127 00:26:28 3 0.0125 0.19759908735846418
129
+ 128 00:27:51 4 0.0125 0.20143914043526107
130
+ 129 00:29:15 1 0.0063 0.19428333055369462
131
+ 130 00:30:37 2 0.0063 0.19641075125317783
132
+ 131 00:31:59 3 0.0063 0.20426463074039056
133
+ 132 00:33:22 4 0.0063 0.1924995852374955
134
+ 133 00:34:44 0 0.0031 0.18798776621682733
135
+ 134 00:36:06 1 0.0031 0.19395236284295214
136
+ 135 00:37:28 2 0.0031 0.19617622008523608
137
+ 136 00:38:51 3 0.0031 0.1928124738314861
138
+ 137 00:40:13 4 0.0031 0.20205567409343358
139
+ 138 00:41:35 1 0.0016 0.19410402266473709
140
+ 139 00:42:58 2 0.0016 0.1933639193590306
141
+ 140 00:44:20 3 0.0016 0.1959089887670324
142
+ 141 00:45:43 4 0.0016 0.19630460231270216
143
+ 142 00:47:06 1 0.0008 0.19898560986252903
144
+ 143 00:48:32 2 0.0008 0.19213544053933287
145
+ 144 00:49:56 3 0.0008 0.18811244633095928
146
+ 145 00:51:18 4 0.0008 0.1902414702256269
147
+ 146 00:52:41 1 0.0004 0.189671358352975
148
+ 147 00:54:03 0 0.0004 0.1872933999741379
149
+ 148 00:55:25 1 0.0004 0.19926253187505505
150
+ 149 00:56:48 2 0.0004 0.19951951173664648
151
+ 150 00:58:10 0 0.0004 0.1853659376855704
pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6e5f0b76cd61ac10b332ed01d131c7fba58cfa93caf4736f09747b2dc1533399
3
+ size 256731629
training.log ADDED
The diff for this file is too large to render. See raw diff