alanakbik commited on
Commit
847e261
1 Parent(s): e32234e

initial model commit

Browse files
Files changed (3) hide show
  1. README.md +151 -0
  2. loss.tsv +132 -0
  3. pytorch_model.bin +3 -0
README.md ADDED
@@ -0,0 +1,151 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - flair
4
+ - token-classification
5
+ - sequence-tagger-model
6
+ language: en
7
+ datasets:
8
+ - conll2000
9
+ inference: false
10
+ ---
11
+
12
+ ## English Chunking in Flair (fast model)
13
+
14
+ This is the fast phrase chunking model for English that ships with [Flair](https://github.com/flairNLP/flair/).
15
+
16
+ F1-Score: **96,48** (corrected CoNLL-2000)
17
+
18
+ Predicts 4 tags:
19
+
20
+ | **tag** | **meaning** |
21
+ |---------------------------------|-----------|
22
+ | ADJP | adjectival |
23
+ | ADVP | adverbial |
24
+ | CONJP | conjunction |
25
+ | INTJ | interjection |
26
+ | LST | list marker |
27
+ | NP | noun phrase |
28
+ | PP | prepositional |
29
+ | PRT | particle |
30
+ | SBAR | subordinate clause |
31
+ | VP | verb phrase |
32
+
33
+ Based on [Flair embeddings](https://www.aclweb.org/anthology/C18-1139/) and LSTM-CRF.
34
+
35
+ ---
36
+
37
+ ### Demo: How to use in Flair
38
+
39
+ Requires: **[Flair](https://github.com/flairNLP/flair/)** (`pip install flair`)
40
+
41
+ ```python
42
+ from flair.data import Sentence
43
+ from flair.models import SequenceTagger
44
+
45
+ # load tagger
46
+ tagger = SequenceTagger.load("flair/chunk-english")
47
+
48
+ # make example sentence
49
+ sentence = Sentence("The happy man has been eating at the diner")
50
+
51
+ # predict NER tags
52
+ tagger.predict(sentence)
53
+
54
+ # print sentence
55
+ print(sentence)
56
+
57
+ # print predicted NER spans
58
+ print('The following NER tags are found:')
59
+ # iterate over entities and print
60
+ for entity in sentence.get_spans('np'):
61
+ print(entity)
62
+
63
+ ```
64
+
65
+ This yields the following output:
66
+ ```
67
+ Span [1,2,3]: "The happy man" [− Labels: NP (0.9958)]
68
+ Span [4,5,6]: "has been eating" [− Labels: VP (0.8759)]
69
+ Span [7]: "at" [− Labels: PP (1.0)]
70
+ Span [8,9]: "the diner" [− Labels: NP (0.9991)]
71
+
72
+ ```
73
+
74
+ So, the spans "*The happy man*" and "*the diner*" are labeled as **noun phrases** (NP) and "*has been eating*" is labeled as a **verb phrase** (VP) in the sentence "*The happy man has been eating at the diner*".
75
+
76
+
77
+ ---
78
+
79
+ ### Training: Script to train this model
80
+
81
+ The following Flair script was used to train this model:
82
+
83
+ ```python
84
+ from flair.data import Corpus
85
+ from flair.datasets import CONLL_2000
86
+ from flair.embeddings import WordEmbeddings, StackedEmbeddings, FlairEmbeddings
87
+
88
+ # 1. get the corpus
89
+ corpus: Corpus = CONLL_2000()
90
+
91
+ # 2. what tag do we want to predict?
92
+ tag_type = 'np'
93
+
94
+ # 3. make the tag dictionary from the corpus
95
+ tag_dictionary = corpus.make_tag_dictionary(tag_type=tag_type)
96
+
97
+ # 4. initialize each embedding we use
98
+ embedding_types = [
99
+
100
+ # contextual string embeddings, forward
101
+ FlairEmbeddings('news-forward'),
102
+
103
+ # contextual string embeddings, backward
104
+ FlairEmbeddings('news-backward'),
105
+ ]
106
+
107
+ # embedding stack consists of Flair and GloVe embeddings
108
+ embeddings = StackedEmbeddings(embeddings=embedding_types)
109
+
110
+ # 5. initialize sequence tagger
111
+ from flair.models import SequenceTagger
112
+
113
+ tagger = SequenceTagger(hidden_size=256,
114
+ embeddings=embeddings,
115
+ tag_dictionary=tag_dictionary,
116
+ tag_type=tag_type)
117
+
118
+ # 6. initialize trainer
119
+ from flair.trainers import ModelTrainer
120
+
121
+ trainer = ModelTrainer(tagger, corpus)
122
+
123
+ # 7. run training
124
+ trainer.train('resources/taggers/chunk-english',
125
+ train_with_dev=True,
126
+ max_epochs=150)
127
+ ```
128
+
129
+
130
+
131
+ ---
132
+
133
+ ### Cite
134
+
135
+ Please cite the following paper when using this model.
136
+
137
+ ```
138
+ @inproceedings{akbik2018coling,
139
+ title={Contextual String Embeddings for Sequence Labeling},
140
+ author={Akbik, Alan and Blythe, Duncan and Vollgraf, Roland},
141
+ booktitle = {{COLING} 2018, 27th International Conference on Computational Linguistics},
142
+ pages = {1638--1649},
143
+ year = {2018}
144
+ }
145
+ ```
146
+
147
+ ---
148
+
149
+ ### Issues?
150
+
151
+ The Flair issue tracker is available [here](https://github.com/flairNLP/flair/issues/).
loss.tsv ADDED
@@ -0,0 +1,132 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ EPOCH TIMESTAMP BAD_EPOCHS LEARNING_RATE TRAIN_LOSS
2
+ 0 05:21:01 0 0.1000 17.34242542215756
3
+ 1 05:22:04 0 0.1000 5.6257953405380245
4
+ 2 05:23:06 0 0.1000 4.303623300790787
5
+ 3 05:24:08 0 0.1000 3.684522943837302
6
+ 4 05:25:10 0 0.1000 3.3204722804682594
7
+ 5 05:26:13 0 0.1000 3.0727916134255273
8
+ 6 05:27:16 0 0.1000 2.84662518118109
9
+ 7 05:28:18 0 0.1000 2.7059059996690067
10
+ 8 05:29:21 0 0.1000 2.5761325248650144
11
+ 9 05:30:23 0 0.1000 2.440997558406421
12
+ 10 05:31:25 0 0.1000 2.3059833283935274
13
+ 11 05:32:25 0 0.1000 2.255575483611652
14
+ 12 05:33:26 0 0.1000 2.160707050561905
15
+ 13 05:34:28 0 0.1000 2.1140442326664926
16
+ 14 05:35:30 0 0.1000 2.010098801766123
17
+ 15 05:36:33 0 0.1000 1.9878978673900878
18
+ 16 05:37:35 0 0.1000 1.9592591504965509
19
+ 17 05:38:36 0 0.1000 1.8743698948196001
20
+ 18 05:39:38 0 0.1000 1.8482042870351247
21
+ 19 05:40:40 0 0.1000 1.8149086647800037
22
+ 20 05:41:41 1 0.1000 1.844334631732532
23
+ 21 05:42:42 0 0.1000 1.764170822075435
24
+ 22 05:43:44 0 0.1000 1.7315230512193271
25
+ 23 05:44:46 0 0.1000 1.636917947445597
26
+ 24 05:45:48 1 0.1000 1.6695456045014518
27
+ 25 05:46:50 2 0.1000 1.651555403641292
28
+ 26 05:47:52 0 0.1000 1.6244623371532985
29
+ 27 05:48:54 0 0.1000 1.5596783099429947
30
+ 28 05:49:56 0 0.1000 1.514996995457581
31
+ 29 05:50:58 0 0.1000 1.5029920841966355
32
+ 30 05:52:00 0 0.1000 1.431032625905105
33
+ 31 05:53:01 1 0.1000 1.4837342532617706
34
+ 32 05:54:03 2 0.1000 1.4580539977976255
35
+ 33 05:55:05 3 0.1000 1.4399318626948765
36
+ 34 05:56:07 4 0.1000 1.4475846835545132
37
+ 35 05:57:09 0 0.0500 1.2653039670416286
38
+ 36 05:58:11 0 0.0500 1.235799746428217
39
+ 37 05:59:13 1 0.0500 1.2381288805178234
40
+ 38 06:00:15 0 0.0500 1.2142414430422441
41
+ 39 06:01:16 0 0.0500 1.1662018648215702
42
+ 40 06:02:17 1 0.0500 1.19737099728414
43
+ 41 06:03:19 2 0.0500 1.1784159956233842
44
+ 42 06:04:20 0 0.0500 1.136076490368162
45
+ 43 06:05:21 1 0.0500 1.1422300245080674
46
+ 44 06:06:23 0 0.0500 1.1005895431552615
47
+ 45 06:07:24 0 0.0500 1.0893675114427295
48
+ 46 06:08:26 1 0.0500 1.115694132979427
49
+ 47 06:09:28 2 0.0500 1.1003576530941894
50
+ 48 06:10:31 0 0.0500 1.0747537553310393
51
+ 49 06:11:34 1 0.0500 1.093895332302366
52
+ 50 06:12:37 0 0.0500 1.045550647271531
53
+ 51 06:13:39 0 0.0500 1.0378333334411893
54
+ 52 06:14:43 1 0.0500 1.052635036621775
55
+ 53 06:15:46 2 0.0500 1.0383459685104235
56
+ 54 06:16:48 3 0.0500 1.0617959085319724
57
+ 55 06:17:51 0 0.0500 1.0143299671156065
58
+ 56 06:18:54 1 0.0500 1.0251009391886847
59
+ 57 06:19:55 0 0.0500 0.9945888205298355
60
+ 58 06:20:56 0 0.0500 0.9834732017346791
61
+ 59 06:21:58 0 0.0500 0.9683706868972097
62
+ 60 06:23:00 1 0.0500 1.0107239616768702
63
+ 61 06:24:03 0 0.0500 0.9520225398242473
64
+ 62 06:25:06 1 0.0500 0.9631213486194611
65
+ 63 06:26:08 0 0.0500 0.9376531612660204
66
+ 64 06:27:10 1 0.0500 0.9862911478749343
67
+ 65 06:28:10 2 0.0500 0.9777858261551176
68
+ 66 06:29:11 3 0.0500 0.9611380408917155
69
+ 67 06:30:11 4 0.0500 0.9442767329514027
70
+ 68 06:31:12 0 0.0250 0.8669117001550538
71
+ 69 06:32:13 1 0.0250 0.9229169498596873
72
+ 70 06:33:15 2 0.0250 0.8993198933345931
73
+ 71 06:34:16 3 0.0250 0.880038060354335
74
+ 72 06:35:17 0 0.0250 0.8603246239147015
75
+ 73 06:36:18 1 0.0250 0.8772120903645243
76
+ 74 06:37:21 0 0.0250 0.8479232568825994
77
+ 75 06:38:23 0 0.0250 0.7854352353938988
78
+ 76 06:39:24 1 0.0250 0.8638150128935065
79
+ 77 06:40:24 2 0.0250 0.8433980496866362
80
+ 78 06:41:25 3 0.0250 0.8262564968849931
81
+ 79 06:42:26 4 0.0250 0.8527375097785678
82
+ 80 06:43:27 1 0.0125 0.8060048690864018
83
+ 81 06:44:29 0 0.0125 0.7787820570170879
84
+ 82 06:45:30 1 0.0125 0.8047857295189585
85
+ 83 06:46:30 2 0.0125 0.8353619445647512
86
+ 84 06:47:31 3 0.0125 0.7827996164560318
87
+ 85 06:48:32 4 0.0125 0.7826738729008607
88
+ 86 06:49:33 1 0.0063 0.7854060885097299
89
+ 87 06:50:34 0 0.0063 0.7773133634456566
90
+ 88 06:51:35 1 0.0063 0.7824090438229697
91
+ 89 06:52:35 0 0.0063 0.7511799194983073
92
+ 90 06:53:36 1 0.0063 0.7688422220093863
93
+ 91 06:54:36 2 0.0063 0.7514194880213056
94
+ 92 06:55:37 0 0.0063 0.7506189524063043
95
+ 93 06:56:37 1 0.0063 0.7805234477988311
96
+ 94 06:57:38 2 0.0063 0.7742435120046138
97
+ 95 06:58:38 3 0.0063 0.7982145274324076
98
+ 96 06:59:38 4 0.0063 0.7590283545000213
99
+ 97 07:00:39 0 0.0031 0.7236002066305706
100
+ 98 07:01:40 1 0.0031 0.7341974345701081
101
+ 99 07:02:41 2 0.0031 0.7550298560942922
102
+ 100 07:03:41 0 0.0031 0.712286435599838
103
+ 101 07:04:43 1 0.0031 0.7448591279132025
104
+ 102 07:05:44 2 0.0031 0.7522817966129098
105
+ 103 07:06:45 3 0.0031 0.7498979801578182
106
+ 104 07:07:46 4 0.0031 0.769592082287584
107
+ 105 07:08:46 1 0.0016 0.7390777413334165
108
+ 106 07:09:47 0 0.0016 0.7049496266458716
109
+ 107 07:10:48 1 0.0016 0.7368284867278168
110
+ 108 07:11:48 2 0.0016 0.7685823498027665
111
+ 109 07:12:50 3 0.0016 0.7776820374386652
112
+ 110 07:13:52 4 0.0016 0.7452258818915912
113
+ 111 07:14:53 1 0.0008 0.7270229106502873
114
+ 112 07:15:54 2 0.0008 0.7077597582978862
115
+ 113 07:16:56 0 0.0008 0.7014804634132555
116
+ 114 07:17:57 1 0.0008 0.7204148694872856
117
+ 115 07:18:58 0 0.0008 0.6946687193853515
118
+ 116 07:19:58 1 0.0008 0.7694109963519232
119
+ 117 07:20:59 2 0.0008 0.7071356939417975
120
+ 118 07:22:00 3 0.0008 0.759676700511149
121
+ 119 07:23:01 4 0.0008 0.771170526849372
122
+ 120 07:24:03 1 0.0004 0.7428989289062363
123
+ 121 07:25:05 2 0.0004 0.7075921507818358
124
+ 122 07:26:06 3 0.0004 0.7152813235563892
125
+ 123 07:27:08 4 0.0004 0.709876735402005
126
+ 124 07:28:10 1 0.0002 0.7386109314858913
127
+ 125 07:29:12 2 0.0002 0.731379884587867
128
+ 126 07:30:14 0 0.0002 0.6848466526184763
129
+ 127 07:31:16 1 0.0002 0.7517304641859872
130
+ 128 07:32:17 2 0.0002 0.7443433770111629
131
+ 129 07:33:17 3 0.0002 0.7162831548069205
132
+ 130 07:34:19 4 0.0002 0.7369625336357526
pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:359acc17acefa7b5d45dfb6b9cad5a3292c3f6bfef402a10f2042d50f41b845f
3
+ size 75233247