alanakbik commited on
Commit
6cdc8fa
1 Parent(s): e8e5382

French model

Browse files
Files changed (5) hide show
  1. README.md +139 -0
  2. loss.tsv +151 -0
  3. pytorch_model.bin +3 -0
  4. test.tsv +0 -0
  5. training.log +0 -0
README.md ADDED
@@ -0,0 +1,139 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - flair
4
+ - token-classification
5
+ - sequence-tagger-model
6
+ language: en
7
+ datasets:
8
+ - conll2003
9
+ inference: false
10
+ ---
11
+
12
+ ## English NER in Flair (default model)
13
+
14
+ This is the standard 4-class NER model for English that ships with [Flair](https://github.com/flairNLP/flair/).
15
+
16
+ F1-Score: **92,98** (CoNLL-03)
17
+
18
+ Predicts 4 tags:
19
+
20
+ | **tag** | **meaning** |
21
+ |---------------------------------|-----------|
22
+ | PER | person name |
23
+ | LOC | location name |
24
+ | ORG | organization name |
25
+ | MISC | other name |
26
+
27
+ Based on [Flair embeddings](https://www.aclweb.org/anthology/C18-1139/) and LSTM-CRF.
28
+
29
+ ---
30
+
31
+ ### Demo: How to use in Flair
32
+
33
+ Requires: **[Flair](https://github.com/flairNLP/flair/)** (`pip install flair`)
34
+
35
+ ```python
36
+ from flair.data import Sentence
37
+ from flair.models import SequenceTagger
38
+
39
+ # load tagger
40
+ tagger = SequenceTagger.load("flair/ner-english")
41
+
42
+ # make example sentence
43
+ sentence = Sentence("George Washington went to Washington")
44
+
45
+ # predict NER tags
46
+ tagger.predict(sentence)
47
+
48
+ # print sentence
49
+ print(sentence)
50
+
51
+ # print predicted NER spans
52
+ print('The following NER tags are found:')
53
+ # iterate over entities and print
54
+ for entity in sentence.get_spans('ner'):
55
+ print(entity)
56
+
57
+ ```
58
+
59
+ This yields the following output:
60
+ ```
61
+ Span [1,2]: "George Washington" [− Labels: PER (0.9968)]
62
+ Span [5]: "Washington" [− Labels: LOC (0.9994)]
63
+ ```
64
+
65
+ So, the entities "*George Washington*" (labeled as a **person**) and "*Washington*" (labeled as a **location**) are found in the sentence "*George Washington went to Washington*".
66
+
67
+
68
+ ---
69
+
70
+ ### Training: Script to train this model
71
+
72
+ The following Flair script was used to train this model:
73
+
74
+ ```python
75
+ from flair.data import Corpus
76
+ from flair.datasets import CONLL_03
77
+ from flair.embeddings import WordEmbeddings, StackedEmbeddings, FlairEmbeddings
78
+
79
+ # 1. get the corpus
80
+ corpus: Corpus = CONLL_03()
81
+
82
+ # 2. what tag do we want to predict?
83
+ tag_type = 'ner'
84
+
85
+ # 3. make the tag dictionary from the corpus
86
+ tag_dictionary = corpus.make_tag_dictionary(tag_type=tag_type)
87
+
88
+ # 4. initialize each embedding we use
89
+ embedding_types = [
90
+
91
+ # GloVe embeddings
92
+ WordEmbeddings('glove'),
93
+
94
+ # contextual string embeddings, forward
95
+ FlairEmbeddings('news-forward'),
96
+
97
+ # contextual string embeddings, backward
98
+ FlairEmbeddings('news-backward'),
99
+ ]
100
+
101
+ # embedding stack consists of Flair and GloVe embeddings
102
+ embeddings = StackedEmbeddings(embeddings=embedding_types)
103
+
104
+ # 5. initialize sequence tagger
105
+ from flair.models import SequenceTagger
106
+
107
+ tagger = SequenceTagger(hidden_size=256,
108
+ embeddings=embeddings,
109
+ tag_dictionary=tag_dictionary,
110
+ tag_type=tag_type)
111
+
112
+ # 6. initialize trainer
113
+ from flair.trainers import ModelTrainer
114
+
115
+ trainer = ModelTrainer(tagger, corpus)
116
+
117
+ # 7. run training
118
+ trainer.train('resources/taggers/ner-english',
119
+ train_with_dev=True,
120
+ max_epochs=150)
121
+ ```
122
+
123
+
124
+
125
+ ---
126
+
127
+ ### Cite
128
+
129
+ Please cite the following paper when using this model.
130
+
131
+ ```
132
+ @inproceedings{akbik2018coling,
133
+ title={Contextual String Embeddings for Sequence Labeling},
134
+ author={Akbik, Alan and Blythe, Duncan and Vollgraf, Roland},
135
+ booktitle = {{COLING} 2018, 27th International Conference on Computational Linguistics},
136
+ pages = {1638--1649},
137
+ year = {2018}
138
+ }
139
+ ```
loss.tsv ADDED
@@ -0,0 +1,151 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ EPOCH TIMESTAMP BAD_EPOCHS LEARNING_RATE TRAIN_LOSS
2
+ 0 19:26:37 0 0.1000 2.318331193891905
3
+ 1 19:54:00 0 0.1000 1.4398467032979894
4
+ 2 20:21:10 0 0.1000 1.2915569943365872
5
+ 3 20:48:33 0 0.1000 1.2083583032411913
6
+ 4 21:15:44 0 0.1000 1.1410765122341853
7
+ 5 21:42:55 0 0.1000 1.0989998259531555
8
+ 6 22:10:08 0 0.1000 1.0612774609958613
9
+ 7 22:37:15 0 0.1000 1.0287262405038522
10
+ 8 23:04:18 0 0.1000 1.0085712932130342
11
+ 9 23:31:39 0 0.1000 0.989349162009775
12
+ 10 23:58:50 0 0.1000 0.9717200679324006
13
+ 11 00:25:49 0 0.1000 0.9578037910804312
14
+ 12 00:52:51 0 0.1000 0.9408924878925405
15
+ 13 01:19:55 0 0.1000 0.929271377663138
16
+ 14 01:47:12 0 0.1000 0.9172978740465897
17
+ 15 02:14:12 0 0.1000 0.9044446516581761
18
+ 16 02:41:22 0 0.1000 0.8984834992917635
19
+ 17 03:08:35 0 0.1000 0.8855764541094021
20
+ 18 03:35:46 0 0.1000 0.8802844247830811
21
+ 19 04:03:03 0 0.1000 0.8727591283138721
22
+ 20 04:30:16 0 0.1000 0.869586915186336
23
+ 21 04:57:34 0 0.1000 0.8578218414498273
24
+ 22 05:24:44 0 0.1000 0.8486296324159509
25
+ 23 05:51:50 0 0.1000 0.8477299566590978
26
+ 24 06:19:03 0 0.1000 0.843290219976697
27
+ 25 06:46:17 0 0.1000 0.8351735146776322
28
+ 26 07:13:28 0 0.1000 0.833952986104514
29
+ 27 07:40:30 0 0.1000 0.8276725968446141
30
+ 28 08:07:37 0 0.1000 0.8232993299842521
31
+ 29 08:34:39 0 0.1000 0.8223834224525959
32
+ 30 09:01:41 0 0.1000 0.8124921200095966
33
+ 31 09:28:46 0 0.1000 0.8077815129071153
34
+ 32 09:55:57 1 0.1000 0.809081745187762
35
+ 33 10:23:19 0 0.1000 0.8029743133453272
36
+ 34 10:50:30 0 0.1000 0.796815457227089
37
+ 35 11:17:40 0 0.1000 0.791879897572661
38
+ 36 11:44:54 0 0.1000 0.7891512475266892
39
+ 37 12:12:11 0 0.1000 0.7862848894810804
40
+ 38 12:39:24 0 0.1000 0.7852521677491485
41
+ 39 13:06:45 0 0.1000 0.7785590070389932
42
+ 40 13:33:56 1 0.1000 0.7785631892902236
43
+ 41 14:01:15 2 0.1000 0.7807862566683882
44
+ 42 14:28:34 0 0.1000 0.7746844343520621
45
+ 43 14:55:52 0 0.1000 0.7691374521742584
46
+ 44 15:23:04 0 0.1000 0.7650815657870744
47
+ 45 15:50:16 1 0.1000 0.7660518090811468
48
+ 46 16:17:34 0 0.1000 0.7606712778008753
49
+ 47 16:44:48 1 0.1000 0.7616593170230107
50
+ 48 17:12:07 0 0.1000 0.7600816034341371
51
+ 49 17:39:16 1 0.1000 0.761806222219621
52
+ 50 18:06:27 2 0.1000 0.7619497269231786
53
+ 51 18:33:35 0 0.1000 0.7542538451411391
54
+ 52 19:00:47 1 0.1000 0.7549237621487469
55
+ 53 19:27:54 0 0.1000 0.7506475397495813
56
+ 54 19:55:01 0 0.1000 0.7470439873395428
57
+ 55 20:22:19 1 0.1000 0.7478391278975753
58
+ 56 20:49:30 0 0.1000 0.7456930509738384
59
+ 57 21:16:48 0 0.1000 0.7435161036068714
60
+ 58 21:43:52 0 0.1000 0.7407694635452122
61
+ 59 22:11:05 0 0.1000 0.7395734377285486
62
+ 60 22:38:25 1 0.1000 0.7396345261844897
63
+ 61 23:05:34 0 0.1000 0.7342302677051354
64
+ 62 23:32:45 1 0.1000 0.7374787427204591
65
+ 63 00:00:00 2 0.1000 0.7380644889528393
66
+ 64 00:27:11 0 0.1000 0.7336558495798419
67
+ 65 00:54:19 0 0.1000 0.728423485648568
68
+ 66 01:21:31 1 0.1000 0.7338024039983109
69
+ 67 01:48:42 2 0.1000 0.7318043743730873
70
+ 68 02:16:01 3 0.1000 0.7306094528045706
71
+ 69 02:43:19 0 0.1000 0.7277526330242875
72
+ 70 03:10:33 0 0.1000 0.722931616184532
73
+ 71 03:37:47 1 0.1000 0.7245286869506041
74
+ 72 04:05:05 2 0.1000 0.7244877224968326
75
+ 73 04:32:13 0 0.1000 0.7220636573129444
76
+ 74 04:59:20 0 0.1000 0.7186857984030759
77
+ 75 05:26:37 1 0.1000 0.7230376542976467
78
+ 76 05:53:49 2 0.1000 0.7191732061646318
79
+ 77 06:20:59 3 0.1000 0.7190409482647014
80
+ 78 06:48:18 0 0.1000 0.7167494126305144
81
+ 79 07:15:22 1 0.1000 0.7171490530573552
82
+ 80 07:42:42 2 0.1000 0.7201235358913739
83
+ 81 08:09:56 0 0.1000 0.7160712421581309
84
+ 82 08:37:12 0 0.1000 0.7155344606006658
85
+ 83 09:04:30 0 0.1000 0.7102095014786207
86
+ 84 09:31:35 1 0.1000 0.7110982454752409
87
+ 85 09:58:51 0 0.1000 0.7094549292678474
88
+ 86 10:26:24 1 0.1000 0.7138914770256447
89
+ 87 10:53:37 2 0.1000 0.7165087141497161
90
+ 88 11:20:48 3 0.1000 0.7106979087796262
91
+ 89 11:48:04 4 0.1000 0.7134444465880753
92
+ 90 12:15:04 0 0.0500 0.6728190737385904
93
+ 91 12:42:15 0 0.0500 0.6580921416080767
94
+ 92 13:09:39 0 0.0500 0.6505834398410654
95
+ 93 13:36:55 0 0.0500 0.6460238384303226
96
+ 94 14:03:53 0 0.0500 0.6412927795401825
97
+ 95 14:31:18 0 0.0500 0.6308042398383541
98
+ 96 14:58:28 1 0.0500 0.6308209504010857
99
+ 97 15:25:36 0 0.0500 0.6290363089051297
100
+ 98 15:52:47 0 0.0500 0.6214969258154592
101
+ 99 16:19:49 1 0.0500 0.6226950392767947
102
+ 100 16:46:56 0 0.0500 0.6151195442964954
103
+ 101 17:14:15 1 0.0500 0.6158701281794297
104
+ 102 17:41:38 2 0.0500 0.619198265183036
105
+ 103 18:08:52 0 0.0500 0.6121407378745335
106
+ 104 18:35:57 0 0.0500 0.611304380792764
107
+ 105 19:05:12 0 0.0500 0.6052835427464978
108
+ 106 19:40:35 1 0.0500 0.6075447963690886
109
+ 107 20:15:53 2 0.0500 0.6123977672669195
110
+ 108 20:51:14 0 0.0500 0.6021773795927724
111
+ 109 21:26:30 0 0.0500 0.6012489927712307
112
+ 110 22:01:42 1 0.0500 0.6024624848678227
113
+ 111 22:37:06 0 0.0500 0.5993795943035874
114
+ 112 23:12:26 0 0.0500 0.5977486755338407
115
+ 113 23:47:29 0 0.0500 0.5971653429250563
116
+ 114 00:22:36 0 0.0500 0.5954294447937319
117
+ 115 00:57:45 0 0.0500 0.5939625836508249
118
+ 116 01:32:57 1 0.0500 0.5965152155247426
119
+ 117 02:08:25 0 0.0500 0.587880958817018
120
+ 118 02:43:43 1 0.0500 0.5949470286568006
121
+ 119 03:18:58 0 0.0500 0.5856260614609846
122
+ 120 03:51:34 0 0.0500 0.5831995654410572
123
+ 121 04:18:43 1 0.0500 0.5854527208193016
124
+ 122 04:45:49 2 0.0500 0.5886225696933526
125
+ 123 05:13:01 3 0.0500 0.584342171468081
126
+ 124 05:40:16 4 0.0500 0.5834616551636368
127
+ 125 06:07:30 0 0.0250 0.566932532944346
128
+ 126 06:34:47 0 0.0250 0.56178293470093
129
+ 127 07:02:01 1 0.0250 0.5625877075217744
130
+ 128 07:29:14 0 0.0250 0.5547944932495074
131
+ 129 07:56:28 1 0.0250 0.5561661093225402
132
+ 130 08:23:51 0 0.0250 0.5522261777311884
133
+ 131 08:51:11 0 0.0250 0.5481663380018486
134
+ 132 09:18:23 0 0.0250 0.5458652658828644
135
+ 133 09:45:48 0 0.0250 0.5442436329339461
136
+ 134 10:13:00 1 0.0250 0.547432091471649
137
+ 135 10:40:15 2 0.0250 0.5470055787874165
138
+ 136 11:09:52 0 0.0250 0.5414540123715196
139
+ 137 11:49:55 1 0.0250 0.5435578107112838
140
+ 138 12:29:34 2 0.0250 0.5451218750047427
141
+ 139 13:08:59 0 0.0250 0.5362602096651831
142
+ 140 13:48:27 1 0.0250 0.5379885851856201
143
+ 141 14:28:18 2 0.0250 0.5380577692661875
144
+ 142 15:07:56 3 0.0250 0.5401486105896453
145
+ 143 15:47:42 0 0.0250 0.5301057130979594
146
+ 144 16:27:39 1 0.0250 0.5354876810664772
147
+ 145 17:07:09 2 0.0250 0.5328541969580035
148
+ 146 17:46:21 3 0.0250 0.5340422660833405
149
+ 147 18:25:58 0 0.0250 0.5279961425130085
150
+ 148 19:05:27 1 0.0250 0.533709565021338
151
+ 149 19:44:58 2 0.0250 0.5291525019593136
pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f0dc8f7d09d9bfbe1f89bbea91dab03a6ec5a0cd7d28189aa6589b52dfb94b09
3
+ size 1331932638
test.tsv ADDED
The diff for this file is too large to render. See raw diff
 
training.log ADDED
The diff for this file is too large to render. See raw diff