egumasa commited on
Commit
10d4d4f
1 Parent(s): b46f8af

Update spaCy pipeline

Browse files
.gitattributes CHANGED
@@ -33,3 +33,7 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
 
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ en_engagement_spl_RoBERTa_base_attention-any-py3-none-any.whl filter=lfs diff=lfs merge=lfs -text
37
+ spancat/model filter=lfs diff=lfs merge=lfs -text
38
+ trainable_transformer/model filter=lfs diff=lfs merge=lfs -text
39
+ transformer/model filter=lfs diff=lfs merge=lfs -text
README.md ADDED
@@ -0,0 +1,105 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - spacy
4
+ - token-classification
5
+ language:
6
+ - en
7
+ model-index:
8
+ - name: en_engagement_spl_RoBERTa_base_attention
9
+ results:
10
+ - task:
11
+ name: NER
12
+ type: token-classification
13
+ metrics:
14
+ - name: NER Precision
15
+ type: precision
16
+ value: 0.0
17
+ - name: NER Recall
18
+ type: recall
19
+ value: 0.0
20
+ - name: NER F Score
21
+ type: f_score
22
+ value: 0.0
23
+ - task:
24
+ name: TAG
25
+ type: token-classification
26
+ metrics:
27
+ - name: TAG (XPOS) Accuracy
28
+ type: accuracy
29
+ value: 0.0
30
+ - task:
31
+ name: LEMMA
32
+ type: token-classification
33
+ metrics:
34
+ - name: Lemma Accuracy
35
+ type: accuracy
36
+ value: 0.0
37
+ - task:
38
+ name: UNLABELED_DEPENDENCIES
39
+ type: token-classification
40
+ metrics:
41
+ - name: Unlabeled Attachment Score (UAS)
42
+ type: f_score
43
+ value: 0.0
44
+ - task:
45
+ name: LABELED_DEPENDENCIES
46
+ type: token-classification
47
+ metrics:
48
+ - name: Labeled Attachment Score (LAS)
49
+ type: f_score
50
+ value: 0.0
51
+ - task:
52
+ name: SENTS
53
+ type: token-classification
54
+ metrics:
55
+ - name: Sentences F-Score
56
+ type: f_score
57
+ value: 0.9469411424
58
+ ---
59
+ | Feature | Description |
60
+ | --- | --- |
61
+ | **Name** | `en_engagement_spl_RoBERTa_base_attention` |
62
+ | **Version** | `0.0.1` |
63
+ | **spaCy** | `>=3.6.0,<3.7.0` |
64
+ | **Default Pipeline** | `transformer`, `parser`, `tagger`, `ner`, `attribute_ruler`, `lemmatizer`, `trainable_transformer`, `spancat` |
65
+ | **Components** | `transformer`, `parser`, `tagger`, `ner`, `attribute_ruler`, `lemmatizer`, `trainable_transformer`, `spancat` |
66
+ | **Vectors** | 0 keys, 0 unique vectors (0 dimensions) |
67
+ | **Sources** | n/a |
68
+ | **License** | n/a |
69
+ | **Author** | [n/a]() |
70
+
71
+ ### Label Scheme
72
+
73
+ <details>
74
+
75
+ <summary>View label scheme (122 labels for 4 components)</summary>
76
+
77
+ | Component | Labels |
78
+ | --- | --- |
79
+ | **`parser`** | `ROOT`, `acl`, `acomp`, `advcl`, `advmod`, `agent`, `amod`, `appos`, `attr`, `aux`, `auxpass`, `case`, `cc`, `ccomp`, `compound`, `conj`, `csubj`, `csubjpass`, `dative`, `dep`, `det`, `dobj`, `expl`, `intj`, `mark`, `meta`, `neg`, `nmod`, `npadvmod`, `nsubj`, `nsubjpass`, `nummod`, `oprd`, `parataxis`, `pcomp`, `pobj`, `poss`, `preconj`, `predet`, `prep`, `prt`, `punct`, `quantmod`, `relcl`, `xcomp` |
80
+ | **`tagger`** | `$`, `''`, `,`, `-LRB-`, `-RRB-`, `.`, `:`, `ADD`, `AFX`, `CC`, `CD`, `DT`, `EX`, `FW`, `HYPH`, `IN`, `JJ`, `JJR`, `JJS`, `LS`, `MD`, `NFP`, `NN`, `NNP`, `NNPS`, `NNS`, `PDT`, `POS`, `PRP`, `PRP$`, `RB`, `RBR`, `RBS`, `RP`, `SYM`, `TO`, `UH`, `VB`, `VBD`, `VBG`, `VBN`, `VBP`, `VBZ`, `WDT`, `WP`, `WP$`, `WRB`, `XX`, ```` |
81
+ | **`ner`** | `CARDINAL`, `DATE`, `EVENT`, `FAC`, `GPE`, `LANGUAGE`, `LAW`, `LOC`, `MONEY`, `NORP`, `ORDINAL`, `ORG`, `PERCENT`, `PERSON`, `PRODUCT`, `QUANTITY`, `TIME`, `WORK_OF_ART` |
82
+ | **`spancat`** | `ATTRIBUTION`, `ENTERTAIN`, `PROCLAIM`, `SOURCES`, `MONOGLOSS`, `CITATION`, `ENDOPHORIC`, `DENY`, `JUSTIFYING`, `COUNTER` |
83
+
84
+ </details>
85
+
86
+ ### Accuracy
87
+
88
+ | Type | Score |
89
+ | --- | --- |
90
+ | `DEP_UAS` | 0.00 |
91
+ | `DEP_LAS` | 0.00 |
92
+ | `DEP_LAS_PER_TYPE` | 0.00 |
93
+ | `SENTS_P` | 93.64 |
94
+ | `SENTS_R` | 95.78 |
95
+ | `SENTS_F` | 94.69 |
96
+ | `TAG_ACC` | 0.00 |
97
+ | `ENTS_F` | 0.00 |
98
+ | `ENTS_P` | 0.00 |
99
+ | `ENTS_R` | 0.00 |
100
+ | `LEMMA_ACC` | 0.00 |
101
+ | `SPANS_SC_F` | 77.65 |
102
+ | `SPANS_SC_P` | 78.19 |
103
+ | `SPANS_SC_R` | 77.12 |
104
+ | `TRAINABLE_TRANSFORMER_LOSS` | 5917.79 |
105
+ | `SPANCAT_LOSS` | 76188.74 |
attribute_ruler/patterns ADDED
Binary file (14.8 kB). View file
 
config.cfg ADDED
@@ -0,0 +1,265 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [paths]
2
+ train = "data/engagement_spl_train.spacy"
3
+ dev = "data/engagement_spl_dev.spacy"
4
+ vectors = null
5
+ init_tok2vec = null
6
+
7
+ [system]
8
+ gpu_allocator = "pytorch"
9
+ seed = 0
10
+
11
+ [nlp]
12
+ lang = "en"
13
+ pipeline = ["transformer","parser","tagger","ner","attribute_ruler","lemmatizer","trainable_transformer","spancat"]
14
+ batch_size = 32
15
+ disabled = []
16
+ before_creation = null
17
+ after_creation = null
18
+ after_pipeline_creation = null
19
+ tokenizer = {"@tokenizers":"spacy.Tokenizer.v1"}
20
+
21
+ [components]
22
+
23
+ [components.attribute_ruler]
24
+ factory = "attribute_ruler"
25
+ scorer = {"@scorers":"spacy.attribute_ruler_scorer.v1"}
26
+ validate = false
27
+
28
+ [components.lemmatizer]
29
+ factory = "lemmatizer"
30
+ mode = "rule"
31
+ model = null
32
+ overwrite = false
33
+ scorer = {"@scorers":"spacy.lemmatizer_scorer.v1"}
34
+
35
+ [components.ner]
36
+ factory = "ner"
37
+ incorrect_spans_key = null
38
+ moves = null
39
+ scorer = {"@scorers":"spacy.ner_scorer.v1"}
40
+ update_with_oracle_cut_size = 100
41
+
42
+ [components.ner.model]
43
+ @architectures = "spacy.TransitionBasedParser.v2"
44
+ state_type = "ner"
45
+ extra_state_tokens = false
46
+ hidden_width = 64
47
+ maxout_pieces = 2
48
+ use_upper = false
49
+ nO = null
50
+
51
+ [components.ner.model.tok2vec]
52
+ @architectures = "spacy-transformers.TransformerListener.v1"
53
+ grad_factor = 1.0
54
+ upstream = "transformer"
55
+ pooling = {"@layers":"reduce_mean.v1"}
56
+
57
+ [components.parser]
58
+ factory = "parser"
59
+ learn_tokens = false
60
+ min_action_freq = 30
61
+ moves = null
62
+ scorer = {"@scorers":"spacy.parser_scorer.v1"}
63
+ update_with_oracle_cut_size = 100
64
+
65
+ [components.parser.model]
66
+ @architectures = "spacy.TransitionBasedParser.v2"
67
+ state_type = "parser"
68
+ extra_state_tokens = false
69
+ hidden_width = 64
70
+ maxout_pieces = 2
71
+ use_upper = false
72
+ nO = null
73
+
74
+ [components.parser.model.tok2vec]
75
+ @architectures = "spacy-transformers.TransformerListener.v1"
76
+ grad_factor = 1.0
77
+ upstream = "transformer"
78
+ pooling = {"@layers":"reduce_mean.v1"}
79
+
80
+ [components.spancat]
81
+ factory = "spancat"
82
+ max_positive = null
83
+ scorer = {"@scorers":"spacy.spancat_scorer.v1"}
84
+ spans_key = ${vars.spans_key}
85
+ threshold = 0.5
86
+
87
+ [components.spancat.model]
88
+ @architectures = "Attention_SpanCategorizer.v3"
89
+
90
+ [components.spancat.model.reducer]
91
+ @layers = "spacy.mean_max_reducer.v1"
92
+ hidden_size = 128
93
+
94
+ [components.spancat.model.scorer]
95
+ @layers = "spacy.LinearLogistic.v1"
96
+ nO = null
97
+ nI = null
98
+
99
+ [components.spancat.model.tok2vec]
100
+ @architectures = "spacy-transformers.TransformerListener.v1"
101
+ grad_factor = 1.0
102
+ pooling = {"@layers":"reduce_mean.v1"}
103
+ upstream = "trainable_transformer"
104
+
105
+ [components.spancat.suggester]
106
+ @misc = "spacy-experimental.ngram_subtree_suggester.v1"
107
+ sizes = [1,2,3,4,5,6,7,8,9,10,11,12]
108
+
109
+ [components.tagger]
110
+ factory = "tagger"
111
+ label_smoothing = 0.0
112
+ neg_prefix = "!"
113
+ overwrite = false
114
+ scorer = {"@scorers":"spacy.tagger_scorer.v1"}
115
+
116
+ [components.tagger.model]
117
+ @architectures = "spacy.Tagger.v2"
118
+ nO = null
119
+ normalize = false
120
+
121
+ [components.tagger.model.tok2vec]
122
+ @architectures = "spacy-transformers.TransformerListener.v1"
123
+ grad_factor = 1.0
124
+ upstream = "transformer"
125
+ pooling = {"@layers":"reduce_mean.v1"}
126
+
127
+ [components.trainable_transformer]
128
+ factory = "transformer"
129
+ max_batch_items = 4096
130
+ set_extra_annotations = {"@annotation_setters":"spacy-transformers.null_annotation_setter.v1"}
131
+
132
+ [components.trainable_transformer.model]
133
+ @architectures = "spacy-transformers.TransformerModel.v1"
134
+ name = "roberta-base"
135
+
136
+ [components.trainable_transformer.model.get_spans]
137
+ @span_getters = "spacy-transformers.strided_spans.v1"
138
+ window = 128
139
+ stride = 96
140
+
141
+ [components.trainable_transformer.model.tokenizer_config]
142
+ use_fast = true
143
+
144
+ [components.transformer]
145
+ factory = "transformer"
146
+ max_batch_items = 4096
147
+ set_extra_annotations = {"@annotation_setters":"spacy-transformers.null_annotation_setter.v1"}
148
+
149
+ [components.transformer.model]
150
+ name = "roberta-base"
151
+ @architectures = "spacy-transformers.TransformerModel.v3"
152
+ mixed_precision = false
153
+
154
+ [components.transformer.model.get_spans]
155
+ @span_getters = "spacy-transformers.strided_spans.v1"
156
+ window = 128
157
+ stride = 96
158
+
159
+ [components.transformer.model.grad_scaler_config]
160
+
161
+ [components.transformer.model.tokenizer_config]
162
+ use_fast = true
163
+
164
+ [components.transformer.model.transformer_config]
165
+
166
+ [corpora]
167
+
168
+ [corpora.dev]
169
+ @readers = "spacy.Corpus.v1"
170
+ path = ${paths.dev}
171
+ max_length = 0
172
+ gold_preproc = false
173
+ limit = 0
174
+ augmenter = null
175
+
176
+ [corpora.train]
177
+ @readers = "spacy.Corpus.v1"
178
+ path = ${paths.train}
179
+ max_length = 2000
180
+ gold_preproc = false
181
+ limit = 0
182
+ augmenter = null
183
+
184
+ [training]
185
+ dev_corpus = "corpora.dev"
186
+ train_corpus = "corpora.train"
187
+ seed = ${system.seed}
188
+ gpu_allocator = ${system.gpu_allocator}
189
+ dropout = 0.1
190
+ accumulate_gradient = 4
191
+ patience = 3000
192
+ max_epochs = 0
193
+ max_steps = 20000
194
+ eval_frequency = 200
195
+ frozen_components = ["transformer","parser","tagger","ner","attribute_ruler","lemmatizer"]
196
+ annotating_components = ["parser"]
197
+ before_to_disk = null
198
+ before_update = null
199
+
200
+ [training.batcher]
201
+ @batchers = "spacy.batch_by_words.v1"
202
+ discard_oversize = false
203
+ tolerance = 0.2
204
+ get_length = null
205
+
206
+ [training.batcher.size]
207
+ @schedules = "compounding.v1"
208
+ start = 500
209
+ stop = 1000
210
+ compound = 1.0002
211
+ t = 0.0
212
+
213
+ [training.logger]
214
+ @loggers = "spacy.ConsoleLogger.v1"
215
+ progress_bar = true
216
+
217
+ [training.optimizer]
218
+ @optimizers = "Adam.v1"
219
+ beta1 = 0.9
220
+ beta2 = 0.999
221
+ L2_is_weight_decay = true
222
+ L2 = 0.01
223
+ grad_clip = 1.0
224
+ use_averages = false
225
+ eps = 0.00000001
226
+
227
+ [training.optimizer.learn_rate]
228
+ @schedules = "warmup_linear.v1"
229
+ warmup_steps = 1000
230
+ total_steps = 20000
231
+ initial_rate = 0.00003
232
+
233
+ [training.score_weights]
234
+ dep_uas = null
235
+ dep_las = null
236
+ dep_las_per_type = null
237
+ sents_p = null
238
+ sents_r = null
239
+ sents_f = null
240
+ tag_acc = null
241
+ ents_f = null
242
+ ents_p = null
243
+ ents_r = null
244
+ ents_per_type = null
245
+ lemma_acc = null
246
+ spans_sc_f = 0.5
247
+ spans_sc_p = 0.0
248
+ spans_sc_r = 0.5
249
+
250
+ [pretraining]
251
+
252
+ [initialize]
253
+ vectors = ${paths.vectors}
254
+ init_tok2vec = ${paths.init_tok2vec}
255
+ vocab_data = null
256
+ lookups = null
257
+ before_init = null
258
+ after_init = null
259
+
260
+ [initialize.components]
261
+
262
+ [initialize.tokenizer]
263
+
264
+ [vars]
265
+ spans_key = "sc"
custom_functions.py ADDED
@@ -0,0 +1,1117 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from functools import partial
2
+ from pathlib import Path
3
+ from typing import Iterable, Callable, Optional
4
+ import spacy
5
+ from spacy.training import Example
6
+ from spacy.tokens import DocBin, Doc
7
+
8
+ from typing import List, Tuple, cast
9
+ from thinc.layers.chain import init as init_chain
10
+ from thinc.api import Model, with_getitem, chain, list2ragged, Logistic, Softmax, clone, LayerNorm, ParametricAttention, Dropout
11
+ from thinc.api import Maxout, Mish, Linear, Gelu, concatenate, glorot_uniform_init, PyTorchLSTM, residual
12
+ from thinc.api import reduce_mean, reduce_max, reduce_first, reduce_last, reduce_sum
13
+ from thinc.types import Ragged, Floats2d
14
+
15
+ from spacy.util import registry
16
+ from spacy.tokens import Doc
17
+ from spacy.ml.extract_spans import extract_spans
18
+
19
+ ## For initializing parametric attention
20
+ from spacy.ml.models.tok2vec import get_tok2vec_width
21
+ # @registry.layers("spacy.LinearLogistic.v1")
22
+ # def build_linear_logistic(nO=None, nI=None) -> Model[Floats2d, Floats2d]:
23
+ # """An output layer for multi-label classification. It uses a linear layer
24
+ # followed by a logistic activation.
25
+ # """
26
+ # return chain(Linear(nO=nO, nI=nI, init_W=glorot_uniform_init), Logistic())
27
+
28
+
29
+ # from typing import Callable, Optional, Tuple, cast
30
+
31
+ # from thinc.config import registry
32
+ # from thinc.model import Model
33
+ # from thinc.types import Floats2d, Ragged
34
+ # from thinc.util import get_width
35
+ # from thinc.layers.noop import noop
36
+
37
+ # InT = Ragged
38
+ # OutT = Ragged
39
+
40
+ # KEY_TRANSFORM_REF: str = "key_transform"
41
+
42
+ # # @registry.layers("ParametricAttention.v2")
43
+ # def ParametricAttention_v2(
44
+
45
+ # key_transform: Optional[Model[Floats2d, Floats2d]] = None,
46
+ # nO: Optional[int] = None
47
+ # ) -> Model[InT, OutT]:
48
+ # if key_transform is None:
49
+ # key_transform = noop()
50
+
51
+ # """Weight inputs by similarity to a learned vector"""
52
+ # return Model(
53
+ # "para-attn",
54
+ # forward,
55
+ # init=init,
56
+ # params={"Q": None},
57
+ # dims={"nO": nO},
58
+ # refs={KEY_TRANSFORM_REF: key_transform},
59
+ # layers=[key_transform],
60
+ # )
61
+
62
+
63
+ # def forward(model: Model[InT, OutT], Xr: InT, is_train: bool) -> Tuple[OutT, Callable]:
64
+ # Q = model.get_param("Q")
65
+ # key_transform = model.get_ref(KEY_TRANSFORM_REF)
66
+
67
+ # attention, bp_attention = _get_attention(
68
+ # model.ops, Q, key_transform, Xr.dataXd, Xr.lengths, is_train
69
+ # )
70
+ # output, bp_output = _apply_attention(model.ops, attention, Xr.dataXd, Xr.lengths)
71
+
72
+ # def backprop(dYr: OutT) -> InT:
73
+ # dX, d_attention = bp_output(dYr.dataXd)
74
+ # dQ, dX2 = bp_attention(d_attention)
75
+ # model.inc_grad("Q", dQ.ravel())
76
+ # dX += dX2
77
+ # return Ragged(dX, dYr.lengths)
78
+
79
+ # return Ragged(output, Xr.lengths), backprop
80
+
81
+
82
+ # def init(
83
+ # model: Model[InT, OutT], X: Optional[InT] = None, Y: Optional[OutT] = None
84
+ # ) -> None:
85
+ # key_transform = model.get_ref(KEY_TRANSFORM_REF)
86
+ # width = get_width(X) if X is not None else None
87
+ # if width:
88
+ # model.set_dim("nO", width)
89
+ # if key_transform.has_dim("nO"):
90
+ # key_transform.set_dim("nO", width)
91
+
92
+ # # Randomly initialize the parameter, as though it were an embedding.
93
+ # Q = model.ops.alloc1f(model.get_dim("nO"))
94
+ # Q += model.ops.xp.random.uniform(-0.1, 0.1, Q.shape)
95
+ # model.set_param("Q", Q)
96
+
97
+ # X_array = X.dataXd if X is not None else None
98
+ # Y_array = Y.dataXd if Y is not None else None
99
+
100
+ # key_transform.initialize(X_array, Y_array)
101
+
102
+
103
+ # def _get_attention(ops, Q, key_transform, X, lengths, is_train):
104
+ # K, K_bp = key_transform(X, is_train=is_train)
105
+
106
+ # attention = ops.gemm(K, ops.reshape2f(Q, -1, 1))
107
+ # attention = ops.softmax_sequences(attention, lengths)
108
+
109
+ # def get_attention_bwd(d_attention):
110
+ # d_attention = ops.backprop_softmax_sequences(d_attention, attention, lengths)
111
+ # dQ = ops.gemm(K, d_attention, trans1=True)
112
+ # dY = ops.xp.outer(d_attention, Q)
113
+ # dX = K_bp(dY)
114
+ # return dQ, dX
115
+
116
+ # return attention, get_attention_bwd
117
+
118
+
119
+ # def _apply_attention(ops, attention, X, lengths):
120
+ # output = X * attention
121
+
122
+ # def apply_attention_bwd(d_output):
123
+ # d_attention = (X * d_output).sum(axis=1, keepdims=True)
124
+ # dX = d_output * attention
125
+ # return dX, d_attention
126
+
127
+ # return output, apply_attention_bwd
128
+
129
+
130
+
131
+ @registry.architectures("CustomSpanCategorizer.v2")
132
+ def build_spancat_model(
133
+ tok2vec: Model[List[Doc], List[Floats2d]],
134
+ reducer1: Model[Ragged, Floats2d],
135
+ reducer2: Model[Ragged, Floats2d],
136
+ scorer: Model[Floats2d, Floats2d],
137
+ ) -> Model[Tuple[List[Doc], Ragged], Floats2d]:
138
+ """Build a span categorizer model, given a token-to-vector model, a
139
+ reducer model to map the sequence of vectors for each span down to a single
140
+ vector, and a scorer model to map the vectors to probabilities.
141
+ tok2vec (Model[List[Doc], List[Floats2d]]): The tok2vec model.
142
+ reducer (Model[Ragged, Floats2d]): The reducer model.
143
+ scorer (Model[Floats2d, Floats2d]): The scorer model.
144
+ """
145
+ model = chain(
146
+ cast(
147
+ Model[Tuple[List[Doc], Ragged], Tuple[Ragged, Ragged]],
148
+ with_getitem(
149
+ 0,
150
+ chain(tok2vec,
151
+ cast(Model[List[Floats2d], Ragged], list2ragged()))),
152
+ ),
153
+ extract_spans(),
154
+ concatenate(reducer1, reducer2),
155
+ scorer,
156
+ )
157
+ model.set_ref("tok2vec", tok2vec)
158
+ model.set_ref("reducer1", reducer1)
159
+ model.set_ref("reducer2", reducer2)
160
+ model.set_ref("scorer", scorer)
161
+ return model
162
+
163
+
164
+ @registry.architectures("LSTM_SpanCategorizer.v1")
165
+ def build_spancat_LSTM_model(
166
+ tok2vec: Model[List[Doc], List[Floats2d]],
167
+ reducer: Model[Ragged, Floats2d],
168
+ scorer: Model[Floats2d, Floats2d],
169
+ LSTMdepth: int = 2,
170
+ LSTMdropout: float = 0.0,
171
+ LSTMhidden: int = 200) -> Model[Tuple[List[Doc], Ragged], Floats2d]:
172
+ """Build a span categorizer model, given a token-to-vector model, a
173
+ reducer model to map the sequence of vectors for each span down to a single
174
+ vector, and a scorer model to map the vectors to probabilities.
175
+ tok2vec (Model[List[Doc], List[Floats2d]]): The tok2vec model.
176
+ reducer (Model[Ragged, Floats2d]): The reducer model.
177
+ scorer (Model[Floats2d, Floats2d]): The scorer model.
178
+ """
179
+ embedding = cast(
180
+ Model[Tuple[List[Doc], Ragged], Tuple[Ragged, Ragged]],
181
+ with_getitem(
182
+ 0,
183
+ chain(
184
+ tok2vec,
185
+ PyTorchLSTM(nI=768,
186
+ nO=LSTMhidden,
187
+ bi=True,
188
+ depth=LSTMdepth,
189
+ dropout=LSTMdropout),
190
+ cast(Model[List[Floats2d], Ragged], list2ragged()))))
191
+ # LSTM = PyTorchLSTM(nO = None, nI= None, bi = True, depth = LSTMdepth, dropout = LSTMdropout)
192
+
193
+ model = chain(
194
+ embedding,
195
+ extract_spans(),
196
+ reducer,
197
+ scorer,
198
+ )
199
+ model.set_ref("tok2vec", tok2vec)
200
+ model.set_ref("reducer", reducer)
201
+ model.set_ref("scorer", scorer)
202
+ return model
203
+
204
+
205
+ # @registry.architectures("Attention_SpanCategorizer.v1")
206
+ # def build_spancat_attention_model(
207
+ # tok2vec: Model[List[Doc], List[Floats2d]],
208
+ # reducer: Model[Ragged, Floats2d],
209
+ # scorer: Model[Floats2d, Floats2d],
210
+ # LSTMdepth: int = 2,
211
+ # LSTMdropout: float = 0.0,
212
+ # LSTMhidden: int = 200) -> Model[Tuple[List[Doc], Ragged], Floats2d]:
213
+ # """Build a span categorizer model, given a token-to-vector model, a
214
+ # reducer model to map the sequence of vectors for each span down to a single
215
+ # vector, and a scorer model to map the vectors to probabilities.
216
+ # tok2vec (Model[List[Doc], List[Floats2d]]): The tok2vec model.
217
+ # reducer (Model[Ragged, Floats2d]): The reducer model.
218
+ # scorer (Model[Floats2d, Floats2d]): The scorer model.
219
+ # """
220
+ # with Model.define_operators({">>": chain, "|": concatenate}):
221
+ # width = tok2vec.maybe_get_dim("nO")
222
+ # attention_layer = ParametricAttention(width)
223
+ # maxout_layer = Maxout(nO=width, nI=width)
224
+ # norm_layer = LayerNorm(nI=width)
225
+ # cnn_model = (
226
+ # cast(
227
+ # Model[Tuple[List[Doc], Ragged], Tuple[Ragged, Ragged]],
228
+ # with_getitem(
229
+ # 0,
230
+ # chain(
231
+ # tok2vec,
232
+ # cast(Model[List[Floats2d], Ragged], list2ragged()))))
233
+ # # >> list2ragged()
234
+ # >> attention_layer
235
+ # >> reduce_sum()
236
+ # >> residual(maxout_layer >> norm_layer >> Dropout(0.0))
237
+ # )
238
+
239
+ # embedding = cast(
240
+ # Model[Tuple[List[Doc], Ragged], Tuple[Ragged, Ragged]],
241
+ # with_getitem(
242
+ # 0,
243
+ # chain(
244
+ # tok2vec,
245
+ # cast(Model[List[Floats2d], Ragged], list2ragged()))))
246
+ # # LSTM = PyTorchLSTM(nO = None, nI= None, bi = True, depth = LSTMdepth, dropout = LSTMdropout)
247
+
248
+ # model = chain(
249
+ # concatenate(embedding, cnn_model),
250
+ # extract_spans(),
251
+ # reducer,
252
+ # scorer,
253
+ # )
254
+ # model.set_ref("tok2vec", tok2vec)
255
+ # model.set_ref("reducer", reducer)
256
+ # model.set_ref("scorer", scorer)
257
+ # return model
258
+
259
+
260
+ @registry.architectures("Attention_SpanCategorizer.v1")
261
+ def build_spancat_LSTM_model(
262
+ tok2vec: Model[List[Doc], List[Floats2d]],
263
+ reducer: Model[Ragged, Floats2d],
264
+ scorer: Model[Floats2d, Floats2d],
265
+ ) -> Model[Tuple[List[Doc], Ragged], Floats2d]:
266
+ """Build a span categorizer model, given a token-to-vector model, a
267
+ reducer model to map the sequence of vectors for each span down to a single
268
+ vector, and a scorer model to map the vectors to probabilities.
269
+ tok2vec (Model[List[Doc], List[Floats2d]]): The tok2vec model.
270
+ reducer (Model[Ragged, Floats2d]): The reducer model.
271
+ scorer (Model[Floats2d, Floats2d]): The scorer model.
272
+ """
273
+ width = tok2vec.maybe_get_dim("nO")
274
+ embedding = cast(
275
+ Model[Tuple[List[Doc], Ragged], Tuple[Ragged, Ragged]],
276
+ with_getitem(
277
+ 0,
278
+ chain(
279
+ tok2vec,
280
+ ParametricAttention(nO = 768),
281
+ cast(Model[List[Floats2d], Ragged], list2ragged()))))
282
+
283
+
284
+ model = chain(
285
+ embedding,
286
+ extract_spans(),
287
+ reducer,
288
+ scorer,
289
+ )
290
+ model.set_ref("tok2vec", tok2vec)
291
+ model.set_ref("reducer", reducer)
292
+ model.set_ref("scorer", scorer)
293
+ return model
294
+
295
+ @registry.architectures("Attention_SpanCategorizer.v2")
296
+ def build_spancat_LSTM_model(
297
+ tok2vec: Model[List[Doc], List[Floats2d]],
298
+ reducer: Model[Ragged, Floats2d],
299
+ scorer: Model[Floats2d, Floats2d],
300
+ ) -> Model[Tuple[List[Doc], Ragged], Floats2d]:
301
+ """Build a span categorizer model, given a token-to-vector model, a
302
+ reducer model to map the sequence of vectors for each span down to a single
303
+ vector, and a scorer model to map the vectors to probabilities.
304
+ tok2vec (Model[List[Doc], List[Floats2d]]): The tok2vec model.
305
+ reducer (Model[Ragged, Floats2d]): The reducer model.
306
+ scorer (Model[Floats2d, Floats2d]): The scorer model.
307
+ """
308
+ width = tok2vec.maybe_get_dim("nO")
309
+ # embedding = cast(
310
+ # Model[Tuple[List[Doc], Ragged], Tuple[Ragged, Ragged]],
311
+ # with_getitem(
312
+ # 0,
313
+ # chain(
314
+ # tok2vec,
315
+ # cast(Model[List[Floats2d], Ragged], list2ragged()))))
316
+ # LSTM = PyTorchLSTM(nO = None, nI= None, bi = True, depth = LSTMdepth, dropout = LSTMdropout)
317
+
318
+ model = chain(
319
+ tok2vec,
320
+ list2ragged(),
321
+ ParametricAttention(width),
322
+ extract_spans(),
323
+ reducer,
324
+ scorer,
325
+ )
326
+ model.set_ref("tok2vec", tok2vec)
327
+ model.set_ref("reducer", reducer)
328
+ model.set_ref("scorer", scorer)
329
+ return model
330
+
331
+ @registry.architectures("Attention_SpanCategorizer.v3")
332
+ def build_spancat_attention_model(
333
+ tok2vec: Model[List[Doc], List[Floats2d]],
334
+ reducer: Model[Ragged, Floats2d],
335
+ scorer: Model[Floats2d, Floats2d],
336
+ ) -> Model[Tuple[List[Doc], Ragged], Floats2d]:
337
+ """Build a span categorizer model, given a token-to-vector model, a
338
+ reducer model to map the sequence of vectors for each span down to a single
339
+ vector, and a scorer model to map the vectors to probabilities.
340
+ tok2vec (Model[List[Doc], List[Floats2d]]): The tok2vec model.
341
+ reducer (Model[Ragged, Floats2d]): The reducer model.
342
+ scorer (Model[Floats2d, Floats2d]): The scorer model.
343
+ """
344
+ width = tok2vec.maybe_get_dim("nO")
345
+ embedding = cast(
346
+ Model[Tuple[List[Doc], Ragged], Tuple[Ragged, Ragged]],
347
+ with_getitem(
348
+ 0,
349
+ chain(
350
+ tok2vec,
351
+ cast(Model[List[Floats2d], Ragged], list2ragged()))))
352
+
353
+
354
+ model = chain(
355
+ embedding,
356
+ extract_spans(),
357
+ ParametricAttention(nO = width),
358
+ reducer,
359
+ scorer,
360
+ )
361
+ model.set_ref("tok2vec", tok2vec)
362
+ model.set_ref("reducer", reducer)
363
+ model.set_ref("scorer", scorer)
364
+ return model
365
+
366
+ @registry.architectures("Attention_SpanCategorizer.v4")
367
+ def build_spancat_LSTM_model(
368
+ tok2vec: Model[List[Doc], List[Floats2d]],
369
+ reducer: Model[Ragged, Floats2d],
370
+ scorer: Model[Floats2d, Floats2d],
371
+ ) -> Model[Tuple[List[Doc], Ragged], Floats2d]:
372
+ """Build a span categorizer model, given a token-to-vector model, a
373
+ reducer model to map the sequence of vectors for each span down to a single
374
+ vector, and a scorer model to map the vectors to probabilities.
375
+ tok2vec (Model[List[Doc], List[Floats2d]]): The tok2vec model.
376
+ reducer (Model[Ragged, Floats2d]): The reducer model.
377
+ scorer (Model[Floats2d, Floats2d]): The scorer model.
378
+ """
379
+ width = tok2vec.maybe_get_dim("nO")
380
+ embedding = cast(
381
+ Model[Tuple[List[Doc], Ragged], Tuple[Ragged, Ragged]],
382
+ with_getitem(
383
+ 0,
384
+ chain(
385
+ tok2vec,
386
+ cast(Model[List[Floats2d], Ragged], list2ragged()))))
387
+
388
+ attention_layer = chain(
389
+ ParametricAttention(nO = width),
390
+ list2ragged())
391
+
392
+
393
+ model = chain(
394
+ embedding,
395
+ attention_layer,
396
+ extract_spans(),
397
+ reducer,
398
+ scorer,
399
+ )
400
+ model.set_ref("tok2vec", tok2vec)
401
+ model.set_ref("reducer", reducer)
402
+ model.set_ref("scorer", scorer)
403
+ return model
404
+
405
+
406
+ @registry.architectures("CustomSpanCategorizer.v2")
407
+ def build_spancat_model(
408
+ tok2vec: Model[List[Doc], List[Floats2d]],
409
+ reducer: Model[Ragged, Floats2d],
410
+ scorer: Model[Floats2d, Floats2d],
411
+ ) -> Model[Tuple[List[Doc], Ragged], Floats2d]:
412
+ """Build a span categorizer model, given a token-to-vector model, a
413
+ reducer model to map the sequence of vectors for each span down to a single
414
+ vector, and a scorer model to map the vectors to probabilities.
415
+ tok2vec (Model[List[Doc], List[Floats2d]]): The tok2vec model.
416
+ reducer (Model[Ragged, Floats2d]): The reducer model.
417
+ scorer (Model[Floats2d, Floats2d]): The scorer model.
418
+ """
419
+ model = chain(
420
+ cast(
421
+ Model[Tuple[List[Doc], Ragged], Tuple[Ragged, Ragged]],
422
+ with_getitem(
423
+ 0,
424
+ chain(tok2vec,
425
+ cast(Model[List[Floats2d], Ragged], list2ragged()))),
426
+ ),
427
+ extract_spans(),
428
+ reducer,
429
+ scorer,
430
+ )
431
+
432
+ model.set_ref("tok2vec", tok2vec)
433
+ model.set_ref("reducer", reducer)
434
+ model.set_ref("scorer", scorer)
435
+ return model
436
+
437
+ @registry.architectures("SpanCatParametricAttention.v1")
438
+ def build_textcat_parametric_attention_v1(
439
+ tok2vec: Model[List[Doc], List[Floats2d]],
440
+ exclusive_classes: bool = False,
441
+ nO: Optional[int] = None,
442
+ ) -> Model[List[Doc], Floats2d]:
443
+ width = tok2vec.maybe_get_dim("nO")
444
+ parametric_attention = _build_parametric_attention_with_residual_nonlinear(
445
+ tok2vec=tok2vec,
446
+ nonlinear_layer=Maxout(nI=width, nO=width),
447
+ key_transform=Gelu(nI=width, nO=width),
448
+ )
449
+ with Model.define_operators({">>": chain}):
450
+ if exclusive_classes:
451
+ output_layer = Softmax(nO=nO)
452
+ else:
453
+ output_layer = Linear(nO=nO, init_W=glorot_uniform_init) >> Logistic()
454
+ model = parametric_attention >> output_layer
455
+ if model.has_dim("nO") is not False and nO is not None:
456
+ model.set_dim("nO", cast(int, nO))
457
+ model.set_ref("output_layer", output_layer)
458
+
459
+ return model
460
+
461
+
462
+ def _build_parametric_attention_with_residual_nonlinear(
463
+ *,
464
+ tok2vec: Model[List[Doc], List[Floats2d]],
465
+ nonlinear_layer: Model[Floats2d, Floats2d],
466
+ key_transform: Optional[Model[Floats2d, Floats2d]] = None,
467
+ ) -> Model[List[Doc], Floats2d]:
468
+ with Model.define_operators({">>": chain, "|": concatenate}):
469
+ width = tok2vec.maybe_get_dim("nO")
470
+ attention_layer = ParametricAttention(nO=width)
471
+ norm_layer = LayerNorm(nI=width)
472
+ parametric_attention = (
473
+ tok2vec
474
+ >> list2ragged()
475
+ >> attention_layer
476
+ >> reduce_sum()
477
+ >> residual(nonlinear_layer >> norm_layer >> Dropout(0.0))
478
+ )
479
+
480
+ parametric_attention.init = _init_parametric_attention_with_residual_nonlinear
481
+
482
+ parametric_attention.set_ref("tok2vec", tok2vec)
483
+ parametric_attention.set_ref("attention_layer", attention_layer)
484
+ # parametric_attention.set_ref("key_transform", key_transform)
485
+ parametric_attention.set_ref("nonlinear_layer", nonlinear_layer)
486
+ parametric_attention.set_ref("norm_layer", norm_layer)
487
+
488
+ return parametric_attention
489
+
490
+ def _init_parametric_attention_with_residual_nonlinear(model, X, Y) -> Model:
491
+ # When tok2vec is lazily initialized, we need to initialize it before
492
+ # the rest of the chain to ensure that we can get its width.
493
+ tok2vec = model.get_ref("tok2vec")
494
+ tok2vec.initialize(X)
495
+
496
+ tok2vec_width = get_tok2vec_width(model)
497
+ model.get_ref("attention_layer").set_dim("nO", tok2vec_width)
498
+ # model.get_ref("key_transform").set_dim("nI", tok2vec_width)
499
+ # model.get_ref("key_transform").set_dim("nO", tok2vec_width)
500
+ model.get_ref("nonlinear_layer").set_dim("nI", tok2vec_width)
501
+ model.get_ref("nonlinear_layer").set_dim("nO", tok2vec_width)
502
+ model.get_ref("norm_layer").set_dim("nI", tok2vec_width)
503
+ model.get_ref("norm_layer").set_dim("nO", tok2vec_width)
504
+ init_chain(model, X, Y)
505
+ return model
506
+
507
+ @registry.architectures("SpanCatEnsemble.v2")
508
+ def build_text_classifier_v2(
509
+ tok2vec: Model[List[Doc], List[Floats2d]],
510
+ nO: Optional[int] = None,
511
+ ) -> Model[List[Doc], Floats2d]:
512
+ # TODO: build the model with _build_parametric_attention_with_residual_nonlinear
513
+ # in spaCy v4. We don't do this in spaCy v3 to preserve model
514
+ # compatibility.
515
+
516
+ with Model.define_operators({">>": chain, "|": concatenate}):
517
+ width = tok2vec.maybe_get_dim("nO")
518
+ attention_layer = ParametricAttention(width)
519
+ maxout_layer = Maxout(nO=width, nI=width)
520
+ norm_layer = LayerNorm(nI=width)
521
+ cnn_model = (
522
+ tok2vec
523
+ >> list2ragged()
524
+ >> attention_layer
525
+ >> reduce_sum()
526
+ >> residual(maxout_layer >> norm_layer >> Dropout(0.0))
527
+ )
528
+
529
+ nO_double = nO * 2 if nO else None
530
+ if exclusive_classes:
531
+ output_layer = Softmax(nO=nO, nI=nO_double)
532
+ else:
533
+ output_layer = Linear(nO=nO, nI=nO_double) >> Logistic()
534
+ model = cnn_model >> output_layer
535
+ model.set_ref("tok2vec", tok2vec)
536
+ if model.has_dim("nO") is not False and nO is not None:
537
+ model.set_dim("nO", cast(int, nO))
538
+
539
+ model.set_ref("attention_layer", attention_layer)
540
+ model.set_ref("maxout_layer", maxout_layer)
541
+ model.set_ref("norm_layer", norm_layer)
542
+
543
+
544
+ model.init = init_ensemble_textcat # type: ignore[assignment]
545
+ return model
546
+
547
+
548
+
549
+ @registry.architectures("Ensemble_SpanCategorizer.v1")
550
+ def build_spancat_model3(
551
+ tok2vec: Model[List[Doc], List[Floats2d]],
552
+ tok2vec_trf: Model[List[Doc], List[Floats2d]],
553
+ reducer1: Model[Ragged, Floats2d],
554
+ reducer2: Model[Ragged, Floats2d],
555
+ scorer: Model[Floats2d, Floats2d],
556
+ ) -> Model[Tuple[List[Doc], Ragged], Floats2d]:
557
+ """Build a span categorizer model, given a token-to-vector model, a
558
+ reducer model to map the sequence of vectors for each span down to a single
559
+ vector, and a scorer model to map the vectors to probabilities.
560
+ tok2vec (Model[List[Doc], List[Floats2d]]): The tok2vec model.
561
+ reducer (Model[Ragged, Floats2d]): The reducer model.
562
+ scorer (Model[Floats2d, Floats2d]): The scorer model.
563
+ """
564
+ trainable_trf = cast(
565
+ Model[Tuple[List[Doc], Ragged], Tuple[Ragged, Ragged]],
566
+ with_getitem(
567
+ 0,
568
+ chain(tok2vec, cast(Model[List[Floats2d], Ragged],
569
+ list2ragged()))),
570
+ )
571
+ en_core_web_trf = cast(
572
+ Model[Tuple[List[Doc], Ragged], Tuple[Ragged, Ragged]],
573
+ with_getitem(
574
+ 0,
575
+ chain(tok2vec_trf,
576
+ cast(Model[List[Floats2d], Ragged], list2ragged()))),
577
+ )
578
+ reduce_trainable = chain(trainable_trf, extract_spans(), reducer1)
579
+ reduce_default = chain(en_core_web_trf, extract_spans(), reducer2)
580
+ model = chain(
581
+ concatenate(reduce_trainable, reduce_default),
582
+ # Mish(),
583
+ # LayerNorm(),
584
+ scorer,
585
+ )
586
+ model.set_ref("tok2vec", tok2vec)
587
+ model.set_ref("tok2vec_trf", tok2vec_trf)
588
+ model.set_ref("reducer1", reducer1)
589
+ model.set_ref("reducer2", reducer2)
590
+ model.set_ref("scorer", scorer)
591
+ return model
592
+
593
+
594
+ @registry.architectures("Ensemble_SpanCategorizer.v2")
595
+ def build_spancat_model3(
596
+ tok2vec: Model[List[Doc], List[Floats2d]],
597
+ tok2vec_trf: Model[List[Doc], List[Floats2d]],
598
+ reducer1: Model[Ragged, Floats2d],
599
+ reducer2: Model[Ragged, Floats2d],
600
+ scorer: Model[Floats2d, Floats2d],
601
+ LSTMhidden: int = 200,
602
+ LSTMdepth: int = 1,
603
+ LSTMdropout: float = 0.0,
604
+ ) -> Model[Tuple[List[Doc], Ragged], Floats2d]:
605
+ """Build a span categorizer model, given a token-to-vector model, a
606
+ reducer model to map the sequence of vectors for each span down to a single
607
+ vector, and a scorer model to map the vectors to probabilities.
608
+ tok2vec (Model[List[Doc], List[Floats2d]]): The tok2vec model.
609
+ reducer (Model[Ragged, Floats2d]): The reducer model.
610
+ scorer (Model[Floats2d, Floats2d]): The scorer model.
611
+ """
612
+ trainable_trf = cast(
613
+ Model[Tuple[List[Doc], Ragged], Tuple[Ragged, Ragged]],
614
+ with_getitem(
615
+ 0,
616
+ chain(tok2vec, cast(Model[List[Floats2d], Ragged],
617
+ list2ragged()))),
618
+ )
619
+ en_core_web_trf = cast(
620
+ Model[Tuple[List[Doc], Ragged], Tuple[Ragged, Ragged]],
621
+ with_getitem(
622
+ 0,
623
+ chain(
624
+ tok2vec_trf,
625
+ PyTorchLSTM(nI=768,
626
+ nO=LSTMhidden,
627
+ bi=True,
628
+ depth=LSTMdepth,
629
+ dropout=LSTMdropout),
630
+ cast(Model[List[Floats2d], Ragged], list2ragged()))),
631
+ )
632
+ reduce_trainable = chain(trainable_trf, extract_spans(), reducer1)
633
+ reduce_default = chain(en_core_web_trf, extract_spans(), reducer2)
634
+ model = chain(
635
+ concatenate(reduce_trainable, reduce_default),
636
+ # Mish(),
637
+ # LayerNorm(),
638
+ scorer,
639
+ )
640
+ model.set_ref("tok2vec", tok2vec)
641
+ model.set_ref("tok2vec_trf", tok2vec_trf)
642
+ model.set_ref("reducer1", reducer1)
643
+ model.set_ref("reducer2", reducer2)
644
+ model.set_ref("scorer", scorer)
645
+ return model
646
+
647
+
648
+ @registry.architectures("Ensemble_SpanCategorizer.v4")
649
+ def build_spancat_model3(
650
+ tok2vec: Model[List[Doc], List[Floats2d]],
651
+ tok2vec_trf: Model[List[Doc], List[Floats2d]],
652
+ reducer1: Model[Ragged, Floats2d],
653
+ reducer2: Model[Ragged, Floats2d],
654
+ scorer: Model[Floats2d, Floats2d],
655
+ LSTMhidden: int = 200,
656
+ LSTMdepth: int = 1,
657
+ LSTMdropout: float = 0.0,
658
+ ) -> Model[Tuple[List[Doc], Ragged], Floats2d]:
659
+ """Build a span categorizer model, given a token-to-vector model, a
660
+ reducer model to map the sequence of vectors for each span down to a single
661
+ vector, and a scorer model to map the vectors to probabilities.
662
+ tok2vec (Model[List[Doc], List[Floats2d]]): The tok2vec model.
663
+ reducer (Model[Ragged, Floats2d]): The reducer model.
664
+ scorer (Model[Floats2d, Floats2d]): The scorer model.
665
+ """
666
+ trainable_trf = cast(
667
+ Model[Tuple[List[Doc], Ragged], Tuple[Ragged, Ragged]],
668
+ with_getitem(
669
+ 0,
670
+ chain(tok2vec, cast(Model[List[Floats2d], Ragged],
671
+ list2ragged()))),
672
+ )
673
+ en_core_web_trf = cast(
674
+ Model[Tuple[List[Doc], Ragged], Tuple[Ragged, Ragged]],
675
+ with_getitem(
676
+ 0,
677
+ chain(
678
+ tok2vec_trf,
679
+ PyTorchLSTM(nI=768,
680
+ nO=LSTMhidden,
681
+ bi=True,
682
+ depth=LSTMdepth,
683
+ dropout=LSTMdropout),
684
+ cast(Model[List[Floats2d], Ragged], list2ragged()))),
685
+ )
686
+ reduce_trainable = chain(trainable_trf, extract_spans(), reducer1)
687
+ reduce_default = chain(en_core_web_trf, extract_spans(), reducer2)
688
+ model = chain(
689
+ concatenate(reduce_trainable, reduce_default),
690
+ Mish(nO = 128),
691
+ LayerNorm(),
692
+ scorer,
693
+ )
694
+ model.set_ref("tok2vec", tok2vec)
695
+ model.set_ref("tok2vec_trf", tok2vec_trf)
696
+ model.set_ref("reducer1", reducer1)
697
+ model.set_ref("reducer2", reducer2)
698
+ model.set_ref("scorer", scorer)
699
+ return model
700
+
701
+
702
+ @registry.layers("mean_max_reducer.v1.5")
703
+ def build_mean_max_reducer1(hidden_size: int,
704
+ dropout: float = 0.0,
705
+ depth: int = 1) -> Model[Ragged, Floats2d]:
706
+ """Reduce sequences by concatenating their mean and max pooled vectors,
707
+ and then combine the concatenated vectors with a hidden layer.
708
+ """
709
+ return chain(
710
+ concatenate(
711
+ cast(Model[Ragged, Floats2d], reduce_last()),
712
+ cast(Model[Ragged, Floats2d], reduce_first()),
713
+ reduce_mean(),
714
+ reduce_max(),
715
+ ),
716
+ clone(Maxout(nO=hidden_size, normalize=True, dropout=dropout), depth),
717
+ )
718
+
719
+
720
+ @registry.layers("sum_reducer.v1")
721
+ def build_sum_reducer(hidden_size: int,
722
+ dropout: float = 0.0,
723
+ depth: int = 1) -> Model[Ragged, Floats2d]:
724
+ """Reduce sequences by concatenating their mean and max pooled vectors,
725
+ and then combine the concatenated vectors with a hidden layer.
726
+ """
727
+ return chain(
728
+ reduce_sum(),
729
+ clone(Maxout(nO=hidden_size, normalize=True, dropout=dropout), depth),
730
+ )
731
+
732
+
733
+ # @registry.layers("mean_max_reducer.v2")
734
+ # def build_mean_max_reducer2(hidden_size: int,
735
+ # dropout: float = 0.0) -> Model[Ragged, Floats2d]:
736
+ # """Reduce sequences by concatenating their mean and max pooled vectors,
737
+ # and then combine the concatenated vectors with a hidden layer.
738
+ # """
739
+ # return chain(
740
+ # concatenate(
741
+ # cast(Model[Ragged, Floats2d], reduce_last()),
742
+ # cast(Model[Ragged, Floats2d], reduce_first()),
743
+ # reduce_mean(),
744
+ # reduce_mean(),
745
+ # reduce_max(),
746
+ # ),
747
+ # Maxout(nO=hidden_size, normalize=True, dropout=dropout),
748
+ # )
749
+
750
+
751
+ @registry.layers("Gelu_mean_max_reducer.v1")
752
+ def build_mean_max_reducer_gelu(hidden_size: int,
753
+ dropout: float = 0.0,
754
+ depth: int = 1) -> Model[Ragged, Floats2d]:
755
+ """Reduce sequences by concatenating their mean and max pooled vectors,
756
+ and then combine the concatenated vectors with a hidden layer.
757
+ """
758
+ gelu_unit = Gelu(nO=hidden_size, normalize=True, dropout=dropout)
759
+ return chain(
760
+ concatenate(
761
+ cast(Model[Ragged, Floats2d], reduce_last()),
762
+ cast(Model[Ragged, Floats2d], reduce_first()),
763
+ reduce_mean(),
764
+ reduce_max(),
765
+ ),
766
+ clone(gelu_unit, depth),
767
+ )
768
+
769
+
770
+ @registry.layers("Mish_mean_max_reducer.v1")
771
+ def build_mean_max_reducer3(hidden_size: int,
772
+ dropout: float = 0.0,
773
+ depth: int = 4) -> Model[Ragged, Floats2d]:
774
+ """Reduce sequences by concatenating their mean and max pooled vectors,
775
+ and then combine the concatenated vectors with a hidden layer.
776
+ """
777
+ mish_unit = Mish(nO=hidden_size, normalize=True, dropout=dropout)
778
+ return chain(
779
+ concatenate(
780
+ cast(Model[Ragged, Floats2d], reduce_last()),
781
+ cast(Model[Ragged, Floats2d], reduce_first()),
782
+ reduce_mean(),
783
+ reduce_max(),
784
+ ),
785
+ clone(mish_unit, depth),
786
+ )
787
+
788
+ @registry.layers("Maxout_mean_max_reducer.v2")
789
+ def build_mean_max_reducer3(hidden_size: int,
790
+ dropout: float = 0.0,
791
+ depth: int = 4) -> Model[Ragged, Floats2d]:
792
+ """Reduce sequences by concatenating their mean and max pooled vectors,
793
+ and then combine the concatenated vectors with a hidden layer.
794
+ """
795
+ maxout_unit = Maxout(nO=hidden_size, normalize=True, dropout=dropout)
796
+ return chain(
797
+ concatenate(
798
+ cast(Model[Ragged, Floats2d], reduce_last()),
799
+ cast(Model[Ragged, Floats2d], reduce_first()),
800
+ reduce_mean(),
801
+ reduce_max(),
802
+ ),
803
+ clone(maxout_unit, depth),
804
+ )
805
+
806
+ @registry.layers("mean_max_reducer.v2")
807
+ def build_mean_max_reducer2(hidden_size: int,
808
+ dropout: float = 0.0) -> Model[Ragged, Floats2d]:
809
+ """Reduce sequences by concatenating their mean and max pooled vectors,
810
+ and then combine the concatenated vectors with a hidden layer.
811
+ """
812
+ return chain(
813
+ concatenate(
814
+ cast(Model[Ragged, Floats2d], reduce_last()),
815
+ cast(Model[Ragged, Floats2d], reduce_first()),
816
+ reduce_mean(),
817
+ reduce_mean(),
818
+ reduce_max(),
819
+ ),
820
+ Maxout(nO=hidden_size, normalize=True, dropout=dropout),
821
+ )
822
+
823
+ @registry.layers("two_way_reducer.v1")
824
+ def build_two_way_reducer(hidden_size: int,
825
+ dropout: float = 0.0) -> Model[Ragged, Floats2d]:
826
+ """Reduce sequences by concatenating their mean and max pooled vectors,
827
+ and then combine the concatenated vectors with a hidden layer.
828
+ """
829
+ default_reducer = concatenate(
830
+ cast(Model[Ragged, Floats2d], reduce_last()),
831
+ cast(Model[Ragged, Floats2d], reduce_first()),
832
+ reduce_mean(),
833
+ reduce_max(),
834
+ )
835
+ mean_sum_reducer = concatenate(reduce_mean(), reduce_sum())
836
+
837
+ return concatenate(
838
+ chain(default_reducer,
839
+ Maxout(nO=hidden_size, normalize=True, dropout=dropout)),
840
+ chain(mean_sum_reducer,
841
+ Maxout(nO=hidden_size // 2, normalize=True, dropout=dropout)))
842
+
843
+
844
+ @registry.layers("Mish_two_way_reducer.v1")
845
+ def build_Mish_two_way_reducer(hidden_size: int,
846
+ dropout: float = 0.0,
847
+ depth: int = 1) -> Model[Ragged, Floats2d]:
848
+ """Reduce sequences by concatenating their mean and max pooled vectors,
849
+ and then combine the concatenated vectors with a hidden layer.
850
+ """
851
+ default_reducer = concatenate(
852
+ cast(Model[Ragged, Floats2d], reduce_last()),
853
+ cast(Model[Ragged, Floats2d], reduce_first()),
854
+ reduce_mean(),
855
+ reduce_max(),
856
+ )
857
+ mean_sum_reducer = concatenate(reduce_mean(), reduce_sum())
858
+
859
+ return concatenate(
860
+ chain(
861
+ default_reducer,
862
+ clone(Mish(nO=hidden_size // 2, normalize=True, dropout=dropout),
863
+ depth)),
864
+ chain(
865
+ mean_sum_reducer,
866
+ clone(Mish(nO=hidden_size // 2, normalize=True, dropout=dropout),
867
+ depth)))
868
+
869
+ @registry.layers("Mish_two_way_reducer.v2")
870
+ def build_Mish_two_way_reducer2(hidden_size: int,
871
+ dropout: float = 0.0,
872
+ depth: int = 1) -> Model[Ragged, Floats2d]:
873
+ """Reduce sequences by concatenating their mean and max pooled vectors,
874
+ and then combine the concatenated vectors with a hidden layer.
875
+ """
876
+ default_reducer = concatenate(
877
+ cast(Model[Ragged, Floats2d], reduce_last()),
878
+ cast(Model[Ragged, Floats2d], reduce_first()),
879
+ reduce_mean(),
880
+ reduce_max(),
881
+ )
882
+ mean_sum_reducer = concatenate(
883
+ cast(Model[Ragged, Floats2d], reduce_last()),
884
+ cast(Model[Ragged, Floats2d], reduce_first()),
885
+ reduce_mean(),
886
+ reduce_sum(),
887
+ )
888
+
889
+ return concatenate(
890
+ chain(
891
+ default_reducer,
892
+ clone(Mish(nO=hidden_size // 2, normalize=True, dropout=dropout),
893
+ depth)),
894
+ chain(
895
+ mean_sum_reducer,
896
+ clone(Mish(nO=hidden_size // 2, normalize=True, dropout=dropout),
897
+ depth)))
898
+
899
+
900
+
901
+
902
+ @registry.layers("three_way_reducer.v3")
903
+ def build_mean_max_reducer2(hidden_size: int,
904
+ dropout: float = 0.0,
905
+ depth: int = 2) -> Model[Ragged, Floats2d]:
906
+ """Reduce sequences by concatenating their mean and max pooled vectors,
907
+ and then combine the concatenated vectors with a hidden layer.
908
+ """
909
+ default_reducer = concatenate(
910
+ cast(Model[Ragged, Floats2d], reduce_last()),
911
+ cast(Model[Ragged, Floats2d], reduce_first()),
912
+ reduce_mean(),
913
+ reduce_max(),
914
+ )
915
+ mean_sum_reducer = concatenate(
916
+ reduce_mean(),
917
+ reduce_sum())
918
+
919
+ return concatenate(chain(default_reducer,
920
+ Maxout(nO=hidden_size, normalize=True, dropout=dropout)),
921
+ chain(mean_sum_reducer,
922
+ Maxout(nO=hidden_size//2, normalize=True, dropout=dropout)),
923
+ chain(mean_sum_reducer,
924
+ clone(Maxout(nO=hidden_size//2, normalize=True, dropout=dropout),depth))
925
+ )
926
+
927
+ @registry.layers("Maxout_three_way_reducer.v1")
928
+ def build_Maxout_three_way_reducer(hidden_size: int,
929
+ dropout: float = 0.0,
930
+ depth: int = 2) -> Model[Ragged, Floats2d]:
931
+ """Reduce sequences by concatenating their mean and max pooled vectors,
932
+ and then combine the concatenated vectors with a hidden layer.
933
+ """
934
+ default_reducer = concatenate(
935
+ cast(Model[Ragged, Floats2d], reduce_last()),
936
+ cast(Model[Ragged, Floats2d], reduce_first()),
937
+ reduce_mean(),
938
+ reduce_max(),
939
+ )
940
+ mean_sum_reducer = concatenate(reduce_mean(), reduce_sum())
941
+
942
+ return concatenate(
943
+ chain(
944
+ default_reducer,
945
+ clone(Maxout(nO=hidden_size // 2, normalize=True, dropout=dropout),
946
+ depth)),
947
+ chain(mean_sum_reducer,
948
+ Maxout(nO=hidden_size // 4, normalize=True, dropout=dropout)),
949
+ chain(
950
+ mean_sum_reducer,
951
+ clone(Maxout(nO=hidden_size // 4, normalize=True, dropout=dropout),
952
+ depth)))
953
+
954
+
955
+ @registry.layers("Mish_three_way_reducer.v1")
956
+ def build_Mish_three_way_reducer(hidden_size: int,
957
+ dropout: float = 0.0,
958
+ depth: int = 2) -> Model[Ragged, Floats2d]:
959
+ """Reduce sequences by concatenating their mean and max pooled vectors,
960
+ and then combine the concatenated vectors with a hidden layer.
961
+ """
962
+ default_reducer = concatenate(
963
+ cast(Model[Ragged, Floats2d], reduce_last()),
964
+ cast(Model[Ragged, Floats2d], reduce_first()),
965
+ reduce_mean(),
966
+ reduce_max(),
967
+ )
968
+ mean_sum_reducer = concatenate(reduce_mean(), reduce_sum())
969
+
970
+ return concatenate(
971
+ chain(
972
+ default_reducer,
973
+ clone(Mish(nO=hidden_size // 2, normalize=True, dropout=dropout),
974
+ depth)),
975
+ chain(mean_sum_reducer,
976
+ Mish(nO=hidden_size // 4, normalize=True, dropout=dropout)),
977
+ chain(
978
+ mean_sum_reducer,
979
+ clone(Mish(nO=hidden_size // 4, normalize=True, dropout=dropout),
980
+ depth)))
981
+
982
+
983
+ @registry.layers("mean_max_reducer.v4")
984
+ def build_mean_max_reducer3(hidden_size: int,
985
+ maxout_pieces: int = 3,
986
+ dropout: float = 0.0) -> Model[Ragged, Floats2d]:
987
+ """Reduce sequences by concatenating their mean and max pooled vectors,
988
+ and then combine the concatenated vectors with a hidden layer.
989
+ """
990
+ hidden_size2 = int(hidden_size / 2)
991
+ hidden_size3 = int(hidden_size / 2)
992
+ return chain(
993
+ concatenate(
994
+ cast(Model[Ragged, Floats2d], reduce_last()),
995
+ cast(Model[Ragged, Floats2d], reduce_first()),
996
+ reduce_mean(),
997
+ reduce_max(),
998
+ ),
999
+ Maxout(nO=hidden_size,
1000
+ nP=maxout_pieces,
1001
+ normalize=True,
1002
+ dropout=dropout),
1003
+ Maxout(nO=hidden_size2,
1004
+ nP=maxout_pieces,
1005
+ normalize=True,
1006
+ dropout=dropout),
1007
+ Maxout(nO=hidden_size3,
1008
+ nP=maxout_pieces,
1009
+ normalize=True,
1010
+ dropout=dropout))
1011
+
1012
+
1013
+ @registry.layers("mean_max_reducer.v3.3")
1014
+ def build_mean_max_reducer4(hidden_size: int,
1015
+ depth: int) -> Model[Ragged, Floats2d]:
1016
+ """Reduce sequences by concatenating their mean and max pooled vectors,
1017
+ and then combine the concatenated vectors with a hidden layer.
1018
+ """
1019
+ hidden_size2 = int(hidden_size / 2)
1020
+ hidden_size3 = int(hidden_size / 2)
1021
+ return chain(
1022
+ concatenate(
1023
+ cast(Model[Ragged, Floats2d], reduce_last()),
1024
+ cast(Model[Ragged, Floats2d], reduce_first()),
1025
+ reduce_mean(),
1026
+ reduce_max(),
1027
+ ), Maxout(nO=hidden_size, nP=3, normalize=True, dropout=0.0),
1028
+ Maxout(nO=hidden_size2, nP=3, normalize=True, dropout=0.0),
1029
+ Maxout(nO=hidden_size3, nP=3, normalize=True, dropout=0.0))
1030
+
1031
+
1032
+ @registry.layers("attention_reducer.v1")
1033
+ def build_mean_max_reducer1(hidden_size: int,
1034
+ dropout: float = 0.0,
1035
+ depth: int = 1) -> Model[Ragged, Floats2d]:
1036
+ """
1037
+ """
1038
+
1039
+ with Model.define_operators({">>": chain, "|": concatenate}):
1040
+ width = tok2vec.maybe_get_dim("nO")
1041
+ attention_layer = ParametricAttention(width)
1042
+ maxout_layer = Maxout(nO=width, nI=width)
1043
+ norm_layer = LayerNorm(nI=width)
1044
+ cnn_model = (
1045
+ tok2vec
1046
+ >> list2ragged()
1047
+ >> attention_layer
1048
+ >> reduce_sum()
1049
+ >> residual(maxout_layer >> norm_layer >> Dropout(0.0))
1050
+ )
1051
+
1052
+
1053
+ return chain(
1054
+ concatenate(
1055
+ cast(Model[Ragged, Floats2d], reduce_last()),
1056
+ cast(Model[Ragged, Floats2d], reduce_first()),
1057
+ reduce_mean(),
1058
+ reduce_max(),
1059
+ ),
1060
+ clone(Maxout(nO=hidden_size, normalize=True, dropout=dropout), depth),
1061
+ )
1062
+
1063
+
1064
+ # @registry.architectures("spacy.MaxoutWindowEncoder.v2")
1065
+ # def MaxoutWindowEncoder(
1066
+ # width: int, window_size: int, maxout_pieces: int, depth: int
1067
+ # ) -> Model[List[Floats2d], List[Floats2d]]:
1068
+ # """Encode context using convolutions with maxout activation, layer
1069
+ # normalization and residual connections.
1070
+ # width (int): The input and output width. These are required to be the same,
1071
+ # to allow residual connections. This value will be determined by the
1072
+ # width of the inputs. Recommended values are between 64 and 300.
1073
+ # window_size (int): The number of words to concatenate around each token
1074
+ # to construct the convolution. Recommended value is 1.
1075
+ # maxout_pieces (int): The number of maxout pieces to use. Recommended
1076
+ # values are 2 or 3.
1077
+ # depth (int): The number of convolutional layers. Recommended value is 4.
1078
+ # """
1079
+ # cnn = chain(
1080
+ # expand_window(window_size=window_size),
1081
+ # Maxout(
1082
+ # nO=width,
1083
+ # nI=width * ((window_size * 2) + 1),
1084
+ # nP=maxout_pieces,
1085
+ # dropout=0.0,
1086
+ # normalize=True,
1087
+ # ),
1088
+ # )
1089
+ # model = clone(residual(cnn), depth)
1090
+ # model.set_dim("nO", width)
1091
+ # receptive_field = window_size * depth
1092
+ # return with_array(model, pad=receptive_field)
1093
+
1094
+
1095
+ # @registry.architectures("spacy.MishWindowEncoder.v2")
1096
+ # def MishWindowEncoder(
1097
+ # width: int, window_size: int, depth: int
1098
+ # ) -> Model[List[Floats2d], List[Floats2d]]:
1099
+ # """Encode context using convolutions with mish activation, layer
1100
+ # normalization and residual connections.
1101
+ # width (int): The input and output width. These are required to be the same,
1102
+ # to allow residual connections. This value will be determined by the
1103
+ # width of the inputs. Recommended values are between 64 and 300.
1104
+ # window_size (int): The number of words to concatenate around each token
1105
+ # to construct the convolution. Recommended value is 1.
1106
+ # depth (int): The number of convolutional layers. Recommended value is 4.
1107
+ # """
1108
+ # cnn = chain(
1109
+ # expand_window(window_size=window_size),
1110
+ # Mish(nO=width, nI=width * ((window_size * 2) + 1), dropout=0.0, normalize=True),
1111
+ # )
1112
+ # model = clone(residual(cnn), depth)
1113
+ # model.set_dim("nO", width)
1114
+ # return with_array(model)
1115
+
1116
+
1117
+
en_engagement_spl_RoBERTa_base_attention-any-py3-none-any.whl ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a2b2dd5c15ced10fb287fd7b894d189522d585bdfe8bd743c2d26ffff2df02ae
3
+ size 903886799
lemmatizer/lookups/lookups.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:eb64f40c0f8396d1762730c0ddf4dad2a52d138f5a389f71a1a1d088173b7737
3
+ size 972893
meta.json ADDED
@@ -0,0 +1,329 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "lang":"en",
3
+ "name":"engagement_spl_RoBERTa_base_attention",
4
+ "version":"0.0.1",
5
+ "description":"",
6
+ "author":"",
7
+ "email":"",
8
+ "url":"",
9
+ "license":"",
10
+ "spacy_version":">=3.6.0,<3.7.0",
11
+ "spacy_git_version":"6fc153a26",
12
+ "vectors":{
13
+ "width":0,
14
+ "vectors":0,
15
+ "keys":0,
16
+ "name":null
17
+ },
18
+ "labels":{
19
+ "transformer":[
20
+
21
+ ],
22
+ "parser":[
23
+ "ROOT",
24
+ "acl",
25
+ "acomp",
26
+ "advcl",
27
+ "advmod",
28
+ "agent",
29
+ "amod",
30
+ "appos",
31
+ "attr",
32
+ "aux",
33
+ "auxpass",
34
+ "case",
35
+ "cc",
36
+ "ccomp",
37
+ "compound",
38
+ "conj",
39
+ "csubj",
40
+ "csubjpass",
41
+ "dative",
42
+ "dep",
43
+ "det",
44
+ "dobj",
45
+ "expl",
46
+ "intj",
47
+ "mark",
48
+ "meta",
49
+ "neg",
50
+ "nmod",
51
+ "npadvmod",
52
+ "nsubj",
53
+ "nsubjpass",
54
+ "nummod",
55
+ "oprd",
56
+ "parataxis",
57
+ "pcomp",
58
+ "pobj",
59
+ "poss",
60
+ "preconj",
61
+ "predet",
62
+ "prep",
63
+ "prt",
64
+ "punct",
65
+ "quantmod",
66
+ "relcl",
67
+ "xcomp"
68
+ ],
69
+ "tagger":[
70
+ "$",
71
+ "''",
72
+ ",",
73
+ "-LRB-",
74
+ "-RRB-",
75
+ ".",
76
+ ":",
77
+ "ADD",
78
+ "AFX",
79
+ "CC",
80
+ "CD",
81
+ "DT",
82
+ "EX",
83
+ "FW",
84
+ "HYPH",
85
+ "IN",
86
+ "JJ",
87
+ "JJR",
88
+ "JJS",
89
+ "LS",
90
+ "MD",
91
+ "NFP",
92
+ "NN",
93
+ "NNP",
94
+ "NNPS",
95
+ "NNS",
96
+ "PDT",
97
+ "POS",
98
+ "PRP",
99
+ "PRP$",
100
+ "RB",
101
+ "RBR",
102
+ "RBS",
103
+ "RP",
104
+ "SYM",
105
+ "TO",
106
+ "UH",
107
+ "VB",
108
+ "VBD",
109
+ "VBG",
110
+ "VBN",
111
+ "VBP",
112
+ "VBZ",
113
+ "WDT",
114
+ "WP",
115
+ "WP$",
116
+ "WRB",
117
+ "XX",
118
+ "``"
119
+ ],
120
+ "ner":[
121
+ "CARDINAL",
122
+ "DATE",
123
+ "EVENT",
124
+ "FAC",
125
+ "GPE",
126
+ "LANGUAGE",
127
+ "LAW",
128
+ "LOC",
129
+ "MONEY",
130
+ "NORP",
131
+ "ORDINAL",
132
+ "ORG",
133
+ "PERCENT",
134
+ "PERSON",
135
+ "PRODUCT",
136
+ "QUANTITY",
137
+ "TIME",
138
+ "WORK_OF_ART"
139
+ ],
140
+ "attribute_ruler":[
141
+
142
+ ],
143
+ "lemmatizer":[
144
+
145
+ ],
146
+ "trainable_transformer":[
147
+
148
+ ],
149
+ "spancat":[
150
+ "ATTRIBUTION",
151
+ "ENTERTAIN",
152
+ "PROCLAIM",
153
+ "SOURCES",
154
+ "MONOGLOSS",
155
+ "CITATION",
156
+ "ENDOPHORIC",
157
+ "DENY",
158
+ "JUSTIFYING",
159
+ "COUNTER"
160
+ ]
161
+ },
162
+ "pipeline":[
163
+ "transformer",
164
+ "parser",
165
+ "tagger",
166
+ "ner",
167
+ "attribute_ruler",
168
+ "lemmatizer",
169
+ "trainable_transformer",
170
+ "spancat"
171
+ ],
172
+ "components":[
173
+ "transformer",
174
+ "parser",
175
+ "tagger",
176
+ "ner",
177
+ "attribute_ruler",
178
+ "lemmatizer",
179
+ "trainable_transformer",
180
+ "spancat"
181
+ ],
182
+ "disabled":[
183
+
184
+ ],
185
+ "performance":{
186
+ "dep_uas":0.0,
187
+ "dep_las":0.0,
188
+ "dep_las_per_type":0.0,
189
+ "sents_p":0.936353211,
190
+ "sents_r":0.957771261,
191
+ "sents_f":0.9469411424,
192
+ "tag_acc":0.0,
193
+ "ents_f":0.0,
194
+ "ents_p":0.0,
195
+ "ents_r":0.0,
196
+ "ents_per_type":{
197
+ "ORDINAL":{
198
+ "p":0.0,
199
+ "r":0.0,
200
+ "f":0.0
201
+ },
202
+ "ENTERTAIN":{
203
+ "p":0.0,
204
+ "r":0.0,
205
+ "f":0.0
206
+ },
207
+ "PROCLAIM":{
208
+ "p":0.0,
209
+ "r":0.0,
210
+ "f":0.0
211
+ },
212
+ "COUNTER":{
213
+ "p":0.0,
214
+ "r":0.0,
215
+ "f":0.0
216
+ },
217
+ "NORP":{
218
+ "p":0.0,
219
+ "r":0.0,
220
+ "f":0.0
221
+ },
222
+ "FAC":{
223
+ "p":0.0,
224
+ "r":0.0,
225
+ "f":0.0
226
+ },
227
+ "PERSON":{
228
+ "p":0.0,
229
+ "r":0.0,
230
+ "f":0.0
231
+ },
232
+ "ORG":{
233
+ "p":0.0,
234
+ "r":0.0,
235
+ "f":0.0
236
+ },
237
+ "ATTRIBUTION":{
238
+ "p":0.0,
239
+ "r":0.0,
240
+ "f":0.0
241
+ },
242
+ "CARDINAL":{
243
+ "p":0.0,
244
+ "r":0.0,
245
+ "f":0.0
246
+ },
247
+ "DENY":{
248
+ "p":0.0,
249
+ "r":0.0,
250
+ "f":0.0
251
+ },
252
+ "MONOGLOSS":{
253
+ "p":0.0,
254
+ "r":0.0,
255
+ "f":0.0
256
+ },
257
+ "DATE":{
258
+ "p":0.0,
259
+ "r":0.0,
260
+ "f":0.0
261
+ },
262
+ "JUSTIFYING":{
263
+ "p":0.0,
264
+ "r":0.0,
265
+ "f":0.0
266
+ },
267
+ "LOC":{
268
+ "p":0.0,
269
+ "r":0.0,
270
+ "f":0.0
271
+ },
272
+ "PERCENT":{
273
+ "p":0.0,
274
+ "r":0.0,
275
+ "f":0.0
276
+ },
277
+ "GPE":{
278
+ "p":0.0,
279
+ "r":0.0,
280
+ "f":0.0
281
+ },
282
+ "QUANTITY":{
283
+ "p":0.0,
284
+ "r":0.0,
285
+ "f":0.0
286
+ },
287
+ "WORK_OF_ART":{
288
+ "p":0.0,
289
+ "r":0.0,
290
+ "f":0.0
291
+ },
292
+ "LAW":{
293
+ "p":0.0,
294
+ "r":0.0,
295
+ "f":0.0
296
+ },
297
+ "PRODUCT":{
298
+ "p":0.0,
299
+ "r":0.0,
300
+ "f":0.0
301
+ },
302
+ "EVENT":{
303
+ "p":0.0,
304
+ "r":0.0,
305
+ "f":0.0
306
+ },
307
+ "TIME":{
308
+ "p":0.0,
309
+ "r":0.0,
310
+ "f":0.0
311
+ },
312
+ "LANGUAGE":{
313
+ "p":0.0,
314
+ "r":0.0,
315
+ "f":0.0
316
+ }
317
+ },
318
+ "lemma_acc":0.0,
319
+ "spans_sc_f":0.7765328988,
320
+ "spans_sc_p":0.7819487179,
321
+ "spans_sc_r":0.7711915841,
322
+ "trainable_transformer_loss":59.1778603495,
323
+ "spancat_loss":761.8874162047
324
+ },
325
+ "requirements":[
326
+ "spacy-transformers>=1.2.5,<1.3.0",
327
+ "spacy-experimental>=0.6.4,<0.7.0"
328
+ ]
329
+ }
ner/cfg ADDED
@@ -0,0 +1,13 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "moves":null,
3
+ "update_with_oracle_cut_size":100,
4
+ "multitasks":[
5
+
6
+ ],
7
+ "min_action_freq":1,
8
+ "learn_tokens":false,
9
+ "beam_width":1,
10
+ "beam_density":0.0,
11
+ "beam_update_prob":0.0,
12
+ "incorrect_spans_key":null
13
+ }
ner/model ADDED
Binary file (314 kB). View file
 
ner/moves ADDED
@@ -0,0 +1 @@
 
 
1
+ ��moves�{"0":{},"1":{"ORG":56356,"DATE":40381,"PERSON":36475,"GPE":26716,"MONEY":15121,"CARDINAL":14096,"NORP":9638,"PERCENT":9182,"WORK_OF_ART":4475,"LOC":4047,"TIME":3670,"QUANTITY":3114,"FAC":3042,"EVENT":3015,"ORDINAL":2142,"PRODUCT":1782,"LAW":1620,"LANGUAGE":355},"2":{"ORG":56356,"DATE":40381,"PERSON":36475,"GPE":26716,"MONEY":15121,"CARDINAL":14096,"NORP":9638,"PERCENT":9182,"WORK_OF_ART":4475,"LOC":4047,"TIME":3670,"QUANTITY":3114,"FAC":3042,"EVENT":3015,"ORDINAL":2142,"PRODUCT":1782,"LAW":1620,"LANGUAGE":355},"3":{"ORG":56356,"DATE":40381,"PERSON":36475,"GPE":26716,"MONEY":15121,"CARDINAL":14096,"NORP":9638,"PERCENT":9182,"WORK_OF_ART":4475,"LOC":4047,"TIME":3670,"QUANTITY":3114,"FAC":3042,"EVENT":3015,"ORDINAL":2142,"PRODUCT":1782,"LAW":1620,"LANGUAGE":355},"4":{"ORG":56356,"DATE":40381,"PERSON":36475,"GPE":26716,"MONEY":15121,"CARDINAL":14096,"NORP":9638,"PERCENT":9182,"WORK_OF_ART":4475,"LOC":4047,"TIME":3670,"QUANTITY":3114,"FAC":3042,"EVENT":3015,"ORDINAL":2142,"PRODUCT":1782,"LAW":1620,"LANGUAGE":355,"":1},"5":{"":1}}�cfg��neg_key�
parser/cfg ADDED
@@ -0,0 +1,13 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "moves":null,
3
+ "update_with_oracle_cut_size":100,
4
+ "multitasks":[
5
+
6
+ ],
7
+ "min_action_freq":30,
8
+ "learn_tokens":false,
9
+ "beam_width":1,
10
+ "beam_density":0.0,
11
+ "beam_update_prob":0.0,
12
+ "incorrect_spans_key":null
13
+ }
parser/model ADDED
Binary file (640 kB). View file
 
parser/moves ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ ��moves�
2
+ {"0":{"":994267},"1":{"":990803},"2":{"det":172595,"nsubj":165748,"compound":116623,"amod":105184,"aux":86667,"punct":65478,"advmod":62763,"poss":36443,"mark":27941,"nummod":22598,"auxpass":15594,"prep":14001,"nsubjpass":13856,"neg":12357,"cc":10739,"nmod":9562,"advcl":9062,"npadvmod":8168,"quantmod":7101,"intj":6464,"ccomp":5896,"dobj":3427,"expl":3360,"dep":2806,"predet":1944,"parataxis":1837,"csubj":1428,"preconj":621,"pobj||prep":616,"attr":578,"meta":376,"advmod||conj":368,"dobj||xcomp":352,"acomp":284,"nsubj||ccomp":224,"dative":206,"advmod||xcomp":149,"dobj||ccomp":70,"csubjpass":64,"dobj||conj":62,"prep||conj":51,"acl":48,"prep||nsubj":41,"prep||dobj":36,"xcomp":34,"advmod||ccomp":32,"oprd":31},"3":{"punct":183790,"pobj":182191,"prep":174008,"dobj":89615,"conj":59687,"cc":51930,"ccomp":30385,"advmod":22861,"xcomp":21021,"relcl":20969,"advcl":19828,"attr":17741,"acomp":16922,"appos":15265,"case":13388,"acl":12085,"pcomp":10324,"npadvmod":9796,"prt":8179,"agent":3903,"dative":3866,"nsubj":3470,"neg":2906,"amod":2839,"intj":2819,"nummod":2732,"oprd":2301,"dep":1487,"parataxis":1261,"quantmod":319,"nmod":294,"acl||dobj":200,"prep||dobj":190,"prep||nsubj":162,"acl||nsubj":159,"appos||nsubj":145,"relcl||dobj":134,"relcl||nsubj":111,"aux":103,"expl":96,"meta":92,"appos||dobj":86,"preconj":71,"csubj":65,"prep||nsubjpass":55,"prep||advmod":54,"prep||acomp":53,"det":51,"nsubjpass":45,"relcl||pobj":42,"acl||nsubjpass":42,"mark":40,"auxpass":39,"prep||pobj":36,"relcl||nsubjpass":32,"appos||nsubjpass":31},"4":{"ROOT":111664}}�cfg��neg_key�
spancat/cfg ADDED
@@ -0,0 +1,19 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "labels":[
3
+ "ATTRIBUTION",
4
+ "ENTERTAIN",
5
+ "PROCLAIM",
6
+ "SOURCES",
7
+ "MONOGLOSS",
8
+ "CITATION",
9
+ "ENDOPHORIC",
10
+ "DENY",
11
+ "JUSTIFYING",
12
+ "COUNTER"
13
+ ],
14
+ "spans_key":"sc",
15
+ "threshold":0.5,
16
+ "max_positive":null,
17
+ "negative_weight":null,
18
+ "allow_overlap":true
19
+ }
spancat/model ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:57f0eae0e1a723c68852e60899998d0ab54d3c5dd3a9605b47c87568f5e1c46f
3
+ size 4731308
tagger/cfg ADDED
@@ -0,0 +1,56 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "label_smoothing":0.0,
3
+ "labels":[
4
+ "$",
5
+ "''",
6
+ ",",
7
+ "-LRB-",
8
+ "-RRB-",
9
+ ".",
10
+ ":",
11
+ "ADD",
12
+ "AFX",
13
+ "CC",
14
+ "CD",
15
+ "DT",
16
+ "EX",
17
+ "FW",
18
+ "HYPH",
19
+ "IN",
20
+ "JJ",
21
+ "JJR",
22
+ "JJS",
23
+ "LS",
24
+ "MD",
25
+ "NFP",
26
+ "NN",
27
+ "NNP",
28
+ "NNPS",
29
+ "NNS",
30
+ "PDT",
31
+ "POS",
32
+ "PRP",
33
+ "PRP$",
34
+ "RB",
35
+ "RBR",
36
+ "RBS",
37
+ "RP",
38
+ "SYM",
39
+ "TO",
40
+ "UH",
41
+ "VB",
42
+ "VBD",
43
+ "VBG",
44
+ "VBN",
45
+ "VBP",
46
+ "VBZ",
47
+ "WDT",
48
+ "WP",
49
+ "WP$",
50
+ "WRB",
51
+ "XX",
52
+ "``"
53
+ ],
54
+ "neg_prefix":"!",
55
+ "overwrite":false
56
+ }
tagger/model ADDED
Binary file (151 kB). View file
 
tokenizer ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ ��prefix_search� �^§|^%|^=|^—|^–|^\+(?![0-9])|^…|^……|^,|^:|^;|^\!|^\?|^¿|^؟|^¡|^\(|^\)|^\[|^\]|^\{|^\}|^<|^>|^_|^#|^\*|^&|^。|^?|^!|^,|^、|^;|^:|^~|^·|^।|^،|^۔|^؛|^٪|^\.\.+|^…|^\'|^"|^”|^“|^`|^‘|^´|^’|^‚|^,|^„|^»|^«|^「|^」|^『|^』|^(|^)|^〔|^〕|^【|^】|^《|^》|^〈|^〉|^〈|^〉|^⟦|^⟧|^\$|^£|^€|^¥|^฿|^US\$|^C\$|^A\$|^₽|^﷼|^₴|^₠|^₡|^₢|^₣|^₤|^₥|^₦|^₧|^₨|^₩|^₪|^₫|^€|^₭|^₮|^₯|^₰|^₱|^₲|^₳|^₴|^₵|^₶|^₷|^₸|^₹|^₺|^₻|^₼|^₽|^₾|^₿|^[\u00A6\u00A9\u00AE\u00B0\u0482\u058D\u058E\u060E\u060F\u06DE\u06E9\u06FD\u06FE\u07F6\u09FA\u0B70\u0BF3-\u0BF8\u0BFA\u0C7F\u0D4F\u0D79\u0F01-\u0F03\u0F13\u0F15-\u0F17\u0F1A-\u0F1F\u0F34\u0F36\u0F38\u0FBE-\u0FC5\u0FC7-\u0FCC\u0FCE\u0FCF\u0FD5-\u0FD8\u109E\u109F\u1390-\u1399\u1940\u19DE-\u19FF\u1B61-\u1B6A\u1B74-\u1B7C\u2100\u2101\u2103-\u2106\u2108\u2109\u2114\u2116\u2117\u211E-\u2123\u2125\u2127\u2129\u212E\u213A\u213B\u214A\u214C\u214D\u214F\u218A\u218B\u2195-\u2199\u219C-\u219F\u21A1\u21A2\u21A4\u21A5\u21A7-\u21AD\u21AF-\u21CD\u21D0\u21D1\u21D3\u21D5-\u21F3\u2300-\u2307\u230C-\u231F\u2322-\u2328\u232B-\u237B\u237D-\u239A\u23B4-\u23DB\u23E2-\u2426\u2440-\u244A\u249C-\u24E9\u2500-\u25B6\u25B8-\u25C0\u25C2-\u25F7\u2600-\u266E\u2670-\u2767\u2794-\u27BF\u2800-\u28FF\u2B00-\u2B2F\u2B45\u2B46\u2B4D-\u2B73\u2B76-\u2B95\u2B98-\u2BC8\u2BCA-\u2BFE\u2CE5-\u2CEA\u2E80-\u2E99\u2E9B-\u2EF3\u2F00-\u2FD5\u2FF0-\u2FFB\u3004\u3012\u3013\u3020\u3036\u3037\u303E\u303F\u3190\u3191\u3196-\u319F\u31C0-\u31E3\u3200-\u321E\u322A-\u3247\u3250\u3260-\u327F\u328A-\u32B0\u32C0-\u32FE\u3300-\u33FF\u4DC0-\u4DFF\uA490-\uA4C6\uA828-\uA82B\uA836\uA837\uA839\uAA77-\uAA79\uFDFD\uFFE4\uFFE8\uFFED\uFFEE\uFFFC\uFFFD\U00010137-\U0001013F\U00010179-\U00010189\U0001018C-\U0001018E\U00010190-\U0001019B\U000101A0\U000101D0-\U000101FC\U00010877\U00010878\U00010AC8\U0001173F\U00016B3C-\U00016B3F\U00016B45\U0001BC9C\U0001D000-\U0001D0F5\U0001D100-\U0001D126\U0001D129-\U0001D164\U0001D16A-\U0001D16C\U0001D183\U0001D184\U0001D18C-\U0001D1A9\U0001D1AE-\U0001D1E8\U0001D200-\U0001D241\U0001D245\U0001D300-\U0001D356\U0001D800-\U0001D9FF\U0001DA37-\U0001DA3A\U0001DA6D-\U0001DA74\U0001DA76-\U0001DA83\U0001DA85\U0001DA86\U0001ECAC\U0001F000-\U0001F02B\U0001F030-\U0001F093\U0001F0A0-\U0001F0AE\U0001F0B1-\U0001F0BF\U0001F0C1-\U0001F0CF\U0001F0D1-\U0001F0F5\U0001F110-\U0001F16B\U0001F170-\U0001F1AC\U0001F1E6-\U0001F202\U0001F210-\U0001F23B\U0001F240-\U0001F248\U0001F250\U0001F251\U0001F260-\U0001F265\U0001F300-\U0001F3FA\U0001F400-\U0001F6D4\U0001F6E0-\U0001F6EC\U0001F6F0-\U0001F6F9\U0001F700-\U0001F773\U0001F780-\U0001F7D8\U0001F800-\U0001F80B\U0001F810-\U0001F847\U0001F850-\U0001F859\U0001F860-\U0001F887\U0001F890-\U0001F8AD\U0001F900-\U0001F90B\U0001F910-\U0001F93E\U0001F940-\U0001F970\U0001F973-\U0001F976\U0001F97A\U0001F97C-\U0001F9A2\U0001F9B0-\U0001F9B9\U0001F9C0-\U0001F9C2\U0001F9D0-\U0001F9FF\U0001FA60-\U0001FA6D]�suffix_search�2�…$|……$|,$|:$|;$|\!$|\?$|¿$|؟$|¡$|\($|\)$|\[$|\]$|\{$|\}$|<$|>$|_$|#$|\*$|&$|。$|?$|!$|,$|、$|;$|:$|~$|·$|।$|،$|۔$|؛$|٪$|\.\.+$|…$|\'$|"$|”$|“$|`$|‘$|´$|’$|‚$|,$|„$|»$|«$|「$|」$|『$|』$|($|)$|〔$|〕$|【$|】$|《$|》$|〈$|〉$|〈$|〉$|⟦$|⟧$|[\u00A6\u00A9\u00AE\u00B0\u0482\u058D\u058E\u060E\u060F\u06DE\u06E9\u06FD\u06FE\u07F6\u09FA\u0B70\u0BF3-\u0BF8\u0BFA\u0C7F\u0D4F\u0D79\u0F01-\u0F03\u0F13\u0F15-\u0F17\u0F1A-\u0F1F\u0F34\u0F36\u0F38\u0FBE-\u0FC5\u0FC7-\u0FCC\u0FCE\u0FCF\u0FD5-\u0FD8\u109E\u109F\u1390-\u1399\u1940\u19DE-\u19FF\u1B61-\u1B6A\u1B74-\u1B7C\u2100\u2101\u2103-\u2106\u2108\u2109\u2114\u2116\u2117\u211E-\u2123\u2125\u2127\u2129\u212E\u213A\u213B\u214A\u214C\u214D\u214F\u218A\u218B\u2195-\u2199\u219C-\u219F\u21A1\u21A2\u21A4\u21A5\u21A7-\u21AD\u21AF-\u21CD\u21D0\u21D1\u21D3\u21D5-\u21F3\u2300-\u2307\u230C-\u231F\u2322-\u2328\u232B-\u237B\u237D-\u239A\u23B4-\u23DB\u23E2-\u2426\u2440-\u244A\u249C-\u24E9\u2500-\u25B6\u25B8-\u25C0\u25C2-\u25F7\u2600-\u266E\u2670-\u2767\u2794-\u27BF\u2800-\u28FF\u2B00-\u2B2F\u2B45\u2B46\u2B4D-\u2B73\u2B76-\u2B95\u2B98-\u2BC8\u2BCA-\u2BFE\u2CE5-\u2CEA\u2E80-\u2E99\u2E9B-\u2EF3\u2F00-\u2FD5\u2FF0-\u2FFB\u3004\u3012\u3013\u3020\u3036\u3037\u303E\u303F\u3190\u3191\u3196-\u319F\u31C0-\u31E3\u3200-\u321E\u322A-\u3247\u3250\u3260-\u327F\u328A-\u32B0\u32C0-\u32FE\u3300-\u33FF\u4DC0-\u4DFF\uA490-\uA4C6\uA828-\uA82B\uA836\uA837\uA839\uAA77-\uAA79\uFDFD\uFFE4\uFFE8\uFFED\uFFEE\uFFFC\uFFFD\U00010137-\U0001013F\U00010179-\U00010189\U0001018C-\U0001018E\U00010190-\U0001019B\U000101A0\U000101D0-\U000101FC\U00010877\U00010878\U00010AC8\U0001173F\U00016B3C-\U00016B3F\U00016B45\U0001BC9C\U0001D000-\U0001D0F5\U0001D100-\U0001D126\U0001D129-\U0001D164\U0001D16A-\U0001D16C\U0001D183\U0001D184\U0001D18C-\U0001D1A9\U0001D1AE-\U0001D1E8\U0001D200-\U0001D241\U0001D245\U0001D300-\U0001D356\U0001D800-\U0001D9FF\U0001DA37-\U0001DA3A\U0001DA6D-\U0001DA74\U0001DA76-\U0001DA83\U0001DA85\U0001DA86\U0001ECAC\U0001F000-\U0001F02B\U0001F030-\U0001F093\U0001F0A0-\U0001F0AE\U0001F0B1-\U0001F0BF\U0001F0C1-\U0001F0CF\U0001F0D1-\U0001F0F5\U0001F110-\U0001F16B\U0001F170-\U0001F1AC\U0001F1E6-\U0001F202\U0001F210-\U0001F23B\U0001F240-\U0001F248\U0001F250\U0001F251\U0001F260-\U0001F265\U0001F300-\U0001F3FA\U0001F400-\U0001F6D4\U0001F6E0-\U0001F6EC\U0001F6F0-\U0001F6F9\U0001F700-\U0001F773\U0001F780-\U0001F7D8\U0001F800-\U0001F80B\U0001F810-\U0001F847\U0001F850-\U0001F859\U0001F860-\U0001F887\U0001F890-\U0001F8AD\U0001F900-\U0001F90B\U0001F910-\U0001F93E\U0001F940-\U0001F970\U0001F973-\U0001F976\U0001F97A\U0001F97C-\U0001F9A2\U0001F9B0-\U0001F9B9\U0001F9C0-\U0001F9C2\U0001F9D0-\U0001F9FF\U0001FA60-\U0001FA6D]$|'s$|'S$|’s$|’S$|—$|–$|(?<=[0-9])\+$|(?<=°[FfCcKk])\.$|(?<=[0-9])(?:\$|£|€|¥|฿|US\$|C\$|A\$|₽|﷼|₴|₠|₡|₢|₣|₤|₥|₦|₧|₨|₩|₪|₫|€|₭|₮|₯|₰|₱|₲|₳|₴|₵|₶|₷|₸|₹|₺|₻|₼|₽|₾|₿)$|(?<=[0-9])(?:km|km²|km³|m|m²|m³|dm|dm²|dm³|cm|cm²|cm³|mm|mm²|mm³|ha|µm|nm|yd|in|ft|kg|g|mg|µg|t|lb|oz|m/s|km/h|kmh|mph|hPa|Pa|mbar|mb|MB|kb|KB|gb|GB|tb|TB|T|G|M|K|%|км|км²|км³|м|м²|м³|дм|дм²|дм³|см|см²|см³|мм|мм²|мм³|нм|кг|г|мг|м/с|км/ч|кПа|Па|мбар|Кб|КБ|кб|Мб|МБ|мб|Гб|ГБ|гб|Тб|ТБ|тбكم|كم²|كم³|م|م²|م³|سم|سم²|سم³|مم|مم²|مم³|كم|غرام|جرام|جم|كغ|ملغ|كوب|اكواب)$|(?<=[0-9a-z\uFF41-\uFF5A\u00DF-\u00F6\u00F8-\u00FF\u0101\u0103\u0105\u0107\u0109\u010B\u010D\u010F\u0111\u0113\u0115\u0117\u0119\u011B\u011D\u011F\u0121\u0123\u0125\u0127\u0129\u012B\u012D\u012F\u0131\u0133\u0135\u0137\u0138\u013A\u013C\u013E\u0140\u0142\u0144\u0146\u0148\u0149\u014B\u014D\u014F\u0151\u0153\u0155\u0157\u0159\u015B\u015D\u015F\u0161\u0163\u0165\u0167\u0169\u016B\u016D\u016F\u0171\u0173\u0175\u0177\u017A\u017C\u017E\u017F\u0180\u0183\u0185\u0188\u018C\u018D\u0192\u0195\u0199-\u019B\u019E\u01A1\u01A3\u01A5\u01A8\u01AA\u01AB\u01AD\u01B0\u01B4\u01B6\u01B9\u01BA\u01BD-\u01BF\u01C6\u01C9\u01CC\u01CE\u01D0\u01D2\u01D4\u01D6\u01D8\u01DA\u01DC\u01DD\u01DF\u01E1\u01E3\u01E5\u01E7\u01E9\u01EB\u01ED\u01EF\u01F0\u01F3\u01F5\u01F9\u01FB\u01FD\u01FF\u0201\u0203\u0205\u0207\u0209\u020B\u020D\u020F\u0211\u0213\u0215\u0217\u0219\u021B\u021D\u021F\u0221\u0223\u0225\u0227\u0229\u022B\u022D\u022F\u0231\u0233-\u0239\u023C\u023F\u0240\u0242\u0247\u0249\u024B\u024D\u024F\u2C61\u2C65\u2C66\u2C68\u2C6A\u2C6C\u2C71\u2C73\u2C74\u2C76-\u2C7B\uA723\uA725\uA727\uA729\uA72B\uA72D\uA72F-\uA731\uA733\uA735\uA737\uA739\uA73B\uA73D\uA73F\uA741\uA743\uA745\uA747\uA749\uA74B\uA74D\uA74F\uA751\uA753\uA755\uA757\uA759\uA75B\uA75D\uA75F\uA761\uA763\uA765\uA767\uA769\uA76B\uA76D\uA76F\uA771-\uA778\uA77A\uA77C\uA77F\uA781\uA783\uA785\uA787\uA78C\uA78E\uA791\uA793-\uA795\uA797\uA799\uA79B\uA79D\uA79F\uA7A1\uA7A3\uA7A5\uA7A7\uA7A9\uA7AF\uA7B5\uA7B7\uA7B9\uA7FA\uAB30-\uAB5A\uAB60-\uAB64\u0250-\u02AF\u1D00-\u1D25\u1D6B-\u1D77\u1D79-\u1D9A\u1E01\u1E03\u1E05\u1E07\u1E09\u1E0B\u1E0D\u1E0F\u1E11\u1E13\u1E15\u1E17\u1E19\u1E1B\u1E1D\u1E1F\u1E21\u1E23\u1E25\u1E27\u1E29\u1E2B\u1E2D\u1E2F\u1E31\u1E33\u1E35\u1E37\u1E39\u1E3B\u1E3D\u1E3F\u1E41\u1E43\u1E45\u1E47\u1E49\u1E4B\u1E4D\u1E4F\u1E51\u1E53\u1E55\u1E57\u1E59\u1E5B\u1E5D\u1E5F\u1E61\u1E63\u1E65\u1E67\u1E69\u1E6B\u1E6D\u1E6F\u1E71\u1E73\u1E75\u1E77\u1E79\u1E7B\u1E7D\u1E7F\u1E81\u1E83\u1E85\u1E87\u1E89\u1E8B\u1E8D\u1E8F\u1E91\u1E93\u1E95-\u1E9D\u1E9F\u1EA1\u1EA3\u1EA5\u1EA7\u1EA9\u1EAB\u1EAD\u1EAF\u1EB1\u1EB3\u1EB5\u1EB7\u1EB9\u1EBB\u1EBD\u1EBF\u1EC1\u1EC3\u1EC5\u1EC7\u1EC9\u1ECB\u1ECD\u1ECF\u1ED1\u1ED3\u1ED5\u1ED7\u1ED9\u1EDB\u1EDD\u1EDF\u1EE1\u1EE3\u1EE5\u1EE7\u1EE9\u1EEB\u1EED\u1EEF\u1EF1\u1EF3\u1EF5\u1EF7\u1EF9\u1EFB\u1EFD\u1EFFёа-яәөүҗңһα-ωάέίόώήύа-щюяіїєґѓѕјљњќѐѝ\u1200-\u137F\u0980-\u09FF\u0591-\u05F4\uFB1D-\uFB4F\u0620-\u064A\u066E-\u06D5\u06E5-\u06FF\u0750-\u077F\u08A0-\u08BD\uFB50-\uFBB1\uFBD3-\uFD3D\uFD50-\uFDC7\uFDF0-\uFDFB\uFE70-\uFEFC\U0001EE00-\U0001EEBB\u0D80-\u0DFF\u0900-\u097F\u0C80-\u0CFF\u0B80-\u0BFF\u0C00-\u0C7F\uAC00-\uD7AF\u1100-\u11FF\u3040-\u309F\u30A0-\u30FFー\u4E00-\u62FF\u6300-\u77FF\u7800-\u8CFF\u8D00-\u9FFF\u3400-\u4DBF\U00020000-\U000215FF\U00021600-\U000230FF\U00023100-\U000245FF\U00024600-\U000260FF\U00026100-\U000275FF\U00027600-\U000290FF\U00029100-\U0002A6DF\U0002A700-\U0002B73F\U0002B740-\U0002B81F\U0002B820-\U0002CEAF\U0002CEB0-\U0002EBEF\u2E80-\u2EFF\u2F00-\u2FDF\u2FF0-\u2FFF\u3000-\u303F\u31C0-\u31EF\u3200-\u32FF\u3300-\u33FF\uF900-\uFAFF\uFE30-\uFE4F\U0001F200-\U0001F2FF\U0002F800-\U0002FA1F%²\-\+…|……|,|:|;|\!|\?|¿|؟|¡|\(|\)|\[|\]|\{|\}|<|>|_|#|\*|&|。|?|!|,|、|;|:|~|·|।|،|۔|؛|٪(?:\'"”“`‘´’‚,„»«「」『』()〔〕【】《》〈〉〈〉⟦⟧)])\.$|(?<=[A-Z\uFF21-\uFF3A\u00C0-\u00D6\u00D8-\u00DE\u0100\u0102\u0104\u0106\u0108\u010A\u010C\u010E\u0110\u0112\u0114\u0116\u0118\u011A\u011C\u011E\u0120\u0122\u0124\u0126\u0128\u012A\u012C\u012E\u0130\u0132\u0134\u0136\u0139\u013B\u013D\u013F\u0141\u0143\u0145\u0147\u014A\u014C\u014E\u0150\u0152\u0154\u0156\u0158\u015A\u015C\u015E\u0160\u0162\u0164\u0166\u0168\u016A\u016C\u016E\u0170\u0172\u0174\u0176\u0178\u0179\u017B\u017D\u0181\u0182\u0184\u0186\u0187\u0189-\u018B\u018E-\u0191\u0193\u0194\u0196-\u0198\u019C\u019D\u019F\u01A0\u01A2\u01A4\u01A6\u01A7\u01A9\u01AC\u01AE\u01AF\u01B1-\u01B3\u01B5\u01B7\u01B8\u01BC\u01C4\u01C7\u01CA\u01CD\u01CF\u01D1\u01D3\u01D5\u01D7\u01D9\u01DB\u01DE\u01E0\u01E2\u01E4\u01E6\u01E8\u01EA\u01EC\u01EE\u01F1\u01F4\u01F6-\u01F8\u01FA\u01FC\u01FE\u0200\u0202\u0204\u0206\u0208\u020A\u020C\u020E\u0210\u0212\u0214\u0216\u0218\u021A\u021C\u021E\u0220\u0222\u0224\u0226\u0228\u022A\u022C\u022E\u0230\u0232\u023A\u023B\u023D\u023E\u0241\u0243-\u0246\u0248\u024A\u024C\u024E\u2C60\u2C62-\u2C64\u2C67\u2C69\u2C6B\u2C6D-\u2C70\u2C72\u2C75\u2C7E\u2C7F\uA722\uA724\uA726\uA728\uA72A\uA72C\uA72E\uA732\uA734\uA736\uA738\uA73A\uA73C\uA73E\uA740\uA742\uA744\uA746\uA748\uA74A\uA74C\uA74E\uA750\uA752\uA754\uA756\uA758\uA75A\uA75C\uA75E\uA760\uA762\uA764\uA766\uA768\uA76A\uA76C\uA76E\uA779\uA77B\uA77D\uA77E\uA780\uA782\uA784\uA786\uA78B\uA78D\uA790\uA792\uA796\uA798\uA79A\uA79C\uA79E\uA7A0\uA7A2\uA7A4\uA7A6\uA7A8\uA7AA-\uA7AE\uA7B0-\uA7B4\uA7B6\uA7B8\u1E00\u1E02\u1E04\u1E06\u1E08\u1E0A\u1E0C\u1E0E\u1E10\u1E12\u1E14\u1E16\u1E18\u1E1A\u1E1C\u1E1E\u1E20\u1E22\u1E24\u1E26\u1E28\u1E2A\u1E2C\u1E2E\u1E30\u1E32\u1E34\u1E36\u1E38\u1E3A\u1E3C\u1E3E\u1E40\u1E42\u1E44\u1E46\u1E48\u1E4A\u1E4C\u1E4E\u1E50\u1E52\u1E54\u1E56\u1E58\u1E5A\u1E5C\u1E5E\u1E60\u1E62\u1E64\u1E66\u1E68\u1E6A\u1E6C\u1E6E\u1E70\u1E72\u1E74\u1E76\u1E78\u1E7A\u1E7C\u1E7E\u1E80\u1E82\u1E84\u1E86\u1E88\u1E8A\u1E8C\u1E8E\u1E90\u1E92\u1E94\u1E9E\u1EA0\u1EA2\u1EA4\u1EA6\u1EA8\u1EAA\u1EAC\u1EAE\u1EB0\u1EB2\u1EB4\u1EB6\u1EB8\u1EBA\u1EBC\u1EBE\u1EC0\u1EC2\u1EC4\u1EC6\u1EC8\u1ECA\u1ECC\u1ECE\u1ED0\u1ED2\u1ED4\u1ED6\u1ED8\u1EDA\u1EDC\u1EDE\u1EE0\u1EE2\u1EE4\u1EE6\u1EE8\u1EEA\u1EEC\u1EEE\u1EF0\u1EF2\u1EF4\u1EF6\u1EF8\u1EFA\u1EFC\u1EFEЁА-ЯӘӨҮҖҢҺΑ-ΩΆΈΊΌΏΉΎА-ЩЮЯІЇЄҐЃЅЈЉЊЌЀЍ\u1200-\u137F\u0980-\u09FF\u0591-\u05F4\uFB1D-\uFB4F\u0620-\u064A\u066E-\u06D5\u06E5-\u06FF\u0750-\u077F\u08A0-\u08BD\uFB50-\uFBB1\uFBD3-\uFD3D\uFD50-\uFDC7\uFDF0-\uFDFB\uFE70-\uFEFC\U0001EE00-\U0001EEBB\u0D80-\u0DFF\u0900-\u097F\u0C80-\u0CFF\u0B80-\u0BFF\u0C00-\u0C7F\uAC00-\uD7AF\u1100-\u11FF\u3040-\u309F\u30A0-\u30FFー\u4E00-\u62FF\u6300-\u77FF\u7800-\u8CFF\u8D00-\u9FFF\u3400-\u4DBF\U00020000-\U000215FF\U00021600-\U000230FF\U00023100-\U000245FF\U00024600-\U000260FF\U00026100-\U000275FF\U00027600-\U000290FF\U00029100-\U0002A6DF\U0002A700-\U0002B73F\U0002B740-\U0002B81F\U0002B820-\U0002CEAF\U0002CEB0-\U0002EBEF\u2E80-\u2EFF\u2F00-\u2FDF\u2FF0-\u2FFF\u3000-\u303F\u31C0-\u31EF\u3200-\u32FF\u3300-\u33FF\uF900-\uFAFF\uFE30-\uFE4F\U0001F200-\U0001F2FF\U0002F800-\U0002FA1F][A-Z\uFF21-\uFF3A\u00C0-\u00D6\u00D8-\u00DE\u0100\u0102\u0104\u0106\u0108\u010A\u010C\u010E\u0110\u0112\u0114\u0116\u0118\u011A\u011C\u011E\u0120\u0122\u0124\u0126\u0128\u012A\u012C\u012E\u0130\u0132\u0134\u0136\u0139\u013B\u013D\u013F\u0141\u0143\u0145\u0147\u014A\u014C\u014E\u0150\u0152\u0154\u0156\u0158\u015A\u015C\u015E\u0160\u0162\u0164\u0166\u0168\u016A\u016C\u016E\u0170\u0172\u0174\u0176\u0178\u0179\u017B\u017D\u0181\u0182\u0184\u0186\u0187\u0189-\u018B\u018E-\u0191\u0193\u0194\u0196-\u0198\u019C\u019D\u019F\u01A0\u01A2\u01A4\u01A6\u01A7\u01A9\u01AC\u01AE\u01AF\u01B1-\u01B3\u01B5\u01B7\u01B8\u01BC\u01C4\u01C7\u01CA\u01CD\u01CF\u01D1\u01D3\u01D5\u01D7\u01D9\u01DB\u01DE\u01E0\u01E2\u01E4\u01E6\u01E8\u01EA\u01EC\u01EE\u01F1\u01F4\u01F6-\u01F8\u01FA\u01FC\u01FE\u0200\u0202\u0204\u0206\u0208\u020A\u020C\u020E\u0210\u0212\u0214\u0216\u0218\u021A\u021C\u021E\u0220\u0222\u0224\u0226\u0228\u022A\u022C\u022E\u0230\u0232\u023A\u023B\u023D\u023E\u0241\u0243-\u0246\u0248\u024A\u024C\u024E\u2C60\u2C62-\u2C64\u2C67\u2C69\u2C6B\u2C6D-\u2C70\u2C72\u2C75\u2C7E\u2C7F\uA722\uA724\uA726\uA728\uA72A\uA72C\uA72E\uA732\uA734\uA736\uA738\uA73A\uA73C\uA73E\uA740\uA742\uA744\uA746\uA748\uA74A\uA74C\uA74E\uA750\uA752\uA754\uA756\uA758\uA75A\uA75C\uA75E\uA760\uA762\uA764\uA766\uA768\uA76A\uA76C\uA76E\uA779\uA77B\uA77D\uA77E\uA780\uA782\uA784\uA786\uA78B\uA78D\uA790\uA792\uA796\uA798\uA79A\uA79C\uA79E\uA7A0\uA7A2\uA7A4\uA7A6\uA7A8\uA7AA-\uA7AE\uA7B0-\uA7B4\uA7B6\uA7B8\u1E00\u1E02\u1E04\u1E06\u1E08\u1E0A\u1E0C\u1E0E\u1E10\u1E12\u1E14\u1E16\u1E18\u1E1A\u1E1C\u1E1E\u1E20\u1E22\u1E24\u1E26\u1E28\u1E2A\u1E2C\u1E2E\u1E30\u1E32\u1E34\u1E36\u1E38\u1E3A\u1E3C\u1E3E\u1E40\u1E42\u1E44\u1E46\u1E48\u1E4A\u1E4C\u1E4E\u1E50\u1E52\u1E54\u1E56\u1E58\u1E5A\u1E5C\u1E5E\u1E60\u1E62\u1E64\u1E66\u1E68\u1E6A\u1E6C\u1E6E\u1E70\u1E72\u1E74\u1E76\u1E78\u1E7A\u1E7C\u1E7E\u1E80\u1E82\u1E84\u1E86\u1E88\u1E8A\u1E8C\u1E8E\u1E90\u1E92\u1E94\u1E9E\u1EA0\u1EA2\u1EA4\u1EA6\u1EA8\u1EAA\u1EAC\u1EAE\u1EB0\u1EB2\u1EB4\u1EB6\u1EB8\u1EBA\u1EBC\u1EBE\u1EC0\u1EC2\u1EC4\u1EC6\u1EC8\u1ECA\u1ECC\u1ECE\u1ED0\u1ED2\u1ED4\u1ED6\u1ED8\u1EDA\u1EDC\u1EDE\u1EE0\u1EE2\u1EE4\u1EE6\u1EE8\u1EEA\u1EEC\u1EEE\u1EF0\u1EF2\u1EF4\u1EF6\u1EF8\u1EFA\u1EFC\u1EFEЁА-ЯӘӨҮҖҢҺΑ-ΩΆΈΊΌΏΉΎА-ЩЮЯІЇЄҐЃЅЈЉЊЌЀЍ\u1200-\u137F\u0980-\u09FF\u0591-\u05F4\uFB1D-\uFB4F\u0620-\u064A\u066E-\u06D5\u06E5-\u06FF\u0750-\u077F\u08A0-\u08BD\uFB50-\uFBB1\uFBD3-\uFD3D\uFD50-\uFDC7\uFDF0-\uFDFB\uFE70-\uFEFC\U0001EE00-\U0001EEBB\u0D80-\u0DFF\u0900-\u097F\u0C80-\u0CFF\u0B80-\u0BFF\u0C00-\u0C7F\uAC00-\uD7AF\u1100-\u11FF\u3040-\u309F\u30A0-\u30FFー\u4E00-\u62FF\u6300-\u77FF\u7800-\u8CFF\u8D00-\u9FFF\u3400-\u4DBF\U00020000-\U000215FF\U00021600-\U000230FF\U00023100-\U000245FF\U00024600-\U000260FF\U00026100-\U000275FF\U00027600-\U000290FF\U00029100-\U0002A6DF\U0002A700-\U0002B73F\U0002B740-\U0002B81F\U0002B820-\U0002CEAF\U0002CEB0-\U0002EBEF\u2E80-\u2EFF\u2F00-\u2FDF\u2FF0-\u2FFF\u3000-\u303F\u31C0-\u31EF\u3200-\u32FF\u3300-\u33FF\uF900-\uFAFF\uFE30-\uFE4F\U0001F200-\U0001F2FF\U0002F800-\U0002FA1F])\.$�infix_finditer�>�\.\.+|…|[\u00A6\u00A9\u00AE\u00B0\u0482\u058D\u058E\u060E\u060F\u06DE\u06E9\u06FD\u06FE\u07F6\u09FA\u0B70\u0BF3-\u0BF8\u0BFA\u0C7F\u0D4F\u0D79\u0F01-\u0F03\u0F13\u0F15-\u0F17\u0F1A-\u0F1F\u0F34\u0F36\u0F38\u0FBE-\u0FC5\u0FC7-\u0FCC\u0FCE\u0FCF\u0FD5-\u0FD8\u109E\u109F\u1390-\u1399\u1940\u19DE-\u19FF\u1B61-\u1B6A\u1B74-\u1B7C\u2100\u2101\u2103-\u2106\u2108\u2109\u2114\u2116\u2117\u211E-\u2123\u2125\u2127\u2129\u212E\u213A\u213B\u214A\u214C\u214D\u214F\u218A\u218B\u2195-\u2199\u219C-\u219F\u21A1\u21A2\u21A4\u21A5\u21A7-\u21AD\u21AF-\u21CD\u21D0\u21D1\u21D3\u21D5-\u21F3\u2300-\u2307\u230C-\u231F\u2322-\u2328\u232B-\u237B\u237D-\u239A\u23B4-\u23DB\u23E2-\u2426\u2440-\u244A\u249C-\u24E9\u2500-\u25B6\u25B8-\u25C0\u25C2-\u25F7\u2600-\u266E\u2670-\u2767\u2794-\u27BF\u2800-\u28FF\u2B00-\u2B2F\u2B45\u2B46\u2B4D-\u2B73\u2B76-\u2B95\u2B98-\u2BC8\u2BCA-\u2BFE\u2CE5-\u2CEA\u2E80-\u2E99\u2E9B-\u2EF3\u2F00-\u2FD5\u2FF0-\u2FFB\u3004\u3012\u3013\u3020\u3036\u3037\u303E\u303F\u3190\u3191\u3196-\u319F\u31C0-\u31E3\u3200-\u321E\u322A-\u3247\u3250\u3260-\u327F\u328A-\u32B0\u32C0-\u32FE\u3300-\u33FF\u4DC0-\u4DFF\uA490-\uA4C6\uA828-\uA82B\uA836\uA837\uA839\uAA77-\uAA79\uFDFD\uFFE4\uFFE8\uFFED\uFFEE\uFFFC\uFFFD\U00010137-\U0001013F\U00010179-\U00010189\U0001018C-\U0001018E\U00010190-\U0001019B\U000101A0\U000101D0-\U000101FC\U00010877\U00010878\U00010AC8\U0001173F\U00016B3C-\U00016B3F\U00016B45\U0001BC9C\U0001D000-\U0001D0F5\U0001D100-\U0001D126\U0001D129-\U0001D164\U0001D16A-\U0001D16C\U0001D183\U0001D184\U0001D18C-\U0001D1A9\U0001D1AE-\U0001D1E8\U0001D200-\U0001D241\U0001D245\U0001D300-\U0001D356\U0001D800-\U0001D9FF\U0001DA37-\U0001DA3A\U0001DA6D-\U0001DA74\U0001DA76-\U0001DA83\U0001DA85\U0001DA86\U0001ECAC\U0001F000-\U0001F02B\U0001F030-\U0001F093\U0001F0A0-\U0001F0AE\U0001F0B1-\U0001F0BF\U0001F0C1-\U0001F0CF\U0001F0D1-\U0001F0F5\U0001F110-\U0001F16B\U0001F170-\U0001F1AC\U0001F1E6-\U0001F202\U0001F210-\U0001F23B\U0001F240-\U0001F248\U0001F250\U0001F251\U0001F260-\U0001F265\U0001F300-\U0001F3FA\U0001F400-\U0001F6D4\U0001F6E0-\U0001F6EC\U0001F6F0-\U0001F6F9\U0001F700-\U0001F773\U0001F780-\U0001F7D8\U0001F800-\U0001F80B\U0001F810-\U0001F847\U0001F850-\U0001F859\U0001F860-\U0001F887\U0001F890-\U0001F8AD\U0001F900-\U0001F90B\U0001F910-\U0001F93E\U0001F940-\U0001F970\U0001F973-\U0001F976\U0001F97A\U0001F97C-\U0001F9A2\U0001F9B0-\U0001F9B9\U0001F9C0-\U0001F9C2\U0001F9D0-\U0001F9FF\U0001FA60-\U0001FA6D]|(?<=[0-9])[+\-\*^](?=[0-9-])|(?<=[a-z\uFF41-\uFF5A\u00DF-\u00F6\u00F8-\u00FF\u0101\u0103\u0105\u0107\u0109\u010B\u010D\u010F\u0111\u0113\u0115\u0117\u0119\u011B\u011D\u011F\u0121\u0123\u0125\u0127\u0129\u012B\u012D\u012F\u0131\u0133\u0135\u0137\u0138\u013A\u013C\u013E\u0140\u0142\u0144\u0146\u0148\u0149\u014B\u014D\u014F\u0151\u0153\u0155\u0157\u0159\u015B\u015D\u015F\u0161\u0163\u0165\u0167\u0169\u016B\u016D\u016F\u0171\u0173\u0175\u0177\u017A\u017C\u017E\u017F\u0180\u0183\u0185\u0188\u018C\u018D\u0192\u0195\u0199-\u019B\u019E\u01A1\u01A3\u01A5\u01A8\u01AA\u01AB\u01AD\u01B0\u01B4\u01B6\u01B9\u01BA\u01BD-\u01BF\u01C6\u01C9\u01CC\u01CE\u01D0\u01D2\u01D4\u01D6\u01D8\u01DA\u01DC\u01DD\u01DF\u01E1\u01E3\u01E5\u01E7\u01E9\u01EB\u01ED\u01EF\u01F0\u01F3\u01F5\u01F9\u01FB\u01FD\u01FF\u0201\u0203\u0205\u0207\u0209\u020B\u020D\u020F\u0211\u0213\u0215\u0217\u0219\u021B\u021D\u021F\u0221\u0223\u0225\u0227\u0229\u022B\u022D\u022F\u0231\u0233-\u0239\u023C\u023F\u0240\u0242\u0247\u0249\u024B\u024D\u024F\u2C61\u2C65\u2C66\u2C68\u2C6A\u2C6C\u2C71\u2C73\u2C74\u2C76-\u2C7B\uA723\uA725\uA727\uA729\uA72B\uA72D\uA72F-\uA731\uA733\uA735\uA737\uA739\uA73B\uA73D\uA73F\uA741\uA743\uA745\uA747\uA749\uA74B\uA74D\uA74F\uA751\uA753\uA755\uA757\uA759\uA75B\uA75D\uA75F\uA761\uA763\uA765\uA767\uA769\uA76B\uA76D\uA76F\uA771-\uA778\uA77A\uA77C\uA77F\uA781\uA783\uA785\uA787\uA78C\uA78E\uA791\uA793-\uA795\uA797\uA799\uA79B\uA79D\uA79F\uA7A1\uA7A3\uA7A5\uA7A7\uA7A9\uA7AF\uA7B5\uA7B7\uA7B9\uA7FA\uAB30-\uAB5A\uAB60-\uAB64\u0250-\u02AF\u1D00-\u1D25\u1D6B-\u1D77\u1D79-\u1D9A\u1E01\u1E03\u1E05\u1E07\u1E09\u1E0B\u1E0D\u1E0F\u1E11\u1E13\u1E15\u1E17\u1E19\u1E1B\u1E1D\u1E1F\u1E21\u1E23\u1E25\u1E27\u1E29\u1E2B\u1E2D\u1E2F\u1E31\u1E33\u1E35\u1E37\u1E39\u1E3B\u1E3D\u1E3F\u1E41\u1E43\u1E45\u1E47\u1E49\u1E4B\u1E4D\u1E4F\u1E51\u1E53\u1E55\u1E57\u1E59\u1E5B\u1E5D\u1E5F\u1E61\u1E63\u1E65\u1E67\u1E69\u1E6B\u1E6D\u1E6F\u1E71\u1E73\u1E75\u1E77\u1E79\u1E7B\u1E7D\u1E7F\u1E81\u1E83\u1E85\u1E87\u1E89\u1E8B\u1E8D\u1E8F\u1E91\u1E93\u1E95-\u1E9D\u1E9F\u1EA1\u1EA3\u1EA5\u1EA7\u1EA9\u1EAB\u1EAD\u1EAF\u1EB1\u1EB3\u1EB5\u1EB7\u1EB9\u1EBB\u1EBD\u1EBF\u1EC1\u1EC3\u1EC5\u1EC7\u1EC9\u1ECB\u1ECD\u1ECF\u1ED1\u1ED3\u1ED5\u1ED7\u1ED9\u1EDB\u1EDD\u1EDF\u1EE1\u1EE3\u1EE5\u1EE7\u1EE9\u1EEB\u1EED\u1EEF\u1EF1\u1EF3\u1EF5\u1EF7\u1EF9\u1EFB\u1EFD\u1EFFёа-яәөүҗңһα-ωάέίόώήύа-щюяіїєґѓѕјљњќѐѝ\u1200-\u137F\u0980-\u09FF\u0591-\u05F4\uFB1D-\uFB4F\u0620-\u064A\u066E-\u06D5\u06E5-\u06FF\u0750-\u077F\u08A0-\u08BD\uFB50-\uFBB1\uFBD3-\uFD3D\uFD50-\uFDC7\uFDF0-\uFDFB\uFE70-\uFEFC\U0001EE00-\U0001EEBB\u0D80-\u0DFF\u0900-\u097F\u0C80-\u0CFF\u0B80-\u0BFF\u0C00-\u0C7F\uAC00-\uD7AF\u1100-\u11FF\u3040-\u309F\u30A0-\u30FFー\u4E00-\u62FF\u6300-\u77FF\u7800-\u8CFF\u8D00-\u9FFF\u3400-\u4DBF\U00020000-\U000215FF\U00021600-\U000230FF\U00023100-\U000245FF\U00024600-\U000260FF\U00026100-\U000275FF\U00027600-\U000290FF\U00029100-\U0002A6DF\U0002A700-\U0002B73F\U0002B740-\U0002B81F\U0002B820-\U0002CEAF\U0002CEB0-\U0002EBEF\u2E80-\u2EFF\u2F00-\u2FDF\u2FF0-\u2FFF\u3000-\u303F\u31C0-\u31EF\u3200-\u32FF\u3300-\u33FF\uF900-\uFAFF\uFE30-\uFE4F\U0001F200-\U0001F2FF\U0002F800-\U0002FA1F\'"”“`‘´’‚,„»«「」『』()〔〕【】《》〈〉〈〉⟦⟧])\.(?=[A-Z\uFF21-\uFF3A\u00C0-\u00D6\u00D8-\u00DE\u0100\u0102\u0104\u0106\u0108\u010A\u010C\u010E\u0110\u0112\u0114\u0116\u0118\u011A\u011C\u011E\u0120\u0122\u0124\u0126\u0128\u012A\u012C\u012E\u0130\u0132\u0134\u0136\u0139\u013B\u013D\u013F\u0141\u0143\u0145\u0147\u014A\u014C\u014E\u0150\u0152\u0154\u0156\u0158\u015A\u015C\u015E\u0160\u0162\u0164\u0166\u0168\u016A\u016C\u016E\u0170\u0172\u0174\u0176\u0178\u0179\u017B\u017D\u0181\u0182\u0184\u0186\u0187\u0189-\u018B\u018E-\u0191\u0193\u0194\u0196-\u0198\u019C\u019D\u019F\u01A0\u01A2\u01A4\u01A6\u01A7\u01A9\u01AC\u01AE\u01AF\u01B1-\u01B3\u01B5\u01B7\u01B8\u01BC\u01C4\u01C7\u01CA\u01CD\u01CF\u01D1\u01D3\u01D5\u01D7\u01D9\u01DB\u01DE\u01E0\u01E2\u01E4\u01E6\u01E8\u01EA\u01EC\u01EE\u01F1\u01F4\u01F6-\u01F8\u01FA\u01FC\u01FE\u0200\u0202\u0204\u0206\u0208\u020A\u020C\u020E\u0210\u0212\u0214\u0216\u0218\u021A\u021C\u021E\u0220\u0222\u0224\u0226\u0228\u022A\u022C\u022E\u0230\u0232\u023A\u023B\u023D\u023E\u0241\u0243-\u0246\u0248\u024A\u024C\u024E\u2C60\u2C62-\u2C64\u2C67\u2C69\u2C6B\u2C6D-\u2C70\u2C72\u2C75\u2C7E\u2C7F\uA722\uA724\uA726\uA728\uA72A\uA72C\uA72E\uA732\uA734\uA736\uA738\uA73A\uA73C\uA73E\uA740\uA742\uA744\uA746\uA748\uA74A\uA74C\uA74E\uA750\uA752\uA754\uA756\uA758\uA75A\uA75C\uA75E\uA760\uA762\uA764\uA766\uA768\uA76A\uA76C\uA76E\uA779\uA77B\uA77D\uA77E\uA780\uA782\uA784\uA786\uA78B\uA78D\uA790\uA792\uA796\uA798\uA79A\uA79C\uA79E\uA7A0\uA7A2\uA7A4\uA7A6\uA7A8\uA7AA-\uA7AE\uA7B0-\uA7B4\uA7B6\uA7B8\u1E00\u1E02\u1E04\u1E06\u1E08\u1E0A\u1E0C\u1E0E\u1E10\u1E12\u1E14\u1E16\u1E18\u1E1A\u1E1C\u1E1E\u1E20\u1E22\u1E24\u1E26\u1E28\u1E2A\u1E2C\u1E2E\u1E30\u1E32\u1E34\u1E36\u1E38\u1E3A\u1E3C\u1E3E\u1E40\u1E42\u1E44\u1E46\u1E48\u1E4A\u1E4C\u1E4E\u1E50\u1E52\u1E54\u1E56\u1E58\u1E5A\u1E5C\u1E5E\u1E60\u1E62\u1E64\u1E66\u1E68\u1E6A\u1E6C\u1E6E\u1E70\u1E72\u1E74\u1E76\u1E78\u1E7A\u1E7C\u1E7E\u1E80\u1E82\u1E84\u1E86\u1E88\u1E8A\u1E8C\u1E8E\u1E90\u1E92\u1E94\u1E9E\u1EA0\u1EA2\u1EA4\u1EA6\u1EA8\u1EAA\u1EAC\u1EAE\u1EB0\u1EB2\u1EB4\u1EB6\u1EB8\u1EBA\u1EBC\u1EBE\u1EC0\u1EC2\u1EC4\u1EC6\u1EC8\u1ECA\u1ECC\u1ECE\u1ED0\u1ED2\u1ED4\u1ED6\u1ED8\u1EDA\u1EDC\u1EDE\u1EE0\u1EE2\u1EE4\u1EE6\u1EE8\u1EEA\u1EEC\u1EEE\u1EF0\u1EF2\u1EF4\u1EF6\u1EF8\u1EFA\u1EFC\u1EFEЁА-ЯӘӨҮҖҢҺΑ-ΩΆΈΊΌΏΉΎА-ЩЮЯІЇЄҐЃЅЈЉЊЌЀЍ\u1200-\u137F\u0980-\u09FF\u0591-\u05F4\uFB1D-\uFB4F\u0620-\u064A\u066E-\u06D5\u06E5-\u06FF\u0750-\u077F\u08A0-\u08BD\uFB50-\uFBB1\uFBD3-\uFD3D\uFD50-\uFDC7\uFDF0-\uFDFB\uFE70-\uFEFC\U0001EE00-\U0001EEBB\u0D80-\u0DFF\u0900-\u097F\u0C80-\u0CFF\u0B80-\u0BFF\u0C00-\u0C7F\uAC00-\uD7AF\u1100-\u11FF\u3040-\u309F\u30A0-\u30FFー\u4E00-\u62FF\u6300-\u77FF\u7800-\u8CFF\u8D00-\u9FFF\u3400-\u4DBF\U00020000-\U000215FF\U00021600-\U000230FF\U00023100-\U000245FF\U00024600-\U000260FF\U00026100-\U000275FF\U00027600-\U000290FF\U00029100-\U0002A6DF\U0002A700-\U0002B73F\U0002B740-\U0002B81F\U0002B820-\U0002CEAF\U0002CEB0-\U0002EBEF\u2E80-\u2EFF\u2F00-\u2FDF\u2FF0-\u2FFF\u3000-\u303F\u31C0-\u31EF\u3200-\u32FF\u3300-\u33FF\uF900-\uFAFF\uFE30-\uFE4F\U0001F200-\U0001F2FF\U0002F800-\U0002FA1F\'"”“`‘´’‚,„»«「」『』()〔〕【】《》〈〉〈〉⟦⟧])|(?<=[A-Za-z\uFF21-\uFF3A\uFF41-\uFF5A\u00C0-\u00D6\u00D8-\u00F6\u00F8-\u00FF\u0100-\u017F\u0180-\u01BF\u01C4-\u024F\u2C60-\u2C7B\u2C7E\u2C7F\uA722-\uA76F\uA771-\uA787\uA78B-\uA78E\uA790-\uA7B9\uA7FA\uAB30-\uAB5A\uAB60-\uAB64\u0250-\u02AF\u1D00-\u1D25\u1D6B-\u1D77\u1D79-\u1D9A\u1E00-\u1EFFёа-яЁА-ЯәөүҗңһӘӨҮҖҢҺα-ωάέίόώήύΑ-ΩΆΈΊΌΏΉΎа-щюяіїєґА-ЩЮЯІЇЄҐѓѕјљњќѐѝЃЅЈЉЊЌЀЍ\u1200-\u137F\u0980-\u09FF\u0591-\u05F4\uFB1D-\uFB4F\u0620-\u064A\u066E-\u06D5\u06E5-\u06FF\u0750-\u077F\u08A0-\u08BD\uFB50-\uFBB1\uFBD3-\uFD3D\uFD50-\uFDC7\uFDF0-\uFDFB\uFE70-\uFEFC\U0001EE00-\U0001EEBB\u0D80-\u0DFF\u0900-\u097F\u0C80-\u0CFF\u0B80-\u0BFF\u0C00-\u0C7F\uAC00-\uD7AF\u1100-\u11FF\u3040-\u309F\u30A0-\u30FFー\u4E00-\u62FF\u6300-\u77FF\u7800-\u8CFF\u8D00-\u9FFF\u3400-\u4DBF\U00020000-\U000215FF\U00021600-\U000230FF\U00023100-\U000245FF\U00024600-\U000260FF\U00026100-\U000275FF\U00027600-\U000290FF\U00029100-\U0002A6DF\U0002A700-\U0002B73F\U0002B740-\U0002B81F\U0002B820-\U0002CEAF\U0002CEB0-\U0002EBEF\u2E80-\u2EFF\u2F00-\u2FDF\u2FF0-\u2FFF\u3000-\u303F\u31C0-\u31EF\u3200-\u32FF\u3300-\u33FF\uF900-\uFAFF\uFE30-\uFE4F\U0001F200-\U0001F2FF\U0002F800-\U0002FA1F]),(?=[A-Za-z\uFF21-\uFF3A\uFF41-\uFF5A\u00C0-\u00D6\u00D8-\u00F6\u00F8-\u00FF\u0100-\u017F\u0180-\u01BF\u01C4-\u024F\u2C60-\u2C7B\u2C7E\u2C7F\uA722-\uA76F\uA771-\uA787\uA78B-\uA78E\uA790-\uA7B9\uA7FA\uAB30-\uAB5A\uAB60-\uAB64\u0250-\u02AF\u1D00-\u1D25\u1D6B-\u1D77\u1D79-\u1D9A\u1E00-\u1EFFёа-яЁА-ЯәөүҗңһӘӨҮҖҢҺα-ωάέίόώήύΑ-ΩΆΈΊΌΏΉΎа-щюяіїєґА-ЩЮЯІЇЄҐѓѕјљњќѐѝЃЅЈЉЊЌЀЍ\u1200-\u137F\u0980-\u09FF\u0591-\u05F4\uFB1D-\uFB4F\u0620-\u064A\u066E-\u06D5\u06E5-\u06FF\u0750-\u077F\u08A0-\u08BD\uFB50-\uFBB1\uFBD3-\uFD3D\uFD50-\uFDC7\uFDF0-\uFDFB\uFE70-\uFEFC\U0001EE00-\U0001EEBB\u0D80-\u0DFF\u0900-\u097F\u0C80-\u0CFF\u0B80-\u0BFF\u0C00-\u0C7F\uAC00-\uD7AF\u1100-\u11FF\u3040-\u309F\u30A0-\u30FFー\u4E00-\u62FF\u6300-\u77FF\u7800-\u8CFF\u8D00-\u9FFF\u3400-\u4DBF\U00020000-\U000215FF\U00021600-\U000230FF\U00023100-\U000245FF\U00024600-\U000260FF\U00026100-\U000275FF\U00027600-\U000290FF\U00029100-\U0002A6DF\U0002A700-\U0002B73F\U0002B740-\U0002B81F\U0002B820-\U0002CEAF\U0002CEB0-\U0002EBEF\u2E80-\u2EFF\u2F00-\u2FDF\u2FF0-\u2FFF\u3000-\u303F\u31C0-\u31EF\u3200-\u32FF\u3300-\u33FF\uF900-\uFAFF\uFE30-\uFE4F\U0001F200-\U0001F2FF\U0002F800-\U0002FA1F])|(?<=[A-Za-z\uFF21-\uFF3A\uFF41-\uFF5A\u00C0-\u00D6\u00D8-\u00F6\u00F8-\u00FF\u0100-\u017F\u0180-\u01BF\u01C4-\u024F\u2C60-\u2C7B\u2C7E\u2C7F\uA722-\uA76F\uA771-\uA787\uA78B-\uA78E\uA790-\uA7B9\uA7FA\uAB30-\uAB5A\uAB60-\uAB64\u0250-\u02AF\u1D00-\u1D25\u1D6B-\u1D77\u1D79-\u1D9A\u1E00-\u1EFFёа-яЁА-ЯәөүҗңһӘӨҮҖҢҺα-ωάέίόώήύΑ-ΩΆΈΊΌΏΉΎа-щюяіїєґА-ЩЮЯІЇЄҐѓѕјљњќѐѝЃЅЈЉЊЌЀЍ\u1200-\u137F\u0980-\u09FF\u0591-\u05F4\uFB1D-\uFB4F\u0620-\u064A\u066E-\u06D5\u06E5-\u06FF\u0750-\u077F\u08A0-\u08BD\uFB50-\uFBB1\uFBD3-\uFD3D\uFD50-\uFDC7\uFDF0-\uFDFB\uFE70-\uFEFC\U0001EE00-\U0001EEBB\u0D80-\u0DFF\u0900-\u097F\u0C80-\u0CFF\u0B80-\u0BFF\u0C00-\u0C7F\uAC00-\uD7AF\u1100-\u11FF\u3040-\u309F\u30A0-\u30FFー\u4E00-\u62FF\u6300-\u77FF\u7800-\u8CFF\u8D00-\u9FFF\u3400-\u4DBF\U00020000-\U000215FF\U00021600-\U000230FF\U00023100-\U000245FF\U00024600-\U000260FF\U00026100-\U000275FF\U00027600-\U000290FF\U00029100-\U0002A6DF\U0002A700-\U0002B73F\U0002B740-\U0002B81F\U0002B820-\U0002CEAF\U0002CEB0-\U0002EBEF\u2E80-\u2EFF\u2F00-\u2FDF\u2FF0-\u2FFF\u3000-\u303F\u31C0-\u31EF\u3200-\u32FF\u3300-\u33FF\uF900-\uFAFF\uFE30-\uFE4F\U0001F200-\U0001F2FF\U0002F800-\U0002FA1F0-9])(?:-|–|—|--|---|——|~)(?=[A-Za-z\uFF21-\uFF3A\uFF41-\uFF5A\u00C0-\u00D6\u00D8-\u00F6\u00F8-\u00FF\u0100-\u017F\u0180-\u01BF\u01C4-\u024F\u2C60-\u2C7B\u2C7E\u2C7F\uA722-\uA76F\uA771-\uA787\uA78B-\uA78E\uA790-\uA7B9\uA7FA\uAB30-\uAB5A\uAB60-\uAB64\u0250-\u02AF\u1D00-\u1D25\u1D6B-\u1D77\u1D79-\u1D9A\u1E00-\u1EFFёа-яЁА-ЯәөүҗңһӘӨҮҖҢҺα-ωάέίόώήύΑ-ΩΆΈΊΌΏΉΎа-щюяіїєґА-ЩЮЯІЇЄҐѓѕјљњќѐѝЃЅЈЉЊЌЀЍ\u1200-\u137F\u0980-\u09FF\u0591-\u05F4\uFB1D-\uFB4F\u0620-\u064A\u066E-\u06D5\u06E5-\u06FF\u0750-\u077F\u08A0-\u08BD\uFB50-\uFBB1\uFBD3-\uFD3D\uFD50-\uFDC7\uFDF0-\uFDFB\uFE70-\uFEFC\U0001EE00-\U0001EEBB\u0D80-\u0DFF\u0900-\u097F\u0C80-\u0CFF\u0B80-\u0BFF\u0C00-\u0C7F\uAC00-\uD7AF\u1100-\u11FF\u3040-\u309F\u30A0-\u30FFー\u4E00-\u62FF\u6300-\u77FF\u7800-\u8CFF\u8D00-\u9FFF\u3400-\u4DBF\U00020000-\U000215FF\U00021600-\U000230FF\U00023100-\U000245FF\U00024600-\U000260FF\U00026100-\U000275FF\U00027600-\U000290FF\U00029100-\U0002A6DF\U0002A700-\U0002B73F\U0002B740-\U0002B81F\U0002B820-\U0002CEAF\U0002CEB0-\U0002EBEF\u2E80-\u2EFF\u2F00-\u2FDF\u2FF0-\u2FFF\u3000-\u303F\u31C0-\u31EF\u3200-\u32FF\u3300-\u33FF\uF900-\uFAFF\uFE30-\uFE4F\U0001F200-\U0001F2FF\U0002F800-\U0002FA1F])|(?<=[A-Za-z\uFF21-\uFF3A\uFF41-\uFF5A\u00C0-\u00D6\u00D8-\u00F6\u00F8-\u00FF\u0100-\u017F\u0180-\u01BF\u01C4-\u024F\u2C60-\u2C7B\u2C7E\u2C7F\uA722-\uA76F\uA771-\uA787\uA78B-\uA78E\uA790-\uA7B9\uA7FA\uAB30-\uAB5A\uAB60-\uAB64\u0250-\u02AF\u1D00-\u1D25\u1D6B-\u1D77\u1D79-\u1D9A\u1E00-\u1EFFёа-яЁА-ЯәөүҗңһӘӨҮҖҢҺα-ωάέίόώήύΑ-ΩΆΈΊΌΏΉΎа-щюяіїєґА-ЩЮЯІЇЄҐѓѕјљњќѐѝЃЅЈЉЊЌЀЍ\u1200-\u137F\u0980-\u09FF\u0591-\u05F4\uFB1D-\uFB4F\u0620-\u064A\u066E-\u06D5\u06E5-\u06FF\u0750-\u077F\u08A0-\u08BD\uFB50-\uFBB1\uFBD3-\uFD3D\uFD50-\uFDC7\uFDF0-\uFDFB\uFE70-\uFEFC\U0001EE00-\U0001EEBB\u0D80-\u0DFF\u0900-\u097F\u0C80-\u0CFF\u0B80-\u0BFF\u0C00-\u0C7F\uAC00-\uD7AF\u1100-\u11FF\u3040-\u309F\u30A0-\u30FFー\u4E00-\u62FF\u6300-\u77FF\u7800-\u8CFF\u8D00-\u9FFF\u3400-\u4DBF\U00020000-\U000215FF\U00021600-\U000230FF\U00023100-\U000245FF\U00024600-\U000260FF\U00026100-\U000275FF\U00027600-\U000290FF\U00029100-\U0002A6DF\U0002A700-\U0002B73F\U0002B740-\U0002B81F\U0002B820-\U0002CEAF\U0002CEB0-\U0002EBEF\u2E80-\u2EFF\u2F00-\u2FDF\u2FF0-\u2FFF\u3000-\u303F\u31C0-\u31EF\u3200-\u32FF\u3300-\u33FF\uF900-\uFAFF\uFE30-\uFE4F\U0001F200-\U0001F2FF\U0002F800-\U0002FA1F0-9])[:<>=/](?=[A-Za-z\uFF21-\uFF3A\uFF41-\uFF5A\u00C0-\u00D6\u00D8-\u00F6\u00F8-\u00FF\u0100-\u017F\u0180-\u01BF\u01C4-\u024F\u2C60-\u2C7B\u2C7E\u2C7F\uA722-\uA76F\uA771-\uA787\uA78B-\uA78E\uA790-\uA7B9\uA7FA\uAB30-\uAB5A\uAB60-\uAB64\u0250-\u02AF\u1D00-\u1D25\u1D6B-\u1D77\u1D79-\u1D9A\u1E00-\u1EFFёа-яЁА-ЯәөүҗңһӘӨҮҖҢҺα-ωάέίόώήύΑ-ΩΆΈΊΌΏΉΎа-щюяіїєґА-ЩЮЯІЇЄҐѓѕјљњќѐѝЃЅЈЉЊЌЀЍ\u1200-\u137F\u0980-\u09FF\u0591-\u05F4\uFB1D-\uFB4F\u0620-\u064A\u066E-\u06D5\u06E5-\u06FF\u0750-\u077F\u08A0-\u08BD\uFB50-\uFBB1\uFBD3-\uFD3D\uFD50-\uFDC7\uFDF0-\uFDFB\uFE70-\uFEFC\U0001EE00-\U0001EEBB\u0D80-\u0DFF\u0900-\u097F\u0C80-\u0CFF\u0B80-\u0BFF\u0C00-\u0C7F\uAC00-\uD7AF\u1100-\u11FF\u3040-\u309F\u30A0-\u30FFー\u4E00-\u62FF\u6300-\u77FF\u7800-\u8CFF\u8D00-\u9FFF\u3400-\u4DBF\U00020000-\U000215FF\U00021600-\U000230FF\U00023100-\U000245FF\U00024600-\U000260FF\U00026100-\U000275FF\U00027600-\U000290FF\U00029100-\U0002A6DF\U0002A700-\U0002B73F\U0002B740-\U0002B81F\U0002B820-\U0002CEAF\U0002CEB0-\U0002EBEF\u2E80-\u2EFF\u2F00-\u2FDF\u2FF0-\u2FFF\u3000-\u303F\u31C0-\u31EF\u3200-\u32FF\u3300-\u33FF\uF900-\uFAFF\uFE30-\uFE4F\U0001F200-\U0001F2FF\U0002F800-\U0002FA1F])�token_match��url_match�
2
+ ��A�
3
+ � ��A� �'��A�'�''��A�''�'Cause��A�'CauseC�because�'Cos��A�'CosC�because�'Coz��A�'CozC�because�'Cuz��A�'CuzC�because�'S��A�'SC�'s�'bout��A�'boutC�about�'cause��A�'causeC�because�'cos��A�'cosC�because�'coz��A�'cozC�because�'cuz��A�'cuzC�because�'d��A�'d�'em��A�'emC�them�'ll��A�'llC�will�'nuff��A�'nuffC�enough�'re��A�'reC�are�'s��A�'sC�'s�(*_*)��A�(*_*)�(-8��A�(-8�(-:��A�(-:�(-;��A�(-;�(-_-)��A�(-_-)�(._.)��A�(._.)�(:��A�(:�(;��A�(;�(=��A�(=�(>_<)��A�(>_<)�(^_^)��A�(^_^)�(o:��A�(o:�(¬_¬)��A�(¬_¬)�(ಠ_ಠ)��A�(ಠ_ಠ)�(╯°□°)╯︵┻━┻��A�(╯°□°)╯︵┻━┻�)-:��A�)-:�):��A�):�-_-��A�-_-�-__-��A�-__-�._.��A�._.�0.0��A�0.0�0.o��A�0.o�0_0��A�0_0�0_o��A�0_o�10a.m.��A�10�A�a.m.C�a.m.�10am��A�10�A�amC�a.m.�10p.m.��A�10�A�p.m.C�p.m.�10pm��A�10�A�pmC�p.m.�11a.m.��A�11�A�a.m.C�a.m.�11am��A�11�A�amC�a.m.�11p.m.��A�11�A�p.m.C�p.m.�11pm��A�11�A�pmC�p.m.�12a.m.��A�12�A�a.m.C�a.m.�12am��A�12�A�amC�a.m.�12p.m.��A�12�A�p.m.C�p.m.�12pm��A�12�A�pmC�p.m.�1a.m.��A�1�A�a.m.C�a.m.�1am��A�1�A�amC�a.m.�1p.m.��A�1�A�p.m.C�p.m.�1pm��A�1�A�pmC�p.m.�2a.m.��A�2�A�a.m.C�a.m.�2am��A�2�A�amC�a.m.�2p.m.��A�2�A�p.m.C�p.m.�2pm��A�2�A�pmC�p.m.�3a.m.��A�3�A�a.m.C�a.m.�3am��A�3�A�amC�a.m.�3p.m.��A�3�A�p.m.C�p.m.�3pm��A�3�A�pmC�p.m.�4a.m.��A�4�A�a.m.C�a.m.�4am��A�4�A�amC�a.m.�4p.m.��A�4�A�p.m.C�p.m.�4pm��A�4�A�pmC�p.m.�5a.m.��A�5�A�a.m.C�a.m.�5am��A�5�A�amC�a.m.�5p.m.��A�5�A�p.m.C�p.m.�5pm��A�5�A�pmC�p.m.�6a.m.��A�6�A�a.m.C�a.m.�6am��A�6�A�amC�a.m.�6p.m.��A�6�A�p.m.C�p.m.�6pm��A�6�A�pmC�p.m.�7a.m.��A�7�A�a.m.C�a.m.�7am��A�7�A�amC�a.m.�7p.m.��A�7�A�p.m.C�p.m.�7pm��A�7�A�pmC�p.m.�8)��A�8)�8-)��A�8-)�8-D��A�8-D�8D��A�8D�8a.m.��A�8�A�a.m.C�a.m.�8am��A�8�A�amC�a.m.�8p.m.��A�8�A�p.m.C�p.m.�8pm��A�8�A�pmC�p.m.�9a.m.��A�9�A�a.m.C�a.m.�9am��A�9�A�amC�a.m.�9p.m.��A�9�A�p.m.C�p.m.�9pm��A�9�A�pmC�p.m.�:'(��A�:'(�:')��A�:')�:'-(��A�:'-(�:'-)��A�:'-)�:(��A�:(�:((��A�:((�:(((��A�:(((�:()��A�:()�:)��A�:)�:))��A�:))�:)))��A�:)))�:*��A�:*�:-(��A�:-(�:-((��A�:-((�:-(((��A�:-(((�:-)��A�:-)�:-))��A�:-))�:-)))��A�:-)))�:-*��A�:-*�:-/��A�:-/�:-0��A�:-0�:-3��A�:-3�:->��A�:->�:-D��A�:-D�:-O��A�:-O�:-P��A�:-P�:-X��A�:-X�:-]��A�:-]�:-o��A�:-o�:-p��A�:-p�:-x��A�:-x�:-|��A�:-|�:-}��A�:-}�:/��A�:/�:0��A�:0�:1��A�:1�:3��A�:3�:>��A�:>�:D��A�:D�:O��A�:O�:P��A�:P�:X��A�:X�:]��A�:]�:o��A�:o�:o)��A�:o)�:p��A�:p�:x��A�:x�:|��A�:|�:}��A�:}�:’(��A�:’(�:’)��A�:’)�:’-(��A�:’-(�:’-)��A�:’-)�;)��A�;)�;-)��A�;-)�;-D��A�;-D�;D��A�;D�;_;��A�;_;�<.<��A�<.<�</3��A�</3�<3��A�<3�<33��A�<33�<333��A�<333�<space>��A�<space>�=(��A�=(�=)��A�=)�=/��A�=/�=3��A�=3�=D��A�=D�=[��A�=[�=]��A�=]�=|��A�=|�>.<��A�>.<�>.>��A�>.>�>:(��A�>:(�>:o��A�>:o�><(((*>��A�><(((*>�@_@��A�@_@�Adm.��A�Adm.�Ain't��A�Ai�A�n'tC�not�Aint��A�Ai�A�ntC�not�Ain’t��A�Ai�A�n’tC�not�Ak.��A�Ak.C�Alaska�Ala.��A�Ala.C�Alabama�Apr.��A�Apr.C�April�Aren't��A�AreC�are�A�n'tC�not�Arent��A�AreC�are�A�ntC�not�Aren’t��A�AreC�are�A�n’tC�not�Ariz.��A�Ariz.C�Arizona�Ark.��A�Ark.C�Arkansas�Aug.��A�Aug.C�August�Bros.��A�Bros.�C'mon��A�C'mC�come�A�on�C++��A�C++�Calif.��A�Calif.C�California�Can't��A�CaC�can�A�n'tC�not�Can't've��A�CaC�can�A�n'tC�not�A�'veC�have�Cannot��A�CanC�can�A�not�Cant��A�CaC�can�A�ntC�not�Cantve��A�CaC�can�A�ntC�not�A�veC�have�Can’t��A�CaC�can�A�n’tC�not�Can’t’ve��A�CaC�can�A�n’tC�not�A�’veC�have�Co.��A�Co.�Colo.��A�Colo.C�Colorado�Conn.��A�Conn.C�Connecticut�Corp.��A�Corp.�Could've��A�CouldC�could�A�'ve�Couldn't��A�CouldC�could�A�n'tC�not�Couldn't've��A�CouldC�could�A�n'tC�not�A�'veC�have�Couldnt��A�CouldC�could�A�ntC�not�Couldntve��A�CouldC�could�A�ntC�not�A�veC�have�Couldn’t��A�CouldC�could�A�n’tC�not�Couldn’t’ve��A�CouldC�could�A�n’tC�not�A�’veC�have�Couldve��A�CouldC�could�A�ve�Could’ve��A�CouldC�could�A�’ve�C’mon��A�C’mC�come�A�on�D.C.��A�D.C.�Daren't��A�DareC�dare�A�n'tC�not�Darent��A�DareC�dare�A�ntC�not�Daren’t��A�DareC�dare�A�n’tC�not�Dec.��A�Dec.C�December�Del.��A�Del.C�Delaware�Didn't��A�DidC�do�A�n'tC�not�Didn't've��A�DidC�do�A�n'tC�not�A�'veC�have�Didnt��A�DidC�do�A�ntC�not�Didntve��A�DidC�do�A�ntC�not�A�veC�have�Didn’t��A�DidC�do�A�n’tC�not�Didn’t’ve��A�DidC�do�A�n’tC�not�A�’veC�have�Doesn't��A�DoesC�does�A�n'tC�not�Doesn't've��A�DoesC�does�A�n'tC�not�A�'veC�have�Doesnt��A�DoesC�does�A�ntC�not�Doesntve��A�DoesC�does�A�ntC�not�A�veC�have�Doesn’t��A�DoesC�does�A�n’tC�not�Doesn’t’ve��A�DoesC�does�A�n’tC�not�A�’veC�have�Doin��A�DoinC�doing�Doin'��A�Doin'C�doing�Doin’��A�Doin’C�doing�Don't��A�DoC�do�A�n'tC�not�Don't've��A�DoC�do�A�n'tC�not�A�'veC�have�Dont��A�DoC�do�A�ntC�not�Dontve��A�DoC�do�A�ntC�not�A�veC�have�Don’t��A�DoC�do�A�n’tC�not�Don’t’ve��A�DoC�do�A�n’tC�not�A�’veC�have�Dr.��A�Dr.�E.G.��A�E.G.�E.g.��A�E.g.�Feb.��A�Feb.C�February�Fla.��A�Fla.C�Florida�Ga.��A�Ga.C�Georgia�Gen.��A�Gen.�Goin��A�GoinC�going�Goin'��A�Goin'C�going�Goin’��A�Goin’C�going�Gonna��A�GonC�going�A�naC�to�Gotta��A�GotC�got�A�taC�to�Gov.��A�Gov.�Hadn't��A�HadC�have�A�n'tC�not�Hadn't've��A�HadC�have�A�n'tC�not�A�'veC�have�Hadnt��A�HadC�have�A�ntC�not�Hadntve��A�HadC�have�A�ntC�not�A�veC�have�Hadn’t��A�HadC�have�A�n’tC�not�Hadn’t’ve��A�HadC�have�A�n’tC�not�A�’veC�have�Hasn't��A�HasC�has�A�n'tC�not�Hasnt��A�HasC�has�A�ntC�not�Hasn’t��A�HasC�has�A�n’tC�not�Haven't��A�HaveC�have�A�n'tC�not�Havent��A�HaveC�have�A�ntC�not�Haven’t��A�HaveC�have�A�n’tC�not�Havin��A�HavinC�having�Havin'��A�Havin'C�having�Havin’��A�Havin’C�having�He'd��A�HeC�he�A�'dC�'d�He'd've��A�HeC�he�A�'dC�would�A�'veC�have�He'll��A�HeC�he�A�'llC�will�He'll've��A�HeC�he�A�'llC�will�A�'veC�have�He's��A�HeC�he�A�'sC�'s�Hed��A�HeC�he�A�dC�'d�Hedve��A�HeC�he�A�dC�would�A�veC�have�Hellve��A�HeC�he�A�llC�will�A�veC�have�Hes��A�HeC�he�A�s�He’d��A�HeC�he�A�’dC�'d�He’d’ve��A�HeC�he�A�’dC�would�A�’veC�have�He’ll��A�HeC�he�A�’llC�will�He’ll’ve��A�HeC�he�A�’llC�will�A�’veC�have�He’s��A�HeC�he�A�’sC�'s�How'd��A�HowC�how�A�'dC�'d�How'd've��A�HowC�how�A�'dC�would�A�'veC�have�How'd'y��A�HowC�how�A�'d�A�'yC�you�How'll��A�HowC�how�A�'llC�will�How'll've��A�HowC�how�A�'llC�will�A�'veC�have�How're��A�HowC�how�A�'reC�are�How's��A�HowC�how�A�'sC�'s�How've��A�HowC�how�A�'ve�Howd��A�HowC�how�A�dC�'d�Howdve��A�HowC�how�A�dC�would�A�veC�have�Howll��A�HowC�how�A�llC�will�Howllve��A�HowC�how�A�llC�will�A�veC�have�Howre��A�HowC�how�A�reC�are�Hows��A�HowC�how�A�s�Howve��A�How�A�veC�have�How’d��A�HowC�how�A�’dC�'d�How’d’ve��A�HowC�how�A�’dC�would�A�’veC�have�How’d’y��A�HowC�how�A�’d�A�’yC�you�How’ll��A�HowC�how�A�’llC�will�How’ll’ve��A�HowC�how�A�’llC�will�A�’veC�have�How’re��A�HowC�how�A�’reC�are�How’s��A�HowC�how�A�’sC�'s�How’ve��A�HowC�how�A�’ve�I'd��A�IC�i�A�'dC�'d�I'd've��A�IC�i�A�'dC�would�A�'veC�have�I'll��A�IC�i�A�'llC�will�I'll've��A�IC�i�A�'llC�will�A�'veC�have�I'm��A�IC�i�A�'mC�am�I'ma��A�IC�i�A�'mC�am�A�aC�gonna�I've��A�IC�i�A�'veC�have�I.E.��A�I.E.�I.e.��A�I.e.�Ia.��A�Ia.C�Iowa�Id��A�IC�i�A�dC�'d�Id.��A�Id.C�Idaho�Idve��A�IC�i�A�dC�would�A�veC�have�Ill.��A�Ill.C�Illinois�Illve��A�IC�i�A�llC�will�A�veC�have�Im��A�IC�i�A�m�Ima��A�IC�i�A�mC�am�A�aC�gonna�Inc.��A�Inc.�Ind.��A�Ind.C�Indiana�Isn't��A�IsC�is�A�n'tC�not�Isnt��A�IsC�is�A�ntC�not�Isn’t��A�IsC�is�A�n’tC�not�It'd��A�ItC�it�A�'dC�'d�It'd've��A�ItC�it�A�'dC�would�A�'veC�have�It'll��A�ItC�it�A�'llC�will�It'll've��A�ItC�it�A�'llC�will�A�'veC�have�It's��A�ItC�it�A�'sC�'s�Itd��A�ItC�it�A�dC�'d�Itdve��A�ItC�it�A�dC�would�A�veC�have�Itll��A�ItC�it�A�llC�will�Itllve��A�ItC�it�A�llC�will�A�veC�have�It’d��A�ItC�it�A�’dC�'d�It’d’ve��A�ItC�it�A�’dC�would�A�’veC�have�It’ll��A�ItC�it�A�’llC�will�It’ll’ve��A�ItC�it�A�’llC�will�A�’veC�have�It’s��A�ItC�it�A�’sC�'s�Ive��A�IC�i�A�veC�have�I’d��A�IC�i�A�’dC�'d�I’d’ve��A�IC�i�A�’dC�would�A�’veC�have�I’ll��A�IC�i�A�’llC�will�I’ll’ve��A�IC�i�A�’llC�will�A�’veC�have�I’m��A�IC�i�A�’mC�am�I’ma��A�IC�i�A�’mC�am�A�aC�gonna�I’ve��A�IC�i�A�’veC�have�Jan.��A�Jan.C�January�Jr.��A�Jr.�Jul.��A�Jul.C�July�Jun.��A�Jun.C�June�Kan.��A�Kan.C�Kansas�Kans.��A�Kans.C�Kansas�Ky.��A�Ky.C�Kentucky�La.��A�La.C�Louisiana�Let's��A�LetC�let�A�'sC�us�Let’s��A�LetC�let�A�’sC�us�Lovin��A�LovinC�loving�Lovin'��A�Lovin'C�loving�Lovin’��A�Lovin’C�loving�Ltd.��A�Ltd.�Ma'am��A�Ma'amC�madam�Mar.��A�Mar.C�March�Mass.��A�Mass.C�Massachusetts�Mayn't��A�MayC�may�A�n'tC�not�Mayn't've��A�MayC�may�A�n'tC�not�A�'veC�have�Maynt��A�MayC�may�A�ntC�not�Mayntve��A�MayC�may�A�ntC�not�A�veC�have�Mayn’t��A�MayC�may�A�n’tC�not�Mayn’t’ve��A�MayC�may�A�n’tC�not�A�’veC�have�Ma’am��A�Ma’amC�madam�Md.��A�Md.�Messrs.��A�Messrs.�Mich.��A�Mich.C�Michigan�Might've��A�MightC�might�A�'ve�Mightn't��A�MightC�might�A�n'tC�not�Mightn't've��A�MightC�might�A�n'tC�not�A�'veC�have�Mightnt��A�MightC�might�A�ntC�not�Mightntve��A�MightC�might�A�ntC�not�A�veC�have�Mightn’t��A�MightC�might�A�n’tC�not�Mightn’t’ve��A�MightC�might�A�n’tC�not�A�’veC�have�Mightve��A�MightC�might�A�ve�Might’ve��A�MightC�might�A�’ve�Minn.��A�Minn.C�Minnesota�Miss.��A�Miss.C�Mississippi�Mo.��A�Mo.�Mont.��A�Mont.�Mr.��A�Mr.�Mrs.��A�Mrs.�Ms.��A�Ms.�Mt.��A�Mt.C�Mount�Must've��A�MustC�must�A�'ve�Mustn't��A�MustC�must�A�n'tC�not�Mustn't've��A�MustC�must�A�n'tC�not�A�'veC�have�Mustnt��A�MustC�must�A�ntC�not�Mustntve��A�MustC�must�A�ntC�not�A�veC�have�Mustn’t��A�MustC�must�A�n’tC�not�Mustn’t’ve��A�MustC�must�A�n’tC�not�A�’veC�have�Mustve��A�MustC�must�A�ve�Must’ve��A�MustC�must�A�’ve�N.C.��A�N.C.C�North Carolina�N.D.��A�N.D.C�North Dakota�N.H.��A�N.H.C�New Hampshire�N.J.��A�N.J.C�New Jersey�N.M.��A�N.M.C�New Mexico�N.Y.��A�N.Y.C�New York�Neb.��A�Neb.C�Nebraska�Nebr.��A�Nebr.C�Nebraska�Needn't��A�NeedC�need�A�n'tC�not�Needn't've��A�NeedC�need�A�n'tC�not�A�'veC�have�Neednt��A�NeedC�need�A�ntC�not�Needntve��A�NeedC�need�A�ntC�not�A�veC�have�Needn’t��A�NeedC�need�A�n’tC�not�Needn’t’ve��A�NeedC�need�A�n’tC�not�A�’veC�have�Nev.��A�Nev.C�Nevada�Not've��A�NotC�not�A�'veC�have�Nothin��A�NothinC�nothing�Nothin'��A�Nothin'C�nothing�Nothin’��A�Nothin’C�nothing�Notve��A�NotC�not�A�veC�have�Not’ve��A�NotC�not�A�’veC�have�Nov.��A�Nov.C�November�Nuthin��A�NuthinC�nothing�Nuthin'��A�Nuthin'C�nothing�Nuthin’��A�Nuthin’C�nothing�O'clock��A�O'clockC�o'clock�O.O��A�O.O�O.o��A�O.o�O_O��A�O_O�O_o��A�O_o�Oct.��A�Oct.C�October�Okla.��A�Okla.C�Oklahoma�Ol��A�OlC�old�Ol'��A�Ol'C�old�Ol’��A�Ol’C�old�Ore.��A�Ore.C�Oregon�Oughtn't��A�OughtC�ought�A�n'tC�not�Oughtn't've��A�OughtC�ought�A�n'tC�not�A�'veC�have�Oughtnt��A�OughtC�ought�A�ntC�not�Oughtntve��A�OughtC�ought�A�ntC�not�A�veC�have�Oughtn’t��A�OughtC�ought�A�n’tC�not�Oughtn’t’ve��A�OughtC�ought�A�n’tC�not�A�’veC�have�O’clock��A�O’clockC�o'clock�Pa.��A�Pa.C�Pennsylvania�Ph.D.��A�Ph.D.�Prof.��A�Prof.�Rep.��A�Rep.�Rev.��A�Rev.�S.C.��A�S.C.C�South Carolina�Sen.��A�Sen.�Sep.��A�Sep.C�September�Sept.��A�Sept.C�September�Shan't��A�ShaC�shall�A�n'tC�not�Shan't've��A�ShaC�shall�A�n'tC�not�A�'veC�have�Shant��A�ShaC�shall�A�ntC�not�Shantve��A�ShaC�shall�A�ntC�not�A�veC�have�Shan’t��A�ShaC�shall�A�n’tC�not�Shan’t’ve��A�ShaC�shall�A�n’tC�not�A�’veC�have�She'd��A�SheC�she�A�'dC�'d�She'd've��A�SheC�she�A�'dC�would�A�'veC�have�She'll��A�SheC�she�A�'llC�will�She'll've��A�SheC�she�A�'llC�will�A�'veC�have�She's��A�SheC�she�A�'sC�'s�Shedve��A�SheC�she�A�dC�would�A�veC�have�Shellve��A�SheC�she�A�llC�will�A�veC�have�Shes��A�SheC�she�A�s�She’d��A�SheC�she�A�’dC�'d�She’d’ve��A�SheC�she�A�’dC�would�A�’veC�have�She’ll��A�SheC�she�A�’llC�will�She’ll’ve��A�SheC�she�A�’llC�will�A�’veC�have�She’s��A�SheC�she�A�’sC�'s�Should've��A�ShouldC�should�A�'ve�Shouldn't��A�ShouldC�should�A�n'tC�not�Shouldn't've��A�ShouldC�should�A�n'tC�not�A�'veC�have�Shouldnt��A�ShouldC�should�A�ntC�not�Shouldntve��A�ShouldC�should�A�ntC�not�A�veC�have�Shouldn’t��A�ShouldC�should�A�n’tC�not�Shouldn’t’ve��A�ShouldC�should�A�n’tC�not�A�’veC�have�Shouldve��A�ShouldC�should�A�ve�Should’ve��A�ShouldC�should�A�’ve�Somethin��A�SomethinC�something�Somethin'��A�Somethin'C�something�Somethin’��A�Somethin’C�something�St.��A�St.�Tenn.��A�Tenn.C�Tennessee�That'd��A�ThatC�that�A�'dC�'d�That'd've��A�ThatC�that�A�'dC�would�A�'veC�have�That'll��A�ThatC�that�A�'llC�will�That'll've��A�ThatC�that�A�'llC�will�A�'veC�have�That's��A�ThatC�that�A�'sC�'s�Thatd��A�ThatC�that�A�dC�'d�Thatdve��A�ThatC�that�A�dC�would�A�veC�have�Thatll��A�ThatC�that�A�llC�will�Thatllve��A�ThatC�that�A�llC�will�A�veC�have�Thats��A�ThatC�that�A�s�That’d��A�ThatC�that�A�’dC�'d�That’d’ve��A�ThatC�that�A�’dC�would�A�’veC�have�That’ll��A�ThatC�that�A�’llC�will�That’ll’ve��A�ThatC�that�A�’llC�will�A�’veC�have�That’s��A�ThatC�that�A�’sC�'s�There'd��A�ThereC�there�A�'dC�'d�There'd've��A�ThereC�there�A�'dC�would�A�'veC�have�There'll��A�ThereC�there�A�'llC�will�There'll've��A�ThereC�there�A�'llC�will�A�'veC�have�There're��A�ThereC�there�A�'reC�are�There's��A�ThereC�there�A�'sC�'s�There've��A�ThereC�there�A�'ve�Thered��A�ThereC�there�A�dC�'d�Theredve��A�ThereC�there�A�dC�would�A�veC�have�Therell��A�ThereC�there�A�llC�will�Therellve��A�ThereC�there�A�llC�will�A�veC�have�Therere��A�ThereC�there�A�reC�are�Theres��A�ThereC�there�A�s�Thereve��A�There�A�veC�have�There’d��A�ThereC�there�A�’dC�'d�There’d’ve��A�ThereC�there�A�’dC�would�A�’veC�have�There’ll��A�ThereC�there�A�’llC�will�There’ll’ve��A�ThereC�there�A�’llC�will�A�’veC�have�There’re��A�ThereC�there�A�’reC�are�There’s��A�ThereC�there�A�’sC�'s�There’ve��A�ThereC�there�A�’ve�These'd��A�TheseC�these�A�'dC�'d�These'd've��A�TheseC�these�A�'dC�would�A�'veC�have�These'll��A�TheseC�these�A�'llC�will�These'll've��A�TheseC�these�A�'llC�will�A�'veC�have�These're��A�TheseC�these�A�'reC�are�These've��A�TheseC�these�A�'ve�Thesed��A�TheseC�these�A�dC�'d�Thesedve��A�TheseC�these�A�dC�would�A�veC�have�Thesell��A�TheseC�these�A�llC�will�Thesellve��A�TheseC�these�A�llC�will�A�veC�have�Thesere��A�TheseC�these�A�reC�are�Theseve��A�These�A�veC�have�These’d��A�TheseC�these�A�’dC�'d�These’d’ve��A�TheseC�these�A�’dC�would�A�’veC�have�These’ll��A�TheseC�these�A�’llC�will�These’ll’ve��A�TheseC�these�A�’llC�will�A�’veC�have�These’re��A�TheseC�these�A�’reC�are�These’ve��A�TheseC�these�A�’ve�They'd��A�TheyC�they�A�'dC�'d�They'd've��A�TheyC�they�A�'dC�would�A�'veC�have�They'll��A�TheyC�they�A�'llC�will�They'll've��A�TheyC�they�A�'llC�will�A�'veC�have�They're��A�TheyC�they�A�'reC�are�They've��A�TheyC�they�A�'veC�have�Theyd��A�TheyC�they�A�dC�'d�Theydve��A�TheyC�they�A�dC�would�A�veC�have�Theyll��A�TheyC�they�A�llC�will�Theyllve��A�TheyC�they�A�llC�will�A�veC�have�Theyre��A�TheyC�they�A�reC�are�Theyve��A�TheyC�they�A�veC�have�They’d��A�TheyC�they�A�’dC�'d�They’d’ve��A�TheyC�they�A�’dC�would�A�’veC�have�They’ll��A�TheyC�they�A�’llC�will�They’ll’ve��A�TheyC�they�A�’llC�will�A�’veC�have�They’re��A�TheyC�they�A�’reC�are�They’ve��A�TheyC�they�A�’veC�have�This'd��A�ThisC�this�A�'dC�'d�This'd've��A�ThisC�this�A�'dC�would�A�'veC�have�This'll��A�ThisC�this�A�'llC�will�This'll've��A�ThisC�this�A�'llC�will�A�'veC�have�This's��A�ThisC�this�A�'sC�'s�Thisd��A�ThisC�this�A�dC�'d�Thisdve��A�ThisC�this�A�dC�would�A�veC�have�Thisll��A�ThisC�this�A�llC�will�Thisllve��A�ThisC�this�A�llC�will�A�veC�have�Thiss��A�ThisC�this�A�s�This’d��A�ThisC�this�A�’dC�'d�This’d’ve��A�ThisC�this�A�’dC�would�A�’veC�have�This’ll��A�ThisC�this�A�’llC�will�This’ll’ve��A�ThisC�this�A�’llC�will�A�’veC�have�This’s��A�ThisC�this�A�’sC�'s�Those'd��A�ThoseC�those�A�'dC�'d�Those'd've��A�ThoseC�those�A�'dC�would�A�'veC�have�Those'll��A�ThoseC�those�A�'llC�will�Those'll've��A�ThoseC�those�A�'llC�will�A�'veC�have�Those're��A�ThoseC�those�A�'reC�are�Those've��A�ThoseC�those�A�'ve�Thosed��A�ThoseC�those�A�dC�'d�Thosedve��A�ThoseC�those�A�dC�would�A�veC�have�Thosell��A�ThoseC�those�A�llC�will�Thosellve��A�ThoseC�those�A�llC�will�A�veC�have�Thosere��A�ThoseC�those�A�reC�are�Thoseve��A�Those�A�veC�have�Those’d��A�ThoseC�those�A�’dC�'d�Those’d’ve��A�ThoseC�those�A�’dC�would�A�’veC�have�Those’ll��A�ThoseC�those�A�’llC�will�Those’ll’ve��A�ThoseC�those�A�’llC�will�A�’veC�have�Those’re��A�ThoseC�those�A�’reC�are�Those’ve��A�ThoseC�those�A�’ve�V.V��A�V.V�V_V��A�V_V�Va.��A�Va.C�Virginia�Wash.��A�Wash.C�Washington�Wasn't��A�WasC�was�A�n'tC�not�Wasnt��A�WasC�was�A�ntC�not�Wasn’t��A�WasC�was�A�n’tC�not�We'd��A�WeC�we�A�'dC�'d�We'd've��A�WeC�we�A�'dC�would�A�'veC�have�We'll��A�WeC�we�A�'llC�will�We'll've��A�WeC�we�A�'llC�will�A�'veC�have�We're��A�WeC�we�A�'reC�are�We've��A�WeC�we�A�'veC�have�Wed��A�WeC�we�A�dC�'d�Wedve��A�WeC�we�A�dC�would�A�veC�have�Wellve��A�WeC�we�A�llC�will�A�veC�have�Weren't��A�WereC�were�A�n'tC�not�Werent��A�WereC�were�A�ntC�not�Weren’t��A�WereC�were�A�n’tC�not�Weve��A�WeC�we�A�veC�have�We’d��A�WeC�we�A�’dC�'d�We’d’ve��A�WeC�we�A�’dC�would�A�’veC�have�We’ll��A�WeC�we�A�’llC�will�We’ll’ve��A�WeC�we�A�’llC�will�A�’veC�have�We’re��A�WeC�we�A�’reC�are�We’ve��A�WeC�we�A�’veC�have�What'd��A�WhatC�what�A�'dC�'d�What'd've��A�WhatC�what�A�'dC�would�A�'veC�have�What'll��A�WhatC�what�A�'llC�will�What'll've��A�WhatC�what�A�'llC�will�A�'veC�have�What're��A�WhatC�what�A�'reC�are�What's��A�WhatC�what�A�'sC�'s�What've��A�WhatC�what�A�'ve�Whatd��A�WhatC�what�A�dC�'d�Whatdve��A�WhatC�what�A�dC�would�A�veC�have�Whatll��A�WhatC�what�A�llC�will�Whatllve��A�WhatC�what�A�llC�will�A�veC�have�Whatre��A�WhatC�what�A�reC�are�Whats��A�WhatC�what�A�s�Whatve��A�What�A�veC�have�What’d��A�WhatC�what�A�’dC�'d�What’d’ve��A�WhatC�what�A�’dC�would�A�’veC�have�What’ll��A�WhatC�what�A�’llC�will�What’ll’ve��A�WhatC�what�A�’llC�will�A�’veC�have�What’re��A�WhatC�what�A�’reC�are�What’s��A�WhatC�what�A�’sC�'s�What’ve��A�WhatC�what�A�’ve�When'd��A�WhenC�when�A�'dC�'d�When'd've��A�WhenC�when�A�'dC�would�A�'veC�have�When'll��A�WhenC�when�A�'llC�will�When'll've��A�WhenC�when�A�'llC�will�A�'veC�have�When're��A�WhenC�when�A�'reC�are�When's��A�WhenC�when�A�'sC�'s�When've��A�WhenC�when�A�'ve�Whend��A�WhenC�when�A�dC�'d�Whendve��A�WhenC�when�A�dC�would�A�veC�have�Whenll��A�WhenC�when�A�llC�will�Whenllve��A�WhenC�when�A�llC�will�A�veC�have�Whenre��A�WhenC�when�A�reC�are�Whens��A�WhenC�when�A�s�Whenve��A�When�A�veC�have�When’d��A�WhenC�when�A�’dC�'d�When’d’ve��A�WhenC�when�A�’dC�would�A�’veC�have�When’ll��A�WhenC�when�A�’llC�will�When’ll’ve��A�WhenC�when�A�’llC�will�A�’veC�have�When’re��A�WhenC�when�A�’reC�are�When’s��A�WhenC�when�A�’sC�'s�When’ve��A�WhenC�when�A�’ve�Where'd��A�WhereC�where�A�'dC�'d�Where'd've��A�WhereC�where�A�'dC�would�A�'veC�have�Where'll��A�WhereC�where�A�'llC�will�Where'll've��A�WhereC�where�A�'llC�will�A�'veC�have�Where're��A�WhereC�where�A�'reC�are�Where's��A�WhereC�where�A�'sC�'s�Where've��A�WhereC�where�A�'ve�Whered��A�WhereC�where�A�dC�'d�Wheredve��A�WhereC�where�A�dC�would�A�veC�have�Wherell��A�WhereC�where�A�llC�will�Wherellve��A�WhereC�where�A�llC�will�A�veC�have�Wherere��A�WhereC�where�A�reC�are�Wheres��A�WhereC�where�A�s�Whereve��A�Where�A�veC�have�Where’d��A�WhereC�where�A�’dC�'d�Where’d’ve��A�WhereC�where�A�’dC�would�A�’veC�have�Where’ll��A�WhereC�where�A�’llC�will�Where’ll’ve��A�WhereC�where�A�’llC�will�A�’veC�have�Where’re��A�WhereC�where�A�’reC�are�Where’s��A�WhereC�where�A�’sC�'s�Where’ve��A�WhereC�where�A�’ve�Who'd��A�WhoC�who�A�'dC�'d�Who'd've��A�WhoC�who�A�'dC�would�A�'veC�have�Who'll��A�WhoC�who�A�'llC�will�Who'll've��A�WhoC�who�A�'llC�will�A�'veC�have�Who're��A�WhoC�who�A�'reC�are�Who's��A�WhoC�who�A�'sC�'s�Who've��A�WhoC�who�A�'ve�Whod��A�WhoC�who�A�dC�'d�Whodve��A�WhoC�who�A�dC�would�A�veC�have�Wholl��A�WhoC�who�A�llC�will�Whollve��A�WhoC�who�A�llC�will�A�veC�have�Whos��A�WhoC�who�A�s�Whove��A�Who�A�veC�have�Who’d��A�WhoC�who�A�’dC�'d�Who’d’ve��A�WhoC�who�A�’dC�would�A�’veC�have�Who’ll��A�WhoC�who�A�’llC�will�Who’ll’ve��A�WhoC�who�A�’llC�will�A�’veC�have�Who’re��A�WhoC�who�A�’reC�are�Who’s��A�WhoC�who�A�’sC�'s�Who’ve��A�WhoC�who�A�’ve�Why'd��A�WhyC�why�A�'dC�'d�Why'd've��A�WhyC�why�A�'dC�would�A�'veC�have�Why'll��A�WhyC�why�A�'llC�will�Why'll've��A�WhyC�why�A�'llC�will�A�'veC�have�Why're��A�WhyC�why�A�'reC�are�Why's��A�WhyC�why�A�'sC�'s�Why've��A�WhyC�why�A�'ve�Whyd��A�WhyC�why�A�dC�'d�Whydve��A�WhyC�why�A�dC�would�A�veC�have�Whyll��A�WhyC�why�A�llC�will�Whyllve��A�WhyC�why�A�llC�will�A�veC�have�Whyre��A�WhyC�why�A�reC�are�Whys��A�WhyC�why�A�s�Whyve��A�Why�A�veC�have�Why’d��A�WhyC�why�A�’dC�'d�Why’d’ve��A�WhyC�why�A�’dC�would�A�’veC�have�Why’ll��A�WhyC�why�A�’llC�will�Why’ll’ve��A�WhyC�why�A�’llC�will�A�’veC�have�Why’re��A�WhyC�why�A�’reC�are�Why’s��A�WhyC�why�A�’sC�'s�Why’ve��A�WhyC�why�A�’ve�Wis.��A�Wis.C�Wisconsin�Won't��A�WoC�will�A�n'tC�not�Won't've��A�WoC�will�A�n'tC�not�A�'veC�have�Wont��A�WoC�will�A�ntC�not�Wontve��A�WoC�will�A�ntC�not�A�veC�have�Won’t��A�WoC�will�A�n’tC�not�Won’t’ve��A�WoC�will�A�n’tC�not�A�’veC�have�Would've��A�WouldC�would�A�'ve�Wouldn't��A�WouldC�would�A�n'tC�not�Wouldn't've��A�WouldC�would�A�n'tC�not�A�'veC�have�Wouldnt��A�WouldC�would�A�ntC�not�Wouldntve��A�WouldC�would�A�ntC�not�A�veC�have�Wouldn’t��A�WouldC�would�A�n’tC�not�Wouldn’t’ve��A�WouldC�would�A�n’tC�not�A�’veC�have�Wouldve��A�WouldC�would�A�ve�Would’ve��A�WouldC�would�A�’ve�XD��A�XD�XDD��A�XDD�You'd��A�YouC�you�A�'dC�'d�You'd've��A�YouC�you�A�'dC�would�A�'veC�have�You'll��A�YouC�you�A�'llC�will�You'll've��A�YouC�you�A�'llC�will�A�'veC�have�You're��A�YouC�you�A�'reC�are�You've��A�YouC�you�A�'veC�have�Youd��A�YouC�you�A�dC�'d�Youdve��A�YouC�you�A�dC�would�A�veC�have�Youll��A�YouC�you�A�llC�will�Youllve��A�YouC�you�A�llC�will�A�veC�have�Youre��A�YouC�you�A�reC�are�Youve��A�YouC�you�A�veC�have�You’d��A�YouC�you�A�’dC�'d�You’d’ve��A�YouC�you�A�’dC�would�A�’veC�have�You’ll��A�YouC�you�A�’llC�will�You’ll’ve��A�YouC�you�A�’llC�will�A�’veC�have�You’re��A�YouC�you�A�’reC�are�You’ve��A�YouC�you�A�’veC�have�[-:��A�[-:�[:��A�[:�[=��A�[=�\")��A�\")�\n��A�\n�\t��A�\t�]=��A�]=�^_^��A�^_^�^__^��A�^__^�^___^��A�^___^�a.��A�a.�a.m.��A�a.m.�ain't��A�ai�A�n'tC�not�aint��A�ai�A�ntC�not�ain’t��A�ai�A�n’tC�not�and/or��A�and/orC�and/or�aren't��A�areC�are�A�n'tC�not�arent��A�areC�are�A�ntC�not�aren’t��A�areC�are�A�n’tC�not�b.��A�b.�c'mon��A�c'mC�come�A�on�c.��A�c.�can't��A�caC�can�A�n'tC�not�can't've��A�caC�can�A�n'tC�not�A�'veC�have�cannot��A�can�A�not�cant��A�caC�can�A�ntC�not�cantve��A�caC�can�A�ntC�not�A�veC�have�can’t��A�caC�can�A�n’tC�not�can’t’ve��A�caC�can�A�n’tC�not�A�’veC�have�co.��A�co.�could've��A�couldC�could�A�'ve�couldn't��A�couldC�could�A�n'tC�not�couldn't've��A�couldC�could�A�n'tC�not�A�'veC�have�couldnt��A�couldC�could�A�ntC�not�couldntve��A�couldC�could�A�ntC�not�A�veC�have�couldn’t��A�couldC�could�A�n’tC�not�couldn’t’ve��A�couldC�could�A�n’tC�not�A�’veC�have�couldve��A�couldC�could�A�ve�could’ve��A�couldC�could�A�’ve�c’mon��A�c’mC�come�A�on�d.��A�d.�daren't��A�dareC�dare�A�n'tC�not�darent��A�dareC�dare�A�ntC�not�daren’t��A�dareC�dare�A�n’tC�not�didn't��A�didC�do�A�n'tC�not�didn't've��A�didC�do�A�n'tC�not�A�'veC�have�didnt��A�didC�do�A�ntC�not�didntve��A�didC�do�A�ntC�not�A�veC�have�didn’t��A�didC�do�A�n’tC�not�didn’t’ve��A�didC�do�A�n’tC�not�A�’veC�have�doesn't��A�doesC�does�A�n'tC�not�doesn't've��A�doesC�does�A�n'tC�not�A�'veC�have�doesnt��A�doesC�does�A�ntC�not�doesntve��A�doesC�does�A�ntC�not�A�veC�have�doesn’t��A�doesC�does�A�n’tC�not�doesn’t’ve��A�doesC�does�A�n’tC�not�A�’veC�have�doin��A�doinC�doing�doin'��A�doin'C�doing�doin’��A�doin’C�doing�don't��A�doC�do�A�n'tC�not�don't've��A�doC�do�A�n'tC�not�A�'veC�have�dont��A�doC�do�A�ntC�not�dontve��A�doC�do�A�ntC�not�A�veC�have�don’t��A�doC�do�A�n’tC�not�don’t’ve��A�doC�do�A�n’tC�not�A�’veC�have�e.��A�e.�e.g.��A�e.g.�em��A�emC�them�f.��A�f.�g.��A�g.�goin��A�goinC�going�goin'��A�goin'C�going�goin’��A�goin’C�going�gonna��A�gonC�going�A�naC�to�gotta��A�got�A�taC�to�h.��A�h.�hadn't��A�hadC�have�A�n'tC�not�hadn't've��A�hadC�have�A�n'tC�not�A�'veC�have�hadnt��A�hadC�have�A�ntC�not�hadntve��A�hadC�have�A�ntC�not�A�veC�have�hadn’t��A�hadC�have�A�n’tC�not�hadn’t’ve��A�hadC�have�A�n’tC�not�A�’veC�have�hasn't��A�hasC�has�A�n'tC�not�hasnt��A�hasC�has�A�ntC�not�hasn’t��A�hasC�has�A�n’tC�not�haven't��A�haveC�have�A�n'tC�not�havent��A�haveC�have�A�ntC�not�haven’t��A�haveC�have�A�n’tC�not�havin��A�havinC�having�havin'��A�havin'C�having�havin’��A�havin’C�having�he'd��A�heC�he�A�'dC�'d�he'd've��A�heC�he�A�'dC�would�A�'veC�have�he'll��A�heC�he�A�'llC�will�he'll've��A�heC�he�A�'llC�will�A�'veC�have�he's��A�heC�he�A�'sC�'s�hed��A�heC�he�A�dC�'d�hedve��A�heC�he�A�dC�would�A�veC�have�hellve��A�heC�he�A�llC�will�A�veC�have�hes��A�heC�he�A�s�he’d��A�heC�he�A�’dC�'d�he’d’ve��A�heC�he�A�’dC�would�A�’veC�have�he’ll��A�heC�he�A�’llC�will�he’ll’ve��A�heC�he�A�’llC�will�A�’veC�have�he’s��A�heC�he�A�’sC�'s�how'd��A�howC�how�A�'dC�'d�how'd've��A�howC�how�A�'dC�would�A�'veC�have�how'd'y��A�how�A�'d�A�'yC�you�how'll��A�howC�how�A�'llC�will�how'll've��A�howC�how�A�'llC�will�A�'veC�have�how're��A�howC�how�A�'reC�are�how's��A�howC�how�A�'sC�'s�how've��A�howC�how�A�'ve�howd��A�howC�how�A�dC�'d�howdve��A�howC�how�A�dC�would�A�veC�have�howll��A�howC�how�A�llC�will�howllve��A�howC�how�A�llC�will�A�veC�have�howre��A�howC�how�A�reC�are�hows��A�howC�how�A�s�howve��A�how�A�veC�have�how’d��A�howC�how�A�’dC�'d�how’d’ve��A�howC�how�A���dC�would�A�’veC�have�how’d’y��A�how�A�’d�A�’yC�you�how’ll��A�howC�how�A�’llC�will�how’ll’ve��A�howC�how�A�’llC�will�A�’veC�have�how’re��A�howC�how�A�’reC�are�how’s��A�howC�how�A�’sC�'s�how’ve��A�howC�how�A�’ve�i'd��A�iC�i�A�'dC�'d�i'd've��A�iC�i�A�'dC�would�A�'veC�have�i'll��A�iC�i�A�'llC�will�i'll've��A�iC�i�A�'llC�will�A�'veC�have�i'm��A�iC�i�A�'mC�am�i'ma��A�iC�i�A�'mC�am�A�aC�gonna�i've��A�iC�i�A�'veC�have�i.��A�i.�i.e.��A�i.e.�id��A�iC�i�A�dC�'d�idve��A�iC�i�A�dC�would�A�veC�have�illve��A�iC�i�A�llC�will�A�veC�have�im��A�iC�i�A�m�ima��A�iC�i�A�mC�am�A�aC�gonna�isn't��A�isC�is�A�n'tC�not�isnt��A�isC�is�A�ntC�not�isn’t��A�isC�is�A�n’tC�not�it'd��A�itC�it�A�'dC�'d�it'd've��A�itC�it�A�'dC�would�A�'veC�have�it'll��A�itC�it�A�'llC�will�it'll've��A�itC�it�A�'llC�will�A�'veC�have�it's��A�itC�it�A�'sC�'s�itd��A�itC�it�A�dC�'d�itdve��A�itC�it�A�dC�would�A�veC�have�itll��A�itC�it�A�llC�will�itllve��A�itC�it�A�llC�will�A�veC�have�it’d��A�itC�it�A�’dC�'d�it’d’ve��A�itC�it�A�’dC�would�A�’veC�have�it’ll��A�itC�it�A�’llC�will�it’ll’ve��A�itC�it�A�’llC�will�A�’veC�have�it’s��A�itC�it�A�’sC�'s�ive��A�iC�i�A�veC�have�i’d��A�iC�i�A�’dC�'d�i’d’ve��A�iC�i�A�’dC�would�A�’veC�have�i’ll��A�iC�i�A�’llC�will�i’ll’ve��A�iC�i�A�’llC�will�A�’veC�have�i’m��A�iC�i�A�’mC�am�i’ma��A�iC�i�A�’mC�am�A�aC�gonna�i’ve��A�iC�i�A�’veC�have�j.��A�j.�k.��A�k.�l.��A�l.�let's��A�let�A�'sC�us�let’s��A�let�A�’sC�us�ll��A�llC�will�lovin��A�lovinC�loving�lovin'��A�lovin'C�loving�lovin’��A�lovin’C�loving�m.��A�m.�ma'am��A�ma'amC�madam�mayn't��A�mayC�may�A�n'tC�not�mayn't've��A�mayC�may�A�n'tC�not�A�'veC�have�maynt��A�mayC�may�A�ntC�not�mayntve��A�mayC�may�A�ntC�not�A�veC�have�mayn’t��A�mayC�may�A�n’tC�not�mayn’t’ve��A�mayC�may�A�n’tC�not�A�’veC�have�ma’am��A�ma’amC�madam�might've��A�mightC�might�A�'ve�mightn't��A�mightC�might�A�n'tC�not�mightn't've��A�mightC�might�A�n'tC�not�A�'veC�have�mightnt��A�mightC�might�A�ntC�not�mightntve��A�mightC�might�A�ntC�not�A�veC�have�mightn’t��A�mightC�might�A�n’tC�not�mightn’t’ve��A�mightC�might�A�n’tC�not�A�’veC�have�mightve��A�mightC�might�A�ve�might’ve��A�mightC�might�A�’ve�must've��A�mustC�must�A�'ve�mustn't��A�mustC�must�A�n'tC�not�mustn't've��A�mustC�must�A�n'tC�not�A�'veC�have�mustnt��A�mustC�must�A�ntC�not�mustntve��A�mustC�must�A�ntC�not�A�veC�have�mustn’t��A�mustC�must�A�n’tC�not�mustn’t’ve��A�mustC�must�A�n’tC�not�A�’veC�have�mustve��A�mustC�must�A�ve�must’ve��A�mustC�must�A�’ve�n.��A�n.�needn't��A�needC�need�A�n'tC�not�needn't've��A�needC�need�A�n'tC�not�A�'veC�have�neednt��A�needC�need�A�ntC�not�needntve��A�needC�need�A�ntC�not�A�veC�have�needn’t��A�needC�need�A�n’tC�not�needn’t’ve��A�needC�need�A�n’tC�not�A�’veC�have�not've��A�not�A�'veC�have�nothin��A�nothinC�nothing�nothin'��A�nothin'C�nothing�nothin’��A�nothin’C�nothing�notve��A�not�A�veC�have�not’ve��A�not�A�’veC�have�nuff��A�nuffC�enough�nuthin��A�nuthinC�nothing�nuthin'��A�nuthin'C�nothing�nuthin’��A�nuthin’C�nothing�o'clock��A�o'clockC�o'clock�o.��A�o.�o.0��A�o.0�o.O��A�o.O�o.o��A�o.o�o_0��A�o_0�o_O��A�o_O�o_o��A�o_o�ol��A�olC�old�ol'��A�ol'C�old�ol’��A�ol’C�old�oughtn't��A�oughtC�ought�A�n'tC�not�oughtn't've��A�oughtC�ought�A�n'tC�not�A�'veC�have�oughtnt��A�oughtC�ought�A�ntC�not�oughtntve��A�oughtC�ought�A�ntC�not�A�veC�have�oughtn’t��A�oughtC�ought�A�n’tC�not�oughtn’t’ve��A�oughtC�ought�A�n’tC�not�A�’veC�have�o’clock��A�o’clockC�o'clock�p.��A�p.�p.m.��A�p.m.�q.��A�q.�r.��A�r.�s.��A�s.�shan't��A�shaC�shall�A�n'tC�not�shan't've��A�shaC�shall�A�n'tC�not�A�'veC�have�shant��A�shaC�shall�A�ntC�not�shantve��A�shaC�shall�A�ntC�not�A�veC�have�shan’t��A�shaC�shall�A�n’tC�not�shan’t’ve��A�shaC�shall�A�n’tC�not�A�’veC�have�she'd��A�sheC�she�A�'dC�'d�she'd've��A�sheC�she�A�'dC�would�A�'veC�have�she'll��A�sheC�she�A�'llC�will�she'll've��A�sheC�she�A�'llC�will�A�'veC�have�she's��A�sheC�she�A�'sC�'s�shedve��A�sheC�she�A�dC�would�A�veC�have�shellve��A�sheC�she�A�llC�will�A�veC�have�shes��A�sheC�she�A�s�she’d��A�sheC�she�A�’dC�'d�she’d’ve��A�sheC�she�A�’dC�would�A�’veC�have�she’ll��A�sheC�she�A�’llC�will�she’ll’ve��A�sheC�she�A�’llC�will�A�’veC�have�she’s��A�sheC�she�A�’sC�'s�should've��A�shouldC�should�A�'ve�shouldn't��A�shouldC�should�A�n'tC�not�shouldn't've��A�shouldC�should�A�n'tC�not�A�'veC�have�shouldnt��A�shouldC�should�A�ntC�not�shouldntve��A�shouldC�should�A�ntC�not�A�veC�have�shouldn’t��A�shouldC�should�A�n’tC�not�shouldn’t’ve��A�shouldC�should�A�n’tC�not�A�’veC�have�shouldve��A�shouldC�should�A�ve�should’ve��A�shouldC�should�A�’ve�somethin��A�somethinC�something�somethin'��A�somethin'C�something�somethin’��A�somethin’C�something�t.��A�t.�that'd��A�thatC�that�A�'dC�'d�that'd've��A�thatC�that�A�'dC�would�A�'veC�have�that'll��A�thatC�that�A�'llC�will�that'll've��A�thatC�that�A�'llC�will�A�'veC�have�that's��A�thatC�that�A�'sC�'s�thatd��A�thatC�that�A�dC�'d�thatdve��A�thatC�that�A�dC�would�A�veC�have�thatll��A�thatC�that�A�llC�will�thatllve��A�thatC�that�A�llC�will�A�veC�have�thats��A�thatC�that�A�s�that’d��A�thatC�that�A�’dC�'d�that’d’ve��A�thatC�that�A�’dC�would�A�’veC�have�that’ll��A�thatC�that�A�’llC�will�that’ll’ve��A�thatC�that�A�’llC�will�A�’veC�have�that’s��A�thatC�that�A�’sC�'s�there'd��A�thereC�there�A�'dC�'d�there'd've��A�thereC�there�A�'dC�would�A�'veC�have�there'll��A�thereC�there�A�'llC�will�there'll've��A�thereC�there�A�'llC�will�A�'veC�have�there're��A�thereC�there�A�'reC�are�there's��A�thereC�there�A�'sC�'s�there've��A�thereC�there�A�'ve�thered��A�thereC�there�A�dC�'d�theredve��A�thereC�there�A�dC�would�A�veC�have�therell��A�thereC�there�A�llC�will�therellve��A�thereC�there�A�llC�will�A�veC�have�therere��A�thereC�there�A�reC�are�theres��A�thereC�there�A�s�thereve��A�there�A�veC�have�there’d��A�thereC�there�A�’dC�'d�there’d’ve��A�thereC�there�A�’dC�would�A�’veC�have�there’ll��A�thereC�there�A�’llC�will�there’ll’ve��A�thereC�there�A�’llC�will�A�’veC�have�there’re��A�thereC�there�A�’reC�are�there’s��A�thereC�there�A�’sC�'s�there’ve��A�thereC�there�A�’ve�these'd��A�theseC�these�A�'dC�'d�these'd've��A�theseC�these�A�'dC�would�A�'veC�have�these'll��A�theseC�these�A�'llC�will�these'll've��A�theseC�these�A�'llC�will�A�'veC�have�these're��A�theseC�these�A�'reC�are�these've��A�theseC�these�A�'ve�thesed��A�theseC�these�A�dC�'d�thesedve��A�theseC�these�A�dC�would�A�veC�have�thesell��A�theseC�these�A�llC�will�thesellve��A�theseC�these�A�llC�will�A�veC�have�thesere��A�theseC�these�A�reC�are�theseve��A�these�A�veC�have�these’d��A�theseC�these�A�’dC�'d�these’d’ve��A�theseC�these�A�’dC�would�A�’veC�have�these’ll��A�theseC�these�A�’llC�will�these’ll’ve��A�theseC�these�A�’llC�will�A�’veC�have�these’re��A�theseC�these�A�’reC�are�these’ve��A�theseC�these�A�’ve�they'd��A�theyC�they�A�'dC�'d�they'd've��A�theyC�they�A�'dC�would�A�'veC�have�they'll��A�theyC�they�A�'llC�will�they'll've��A�theyC�they�A�'llC�will�A�'veC�have�they're��A�theyC�they�A�'reC�are�they've��A�theyC�they�A�'veC�have�theyd��A�theyC�they�A�dC�'d�theydve��A�theyC�they�A�dC�would�A�veC�have�theyll��A�theyC�they�A�llC�will�theyllve��A�theyC�they�A�llC�will�A�veC�have�theyre��A�theyC�they�A�reC�are�theyve��A�theyC�they�A�veC�have�they’d��A�theyC�they�A�’dC�'d�they’d’ve��A�theyC�they�A�’dC�would�A�’veC�have�they’ll��A�theyC�they�A�’llC�will�they’ll’ve��A�theyC�they�A�’llC�will�A�’veC�have�they’re��A�theyC�they�A�’reC�are�they’ve��A�theyC�they�A�’veC�have�this'd��A�thisC�this�A�'dC�'d�this'd've��A�thisC�this�A�'dC�would�A�'veC�have�this'll��A�thisC�this�A�'llC�will�this'll've��A�thisC�this�A�'llC�will�A�'veC�have�this's��A�thisC�this�A�'sC�'s�thisd��A�thisC�this�A�dC�'d�thisdve��A�thisC�this�A�dC�would�A�veC�have�thisll��A�thisC�this�A�llC�will�thisllve��A�thisC�this�A�llC�will�A�veC�have�thiss��A�thisC�this�A�s�this’d��A�thisC�this�A�’dC�'d�this’d’ve��A�thisC�this�A�’dC�would�A�’veC�have�this’ll��A�thisC�this�A�’llC�will�this’ll’ve��A�thisC�this�A�’llC�will�A�’veC�have�this’s��A�thisC�this�A�’sC�'s�those'd��A�thoseC�those�A�'dC�'d�those'd've��A�thoseC�those�A�'dC�would�A�'veC�have�those'll��A�thoseC�those�A�'llC�will�those'll've��A�thoseC�those�A�'llC�will�A�'veC�have�those're��A�thoseC�those�A�'reC�are�those've��A�thoseC�those�A�'ve�thosed��A�thoseC�those�A�dC�'d�thosedve��A�thoseC�those�A�dC�would�A�veC�have�thosell��A�thoseC�those�A�llC�will�thosellve��A�thoseC�those�A�llC�will�A�veC�have�thosere��A�thoseC�those�A�reC�are�thoseve��A�those�A�veC�have�those’d��A�thoseC�those�A�’dC�'d�those’d’ve��A�thoseC�those�A�’dC�would�A�’veC�have�those’ll��A�thoseC�those�A�’llC�will�those’ll’ve��A�thoseC�those�A�’llC�will�A�’veC�have�those’re��A�thoseC�those�A�’reC�are�those’ve��A�thoseC�those�A�’ve�u.��A�u.�v.��A�v.�v.s.��A�v.s.�v.v��A�v.v�v_v��A�v_v�vs.��A�vs.�w.��A�w.�w/o��A�w/oC�without�wasn't��A�wasC�was�A�n'tC�not�wasnt��A�wasC�was�A�ntC�not�wasn’t��A�wasC�was�A�n’tC�not�we'd��A�weC�we�A�'dC�'d�we'd've��A�weC�we�A�'dC�would�A�'veC�have�we'll��A�weC�we�A�'llC�will�we'll've��A�weC�we�A�'llC�will�A�'veC�have�we're��A�weC�we�A�'reC�are�we've��A�weC�we�A�'veC�have�wed��A�weC�we�A�dC�'d�wedve��A�weC�we�A�dC�would�A�veC�have�wellve��A�weC�we�A�llC�will�A�veC�have�weren't��A�wereC�were�A�n'tC�not�werent��A�wereC�were�A�ntC�not�weren’t��A�wereC�were�A�n’tC�not�weve��A�weC�we�A�veC�have�we’d��A�weC�we�A�’dC�'d�we’d’ve��A�weC�we�A�’dC�would�A�’veC�have�we’ll��A�weC�we�A�’llC�will�we’ll’ve��A�weC�we�A�’llC�will�A�’veC�have�we’re��A�weC�we�A�’reC�are�we’ve��A�weC�we�A�’veC�have�what'd��A�whatC�what�A�'dC�'d�what'd've��A�whatC�what�A�'dC�would�A�'veC�have�what'll��A�whatC�what�A�'llC�will�what'll've��A�whatC�what�A�'llC�will�A�'veC�have�what're��A�whatC�what�A�'reC�are�what's��A�whatC�what�A�'sC�'s�what've��A�whatC�what�A�'ve�whatd��A�whatC�what�A�dC�'d�whatdve��A�whatC�what�A�dC�would�A�veC�have�whatll��A�whatC�what�A�llC�will�whatllve��A�whatC�what�A�llC�will�A�veC�have�whatre��A�whatC�what�A�reC�are�whats��A�whatC�what�A�s�whatve��A�what�A�veC�have�what’d��A�whatC�what�A�’dC�'d�what’d’ve��A�whatC�what�A�’dC�would�A�’veC�have�what’ll��A�whatC�what�A�’llC�will�what’ll’ve��A�whatC�what�A�’llC�will�A�’veC�have�what’re��A�whatC�what�A�’reC�are�what’s��A�whatC�what�A�’sC�'s�what’ve��A�whatC�what�A�’ve�when'd��A�whenC�when�A�'dC�'d�when'd've��A�whenC�when�A�'dC�would�A�'veC�have�when'll��A�whenC�when�A�'llC�will�when'll've��A�whenC�when�A�'llC�will�A�'veC�have�when're��A�whenC�when�A�'reC�are�when's��A�whenC�when�A�'sC�'s�when've��A�whenC�when�A�'ve�whend��A�whenC�when�A�dC�'d�whendve��A�whenC�when�A�dC�would�A�veC�have�whenll��A�whenC�when�A�llC�will�whenllve��A�whenC�when�A�llC�will�A�veC�have�whenre��A�whenC�when�A�reC�are�whens��A�whenC�when�A�s�whenve��A�when�A�veC�have�when’d��A�whenC�when�A�’dC�'d�when’d’ve��A�whenC�when�A�’dC�would�A�’veC�have�when’ll��A�whenC�when�A�’llC�will�when’ll’ve��A�whenC�when�A�’llC�will�A�’veC�have�when’re��A�whenC�when�A�’reC�are�when’s��A�whenC�when�A�’sC�'s�when’ve��A�whenC�when�A�’ve�where'd��A�whereC�where�A�'dC�'d�where'd've��A�whereC�where�A�'dC�would�A�'veC�have�where'll��A�whereC�where�A�'llC�will�where'll've��A�whereC�where�A�'llC�will�A�'veC�have�where're��A�whereC�where�A�'reC�are�where's��A�whereC�where�A�'sC�'s�where've��A�whereC�where�A�'ve�whered��A�whereC�where�A�dC�'d�wheredve��A�whereC�where�A�dC�would�A�veC�have�wherell��A�whereC�where�A�llC�will�wherellve��A�whereC�where�A�llC�will�A�veC�have�wherere��A�whereC�where�A�reC�are�wheres��A�whereC�where�A�s�whereve��A�where�A�veC�have�where’d��A�whereC�where�A�’dC�'d�where’d’ve��A�whereC�where�A�’dC�would�A�’veC�have�where’ll��A�whereC�where�A�’llC�will�where’ll’ve��A�whereC�where�A�’llC�will�A�’veC�have�where’re��A�whereC�where�A�’reC�are�where’s��A�whereC�where�A�’sC�'s�where’ve��A�whereC�where�A�’ve�who'd��A�whoC�who�A�'dC�'d�who'd've��A�whoC�who�A�'dC�would�A�'veC�have�who'll��A�whoC�who�A�'llC�will�who'll've��A�whoC�who�A�'llC�will�A�'veC�have�who're��A�whoC�who�A�'reC�are�who's��A�whoC�who�A�'sC�'s�who've��A�whoC�who�A�'ve�whod��A�whoC�who�A�dC�'d�whodve��A�whoC�who�A�dC�would�A�veC�have�wholl��A�whoC�who�A�llC�will�whollve��A�whoC�who�A�llC�will�A�veC�have�whos��A�whoC�who�A�s�whove��A�who�A�veC�have�who’d��A�whoC�who�A�’dC�'d�who’d’ve��A�whoC�who�A�’dC�would�A�’veC�have�who’ll��A�whoC�who�A�’llC�will�who’ll’ve��A�whoC�who�A�’llC�will�A�’veC�have�who’re��A�whoC�who�A�’reC�are�who’s��A�whoC�who�A�’sC�'s�who’ve��A�whoC�who�A�’ve�why'd��A�whyC�why�A�'dC�'d�why'd've��A�whyC�why�A�'dC�would�A�'veC�have�why'll��A�whyC�why�A�'llC�will�why'll've��A�whyC�why�A�'llC�will�A�'veC�have�why're��A�whyC�why�A�'reC�are�why's��A�whyC�why�A�'sC�'s�why've��A�whyC�why�A�'ve�whyd��A�whyC�why�A�dC�'d�whydve��A�whyC�why�A�dC�would�A�veC�have�whyll��A�whyC�why�A�llC�will�whyllve��A�whyC�why�A�llC�will�A�veC�have�whyre��A�whyC�why�A�reC�are�whys��A�whyC�why�A�s�whyve��A�why�A�veC�have�why’d��A�whyC�why�A�’dC�'d�why’d’ve��A�whyC�why�A�’dC�would�A�’veC�have�why’ll��A�whyC�why�A�’llC�will�why’ll’ve��A�whyC�why�A�’llC�will�A�’veC�have�why’re��A�whyC�why�A�’reC�are�why’s��A�whyC�why�A�’sC�'s�why’ve��A�whyC�why�A�’ve�won't��A�woC�will�A�n'tC�not�won't've��A�woC�will�A�n'tC�not�A�'veC�have�wont��A�woC�will�A�ntC�not�wontve��A�woC�will�A�ntC�not�A�veC�have�won’t��A�woC�will�A�n’tC�not�won’t’ve��A�woC�will�A�n’tC�not�A�’veC�have�would've��A�wouldC�would�A�'ve�wouldn't��A�wouldC�would�A�n'tC�not�wouldn't've��A�wouldC�would�A�n'tC�not�A�'veC�have�wouldnt��A�wouldC�would�A�ntC�not�wouldntve��A�wouldC�would�A�ntC�not�A�veC�have�wouldn’t��A�wouldC�would�A�n’tC�not�wouldn’t’ve��A�wouldC�would�A�n’tC�not�A�’veC�have�wouldve��A�wouldC�would�A�ve�would’ve��A�wouldC�would�A�’ve�x.��A�x.�xD��A�xD�xDD��A�xDD�y'all��A�y'C�you�A�all�y.��A�y.�yall��A�yC�you�A�all�you'd��A�youC�you�A�'dC�'d�you'd've��A�youC�you�A�'dC�would�A�'veC�have�you'll��A�youC�you�A�'llC�will�you'll've��A�youC�you�A�'llC�will�A�'veC�have�you're��A�youC�you�A�'reC�are�you've��A�youC�you�A�'veC�have�youd��A�youC�you�A�dC�'d�youdve��A�youC�you�A�dC�would�A�veC�have�youll��A�youC�you�A�llC�will�youllve��A�youC�you�A�llC�will�A�veC�have�youre��A�youC�you�A�reC�are�youve��A�youC�you�A�veC�have�you’d��A�youC�you�A�’dC�'d�you’d’ve��A�youC�you�A�’dC�would�A�’veC�have�you’ll��A�youC�you�A�’llC�will�you’ll’ve��A�youC�you�A�’llC�will�A�’veC�have�you’re��A�youC�you�A�’reC�are�you’ve��A�youC�you�A�’veC�have�y’all��A�y’C�you�A�all�z.��A�z.� ��A� C� �¯\(ツ)/¯��A�¯\(ツ)/¯�°C.��A�°�A�C�A�.�°F.��A�°�A�F�A�.�°K.��A�°�A�K�A�.�°c.��A�°�A�c�A�.�°f.��A�°�A�f�A�.�°k.��A�°�A�k�A�.�ä.��A�ä.�ö.��A�ö.�ü.��A�ü.�ಠ_ಠ��A�ಠ_ಠ�ಠ︵ಠ��A�ಠ︵ಠ�—��A�—�‘S��A�‘SC�'s�‘s��A�‘sC�'s�’��A�’�’Cause��A�’CauseC�because�’Cos��A�’CosC�because�’Coz��A�’CozC�because�’Cuz��A�’CuzC�because�’S��A�’SC�'s�’bout��A�’boutC�about�’cause��A�’causeC�because�’cos��A�’cosC�because�’coz��A�’cozC�because�’cuz��A�’cuzC�because�’d��A�’d�’em��A�’emC�them�’ll��A�’llC�will�’nuff��A�’nuffC�enough�’re��A�’reC�are�’s��A�’sC�'s�’’��A�’’�faster_heuristics�
trainable_transformer/cfg ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ {
2
+ "max_batch_items":4096
3
+ }
trainable_transformer/model ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:fe7414c3a1d546d9d1e5e579db20f3b41d324d07e083d16fff4de6861996948b
3
+ size 502028268
transformer/cfg ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ {
2
+ "max_batch_items":4096
3
+ }
transformer/model ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:bb9fc57108532fd3a96a6c638e5a4c55c19990e900eac50266b189f1c23da3bf
3
+ size 502028329
vocab/key2row ADDED
@@ -0,0 +1 @@
 
 
1
+
vocab/lookups.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:76be8b528d0075f7aae98d6fa57a6d3c83ae480a8469e668d7b0af968995ac71
3
+ size 1
vocab/strings.json ADDED
The diff for this file is too large to render. See raw diff
 
vocab/vectors ADDED
Binary file (128 Bytes). View file
 
vocab/vectors.cfg ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ {
2
+ "mode":"default"
3
+ }