Update spaCy pipeline
Browse files- .gitattributes +4 -0
- README.md +105 -0
- attribute_ruler/patterns +0 -0
- config.cfg +287 -0
- custom_functions.py +855 -0
- en_engagement_Dual_RoBERTa_acad3_f4-any-py3-none-any.whl +3 -0
- lemmatizer/lookups/lookups.bin +3 -0
- meta.json +344 -0
- ner/cfg +13 -0
- ner/model +0 -0
- ner/moves +1 -0
- parser/cfg +13 -0
- parser/model +0 -0
- parser/moves +2 -0
- spancat/cfg +17 -0
- spancat/model +3 -0
- tagger/cfg +55 -0
- tagger/model +0 -0
- tokenizer +3 -0
- trainable_transformer/cfg +3 -0
- trainable_transformer/model +3 -0
- transformer/cfg +3 -0
- transformer/model +3 -0
- vocab/key2row +1 -0
- vocab/lookups.bin +3 -0
- vocab/strings.json +0 -0
- vocab/vectors +0 -0
- vocab/vectors.cfg +3 -0
.gitattributes
CHANGED
@@ -32,3 +32,7 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
|
|
32 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
33 |
*.zst filter=lfs diff=lfs merge=lfs -text
|
34 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
|
|
|
|
|
|
|
|
|
32 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
33 |
*.zst filter=lfs diff=lfs merge=lfs -text
|
34 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
35 |
+
en_engagement_Dual_RoBERTa_acad3_f4-any-py3-none-any.whl filter=lfs diff=lfs merge=lfs -text
|
36 |
+
spancat/model filter=lfs diff=lfs merge=lfs -text
|
37 |
+
trainable_transformer/model filter=lfs diff=lfs merge=lfs -text
|
38 |
+
transformer/model filter=lfs diff=lfs merge=lfs -text
|
README.md
ADDED
@@ -0,0 +1,105 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
tags:
|
3 |
+
- spacy
|
4 |
+
- token-classification
|
5 |
+
language:
|
6 |
+
- en
|
7 |
+
model-index:
|
8 |
+
- name: en_engagement_Dual_RoBERTa_acad3_f4
|
9 |
+
results:
|
10 |
+
- task:
|
11 |
+
name: NER
|
12 |
+
type: token-classification
|
13 |
+
metrics:
|
14 |
+
- name: NER Precision
|
15 |
+
type: precision
|
16 |
+
value: 0.0
|
17 |
+
- name: NER Recall
|
18 |
+
type: recall
|
19 |
+
value: 0.0
|
20 |
+
- name: NER F Score
|
21 |
+
type: f_score
|
22 |
+
value: 0.0
|
23 |
+
- task:
|
24 |
+
name: TAG
|
25 |
+
type: token-classification
|
26 |
+
metrics:
|
27 |
+
- name: TAG (XPOS) Accuracy
|
28 |
+
type: accuracy
|
29 |
+
value: 0.0
|
30 |
+
- task:
|
31 |
+
name: LEMMA
|
32 |
+
type: token-classification
|
33 |
+
metrics:
|
34 |
+
- name: Lemma Accuracy
|
35 |
+
type: accuracy
|
36 |
+
value: 0.0
|
37 |
+
- task:
|
38 |
+
name: UNLABELED_DEPENDENCIES
|
39 |
+
type: token-classification
|
40 |
+
metrics:
|
41 |
+
- name: Unlabeled Attachment Score (UAS)
|
42 |
+
type: f_score
|
43 |
+
value: 0.0
|
44 |
+
- task:
|
45 |
+
name: LABELED_DEPENDENCIES
|
46 |
+
type: token-classification
|
47 |
+
metrics:
|
48 |
+
- name: Labeled Attachment Score (LAS)
|
49 |
+
type: f_score
|
50 |
+
value: 0.0
|
51 |
+
- task:
|
52 |
+
name: SENTS
|
53 |
+
type: token-classification
|
54 |
+
metrics:
|
55 |
+
- name: Sentences F-Score
|
56 |
+
type: f_score
|
57 |
+
value: 0.8446808511
|
58 |
+
---
|
59 |
+
| Feature | Description |
|
60 |
+
| --- | --- |
|
61 |
+
| **Name** | `en_engagement_Dual_RoBERTa_acad3_f4` |
|
62 |
+
| **Version** | `1.0.0` |
|
63 |
+
| **spaCy** | `>=3.4.4,<3.5.0` |
|
64 |
+
| **Default Pipeline** | `transformer`, `parser`, `tagger`, `ner`, `attribute_ruler`, `lemmatizer`, `trainable_transformer`, `spancat` |
|
65 |
+
| **Components** | `transformer`, `parser`, `tagger`, `ner`, `attribute_ruler`, `lemmatizer`, `trainable_transformer`, `spancat` |
|
66 |
+
| **Vectors** | 0 keys, 0 unique vectors (0 dimensions) |
|
67 |
+
| **Sources** | n/a |
|
68 |
+
| **License** | n/a |
|
69 |
+
| **Author** | [n/a]() |
|
70 |
+
|
71 |
+
### Label Scheme
|
72 |
+
|
73 |
+
<details>
|
74 |
+
|
75 |
+
<summary>View label scheme (122 labels for 4 components)</summary>
|
76 |
+
|
77 |
+
| Component | Labels |
|
78 |
+
| --- | --- |
|
79 |
+
| **`parser`** | `ROOT`, `acl`, `acomp`, `advcl`, `advmod`, `agent`, `amod`, `appos`, `attr`, `aux`, `auxpass`, `case`, `cc`, `ccomp`, `compound`, `conj`, `csubj`, `csubjpass`, `dative`, `dep`, `det`, `dobj`, `expl`, `intj`, `mark`, `meta`, `neg`, `nmod`, `npadvmod`, `nsubj`, `nsubjpass`, `nummod`, `oprd`, `parataxis`, `pcomp`, `pobj`, `poss`, `preconj`, `predet`, `prep`, `prt`, `punct`, `quantmod`, `relcl`, `xcomp` |
|
80 |
+
| **`tagger`** | `$`, `''`, `,`, `-LRB-`, `-RRB-`, `.`, `:`, `ADD`, `AFX`, `CC`, `CD`, `DT`, `EX`, `FW`, `HYPH`, `IN`, `JJ`, `JJR`, `JJS`, `LS`, `MD`, `NFP`, `NN`, `NNP`, `NNPS`, `NNS`, `PDT`, `POS`, `PRP`, `PRP$`, `RB`, `RBR`, `RBS`, `RP`, `SYM`, `TO`, `UH`, `VB`, `VBD`, `VBG`, `VBN`, `VBP`, `VBZ`, `WDT`, `WP`, `WP$`, `WRB`, `XX`, ```` |
|
81 |
+
| **`ner`** | `CARDINAL`, `DATE`, `EVENT`, `FAC`, `GPE`, `LANGUAGE`, `LAW`, `LOC`, `MONEY`, `NORP`, `ORDINAL`, `ORG`, `PERCENT`, `PERSON`, `PRODUCT`, `QUANTITY`, `TIME`, `WORK_OF_ART` |
|
82 |
+
| **`spancat`** | `MONOGLOSS`, `ATTRIBUTION`, `ENTERTAIN`, `PROCLAIM`, `JUSTIFYING`, `SOURCES`, `CITATION`, `COUNTER`, `DENY`, `ENDOPHORIC` |
|
83 |
+
|
84 |
+
</details>
|
85 |
+
|
86 |
+
### Accuracy
|
87 |
+
|
88 |
+
| Type | Score |
|
89 |
+
| --- | --- |
|
90 |
+
| `DEP_UAS` | 0.00 |
|
91 |
+
| `DEP_LAS` | 0.00 |
|
92 |
+
| `DEP_LAS_PER_TYPE` | 0.00 |
|
93 |
+
| `SENTS_P` | 80.73 |
|
94 |
+
| `SENTS_R` | 88.57 |
|
95 |
+
| `SENTS_F` | 84.47 |
|
96 |
+
| `TAG_ACC` | 0.00 |
|
97 |
+
| `ENTS_F` | 0.00 |
|
98 |
+
| `ENTS_P` | 0.00 |
|
99 |
+
| `ENTS_R` | 0.00 |
|
100 |
+
| `LEMMA_ACC` | 0.00 |
|
101 |
+
| `SPANS_SC_F` | 71.14 |
|
102 |
+
| `SPANS_SC_P` | 71.74 |
|
103 |
+
| `SPANS_SC_R` | 70.55 |
|
104 |
+
| `TRAINABLE_TRANSFORMER_LOSS` | 359.10 |
|
105 |
+
| `SPANCAT_LOSS` | 74753.57 |
|
attribute_ruler/patterns
ADDED
Binary file (14.8 kB). View file
|
|
config.cfg
ADDED
@@ -0,0 +1,287 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
[paths]
|
2 |
+
train = "data/engagement_three_train4.spacy"
|
3 |
+
dev = "data/engagement_three_dev4.spacy"
|
4 |
+
vectors = null
|
5 |
+
init_tok2vec = null
|
6 |
+
|
7 |
+
[system]
|
8 |
+
gpu_allocator = "pytorch"
|
9 |
+
seed = 0
|
10 |
+
|
11 |
+
[nlp]
|
12 |
+
lang = "en"
|
13 |
+
pipeline = ["transformer","parser","tagger","ner","attribute_ruler","lemmatizer","trainable_transformer","spancat"]
|
14 |
+
batch_size = 10
|
15 |
+
disabled = []
|
16 |
+
before_creation = null
|
17 |
+
after_creation = null
|
18 |
+
after_pipeline_creation = null
|
19 |
+
tokenizer = {"@tokenizers":"spacy.Tokenizer.v1"}
|
20 |
+
|
21 |
+
[components]
|
22 |
+
|
23 |
+
[components.attribute_ruler]
|
24 |
+
factory = "attribute_ruler"
|
25 |
+
scorer = {"@scorers":"spacy.attribute_ruler_scorer.v1"}
|
26 |
+
validate = false
|
27 |
+
|
28 |
+
[components.lemmatizer]
|
29 |
+
factory = "lemmatizer"
|
30 |
+
mode = "rule"
|
31 |
+
model = null
|
32 |
+
overwrite = false
|
33 |
+
scorer = {"@scorers":"spacy.lemmatizer_scorer.v1"}
|
34 |
+
|
35 |
+
[components.ner]
|
36 |
+
factory = "ner"
|
37 |
+
incorrect_spans_key = null
|
38 |
+
moves = null
|
39 |
+
scorer = {"@scorers":"spacy.ner_scorer.v1"}
|
40 |
+
update_with_oracle_cut_size = 100
|
41 |
+
|
42 |
+
[components.ner.model]
|
43 |
+
@architectures = "spacy.TransitionBasedParser.v2"
|
44 |
+
state_type = "ner"
|
45 |
+
extra_state_tokens = false
|
46 |
+
hidden_width = 64
|
47 |
+
maxout_pieces = 2
|
48 |
+
use_upper = false
|
49 |
+
nO = null
|
50 |
+
|
51 |
+
[components.ner.model.tok2vec]
|
52 |
+
@architectures = "spacy-transformers.TransformerListener.v1"
|
53 |
+
grad_factor = 1.0
|
54 |
+
upstream = "transformer"
|
55 |
+
pooling = {"@layers":"reduce_mean.v1"}
|
56 |
+
|
57 |
+
[components.parser]
|
58 |
+
factory = "parser"
|
59 |
+
learn_tokens = false
|
60 |
+
min_action_freq = 30
|
61 |
+
moves = null
|
62 |
+
scorer = {"@scorers":"spacy.parser_scorer.v1"}
|
63 |
+
update_with_oracle_cut_size = 100
|
64 |
+
|
65 |
+
[components.parser.model]
|
66 |
+
@architectures = "spacy.TransitionBasedParser.v2"
|
67 |
+
state_type = "parser"
|
68 |
+
extra_state_tokens = false
|
69 |
+
hidden_width = 64
|
70 |
+
maxout_pieces = 2
|
71 |
+
use_upper = false
|
72 |
+
nO = null
|
73 |
+
|
74 |
+
[components.parser.model.tok2vec]
|
75 |
+
@architectures = "spacy-transformers.TransformerListener.v1"
|
76 |
+
grad_factor = 1.0
|
77 |
+
upstream = "transformer"
|
78 |
+
pooling = {"@layers":"reduce_mean.v1"}
|
79 |
+
|
80 |
+
[components.spancat]
|
81 |
+
factory = "spancat"
|
82 |
+
max_positive = null
|
83 |
+
scorer = {"@scorers":"spacy.spancat_scorer.v1"}
|
84 |
+
spans_key = ${vars.spans_key}
|
85 |
+
threshold = 0.5
|
86 |
+
|
87 |
+
[components.spancat.model]
|
88 |
+
@architectures = "Ensemble_SpanCategorizer.v2"
|
89 |
+
LSTMhidden = 200
|
90 |
+
LSTMdepth = 1
|
91 |
+
LSTMdropout = 0.0
|
92 |
+
|
93 |
+
[components.spancat.model.reducer1]
|
94 |
+
@layers = "Mish_two_way_reducer.v2"
|
95 |
+
depth = 2
|
96 |
+
dropout = 0.2
|
97 |
+
hidden_size = 256
|
98 |
+
|
99 |
+
[components.spancat.model.reducer2]
|
100 |
+
@layers = "Mish_mean_max_reducer.v1"
|
101 |
+
depth = 1
|
102 |
+
dropout = 0.4
|
103 |
+
hidden_size = 128
|
104 |
+
|
105 |
+
[components.spancat.model.scorer]
|
106 |
+
@layers = "spacy.LinearLogistic.v1"
|
107 |
+
nO = null
|
108 |
+
nI = null
|
109 |
+
|
110 |
+
[components.spancat.model.tok2vec]
|
111 |
+
@architectures = "spacy-transformers.TransformerListener.v1"
|
112 |
+
grad_factor = 1.0
|
113 |
+
pooling = {"@layers":"reduce_mean.v1"}
|
114 |
+
upstream = "trainable_transformer"
|
115 |
+
|
116 |
+
[components.spancat.model.tok2vec_trf]
|
117 |
+
@architectures = "spacy-transformers.TransformerListener.v1"
|
118 |
+
grad_factor = 0
|
119 |
+
pooling = {"@layers":"reduce_mean.v1"}
|
120 |
+
upstream = "transformer"
|
121 |
+
|
122 |
+
[components.spancat.suggester]
|
123 |
+
@misc = "spacy-experimental.ngram_subtree_suggester.v1"
|
124 |
+
sizes = [1,2,3,4,5,6,7,8,9,10,11,12]
|
125 |
+
|
126 |
+
[components.tagger]
|
127 |
+
factory = "tagger"
|
128 |
+
neg_prefix = "!"
|
129 |
+
overwrite = false
|
130 |
+
scorer = {"@scorers":"spacy.tagger_scorer.v1"}
|
131 |
+
|
132 |
+
[components.tagger.model]
|
133 |
+
@architectures = "spacy.Tagger.v2"
|
134 |
+
nO = null
|
135 |
+
normalize = false
|
136 |
+
|
137 |
+
[components.tagger.model.tok2vec]
|
138 |
+
@architectures = "spacy-transformers.TransformerListener.v1"
|
139 |
+
grad_factor = 1.0
|
140 |
+
upstream = "transformer"
|
141 |
+
pooling = {"@layers":"reduce_mean.v1"}
|
142 |
+
|
143 |
+
[components.trainable_transformer]
|
144 |
+
factory = "transformer"
|
145 |
+
max_batch_items = 4096
|
146 |
+
set_extra_annotations = {"@annotation_setters":"spacy-transformers.null_annotation_setter.v1"}
|
147 |
+
|
148 |
+
[components.trainable_transformer.model]
|
149 |
+
name = "egumasa/roberta-base-academic3"
|
150 |
+
@architectures = "spacy-transformers.TransformerModel.v1"
|
151 |
+
|
152 |
+
[components.trainable_transformer.model.get_spans]
|
153 |
+
@span_getters = "spacy-transformers.strided_spans.v1"
|
154 |
+
window = 384
|
155 |
+
stride = 288
|
156 |
+
|
157 |
+
[components.trainable_transformer.model.tokenizer_config]
|
158 |
+
use_fast = true
|
159 |
+
|
160 |
+
[components.transformer]
|
161 |
+
factory = "transformer"
|
162 |
+
max_batch_items = 4096
|
163 |
+
set_extra_annotations = {"@annotation_setters":"spacy-transformers.null_annotation_setter.v1"}
|
164 |
+
|
165 |
+
[components.transformer.model]
|
166 |
+
@architectures = "spacy-transformers.TransformerModel.v3"
|
167 |
+
name = "roberta-base"
|
168 |
+
mixed_precision = false
|
169 |
+
|
170 |
+
[components.transformer.model.get_spans]
|
171 |
+
@span_getters = "spacy-transformers.strided_spans.v1"
|
172 |
+
window = 128
|
173 |
+
stride = 96
|
174 |
+
|
175 |
+
[components.transformer.model.grad_scaler_config]
|
176 |
+
|
177 |
+
[components.transformer.model.tokenizer_config]
|
178 |
+
use_fast = true
|
179 |
+
|
180 |
+
[components.transformer.model.transformer_config]
|
181 |
+
|
182 |
+
[corpora]
|
183 |
+
|
184 |
+
[corpora.dev]
|
185 |
+
@readers = "spacy.Corpus.v1"
|
186 |
+
path = ${paths.dev}
|
187 |
+
max_length = 0
|
188 |
+
gold_preproc = false
|
189 |
+
limit = 0
|
190 |
+
augmenter = null
|
191 |
+
|
192 |
+
[corpora.train]
|
193 |
+
@readers = "spacy.Corpus.v1"
|
194 |
+
path = ${paths.train}
|
195 |
+
max_length = 2000
|
196 |
+
gold_preproc = false
|
197 |
+
limit = 0
|
198 |
+
augmenter = null
|
199 |
+
|
200 |
+
[training]
|
201 |
+
dev_corpus = "corpora.dev"
|
202 |
+
train_corpus = "corpora.train"
|
203 |
+
seed = ${system.seed}
|
204 |
+
gpu_allocator = ${system.gpu_allocator}
|
205 |
+
dropout = 0.1
|
206 |
+
accumulate_gradient = 4
|
207 |
+
patience = 3000
|
208 |
+
max_epochs = 0
|
209 |
+
max_steps = 20000
|
210 |
+
eval_frequency = 200
|
211 |
+
frozen_components = ["transformer","parser","tagger","ner","attribute_ruler","lemmatizer"]
|
212 |
+
annotating_components = ["transformer","parser"]
|
213 |
+
before_to_disk = null
|
214 |
+
|
215 |
+
[training.batcher]
|
216 |
+
@batchers = "spacy.batch_by_words.v1"
|
217 |
+
discard_oversize = false
|
218 |
+
tolerance = 0.2
|
219 |
+
get_length = null
|
220 |
+
|
221 |
+
[training.batcher.size]
|
222 |
+
start = 500
|
223 |
+
@schedules = "compounding.v1"
|
224 |
+
stop = 1000
|
225 |
+
compound = 1.0002
|
226 |
+
t = 0.0
|
227 |
+
|
228 |
+
[training.logger]
|
229 |
+
@loggers = "spacy.WandbLogger.v4"
|
230 |
+
project_name = "Spancat_5-fold"
|
231 |
+
remove_config_values = ["paths.train","paths.dev","corpora.train.path","corpora.dev.path"]
|
232 |
+
model_log_interval = null
|
233 |
+
entity = "e-masaki0101"
|
234 |
+
log_dataset_dir = null
|
235 |
+
run_name = null
|
236 |
+
log_best_dir = null
|
237 |
+
log_latest_dir = null
|
238 |
+
|
239 |
+
[training.optimizer]
|
240 |
+
@optimizers = "Adam.v1"
|
241 |
+
beta1 = 0.9
|
242 |
+
beta2 = 0.999
|
243 |
+
L2_is_weight_decay = true
|
244 |
+
L2 = 0.01
|
245 |
+
grad_clip = 1.0
|
246 |
+
use_averages = false
|
247 |
+
eps = 0.00000001
|
248 |
+
|
249 |
+
[training.optimizer.learn_rate]
|
250 |
+
initial_rate = 0.0000565344
|
251 |
+
@schedules = "warmup_linear.v1"
|
252 |
+
warmup_steps = 1000
|
253 |
+
total_steps = 20000
|
254 |
+
|
255 |
+
[training.score_weights]
|
256 |
+
dep_uas = null
|
257 |
+
dep_las = null
|
258 |
+
dep_las_per_type = null
|
259 |
+
sents_p = null
|
260 |
+
sents_r = null
|
261 |
+
sents_f = null
|
262 |
+
tag_acc = null
|
263 |
+
ents_f = null
|
264 |
+
ents_p = null
|
265 |
+
ents_r = null
|
266 |
+
ents_per_type = null
|
267 |
+
lemma_acc = null
|
268 |
+
spans_sc_f = 0.5
|
269 |
+
spans_sc_p = 0.0
|
270 |
+
spans_sc_r = 0.5
|
271 |
+
|
272 |
+
[pretraining]
|
273 |
+
|
274 |
+
[initialize]
|
275 |
+
vectors = ${paths.vectors}
|
276 |
+
init_tok2vec = ${paths.init_tok2vec}
|
277 |
+
vocab_data = null
|
278 |
+
lookups = null
|
279 |
+
before_init = null
|
280 |
+
after_init = null
|
281 |
+
|
282 |
+
[initialize.components]
|
283 |
+
|
284 |
+
[initialize.tokenizer]
|
285 |
+
|
286 |
+
[vars]
|
287 |
+
spans_key = "sc"
|
custom_functions.py
ADDED
@@ -0,0 +1,855 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
from functools import partial
|
2 |
+
from pathlib import Path
|
3 |
+
from typing import Iterable, Callable
|
4 |
+
import spacy
|
5 |
+
from spacy.training import Example
|
6 |
+
from spacy.tokens import DocBin, Doc
|
7 |
+
|
8 |
+
from typing import List, Tuple, cast
|
9 |
+
from thinc.api import Model, with_getitem, chain, list2ragged, Logistic, clone, LayerNorm
|
10 |
+
from thinc.api import Maxout, Mish, Linear, Gelu, concatenate, glorot_uniform_init, PyTorchLSTM, residual
|
11 |
+
from thinc.api import reduce_mean, reduce_max, reduce_first, reduce_last, reduce_sum
|
12 |
+
from thinc.types import Ragged, Floats2d
|
13 |
+
|
14 |
+
from spacy.util import registry
|
15 |
+
from spacy.tokens import Doc
|
16 |
+
from spacy.ml.extract_spans import extract_spans
|
17 |
+
|
18 |
+
# @registry.layers("spacy.LinearLogistic.v1")
|
19 |
+
# def build_linear_logistic(nO=None, nI=None) -> Model[Floats2d, Floats2d]:
|
20 |
+
# """An output layer for multi-label classification. It uses a linear layer
|
21 |
+
# followed by a logistic activation.
|
22 |
+
# """
|
23 |
+
# return chain(Linear(nO=nO, nI=nI, init_W=glorot_uniform_init), Logistic())
|
24 |
+
|
25 |
+
@registry.architectures("CustomSpanCategorizer.v2")
|
26 |
+
def build_spancat_model(
|
27 |
+
tok2vec: Model[List[Doc], List[Floats2d]],
|
28 |
+
reducer1: Model[Ragged, Floats2d],
|
29 |
+
reducer2: Model[Ragged, Floats2d],
|
30 |
+
scorer: Model[Floats2d, Floats2d],
|
31 |
+
) -> Model[Tuple[List[Doc], Ragged], Floats2d]:
|
32 |
+
"""Build a span categorizer model, given a token-to-vector model, a
|
33 |
+
reducer model to map the sequence of vectors for each span down to a single
|
34 |
+
vector, and a scorer model to map the vectors to probabilities.
|
35 |
+
tok2vec (Model[List[Doc], List[Floats2d]]): The tok2vec model.
|
36 |
+
reducer (Model[Ragged, Floats2d]): The reducer model.
|
37 |
+
scorer (Model[Floats2d, Floats2d]): The scorer model.
|
38 |
+
"""
|
39 |
+
model = chain(
|
40 |
+
cast(
|
41 |
+
Model[Tuple[List[Doc], Ragged], Tuple[Ragged, Ragged]],
|
42 |
+
with_getitem(
|
43 |
+
0,
|
44 |
+
chain(tok2vec,
|
45 |
+
cast(Model[List[Floats2d], Ragged], list2ragged()))),
|
46 |
+
),
|
47 |
+
extract_spans(),
|
48 |
+
concatenate(reducer1, reducer2),
|
49 |
+
scorer,
|
50 |
+
)
|
51 |
+
model.set_ref("tok2vec", tok2vec)
|
52 |
+
model.set_ref("reducer1", reducer1)
|
53 |
+
model.set_ref("reducer2", reducer2)
|
54 |
+
model.set_ref("scorer", scorer)
|
55 |
+
return model
|
56 |
+
|
57 |
+
|
58 |
+
@registry.architectures("LSTM_SpanCategorizer.v1")
|
59 |
+
def build_spancat_LSTM_model(
|
60 |
+
tok2vec: Model[List[Doc], List[Floats2d]],
|
61 |
+
reducer: Model[Ragged, Floats2d],
|
62 |
+
scorer: Model[Floats2d, Floats2d],
|
63 |
+
LSTMdepth: int = 2,
|
64 |
+
LSTMdropout: float = 0.0,
|
65 |
+
LSTMhidden: int = 200) -> Model[Tuple[List[Doc], Ragged], Floats2d]:
|
66 |
+
"""Build a span categorizer model, given a token-to-vector model, a
|
67 |
+
reducer model to map the sequence of vectors for each span down to a single
|
68 |
+
vector, and a scorer model to map the vectors to probabilities.
|
69 |
+
tok2vec (Model[List[Doc], List[Floats2d]]): The tok2vec model.
|
70 |
+
reducer (Model[Ragged, Floats2d]): The reducer model.
|
71 |
+
scorer (Model[Floats2d, Floats2d]): The scorer model.
|
72 |
+
"""
|
73 |
+
embedding = cast(
|
74 |
+
Model[Tuple[List[Doc], Ragged], Tuple[Ragged, Ragged]],
|
75 |
+
with_getitem(
|
76 |
+
0,
|
77 |
+
chain(
|
78 |
+
tok2vec,
|
79 |
+
PyTorchLSTM(nI=768,
|
80 |
+
nO=LSTMhidden,
|
81 |
+
bi=True,
|
82 |
+
depth=LSTMdepth,
|
83 |
+
dropout=LSTMdropout),
|
84 |
+
cast(Model[List[Floats2d], Ragged], list2ragged()))))
|
85 |
+
# LSTM = PyTorchLSTM(nO = None, nI= None, bi = True, depth = LSTMdepth, dropout = LSTMdropout)
|
86 |
+
|
87 |
+
model = chain(
|
88 |
+
embedding,
|
89 |
+
extract_spans(),
|
90 |
+
reducer,
|
91 |
+
scorer,
|
92 |
+
)
|
93 |
+
model.set_ref("tok2vec", tok2vec)
|
94 |
+
model.set_ref("reducer", reducer)
|
95 |
+
model.set_ref("scorer", scorer)
|
96 |
+
return model
|
97 |
+
|
98 |
+
@registry.architectures("LSTM_SpanCategorizer.v1.1")
|
99 |
+
def build_spancat_LSTM_model(
|
100 |
+
tok2vec: Model[List[Doc], List[Floats2d]],
|
101 |
+
reducer: Model[Ragged, Floats2d],
|
102 |
+
scorer: Model[Floats2d, Floats2d],
|
103 |
+
lstmdepth: int = 2,
|
104 |
+
lstmdropout: float = 0.0,
|
105 |
+
lstmhidden: int = 200) -> Model[Tuple[List[Doc], Ragged], Floats2d]:
|
106 |
+
"""Build a span categorizer model, given a token-to-vector model, a
|
107 |
+
reducer model to map the sequence of vectors for each span down to a single
|
108 |
+
vector, and a scorer model to map the vectors to probabilities.
|
109 |
+
tok2vec (Model[List[Doc], List[Floats2d]]): The tok2vec model.
|
110 |
+
reducer (Model[Ragged, Floats2d]): The reducer model.
|
111 |
+
scorer (Model[Floats2d, Floats2d]): The scorer model.
|
112 |
+
"""
|
113 |
+
embedding = cast(
|
114 |
+
Model[Tuple[List[Doc], Ragged], Tuple[Ragged, Ragged]],
|
115 |
+
with_getitem(
|
116 |
+
0,
|
117 |
+
chain(
|
118 |
+
tok2vec,
|
119 |
+
PyTorchLSTM(nI=768,
|
120 |
+
nO=lstmhidden,
|
121 |
+
bi=True,
|
122 |
+
depth=lstmdepth,
|
123 |
+
dropout=lstmdropout),
|
124 |
+
cast(Model[List[Floats2d], Ragged], list2ragged()))))
|
125 |
+
|
126 |
+
model = chain(
|
127 |
+
embedding,
|
128 |
+
extract_spans(),
|
129 |
+
reducer,
|
130 |
+
scorer,
|
131 |
+
)
|
132 |
+
model.set_ref("tok2vec", tok2vec)
|
133 |
+
model.set_ref("reducer", reducer)
|
134 |
+
model.set_ref("scorer", scorer)
|
135 |
+
return model
|
136 |
+
|
137 |
+
|
138 |
+
|
139 |
+
# @registry.architectures("LSTM_SpanCategorizer.v2")
|
140 |
+
# def build_spancat_LSTM_model(
|
141 |
+
# tok2vec: Model[List[Doc], List[Floats2d]],
|
142 |
+
# reducer: Model[Ragged, Floats2d],
|
143 |
+
# scorer: Model[Floats2d, Floats2d],
|
144 |
+
# LSTMdepth: int = 2,
|
145 |
+
# LSTMdropout: float = 0.0,
|
146 |
+
# LSTMhidden: int = 200) -> Model[Tuple[List[Doc], Ragged], Floats2d]:
|
147 |
+
# """Build a span categorizer model, given a token-to-vector model, a
|
148 |
+
# reducer model to map the sequence of vectors for each span down to a single
|
149 |
+
# vector, and a scorer model to map the vectors to probabilities.
|
150 |
+
# tok2vec (Model[List[Doc], List[Floats2d]]): The tok2vec model.
|
151 |
+
# reducer (Model[Ragged, Floats2d]): The reducer model.
|
152 |
+
# scorer (Model[Floats2d, Floats2d]): The scorer model.
|
153 |
+
# """
|
154 |
+
# embedding = cast(
|
155 |
+
# Model[Tuple[List[Doc], Ragged], Tuple[Ragged, Ragged]],
|
156 |
+
# with_getitem(
|
157 |
+
# 0,
|
158 |
+
# chain(
|
159 |
+
# tok2vec,
|
160 |
+
# cast(Model[List[Floats2d], Ragged], list2ragged()))))
|
161 |
+
|
162 |
+
# lstm_layer = PyTorchLSTM(nO = LSTMhidden, nI= 768, bi = True, depth = LSTMdepth, dropout = LSTMdropout)
|
163 |
+
|
164 |
+
# model = chain(
|
165 |
+
# embedding,
|
166 |
+
# lstm_layer,
|
167 |
+
# extract_spans(),
|
168 |
+
# reducer,
|
169 |
+
# scorer,
|
170 |
+
# )
|
171 |
+
# model.set_ref("tok2vec", tok2vec)
|
172 |
+
# model.set_ref("reducer", reducer)
|
173 |
+
# model.set_ref("scorer", scorer)
|
174 |
+
# return model
|
175 |
+
|
176 |
+
|
177 |
+
|
178 |
+
@registry.architectures("Ensemble_SpanCategorizer.v1")
|
179 |
+
def build_dual_transformer_model(
|
180 |
+
tok2vec: Model[List[Doc], List[Floats2d]],
|
181 |
+
tok2vec_trf: Model[List[Doc], List[Floats2d]],
|
182 |
+
reducer1: Model[Ragged, Floats2d],
|
183 |
+
reducer2: Model[Ragged, Floats2d],
|
184 |
+
scorer: Model[Floats2d, Floats2d],
|
185 |
+
LSTMhidden: int = 200,
|
186 |
+
LSTMdepth: int = 1,
|
187 |
+
LSTMdropout: float = 0.0,
|
188 |
+
lstmhidden: int = 200,
|
189 |
+
lstmdepth: int = 1,
|
190 |
+
lstmdropout: float = 0.0,
|
191 |
+
) -> Model[Tuple[List[Doc], Ragged], Floats2d]:
|
192 |
+
"""Build a span categorizer model, given a token-to-vector model, a
|
193 |
+
reducer model to map the sequence of vectors for each span down to a single
|
194 |
+
vector, and a scorer model to map the vectors to probabilities.
|
195 |
+
tok2vec (Model[List[Doc], List[Floats2d]]): The tok2vec model.
|
196 |
+
reducer (Model[Ragged, Floats2d]): The reducer model.
|
197 |
+
scorer (Model[Floats2d, Floats2d]): The scorer model.
|
198 |
+
"""
|
199 |
+
trainable_trf = cast(
|
200 |
+
Model[Tuple[List[Doc], Ragged], Tuple[Ragged, Ragged]],
|
201 |
+
with_getitem(
|
202 |
+
0,
|
203 |
+
chain(tok2vec, cast(Model[List[Floats2d], Ragged],
|
204 |
+
list2ragged()))),
|
205 |
+
)
|
206 |
+
en_core_web_trf = cast(
|
207 |
+
Model[Tuple[List[Doc], Ragged], Tuple[Ragged, Ragged]],
|
208 |
+
with_getitem(
|
209 |
+
0,
|
210 |
+
chain(tok2vec_trf,
|
211 |
+
cast(Model[List[Floats2d], Ragged], list2ragged()))),
|
212 |
+
)
|
213 |
+
reduce_trainable = chain(trainable_trf, extract_spans(), reducer1)
|
214 |
+
reduce_default = chain(en_core_web_trf, extract_spans(), reducer2)
|
215 |
+
model = chain(
|
216 |
+
concatenate(reduce_trainable, reduce_default),
|
217 |
+
scorer,
|
218 |
+
)
|
219 |
+
model.set_ref("tok2vec", tok2vec)
|
220 |
+
model.set_ref("tok2vec_trf", tok2vec_trf)
|
221 |
+
model.set_ref("reducer1", reducer1)
|
222 |
+
model.set_ref("reducer2", reducer2)
|
223 |
+
model.set_ref("scorer", scorer)
|
224 |
+
return model
|
225 |
+
|
226 |
+
|
227 |
+
@registry.architectures("Ensemble_SpanCategorizer.v2")
|
228 |
+
def build_dual_transformer_model2(
|
229 |
+
tok2vec: Model[List[Doc], List[Floats2d]],
|
230 |
+
tok2vec_trf: Model[List[Doc], List[Floats2d]],
|
231 |
+
reducer1: Model[Ragged, Floats2d],
|
232 |
+
reducer2: Model[Ragged, Floats2d],
|
233 |
+
scorer: Model[Floats2d, Floats2d],
|
234 |
+
LSTMhidden: int = 200,
|
235 |
+
LSTMdepth: int = 1,
|
236 |
+
LSTMdropout: float = 0.0,
|
237 |
+
lstmhidden: int = 200,
|
238 |
+
lstmdepth: int = 1,
|
239 |
+
lstmdropout: float = 0.0,
|
240 |
+
) -> Model[Tuple[List[Doc], Ragged], Floats2d]:
|
241 |
+
"""Build a span categorizer model, given a token-to-vector model, a
|
242 |
+
reducer model to map the sequence of vectors for each span down to a single
|
243 |
+
vector, and a scorer model to map the vectors to probabilities.
|
244 |
+
tok2vec (Model[List[Doc], List[Floats2d]]): The tok2vec model.
|
245 |
+
reducer (Model[Ragged, Floats2d]): The reducer model.
|
246 |
+
scorer (Model[Floats2d, Floats2d]): The scorer model.
|
247 |
+
"""
|
248 |
+
trainable_trf = cast(
|
249 |
+
Model[Tuple[List[Doc], Ragged], Tuple[Ragged, Ragged]],
|
250 |
+
with_getitem(
|
251 |
+
0,
|
252 |
+
chain(tok2vec, cast(Model[List[Floats2d], Ragged],
|
253 |
+
list2ragged()))),
|
254 |
+
)
|
255 |
+
en_core_web_trf = cast(
|
256 |
+
Model[Tuple[List[Doc], Ragged], Tuple[Ragged, Ragged]],
|
257 |
+
with_getitem(
|
258 |
+
0,
|
259 |
+
chain(
|
260 |
+
tok2vec_trf,
|
261 |
+
PyTorchLSTM(nI=768,
|
262 |
+
nO=lstmhidden,
|
263 |
+
bi=True,
|
264 |
+
depth=lstmdepth,
|
265 |
+
dropout=lstmdropout),
|
266 |
+
cast(Model[List[Floats2d], Ragged], list2ragged()))),
|
267 |
+
)
|
268 |
+
reduce_trainable = chain(trainable_trf, extract_spans(), reducer1)
|
269 |
+
reduce_default = chain(en_core_web_trf, extract_spans(), reducer2)
|
270 |
+
model = chain(
|
271 |
+
concatenate(reduce_trainable, reduce_default),
|
272 |
+
# Mish(),
|
273 |
+
# LayerNorm(),
|
274 |
+
scorer,
|
275 |
+
)
|
276 |
+
model.set_ref("tok2vec", tok2vec)
|
277 |
+
model.set_ref("tok2vec_trf", tok2vec_trf)
|
278 |
+
model.set_ref("reducer1", reducer1)
|
279 |
+
model.set_ref("reducer2", reducer2)
|
280 |
+
model.set_ref("scorer", scorer)
|
281 |
+
return model
|
282 |
+
|
283 |
+
|
284 |
+
@registry.architectures("Ensemble_SpanCategorizer.v3")
|
285 |
+
def build_dual_transformer_model3(
|
286 |
+
tok2vec: Model[List[Doc], List[Floats2d]],
|
287 |
+
tok2vec_trf: Model[List[Doc], List[Floats2d]],
|
288 |
+
reducer1: Model[Ragged, Floats2d],
|
289 |
+
reducer2: Model[Ragged, Floats2d],
|
290 |
+
scorer: Model[Floats2d, Floats2d],
|
291 |
+
LSTMhidden: int = 200,
|
292 |
+
LSTMdepth: int = 1,
|
293 |
+
LSTMdropout: float = 0.0,
|
294 |
+
lstmhidden: int = 200,
|
295 |
+
lstmdepth: int = 1,
|
296 |
+
lstmdropout: float = 0.0,
|
297 |
+
) -> Model[Tuple[List[Doc], Ragged], Floats2d]:
|
298 |
+
"""Build a span categorizer model, given a token-to-vector model, a
|
299 |
+
reducer model to map the sequence of vectors for each span down to a single
|
300 |
+
vector, and a scorer model to map the vectors to probabilities.
|
301 |
+
tok2vec (Model[List[Doc], List[Floats2d]]): The tok2vec model.
|
302 |
+
reducer (Model[Ragged, Floats2d]): The reducer model.
|
303 |
+
scorer (Model[Floats2d, Floats2d]): The scorer model.
|
304 |
+
"""
|
305 |
+
en_core_web_trf = cast(
|
306 |
+
Model[Tuple[List[Doc], Ragged], Tuple[Ragged, Ragged]],
|
307 |
+
with_getitem(
|
308 |
+
0,
|
309 |
+
chain(tok2vec_trf, cast(Model[List[Floats2d], Ragged],
|
310 |
+
list2ragged()))),
|
311 |
+
)
|
312 |
+
trainable_trf = cast(
|
313 |
+
Model[Tuple[List[Doc], Ragged], Tuple[Ragged, Ragged]],
|
314 |
+
with_getitem(
|
315 |
+
0,
|
316 |
+
chain(
|
317 |
+
tok2vec,
|
318 |
+
PyTorchLSTM(nI=768,
|
319 |
+
nO=lstmhidden,
|
320 |
+
bi=True,
|
321 |
+
depth=lstmdepth,
|
322 |
+
dropout=lstmdropout),
|
323 |
+
cast(Model[List[Floats2d], Ragged], list2ragged()))),
|
324 |
+
)
|
325 |
+
reduce_trainable = chain(trainable_trf, extract_spans(), reducer1)
|
326 |
+
reduce_default = chain(en_core_web_trf, extract_spans(), reducer2)
|
327 |
+
model = chain(
|
328 |
+
concatenate(reduce_trainable, reduce_default),
|
329 |
+
scorer,
|
330 |
+
)
|
331 |
+
model.set_ref("tok2vec", tok2vec)
|
332 |
+
model.set_ref("tok2vec_trf", tok2vec_trf)
|
333 |
+
model.set_ref("reducer1", reducer1)
|
334 |
+
model.set_ref("reducer2", reducer2)
|
335 |
+
model.set_ref("scorer", scorer)
|
336 |
+
return model
|
337 |
+
|
338 |
+
@registry.architectures("Ensemble_SpanCategorizer.v4")
|
339 |
+
def build_dual_transformer_model4(
|
340 |
+
tok2vec: Model[List[Doc], List[Floats2d]],
|
341 |
+
tok2vec_trf: Model[List[Doc], List[Floats2d]],
|
342 |
+
reducer1: Model[Ragged, Floats2d],
|
343 |
+
reducer2: Model[Ragged, Floats2d],
|
344 |
+
scorer: Model[Floats2d, Floats2d],
|
345 |
+
LSTMhidden: int = 200,
|
346 |
+
LSTMdepth: int = 1,
|
347 |
+
LSTMdropout: float = 0.0,
|
348 |
+
lstmhidden: int = 200,
|
349 |
+
lstmdepth: int = 1,
|
350 |
+
lstmdropout: float = 0.0,
|
351 |
+
) -> Model[Tuple[List[Doc], Ragged], Floats2d]:
|
352 |
+
"""Build a span categorizer model, given a token-to-vector model, a
|
353 |
+
reducer model to map the sequence of vectors for each span down to a single
|
354 |
+
vector, and a scorer model to map the vectors to probabilities.
|
355 |
+
tok2vec (Model[List[Doc], List[Floats2d]]): The tok2vec model.
|
356 |
+
reducer (Model[Ragged, Floats2d]): The reducer model.
|
357 |
+
scorer (Model[Floats2d, Floats2d]): The scorer model.
|
358 |
+
"""
|
359 |
+
trainable_trf = cast(
|
360 |
+
Model[Tuple[List[Doc], Ragged], Tuple[Ragged, Ragged]],
|
361 |
+
with_getitem(
|
362 |
+
0,
|
363 |
+
chain(tok2vec, cast(Model[List[Floats2d], Ragged],
|
364 |
+
list2ragged()))),
|
365 |
+
)
|
366 |
+
en_core_web_trf = cast(
|
367 |
+
Model[Tuple[List[Doc], Ragged], Tuple[Ragged, Ragged]],
|
368 |
+
with_getitem(
|
369 |
+
0,
|
370 |
+
chain(
|
371 |
+
tok2vec_trf,
|
372 |
+
PyTorchLSTM(nI=768,
|
373 |
+
nO=lstmhidden,
|
374 |
+
bi=True,
|
375 |
+
depth=lstmdepth,
|
376 |
+
dropout=lstmdropout),
|
377 |
+
cast(Model[List[Floats2d], Ragged], list2ragged()))),
|
378 |
+
)
|
379 |
+
reduce_trainable = chain(trainable_trf, extract_spans(), reducer1)
|
380 |
+
reduce_default = chain(en_core_web_trf, extract_spans(), reducer2)
|
381 |
+
model = chain(
|
382 |
+
concatenate(reduce_trainable, reduce_default),
|
383 |
+
Mish(nO = 128),
|
384 |
+
LayerNorm(),
|
385 |
+
scorer,
|
386 |
+
)
|
387 |
+
model.set_ref("tok2vec", tok2vec)
|
388 |
+
model.set_ref("tok2vec_trf", tok2vec_trf)
|
389 |
+
model.set_ref("reducer1", reducer1)
|
390 |
+
model.set_ref("reducer2", reducer2)
|
391 |
+
model.set_ref("scorer", scorer)
|
392 |
+
return model
|
393 |
+
|
394 |
+
|
395 |
+
@registry.architectures("Ensemble_SpanCategorizer.v5")
|
396 |
+
def build_dual_transformer_model3(
|
397 |
+
tok2vec: Model[List[Doc], List[Floats2d]],
|
398 |
+
tok2vec_trf: Model[List[Doc], List[Floats2d]],
|
399 |
+
reducer1: Model[Ragged, Floats2d],
|
400 |
+
reducer2: Model[Ragged, Floats2d],
|
401 |
+
scorer: Model[Floats2d, Floats2d],
|
402 |
+
LSTMhidden: int = 200,
|
403 |
+
LSTMdepth: int = 1,
|
404 |
+
LSTMdropout: float = 0.0,
|
405 |
+
lstmhidden: int = 200,
|
406 |
+
lstmdepth: int = 1,
|
407 |
+
lstmdropout: float = 0.0,
|
408 |
+
) -> Model[Tuple[List[Doc], Ragged], Floats2d]:
|
409 |
+
"""Build a span categorizer model, given a token-to-vector model, a
|
410 |
+
reducer model to map the sequence of vectors for each span down to a single
|
411 |
+
vector, and a scorer model to map the vectors to probabilities.
|
412 |
+
tok2vec (Model[List[Doc], List[Floats2d]]): The tok2vec model.
|
413 |
+
reducer (Model[Ragged, Floats2d]): The reducer model.
|
414 |
+
scorer (Model[Floats2d, Floats2d]): The scorer model.
|
415 |
+
"""
|
416 |
+
en_core_web_trf = cast(
|
417 |
+
Model[Tuple[List[Doc], Ragged], Tuple[Ragged, Ragged]],
|
418 |
+
with_getitem(
|
419 |
+
0,
|
420 |
+
chain(tok2vec_trf,
|
421 |
+
PyTorchLSTM(nI=768,
|
422 |
+
nO=lstmhidden,
|
423 |
+
bi=True,
|
424 |
+
depth=lstmdepth,
|
425 |
+
dropout=lstmdropout),
|
426 |
+
cast(Model[List[Floats2d], Ragged],
|
427 |
+
list2ragged()))),
|
428 |
+
)
|
429 |
+
trainable_trf = cast(
|
430 |
+
Model[Tuple[List[Doc], Ragged], Tuple[Ragged, Ragged]],
|
431 |
+
with_getitem(
|
432 |
+
0,
|
433 |
+
chain(
|
434 |
+
tok2vec,
|
435 |
+
PyTorchLSTM(nI=768,
|
436 |
+
nO=lstmhidden,
|
437 |
+
bi=True,
|
438 |
+
depth=lstmdepth,
|
439 |
+
dropout=lstmdropout),
|
440 |
+
cast(Model[List[Floats2d], Ragged], list2ragged()))),
|
441 |
+
)
|
442 |
+
reduce_trainable = chain(trainable_trf, extract_spans(), reducer1)
|
443 |
+
reduce_default = chain(en_core_web_trf, extract_spans(), reducer2)
|
444 |
+
model = chain(
|
445 |
+
concatenate(reduce_trainable, reduce_default),
|
446 |
+
scorer,
|
447 |
+
)
|
448 |
+
model.set_ref("tok2vec", tok2vec)
|
449 |
+
model.set_ref("tok2vec_trf", tok2vec_trf)
|
450 |
+
model.set_ref("reducer1", reducer1)
|
451 |
+
model.set_ref("reducer2", reducer2)
|
452 |
+
model.set_ref("scorer", scorer)
|
453 |
+
return model
|
454 |
+
|
455 |
+
|
456 |
+
|
457 |
+
@registry.layers("mean_max_reducer.v1.5")
|
458 |
+
def build_mean_max_reducer1(hidden_size: int,
|
459 |
+
dropout: float = 0.0,
|
460 |
+
depth: int = 1) -> Model[Ragged, Floats2d]:
|
461 |
+
"""Reduce sequences by concatenating their mean and max pooled vectors,
|
462 |
+
and then combine the concatenated vectors with a hidden layer.
|
463 |
+
"""
|
464 |
+
return chain(
|
465 |
+
concatenate(
|
466 |
+
cast(Model[Ragged, Floats2d], reduce_last()),
|
467 |
+
cast(Model[Ragged, Floats2d], reduce_first()),
|
468 |
+
reduce_mean(),
|
469 |
+
reduce_max(),
|
470 |
+
),
|
471 |
+
clone(Maxout(nO=hidden_size, normalize=True, dropout=dropout), depth),
|
472 |
+
)
|
473 |
+
|
474 |
+
|
475 |
+
# @registry.layers("mean_max_reducer.v2")
|
476 |
+
# def build_mean_max_reducer2(hidden_size: int,
|
477 |
+
# dropout: float = 0.0) -> Model[Ragged, Floats2d]:
|
478 |
+
# """Reduce sequences by concatenating their mean and max pooled vectors,
|
479 |
+
# and then combine the concatenated vectors with a hidden layer.
|
480 |
+
# """
|
481 |
+
# return chain(
|
482 |
+
# concatenate(
|
483 |
+
# cast(Model[Ragged, Floats2d], reduce_last()),
|
484 |
+
# cast(Model[Ragged, Floats2d], reduce_first()),
|
485 |
+
# reduce_mean(),
|
486 |
+
# reduce_mean(),
|
487 |
+
# reduce_max(),
|
488 |
+
# ),
|
489 |
+
# Maxout(nO=hidden_size, normalize=True, dropout=dropout),
|
490 |
+
# )
|
491 |
+
|
492 |
+
|
493 |
+
@registry.layers("Gelu_mean_max_reducer.v1")
|
494 |
+
def build_mean_max_reducer_gelu(hidden_size: int,
|
495 |
+
dropout: float = 0.0,
|
496 |
+
depth: int = 1) -> Model[Ragged, Floats2d]:
|
497 |
+
"""Reduce sequences by concatenating their mean and max pooled vectors,
|
498 |
+
and then combine the concatenated vectors with a hidden layer.
|
499 |
+
"""
|
500 |
+
gelu_unit = Gelu(nO=hidden_size, normalize=True, dropout=dropout)
|
501 |
+
return chain(
|
502 |
+
concatenate(
|
503 |
+
cast(Model[Ragged, Floats2d], reduce_last()),
|
504 |
+
cast(Model[Ragged, Floats2d], reduce_first()),
|
505 |
+
reduce_mean(),
|
506 |
+
reduce_max(),
|
507 |
+
),
|
508 |
+
clone(gelu_unit, depth),
|
509 |
+
)
|
510 |
+
|
511 |
+
|
512 |
+
@registry.layers("Mish_mean_max_reducer.v1")
|
513 |
+
def build_mean_max_reducer3(hidden_size: int,
|
514 |
+
dropout: float = 0.0,
|
515 |
+
depth: int = 4) -> Model[Ragged, Floats2d]:
|
516 |
+
"""Reduce sequences by concatenating their mean and max pooled vectors,
|
517 |
+
and then combine the concatenated vectors with a hidden layer.
|
518 |
+
"""
|
519 |
+
mish_unit = Mish(nO=hidden_size, normalize=True, dropout=dropout)
|
520 |
+
return chain(
|
521 |
+
concatenate(
|
522 |
+
cast(Model[Ragged, Floats2d], reduce_last()),
|
523 |
+
cast(Model[Ragged, Floats2d], reduce_first()),
|
524 |
+
reduce_mean(),
|
525 |
+
reduce_max(),
|
526 |
+
),
|
527 |
+
clone(mish_unit, depth),
|
528 |
+
)
|
529 |
+
|
530 |
+
@registry.layers("Maxout_mean_max_reducer.v2")
|
531 |
+
def build_mean_max_reducer3(hidden_size: int,
|
532 |
+
dropout: float = 0.0,
|
533 |
+
depth: int = 4) -> Model[Ragged, Floats2d]:
|
534 |
+
"""Reduce sequences by concatenating their mean and max pooled vectors,
|
535 |
+
and then combine the concatenated vectors with a hidden layer.
|
536 |
+
"""
|
537 |
+
maxout_unit = Maxout(nO=hidden_size, normalize=True, dropout=dropout)
|
538 |
+
return chain(
|
539 |
+
concatenate(
|
540 |
+
cast(Model[Ragged, Floats2d], reduce_last()),
|
541 |
+
cast(Model[Ragged, Floats2d], reduce_first()),
|
542 |
+
reduce_mean(),
|
543 |
+
reduce_max(),
|
544 |
+
),
|
545 |
+
clone(maxout_unit, depth),
|
546 |
+
)
|
547 |
+
|
548 |
+
@registry.layers("mean_max_reducer.v2")
|
549 |
+
def build_mean_max_reducer2(hidden_size: int,
|
550 |
+
dropout: float = 0.0) -> Model[Ragged, Floats2d]:
|
551 |
+
"""Reduce sequences by concatenating their mean and max pooled vectors,
|
552 |
+
and then combine the concatenated vectors with a hidden layer.
|
553 |
+
"""
|
554 |
+
return chain(
|
555 |
+
concatenate(
|
556 |
+
cast(Model[Ragged, Floats2d], reduce_last()),
|
557 |
+
cast(Model[Ragged, Floats2d], reduce_first()),
|
558 |
+
reduce_mean(),
|
559 |
+
reduce_mean(),
|
560 |
+
reduce_max(),
|
561 |
+
),
|
562 |
+
Maxout(nO=hidden_size, normalize=True, dropout=dropout),
|
563 |
+
)
|
564 |
+
|
565 |
+
@registry.layers("two_way_reducer.v1")
|
566 |
+
def build_two_way_reducer(hidden_size: int,
|
567 |
+
dropout: float = 0.0) -> Model[Ragged, Floats2d]:
|
568 |
+
"""Reduce sequences by concatenating their mean and max pooled vectors,
|
569 |
+
and then combine the concatenated vectors with a hidden layer.
|
570 |
+
"""
|
571 |
+
default_reducer = concatenate(
|
572 |
+
cast(Model[Ragged, Floats2d], reduce_last()),
|
573 |
+
cast(Model[Ragged, Floats2d], reduce_first()),
|
574 |
+
reduce_mean(),
|
575 |
+
reduce_max(),
|
576 |
+
)
|
577 |
+
mean_sum_reducer = concatenate(reduce_mean(), reduce_sum())
|
578 |
+
|
579 |
+
return concatenate(
|
580 |
+
chain(default_reducer,
|
581 |
+
Maxout(nO=hidden_size, normalize=True, dropout=dropout)),
|
582 |
+
chain(mean_sum_reducer,
|
583 |
+
Maxout(nO=hidden_size // 2, normalize=True, dropout=dropout)))
|
584 |
+
|
585 |
+
|
586 |
+
@registry.layers("Mish_two_way_reducer.v1")
|
587 |
+
def build_Mish_two_way_reducer(hidden_size: int,
|
588 |
+
dropout: float = 0.0,
|
589 |
+
depth: int = 1) -> Model[Ragged, Floats2d]:
|
590 |
+
"""Reduce sequences by concatenating their mean and max pooled vectors,
|
591 |
+
and then combine the concatenated vectors with a hidden layer.
|
592 |
+
"""
|
593 |
+
default_reducer = concatenate(
|
594 |
+
cast(Model[Ragged, Floats2d], reduce_last()),
|
595 |
+
cast(Model[Ragged, Floats2d], reduce_first()),
|
596 |
+
reduce_mean(),
|
597 |
+
reduce_max(),
|
598 |
+
)
|
599 |
+
mean_sum_reducer = concatenate(reduce_mean(), reduce_sum())
|
600 |
+
|
601 |
+
return concatenate(
|
602 |
+
chain(
|
603 |
+
default_reducer,
|
604 |
+
clone(Mish(nO=hidden_size // 2, normalize=True, dropout=dropout),
|
605 |
+
depth)),
|
606 |
+
chain(
|
607 |
+
mean_sum_reducer,
|
608 |
+
clone(Mish(nO=hidden_size // 2, normalize=True, dropout=dropout),
|
609 |
+
depth)))
|
610 |
+
|
611 |
+
@registry.layers("Mish_two_way_reducer.v2")
|
612 |
+
def build_Mish_two_way_reducer2(hidden_size: int,
|
613 |
+
dropout: float = 0.0,
|
614 |
+
depth: int = 1) -> Model[Ragged, Floats2d]:
|
615 |
+
"""Reduce sequences by concatenating their mean and max pooled vectors,
|
616 |
+
and then combine the concatenated vectors with a hidden layer.
|
617 |
+
"""
|
618 |
+
default_reducer = concatenate(
|
619 |
+
cast(Model[Ragged, Floats2d], reduce_last()),
|
620 |
+
cast(Model[Ragged, Floats2d], reduce_first()),
|
621 |
+
reduce_mean(),
|
622 |
+
reduce_max(),
|
623 |
+
)
|
624 |
+
mean_sum_reducer = concatenate(
|
625 |
+
cast(Model[Ragged, Floats2d], reduce_last()),
|
626 |
+
cast(Model[Ragged, Floats2d], reduce_first()),
|
627 |
+
reduce_mean(),
|
628 |
+
reduce_sum(),
|
629 |
+
)
|
630 |
+
|
631 |
+
return concatenate(
|
632 |
+
chain(
|
633 |
+
default_reducer,
|
634 |
+
clone(Mish(nO=hidden_size // 2, normalize=True, dropout=dropout),
|
635 |
+
depth)),
|
636 |
+
chain(
|
637 |
+
mean_sum_reducer,
|
638 |
+
clone(Mish(nO=hidden_size // 2, normalize=True, dropout=dropout),
|
639 |
+
depth)))
|
640 |
+
|
641 |
+
@registry.layers("Mish_two_way_reducer.v3")
|
642 |
+
def build_Mish_two_way_reducer3(hidden_size: int,
|
643 |
+
dropout: float = 0.0,
|
644 |
+
depth: int = 1) -> Model[Ragged, Floats2d]:
|
645 |
+
"""Reduce sequences by concatenating their mean and max pooled vectors,
|
646 |
+
and then combine the concatenated vectors with a hidden layer.
|
647 |
+
"""
|
648 |
+
default_reducer = concatenate(
|
649 |
+
cast(Model[Ragged, Floats2d], reduce_last()),
|
650 |
+
cast(Model[Ragged, Floats2d], reduce_first()),
|
651 |
+
reduce_mean(),
|
652 |
+
reduce_max(),
|
653 |
+
)
|
654 |
+
mean_sum_reducer = concatenate(
|
655 |
+
cast(Model[Ragged, Floats2d], reduce_last()),
|
656 |
+
cast(Model[Ragged, Floats2d], reduce_first()),
|
657 |
+
reduce_max(),
|
658 |
+
)
|
659 |
+
|
660 |
+
return concatenate(
|
661 |
+
chain(
|
662 |
+
default_reducer,
|
663 |
+
clone(Mish(nO=hidden_size // 2, normalize=True, dropout=dropout),
|
664 |
+
depth)),
|
665 |
+
chain(
|
666 |
+
mean_sum_reducer,
|
667 |
+
clone(Mish(nO=hidden_size // 2, normalize=True, dropout=dropout),
|
668 |
+
depth)))
|
669 |
+
|
670 |
+
|
671 |
+
|
672 |
+
@registry.layers("three_way_reducer.v3")
|
673 |
+
def build_mean_max_reducer2(hidden_size: int,
|
674 |
+
dropout: float = 0.0,
|
675 |
+
depth: int = 2) -> Model[Ragged, Floats2d]:
|
676 |
+
"""Reduce sequences by concatenating their mean and max pooled vectors,
|
677 |
+
and then combine the concatenated vectors with a hidden layer.
|
678 |
+
"""
|
679 |
+
default_reducer = concatenate(
|
680 |
+
cast(Model[Ragged, Floats2d], reduce_last()),
|
681 |
+
cast(Model[Ragged, Floats2d], reduce_first()),
|
682 |
+
reduce_mean(),
|
683 |
+
reduce_max(),
|
684 |
+
)
|
685 |
+
mean_sum_reducer = concatenate(
|
686 |
+
reduce_mean(),
|
687 |
+
reduce_sum())
|
688 |
+
|
689 |
+
return concatenate(chain(default_reducer,
|
690 |
+
Maxout(nO=hidden_size, normalize=True, dropout=dropout)),
|
691 |
+
chain(mean_sum_reducer,
|
692 |
+
Maxout(nO=hidden_size//2, normalize=True, dropout=dropout)),
|
693 |
+
chain(mean_sum_reducer,
|
694 |
+
clone(Maxout(nO=hidden_size//2, normalize=True, dropout=dropout),depth))
|
695 |
+
)
|
696 |
+
|
697 |
+
@registry.layers("Maxout_three_way_reducer.v1")
|
698 |
+
def build_Maxout_three_way_reducer(hidden_size: int,
|
699 |
+
dropout: float = 0.0,
|
700 |
+
depth: int = 2) -> Model[Ragged, Floats2d]:
|
701 |
+
"""Reduce sequences by concatenating their mean and max pooled vectors,
|
702 |
+
and then combine the concatenated vectors with a hidden layer.
|
703 |
+
"""
|
704 |
+
default_reducer = concatenate(
|
705 |
+
cast(Model[Ragged, Floats2d], reduce_last()),
|
706 |
+
cast(Model[Ragged, Floats2d], reduce_first()),
|
707 |
+
reduce_mean(),
|
708 |
+
reduce_max(),
|
709 |
+
)
|
710 |
+
mean_sum_reducer = concatenate(reduce_mean(), reduce_sum())
|
711 |
+
|
712 |
+
return concatenate(
|
713 |
+
chain(
|
714 |
+
default_reducer,
|
715 |
+
clone(Maxout(nO=hidden_size // 2, normalize=True, dropout=dropout),
|
716 |
+
depth)),
|
717 |
+
chain(mean_sum_reducer,
|
718 |
+
Maxout(nO=hidden_size // 4, normalize=True, dropout=dropout)),
|
719 |
+
chain(
|
720 |
+
mean_sum_reducer,
|
721 |
+
clone(Maxout(nO=hidden_size // 4, normalize=True, dropout=dropout),
|
722 |
+
depth)))
|
723 |
+
|
724 |
+
|
725 |
+
@registry.layers("Mish_three_way_reducer.v1")
|
726 |
+
def build_Mish_three_way_reducer(hidden_size: int,
|
727 |
+
dropout: float = 0.0,
|
728 |
+
depth: int = 2) -> Model[Ragged, Floats2d]:
|
729 |
+
"""Reduce sequences by concatenating their mean and max pooled vectors,
|
730 |
+
and then combine the concatenated vectors with a hidden layer.
|
731 |
+
"""
|
732 |
+
default_reducer = concatenate(
|
733 |
+
cast(Model[Ragged, Floats2d], reduce_last()),
|
734 |
+
cast(Model[Ragged, Floats2d], reduce_first()),
|
735 |
+
reduce_mean(),
|
736 |
+
reduce_max(),
|
737 |
+
)
|
738 |
+
mean_sum_reducer = concatenate(reduce_mean(), reduce_sum())
|
739 |
+
|
740 |
+
return concatenate(
|
741 |
+
chain(
|
742 |
+
default_reducer,
|
743 |
+
clone(Mish(nO=hidden_size // 2, normalize=True, dropout=dropout),
|
744 |
+
depth)),
|
745 |
+
chain(mean_sum_reducer,
|
746 |
+
Mish(nO=hidden_size // 4, normalize=True, dropout=dropout)),
|
747 |
+
chain(
|
748 |
+
mean_sum_reducer,
|
749 |
+
clone(Mish(nO=hidden_size // 4, normalize=True, dropout=dropout),
|
750 |
+
depth)))
|
751 |
+
|
752 |
+
|
753 |
+
@registry.layers("mean_max_reducer.v4")
|
754 |
+
def build_mean_max_reducer3(hidden_size: int,
|
755 |
+
maxout_pieces: int = 3,
|
756 |
+
dropout: float = 0.0) -> Model[Ragged, Floats2d]:
|
757 |
+
"""Reduce sequences by concatenating their mean and max pooled vectors,
|
758 |
+
and then combine the concatenated vectors with a hidden layer.
|
759 |
+
"""
|
760 |
+
hidden_size2 = int(hidden_size / 2)
|
761 |
+
hidden_size3 = int(hidden_size / 2)
|
762 |
+
return chain(
|
763 |
+
concatenate(
|
764 |
+
cast(Model[Ragged, Floats2d], reduce_last()),
|
765 |
+
cast(Model[Ragged, Floats2d], reduce_first()),
|
766 |
+
reduce_mean(),
|
767 |
+
reduce_max(),
|
768 |
+
),
|
769 |
+
Maxout(nO=hidden_size,
|
770 |
+
nP=maxout_pieces,
|
771 |
+
normalize=True,
|
772 |
+
dropout=dropout),
|
773 |
+
Maxout(nO=hidden_size2,
|
774 |
+
nP=maxout_pieces,
|
775 |
+
normalize=True,
|
776 |
+
dropout=dropout),
|
777 |
+
Maxout(nO=hidden_size3,
|
778 |
+
nP=maxout_pieces,
|
779 |
+
normalize=True,
|
780 |
+
dropout=dropout))
|
781 |
+
|
782 |
+
|
783 |
+
@registry.layers("mean_max_reducer.v3.3")
|
784 |
+
def build_mean_max_reducer4(hidden_size: int,
|
785 |
+
depth: int) -> Model[Ragged, Floats2d]:
|
786 |
+
"""Reduce sequences by concatenating their mean and max pooled vectors,
|
787 |
+
and then combine the concatenated vectors with a hidden layer.
|
788 |
+
"""
|
789 |
+
hidden_size2 = int(hidden_size / 2)
|
790 |
+
hidden_size3 = int(hidden_size / 2)
|
791 |
+
return chain(
|
792 |
+
concatenate(
|
793 |
+
cast(Model[Ragged, Floats2d], reduce_last()),
|
794 |
+
cast(Model[Ragged, Floats2d], reduce_first()),
|
795 |
+
reduce_mean(),
|
796 |
+
reduce_max(),
|
797 |
+
), Maxout(nO=hidden_size, nP=3, normalize=True, dropout=0.0),
|
798 |
+
Maxout(nO=hidden_size2, nP=3, normalize=True, dropout=0.0),
|
799 |
+
Maxout(nO=hidden_size3, nP=3, normalize=True, dropout=0.0))
|
800 |
+
|
801 |
+
|
802 |
+
# @registry.architectures("spacy.MaxoutWindowEncoder.v2")
|
803 |
+
# def MaxoutWindowEncoder(
|
804 |
+
# width: int, window_size: int, maxout_pieces: int, depth: int
|
805 |
+
# ) -> Model[List[Floats2d], List[Floats2d]]:
|
806 |
+
# """Encode context using convolutions with maxout activation, layer
|
807 |
+
# normalization and residual connections.
|
808 |
+
# width (int): The input and output width. These are required to be the same,
|
809 |
+
# to allow residual connections. This value will be determined by the
|
810 |
+
# width of the inputs. Recommended values are between 64 and 300.
|
811 |
+
# window_size (int): The number of words to concatenate around each token
|
812 |
+
# to construct the convolution. Recommended value is 1.
|
813 |
+
# maxout_pieces (int): The number of maxout pieces to use. Recommended
|
814 |
+
# values are 2 or 3.
|
815 |
+
# depth (int): The number of convolutional layers. Recommended value is 4.
|
816 |
+
# """
|
817 |
+
# cnn = chain(
|
818 |
+
# expand_window(window_size=window_size),
|
819 |
+
# Maxout(
|
820 |
+
# nO=width,
|
821 |
+
# nI=width * ((window_size * 2) + 1),
|
822 |
+
# nP=maxout_pieces,
|
823 |
+
# dropout=0.0,
|
824 |
+
# normalize=True,
|
825 |
+
# ),
|
826 |
+
# )
|
827 |
+
# model = clone(residual(cnn), depth)
|
828 |
+
# model.set_dim("nO", width)
|
829 |
+
# receptive_field = window_size * depth
|
830 |
+
# return with_array(model, pad=receptive_field)
|
831 |
+
|
832 |
+
|
833 |
+
# @registry.architectures("spacy.MishWindowEncoder.v2")
|
834 |
+
# def MishWindowEncoder(
|
835 |
+
# width: int, window_size: int, depth: int
|
836 |
+
# ) -> Model[List[Floats2d], List[Floats2d]]:
|
837 |
+
# """Encode context using convolutions with mish activation, layer
|
838 |
+
# normalization and residual connections.
|
839 |
+
# width (int): The input and output width. These are required to be the same,
|
840 |
+
# to allow residual connections. This value will be determined by the
|
841 |
+
# width of the inputs. Recommended values are between 64 and 300.
|
842 |
+
# window_size (int): The number of words to concatenate around each token
|
843 |
+
# to construct the convolution. Recommended value is 1.
|
844 |
+
# depth (int): The number of convolutional layers. Recommended value is 4.
|
845 |
+
# """
|
846 |
+
# cnn = chain(
|
847 |
+
# expand_window(window_size=window_size),
|
848 |
+
# Mish(nO=width, nI=width * ((window_size * 2) + 1), dropout=0.0, normalize=True),
|
849 |
+
# )
|
850 |
+
# model = clone(residual(cnn), depth)
|
851 |
+
# model.set_dim("nO", width)
|
852 |
+
# return with_array(model)
|
853 |
+
|
854 |
+
|
855 |
+
|
en_engagement_Dual_RoBERTa_acad3_f4-any-py3-none-any.whl
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:64e8d062bfbaedc9f2157b1315a1f7e963df4c6d376953ef34db131183618d61
|
3 |
+
size 930703767
|
lemmatizer/lookups/lookups.bin
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:eb64f40c0f8396d1762730c0ddf4dad2a52d138f5a389f71a1a1d088173b7737
|
3 |
+
size 972893
|
meta.json
ADDED
@@ -0,0 +1,344 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"lang":"en",
|
3 |
+
"name":"engagement_Dual_RoBERTa_acad3_f4",
|
4 |
+
"version":"1.0.0",
|
5 |
+
"description":"",
|
6 |
+
"author":"",
|
7 |
+
"email":"",
|
8 |
+
"url":"",
|
9 |
+
"license":"",
|
10 |
+
"spacy_version":">=3.4.4,<3.5.0",
|
11 |
+
"spacy_git_version":"Unknown",
|
12 |
+
"vectors":{
|
13 |
+
"width":0,
|
14 |
+
"vectors":0,
|
15 |
+
"keys":0,
|
16 |
+
"name":null
|
17 |
+
},
|
18 |
+
"labels":{
|
19 |
+
"transformer":[
|
20 |
+
|
21 |
+
],
|
22 |
+
"parser":[
|
23 |
+
"ROOT",
|
24 |
+
"acl",
|
25 |
+
"acomp",
|
26 |
+
"advcl",
|
27 |
+
"advmod",
|
28 |
+
"agent",
|
29 |
+
"amod",
|
30 |
+
"appos",
|
31 |
+
"attr",
|
32 |
+
"aux",
|
33 |
+
"auxpass",
|
34 |
+
"case",
|
35 |
+
"cc",
|
36 |
+
"ccomp",
|
37 |
+
"compound",
|
38 |
+
"conj",
|
39 |
+
"csubj",
|
40 |
+
"csubjpass",
|
41 |
+
"dative",
|
42 |
+
"dep",
|
43 |
+
"det",
|
44 |
+
"dobj",
|
45 |
+
"expl",
|
46 |
+
"intj",
|
47 |
+
"mark",
|
48 |
+
"meta",
|
49 |
+
"neg",
|
50 |
+
"nmod",
|
51 |
+
"npadvmod",
|
52 |
+
"nsubj",
|
53 |
+
"nsubjpass",
|
54 |
+
"nummod",
|
55 |
+
"oprd",
|
56 |
+
"parataxis",
|
57 |
+
"pcomp",
|
58 |
+
"pobj",
|
59 |
+
"poss",
|
60 |
+
"preconj",
|
61 |
+
"predet",
|
62 |
+
"prep",
|
63 |
+
"prt",
|
64 |
+
"punct",
|
65 |
+
"quantmod",
|
66 |
+
"relcl",
|
67 |
+
"xcomp"
|
68 |
+
],
|
69 |
+
"tagger":[
|
70 |
+
"$",
|
71 |
+
"''",
|
72 |
+
",",
|
73 |
+
"-LRB-",
|
74 |
+
"-RRB-",
|
75 |
+
".",
|
76 |
+
":",
|
77 |
+
"ADD",
|
78 |
+
"AFX",
|
79 |
+
"CC",
|
80 |
+
"CD",
|
81 |
+
"DT",
|
82 |
+
"EX",
|
83 |
+
"FW",
|
84 |
+
"HYPH",
|
85 |
+
"IN",
|
86 |
+
"JJ",
|
87 |
+
"JJR",
|
88 |
+
"JJS",
|
89 |
+
"LS",
|
90 |
+
"MD",
|
91 |
+
"NFP",
|
92 |
+
"NN",
|
93 |
+
"NNP",
|
94 |
+
"NNPS",
|
95 |
+
"NNS",
|
96 |
+
"PDT",
|
97 |
+
"POS",
|
98 |
+
"PRP",
|
99 |
+
"PRP$",
|
100 |
+
"RB",
|
101 |
+
"RBR",
|
102 |
+
"RBS",
|
103 |
+
"RP",
|
104 |
+
"SYM",
|
105 |
+
"TO",
|
106 |
+
"UH",
|
107 |
+
"VB",
|
108 |
+
"VBD",
|
109 |
+
"VBG",
|
110 |
+
"VBN",
|
111 |
+
"VBP",
|
112 |
+
"VBZ",
|
113 |
+
"WDT",
|
114 |
+
"WP",
|
115 |
+
"WP$",
|
116 |
+
"WRB",
|
117 |
+
"XX",
|
118 |
+
"``"
|
119 |
+
],
|
120 |
+
"ner":[
|
121 |
+
"CARDINAL",
|
122 |
+
"DATE",
|
123 |
+
"EVENT",
|
124 |
+
"FAC",
|
125 |
+
"GPE",
|
126 |
+
"LANGUAGE",
|
127 |
+
"LAW",
|
128 |
+
"LOC",
|
129 |
+
"MONEY",
|
130 |
+
"NORP",
|
131 |
+
"ORDINAL",
|
132 |
+
"ORG",
|
133 |
+
"PERCENT",
|
134 |
+
"PERSON",
|
135 |
+
"PRODUCT",
|
136 |
+
"QUANTITY",
|
137 |
+
"TIME",
|
138 |
+
"WORK_OF_ART"
|
139 |
+
],
|
140 |
+
"attribute_ruler":[
|
141 |
+
|
142 |
+
],
|
143 |
+
"lemmatizer":[
|
144 |
+
|
145 |
+
],
|
146 |
+
"trainable_transformer":[
|
147 |
+
|
148 |
+
],
|
149 |
+
"spancat":[
|
150 |
+
"MONOGLOSS",
|
151 |
+
"ATTRIBUTION",
|
152 |
+
"ENTERTAIN",
|
153 |
+
"PROCLAIM",
|
154 |
+
"JUSTIFYING",
|
155 |
+
"SOURCES",
|
156 |
+
"CITATION",
|
157 |
+
"COUNTER",
|
158 |
+
"DENY",
|
159 |
+
"ENDOPHORIC"
|
160 |
+
]
|
161 |
+
},
|
162 |
+
"pipeline":[
|
163 |
+
"transformer",
|
164 |
+
"parser",
|
165 |
+
"tagger",
|
166 |
+
"ner",
|
167 |
+
"attribute_ruler",
|
168 |
+
"lemmatizer",
|
169 |
+
"trainable_transformer",
|
170 |
+
"spancat"
|
171 |
+
],
|
172 |
+
"components":[
|
173 |
+
"transformer",
|
174 |
+
"parser",
|
175 |
+
"tagger",
|
176 |
+
"ner",
|
177 |
+
"attribute_ruler",
|
178 |
+
"lemmatizer",
|
179 |
+
"trainable_transformer",
|
180 |
+
"spancat"
|
181 |
+
],
|
182 |
+
"disabled":[
|
183 |
+
|
184 |
+
],
|
185 |
+
"performance":{
|
186 |
+
"dep_uas":0.0,
|
187 |
+
"dep_las":0.0,
|
188 |
+
"dep_las_per_type":0.0,
|
189 |
+
"sents_p":0.8073207931,
|
190 |
+
"sents_r":0.8856664808,
|
191 |
+
"sents_f":0.8446808511,
|
192 |
+
"tag_acc":0.0,
|
193 |
+
"ents_f":0.0,
|
194 |
+
"ents_p":0.0,
|
195 |
+
"ents_r":0.0,
|
196 |
+
"ents_per_type":{
|
197 |
+
"PERSON":{
|
198 |
+
"p":0.0,
|
199 |
+
"r":0.0,
|
200 |
+
"f":0.0
|
201 |
+
},
|
202 |
+
"ATTRIBUTION":{
|
203 |
+
"p":0.0,
|
204 |
+
"r":0.0,
|
205 |
+
"f":0.0
|
206 |
+
},
|
207 |
+
"ENTERTAIN":{
|
208 |
+
"p":0.0,
|
209 |
+
"r":0.0,
|
210 |
+
"f":0.0
|
211 |
+
},
|
212 |
+
"PROCLAIM":{
|
213 |
+
"p":0.0,
|
214 |
+
"r":0.0,
|
215 |
+
"f":0.0
|
216 |
+
},
|
217 |
+
"TIME":{
|
218 |
+
"p":0.0,
|
219 |
+
"r":0.0,
|
220 |
+
"f":0.0
|
221 |
+
},
|
222 |
+
"MONOGLOSS":{
|
223 |
+
"p":0.0,
|
224 |
+
"r":0.0,
|
225 |
+
"f":0.0
|
226 |
+
},
|
227 |
+
"CARDINAL":{
|
228 |
+
"p":0.0,
|
229 |
+
"r":0.0,
|
230 |
+
"f":0.0
|
231 |
+
},
|
232 |
+
"WORK_OF_ART":{
|
233 |
+
"p":0.0,
|
234 |
+
"r":0.0,
|
235 |
+
"f":0.0
|
236 |
+
},
|
237 |
+
"NORP":{
|
238 |
+
"p":0.0,
|
239 |
+
"r":0.0,
|
240 |
+
"f":0.0
|
241 |
+
},
|
242 |
+
"GPE":{
|
243 |
+
"p":0.0,
|
244 |
+
"r":0.0,
|
245 |
+
"f":0.0
|
246 |
+
},
|
247 |
+
"COUNTER":{
|
248 |
+
"p":0.0,
|
249 |
+
"r":0.0,
|
250 |
+
"f":0.0
|
251 |
+
},
|
252 |
+
"DATE":{
|
253 |
+
"p":0.0,
|
254 |
+
"r":0.0,
|
255 |
+
"f":0.0
|
256 |
+
},
|
257 |
+
"JUSTIFYING":{
|
258 |
+
"p":0.0,
|
259 |
+
"r":0.0,
|
260 |
+
"f":0.0
|
261 |
+
},
|
262 |
+
"SOURCES":{
|
263 |
+
"p":0.0,
|
264 |
+
"r":0.0,
|
265 |
+
"f":0.0
|
266 |
+
},
|
267 |
+
"QUANTITY":{
|
268 |
+
"p":0.0,
|
269 |
+
"r":0.0,
|
270 |
+
"f":0.0
|
271 |
+
},
|
272 |
+
"DENY":{
|
273 |
+
"p":0.0,
|
274 |
+
"r":0.0,
|
275 |
+
"f":0.0
|
276 |
+
},
|
277 |
+
"ORG":{
|
278 |
+
"p":0.0,
|
279 |
+
"r":0.0,
|
280 |
+
"f":0.0
|
281 |
+
},
|
282 |
+
"LANGUAGE":{
|
283 |
+
"p":0.0,
|
284 |
+
"r":0.0,
|
285 |
+
"f":0.0
|
286 |
+
},
|
287 |
+
"ORDINAL":{
|
288 |
+
"p":0.0,
|
289 |
+
"r":0.0,
|
290 |
+
"f":0.0
|
291 |
+
},
|
292 |
+
"PERCENT":{
|
293 |
+
"p":0.0,
|
294 |
+
"r":0.0,
|
295 |
+
"f":0.0
|
296 |
+
},
|
297 |
+
"FAC":{
|
298 |
+
"p":0.0,
|
299 |
+
"r":0.0,
|
300 |
+
"f":0.0
|
301 |
+
},
|
302 |
+
"LOC":{
|
303 |
+
"p":0.0,
|
304 |
+
"r":0.0,
|
305 |
+
"f":0.0
|
306 |
+
},
|
307 |
+
"PRODUCT":{
|
308 |
+
"p":0.0,
|
309 |
+
"r":0.0,
|
310 |
+
"f":0.0
|
311 |
+
},
|
312 |
+
"ENDOPHORIC":{
|
313 |
+
"p":0.0,
|
314 |
+
"r":0.0,
|
315 |
+
"f":0.0
|
316 |
+
},
|
317 |
+
"MONEY":{
|
318 |
+
"p":0.0,
|
319 |
+
"r":0.0,
|
320 |
+
"f":0.0
|
321 |
+
},
|
322 |
+
"EVENT":{
|
323 |
+
"p":0.0,
|
324 |
+
"r":0.0,
|
325 |
+
"f":0.0
|
326 |
+
},
|
327 |
+
"LAW":{
|
328 |
+
"p":0.0,
|
329 |
+
"r":0.0,
|
330 |
+
"f":0.0
|
331 |
+
}
|
332 |
+
},
|
333 |
+
"lemma_acc":0.0,
|
334 |
+
"spans_sc_f":0.7113899614,
|
335 |
+
"spans_sc_p":0.7173913043,
|
336 |
+
"spans_sc_r":0.705488194,
|
337 |
+
"trainable_transformer_loss":3.5909547597,
|
338 |
+
"spancat_loss":747.535664727
|
339 |
+
},
|
340 |
+
"requirements":[
|
341 |
+
"spacy-transformers>=1.1.8,<1.2.0",
|
342 |
+
"spacy-experimental>=0.6.1,<0.7.0"
|
343 |
+
]
|
344 |
+
}
|
ner/cfg
ADDED
@@ -0,0 +1,13 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"moves":null,
|
3 |
+
"update_with_oracle_cut_size":100,
|
4 |
+
"multitasks":[
|
5 |
+
|
6 |
+
],
|
7 |
+
"min_action_freq":1,
|
8 |
+
"learn_tokens":false,
|
9 |
+
"beam_width":1,
|
10 |
+
"beam_density":0.0,
|
11 |
+
"beam_update_prob":0.0,
|
12 |
+
"incorrect_spans_key":null
|
13 |
+
}
|
ner/model
ADDED
Binary file (314 kB). View file
|
|
ner/moves
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
��moves�{"0":{},"1":{"ORG":56356,"DATE":40381,"PERSON":36475,"GPE":26716,"MONEY":15121,"CARDINAL":14096,"NORP":9638,"PERCENT":9182,"WORK_OF_ART":4475,"LOC":4047,"TIME":3670,"QUANTITY":3114,"FAC":3042,"EVENT":3015,"ORDINAL":2142,"PRODUCT":1782,"LAW":1620,"LANGUAGE":355},"2":{"ORG":56356,"DATE":40381,"PERSON":36475,"GPE":26716,"MONEY":15121,"CARDINAL":14096,"NORP":9638,"PERCENT":9182,"WORK_OF_ART":4475,"LOC":4047,"TIME":3670,"QUANTITY":3114,"FAC":3042,"EVENT":3015,"ORDINAL":2142,"PRODUCT":1782,"LAW":1620,"LANGUAGE":355},"3":{"ORG":56356,"DATE":40381,"PERSON":36475,"GPE":26716,"MONEY":15121,"CARDINAL":14096,"NORP":9638,"PERCENT":9182,"WORK_OF_ART":4475,"LOC":4047,"TIME":3670,"QUANTITY":3114,"FAC":3042,"EVENT":3015,"ORDINAL":2142,"PRODUCT":1782,"LAW":1620,"LANGUAGE":355},"4":{"ORG":56356,"DATE":40381,"PERSON":36475,"GPE":26716,"MONEY":15121,"CARDINAL":14096,"NORP":9638,"PERCENT":9182,"WORK_OF_ART":4475,"LOC":4047,"TIME":3670,"QUANTITY":3114,"FAC":3042,"EVENT":3015,"ORDINAL":2142,"PRODUCT":1782,"LAW":1620,"LANGUAGE":355,"":1},"5":{"":1}}�cfg��neg_key�
|
parser/cfg
ADDED
@@ -0,0 +1,13 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"moves":null,
|
3 |
+
"update_with_oracle_cut_size":100,
|
4 |
+
"multitasks":[
|
5 |
+
|
6 |
+
],
|
7 |
+
"min_action_freq":30,
|
8 |
+
"learn_tokens":false,
|
9 |
+
"beam_width":1,
|
10 |
+
"beam_density":0.0,
|
11 |
+
"beam_update_prob":0.0,
|
12 |
+
"incorrect_spans_key":null
|
13 |
+
}
|
parser/model
ADDED
Binary file (640 kB). View file
|
|
parser/moves
ADDED
@@ -0,0 +1,2 @@
|
|
|
|
|
|
|
1 |
+
��moves�
|
2 |
+
{"0":{"":994267},"1":{"":990803},"2":{"det":172595,"nsubj":165748,"compound":116623,"amod":105184,"aux":86667,"punct":65478,"advmod":62763,"poss":36443,"mark":27941,"nummod":22598,"auxpass":15594,"prep":14001,"nsubjpass":13856,"neg":12357,"cc":10739,"nmod":9562,"advcl":9062,"npadvmod":8168,"quantmod":7101,"intj":6464,"ccomp":5896,"dobj":3427,"expl":3360,"dep":2806,"predet":1944,"parataxis":1837,"csubj":1428,"preconj":621,"pobj||prep":616,"attr":578,"meta":376,"advmod||conj":368,"dobj||xcomp":352,"acomp":284,"nsubj||ccomp":224,"dative":206,"advmod||xcomp":149,"dobj||ccomp":70,"csubjpass":64,"dobj||conj":62,"prep||conj":51,"acl":48,"prep||nsubj":41,"prep||dobj":36,"xcomp":34,"advmod||ccomp":32,"oprd":31},"3":{"punct":183790,"pobj":182191,"prep":174008,"dobj":89615,"conj":59687,"cc":51930,"ccomp":30385,"advmod":22861,"xcomp":21021,"relcl":20969,"advcl":19828,"attr":17741,"acomp":16922,"appos":15265,"case":13388,"acl":12085,"pcomp":10324,"npadvmod":9796,"prt":8179,"agent":3903,"dative":3866,"nsubj":3470,"neg":2906,"amod":2839,"intj":2819,"nummod":2732,"oprd":2301,"dep":1487,"parataxis":1261,"quantmod":319,"nmod":294,"acl||dobj":200,"prep||dobj":190,"prep||nsubj":162,"acl||nsubj":159,"appos||nsubj":145,"relcl||dobj":134,"relcl||nsubj":111,"aux":103,"expl":96,"meta":92,"appos||dobj":86,"preconj":71,"csubj":65,"prep||nsubjpass":55,"prep||advmod":54,"prep||acomp":53,"det":51,"nsubjpass":45,"relcl||pobj":42,"acl||nsubjpass":42,"mark":40,"auxpass":39,"prep||pobj":36,"relcl||nsubjpass":32,"appos||nsubjpass":31},"4":{"ROOT":111664}}�cfg��neg_key�
|
spancat/cfg
ADDED
@@ -0,0 +1,17 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"labels":[
|
3 |
+
"MONOGLOSS",
|
4 |
+
"ATTRIBUTION",
|
5 |
+
"ENTERTAIN",
|
6 |
+
"PROCLAIM",
|
7 |
+
"JUSTIFYING",
|
8 |
+
"SOURCES",
|
9 |
+
"CITATION",
|
10 |
+
"COUNTER",
|
11 |
+
"DENY",
|
12 |
+
"ENDOPHORIC"
|
13 |
+
],
|
14 |
+
"spans_key":"sc",
|
15 |
+
"threshold":0.5,
|
16 |
+
"max_positive":null
|
17 |
+
}
|
spancat/model
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:8e7912d854104fca7ed3bfdf829eb820a9d94f17843a909ee7f504538e26d029
|
3 |
+
size 6502651
|
tagger/cfg
ADDED
@@ -0,0 +1,55 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"labels":[
|
3 |
+
"$",
|
4 |
+
"''",
|
5 |
+
",",
|
6 |
+
"-LRB-",
|
7 |
+
"-RRB-",
|
8 |
+
".",
|
9 |
+
":",
|
10 |
+
"ADD",
|
11 |
+
"AFX",
|
12 |
+
"CC",
|
13 |
+
"CD",
|
14 |
+
"DT",
|
15 |
+
"EX",
|
16 |
+
"FW",
|
17 |
+
"HYPH",
|
18 |
+
"IN",
|
19 |
+
"JJ",
|
20 |
+
"JJR",
|
21 |
+
"JJS",
|
22 |
+
"LS",
|
23 |
+
"MD",
|
24 |
+
"NFP",
|
25 |
+
"NN",
|
26 |
+
"NNP",
|
27 |
+
"NNPS",
|
28 |
+
"NNS",
|
29 |
+
"PDT",
|
30 |
+
"POS",
|
31 |
+
"PRP",
|
32 |
+
"PRP$",
|
33 |
+
"RB",
|
34 |
+
"RBR",
|
35 |
+
"RBS",
|
36 |
+
"RP",
|
37 |
+
"SYM",
|
38 |
+
"TO",
|
39 |
+
"UH",
|
40 |
+
"VB",
|
41 |
+
"VBD",
|
42 |
+
"VBG",
|
43 |
+
"VBN",
|
44 |
+
"VBP",
|
45 |
+
"VBZ",
|
46 |
+
"WDT",
|
47 |
+
"WP",
|
48 |
+
"WP$",
|
49 |
+
"WRB",
|
50 |
+
"XX",
|
51 |
+
"``"
|
52 |
+
],
|
53 |
+
"neg_prefix":"!",
|
54 |
+
"overwrite":false
|
55 |
+
}
|
tagger/model
ADDED
Binary file (151 kB). View file
|
|
tokenizer
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
��prefix_search�~^§|^%|^=|^—|^–|^\+(?![0-9])|^…|^……|^,|^:|^;|^\!|^\?|^¿|^؟|^¡|^\(|^\)|^\[|^\]|^\{|^\}|^<|^>|^_|^#|^\*|^&|^。|^?|^!|^,|^、|^;|^:|^~|^·|^।|^،|^۔|^؛|^٪|^\.\.+|^…|^\'|^"|^”|^“|^`|^‘|^´|^’|^‚|^,|^„|^»|^«|^「|^」|^『|^』|^(|^)|^〔|^〕|^【|^】|^《|^》|^〈|^〉|^\$|^£|^€|^¥|^฿|^US\$|^C\$|^A\$|^₽|^﷼|^₴|^₠|^₡|^₢|^₣|^₤|^₥|^₦|^₧|^₨|^₩|^₪|^₫|^€|^₭|^₮|^₯|^₰|^₱|^₲|^₳|^₴|^₵|^₶|^₷|^₸|^₹|^₺|^₻|^₼|^₽|^₾|^₿|^[\u00A6\u00A9\u00AE\u00B0\u0482\u058D\u058E\u060E\u060F\u06DE\u06E9\u06FD\u06FE\u07F6\u09FA\u0B70\u0BF3-\u0BF8\u0BFA\u0C7F\u0D4F\u0D79\u0F01-\u0F03\u0F13\u0F15-\u0F17\u0F1A-\u0F1F\u0F34\u0F36\u0F38\u0FBE-\u0FC5\u0FC7-\u0FCC\u0FCE\u0FCF\u0FD5-\u0FD8\u109E\u109F\u1390-\u1399\u1940\u19DE-\u19FF\u1B61-\u1B6A\u1B74-\u1B7C\u2100\u2101\u2103-\u2106\u2108\u2109\u2114\u2116\u2117\u211E-\u2123\u2125\u2127\u2129\u212E\u213A\u213B\u214A\u214C\u214D\u214F\u218A\u218B\u2195-\u2199\u219C-\u219F\u21A1\u21A2\u21A4\u21A5\u21A7-\u21AD\u21AF-\u21CD\u21D0\u21D1\u21D3\u21D5-\u21F3\u2300-\u2307\u230C-\u231F\u2322-\u2328\u232B-\u237B\u237D-\u239A\u23B4-\u23DB\u23E2-\u2426\u2440-\u244A\u249C-\u24E9\u2500-\u25B6\u25B8-\u25C0\u25C2-\u25F7\u2600-\u266E\u2670-\u2767\u2794-\u27BF\u2800-\u28FF\u2B00-\u2B2F\u2B45\u2B46\u2B4D-\u2B73\u2B76-\u2B95\u2B98-\u2BC8\u2BCA-\u2BFE\u2CE5-\u2CEA\u2E80-\u2E99\u2E9B-\u2EF3\u2F00-\u2FD5\u2FF0-\u2FFB\u3004\u3012\u3013\u3020\u3036\u3037\u303E\u303F\u3190\u3191\u3196-\u319F\u31C0-\u31E3\u3200-\u321E\u322A-\u3247\u3250\u3260-\u327F\u328A-\u32B0\u32C0-\u32FE\u3300-\u33FF\u4DC0-\u4DFF\uA490-\uA4C6\uA828-\uA82B\uA836\uA837\uA839\uAA77-\uAA79\uFDFD\uFFE4\uFFE8\uFFED\uFFEE\uFFFC\uFFFD\U00010137-\U0001013F\U00010179-\U00010189\U0001018C-\U0001018E\U00010190-\U0001019B\U000101A0\U000101D0-\U000101FC\U00010877\U00010878\U00010AC8\U0001173F\U00016B3C-\U00016B3F\U00016B45\U0001BC9C\U0001D000-\U0001D0F5\U0001D100-\U0001D126\U0001D129-\U0001D164\U0001D16A-\U0001D16C\U0001D183\U0001D184\U0001D18C-\U0001D1A9\U0001D1AE-\U0001D1E8\U0001D200-\U0001D241\U0001D245\U0001D300-\U0001D356\U0001D800-\U0001D9FF\U0001DA37-\U0001DA3A\U0001DA6D-\U0001DA74\U0001DA76-\U0001DA83\U0001DA85\U0001DA86\U0001ECAC\U0001F000-\U0001F02B\U0001F030-\U0001F093\U0001F0A0-\U0001F0AE\U0001F0B1-\U0001F0BF\U0001F0C1-\U0001F0CF\U0001F0D1-\U0001F0F5\U0001F110-\U0001F16B\U0001F170-\U0001F1AC\U0001F1E6-\U0001F202\U0001F210-\U0001F23B\U0001F240-\U0001F248\U0001F250\U0001F251\U0001F260-\U0001F265\U0001F300-\U0001F3FA\U0001F400-\U0001F6D4\U0001F6E0-\U0001F6EC\U0001F6F0-\U0001F6F9\U0001F700-\U0001F773\U0001F780-\U0001F7D8\U0001F800-\U0001F80B\U0001F810-\U0001F847\U0001F850-\U0001F859\U0001F860-\U0001F887\U0001F890-\U0001F8AD\U0001F900-\U0001F90B\U0001F910-\U0001F93E\U0001F940-\U0001F970\U0001F973-\U0001F976\U0001F97A\U0001F97C-\U0001F9A2\U0001F9B0-\U0001F9B9\U0001F9C0-\U0001F9C2\U0001F9D0-\U0001F9FF\U0001FA60-\U0001FA6D]�suffix_search�2y…$|……$|,$|:$|;$|\!$|\?$|¿$|؟$|¡$|\($|\)$|\[$|\]$|\{$|\}$|<$|>$|_$|#$|\*$|&$|。$|?$|!$|,$|、$|;$|:$|~$|·$|।$|،$|۔$|؛$|٪$|\.\.+$|…$|\'$|"$|”$|“$|`$|‘$|´$|’$|‚$|,$|„$|»$|«$|「$|」$|『$|』$|($|)$|〔$|〕$|【$|】$|《$|》$|〈$|〉$|[\u00A6\u00A9\u00AE\u00B0\u0482\u058D\u058E\u060E\u060F\u06DE\u06E9\u06FD\u06FE\u07F6\u09FA\u0B70\u0BF3-\u0BF8\u0BFA\u0C7F\u0D4F\u0D79\u0F01-\u0F03\u0F13\u0F15-\u0F17\u0F1A-\u0F1F\u0F34\u0F36\u0F38\u0FBE-\u0FC5\u0FC7-\u0FCC\u0FCE\u0FCF\u0FD5-\u0FD8\u109E\u109F\u1390-\u1399\u1940\u19DE-\u19FF\u1B61-\u1B6A\u1B74-\u1B7C\u2100\u2101\u2103-\u2106\u2108\u2109\u2114\u2116\u2117\u211E-\u2123\u2125\u2127\u2129\u212E\u213A\u213B\u214A\u214C\u214D\u214F\u218A\u218B\u2195-\u2199\u219C-\u219F\u21A1\u21A2\u21A4\u21A5\u21A7-\u21AD\u21AF-\u21CD\u21D0\u21D1\u21D3\u21D5-\u21F3\u2300-\u2307\u230C-\u231F\u2322-\u2328\u232B-\u237B\u237D-\u239A\u23B4-\u23DB\u23E2-\u2426\u2440-\u244A\u249C-\u24E9\u2500-\u25B6\u25B8-\u25C0\u25C2-\u25F7\u2600-\u266E\u2670-\u2767\u2794-\u27BF\u2800-\u28FF\u2B00-\u2B2F\u2B45\u2B46\u2B4D-\u2B73\u2B76-\u2B95\u2B98-\u2BC8\u2BCA-\u2BFE\u2CE5-\u2CEA\u2E80-\u2E99\u2E9B-\u2EF3\u2F00-\u2FD5\u2FF0-\u2FFB\u3004\u3012\u3013\u3020\u3036\u3037\u303E\u303F\u3190\u3191\u3196-\u319F\u31C0-\u31E3\u3200-\u321E\u322A-\u3247\u3250\u3260-\u327F\u328A-\u32B0\u32C0-\u32FE\u3300-\u33FF\u4DC0-\u4DFF\uA490-\uA4C6\uA828-\uA82B\uA836\uA837\uA839\uAA77-\uAA79\uFDFD\uFFE4\uFFE8\uFFED\uFFEE\uFFFC\uFFFD\U00010137-\U0001013F\U00010179-\U00010189\U0001018C-\U0001018E\U00010190-\U0001019B\U000101A0\U000101D0-\U000101FC\U00010877\U00010878\U00010AC8\U0001173F\U00016B3C-\U00016B3F\U00016B45\U0001BC9C\U0001D000-\U0001D0F5\U0001D100-\U0001D126\U0001D129-\U0001D164\U0001D16A-\U0001D16C\U0001D183\U0001D184\U0001D18C-\U0001D1A9\U0001D1AE-\U0001D1E8\U0001D200-\U0001D241\U0001D245\U0001D300-\U0001D356\U0001D800-\U0001D9FF\U0001DA37-\U0001DA3A\U0001DA6D-\U0001DA74\U0001DA76-\U0001DA83\U0001DA85\U0001DA86\U0001ECAC\U0001F000-\U0001F02B\U0001F030-\U0001F093\U0001F0A0-\U0001F0AE\U0001F0B1-\U0001F0BF\U0001F0C1-\U0001F0CF\U0001F0D1-\U0001F0F5\U0001F110-\U0001F16B\U0001F170-\U0001F1AC\U0001F1E6-\U0001F202\U0001F210-\U0001F23B\U0001F240-\U0001F248\U0001F250\U0001F251\U0001F260-\U0001F265\U0001F300-\U0001F3FA\U0001F400-\U0001F6D4\U0001F6E0-\U0001F6EC\U0001F6F0-\U0001F6F9\U0001F700-\U0001F773\U0001F780-\U0001F7D8\U0001F800-\U0001F80B\U0001F810-\U0001F847\U0001F850-\U0001F859\U0001F860-\U0001F887\U0001F890-\U0001F8AD\U0001F900-\U0001F90B\U0001F910-\U0001F93E\U0001F940-\U0001F970\U0001F973-\U0001F976\U0001F97A\U0001F97C-\U0001F9A2\U0001F9B0-\U0001F9B9\U0001F9C0-\U0001F9C2\U0001F9D0-\U0001F9FF\U0001FA60-\U0001FA6D]$|'s$|'S$|’s$|’S$|—$|–$|(?<=[0-9])\+$|(?<=°[FfCcKk])\.$|(?<=[0-9])(?:\$|£|€|¥|฿|US\$|C\$|A\$|₽|﷼|₴|₠|₡|₢|₣|₤|₥|₦|₧|₨|₩|₪|₫|€|₭|₮|₯|₰|₱|₲|₳|₴|₵|₶|₷|₸|₹|₺|₻|₼|₽|₾|₿)$|(?<=[0-9])(?:km|km²|km³|m|m²|m³|dm|dm²|dm³|cm|cm²|cm³|mm|mm²|mm³|ha|µm|nm|yd|in|ft|kg|g|mg|µg|t|lb|oz|m/s|km/h|kmh|mph|hPa|Pa|mbar|mb|MB|kb|KB|gb|GB|tb|TB|T|G|M|K|%|км|км²|км³|м|м²|м³|дм|дм²|дм³|см|см²|см³|мм|мм²|мм³|нм|кг|г|мг|м/с|км/ч|кПа|Па|мбар|Кб|КБ|кб|Мб|МБ|мб|Гб|ГБ|гб|Тб|ТБ|тбكم|كم²|كم³|م|م²|م³|سم|سم²|سم³|مم|مم²|مم³|كم|غرام|جرام|جم|كغ|ملغ|كوب|اكواب)$|(?<=[0-9a-z\uFF41-\uFF5A\u00DF-\u00F6\u00F8-\u00FF\u0101\u0103\u0105\u0107\u0109\u010B\u010D\u010F\u0111\u0113\u0115\u0117\u0119\u011B\u011D\u011F\u0121\u0123\u0125\u0127\u0129\u012B\u012D\u012F\u0131\u0133\u0135\u0137\u0138\u013A\u013C\u013E\u0140\u0142\u0144\u0146\u0148\u0149\u014B\u014D\u014F\u0151\u0153\u0155\u0157\u0159\u015B\u015D\u015F\u0161\u0163\u0165\u0167\u0169\u016B\u016D\u016F\u0171\u0173\u0175\u0177\u017A\u017C\u017E\u017F\u0180\u0183\u0185\u0188\u018C\u018D\u0192\u0195\u0199-\u019B\u019E\u01A1\u01A3\u01A5\u01A8\u01AA\u01AB\u01AD\u01B0\u01B4\u01B6\u01B9\u01BA\u01BD-\u01BF\u01C6\u01C9\u01CC\u01CE\u01D0\u01D2\u01D4\u01D6\u01D8\u01DA\u01DC\u01DD\u01DF\u01E1\u01E3\u01E5\u01E7\u01E9\u01EB\u01ED\u01EF\u01F0\u01F3\u01F5\u01F9\u01FB\u01FD\u01FF\u0201\u0203\u0205\u0207\u0209\u020B\u020D\u020F\u0211\u0213\u0215\u0217\u0219\u021B\u021D\u021F\u0221\u0223\u0225\u0227\u0229\u022B\u022D\u022F\u0231\u0233-\u0239\u023C\u023F\u0240\u0242\u0247\u0249\u024B\u024D\u024F\u2C61\u2C65\u2C66\u2C68\u2C6A\u2C6C\u2C71\u2C73\u2C74\u2C76-\u2C7B\uA723\uA725\uA727\uA729\uA72B\uA72D\uA72F-\uA731\uA733\uA735\uA737\uA739\uA73B\uA73D\uA73F\uA741\uA743\uA745\uA747\uA749\uA74B\uA74D\uA74F\uA751\uA753\uA755\uA757\uA759\uA75B\uA75D\uA75F\uA761\uA763\uA765\uA767\uA769\uA76B\uA76D\uA76F\uA771-\uA778\uA77A\uA77C\uA77F\uA781\uA783\uA785\uA787\uA78C\uA78E\uA791\uA793-\uA795\uA797\uA799\uA79B\uA79D\uA79F\uA7A1\uA7A3\uA7A5\uA7A7\uA7A9\uA7AF\uA7B5\uA7B7\uA7B9\uA7FA\uAB30-\uAB5A\uAB60-\uAB64\u0250-\u02AF\u1D00-\u1D25\u1D6B-\u1D77\u1D79-\u1D9A\u1E01\u1E03\u1E05\u1E07\u1E09\u1E0B\u1E0D\u1E0F\u1E11\u1E13\u1E15\u1E17\u1E19\u1E1B\u1E1D\u1E1F\u1E21\u1E23\u1E25\u1E27\u1E29\u1E2B\u1E2D\u1E2F\u1E31\u1E33\u1E35\u1E37\u1E39\u1E3B\u1E3D\u1E3F\u1E41\u1E43\u1E45\u1E47\u1E49\u1E4B\u1E4D\u1E4F\u1E51\u1E53\u1E55\u1E57\u1E59\u1E5B\u1E5D\u1E5F\u1E61\u1E63\u1E65\u1E67\u1E69\u1E6B\u1E6D\u1E6F\u1E71\u1E73\u1E75\u1E77\u1E79\u1E7B\u1E7D\u1E7F\u1E81\u1E83\u1E85\u1E87\u1E89\u1E8B\u1E8D\u1E8F\u1E91\u1E93\u1E95-\u1E9D\u1E9F\u1EA1\u1EA3\u1EA5\u1EA7\u1EA9\u1EAB\u1EAD\u1EAF\u1EB1\u1EB3\u1EB5\u1EB7\u1EB9\u1EBB\u1EBD\u1EBF\u1EC1\u1EC3\u1EC5\u1EC7\u1EC9\u1ECB\u1ECD\u1ECF\u1ED1\u1ED3\u1ED5\u1ED7\u1ED9\u1EDB\u1EDD\u1EDF\u1EE1\u1EE3\u1EE5\u1EE7\u1EE9\u1EEB\u1EED\u1EEF\u1EF1\u1EF3\u1EF5\u1EF7\u1EF9\u1EFB\u1EFD\u1EFFёа-яәөүҗңһα-ωάέίόώήύа-щюяіїєґѓѕјљњќѐѝ\u1200-\u137F\u0980-\u09FF\u0591-\u05F4\uFB1D-\uFB4F\u0620-\u064A\u066E-\u06D5\u06E5-\u06FF\u0750-\u077F\u08A0-\u08BD\uFB50-\uFBB1\uFBD3-\uFD3D\uFD50-\uFDC7\uFDF0-\uFDFB\uFE70-\uFEFC\U0001EE00-\U0001EEBB\u0D80-\u0DFF\u0900-\u097F\u0C80-\u0CFF\u0B80-\u0BFF\u0C00-\u0C7F\uAC00-\uD7AF\u1100-\u11FF\u3040-\u309F\u30A0-\u30FFー\u4E00-\u62FF\u6300-\u77FF\u7800-\u8CFF\u8D00-\u9FFF\u3400-\u4DBF\U00020000-\U000215FF\U00021600-\U000230FF\U00023100-\U000245FF\U00024600-\U000260FF\U00026100-\U000275FF\U00027600-\U000290FF\U00029100-\U0002A6DF\U0002A700-\U0002B73F\U0002B740-\U0002B81F\U0002B820-\U0002CEAF\U0002CEB0-\U0002EBEF\u2E80-\u2EFF\u2F00-\u2FDF\u2FF0-\u2FFF\u3000-\u303F\u31C0-\u31EF\u3200-\u32FF\u3300-\u33FF\uF900-\uFAFF\uFE30-\uFE4F\U0001F200-\U0001F2FF\U0002F800-\U0002FA1F%²\-\+…|……|,|:|;|\!|\?|¿|؟|¡|\(|\)|\[|\]|\{|\}|<|>|_|#|\*|&|。|?|!|,|、|;|:|~|·|।|،|۔|؛|٪(?:\'"”“`‘´’‚,„»«「」『』()〔〕【】《》〈〉)])\.$|(?<=[A-Z\uFF21-\uFF3A\u00C0-\u00D6\u00D8-\u00DE\u0100\u0102\u0104\u0106\u0108\u010A\u010C\u010E\u0110\u0112\u0114\u0116\u0118\u011A\u011C\u011E\u0120\u0122\u0124\u0126\u0128\u012A\u012C\u012E\u0130\u0132\u0134\u0136\u0139\u013B\u013D\u013F\u0141\u0143\u0145\u0147\u014A\u014C\u014E\u0150\u0152\u0154\u0156\u0158\u015A\u015C\u015E\u0160\u0162\u0164\u0166\u0168\u016A\u016C\u016E\u0170\u0172\u0174\u0176\u0178\u0179\u017B\u017D\u0181\u0182\u0184\u0186\u0187\u0189-\u018B\u018E-\u0191\u0193\u0194\u0196-\u0198\u019C\u019D\u019F\u01A0\u01A2\u01A4\u01A6\u01A7\u01A9\u01AC\u01AE\u01AF\u01B1-\u01B3\u01B5\u01B7\u01B8\u01BC\u01C4\u01C7\u01CA\u01CD\u01CF\u01D1\u01D3\u01D5\u01D7\u01D9\u01DB\u01DE\u01E0\u01E2\u01E4\u01E6\u01E8\u01EA\u01EC\u01EE\u01F1\u01F4\u01F6-\u01F8\u01FA\u01FC\u01FE\u0200\u0202\u0204\u0206\u0208\u020A\u020C\u020E\u0210\u0212\u0214\u0216\u0218\u021A\u021C\u021E\u0220\u0222\u0224\u0226\u0228\u022A\u022C\u022E\u0230\u0232\u023A\u023B\u023D\u023E\u0241\u0243-\u0246\u0248\u024A\u024C\u024E\u2C60\u2C62-\u2C64\u2C67\u2C69\u2C6B\u2C6D-\u2C70\u2C72\u2C75\u2C7E\u2C7F\uA722\uA724\uA726\uA728\uA72A\uA72C\uA72E\uA732\uA734\uA736\uA738\uA73A\uA73C\uA73E\uA740\uA742\uA744\uA746\uA748\uA74A\uA74C\uA74E\uA750\uA752\uA754\uA756\uA758\uA75A\uA75C\uA75E\uA760\uA762\uA764\uA766\uA768\uA76A\uA76C\uA76E\uA779\uA77B\uA77D\uA77E\uA780\uA782\uA784\uA786\uA78B\uA78D\uA790\uA792\uA796\uA798\uA79A\uA79C\uA79E\uA7A0\uA7A2\uA7A4\uA7A6\uA7A8\uA7AA-\uA7AE\uA7B0-\uA7B4\uA7B6\uA7B8\u1E00\u1E02\u1E04\u1E06\u1E08\u1E0A\u1E0C\u1E0E\u1E10\u1E12\u1E14\u1E16\u1E18\u1E1A\u1E1C\u1E1E\u1E20\u1E22\u1E24\u1E26\u1E28\u1E2A\u1E2C\u1E2E\u1E30\u1E32\u1E34\u1E36\u1E38\u1E3A\u1E3C\u1E3E\u1E40\u1E42\u1E44\u1E46\u1E48\u1E4A\u1E4C\u1E4E\u1E50\u1E52\u1E54\u1E56\u1E58\u1E5A\u1E5C\u1E5E\u1E60\u1E62\u1E64\u1E66\u1E68\u1E6A\u1E6C\u1E6E\u1E70\u1E72\u1E74\u1E76\u1E78\u1E7A\u1E7C\u1E7E\u1E80\u1E82\u1E84\u1E86\u1E88\u1E8A\u1E8C\u1E8E\u1E90\u1E92\u1E94\u1E9E\u1EA0\u1EA2\u1EA4\u1EA6\u1EA8\u1EAA\u1EAC\u1EAE\u1EB0\u1EB2\u1EB4\u1EB6\u1EB8\u1EBA\u1EBC\u1EBE\u1EC0\u1EC2\u1EC4\u1EC6\u1EC8\u1ECA\u1ECC\u1ECE\u1ED0\u1ED2\u1ED4\u1ED6\u1ED8\u1EDA\u1EDC\u1EDE\u1EE0\u1EE2\u1EE4\u1EE6\u1EE8\u1EEA\u1EEC\u1EEE\u1EF0\u1EF2\u1EF4\u1EF6\u1EF8\u1EFA\u1EFC\u1EFEЁА-ЯӘӨҮҖҢҺΑ-ΩΆΈΊΌΏΉΎА-ЩЮЯІЇЄҐЃЅЈЉЊЌЀЍ\u1200-\u137F\u0980-\u09FF\u0591-\u05F4\uFB1D-\uFB4F\u0620-\u064A\u066E-\u06D5\u06E5-\u06FF\u0750-\u077F\u08A0-\u08BD\uFB50-\uFBB1\uFBD3-\uFD3D\uFD50-\uFDC7\uFDF0-\uFDFB\uFE70-\uFEFC\U0001EE00-\U0001EEBB\u0D80-\u0DFF\u0900-\u097F\u0C80-\u0CFF\u0B80-\u0BFF\u0C00-\u0C7F\uAC00-\uD7AF\u1100-\u11FF\u3040-\u309F\u30A0-\u30FFー\u4E00-\u62FF\u6300-\u77FF\u7800-\u8CFF\u8D00-\u9FFF\u3400-\u4DBF\U00020000-\U000215FF\U00021600-\U000230FF\U00023100-\U000245FF\U00024600-\U000260FF\U00026100-\U000275FF\U00027600-\U000290FF\U00029100-\U0002A6DF\U0002A700-\U0002B73F\U0002B740-\U0002B81F\U0002B820-\U0002CEAF\U0002CEB0-\U0002EBEF\u2E80-\u2EFF\u2F00-\u2FDF\u2FF0-\u2FFF\u3000-\u303F\u31C0-\u31EF\u3200-\u32FF\u3300-\u33FF\uF900-\uFAFF\uFE30-\uFE4F\U0001F200-\U0001F2FF\U0002F800-\U0002FA1F][A-Z\uFF21-\uFF3A\u00C0-\u00D6\u00D8-\u00DE\u0100\u0102\u0104\u0106\u0108\u010A\u010C\u010E\u0110\u0112\u0114\u0116\u0118\u011A\u011C\u011E\u0120\u0122\u0124\u0126\u0128\u012A\u012C\u012E\u0130\u0132\u0134\u0136\u0139\u013B\u013D\u013F\u0141\u0143\u0145\u0147\u014A\u014C\u014E\u0150\u0152\u0154\u0156\u0158\u015A\u015C\u015E\u0160\u0162\u0164\u0166\u0168\u016A\u016C\u016E\u0170\u0172\u0174\u0176\u0178\u0179\u017B\u017D\u0181\u0182\u0184\u0186\u0187\u0189-\u018B\u018E-\u0191\u0193\u0194\u0196-\u0198\u019C\u019D\u019F\u01A0\u01A2\u01A4\u01A6\u01A7\u01A9\u01AC\u01AE\u01AF\u01B1-\u01B3\u01B5\u01B7\u01B8\u01BC\u01C4\u01C7\u01CA\u01CD\u01CF\u01D1\u01D3\u01D5\u01D7\u01D9\u01DB\u01DE\u01E0\u01E2\u01E4\u01E6\u01E8\u01EA\u01EC\u01EE\u01F1\u01F4\u01F6-\u01F8\u01FA\u01FC\u01FE\u0200\u0202\u0204\u0206\u0208\u020A\u020C\u020E\u0210\u0212\u0214\u0216\u0218\u021A\u021C\u021E\u0220\u0222\u0224\u0226\u0228\u022A\u022C\u022E\u0230\u0232\u023A\u023B\u023D\u023E\u0241\u0243-\u0246\u0248\u024A\u024C\u024E\u2C60\u2C62-\u2C64\u2C67\u2C69\u2C6B\u2C6D-\u2C70\u2C72\u2C75\u2C7E\u2C7F\uA722\uA724\uA726\uA728\uA72A\uA72C\uA72E\uA732\uA734\uA736\uA738\uA73A\uA73C\uA73E\uA740\uA742\uA744\uA746\uA748\uA74A\uA74C\uA74E\uA750\uA752\uA754\uA756\uA758\uA75A\uA75C\uA75E\uA760\uA762\uA764\uA766\uA768\uA76A\uA76C\uA76E\uA779\uA77B\uA77D\uA77E\uA780\uA782\uA784\uA786\uA78B\uA78D\uA790\uA792\uA796\uA798\uA79A\uA79C\uA79E\uA7A0\uA7A2\uA7A4\uA7A6\uA7A8\uA7AA-\uA7AE\uA7B0-\uA7B4\uA7B6\uA7B8\u1E00\u1E02\u1E04\u1E06\u1E08\u1E0A\u1E0C\u1E0E\u1E10\u1E12\u1E14\u1E16\u1E18\u1E1A\u1E1C\u1E1E\u1E20\u1E22\u1E24\u1E26\u1E28\u1E2A\u1E2C\u1E2E\u1E30\u1E32\u1E34\u1E36\u1E38\u1E3A\u1E3C\u1E3E\u1E40\u1E42\u1E44\u1E46\u1E48\u1E4A\u1E4C\u1E4E\u1E50\u1E52\u1E54\u1E56\u1E58\u1E5A\u1E5C\u1E5E\u1E60\u1E62\u1E64\u1E66\u1E68\u1E6A\u1E6C\u1E6E\u1E70\u1E72\u1E74\u1E76\u1E78\u1E7A\u1E7C\u1E7E\u1E80\u1E82\u1E84\u1E86\u1E88\u1E8A\u1E8C\u1E8E\u1E90\u1E92\u1E94\u1E9E\u1EA0\u1EA2\u1EA4\u1EA6\u1EA8\u1EAA\u1EAC\u1EAE\u1EB0\u1EB2\u1EB4\u1EB6\u1EB8\u1EBA\u1EBC\u1EBE\u1EC0\u1EC2\u1EC4\u1EC6\u1EC8\u1ECA\u1ECC\u1ECE\u1ED0\u1ED2\u1ED4\u1ED6\u1ED8\u1EDA\u1EDC\u1EDE\u1EE0\u1EE2\u1EE4\u1EE6\u1EE8\u1EEA\u1EEC\u1EEE\u1EF0\u1EF2\u1EF4\u1EF6\u1EF8\u1EFA\u1EFC\u1EFEЁА-ЯӘӨҮҖҢҺΑ-ΩΆΈΊΌΏΉΎА-ЩЮЯІЇЄҐЃЅЈЉЊЌЀЍ\u1200-\u137F\u0980-\u09FF\u0591-\u05F4\uFB1D-\uFB4F\u0620-\u064A\u066E-\u06D5\u06E5-\u06FF\u0750-\u077F\u08A0-\u08BD\uFB50-\uFBB1\uFBD3-\uFD3D\uFD50-\uFDC7\uFDF0-\uFDFB\uFE70-\uFEFC\U0001EE00-\U0001EEBB\u0D80-\u0DFF\u0900-\u097F\u0C80-\u0CFF\u0B80-\u0BFF\u0C00-\u0C7F\uAC00-\uD7AF\u1100-\u11FF\u3040-\u309F\u30A0-\u30FFー\u4E00-\u62FF\u6300-\u77FF\u7800-\u8CFF\u8D00-\u9FFF\u3400-\u4DBF\U00020000-\U000215FF\U00021600-\U000230FF\U00023100-\U000245FF\U00024600-\U000260FF\U00026100-\U000275FF\U00027600-\U000290FF\U00029100-\U0002A6DF\U0002A700-\U0002B73F\U0002B740-\U0002B81F\U0002B820-\U0002CEAF\U0002CEB0-\U0002EBEF\u2E80-\u2EFF\u2F00-\u2FDF\u2FF0-\u2FFF\u3000-\u303F\u31C0-\u31EF\u3200-\u32FF\u3300-\u33FF\uF900-\uFAFF\uFE30-\uFE4F\U0001F200-\U0001F2FF\U0002F800-\U0002FA1F])\.$�infix_finditer�>�\.\.+|…|[\u00A6\u00A9\u00AE\u00B0\u0482\u058D\u058E\u060E\u060F\u06DE\u06E9\u06FD\u06FE\u07F6\u09FA\u0B70\u0BF3-\u0BF8\u0BFA\u0C7F\u0D4F\u0D79\u0F01-\u0F03\u0F13\u0F15-\u0F17\u0F1A-\u0F1F\u0F34\u0F36\u0F38\u0FBE-\u0FC5\u0FC7-\u0FCC\u0FCE\u0FCF\u0FD5-\u0FD8\u109E\u109F\u1390-\u1399\u1940\u19DE-\u19FF\u1B61-\u1B6A\u1B74-\u1B7C\u2100\u2101\u2103-\u2106\u2108\u2109\u2114\u2116\u2117\u211E-\u2123\u2125\u2127\u2129\u212E\u213A\u213B\u214A\u214C\u214D\u214F\u218A\u218B\u2195-\u2199\u219C-\u219F\u21A1\u21A2\u21A4\u21A5\u21A7-\u21AD\u21AF-\u21CD\u21D0\u21D1\u21D3\u21D5-\u21F3\u2300-\u2307\u230C-\u231F\u2322-\u2328\u232B-\u237B\u237D-\u239A\u23B4-\u23DB\u23E2-\u2426\u2440-\u244A\u249C-\u24E9\u2500-\u25B6\u25B8-\u25C0\u25C2-\u25F7\u2600-\u266E\u2670-\u2767\u2794-\u27BF\u2800-\u28FF\u2B00-\u2B2F\u2B45\u2B46\u2B4D-\u2B73\u2B76-\u2B95\u2B98-\u2BC8\u2BCA-\u2BFE\u2CE5-\u2CEA\u2E80-\u2E99\u2E9B-\u2EF3\u2F00-\u2FD5\u2FF0-\u2FFB\u3004\u3012\u3013\u3020\u3036\u3037\u303E\u303F\u3190\u3191\u3196-\u319F\u31C0-\u31E3\u3200-\u321E\u322A-\u3247\u3250\u3260-\u327F\u328A-\u32B0\u32C0-\u32FE\u3300-\u33FF\u4DC0-\u4DFF\uA490-\uA4C6\uA828-\uA82B\uA836\uA837\uA839\uAA77-\uAA79\uFDFD\uFFE4\uFFE8\uFFED\uFFEE\uFFFC\uFFFD\U00010137-\U0001013F\U00010179-\U00010189\U0001018C-\U0001018E\U00010190-\U0001019B\U000101A0\U000101D0-\U000101FC\U00010877\U00010878\U00010AC8\U0001173F\U00016B3C-\U00016B3F\U00016B45\U0001BC9C\U0001D000-\U0001D0F5\U0001D100-\U0001D126\U0001D129-\U0001D164\U0001D16A-\U0001D16C\U0001D183\U0001D184\U0001D18C-\U0001D1A9\U0001D1AE-\U0001D1E8\U0001D200-\U0001D241\U0001D245\U0001D300-\U0001D356\U0001D800-\U0001D9FF\U0001DA37-\U0001DA3A\U0001DA6D-\U0001DA74\U0001DA76-\U0001DA83\U0001DA85\U0001DA86\U0001ECAC\U0001F000-\U0001F02B\U0001F030-\U0001F093\U0001F0A0-\U0001F0AE\U0001F0B1-\U0001F0BF\U0001F0C1-\U0001F0CF\U0001F0D1-\U0001F0F5\U0001F110-\U0001F16B\U0001F170-\U0001F1AC\U0001F1E6-\U0001F202\U0001F210-\U0001F23B\U0001F240-\U0001F248\U0001F250\U0001F251\U0001F260-\U0001F265\U0001F300-\U0001F3FA\U0001F400-\U0001F6D4\U0001F6E0-\U0001F6EC\U0001F6F0-\U0001F6F9\U0001F700-\U0001F773\U0001F780-\U0001F7D8\U0001F800-\U0001F80B\U0001F810-\U0001F847\U0001F850-\U0001F859\U0001F860-\U0001F887\U0001F890-\U0001F8AD\U0001F900-\U0001F90B\U0001F910-\U0001F93E\U0001F940-\U0001F970\U0001F973-\U0001F976\U0001F97A\U0001F97C-\U0001F9A2\U0001F9B0-\U0001F9B9\U0001F9C0-\U0001F9C2\U0001F9D0-\U0001F9FF\U0001FA60-\U0001FA6D]|(?<=[0-9])[+\-\*^](?=[0-9-])|(?<=[a-z\uFF41-\uFF5A\u00DF-\u00F6\u00F8-\u00FF\u0101\u0103\u0105\u0107\u0109\u010B\u010D\u010F\u0111\u0113\u0115\u0117\u0119\u011B\u011D\u011F\u0121\u0123\u0125\u0127\u0129\u012B\u012D\u012F\u0131\u0133\u0135\u0137\u0138\u013A\u013C\u013E\u0140\u0142\u0144\u0146\u0148\u0149\u014B\u014D\u014F\u0151\u0153\u0155\u0157\u0159\u015B\u015D\u015F\u0161\u0163\u0165\u0167\u0169\u016B\u016D\u016F\u0171\u0173\u0175\u0177\u017A\u017C\u017E\u017F\u0180\u0183\u0185\u0188\u018C\u018D\u0192\u0195\u0199-\u019B\u019E\u01A1\u01A3\u01A5\u01A8\u01AA\u01AB\u01AD\u01B0\u01B4\u01B6\u01B9\u01BA\u01BD-\u01BF\u01C6\u01C9\u01CC\u01CE\u01D0\u01D2\u01D4\u01D6\u01D8\u01DA\u01DC\u01DD\u01DF\u01E1\u01E3\u01E5\u01E7\u01E9\u01EB\u01ED\u01EF\u01F0\u01F3\u01F5\u01F9\u01FB\u01FD\u01FF\u0201\u0203\u0205\u0207\u0209\u020B\u020D\u020F\u0211\u0213\u0215\u0217\u0219\u021B\u021D\u021F\u0221\u0223\u0225\u0227\u0229\u022B\u022D\u022F\u0231\u0233-\u0239\u023C\u023F\u0240\u0242\u0247\u0249\u024B\u024D\u024F\u2C61\u2C65\u2C66\u2C68\u2C6A\u2C6C\u2C71\u2C73\u2C74\u2C76-\u2C7B\uA723\uA725\uA727\uA729\uA72B\uA72D\uA72F-\uA731\uA733\uA735\uA737\uA739\uA73B\uA73D\uA73F\uA741\uA743\uA745\uA747\uA749\uA74B\uA74D\uA74F\uA751\uA753\uA755\uA757\uA759\uA75B\uA75D\uA75F\uA761\uA763\uA765\uA767\uA769\uA76B\uA76D\uA76F\uA771-\uA778\uA77A\uA77C\uA77F\uA781\uA783\uA785\uA787\uA78C\uA78E\uA791\uA793-\uA795\uA797\uA799\uA79B\uA79D\uA79F\uA7A1\uA7A3\uA7A5\uA7A7\uA7A9\uA7AF\uA7B5\uA7B7\uA7B9\uA7FA\uAB30-\uAB5A\uAB60-\uAB64\u0250-\u02AF\u1D00-\u1D25\u1D6B-\u1D77\u1D79-\u1D9A\u1E01\u1E03\u1E05\u1E07\u1E09\u1E0B\u1E0D\u1E0F\u1E11\u1E13\u1E15\u1E17\u1E19\u1E1B\u1E1D\u1E1F\u1E21\u1E23\u1E25\u1E27\u1E29\u1E2B\u1E2D\u1E2F\u1E31\u1E33\u1E35\u1E37\u1E39\u1E3B\u1E3D\u1E3F\u1E41\u1E43\u1E45\u1E47\u1E49\u1E4B\u1E4D\u1E4F\u1E51\u1E53\u1E55\u1E57\u1E59\u1E5B\u1E5D\u1E5F\u1E61\u1E63\u1E65\u1E67\u1E69\u1E6B\u1E6D\u1E6F\u1E71\u1E73\u1E75\u1E77\u1E79\u1E7B\u1E7D\u1E7F\u1E81\u1E83\u1E85\u1E87\u1E89\u1E8B\u1E8D\u1E8F\u1E91\u1E93\u1E95-\u1E9D\u1E9F\u1EA1\u1EA3\u1EA5\u1EA7\u1EA9\u1EAB\u1EAD\u1EAF\u1EB1\u1EB3\u1EB5\u1EB7\u1EB9\u1EBB\u1EBD\u1EBF\u1EC1\u1EC3\u1EC5\u1EC7\u1EC9\u1ECB\u1ECD\u1ECF\u1ED1\u1ED3\u1ED5\u1ED7\u1ED9\u1EDB\u1EDD\u1EDF\u1EE1\u1EE3\u1EE5\u1EE7\u1EE9\u1EEB\u1EED\u1EEF\u1EF1\u1EF3\u1EF5\u1EF7\u1EF9\u1EFB\u1EFD\u1EFFёа-яәөүҗңһα-ωάέίόώήύа-щюяіїєґѓѕјљњќѐѝ\u1200-\u137F\u0980-\u09FF\u0591-\u05F4\uFB1D-\uFB4F\u0620-\u064A\u066E-\u06D5\u06E5-\u06FF\u0750-\u077F\u08A0-\u08BD\uFB50-\uFBB1\uFBD3-\uFD3D\uFD50-\uFDC7\uFDF0-\uFDFB\uFE70-\uFEFC\U0001EE00-\U0001EEBB\u0D80-\u0DFF\u0900-\u097F\u0C80-\u0CFF\u0B80-\u0BFF\u0C00-\u0C7F\uAC00-\uD7AF\u1100-\u11FF\u3040-\u309F\u30A0-\u30FFー\u4E00-\u62FF\u6300-\u77FF\u7800-\u8CFF\u8D00-\u9FFF\u3400-\u4DBF\U00020000-\U000215FF\U00021600-\U000230FF\U00023100-\U000245FF\U00024600-\U000260FF\U00026100-\U000275FF\U00027600-\U000290FF\U00029100-\U0002A6DF\U0002A700-\U0002B73F\U0002B740-\U0002B81F\U0002B820-\U0002CEAF\U0002CEB0-\U0002EBEF\u2E80-\u2EFF\u2F00-\u2FDF\u2FF0-\u2FFF\u3000-\u303F\u31C0-\u31EF\u3200-\u32FF\u3300-\u33FF\uF900-\uFAFF\uFE30-\uFE4F\U0001F200-\U0001F2FF\U0002F800-\U0002FA1F\'"”“`‘´’‚,„»«「」『』()〔〕【】《》〈〉])\.(?=[A-Z\uFF21-\uFF3A\u00C0-\u00D6\u00D8-\u00DE\u0100\u0102\u0104\u0106\u0108\u010A\u010C\u010E\u0110\u0112\u0114\u0116\u0118\u011A\u011C\u011E\u0120\u0122\u0124\u0126\u0128\u012A\u012C\u012E\u0130\u0132\u0134\u0136\u0139\u013B\u013D\u013F\u0141\u0143\u0145\u0147\u014A\u014C\u014E\u0150\u0152\u0154\u0156\u0158\u015A\u015C\u015E\u0160\u0162\u0164\u0166\u0168\u016A\u016C\u016E\u0170\u0172\u0174\u0176\u0178\u0179\u017B\u017D\u0181\u0182\u0184\u0186\u0187\u0189-\u018B\u018E-\u0191\u0193\u0194\u0196-\u0198\u019C\u019D\u019F\u01A0\u01A2\u01A4\u01A6\u01A7\u01A9\u01AC\u01AE\u01AF\u01B1-\u01B3\u01B5\u01B7\u01B8\u01BC\u01C4\u01C7\u01CA\u01CD\u01CF\u01D1\u01D3\u01D5\u01D7\u01D9\u01DB\u01DE\u01E0\u01E2\u01E4\u01E6\u01E8\u01EA\u01EC\u01EE\u01F1\u01F4\u01F6-\u01F8\u01FA\u01FC\u01FE\u0200\u0202\u0204\u0206\u0208\u020A\u020C\u020E\u0210\u0212\u0214\u0216\u0218\u021A\u021C\u021E\u0220\u0222\u0224\u0226\u0228\u022A\u022C\u022E\u0230\u0232\u023A\u023B\u023D\u023E\u0241\u0243-\u0246\u0248\u024A\u024C\u024E\u2C60\u2C62-\u2C64\u2C67\u2C69\u2C6B\u2C6D-\u2C70\u2C72\u2C75\u2C7E\u2C7F\uA722\uA724\uA726\uA728\uA72A\uA72C\uA72E\uA732\uA734\uA736\uA738\uA73A\uA73C\uA73E\uA740\uA742\uA744\uA746\uA748\uA74A\uA74C\uA74E\uA750\uA752\uA754\uA756\uA758\uA75A\uA75C\uA75E\uA760\uA762\uA764\uA766\uA768\uA76A\uA76C\uA76E\uA779\uA77B\uA77D\uA77E\uA780\uA782\uA784\uA786\uA78B\uA78D\uA790\uA792\uA796\uA798\uA79A\uA79C\uA79E\uA7A0\uA7A2\uA7A4\uA7A6\uA7A8\uA7AA-\uA7AE\uA7B0-\uA7B4\uA7B6\uA7B8\u1E00\u1E02\u1E04\u1E06\u1E08\u1E0A\u1E0C\u1E0E\u1E10\u1E12\u1E14\u1E16\u1E18\u1E1A\u1E1C\u1E1E\u1E20\u1E22\u1E24\u1E26\u1E28\u1E2A\u1E2C\u1E2E\u1E30\u1E32\u1E34\u1E36\u1E38\u1E3A\u1E3C\u1E3E\u1E40\u1E42\u1E44\u1E46\u1E48\u1E4A\u1E4C\u1E4E\u1E50\u1E52\u1E54\u1E56\u1E58\u1E5A\u1E5C\u1E5E\u1E60\u1E62\u1E64\u1E66\u1E68\u1E6A\u1E6C\u1E6E\u1E70\u1E72\u1E74\u1E76\u1E78\u1E7A\u1E7C\u1E7E\u1E80\u1E82\u1E84\u1E86\u1E88\u1E8A\u1E8C\u1E8E\u1E90\u1E92\u1E94\u1E9E\u1EA0\u1EA2\u1EA4\u1EA6\u1EA8\u1EAA\u1EAC\u1EAE\u1EB0\u1EB2\u1EB4\u1EB6\u1EB8\u1EBA\u1EBC\u1EBE\u1EC0\u1EC2\u1EC4\u1EC6\u1EC8\u1ECA\u1ECC\u1ECE\u1ED0\u1ED2\u1ED4\u1ED6\u1ED8\u1EDA\u1EDC\u1EDE\u1EE0\u1EE2\u1EE4\u1EE6\u1EE8\u1EEA\u1EEC\u1EEE\u1EF0\u1EF2\u1EF4\u1EF6\u1EF8\u1EFA\u1EFC\u1EFEЁА-ЯӘӨҮҖҢҺΑ-ΩΆΈΊΌΏΉΎА-ЩЮЯІЇЄҐЃЅЈЉЊЌЀЍ\u1200-\u137F\u0980-\u09FF\u0591-\u05F4\uFB1D-\uFB4F\u0620-\u064A\u066E-\u06D5\u06E5-\u06FF\u0750-\u077F\u08A0-\u08BD\uFB50-\uFBB1\uFBD3-\uFD3D\uFD50-\uFDC7\uFDF0-\uFDFB\uFE70-\uFEFC\U0001EE00-\U0001EEBB\u0D80-\u0DFF\u0900-\u097F\u0C80-\u0CFF\u0B80-\u0BFF\u0C00-\u0C7F\uAC00-\uD7AF\u1100-\u11FF\u3040-\u309F\u30A0-\u30FFー\u4E00-\u62FF\u6300-\u77FF\u7800-\u8CFF\u8D00-\u9FFF\u3400-\u4DBF\U00020000-\U000215FF\U00021600-\U000230FF\U00023100-\U000245FF\U00024600-\U000260FF\U00026100-\U000275FF\U00027600-\U000290FF\U00029100-\U0002A6DF\U0002A700-\U0002B73F\U0002B740-\U0002B81F\U0002B820-\U0002CEAF\U0002CEB0-\U0002EBEF\u2E80-\u2EFF\u2F00-\u2FDF\u2FF0-\u2FFF\u3000-\u303F\u31C0-\u31EF\u3200-\u32FF\u3300-\u33FF\uF900-\uFAFF\uFE30-\uFE4F\U0001F200-\U0001F2FF\U0002F800-\U0002FA1F\'"”“`‘´’‚,„»«「」『』()〔〕【】《》〈〉])|(?<=[A-Za-z\uFF21-\uFF3A\uFF41-\uFF5A\u00C0-\u00D6\u00D8-\u00F6\u00F8-\u00FF\u0100-\u017F\u0180-\u01BF\u01C4-\u024F\u2C60-\u2C7B\u2C7E\u2C7F\uA722-\uA76F\uA771-\uA787\uA78B-\uA78E\uA790-\uA7B9\uA7FA\uAB30-\uAB5A\uAB60-\uAB64\u0250-\u02AF\u1D00-\u1D25\u1D6B-\u1D77\u1D79-\u1D9A\u1E00-\u1EFFёа-яЁА-ЯәөүҗңһӘӨҮҖҢҺα-ωάέίόώήύΑ-ΩΆΈΊΌΏΉΎа-щюяіїєґА-ЩЮЯІЇЄҐѓѕјљњќѐѝЃЅЈЉЊЌЀЍ\u1200-\u137F\u0980-\u09FF\u0591-\u05F4\uFB1D-\uFB4F\u0620-\u064A\u066E-\u06D5\u06E5-\u06FF\u0750-\u077F\u08A0-\u08BD\uFB50-\uFBB1\uFBD3-\uFD3D\uFD50-\uFDC7\uFDF0-\uFDFB\uFE70-\uFEFC\U0001EE00-\U0001EEBB\u0D80-\u0DFF\u0900-\u097F\u0C80-\u0CFF\u0B80-\u0BFF\u0C00-\u0C7F\uAC00-\uD7AF\u1100-\u11FF\u3040-\u309F\u30A0-\u30FFー\u4E00-\u62FF\u6300-\u77FF\u7800-\u8CFF\u8D00-\u9FFF\u3400-\u4DBF\U00020000-\U000215FF\U00021600-\U000230FF\U00023100-\U000245FF\U00024600-\U000260FF\U00026100-\U000275FF\U00027600-\U000290FF\U00029100-\U0002A6DF\U0002A700-\U0002B73F\U0002B740-\U0002B81F\U0002B820-\U0002CEAF\U0002CEB0-\U0002EBEF\u2E80-\u2EFF\u2F00-\u2FDF\u2FF0-\u2FFF\u3000-\u303F\u31C0-\u31EF\u3200-\u32FF\u3300-\u33FF\uF900-\uFAFF\uFE30-\uFE4F\U0001F200-\U0001F2FF\U0002F800-\U0002FA1F]),(?=[A-Za-z\uFF21-\uFF3A\uFF41-\uFF5A\u00C0-\u00D6\u00D8-\u00F6\u00F8-\u00FF\u0100-\u017F\u0180-\u01BF\u01C4-\u024F\u2C60-\u2C7B\u2C7E\u2C7F\uA722-\uA76F\uA771-\uA787\uA78B-\uA78E\uA790-\uA7B9\uA7FA\uAB30-\uAB5A\uAB60-\uAB64\u0250-\u02AF\u1D00-\u1D25\u1D6B-\u1D77\u1D79-\u1D9A\u1E00-\u1EFFёа-яЁА-ЯәөүҗңһӘӨҮҖҢҺα-ωάέίόώήύΑ-ΩΆΈΊΌΏΉΎа-щюяіїєґА-ЩЮЯІЇЄҐѓѕјљњќѐѝЃЅЈЉЊЌЀЍ\u1200-\u137F\u0980-\u09FF\u0591-\u05F4\uFB1D-\uFB4F\u0620-\u064A\u066E-\u06D5\u06E5-\u06FF\u0750-\u077F\u08A0-\u08BD\uFB50-\uFBB1\uFBD3-\uFD3D\uFD50-\uFDC7\uFDF0-\uFDFB\uFE70-\uFEFC\U0001EE00-\U0001EEBB\u0D80-\u0DFF\u0900-\u097F\u0C80-\u0CFF\u0B80-\u0BFF\u0C00-\u0C7F\uAC00-\uD7AF\u1100-\u11FF\u3040-\u309F\u30A0-\u30FFー\u4E00-\u62FF\u6300-\u77FF\u7800-\u8CFF\u8D00-\u9FFF\u3400-\u4DBF\U00020000-\U000215FF\U00021600-\U000230FF\U00023100-\U000245FF\U00024600-\U000260FF\U00026100-\U000275FF\U00027600-\U000290FF\U00029100-\U0002A6DF\U0002A700-\U0002B73F\U0002B740-\U0002B81F\U0002B820-\U0002CEAF\U0002CEB0-\U0002EBEF\u2E80-\u2EFF\u2F00-\u2FDF\u2FF0-\u2FFF\u3000-\u303F\u31C0-\u31EF\u3200-\u32FF\u3300-\u33FF\uF900-\uFAFF\uFE30-\uFE4F\U0001F200-\U0001F2FF\U0002F800-\U0002FA1F])|(?<=[A-Za-z\uFF21-\uFF3A\uFF41-\uFF5A\u00C0-\u00D6\u00D8-\u00F6\u00F8-\u00FF\u0100-\u017F\u0180-\u01BF\u01C4-\u024F\u2C60-\u2C7B\u2C7E\u2C7F\uA722-\uA76F\uA771-\uA787\uA78B-\uA78E\uA790-\uA7B9\uA7FA\uAB30-\uAB5A\uAB60-\uAB64\u0250-\u02AF\u1D00-\u1D25\u1D6B-\u1D77\u1D79-\u1D9A\u1E00-\u1EFFёа-яЁА-ЯәөүҗңһӘӨҮҖҢҺα-ωάέίόώήύΑ-ΩΆΈΊΌΏΉΎа-щюяіїєґА-ЩЮЯІЇЄҐѓѕјљњќѐѝЃЅЈЉЊЌЀЍ\u1200-\u137F\u0980-\u09FF\u0591-\u05F4\uFB1D-\uFB4F\u0620-\u064A\u066E-\u06D5\u06E5-\u06FF\u0750-\u077F\u08A0-\u08BD\uFB50-\uFBB1\uFBD3-\uFD3D\uFD50-\uFDC7\uFDF0-\uFDFB\uFE70-\uFEFC\U0001EE00-\U0001EEBB\u0D80-\u0DFF\u0900-\u097F\u0C80-\u0CFF\u0B80-\u0BFF\u0C00-\u0C7F\uAC00-\uD7AF\u1100-\u11FF\u3040-\u309F\u30A0-\u30FFー\u4E00-\u62FF\u6300-\u77FF\u7800-\u8CFF\u8D00-\u9FFF\u3400-\u4DBF\U00020000-\U000215FF\U00021600-\U000230FF\U00023100-\U000245FF\U00024600-\U000260FF\U00026100-\U000275FF\U00027600-\U000290FF\U00029100-\U0002A6DF\U0002A700-\U0002B73F\U0002B740-\U0002B81F\U0002B820-\U0002CEAF\U0002CEB0-\U0002EBEF\u2E80-\u2EFF\u2F00-\u2FDF\u2FF0-\u2FFF\u3000-\u303F\u31C0-\u31EF\u3200-\u32FF\u3300-\u33FF\uF900-\uFAFF\uFE30-\uFE4F\U0001F200-\U0001F2FF\U0002F800-\U0002FA1F0-9])(?:-|–|—|--|---|——|~)(?=[A-Za-z\uFF21-\uFF3A\uFF41-\uFF5A\u00C0-\u00D6\u00D8-\u00F6\u00F8-\u00FF\u0100-\u017F\u0180-\u01BF\u01C4-\u024F\u2C60-\u2C7B\u2C7E\u2C7F\uA722-\uA76F\uA771-\uA787\uA78B-\uA78E\uA790-\uA7B9\uA7FA\uAB30-\uAB5A\uAB60-\uAB64\u0250-\u02AF\u1D00-\u1D25\u1D6B-\u1D77\u1D79-\u1D9A\u1E00-\u1EFFёа-яЁА-ЯәөүҗңһӘӨҮҖҢҺα-ωάέίόώήύΑ-ΩΆΈΊΌΏΉΎа-щюяіїєґА-ЩЮЯІЇЄҐѓѕјљњќѐѝЃЅЈЉЊЌЀЍ\u1200-\u137F\u0980-\u09FF\u0591-\u05F4\uFB1D-\uFB4F\u0620-\u064A\u066E-\u06D5\u06E5-\u06FF\u0750-\u077F\u08A0-\u08BD\uFB50-\uFBB1\uFBD3-\uFD3D\uFD50-\uFDC7\uFDF0-\uFDFB\uFE70-\uFEFC\U0001EE00-\U0001EEBB\u0D80-\u0DFF\u0900-\u097F\u0C80-\u0CFF\u0B80-\u0BFF\u0C00-\u0C7F\uAC00-\uD7AF\u1100-\u11FF\u3040-\u309F\u30A0-\u30FFー\u4E00-\u62FF\u6300-\u77FF\u7800-\u8CFF\u8D00-\u9FFF\u3400-\u4DBF\U00020000-\U000215FF\U00021600-\U000230FF\U00023100-\U000245FF\U00024600-\U000260FF\U00026100-\U000275FF\U00027600-\U000290FF\U00029100-\U0002A6DF\U0002A700-\U0002B73F\U0002B740-\U0002B81F\U0002B820-\U0002CEAF\U0002CEB0-\U0002EBEF\u2E80-\u2EFF\u2F00-\u2FDF\u2FF0-\u2FFF\u3000-\u303F\u31C0-\u31EF\u3200-\u32FF\u3300-\u33FF\uF900-\uFAFF\uFE30-\uFE4F\U0001F200-\U0001F2FF\U0002F800-\U0002FA1F])|(?<=[A-Za-z\uFF21-\uFF3A\uFF41-\uFF5A\u00C0-\u00D6\u00D8-\u00F6\u00F8-\u00FF\u0100-\u017F\u0180-\u01BF\u01C4-\u024F\u2C60-\u2C7B\u2C7E\u2C7F\uA722-\uA76F\uA771-\uA787\uA78B-\uA78E\uA790-\uA7B9\uA7FA\uAB30-\uAB5A\uAB60-\uAB64\u0250-\u02AF\u1D00-\u1D25\u1D6B-\u1D77\u1D79-\u1D9A\u1E00-\u1EFFёа-яЁА-ЯәөүҗңһӘӨҮҖҢҺα-ωάέίόώήύΑ-ΩΆΈΊΌΏΉΎа-щюяіїєґА-ЩЮЯІЇЄҐѓѕјљњќѐѝЃЅЈЉЊЌЀЍ\u1200-\u137F\u0980-\u09FF\u0591-\u05F4\uFB1D-\uFB4F\u0620-\u064A\u066E-\u06D5\u06E5-\u06FF\u0750-\u077F\u08A0-\u08BD\uFB50-\uFBB1\uFBD3-\uFD3D\uFD50-\uFDC7\uFDF0-\uFDFB\uFE70-\uFEFC\U0001EE00-\U0001EEBB\u0D80-\u0DFF\u0900-\u097F\u0C80-\u0CFF\u0B80-\u0BFF\u0C00-\u0C7F\uAC00-\uD7AF\u1100-\u11FF\u3040-\u309F\u30A0-\u30FFー\u4E00-\u62FF\u6300-\u77FF\u7800-\u8CFF\u8D00-\u9FFF\u3400-\u4DBF\U00020000-\U000215FF\U00021600-\U000230FF\U00023100-\U000245FF\U00024600-\U000260FF\U00026100-\U000275FF\U00027600-\U000290FF\U00029100-\U0002A6DF\U0002A700-\U0002B73F\U0002B740-\U0002B81F\U0002B820-\U0002CEAF\U0002CEB0-\U0002EBEF\u2E80-\u2EFF\u2F00-\u2FDF\u2FF0-\u2FFF\u3000-\u303F\u31C0-\u31EF\u3200-\u32FF\u3300-\u33FF\uF900-\uFAFF\uFE30-\uFE4F\U0001F200-\U0001F2FF\U0002F800-\U0002FA1F0-9])[:<>=/](?=[A-Za-z\uFF21-\uFF3A\uFF41-\uFF5A\u00C0-\u00D6\u00D8-\u00F6\u00F8-\u00FF\u0100-\u017F\u0180-\u01BF\u01C4-\u024F\u2C60-\u2C7B\u2C7E\u2C7F\uA722-\uA76F\uA771-\uA787\uA78B-\uA78E\uA790-\uA7B9\uA7FA\uAB30-\uAB5A\uAB60-\uAB64\u0250-\u02AF\u1D00-\u1D25\u1D6B-\u1D77\u1D79-\u1D9A\u1E00-\u1EFFёа-яЁА-ЯәөүҗңһӘӨҮҖҢҺα-ωάέίόώήύΑ-ΩΆΈΊΌΏΉΎа-щюяіїєґА-ЩЮЯІЇЄҐѓѕјљњќѐѝЃЅЈЉЊЌЀЍ\u1200-\u137F\u0980-\u09FF\u0591-\u05F4\uFB1D-\uFB4F\u0620-\u064A\u066E-\u06D5\u06E5-\u06FF\u0750-\u077F\u08A0-\u08BD\uFB50-\uFBB1\uFBD3-\uFD3D\uFD50-\uFDC7\uFDF0-\uFDFB\uFE70-\uFEFC\U0001EE00-\U0001EEBB\u0D80-\u0DFF\u0900-\u097F\u0C80-\u0CFF\u0B80-\u0BFF\u0C00-\u0C7F\uAC00-\uD7AF\u1100-\u11FF\u3040-\u309F\u30A0-\u30FFー\u4E00-\u62FF\u6300-\u77FF\u7800-\u8CFF\u8D00-\u9FFF\u3400-\u4DBF\U00020000-\U000215FF\U00021600-\U000230FF\U00023100-\U000245FF\U00024600-\U000260FF\U00026100-\U000275FF\U00027600-\U000290FF\U00029100-\U0002A6DF\U0002A700-\U0002B73F\U0002B740-\U0002B81F\U0002B820-\U0002CEAF\U0002CEB0-\U0002EBEF\u2E80-\u2EFF\u2F00-\u2FDF\u2FF0-\u2FFF\u3000-\u303F\u31C0-\u31EF\u3200-\u32FF\u3300-\u33FF\uF900-\uFAFF\uFE30-\uFE4F\U0001F200-\U0001F2FF\U0002F800-\U0002FA1F])�token_match��url_match�
|
2 |
+
��A�
|
3 |
+
� ��A� �'��A�'�''��A�''�'Cause��A�'CauseC�because�'Cos��A�'CosC�because�'Coz��A�'CozC�because�'Cuz��A�'CuzC�because�'S��A�'SC�'s�'bout��A�'boutC�about�'cause��A�'causeC�because�'cos��A�'cosC�because�'coz��A�'cozC�because�'cuz��A�'cuzC�because�'d��A�'d�'em��A�'emC�them�'ll��A�'llC�will�'nuff��A�'nuffC�enough�'re��A�'reC�are�'s��A�'sC�'s�(*_*)��A�(*_*)�(-8��A�(-8�(-:��A�(-:�(-;��A�(-;�(-_-)��A�(-_-)�(._.)��A�(._.)�(:��A�(:�(;��A�(;�(=��A�(=�(>_<)��A�(>_<)�(^_^)��A�(^_^)�(o:��A�(o:�(¬_¬)��A�(¬_¬)�(ಠ_ಠ)��A�(ಠ_ಠ)�(╯°□°)╯︵┻━┻��A�(╯°□°)╯︵┻━┻�)-:��A�)-:�):��A�):�-_-��A�-_-�-__-��A�-__-�._.��A�._.�0.0��A�0.0�0.o��A�0.o�0_0��A�0_0�0_o��A�0_o�10a.m.��A�10�A�a.m.C�a.m.�10am��A�10�A�amC�a.m.�10p.m.��A�10�A�p.m.C�p.m.�10pm��A�10�A�pmC�p.m.�11a.m.��A�11�A�a.m.C�a.m.�11am��A�11�A�amC�a.m.�11p.m.��A�11�A�p.m.C�p.m.�11pm��A�11�A�pmC�p.m.�12a.m.��A�12�A�a.m.C�a.m.�12am��A�12�A�amC�a.m.�12p.m.��A�12�A�p.m.C�p.m.�12pm��A�12�A�pmC�p.m.�1a.m.��A�1�A�a.m.C�a.m.�1am��A�1�A�amC�a.m.�1p.m.��A�1�A�p.m.C�p.m.�1pm��A�1�A�pmC�p.m.�2a.m.��A�2�A�a.m.C�a.m.�2am��A�2�A�amC�a.m.�2p.m.��A�2�A�p.m.C�p.m.�2pm��A�2�A�pmC�p.m.�3a.m.��A�3�A�a.m.C�a.m.�3am��A�3�A�amC�a.m.�3p.m.��A�3�A�p.m.C�p.m.�3pm��A�3�A�pmC�p.m.�4a.m.��A�4�A�a.m.C�a.m.�4am��A�4�A�amC�a.m.�4p.m.��A�4�A�p.m.C�p.m.�4pm��A�4�A�pmC�p.m.�5a.m.��A�5�A�a.m.C�a.m.�5am��A�5�A�amC�a.m.�5p.m.��A�5�A�p.m.C�p.m.�5pm��A�5�A�pmC�p.m.�6a.m.��A�6�A�a.m.C�a.m.�6am��A�6�A�amC�a.m.�6p.m.��A�6�A�p.m.C�p.m.�6pm��A�6�A�pmC�p.m.�7a.m.��A�7�A�a.m.C�a.m.�7am��A�7�A�amC�a.m.�7p.m.��A�7�A�p.m.C�p.m.�7pm��A�7�A�pmC�p.m.�8)��A�8)�8-)��A�8-)�8-D��A�8-D�8D��A�8D�8a.m.��A�8�A�a.m.C�a.m.�8am��A�8�A�amC�a.m.�8p.m.��A�8�A�p.m.C�p.m.�8pm��A�8�A�pmC�p.m.�9a.m.��A�9�A�a.m.C�a.m.�9am��A�9�A�amC�a.m.�9p.m.��A�9�A�p.m.C�p.m.�9pm��A�9�A�pmC�p.m.�:'(��A�:'(�:')��A�:')�:'-(��A�:'-(�:'-)��A�:'-)�:(��A�:(�:((��A�:((�:(((��A�:(((�:()��A�:()�:)��A�:)�:))��A�:))�:)))��A�:)))�:*��A�:*�:-(��A�:-(�:-((��A�:-((�:-(((��A�:-(((�:-)��A�:-)�:-))��A�:-))�:-)))��A�:-)))�:-*��A�:-*�:-/��A�:-/�:-0��A�:-0�:-3��A�:-3�:->��A�:->�:-D��A�:-D�:-O��A�:-O�:-P��A�:-P�:-X��A�:-X�:-]��A�:-]�:-o��A�:-o�:-p��A�:-p�:-x��A�:-x�:-|��A�:-|�:-}��A�:-}�:/��A�:/�:0��A�:0�:1��A�:1�:3��A�:3�:>��A�:>�:D��A�:D�:O��A�:O�:P��A�:P�:X��A�:X�:]��A�:]�:o��A�:o�:o)��A�:o)�:p��A�:p�:x��A�:x�:|��A�:|�:}��A�:}�:’(��A�:’(�:’)��A�:’)�:’-(��A�:’-(�:’-)��A�:’-)�;)��A�;)�;-)��A�;-)�;-D��A�;-D�;D��A�;D�;_;��A�;_;�<.<��A�<.<�</3��A�</3�<3��A�<3�<33��A�<33�<333��A�<333�<space>��A�<space>�=(��A�=(�=)��A�=)�=/��A�=/�=3��A�=3�=D��A�=D�=[��A�=[�=]��A�=]�=|��A�=|�>.<��A�>.<�>.>��A�>.>�>:(��A�>:(�>:o��A�>:o�><(((*>��A�><(((*>�@_@��A�@_@�Adm.��A�Adm.�Ain't��A�Ai�A�n'tC�not�Aint��A�Ai�A�ntC�not�Ain’t��A�Ai�A�n’tC�not�Ak.��A�Ak.C�Alaska�Ala.��A�Ala.C�Alabama�Apr.��A�Apr.C�April�Aren't��A�AreC�are�A�n'tC�not�Arent��A�AreC�are�A�ntC�not�Aren’t��A�AreC�are�A�n’tC�not�Ariz.��A�Ariz.C�Arizona�Ark.��A�Ark.C�Arkansas�Aug.��A�Aug.C�August�Bros.��A�Bros.�C'mon��A�C'mC�come�A�on�C++��A�C++�Calif.��A�Calif.C�California�Can't��A�CaC�can�A�n'tC�not�Can't've��A�CaC�can�A�n'tC�not�A�'veC�have�Cannot��A�CanC�can�A�not�Cant��A�CaC�can�A�ntC�not�Cantve��A�CaC�can�A�ntC�not�A�veC�have�Can’t��A�CaC�can�A�n’tC�not�Can’t’ve��A�CaC�can�A�n’tC�not�A�’veC�have�Co.��A�Co.�Colo.��A�Colo.C�Colorado�Conn.��A�Conn.C�Connecticut�Corp.��A�Corp.�Could've��A�CouldC�could�A�'ve�Couldn't��A�CouldC�could�A�n'tC�not�Couldn't've��A�CouldC�could�A�n'tC�not�A�'veC�have�Couldnt��A�CouldC�could�A�ntC�not�Couldntve��A�CouldC�could�A�ntC�not�A�veC�have�Couldn’t��A�CouldC�could�A�n’tC�not�Couldn’t’ve��A�CouldC�could�A�n’tC�not�A�’veC�have�Couldve��A�CouldC�could�A�ve�Could’ve��A�CouldC�could�A�’ve�C’mon��A�C’mC�come�A�on�D.C.��A�D.C.�Daren't��A�DareC�dare�A�n'tC�not�Darent��A�DareC�dare�A�ntC�not�Daren’t��A�DareC�dare�A�n’tC�not�Dec.��A�Dec.C�December�Del.��A�Del.C�Delaware�Didn't��A�DidC�do�A�n'tC�not�Didn't've��A�DidC�do�A�n'tC�not�A�'veC�have�Didnt��A�DidC�do�A�ntC�not�Didntve��A�DidC�do�A�ntC�not�A�veC�have�Didn’t��A�DidC�do�A�n’tC�not�Didn’t’ve��A�DidC�do�A�n’tC�not�A�’veC�have�Doesn't��A�DoesC�does�A�n'tC�not�Doesn't've��A�DoesC�does�A�n'tC�not�A�'veC�have�Doesnt��A�DoesC�does�A�ntC�not�Doesntve��A�DoesC�does�A�ntC�not�A�veC�have�Doesn’t��A�DoesC�does�A�n’tC�not�Doesn’t’ve��A�DoesC�does�A�n’tC�not�A�’veC�have�Doin��A�DoinC�doing�Doin'��A�Doin'C�doing�Doin’��A�Doin’C�doing�Don't��A�DoC�do�A�n'tC�not�Don't've��A�DoC�do�A�n'tC�not�A�'veC�have�Dont��A�DoC�do�A�ntC�not�Dontve��A�DoC�do�A�ntC�not�A�veC�have�Don’t��A�DoC�do�A�n’tC�not�Don’t’ve��A�DoC�do�A�n’tC�not�A�’veC�have�Dr.��A�Dr.�E.G.��A�E.G.�E.g.��A�E.g.�Feb.��A�Feb.C�February�Fla.��A�Fla.C�Florida�Ga.��A�Ga.C�Georgia�Gen.��A�Gen.�Goin��A�GoinC�going�Goin'��A�Goin'C�going�Goin’��A�Goin’C�going�Gonna��A�GonC�going�A�naC�to�Gotta��A�GotC�got�A�taC�to�Gov.��A�Gov.�Hadn't��A�HadC�have�A�n'tC�not�Hadn't've��A�HadC�have�A�n'tC�not�A�'veC�have�Hadnt��A�HadC�have�A�ntC�not�Hadntve��A�HadC�have�A�ntC�not�A�veC�have�Hadn’t��A�HadC�have�A�n’tC�not�Hadn’t’ve��A�HadC�have�A�n’tC�not�A�’veC�have�Hasn't��A�HasC�has�A�n'tC�not�Hasnt��A�HasC�has�A�ntC�not�Hasn’t��A�HasC�has�A�n’tC�not�Haven't��A�HaveC�have�A�n'tC�not�Havent��A�HaveC�have�A�ntC�not�Haven��t��A�HaveC�have�A�n’tC�not�Havin��A�HavinC�having�Havin'��A�Havin'C�having�Havin’��A�Havin’C�having�He'd��A�HeC�he�A�'dC�'d�He'd've��A�HeC�he�A�'dC�would�A�'veC�have�He'll��A�HeC�he�A�'llC�will�He'll've��A�HeC�he�A�'llC�will�A�'veC�have�He's��A�HeC�he�A�'sC�'s�Hed��A�HeC�he�A�dC�'d�Hedve��A�HeC�he�A�dC�would�A�veC�have�Hellve��A�HeC�he�A�llC�will�A�veC�have�Hes��A�HeC�he�A�s�He’d��A�HeC�he�A�’dC�'d�He’d’ve��A�HeC�he�A�’dC�would�A�’veC�have�He’ll��A�HeC�he�A�’llC�will�He’ll’ve��A�HeC�he�A�’llC�will�A�’veC�have�He’s��A�HeC�he�A�’sC�'s�How'd��A�HowC�how�A�'dC�'d�How'd've��A�HowC�how�A�'dC�would�A�'veC�have�How'd'y��A�HowC�how�A�'d�A�'yC�you�How'll��A�HowC�how�A�'llC�will�How'll've��A�HowC�how�A�'llC�will�A�'veC�have�How're��A�HowC�how�A�'reC�are�How's��A�HowC�how�A�'sC�'s�How've��A�HowC�how�A�'ve�Howd��A�HowC�how�A�dC�'d�Howdve��A�HowC�how�A�dC�would�A�veC�have�Howll��A�HowC�how�A�llC�will�Howllve��A�HowC�how�A�llC�will�A�veC�have�Howre��A�HowC�how�A�reC�are�Hows��A�HowC�how�A�s�Howve��A�How�A�veC�have�How’d��A�HowC�how�A�’dC�'d�How’d’ve��A�HowC�how�A�’dC�would�A�’veC�have�How’d’y��A�HowC�how�A�’d�A�’yC�you�How’ll��A�HowC�how�A�’llC�will�How’ll’ve��A�HowC�how�A�’llC�will�A�’veC�have�How’re��A�HowC�how�A�’reC�are�How’s��A�HowC�how�A�’sC�'s�How’ve��A�HowC�how�A�’ve�I'd��A�IC�i�A�'dC�'d�I'd've��A�IC�i�A�'dC�would�A�'veC�have�I'll��A�IC�i�A�'llC�will�I'll've��A�IC�i�A�'llC�will�A�'veC�have�I'm��A�IC�i�A�'mC�am�I'ma��A�IC�i�A�'mC�am�A�aC�gonna�I've��A�IC�i�A�'veC�have�I.E.��A�I.E.�I.e.��A�I.e.�Ia.��A�Ia.C�Iowa�Id��A�IC�i�A�dC�'d�Id.��A�Id.C�Idaho�Idve��A�IC�i�A�dC�would�A�veC�have�Ill.��A�Ill.C�Illinois�Illve��A�IC�i�A�llC�will�A�veC�have�Im��A�IC�i�A�m�Ima��A�IC�i�A�mC�am�A�aC�gonna�Inc.��A�Inc.�Ind.��A�Ind.C�Indiana�Isn't��A�IsC�is�A�n'tC�not�Isnt��A�IsC�is�A�ntC�not�Isn’t��A�IsC�is�A�n’tC�not�It'd��A�ItC�it�A�'dC�'d�It'd've��A�ItC�it�A�'dC�would�A�'veC�have�It'll��A�ItC�it�A�'llC�will�It'll've��A�ItC�it�A�'llC�will�A�'veC�have�It's��A�ItC�it�A�'sC�'s�Itd��A�ItC�it�A�dC�'d�Itdve��A�ItC�it�A�dC�would�A�veC�have�Itll��A�ItC�it�A�llC�will�Itllve��A�ItC�it�A�llC�will�A�veC�have�It’d��A�ItC�it�A�’dC�'d�It’d’ve��A�ItC�it�A�’dC�would�A�’veC�have�It’ll��A�ItC�it�A�’llC�will�It’ll’ve��A�ItC�it�A�’llC�will�A�’veC�have�It’s��A�ItC�it�A�’sC�'s�Ive��A�IC�i�A�veC�have�I’d��A�IC�i�A�’dC�'d�I’d’ve��A�IC�i�A�’dC�would�A�’veC�have�I’ll��A�IC�i�A�’llC�will�I’ll’ve��A�IC�i�A�’llC�will�A�’veC�have�I’m��A�IC�i�A�’mC�am�I’ma��A�IC�i�A�’mC�am�A�aC�gonna�I’ve��A�IC�i�A�’veC�have�Jan.��A�Jan.C�January�Jr.��A�Jr.�Jul.��A�Jul.C�July�Jun.��A�Jun.C�June�Kan.��A�Kan.C�Kansas�Kans.��A�Kans.C�Kansas�Ky.��A�Ky.C�Kentucky�La.��A�La.C�Louisiana�Let's��A�LetC�let�A�'sC�us�Let’s��A�LetC�let�A�’sC�us�Lovin��A�LovinC�loving�Lovin'��A�Lovin'C�loving�Lovin’��A�Lovin’C�loving�Ltd.��A�Ltd.�Ma'am��A�Ma'amC�madam�Mar.��A�Mar.C�March�Mass.��A�Mass.C�Massachusetts�Mayn't��A�MayC�may�A�n'tC�not�Mayn't've��A�MayC�may�A�n'tC�not�A�'veC�have�Maynt��A�MayC�may�A�ntC�not�Mayntve��A�MayC�may�A�ntC�not�A�veC�have�Mayn’t��A�MayC�may�A�n’tC�not�Mayn’t’ve��A�MayC�may�A�n’tC�not�A�’veC�have�Ma’am��A�Ma’amC�madam�Md.��A�Md.�Messrs.��A�Messrs.�Mich.��A�Mich.C�Michigan�Might've��A�MightC�might�A�'ve�Mightn't��A�MightC�might�A�n'tC�not�Mightn't've��A�MightC�might�A�n'tC�not�A�'veC�have�Mightnt��A�MightC�might�A�ntC�not�Mightntve��A�MightC�might�A�ntC�not�A�veC�have�Mightn’t��A�MightC�might�A�n’tC�not�Mightn’t’ve��A�MightC�might�A�n’tC�not�A�’veC�have�Mightve��A�MightC�might�A�ve�Might’ve��A�MightC�might�A�’ve�Minn.��A�Minn.C�Minnesota�Miss.��A�Miss.C�Mississippi�Mo.��A�Mo.�Mont.��A�Mont.�Mr.��A�Mr.�Mrs.��A�Mrs.�Ms.��A�Ms.�Mt.��A�Mt.C�Mount�Must've��A�MustC�must�A�'ve�Mustn't��A�MustC�must�A�n'tC�not�Mustn't've��A�MustC�must�A�n'tC�not�A�'veC�have�Mustnt��A�MustC�must�A�ntC�not�Mustntve��A�MustC�must�A�ntC�not�A�veC�have�Mustn’t��A�MustC�must�A�n’tC�not�Mustn’t’ve��A�MustC�must�A�n’tC�not�A�’veC�have�Mustve��A�MustC�must�A�ve�Must’ve��A�MustC�must�A�’ve�N.C.��A�N.C.C�North Carolina�N.D.��A�N.D.C�North Dakota�N.H.��A�N.H.C�New Hampshire�N.J.��A�N.J.C�New Jersey�N.M.��A�N.M.C�New Mexico�N.Y.��A�N.Y.C�New York�Neb.��A�Neb.C�Nebraska�Nebr.��A�Nebr.C�Nebraska�Needn't��A�NeedC�need�A�n'tC�not�Needn't've��A�NeedC�need�A�n'tC�not�A�'veC�have�Neednt��A�NeedC�need�A�ntC�not�Needntve��A�NeedC�need�A�ntC�not�A�veC�have�Needn’t��A�NeedC�need�A�n’tC�not�Needn’t’ve��A�NeedC�need�A�n’tC�not�A�’veC�have�Nev.��A�Nev.C�Nevada�Not've��A�NotC�not�A�'veC�have�Nothin��A�NothinC�nothing�Nothin'��A�Nothin'C�nothing�Nothin’��A�Nothin’C�nothing�Notve��A�NotC�not�A�veC�have�Not’ve��A�NotC�not�A�’veC�have�Nov.��A�Nov.C�November�Nuthin��A�NuthinC�nothing�Nuthin'��A�Nuthin'C�nothing�Nuthin’��A�Nuthin’C�nothing�O'clock��A�O'clockC�o'clock�O.O��A�O.O�O.o��A�O.o�O_O��A�O_O�O_o��A�O_o�Oct.��A�Oct.C�October�Okla.��A�Okla.C�Oklahoma�Ol��A�OlC�old�Ol'��A�Ol'C�old�Ol’��A�Ol’C�old�Ore.��A�Ore.C�Oregon�Oughtn't��A�OughtC�ought�A�n'tC�not�Oughtn't've��A�OughtC�ought�A�n'tC�not�A�'veC�have�Oughtnt��A�OughtC�ought�A�ntC�not�Oughtntve��A�OughtC�ought�A�ntC�not�A�veC�have�Oughtn’t��A�OughtC�ought�A�n’tC�not�Oughtn’t’ve��A�OughtC�ought�A�n’tC�not�A�’veC�have�O’clock��A�O’clockC�o'clock�Pa.��A�Pa.C�Pennsylvania�Ph.D.��A�Ph.D.�Prof.��A�Prof.�Rep.��A�Rep.�Rev.��A�Rev.�S.C.��A�S.C.C�South Carolina�Sen.��A�Sen.�Sep.��A�Sep.C�September�Sept.��A�Sept.C�September�Shan't��A�ShaC�shall�A�n'tC�not�Shan't've��A�ShaC�shall�A�n'tC�not�A�'veC�have�Shant��A�ShaC�shall�A�ntC�not�Shantve��A�ShaC�shall�A�ntC�not�A�veC�have�Shan’t��A�ShaC�shall�A�n’tC�not�Shan’t’ve��A�ShaC�shall�A�n’tC�not�A�’veC�have�She'd��A�SheC�she�A�'dC�'d�She'd've��A�SheC�she�A�'dC�would�A�'veC�have�She'll��A�SheC�she�A�'llC�will�She'll've��A�SheC�she�A�'llC�will�A�'veC�have�She's��A�SheC�she�A�'sC�'s�Shedve��A�SheC�she�A�dC�would�A�veC�have�Shellve��A�SheC�she�A�llC�will�A�veC�have�Shes��A�SheC�she�A�s�She’d��A�SheC�she�A�’dC�'d�She’d’ve��A�SheC�she�A�’dC�would�A�’veC�have�She’ll��A�SheC�she�A�’llC�will�She’ll’ve��A�SheC�she�A�’llC�will�A�’veC�have�She’s��A�SheC�she�A�’sC�'s�Should've��A�ShouldC�should�A�'ve�Shouldn't��A�ShouldC�should�A�n'tC�not�Shouldn't've��A�ShouldC�should�A�n'tC�not�A�'veC�have�Shouldnt��A�ShouldC�should�A�ntC�not�Shouldntve��A�ShouldC�should�A�ntC�not�A�veC�have�Shouldn’t��A�ShouldC�should�A�n’tC�not�Shouldn’t’ve��A�ShouldC�should�A�n’tC�not�A�’veC�have�Shouldve��A�ShouldC�should�A�ve�Should’ve��A�ShouldC�should�A�’ve�Somethin��A�SomethinC�something�Somethin'��A�Somethin'C�something�Somethin’��A�Somethin’C�something�St.��A�St.�Tenn.��A�Tenn.C�Tennessee�That'd��A�ThatC�that�A�'dC�'d�That'd've��A�ThatC�that�A�'dC�would�A�'veC�have�That'll��A�ThatC�that�A�'llC�will�That'll've��A�ThatC�that�A�'llC�will�A�'veC�have�That's��A�ThatC�that�A�'sC�'s�Thatd��A�ThatC�that�A�dC�'d�Thatdve��A�ThatC�that�A�dC�would�A�veC�have�Thatll��A�ThatC�that�A�llC�will�Thatllve��A�ThatC�that�A�llC�will�A�veC�have�Thats��A�ThatC�that�A�s�That’d��A�ThatC�that�A�’dC�'d�That’d’ve��A�ThatC�that�A�’dC�would�A�’veC�have�That’ll��A�ThatC�that�A�’llC�will�That’ll’ve��A�ThatC�that�A�’llC�will�A�’veC�have�That’s��A�ThatC�that�A�’sC�'s�There'd��A�ThereC�there�A�'dC�'d�There'd've��A�ThereC�there�A�'dC�would�A�'veC�have�There'll��A�ThereC�there�A�'llC�will�There'll've��A�ThereC�there�A�'llC�will�A�'veC�have�There're��A�ThereC�there�A�'reC�are�There's��A�ThereC�there�A�'sC�'s�There've��A�ThereC�there�A�'ve�Thered��A�ThereC�there�A�dC�'d�Theredve��A�ThereC�there�A�dC�would�A�veC�have�Therell��A�ThereC�there�A�llC�will�Therellve��A�ThereC�there�A�llC�will�A�veC�have�Therere��A�ThereC�there�A�reC�are�Theres��A�ThereC�there�A�s�Thereve��A�There�A�veC�have�There’d��A�ThereC�there�A�’dC�'d�There’d’ve��A�ThereC�there�A�’dC�would�A�’veC�have�There’ll��A�ThereC�there�A�’llC�will�There’ll’ve��A�ThereC�there�A�’llC�will�A�’veC�have�There’re��A�ThereC�there�A�’reC�are�There’s��A�ThereC�there�A�’sC�'s�There’ve��A�ThereC�there�A�’ve�These'd��A�TheseC�these�A�'dC�'d�These'd've��A�TheseC�these�A�'dC�would�A�'veC�have�These'll��A�TheseC�these�A�'llC�will�These'll've��A�TheseC�these�A�'llC�will�A�'veC�have�These're��A�TheseC�these�A�'reC�are�These've��A�TheseC�these�A�'ve�Thesed��A�TheseC�these�A�dC�'d�Thesedve��A�TheseC�these�A�dC�would�A�veC�have�Thesell��A�TheseC�these�A�llC�will�Thesellve��A�TheseC�these�A�llC�will�A�veC�have�Thesere��A�TheseC�these�A�reC�are�Theseve��A�These�A�veC�have�These’d��A�TheseC�these�A�’dC�'d�These’d’ve��A�TheseC�these�A�’dC�would�A�’veC�have�These’ll��A�TheseC�these�A�’llC�will�These’ll’ve��A�TheseC�these�A�’llC�will�A�’veC�have�These’re��A�TheseC�these�A�’reC�are�These’ve��A�TheseC�these�A�’ve�They'd��A�TheyC�they�A�'dC�'d�They'd've��A�TheyC�they�A�'dC�would�A�'veC�have�They'll��A�TheyC�they�A�'llC�will�They'll've��A�TheyC�they�A�'llC�will�A�'veC�have�They're��A�TheyC�they�A�'reC�are�They've��A�TheyC�they�A�'veC�have�Theyd��A�TheyC�they�A�dC�'d�Theydve��A�TheyC�they�A�dC�would�A�veC�have�Theyll��A�TheyC�they�A�llC�will�Theyllve��A�TheyC�they�A�llC�will�A�veC�have�Theyre��A�TheyC�they�A�reC�are�Theyve��A�TheyC�they�A�veC�have�They’d��A�TheyC�they�A�’dC�'d�They’d’ve��A�TheyC�they�A�’dC�would�A�’veC�have�They’ll��A�TheyC�they�A�’llC�will�They’ll’ve��A�TheyC�they�A�’llC�will�A�’veC�have�They’re��A�TheyC�they�A�’reC�are�They’ve��A�TheyC�they�A�’veC�have�This'd��A�ThisC�this�A�'dC�'d�This'd've��A�ThisC�this�A�'dC�would�A�'veC�have�This'll��A�ThisC�this�A�'llC�will�This'll've��A�ThisC�this�A�'llC�will�A�'veC�have�This's��A�ThisC�this�A�'sC�'s�Thisd��A�ThisC�this�A�dC�'d�Thisdve��A�ThisC�this�A�dC�would�A�veC�have�Thisll��A�ThisC�this�A�llC�will�Thisllve��A�ThisC�this�A�llC�will�A�veC�have�Thiss��A�ThisC�this�A�s�This’d��A�ThisC�this�A�’dC�'d�This’d’ve��A�ThisC�this�A�’dC�would�A�’veC�have�This’ll��A�ThisC�this�A�’llC�will�This’ll’ve��A�ThisC�this�A�’llC�will�A�’veC�have�This’s��A�ThisC�this�A�’sC�'s�Those'd��A�ThoseC�those�A�'dC�'d�Those'd've��A�ThoseC�those�A�'dC�would�A�'veC�have�Those'll��A�ThoseC�those�A�'llC�will�Those'll've��A�ThoseC�those�A�'llC�will�A�'veC�have�Those're��A�ThoseC�those�A�'reC�are�Those've��A�ThoseC�those�A�'ve�Thosed��A�ThoseC�those�A�dC�'d�Thosedve��A�ThoseC�those�A�dC�would�A�veC�have�Thosell��A�ThoseC�those�A�llC�will�Thosellve��A�ThoseC�those�A�llC�will�A�veC�have�Thosere��A�ThoseC�those�A�reC�are�Thoseve��A�Those�A�veC�have�Those’d��A�ThoseC�those�A�’dC�'d�Those’d’ve��A�ThoseC�those�A�’dC�would�A�’veC�have�Those’ll��A�ThoseC�those�A�’llC�will�Those’ll’ve��A�ThoseC�those�A�’llC�will�A�’veC�have�Those’re��A�ThoseC�those�A�’reC�are�Those’ve��A�ThoseC�those�A�’ve�V.V��A�V.V�V_V��A�V_V�Va.��A�Va.C�Virginia�Wash.��A�Wash.C�Washington�Wasn't��A�WasC�was�A�n'tC�not�Wasnt��A�WasC�was�A�ntC�not�Wasn’t��A�WasC�was�A�n’tC�not�We'd��A�WeC�we�A�'dC�'d�We'd've��A�WeC�we�A�'dC�would�A�'veC�have�We'll��A�WeC�we�A�'llC�will�We'll've��A�WeC�we�A�'llC�will�A�'veC�have�We're��A�WeC�we�A�'reC�are�We've��A�WeC�we�A�'veC�have�Wed��A�WeC�we�A�dC�'d�Wedve��A�WeC�we�A�dC�would�A�veC�have�Wellve��A�WeC�we�A�llC�will�A�veC�have�Weren't��A�WereC�were�A�n'tC�not�Werent��A�WereC�were�A�ntC�not�Weren’t��A�WereC�were�A�n’tC�not�Weve��A�WeC�we�A�veC�have�We’d��A�WeC�we�A�’dC�'d�We’d’ve��A�WeC�we�A�’dC�would�A�’veC�have�We’ll��A�WeC�we�A�’llC�will�We’ll’ve��A�WeC�we�A�’llC�will�A�’veC�have�We’re��A�WeC�we�A�’reC�are�We’ve��A�WeC�we�A�’veC�have�What'd��A�WhatC�what�A�'dC�'d�What'd've��A�WhatC�what�A�'dC�would�A�'veC�have�What'll��A�WhatC�what�A�'llC�will�What'll've��A�WhatC�what�A�'llC�will�A�'veC�have�What're��A�WhatC�what�A�'reC�are�What's��A�WhatC�what�A�'sC�'s�What've��A�WhatC�what�A�'ve�Whatd��A�WhatC�what�A�dC�'d�Whatdve��A�WhatC�what�A�dC�would�A�veC�have�Whatll��A�WhatC�what�A�llC�will�Whatllve��A�WhatC�what�A�llC�will�A�veC�have�Whatre��A�WhatC�what�A�reC�are�Whats��A�WhatC�what�A�s�Whatve��A�What�A�veC�have�What’d��A�WhatC�what�A�’dC�'d�What’d’ve��A�WhatC�what�A�’dC�would�A�’veC�have�What’ll��A�WhatC�what�A�’llC�will�What’ll’ve��A�WhatC�what�A�’llC�will�A�’veC�have�What’re��A�WhatC�what�A�’reC�are�What’s��A�WhatC�what�A�’sC�'s�What’ve��A�WhatC�what�A�’ve�When'd��A�WhenC�when�A�'dC�'d�When'd've��A�WhenC�when�A�'dC�would�A�'veC�have�When'll��A�WhenC�when�A�'llC�will�When'll've��A�WhenC�when�A�'llC�will�A�'veC�have�When're��A�WhenC�when�A�'reC�are�When's��A�WhenC�when�A�'sC�'s�When've��A�WhenC�when�A�'ve�Whend��A�WhenC�when�A�dC�'d�Whendve��A�WhenC�when�A�dC�would�A�veC�have�Whenll��A�WhenC�when�A�llC�will�Whenllve��A�WhenC�when�A�llC�will�A�veC�have�Whenre��A�WhenC�when�A�reC�are�Whens��A�WhenC�when�A�s�Whenve��A�When�A�veC�have�When’d��A�WhenC�when�A�’dC�'d�When’d’ve��A�WhenC�when�A�’dC�would�A�’veC�have�When’ll��A�WhenC�when�A�’llC�will�When’ll’ve��A�WhenC�when�A�’llC�will�A�’veC�have�When’re��A�WhenC�when�A�’reC�are�When’s��A�WhenC�when�A�’sC�'s�When’ve��A�WhenC�when�A�’ve�Where'd��A�WhereC�where�A�'dC�'d�Where'd've��A�WhereC�where�A�'dC�would�A�'veC�have�Where'll��A�WhereC�where�A�'llC�will�Where'll've��A�WhereC�where�A�'llC�will�A�'veC�have�Where're��A�WhereC�where�A�'reC�are�Where's��A�WhereC�where�A�'sC�'s�Where've��A�WhereC�where�A�'ve�Whered��A�WhereC�where�A�dC�'d�Wheredve��A�WhereC�where�A�dC�would�A�veC�have�Wherell��A�WhereC�where�A�llC�will�Wherellve��A�WhereC�where�A�llC�will�A�veC�have�Wherere��A�WhereC�where�A�reC�are�Wheres��A�WhereC�where�A�s�Whereve��A�Where�A�veC�have�Where’d��A�WhereC�where�A�’dC�'d�Where’d’ve��A�WhereC�where�A�’dC�would�A�’veC�have�Where’ll��A�WhereC�where�A�’llC�will�Where’ll’ve��A�WhereC�where�A�’llC�will�A�’veC�have�Where’re��A�WhereC�where�A�’reC�are�Where’s��A�WhereC�where�A�’sC�'s�Where’ve��A�WhereC�where�A�’ve�Who'd��A�WhoC�who�A�'dC�'d�Who'd've��A�WhoC�who�A�'dC�would�A�'veC�have�Who'll��A�WhoC�who�A�'llC�will�Who'll've��A�WhoC�who�A�'llC�will�A�'veC�have�Who're��A�WhoC�who�A�'reC�are�Who's��A�WhoC�who�A�'sC�'s�Who've��A�WhoC�who�A�'ve�Whod��A�WhoC�who�A�dC�'d�Whodve��A�WhoC�who�A�dC�would�A�veC�have�Wholl��A�WhoC�who�A�llC�will�Whollve��A�WhoC�who�A�llC�will�A�veC�have�Whos��A�WhoC�who�A�s�Whove��A�Who�A�veC�have�Who’d��A�WhoC�who�A�’dC�'d�Who’d’ve��A�WhoC�who�A�’dC�would�A�’veC�have�Who’ll��A�WhoC�who�A�’llC�will�Who’ll’ve��A�WhoC�who�A�’llC�will�A�’veC�have�Who’re��A�WhoC�who�A�’reC�are�Who’s��A�WhoC�who�A�’sC�'s�Who’ve��A�WhoC�who�A�’ve�Why'd��A�WhyC�why�A�'dC�'d�Why'd've��A�WhyC�why�A�'dC�would�A�'veC�have�Why'll��A�WhyC�why�A�'llC�will�Why'll've��A�WhyC�why�A�'llC�will�A�'veC�have�Why're��A�WhyC�why�A�'reC�are�Why's��A�WhyC�why�A�'sC�'s�Why've��A�WhyC�why�A�'ve�Whyd��A�WhyC�why�A�dC�'d�Whydve��A�WhyC�why�A�dC�would�A�veC�have�Whyll��A�WhyC�why�A�llC�will�Whyllve��A�WhyC�why�A�llC�will�A�veC�have�Whyre��A�WhyC�why�A�reC�are�Whys��A�WhyC�why�A�s�Whyve��A�Why�A�veC�have�Why’d��A�WhyC�why�A�’dC�'d�Why’d’ve��A�WhyC�why�A�’dC�would�A�’veC�have�Why’ll��A�WhyC�why�A�’llC�will�Why’ll’ve��A�WhyC�why�A�’llC�will�A�’veC�have�Why’re��A�WhyC�why�A�’reC�are�Why’s��A�WhyC�why�A�’sC�'s�Why’ve��A�WhyC�why�A�’ve�Wis.��A�Wis.C�Wisconsin�Won't��A�WoC�will�A�n'tC�not�Won't've��A�WoC�will�A�n'tC�not�A�'veC�have�Wont��A�WoC�will�A�ntC�not�Wontve��A�WoC�will�A�ntC�not�A�veC�have�Won’t��A�WoC�will�A�n’tC�not�Won’t’ve��A�WoC�will�A�n’tC�not�A�’veC�have�Would've��A�WouldC�would�A�'ve�Wouldn't��A�WouldC�would�A�n'tC�not�Wouldn't've��A�WouldC�would�A�n'tC�not�A�'veC�have�Wouldnt��A�WouldC�would�A�ntC�not�Wouldntve��A�WouldC�would�A�ntC�not�A�veC�have�Wouldn’t��A�WouldC�would�A�n’tC�not�Wouldn’t’ve��A�WouldC�would�A�n’tC�not�A�’veC�have�Wouldve��A�WouldC�would�A�ve�Would’ve��A�WouldC�would�A�’ve�XD��A�XD�XDD��A�XDD�You'd��A�YouC�you�A�'dC�'d�You'd've��A�YouC�you�A�'dC�would�A�'veC�have�You'll��A�YouC�you�A�'llC�will�You'll've��A�YouC�you�A�'llC�will�A�'veC�have�You're��A�YouC�you�A�'reC�are�You've��A�YouC�you�A�'veC�have�Youd��A�YouC�you�A�dC�'d�Youdve��A�YouC�you�A�dC�would�A�veC�have�Youll��A�YouC�you�A�llC�will�Youllve��A�YouC�you�A�llC�will�A�veC�have�Youre��A�YouC�you�A�reC�are�Youve��A�YouC�you�A�veC�have�You’d��A�YouC�you�A�’dC�'d�You’d’ve��A�YouC�you�A�’dC�would�A�’veC�have�You’ll��A�YouC�you�A�’llC�will�You’ll’ve��A�YouC�you�A�’llC�will�A�’veC�have�You’re��A�YouC�you�A�’reC�are�You’ve��A�YouC�you�A�’veC�have�[-:��A�[-:�[:��A�[:�[=��A�[=�\")��A�\")�\n��A�\n�\t��A�\t�]=��A�]=�^_^��A�^_^�^__^��A�^__^�^___^��A�^___^�a.��A�a.�a.m.��A�a.m.�ain't��A�ai�A�n'tC�not�aint��A�ai�A�ntC�not�ain’t��A�ai�A�n’tC�not�and/or��A�and/orC�and/or�aren't��A�areC�are�A�n'tC�not�arent��A�areC�are�A�ntC�not�aren’t��A�areC�are�A�n’tC�not�b.��A�b.�c'mon��A�c'mC�come�A�on�c.��A�c.�can't��A�caC�can�A�n'tC�not�can't've��A�caC�can�A�n'tC�not�A�'veC�have�cannot��A�can�A�not�cant��A�caC�can�A�ntC�not�cantve��A�caC�can�A�ntC�not�A�veC�have�can’t��A�caC�can�A�n’tC�not�can’t’ve��A�caC�can�A�n’tC�not�A�’veC�have�co.��A�co.�could've��A�couldC�could�A�'ve�couldn't��A�couldC�could�A�n'tC�not�couldn't've��A�couldC�could�A�n'tC�not�A�'veC�have�couldnt��A�couldC�could�A�ntC�not�couldntve��A�couldC�could�A�ntC�not�A�veC�have�couldn’t��A�couldC�could�A�n’tC�not�couldn’t’ve��A�couldC�could�A�n’tC�not�A�’veC�have�couldve��A�couldC�could�A�ve�could’ve��A�couldC�could�A�’ve�c’mon��A�c’mC�come�A�on�d.��A�d.�daren't��A�dareC�dare�A�n'tC�not�darent��A�dareC�dare�A�ntC�not�daren’t��A�dareC�dare�A�n’tC�not�didn't��A�didC�do�A�n'tC�not�didn't've��A�didC�do�A�n'tC�not�A�'veC�have�didnt��A�didC�do�A�ntC�not�didntve��A�didC�do�A�ntC�not�A�veC�have�didn’t��A�didC�do�A�n’tC�not�didn’t’ve��A�didC�do�A�n’tC�not�A�’veC�have�doesn't��A�doesC�does�A�n'tC�not�doesn't've��A�doesC�does�A�n'tC�not�A�'veC�have�doesnt��A�doesC�does�A�ntC�not�doesntve��A�doesC�does�A�ntC�not�A�veC�have�doesn’t��A�doesC�does�A�n’tC�not�doesn’t’ve��A�doesC�does�A�n’tC�not�A�’veC�have�doin��A�doinC�doing�doin'��A�doin'C�doing�doin’��A�doin’C�doing�don't��A�doC�do�A�n'tC�not�don't've��A�doC�do�A�n'tC�not�A�'veC�have�dont��A�doC�do�A�ntC�not�dontve��A�doC�do�A�ntC�not�A�veC�have�don’t��A�doC�do�A�n’tC�not�don’t’ve��A�doC�do�A�n’tC�not�A�’veC�have�e.��A�e.�e.g.��A�e.g.�em��A�emC�them�f.��A�f.�g.��A�g.�goin��A�goinC�going�goin'��A�goin'C�going�goin’��A�goin’C�going�gonna��A�gonC�going�A�naC�to�gotta��A�got�A�taC�to�h.��A�h.�hadn't��A�hadC�have�A�n'tC�not�hadn't've��A�hadC�have�A�n'tC�not�A�'veC�have�hadnt��A�hadC�have�A�ntC�not�hadntve��A�hadC�have�A�ntC�not�A�veC�have�hadn’t��A�hadC�have�A�n’tC�not�hadn’t’ve��A�hadC�have�A�n’tC�not�A�’veC�have�hasn't��A�hasC�has�A�n'tC�not�hasnt��A�hasC�has�A�ntC�not�hasn’t��A�hasC�has�A�n’tC�not�haven't��A�haveC�have�A�n'tC�not�havent��A�haveC�have�A�ntC�not�haven’t��A�haveC�have�A�n’tC�not�havin��A�havinC�having�havin'��A�havin'C�having�havin’��A�havin’C�having�he'd��A�heC�he�A�'dC�'d�he'd've��A�heC�he�A�'dC�would�A�'veC�have�he'll��A�heC�he�A�'llC�will�he'll've��A�heC�he�A�'llC�will�A�'veC�have�he's��A�heC�he�A�'sC�'s�hed��A�heC�he�A�dC�'d�hedve��A�heC�he�A�dC�would�A�veC�have�hellve��A�heC�he�A�llC�will�A�veC�have�hes��A�heC�he�A�s�he’d��A�heC�he�A�’dC�'d�he’d’ve��A�heC�he�A�’dC�would�A�’veC�have�he’ll��A�heC�he�A�’llC�will�he’ll’ve��A�heC�he�A�’llC�will�A�’veC�have�he’s��A�heC�he�A�’sC�'s�how'd��A�howC�how�A�'dC�'d�how'd've��A�howC�how�A�'dC�would�A�'veC�have�how'd'y��A�how�A�'d�A�'yC�you�how'll��A�howC�how�A�'llC�will�how'll've��A�howC�how�A�'llC�will�A�'veC�have�how're��A�howC�how�A�'reC�are�how's��A�howC�how�A�'sC�'s�how've��A�howC�how�A�'ve�howd��A�howC�how�A�dC�'d�howdve��A�howC�how�A�dC�would�A�veC�have�howll��A�howC�how�A�llC�will�howllve��A�howC�how�A�llC�will�A�veC�have�howre��A�howC�how�A�reC�are�hows��A�howC�how�A�s�howve��A�how�A�veC�have�how’d��A�howC�how�A�’dC�'d�how’d’ve��A�howC�how�A�’dC�would�A�’veC�have�how’d’y��A�how�A�’d�A�’yC�you�how’ll��A�howC�how�A�’llC�will�how’ll’ve��A�howC�how�A�’llC�will�A�’veC�have�how’re��A�howC�how�A�’reC�are�how’s��A�howC�how�A�’sC�'s�how’ve��A�howC�how�A�’ve�i'd��A�iC�i�A�'dC�'d�i'd've��A�iC�i�A�'dC�would�A�'veC�have�i'll��A�iC�i�A�'llC�will�i'll've��A�iC�i�A�'llC�will�A�'veC�have�i'm��A�iC�i�A�'mC�am�i'ma��A�iC�i�A�'mC�am�A�aC�gonna�i've��A�iC�i�A�'veC�have�i.��A�i.�i.e.��A�i.e.�id��A�iC�i�A�dC�'d�idve��A�iC�i�A�dC�would�A�veC�have�illve��A�iC�i�A�llC�will�A�veC�have�im��A�iC�i�A�m�ima��A�iC�i�A�mC�am�A�aC�gonna�isn't��A�isC�is�A�n'tC�not�isnt��A�isC�is�A�ntC�not�isn’t��A�isC�is�A�n’tC�not�it'd��A�itC�it�A�'dC�'d�it'd've��A�itC�it�A�'dC�would�A�'veC�have�it'll��A�itC�it�A�'llC�will�it'll've��A�itC�it�A�'llC�will�A�'veC�have�it's��A�itC�it�A�'sC�'s�itd��A�itC�it�A�dC�'d�itdve��A�itC�it�A�dC�would�A�veC�have�itll��A�itC�it�A�llC�will�itllve��A�itC�it�A�llC�will�A�veC�have�it’d��A�itC�it�A�’dC�'d�it’d’ve��A�itC�it�A�’dC�would�A�’veC�have�it’ll��A�itC�it�A�’llC�will�it’ll’ve��A�itC�it�A�’llC�will�A�’veC�have�it’s��A�itC�it�A�’sC�'s�ive��A�iC�i�A�veC�have�i’d��A�iC�i�A�’dC�'d�i’d’ve��A�iC�i�A�’dC�would�A�’veC�have�i’ll��A�iC�i�A�’llC�will�i’ll’ve��A�iC�i�A�’llC�will�A�’veC�have�i’m��A�iC�i�A�’mC�am�i’ma��A�iC�i�A�’mC�am�A�aC�gonna�i’ve��A�iC�i�A�’veC�have�j.��A�j.�k.��A�k.�l.��A�l.�let's��A�let�A�'sC�us�let’s��A�let�A�’sC�us�ll��A�llC�will�lovin��A�lovinC�loving�lovin'��A�lovin'C�loving�lovin’��A�lovin’C�loving�m.��A�m.�ma'am��A�ma'amC�madam�mayn't��A�mayC�may�A�n'tC�not�mayn't've��A�mayC�may�A�n'tC�not�A�'veC�have�maynt��A�mayC�may�A�ntC�not�mayntve��A�mayC�may�A�ntC�not�A�veC�have�mayn’t��A�mayC�may�A�n’tC�not�mayn’t’ve��A�mayC�may�A�n’tC�not�A�’veC�have�ma’am��A�ma’amC�madam�might've��A�mightC�might�A�'ve�mightn't��A�mightC�might�A�n'tC�not�mightn't've��A�mightC�might�A�n'tC�not�A�'veC�have�mightnt��A�mightC�might�A�ntC�not�mightntve��A�mightC�might�A�ntC�not�A�veC�have�mightn’t��A�mightC�might�A�n’tC�not�mightn’t’ve��A�mightC�might�A�n’tC�not�A�’veC�have�mightve��A�mightC�might�A�ve�might’ve��A�mightC�might�A�’ve�must've��A�mustC�must�A�'ve�mustn't��A�mustC�must�A�n'tC�not�mustn't've��A�mustC�must�A�n'tC�not�A�'veC�have�mustnt��A�mustC�must�A�ntC�not�mustntve��A�mustC�must�A�ntC�not�A�veC�have�mustn’t��A�mustC�must�A�n’tC�not�mustn’t’ve��A�mustC�must�A�n’tC�not�A�’veC�have�mustve��A�mustC�must�A�ve�must’ve��A�mustC�must�A�’ve�n.��A�n.�needn't��A�needC�need�A�n'tC�not�needn't've��A�needC�need�A�n'tC�not�A�'veC�have�neednt��A�needC�need�A�ntC�not�needntve��A�needC�need�A�ntC�not�A�veC�have�needn’t��A�needC�need�A�n’tC�not�needn’t’ve��A�needC�need�A�n’tC�not�A�’veC�have�not've��A�not�A�'veC�have�nothin��A�nothinC�nothing�nothin'��A�nothin'C�nothing�nothin’��A�nothin’C�nothing�notve��A�not�A�veC�have�not’ve��A�not�A�’veC�have�nuff��A�nuffC�enough�nuthin��A�nuthinC�nothing�nuthin'��A�nuthin'C�nothing�nuthin’��A�nuthin’C�nothing�o'clock��A�o'clockC�o'clock�o.��A�o.�o.0��A�o.0�o.O��A�o.O�o.o��A�o.o�o_0��A�o_0�o_O��A�o_O�o_o��A�o_o�ol��A�olC�old�ol'��A�ol'C�old�ol’��A�ol’C�old�oughtn't��A�oughtC�ought�A�n'tC�not�oughtn't've��A�oughtC�ought�A�n'tC�not�A�'veC�have�oughtnt��A�oughtC�ought�A�ntC�not�oughtntve��A�oughtC�ought�A�ntC�not�A�veC�have�oughtn’t��A�oughtC�ought�A�n’tC�not�oughtn’t’ve��A�oughtC�ought�A�n’tC�not�A�’veC�have�o’clock��A�o’clockC�o'clock�p.��A�p.�p.m.��A�p.m.�q.��A�q.�r.��A�r.�s.��A�s.�shan't��A�shaC�shall�A�n'tC�not�shan't've��A�shaC�shall�A�n'tC�not�A�'veC�have�shant��A�shaC�shall�A�ntC�not�shantve��A�shaC�shall�A�ntC�not�A�veC�have�shan’t��A�shaC�shall�A�n’tC�not�shan’t’ve��A�shaC�shall�A�n’tC�not�A�’veC�have�she'd��A�sheC�she�A�'dC�'d�she'd've��A�sheC�she�A�'dC�would�A�'veC�have�she'll��A�sheC�she�A�'llC�will�she'll've��A�sheC�she�A�'llC�will�A�'veC�have�she's��A�sheC�she�A�'sC�'s�shedve��A�sheC�she�A�dC�would�A�veC�have�shellve��A�sheC�she�A�llC�will�A�veC�have�shes��A�sheC�she�A�s�she’d��A�sheC�she�A�’dC�'d�she’d’ve��A�sheC�she�A�’dC�would�A�’veC�have�she’ll��A�sheC�she�A�’llC�will�she’ll’ve��A�sheC�she�A�’llC�will�A�’veC�have�she’s��A�sheC�she�A�’sC�'s�should've��A�shouldC�should�A�'ve�shouldn't��A�shouldC�should�A�n'tC�not�shouldn't've��A�shouldC�should�A�n'tC�not�A�'veC�have�shouldnt��A�shouldC�should�A�ntC�not�shouldntve��A�shouldC�should�A�ntC�not�A�veC�have�shouldn’t��A�shouldC�should�A�n’tC�not�shouldn’t’ve��A�shouldC�should�A�n’tC�not�A�’veC�have�shouldve��A�shouldC�should�A�ve�should’ve��A�shouldC�should�A�’ve�somethin��A�somethinC�something�somethin'��A�somethin'C�something�somethin’��A�somethin’C�something�t.��A�t.�that'd��A�thatC�that�A�'dC�'d�that'd've��A�thatC�that�A�'dC�would�A�'veC�have�that'll��A�thatC�that�A�'llC�will�that'll've��A�thatC�that�A�'llC�will�A�'veC�have�that's��A�thatC�that�A�'sC�'s�thatd��A�thatC�that�A�dC�'d�thatdve��A�thatC�that�A�dC�would�A�veC�have�thatll��A�thatC�that�A�llC�will�thatllve��A�thatC�that�A�llC�will�A�veC�have�thats��A�thatC�that�A�s�that’d��A�thatC�that�A�’dC�'d�that’d’ve��A�thatC�that�A�’dC�would�A�’veC�have�that’ll��A�thatC�that�A�’llC�will�that’ll’ve��A�thatC�that�A�’llC�will�A�’veC�have�that’s��A�thatC�that�A�’sC�'s�there'd��A�thereC�there�A�'dC�'d�there'd've��A�thereC�there�A�'dC�would�A�'veC�have�there'll��A�thereC�there�A�'llC�will�there'll've��A�thereC�there�A�'llC�will�A�'veC�have�there're��A�thereC�there�A�'reC�are�there's��A�thereC�there�A�'sC�'s�there've��A�thereC�there�A�'ve�thered��A�thereC�there�A�dC�'d�theredve��A�thereC�there�A�dC�would�A�veC�have�therell��A�thereC�there�A�llC�will�therellve��A�thereC�there�A�llC�will�A�veC�have�therere��A�thereC�there�A�reC�are�theres��A�thereC�there�A�s�thereve��A�there�A�veC�have�there’d��A�thereC�there�A�’dC�'d�there’d’ve��A�thereC�there�A�’dC�would�A�’veC�have�there’ll��A�thereC�there�A�’llC�will�there’ll’ve��A�thereC�there�A�’llC�will�A�’veC�have�there’re��A�thereC�there�A�’reC�are�there’s��A�thereC�there�A�’sC�'s�there’ve��A�thereC�there�A�’ve�these'd��A�theseC�these�A�'dC�'d�these'd've��A�theseC�these�A�'dC�would�A�'veC�have�these'll��A�theseC�these�A�'llC�will�these'll've��A�theseC�these�A�'llC�will�A�'veC�have�these're��A�theseC�these�A�'reC�are�these've��A�theseC�these�A�'ve�thesed��A�theseC�these�A�dC�'d�thesedve��A�theseC�these�A�dC�would�A�veC�have�thesell��A�theseC�these�A�llC�will�thesellve��A�theseC�these�A�llC�will�A�veC�have�thesere��A�theseC�these�A�reC�are�theseve��A�these�A�veC�have�these’d��A�theseC�these�A�’dC�'d�these’d’ve��A�theseC�these�A�’dC�would�A�’veC�have�these’ll��A�theseC�these�A�’llC�will�these’ll’ve��A�theseC�these�A�’llC�will�A�’veC�have�these’re��A�theseC�these�A�’reC�are�these’ve��A�theseC�these�A�’ve�they'd��A�theyC�they�A�'dC�'d�they'd've��A�theyC�they�A�'dC�would�A�'veC�have�they'll��A�theyC�they�A�'llC�will�they'll've��A�theyC�they�A�'llC�will�A�'veC�have�they're��A�theyC�they�A�'reC�are�they've��A�theyC�they�A�'veC�have�theyd��A�theyC�they�A�dC�'d�theydve��A�theyC�they�A�dC�would�A�veC�have�theyll��A�theyC�they�A�llC�will�theyllve��A�theyC�they�A�llC�will�A�veC�have�theyre��A�theyC�they�A�reC�are�theyve��A�theyC�they�A�veC�have�they’d��A�theyC�they�A�’dC�'d�they’d’ve��A�theyC�they�A�’dC�would�A�’veC�have�they’ll��A�theyC�they�A�’llC�will�they’ll’ve��A�theyC�they�A�’llC�will�A�’veC�have�they’re��A�theyC�they�A�’reC�are�they’ve��A�theyC�they�A�’veC�have�this'd��A�thisC�this�A�'dC�'d�this'd've��A�thisC�this�A�'dC�would�A�'veC�have�this'll��A�thisC�this�A�'llC�will�this'll've��A�thisC�this�A�'llC�will�A�'veC�have�this's��A�thisC�this�A�'sC�'s�thisd��A�thisC�this�A�dC�'d�thisdve��A�thisC�this�A�dC�would�A�veC�have�thisll��A�thisC�this�A�llC�will�thisllve��A�thisC�this�A�llC�will�A�veC�have�thiss��A�thisC�this�A�s�this’d��A�thisC�this�A�’dC�'d�this’d’ve��A�thisC�this�A�’dC�would�A�’veC�have�this’ll��A�thisC�this�A�’llC�will�this’ll’ve��A�thisC�this�A�’llC�will�A�’veC�have�this’s��A�thisC�this�A�’sC�'s�those'd��A�thoseC�those�A�'dC�'d�those'd've��A�thoseC�those�A�'dC�would�A�'veC�have�those'll��A�thoseC�those�A�'llC�will�those'll've��A�thoseC�those�A�'llC�will�A�'veC�have�those're��A�thoseC�those�A�'reC�are�those've��A�thoseC�those�A�'ve�thosed��A�thoseC�those�A�dC�'d�thosedve��A�thoseC�those�A�dC�would�A�veC�have�thosell��A�thoseC�those�A�llC�will�thosellve��A�thoseC�those�A�llC�will�A�veC�have�thosere��A�thoseC�those�A�reC�are�thoseve��A�those�A�veC�have�those’d��A�thoseC�those�A�’dC�'d�those’d’ve��A�thoseC�those�A�’dC�would�A�’veC�have�those’ll��A�thoseC�those�A�’llC�will�those’ll’ve��A�thoseC�those�A�’llC�will�A�’veC�have�those’re��A�thoseC�those�A�’reC�are�those’ve��A�thoseC�those�A�’ve�u.��A�u.�v.��A�v.�v.s.��A�v.s.�v.v��A�v.v�v_v��A�v_v�vs.��A�vs.�w.��A�w.�w/o��A�w/oC�without�wasn't��A�wasC�was�A�n'tC�not�wasnt��A�wasC�was�A�ntC�not�wasn’t��A�wasC�was�A�n’tC�not�we'd��A�weC�we�A�'dC�'d�we'd've��A�weC�we�A�'dC�would�A�'veC�have�we'll��A�weC�we�A�'llC�will�we'll've��A�weC�we�A�'llC�will�A�'veC�have�we're��A�weC�we�A�'reC�are�we've��A�weC�we�A�'veC�have�wed��A�weC�we�A�dC�'d�wedve��A�weC�we�A�dC�would�A�veC�have�wellve��A�weC�we�A�llC�will�A�veC�have�weren't��A�wereC�were�A�n'tC�not�werent��A�wereC�were�A�ntC�not�weren’t��A�wereC�were�A�n’tC�not�weve��A�weC�we�A�veC�have�we’d��A�weC�we�A�’dC�'d�we’d’ve��A�weC�we�A�’dC�would�A�’veC�have�we’ll��A�weC�we�A�’llC�will�we’ll’ve��A�weC�we�A�’llC�will�A�’veC�have�we’re��A�weC�we�A�’reC�are�we’ve��A�weC�we�A�’veC�have�what'd��A�whatC�what�A�'dC�'d�what'd've��A�whatC�what�A�'dC�would�A�'veC�have�what'll��A�whatC�what�A�'llC�will�what'll've��A�whatC�what�A�'llC�will�A�'veC�have�what're��A�whatC�what�A�'reC�are�what's��A�whatC�what�A�'sC�'s�what've��A�whatC�what�A�'ve�whatd��A�whatC�what�A�dC�'d�whatdve��A�whatC�what�A�dC�would�A�veC�have�whatll��A�whatC�what�A�llC�will�whatllve��A�whatC�what�A�llC�will�A�veC�have�whatre��A�whatC�what�A�reC�are�whats��A�whatC�what�A�s�whatve��A�what�A�veC�have�what’d��A�whatC�what�A�’dC�'d�what’d’ve��A�whatC�what�A�’dC�would�A�’veC�have�what’ll��A�whatC�what�A�’llC�will�what’ll’ve��A�whatC�what�A�’llC�will�A�’veC�have�what’re��A�whatC�what�A�’reC�are�what’s��A�whatC�what�A�’sC�'s�what’ve��A�whatC�what�A�’ve�when'd��A�whenC�when�A�'dC�'d�when'd've��A�whenC�when�A�'dC�would�A�'veC�have�when'll��A�whenC�when�A�'llC�will�when'll've��A�whenC�when�A�'llC�will�A�'veC�have�when're��A�whenC�when�A�'reC�are�when's��A�whenC�when�A�'sC�'s�when've��A�whenC�when�A�'ve�whend��A�whenC�when�A�dC�'d�whendve��A�whenC�when�A�dC�would�A�veC�have�whenll��A�whenC�when�A�llC�will�whenllve��A�whenC�when�A�llC�will�A�veC�have�whenre��A�whenC�when�A�reC�are�whens��A�whenC�when�A�s�whenve��A�when�A�veC�have�when’d��A�whenC�when�A�’dC�'d�when’d’ve��A�whenC�when�A�’dC�would�A�’veC�have�when’ll��A�whenC�when�A�’llC�will�when’ll’ve��A�whenC�when�A�’llC�will�A�’veC�have�when’re��A�whenC�when�A�’reC�are�when’s��A�whenC�when�A�’sC�'s�when’ve��A�whenC�when�A�’ve�where'd��A�whereC�where�A�'dC�'d�where'd've��A�whereC�where�A�'dC�would�A�'veC�have�where'll��A�whereC�where�A�'llC�will�where'll've��A�whereC�where�A�'llC�will�A�'veC�have�where're��A�whereC�where�A�'reC�are�where's��A�whereC�where�A�'sC�'s�where've��A�whereC�where�A�'ve�whered��A�whereC�where�A�dC�'d�wheredve��A�whereC�where�A�dC�would�A�veC�have�wherell��A�whereC�where�A�llC�will�wherellve��A�whereC�where�A�llC�will�A�veC�have�wherere��A�whereC�where�A�reC�are�wheres��A�whereC�where�A�s�whereve��A�where�A�veC�have�where’d��A�whereC�where�A�’dC�'d�where’d’ve��A�whereC�where�A�’dC�would�A�’veC�have�where’ll��A�whereC�where�A�’llC�will�where’ll’ve��A�whereC�where�A�’llC�will�A�’veC�have�where’re��A�whereC�where�A�’reC�are�where’s��A�whereC�where�A�’sC�'s�where’ve��A�whereC�where�A�’ve�who'd��A�whoC�who�A�'dC�'d�who'd've��A�whoC�who�A�'dC�would�A�'veC�have�who'll��A�whoC�who�A�'llC�will�who'll've��A�whoC�who�A�'llC�will�A�'veC�have�who're��A�whoC�who�A�'reC�are�who's��A�whoC�who�A�'sC�'s�who've��A�whoC�who�A�'ve�whod��A�whoC�who�A�dC�'d�whodve��A�whoC�who�A�dC�would�A�veC�have�wholl��A�whoC�who�A�llC�will�whollve��A�whoC�who�A�llC�will�A�veC�have�whos��A�whoC�who�A�s�whove��A�who�A�veC�have�who’d��A�whoC�who�A�’dC�'d�who’d’ve��A�whoC�who�A�’dC�would�A�’veC�have�who’ll��A�whoC�who�A�’llC�will�who’ll’ve��A�whoC�who�A�’llC�will�A�’veC�have�who’re��A�whoC�who�A�’reC�are�who’s��A�whoC�who�A�’sC�'s�who’ve��A�whoC�who�A�’ve�why'd��A�whyC�why�A�'dC�'d�why'd've��A�whyC�why�A�'dC�would�A�'veC�have�why'll��A�whyC�why�A�'llC�will�why'll've��A�whyC�why�A�'llC�will�A�'veC�have�why're��A�whyC�why�A�'reC�are�why's��A�whyC�why�A�'sC�'s�why've��A�whyC�why�A�'ve�whyd��A�whyC�why�A�dC�'d�whydve��A�whyC�why�A�dC�would�A�veC�have�whyll��A�whyC�why�A�llC�will�whyllve��A�whyC�why�A�llC�will�A�veC�have�whyre��A�whyC�why�A�reC�are�whys��A�whyC�why�A�s�whyve��A�why�A�veC�have�why’d��A�whyC�why�A�’dC�'d�why’d’ve��A�whyC�why�A�’dC�would�A�’veC�have�why’ll��A�whyC�why�A�’llC�will�why’ll’ve��A�whyC�why�A�’llC�will�A�’veC�have�why’re��A�whyC�why�A�’reC�are�why’s��A�whyC�why�A�’sC�'s�why’ve��A�whyC�why�A�’ve�won't��A�woC�will�A�n'tC�not�won't've��A�woC�will�A�n'tC�not�A�'veC�have�wont��A�woC�will�A�ntC�not�wontve��A�woC�will�A�ntC�not�A�veC�have�won’t��A�woC�will�A�n’tC�not�won’t’ve��A�woC�will�A�n’tC�not�A�’veC�have�would've��A�wouldC�would�A�'ve�wouldn't��A�wouldC�would�A�n'tC�not�wouldn't've��A�wouldC�would�A�n'tC�not�A�'veC�have�wouldnt��A�wouldC�would�A�ntC�not�wouldntve��A�wouldC�would�A�ntC�not�A�veC�have�wouldn’t��A�wouldC�would�A�n’tC�not�wouldn’t’ve��A�wouldC�would�A�n’tC�not�A�’veC�have�wouldve��A�wouldC�would�A�ve�would’ve��A�wouldC�would�A�’ve�x.��A�x.�xD��A�xD�xDD��A�xDD�y'all��A�y'C�you�A�all�y.��A�y.�yall��A�yC�you�A�all�you'd��A�youC�you�A�'dC�'d�you'd've��A�youC�you�A�'dC�would�A�'veC�have�you'll��A�youC�you�A�'llC�will�you'll've��A�youC�you�A�'llC�will�A�'veC�have�you're��A�youC�you�A�'reC�are�you've��A�youC�you�A�'veC�have�youd��A�youC�you�A�dC�'d�youdve��A�youC�you�A�dC�would�A�veC�have�youll��A�youC�you�A�llC�will�youllve��A�youC�you�A�llC�will�A�veC�have�youre��A�youC�you�A�reC�are�youve��A�youC�you�A�veC�have�you’d��A�youC�you�A�’dC�'d�you’d’ve��A�youC�you�A�’dC�would�A�’veC�have�you’ll��A�youC�you�A�’llC�will�you’ll’ve��A�youC�you�A�’llC�will�A�’veC�have�you’re��A�youC�you�A�’reC�are�you’ve��A�youC�you�A�’veC�have�y’all��A�y’C�you�A�all�z.��A�z.� ��A� C� �¯\(ツ)/¯��A�¯\(ツ)/¯�°C.��A�°�A�C�A�.�°F.��A�°�A�F�A�.�°K.��A�°�A�K�A�.�°c.��A�°�A�c�A�.�°f.��A�°�A�f�A�.�°k.��A�°�A�k�A�.�ä.��A�ä.�ö.��A�ö.�ü.��A�ü.�ಠ_ಠ��A�ಠ_ಠ�ಠ︵ಠ��A�ಠ︵ಠ�—��A�—�‘S��A�‘SC�'s�‘s��A�‘sC�'s�’��A�’�’Cause��A�’CauseC�because�’Cos��A�’CosC�because�’Coz��A�’CozC�because�’Cuz��A�’CuzC�because�’S��A�’SC�'s�’bout��A�’boutC�about�’cause��A�’causeC�because�’cos��A�’cosC�because�’coz��A�’cozC�because�’cuz��A�’cuzC�because�’d��A�’d�’em��A�’emC�them�’ll��A�’llC�will�’nuff��A�’nuffC�enough�’re��A�’reC�are�’s��A�’sC�'s�’’��A�’’�faster_heuristics�
|
trainable_transformer/cfg
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"max_batch_items":4096
|
3 |
+
}
|
trainable_transformer/model
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:ab54f54a86b0b191d2a61cae8688eb0613391f0735e1359a6e0ea7f9b516a8ca
|
3 |
+
size 502030693
|
transformer/cfg
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"max_batch_items":4096
|
3 |
+
}
|
transformer/model
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:9b901f5c2249e02bee9b4efa59d01fb065a4c1bbf4e4ac9f8548f9529bd7f97e
|
3 |
+
size 502030652
|
vocab/key2row
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
�
|
vocab/lookups.bin
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:76be8b528d0075f7aae98d6fa57a6d3c83ae480a8469e668d7b0af968995ac71
|
3 |
+
size 1
|
vocab/strings.json
ADDED
The diff for this file is too large to render.
See raw diff
|
|
vocab/vectors
ADDED
Binary file (128 Bytes). View file
|
|
vocab/vectors.cfg
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"mode":"default"
|
3 |
+
}
|