diff --git a/README.md b/README.md new file mode 100644 index 0000000000000000000000000000000000000000..a8fbc62116a29f95ab62ae769de34050d1e9b843 --- /dev/null +++ b/README.md @@ -0,0 +1,165 @@ +
+ +# 🔥 Flame + +
+ +A minimal framework for training FLA models, whether from scratch or through finetuning. + +Built on the robust infrastructure of 🤗, `flame` enables you to train large language models with just a few lines of code: +we use `datasets` for data processing, `transformers` for model definitions, and `accelerate`[^1] for seamless distributed training. + +In this README, we will guide you through the process of using `flame` to train GLA models. + +## Setup + +To get started, you'll need to install the required packages. +Both `fla` and `flame` have minimal dependencies. +Clone the `fla` repository and install the necessary packages as follows: + +```bash +git clone https://github.com/sustcsonglin/flash-linear-attention.git +pip install . +pip install accelerate wandb +pip3 install deepspeed +``` + +> [!CAUTION] +> The 🤗 `tokenizers` have some [memory leak issues](https://github.com/huggingface/tokenizers/issues/1539) when processing very long documents. +> To address this, please ensure you install `tokenizers>=0.20.4`. + +## Preprocessing + +Before training, you need to download and pre-tokenize your dataset. +We provide a straightforward script for this. +For instance, to tokenize a 10B sample of the `fineweb-edu` dataset, run: + +```bash +python preprocess.py \ + --dataset HuggingFaceFW/fineweb-edu \ + --name sample-10BT \ + --split train \ + --context_length 2048 +``` +or an even smaller example, just for testing: +```bash +python preprocess.py \ + --dataset alturing/gutenberg-texts \ + --split train \ + --context_length 2048 +``` + +This will cache the processed dataset at `data/HuggingFaceFW/fineweb-edu/sample-10BT/train`. + +GLA utilizes a subset of Slimpajama for pretraining [in the paper](https://proceedings.mlr.press/v235/yang24ab.html). +Given the size of the dataset, the fastest way to download it is using `git lfs` (refer to [this issue](https://huggingface.co/datasets/cerebras/SlimPajama-627B/discussions/2)). +```bash +git lfs install +git clone https://huggingface.co/datasets/cerebras/SlimPajama-627B +python preprocess.py \ + --dataset SlimPajama-627B \ + --split train \ + --context_length 2048 +``` + +## Training from scratch + +To train your 340M model from scratch, execute the following command: + +```bash +bash train.sh \ + type=gla \ + lr=3e-4 \ + steps=20480 \ + batch=8 \ + update=1 \ + warmup=1024 \ + context=2048 \ + path=exp/gla-340M-10B \ + project=fla \ + model=configs/gla_340M.json \ + data=HuggingFaceFW/fineweb-edu \ + name=sample-10BT \ + cache=data/HuggingFaceFW/fineweb-edu/sample-10BT/train +``` +or for testing SCAN: +```bash +bash train.sh \ + type=scan \ + lr=3e-4 \ + steps=1000 \ + batch=8 \ + update=1 \ + warmup=100 \ + context=2048 \ + path=exp/scan-340M-test \ + project=fla \ + model=configs/scan_340M.json \ + data=alturing/gutenberg-texts \ + name=sample-10BT \ + cache=data/alturing/gutenberg-texts/train +``` + +`flame` also supports resuming interrupted training by specifying the checkpoint path. +Simply use the following command to resume training: + +```bash +bash train.sh \ + type=gla \ + lr=3e-4 \ + steps=20480 \ + batch=8 \ + update=1 \ + warmup=1024 \ + context=2048 \ + path=exp/gla-340M-10B \ + project=fla \ + model=configs/gla_340M.json \ + data=HuggingFaceFW/fineweb-edu \ + name=sample-10BT \ + cache=data/HuggingFaceFW/fineweb-edu/sample-10BT/train \ + checkpoint=exp/gla-340M-10B/checkpoint-8192 +``` + +You can also use `wandb` to monitor your training process effectively. + +![wandb](https://github.com/user-attachments/assets/05ca031c-1cae-41c9-bfcb-5b6b6d0df729) + +## Continual Pretraining + +`flame` supports continual training from a pretrained checkpoint. +Below, we provide an example of how to finetune Mistral-7B to GLA. +You can follow similar steps to reproduce the results in the [GSA paper](https://arxiv.org/abs/2409.07146): + +1. Initialize a brand-new GLA-7B model from the config and copy the mathced pretrained weights from Mistral-7B: +```bash +cd ../utils +python convert_from_llama.py \ + --model mistralai/Mistral-7B-v0.1 \ + --config ../training/configs/gla_7B.json \ + --output ../training/converted/gla-7B +cd - +``` + +2. Directly launch training from the converted checkpoint: +```bash +bash train.sh \ + type=gla \ + lr=3e-5 \ + steps=10240 \ + batch=4 \ + update=8 \ + warmup=512 \ + context=2048 \ + path=exp/gla-7B-20B \ + project=fla \ + model=converted/gla-7B \ + data=SlimPajama-627B \ + cache=data/SlimPajama-627B/train +``` + +Please be aware that finetuning on a single node may not be the most efficient approach. +If available, consider leveraging multi-node GPUs for optimal performance. +You can find guidance on how to launch a multi-node job in the [accelerate tutorial](https://github.com/huggingface/accelerate/blob/main/examples/slurm/submit_multinode.sh). + +[^1]: The `accelerate` library supports various distributed frameworks, like `deepspeed` and `megatron` for large-scale training. We use `deepspeed` in our case. diff --git a/config.json b/config.json new file mode 100644 index 0000000000000000000000000000000000000000..1bcdb7580c0dd61927f7e2596d2e4b1af15cac2c --- /dev/null +++ b/config.json @@ -0,0 +1,42 @@ +{ + "_name_or_path": "configs/scan_16M_8192.json", + "architectures": [ + "SCANForCausalLM" + ], + "attn": null, + "attn_mode": "parallel", + "bos_token_id": 1, + "clamp_max": null, + "clamp_min": null, + "elementwise_affine": true, + "eos_token_id": 2, + "expand_k": 1, + "expand_v": 1, + "fuse_cross_entropy": true, + "fuse_norm": true, + "gate_act": "softmax", + "gate_logit_normalizer": 8, + "hidden_act": "swish", + "hidden_ratio": 4, + "hidden_size": 256, + "initializer_range": 0.02, + "intermediate_size": null, + "max_position_embeddings": 8192, + "model_type": "scan", + "norm_eps": 1e-06, + "norm_first": true, + "num_heads": 4, + "num_hidden_layers": 10, + "num_kv_heads": null, + "state_size": 16, + "tie_word_embeddings": true, + "torch_dtype": "bfloat16", + "transformers_version": "4.47.0", + "use_cache": true, + "use_gk": true, + "use_gv": false, + "use_norm": true, + "use_output_gate": false, + "vocab_size": 32000, + "window_size": 128 +} diff --git a/configs/deepspeed.yaml b/configs/deepspeed.yaml new file mode 100644 index 0000000000000000000000000000000000000000..4d3ad966736455a6fb7ffeef1803784d2a6bf997 --- /dev/null +++ b/configs/deepspeed.yaml @@ -0,0 +1,10 @@ +compute_environment: LOCAL_MACHINE +distributed_type: DEEPSPEED +deepspeed_config: + deepspeed_config_file: configs/ds_config.json + zero3_init_flag: true +machine_rank: 0 +main_training_function: main +num_machines: 1 +num_processes: 1 +use_cpu: false diff --git a/configs/ds_config.json b/configs/ds_config.json new file mode 100644 index 0000000000000000000000000000000000000000..272589af1272826bce4abe07ba41e22471f276db --- /dev/null +++ b/configs/ds_config.json @@ -0,0 +1,19 @@ +{ + "train_batch_size": "auto", + "train_micro_batch_size_per_gpu": "auto", + "gradient_accumulation_steps": "auto", + "gradient_clipping": "auto", + "zero_allow_untested_optimizer": true, + "bf16": { + "enabled": true + }, + "zero_optimization": { + "stage": 2, + "allgather_partitions": true, + "allgather_bucket_size": 5e8, + "reduce_scatter": true, + "reduce_bucket_size": 5e8, + "overlap_comm": false, + "contiguous_gradients": true + } +} diff --git a/configs/gla_16M.json b/configs/gla_16M.json new file mode 100644 index 0000000000000000000000000000000000000000..ba97225ae9ffac48f79c486a3802f7ff59290320 --- /dev/null +++ b/configs/gla_16M.json @@ -0,0 +1,26 @@ +{ + "attn_mode": "chunk", + "bos_token_id": 1, + "clamp_min": null, + "eos_token_id": 2, + "expand_k": 0.5, + "expand_v": 1, + "fuse_cross_entropy": true, + "fuse_norm": true, + "hidden_act": "swish", + "hidden_ratio": 4, + "hidden_size": 256, + "initializer_range": 0.02, + "intermediate_size": null, + "max_position_embeddings": 2048, + "model_type": "gla", + "num_heads": 4, + "num_hidden_layers": 10, + "norm_eps": 1e-06, + "tie_word_embeddings": true, + "transformers_version": "4.38.2", + "use_cache": true, + "use_gk": true, + "use_gv": false, + "vocab_size": 32000 +} \ No newline at end of file diff --git a/configs/gla_1B.json b/configs/gla_1B.json new file mode 100644 index 0000000000000000000000000000000000000000..b727f4e7a749055886d98ae452e2093fc954cdfe --- /dev/null +++ b/configs/gla_1B.json @@ -0,0 +1,26 @@ +{ + "attn_mode": "chunk", + "bos_token_id": 1, + "clamp_min": null, + "eos_token_id": 2, + "expand_k": 0.5, + "expand_v": 1, + "fuse_cross_entropy": true, + "fuse_norm": true, + "hidden_act": "swish", + "hidden_ratio": 4, + "hidden_size": 2048, + "initializer_range": 0.02, + "intermediate_size": null, + "max_position_embeddings": 2048, + "model_type": "gla", + "num_heads": 4, + "num_hidden_layers": 24, + "norm_eps": 1e-06, + "tie_word_embeddings": false, + "transformers_version": "4.38.2", + "use_cache": true, + "use_gk": true, + "use_gv": false, + "vocab_size": 32000 +} diff --git a/configs/gla_340M.json b/configs/gla_340M.json new file mode 100644 index 0000000000000000000000000000000000000000..bcb0beec65bf9abebf2bfbd08c1f6864f57e5fef --- /dev/null +++ b/configs/gla_340M.json @@ -0,0 +1,26 @@ +{ + "attn_mode": "chunk", + "bos_token_id": 1, + "clamp_min": null, + "eos_token_id": 2, + "expand_k": 0.5, + "expand_v": 1, + "fuse_cross_entropy": true, + "fuse_norm": true, + "hidden_act": "swish", + "hidden_ratio": 4, + "hidden_size": 1024, + "initializer_range": 0.02, + "intermediate_size": null, + "max_position_embeddings": 2048, + "model_type": "gla", + "num_heads": 4, + "num_hidden_layers": 24, + "norm_eps": 1e-06, + "tie_word_embeddings": true, + "transformers_version": "4.38.2", + "use_cache": true, + "use_gk": true, + "use_gv": false, + "vocab_size": 32000 +} diff --git a/configs/gla_7B.json b/configs/gla_7B.json new file mode 100644 index 0000000000000000000000000000000000000000..48107c45a764967b6e9cd544a1b5f026bcdc3c0b --- /dev/null +++ b/configs/gla_7B.json @@ -0,0 +1,29 @@ +{ + "attn_mode": "chunk", + "bos_token_id": 1, + "clamp_min": null, + "eos_token_id": 2, + "expand_k": 1, + "expand_v": 1, + "feature_map": "relu", + "fuse_cross_entropy": true, + "fuse_norm": true, + "hidden_act": "swish", + "hidden_ratio": 4, + "hidden_size": 4096, + "initializer_range": 0.02, + "intermediate_size": 14336, + "max_position_embeddings": 32768, + "model_type": "gla", + "num_heads": 32, + "num_kv_heads": 8, + "num_hidden_layers": 32, + "norm_eps": 1e-05, + "tie_word_embeddings": false, + "transformers_version": "4.40.0", + "use_cache": true, + "use_output_gate": false, + "use_gk": true, + "use_gv": false, + "vocab_size": 32000 +} diff --git a/configs/gsa_16M.json b/configs/gsa_16M.json new file mode 100644 index 0000000000000000000000000000000000000000..caa6f36f1b368671006986d13a2bfc4f0c95923f --- /dev/null +++ b/configs/gsa_16M.json @@ -0,0 +1,27 @@ +{ + "attn_mode": "chunk", + "bos_token_id": 1, + "clamp_min": null, + "eos_token_id": 2, + "expand_k": 1, + "expand_v": 1, + "fuse_cross_entropy": true, + "fuse_norm": true, + "hidden_act": "swish", + "hidden_ratio": 4, + "hidden_size": 256, + "initializer_range": 0.02, + "intermediate_size": null, + "max_position_embeddings": 2048, + "model_type": "gsa", + "num_slots": 16, + "num_heads": 4, + "num_hidden_layers": 10, + "norm_eps": 1e-06, + "tie_word_embeddings": true, + "transformers_version": "4.38.2", + "use_cache": true, + "use_gk": true, + "use_gv": false, + "vocab_size": 32000 +} \ No newline at end of file diff --git a/configs/scan_16M.json b/configs/scan_16M.json new file mode 100644 index 0000000000000000000000000000000000000000..acbf33ec6b88f754f95d6af7f30fd8f4145bbb30 --- /dev/null +++ b/configs/scan_16M.json @@ -0,0 +1,29 @@ +{ + "attn_mode": "parallel", + "bos_token_id": 1, + "clamp_min": null, + "eos_token_id": 2, + "expand_k": 1, + "expand_v": 1, + "fuse_cross_entropy": true, + "fuse_norm": true, + "hidden_act": "swish", + "gate_act": "softmax", + "hidden_ratio": 4, + "hidden_size": 256, + "window_size": 128, + "state_size": 16, + "initializer_range": 0.02, + "intermediate_size": null, + "max_position_embeddings": 2048, + "model_type": "scan", + "num_heads": 4, + "num_hidden_layers": 10, + "norm_eps": 1e-06, + "tie_word_embeddings": true, + "transformers_version": "4.38.2", + "use_cache": true, + "use_gk": true, + "use_gv": false, + "vocab_size": 32000 +} \ No newline at end of file diff --git a/configs/scan_16M_8192.json b/configs/scan_16M_8192.json new file mode 100644 index 0000000000000000000000000000000000000000..c084a9e83f632b73aeed436b3563af739eca97c6 --- /dev/null +++ b/configs/scan_16M_8192.json @@ -0,0 +1,29 @@ +{ + "attn_mode": "parallel", + "bos_token_id": 1, + "clamp_min": null, + "eos_token_id": 2, + "expand_k": 1, + "expand_v": 1, + "fuse_cross_entropy": true, + "fuse_norm": true, + "hidden_act": "swish", + "gate_act": "softmax", + "hidden_ratio": 4, + "hidden_size": 256, + "window_size": 128, + "state_size": 16, + "initializer_range": 0.02, + "intermediate_size": null, + "max_position_embeddings": 8192, + "model_type": "scan", + "num_heads": 4, + "num_hidden_layers": 10, + "norm_eps": 1e-06, + "tie_word_embeddings": true, + "transformers_version": "4.38.2", + "use_cache": true, + "use_gk": true, + "use_gv": false, + "vocab_size": 32000 +} \ No newline at end of file diff --git a/configs/scan_20M.json b/configs/scan_20M.json new file mode 100644 index 0000000000000000000000000000000000000000..075bbeff341d0d206f544356b88c25516cba1af2 --- /dev/null +++ b/configs/scan_20M.json @@ -0,0 +1,29 @@ +{ + "attn_mode": "parallel", + "bos_token_id": 1, + "clamp_min": null, + "eos_token_id": 2, + "expand_k": 1, + "expand_v": 1, + "fuse_cross_entropy": true, + "fuse_norm": true, + "hidden_act": "swish", + "gate_act": "softmax", + "hidden_ratio": 4, + "hidden_size": 384, + "window_size": 128, + "state_size": 16, + "initializer_range": 0.02, + "intermediate_size": null, + "max_position_embeddings": 2048, + "model_type": "scan", + "num_heads": 6, + "num_hidden_layers": 10, + "norm_eps": 1e-06, + "tie_word_embeddings": true, + "transformers_version": "4.38.2", + "use_cache": true, + "use_gk": true, + "use_gv": false, + "vocab_size": 32000 +} \ No newline at end of file diff --git a/configs/scan_340M.json b/configs/scan_340M.json new file mode 100644 index 0000000000000000000000000000000000000000..702a09d8c55d49ddd1b12bf41fc39c433624a74e --- /dev/null +++ b/configs/scan_340M.json @@ -0,0 +1,29 @@ +{ + "attn_mode": "parallel", + "bos_token_id": 1, + "clamp_min": null, + "eos_token_id": 2, + "expand_k": 1, + "expand_v": 1, + "fuse_cross_entropy": true, + "fuse_norm": true, + "hidden_act": "swish", + "gate_act": "softmax", + "hidden_ratio": 4, + "hidden_size": 1024, + "window_size": 128, + "state_size": 32, + "initializer_range": 0.02, + "intermediate_size": null, + "max_position_embeddings": 2048, + "model_type": "scan", + "num_heads": 4, + "num_hidden_layers": 24, + "norm_eps": 1e-06, + "tie_word_embeddings": true, + "transformers_version": "4.38.2", + "use_cache": true, + "use_gk": true, + "use_gv": false, + "vocab_size": 32000 +} \ No newline at end of file diff --git a/configs/transformer_16M.json b/configs/transformer_16M.json new file mode 100644 index 0000000000000000000000000000000000000000..1a77d74d76b4d835cc2d530360390e631fe368a7 --- /dev/null +++ b/configs/transformer_16M.json @@ -0,0 +1,26 @@ +{ + "model_type": "transformer", + "attention_bias": false, + "bos_token_id": 1, + "clamp_min": null, + "eos_token_id": 2, + "fuse_cross_entropy": true, + "fuse_norm": true, + "hidden_act": "swish", + "hidden_ratio": 4, + "hidden_size": 256, + "state_size": 16, + "initializer_range": 0.02, + "intermediate_size": null, + "max_position_embeddings": 2048, + "num_heads": 4, + "num_kv_heads": 4, + "num_hidden_layers": 10, + "norm_eps": 1e-06, + "tie_word_embeddings": true, + "transformers_version": "4.38.2", + "use_cache": true, + "use_gk": true, + "use_gv": false, + "vocab_size": 32000 +} \ No newline at end of file diff --git a/configs/transformer_16M_8192.json b/configs/transformer_16M_8192.json new file mode 100644 index 0000000000000000000000000000000000000000..b7af0ae009161ebe9df0567bcb5cbb71dabdde68 --- /dev/null +++ b/configs/transformer_16M_8192.json @@ -0,0 +1,26 @@ +{ + "model_type": "transformer", + "attention_bias": false, + "bos_token_id": 1, + "clamp_min": null, + "eos_token_id": 2, + "fuse_cross_entropy": true, + "fuse_norm": true, + "hidden_act": "swish", + "hidden_ratio": 4, + "hidden_size": 256, + "state_size": 16, + "initializer_range": 0.02, + "intermediate_size": null, + "max_position_embeddings": 8192, + "num_heads": 4, + "num_kv_heads": 4, + "num_hidden_layers": 10, + "norm_eps": 1e-06, + "tie_word_embeddings": true, + "transformers_version": "4.38.2", + "use_cache": true, + "use_gk": true, + "use_gv": false, + "vocab_size": 32000 +} \ No newline at end of file diff --git a/fla/__init__.py b/fla/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..265ee395c4ecd87345fa7f9a1849f94c570c86aa --- /dev/null +++ b/fla/__init__.py @@ -0,0 +1,58 @@ +# -*- coding: utf-8 -*- + +from fla.layers import (ABCAttention, Attention, BasedLinearAttention, + BitAttention, DeltaNet, GatedLinearAttention, + GatedSlotAttention, HGRN2Attention, HGRNAttention, + LinearAttention, MultiScaleRetention, + ReBasedLinearAttention) +from fla.models import (ABCForCausalLM, ABCModel, BitNetForCausalLM, + BitNetModel, DeltaNetForCausalLM, DeltaNetModel, + GLAForCausalLM, GLAModel, GSAForCausalLM, GSAModel, + HGRN2ForCausalLM, HGRN2Model, HGRNForCausalLM, + LinearAttentionForCausalLM, LinearAttentionModel, + RetNetForCausalLM, RetNetModel, RWKV6ForCausalLM, + RWKV6Model, TransformerForCausalLM, TransformerModel) + +__all__ = [ + 'ABCAttention', + 'Attention', + 'BasedLinearAttention', + 'BitAttention', + 'DeltaNet', + 'HGRNAttention', + 'HGRN2Attention', + 'GatedLinearAttention', + 'GatedSlotAttention', + 'LinearAttention', + 'MultiScaleRetention', + 'ReBasedLinearAttention', + 'ABCForCausalLM', + 'ABCModel', + 'BitNetForCausalLM', + 'BitNetModel', + 'DeltaNetForCausalLM', + 'DeltaNetModel', + 'HGRNForCausalLM', + 'HGRNModel', + 'HGRN2ForCausalLM', + 'HGRN2Model', + 'GLAForCausalLM', + 'GLAModel', + 'GSAForCausalLM', + 'GSAModel', + 'LinearAttentionForCausalLM', + 'LinearAttentionModel', + 'RetNetForCausalLM', + 'RetNetModel', + 'RWKV6ForCausalLM', + 'RWKV6Model', + 'TransformerForCausalLM', + 'TransformerModel', + 'chunk_gla', + 'chunk_retention', + 'fused_chunk_based', + 'fused_chunk_gla', + 'fused_chunk_retention' +] + +__version__ = '0.1' diff --git a/fla/layers/__init__.py b/fla/layers/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..5f1879d844ed10d3515a4fe744fa017ce01d100d --- /dev/null +++ b/fla/layers/__init__.py @@ -0,0 +1,31 @@ +# -*- coding: utf-8 -*- + +from .abc import ABCAttention +from .attn import Attention +from .based import BasedLinearAttention +from .bitattn import BitAttention +from .delta_net import DeltaNet +from .gla import GatedLinearAttention +from .gsa import GatedSlotAttention +from .hgrn import HGRNAttention +from .hgrn2 import HGRN2Attention +from .linear_attn import LinearAttention +from .multiscale_retention import MultiScaleRetention +from .rebased import ReBasedLinearAttention +from .rwkv6 import RWKV6Attention + +__all__ = [ + 'ABCAttention', + 'Attention', + 'BasedLinearAttention', + 'BitAttention', + 'DeltaNet', + 'GatedLinearAttention', + 'GatedSlotAttention', + 'HGRNAttention', + 'HGRN2Attention', + 'LinearAttention', + 'MultiScaleRetention', + 'ReBasedLinearAttention', + 'RWKV6Attention', +] diff --git a/fla/layers/abc.py b/fla/layers/abc.py new file mode 100644 index 0000000000000000000000000000000000000000..ad114ea1c7a75fd11f6af1bd655e82281971de44 --- /dev/null +++ b/fla/layers/abc.py @@ -0,0 +1,207 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +from __future__ import annotations + +import warnings +from typing import TYPE_CHECKING, Optional, Tuple + +import torch +import torch.nn as nn +from einops import rearrange + +from fla.modules import (FusedRMSNormSwishGate, RMSNorm, RotaryEmbedding, + ShortConvolution) +from fla.modules.activations import swiglu, swish +from fla.ops.abc.chunk import chunk_abc + +if TYPE_CHECKING: + from fla.models.utils import Cache + + +class ABCAttention(nn.Module): + + def __init__( + self, + hidden_size: int = 1024, + expand_k: float = 0.5, + expand_v: float = 1.0, + num_heads: int = 4, + use_short_conv: bool = False, + conv_size: int = 4, + conv_bias: bool = False, + num_slots: Optional[int] = None, + elementwise_affine: Optional[bool] = True, + norm_eps: float = 1e-5, + gate_low_rank_dim: int = 16, + gate_logit_normalizer: int = 16, + use_input_gate: bool = False, + use_output_gate: bool = True, + use_norm: bool = True, + clamp_min: Optional[float] = -32, + clamp_max: Optional[float] = 32, + layer_idx: Optional[int] = None, + **kwargs + ) -> ABCAttention: + super().__init__() + + self.hidden_size = hidden_size + self.expand_k = expand_k + self.expand_v = expand_v + self.num_heads = num_heads + self.key_dim = int(self.hidden_size * self.expand_k) + self.value_dim = int(self.hidden_size * self.expand_v) + self.head_k_dim = self.key_dim // self.num_heads + self.head_v_dim = self.value_dim // self.num_heads + + self.use_short_conv = use_short_conv + self.conv_size = conv_size + self.conv_bias = conv_bias + + self.gate_low_rank_dim = gate_low_rank_dim + self.gate_logit_normalizer = gate_logit_normalizer + + self.use_input_gate = use_input_gate + self.use_output_gate = use_output_gate + self.use_norm = use_norm + + if num_slots is None: + num_slots = self.head_k_dim + self.num_slots = num_slots + + self.norm_eps = norm_eps + + self.clamp_min = clamp_min + self.clamp_max = clamp_max + self.layer_idx = layer_idx + + if layer_idx is None: + warnings.warn( + f"Instantiating {self.__class__.__name__} without passing `layer_idx` is not recommended and will " + "to errors during the forward call, if caching is used. Please make sure to provide a `layer_idx` " + "when creating this class." + ) + + self.q_proj = nn.Linear(self.hidden_size, self.key_dim, bias=False) + self.k_proj = nn.Linear(self.hidden_size, self.key_dim, bias=False) + self.v_proj = nn.Linear(self.hidden_size, self.value_dim, bias=False) + + if use_output_gate: + self.g_proj = nn.Linear(self.hidden_size, self.value_dim, bias=False) + self.s_proj = nn.Linear(self.hidden_size, self.num_heads * self.num_slots, bias=False) + self.o_proj = nn.Linear(self.value_dim, self.hidden_size, bias=False) + + if use_short_conv: + self.conv_size = conv_size + self.q_conv1d = ShortConvolution(self.key_dim, conv_size, activation='silu') + self.k_conv1d = ShortConvolution(self.key_dim, conv_size, activation='silu') + self.v_conv1d = ShortConvolution(self.value_dim, conv_size, activation='silu') + + if self.use_norm: + if self.use_output_gate: + self.g_norm = FusedRMSNormSwishGate(self.head_v_dim, elementwise_affine, norm_eps) + else: + self.g_norm = RMSNorm(hidden_size=self.head_v_dim, elementwise_affine=elementwise_affine, eps=norm_eps) + + if self.use_rope: + self.rotary = RotaryEmbedding(self.head_k_dim) + + self.apply(self._initialize_weights) + + def _initialize_weights(self, module: nn.Module): + if getattr(module, "_is_hf_initialized", False): + return + if isinstance(module, nn.Linear): + nn.init.xavier_uniform_(module.weight, gain=2 ** -2.5) + if module.bias is not None: + nn.init.zeros_(module.bias) + module._is_hf_initialized = True + + def forward( + self, + hidden_states: torch.Tensor, + attention_mask: Optional[torch.Tensor] = None, + past_key_values: Optional[Cache] = None, + use_cache: Optional[bool] = False, + output_attentions: Optional[bool] = False, + **kwargs + ) -> Tuple[torch.Tensor, Optional[torch.Tensor], Optional[Cache]]: + if attention_mask is not None: + assert len(attention_mask.shape) == 2, ( + "Expected attention_mask as a 0-1 matrix with shape [batch_size, seq_len] " + "for padding purposes (0 indicating padding). " + "Arbitrary attention masks of shape [batch_size, seq_len, seq_len] are not allowed." + ) + + last_state = None + if past_key_values is not None and len(past_key_values) > self.layer_idx: + last_state = past_key_values[self.layer_idx] + + if self.use_short_conv: + conv_state_q, conv_state_k, conv_state_v = None, None, None + if last_state is not None: + conv_state_q, conv_state_k, conv_state_v = last_state['conv_state'] + conv_mask = attention_mask[:, -hidden_states.shape[1]:] if attention_mask is not None else None + q, conv_state_q = self.q_conv1d(x=self.q_proj(hidden_states), + mask=conv_mask, + cache=conv_state_q, + output_final_state=use_cache) + k, conv_state_k = self.k_conv1d(x=self.k_proj(hidden_states), + mask=conv_mask, + cache=conv_state_k, + output_final_state=use_cache) + v, conv_state_v = self.v_conv1d(x=self.v_proj(hidden_states), + mask=conv_mask, + cache=conv_state_v, + output_final_state=use_cache) + else: + q = self.q_proj(hidden_states) + k = self.k_proj(hidden_states) + v = self.v_proj(hidden_states) + + if self.use_input_gate: + q, k, v = map(lambda x: swish(x), (q, k, v)) + # dealing with left-padding + if attention_mask is not None: + v = v.mul_(attention_mask[:, -v.shape[-2]:, None]) + + q, k, v = map(lambda x: rearrange(x, '... (h d) -> ... h d', h=self.num_heads), (q, k, v)) + if self.use_rope: + seqlen_offset = 0 + if past_key_values is not None: + seqlen_offset = past_key_values.get_seq_length(self.layer_idx) + q, k = self.rotary(q, k, seqlen_offset) + + s = rearrange(self.s_proj(hidden_states), '... (h m) -> ... h m', h=self.num_heads) + s = s.clamp_(self.clamp_min, self.clamp_max) + + recurrent_state = last_state['recurrent_state'] if last_state is not None else None + o, recurrent_state = chunk_abc( + q=q, + k=k, + v=v, + s=s, + initial_state=recurrent_state, + output_final_state=use_cache, + head_first=False + ) + if past_key_values is not None: + past_key_values.update( + recurrent_state=recurrent_state, + conv_state=(conv_state_q, conv_state_k, conv_state_v) if self.use_short_conv else None, + layer_idx=self.layer_idx, + offset=q.shape[2] + ) + + if self.use_norm and not self.use_output_gate: + o = self.g_norm(o) + elif self.use_output_gate: + g = rearrange(self.g_proj(hidden_states), '... (h d) -> ... h d', h=self.num_heads) + o = self.g_norm(o, g) if self.use_norm else swiglu(g, o) + o = rearrange(o, '... h d -> ... (h d)') + o = self.o_proj(o) + + return o, None, past_key_values + + def state_size(self, seq_len: int = 2048): + return self.num_heads * self.key_dim * self.head_v_dim diff --git a/fla/layers/attn.py b/fla/layers/attn.py new file mode 100644 index 0000000000000000000000000000000000000000..b6659b73be25944ffed9905140e358a3f450be50 --- /dev/null +++ b/fla/layers/attn.py @@ -0,0 +1,182 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +from __future__ import annotations + +import warnings +from typing import TYPE_CHECKING, Optional, Tuple + +import torch +import torch.nn as nn +import torch.nn.functional as F +import torch.utils.checkpoint +from einops import rearrange +from transformers.utils import logging + +from fla.modules import RMSNorm, RotaryEmbedding + +if TYPE_CHECKING: + from fla.models.utils import Cache + +try: + from flash_attn import flash_attn_func, flash_attn_varlen_func + from flash_attn.bert_padding import (index_first_axis, pad_input, + unpad_input) +except ImportError: + warnings.warn( + "Flash Attention is not installed. Please install it via `pip install flash-attn --no-build-isolation`", + category=ImportWarning + ) + flash_attn_func = None + +logger = logging.get_logger(__name__) + + +class Attention(nn.Module): + + def __init__( + self, + hidden_size: int = 2048, + num_heads: int = 32, + num_kv_heads: Optional[int] = None, + window_size: Optional[int] = None, + rope_theta: Optional[float] = 10000., + max_position_embeddings: Optional[int] = None, + norm_first: bool = False, + norm_eps: float = 1e-5, + layer_idx: int = None + ): + super().__init__() + + self.num_heads = num_heads + if num_kv_heads is None: + self.num_kv_heads = self.num_heads + else: + self.num_kv_heads = num_kv_heads + self.num_kv_groups = num_heads // self.num_kv_heads + self.hidden_size = hidden_size + self.head_dim = self.hidden_size // self.num_heads + self.kv_dim = self.num_kv_heads * self.head_dim + self.kv_dim = self.num_kv_heads * self.head_dim + self.window_size = window_size + self.rope_theta = rope_theta + self.max_position_embeddings = max_position_embeddings + self.norm_first = norm_first + self.layer_idx = layer_idx + + if norm_first: + self.norm = RMSNorm(self.hidden_size, eps=norm_eps) + self.q_proj = nn.Linear(self.hidden_size, self.hidden_size, bias=False) + self.k_proj = nn.Linear(self.hidden_size, self.kv_dim, bias=False) + self.v_proj = nn.Linear(self.hidden_size, self.kv_dim, bias=False) + self.o_proj = nn.Linear(self.hidden_size, self.hidden_size, bias=False) + + self.rotary = RotaryEmbedding(dim=self.head_dim, base=self.rope_theta) + + def forward( + self, + hidden_states: torch.Tensor, + attention_mask: Optional[torch.LongTensor] = None, + past_key_values: Optional[Cache] = None, + output_attentions: bool = False, + use_cache: bool = False, + **kwargs, + ) -> Tuple[torch.Tensor, Optional[torch.Tensor], Optional[Tuple[torch.Tensor]]]: + if attention_mask is not None: + assert len(attention_mask.shape) == 2, ( + "Expected attention_mask as a 0-1 matrix with shape [batch_size, seq_len] " + "for padding purposes (0 indicating padding). " + "Arbitrary attention masks of shape [batch_size, seq_len, seq_len] are not allowed." + ) + + batch_size, q_len, _ = hidden_states.size() + + if self.norm_first: + hidden_states = self.norm(hidden_states) + + q = rearrange(self.q_proj(hidden_states), '... (h d) -> ... h d', h=self.num_heads) + k = rearrange(self.k_proj(hidden_states), '... (h d) -> ... h d', h=self.num_kv_heads) + v = rearrange(self.v_proj(hidden_states), '... (h d) -> ... h d', h=self.num_kv_heads) + + seqlen_offset, max_seqlen = 0, q_len + if past_key_values is not None: + seqlen_offset = past_key_values.get_seq_length(self.layer_idx) + max_seqlen = q.shape[1] + seqlen_offset + + if attention_mask is not None: + # to deliminate the offsets of padding tokens + seqlen_offset = (seqlen_offset + attention_mask.sum(-1) - attention_mask.shape[-1]).clamp(min=0) + max_seqlen = q.shape[1] + max(seqlen_offset) + + if self.max_position_embeddings is not None: + max_seqlen = max(max_seqlen, self.max_position_embeddings) + q, k = self.rotary(q, k, seqlen_offset, max_seqlen) + + if past_key_values is not None: + k, v = past_key_values.update( + attn_state=(k.flatten(-2, -1), v.flatten(-2, -1)), + layer_idx=self.layer_idx, + offset=q_len, + cache_kwargs=dict(window_size=self.window_size) + )['attn_state'] + k = rearrange(k, '... (h d) -> ... h d', h=self.num_kv_heads) + v = rearrange(v, '... (h d) -> ... h d', h=self.num_kv_heads) + + if flash_attn_func is None: + raise ImportError("Please install Flash Attention via `pip install flash-attn --no-build-isolation` first") + + # Contains at least one padding token in the sequence + if attention_mask is not None: + q, k, v, indices_q, cu_seq_lens, max_seq_lens = self._upad_input(q, k, v, attention_mask, q_len) + cu_seqlens_q, cu_seqlens_k = cu_seq_lens + max_seqlen_q, max_seqlen_k = max_seq_lens + o = flash_attn_varlen_func( + q, k, v, + cu_seqlens_q=cu_seqlens_q, + cu_seqlens_k=cu_seqlens_k, + max_seqlen_q=max_seqlen_q, + max_seqlen_k=max_seqlen_k, + causal=True, + window_size=(-1, -1) if self.window_size is None else (self.window_size-1, 0) + ) + o = pad_input(o, indices_q, batch_size, q_len) + else: + o = flash_attn_func( + q, k, v, + causal=True, + window_size=(-1, -1) if self.window_size is None else (self.window_size-1, 0) + ) + o = o.reshape(batch_size, q_len, self.hidden_size) + o = self.o_proj(o) + + if not output_attentions: + attentions = None + + return o, attentions, past_key_values + + def _upad_input(self, q, k, v, attention_mask, q_len): + seqlens = attention_mask.sum(-1, dtype=torch.int32) + indices_k = torch.nonzero(attention_mask.flatten(), as_tuple=False).flatten() + max_seqlen_k = seqlens.max().item() + cu_seqlens_k = F.pad(torch.cumsum(seqlens, dim=0, dtype=torch.int32), (1, 0)) + batch_size, seq_len, num_key_value_heads, head_dim = k.shape + + k = index_first_axis(k.reshape(batch_size * seq_len, num_key_value_heads, head_dim), indices_k) + v = index_first_axis(v.reshape(batch_size * seq_len, num_key_value_heads, head_dim), indices_k) + if q_len == seq_len: + q = index_first_axis(q.reshape(batch_size * seq_len, self.num_heads, head_dim), indices_k) + cu_seqlens_q = cu_seqlens_k + max_seqlen_q = max_seqlen_k + indices_q = indices_k + elif q_len == 1: + max_seqlen_q = 1 + # There is a memcpy here, that is very bad. + cu_seqlens_q = torch.arange(batch_size + 1, dtype=torch.int32, device=q.device) + indices_q = cu_seqlens_q[:-1] + q = q.squeeze(1) + else: + # The -q_len: slice assumes left padding. + attention_mask = attention_mask[:, -q_len:] + q, indices_q, cu_seqlens_q, max_seqlen_q = unpad_input(q, attention_mask) + + return q, k, v, indices_q, (cu_seqlens_q, cu_seqlens_k), (max_seqlen_q, max_seqlen_k) diff --git a/fla/layers/based.py b/fla/layers/based.py new file mode 100644 index 0000000000000000000000000000000000000000..77cc09570e3cdd56b17f773141ee964afd7be04e --- /dev/null +++ b/fla/layers/based.py @@ -0,0 +1,105 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +""" +Linear attention in Based. +https://github.com/HazyResearch/zoology/blob/main/zoology/mixers/based.py +""" + +import torch +import torch.nn as nn +from einops import rearrange + +from fla.modules.feature_map import TaylorFeatureMap +from fla.ops.based import parallel_based +from fla.ops.linear_attn import chunk_linear_attn, fused_chunk_linear_attn + + +class BasedLinearAttention(nn.Module): + + def __init__( + self, + hidden_size: int, + feature_dim: int = 16, + num_key_value_heads: int = 12, + num_heads: int = 12, + feature_name: str = "taylor_exp", + eps: float = 1e-12, + causal: bool = True, + mode: str = "parallel", + ): + super().__init__() + + self.hidden_size = hidden_size + self.mode = mode + self.feature_name = feature_name + self.feature_dim = feature_dim + self.num_key_value_heads = num_key_value_heads + self.num_heads = num_heads + self.head_dim = self.hidden_size // self.num_key_value_heads + self.causal = causal + + self.q_proj = nn.Linear(self.hidden_size, self.feature_dim * self.num_heads, bias=False) + self.k_proj = nn.Linear(self.hidden_size, self.feature_dim * self.num_heads, bias=False) + self.v_proj = nn.Linear(self.hidden_size, self.num_key_value_heads * self.head_dim, bias=False) + self.o_proj = nn.Linear(self.num_heads * self.head_dim, self.hidden_size, bias=False) + self.dropout = nn.Identity() + self.feature_map = TaylorFeatureMap(feature_dim) + self.eps = eps + + self.apply(self._initialize_weights) + + def _initialize_weights(self, module: nn.Module): + if getattr(module, "_is_hf_initialized", False): + return + if isinstance(module, nn.Linear): + nn.init.xavier_uniform_(module.weight, gain=2 ** -2.5) + if module.bias is not None: + nn.init.zeros_(module.bias) + module._is_hf_initialized = True + + def forward(self, hidden_states: torch.Tensor, **kwargs): + mode = self.mode + q, k, v = self.q_proj(hidden_states), self.k_proj(hidden_states), self.v_proj(hidden_states) + q, k, v = map(lambda x: rearrange(x, "... (h d) -> ... h d", h=self.num_heads), [q, k, v]) + if mode == "fused_chunk": + q, k = self.feature_map(q), self.feature_map(k) + o = fused_chunk_linear_attn(q, k, v, normalize=True, scale=1, head_first=False) + elif mode == 'chunk': + q, k = self.feature_map(q), self.feature_map(k) + o = chunk_linear_attn(q, k, v, normalize=True, scale=1, head_first=False) + elif mode == 'parallel': + assert q.shape[-1] <= 128 + o = parallel_based(q, k, v, True, True, head_first=False) + o = self.o_proj(o) + o = self.dropout(o) + return o + + # https://github.com/HazyResearch/zoology/blob/main/zoology/mixers/based.py#L119 + + def forward_reference(self, hidden_states: torch.Tensor, filters: torch.Tensor = None, *args, **kwargs): + """ + x (torch.Tensor): tensor of shape (b, d, t) + y (torch.Tensor): tensor of shape (b, d, t) + """ + # hidden_states = hidden_states.transpose(1, 2) + b, t, _ = hidden_states.size() + q, k, v = self.q_proj(hidden_states), self.k_proj(hidden_states), self.v_proj(hidden_states) + + q = q.view(b, t, self.num_heads, self.feature_dim).transpose(1, 2) + k = k.view(b, t, self.num_key_value_heads, self.feature_dim).transpose(1, 2) + v = v.view(b, t, self.num_key_value_heads, self.head_dim).transpose(1, 2) + + # Linear attention + q, k = self.feature_map(q), self.feature_map(k) + q, k, v = q.unsqueeze(-2), k.unsqueeze(-2), v.unsqueeze(-1) + + # Compute attention + if self.causal: + y = ((q * (k * v).cumsum(2)).sum(-1) / ((q * k.cumsum(2)).sum(-1) + self.eps)) + else: + y = ((q * (k * v).sum(2, True)).sum(-1) / ((q * k.sum(2, True)).sum(-1) + self.eps)) + y = rearrange(y, 'b h t d -> b t (h d)') + y = self.o_proj(y.to(hidden_states.dtype)) + y = self.dropout(y) + return y.to(hidden_states.dtype) diff --git a/fla/layers/bitattn.py b/fla/layers/bitattn.py new file mode 100644 index 0000000000000000000000000000000000000000..0faea1a2e91fabb2da787bd6f07315712154ec07 --- /dev/null +++ b/fla/layers/bitattn.py @@ -0,0 +1,183 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +from __future__ import annotations + +import warnings +from typing import TYPE_CHECKING, Optional, Tuple + +import torch +import torch.nn as nn +import torch.nn.functional as F +import torch.utils.checkpoint +from einops import rearrange +from transformers.utils import logging + +from fla.modules import RMSNorm, RotaryEmbedding +from fla.modules.fused_bitlinear import FusedBitLinear + +if TYPE_CHECKING: + from fla.models.utils import Cache + +try: + from flash_attn import flash_attn_func, flash_attn_varlen_func + from flash_attn.bert_padding import (index_first_axis, pad_input, + unpad_input) +except ImportError: + warnings.warn( + "Flash Attention is not installed. Please install it via `pip install flash-attn --no-build-isolation`", + category=ImportWarning + ) + flash_attn_func = None + +logger = logging.get_logger(__name__) + + +class BitAttention(nn.Module): + + def __init__( + self, + hidden_size: int = 2048, + num_heads: int = 32, + num_kv_heads: Optional[int] = None, + window_size: Optional[int] = None, + rope_theta: Optional[float] = 10000., + max_position_embeddings: Optional[int] = None, + norm_first: bool = False, + norm_eps: float = 1e-5, + layer_idx: int = None + ): + super().__init__() + + self.num_heads = num_heads + if num_kv_heads is None: + self.num_kv_heads = self.num_heads + else: + self.num_kv_heads = num_kv_heads + self.num_kv_groups = num_heads // self.num_kv_heads + self.hidden_size = hidden_size + self.head_dim = self.hidden_size // self.num_heads + self.kv_dim = self.num_kv_heads * self.head_dim + self.kv_dim = self.num_kv_heads * self.head_dim + self.window_size = window_size + self.rope_theta = rope_theta + self.max_position_embeddings = max_position_embeddings + self.norm_first = norm_first + self.layer_idx = layer_idx + + if norm_first: + self.norm = RMSNorm(self.hidden_size, eps=norm_eps) + self.q_proj = FusedBitLinear(self.hidden_size, self.hidden_size, bias=False) + self.k_proj = FusedBitLinear(self.hidden_size, self.kv_dim, bias=False) + self.v_proj = FusedBitLinear(self.hidden_size, self.kv_dim, bias=False) + self.o_proj = FusedBitLinear(self.hidden_size, self.hidden_size, bias=False) + + self.rotary = RotaryEmbedding(dim=self.head_dim, base=self.rope_theta) + + def forward( + self, + hidden_states: torch.Tensor, + attention_mask: Optional[torch.LongTensor] = None, + past_key_values: Optional[Cache] = None, + output_attentions: bool = False, + use_cache: bool = False, + **kwargs, + ) -> Tuple[torch.Tensor, Optional[torch.Tensor], Optional[Tuple[torch.Tensor]]]: + if attention_mask is not None: + assert len(attention_mask.shape) == 2, ( + "Expected attention_mask as a 0-1 matrix with shape [batch_size, seq_len] " + "for padding purposes (0 indicating padding). " + "Arbitrary attention masks of shape [batch_size, seq_len, seq_len] are not allowed." + ) + + batch_size, q_len, _ = hidden_states.size() + + if self.norm_first: + hidden_states = self.norm(hidden_states) + + q = rearrange(self.q_proj(hidden_states), '... (h d) -> ... h d', h=self.num_heads) + k = rearrange(self.k_proj(hidden_states), '... (h d) -> ... h d', h=self.num_kv_heads) + v = rearrange(self.v_proj(hidden_states), '... (h d) -> ... h d', h=self.num_kv_heads) + + seqlen_offset, max_seqlen = 0, q_len + if past_key_values is not None: + seqlen_offset = past_key_values.get_seq_length(self.layer_idx) + max_seqlen = q.shape[1] + seqlen_offset + + if attention_mask is not None: + # to deliminate the offsets of padding tokens + seqlen_offset = (seqlen_offset + attention_mask.sum(-1) - attention_mask.shape[-1]).clamp(min=0) + max_seqlen = q.shape[1] + max(seqlen_offset) + + if self.max_position_embeddings is not None: + max_seqlen = max(max_seqlen, self.max_position_embeddings) + q, k = self.rotary(q, k, seqlen_offset, max_seqlen) + + if past_key_values is not None: + k, v = past_key_values.update( + attn_state=(k.flatten(-2, -1), v.flatten(-2, -1)), + layer_idx=self.layer_idx, + offset=q_len, + cache_kwargs=dict(window_size=self.window_size) + )['attn_state'] + k = rearrange(k, '... (h d) -> ... h d', h=self.num_kv_heads) + v = rearrange(v, '... (h d) -> ... h d', h=self.num_kv_heads) + + if flash_attn_func is None: + raise ImportError("Please install Flash Attention via `pip install flash-attn --no-build-isolation` first") + + # Contains at least one padding token in the sequence + if attention_mask is not None: + q, k, v, indices_q, cu_seq_lens, max_seq_lens = self._upad_input(q, k, v, attention_mask, q_len) + cu_seqlens_q, cu_seqlens_k = cu_seq_lens + max_seqlen_q, max_seqlen_k = max_seq_lens + o = flash_attn_varlen_func( + q, k, v, + cu_seqlens_q=cu_seqlens_q, + cu_seqlens_k=cu_seqlens_k, + max_seqlen_q=max_seqlen_q, + max_seqlen_k=max_seqlen_k, + causal=True, + window_size=(-1, -1) if self.window_size is None else (self.window_size-1, 0) + ) + o = pad_input(o, indices_q, batch_size, q_len) + else: + o = flash_attn_func( + q, k, v, + causal=True, + window_size=(-1, -1) if self.window_size is None else (self.window_size-1, 0) + ) + o = o.reshape(batch_size, q_len, self.hidden_size) + o = self.o_proj(o) + + if not output_attentions: + attentions = None + + return o, attentions, past_key_values + + def _upad_input(self, q, k, v, attention_mask, q_len): + seqlens = attention_mask.sum(-1, dtype=torch.int32) + indices_k = torch.nonzero(attention_mask.flatten(), as_tuple=False).flatten() + max_seqlen_k = seqlens.max().item() + cu_seqlens_k = F.pad(torch.cumsum(seqlens, dim=0, dtype=torch.int32), (1, 0)) + batch_size, seq_len, num_key_value_heads, head_dim = k.shape + + k = index_first_axis(k.reshape(batch_size * seq_len, num_key_value_heads, head_dim), indices_k) + v = index_first_axis(v.reshape(batch_size * seq_len, num_key_value_heads, head_dim), indices_k) + if q_len == seq_len: + q = index_first_axis(q.reshape(batch_size * seq_len, self.num_heads, head_dim), indices_k) + cu_seqlens_q = cu_seqlens_k + max_seqlen_q = max_seqlen_k + indices_q = indices_k + elif q_len == 1: + max_seqlen_q = 1 + # There is a memcpy here, that is very bad. + cu_seqlens_q = torch.arange(batch_size + 1, dtype=torch.int32, device=q.device) + indices_q = cu_seqlens_q[:-1] + q = q.squeeze(1) + else: + # The -q_len: slice assumes left padding. + attention_mask = attention_mask[:, -q_len:] + q, indices_q, cu_seqlens_q, max_seqlen_q = unpad_input(q, attention_mask) + + return q, k, v, indices_q, (cu_seqlens_q, cu_seqlens_k), (max_seqlen_q, max_seqlen_k) diff --git a/fla/layers/delta_net.py b/fla/layers/delta_net.py new file mode 100644 index 0000000000000000000000000000000000000000..cd9737a83dfd8fa9f58d8b53dcd38609993d239f --- /dev/null +++ b/fla/layers/delta_net.py @@ -0,0 +1,267 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +# Sect4.2 of Linear Transformers Are Secretly Fast Weight Programmers https://arxiv.org/abs/2102.11174 +from __future__ import annotations + +from typing import TYPE_CHECKING, Optional, Tuple + +import torch +import torch.nn as nn +from einops import rearrange +from torch.nn import functional as F + +from fla.modules import FusedRMSNormSwishGate, RMSNorm, ShortConvolution +from fla.modules.l2norm import l2_norm +from fla.ops.delta_rule import (chunk_delta_rule, fused_chunk_delta_rule, + fused_recurrent_delta_rule) + +if TYPE_CHECKING: + from fla.models.utils import Cache + + +def elu_p1(x): + return (F.elu(x, 1., False) + 1.).to(x) + + +def sum_norm(x): + return (x / x.sum(-1, keepdim=True)).to(x) + +# https://github.com/IDSIA/recurrent-fwp/blob/master/algorithmic/layers.py#L86C1-L146C1 + + +class DeltaNet(nn.Module): + def __init__( + self, + d_model: int = None, + hidden_size: int = 1024, + expand_k: float = 1.0, + expand_v: float = 1.0, + num_heads: int = 4, + mode: str = 'chunk', + use_beta: bool = True, + use_gate: bool = False, + use_output_norm: bool = True, + use_elu: bool = False, + use_short_conv: bool = True, + conv_size: int = 4, + conv_bias: bool = False, + layer_idx: int = None, + qk_activation: str = 'silu', + qk_norm: str = 'l2', + norm_first: bool = False, + norm_eps: float = 1e-5, + **kwargs + ) -> DeltaNet: + super().__init__() + + self.mode = mode + self.qk_activation = qk_activation + self.qk_norm = qk_norm + + assert self.qk_activation in ['silu', 'relu', 'elu', 'identity'] + assert self.qk_norm in ['l2', 'sum'] + + if d_model is not None: + hidden_size = d_model + self.hidden_size = hidden_size + self.expand_k = expand_k + self.expand_v = expand_v + self.num_heads = num_heads + self.use_gate = use_gate + self.use_output_norm = use_output_norm + self.use_short_conv = use_short_conv + self.conv_size = conv_size + self.conv_bias = conv_bias + + self.key_dim = int(hidden_size * expand_k) + self.value_dim = int(hidden_size * expand_v) + self.head_qk_dim = self.key_dim // num_heads + self.head_v_dim = self.value_dim // num_heads + self.norm_first = norm_first + self.layer_idx = layer_idx + + self.silu = nn.SiLU() + + assert mode in ['chunk', 'fused_chunk', 'fused_recurrent'], f"Not suppoerted mode `{mode}`." + assert self.key_dim % num_heads == 0, f"key dim must be divisible by num_heads of {num_heads}" + assert self.value_dim % num_heads == 0, f"value dim must be divisible by num_heads of {num_heads}" + + if norm_first: + self.norm = RMSNorm(self.hidden_size, eps=norm_eps) + + self.q_proj = nn.Linear(hidden_size, self.key_dim, bias=False) + self.k_proj = nn.Linear(hidden_size, self.key_dim, bias=False) + self.v_proj = nn.Linear(hidden_size, self.value_dim, bias=False) + + self.use_beta = use_beta + self.use_elu = use_elu + if self.use_beta: + self.b_proj = nn.Linear(hidden_size, self.num_heads, bias=False) + if use_short_conv: + self.conv_size = conv_size + self.q_conv1d = ShortConvolution( + hidden_size=self.key_dim, + kernel_size=conv_size, + activation='silu' if qk_activation == 'silu' else None + ) + self.k_conv1d = ShortConvolution( + hidden_size=self.key_dim, + kernel_size=conv_size, + activation='silu' if qk_activation == 'silu' else None + ) + self.v_conv1d = ShortConvolution( + hidden_size=self.value_dim, + kernel_size=conv_size, + activation='silu' + ) + else: + raise UserWarning( + "ShortConvolution is crucial to the performance. " + "Do not turn it off, i.e., setting `use_short_conv=False` unless you know what you are doing." + ) + if use_gate: + self.g_proj = nn.Linear(hidden_size, self.value_dim, bias=False) + self.o_norm = FusedRMSNormSwishGate(self.head_v_dim, eps=norm_eps) + else: + self.o_norm = RMSNorm(self.head_v_dim, eps=norm_eps) + + self.o_proj = nn.Linear(self.value_dim, hidden_size, bias=False) + + self.apply(self._initialize_weights) + + def _initialize_weights(self, module: nn.Module): + if getattr(module, "_is_hf_initialized", False): + return + if isinstance(module, nn.Linear): + nn.init.xavier_uniform_(module.weight, gain=2 ** -2.5) + if module.bias is not None: + nn.init.zeros_(module.bias) + module._is_hf_initialized = True + + def forward( + self, + hidden_states: torch.Tensor, + attention_mask: Optional[torch.Tensor] = None, + past_key_values: Optional[Cache] = None, + use_cache: Optional[bool] = False, + output_attentions: Optional[bool] = False, + **kwargs + ) -> Tuple[torch.Tensor, Optional[torch.Tensor], Optional[Cache]]: + if attention_mask is not None: + assert len(attention_mask.shape) == 2, ( + "Expected attention_mask as a 0-1 matrix with shape [batch_size, seq_len] " + "for padding purposes (0 indicating padding). " + "Arbitrary attention masks of shape [batch_size, seq_len, seq_len] are not allowed." + ) + + # change to inference mode. + mode = 'fused_recurrent' if hidden_states.shape[1] < 64 else self.mode + + if self.norm_first: + hidden_states = self.norm(hidden_states) + + last_state = None + if past_key_values is not None and len(past_key_values) > self.layer_idx: + last_state = past_key_values[self.layer_idx] + + if self.use_short_conv: + conv_state_q, conv_state_k, conv_state_v = None, None, None + if last_state is not None: + conv_state_q, conv_state_k, conv_state_v = last_state['conv_state'] + conv_mask = attention_mask[:, -hidden_states.shape[1]:] if attention_mask is not None else None + q, conv_state_q = self.q_conv1d(x=self.q_proj(hidden_states), + mask=conv_mask, + cache=conv_state_q, + output_final_state=use_cache) + k, conv_state_k = self.k_conv1d(x=self.k_proj(hidden_states), + mask=conv_mask, + cache=conv_state_k, + output_final_state=use_cache) + v, conv_state_v = self.v_conv1d(x=self.v_proj(hidden_states), + mask=conv_mask, + cache=conv_state_v, + output_final_state=use_cache) + else: + q = self.q_proj(hidden_states) + k = self.k_proj(hidden_states) + v = self.silu(self.v_proj(hidden_states)) + + q, k, v = map(lambda x: rearrange(x, 'b t (h d) -> b t h d', h=self.num_heads), (q, k, v)) + if self.qk_activation != 'silu': + if self.qk_activation == 'relu': + q, k = q.relu(), k.relu() + elif self.qk_activation == 'elu': + q, k = elu_p1(q), elu_p1(k) + elif self.qk_activation == 'identity': + pass + else: + raise NotImplementedError + + if self.qk_norm is not None: + if self.qk_norm == 'l2': + q = l2_norm(q) + k = l2_norm(k) + elif self.qk_norm == 'sum': + q = sum_norm(q).to(q) + k = sum_norm(k).to(k) + + if self.use_beta: + beta = self.b_proj(hidden_states).sigmoid() + else: + beta = q.new_ones(q.shape[0], q.shape[1], q.shape[2]) + + # dealing with padding + if attention_mask is not None: + beta = beta.mul(attention_mask[:, -beta.shape[-2]:, None]) + recurrent_state = last_state['recurrent_state'] if last_state is not None else None + if mode == 'fused_recurrent': + o, recurrent_state = fused_recurrent_delta_rule( + q=q, + k=k, + v=v, + beta=beta, + initial_state=recurrent_state, + output_final_state=use_cache, + head_first=False + ) + elif mode == 'fused_chunk': + o, recurrent_state = fused_chunk_delta_rule( + q=q, + k=k, + v=v, + beta=beta, + initial_state=recurrent_state, + output_final_state=use_cache, + head_first=False + ) + elif mode == 'chunk': + o, recurrent_state = chunk_delta_rule( + q=q, + k=k, + v=v, + beta=beta, + initial_state=recurrent_state, + output_final_state=use_cache, + head_first=False + ) + else: + raise NotImplementedError(f"Not supported mode `{mode}`.") + + if past_key_values is not None: + past_key_values.update( + recurrent_state=recurrent_state, + conv_state=(conv_state_q, conv_state_k, conv_state_v) if self.use_short_conv else None, + layer_idx=self.layer_idx, + offset=q.shape[2] + ) + + if self.use_gate: + g = rearrange(self.g_proj(hidden_states), 'b t (h d) -> b t h d', h=self.num_heads) + o = self.o_norm(o, g) + else: + o = self.o_norm(o) + o = rearrange(o, 'b t h d -> b t (h d)') + o = self.o_proj(o) + + return o, None, past_key_values diff --git a/fla/layers/gla.py b/fla/layers/gla.py new file mode 100644 index 0000000000000000000000000000000000000000..8cc22d5069819309a4ef4aa2b361013fbd4ee47a --- /dev/null +++ b/fla/layers/gla.py @@ -0,0 +1,280 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + + +from __future__ import annotations + +from typing import TYPE_CHECKING, Optional, Tuple + +import torch +import torch.nn as nn +import torch.nn.functional as F +from einops import rearrange, repeat + +from fla.modules import FusedRMSNormSwishGate, RMSNorm, ShortConvolution +from fla.modules.activations import ACT2FN +from fla.ops.gla import chunk_gla, fused_chunk_gla, fused_recurrent_gla + +if TYPE_CHECKING: + from fla.models.utils import Cache + + +class GatedLinearAttention(nn.Module): + r""" + The layer implementaion for [Gated Linear Attention Transformers with Hardware-Efficient Training](https://arxiv.org/abs/2312.06635). # noqa + + Args: + mode (str, Optional): + Which GLA kernel to use. + Currently available: `chunk`, `fused_recurrent`, and `fused_chunk`. + Default: `chunk`. + hidden_size (int, Optional): + The hidden size of the input. Default: 1024. + expand_k (float, Optional): + The expansion ratio for the key dim. Default: 0.5. + expand_v (float, Optional): + The expansion ratio for the value dim. Default: 1.0. + num_heads (int, Optional): + The number of heads. Default: 4. + num_kv_heads (int, Optional): + The number of key/value heads, used for MQA. Default: None. + feature_map (str, Optional): + Feature map function applied to queries/keys. Default: None. + use_short_conv (bool, Optional): + Whether to use short convolutions. Default: `False`. + conv_size (int, Optional): + The kernel size of the short convolution, only used when `use_short_conv` is `True`. Default: 4. + conv_bias (bool, Optional): + Whether to use bias in the short convolution, only used when `use_short_conv` is `True`. Default: `False`. + use_output_gate (bool, Optional): + Whether to use output gate. Default: `True`. + gate_fn (str, Optional): + The activation function for the output gate. Default: `swish`. + elementwise_affine (bool, Optional): + If `True`, applies elementwise affine to LayerNorm with learnable parameters. Default: `True`. + norm_eps (float, Optional): + The epsilon value for the layernorm/rmsnorm layer. Default: 1e-5. + gate_logit_normalizer (int, Optional): + The normalizer for the gate logits, appied after `logsigmoid`. Default: 16. + gate_low_rank_dim (int, Optional): + The low rank dim for the gate projection. Default: 16. + clamp_min (float, Optional): + The minimum value for the gate logits. Default: None. + fuse_norm (bool, Optional): + Whether to fuse the norm and the output gate for better memory footprint. Default: `True`. + layer_idx (int, Optional): + The index of the layer. Default: None. + """ + + def __init__( + self, + mode: str = 'chunk', + hidden_size: int = 1024, + expand_k: float = 0.5, + expand_v: float = 1.0, + num_heads: int = 4, + num_kv_heads: Optional[int] = None, + feature_map: Optional[str] = None, + use_short_conv: bool = False, + conv_size: int = 4, + conv_bias: bool = False, + use_output_gate: bool = True, + gate_fn: str = 'swish', + elementwise_affine: Optional[bool] = True, + norm_eps: float = 1e-5, + gate_logit_normalizer: int = 16, + gate_low_rank_dim: int = 16, + clamp_min: Optional[float] = None, + fuse_norm: bool = True, + layer_idx: int = None, + ) -> GatedLinearAttention: + super().__init__() + + self.mode = mode + self.hidden_size = hidden_size + self.expand_k = expand_k + self.expand_v = expand_v + self.num_heads = num_heads + self.num_kv_heads = num_kv_heads if num_kv_heads is not None else num_heads + self.num_kv_groups = self.num_heads // self.num_kv_heads + self.feature_map_fn = ACT2FN[feature_map] if feature_map is not None else None + + self.use_short_conv = use_short_conv + self.conv_size = conv_size + self.conv_bias = conv_bias + self.use_output_gate = use_output_gate + + self.key_dim = int(hidden_size * expand_k) + self.value_dim = int(hidden_size * expand_v) + self.key_dim_per_group = self.key_dim // self.num_kv_groups + self.value_dim_per_group = self.value_dim // self.num_kv_groups + self.clamp_min = clamp_min + self.layer_idx = layer_idx + + assert mode in ['chunk', 'fused_recurrent', 'fused_chunk'], f"Not suppoerted mode `{mode}`." + assert self.key_dim % num_heads == 0, f"key dim must be divisible by num_heads of {num_heads}" + assert self.value_dim % num_heads == 0, f"value dim must be divisible by num_heads of {num_heads}" + + self.head_qk_dim = self.key_dim // num_heads + self.head_v_dim = self.value_dim // num_heads + + self.q_proj = nn.Linear(hidden_size, self.key_dim, bias=False) + self.k_proj = nn.Linear(hidden_size, self.key_dim_per_group, bias=False) + self.v_proj = nn.Linear(hidden_size, self.value_dim_per_group, bias=False) + if self.use_output_gate: + self.g_proj = nn.Linear(hidden_size, self.value_dim, bias=False) + + if use_short_conv: + self.conv_size = conv_size + self.q_conv1d = ShortConvolution(self.key_dim, conv_size, activation='silu') + self.k_conv1d = ShortConvolution(self.key_dim_per_group, conv_size, activation='silu') + self.v_conv1d = ShortConvolution(self.value_dim_per_group, conv_size, activation='silu') + + self.gk_proj = nn.Sequential(nn.Linear(hidden_size, gate_low_rank_dim, bias=False), + nn.Linear(gate_low_rank_dim, self.key_dim_per_group, bias=True)) + self.o_proj = nn.Linear(self.value_dim, hidden_size, bias=False) + + if gate_fn == 'swish' and fuse_norm and use_output_gate: + self.g_norm_swish_gate = FusedRMSNormSwishGate(self.head_v_dim, elementwise_affine, norm_eps) + self.fuse_norm_and_gate = True + else: + self.fuse_norm_and_gate = False + self.g_norm = RMSNorm(hidden_size=self.head_v_dim, elementwise_affine=elementwise_affine, eps=norm_eps) + self.gate_fn = ACT2FN[gate_fn] + + self.gate_logit_normalizer = gate_logit_normalizer + + self.apply(self._initialize_weights) + + def _initialize_weights(self, module: nn.Module): + if getattr(module, "_is_hf_initialized", False): + return + if isinstance(module, nn.Linear): + nn.init.xavier_uniform_(module.weight, gain=2 ** -2.5) + if module.bias is not None: + nn.init.zeros_(module.bias) + module._is_hf_initialized = True + + def forward( + self, + hidden_states: torch.Tensor, + attention_mask: Optional[torch.Tensor] = None, + past_key_values: Optional[Cache] = None, + use_cache: Optional[bool] = False, + output_attentions: Optional[bool] = False, + **kwargs + ) -> Tuple[torch.Tensor, Optional[torch.Tensor], Optional[Cache]]: + if attention_mask is not None: + assert len(attention_mask.shape) == 2, ( + "Expected attention_mask as a 0-1 matrix with shape [batch_size, seq_len] " + "for padding purposes (0 indicating padding). " + "Arbitrary attention masks of shape [batch_size, seq_len, seq_len] are not allowed." + ) + + # launching the triton kernel for just one token will actually be slower + mode = 'fused_recurrent' if hidden_states.shape[1] == 1 else self.mode + + last_state = None + if past_key_values is not None and len(past_key_values) > self.layer_idx: + last_state = past_key_values[self.layer_idx] + if self.use_short_conv: + conv_state_q, conv_state_k, conv_state_v = None, None, None + if last_state is not None: + conv_state_q, conv_state_k, conv_state_v = last_state['conv_state'] + conv_mask = attention_mask[:, -hidden_states.shape[1]:] if attention_mask is not None else None + q, conv_state_q = self.q_conv1d(x=self.q_proj(hidden_states), + mask=conv_mask, + cache=conv_state_q, + output_final_state=use_cache) + k, conv_state_k = self.k_conv1d(x=self.k_proj(hidden_states), + mask=conv_mask, + cache=conv_state_k, + output_final_state=use_cache) + v, conv_state_v = self.v_conv1d(x=self.v_proj(hidden_states), + mask=conv_mask, + cache=conv_state_v, + output_final_state=use_cache) + else: + q = self.q_proj(hidden_states) + k = self.k_proj(hidden_states) + v = self.v_proj(hidden_states) + gk = self.gk_proj(hidden_states) + + if self.feature_map_fn is not None: + q, k = map(self.feature_map_fn, (q, k)) + # dealing with left-padding + if attention_mask is not None: + v = v.mul_(attention_mask[:, -v.shape[-2]:, None]) + q = rearrange(q, 'b t (h d) -> b t h d', h=self.num_heads) + if self.num_kv_groups > 1: + k, v, gk = (repeat(x, 'b t (h d) -> b t (h g) d', h=self.num_kv_heads, g=self.num_kv_groups) for x in (k, v, gk)) + else: + k, v, gk = (rearrange(x, 'b t (h d) -> b t h d', h=self.num_kv_heads) for x in (k, v, gk)) + gk = F.logsigmoid(gk) / self.gate_logit_normalizer + + if self.clamp_min is not None: + gk = torch.clamp_min(gk, self.clamp_min) + + recurrent_state = last_state['recurrent_state'] if last_state is not None else None + if mode == 'fused_recurrent': + o, recurrent_state = fused_recurrent_gla( + q=q, + k=k, + v=v, + gk=gk, + initial_state=recurrent_state, + output_final_state=use_cache, + head_first=False + ) + elif mode == 'fused_chunk': + o, recurrent_state = fused_chunk_gla( + q=q, + k=k, + v=v, + g=gk, + initial_state=recurrent_state, + output_final_state=use_cache, + head_first=False + ) + elif mode == 'chunk': + o, recurrent_state = chunk_gla( + q=q, + k=k, + v=v, + g=gk, + initial_state=recurrent_state, + output_final_state=use_cache, + head_first=False + ) + else: + raise NotImplementedError(f"Not supported mode `{mode}`.") + + if past_key_values is not None: + past_key_values.update( + recurrent_state=recurrent_state, + conv_state=(conv_state_q, conv_state_k, conv_state_v) if self.use_short_conv else None, + layer_idx=self.layer_idx, + offset=q.shape[2] + ) + + if self.use_output_gate: + g = self.g_proj(hidden_states) + if self.fuse_norm_and_gate: + g = rearrange(g, 'b t (h d) -> b t h d', h=self.num_heads) + o = self.g_norm_swish_gate(o, g) + o = rearrange(o, 'b t h d -> b t (h d)') + else: + o = rearrange(self.g_norm(o), 'b t h d -> b t (h d)') + o = o * self.gate_fn(g) + else: + o = rearrange(self.g_norm(o), 'b t h d -> b t (h d)') + o = self.o_proj(o) + + return o, None, past_key_values + + def state_size(self, **kwargs) -> int: + state_size = self.key_dim * self.head_v_dim + for module in self.children(): + if isinstance(module, ShortConvolution): + state_size += module.state_size + return state_size diff --git a/fla/layers/gsa.py b/fla/layers/gsa.py new file mode 100644 index 0000000000000000000000000000000000000000..0b0c5f77659bd2ee0ef512c082465eb647a2253c --- /dev/null +++ b/fla/layers/gsa.py @@ -0,0 +1,233 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +from __future__ import annotations + +import warnings +from typing import TYPE_CHECKING, Optional, Tuple + +import torch +import torch.nn as nn +import torch.nn.functional as F +from einops import rearrange + +from fla.modules import RMSNorm, ShortConvolution +from fla.modules.activations import swish +from fla.modules.feature_map import (ReLUFeatureMap, SwishFeatureMap, + T2RFeatureMap) +from fla.modules.layernorm import rms_norm_linear +from fla.ops.gsa import chunk_gsa, fused_recurrent_gsa + +if TYPE_CHECKING: + from fla.models.utils import Cache + + +class GatedSlotAttention(nn.Module): + + def __init__( + self, + mode: str = 'chunk', + hidden_size: int = 1024, + expand_k: float = 1., + expand_v: float = 1., + num_heads: int = 4, + num_kv_heads: Optional[int] = None, + use_short_conv: bool = False, + conv_size: int = 4, + conv_bias: bool = False, + num_slots: Optional[int] = None, + elementwise_affine: Optional[bool] = True, + norm_first: bool = True, + norm_eps: float = 1e-5, + gate_logit_normalizer: int = 8, + feature_map: str = 'swish', + use_output_gate: bool = False, + use_norm: bool = True, + layer_idx: Optional[int] = None, + scale: Optional[float] = 1., + **kwargs + ) -> GatedSlotAttention: + super().__init__() + + self.mode = mode + self.hidden_size = hidden_size + self.expand_k = expand_k + self.expand_v = expand_v + self.num_heads = num_heads + self.num_kv_heads = num_heads if num_kv_heads is None else num_kv_heads + self.num_kv_groups = self.num_heads // self.num_kv_heads + self.key_dim = int(hidden_size * expand_k) + self.value_dim = int(hidden_size * expand_v) + self.key_dim_per_group = self.key_dim // self.num_kv_groups + self.value_dim_per_group = self.value_dim // self.num_kv_groups + self.head_k_dim = self.key_dim // self.num_heads + self.head_v_dim = self.value_dim // self.num_heads + + self.use_short_conv = use_short_conv + self.conv_size = conv_size + self.conv_bias = conv_bias + + self.gate_logit_normalizer = gate_logit_normalizer + + self.use_output_gate = use_output_gate + self.use_norm = use_norm + self.scale = scale + + if num_slots is None: + num_slots = self.head_k_dim + self.num_slots = num_slots + self.norm_first = norm_first + + self.layer_idx = layer_idx + + if layer_idx is None: + warnings.warn( + f"Instantiating {self.__class__.__name__} without passing `layer_idx` is not recommended and will " + "to errors during the forward call, if caching is used. Please make sure to provide a `layer_idx` " + "when creating this class." + ) + + if norm_first: + self.norm = RMSNorm(self.hidden_size, eps=norm_eps) + self.register_module('feature_map', None) + if feature_map == 'swish': + self.feature_map = SwishFeatureMap() + elif feature_map == 'relu': + self.feature_map = ReLUFeatureMap() + elif feature_map == 't2r': + self.feature_map = T2RFeatureMap(self.head_k_dim, self.head_k_dim) + else: + raise NotImplementedError(f"Feature map `{feature_map}` is not supported now.") + + self.q_proj = nn.Linear(self.hidden_size, self.key_dim, bias=False) + self.k_proj = nn.Linear(self.hidden_size, self.key_dim_per_group, bias=False) + self.v_proj = nn.Linear(self.hidden_size, self.value_dim_per_group, bias=False) + self.f_proj = nn.Linear(self.hidden_size, self.num_kv_heads * self.num_slots, bias=False) + + if use_short_conv: + self.conv_size = conv_size + self.q_conv1d = ShortConvolution(self.key_dim, conv_size, activation='silu') + self.k_conv1d = ShortConvolution(self.key_dim_per_group, conv_size, activation='silu') + self.v_conv1d = ShortConvolution(self.value_dim_per_group, conv_size, activation='silu') + + self.g_norm = RMSNorm(self.hidden_size, elementwise_affine, eps=norm_eps) + self.o_proj = nn.Linear(self.value_dim, self.hidden_size, bias=False) + + self.apply(self._initialize_weights) + + def _initialize_weights(self, module: nn.Module): + if getattr(module, "_is_hf_initialized", False): + return + if isinstance(module, nn.Linear): + nn.init.xavier_uniform_(module.weight, gain=2 ** -2.5) + if module.bias is not None: + nn.init.zeros_(module.bias) + module._is_hf_initialized = True + + def forward( + self, + hidden_states: torch.Tensor, + attention_mask: Optional[torch.Tensor] = None, + past_key_values: Optional[Cache] = None, + use_cache: Optional[bool] = False, + output_attentions: Optional[bool] = False, + **kwargs + ) -> Tuple[torch.Tensor, Optional[torch.Tensor], Optional[Cache]]: + if attention_mask is not None: + assert len(attention_mask.shape) == 2, ( + "Expected attention_mask as a 0-1 matrix with shape [batch_size, seq_len] " + "for padding purposes (0 indicating padding). " + "Arbitrary attention masks of shape [batch_size, seq_len, seq_len] are not allowed." + ) + + # launching the triton kernel for just one token will actually be slower + mode = 'fused_recurrent' if hidden_states.shape[1] == 1 else self.mode + + if self.norm_first: + hidden_states = self.norm(hidden_states) + + last_state = None + if past_key_values is not None and len(past_key_values) > self.layer_idx: + last_state = past_key_values[self.layer_idx] + + if self.use_short_conv: + conv_state_q, conv_state_k, conv_state_v = None, None, None + if last_state is not None: + conv_state_q, conv_state_k, conv_state_v = last_state['conv_state'] + conv_mask = attention_mask[:, -hidden_states.shape[1]:] if attention_mask is not None else None + q, conv_state_q = self.q_conv1d(x=self.q_proj(hidden_states), + mask=conv_mask, + cache=conv_state_q, + output_final_state=use_cache) + k, conv_state_k = self.k_conv1d(x=self.k_proj(hidden_states), + mask=conv_mask, + cache=conv_state_k, + output_final_state=use_cache) + v, conv_state_v = self.v_conv1d(x=self.v_proj(hidden_states), + mask=conv_mask, + cache=conv_state_v, + output_final_state=use_cache) + else: + q = self.q_proj(hidden_states) + k = self.k_proj(hidden_states) + v = self.v_proj(hidden_states) + f = self.f_proj(hidden_states) + + q = rearrange(q, 'b t (h d) -> b t h d', h=self.num_heads) + k = rearrange(k, 'b t (h d) -> b t h d', h=self.num_kv_heads) + v = rearrange(v, 'b t (h d) -> b t h d', h=self.num_kv_heads) + f = rearrange(f, 'b t (h m) -> b t h m', h=self.num_kv_heads) + + if self.feature_map is not None: + q, k = map(lambda x: self.feature_map(x), (q, k)) + v = swish(v) + + f = F.logsigmoid(f) / self.gate_logit_normalizer + s = (1 - f.exp()).to(f.dtype) + # dealing with left-padding + if attention_mask is not None: + s = s.mul_(attention_mask[:, -s.shape[1]:, None, None]) + v = v.mul_(attention_mask[:, -v.shape[1]:, None, None]) + + recurrent_state = last_state['recurrent_state'] if last_state is not None else None + if mode == 'fused_recurrent': + o, recurrent_state = fused_recurrent_gsa( + q=q, + k=k, + v=v, + s=s, + g=f, + initial_state=recurrent_state, + output_final_state=use_cache, + scale=self.scale, + head_first=False + ) + elif mode == 'chunk': + o, recurrent_state = chunk_gsa( + q=q, + k=k, + v=v, + s=s, + g=f, + initial_state=recurrent_state, + output_final_state=use_cache, + scale=self.scale, + head_first=False + ) + else: + raise NotImplementedError(f"Not supported mode `{mode}`.") + + if past_key_values is not None: + past_key_values.update( + recurrent_state=recurrent_state, + conv_state=(conv_state_q, conv_state_k, conv_state_v) if self.use_short_conv else None, + layer_idx=self.layer_idx, + offset=q.shape[2] + ) + + o = rearrange(o, 'b t h d -> b t (h d)') + o = rms_norm_linear(swish(o), self.g_norm.weight, self.g_norm.bias, self.o_proj.weight, self.o_proj.bias) + return o, None, past_key_values + + def state_size(self, *args, **kwargs) -> int: + return 2 * self.num_slots * self.hidden_size diff --git a/fla/layers/hgrn.py b/fla/layers/hgrn.py new file mode 100644 index 0000000000000000000000000000000000000000..97549de05f313cd41fdd918074c37985f7a0edcd --- /dev/null +++ b/fla/layers/hgrn.py @@ -0,0 +1,153 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +# "Hierarchically Gated Recurrent Neural Network for Sequence Modeling" [https://arxiv.org/abs/2311.04823] + +from __future__ import annotations + +from typing import TYPE_CHECKING, Optional, Tuple + +import torch +import torch.nn as nn +import torch.nn.functional as F + +from fla.modules import FusedRMSNormSwishGate, ShortConvolution +from fla.modules.activations import swiglu +from fla.ops.hgrn import chunk_hgrn, fused_recurrent_hgrn + +if TYPE_CHECKING: + from fla.models.utils import Cache + + +class HGRNAttention(nn.Module): + + def __init__( + self, + mode: str = 'chunk', + hidden_size: int = 1024, + expand_ratio: Optional[int] = 1, + use_short_conv: bool = False, + conv_size: int = 4, + conv_bias: bool = False, + elementwise_affine: Optional[bool] = True, + norm_eps: float = 1e-5, + layer_idx: int = None + ) -> HGRNAttention: + super().__init__() + + self.mode = mode + self.hidden_size = hidden_size + self.expand_ratio = expand_ratio + self.input_dim = int(hidden_size * expand_ratio) + + self.use_short_conv = use_short_conv + self.conv_size = conv_size + self.conv_bias = conv_bias + + self.layer_idx = layer_idx + + assert mode in ['chunk', 'fused_recurrent'], f"Not suppoerted mode `{mode}`." + + self.i_proj = nn.Linear(hidden_size, self.input_dim, bias=False) + self.f_proj = nn.Linear(hidden_size, self.input_dim, bias=False) + self.g_proj = nn.Linear(hidden_size, self.input_dim, bias=False) + + if use_short_conv: + self.conv_size = conv_size + self.q_conv1d = ShortConvolution(self.input_dim, conv_size, activation=None) + self.f_conv1d = ShortConvolution(self.input_dim, conv_size, activation=None) + self.i_conv1d = ShortConvolution(self.input_dim, conv_size, activation=None) + + self.g_norm = FusedRMSNormSwishGate(self.input_dim, elementwise_affine, norm_eps) + self.o_proj = nn.Linear(self.input_dim, hidden_size, bias=False) + + self.apply(self._initialize_weights) + + def _initialize_weights(self, module: nn.Module): + if getattr(module, "_is_hf_initialized", False): + return + if isinstance(module, nn.Linear): + nn.init.xavier_uniform_(module.weight, gain=2 ** -2.5) + if module.bias is not None: + nn.init.zeros_(module.bias) + module._is_hf_initialized = True + + def forward( + self, + hidden_states: torch.Tensor, + attention_mask: Optional[torch.Tensor] = None, + past_key_values: Optional[Cache] = None, + use_cache: Optional[bool] = False, + output_attentions: Optional[bool] = False, + lower_bound: Optional[torch.Tensor] = None, + **kwargs + ) -> Tuple[torch.Tensor, Optional[torch.Tensor], Optional[Cache]]: + if attention_mask is not None: + assert len(attention_mask.shape) == 2, ( + "Expected attention_mask as a 0-1 matrix with shape [batch_size, seq_len] " + "for padding purposes (0 indicating padding). " + "Arbitrary attention masks of shape [batch_size, seq_len, seq_len] are not allowed." + ) + + # launching the triton kernel for just one token will actually be slower + mode = 'fused_recurrent' if hidden_states.shape[1] == 1 else self.mode + + last_state = None + if past_key_values is not None and len(past_key_values) > self.layer_idx: + last_state = past_key_values[self.layer_idx] + + if self.use_short_conv: + conv_state_i, conv_state_f = None, None + if last_state is not None: + conv_state_i, conv_state_f = last_state['conv_state'] + conv_mask = attention_mask[:, -hidden_states.shape[1]:] if attention_mask is not None else None + i, conv_state_i = self.i_conv1d(x=self.i_proj(hidden_states), + mask=conv_mask, + cache=conv_state_i, + output_final_state=use_cache) + f, conv_state_f = self.f_conv1d(x=self.f_proj(hidden_states), + mask=conv_mask, + cache=conv_state_f, + output_final_state=use_cache) + else: + i = self.i_proj(hidden_states) + f = self.f_proj(hidden_states) + + # the lower bound for the first layer is zero + if lower_bound is None or self.layer_idx == 0: + i, f = swiglu(i, 1 - f.sigmoid()), F.logsigmoid(f) + else: + g = lower_bound + (1 - lower_bound) * f.sigmoid() + i, f = swiglu(i, 1 - g), g.log() + + # dealing with left-padding + if attention_mask is not None: + i = i.mul_(attention_mask[:, -i.shape[-2]:, None]) + + recurrent_state = last_state['recurrent_state'] if last_state is not None else None + if mode == 'chunk': + o, recurrent_state = chunk_hgrn(i, f, recurrent_state, use_cache) + elif mode == 'fused_recurrent': + o, recurrent_state = fused_recurrent_hgrn(i, f, recurrent_state, use_cache) + else: + raise NotImplementedError(f"Not supported mode `{mode}`.") + + if past_key_values is not None: + past_key_values.update( + recurrent_state=recurrent_state, + conv_state=(conv_state_i, conv_state_f) if self.use_short_conv else None, + layer_idx=self.layer_idx, + offset=i.shape[2] + ) + + o = self.g_norm(o, self.g_proj(hidden_states)) + o = self.o_proj(o) + + return o, None, past_key_values + + def state_size(self, **kwargs) -> int: + state_size = self.hidden_size + for module in self.children(): + if isinstance(module, ShortConvolution): + state_size += module.state_size + return state_size diff --git a/fla/layers/hgrn2.py b/fla/layers/hgrn2.py new file mode 100644 index 0000000000000000000000000000000000000000..769c19b7d2ed12f97ebe4cb44f474d3eba6f72fc --- /dev/null +++ b/fla/layers/hgrn2.py @@ -0,0 +1,207 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +# "HGRN2: Gated Linear RNNs with State Expansion"[https://arxiv.org/abs/2404.07904] + +from __future__ import annotations + +from typing import TYPE_CHECKING, Optional, Tuple + +import torch +import torch.nn as nn +import torch.nn.functional as F +from einops import rearrange + +from fla.modules import RMSNorm, ShortConvolution +from fla.modules.activations import swish +from fla.modules.layernorm import rms_norm_linear +from fla.ops.gla import chunk_gla, fused_chunk_gla, fused_recurrent_gla + +if TYPE_CHECKING: + from fla.models.utils import Cache + + +class HGRN2Attention(nn.Module): + + def __init__( + self, + mode: str = 'chunk', + hidden_size: int = 1024, + num_heads: Optional[int] = None, + expand_ratio: Optional[int] = 128, + use_short_conv: bool = False, + conv_size: int = 4, + conv_bias: bool = False, + elementwise_affine: Optional[bool] = True, + norm_eps: float = 1e-5, + layer_idx: int = None + ) -> HGRN2Attention: + super().__init__() + + self.mode = mode + self.hidden_size = hidden_size + + if expand_ratio is None and num_heads is not None: + expand_ratio = hidden_size // num_heads + elif expand_ratio is not None and num_heads is None: + num_heads = hidden_size // expand_ratio + elif expand_ratio is None and num_heads is None: + raise RuntimeError("One of `expand_ratio` or `num_heads` should be provided.") + self.num_heads = num_heads + self.expand_ratio = expand_ratio + + self.use_short_conv = use_short_conv + self.conv_size = conv_size + self.conv_bias = conv_bias + + self.forget_dim = int(self.num_heads * self.expand_ratio) + self.input_dim = hidden_size + self.layer_idx = layer_idx + + assert mode in ['chunk', 'fused_recurrent', 'fused_chunk'], f"Not suppoerted mode `{mode}`." + assert self.forget_dim % num_heads == 0, f"forget dim must be divisible by num_heads of {num_heads}" + assert self.input_dim % num_heads == 0, f"input dim must be divisible by num_heads of {num_heads}" + + self.head_f_dim = self.expand_ratio + self.head_i_dim = self.hidden_size // num_heads + + self.q_proj = nn.Linear(hidden_size, self.forget_dim, bias=False) + self.f_proj = nn.Linear(hidden_size, self.forget_dim, bias=False) + self.i_proj = nn.Linear(hidden_size, self.input_dim, bias=False) + + if use_short_conv: + self.conv_size = conv_size + self.q_conv1d = ShortConvolution(self.forget_dim, conv_size, activation=None) + self.f_conv1d = ShortConvolution(self.forget_dim, conv_size, activation=None) + self.i_conv1d = ShortConvolution(self.input_dim, conv_size, activation=None) + + self.g_norm = RMSNorm(hidden_size=self.hidden_size, elementwise_affine=elementwise_affine, eps=norm_eps) + self.o_proj = nn.Linear(self.input_dim, hidden_size, bias=False) + + self.apply(self._initialize_weights) + + def _initialize_weights(self, module: nn.Module): + if getattr(module, "_is_hf_initialized", False): + return + if isinstance(module, nn.Linear): + nn.init.xavier_uniform_(module.weight, gain=2 ** -2.5) + if module.bias is not None: + nn.init.zeros_(module.bias) + module._is_hf_initialized = True + + def forward( + self, + hidden_states: torch.Tensor, + attention_mask: Optional[torch.Tensor] = None, + past_key_values: Optional[Cache] = None, + use_cache: Optional[bool] = False, + output_attentions: Optional[bool] = False, + lower_bound: Optional[torch.Tensor] = None, + **kwargs + ) -> Tuple[torch.Tensor, Optional[torch.Tensor], Optional[Cache]]: + if attention_mask is not None: + assert len(attention_mask.shape) == 2, ( + "Expected attention_mask as a 0-1 matrix with shape [batch_size, seq_len] " + "for padding purposes (0 indicating padding). " + "Arbitrary attention masks of shape [batch_size, seq_len, seq_len] are not allowed." + ) + + # launching the triton kernel for just one token will actually be slower + mode = 'fused_recurrent' if hidden_states.shape[1] == 1 else self.mode + + last_state = None + if past_key_values is not None and len(past_key_values) > self.layer_idx: + last_state = past_key_values[self.layer_idx] + + if self.use_short_conv: + conv_state_q, conv_state_f, conv_state_i = None, None, None + if last_state is not None: + conv_state_q, conv_state_f, conv_state_i = last_state['conv_state'] + conv_mask = attention_mask[:, -hidden_states.shape[1]:] if attention_mask is not None else None + q, conv_state_q = self.q_conv1d(x=self.q_proj(hidden_states), + mask=conv_mask, + cache=conv_state_q, + output_final_state=use_cache) + f, conv_state_f = self.f_conv1d(x=self.f_proj(hidden_states), + mask=conv_mask, + cache=conv_state_f, + output_final_state=use_cache) + i, conv_state_i = self.i_conv1d(x=self.i_proj(hidden_states), + mask=conv_mask, + cache=conv_state_i, + output_final_state=use_cache) + else: + q = self.q_proj(hidden_states) + f = self.f_proj(hidden_states) + i = self.i_proj(hidden_states) + + # dealing with left-padding + if attention_mask is not None: + i = i.mul_(attention_mask[:, -i.shape[-2]:, None]) + + q = swish(q) + + # improve precision + f = f.float() + + # the lower bound for the first layer is zero + if lower_bound is None or self.layer_idx == 0: + k, g = 1 - f.sigmoid(), F.logsigmoid(f) + else: + g = lower_bound + (1 - lower_bound) * f.sigmoid() + k, g = 1 - g, g.log() + + q, k, i, g = map(lambda x: rearrange(x, '... (h d) -> ... h d', h=self.num_heads), (q, k.to(i), i, g)) + + recurrent_state = last_state['recurrent_state'] if last_state is not None else None + if mode == 'fused_recurrent': + o, recurrent_state = fused_recurrent_gla( + q=q, + k=k, + v=i, + gk=g, + initial_state=recurrent_state, + output_final_state=use_cache, + head_first=False + ) + elif mode == 'fused_chunk': + o, recurrent_state = fused_chunk_gla( + q=q, + k=k, + v=i, + g=g, + initial_state=recurrent_state, + output_final_state=use_cache, + head_first=False + ) + elif mode == 'chunk': + o, recurrent_state = chunk_gla( + q=q, + k=k, + v=i, + g=g, + initial_state=recurrent_state, + output_final_state=use_cache, + head_first=False + ) + else: + raise NotImplementedError(f"Not supported mode `{mode}`.") + + if past_key_values is not None: + past_key_values.update( + recurrent_state=recurrent_state, + conv_state=(conv_state_q, conv_state_f, conv_state_i) if self.use_short_conv else None, + layer_idx=self.layer_idx, + offset=q.shape[2] + ) + + o = rearrange(o, '... h d -> ... (h d)') + o = rms_norm_linear(o, self.g_norm.weight, self.g_norm.bias, self.o_proj.weight, self.o_proj.bias) + return o, None, past_key_values + + def state_size(self, **kwargs) -> int: + state_size = self.forget_dim * self.head_i_dim + for module in self.children(): + if isinstance(module, ShortConvolution): + state_size += module.state_size + return state_size diff --git a/fla/layers/linear_attn.py b/fla/layers/linear_attn.py new file mode 100644 index 0000000000000000000000000000000000000000..7aae4e4371963e61e8bd52ffd7a3c97eaf81ee49 --- /dev/null +++ b/fla/layers/linear_attn.py @@ -0,0 +1,171 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +from typing import Optional + +import torch.nn as nn +import torch.nn.functional as F +from einops import rearrange, repeat + +from fla.modules import RMSNorm +from fla.modules.feature_map import (DPFPFeatureMap, HadamardFeatureMap, + HedgehogFeatureMap, T2RFeatureMap) +from fla.ops.linear_attn import (chunk_linear_attn, fused_chunk_linear_attn, + fused_recurrent_linear_attn) + + +class LinearAttention(nn.Module): + def __init__( + self, + mode: str = 'chunk', + hidden_size: str = 1024, + expand_k: int = 1.0, + expand_v: int = 1.0, + num_heads: int = 8, + num_kv_heads: Optional[int] = None, + feature_map: str = 'elementwise_product', + tie_feature_map_qk: bool = False, + output_norm: str = 'rmsnorm', + norm_q: bool = False, + norm_k: bool = False, + # standard linear attention normalization + do_feature_map_norm: bool = False, + elementwise_affine: bool = True, + norm_eps: float = 1e-5, + **kwargs + ): + super().__init__() + + self.hidden_size = hidden_size + self.mode = mode + self.num_heads = num_heads + self.num_kv_heads = num_kv_heads if num_kv_heads is not None else num_heads + self.num_kv_groups = self.num_heads // self.num_kv_heads + self.key_dim = int(hidden_size * expand_k) + self.value_dim = int(hidden_size * expand_v) + self.key_dim_per_group = self.key_dim // self.num_kv_groups + self.value_dim_per_group = self.value_dim // self.num_kv_groups + + assert mode in ['chunk', 'fused_chunk', 'fused_recurrent'], f"Not suppoerted mode `{mode}`." + assert self.key_dim % num_heads == 0, f"key dim must be divisible by num_heads of {num_heads}" + assert self.value_dim % num_heads == 0, f"value dim must be divisible by num_heads of {num_heads}" + + self.head_qk_dim = self.key_dim // num_heads + self.head_v_dim = self.value_dim // num_heads + self.do_feature_map_norm = do_feature_map_norm + + if feature_map == 'hedgehog': + if tie_feature_map_qk: + self.feature_map_q = self.feature_map_k = HedgehogFeatureMap(head_dim=self.head_qk_dim) + else: + self.feature_map_q = HedgehogFeatureMap(head_dim=self.head_qk_dim) + self.feature_map_k = HedgehogFeatureMap(head_dim=self.head_qk_dim) + + elif feature_map == 't2r': + if tie_feature_map_qk: + self.feature_map_q = self.feature_map_k = T2RFeatureMap(head_dim=self.head_qk_dim) + else: + self.feature_map_q = T2RFeatureMap(head_dim=self.head_qk_dim) + self.feature_map_k = T2RFeatureMap(head_dim=self.head_qk_dim) + + elif feature_map == 'elementwise_product': + if tie_feature_map_qk: + self.feature_map_q = self.feature_map_k = HadamardFeatureMap(head_dim=self.head_qk_dim) + else: + self.feature_map_q = HadamardFeatureMap(head_dim=self.head_qk_dim) + self.feature_map_k = HadamardFeatureMap(head_dim=self.head_qk_dim) + + elif feature_map == 'dpfp': + self.feature_map_q = DPFPFeatureMap(head_dim=self.head_qk_dim) + self.feature_map_k = DPFPFeatureMap(head_dim=self.head_qk_dim) + + elif feature_map == 'elu': + def elu(x): + return F.elu(x) + 1 + self.feature_map_q = elu + self.feature_map_k = elu + + elif feature_map == 'relu': + self.feature_map_q = nn.ReLU() + self.feature_map_k = nn.ReLU() + + elif feature_map == 'identity': + self.feature_map_q = nn.Identity() + self.feature_map_k = nn.Identity() + else: + raise NotImplementedError(f"Not supported feature map `{feature_map}`.") + + self.q_proj = nn.Linear(hidden_size, self.key_dim, bias=False) + self.k_proj = nn.Linear(hidden_size, self.key_dim_per_group, bias=False) + self.v_proj = nn.Linear(hidden_size, self.value_dim_per_group, bias=False) + + if output_norm == 'rmsnorm': + self.norm = RMSNorm(hidden_size=self.head_v_dim, elementwise_affine=elementwise_affine, eps=norm_eps) + elif output_norm == 'identity': + self.norm = nn.Identity() + else: + raise NotImplementedError(f"Not supported output norm `{output_norm}`.") + + self.o_proj = nn.Linear(self.value_dim, hidden_size, bias=False) + + self.norm_q = norm_q + self.norm_k = norm_k + + self.apply(self._initialize_weights) + + def _initialize_weights(self, module: nn.Module): + if getattr(module, "_is_hf_initialized", False): + return + if isinstance(module, nn.Linear): + nn.init.xavier_uniform_(module.weight, gain=2 ** -2.5) + if module.bias is not None: + nn.init.zeros_(module.bias) + module._is_hf_initialized = True + + def forward(self, x): + mode = self.mode + q = self.q_proj(x) + k = self.k_proj(x) + v = self.v_proj(x) + + q = rearrange(q, '... (h d) -> ... h d', h=self.num_heads) + if self.num_kv_groups > 1: + k, v = (repeat(x, '... (h d) -> ... (h g) d', h=self.num_kv_heads, g=self.num_kv_groups) for x in (k, v)) + else: + k, v = (rearrange(x, '... (h d) -> ... h d', h=self.num_kv_heads) for x in (k, v)) + + q = self.feature_map_q(q) + k = self.feature_map_k(k) + + if self.norm_q: + q = q / (q.sum(-1, True) + 1e-4) + if self.norm_k: + k = k / (k.sum(-1, True) + 1e-4) + + if mode == 'chunk': + o, final_state = chunk_linear_attn( + q=q, + k=k, + v=v, + normalize=self.do_feature_map_norm, + head_first=False + ) + elif mode == 'fused_chunk': + o, final_state = fused_chunk_linear_attn( + q=q, + k=k, + v=v, + normalize=self.do_feature_map_norm, + ) + elif mode == 'fused_recurrent': + o, final_state = fused_recurrent_linear_attn( + q=q, + k=k, + v=v, + normalize=self.do_feature_map_norm, + ) + else: + raise NotImplementedError + o = self.norm(o) + o = self.o_proj(o) + return o diff --git a/fla/layers/multiscale_retention.py b/fla/layers/multiscale_retention.py new file mode 100644 index 0000000000000000000000000000000000000000..45312bec92bd592e61ce646e2d6036b07e76fa41 --- /dev/null +++ b/fla/layers/multiscale_retention.py @@ -0,0 +1,282 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +from __future__ import annotations + +from typing import TYPE_CHECKING, Optional, Tuple + +import torch +import torch.nn as nn +from einops import rearrange, repeat +from transformers.activations import ACT2FN + +from fla.modules import FusedRMSNormSwishGate, RMSNorm, ShortConvolution +from fla.modules.rotary import RotaryEmbedding +from fla.ops.retention import (chunk_retention, fused_chunk_retention, + fused_recurrent_retention, parallel_retention) + +if TYPE_CHECKING: + from fla.models.utils import Cache + + +class MultiScaleRetention(nn.Module): + r""" + The layer implementaion for [Retentive Network: A Successor to Transformer for Large Language Models](https://arxiv.org/pdf/2307.08621.pdf). # noqa + + Args: + mode (str, Optional): + Which Retention kernel to use. + Currently available: `chunk`, `fused_recurrent`, `parallel`, and `fused_chunk`. + Default: `fused_chunk`. + hidden_size (int, Optional): + The hidden size of the input. Default: 1024. + expand_k (float, Optional): + The expansion ratio for the key dim. Default: 1.0. + expand_v (float, Optional): + The expansion ratio for the value dim. Default: 2.0. + num_heads (int, Optional): + The number of heads. Default: 8. + num_kv_heads (int, Optional): + The number of key/value heads, used for MQA. Default: None. + feature_map (str, Optional): + Feature map function applied to queries/keys. Default: None. + use_short_conv (bool, Optional): + Whether to use short convolutions. Default: `False`. + conv_size (int, Optional): + The kernel size of the short convolution, only used when `use_short_conv` is `True`. Default: 4. + conv_bias (bool, Optional): + Whether to use bias in the short convolution, only used when `use_short_conv` is `True`. Default: `False`. + use_output_gate (bool, Optional): + Whether to use output gate. Default: `True`. + gate_fn (str, Optional): + The activation function for the output gate. Default: `swish`. + elementwise_affine (bool, Optional): + If `True`, applies elementwise affine to LayerNorm with learnable parameters. Default: `True`. + norm_eps (float, Optional): + The epsilon value for the layernorm/rmsnorm layer. Default: 1e-5. + fuse_norm (bool, Optional): + Whether to fuse the norm and the output gate for better memory footprint. Default: `True`. + layer_idx (int, Optional): + The index of the layer. Default: None. + """ + + def __init__( + self, + mode: str = 'chunk', + hidden_size: int = 1024, + expand_k: float = 1.0, + expand_v: float = 2.0, + num_heads: int = 8, + num_kv_heads: Optional[int] = None, + feature_map: Optional[str] = None, + use_short_conv: bool = False, + conv_size: int = 4, + conv_bias: bool = False, + use_output_gate: bool = True, + gate_fn: str = 'swish', + elementwise_affine: Optional[bool] = True, + norm_eps: float = 1e-5, + fuse_norm: bool = True, + layer_idx: int = None, + **kwargs + ) -> MultiScaleRetention: + super().__init__() + + self.mode = mode + self.hidden_size = hidden_size + self.expand_k = expand_k + self.expand_v = expand_v + self.num_heads = num_heads + self.num_kv_heads = num_kv_heads if num_kv_heads is not None else num_heads + self.num_kv_groups = self.num_heads // self.num_kv_heads + self.feature_map_fn = ACT2FN[feature_map] if feature_map is not None else None + + self.use_short_conv = use_short_conv + self.conv_size = conv_size + self.conv_bias = conv_bias + self.use_output_gate = use_output_gate + + self.key_dim = int(hidden_size * expand_k) + self.value_dim = int(hidden_size * expand_v) + self.key_dim_per_group = self.key_dim // self.num_kv_groups + self.value_dim_per_group = self.value_dim // self.num_kv_groups + self.layer_idx = layer_idx + + assert mode in ['chunk', 'fused_chunk', 'parallel', 'fused_recurrent'], f"Not suppoerted mode `{mode}`." + assert self.key_dim % num_heads == 0, f"key dim must be divisible by num_heads of {num_heads}" + assert self.value_dim % num_heads == 0, f"value dim must be divisible by num_heads of {num_heads}" + + self.head_qk_dim = self.key_dim // num_heads + self.head_v_dim = self.value_dim // num_heads + + self.q_proj = nn.Linear(hidden_size, self.key_dim, bias=False) + self.k_proj = nn.Linear(hidden_size, self.key_dim_per_group, bias=False) + self.v_proj = nn.Linear(hidden_size, self.value_dim_per_group, bias=False) + if self.use_output_gate: + self.g_proj = nn.Linear(hidden_size, self.value_dim, bias=False) + + if use_short_conv: + self.conv_size = conv_size + self.q_conv1d = ShortConvolution(self.key_dim, conv_size, activation='silu') + self.k_conv1d = ShortConvolution(self.key_dim_per_group, conv_size, activation='silu') + self.v_conv1d = ShortConvolution(self.value_dim_per_group, conv_size, activation='silu') + + self.o_proj = nn.Linear(self.value_dim, hidden_size, bias=False) + + if gate_fn == 'swish' and fuse_norm and use_output_gate: + self.g_norm_swish_gate = FusedRMSNormSwishGate(self.head_v_dim, elementwise_affine, norm_eps) + self.fuse_norm_and_gate = True + else: + self.fuse_norm_and_gate = False + self.g_norm = RMSNorm(hidden_size=self.head_v_dim, elementwise_affine=elementwise_affine, eps=norm_eps) + self.gate_fn = ACT2FN[gate_fn] + + # TODO: fix this issue + # https://github.com/Dao-AILab/flash-attention/blob/main/flash_attn/ops/triton/rotary.py#L180 + # Ideally, we would want to support arbitrary d_head_qk + assert self.head_qk_dim <= 256, "head_qk_dim must be less than or equal to 256" + self.rotary = RotaryEmbedding(dim=self.head_qk_dim) + + self.apply(self._initialize_weights) + + def _initialize_weights(self, module: nn.Module): + if getattr(module, "_is_hf_initialized", False): + return + if isinstance(module, nn.Linear): + nn.init.xavier_uniform_(module.weight, gain=2 ** -2.5) + if module.bias is not None: + nn.init.zeros_(module.bias) + module._is_hf_initialized = True + + def forward( + self, + hidden_states: torch.Tensor, + attention_mask: Optional[torch.Tensor] = None, + past_key_values: Optional[Cache] = None, + use_cache: Optional[bool] = False, + output_attentions: Optional[bool] = False, + **kwargs + ) -> Tuple[torch.Tensor, Optional[torch.Tensor], Optional[Cache]]: + if attention_mask is not None: + assert len(attention_mask.shape) == 2, ( + "Expected attention_mask as a 0-1 matrix with shape [batch_size, seq_len] " + "for padding purposes (0 indicating padding). " + "Arbitrary attention masks of shape [batch_size, seq_len, seq_len] are not allowed." + ) + + # launching the triton kernel for just one token will actually be slower + mode = 'fused_recurrent' if hidden_states.shape[1] == 1 else self.mode + + last_state = None + if past_key_values is not None and len(past_key_values) > self.layer_idx: + last_state = past_key_values[self.layer_idx] + + if self.use_short_conv: + conv_state_q, conv_state_k, conv_state_v = None, None, None + if last_state is not None: + conv_state_q, conv_state_k, conv_state_v = last_state['conv_state'] + conv_mask = attention_mask[:, -hidden_states.shape[1]:] if attention_mask is not None else None + q, conv_state_q = self.q_conv1d(x=self.q_proj(hidden_states), + mask=conv_mask, + cache=conv_state_q, + output_final_state=use_cache) + k, conv_state_k = self.k_conv1d(x=self.k_proj(hidden_states), + mask=conv_mask, + cache=conv_state_k, + output_final_state=use_cache) + v, conv_state_v = self.v_conv1d(x=self.v_proj(hidden_states), + mask=conv_mask, + cache=conv_state_v, + output_final_state=use_cache) + else: + q = self.q_proj(hidden_states) + k = self.k_proj(hidden_states) + v = self.v_proj(hidden_states) + + # dealing with left-padding + if attention_mask is not None: + v = v.mul_(attention_mask[:, -v.shape[-2]:, None]) + q = rearrange(q, '... (h d) -> ... h d', h=self.num_heads) + k = rearrange(k, '... (h d) -> ... h d', h=self.num_kv_heads) + if self.feature_map_fn is not None: + q, k = map(self.feature_map_fn, (q, k)) + + seqlen_offset, max_seqlen = 0, q.shape[1] + if past_key_values is not None: + seqlen_offset = past_key_values.get_seq_length(self.layer_idx) + max_seqlen = q.shape[1] + seqlen_offset + + if attention_mask is not None: + # to deliminate the offsets of padding tokens + seqlen_offset = (seqlen_offset + attention_mask.sum(-1) - attention_mask.shape[-1]).clamp(min=0) + max_seqlen = q.shape[1] + max(seqlen_offset) + + q, k = self.rotary(q, k, seqlen_offset, max_seqlen) + if self.num_kv_groups > 1: + k = repeat(k, 'b t h d -> b t (h g) d', h=self.num_kv_heads, g=self.num_kv_groups) + v = repeat(v, 'b t (h d) -> b t (h g) d', h=self.num_kv_heads, g=self.num_kv_groups) + else: + k, v = rearrange(k, 'b t h d -> b t h d'), rearrange(v, 'b t (h d) -> b t h d', h=self.num_kv_heads) + + recurrent_state = last_state['recurrent_state'] if last_state is not None else None + if mode == 'chunk': + o, recurrent_state = chunk_retention( + q=q, + k=k, + v=v, + initial_state=recurrent_state, + output_final_state=use_cache, + head_first=False + ) + elif mode == 'fused_chunk': + o, recurrent_state = fused_chunk_retention( + q=q, + k=k, + v=v, + initial_state=recurrent_state, + output_final_state=use_cache, + head_first=False + ) + elif mode == 'parallel': + o, recurrent_state = parallel_retention(q, k, v, head_first=False) + elif mode == 'fused_recurrent': + o, recurrent_state = fused_recurrent_retention( + q=q, + k=k, + v=v, + initial_state=recurrent_state, + output_final_state=use_cache, + head_first=False + ) + else: + raise NotImplementedError(f"Not supported mode `{mode}`.") + + if past_key_values is not None: + past_key_values.update( + recurrent_state=recurrent_state, + conv_state=(conv_state_q, conv_state_k, conv_state_v) if self.use_short_conv else None, + layer_idx=self.layer_idx, + offset=q.shape[2] + ) + + if self.use_output_gate: + g = self.g_proj(hidden_states) + if self.fuse_norm_and_gate: + g = rearrange(g, 'b t (h d) -> b t h d', h=self.num_heads) + o = self.g_norm_swish_gate(o, g) + o = rearrange(o, 'b t h d -> b t (h d)') + else: + o = rearrange(self.g_norm(o), 'b t h d -> b t (h d)') + o = o * self.gate_fn(g) + else: + o = rearrange(self.g_norm(o), 'b t h d -> b t (h d)') + o = self.o_proj(o) + + return o, None, past_key_values + + def state_size(self, **kwargs) -> int: + state_size = self.key_dim * self.head_v_dim + for module in self.children(): + if isinstance(module, ShortConvolution): + state_size += module.state_size + return state_size diff --git a/fla/layers/rebased.py b/fla/layers/rebased.py new file mode 100644 index 0000000000000000000000000000000000000000..b55a1df64ee1f4e8373c3be0d822043f0f635c25 --- /dev/null +++ b/fla/layers/rebased.py @@ -0,0 +1,136 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +""" +https://github.com/corl-team/rebased/blob/main/flash_linear_attention/fla/layers/rebased_fast.py +""" + +from __future__ import annotations + +from typing import Optional + +import torch +import torch.nn as nn +from einops import rearrange + +from fla.modules.feature_map import RebasedFeatureMap +from fla.ops.linear_attn import chunk_linear_attn, fused_chunk_linear_attn +from fla.ops.rebased import parallel_rebased + + +class ReBasedLinearAttention(nn.Module): + def __init__( + self, + hidden_size: int, + l_max: int = 2048, + feature_dim: int = 16, + num_key_value_heads: int = 16, + num_heads: int = 16, + use_gamma: Optional[bool] = True, + use_beta: Optional[bool] = True, + normalize: Optional[bool] = True, + causal: bool = True, + eps: float = 1e-5, + mode: str = "parallel", + layer_idx: Optional[int] = None, + **kwargs + ) -> ReBasedLinearAttention: + super().__init__() + self.hidden_size = hidden_size + self.l_max = l_max + self.mode = mode + assert self.mode in ["fused_chunk", "parallel", 'chunk'] + + # linear attention + self.feature_dim = feature_dim + self.num_key_value_heads = num_key_value_heads + self.num_heads = num_heads + self.head_dim = self.hidden_size // self.num_key_value_heads + self.use_gamma = use_gamma + self.use_beta = use_beta + self.normalize = normalize + self.causal = causal + + self.feature_map = RebasedFeatureMap(self.feature_dim, use_gamma, use_beta, normalize) + self.q_proj = nn.Linear(self.hidden_size, self.feature_dim * self.num_heads, bias=False) + self.k_proj = nn.Linear(self.hidden_size, self.feature_dim * self.num_heads, bias=False) + self.v_proj = nn.Linear(self.hidden_size, self.num_key_value_heads * self.head_dim, bias=False) + self.o_proj = nn.Linear(self.num_heads * self.head_dim, self.hidden_size, bias=False) + self.dropout = nn.Identity() + self.eps = eps + + self.apply(self._initialize_weights) + + def _initialize_weights(self, module: nn.Module): + if getattr(module, "_is_hf_initialized", False): + return + if isinstance(module, nn.Linear): + nn.init.xavier_uniform_(module.weight, gain=2 ** -2.5) + if module.bias is not None: + nn.init.zeros_(module.bias) + module._is_hf_initialized = True + + def forward(self, hidden_states: torch.Tensor, **kwargs): + mode = self.mode + q, k, v = self.q_proj(hidden_states), self.k_proj(hidden_states), self.v_proj(hidden_states) + q, k, v = map(lambda x: rearrange(x, "... (h d) -> ... h d", h=self.num_heads), [q, k, v]) + q, k = self.feature_map(q, flatten=(mode != 'parallel')), self.feature_map(k, flatten=(mode != 'parallel')) + if mode == "fused_chunk": + o = fused_chunk_linear_attn( + q=q, + k=k, + v=v, + normalize=True, + scale=1, + head_first=False + ) + elif mode == 'chunk': + o = chunk_linear_attn( + q=q, + k=k, + v=v, + normalize=True, + scale=1, + head_first=False + ) + elif mode == 'parallel': + assert q.shape[-1] <= 128 + o = parallel_rebased( + q=q, + k=k, + v=v, + eps=self.eps, + use_scale=True, + use_normalize=True, + head_first=False + ) + o = self.o_proj(o) + o = self.dropout(o) + return o + + # https://github.com/HazyResearch/zoology/blob/main/zoology/mixers/based.py#L119 + def forward_reference(self, hidden_states: torch.Tensor, filters: torch.Tensor = None, *args, **kwargs): + """ + x (torch.Tensor): tensor of shape (b, d, t) + y (torch.Tensor): tensor of shape (b, d, t) + """ + b, t, _ = hidden_states.size() + q, k, v = self.q_proj(hidden_states), self.k_proj(hidden_states), self.v_proj(hidden_states) + + q = q.view(b, t, self.num_heads, self.feature_dim).transpose(1, 2) + k = k.view(b, t, self.num_key_value_heads, self.feature_dim).transpose(1, 2) + v = v.view(b, t, self.num_key_value_heads, self.head_dim).transpose(1, 2) + + # Linear attention + q, k = self.feature_map(q), self.feature_map(k) + q, k, v = q.unsqueeze(-2), k.unsqueeze(-2), v.unsqueeze(-1) + + # Compute attention + if self.causal: + y = ((q * (k * v).cumsum(2)).sum(-1) / ((q * k.cumsum(2)).sum(-1) + self.eps)) + else: + y = ((q * (k * v).sum(2, True)).sum(-1) / ((q * k.sum(2, True)).sum(-1) + self.eps)) + y = rearrange(y, 'b h t d -> b t (h d)') + y = self.o_proj(y.to(hidden_states.dtype)) + y = self.dropout(y) + return y.to(hidden_states.dtype) diff --git a/fla/layers/rwkv6.py b/fla/layers/rwkv6.py new file mode 100644 index 0000000000000000000000000000000000000000..f00e17a7f56d68e4881353487b4dddf7c386e146 --- /dev/null +++ b/fla/layers/rwkv6.py @@ -0,0 +1,291 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +# "Eagle and Finch: RWKV with Matrix-Valued States and Dynamic Recurrence"[https://arxiv.org/abs/2404.05892] + +from __future__ import annotations + +from typing import TYPE_CHECKING, Optional, Tuple + +import torch +import torch.nn as nn +from einops import rearrange + +from fla.modules import GroupNorm +from fla.modules.activations import ACT2FN +from fla.ops.rwkv6 import chunk_rwkv6, fused_recurrent_rwkv6 + +if TYPE_CHECKING: + from fla.models.utils import Cache + + +class RWKV6Attention(nn.Module): + + def __init__( + self, + mode: str = 'chunk', + hidden_size: int = 1024, + expand_k: float = 0.5, + expand_v: float = 1.0, + num_heads: int = 4, + gate_fn: str = 'swish', + proj_low_rank_dim: int = 32, + gate_low_rank_dim: int = 64, + fuse_norm: bool = True, + elementwise_affine: Optional[bool] = True, + norm_eps: float = 1e-5, + layer_idx: int = None, + **kwargs + ) -> RWKV6Attention: + super().__init__() + + self.mode = mode + self.hidden_size = hidden_size + self.expand_k = expand_k + self.expand_v = expand_v + self.num_heads = num_heads + self.proj_low_rank_dim = proj_low_rank_dim + self.gate_low_rank_dim = gate_low_rank_dim + + self.key_dim = int(hidden_size * expand_k) + self.value_dim = int(hidden_size * expand_v) + self.layer_idx = layer_idx + + assert mode in ['chunk', 'fused_recurrent'], f"Not suppoerted mode `{mode}`." + assert self.key_dim % num_heads == 0, f"key dim must be divisible by num_heads of {num_heads}" + assert self.value_dim % num_heads == 0, f"value dim must be divisible by num_heads of {num_heads}" + + self.head_qk_dim = self.key_dim // num_heads + self.head_v_dim = self.value_dim // num_heads + + self.time_shift = nn.ZeroPad2d((0, 0, 1, -1)) + self.x_proj = nn.Sequential( + LerpLinear(hidden_size, proj_low_rank_dim * 5), + nn.Tanh(), + nn.Linear(proj_low_rank_dim * 5, hidden_size, bias=False) + ) + self.x_bias = nn.Parameter(torch.zeros(5, hidden_size)) + + self.r_proj = DDLerpLinear(hidden_size, self.key_dim) + self.w_proj = DDLerpLinear(hidden_size, self.key_dim, low_rank_dim=gate_low_rank_dim) + self.k_proj = DDLerpLinear(hidden_size, self.key_dim) + self.v_proj = DDLerpLinear(hidden_size, self.value_dim) + self.g_proj = DDLerpLinear(hidden_size, self.value_dim) + self.bonus = nn.Parameter(torch.zeros(num_heads, self.head_qk_dim)) + + # TODO: fuse GroupNorm and output gate + self.g_norm = GroupNorm(self.num_heads, self.value_dim, elementwise_affine=elementwise_affine, bias=True, eps=norm_eps) + self.o_proj = nn.Linear(self.value_dim, hidden_size, bias=False) + self.gate_fn = ACT2FN[gate_fn] + + self.apply(self._initialize_weights) + + def _initialize_weights(self, module: nn.Module): + if getattr(module, "_is_hf_initialized", False): + return + if isinstance(module, nn.Linear): + nn.init.xavier_uniform_(module.weight, gain=2 ** -2.5) + if module.bias is not None: + nn.init.zeros_(module.bias) + if isinstance(module, nn.Parameter): + nn.init.xavier_uniform_(module, gain=2 ** -2.5) + module._is_hf_initialized = True + + def forward( + self, + hidden_states: torch.Tensor, + attention_mask: Optional[torch.Tensor] = None, + past_key_values: Optional[Cache] = None, + use_cache: Optional[bool] = False, + output_attentions: Optional[bool] = False, + **kwargs + ) -> Tuple[torch.Tensor, Optional[torch.Tensor], Optional[Cache]]: + if attention_mask is not None: + assert len(attention_mask.shape) == 2, ( + "Expected attention_mask as a 0-1 matrix with shape [batch_size, seq_len] " + "for padding purposes (0 indicating padding). " + "Arbitrary attention masks of shape [batch_size, seq_len, seq_len] are not allowed." + ) + + batch_size, seq_len, hidden_size = hidden_states.shape + # launching the triton kernel for just one token will actually be slower + mode = 'fused_recurrent' if hidden_states.shape[1] == 1 else self.mode + + last_state = None + if past_key_values is not None and len(past_key_values) > self.layer_idx: + last_state = past_key_values[self.layer_idx] + + if attention_mask is not None: + hidden_states = hidden_states.mul_(attention_mask[:, -hidden_states.shape[-2]:, None]) + if hidden_states.shape[1] == 1 and last_state is not None: + shifted = last_state['conv_state'].unsqueeze(1) + else: + shifted = self.time_shift(hidden_states) + if last_state is not None: + shifted[:, 0] = last_state['conv_state'][0] + + delta = shifted - hidden_states + x = self.x_proj[0](hidden_states, delta).view(batch_size, seq_len, -1, self.proj_low_rank_dim) + x = torch.einsum('b t n r, h n r-> b t n h', self.x_proj[1](x), self.x_proj[2].weight.view(hidden_size, 5, -1)) + + r, w, k, v, g = x.add_(self.x_bias).unbind(-2) + r = self.r_proj(hidden_states, r, delta) + w = self.w_proj(hidden_states, w, delta) + k = self.k_proj(hidden_states, k, delta) + v = self.v_proj(hidden_states, v, delta) + g = self.g_proj(hidden_states, g, delta) + + # dealing with left-padding + if attention_mask is not None: + v = v.mul_(attention_mask[:, -v.shape[-2]:, None]) + r, w, k, v = map(lambda x: rearrange(x, 'b t (h d) -> b t h d', h=self.num_heads), (r, w, k, v)) + w = -torch.exp(w) + u = self.bonus + + recurrent_state = last_state['recurrent_state'] if last_state is not None else None + if mode == 'fused_recurrent': + o, recurrent_state = fused_recurrent_rwkv6( + r=r, + k=k, + v=v, + w=w, + u=u, + scale=1., + initial_state=recurrent_state, + output_final_state=use_cache, + head_first=False + ) + elif mode == 'chunk': + o, recurrent_state = chunk_rwkv6( + q=r, + k=k, + v=v, + g=w, + u=u, + scale=1., + initial_state=recurrent_state, + output_final_state=use_cache, + head_first=False + ) + else: + raise NotImplementedError(f"Not supported mode `{mode}`.") + + if past_key_values is not None: + past_key_values.update( + recurrent_state=recurrent_state, + conv_state=hidden_states[:, -1], + layer_idx=self.layer_idx, + offset=r.shape[2] + ) + + o = self.g_norm(rearrange(o, '... h d -> ... (h d)')) * self.gate_fn(g) + o = self.o_proj(o) + + return o, None, past_key_values + + +class LoRA(nn.Module): + + def __init__( + self, + input_dim: int, + output_dim: int, + low_rank_dim: int, + bias: Optional[bool] = True + ): + super().__init__() + + self.input_dim = input_dim + self.output_dim = output_dim + self.low_rank_dim = low_rank_dim + self.bias = bias + + self.lora = nn.Sequential( + nn.Linear(input_dim, low_rank_dim, bias=False), + nn.Tanh(), + nn.Linear(low_rank_dim, output_dim, bias=bias) + ) + + def __repr__(self) -> str: + s = f"{self.__class__.__name__}(" + s += f"input_dim={self.input_dim}, low_rank_dim={self.low_rank_dim}, output_dim={self.output_dim}" + if not self.bias: + s += f", bias={self.bias}" + s += ")" + return s + + def forward(self, x: torch.Tensor) -> torch.Tensor: + return self.lora(x) + + +class LerpLinear(nn.Module): + + def __init__( + self, + input_dim: int, + output_dim: int, + low_rank_dim: Optional[int] = None + ): + super().__init__() + + self.input_dim = input_dim + self.output_dim = output_dim + self.low_rank_dim = low_rank_dim + + self.time_shift = nn.ZeroPad2d((0, 0, 1, -1)) + if low_rank_dim is None: + self.linear = nn.Linear(input_dim, output_dim, bias=False) + else: + self.linear = LoRA(input_dim, output_dim, low_rank_dim) + self.mu = nn.Parameter(torch.zeros(input_dim)) + + def __repr__(self) -> str: + s = f"{self.__class__.__name__}({self.input_dim}, {self.output_dim}" + if self.low_rank_dim is not None: + s += f", low_rank_dim={self.low_rank_dim}" + s += ")" + return s + + def forward(self, x: torch.Tensor, delta: Optional[torch.Tensor] = None) -> torch.Tensor: + if delta is None: + shifted = self.time_shift(x) + if len(shifted.shape) == 2: + shifted = shifted.unsqueeze(1) + delta = shifted - x + return self.linear(x + delta * self.mu) + + +class DDLerpLinear(nn.Module): + + def __init__( + self, + input_dim: int, + output_dim: int, + low_rank_dim: Optional[int] = None + ): + super().__init__() + + self.input_dim = input_dim + self.output_dim = output_dim + self.low_rank_dim = low_rank_dim + + self.time_shift = nn.ZeroPad2d((0, 0, 1, -1)) + if low_rank_dim is None: + self.linear = nn.Linear(input_dim, output_dim, bias=False) + else: + self.linear = LoRA(input_dim, output_dim, low_rank_dim) + + def __repr__(self) -> str: + s = f"{self.__class__.__name__}({self.input_dim}, {self.output_dim}" + if self.low_rank_dim is not None: + s += f", low_rank_dim={self.low_rank_dim}" + s += ")" + return s + + def forward(self, x: torch.Tensor, mu: torch.Tensor, delta: Optional[torch.Tensor] = None) -> torch.Tensor: + if delta is None: + shifted = self.time_shift(x) + if len(shifted.shape) == 2: + shifted = shifted.unsqueeze(1) + delta = shifted - x + return self.linear(x + delta * mu) diff --git a/fla/layers/scan.py b/fla/layers/scan.py new file mode 100644 index 0000000000000000000000000000000000000000..a7d167ac64493fc11cd1e70d5c1137a94b49bf32 --- /dev/null +++ b/fla/layers/scan.py @@ -0,0 +1,237 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +from __future__ import annotations + +import warnings +from typing import TYPE_CHECKING, Optional, Tuple + +import torch +import torch.nn as nn +import torch.nn.functional as F +from einops import rearrange + +from fla.modules import RMSNorm +from fla.modules.activations import swish, sigmoid +from fla.modules.layernorm import rms_norm_linear +from fla.ops.scan import parallel_scan, naive_recurrent_scan + +if TYPE_CHECKING: + from fla.models.utils import Cache + +def build_alibi_tensor_scan(head_num, seq_len, window_len, state_size): + slopes = torch.tensor([2 ** (-8.0 * i / head_num) for i in range(head_num)]) + alibi = torch.zeros((head_num, seq_len, window_len)) + for i in range(seq_len): + for j in range(window_len): + if i < window_len: + alibi[:, i, j] = slopes * (j - window_len + 1) if i > (window_len - j - 2) else 0 + else: + alibi[:, i, j] = alibi[:, window_len-1, j] + # Now concat a zeros tensor of size (head_num, seq_len, state_size) to the left of the above square tensor + alibi = torch.cat((torch.zeros(head_num, seq_len, state_size), alibi), dim=2) + return alibi # shape: (head_num, seq_len, state_size + window_size) or (H, T, S + W) + +def scores_mask(T, W, S): + # create lower right triangle mask (W, W) + mask = torch.tril(torch.ones(W, W)).flip(1) + # concat ones with size (T-W, W) in 0th dim + mask = torch.cat((mask, torch.ones(T-W, W)), dim=0) + # concat ones with size (T, S) in 1st dim + mask = torch.cat((torch.ones(T, S), mask), dim=1) + return mask # shape: (T, S + W) + +class SemiCompressedAttention(nn.Module): + + def __init__( + self, + mode: str = 'parallel', + hidden_size: int = 1024, + window_size: int = 512, + state_size: int = 64, + gate_act: str = 'softmax', + max_position_embeddings: Optional[int] = 2048, + expand_k: float = 1., + expand_v: float = 1., + num_heads: int = 4, + num_kv_heads: Optional[int] = None, + elementwise_affine: Optional[bool] = True, + norm_first: bool = True, + norm_eps: float = 1e-5, + gate_logit_normalizer: int = 8, + use_output_gate: bool = False, + use_norm: bool = True, + layer_idx: Optional[int] = None, + scale: Optional[float] = 1., + **kwargs + ) -> SemiCompressedAttention: + super().__init__() + + self.mode = mode + self.hidden_size = hidden_size + self.window_size = window_size + self.state_size = state_size + self.gate_act = gate_act + self.max_position_embeddings = max_position_embeddings + self.expand_k = expand_k + self.expand_v = expand_v + self.num_heads = num_heads + self.num_kv_heads = num_heads if num_kv_heads is None else num_kv_heads + self.num_kv_groups = self.num_heads // self.num_kv_heads + self.key_dim = int(hidden_size * expand_k) + self.value_dim = int(hidden_size * expand_v) + self.key_dim_per_group = self.key_dim // self.num_kv_groups + self.value_dim_per_group = self.value_dim // self.num_kv_groups + self.head_k_dim = self.key_dim // self.num_heads + self.head_v_dim = self.value_dim // self.num_heads + + self.gate_logit_normalizer = gate_logit_normalizer + + self.use_output_gate = use_output_gate + self.use_norm = use_norm + self.scale = scale + + self.norm_first = norm_first + + self.layer_idx = layer_idx + + if layer_idx is None: + warnings.warn( + f"Instantiating {self.__class__.__name__} without passing `layer_idx` is not recommended and will " + "to errors during the forward call, if caching is used. Please make sure to provide a `layer_idx` " + "when creating this class." + ) + + if norm_first: + self.norm = RMSNorm(self.hidden_size, eps=norm_eps) + + self.q_proj = nn.Linear(self.hidden_size, self.key_dim, bias=False) + self.k_proj = nn.Linear(self.hidden_size, self.key_dim_per_group, bias=False) + self.v_proj = nn.Linear(self.hidden_size, self.value_dim_per_group, bias=False) + self.s_proj = nn.Linear(self.hidden_size, self.key_dim_per_group, bias=False) + self.g_proj = nn.Linear(self.hidden_size, self.num_heads * self.state_size, bias=False) + + self.norm = RMSNorm(self.hidden_size, elementwise_affine, eps=norm_eps) + self.o_proj = nn.Linear(self.value_dim, self.hidden_size, bias=False) + + self.apply(self._initialize_weights) + + self.register_buffer('alibi', build_alibi_tensor_scan(self.num_heads, self.max_position_embeddings, self.window_size, self.state_size)) + self.register_buffer('mask', scores_mask(self.max_position_embeddings, self.window_size, self.state_size)) + + def _initialize_weights(self, module: nn.Module): + if getattr(module, "_is_hf_initialized", False): + return + if isinstance(module, nn.Linear): + nn.init.xavier_uniform_(module.weight, gain=2 ** -2.5) + if module.bias is not None: + nn.init.zeros_(module.bias) + module._is_hf_initialized = True + + def forward( + self, + hidden_states: torch.Tensor, + attention_mask: Optional[torch.Tensor] = None, + past_key_values: Optional[Cache] = None, + use_cache: Optional[bool] = False, + output_attentions: Optional[bool] = False, + **kwargs + ) -> Tuple[torch.Tensor, Optional[torch.Tensor], Optional[Cache]]: + if attention_mask is not None: + assert len(attention_mask.shape) == 2, ( + "Expected attention_mask as a 0-1 matrix with shape [batch_size, seq_len] " + "for padding purposes (0 indicating padding). " + "Arbitrary attention masks of shape [batch_size, seq_len, seq_len] are not allowed." + ) + + # launching the triton kernel for just one token will actually be slower + mode = 'naive' if past_key_values is not None else self.mode + + if self.norm_first: + hidden_states = self.norm(hidden_states) + + last_state = None + if past_key_values is not None and len(past_key_values) > self.layer_idx: + last_state = past_key_values[self.layer_idx] + + q = self.q_proj(hidden_states) + k = self.k_proj(hidden_states) + v = self.v_proj(hidden_states) + s = self.s_proj(hidden_states) + g = self.g_proj(hidden_states) + + if self.gate_act == 'softmax': + g = F.softmax(g, dim=-1) + elif self.gate_act == 'sigmoid': + g = sigmoid(g) + else: + raise NotImplementedError(f"Gate activation `{self.gate_act}` is not supported.") + + # KV cache is updated before going into SCAN + if past_key_values is not None: + k, v = past_key_values.update( + attn_state=(k, v), + layer_idx=self.layer_idx, + offset=q.shape[2], + # We actually don't want to crop to window for the initial prompt, only for subsequent autoregressive tokens + cache_kwargs=dict(window_size=self.window_size) if q.shape[-2] == 1 else dict() + )['attn_state'] + + recurrent_state = last_state['recurrent_state'] if last_state is not None else None + if mode == 'parallel': + # Split heads (but merge with batch dimension because kernels receive (B T C) shape) + q = rearrange(q, 'b t (h c) -> (b h) t c', h=self.num_heads) + k = rearrange(k, 'b t (h c) -> (b h) t c', h=self.num_kv_heads) + v = rearrange(v, 'b t (h c) -> (b h) t c', h=self.num_kv_heads) + s = rearrange(s, 'b t (h c) -> (b h) t c', h=self.num_kv_heads) + g = rearrange(g, 'b t (h s) -> (b h) t s', h=self.num_kv_heads) + o, recurrent_state = parallel_scan( + q=q, + k=k, + v=v, + s=s, + g=g, + window_size=self.window_size, + num_heads=self.num_heads, + alibi=self.alibi.to(q.device), + mask=self.mask.to(q.device), + initial_state=recurrent_state, + output_final_state=use_cache, + scale=self.scale, + head_first=False + ) + o = rearrange(o, '(b h) t c -> b t (h c)', h=self.num_heads) + elif mode == 'naive': + # TODO: Implement naive recurrent SCAN for inference + q = rearrange(q, 'b t (h c) -> b h t c', h=self.num_heads) + k = rearrange(k, 'b t (h c) -> b h t c', h=self.num_kv_heads) + v = rearrange(v, 'b t (h c) -> b h t c', h=self.num_kv_heads) + s = rearrange(s, 'b t (h c) -> b h t c', h=self.num_kv_heads) + g = rearrange(g, 'b t (h s) -> b h t s', h=self.num_kv_heads) + o, recurrent_state = naive_recurrent_scan( + q=q, + k=k, + v=v, + s=s, + g=g, + window_size=self.window_size, + alibi=self.alibi.to(q.device), + mask=self.mask.to(q.device), + initial_state=recurrent_state, + output_final_state=use_cache, + scale=self.scale, + head_first=False + ) + o = rearrange(o, 'b h t c -> b t (h c)', h=self.num_heads) + else: + raise NotImplementedError(f"Not supported mode `{mode}`.") + + # Update the recurrent state after SCAN + if past_key_values is not None: + past_key_values.update( + recurrent_state=recurrent_state, + layer_idx=self.layer_idx + ) + + o = rms_norm_linear(swish(o), self.norm.weight, self.norm.bias, self.o_proj.weight, self.o_proj.bias) + return o, None, past_key_values diff --git a/fla/layers/simple_gla.py b/fla/layers/simple_gla.py new file mode 100644 index 0000000000000000000000000000000000000000..d25ad738960121f8a51a757d357a219d998c54d7 --- /dev/null +++ b/fla/layers/simple_gla.py @@ -0,0 +1,252 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +from __future__ import annotations + +from typing import TYPE_CHECKING, Optional, Tuple + +import torch +import torch.nn as nn +import torch.nn.functional as F +from einops import rearrange, repeat + +from fla.modules import FusedRMSNormSwishGate, RMSNorm, ShortConvolution +from fla.modules.activations import ACT2FN +from fla.ops.simple_gla import chunk_simple_gla, fused_recurrent_simple_gla + +if TYPE_CHECKING: + from fla.models.utils import Cache + + +class SimpleGatedLinearAttention(nn.Module): + r""" + The layer implementaion for [Gated Linear Attention Transformers with Hardware-Efficient Training](https://arxiv.org/abs/2312.06635). # noqa + This layer calls the simplified GLA kernel in which the gating is head-wise instead of elementwise. + + Args: + mode (str, Optional): + Which GLA kernel to use. + Currently available: `chunk`. + Default: `chunk`. + hidden_size (int, Optional): + The hidden size of the input. Default: 1024. + expand_k (float, Optional): + The expansion ratio for the key dim. Default: 1.0. + expand_v (float, Optional): + The expansion ratio for the value dim. Default: 1.0. + num_heads (int, Optional): + The number of heads. Default: 4. + num_kv_heads (int, Optional): + The number of key/value heads, used for MQA. Default: None. + feature_map (str, Optional): + Feature map function applied to queries/keys. Default: None. + use_short_conv (bool, Optional): + Whether to use short convolutions. Default: `False`. + conv_size (int, Optional): + The kernel size of the short convolution, only used when `use_short_conv` is `True`. Default: 4. + conv_bias (bool, Optional): + Whether to use bias in the short convolution, only used when `use_short_conv` is `True`. Default: `False`. + gate_fn (str, Optional): + The activation function for the output gate. Default: `swish`. + elementwise_affine (bool, Optional): + If `True`, applies elementwise affine to LayerNorm with learnable parameters. Default: `True`. + norm_eps (float, Optional): + The epsilon value for the layernorm/rmsnorm layer. Default: 1e-5. + gate_logit_normalizer (int, Optional): + The normalizer for the gate logits, appied after `logsigmoid`. Default: 16. + fuse_norm (bool, Optional): + Whether to fuse the norm and the output gate for better memory footprint. Default: `True`. + layer_idx (int, Optional): + The index of the layer. Default: None. + """ + + def __init__( + self, + mode: str = 'chunk', + hidden_size: int = 1024, + expand_k: float = 1., + expand_v: float = 1., + num_heads: int = 4, + num_kv_heads: Optional[int] = None, + feature_map: Optional[str] = None, + use_short_conv: bool = True, + conv_size: int = 4, + conv_bias: bool = False, + gate_fn: str = 'swish', + elementwise_affine: Optional[bool] = True, + norm_eps: float = 1e-5, + gate_logit_normalizer: int = 16, + fuse_norm: bool = True, + layer_idx: int = None, + ) -> SimpleGatedLinearAttention: + super().__init__() + + self.mode = mode + self.hidden_size = hidden_size + self.expand_k = expand_k + self.expand_v = expand_v + self.num_heads = num_heads + self.num_kv_heads = num_kv_heads if num_kv_heads is not None else num_heads + self.num_kv_groups = self.num_heads // self.num_kv_heads + self.feature_map_fn = ACT2FN[feature_map] if feature_map is not None else None + + self.use_short_conv = use_short_conv + self.conv_size = conv_size + self.conv_bias = conv_bias + + self.key_dim = int(hidden_size * expand_k) + self.value_dim = int(hidden_size * expand_v) + self.key_dim_per_group = self.key_dim // self.num_kv_groups + self.value_dim_per_group = self.value_dim // self.num_kv_groups + self.layer_idx = layer_idx + + assert mode in ['chunk', "fused_recurrent"], f"Not suppoerted mode `{mode}`." + assert self.key_dim % num_heads == 0, f"key dim must be divisible by num_heads of {num_heads}" + assert self.value_dim % num_heads == 0, f"value dim must be divisible by num_heads of {num_heads}" + + self.head_qk_dim = self.key_dim // num_heads + self.head_v_dim = self.value_dim // num_heads + + self.q_proj = nn.Linear(hidden_size, self.key_dim, bias=False) + self.k_proj = nn.Linear(hidden_size, self.key_dim_per_group, bias=False) + self.v_proj = nn.Linear(hidden_size, self.value_dim_per_group, bias=False) + self.g_proj = nn.Linear(hidden_size, self.value_dim, bias=False) + + if use_short_conv: + self.conv_size = conv_size + self.q_conv1d = ShortConvolution(self.key_dim, conv_size, activation='silu') + self.k_conv1d = ShortConvolution(self.key_dim_per_group, conv_size, activation='silu') + self.v_conv1d = ShortConvolution(self.value_dim_per_group, conv_size, activation='silu') + + self.gk_proj = nn.Linear(hidden_size, self.num_heads) + + if gate_fn == 'swish' and fuse_norm: + self.g_norm_swish_gate = FusedRMSNormSwishGate(self.head_v_dim, elementwise_affine, norm_eps) + self.fuse_norm_and_gate = True + else: + self.fuse_norm_and_gate = False + self.g_norm = RMSNorm(hidden_size=self.head_v_dim, elementwise_affine=elementwise_affine, eps=norm_eps) + self.gate_fn = ACT2FN[gate_fn] + self.o_proj = nn.Linear(self.value_dim, hidden_size, bias=False) + + self.gate_logit_normalizer = gate_logit_normalizer + + self.apply(self._initialize_weights) + + def _initialize_weights(self, module: nn.Module): + if getattr(module, "_is_hf_initialized", False): + return + if isinstance(module, nn.Linear): + nn.init.xavier_uniform_(module.weight, gain=2 ** -2.5) + if module.bias is not None: + nn.init.zeros_(module.bias) + module._is_hf_initialized = True + + def forward( + self, + hidden_states: torch.Tensor, + attention_mask: Optional[torch.Tensor] = None, + past_key_values: Optional[Cache] = None, + use_cache: Optional[bool] = False, + output_attentions: Optional[bool] = False, + **kwargs + ) -> Tuple[torch.Tensor, Optional[torch.Tensor], Optional[Cache]]: + if attention_mask is not None: + assert len(attention_mask.shape) == 2, ( + "Expected attention_mask as a 0-1 matrix with shape [batch_size, seq_len] " + "for padding purposes (0 indicating padding). " + "Arbitrary attention masks of shape [batch_size, seq_len, seq_len] are not allowed." + ) + + # launching the triton kernel for just one token will actually be slower + mode = 'fused_recurrent' if hidden_states.shape[1] == 1 else self.mode + + last_state = None + if past_key_values is not None and len(past_key_values) > self.layer_idx: + last_state = past_key_values[self.layer_idx] + + if self.use_short_conv: + conv_state_q, conv_state_k, conv_state_v = None, None, None + if last_state is not None: + conv_state_q, conv_state_k, conv_state_v = last_state['conv_state'] + conv_mask = attention_mask[:, -hidden_states.shape[1]:] if attention_mask is not None else None + q, conv_state_q = self.q_conv1d(x=self.q_proj(hidden_states), + mask=conv_mask, + cache=conv_state_q, + output_final_state=use_cache) + k, conv_state_k = self.k_conv1d(x=self.k_proj(hidden_states), + mask=conv_mask, + cache=conv_state_k, + output_final_state=use_cache) + v, conv_state_v = self.v_conv1d(x=self.v_proj(hidden_states), + mask=conv_mask, + cache=conv_state_v, + output_final_state=use_cache) + else: + q = self.q_proj(hidden_states) + k = self.k_proj(hidden_states) + v = self.v_proj(hidden_states) + gk = self.gk_proj(hidden_states) + + if self.feature_map_fn is not None: + q, k = map(self.feature_map_fn, (q, k)) + # dealing with left-padding + if attention_mask is not None: + v = v.mul_(attention_mask[:, -v.shape[-2]:, None]) + q = rearrange(q, '... (h d) -> ... h d', h=self.num_heads) + if self.num_kv_groups > 1: + k, v = (repeat(x, '... (h d) -> ... (h g) d', h=self.num_kv_heads, g=self.num_kv_groups) for x in (k, v)) + else: + k, v = (rearrange(x, '... (h d) -> ... h d', h=self.num_kv_heads) for x in (k, v)) + gk = F.logsigmoid(gk) / self.gate_logit_normalizer + + recurrent_state = last_state['recurrent_state'] if last_state is not None else None + if mode == 'chunk': + o, recurrent_state = chunk_simple_gla( + q=q, + k=k, + v=v, + gk=gk, + initial_state=recurrent_state, + output_final_state=use_cache, + head_first=False + ) + elif mode == 'fused_recurrent': + o, recurrent_state = fused_recurrent_simple_gla( + q=q, + k=k, + v=v, + gk=gk, + initial_state=recurrent_state, + output_final_state=use_cache, + head_first=False + ) + else: + raise NotImplementedError(f"Not supported mode `{mode}`.") + + if past_key_values is not None: + past_key_values.update( + recurrent_state=recurrent_state, + conv_state=(conv_state_q, conv_state_k, conv_state_v) if self.use_short_conv else None, + layer_idx=self.layer_idx, + offset=q.shape[2] + ) + + g = self.g_proj(hidden_states) + if self.fuse_norm_and_gate: + g = rearrange(g, 'b t (h d) -> b t h d', h=self.num_heads) + o = self.g_norm_swish_gate(o, g) + o = rearrange(o, 'b t h d -> b t (h d)') + else: + o = rearrange(self.g_norm(o), 'b t h d -> b t (h d)') + o = o * self.gate_fn(g) + o = self.o_proj(o) + + return o, None, past_key_values + + def state_size(self, **kwargs) -> int: + state_size = self.key_dim * self.head_v_dim + for module in self.children(): + if isinstance(module, ShortConvolution): + state_size += module.state_size + return state_size diff --git a/fla/models/__init__.py b/fla/models/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..dfb1a000adc48c8bf62d827a8f78ad648b36421a --- /dev/null +++ b/fla/models/__init__.py @@ -0,0 +1,39 @@ +# -*- coding: utf-8 -*- + +from fla.models.abc import ABCConfig, ABCForCausalLM, ABCModel +from fla.models.bitnet import BitNetConfig, BitNetForCausalLM, BitNetModel +from fla.models.delta_net import (DeltaNetConfig, DeltaNetForCausalLM, + DeltaNetModel) +from fla.models.gla import GLAConfig, GLAForCausalLM, GLAModel +from fla.models.gsa import GSAConfig, GSAForCausalLM, GSAModel +from fla.models.hgrn import HGRNConfig, HGRNForCausalLM, HGRNModel +from fla.models.hgrn2 import HGRN2Config, HGRN2ForCausalLM, HGRN2Model +from fla.models.linear_attn import (LinearAttentionConfig, + LinearAttentionForCausalLM, + LinearAttentionModel) +from fla.models.mamba import MambaConfig, MambaForCausalLM, MambaModel +from fla.models.mamba2 import Mamba2Config, Mamba2ForCausalLM, Mamba2Model +from fla.models.retnet import RetNetConfig, RetNetForCausalLM, RetNetModel +from fla.models.rwkv6 import RWKV6Config, RWKV6ForCausalLM, RWKV6Model +from fla.models.samba import SambaConfig, SambaForCausalLM, SambaModel +from fla.models.scan import SCANConfig, SCANForCausalLM, SCANModel +from fla.models.transformer import (TransformerConfig, TransformerForCausalLM, + TransformerModel) + +__all__ = [ + 'ABCConfig', 'ABCForCausalLM', 'ABCModel', + 'BitNetConfig', 'BitNetForCausalLM', 'BitNetModel', + 'DeltaNetConfig', 'DeltaNetForCausalLM', 'DeltaNetModel', + 'GLAConfig', 'GLAForCausalLM', 'GLAModel', + 'GSAConfig', 'GSAForCausalLM', 'GSAModel', + 'HGRNConfig', 'HGRNForCausalLM', 'HGRNModel', + 'HGRN2Config', 'HGRN2ForCausalLM', 'HGRN2Model', + 'LinearAttentionConfig', 'LinearAttentionForCausalLM', 'LinearAttentionModel', + 'MambaConfig', 'MambaForCausalLM', 'MambaModel', + 'Mamba2Config', 'Mamba2ForCausalLM', 'Mamba2Model', + 'RetNetConfig', 'RetNetForCausalLM', 'RetNetModel', + 'RWKV6Config', 'RWKV6ForCausalLM', 'RWKV6Model', + 'SambaConfig', 'SambaForCausalLM', 'SambaModel', + 'SCANConfig', 'SCANForCausalLM', 'SCANModel', + 'TransformerConfig', 'TransformerForCausalLM', 'TransformerModel' +] diff --git a/fla/models/abc/__init__.py b/fla/models/abc/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..f7021f22ff0f9781432bd3969473520851f4b553 --- /dev/null +++ b/fla/models/abc/__init__.py @@ -0,0 +1,13 @@ +# -*- coding: utf-8 -*- + +from transformers import AutoConfig, AutoModel, AutoModelForCausalLM + +from fla.models.abc.configuration_abc import ABCConfig +from fla.models.abc.modeling_abc import ABCForCausalLM, ABCModel + +AutoConfig.register(ABCConfig.model_type, ABCConfig) +AutoModel.register(ABCConfig, ABCModel) +AutoModelForCausalLM.register(ABCConfig, ABCForCausalLM) + + +__all__ = ['ABCConfig', 'ABCForCausalLM', 'ABCModel'] diff --git a/fla/models/abc/configuration_abc.py b/fla/models/abc/configuration_abc.py new file mode 100644 index 0000000000000000000000000000000000000000..bdb7f4e2276db932ad5cdd80adbcb14a49e41927 --- /dev/null +++ b/fla/models/abc/configuration_abc.py @@ -0,0 +1,84 @@ +# -*- coding: utf-8 -*- + +from typing import Dict, Optional + +from transformers.configuration_utils import PretrainedConfig + + +class ABCConfig(PretrainedConfig): + + model_type = 'abc' + keys_to_ignore_at_inference = ['past_key_values'] + + def __init__( + self, + hidden_size: int = 2048, + gate_low_rank_dim: int = 16, + clamp_min: float = -32, + clamp_max: float = 32, + hidden_ratio: Optional[int] = 4, + intermediate_size: Optional[int] = None, + num_hidden_layers: int = 24, + num_heads: int = 4, + num_slots: Optional[int] = 64, + use_short_conv: bool = False, + conv_size: int = 4, + exapnd_k: float = 0.5, + exapnd_v: float = 1, + hidden_act: str = "swish", + max_position_embeddings: int = 2048, + elementwise_affine: Optional[bool] = True, + norm_eps: float = 1e-6, + attn: Optional[Dict] = None, + use_cache: bool = True, + pad_token_id: int = None, + bos_token_id: int = 1, + eos_token_id: int = 2, + tie_word_embeddings: bool = False, + initializer_range: float = 0.02, + fuse_norm: bool = True, + fuse_cross_entropy: bool = True, + vocab_size: int = 32000, + **kwargs + ): + self.hidden_size = hidden_size + self.gate_low_rank_dim = gate_low_rank_dim + self.clamp_min = clamp_min + self.clamp_max = clamp_max + self.hidden_ratio = hidden_ratio + self.intermediate_size = intermediate_size + self.num_hidden_layers = num_hidden_layers + self.num_heads = num_heads + self.num_slots = num_slots + self.use_short_conv = use_short_conv + self.conv_size = conv_size + self.expand_k = exapnd_k + self.expand_v = exapnd_v + self.hidden_act = hidden_act + self.max_position_embeddings = max_position_embeddings + self.elementwise_affine = elementwise_affine + self.norm_eps = norm_eps + self.attn = attn + self.use_cache = use_cache + self.initializer_range = initializer_range + self.fuse_norm = fuse_norm + self.fuse_cross_entropy = fuse_cross_entropy + self.vocab_size = vocab_size + + if attn is not None: + if not isinstance(attn, Dict): + raise ValueError("attn must be a dictionary") + if 'layers' not in attn: + raise ValueError("Layer indices must be provided to initialize hybrid attention layers") + if 'num_heads' not in attn: + raise ValueError("Number of heads must be provided to initialize hybrid attention layers") + attn['num_kv_heads'] = attn.get('num_kv_heads', attn['num_heads']) + attn['window_size'] = attn.get('window_size', None) + + super().__init__( + pad_token_id=pad_token_id, + bos_token_id=bos_token_id, + eos_token_id=eos_token_id, + tie_word_embeddings=tie_word_embeddings, + **kwargs, + ) diff --git a/fla/models/abc/modeling_abc.py b/fla/models/abc/modeling_abc.py new file mode 100644 index 0000000000000000000000000000000000000000..3db6a5b05275c87e8c2ec089be58b51d91c61a3d --- /dev/null +++ b/fla/models/abc/modeling_abc.py @@ -0,0 +1,403 @@ +# -*- coding: utf-8 -*- + +from __future__ import annotations + +import math +import warnings +from typing import List, Optional, Tuple, Union + +import torch +import torch.nn as nn +import torch.utils.checkpoint +from transformers.activations import ACT2FN +from transformers.generation import GenerationMixin +from transformers.modeling_outputs import (BaseModelOutputWithPast, + CausalLMOutputWithPast) +from transformers.modeling_utils import PreTrainedModel +from transformers.utils import logging + +from fla.layers.abc import ABCAttention +from fla.layers.attn import Attention +from fla.models.abc.configuration_abc import ABCConfig +from fla.models.utils import Cache +from fla.modules import (FusedCrossEntropyLoss, FusedLinearCrossEntropyLoss, + RMSNorm) +from fla.modules.activations import swiglu_linear + +logger = logging.get_logger(__name__) + + +class ABCMLP(nn.Module): + + def __init__( + self, + hidden_size: int, + hidden_ratio: Optional[int] = None, + intermediate_size: Optional[int] = None, + hidden_act: str = 'swish' + ) -> ABCMLP: + super().__init__() + + self.hidden_size = hidden_size + # the final number of params is `hidden_ratio * hidden_size^2` + # `intermediate_size` is chosen to be a multiple of 256 closest to `2/3 * hidden_size * hidden_ratio` + if hidden_ratio is None: + hidden_ratio = 4 + if intermediate_size is None: + intermediate_size = int(hidden_size * hidden_ratio * 2 / 3) + intermediate_size = 256 * ((intermediate_size + 256 - 1) // 256) + self.hidden_ratio = hidden_ratio + self.intermediate_size = intermediate_size + + self.gate_proj = nn.Linear(self.hidden_size, self.intermediate_size * 2, bias=False) + self.down_proj = nn.Linear(self.intermediate_size, self.hidden_size, bias=False) + self.act_fn = ACT2FN[hidden_act] + + def forward(self, x): + y = self.gate_proj(x) + gate, y = y.chunk(2, -1) + return swiglu_linear(gate, y, self.down_proj.weight, self.down_proj.bias) + + +class ABCBlock(nn.Module): + def __init__(self, config: ABCConfig, layer_idx: int): + super().__init__() + self.hidden_size = config.hidden_size + + self.attn_norm = RMSNorm(hidden_size=config.hidden_size, eps=config.norm_eps) + if config.attn is not None and layer_idx in config.attn['layers']: + self.attn = Attention( + hidden_size=config.hidden_size, + num_heads=config.attn['num_heads'], + num_kv_heads=config.attn['num_kv_heads'], + window_size=config.attn['window_size'], + max_position_embeddings=config.max_position_embeddings, + layer_idx=layer_idx + ) + else: + self.attn = ABCAttention( + hidden_size=config.hidden_size, + expand_k=config.expand_k, + expand_v=config.expand_v, + num_heads=config.num_heads, + num_slots=config.num_slots, + use_short_conv=config.use_short_conv, + conv_size=config.conv_size, + gate_fn=config.hidden_act, + elementwise_affine=config.elementwise_affine, + norm_eps=config.norm_eps, + clamp_min=config.clamp_min, + clamp_max=config.clamp_max, + fuse_norm=config.fuse_norm, + layer_idx=layer_idx + ) + self.mlp_norm = RMSNorm(hidden_size=config.hidden_size, eps=config.norm_eps) + self.mlp = ABCMLP( + hidden_size=config.hidden_size, + hidden_ratio=config.hidden_ratio, + intermediate_size=config.intermediate_size, + hidden_act=config.hidden_act + ) + + def forward( + self, + hidden_states: torch.Tensor, + attention_mask: Optional[torch.Tensor] = None, + past_key_values: Optional[Union[Cache, List[torch.FloatTensor]]] = None, + use_cache: Optional[bool] = False, + output_attentions: Optional[bool] = False, + **kwargs, + ) -> Tuple[torch.FloatTensor, Optional[Tuple[torch.FloatTensor, torch.FloatTensor]]]: + + residual = hidden_states + + hidden_states = self.attn_norm(hidden_states) + hidden_states, attentions, past_key_values = self.attn( + hidden_states=hidden_states, + attention_mask=attention_mask, + past_key_values=past_key_values, + use_cache=use_cache, + output_attentions=output_attentions + ) + hidden_states, residual = self.mlp_norm(hidden_states, residual, True) + hidden_states = self.mlp(hidden_states) + hidden_states = residual + hidden_states + + outputs = (hidden_states, attentions, past_key_values) + + return outputs + + +class ABCPreTrainedModel(PreTrainedModel): + + config_class = ABCConfig + supports_gradient_checkpointing = True + _no_split_modules = ['ABCBlock'] + + def __init__(self, *inputs, **kwargs): + super().__init__(*inputs, **kwargs) + + def _init_weights( + self, + module: nn.Module, + rescale_prenorm_residual: bool = True, + num_residuals_per_layer: int = 2, + ): + if isinstance(module, (nn.Linear, nn.Conv1d)): + # Slightly different from the TF version which uses truncated_normal for initialization + # cf https://github.com/pytorch/pytorch/pull/5617 + nn.init.normal_(module.weight, mean=0.0, std=self.config.initializer_range) + if module.bias is not None: + nn.init.zeros_(module.bias) + elif isinstance(module, nn.Embedding): + nn.init.normal_(module.weight, mean=0.0, std=self.config.initializer_range) + if module.padding_idx is not None: + module.weight.data[module.padding_idx].zero_() + + if rescale_prenorm_residual: + # Reinitialize selected weights subject to the OpenAI GPT-2 Paper Scheme: + # > A modified initialization which accounts for the accumulation on the residual path with model depth. Scale + # > the weights of residual layers at initialization by a factor of 1/√N where N is the # of residual layers. + # > -- GPT-2 :: https://openai.com/blog/better-language-models/ + # + # Reference (Megatron-LM): https://github.com/NVIDIA/Megatron-LM/blob/main/megatron/model/gpt_model.py + for name, p in module.named_parameters(): + if name in ["o_proj.weight", "down_proj.weight"]: + # Special Scaled Initialization --> There are 2 Layer Norms per Transformer Block + # Following Pytorch init, except scale by 1/sqrt(2 * n_layer) + # We need to reinit p since this code could be called multiple times + # Having just p *= scale would repeatedly scale it down + with torch.no_grad(): + p /= math.sqrt(num_residuals_per_layer * self.config.num_hidden_layers) + + +class ABCModel(ABCPreTrainedModel): + + def __init__(self, config: ABCConfig): + super().__init__(config) + self.padding_idx = config.pad_token_id + self.vocab_size = config.vocab_size + + self.embeddings = nn.Embedding(config.vocab_size, config.hidden_size, self.padding_idx) + self.layers = nn.ModuleList([ABCBlock(config, layer_idx) for layer_idx in range(config.num_hidden_layers)]) + self.norm = RMSNorm(config.hidden_size, eps=config.norm_eps) + + self.gradient_checkpointing = False + + self.post_init() + + def get_input_embeddings(self): + return self.embeddings + + def set_input_embeddings(self, value): + self.embeddings = value + + def forward( + self, + input_ids: Optional[torch.LongTensor] = None, + attention_mask: Optional[torch.Tensor] = None, # noqa + inputs_embeds: Optional[torch.FloatTensor] = None, + past_key_values: Optional[Union[Cache, List[torch.FloatTensor]]] = None, + use_cache: Optional[bool] = None, + output_attentions: Optional[bool] = None, + output_hidden_states: Optional[bool] = None, + return_dict: Optional[bool] = None + ) -> Union[Tuple, BaseModelOutputWithPast]: + if output_attentions: + warnings.warn("`ABCModel` does not `output_attentions` now, setting it to `False`.") + output_attentions = False + output_attentions = output_attentions if output_attentions is not None else self.config.output_attentions + output_hidden_states = output_hidden_states if output_hidden_states is not None else self.config.output_hidden_states + use_cache = use_cache if use_cache is not None else (self.config.use_cache if not self.training else False) + return_dict = return_dict if return_dict is not None else self.config.use_return_dict + + # retrieve input_ids and inputs_embeds + if input_ids is not None and inputs_embeds is not None: + raise ValueError("You cannot specify both input_ids and inputs_embeds at the same time") + if input_ids is None and inputs_embeds is None: + raise ValueError("You have to specify either input_ids or inputs_embeds") + + if inputs_embeds is None: + inputs_embeds = self.embeddings(input_ids) + hidden_states = inputs_embeds + + if use_cache and not isinstance(past_key_values, Cache): + past_key_values = Cache.from_legacy_cache(past_key_values) + + if self.gradient_checkpointing and self.training and use_cache: + logger.warning_once("`use_cache=True` is incompatible with gradient checkpointing. Setting `use_cache=False`...") + use_cache = False + + all_hidden_states = () if output_hidden_states else None + all_attns = () if output_attentions else None + for layer in self.layers: + if output_hidden_states: + all_hidden_states += (hidden_states,) + + if self.gradient_checkpointing and self.training: + hidden_states, attentions, past_key_values = self._gradient_checkpointing_func( + layer.__call__, + hidden_states, + attention_mask, + past_key_values, + use_cache, + output_attentions + ) + else: + hidden_states, attentions, past_key_values = layer( + hidden_states, + attention_mask, + past_key_values=past_key_values, + use_cache=use_cache, + output_attentions=output_attentions + ) + + if output_attentions: + all_attns += (attentions,) + + hidden_states = self.norm(hidden_states) + + # add hidden states from the last decoder layer + if output_hidden_states: + all_hidden_states += (hidden_states,) + + if not return_dict: + return tuple(i for i in [hidden_states, past_key_values, all_hidden_states, all_attns] if i is not None) + return BaseModelOutputWithPast( + last_hidden_state=hidden_states, + past_key_values=past_key_values, + hidden_states=all_hidden_states, + attentions=all_attns + ) + + +class ABCForCausalLM(ABCPreTrainedModel, GenerationMixin): + + _tied_weights_keys = ["lm_head.weight"] + + def __init__(self, config): + super().__init__(config) + self.model = ABCModel(config) + self.vocab_size = config.vocab_size + self.lm_head = nn.Linear(config.hidden_size, config.vocab_size, bias=False) + + # Initialize weights and apply final processing + self.post_init() + + def get_input_embeddings(self): + return self.model.embeddings + + def set_input_embeddings(self, value): + self.model.embeddings = value + + def get_output_embeddings(self): + return self.lm_head + + def set_output_embeddings(self, new_embeddings): + self.lm_head = new_embeddings + + def set_decoder(self, decoder): + self.model = decoder + + def get_decoder(self): + return self.model + + def generate(self, *args, **kwargs): + try: + return super().generate(*args, **kwargs) + except AttributeError as exception: + if 'past_key_values' in str(exception): + raise AttributeError( + f"You tried to call `generate` with a decoding strategy that manipulates `past_key_values`, " + f"which is not supported for {self.__class__.__name__}. " + f"Try another generation strategy instead. " + f"For the available generation strategies, check this doc: " + f"https://huggingface.co/docs/transformers/en/generation_strategies#decoding-strategies" + ) + else: + raise exception + + def prepare_inputs_for_generation( + self, + input_ids: torch.LongTensor = None, + past_key_values: Optional[Union[Cache, List[torch.FloatTensor]]] = None, + inputs_embeds: Optional[torch.FloatTensor] = None, + **kwargs + ): + # only last token for `inputs_ids` if the `past_key_values` is passed along. + if past_key_values is not None: + input_ids = input_ids[:, -1:] + + # if `inputs_embeds` are passed, we only want to use them in the 1st generation step + if inputs_embeds is not None and past_key_values is None: + model_inputs = {'inputs_embeds': inputs_embeds} + else: + model_inputs = {'input_ids': input_ids} + model_inputs['past_key_values'] = past_key_values + return model_inputs + + def forward( + self, + input_ids: torch.LongTensor = None, + attention_mask: Optional[torch.Tensor] = None, + inputs_embeds: Optional[torch.Tensor] = None, + past_key_values: Optional[Union[Cache, List[torch.FloatTensor]]] = None, + labels: Optional[torch.LongTensor] = None, + use_cache: Optional[bool] = None, + output_attentions: Optional[bool] = None, + output_hidden_states: Optional[bool] = None, + return_dict: Optional[bool] = None, + num_logits_to_keep: Optional[int] = 0 + ) -> Union[Tuple, CausalLMOutputWithPast]: + output_attentions = output_attentions if output_attentions is not None else self.config.output_attentions + output_hidden_states = ( + output_hidden_states if output_hidden_states is not None else self.config.output_hidden_states + ) + return_dict = return_dict if return_dict is not None else self.config.use_return_dict + + outputs = self.model( + input_ids=input_ids, + attention_mask=attention_mask, + inputs_embeds=inputs_embeds, + past_key_values=past_key_values, + use_cache=use_cache, + output_attentions=output_attentions, + output_hidden_states=output_hidden_states, + return_dict=return_dict + ) + + hidden_states = outputs[0] + fuse_linear_and_cross_entropy = self.config.fuse_cross_entropy and self.training + logits = None if fuse_linear_and_cross_entropy else self.lm_head(hidden_states[:, -num_logits_to_keep:]) + + loss = None + if labels is not None: + if self.config.fuse_cross_entropy: + if fuse_linear_and_cross_entropy: + loss_fct = FusedLinearCrossEntropyLoss() + else: + loss_fct = FusedCrossEntropyLoss(inplace_backward=True) + else: + loss_fct = nn.CrossEntropyLoss() + # Enable model parallelism + labels = labels.to(hidden_states.device) + labels = torch.cat((labels[..., 1:], torch.full_like(labels[:, :1], loss_fct.ignore_index)), 1) + if fuse_linear_and_cross_entropy: + loss = loss_fct(hidden_states.view(-1, self.config.hidden_size), + labels.view(-1), + self.lm_head.weight, + self.lm_head.bias) + else: + loss = loss_fct(logits.view(-1, self.config.vocab_size), labels.view(-1)) + + if not return_dict: + output = (logits,) + outputs[1:] + return (loss,) + output if loss is not None else output + + return CausalLMOutputWithPast( + loss=loss, + logits=logits, + past_key_values=outputs.past_key_values, + hidden_states=outputs.hidden_states, + attentions=outputs.attentions, + ) diff --git a/fla/models/bitnet/__init__.py b/fla/models/bitnet/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..bede22c64707be1ff17f402c0af6ed9da1ff1aee --- /dev/null +++ b/fla/models/bitnet/__init__.py @@ -0,0 +1,13 @@ +# -*- coding: utf-8 -*- + +from transformers import AutoConfig, AutoModel, AutoModelForCausalLM + +from fla.models.bitnet.configuration_bitnet import BitNetConfig +from fla.models.bitnet.modeling_bitnet import BitNetForCausalLM, BitNetModel + +AutoConfig.register(BitNetConfig.model_type, BitNetConfig) +AutoModel.register(BitNetConfig, BitNetModel) +AutoModelForCausalLM.register(BitNetConfig, BitNetForCausalLM) + + +__all__ = ['BitNetConfig', 'BitNetForCausalLM', 'BitNetModel'] diff --git a/fla/models/bitnet/configuration_bitnet.py b/fla/models/bitnet/configuration_bitnet.py new file mode 100644 index 0000000000000000000000000000000000000000..b6c50f8aae9f4671f08a332d36b4db8cfefbf071 --- /dev/null +++ b/fla/models/bitnet/configuration_bitnet.py @@ -0,0 +1,68 @@ +# -*- coding: utf-8 -*- + +from typing import Optional + +from transformers.configuration_utils import PretrainedConfig + + +class BitNetConfig(PretrainedConfig): + + model_type = 'bitnet' + keys_to_ignore_at_inference = ['past_key_values'] + + def __init__( + self, + vocab_size: int = 32000, + hidden_size: int = 2048, + num_hidden_layers: int = 24, + num_heads: int = 32, + num_kv_heads: int = None, + window_size: Optional[int] = None, + rope_theta: Optional[float] = 10000., + max_position_embeddings: int = 2048, + hidden_ratio: Optional[int] = 4, + intermediate_size: Optional[int] = None, + hidden_act: str = "swish", + initializer_range: float = 0.02, + elementwise_affine: Optional[bool] = True, + norm_first: bool = False, + norm_eps: float = 1e-6, + use_cache: bool = True, + pad_token_id: int = None, + bos_token_id: int = 1, + eos_token_id: int = 2, + tie_word_embeddings: bool = False, + attention_bias: bool = False, + fuse_norm: bool = True, + fuse_cross_entropy: bool = True, + **kwargs, + ): + self.vocab_size = vocab_size + self.hidden_size = hidden_size + self.num_hidden_layers = num_hidden_layers + self.num_heads = num_heads + self.num_kv_heads = num_kv_heads + self.window_size = window_size + self.rope_theta = rope_theta + self.max_position_embeddings = max_position_embeddings + + self.hidden_ratio = hidden_ratio + self.intermediate_size = intermediate_size + self.hidden_act = hidden_act + + self.initializer_range = initializer_range + self.elementwise_affine = elementwise_affine + self.norm_first = norm_first + self.norm_eps = norm_eps + self.use_cache = use_cache + self.attention_bias = attention_bias + self.fuse_cross_entropy = fuse_cross_entropy + self.fuse_norm = fuse_norm + + super().__init__( + pad_token_id=pad_token_id, + bos_token_id=bos_token_id, + eos_token_id=eos_token_id, + tie_word_embeddings=tie_word_embeddings, + **kwargs, + ) diff --git a/fla/models/bitnet/modeling_bitnet.py b/fla/models/bitnet/modeling_bitnet.py new file mode 100644 index 0000000000000000000000000000000000000000..5e55e0033639450e16a728d64ced18683c980463 --- /dev/null +++ b/fla/models/bitnet/modeling_bitnet.py @@ -0,0 +1,428 @@ +# -*- coding: utf-8 -*- + +from __future__ import annotations + +import math +import warnings +from typing import List, Optional, Tuple, Union + +import torch +import torch.nn as nn +import torch.utils.checkpoint +from transformers.activations import ACT2FN +from transformers.generation import GenerationMixin +from transformers.modeling_outputs import (BaseModelOutputWithPast, + CausalLMOutputWithPast) +from transformers.modeling_utils import PreTrainedModel +from transformers.utils import logging + +from fla.layers.bitattn import BitAttention +from fla.models.bitnet.configuration_bitnet import BitNetConfig +from fla.models.utils import Cache +from fla.modules import (FusedCrossEntropyLoss, FusedLinearCrossEntropyLoss, + RMSNorm) +from fla.modules.activations import swiglu_bitlinear +from fla.modules.fused_bitlinear import BitLinear, rms_norm_linear_quant + +logger = logging.get_logger(__name__) + + +class BitNetMLP(nn.Module): + + def __init__( + self, + hidden_size: int, + hidden_ratio: Optional[int] = None, + intermediate_size: Optional[int] = None, + hidden_act: str = 'swish', + norm_first: bool = True, + norm_eps: float = 1e-5 + ) -> BitNetMLP: + super().__init__() + + self.hidden_size = hidden_size + # the final number of params is `hidden_ratio * hidden_size^2` + # `intermediate_size` is chosen to be a multiple of 256 closest to `2/3 * hidden_size * hidden_ratio` + if hidden_ratio is None: + hidden_ratio = 4 + if intermediate_size is None: + intermediate_size = int(hidden_size * hidden_ratio * 2 / 3) + intermediate_size = 256 * ((intermediate_size + 256 - 1) // 256) + self.hidden_ratio = hidden_ratio + self.intermediate_size = intermediate_size + self.norm_first = norm_first + + if norm_first: + self.norm = RMSNorm(hidden_size=hidden_size, eps=norm_eps) + + self.gate_proj = BitLinear(self.hidden_size, self.intermediate_size * 2, bias=False) + self.down_proj = BitLinear(self.intermediate_size, self.hidden_size, bias=False) + self.act_fn = ACT2FN[hidden_act] + + def forward(self, x): + if self.norm_first: + x = rms_norm_linear_quant(x, self.norm.weight, self.norm.bias, self.gate_proj.weight, self.gate_proj.bias) + else: + x = self.gate_proj(x) + gate, y = x.chunk(2, -1) + return swiglu_bitlinear(gate, y, self.down_proj.weight, self.down_proj.bias) + + +class BitNetBlock(nn.Module): + + def __init__(self, config: BitNetConfig, layer_idx: int): + super().__init__() + + self.hidden_size = config.hidden_size + + if not config.norm_first: + self.attn_norm = RMSNorm(hidden_size=config.hidden_size, eps=config.norm_eps) + self.attn = BitAttention( + hidden_size=config.hidden_size, + num_heads=config.num_heads, + num_kv_heads=config.num_kv_heads, + window_size=config.window_size, + rope_theta=config.rope_theta, + max_position_embeddings=config.max_position_embeddings, + norm_first=config.norm_first, + norm_eps=config.norm_eps, + layer_idx=layer_idx + ) + if not config.norm_first: + self.mlp_norm = RMSNorm(hidden_size=config.hidden_size, eps=config.norm_eps) + self.mlp = BitNetMLP( + hidden_size=config.hidden_size, + hidden_ratio=config.hidden_ratio, + intermediate_size=config.intermediate_size, + hidden_act=config.hidden_act, + norm_first=config.norm_first, + norm_eps=config.norm_eps + ) + + def forward( + self, + hidden_states: torch.Tensor, + attention_mask: Optional[torch.Tensor] = None, + past_key_values: Optional[Tuple[torch.Tensor]] = None, + output_attentions: Optional[bool] = False, + use_cache: Optional[bool] = False, + **kwargs, + ) -> Tuple[torch.FloatTensor, Optional[Tuple[torch.FloatTensor, torch.FloatTensor]]]: + + residual = hidden_states + if hasattr(self, 'attn_norm'): + hidden_states = self.attn_norm(hidden_states) + hidden_states, attentions, past_key_values = self.attn( + hidden_states=hidden_states, + attention_mask=attention_mask, + past_key_values=past_key_values, + use_cache=use_cache, + output_attentions=output_attentions + ) + if hasattr(self, 'mlp_norm'): + hidden_states, residual = self.mlp_norm(hidden_states, residual, True) + else: + hidden_states = residual + hidden_states + residual = hidden_states + hidden_states = self.mlp(hidden_states) + hidden_states = residual + hidden_states + + outputs = (hidden_states,) + + if output_attentions: + outputs += (attentions,) + + if use_cache: + outputs += (past_key_values,) + + return outputs + + +class BitNetPreTrainedModel(PreTrainedModel): + + config_class = BitNetConfig + supports_gradient_checkpointing = True + _no_split_modules = ['BitNetBlock'] + + def __init__(self, *inputs, **kwargs): + super().__init__(*inputs, **kwargs) + + def _init_weights( + self, + module: nn.Module, + rescale_prenorm_residual: bool = False, + num_residuals_per_layer: int = 2, + ): + if isinstance(module, (BitLinear, nn.Conv1d)): + # Slightly different from the TF version which uses truncated_normal for initialization + # cf https://github.com/pytorch/pytorch/pull/5617 + nn.init.normal_(module.weight, mean=0.0, std=self.config.initializer_range) + if module.bias is not None: + nn.init.zeros_(module.bias) + elif isinstance(module, nn.Embedding): + nn.init.normal_(module.weight, mean=0.0, std=self.config.initializer_range) + if module.padding_idx is not None: + module.weight.data[module.padding_idx].zero_() + + if rescale_prenorm_residual: + # Reinitialize selected weights subject to the OpenAI GPT-2 Paper Scheme: + # > A modified initialization which accounts for the accumulation on the residual path with model depth. Scale + # > the weights of residual layers at initialization by a factor of 1/√N where N is the # of residual layers. + # > -- GPT-2 :: https://openai.com/blog/better-language-models/ + # + # Reference (Megatron-LM): https://github.com/NVIDIA/Megatron-LM/blob/main/megatron/model/gpt_model.py + for name, p in module.named_parameters(): + if name in ["o_proj.weight", "down_proj.weight"]: + # Special Scaled Initialization --> There are 2 Layer Norms per Transformer Block + # Following Pytorch init, except scale by 1/sqrt(2 * n_layer) + # We need to reinit p since this code could be called multiple times + # Having just p *= scale would repeatedly scale it down + with torch.no_grad(): + p /= math.sqrt(num_residuals_per_layer * self.config.num_hidden_layers) + + +class BitNetModel(BitNetPreTrainedModel): + + def __init__(self, config: BitNetConfig): + super().__init__(config) + self.padding_idx = config.pad_token_id + self.vocab_size = config.vocab_size + + self.embeddings = nn.Embedding(config.vocab_size, config.hidden_size, self.padding_idx) + self.layers = nn.ModuleList([BitNetBlock(config, layer_idx) for layer_idx in range(config.num_hidden_layers)]) + self.norm = RMSNorm(config.hidden_size, eps=config.norm_eps) + + self.gradient_checkpointing = False + + self.post_init() + + def get_input_embeddings(self): + return self.embeddings + + def set_input_embeddings(self, value): + self.embeddings = value + + def forward( + self, + input_ids: Optional[torch.LongTensor] = None, + attention_mask: Optional[torch.Tensor] = None, + past_key_values: Optional[List[torch.FloatTensor]] = None, + inputs_embeds: Optional[torch.FloatTensor] = None, + use_cache: Optional[bool] = None, + output_attentions: Optional[bool] = None, + output_hidden_states: Optional[bool] = None, + return_dict: Optional[bool] = None + ) -> Union[Tuple, CausalLMOutputWithPast]: + if output_attentions: + warnings.warn( + "`BitNetModel` does not support output attention weights now, so `output_attentions` is set to `False`." + ) + output_attentions = False + output_attentions = output_attentions if output_attentions is not None else self.config.output_attentions + output_hidden_states = output_hidden_states if output_hidden_states is not None else self.config.output_hidden_states + use_cache = use_cache if use_cache is not None else (self.config.use_cache if not self.training else False) + return_dict = return_dict if return_dict is not None else self.config.use_return_dict + + # retrieve input_ids and inputs_embeds + if input_ids is not None and inputs_embeds is not None: + raise ValueError("You cannot specify both input_ids and inputs_embeds at the same time") + elif input_ids is None and inputs_embeds is None: + raise ValueError("You have to specify either input_ids or inputs_embeds") + + if use_cache and not isinstance(past_key_values, Cache): + past_key_values = Cache.from_legacy_cache(past_key_values) + + if inputs_embeds is None: + inputs_embeds = self.embeddings(input_ids) + + # embed positions + hidden_states = inputs_embeds + + if self.gradient_checkpointing and self.training: + if use_cache: + logger.warning_once( + "`use_cache=True` is incompatible with gradient checkpointing. Setting `use_cache=False`..." + ) + use_cache = False + + all_hidden_states = () if output_hidden_states else None + all_attns = () if output_attentions else None + next_cache = None + + for layer in self.layers: + if output_hidden_states: + all_hidden_states += (hidden_states,) + + if self.gradient_checkpointing and self.training: + layer_outputs = self._gradient_checkpointing_func( + layer.__call__, + hidden_states, + attention_mask, + past_key_values, + output_attentions, + use_cache + ) + else: + layer_outputs = layer( + hidden_states, + attention_mask=attention_mask, + past_key_values=past_key_values, + output_attentions=output_attentions, + use_cache=use_cache + ) + + hidden_states = layer_outputs[0] + + if use_cache: + next_cache = layer_outputs[2 if output_attentions else 1] + + if output_attentions: + all_attns += (layer_outputs[1],) + + hidden_states = self.norm(hidden_states) + + # add hidden states from the last decoder layer + if output_hidden_states: + all_hidden_states += (hidden_states,) + + if not return_dict: + return tuple(v for v in [hidden_states, next_cache, all_hidden_states, all_attns] if v is not None) + + return BaseModelOutputWithPast( + last_hidden_state=hidden_states, + past_key_values=next_cache, + hidden_states=all_hidden_states, + attentions=all_attns + ) + + +class BitNetForCausalLM(BitNetPreTrainedModel, GenerationMixin): + + _tied_weights_keys = ["lm_head.weight"] + + def __init__(self, config): + super().__init__(config) + self.model = BitNetModel(config) + self.vocab_size = config.vocab_size + self.lm_head = BitLinear(config.hidden_size, config.vocab_size, bias=False) + + # Initialize weights and apply final processing + self.post_init() + + def get_input_embeddings(self): + return self.model.embeddings + + def set_input_embeddings(self, value): + self.model.embeddings = value + + def get_output_embeddings(self): + return self.lm_head + + def set_output_embeddings(self, new_embeddings): + self.lm_head = new_embeddings + + def set_decoder(self, decoder): + self.model = decoder + + def get_decoder(self): + return self.model + + def prepare_inputs_for_generation( + self, + input_ids: torch.LongTensor = None, + past_key_values: Optional[Union[Cache, List[torch.FloatTensor]]] = None, + attention_mask: Optional[torch.Tensor] = None, + inputs_embeds: Optional[torch.Tensor] = None, + use_cache: bool = True, + num_logits_to_keep: Optional[int] = None, + **kwargs + ): + # only last token for `inputs_ids` if the `past_key_values` is passed along. + if past_key_values is not None: + input_ids = input_ids[:, -1:] + # if `inputs_embeds` are passed, we only want to use them in the 1st generation step + if inputs_embeds is not None and past_key_values is None: + model_inputs = {'inputs_embeds': inputs_embeds} + else: + # The `contiguous()` here is necessary to have a static stride during decoding. torchdynamo otherwise + # recompiles graphs as the stride of the inputs is a guard. + # Ref: https://github.com/huggingface/transformers/pull/29114 + # TODO: use `next_tokens` directly instead. + model_inputs = {'input_ids': input_ids.contiguous()} + + if num_logits_to_keep is not None: + model_inputs['num_logits_to_keep'] = num_logits_to_keep + + model_inputs.update({ + 'past_key_values': past_key_values, + 'use_cache': use_cache, + 'attention_mask': attention_mask, + 'num_logits_to_keep': num_logits_to_keep, + }) + return model_inputs + + def forward( + self, + input_ids: torch.LongTensor = None, + attention_mask: Optional[torch.Tensor] = None, + past_key_values: Optional[Union[Cache, List[torch.FloatTensor]]] = None, + inputs_embeds: Optional[torch.FloatTensor] = None, + labels: Optional[torch.LongTensor] = None, + use_cache: Optional[bool] = None, + output_attentions: Optional[bool] = None, + output_hidden_states: Optional[bool] = None, + return_dict: Optional[bool] = None, + num_logits_to_keep: Optional[int] = 0 + ) -> Union[Tuple, CausalLMOutputWithPast]: + output_attentions = output_attentions if output_attentions is not None else self.config.output_attentions + output_hidden_states = ( + output_hidden_states if output_hidden_states is not None else self.config.output_hidden_states + ) + return_dict = return_dict if return_dict is not None else self.config.use_return_dict + + outputs = self.model( + input_ids=input_ids, + attention_mask=attention_mask, + past_key_values=past_key_values, + inputs_embeds=inputs_embeds, + use_cache=use_cache, + output_attentions=output_attentions, + output_hidden_states=output_hidden_states, + return_dict=return_dict + ) + + hidden_states = outputs[0] + fuse_linear_and_cross_entropy = self.config.fuse_cross_entropy and self.training + logits = None if fuse_linear_and_cross_entropy else self.lm_head(hidden_states[:, -num_logits_to_keep:]) + + loss = None + if labels is not None: + if self.config.fuse_cross_entropy: + if fuse_linear_and_cross_entropy: + loss_fct = FusedLinearCrossEntropyLoss() + else: + loss_fct = FusedCrossEntropyLoss(inplace_backward=True) + else: + loss_fct = nn.CrossEntropyLoss() + # Enable model parallelism + labels = labels.to(hidden_states.device) + labels = torch.cat((labels[..., 1:], torch.full_like(labels[:, :1], loss_fct.ignore_index)), 1) + if fuse_linear_and_cross_entropy: + loss = loss_fct(hidden_states.view(-1, self.config.hidden_size), + labels.view(-1), + self.lm_head.weight, + self.lm_head.bias) + else: + loss = loss_fct(logits.view(-1, self.config.vocab_size), labels.view(-1)) + + if not return_dict: + output = (logits,) + outputs[1:] + return (loss,) + output if loss is not None else output + + return CausalLMOutputWithPast( + loss=loss, + logits=logits, + past_key_values=outputs.past_key_values, + hidden_states=outputs.hidden_states, + attentions=outputs.attentions, + ) diff --git a/fla/models/delta_net/__init__.py b/fla/models/delta_net/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..258908922fef01c223727a92958355d4ae5f78d6 --- /dev/null +++ b/fla/models/delta_net/__init__.py @@ -0,0 +1,13 @@ +# -*- coding: utf-8 -*- + +from transformers import AutoConfig, AutoModel, AutoModelForCausalLM + +from fla.models.delta_net.configuration_delta_net import DeltaNetConfig +from fla.models.delta_net.modeling_delta_net import (DeltaNetForCausalLM, + DeltaNetModel) + +AutoConfig.register(DeltaNetConfig.model_type, DeltaNetConfig) +AutoModel.register(DeltaNetConfig, DeltaNetModel) +AutoModelForCausalLM.register(DeltaNetConfig, DeltaNetForCausalLM) + +__all__ = ['DeltaNetConfig', 'DeltaNetForCausalLM', 'DeltaNetModel'] diff --git a/fla/models/delta_net/configuration_delta_net.py b/fla/models/delta_net/configuration_delta_net.py new file mode 100644 index 0000000000000000000000000000000000000000..45ba7b498ad754b516bd6d2de248838d3d85552d --- /dev/null +++ b/fla/models/delta_net/configuration_delta_net.py @@ -0,0 +1,87 @@ +# -*- coding: utf-8 -*- + +from typing import Dict, Optional + +from transformers.configuration_utils import PretrainedConfig + + +class DeltaNetConfig(PretrainedConfig): + + model_type = 'delta_net' + keys_to_ignore_at_inference = ['past_key_values'] + + def __init__( + self, + attn_mode: str = "chunk", + hidden_size: int = 2048, + expand_k: int = 1, + expand_v: int = 1, + use_gate: bool = False, + use_short_conv: bool = True, + conv_size: int = 4, + use_beta: bool = True, + use_output_norm: bool = True, + num_heads: int = 16, + qk_norm: str = 'l2', + qk_activation: str = 'silu', + max_position_embeddings: int = 2048, + hidden_ratio: Optional[int] = 4, + intermediate_size: Optional[int] = None, + hidden_act: str = "swish", + num_hidden_layers: int = 24, + norm_first: bool = False, + norm_eps: float = 1e-6, + attn: Optional[Dict] = None, + use_cache: bool = True, + pad_token_id: int = None, + bos_token_id: int = 1, + eos_token_id: int = 2, + tie_word_embeddings: bool = False, + initializer_range: float = 0.02, + fuse_cross_entropy: bool = True, + vocab_size: int = 32000, + **kwargs + ): + self.attn_mode = attn_mode + self.hidden_size = hidden_size + self.expand_k = expand_k + self.expand_v = expand_v + self.use_gate = use_gate + self.use_short_conv = use_short_conv + self.conv_size = conv_size + self.use_beta = use_beta + self.use_output_norm = use_output_norm + self.num_heads = num_heads + self.qk_norm = qk_norm + self.qk_activation = qk_activation + self.max_position_embeddings = max_position_embeddings + + self.hidden_ratio = hidden_ratio + self.intermediate_size = intermediate_size + self.hidden_act = hidden_act + self.num_hidden_layers = num_hidden_layers + self.norm_first = norm_first + self.norm_eps = norm_eps + self.attn = attn + self.use_cache = use_cache + self.initializer_range = initializer_range + self.fuse_cross_entropy = fuse_cross_entropy + self.vocab_size = vocab_size + + if attn is not None: + if not isinstance(attn, Dict): + raise ValueError("attn must be a dictionary") + if 'layers' not in attn: + raise ValueError("Layer indices must be provided to initialize hybrid attention layers") + if 'num_heads' not in attn: + raise ValueError("Number of heads must be provided to initialize hybrid attention layers") + attn['num_kv_heads'] = attn.get('num_kv_heads', attn['num_heads']) + attn['window_size'] = attn.get('window_size', None) + + super().__init__( + pad_token_id=pad_token_id, + bos_token_id=bos_token_id, + eos_token_id=eos_token_id, + tie_word_embeddings=tie_word_embeddings, + **kwargs, + ) diff --git a/fla/models/delta_net/modeling_delta_net.py b/fla/models/delta_net/modeling_delta_net.py new file mode 100644 index 0000000000000000000000000000000000000000..6fe6f6bf87d98f74dddd8a600e61daebdcd624e8 --- /dev/null +++ b/fla/models/delta_net/modeling_delta_net.py @@ -0,0 +1,439 @@ +# -*- coding: utf-8 -*- + +from __future__ import annotations + +import math +import warnings +from typing import List, Optional, Tuple, Union + +import torch +import torch.nn as nn +import torch.utils.checkpoint +from transformers.activations import ACT2FN +from transformers.generation import GenerationMixin +from transformers.modeling_outputs import (BaseModelOutputWithPast, + CausalLMOutputWithPast) +from transformers.modeling_utils import PreTrainedModel +from transformers.utils import logging + +from fla.layers.attn import Attention +from fla.layers.delta_net import DeltaNet +from fla.models.delta_net.configuration_delta_net import DeltaNetConfig +from fla.models.utils import Cache +from fla.modules import (FusedCrossEntropyLoss, FusedLinearCrossEntropyLoss, + RMSNorm) +from fla.modules.activations import swiglu_linear +from fla.modules.layernorm import rms_norm_linear + +logger = logging.get_logger(__name__) + + +class DeltaNetMLP(nn.Module): + + def __init__( + self, + hidden_size: int, + hidden_ratio: Optional[int] = None, + intermediate_size: Optional[int] = None, + hidden_act: str = 'swish', + norm_first: bool = True, + norm_eps: float = 1e-5 + ) -> DeltaNetMLP: + super().__init__() + + self.hidden_size = hidden_size + # the final number of params is `hidden_ratio * hidden_size^2` + # `intermediate_size` is chosen to be a multiple of 256 closest to `2/3 * hidden_size * hidden_ratio` + if hidden_ratio is None: + hidden_ratio = 4 + if intermediate_size is None: + intermediate_size = int(hidden_size * hidden_ratio * 2 / 3) + intermediate_size = 256 * ((intermediate_size + 256 - 1) // 256) + self.hidden_ratio = hidden_ratio + self.intermediate_size = intermediate_size + self.norm_first = norm_first + + if norm_first: + self.norm = RMSNorm(hidden_size=hidden_size, eps=norm_eps) + + self.gate_proj = nn.Linear(self.hidden_size, self.intermediate_size * 2, bias=False) + self.down_proj = nn.Linear(self.intermediate_size, self.hidden_size, bias=False) + self.act_fn = ACT2FN[hidden_act] + + def forward(self, x): + if self.norm_first: + x = rms_norm_linear(x, self.norm.weight, self.norm.bias, self.gate_proj.weight, self.gate_proj.bias) + else: + x = self.gate_proj(x) + gate, y = x.chunk(2, -1) + return swiglu_linear(gate, y, self.down_proj.weight, self.down_proj.bias) + + +class DeltaNetBlock(nn.Module): + def __init__(self, config: DeltaNetConfig, layer_idx: int): + super().__init__() + self.hidden_size = config.hidden_size + + if not config.norm_first: + self.attn_norm = RMSNorm(hidden_size=config.hidden_size, eps=config.norm_eps) + if config.attn is not None and layer_idx in config.attn['layers']: + self.attn = Attention( + hidden_size=config.hidden_size, + num_heads=config.attn['num_heads'], + num_kv_heads=config.attn['num_kv_heads'], + window_size=config.attn['window_size'], + max_position_embeddings=config.max_position_embeddings, + layer_idx=layer_idx + ) + else: + self.attn = DeltaNet( + mode=config.attn_mode, + hidden_size=config.hidden_size, + expand_k=config.expand_k, + expand_v=config.expand_v, + num_heads=config.num_heads, + use_gate=config.use_gate, + use_beta=config.use_beta, + use_short_conv=config.use_short_conv, + use_output_norm=config.use_output_norm, + conv_size=config.conv_size, + qk_norm=config.qk_norm, + qk_activation=config.qk_activation, + norm_first=config.norm_first, + norm_eps=config.norm_eps, + layer_idx=layer_idx + ) + if not config.norm_first: + self.mlp_norm = RMSNorm(hidden_size=config.hidden_size, eps=config.norm_eps) + self.mlp = DeltaNetMLP( + hidden_size=config.hidden_size, + hidden_ratio=config.hidden_ratio, + intermediate_size=config.intermediate_size, + hidden_act=config.hidden_act, + norm_first=config.norm_first, + norm_eps=config.norm_eps + ) + + def forward( + self, + hidden_states: torch.Tensor, + attention_mask: Optional[torch.Tensor] = None, + past_key_values: Optional[Union[Cache, List[torch.FloatTensor]]] = None, + use_cache: Optional[bool] = False, + output_attentions: Optional[bool] = False, + **kwargs + ) -> Tuple[torch.FloatTensor, Optional[Tuple[torch.FloatTensor, torch.FloatTensor]]]: + + residual = hidden_states + if hasattr(self, 'attn_norm'): + hidden_states = self.attn_norm(hidden_states) + hidden_states, attentions, past_key_values = self.attn( + hidden_states=hidden_states, + attention_mask=attention_mask, + past_key_values=past_key_values, + use_cache=use_cache, + output_attentions=output_attentions + ) + if hasattr(self, 'mlp_norm'): + hidden_states, residual = self.mlp_norm(hidden_states, residual, True) + else: + hidden_states = residual + hidden_states + residual = hidden_states + hidden_states = self.mlp(hidden_states) + hidden_states = residual + hidden_states + + outputs = (hidden_states, attentions, past_key_values) + + return outputs + + +class DeltaNetPreTrainedModel(PreTrainedModel): + + config_class = DeltaNetConfig + supports_gradient_checkpointing = True + _no_split_modules = ['DeltaNetBlock'] + + def __init__(self, *inputs, **kwargs): + super().__init__(*inputs, **kwargs) + + def _init_weights( + self, + module: nn.Module, + rescale_prenorm_residual: bool = True, + num_residuals_per_layer: int = 2, + ): + if isinstance(module, (nn.Linear, nn.Conv1d)): + # Slightly different from the TF version which uses truncated_normal for initialization + # cf https://github.com/pytorch/pytorch/pull/5617 + nn.init.normal_(module.weight, mean=0.0, std=self.config.initializer_range) + if module.bias is not None: + nn.init.zeros_(module.bias) + elif isinstance(module, nn.Embedding): + nn.init.normal_(module.weight, mean=0.0, std=self.config.initializer_range) + if module.padding_idx is not None: + module.weight.data[module.padding_idx].zero_() + + if rescale_prenorm_residual: + # Reinitialize selected weights subject to the OpenAI GPT-2 Paper Scheme: + # > A modified initialization which accounts for the accumulation on the residual path with model depth. Scale + # > the weights of residual layers at initialization by a factor of 1/√N where N is the # of residual layers. + # > -- GPT-2 :: https://openai.com/blog/better-language-models/ + # + # Reference (Megatron-LM): https://github.com/NVIDIA/Megatron-LM/blob/main/megatron/model/gpt_model.py + for name, p in module.named_parameters(): + if name in ["o_proj.weight", "down_proj.weight"]: + # Special Scaled Initialization --> There are 2 Layer Norms per Transformer Block + # Following Pytorch init, except scale by 1/sqrt(2 * n_layer) + # We need to reinit p since this code could be called multiple times + # Having just p *= scale would repeatedly scale it down + with torch.no_grad(): + p /= math.sqrt(num_residuals_per_layer * self.config.num_hidden_layers) + + +class DeltaNetModel(DeltaNetPreTrainedModel): + + def __init__(self, config: DeltaNetConfig): + super().__init__(config) + self.padding_idx = config.pad_token_id + self.vocab_size = config.vocab_size + + self.embeddings = nn.Embedding(config.vocab_size, config.hidden_size, self.padding_idx) + self.layers = nn.ModuleList([DeltaNetBlock(config, layer_idx) for layer_idx in range(config.num_hidden_layers)]) + self.norm = RMSNorm(config.hidden_size, eps=config.norm_eps) + + self.gradient_checkpointing = False + + self.post_init() + + def get_input_embeddings(self): + return self.embeddings + + def set_input_embeddings(self, value): + self.embeddings = value + + def forward( + self, + input_ids: Optional[torch.LongTensor] = None, + attention_mask: Optional[torch.Tensor] = None, # noqa + inputs_embeds: Optional[torch.FloatTensor] = None, + past_key_values: Optional[Union[Cache, List[torch.FloatTensor]]] = None, + use_cache: Optional[bool] = None, + output_attentions: Optional[bool] = None, + output_hidden_states: Optional[bool] = None, + return_dict: Optional[bool] = None + ) -> Union[Tuple, BaseModelOutputWithPast]: + if output_attentions: + warnings.warn("`DeltaNetModel` does not `output_attentions` now, setting it to `False`.") + output_attentions = False + output_attentions = output_attentions if output_attentions is not None else self.config.output_attentions + output_hidden_states = output_hidden_states if output_hidden_states is not None else self.config.output_hidden_states + use_cache = use_cache if use_cache is not None else (self.config.use_cache if not self.training else False) + return_dict = return_dict if return_dict is not None else self.config.use_return_dict + + # retrieve input_ids and inputs_embeds + if input_ids is not None and inputs_embeds is not None: + raise ValueError("You cannot specify both input_ids and inputs_embeds at the same time") + if input_ids is None and inputs_embeds is None: + raise ValueError("You have to specify either input_ids or inputs_embeds") + + if inputs_embeds is None: + inputs_embeds = self.embeddings(input_ids) + hidden_states = inputs_embeds + + if use_cache and not isinstance(past_key_values, Cache): + past_key_values = Cache.from_legacy_cache(past_key_values) + + if self.gradient_checkpointing and self.training and use_cache: + logger.warning_once("`use_cache=True` is incompatible with gradient checkpointing. Setting `use_cache=False`...") + use_cache = False + + all_hidden_states = () if output_hidden_states else None + all_attns = () if output_attentions else None + for layer in self.layers: + if output_hidden_states: + all_hidden_states += (hidden_states,) + + if self.gradient_checkpointing and self.training: + hidden_states, attentions, past_key_values = self._gradient_checkpointing_func( + layer.__call__, + hidden_states, + attention_mask, + past_key_values, + use_cache, + output_attentions + ) + else: + hidden_states, attentions, past_key_values = layer( + hidden_states, + attention_mask=attention_mask, + past_key_values=past_key_values, + use_cache=use_cache, + output_attentions=output_attentions + ) + + if output_attentions: + all_attns += (attentions,) + + hidden_states = self.norm(hidden_states) + + # add hidden states from the last decoder layer + if output_hidden_states: + all_hidden_states += (hidden_states,) + + next_cache = past_key_values + if not return_dict: + return tuple(x for x in [hidden_states, next_cache, all_hidden_states, all_attns] if x is not None) + return BaseModelOutputWithPast( + last_hidden_state=hidden_states, + past_key_values=next_cache, + hidden_states=all_hidden_states, + attentions=all_attns + ) + + +class DeltaNetForCausalLM(DeltaNetPreTrainedModel, GenerationMixin): + + _tied_weights_keys = ["lm_head.weight"] + + def __init__(self, config): + super().__init__(config) + self.model = DeltaNetModel(config) + self.vocab_size = config.vocab_size + self.lm_head = nn.Linear(config.hidden_size, config.vocab_size, bias=False) + + # Initialize weights and apply final processing + self.post_init() + + def get_input_embeddings(self): + return self.model.embeddings + + def set_input_embeddings(self, value): + self.model.embeddings = value + + def get_output_embeddings(self): + return self.lm_head + + def set_output_embeddings(self, new_embeddings): + self.lm_head = new_embeddings + + def set_decoder(self, decoder): + self.model = decoder + + def get_decoder(self): + return self.model + + def generate(self, *args, **kwargs): + try: + return super().generate(*args, **kwargs) + except AttributeError as exception: + if 'past_key_values' in str(exception): + raise AttributeError( + f"You tried to call `generate` with a decoding strategy that manipulates `past_key_values`, " + f"which is not supported for {self.__class__.__name__}. " + f"Try another generation strategy instead. " + f"For the available generation strategies, check this doc: " + f"https://huggingface.co/docs/transformers/en/generation_strategies#decoding-strategies" + ) + else: + raise exception + + def prepare_inputs_for_generation( + self, + input_ids: torch.LongTensor = None, + past_key_values: Optional[Union[Cache, List[torch.FloatTensor]]] = None, + attention_mask: Optional[torch.Tensor] = None, + inputs_embeds: Optional[torch.Tensor] = None, + use_cache: bool = True, + num_logits_to_keep: Optional[int] = None, + **kwargs + ): + # only last token for `inputs_ids` if the `past_key_values` is passed along. + if past_key_values is not None: + input_ids = input_ids[:, -1:] + + # if `inputs_embeds` are passed, we only want to use them in the 1st generation step + if inputs_embeds is not None and past_key_values is None: + model_inputs = {'inputs_embeds': inputs_embeds} + else: + # The `contiguous()` here is necessary to have a static stride during decoding. torchdynamo otherwise + # recompiles graphs as the stride of the inputs is a guard. + # Ref: https://github.com/huggingface/transformers/pull/29114 + # TODO: use `next_tokens` directly instead. + model_inputs = {'input_ids': input_ids.contiguous()} + + if num_logits_to_keep is not None: + model_inputs['num_logits_to_keep'] = num_logits_to_keep + + model_inputs.update({ + 'past_key_values': past_key_values, + 'use_cache': use_cache, + 'attention_mask': attention_mask, + 'num_logits_to_keep': num_logits_to_keep, + }) + return model_inputs + + def forward( + self, + input_ids: torch.LongTensor = None, + attention_mask: Optional[torch.Tensor] = None, + inputs_embeds: Optional[torch.Tensor] = None, + past_key_values: Optional[Union[Cache, List[torch.FloatTensor]]] = None, + labels: Optional[torch.LongTensor] = None, + use_cache: Optional[bool] = None, + output_attentions: Optional[bool] = None, + output_hidden_states: Optional[bool] = None, + return_dict: Optional[bool] = None, + num_logits_to_keep: Optional[int] = 0 + ) -> Union[Tuple, CausalLMOutputWithPast]: + output_attentions = output_attentions if output_attentions is not None else self.config.output_attentions + output_hidden_states = ( + output_hidden_states if output_hidden_states is not None else self.config.output_hidden_states + ) + return_dict = return_dict if return_dict is not None else self.config.use_return_dict + + outputs = self.model( + input_ids=input_ids, + attention_mask=attention_mask, + inputs_embeds=inputs_embeds, + past_key_values=past_key_values, + use_cache=use_cache, + output_attentions=output_attentions, + output_hidden_states=output_hidden_states, + return_dict=return_dict + ) + + hidden_states = outputs[0] + fuse_linear_and_cross_entropy = self.config.fuse_cross_entropy and self.training + logits = None if fuse_linear_and_cross_entropy else self.lm_head(hidden_states[:, -num_logits_to_keep:]) + + loss = None + if labels is not None: + if self.config.fuse_cross_entropy: + if fuse_linear_and_cross_entropy: + loss_fct = FusedLinearCrossEntropyLoss() + else: + loss_fct = FusedCrossEntropyLoss(inplace_backward=True) + else: + loss_fct = nn.CrossEntropyLoss() + # Enable model parallelism + labels = labels.to(hidden_states.device) + labels = torch.cat((labels[..., 1:], torch.full_like(labels[:, :1], loss_fct.ignore_index)), 1) + if fuse_linear_and_cross_entropy: + loss = loss_fct(hidden_states.view(-1, self.config.hidden_size), + labels.view(-1), + self.lm_head.weight, + self.lm_head.bias) + else: + loss = loss_fct(logits.view(-1, self.config.vocab_size), labels.view(-1)) + + if not return_dict: + output = (logits,) + outputs[1:] + return (loss,) + output if loss is not None else output + + return CausalLMOutputWithPast( + loss=loss, + logits=logits, + past_key_values=outputs.past_key_values, + hidden_states=outputs.hidden_states, + attentions=outputs.attentions, + ) diff --git a/fla/models/gla/__init__.py b/fla/models/gla/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..edccb515af8f04144308bfcbb72be8e91e714cd7 --- /dev/null +++ b/fla/models/gla/__init__.py @@ -0,0 +1,13 @@ +# -*- coding: utf-8 -*- + +from transformers import AutoConfig, AutoModel, AutoModelForCausalLM + +from fla.models.gla.configuration_gla import GLAConfig +from fla.models.gla.modeling_gla import GLAForCausalLM, GLAModel + +AutoConfig.register(GLAConfig.model_type, GLAConfig) +AutoModel.register(GLAConfig, GLAModel) +AutoModelForCausalLM.register(GLAConfig, GLAForCausalLM) + + +__all__ = ['GLAConfig', 'GLAForCausalLM', 'GLAModel'] diff --git a/fla/models/gla/configuration_gla.py b/fla/models/gla/configuration_gla.py new file mode 100644 index 0000000000000000000000000000000000000000..7991112b2c7bdf2fdad7211cf4c619641599ed87 --- /dev/null +++ b/fla/models/gla/configuration_gla.py @@ -0,0 +1,90 @@ +# -*- coding: utf-8 -*- + +from typing import Dict, Optional + +from transformers.configuration_utils import PretrainedConfig + + +class GLAConfig(PretrainedConfig): + + model_type = 'gla' + keys_to_ignore_at_inference = ['past_key_values'] + + def __init__( + self, + hidden_size: int = 2048, + expand_k: int = 0.5, + expand_v: int = 1, + hidden_ratio: Optional[int] = 4, + intermediate_size: Optional[int] = None, + num_hidden_layers: int = 24, + num_heads: int = 4, + num_kv_heads: Optional[int] = None, + feature_map: Optional[str] = None, + attn_mode: str = "chunk", + use_short_conv: bool = False, + conv_size: int = 4, + use_output_gate: bool = True, + clamp_min: Optional[float] = None, + hidden_act: str = "swish", + max_position_embeddings: int = 2048, + elementwise_affine: Optional[bool] = True, + norm_eps: float = 1e-6, + use_gk: bool = True, + use_gv: bool = False, + attn: Optional[Dict] = None, + use_cache: bool = True, + pad_token_id: int = None, + bos_token_id: int = 1, + eos_token_id: int = 2, + tie_word_embeddings: bool = False, + initializer_range: float = 0.02, + fuse_norm: bool = True, + fuse_cross_entropy: bool = True, + vocab_size: int = 32000, + **kwargs + ): + self.hidden_size = hidden_size + self.expand_k = expand_k + self.expand_v = expand_v + self.hidden_ratio = hidden_ratio + self.intermediate_size = intermediate_size + self.num_hidden_layers = num_hidden_layers + self.num_heads = num_heads + self.num_kv_heads = num_kv_heads + self.feature_map = feature_map + self.attn_mode = attn_mode + self.use_short_conv = use_short_conv + self.conv_size = conv_size + self.use_output_gate = use_output_gate + self.clamp_min = clamp_min + self.hidden_act = hidden_act + self.max_position_embeddings = max_position_embeddings + self.elementwise_affine = elementwise_affine + self.norm_eps = norm_eps + self.use_gk = use_gk + self.use_gv = use_gv + self.attn = attn + self.use_cache = use_cache + self.initializer_range = initializer_range + self.fuse_norm = fuse_norm + self.fuse_cross_entropy = fuse_cross_entropy + self.vocab_size = vocab_size + + if attn is not None: + if not isinstance(attn, Dict): + raise ValueError("attn must be a dictionary") + if 'layers' not in attn: + raise ValueError("Layer indices must be provided to initialize hybrid attention layers") + if 'num_heads' not in attn: + raise ValueError("Number of heads must be provided to initialize hybrid attention layers") + attn['num_kv_heads'] = attn.get('num_kv_heads', attn['num_heads']) + attn['window_size'] = attn.get('window_size', None) + + super().__init__( + pad_token_id=pad_token_id, + bos_token_id=bos_token_id, + eos_token_id=eos_token_id, + tie_word_embeddings=tie_word_embeddings, + **kwargs, + ) diff --git a/fla/models/gla/modeling_gla.py b/fla/models/gla/modeling_gla.py new file mode 100644 index 0000000000000000000000000000000000000000..bfaa29568a5d026443b4a23f87ba4ba698e0b5d0 --- /dev/null +++ b/fla/models/gla/modeling_gla.py @@ -0,0 +1,418 @@ +# -*- coding: utf-8 -*- + +from __future__ import annotations + +import math +import warnings +from typing import List, Optional, Tuple, Union + +import torch +import torch.nn as nn +import torch.utils.checkpoint +from transformers.activations import ACT2FN +from transformers.generation import GenerationMixin +from transformers.modeling_outputs import (BaseModelOutputWithPast, + CausalLMOutputWithPast) +from transformers.modeling_utils import PreTrainedModel +from transformers.utils import logging + +from fla.layers.attn import Attention +from fla.layers.gla import GatedLinearAttention +from fla.models.gla.configuration_gla import GLAConfig +from fla.models.utils import Cache +from fla.modules import (FusedCrossEntropyLoss, FusedLinearCrossEntropyLoss, + RMSNorm) +from fla.modules.activations import swiglu_linear + +logger = logging.get_logger(__name__) + + +class GLAMLP(nn.Module): + + def __init__( + self, + hidden_size: int, + hidden_ratio: Optional[int] = None, + intermediate_size: Optional[int] = None, + hidden_act: str = 'swish' + ) -> GLAMLP: + super().__init__() + + self.hidden_size = hidden_size + # the final number of params is `hidden_ratio * hidden_size^2` + # `intermediate_size` is chosen to be a multiple of 256 closest to `2/3 * hidden_size * hidden_ratio` + if hidden_ratio is None: + hidden_ratio = 4 + if intermediate_size is None: + intermediate_size = int(hidden_size * hidden_ratio * 2 / 3) + intermediate_size = 256 * ((intermediate_size + 256 - 1) // 256) + self.hidden_ratio = hidden_ratio + self.intermediate_size = intermediate_size + + self.gate_proj = nn.Linear(self.hidden_size, self.intermediate_size * 2, bias=False) + self.down_proj = nn.Linear(self.intermediate_size, self.hidden_size, bias=False) + self.act_fn = ACT2FN[hidden_act] + + def forward(self, x): + y = self.gate_proj(x) + gate, y = y.chunk(2, -1) + return swiglu_linear(gate, y, self.down_proj.weight, self.down_proj.bias) + + +class GLABlock(nn.Module): + def __init__(self, config: GLAConfig, layer_idx: int): + super().__init__() + self.hidden_size = config.hidden_size + + self.attn_norm = RMSNorm(hidden_size=config.hidden_size, eps=config.norm_eps) + if config.attn is not None and layer_idx in config.attn['layers']: + self.attn = Attention( + hidden_size=config.hidden_size, + num_heads=config.attn['num_heads'], + num_kv_heads=config.attn['num_kv_heads'], + window_size=config.attn['window_size'], + max_position_embeddings=config.max_position_embeddings, + layer_idx=layer_idx + ) + else: + self.attn = GatedLinearAttention( + mode=config.attn_mode, + hidden_size=config.hidden_size, + expand_k=config.expand_k, + expand_v=config.expand_v, + num_heads=config.num_heads, + num_kv_heads=config.num_kv_heads, + feature_map=config.feature_map, + use_short_conv=config.use_short_conv, + conv_size=config.conv_size, + use_output_gate=config.use_output_gate, + gate_fn=config.hidden_act, + elementwise_affine=config.elementwise_affine, + norm_eps=config.norm_eps, + clamp_min=config.clamp_min, + fuse_norm=config.fuse_norm, + layer_idx=layer_idx + ) + self.mlp_norm = RMSNorm(hidden_size=config.hidden_size, eps=config.norm_eps) + self.mlp = GLAMLP( + hidden_size=config.hidden_size, + hidden_ratio=config.hidden_ratio, + intermediate_size=config.intermediate_size, + hidden_act=config.hidden_act + ) + + def forward( + self, + hidden_states: torch.Tensor, + attention_mask: Optional[torch.Tensor] = None, + past_key_values: Optional[Union[Cache, List[torch.FloatTensor]]] = None, + use_cache: Optional[bool] = False, + output_attentions: Optional[bool] = False, + **kwargs, + ) -> Tuple[torch.FloatTensor, Optional[Tuple[torch.FloatTensor, torch.FloatTensor]]]: + residual = hidden_states + hidden_states = self.attn_norm(hidden_states) + hidden_states, attentions, past_key_values = self.attn( + hidden_states=hidden_states, + attention_mask=attention_mask, + past_key_values=past_key_values, + use_cache=use_cache, + output_attentions=output_attentions + ) + hidden_states, residual = self.mlp_norm(hidden_states, residual, True) + hidden_states = self.mlp(hidden_states) + hidden_states = residual + hidden_states + + outputs = (hidden_states, attentions, past_key_values) + + return outputs + + +class GLAPreTrainedModel(PreTrainedModel): + + config_class = GLAConfig + supports_gradient_checkpointing = True + _no_split_modules = ['GLABlock'] + + def __init__(self, *inputs, **kwargs): + super().__init__(*inputs, **kwargs) + + def _init_weights( + self, + module: nn.Module, + rescale_prenorm_residual: bool = True, + num_residuals_per_layer: int = 2, + ): + if isinstance(module, (nn.Linear, nn.Conv1d)): + # Slightly different from the TF version which uses truncated_normal for initialization + # cf https://github.com/pytorch/pytorch/pull/5617 + nn.init.normal_(module.weight, mean=0.0, std=self.config.initializer_range) + if module.bias is not None: + nn.init.zeros_(module.bias) + elif isinstance(module, nn.Embedding): + nn.init.normal_(module.weight, mean=0.0, std=self.config.initializer_range) + if module.padding_idx is not None: + module.weight.data[module.padding_idx].zero_() + + if rescale_prenorm_residual: + # Reinitialize selected weights subject to the OpenAI GPT-2 Paper Scheme: + # > A modified initialization which accounts for the accumulation on the residual path with model depth. Scale + # > the weights of residual layers at initialization by a factor of 1/√N where N is the # of residual layers. + # > -- GPT-2 :: https://openai.com/blog/better-language-models/ + # + # Reference (Megatron-LM): https://github.com/NVIDIA/Megatron-LM/blob/main/megatron/model/gpt_model.py + for name, p in module.named_parameters(): + if name in ["o_proj.weight", "down_proj.weight"]: + # Special Scaled Initialization --> There are 2 Layer Norms per Transformer Block + # Following Pytorch init, except scale by 1/sqrt(2 * n_layer) + # We need to reinit p since this code could be called multiple times + # Having just p *= scale would repeatedly scale it down + with torch.no_grad(): + p /= math.sqrt(num_residuals_per_layer * self.config.num_hidden_layers) + + +class GLAModel(GLAPreTrainedModel): + + def __init__(self, config: GLAConfig): + super().__init__(config) + self.padding_idx = config.pad_token_id + self.vocab_size = config.vocab_size + + self.embeddings = nn.Embedding(config.vocab_size, config.hidden_size, self.padding_idx) + self.layers = nn.ModuleList([GLABlock(config, layer_idx) for layer_idx in range(config.num_hidden_layers)]) + self.norm = RMSNorm(config.hidden_size, eps=config.norm_eps) + + self.gradient_checkpointing = False + + self.post_init() + + def get_input_embeddings(self): + return self.embeddings + + def set_input_embeddings(self, value): + self.embeddings = value + + def forward( + self, + input_ids: Optional[torch.LongTensor] = None, + attention_mask: Optional[torch.Tensor] = None, # noqa + inputs_embeds: Optional[torch.FloatTensor] = None, + past_key_values: Optional[Union[Cache, List[torch.FloatTensor]]] = None, + use_cache: Optional[bool] = None, + output_attentions: Optional[bool] = None, + output_hidden_states: Optional[bool] = None, + return_dict: Optional[bool] = None + ) -> Union[Tuple, BaseModelOutputWithPast]: + if output_attentions: + warnings.warn("`GLAModel` does not `output_attentions` now, setting it to `False`.") + output_attentions = False + output_attentions = output_attentions if output_attentions is not None else self.config.output_attentions + output_hidden_states = output_hidden_states if output_hidden_states is not None else self.config.output_hidden_states + use_cache = use_cache if use_cache is not None else (self.config.use_cache if not self.training else False) + return_dict = return_dict if return_dict is not None else self.config.use_return_dict + + # retrieve input_ids and inputs_embeds + if input_ids is not None and inputs_embeds is not None: + raise ValueError("You cannot specify both input_ids and inputs_embeds at the same time") + if input_ids is None and inputs_embeds is None: + raise ValueError("You have to specify either input_ids or inputs_embeds") + + if inputs_embeds is None: + inputs_embeds = self.embeddings(input_ids) + hidden_states = inputs_embeds + + if use_cache and not isinstance(past_key_values, Cache): + past_key_values = Cache.from_legacy_cache(past_key_values) + + if self.gradient_checkpointing and self.training and use_cache: + logger.warning_once("`use_cache=True` is incompatible with gradient checkpointing. Setting `use_cache=False`...") + use_cache = False + + all_hidden_states = () if output_hidden_states else None + all_attns = () if output_attentions else None + for layer in self.layers: + if output_hidden_states: + all_hidden_states += (hidden_states,) + + if self.gradient_checkpointing and self.training: + hidden_states, attentions, past_key_values = self._gradient_checkpointing_func( + layer.__call__, + hidden_states, + attention_mask, + past_key_values, + use_cache, + output_attentions + ) + else: + hidden_states, attentions, past_key_values = layer( + hidden_states, + attention_mask=attention_mask, + past_key_values=past_key_values, + use_cache=use_cache, + output_attentions=output_attentions + ) + + if output_attentions: + all_attns += (attentions,) + + hidden_states = self.norm(hidden_states) + + # add hidden states from the last decoder layer + if output_hidden_states: + all_hidden_states += (hidden_states,) + + if not return_dict: + return tuple(i for i in [hidden_states, past_key_values, all_hidden_states, all_attns] if i is not None) + return BaseModelOutputWithPast( + last_hidden_state=hidden_states, + past_key_values=past_key_values, + hidden_states=all_hidden_states, + attentions=all_attns + ) + + +class GLAForCausalLM(GLAPreTrainedModel, GenerationMixin): + + _tied_weights_keys = ["lm_head.weight"] + + def __init__(self, config): + super().__init__(config) + self.model = GLAModel(config) + self.vocab_size = config.vocab_size + self.lm_head = nn.Linear(config.hidden_size, config.vocab_size, bias=False) + + # Initialize weights and apply final processing + self.post_init() + + def get_input_embeddings(self): + return self.model.embeddings + + def set_input_embeddings(self, value): + self.model.embeddings = value + + def get_output_embeddings(self): + return self.lm_head + + def set_output_embeddings(self, new_embeddings): + self.lm_head = new_embeddings + + def set_decoder(self, decoder): + self.model = decoder + + def get_decoder(self): + return self.model + + def generate(self, *args, **kwargs): + try: + return super().generate(*args, **kwargs) + except AttributeError as exception: + if 'past_key_values' in str(exception): + raise AttributeError( + f"You tried to call `generate` with a decoding strategy that manipulates `past_key_values`, " + f"which is not supported for {self.__class__.__name__}. " + f"Try another generation strategy instead. " + f"For the available generation strategies, check this doc: " + f"https://huggingface.co/docs/transformers/en/generation_strategies#decoding-strategies" + ) + else: + raise exception + + def prepare_inputs_for_generation( + self, + input_ids: torch.LongTensor = None, + past_key_values: Optional[Union[Cache, List[torch.FloatTensor]]] = None, + attention_mask: Optional[torch.Tensor] = None, + inputs_embeds: Optional[torch.Tensor] = None, + use_cache: bool = True, + num_logits_to_keep: Optional[int] = None, + **kwargs + ): + # only last token for `inputs_ids` if the `past_key_values` is passed along. + if past_key_values is not None: + input_ids = input_ids[:, -1:] + # if `inputs_embeds` are passed, we only want to use them in the 1st generation step + if inputs_embeds is not None and past_key_values is None: + model_inputs = {'inputs_embeds': inputs_embeds} + else: + # The `contiguous()` here is necessary to have a static stride during decoding. torchdynamo otherwise + # recompiles graphs as the stride of the inputs is a guard. + # Ref: https://github.com/huggingface/transformers/pull/29114 + # TODO: use `next_tokens` directly instead. + model_inputs = {'input_ids': input_ids.contiguous()} + + if num_logits_to_keep is not None: + model_inputs['num_logits_to_keep'] = num_logits_to_keep + + model_inputs.update({ + 'past_key_values': past_key_values, + 'use_cache': use_cache, + 'attention_mask': attention_mask, + 'num_logits_to_keep': num_logits_to_keep, + }) + return model_inputs + + def forward( + self, + input_ids: torch.LongTensor = None, + attention_mask: Optional[torch.Tensor] = None, + inputs_embeds: Optional[torch.Tensor] = None, + past_key_values: Optional[Union[Cache, List[torch.FloatTensor]]] = None, + labels: Optional[torch.LongTensor] = None, + use_cache: Optional[bool] = None, + output_attentions: Optional[bool] = None, + output_hidden_states: Optional[bool] = None, + return_dict: Optional[bool] = None, + num_logits_to_keep: Optional[int] = 0 + ) -> Union[Tuple, CausalLMOutputWithPast]: + output_attentions = output_attentions if output_attentions is not None else self.config.output_attentions + output_hidden_states = ( + output_hidden_states if output_hidden_states is not None else self.config.output_hidden_states + ) + return_dict = return_dict if return_dict is not None else self.config.use_return_dict + + outputs = self.model( + input_ids=input_ids, + attention_mask=attention_mask, + inputs_embeds=inputs_embeds, + past_key_values=past_key_values, + use_cache=use_cache, + output_attentions=output_attentions, + output_hidden_states=output_hidden_states, + return_dict=return_dict + ) + + hidden_states = outputs[0] + fuse_linear_and_cross_entropy = self.config.fuse_cross_entropy and self.training + logits = None if fuse_linear_and_cross_entropy else self.lm_head(hidden_states[:, -num_logits_to_keep:]) + + loss = None + if labels is not None: + if self.config.fuse_cross_entropy: + if fuse_linear_and_cross_entropy: + loss_fct = FusedLinearCrossEntropyLoss() + else: + loss_fct = FusedCrossEntropyLoss(inplace_backward=True) + else: + loss_fct = nn.CrossEntropyLoss() + # Enable model parallelism + labels = labels.to(hidden_states.device) + labels = torch.cat((labels[..., 1:], torch.full_like(labels[:, :1], loss_fct.ignore_index)), 1) + if fuse_linear_and_cross_entropy: + loss = loss_fct(hidden_states.view(-1, self.config.hidden_size), + labels.view(-1), + self.lm_head.weight, + self.lm_head.bias) + else: + loss = loss_fct(logits.view(-1, self.config.vocab_size), labels.view(-1)) + + if not return_dict: + output = (logits,) + outputs[1:] + return (loss,) + output if loss is not None else output + + return CausalLMOutputWithPast( + loss=loss, + logits=logits, + past_key_values=outputs.past_key_values, + hidden_states=outputs.hidden_states, + attentions=outputs.attentions, + ) diff --git a/fla/models/gsa/__init__.py b/fla/models/gsa/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..a134f758e0bea0eb844a2db73957936078f889b6 --- /dev/null +++ b/fla/models/gsa/__init__.py @@ -0,0 +1,13 @@ +# -*- coding: utf-8 -*- + +from transformers import AutoConfig, AutoModel, AutoModelForCausalLM + +from fla.models.gsa.configuration_gsa import GSAConfig +from fla.models.gsa.modeling_gsa import GSAForCausalLM, GSAModel + +AutoConfig.register(GSAConfig.model_type, GSAConfig) +AutoModel.register(GSAConfig, GSAModel) +AutoModelForCausalLM.register(GSAConfig, GSAForCausalLM) + + +__all__ = ['GSAConfig', 'GSAForCausalLM', 'GSAModel'] diff --git a/fla/models/gsa/configuration_gsa.py b/fla/models/gsa/configuration_gsa.py new file mode 100644 index 0000000000000000000000000000000000000000..b2b37c8438cfdc05ea611950a955c431f40354ba --- /dev/null +++ b/fla/models/gsa/configuration_gsa.py @@ -0,0 +1,94 @@ +# -*- coding: utf-8 -*- + +from typing import Dict, Optional + +from transformers.configuration_utils import PretrainedConfig + + +class GSAConfig(PretrainedConfig): + + model_type = 'gsa' + keys_to_ignore_at_inference = ['past_key_values'] + + def __init__( + self, + hidden_size: int = 2048, + gate_logit_normalizer: Optional[int] = 8, + clamp_min: Optional[float] = None, + clamp_max: Optional[float] = None, + hidden_ratio: Optional[int] = 4, + intermediate_size: Optional[int] = None, + num_hidden_layers: int = 24, + num_heads: int = 4, + num_kv_heads: Optional[int] = None, + num_slots: Optional[int] = 64, + use_short_conv: bool = False, + conv_size: int = 4, + exapnd_k: float = 1, + exapnd_v: float = 1, + feature_map: str = 'swish', + use_output_gate: bool = False, + use_norm: bool = True, + max_position_embeddings: int = 2048, + hidden_act: str = "swish", + elementwise_affine: Optional[bool] = True, + norm_first: bool = True, + norm_eps: float = 1e-6, + attn: Optional[Dict] = None, + use_cache: bool = True, + pad_token_id: int = None, + bos_token_id: int = 1, + eos_token_id: int = 2, + initializer_range: float = 0.02, + tie_word_embeddings: bool = False, + fuse_norm: bool = True, + fuse_cross_entropy: bool = True, + vocab_size: int = 32000, + **kwargs + ): + self.hidden_size = hidden_size + self.gate_logit_normalizer = gate_logit_normalizer + self.clamp_min = clamp_min + self.clamp_max = clamp_max + self.hidden_ratio = hidden_ratio + self.intermediate_size = intermediate_size + self.num_hidden_layers = num_hidden_layers + self.num_heads = num_heads + self.num_kv_heads = num_kv_heads + self.num_slots = num_slots + self.use_short_conv = use_short_conv + self.conv_size = conv_size + self.expand_k = exapnd_k + self.expand_v = exapnd_v + self.feature_map = feature_map + self.use_output_gate = use_output_gate + self.use_norm = use_norm + self.max_position_embeddings = max_position_embeddings + self.hidden_act = hidden_act + self.elementwise_affine = elementwise_affine + self.norm_first = norm_first + self.norm_eps = norm_eps + self.attn = attn + self.use_cache = use_cache + self.initializer_range = initializer_range + self.fuse_cross_entropy = fuse_cross_entropy + self.fuse_norm = fuse_norm + self.vocab_size = vocab_size + + if attn is not None: + if not isinstance(attn, Dict): + raise ValueError("attn must be a dictionary") + if 'layers' not in attn: + raise ValueError("Layer indices must be provided to initialize hybrid attention layers") + if 'num_heads' not in attn: + raise ValueError("Number of heads must be provided to initialize hybrid attention layers") + attn['num_kv_heads'] = attn.get('num_kv_heads', attn['num_heads']) + attn['window_size'] = attn.get('window_size', None) + + super().__init__( + pad_token_id=pad_token_id, + bos_token_id=bos_token_id, + eos_token_id=eos_token_id, + tie_word_embeddings=tie_word_embeddings, + **kwargs, + ) diff --git a/fla/models/gsa/modeling_gsa.py b/fla/models/gsa/modeling_gsa.py new file mode 100644 index 0000000000000000000000000000000000000000..4133e6ca0cfb3579ae368ca9a35ce8074f8a7fd2 --- /dev/null +++ b/fla/models/gsa/modeling_gsa.py @@ -0,0 +1,442 @@ +# -*- coding: utf-8 -*- + +from __future__ import annotations + +import math +import warnings +from typing import List, Optional, Tuple, Union + +import torch +import torch.nn as nn +import torch.utils.checkpoint +from transformers.activations import ACT2FN +from transformers.generation import GenerationMixin +from transformers.modeling_outputs import (BaseModelOutputWithPast, + CausalLMOutputWithPast) +from transformers.modeling_utils import PreTrainedModel +from transformers.utils import logging + +from fla.layers.attn import Attention +from fla.layers.gsa import GatedSlotAttention +from fla.models.gsa.configuration_gsa import GSAConfig +from fla.models.utils import Cache +from fla.modules import (FusedCrossEntropyLoss, FusedLinearCrossEntropyLoss, + RMSNorm) +from fla.modules.activations import swiglu_linear +from fla.modules.layernorm import rms_norm_linear + +logger = logging.get_logger(__name__) + + +class GSAMLP(nn.Module): + + def __init__( + self, + hidden_size: int, + hidden_ratio: Optional[int] = None, + intermediate_size: Optional[int] = None, + hidden_act: str = 'swish', + norm_first: bool = True, + norm_eps: float = 1e-5 + ) -> GSAMLP: + super().__init__() + + self.hidden_size = hidden_size + # the final number of params is `hidden_ratio * hidden_size^2` + # `intermediate_size` is chosen to be a multiple of 256 closest to `2/3 * hidden_size * hidden_ratio` + if hidden_ratio is None: + hidden_ratio = 4 + if intermediate_size is None: + intermediate_size = int(hidden_size * hidden_ratio * 2 / 3) + intermediate_size = 256 * ((intermediate_size + 256 - 1) // 256) + self.hidden_ratio = hidden_ratio + self.intermediate_size = intermediate_size + self.norm_first = norm_first + + if norm_first: + self.norm = RMSNorm(hidden_size=hidden_size, eps=norm_eps) + + self.gate_proj = nn.Linear(self.hidden_size, self.intermediate_size * 2, bias=False) + self.down_proj = nn.Linear(self.intermediate_size, self.hidden_size, bias=False) + self.act_fn = ACT2FN[hidden_act] + + def forward(self, x): + if self.norm_first: + x = rms_norm_linear(x, self.norm.weight, self.norm.bias, self.gate_proj.weight, self.gate_proj.bias) + else: + x = self.gate_proj(x) + gate, y = x.chunk(2, -1) + return swiglu_linear(gate, y, self.down_proj.weight, self.down_proj.bias) + + +class GSABlock(nn.Module): + def __init__(self, config: GSAConfig, layer_idx: int): + super().__init__() + self.hidden_size = config.hidden_size + + if not config.norm_first: + self.attn_norm = RMSNorm(hidden_size=config.hidden_size, eps=config.norm_eps) + if config.attn is not None and layer_idx in config.attn['layers']: + self.attn = Attention( + hidden_size=config.hidden_size, + num_heads=config.attn['num_heads'], + num_kv_heads=config.attn['num_kv_heads'], + window_size=config.attn['window_size'], + max_position_embeddings=config.max_position_embeddings, + layer_idx=layer_idx + ) + else: + self.attn = GatedSlotAttention( + hidden_size=config.hidden_size, + expand_k=config.expand_k, + expand_v=config.expand_v, + num_heads=config.num_heads, + num_kv_heads=config.num_kv_heads, + num_slots=config.num_slots, + use_short_conv=config.use_short_conv, + conv_size=config.conv_size, + feature_map=config.feature_map, + use_output_gate=config.use_output_gate, + use_norm=config.use_norm, + gate_fn=config.hidden_act, + gate_logit_normalizer=config.gate_logit_normalizer, + elementwise_affine=config.elementwise_affine, + norm_first=config.norm_first, + norm_eps=config.norm_eps, + fuse_norm=config.fuse_norm, + layer_idx=layer_idx + ) + if not config.norm_first: + self.mlp_norm = RMSNorm(hidden_size=config.hidden_size, eps=config.norm_eps) + self.mlp = GSAMLP( + hidden_size=config.hidden_size, + hidden_ratio=config.hidden_ratio, + intermediate_size=config.intermediate_size, + hidden_act=config.hidden_act, + norm_first=config.norm_first, + norm_eps=config.norm_eps + ) + + def forward( + self, + hidden_states: torch.Tensor, + attention_mask: Optional[torch.Tensor] = None, + past_key_values: Optional[Union[Cache, List[torch.FloatTensor]]] = None, + use_cache: Optional[bool] = False, + output_attentions: Optional[bool] = False, + **kwargs + ) -> Tuple[torch.FloatTensor, Optional[Tuple[torch.FloatTensor, torch.FloatTensor]]]: + + residual = hidden_states + if hasattr(self, 'attn_norm'): + hidden_states = self.attn_norm(hidden_states) + hidden_states, attentions, past_key_values = self.attn( + hidden_states=hidden_states, + attention_mask=attention_mask, + past_key_values=past_key_values, + use_cache=use_cache, + output_attentions=output_attentions + ) + if hasattr(self, 'mlp_norm'): + hidden_states, residual = self.mlp_norm(hidden_states, residual, True) + else: + hidden_states = residual + hidden_states + residual = hidden_states + hidden_states = self.mlp(hidden_states) + hidden_states = residual + hidden_states + + outputs = (hidden_states, attentions, past_key_values) + + return outputs + + +class GSAPreTrainedModel(PreTrainedModel): + + config_class = GSAConfig + supports_gradient_checkpointing = True + _no_split_modules = ['GSABlock'] + + def __init__(self, *inputs, **kwargs): + super().__init__(*inputs, **kwargs) + + def _init_weights( + self, + module: nn.Module, + rescale_prenorm_residual: bool = True, + num_residuals_per_layer: int = 2, + ): + if isinstance(module, (nn.Linear, nn.Conv1d)): + # Slightly different from the TF version which uses truncated_normal for initialization + # cf https://github.com/pytorch/pytorch/pull/5617 + nn.init.normal_(module.weight, mean=0.0, std=self.config.initializer_range) + if module.bias is not None: + nn.init.zeros_(module.bias) + elif isinstance(module, nn.Embedding): + nn.init.normal_(module.weight, mean=0.0, std=self.config.initializer_range) + if module.padding_idx is not None: + module.weight.data[module.padding_idx].zero_() + + if rescale_prenorm_residual: + # Reinitialize selected weights subject to the OpenAI GPT-2 Paper Scheme: + # > A modified initialization which accounts for the accumulation on the residual path with model depth. Scale + # > the weights of residual layers at initialization by a factor of 1/√N where N is the # of residual layers. + # > -- GPT-2 :: https://openai.com/blog/better-language-models/ + # + # Reference (Megatron-LM): https://github.com/NVIDIA/Megatron-LM/blob/main/megatron/model/gpt_model.py + for name, p in module.named_parameters(): + if name in ["o_proj.weight", "down_proj.weight"]: + # Special Scaled Initialization --> There are 2 Layer Norms per Transformer Block + # Following Pytorch init, except scale by 1/sqrt(2 * n_layer) + # We need to reinit p since this code could be called multiple times + # Having just p *= scale would repeatedly scale it down + with torch.no_grad(): + p /= math.sqrt(num_residuals_per_layer * self.config.num_hidden_layers) + + +class GSAModel(GSAPreTrainedModel): + + def __init__(self, config: GSAConfig): + super().__init__(config) + self.padding_idx = config.pad_token_id + self.vocab_size = config.vocab_size + + self.embeddings = nn.Embedding(config.vocab_size, config.hidden_size, self.padding_idx) + self.layers = nn.ModuleList([GSABlock(config, layer_idx) for layer_idx in range(config.num_hidden_layers)]) + self.norm = RMSNorm(config.hidden_size, eps=config.norm_eps) + + self.gradient_checkpointing = False + + self.post_init() + + def get_input_embeddings(self): + return self.embeddings + + def set_input_embeddings(self, value): + self.embeddings = value + + def forward( + self, + input_ids: Optional[torch.LongTensor] = None, + attention_mask: Optional[torch.Tensor] = None, # noqa + inputs_embeds: Optional[torch.FloatTensor] = None, + past_key_values: Optional[Union[Cache, List[torch.FloatTensor]]] = None, + use_cache: Optional[bool] = None, + output_attentions: Optional[bool] = None, + output_hidden_states: Optional[bool] = None, + return_dict: Optional[bool] = None + ) -> Union[Tuple, BaseModelOutputWithPast]: + if output_attentions: + warnings.warn("`GSAModel` does not `output_attentions` now, setting it to `False`.") + output_attentions = False + output_attentions = output_attentions if output_attentions is not None else self.config.output_attentions + output_hidden_states = output_hidden_states if output_hidden_states is not None else self.config.output_hidden_states + use_cache = use_cache if use_cache is not None else (self.config.use_cache if not self.training else False) + return_dict = return_dict if return_dict is not None else self.config.use_return_dict + + # retrieve input_ids and inputs_embeds + if input_ids is not None and inputs_embeds is not None: + raise ValueError("You cannot specify both input_ids and inputs_embeds at the same time") + if input_ids is None and inputs_embeds is None: + raise ValueError("You have to specify either input_ids or inputs_embeds") + + if inputs_embeds is None: + inputs_embeds = self.embeddings(input_ids) + hidden_states = inputs_embeds + + if use_cache and not isinstance(past_key_values, Cache): + past_key_values = Cache.from_legacy_cache(past_key_values) + + if self.gradient_checkpointing and self.training and use_cache: + logger.warning_once("`use_cache=True` is incompatible with gradient checkpointing. Setting `use_cache=False`...") + use_cache = False + + all_hidden_states = () if output_hidden_states else None + all_attns = () if output_attentions else None + + for i, layer in enumerate(self.layers): + if output_hidden_states: + all_hidden_states += (hidden_states,) + + if self.gradient_checkpointing and self.training: + hidden_states, attentions, past_key_values = self._gradient_checkpointing_func( + layer.__call__, + hidden_states, + attention_mask, + past_key_values, + use_cache, + output_attentions, + ) + else: + hidden_states, attentions, past_key_values = layer( + hidden_states, + attention_mask=attention_mask, + past_key_values=past_key_values, + use_cache=use_cache, + output_attentions=output_attentions + ) + + if output_attentions: + all_attns += (attentions,) + + hidden_states = self.norm(hidden_states) + + # add hidden states from the last decoder layer + if output_hidden_states: + all_hidden_states += (hidden_states,) + + if not return_dict: + return tuple(i for i in [hidden_states, past_key_values, all_hidden_states, all_attns] if i is not None) + return BaseModelOutputWithPast( + last_hidden_state=hidden_states, + past_key_values=past_key_values, + hidden_states=all_hidden_states, + attentions=all_attns + ) + + +class GSAForCausalLM(GSAPreTrainedModel, GenerationMixin): + + _tied_weights_keys = ["lm_head.weight"] + + def __init__(self, config): + + super().__init__(config) + self.model = GSAModel(config) + self.vocab_size = config.vocab_size + self.lm_head = nn.Linear(config.hidden_size, config.vocab_size, bias=False) + + # Initialize weights and apply final processing + self.post_init() + + def get_input_embeddings(self): + return self.model.embeddings + + def set_input_embeddings(self, value): + self.model.embeddings = value + + def get_output_embeddings(self): + return self.lm_head + + def set_output_embeddings(self, new_embeddings): + self.lm_head = new_embeddings + + def set_decoder(self, decoder): + self.model = decoder + + def get_decoder(self): + return self.model + + def generate(self, *args, **kwargs): + try: + return super().generate(*args, **kwargs) + except AttributeError as exception: + if 'past_key_values' in str(exception): + raise AttributeError( + f"You tried to call `generate` with a decoding strategy that manipulates `past_key_values`, " + f"which is not supported for {self.__class__.__name__}. " + f"Try another generation strategy instead. " + f"For the available generation strategies, check this doc: " + f"https://huggingface.co/docs/transformers/en/generation_strategies#decoding-strategies" + ) + else: + raise exception + + def prepare_inputs_for_generation( + self, + input_ids: torch.LongTensor = None, + past_key_values: Optional[Union[Cache, List[torch.FloatTensor]]] = None, + attention_mask: Optional[torch.Tensor] = None, + inputs_embeds: Optional[torch.Tensor] = None, + use_cache: bool = True, + num_logits_to_keep: Optional[int] = None, + **kwargs + ): + # only last token for `inputs_ids` if the `past_key_values` is passed along. + if past_key_values is not None: + input_ids = input_ids[:, -1:] + # if `inputs_embeds` are passed, we only want to use them in the 1st generation step + if inputs_embeds is not None and past_key_values is None: + model_inputs = {'inputs_embeds': inputs_embeds} + else: + # The `contiguous()` here is necessary to have a static stride during decoding. torchdynamo otherwise + # recompiles graphs as the stride of the inputs is a guard. + # Ref: https://github.com/huggingface/transformers/pull/29114 + # TODO: use `next_tokens` directly instead. + model_inputs = {'input_ids': input_ids.contiguous()} + + if num_logits_to_keep is not None: + model_inputs['num_logits_to_keep'] = num_logits_to_keep + + model_inputs.update({ + 'past_key_values': past_key_values, + 'use_cache': use_cache, + 'attention_mask': attention_mask, + 'num_logits_to_keep': num_logits_to_keep, + }) + return model_inputs + + def forward( + self, + input_ids: torch.LongTensor = None, + attention_mask: Optional[torch.Tensor] = None, + inputs_embeds: Optional[torch.Tensor] = None, + past_key_values: Optional[Union[Cache, List[torch.FloatTensor]]] = None, + labels: Optional[torch.LongTensor] = None, + use_cache: Optional[bool] = None, + output_attentions: Optional[bool] = None, + output_hidden_states: Optional[bool] = None, + return_dict: Optional[bool] = None, + num_logits_to_keep: Optional[int] = 0 + ) -> Union[Tuple, CausalLMOutputWithPast]: + output_attentions = output_attentions if output_attentions is not None else self.config.output_attentions + output_hidden_states = ( + output_hidden_states if output_hidden_states is not None else self.config.output_hidden_states + ) + return_dict = return_dict if return_dict is not None else self.config.use_return_dict + + outputs = self.model( + input_ids=input_ids, + attention_mask=attention_mask, + inputs_embeds=inputs_embeds, + past_key_values=past_key_values, + use_cache=use_cache, + output_attentions=output_attentions, + output_hidden_states=output_hidden_states, + return_dict=return_dict + ) + + hidden_states = outputs[0] + fuse_linear_and_cross_entropy = self.config.fuse_cross_entropy and self.training + logits = None if fuse_linear_and_cross_entropy else self.lm_head(hidden_states[:, -num_logits_to_keep:]) + + loss = None + if labels is not None: + if self.config.fuse_cross_entropy: + if fuse_linear_and_cross_entropy: + loss_fct = FusedLinearCrossEntropyLoss() + else: + loss_fct = FusedCrossEntropyLoss(inplace_backward=True) + else: + loss_fct = nn.CrossEntropyLoss() + # Enable model parallelism + labels = labels.to(hidden_states.device) + labels = torch.cat((labels[..., 1:], torch.full_like(labels[:, :1], loss_fct.ignore_index)), 1) + if fuse_linear_and_cross_entropy: + loss = loss_fct(hidden_states.view(-1, self.config.hidden_size), + labels.view(-1), + self.lm_head.weight, + self.lm_head.bias) + else: + loss = loss_fct(logits.view(-1, self.config.vocab_size), labels.view(-1)) + + if not return_dict: + output = (logits,) + outputs[1:] + return (loss,) + output if loss is not None else output + + return CausalLMOutputWithPast( + loss=loss, + logits=logits, + past_key_values=outputs.past_key_values, + hidden_states=outputs.hidden_states, + attentions=outputs.attentions, + ) diff --git a/fla/models/hgrn/__init__.py b/fla/models/hgrn/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..3b29a3dd82da6d64bac6cc887e24295a03de5b23 --- /dev/null +++ b/fla/models/hgrn/__init__.py @@ -0,0 +1,13 @@ +# -*- coding: utf-8 -*- + +from transformers import AutoConfig, AutoModel, AutoModelForCausalLM + +from fla.models.hgrn.configuration_hgrn import HGRNConfig +from fla.models.hgrn.modeling_hgrn import HGRNForCausalLM, HGRNModel + +AutoConfig.register(HGRNConfig.model_type, HGRNConfig) +AutoModel.register(HGRNConfig, HGRNModel) +AutoModelForCausalLM.register(HGRNConfig, HGRNForCausalLM) + + +__all__ = ['HGRNConfig', 'HGRNForCausalLM', 'HGRNModel'] diff --git a/fla/models/hgrn/configuration_hgrn.py b/fla/models/hgrn/configuration_hgrn.py new file mode 100644 index 0000000000000000000000000000000000000000..39dd38db6f855029f070862f03ad6f47ef913bbe --- /dev/null +++ b/fla/models/hgrn/configuration_hgrn.py @@ -0,0 +1,74 @@ +# -*- coding: utf-8 -*- + +from typing import Dict, Optional + +from transformers.configuration_utils import PretrainedConfig + + +class HGRNConfig(PretrainedConfig): + + model_type = 'hgrn' + keys_to_ignore_at_inference = ['past_key_values'] + + def __init__( + self, + attn_mode: str = "chunk", + hidden_size: int = 2048, + num_hidden_layers: int = 24, + expand_ratio: Optional[int] = 1, + use_short_conv: bool = False, + conv_size: int = 4, + use_lower_bound: bool = True, + max_position_embeddings: int = 2048, + hidden_ratio: Optional[int] = 4, + intermediate_size: Optional[int] = None, + hidden_act: str = "swish", + elementwise_affine: Optional[bool] = True, + norm_eps: float = 1e-6, + attn: Optional[Dict] = None, + use_cache: bool = True, + pad_token_id: int = None, + bos_token_id: int = 1, + eos_token_id: int = 2, + tie_word_embeddings: bool = False, + initializer_range: float = 0.02, + fuse_cross_entropy: bool = True, + vocab_size: int = 32000, + **kwargs + ): + self.attn_mode = attn_mode + self.hidden_size = hidden_size + self.num_hidden_layers = num_hidden_layers + self.expand_ratio = expand_ratio + self.use_short_conv = use_short_conv + self.conv_size = conv_size + self.use_lower_bound = use_lower_bound + self.max_position_embeddings = max_position_embeddings + self.hidden_ratio = hidden_ratio + self.intermediate_size = intermediate_size + self.elementwise_affine = elementwise_affine + self.attn = attn + self.norm_eps = norm_eps + self.hidden_act = hidden_act + self.use_cache = use_cache + self.initializer_range = initializer_range + self.fuse_cross_entropy = fuse_cross_entropy + self.vocab_size = vocab_size + + if attn is not None: + if not isinstance(attn, Dict): + raise ValueError("attn must be a dictionary") + if 'layers' not in attn: + raise ValueError("Layer indices must be provided to initialize hybrid attention layers") + if 'num_heads' not in attn: + raise ValueError("Number of heads must be provided to initialize hybrid attention layers") + attn['num_kv_heads'] = attn.get('num_kv_heads', attn['num_heads']) + attn['window_size'] = attn.get('window_size', None) + + super().__init__( + pad_token_id=pad_token_id, + bos_token_id=bos_token_id, + eos_token_id=eos_token_id, + tie_word_embeddings=tie_word_embeddings, + **kwargs, + ) diff --git a/fla/models/hgrn/modeling_hgrn.py b/fla/models/hgrn/modeling_hgrn.py new file mode 100644 index 0000000000000000000000000000000000000000..3091b40fd86ecb601d813f1b7a77affd40ba3c05 --- /dev/null +++ b/fla/models/hgrn/modeling_hgrn.py @@ -0,0 +1,421 @@ +# -*- coding: utf-8 -*- + +from __future__ import annotations + +import math +import warnings +from typing import List, Optional, Tuple, Union + +import torch +import torch.nn as nn +import torch.utils.checkpoint +from transformers.activations import ACT2FN +from transformers.generation import GenerationMixin +from transformers.modeling_outputs import (BaseModelOutputWithPast, + CausalLMOutputWithPast) +from transformers.modeling_utils import PreTrainedModel +from transformers.utils import logging + +from fla.layers.attn import Attention +from fla.layers.hgrn import HGRNAttention +from fla.models.hgrn.configuration_hgrn import HGRNConfig +from fla.models.utils import Cache +from fla.modules import (FusedCrossEntropyLoss, FusedLinearCrossEntropyLoss, + RMSNorm) +from fla.modules.activations import swiglu_linear + +logger = logging.get_logger(__name__) + + +class HGRNMLP(nn.Module): + + def __init__( + self, + hidden_size: int, + hidden_ratio: Optional[int] = None, + intermediate_size: Optional[int] = None, + hidden_act: str = 'swish' + ) -> HGRNMLP: + super().__init__() + + self.hidden_size = hidden_size + # the final number of params is `hidden_ratio * hidden_size^2` + # `intermediate_size` is chosen to be a multiple of 256 closest to `2/3 * hidden_size * hidden_ratio` + if hidden_ratio is None: + hidden_ratio = 4 + if intermediate_size is None: + intermediate_size = int(hidden_size * hidden_ratio * 2 / 3) + intermediate_size = 256 * ((intermediate_size + 256 - 1) // 256) + self.hidden_ratio = hidden_ratio + self.intermediate_size = intermediate_size + + self.gate_proj = nn.Linear(self.hidden_size, self.intermediate_size * 2, bias=False) + self.down_proj = nn.Linear(self.intermediate_size, self.hidden_size, bias=False) + self.act_fn = ACT2FN[hidden_act] + + def forward(self, x): + y = self.gate_proj(x) + gate, y = y.chunk(2, -1) + return swiglu_linear(gate, y, self.down_proj.weight, self.down_proj.bias) + + +class HGRNBlock(nn.Module): + def __init__(self, config: HGRNConfig, layer_idx: int): + super().__init__() + self.hidden_size = config.hidden_size + + self.attn_norm = RMSNorm(hidden_size=config.hidden_size, eps=config.norm_eps) + if config.attn is not None and layer_idx in config.attn['layers']: + self.attn = Attention( + hidden_size=config.hidden_size, + num_heads=config.attn['num_heads'], + num_kv_heads=config.attn['num_kv_heads'], + window_size=config.attn['window_size'], + max_position_embeddings=config.max_position_embeddings, + layer_idx=layer_idx + ) + else: + self.attn = HGRNAttention( + mode=config.attn_mode, + hidden_size=config.hidden_size, + expand_ratio=config.expand_ratio, + use_short_conv=config.use_short_conv, + conv_size=config.conv_size, + elementwise_affine=config.elementwise_affine, + norm_eps=config.norm_eps, + layer_idx=layer_idx + ) + self.mlp_norm = RMSNorm(hidden_size=config.hidden_size, eps=config.norm_eps) + self.mlp = HGRNMLP( + hidden_size=config.hidden_size, + hidden_ratio=config.hidden_ratio, + intermediate_size=config.intermediate_size, + hidden_act=config.hidden_act + ) + + def forward( + self, + hidden_states: torch.Tensor, + attention_mask: Optional[torch.Tensor] = None, + past_key_values: Optional[Union[Cache, List[torch.FloatTensor]]] = None, + use_cache: Optional[bool] = False, + output_attentions: Optional[bool] = False, + lower_bound: Optional[torch.Tensor] = False, + **kwargs, + ) -> Tuple[torch.FloatTensor, Optional[Tuple[torch.FloatTensor, torch.FloatTensor]]]: + residual = hidden_states + hidden_states = self.attn_norm(hidden_states) + hidden_states, attentions, past_key_values = self.attn( + hidden_states=hidden_states, + attention_mask=attention_mask, + past_key_values=past_key_values, + use_cache=use_cache, + output_attentions=output_attentions, + lower_bound=lower_bound + ) + hidden_states, residual = self.mlp_norm(hidden_states, residual, True) + hidden_states = self.mlp(hidden_states) + hidden_states = residual + hidden_states + + outputs = (hidden_states, attentions, past_key_values) + + return outputs + + +class HGRNPreTrainedModel(PreTrainedModel): + + config_class = HGRNConfig + supports_gradient_checkpointing = True + _no_split_modules = ['HGRNBlock'] + + def __init__(self, *inputs, **kwargs): + super().__init__(*inputs, **kwargs) + + def _init_weights( + self, + module: nn.Module, + rescale_prenorm_residual: bool = True, + num_residuals_per_layer: int = 2, + ): + if isinstance(module, (nn.Linear, nn.Conv1d)): + # Slightly different from the TF version which uses truncated_normal for initialization + # cf https://github.com/pytorch/pytorch/pull/5617 + nn.init.normal_(module.weight, mean=0.0, std=self.config.initializer_range) + if module.bias is not None: + nn.init.zeros_(module.bias) + elif isinstance(module, nn.Embedding): + nn.init.normal_(module.weight, mean=0.0, std=self.config.initializer_range) + if module.padding_idx is not None: + module.weight.data[module.padding_idx].zero_() + + if rescale_prenorm_residual: + # Reinitialize selected weights subject to the OpenAI GPT-2 Paper Scheme: + # > A modified initialization which accounts for the accumulation on the residual path with model depth. Scale + # > the weights of residual layers at initialization by a factor of 1/√N where N is the # of residual layers. + # > -- GPT-2 :: https://openai.com/blog/better-language-models/ + # + # Reference (Megatron-LM): https://github.com/NVIDIA/Megatron-LM/blob/main/megatron/model/gpt_model.py + for name, p in module.named_parameters(): + if name in ["o_proj.weight", "down_proj.weight"]: + # Special Scaled Initialization --> There are 2 Layer Norms per Transformer Block + # Following Pytorch init, except scale by 1/sqrt(2 * n_layer) + # We need to reinit p since this code could be called multiple times + # Having just p *= scale would repeatedly scale it down + with torch.no_grad(): + p /= math.sqrt(num_residuals_per_layer * self.config.num_hidden_layers) + + +class HGRNModel(HGRNPreTrainedModel): + + def __init__(self, config: HGRNConfig): + super().__init__(config) + self.padding_idx = config.pad_token_id + self.vocab_size = config.vocab_size + + self.embeddings = nn.Embedding(config.vocab_size, config.hidden_size, self.padding_idx) + if config.use_lower_bound: + self.lower_bounds = nn.Parameter(torch.zeros(config.num_hidden_layers, config.hidden_size)) + self.layers = nn.ModuleList([HGRNBlock(config, layer_idx) for layer_idx in range(config.num_hidden_layers)]) + self.norm = RMSNorm(config.hidden_size, eps=config.norm_eps) + + self.gradient_checkpointing = False + + self.post_init() + + def get_input_embeddings(self): + return self.embeddings + + def set_input_embeddings(self, value): + self.embeddings = value + + def forward( + self, + input_ids: Optional[torch.LongTensor] = None, + attention_mask: Optional[torch.Tensor] = None, # noqa + inputs_embeds: Optional[torch.FloatTensor] = None, + past_key_values: Optional[Union[Cache, List[torch.FloatTensor]]] = None, + use_cache: Optional[bool] = None, + output_attentions: Optional[bool] = None, + output_hidden_states: Optional[bool] = None, + return_dict: Optional[bool] = None + ) -> Union[Tuple, BaseModelOutputWithPast]: + if output_attentions: + warnings.warn("`HGRNModel` does not `output_attentions` now, setting it to `False`.") + output_attentions = False + output_attentions = output_attentions if output_attentions is not None else self.config.output_attentions + output_hidden_states = output_hidden_states if output_hidden_states is not None else self.config.output_hidden_states + use_cache = use_cache if use_cache is not None else (self.config.use_cache if not self.training else False) + return_dict = return_dict if return_dict is not None else self.config.use_return_dict + + # retrieve input_ids and inputs_embeds + if input_ids is not None and inputs_embeds is not None: + raise ValueError("You cannot specify both input_ids and inputs_embeds at the same time") + if input_ids is None and inputs_embeds is None: + raise ValueError("You have to specify either input_ids or inputs_embeds") + + if inputs_embeds is None: + inputs_embeds = self.embeddings(input_ids) + hidden_states = inputs_embeds + + if use_cache and not isinstance(past_key_values, Cache): + past_key_values = Cache.from_legacy_cache(past_key_values) + + if self.gradient_checkpointing and self.training and use_cache: + logger.warning_once("`use_cache=True` is incompatible with gradient checkpointing. Setting `use_cache=False`...") + use_cache = False + + all_hidden_states = () if output_hidden_states else None + all_attns = () if output_attentions else None + + if self.config.use_lower_bound: + lower_bounds = self.lower_bounds.softmax(0) + lower_bounds = lower_bounds.cumsum(0) - lower_bounds[0] + for i, layer in enumerate(self.layers): + if output_hidden_states: + all_hidden_states += (hidden_states,) + + lower_bound = lower_bounds[i] if self.config.use_lower_bound else None + if self.gradient_checkpointing and self.training: + hidden_states, attentions, past_key_values = self._gradient_checkpointing_func( + layer.__call__, + hidden_states, + attention_mask, + past_key_values, + use_cache, + output_attentions, + lower_bound + ) + else: + hidden_states, attentions, past_key_values = layer( + hidden_states, + attention_mask=attention_mask, + past_key_values=past_key_values, + use_cache=use_cache, + output_attentions=output_attentions, + lower_bound=lower_bound + ) + + if output_attentions: + all_attns += (attentions,) + + hidden_states = self.norm(hidden_states) + + # add hidden states from the last decoder layer + if output_hidden_states: + all_hidden_states += (hidden_states,) + + if not return_dict: + return tuple(i for i in [hidden_states, past_key_values, all_hidden_states, all_attns] if i is not None) + return BaseModelOutputWithPast( + last_hidden_state=hidden_states, + past_key_values=past_key_values, + hidden_states=all_hidden_states, + attentions=all_attns + ) + + +class HGRNForCausalLM(HGRNPreTrainedModel, GenerationMixin): + + _tied_weights_keys = ["lm_head.weight"] + + def __init__(self, config): + super().__init__(config) + self.model = HGRNModel(config) + self.vocab_size = config.vocab_size + self.lm_head = nn.Linear(config.hidden_size, config.vocab_size, bias=False) + + # Initialize weights and apply final processing + self.post_init() + + def get_input_embeddings(self): + return self.model.embeddings + + def set_input_embeddings(self, value): + self.model.embeddings = value + + def get_output_embeddings(self): + return self.lm_head + + def set_output_embeddings(self, new_embeddings): + self.lm_head = new_embeddings + + def set_decoder(self, decoder): + self.model = decoder + + def get_decoder(self): + return self.model + + def generate(self, *args, **kwargs): + try: + return super().generate(*args, **kwargs) + except AttributeError as exception: + if 'past_key_values' in str(exception): + raise AttributeError( + f"You tried to call `generate` with a decoding strategy that manipulates `past_key_values`, " + f"which is not supported for {self.__class__.__name__}. " + f"Try another generation strategy instead. " + f"For the available generation strategies, check this doc: " + f"https://huggingface.co/docs/transformers/en/generation_strategies#decoding-strategies" + ) + else: + raise exception + + def prepare_inputs_for_generation( + self, + input_ids: torch.LongTensor = None, + past_key_values: Optional[Union[Cache, List[torch.FloatTensor]]] = None, + attention_mask: Optional[torch.Tensor] = None, + inputs_embeds: Optional[torch.Tensor] = None, + use_cache: bool = True, + num_logits_to_keep: Optional[int] = None, + **kwargs + ): + # only last token for `inputs_ids` if the `past_key_values` is passed along. + if past_key_values is not None: + input_ids = input_ids[:, -1:] + # if `inputs_embeds` are passed, we only want to use them in the 1st generation step + if inputs_embeds is not None and past_key_values is None: + model_inputs = {'inputs_embeds': inputs_embeds} + else: + # The `contiguous()` here is necessary to have a static stride during decoding. torchdynamo otherwise + # recompiles graphs as the stride of the inputs is a guard. + # Ref: https://github.com/huggingface/transformers/pull/29114 + # TODO: use `next_tokens` directly instead. + model_inputs = {'input_ids': input_ids.contiguous()} + + if num_logits_to_keep is not None: + model_inputs['num_logits_to_keep'] = num_logits_to_keep + + model_inputs.update({ + 'past_key_values': past_key_values, + 'use_cache': use_cache, + 'attention_mask': attention_mask, + 'num_logits_to_keep': num_logits_to_keep, + }) + return model_inputs + + def forward( + self, + input_ids: torch.LongTensor = None, + attention_mask: Optional[torch.Tensor] = None, + inputs_embeds: Optional[torch.Tensor] = None, + past_key_values: Optional[Union[Cache, List[torch.FloatTensor]]] = None, + labels: Optional[torch.LongTensor] = None, + use_cache: Optional[bool] = None, + output_attentions: Optional[bool] = None, + output_hidden_states: Optional[bool] = None, + return_dict: Optional[bool] = None, + num_logits_to_keep: Optional[int] = 0 + ) -> Union[Tuple, CausalLMOutputWithPast]: + output_attentions = output_attentions if output_attentions is not None else self.config.output_attentions + output_hidden_states = ( + output_hidden_states if output_hidden_states is not None else self.config.output_hidden_states + ) + return_dict = return_dict if return_dict is not None else self.config.use_return_dict + + outputs = self.model( + input_ids=input_ids, + attention_mask=attention_mask, + inputs_embeds=inputs_embeds, + past_key_values=past_key_values, + use_cache=use_cache, + output_attentions=output_attentions, + output_hidden_states=output_hidden_states, + return_dict=return_dict + ) + + hidden_states = outputs[0] + fuse_linear_and_cross_entropy = self.config.fuse_cross_entropy and self.training + logits = None if fuse_linear_and_cross_entropy else self.lm_head(hidden_states[:, -num_logits_to_keep:]) + + loss = None + if labels is not None: + if self.config.fuse_cross_entropy: + if fuse_linear_and_cross_entropy: + loss_fct = FusedLinearCrossEntropyLoss() + else: + loss_fct = FusedCrossEntropyLoss(inplace_backward=True) + else: + loss_fct = nn.CrossEntropyLoss() + # Enable model parallelism + labels = labels.to(hidden_states.device) + labels = torch.cat((labels[..., 1:], torch.full_like(labels[:, :1], loss_fct.ignore_index)), 1) + if fuse_linear_and_cross_entropy: + loss = loss_fct(hidden_states.view(-1, self.config.hidden_size), + labels.view(-1), + self.lm_head.weight, + self.lm_head.bias) + else: + loss = loss_fct(logits.view(-1, self.config.vocab_size), labels.view(-1)) + + if not return_dict: + output = (logits,) + outputs[1:] + return (loss,) + output if loss is not None else output + + return CausalLMOutputWithPast( + loss=loss, + logits=logits, + past_key_values=outputs.past_key_values, + hidden_states=outputs.hidden_states, + attentions=outputs.attentions, + ) diff --git a/fla/models/hgrn2/__init__.py b/fla/models/hgrn2/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..306b8082220a57091f2e99cd689c011690db0439 --- /dev/null +++ b/fla/models/hgrn2/__init__.py @@ -0,0 +1,13 @@ +# -*- coding: utf-8 -*- + +from transformers import AutoConfig, AutoModel, AutoModelForCausalLM + +from fla.models.hgrn2.configuration_hgrn2 import HGRN2Config +from fla.models.hgrn2.modeling_hgrn2 import HGRN2ForCausalLM, HGRN2Model + +AutoConfig.register(HGRN2Config.model_type, HGRN2Config) +AutoModel.register(HGRN2Config, HGRN2Model) +AutoModelForCausalLM.register(HGRN2Config, HGRN2ForCausalLM) + + +__all__ = ['HGRN2Config', 'HGRN2ForCausalLM', 'HGRN2Model'] diff --git a/fla/models/hgrn2/configuration_hgrn2.py b/fla/models/hgrn2/configuration_hgrn2.py new file mode 100644 index 0000000000000000000000000000000000000000..d7a5945b8586d66cc8f0f9e734cc5667fc4ddead --- /dev/null +++ b/fla/models/hgrn2/configuration_hgrn2.py @@ -0,0 +1,76 @@ +# -*- coding: utf-8 -*- + +from typing import Dict, Optional + +from transformers.configuration_utils import PretrainedConfig + + +class HGRN2Config(PretrainedConfig): + + model_type = 'hgrn2' + keys_to_ignore_at_inference = ['past_key_values'] + + def __init__( + self, + hidden_size: int = 2048, + num_hidden_layers: int = 24, + attn_mode: str = "chunk", + num_heads: Optional[int] = None, + expand_ratio: Optional[int] = 128, + use_short_conv: bool = False, + conv_size: int = 4, + use_lower_bound: bool = True, + hidden_ratio: Optional[int] = 4, + intermediate_size: Optional[int] = None, + hidden_act: str = "swish", + max_position_embeddings: int = 2048, + elementwise_affine: Optional[bool] = True, + norm_eps: float = 1e-6, + attn: Optional[Dict] = None, + use_cache: bool = True, + pad_token_id: int = None, + bos_token_id: int = 1, + eos_token_id: int = 2, + tie_word_embeddings: bool = False, + initializer_range: float = 0.02, + fuse_cross_entropy: bool = True, + vocab_size: int = 32000, + **kwargs + ): + self.hidden_size = hidden_size + self.num_hidden_layers = num_hidden_layers + self.attn_mode = attn_mode + self.num_heads = num_heads + self.expand_ratio = expand_ratio + self.use_short_conv = use_short_conv + self.conv_size = conv_size + self.use_lower_bound = use_lower_bound + self.max_position_embeddings = max_position_embeddings + self.hidden_ratio = hidden_ratio + self.intermediate_size = intermediate_size + self.hidden_act = hidden_act + self.elementwise_affine = elementwise_affine + self.norm_eps = norm_eps + self.attn = attn + self.use_cache = use_cache + self.initializer_range = initializer_range + self.fuse_cross_entropy = fuse_cross_entropy + self.vocab_size = vocab_size + + if attn is not None: + if not isinstance(attn, Dict): + raise ValueError("attn must be a dictionary") + if 'layers' not in attn: + raise ValueError("Layer indices must be provided to initialize hybrid attention layers") + if 'num_heads' not in attn: + raise ValueError("Number of heads must be provided to initialize hybrid attention layers") + attn['num_kv_heads'] = attn.get('num_kv_heads', attn['num_heads']) + attn['window_size'] = attn.get('window_size', None) + + super().__init__( + pad_token_id=pad_token_id, + bos_token_id=bos_token_id, + eos_token_id=eos_token_id, + tie_word_embeddings=tie_word_embeddings, + **kwargs, + ) diff --git a/fla/models/hgrn2/modeling_hgrn2.py b/fla/models/hgrn2/modeling_hgrn2.py new file mode 100644 index 0000000000000000000000000000000000000000..f9fec7317941f9d9597fd7ee926ea49b802de6f7 --- /dev/null +++ b/fla/models/hgrn2/modeling_hgrn2.py @@ -0,0 +1,422 @@ +# -*- coding: utf-8 -*- + +from __future__ import annotations + +import math +import warnings +from typing import List, Optional, Tuple, Union + +import torch +import torch.nn as nn +import torch.utils.checkpoint +from transformers.activations import ACT2FN +from transformers.generation import GenerationMixin +from transformers.modeling_outputs import (BaseModelOutputWithPast, + CausalLMOutputWithPast) +from transformers.modeling_utils import PreTrainedModel +from transformers.utils import logging + +from fla.layers.attn import Attention +from fla.layers.hgrn2 import HGRN2Attention +from fla.models.hgrn2.configuration_hgrn2 import HGRN2Config +from fla.models.utils import Cache +from fla.modules import (FusedCrossEntropyLoss, FusedLinearCrossEntropyLoss, + RMSNorm) +from fla.modules.activations import swiglu_linear + +logger = logging.get_logger(__name__) + + +class HGRN2MLP(nn.Module): + + def __init__( + self, + hidden_size: int, + hidden_ratio: Optional[int] = None, + intermediate_size: Optional[int] = None, + hidden_act: str = 'swish' + ) -> HGRN2MLP: + super().__init__() + + self.hidden_size = hidden_size + # the final number of params is `hidden_ratio * hidden_size^2` + # `intermediate_size` is chosen to be a multiple of 256 closest to `2/3 * hidden_size * hidden_ratio` + if hidden_ratio is None: + hidden_ratio = 4 + if intermediate_size is None: + intermediate_size = int(hidden_size * hidden_ratio * 2 / 3) + intermediate_size = 256 * ((intermediate_size + 256 - 1) // 256) + self.hidden_ratio = hidden_ratio + self.intermediate_size = intermediate_size + + self.gate_proj = nn.Linear(self.hidden_size, self.intermediate_size * 2, bias=False) + self.down_proj = nn.Linear(self.intermediate_size, self.hidden_size, bias=False) + self.act_fn = ACT2FN[hidden_act] + + def forward(self, x): + y = self.gate_proj(x) + gate, y = y.chunk(2, -1) + return swiglu_linear(gate, y, self.down_proj.weight, self.down_proj.bias) + + +class HGRN2Block(nn.Module): + def __init__(self, config: HGRN2Config, layer_idx: int): + super().__init__() + self.hidden_size = config.hidden_size + + self.attn_norm = RMSNorm(hidden_size=config.hidden_size, eps=config.norm_eps) + if config.attn is not None and layer_idx in config.attn['layers']: + self.attn = Attention( + hidden_size=config.hidden_size, + num_heads=config.attn['num_heads'], + num_kv_heads=config.attn['num_kv_heads'], + window_size=config.attn['window_size'], + max_position_embeddings=config.max_position_embeddings, + layer_idx=layer_idx + ) + else: + self.attn = HGRN2Attention( + mode=config.attn_mode, + hidden_size=config.hidden_size, + num_heads=config.num_heads, + expand_ratio=config.expand_ratio, + use_short_conv=config.use_short_conv, + conv_size=config.conv_size, + elementwise_affine=config.elementwise_affine, + norm_eps=config.norm_eps, + layer_idx=layer_idx + ) + self.mlp_norm = RMSNorm(hidden_size=config.hidden_size, eps=config.norm_eps) + self.mlp = HGRN2MLP( + hidden_size=config.hidden_size, + hidden_ratio=config.hidden_ratio, + intermediate_size=config.intermediate_size, + hidden_act=config.hidden_act + ) + + def forward( + self, + hidden_states: torch.Tensor, + attention_mask: Optional[torch.Tensor] = None, + past_key_values: Optional[Union[Cache, List[torch.FloatTensor]]] = None, + use_cache: Optional[bool] = False, + output_attentions: Optional[bool] = False, + lower_bound: Optional[torch.Tensor] = False, + **kwargs, + ) -> Tuple[torch.FloatTensor, Optional[Tuple[torch.FloatTensor, torch.FloatTensor]]]: + residual = hidden_states + hidden_states = self.attn_norm(hidden_states) + hidden_states, attentions, past_key_values = self.attn( + hidden_states=hidden_states, + attention_mask=attention_mask, + past_key_values=past_key_values, + use_cache=use_cache, + output_attentions=output_attentions, + lower_bound=lower_bound + ) + hidden_states, residual = self.mlp_norm(hidden_states, residual, True) + hidden_states = self.mlp(hidden_states) + hidden_states = residual + hidden_states + + outputs = (hidden_states, attentions, past_key_values) + + return outputs + + +class HGRN2PreTrainedModel(PreTrainedModel): + + config_class = HGRN2Config + supports_gradient_checkpointing = True + _no_split_modules = ['HGRN2Block'] + + def __init__(self, *inputs, **kwargs): + super().__init__(*inputs, **kwargs) + + def _init_weights( + self, + module: nn.Module, + rescale_prenorm_residual: bool = True, + num_residuals_per_layer: int = 2, + ): + if isinstance(module, (nn.Linear, nn.Conv1d)): + # Slightly different from the TF version which uses truncated_normal for initialization + # cf https://github.com/pytorch/pytorch/pull/5617 + nn.init.normal_(module.weight, mean=0.0, std=self.config.initializer_range) + if module.bias is not None: + nn.init.zeros_(module.bias) + elif isinstance(module, nn.Embedding): + nn.init.normal_(module.weight, mean=0.0, std=self.config.initializer_range) + if module.padding_idx is not None: + module.weight.data[module.padding_idx].zero_() + + if rescale_prenorm_residual: + # Reinitialize selected weights subject to the OpenAI GPT-2 Paper Scheme: + # > A modified initialization which accounts for the accumulation on the residual path with model depth. Scale + # > the weights of residual layers at initialization by a factor of 1/√N where N is the # of residual layers. + # > -- GPT-2 :: https://openai.com/blog/better-language-models/ + # + # Reference (Megatron-LM): https://github.com/NVIDIA/Megatron-LM/blob/main/megatron/model/gpt_model.py + for name, p in module.named_parameters(): + if name in ["o_proj.weight", "down_proj.weight"]: + # Special Scaled Initialization --> There are 2 Layer Norms per Transformer Block + # Following Pytorch init, except scale by 1/sqrt(2 * n_layer) + # We need to reinit p since this code could be called multiple times + # Having just p *= scale would repeatedly scale it down + with torch.no_grad(): + p /= math.sqrt(num_residuals_per_layer * self.config.num_hidden_layers) + + +class HGRN2Model(HGRN2PreTrainedModel): + + def __init__(self, config: HGRN2Config): + super().__init__(config) + self.padding_idx = config.pad_token_id + self.vocab_size = config.vocab_size + + self.embeddings = nn.Embedding(config.vocab_size, config.hidden_size, self.padding_idx) + if config.use_lower_bound: + self.lower_bounds = nn.Parameter(torch.zeros(config.num_hidden_layers, config.hidden_size)) + self.layers = nn.ModuleList([HGRN2Block(config, layer_idx) for layer_idx in range(config.num_hidden_layers)]) + self.norm = RMSNorm(config.hidden_size, eps=config.norm_eps) + + self.gradient_checkpointing = False + + self.post_init() + + def get_input_embeddings(self): + return self.embeddings + + def set_input_embeddings(self, value): + self.embeddings = value + + def forward( + self, + input_ids: Optional[torch.LongTensor] = None, + attention_mask: Optional[torch.Tensor] = None, # noqa + inputs_embeds: Optional[torch.FloatTensor] = None, + past_key_values: Optional[Union[Cache, List[torch.FloatTensor]]] = None, + use_cache: Optional[bool] = None, + output_attentions: Optional[bool] = None, + output_hidden_states: Optional[bool] = None, + return_dict: Optional[bool] = None + ) -> Union[Tuple, BaseModelOutputWithPast]: + if output_attentions: + warnings.warn("`HGRN2Model` does not `output_attentions` now, setting it to `False`.") + output_attentions = False + output_attentions = output_attentions if output_attentions is not None else self.config.output_attentions + output_hidden_states = output_hidden_states if output_hidden_states is not None else self.config.output_hidden_states + use_cache = use_cache if use_cache is not None else (self.config.use_cache if not self.training else False) + return_dict = return_dict if return_dict is not None else self.config.use_return_dict + + # retrieve input_ids and inputs_embeds + if input_ids is not None and inputs_embeds is not None: + raise ValueError("You cannot specify both input_ids and inputs_embeds at the same time") + if input_ids is None and inputs_embeds is None: + raise ValueError("You have to specify either input_ids or inputs_embeds") + + if inputs_embeds is None: + inputs_embeds = self.embeddings(input_ids) + hidden_states = inputs_embeds + + if use_cache and not isinstance(past_key_values, Cache): + past_key_values = Cache.from_legacy_cache(past_key_values) + + if self.gradient_checkpointing and self.training and use_cache: + logger.warning_once("`use_cache=True` is incompatible with gradient checkpointing. Setting `use_cache=False`...") + use_cache = False + + all_hidden_states = () if output_hidden_states else None + all_attns = () if output_attentions else None + + if self.config.use_lower_bound: + lower_bounds = self.lower_bounds.softmax(0) + lower_bounds = lower_bounds.cumsum(0) - lower_bounds[0] + for i, layer in enumerate(self.layers): + if output_hidden_states: + all_hidden_states += (hidden_states,) + + lower_bound = lower_bounds[i] if self.config.use_lower_bound else None + if self.gradient_checkpointing and self.training: + hidden_states, attentions, past_key_values = self._gradient_checkpointing_func( + layer.__call__, + hidden_states, + attention_mask, + past_key_values, + use_cache, + output_attentions, + lower_bound + ) + else: + hidden_states, attentions, past_key_values = layer( + hidden_states, + attention_mask=attention_mask, + past_key_values=past_key_values, + use_cache=use_cache, + output_attentions=output_attentions, + lower_bound=lower_bound + ) + + if output_attentions: + all_attns += (attentions,) + + hidden_states = self.norm(hidden_states) + + # add hidden states from the last decoder layer + if output_hidden_states: + all_hidden_states += (hidden_states,) + + if not return_dict: + return tuple(i for i in [hidden_states, past_key_values, all_hidden_states, all_attns] if i is not None) + return BaseModelOutputWithPast( + last_hidden_state=hidden_states, + past_key_values=past_key_values, + hidden_states=all_hidden_states, + attentions=all_attns + ) + + +class HGRN2ForCausalLM(HGRN2PreTrainedModel, GenerationMixin): + + _tied_weights_keys = ["lm_head.weight"] + + def __init__(self, config): + super().__init__(config) + self.model = HGRN2Model(config) + self.vocab_size = config.vocab_size + self.lm_head = nn.Linear(config.hidden_size, config.vocab_size, bias=False) + + # Initialize weights and apply final processing + self.post_init() + + def get_input_embeddings(self): + return self.model.embeddings + + def set_input_embeddings(self, value): + self.model.embeddings = value + + def get_output_embeddings(self): + return self.lm_head + + def set_output_embeddings(self, new_embeddings): + self.lm_head = new_embeddings + + def set_decoder(self, decoder): + self.model = decoder + + def get_decoder(self): + return self.model + + def generate(self, *args, **kwargs): + try: + return super().generate(*args, **kwargs) + except AttributeError as exception: + if 'past_key_values' in str(exception): + raise AttributeError( + f"You tried to call `generate` with a decoding strategy that manipulates `past_key_values`, " + f"which is not supported for {self.__class__.__name__}. " + f"Try another generation strategy instead. " + f"For the available generation strategies, check this doc: " + f"https://huggingface.co/docs/transformers/en/generation_strategies#decoding-strategies" + ) + else: + raise exception + + def prepare_inputs_for_generation( + self, + input_ids: torch.LongTensor = None, + past_key_values: Optional[Union[Cache, List[torch.FloatTensor]]] = None, + attention_mask: Optional[torch.Tensor] = None, + inputs_embeds: Optional[torch.Tensor] = None, + use_cache: bool = True, + num_logits_to_keep: Optional[int] = None, + **kwargs + ): + # only last token for `inputs_ids` if the `past_key_values` is passed along. + if past_key_values is not None: + input_ids = input_ids[:, -1:] + # if `inputs_embeds` are passed, we only want to use them in the 1st generation step + if inputs_embeds is not None and past_key_values is None: + model_inputs = {'inputs_embeds': inputs_embeds} + else: + # The `contiguous()` here is necessary to have a static stride during decoding. torchdynamo otherwise + # recompiles graphs as the stride of the inputs is a guard. + # Ref: https://github.com/huggingface/transformers/pull/29114 + # TODO: use `next_tokens` directly instead. + model_inputs = {'input_ids': input_ids.contiguous()} + + if num_logits_to_keep is not None: + model_inputs['num_logits_to_keep'] = num_logits_to_keep + + model_inputs.update({ + 'past_key_values': past_key_values, + 'use_cache': use_cache, + 'attention_mask': attention_mask, + 'num_logits_to_keep': num_logits_to_keep, + }) + return model_inputs + + def forward( + self, + input_ids: torch.LongTensor = None, + attention_mask: Optional[torch.Tensor] = None, + inputs_embeds: Optional[torch.Tensor] = None, + past_key_values: Optional[Union[Cache, List[torch.FloatTensor]]] = None, + labels: Optional[torch.LongTensor] = None, + use_cache: Optional[bool] = None, + output_attentions: Optional[bool] = None, + output_hidden_states: Optional[bool] = None, + return_dict: Optional[bool] = None, + num_logits_to_keep: Optional[int] = 0 + ) -> Union[Tuple, CausalLMOutputWithPast]: + output_attentions = output_attentions if output_attentions is not None else self.config.output_attentions + output_hidden_states = ( + output_hidden_states if output_hidden_states is not None else self.config.output_hidden_states + ) + return_dict = return_dict if return_dict is not None else self.config.use_return_dict + + outputs = self.model( + input_ids=input_ids, + attention_mask=attention_mask, + inputs_embeds=inputs_embeds, + past_key_values=past_key_values, + use_cache=use_cache, + output_attentions=output_attentions, + output_hidden_states=output_hidden_states, + return_dict=return_dict + ) + + hidden_states = outputs[0] + fuse_linear_and_cross_entropy = self.config.fuse_cross_entropy and self.training + logits = None if fuse_linear_and_cross_entropy else self.lm_head(hidden_states[:, -num_logits_to_keep:]) + + loss = None + if labels is not None: + if self.config.fuse_cross_entropy: + if fuse_linear_and_cross_entropy: + loss_fct = FusedLinearCrossEntropyLoss() + else: + loss_fct = FusedCrossEntropyLoss(inplace_backward=True) + else: + loss_fct = nn.CrossEntropyLoss() + # Enable model parallelism + labels = labels.to(hidden_states.device) + labels = torch.cat((labels[..., 1:], torch.full_like(labels[:, :1], loss_fct.ignore_index)), 1) + if fuse_linear_and_cross_entropy: + loss = loss_fct(hidden_states.view(-1, self.config.hidden_size), + labels.view(-1), + self.lm_head.weight, + self.lm_head.bias) + else: + loss = loss_fct(logits.view(-1, self.config.vocab_size), labels.view(-1)) + + if not return_dict: + output = (logits,) + outputs[1:] + return (loss,) + output if loss is not None else output + + return CausalLMOutputWithPast( + loss=loss, + logits=logits, + past_key_values=outputs.past_key_values, + hidden_states=outputs.hidden_states, + attentions=outputs.attentions, + ) diff --git a/fla/models/linear_attn/__init__.py b/fla/models/linear_attn/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..72d5d022de95afe9dc6cf76d3c2026a6a7f9e7a0 --- /dev/null +++ b/fla/models/linear_attn/__init__.py @@ -0,0 +1,14 @@ +# -*- coding: utf-8 -*- + +from transformers import AutoConfig, AutoModel, AutoModelForCausalLM + +from fla.models.linear_attn.configuration_linear_attn import \ + LinearAttentionConfig +from fla.models.linear_attn.modeling_linear_attn import ( + LinearAttentionForCausalLM, LinearAttentionModel) + +AutoConfig.register(LinearAttentionConfig.model_type, LinearAttentionConfig) +AutoModel.register(LinearAttentionConfig, LinearAttentionModel) +AutoModelForCausalLM.register(LinearAttentionConfig, LinearAttentionForCausalLM) + +__all__ = ['LinearAttentionConfig', 'LinearAttentionForCausalLM', 'LinearAttentionModel'] diff --git a/fla/models/linear_attn/configuration_linear_attn.py b/fla/models/linear_attn/configuration_linear_attn.py new file mode 100644 index 0000000000000000000000000000000000000000..d1bff79e2ecbd61fc07479b4dafbccc40cfa94d8 --- /dev/null +++ b/fla/models/linear_attn/configuration_linear_attn.py @@ -0,0 +1,83 @@ +# -*- coding: utf-8 -*- + +from typing import Dict, Optional + +from transformers.configuration_utils import PretrainedConfig + + +class LinearAttentionConfig(PretrainedConfig): + + model_type = 'linear_attn' + keys_to_ignore_at_inference = ['past_key_values'] + + def __init__( + self, + attn_mode: str = "fused_chunk", + hidden_size: int = 2048, + expand_k: int = 1, + expand_v: int = 1, + hidden_ratio: Optional[int] = 4, + intermediate_size: Optional[int] = None, + num_hidden_layers: int = 24, + num_heads: int = 4, + num_kv_heads: Optional[int] = None, + feature_map: str = "elementwise_product", + tie_feature_map_qk: bool = False, + norm_q: bool = False, + norm_k: bool = False, + norm_feature_map: bool = False, + hidden_act: str = "swish", + max_position_embeddings: int = 2048, + elementwise_affine: Optional[bool] = True, + norm_eps: float = 1e-6, + attn: Optional[Dict] = None, + use_cache: bool = True, + pad_token_id: int = None, + bos_token_id: int = 1, + eos_token_id: int = 2, + tie_word_embeddings: bool = False, + initializer_range: float = 0.02, + fuse_cross_entropy: bool = True, + vocab_size: int = 32000, + **kwargs + ): + self.attn_mode = attn_mode + self.hidden_size = hidden_size + self.expand_k = expand_k + self.expand_v = expand_v + self.hidden_ratio = hidden_ratio + self.intermediate_size = intermediate_size + self.num_hidden_layers = num_hidden_layers + self.num_heads = num_heads + self.num_kv_heads = num_kv_heads + self.feature_map = feature_map + self.tie_feature_map_qk = tie_feature_map_qk + self.norm_q = norm_q + self.norm_k = norm_k + self.norm_feature_map = norm_feature_map + self.max_position_embeddings = max_position_embeddings + self.elementwise_affine = elementwise_affine + self.norm_eps = norm_eps + self.attn = attn + self.use_cache = use_cache + self.initializer_range = initializer_range + self.fuse_cross_entropy = fuse_cross_entropy + self.vocab_size = vocab_size + + if attn is not None: + if not isinstance(attn, Dict): + raise ValueError("attn must be a dictionary") + if 'layers' not in attn: + raise ValueError("Layer indices must be provided to initialize hybrid attention layers") + if 'num_heads' not in attn: + raise ValueError("Number of heads must be provided to initialize hybrid attention layers") + attn['num_kv_heads'] = attn.get('num_kv_heads', attn['num_heads']) + attn['window_size'] = attn.get('window_size', None) + + super().__init__( + pad_token_id=pad_token_id, + bos_token_id=bos_token_id, + eos_token_id=eos_token_id, + tie_word_embeddings=tie_word_embeddings, + **kwargs, + ) diff --git a/fla/models/linear_attn/modeling_linear_attn.py b/fla/models/linear_attn/modeling_linear_attn.py new file mode 100644 index 0000000000000000000000000000000000000000..e1906bfd856a801c4b2073a1ba7094ff93304da5 --- /dev/null +++ b/fla/models/linear_attn/modeling_linear_attn.py @@ -0,0 +1,427 @@ +# -*- coding: utf-8 -*- + +from __future__ import annotations + +import math +import warnings +from typing import List, Optional, Tuple, Union + +import torch +import torch.nn as nn +import torch.utils.checkpoint +from transformers.activations import ACT2FN +from transformers.generation import GenerationMixin +from transformers.modeling_outputs import (BaseModelOutputWithPast, + CausalLMOutputWithPast) +from transformers.modeling_utils import PreTrainedModel +from transformers.utils import logging + +from fla.layers.attn import Attention +from fla.layers.linear_attn import LinearAttention +from fla.models.linear_attn.configuration_linear_attn import \ + LinearAttentionConfig +from fla.models.utils import Cache +from fla.modules import (FusedCrossEntropyLoss, FusedLinearCrossEntropyLoss, + RMSNorm) +from fla.modules.activations import swiglu_linear + +logger = logging.get_logger(__name__) + + +class LinearAttentionMLP(nn.Module): + + def __init__( + self, + hidden_size: int, + hidden_ratio: Optional[int] = None, + intermediate_size: Optional[int] = None, + hidden_act: str = 'swish' + ) -> LinearAttentionMLP: + super().__init__() + + self.hidden_size = hidden_size + # the final number of params is `hidden_ratio * hidden_size^2` + # `intermediate_size` is chosen to be a multiple of 256 closest to `2/3 * hidden_size * hidden_ratio` + if hidden_ratio is None: + hidden_ratio = 4 + if intermediate_size is None: + intermediate_size = int(hidden_size * hidden_ratio * 2 / 3) + intermediate_size = 256 * ((intermediate_size + 256 - 1) // 256) + self.hidden_ratio = hidden_ratio + self.intermediate_size = intermediate_size + + self.gate_proj = nn.Linear(self.hidden_size, self.intermediate_size * 2, bias=False) + self.down_proj = nn.Linear(self.intermediate_size, self.hidden_size, bias=False) + self.act_fn = ACT2FN[hidden_act] + + def forward(self, x): + y = self.gate_proj(x) + gate, y = y.chunk(2, -1) + return swiglu_linear(gate, y, self.down_proj.weight, self.down_proj.bias) + + +class LinearAttentionBlock(nn.Module): + def __init__(self, config: LinearAttentionConfig, layer_idx: int): + super().__init__() + self.hidden_size = config.hidden_size + + self.attn_norm = RMSNorm(hidden_size=config.hidden_size, eps=config.norm_eps) + if config.attn is not None and layer_idx in config.attn['layers']: + self.attn = Attention( + hidden_size=config.hidden_size, + num_heads=config.attn['num_heads'], + num_kv_heads=config.attn['num_kv_heads'], + window_size=config.attn['window_size'], + max_position_embeddings=config.max_position_embeddings, + layer_idx=layer_idx + ) + else: + self.attn = LinearAttention( + mode=config.attn_mode, + hidden_size=config.hidden_size, + expand_k=config.expand_k, + expand_v=config.expand_v, + num_heads=config.num_heads, + num_kv_heads=config.num_kv_heads, + feature_map=config.feature_map, + tie_feature_map_qk=config.tie_feature_map_qk, + norm_q=config.norm_q, + norm_k=config.norm_k, + do_feature_map_norm=config.norm_feature_map, + elementwise_affine=config.elementwise_affine, + norm_eps=config.norm_eps, + layer_idx=layer_idx + ) + self.mlp_norm = RMSNorm(hidden_size=config.hidden_size, eps=config.norm_eps) + self.mlp = LinearAttentionMLP( + hidden_size=config.hidden_size, + hidden_ratio=config.hidden_ratio, + intermediate_size=config.intermediate_size, + hidden_act=config.hidden_act + ) + + def forward( + self, + hidden_states: torch.Tensor, + attention_mask: Optional[torch.Tensor] = None, + past_key_values: Optional[Union[Cache, List[torch.FloatTensor]]] = None, + use_cache: Optional[bool] = False, + output_attentions: Optional[bool] = False, + **kwargs, + ) -> Tuple[torch.FloatTensor, Optional[Tuple[torch.FloatTensor, torch.FloatTensor]]]: + residual = hidden_states + # currently not supported + attn_weights, present_key_value = None, None + + hidden_states = self.attn_norm(hidden_states) + hidden_states = self.attn(hidden_states) + hidden_states, residual = self.mlp_norm(hidden_states, residual, True) + hidden_states = self.mlp(hidden_states) + hidden_states = residual + hidden_states + + outputs = (hidden_states,) + + if output_attentions: + outputs += (attn_weights,) + + if use_cache: + outputs += (present_key_value,) + + return outputs + + +class LinearAttentionPreTrainedModel(PreTrainedModel): + + config_class = LinearAttentionConfig + supports_gradient_checkpointing = True + _no_split_modules = ['LinearAttentionBlock'] + + def __init__(self, *inputs, **kwargs): + super().__init__(*inputs, **kwargs) + + def _init_weights( + self, + module: nn.Module, + rescale_prenorm_residual: bool = True, + num_residuals_per_layer: int = 2, + ): + if isinstance(module, (nn.Linear, nn.Conv1d)): + # Slightly different from the TF version which uses truncated_normal for initialization + # cf https://github.com/pytorch/pytorch/pull/5617 + nn.init.normal_(module.weight, mean=0.0, std=self.config.initializer_range) + if module.bias is not None: + nn.init.zeros_(module.bias) + elif isinstance(module, nn.Embedding): + nn.init.normal_(module.weight, mean=0.0, std=self.config.initializer_range) + if module.padding_idx is not None: + module.weight.data[module.padding_idx].zero_() + + if rescale_prenorm_residual: + # Reinitialize selected weights subject to the OpenAI GPT-2 Paper Scheme: + # > A modified initialization which accounts for the accumulation on the residual path with model depth. Scale + # > the weights of residual layers at initialization by a factor of 1/√N where N is the # of residual layers. + # > -- GPT-2 :: https://openai.com/blog/better-language-models/ + # + # Reference (Megatron-LM): https://github.com/NVIDIA/Megatron-LM/blob/main/megatron/model/gpt_model.py + for name, p in module.named_parameters(): + if name in ["o_proj.weight", "down_proj.weight"]: + # Special Scaled Initialization --> There are 2 Layer Norms per Transformer Block + # Following Pytorch init, except scale by 1/sqrt(2 * n_layer) + # We need to reinit p since this code could be called multiple times + # Having just p *= scale would repeatedly scale it down + with torch.no_grad(): + p /= math.sqrt(num_residuals_per_layer * self.config.num_hidden_layers) + + +class LinearAttentionModel(LinearAttentionPreTrainedModel): + + def __init__(self, config: LinearAttentionConfig): + super().__init__(config) + self.padding_idx = config.pad_token_id + self.vocab_size = config.vocab_size + + self.embeddings = nn.Embedding(config.vocab_size, config.hidden_size, self.padding_idx) + self.layers = nn.ModuleList([LinearAttentionBlock(config, layer_idx) for layer_idx in range(config.num_hidden_layers)]) + self.norm = RMSNorm(config.hidden_size, eps=config.norm_eps) + + self.gradient_checkpointing = False + + self.post_init() + + def get_input_embeddings(self): + return self.embeddings + + def set_input_embeddings(self, value): + self.embeddings = value + + def forward( + self, + input_ids: Optional[torch.LongTensor] = None, + attention_mask: Optional[torch.Tensor] = None, # noqa + inputs_embeds: Optional[torch.FloatTensor] = None, + past_key_values: Optional[Union[Cache, List[torch.FloatTensor]]] = None, + use_cache: Optional[bool] = None, + output_attentions: Optional[bool] = None, + output_hidden_states: Optional[bool] = None, + return_dict: Optional[bool] = None + ) -> Union[Tuple, BaseModelOutputWithPast]: + if output_attentions: + warnings.warn( + "`LinearAttentionModel` does not support output attention weights now, " + "so `output_attentions` is set to `False`." + ) + output_attentions = False + output_attentions = output_attentions if output_attentions is not None else self.config.output_attentions + output_hidden_states = output_hidden_states if output_hidden_states is not None else self.config.output_hidden_states + use_cache = use_cache if use_cache is not None else (self.config.use_cache if not self.training else False) + return_dict = return_dict if return_dict is not None else self.config.use_return_dict + + # retrieve input_ids and inputs_embeds + if input_ids is not None and inputs_embeds is not None: + raise ValueError("You cannot specify both input_ids and inputs_embeds at the same time") + if input_ids is None and inputs_embeds is None: + raise ValueError("You have to specify either input_ids or inputs_embeds") + + if inputs_embeds is None: + inputs_embeds = self.embeddings(input_ids) + hidden_states = inputs_embeds + + if use_cache and not isinstance(past_key_values, Cache): + past_key_values = Cache.from_legacy_cache(past_key_values) + + if self.gradient_checkpointing and self.training and use_cache: + logger.warning_once("`use_cache=True` is incompatible with gradient checkpointing. Setting `use_cache=False`...") + use_cache = False + + all_hidden_states = () if output_hidden_states else None + all_attns = () if output_attentions else None + + if self.config.use_lower_bound: + lower_bounds = self.lower_bounds.softmax(0) + lower_bounds = lower_bounds.cumsum(0) - lower_bounds[0] + for i, layer in enumerate(self.layers): + if output_hidden_states: + all_hidden_states += (hidden_states,) + + if self.gradient_checkpointing and self.training: + hidden_states, attentions, past_key_values = self._gradient_checkpointing_func( + layer.__call__, + hidden_states, + attention_mask, + past_key_values, + use_cache, + output_attentions, + ) + else: + hidden_states, attentions, past_key_values = layer( + hidden_states, + attention_mask=attention_mask, + past_key_values=past_key_values, + use_cache=use_cache, + output_attentions=output_attentions + ) + + if output_attentions: + all_attns += (attentions,) + + hidden_states = self.norm(hidden_states) + + # add hidden states from the last decoder layer + if output_hidden_states: + all_hidden_states += (hidden_states,) + + if not return_dict: + return tuple(i for i in [hidden_states, past_key_values, all_hidden_states, all_attns] if i is not None) + return BaseModelOutputWithPast( + last_hidden_state=hidden_states, + past_key_values=past_key_values, + hidden_states=all_hidden_states, + attentions=all_attns + ) + + +class LinearAttentionForCausalLM(LinearAttentionPreTrainedModel, GenerationMixin): + + _tied_weights_keys = ["lm_head.weight"] + + def __init__(self, config): + super().__init__(config) + self.model = LinearAttentionModel(config) + self.vocab_size = config.vocab_size + self.lm_head = nn.Linear(config.hidden_size, config.vocab_size, bias=False) + + # Initialize weights and apply final processing + self.post_init() + + def get_input_embeddings(self): + return self.model.embeddings + + def set_input_embeddings(self, value): + self.model.embeddings = value + + def get_output_embeddings(self): + return self.lm_head + + def set_output_embeddings(self, new_embeddings): + self.lm_head = new_embeddings + + def set_decoder(self, decoder): + self.model = decoder + + def get_decoder(self): + return self.model + + def generate(self, *args, **kwargs): + try: + return super().generate(*args, **kwargs) + except AttributeError as exception: + if 'past_key_values' in str(exception): + raise AttributeError( + f"You tried to call `generate` with a decoding strategy that manipulates `past_key_values`, " + f"which is not supported for {self.__class__.__name__}. " + f"Try another generation strategy instead. " + f"For the available generation strategies, check this doc: " + f"https://huggingface.co/docs/transformers/en/generation_strategies#decoding-strategies" + ) + else: + raise exception + + def prepare_inputs_for_generation( + self, + input_ids: torch.LongTensor = None, + past_key_values: Optional[Union[Cache, List[torch.FloatTensor]]] = None, + attention_mask: Optional[torch.Tensor] = None, + inputs_embeds: Optional[torch.Tensor] = None, + use_cache: bool = True, + num_logits_to_keep: Optional[int] = None, + **kwargs + ): + # only last token for `inputs_ids` if the `past_key_values` is passed along. + if past_key_values is not None: + input_ids = input_ids[:, -1:] + # if `inputs_embeds` are passed, we only want to use them in the 1st generation step + if inputs_embeds is not None and past_key_values is None: + model_inputs = {'inputs_embeds': inputs_embeds} + else: + # The `contiguous()` here is necessary to have a static stride during decoding. torchdynamo otherwise + # recompiles graphs as the stride of the inputs is a guard. + # Ref: https://github.com/huggingface/transformers/pull/29114 + # TODO: use `next_tokens` directly instead. + model_inputs = {'input_ids': input_ids.contiguous()} + + if num_logits_to_keep is not None: + model_inputs['num_logits_to_keep'] = num_logits_to_keep + + model_inputs.update({ + 'past_key_values': past_key_values, + 'use_cache': use_cache, + 'attention_mask': attention_mask, + 'num_logits_to_keep': num_logits_to_keep, + }) + return model_inputs + + def forward( + self, + input_ids: torch.LongTensor = None, + attention_mask: Optional[torch.Tensor] = None, + inputs_embeds: Optional[torch.Tensor] = None, + past_key_values: Optional[Union[Cache, List[torch.FloatTensor]]] = None, + labels: Optional[torch.LongTensor] = None, + use_cache: Optional[bool] = None, + output_attentions: Optional[bool] = None, + output_hidden_states: Optional[bool] = None, + return_dict: Optional[bool] = None, + num_logits_to_keep: Optional[int] = 0 + ) -> Union[Tuple, CausalLMOutputWithPast]: + output_attentions = output_attentions if output_attentions is not None else self.config.output_attentions + output_hidden_states = ( + output_hidden_states if output_hidden_states is not None else self.config.output_hidden_states + ) + return_dict = return_dict if return_dict is not None else self.config.use_return_dict + + outputs = self.model( + input_ids=input_ids, + attention_mask=attention_mask, + inputs_embeds=inputs_embeds, + past_key_values=past_key_values, + use_cache=use_cache, + output_attentions=output_attentions, + output_hidden_states=output_hidden_states, + return_dict=return_dict + ) + + hidden_states = outputs[0] + fuse_linear_and_cross_entropy = self.config.fuse_cross_entropy and self.training + logits = None if fuse_linear_and_cross_entropy else self.lm_head(hidden_states[:, -num_logits_to_keep:]) + + loss = None + if labels is not None: + if self.config.fuse_cross_entropy: + if fuse_linear_and_cross_entropy: + loss_fct = FusedLinearCrossEntropyLoss() + else: + loss_fct = FusedCrossEntropyLoss(inplace_backward=True) + else: + loss_fct = nn.CrossEntropyLoss() + # Enable model parallelism + labels = labels.to(hidden_states.device) + labels = torch.cat((labels[..., 1:], torch.full_like(labels[:, :1], loss_fct.ignore_index)), 1) + if fuse_linear_and_cross_entropy: + loss = loss_fct(hidden_states.view(-1, self.config.hidden_size), + labels.view(-1), + self.lm_head.weight, + self.lm_head.bias) + else: + loss = loss_fct(logits.view(-1, self.config.vocab_size), labels.view(-1)) + + if not return_dict: + output = (logits,) + outputs[1:] + return (loss,) + output if loss is not None else output + + return CausalLMOutputWithPast( + loss=loss, + logits=logits, + past_key_values=outputs.past_key_values, + hidden_states=outputs.hidden_states, + attentions=outputs.attentions, + ) diff --git a/fla/models/mamba/__init__.py b/fla/models/mamba/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..a0eff2ea26f3a11bcf2333002509686eca2289aa --- /dev/null +++ b/fla/models/mamba/__init__.py @@ -0,0 +1,14 @@ +# -*- coding: utf-8 -*- + +from transformers import AutoConfig, AutoModel, AutoModelForCausalLM + +from fla.models.mamba.configuration_mamba import MambaConfig +from fla.models.mamba.modeling_mamba import (MambaBlock, MambaForCausalLM, + MambaModel) + +AutoConfig.register(MambaConfig.model_type, MambaConfig, True) +AutoModel.register(MambaConfig, MambaModel, True) +AutoModelForCausalLM.register(MambaConfig, MambaForCausalLM, True) + + +__all__ = ['MambaConfig', 'MambaForCausalLM', 'MambaModel', 'MambaBlock'] diff --git a/fla/models/mamba/configuration_mamba.py b/fla/models/mamba/configuration_mamba.py new file mode 100644 index 0000000000000000000000000000000000000000..f25d6e3e134c5a549351fbfe256c674ecbbec29b --- /dev/null +++ b/fla/models/mamba/configuration_mamba.py @@ -0,0 +1,166 @@ +# coding=utf-8 +# Copyright 2024 The HuggingFace Inc. team. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +"""MAMBA configuration""" + +import math + +from transformers.configuration_utils import PretrainedConfig + + +class MambaConfig(PretrainedConfig): + """ + This is the configuration class to store the configuration of a [`MambaModel`]. It is used to instantiate a MAMBA + model according to the specified arguments, defining the model architecture. Instantiating a configuration with the + defaults will yield a similar configuration to that of the MAMBA + [state-spaces/mamba-2.8b](https://huggingface.co/state-spaces/mamba-2.8b) architecture. + + Configuration objects inherit from [`PretrainedConfig`] and can be used to control the model outputs. Read the + documentation from [`PretrainedConfig`] for more information. + + + Args: + vocab_size (`int`, *optional*): + Vocabulary size of the Mamba model. + hidden_size (`int`, *optional*): + Dimensionality of the embeddings and hidden states. Default: 2048. + state_size (`int`, *optional*): + Shape of the state space latents. Default: 16. + num_hidden_layers (`int`, *optional*): + Number of hidden layers in the model. Default: 48. + layer_norm_epsilon (`float`, *optional*): + The epsilon to use in the layer normalization layers. Default: 1e-5. + pad_token_id (`int`, *optional*): + Padding token id. Default: 0. + bos_token_id (`int`, *optional*): + The id of the beginning of sentence token in the vocabulary. Default: 0. + eos_token_id (`int`, *optional*): + The id of the end of sentence token in the vocabulary. Default: 0. + expand (`int`, *optional*): + Expanding factor used to determine the intermediate size. Default: 2. + conv_kernel (`int`, *optional*): + Size of the convolution kernel. Default: 4. + use_bias (`bool`, *optional*): + Whether or not to use bias in ["in_proj", "out_proj"] of the mixer block. Default: `False`. + use_conv_bias (`bool`, *optional*): + Whether or not to use bias in the convolution layer of the mixer block. Default: `True`. + hidden_act (`str`, *optional*): + The non-linear activation function (function or string) in the decoder. Default: `"silu"`. + initializer_range (`float`, *optional*): + The standard deviation of the truncated_normal_initializer for initializing all weight matrices. Default: 0.1. + residual_in_fp32 (`bool`, *optional*): + Whether or not residuals should be in `float32`. + If set to `False` residuals will keep the same `dtype` as the rest of the model. Default: `True`. + time_step_rank (`Union[int,str]`, *optional*): + Rank of the the discretization projection matrix. + `"auto"` means that it will default to `math.ceil(self.hidden_size / 16)`. Default: `"auto"`. + time_step_scale (`float`, *optional*): + Scale used used to scale `dt_proj.bias`. Default: 1.0. + time_step_min (`float`, *optional*): + Minimum `time_step` used to bound `dt_proj.bias`. Default: 0.001. + time_step_max (`float`, *optional*): + Maximum `time_step` used to bound `dt_proj.bias`. Default: 0.1. + time_step_init_scheme (`float`, *optional*): + Init scheme used for `dt_proj.weight`. Should be one of `["random","uniform"]`. Default: `"random"`. + time_step_floor (`float`, *optional*): + Minimum clamping value of the `dt_proj.bias` layer initialization. Default: 0.0001. + window_size (`int`, *optional*): + The window size used for sliding window attention. Default: 2048. + rescale_prenorm_residual (`bool`, *optional*): + Whether or not to rescale `out_proj` weights when initializing. Default: `False`. + use_cache (`bool`, *optional*): + Whether or not the cache should be used. Default: `True`. + + + Example: + + ```python + >>> from transformers import MambaConfig, MambaModel + + >>> # Initializing a Mamba configuration + >>> configuration = MambaConfig() + + >>> # Initializing a model (with random weights) from the configuration + >>> model = MambaModel(configuration) + + >>> # Accessing the model configuration + >>> configuration = model.config + ```""" + + model_type = "mamba" + + def __init__( + self, + vocab_size: int = 32000, + hidden_size: int = 2048, + state_size: int = 16, + num_hidden_layers: int = 48, + layer_norm_epsilon=1e-5, + pad_token_id: int = 0, + bos_token_id: int = 1, + eos_token_id: int = 2, + expand: int = 2, + conv_kernel: int = 4, + use_bias: bool = False, + use_conv_bias: bool = True, + hidden_act: str = "silu", + initializer_range: str = 0.1, + residual_in_fp32: bool = False, + time_step_rank: str = "auto", + time_step_scale: float = 1.0, + time_step_min: float = 0.001, + time_step_max: float = 0.1, + time_step_init_scheme: str = "random", + time_step_floor: float = 1e-4, + rescale_prenorm_residual: bool = False, + use_cache: bool = True, + fuse_norm: bool = True, + fuse_cross_entropy: bool = True, + tie_word_embeddings: bool = False, + **kwargs, + ): + self.vocab_size = vocab_size + self.hidden_size = hidden_size + self.state_size = state_size + self.num_hidden_layers = num_hidden_layers + self.layer_norm_epsilon = layer_norm_epsilon + self.conv_kernel = conv_kernel + self.expand = expand + self.intermediate_size = int(expand * self.hidden_size) + self.bos_token_id = bos_token_id + self.eos_token_id = eos_token_id + self.pad_token_id = pad_token_id + self.use_bias = use_bias + self.use_conv_bias = use_conv_bias + self.hidden_act = hidden_act + self.initializer_range = initializer_range + self.time_step_rank = math.ceil(self.hidden_size / 16) if time_step_rank == "auto" else time_step_rank + self.time_step_scale = time_step_scale + self.time_step_min = time_step_min + self.time_step_max = time_step_max + self.time_step_init_scheme = time_step_init_scheme + self.time_step_floor = time_step_floor + self.rescale_prenorm_residual = rescale_prenorm_residual + self.residual_in_fp32 = residual_in_fp32 + self.use_cache = use_cache + self.fuse_cross_entropy = fuse_cross_entropy + self.fuse_norm = fuse_norm + + super().__init__( + bos_token_id=bos_token_id, + eos_token_id=eos_token_id, + pad_token_id=pad_token_id, + tie_word_embeddings=tie_word_embeddings, + **kwargs + ) diff --git a/fla/models/mamba/modeling_mamba.py b/fla/models/mamba/modeling_mamba.py new file mode 100644 index 0000000000000000000000000000000000000000..3ffbd2cd9b5e6d93ceca620679c268bb67624276 --- /dev/null +++ b/fla/models/mamba/modeling_mamba.py @@ -0,0 +1,837 @@ +# coding=utf-8 +# Copyright 2024 state-spaces/mamba org and HuggingFace Inc. team. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +"""PyTorch MAMBA model.""" + +import math +import warnings +from dataclasses import dataclass +from typing import Any, Dict, Optional, Tuple, Union + +import torch +import torch.utils.checkpoint +from torch import nn +from transformers.activations import ACT2FN +from transformers.configuration_utils import PretrainedConfig +from transformers.generation import GenerationMixin +from transformers.modeling_utils import PreTrainedModel +from transformers.utils import ModelOutput, logging + +from fla.models.mamba.configuration_mamba import MambaConfig +from fla.modules import (FusedCrossEntropyLoss, FusedLinearCrossEntropyLoss, + RMSNorm) + +logger = logging.get_logger(__name__) + + +with warnings.catch_warnings(): + warnings.simplefilter('ignore') + try: + from mamba_ssm.ops.selective_scan_interface import (mamba_inner_fn, + selective_scan_fn) + from mamba_ssm.ops.triton.selective_state_update import \ + selective_state_update + except ImportError: + selective_state_update, selective_scan_fn, mamba_inner_fn = None, None, None + + try: + from causal_conv1d import causal_conv1d_fn, causal_conv1d_update + except ImportError: + causal_conv1d_update, causal_conv1d_fn = None, None + is_fast_path_available = all(( + selective_state_update, + selective_scan_fn, + causal_conv1d_fn, + causal_conv1d_update, + mamba_inner_fn + )) + + +class MambaCache: + """ + Cache for mamba model which does not have attention mechanism and key value states. + + Arguments: + config (`PretrainedConfig): + The configuration file defining the shape-related attributes required to initialize the static cache. + batch_size (`int`): + The batch size with which the model will be used. Note that a new instance must be instantiated if a + smaller batch size is used. + dtype (`torch.dtype`, *optional*, defaults to `torch.float16`): + The default `dtype` to use when initializing the layer. + device (`torch.device` or `str`, *optional*): + The device on which the cache should be initialized. Should be the same as the layer. + + Attributes: + dtype: (`torch.dtype`): + The default `dtype` used to initializing the cache. + intermediate_size: (`int`): + Model's intermediate_size taken from config. + ssm_state_size: (`int`): + Model's state_size taken from config. + conv_kernel_size: (`int`): + Model's convolution kernel size taken from config + conv_states: (`torch.Tensor`): + A tensor of shape `[layer_idx, batch_size, intermediate_size, conv_kernel_size]` that holds convolutional states. + ssm_states: (`torch.Tensor`): + A tensor of shape `[layer_idx, batch_size, intermediate_size, ssm_state_size]` that holds ssm states + + Example: + + ```python + >>> from transformers import AutoTokenizer, MambaForCausalLM, MambaCache + + >>> model = MambaForCausalLM.from_pretrained("state-spaces/mamba-130m-hf") + >>> tokenizer = AutoTokenizer.from_pretrained("state-spaces/mamba-130m-hf") + + >>> inputs = tokenizer(text="My name is Mamba", return_tensors="pt") + + >>> # Prepare a cache class and pass it to model's forward + >>> past_key_values = MambaCache(config=model.config, batch_size=1, device=model.device, dtype=model.dtype) + >>> outputs = model(**inputs, past_key_values=past_key_values, use_cache=True) + >>> outputs.past_key_values + MambaCache() + ``` + """ + + # TODO (joao): remove `=None` in non-optional arguments in v4.46. Remove from `OBJECTS_TO_IGNORE` as well. + def __init__( + self, + config: PretrainedConfig, + batch_size: int = None, + dtype: torch.dtype = torch.float16, + device: Optional[Union[torch.device, str]] = None, + max_batch_size: Optional[int] = None, + ): + if max_batch_size is not None: + logger.warning_once( + f"The 'max_batch_size' argument of {self.__class__.__name__} is deprecated and will be removed in " + "v4.46. Use the more precisely named 'batch_size' argument instead." + ) + self.dtype = dtype + self.batch_size = batch_size or max_batch_size + self.intermediate_size = config.intermediate_size + self.ssm_state_size = config.state_size + self.conv_kernel_size = config.conv_kernel + + self.conv_states: torch.Tensor = torch.zeros( + config.num_hidden_layers, + self.batch_size, + self.intermediate_size, + self.conv_kernel_size, + device=device, + dtype=dtype, + ) + self.ssm_states: torch.Tensor = torch.zeros( + config.num_hidden_layers, + self.batch_size, + self.intermediate_size, + self.ssm_state_size, + device=device, + dtype=dtype, + ) + + torch._dynamo.mark_static_address(self.conv_states) + torch._dynamo.mark_static_address(self.ssm_states) + + def update_conv_state( + self, layer_idx: int, new_conv_state: torch.Tensor, cache_position: torch.LongTensor + ) -> torch.Tensor: + conv_state = self.conv_states[layer_idx] + cache_position = cache_position.clamp(0, self.conv_kernel_size - 1) + + conv_state = conv_state.roll(shifts=-1, dims=-1) + conv_state[:, :, cache_position] = new_conv_state.to(conv_state.device) + self.conv_states[layer_idx].zero_() + self.conv_states[layer_idx] += conv_state + return self.conv_states[layer_idx] + + def update_ssm_state(self, layer_idx: int, new_ssm_state: torch.Tensor): + self.ssm_states[layer_idx] = new_ssm_state.to(self.ssm_states.device) + return self.ssm_states[layer_idx] + + def reset(self): + self.conv_states.zero_() + self.ssm_states.zero_() + + +class MambaMixer(nn.Module): + """ + Compute ∆, A, B, C, and D the state space parameters and compute the `contextualized_states`. + A, D are input independent (see Mamba paper [1] Section 3.5.2 "Interpretation of A" for why A isn't selective) + ∆, B, C are input-dependent (this is a key difference between Mamba and the linear time invariant S4, + and is why Mamba is called **selective** state spaces) + """ + + def __init__(self, config: MambaConfig, layer_idx: int): + super().__init__() + self.config = config + self.hidden_size = config.hidden_size + self.ssm_state_size = config.state_size + self.conv_kernel_size = config.conv_kernel + self.intermediate_size = config.intermediate_size + self.time_step_rank = int(config.time_step_rank) + self.layer_idx = layer_idx + self.use_conv_bias = config.use_conv_bias + self.conv1d = nn.Conv1d( + in_channels=self.intermediate_size, + out_channels=self.intermediate_size, + bias=config.use_conv_bias, + kernel_size=config.conv_kernel, + groups=self.intermediate_size, + padding=config.conv_kernel - 1, + ) + + self.activation = config.hidden_act + self.act = ACT2FN[config.hidden_act] + + # projection of the input hidden states + self.in_proj = nn.Linear(self.hidden_size, self.intermediate_size * 2, bias=config.use_bias) + # selective projection used to make dt, B and C input dependant + self.x_proj = nn.Linear(self.intermediate_size, self.time_step_rank + self.ssm_state_size * 2, bias=False) + # time step projection (discretization) + self.dt_proj = nn.Linear(self.time_step_rank, self.intermediate_size, bias=True) + + # S4D real initialization. These are not discretized! + # The core is to load them, compute the discrete states, then write the updated state. Keeps the memory bounded + A = torch.arange(1, self.ssm_state_size + 1, dtype=torch.float32)[None, :] + A = A.expand(self.intermediate_size, -1).contiguous() + + self.A_log = nn.Parameter(torch.log(A)) + self.D = nn.Parameter(torch.ones(self.intermediate_size)) + self.out_proj = nn.Linear(self.intermediate_size, self.hidden_size, bias=config.use_bias) + self.use_bias = config.use_bias + + if not is_fast_path_available: + logger.warning_once( + "The fast path is not available because on of " + "`(selective_state_update, selective_scan_fn, causal_conv1d_fn, causal_conv1d_update, mamba_inner_fn)`" + " is None. Falling back to the naive implementation. " + "To install follow https://github.com/state-spaces/mamba/#installation and" + " https://github.com/Dao-AILab/causal-conv1d" + ) + + def cuda_kernels_forward( + self, + hidden_states: torch.Tensor, + cache_params: Optional[MambaCache] = None, + cache_position: Optional[torch.LongTensor] = None, + attention_mask: Optional[torch.LongTensor] = None, + ): + # 1. Gated MLP's linear projection + projected_states = self.in_proj(hidden_states).transpose(1, 2) + + if self.training and cache_params is None: # Doesn't support outputting the states -> used for training + contextualized_states = mamba_inner_fn( + projected_states, + self.conv1d.weight, + self.conv1d.bias if self.use_conv_bias else None, + self.x_proj.weight, + self.dt_proj.weight, + self.out_proj.weight, + self.out_proj.bias.float() if self.use_bias else None, + -torch.exp(self.A_log.float()), + None, # input-dependent B + None, # input-dependent C + self.D.float(), + delta_bias=self.dt_proj.bias.float(), + delta_softplus=True, + ) + + else: + hidden_states, gate = projected_states.chunk(2, dim=1) + + if attention_mask is not None: + hidden_states = hidden_states * attention_mask.unsqueeze(1) + + # 2. Convolution sequence transformation + conv_weights = self.conv1d.weight.view(self.conv1d.weight.size(0), self.conv1d.weight.size(2)) + if cache_params is not None and cache_position[0] > 0: + hidden_states = causal_conv1d_update( + hidden_states.squeeze(-1), + cache_params.conv_states[self.layer_idx], + conv_weights, + self.conv1d.bias, + self.activation, + ) + hidden_states = hidden_states.unsqueeze(-1) + else: + if cache_params is not None: + conv_states = nn.functional.pad( + hidden_states, (self.conv_kernel_size - hidden_states.shape[-1], 0) + ) + cache_params.update_conv_state(self.layer_idx, conv_states, cache_position) + hidden_states = causal_conv1d_fn( + hidden_states, conv_weights, self.conv1d.bias, activation=self.activation + ) + + if attention_mask is not None: + hidden_states = hidden_states * attention_mask.unsqueeze(1) + + # 3. State Space Model sequence transformation + # 3.a. input varying initialization of time_step, B and C + ssm_parameters = self.x_proj(hidden_states.transpose(1, 2)) + time_step, B, C = torch.split( + ssm_parameters, [self.time_step_rank, self.ssm_state_size, self.ssm_state_size], dim=-1 + ) + discrete_time_step = self.dt_proj.weight @ time_step.transpose(1, 2) + + A = -torch.exp(self.A_log.float()) + # 3.c perform the recurrence y ← SSM(A, B, C)(x) + time_proj_bias = self.dt_proj.bias.float() if hasattr(self.dt_proj, "bias") else None + if cache_params is not None and cache_position[0] > 0: + scan_outputs = selective_state_update( + cache_params.ssm_states[self.layer_idx], + hidden_states[..., 0], + discrete_time_step[..., 0], + A, + B[:, 0], + C[:, 0], + self.D, + gate[..., 0], + time_proj_bias, + dt_softplus=True, + ).unsqueeze(-1) + else: + scan_outputs, ssm_state = selective_scan_fn( + hidden_states, + discrete_time_step, + A, + B.transpose(1, 2), + C.transpose(1, 2), + self.D.float(), + gate, + time_proj_bias, + delta_softplus=True, + return_last_state=True, + ) + if ssm_state is not None and cache_params is not None: + cache_params.update_ssm_state(self.layer_idx, ssm_state) + + # 4. Final linear projection + contextualized_states = self.out_proj(scan_outputs.transpose(1, 2)) + return contextualized_states + + def slow_forward( + self, + input_states, + cache_params: Optional[MambaCache] = None, + cache_position: Optional[torch.LongTensor] = None, + attention_mask: Optional[torch.LongTensor] = None + ): + batch_size, seq_len, _ = input_states.shape + dtype = input_states.dtype + # 1. Gated MLP's linear projection + # [batch, 2 * intermediate_size, seq_len] + projected_states = self.in_proj(input_states).transpose(1, 2) + hidden_states, gate = projected_states.chunk(2, dim=1) + + if attention_mask is not None: + hidden_states = hidden_states * attention_mask.unsqueeze(1) + + # 2. Convolution sequence transformation + if cache_params is not None: + ssm_state = cache_params.ssm_states[self.layer_idx].clone() + ssm_state = ssm_state.to(hidden_states.device) + # use `cache_position.shape[0]` to check whether we are in prefill + # stage, it's equivalent to check `cache_position[0] == 0`, which + # breaks dynamo fullgraph constraints + if cache_position.shape[0] == self.conv_kernel_size: + conv_state = nn.functional.pad( + hidden_states, + (self.conv_kernel_size - hidden_states.shape[-1], 0) + ) + + cache_params.update_conv_state(self.layer_idx, conv_state, cache_position) + # [batch, intermediate_size, seq_len] + hidden_states = self.act(self.conv1d(hidden_states)[..., :seq_len]) + else: + conv_state = cache_params.update_conv_state(self.layer_idx, hidden_states, cache_position) + hidden_states = torch.sum(conv_state * self.conv1d.weight[:, 0, :], dim=-1) + if self.use_conv_bias: + hidden_states += self.conv1d.bias + # [batch, intermediate_size, 1] : decoding + hidden_states = self.act(hidden_states).to(dtype).unsqueeze(-1) + else: + ssm_state = torch.zeros( + (batch_size, self.intermediate_size, self.ssm_state_size), + device=hidden_states.device, dtype=dtype + ) + # [batch, intermediate_size, seq_len] + hidden_states = self.act(self.conv1d(hidden_states)[..., :seq_len]) + + if attention_mask is not None: + hidden_states = hidden_states * attention_mask.unsqueeze(1) + + # 3. State Space Model sequence transformation + # 3.a. Selection: [batch, seq_len, self.time_step_rank + self.ssm_state_size * 2] + ssm_parameters = self.x_proj(hidden_states.transpose(1, 2)) + time_step, B, C = torch.split( + ssm_parameters, [self.time_step_rank, self.ssm_state_size, self.ssm_state_size], dim=-1 + ) + # [batch, seq_len, intermediate_size] + discrete_time_step = self.dt_proj(time_step) + # [batch, intermediate_size, seq_len] + discrete_time_step = nn.functional.softplus(discrete_time_step).transpose(1, 2) + + # 3.b. Discretization: B and C to [batch, seq_len, intermediate_size, ssm_state_size] (SRAM) + # [intermediate_size, ssm_state_size] + A = -torch.exp(self.A_log.float()) + # [batch, intermediate_size, seq_len, ssm_state_size] + discrete_A = torch.exp(A[None, :, None, :] * discrete_time_step[:, :, :, None]) + # [batch, intermediate_size, seq_len, ssm_state_size] + discrete_B = discrete_time_step[:, :, :, None] * B[:, None, :, :].float() + deltaB_u = discrete_B * hidden_states[:, :, :, None].float() + + # 3.c perform the recurrence y ← SSM(A, B, C)(x) + scan_outputs = [] + for i in range(seq_len): + # [batch, intermediade_size, ssm_state] + ssm_state = discrete_A[:, :, i, :] * ssm_state + deltaB_u[:, :, i, :] + # [batch, intermediade_size, 1] + scan_output = torch.matmul(ssm_state.to(dtype), C[:, i, :].unsqueeze(-1)) + scan_outputs.append(scan_output[:, :, 0]) + # [batch, seq_len, intermediade_size] + scan_output = torch.stack(scan_outputs, dim=-1) + scan_output = scan_output + (hidden_states * self.D[None, :, None]) + scan_output = (scan_output * self.act(gate)) + + if cache_params is not None: + cache_params.ssm_states[self.layer_idx].copy_(ssm_state) + + # 4. Final linear projection + # [batch, seq_len, hidden_size] + contextualized_states = self.out_proj(scan_output.transpose(1, 2)) + return contextualized_states + # fmt: on + + def forward( + self, + hidden_states, + cache_params: Optional[MambaCache] = None, + cache_position: Optional[torch.LongTensor] = None, + attention_mask: Optional[torch.LongTensor] = None, + ): + if is_fast_path_available and "cuda" in self.x_proj.weight.device.type: + return self.cuda_kernels_forward(hidden_states, cache_params, cache_position, attention_mask) + return self.slow_forward(hidden_states, cache_params, cache_position, attention_mask) + + +class MambaBlock(nn.Module): + def __init__(self, config, layer_idx): + super().__init__() + self.config = config + self.layer_idx = layer_idx + self.residual_in_fp32 = config.residual_in_fp32 + self.norm = RMSNorm(config.hidden_size, eps=config.layer_norm_epsilon) + self.mixer = MambaMixer(config, layer_idx=layer_idx) + + def forward( + self, + hidden_states, + cache_params: Optional[MambaCache] = None, + cache_position: Optional[torch.LongTensor] = None, + attention_mask: Optional[torch.LongTensor] = None, + ): + residual = hidden_states + hidden_states = self.norm(hidden_states) + if self.residual_in_fp32: + residual = residual.to(torch.float32) + + hidden_states = self.mixer( + hidden_states, cache_params=cache_params, cache_position=cache_position, attention_mask=attention_mask + ) + hidden_states = residual + hidden_states + return hidden_states + + +class MambaPreTrainedModel(PreTrainedModel): + """ + An abstract class to handle weights initialization and a simple interface for downloading and loading pretrained + models. + """ + + config_class = MambaConfig + base_model_prefix = "backbone" + _no_split_modules = ["MambaBlock", "MambaMixer"] + supports_gradient_checkpointing = True + _is_stateful = True + + def _init_weights(self, module): + """Initialize the weights.""" + if isinstance(module, MambaMixer): + module.A_log._no_weight_decay = True + module.D._no_weight_decay = True + + dt_init_std = self.config.time_step_rank**-0.5 * self.config.time_step_scale + if self.config.time_step_init_scheme == "constant": + nn.init.constant_(module.dt_proj.weight, dt_init_std) + elif self.config.time_step_init_scheme == "random": + nn.init.uniform_(module.dt_proj.weight, -dt_init_std, dt_init_std) + + dt = torch.exp( + torch.rand(self.config.intermediate_size) + * (math.log(self.config.time_step_max) - math.log(self.config.time_step_min)) + + math.log(self.config.time_step_min) + ).clamp(min=self.config.time_step_floor) + # # Inverse of softplus: https://github.com/pytorch/pytorch/issues/72759 + inv_dt = dt + torch.log(-torch.expm1(-dt)) + with torch.no_grad(): + module.dt_proj.bias.copy_(inv_dt) + module.dt_proj.bias._no_reinit = True + + if isinstance(module, nn.Linear): + if module.bias is not None: + if not getattr(module.bias, "_no_reinit", False): + nn.init.zeros_(module.bias) + elif isinstance(module, nn.Embedding): + nn.init.normal_(module.weight, std=self.config.initializer_range) + + if self.config.rescale_prenorm_residual: + # Reinitialize selected weights subject to the OpenAI GPT-2 Paper Scheme: + # > A modified initialization which accounts for the accumulation on the residual path with model depth. Scale + # > the weights of residual layers at initialization by a factor of 1/√N where N is the # of residual layers. + # > -- GPT-2 :: https://openai.com/blog/better-language-models/ + # + # Reference (Megatron-LM): https://github.com/NVIDIA/Megatron-LM/blob/main/megatron/model/gpt_model.py + for name, p in module.named_parameters(): + if name in ["out_proj.weight"]: + # Special Scaled Initialization --> There are 2 Layer Norms per Transformer Block + # Following Pytorch init, except scale by 1/sqrt(2 * n_layer) + # We need to reinit p since this code could be called multiple times + # Having just p *= scale would repeatedly scale it down + nn.init.kaiming_uniform_(p, a=math.sqrt(5)) + with torch.no_grad(): + p /= math.sqrt(self.config.num_hidden_layers) + + +@dataclass +class MambaOutput(ModelOutput): + """ + Class for the MAMBA model outputs. + + Args: + last_hidden_state (`torch.FloatTensor` of shape `(batch_size, sequence_length, hidden_size)`): + Sequence of hidden-states at the output of the last layer of the model. + cache_params (`MambaCache`): + The state of the model at the last time step. Can be used in a forward method with the next `input_ids` to + avoid providing the old `input_ids`. + + Includes both the State space model state matrices after the selective scan, and the Convolutional states + hidden_states (`tuple(torch.FloatTensor)`, *optional*, + returned when `output_hidden_states=True` is passed or when `config.output_hidden_states=True`): + Tuple of `torch.FloatTensor` (one for the output of the embeddings, if the model has an embedding layer, + + one for the output of each layer) of shape `(batch_size, sequence_length, hidden_size)`. + + Hidden-states of the model at the output of each layer plus the optional initial embedding outputs. + """ + + last_hidden_state: Optional[torch.FloatTensor] = None + cache_params: Optional[MambaCache] = None + hidden_states: Optional[Tuple[torch.FloatTensor]] = None + + +@dataclass +class MambaCausalLMOutput(ModelOutput): + """ + Base class for causal language model (or autoregressive) outputs. + + Args: + loss (`torch.FloatTensor` of shape `(1,)`, *optional*, returned when `labels` is provided): + Language modeling loss (for next-token prediction). + logits (`torch.FloatTensor` of shape `(batch_size, sequence_length, config.vocab_size)`): + Prediction scores of the language modeling head (scores for each vocabulary token before SoftMax). + cache_params (`MambaCache`): + The state of the model at the last time step. Can be used in a forward method with the next `input_ids` to + avoid providing the old `input_ids`. + + Includes both the State space model state matrices after the selective scan, and the Convolutional states + hidden_states (`tuple(torch.FloatTensor)`, *optional*, + returned when `output_hidden_states=True` is passed or when `config.output_hidden_states=True`): + Tuple of `torch.FloatTensor` (one for the output of the embeddings, if the model has an embedding layer, + + one for the output of each layer) of shape `(batch_size, sequence_length, hidden_size)`. + + Hidden-states of the model at the output of each layer plus the optional initial embedding outputs. + """ + + loss: Optional[torch.FloatTensor] = None + logits: Optional[torch.FloatTensor] = None + cache_params: Optional[MambaCache] = None + hidden_states: Optional[Tuple[torch.FloatTensor]] = None + + +class MambaModel(MambaPreTrainedModel): + def __init__(self, config): + super().__init__(config) + + self.embeddings = nn.Embedding(config.vocab_size, config.hidden_size) + self.layers = nn.ModuleList([MambaBlock(config, layer_idx=idx) for idx in range(config.num_hidden_layers)]) + + self.gradient_checkpointing = False + self.norm_f = RMSNorm(config.hidden_size, eps=config.layer_norm_epsilon) + # Initialize weights and apply final processing + self._register_load_state_dict_pre_hook(self.load_hook) + self.post_init() + + def load_hook(self, state_dict, prefix, *args): + for k in state_dict: + if "embedding." in k: + state_dict[k.replace("embedding.", "embeddings.")] = state_dict.pop(k) + break + + def get_input_embeddings(self): + return self.embeddings + + def set_input_embeddings(self, new_embeddings): + self.embeddings = new_embeddings + + def forward( + self, + input_ids: Optional[torch.LongTensor] = None, + inputs_embeds: Optional[torch.LongTensor] = None, + cache_params: Optional[MambaCache] = None, + use_cache: Optional[bool] = None, + output_hidden_states: Optional[bool] = None, + return_dict: Optional[bool] = None, + cache_position: Optional[torch.LongTensor] = None, + attention_mask: Optional[torch.LongTensor] = None, + ) -> Union[Tuple, MambaOutput]: + output_hidden_states = ( + output_hidden_states if output_hidden_states is not None else self.config.output_hidden_states + ) + use_cache = use_cache if use_cache is not None else (self.config.use_cache if not self.training else False) + return_dict = return_dict if return_dict is not None else self.config.use_return_dict + + if (input_ids is None) ^ (inputs_embeds is not None): # ^ is python for xor + raise ValueError( + "You cannot specify both input_ids and inputs_embeds at the same time, and must specify either one" + ) + + if inputs_embeds is None: + inputs_embeds = self.embeddings(input_ids) + + if self.gradient_checkpointing and self.training and use_cache: + use_cache = False + + if use_cache: + if cache_params is None: + cache_params = MambaCache( + self.config, inputs_embeds.size(0), device=inputs_embeds.device, dtype=inputs_embeds.dtype + ) + cache_position = torch.arange(0, self.config.conv_kernel, device=inputs_embeds.device) + elif cache_position is None: + # cases when we do manual forward instead of using `model.generate` which will initiate + # `cache_position` and makes sure it is not None, throw error here instead of doing some + # hack to conjecture the current cache position + raise ValueError( + "You have to specify the `cache_position` manually when `use_cache=True` and `cache_params` is passed, " + "you don't have to pass a `cache_params` if you are in prefilling stage because in that case it will " + "be initialized for you automatically" + ) + else: + cache_params = None + + hidden_states = inputs_embeds + all_hidden_states = () if output_hidden_states else None + for mixer_block in self.layers: + if self.gradient_checkpointing and self.training: + hidden_states = self._gradient_checkpointing_func( + mixer_block.__call__, hidden_states, cache_params, cache_position, attention_mask + ) + else: + hidden_states = mixer_block( + hidden_states, + cache_params=cache_params, + cache_position=cache_position, + attention_mask=attention_mask, + ) + + if output_hidden_states: + all_hidden_states = all_hidden_states + (hidden_states,) + + hidden_states = self.norm_f(hidden_states) + + if output_hidden_states: + all_hidden_states = all_hidden_states + (hidden_states,) + + if not return_dict: + return tuple(v for v in [hidden_states, cache_params, all_hidden_states] if v is not None) + + return MambaOutput( + last_hidden_state=hidden_states, + cache_params=cache_params if use_cache else None, + hidden_states=all_hidden_states, + ) + + +class MambaForCausalLM(MambaPreTrainedModel, GenerationMixin): + + _tied_weights_keys = ["lm_head.weight"] + + def __init__(self, config): + super().__init__(config) + self.backbone = MambaModel(config) + self.lm_head = nn.Linear(config.hidden_size, config.vocab_size, bias=False) + # Initialize weights and apply final processing + self.post_init() + + def get_output_embeddings(self): + return self.lm_head + + def set_output_embeddings(self, new_embeddings): + self.lm_head = new_embeddings + + def get_input_embeddings(self): + return self.backbone.get_input_embeddings() + + def set_input_embeddings(self, new_embeddings): + return self.backbone.set_input_embeddings(new_embeddings) + + def _update_model_kwargs_for_generation( + self, outputs: ModelOutput, + model_kwargs: Dict[str, Any], + num_new_tokens: int = 1, + **kwargs + ) -> Dict[str, Any]: + model_kwargs["cache_params"] = outputs.get("cache_params", None) + if ( + model_kwargs.get("use_cache", True) + and "cache_position" in model_kwargs + and model_kwargs["cache_position"] is not None + ): + model_kwargs["cache_position"] = model_kwargs["cache_position"][-1:] + num_new_tokens + + if "attention_mask" in model_kwargs: + attention_mask = model_kwargs["attention_mask"] + model_kwargs["attention_mask"] = torch.cat( + [attention_mask, attention_mask.new_ones((attention_mask.shape[0], 1))], dim=-1 + ) + + return model_kwargs + + def prepare_inputs_for_generation( + self, + input_ids, + inputs_embeds=None, + use_cache=None, + cache_params: Optional[MambaCache] = None, + cache_position: Optional[torch.LongTensor] = None, + attention_mask: Optional[torch.LongTensor] = None, + num_logits_to_keep: Optional[int] = None, + **kwargs, + ): + if use_cache: + # `cache_position` should have been initialized in `generate` + if cache_position is None: + raise ValueError( + "`cache_position` should not be None as it should have been initialized in " + "`model.generate`, you are responsible for passing in a valid `cache_position` if " + "you are calling `prepare_inputs_for_generation` directly with `use_cache=True`" + ) + if cache_position[0] > 0: + input_ids = input_ids[:, -1].unsqueeze(-1) + + if attention_mask is not None: + attention_mask = None + + else: + # we initialize the `cache_position` to full size of `conv_states` at prefill stage + # considering padding will be applied when input length is shorter, and truncation + # will be applied when it is longer, so it will be equivalent to always have it match + # the length of `cache_params.conv_states`, which is `config.conv_kernel` + cache_position = torch.arange(0, self.config.conv_kernel, device=input_ids.device) + + if inputs_embeds is not None and cache_params is None: + model_inputs = {"inputs_embeds": inputs_embeds} + else: + model_inputs = {"input_ids": input_ids.contiguous()} + + if num_logits_to_keep is not None: + model_inputs['num_logits_to_keep'] = num_logits_to_keep + + model_inputs.update({ + 'cache_params': cache_params, + 'use_cache': use_cache, + 'cache_position': cache_position, + 'attention_mask': attention_mask, + 'num_logits_to_keep': num_logits_to_keep, + }) + return model_inputs + + def forward( + self, + input_ids: Optional[torch.LongTensor] = None, + attention_mask: Optional[torch.LongTensor] = None, + inputs_embeds: Optional[torch.FloatTensor] = None, + cache_params: Optional[MambaCache] = None, + labels: Optional[torch.LongTensor] = None, + output_hidden_states: Optional[bool] = None, + return_dict: Optional[bool] = None, + use_cache: Optional[bool] = None, + cache_position: Optional[torch.Tensor] = None, + num_logits_to_keep: Optional[int] = 0, + **kwargs, # for now we need this for generation + ) -> Union[Tuple, MambaCausalLMOutput]: + r""" + labels (`torch.LongTensor` of shape `(batch_size, sequence_length)`, *optional*): + Labels for language modeling. Note that the labels **are shifted** inside the model, i.e. you can set + `labels = input_ids` Indices are selected in `[-100, 0, ..., config.vocab_size]` All labels set to `-100` + are ignored (masked), the loss is only computed for labels in `[0, ..., config.vocab_size]` + """ + return_dict = return_dict if return_dict is not None else self.config.use_return_dict + + mamba_outputs = self.backbone( + input_ids, + cache_params=cache_params, + inputs_embeds=inputs_embeds, + output_hidden_states=output_hidden_states, + return_dict=return_dict, + use_cache=use_cache, + cache_position=cache_position, + attention_mask=attention_mask, + ) + hidden_states = mamba_outputs[0] + fuse_linear_and_cross_entropy = self.config.fuse_cross_entropy and self.training + logits = None if fuse_linear_and_cross_entropy else self.lm_head(hidden_states[:, -num_logits_to_keep:]) + + loss = None + if labels is not None: + if self.config.fuse_cross_entropy: + if fuse_linear_and_cross_entropy: + loss_fct = FusedLinearCrossEntropyLoss() + else: + loss_fct = FusedCrossEntropyLoss(inplace_backward=True) + else: + loss_fct = nn.CrossEntropyLoss() + # Enable model parallelism + labels = labels.to(hidden_states.device) + labels = torch.cat((labels[..., 1:], torch.full_like(labels[:, :1], loss_fct.ignore_index)), 1) + if fuse_linear_and_cross_entropy: + loss = loss_fct(hidden_states.view(-1, self.config.hidden_size), + labels.view(-1), + self.lm_head.weight, + self.lm_head.bias) + else: + loss = loss_fct(logits.view(-1, self.config.vocab_size), labels.view(-1)) + + if not return_dict: + output = (logits,) + mamba_outputs[1:] + return (loss,) + output if loss is not None else output + + return MambaCausalLMOutput( + loss=loss, + logits=logits, + cache_params=mamba_outputs.cache_params, + hidden_states=mamba_outputs.hidden_states, + ) diff --git a/fla/models/mamba2/__init__.py b/fla/models/mamba2/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..0b8ac62a700590e06d1e524979b2f21353aa5188 --- /dev/null +++ b/fla/models/mamba2/__init__.py @@ -0,0 +1,13 @@ +# -*- coding: utf-8 -*- + +from transformers import AutoConfig, AutoModel, AutoModelForCausalLM + +from fla.models.mamba2.configuration_mamba2 import Mamba2Config +from fla.models.mamba2.modeling_mamba2 import Mamba2ForCausalLM, Mamba2Model + +AutoConfig.register(Mamba2Config.model_type, Mamba2Config, True) +AutoModel.register(Mamba2Config, Mamba2Model, True) +AutoModelForCausalLM.register(Mamba2Config, Mamba2ForCausalLM, True) + + +__all__ = ['Mamba2Config', 'Mamba2ForCausalLM', 'Mamba2Model'] diff --git a/fla/models/mamba2/configuration_mamba2.py b/fla/models/mamba2/configuration_mamba2.py new file mode 100644 index 0000000000000000000000000000000000000000..264c95bfa2a8ba697c8e2d70ab65758b70ca8888 --- /dev/null +++ b/fla/models/mamba2/configuration_mamba2.py @@ -0,0 +1,168 @@ +# Copyright 2024 The HuggingFace Inc. team. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +"""MAMBA2 configuration""" + +import math + +from transformers.configuration_utils import PretrainedConfig + + +class Mamba2Config(PretrainedConfig): + """ + This is the configuration class to store the configuration of a [`Mamba2Model`]. It is used to instantiate a MAMBA2 + model according to the specified arguments, defining the model architecture. Instantiating a configuration with the + defaults will yield a similar configuration to that of the MAMBA2 + [state-spaces/mamba2-2.8b](https://huggingface.co/state-spaces/mamba2-2.8b) architecture. + + Configuration objects inherit from [`PretrainedConfig`] and can be used to control the model outputs. Read the + documentation from [`PretrainedConfig`] for more information. + + + Args: + num_heads (`int`, *optional*, defaults to 64): + Number of heads for the evolution matrices of mamba 2. + head_dim (`int`, *optional*, defaults to 64): + Dimension of each head. + vocab_size (`int`, *optional*, defaults to 32768): + Vocabulary size of the MAMBA2 model. Defines the number of different tokens that can be represented by the + `inputs_ids` passed when calling [`Mamba2Model`]. + hidden_size (`int`, *optional*, defaults to 2048): + Dimensionality of the embeddings and hidden states. + state_size (`int`, *optional*, defaults to 128): shape of the state space latents. + num_hidden_layers (`int`, *optional*, defaults to 48): + Number of hidden layers in the model. + layer_norm_epsilon (`float`, *optional*, defaults to 1e-05): + The epsilon to use in the layer normalization layers. + pad_token_id (`int`, *optional*, defaults to 0): + Padding token id. + bos_token_id (`int`, *optional*, defaults to 1): + The id of the beginning of sentence token in the vocabulary. + eos_token_id (`int`, *optional*, defaults to 2): + The id of the end of sentence token in the vocabulary. + expand (`int`, *optional*, defaults to 2): Expanding factor used to determine the intermediate size. + conv_kernel (`int`, *optional*, defaults to 4): Size of the convolution kernel. + n_groups (`int`, *optional*, defaults to 1): + Number of groups for the evolution matrices of mamba 2. + use_bias (`bool`, *optional*, defaults to `False`): + Whether or not to use bias in ["in_proj", "out_proj"] of the mixer block + use_conv_bias (`bool`, *optional*, defaults to `True`): + Whether or not to use bias in the convolution layer of the mixer block. + hidden_act (`str`, *optional*, defaults to `"silu"`): + The non-linear activation function (function or string) in the decoder. + initializer_range (`float`, *optional*, defaults to 0.1): + The standard deviation of the truncated_normal_initializer for initializing all weight matrices. + residual_in_fp32 (`bool`, *optional*, defaults to `True`): + Whether or not residuals should be in `float32`. + If set to `False` residuals will keep the same `dtype` as the rest of the model + time_step_rank (`Union[int,str]`, *optional*, defaults to `"auto"`): + Rank of the discretization projection matrix. + `"auto"` means that it will default to `math.ceil(self.hidden_size / 16)` + time_step_min (`float`, *optional*, defaults to 0.001): + Minimum `time_step` used to bound `dt_proj.bias`. + time_step_max (`float`, *optional*, defaults to 0.1): + Maximum `time_step` used to bound `dt_proj.bias`. + time_step_floor (`float`, *optional*, defaults to 0.0001): + Minimum clamping value of the `dt_proj.bias` layer initialization. + time_step_limit (`tuple`, *optional*, defaults to `(0.0, inf)`): + Accepted range of time step values. + rescale_prenorm_residual (`bool`, *optional*, defaults to `True`): + Whether or not to rescale `out_proj` weights when initializing. + use_cache (`bool`, *optional*, defaults to `True`): + Whether or not the cache should be used. + rms_norm (`bool`, *optional*, defaults to `True`): + Whether to use RMS norm or not. + chunk_size (`int`, *optional*, defaults to 256): + Size of the chunks that will comprise the sequence. + tie_word_embeddings (`bool`, *optional*, defaults to `False`): + Whether to tie word embeddings or not. + """ + + model_type = "mamba2" + + def __init__( + self, + num_heads: int = 64, + head_dim: int = 64, + vocab_size: int = 32000, + hidden_size: int = 2048, + state_size: int = 128, + num_hidden_layers: int = 48, + layer_norm_epsilon: float = 1e-5, + pad_token_id: int = 0, + bos_token_id: int = 1, + eos_token_id: int = 2, + expand: int = 2, + conv_kernel: int = 4, + n_groups: int = 1, + use_bias: bool = False, + use_conv_bias: bool = True, + hidden_act: str = "silu", + initializer_range: float = 0.1, + residual_in_fp32: bool = True, + time_step_rank: str = "auto", + time_step_min: float = 0.001, + time_step_max: float = 0.1, + time_step_floor: float = 1e-4, + time_step_limit=(0.0, float("inf")), + rescale_prenorm_residual: bool = True, + use_cache: bool = True, + rms_norm: bool = True, + chunk_size: int = 256, + fuse_cross_entropy: bool = True, + tie_word_embeddings: bool = False, + **kwargs, + ): + self.vocab_size = vocab_size + self.hidden_size = hidden_size + self.state_size = state_size + self.num_hidden_layers = num_hidden_layers + self.layer_norm_epsilon = layer_norm_epsilon + self.conv_kernel = conv_kernel + self.expand = expand + + self.bos_token_id = bos_token_id + self.eos_token_id = eos_token_id + self.pad_token_id = pad_token_id + self.use_bias = use_bias + self.use_conv_bias = use_conv_bias + self.hidden_act = hidden_act + self.initializer_range = initializer_range + self.time_step_rank = ( + math.ceil(self.hidden_size / 16) + if time_step_rank == "auto" + else time_step_rank + ) + self.time_step_min = time_step_min + self.time_step_max = time_step_max + self.time_step_floor = time_step_floor + self.rescale_prenorm_residual = rescale_prenorm_residual + self.residual_in_fp32 = residual_in_fp32 + self.use_cache = use_cache + self.n_groups = n_groups + self.num_heads = num_heads + self.head_dim = head_dim + self.rms_norm = rms_norm + self.state_size = state_size + self.chunk_size = chunk_size + self.time_step_limit = time_step_limit + self.fuse_cross_entropy = fuse_cross_entropy + self.tie_word_embeddings = tie_word_embeddings + + super().__init__( + bos_token_id=bos_token_id, + eos_token_id=eos_token_id, + pad_token_id=pad_token_id, + tie_word_embeddings=tie_word_embeddings, + **kwargs, + ) diff --git a/fla/models/mamba2/modeling_mamba2.py b/fla/models/mamba2/modeling_mamba2.py new file mode 100644 index 0000000000000000000000000000000000000000..79431927a54a3d6826a4a7f9a51007dac54357eb --- /dev/null +++ b/fla/models/mamba2/modeling_mamba2.py @@ -0,0 +1,1030 @@ +# Copyright 2024 state-spaces/mamba2 org and HuggingFace Inc. team. +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +"""PyTorch MAMBA2 model.""" + +import math +import warnings +from dataclasses import dataclass +from typing import Optional, Tuple, Union + +import torch +import torch.utils.checkpoint +from torch import nn +from transformers.activations import ACT2FN +from transformers.generation import GenerationMixin +from transformers.modeling_utils import PreTrainedModel +from transformers.utils import ModelOutput, logging + +from fla.models.mamba2.configuration_mamba2 import Mamba2Config +from fla.modules import (FusedCrossEntropyLoss, FusedLinearCrossEntropyLoss, + RMSNorm) +from fla.modules.layernorm_gated import RMSNormGated + +logger = logging.get_logger(__name__) + +with warnings.catch_warnings(): + warnings.simplefilter('ignore') + try: + from mamba_ssm.ops.triton.selective_state_update import \ + selective_state_update + from mamba_ssm.ops.triton.ssd_combined import ( + mamba_chunk_scan_combined, mamba_split_conv1d_scan_combined) + except ImportError: + ( + selective_state_update, + mamba_chunk_scan_combined, + mamba_split_conv1d_scan_combined, + ) = (None, None, None) + try: + from causal_conv1d import causal_conv1d_fn, causal_conv1d_update + except ImportError: + causal_conv1d_update, causal_conv1d_fn = None, None + is_fast_path_available = all(( + selective_state_update, + causal_conv1d_fn, + causal_conv1d_update + )) + + +def pad_tensor_by_size(input_tensor: torch.Tensor, pad_size: int): + """ + Padding x tensor with `pad_size` on the seq_len dim (dim=1) + + Assumes that we only have tensors of either size 4 or 3 + """ + pad_shape = (0, 0, 0, 0, 0, pad_size, 0, 0) if len(input_tensor.shape) == 4 else (0, 0, 0, pad_size, 0, 0) + + return torch.nn.functional.pad(input_tensor, pad_shape, mode="constant", value=0) + + +def reshape_into_chunks(input_tensor, pad_size, chunk_size): + """ + Padding input_tensor with `pad_size` on the seq_len dim (dim=1) and + simultaneously splitting it into chunk sequences. + + Assumes that we only have tensors of either size 4 or 3 + """ + # [bsz, seq_len, ...] -> [bsz, seq_len multiple of chunk_size, ...] + input_tensor = pad_tensor_by_size(input_tensor, pad_size) + + if len(input_tensor.shape) == 3: + # [bsz, seq_len multiple of chunk_size, num_heads] -> [bsz, -1, chunk_size, num_heads] + return input_tensor.reshape(input_tensor.shape[0], -1, chunk_size, input_tensor.shape[2]) + else: + # [bsz, seq_len multiple of chunk_size, num_heads, head_dim or state_size] -> + # [bsz, -1, chunk_size, num_heads, head_dim or state_size] + return input_tensor.reshape( + input_tensor.shape[0], -1, chunk_size, input_tensor.shape[2], input_tensor.shape[3] + ) + + +def segment_sum(input_tensor): + """ + More stable segment sum calculation. Uses cumulative sums and masking instead of direct subtractions. + """ + chunk_size = input_tensor.size(-1) + # 1. expand input tensor to have an additional dimension and repeat along that dimension + # [..., chunk_size] -> [..., chunk_size, chunk_size] + input_tensor = input_tensor[..., None].expand(*input_tensor.size(), chunk_size) + # 2. create a lower triangular mask with the diagonal set to 0 to 0 out elements above diag + mask = torch.tril(torch.ones(chunk_size, chunk_size, device=input_tensor.device, dtype=torch.bool), diagonal=-1) + input_tensor = input_tensor.masked_fill(~mask, 0) + # 3. compute actual cumsum + tensor_segsum = torch.cumsum(input_tensor, dim=-2) + + # 4. apply mask to keep only the lower triangular part of the cumulative sum result (incl diagonal this time) + mask = torch.tril(torch.ones(chunk_size, chunk_size, device=input_tensor.device, dtype=torch.bool), diagonal=0) + tensor_segsum = tensor_segsum.masked_fill(~mask, -torch.inf) + return tensor_segsum + + +class Mamba2Cache: + """ + Arguments: + config: Mamba2Config + batch_size: int + dtype: torch.dtype + device: torch.device + + Attributes: + seqlen_offset: int + dtype: torch.dtype + conv_states: Dict[int, torch.Tensor] # layer_idx -> [batch_size, intermediate_size, conv_kernel_size] + ssm_states: Dict[int, torch.Tensor] # layer_idx -> [batch_size, intermediate_size, ssm_state_size] + """ + + def __init__( + self, + config: Mamba2Config, + batch_size: int, + dtype: torch.dtype = torch.float16, + device: Optional[str] = None, + ): + self.seqlen_offset = 0 + self.dtype = dtype + self.conv_kernel_size = config.conv_kernel + self.intermediate_size = int(config.expand * config.hidden_size) + + self.conv_states = { + i: torch.zeros( + batch_size, + self.intermediate_size + 2 * config.n_groups * config.state_size, + self.conv_kernel_size, + device=device, + dtype=dtype, + ) + for i in range(config.num_hidden_layers) + } + self.ssm_states = { + i: torch.zeros( + batch_size, + config.num_heads, + config.head_dim, + config.state_size, + device=device, + dtype=dtype, + ) + for i in range(config.num_hidden_layers) + } + self.activation = config.hidden_act + self.act = ACT2FN[config.hidden_act] + + def update_conv_state( + self, + layer_idx: int, + new_conv_state: torch.Tensor, + cache_position: torch.LongTensor, + ) -> torch.Tensor: + conv_state = self.conv_states[layer_idx] + cache_position = cache_position.clamp(0, self.conv_kernel_size - 1) + + conv_state = conv_state.roll(shifts=-1, dims=-1) + conv_state[:, :, cache_position] = new_conv_state.to(conv_state.device) + self.conv_states[layer_idx].zero_() + self.conv_states[layer_idx] += conv_state + return self.conv_states[layer_idx] + + def reset(self): + self.conv_states.zero_() + self.ssm_states.zero_() + + +class Mamba2Mixer(nn.Module): + """ + Compute ∆, A, B, C, and D the state space parameters and compute the `contextualized_states`. + A, D are input independent (see Mamba paper [1] Section 3.5.2 "Interpretation of A" for why A isn't selective) + ∆, B, C are input-dependent (this is a key difference between Mamba and the linear time invariant S4, + and is why Mamba is called **selective** state spaces) + """ + + def __init__(self, config: Mamba2Config, layer_idx: int): + super().__init__() + self.num_heads = config.num_heads + self.hidden_size = config.hidden_size + self.ssm_state_size = config.state_size + self.conv_kernel_size = config.conv_kernel + self.intermediate_size = int(config.expand * self.hidden_size) + self.time_step_rank = int(config.time_step_rank) + self.layer_idx = layer_idx + self.use_conv_bias = config.use_conv_bias + self.activation = config.hidden_act + self.act = ACT2FN[config.hidden_act] + + self.layer_norm_epsilon = config.layer_norm_epsilon + self.rms_norm = config.rms_norm + + self.n_groups = config.n_groups + self.head_dim = config.head_dim + self.chunk_size = config.chunk_size + + self.time_step_limit = config.time_step_limit + self.time_step_min = config.time_step_min + self.time_step_max = config.time_step_max + + self.conv_dim = self.intermediate_size + 2 * self.n_groups * self.ssm_state_size + self.conv1d = nn.Conv1d( + in_channels=self.conv_dim, + out_channels=self.conv_dim, + bias=config.use_conv_bias, + kernel_size=config.conv_kernel, + groups=self.conv_dim, + padding=config.conv_kernel - 1, + ) + + # projection of the input hidden states + projection_size = self.intermediate_size + self.conv_dim + self.num_heads + self.in_proj = nn.Linear( + self.hidden_size, + projection_size, + bias=config.use_bias, + ) + # selective projection used to make dt, B and C input dependant + + # time step projection (discretization) + # instantiate once and copy inv_dt in init_weights of PretrainedModel + self.dt_bias = nn.Parameter(torch.ones(self.num_heads)) + + # S4D real initialization. These are not discretized! + # The core is to load them, compute the discrete states, then write the updated state. Keeps the memory bounded + A = torch.arange(1, self.num_heads + 1) + self.A_log = nn.Parameter(torch.log(A)) + self.A_log._no_weight_decay = True + self.norm = RMSNormGated( + self.intermediate_size, eps=self.layer_norm_epsilon, norm_before_gate=False + ) + self.D = nn.Parameter(torch.ones(self.num_heads)) + self.D._no_weight_decay = True + + self.out_proj = nn.Linear(self.intermediate_size, self.hidden_size, bias=config.use_bias) + self.use_bias = config.use_bias + + if not is_fast_path_available: + logger.warning_once( + "The fast path is not available because one of " + "`(selective_state_update, causal_conv1d_fn, causal_conv1d_update)` is None. " + "Falling back to the naive implementation. " + "To install follow https://github.com/state-spaces/mamba/#installation and" + "https://github.com/Dao-AILab/causal-conv1d" + ) + + def cuda_kernels_forward( + self, + hidden_states: torch.Tensor, + cache_params: Optional[Mamba2Cache] = None, + cache_position: Optional[torch.LongTensor] = None, + attention_mask: Optional[torch.Tensor] = None, + ): + # set up dimensions for reshapes later + batch_size, seq_len, _ = hidden_states.shape + groups_time_state_size = self.n_groups * self.ssm_state_size + d_to_remove = 2 * self.intermediate_size + 2 * self.n_groups * self.ssm_state_size + self.num_heads + + # getting projected states from cache if it exists + if cache_params is not None and cache_params.seqlen_offset > 0: + in_projected_states = self.in_proj(hidden_states.squeeze(1)) # (B 2D) + d_mlp = (in_projected_states.shape[-1] - d_to_remove) // 2 + split_projection_dim = [d_mlp, d_mlp, self.intermediate_size, self.conv_dim, self.num_heads] + _, _, gate, hidden_states_B_C, dt = torch.split(in_projected_states, split_projection_dim, dim=-1) + + hidden_states_B_C = causal_conv1d_update( + hidden_states_B_C, + cache_params.conv_states[self.layer_idx], + self.conv1d.weight.squeeze(1), + self.conv1d.bias, + self.activation, + ) + + hidden_states, B, C = torch.split( + hidden_states_B_C, + [ + self.intermediate_size, + groups_time_state_size, + groups_time_state_size, + ], + dim=-1, + ) + A = -torch.exp(self.A_log.float()) # (nheads,) + + A = A[:, None, ...][:, :, None].expand(-1, self.head_dim, self.ssm_state_size).to(dtype=torch.float32) + dt = dt[:, :, None].expand(-1, -1, self.head_dim) + dt_bias = self.dt_bias[:, None, ...].expand(-1, self.head_dim) + D = self.D[:, None, ...].expand(-1, self.head_dim) + B = B.view(batch_size, self.n_groups, B.shape[1] // self.n_groups) + C = C.view(batch_size, self.n_groups, C.shape[1] // self.n_groups) + hidden_states_reshaped = hidden_states.view(batch_size, self.num_heads, self.head_dim) + + hidden_states = selective_state_update( + cache_params.ssm_states[self.layer_idx], + hidden_states_reshaped, + dt, + A, + B, + C, + D, + z=None, + dt_bias=dt_bias, + dt_softplus=True, + ) + hidden_states = hidden_states.view(batch_size, self.num_heads * self.head_dim) + hidden_states = self.norm(hidden_states, gate) + out = self.out_proj(hidden_states)[:, None, ...] + # if no cache is found, calling the kernel + else: + if attention_mask is not None and attention_mask.shape[1] > 1 and attention_mask.shape[0] > 1: + # tune out hidden states for pad tokens, see https://github.com/state-spaces/mamba/issues/66 + dtype = hidden_states.dtype + hidden_states = (hidden_states * attention_mask[:, :, None]).to(dtype) + # 1. Gated MLP's linear projection + projected_states = self.in_proj(hidden_states) + A = -torch.exp(self.A_log.float()) # (num_heads) or (intermediate_size, state_size) + dt_limit_kwargs = {} if self.time_step_limit == (0.0, float("inf")) else {"dt_limit": self.time_step_limit} + + if self.training and cache_params is None: + out, ssm_state = mamba_split_conv1d_scan_combined( + projected_states, + self.conv1d.weight.squeeze(1), + self.conv1d.bias, + self.dt_bias, + A, + D=self.D, + chunk_size=self.chunk_size, + seq_idx=None, # was seq_idx + activation=self.activation, + rmsnorm_weight=self.norm.weight, + rmsnorm_eps=self.norm.eps, + outproj_weight=self.out_proj.weight, + outproj_bias=self.out_proj.bias, + headdim=self.head_dim, + ngroups=self.n_groups, + norm_before_gate=False, + return_final_states=True, + **dt_limit_kwargs, + ) + + else: + gate, hidden_states_B_C, time_step = torch.split( + projected_states, + [self.intermediate_size, self.conv_dim, self.num_heads], + dim=-1, + ) + + # 1D Convolution + if causal_conv1d_fn is None or self.activation not in ["silu", "swish"]: + hidden_states_B_C = self.act( + self.conv1d(hidden_states_B_C.transpose(1, 2)).transpose(1, 2)[:, :seq_len] + ) # (B, L, self.d_inner + 2 * ngroups * d_state) + else: + hidden_states_B_C = causal_conv1d_fn( + x=hidden_states_B_C.transpose(1, 2), + weight=self.conv1d.weight.squeeze(1), + bias=self.conv1d.bias, + activation=self.activation, + ).transpose(1, 2)[:, :seq_len] + hidden_states, B, C = torch.split( + hidden_states_B_C, + [self.intermediate_size, groups_time_state_size, groups_time_state_size], + dim=-1, + ) + if attention_mask is not None and attention_mask.shape[1] > 1 and attention_mask.shape[0] > 1: + # tune out hidden states for pad tokens, see https://github.com/state-spaces/mamba/issues/66 + dtype = hidden_states.dtype + hidden_states = (hidden_states * attention_mask[:, :, None]).to(dtype) + scan_output, ssm_state = mamba_chunk_scan_combined( + hidden_states.view(batch_size, seq_len, -1, self.head_dim), + time_step, + A, + B.view(batch_size, seq_len, self.n_groups, -1), + C.view(batch_size, seq_len, self.n_groups, -1), + chunk_size=self.chunk_size, + D=self.D, + z=None, + seq_idx=None, + return_final_states=True, + dt_bias=self.dt_bias, + dt_softplus=True, + **dt_limit_kwargs, + ) + if ssm_state is not None and cache_params is not None: + cache_params.ssm_states[self.layer_idx].copy_(ssm_state) + scan_output = scan_output.view(batch_size, seq_len, -1) + # Multiply "gate" branch and apply extra normalization layer + scan_output = self.norm(scan_output, gate) + out = self.out_proj(scan_output) + return out + + # fmt: off + def torch_forward( + self, + input_states, + cache_params: Optional[Mamba2Cache] = None, + cache_position: Optional[torch.LongTensor] = None, + attention_mask: Optional[torch.Tensor] = None + ): + batch_size, seq_len, _ = input_states.shape + dtype = input_states.dtype + # Gated MLP's linear projection + projected_states = self.in_proj(input_states.squeeze(1)) + d_mlp = (projected_states.shape[-1] - 2 * self.intermediate_size - + 2 * self.n_groups * self.ssm_state_size - self.num_heads) // 2 + _, _, gate, hidden_states, dt = projected_states.split( + [d_mlp, d_mlp, self.intermediate_size, self.conv_dim, self.num_heads], dim=-1 + ) + + # Convolution sequence transformation + if cache_params is not None: + ssm_state = cache_params.ssm_states[self.layer_idx].clone() + ssm_state = ssm_state.to(hidden_states.device) + if cache_params.seqlen_offset > 0: + # [batch, intermediate_size, conv_kernel_size] + conv_state = cache_params.conv_states[self.layer_idx] + conv_state = torch.roll(conv_state, shifts=-1, dims=-1) + # handle batched generation - states are copied through + conv_state[:, :, -1] = hidden_states[:, 0, :] if hidden_states.ndim == 3 else hidden_states + cache_params.conv_states[self.layer_idx].copy_(conv_state) + hidden_states = torch.sum(conv_state.to(projected_states.device) * self.conv1d.weight[:, 0, :], dim=-1) + if self.use_conv_bias: + hidden_states += self.conv1d.bias + # [batch, 1, intermediate_size] : decoding + hidden_states = self.act(hidden_states).to(dtype)[:, None, ...] + else: + hidden_states = hidden_states.transpose(1, 2) + conv_state = nn.functional.pad( + hidden_states, + (self.conv_kernel_size - hidden_states.shape[-1], 0) + ) + cache_params.conv_states[self.layer_idx].copy_(conv_state) + # [batch, intermediate_size, seq_len] + hidden_states = self.act(self.conv1d(hidden_states).transpose(1, 2))[:, :seq_len, :] + if attention_mask is not None and attention_mask.shape[1] > 1 and attention_mask.shape[0] > 1: + dtype = hidden_states.dtype + # tune out hidden states for pad tokens, see https://github.com/state-spaces/mamba/issues/66 + hidden_states = (hidden_states * attention_mask[:, :, None]).to(dtype) + else: + ssm_state = torch.zeros( + (batch_size, self.num_heads, self.head_dim, self.ssm_state_size), + device=hidden_states.device, dtype=dtype + ) + hidden_states = self.act(self.conv1d(hidden_states.transpose(1, 2))[..., :seq_len].transpose(1, 2)) + hidden_states, B, C = torch.split( + hidden_states, + [self.intermediate_size, self.n_groups * self.ssm_state_size, self.n_groups * self.ssm_state_size], + dim=-1 + ) + A = -torch.exp(self.A_log.float()) # [num_heads] + if cache_params is not None and cache_params.seqlen_offset > 0: + # Note: there is no need to pad parameter matrices here, as there is just one new token + # for batched generation + dt = dt[:, None, ...] if dt.ndim == 2 else dt[:, 0, :][:, None, ...] + dt = dt.transpose(1, 2).expand(batch_size, dt.shape[-1], self.head_dim) + # [num_heads] -> [num_heads, head_dim] + dt_bias = self.dt_bias[..., None].expand(self.dt_bias.shape[0], self.head_dim) + + dt = torch.nn.functional.softplus(dt + dt_bias.to(dt.dtype)) + dt = torch.clamp(dt, self.time_step_min) + A = A[..., None, None].expand(self.num_heads, self.head_dim, self.ssm_state_size).to(dtype=torch.float32) + # [bsz, num_heads, head_dim, state_size] + dA = torch.exp(dt[..., None] * A) + + # Discretize B + # [bsz, n_groups * state_size] -> [bsz, n_groups, 1, state_size] -> + # -> [bsz, n_groups, group to head repetition factor, state_size] -> [bsz, num_heads, state_size] + B = B.reshape(batch_size, self.n_groups, -1)[..., None, :] + B = B.expand(batch_size, self.n_groups, self.num_heads // self.n_groups, B.shape[-1]).contiguous() + B = B.reshape(batch_size, -1, B.shape[-1]) + # [bsz, num_heads, head_dim, state_size] + dB = dt[..., None] * B[..., None, :] + + # Discretize x into dB + # [bsz, intermediate_size] -> [bsz, num_heads, head_dim] + hidden_states = hidden_states.reshape(batch_size, -1, self.head_dim) + dBx = dB * hidden_states[..., None] + + # State calculation + cache_params.ssm_states[self.layer_idx].copy_( + cache_params.ssm_states[self.layer_idx] * dA + dBx + ) + + # Subsequent output + # [bsz, n_groups * state_size] -> [bsz, num_heads, state_size] + C = C.reshape(batch_size, self.n_groups, -1)[..., None, :] + C = C.expand(batch_size, self.n_groups, self.num_heads // self.n_groups, C.shape[-1]).contiguous() + C = C.reshape(batch_size, -1, C.shape[-1]) + # [bsz, num_heads, head_dim] + + ssm_states = cache_params.ssm_states[self.layer_idx].to(C.dtype) # Shape: [b, h, d, n] + # Reshape ssm_states to merge the first two dimensions + # Shape: [b*h, d, n] + ssm_states_reshaped = ssm_states.view(batch_size * self.num_heads, self.head_dim, self.ssm_state_size) + C_reshaped = C.view(batch_size * self.num_heads, self.ssm_state_size, 1) # Shape: [b*h, n, 1] + y = torch.bmm(ssm_states_reshaped, C_reshaped) + y = y.view(batch_size, self.num_heads, self.head_dim) + + # D skip connection + # [num_heads] -> [num_heads, head_dim] + D = self.D[..., None].expand(self.D.shape[0], self.head_dim) + y = (y + hidden_states * D).to(y.dtype) + + # [bsz, num_heads, head_dim] -> [bsz, 1, intermediate_size] + y = y.reshape(batch_size, -1)[:, None, ...] + else: + # begin ssd naive implementation without einsums + dt = nn.functional.softplus(dt + self.dt_bias) + dt = torch.clamp(dt, self.time_step_min) + hidden_states = hidden_states.reshape(batch_size, seq_len, -1, self.head_dim).float() + B = B.reshape(batch_size, seq_len, -1, self.ssm_state_size).float() + C = C.reshape(batch_size, seq_len, -1, self.ssm_state_size).float() + B = B.repeat(1, 1, self.num_heads // self.n_groups, 1) + C = C.repeat(1, 1, self.num_heads // self.n_groups, 1) + pad_size = (self.chunk_size - seq_len % self.chunk_size) % self.chunk_size + + D_residual = self.D[..., None] * pad_tensor_by_size(hidden_states, pad_size) + + # Discretize x and A + hidden_states = hidden_states * dt[..., None] + A = A.to(hidden_states.dtype) * dt + + # Rearrange into blocks/chunks + hidden_states, A, B, C = [reshape_into_chunks(t, pad_size, self.chunk_size) for t in (hidden_states, A, B, C)] + + # [bsz, -1, chunk_size, num_heads] -> [bsz, num_heads, -1, chunk_size] + A = A.permute(0, 3, 1, 2) + A_cumsum = torch.cumsum(A, dim=-1) + + # 1. Compute the output for each intra-chunk (diagonal blocks) + # This is the analog of a causal mask + L = torch.exp(segment_sum(A)) + + # Contraction of C and B to get G (attention-weights like) + # shape: (b, c, l, s, h, n) + G_intermediate = C[:, :, :, None, :, :] * B[:, :, None, :, :, :] + G = G_intermediate.sum(dim=-1) # shape: (b, c, l, s, h) + + # Compute M, equivalent to applying attention mask to weights + M_intermediate = G[..., None] * L.permute(0, 2, 3, 4, 1)[..., None] + M = M_intermediate.sum(dim=-1) + + # Compute Y_diag (apply to values) + Y_diag = (M[..., None] * hidden_states[:, :, None]).sum(dim=3) + + # 2. Compute the state for each intra-chunk + # (right term of low-rank factorization of off-diagonal blocks; B terms) + decay_states = torch.exp((A_cumsum[:, :, :, -1:] - A_cumsum)) + B_decay = B * decay_states.permute(0, -2, -1, 1)[..., None] + states = (B_decay[..., None, :] * hidden_states[..., None]).sum(dim=2) + + # 3. Compute the inter-chunk SSM recurrence; produces correct SSM states at chunk boundaries + # (middle term of factorization of off-diag blocks; A terms) + if cache_params is not None and cache_params.seqlen_offset > 0: + previous_states = cache_params.ssm_states[self.layer_idx][:, None, ...] + else: + previous_states = torch.zeros_like(states[:, :1]) + states = torch.cat([previous_states, states], dim=1) + decay_chunk = torch.exp(segment_sum(nn.functional.pad(A_cumsum[:, :, :, -1], (1, 0)))) + decay_chunk = decay_chunk.transpose(1, 3) + new_states = (decay_chunk[..., None, None] * states[:, :, None, ...]).sum(dim=1) + states, ssm_state = new_states[:, :-1], new_states[:, -1] + + # 4. Compute state -> output conversion per chunk + # (left term of low-rank factorization of off-diagonal blocks; C terms) + state_decay_out = torch.exp(A_cumsum) + C_times_states = (C[..., None, :] * states[:, :, None, ...]) + state_decay_out_permuted = state_decay_out.permute(0, 2, 3, 1) + Y_off = (C_times_states.sum(-1) * state_decay_out_permuted[..., None]) + + # Add output of intra-chunk and inter-chunk terms (diagonal and off-diagonal blocks) + y = Y_diag + Y_off + # [bsz, -1, self.chunk_size, num_heads, head_dim] -> [bsz, (padded) seq_len, num_heads, head_dim] + y = y.reshape(batch_size, -1, self.num_heads, self.head_dim) + + y = y + D_residual + # Cutting off padded chunks + if pad_size > 0: + y = y[:, :seq_len, :, :] + y = y.reshape(batch_size, seq_len, -1) + if ssm_state is not None and cache_params is not None: + cache_params.ssm_states[self.layer_idx].copy_(ssm_state) + + scan_output = self.norm(y, gate) + # end ssd naive + + # 4. Final linear projection + contextualized_states = self.out_proj(scan_output.to(dtype)) # [batch, seq_len, hidden_size] + return contextualized_states + # fmt: on + + def forward( + self, + hidden_states, + cache_params: Optional[Mamba2Cache] = None, + cache_position: Optional[torch.LongTensor] = None, + attention_mask: Optional[torch.Tensor] = None, + ): + if is_fast_path_available and "cuda" in self.in_proj.weight.device.type: + return self.cuda_kernels_forward(hidden_states, cache_params, cache_position, attention_mask) + dtype = hidden_states.dtype + if attention_mask is not None and attention_mask.shape[1] > 1 and attention_mask.shape[0] > 1: + # tune out hidden states for pad tokens, see https://github.com/state-spaces/mamba/issues/66 + hidden_states = (hidden_states * attention_mask[:, :, None]).to(dtype) + + return self.torch_forward(hidden_states, cache_params, cache_position, attention_mask) + + +class Mamba2Block(nn.Module): + def __init__(self, config, layer_idx): + super().__init__() + self.config = config + self.layer_idx = layer_idx + self.residual_in_fp32 = config.residual_in_fp32 + self.norm = RMSNorm(config.hidden_size, eps=config.layer_norm_epsilon) + self.mixer = Mamba2Mixer(config, layer_idx=layer_idx) + + def forward( + self, + hidden_states, + cache_params: Optional[Mamba2Cache] = None, + cache_position: Optional[torch.LongTensor] = None, + attention_mask: Optional[torch.Tensor] = None, + ): + residual = hidden_states + hidden_states = self.norm(hidden_states.to(dtype=self.norm.weight.dtype)) + if self.residual_in_fp32: + residual = residual.to(torch.float32) + + hidden_states = self.mixer( + hidden_states, + cache_params=cache_params, + cache_position=cache_position, + attention_mask=attention_mask, + ) + hidden_states = residual + hidden_states + return hidden_states + + +class Mamba2PreTrainedModel(PreTrainedModel, GenerationMixin): + """ + An abstract class to handle weights initialization and a simple interface for downloading and loading pretrained + models. + """ + + config_class = Mamba2Config + base_model_prefix = "backbone" + _no_split_modules = ["Mamba2Block"] + supports_gradient_checkpointing = True + _is_stateful = True + + def _init_weights(self, module): + """Initialize the weights.""" + if isinstance(module, Mamba2Mixer): + module.A_log._no_weight_decay = True + module.D._no_weight_decay = True + + dt = torch.exp( + torch.rand(self.config.num_heads) + * (math.log(self.config.time_step_max) - math.log(self.config.time_step_min)) + + math.log(self.config.time_step_min) + ).clamp(min=self.config.time_step_floor) + + # # Inverse of softplus: https://github.com/pytorch/pytorch/issues/72759 + inv_dt = dt + torch.log(-torch.expm1(-dt)) + with torch.no_grad(): + module.dt_bias.copy_(inv_dt) + module.dt_bias._no_reinit = True + + if isinstance(module, nn.Linear): + if module.bias is not None: + if not getattr(module.bias, "_no_reinit", False): + nn.init.zeros_(module.bias) + elif isinstance(module, nn.Embedding): + nn.init.normal_(module.weight, std=self.config.initializer_range) + + if self.config.rescale_prenorm_residual: + # Reinitialize selected weights subject to the OpenAI GPT-2 Paper Scheme: + # > A modified initialization which accounts for the accumulation on the residual path with model depth. Scale + # > the weights of residual layers at initialization by a factor of 1/√N where N is the # of residual layers. + # > -- GPT-2 :: https://openai.com/blog/better-language-models/ + # + # Reference (Megatron-LM): https://github.com/NVIDIA/Megatron-LM/blob/main/megatron/model/gpt_model.py + for name, p in module.named_parameters(): + if name in ["out_proj.weight"]: + # Special Scaled Initialization --> There are 2 Layer Norms per Transformer Block + # Following Pytorch init, except scale by 1/sqrt(2 * n_layer) + # We need to reinit p since this code could be called multiple times + # Having just p *= scale would repeatedly scale it down + nn.init.kaiming_uniform_(p, a=math.sqrt(5)) + with torch.no_grad(): + p /= math.sqrt(self.config.num_hidden_layers) + + +@dataclass +# Copied from transformers.models.mamba.modeling_mamba.MambaOutput with MAMBA->MAMBA2,Mamba->Mamba2 +class Mamba2Output(ModelOutput): + """ + Class for the MAMBA2 model outputs. + + Args: + last_hidden_state (`torch.FloatTensor` of shape `(batch_size, sequence_length, hidden_size)`): + Sequence of hidden-states at the output of the last layer of the model. + cache_params (`Mamba2Cache`): + The state of the model at the last time step. Can be used in a forward method with the next `input_ids` to + avoid providing the old `input_ids`. + + Includes both the State space model state matrices after the selective scan, and the Convolutional states + hidden_states (`tuple(torch.FloatTensor)`, *optional*, + returned when `output_hidden_states=True` is passed or when `config.output_hidden_states=True`): + Tuple of `torch.FloatTensor` (one for the output of the embeddings, if the model has an embedding layer, + + one for the output of each layer) of shape `(batch_size, sequence_length, hidden_size)`. + + Hidden-states of the model at the output of each layer plus the optional initial embedding outputs. + """ + + last_hidden_state: Optional[torch.FloatTensor] = None + cache_params: Optional[Mamba2Cache] = None + hidden_states: Optional[Tuple[torch.FloatTensor]] = None + + +@dataclass +# Copied from transformers.models.mamba.modeling_mamba.MambaCausalLMOutput with Mamba->Mamba2 +class Mamba2CausalLMOutput(ModelOutput): + """ + Base class for causal language model (or autoregressive) outputs. + + Args: + loss (`torch.FloatTensor` of shape `(1,)`, *optional*, returned when `labels` is provided): + Language modeling loss (for next-token prediction). + logits (`torch.FloatTensor` of shape `(batch_size, sequence_length, config.vocab_size)`): + Prediction scores of the language modeling head (scores for each vocabulary token before SoftMax). + cache_params (`Mamba2Cache`): + The state of the model at the last time step. Can be used in a forward method with the next `input_ids` to + avoid providing the old `input_ids`. + + Includes both the State space model state matrices after the selective scan, and the Convolutional states + hidden_states (`tuple(torch.FloatTensor)`, *optional*, + returned when `output_hidden_states=True` is passed or when `config.output_hidden_states=True`): + Tuple of `torch.FloatTensor` (one for the output of the embeddings, if the model has an embedding layer, + + one for the output of each layer) of shape `(batch_size, sequence_length, hidden_size)`. + + Hidden-states of the model at the output of each layer plus the optional initial embedding outputs. + """ + + loss: Optional[torch.FloatTensor] = None + logits: Optional[torch.FloatTensor] = None + cache_params: Optional[Mamba2Cache] = None + hidden_states: Optional[Tuple[torch.FloatTensor]] = None + + +class Mamba2Model(Mamba2PreTrainedModel): + def __init__(self, config): + super().__init__(config) + + self.embeddings = nn.Embedding(config.vocab_size, config.hidden_size) + self.layers = nn.ModuleList([Mamba2Block(config, layer_idx=idx) for idx in range(config.num_hidden_layers)]) + + self.gradient_checkpointing = False + self.norm_f = RMSNorm(config.hidden_size, eps=config.layer_norm_epsilon) + # Initialize weights and apply final processing + self._register_load_state_dict_pre_hook(self.load_hook) + self.post_init() + + def load_hook(self, state_dict, prefix, *args): + for k in state_dict: + if "embedding." in k: + state_dict[k.replace("embedding.", "embeddings.")] = state_dict.pop(k) + break + + def get_input_embeddings(self): + return self.embeddings + + def set_input_embeddings(self, new_embeddings): + self.embeddings = new_embeddings + + def forward( + self, + input_ids: Optional[torch.LongTensor] = None, + inputs_embeds: Optional[torch.LongTensor] = None, + cache_params: Optional[Mamba2Cache] = None, + use_cache: Optional[bool] = None, + output_hidden_states: Optional[bool] = None, + return_dict: Optional[bool] = None, + cache_position: Optional[torch.LongTensor] = None, + attention_mask: Optional[torch.Tensor] = None, + **kwargs, + ) -> Union[Tuple, Mamba2Output]: + output_hidden_states = ( + output_hidden_states if output_hidden_states is not None else self.config.output_hidden_states + ) + use_cache = use_cache if use_cache is not None else (self.config.use_cache if not self.training else False) + return_dict = return_dict if return_dict is not None else self.config.use_return_dict + + if (input_ids is None) ^ (inputs_embeds is not None): # ^ is python for xor + raise ValueError("You must specify exactly one of input_ids or inputs_embeds") + + if inputs_embeds is None: + inputs_embeds = self.embeddings(input_ids) + + if self.gradient_checkpointing and self.training and use_cache: + use_cache = False + + if use_cache: + if cache_params is None: + cache_params = Mamba2Cache( + self.config, inputs_embeds.size(0), device=inputs_embeds.device, dtype=inputs_embeds.dtype + ) + cache_position = torch.arange(0, self.config.conv_kernel, device=inputs_embeds.device) + elif cache_position is None: + # cases when we do manual forward instead of using `model.generate` which will initiate + # `cache_position` and makes sure it is not None, throw error here instead of doing some + # hack to conjecture the current cache position + raise ValueError( + "You have to specify the `cache_position` manually when `use_cache=True` and `cache_params` is passed, " + "you don't have to pass a `cache_params` if you are in prefilling stage because in that case it will " + "be initialized for you automatically" + ) + else: + cache_params = None + + hidden_states = inputs_embeds + all_hidden_states = () if output_hidden_states else None + for mixer_block in self.layers: + if self.gradient_checkpointing and self.training: + hidden_states = self._gradient_checkpointing_func( + mixer_block.__call__, + hidden_states, + cache_params, + cache_position, + attention_mask, + ) + else: + hidden_states = mixer_block( + hidden_states, + cache_params=cache_params, + cache_position=cache_position, + attention_mask=attention_mask, + ) + + if output_hidden_states: + all_hidden_states = all_hidden_states + (hidden_states,) + + if use_cache: + cache_params.seqlen_offset += inputs_embeds.shape[1] + + hidden_states = self.norm_f(hidden_states) + + if output_hidden_states: + all_hidden_states = all_hidden_states + (hidden_states,) + + if not return_dict: + return tuple(v for v in [hidden_states, cache_params, all_hidden_states] if v is not None) + + return Mamba2Output( + last_hidden_state=hidden_states, + cache_params=cache_params if use_cache else None, + hidden_states=all_hidden_states, + ) + + +class Mamba2ForCausalLM(Mamba2PreTrainedModel): + _tied_weights_keys = [] + + def __init__(self, config): + super().__init__(config) + self.backbone = Mamba2Model(config) + self.lm_head = nn.Linear(config.hidden_size, config.vocab_size, bias=False) + # Initialize weights and apply final processing + self.post_init() + + def get_output_embeddings(self): + return self.lm_head + + def set_output_embeddings(self, new_embeddings): + self.lm_head = new_embeddings + + def get_input_embeddings(self): + return self.backbone.get_input_embeddings() + + def set_input_embeddings(self, new_embeddings): + return self.backbone.set_input_embeddings(new_embeddings) + + def prepare_inputs_for_generation( + self, + input_ids, + inputs_embeds=None, + use_cache=None, + cache_params: Optional[Mamba2Cache] = None, + cache_position: Optional[torch.LongTensor] = None, + attention_mask: Optional[torch.Tensor] = None, + num_logits_to_keep: Optional[int] = None, + **kwargs, + ): + if inputs_embeds is not None: + past_len = inputs_embeds.shape[1] + input_ids.shape[1] + else: + past_len = input_ids.shape[1] + if use_cache: + # `cache_position` should have been initialized in `generate` + if cache_position is None: + raise ValueError( + "`cache_position` should not be None as it should have been initialized in " + "`model.generate`, you are responsible for passing in a valid `cache_position` if " + "you are calling `prepare_inputs_for_generation` directly with `use_cache=True`" + ) + # how do we detect that we are in decoding without cache? + if cache_position[0] > 0: + input_ids = input_ids[:, -1][..., None] + attention_mask = attention_mask[:, -1][..., None] + else: + # we initialize the `cache_position` to full size of `conv_states` at prefill stage + # considering padding will be applied when input length is shorter, and truncation + # will be applied when it is longer, so it will be equivalent to always have it match + # the length of `cache_params.conv_states`, which is `config.conv_kernel` + cache_position = torch.arange(0, past_len, device=input_ids.device) + # if the cache is not used, we also do have to extend the attention mask here + # TODO there is likely a cleverer way to do this + extended_mask = torch.ones( + attention_mask.size(0), past_len - attention_mask.shape[1], device=attention_mask.device + ) + attention_mask = torch.cat([attention_mask, extended_mask], dim=1) + cache_params = None + + if attention_mask.shape[1] < past_len: + # we have to update manually the attention mask if + # we are in decoding without cache + # and we don't have position_ids here + # TODO but we should be able to use cache_position though at a later time + extended_mask = torch.ones( + attention_mask.size(0), past_len - attention_mask.shape[1], device=attention_mask.device + ) + attention_mask = torch.cat([attention_mask, extended_mask], dim=1) + if inputs_embeds is not None and cache_params is None: + model_inputs = {"inputs_embeds": inputs_embeds} + else: + model_inputs = {"input_ids": input_ids} + + if num_logits_to_keep is not None: + model_inputs['num_logits_to_keep'] = num_logits_to_keep + + model_inputs.update({ + 'attention_mask': attention_mask, + 'cache_params': cache_params, + 'use_cache': use_cache, + 'cache_position': cache_position, + 'num_logits_to_keep': num_logits_to_keep + }) + return model_inputs + + def forward( + self, + input_ids: Optional[torch.LongTensor] = None, + inputs_embeds: Optional[torch.FloatTensor] = None, + cache_params: Optional[Mamba2Cache] = None, + labels: Optional[torch.LongTensor] = None, + output_hidden_states: Optional[bool] = None, + return_dict: Optional[bool] = None, + use_cache: Optional[bool] = None, + cache_position: Optional[torch.Tensor] = None, + attention_mask: Optional[torch.Tensor] = None, + num_logits_to_keep: Optional[int] = 0, + **kwargs, # for now we need this for generation + ) -> Union[Tuple, Mamba2CausalLMOutput]: + r""" + labels (`torch.LongTensor` of shape `(batch_size, sequence_length)`, *optional*): + Labels for language modeling. Note that the labels **are shifted** inside the model, i.e. you can set + `labels = input_ids` Indices are selected in `[-100, 0, ..., config.vocab_size]` All labels set to `-100` + are ignored (masked), the loss is only computed for labels in `[0, ..., config.vocab_size]` + """ + return_dict = return_dict if return_dict is not None else self.config.use_return_dict + + outputs = self.backbone( + input_ids, + cache_params=cache_params, + inputs_embeds=inputs_embeds, + output_hidden_states=output_hidden_states, + return_dict=return_dict, + use_cache=use_cache, + cache_position=cache_position, + attention_mask=attention_mask, + ) + hidden_states = outputs[0] + fuse_linear_and_cross_entropy = self.config.fuse_cross_entropy and self.training + logits = None if fuse_linear_and_cross_entropy else self.lm_head(hidden_states[:, -num_logits_to_keep:]) + + loss = None + if labels is not None: + if self.config.fuse_cross_entropy: + if fuse_linear_and_cross_entropy: + loss_fct = FusedLinearCrossEntropyLoss() + else: + loss_fct = FusedCrossEntropyLoss(inplace_backward=True) + else: + loss_fct = nn.CrossEntropyLoss() + # Enable model parallelism + labels = labels.to(hidden_states.device) + labels = torch.cat((labels[..., 1:], torch.full_like(labels[:, :1], loss_fct.ignore_index)), 1) + if fuse_linear_and_cross_entropy: + loss = loss_fct(hidden_states.view(-1, self.config.hidden_size), + labels.view(-1), + self.lm_head.weight, + self.lm_head.bias) + else: + loss = loss_fct(logits.view(-1, self.config.vocab_size), labels.view(-1)) + + if not return_dict: + output = (logits,) + outputs[1:] + return (loss,) + output if loss is not None else output + + return Mamba2CausalLMOutput( + loss=loss, + logits=logits, + cache_params=outputs.cache_params, + hidden_states=outputs.hidden_states, + ) diff --git a/fla/models/retnet/__init__.py b/fla/models/retnet/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..ad7d9e9da930819a2a6728e3e189090651b82a2e --- /dev/null +++ b/fla/models/retnet/__init__.py @@ -0,0 +1,13 @@ +# -*- coding: utf-8 -*- + +from transformers import AutoConfig, AutoModel, AutoModelForCausalLM + +from fla.models.retnet.configuration_retnet import RetNetConfig +from fla.models.retnet.modeling_retnet import RetNetForCausalLM, RetNetModel + +AutoConfig.register(RetNetConfig.model_type, RetNetConfig) +AutoModel.register(RetNetConfig, RetNetModel) +AutoModelForCausalLM.register(RetNetConfig, RetNetForCausalLM) + + +__all__ = ['RetNetConfig', 'RetNetForCausalLM', 'RetNetModel'] diff --git a/fla/models/retnet/configuration_retnet.py b/fla/models/retnet/configuration_retnet.py new file mode 100644 index 0000000000000000000000000000000000000000..535841629c557f226bf240a5e0bd6dc1493e317f --- /dev/null +++ b/fla/models/retnet/configuration_retnet.py @@ -0,0 +1,87 @@ +# -*- coding: utf-8 -*- + +from __future__ import annotations + +from typing import Dict, Optional + +from transformers.configuration_utils import PretrainedConfig + + +class RetNetConfig(PretrainedConfig): + + model_type = 'retnet' + keys_to_ignore_at_inference = ['past_key_values'] + + def __init__( + self, + attn_mode: str = "chunk", + hidden_size: int = 2048, + expand_k: int = 1, + expand_v: int = 2, + hidden_ratio: Optional[int] = 2, + intermediate_size: Optional[int] = None, + num_hidden_layers: int = 24, + num_heads: int = 8, + num_kv_heads: Optional[int] = None, + feature_map: Optional[str] = None, + hidden_act: str = "swish", + use_short_conv: bool = False, + conv_size: int = 4, + use_output_gate: bool = True, + max_position_embeddings: int = 2048, + elementwise_affine: Optional[bool] = True, + norm_eps: float = 1e-6, + attn: Optional[Dict] = None, + use_cache: bool = True, + pad_token_id: int = None, + bos_token_id: int = 1, + eos_token_id: int = 2, + tie_word_embeddings: bool = False, + initializer_range: float = 0.02, + fuse_norm: bool = True, + fuse_cross_entropy: bool = True, + vocab_size: int = 32000, + **kwargs + ) -> RetNetConfig: + self.attn_mode = attn_mode + self.hidden_size = hidden_size + self.expand_k = expand_k + self.expand_v = expand_v + self.hidden_ratio = hidden_ratio + self.intermediate_size = intermediate_size + self.num_hidden_layers = num_hidden_layers + self.num_heads = num_heads + self.num_kv_heads = num_kv_heads + self.feature_map = feature_map + self.hidden_act = hidden_act + self.use_short_conv = use_short_conv + self.conv_size = conv_size + self.use_output_gate = use_output_gate + self.hidden_act = hidden_act + self.max_position_embeddings = max_position_embeddings + self.elementwise_affine = elementwise_affine + self.norm_eps = norm_eps + self.attn = attn + self.use_cache = use_cache + self.initializer_range = initializer_range + self.fuse_norm = fuse_norm + self.fuse_cross_entropy = fuse_cross_entropy + self.vocab_size = vocab_size + + if attn is not None: + if not isinstance(attn, Dict): + raise ValueError("attn must be a dictionary") + if 'layers' not in attn: + raise ValueError("Layer indices must be provided to initialize hybrid attention layers") + if 'num_heads' not in attn: + raise ValueError("Number of heads must be provided to initialize hybrid attention layers") + attn['num_kv_heads'] = attn.get('num_kv_heads', attn['num_heads']) + attn['window_size'] = attn.get('window_size', None) + + super().__init__( + pad_token_id=pad_token_id, + bos_token_id=bos_token_id, + eos_token_id=eos_token_id, + tie_word_embeddings=tie_word_embeddings, + **kwargs, + ) diff --git a/fla/models/retnet/modeling_retnet.py b/fla/models/retnet/modeling_retnet.py new file mode 100644 index 0000000000000000000000000000000000000000..f70c737a1504fd5d3fb37d394ae93b39a3de6bab --- /dev/null +++ b/fla/models/retnet/modeling_retnet.py @@ -0,0 +1,426 @@ +# -*- coding: utf-8 -*- + +from __future__ import annotations + +import math +import warnings +from typing import List, Optional, Tuple, Union + +import torch +import torch.nn as nn +import torch.utils.checkpoint +from transformers.activations import ACT2FN +from transformers.generation import GenerationMixin +from transformers.modeling_outputs import (BaseModelOutputWithPast, + CausalLMOutputWithPast) +from transformers.modeling_utils import PreTrainedModel +from transformers.utils import logging + +from fla.layers.attn import Attention +from fla.layers.multiscale_retention import MultiScaleRetention +from fla.models.retnet.configuration_retnet import RetNetConfig +from fla.models.utils import Cache +from fla.modules import (FusedCrossEntropyLoss, FusedLinearCrossEntropyLoss, + RMSNorm) +from fla.modules.activations import swiglu_linear + +logger = logging.get_logger(__name__) + + +class RetNetMLP(nn.Module): + + def __init__( + self, + hidden_size: int, + hidden_ratio: Optional[int] = None, + intermediate_size: Optional[int] = None, + hidden_act: str = 'swish' + ) -> RetNetMLP: + super().__init__() + + self.hidden_size = hidden_size + # the final number of params is `hidden_ratio * hidden_size^2` + # `intermediate_size` is chosen to be a multiple of 256 closest to `2/3 * hidden_size * hidden_ratio` + if hidden_ratio is None: + hidden_ratio = 4 + if intermediate_size is None: + intermediate_size = int(hidden_size * hidden_ratio * 2 / 3) + intermediate_size = 256 * ((intermediate_size + 256 - 1) // 256) + self.hidden_ratio = hidden_ratio + self.intermediate_size = intermediate_size + + self.gate_proj = nn.Linear(self.hidden_size, self.intermediate_size * 2, bias=False) + self.down_proj = nn.Linear(self.intermediate_size, self.hidden_size, bias=False) + self.act_fn = ACT2FN[hidden_act] + + def forward(self, x): + y = self.gate_proj(x) + gate, y = y.chunk(2, -1) + return swiglu_linear(gate, y, self.down_proj.weight, self.down_proj.bias) + + +class RetNetBlock(nn.Module): + def __init__(self, config: RetNetConfig, layer_idx: int): + super().__init__() + self.hidden_size = config.hidden_size + + self.attn_norm = RMSNorm(hidden_size=config.hidden_size, eps=config.norm_eps) + if config.attn is not None and layer_idx in config.attn['layers']: + self.attn = Attention( + hidden_size=config.hidden_size, + num_heads=config.attn['num_heads'], + num_kv_heads=config.attn['num_kv_heads'], + window_size=config.attn['window_size'], + max_position_embeddings=config.max_position_embeddings, + layer_idx=layer_idx + ) + else: + self.attn = MultiScaleRetention( + mode=config.attn_mode, + hidden_size=config.hidden_size, + expand_k=config.expand_k, + expand_v=config.expand_v, + num_heads=config.num_heads, + num_kv_heads=config.num_kv_heads, + feature_map=config.feature_map, + use_output_gate=config.use_output_gate, + gate_fn=config.hidden_act, + elementwise_affine=config.elementwise_affine, + norm_eps=config.norm_eps, + fuse_norm=config.fuse_norm, + layer_idx=layer_idx + ) + self.mlp_norm = RMSNorm(hidden_size=config.hidden_size, eps=config.norm_eps) + self.mlp = RetNetMLP( + hidden_size=config.hidden_size, + hidden_ratio=config.hidden_ratio, + intermediate_size=config.intermediate_size, + hidden_act=config.hidden_act + ) + + def forward( + self, + hidden_states: torch.Tensor, + attention_mask: Optional[torch.Tensor] = None, + past_key_values: Optional[List[torch.FloatTensor]] = None, + use_cache: Optional[bool] = False, + output_attentions: Optional[bool] = False, + **kwargs, + ) -> Tuple[torch.FloatTensor, Optional[Tuple[torch.FloatTensor, torch.FloatTensor]]]: + + residual = hidden_states + + hidden_states = self.attn_norm(hidden_states) + hidden_states, attentions, past_key_values = self.attn( + hidden_states=hidden_states, + attention_mask=attention_mask, + past_key_values=past_key_values, + use_cache=use_cache, + output_attentions=output_attentions + ) + hidden_states, residual = self.mlp_norm(hidden_states, residual, True) + hidden_states = self.mlp(hidden_states) + hidden_states = residual + hidden_states + + outputs = (hidden_states, attentions, past_key_values) + + return outputs + + +class RetNetPreTrainedModel(PreTrainedModel): + + config_class = RetNetConfig + supports_gradient_checkpointing = True + _no_split_modules = ['RetNetBlock'] + + def __init__(self, *inputs, **kwargs): + super().__init__(*inputs, **kwargs) + + def _init_weights( + self, + module: nn.Module, + rescale_prenorm_residual: bool = True, + num_residuals_per_layer: int = 2, + ): + if isinstance(module, (nn.Linear, nn.Conv1d)): + # Slightly different from the TF version which uses truncated_normal for initialization + # cf https://github.com/pytorch/pytorch/pull/5617 + nn.init.normal_(module.weight, mean=0.0, std=self.config.initializer_range) + if module.bias is not None: + nn.init.zeros_(module.bias) + elif isinstance(module, nn.Embedding): + nn.init.normal_(module.weight, mean=0.0, std=self.config.initializer_range) + if module.padding_idx is not None: + module.weight.data[module.padding_idx].zero_() + + if rescale_prenorm_residual: + # Reinitialize selected weights subject to the OpenAI GPT-2 Paper Scheme: + # > A modified initialization which accounts for the accumulation on the residual path with model depth. Scale + # > the weights of residual layers at initialization by a factor of 1/√N where N is the # of residual layers. + # > -- GPT-2 :: https://openai.com/blog/better-language-models/ + # + # Reference (Megatron-LM): https://github.com/NVIDIA/Megatron-LM/blob/main/megatron/model/gpt_model.py + for name, p in module.named_parameters(): + if name in ["o_proj.weight", "down_proj.weight"]: + # Special Scaled Initialization --> There are 2 Layer Norms per Transformer Block + # Following Pytorch init, except scale by 1/sqrt(2 * n_layer) + # We need to reinit p since this code could be called multiple times + # Having just p *= scale would repeatedly scale it down + with torch.no_grad(): + p /= math.sqrt(num_residuals_per_layer * self.config.num_hidden_layers) + + +class RetNetModel(RetNetPreTrainedModel): + + def __init__(self, config: RetNetConfig): + super().__init__(config) + self.padding_idx = config.pad_token_id + self.vocab_size = config.vocab_size + + self.embeddings = nn.Embedding(config.vocab_size, config.hidden_size, self.padding_idx) + self.layers = nn.ModuleList( + [RetNetBlock(config, layer_idx) for layer_idx in range(config.num_hidden_layers)] + ) + self.norm = RMSNorm(config.hidden_size, eps=config.norm_eps) + + self.gradient_checkpointing = False + + self.post_init() + + def get_input_embeddings(self): + return self.embeddings + + def set_input_embeddings(self, value): + self.embeddings = value + + def forward( + self, + input_ids: Optional[torch.LongTensor] = None, + attention_mask: Optional[torch.Tensor] = None, # noqa + inputs_embeds: Optional[torch.FloatTensor] = None, + past_key_values: Optional[List[torch.FloatTensor]] = None, + use_cache: Optional[bool] = None, + output_attentions: Optional[bool] = None, + output_hidden_states: Optional[bool] = None, + return_dict: Optional[bool] = None + ) -> Union[Tuple, BaseModelOutputWithPast]: + if output_attentions: + warnings.warn( + "`RetNetModel` does not support output attention weights now, so `output_attentions` is set to `False`." + ) + output_attentions = False + output_attentions = output_attentions if output_attentions is not None else self.config.output_attentions + output_hidden_states = ( + output_hidden_states if output_hidden_states is not None else self.config.output_hidden_states + ) + use_cache = use_cache if use_cache is not None else (self.config.use_cache if not self.training else False) + return_dict = return_dict if return_dict is not None else self.config.use_return_dict + + # retrieve input_ids and inputs_embeds + if input_ids is not None and inputs_embeds is not None: + raise ValueError("You cannot specify both input_ids and inputs_embeds at the same time") + if input_ids is None and inputs_embeds is None: + raise ValueError("You have to specify either input_ids or inputs_embeds") + + if inputs_embeds is None: + inputs_embeds = self.embeddings(input_ids) + hidden_states = inputs_embeds + + if use_cache and not isinstance(past_key_values, Cache): + past_key_values = Cache.from_legacy_cache(past_key_values) + + if self.gradient_checkpointing and self.training and use_cache: + logger.warning_once("`use_cache=True` is incompatible with gradient checkpointing. Setting `use_cache=False`...") + use_cache = False + + all_hidden_states = () if output_hidden_states else None + all_attns = () if output_attentions else None + for layer in self.layers: + if output_hidden_states: + all_hidden_states += (hidden_states,) + + if self.gradient_checkpointing and self.training: + hidden_states, attentions, past_key_values = self._gradient_checkpointing_func( + layer.__call__, + hidden_states, + attention_mask, + past_key_values, + use_cache, + output_attentions + ) + else: + hidden_states, attentions, past_key_values = layer( + hidden_states, + attention_mask=attention_mask, + past_key_values=past_key_values, + use_cache=use_cache, + output_attentions=output_attentions + ) + + if output_attentions: + all_attns += (attentions,) + + hidden_states = self.norm(hidden_states) + + # add hidden states from the last decoder layer + if output_hidden_states: + all_hidden_states += (hidden_states,) + + if not return_dict: + return tuple(i for i in [hidden_states, past_key_values, all_hidden_states, all_attns] if i is not None) + return BaseModelOutputWithPast( + last_hidden_state=hidden_states, + past_key_values=past_key_values, + hidden_states=all_hidden_states, + attentions=all_attns + ) + + +class RetNetForCausalLM(RetNetPreTrainedModel, GenerationMixin): + + _tied_weights_keys = ["lm_head.weight"] + + def __init__(self, config): + super().__init__(config) + self.model = RetNetModel(config) + self.vocab_size = config.vocab_size + self.lm_head = nn.Linear(config.hidden_size, config.vocab_size, bias=False) + + # Initialize weights and apply final processing + self.post_init() + + def get_input_embeddings(self): + return self.model.embeddings + + def set_input_embeddings(self, value): + self.model.embeddings = value + + def get_output_embeddings(self): + return self.lm_head + + def set_output_embeddings(self, new_embeddings): + self.lm_head = new_embeddings + + def set_decoder(self, decoder): + self.model = decoder + + def get_decoder(self): + return self.model + + def generate(self, *args, **kwargs): + try: + return super().generate(*args, **kwargs) + except AttributeError as exception: + # Expected exception: "AttributeError: '(object name)' object has no attribute 'past_key_values'" + if 'past_key_values' in str(exception): + raise AttributeError( + f"You tried to call `generate` with a decoding strategy that manipulates `past_key_values`, " + f"which is not supported for {self.__class__.__name__}. " + f"Try another generation strategy instead. " + f"For the available generation strategies, check this doc: " + f"https://huggingface.co/docs/transformers/en/generation_strategies#decoding-strategies" + ) + else: + raise exception + + def prepare_inputs_for_generation( + self, + input_ids: torch.LongTensor = None, + past_key_values: Optional[torch.Tensor] = None, + attention_mask: Optional[torch.Tensor] = None, + inputs_embeds: Optional[torch.FloatTensor] = None, + use_cache: Optional[bool] = True, + num_logits_to_keep: Optional[int] = None, + **kwargs + ): + # only last token for `inputs_ids` if the `past_key_values` is passed along. + if past_key_values is not None: + input_ids = input_ids[:, -1:] + + # if `inputs_embeds` are passed, we only want to use them in the 1st generation step + if inputs_embeds is not None and past_key_values is None: + model_inputs = {'inputs_embeds': inputs_embeds} + else: + # The `contiguous()` here is necessary to have a static stride during decoding. torchdynamo otherwise + # recompiles graphs as the stride of the inputs is a guard. + # Ref: https://github.com/huggingface/transformers/pull/29114 + # TODO: use `next_tokens` directly instead. + model_inputs = {'input_ids': input_ids.contiguous()} + + if num_logits_to_keep is not None: + model_inputs['num_logits_to_keep'] = num_logits_to_keep + + model_inputs.update({ + 'past_key_values': past_key_values, + 'use_cache': use_cache, + 'attention_mask': attention_mask, + 'num_logits_to_keep': num_logits_to_keep, + }) + return model_inputs + + def forward( + self, + input_ids: torch.LongTensor = None, + attention_mask: Optional[torch.Tensor] = None, + inputs_embeds: Optional[torch.FloatTensor] = None, + past_key_values: Optional[List[torch.FloatTensor]] = None, + labels: Optional[torch.LongTensor] = None, + use_cache: Optional[bool] = None, + output_attentions: Optional[bool] = None, + output_hidden_states: Optional[bool] = None, + return_dict: Optional[bool] = None, + num_logits_to_keep: Optional[int] = 0 + ) -> Union[Tuple, CausalLMOutputWithPast]: + output_attentions = output_attentions if output_attentions is not None else self.config.output_attentions + output_hidden_states = ( + output_hidden_states if output_hidden_states is not None else self.config.output_hidden_states + ) + return_dict = return_dict if return_dict is not None else self.config.use_return_dict + + # decoder outputs consists of (dec_features, layer_state, dec_hidden, dec_attn) + outputs = self.model( + input_ids=input_ids, + attention_mask=attention_mask, + inputs_embeds=inputs_embeds, + past_key_values=past_key_values, + use_cache=use_cache, + output_attentions=output_attentions, + output_hidden_states=output_hidden_states, + return_dict=return_dict + ) + + hidden_states = outputs[0] + fuse_linear_and_cross_entropy = self.config.fuse_cross_entropy and self.training + logits = None if fuse_linear_and_cross_entropy else self.lm_head(hidden_states[:, -num_logits_to_keep:]) + + loss = None + if labels is not None: + if self.config.fuse_cross_entropy: + if fuse_linear_and_cross_entropy: + loss_fct = FusedLinearCrossEntropyLoss() + else: + loss_fct = FusedCrossEntropyLoss(inplace_backward=True) + else: + loss_fct = nn.CrossEntropyLoss() + # Enable model parallelism + labels = labels.to(hidden_states.device) + labels = torch.cat((labels[..., 1:], torch.full_like(labels[:, :1], loss_fct.ignore_index)), 1) + if fuse_linear_and_cross_entropy: + loss = loss_fct(hidden_states.view(-1, self.config.hidden_size), + labels.view(-1), + self.lm_head.weight, + self.lm_head.bias) + else: + loss = loss_fct(logits.view(-1, self.config.vocab_size), labels.view(-1)) + + if not return_dict: + output = (logits,) + outputs[1:] + return (loss,) + output if loss is not None else output + + return CausalLMOutputWithPast( + loss=loss, + logits=logits, + past_key_values=outputs.past_key_values, + hidden_states=outputs.hidden_states, + attentions=outputs.attentions, + ) diff --git a/fla/models/rwkv6/__init__.py b/fla/models/rwkv6/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..942c6dc203bf6c867ffd5111e7f2ae1e7c060386 --- /dev/null +++ b/fla/models/rwkv6/__init__.py @@ -0,0 +1,13 @@ +# -*- coding: utf-8 -*- + +from transformers import AutoConfig, AutoModel, AutoModelForCausalLM + +from fla.models.rwkv6.configuration_rwkv6 import RWKV6Config +from fla.models.rwkv6.modeling_rwkv6 import RWKV6ForCausalLM, RWKV6Model + +AutoConfig.register(RWKV6Config.model_type, RWKV6Config) +AutoModel.register(RWKV6Config, RWKV6Model) +AutoModelForCausalLM.register(RWKV6Config, RWKV6ForCausalLM) + + +__all__ = ['RWKV6Config', 'RWKV6ForCausalLM', 'RWKV6Model'] diff --git a/fla/models/rwkv6/configuration_rwkv6.py b/fla/models/rwkv6/configuration_rwkv6.py new file mode 100644 index 0000000000000000000000000000000000000000..6e56614bf908dee1b34b3864629b58a5d10295c7 --- /dev/null +++ b/fla/models/rwkv6/configuration_rwkv6.py @@ -0,0 +1,80 @@ +# -*- coding: utf-8 -*- + +from typing import Dict, Optional + +from transformers.configuration_utils import PretrainedConfig + + +class RWKV6Config(PretrainedConfig): + + model_type = 'rwkv6' + keys_to_ignore_at_inference = ['past_key_values'] + + def __init__( + self, + attn_mode: str = "chunk", + hidden_size: int = 2048, + expand_k: int = 0.5, + expand_v: int = 1, + hidden_ratio: Optional[int] = 3.5, + intermediate_size: Optional[int] = None, + num_hidden_layers: int = 24, + num_heads: int = 4, + proj_low_rank_dim: int = 32, + gate_low_rank_dim: int = 64, + hidden_act: str = "sqrelu", + max_position_embeddings: int = 2048, + norm_first: bool = True, + norm_bias: bool = True, + norm_eps: float = 1e-5, + attn: Optional[Dict] = None, + use_cache: bool = True, + pad_token_id: int = None, + bos_token_id: int = 1, + eos_token_id: int = 2, + tie_word_embeddings: bool = False, + initializer_range: float = 0.02, + fuse_norm: bool = True, + fuse_cross_entropy: bool = True, + vocab_size: int = 32000, + **kwargs + ): + self.attn_mode = attn_mode + self.hidden_size = hidden_size + self.expand_k = expand_k + self.expand_v = expand_v + self.hidden_ratio = hidden_ratio + self.intermediate_size = intermediate_size + self.norm_first = norm_first + self.num_hidden_layers = num_hidden_layers + self.num_heads = num_heads + self.proj_low_rank_dim = proj_low_rank_dim + self.gate_low_rank_dim = gate_low_rank_dim + self.hidden_act = hidden_act + self.max_position_embeddings = max_position_embeddings + self.norm_bias = norm_bias + self.norm_eps = norm_eps + self.attn = attn + self.use_cache = use_cache + self.initializer_range = initializer_range + self.fuse_norm = fuse_norm + self.fuse_cross_entropy = fuse_cross_entropy + self.vocab_size = vocab_size + + if attn is not None: + if not isinstance(attn, Dict): + raise ValueError("attn must be a dictionary") + if 'layers' not in attn: + raise ValueError("Layer indices must be provided to initialize hybrid attention layers") + if 'num_heads' not in attn: + raise ValueError("Number of heads must be provided to initialize hybrid attention layers") + attn['num_kv_heads'] = attn.get('num_kv_heads', attn['num_heads']) + attn['window_size'] = attn.get('window_size', None) + + super().__init__( + pad_token_id=pad_token_id, + bos_token_id=bos_token_id, + eos_token_id=eos_token_id, + tie_word_embeddings=tie_word_embeddings, + **kwargs, + ) diff --git a/fla/models/rwkv6/modeling_rwkv6.py b/fla/models/rwkv6/modeling_rwkv6.py new file mode 100644 index 0000000000000000000000000000000000000000..a5b68d2a941d1ecdc3575f2356662b3f7c33ce75 --- /dev/null +++ b/fla/models/rwkv6/modeling_rwkv6.py @@ -0,0 +1,442 @@ +# -*- coding: utf-8 -*- + +from __future__ import annotations + +import math +import warnings +from typing import Optional, Tuple, Union + +import torch +import torch.nn as nn +import torch.utils.checkpoint +from transformers.generation import GenerationMixin +from transformers.modeling_outputs import (BaseModelOutputWithPast, + CausalLMOutputWithPast) +from transformers.modeling_utils import PreTrainedModel +from transformers.utils import logging + +from fla.layers.attn import Attention +from fla.layers.rwkv6 import LerpLinear, RWKV6Attention +from fla.models.rwkv6.configuration_rwkv6 import RWKV6Config +from fla.models.utils import Cache +from fla.modules import (FusedCrossEntropyLoss, FusedLinearCrossEntropyLoss, + LayerNorm) +from fla.modules.activations import ACT2FN + +logger = logging.get_logger(__name__) + + +class RWKV6FeedForward(nn.Module): + + def __init__( + self, + hidden_size: int, + hidden_ratio: Optional[int] = None, + intermediate_size: Optional[int] = None, + hidden_act: str = 'sqrelu', + layer_idx: int = None + ) -> RWKV6FeedForward: + super().__init__() + + self.hidden_size = hidden_size + if hidden_ratio is None: + hidden_ratio = 3.5 + if intermediate_size is None: + intermediate_size = int(hidden_size * hidden_ratio) + intermediate_size = 32 * ((intermediate_size + 32 - 1) // 32) + self.hidden_ratio = hidden_ratio + self.intermediate_size = intermediate_size + + self.time_shift = nn.ZeroPad2d((0, 0, 1, -1)) + + self.key = LerpLinear(hidden_size, intermediate_size) + self.value = nn.Linear(intermediate_size, hidden_size, bias=False) + self.receptance = LerpLinear(hidden_size, hidden_size) + self.act_fn = ACT2FN[hidden_act] + + self.layer_idx = layer_idx + + def forward( + self, + x: torch.Tensor, + attention_mask: Optional[torch.Tensor] = None, + state: Optional[Cache] = None + ) -> torch.Tensor: + if attention_mask is not None: + x = x.mul_(attention_mask[:, -x.shape[-2]:, None]) + if x.shape[1] == 1 and state is not None: + shifted = state[self.layer_idx]['ffn_state'].unsqueeze(1) + else: + shifted = self.time_shift(x) + if state[self.layer_idx]['ffn_state'] is not None: + shifted[:, 0] = state[self.layer_idx]['ffn_state'][-1] + delta = shifted - x + key = self.act_fn(self.key(x, delta)) + value = self.value(key) + receptance = self.receptance(x, delta) + + if state is not None: + # no need to update the offset twice + state.update(ffn_state=x[:, -1], layer_idx=self.layer_idx, offset=0) + return receptance.sigmoid() * value, state + + +class RWKV6Block(nn.Module): + def __init__(self, config: RWKV6Config, layer_idx: int): + super().__init__() + self.hidden_size = config.hidden_size + + self.config = config + self.layer_idx = layer_idx + + if config.norm_first and layer_idx == 0: + self.pre_norm = LayerNorm(hidden_size=config.hidden_size, bias=config.norm_bias, eps=config.norm_eps) + self.attn_norm = LayerNorm(hidden_size=config.hidden_size, bias=config.norm_bias, eps=config.norm_eps) + if config.attn is not None and layer_idx in config.attn['layers']: + self.attn = Attention( + hidden_size=config.hidden_size, + num_heads=config.attn['num_heads'], + num_kv_heads=config.attn['num_kv_heads'], + window_size=config.attn['window_size'], + max_position_embeddings=config.max_position_embeddings, + layer_idx=layer_idx + ) + else: + self.attn = RWKV6Attention( + mode=config.attn_mode, + hidden_size=config.hidden_size, + expand_k=config.expand_k, + expand_v=config.expand_v, + num_heads=config.num_heads, + proj_low_rank_dim=config.proj_low_rank_dim, + gate_low_rank_dim=config.gate_low_rank_dim, + norm_eps=config.norm_eps, + fuse_norm=config.fuse_norm, + layer_idx=layer_idx + ) + self.ffn_norm = LayerNorm(hidden_size=config.hidden_size, bias=config.norm_bias, eps=config.norm_eps) + self.ffn = RWKV6FeedForward( + hidden_size=config.hidden_size, + hidden_ratio=config.hidden_ratio, + intermediate_size=config.intermediate_size, + hidden_act=config.hidden_act, + layer_idx=layer_idx + ) + + def forward( + self, + hidden_states: torch.Tensor, + attention_mask: Optional[torch.Tensor] = None, + past_key_values: Optional[Cache] = None, + use_cache: Optional[bool] = False, + output_attentions: Optional[bool] = False, + **kwargs, + ) -> Tuple[torch.FloatTensor, Optional[Tuple[torch.FloatTensor, torch.FloatTensor]]]: + residual = self.pre_norm(hidden_states) if hasattr(self, 'pre_norm') else hidden_states + hidden_states = self.attn_norm(residual) + hidden_states, attentions, past_key_values = self.attn( + hidden_states=hidden_states, + attention_mask=attention_mask, + past_key_values=past_key_values, + use_cache=use_cache, + output_attentions=output_attentions + ) + hidden_states, residual = self.ffn_norm(hidden_states, residual, True) + hidden_states, past_key_values = self.ffn(hidden_states, attention_mask, past_key_values) + hidden_states = residual + hidden_states + + outputs = (hidden_states, attentions, past_key_values) + + return outputs + + +class RWKV6PreTrainedModel(PreTrainedModel): + + config_class = RWKV6Config + supports_gradient_checkpointing = True + _no_split_modules = ['RWKV6Block'] + + def __init__(self, *inputs, **kwargs): + super().__init__(*inputs, **kwargs) + + def _init_weights( + self, + module: nn.Module, + rescale_prenorm_residual: bool = True, + num_residuals_per_layer: int = 2, + ): + if isinstance(module, (nn.Linear, nn.Conv1d)): + # Slightly different from the TF version which uses truncated_normal for initialization + # cf https://github.com/pytorch/pytorch/pull/5617 + nn.init.normal_(module.weight, mean=0.0, std=self.config.initializer_range) + if module.bias is not None: + nn.init.zeros_(module.bias) + elif isinstance(module, nn.Parameter): + nn.init.normal_(module, mean=0.0, std=self.config.initializer_range) + elif isinstance(module, nn.Embedding): + nn.init.normal_(module.weight, mean=0.0, std=self.config.initializer_range) + if module.padding_idx is not None: + module.weight.data[module.padding_idx].zero_() + + if rescale_prenorm_residual: + # Reinitialize selected weights subject to the OpenAI GPT-2 Paper Scheme: + # > A modified initialization which accounts for the accumulation on the residual path with model depth. Scale + # > the weights of residual layers at initialization by a factor of 1/√N where N is the # of residual layers. + # > -- GPT-2 :: https://openai.com/blog/better-language-models/ + # + # Reference (Megatron-LM): https://github.com/NVIDIA/Megatron-LM/blob/main/megatron/model/gpt_model.py + for name, p in module.named_parameters(): + if name in ["o_proj.weight", "down_proj.weight"]: + # Special Scaled Initialization --> There are 2 Layer Norms per Transformer Block + # Following Pytorch init, except scale by 1/sqrt(2 * n_layer) + # We need to reinit p since this code could be called multiple times + # Having just p *= scale would repeatedly scale it down + with torch.no_grad(): + p /= math.sqrt(num_residuals_per_layer * self.config.num_hidden_layers) + + +class RWKV6Model(RWKV6PreTrainedModel): + + def __init__(self, config: RWKV6Config): + super().__init__(config) + self.padding_idx = config.pad_token_id + self.vocab_size = config.vocab_size + + self.embeddings = nn.Embedding(config.vocab_size, config.hidden_size, self.padding_idx) + self.layers = nn.ModuleList([RWKV6Block(config, layer_idx) for layer_idx in range(config.num_hidden_layers)]) + self.norm = LayerNorm(config.hidden_size, bias=config.norm_bias, eps=config.norm_eps) + + self.gradient_checkpointing = False + + self.post_init() + + def get_input_embeddings(self): + return self.embeddings + + def set_input_embeddings(self, value): + self.embeddings = value + + def forward( + self, + input_ids: Optional[torch.LongTensor] = None, + attention_mask: Optional[torch.Tensor] = None, # noqa + inputs_embeds: Optional[torch.FloatTensor] = None, + past_key_values: Optional[Cache] = None, + use_cache: Optional[bool] = None, + output_attentions: Optional[bool] = None, + output_hidden_states: Optional[bool] = None, + return_dict: Optional[bool] = None + ) -> Union[Tuple, BaseModelOutputWithPast]: + if output_attentions: + warnings.warn("`RWKV6Model` does not `output_attentions` now, setting it to `False`.") + output_attentions = False + output_attentions = output_attentions if output_attentions is not None else self.config.output_attentions + output_hidden_states = output_hidden_states if output_hidden_states is not None else self.config.output_hidden_states + use_cache = use_cache if use_cache is not None else (self.config.use_cache if not self.training else False) + return_dict = return_dict if return_dict is not None else self.config.use_return_dict + + # retrieve input_ids and inputs_embeds + if input_ids is not None and inputs_embeds is not None: + raise ValueError("You cannot specify both input_ids and inputs_embeds at the same time") + if input_ids is None and inputs_embeds is None: + raise ValueError("You have to specify either input_ids or inputs_embeds") + + if inputs_embeds is None: + inputs_embeds = self.embeddings(input_ids) + hidden_states = inputs_embeds + + if use_cache and not isinstance(past_key_values, Cache): + past_key_values = Cache.from_legacy_cache(past_key_values) + + if self.gradient_checkpointing and self.training and use_cache: + logger.warning_once("`use_cache=True` is incompatible with gradient checkpointing. Setting `use_cache=False`...") + use_cache = False + + all_hidden_states = () if output_hidden_states else None + all_attns = () if output_attentions else None + for layer in self.layers: + if output_hidden_states: + all_hidden_states += (hidden_states,) + + if self.gradient_checkpointing and self.training: + hidden_states, attentions, past_key_values = self._gradient_checkpointing_func( + layer.__call__, + hidden_states, + attention_mask, + past_key_values, + use_cache, + output_attentions + ) + else: + hidden_states, attentions, past_key_values = layer( + hidden_states, + attention_mask=attention_mask, + past_key_values=past_key_values, + use_cache=use_cache, + output_attentions=output_attentions + ) + + if output_attentions: + all_attns += (attentions,) + + hidden_states = self.norm(hidden_states) + + # add hidden states from the last decoder layer + if output_hidden_states: + all_hidden_states += (hidden_states,) + + if not return_dict: + return tuple(i for i in [hidden_states, past_key_values, all_hidden_states, all_attns] if i is not None) + return BaseModelOutputWithPast( + last_hidden_state=hidden_states, + past_key_values=past_key_values, + hidden_states=all_hidden_states, + attentions=all_attns + ) + + +class RWKV6ForCausalLM(RWKV6PreTrainedModel, GenerationMixin): + + _tied_weights_keys = ["lm_head.weight"] + + def __init__(self, config): + super().__init__(config) + self.model = RWKV6Model(config) + self.vocab_size = config.vocab_size + self.lm_head = nn.Linear(config.hidden_size, config.vocab_size, bias=False) + + # Initialize weights and apply final processing + self.post_init() + + def get_input_embeddings(self): + return self.model.embeddings + + def set_input_embeddings(self, value): + self.model.embeddings = value + + def get_output_embeddings(self): + return self.lm_head + + def set_output_embeddings(self, new_embeddings): + self.lm_head = new_embeddings + + def set_decoder(self, decoder): + self.model = decoder + + def get_decoder(self): + return self.model + + def generate(self, *args, **kwargs): + try: + return super().generate(*args, **kwargs) + except AttributeError as exception: + if 'past_key_values' in str(exception): + raise AttributeError( + f"You tried to call `generate` with a decoding strategy that manipulates `past_key_values`, " + f"which is not supported for {self.__class__.__name__}. " + f"Try another generation strategy instead. " + f"For the available generation strategies, check this doc: " + f"https://huggingface.co/docs/transformers/en/generation_strategies#decoding-strategies" + ) + else: + raise exception + + def prepare_inputs_for_generation( + self, + input_ids: torch.LongTensor = None, + past_key_values: Optional[Cache] = None, + attention_mask: Optional[torch.Tensor] = None, + inputs_embeds: Optional[torch.Tensor] = None, + use_cache: bool = True, + num_logits_to_keep: Optional[int] = None, + **kwargs + ): + # only last token for `inputs_ids` if the `past_key_values` is passed along. + if past_key_values is not None: + input_ids = input_ids[:, -1:] + # if `inputs_embeds` are passed, we only want to use them in the 1st generation step + if inputs_embeds is not None and past_key_values is None: + model_inputs = {'inputs_embeds': inputs_embeds} + else: + # The `contiguous()` here is necessary to have a static stride during decoding. torchdynamo otherwise + # recompiles graphs as the stride of the inputs is a guard. + # Ref: https://github.com/huggingface/transformers/pull/29114 + # TODO: use `next_tokens` directly instead. + model_inputs = {'input_ids': input_ids.contiguous()} + + if num_logits_to_keep is not None: + model_inputs['num_logits_to_keep'] = num_logits_to_keep + + model_inputs.update({ + 'past_key_values': past_key_values, + 'use_cache': use_cache, + 'attention_mask': attention_mask, + 'num_logits_to_keep': num_logits_to_keep, + }) + return model_inputs + + def forward( + self, + input_ids: torch.LongTensor = None, + attention_mask: Optional[torch.Tensor] = None, + inputs_embeds: Optional[torch.Tensor] = None, + past_key_values: Optional[Cache] = None, + labels: Optional[torch.LongTensor] = None, + use_cache: Optional[bool] = None, + output_attentions: Optional[bool] = None, + output_hidden_states: Optional[bool] = None, + return_dict: Optional[bool] = None, + num_logits_to_keep: Optional[int] = 0 + ) -> Union[Tuple, CausalLMOutputWithPast]: + output_attentions = output_attentions if output_attentions is not None else self.config.output_attentions + output_hidden_states = ( + output_hidden_states if output_hidden_states is not None else self.config.output_hidden_states + ) + return_dict = return_dict if return_dict is not None else self.config.use_return_dict + + outputs = self.model( + input_ids=input_ids, + attention_mask=attention_mask, + inputs_embeds=inputs_embeds, + past_key_values=past_key_values, + use_cache=use_cache, + output_attentions=output_attentions, + output_hidden_states=output_hidden_states, + return_dict=return_dict + ) + + hidden_states = outputs[0] + fuse_linear_and_cross_entropy = self.config.fuse_cross_entropy and self.training + logits = None if fuse_linear_and_cross_entropy else self.lm_head(hidden_states[:, -num_logits_to_keep:]) + + loss = None + if labels is not None: + if self.config.fuse_cross_entropy: + if fuse_linear_and_cross_entropy: + loss_fct = FusedLinearCrossEntropyLoss() + else: + loss_fct = FusedCrossEntropyLoss(inplace_backward=True) + else: + loss_fct = nn.CrossEntropyLoss() + # Enable model parallelism + labels = labels.to(hidden_states.device) + labels = torch.cat((labels[..., 1:], torch.full_like(labels[:, :1], loss_fct.ignore_index)), 1) + if fuse_linear_and_cross_entropy: + loss = loss_fct(hidden_states.view(-1, self.config.hidden_size), + labels.view(-1), + self.lm_head.weight, + self.lm_head.bias) + else: + loss = loss_fct(logits.view(-1, self.config.vocab_size), labels.view(-1)) + + if not return_dict: + output = (logits,) + outputs[1:] + return (loss,) + output if loss is not None else output + + return CausalLMOutputWithPast( + loss=loss, + logits=logits, + past_key_values=outputs.past_key_values, + hidden_states=outputs.hidden_states, + attentions=outputs.attentions, + ) diff --git a/fla/models/samba/__init__.py b/fla/models/samba/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..244913e776944de23878781f4be7bd037fac89ab --- /dev/null +++ b/fla/models/samba/__init__.py @@ -0,0 +1,14 @@ +# -*- coding: utf-8 -*- + +from transformers import AutoConfig, AutoModel, AutoModelForCausalLM + +from fla.models.samba.configuration_samba import SambaConfig +from fla.models.samba.modeling_samba import (SambaBlock, SambaForCausalLM, + SambaModel) + +AutoConfig.register(SambaConfig.model_type, SambaConfig, True) +AutoModel.register(SambaConfig, SambaModel, True) +AutoModelForCausalLM.register(SambaConfig, SambaForCausalLM, True) + + +__all__ = ['SambaConfig', 'SambaForCausalLM', 'SambaModel', 'SambaBlock'] diff --git a/fla/models/samba/configuration_samba.py b/fla/models/samba/configuration_samba.py new file mode 100644 index 0000000000000000000000000000000000000000..fd7b008ef73f780c4487517c1883056d4d58b525 --- /dev/null +++ b/fla/models/samba/configuration_samba.py @@ -0,0 +1,87 @@ +# -*- coding: utf-8 -*- + +import math +from typing import Dict, Optional + +from transformers.configuration_utils import PretrainedConfig + + +class SambaConfig(PretrainedConfig): + + model_type = "samba" + + def __init__( + self, + vocab_size: int = 32000, + hidden_size: int = 2304, + state_size: int = 16, + num_hidden_layers: int = 18, + norm_eps=1e-5, + pad_token_id: int = 0, + bos_token_id: int = 1, + eos_token_id: int = 2, + expand: int = 2, + conv_kernel: int = 4, + use_bias: bool = False, + use_conv_bias: bool = True, + hidden_act: str = "silu", + initializer_range: str = 0.02, + residual_in_fp32: bool = False, + time_step_rank: str = "auto", + time_step_scale: float = 1.0, + time_step_min: float = 0.001, + time_step_max: float = 0.1, + time_step_init_scheme: str = "random", + time_step_floor: float = 1e-4, + max_position_embeddings: int = 2048, + attn: Optional[Dict] = { + 'layers': (1, 3, 5, 7, 9, 11, 13, 15, 17), + 'num_heads': 18, + 'num_kv_heads': 18, + 'window_size': 2048 + }, + hidden_ratio: Optional[int] = 4, + rescale_prenorm_residual: bool = False, + use_cache: bool = True, + fuse_norm: bool = True, + fuse_cross_entropy: bool = True, + tie_word_embeddings: bool = False, + **kwargs, + ): + self.vocab_size = vocab_size + self.hidden_size = hidden_size + self.state_size = state_size + self.num_hidden_layers = num_hidden_layers + self.norm_eps = norm_eps + self.conv_kernel = conv_kernel + self.expand = expand + self.intermediate_size = int(expand * self.hidden_size) + self.bos_token_id = bos_token_id + self.eos_token_id = eos_token_id + self.pad_token_id = pad_token_id + self.use_bias = use_bias + self.use_conv_bias = use_conv_bias + self.hidden_act = hidden_act + self.initializer_range = initializer_range + self.time_step_rank = math.ceil(self.hidden_size / 16) if time_step_rank == "auto" else time_step_rank + self.time_step_scale = time_step_scale + self.time_step_min = time_step_min + self.time_step_max = time_step_max + self.time_step_init_scheme = time_step_init_scheme + self.time_step_floor = time_step_floor + self.max_position_embeddings = max_position_embeddings + self.attn = attn + self.hidden_ratio = hidden_ratio + self.rescale_prenorm_residual = rescale_prenorm_residual + self.residual_in_fp32 = residual_in_fp32 + self.use_cache = use_cache + self.fuse_cross_entropy = fuse_cross_entropy + self.fuse_norm = fuse_norm + + super().__init__( + bos_token_id=bos_token_id, + eos_token_id=eos_token_id, + pad_token_id=pad_token_id, + tie_word_embeddings=tie_word_embeddings, + **kwargs + ) diff --git a/fla/models/samba/modeling_samba.py b/fla/models/samba/modeling_samba.py new file mode 100644 index 0000000000000000000000000000000000000000..b23ea4fcfe5951bb1429c1c1122a1d3834eee139 --- /dev/null +++ b/fla/models/samba/modeling_samba.py @@ -0,0 +1,418 @@ +# -*- coding: utf-8 -*- + +from __future__ import annotations + +import math +from dataclasses import dataclass +from typing import Any, Dict, Optional, Tuple, Union + +import torch +import torch.utils.checkpoint +from torch import nn +from transformers.activations import ACT2FN +from transformers.generation import GenerationMixin +from transformers.modeling_utils import PreTrainedModel +from transformers.utils import ModelOutput, logging + +from fla.layers.attn import Attention +from fla.models.mamba.modeling_mamba import MambaCache, MambaMixer +from fla.models.samba.configuration_samba import SambaConfig +from fla.modules import (FusedCrossEntropyLoss, FusedLinearCrossEntropyLoss, + RMSNorm) +from fla.modules.activations import swiglu_linear + +logger = logging.get_logger(__name__) + + +class SambaMLP(nn.Module): + + def __init__( + self, + hidden_size: int, + hidden_ratio: Optional[int] = None, + hidden_act: str = 'swish' + ) -> SambaMLP: + super().__init__() + + self.hidden_size = hidden_size + # the final number of params is `hidden_ratio * hidden_size^2` + # `intermediate_size` is chosen to be a multiple of 256 closest to `2/3 * hidden_size * hidden_ratio` + if hidden_ratio is None: + hidden_ratio = 4 + self.hidden_ratio = hidden_ratio + + self.intermediate_size = int(hidden_size * hidden_ratio * 2 / 3) + self.intermediate_size = 256 * ((self.intermediate_size + 256 - 1) // 256) + + self.gate_proj = nn.Linear(self.hidden_size, self.intermediate_size * 2, bias=False) + self.down_proj = nn.Linear(self.intermediate_size, self.hidden_size, bias=False) + self.act_fn = ACT2FN[hidden_act] + + def forward(self, x): + y = self.gate_proj(x) + gate, y = y.chunk(2, -1) + return swiglu_linear(gate, y, self.down_proj.weight, self.down_proj.bias) + + +class SambaBlock(nn.Module): + def __init__(self, config, layer_idx): + super().__init__() + + self.config = config + self.hidden_size = config.hidden_size + self.layer_idx = layer_idx + + self.mixer_norm = RMSNorm(hidden_size=config.hidden_size, eps=config.norm_eps) + if config.attn is not None and layer_idx in config.attn['layers']: + self.mixer = Attention( + hidden_size=config.hidden_size, + num_heads=config.attn['num_heads'], + num_kv_heads=config.attn['num_kv_heads'], + window_size=config.attn['window_size'], + max_position_embeddings=config.max_position_embeddings, + layer_idx=layer_idx + ) + else: + self.mixer = MambaMixer(config, layer_idx=layer_idx) + self.mlp_norm = RMSNorm(hidden_size=config.hidden_size, eps=config.norm_eps) + self.mlp = SambaMLP( + hidden_size=config.hidden_size, + hidden_ratio=config.hidden_ratio, + hidden_act=config.hidden_act + ) + + def forward( + self, + hidden_states: torch.Tensor, + cache_params: Optional[Tuple[torch.Tensor]] = None, + **kwargs, + ) -> Tuple[torch.FloatTensor, Optional[Tuple[torch.FloatTensor, torch.FloatTensor]]]: + + residual = hidden_states + hidden_states = self.mixer_norm(hidden_states) + if isinstance(self.mixer, MambaMixer): + hidden_states = self.mixer(hidden_states, cache_params=cache_params) + else: + hidden_states, _, cache_params = self.mixer(hidden_states=hidden_states, past_key_values=cache_params) + hidden_states, residual = self.mlp_norm(hidden_states, residual, True) + hidden_states = self.mlp(hidden_states) + hidden_states = residual + hidden_states + return hidden_states + + +class SambaPreTrainedModel(PreTrainedModel): + """ + An abstract class to handle weights initialization and a simple interface for downloading and loading pretrained + models. + """ + + config_class = SambaConfig + base_model_prefix = "backbone" + _no_split_modules = ["SambaBlock"] + supports_gradient_checkpointing = True + + def _init_weights(self, module): + """Initialize the weights.""" + if isinstance(module, MambaMixer): + module.A_log._no_weight_decay = True + module.D._no_weight_decay = True + + dt_init_std = self.config.time_step_rank**-0.5 * self.config.time_step_scale + if self.config.time_step_init_scheme == "constant": + nn.init.constant_(module.dt_proj.weight, dt_init_std) + elif self.config.time_step_init_scheme == "random": + nn.init.uniform_(module.dt_proj.weight, -dt_init_std, dt_init_std) + + dt = torch.exp( + torch.rand(self.config.intermediate_size) + * (math.log(self.config.time_step_max) - math.log(self.config.time_step_min)) + + math.log(self.config.time_step_min) + ).clamp(min=self.config.time_step_floor) + # # Inverse of softplus: https://github.com/pytorch/pytorch/issues/72759 + inv_dt = dt + torch.log(-torch.expm1(-dt)) + with torch.no_grad(): + module.dt_proj.bias.copy_(inv_dt) + module.dt_proj.bias._no_reinit = True + + if isinstance(module, nn.Linear): + if module.bias is not None: + if not getattr(module.bias, "_no_reinit", False): + nn.init.zeros_(module.bias) + elif isinstance(module, nn.Embedding): + nn.init.normal_(module.weight, std=self.config.initializer_range) + + if self.config.rescale_prenorm_residual: + # Reinitialize selected weights subject to the OpenAI GPT-2 Paper Scheme: + # > A modified initialization which accounts for the accumulation on the residual path with model depth. Scale + # > the weights of residual layers at initialization by a factor of 1/√N where N is the # of residual layers. + # > -- GPT-2 :: https://openai.com/blog/better-language-models/ + # + # Reference (Megatron-LM): https://github.com/NVIDIA/Megatron-LM/blob/main/megatron/model/gpt_model.py + for name, p in module.named_parameters(): + if name in ["out_proj.weight"]: + # Special Scaled Initialization --> There are 2 Layer Norms per Transformer Block + # Following Pytorch init, except scale by 1/sqrt(2 * n_layer) + # We need to reinit p since this code could be called multiple times + # Having just p *= scale would repeatedly scale it down + nn.init.kaiming_uniform_(p, a=math.sqrt(5)) + with torch.no_grad(): + p /= math.sqrt(self.config.num_layers) + + +@dataclass +class SambaOutput(ModelOutput): + """ + Class for the Samba model outputs. + + Args: + last_hidden_state (`torch.FloatTensor` of shape `(batch_size, sequence_length, hidden_size)`): + Sequence of hidden-states at the output of the last layer of the model. + cache_params (`MambaCache`): + The state of the model at the last time step. Can be used in a forward method with the next `input_ids` to + avoid providing the old `input_ids`. + + Includes both the State space model state matrices after the selective scan, and the Convolutional states + hidden_states (`tuple(torch.FloatTensor)`, *optional*, + returned when `output_hidden_states=True` is passed or when `config.output_hidden_states=True`): + Tuple of `torch.FloatTensor` (one for the output of the embeddings, if the model has an embedding layer, + + one for the output of each layer) of shape `(batch_size, sequence_length, hidden_size)`. + + Hidden-states of the model at the output of each layer plus the optional initial embedding outputs. + """ + + last_hidden_state: Optional[torch.FloatTensor] = None + cache_params: Optional[MambaCache] = None + hidden_states: Optional[Tuple[torch.FloatTensor]] = None + + +@dataclass +class SambaCausalLMOutput(ModelOutput): + """ + Base class for causal language model (or autoregressive) outputs. + + Args: + loss (`torch.FloatTensor` of shape `(1,)`, *optional*, returned when `labels` is provided): + Language modeling loss (for next-token prediction). + logits (`torch.FloatTensor` of shape `(batch_size, sequence_length, config.vocab_size)`): + Prediction scores of the language modeling head (scores for each vocabulary token before SoftMax). + cache_params (`MambaCache`): + The state of the model at the last time step. Can be used in a forward method with the next `input_ids` to + avoid providing the old `input_ids`. + + Includes both the State space model state matrices after the selective scan, and the Convolutional states + hidden_states (`tuple(torch.FloatTensor)`, *optional*, + returned when `output_hidden_states=True` is passed or when `config.output_hidden_states=True`): + Tuple of `torch.FloatTensor` (one for the output of the embeddings, if the model has an embedding layer, + + one for the output of each layer) of shape `(batch_size, sequence_length, hidden_size)`. + + Hidden-states of the model at the output of each layer plus the optional initial embedding outputs. + """ + + loss: Optional[torch.FloatTensor] = None + logits: Optional[torch.FloatTensor] = None + cache_params: Optional[MambaCache] = None + hidden_states: Optional[Tuple[torch.FloatTensor]] = None + + +class SambaModel(SambaPreTrainedModel): + def __init__(self, config): + super().__init__(config) + + self.embeddings = nn.Embedding(config.vocab_size, config.hidden_size) + self.layers = nn.ModuleList([SambaBlock(config, layer_idx=idx) for idx in range(config.num_hidden_layers)]) + + self.gradient_checkpointing = False + self.norm_f = RMSNorm(config.hidden_size, eps=config.norm_eps) + # Initialize weights and apply final processing + self.post_init() + + def get_input_embeddings(self): + return self.embeddings + + def set_input_embeddings(self, new_embeddings): + self.embeddings = new_embeddings + + def forward( + self, + input_ids: Optional[torch.LongTensor] = None, + inputs_embeds: Optional[torch.LongTensor] = None, + cache_params: Optional[MambaCache] = None, + use_cache: Optional[bool] = None, + output_hidden_states: Optional[bool] = None, + return_dict: Optional[bool] = None, + **kwargs, # `attention_mask` is passed by the tokenizer and we don't want it + ) -> Union[Tuple, SambaOutput]: + output_hidden_states = ( + output_hidden_states if output_hidden_states is not None else self.config.output_hidden_states + ) + use_cache = use_cache if use_cache is not None else (self.config.use_cache if not self.training else False) + return_dict = return_dict if return_dict is not None else self.config.use_return_dict + + if (input_ids is None) ^ (inputs_embeds is not None): # ^ is python for xor + raise ValueError( + "You cannot specify both input_ids and inputs_embeds at the same time, and must specify either one" + ) + + if inputs_embeds is None: + inputs_embeds = self.embeddings(input_ids) + + if self.gradient_checkpointing and self.training and use_cache: + use_cache = False + + if cache_params is None and use_cache: + cache_params = MambaCache( + self.config, inputs_embeds.size(0), device=inputs_embeds.device, dtype=inputs_embeds.dtype + ) + + hidden_states = inputs_embeds + all_hidden_states = () if output_hidden_states else None + for mixer_block in self.layers: + if self.gradient_checkpointing and self.training: + hidden_states = self._gradient_checkpointing_func(mixer_block.__call__, hidden_states, cache_params) + else: + hidden_states = mixer_block(hidden_states, cache_params=cache_params) + + if output_hidden_states: + all_hidden_states = all_hidden_states + (hidden_states,) + + if use_cache: + cache_params.seqlen_offset += inputs_embeds.shape[1] + + hidden_states = self.norm_f(hidden_states) + + if output_hidden_states: + all_hidden_states = all_hidden_states + (hidden_states,) + + if not return_dict: + return tuple(v for v in [hidden_states, cache_params, all_hidden_states] if v is not None) + + return SambaOutput( + last_hidden_state=hidden_states, + cache_params=cache_params if use_cache else None, + hidden_states=all_hidden_states, + ) + + +class SambaForCausalLM(SambaPreTrainedModel, GenerationMixin): + + _tied_weights_keys = ["lm_head.weight"] + + def __init__(self, config): + super().__init__(config) + self.backbone = SambaModel(config) + self.lm_head = nn.Linear(config.hidden_size, config.vocab_size, bias=False) + # Initialize weights and apply final processing + self.post_init() + + def get_output_embeddings(self): + return self.lm_head + + def set_output_embeddings(self, new_embeddings): + self.lm_head = new_embeddings + + def get_input_embeddings(self): + return self.backbone.get_input_embeddings() + + def set_input_embeddings(self, new_embeddings): + return self.backbone.set_input_embeddings(new_embeddings) + + def _update_model_kwargs_for_generation( + self, outputs: ModelOutput, model_kwargs: Dict[str, Any], **kwargs + ) -> Dict[str, Any]: + model_kwargs["cache_params"] = outputs.get("cache_params", None) + return model_kwargs + + def prepare_inputs_for_generation( + self, + input_ids, + cache_params: + Optional[MambaCache] = None, + inputs_embeds=None, + attention_mask=None, + use_cache: Optional[bool] = True, + num_logits_to_keep: Optional[int] = None, + **kwargs + ): + # only last token for inputs_ids if the state is passed along. + if cache_params is not None: + input_ids = input_ids[:, -1].unsqueeze(-1) + + if inputs_embeds is not None and cache_params is None: + model_inputs = {"inputs_embeds": inputs_embeds} + else: + model_inputs = {"input_ids": input_ids} + + if num_logits_to_keep is not None: + model_inputs['num_logits_to_keep'] = num_logits_to_keep + + model_inputs.update({ + 'cache_params': cache_params, + 'use_cache': use_cache, + 'attention_mask': attention_mask, + 'num_logits_to_keep': num_logits_to_keep, + }) + return model_inputs + + def forward( + self, + input_ids: Optional[torch.LongTensor] = None, + attention_mask: Optional[torch.Tensor] = None, # noqa + inputs_embeds: Optional[torch.FloatTensor] = None, + cache_params: Optional[MambaCache] = None, + labels: Optional[torch.LongTensor] = None, + output_hidden_states: Optional[bool] = None, + return_dict: Optional[bool] = None, + use_cache: Optional[bool] = None, + num_logits_to_keep: Optional[int] = 0, + **kwargs, # for now we need this for generation + ) -> Union[Tuple, SambaCausalLMOutput]: + r""" + labels (`torch.LongTensor` of shape `(batch_size, sequence_length)`, *optional*): + Labels for language modeling. Note that the labels **are shifted** inside the model, i.e. you can set + `labels = input_ids` Indices are selected in `[-100, 0, ..., config.vocab_size]` All labels set to `-100` + are ignored (masked), the loss is only computed for labels in `[0, ..., config.vocab_size]` + """ + return_dict = return_dict if return_dict is not None else self.config.use_return_dict + + samba_outputs = self.backbone( + input_ids, + cache_params=cache_params, + inputs_embeds=inputs_embeds, + output_hidden_states=output_hidden_states, + return_dict=return_dict, + use_cache=use_cache, + ) + hidden_states = samba_outputs[0] + fuse_linear_and_cross_entropy = self.config.fuse_cross_entropy and self.training + logits = None if fuse_linear_and_cross_entropy else self.lm_head(hidden_states[:, -num_logits_to_keep:]) + + loss = None + if labels is not None: + if self.config.fuse_cross_entropy: + if fuse_linear_and_cross_entropy: + loss_fct = FusedLinearCrossEntropyLoss() + else: + loss_fct = FusedCrossEntropyLoss(inplace_backward=True) + else: + loss_fct = nn.CrossEntropyLoss() + # Enable model parallelism + labels = labels.to(hidden_states.device) + labels = torch.cat((labels[..., 1:], torch.full_like(labels[:, :1], loss_fct.ignore_index)), 1) + if fuse_linear_and_cross_entropy: + loss = loss_fct(hidden_states.view(-1, self.config.hidden_size), + labels.view(-1), + self.lm_head.weight, + self.lm_head.bias) + else: + loss = loss_fct(logits.view(-1, self.config.vocab_size), labels.view(-1)) + + if not return_dict: + output = (logits,) + samba_outputs[1:] + return (loss,) + output if loss is not None else output + + return SambaCausalLMOutput( + loss=loss, + logits=logits, + cache_params=samba_outputs.cache_params, + hidden_states=samba_outputs.hidden_states, + ) diff --git a/fla/models/scan/__init__.py b/fla/models/scan/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..55e172cf0ab129cc2568b85064a1df632d0d3cb3 --- /dev/null +++ b/fla/models/scan/__init__.py @@ -0,0 +1,13 @@ +# -*- coding: utf-8 -*- + +from transformers import AutoConfig, AutoModel, AutoModelForCausalLM + +from fla.models.scan.configuration_scan import SCANConfig +from fla.models.scan.modeling_scan import SCANForCausalLM, SCANModel + +AutoConfig.register(SCANConfig.model_type, SCANConfig) +AutoModel.register(SCANConfig, SCANModel) +AutoModelForCausalLM.register(SCANConfig, SCANForCausalLM) + + +__all__ = ['SCANConfig', 'SCANForCausalLM', 'SCANModel'] diff --git a/fla/models/scan/configuration_scan.py b/fla/models/scan/configuration_scan.py new file mode 100644 index 0000000000000000000000000000000000000000..a89b1adfd382d4f36fde1bd077a64cc5af158031 --- /dev/null +++ b/fla/models/scan/configuration_scan.py @@ -0,0 +1,92 @@ +# -*- coding: utf-8 -*- + +from typing import Dict, Optional + +from transformers.configuration_utils import PretrainedConfig + + +class SCANConfig(PretrainedConfig): + + model_type = 'scan' + # keys_to_ignore_at_inference = ['past_key_values'] + + def __init__( + self, + hidden_size: int = 2048, + window_size: int = 512, + gate_logit_normalizer: Optional[int] = 8, + clamp_min: Optional[float] = None, + clamp_max: Optional[float] = None, + hidden_ratio: Optional[int] = 4, + intermediate_size: Optional[int] = None, + num_hidden_layers: int = 24, + num_heads: int = 4, + num_kv_heads: Optional[int] = None, + state_size: Optional[int] = 64, + expand_k: float = 1, + expand_v: float = 1, + gate_act: str = 'softmax', + use_output_gate: bool = False, + use_norm: bool = True, + hidden_act: str = "swish", + elementwise_affine: Optional[bool] = True, + max_position_embeddings: Optional[int] = 2048, + norm_first: bool = True, + norm_eps: float = 1e-6, + attn: Optional[Dict] = None, + use_cache: bool = True, + pad_token_id: int = None, + bos_token_id: int = 1, + eos_token_id: int = 2, + initializer_range: float = 0.02, + tie_word_embeddings: bool = False, + fuse_norm: bool = True, + fuse_cross_entropy: bool = True, + vocab_size: int = 32000, + **kwargs + ): + self.hidden_size = hidden_size + self.window_size = window_size + self.gate_logit_normalizer = gate_logit_normalizer + self.clamp_min = clamp_min + self.clamp_max = clamp_max + self.hidden_ratio = hidden_ratio + self.intermediate_size = intermediate_size + self.num_hidden_layers = num_hidden_layers + self.num_heads = num_heads + self.num_kv_heads = num_kv_heads + self.state_size = state_size + self.expand_k = expand_k + self.expand_v = expand_v + self.gate_act = gate_act + self.use_output_gate = use_output_gate + self.use_norm = use_norm + self.hidden_act = hidden_act + self.elementwise_affine = elementwise_affine + self.max_position_embeddings = max_position_embeddings + self.norm_first = norm_first + self.norm_eps = norm_eps + self.attn = attn + self.use_cache = use_cache + self.initializer_range = initializer_range + self.fuse_cross_entropy = fuse_cross_entropy + self.fuse_norm = fuse_norm + self.vocab_size = vocab_size + + if attn is not None: + if not isinstance(attn, Dict): + raise ValueError("attn must be a dictionary") + if 'layers' not in attn: + raise ValueError("Layer indices must be provided to initialize hybrid attention layers") + if 'num_heads' not in attn: + raise ValueError("Number of heads must be provided to initialize hybrid attention layers") + attn['num_kv_heads'] = attn.get('num_kv_heads', attn['num_heads']) + attn['window_size'] = attn.get('window_size', None) + + super().__init__( + pad_token_id=pad_token_id, + bos_token_id=bos_token_id, + eos_token_id=eos_token_id, + tie_word_embeddings=tie_word_embeddings, + **kwargs, + ) diff --git a/fla/models/scan/modeling_scan.py b/fla/models/scan/modeling_scan.py new file mode 100644 index 0000000000000000000000000000000000000000..3197aeca52b499be8c079447998f88620800a5d8 --- /dev/null +++ b/fla/models/scan/modeling_scan.py @@ -0,0 +1,442 @@ +# -*- coding: utf-8 -*- + +from __future__ import annotations + +import math +import warnings +from typing import List, Optional, Tuple, Union + +import torch +import torch.nn as nn +import torch.utils.checkpoint +from transformers.activations import ACT2FN +from transformers.generation import GenerationMixin +from transformers.modeling_outputs import (BaseModelOutputWithPast, + CausalLMOutputWithPast) +from transformers.modeling_utils import PreTrainedModel +from transformers.utils import logging + +from fla.layers.attn import Attention +from fla.layers.scan import SemiCompressedAttention +from fla.models.scan.configuration_scan import SCANConfig +from fla.models.utils import Cache +from fla.modules import (FusedCrossEntropyLoss, FusedLinearCrossEntropyLoss, + RMSNorm) +from fla.modules.activations import swiglu_linear +from fla.modules.layernorm import rms_norm_linear + +logger = logging.get_logger(__name__) + + +class SCANMLP(nn.Module): + + def __init__( + self, + hidden_size: int, + hidden_ratio: Optional[int] = None, + intermediate_size: Optional[int] = None, + hidden_act: str = 'swish', + norm_first: bool = True, + norm_eps: float = 1e-5 + ) -> SCANMLP: + super().__init__() + + self.hidden_size = hidden_size + # the final number of params is `hidden_ratio * hidden_size^2` + # `intermediate_size` is chosen to be a multiple of 256 closest to `2/3 * hidden_size * hidden_ratio` + if hidden_ratio is None: + hidden_ratio = 4 + if intermediate_size is None: + intermediate_size = int(hidden_size * hidden_ratio * 2 / 3) + intermediate_size = 256 * ((intermediate_size + 256 - 1) // 256) + self.hidden_ratio = hidden_ratio + self.intermediate_size = intermediate_size + self.norm_first = norm_first + + if norm_first: + self.norm = RMSNorm(hidden_size=hidden_size, eps=norm_eps) + + self.gate_proj = nn.Linear(self.hidden_size, self.intermediate_size * 2, bias=False) + self.down_proj = nn.Linear(self.intermediate_size, self.hidden_size, bias=False) + self.act_fn = ACT2FN[hidden_act] + + def forward(self, x): + if self.norm_first: + x = rms_norm_linear(x, self.norm.weight, self.norm.bias, self.gate_proj.weight, self.gate_proj.bias) + else: + x = self.gate_proj(x) + gate, y = x.chunk(2, -1) + return swiglu_linear(gate, y, self.down_proj.weight, self.down_proj.bias) + + +class SCANBlock(nn.Module): + def __init__(self, config: SCANConfig, layer_idx: int): + super().__init__() + self.hidden_size = config.hidden_size + + if not config.norm_first: + self.attn_norm = RMSNorm(hidden_size=config.hidden_size, eps=config.norm_eps) + # if config.attn is not None and layer_idx in config.attn['layers']: + # self.attn = Attention( + # hidden_size=config.hidden_size, + # num_heads=config.attn['num_heads'], + # num_kv_heads=config.attn['num_kv_heads'], + # window_size=config.attn['window_size'], + # max_position_embeddings=config.max_position_embeddings, + # layer_idx=layer_idx + # ) + # else: # No need for hybrid option because the SCAN module is inherently hybrid + self.attn = SemiCompressedAttention( + hidden_size=config.hidden_size, + window_size=config.window_size, + state_size=config.state_size, + gate_act=config.gate_act, + max_position_embeddings=config.max_position_embeddings, + expand_k=config.expand_k, + expand_v=config.expand_v, + num_heads=config.num_heads, + num_kv_heads=config.num_kv_heads, + use_output_gate=config.use_output_gate, + use_norm=config.use_norm, + gate_fn=config.hidden_act, + gate_logit_normalizer=config.gate_logit_normalizer, + elementwise_affine=config.elementwise_affine, + norm_first=config.norm_first, + norm_eps=config.norm_eps, + fuse_norm=config.fuse_norm, + layer_idx=layer_idx + ) + if not config.norm_first: + self.mlp_norm = RMSNorm(hidden_size=config.hidden_size, eps=config.norm_eps) + self.mlp = SCANMLP( + hidden_size=config.hidden_size, + hidden_ratio=config.hidden_ratio, + intermediate_size=config.intermediate_size, + hidden_act=config.hidden_act, + norm_first=config.norm_first, + norm_eps=config.norm_eps + ) + + def forward( + self, + hidden_states: torch.Tensor, + attention_mask: Optional[torch.Tensor] = None, + past_key_values: Optional[Union[Cache, List[torch.FloatTensor]]] = None, + use_cache: Optional[bool] = False, + output_attentions: Optional[bool] = False, + **kwargs + ) -> Tuple[torch.FloatTensor, Optional[Tuple[torch.FloatTensor, torch.FloatTensor]]]: + + residual = hidden_states + if hasattr(self, 'attn_norm'): + hidden_states = self.attn_norm(hidden_states) + hidden_states, attentions, past_key_values = self.attn( + hidden_states=hidden_states, + attention_mask=attention_mask, + past_key_values=past_key_values, + use_cache=use_cache, + output_attentions=output_attentions + ) + if hasattr(self, 'mlp_norm'): + hidden_states, residual = self.mlp_norm(hidden_states, residual, True) + else: + hidden_states = residual + hidden_states + residual = hidden_states + hidden_states = self.mlp(hidden_states) + hidden_states = residual + hidden_states + + outputs = (hidden_states, attentions, past_key_values) + + return outputs + + +class SCANPreTrainedModel(PreTrainedModel): + + config_class = SCANConfig + supports_gradient_checkpointing = True + _no_split_modules = ['SCANBlock'] + + def __init__(self, *inputs, **kwargs): + super().__init__(*inputs, **kwargs) + + def _init_weights( + self, + module: nn.Module, + rescale_prenorm_residual: bool = True, + num_residuals_per_layer: int = 2, + ): + if isinstance(module, (nn.Linear, nn.Conv1d)): + # Slightly different from the TF version which uses truncated_normal for initialization + # cf https://github.com/pytorch/pytorch/pull/5617 + nn.init.normal_(module.weight, mean=0.0, std=self.config.initializer_range) + if module.bias is not None: + nn.init.zeros_(module.bias) + elif isinstance(module, nn.Embedding): + nn.init.normal_(module.weight, mean=0.0, std=self.config.initializer_range) + if module.padding_idx is not None: + module.weight.data[module.padding_idx].zero_() + + if rescale_prenorm_residual: + # Reinitialize selected weights subject to the OpenAI GPT-2 Paper Scheme: + # > A modified initialization which accounts for the accumulation on the residual path with model depth. Scale + # > the weights of residual layers at initialization by a factor of 1/√N where N is the # of residual layers. + # > -- GPT-2 :: https://openai.com/blog/better-language-models/ + # + # Reference (Megatron-LM): https://github.com/NVIDIA/Megatron-LM/blob/main/megatron/model/gpt_model.py + for name, p in module.named_parameters(): + if name in ["o_proj.weight", "down_proj.weight"]: + # Special Scaled Initialization --> There are 2 Layer Norms per Transformer Block + # Following Pytorch init, except scale by 1/sqrt(2 * n_layer) + # We need to reinit p since this code could be called multiple times + # Having just p *= scale would repeatedly scale it down + with torch.no_grad(): + p /= math.sqrt(num_residuals_per_layer * self.config.num_hidden_layers) + + +class SCANModel(SCANPreTrainedModel): + + def __init__(self, config: SCANConfig): + super().__init__(config) + self.padding_idx = config.pad_token_id + self.vocab_size = config.vocab_size + + self.embeddings = nn.Embedding(config.vocab_size, config.hidden_size, self.padding_idx) + self.layers = nn.ModuleList([SCANBlock(config, layer_idx) for layer_idx in range(config.num_hidden_layers)]) + self.norm = RMSNorm(config.hidden_size, eps=config.norm_eps) + + self.gradient_checkpointing = False + + self.post_init() + + def get_input_embeddings(self): + return self.embeddings + + def set_input_embeddings(self, value): + self.embeddings = value + + def forward( + self, + input_ids: Optional[torch.LongTensor] = None, + attention_mask: Optional[torch.Tensor] = None, # noqa + inputs_embeds: Optional[torch.FloatTensor] = None, + past_key_values: Optional[Union[Cache, List[torch.FloatTensor]]] = None, + use_cache: Optional[bool] = None, + output_attentions: Optional[bool] = None, + output_hidden_states: Optional[bool] = None, + return_dict: Optional[bool] = None + ) -> Union[Tuple, BaseModelOutputWithPast]: + if output_attentions: + warnings.warn("`SCANModel` does not `output_attentions` now, setting it to `False`.") + output_attentions = False + output_attentions = output_attentions if output_attentions is not None else self.config.output_attentions + output_hidden_states = output_hidden_states if output_hidden_states is not None else self.config.output_hidden_states + use_cache = use_cache if use_cache is not None else (self.config.use_cache if not self.training else False) + return_dict = return_dict if return_dict is not None else self.config.use_return_dict + + # retrieve input_ids and inputs_embeds + if input_ids is not None and inputs_embeds is not None: + raise ValueError("You cannot specify both input_ids and inputs_embeds at the same time") + if input_ids is None and inputs_embeds is None: + raise ValueError("You have to specify either input_ids or inputs_embeds") + + if inputs_embeds is None: + inputs_embeds = self.embeddings(input_ids) + hidden_states = inputs_embeds + + if use_cache and not isinstance(past_key_values, Cache): + past_key_values = Cache.from_legacy_cache(past_key_values) + + if self.gradient_checkpointing and self.training and use_cache: + logger.warning_once("`use_cache=True` is incompatible with gradient checkpointing. Setting `use_cache=False`...") + use_cache = False + + all_hidden_states = () if output_hidden_states else None + all_attns = () if output_attentions else None + + for i, layer in enumerate(self.layers): + if output_hidden_states: + all_hidden_states += (hidden_states,) + + if self.gradient_checkpointing and self.training: + hidden_states, attentions, past_key_values = self._gradient_checkpointing_func( + layer.__call__, + hidden_states, + attention_mask, + past_key_values, + use_cache, + output_attentions, + ) + else: + hidden_states, attentions, past_key_values = layer( + hidden_states, + attention_mask=attention_mask, + past_key_values=past_key_values, + use_cache=use_cache, + output_attentions=output_attentions + ) + + if output_attentions: + all_attns += (attentions,) + + hidden_states = self.norm(hidden_states) + + # add hidden states from the last decoder layer + if output_hidden_states: + all_hidden_states += (hidden_states,) + + if not return_dict: + return tuple(i for i in [hidden_states, past_key_values, all_hidden_states, all_attns] if i is not None) + return BaseModelOutputWithPast( + last_hidden_state=hidden_states, + past_key_values=past_key_values, + hidden_states=all_hidden_states, + attentions=all_attns + ) + + +class SCANForCausalLM(SCANPreTrainedModel, GenerationMixin): + + _tied_weights_keys = ["lm_head.weight"] + + def __init__(self, config): + + super().__init__(config) + self.model = SCANModel(config) + self.vocab_size = config.vocab_size + self.lm_head = nn.Linear(config.hidden_size, config.vocab_size, bias=False) + + # Initialize weights and apply final processing + self.post_init() + + def get_input_embeddings(self): + return self.model.embeddings + + def set_input_embeddings(self, value): + self.model.embeddings = value + + def get_output_embeddings(self): + return self.lm_head + + def set_output_embeddings(self, new_embeddings): + self.lm_head = new_embeddings + + def set_decoder(self, decoder): + self.model = decoder + + def get_decoder(self): + return self.model + + def generate(self, *args, **kwargs): + try: + return super().generate(*args, **kwargs) + except AttributeError as exception: + if 'past_key_values' in str(exception): + raise AttributeError( + f"You tried to call `generate` with a decoding strategy that manipulates `past_key_values`, " + f"which is not supported for {self.__class__.__name__}. " + f"Try another generation strategy instead. " + f"For the available generation strategies, check this doc: " + f"https://huggingface.co/docs/transformers/en/generation_strategies#decoding-strategies" + ) + else: + raise exception + + def prepare_inputs_for_generation( + self, + input_ids: torch.LongTensor = None, + past_key_values: Optional[Union[Cache, List[torch.FloatTensor]]] = None, + attention_mask: Optional[torch.Tensor] = None, + inputs_embeds: Optional[torch.Tensor] = None, + use_cache: bool = True, + num_logits_to_keep: Optional[int] = None, + **kwargs + ): + # only last token for `inputs_ids` if the `past_key_values` is passed along. + if past_key_values is not None: + input_ids = input_ids[:, -1:] + # if `inputs_embeds` are passed, we only want to use them in the 1st generation step + if inputs_embeds is not None and past_key_values is None: + model_inputs = {'inputs_embeds': inputs_embeds} + else: + # The `contiguous()` here is necessary to have a static stride during decoding. torchdynamo otherwise + # recompiles graphs as the stride of the inputs is a guard. + # Ref: https://github.com/huggingface/transformers/pull/29114 + # TODO: use `next_tokens` directly instead. + model_inputs = {'input_ids': input_ids.contiguous()} + + if num_logits_to_keep is not None: + model_inputs['num_logits_to_keep'] = num_logits_to_keep + + model_inputs.update({ + 'past_key_values': past_key_values, + 'use_cache': use_cache, + 'attention_mask': attention_mask, + 'num_logits_to_keep': num_logits_to_keep, + }) + return model_inputs + + def forward( + self, + input_ids: torch.LongTensor = None, + attention_mask: Optional[torch.Tensor] = None, + inputs_embeds: Optional[torch.Tensor] = None, + past_key_values: Optional[Union[Cache, List[torch.FloatTensor]]] = None, + labels: Optional[torch.LongTensor] = None, + use_cache: Optional[bool] = None, + output_attentions: Optional[bool] = None, + output_hidden_states: Optional[bool] = None, + return_dict: Optional[bool] = None, + num_logits_to_keep: Optional[int] = 0 + ) -> Union[Tuple, CausalLMOutputWithPast]: + output_attentions = output_attentions if output_attentions is not None else self.config.output_attentions + output_hidden_states = ( + output_hidden_states if output_hidden_states is not None else self.config.output_hidden_states + ) + return_dict = return_dict if return_dict is not None else self.config.use_return_dict + + outputs = self.model( + input_ids=input_ids, + attention_mask=attention_mask, + inputs_embeds=inputs_embeds, + past_key_values=past_key_values, + use_cache=use_cache, + output_attentions=output_attentions, + output_hidden_states=output_hidden_states, + return_dict=return_dict + ) + + hidden_states = outputs[0] + fuse_linear_and_cross_entropy = self.config.fuse_cross_entropy and self.training + logits = None if fuse_linear_and_cross_entropy else self.lm_head(hidden_states[:, -num_logits_to_keep:]) + + loss = None + if labels is not None: + if self.config.fuse_cross_entropy: + if fuse_linear_and_cross_entropy: + loss_fct = FusedLinearCrossEntropyLoss() + else: + loss_fct = FusedCrossEntropyLoss(inplace_backward=True) + else: + loss_fct = nn.CrossEntropyLoss() + # Enable model parallelism + labels = labels.to(hidden_states.device) + labels = torch.cat((labels[..., 1:], torch.full_like(labels[:, :1], loss_fct.ignore_index)), 1) + if fuse_linear_and_cross_entropy: + loss = loss_fct(hidden_states.view(-1, self.config.hidden_size), + labels.view(-1), + self.lm_head.weight, + self.lm_head.bias) + else: + loss = loss_fct(logits.view(-1, self.config.vocab_size), labels.view(-1)) + + if not return_dict: + output = (logits,) + outputs[1:] + return (loss,) + output if loss is not None else output + + return CausalLMOutputWithPast( + loss=loss, + logits=logits, + past_key_values=outputs.past_key_values, + hidden_states=outputs.hidden_states, + attentions=outputs.attentions, + ) diff --git a/fla/models/transformer/__init__.py b/fla/models/transformer/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..47df999fe1446258dc9930e8b0aa6941f1c93f58 --- /dev/null +++ b/fla/models/transformer/__init__.py @@ -0,0 +1,14 @@ +# -*- coding: utf-8 -*- + +from transformers import AutoConfig, AutoModel, AutoModelForCausalLM + +from fla.models.transformer.configuration_transformer import TransformerConfig +from fla.models.transformer.modeling_transformer import ( + TransformerForCausalLM, TransformerModel) + +AutoConfig.register(TransformerConfig.model_type, TransformerConfig) +AutoModel.register(TransformerConfig, TransformerModel) +AutoModelForCausalLM.register(TransformerConfig, TransformerForCausalLM) + + +__all__ = ['TransformerConfig', 'TransformerForCausalLM', 'TransformerModel'] diff --git a/fla/models/transformer/configuration_transformer.py b/fla/models/transformer/configuration_transformer.py new file mode 100644 index 0000000000000000000000000000000000000000..35e27113cdcfb0e5ed9c0ec70ec08761f6ed4232 --- /dev/null +++ b/fla/models/transformer/configuration_transformer.py @@ -0,0 +1,68 @@ +# -*- coding: utf-8 -*- + +from typing import Optional + +from transformers.configuration_utils import PretrainedConfig + + +class TransformerConfig(PretrainedConfig): + + model_type = 'transformer' + keys_to_ignore_at_inference = ['past_key_values'] + + def __init__( + self, + vocab_size: int = 32000, + hidden_size: int = 2048, + num_hidden_layers: int = 24, + num_heads: int = 32, + num_kv_heads: int = None, + window_size: Optional[int] = None, + rope_theta: Optional[float] = 10000., + max_position_embeddings: int = 2048, + hidden_ratio: Optional[int] = 4, + intermediate_size: Optional[int] = None, + hidden_act: str = "swish", + initializer_range: float = 0.02, + elementwise_affine: Optional[bool] = True, + norm_first: bool = False, + norm_eps: float = 1e-6, + use_cache: bool = True, + pad_token_id: int = None, + bos_token_id: int = 1, + eos_token_id: int = 2, + tie_word_embeddings: bool = False, + attention_bias: bool = False, + fuse_norm: bool = True, + fuse_cross_entropy: bool = True, + **kwargs, + ): + self.vocab_size = vocab_size + self.hidden_size = hidden_size + self.num_hidden_layers = num_hidden_layers + self.num_heads = num_heads + self.num_kv_heads = num_kv_heads + self.window_size = window_size + self.rope_theta = rope_theta + self.max_position_embeddings = max_position_embeddings + + self.hidden_ratio = hidden_ratio + self.intermediate_size = intermediate_size + self.hidden_act = hidden_act + + self.initializer_range = initializer_range + self.elementwise_affine = elementwise_affine + self.norm_first = norm_first + self.norm_eps = norm_eps + self.use_cache = use_cache + self.attention_bias = attention_bias + self.fuse_cross_entropy = fuse_cross_entropy + self.fuse_norm = fuse_norm + + super().__init__( + pad_token_id=pad_token_id, + bos_token_id=bos_token_id, + eos_token_id=eos_token_id, + tie_word_embeddings=tie_word_embeddings, + **kwargs, + ) diff --git a/fla/models/transformer/modeling_transformer.py b/fla/models/transformer/modeling_transformer.py new file mode 100644 index 0000000000000000000000000000000000000000..d5c3aa943887648b7656365676888916b42331b3 --- /dev/null +++ b/fla/models/transformer/modeling_transformer.py @@ -0,0 +1,428 @@ +# -*- coding: utf-8 -*- + +from __future__ import annotations + +import math +import warnings +from typing import List, Optional, Tuple, Union + +import torch +import torch.nn as nn +import torch.utils.checkpoint +from transformers.activations import ACT2FN +from transformers.generation import GenerationMixin +from transformers.modeling_outputs import (BaseModelOutputWithPast, + CausalLMOutputWithPast) +from transformers.modeling_utils import PreTrainedModel +from transformers.utils import logging + +from fla.layers.attn import Attention +from fla.models.transformer.configuration_transformer import TransformerConfig +from fla.models.utils import Cache +from fla.modules import (FusedCrossEntropyLoss, FusedLinearCrossEntropyLoss, + RMSNorm) +from fla.modules.activations import swiglu_linear +from fla.modules.layernorm import rms_norm_linear + +logger = logging.get_logger(__name__) + + +class TransformerMLP(nn.Module): + + def __init__( + self, + hidden_size: int, + hidden_ratio: Optional[int] = None, + intermediate_size: Optional[int] = None, + hidden_act: str = 'swish', + norm_first: bool = True, + norm_eps: float = 1e-5 + ) -> TransformerMLP: + super().__init__() + + self.hidden_size = hidden_size + # the final number of params is `hidden_ratio * hidden_size^2` + # `intermediate_size` is chosen to be a multiple of 256 closest to `2/3 * hidden_size * hidden_ratio` + if hidden_ratio is None: + hidden_ratio = 4 + if intermediate_size is None: + intermediate_size = int(hidden_size * hidden_ratio * 2 / 3) + intermediate_size = 256 * ((intermediate_size + 256 - 1) // 256) + self.hidden_ratio = hidden_ratio + self.intermediate_size = intermediate_size + self.norm_first = norm_first + + if norm_first: + self.norm = RMSNorm(hidden_size=hidden_size, eps=norm_eps) + + self.gate_proj = nn.Linear(self.hidden_size, self.intermediate_size * 2, bias=False) + self.down_proj = nn.Linear(self.intermediate_size, self.hidden_size, bias=False) + self.act_fn = ACT2FN[hidden_act] + + def forward(self, x): + if self.norm_first: + x = rms_norm_linear(x, self.norm.weight, self.norm.bias, self.gate_proj.weight, self.gate_proj.bias) + else: + x = self.gate_proj(x) + gate, y = x.chunk(2, -1) + return swiglu_linear(gate, y, self.down_proj.weight, self.down_proj.bias) + + +class TransformerBlock(nn.Module): + + def __init__(self, config: TransformerConfig, layer_idx: int): + super().__init__() + + self.hidden_size = config.hidden_size + + if not config.norm_first: + self.attn_norm = RMSNorm(hidden_size=config.hidden_size, eps=config.norm_eps) + self.attn = Attention( + hidden_size=config.hidden_size, + num_heads=config.num_heads, + num_kv_heads=config.num_kv_heads, + window_size=config.window_size, + rope_theta=config.rope_theta, + max_position_embeddings=config.max_position_embeddings, + norm_first=config.norm_first, + norm_eps=config.norm_eps, + layer_idx=layer_idx + ) + if not config.norm_first: + self.mlp_norm = RMSNorm(hidden_size=config.hidden_size, eps=config.norm_eps) + self.mlp = TransformerMLP( + hidden_size=config.hidden_size, + hidden_ratio=config.hidden_ratio, + intermediate_size=config.intermediate_size, + hidden_act=config.hidden_act, + norm_first=config.norm_first, + norm_eps=config.norm_eps + ) + + def forward( + self, + hidden_states: torch.Tensor, + attention_mask: Optional[torch.Tensor] = None, + past_key_values: Optional[Tuple[torch.Tensor]] = None, + output_attentions: Optional[bool] = False, + use_cache: Optional[bool] = False, + **kwargs, + ) -> Tuple[torch.FloatTensor, Optional[Tuple[torch.FloatTensor, torch.FloatTensor]]]: + + residual = hidden_states + if hasattr(self, 'attn_norm'): + hidden_states = self.attn_norm(hidden_states) + hidden_states, attentions, past_key_values = self.attn( + hidden_states=hidden_states, + attention_mask=attention_mask, + past_key_values=past_key_values, + use_cache=use_cache, + output_attentions=output_attentions + ) + if hasattr(self, 'mlp_norm'): + hidden_states, residual = self.mlp_norm(hidden_states, residual, True) + else: + hidden_states = residual + hidden_states + residual = hidden_states + hidden_states = self.mlp(hidden_states) + hidden_states = residual + hidden_states + + outputs = (hidden_states,) + + if output_attentions: + outputs += (attentions,) + + if use_cache: + outputs += (past_key_values,) + + return outputs + + +class TransformerPreTrainedModel(PreTrainedModel): + + config_class = TransformerConfig + supports_gradient_checkpointing = True + _no_split_modules = ['TransformerBlock'] + + def __init__(self, *inputs, **kwargs): + super().__init__(*inputs, **kwargs) + + def _init_weights( + self, + module: nn.Module, + rescale_prenorm_residual: bool = False, + num_residuals_per_layer: int = 2, + ): + if isinstance(module, (nn.Linear, nn.Conv1d)): + # Slightly different from the TF version which uses truncated_normal for initialization + # cf https://github.com/pytorch/pytorch/pull/5617 + nn.init.normal_(module.weight, mean=0.0, std=self.config.initializer_range) + if module.bias is not None: + nn.init.zeros_(module.bias) + elif isinstance(module, nn.Embedding): + nn.init.normal_(module.weight, mean=0.0, std=self.config.initializer_range) + if module.padding_idx is not None: + module.weight.data[module.padding_idx].zero_() + + if rescale_prenorm_residual: + # Reinitialize selected weights subject to the OpenAI GPT-2 Paper Scheme: + # > A modified initialization which accounts for the accumulation on the residual path with model depth. Scale + # > the weights of residual layers at initialization by a factor of 1/√N where N is the # of residual layers. + # > -- GPT-2 :: https://openai.com/blog/better-language-models/ + # + # Reference (Megatron-LM): https://github.com/NVIDIA/Megatron-LM/blob/main/megatron/model/gpt_model.py + for name, p in module.named_parameters(): + if name in ["o_proj.weight", "down_proj.weight"]: + # Special Scaled Initialization --> There are 2 Layer Norms per Transformer Block + # Following Pytorch init, except scale by 1/sqrt(2 * n_layer) + # We need to reinit p since this code could be called multiple times + # Having just p *= scale would repeatedly scale it down + with torch.no_grad(): + p /= math.sqrt(num_residuals_per_layer * self.config.num_hidden_layers) + + +class TransformerModel(TransformerPreTrainedModel): + + def __init__(self, config: TransformerConfig): + super().__init__(config) + self.padding_idx = config.pad_token_id + self.vocab_size = config.vocab_size + + self.embeddings = nn.Embedding(config.vocab_size, config.hidden_size, self.padding_idx) + self.layers = nn.ModuleList([TransformerBlock(config, layer_idx) for layer_idx in range(config.num_hidden_layers)]) + self.norm = RMSNorm(config.hidden_size, eps=config.norm_eps) + + self.gradient_checkpointing = False + + self.post_init() + + def get_input_embeddings(self): + return self.embeddings + + def set_input_embeddings(self, value): + self.embeddings = value + + def forward( + self, + input_ids: Optional[torch.LongTensor] = None, + attention_mask: Optional[torch.Tensor] = None, + past_key_values: Optional[List[torch.FloatTensor]] = None, + inputs_embeds: Optional[torch.FloatTensor] = None, + use_cache: Optional[bool] = None, + output_attentions: Optional[bool] = None, + output_hidden_states: Optional[bool] = None, + return_dict: Optional[bool] = None + ) -> Union[Tuple, CausalLMOutputWithPast]: + if output_attentions: + warnings.warn( + "`TransformerModel` does not support output attention weights now, so `output_attentions` is set to `False`." + ) + output_attentions = False + output_attentions = output_attentions if output_attentions is not None else self.config.output_attentions + output_hidden_states = output_hidden_states if output_hidden_states is not None else self.config.output_hidden_states + use_cache = use_cache if use_cache is not None else (self.config.use_cache if not self.training else False) + return_dict = return_dict if return_dict is not None else self.config.use_return_dict + + # retrieve input_ids and inputs_embeds + if input_ids is not None and inputs_embeds is not None: + raise ValueError("You cannot specify both input_ids and inputs_embeds at the same time") + elif input_ids is None and inputs_embeds is None: + raise ValueError("You have to specify either input_ids or inputs_embeds") + + if use_cache and not isinstance(past_key_values, Cache): + past_key_values = Cache.from_legacy_cache(past_key_values) + + if inputs_embeds is None: + inputs_embeds = self.embeddings(input_ids) + + # embed positions + hidden_states = inputs_embeds + + if self.gradient_checkpointing and self.training: + if use_cache: + logger.warning_once( + "`use_cache=True` is incompatible with gradient checkpointing. Setting `use_cache=False`..." + ) + use_cache = False + + all_hidden_states = () if output_hidden_states else None + all_attns = () if output_attentions else None + next_cache = None + + for layer in self.layers: + if output_hidden_states: + all_hidden_states += (hidden_states,) + + if self.gradient_checkpointing and self.training: + layer_outputs = self._gradient_checkpointing_func( + layer.__call__, + hidden_states, + attention_mask, + past_key_values, + output_attentions, + use_cache + ) + else: + layer_outputs = layer( + hidden_states, + attention_mask=attention_mask, + past_key_values=past_key_values, + output_attentions=output_attentions, + use_cache=use_cache + ) + + hidden_states = layer_outputs[0] + + if use_cache: + next_cache = layer_outputs[2 if output_attentions else 1] + + if output_attentions: + all_attns += (layer_outputs[1],) + + hidden_states = self.norm(hidden_states) + + # add hidden states from the last decoder layer + if output_hidden_states: + all_hidden_states += (hidden_states,) + + if not return_dict: + return tuple(v for v in [hidden_states, next_cache, all_hidden_states, all_attns] if v is not None) + + return BaseModelOutputWithPast( + last_hidden_state=hidden_states, + past_key_values=next_cache, + hidden_states=all_hidden_states, + attentions=all_attns + ) + + +class TransformerForCausalLM(TransformerPreTrainedModel, GenerationMixin): + + _tied_weights_keys = ["lm_head.weight"] + + def __init__(self, config): + super().__init__(config) + self.model = TransformerModel(config) + self.vocab_size = config.vocab_size + self.lm_head = nn.Linear(config.hidden_size, config.vocab_size, bias=False) + + # Initialize weights and apply final processing + self.post_init() + + def get_input_embeddings(self): + return self.model.embeddings + + def set_input_embeddings(self, value): + self.model.embeddings = value + + def get_output_embeddings(self): + return self.lm_head + + def set_output_embeddings(self, new_embeddings): + self.lm_head = new_embeddings + + def set_decoder(self, decoder): + self.model = decoder + + def get_decoder(self): + return self.model + + def prepare_inputs_for_generation( + self, + input_ids: torch.LongTensor = None, + past_key_values: Optional[Union[Cache, List[torch.FloatTensor]]] = None, + attention_mask: Optional[torch.Tensor] = None, + inputs_embeds: Optional[torch.Tensor] = None, + use_cache: bool = True, + num_logits_to_keep: Optional[int] = None, + **kwargs + ): + # only last token for `inputs_ids` if the `past_key_values` is passed along. + if past_key_values is not None: + input_ids = input_ids[:, -1:] + # if `inputs_embeds` are passed, we only want to use them in the 1st generation step + if inputs_embeds is not None and past_key_values is None: + model_inputs = {'inputs_embeds': inputs_embeds} + else: + # The `contiguous()` here is necessary to have a static stride during decoding. torchdynamo otherwise + # recompiles graphs as the stride of the inputs is a guard. + # Ref: https://github.com/huggingface/transformers/pull/29114 + # TODO: use `next_tokens` directly instead. + model_inputs = {'input_ids': input_ids.contiguous()} + + if num_logits_to_keep is not None: + model_inputs['num_logits_to_keep'] = num_logits_to_keep + + model_inputs.update({ + 'past_key_values': past_key_values, + 'use_cache': use_cache, + 'attention_mask': attention_mask, + 'num_logits_to_keep': num_logits_to_keep, + }) + return model_inputs + + def forward( + self, + input_ids: torch.LongTensor = None, + attention_mask: Optional[torch.Tensor] = None, + past_key_values: Optional[Union[Cache, List[torch.FloatTensor]]] = None, + inputs_embeds: Optional[torch.FloatTensor] = None, + labels: Optional[torch.LongTensor] = None, + use_cache: Optional[bool] = None, + output_attentions: Optional[bool] = None, + output_hidden_states: Optional[bool] = None, + return_dict: Optional[bool] = None, + num_logits_to_keep: Optional[int] = 0 + ) -> Union[Tuple, CausalLMOutputWithPast]: + output_attentions = output_attentions if output_attentions is not None else self.config.output_attentions + output_hidden_states = ( + output_hidden_states if output_hidden_states is not None else self.config.output_hidden_states + ) + return_dict = return_dict if return_dict is not None else self.config.use_return_dict + + outputs = self.model( + input_ids=input_ids, + attention_mask=attention_mask, + past_key_values=past_key_values, + inputs_embeds=inputs_embeds, + use_cache=use_cache, + output_attentions=output_attentions, + output_hidden_states=output_hidden_states, + return_dict=return_dict + ) + + hidden_states = outputs[0] + fuse_linear_and_cross_entropy = self.config.fuse_cross_entropy and self.training + logits = None if fuse_linear_and_cross_entropy else self.lm_head(hidden_states[:, -num_logits_to_keep:]) + + loss = None + if labels is not None: + if self.config.fuse_cross_entropy: + if fuse_linear_and_cross_entropy: + loss_fct = FusedLinearCrossEntropyLoss() + else: + loss_fct = FusedCrossEntropyLoss(inplace_backward=True) + else: + loss_fct = nn.CrossEntropyLoss() + # Enable model parallelism + labels = labels.to(hidden_states.device) + labels = torch.cat((labels[..., 1:], torch.full_like(labels[:, :1], loss_fct.ignore_index)), 1) + if fuse_linear_and_cross_entropy: + loss = loss_fct(hidden_states.view(-1, self.config.hidden_size), + labels.view(-1), + self.lm_head.weight, + self.lm_head.bias) + else: + loss = loss_fct(logits.view(-1, self.config.vocab_size), labels.view(-1)) + + if not return_dict: + output = (logits,) + outputs[1:] + return (loss,) + output if loss is not None else output + + return CausalLMOutputWithPast( + loss=loss, + logits=logits, + past_key_values=outputs.past_key_values, + hidden_states=outputs.hidden_states, + attentions=outputs.attentions, + ) diff --git a/fla/models/utils.py b/fla/models/utils.py new file mode 100644 index 0000000000000000000000000000000000000000..ed6fa9002feee9bc172ada761481a56bc4c98cc1 --- /dev/null +++ b/fla/models/utils.py @@ -0,0 +1,143 @@ +# -*- coding: utf-8 -*- + +from __future__ import annotations + +from typing import Any, Dict, List, Optional, Tuple + +import torch +import transformers + + +class Cache(transformers.cache_utils.Cache): + """ + A cache used for storing hidden states produced by flash linear attention models. + + It stores the states of each layer as the tensor of shape `[batch_size, key_dim, value_dim]`. + """ + + def __init__( + self, + seen_tokens: int = 0 + ) -> Cache: + + self.states: List[Dict[str, Any]] = [] + + self._seen_tokens = seen_tokens # Used in `generate` to keep tally of how many tokens the cache has seen + + def __getitem__(self, layer_idx: int) -> Dict[str, Any]: + if layer_idx < len(self): + return self.states[layer_idx] + else: + raise KeyError(f"Cache only has {len(self)} layers, attempted to access layer with index {layer_idx}") + + def __iter__(self): + for state in self.states: + yield state + + def __len__(self): + return len(self.states) + + def update( + self, + recurrent_state: torch.Tensor = None, + attn_state: Tuple[torch.Tensor, torch.Tensor] = None, + conv_state: Tuple[torch.Tensor] = None, + ffn_state: torch.Tensor = None, + layer_idx: int = 0, + offset: Optional[int] = 1, + cache_kwargs: Optional[Dict[str, Any]] = None, + ) -> Dict[str, Any]: + """ + Updates the cache with the new `recurrent_state`/`attn_state`/`conv_state` for the layer `layer_idx`. + + Args: + recurrent_state (`torch.Tensor`, `optional`): + The new recurrent state to cache. + attn_state (`Tuple[torch.Tensor, torch.Tensor]`, `optional`): + The new attention key/value states to cache. + conv_state (`Tuple[torch.Tensor]`, `optional`): + The new convolution state to cache. + layer_idx (`int`, defaults to 0): + The index of the layer to cache the states for. + offset (`int`, `optional`, defaults to 1): + The number of new tokens being processed. + cache_kwargs (`Dict[str, Any]`, `optional`): + Additional arguments for the cache subclass. + + Return: + Dictionary of the updated state. + """ + + # Update the number of seen tokens + if layer_idx == 0: + self._seen_tokens += offset + + if attn_state is not None: + input_size = attn_state[0].shape[-2] + window_size = cache_kwargs.get('window_size', None) + if not isinstance(attn_state, Tuple) or len(attn_state) != 2: + raise ValueError("`attn_state` must be a tuple of two tensors for key/value states") + if len(self.states) <= layer_idx: + if attn_state is not None: + if window_size is not None and input_size > window_size: + attn_state = (attn_state[0][..., -window_size:, :].contiguous(), + attn_state[1][..., -window_size:, :].contiguous()) + state = dict( + recurrent_state=recurrent_state, + attn_state=attn_state, + conv_state=conv_state, + ffn_state=ffn_state + ) + self.states.append(state) + else: + state = self.states[layer_idx] + if recurrent_state is not None: + state['recurrent_state'] = recurrent_state + if attn_state is not None: + key_state, value_state = state['attn_state'] + if window_size is not None and key_state.shape[-2] == window_size: + # DO NOT allocate new memory if the cache is full + # roll the key/value states to the left by `input_size` + key_state = key_state.roll(-input_size, -2) + value_state = value_state.roll(-input_size, -2) + # replace the last `input_size` tokens with the new key/value states + key_state[..., -input_size:, :] = attn_state[0] + value_state[..., -input_size:, :] = attn_state[1] + attn_state = (key_state, value_state) + else: + attn_state = (torch.cat([key_state, attn_state[0]], -2), + torch.cat([value_state, attn_state[1]], -2),) + state['attn_state'] = attn_state + if conv_state is not None: + state['conv_state'] = conv_state + if ffn_state is not None: + state['ffn_state'] = ffn_state + + return state + + def get_seq_length(self, layer_idx: Optional[int] = 0) -> int: + """Returns the sequence length of the cached states. A layer index can be optionally passed.""" + if len(self.states) <= layer_idx: + return 0 + return self._seen_tokens + + def get_max_length(self) -> Optional[int]: + """Returns the maximum sequence length of the cached states. Cache does not have a maximum length.""" + return None + + def to_legacy_cache(self) -> Tuple: + return tuple(self.states) + + @classmethod + def from_legacy_cache( + cls, + past_key_values: Optional[Tuple] = None, + seen_tokens: int = 0 + ) -> Cache: + """Converts a cache in the legacy cache format into an equivalent `Cache`.""" + + cache = cls(seen_tokens) + if past_key_values is not None: + for layer_idx in range(len(past_key_values)): + cache.states.append(past_key_values[layer_idx]) + return cache diff --git a/fla/modules/__init__.py b/fla/modules/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..12ed06c29b480e04228b3799c089930f5e79b94f --- /dev/null +++ b/fla/modules/__init__.py @@ -0,0 +1,22 @@ +# -*- coding: utf-8 -*- + +from fla.modules.convolution import (ImplicitLongConvolution, LongConvolution, + ShortConvolution) +from fla.modules.fused_cross_entropy import FusedCrossEntropyLoss +from fla.modules.fused_kl_div import FusedKLDivLoss +from fla.modules.fused_linear_cross_entropy import FusedLinearCrossEntropyLoss +from fla.modules.fused_norm_gate import (FusedLayerNormSwishGate, + FusedLayerNormSwishGateLinear, + FusedRMSNormSwishGate, + FusedRMSNormSwishGateLinear) +from fla.modules.layernorm import (GroupNorm, GroupNormLinear, LayerNorm, + LayerNormLinear, RMSNorm, RMSNormLinear) +from fla.modules.rotary import RotaryEmbedding + +__all__ = [ + 'ImplicitLongConvolution', 'LongConvolution', 'ShortConvolution', + 'FusedCrossEntropyLoss', 'FusedLinearCrossEntropyLoss', 'FusedKLDivLoss', + 'GroupNorm', 'GroupNormLinear', 'LayerNorm', 'LayerNormLinear', 'RMSNorm', 'RMSNormLinear', + 'FusedLayerNormSwishGate', 'FusedLayerNormSwishGateLinear', 'FusedRMSNormSwishGate', 'FusedRMSNormSwishGateLinear', + 'RotaryEmbedding' +] diff --git a/fla/modules/activations.py b/fla/modules/activations.py new file mode 100644 index 0000000000000000000000000000000000000000..c4dd608b453ac39d28d64b729fb24ad4e31ca0ef --- /dev/null +++ b/fla/modules/activations.py @@ -0,0 +1,434 @@ +# -*- coding: utf-8 -*- + +# Copyright (c) 2023-2024, Tri Dao, Yu Zhang, Songlin Yang. + +import torch +import torch.nn.functional as F +import triton +import triton.language as tl + +import fla.modules.fused_bitlinear as fused_bitlinear +from fla.utils import autocast_custom_bwd, autocast_custom_fwd, contiguous + +sigmoid_fwd_codestring = """ +template T sigmoid_fwd(T x) { + return 1.0f / (1.0f + ::exp(-float(x))); +} +""" +sigmoid_bwd_codestring = """ +template T sigmoid_bwd(T x, T g) { + float x_sigmoid = 1.0f / (1.0f + ::exp(-float(x))); + return float(g) * x_sigmoid * (1.0f - x_sigmoid); +} +""" + +sigmoid_fwd = torch.cuda.jiterator._create_jit_fn(sigmoid_fwd_codestring) +sigmoid_bwd = torch.cuda.jiterator._create_jit_fn(sigmoid_bwd_codestring) + + +class SigmoidFunction(torch.autograd.Function): + + @staticmethod + def forward(ctx, x): + ctx.save_for_backward(x) + return sigmoid_fwd(x) + + @staticmethod + def backward(ctx, dout): + x, = ctx.saved_tensors + return sigmoid_bwd(x, dout) + + +sigmoid = SigmoidFunction.apply + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8), + triton.Config({}, num_warps=16), + triton.Config({}, num_warps=32) + ], + key=['D'] +) +@triton.jit +def logsigmoid_fwd_kernel( + x, + y, + temperature, + T: tl.constexpr, + D: tl.constexpr, + B: tl.constexpr +): + i = tl.program_id(0) + o_i = i * B + tl.arange(0, B) + m_i = o_i < T + + b_x = tl.load(x + o_i, mask=m_i, other=0.).to(tl.float32) + b_m = tl.minimum(0., b_x) + b_z = 1. + tl.exp(-tl.abs(b_x)) + b_y = (b_m - tl.log(b_z)) / temperature + tl.store(y + o_i, b_y.to(y.dtype.element_ty), mask=m_i) + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8), + triton.Config({}, num_warps=16), + triton.Config({}, num_warps=32) + ], + key=['D'] +) +@triton.jit +def logsigmoid_bwd_kernel( + x, + dx, + dy, + temperature, + T: tl.constexpr, + D: tl.constexpr, + B: tl.constexpr +): + i = tl.program_id(0) + o_i = i * B + tl.arange(0, B) + m_i = o_i < T + + b_x = tl.load(x + o_i, mask=m_i, other=0.).to(tl.float32) + b_dy = tl.load(dy + o_i, mask=m_i, other=0.).to(tl.float32) + b_dx = b_dy * (1. - tl.sigmoid(b_x)) / temperature + tl.store(dx + o_i, b_dx.to(dx.dtype.element_ty), mask=m_i) + + +def logsigmoid_fwd(x: torch.Tensor, temperature: float = 1.) -> torch.Tensor: + T, D = x.numel(), x.shape[-1] + B = triton.next_power_of_2(triton.cdiv(T, torch.cuda.get_device_properties(x.device).multi_processor_count)) + y = torch.empty_like(x) + logsigmoid_fwd_kernel[(triton.cdiv(T, B),)]( + x=x, + y=y, + temperature=temperature, + T=T, + D=D, + B=B + ) + return y + + +def logsigmoid_bwd(x: torch.Tensor, dy: torch.Tensor, temperature: float = 1.) -> torch.Tensor: + T, D = x.numel(), x.shape[-1] + B = triton.next_power_of_2(triton.cdiv(T, torch.cuda.get_device_properties(x.device).multi_processor_count)) + dx = torch.empty_like(x) + logsigmoid_bwd_kernel[(triton.cdiv(T, B),)]( + x=x, + dx=dx, + dy=dy, + temperature=temperature, + T=T, + D=D, + B=B + ) + return dx + + +class LogSigmoidFunction(torch.autograd.Function): + + @staticmethod + @contiguous + def forward(ctx, x, temperature): + ctx.save_for_backward(x,) + ctx.temperature = temperature + return logsigmoid_fwd(x, temperature) + + @staticmethod + @contiguous + def backward(ctx, dy): + x, = ctx.saved_tensors + return logsigmoid_bwd(x, dy, ctx.temperature), None + + +def logsigmoid(x: torch.Tensor, temperature: float = 1.) -> torch.Tensor: + return LogSigmoidFunction.apply(x, temperature) + + +swish_fwd_codestring = """ +template T swish_fwd(T x) { + float x_sigmoid = 1.0f / (1.0f + ::exp(-float(x))); + return float(x) * x_sigmoid; +} +""" +swish_bwd_codestring = """ +template T swish_bwd(T x, T g) { + float x_sigmoid = 1.0f / (1.0f + ::exp(-float(x))); + return float(g) * x_sigmoid * (1.0f - float(x) * x_sigmoid + float(x)); +} +""" + +swish_fwd = torch.cuda.jiterator._create_jit_fn(swish_fwd_codestring) +swish_bwd = torch.cuda.jiterator._create_jit_fn(swish_bwd_codestring) + + +class SwishFunction(torch.autograd.Function): + + @staticmethod + def forward(ctx, x): + ctx.save_for_backward(x) + return swish_fwd(x) + + @staticmethod + def backward(ctx, dout): + x, = ctx.saved_tensors + return swish_bwd(x, dout) + + +swish = SwishFunction.apply + +# 1/sqrt(2*pi)-> 0.3989423 +# 1/sqrt(2) -> 0.70710678 +# sqrt(2/pi) -> 0.79788456 + + +# this function is tanh approximation of gelu +# actual gelu is: +# x * 0.5 * (1.0 + torch.erf(x * 0.70710678)) +@torch.jit.script +def bias_gelu(y, bias): + x = bias + y + return (x * 0.5 * (1.0 + torch.tanh(0.79788456 * x * (1 + 0.044715 * x * x)))).to(dtype=y.dtype) + + +# gradient of tanh approximation of gelu +# gradient of actual gelu is: +# 0.5 * (1. + torch.erf(x * 0.70710678)) + 0.3989423 * x * torch.exp(-0.5 * x * x) +@torch.jit.script +def bias_gelu_bwd(g, y, bias): + """Assume that y has shape (B, D) and bias has shape (D)""" + x = bias + y + tanh_out = torch.tanh(0.79788456 * x * (1 + 0.044715 * x * x)) + # sqrt(2/pi) * 3 * 0.044715 -> 0.1070322243 + ff = 0.5 * x * ((1 - tanh_out * tanh_out) * (0.79788456 + 0.1070322243 * x * x)) + 0.5 * ( + 1 + tanh_out + ) + grad_y = ff * g + return grad_y.to(dtype=y.dtype), grad_y.sum(dim=(0), dtype=bias.dtype) + + +class GeLUFunction(torch.autograd.Function): + + @staticmethod + # bias is an optional argument + def forward(ctx, input, bias): + ctx.save_for_backward(input, bias) + return bias_gelu(input, bias) + + @staticmethod + def backward(ctx, grad_output): + input, bias = ctx.saved_tensors + tmp = bias_gelu_bwd(grad_output, input, bias) + return tmp, tmp + + +bias_gelu_impl = GeLUFunction.apply + + +# this function is tanh approximation of gelu +# actual gelu is: +# x * 0.5 * (1.0 + torch.erf(x * 0.70710678)) +@torch.jit.script +def gelu_fwd(x): + return (x * 0.5 * (1.0 + torch.tanh(0.79788456 * x * (1 + 0.044715 * x * x)))).to(dtype=x.dtype) + + +# gradient of tanh approximation of gelu +# gradient of actual gelu is: +# 0.5 * (1. + torch.erf(x * 0.70710678)) + 0.3989423 * x * torch.exp(-0.5 * x * x) +@torch.jit.script +def gelu_bwd(g, x): + tanh_out = torch.tanh(0.79788456 * x * (1 + 0.044715 * x * x)) + # sqrt(2/pi) * 3 * 0.044715 -> 0.1070322243 + ff = 0.5 * x * ((1 - tanh_out * tanh_out) * (0.79788456 + 0.1070322243 * x * x)) + 0.5 * ( + 1 + tanh_out + ) + return (ff * g).to(dtype=x.dtype) + + +class FastGeLUFunction(torch.autograd.Function): + @staticmethod + # bias is an optional argument + def forward(ctx, input): + ctx.save_for_backward(input) + return gelu_fwd(input) + + @staticmethod + def backward(ctx, grad_output): + (input,) = ctx.saved_tensors + tmp = gelu_bwd(grad_output, input) + return tmp + + +fast_gelu_impl = FastGeLUFunction.apply + + +@torch.jit.script +def relu_bwd(g, x): + return torch.where(x >= 0, g, 0.0).to(dtype=x.dtype) + + +@torch.jit.script +def sqrelu_fwd(x): + r = F.relu(x) + return (r * r).to(dtype=x.dtype) + + +@torch.jit.script +def sqrelu_bwd(g, x): + return (2.0 * g * F.relu(x)).to(dtype=x.dtype) + + +class SquaredReLUFunction(torch.autograd.Function): + + @staticmethod + def forward(ctx, input): + ctx.save_for_backward(input) + return sqrelu_fwd(input) + + @staticmethod + def backward(ctx, grad_output): + input, = ctx.saved_tensors + return sqrelu_bwd(grad_output, input) + + +sqrelu = SquaredReLUFunction.apply + + +swiglu_fwd_codestring = """ +template T swiglu_fwd(T x, T y) { + return float(x) * float(y) / (1.0f + ::exp(-float(x))); +} +""" +swiglu_bwd_codestring = """ +template T swiglu_bwd(T x, T y, T g, T& dx, T& dy) { + float x_sigmoid = 1.0f / (1.0f + ::exp(-float(x))); + dx = x_sigmoid * (1 + float(x) * (1.0f - x_sigmoid)) * float(g) * float(y); + dy = float(x) * x_sigmoid * float(g); +} +""" + +swiglu_bwd_with_output_codestring = """ +template T swiglu_bwd_with_output(T x, T y, T g, T& dx, T& dy, T& z) { + float x_sigmoid = 1.0f / (1.0f + ::exp(-float(x))); + float x_swish = float(x) * x_sigmoid; + dx = x_sigmoid * (1 + float(x) * (1.0f - x_sigmoid)) * float(g) * float(y); + dy = x_swish * float(g); + z = x_swish * float(y); +} +""" + +swiglu_fwd = torch.cuda.jiterator._create_jit_fn(swiglu_fwd_codestring) +swiglu_bwd = torch.cuda.jiterator._create_multi_output_jit_fn(swiglu_bwd_codestring, num_outputs=2) +swiglu_bwd_with_output = torch.cuda.jiterator._create_multi_output_jit_fn(swiglu_bwd_with_output_codestring, num_outputs=3) + + +class SwiGLUFunction(torch.autograd.Function): + r""" + Swish-Gated Linear Unit (SwiGLU) function. + + .. math:: + \text{SwiGLU}(x, y) = swish(x) * y = \frac{x}{1 + \exp(-x)} * y + """ + + @staticmethod + def forward(ctx, x, y): + ctx.save_for_backward(x, y) + return swiglu_fwd(x, y) + + @staticmethod + def backward(ctx, dout): + x, y = ctx.saved_tensors + return swiglu_bwd(x, y, dout) + + +class SwiGLULinearFunction(torch.autograd.Function): + r""" + Swish-Gated Linear Unit (SwiGLU) function followed by a linear transformation. + + .. math:: + \text{SwiGLULinear}(x, y, W, b) = (swish(x) * y) W + b + + This simple wrap discards the intermediate results of SwiGLU(x, y) to save memory. + """ + + @staticmethod + @autocast_custom_fwd + def forward(ctx, x, y, weight, bias): + z = swiglu_fwd(x, y) + out = F.linear(z, weight, bias) + # We don't store z, will be recomputed in the backward pass to save memory + ctx.save_for_backward(x, y, weight) + ctx.linear_bias_is_none = bias is None + return out + + @staticmethod + @autocast_custom_bwd + def backward(ctx, dout, *args): + x, y, weight = ctx.saved_tensors + dout = dout.reshape(-1, dout.shape[-1]) + dz = F.linear(dout, weight.t()).view_as(x) + dx, dy, z = swiglu_bwd_with_output(x, y, dz) + dlinear_weight = torch.einsum("bo,bi->oi", dout, z.reshape(-1, z.shape[-1])) + dlinear_bias = None if ctx.linear_bias_is_none else dout.sum(0) + return dx, dy, dlinear_weight, dlinear_bias + + +class SwiGLUBitLinearFunction(torch.autograd.Function): + r""" + Swish-Gated Linear Unit (SwiGLU) function followed by a linear transformation. + + .. math:: + \text{SwiGLULinear}(x, y, W, b) = (swish(x) * y) W + b + + This simple wrap discards the intermediate results of SwiGLU(x, y) to save memory. + """ + + @staticmethod + @autocast_custom_fwd + def forward(ctx, x, y, weight, bias): + z = swiglu_fwd(x, y) + out = fused_bitlinear.bit_linear(z, weight, bias) + # We don't store z, will be recomputed in the backward pass to save memory + ctx.save_for_backward(x, y, weight) + ctx.linear_bias_is_none = bias is None + return out + + @staticmethod + @autocast_custom_bwd + def backward(ctx, dout, *args): + x, y, weight = ctx.saved_tensors + dout = dout.reshape(-1, dout.shape[-1]) + dz = fused_bitlinear.bit_linear(dout, weight.t()).view_as(x) + dx, dy, z = swiglu_bwd_with_output(x, y, dz) + dlinear_weight = torch.einsum("bo,bi->oi", dout, z.reshape(-1, z.shape[-1])) + dlinear_bias = None if ctx.linear_bias_is_none else dout.sum(0) + return dx, dy, dlinear_weight, dlinear_bias + + +swiglu = SwiGLUFunction.apply + +swiglu_linear = SwiGLULinearFunction.apply + +swiglu_bitlinear = SwiGLUBitLinearFunction.apply + +ACT2FN = { + 'relu': F.relu, + 'sigmoid': sigmoid, + 'logsigmoid': logsigmoid, + 'silu': swish, + 'swish': swish, + 'sqrelu': sqrelu, + 'gelu': fast_gelu_impl, + 'bias_gelu': bias_gelu_impl, +} diff --git a/fla/modules/convolution.py b/fla/modules/convolution.py new file mode 100644 index 0000000000000000000000000000000000000000..5a7d0228d49a14b00c886a264e6201c661d58964 --- /dev/null +++ b/fla/modules/convolution.py @@ -0,0 +1,353 @@ +# -*- coding: utf-8 -*- + +# from https://github.com/HazyResearch/zoology/blob/main/zoology/mixers/convolution.py + +import math +import warnings +from typing import Optional, Tuple + +import torch +import torch.nn as nn +import torch.nn.functional as F +from einops import rearrange + +from fla.modules.activations import ACT2FN +from fla.utils import checkpoint + +try: + from causal_conv1d import causal_conv1d_fn, causal_conv1d_update +except ImportError: + causal_conv1d_fn = None + causal_conv1d_update = None + + +def fft_conv(u, k, dropout_mask, gelu=True, k_rev=None): + seqlen = u.shape[-1] + fft_size = 2 * seqlen + k_f = torch.fft.rfft(k, n=fft_size) / fft_size + if k_rev is not None: + k_rev_f = torch.fft.rfft(k_rev, n=fft_size) / fft_size + k_f = k_f + k_rev_f.conj() + u_f = torch.fft.rfft(u.to(dtype=k.dtype), n=fft_size) + + if len(u.shape) > 3: + k_f = k_f.unsqueeze(1) + y = torch.fft.irfft(u_f * k_f, n=fft_size, norm="forward")[..., :seqlen] + + out = y + u + if gelu: + out = F.gelu(out) + if dropout_mask is not None: + return (out * rearrange(dropout_mask, "b H -> b H 1")).to(dtype=u.dtype) + else: + return out.to(dtype=u.dtype) + + +@checkpoint +def proj_then_conv1d( + x: torch.Tensor, + proj_weight: torch.Tensor, + conv1d_weight: torch.Tensor, + conv1d_bias: Optional[torch.Tensor] = None, + cache: Optional[torch.Tensor] = None +) -> torch.Tensor: + # We do matmul and transpose BLH -> HBL at the same time + x = rearrange(proj_weight @ rearrange(x, "b t d -> d (b t)"), "d (b t) -> b d t", t=x.shape[-2]) + + if causal_conv1d_fn is None: + raise ImportError("`causal_conv1d_fn` is not available. Please install `causal-conv1d` first.") + if cache is None: + x = causal_conv1d_fn( + x=x, + weight=rearrange(conv1d_weight, "d 1 w -> d w"), + bias=conv1d_bias, + activation="silu", + ).transpose(1, 2) + else: + assert x.shape[-1] == 1, "Only support decoding with 1 token at a time for now" + x = x.squeeze(-1) + x = causal_conv1d_update( + x=x, + weight=rearrange(conv1d_weight, "d 1 w -> d w"), + bias=conv1d_bias, + cache=cache, + activation="silu", + ) + return x + + +class ShortConvolution(nn.Conv1d): + """ + Simple wrapper around `nn.Conv1d` that accepts dimension last. + """ + + def __init__( + self, + hidden_size: int, + kernel_size: int, + bias: bool = False, + activation: Optional[str] = 'silu', + use_fast_conv1d: Optional[bool] = True + ): + super().__init__( + in_channels=hidden_size, + out_channels=hidden_size, + kernel_size=kernel_size, + groups=hidden_size, + bias=bias, + padding=kernel_size - 1 + ) + + self.hidden_size = hidden_size + self.activation = None + if activation is not None: + assert activation in ['silu', 'swish'], f"Activation `{activation}` not supported yet." + self.activation = activation + + if causal_conv1d_fn is None: + if use_fast_conv1d: + raise RuntimeError( + "Please either install `causal-conv1d>=1.4.0` to enable fast causal short convolution CUDA kernel " + "or set `use_fast_conv1d` to False" + ) + else: + warnings.warn( + "The naive Pytorch verison is very slow in practice, " + "please run `pip install causal-conv1d>=1.4.0` to install fast causal short convolution CUDA kernel", + category=ImportWarning + ) + self.use_fast_conv1d = use_fast_conv1d + + def extra_repr(self): + s = ('{in_channels}, {out_channels}, kernel_size={kernel_size}' + ', stride={stride}') + if self.padding != (0,) * len(self.padding): + s += ', padding={padding}' + if self.dilation != (1,) * len(self.dilation): + s += ', dilation={dilation}' + if self.output_padding != (0,) * len(self.output_padding): + s += ', output_padding={output_padding}' + if self.groups != 1: + s += ', groups={groups}' + if self.bias is None: + s += ', bias=False' + if self.padding_mode != 'zeros': + s += ', padding_mode={padding_mode}' + if self.activation is not None: + s += ', activation={activation}' + if not self.use_fast_conv1d: + s += ', use_fast_conv1d={use_fast_conv1d}' + return s.format(**self.__dict__) + + def forward( + self, + x: torch.Tensor, + mask: Optional[torch.Tensor] = None, + cache: Optional[torch.Tensor] = None, + output_final_state: bool = False + ) -> Tuple[torch.Tensor, torch.Tensor]: + """ + Args: + x (`torch.Tensor`): + Tensor of shape `[batch_size, seq_len, hidden_size]` + mask (`Optional[torch.Tensor]`): + Attention mask dealing with padded positions. + cache (`Optional[torch.Tensor]`): + Previous cache tensor of shape `[batch_size, hidden_size, kernel_size]`. + If provided, the cache is updated **inplace**. + output_final_state (Optional[bool]): + Whether to output the final state of shape `[batch_size, hidden_size, kernel_size]`. Default: `False`. + Returns: + Tensor of shape `[batch_size, seq_len, hidden_size]`. + """ + + batch_size, _, hidden_size = x.shape + if mask is not None: + x = x.mul_(mask.unsqueeze(-1)) + if output_final_state and cache is None: + cache = x.new_zeros(batch_size, hidden_size, self.kernel_size[0]) + if cache is not None and x.shape[1] == 1: + return self.step(x, cache) + x = rearrange(x, "b t d -> b d t") + # Update state (B D W) + if cache is not None: + cache.copy_(F.pad(x, (self.kernel_size[0] - x.shape[-1], 0))) + if self.use_fast_conv1d: + x = causal_conv1d_fn( + x=x, + weight=rearrange(self.weight, "d 1 w -> d w"), + bias=self.bias, + activation=self.activation, + ) + else: + x = self._conv_forward(x, self.weight, self.bias)[..., :x.shape[-1]] + if self.activation is not None: + x = ACT2FN[self.activation](x) + return rearrange(x, "b d t -> b t d"), cache + + def step( + self, + x: torch.Tensor, + cache: torch.Tensor + ): + assert x.shape[1] == 1, "Only support decoding with 1 token at a time for now" + + x = x.squeeze(1) + if self.use_fast_conv1d: + x = causal_conv1d_update( + x=x, + conv_state=cache, + weight=rearrange(self.weight, "d 1 w -> d w"), + bias=self.bias, + activation=self.activation, + ) + else: + dtype = x.dtype + cache.copy_(torch.roll(cache, shifts=-1, dims=-1)) + cache[:, :, -1] = x + x = torch.sum(cache * rearrange(self.weight, "d 1 w -> d w"), dim=-1) + if self.bias is not None: + x = x + self.bias + if self.activation is not None: + x = ACT2FN[self.activation](x).to(dtype=dtype) + return x.unsqueeze(1), cache + + @property + def state_size(self) -> int: + return self.hidden_size * self.kernel_size + + +class LongConvolution(nn.Module): + """ + LongConvolution applies a convolution operation on the input tensor using a fixed + filter of length max_len. + The filter is learned during training and is applied using FFT convolution. + Args: + hidden_size (int): The number of expected features in the input and output. + max_len (int): The maximum sequence length. + Returns: + y: [batch_size, seq_len, hidden_size] tensor + """ + + def __init__( + self, + hidden_size: int, + max_len: int, + **kwargs, + ): + """ + Initializes the LongConvolution module. + Args: + hidden_size (int): The number of expected features in the input and output. + max_len (int): The maximum sequence length. + """ + super().__init__() + self.hidden_size = hidden_size + self.filter = nn.Parameter(torch.randn(self.hidden_size, max_len), requires_grad=True) + + def forward(self, x: torch.Tensor, *args, **kwargs): + """ + Applies the LongConvolution operation on the input tensor. + Args: + x: [batch_size, seq_len, hidden_size] tensor + Returns: + y: [batch_size, seq_len, hidden_size] tensor + """ + x = x.transpose(1, 2) + y = fft_conv(x, self.filter, dropout_mask=None, gelu=False) + y = y.transpose(1, 2) + return y.to(dtype=x.dtype) + + +class PositionalEmbedding(nn.Module): + def __init__(self, emb_dim: int, seq_len: int, **kwargs): + """Complex exponential positional embeddings for implicit long convolution filters.""" + super().__init__() + + self.seq_len = seq_len + # The time embedding fed to the filteres is normalized so that t_f = 1 + t = torch.linspace(0, 1, self.seq_len)[None, :, None] # 1, L, 1 + + if emb_dim > 1: + bands = (emb_dim - 1) // 2 + # To compute the right embeddings we use the "proper" linspace + t_rescaled = torch.linspace(0, seq_len - 1, seq_len)[None, :, None] + w = 2 * math.pi * t_rescaled / seq_len # 1, L, 1 + + f = torch.linspace(1e-4, bands - 1, bands)[None, None] + z = torch.exp(-1j * f * w) + z = torch.cat([t, z.real, z.imag], dim=-1) + self.z = nn.Parameter(z, requires_grad=False) + + def forward(self, L): + return self.z[:, :L] + + +class ImplicitLongConvolution(nn.Module): + """ + Long convolution with implicit filter parameterized by an MLP. + + Args: + hidden_size (int): + The number of expected features in the input and output. + max_len (int): + The maximum sequence length. + d_emb (Optional[int]): + The dimension of the positional embeddings. Must be odd and greater or equal to 3 (time, sine and cosine). + Defaults to 3. + d_hidden (Optional[int]): + The number of features in the hidden layer of the MLP. Defaults to 16. + + Attributes: + pos_emb (`PositionalEmbedding`): The positional embedding layer. + mlp (`nn.Sequential`): The MLP that parameterizes the implicit filter. + + """ + + def __init__( + self, + hidden_size: int, + max_len: int, + d_emb: int = 3, + d_hidden: int = 16, + **kwargs, + ): + """ + Long convolution with implicit filter parameterized by an MLP. + + + """ + super().__init__() + self.hidden_size = hidden_size + self.d_emb = d_emb + + assert ( + d_emb % 2 != 0 and d_emb >= 3 + ), "d_emb must be odd and greater or equal to 3 (time, sine and cosine)" + self.pos_emb = PositionalEmbedding(d_emb, max_len) + + # final linear layer + self.mlp = nn.Sequential( + nn.Linear(d_emb, d_hidden), + torch.nn.ReLU(), + nn.Linear(d_hidden, hidden_size), + ) + + def filter(self, seq_len: int, *args, **kwargs): + k = self.mlp(self.pos_emb(seq_len)) + + return k.transpose(1, 2) + + def forward(self, x: torch.Tensor, *args, **kwargs): + """ + Args: + x: [batch_size, seq_len, hidden_size] tensor + Returns: + y: [batch_size, seq_len, hidden_size] tensor + """ + x = x.transpose(1, 2) + k = self.filter(x.shape[-1]) + y = fft_conv(x, k, dropout_mask=None, gelu=False) + + y = y.transpose(1, 2) + return y.to(dtype=x.dtype) diff --git a/fla/modules/feature_map.py b/fla/modules/feature_map.py new file mode 100644 index 0000000000000000000000000000000000000000..6af81e74d3975f67b8df23c1dfa60cd01b5a4950 --- /dev/null +++ b/fla/modules/feature_map.py @@ -0,0 +1,300 @@ +# -*- coding: utf-8 -*- + +from __future__ import annotations + +import math +from typing import Optional + +import torch +import torch.nn.functional as F +from torch import nn + +from fla.modules.activations import fast_gelu_impl, sigmoid, sqrelu, swish +from fla.modules.layernorm import layer_norm +from fla.utils import checkpoint + + +@checkpoint +def flatten_diag_outer_product(x, y): + z = torch.einsum("...i,...j->...ij", x, y) + N = z.size(-1) + indicies = torch.triu_indices(N, N) + return z[..., indicies[0], indicies[1]] + + +@checkpoint +def flatten_diag_outer_product_off1(x, y): + z = torch.einsum("...i,...j->...ij", x, y) + N = z.size(-1) + indicies = torch.triu_indices(N, N, 1) + indices2 = torch.arange(0, N) + return z[..., indicies[0], indicies[1]], z[..., indices2, indices2] + + +def is_power_of_2(n): + return (n & (n - 1) == 0) and n != 0 + + +class HedgehogFeatureMap(nn.Module): + + r""" + Hedgehog feature map as introduced in + `The Hedgehog & the Porcupine: Expressive Linear Attentions with Softmax Mimicry `_ + """ + + def __init__( + self, + head_dim: int + ) -> HedgehogFeatureMap: + super().__init__() + # Trainable map + self.layer = nn.Linear(head_dim, head_dim) + self.init_weights_() + + def init_weights_(self): + """Initialize trainable map as identity""" + with torch.no_grad(): + identity = torch.eye(*self.layer.weight.shape[-2:], dtype=torch.float) + self.layer.weight.copy_(identity.to(self.layer.weight)) + nn.init.zeros_(self.layer.bias) + + def forward(self, x: torch.Tensor): + x = self.layer(x) # shape b, h, l, d + return torch.cat([2*x, -2*x], dim=-1).softmax(-1) + + +class T2RFeatureMap(nn.Module): + + r""" + Simple linear mapping feature map as in + `Finetuning Pretrained Transformers into RNNs `_ + """ + + def __init__( + self, + head_dim: int, + dot_dim: int = None, + bias: Optional[bool] = False + ) -> T2RFeatureMap: + super().__init__() + # Trainable map + if dot_dim is None: + dot_dim = head_dim + + self.head_dim = head_dim + self.dot_dim = dot_dim + self.bias = bias + + self.layer = nn.Linear(head_dim, dot_dim, bias=bias) + + def __repr__(self) -> str: + return f"{self.__class__.__name__}(head_dim={self.head_dim}, dot_dim={self.dot_dim}, bias={self.bias})" + + def forward(self, x: torch.Tensor): + return self.layer(x).relu() + + +class DPFPFeatureMap(nn.Module): + + r""" + Deterministic Parameter-Free Projection (DPFP) feature map in + `Linear Transformers Are Secretly Fast Weight Programmers `_ + """ + + def __init__( + self, + head_dim: int, + nu: int = 4 + ) -> DPFPFeatureMap: + super().__init__() + self.nu = nu + + def forward(self, x: torch.Tensor): + x = torch.cat([x.relu(), -x.relu()], dim=-1) + x_rolled = torch.cat([x.roll(shifts=j, dims=-1) for j in range(1, self.nu+1)], dim=-1) + x_repeat = torch.cat([x] * self.nu, dim=-1) + return x_repeat * x_rolled + + +class HadamardFeatureMap(nn.Module): + def __init__( + self, + head_dim: int + ) -> HadamardFeatureMap: + super().__init__() + # Trainable map + self.layer1 = nn.Linear(head_dim, head_dim) + self.layer2 = nn.Linear(head_dim, head_dim) + + def forward(self, x: torch.Tensor): + return self.layer1(x) * self.layer2(x) + + +class LearnableOuterProductFeatureMap(nn.Module): + def __init__( + self, + head_dim: int, + feature_dim: int + ) -> LearnableOuterProductFeatureMap: + super().__init__() + # Trainable map + self.layer1 = nn.Linear(head_dim, feature_dim, bias=False) + self.layer2 = nn.Linear(head_dim, feature_dim, bias=False) + self.normalizer = feature_dim ** -0.5 + + def forward(self, x: torch.Tensor): + return flatten_diag_outer_product(self.layer1(x), self.layer2(x)) + + +class LearnablePolySketchNonNegativeFeatureMap(nn.Module): + + def __init__( + self, + head_dim: int, + sketch_size: Optional[int] = None, + degree: Optional[int] = 2 + ) -> LearnablePolySketchNonNegativeFeatureMap: + super().__init__() + + assert is_power_of_2(degree) and degree >= 2, f"The degree {degree} must be a power of 2" + + self.head_dim = head_dim + self.sketch_size = sketch_size if sketch_size is not None else head_dim + self.degree = degree + + self.gamma = nn.Parameter(torch.ones(head_dim)) + self.beta = nn.Parameter(torch.zeros(head_dim)) + # NOTE: the sketch layers defined here are quite different from the original paper + # currently we simply use linear layers without any non-linear activations + self.sketches1 = nn.ModuleList([ + nn.Linear(head_dim, sketch_size, bias=False), + *[nn.Linear(sketch_size, sketch_size, bias=False) for _ in range(int(math.log2(self.degree)) - 2)] + ]) + self.sketches2 = nn.ModuleList([ + nn.Linear(head_dim, sketch_size, bias=False), + *[nn.Linear(sketch_size, sketch_size, bias=False) for _ in range(int(math.log2(self.degree)) - 2)] + ]) + + def forward(self, x: torch.Tensor): + # Section 2.1 + x = layer_norm(x, self.gamma, self.beta) + # first map the input to sketch size with learnable parameters + x = self.sketches1[0](x) * self.sketches2[0](x) * self.head_dim ** -0.5 + for i in range(1, int(math.log2(self.degree)) - 1): + x = self.sketches1[i](x) * self.sketches2[i](x) * self.head_dim ** -0.5 + # do sketch mapping for log2(p) - 1 times in total + # do p=2 mapping to ensure non-negativity + return flatten_diag_outer_product(x, x) + + +class TaylorFeatureMap(nn.Module): + def __init__( + self, + head_dim: int + ) -> TaylorFeatureMap: + super().__init__() + self.head_dim = head_dim + self.r2 = math.sqrt(2) + self.rd = math.sqrt(self.head_dim) + self.rrd = math.sqrt(self.rd) + + def forward(self, x: torch.Tensor): + x2_1, x2_2 = flatten_diag_outer_product_off1(x, x) + return torch.cat([torch.ones_like(x[..., 0:1]), x / self.rrd, x2_2 / (self.rd * self.r2), x2_1 / self.rd], dim=-1) + + +class RebasedFeatureMap(nn.Module): + + def __init__( + self, + head_dim: int, + use_gamma: Optional[bool] = True, + use_beta: Optional[bool] = True, + normalize: Optional[bool] = True + ) -> RebasedFeatureMap: + super().__init__() + + self.head_dim = head_dim + self.use_gamma = use_gamma + self.use_beta = use_beta + self.normalize = normalize + + self.gamma = None + self.beta = None + if use_gamma: + self.gamma = nn.Parameter(torch.ones(head_dim)) + if use_beta: + self.beta = nn.Parameter(torch.zeros(head_dim)) + + def forward(self, x: torch.Tensor, flatten: Optional[bool] = True): + if self.use_beta and self.use_gamma and self.normalize: + x = layer_norm(x, self.gamma, self.beta) + elif self.normalize: + x = F.layer_norm(x, (self.head_dim,), self.gamma, self.beta) + elif self.use_gamma and self.use_beta: + x = torch.addcmul(self.beta, x, self.gamma) + elif self.use_gamma: + x = x.mul(self.gamma) + else: + raise RuntimeError(f"Not supported combination of `use_gamma`, `use_beta` and `normalize`, " + f"which is currentlt set as (`{self.use_gamma}`, `{self.use_beta}`, `{self.normalize}`)") + if not flatten: + return x + x2_1, x2_2 = flatten_diag_outer_product_off1(x, x) + # rebased use learnable parameters to approximate any quadratic function + return torch.cat([x2_2 * self.head_dim ** -0.5, x2_1 * (2 / self.head_dim) ** 0.5], dim=-1) + + +class ReLUFeatureMap(nn.Module): + + def __init__( + self, + ) -> ReLUFeatureMap: + super().__init__() + + def forward(self, x: torch.Tensor): + return F.relu(x) + + +class SquaredReLUFeatureMap(nn.Module): + + def __init__( + self, + ) -> SquaredReLUFeatureMap: + super().__init__() + + def forward(self, x: torch.Tensor): + return sqrelu(x) + + +class GELUFeatureMap(nn.Module): + + def __init__( + self, + ) -> GELUFeatureMap: + super().__init__() + + def forward(self, x: torch.Tensor): + return fast_gelu_impl(x) + + +class SwishFeatureMap(nn.Module): + + def __init__( + self, + ) -> SwishFeatureMap: + super().__init__() + + def forward(self, x: torch.Tensor): + return swish(x) + + +class SigmoidFeatureMap(nn.Module): + + def __init__( + self, + ) -> SigmoidFeatureMap: + super().__init__() + + def forward(self, x: torch.Tensor): + return sigmoid(x) diff --git a/fla/modules/fused_bitlinear.py b/fla/modules/fused_bitlinear.py new file mode 100644 index 0000000000000000000000000000000000000000..341abec4e0a1ad926103e5c91f91bede0379d215 --- /dev/null +++ b/fla/modules/fused_bitlinear.py @@ -0,0 +1,625 @@ +# -*- coding: utf-8 -*- + +# Implementations of BitLinear layer with fused LayerNorm and quantized Linear layer. +# [The Era of 1-bit LLMs: All Large Language Models are in 1.58 Bits](https://arxiv.org/abs/2402.17764) +# [Scalable MatMul-free Language Modeling](https://arxiv.org/abs/2406.02528) + +# Code adapted from https://github.com/ridgerchu/matmulfreellm/ + +from __future__ import annotations + +import math + +import torch +import torch.nn as nn +import torch.nn.functional as F +import triton +import triton.language as tl + +from fla.modules.layernorm import RMSNorm +from fla.utils import contiguous + + +def activation_quant(x): + """ + Per-token quantization to 8 bits. No grouping is needed for quantization. + + Args: + x: An activation tensor with shape [n, d]. + + Returns: + A quantized activation tensor with shape [n, d]. + """ + # Compute the scale factor + scale = 127.0 / x.abs().max(dim=-1, keepdim=True).values.clamp_(min=1e-5) + # Quantize and then de-quantize the tensor + y = (x * scale).round().clamp_(-128, 127) / scale + return y + + +def weight_quant(w): + """ + Per-tensor quantization to 1.58 bits. No grouping is needed for quantization. + + Args: + w: A weight tensor with shape [d, k]. + + Returns: + A quantized weight tensor with shape [d, k]. + """ + # Compute the scale factor + scale = 1.0 / w.abs().mean().clamp_(min=1e-5) + # Quantize and then de-quantize the tensor + u = (w * scale).round().clamp_(-1, 1) / scale + return u + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8), + triton.Config({}, num_warps=16), + triton.Config({}, num_warps=32), + ], + key=["N", "HAS_RESIDUAL", "STORE_RESIDUAL_OUT", "IS_RMS_NORM", "HAS_BIAS"], +) +# @triton.heuristics({"HAS_BIAS": lambda args: args["B"] is not None}) +# @triton.heuristics({"HAS_RESIDUAL": lambda args: args["RESIDUAL"] is not None}) +@triton.jit +def _layer_norm_fwd_quant_kernel( + X, # pointer to the input + Y, # pointer to the output + W, # pointer to the weights + B, # pointer to the biases + RESIDUAL, # pointer to the residual + RESIDUAL_OUT, # pointer to the residual + Mean, # pointer to the mean + Rstd, # pointer to the 1/std + stride_x_row, # how much to increase the pointer when moving by 1 row + stride_y_row, + stride_res_row, + stride_res_out_row, + N, # number of columns in X + eps, # epsilon to avoid division by zero + IS_RMS_NORM: tl.constexpr, + BLOCK_N: tl.constexpr, + HAS_RESIDUAL: tl.constexpr, + STORE_RESIDUAL_OUT: tl.constexpr, + HAS_WEIGHT: tl.constexpr, + HAS_BIAS: tl.constexpr +): + # Map the program id to the row of X and Y it should compute. + row = tl.program_id(0) + X += row * stride_x_row + Y += row * stride_y_row + if HAS_RESIDUAL: + RESIDUAL += row * stride_res_row + if STORE_RESIDUAL_OUT: + RESIDUAL_OUT += row * stride_res_out_row + # Compute mean and variance + cols = tl.arange(0, BLOCK_N) + x = tl.load(X + cols, mask=cols < N, other=0.0).to(tl.float32) + if HAS_RESIDUAL: + residual = tl.load(RESIDUAL + cols, mask=cols < N, other=0.0).to(tl.float32) + x += residual + if STORE_RESIDUAL_OUT: + tl.store(RESIDUAL_OUT + cols, x, mask=cols < N) + if not IS_RMS_NORM: + mean = tl.sum(x, axis=0) / N + tl.store(Mean + row, mean) + xbar = tl.where(cols < N, x - mean, 0.0) + var = tl.sum(xbar * xbar, axis=0) / N + else: + xbar = tl.where(cols < N, x, 0.0) + var = tl.sum(xbar * xbar, axis=0) / N + rstd = 1 / tl.sqrt(var + eps) + tl.store(Rstd + row, rstd) + # Normalize and apply linear transformation + mask = cols < N + if HAS_WEIGHT: + w = tl.load(W + cols, mask=mask).to(tl.float32) + if HAS_BIAS: + b = tl.load(B + cols, mask=mask).to(tl.float32) + x_hat = (x - mean) * rstd if not IS_RMS_NORM else x * rstd + + y = x_hat * w if HAS_WEIGHT else x_hat + if HAS_BIAS: + y = y + b + + # Aply quantization to the output + scale = 127.0 / tl.maximum(tl.max(tl.abs(y), 0), 1e-5) + # Quantize and then de-quantize the tensor + y = tl.math.round(y * scale) + y = tl.maximum(tl.minimum(y, 127), -128) / scale + + # Write output + tl.store(Y + cols, y, mask=mask) + + +def _layer_norm_fwd_quant( + x, weight, bias, eps, residual=None, out_dtype=None, residual_dtype=None, is_rms_norm=False +): + if residual is not None: + residual_dtype = residual.dtype + M, N = x.shape + # allocate output + y = torch.empty_like(x, dtype=x.dtype if out_dtype is None else out_dtype) + if residual is not None or (residual_dtype is not None and residual_dtype != x.dtype): + residual_out = torch.empty(M, N, device=x.device, dtype=residual_dtype) + else: + residual_out = None + mean = torch.empty((M,), dtype=torch.float32, device="cuda") if not is_rms_norm else None + rstd = torch.empty((M,), dtype=torch.float32, device="cuda") + # Less than 64KB per feature: enqueue fused kernel + MAX_FUSED_SIZE = 65536 // x.element_size() + BLOCK_N = min(MAX_FUSED_SIZE, triton.next_power_of_2(N)) + if N > BLOCK_N: + raise RuntimeError("This layer norm doesn't support feature dim >= 64KB.") + # heuristics for number of warps + with torch.cuda.device(x.device.index): + _layer_norm_fwd_quant_kernel[(M,)]( + x, + y, + weight, + bias, + residual, + residual_out, + mean, + rstd, + x.stride(0), + y.stride(0), + residual.stride(0) if residual is not None else 0, + residual_out.stride(0) if residual_out is not None else 0, + N, + eps, + is_rms_norm, + BLOCK_N, + residual is not None, + residual_out is not None, + weight is not None, + bias is not None, + ) + # residual_out is None if residual is None and residual_dtype == input_dtype + return y, mean, rstd, residual_out if residual_out is not None else x + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8), + triton.Config({}, num_warps=16), + triton.Config({}, num_warps=32), + ], + key=["N", "HAS_DRESIDUAL", "STORE_DRESIDUAL", "IS_RMS_NORM", "HAS_BIAS"], +) +# @triton.heuristics({"HAS_BIAS": lambda args: args["B"] is not None}) +# @triton.heuristics({"HAS_DRESIDUAL": lambda args: args["DRESIDUAL"] is not None}) +# @triton.heuristics({"STORE_DRESIDUAL": lambda args: args["DRESIDUAL_IN"] is not None}) +@triton.heuristics({"RECOMPUTE_OUTPUT": lambda args: args["Y"] is not None}) +@triton.jit +def _layer_norm_bwd_kernel( + X, # pointer to the input + W, # pointer to the weights + B, # pointer to the biases + Y, # pointer to the output to be recomputed + DY, # pointer to the output gradient + DX, # pointer to the input gradient + DW, # pointer to the partial sum of weights gradient + DB, # pointer to the partial sum of biases gradient + DRESIDUAL, + DRESIDUAL_IN, + Mean, # pointer to the mean + Rstd, # pointer to the 1/std + stride_x_row, # how much to increase the pointer when moving by 1 row + stride_y_row, + stride_dy_row, + stride_dx_row, + stride_dres_row, + stride_dres_in_row, + M, # number of rows in X + N, # number of columns in X + eps, # epsilon to avoid division by zero + rows_per_program, + IS_RMS_NORM: tl.constexpr, + BLOCK_N: tl.constexpr, + HAS_DRESIDUAL: tl.constexpr, + STORE_DRESIDUAL: tl.constexpr, + HAS_WEIGHT: tl.constexpr, + HAS_BIAS: tl.constexpr, + RECOMPUTE_OUTPUT: tl.constexpr, +): + # Map the program id to the elements of X, DX, and DY it should compute. + row_block_id = tl.program_id(0) + row_start = row_block_id * rows_per_program + cols = tl.arange(0, BLOCK_N) + mask = cols < N + X += row_start * stride_x_row + if HAS_DRESIDUAL: + DRESIDUAL += row_start * stride_dres_row + if STORE_DRESIDUAL: + DRESIDUAL_IN += row_start * stride_dres_in_row + DY += row_start * stride_dy_row + DX += row_start * stride_dx_row + if RECOMPUTE_OUTPUT: + Y += row_start * stride_y_row + if HAS_WEIGHT: + w = tl.load(W + cols, mask=mask).to(tl.float32) + dw = tl.zeros((BLOCK_N,), dtype=tl.float32) + if RECOMPUTE_OUTPUT and HAS_BIAS: + b = tl.load(B + cols, mask=mask, other=0.0).to(tl.float32) + if HAS_BIAS: + db = tl.zeros((BLOCK_N,), dtype=tl.float32) + row_end = min((row_block_id + 1) * rows_per_program, M) + for row in range(row_start, row_end): + # Load data to SRAM + x = tl.load(X + cols, mask=mask, other=0).to(tl.float32) + dy = tl.load(DY + cols, mask=mask, other=0).to(tl.float32) + if not IS_RMS_NORM: + mean = tl.load(Mean + row) + rstd = tl.load(Rstd + row) + # Compute dx + xhat = (x - mean) * rstd if not IS_RMS_NORM else x * rstd + xhat = tl.where(mask, xhat, 0.0) + if RECOMPUTE_OUTPUT: + y = xhat * w if HAS_WEIGHT else xhat + if HAS_BIAS: + y = y + b + + # Aply quantization to the output + scale = 127.0 / tl.maximum(tl.max(tl.abs(y), 0), 1e-5) + # Quantize and then de-quantize the tensor + y = tl.math.round(y * scale) + y = tl.maximum(tl.minimum(y, 127), -128) / scale + + tl.store(Y + cols, y, mask=mask) + wdy = dy + if HAS_WEIGHT: + wdy = dy * w + dw += dy * xhat + if HAS_BIAS: + db += dy + if not IS_RMS_NORM: + c1 = tl.sum(xhat * wdy, axis=0) / N + c2 = tl.sum(wdy, axis=0) / N + dx = (wdy - (xhat * c1 + c2)) * rstd + else: + c1 = tl.sum(xhat * wdy, axis=0) / N + dx = (wdy - xhat * c1) * rstd + if HAS_DRESIDUAL: + dres = tl.load(DRESIDUAL + cols, mask=mask, other=0).to(tl.float32) + dx += dres + # Write dx + if STORE_DRESIDUAL: + tl.store(DRESIDUAL_IN + cols, dx, mask=mask) + tl.store(DX + cols, dx, mask=mask) + + X += stride_x_row + if HAS_DRESIDUAL: + DRESIDUAL += stride_dres_row + if STORE_DRESIDUAL: + DRESIDUAL_IN += stride_dres_in_row + if RECOMPUTE_OUTPUT: + Y += stride_y_row + DY += stride_dy_row + DX += stride_dx_row + if HAS_WEIGHT: + tl.store(DW + row_block_id * N + cols, dw, mask=mask) + if HAS_BIAS: + tl.store(DB + row_block_id * N + cols, db, mask=mask) + + +def _layer_norm_bwd( + dy, + x, + weight, + bias, + eps, + mean, + rstd, + dresidual=None, + has_residual=False, + is_rms_norm=False, + x_dtype=None, + recompute_output=False, +): + M, N = x.shape + # allocate output + dx = torch.empty_like(x) if x_dtype is None else torch.empty(M, N, dtype=x_dtype, device=x.device) + dresidual_in = torch.empty_like(x) if has_residual and dx.dtype != x.dtype else None + y = torch.empty(M, N, dtype=dy.dtype, device=dy.device) if recompute_output else None + + # Less than 64KB per feature: enqueue fused kernel + MAX_FUSED_SIZE = 65536 // x.element_size() + BLOCK_N = min(MAX_FUSED_SIZE, triton.next_power_of_2(N)) + if N > BLOCK_N: + raise RuntimeError("This layer norm doesn't support feature dim >= 64KB.") + sm_count = torch.cuda.get_device_properties(x.device).multi_processor_count + _dw = torch.empty((sm_count, N), dtype=torch.float32, device=weight.device) if weight is not None else None + _db = torch.empty((sm_count, N), dtype=torch.float32, device=bias.device) if bias is not None else None + rows_per_program = math.ceil(M / sm_count) + grid = (sm_count,) + with torch.cuda.device(x.device.index): + _layer_norm_bwd_kernel[grid]( + x, + weight, + bias, + y, + dy, + dx, + _dw, + _db, + dresidual, + dresidual_in, + mean, + rstd, + x.stride(0), + 0 if not recompute_output else y.stride(0), + dy.stride(0), + dx.stride(0), + dresidual.stride(0) if dresidual is not None else 0, + dresidual_in.stride(0) if dresidual_in is not None else 0, + M, + N, + eps, + rows_per_program, + is_rms_norm, + BLOCK_N, + dresidual is not None, + dresidual_in is not None, + weight is not None, + bias is not None, + ) + dw = _dw.sum(0).to(weight.dtype) if weight is not None else None + db = _db.sum(0).to(bias.dtype) if bias is not None else None + # Don't need to compute dresidual_in separately in this case + if has_residual and dx.dtype == x.dtype: + dresidual_in = dx + return (dx, dw, db, dresidual_in) if not recompute_output else (dx, dw, db, dresidual_in, y) + + +class LayerNormLinearQuantFn(torch.autograd.Function): + + @staticmethod + @contiguous + def forward( + ctx, + x, + norm_weight, + norm_bias, + linear_weight, + linear_bias, + residual=None, + eps=1e-6, + prenorm=False, + residual_in_fp32=False, + is_rms_norm=False, + ): + x_shape_og = x.shape + # reshape input data into 2D tensor + x = x.reshape(-1, x.shape[-1]) + if residual is not None: + assert residual.shape == x_shape_og + residual = residual.reshape(-1, residual.shape[-1]) + residual_dtype = residual.dtype if residual is not None else (torch.float32 if residual_in_fp32 else None) + y, mean, rstd, residual_out = _layer_norm_fwd_quant( + x, + norm_weight, + norm_bias, + eps, + residual, + out_dtype=None if not torch.is_autocast_enabled() else torch.get_autocast_gpu_dtype(), + residual_dtype=residual_dtype, + is_rms_norm=is_rms_norm, + ) + y = y.reshape(x_shape_og) + dtype = torch.get_autocast_gpu_dtype() if torch.is_autocast_enabled() else y.dtype + linear_weight = weight_quant(linear_weight).to(dtype) + linear_bias = linear_bias.to(dtype) if linear_bias is not None else None + out = F.linear(y.to(linear_weight.dtype), linear_weight, linear_bias) + # We don't store y, will be recomputed in the backward pass to save memory + ctx.save_for_backward(residual_out, norm_weight, norm_bias, linear_weight, mean, rstd) + ctx.x_shape_og = x_shape_og + ctx.eps = eps + ctx.is_rms_norm = is_rms_norm + ctx.has_residual = residual is not None + ctx.prenorm = prenorm + ctx.x_dtype = x.dtype + ctx.linear_bias_is_none = linear_bias is None + return out if not prenorm else (out, residual_out.reshape(x_shape_og)) + + @staticmethod + @contiguous + def backward(ctx, dout, *args): + x, norm_weight, norm_bias, linear_weight, mean, rstd = ctx.saved_tensors + dout = dout.reshape(-1, dout.shape[-1]) + dy = F.linear(dout, linear_weight.t()) + dlinear_bias = None if ctx.linear_bias_is_none else dout.sum(0) + assert dy.shape == x.shape + if ctx.prenorm: + dresidual = args[0] + dresidual = dresidual.reshape(-1, dresidual.shape[-1]) + assert dresidual.shape == x.shape + else: + dresidual = None + dx, dnorm_weight, dnorm_bias, dresidual_in, y = _layer_norm_bwd( + dy, + x, + norm_weight, + norm_bias, + ctx.eps, + mean, + rstd, + dresidual, + ctx.has_residual, + ctx.is_rms_norm, + x_dtype=ctx.x_dtype, + recompute_output=True + ) + dlinear_weight = torch.einsum("bo,bi->oi", dout, y) + return ( + dx.reshape(ctx.x_shape_og), + dnorm_weight, + dnorm_bias, + dlinear_weight, + dlinear_bias, + dresidual_in.reshape(ctx.x_shape_og) if ctx.has_residual else None, + None, + None, + None, + None, + ) + + +def layer_norm_linear_quant_fn( + x, + norm_weight, + norm_bias, + linear_weight, + linear_bias, + residual=None, + eps=1e-6, + prenorm=False, + residual_in_fp32=False, + is_rms_norm=False, +): + return LayerNormLinearQuantFn.apply( + x, + norm_weight, + norm_bias, + linear_weight, + linear_bias, + residual, + eps, + prenorm, + residual_in_fp32, + is_rms_norm, + ) + + +def rms_norm_linear_quant( + x: torch.Tensor, + norm_weight: torch.Tensor, + norm_bias: torch.Tensor, + linear_weight: torch.Tensor, + linear_bias: torch.Tensor, + residual: torch.Tensor = None, + eps: float = 1e-5, + prenorm: bool = False, + residual_in_fp32: bool = False +): + return layer_norm_linear_quant_fn( + x=x, + norm_weight=norm_weight, + norm_bias=norm_bias, + linear_weight=linear_weight, + linear_bias=linear_bias, + residual=residual, + eps=eps, + prenorm=prenorm, + residual_in_fp32=residual_in_fp32, + is_rms_norm=True + ) + + +def bit_linear(x, weight, bias=None, norm_weight=None, norm_bias=None, eps=1e-8): + """ + A functional version of BitLinear that applies quantization to activations and weights. + + Args: + x: Input tensor with shape [n, d]. + weight: Weight tensor with shape [out_features, in_features]. + bias: Bias tensor with shape [out_features] (optional). + norm_weight: Weight tensor for RMS normalization with shape [in_features]. + norm_bias: Bias tensor for RMS normalization with shape [in_features]. + eps: A small constant for numerical stability in normalization. + + Returns: + Output tensor with shape [n, out_features]. + """ + return layer_norm_linear_quant_fn( + x, + norm_weight, + norm_bias, + weight, + bias, + is_rms_norm=True + ) + + +class BitLinear(nn.Linear): + """ + A custom linear layer that applies quantization on both activations and weights. + This is primarily for training; kernel optimization is needed for efficiency in deployment. + """ + + def __init__(self, in_features, out_features, bias=False): + """ + Initializes the BitLinear layer. + + Args: + in_features: Size of each input sample. + out_features: Size of each output sample. + bias: If set to False, the layer will not learn an additive bias. Default: True. + """ + # Initialize the superclass nn.Linear with the given parameters + super(BitLinear, self).__init__(in_features, out_features, bias=bias) + + self.norm = RMSNorm(in_features, eps=1e-8) + + def forward(self, x): + """ + Overrides the forward pass to include quantization. + + Args: + x: An input tensor with shape [n, d]. + + Returns: + An output tensor with shape [n, d]. + """ + # Weight tensor + w = self.weight + + # Apply RMS normalization to the input + x_norm = self.norm(x) + + # Apply quantization to both activations and weights + # Uses Straight-Through Estimator (STE) trick with .detach() for gradient flow + x_quant = x_norm + (activation_quant(x_norm) - x_norm).detach() + w_quant = w + (weight_quant(w) - w).detach() + # Perform linear operation with quantized values + y = F.linear(x_quant, w_quant) + + return y + + +class FusedBitLinear(BitLinear): + """ + A custom linear layer that applies quantization on both activations and weights. + This is primarily for training; kernel optimization is needed for efficiency in deployment. + """ + + def __init__(self, in_features, out_features, bias=False): + """ + Initializes the BitLinear layer. + + Args: + in_features: Size of each input sample. + out_features: Size of each output sample. + bias: If set to False, the layer will not learn an additive bias. Default: True. + """ + # Initialize the superclass nn.Linear with the given parameters + super(FusedBitLinear, self).__init__(in_features, out_features, bias=bias) + + def forward(self, x): + return layer_norm_linear_quant_fn( + x, + self.norm.weight, + self.norm.bias, + self.weight, + self.bias, + is_rms_norm=True + ) diff --git a/fla/modules/fused_cross_entropy.py b/fla/modules/fused_cross_entropy.py new file mode 100644 index 0000000000000000000000000000000000000000..b87c1f6ecd20f1f8dd633cb2663f7bdb66e5e79c --- /dev/null +++ b/fla/modules/fused_cross_entropy.py @@ -0,0 +1,423 @@ +# -*- coding: utf-8 -*- + +# Copyright (c) 2023, Tri Dao. + +from typing import Any, Tuple + +import torch +import torch.nn as nn +import triton +import triton.language as tl + +from fla.utils import contiguous + +# `all_gather_into_tensor` and `reduce_scatter_tensor` are new placeholders for +# `_all_gather_base` and `_reduce_scatter_base`. They require the most recent +# version of PyTorch. The following 2 lines are for backward compatibility with +# older PyTorch. +if "all_gather_into_tensor" not in dir(torch.distributed): + torch.distributed.all_gather_into_tensor = torch.distributed._all_gather_base + + +@triton.heuristics({ + "HAS_SMOOTHING": lambda args: args["label_smoothing"] > 0.0, +}) +@triton.jit +def cross_entropy_fwd_kernel( + loss_ptr, # data ptrs + lse_ptr, + z_loss_ptr, + logits_ptr, + labels_ptr, + label_smoothing, + logit_scale, + lse_square_scale, + ignore_index, + total_classes, + class_start_idx, # Useful for tensor parallel when each rank only has a subset of classes + n_cols, # shapes + n_rows, + logits_row_stride, # strides + BLOCK_SIZE: tl.constexpr, + HAS_SMOOTHING: tl.constexpr, + # if SPLIT (e.g. tensor parallel), don't include the LSE in the loss since it's not the final LSE + SPLIT: tl.constexpr, +): + row_idx = tl.program_id(0) + col_block_idx = tl.program_id(1) + logits_ptr = logits_ptr + row_idx * logits_row_stride.to(tl.int64) + col_offsets = col_block_idx * BLOCK_SIZE + tl.arange(0, BLOCK_SIZE) + label_idx = tl.load(labels_ptr + row_idx) + logits = tl.load(logits_ptr + col_offsets, mask=col_offsets < n_cols, other=-float("inf")) + logits = logits.to(tl.float32) * logit_scale + max_logits = tl.max(logits, 0) + if HAS_SMOOTHING: + sum_logits = tl.sum(tl.where(col_offsets < n_cols, logits, 0.0), 0) + lse = tl.log(tl.sum(tl.exp(logits - max_logits), 0)) + max_logits + tl.store(lse_ptr + col_block_idx * n_rows + row_idx, lse) + if label_idx == ignore_index: + loss = 0.0 + z_loss = 0.0 + else: + label_idx -= class_start_idx + if label_idx >= col_block_idx * BLOCK_SIZE and label_idx < min( + n_cols, (col_block_idx + 1) * BLOCK_SIZE + ): + logits_label = tl.load(logits_ptr + label_idx) * logit_scale + if HAS_SMOOTHING: + loss = ( + (lse if not SPLIT else 0.0) + - label_smoothing * sum_logits / total_classes + - (1 - label_smoothing) * logits_label + ) + else: + loss = (lse if not SPLIT else 0.0) - logits_label + else: + # If label is out of bounds, we set the CE loss to 0.0. But we still want the label_smoothing loss + if HAS_SMOOTHING: + loss = label_smoothing * ((lse if not SPLIT else 0.0) - sum_logits / total_classes) + else: + loss = 0.0 + if not SPLIT: + z_loss = lse_square_scale * lse * lse + loss += z_loss + else: + z_loss = 0.0 + tl.store(loss_ptr + col_block_idx * n_rows + row_idx, loss) + if not SPLIT: + tl.store(z_loss_ptr + col_block_idx * n_rows + row_idx, z_loss) + + +@triton.heuristics({ + "HAS_SMOOTHING": lambda args: args["label_smoothing"] > 0.0, +}) +@triton.jit +def cross_entropy_bwd_kernel( + dlogits_ptr, # data ptrs + dloss_ptr, + logits_ptr, + lse_ptr, + labels_ptr, + label_smoothing, + logit_scale, + lse_square_scale, + ignore_index, + total_classes, + class_start_idx, # Useful for tensor parallel when each rank only has a subset of classes + n_cols, # shapes + logits_row_stride, # strides + dlogits_row_stride, + dloss_row_stride, + BLOCK_SIZE: tl.constexpr, + HAS_SMOOTHING: tl.constexpr, +): + row_idx = tl.program_id(0) + col_block_idx = tl.program_id(1) + logits_ptr = logits_ptr + row_idx * logits_row_stride.to(tl.int64) + dlogits_ptr = dlogits_ptr + row_idx * dlogits_row_stride.to(tl.int64) + col_offsets = col_block_idx * BLOCK_SIZE + tl.arange(0, BLOCK_SIZE) + label_idx = tl.load(labels_ptr + row_idx) + if label_idx != ignore_index: + dloss = tl.load(dloss_ptr + row_idx * dloss_row_stride) + else: + dloss = 0.0 + logits = tl.load(logits_ptr + col_offsets, mask=col_offsets < n_cols, other=-float("inf")).to( + tl.float32 + ) * logit_scale + lse = tl.load(lse_ptr + row_idx) + probs = tl.exp(logits - lse) + probs += 2.0 * lse_square_scale * lse * probs + label_idx -= class_start_idx + if HAS_SMOOTHING: + smooth_negative = label_smoothing / total_classes + probs = tl.where(col_offsets == label_idx, probs - (1 - label_smoothing), probs) - smooth_negative + else: + probs = tl.where(col_offsets == label_idx, probs - 1.0, probs) + tl.store(dlogits_ptr + col_offsets, (dloss * logit_scale) * probs, mask=col_offsets < n_cols) + + +def fused_cross_entropy_forward( + logits: torch.Tensor, + target: torch.Tensor, + label_smoothing: float = 0.0, + logit_scale: float = 1.0, + lse_square_scale: float = 0.0, + ignore_index: int = -100, + process_group=None, +): + n_rows, n_cols = logits.shape + assert target.shape == (n_rows,) + world_size = 1 if process_group is None else torch.distributed.get_world_size(process_group) + total_classes = world_size * n_cols + rank = 0 if process_group is None else torch.distributed.get_rank(process_group) + class_start_idx = rank * n_cols + + if logits.stride(-1) != 1: + logits = logits.contiguous() + # Set these similar to https://github.com/openai/triton/blob/main/python/tutorials/02-fused-softmax.py + MAX_BLOCK_SIZE = 64 * 1024 + BLOCK_SIZE = min(triton.next_power_of_2(n_cols), MAX_BLOCK_SIZE) + num_warps = ( + 4 + if BLOCK_SIZE < 2048 + else (8 if BLOCK_SIZE < 8192 else (16 if BLOCK_SIZE < 128 * 1024 else 32)) + ) + # We may split the lse computation across multiple blocks, then do a reduction + # lse(local_lse) to get the final LSE. This is faster for large n_cols (e.g., > 64k) + # where having just one thread block processing more than 64k elements is slow. + split = world_size > 1 or n_cols > MAX_BLOCK_SIZE + n_splits = (n_cols + BLOCK_SIZE - 1) // BLOCK_SIZE + loss_shape = (n_splits, n_rows) if n_splits > 1 else (n_rows,) + losses = torch.empty(*loss_shape, dtype=torch.float, device=logits.device) + lse = torch.empty(*loss_shape, dtype=torch.float, device=logits.device) + z_losses = torch.empty(*loss_shape, dtype=torch.float, device=logits.device) + # Need this, otherwise Triton tries to launch from cuda:0 and we get + # ValueError: Pointer argument (at 0) cannot be accessed from Triton (cpu tensor?) + with torch.cuda.device(logits.device.index): + cross_entropy_fwd_kernel[(n_rows, n_splits)]( + losses, # data ptrs + lse, + z_losses, + logits, + target, + label_smoothing, + logit_scale, + lse_square_scale, + ignore_index, + total_classes, + class_start_idx, + n_cols, # shapes + n_rows, + logits.stride(0), # strides + BLOCK_SIZE=BLOCK_SIZE, # constants + num_warps=num_warps, + SPLIT=split + ) + + if split: + # If there's no label_smoothing, if target are in the vocab of this partition, losses contains + # - predicted logit, and 0 otherwise. + # If there's label_smoothing=0.1, for target in the vocab of this partition, losses contains + # -0.9 * predicted logit - 0.1 * sum logit / total_classes. + # For target not in the vocab of this partition, losses contains + # -0.1 * sum logit / total_classes. + if n_splits > 1: + lse = torch.logsumexp(lse, dim=0) + losses = losses.sum(dim=0) + if world_size > 1: + lse_allgather = torch.empty(world_size, n_rows, dtype=lse.dtype, device=lse.device) + torch.distributed.all_gather_into_tensor(lse_allgather, lse, group=process_group) + handle_losses = torch.distributed.all_reduce( + losses, op=torch.distributed.ReduceOp.SUM, group=process_group, async_op=True + ) + lse = torch.logsumexp(lse_allgather, dim=0) + handle_losses.wait() + # After the allreduce, if there's no label_smoothing, the total losses are - predicted_logit, + # we just have to add the (global) lse. + # If there's label_smoothing=0.1, the total losses are + # -0.9 * predicted_logit - 0.1 * sum logit / total_classes. + # Again, we just have to add the (global) lse. + losses += lse + if lse_square_scale != 0.0: + z_losses = lse_square_scale * lse.square() + z_losses.masked_fill_(target == ignore_index, 0.0) + losses += z_losses + else: + z_losses = torch.zeros_like(losses) + losses.masked_fill_(target == ignore_index, 0.0) + + return losses, z_losses, lse, total_classes, class_start_idx + + +class CrossEntropyLossFunction(torch.autograd.Function): + + @staticmethod + @contiguous + def forward( + ctx, + logits, + target, + label_smoothing=0.0, + logit_scale=1.0, + lse_square_scale=0.0, + ignore_index=-100, + inplace_backward=False, + process_group=None, + ): + losses, z_losses, lse, total_classes, class_start_idx = fused_cross_entropy_forward( + logits, + target, + label_smoothing, + logit_scale, + lse_square_scale, + ignore_index, + process_group, + ) + ctx.save_for_backward(logits, lse, target) + ctx.mark_non_differentiable(z_losses) + ctx.label_smoothing = label_smoothing + ctx.logit_scale = logit_scale + ctx.lse_square_scale = lse_square_scale + ctx.ignore_index = ignore_index + ctx.total_classes = total_classes + ctx.class_start_idx = class_start_idx + ctx.inplace_backward = inplace_backward + + return losses, z_losses + + @staticmethod + @contiguous + def backward(ctx, grad_losses, grad_z_losses): + del grad_z_losses # z_losses are only for logging. + + logits, lse, target = ctx.saved_tensors + dlogits = logits if ctx.inplace_backward else torch.empty_like(logits) + n_rows, n_cols = logits.shape + BLOCK_SIZE = min(triton.next_power_of_2(n_cols), 4 * 1024) + num_warps = 4 if BLOCK_SIZE < 2048 else (8 if BLOCK_SIZE < 8192 else 16) + def grid(META): return (n_rows, triton.cdiv(n_cols, META["BLOCK_SIZE"])) # noqa + # Need this, otherwise Triton tries to launch from cuda:0 and we get + # ValueError: Pointer argument (at 0) cannot be accessed from Triton (cpu tensor?) + with torch.cuda.device(logits.device.index): + cross_entropy_bwd_kernel[grid]( + dlogits, # data ptrs + grad_losses, + logits, + lse, + target, + ctx.label_smoothing, + ctx.logit_scale, + ctx.lse_square_scale, + ctx.ignore_index, + ctx.total_classes, + ctx.class_start_idx, + n_cols, # shapes + logits.stride(0), # strides + dlogits.stride(0), + grad_losses.stride(0), + BLOCK_SIZE=BLOCK_SIZE, # constants + num_warps=num_warps, + ) + return dlogits, None, None, None, None, None, None, None, None + + +def cross_entropy_loss( + logits: torch.Tensor, + target: torch.Tensor, + label_smoothing: float = 0.0, + logit_scale: float = 1.0, + lse_square_scale: float = 0.0, + ignore_index=-100, + inplace_backward: bool = False, + process_group=None, +) -> Tuple[torch.Tensor, torch.Tensor]: + """ + Arguments: + logits: [batch, vocab_size] + target: [batch,] + label_smoothing: float + logit_scale: float. + Multiply logits by this scale before calculating the loss. + lse_square_scale: float. + If > 0, we add lse_square_scale * lse(logits) ^ 2 to the loss. + This is also referred to as "z-loss". + ignore_index: int. + If target == ignore_index, the loss is set to 0.0. + inplace_backward: bool. + If True, we do the backward pass in-place by modifying the logits. + This saves memory. + process_group: + if not None, we're doing Tensor Parallel: each process is responsible for + one part of the vocab. The loss will be aggregated across processes. + Returns: + losses: [batch,], float + z_losses: [batch,], float + """ + return CrossEntropyLossFunction.apply( + logits, + target, + label_smoothing, + logit_scale, + lse_square_scale, + ignore_index, + inplace_backward, + process_group, + ) + + +class FusedCrossEntropyLoss(nn.Module): + def __init__( + self, + ignore_index: int = -100, + reduction: str = "mean", + label_smoothing: float = 0.0, + logit_scale: float = 1.0, + lse_square_scale: float = 0.0, + inplace_backward: bool = False, + process_group: Any = None, + return_z_loss: bool = False, + ): + """ + Arguments: + ignore_index: int. If target == ignore_index, the loss is set to 0.0. + label_smoothing: float + lse_square_scale: float. If > 0, we add lse_square_scale * lse(logits) ^ 2 to the loss. + This is also referred to as "z-loss". + inplace_backward: bool. If True, we do the backward pass in-place by modifying the logits. + This saves memory. + process_group: if not None, we're doing Tensor Parallel: each process is responsible for + one part of the vocab. The loss will be aggregated across processes. + return_z_loss: bool. If True, we return the component of the loss contributed by + the lse_square_scale value. This value is only for logging and does not support + backprop. + """ + super().__init__() + if reduction not in ["mean", "none", "sum"]: + raise NotImplementedError("Only support reduction = 'mean' or 'none' or 'sum'") + self.ignore_index = ignore_index + self.reduction = reduction + self.label_smoothing = label_smoothing + self.logit_scale = logit_scale + self.lse_square_scale = lse_square_scale + self.inplace_backward = inplace_backward + self.process_group = process_group + self.return_z_loss = return_z_loss + + def forward(self, input, target): + """ + Arguments: + input: (batch, vocab_size) + target: (batch,) + Returns: + losses: (batch,) if reduction is 'none', else (1,), dtype float + z_loss: (batch,) if reduction is 'none', else (1,), dtype float (if self.return_z_loss) + """ + assert input.is_cuda and target.is_cuda, "Only support CUDA tensors" + loss, z_loss = cross_entropy_loss( + input, + target, + label_smoothing=self.label_smoothing, + logit_scale=self.logit_scale, + lse_square_scale=self.lse_square_scale, + ignore_index=self.ignore_index, + inplace_backward=self.inplace_backward, + process_group=self.process_group, + ) + if self.reduction == "mean": + loss = loss.sum() / (target != self.ignore_index).sum() + elif self.reduction == "sum": + loss = loss.sum() + else: + loss = loss + + if not self.return_z_loss: + return loss + + if self.reduction == "mean": + z_loss = z_loss.sum() / (target != self.ignore_index).sum() + elif self.reduction == "sum": + z_loss = z_loss.sum() + else: + z_loss = z_loss + + return loss, z_loss diff --git a/fla/modules/fused_kl_div.py b/fla/modules/fused_kl_div.py new file mode 100644 index 0000000000000000000000000000000000000000..a69c9d6ab2cfa8a6ce2eb2d0794ddff206caef37 --- /dev/null +++ b/fla/modules/fused_kl_div.py @@ -0,0 +1,321 @@ +# -*- coding: utf-8 -*- + +from typing import Tuple + +import torch +import torch.nn as nn +import torch.nn.functional as F +import triton +import triton.language as tl + +from fla.utils import contiguous + +# The hard limit of TRITON_MAX_TENSOR_NUMEL is 1048576 +# https://github.com/triton-lang/triton/blob/ba42a5c68fd0505f8c42f4202d53be0f8d9a5fe0/python/triton/language/core.py#L19 +# However, setting limit as 65536 as in LayerNorm tutorial is faster because of less register spilling +# The optimal maximum block size depends on your hardware, your kernel, and your dtype +MAX_FUSED_SIZE = 65536 // 2 + + +@triton.jit +def kl_div_kernel( + logits, + target_logits, + loss, + s_logits, + s_loss, + reduction: tl.constexpr, + N: tl.constexpr, + V: tl.constexpr, + BV: tl.constexpr +): + # https://github.com/triton-lang/triton/issues/1058 + # If N*V is too large, i_n * stride will overflow out of int32, so we convert to int64 + i_n = tl.program_id(0).to(tl.int64) + + logits += i_n * s_logits + target_logits += i_n * s_logits + + # m is the max value. use the notation from the paper + sm, tm = float('-inf'), float('-inf') + # d is the sum. use the notation from the paper + sd, td = 0.0, 0.0 + + NV = tl.cdiv(V, BV) + for iv in range(0, NV): + o_x = iv * BV + tl.arange(0, BV) + # for student + b_sl = tl.load(logits + o_x, mask=o_x < V, other=float('-inf')) + b_sm = tl.max(b_sl) + m_new = tl.maximum(sm, b_sm) + sd = sd * tl.exp(sm - m_new) + tl.sum(tl.exp(b_sl - m_new)) + sm = m_new + # for teacher + b_tl = tl.load(target_logits + o_x, mask=o_x < V, other=float('-inf')) + b_tm = tl.max(b_tl) + m_new = tl.maximum(tm, b_tm) + td = td * tl.exp(tm - m_new) + tl.sum(tl.exp(b_tl - m_new)) + tm = m_new + + b_loss = 0. + # KL(y_true || y) = exp(y_true) * (log(y_true) - log(y)) + for iv in range(0, NV): + o_x = iv * BV + tl.arange(0, BV) + b_sl = tl.load(logits + o_x, mask=o_x < V, other=float('-inf')) + b_tl = tl.load(target_logits + o_x, mask=o_x < V, other=float('-inf')) + b_sp_log = b_sl - sm - tl.log(sd) + b_tp_log = b_tl - tm - tl.log(td) + b_sp = tl.exp(b_sp_log) + b_tp = tl.exp(b_tp_log) + b_kl = tl.where(o_x < V, b_tp * (b_tp_log - b_sp_log), 0) + b_dl = -b_tp + b_sp + b_loss += tl.sum(b_kl) + if reduction == 'batchmean': + b_dl = b_dl / N + tl.store(logits + o_x, b_dl, mask=o_x < V) + + # Normalize the loss by the number of elements if reduction is 'batchmean' + if reduction == 'batchmean': + b_loss = b_loss / N + + tl.store(loss + i_n * s_loss, b_loss) + + +@triton.jit +def elementwise_mul_kernel( + x, + g, + N: tl.constexpr, + B: tl.constexpr +): + """ + This function multiplies each element of the tensor pointed by x with the value pointed by g. + The multiplication is performed in-place on the tensor pointed by x. + + Parameters: + x: + Pointer to the input tensor. + g: + Pointer to the gradient output value. + N (int): + The number of columns in the input tensor. + B (int): + The block size for Triton operations. + """ + + # Get the program ID and convert it to int64 to avoid overflow + i_x = tl.program_id(0).to(tl.int64) + o_x = i_x * B + tl.arange(0, B) + + # Load the gradient output value + b_g = tl.load(g) + b_x = tl.load(x + o_x, mask=o_x < N) + tl.store(x + o_x, b_x * b_g, mask=o_x < N) + + +def fused_kl_div_forward( + x: torch.Tensor, + target_x: torch.Tensor, + weight: torch.Tensor, + target_weight: torch.Tensor, + reduction: str = 'batchmean' +): + device = x.device + + # ideally, we would like to achieve the same memory consumption as [N, H], + # so the expected chunk size should be: + # NC = ceil(V / H) + # C = ceil(N / NC) + # for ex: N = 4096*4, V = 32000, H = 4096 ==> NC = 8, C = ceil(N / NC) = 2048 + N, H, V = *x.shape, weight.shape[0] + BV = min(MAX_FUSED_SIZE, triton.next_power_of_2(V)) + # TODO: in real cases, we may need to limit the number of chunks NC to + # ensure the precisions of accumulated gradients + NC = min(8, triton.cdiv(V, H)) + C = triton.next_power_of_2(triton.cdiv(N, NC)) + NC = triton.cdiv(N, C) + + dx = torch.zeros_like(x, device=device) + dw = torch.zeros_like(weight, device=device) if weight is not None else None + # we use fp32 for loss accumulator + loss = torch.zeros(N, dtype=torch.float32, device=device) + + for ic in range(NC): + start, end = ic * C, min((ic + 1) * C, N) + # [C, N] + c_sx = x[start:end] + c_tx = target_x[start:end] + # when doing matmul, use the original precision + # [C, V] + c_sl = F.linear(c_sx, weight) + c_tl = F.linear(c_tx, target_weight) + + # unreduced loss + c_loss = loss[start:end] + + # Here we calculate the gradient of c_sx in place so we can save memory. + kl_div_kernel[(c_sx.shape[0],)]( + logits=c_sl, + target_logits=c_tl, + loss=c_loss, + s_logits=c_sl.stride(-2), + s_loss=c_loss.stride(-1), + reduction=reduction, + N=N, + V=V, + BV=BV, + num_warps=32 + ) + + # gradient of logits is computed in-place by the above triton kernel and is of shape: C x V + # thus dx[start: end] should be of shape: C x H + # additionally, since we are chunking the inputs, observe that the loss and gradients are calculated only + # on `n_non_ignore` tokens. However, the gradient of the input should be calculated for all tokens. + # Thus, we need an additional scaling factor of (n_non_ignore/total) to scale the gradients. + # [C, H] + + dx[start:end] = torch.mm(c_sl, weight) + + if weight is not None: + torch.addmm(input=dw, mat1=c_sl.t(), mat2=c_sx, out=dw) + + loss = loss.sum() + return loss, dx, dw + + +def fused_kl_div_backward( + do: torch.Tensor, + dx: torch.Tensor, + dw: torch.Tensor +): + # If cross entropy is the last layer, do is 1.0. Skip the mul to save time + if torch.ne(do, torch.tensor(1.0, device=do.device)): + # We use a Triton kernel instead of a PyTorch operation because modifying inputs in-place + # for gradient storage and backward multiple times causes anomalies with PyTorch but not with Triton. + N, H = dx.shape + B = min(MAX_FUSED_SIZE, triton.next_power_of_2(H)) + + elementwise_mul_kernel[(triton.cdiv(N * H, B),)]( + x=dx, + g=do, + N=N*H, + B=B, + num_warps=32, + ) + + # handle dw + if dw is not None: + V, H = dw.shape + elementwise_mul_kernel[(triton.cdiv(V * H, B),)]( + x=dw, + g=do, + N=V*H, + B=B, + num_warps=32, + ) + + return dx, dw + + +class FusedKLDivLossFunction(torch.autograd.Function): + + @staticmethod + @contiguous + def forward( + ctx, + x: torch.Tensor, + target_x: torch.Tensor, + weight: torch.Tensor, + target_weight: torch.Tensor, + reduction: str + ): + loss, dx, dw = fused_kl_div_forward( + x=x, + target_x=target_x, + weight=weight, + target_weight=target_weight, + reduction=reduction + ) + ctx.save_for_backward(dx, dw) + return loss + + @staticmethod + @contiguous + def backward(ctx, do): + dx, dw = ctx.saved_tensors + dx, dw = fused_kl_div_backward(do, dx, dw) + return dx, None, dw, None, None + + +def fused_kl_div_loss( + x: torch.Tensor, + target_x: torch.Tensor, + weight: torch.Tensor, + target_weight: torch.Tensor, + reduction: str = 'batchmean' +) -> Tuple[torch.Tensor, torch.Tensor]: + """ + Args: + x (torch.Tensor): [batch_size * seq_len, hidden_size] + target_x (torch.Tensor): [batch_size * seq_len, hidden_size] + weight (torch.Tensor): [vocab_size, hidden_size] + where `vocab_size` is the number of classes. + target_weight (torch.Tensor): [vocab_size, hidden_size] + where `vocab_size` is the number of classes. + reduction: + Specifies the reduction to apply to the output: 'batchmean'. Default: 'batchmean'. + Returns: + loss + """ + return FusedKLDivLossFunction.apply( + x, + target_x, + weight, + target_weight, + reduction + ) + + +class FusedKLDivLoss(nn.Module): + + def __init__( + self, + reduction: str = 'batchmean' + ): + """ + Args: + reduction: + Specifies the reduction to apply to the output: 'batchmean'. Default: 'batchmean'. + """ + super().__init__() + + assert reduction in ['batchmean'], f"reduction: {reduction} is not supported" + + self.reduction = reduction + + def forward( + self, + x: torch.Tensor, + target_x: torch.Tensor, + weight: torch.Tensor, + target_weight: torch.Tensor + ): + """ + Args: + x (torch.Tensor): [batch_size * seq_len, hidden_size] + target_x (torch.Tensor): [batch_size * seq_len, hidden_size] + weight (torch.Tensor): [vocab_size, hidden_size] + where `vocab_size` is the number of classes. + target_weight (torch.Tensor): [vocab_size, hidden_size] + where `vocab_size` is the number of classes. + Returns: + loss + """ + loss = fused_kl_div_loss( + x=x, + target_x=target_x, + weight=weight, + target_weight=target_weight, + reduction=self.reduction + ) + return loss diff --git a/fla/modules/fused_linear_cross_entropy.py b/fla/modules/fused_linear_cross_entropy.py new file mode 100644 index 0000000000000000000000000000000000000000..e1e3ac8a4218fcf946ecabc7bfba0978ee1ca40d --- /dev/null +++ b/fla/modules/fused_linear_cross_entropy.py @@ -0,0 +1,509 @@ +# -*- coding: utf-8 -*- + +# Code adapted from +# https://github.com/linkedin/Liger-Kernel/blob/main/src/liger_kernel/ops/fused_linear_cross_entropy.py + +from typing import Optional, Tuple + +import torch +import torch.nn as nn +import torch.nn.functional as F +import triton +import triton.language as tl + +from fla.ops.utils import logsumexp_fwd +from fla.utils import contiguous + +# The hard limit of TRITON_MAX_TENSOR_NUMEL is 1048576 +# https://github.com/triton-lang/triton/blob/ba42a5c68fd0505f8c42f4202d53be0f8d9a5fe0/python/triton/language/core.py#L19 +# However, setting limit as 65536 as in LayerNorm tutorial is faster because of less register spilling +# The optimal maximum block size depends on your hardware, your kernel, and your dtype +MAX_FUSED_SIZE = 65536 // 2 + + +@triton.jit +def cross_entropy_kernel( + logits, + lse, + target, + loss, + total, + ignore_index, + label_smoothing: tl.constexpr, + logit_scale: tl.constexpr, + reduction: tl.constexpr, + V: tl.constexpr, + BV: tl.constexpr +): + """ + This kernel computes both cross entropy loss and the gradient of the input. + We only consider hard label + mean reduction for now. + Please refer to https://pytorch.org/docs/stable/generated/torch.nn.CrossEntropyLoss.html for the math. + + Args: + logits: + Pointer to logits tensor. + lse: + Pointer to logsumexp tensor. + target: Pointer to target tensor. + loss: + Pointer to tensor to store the loss. + V (int): + The number of columns in the input tensor. + total (int): + The number of non-ignored classes. + ignore_index (int): + The index to ignore in the target. + label_smoothing (float): + The amount of smoothing when computing the loss, where 0.0 means no smoothing. + reduction (str): + The string for the reduction to apply + BV (int): + The block size for vocab. + """ + + # https://github.com/triton-lang/triton/issues/1058 + # If B*T*V is too large, i_n * stride will overflow out of int32, so we convert to int64 + i_n = tl.program_id(0).to(tl.int64) + NV = tl.cdiv(V, BV) + + # 1. Load target first because if the target is ignore_index, we can return right away + b_y = tl.load(target + i_n) + + # 2. locate the start index + logits += i_n * V + + if b_y == ignore_index: + # set all x as 0 + for i in range(0, V, BV): + o_v = i + tl.arange(0, BV) + tl.store(logits + o_v, 0.0, mask=o_v < V) + return + + # Online softmax: 2 loads + 1 store (compared with 3 loads + 1 store for the safe softmax) + # Refer to Algorithm 3 in the paper: https://arxiv.org/pdf/1805.02867 + + # 3. [Online softmax] first pass: compute logsumexp + # we did this in anouter kernel + b_l = tl.load(logits + b_y) * logit_scale + b_lse = tl.load(lse + i_n) + + # 4. Calculate the loss + # loss = lse - logits_l + b_loss = b_lse - b_l + + # Label smoothing is a general case of normal cross entropy + # See the full derivation at https://github.com/linkedin/Liger-Kernel/pull/198#issue-2503665310 + b_z = 0.0 + eps = label_smoothing / V + + # We need tl.debug_barrier() as mentioned in + # https://github.com/triton-lang/triton/blob/ba42a5c68fd0505f8c42f4202d53be0f8d9a5fe0/python/triton/ops/cross_entropy.py#L34 + tl.debug_barrier() + + # 5. [Online Softmax] Second pass: compute gradients + # For 'mean' reduction, gradients are normalized by number of non-ignored elements + # dx_y = (softmax(x_y) - 1) / N + # dx_i = softmax(x_i) / N, i != y + # For label smoothing: + # dx_i = (softmax(x_y) - label_smoothing / V) / N, i != y + # dx_y = (softmax(x_y) - label_smoothing / V - (1 - label_smoothing)) / N + # = dx_i - (1 - label_smoothing) / N + for iv in range(0, NV): + o_v = iv * BV + tl.arange(0, BV) + b_logits = tl.load(logits + o_v, mask=o_v < V, other=float('-inf')) * logit_scale + if label_smoothing > 0: + # scale X beforehand to avoid overflow + b_z += tl.sum(tl.where(o_v < V, -eps * b_logits, 0.0)) + b_p = (tl.exp(b_logits - b_lse) - eps) * logit_scale + if reduction == "mean": + b_p = b_p / total + tl.store(logits + o_v, b_p, mask=o_v < V) + + tl.debug_barrier() + + # Orginal loss = H(q, p), with label smoothing regularization = H(q', p) and (label_smoothing / V) = eps + # H(q', p) = (1 - label_smoothing) * H(q, p) + label_smoothing * H(u, p) + # = (1 - label_smoothing) * H(q, p) + eps * sum(logsoftmax(x_i)) + # By using m (global max of xi) and d (sum of e^(xi-m)), we can simplify as: + # = (1 - label_smoothing) * H(q, p) + (-sum(x_i * eps) + label_smoothing * (m + logd)) + # Refer to H(q', p) in section 7 of the paper: + # https://arxiv.org/pdf/1512.00567 + # pytorch: + # https://github.com/pytorch/pytorch/blob/2981534f54d49fa3a9755c9b0855e7929c2527f0/aten/src/ATen/native/LossNLL.cpp#L516 + # See full derivation at https://github.com/linkedin/Liger-Kernel/pull/198#issuecomment-2333753087 + if label_smoothing > 0: + b_loss = b_loss * (1 - label_smoothing) + (b_z + label_smoothing * b_lse) + + # 6. Specially handle the i==y case where `dx_y = (softmax(x_y) - (1 - label_smoothing) / N` + b_l = tl.load(logits + b_y) + + # Normalize the loss by the number of non-ignored elements if reduction is "mean" + if reduction == 'mean': + b_loss = b_loss / total + b_l += (label_smoothing - 1) / total * logit_scale + else: + b_l += (label_smoothing - 1) * logit_scale + + tl.store(loss + i_n, b_loss) + tl.store(logits + b_y, b_l) + + +@triton.jit +def elementwise_mul_kernel( + x, + g, + N: tl.constexpr, + B: tl.constexpr +): + """ + This function multiplies each element of the tensor pointed by x with the value pointed by g. + The multiplication is performed in-place on the tensor pointed by x. + + Parameters: + x: + Pointer to the input tensor. + g: + Pointer to the gradient output value. + N (int): + The number of columns in the input tensor. + B (int): + The block size for Triton operations. + """ + + # Get the program ID and convert it to int64 to avoid overflow + i_x = tl.program_id(0).to(tl.int64) + o_x = i_x * B + tl.arange(0, B) + + # Load the gradient output value + b_g = tl.load(g) + b_x = tl.load(x + o_x, mask=o_x < N) + tl.store(x + o_x, b_x * b_g, mask=o_x < N) + + +def fused_linear_cross_entropy_forward( + x: torch.Tensor, + target: torch.LongTensor, + weight: torch.Tensor, + bias: torch.Tensor = None, + ignore_index: int = -100, + label_smoothing: float = 0.0, + logit_scale: float = 1.0, + num_chunks: int = 8, + reduction: str = "mean" +): + device = x.device + + # inputs have shape: [N, H] + # materialized activations will have shape: [N, V] + # the increase in memory = [N, V] + # reduction can be achieved by partitioning the number of tokens N into smaller chunks. + + # ideally, we would like to achieve the same memory consumption as [N, H], + # so the expected chunk size should be: + # NC = ceil(V / H) + # C = ceil(N / NC) + # for ex: N = 4096*4, V = 32000, H = 4096 ==> NC = 8, C = ceil(N / NC) = 2048 + N, H, V = *x.shape, weight.shape[0] + BV = min(MAX_FUSED_SIZE, triton.next_power_of_2(V)) + # TODO: in real cases, we may need to limit the number of chunks NC to + # ensure the precisions of accumulated gradients + NC = min(num_chunks, triton.cdiv(V, H)) + C = triton.next_power_of_2(triton.cdiv(N, NC)) + NC = triton.cdiv(N, C) + + dx = torch.zeros_like(x, device=device) + dw = torch.zeros_like(weight, device=device) if weight is not None else None + db = torch.zeros_like(bias, device=device) if bias is not None else None + # we use fp32 for loss accumulator + loss = torch.zeros(N, dtype=torch.float32, device=device) + + total = target.ne(ignore_index).sum().item() + + for ic in range(NC): + start, end = ic * C, min((ic + 1) * C, N) + # [C, N] + c_x = x[start:end] + # when doing matmul, use the original precision + # [C, V] + c_logits = F.linear(c_x, weight, bias) + c_target = target[start:end] + # [C] + # keep lse in fp32 to maintain precision + c_lse = logsumexp_fwd(c_logits, scale=logit_scale, dtype=torch.float) + + # unreduced loss + c_loss = loss[start:end] + + # Here we calculate the gradient of c_logits in place so we can save memory. + cross_entropy_kernel[(c_logits.shape[0],)]( + logits=c_logits, + lse=c_lse, + target=c_target, + loss=c_loss, + total=total, + ignore_index=ignore_index, + label_smoothing=label_smoothing, + logit_scale=logit_scale, + reduction=reduction, + V=V, + BV=BV, + num_warps=32 + ) + + # gradient of logits is computed in-place by the above triton kernel and is of shape: C x V + # thus dx should be of shape: C x H + dx[start:end] = torch.mm(c_logits, weight) + + # keep dw in fp32 to maintain precision + if weight is not None: + dw += c_logits.t() @ c_x + + if bias is not None: + torch.add(input=db, other=c_logits.sum(0), out=db) + + loss = loss.sum() + if dw is not None: + dw = dw.to(weight) + if db is not None: + db = db.to(bias) + return loss, dx, dw, db + + +def fused_linear_cross_entropy_backward( + do: torch.Tensor, + dx: torch.Tensor, + dw: torch.Tensor, + db: torch.Tensor +): + # If cross entropy is the last layer, do is 1.0. Skip the mul to save time + if torch.ne(do, torch.tensor(1.0, device=do.device)): + # We use a Triton kernel instead of a PyTorch operation because modifying inputs in-place + # for gradient storage and backward multiple times causes anomalies with PyTorch but not with Triton. + N, H = dx.shape + B = min(MAX_FUSED_SIZE, triton.next_power_of_2(H)) + + elementwise_mul_kernel[(triton.cdiv(N * H, B),)]( + x=dx, + g=do, + N=N*H, + B=B, + num_warps=32, + ) + + # handle dw + if dw is not None: + V, H = dw.shape + elementwise_mul_kernel[(triton.cdiv(V * H, B),)]( + x=dw, + g=do, + N=V*H, + B=B, + num_warps=32, + ) + + if db is not None: + V = db.shape[0] + elementwise_mul_kernel[(triton.cdiv(V, B),)]( + x=db, + g=do, + N=V, + B=B, + num_warps=32, + ) + return dx, dw, db + + +class FusedLinearCrossEntropyFunction(torch.autograd.Function): + + @staticmethod + @contiguous + def forward( + ctx, + x: torch.Tensor, + target: torch.LongTensor, + weight: torch.Tensor, + bias: torch.Tensor = None, + ignore_index: int = -100, + label_smoothing: float = 0.0, + logit_scale: float = 1.0, + num_chunks: int = 8, + reduction: str = "mean" + ): + """ + Fusing the last linear layer with cross-entropy loss + Reference: https://github.com/mgmalek/efficient_cross_entropy + + Handle the forward and backward pass of the final linear layer via cross-entropy loss by avoiding + the materialization of the large logits tensor. Since Cross Entropy Loss is the last layer, we can + compute the gradient at the forward pass. By doing so, we don't have to store the x and target + for the backward pass. + + x (torch.Tensor): [batch_size * seq_len, hidden_size] + target (torch.LongTensor): [batch_size * seq_len] + where each value is in [0, vocab_size). + weight (torch.Tensor): [vocab_size, hidden_size] + where `vocab_size` is the number of classes. + bias (Optional[torch.Tensor]): [vocab_size] + where `vocab_size` is the number of classes. + ignore_index: + the index to ignore in the target. + label_smoothing: + the amount of smoothing when computing the loss, where 0.0 means no smoothing. + logit_scale: float = 1.0, + A scaling factor applied to the logits. Default: 1.0 + num_chunks: int + The number of chunks to split the input tensor into for processing. + This can help optimize memory usage and computation speed. + Default: 8 + reduction: + Specifies the reduction to apply to the output: 'mean' | 'sum'. + 'mean': the weighted mean of the output is taken, + 'sum': the output will be summed. + Default: 'mean'. + """ + loss, dx, dw, db = fused_linear_cross_entropy_forward( + x, + target, + weight, + bias, + ignore_index, + label_smoothing, + logit_scale, + num_chunks, + reduction + ) + # downcast to dtype and store for backward + ctx.save_for_backward( + dx.detach(), + dw.detach() if weight is not None else None, + db.detach() if bias is not None else None, + ) + return loss + + @staticmethod + @contiguous + def backward(ctx, do): + dx, dw, db = ctx.saved_tensors + dx, dw, db = fused_linear_cross_entropy_backward(do, dx, dw, db) + return dx, None, dw, db, None, None, None, None, None + + +def fused_linear_cross_entropy_loss( + x: torch.Tensor, + target: torch.LongTensor, + weight: torch.Tensor, + bias: torch.Tensor = None, + ignore_index: int = -100, + label_smoothing: float = 0.0, + logit_scale: float = 1.0, + num_chunks: int = 8, + reduction: str = "mean" +) -> Tuple[torch.Tensor, torch.Tensor]: + """ + Args: + x (torch.Tensor): [batch_size * seq_len, hidden_size] + target (torch.LongTensor): [batch_size * seq_len] + where each value is in [0, vocab_size). + weight (torch.Tensor): [vocab_size, hidden_size] + where `vocab_size` is the number of classes. + bias (Optional[torch.Tensor]): [vocab_size] + where `vocab_size` is the number of classes. + ignore_index: int. + If target == ignore_index, the loss is set to 0.0. + label_smoothing: float + logit_scale: float + A scaling factor applied to the logits. Default: 1.0 + num_chunks: int + The number of chunks to split the input tensor into for processing. + This can help optimize memory usage and computation speed. + Default: 8 + reduction: + Specifies the reduction to apply to the output: 'mean' | 'sum'. + 'mean': the weighted mean of the output is taken, + 'sum': the output will be summed. + Default: 'mean'. + Returns: + losses: [batch,], float + """ + return FusedLinearCrossEntropyFunction.apply( + x, + target, + weight, + bias, + ignore_index, + label_smoothing, + logit_scale, + num_chunks, + reduction + ) + + +class FusedLinearCrossEntropyLoss(nn.Module): + + def __init__( + self, + ignore_index: int = -100, + label_smoothing: float = 0.0, + logit_scale: float = 1.0, + num_chunks: int = 8, + reduction: str = "mean" + ): + """ + Args: + ignore_index: int. + If target == ignore_index, the loss is set to 0.0. + label_smoothing: float + logit_scale: float + A scaling factor applied to the logits. Default: 1.0 + num_chunks: int + The number of chunks to split the input tensor into for processing. + This can help optimize memory usage and computation speed. + Default: 8 + reduction: + Specifies the reduction to apply to the output: 'mean' | 'sum'. + 'mean': the weighted mean of the output is taken, + 'sum': the output will be summed. + Default: 'mean'. + """ + super().__init__() + + assert reduction in ["none", "mean", "sum"], f"reduction: {reduction} is not supported" + + self.ignore_index = ignore_index + self.label_smoothing = label_smoothing + self.logit_scale = logit_scale + self.num_chunks = num_chunks + self.reduction = reduction + + def forward( + self, + x: torch.Tensor, + target: torch.LongTensor, + weight: torch.Tensor, + bias: Optional[torch.Tensor] = None + ): + """ + Args: + x (torch.Tensor): [batch_size * seq_len, hidden_size] + target (torch.LongTensor): [batch_size * seq_len] + where each value is in [0, V). + weight (torch.Tensor): [vocab_size, hidden_size] + where `vocab_size` is the number of classes. + bias (Optional[torch.Tensor]): [vocab_size] + where `vocab_size` is the number of classes. + Returns: + loss + """ + loss = fused_linear_cross_entropy_loss( + x, + target, + weight=weight, + bias=bias, + ignore_index=self.ignore_index, + label_smoothing=self.label_smoothing, + logit_scale=self.logit_scale, + num_chunks=self.num_chunks, + reduction=self.reduction + ) + return loss diff --git a/fla/modules/fused_norm_gate.py b/fla/modules/fused_norm_gate.py new file mode 100644 index 0000000000000000000000000000000000000000..739b5ae46ca4e15d263fabbebfa70dcd6424ed7d --- /dev/null +++ b/fla/modules/fused_norm_gate.py @@ -0,0 +1,889 @@ +# -*- coding: utf-8 -*- + +# Copyright (c) 2023, Tri Dao. +# https://github.com/state-spaces/mamba/blob/fb7b5310fa865dbd62aa059b1e26f2b431363e2a/mamba_ssm/ops/triton/layernorm.py +# Implement residual + layer_norm / rms_norm. + +# Based on the Triton LayerNorm tutorial: https://triton-lang.org/main/getting-started/tutorials/05-layer-norm.html +# For the backward pass, we keep weight_grad and bias_grad in registers and accumulate. +# This is faster for dimensions up to 8k, but after that it's much slower due to register spilling. +# The models we train have hidden dim up to 8k anyway (e.g. Llama 70B), so this is fine. + +from __future__ import annotations + +import math + +import torch +import torch.nn as nn +import torch.nn.functional as F +import triton +import triton.language as tl + +from fla.utils import contiguous + + +def layer_norm_ref(x, weight, bias, residual=None, eps=1e-6, prenorm=False, upcast=False): + dtype = x.dtype + if upcast: + weight = weight.float() + bias = bias.float() if bias is not None else None + if upcast: + x = x.float() + residual = residual.float() if residual is not None else residual + if residual is not None: + x = (x + residual).to(x.dtype) + out = F.layer_norm(x.to(weight.dtype), x.shape[-1:], weight=weight, bias=bias, eps=eps).to( + dtype + ) + return out if not prenorm else (out, x) + + +def rms_norm_ref(x, weight, bias, residual=None, eps=1e-6, prenorm=False, upcast=False): + dtype = x.dtype + if upcast: + weight = weight.float() + bias = bias.float() if bias is not None else None + if upcast: + x = x.float() + residual = residual.float() if residual is not None else residual + if residual is not None: + x = (x + residual).to(x.dtype) + rstd = 1 / torch.sqrt((x.square()).mean(dim=-1, keepdim=True) + eps) + out = (x * rstd * weight) + \ + bias if bias is not None else (x * rstd * weight) + out = out.to(dtype) + return out if not prenorm else (out, x) + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8), + triton.Config({}, num_warps=16), + triton.Config({}, num_warps=32), + ], + key=["N", "HAS_RESIDUAL", "STORE_RESIDUAL_OUT", "IS_RMS_NORM", "HAS_BIAS"], +) +# @triton.heuristics({"HAS_BIAS": lambda args: args["B"] is not None}) +# @triton.heuristics({"HAS_RESIDUAL": lambda args: args["RESIDUAL"] is not None}) +@triton.jit +def _layer_norm_fwd_1pass_kernel( + X, # pointer to the input + O, # pointer to the gate + Y, # pointer to the output + W, # pointer to the weights + B, # pointer to the biases + RESIDUAL, # pointer to the residual + RESIDUAL_OUT, # pointer to the residual + Mean, # pointer to the mean + Rstd, # pointer to the 1/std + stride_x_row, # how much to increase the pointer when moving by 1 row + stride_y_row, + stride_res_row, + stride_res_out_row, + N, # number of columns in X + eps, # epsilon to avoid division by zero + IS_RMS_NORM: tl.constexpr, + BLOCK_N: tl.constexpr, + HAS_RESIDUAL: tl.constexpr, + STORE_RESIDUAL_OUT: tl.constexpr, + HAS_WEIGHT: tl.constexpr, + HAS_BIAS: tl.constexpr +): + # Map the program id to the row of X and Y it should compute. + row = tl.program_id(0) + X += row * stride_x_row + Y += row * stride_y_row + O += row * stride_x_row + if HAS_RESIDUAL: + RESIDUAL += row * stride_res_row + if STORE_RESIDUAL_OUT: + RESIDUAL_OUT += row * stride_res_out_row + # Compute mean and variance + cols = tl.arange(0, BLOCK_N) + x = tl.load(X + cols, mask=cols < N, other=0.0).to(tl.float32) + if HAS_RESIDUAL: + residual = tl.load(RESIDUAL + cols, mask=cols < + N, other=0.0).to(tl.float32) + x += residual + if STORE_RESIDUAL_OUT: + tl.store(RESIDUAL_OUT + cols, x, mask=cols < N) + if not IS_RMS_NORM: + mean = tl.sum(x, axis=0) / N + tl.store(Mean + row, mean) + xbar = tl.where(cols < N, x - mean, 0.0) + var = tl.sum(xbar * xbar, axis=0) / N + else: + xbar = tl.where(cols < N, x, 0.0) + var = tl.sum(xbar * xbar, axis=0) / N + rstd = 1 / tl.sqrt(var + eps) + tl.store(Rstd + row, rstd) + # Normalize and apply linear transformation + mask = cols < N + if HAS_WEIGHT: + w = tl.load(W + cols, mask=mask).to(tl.float32) + if HAS_BIAS: + b = tl.load(B + cols, mask=mask).to(tl.float32) + x_hat = (x - mean) * rstd if not IS_RMS_NORM else x * rstd + y = x_hat * w if HAS_WEIGHT else x_hat + if HAS_BIAS: + y = y + b + + # Swish output gate + o = tl.load(O + cols, mask=cols < N, other=0.0).to(tl.float32) + y = y * o * tl.sigmoid(o) + + # Write output + tl.store(Y + cols, y, mask=mask) + + +def _layer_norm_fwd( + x, o, weight, bias, eps, residual=None, out_dtype=None, residual_dtype=None, is_rms_norm=False +): + if residual is not None: + residual_dtype = residual.dtype + M, N = x.shape + assert x.stride(-1) == 1 + if residual is not None: + assert residual.stride(-1) == 1 + assert residual.shape == (M, N) + if weight is not None: + assert weight.shape == (N,) + assert weight.stride(-1) == 1 + if bias is not None: + assert bias.stride(-1) == 1 + assert bias.shape == (N,) + # allocate output + y = torch.empty_like(x, dtype=x.dtype if out_dtype is None else out_dtype) + assert y.stride(-1) == 1 + if residual is not None or (residual_dtype is not None and residual_dtype != x.dtype): + residual_out = torch.empty(M, N, device=x.device, dtype=residual_dtype) + assert residual_out.stride(-1) == 1 + else: + residual_out = None + mean = torch.empty((M,), dtype=torch.float32, + device="cuda") if not is_rms_norm else None + rstd = torch.empty((M,), dtype=torch.float32, device="cuda") + # Less than 64KB per feature: enqueue fused kernel + MAX_FUSED_SIZE = 65536 // x.element_size() + BLOCK_N = min(MAX_FUSED_SIZE, triton.next_power_of_2(N)) + if N > BLOCK_N: + raise RuntimeError( + "This layer norm doesn't support feature dim >= 64KB.") + # heuristics for number of warps + with torch.cuda.device(x.device.index): + _layer_norm_fwd_1pass_kernel[(M,)]( + x, + o, + y, + weight, + bias, + residual, + residual_out, + mean, + rstd, + x.stride(0), + y.stride(0), + residual.stride(0) if residual is not None else 0, + residual_out.stride(0) if residual_out is not None else 0, + N, + eps, + is_rms_norm, + BLOCK_N, + residual is not None, + residual_out is not None, + weight is not None, + bias is not None, + ) + # residual_out is None if residual is None and residual_dtype == input_dtype + return y, mean, rstd, residual_out if residual_out is not None else x + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8), + triton.Config({}, num_warps=16), + triton.Config({}, num_warps=32), + ], + key=["N", "HAS_DRESIDUAL", "STORE_DRESIDUAL", "IS_RMS_NORM", "HAS_BIAS"], +) +# @triton.heuristics({"HAS_BIAS": lambda args: args["B"] is not None}) +# @triton.heuristics({"HAS_DRESIDUAL": lambda args: args["DRESIDUAL"] is not None}) +# @triton.heuristics({"STORE_DRESIDUAL": lambda args: args["DRESIDUAL_IN"] is not None}) +@triton.heuristics({"RECOMPUTE_OUTPUT": lambda args: args["Y"] is not None}) +@triton.jit +def _layer_norm_bwd_kernel( + X, # pointer to the input + O, # pointer to the gate + W, # pointer to the weights + B, # pointer to the biases + Y, # pointer to the output to be recomputed + DY, # pointer to the output gradient + DX, # pointer to the input gradient + DO, # pointer to the gate gradient + DW, # pointer to the partial sum of weights gradient + DB, # pointer to the partial sum of biases gradient + DRESIDUAL, + DRESIDUAL_IN, + Mean, # pointer to the mean + Rstd, # pointer to the 1/std + stride_x_row, # how much to increase the pointer when moving by 1 row + stride_y_row, + stride_dy_row, + stride_dx_row, + stride_dres_row, + stride_dres_in_row, + M, # number of rows in X + N, # number of columns in X + eps, # epsilon to avoid division by zero + rows_per_program, + IS_RMS_NORM: tl.constexpr, + BLOCK_N: tl.constexpr, + HAS_DRESIDUAL: tl.constexpr, + STORE_DRESIDUAL: tl.constexpr, + HAS_WEIGHT: tl.constexpr, + HAS_BIAS: tl.constexpr, + RECOMPUTE_OUTPUT: tl.constexpr, +): + # Map the program id to the elements of X, DX, and DY it should compute. + row_block_id = tl.program_id(0) + row_start = row_block_id * rows_per_program + cols = tl.arange(0, BLOCK_N) + mask = cols < N + X += row_start * stride_x_row + O += row_start * stride_x_row + if HAS_DRESIDUAL: + DRESIDUAL += row_start * stride_dres_row + if STORE_DRESIDUAL: + DRESIDUAL_IN += row_start * stride_dres_in_row + DY += row_start * stride_dy_row + DX += row_start * stride_dx_row + DO += row_start * stride_dx_row + if RECOMPUTE_OUTPUT: + Y += row_start * stride_y_row + if HAS_WEIGHT: + w = tl.load(W + cols, mask=mask).to(tl.float32) + dw = tl.zeros((BLOCK_N,), dtype=tl.float32) + if RECOMPUTE_OUTPUT and HAS_BIAS: + b = tl.load(B + cols, mask=mask, other=0.0).to(tl.float32) + if HAS_BIAS: + db = tl.zeros((BLOCK_N,), dtype=tl.float32) + row_end = min((row_block_id + 1) * rows_per_program, M) + for row in range(row_start, row_end): + # Load data to SRAM + x = tl.load(X + cols, mask=mask, other=0).to(tl.float32) + o = tl.load(O + cols, mask=mask, other=0).to(tl.float32) + dy = tl.load(DY + cols, mask=mask, other=0).to(tl.float32) + + if not IS_RMS_NORM: + mean = tl.load(Mean + row) + rstd = tl.load(Rstd + row) + # Compute dx + xhat = (x - mean) * rstd if not IS_RMS_NORM else x * rstd + xhat = tl.where(mask, xhat, 0.0) + + y = xhat * w if HAS_WEIGHT else xhat + if HAS_BIAS: + y = y + b + if RECOMPUTE_OUTPUT: + tl.store(Y + cols, y, mask=mask) + + sigmoid_o = tl.sigmoid(o) + do = dy * y * (sigmoid_o + o * sigmoid_o * (1 - sigmoid_o)) + dy = dy * o * sigmoid_o + wdy = dy + if HAS_WEIGHT: + wdy = dy * w + dw += dy * xhat + if HAS_BIAS: + db += dy + if not IS_RMS_NORM: + c1 = tl.sum(xhat * wdy, axis=0) / N + c2 = tl.sum(wdy, axis=0) / N + dx = (wdy - (xhat * c1 + c2)) * rstd + else: + c1 = tl.sum(xhat * wdy, axis=0) / N + dx = (wdy - xhat * c1) * rstd + if HAS_DRESIDUAL: + dres = tl.load(DRESIDUAL + cols, mask=mask, other=0).to(tl.float32) + dx += dres + # Write dx + if STORE_DRESIDUAL: + tl.store(DRESIDUAL_IN + cols, dx, mask=mask) + tl.store(DX + cols, dx, mask=mask) + tl.store(DO + cols, do, mask=mask) + + X += stride_x_row + O += stride_x_row + if HAS_DRESIDUAL: + DRESIDUAL += stride_dres_row + if STORE_DRESIDUAL: + DRESIDUAL_IN += stride_dres_in_row + if RECOMPUTE_OUTPUT: + Y += stride_y_row + DY += stride_dy_row + DX += stride_dx_row + DO += stride_dx_row + if HAS_WEIGHT: + tl.store(DW + row_block_id * N + cols, dw, mask=mask) + if HAS_BIAS: + tl.store(DB + row_block_id * N + cols, db, mask=mask) + + +def _layer_norm_bwd( + dy, + x, + o, + weight, + bias, + eps, + mean, + rstd, + dresidual=None, + has_residual=False, + is_rms_norm=False, + x_dtype=None, + recompute_output=False, +): + M, N = x.shape + assert x.stride(-1) == 1 + assert dy.stride(-1) == 1 + assert dy.shape == (M, N) + if dresidual is not None: + assert dresidual.stride(-1) == 1 + assert dresidual.shape == (M, N) + if weight is not None: + assert weight.shape == (N,) + assert weight.stride(-1) == 1 + if bias is not None: + assert bias.stride(-1) == 1 + assert bias.shape == (N,) + # allocate output + dx = ( + torch.empty_like(x) + if x_dtype is None + else torch.empty(M, N, dtype=x_dtype, device=x.device) + ) + do = ( + torch.empty_like(o) + if x_dtype is None + else torch.empty(M, N, dtype=x_dtype, device=x.device) + ) + dresidual_in = torch.empty_like(x) if has_residual and dx.dtype != x.dtype else None + y = torch.empty(M, N, dtype=dy.dtype, device=dy.device) if recompute_output else None + + # Less than 64KB per feature: enqueue fused kernel + MAX_FUSED_SIZE = 65536 // x.element_size() + BLOCK_N = min(MAX_FUSED_SIZE, triton.next_power_of_2(N)) + if N > BLOCK_N: + raise RuntimeError("This layer norm doesn't support feature dim >= 64KB.") + sm_count = torch.cuda.get_device_properties(x.device).multi_processor_count + _dw = ( + torch.empty((sm_count, N), dtype=torch.float32, device=weight.device) + if weight is not None + else None + ) + _db = ( + torch.empty((sm_count, N), dtype=torch.float32, device=bias.device) + if bias is not None + else None + ) + rows_per_program = math.ceil(M / sm_count) + grid = (sm_count,) + with torch.cuda.device(x.device.index): + _layer_norm_bwd_kernel[grid]( + x, + o, + weight, + bias, + y, + dy, + dx, + do, + _dw, + _db, + dresidual, + dresidual_in, + mean, + rstd, + x.stride(0), + 0 if not recompute_output else y.stride(0), + dy.stride(0), + dx.stride(0), + dresidual.stride(0) if dresidual is not None else 0, + dresidual_in.stride(0) if dresidual_in is not None else 0, + M, + N, + eps, + rows_per_program, + is_rms_norm, + BLOCK_N, + dresidual is not None, + dresidual_in is not None, + weight is not None, + bias is not None, + ) + dw = _dw.sum(0).to(weight.dtype) if weight is not None else None + db = _db.sum(0).to(bias.dtype) if bias is not None else None + # Don't need to compute dresidual_in separately in this case + if has_residual and dx.dtype == x.dtype: + dresidual_in = dx + return (dx, do, dw, db, dresidual_in) if not recompute_output else (dx, do, dw, db, dresidual_in, y) + + +class LayerNormSwishGateFn(torch.autograd.Function): + + @staticmethod + @contiguous + def forward( + ctx, + x, + o, + weight, + bias, + residual=None, + eps=1e-6, + prenorm=False, + residual_in_fp32=False, + is_rms_norm=False, + ): + x_shape_og = x.shape + o_shape_og = o.shape + # reshape input data into 2D tensor + x = x.reshape(-1, x.shape[-1]) + o = o.reshape(-1, o.shape[-1]) + if residual is not None: + assert residual.shape == x_shape_og + residual = residual.reshape(-1, residual.shape[-1]) + residual_dtype = ( + residual.dtype + if residual is not None + else (torch.float32 if residual_in_fp32 else None) + ) + y, mean, rstd, residual_out = _layer_norm_fwd( + x, o, weight, bias, eps, residual, residual_dtype=residual_dtype, is_rms_norm=is_rms_norm + ) + ctx.save_for_backward(residual_out, o, weight, bias, mean, rstd) + ctx.x_shape_og = x_shape_og + ctx.o_shape_og = o_shape_og + ctx.eps = eps + ctx.is_rms_norm = is_rms_norm + ctx.has_residual = residual is not None + ctx.prenorm = prenorm + ctx.x_dtype = x.dtype + y = y.reshape(x_shape_og) + return y if not prenorm else (y, residual_out.reshape(x_shape_og)) + + @staticmethod + @contiguous + def backward(ctx, dy, *args): + x, o, weight, bias, mean, rstd = ctx.saved_tensors + dy = dy.reshape(-1, dy.shape[-1]) + assert dy.shape == x.shape + if ctx.prenorm: + dresidual = args[0] + dresidual = dresidual.reshape(-1, dresidual.shape[-1]) + assert dresidual.shape == x.shape + else: + dresidual = None + dx, do, dw, db, dresidual_in = _layer_norm_bwd( + dy, + x, + o, + weight, + bias, + ctx.eps, + mean, + rstd, + dresidual, + ctx.has_residual, + ctx.is_rms_norm, + x_dtype=ctx.x_dtype, + ) + return ( + dx.reshape(ctx.x_shape_og), + do.reshape(ctx.o_shape_og), + dw, + db, + dresidual_in.reshape(ctx.x_shape_og) if ctx.has_residual else None, + None, + None, + None, + None, + ) + + +class LayerNormSwishGateLinearFn(torch.autograd.Function): + + @staticmethod + @contiguous + def forward( + ctx, + x, + o, + norm_weight, + norm_bias, + linear_weight, + linear_bias, + residual=None, + eps=1e-6, + prenorm=False, + residual_in_fp32=False, + is_rms_norm=False, + ): + x_shape_og = x.shape + o_shape_og = o.shape + # reshape input data into 2D tensor + x = x.reshape(-1, x.shape[-1]) + o = o.reshape(-1, o.shape[-1]) + if residual is not None: + assert residual.shape == x_shape_og + residual = residual.reshape(-1, residual.shape[-1]) + residual_dtype = ( + residual.dtype + if residual is not None + else (torch.float32 if residual_in_fp32 else None) + ) + y, mean, rstd, residual_out = _layer_norm_fwd( + x, + o, + norm_weight, + norm_bias, + eps, + residual, + residual_dtype=residual_dtype, + is_rms_norm=is_rms_norm + ) + y = y.reshape(x_shape_og) + dtype = torch.get_autocast_gpu_dtype() if torch.is_autocast_enabled() else y.dtype + linear_weight = linear_weight.to(dtype) + linear_bias = linear_bias.to(dtype) if linear_bias is not None else None + out = F.linear(y.to(linear_weight.dtype), linear_weight, linear_bias) + # We don't store y, will be recomputed in the backward pass to save memory + ctx.save_for_backward(residual_out, o, norm_weight, norm_bias, linear_weight, mean, rstd) + ctx.x_shape_og = x_shape_og + ctx.o_shape_og = o_shape_og + ctx.eps = eps + ctx.is_rms_norm = is_rms_norm + ctx.has_residual = residual is not None + ctx.prenorm = prenorm + ctx.x_dtype = x.dtype + ctx.linear_bias_is_none = linear_bias is None + return out if not prenorm else (out, residual_out.reshape(x_shape_og)) + + @staticmethod + @contiguous + def backward(ctx, dout, *args): + x, o, norm_weight, norm_bias, linear_weight, mean, rstd = ctx.saved_tensors + dout = dout.reshape(-1, dout.shape[-1]) + dy = F.linear(dout, linear_weight.t()) + dlinear_bias = None if ctx.linear_bias_is_none else dout.sum(0) + assert dy.shape == x.shape + if ctx.prenorm: + dresidual = args[0] + dresidual = dresidual.reshape(-1, dresidual.shape[-1]) + assert dresidual.shape == x.shape + else: + dresidual = None + dx, do, dnorm_weight, dnorm_bias, dresidual_in, y = _layer_norm_bwd( + dy, + x, + o, + norm_weight, + norm_bias, + ctx.eps, + mean, + rstd, + dresidual=dresidual, + has_residual=ctx.has_residual, + is_rms_norm=ctx.is_rms_norm, + x_dtype=ctx.x_dtype, + recompute_output=True, + ) + dlinear_weight = torch.einsum("bo,bi->oi", dout, y) + return ( + dx.reshape(ctx.x_shape_og), + do.reshape(ctx.o_shape_og), + dnorm_weight, + dnorm_bias, + dlinear_weight, + dlinear_bias, + dresidual_in.reshape(ctx.x_shape_og) if ctx.has_residual else None, + None, + None, + None, + None, + ) + + +def layer_norm_swish_gate_fn( + x, + o, + weight, + bias, + residual=None, + prenorm=False, + residual_in_fp32=False, + eps=1e-6 +): + return LayerNormSwishGateFn.apply( + x, + o, + weight, + bias, + residual, + eps, + prenorm, + residual_in_fp32, + False + ) + + +def rms_norm_swish_gate_fn( + x, + o, + weight, + bias, + residual=None, + prenorm=False, + residual_in_fp32=False, + eps=1e-6 +): + return LayerNormSwishGateFn.apply( + x, + o, + weight, + bias, + residual, + eps, + prenorm, + residual_in_fp32, + True + ) + + +def layer_norm_swish_gate_linear_fn( + x, + o, + norm_weight, + norm_bias, + linear_weight, + linear_bias, + residual=None, + prenorm=False, + residual_in_fp32=False, + eps=1e-6 +): + return LayerNormSwishGateLinearFn.apply( + x, + o, + norm_weight, + norm_bias, + linear_weight, + linear_bias, + residual, + eps, + prenorm, + residual_in_fp32, + False + ) + + +def rms_norm_swish_gate_linear_fn( + x, + o, + norm_weight, + norm_bias, + linear_weight, + linear_bias, + residual=None, + prenorm=False, + residual_in_fp32=False, + eps=1e-6 +): + return LayerNormSwishGateLinearFn.apply( + x, + o, + norm_weight, + norm_bias, + linear_weight, + linear_bias, + residual, + eps, + prenorm, + residual_in_fp32, + True + ) + + +class FusedLayerNormSwishGate(nn.Module): + + def __init__( + self, + hidden_size, + elementwise_affine: bool = True, + eps=1e-5 + ) -> FusedLayerNormSwishGate: + super().__init__() + + self.hidden_size = hidden_size + self.elementwise_affine = elementwise_affine + self.eps = eps + + if elementwise_affine: + self.weight = nn.Parameter(torch.ones(hidden_size)) + else: + self.register_parameter("weight", None) + self.register_parameter("bias", None) + + def __repr__(self) -> str: + s = f"{self.__class__.__name__}({self.hidden_size}" + if not self.elementwise_affine: + s += f", elementwise_affine={self.elementwise_affine}" + s += f", eps={self.eps}" + s += ")" + return s + + def forward(self, x, o, residual=None, prenorm=False, residual_in_fp32=False): + return layer_norm_swish_gate_fn( + x, + o, + self.weight, + self.bias, + residual=residual, + eps=self.eps, + prenorm=prenorm, + residual_in_fp32=residual_in_fp32 + ) + + +class FusedRMSNormSwishGate(nn.Module): + + def __init__( + self, + hidden_size, + elementwise_affine: bool = True, + eps=1e-5 + ) -> FusedRMSNormSwishGate: + super().__init__() + + self.hidden_size = hidden_size + self.elementwise_affine = elementwise_affine + self.eps = eps + + if elementwise_affine: + self.weight = nn.Parameter(torch.ones(hidden_size)) + else: + self.register_parameter("weight", None) + self.register_parameter("bias", None) + + def __repr__(self) -> str: + s = f"{self.__class__.__name__}({self.hidden_size}" + if not self.elementwise_affine: + s += f", elementwise_affine={self.elementwise_affine}" + s += f", eps={self.eps}" + s += ")" + return s + + def forward(self, x, o, residual=None, prenorm=False, residual_in_fp32=False): + return rms_norm_swish_gate_fn( + x, + o, + self.weight, + self.bias, + residual=residual, + eps=self.eps, + prenorm=prenorm, + residual_in_fp32=residual_in_fp32 + ) + + +class FusedLayerNormSwishGateLinear(nn.Module): + + def __init__( + self, + hidden_size, + elementwise_affine: bool = True, + eps=1e-5 + ) -> FusedLayerNormSwishGateLinear: + super().__init__() + + self.hidden_size = hidden_size + self.elementwise_affine = elementwise_affine + self.eps = eps + + if elementwise_affine: + self.weight = nn.Parameter(torch.ones(hidden_size)) + else: + self.register_parameter("weight", None) + self.register_parameter("bias", None) + + def __repr__(self) -> str: + s = f"{self.__class__.__name__}({self.hidden_size}" + if not self.elementwise_affine: + s += f", elementwise_affine={self.elementwise_affine}" + s += f", eps={self.eps}" + s += ")" + return s + + def forward(self, x, o, weight, bias, residual=None, prenorm=False, residual_in_fp32=False): + return layer_norm_swish_gate_linear_fn( + x, + o, + self.weight, + self.bias, + weight, + bias, + residual=residual, + eps=self.eps, + prenorm=prenorm, + residual_in_fp32=residual_in_fp32 + ) + + +class FusedRMSNormSwishGateLinear(nn.Module): + + def __init__( + self, + hidden_size, + elementwise_affine: bool = True, + eps=1e-5 + ) -> FusedRMSNormSwishGateLinear: + super().__init__() + + self.hidden_size = hidden_size + self.elementwise_affine = elementwise_affine + self.eps = eps + + if elementwise_affine: + self.weight = nn.Parameter(torch.ones(hidden_size)) + else: + self.register_parameter("weight", None) + self.register_parameter("bias", None) + + def __repr__(self) -> str: + s = f"{self.__class__.__name__}({self.hidden_size}" + if not self.elementwise_affine: + s += f", elementwise_affine={self.elementwise_affine}" + s += f", eps={self.eps}" + s += ")" + return s + + def forward(self, x, o, weight, bias, residual=None, prenorm=False, residual_in_fp32=False): + return rms_norm_swish_gate_linear_fn( + x, + o, + self.weight, + self.bias, + weight, + bias, + residual=residual, + eps=self.eps, + prenorm=prenorm, + residual_in_fp32=residual_in_fp32 + ) diff --git a/fla/modules/l2norm.py b/fla/modules/l2norm.py new file mode 100644 index 0000000000000000000000000000000000000000..4206125bb2d6985cd84999420e63b48a84b29ead --- /dev/null +++ b/fla/modules/l2norm.py @@ -0,0 +1,201 @@ +# -*- coding: utf-8 -*- + +import torch +import triton +import triton.language as tl + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8), + triton.Config({}, num_warps=16), + triton.Config({}, num_warps=32), + ], + key=["N"], +) +# @triton.heuristics({"HAS_BIAS": lambda args: args["B"] is not None}) +# @triton.heuristics({"HAS_RESIDUAL": lambda args: args["RESIDUAL"] is not None}) +@triton.jit +def _l2_norm_fwd_1pass_kernel( + X, # pointer to the input + Y, # pointer to the output + stride_x_row, # how much to increase the pointer when moving by 1 row + N, # number of columns in X + eps, # epsilon to avoid division by zero + BLOCK_N: tl.constexpr, +): + # Map the program id to the row of X and Y it should compute. + row = tl.program_id(0) + X += row * stride_x_row + Y += row * stride_x_row + # Compute mean and variance + cols = tl.arange(0, BLOCK_N) + x = tl.load(X + cols, mask=cols < N, other=0.0).to(tl.float32) + xbar = tl.where(cols < N, x, 0.0) + var = tl.sum(xbar * xbar, axis=0) + rstd = 1 / tl.sqrt(var + eps) + # tl.store(Rstd + row, rstd) + # Normalize and apply linear transformation + mask = cols < N + y = x * rstd + # Write output + tl.store(Y + cols, y, mask=mask) + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8), + triton.Config({}, num_warps=16), + triton.Config({}, num_warps=32), + ], + key=["N"], +) +# @triton.heuristics({"HAS_BIAS": lambda args: args["B"] is not None}) +# @triton.heuristics({"HAS_DRESIDUAL": lambda args: args["DRESIDUAL"] is not None}) +# @triton.heuristics({"STORE_DRESIDUAL": lambda args: args["DRESIDUAL_IN"] is not None}) +# @triton.heuristics({"RECOMPUTE_OUTPUT": lambda args: args["Y"] is not None}) +@triton.jit +def _l2_norm_bwd_kernel( + X, # pointer to the input + # Y, # pointer to the output to be recomputed + DY, # pointer to the output gradient + DX, # pointer to the input gradient + stride_x_row, # how much to increase the pointer when moving by 1 row + N, # number of columns in X + eps, # epsilon to avoid division by zero + BLOCK_N: tl.constexpr, +): + # Map the program id to the elements of X, DX, and DY it should compute. + # Map the program id to the row of X and Y it should compute. + row = tl.program_id(0) + X += row * stride_x_row + DX += row * stride_x_row + DY += row * stride_x_row + + # Y += row * stride_y_row + cols = tl.arange(0, BLOCK_N) + x = tl.load(X + cols, mask=cols < N, other=0.0).to(tl.float32) + x = tl.where(cols < N, x, 0.0) + var = tl.sum(x * x) + rstd = 1 / tl.sqrt(var + eps) + # tl.store(Rstd + row, rstd) + # Normalize and apply linear transformation + mask = cols < N + # y = x * rstd + dy = tl.load(DY + cols, mask=cols < N, other=0.0).to(tl.float32) + dy = tl.where(cols < N, dy, 0.0) + # dx = dy * rstd - tl.sum(dy * x) * (1 / (var+eps)) * rstd * x + dx = dy * rstd - tl.sum(dy * x) * (1 / (var+eps)) * rstd * x + tl.store(DX + cols, dx, mask=mask) + + +def _l2_norm_fwd( + x, eps=1e-6 +): + x_shape_og = x.shape + x = x.reshape(-1, x.shape[-1]) + if x.stride(-1) != 1: + x = x.contiguous() + M, N = x.shape + assert x.stride(-1) == 1 + # allocate output + y = torch.empty_like(x) + assert y.stride(-1) == 1 + N = x.shape[-1] + M = x.shape[0] + # rstd = torch.empty((M,), dtype=torch.float32, device="cuda") + # Less than 64KB per feature: enqueue fused kernel + MAX_FUSED_SIZE = 65536 // x.element_size() + BLOCK_N = min(MAX_FUSED_SIZE, triton.next_power_of_2(N)) + if N > BLOCK_N: + raise RuntimeError( + "This layer norm doesn't support feature dim >= 64KB.") + # heuristics for number of warps + with torch.cuda.device(x.device.index): + _l2_norm_fwd_1pass_kernel[(M,)]( + x, + y, + x.stride(0), + N, + eps, + # is_rms_norm, + BLOCK_N, + # residual is not None, + # residual_out is not None, + # bias is not None, + ) + return y.reshape(x_shape_og) + + +def _l2_norm_bwd( + x, dy, eps=1e-5, +): + x_shape_og = x.shape + x = x.reshape(-1, dy.shape[-1]) + dy = dy.reshape(-1, dy.shape[-1]) + if dy.stride(-1) != 1: + dy = dy.contiguous() + assert dy.shape == x.shape + # allocate output + dx = torch.empty_like(x) + N = x.shape[-1] + M = x.shape[0] + assert x.stride(-1) == 1 + assert dy.stride(-1) == 1 + # rstd = torch.empty((M,), dtype=torch.float32, device="cuda") + # Less than 64KB per feature: enqueue fused kernel + MAX_FUSED_SIZE = 65536 // x.element_size() + BLOCK_N = min(MAX_FUSED_SIZE, triton.next_power_of_2(N)) + if N > BLOCK_N: + raise RuntimeError( + "This layer norm doesn't support feature dim >= 64KB.") + # heuristics for number of warps + with torch.cuda.device(x.device.index): + _l2_norm_bwd_kernel[(M,)]( + x, + dy, + dx, + x.stride(0), + N, + eps, + BLOCK_N, + ) + return dx.reshape(x_shape_og) + + +class L2NormFunction(torch.autograd.Function): + + @staticmethod + def forward( + ctx, + x, + eps=1e-6, + ): + # reshape input data into 2D tensor + y = _l2_norm_fwd(x, eps) + ctx.eps = eps + ctx.x_dtype = x.dtype + ctx.save_for_backward(x) + return y + + @staticmethod + def backward(ctx, dy, *args): + x, = ctx.saved_tensors + dx = _l2_norm_bwd( + x, + dy, + ctx.eps, + ) + return ( + dx, + None + ) + + +l2_norm = L2NormFunction.apply diff --git a/fla/modules/layernorm.py b/fla/modules/layernorm.py new file mode 100644 index 0000000000000000000000000000000000000000..de226f27e71a4ba35605792bd181fb3c21712b9f --- /dev/null +++ b/fla/modules/layernorm.py @@ -0,0 +1,1009 @@ +# -*- coding: utf-8 -*- + +# Copyright (c) 2023, Tri Dao. +# https://github.com/state-spaces/mamba/blob/fb7b5310fa865dbd62aa059b1e26f2b431363e2a/mamba_ssm/ops/triton/layernorm.py +# Implement residual + layer_norm / rms_norm. + +# Based on the Triton LayerNorm tutorial: https://triton-lang.org/main/getting-started/tutorials/05-layer-norm.html +# For the backward pass, we keep weight_grad and bias_grad in registers and accumulate. +# This is faster for dimensions up to 8k, but after that it's much slower due to register spilling. +# The models we train have hidden dim up to 8k anyway (e.g. Llama 70B), so this is fine. + +from __future__ import annotations + +import torch +import torch.nn as nn +import torch.nn.functional as F +import triton +import triton.language as tl + +from fla.utils import contiguous + + +def layer_norm_ref( + x: torch.Tensor, + weight: torch.Tensor, + bias: torch.Tensor, + residual: torch.Tensor = None, + eps: float = 1e-5, + prenorm: bool = False, + upcast: bool = False +): + dtype = x.dtype + if upcast: + weight = weight.float() + bias = bias.float() if bias is not None else None + if upcast: + x = x.float() + residual = residual.float() if residual is not None else residual + if residual is not None: + x = (x + residual).to(x.dtype) + out = F.layer_norm(x.to(weight.dtype), x.shape[-1:], weight=weight, bias=bias, eps=eps).to( + dtype + ) + return out if not prenorm else (out, x) + + +def rms_norm_ref( + x: torch.Tensor, + weight: torch.Tensor, + bias: torch.Tensor, + residual: torch.Tensor = None, + eps: float = 1e-5, + prenorm: bool = False, + upcast: bool = False +): + dtype = x.dtype + if upcast: + weight = weight.float() + bias = bias.float() if bias is not None else None + if upcast: + x = x.float() + residual = residual.float() if residual is not None else residual + if residual is not None: + x = (x + residual).to(x.dtype) + rstd = 1 / torch.sqrt((x.square()).mean(dim=-1, keepdim=True) + eps) + out = (x * rstd * weight) + bias if bias is not None else (x * rstd * weight) + out = out.to(dtype) + return out if not prenorm else (out, x) + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8), + triton.Config({}, num_warps=16), + triton.Config({}, num_warps=32), + ], + key=["N", "HAS_RESIDUAL", "STORE_RESIDUAL_OUT", "IS_RMS_NORM", "HAS_BIAS"], +) +# @triton.heuristics({"HAS_BIAS": lambda args: args["B"] is not None}) +# @triton.heuristics({"HAS_RESIDUAL": lambda args: args["RESIDUAL"] is not None}) +@triton.jit +def _layer_norm_fwd_1pass_kernel( + X, # pointer to the input + Y, # pointer to the output + W, # pointer to the weights + B, # pointer to the biases + RESIDUAL, # pointer to the residual + RESIDUAL_OUT, # pointer to the residual + Mean, # pointer to the mean + Rstd, # pointer to the 1/std + stride_x_row, # how much to increase the pointer when moving by 1 row + stride_y_row, + stride_res_row, + stride_res_out_row, + N, # number of columns in X + G, # number of groups + eps, # epsilon to avoid division by zero + IS_RMS_NORM: tl.constexpr, + BLOCK_N: tl.constexpr, + HAS_RESIDUAL: tl.constexpr, + STORE_RESIDUAL_OUT: tl.constexpr, + HAS_WEIGHT: tl.constexpr, + HAS_BIAS: tl.constexpr +): + # Map the program id to the row of X and Y it should compute. + row = tl.program_id(0) + group = row % G + X += row * stride_x_row + Y += row * stride_y_row + if HAS_RESIDUAL: + RESIDUAL += row * stride_res_row + if STORE_RESIDUAL_OUT: + RESIDUAL_OUT += row * stride_res_out_row + # Compute mean and variance + cols = tl.arange(0, BLOCK_N) + x = tl.load(X + cols, mask=cols < N, other=0.0).to(tl.float32) + if HAS_RESIDUAL: + residual = tl.load(RESIDUAL + cols, mask=cols < N, other=0.0).to(tl.float32) + x += residual + if STORE_RESIDUAL_OUT: + tl.store(RESIDUAL_OUT + cols, x, mask=cols < N) + if not IS_RMS_NORM: + mean = tl.sum(x, axis=0) / N + tl.store(Mean + row, mean) + xbar = tl.where(cols < N, x - mean, 0.0) + var = tl.sum(xbar * xbar, axis=0) / N + else: + xbar = tl.where(cols < N, x, 0.0) + var = tl.sum(xbar * xbar, axis=0) / N + rstd = 1 / tl.sqrt(var + eps) + tl.store(Rstd + row, rstd) + # Normalize and apply linear transformation + mask = cols < N + if HAS_WEIGHT: + w = tl.load(W + group * stride_x_row + cols, mask=mask).to(tl.float32) + if HAS_BIAS: + b = tl.load(B + group * stride_x_row + cols, mask=mask).to(tl.float32) + x_hat = (x - mean) * rstd if not IS_RMS_NORM else x * rstd + + y = x_hat * w if HAS_WEIGHT else x_hat + if HAS_BIAS: + y = y + b + # Write output + tl.store(Y + cols, y, mask=mask) + + +def _layer_norm_fwd( + x, + weight, + bias, + eps, + residual=None, + out_dtype=None, + residual_dtype=None, + is_rms_norm=False, + num_groups=1 +): + if residual is not None: + residual_dtype = residual.dtype + M, N, G = *x.shape, num_groups + if residual is not None: + assert residual.shape == (M, N) + if weight is not None: + assert weight.shape == (G * N,) + if bias is not None: + assert bias.shape == (G * N,) + # allocate output + y = torch.empty_like(x, dtype=x.dtype if out_dtype is None else out_dtype) + if residual is not None or (residual_dtype is not None and residual_dtype != x.dtype): + residual_out = torch.empty(M, N, device=x.device, dtype=residual_dtype) + else: + residual_out = None + mean = torch.empty((M,), dtype=torch.float32, device="cuda") if not is_rms_norm else None + rstd = torch.empty((M,), dtype=torch.float32, device="cuda") + # Less than 64KB per feature: enqueue fused kernel + MAX_FUSED_SIZE = 65536 // x.element_size() + BLOCK_N = min(MAX_FUSED_SIZE, triton.next_power_of_2(N)) + if N > BLOCK_N: + raise RuntimeError("This layer norm doesn't support feature dim >= 64KB.") + # heuristics for number of warps + with torch.cuda.device(x.device.index): + _layer_norm_fwd_1pass_kernel[(M,)]( + x, + y, + weight, + bias, + residual, + residual_out, + mean, + rstd, + x.stride(0), + y.stride(0), + residual.stride(0) if residual is not None else 0, + residual_out.stride(0) if residual_out is not None else 0, + N, + G, + eps, + is_rms_norm, + BLOCK_N, + residual is not None, + residual_out is not None, + weight is not None, + bias is not None, + ) + # residual_out is None if residual is None and residual_dtype == input_dtype + return y, mean, rstd, residual_out if residual_out is not None else x + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8), + triton.Config({}, num_warps=16), + triton.Config({}, num_warps=32), + ], + key=["N", "HAS_DRESIDUAL", "STORE_DRESIDUAL", "IS_RMS_NORM", "HAS_BIAS"], +) +# @triton.heuristics({"HAS_BIAS": lambda args: args["B"] is not None}) +# @triton.heuristics({"HAS_DRESIDUAL": lambda args: args["DRESIDUAL"] is not None}) +# @triton.heuristics({"STORE_DRESIDUAL": lambda args: args["DRESIDUAL_IN"] is not None}) +@triton.heuristics({"RECOMPUTE_OUTPUT": lambda args: args["Y"] is not None}) +@triton.jit +def _layer_norm_bwd_kernel( + X, # pointer to the input + W, # pointer to the weights + B, # pointer to the biases + Y, # pointer to the output to be recomputed + DY, # pointer to the output gradient + DX, # pointer to the input gradient + DW, # pointer to the partial sum of weights gradient + DB, # pointer to the partial sum of biases gradient + DRESIDUAL, + DRESIDUAL_IN, + Mean, # pointer to the mean + Rstd, # pointer to the 1/std + stride_x_row, # how much to increase the pointer when moving by 1 row + stride_y_row, + stride_dy_row, + stride_dx_row, + stride_dres_row, + stride_dres_in_row, + M, # number of rows in X + N, # number of columns in X + G, # number of groups + rows_per_program, + programs_per_group, + IS_RMS_NORM: tl.constexpr, + BLOCK_N: tl.constexpr, + HAS_DRESIDUAL: tl.constexpr, + STORE_DRESIDUAL: tl.constexpr, + HAS_WEIGHT: tl.constexpr, + HAS_BIAS: tl.constexpr, + RECOMPUTE_OUTPUT: tl.constexpr, +): + row_block_id = tl.program_id(0) + group_id, program_id_in_group = row_block_id // programs_per_group, row_block_id % programs_per_group + + row_start = group_id + program_id_in_group * G * rows_per_program + row_end = min(row_start + G * rows_per_program, M) + + cols = tl.arange(0, BLOCK_N) + mask = cols < N + + if HAS_WEIGHT: + w = tl.load(W + group_id * stride_x_row + cols, mask=mask).to(tl.float32) + dw = tl.zeros((BLOCK_N,), dtype=tl.float32) + if RECOMPUTE_OUTPUT and HAS_BIAS: + b = tl.load(B + group_id * stride_x_row + cols, mask=mask, other=0.0).to(tl.float32) + if HAS_BIAS: + db = tl.zeros((BLOCK_N,), dtype=tl.float32) + + for row in range(row_start, row_end, G): + # Load data to SRAM + x = tl.load(X + row * stride_x_row + cols, mask=mask, other=0).to(tl.float32) + dy = tl.load(DY + row * stride_dy_row + cols, mask=mask, other=0).to(tl.float32) + if not IS_RMS_NORM: + mean = tl.load(Mean + row) + rstd = tl.load(Rstd + row) + # Compute dx + xhat = (x - mean) * rstd if not IS_RMS_NORM else x * rstd + xhat = tl.where(mask, xhat, 0.0) + if RECOMPUTE_OUTPUT: + y = xhat * w if HAS_WEIGHT else xhat + if HAS_BIAS: + y = y + b + tl.store(Y + row * stride_y_row + cols, y, mask=mask) + wdy = dy + if HAS_WEIGHT: + wdy = dy * w + dw += dy * xhat + if HAS_BIAS: + db += dy + if not IS_RMS_NORM: + c1 = tl.sum(xhat * wdy, axis=0) / N + c2 = tl.sum(wdy, axis=0) / N + dx = (wdy - (xhat * c1 + c2)) * rstd + else: + c1 = tl.sum(xhat * wdy, axis=0) / N + dx = (wdy - xhat * c1) * rstd + if HAS_DRESIDUAL: + dres = tl.load(DRESIDUAL + row * stride_dres_row + cols, mask=mask, other=0).to(tl.float32) + dx += dres + # Write dx + if STORE_DRESIDUAL: + tl.store(DRESIDUAL_IN + row * stride_dres_in_row + cols, dx, mask=mask) + tl.store(DX + row * stride_dx_row + cols, dx, mask=mask) + + if HAS_WEIGHT: + tl.store(DW + row_block_id * N + cols, dw, mask=mask) + if HAS_BIAS: + tl.store(DB + row_block_id * N + cols, db, mask=mask) + + +def _layer_norm_bwd( + dy, + x, + weight, + bias, + eps, + mean, + rstd, + dresidual=None, + has_residual=False, + is_rms_norm=False, + x_dtype=None, + recompute_output=False, + num_groups=1 +): + M, N, G = *x.shape, num_groups + assert dy.shape == (M, N) + if dresidual is not None: + assert dresidual.shape == (M, N) + if weight is not None: + assert weight.shape == (G * N,) + if bias is not None: + assert bias.shape == (G * N,) + # allocate output + dx = torch.empty_like(x) if x_dtype is None else torch.empty(M, N, dtype=x_dtype, device=x.device) + dresidual_in = torch.empty_like(x) if has_residual and dx.dtype != x.dtype else None + y = torch.empty(M, N, dtype=dy.dtype, device=dy.device) if recompute_output else None + + # Less than 64KB per feature: enqueue fused kernel + MAX_FUSED_SIZE = 65536 // x.element_size() + BLOCK_N = min(MAX_FUSED_SIZE, triton.next_power_of_2(N)) + if N > BLOCK_N: + raise RuntimeError("This layer norm doesn't support feature dim >= 64KB.") + # each program handles one group only + S = triton.cdiv(torch.cuda.get_device_properties(x.device).multi_processor_count, G) * G + dw = torch.empty((S, N), dtype=torch.float32, device=weight.device) if weight is not None else None + db = torch.empty((S, N), dtype=torch.float32, device=bias.device) if bias is not None else None + rows_per_program = triton.cdiv(M, S) + programs_per_group = S // G + grid = (S,) + with torch.cuda.device(x.device.index): + _layer_norm_bwd_kernel[grid]( + x, + weight, + bias, + y, + dy, + dx, + dw, + db, + dresidual, + dresidual_in, + mean, + rstd, + x.stride(0), + 0 if not recompute_output else y.stride(0), + dy.stride(0), + dx.stride(0), + dresidual.stride(0) if dresidual is not None else 0, + dresidual_in.stride(0) if dresidual_in is not None else 0, + M, + N, + G, + rows_per_program, + programs_per_group, + is_rms_norm, + BLOCK_N, + dresidual is not None, + dresidual_in is not None, + weight is not None, + bias is not None, + ) + dw = dw.view(G, -1, N).sum(1).to(weight).view_as(weight) if weight is not None else None + db = db.view(G, -1, N).sum(1).to(bias).view_as(bias) if bias is not None else None + # Don't need to compute dresidual_in separately in this case + if has_residual and dx.dtype == x.dtype: + dresidual_in = dx + return (dx, dw, db, dresidual_in) if not recompute_output else (dx, dw, db, dresidual_in, y) + + +class LayerNormFunction(torch.autograd.Function): + + @staticmethod + @contiguous + def forward( + ctx, + x, + weight, + bias, + residual=None, + eps=1e-5, + prenorm=False, + residual_in_fp32=False, + is_rms_norm=False, + num_groups=1 + ): + x_shape_og = x.shape + + if x.shape[-1] % num_groups != 0: + raise ValueError('num_channels must be divisible by num_groups') + # reshape input data into 2D tensor + x = x.reshape(-1, (x.shape[-1] // num_groups)) + if residual is not None: + assert residual.shape == x_shape_og + residual = residual.reshape_as(x) + residual_dtype = ( + residual.dtype + if residual is not None + else (torch.float32 if residual_in_fp32 else None) + ) + y, mean, rstd, residual_out = _layer_norm_fwd( + x, weight, bias, eps, residual, + residual_dtype=residual_dtype, + is_rms_norm=is_rms_norm, + num_groups=num_groups + ) + ctx.save_for_backward(residual_out, weight, bias, mean, rstd) + ctx.x_shape_og = x_shape_og + ctx.eps = eps + ctx.is_rms_norm = is_rms_norm + ctx.num_groups = num_groups + ctx.has_residual = residual is not None + ctx.prenorm = prenorm + ctx.x_dtype = x.dtype + y = y.reshape(x_shape_og) + return y if not prenorm else (y, residual_out.reshape(x_shape_og)) + + @staticmethod + @contiguous + def backward(ctx, dy, *args): + x, weight, bias, mean, rstd = ctx.saved_tensors + dy = dy.reshape(-1, (dy.shape[-1] // ctx.num_groups)) + assert dy.shape == x.shape + if ctx.prenorm: + dresidual = args[0] + dresidual = dresidual.reshape(-1, x.shape[-1]) + assert dresidual.shape == x.shape + else: + dresidual = None + dx, dw, db, dresidual_in = _layer_norm_bwd( + dy, + x, + weight, + bias, + ctx.eps, + mean, + rstd, + dresidual, + ctx.has_residual, + ctx.is_rms_norm, + x_dtype=ctx.x_dtype, + num_groups=ctx.num_groups + ) + return ( + dx.reshape(ctx.x_shape_og), + dw, + db, + dresidual_in.reshape(ctx.x_shape_og) if ctx.has_residual else None, + None, + None, + None, + None, + None + ) + + +def layer_norm( + x: torch.Tensor, + weight: torch.Tensor, + bias: torch.Tensor, + residual: torch.Tensor = None, + eps: float = 1e-5, + prenorm: bool = False, + residual_in_fp32: bool = False, + is_rms_norm: bool = False +): + return LayerNormFunction.apply( + x, + weight, + bias, + residual, + eps, + prenorm, + residual_in_fp32, + is_rms_norm + ) + + +def group_norm( + x: torch.Tensor, + weight: torch.Tensor, + bias: torch.Tensor, + residual: torch.Tensor = None, + eps: float = 1e-5, + prenorm: bool = False, + residual_in_fp32: bool = False, + is_rms_norm: bool = False, + num_groups: int = 1 +): + return LayerNormFunction.apply( + x, + weight, + bias, + residual, + eps, + prenorm, + residual_in_fp32, + is_rms_norm, + num_groups + ) + + +def rms_norm( + x: torch.Tensor, + weight: torch.Tensor, + bias: torch.Tensor, + residual: torch.Tensor = None, + eps: float = 1e-5, + prenorm: bool = False, + residual_in_fp32: bool = False +): + return LayerNormFunction.apply( + x, + weight, + bias, + residual, + eps, + prenorm, + residual_in_fp32, + True + ) + + +class LayerNorm(nn.Module): + + def __init__( + self, + hidden_size: int, + elementwise_affine: bool = True, + bias: bool = False, + eps: float = 1e-5 + ) -> LayerNorm: + super().__init__() + + self.hidden_size = hidden_size + self.elementwise_affine = elementwise_affine + self.eps = eps + + self.register_parameter("weight", None) + self.register_parameter("bias", None) + if elementwise_affine: + self.weight = nn.Parameter(torch.ones(hidden_size)) + if bias: + self.bias = nn.Parameter(torch.zeros(hidden_size)) + + def __repr__(self) -> str: + s = f"{self.__class__.__name__}({self.hidden_size}" + if not self.elementwise_affine: + s += f", elementwise_affine={self.elementwise_affine}" + s += f", eps={self.eps}" + s += ")" + return s + + def forward(self, x, residual=None, prenorm=False, residual_in_fp32=False): + return layer_norm( + x, + self.weight, + self.bias, + residual=residual, + eps=self.eps, + prenorm=prenorm, + residual_in_fp32=residual_in_fp32 + ) + + +class GroupNorm(nn.Module): + + def __init__( + self, + num_groups: int, + hidden_size: int, + elementwise_affine: bool = True, + bias: bool = False, + eps: float = 1e-5 + ) -> GroupNorm: + super().__init__() + + if hidden_size % num_groups != 0: + raise ValueError('num_channels must be divisible by num_groups') + + self.num_groups = num_groups + self.hidden_size = hidden_size + self.elementwise_affine = elementwise_affine + self.eps = eps + + self.register_parameter("weight", None) + self.register_parameter("bias", None) + if elementwise_affine: + self.weight = nn.Parameter(torch.ones(hidden_size)) + if bias: + self.bias = nn.Parameter(torch.zeros(hidden_size)) + + def __repr__(self) -> str: + s = f"{self.__class__.__name__}({self.num_groups}, {self.hidden_size}" + if not self.elementwise_affine: + s += f", elementwise_affine={self.elementwise_affine}" + s += f", eps={self.eps}" + s += ")" + return s + + def forward(self, x, residual=None, prenorm=False, residual_in_fp32=False): + return group_norm( + x, + self.weight, + self.bias, + residual=residual, + eps=self.eps, + prenorm=prenorm, + residual_in_fp32=residual_in_fp32, + num_groups=self.num_groups + ) + + +class RMSNorm(nn.Module): + + def __init__( + self, + hidden_size: int, + elementwise_affine: bool = True, + bias: bool = False, + eps: float = 1e-5 + ) -> RMSNorm: + super().__init__() + + self.hidden_size = hidden_size + self.elementwise_affine = elementwise_affine + self.eps = eps + + self.register_parameter("weight", None) + self.register_parameter("bias", None) + if elementwise_affine: + self.weight = nn.Parameter(torch.ones(hidden_size)) + if bias: + self.bias = nn.Parameter(torch.zeros(hidden_size)) + + def __repr__(self) -> str: + s = f"{self.__class__.__name__}({self.hidden_size}" + if not self.elementwise_affine: + s += f", elementwise_affine={self.elementwise_affine}" + s += f", eps={self.eps}" + s += ")" + return s + + def forward(self, x, residual=None, prenorm=False, residual_in_fp32=False): + return rms_norm( + x, + self.weight, + self.bias, + residual=residual, + eps=self.eps, + prenorm=prenorm, + residual_in_fp32=residual_in_fp32, + ) + + +class LayerNormLinearFunction(torch.autograd.Function): + + @staticmethod + @contiguous + def forward( + ctx, + x, + norm_weight, + norm_bias, + linear_weight, + linear_bias, + residual=None, + eps=1e-5, + prenorm=False, + residual_in_fp32=False, + is_rms_norm=False, + num_groups=1 + ): + x_shape_og = x.shape + + if x.shape[-1] % num_groups != 0: + raise ValueError('num_channels must be divisible by num_groups') + # reshape input data into 2D tensor + x = x.reshape(-1, (x.shape[-1] // num_groups)) + if residual is not None: + assert residual.shape == x_shape_og + residual = residual.reshape_as(x) + residual_dtype = ( + residual.dtype + if residual is not None + else (torch.float32 if residual_in_fp32 else None) + ) + y, mean, rstd, residual_out = _layer_norm_fwd( + x, + norm_weight, + norm_bias, + eps, + residual, + out_dtype=None if not torch.is_autocast_enabled() else torch.get_autocast_gpu_dtype(), + residual_dtype=residual_dtype, + is_rms_norm=is_rms_norm, + num_groups=num_groups + ) + y = y.reshape(x_shape_og) + dtype = torch.get_autocast_gpu_dtype() if torch.is_autocast_enabled() else y.dtype + linear_weight = linear_weight.to(dtype) + linear_bias = linear_bias.to(dtype) if linear_bias is not None else None + out = F.linear(y.to(linear_weight.dtype), linear_weight, linear_bias) + # We don't store y, will be recomputed in the backward pass to save memory + ctx.save_for_backward(residual_out, norm_weight, norm_bias, linear_weight, mean, rstd) + ctx.x_shape_og = x_shape_og + ctx.eps = eps + ctx.is_rms_norm = is_rms_norm + ctx.num_groups = num_groups + ctx.has_residual = residual is not None + ctx.prenorm = prenorm + ctx.x_dtype = x.dtype + ctx.linear_bias_is_none = linear_bias is None + return out if not prenorm else (out, residual_out.reshape(x_shape_og)) + + @staticmethod + @contiguous + def backward(ctx, dout, *args): + x, norm_weight, norm_bias, linear_weight, mean, rstd = ctx.saved_tensors + dout = dout.reshape(-1, dout.shape[-1]) + dy = F.linear(dout, linear_weight.t()) + dy = dy.reshape(-1, (dy.shape[-1] // ctx.num_groups)) + dlinear_bias = None if ctx.linear_bias_is_none else dout.sum(0) + assert dy.shape == x.shape + if ctx.prenorm: + dresidual = args[0] + dresidual = dresidual.reshape(-1, x.shape[-1]) + assert dresidual.shape == x.shape + else: + dresidual = None + dx, dnorm_weight, dnorm_bias, dresidual_in, y = _layer_norm_bwd( + dy, + x, + norm_weight, + norm_bias, + ctx.eps, + mean, + rstd, + dresidual, + ctx.has_residual, + ctx.is_rms_norm, + x_dtype=ctx.x_dtype, + recompute_output=True, + num_groups=ctx.num_groups + ) + dlinear_weight = torch.einsum("bo,bi->oi", dout, y.view(-1, linear_weight.shape[-1])) + return ( + dx.reshape(ctx.x_shape_og), + dnorm_weight, + dnorm_bias, + dlinear_weight, + dlinear_bias, + dresidual_in.reshape(ctx.x_shape_og) if ctx.has_residual else None, + None, + None, + None, + None, + None + ) + + +class LayerNormLinear(nn.Module): + + def __init__( + self, + hidden_size, + elementwise_affine: bool = True, + bias: bool = False, + eps: float = 1e-5 + ) -> LayerNormLinear: + super().__init__() + + self.hidden_size = hidden_size + self.elementwise_affine = elementwise_affine + self.eps = eps + + self.register_parameter("weight", None) + self.register_parameter("bias", None) + if elementwise_affine: + self.weight = nn.Parameter(torch.ones(hidden_size)) + if bias: + self.bias = nn.Parameter(torch.zeros(hidden_size)) + + def __repr__(self) -> str: + s = f"{self.__class__.__name__}({self.hidden_size}" + if not self.elementwise_affine: + s += f", elementwise_affine={self.elementwise_affine}" + s += f", eps={self.eps}" + s += ")" + return s + + def forward(self, x, weight, bias, residual=None, prenorm=False, residual_in_fp32=False): + return layer_norm_linear( + x=x, + norm_weight=self.weight, + norm_bias=self.bias, + linear_weight=weight, + linear_bias=bias, + residual=residual, + eps=self.eps, + prenorm=prenorm, + residual_in_fp32=residual_in_fp32, + is_rms_norm=False + ) + + +class GroupNormLinear(nn.Module): + + def __init__( + self, + num_groups: int, + hidden_size: int, + elementwise_affine: bool = True, + bias: bool = False, + eps: float = 1e-5 + ) -> GroupNormLinear: + super().__init__() + + if hidden_size % num_groups != 0: + raise ValueError('num_channels must be divisible by num_groups') + + self.num_groups = num_groups + self.hidden_size = hidden_size + self.elementwise_affine = elementwise_affine + self.eps = eps + + self.register_parameter("weight", None) + self.register_parameter("bias", None) + if elementwise_affine: + self.weight = nn.Parameter(torch.ones(hidden_size)) + if bias: + self.bias = nn.Parameter(torch.zeros(hidden_size)) + + def __repr__(self) -> str: + s = f"{self.__class__.__name__}({self.num_groups}, {self.hidden_size}" + if not self.elementwise_affine: + s += f", elementwise_affine={self.elementwise_affine}" + s += f", eps={self.eps}" + s += ")" + return s + + def forward(self, x, weight, bias, residual=None, prenorm=False, residual_in_fp32=False): + return layer_norm_linear( + x=x, + norm_weight=self.weight, + norm_bias=self.bias, + linear_weight=weight, + linear_bias=bias, + residual=residual, + eps=self.eps, + prenorm=prenorm, + residual_in_fp32=residual_in_fp32, + is_rms_norm=False, + num_groups=self.num_groups + ) + + +class RMSNormLinear(nn.Module): + + def __init__( + self, + hidden_size, + elementwise_affine: bool = True, + bias: bool = False, + eps: float = 1e-5 + ) -> RMSNormLinear: + super().__init__() + + self.hidden_size = hidden_size + self.elementwise_affine = elementwise_affine + self.eps = eps + + self.register_parameter("weight", None) + self.register_parameter("bias", None) + if elementwise_affine: + self.weight = nn.Parameter(torch.ones(hidden_size)) + if bias: + self.bias = nn.Parameter(torch.zeros(hidden_size)) + + def __repr__(self) -> str: + s = f"{self.__class__.__name__}({self.hidden_size}" + if not self.elementwise_affine: + s += f", elementwise_affine={self.elementwise_affine}" + s += f", eps={self.eps}" + s += ")" + return s + + def forward(self, x, weight, bias, residual=None, prenorm=False, residual_in_fp32=False): + return layer_norm_linear( + x=x, + norm_weight=self.weight, + norm_bias=self.bias, + linear_weight=weight, + linear_bias=bias, + residual=residual, + eps=self.eps, + prenorm=prenorm, + residual_in_fp32=residual_in_fp32, + is_rms_norm=True + ) + + +def layer_norm_linear( + x: torch.Tensor, + norm_weight: torch.Tensor, + norm_bias: torch.Tensor, + linear_weight: torch.Tensor, + linear_bias: torch.Tensor, + residual: torch.Tensor = None, + eps: float = 1e-5, + prenorm: bool = False, + residual_in_fp32: bool = False, + is_rms_norm: bool = False, + num_groups: int = 1 +): + return LayerNormLinearFunction.apply( + x, + norm_weight, + norm_bias, + linear_weight, + linear_bias, + residual, + eps, + prenorm, + residual_in_fp32, + is_rms_norm, + num_groups + ) + + +def rms_norm_linear( + x: torch.Tensor, + norm_weight: torch.Tensor, + norm_bias: torch.Tensor, + linear_weight: torch.Tensor, + linear_bias: torch.Tensor, + residual: torch.Tensor = None, + eps: float = 1e-5, + prenorm: bool = False, + residual_in_fp32: bool = False +): + return layer_norm_linear( + x=x, + norm_weight=norm_weight, + norm_bias=norm_bias, + linear_weight=linear_weight, + linear_bias=linear_bias, + residual=residual, + eps=eps, + prenorm=prenorm, + residual_in_fp32=residual_in_fp32, + is_rms_norm=True + ) + + +def group_norm_linear( + x: torch.Tensor, + norm_weight: torch.Tensor, + norm_bias: torch.Tensor, + linear_weight: torch.Tensor, + linear_bias: torch.Tensor, + residual: torch.Tensor = None, + eps: float = 1e-5, + prenorm: bool = False, + residual_in_fp32: bool = False, + is_rms_norm: bool = False, + num_groups: int = 1 +): + return layer_norm_linear( + x=x, + norm_weight=norm_weight, + norm_bias=norm_bias, + linear_weight=linear_weight, + linear_bias=linear_bias, + residual=residual, + eps=eps, + prenorm=prenorm, + residual_in_fp32=residual_in_fp32, + is_rms_norm=is_rms_norm, + num_groups=num_groups + ) diff --git a/fla/modules/layernorm_gated.py b/fla/modules/layernorm_gated.py new file mode 100644 index 0000000000000000000000000000000000000000..5faf02f4d84169901f800152509d494868aac49c --- /dev/null +++ b/fla/modules/layernorm_gated.py @@ -0,0 +1,447 @@ +# Copyright (c) 2024, Tri Dao. +# Based on the Triton LayerNorm tutorial: https://triton-lang.org/main/getting-started/tutorials/05-layer-norm.html +# For the backward pass, we keep weight_grad and bias_grad in registers and accumulate. +# This backward pass is faster for dimensions up to 8k, but after that it's much slower due to register spilling. +# The models we train have hidden dim up to 8k anyway (e.g. Llama 70B), so this is fine. + +import math + +import torch +import torch.nn.functional as F +import triton +import triton.language as tl +from einops import rearrange + + +def rms_norm_ref(x, weight, bias, z=None, eps=1e-6, group_size=None, norm_before_gate=True, upcast=True): + dtype = x.dtype + weight = weight.float() + bias = bias.float() if bias is not None else None + if upcast: + x = x.float() + z = z.float() if z is not None else z + if z is not None and not norm_before_gate: + x = x * F.silu(z) + if group_size is None: + rstd = 1 / torch.sqrt((x.square()).mean(dim=-1, keepdim=True) + eps) + out = (x * rstd * weight) + bias if bias is not None else (x * rstd * weight) + else: + x_group = rearrange(x, "... (g d) -> ... g d", d=group_size) + rstd = 1 / torch.sqrt((x_group.square()).mean(dim=-1, keepdim=True) + eps) + out = rearrange(x_group * rstd, "... g d -> ... (g d)") * weight + if bias is not None: + out = out + bias + if z is not None and norm_before_gate: + out *= F.silu(z) + return out.to(dtype) + + +@triton.heuristics({"HAS_BIAS": lambda args: args["B"] is not None}) +@triton.heuristics({"HAS_Z": lambda args: args["Z"] is not None}) +@triton.jit +def _layer_norm_fwd_1pass_kernel( + X, # pointer to the input + Y, # pointer to the output + W, # pointer to the weights + B, # pointer to the biases + Z, # pointer to the other branch + Mean, # pointer to the mean + Rstd, # pointer to the 1/std + stride_x_row, # how much to increase the pointer when moving by 1 row + stride_y_row, + stride_z_row, + M, # number of rows in X + N, # number of columns in X + eps, # epsilon to avoid division by zero + BLOCK_N: tl.constexpr, + HAS_BIAS: tl.constexpr, + HAS_Z: tl.constexpr, + NORM_BEFORE_GATE: tl.constexpr, + IS_RMS_NORM: tl.constexpr, +): + # Map the program id to the row of X and Y it should compute. + row = tl.program_id(0) + group = tl.program_id(1) + X += row * stride_x_row + group * N + Y += row * stride_y_row + group * N + if HAS_Z: + Z += row * stride_z_row + group * N + if not IS_RMS_NORM: + Mean += group * M + Rstd += group * M + W += group * N + if HAS_BIAS: + B += group * N + # Compute mean and variance + cols = tl.arange(0, BLOCK_N) + x = tl.load(X + cols, mask=cols < N, other=0.).to(tl.float32) + if HAS_Z and not NORM_BEFORE_GATE: + z = tl.load(Z + cols, mask=cols < N).to(tl.float32) + x *= z * tl.sigmoid(z) + if not IS_RMS_NORM: + mean = tl.sum(x, axis=0) / N + tl.store(Mean + row, mean) + xbar = tl.where(cols < N, x - mean, 0.) + var = tl.sum(xbar * xbar, axis=0) / N + else: + xbar = tl.where(cols < N, x, 0.) + var = tl.sum(xbar * xbar, axis=0) / N + rstd = 1 / tl.sqrt(var + eps) + tl.store(Rstd + row, rstd) + # Normalize and apply linear transformation + mask = cols < N + w = tl.load(W + cols, mask=mask).to(tl.float32) + if HAS_BIAS: + b = tl.load(B + cols, mask=mask).to(tl.float32) + x_hat = (x - mean) * rstd if not IS_RMS_NORM else x * rstd + y = x_hat * w + b if HAS_BIAS else x_hat * w + if HAS_Z and NORM_BEFORE_GATE: + z = tl.load(Z + cols, mask=mask).to(tl.float32) + y *= z * tl.sigmoid(z) + # Write output + tl.store(Y + cols, y, mask=mask) + + +def _layer_norm_fwd(x, weight, bias, eps, z=None, out=None, group_size=None, norm_before_gate=True, is_rms_norm=False): + M, N = x.shape + if group_size is None: + group_size = N + assert N % group_size == 0 + ngroups = N // group_size + assert x.stride(-1) == 1 + if z is not None: + assert z.stride(-1) == 1 + assert z.shape == (M, N) + assert weight.shape == (N,) + assert weight.stride(-1) == 1 + if bias is not None: + assert bias.stride(-1) == 1 + assert bias.shape == (N,) + # allocate output + if out is not None: + assert out.shape == x.shape + else: + out = torch.empty_like(x) + assert out.stride(-1) == 1 + mean = torch.empty((ngroups * M, ), dtype=torch.float32, device=x.device) if not is_rms_norm else None + rstd = torch.empty((ngroups * M, ), dtype=torch.float32, device=x.device) + # Less than 64KB per feature: enqueue fused kernel + MAX_FUSED_SIZE = 65536 // x.element_size() + BLOCK_N = min(MAX_FUSED_SIZE, triton.next_power_of_2(group_size)) + if group_size > BLOCK_N: + raise RuntimeError("This layer norm doesn't support feature dim >= 64KB.") + # heuristics for number of warps + num_warps = min(max(BLOCK_N // 256, 1), 8) + grid = (M, ngroups) + with torch.cuda.device(x.device.index): + _layer_norm_fwd_1pass_kernel[grid](x, out, weight, bias, z, mean, rstd, + x.stride(0), out.stride(0), z.stride(0) if z is not None else 0, + M, group_size, eps, + BLOCK_N=BLOCK_N, + NORM_BEFORE_GATE=norm_before_gate, + IS_RMS_NORM=is_rms_norm, + num_warps=num_warps) + return out, mean, rstd + + +@triton.heuristics({"HAS_BIAS": lambda args: args["B"] is not None}) +@triton.heuristics({"HAS_Z": lambda args: args["Z"] is not None}) +@triton.heuristics({"RECOMPUTE_OUTPUT": lambda args: args["Y"] is not None}) +@triton.jit +def _layer_norm_bwd_kernel( + X, # pointer to the input + W, # pointer to the weights + B, # pointer to the biases + Z, # pointer to the other branch + Y, # pointer to the output to be recomputed + DY, # pointer to the output gradient + DX, # pointer to the input gradient + DW, # pointer to the partial sum of weights gradient + DB, # pointer to the partial sum of biases gradient + DZ, # pointer to the other branch + Mean, # pointer to the mean + Rstd, # pointer to the 1/std + stride_x_row, # how much to increase the pointer when moving by 1 row + stride_z_row, + stride_y_row, + stride_dy_row, + stride_dx_row, + stride_dz_row, + stride_dw_row, + stride_db_row, + M, # number of rows in X + N, # number of columns in X + eps, # epsilon to avoid division by zero + rows_per_program, + NORM_BEFORE_GATE: tl.constexpr, + IS_RMS_NORM: tl.constexpr, + HAS_BIAS: tl.constexpr, + HAS_Z: tl.constexpr, + RECOMPUTE_OUTPUT: tl.constexpr, + BLOCK_N: tl.constexpr, +): + # Map the program id to the elements of X, DX, and DY it should compute. + row_block_id = tl.program_id(0) + group = tl.program_id(1) + row_start = row_block_id * rows_per_program + cols = tl.arange(0, BLOCK_N) + mask = cols < N + X += row_start * stride_x_row + group * N + if HAS_Z: + Z += row_start * stride_z_row + group * N + DZ += row_start * stride_dz_row + group * N + DY += row_start * stride_dy_row + group * N + DX += row_start * stride_dx_row + group * N + if RECOMPUTE_OUTPUT: + Y += row_start * stride_y_row + group * N + if not IS_RMS_NORM: + Mean += group * M + Rstd += group * M + W += group * N + w = tl.load(W + cols, mask=mask).to(tl.float32) + if (RECOMPUTE_OUTPUT or HAS_Z) and HAS_BIAS: + B += group * N + b = tl.load(B + cols, mask=mask, other=0.).to(tl.float32) + dw = tl.zeros((BLOCK_N,), dtype=tl.float32) + if HAS_BIAS: + db = tl.zeros((BLOCK_N,), dtype=tl.float32) + row_end = min((row_block_id + 1) * rows_per_program, M) + for row in range(row_start, row_end): + # Load data to SRAM + x = tl.load(X + cols, mask=mask, other=0).to(tl.float32) + dy = tl.load(DY + cols, mask=mask, other=0).to(tl.float32) + if not IS_RMS_NORM: + mean = tl.load(Mean + row) + if HAS_Z and not NORM_BEFORE_GATE: + z = tl.load(Z + cols, mask=mask, other=0.).to(tl.float32) + x_og = x + x = x_og * z * tl.sigmoid(z) + rstd = tl.load(Rstd + row) + # Compute dx + xhat = (x - mean) * rstd if not IS_RMS_NORM else x * rstd + xhat = tl.where(mask, xhat, 0.) + if HAS_Z and NORM_BEFORE_GATE: + z = tl.load(Z + cols, mask=mask, other=0.).to(tl.float32) + z_sigmoid = tl.sigmoid(z) + y = xhat * w + b if HAS_BIAS else xhat * w + if RECOMPUTE_OUTPUT: + tl.store(Y + cols, y * z * z_sigmoid, mask=mask) + dz = dy * y * z_sigmoid * (1 + z * (1 - z_sigmoid)) + tl.store(DZ + cols, dz, mask=mask) + dy *= z * z_sigmoid + else: + if RECOMPUTE_OUTPUT: + y = xhat * w + b if HAS_BIAS else xhat * w + tl.store(Y + cols, y, mask=mask) + wdy = w * dy + c1 = tl.sum(xhat * wdy, axis=0) / N + if not IS_RMS_NORM: + c2 = tl.sum(wdy, axis=0) / N + dx = (wdy - (xhat * c1 + c2)) * rstd + else: + dx = (wdy - xhat * c1) * rstd + dw += dy * xhat + if HAS_BIAS: + db += dy + if HAS_Z and not NORM_BEFORE_GATE: + z_sigmoid = tl.sigmoid(z) + dz = dx * x_og * z_sigmoid * (1 + z * (1 - z_sigmoid)) + tl.store(DZ + cols, dz, mask=mask) + dx *= z * z_sigmoid + # Write dx + tl.store(DX + cols, dx, mask=mask) + + X += stride_x_row + if HAS_Z: + Z += stride_z_row + DZ += stride_dz_row + if RECOMPUTE_OUTPUT: + Y += stride_y_row + DY += stride_dy_row + DX += stride_dx_row + tl.store(DW + row_block_id * stride_dw_row + group * N + cols, dw, mask=mask) + if HAS_BIAS: + tl.store(DB + row_block_id * stride_db_row + group * N + cols, db, mask=mask) + + +def _layer_norm_bwd(dy, x, weight, bias, eps, mean, rstd, z=None, group_size=None, + norm_before_gate=True, is_rms_norm=False, recompute_output=False, dz=None, out=None): + M, N = x.shape + if group_size is None: + group_size = N + assert N % group_size == 0 + ngroups = N // group_size + assert x.stride(-1) == 1 + assert dy.stride(-1) == 1 + assert dy.shape == (M, N) + if z is not None: + assert z.stride(-1) == 1 + assert z.shape == (M, N) + assert weight.shape == (N,) + assert weight.stride(-1) == 1 + if bias is not None: + assert bias.stride(-1) == 1 + assert bias.shape == (N,) + # allocate output + dx = torch.empty_like(x) + if dz is not None: + assert z is not None + assert dz.shape == z.shape + assert dz.stride(-1) == 1 + else: + dz = torch.empty_like(z) if z is not None else None + if recompute_output: + if out is None: + out = torch.empty_like(x) + assert out.shape == x.shape + + # Less than 64KB per feature: enqueue fused kernel + MAX_FUSED_SIZE = 65536 // x.element_size() + BLOCK_N = min(MAX_FUSED_SIZE, triton.next_power_of_2(group_size)) + if group_size > BLOCK_N: + raise RuntimeError("This layer norm doesn't support feature dim >= 64KB.") + # heuristics for number of warps + num_warps = min(max(BLOCK_N // 256, 1), 8) + sm_count = torch.cuda.get_device_properties(x.device).multi_processor_count + # If group size is small (e.g., 64), we're only using 1 warp. So having just 108 programs + # would limit the occupancy. + nrow_groups = math.ceil(sm_count * math.ceil(4 / num_warps) / ngroups) + _dw = torch.empty((nrow_groups, N), dtype=torch.float32, device=weight.device) + _db = torch.empty((nrow_groups, N), dtype=torch.float32, device=bias.device) if bias is not None else None + rows_per_program = math.ceil(M / nrow_groups) + grid = (nrow_groups, ngroups) + with torch.cuda.device(x.device.index): + _layer_norm_bwd_kernel[grid](x, weight, bias, z, out if recompute_output else None, + dy, dx, _dw, _db, dz, mean, rstd, + x.stride(0), + z.stride(0) if z is not None else 0, + 0 if not recompute_output else out.stride(0), + dy.stride(0), dx.stride(0), + dz.stride(0) if dz is not None else 0, + _dw.stride(0), + _db.stride(0) if _db is not None else 0, + M, group_size, eps, + rows_per_program, + BLOCK_N=BLOCK_N, + NORM_BEFORE_GATE=norm_before_gate, + IS_RMS_NORM=is_rms_norm, + num_warps=num_warps) + dw = _dw.sum(0).to(weight.dtype) + db = _db.sum(0).to(bias.dtype) if bias is not None else None + return (dx, dw, db, dz) if not recompute_output else (dx, dw, db, dz, out) + + +class LayerNormFn(torch.autograd.Function): + + @staticmethod + def forward(ctx, x, weight, bias, z=None, eps=1e-6, group_size=None, norm_before_gate=True, + is_rms_norm=False): + """If z is not None, we do norm(x) * silu(z) if norm_before_gate, else norm(x * silu(z)) + """ + + x_shape_og = x.shape + # reshape input data into 2D tensor + x = x.reshape(-1, x.shape[-1]) + if x.stride(-1) != 1: + x = x.contiguous() + if z is not None: + assert z.shape == x_shape_og + z = z.reshape(-1, z.shape[-1]) + if z.stride(-1) != 1: + z = z.contiguous() + weight = weight.contiguous() + if bias is not None: + bias = bias.contiguous() + y, mean, rstd = _layer_norm_fwd(x, weight, bias, eps, z=z, group_size=group_size, + norm_before_gate=norm_before_gate, is_rms_norm=is_rms_norm) + ctx.save_for_backward(x, weight, bias, mean, rstd, z) + ctx.x_shape_og = x_shape_og + ctx.eps = eps + ctx.group_size = group_size + ctx.norm_before_gate = norm_before_gate + ctx.is_rms_norm = is_rms_norm + return y.reshape(x_shape_og) + + @staticmethod + def backward(ctx, dy): + x, weight, bias, mean, rstd, z = ctx.saved_tensors + dy = dy.reshape(-1, dy.shape[-1]) + if dy.stride(-1) != 1: + dy = dy.contiguous() + assert dy.shape == x.shape + dx, dw, db, dz = _layer_norm_bwd( + dy, + x, + weight, + bias, + ctx.eps, + mean, + rstd, + z, + ctx.group_size, + ctx.norm_before_gate, + ctx.is_rms_norm + ) + dx = dx.reshape(ctx.x_shape_og) + dx = dz.reshape(ctx.x_shape_og) if dz is not None else None + return dx, dw, db, dz, None, None, None, None + + +def layernorm_fn(x, weight, bias, z=None, eps=1e-6, group_size=None, norm_before_gate=True, is_rms_norm=False): + return LayerNormFn.apply(x, weight, bias, z, eps, group_size, norm_before_gate, is_rms_norm) + + +def rmsnorm_fn(x, weight, bias, z=None, eps=1e-6, group_size=None, norm_before_gate=True): + return LayerNormFn.apply(x, weight, bias, z, eps, group_size, norm_before_gate, True) + + +class LayerNormGated(torch.nn.Module): + + def __init__(self, hidden_size, eps=1e-5, group_size=None, norm_before_gate=True, device=None, dtype=None): + """If group_size is not None, we do GroupNorm with each group having group_size elements. + group_size=None is equivalent to group_size=hidden_size (i.e. there's only 1 group). + """ + + factory_kwargs = {"device": device, "dtype": dtype} + super().__init__() + self.eps = eps + self.weight = torch.nn.Parameter(torch.empty(hidden_size, **factory_kwargs)) + self.bias = torch.nn.Parameter(torch.empty(hidden_size, **factory_kwargs)) + self.group_size = group_size + self.norm_before_gate = norm_before_gate + self.reset_parameters() + + def reset_parameters(self): + torch.nn.init.ones_(self.weight) + torch.nn.init.zeros_(self.bias) + + def forward(self, x, z=None): + """If z is not None, we do norm(x) * silu(z) if norm_before_gate, else norm(x * silu(z)) + """ + return layernorm_fn(x, self.weight, self.bias, z=z, group_size=self.group_size, eps=self.eps, + norm_before_gate=self.norm_before_gate) + + +class RMSNormGated(torch.nn.Module): + + def __init__(self, hidden_size, eps=1e-5, group_size=None, norm_before_gate=False, device=None, dtype=None): + """If group_size is not None, we do GroupNorm with each group having group_size elements. + group_size=None is equivalent to group_size=hidden_size (i.e. there's only 1 group). + """ + factory_kwargs = {"device": device, "dtype": dtype} + super().__init__() + self.eps = eps + self.weight = torch.nn.Parameter(torch.empty(hidden_size, **factory_kwargs)) + self.register_parameter("bias", None) + self.group_size = group_size + self.norm_before_gate = norm_before_gate + self.reset_parameters() + + def reset_parameters(self): + torch.nn.init.ones_(self.weight) + + def forward(self, x, z=None): + """If z is not None, we do norm(x) * silu(z) if norm_before_gate, else norm(x * silu(z)) + """ + return rmsnorm_fn(x, self.weight, self.bias, z=z, eps=self.eps, group_size=self.group_size, + norm_before_gate=self.norm_before_gate) diff --git a/fla/modules/rotary.py b/fla/modules/rotary.py new file mode 100644 index 0000000000000000000000000000000000000000..ed7ed9c99749975c6a13cb1ab8a788578f8d5864 --- /dev/null +++ b/fla/modules/rotary.py @@ -0,0 +1,304 @@ +# -*- coding: utf-8 -*- + +# Copyright (c) 2023, Tri Dao. + +from typing import Optional, Tuple, Union + +import torch +from einops import rearrange, repeat + +from fla.ops.rotary import apply_rotary + + +def rotate_half(x, interleaved=False): + if not interleaved: + x1, x2 = x.chunk(2, dim=-1) + return torch.cat((-x2, x1), dim=-1) + else: + x1, x2 = x[..., ::2], x[..., 1::2] + return rearrange(torch.stack((-x2, x1), dim=-1), "... d two -> ... (d two)", two=2) + + +def rotary_embedding_torch(x, cos, sin, interleaved=False): + """ + x: (batch_size, seqlen, nheads, headdim) + cos, sin: (seqlen, rotary_dim / 2) or (batch_size, seqlen, rotary_dim / 2) + """ + ro_dim = cos.shape[-1] * 2 + assert ro_dim <= x.shape[-1] + cos = repeat( + cos, "... d -> ... 1 (2 d)" if not interleaved else "... d -> ... 1 (d 2)") + sin = repeat( + sin, "... d -> ... 1 (2 d)" if not interleaved else "... d -> ... 1 (d 2)") + return torch.cat( + [x[..., :ro_dim] * cos + + rotate_half(x[..., :ro_dim], interleaved) * sin, x[..., ro_dim:]], + dim=-1, + ) + + +class RotaryEmbeddingFunction(torch.autograd.Function): + + @staticmethod + def forward( + ctx, + x, + cos, + sin, + interleaved=False, + inplace=False, + seqlen_offsets: Union[int, torch.Tensor] = 0, + cu_seqlens: Optional[torch.Tensor] = None, + max_seqlen: Optional[int] = None, + ): + out = apply_rotary( + x, + cos, + sin, + seqlen_offsets=seqlen_offsets, + cu_seqlens=cu_seqlens, + max_seqlen=max_seqlen, + interleaved=interleaved, + inplace=inplace, + ) + if isinstance(seqlen_offsets, int): + # Can't save int with save_for_backward + ctx.save_for_backward(cos, sin, cu_seqlens) + ctx.seqlen_offsets = seqlen_offsets + else: + ctx.save_for_backward(cos, sin, cu_seqlens, seqlen_offsets) + ctx.seqlen_offsets = None + ctx.interleaved = interleaved + ctx.inplace = inplace + ctx.max_seqlen = max_seqlen + return out if not inplace else x + + @staticmethod + def backward(ctx, do): + seqlen_offsets = ctx.seqlen_offsets + if seqlen_offsets is None: + cos, sin, cu_seqlens, seqlen_offsets = ctx.saved_tensors + else: + cos, sin, cu_seqlens = ctx.saved_tensors + # TD [2023-09-02]: For some reason Triton (2.0.0.post1) errors with + # "[CUDA]: invalid device context", and cloning makes it work. Idk why. Triton 2.1.0 works. + if not ctx.interleaved and not ctx.inplace: + do = do.clone() + dx = apply_rotary( + do, + cos, + sin, + seqlen_offsets=seqlen_offsets, + cu_seqlens=cu_seqlens, + max_seqlen=ctx.max_seqlen, + interleaved=ctx.interleaved, + inplace=ctx.inplace, + conjugate=True, + ) + return dx, None, None, None, None, None, None, None + + +def rotary_embedding( + x, + cos, + sin, + interleaved=False, + inplace=False, + seqlen_offsets: Union[int, torch.Tensor] = 0, + cu_seqlens: Optional[torch.Tensor] = None, + max_seqlen: Optional[int] = None, +): + """ + Arguments: + x: (batch_size, seqlen, nheads, headdim) if cu_seqlens is None + else (total_seqlen, nheads, headdim) + cos, sin: (seqlen_rotary, rotary_dim / 2) + interleaved: if True, rotate pairs of even and odd dimensions (GPT-J style) instead + of 1st half and 2nd half (GPT-NeoX style). + inplace: if True, apply rotary embedding in-place. + seqlen_offsets: (batch_size,) or int. Each sequence in x is shifted by this amount. + Most commonly used in inference when we have KV cache. + cu_seqlens: (batch + 1,) or None + max_seqlen: int + Return: + out: (batch_size, seqlen, nheads, headdim) if cu_seqlens is None + else (total_seqlen, nheads, headdim) + rotary_dim must be <= headdim + Apply rotary embedding to the first rotary_dim of x. + """ + return RotaryEmbeddingFunction.apply( + x, cos, sin, interleaved, inplace, seqlen_offsets, cu_seqlens, max_seqlen + ) + + +class RotaryEmbedding(torch.nn.Module): + """ + The rotary position embeddings from RoFormer_ (Su et. al). + A crucial insight from the method is that the query and keys are + transformed by rotation matrices which depend on the relative positions. + + Other implementations are available in the Rotary Transformer repo_ and in + GPT-NeoX_, GPT-NeoX was an inspiration + + .. _RoFormer: https://arxiv.org/abs/2104.09864 + .. _repo: https://github.com/ZhuiyiTechnology/roformer + .. _GPT-NeoX: https://github.com/EleutherAI/gpt-neox + + If scale_base is not None, this implements XPos (Sun et al., https://arxiv.org/abs/2212.10554). + A recommended value for scale_base is 512: https://github.com/HazyResearch/flash-attention/issues/96 + Reference: https://github.com/sunyt32/torchscale/blob/main/torchscale/component/xpos_relative_position.py + """ + + def __init__( + self, + dim: int, + base=10000.0, + interleaved=False, + scale_base=None, + pos_idx_in_fp32=True, + device=None, + ): + """ + interleaved: if True, rotate pairs of even and odd dimensions (GPT-J style) instead + of 1st half and 2nd half (GPT-NeoX style). + pos_idx_in_fp32: if True, the position indices [0.0, ..., seqlen - 1] are in fp32, + otherwise they might be in lower precision. + This option was added because previously (before 2023-07-02), when we construct + the position indices, we use the dtype of self.inv_freq. In most cases this would + be fp32, but if the model is trained in pure bf16 (not mixed precision), then + self.inv_freq would be bf16, and the position indices are also in bf16. + Because of the limited precision of bf16 (e.g. 1995.0 is rounded to 2000.0), the + embeddings for some positions will coincide. + To maintain compatibility with models previously trained in pure bf16, + we add this option. + """ + super().__init__() + self.dim = dim + self.base = float(base) + self.pos_idx_in_fp32 = pos_idx_in_fp32 + # Generate and save the inverse frequency buffer (non trainable) + inv_freq = self._compute_inv_freq(device) + self.register_buffer("inv_freq", inv_freq, persistent=False) + self.interleaved = interleaved + self.scale_base = scale_base + scale = ( + (torch.arange(0, dim, 2, device=device, + dtype=torch.float32) + 0.4 * dim) / (1.4 * dim) + if scale_base is not None + else None + ) + self.register_buffer("scale", scale, persistent=False) + + self._seq_len_cached = 0 + self._cos_cached = None + self._sin_cached = None + self._cos_k_cached = None + self._sin_k_cached = None + + def _compute_inv_freq(self, device=None): + return 1.0 / ( + self.base + ** (torch.arange(0, self.dim, 2, device=device, dtype=torch.float32) / self.dim) + ) + + def _update_cos_sin_cache(self, seqlen, device=None, dtype=None): + # Reset the tables if the sequence length has changed, + # if we're on a new device (possibly due to tracing for instance), + # or if we're switching from inference mode to training + if ( + seqlen > self._seq_len_cached + or self._cos_cached is None + or self._cos_cached.device != device + or self._cos_cached.dtype != dtype + or (self.training and self._cos_cached.is_inference()) + ): + self._seq_len_cached = seqlen + # We want fp32 here, not self.inv_freq.dtype, since the model could be loaded in bf16 + # And the output of arange can be quite large, so bf16 would lose a lot of precision. + # However, for compatibility reason, we add an option to use the dtype of self.inv_freq. + if self.pos_idx_in_fp32: + t = torch.arange(seqlen, device=device, dtype=torch.float32) + # We want fp32 here as well since inv_freq will be multiplied with t, and the output + # will be large. Having it in bf16 will lose a lot of precision and cause the + # cos & sin output to change significantly. + # We want to recompute self.inv_freq if it was not loaded in fp32 + if self.inv_freq.dtype != torch.float32: + inv_freq = self._compute_inv_freq(device=device) + else: + inv_freq = self.inv_freq + else: + t = torch.arange(seqlen, device=device, dtype=self.inv_freq.dtype) + inv_freq = self.inv_freq + # Don't do einsum, it converts fp32 to fp16 under AMP + # freqs = torch.einsum("i,j->ij", t, self.inv_freq) + freqs = torch.outer(t, inv_freq) + if self.scale is None: + self._cos_cached = torch.cos(freqs).to(dtype) + self._sin_cached = torch.sin(freqs).to(dtype) + else: + power = ( + torch.arange(seqlen, dtype=self.scale.dtype, device=self.scale.device) + - seqlen // 2 + ) / self.scale_base + scale = self.scale.to(device=power.device) ** rearrange(power, "s -> s 1") + # We want the multiplication by scale to happen in fp32 + self._cos_cached = (torch.cos(freqs) * scale).to(dtype) + self._sin_cached = (torch.sin(freqs) * scale).to(dtype) + self._cos_k_cached = (torch.cos(freqs) / scale).to(dtype) + self._sin_k_cached = (torch.sin(freqs) / scale).to(dtype) + + def forward( + self, + q: torch.Tensor, + k: torch.Tensor, + seqlen_offset: Union[int, torch.Tensor] = 0, + max_seqlen: Optional[int] = None, + ) -> Union[torch.Tensor, Tuple[torch.Tensor, torch.Tensor]]: + """ + q: (batch, seqlen, nheads, headdim) + k: (batch, seqlen, nheads, headdim) + seqlen_offset: + (batch_size,) or int. Each sequence in x is shifted by this amount. + Most commonly used in inference when we have KV cache. + If it's a tensor of shape (batch_size,), then to update the cos / sin cache, one + should pass in max_seqlen, which will update the cos / sin cache up to that length. + max_seqlen: int + """ + seqlen = q.shape[1] + if max_seqlen is not None: + self._update_cos_sin_cache(max_seqlen, device=q.device, dtype=q.dtype) + elif isinstance(seqlen_offset, int): + self._update_cos_sin_cache(seqlen + seqlen_offset, device=q.device, dtype=q.dtype) + if self.scale is None: + q = rotary_embedding( + q, + self._cos_cached, + self._sin_cached, + interleaved=self.interleaved, + seqlen_offsets=seqlen_offset + ) + k = rotary_embedding( + k, + self._cos_cached, + self._sin_cached, + interleaved=self.interleaved, + seqlen_offsets=seqlen_offset + ) + + else: + q = rotary_embedding( + q, + self._cos_cached, + self._sin_cached, + interleaved=self.interleaved, + seqlen_offsets=seqlen_offset + ) + k = rotary_embedding( + k, + self._cos_k_cached, + self._sin_k_cached, + interleaved=self.interleaved, + seqlen_offsets=seqlen_offset + ) + + return q, k diff --git a/fla/ops/__init__.py b/fla/ops/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..4f8681d2a16bb0c9b86fc0f3cb268c4bb69ce5b8 --- /dev/null +++ b/fla/ops/__init__.py @@ -0,0 +1,18 @@ +# -*- coding: utf-8 -*- + +from .based import fused_chunk_based, parallel_based +from .gla import chunk_gla, fused_chunk_gla, fused_recurrent_gla +from .retention import (chunk_retention, fused_chunk_retention, + fused_recurrent_retention, parallel_retention) + +__all__ = [ + 'fused_chunk_based', + 'parallel_based', + 'chunk_gla', + 'fused_chunk_gla', + 'fused_recurrent_gla', + 'chunk_retention', + 'fused_chunk_retention', + 'fused_recurrent_retention', + 'parallel_retention' +] diff --git a/fla/ops/abc/__init__.py b/fla/ops/abc/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..fdac8d900fc51485a55716443ee1f00424b522b9 --- /dev/null +++ b/fla/ops/abc/__init__.py @@ -0,0 +1,7 @@ +# -*- coding: utf-8 -*- + +from .chunk import chunk_abc + +__all__ = [ + 'chunk_abc' +] diff --git a/fla/ops/abc/chunk.py b/fla/ops/abc/chunk.py new file mode 100644 index 0000000000000000000000000000000000000000..da419716a776d4e301a929b8dcbc44ec2474a6cd --- /dev/null +++ b/fla/ops/abc/chunk.py @@ -0,0 +1,1220 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +from typing import Optional, Tuple + +import torch +import triton +import triton.language as tl + +from fla.ops.utils import (logcumsumexp_fwd_kernel, softmax_bwd_kernel, + softmax_fwd_kernel) +from fla.utils import contiguous + + +@triton.jit +def chunk_abc_fwd_kernel_h( + k, + v, + z, + h, + h0, + ht, + s_k_h, + s_k_t, + s_k_d, + s_v_h, + s_v_t, + s_v_d, + s_h_h, + s_h_t, + s_h_d, + T: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + NT: tl.constexpr, + NORMK: tl.constexpr, + USE_INITIAL_STATE: tl.constexpr, + STORE_FINAL_STATE: tl.constexpr +): + i_v, i_k, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + + b_h = tl.zeros([BK, BV], dtype=tl.float32) + if USE_INITIAL_STATE: + p_h = tl.make_block_ptr(h0 + i_bh * K * V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + b_h += tl.load(p_h, boundary_check=(0, 1)).to(tl.float32) + if NORMK: + p_z0 = tl.make_block_ptr(z + i_bh * s_k_h, (T * K,), (s_k_d,), (i_k * BK,), (BK,), (0,)) + else: + p_z0 = tl.make_block_ptr(z + i_bh * s_v_h, (T * V,), (s_v_d,), (i_v * BV,), (BV,), (0,)) + b_zp = tl.load(p_z0).to(tl.float32) + for i_t in range(NT): + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (K, T), (s_k_d, s_k_t), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_h = tl.make_block_ptr(h + i_bh * s_h_h + i_t * K * V, (K, V), (s_h_t, s_h_d), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + + tl.store(p_h, b_h.to(p_h.dtype.element_ty), boundary_check=(0, 1)) + # [BK, BT] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BT, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + if NORMK: + p_zc = tl.make_block_ptr(z + i_bh * s_k_h, (T * K,), (s_k_d,), ((i_t * BT + BT - 1) * K + i_k * BK,), (BK,), (0,)) + # [BK,] + b_zc = tl.load(p_zc, boundary_check=(0,)) + b_r, b_zp = tl.exp(b_zp - b_zc), b_zc + # [BK, BV] + b_h = b_h * b_r[:, None] + b_k = tl.exp(b_k - b_zc[:, None]).to(b_k.dtype) + else: + p_zc = tl.make_block_ptr(z + i_bh * s_v_h, (T * V,), (s_v_d,), ((i_t * BT + BT - 1) * V + i_v * BV,), (BV,), (0,)) + # [BV,] + b_zc = tl.load(p_zc, boundary_check=(0,)) + b_r, b_zp = tl.exp(b_zp - b_zc), b_zc + # [BK, BV] + b_h = b_h * b_r[None, :] + b_v = tl.exp(b_v - b_zc[None, :]).to(b_v.dtype) + # [BK, BV] + b_h += tl.dot(b_k, b_v, allow_tf32=False) + + if STORE_FINAL_STATE: + p_h = tl.make_block_ptr(ht + i_bh * K * V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + tl.store(p_h, b_h.to(p_h.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.jit +def chunk_abc_fwd_kernel_intra_K( + v, + z, + o, + A, + s_v_h, + s_v_t, + s_v_d, + T: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BC: tl.constexpr, + BV: tl.constexpr, + NC: tl.constexpr +): + i_v, i_c, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_t, i_i = i_c // NC, i_c % NC + + p_z = tl.make_block_ptr(z + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT + i_i * BC, i_v * BV), (BC, BV), (1, 0)) + p_zn = tl.make_block_ptr(z + i_bh * s_v_h, (T * V,), (s_v_d,), ((i_t * BT + i_i * BC) * V + i_v * BV,), (BV,), (0,)) + # [BV,] + b_zn = tl.load(p_zn, boundary_check=(0,)) + # [BC, BV] + b_o = tl.zeros([BC, BV], dtype=tl.float32) + for i_j in range(0, i_i): + p_A = tl.make_block_ptr(A + i_bh * T * BT, (T, BT), (BT, 1), (i_t * BT + i_i * BC, i_j * BC), (BC, BC), (1, 0)) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT + i_j * BC, i_v * BV), (BC, BV), (1, 0)) + # [BC, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + # [BC, BC] + b_A = tl.load(p_A, boundary_check=(0, 1)) + b_o += tl.dot(b_A, tl.exp(b_v - b_zn[None, :]).to(b_v.dtype), allow_tf32=False) + b_z = tl.load(p_z, boundary_check=(0, 1)) + b_o *= tl.exp(b_zn[None, :] - b_z) + + o_i = tl.arange(0, BC) + o_A = i_bh * T * BT + (i_t * BT + i_i * BC + tl.arange(0, BC)) * BT + i_i * BC + m_A = (i_t * BT + i_i * BC + tl.arange(0, BC)) < T + for j in range(0, BC): + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (T * V,), (1,), ((i_t * BT + i_i * BC + j) * V + i_v * BV,), (BV,), (0,)) + # [BC,] + b_A = tl.load(A + o_A + j, mask=m_A, other=0) + # [BV,] + b_v = tl.load(p_v, boundary_check=(0,)).to(tl.float32) + # [BC, BV] + # avoid 0 * inf = inf + m_i = o_i[:, None] >= j + b_o += tl.where(m_i, b_A[:, None] * tl.exp(b_v[None, :] - b_z), 0) + p_o = tl.make_block_ptr(o + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT + i_i * BC, i_v * BV), (BC, BV), (1, 0)) + tl.store(p_o, b_o.to(p_o.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.jit +def chunk_abc_fwd_kernel_K( + q, + k, + z, + h, + o, + A, + s_k_h, + s_k_t, + s_k_d, + s_v_h, + s_v_t, + s_v_d, + s_h_h, + s_h_t, + s_h_d, + scale, + T: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr +): + i_v, i_t, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_p = tl.maximum(i_t * BT - 1, 0) + + o_i = tl.arange(0, BT) + m_s = o_i[:, None] >= o_i[None, :] + + b_o = tl.zeros([BT, BV], dtype=tl.float32) + b_A = tl.zeros([BT, BT], dtype=tl.float32) + for i_k in range(tl.cdiv(K, BK)): + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (K, T), (s_k_d, s_k_t), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + p_h = tl.make_block_ptr(h + i_bh * s_h_h + i_t * K * V, (K, V), (s_h_t, s_h_d), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + + # [BT, BK] + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_q = (b_q * scale).to(b_q.dtype) + # [BK, BT] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BK, BV] + b_h = tl.load(p_h, boundary_check=(0, 1)) + # [BT, BV] + b_o += tl.dot(b_q, b_h, allow_tf32=False) + # [BT, BT] + b_A += tl.dot(b_q, b_k, allow_tf32=False) + p_z = tl.make_block_ptr(z + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_o = tl.make_block_ptr(o + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + # [BT, BV] + b_z = tl.load(p_z, boundary_check=(0, 1)) + # [BT, BV] + p_zp = tl.make_block_ptr(z + i_bh * s_v_h, (T * V,), (s_v_d,), (i_p * V + i_v * BV,), (BV,), (0,)) + b_zp = tl.load(p_zp, boundary_check=(0,)) + b_o = b_o * tl.exp(b_zp[None, :] - b_z) + tl.store(p_o, b_o.to(p_o.dtype.element_ty), boundary_check=(0, 1)) + + p_A = tl.make_block_ptr(A + i_bh * T * BT, (T, BT), (BT, 1), (i_t * BT, 0), (BT, BT), (1, 0)) + # [BT, BT] + b_A = tl.where(m_s, b_A, 0.) + if i_v == 0: + tl.store(p_A, b_A.to(p_A.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.jit +def chunk_abc_fwd_kernel_intra_V( + q, + k, + z, + A, + s_k_h, + s_k_t, + s_k_d, + scale, + T: tl.constexpr, + K: tl.constexpr, + BT: tl.constexpr, + BC: tl.constexpr, + BK: tl.constexpr, + NC: tl.constexpr +): + i_k, i_c, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_t, i_i, i_j = i_c // (NC * NC), (i_c % (NC * NC)) // NC, (i_c % (NC * NC)) % NC + n_bh = tl.num_programs(2) + + if i_i > i_j: + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (K, T), (s_k_d, s_k_t), (i_k * BK, i_t * BT + i_j * BC), (BK, BC), (0, 1)) + p_z = tl.make_block_ptr(z + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + p_A = tl.make_block_ptr(A + (i_k*n_bh+i_bh)*T*BT, (T, BT), (BT, 1), (i_t * BT + i_i * BC, i_j * BC), (BC, BC), (1, 0)) + p_zn = tl.make_block_ptr(z + i_bh * s_k_h, (T * K,), (s_k_d,), ((i_t * BT + i_i * BC) * K + i_k * BK,), (BK,), (0,)) + # [BK,] + b_zn = tl.load(p_zn, boundary_check=(0,)) + # [BC, BK] + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_z = tl.load(p_z, boundary_check=(0, 1)) + b_q = (b_q * tl.exp(b_zn[None, :] - b_z) * scale).to(b_q.dtype) + # [BK, BC] + b_k = tl.load(p_k, boundary_check=(0, 1)) + b_k = tl.exp(b_k - b_zn[:, None]).to(b_k.dtype) + # [BC, BC] + b_A = tl.dot(b_q, b_k, allow_tf32=False) + tl.store(p_A, b_A.to(A.dtype.element_ty), boundary_check=(0, 1)) + elif i_i == i_j: + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (T * K,), (s_k_d,), ((i_t * BT + i_j * BC) * K + i_k * BK,), (BK,), (0,)) + p_z = tl.make_block_ptr(z + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + # [BC, BK] + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_z = tl.load(p_z, boundary_check=(0, 1)) + + o_i = tl.arange(0, BC) + o_A = (i_bh + i_k * n_bh) * T * BT + (i_t * BT + i_i * BC + tl.arange(0, BC)) * BT + i_j * BC + m_A = (i_t * BT + i_i * BC + tl.arange(0, BC)) < T + for j in range(0, BC): + # [BK,] + b_k = tl.load(p_k, boundary_check=(0,)).to(tl.float32) + # [BC,] + b_A = tl.sum(b_q * tl.exp(b_k[None, :] - b_z) * scale, 1) + b_A = tl.where(o_i >= j, b_A, 0.) + tl.store(A + o_A + j, b_A.to(b_q.dtype), mask=m_A) + + p_k = tl.advance(p_k, (K,)) + + +@triton.jit +def chunk_abc_fwd_kernel_V( + q, + v, + z, + h, + o, + A, + s_k_h, + s_k_t, + s_k_d, + s_v_h, + s_v_t, + s_v_d, + s_h_h, + s_h_t, + s_h_d, + scale, + T: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr +): + i_v, i_t, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_p = tl.maximum(i_t * BT - 1, 0) + + b_o = tl.zeros([BT, BV], dtype=tl.float32) + for i_k in range(tl.cdiv(K, BK)): + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_z = tl.make_block_ptr(z + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_h = tl.make_block_ptr(h + i_bh * s_h_h + i_t * K * V, (K, V), (s_h_t, s_h_d), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + p_zp = tl.make_block_ptr(z + i_bh * s_k_h, (T * K,), (s_k_d,), (i_p * K + i_k * BK,), (BK,), (0,)) + + # [BT, BK] + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_q = (b_q * scale).to(b_q.dtype) + # [BT, BK] + b_z = tl.load(p_z, boundary_check=(0, 1)) + # [BT, BK] + b_zp = tl.load(p_zp, boundary_check=(0,)) + b_q = (b_q * tl.exp(b_zp[None, :] - b_z)).to(b_q.dtype) + # [BK, BV] + b_h = tl.load(p_h, boundary_check=(0, 1)) + # works but dkw, owing to divine benevolence + # [BT, BV] + if i_k >= 0: + b_o += tl.dot(b_q, b_h, allow_tf32=False) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_o = tl.make_block_ptr(o + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_A = tl.make_block_ptr(A + i_bh * T * BT, (T, BT), (BT, 1), (i_t * BT, 0), (BT, BT), (1, 0)) + # [BT, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + # [BT, BT] + b_A = tl.load(p_A, boundary_check=(0, 1)) + b_o += tl.dot(b_A, b_v, allow_tf32=False) + tl.store(p_o, b_o.to(p_o.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.jit +def chunk_abc_bwd_kernel_dh( + q, + z, + do, + dh, + s_k_h, + s_k_t, + s_k_d, + s_v_h, + s_v_t, + s_v_d, + s_h_h, + s_h_t, + s_h_d, + scale, + T: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + NT: tl.constexpr, + NORMK: tl.constexpr +): + i_k, i_v, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + + b_dh = tl.zeros([BK, BV], dtype=tl.float32) + b_zp = tl.full([BK if NORMK else BV], float('inf'), dtype=tl.float32) + for i_t in range(NT - 1, -1, -1): + i_p = tl.maximum(i_t * BT - 1, 0) + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (K, T), (s_k_d, s_k_t), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + p_do = tl.make_block_ptr(do + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_dh = tl.make_block_ptr(dh + i_bh * s_h_h + i_t * K*V, (K, V), (s_h_t, s_h_d), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + + # [BK, BT] + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_q = (b_q * scale).to(b_q.dtype) + # [BT, BV] + b_do = tl.load(p_do, boundary_check=(0, 1)) + + tl.store(p_dh, b_dh.to(p_dh.dtype.element_ty), boundary_check=(0, 1)) + if NORMK: + p_z = tl.make_block_ptr(z + i_bh * s_k_h, (K, T), (s_k_d, s_k_t), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + p_zc = tl.make_block_ptr(z + i_bh * s_k_h, (T * K,), (s_k_d,), (i_p * K + i_k * BK,), (BK,), (0,)) + # [BK,] + b_zc = tl.load(p_zc, boundary_check=(0,)) + b_r, b_zp = tl.exp(b_zc - b_zp), b_zc + # [BK, BT] + b_z = tl.load(p_z, boundary_check=(0, 1)) + b_q = (b_q * tl.exp(b_zc[:, None] - b_z)).to(b_q.dtype) + # [BK, BV] + b_dh = b_dh * b_r[:, None] + else: + p_z = tl.make_block_ptr(z + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_zc = tl.make_block_ptr(z + i_bh * s_v_h, (T * V,), (s_v_d,), (i_p * V + i_v * BV,), (BV,), (0,)) + # [BV,] + b_zc = tl.load(p_zc, boundary_check=(0,)) + b_r, b_zp = tl.exp(b_zc - b_zp), b_zc + # [BT, BV] + b_z = tl.load(p_z, boundary_check=(0,)) + b_do = (b_do * tl.exp(b_zc[None, :] - b_z)).to(b_do.dtype) + # [BK, BV] + b_dh = b_dh * b_r[None, :] + # [BK, BV] + b_dh += tl.dot(b_q, b_do, allow_tf32=False) + + +@triton.jit +def chunk_abc_bwd_kernel_V( + k, + v, + z, + h, + A, + do, + dh, + dq, + dk, + dv, + dA, + s_k_h, + s_k_t, + s_k_d, + s_v_h, + s_v_t, + s_v_d, + s_h_h, + s_h_t, + s_h_d, + scale, + T: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr +): + i_k, i_t, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_p = tl.maximum(i_t * BT - 1, 0) + n_bh = tl.num_programs(2) + + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_zc = tl.make_block_ptr(z + i_bh * s_k_h, (T * K,), (s_k_d,), ((i_t * BT + BT - 1) * K + i_k * BK,), (BK,), (0,)) + p_A = tl.make_block_ptr(A + i_bh * T * BT, (BT, T), (1, BT), (0, i_t * BT), (BT, BT), (0, 1)) + + # [BK,] + b_zc = tl.load(p_zc, boundary_check=(0,)) + # [BT, BK] + b_k = tl.load(p_k, boundary_check=(0, 1)) + b_k = tl.exp(b_k - b_zc[None, :]).to(b_k.dtype) + # [BT, BT] + b_A = tl.load(p_A, boundary_check=(0, 1)) + + b_dq = tl.zeros([BT, BK], dtype=tl.float32) + b_dk = tl.zeros([BT, BK], dtype=tl.float32) + b_dA = tl.zeros([BT, BT], dtype=tl.float32) + for i_v in range(tl.cdiv(V, BV)): + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_h = tl.make_block_ptr(h + i_bh * s_h_h + i_t * V * K, (V, K), (s_h_d, s_h_t), (i_v * BV, i_k * BK), (BV, BK), (0, 1)) + p_do = tl.make_block_ptr(do + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_dh = tl.make_block_ptr(dh + i_bh * s_h_h + i_t * K*V, (K, V), (s_h_t, s_h_d), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + p_dv = tl.make_block_ptr(dv + (i_k*n_bh+i_bh) * s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + + # [BT, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + # [BV, BK] + b_h = tl.load(p_h, boundary_check=(0, 1)) + # [BT, BV] + b_do = tl.load(p_do, boundary_check=(0, 1)) + # [BK, BV] + b_dh = tl.load(p_dh, boundary_check=(0, 1)) + + # [BT, BV] + b_dv = tl.dot(b_k, b_dh, allow_tf32=False) + if i_k == 0: + b_dv += tl.dot(b_A, b_do, allow_tf32=False) + b_do = (b_do * scale).to(b_do.dtype) + tl.store(p_dv, b_dv.to(p_dv.dtype.element_ty), boundary_check=(0, 1)) + # [BT, BT] + b_dA += tl.dot(b_do, tl.trans(b_v), allow_tf32=False) + # [BT, BK] + b_dq += tl.dot(b_do, b_h, allow_tf32=False) + # [BT, BK] + b_dk += tl.dot(b_v, tl.trans(b_dh), allow_tf32=False) + p_z = tl.make_block_ptr(z + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_zp = tl.make_block_ptr(z + i_bh * s_k_h, (T * K,), (s_k_d,), (i_p * K + i_k * BK,), (BK,), (0,)) + # [BK,] + b_zp = tl.load(p_zp, boundary_check=(0,)) + # [BT, BK] + b_z = tl.load(p_z, boundary_check=(0, 1)) + b_z = tl.exp(b_zp[None, :] - b_z) + # [BT, BK] + b_dq = b_dq * b_z + b_dk = b_dk * b_k + + p_dq = tl.make_block_ptr(dq + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dk = tl.make_block_ptr(dk + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dA = tl.make_block_ptr(dA + i_bh * T * BT, (T, BT,), (BT, 1), (i_t * BT, 0), (BT, BT), (1, 0)) + tl.store(p_dq, b_dq.to(p_dq.dtype.element_ty), boundary_check=(0, 1)) + tl.store(p_dk, b_dk.to(p_dk.dtype.element_ty), boundary_check=(0, 1)) + + o_i = tl.arange(0, BT) + m_s = o_i[:, None] >= o_i[None, :] + # [BT, BT] + b_dA = tl.where(m_s, b_dA, 0.).to(b_k.dtype) + if i_k == 0: + tl.store(p_dA, b_dA.to(p_dA.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.jit +def chunk_abc_bwd_kernel_intra_V( + q, + k, + z, + dA, + dq, + dk, + s_k_h, + s_k_t, + s_k_d, + T: tl.constexpr, + K: tl.constexpr, + BT: tl.constexpr, + BC: tl.constexpr, + BK: tl.constexpr, + NC: tl.constexpr +): + i_k, i_c, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_t, i_i = i_c // NC, i_c % NC + + p_z = tl.make_block_ptr(z + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + p_zn = tl.make_block_ptr(z + i_bh * s_k_h, (T * K,), (s_k_d,), ((i_t * BT + i_i * BC) * K + i_k * BK,), (BK,), (0,)) + # [BK,] + b_zn = tl.load(p_zn, boundary_check=(0,)) + # [BC, BK] + b_z = tl.load(p_z, boundary_check=(0, 1)) + b_zq = tl.exp(b_zn[None, :] - b_z) + b_dq = tl.zeros([BC, BK], dtype=tl.float32) + for i_j in range(0, i_i): + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT + i_j * BC, i_k * BK), (BC, BK), (1, 0)) + p_dA = tl.make_block_ptr(dA + i_bh * T * BT, (T, BT), (BT, 1), (i_t * BT + i_i * BC, i_j * BC), (BC, BC), (1, 0)) + # [BC, BK] + b_k = tl.load(p_k, boundary_check=(0, 1)) + b_kz = tl.exp(b_k - b_zn[None, :]).to(b_k.dtype) + # [BC, BC] + b_dA = tl.load(p_dA, boundary_check=(0, 1)) + # [BC, BK] + b_dq += tl.dot(b_dA, b_kz, allow_tf32=False) + b_dq *= b_zq + + o_i = tl.arange(0, BC) + o_dA = i_bh * T * BT + (i_t * BT + i_i * BC + tl.arange(0, BC)) * BT + i_i * BC + m_dA = (i_t * BT + i_i * BC + tl.arange(0, BC)) < T + for j in range(0, BC): + p_kj = tl.make_block_ptr(k + i_bh * s_k_h, (T * K,), (1,), ((i_t * BT + i_i*BC+j) * K + i_k * BK,), (BK,), (0,)) + # [BC,] + b_dA = tl.load(dA + o_dA + j, mask=m_dA, other=0) + # [BK,] + b_kj = tl.load(p_kj, boundary_check=(0,)).to(tl.float32) + # [BC, BK] + m_i = o_i[:, None] >= j + # [BC, BK] + b_dq += tl.where(m_i, b_dA[:, None] * tl.exp(b_kj[None, :] - b_z), 0.) + p_dq = tl.make_block_ptr(dq + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + tl.store(p_dq, b_dq.to(p_dq.dtype.element_ty), boundary_check=(0, 1)) + + tl.debug_barrier() + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + p_zn = tl.make_block_ptr(z + i_bh * s_k_h, (T*K,), (s_k_d,), ((i_t * BT + i_i * BC + BC - 1) * K + i_k * BK,), (BK,), (0,)) + # [BK,] + b_zn = tl.load(p_zn, boundary_check=(0,)) + # [BC, BK] + b_k = tl.load(p_k, boundary_check=(0, 1)) + b_kz = tl.exp(b_k - b_zn[None, :]) + b_dk = tl.zeros([BC, BK], dtype=tl.float32) + for i_j in range(i_i + 1, NC): + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT + i_j * BC, i_k * BK), (BC, BK), (1, 0)) + p_z = tl.make_block_ptr(z + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT + i_j * BC, i_k * BK), (BC, BK), (1, 0)) + p_dA = tl.make_block_ptr(dA + i_bh * T * BT, (T, BT), (BT, 1), (i_t * BT + i_j * BC, i_i * BC), (BC, BC), (1, 0)) + # [BC, BK] + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_z = tl.load(p_z, boundary_check=(0, 1)) + b_qz = (b_q * tl.exp(b_zn[None, :] - b_z)).to(b_q.dtype) + # [BC, BC] + b_dA = tl.load(p_dA, boundary_check=(0, 1)) + # [BC, BK] + b_dk += tl.dot(tl.trans(b_dA), b_qz, allow_tf32=False) + b_dk *= b_kz + + o_dA = i_bh * T * BT + (i_t * BT + i_i * BC) * BT + i_i * BC + tl.arange(0, BC) + for j in range(0, BC): + p_qj = tl.make_block_ptr(q + i_bh * s_k_h, (T * K,), (1,), ((i_t * BT + i_i * BC + j) * K + i_k * BK,), (BK,), (0,)) + p_zj = tl.make_block_ptr(z + i_bh * s_k_h, (T * K,), (1,), ((i_t * BT + i_i * BC + j) * K + i_k * BK,), (BK,), (0,)) + # [BC,] + b_dA = tl.load(dA + o_dA + j * BT, mask=(i_t * BT + i_i * BC + j < T), other=0) + # [BK,] + b_qj = tl.load(p_qj, boundary_check=(0,)).to(tl.float32) + b_zj = tl.load(p_zj, boundary_check=(0,)).to(tl.float32) + # [BC, BK] + m_i = o_i[:, None] <= j + b_dk += tl.where(m_i, b_dA[:, None] * b_qj[None, :] * tl.exp(b_k - b_zj[None, :]), 0.) + p_dk = tl.make_block_ptr(dk + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + tl.store(p_dk, b_dk.to(p_dk.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.jit +def chunk_abc_bwd_kernel_intra_K( + v, + z, + do, + dA, + s_v_h, + s_v_t, + s_v_d, + scale, + T: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BC: tl.constexpr, + BV: tl.constexpr, + NC: tl.constexpr +): + i_v, i_c, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_t, i_i, i_j = i_c // (NC * NC), (i_c % (NC * NC)) // NC, (i_c % (NC * NC)) % NC + n_bh = tl.num_programs(2) + + if i_i > i_j: + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (V, T), (s_v_d, s_v_t), (i_v * BV, i_t * BT + i_j * BC), (BV, BC), (0, 1)) + p_z = tl.make_block_ptr(z + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT + i_i * BC, i_v * BV), (BC, BV), (1, 0)) + p_zn = tl.make_block_ptr(z + i_bh * s_v_h, (T * V,), (s_v_d,), ((i_t * BT + i_i * BC) * V + i_v * BV,), (BV,), (0,)) + p_do = tl.make_block_ptr(do + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT + i_i * BC, i_v * BV), (BC, BV), (1, 0)) + p_dA = tl.make_block_ptr(dA+(i_bh+i_v*n_bh)*T*BT, (T, BT), (BT, 1), (i_t * BT + i_i * BC, i_j * BC), (BC, BC), (1, 0)) + # [BV,] + b_zn = tl.load(p_zn, boundary_check=(0,)) + # [BC, BV] + b_z = tl.load(p_z, boundary_check=(0, 1)) + b_do = tl.load(p_do, boundary_check=(0, 1)) + b_do = (b_do * tl.exp(b_zn[None, :] - b_z) * scale).to(b_do.dtype) + # [BV, BC] + b_v = tl.load(p_v, boundary_check=(0, 1)) + b_v = tl.exp(b_v - b_zn[:, None]).to(b_v.dtype) + # [BC, BC] + b_dA = tl.dot(b_do, b_v, allow_tf32=False) + tl.store(p_dA, b_dA.to(dA.dtype.element_ty), boundary_check=(0, 1)) + elif i_i == i_j: + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (T * V,), (s_v_d,), ((i_t * BT + i_j * BC) * V + i_v * BV,), (BV,), (0,)) + p_z = tl.make_block_ptr(z + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT + i_i * BC, i_v * BV), (BC, BV), (1, 0)) + p_do = tl.make_block_ptr(do + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT + i_i * BC, i_v * BV), (BC, BV), (1, 0)) + # [BC, BV] + b_z = tl.load(p_z, boundary_check=(0, 1)) + b_do = tl.load(p_do, boundary_check=(0, 1)) * scale + + o_i = tl.arange(0, BC) + o_A = (i_bh + i_v * n_bh) * T * BT + (i_t * BT + i_i * BC + tl.arange(0, BC)) * BT + i_j * BC + m_A = (i_t * BT + i_i * BC + tl.arange(0, BC)) < T + for j in range(0, BC): + # [BV,] + b_v = tl.load(p_v, boundary_check=(0,)).to(tl.float32) + # [BC,] + b_dA = tl.sum(b_do * tl.exp(b_v[None, :] - b_z), 1) + b_dA = tl.where(o_i >= j, b_dA, 0) + tl.store(dA + o_A + j, b_dA.to(b_do.dtype), mask=m_A) + + p_v = tl.advance(p_v, (V,)) + + +@triton.jit +def chunk_abc_bwd_kernel_K( + q, + k, + v, + z, + h, + A, + do, + dh, + dq, + dk, + dv, + dA, + s_k_h, + s_k_t, + s_k_d, + s_v_h, + s_v_t, + s_v_d, + s_h_h, + s_h_t, + s_h_d, + scale, + T: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr +): + i_k, i_t, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_p = tl.maximum(i_t * BT - 1, 0) + n_bh = tl.num_programs(2) + + o_i = tl.arange(0, BT) + m_s = o_i[:, None] >= o_i[None, :] + + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_A = tl.make_block_ptr(A + (i_k*n_bh+i_bh) * T * BT, (T, BT, ), (BT, 1), (i_t * BT, 0), (BT, BT), (1, 0)) + + # [BT, BK] + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BT, BT] + b_A = tl.dot((b_q * scale).to(b_q.dtype), tl.trans(b_k), allow_tf32=False) + b_A = tl.where(m_s, b_A, 0.) + tl.store(p_A, b_A.to(p_A.dtype.element_ty), boundary_check=(0, 1)) + + b_dq = tl.zeros([BT, BK], dtype=tl.float32) + b_dk = tl.zeros([BT, BK], dtype=tl.float32) + for i_v in range(tl.cdiv(V, BV)): + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_z = tl.make_block_ptr(z + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_zp = tl.make_block_ptr(z + i_bh * s_v_h, (T * V,), (s_v_d,), (i_p * V + i_v * BV,), (BV,), (0,)) + p_zc = tl.make_block_ptr(z + i_bh * s_v_h, (T * V,), (s_v_d,), ((i_t * BT + BT - 1) * V + i_v * BV,), (BV,), (0,)) + p_h = tl.make_block_ptr(h + i_bh * s_h_h + i_t * K*V, (V, K), (s_h_d, s_h_t), (i_v * BV, i_k * BK), (BV, BK), (0, 1)) + + p_do = tl.make_block_ptr(do + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_dh = tl.make_block_ptr(dh + i_bh * s_h_h + i_t * K*V, (K, V), (s_h_t, s_h_d), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + p_dv = tl.make_block_ptr(dv + (i_k*n_bh+i_bh) * s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + + # [BV,] + b_zp = tl.load(p_zp, boundary_check=(0,)) + b_zc = tl.load(p_zc, boundary_check=(0,)) + # [BT, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + b_v = tl.exp(b_v - b_zc[None, :]).to(b_v.dtype) + b_z = tl.load(p_z, boundary_check=(0, 1)) + b_z = tl.exp(b_zp[None, :] - b_z) + # [BV, BK] + b_h = tl.load(p_h, boundary_check=(0, 1)) + # [BT, BV] + b_do = tl.load(p_do, boundary_check=(0, 1)) + b_do = (b_do * b_z * scale).to(b_do.dtype) + # [BK, BV] + b_dh = tl.load(p_dh, boundary_check=(0, 1)) + + # [BT, BK] + b_dq += tl.dot(b_do, b_h, allow_tf32=False) + b_dk += tl.dot(b_v, tl.trans(b_dh), allow_tf32=False) + # [BT, BV] + b_dv = b_v * tl.dot(b_k, b_dh, allow_tf32=False) + tl.store(p_dv, b_dv.to(p_dv.dtype.element_ty), boundary_check=(0, 1)) + p_dA = tl.make_block_ptr(dA + i_bh * T * BT, (T, BT, ), (BT, 1), (i_t * BT, 0), (BT, BT), (1, 0)) + # [BT, BT] + b_dA = tl.load(p_dA, boundary_check=(0, 1)) + # [BT, BK] + b_dq += tl.dot(b_dA, b_k, allow_tf32=False) + b_dk += tl.dot(tl.trans(b_dA).to(b_k.dtype), b_q, allow_tf32=False) + + p_dq = tl.make_block_ptr(dq + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dk = tl.make_block_ptr(dk + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + tl.store(p_dq, b_dq.to(p_dq.dtype.element_ty), boundary_check=(0, 1)) + tl.store(p_dk, b_dk.to(p_dk.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.jit +def chunk_abc_bwd_kernel_intra_KV( + v, + z, + A, + do, + dv, + s_v_h, + s_v_t, + s_v_d, + T: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BC: tl.constexpr, + BV: tl.constexpr, + NC: tl.constexpr +): + i_v, i_c, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_t, i_i = i_c // NC, i_c % NC + + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT + i_i * BC, i_v * BV), (BC, BV), (1, 0)) + p_zn = tl.make_block_ptr(z + i_bh * s_v_h, (T*V,), (s_v_d,), ((i_t * BT + i_i * BC + BC - 1) * V + i_v * BV,), (BV,), (0,)) + # [BV,] + b_zn = tl.load(p_zn, boundary_check=(0,)) + # [BC, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + b_dv = tl.zeros([BC, BV], dtype=tl.float32) + for i_j in range(i_i + 1, NC): + p_z = tl.make_block_ptr(z + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT + i_j * BC, i_v * BV), (BC, BV), (1, 0)) + p_A = tl.make_block_ptr(A + i_bh * T * BT, (BT, T), (1, BT), (i_i * BC, i_t * BT + i_j * BC), (BC, BC), (0, 1)) + p_do = tl.make_block_ptr(do + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT + i_j * BC, i_v * BV), (BC, BV), (1, 0)) + # [BC, BV] + b_z = tl.load(p_z, boundary_check=(0, 1)) + b_do = tl.load(p_do, boundary_check=(0, 1)) + b_do = (b_do * tl.exp(b_zn[None, :] - b_z)).to(b_do.dtype) + # [BC, BC] + b_A = tl.load(p_A, boundary_check=(0, 1)) + b_dv += tl.dot(b_A, b_do, allow_tf32=False) + b_dv *= tl.exp(b_v - b_zn[None, :]) + + o_i = tl.arange(0, BC) + for j in range(0, BC): + p_z = tl.make_block_ptr(z + i_bh * s_v_h, (T * V,), (1,), ((i_t * BT + i_i * BC + j) * V + i_v * BV,), (BV,), (0,)) + p_A = tl.make_block_ptr(A + i_bh * T * BT, (T * BT,), (1,), ((i_t * BT + i_i * BC + j) * BT + i_i * BC,), (BC,), (0,)) + p_do = tl.make_block_ptr(do + i_bh * s_v_h, (T * V,), (1,), ((i_t * BT + i_i * BC + j) * V + i_v * BV,), (BV,), (0,)) + # [BC,] + b_A = tl.load(p_A, boundary_check=(0,)) + # [BV,] + b_z = tl.load(p_z, boundary_check=(0,)) + b_do = tl.load(p_do, boundary_check=(0,)) + # [BC, BV] + m_i = o_i[:, None] <= j + b_dv += tl.where(m_i, tl.exp(b_v - b_z[None, :]) * b_A[:, None] * b_do[None, :], 0.) + p_dv = tl.make_block_ptr(dv + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT + i_i * BC, i_v * BV), (BC, BV), (1, 0)) + tl.store(p_dv, b_dv.to(p_dv.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.jit +def chunk_abc_bwd_kernel_rcum_inter( + s, + z, + ss, + doo, + s_s_h, + s_s_t, + s_s_d, + T: tl.constexpr, + S: tl.constexpr, + BT: tl.constexpr, + BS: tl.constexpr, + NT: tl.constexpr +): + i_m, i_bh = tl.program_id(0), tl.program_id(1) + + b_sp = tl.zeros([BS,], dtype=tl.float32) + b_zp = tl.full([BS,], float('inf'), dtype=tl.float32) + for i_t in range(NT - 1, -1, -1): + p_s = tl.make_block_ptr(s + i_bh * s_s_h, (T, S), (s_s_t, s_s_d), (i_t * BT, i_m * BS), (BT, BS), (1, 0)) + p_z = tl.make_block_ptr(z + i_bh * s_s_h, (T, S), (s_s_t, s_s_d), (i_t * BT, i_m * BS), (BT, BS), (1, 0)) + p_zc = tl.make_block_ptr(z + i_bh * s_s_h, (T * S,), (s_s_d,), ((i_t * BT) * S + i_m * BS,), (BS,), (0,)) + p_ss = tl.make_block_ptr(ss + i_bh * s_s_h, (T, S), (s_s_t, s_s_d), (i_t * BT, i_m * BS), (BT, BS), (1, 0)) + p_doo = tl.make_block_ptr(doo + i_bh * s_s_h, (T, S), (s_s_t, s_s_d), (i_t * BT, i_m * BS), (BT, BS), (1, 0)) + # [BS,] + b_zc = tl.load(p_zc, boundary_check=(0,)) + # [BT, BS] + b_s = tl.load(p_s, boundary_check=(0, 1)) + b_z = tl.load(p_z, boundary_check=(0, 1)) + b_ss = tl.load(p_ss, boundary_check=(0, 1)) + + b_doo = tl.exp(b_s - b_zp[None, :]) * b_sp[None, :] + tl.store(p_doo, b_doo.to(p_doo.dtype.element_ty), boundary_check=(0, 1)) + # [BS,] + b_sp = b_sp * tl.exp(b_zc - b_zp) + tl.sum(b_ss * tl.exp(b_zc[None, :] - b_z), 0) + b_zp = b_zc + + +@triton.jit +def chunk_abc_bwd_kernel_rcum_intra( + s, + z, + ss, + doo, + s_s_h, + s_s_t, + s_s_d, + T: tl.constexpr, + S: tl.constexpr, + BT: tl.constexpr, + BC: tl.constexpr, + BS: tl.constexpr, + NC: tl.constexpr +): + i_s, i_c, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_t, i_i = i_c // NC, i_c % NC + + o_i = tl.arange(0, BC) + m_o = tl.full([BC, BC], 1., dtype=tl.float32) + + p_s = tl.make_block_ptr(s + i_bh * s_s_h, (T, S), (s_s_t, s_s_d), (i_t * BT + i_i * BC, i_s * BS), (BC, BS), (1, 0)) + p_zn = tl.make_block_ptr(z + i_bh * s_s_h, (T*S,), (s_s_d,), ((i_t * BT + i_i * BC + BC - 1) * S + i_s * BS,), (BS,), (0,)) + p_doo = tl.make_block_ptr(doo + i_bh * s_s_h, (T, S), (s_s_t, s_s_d), (i_t * BT + i_i * BC, i_s * BS), (BC, BS), (1, 0)) + # [BC, BS] + b_s = tl.load(p_s, boundary_check=(0, 1)) + # [BS,] + b_zn = tl.load(p_zn, boundary_check=(0,)) + + b_doo = tl.zeros([BC, BS], dtype=tl.float32) + for i_j in range(i_i + 1, NC): + p_z = tl.make_block_ptr(z + i_bh * s_s_h, (T, S), (s_s_t, s_s_d), (i_t * BT + i_j * BC, i_s * BS), (BC, BS), (1, 0)) + p_ss = tl.make_block_ptr(ss + i_bh * s_s_h, (T, S), (s_s_t, s_s_d), (i_t * BT + i_j * BC, i_s * BS), (BC, BS), (1, 0)) + # [BC, BS] + b_z = tl.load(p_z, boundary_check=(0, 1)) + b_ss = tl.load(p_ss, boundary_check=(0, 1)) + # [BC, BS] + b_doo += b_ss * tl.exp(b_zn[None, :] - b_z) + b_doo = tl.exp(b_s - b_zn[None, :]) * tl.dot(m_o.to(b_s.dtype), b_doo.to(b_s.dtype), allow_tf32=False) + + for j in range(0, BC): + p_z = tl.make_block_ptr(z + i_bh * s_s_h, (T * S,), (1,), ((i_t * BT + i_i * BC + j) * S + i_s * BS,), (BS,), (0,)) + p_ss = tl.make_block_ptr(ss + i_bh * s_s_h, (T * S,), (1,), ((i_t * BT + i_i * BC + j) * S + i_s * BS,), (BS,), (0,)) + # [BS,] + b_z = tl.load(p_z, boundary_check=(0,)) + b_ss = tl.load(p_ss, boundary_check=(0,)) + # [BC, BS] + m_i = o_i[:, None] <= j + b_doo += tl.where(m_i, tl.exp(b_s - b_z[None, :]) * b_ss[None, :], 0.) + b_doo += tl.load(p_doo, boundary_check=(0, 1)) + tl.store(p_doo, b_doo.to(p_doo.dtype.element_ty), boundary_check=(0, 1)) + + +class ChunkABCFunction(torch.autograd.Function): + + @staticmethod + @contiguous + def forward(ctx, q, k, v, s, initial_state, output_final_state): + B, H, T, K, V, M = *q.shape, v.shape[-1], s.shape[-1] + BT, BC = 64, 16 + BK = min(64, triton.next_power_of_2(K)) + BV = min(64, triton.next_power_of_2(V)) + BM = min(64, triton.next_power_of_2(M)) + NT, NC = triton.cdiv(T, BT), triton.cdiv(BT, BC) + NV, NM = triton.cdiv(V, BV), triton.cdiv(M, BM) + num_warps = 4 if BK == 64 else 2 + num_stages = 1 + + def fwd_pre(s, B, H, T, S): + # keep cummulative normalizer in fp32 + z = torch.empty_like(s, dtype=torch.float) + grid = (B * H,) + logcumsumexp_fwd_kernel[grid]( + s, z, + s.stride(1), s.stride(2), s.stride(3), + T=T, S=S + ) + return z + + def fwd_inner(q, k, v, z, B, H, T, K, V, BT, BK, BV, NT, normk=False, h0=None, ht=None): + NK, NV = triton.cdiv(K, BK), triton.cdiv(V, BV) + h = q.new_empty(B, H, NT * K, V) + grid = (NV, NK, B * H) + chunk_abc_fwd_kernel_h[grid]( + k, v, z, h, h0, ht, + k.stride(1), k.stride(2), k.stride(3), + v.stride(1), v.stride(2), v.stride(3), + h.stride(1), h.stride(2), h.stride(3), + T=T, K=K, V=V, BT=BT, BK=BK, BV=BV, NT=NT, + NORMK=normk, + USE_INITIAL_STATE=h0 is not None, + STORE_FINAL_STATE=ht is not None, + num_warps=num_warps, + num_stages=num_stages + ) + return h + + final_state = None + if output_final_state: + final_state = (q.new_empty(B, H, K, M, dtype=torch.float), + q.new_empty(B, H, M, V, dtype=torch.float)) + + z = fwd_pre(s, B, H, T, M) + scale = K ** -0.5 + hk = fwd_inner( + q=q, k=k, v=s, z=z, + B=B, H=H, T=T, K=K, V=M, BT=BT, BK=BK, BV=BM, NT=NT, + normk=False, + h0=initial_state[0] if initial_state is not None else None, + ht=final_state[0] if final_state is not None else None + ) + ok1 = torch.empty_like(s) + Ak = q.new_empty(B, H, T, BT) + grid = (NM, NT, B * H) + chunk_abc_fwd_kernel_K[grid]( + q, k, z, hk, ok1, Ak, + k.stride(1), k.stride(2), k.stride(3), + s.stride(1), s.stride(2), s.stride(3), + hk.stride(1), hk.stride(2), hk.stride(3), + scale=scale, + T=T, K=K, V=M, BT=BT, BK=BK, BV=BM, + num_warps=num_warps, + num_stages=num_stages + ) + ok0 = torch.empty_like(s) + grid = (NM, NT * NC, B * H) + chunk_abc_fwd_kernel_intra_K[grid]( + s, z, ok0, Ak, + s.stride(1), s.stride(2), s.stride(3), + T=T, V=M, BT=BT, BC=BC, BV=BM, NC=NC, + num_warps=2, + num_stages=num_stages + ) + ok = ok0.add_(ok1) + + scale = 1. + # equivalent to: + # p = ok.softmax(-1, torch.float) + # p is kept in fp32 for safe softmax backward + p = torch.empty_like(ok, dtype=torch.float) + grid = (NT, B * H) + softmax_fwd_kernel[grid]( + ok, p, + s.stride(1), s.stride(2), s.stride(3), + T=T, S=M, BT=BT + ) + qv = p.to(q.dtype) + + scale = 1. + hv = fwd_inner( + q=qv, k=s, v=v, z=z, + B=B, H=H, T=T, K=M, V=V, BT=BT, BK=BM, BV=BV, NT=NT, + normk=True, + h0=initial_state[1] if initial_state is not None else None, + ht=final_state[1] if final_state is not None else None + ) + Av = q.new_zeros(NM, B, H, T, BT) + grid = (NM, NT * NC * NC, B * H) + chunk_abc_fwd_kernel_intra_V[grid]( + qv, s, z, Av, + s.stride(1), s.stride(2), s.stride(3), + scale=scale, + T=T, K=M, BT=BT, BC=BC, BK=BM, NC=NC, + num_warps=2, + num_stages=num_stages + ) + Av = Av.sum(0) + ov = torch.empty_like(v) + grid = (NV, NT, B * H) + chunk_abc_fwd_kernel_V[grid]( + qv, v, z, hv, ov, Av, + s.stride(1), s.stride(2), s.stride(3), + v.stride(1), v.stride(2), v.stride(3), + hv.stride(1), hv.stride(2), hv.stride(3), + scale=scale, + T=T, K=M, V=V, BT=BT, BK=BM, BV=BV, + num_warps=num_warps, + num_stages=num_stages + ) + ctx.save_for_backward(q, k, v, s, z, ok, p, hk, hv, Av) + ctx.BT = BT + return ov, final_state + + @staticmethod + @contiguous + def backward(ctx, dov, dht=None): + q, k, v, s, z, ok, p, hk, hv, Av = ctx.saved_tensors + B, H, T, K, V, M = *q.shape, v.shape[-1], s.shape[-1] + BT, BC = ctx.BT, 16 + BK = min(64, triton.next_power_of_2(K)) + BV = min(64, triton.next_power_of_2(V)) + BM = min(64, triton.next_power_of_2(M)) + NT, NC = triton.cdiv(T, BT), triton.cdiv(BT, BC) + NK, NM = triton.cdiv(K, BK), triton.cdiv(M, BM) + num_warps = 4 if BK == 64 else 2 + num_stages = 1 + + def bwd_inner(q, z, do, B, H, T, K, V, BT, BK, BV, NT, scale, normk=False): + NK, NV = triton.cdiv(K, BK), triton.cdiv(V, BV) + dh = q.new_empty(B, H, NT * K, V) + grid = (NK, NV, B * H) + chunk_abc_bwd_kernel_dh[grid]( + q, z, do, dh, + q.stride(1), q.stride(2), q.stride(3), + do.stride(1), do.stride(2), do.stride(3), + dh.stride(1), dh.stride(2), dh.stride(3), + scale=scale, + T=T, K=K, V=V, BT=BT, BK=BK, BV=BV, NT=NT, + NORMK=normk, + num_warps=num_warps, + num_stages=num_stages + ) + return dh + + def bwd_post(s, z, ss, B, H, T, S, BT, BC, BS, NT, NC, NS): + doo = torch.empty_like(s) + grid = (NS, B * H) + chunk_abc_bwd_kernel_rcum_inter[grid]( + s, z, ss, doo, + s.stride(1), s.stride(2), s.stride(3), + T=T, S=S, BT=BT, BS=BS, NT=NT, + num_warps=num_warps, + num_stages=num_stages + ) + grid = (NS, NT * NC, B * H) + chunk_abc_bwd_kernel_rcum_intra[grid]( + s, z, ss, doo, + s.stride(1), s.stride(2), s.stride(3), + T=T, S=S, BT=BT, BC=BC, BS=BS, NC=NC, + num_warps=num_warps, + num_stages=num_stages + ) + return doo + + scale = 1. + qv = p.to(q.dtype) + dhv = bwd_inner( + qv, z, dov, + B=B, H=H, T=T, K=M, V=V, BT=BT, BK=BM, BV=BV, NT=NT, + scale=scale, + normk=True + ) + dp1 = torch.empty_like(p) + dsv1 = torch.empty_like(s, dtype=torch.float) + dv = v.new_empty(NM, *v.shape) + dAv = q.new_zeros(B, H, T, BT) + grid = (NM, NT, B * H) + chunk_abc_bwd_kernel_V[grid]( + s, v, z, hv, Av, dov, dhv, dp1, dsv1, dv, dAv, + s.stride(1), s.stride(2), s.stride(3), + v.stride(1), v.stride(2), v.stride(3), + hv.stride(1), hv.stride(2), hv.stride(3), + scale=scale, + T=T, K=M, V=V, BT=BT, BK=BM, BV=BV, + num_warps=num_warps, + num_stages=num_stages + ) + dv = dv.sum(0) + dp0 = torch.empty_like(p) + dsv0 = s.new_zeros(s.shape, dtype=torch.float) + grid = (NM, NT * NC, B * H) + chunk_abc_bwd_kernel_intra_V[grid]( + qv, s, z, dAv, dp0, dsv0, + s.stride(1), s.stride(2), s.stride(3), + T=T, K=M, BT=BT, BC=BC, BK=BM, NC=NC, + num_warps=2, + num_stages=num_stages + ) + dp = dp1.add_(dp0) + dsv = dsv1.add_(dsv0) + + # softmax gradient, equivalent to: + # dok = p * (dp - (p * dp).sum(-1, True)) + dok = torch.empty_like(ok) + grid = (NT, B * H) + softmax_bwd_kernel[grid]( + p, dp, dok, + s.stride(1), s.stride(2), s.stride(3), + T=T, S=M, BT=BT + ) + + scale = K ** -0.5 + dhk = bwd_inner( + q, z, dok, + B=B, H=H, T=T, K=K, V=M, BT=BT, BK=BK, BV=BM, NT=NT, + scale=scale, + normk=False + ) + dAk = q.new_zeros(NM, B, H, T, BT) + grid = (NM, NT * NC * NC, B * H) + chunk_abc_bwd_kernel_intra_K[grid]( + s, z, dok, dAk, + s.stride(1), s.stride(2), s.stride(3), + scale=scale, + T=T, V=M, BT=BT, BC=BC, BV=BM, NC=NC, + num_warps=2, + num_stages=num_stages + ) + dAk = dAk.sum(0) + + Ak = q.new_zeros(NK, B, H, T, BT) + dq = torch.empty_like(q) + dk = torch.empty_like(k) + dsk1 = s.new_empty(NK, *s.shape, dtype=torch.float) + grid = (NK, NT, B * H) + chunk_abc_bwd_kernel_K[grid]( + q, k, s, z, hk, Ak, dok, dhk, dq, dk, dsk1, dAk, + q.stride(1), q.stride(2), q.stride(3), + s.stride(1), s.stride(2), s.stride(3), + hk.stride(1), hk.stride(2), hk.stride(3), + scale=scale, + T=T, K=K, V=M, BT=BT, BK=BK, BV=BM, + num_warps=num_warps, + num_stages=num_stages + ) + Ak = Ak.sum(0) + dsk1 = dsk1.sum(0) + dsk0 = torch.empty_like(s, dtype=torch.float) + grid = (NM, NT * NC, B * H) + chunk_abc_bwd_kernel_intra_KV[grid]( + s, z, Ak, dok, dsk0, + s.stride(1), s.stride(2), s.stride(3), + T=T, V=M, BT=BT, BC=BC, BV=BM, NC=NC, + num_warps=2, + num_stages=num_stages + ) + ds = dsv.add_(dsk1.add_(dsk0)) + ds -= bwd_post(s, z, ok * dok + p * dp, B, H, T, M, BT, BC, BM, NT, NC, NM) + ds = ds.to(s.dtype) + return dq, dk, dv, ds, None, None + + +def chunk_abc( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + s: torch.Tensor, + initial_state: Optional[Tuple[torch.Tensor]] = None, + output_final_state: bool = False, + head_first: bool = True +) -> Tuple[torch.Tensor, torch.Tensor]: + r""" + Args: + q (torch.Tensor): + queries of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]` + k (torch.Tensor): + keys of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]` + v (torch.Tensor): + values of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]` + s (torch.Tensor): + slot representations of shape `[B, H, T, M]` if `head_first=True` else `[B, T, H, M]` + initial_state (Optional[Tuple[torch.Tensor, torch.Tensor]]): + Initial states of shape `[B, H, K, M]` and `[B, H, M, V]`. Default: `None`. + output_final_state (Optional[bool]): + Whether to output the final state of shape `[B, H, K, M]` and `[B, H, M, V]`. Default: `False`. + head_first (Optional[bool]): + Whether the inputs are in the head-first format. + Default: `True`. + + Returns: + o (torch.Tensor): + Outputs of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]`. + final_state (torch.Tensor): + Final state of shape `[B, H, K, M]` and `[B, H, M, V]` if `output_final_state=True` else `None`. + """ + if not head_first: + q, k, v, s = map(lambda x: x.transpose(1, 2), (q, k, v, s)) + o, final_state = ChunkABCFunction.apply(q, k, v, s, initial_state, output_final_state) + if not head_first: + o = o.transpose(1, 2) + return o, final_state diff --git a/fla/ops/abc/naive.py b/fla/ops/abc/naive.py new file mode 100644 index 0000000000000000000000000000000000000000..a7f25c40db73bcf33d1599761be0008cc5be7c59 --- /dev/null +++ b/fla/ops/abc/naive.py @@ -0,0 +1,96 @@ +# -*- coding: utf-8 -*- + +from typing import Optional + +import torch +from einops import repeat + + +def naive_recurrent_abc( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + s: torch.Tensor, + g: Optional[torch.Tensor] = None, + scale: Optional[int] = None, + initial_state: Optional[torch.Tensor] = None, + output_final_state: Optional[bool] = False +) -> torch.Tensor: + dtype = q.dtype + + NG = q.shape[1]//k.shape[1] + # [batch_size, n_heads, seq_len, n_slots] + if g is None: + z = s.float().logcumsumexp(2) + g = torch.cat((z[:, :, :1], z[:, :, :-1]), 2) - z + s = torch.exp(s - z) + q, k, v, s, g = map(lambda x: x.float(), (q, k, v, s, g)) + k, v, s, g = map(lambda x: repeat(x, 'b h t d -> b (h g) t d', g=NG), (k, v, s, g)) + if initial_state is not None: + initial_state = tuple(map(lambda x: repeat(x, 'b h k v -> b (h g) k v', g=NG), initial_state)) + + B, H, T, K, V, M = *q.shape, v.shape[-1], s.shape[-1] + + hk = torch.zeros(B, H, K, M, dtype=torch.float, device=q.device) + ok = torch.zeros_like(s) + + if scale is None: + scale = q.shape[-1] ** -0.5 + + final_state = None + if initial_state is not None: + hk += initial_state[0] + + for i in range(T): + q_i = q[:, :, i] * scale + k_i = k[:, :, i] + v_i = s[:, :, i] + g_i = g[:, :, i].exp() + hk = hk * g_i[..., None, :] + k_i[..., None] * v_i[..., None, :] + ok[:, :, i] = (q_i[..., None] * hk).sum(-2) + + qv = ok.softmax(-1) + hv = torch.zeros(B, H, M, V, dtype=torch.float, device=q.device) + ov = torch.zeros_like(v) + if initial_state is not None: + hv += initial_state[1] + + for i in range(T): + q_i = qv[:, :, i] + k_i = s[:, :, i] + v_i = v[:, :, i] + g_i = g[:, :, i].exp() + hv = hv * g_i[..., :, None] + k_i[..., None] * v_i[..., None, :] + ov[:, :, i] = (q_i[..., None] * hv).sum(-2) + + if output_final_state: + final_state = (hk.view(B, -1, NG, K, M)[:, :, 0], hv.view(B, -1, NG, M, V)[:, :, 0]) + return ov.to(dtype), final_state + + +def naive_cumsum_abc( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + s: torch.Tensor +) -> torch.Tensor: + """ + A simple implementation of vanilla ABC that is more aligned with the descriptions in the paper. + This is just for demonstration purposes, with no numerical stabilities guaranteed. + """ + + dtype = q.dtype + q, k, v, s = map(lambda x: x.float(), (q, k, v, s)) + + scale = q.shape[-1] ** -0.5 + # [batch_size, n_heads, seq_len, n_slots] + s = (s - s.max(2, True)[0]).exp() + z = s.cumsum(2) + # [batch_size, n_heads, seq_len, n_slots, d_head] + K = (s.unsqueeze(-1) * k.unsqueeze(-2)).cumsum(2) / z.unsqueeze(-1) + V = (s.unsqueeze(-1) * v.unsqueeze(-2)).cumsum(2) / z.unsqueeze(-1) + # [batch_size, n_heads, seq_len, n_slots] + p = torch.einsum('...d,...md->...m', q * scale, K).softmax(-1) + # [batch_size, n_heads, seq_len, d_head] + o = torch.einsum('...m,...md->...d', p, V) + return o.to(dtype), None diff --git a/fla/ops/based/__init__.py b/fla/ops/based/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..f20b31ba0ea4c7d345761fbd6ab5f6ced5136236 --- /dev/null +++ b/fla/ops/based/__init__.py @@ -0,0 +1,9 @@ +# -*- coding: utf-8 -*- + +from .fused_chunk import fused_chunk_based +from .parallel import parallel_based + +__all__ = [ + 'fused_chunk_based', + 'parallel_based' +] diff --git a/fla/ops/based/fused_chunk.py b/fla/ops/based/fused_chunk.py new file mode 100644 index 0000000000000000000000000000000000000000..3e3c5df52859073ea2952aa86488a28c037600bf --- /dev/null +++ b/fla/ops/based/fused_chunk.py @@ -0,0 +1,390 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +from typing import Optional + +import torch +import triton +import triton.language as tl + +from fla.utils import autocast_custom_bwd, autocast_custom_fwd, contiguous + + +@triton.jit +def fused_chunk_based_fwd_kernel( + q, # query [B, H, L, K] + k, # key [B, H, L, V] + v, # value [B, H, L, V] + o, # output [B, H, L, V] + z, # normalizer [B, H, L, 1] + s_k_h, # stride size: L * K + s_k_t, # stride size: K + s_k_d, # stride size: 1 + s_v_h, # stride size: L * V + s_v_t, # stride size: V + s_v_d, # stride size: 1 + scale, # K ** -0.5 + B: tl.constexpr, # batch size + H: tl.constexpr, # H + T: tl.constexpr, # T + K: tl.constexpr, # K + V: tl.constexpr, # V + BT: tl.constexpr, # BLOCK SIZE along the sequence dimension, a.k.a. chunk size + BK: tl.constexpr, # BLOCK SIZE along the K dimension + BV: tl.constexpr, # BLOCK SIZE along the V dimension +): + # indices + i_v, i_k, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + + o_i = tl.arange(0, BT) + + # [BT, BT] + m_s = o_i[:, None] >= o_i[None, :] + + # [BV], zero-order taylor expansion + b_h_0o = tl.zeros([BV], dtype=tl.float32) + # [BK, BV], first-order taylor expansion + b_h_1o = tl.zeros([BK, BV], dtype=tl.float32) + # [BK, BK, BV] second-order taylor expansion + b_h_2o = tl.zeros([BK*BK, BV], dtype=tl.float32) + + # make block pointers + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (0, i_k * BK), (BT, BK), (1, 0)) + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (K, T), (s_k_d, s_k_t), (i_k * BK, 0), (BK, BT), (0, 1)) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (0, i_v * BV), (BT, BV), (1, 0)) + p_o = tl.make_block_ptr(o + (i_bh + i_k*B*H) * s_v_h, (T, V), (s_v_t, s_v_d), (0, i_v * BV), (BT, BV), (1, 0)) + + p_z = z + (i_bh + i_k * B * H) * T + tl.arange(0, BT) + k_2o = tl.zeros([1, BK * BK], dtype=tl.float32) + k_1o = tl.zeros([1, BK], dtype=tl.float32) + k_0o = 0 + + for i in range(0, tl.cdiv(T, BT)): + # [BK, BT] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BK*BK, BT] + b_k_2o = b_k[:, None, :] * b_k[None, :, :] + b_k_2o = tl.reshape(b_k_2o, [BK * BK, BT]).to(b_k.dtype) + # [BT, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + # [BT, BK] + b_q = (tl.load(p_q, boundary_check=(0, 1)) * scale).to(b_k.dtype) + b_o = tl.zeros([BT, BV], dtype=tl.float32) + b_z = tl.zeros([BT], dtype=tl.float32) + + # interchunk + # zero-order + b_o += b_h_0o + b_z += k_0o + # first-order + b_o += tl.dot(b_q, b_h_1o.to(b_q.dtype), allow_tf32=False) + b_z += tl.sum(b_q * k_1o, axis=1) + # second-order + b_q_2o = b_q[:, :, None] * b_q[:, None, :] + b_q_2o = tl.reshape(b_q_2o, [BT, BK * BK]).to(b_k.dtype) + b_o += tl.dot(b_q_2o, b_h_2o.to(b_q_2o.dtype), allow_tf32=False) * 0.5 + b_z += tl.sum(b_q_2o * k_2o, axis=1) * 0.5 + + # update running statistics + k_1o += tl.sum(b_k, axis=1)[None, :] + k_2o += tl.sum(b_k_2o, axis=1)[None, :] + k_0o += BT + + # intrachunk + # [BT, BT] + b_s = tl.dot(b_q, b_k, allow_tf32=False) + b_s = 1 + b_s + 0.5 * b_s * b_s + b_s = tl.where(m_s, b_s, 0) + b_z += tl.sum(b_s, axis=1) + b_o += tl.dot(b_s.to(b_q.dtype), b_v, allow_tf32=False) + # [TB, BV] + tl.store(p_o, b_o.to(p_o.dtype.element_ty), boundary_check=(0, 1)) + tl.store(p_z, b_z.to(p_z.dtype.element_ty), mask=(i * BT + tl.arange(0, BT)) < T) + + # update hidden state + # [BK, BV] + b_h_2o = b_h_2o + tl.dot(b_k_2o.to(b_v.dtype), b_v, allow_tf32=False) + b_h_1o = b_h_1o + tl.dot(b_k, b_v, allow_tf32=False) + b_h_0o = b_h_0o + tl.sum(b_v, axis=0) + + p_q = tl.advance(p_q, (BT, 0)) + p_k = tl.advance(p_k, (0, BT)) + p_v = tl.advance(p_v, (BT, 0)) + p_o = tl.advance(p_o, (BT, 0)) + p_z += BT + + +# Similar to Algorithm1 of https://arxiv.org/abs/2006.16236 +@triton.jit +def fused_chunk_based_bwd_kernel( + # NV: number of split in the V dimension. NK: number of split in the K dimension + q, # query [B, H, L, K] + k, # key [B, H, L, V] + v, # value [B, H, L, V] + do, # gradient of output [B, H, L, V] + dz, # gradient of normalizer [B, H, L] + dq, # gradient of query [NV, B, H, L, K] + dk, # gradient of key [NV, B, H, L, K] + dv, # gradient of value [NK, B, H, L, V] + s_k_h, # stride size: L * K + s_k_t, # stride size: K + s_k_d, # stride size: 1 + s_v_h, # stride size: L * V + s_v_t, # stride size: V + s_v_d, # stride size: 1 + scale, # K ** -0.5 + B: tl.constexpr, # B + H: tl.constexpr, # H + T: tl.constexpr, # T + K: tl.constexpr, # K + V: tl.constexpr, # V + BT: tl.constexpr, # BLOCK SIZE along the sequence dimension, a.k.a. chunk size + BK: tl.constexpr, # BLOCK SIZE along the K dimension + BV: tl.constexpr, # BLOCK SIZE along the V dimension +): + i_v, i_k, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + + o_i = tl.arange(0, BT) + m_s = o_i[:, None] >= o_i[None, :] + + # [BV], zero-order taylor expansion + # b_h_0o = tl.zeros([BV], dtype=tl.float32) + # [BK, BV], first-order taylor expansion + b_h_1o = tl.zeros([BV, BK], dtype=tl.float32) + # [BK, BK, BV] second-order taylor expansion + b_h_2o = tl.zeros([BV, BK*BK], dtype=tl.float32) + + k_1o = tl.zeros([1, BK], dtype=tl.float32) + k_2o = tl.zeros([1, BK * BK], dtype=tl.float32) + + for i in range(0, tl.cdiv(T, BT)): + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i * BT, i_k * BK), (BT, BK), (1, 0)) + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i * BT, i_k * BK), (BT, BK), (1, 0)) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (V, T), (s_v_d, s_v_t), (i_v * BV, i * BT), (BV, BT), (0, 1)) + p_do = tl.make_block_ptr(do + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i * BT, i_v * BV), (BT, BV), (1, 0)) + p_dq = tl.make_block_ptr(dq + (i_bh + i_v*B*H) * s_k_h, (T, K), (s_k_t, s_k_d), (i*BT, i_k*BK), (BT, BK), (1, 0)) + p_dz = dz + (i_bh) * T + tl.arange(0, BT) + i * BT + b_dq = tl.zeros([BT, BK], dtype=tl.float32) + + # load tensors + # [BT, BK] + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_q = (b_q * scale).to(b_q.dtype) + b_k = tl.load(p_k, boundary_check=(0, 1)) + b_do = tl.load(p_do, boundary_check=(0, 1)).to(b_q.dtype) + b_dz = tl.load(p_dz, mask=(tl.arange(0, BT) + i * BT) < T) + # [BV, BT] + b_v = tl.load(p_v, boundary_check=(0, 1)) + + # inter-chunk + b_dq += tl.dot(b_do, (b_h_1o).to(b_do.dtype), allow_tf32=False) + if i_v == 0: + b_dq += b_dz[:, None] * k_1o + b_dq_2o = tl.dot(b_do, (b_h_2o).to(b_do.dtype), allow_tf32=False) * 0.5 + if i_v == 0: + b_dq_2o += (b_dz[:, None] * k_2o) * 0.5 + b_dq_2o = tl.reshape(b_dq_2o, [BT, BK, BK]) + b_dq += tl.sum(b_dq_2o * b_q[:, :, None], axis=1) + b_dq += tl.sum(b_dq_2o * b_q[:, None, :], axis=2) + b_dq *= scale + + # intra-chunk + # [BT, BT] + b_ds = tl.dot(b_do, b_v, allow_tf32=False) + if i_v == 0: + b_ds += b_dz[:, None] + b_ds = tl.where(m_s, b_ds, 0) * scale + b_s = tl.dot(b_q, tl.trans(b_k), allow_tf32=False) + b_s = tl.where(m_s, b_s, 0) + b_dq += tl.dot((b_ds * (1 + b_s)).to(b_q.dtype), b_k, allow_tf32=False) + + # store + tl.store(p_dq, b_dq.to(p_dq.dtype.element_ty), boundary_check=(0, 1)) + + # update hidden state + # [BT, BK*BK] + b_k_2o = b_k[:, :, None] * b_k[:, None, :] + b_k_2o = tl.reshape(b_k_2o, [BT, BK * BK]).to(b_k.dtype) + # [BV, BK*BK] + b_h_2o = b_h_2o + tl.dot(b_v, b_k_2o.to(b_v.dtype), allow_tf32=False) + # [BV, BK] + b_h_1o = b_h_1o + tl.dot(b_v, b_k, allow_tf32=False) + + if i_v == 0: + # update running statistics + k_1o += tl.sum(b_k, axis=0)[None, :] + k_2o += tl.sum(b_k_2o, axis=0)[None, :] + + tl.debug_barrier() + b_h_1o = None + b_h_2o = None + + # [BK, BV], first-order taylor expansion + b_dh_1o = tl.zeros([BK, BV], dtype=tl.float32) + # [BK, BK, BV] second-order taylor expansion + b_dh_2o = tl.zeros([BK*BK, BV], dtype=tl.float32) + b_dh_0o = tl.zeros([BV], dtype=tl.float32) + m_s = tl.arange(0, BT)[:, None] <= tl.arange(0, BT)[None, :] + + dq_1o = tl.zeros([1, BK], dtype=tl.float32) + dq_2o = tl.zeros([BK * BK, 1], dtype=tl.float32) + + for i in range(tl.cdiv(T, BT) * BT - BT, -BT, -BT): + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (K, T), (s_k_d, s_k_t), (i_k * BK, i), (BK, BT), (0, 1)) + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i, i_k * BK), (BT, BK), (1, 0)) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i, i_v * BV), (BT, BV), (1, 0)) + p_do = tl.make_block_ptr(do + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i, i_v * BV), (BT, BV), (1, 0)) + p_dk = tl.make_block_ptr(dk + (i_bh+i_v*B*H) * s_k_h, (T, K), (s_k_t, s_k_d), (i, i_k*BK), (BT, BK), (1, 0)) + p_dv = tl.make_block_ptr(dv + (i_bh+i_k*B*H) * s_v_h, (T, V), (s_v_t, s_v_d), (i, i_v*BV), (BT, BV), (1, 0)) + p_dz = dz + (i_bh) * T + tl.arange(0, BT) + i + + b_dk = tl.zeros([BT, BK], dtype=tl.float32) + b_dv = tl.zeros([BT, BV], dtype=tl.float32) + + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_k = tl.load(p_k, boundary_check=(0, 1)) + b_v = tl.load(p_v, boundary_check=(0, 1)) + b_do = tl.load(p_do, boundary_check=(0, 1)).to(b_q.dtype) + b_dz = tl.load(p_dz, mask=(tl.arange(0, BT)+i) < T) + b_q = (b_q * scale).to(b_k.dtype) + + # intra chunk + b_ds = tl.dot(b_v, tl.trans(b_do), allow_tf32=False) + if i_v == 0: + b_ds += b_dz[None, :] + b_ds = tl.where(m_s, b_ds, 0) + b_s = tl.dot(b_k, b_q, allow_tf32=False) + b_s2 = 1 + b_s + 0.5 * b_s * b_s + b_s = tl.where(m_s, b_s, 0) + b_s2 = tl.where(m_s, b_s2, 0) + b_ds *= (1+b_s) + + b_dk += tl.dot(b_ds.to(b_k.dtype), tl.trans(b_q), allow_tf32=False) + b_dv += tl.dot(b_s2.to(b_do.dtype), b_do, allow_tf32=False) + + # inter chunk + b_k_2o = b_k[:, :, None] * b_k[:, None, :] + b_k_2o = tl.reshape(b_k_2o, [BT, BK * BK]).to(b_k.dtype) + + b_dv += tl.dot(b_k, b_dh_1o.to(b_k.dtype), allow_tf32=False) + b_dv += tl.dot(b_k_2o, b_dh_2o.to(b_k.dtype), allow_tf32=False) + b_dv += b_dh_0o + + b_dk += tl.dot(b_v, tl.trans(b_dh_1o).to(b_k.dtype), allow_tf32=False) + + if i_v == 0: + b_dk += dq_1o + + b_dk_2o = tl.dot(b_dh_2o.to(b_k.dtype), tl.trans(b_v), allow_tf32=False) + if i_v == 0: + b_dk_2o += dq_2o + b_dk_2o = tl.reshape(b_dk_2o, [BK, BK, BT]) + b_k_fp32 = tl.trans(b_k.to(tl.float32)) + b_dk2 = tl.sum(b_dk_2o * b_k_fp32[:, None, :], axis=0) + b_dk2 += tl.sum(b_dk_2o * b_k_fp32[None, :, :], axis=1) + b_dk += tl.trans(b_dk2) + + # hidden state update + b_dh_0o += tl.sum(b_do, axis=0) + b_dh_1o = b_dh_1o + tl.dot(b_q, b_do, allow_tf32=False) + b_q_2o = b_q[None, :, :] * b_q[:, None, :] + b_q_2o = tl.reshape(b_q_2o, [BK * BK, BT]).to(b_k.dtype) + b_dh_2o = b_dh_2o + tl.dot(b_q_2o, b_do, allow_tf32=False) * 0.5 + + if i_v == 0: + dq_1o += (tl.sum(b_dz[None, :] * b_q, axis=1))[None, :] + dq_2o += (tl.sum(b_dz[None, :] * b_q_2o, axis=1) * 0.5)[:, None] + + tl.store(p_dk, b_dk.to(p_dk.dtype.element_ty), boundary_check=(0, 1)) + tl.store(p_dv, b_dv.to(p_dv.dtype.element_ty), boundary_check=(0, 1)) + + +class FusedChunkBasedFunction(torch.autograd.Function): + + @staticmethod + @contiguous + @autocast_custom_fwd + def forward(ctx, q, k, v, scale=1): + B, H, T, K, V = *k.shape, v.shape[-1] + + scale = scale + BT = 16 + BK, BV = min(K, 16), min(V, 32) + BK, BV = max(BK, 16), max(BV, 16) + NK, NV = triton.cdiv(K, BK), triton.cdiv(V, BV) + + num_warps = 4 + + # the norm of o might explode, so we need to use float32 here + o = q.new_empty(NK, B, H, T, V, dtype=torch.float32) + z = q.new_empty(NK, B, H, T, dtype=torch.float32) + + grid = (NV, NK, B * H) + fused_chunk_based_fwd_kernel[grid]( + q, k, v, o, z, + q.stride(1), q.stride(2), q.stride(3), + v.stride(1), v.stride(2), v.stride(3), + scale, + B=B, H=H, T=T, K=K, V=V, BT=BT, BK=BK, BV=BV, + num_warps=num_warps, + ) + o = o.sum(0) + z = z.sum(0) + ctx.save_for_backward(q, k, v) + ctx.scale = scale + return o.to(q.dtype), z.to(z.dtype) + + @staticmethod + @contiguous + @autocast_custom_bwd + def backward(ctx, do, dz): + q, k, v = ctx.saved_tensors + B, H, T, K, V = *k.shape, v.shape[-1] + scale = ctx.scale + + BT = 16 + BK, BV = min(K, 16), min(V, 32) + BK, BV = max(BK, 16), max(BV, 16) + NK, NV = triton.cdiv(K, BK), triton.cdiv(V, BV) + num_stages = 1 + num_warps = 4 + + dq = q.new_empty(NV, B, H, T, K) + dk = q.new_empty(NV, B, H, T, K) + dv = q.new_empty(NK, B, H, T, V) + grid = (NV, NK, B * H) + + fused_chunk_based_bwd_kernel[grid]( + q, k, v, do, dz, dq, dk, dv, + q.stride(1), q.stride(2), q.stride(3), + v.stride(1), v.stride(2), v.stride(3), + scale, + B=B, H=H, T=T, K=K, V=V, BT=BT, BK=BK, BV=BV, + num_warps=num_warps, + num_stages=num_stages + ) + dq = dq.sum(0) + dk = dk.sum(0) + dv = dv.sum(0) + return dq.to(q.dtype), dk.to(k.dtype), dv.to(v.dtype), None + + +def fused_chunk_based( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + scale: Optional[float] = None, + use_norm: bool = True, + head_first: bool = True +): + assert q.shape[-1] <= 16, 'only support feature dimension up to 16.' + if scale is None: + scale = q.shape[-1] ** -0.5 + if not head_first: + q, k, v = map(lambda x: x.transpose(1, 2), (q, k, v)) + o, z = FusedChunkBasedFunction.apply(q, k, v, scale) + if use_norm: + o = o / (z[..., None] + 1e-6) + if not head_first: + o = o.transpose(1, 2) + return o.to(q.dtype) diff --git a/fla/ops/based/naive.py b/fla/ops/based/naive.py new file mode 100644 index 0000000000000000000000000000000000000000..4de614137ed28567ebb1df39c0892f498b91fb5a --- /dev/null +++ b/fla/ops/based/naive.py @@ -0,0 +1,72 @@ +# -*- coding: utf-8 -*- + +from typing import Optional + +import torch +from einops import rearrange + + +def naive_parallel_based( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + scale: Optional[float] = None, + use_norm: bool = True +): + if scale is None: + scale = q.shape[-1] ** -0.5 + q = q * scale + attn = q @ k.transpose(-2, -1) + attn = 1 + attn + 1/2 * (attn ** 2) + attn.masked_fill_(~torch.tril(torch.ones( + q.shape[-2], q.shape[-2], dtype=torch.bool, device=q.device)), 0) + o = attn @ v + if use_norm: + z = attn.sum(-1) + return o / (z[..., None] + 1e-6) + else: + return o + + +def naive_chunk_based(q, k, v, chunk_size=256): + q = q * (q.shape[-1] ** -0.5) + # compute normalizer. + k_cumsum = torch.cumsum(k, dim=-2) + kk_cumsum = torch.cumsum(k.unsqueeze(-1) * k.unsqueeze(-2), dim=-3) + # first + z = (q * k_cumsum).sum(-1) + # second order + z += (q.unsqueeze(-1) * q.unsqueeze(-2) * kk_cumsum).sum((-1, -2)) * 0.5 + # zero-th order + z += (torch.arange(0, q.shape[-2]).to(z.device) * 1.0 + 1.0)[None, None, :] + + # compute o + # constant term + _o = v.cumsum(-2) + + q = rearrange(q, 'b h (n c) d -> b h n c d', c=chunk_size) + + k = rearrange(k, 'b h (n c) d -> b h n c d', c=chunk_size) + v = rearrange(v, 'b h (n c) d -> b h n c d', c=chunk_size) + + intra_chunk_attn = q @ k.transpose(-2, -1) + intra_chunk_attn = intra_chunk_attn + 1/2 * (intra_chunk_attn ** 2) + intra_chunk_attn.masked_fill_(~torch.tril(torch.ones(chunk_size, chunk_size, dtype=torch.bool, device=q.device)), 0) + o = intra_chunk_attn @ v + + # quadractic term + kv = torch.einsum('b h n c x, b h n c y, b h n c z -> b h n x y z', k, k, v) + kv = kv.cumsum(2) + kv = torch.cat([torch.zeros_like(kv[:, :, :1]), kv[:, :, :-1]], dim=2) + + o += 0.5 * torch.einsum('b h n x y z, b h n c x, b h n c y -> b h n c z', kv, q, q) + + # linear term + kv = torch.einsum('b h n c x, b h n c y -> b h n x y', k, v) + kv = kv.cumsum(2) + kv = torch.cat([torch.zeros_like(kv[:, :, :1]), kv[:, :, :-1]], dim=2) + o += torch.einsum('b h n x y, b h n c x -> b h n c y', kv, q) + + o = rearrange(o, 'b h n c d -> b h (n c) d') + o = o + _o + return o / (z[..., None] + 1e-6) diff --git a/fla/ops/based/parallel.py b/fla/ops/based/parallel.py new file mode 100644 index 0000000000000000000000000000000000000000..70330eae283756155b1f1aa875550a5a9aa0d591 --- /dev/null +++ b/fla/ops/based/parallel.py @@ -0,0 +1,409 @@ + +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +from typing import Optional + +import torch +import triton +import triton.language as tl + +from fla.utils import autocast_custom_bwd, autocast_custom_fwd, contiguous + +# Based: An Educational and Effective Sequence Mixer +# https://hazyresearch.stanford.edu/blog/2023-12-11-zoology2-based + + +@triton.jit +def parallel_based_fwd_kernel( + q, # query [B, H, L, K] + k, # key [B, H, L, V] + v, # value [B, H, L, V] + o, # output [B, H, L, V] + z, # normalizer [B, H, L] + s_k_h, # stride size: L * K + s_k_t, # stride size: K + s_k_d, # stride size: 1 + s_v_h, # stride size: L * V + s_v_t, # stride size: V + s_v_d, # stride size: 1 + scale, # K ** -0.5 + B: tl.constexpr, # batch size + H: tl.constexpr, # H + T: tl.constexpr, # T + K: tl.constexpr, # K + V: tl.constexpr, # V + BTL: tl.constexpr, # BLOCK SIZE along the sequence dimension for Q + BTS: tl.constexpr, # BLOCK SIZE along the sequence dimension for K/V + BK: tl.constexpr, # BLOCK SIZE along the K dimension + BV: tl.constexpr, # BLOCK SIZE along the V dimension +): + # i_c: chunk index. used for sequence parallelism + i_kv, i_c, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + NV = tl.cdiv(V, BV) + i_k = i_kv // (NV) + i_v = i_kv % (NV) + + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_c * BTL, i_k * BK), (BTL, BK), (1, 0)) + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (K, T), (s_k_d, s_k_t), (i_k * BK, 0), (BK, BTS), (0, 1)) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (0, i_v * BV), (BTS, BV), (1, 0)) + + # [BQ, BD] block Q, in the shared memory throughout the whole kernel + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_q = (b_q * scale).to(b_q.dtype) + b_o = tl.zeros([BTL, BV], dtype=tl.float32) + b_z = tl.zeros([BTL], dtype=tl.float32) + + # Q block and K block have no overlap + # no need for mask, thereby saving flops + for _ in range(0, i_c * BTL, BTS): + # [BK, BTS] + b_k = tl.load(p_k, boundary_check=(0, 1)) + + # [BTS, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + # [BTL, BTS] + b_s = tl.dot(b_q, (b_k), allow_tf32=False) + b_s = 1 + b_s + 0.5 * b_s * b_s + b_z += tl.sum(b_s, axis=1) + + # [BQ, BD] + b_o = b_o + tl.dot(b_s.to(b_v.dtype), b_v, allow_tf32=False) + p_k = tl.advance(p_k, (0, BTS)) + p_v = tl.advance(p_v, (BTS, 0)) + + # # rescale interchunk output + tl.debug_barrier() + o_q = tl.arange(0, BTL) + # # sync threads, easy for compiler to optimize + # tl.debug_barrier() + + o_k = tl.arange(0, BTS) + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (K, T), (s_k_d, s_k_t), (i_k * BK, i_c * BTL), (BK, BTS), (0, 1)) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_c * BTL, i_v * BV), (BTS, BV), (1, 0)) + # Q block and K block have overlap. masks required + for _ in range(i_c * BTL, (i_c + 1) * BTL, BTS): + # [BK, BTS] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BTS, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + # [BTL, BTS] + m_s = o_q[:, None] >= o_k[None, :] + b_s = tl.dot(b_q, b_k, allow_tf32=False) + b_s = 1 + b_s + 0.5 * b_s * b_s + b_s = tl.where(m_s, b_s, 0) + b_z += tl.sum(b_s, axis=1) + # [BTL, BV] + b_o += tl.dot(b_s.to(b_q.dtype), b_v, allow_tf32=False) + + p_k = tl.advance(p_k, (0, BTS)) + p_v = tl.advance(p_v, (BTS, 0)) + o_k += BTS + + p_o = tl.make_block_ptr(o + (i_bh + B * H * i_k) * s_v_h, (T, V), (s_v_t, s_v_d), (i_c*BTL, i_v*BV), (BTL, BV), (1, 0)) + p_z = z + (i_bh + B * H * i_k) * T + i_c * BTL + tl.arange(0, BTL) + tl.store(p_o, b_o.to(p_o.dtype.element_ty), boundary_check=(0, 1)) + tl.store(p_z, b_z.to(p_z.dtype.element_ty), mask=((i_c * BTL + tl.arange(0, BTL)) < T)) + + +@triton.jit +def _parallel_based_bwd_dq( + i_bh, + i_c, + i_k, + i_v, + i_h, + q, + k, + v, + do, + dz, + dq, + s_k_h, + s_k_t, + s_k_d, + s_v_h, + s_v_t, s_v_d, B, H, T, scale, + BTL: tl.constexpr, + BTS: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, +): + p_do = tl.make_block_ptr(do + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), + (i_c * BTL, i_v * BV), (BTL, BV), (1, 0)) + p_q = tl.make_block_ptr(q + (i_bh) * s_k_h, (T, K), + (s_k_t, s_k_d), (i_c*BTL, i_k*BK), (BTL, BK), (1, 0)) + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_do = tl.load(p_do, boundary_check=(0, 1)).to(b_q.dtype) + b_q = (b_q * scale).to(b_q.dtype) + b_dq = tl.zeros([BTL, BK], dtype=tl.float32) + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (0, i_k * BK), (BTS, BK), (1, 0)) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (V, T), (s_v_d, s_v_t), (i_v * BV, 0), (BV, BTS), (0, 1)) + p_dz = dz + i_bh * T + i_c * BTL + tl.arange(0, BTL) + b_dz = tl.load(p_dz, mask=(i_c * BTL + tl.arange(0, BTL)) < T) + + for _ in range(0, i_c * BTL, BTS): + # [BTS, BK] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BV, BTS] + b_v = tl.load(p_v, boundary_check=(0, 1)) + # [BTL, BTS] + b_ds = tl.dot(b_do, b_v, allow_tf32=False) + if i_v == 0: + b_ds += b_dz[:, None] + else: + b_ds = b_ds + b_s = tl.dot(b_q, tl.trans(b_k), allow_tf32=False) + # [BQ, BD] + b_dq += tl.dot((b_ds * (1 + b_s)).to(b_v.dtype), b_k, allow_tf32=False) + p_k = tl.advance(p_k, (BTS, 0)) + p_v = tl.advance(p_v, (0, BTS)) + + b_dq *= scale + o_q = tl.arange(0, BTL) + o_k = tl.arange(0, BTS) + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_c * BTL, i_k * BK), (BTS, BK), (1, 0)) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (V, T), (s_v_d, s_v_t), (i_v * BV, i_c * BTL), (BV, BTS), (0, 1)) + # Q block and K block have overlap. masks required + for _ in range(i_c * BTL, (i_c + 1) * BTL, BTS): + # [BTS, BK] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BV, BTS] + b_v = tl.load(p_v, boundary_check=(0, 1)) + # [BTL, BTS] + m_s = o_q[:, None] >= o_k[None, :] + b_ds = tl.dot(b_do, b_v, allow_tf32=False) + if i_v == 0: + b_ds += b_dz[:, None] + else: + b_ds = b_ds + b_ds = tl.where(m_s, b_ds, 0) * scale + b_s = tl.dot(b_q, tl.trans(b_k), allow_tf32=False) + b_s = tl.where(m_s, b_s, 0) + # [BTL, BK] + b_dq += tl.dot((b_ds + b_ds * b_s).to(b_k.dtype), + b_k, allow_tf32=False) + p_k = tl.advance(p_k, (BTS, 0)) + p_v = tl.advance(p_v, (0, BTS)) + o_k += BTS + p_dq = tl.make_block_ptr(dq + (i_bh + B * H * i_v) * s_k_h, (T, K), + (s_k_t, s_k_d), (i_c*BTL, i_k*BK), (BTL, BK), (1, 0)) + tl.store(p_dq, b_dq.to(p_dq.dtype.element_ty), boundary_check=(0, 1)) + return + + +@triton.jit +def _parallel_based_bwd_dkv( + i_bh, i_c, i_k, i_v, i_h, + q, k, v, do, dz, dk, dv, s_k_h, s_k_t, s_k_d, s_v_h, + s_v_t, s_v_d, B, H, T, scale, + BTL: tl.constexpr, BTS: tl.constexpr, BK: tl.constexpr, BV: tl.constexpr, + K: tl.constexpr, V: tl.constexpr, +): + # compute dk dv + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_c * BTL, i_k * BK), (BTL, BK), (1, 0)) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_c * BTL, i_v * BV), (BTL, BV), (1, 0)) + b_k, b_v = tl.load(p_k, boundary_check=(0, 1)), tl.load( + p_v, boundary_check=(0, 1)) + b_dk, b_dv = tl.zeros([BTL, BK], dtype=tl.float32), tl.zeros( + [BTL, BV], dtype=tl.float32) + + for i in range((tl.cdiv(T, BTS) * BTS)-BTS, (i_c + 1) * BTL - BTS, -BTS): + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (K, T), (s_k_d, s_k_t), (i_k * BK, i), (BK, BTS), (0, 1)) + p_do = tl.make_block_ptr(do + i_bh * s_v_h, (V, T), (s_v_d, s_v_t), (i_v * BV, i), (BV, BTS), (0, 1)) + p_dz = dz + i_bh * T + i + tl.arange(0, BTS) + b_q = tl.load(p_q, boundary_check=(0, 1)) # [BK, BTS] + b_do = tl.load(p_do, boundary_check=(0, 1)).to(b_q.dtype) # [BV, BTS] + b_dz = tl.load(p_dz, mask=(i + tl.arange(0, BTS)) < T) + b_s = tl.dot(b_k.to(b_q.dtype), b_q, allow_tf32=False) * \ + scale # [BTL, BTS] + b_s2 = 1 + b_s + 0.5 * b_s * b_s + b_dv += tl.dot(b_s2.to(b_q.dtype), tl.trans(b_do), allow_tf32=False) + b_ds = tl.dot(b_v, b_do, allow_tf32=False) * scale + if i_v == 0: + b_ds += b_dz[None, :] * scale + else: + b_ds = b_ds + b_dk += tl.dot((b_ds + b_ds * b_s).to(b_q.dtype), tl.trans(b_q), allow_tf32=False) + + tl.debug_barrier() + o_q, o_k = tl.arange(0, BTS), tl.arange(0, BTL) + for i in range(i_c*BTL, (i_c+1)*BTL, BTS): + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (K, T), (s_k_d, s_k_t), (i_k * BK, i), (BK, BTS), (0, 1)) + p_do = tl.make_block_ptr(do + i_bh * s_v_h, (V, T), (s_v_d, s_v_t), (i_v * BV, i), (BV, BTS), (0, 1)) + p_dz = dz + i_bh * T + i + tl.arange(0, BTS) + b_q = tl.load(p_q, boundary_check=(0, 1)) # [BD, BQ] + b_do = tl.load(p_do, boundary_check=(0, 1)).to(b_q.dtype) + b_dz = tl.load(p_dz, mask=(i + tl.arange(0, BTS)) < T) + # [BK, BQ] + m_s = o_k[:, None] <= o_q[None, :] + b_s = tl.dot(b_k, b_q, allow_tf32=False) * scale + b_s2 = 1 + b_s + 0.5 * b_s * b_s + b_s = tl.where(m_s, b_s, 0) + b_s2 = tl.where(m_s, b_s2, 0) + + b_ds = tl.dot(b_v, b_do, allow_tf32=False) + if i_v == 0: + b_ds += b_dz[None, :] + else: + b_ds = b_ds + b_ds = tl.where(m_s, b_ds, 0) * scale + # [BK, BD] + b_dv += tl.dot(b_s2.to(b_q.dtype), tl.trans(b_do), allow_tf32=False) + b_dk += tl.dot((b_ds + b_ds * b_s).to(b_q.dtype), + tl.trans(b_q), allow_tf32=False) + o_q += BTS + + p_dk = tl.make_block_ptr(dk + (i_bh + B * H * i_v) * s_k_h, (T, K), + (s_k_t, s_k_d), (i_c*BTL, i_k*BK), (BTL, BK), (1, 0)) + p_dv = tl.make_block_ptr(dv + (i_bh + B * H * i_k) * s_v_h, (T, V), + (s_v_t, s_v_d), (i_c*BTL, i_v*BV), (BTL, BV), (1, 0)) + tl.store(p_dk, b_dk.to(p_dk.dtype.element_ty), boundary_check=(0, 1)) + tl.store(p_dv, b_dv.to(p_dv.dtype.element_ty), boundary_check=(0, 1)) + return + + +@triton.jit +def parallel_based_bwd_kernel( + q, + k, + v, + do, + dz, + dq, + dk, + dv, + s_k_h, + s_k_t, + s_k_d, + s_v_h, + s_v_t, + s_v_d, + scale, + B: tl.constexpr, + H: tl.constexpr, + T: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BTL: tl.constexpr, + BTS: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, +): + i_kv, i_c, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + NV = tl.cdiv(V, BV) + i_k = i_kv // (NV) + i_v = i_kv % (NV) + i_h = i_bh % H + _parallel_based_bwd_dq( + i_bh, i_c, i_k, i_v, i_h, + q, k, v, do, dz, dq, s_k_h, s_k_t, s_k_d, s_v_h, + s_v_t, s_v_d, B, H, T, scale, BTL=BTL, BTS=BTS, BK=BK, BV=BV, K=K, V=V + ) + tl.debug_barrier() + _parallel_based_bwd_dkv( + i_bh, i_c, i_k, i_v, i_h, + q, k, v, do, dz, dk, dv, s_k_h, s_k_t, s_k_d, s_v_h, + s_v_t, s_v_d, B, H, T, scale, BTL, BTS, BK, BV, K, V + ) + + +class ParallelBasedFunction(torch.autograd.Function): + + @staticmethod + @contiguous + @autocast_custom_fwd + def forward(ctx, q, k, v, scale): + BTL, BTS = 128, 32 + assert BTL % BTS == 0 + # assert q.shape[-1] % 16 == 0 + BK = min(128, triton.next_power_of_2(k.shape[-1])) + BV = min(128, triton.next_power_of_2(v.shape[-1])) + BK, BV = max(BK, 16), max(BV, 16) + B, H, T, K, V = *k.shape, v.shape[-1] + num_stages = 2 + num_warps = 4 + NK = triton.cdiv(K, BK) + NV = triton.cdiv(V, BV) + grid = (NK * NV, triton.cdiv(T, BTL), B * H) + + assert NK == 1, "will encounter some synchronization issue if not." + + o = torch.empty(NK, B, H, T, V, device=q.device) + z = torch.empty(NK, B, H, T, device=q.device) + parallel_based_fwd_kernel[grid]( + q, k, v, o, z, + q.stride(1), q.stride(2), q.stride(3), + v.stride(1), v.stride(2), v.stride(3), + scale, + B=B, H=H, T=T, K=K, V=V, + BTL=BTL, BTS=BTS, BK=BK, BV=BV, + num_warps=num_warps, + num_stages=num_stages + ) + ctx.save_for_backward(q, k, v) + ctx.scale = scale + return o.sum(0).to(q.dtype), z.sum(0).to(q.dtype) + + @staticmethod + @contiguous + @autocast_custom_bwd + def backward(ctx, do, dz): + q, k, v = ctx.saved_tensors + scale = ctx.scale + BTL, BTS = 64, 32 + assert BTL % BTS == 0 + BK = min(128, triton.next_power_of_2(k.shape[-1])) + BV = min(128, triton.next_power_of_2(v.shape[-1])) + BK, BV = max(BK, 16), max(BV, 16) + B, H, T, K, V = *k.shape, v.shape[-1] + num_stages = 2 + num_warps = 4 + NK = triton.cdiv(K, BK) + NV = triton.cdiv(V, BV) + grid = (NK * NV, triton.cdiv(T, BTL), B * H) + + assert NK == 1, "will encounter some synchronization issue if not" + + dq = torch.empty(NV, B, H, T, K, dtype=q.dtype, device=q.device) + dk = torch.empty(NV, B, H, T, K, dtype=q.dtype, device=q.device) + dv = torch.empty(NK, B, H, T, V, dtype=q.dtype, device=q.device) + + parallel_based_bwd_kernel[grid]( + q, k, v, do, dz, dq, dk, dv, + q.stride(1), q.stride(2), q.stride(3), + v.stride(1), v.stride(2), v.stride(3), + scale, + B=B, H=H, T=T, K=K, V=V, + BTL=BTL, BTS=BTS, BK=BK, BV=BV, + num_warps=num_warps, + num_stages=num_stages + ) + + return dq.sum(0).to(q.dtype), dk.sum(0).to(k.dtype), dv.sum(0).to(v.dtype), None + + +triton_parallel_based = ParallelBasedFunction.apply + + +def parallel_based( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + scale: Optional[float] = None, + use_norm: bool = True, + head_first: bool = True +): + assert q.shape[-1] <= 128, "only support feature dim up to 128" + if scale is None: + scale = q.shape[-1] ** -0.5 + if not head_first: + q, k, v = map(lambda x: x.transpose(1, 2), (q, k, v)) + o, z = triton_parallel_based(q, k, v, scale) + if use_norm: + o = o / (z[..., None] + 1e-6) + if not head_first: + o = o.transpose(1, 2) + return o.to(q.dtype) diff --git a/fla/ops/common/__init__.py b/fla/ops/common/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..40a96afc6ff09d58a702b76e3f7dd412fe975e26 --- /dev/null +++ b/fla/ops/common/__init__.py @@ -0,0 +1 @@ +# -*- coding: utf-8 -*- diff --git a/fla/ops/common/chunk_h.py b/fla/ops/common/chunk_h.py new file mode 100644 index 0000000000000000000000000000000000000000..580cd01c460e114cd54f5e19524fe02a34e763da --- /dev/null +++ b/fla/ops/common/chunk_h.py @@ -0,0 +1,395 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +from typing import Optional, Tuple + +import torch +import triton +import triton.language as tl + + +@triton.heuristics({ + 'USE_INITIAL_STATE': lambda args: args['h0'] is not None, + 'STORE_FINAL_STATE': lambda args: args['ht'] is not None, + 'USE_OFFSETS': lambda args: args['offsets'] is not None +}) +@triton.autotune( + configs=[ + triton.Config({'BK': BK, 'BV': BV}, num_warps=num_warps) + for BK in [32, 64, 128] + for BV in [32, 64, 128] + for num_warps in [1, 2, 4, 8] + ], + key=['BT', 'USE_G', 'USE_GK', 'USE_GV'], +) +@triton.jit +def chunk_fwd_kernel_h( + k, + v, + h, + g, + gk, + gv, + h0, + ht, + offsets, + c_offsets, + T: tl.constexpr, + H: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + USE_G: tl.constexpr, + USE_GK: tl.constexpr, + USE_GV: tl.constexpr, + USE_INITIAL_STATE: tl.constexpr, + STORE_FINAL_STATE: tl.constexpr, + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + i_k, i_v, i_nh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_n, i_h = i_nh // H, i_nh % H + if USE_OFFSETS: + bos, eos = tl.load(offsets + i_n).to(tl.int32), tl.load(offsets + i_n + 1).to(tl.int32) + T = eos - bos + NT = tl.cdiv(T, BT) + boh = tl.load(c_offsets + i_n).to(tl.int32) + else: + bos, eos = i_n * T, i_n * T + T + NT = tl.cdiv(T, BT) + boh = i_n * NT + + # [BK, BV] + b_h = tl.zeros([BK, BV], dtype=tl.float32) + if USE_INITIAL_STATE: + p_h0 = tl.make_block_ptr(h0 + i_nh * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + b_h = tl.load(p_h0, boundary_check=(0, 1)).to(tl.float32) + + for i_t in range(NT): + if HEAD_FIRST: + p_k = tl.make_block_ptr(k + i_nh * T*K, (K, T), (1, K), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + p_v = tl.make_block_ptr(v + i_nh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_h = tl.make_block_ptr(h + (i_nh * NT + i_t) * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + else: + p_k = tl.make_block_ptr(k + (bos*H + i_h) * K, (K, T), (1, H*K), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + p_v = tl.make_block_ptr(v + (bos*H + i_h) * V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_h = tl.make_block_ptr(h + ((boh + i_t) * H + i_h) * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + + tl.store(p_h, b_h.to(p_h.dtype.element_ty), boundary_check=(0, 1)) + # [BK, BT] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BT, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + last_idx = min((i_t + 1) * BT, T) - 1 + + # scalar decay + if USE_G: + if HEAD_FIRST: + b_g_last = tl.load(g + i_nh * T + last_idx) + p_g = g + i_nh * T + i_t * BT + tl.arange(0, BT) + p_g = tl.max_contiguous(tl.multiple_of(p_g, BT), BT) + else: + b_g_last = tl.load(g + bos * H + last_idx * H + i_h) + p_g = g + bos*H + (i_t * BT + tl.arange(0, BT)) * H + i_h + b_h *= tl.exp(b_g_last) + b_g = tl.load(p_g, mask=(i_t * BT + tl.arange(0, BT) < T), other=0.) + b_v = (b_v * tl.exp(b_g_last - b_g)[:, None]).to(b_v.dtype) + + # vector decay, h = Diag(gk) @ h + if USE_GK: + if HEAD_FIRST: + p_gk = tl.make_block_ptr(gk + i_nh * T*K, (K, T), (1, K), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + p_gk_last = gk + i_nh * T*K + last_idx * K + i_k * BK + tl.arange(0, BK) + else: + p_gk = tl.make_block_ptr(gk + (bos*H + i_h) * K, (K, T), (1, H*K), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + p_gk_last = gk + (bos + last_idx) * H*K + i_h * K + i_k * BK + tl.arange(0, BK) + p_gk_last = tl.max_contiguous(tl.multiple_of(p_gk_last, BK), BK) + b_gk_last = tl.load(p_gk_last, mask=(i_k * BK + tl.arange(0, BK) < K), other=0.) + b_h *= tl.exp(b_gk_last)[:, None] + + b_gk = tl.load(p_gk, boundary_check=(0, 1)) + b_k = (b_k * tl.exp(b_gk_last[:, None] - b_gk)).to(b_k.dtype) + + # vector decay, h = h @ Diag(gv) + if USE_GV: + if HEAD_FIRST: + p_gv = tl.make_block_ptr(gv + i_nh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_gv_last = gv + i_nh * T*V + last_idx * V + i_v * BV + tl.arange(0, BV) + else: + p_gv = tl.make_block_ptr(gv + (bos*H + i_h) * V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_gv_last = gv + (bos + last_idx) * H*V + i_h * V + i_v * BV + tl.arange(0, BV) + p_gv_last = tl.max_contiguous(tl.multiple_of(p_gv_last, BV), BV) + b_gv_last = tl.load(p_gv_last, mask=(i_v * BV + tl.arange(0, BV) < V), other=0.) + b_h *= tl.exp(b_gv_last)[None, :] + + b_gv = tl.load(p_gv, boundary_check=(0, 1)) + b_v = (b_v * tl.exp(b_gv_last[None, :] - b_gv)).to(b_v.dtype) + + b_h += tl.dot(b_k, b_v) + + if STORE_FINAL_STATE: + p_ht = tl.make_block_ptr(ht + i_nh * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + tl.store(p_ht, b_h.to(p_ht.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.heuristics({ + 'STORE_INITIAL_STATE_GRADIENT': lambda args: args['dh0'] is not None, + 'USE_FINAL_STATE_GRADIENT': lambda args: args['dht'] is not None, + 'USE_OFFSETS': lambda args: args['offsets'] is not None +}) +@triton.autotune( + configs=[ + triton.Config({'BK': BK, 'BV': BV}, num_warps=num_warps) + for BK in [32, 64, 128] + for BV in [32, 64, 128] + for num_warps in [1, 2, 4, 8] + ], + key=['BT', 'USE_G', 'USE_GK', 'USE_GV'], +) +@triton.jit +def chunk_bwd_kernel_dh( + q, + g, + gk, + gv, + do, + dh, + dht, + dh0, + offsets, + c_offsets, + scale, + T: tl.constexpr, + HQ: tl.constexpr, + H: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + NG: tl.constexpr, + USE_G: tl.constexpr, + USE_GK: tl.constexpr, + USE_GV: tl.constexpr, + STORE_INITIAL_STATE_GRADIENT: tl.constexpr, + USE_FINAL_STATE_GRADIENT: tl.constexpr, + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + i_k, i_v, i_nh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_bg = i_nh // NG + i_n, i_hq = i_nh // HQ, i_nh % HQ + i_h = i_hq // NG + if USE_OFFSETS: + bos, eos = tl.load(offsets + i_n).to(tl.int32), tl.load(offsets + i_n + 1).to(tl.int32) + T = eos - bos + NT = tl.cdiv(T, BT) + boh = tl.load(c_offsets + i_n).to(tl.int32) + else: + bos, eos = i_n * T, i_n * T + T + NT = tl.cdiv(T, BT) + boh = i_n * NT + + # [BK, BV] + b_dh = tl.zeros([BK, BV], dtype=tl.float32) + if USE_FINAL_STATE_GRADIENT: + p_dht = tl.make_block_ptr(dht + i_nh * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + b_dh += tl.load(p_dht, boundary_check=(0, 1)).to(tl.float32) + + for i_t in range(NT - 1, -1, -1): + if HEAD_FIRST: + p_dh = tl.make_block_ptr(dh + (i_nh * NT + i_t) * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + else: + p_dh = tl.make_block_ptr(dh + ((boh+i_t) * H + i_h) * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + tl.store(p_dh, b_dh.to(p_dh.dtype.element_ty), boundary_check=(0, 1)) + last_idx = min(i_t * BT + BT, T) - 1 + # [BK, BT] + if HEAD_FIRST: + p_q = tl.make_block_ptr(q + i_nh * T*K, (K, T), (1, K), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + p_do = tl.make_block_ptr(do + i_nh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + else: + p_q = tl.make_block_ptr(q + (bos*HQ + i_hq) * K, (K, T), (1, HQ*K), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + p_do = tl.make_block_ptr(do + (bos*HQ + i_hq) * V, (T, V), (HQ*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_q = (b_q * scale).to(b_q.dtype) + # [BT, BV] + b_do = tl.load(p_do, boundary_check=(0, 1)) + + if USE_G: + if HEAD_FIRST: + p_g = g + i_bg * T + i_t * BT + tl.arange(0, BT) + p_g = tl.max_contiguous(tl.multiple_of(p_g, BT), BT) + b_g_last = tl.load(g + i_bg * T + last_idx) + else: + p_g = g + (bos + i_t * BT + tl.arange(0, BT)) * H + i_h + b_g_last = tl.load(g + (bos + last_idx) * H + i_h) + b_g = tl.load(p_g, mask=(i_t * BT + tl.arange(0, BT) < T), other=0.) + b_q = (b_q * tl.exp(b_g)[None, :]).to(b_q.dtype) + + b_dh *= tl.exp(b_g_last) + + if USE_GK: + if HEAD_FIRST: + p_gk = tl.make_block_ptr(gk + i_bg * T*K, (K, T), (1, K), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + p_gk_last = gk + (i_bg * T + last_idx) * K + i_k * BK + tl.arange(0, BK) + + else: + p_gk = tl.make_block_ptr(gk + (bos*H + i_h) * K, (K, T), (1, H*K), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + p_gk_last = gk + (bos + last_idx) * H*K + i_h * K + i_k * BK + tl.arange(0, BK) + p_gk_last = tl.max_contiguous(tl.multiple_of(p_gk_last, BK), BK) + + b_gk = tl.load(p_gk, boundary_check=(0, 1)) + b_q = (b_q * tl.exp(b_gk)).to(b_q.dtype) + b_gk_last = tl.load(p_gk_last, mask=(i_k * BK + tl.arange(0, BK) < K), other=0.) + b_dh *= tl.exp(b_gk_last)[:, None] + + if USE_GV: + if HEAD_FIRST: + p_gv = tl.make_block_ptr(gv + i_bg * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_gv_last = gv + (i_bg * T + last_idx) * V + i_v * BV + tl.arange(0, BV) + else: + p_gv = tl.make_block_ptr(gv + (bos*H + i_h) * V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_gv_last = gv + (bos + last_idx) * H*V + i_h * V + i_v * BV + tl.arange(0, BV) + p_gv_last = tl.max_contiguous(tl.multiple_of(p_gv_last, BV), BV) + + b_gv = tl.load(p_gv, boundary_check=(0, 1)) + b_do = (b_do * tl.exp(b_gv)).to(b_do.dtype) + + b_gv_last = tl.load(p_gv_last, mask=(i_v * BV + tl.arange(0, BV) < V), other=0.) + b_dh *= tl.exp(b_gv_last)[None, :] + + b_dh += tl.dot(b_q, b_do) + + if STORE_INITIAL_STATE_GRADIENT: + p_dh0 = tl.make_block_ptr(dh0 + i_nh * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + tl.store(p_dh0, b_dh.to(p_dh0.dtype.element_ty), boundary_check=(0, 1)) + + +def chunk_fwd_h( + k: torch.Tensor, + v: torch.Tensor, + g: torch.Tensor, + gk: torch.Tensor, + gv: torch.Tensor, + h0: torch.Tensor, + output_final_state: bool, + states_in_fp32: bool = False, + offsets: Optional[torch.Tensor] = None, + c_offsets: Optional[torch.Tensor] = None, + head_first: bool = True, + chunk_size: int = 64 +) -> Tuple[torch.Tensor, torch.Tensor]: + if head_first: + B, H, T, K, V = *k.shape, v.shape[-1] + else: + B, T, H, K, V = *k.shape, v.shape[-1] + BT = chunk_size + # N: the actual number of sequences in the batch with either equal or variable lengths + if offsets is None: + N, NT, c_offsets = B, triton.cdiv(T, BT), None + else: + N = len(offsets) - 1 + if c_offsets is None: + c_offsets = torch.cat([offsets.new_tensor([0]), triton.cdiv(offsets[1:] - offsets[:-1], BT)]).cumsum(-1) + NT = c_offsets[-1] + + if head_first: + h = k.new_empty(B, H, NT, K, V, dtype=k.dtype if not states_in_fp32 else torch.float32) + else: + h = k.new_empty(B, NT, H, K, V, dtype=k.dtype if not states_in_fp32 else torch.float32) + ht = k.new_empty(N, H, K, V, dtype=torch.float32) if output_final_state else None + + def grid(meta): return (triton.cdiv(K, meta['BK']), triton.cdiv(V, meta['BV']), N * H) + chunk_fwd_kernel_h[grid]( + k=k, + v=v, + h=h, + g=g, + gk=gk, + gv=gv, + h0=h0, + ht=ht, + offsets=offsets, + c_offsets=c_offsets, + T=T, + H=H, + K=K, + V=V, + BT=BT, + USE_G=g is not None, + USE_GK=gk is not None, + USE_GV=gv is not None, + HEAD_FIRST=head_first + ) + return h, ht + + +def chunk_bwd_dh( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + g: torch.Tensor, + gk: torch.Tensor, + gv: torch.Tensor, + do: torch.Tensor, + h0: torch.Tensor, + dht: torch.Tensor, + scale: float, + states_in_fp32: bool = False, + offsets: Optional[torch.Tensor] = None, + c_offsets: Optional[torch.Tensor] = None, + head_first: bool = True, + chunk_size: int = 64 +) -> Tuple[torch.Tensor, torch.Tensor]: + if head_first: + B, H, T, K, V = *k.shape, v.shape[-1] + HQ = q.shape[1] + else: + B, T, H, K, V = *k.shape, v.shape[-1] + HQ = q.shape[2] + BT = chunk_size + # N: the actual number of sequences in the batch with either equal or variable lengths + if offsets is None: + N, NT, c_offsets = B, triton.cdiv(T, BT), None + else: + N = len(offsets) - 1 + if c_offsets is None: + c_offsets = torch.cat([offsets.new_tensor([0]), triton.cdiv(offsets[1:] - offsets[:-1], BT)]).cumsum(-1) + NT = c_offsets[-1] + # number of groups in GQA + NG = HQ // H + + if head_first: + dh = k.new_empty(B, HQ, NT, K, V, dtype=k.dtype if not states_in_fp32 else torch.float32) + else: + dh = k.new_empty(B, NT, HQ, K, V, dtype=k.dtype if not states_in_fp32 else torch.float32) + dh0 = torch.empty_like(h0, dtype=torch.float32) if h0 is not None else None + + def grid(meta): return (triton.cdiv(K, meta['BK']), triton.cdiv(V, meta['BV']), N * H) + chunk_bwd_kernel_dh[grid]( + q=q, + g=g, + gk=gk, + gv=gv, + do=do, + dh=dh, + dht=dht, + dh0=dh0, + offsets=offsets, + c_offsets=c_offsets, + scale=scale, + T=T, + HQ=HQ, + H=H, + K=K, + V=V, + BT=BT, + NG=NG, + USE_G=g is not None, + USE_GK=gk is not None, + USE_GV=gv is not None, + HEAD_FIRST=head_first, + ) + return dh, dh0 diff --git a/fla/ops/common/fused_recurrent.py b/fla/ops/common/fused_recurrent.py new file mode 100644 index 0000000000000000000000000000000000000000..e418f8a10c5e6b6702c248b6b0b2a15d584b2c21 --- /dev/null +++ b/fla/ops/common/fused_recurrent.py @@ -0,0 +1,577 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +from typing import Optional + +import torch +import triton +import triton.language as tl + +from fla.ops.utils import chunk_global_cumsum +from fla.utils import autocast_custom_bwd, autocast_custom_fwd, contiguous + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + ], + key=["BK", "BV", "USE_GK", "USE_GV", "USE_G"], +) +@triton.heuristics({ + 'USE_INITIAL_STATE': lambda args: args['h0'] is not None, + 'STORE_FINAL_STATE': lambda args: args['ht'] is not None, + 'USE_OFFSETS': lambda args: args['offsets'] is not None +}) +@triton.jit +def fused_recurrent_fwd_kernel( + q, # query [B, H, T, K]/[B, T, H, K] + k, # key [B, H, T, K]/[B, T, H, K] + v, # value [B, H, T, V]/[B, T, H, V] + g, # log gate [B, H, T]/[B, T, H] or None + gk, # log gate [B, H, T, K]/[B, T, H, K] or None + gv, # log gate [B, H, T, V]/[B, T, H, V] or None + o, # output [NK, B, H, T, V]/[NK, B, T, H, V] + h0, # initial hidden state [B, H, K, V] + ht, # final hidden state [B, H, K, V] + offsets, + scale, + B: tl.constexpr, + T: tl.constexpr, + H: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + REVERSE: tl.constexpr, # whether to reverse the recurrence + USE_G: tl.constexpr, # whether to use g + USE_GK: tl.constexpr, # whether to use gk + USE_GV: tl.constexpr, # whether to use gv + USE_INITIAL_STATE: tl.constexpr, # whether to use initial state + STORE_FINAL_STATE: tl.constexpr, # whether to store final state + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + # indices + i_v, i_k, i_nh = tl.program_id(0).to(tl.int64), tl.program_id(1).to(tl.int64), tl.program_id(2).to(tl.int64) + i_n, i_h = i_nh // H, i_nh % H + if USE_OFFSETS: + bos, eos = tl.load(offsets + i_n).to(tl.int64), tl.load(offsets + i_n + 1).to(tl.int64) + all = T + T = eos - bos + else: + bos, eos = i_n * T, i_n * T + T + all = B * T + + if HEAD_FIRST: + p_q = q + i_nh * T*K + ((T-1) * K if REVERSE else 0) + i_k * BK + tl.arange(0, BK) + p_k = k + i_nh * T*K + ((T-1) * K if REVERSE else 0) + i_k * BK + tl.arange(0, BK) + p_v = v + i_nh * T*V + ((T-1) * V if REVERSE else 0) + i_v * BV + tl.arange(0, BV) + p_o = o + (i_k * B*H + i_nh) * T*V + ((T-1) * V if REVERSE else 0) + i_v * BV + tl.arange(0, BV) + if USE_G: + p_g = g + i_nh * T + ((T-1) if REVERSE else 0) + if USE_GK: + p_gk = gk + i_nh * T*K + ((T-1) * K if REVERSE else 0) + i_k * BK + tl.arange(0, BK) + if USE_GV: + p_gv = gv + i_nh * T*V + ((T-1) * V if REVERSE else 0) + i_v * BV + tl.arange(0, BV) + else: + p_q = q + (bos + ((T-1) if REVERSE else 0)) * H*K + i_h * K + i_k * BK + tl.arange(0, BK) + p_k = k + (bos + ((T-1) if REVERSE else 0)) * H*K + i_h * K + i_k * BK + tl.arange(0, BK) + p_v = v + (bos + ((T-1) if REVERSE else 0)) * H*V + i_h * V + i_v * BV + tl.arange(0, BV) + p_o = o + ((i_k * all + bos) + ((T-1) if REVERSE else 0)) * H*V + i_h * V + i_v * BV + tl.arange(0, BV) + if USE_G: + p_g = g + (bos + ((T-1) if REVERSE else 0)) * H + i_h + if USE_GK: + p_gk = gk + (bos + ((T-1) if REVERSE else 0)) * H*K + i_h * K + i_k * BK + tl.arange(0, BK) + if USE_GV: + p_gv = gv + (bos + ((T-1) if REVERSE else 0)) * H*V + i_h * V + i_v * BV + tl.arange(0, BV) + + mask_k = (i_k * BK + tl.arange(0, BK)) < K + mask_v = (i_v * BV + tl.arange(0, BV)) < V + mask_h = mask_k[None, :] & mask_v[:, None] + b_h = tl.zeros([BV, BK], dtype=tl.float32) + + if USE_INITIAL_STATE: + p_h0 = h0 + i_nh * K*V + (i_k * BK + tl.arange(0, BK)[None, :]) * V + (i_v * BV + tl.arange(0, BV)[:, None]) + b_h += tl.load(p_h0, mask=mask_h, other=0).to(tl.float32) + + for _ in range(0, T): + b_q = tl.load(p_q, mask=mask_k, other=0).to(tl.float32) * scale + b_k = tl.load(p_k, mask=mask_k, other=0).to(tl.float32) + b_v = tl.load(p_v, mask=mask_v, other=0).to(tl.float32) + if USE_GK: + b_gk = tl.load(p_gk, mask=mask_k, other=0).to(tl.float32) + b_h = b_h * tl.exp(b_gk[None, :]) + if USE_GV: + b_gv = tl.load(p_gv, mask=mask_v, other=0).to(tl.float32) + b_h = b_h * tl.exp(b_gv[:, None]) + if USE_G: + b_g = tl.load(p_g).to(tl.float32) + b_h = b_h * tl.exp(b_g) + b_h += b_k[None, :] * b_v[:, None] + b_o = b_h * b_q[None, :] + b_o = tl.sum(b_o, axis=1) + tl.store(p_o, b_o.to(p_o.dtype.element_ty), mask=mask_v) + p_q += (-1 if REVERSE else 1) * (1 if HEAD_FIRST else H) * K + p_k += (-1 if REVERSE else 1) * (1 if HEAD_FIRST else H) * K + p_v += (-1 if REVERSE else 1) * (1 if HEAD_FIRST else H) * V + p_o += (-1 if REVERSE else 1) * (1 if HEAD_FIRST else H) * V + if USE_GK: + p_gk += (-1 if REVERSE else 1) * (1 if HEAD_FIRST else H) * K + if USE_GV: + p_gv += (-1 if REVERSE else 1) * (1 if HEAD_FIRST else H) * V + if USE_G: + p_g += (-1 if REVERSE else 1) * (1 if HEAD_FIRST else H) + + if STORE_FINAL_STATE: + p_ht = ht + i_nh * K*V + (i_k * BK + tl.arange(0, BK)[None, :]) * V + (i_v * BV + tl.arange(0, BV)[:, None]) + tl.store(p_ht, b_h.to(p_ht.dtype.element_ty), mask=mask_h) + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + ], + key=["BK", "BV", "USE_GK", "USE_GV", "USE_G"], +) +@triton.heuristics({ + 'USE_INITIAL_STATE': lambda args: args['h0'] is not None, + 'STORE_INITIAL_STATE_GRADIENT': lambda args: args['dh0'] is not None, + 'USE_FINAL_STATE_GRADIENT': lambda args: args['dht'] is not None, + 'USE_OFFSETS': lambda args: args['offsets'] is not None +}) +# Similar to Algorithm1 of https://arxiv.org/abs/2006.16236 +@triton.jit +def fused_recurrent_bwd_kernel( + q, # query [B, H, T, K]/[B, T, H, K] + k, # key [B, H, T, V]/[B, T, H, V] + v, # value [B, H, T, V]/[B, T, H, V] + g, # log gate [B, H, T]/[B, T, H] or None + gk, # log gate [B, H, T, K]/[B, T, H, K] or None + gv, # log gate [B, H, T, V]/[B, T, H, V] or None + h0, # initial hidden state [B, H, K, V] + do, # gradient wrt output [B, H, T, V]/[B, T, H, V] + dq, # gradient wrt query [NV, B, H, T, K]/[NK, B, T, H, K] + dk, # gradient wrt key [NV, B, H, T, K]/[NK, B, T, H, K] + dv, # gradient wrt value [NK, B, H, T, V]/[NV, B, T, H, V] + dht, # gradient wrt final hidden state [B, H, K, V] + dh0, # gradient wrt initial hidden state [B, H, K, V] + offsets, + scale, + B: tl.constexpr, + T: tl.constexpr, + H: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + REVERSE: tl.constexpr, # whether to do autoregressive modeling in the reverse direction + USE_G: tl.constexpr, # whether to use g + USE_GK: tl.constexpr, # whether to use gk + USE_GV: tl.constexpr, # whether to use gv + USE_INITIAL_STATE: tl.constexpr, # whether to use initial state + STORE_INITIAL_STATE_GRADIENT: tl.constexpr, # whether to store gradient wrt initial state + USE_FINAL_STATE_GRADIENT: tl.constexpr, # whether to compute gradient wrt final state + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + i_v, i_k, i_nh = tl.program_id(0).to(tl.int64), tl.program_id(1).to(tl.int64), tl.program_id(2).to(tl.int64) + i_n, i_h = i_nh // H, i_nh % H + if USE_OFFSETS: + bos, eos = tl.load(offsets + i_n).to(tl.int64), tl.load(offsets + i_n + 1).to(tl.int64) + all = T + T = eos - bos + else: + bos, eos = i_n * T, i_n * T + T + all = B * T + + if HEAD_FIRST: + p_k = k + i_nh * T*K + ((T-1) * K if REVERSE else 0) + i_k * BK + tl.arange(0, BK) + p_v = v + i_nh * T*V + ((T-1) * V if REVERSE else 0) + i_v * BV + tl.arange(0, BV) + p_do = do + i_nh * T*V + ((T-1) * V if REVERSE else 0) + i_v * BV + tl.arange(0, BV) + p_dq = dq + (i_v * B*H + i_nh) * T*K + ((T-1) * K if REVERSE else 0) + i_k * BK + tl.arange(0, BK) + if USE_G: + p_g = g + i_nh * T + ((T-1) if REVERSE else 0) + if USE_GK: + p_gk = gk + i_nh * T*K + ((T-1) * K if REVERSE else 0) + i_k * BK + tl.arange(0, BK) + if USE_GV: + p_gv = gv + i_nh * T*V + ((T-1) * V if REVERSE else 0) + i_v * BV + tl.arange(0, BV) + else: + p_k = k + (bos + ((T-1) if REVERSE else 0)) * H*K + i_h * K + i_k * BK + tl.arange(0, BK) + p_v = v + (bos + ((T-1) if REVERSE else 0)) * H*V + i_h * V + i_v * BV + tl.arange(0, BV) + p_do = do + (bos + ((T-1) if REVERSE else 0)) * H*V + i_h * V + i_v * BV + tl.arange(0, BV) + p_dq = dq + ((i_v * all + bos) + ((T-1) if REVERSE else 0)) * H*K + i_h * K + i_k * BK + tl.arange(0, BK) + if USE_G: + p_g = g + (bos + ((T-1) if REVERSE else 0)) * H + i_h + if USE_GK: + p_gk = gk + (bos + ((T-1) if REVERSE else 0)) * H*K + i_h * K + i_k * BK + tl.arange(0, BK) + if USE_GV: + p_gv = gv + (bos + ((T-1) if REVERSE else 0)) * H*V + i_h * V + i_v * BV + tl.arange(0, BV) + + mask_k = i_k * BK + tl.arange(0, BK) < K + mask_v = i_v * BV + tl.arange(0, BV) < V + mask_h = mask_k[:, None] & mask_v[None, :] + + b_h = tl.zeros([BK, BV], dtype=tl.float32) + if USE_INITIAL_STATE: + p_h0 = h0 + i_nh * K*V + (i_k * BK + tl.arange(0, BK)[:, None]) * V + (i_v * BV + tl.arange(0, BV)[None, :]) + b_h += tl.load(p_h0, mask=mask_h, other=0).to(tl.float32) + + for _ in range(0, T): + b_k = tl.load(p_k, mask=mask_k, other=0).to(tl.float32) + b_v = tl.load(p_v, mask=mask_v, other=0).to(tl.float32) + b_do = tl.load(p_do, mask=mask_v, other=0).to(tl.float32) + if USE_G: + b_g = tl.load(p_g).to(tl.float32) + b_h = b_h * tl.exp(b_g) + if USE_GK: + b_gk = tl.load(p_gk, mask=mask_k, other=0).to(tl.float32) + b_h = b_h * tl.exp(b_gk[:, None]) + if USE_GV: + b_gv = tl.load(p_gv, mask=mask_v, other=0).to(tl.float32) + b_h = b_h * tl.exp(b_gv[None, :]) + b_h += b_k[:, None] * b_v[None, :] + b_dq = b_h * b_do[None, :] + b_dq = tl.sum(b_dq, axis=1) * scale + tl.store(p_dq, b_dq.to(p_dq.dtype.element_ty), mask=mask_k) + + p_k += (-1 if REVERSE else 1) * (1 if HEAD_FIRST else H) * K + p_v += (-1 if REVERSE else 1) * (1 if HEAD_FIRST else H) * V + p_do += (-1 if REVERSE else 1) * (1 if HEAD_FIRST else H) * V + p_dq += (-1 if REVERSE else 1) * (1 if HEAD_FIRST else H) * K + if USE_G: + p_g += (-1 if REVERSE else 1) * (1 if HEAD_FIRST else H) + if USE_GK: + p_gk += (-1 if REVERSE else 1) * (1 if HEAD_FIRST else H) * K + if USE_GV: + p_gv += (-1 if REVERSE else 1) * (1 if HEAD_FIRST else H) * V + + # sync threads + tl.debug_barrier() + + if HEAD_FIRST: + p_q = q + i_nh * T*K + ((T - 1) * K if not REVERSE else 0) + i_k * BK + tl.arange(0, BK) + p_k = k + i_nh * T*K + ((T - 1) * K if not REVERSE else 0) + i_k * BK + tl.arange(0, BK) + p_v = v + i_nh * T*V + ((T - 1) * V if not REVERSE else 0) + i_v * BV + tl.arange(0, BV) + p_do = do + i_nh * T*V + ((T - 1) * V if not REVERSE else 0) + i_v * BV + tl.arange(0, BV) + p_dk = dk + (i_v * B*H + i_nh) * T*K + ((T - 1) * K if not REVERSE else 0) + i_k * BK + tl.arange(0, BK) + p_dv = dv + (i_k * B*H + i_nh) * T*V + ((T - 1) * V if not REVERSE else 0) + i_v * BV + tl.arange(0, BV) + if USE_G: + p_g = g + i_nh * T + ((T - 1) if not REVERSE else 0) + if USE_GK: + p_gk = gk + i_nh * T*K + ((T - 1) * K if not REVERSE else 0) + i_k * BK + tl.arange(0, BK) + if USE_GV: + p_gv = gv + i_nh * T*V + ((T - 1) * V if not REVERSE else 0) + i_v * BV + tl.arange(0, BV) + else: + p_q = q + (bos + ((T - 1) if not REVERSE else 0)) * H*K + i_h * K + i_k * BK + tl.arange(0, BK) + p_k = k + (bos + ((T - 1) if not REVERSE else 0)) * H*K + i_h * K + i_k * BK + tl.arange(0, BK) + p_v = v + (bos + ((T - 1) if not REVERSE else 0)) * H*V + i_h * V + i_v * BV + tl.arange(0, BV) + p_do = do + (bos + ((T - 1) if not REVERSE else 0)) * H*V + i_h * V + i_v * BV + tl.arange(0, BV) + p_dk = dk + ((i_v * all + bos) + ((T - 1) if not REVERSE else 0)) * H*K + i_h * K + i_k * BK + tl.arange(0, BK) + p_dv = dv + ((i_k * all + bos) + ((T - 1) if not REVERSE else 0)) * H*V + i_h * V + i_v * BV + tl.arange(0, BV) + if USE_G: + p_g = g + (bos + ((T - 1) if not REVERSE else 0)) * H + i_h + if USE_GK: + p_gk = gk + (bos + ((T - 1) if not REVERSE else 0)) * H*K + i_h * K + i_k * BK + tl.arange(0, BK) + if USE_GV: + p_gv = gv + (bos + ((T - 1) if not REVERSE else 0)) * H*V + i_h * V + i_v * BV + tl.arange(0, BV) + + b_dh = tl.zeros([BK, BV], dtype=tl.float32) + if USE_FINAL_STATE_GRADIENT: + p_dht = dht + i_nh * K*V + (i_k * BK + tl.arange(0, BK)[:, None]) * V + (i_v * BV + tl.arange(0, BV)[None, :]) + b_dh += tl.load(p_dht, mask=mask_h, other=0).to(tl.float32) + + for _ in range(T): + b_q = tl.load(p_q, mask=mask_k, other=0).to(tl.float32) * scale + b_k = tl.load(p_k, mask=mask_k, other=0).to(tl.float32) + b_v = tl.load(p_v, mask=mask_v, other=0).to(tl.float32) + b_do = tl.load(p_do, mask=mask_v, other=0).to(tl.float32) + b_dh += b_q[:, None] * b_do[None, :] + b_dk = tl.sum(b_dh * b_v[None, :], axis=1) + b_dv = tl.sum(b_dh * b_k[:, None], axis=0) + if USE_G: + b_g = tl.load(p_g).to(tl.float32) + b_dh *= tl.exp(b_g) + if USE_GK: + b_gk = tl.load(p_gk, mask=mask_k, other=0).to(tl.float32) + b_dh *= tl.exp(b_gk)[:, None] + if USE_GV: + b_gv = tl.load(p_gv, mask=mask_v, other=0).to(tl.float32) + b_dh *= tl.exp(b_gv)[None, :] + tl.store(p_dk, b_dk.to(p_dk.dtype.element_ty), mask=mask_k) + tl.store(p_dv, b_dv.to(p_dv.dtype.element_ty), mask=mask_v) + + p_q += (1 if REVERSE else -1) * (1 if HEAD_FIRST else H) * K + p_k += (1 if REVERSE else -1) * (1 if HEAD_FIRST else H) * K + p_v += (1 if REVERSE else -1) * (1 if HEAD_FIRST else H) * V + p_do += (1 if REVERSE else -1) * (1 if HEAD_FIRST else H) * V + p_dk += (1 if REVERSE else -1) * (1 if HEAD_FIRST else H) * K + p_dv += (1 if REVERSE else -1) * (1 if HEAD_FIRST else H) * V + if USE_G: + p_g += (1 if REVERSE else -1) * (1 if HEAD_FIRST else H) + if USE_GK: + p_gk += (1 if REVERSE else -1) * (1 if HEAD_FIRST else H) * K + if USE_GV: + p_gv += (1 if REVERSE else -1) * (1 if HEAD_FIRST else H) * V + + if STORE_INITIAL_STATE_GRADIENT: + p_dh0 = dh0 + i_nh * K*V + (i_k * BK + tl.arange(0, BK)[:, None]) * V + (i_v * BV + tl.arange(0, BV)[None, :]) + tl.store(p_dh0, b_dh.to(p_dh0.dtype.element_ty), mask=mask_h) + + +def fused_recurrent_fwd( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + g: Optional[torch.Tensor] = None, + gk: Optional[torch.Tensor] = None, + gv: Optional[torch.Tensor] = None, + scale: Optional[float] = None, + initial_state: Optional[torch.Tensor] = None, + output_final_state: bool = False, + reverse: bool = False, + offsets: Optional[torch.LongTensor] = None, + head_first: bool = True +): + if head_first: + B, H, T, K, V = *k.shape, v.shape[-1] + else: + B, T, H, K, V = *k.shape, v.shape[-1] + N = B if offsets is None else len(offsets) - 1 + BK, BV = min(K, 64), min(V, 64) + NK, NV = triton.cdiv(K, BK), triton.cdiv(V, BV) + + h0 = initial_state + if output_final_state: + ht = q.new_empty(N, H, K, V, dtype=torch.float32) + else: + ht = None + o = q.new_empty(NK, *v.shape, dtype=torch.float32) + + grid = (NV, NK, N * H) + fused_recurrent_fwd_kernel[grid]( + q, + k, + v, + g, + gk, + gv, + o, + h0, + ht, + offsets, + scale, + B=B, + T=T, + H=H, + K=K, + V=V, + BK=BK, + BV=BV, + USE_G=g is not None, + USE_GK=gk is not None, + USE_GV=gv is not None, + REVERSE=reverse, + HEAD_FIRST=head_first + ) + o = o.sum(0) + return o, ht + + +def fused_recurrent_bwd( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + g: Optional[torch.Tensor] = None, + gk: Optional[torch.Tensor] = None, + gv: Optional[torch.Tensor] = None, + o: Optional[torch.Tensor] = None, + do: Optional[torch.Tensor] = None, + dht: Optional[torch.Tensor] = None, + scale: Optional[float] = None, + initial_state: Optional[torch.Tensor] = None, + reverse: bool = False, + offsets: Optional[torch.LongTensor] = None, + head_first: bool = True +): + if head_first: + B, H, T, K, V = *k.shape, v.shape[-1] + else: + B, T, H, K, V = *k.shape, v.shape[-1] + N = B if offsets is None else len(offsets) - 1 + + BK, BV = min(K, 64), min(V, 64) + NK, NV = triton.cdiv(K, BK), triton.cdiv(V, BV) + + dq = q.new_empty(NV, *q.shape, dtype=torch.float32) + dk = q.new_empty(NV, *k.shape, dtype=torch.float32) + dv = q.new_empty(NK, *v.shape, dtype=torch.float32) + h0 = initial_state + dh0 = torch.empty_like(initial_state) if initial_state is not None else None + + grid = (NV, NK, N * H) + fused_recurrent_bwd_kernel[grid]( + q, + k, + v, + g, + gk, + gv, + h0, + do, + dq, + dk, + dv, + dht, + dh0, + offsets, + scale, + B=B, + T=T, + H=H, + K=K, + V=V, + BK=BK, + BV=BV, + USE_G=g is not None, + USE_GK=gk is not None, + USE_GV=gv is not None, + REVERSE=reverse, + HEAD_FIRST=head_first + ) + dq = dq.sum(0) + dk = dk.sum(0) + dv = dv.sum(0) + dg, dgk, dgv = None, None, None + if g is not None: + dg = chunk_global_cumsum( + (dq * q.float() - dk * k.float()).sum(-1), + reverse=not reverse, + offsets=offsets, + head_first=head_first + ) + if gk is not None: + dgk = chunk_global_cumsum( + dq * q.float() - dk * k.float(), + reverse=not reverse, + offsets=offsets, + head_first=head_first + ) + if gv is not None: + dgv = chunk_global_cumsum( + do.float() * o.float() - dv * v.float(), + reverse=not reverse, + offsets=offsets, + head_first=head_first + ) + + return dq, dk, dv, dg, dgk, dgv, dh0 + + +class FusedRecurrentFunction(torch.autograd.Function): + + @staticmethod + @contiguous + @autocast_custom_fwd + def forward( + ctx, + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + g: Optional[torch.Tensor] = None, + gk: Optional[torch.Tensor] = None, + gv: Optional[torch.Tensor] = None, + scale: Optional[float] = None, + initial_state: Optional[torch.Tensor] = None, + output_final_state: bool = False, + reverse: bool = False, + offsets: Optional[torch.LongTensor] = None, + head_first: bool = True + ): + o, ht = fused_recurrent_fwd( + q=q, + k=k, + v=v, + g=g, + gk=gk, + gv=gv, + scale=scale, + initial_state=initial_state, + output_final_state=output_final_state, + reverse=reverse, + offsets=offsets, + head_first=head_first + ) + ctx.save_for_backward(q, k, v, g, gk, gv, initial_state, o) + ctx.scale = scale + ctx.reverse = reverse + ctx.offsets = offsets + ctx.head_first = head_first + return o.to(q.dtype), ht + + @staticmethod + @contiguous + @autocast_custom_bwd + def backward(ctx, do, dht): + q, k, v, g, gk, gv, initial_state, o = ctx.saved_tensors + + # not supported yet. + if dht is not None: + if g is not None: + assert g.requires_grad is False, "Cannot load final state gradient and use gates at the same time" + if gk is not None: + assert gk.requires_grad is False, "Cannot load final state gradient and use gates at the same time" + if gv is not None: + assert gv.requires_grad is False, "Cannot load final state gradient and use gates at the same time" + dq, dk, dv, dg, dgk, dgv, dh0 = fused_recurrent_bwd( + q=q, + k=k, + v=v, + g=g, + gk=gk, + gv=gv, + o=o, + do=do, + dht=dht, + scale=ctx.scale, + initial_state=initial_state, + reverse=ctx.reverse, + offsets=ctx.offsets, + head_first=ctx.head_first + ) + return dq.to(q.dtype), dk.to(k.dtype), dv.to(v.dtype), dg, dgk, dgv, None, dh0, None, None, None, None + + +def fused_recurrent( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + g: Optional[torch.Tensor] = None, + gk: Optional[torch.Tensor] = None, + gv: Optional[torch.Tensor] = None, + scale: Optional[float] = None, + initial_state: Optional[torch.Tensor] = None, + output_final_state: bool = False, + reverse: bool = False, + offsets: Optional[torch.LongTensor] = None, + head_first: bool = True +): + if scale is None: + scale = k.shape[-1] ** -0.5 + return FusedRecurrentFunction.apply( + q, + k, + v, + g, + gk, + gv, + scale, + initial_state, + output_final_state, + reverse, + offsets, + head_first + ) diff --git a/fla/ops/delta_rule/README.md b/fla/ops/delta_rule/README.md new file mode 100644 index 0000000000000000000000000000000000000000..1ab2d485a9552d70238c1f68288c72c62f9e0ef2 --- /dev/null +++ b/fla/ops/delta_rule/README.md @@ -0,0 +1,4 @@ +- Delta Rule + +The implementation of delta rule described in https://arxiv.org/abs/2102.11174 + diff --git a/fla/ops/delta_rule/__init__.py b/fla/ops/delta_rule/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..e0acb6a7d0e4eec9a8dc697615604783b8858d13 --- /dev/null +++ b/fla/ops/delta_rule/__init__.py @@ -0,0 +1,11 @@ +# -*- coding: utf-8 -*- + +from .chunk import chunk_delta_rule +from .fused_chunk import fused_chunk_delta_rule +from .fused_recurrent import fused_recurrent_delta_rule + +__all__ = [ + 'fused_chunk_delta_rule', + 'fused_recurrent_delta_rule', + 'chunk_delta_rule' +] diff --git a/fla/ops/delta_rule/chunk.py b/fla/ops/delta_rule/chunk.py new file mode 100644 index 0000000000000000000000000000000000000000..458bb55ec1cfd8a29ae93fbed124b9806f396e09 --- /dev/null +++ b/fla/ops/delta_rule/chunk.py @@ -0,0 +1,1116 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +from typing import Optional, Tuple + +import torch +import triton +import triton.language as tl + +from fla.ops.delta_rule.wy_fast import (bwd_prepare_wy_repr, + fwd_prepare_wy_repr, fwd_recompute_w_u) +from fla.utils import autocast_custom_bwd, autocast_custom_fwd, contiguous + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8), + triton.Config({}, num_warps=16), + ], + key=['BT', 'BK', 'BV'], +) +@triton.heuristics({ + 'USE_INITIAL_STATE': lambda args: args['h0'] is not None, + 'STORE_FINAL_STATE': lambda args: args['ht'] is not None, + 'USE_OFFSETS': lambda args: args['offsets'] is not None +}) +@triton.jit +def chunk_delta_rule_fwd_kernel_h( + k, + v, + d, + v_new, + h, + h0, + ht, + offsets, + c_offsets, + T: tl.constexpr, + H: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BC: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + NT: tl.constexpr, + USE_INITIAL_STATE: tl.constexpr, + STORE_FINAL_STATE: tl.constexpr, + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + i_k, i_v, i_nh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_n, i_h = i_nh // H, i_nh % H + if USE_OFFSETS: + bos, eos = tl.load(offsets + i_n).to(tl.int32), tl.load(offsets + i_n + 1).to(tl.int32) + T = eos - bos + NT = tl.cdiv(T, BT) + boh = tl.load(c_offsets + i_n).to(tl.int32) + else: + bos, eos = i_n * T, i_n * T + T + NT = tl.cdiv(T, BT) + boh = i_n * NT + + # [BK, BV] + b_h = tl.zeros([BK, BV], dtype=tl.float32) + if USE_INITIAL_STATE: + p_h0 = tl.make_block_ptr(h0 + i_nh * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + b_h = tl.load(p_h0, boundary_check=(0, 1)).to(tl.float32) + + for i_t in range(NT): + if HEAD_FIRST: + p_h = tl.make_block_ptr(h + (i_nh * NT + i_t) * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + else: + p_h = tl.make_block_ptr(h + ((boh + i_t) * H + i_h) * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + tl.store(p_h, b_h.to(p_h.dtype.element_ty), boundary_check=(0, 1)) + + b_hc = tl.zeros([BK, BV], dtype=tl.float32) + # since we need to make all DK in the SRAM. we face serve SRAM memory burden. By subchunking we allievate such burden + for i_c in range(tl.cdiv(min(BT, T - i_t * BT), BC)): + if HEAD_FIRST: + p_k = tl.make_block_ptr(k + i_nh * T*K, (K, T), (1, K), (i_k * BK, i_t * BT + i_c * BC), (BK, BC), (0, 1)) + p_d = tl.make_block_ptr(d + i_nh * T*K, (T, K), (K, 1), (i_t * BT + i_c * BC, i_k * BK), (BC, BK), (1, 0)) + p_v = tl.make_block_ptr(v + i_nh * T*V, (T, V), (V, 1), (i_t * BT + i_c * BC, i_v * BV), (BC, BV), (1, 0)) + p_v_new = tl.make_block_ptr(v_new+i_nh*T*V, (T, V), (V, 1), (i_t * BT + i_c * BC, i_v * BV), (BC, BV), (1, 0)) + else: + p_k = tl.make_block_ptr(k+(bos*H+i_h)*K, (K, T), (1, H*K), (i_k * BK, i_t * BT + i_c * BC), (BK, BC), (0, 1)) + p_d = tl.make_block_ptr(d+(bos*H+i_h)*K, (T, K), (H*K, 1), (i_t * BT + i_c * BC, i_k * BK), (BC, BK), (1, 0)) + p_v = tl.make_block_ptr(v+(bos*H+i_h)*V, (T, V), (H*V, 1), (i_t * BT + i_c * BC, i_v * BV), (BC, BV), (1, 0)) + p_v_new = tl.make_block_ptr(v_new+(bos*H+i_h)*V, (T, V), (H*V, 1), (i_t*BT+i_c*BC, i_v * BV), (BC, BV), (1, 0)) + # [BK, BC] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BC, BK] + b_d = tl.load(p_d, boundary_check=(0, 1)) + # [BC, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + b_v -= tl.dot(b_d, b_h.to(b_k.dtype)) + # [BK, BV] + tl.store(p_v_new, b_v.to(p_v_new.dtype.element_ty), boundary_check=(0, 1)) + b_hc += tl.dot(b_k, b_v.to(b_k.dtype), allow_tf32=False) + b_h += b_hc + + if STORE_FINAL_STATE: + p_ht = tl.make_block_ptr(ht + i_nh * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + tl.store(p_ht, b_h.to(p_ht.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + ], + key=['BT', 'BK', 'BV'], +) +@triton.heuristics({'USE_OFFSETS': lambda args: args['offsets'] is not None}) +@triton.jit +def chunk_delta_rule_fwd_kernel_o( + q, + k, + v, + h, + o, + offsets, + indices, + scale, + T: tl.constexpr, + H: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + i_v, i_t, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_b, i_h = i_bh // H, i_bh % H + if USE_OFFSETS: + i_tg = i_t + i_n, i_t = tl.load(indices + i_t * 2).to(tl.int32), tl.load(indices + i_t * 2 + 1).to(tl.int32) + bos, eos = tl.load(offsets + i_n).to(tl.int32), tl.load(offsets + i_n + 1).to(tl.int32) + T = eos - bos + NT = tl.cdiv(T, BT) + else: + NT = tl.cdiv(T, BT) + i_tg = i_b * NT + i_t + bos, eos = i_b * T, i_b * T + T + + o_i = tl.arange(0, BT) + m_s = o_i[:, None] >= o_i[None, :] + + b_o = tl.zeros([BT, BV], dtype=tl.float32) + b_s = tl.zeros([BT, BT], dtype=tl.float32) + for i_k in range(tl.cdiv(K, BK)): + if HEAD_FIRST: + p_q = tl.make_block_ptr(q + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_k = tl.make_block_ptr(k + i_bh * T*K, (K, T), (1, K), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + p_h = tl.make_block_ptr(h + (i_bh * NT + i_t) * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + else: + p_q = tl.make_block_ptr(q + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_k = tl.make_block_ptr(k + (bos * H + i_h) * K, (K, T), (1, H*K), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + p_h = tl.make_block_ptr(h + (i_tg * H + i_h) * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + + # [BT, BK] + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_q = (b_q * scale).to(b_q.dtype) + # [BK, BT] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BK, BV] + b_h = tl.load(p_h, boundary_check=(0, 1)) + b_o += tl.dot(b_q, b_h, allow_tf32=False) + b_s += tl.dot(b_q, b_k, allow_tf32=False) + + b_s = tl.where(m_s, b_s, 0) + if HEAD_FIRST: + p_v = tl.make_block_ptr(v + i_bh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_o = tl.make_block_ptr(o + i_bh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + else: + p_v = tl.make_block_ptr(v + (bos * H + i_h) * V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_o = tl.make_block_ptr(o + (bos * H + i_h) * V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + b_v = tl.load(p_v, boundary_check=(0, 1)) + b_o = (b_o + tl.dot(b_s.to(b_v.dtype), b_v, allow_tf32=False)) + tl.store(p_o, b_o.to(p_o.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + ], + key=['BT', 'BK', 'BV'], +) +@triton.heuristics({'USE_OFFSETS': lambda args: args['offsets'] is not None}) +@triton.jit +def chunk_delta_rule_fwd_kernel_prepare_dv( + q, + k, + do, + dv, + offsets, + indices, + scale, + T: tl.constexpr, + H: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + i_t, i_bh = tl.program_id(0), tl.program_id(1) + i_b, i_h = i_bh // H, i_bh % H + if USE_OFFSETS: + i_n, i_t = tl.load(indices + i_t * 2).to(tl.int32), tl.load(indices + i_t * 2 + 1).to(tl.int32) + bos, eos = tl.load(offsets + i_n).to(tl.int32), tl.load(offsets + i_n + 1).to(tl.int32) + T = eos - bos + else: + bos, eos = i_b * T, i_b * T + T + + b_A = tl.zeros([BT, BT], dtype=tl.float32) + for i_k in range(tl.cdiv(K, BK)): + if HEAD_FIRST: + p_q = tl.make_block_ptr(q + i_bh * T*K, (K, T), (1, K), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + p_k = tl.make_block_ptr(k + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + else: + p_q = tl.make_block_ptr(q + (bos * H + i_h) * K, (K, T), (1, H*K), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + p_k = tl.make_block_ptr(k + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + + b_k = tl.load(p_k, boundary_check=(0, 1)) + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_q = (b_q * scale).to(b_k.dtype) + b_A += tl.dot(b_k, b_q, allow_tf32=False) + + b_A = tl.where(tl.arange(0, BT)[:, None] <= tl.arange(0, BT)[None, :], b_A, 0).to(do.dtype.element_ty) + for i_v in range(tl.cdiv(V, BV)): + if HEAD_FIRST: + p_do = tl.make_block_ptr(do + i_bh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_dv = tl.make_block_ptr(dv + i_bh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + else: + p_do = tl.make_block_ptr(do + (bos * H + i_h) * V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_dv = tl.make_block_ptr(dv + (bos * H + i_h) * V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + b_do = tl.load(p_do, boundary_check=(0, 1)) + b_dv = tl.dot(b_A, b_do, allow_tf32=False) + tl.store(p_dv, b_dv.to(p_dv.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4) + ], + key=['BT', 'BK', 'BV'], +) +@triton.heuristics({ + 'USE_FINAL_STATE_GRADIENT': lambda args: args['dht'] is not None, + 'USE_INITIAL_STATE': lambda args: args['dh0'] is not None, + 'USE_OFFSETS': lambda args: args['offsets'] is not None +}) +@triton.jit +def chunk_delta_rule_bwd_kernel_dhu( + q, + k, + d, + dht, + dh0, + do, + dh, + dv, + dv2, + offsets, + c_offsets, + scale, + T: tl.constexpr, + H: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BC: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + USE_FINAL_STATE_GRADIENT: tl.constexpr, + USE_INITIAL_STATE: tl.constexpr, + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + i_k, i_v, i_nh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_n, i_h = i_nh // H, i_nh % H + if USE_OFFSETS: + bos, eos = tl.load(offsets + i_n).to(tl.int32), tl.load(offsets + i_n + 1).to(tl.int32) + T = eos - bos + NT = tl.cdiv(T, BT) + boh = tl.load(c_offsets + i_n).to(tl.int32) + else: + bos, eos = i_n * T, i_n * T + T + NT = tl.cdiv(T, BT) + boh = i_n * NT + + # [BK, BV] + b_dh = tl.zeros([BK, BV], dtype=tl.float32) + if USE_FINAL_STATE_GRADIENT: + p_dht = tl.make_block_ptr(dht + i_nh * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + b_dh += tl.load(p_dht, boundary_check=(0, 1)) + + for i_t in range(NT - 1, -1, -1): + if HEAD_FIRST: + p_dh = tl.make_block_ptr(dh + (i_nh * NT + i_t) * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + else: + p_dh = tl.make_block_ptr(dh + ((boh+i_t) * H + i_h) * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + tl.store(p_dh, b_dh.to(p_dh.dtype.element_ty), boundary_check=(0, 1)) + b_dh_tmp = tl.zeros([BK, BV], dtype=tl.float32) + for i_c in range(tl.cdiv(BT, BC) - 1, -1, -1): + if HEAD_FIRST: + p_q = tl.make_block_ptr(q + i_nh * T*K, (K, T), (1, K), (i_k * BK, i_t * BT + i_c * BC), (BK, BC), (0, 1)) + p_k = tl.make_block_ptr(k + i_nh * T*K, (T, K), (K, 1), (i_t * BT + i_c * BC, i_k * BK), (BC, BK), (1, 0)) + p_d = tl.make_block_ptr(d + i_nh * T*K, (K, T), (1, K), (i_k * BK, i_t * BT + i_c * BC), (BK, BC), (0, 1)) + p_dv = tl.make_block_ptr(dv + i_nh * T*V, (T, V), (V, 1), (i_t * BT + i_c * BC, i_v * BV), (BC, BV), (1, 0)) + p_do = tl.make_block_ptr(do + i_nh * T*V, (T, V), (V, 1), (i_t * BT + i_c * BC, i_v * BV), (BC, BV), (1, 0)) + p_dv2 = tl.make_block_ptr(dv2 + i_nh * T*V, (T, V), (V, 1), (i_t * BT + i_c * BC, i_v * BV), (BC, BV), (1, 0)) + else: + p_q = tl.make_block_ptr(q+(bos*H+i_h)*K, (K, T), (1, H*K), (i_k * BK, i_t * BT + i_c * BC), (BK, BC), (0, 1)) + p_k = tl.make_block_ptr(k+(bos*H+i_h)*K, (T, K), (H*K, 1), (i_t * BT + i_c * BC, i_k * BK), (BC, BK), (1, 0)) + p_d = tl.make_block_ptr(d+(bos*H+i_h)*K, (K, T), (1, H*K), (i_k * BK, i_t * BT + i_c * BC), (BK, BC), (0, 1)) + p_dv = tl.make_block_ptr(dv+(bos*H+i_h)*V, (T, V), (H*V, 1), (i_t*BT + i_c * BC, i_v * BV), (BC, BV), (1, 0)) + p_do = tl.make_block_ptr(do+(bos*H+i_h)*V, (T, V), (H*V, 1), (i_t*BT + i_c * BC, i_v * BV), (BC, BV), (1, 0)) + p_dv2 = tl.make_block_ptr(dv2+(bos*H+i_h)*V, (T, V), (H*V, 1), (i_t*BT + i_c * BC, i_v * BV), (BC, BV), (1, 0)) + # [BK, BT] + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_q = (b_q * scale).to(b_q.dtype) + # [BT, BK] + b_k = tl.load(p_k, boundary_check=(0, 1)) + b_d = tl.load(p_d, boundary_check=(0, 1)) + # [BT, V] + b_do = tl.load(p_do, boundary_check=(0, 1)) + + b_dv = tl.load(p_dv, boundary_check=(0, 1)) + b_dv += tl.dot(b_k, b_dh.to(b_k.dtype), allow_tf32=False) + + tl.store(p_dv2, b_dv.to(p_dv.dtype.element_ty), boundary_check=(0, 1)) + # [BK, BV] + b_dh_tmp += tl.dot(b_q, b_do.to(b_q.dtype), allow_tf32=False) + b_dh_tmp -= tl.dot(b_d, b_dv.to(b_q.dtype), allow_tf32=False) + b_dh += b_dh_tmp + + if USE_INITIAL_STATE: + p_dh0 = tl.make_block_ptr(dh0 + i_nh * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + tl.store(p_dh0, b_dh.to(p_dh0.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4) + ], + key=['BT', 'BK', 'BV'], +) +@triton.heuristics({'USE_OFFSETS': lambda args: args['offsets'] is not None}) +@triton.jit +def chunk_delta_rule_bwd_kernel_dqkw( + q, + k, + v, + h, + do, + dh, + dq, + dk, + dv, + dw, + offsets, + indices, + scale, + T: tl.constexpr, + H: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + NT: tl.constexpr, + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + i_k, i_t, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_b, i_h = i_bh // H, i_bh % H + if USE_OFFSETS: + i_tg = i_t + i_n, i_t = tl.load(indices + i_t * 2).to(tl.int32), tl.load(indices + i_t * 2 + 1).to(tl.int32) + bos, eos = tl.load(offsets + i_n).to(tl.int32), tl.load(offsets + i_n + 1).to(tl.int32) + T = eos - bos + NT = tl.cdiv(T, BT) + else: + NT = tl.cdiv(T, BT) + i_tg = i_b * NT + i_t + bos, eos = i_b * T, i_b * T + T + + o_i = tl.arange(0, BT) + + if HEAD_FIRST: + p_q = tl.make_block_ptr(q + i_bh * T*K, (K, T), (1, K), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + p_k = tl.make_block_ptr(k + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + else: + p_q = tl.make_block_ptr(q + (bos * H + i_h) * K, (K, T), (1, H*K), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + p_k = tl.make_block_ptr(k + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + + b_dq = tl.zeros([BT, BK], dtype=tl.float32) + b_dk = tl.zeros([BT, BK], dtype=tl.float32) + b_dw = tl.zeros([BT, BK], dtype=tl.float32) + b_ds = tl.zeros([BT, BT], dtype=tl.float32) + for i_v in range(tl.cdiv(V, BV)): + if HEAD_FIRST: + p_v = tl.make_block_ptr(v + i_bh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_do = tl.make_block_ptr(do + i_bh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_dv = tl.make_block_ptr(dv + i_bh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_h = tl.make_block_ptr(h + i_bh * NT*K*V + i_t * K*V, (V, K), (1, V), (i_v * BV, i_k * BK), (BV, BK), (0, 1)) + p_dh = tl.make_block_ptr(dh + i_bh * NT*K*V + i_t * K*V, (V, K), (1, V), (i_v * BV, i_k * BK), (BV, BK), (0, 1)) + else: + p_v = tl.make_block_ptr(v + (bos * H + i_h) * V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_do = tl.make_block_ptr(do + (bos * H + i_h) * V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_dv = tl.make_block_ptr(dv + (bos * H + i_h) * V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_h = tl.make_block_ptr(h + (i_tg * H + i_h) * K*V, (V, K), (1, V), (i_v * BV, i_k * BK), (BV, BK), (0, 1)) + p_dh = tl.make_block_ptr(dh + (i_tg * H + i_h) * K*V, (V, K), (1, V), (i_v * BV, i_k * BK), (BV, BK), (0, 1)) + + # [BT, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + b_do = tl.load(p_do, boundary_check=(0, 1)) + # [BV, BK] + b_h = tl.load(p_h, boundary_check=(0, 1)) + # [BK, BV] + b_dh = tl.load(p_dh, boundary_check=(0, 1)) + # [BT, BT] + b_ds += tl.dot(b_do, tl.trans(b_v), allow_tf32=False) + # [BT, BK] + b_dq += tl.dot(b_do, b_h, allow_tf32=False) + b_dk += tl.dot(b_v, b_dh, allow_tf32=False) + + b_dv = tl.load(p_dv, boundary_check=(0, 1)) + b_dw += tl.dot(b_dv.to(b_v.dtype), b_h.to(b_v.dtype), allow_tf32=False) + + # [BK, BT] + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_q = (b_q * scale).to(b_q.dtype) + b_k = tl.load(p_k, boundary_check=(0, 1)) + b_ds = tl.where(o_i[:, None] >= o_i[None, :], b_ds, 0).to(b_q.dtype) + b_dq += tl.dot(b_ds, b_k, allow_tf32=False) + b_dq *= scale + b_dk += tl.trans(tl.dot(b_q, b_ds, allow_tf32=False)) + + if HEAD_FIRST: + p_dq = tl.make_block_ptr(dq + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dk = tl.make_block_ptr(dk + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dw = tl.make_block_ptr(dw + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + else: + p_dq = tl.make_block_ptr(dq + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dk = tl.make_block_ptr(dk + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dw = tl.make_block_ptr(dw + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + + tl.store(p_dq, b_dq.to(p_dq.dtype.element_ty), boundary_check=(0, 1)) + tl.store(p_dk, b_dk.to(p_dk.dtype.element_ty), boundary_check=(0, 1)) + tl.store(p_dw, -b_dw.to(p_dw.dtype.element_ty), boundary_check=(0, 1)) + + +def chunk_delta_rule_fwd_prepare_dv( + q: torch.Tensor, + k: torch.Tensor, + do: torch.Tensor, + scale: float, + offsets: Optional[torch.LongTensor] = None, + indices: Optional[torch.LongTensor] = None, + head_first: bool = True, + chunk_size: int = 64 +) -> torch.Tensor: + if head_first: + B, H, T, K, V = *k.shape, do.shape[-1] + else: + B, T, H, K, V = *k.shape, do.shape[-1] + BT = min(chunk_size, max(triton.next_power_of_2(T), 16)) + if offsets is None: + NT = triton.cdiv(T, BT) + else: + if indices is None: + indices = torch.cat([torch.arange(n) for n in triton.cdiv(offsets[1:] - offsets[:-1], BT).tolist()]) + indices = torch.stack([indices.eq(0).cumsum(0) - 1, indices], 1).to(offsets) + NT = len(indices) + BK = min(triton.next_power_of_2(K), 64) + BV = min(triton.next_power_of_2(V), 64) + + dv = torch.empty_like(do) + chunk_delta_rule_fwd_kernel_prepare_dv[(NT, B * H)]( + q, + k, + do, + dv, + offsets, + indices, + scale, + T=T, + H=H, + K=K, + V=V, + BT=BT, + BK=BK, + BV=BV, + HEAD_FIRST=head_first + ) + return dv + + +def chunk_delta_rule_fwd_h( + k: torch.Tensor, + w: torch.Tensor, + u: torch.Tensor, + initial_state: Optional[torch.Tensor] = None, + output_final_state: bool = False, + offsets: Optional[torch.LongTensor] = None, + c_offsets: Optional[torch.Tensor] = None, + head_first: bool = True, + chunk_size: int = 64 +) -> Tuple[torch.Tensor, torch.Tensor]: + if head_first: + B, H, T, K, V = *k.shape, u.shape[-1] + else: + B, T, H, K, V = *k.shape, u.shape[-1] + BT = min(chunk_size, max(triton.next_power_of_2(T), 16)) + # N: the actual number of sequences in the batch with either equal or variable lengths + if offsets is None: + N, NT, c_offsets = B, triton.cdiv(T, BT), None + else: + N = len(offsets) - 1 + if c_offsets is None: + c_offsets = torch.cat([offsets.new_tensor([0]), triton.cdiv(offsets[1:] - offsets[:-1], BT)]).cumsum(-1) + NT = c_offsets[-1] + BK = triton.next_power_of_2(K) + assert BK <= 256, "current kernel does not support head dimension larger than 256." + # H100 can have larger block size + if torch.cuda.get_device_capability()[0] >= 9: + BV = 64 + BC = 64 + # A100 + elif torch.cuda.get_device_capability() == (8, 0): + BV = 32 + BC = 64 + else: + BV = 32 + BC = 64 if K <= 128 else 32 + BC = min(BT, BC) + NK = triton.cdiv(K, BK) + NV = triton.cdiv(V, BV) + assert NK == 1, 'NK > 1 is not supported because it involves time-consuming synchronization' + + if head_first: + h = k.new_empty(B, H, NT, K, V) + else: + h = k.new_empty(B, NT, H, K, V) + final_state = k.new_empty(N, H, K, V, dtype=torch.float32) if output_final_state else None + + v_new = torch.empty_like(u) + grid = (NK, NV, N * H) + chunk_delta_rule_fwd_kernel_h[grid]( + k=k, + v=u, + d=w, + v_new=v_new, + h=h, + h0=initial_state, + ht=final_state, + offsets=offsets, + c_offsets=c_offsets, + T=T, + H=H, + K=K, + V=V, + BT=BT, + BC=BC, + BK=BK, + BV=BV, + NT=NT, + HEAD_FIRST=head_first + ) + return h, v_new, final_state + + +def chunk_delta_rule_bwd_dhu( + q: torch.Tensor, + k: torch.Tensor, + w: torch.Tensor, + h0: torch.Tensor, + dht: Optional[torch.Tensor], + do: torch.Tensor, + dv: torch.Tensor, + scale: float, + offsets: Optional[torch.LongTensor] = None, + c_offsets: Optional[torch.LongTensor] = None, + head_first: bool = True, + chunk_size: int = 64 +) -> Tuple[torch.Tensor, torch.Tensor, torch.Tensor]: + if head_first: + B, H, T, K, V = *q.shape, do.shape[-1] + else: + B, T, H, K, V = *q.shape, do.shape[-1] + BT = min(chunk_size, max(triton.next_power_of_2(T), 16)) + # N: the actual number of sequences in the batch with either equal or variable lengths + if offsets is None: + N, NT, c_offsets = B, triton.cdiv(T, BT), None + else: + N = len(offsets) - 1 + if c_offsets is None: + c_offsets = torch.cat([offsets.new_tensor([0]), triton.cdiv(offsets[1:] - offsets[:-1], BT)]).cumsum(-1) + NT = c_offsets[-1] + + BK = triton.next_power_of_2(K) + assert BK <= 256, "current kernel does not support head dimension being larger than 256." + # H100 + if torch.cuda.get_device_capability()[0] >= 9: + BV = 64 + BC = 64 + # A100 + elif torch.cuda.get_device_capability() == (8, 0): + BV = 32 + BC = 64 if K <= 128 else 32 + else: + BV = 32 + BC = 64 if K <= 128 else 32 + BC = min(BT, BC) + NK, NV = triton.cdiv(K, BK), triton.cdiv(V, BV) + assert NK == 1, 'NK > 1 is not supported because it involves time-consuming synchronization' + + if head_first: + dh = q.new_empty(B, H, NT, K, V) + else: + dh = q.new_empty(B, NT, H, K, V) + dh0 = torch.empty_like(h0, dtype=torch.float32) if h0 is not None else None + dv2 = torch.empty_like(dv) + + grid = (NK, NV, N * H) + chunk_delta_rule_bwd_kernel_dhu[grid]( + q=q, + k=k, + d=w, + dht=dht, + dh0=dh0, + do=do, + dh=dh, + dv=dv, + dv2=dv2, + offsets=offsets, + c_offsets=c_offsets, + scale=scale, + T=T, + H=H, + K=K, + V=V, + BT=BT, + BC=BC, + BK=BK, + BV=BV, + HEAD_FIRST=head_first + ) + return dh, dh0, dv2 + + +def chunk_delta_rule_fwd_o( + q: torch.Tensor, + k: torch.Tensor, + v_new: torch.Tensor, + h: torch.Tensor, + scale: float, + offsets: Optional[torch.LongTensor] = None, + indices: Optional[torch.LongTensor] = None, + head_first: bool = True, + chunk_size: int = 64 +) -> torch.Tensor: + if head_first: + B, H, T, K, V = *q.shape, v_new.shape[-1] + else: + B, T, H, K, V = *q.shape, v_new.shape[-1] + BT = min(chunk_size, max(triton.next_power_of_2(T), 16)) + if offsets is None: + NT = triton.cdiv(T, BT) + else: + if indices is None: + indices = torch.cat([torch.arange(n) for n in triton.cdiv(offsets[1:] - offsets[:-1], BT).tolist()]) + indices = torch.stack([indices.eq(0).cumsum(0) - 1, indices], 1).to(offsets) + NT = len(indices) + BK = min(triton.next_power_of_2(K), 64) + BV = min(triton.next_power_of_2(V), 64) + NV = triton.cdiv(V, BV) + + o = torch.empty_like(v_new) + grid = (NV, NT, B * H) + chunk_delta_rule_fwd_kernel_o[grid]( + q=q, + k=k, + v=v_new, + h=h, + o=o, + offsets=offsets, + indices=indices, + scale=scale, + T=T, + H=H, + K=K, + V=V, + BT=BT, + BK=BK, + BV=BV, + HEAD_FIRST=head_first + ) + return o + + +def chunk_delta_rule_bwd_dqkw( + q: torch.Tensor, + k: torch.Tensor, + v_new: torch.Tensor, + w: torch.Tensor, + h: torch.Tensor, + du: torch.Tensor, + do: torch.Tensor, + dh: torch.Tensor, + scale: float, + offsets: Optional[torch.LongTensor] = None, + indices: Optional[torch.LongTensor] = None, + head_first: bool = True, + chunk_size: int = 64 +) -> Tuple[torch.Tensor, torch.Tensor, torch.Tensor]: + if head_first: + B, H, T, K, V = *q.shape, v_new.shape[-1] + else: + B, T, H, K, V = *q.shape, v_new.shape[-1] + BT = min(chunk_size, max(triton.next_power_of_2(T), 16)) + if offsets is None: + NT = triton.cdiv(T, BT) + else: + if indices is None: + indices = torch.cat([torch.arange(n) for n in triton.cdiv(offsets[1:] - offsets[:-1], BT).tolist()]) + indices = torch.stack([indices.eq(0).cumsum(0) - 1, indices], 1).to(offsets) + NT = len(indices) + BK = min(triton.next_power_of_2(K), 64) + BV = min(triton.next_power_of_2(V), 64) + NK = triton.cdiv(K, BK) + + dq = torch.empty_like(q) + dk = torch.empty_like(k) + dw = torch.empty_like(w) + grid = (NK, NT, B * H) + chunk_delta_rule_bwd_kernel_dqkw[grid]( + q=q, + k=k, + v=v_new, + h=h, + do=do, + dh=dh, + dq=dq, + dk=dk, + dv=du, + dw=dw, + offsets=offsets, + indices=indices, + scale=scale, + T=T, + H=H, + K=K, + V=V, + BT=BT, + BK=BK, + BV=BV, + NT=NT, + HEAD_FIRST=head_first + ) + return dq, dk, dw + + +def chunk_delta_rule_fwd( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + beta: torch.Tensor, + scale: float, + initial_state: torch.Tensor, + output_final_state: bool, + checkpoint_level: int = 1, + offsets: Optional[torch.LongTensor] = None, + indices: Optional[torch.LongTensor] = None, + head_first: bool = True, + chunk_size: int = 64 +): + T = q.shape[2] if head_first else q.shape[1] + BT = min(chunk_size, max(triton.next_power_of_2(T), 16)) + # obtain WY representation. u is actually the new v. + w, u, A = fwd_prepare_wy_repr( + k=k, + v=v, + beta=beta, + offsets=offsets, + indices=indices, + head_first=head_first, + chunk_size=BT + ) + + h, v_new, final_state = chunk_delta_rule_fwd_h( + k=k, + w=w, + u=u, + initial_state=initial_state, + output_final_state=output_final_state, + offsets=offsets, + head_first=head_first, + chunk_size=BT + ) + + # obtain output + o = chunk_delta_rule_fwd_o( + q=q, + k=k, + v_new=v_new, + h=h, + scale=scale, + offsets=offsets, + indices=indices, + head_first=head_first, + chunk_size=BT + ) + if checkpoint_level == 1: + h, v_new = None, None + return o, A, h, v_new, final_state + + +def chunk_delta_rule_bwd( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + beta: torch.Tensor, + A: torch.Tensor, + h: torch.Tensor, + v_new: torch.Tensor, + scale: float, + initial_state: torch.Tensor, + do: torch.Tensor, + dht: torch.Tensor, + offsets: Optional[torch.LongTensor] = None, + indices: Optional[torch.LongTensor] = None, + head_first: bool = True, + chunk_size: int = 64 +): + T = q.shape[2] if head_first else q.shape[1] + BT = min(chunk_size, max(triton.next_power_of_2(T), 16)) + w, u = fwd_recompute_w_u( + k=k, + v=v, + beta=beta, + A=A, + offsets=offsets, + indices=indices, + head_first=head_first, + chunk_size=BT + ) + if h is None: + h, v_new, _ = chunk_delta_rule_fwd_h( + k=k, + w=w, + u=u, + initial_state=initial_state, + output_final_state=False, + offsets=offsets, + head_first=head_first, + chunk_size=BT + ) + dv = chunk_delta_rule_fwd_prepare_dv( + q=q, + k=k, + do=do, + scale=scale, + offsets=offsets, + indices=indices, + head_first=head_first, + chunk_size=BT + ) + dh, dh0, dv = chunk_delta_rule_bwd_dhu( + q=q, + k=k, + w=w, + h0=initial_state, + dht=dht, + do=do, + dv=dv, + scale=scale, + offsets=offsets, + head_first=head_first, + chunk_size=BT + ) + dq, dk, dw = chunk_delta_rule_bwd_dqkw( + q=q, + k=k, + v_new=v_new, + w=w, + h=h, + du=dv, + do=do, + dh=dh, + scale=scale, + offsets=offsets, + indices=indices, + head_first=head_first, + chunk_size=BT + ) + dk2, dv, db = bwd_prepare_wy_repr( + k=k, + v=v, + beta=beta, + A=A, + dw=dw, + du=dv, + offsets=offsets, + indices=indices, + head_first=head_first, + chunk_size=BT + ) + dk.add_(dk2) + return dq, dk, dv, db, dh0 + + +class ChunkDeltaRuleFunction(torch.autograd.Function): + + @staticmethod + @contiguous + @autocast_custom_fwd + def forward( + ctx, + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + beta: torch.Tensor, + scale: float, + initial_state: torch.Tensor, + output_final_state: bool, + checkpoint_level: int = 1, + offsets: Optional[torch.LongTensor] = None, + head_first: bool = True + ): + T = q.shape[2] if head_first else q.shape[1] + chunk_size = min(64, max(triton.next_power_of_2(T), 16)) + + # 2-d indices denoting the offsets of chunks in each sequence + # for example, if the passed `offsets` is [0, 100, 356] and `chunk_size` is 64, + # then there are 2 and 4 chunks in the 1st and 2nd sequences respectively, and `indices` will be + # [[0, 0], [0, 1], [1, 0], [1, 1], [1, 2], [1, 3]] + indices = None + if offsets is not None: + indices = torch.cat([torch.arange(n) for n in triton.cdiv(offsets[1:] - offsets[:-1], chunk_size).tolist()]) + indices = torch.stack([indices.eq(0).cumsum(0) - 1, indices], 1).to(offsets) + + o, A, h, v_new, final_state = chunk_delta_rule_fwd( + q=q, + k=k, + v=v, + beta=beta, + scale=scale, + initial_state=initial_state, + output_final_state=output_final_state, + checkpoint_level=checkpoint_level, + offsets=offsets, + indices=indices, + head_first=head_first, + chunk_size=chunk_size + ) + ctx.save_for_backward(q, k, v, beta, A, h, v_new, initial_state) + ctx.chunk_size = chunk_size + ctx.scale = scale + ctx.offsets = offsets + ctx.indices = indices + ctx.head_first = head_first + return o.to(q.dtype), final_state + + @staticmethod + @contiguous + @autocast_custom_bwd + def backward( + ctx, + do: torch.Tensor, + dht: torch.Tensor + ): + q, k, v, beta, A, h, v_new, initial_state = ctx.saved_tensors + dq, dk, dv, db, dh0 = chunk_delta_rule_bwd( + q=q, + k=k, + v=v, + beta=beta, + A=A, + h=h, + v_new=v_new, + scale=ctx.scale, + initial_state=initial_state, + do=do, + dht=dht, + offsets=ctx.offsets, + indices=ctx.indices, + head_first=ctx.head_first, + chunk_size=ctx.chunk_size + ) + return dq.to(q.dtype), dk.to(k.dtype), dv.to(v.dtype), db.to(beta.dtype), None, dh0, None, None, None, None, None + + +def chunk_delta_rule( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + beta: torch.Tensor, + scale: float = None, + initial_state: torch.Tensor = None, + output_final_state: bool = False, + checkpoint_level: int = 1, + offsets: Optional[torch.LongTensor] = None, + head_first: bool = True +): + r""" + Args: + q (torch.Tensor): + queries of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]`. + k (torch.Tensor): + keys of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]`. + v (torch.Tensor): + values of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]`. + beta (torch.Tensor): + betas of shape `[B, H, T]` if `head_first=True` else `[B, T, H]`. + scale (Optional[int]): + Scale factor for the RetNet attention scores. + If not provided, it will default to `1 / sqrt(K)`. Default: `None`. + initial_state (Optional[torch.Tensor]): + Initial state of shape `[N, H, K, V]` for `N` input sequences. + For equal-length input sequences, `N` equals the batch size `B`. + Default: `None`. + output_final_state (Optional[bool]): + Whether to output the final state of shape `[N, H, K, V]`. Default: `False`. + checkpoint_level (Optional[int]): + Checkpointing level; higher values will save more memories and do more recomputations during backward. + Default: `1`: + - Level `0`: no memory saved, no recomputation. + - Level `1`: recompute the forward hidden states during backward. + offsets (Optional[torch.LongTensor]): + Offsets of shape `[N+1]` defining the bos/eos positions of `N` variable-length sequences in the batch. + For example, + if `offsets` is `[0, 1, 3, 6, 10, 15]`, there are `N=5` sequences with lengths 1, 2, 3, 4 and 5 respectively. + If provided, the inputs are concatenated and the batch size `B` is expected to be 1. + Default: `None`. + head_first (Optional[bool]): + Whether the inputs are in the head-first format, which is not supported for variable-length inputs. + Default: `True`. + + Returns: + o (torch.Tensor): + Outputs of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]`. + final_state (torch.Tensor): + Final state of shape `[N, H, K, V]` if `output_final_state=True` else `None`. + + Examples:: + >>> import torch + >>> import torch.nn.functional as F + >>> from einops import rearrange + >>> from fla.ops.delta_rule import chunk_delta_rule + # inputs with equal lengths + >>> B, T, H, K, V = 4, 2048, 4, 512, 512 + >>> q = torch.randn(B, T, H, K, dtype=torch.bfloat16, device='cuda') + >>> k = F.normalize(torch.randn(B, T, H, K, dtype=torch.bfloat16, device='cuda'), p=2, dim=-1) + >>> v = torch.randn(B, T, H, V, dtype=torch.bfloat16, device='cuda') + >>> beta = torch.rand(B, T, H, dtype=torch.bfloat16, device='cuda').sigmoid() + >>> h0 = torch.randn(B, H, K, V, dtype=torch.bfloat16, device='cuda') + >>> o, ht = chunk_delta_rule(q, k, v, beta, + initial_state=h0, + output_final_state=True, + head_first=False) + # for variable-length inputs, the batch size `B` is expected to be 1 and `offsets` is required + >>> q, k, v, beta = map(lambda x: rearrange(x, 'b t ... -> 1 (b t) ...'), (q, k, v, beta)) + # for a batch with 4 sequences, offsets with 5 start/end positions are expected + >>> offsets = q.new_tensor([0, 2048, 4096, 6144, 8192], dtype=torch.long) + >>> o_var, ht_var = chunk_delta_rule(q, k, v, beta, + initial_state=h0, + output_final_state=True, + offsets=offsets, + head_first=False) + """ + assert q.dtype == k.dtype == v.dtype + assert q.dtype != torch.float32, "ChunkDeltaRuleFunction does not support float32. Please use bfloat16." + assert len(beta.shape) == 3, "beta must be of shape (batch size, num of head, seq len)." + + if offsets is not None: + if q.shape[0] != 1: + raise ValueError(f"The batch size is expected to be 1 rather than {q.shape[0]} when using `offsets`." + f"Please flatten variable-length inputs before processing.") + if head_first: + raise RuntimeError("Sequences with variable lengths are not supported for head-first mode") + if initial_state is not None and initial_state.shape[0] != len(offsets) - 1: + raise ValueError(f"The number of initial states is expected to be equal to the number of input sequences, " + f"i.e., {len(offsets) - 1} rather than {initial_state.shape[0]}.") + if scale is None: + scale = k.shape[-1] ** -0.5 + o, final_state = ChunkDeltaRuleFunction.apply( + q, + k, + v, + beta, + scale, + initial_state, + output_final_state, + checkpoint_level, + offsets, + head_first + ) + return o, final_state diff --git a/fla/ops/delta_rule/fused_chunk.py b/fla/ops/delta_rule/fused_chunk.py new file mode 100644 index 0000000000000000000000000000000000000000..c74b013b4abeffb0fe558600e5eb7fb2e23d86ce --- /dev/null +++ b/fla/ops/delta_rule/fused_chunk.py @@ -0,0 +1,409 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +import torch +import triton +import triton.language as tl + +from fla.ops.delta_rule.wy_fast import (bwd_prepare_wy_repr, + fwd_prepare_wy_repr, fwd_recompute_w) +from fla.utils import autocast_custom_bwd, autocast_custom_fwd, contiguous + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4) + ], + key=["BT", "BK"], +) +@triton.jit +def fused_chunk_delta_rule_fwd_kernel( + # B: batch_size, H: n_heads, T: seq_len, D: d_head + q, # query [B, H, L, D_head_K] + k, # key [B, H, L, D_head_K] + v, # value [B, H, L, D_head_V] + v_new, + d, # decay [B, H, L, D_head_K] + o, # output [B, H, L, D_head_V] + initial_state, # initial state of the chunk [B, H, D_head_K, D_head_V] + final_state, # final state of the chunk [B, H, D_head_K, D_head_V] + s_k_h, # stride size: L * D_head_K + s_k_t, # stride size: D_head_K + s_k_d, # stride size: 1 + s_v_h, # stride size: L * D_head_V + s_v_t, # stride size: D_head_V + s_v_d, # stride size: 1 + B, # batch size + H, # n_heads + T, # seq_len + scale, # D_head_K ** -0.5 + BT: tl.constexpr, # BLOCK SIZE along the sequence dimension, a.k.a. chunk size + BK: tl.constexpr, # BLOCK SIZE along the K dimension + BV: tl.constexpr, # BLOCK SIZE along the V dimension + DK: tl.constexpr, # D_head_K + DV: tl.constexpr, # D_head_V + USE_INITIAL_STATE: tl.constexpr, + STORE_FINAL_STATE: tl.constexpr, + CHECK: tl.constexpr +): + # indices + i_v, i_k, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + + o_i = tl.arange(0, BT) + + # [BT, BT] + m_s = o_i[:, None] >= o_i[None, :] + # [BK, BV] + b_h = tl.zeros([BK, BV], dtype=tl.float32) + + # make block pointers + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (T, DK), (s_k_t, s_k_d), (0, i_k * BK), (BT, BK), (1, 0)) + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (DK, T), (s_k_d, s_k_t), (i_k * BK, 0), (BK, BT), (0, 1)) + p_d = tl.make_block_ptr(d + i_bh * s_k_h, (T, DK), (s_k_t, s_k_d), (0, i_k * BK), (BT, BK), (1, 0)) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (T, DV), (s_v_t, s_v_d), (0, i_v * BV), (BT, BV), (1, 0)) + p_o = tl.make_block_ptr(o + (i_bh+i_k*B*H) * s_v_h, (T, DV), (s_v_t, s_v_d), (0, i_v * BV), (BT, BV), (1, 0)) + p_v_new = tl.make_block_ptr(v_new + i_bh * s_v_h, (T, DV), (s_v_t, s_v_d), (0, i_v * BV), (BT, BV), (1, 0)) + + if USE_INITIAL_STATE: + p_h = tl.make_block_ptr(initial_state + i_bh * DK * DV, (DK, DV), (DV, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + b_h = tl.load(p_h, boundary_check=(0, 1)).to(tl.float32) + + for i in range(0, tl.cdiv(T, BT)): + # [BK, BT] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BT, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + # [BT, BK] + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_d = tl.load(p_d, boundary_check=(0, 1)) + b_q = (b_q * scale).to(b_k.dtype) + + # [BT, BT] + b_s = tl.dot(b_q, b_k, allow_tf32=False) + b_s = tl.where(m_s, b_s, 0) + # [BT, BV] + b_v_prime = tl.dot(b_d, b_h.to(b_q.dtype), allow_tf32=False) + b_v = b_v - b_v_prime + tl.store(p_v_new, b_v.to(p_v.dtype.element_ty), boundary_check=(0, 1)) + + b_o = tl.dot(b_s.to(b_q.dtype), b_v.to(b_q.dtype), allow_tf32=False) + if CHECK and i == 0: + b_o += tl.dot(b_q, b_h.to(b_q.dtype), allow_tf32=False) + b_h = b_h + tl.dot(b_k, b_v.to(b_k.dtype), allow_tf32=False) + else: + b_o += tl.dot(b_q, b_h.to(b_q.dtype), allow_tf32=False) + b_h = b_h + tl.dot(b_k, b_v.to(b_k.dtype), allow_tf32=False) + tl.store(p_o, b_o.to(p_o.dtype.element_ty), boundary_check=(0, 1)) + p_q = tl.advance(p_q, (BT, 0)) + p_k = tl.advance(p_k, (0, BT)) + p_v = tl.advance(p_v, (BT, 0)) + p_v_new = tl.advance(p_v_new, (BT, 0)) + p_o = tl.advance(p_o, (BT, 0)) + p_d = tl.advance(p_d, (BT, 0)) + + if STORE_FINAL_STATE: + p_final = tl.make_block_ptr(final_state + i_bh * DK * DV, (DK, DV), (DV, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + tl.store(p_final, b_h.to(p_final.dtype.element_ty), boundary_check=(0, 1)) + + +# Similar to Algorithm1 of https://arxiv.org/abs/2006.16236 +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8) + ], + key=["BT", "BK", "BV"], +) +@triton.jit +def fused_chunk_delta_rule_bwd_kernel( + # B: batch_size, H: n_heads, T: seq_len, D: d_head + # NV: number of split in the V dimension. NK: number of split in the K dimension + q, # query [B, H, L, D_head_K] + k, # key [B, H, L, D_head_V] + v, # value [B, H, L, D_head_V] + d, # decay [B, H, L, D_head_K] + dht, # gradient of final state [B, H, D_head_K, D_head_V] + dh0, # gradient of initial state [B, H, D_head_K, D_head_V] + do, # gradient of output [B, H, L, D_head_V] + dq, # gradient of query [NV, B, H, L, D_head_K] + dk, # gradient of key [NV, B, H, L, D_head_K] + dv, # gradient of value [NK, B, H, L, D_head_V] + dd, # gradient of decay [NV, B, H, L, D_head_K] + initial_state, # initial state of the chunk [B, H, D_head_K, D_head_V] + s_k_h, # stride size: L * D_head_K + s_k_t, # stride size: D_head_K + s_k_d, # stride size: 1 + s_v_h, # stride size: L * D_head_V + s_v_t, # stride size: D_head_V + s_v_d, # stride size: 1 + B, # batch_size + H, # n_heads + T, # seq_len + scale, # D_head_K ** -0.5 by default + BT: tl.constexpr, # BLOCK SIZE along the sequence dimension, a.k.a. chunk size + BK: tl.constexpr, # BLOCK SIZE along the K dimension + BV: tl.constexpr, # BLOCK SIZE along the V dimension + DK: tl.constexpr, # D_head_K + DV: tl.constexpr, # D_head_V + USE_INITIAL_STATE: tl.constexpr, # whether to use initial state + USE_DHT: tl.constexpr, # whether to use final state gradient + USE_DHO: tl.constexpr, # whether to use initial state gradient + CHECK: tl.constexpr +): + i_v, i_k, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + o_i = tl.arange(0, BT) + + # first reverse + # [BK, BV] + b_dh = tl.zeros([BK, BV], dtype=tl.float32) + if USE_DHT: + p_dht = tl.make_block_ptr(dht + i_bh * DK * DV, (DK, DV), (DV, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + b_dh += tl.load(p_dht, boundary_check=(0, 1)).to(tl.float32) + m_s = o_i[:, None] <= o_i[None, :] + + for i in range(tl.cdiv(T, BT) - 1, -1, -1): + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (DK, T), (s_k_d, s_k_t), (i_k * BK, i * BT), (BK, BT), (0, 1)) + p_d = tl.make_block_ptr(d + i_bh * s_k_h, (DK, T), (s_k_d, s_k_t), (i_k * BK, i * BT), (BK, BT), (0, 1)) + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (T, DK), (s_k_t, s_k_d), (i * BT, i_k * BK), (BT, BK), (1, 0)) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (T, DV), (s_v_t, s_v_d), (i * BT, i_v * BV), (BT, BV), (1, 0)) + p_do = tl.make_block_ptr(do + i_bh * s_v_h, (T, DV), (s_v_t, s_v_d), (i * BT, i_v * BV), (BT, BV), (1, 0)) + p_dk = tl.make_block_ptr(dk + (i_bh+i_v*B*H) * s_k_h, (T, DK), (s_k_t, s_k_d), (i*BT, i_k*BK), (BT, BK), (1, 0)) + p_dv = tl.make_block_ptr(dv + (i_bh+i_k*B*H) * s_v_h, (T, DV), (s_v_t, s_v_d), (i*BT, i_v*BV), (BT, BV), (1, 0)) + # [DK, BT] + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_q = (b_q * scale).to(b_q.dtype) + # [BT, DK] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BT, DV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + b_do = tl.load(p_do, boundary_check=(0, 1)) + + # [BT, BT] + b_ds = tl.dot(b_v, tl.trans(b_do), allow_tf32=False) + b_ds = tl.where(m_s, b_ds, 0).to(b_q.dtype) + # [BT, BT] + b_s = tl.dot(b_k, b_q, allow_tf32=False) + b_s = tl.where(m_s, b_s, 0).to(b_q.dtype) + # [BT, DK] + b_dk = tl.dot(b_ds, tl.trans(b_q), allow_tf32=False) + # [BT, DV] + b_dv = tl.dot(b_s, b_do, allow_tf32=False) + b_d = tl.load(p_d, boundary_check=(0, 1)) + b_dk += tl.dot(b_v, tl.trans(b_dh).to(b_v.dtype), allow_tf32=False) + b_dv += tl.dot(b_k, b_dh.to(b_k.dtype), allow_tf32=False) + b_dh += tl.dot(b_q, b_do, allow_tf32=False) + b_dh -= tl.dot(b_d, b_dv.to(b_d.dtype), allow_tf32=False) + + tl.store(p_dk, (b_dk).to(p_dk.dtype.element_ty), boundary_check=(0, 1)) + tl.store(p_dv, b_dv.to(p_dv.dtype.element_ty), boundary_check=(0, 1)) + + if USE_DHO: + p_dh0 = tl.make_block_ptr(dh0 + i_bh * DK * DV, (DK, DV), (DV, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + tl.store(p_dh0, b_dh.to(p_dh0.dtype.element_ty), boundary_check=(0, 1)) + + # sync threads + b_h = None + tl.debug_barrier() + m_s = o_i[:, None] >= o_i[None, :] + b_h = tl.zeros([BV, BK], dtype=tl.float32) + if USE_INITIAL_STATE: + p_h = tl.make_block_ptr(initial_state + i_bh * DK * DV, (DV, DK), (1, DV), (i_v * BV, i_k * BK), (BV, BK), (0, 1)) + b_h += tl.load(p_h, boundary_check=(0, 1)).to(tl.float32) + NT = tl.cdiv(T, BT) + for i in range(0, NT): + p_dv = tl.make_block_ptr(dv + i_bh * s_v_h, (T, DV), (s_v_t, s_v_d), (i * BT, i_v * BV), (BT, BV), (1, 0)) + b_dv = tl.load(p_dv, boundary_check=(0, 1)) + b_dd = tl.dot(b_dv.to(k.dtype.element_ty), b_h.to(k.dtype.element_ty), allow_tf32=False) + p_dd = tl.make_block_ptr(dd + (i_bh + i_v*B*H) * s_k_h, (T, DK), (s_k_t, s_k_d), + (i * BT, i_k * BK), (BT, BK), (1, 0)) + tl.store(p_dd, -b_dd.to(p_dd.dtype.element_ty), boundary_check=(0, 1)) + + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (T, DK), (s_k_t, s_k_d), (i * BT, i_k * BK), (BT, BK), (1, 0)) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (DV, T), (s_v_d, s_v_t), (i_v * BV, i * BT), (BV, BT), (0, 1)) + p_do = tl.make_block_ptr(do + i_bh * s_v_h, (T, DV), (s_v_t, s_v_d), (i * BT, i_v * BV), (BT, BV), (1, 0)) + p_dq = tl.make_block_ptr(dq + (i_bh + i_v*B*H) * s_k_h, (T, DK), (s_k_t, s_k_d), (i*BT, i_k*BK), (BT, BK), (1, 0)) + # [BT, DK] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [DV, BT] + b_v = tl.load(p_v, boundary_check=(0, 1)) + # [BT, DV] + b_do = tl.load(p_do, boundary_check=(0, 1)) + + # [BT, BT] + b_ds = tl.dot(b_do, b_v, allow_tf32=False) + b_ds = tl.where(m_s, b_ds, 0) + # [BT, DK] + b_dq = tl.dot(b_ds.to(b_k.dtype), b_k, allow_tf32=False) + # [DV, DK] + if CHECK and i == 0: + b_dq += tl.dot(b_do, b_h.to(b_do.dtype), allow_tf32=False) + b_h = b_h + tl.dot(b_v, b_k, allow_tf32=False) + else: + b_dq += tl.dot(b_do, b_h.to(b_do.dtype), allow_tf32=False) + b_h = b_h + tl.dot(b_v, b_k, allow_tf32=False) + b_dq *= scale + tl.store(p_dq, b_dq.to(p_dq.dtype.element_ty), boundary_check=(0, 1)) + + +def fused_chunk_delta_rule_fwd(q, k, v, d, BT, scale, initial_state, output_final_state): + batch_size, n_heads, seq_len, d_head_qk = q.shape + d_head_v = v.shape[-1] + BT = BT + BK, BV = triton.next_power_of_2(d_head_qk), min(triton.next_power_of_2(d_head_v), 32) + NK, NV = triton.cdiv(d_head_qk, BK), triton.cdiv(d_head_v, BV) + assert NK == 1, 'NK should be 1' + o = q.new_empty(batch_size, n_heads, seq_len, d_head_v) + if output_final_state: + final_state = q.new_empty(batch_size, n_heads, d_head_qk, d_head_v, dtype=torch.float32, requires_grad=False) + else: + final_state = None + CHECK = True + # if version.parse(triton.__version__) < version.parse('2.2.0'): + # import warnings + # warnings.warn( + # "Triton<2.2.0 detected for running this kernel, " + # "which is known to have some weird compiler issues (refer to https://github.com/openai/triton/issues/2852) " + # "that lead to significant precision loss. " + # "We've add some initial condition checks to resolve this, sadly at the sacrifice of the speed. " + # "For optimal performance, it is recommended to install Triton>=2.2.0 (if possible)." + # ) + # CHECK = True + grid = (NV, NK, batch_size * n_heads) + v_new = torch.empty_like(v) + fused_chunk_delta_rule_fwd_kernel[grid]( + q, k, v, v_new, d, o, initial_state, final_state, + q.stride(1), q.stride(2), q.stride(3), + v.stride(1), v.stride(2), v.stride(3), + batch_size, n_heads, seq_len, scale, + BT=BT, DK=d_head_qk, DV=d_head_v, BK=BK, BV=BV, + USE_INITIAL_STATE=initial_state is not None, + STORE_FINAL_STATE=output_final_state, + CHECK=CHECK, + ) + return o, v_new, CHECK, final_state + + +def fused_chunk_delta_rule_bwd(q, k, v, d, dht, dh0, do, BT, CHECK, initial_state, scale): + batch_size, n_heads, seq_len, d_head_qk = q.shape + d_head_v = v.shape[-1] + BK, BV = triton.next_power_of_2(d_head_qk), min(triton.next_power_of_2(d_head_v), 32) + NK, NV = triton.cdiv(d_head_qk, BK), triton.cdiv(d_head_v, BV) + assert NK == 1 + dq = q.new_empty(NV, batch_size, n_heads, seq_len, d_head_qk) + dk = q.new_empty(NV, batch_size, n_heads, seq_len, d_head_qk) + dd = q.new_empty(NV, batch_size, n_heads, seq_len, d_head_qk) + dv = q.new_empty(NK, batch_size, n_heads, seq_len, d_head_v) + grid = (NV, NK, batch_size * n_heads) + fused_chunk_delta_rule_bwd_kernel[grid]( + q, k, v, d, dht, dh0, do, dq, dk, dv, dd, initial_state, + q.stride(1), q.stride(2), q.stride(3), + v.stride(1), v.stride(2), v.stride(3), + batch_size, n_heads, seq_len, scale, + BT=BT, DK=d_head_qk, DV=d_head_v, BK=BK, BV=BV, + USE_INITIAL_STATE=initial_state is not None, + USE_DHT=dht is not None, + USE_DHO=dh0 is not None, + CHECK=CHECK + # num_warps=num_warps, + # num_stages=num_stages + ) + dq = dq.sum(0) + dk = dk.sum(0) + dv = dv.sum(0) + dd = dd.sum(0) + return dq, dk, dv, dd + + +class FusedChunkDeltaRuleFunction(torch.autograd.Function): + + @staticmethod + @contiguous + @autocast_custom_fwd + def forward(ctx, q, k, v, beta, BT, scale, initial_state, output_final_state): + # obtain WY representation. u is actually the new v. + w, u, A = fwd_prepare_wy_repr(k, v, beta, BT) + # ### forward_h + final_state = None + if output_final_state: + final_state = q.new_empty(q.shape[0], q.shape[1], q.shape[-1], v.shape[-1], + dtype=torch.float32, requires_grad=False) + o, v_new, CHECK, final_state = fused_chunk_delta_rule_fwd(q, k, u, w, BT, scale, initial_state, output_final_state) + ctx.save_for_backward(q, k, v, beta, A, v_new, initial_state) + ctx.CHECK = CHECK + ctx.BT = BT + ctx.scale = scale + return o.to(q.dtype), final_state + + @staticmethod + @contiguous + @autocast_custom_bwd + def backward(ctx, do, dht): + q, k, v, beta, A, v_new, initial_state = ctx.saved_tensors + BT = ctx.BT + scale = ctx.scale + w = fwd_recompute_w(k, beta, A, BT) + if initial_state is not None and initial_state.requires_grad: + dh0 = torch.empty_like(initial_state, dtype=torch.float32) + else: + dh0 = None + dq, dk, dv, dw = fused_chunk_delta_rule_bwd(q, k, v_new, w, dht, dh0, do, BT, ctx.CHECK, initial_state, scale) + dk2, dv, dbeta = bwd_prepare_wy_repr(k, v, beta, A, dw, dv, BT) + dk.add_(dk2) + return dq.to(q.dtype), dk.to(k.dtype), dv.to(v.dtype), dbeta.to(beta.dtype), None, None, dh0, None, None, None + + +def fused_chunk_delta_rule( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + beta: torch.Tensor, + scale: float = None, + initial_state: torch.Tensor = None, + output_final_state: bool = False, + head_first: bool = True +): + r""" + Args: + q (torch.Tensor): + queries of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]`. + k (torch.Tensor): + keys of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]`. + v (torch.Tensor): + values of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]`. + beta (torch.Tensor): + betas of shape `[B, H, T]` if `head_first=True` else `(B, T, H)`. + scale (Optional[int]): + Scale factor for the RetNet attention scores. + If not provided, it will default to `1 / sqrt(K)`. Default: `None`. + initial_state (Optional[torch.Tensor]): + Initial state of shape `[B, H, K, V]`. Default: `None`. + output_final_state (Optional[bool]): + Whether to output the final state of shape `[B, H, K, V]`. Default: `False`. + head_first (Optional[bool]): + Whether the inputs are in the head-first format. + Default: `True`. + + Returns: + o (torch.Tensor): + Outputs of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]`. + final_state (torch.Tensor): + Final state of shape `[B, H, K, V]` if `output_final_state=True` else `None`. + """ + BT = 32 if q.shape[-1] <= 128 else 16 + assert q.dtype == k.dtype == v.dtype + assert q.dtype != torch.float32, "ChunkDeltaRuleFunction does not support float32. Please use bfloat16." + assert len(beta.shape) == 3, "beta must be of shape (batch size, num of head, seq len)." + if scale is None: + scale = k.shape[-1] ** -0.5 + else: + assert scale > 0, "scale must be positive." + if not head_first: + q, k, v, beta = map(lambda x: x.transpose(1, 2), (q, k, v, beta)) + o, final_state = FusedChunkDeltaRuleFunction.apply(q, k, v, beta, BT, scale, initial_state, output_final_state) + if not head_first: + o = o.transpose(1, 2) + return o, final_state diff --git a/fla/ops/delta_rule/fused_recurrent.py b/fla/ops/delta_rule/fused_recurrent.py new file mode 100644 index 0000000000000000000000000000000000000000..5f5511487881f0fe150202624d48be43e995d64e --- /dev/null +++ b/fla/ops/delta_rule/fused_recurrent.py @@ -0,0 +1,578 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +from typing import Optional, Tuple + +import torch +import triton +import triton.language as tl + +from fla.utils import contiguous + + +@triton.heuristics({ + 'USE_INITIAL_STATE': lambda args: args['h0'] is not None, + 'STORE_FINAL_STATE': lambda args: args['ht'] is not None, + 'USE_OFFSETS': lambda args: args['offsets'] is not None +}) +@triton.jit +def fused_recurrent_delta_rule_fwd_kernel( + q, + k, + v, + u, + beta, + o, + h0, + ht, + offsets, + scale, + B: tl.constexpr, + T: tl.constexpr, + H: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + USE_INITIAL_STATE: tl.constexpr, # whether to use initial state + STORE_FINAL_STATE: tl.constexpr, # whether to store final state + IS_BETA_HEADWISE: tl.constexpr, # whether beta is headwise vector or scalar, + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + i_v, i_k, i_nh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_n, i_h = i_nh // H, i_nh % H + if USE_OFFSETS: + bos, eos = tl.load(offsets + i_n).to(tl.int64), tl.load(offsets + i_n + 1).to(tl.int64) + all = T + T = eos - bos + else: + bos, eos = i_n * T, i_n * T + T + all = B * T + + if HEAD_FIRST: + p_q = q + i_nh * T*K + i_k * BK + tl.arange(0, BK) + p_k = k + i_nh * T*K + i_k * BK + tl.arange(0, BK) + p_v = v + i_nh * T*V + i_v * BV + tl.arange(0, BV) + p_u = u + i_nh * T*V + i_v * BV + tl.arange(0, BV) + if IS_BETA_HEADWISE: + p_beta = beta + i_nh * T*V + i_v * BV + tl.arange(0, BV) + else: + p_beta = beta + i_nh * T + p_o = o + (i_k * B*H + i_nh) * T*V + i_v * BV + tl.arange(0, BV) + else: + p_q = q + (bos * H + i_h) * K + i_k * BK + tl.arange(0, BK) + p_k = k + (bos * H + i_h) * K + i_k * BK + tl.arange(0, BK) + p_v = v + (bos * H + i_h) * V + i_v * BV + tl.arange(0, BV) + p_u = u + (bos * H + i_h) * V + i_v * BV + tl.arange(0, BV) + if IS_BETA_HEADWISE: + p_beta = beta + (bos * H + i_h) * V + i_v * BV + tl.arange(0, BV) + else: + p_beta = beta + bos * H + i_h + p_o = o + ((i_k * all + bos) * H + i_h) * V + i_v * BV + tl.arange(0, BV) + + mask_k = (i_k * BK + tl.arange(0, BK)) < K + mask_v = (i_v * BV + tl.arange(0, BV)) < V + mask_h = mask_k[None, :] & mask_v[:, None] + + b_h = tl.zeros([BV, BK], dtype=tl.float32) + if USE_INITIAL_STATE: + p_h0 = h0 + i_nh * K * V + (i_k * BK + tl.arange(0, BK)[None, :]) * V + (i_v * BV + tl.arange(0, BV)[:, None]) + b_h += tl.load(p_h0, mask=mask_h, other=0).to(tl.float32) + + for _ in range(0, T): + b_k = tl.load(p_k, mask=mask_k, other=0).to(tl.float32) + b_v = tl.load(p_v, mask=mask_v, other=0).to(tl.float32) + b_q = tl.load(p_q, mask=mask_k, other=0).to(tl.float32) * scale + b_v_minus = tl.sum(b_h * b_k[None, :], axis=1) + b_v -= b_v_minus + if IS_BETA_HEADWISE: + b_beta = tl.load(p_beta, mask=mask_v, other=0).to(tl.float32) + else: + b_beta = tl.load(p_beta).to(tl.float32) + tl.store(p_u, b_v.to(p_v.dtype.element_ty), mask=mask_v) + b_v *= b_beta + b_h += b_k[None, :] * b_v[:, None] + b_o = b_h * b_q[None, :] + b_o = tl.sum(b_o, axis=1) + tl.store(p_o, b_o.to(p_o.dtype.element_ty), mask=mask_v) + + p_q += K if HEAD_FIRST else H*K + p_k += K if HEAD_FIRST else H*K + p_o += V if HEAD_FIRST else H*V + p_v += V if HEAD_FIRST else H*V + p_u += V if HEAD_FIRST else H*V + p_beta += (1 if HEAD_FIRST else H) * (V if IS_BETA_HEADWISE else 1) + + if STORE_FINAL_STATE: + p_ht = ht + i_nh * K * V + (i_k * BK + tl.arange(0, BK)[None, :]) * V + (i_v * BV + tl.arange(0, BV)[:, None]) + tl.store(p_ht, b_h.to(p_ht.dtype.element_ty), mask=mask_h) + + +@triton.heuristics({ + 'USE_INITIAL_STATE': lambda args: args['h0'] is not None, + 'USE_FINAL_STATE_GRADIENT': lambda args: args['dht'] is not None, + 'USE_OFFSETS': lambda args: args['offsets'] is not None +}) +@triton.jit +def fused_recurrent_delta_rule_bwd_kernel( + q, + k, + v, + beta, + h0, + dh0, + dht, + do, + dq, + dk, + dv, + db, + offsets, + scale, + B: tl.constexpr, + T: tl.constexpr, + H: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + NK: tl.constexpr, + IS_BETA_HEADWISE: tl.constexpr, # whether beta is headwise vector or scalar + USE_INITIAL_STATE: tl.constexpr, # whether to use dh0 + USE_FINAL_STATE_GRADIENT: tl.constexpr, # whether to use dht + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + i_v, i_k, i_nh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_n, i_h = i_nh // H, i_nh % H + if USE_OFFSETS: + bos, eos = tl.load(offsets + i_n).to(tl.int64), tl.load(offsets + i_n + 1).to(tl.int64) + all = T + T = eos - bos + else: + bos, eos = i_n * T, i_n * T + T + all = B * T + + mask_k = i_k * BK + tl.arange(0, BK) < K + mask_v = i_v * BV + tl.arange(0, BV) < V + + if HEAD_FIRST: + p_q = q + i_nh * T*K + i_k * BK + tl.arange(0, BK) + (T - 1) * K + p_k = k + i_nh * T*K + i_k * BK + tl.arange(0, BK) + (T - 1) * K + p_v = v + i_nh * T*V + i_v * BV + tl.arange(0, BV) + (T - 1) * V + p_do = do + i_nh * T*V + i_v * BV + tl.arange(0, BV) + (T - 1) * V + p_dk = dk + (i_v * B*H + i_nh) * T*K + i_k * BK + tl.arange(0, BK) + (T - 1) * K + p_dv = dv + (i_k * B*H + i_nh) * T*V + i_v * BV + tl.arange(0, BV) + (T - 1) * V + if IS_BETA_HEADWISE: + p_beta = beta + i_nh * T*V + i_v * BV + tl.arange(0, BV) + (T - 1) * V + p_dbeta = db + (i_v * NK*B*H + i_k * B*H + i_nh) * T*V + tl.arange(0, BV) + (T - 1) * V + else: + p_beta = beta + i_nh * T + T - 1 + p_dbeta = db + (i_v * B*H + i_nh) * T + T - 1 + else: + p_q = q + (bos * H + i_h) * K + i_k * BK + tl.arange(0, BK) + (T - 1) * H*K + p_k = k + (bos * H + i_h) * K + i_k * BK + tl.arange(0, BK) + (T - 1) * H*K + p_v = v + (bos * H + i_h) * V + i_v * BV + tl.arange(0, BV) + (T - 1) * H*V + p_do = do + (bos * H + i_h) * V + i_v * BV + tl.arange(0, BV) + (T - 1) * H*V + p_dk = dk + ((i_v * all + bos) * H + i_h) * K + i_k * BK + tl.arange(0, BK) + (T - 1) * H*K + p_dv = dv + ((i_k * all + bos) * H + i_h) * V + i_v * BV + tl.arange(0, BV) + (T - 1) * H*V + if IS_BETA_HEADWISE: + p_beta = beta + (bos + T - 1) * H*V + i_h * V + i_v * BV + tl.arange(0, BV) + p_dbeta = db + ((i_v * NK + i_k) * all + bos + T - 1) * H*V + i_h * V + tl.arange(0, BV) + else: + p_beta = beta + (bos + T - 1) * H + i_h + p_dbeta = db + (i_v * all + bos + T - 1) * H + i_h + + b_dh = tl.zeros([BK, BV], dtype=tl.float32) + if USE_FINAL_STATE_GRADIENT: + p_ht = dht + i_nh * K * V + (i_k * BK + tl.arange(0, BK)[:, None]) * V + (i_v * BV + tl.arange(0, BV)[None, :]) + b_dh += tl.load(p_ht, mask=mask_k[:, None] & mask_v[None, :], other=0).to(tl.float32) + + for _ in range(T): + b_q = tl.load(p_q, mask=mask_k, other=0).to(tl.float32) * scale + b_k = tl.load(p_k, mask=mask_k, other=0).to(tl.float32) + b_v = tl.load(p_v, mask=mask_v, other=0).to(tl.float32) + b_do = tl.load(p_do, mask=mask_v, other=0).to(tl.float32) + if IS_BETA_HEADWISE: + b_beta = tl.load(p_beta, mask=mask_v, other=0).to(tl.float32) + else: + b_beta = tl.load(p_beta).to(tl.float32) + b_dh += b_q[:, None] * b_do[None, :] + b_dk = tl.sum(b_dh * (b_v * b_beta)[None, :], axis=1) + b_dv = tl.sum(b_dh * b_k[:, None], axis=0) + + b_db = b_dv * b_v if IS_BETA_HEADWISE else tl.sum(b_dv * b_v) + b_dv = b_dv * b_beta + + tl.store(p_dk, b_dk.to(p_dk.dtype.element_ty), mask=mask_k) + tl.store(p_dv, b_dv.to(p_dv.dtype.element_ty), mask=mask_v) + if IS_BETA_HEADWISE: + tl.store(p_dbeta, b_db.to(p_dbeta.dtype.element_ty), mask=mask_v) + else: + tl.store(p_dbeta, b_db.to(p_dbeta.dtype.element_ty)) + + b_dh -= b_k[:, None] * b_dv[None, :] + + p_q -= K if HEAD_FIRST else H*K + p_k -= K if HEAD_FIRST else H*K + p_v -= V if HEAD_FIRST else H*V + p_do -= V if HEAD_FIRST else H*V + p_dk -= K if HEAD_FIRST else H*K + p_dv -= V if HEAD_FIRST else H*V + p_dbeta -= (1 if HEAD_FIRST else H) * (V if IS_BETA_HEADWISE else 1) + p_beta -= (1 if HEAD_FIRST else H) * (V if IS_BETA_HEADWISE else 1) + + if USE_INITIAL_STATE: + p_dh0 = dh0 + i_nh * K * V + (i_k * BK + tl.arange(0, BK)[:, None]) * V + (i_v * BV + tl.arange(0, BV)[None, :]) + tl.store(p_dh0, b_dh.to(p_dh0.dtype.element_ty), mask=mask_k[:, None] & mask_v[None, :]) + + tl.debug_barrier() + + b_h = tl.zeros([BK, BV], dtype=tl.float32) + + if HEAD_FIRST: + p_q = q + i_nh * T*K + i_k * BK + tl.arange(0, BK) + p_k = k + i_nh * T*K + i_k * BK + tl.arange(0, BK) + p_v = v + i_nh * T*V + i_v * BV + tl.arange(0, BV) + if IS_BETA_HEADWISE: + p_beta = beta + i_nh * T*V + i_v * BV + tl.arange(0, BV) + else: + p_beta = beta + i_nh * T + p_do = do + i_nh * T*V + i_v * BV + tl.arange(0, BV) + p_dq = dq + (i_v * B*H + i_nh) * T*K + i_k * BK + tl.arange(0, BK) + p_dk = dk + (i_v * B*H + i_nh) * T*K + i_k * BK + tl.arange(0, BK) + p_dv = dv + (i_k * B*H + i_nh) * T*V + i_v * BV + tl.arange(0, BV) + else: + p_q = q + (bos * H + i_h) * K + i_k * BK + tl.arange(0, BK) + p_k = k + (bos * H + i_h) * K + i_k * BK + tl.arange(0, BK) + p_v = v + (bos * H + i_h) * V + i_v * BV + tl.arange(0, BV) + if IS_BETA_HEADWISE: + p_beta = beta + (bos * H + i_h) * V + i_v * BV + tl.arange(0, BV) + else: + p_beta = beta + bos * H + i_h + p_do = do + (bos * H + i_h) * V + i_v * BV + tl.arange(0, BV) + p_dq = dq + ((i_v * all + bos) * H + i_h) * K + i_k * BK + tl.arange(0, BK) + p_dk = dk + ((i_v * all + bos) * H + i_h) * K + i_k * BK + tl.arange(0, BK) + p_dv = dv + ((i_k * all + bos) * H + i_h) * V + i_v * BV + tl.arange(0, BV) + + if USE_INITIAL_STATE: + mask_h = mask_k[:, None] & mask_v[None, :] + p_h0 = h0 + i_nh * K * V + (i_k * BK + tl.arange(0, BK)[:, None]) * V + (i_v * BV + tl.arange(0, BV)[None, :]) + b_h += tl.load(p_h0, mask=mask_h, other=0).to(tl.float32) + + for _ in range(0, T): + b_dk = tl.load(p_dk, mask=mask_k, other=0).to(tl.float32) + b_dv = tl.load(p_dv, mask=mask_v, other=0).to(tl.float32) + b_dk -= tl.sum(b_dv[None, :] * b_h, axis=1) + tl.store(p_dk, b_dk.to(p_dk.dtype.element_ty), mask=mask_k) + + b_k = tl.load(p_k, mask=mask_k, other=0).to(tl.float32) + b_v = tl.load(p_v, mask=mask_v, other=0).to(tl.float32) + b_do = tl.load(p_do, mask=mask_v, other=0).to(tl.float32) + if IS_BETA_HEADWISE: + b_beta = tl.load(p_beta, mask=mask_v, other=0).to(tl.float32) + else: + b_beta = tl.load(p_beta).to(tl.float32) + b_v *= b_beta + + b_h += b_k[:, None] * b_v[None, :] + b_dq = b_h * b_do[None, :] + d_q = tl.sum(b_dq, axis=1) * scale + tl.store(p_dq, d_q.to(p_dq.dtype.element_ty), mask=mask_k) + + p_k += K if HEAD_FIRST else H*K + p_v += V if HEAD_FIRST else H*V + p_do += V if HEAD_FIRST else H*V + p_dq += K if HEAD_FIRST else H*K + p_dk += K if HEAD_FIRST else H*K + p_dv += V if HEAD_FIRST else H*V + p_beta += (1 if HEAD_FIRST else H) * (V if IS_BETA_HEADWISE else 1) + + +def fused_recurrent_delta_rule_fwd( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + beta: torch.Tensor, + scale: float, + initial_state: torch.Tensor, + output_final_state: bool, + offsets: Optional[torch.LongTensor] = None, + head_first: bool = True +) -> Tuple[torch.Tensor, torch.Tensor]: + if head_first: + B, H, T, K, V = *k.shape, v.shape[-1] + else: + B, T, H, K, V = *k.shape, v.shape[-1] + N = B if offsets is None else len(offsets) - 1 + BK, BV = triton.next_power_of_2(K), min(triton.next_power_of_2(V), 8) + NK, NV = triton.cdiv(K, BK), triton.cdiv(V, BV) + assert NK == 1, "NK > 1 is not supported yet" + num_stages = 1 + num_warps = 1 + + o = q.new_empty(NK, *v.shape) + if output_final_state: + final_state = q.new_empty(N, H, K, V, dtype=torch.float32) + else: + final_state = None + + grid = (NV, NK, N * H) + u = torch.empty_like(v) + fused_recurrent_delta_rule_fwd_kernel[grid]( + q, + k, + v, + u, + beta, + o, + initial_state, + final_state, + offsets, + scale, + B=B, + T=T, + H=H, + K=K, + V=V, + BK=BK, + BV=BV, + IS_BETA_HEADWISE=beta.ndim == v.ndim, + HEAD_FIRST=head_first, + num_warps=num_warps, + num_stages=num_stages, + ) + o = o.squeeze(0) + return o, u, final_state + + +def fused_recurrent_delta_rule_bwd( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + beta: torch.Tensor, + dht: torch.Tensor, + do: torch.Tensor, + scale: float, + initial_state: torch.Tensor, + offsets: Optional[torch.LongTensor] = None, + head_first: bool = True +) -> Tuple[torch.Tensor, torch.Tensor, torch.Tensor, torch.Tensor, torch.Tensor]: + if head_first: + B, H, T, K, V = *k.shape, v.shape[-1] + else: + B, T, H, K, V = *k.shape, v.shape[-1] + N = B if offsets is None else len(offsets) - 1 + BK, BV = triton.next_power_of_2(K), min(triton.next_power_of_2(V), 32) + NK, NV = triton.cdiv(K, BK), triton.cdiv(V, BV) + assert NK == 1, "NK > 1 is not supported yet" + num_stages = 1 + num_warps = 2 + + beta_vector = beta.ndim == v.ndim + + dq = q.new_empty(NV, *q.shape) + dk = q.new_empty(NV, *k.shape) + dv = q.new_empty(NK, *v.shape) + if beta_vector: + db = q.new_empty(NV, NK, B, H, T, V) if head_first else q.new_empty(NV, NK, B, T, H, V) + else: + db = q.new_empty(NV, B, H, T) if head_first else q.new_empty(NV, B, T, H) + grid = (NV, NK, N * H) + + if initial_state is not None and initial_state.requires_grad: + dh0 = torch.empty_like(initial_state, dtype=torch.float32) + else: + dh0 = None + + fused_recurrent_delta_rule_bwd_kernel[grid]( + q, + k, + v, + beta, + initial_state, + dh0, + dht, + do, + dq, + dk, + dv, + db, + offsets, + scale, + B=B, + T=T, + H=H, + K=K, + V=V, + BK=BK, + BV=BV, + NK=NK, + IS_BETA_HEADWISE=beta_vector, + HEAD_FIRST=head_first, + num_warps=num_warps, + num_stages=num_stages + ) + dq = dq.sum(0) + dk = dk.sum(0) + dv = dv.sum(0) + db = db.sum((0, 1)) if beta_vector else db.sum(0) + + return dq, dk, dv, db, dh0 + + +class FusedRecurrentFunction(torch.autograd.Function): + + @staticmethod + @contiguous + def forward( + ctx, + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + beta: torch.Tensor, + scale: float, + initial_state: torch.Tensor, + output_final_state: bool, + offsets: Optional[torch.LongTensor] = None, + head_first: bool = True + ): + o, u, final_state = fused_recurrent_delta_rule_fwd( + q=q, + k=k, + v=v, + beta=beta, + scale=scale, + initial_state=initial_state, + output_final_state=output_final_state, + offsets=offsets, + head_first=head_first + ) + + ctx.save_for_backward(q, k, u, beta, initial_state) + ctx.scale = scale + ctx.offsets = offsets + ctx.head_first = head_first + return o, final_state + + @staticmethod + @contiguous + def backward(ctx, do, dht): + q, k, v, beta, initial_state = ctx.saved_tensors + dq, dk, dv, db, dh0 = fused_recurrent_delta_rule_bwd( + q=q, + k=k, + v=v, + beta=beta, + dht=dht, + do=do, + scale=ctx.scale, + initial_state=initial_state, + offsets=ctx.offsets, + head_first=ctx.head_first + ) + return dq.to(q), dk.to(k), dv.to(v), db.to(beta), None, dh0, None, None, None + + +def fused_recurrent_delta_rule( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + beta: torch.Tensor = None, + scale: float = None, + initial_state: torch.Tensor = None, + output_final_state: bool = False, + offsets: Optional[torch.LongTensor] = None, + head_first: bool = True +) -> Tuple[torch.Tensor, torch.Tensor]: + r""" + Args: + q (torch.Tensor): + queries of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]`. + k (torch.Tensor): + keys of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]`. + v (torch.Tensor): + values of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]`. + beta (torch.Tensor): + betas of shape `[B, H, T]` if `head_first=True` else `(B, T, H)`. + scale (Optional[int]): + Scale factor for the RetNet attention scores. + If not provided, it will default to `1 / sqrt(K)`. Default: `None`. + initial_state (Optional[torch.Tensor]): + Initial state of shape `[N, H, K, V]` for `N` input sequences. + For equal-length input sequences, `N` equals the batch size `B`. + Default: `None`. + output_final_state (Optional[bool]): + Whether to output the final state of shape `[N, H, K, V]`. Default: `False`. + offsets (Optional[torch.LongTensor]): + Offsets of shape `[N+1]` defining the bos/eos positions of `N` variable-length sequences in the batch. + For example, + if `offsets` is `[0, 1, 3, 6, 10, 15]`, there are `N=5` sequences with lengths 1, 2, 3, 4 and 5 respectively. + If provided, the inputs are concatenated and the batch size `B` is expected to be 1. + Default: `None`. + head_first (Optional[bool]): + Whether the inputs are in the head-first format, which is not supported for variable-length inputs. + Default: `True`. + + Returns: + o (torch.Tensor): + Outputs of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]`. + final_state (torch.Tensor): + Final state of shape `[N, H, K, V]` if `output_final_state=True` else `None`. + + Examples:: + >>> import torch + >>> import torch.nn.functional as F + >>> from einops import rearrange + >>> from fla.ops.delta_rule import fused_recurrent_delta_rule + # inputs with equal lengths + >>> B, T, H, K, V = 4, 2048, 4, 512, 512 + >>> q = torch.randn(B, T, H, K, device='cuda') + >>> k = F.normalize(torch.randn(B, T, H, K, device='cuda'), p=2, dim=-1) + >>> v = torch.randn(B, T, H, V, device='cuda') + >>> beta = torch.rand(B, T, H, device='cuda').sigmoid() + >>> h0 = torch.randn(B, H, K, V, device='cuda') + >>> o, ht = fused_recurrent_delta_rule(q, k, v, beta, + initial_state=h0, + output_final_state=True, + head_first=False) + # for variable-length inputs, the batch size `B` is expected to be 1 and `offsets` is required + >>> q, k, v, beta = map(lambda x: rearrange(x, 'b t ... -> 1 (b t) ...'), (q, k, v, beta)) + # for a batch with 4 sequences, offsets with 5 start/end positions are expected + >>> offsets = q.new_tensor([0, 2048, 4096, 6144, 8192], dtype=torch.long) + >>> o_var, ht_var = fused_recurrent_delta_rule(q, k, v, beta, + initial_state=h0, + output_final_state=True, + offsets=offsets, + head_first=False) + >>> assert o.allclose(o_var.view(o.shape)) + >>> assert ht.allclose(ht_var) + """ + if offsets is not None: + if q.shape[0] != 1: + raise ValueError(f"The batch size is expected to be 1 rather than {q.shape[0]} when using `offsets`." + f"Please flatten variable-length inputs before processing.") + if head_first: + raise RuntimeError("Sequences with variable lengths are not supported for head-first mode") + if initial_state is not None and initial_state.shape[0] != len(offsets) - 1: + raise ValueError(f"The number of initial states is expected to be equal to the number of input sequences, " + f"i.e., {len(offsets) - 1} rather than {initial_state.shape[0]}.") + if scale is None: + scale = k.shape[-1] ** -0.5 + else: + assert scale > 0, "scale must be positive" + if beta is None: + beta = torch.ones_like(q[..., 0]) + o, final_state = FusedRecurrentFunction.apply( + q, + k, + v, + beta, + scale, + initial_state, + output_final_state, + offsets, + head_first + ) + return o, final_state diff --git a/fla/ops/delta_rule/naive.py b/fla/ops/delta_rule/naive.py new file mode 100644 index 0000000000000000000000000000000000000000..bdd73cf29a345f24c49de38b9c4b7986c21573ab --- /dev/null +++ b/fla/ops/delta_rule/naive.py @@ -0,0 +1,164 @@ +# -*- coding: utf-8 -*- + +import torch +from einops import rearrange + + +def delta_rule_recurrence(q, k, v, beta, initial_state=None, output_final_state=True): + orig_dtype = q.dtype + b, h, l, d_k = q.shape + q, k, v, beta = map(lambda x: x.float(), [q, k, v, beta]) + d_v = v.shape[-1] + o = torch.zeros_like(v) + S = torch.zeros(b, h, d_k, d_v).to(v) + q = q * (d_k ** -0.5) + + if beta.ndim < v.ndim: + beta = beta[..., None] + + if initial_state is not None: + S += initial_state + + for i in range(l): + _k = k[:, :, i] + _q = q[:, :, i] + _v = v[:, :, i].clone() + beta_i = beta[:, :, i] + _v = _v - (S.clone() * _k[..., None]).sum(-2) + _v = _v * beta_i + S = S.clone() + _k.unsqueeze(-1) * _v.unsqueeze(-2) + o[:, :, i] = torch.einsum('bhd,bhdm->bhm', _q, S) + S = None if output_final_state is False else S + return o.to(orig_dtype), S + + +def delta_rule_chunkwise(q, k, v, beta, chunk_size=32): + b, h, l, d_k = q.shape + d_v = v.shape[-1] + q = q * (d_k ** -0.5) + v = v * beta[..., None] + k_beta = k * beta[..., None] + + assert l % chunk_size == 0 + + # compute (I - tri(diag(beta) KK^T))^{-1} + mask = torch.triu(torch.ones(chunk_size, chunk_size, dtype=torch.bool, device=q.device), diagonal=0) + q, k, v, k_beta = map(lambda x: rearrange(x, 'b h (n c) d -> b h n c d', c=chunk_size), [q, k, v, k_beta]) + attn = -(k_beta @ k.transpose(-1, -2)).masked_fill(mask, 0) + for i in range(1, chunk_size): + attn[..., i, :i] = attn[..., i, :i] + (attn[..., i, :, None].clone() * attn[..., :, :i].clone()).sum(-2) + attn = attn + torch.eye(chunk_size, dtype=torch.float, device=q.device) + + u = attn @ v + w = attn @ k_beta + S = k.new_zeros(b, h, d_k, d_v) + o = torch.zeros_like(v) + mask = torch.triu(torch.ones(chunk_size, chunk_size, dtype=torch.bool, device=q.device), diagonal=1) + for i in range(0, l // chunk_size): + q_i, k_i = q[:, :, i], k[:, :, i] + attn = (q_i @ k_i.transpose(-1, -2)).masked_fill_(mask, 0) + u_i = u[:, :, i] - w[:, :, i] @ S + o_inter = q_i @ S + o[:, :, i] = o_inter + attn @ u_i + S = S + k_i.transpose(-1, -2) @ u_i + + return rearrange(o, 'b h n c d -> b h (n c) d'), S + + +def delta_rule_parallel(q, k, v, beta, BM=128, BN=32): + b, h, l, d_k = q.shape + # d_v = v.shape[-1] + q = q * (d_k ** -0.5) + v = v * beta[..., None] + k_beta = k * beta[..., None] + # compute (I - tri(diag(beta) KK^T))^{-1} + q, k, v, k_beta = map(lambda x: rearrange(x, 'b h (n c) d -> b h n c d', c=BN), [q, k, v, k_beta]) + mask = torch.triu(torch.ones(BN, BN, dtype=torch.bool, device=q.device), diagonal=0) + T = -(k_beta @ k.transpose(-1, -2)).masked_fill(mask, 0) + for i in range(1, BN): + T[..., i, :i] = T[..., i, :i].clone() + (T[..., i, :, None].clone() * T[..., :, :i].clone()).sum(-2) + T = T + torch.eye(BN, dtype=torch.float, device=q.device) + + mask2 = torch.triu(torch.ones(BN, BN, dtype=torch.bool, device=q.device), diagonal=1) + A_local = (q @ k.transpose(-1, -2)).masked_fill(mask2, 0) @ T + o_intra = A_local @ v + + # apply cumprod transition matrices on k to the last position within the chunk + k = k - ((k @ k.transpose(-1, -2)).masked_fill(mask, 0) @ T).transpose(-1, -2) @ k_beta + # apply cumprod transition matrices on q to the first position within the chunk + q = q - A_local @ k_beta + o_intra = A_local @ v + + A = torch.zeros(b, h, l, l, device=q.device) + + q, k, v, k_beta, o_intra = map(lambda x: rearrange(x, 'b h n c d -> b h (n c) d'), [q, k, v, k_beta, o_intra]) + o = torch.empty_like(v) + for i in range(0, l, BM): + q_i = q[:, :, i:i+BM] + o_i = o_intra[:, :, i:i+BM] + # intra block + for j in range(i + BM - 2 * BN, i-BN, -BN): + k_j = k[:, :, j:j+BN] + A_ij = q_i @ k_j.transpose(-1, -2) + mask = torch.arange(i, i+BM) >= (j + BN) + A_ij = A_ij.masked_fill_(~mask[:, None].to(A_ij.device), 0) + A[:, :, i:i+BM, j:j+BN] = A_ij + q_i = q_i - A_ij @ k_beta[:, :, j:j+BN] + o_i += A_ij @ v[:, :, j:j+BN] + # inter block + for j in range(i - BN, -BN, -BN): + k_j = k[:, :, j:j+BN] + A_ij = q_i @ k_j.transpose(-1, -2) + A[:, :, i:i+BM, j:j+BN] = A_ij + q_i = q_i - A_ij @ k_beta[:, :, j:j+BN] + o_i += A_ij @ v[:, :, j:j+BN] + o[:, :, i:i+BM] = o_i + + for i in range(0, l//BN): + A[:, :, i*BN:i*BN+BN, i*BN:i*BN+BN] = A_local[:, :, i] + + return o, A + + +if __name__ == '__main__': + B = 2 + H = 4 + L = 512 + DK = 128 + DV = 128 + q = (torch.randn(B, H, L, DK)).cuda().requires_grad_(True) + k = (torch.randn(B, H, L, DK)).cuda() + k = torch.nn.functional.normalize(k, dim=-1, p=2).requires_grad_(True) + v = (torch.randn(B, H, L, DV)).cuda().requires_grad_(True) + beta = torch.randn(B, H, L).cuda().sigmoid().requires_grad_(True) + + o, _ = delta_rule_recurrence(q, k, v, beta) + do = torch.randn(B, H, L, DV).cuda() + o.backward(do, retain_graph=True) + q_grad, q.grad = q.grad, None + k_grad, k.grad = k.grad, None + v_grad, v.grad = v.grad, None + beta_grad, beta.grad = beta.grad, None + + o2, _ = delta_rule_chunkwise(q, k, v, beta) + o2.backward(do) + assert torch.allclose(o, o2, atol=1e-4), breakpoint() + assert torch.allclose(q.grad, q_grad, atol=1e-4), breakpoint() + assert torch.allclose(k.grad, k_grad, atol=1e-4), breakpoint() + assert torch.allclose(v.grad, v_grad, atol=1e-4), breakpoint() + assert torch.allclose(beta.grad, beta_grad, atol=1e-4), breakpoint() + + q_grad, q.grad = q.grad, None + k_grad, k.grad = k.grad, None + v_grad, v.grad = v.grad, None + beta_grad, beta.grad = beta.grad, None + + o3, _ = delta_rule_parallel(q, k, v, beta) + o3.backward(do) + assert torch.allclose(o, o3, atol=1e-4), breakpoint() + assert torch.allclose(q.grad, q_grad, atol=1e-4), breakpoint() + assert torch.allclose(k.grad, k_grad, atol=1e-4), breakpoint() + assert torch.allclose(v.grad, v_grad, atol=1e-4), breakpoint() + assert torch.allclose(beta.grad, beta_grad, atol=1e-4), breakpoint() + + print("All passed!") diff --git a/fla/ops/delta_rule/parallel.py b/fla/ops/delta_rule/parallel.py new file mode 100644 index 0000000000000000000000000000000000000000..5613dcaffcf0342f81849eeff5279bf993e64b4a --- /dev/null +++ b/fla/ops/delta_rule/parallel.py @@ -0,0 +1,400 @@ + + +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +from typing import Tuple + +import torch +import triton +import triton.language as tl +from einops import rearrange + +from fla.ops.delta_rule.wy_fast import fwd_prepare_T +from fla.utils import autocast_custom_bwd, autocast_custom_fwd, contiguous + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + ], + key=["BT", "K", "V"], +) +@triton.jit +def chunk_transform_qk_fwd_kernel( + q, + k, + v, + beta, + o, + A, + q_new, + k_new, + A_local, + s_k_h, + s_k_t, + s_k_d, + s_v_h, + s_v_t, + s_v_d, + scale, + T: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + BT: tl.constexpr, + OUTPUT_ATTENTIONS: tl.constexpr, + # SAVE_ATTENTION: tl.constexpr +): + i_t, i_bh = tl.program_id(0), tl.program_id(1) + + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT, 0), (BT, BK), (1, 0)) + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT, 0), (BT, BK), (1, 0)) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT, 0), (BT, BV), (1, 0)) + b_q = (tl.load(p_q, boundary_check=(0, 1)) * scale).to(p_q.dtype.element_ty) + b_k = tl.load(p_k, boundary_check=(0, 1)) + b_v = tl.load(p_v, boundary_check=(0, 1)) + + p_T = tl.make_block_ptr(A + i_bh * T * BT, (T, BT), (BT, 1), (i_t * BT, 0), (BT, BT), (1, 0)) + b_T = tl.load(p_T, boundary_check=(0, 1)) + + o_i = tl.arange(0, BT) + m_t = o_i[:, None] >= o_i[None, :] + b_qk = tl.where(m_t, tl.dot(b_q, tl.trans(b_k), allow_tf32=False), 0).to(b_q.dtype) + m_t = o_i[:, None] > o_i[None, :] + b_kk = tl.where(m_t, tl.dot(b_k, tl.trans(b_k), allow_tf32=False), 0).to(b_k.dtype) + + p_beta = tl.make_block_ptr(beta + i_bh * T, (T, ), (1, ), (i_t * BT, ), (BT, ), (0, )) + b_beta = tl.load(p_beta, boundary_check=(0, )) + b_k_beta = (b_k * b_beta[:, None]).to(b_k.dtype) + + b_qkT = tl.dot(b_qk, b_T, allow_tf32=False).to(b_k.dtype) + + if OUTPUT_ATTENTIONS: + p_a = tl.make_block_ptr(A_local + i_bh * T * BT, (T, BT), (BT, 1), (i_t * BT, 0), (BT, BT), (1, 0)) + tl.store(p_a, b_qkT.to(p_a.dtype.element_ty), boundary_check=(0, 1)) + + b_kkT = tl.dot(b_kk, b_T, allow_tf32=False).to(b_k.dtype) + p_o = tl.make_block_ptr(o + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT, 0), (BT, BV), (1, 0)) + tl.store(p_o, tl.dot(b_qkT, b_v).to(p_o.dtype.element_ty), boundary_check=(0, 1)) + + p_q_new = tl.make_block_ptr(q_new + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT, 0), (BT, BK), (1, 0)) + tl.store(p_q_new, (b_q - tl.dot(b_qkT, b_k_beta, allow_tf32=False)).to(p_q_new.dtype.element_ty), boundary_check=(0, 1)) + + p_k_new = tl.make_block_ptr(k_new + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT, 0), (BT, BK), (1, 0)) + tl.store(p_k_new, (b_k - tl.dot(tl.trans(b_kkT), b_k_beta, allow_tf32=False) + ).to(p_k_new.dtype.element_ty), boundary_check=(0, 1)) + + +def chunk_transform_qk_fwd_fn(q, k, v, beta, A, scale, BT, output_attentions): + B, H, T, K = k.shape + q_new = torch.empty_like(q) + k_new = torch.empty_like(k) + o = torch.empty_like(v) + grid = (triton.cdiv(T, BT), B*H) + V = v.shape[-1] + A_local = torch.empty_like(A) if output_attentions else None + chunk_transform_qk_fwd_kernel[grid]( + q, k, v, beta, o, A, q_new, k_new, A_local, + q.stride(1), q.stride(2), q.stride(3), + v.stride(1), v.stride(2), v.stride(3), + scale=scale, + T=T, + K=K, + V=V, + BT=BT, + BK=triton.next_power_of_2(K), + BV=triton.next_power_of_2(V), + OUTPUT_ATTENTIONS=output_attentions + ) + return q_new, k_new, o, A_local + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + ], + key=["BT"], +) +@triton.jit +def save_intra_chunk_attn( + A, + A_local, + T: tl.constexpr, + BT: tl.constexpr, +): + i_t, i_bh = tl.program_id(0), tl.program_id(1) + p_A = tl.make_block_ptr(A + i_bh * T * T, (T, T), (T, 1), (i_t * BT, i_t * BT), (BT, BT), (1, 0)) + p_A_local = tl.make_block_ptr(A_local + i_bh * T * BT, (T, BT), (BT, 1), (i_t * BT, 0), (BT, BT), (1, 0)) + b_A_local = tl.load(p_A_local, boundary_check=(0, 1)) + tl.store(p_A, b_A_local.to(p_A.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.heuristics({ + 'OUTPUT_ATTENTIONS': lambda args: args['attn'] is not None +}) +@triton.jit +def parallel_delta_rule_fwd_kernel( + q, + k, + k2, # original k + v, + beta, + o, + o_new, + attn, + s_k_h, + s_k_t, + s_v_h, + s_v_t, + T: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BS: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + OUTPUT_ATTENTIONS: tl.constexpr +): + i_t, i_bh = tl.program_id(0), tl.program_id(1) + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (T, K), (s_k_t, 1), (i_t * BT, 0), (BT, BK), (1, 0)) + + # the Q block is kept in the shared memory throughout the whole kernel + # [BT, BK] + b_q = tl.zeros([BT, BK], dtype=tl.float32) + b_q += tl.load(p_q, boundary_check=(0, 1)) + + b_o = tl.zeros([BT, BV], dtype=tl.float32) + p_o = tl.make_block_ptr(o + i_bh * s_v_h, (T, V), (s_v_t, 1), (i_t * BT, 0), (BT, BV), (1, 0)) + b_o += tl.load(p_o, boundary_check=(0, 1)) + + # As opposed to Flashattention, this kernel requires scanning the KV blocks from right to left + # Q block and K block have overlap. + # masks required + for offset in range((i_t + 1) * BT - 2 * BS, i_t * BT - BS, -BS): + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (K, T), (1, s_k_t), (0, offset), (BK, BS), (0, 1)) + p_k2 = tl.make_block_ptr(k2 + i_bh * s_k_h, (T, K), (s_k_t, 1), (offset, 0), (BS, BK), (1, 0)) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (T, V), (s_v_t, 1), (offset, 0), (BS, BV), (1, 0)) + p_beta = tl.make_block_ptr(beta + i_bh * T, (T, ), (1, ), (offset, ), (BS, ), (0,)) + # [BK, BS] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BS, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + # [BS] + b_beta = tl.load(p_beta, boundary_check=(0,)) + # [BT, BS] + m_s = tl.arange(0, BT) >= (offset - i_t*BT + BS) + b_s = tl.dot(b_q.to(b_k.dtype), b_k, allow_tf32=False) + b_s = tl.where(m_s[:, None], b_s, 0) + + b_o += tl.dot(b_s.to(b_v.dtype), b_v, allow_tf32=False) + b_k2 = (tl.load(p_k2, boundary_check=(0, 1)) * b_beta[:, None]).to(b_v.dtype) + b_q -= tl.dot(b_s.to(b_v.dtype), b_k2, allow_tf32=False) + + if OUTPUT_ATTENTIONS: + p_a = tl.make_block_ptr(attn + i_bh * T * T, (T, T), (T, 1), (i_t * BT, offset), (BT, BS), (1, 0)) + tl.store(p_a, b_s.to(p_a.dtype.element_ty), boundary_check=(0, 1)) + + # Q block and K block have no overlap + # no need for mask, thereby saving flops + for offset in range(i_t * BT - BS, -BS, -BS): + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (K, T), (1, s_k_t), (0, offset), (BK, BS), (0, 1)) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (T, V), (s_v_t, 1), (offset, 0), (BS, BV), (1, 0)) + p_beta = tl.make_block_ptr(beta + i_bh * T, (T, ), (1, ), (offset, ), (BS, ), (0,)) + p_k2 = tl.make_block_ptr(k2 + i_bh * s_k_h, (T, K), (s_k_t, 1), (offset, 0), (BS, BK), (1, 0)) + + # [BK, BS] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BS, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + # [BS] + b_beta = tl.load(p_beta, boundary_check=(0,)) + # [BT, BS] + b_s = (tl.dot(b_q.to(b_k.dtype), b_k, allow_tf32=False)) + # [BT, BV] + b_o += tl.dot(b_s.to(b_v.dtype), b_v, allow_tf32=False) + b_k2 = (tl.load(p_k2, boundary_check=(0, 1)) * b_beta[:, None]).to(b_v.dtype) + b_q -= tl.dot(b_s.to(b_v.dtype), b_k2, allow_tf32=False).to(b_q.dtype) + + if OUTPUT_ATTENTIONS: + p_a = tl.make_block_ptr(attn + i_bh * T * T, (T, T), (T, 1), (i_t * BT, offset), (BT, BS), (1, 0)) + tl.store(p_a, b_s.to(p_a.dtype.element_ty), boundary_check=(0, 1)) + + p_o_new = tl.make_block_ptr(o_new + i_bh * s_v_h, (T, V), (s_v_t, 1), (i_t*BT, 0), (BT, BV), (1, 0)) + tl.store(p_o_new, b_o.to(p_o.dtype.element_ty), boundary_check=(0, 1)) + + +class ParallelDeltaRuleFunction(torch.autograd.Function): + + @staticmethod + @contiguous + @autocast_custom_fwd + def forward(ctx, q, k, v, beta, scale, output_attentions): + B, H, T, K, V = *k.shape, v.shape[-1] + assert q.shape[-1] <= 128, 'The maximum supported sequence length is 128.' + BT, BS = 128, 32 + BK = triton.next_power_of_2(k.shape[-1]) + BV = triton.next_power_of_2(v.shape[-1]) + assert BT % BS == 0 + + A = fwd_prepare_T(k, beta, BS) + attn = q.new_zeros(B, H, T, T) if output_attentions else None + q_new, k_new, o, A_local = chunk_transform_qk_fwd_fn(q, k, v, beta, A, scale, BS, output_attentions) + + num_stages = 3 if K <= 64 else 2 + num_warps = 4 + grid = (triton.cdiv(T, BT), B * H) + o_new = torch.empty_like(o) + + parallel_delta_rule_fwd_kernel[grid]( + q=q_new, + k=k_new, + k2=k, + v=v, + beta=beta, + o=o, + o_new=o_new, + attn=attn, + s_k_h=k.stride(1), + s_k_t=k.stride(2), + s_v_h=v.stride(1), + s_v_t=v.stride(2), + T=T, + K=K, + V=V, + BT=BT, + BS=BS, + BK=BK, + BV=BV, + num_stages=num_stages, + num_warps=num_warps + ) + + if output_attentions: + grid = (triton.cdiv(T, BS), B * H) + save_intra_chunk_attn[grid]( + A=attn, A_local=A_local, T=T, BT=BS + ) + return o_new.to(q.dtype), attn + + @staticmethod + @contiguous + @autocast_custom_bwd + def backward(ctx, do, d_attn=None): + raise NotImplementedError('Backward pass is not implemented. Stay tuned!') + + +def parallel_delta_rule( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + beta: torch.Tensor, + scale: float = None, + output_attentions: bool = False, + head_first: bool = True +) -> Tuple[torch.Tensor, torch.Tensor]: + r""" + Args: + q (torch.Tensor): + queries of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]`. + k (torch.Tensor): + keys of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]`. + v (torch.Tensor): + values of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]`. + beta (torch.Tensor): + betas of shape `[B, H, T]` if `head_first=True` else `[B, T, H]`. + scale (Optional[int]): + Scale factor for attention scores. + If not provided, it will default to `1 / sqrt(K)`. Default: `None`. + output_attentions (bool): + Whether to output the materialized attention scores of shape [B, H, T, T]. Default: `False`. + head_first (Optional[bool]): + Whether the inputs are in the head-first format. + Default: `True`. + + Returns: + o (torch.Tensor): + Outputs of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]`. + attn (torch.Tensor): + Attention scores of shape `[B, H, T, T]` if `output_attentions=True` else `None`. + """ + if not head_first: + q, k, v, beta = map(lambda x: x.transpose(1, 2), (q, k, v, beta)) + o, attn = ParallelDeltaRuleFunction.apply(q, k, v, beta, scale, output_attentions) + if not head_first: + o = o.transpose(1, 2) + return o, attn + + +def naive_delta_rule_parallel(q, k, v, beta, BM=128, BN=32): + b, h, l, d_k = q.shape + q = q * (d_k ** -0.5) + v = v * beta[..., None] + k_beta = k * beta[..., None] + # compute (I - tri(diag(beta) KK^T))^{-1} + q, k, v, k_beta = map(lambda x: rearrange(x, 'b h (n c) d -> b h n c d', c=BN), [q, k, v, k_beta]) + mask = torch.triu(torch.ones(BN, BN, dtype=torch.bool, device=q.device), diagonal=0) + T = -(k_beta @ k.transpose(-1, -2)).masked_fill(mask, 0) + for i in range(1, BN): + T[..., i, :i] = T[..., i, :i].clone() + (T[..., i, :, None].clone() * T[..., :, :i].clone()).sum(-2) + T = T + torch.eye(BN, dtype=q.dtype, device=q.device) + + mask2 = torch.triu(torch.ones(BN, BN, dtype=torch.bool, device=q.device), diagonal=1) + A_local = (q @ k.transpose(-1, -2)).masked_fill(mask2, 0) @ T + o_intra = A_local @ v + + # apply cumprod transition matrices on k to the last position within the chunk + k = k - ((k @ k.transpose(-1, -2)).masked_fill(mask, 0) @ T).transpose(-1, -2) @ k_beta + # apply cumprod transition matrices on q to the first position within the chunk + q = q - A_local @ k_beta + o_intra = A_local @ v + + A = torch.zeros(b, h, l, l, device=q.device) + + q, k, v, k_beta, o_intra = map(lambda x: rearrange(x, 'b h n c d -> b h (n c) d'), [q, k, v, k_beta, o_intra]) + o = torch.empty_like(v) + for i in range(0, l, BM): + q_i = q[:, :, i:i+BM] + o_i = o_intra[:, :, i:i+BM] + # intra block + for j in range(i + BM - 2 * BN, i-BN, -BN): + k_j = k[:, :, j:j+BN] + A_ij = q_i @ k_j.transpose(-1, -2) + mask = torch.arange(i, i+BM) >= (j + BN) + A_ij = A_ij.masked_fill_(~mask[:, None].to(A_ij.device), 0) + A[:, :, i:i+BM, j:j+BN] = A_ij + q_i = q_i - A_ij @ k_beta[:, :, j:j+BN] + o_i += A_ij @ v[:, :, j:j+BN] + # inter block + for j in range(i - BN, -BN, -BN): + k_j = k[:, :, j:j+BN] + A_ij = q_i @ k_j.transpose(-1, -2) + A[:, :, i:i+BM, j:j+BN] = A_ij + q_i = q_i - A_ij @ k_beta[:, :, j:j+BN] + o_i += A_ij @ v[:, :, j:j+BN] + o[:, :, i:i+BM] = o_i + + for i in range(0, l//BN): + A[:, :, i*BN:i*BN+BN, i*BN:i*BN+BN] = A_local[:, :, i] + + return o, A + + +if __name__ == "__main__": + B, H, T, K, V = 2, 4, 512, 64, 64 + torch.set_default_dtype(torch.bfloat16) + + q = torch.randn[B, H, T, K].cuda() + k = torch.nn.functional.normalize(torch.randn[B, H, T, K].cuda(), p=2, dim=-1) + v = torch.randn[B, H, T, V].cuda() + beta = torch.ones(B, H, T).cuda() + + output_attentions = True + ref_o, ref_attn = naive_delta_rule_parallel(q.clone(), k.clone(), v.clone(), beta.clone()) + o, attn = parallel_delta_rule(q.clone(), k.clone(), v.clone(), beta.clone(), K**-0.5, output_attentions) + print((ref_o-o).abs().max()) + print((ref_attn-attn).abs().max()) diff --git a/fla/ops/delta_rule/wy_fast.py b/fla/ops/delta_rule/wy_fast.py new file mode 100644 index 0000000000000000000000000000000000000000..ea8962bd0c97fd0e6800272af2edad8ca0dcc4b4 --- /dev/null +++ b/fla/ops/delta_rule/wy_fast.py @@ -0,0 +1,805 @@ +# -*- coding: utf-8 -*- + +from typing import Optional, Tuple + +import torch +import triton +import triton.language as tl +from einops import rearrange + +from fla.utils import autocast_custom_bwd, autocast_custom_fwd, contiguous + + +# Inspired by "THE WY REPRESENTATION FOR PRODUCTS OF HOUSEHOLDER MATRICES" https://epubs.siam.org/doi/pdf/10.1137/0908009 +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8), + triton.Config({}, num_warps=16) + ], + key=["BK"] +) +@triton.heuristics({'USE_OFFSETS': lambda args: args['offsets'] is not None}) +@triton.jit +def fwd_prepare_wy_repr_kernel_chunk32( + k, + beta, + A, + offsets, + indices, + T: tl.constexpr, + H: tl.constexpr, + K: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr, + BC: tl.constexpr, + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr, +): + i_t, i_bh = tl.program_id(0), tl.program_id(1) + i_b, i_h = i_bh // H, i_bh % H + if USE_OFFSETS: + i_n, i_t = tl.load(indices + i_t * 2).to(tl.int32), tl.load(indices + i_t * 2 + 1).to(tl.int32) + bos, eos = tl.load(offsets + i_n).to(tl.int32), tl.load(offsets + i_n + 1).to(tl.int32) + T = eos - bos + else: + bos, eos = i_b * T, i_b * T + T + + if HEAD_FIRST: + p_beta = tl.make_block_ptr(beta + i_bh * T, (T,), (1,), (i_t * BT,), (BT,), (0,)) + else: + p_beta = tl.make_block_ptr(beta + bos * H + i_h, (T,), (H,), (i_t * BT,), (BT,), (0,)) + b_beta = tl.load(p_beta, boundary_check=(0,)) + + b_A = tl.zeros([BT, BT], dtype=tl.float32) + for i_k in range(tl.cdiv(K, BK)): + if HEAD_FIRST: + p_k = tl.make_block_ptr(k + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + else: + p_k = tl.make_block_ptr(k + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + b_k = tl.load(p_k, boundary_check=(0, 1)) + b_kb = (b_k * b_beta[:, None]).to(b_k.dtype) + b_A += tl.dot(b_kb, tl.trans(b_k), allow_tf32=False) + + b_A = -tl.where(tl.arange(0, BT)[:, None] > tl.arange(0, BT)[None, :], b_A, 0) + for i in range(1, BT): + mask = tl.arange(0, BT) == i + b_a = tl.sum(tl.where(mask[:, None], b_A, 0), 0) + b_a = b_a + tl.sum(b_a[:, None] * b_A, 0) * (tl.arange(0, BT) < i) + b_A = tl.where(mask[:, None], b_a, b_A) + b_A += tl.arange(0, BT)[:, None] == tl.arange(0, BT)[None, :] + + if HEAD_FIRST: + p_A = tl.make_block_ptr(A + i_bh * T * BT, (T, BT), (BT, 1), (i_t * BT, 0), (BT, BT), (1, 0)) + else: + p_A = tl.make_block_ptr(A + (bos*H + i_h) * BT, (T, BT), (H*BT, 1), (i_t * BT, 0), (BT, BT), (1, 0)) + tl.store(p_A, (b_A).to(p_A.dtype.element_ty), boundary_check=(0, 1)) + b_A = b_A.to(k.dtype.element_ty) + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8), + triton.Config({}, num_warps=16) + ], + key=["BK"], +) +@triton.heuristics({'USE_OFFSETS': lambda args: args['offsets'] is not None}) +@triton.jit +def fwd_prepare_wy_repr_kernel_chunk64( + k, + beta, + A, + offsets, + indices, + T: tl.constexpr, + H: tl.constexpr, + K: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr, + BC: tl.constexpr, + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + i_t, i_bh = tl.program_id(0), tl.program_id(1) + i_b, i_h = i_bh // H, i_bh % H + if USE_OFFSETS: + i_n, i_t = tl.load(indices + i_t * 2).to(tl.int32), tl.load(indices + i_t * 2 + 1).to(tl.int32) + bos, eos = tl.load(offsets + i_n).to(tl.int32), tl.load(offsets + i_n + 1).to(tl.int32) + T = eos - bos + else: + bos, eos = i_b * T, i_b * T + T + + b_A = tl.zeros([BC, BC], dtype=tl.float32) + b_A2 = tl.zeros([BC, BC], dtype=tl.float32) + b_A3 = tl.zeros([BC, BC], dtype=tl.float32) + + if HEAD_FIRST: + p_beta = tl.make_block_ptr(beta + i_bh * T, (T,), (1,), (i_t * BT,), (BC,), (0,)) + else: + p_beta = tl.make_block_ptr(beta + bos * H + i_h, (T,), (H,), (i_t * BT,), (BC,), (0,)) + b_beta = tl.load(p_beta, boundary_check=(0,)) + + if HEAD_FIRST: + p_beta2 = tl.make_block_ptr(beta + i_bh * T, (T,), (1,), (i_t * BT + BC,), (BC,), (0,)) + else: + p_beta2 = tl.make_block_ptr(beta + bos * H + i_h, (T,), (H,), (i_t * BT + BC,), (BC,), (0,)) + b_beta2 = tl.load(p_beta2, boundary_check=(0,)) + + for i_k in range(tl.cdiv(K, BK)): + if HEAD_FIRST: + p_k = tl.make_block_ptr(k + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BC, BK), (1, 0)) + p_k2 = tl.make_block_ptr(k + i_bh * T*K, (T, K), (K, 1), (i_t * BT + BC, i_k * BK), (BC, BK), (1, 0)) + else: + p_k = tl.make_block_ptr(k + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BC, BK), (1, 0)) + p_k2 = tl.make_block_ptr(k + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT + BC, i_k * BK), (BC, BK), (1, 0)) + b_k = tl.load(p_k, boundary_check=(0, 1)) + b_kb = (b_k * b_beta[:, None]).to(b_k.dtype) + b_k2 = tl.load(p_k2, boundary_check=(0, 1)) + b_kb2 = (b_k2 * b_beta2[:, None]).to(b_k2.dtype) + b_A += tl.dot(b_kb, tl.trans(b_k), allow_tf32=False) + b_A2 += tl.dot(b_kb2, tl.trans(b_k2), allow_tf32=False) + b_A3 += tl.dot(b_kb2, tl.trans(b_k), allow_tf32=False) + + b_A = -tl.where(tl.arange(0, BC)[:, None] > tl.arange(0, BC)[None, :], b_A, 0) + b_A2 = -tl.where(tl.arange(0, BC)[:, None] > tl.arange(0, BC)[None, :], b_A2, 0) + for i in range(1, BC): + mask = tl.arange(0, BC) == i + b_a = tl.sum(tl.where(mask[:, None], b_A, 0), 0) + b_a2 = tl.sum(tl.where(mask[:, None], b_A2, 0), 0) + b_a = b_a + tl.sum(b_a[:, None] * b_A, 0) * (tl.arange(0, BC) < i) + b_a2 = b_a2 + tl.sum(b_a2[:, None] * b_A2, 0) * (tl.arange(0, BC) < i) + b_A = tl.where(mask[:, None], b_a, b_A) + b_A2 = tl.where(mask[:, None], b_a2, b_A2) + + # blockwise computation of lower triangular matrix's inverse + # i.e., [A11, 0; A21, A22]^-1 = [A11^-1, 0; -A22^-1 A21 A11^-1, A22^-1] + b_A += tl.arange(0, BC)[:, None] == tl.arange(0, BC)[None, :] + b_A2 += tl.arange(0, BC)[:, None] == tl.arange(0, BC)[None, :] + b_A3 = -tl.dot(tl.dot(b_A2, b_A3, allow_tf32=False), b_A, allow_tf32=False) + + if HEAD_FIRST: + p_A1 = tl.make_block_ptr(A + i_bh * T * BT, (T, BT), (BT, 1), (i_t * BT, 0), (BC, BC), (1, 0)) + p_A2 = tl.make_block_ptr(A + i_bh * T * BT, (T, BT), (BT, 1), (i_t * BT + BC, BC), (BC, BC), (1, 0)) + p_A3 = tl.make_block_ptr(A + i_bh * T * BT, (T, BT), (BT, 1), (i_t * BT + BC, 0), (BC, BC), (1, 0)) + p_A4 = tl.make_block_ptr(A + i_bh * T * BT, (T, BT), (BT, 1), (i_t * BT, BC), (BC, BC), (1, 0)) + else: + p_A1 = tl.make_block_ptr(A + (bos*H + i_h) * BT, (T, BT), (H*BT, 1), (i_t * BT, 0), (BC, BC), (1, 0)) + p_A2 = tl.make_block_ptr(A + (bos*H + i_h) * BT, (T, BT), (H*BT, 1), (i_t * BT + BC, BC), (BC, BC), (1, 0)) + p_A3 = tl.make_block_ptr(A + (bos*H + i_h) * BT, (T, BT), (H*BT, 1), (i_t * BT + BC, 0), (BC, BC), (1, 0)) + p_A4 = tl.make_block_ptr(A + (bos*H + i_h) * BT, (T, BT), (H*BT, 1), (i_t * BT, BC), (BC, BC), (1, 0)) + tl.store(p_A1, b_A.to(p_A1.dtype.element_ty), boundary_check=(0, 1)) + tl.store(p_A2, b_A2.to(p_A2.dtype.element_ty), boundary_check=(0, 1)) + tl.store(p_A3, b_A3.to(p_A3.dtype.element_ty), boundary_check=(0, 1)) + # causal mask + tl.store(p_A4, tl.zeros([BC, BC], dtype=tl.float32).to(p_A4.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8) + ], + key=["BT", "BK", "BV"], +) +@triton.heuristics({'USE_OFFSETS': lambda args: args['offsets'] is not None}) +@triton.jit +def fwd_recompute_w_u_kernel( + k, + v, + beta, + w, + u, + A, + offsets, + indices, + T: tl.constexpr, + H: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + i_t, i_bh = tl.program_id(0), tl.program_id(1) + i_b, i_h = i_bh // H, i_bh % H + if USE_OFFSETS: + i_n, i_t = tl.load(indices + i_t * 2).to(tl.int32), tl.load(indices + i_t * 2 + 1).to(tl.int32) + bos, eos = tl.load(offsets + i_n).to(tl.int32), tl.load(offsets + i_n + 1).to(tl.int32) + T = eos - bos + else: + bos, eos = i_b * T, i_b * T + T + + if HEAD_FIRST: + p_beta = tl.make_block_ptr(beta + i_bh * T, (T,), (1,), (i_t * BT,), (BT,), (0,)) + p_A = tl.make_block_ptr(A + i_bh * T * BT, (T, BT), (BT, 1), (i_t * BT, 0), (BT, BT), (1, 0)) + else: + p_beta = tl.make_block_ptr(beta + bos * H + i_h, (T,), (H,), (i_t * BT,), (BT,), (0,)) + p_A = tl.make_block_ptr(A + (bos*H + i_h) * BT, (T, BT), (H*BT, 1), (i_t * BT, 0), (BT, BT), (1, 0)) + b_beta = tl.load(p_beta, boundary_check=(0,)) + b_A = tl.load(p_A, boundary_check=(0, 1)) + + for i_v in range(tl.cdiv(V, BV)): + if HEAD_FIRST: + p_v = tl.make_block_ptr(v + i_bh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_u = tl.make_block_ptr(u + i_bh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + else: + p_v = tl.make_block_ptr(v + (bos*H + i_h) * V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_u = tl.make_block_ptr(u + (bos*H + i_h) * V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + b_v = tl.load(p_v, boundary_check=(0, 1)) + b_vb = (b_v * b_beta[:, None]).to(b_v.dtype) + b_u = tl.dot(b_A.to(b_vb.dtype), b_vb, allow_tf32=False) + tl.store(p_u, (b_u).to(p_u.dtype.element_ty), boundary_check=(0, 1)) + + for i_k in range(tl.cdiv(K, BK)): + if HEAD_FIRST: + p_k = tl.make_block_ptr(k + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_w = tl.make_block_ptr(w + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + else: + p_k = tl.make_block_ptr(k + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_w = tl.make_block_ptr(w + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + b_k = tl.load(p_k, boundary_check=(0, 1)) + b_kb = (b_k * b_beta[:, None]).to(b_k.dtype) + b_w = tl.dot(b_A.to(b_kb.dtype), b_kb, allow_tf32=False) + tl.store(p_w, b_w.to(p_w.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8) + ], + key=["BT", "BK"], +) +@triton.heuristics({'USE_OFFSETS': lambda args: args['offsets'] is not None}) +@triton.jit +def fwd_recompute_w_kernel( + k, + beta, + w, + A, + offsets, + indices, + T: tl.constexpr, + H: tl.constexpr, + K: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr, + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + i_t, i_bh = tl.program_id(0), tl.program_id(1) + i_b, i_h = i_bh // H, i_bh % H + if USE_OFFSETS: + i_n, i_t = tl.load(indices + i_t * 2).to(tl.int32), tl.load(indices + i_t * 2 + 1).to(tl.int32) + bos, eos = tl.load(offsets + i_n).to(tl.int32), tl.load(offsets + i_n + 1).to(tl.int32) + T = eos - bos + else: + bos, eos = i_b * T, i_b * T + T + + if HEAD_FIRST: + p_beta = tl.make_block_ptr(beta + i_bh * T, (T,), (1,), (i_t * BT,), (BT,), (0,)) + p_A = tl.make_block_ptr(A + i_bh * T * BT, (T, BT), (BT, 1), (i_t * BT, 0), (BT, BT), (1, 0)) + else: + p_beta = tl.make_block_ptr(beta + bos * H + i_h, (T,), (H,), (i_t * BT,), (BT,), (0,)) + p_A = tl.make_block_ptr(A + (bos*H + i_h) * BT, (T, BT), (H*BT, 1), (i_t * BT, 0), (BT, BT), (1, 0)) + b_beta = tl.load(p_beta, boundary_check=(0,)) + b_A = tl.load(p_A, boundary_check=(0, 1)).to(k.dtype.element_ty) + + for i_k in range(tl.cdiv(K, BK)): + if HEAD_FIRST: + p_k = tl.make_block_ptr(k + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_w = tl.make_block_ptr(w + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + else: + p_k = tl.make_block_ptr(k + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_w = tl.make_block_ptr(w + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + b_k = tl.load(p_k, boundary_check=(0, 1)) + b_kb = (b_k * b_beta[:, None]).to(b_k.dtype) + b_w = tl.dot(b_A, b_kb, allow_tf32=False) + + tl.store(p_w, b_w.to(p_w.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8), + triton.Config({}, num_warps=16) + ], + key=["BT", "BK", "BV"], +) +@triton.heuristics({'USE_OFFSETS': lambda args: args['offsets'] is not None}) +@triton.jit +def bwd_prepare_wy_repr_kernel( + k, + v, + beta, + A, + dw, + du, + dk, + dv, + dbeta, + offsets, + indices, + T: tl.constexpr, + H: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + i_t, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_b, i_h = i_bh // H, i_bh % H + if USE_OFFSETS: + i_n, i_t = tl.load(indices + i_t * 2).to(tl.int32), tl.load(indices + i_t * 2 + 1).to(tl.int32) + bos, eos = tl.load(offsets + i_n).to(tl.int32), tl.load(offsets + i_n + 1).to(tl.int32) + T = eos - bos + else: + bos, eos = i_b * T, i_b * T + T + + if HEAD_FIRST: + p_beta = tl.make_block_ptr(beta + i_bh * T, (T,), (1,), (i_t * BT,), (BT,), (0,)) + p_A = tl.make_block_ptr(A + i_bh * T * BT, (BT, T), (1, BT), (0, i_t * BT), (BT, BT), (0, 1)) + else: + p_beta = tl.make_block_ptr(beta + bos * H + i_h, (T,), (H,), (i_t * BT,), (BT,), (0,)) + p_A = tl.make_block_ptr(A + (bos*H + i_h) * BT, (BT, T), (1, H*BT), (0, i_t * BT), (BT, BT), (0, 1)) + + b_beta = tl.load(p_beta, boundary_check=(0,)) + b_A = tl.load(p_A, boundary_check=(0, 1)) + + b_dbeta = tl.zeros([BT], dtype=tl.float32) + b_dA = tl.zeros([BT, BT], dtype=tl.float32) + for i_v in range(tl.cdiv(V, BV)): + if HEAD_FIRST: + p_v = tl.make_block_ptr(v + i_bh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_dv = tl.make_block_ptr(dv + i_bh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_du = tl.make_block_ptr(du + i_bh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + else: + p_v = tl.make_block_ptr(v + (bos*H + i_h) * V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_dv = tl.make_block_ptr(dv + (bos*H + i_h) * V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_du = tl.make_block_ptr(du + (bos*H + i_h) * V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + + b_v = tl.load(p_v, boundary_check=(0, 1)) + b_v_beta = (b_v * b_beta[:, None]).to(b_v.dtype) + b_du = tl.load(p_du, boundary_check=(0, 1)) + b_dA += tl.dot(b_du, tl.trans(b_v_beta), allow_tf32=False) + b_dv_beta = tl.dot(b_A, b_du, allow_tf32=False) + b_dv = b_dv_beta * b_beta[:, None] + b_dbeta += tl.sum(b_dv_beta * b_v, 1) + + tl.store(p_dv, b_dv.to(p_dv.dtype.element_ty), boundary_check=(0, 1)) + + for i_k in range(tl.cdiv(K, BK)): + if HEAD_FIRST: + p_k = tl.make_block_ptr(k + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dk = tl.make_block_ptr(dk + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dw = tl.make_block_ptr(dw + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + else: + p_k = tl.make_block_ptr(k + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dk = tl.make_block_ptr(dk + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dw = tl.make_block_ptr(dw + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + b_k = tl.load(p_k, boundary_check=(0, 1)) + b_k_beta = (b_k * b_beta[:, None]).to(b_k.dtype) + b_dw = tl.load(p_dw, boundary_check=(0, 1)) + b_dA += tl.dot(b_dw, tl.trans(b_k_beta), allow_tf32=False) + b_dk_beta = tl.dot(b_A, b_dw, allow_tf32=False) + b_dk = b_dk_beta * b_beta[:, None] + b_dbeta += tl.sum(b_dk_beta * b_k, 1) + + tl.store(p_dk, b_dk.to(p_dk.dtype.element_ty), boundary_check=(0, 1)) + + b_dA = tl.where(tl.arange(0, BT)[:, None] > tl.arange(0, BT)[None, :], b_dA, 0) + b_dA = tl.dot(b_dA.to(b_A.dtype), b_A) + b_dA = tl.dot(b_A, b_dA.to(b_A.dtype)) + b_dA = tl.where(tl.arange(0, BT)[:, None] > tl.arange(0, BT)[None, :], -b_dA, 0).to(k.dtype.element_ty) + + for i_k in range(tl.cdiv(K, BK)): + if HEAD_FIRST: + p_k = tl.make_block_ptr(k + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dk = tl.make_block_ptr(dk + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + else: + p_k = tl.make_block_ptr(k + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dk = tl.make_block_ptr(dk + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + b_k = tl.load(p_k, boundary_check=(0, 1)) + b_dk = tl.load(p_dk, boundary_check=(0, 1)) + b_k_beta = (b_k * b_beta[:, None]).to(b_k.dtype) + + b_dk_beta = tl.dot(b_dA, b_k, allow_tf32=False) + b_dbeta += tl.sum(b_dk_beta * b_k, 1) + b_dk += tl.dot(tl.trans(b_dA), b_k_beta, allow_tf32=False) + b_dk += b_dk_beta * b_beta[:, None] + tl.store(p_dk, b_dk.to(p_dk.dtype.element_ty), boundary_check=(0, 1)) + + if HEAD_FIRST: + p_dbeta = tl.make_block_ptr(dbeta + i_bh * T, (T,), (1,), (i_t * BT,), (BT,), (0,)) + else: + p_dbeta = tl.make_block_ptr(dbeta + bos * H + i_h, (T,), (H,), (i_t * BT,), (BT,), (0,)) + tl.store(p_dbeta, b_dbeta.to(p_dbeta.dtype.element_ty), boundary_check=(0,)) + + +def fwd_prepare_wy_repr( + k: torch.Tensor, + v: torch.Tensor, + beta: torch.Tensor, + offsets: Optional[torch.LongTensor], + indices: Optional[torch.LongTensor], + head_first: bool = True, + chunk_size: int = 64 +) -> Tuple[torch.Tensor, torch.Tensor, torch.Tensor]: + if head_first: + B, H, T, K = k.shape + else: + B, T, H, K = k.shape + BT = min(chunk_size, max(triton.next_power_of_2(T), 16)) + if offsets is None: + NT = triton.cdiv(T, BT) + else: + if indices is None: + indices = torch.cat([torch.arange(n) for n in triton.cdiv(offsets[1:] - offsets[:-1], BT).tolist()]) + indices = torch.stack([indices.eq(0).cumsum(0) - 1, indices], 1).to(offsets) + NT = len(indices) + BC = min(BT, 32) + BK = min(triton.next_power_of_2(K), 64) + + u = torch.empty_like(v) + w = torch.empty_like(k) + A = torch.empty(B, *((H, T) if head_first else (T, H)), BT, device=k.device, dtype=k.dtype) + fwd_fn = fwd_prepare_wy_repr_kernel_chunk64 if BT == 64 else fwd_prepare_wy_repr_kernel_chunk32 + fwd_fn[(NT, B * H)]( + k=k, + beta=beta, + A=A, + offsets=offsets, + indices=indices, + T=T, + H=H, + K=K, + BT=BT, + BK=BK, + BC=BC, + HEAD_FIRST=head_first + ) + w, u = fwd_recompute_w_u( + k=k, + v=v, + beta=beta, + A=A, + offsets=offsets, + indices=indices, + head_first=head_first, + chunk_size=chunk_size + ) + return w, u, A + + +def fwd_prepare_T( + k: torch.Tensor, + beta: torch.Tensor, + offsets: Optional[torch.LongTensor], + indices: Optional[torch.LongTensor], + head_first: bool, + chunk_size: int +) -> Tuple[torch.Tensor, torch.Tensor]: + if head_first: + B, H, T, K = k.shape + else: + B, T, H, K = k.shape + BT = min(chunk_size, max(triton.next_power_of_2(T), 16)) + assert BT in [16, 32, 64] + if offsets is None: + NT = triton.cdiv(T, BT) + else: + if indices is None: + indices = torch.cat([torch.arange(n) for n in triton.cdiv(offsets[1:] - offsets[:-1], BT).tolist()]) + indices = torch.stack([indices.eq(0).cumsum(0) - 1, indices], 1).to(offsets) + NT = len(indices) + BC = min(BT, 32) + + BK = min(triton.next_power_of_2(K), 64) + A = torch.empty(B, *((H, T) if head_first else (T, H)), BT, device=k.device, dtype=k.dtype) + fwd_fn = fwd_prepare_wy_repr_kernel_chunk64 if BT == 64 else fwd_prepare_wy_repr_kernel_chunk32 + fwd_fn[(NT, B * H)]( + k=k, + beta=beta, + A=A, + offsets=offsets, + indices=indices, + T=T, + H=H, + K=K, + BT=BT, + BK=BK, + BC=BC, + HEAD_FIRST=head_first + ) + return A + + +def fwd_recompute_w_u( + k: torch.Tensor, + v: torch.Tensor, + beta: torch.Tensor, + A: torch.Tensor, + offsets: Optional[torch.LongTensor], + indices: Optional[torch.LongTensor], + head_first: bool, + chunk_size: int +) -> Tuple[torch.Tensor, torch.Tensor]: + if head_first: + B, H, T, K, V = *k.shape, v.shape[-1] + else: + B, T, H, K, V = *k.shape, v.shape[-1] + BT = min(chunk_size, max(triton.next_power_of_2(T), 16)) + if offsets is None: + NT = triton.cdiv(T, BT) + else: + if indices is None: + indices = torch.cat([torch.arange(n) for n in triton.cdiv(offsets[1:] - offsets[:-1], BT).tolist()]) + indices = torch.stack([indices.eq(0).cumsum(0) - 1, indices], 1).to(offsets) + NT = len(indices) + BK = min(triton.next_power_of_2(K), 64) + BV = min(triton.next_power_of_2(V), 64) + + u = torch.empty_like(v) + w = torch.empty_like(k) + fwd_recompute_w_u_kernel[(NT, B*H)]( + k, + v, + beta, + w, + u, + A, + offsets=offsets, + indices=indices, + T=T, + H=H, + K=K, + V=V, + BT=BT, + BK=BK, + BV=BV, + HEAD_FIRST=head_first + ) + return w, u + + +def fwd_recompute_w( + k: torch.Tensor, + beta: torch.Tensor, + A: torch.Tensor, + offsets: Optional[torch.LongTensor], + indices: Optional[torch.LongTensor], + head_first: bool, + chunk_size: int +) -> torch.Tensor: + if head_first: + B, H, T, K = k.shape + else: + B, T, H, K = k.shape + BT = min(chunk_size, max(triton.next_power_of_2(T), 16)) + if offsets is None: + NT = triton.cdiv(T, BT) + else: + if indices is None: + indices = torch.cat([torch.arange(n) for n in triton.cdiv(offsets[1:] - offsets[:-1], BT).tolist()]) + indices = torch.stack([indices.eq(0).cumsum(0) - 1, indices], 1).to(offsets) + NT = len(indices) + BK = min(triton.next_power_of_2(K), 64) + + w = torch.empty_like(k) + fwd_recompute_w_kernel[(NT, B*H)]( + k, + beta, + w, + A, + T=T, + H=H, + K=K, + BT=BT, + BK=BK, + HEAD_FIRST=head_first + ) + return w + + +def bwd_prepare_wy_repr( + k: torch.Tensor, + v: torch.Tensor, + beta: torch.Tensor, + A: torch.Tensor, + dw: torch.Tensor, + du: torch.Tensor, + offsets: Optional[torch.LongTensor], + indices: Optional[torch.LongTensor], + head_first: bool, + chunk_size: int +) -> Tuple[torch.Tensor, torch.Tensor, torch.Tensor]: + if head_first: + B, H, T, K, V = *k.shape, v.shape[-1] + else: + B, T, H, K, V = *k.shape, v.shape[-1] + BT = min(chunk_size, max(triton.next_power_of_2(T), 16)) + if offsets is None: + NT = triton.cdiv(T, BT) + else: + if indices is None: + indices = torch.cat([torch.arange(n) for n in triton.cdiv(offsets[1:] - offsets[:-1], BT).tolist()]) + indices = torch.stack([indices.eq(0).cumsum(0) - 1, indices], 1).to(offsets) + NT = len(indices) + BK = min(triton.next_power_of_2(K), 64) + BV = min(triton.next_power_of_2(V), 64) + + dk = torch.empty_like(k) + dv = torch.empty_like(v) + dbeta = torch.empty_like(beta) + bwd_prepare_wy_repr_kernel[(NT, B * H)]( + k, + v, + beta, + A, + dw, + du, + dk, + dv, + dbeta, + offsets=offsets, + indices=indices, + T=T, + H=H, + K=K, + V=V, + BT=BT, + BK=BK, + BV=BV, + HEAD_FIRST=head_first + ) + return dk, dv, dbeta + + +class WYRepresentationPrepration(torch.autograd.Function): + + @staticmethod + @contiguous + @autocast_custom_fwd + def forward( + ctx, + k: torch.Tensor, + v: torch.Tensor, + beta: torch.Tensor, + offsets: Optional[torch.LongTensor] = None, + head_first: bool = True, + chunk_size: int = 64 + ): + assert chunk_size in [16, 32, 64] + # 2-d indices denoting the offsets of chunks in each sequence + # for example, if the passed `offsets` is [0, 100, 356] and `chunk_size` is 64, + # then there are 2 and 4 chunks in the 1st and 2nd sequences respectively, and `indices` will be + # [[0, 0], [0, 1], [1, 0], [1, 1], [1, 2], [1, 3]] + indices = None + if offsets is not None: + indices = torch.cat([torch.arange(n) for n in triton.cdiv(offsets[1:] - offsets[:-1], chunk_size).tolist()]) + indices = torch.stack([indices.eq(0).cumsum(0) - 1, indices], 1).to(offsets) + + w, u, A = fwd_prepare_wy_repr( + k=k, + v=v, + beta=beta, + offsets=offsets, + indices=indices, + head_first=head_first, + chunk_size=chunk_size + ) + ctx.offsets = offsets + ctx.indices = indices + ctx.head_first = head_first + ctx.chunk_size = chunk_size + ctx.save_for_backward(k, v, beta, A) + return w, u + + @staticmethod + @contiguous + @autocast_custom_bwd + def backward( + ctx, + dw: torch.Tensor, + du: torch.Tensor + ): + k, v, beta, A = ctx.saved_tensors + dk, dv, dbeta = bwd_prepare_wy_repr( + k=k, + v=v, + beta=beta, + A=A, + dw=dw, + du=du, + offsets=ctx.offsets, + indices=ctx.indices, + head_first=ctx.head_first, + chunk_size=ctx.chunk_size + ) + return dk, dv, dbeta, None, None, None + + +prepare_wy_repr = WYRepresentationPrepration.apply + + +def naive(k, v, beta, chunk_size): + l_org = k.shape[2] + l_new = triton.next_power_of_2(l_org) + # pad k, v, beta + k = torch.cat([k, torch.zeros_like(k)[:, :, :l_new-l_org, :]], dim=2) + v = torch.cat([v, torch.zeros_like(v)[:, :, :l_new-l_org, :]], dim=2) + beta = torch.cat([beta, torch.zeros_like(beta)[:, :, :l_new-l_org]], dim=2) + + k, v = map(lambda x: rearrange(x, 'b h (n c) d -> b h n c d', c=chunk_size), (k, v)) + # k = torch.nn.functional.normalize(k, dim=-1, p=2) + beta = rearrange(beta, 'b h (n c) -> b h n c', c=chunk_size) + mask = torch.triu(torch.ones(chunk_size, chunk_size, dtype=torch.bool, device=k.device), diagonal=0) + k_beta = k * beta[..., None] + v = v * beta[..., None] + attn = (k @ k.transpose(-1, -2)).masked_fill_(mask, 0) + attn = attn * beta[..., None] + x = attn @ v + + o = torch.zeros_like(k) + o2 = torch.zeros_like(v) + + o[..., 0, :] = k_beta[..., 0, :].clone() + o2[..., 0, :] = x[..., 0, :].clone() + for i in range(1, chunk_size): + o_i = (o[..., :i, :]).clone() + o[..., i, :] = -(attn[..., i, :i, None] * o_i).sum(3) + k_beta[..., i, :] + o2_i = (o2[..., :i, :]).clone() + o2[..., i, :] = -(attn[..., i, :i, None] * o2_i).sum(3) + x[..., i, :] + return map(lambda x: rearrange(x, 'b h n c d -> b h (n c) d')[:, :, :l_org], (o, v-o2)) + + +if __name__ == "__main__": + torch.set_default_dtype(torch.bfloat16) + seq_len = 1024 + b = 4 + h = 4 + k = torch.nn.functional.normalize(torch.randn(b, h, seq_len, 128), dim=-1, p=2) + v = torch.randn(b, h, seq_len, 128) + beta = torch.rand(b, h, seq_len).sigmoid() + # beta = torch.ones(b, h, seq_len) + require_grad = True + + k, v, beta = map(lambda x: x.cuda().requires_grad_(require_grad), (k, v, beta)) + do = torch.rand_like(k) + do2 = torch.rand_like(v) + + o1, o2 = naive(k.clone(), v.clone(), beta.clone(), 64) + if require_grad: + o1.backward(do, retain_graph=True) + o2.backward(do2, retain_graph=True) + k_grad2, v_grad2, beta_grad2 = k.grad, v.grad, beta.grad + k.grad = v.grad = beta.grad = None + o3, o4 = prepare_wy_repr(k.clone(), v.clone(), beta.clone(), 64) + print((o1-o3).abs().max()) + print((o2-o4).abs().max()) + + if require_grad: + o3.backward(do, retain_graph=True) + o4.backward(do2, retain_graph=True) + k_grad, v_grad, beta_grad = k.grad, v.grad, beta.grad + print((k_grad2-k_grad).abs().max()) + print((v_grad2-v_grad).abs().max()) + print((beta_grad2-beta_grad).abs().max()) diff --git a/fla/ops/generalized_delta_rule/__init__.py b/fla/ops/generalized_delta_rule/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391 diff --git a/fla/ops/generalized_delta_rule/iplr/__init__.py b/fla/ops/generalized_delta_rule/iplr/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391 diff --git a/fla/ops/generalized_delta_rule/iplr/fused_recurrent.py b/fla/ops/generalized_delta_rule/iplr/fused_recurrent.py new file mode 100644 index 0000000000000000000000000000000000000000..1cbffdbcf586e286cf01bb7988530fa64718ef67 --- /dev/null +++ b/fla/ops/generalized_delta_rule/iplr/fused_recurrent.py @@ -0,0 +1,349 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +from typing import Tuple + +import torch +import triton +import triton.language as tl + +from fla.utils import contiguous + + +@triton.jit +def fused_recurrent_fwd_kernel( + # B: batch_size, H: n_heads, T: seq_len, D: d_head + q, # query [B, H, L, K] + k, # key [B, H, L, V] + v, # value [B, H, L, V]. + alpha, # beta [B, H, L] + beta, + o, # output [B, H, L, V] + ha, # tmp variable [B, H, L, V] for storing intermediate results of (h * alpha[None, :]).sum(0) + h0, + ht, # final hidden state [B, H, K, V] + s_k_h, # stride size: L * K + s_v_h, # stride size: L * V + scale, # K ** -0.5 + B, # batch size + H, # n_heads + T, # seq_len + K: tl.constexpr, # K + V: tl.constexpr, # V + BK: tl.constexpr, # BLOCK SIZE along the K dimension + BV: tl.constexpr, # BLOCK SIZE along the V dimension + USE_INITIAL_STATE: tl.constexpr, # whether to use initial state + STORE_FINAL_STATE: tl.constexpr, # whether to store final state +): + + # indices + i_v, i_k, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + + p_q = q + i_bh * s_k_h + i_k * BK + tl.arange(0, BK) + p_k = k + i_bh * s_k_h + i_k * BK + tl.arange(0, BK) + p_v = v + i_bh * s_v_h + i_v * BV + tl.arange(0, BV) + p_alpha = alpha + i_bh * s_k_h + i_k * BK + tl.arange(0, BK) + p_beta = beta + i_bh * s_k_h + i_k * BK + tl.arange(0, BK) + p_o = o + (i_bh + i_k * B * H) * s_v_h + i_v * BV + tl.arange(0, BV) + p_ha = ha + (i_bh + i_k * B * H) * s_v_h + i_v * BV + tl.arange(0, BV) + + mask_bk = (i_k * BK + tl.arange(0, BK)) < K + mask_bv = (i_v * BV + tl.arange(0, BV)) < V + mask_kv = mask_bk[None, :] & mask_bv[:, None] + + h = tl.zeros([BV, BK], dtype=tl.float32) + + if USE_INITIAL_STATE: + p_h0 = h0 + i_bh * K * V + (i_k * BK + tl.arange(0, BK)[None, :]) * V + (i_v * BV + tl.arange(0, BV)[:, None]) + h += tl.load(p_h0, mask=mask_kv, other=0).to(tl.float32) + + for _ in range(0, T): + b_k = tl.load(p_k, mask=mask_bk, other=0).to(tl.float32) + b_v = tl.load(p_v, mask=mask_bv, other=0).to(tl.float32) + b_q = tl.load(p_q, mask=mask_bk, other=0).to(tl.float32) * scale + b_alpha = tl.load(p_alpha, mask=mask_bk, other=0).to(tl.float32) + b_beta = tl.load(p_beta, mask=mask_bk, other=0).to(tl.float32) + # to store + tmp = tl.sum(h * b_alpha[None, :], axis=1) + h += (tmp[:, None] * b_beta[None, :] + b_k[None, :] * b_v[:, None]) + _o = h * b_q[None, :] + _o = tl.sum(_o, axis=1) + tl.store(p_o, _o.to(p_o.dtype.element_ty), mask=mask_bv) + tl.store(p_ha, tmp.to(p_ha.dtype.element_ty), mask=mask_bv) + p_q += K + p_k += K + p_o += V + p_v += V + p_ha += V + p_alpha += K + p_beta += K + + if STORE_FINAL_STATE: + p_ht = ht + i_bh * K * V + (i_k * BK + tl.arange(0, BK)[None, :]) * V + (i_v * BV + tl.arange(0, BV)[:, None]) + tl.store(p_ht, h.to(p_ht.dtype.element_ty), mask=mask_kv) + + +# Similar to Algorithm1 of https://arxiv.org/abs/2006.16236 +@triton.jit +def fused_recurrent_bwd_kernel( + # B: batch_size, H: n_heads, T: seq_len, D: d_head + # NV: number of split in the V dimension. NK: number of split in the K dimension + q, # query [B, H, L, K] + k, # key [B, H, L, V] + v, # value [B, H, L, V] + alpha, # alpha [B, H, L, K] + beta, # beta [B, H, L, K] + ha, # ha [B, H, L, V] + dht, # gradient of final state [B, H, K, V] + dh0, # gradient of initial state [B, H, K, V] + do, # gradient of output [B, H, L, V] + dq, # gradient of query [NV, B, H, L, K] + dk, # gradient of key [NV, B, H, L, K] + dv, # gradient of value [NK, B, H, L, V] + dalpha, # gradient of alpha [NV, B, H, L, K] + dbeta, # gradient of beta [NV, B, H, L, K] + dha, # gradient of ha [NK, B, H, L, V] + h0, # initial state [B, H, K, V] + s_k_h, # stride size: L * K + s_v_h, # stride size: L * V + NK, # NK block size + scale, # K ** -0.5 + B, # batch_size + H, # n_heads + T, # seq_len + K: tl.constexpr, # K + V: tl.constexpr, # V + BK: tl.constexpr, # BLOCK SIZE along the K dimension + BV: tl.constexpr, # BLOCK SIZE along the V dimension + USE_INITIAL_STATE: tl.constexpr, # whether to use initial state h0 + USE_DH0: tl.constexpr, # whether to use dh0 + USE_DHT: tl.constexpr, # whether to use dht +): + i_v, i_k, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + mask_bk = i_k * BK + tl.arange(0, BK) < K + mask_bv = i_v * BV + tl.arange(0, BV) < V + + p_q = q + i_bh * s_k_h + i_k * BK + tl.arange(0, BK) + (T - 1) * K + p_k = k + i_bh * s_k_h + i_k * BK + tl.arange(0, BK) + (T - 1) * K + p_do = do + i_bh * s_v_h + i_v * BV + tl.arange(0, BV) + (T - 1) * V + p_v = v + i_bh * s_v_h + i_v * BV + tl.arange(0, BV) + (T - 1) * V + p_ha = ha + i_bh * s_v_h + i_v * BV + tl.arange(0, BV) + (T - 1) * V + p_alpha = alpha + i_bh * s_k_h + i_k * BK + tl.arange(0, BK) + (T - 1) * K + p_beta = beta + i_bh * s_k_h + i_k * BK + tl.arange(0, BK) + (T - 1) * K + + p_dk = dk + (i_bh + i_v * B * H) * s_k_h + i_k * BK + tl.arange(0, BK) + (T - 1) * K + p_dbeta = dbeta + (i_bh + i_v * B * H) * s_k_h + i_k * BK + tl.arange(0, BK) + (T - 1) * K + p_dv = dv + (i_bh + i_k * B * H) * s_v_h + i_v * BV + tl.arange(0, BV) + (T - 1) * V + p_dha = dha + (i_bh + i_k * B * H) * s_v_h + i_v * BV + tl.arange(0, BV) + (T - 1) * V + d_h = tl.zeros([BK, BV], dtype=tl.float32) + + if USE_DHT: + p_ht = dht + i_bh * K * V + (i_k * BK + tl.arange(0, BK)[:, None]) * V + (i_v * BV + tl.arange(0, BV)[None, :]) + d_h += tl.load(p_ht, mask=mask_bk[:, None] & mask_bv[None, :], other=0).to(tl.float32) + + for _ in range(T): + b_q = tl.load(p_q, mask=mask_bk, other=0).to(tl.float32) * scale + b_k = tl.load(p_k, mask=mask_bk, other=0).to(tl.float32) + b_v = tl.load(p_v, mask=mask_bv, other=0).to(tl.float32) + b_do = tl.load(p_do, mask=mask_bv, other=0).to(tl.float32) + b_beta = tl.load(p_beta, mask=mask_bk, other=0).to(tl.float32) + b_alpha = tl.load(p_alpha, mask=mask_bk, other=0).to(tl.float32) + b_ha = tl.load(p_ha, mask=mask_bv, other=0).to(tl.float32) + + d_h += b_q[:, None] * b_do[None, :] + d_k = tl.sum(d_h * b_v[None, :], axis=1) + d_v = tl.sum(d_h * b_k[:, None], axis=0) + tl.store(p_dk, d_k.to(p_dk.dtype.element_ty), mask=mask_bk) + tl.store(p_dv, d_v.to(p_dv.dtype.element_ty), mask=mask_bv) + + b_dha = tl.sum(d_h * b_beta[:, None], axis=0) + tl.store(p_dha, b_dha.to(p_dha.dtype.element_ty), mask=mask_bv) + b_dbeta = tl.sum(d_h * b_ha[None, :], axis=1) + tl.store(p_dbeta, b_dbeta.to(p_dbeta.dtype.element_ty), mask=mask_bk) + + d_h += b_dha[None, :] * b_alpha[:, None] + p_do -= V + p_q -= K + p_k -= K + p_v -= V + p_dk -= K + p_dv -= V + p_beta -= K + p_dbeta -= K + p_alpha -= K + p_dha -= V + p_ha -= V + + if USE_DH0: + p_dh0 = dh0 + i_bh * K * V + (i_k * BK + tl.arange(0, BK)[:, None]) * V + (i_v * BV + tl.arange(0, BV)[None, :]) + tl.store(p_dh0, d_h.to(p_dh0.dtype.element_ty), mask=mask_bk[:, None] & mask_bv[None, :]) + + tl.debug_barrier() + + h = tl.zeros([BK, BV], dtype=tl.float32) + p_q = q + i_bh * s_k_h + i_k * BK + tl.arange(0, BK) + p_k = k + i_bh * s_k_h + i_k * BK + tl.arange(0, BK) + p_beta = beta + i_bh * s_k_h + i_k * BK + tl.arange(0, BK) + p_v = v + i_bh * s_v_h + i_v * BV + tl.arange(0, BV) + p_ha = ha + i_bh * s_v_h + i_v * BV + tl.arange(0, BV) + p_do = do + i_bh * s_v_h + i_v * BV + tl.arange(0, BV) + p_dq = dq + (i_bh + i_v * B * H) * s_k_h + i_k * BK + tl.arange(0, BK) + p_dv = dv + (i_bh + i_k * B * H) * s_v_h + i_v * BV + tl.arange(0, BV) + p_dha = dha + (i_bh + i_k * B * H) * s_v_h + i_v * BV + tl.arange(0, BV) + p_alpha = alpha + (i_bh + i_v * B * H) * s_k_h + i_k * BK + tl.arange(0, BK) + p_dalpha = dalpha + (i_bh + i_v * B * H) * s_k_h + i_k * BK + tl.arange(0, BK) + + if USE_INITIAL_STATE: + mask_kv = mask_bk[:, None] & mask_bv[None, :] + p_h0 = h0 + i_bh * K * V + (i_k * BK + tl.arange(0, BK)[:, None]) * V + (i_v * BV + tl.arange(0, BV)[None, :]) + h += tl.load(p_h0, mask=mask_kv, other=0).to(tl.float32) + + for i in range(0, T): + d_ha = tl.load(p_dha, mask=mask_bv, other=0).to(tl.float32) + d_alpha = tl.sum(d_ha[None, :] * h, axis=1) + tl.store(p_dalpha, d_alpha.to(p_dalpha.dtype.element_ty), mask=mask_bk) + b_k = tl.load(p_k, mask=mask_bk, other=0).to(tl.float32) + b_v = tl.load(p_v, mask=mask_bv, other=0).to(tl.float32) + b_do = tl.load(p_do, mask=mask_bv, other=0).to(tl.float32) + b_beta = tl.load(p_beta, mask=mask_bk, other=0).to(tl.float32) + b_ha = tl.load(p_ha, mask=mask_bv, other=0).to(tl.float32) + h += b_k[:, None] * b_v[None, :] + b_beta[:, None] * b_ha[None, :] + _d_q = h * b_do[None, :] + d_q = tl.sum(_d_q, axis=1) * scale + tl.store(p_dq, d_q.to(p_dq.dtype.element_ty), mask=mask_bk) + + p_k += K + p_do += V + p_v += V + p_dk += K + p_dalpha += K + p_dha += V + p_ha += V + p_dq += K + p_beta += K + + +class FusedRecurrentFunction(torch.autograd.Function): + + @staticmethod + @contiguous + def forward(ctx, q, k, v, alpha, beta, scale=None, initial_state=None, output_final_state=False): + B, H, T, K, V = *q.shape, v.shape[-1] + BK, BV = triton.next_power_of_2(K), min(triton.next_power_of_2(V), 8) + NK, NV = triton.cdiv(K, BK), triton.cdiv(V, BV) + num_stages = 1 + num_warps = 1 + assert NK == 1, "NK > 1 is not supported yet" + o = q.new_empty(NK, B, H, T, V) + + if output_final_state: + final_state = q.new_empty(B, H, K, V, dtype=torch.float32) + else: + final_state = None + + ha = torch.empty_like(v, dtype=torch.float32) + + grid = (NV, NK, B * H) + fused_recurrent_fwd_kernel[grid]( + q, k, v, alpha, beta, o, ha, initial_state, final_state, + q.stride(1), + v.stride(1), + scale, + B=B, H=H, T=T, K=K, V=V, + BK=BK, BV=BV, + USE_INITIAL_STATE=initial_state is not None, + STORE_FINAL_STATE=final_state is not None, + num_warps=num_warps, + num_stages=num_stages, + ) + o = o.squeeze(0) + ctx.save_for_backward(q, k, v, alpha, beta, ha, initial_state) + ctx.scale = scale + return o, final_state + + @staticmethod + @contiguous + def backward(ctx, do, dht): + q, k, v, alpha, beta, ha, initial_state = ctx.saved_tensors + B, H, T, K, V = *q.shape, v.shape[-1] + scale = ctx.scale + BK, BV = triton.next_power_of_2(K), min(triton.next_power_of_2(V), 32) + NK, NV = triton.cdiv(K, BK), triton.cdiv(V, BV) + assert NK == 1, "NK > 1 is not supported yet" + num_stages = 1 + num_warps = 2 + + dq = q.new_empty(NV, B, H, T, K) + dk = k.new_empty(NV, B, H, T, K) + dalpha = alpha.new_empty(NV, B, H, T, K) + dbeta = beta.new_empty(NV, B, H, T, K) + dv = v.new_empty(NK, B, H, T, V) + dha = ha.new_empty(NK, B, H, T, V) + + grid = (NV, NK, B * H) + + if initial_state is not None and initial_state.requires_grad: + dh0 = torch.empty_like(initial_state, dtype=torch.float32) + else: + dh0 = None + + fused_recurrent_bwd_kernel[grid]( + q, k, v, alpha, beta, ha, dht, dh0, do, dq, dk, dv, dalpha, dbeta, dha, initial_state, + q.stride(1), + v.stride(1), + NK, scale, + B=B, H=H, T=T, K=K, V=V, + BK=BK, BV=BV, + USE_INITIAL_STATE=initial_state is not None, + USE_DH0=dh0 is not None, + USE_DHT=dht is not None, + num_warps=num_warps, + num_stages=num_stages + ) + dq = dq.sum(0) + dk = dk.sum(0) + dv = dv.sum(0) + dalpha = dalpha.sum(0) + dbeta = dbeta.sum(0) + return dq.to(q), dk.to(k), dv.to(v), dalpha.to(alpha), dbeta.to(beta), None, dh0, None + + +def fused_recurrent_iplr( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + alpha: torch.Tensor, + beta: torch.Tensor, + scale: float = None, + initial_state: torch.Tensor = None, + output_final_state: bool = False +) -> Tuple[torch.Tensor, torch.Tensor]: + r""" + This function computes the recurrence S_t = S_t @ (I + alpha_t beta_t^T) + v_t k_t^T in a recurrent manner. + Since the transition matrices is identity-plus-low-rank, we call it the Identity-Plus-Low-Rank (IPLR). + + Args: + q (torch.Tensor): + queries of shape `[B, H, T, K]` + k (torch.Tensor): + keys of shape `[B, H, T, K]` + v (torch.Tensor): + values of shape `[B, H, T, V]` + alpha (torch.Tensor): + alphas of shape `[B, H, T, K]` + beta (torch.Tensor): + betas of shape `[B, H, T, K]` + scale (Optional[int]): + Scale factor for the RetNet attention scores. + If not provided, it will default to `1 / sqrt(K)`. Default: `None`. + initial_state (Optional[torch.Tensor]): + Initial state of shape `[B, H, K, V]`. Default: `None`. + output_final_state (Optional[bool]): + Whether to output the final state of shape `[B, H, K, V]`. Default: `False`. + """ + if scale is None: + scale = q.shape[-1] ** -0.5 + else: + assert scale > 0, "scale must be positive" + o, final_state = FusedRecurrentFunction.apply(q, k, v, alpha, beta, scale, initial_state, output_final_state) + return o, final_state diff --git a/fla/ops/generalized_delta_rule/iplr/naive.py b/fla/ops/generalized_delta_rule/iplr/naive.py new file mode 100644 index 0000000000000000000000000000000000000000..1123d4e2a719c49d4bc07384dcf51bad23094464 --- /dev/null +++ b/fla/ops/generalized_delta_rule/iplr/naive.py @@ -0,0 +1,100 @@ +# -*- coding: utf-8 -*- + +import torch +from einops import rearrange + + +# S_t = S_t @ (I + alpha_t beta_t^T) + v_t k_t^T +# q, k, alpha, beta [B, H, L, D_K] +# v [B, H, L, D_V] +def iplr_recurrence(q, k, v, alpha, beta, initial_state=None, output_final_state=True): + orig_dtype = q.dtype + b, h, l, d_k = q.shape + q, k, v, beta = map(lambda x: x.float(), [q, k, v, beta]) + d_v = v.shape[-1] + o = torch.zeros_like(v) + S = torch.zeros(b, h, d_k, d_v).to(v) + q = q * (d_k ** -0.5) + + if initial_state is not None: + S += initial_state + + for i in range(l): + _k = k[:, :, i] + _q = q[:, :, i] + _v = v[:, :, i] + _alpha = alpha[:, :, i] + _beta = beta[:, :, i] + _kv = _k[..., None] * _v[..., None, :] + (S.clone() * _alpha[..., None]).sum(-2, keepdim=True) * _beta[..., None] + S = S + _kv + o[:, :, i] = torch.einsum('bhd,bhdm->bhm', _q, S) + S = None if output_final_state is False else S + return o.to(orig_dtype), S + + +def iplr_chunkwise(q, k, v, alpha, beta, initial_state=None, output_final_state=True, chunk_size=32): + b, h, l, d_k = q.shape + d_v = v.shape[-1] + q = q * (d_k ** -0.5) + v = v + assert l % chunk_size == 0 + + if initial_state is not None: + S += initial_state + + # note that diagonal is masked. + mask = torch.triu(torch.ones(chunk_size, chunk_size, dtype=torch.bool, device=q.device), diagonal=0) + q, k, v, alpha, beta = map(lambda x: rearrange(x, 'b h (n c) d -> b h n c d', c=chunk_size), [q, k, v, alpha, beta]) + + v2 = (alpha @ k.transpose(-1, -2)).masked_fill_(mask, 0) @ v + attn = (alpha @ beta.transpose(-1, -2)).masked_fill(mask, 0) + for i in range(1, chunk_size): + attn[..., i, :i] = attn[..., i, :i] + (attn[..., i, :, None].clone() * attn[..., :, :i].clone()).sum(-2) + + attn = attn + torch.eye(chunk_size, dtype=torch.float, device=q.device) + u = attn @ v2 + w = attn @ alpha + S = k.new_zeros(b, h, d_k, d_v) + o = torch.zeros_like(v) + mask = torch.triu(torch.ones(chunk_size, chunk_size, dtype=torch.bool, device=q.device), diagonal=1) + for i in range(0, l // chunk_size): + q_i, k_i, v_i, u_i, w_i, beta_i = q[:, :, i], k[:, :, i], v[:, :, i], u[:, :, i], w[:, :, i], beta[:, :, i] + o_1 = (q_i @ k_i.transpose(-1, -2)).masked_fill_(mask, 0) @ v_i + v2_i = u_i + w_i @ S + o_2 = (q_i @ beta_i.transpose(-1, -2)).masked_fill_(mask, 0) @ (v2_i) + o_3 = q_i @ S + o[:, :, i] = o_1 + o_2 + o_3 + S = S + k_i.transpose(-1, -2) @ v_i + beta_i.transpose(-1, -2) @ v2_i + S = None if output_final_state is False else S + return rearrange(o, 'b h n c d -> b h (n c) d'), S + + +if __name__ == '__main__': + B = 2 + H = 4 + L = 128 + DK = 128 + DV = 128 + q = (torch.randn(B, H, L, DK)).cuda().requires_grad_(True) + k = (torch.randn(B, H, L, DK)).cuda().requires_grad_(True) + v = (torch.randn(B, H, L, DV)).cuda().requires_grad_(True) + alpha = torch.randn(B, H, L, DK).cuda().softmax(-1).requires_grad_(True) + beta = torch.randn(B, H, L, DK).cuda().softmax(-1).requires_grad_(True) + + o, s = iplr_recurrence(q, k, v, -alpha, beta) + do = torch.randn_like(o).cuda() + o.backward(do, retain_graph=True) + q_grad, q.grad = q.grad, None + k_grad, k.grad = k.grad, None + v_grad, v.grad = v.grad, None + beta_grad, beta.grad = beta.grad, None + + o2, s2 = iplr_chunkwise(q, k, v, -alpha, beta) + o2.backward(do) + assert torch.allclose(o, o2, atol=1e-4), breakpoint() + assert torch.allclose(s, s2, atol=1e-4), breakpoint() + assert torch.allclose(q.grad, q_grad, atol=1e-4), breakpoint() + assert torch.allclose(k.grad, k_grad, atol=1e-4), breakpoint() + assert torch.allclose(v.grad, v_grad, atol=1e-4), breakpoint() + assert torch.allclose(beta.grad, beta_grad, atol=1e-4), breakpoint() + print("All passed!") diff --git a/fla/ops/gla/__init__.py b/fla/ops/gla/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..367c85442a26fe56516716622433f8b6f87afd2c --- /dev/null +++ b/fla/ops/gla/__init__.py @@ -0,0 +1,11 @@ +# -*- coding: utf-8 -*- + +from .chunk import chunk_gla +from .fused_chunk import fused_chunk_gla +from .fused_recurrent import fused_recurrent_gla + +__all__ = [ + 'chunk_gla', + 'fused_chunk_gla', + 'fused_recurrent_gla' +] diff --git a/fla/ops/gla/chunk.py b/fla/ops/gla/chunk.py new file mode 100644 index 0000000000000000000000000000000000000000..c3aecc20582492188e44ff8d13aa205f629c5de4 --- /dev/null +++ b/fla/ops/gla/chunk.py @@ -0,0 +1,1514 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +from typing import Optional, Tuple + +import torch +import triton +import triton.language as tl + +from fla.ops.common.chunk_h import chunk_bwd_dh, chunk_fwd_h +from fla.ops.utils import chunk_local_cumsum +from fla.utils import contiguous + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8), + ], + key=["BC", "BK"], +) +@triton.heuristics({'USE_OFFSETS': lambda args: args['offsets'] is not None}) +@triton.jit +def chunk_gla_fwd_A_kernel_intra_sub_inter( + q, + k, + g, + A, + offsets, + indices, + scale, + T: tl.constexpr, + H: tl.constexpr, + K: tl.constexpr, + BT: tl.constexpr, + BC: tl.constexpr, + BK: tl.constexpr, + NC: tl.constexpr, + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + i_t, i_c, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_b, i_h = i_bh // H, i_bh % H + i_i, i_j = i_c // NC, i_c % NC + if USE_OFFSETS: + i_n, i_t = tl.load(indices + i_t * 2).to(tl.int32), tl.load(indices + i_t * 2 + 1).to(tl.int32) + bos, eos = tl.load(offsets + i_n).to(tl.int32), tl.load(offsets + i_n + 1).to(tl.int32) + T = eos - bos + else: + bos, eos = i_b * T, i_b * T + T + + if i_t * BT + i_i * BC >= T: + return + if i_i <= i_j: + return + + b_A = tl.zeros([BC, BC], dtype=tl.float32) + for i_k in range(tl.cdiv(K, BK)): + o_k = i_k * BK + tl.arange(0, BK) + m_k = o_k < K + + if HEAD_FIRST: + p_q = tl.make_block_ptr(q + i_bh * T*K, (T, K), (K, 1), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + p_g = tl.make_block_ptr(g + i_bh * T*K, (T, K), (K, 1), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + p_k = tl.make_block_ptr(k + i_bh * T*K, (K, T), (1, K), (i_k * BK, i_t * BT + i_j * BC), (BK, BC), (0, 1)) + p_gk = tl.make_block_ptr(g + i_bh * T*K, (K, T), (1, K), (i_k * BK, i_t * BT + i_j * BC), (BK, BC), (0, 1)) + p_gn = tl.max_contiguous(tl.multiple_of(g + (i_bh * T + i_t * BT + i_i * BC) * K + o_k, BK), BK) + else: + p_q = tl.make_block_ptr(q + (bos*H+i_h)*K, (T, K), (H*K, 1), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + p_g = tl.make_block_ptr(g + (bos*H+i_h)*K, (T, K), (H*K, 1), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + p_k = tl.make_block_ptr(k + (bos*H+i_h)*K, (K, T), (1, H*K), (i_k * BK, i_t * BT + i_j * BC), (BK, BC), (0, 1)) + p_gk = tl.make_block_ptr(g + (bos*H+i_h)*K, (K, T), (1, H*K), (i_k * BK, i_t * BT + i_j * BC), (BK, BC), (0, 1)) + p_gn = tl.max_contiguous(tl.multiple_of(g + (bos + i_t * BT + i_i * BC) * H*K + i_h * K + o_k, BK), BK) + + # [BK,] + b_gn = tl.load(p_gn, mask=m_k, other=0) + # [BC, BK] + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_g = tl.load(p_g, boundary_check=(0, 1)) + b_qg = b_q * tl.exp(b_g - b_gn[None, :]) * scale + # [BK, BC] + b_k = tl.load(p_k, boundary_check=(0, 1)) + b_gk = tl.load(p_gk, boundary_check=(0, 1)) + b_kg = b_k * tl.exp(b_gn[:, None] - b_gk) + # [BC, BC] using tf32 to improve precision here. + b_A += tl.dot(b_qg, b_kg) + + if HEAD_FIRST: + p_A = tl.make_block_ptr(A + i_bh*T*BT, (T, BT), (BT, 1), (i_t * BT + i_i * BC, i_j * BC), (BC, BC), (1, 0)) + else: + p_A = tl.make_block_ptr(A + (bos*H + i_h)*BT, (T, BT), (H*BT, 1), (i_t * BT + i_i * BC, i_j * BC), (BC, BC), (1, 0)) + tl.store(p_A, b_A.to(A.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8), + ], + key=["BK", "BT"], +) +@triton.heuristics({'USE_OFFSETS': lambda args: args['offsets'] is not None}) +@triton.jit +def chunk_gla_fwd_A_kernel_intra_sub_intra( + q, + k, + g, + A, + offsets, + indices, + scale, + T: tl.constexpr, + H: tl.constexpr, + K: tl.constexpr, + BT: tl.constexpr, + BC: tl.constexpr, + BK: tl.constexpr, + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + i_t, i_i, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_b, i_h = i_bh // H, i_bh % H + i_j = i_i + if USE_OFFSETS: + i_n, i_t = tl.load(indices + i_t * 2).to(tl.int32), tl.load(indices + i_t * 2 + 1).to(tl.int32) + bos, eos = tl.load(offsets + i_n).to(tl.int32), tl.load(offsets + i_n + 1).to(tl.int32) + T = eos - bos + else: + bos, eos = i_b * T, i_b * T + T + + if i_t * BT + i_i * BC >= T: + return + + o_i = tl.arange(0, BC) + o_k = tl.arange(0, BK) + m_k = o_k < K + m_A = (i_t * BT + i_i * BC + tl.arange(0, BC)) < T + if HEAD_FIRST: + o_A = i_bh * T*BT + (i_t * BT + i_i * BC + tl.arange(0, BC)) * BT + i_j * BC + p_q = tl.make_block_ptr(q + i_bh * T*K, (T, K), (K, 1), (i_t * BT + i_i * BC, 0), (BC, BK), (1, 0)) + p_g = tl.make_block_ptr(g + i_bh * T*K, (T, K), (K, 1), (i_t * BT + i_i * BC, 0), (BC, BK), (1, 0)) + p_k = tl.max_contiguous(tl.multiple_of(k + (i_bh * T + i_t * BT + i_j * BC) * K + o_k, BK), BK) + p_gk = tl.max_contiguous(tl.multiple_of(g + (i_bh * T + i_t * BT + i_j * BC) * K + o_k, BK), BK) + else: + o_A = (bos + i_t * BT + i_i * BC + tl.arange(0, BC)) * H*BT + i_h * BT + i_j * BC + p_q = tl.make_block_ptr(q + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT + i_i * BC, 0), (BC, BK), (1, 0)) + p_g = tl.make_block_ptr(g + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT + i_i * BC, 0), (BC, BK), (1, 0)) + p_k = tl.max_contiguous(tl.multiple_of(k + (bos + i_t * BT + i_j * BC) * H*K + i_h * K + o_k, BK), BK) + p_gk = tl.max_contiguous(tl.multiple_of(g + (bos + i_t * BT + i_j * BC) * H*K + i_h * K + o_k, BK), BK) + + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_g = tl.load(p_g, boundary_check=(0, 1)) + for j in range(0, min(BC, T - i_t * BT - i_i * BC)): + b_k = tl.load(p_k, mask=m_k, other=0).to(tl.float32) + b_gk = tl.load(p_gk, mask=m_k, other=0).to(tl.float32) + b_A = tl.sum(b_q * b_k[None, :] * tl.exp(b_g - b_gk[None, :]), 1) + b_A = tl.where(o_i >= j, b_A * scale, 0.) + + tl.store(A + o_A + j, b_A, mask=m_A) + p_k += K if HEAD_FIRST else H*K + p_gk += K if HEAD_FIRST else H*K + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8), + ], + key=["BC", "BK"], +) +@triton.heuristics({'USE_OFFSETS': lambda args: args['offsets'] is not None}) +@triton.jit +def chunk_gla_fwd_A_kernel_intra_sub_intra_split( + q, + k, + g, + A, + offsets, + indices, + scale, + B: tl.constexpr, + T: tl.constexpr, + H: tl.constexpr, + K: tl.constexpr, + BT: tl.constexpr, + BC: tl.constexpr, + BK: tl.constexpr, + NC: tl.constexpr, + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + i_k, i_tc, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_b, i_h = i_bh // H, i_bh % H + i_t, i_i = i_tc // NC, i_tc % NC + i_j = i_i + if USE_OFFSETS: + i_n, i_t = tl.load(indices + i_t * 2).to(tl.int32), tl.load(indices + i_t * 2 + 1).to(tl.int32) + bos, eos = tl.load(offsets + i_n).to(tl.int32), tl.load(offsets + i_n + 1).to(tl.int32) + all = T + T = eos - bos + else: + bos, eos = i_b * T, i_b * T + T + all = B * T + + if i_t * BT + i_i * BC >= T: + return + + o_i = tl.arange(0, BC) + o_k = i_k * BK + tl.arange(0, BK) + m_k = o_k < K + m_A = (i_t * BT + i_i * BC + tl.arange(0, BC)) < T + + if HEAD_FIRST: + o_A = (i_k * B*H + i_bh) * T * BC + (i_t * BT + i_i * BC + tl.arange(0, BC)) * BC + p_q = tl.make_block_ptr(q + i_bh * T*K, (T, K), (K, 1), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + p_g = tl.make_block_ptr(g + i_bh * T*K, (T, K), (K, 1), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + p_k = tl.max_contiguous(tl.multiple_of(k + (i_bh * T + i_t * BT + i_j * BC) * K + o_k, BK), BK) + p_gk = tl.max_contiguous(tl.multiple_of(g + (i_bh * T + i_t * BT + i_j * BC) * K + o_k, BK), BK) + else: + o_A = (i_k * all + bos + i_t * BT + i_i * BC + tl.arange(0, BC)) * H*BC + i_h * BC + p_q = tl.make_block_ptr(q + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + p_g = tl.make_block_ptr(g + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + p_k = tl.max_contiguous(tl.multiple_of(k + (bos + i_t * BT + i_j * BC) * H*K + i_h * K + o_k, BK), BK) + p_gk = tl.max_contiguous(tl.multiple_of(g + (bos + i_t * BT + i_j * BC) * H*K + i_h * K + o_k, BK), BK) + + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_g = tl.load(p_g, boundary_check=(0, 1)) + for j in range(0, min(BC, T - i_t * BT - i_i * BC)): + b_A = tl.zeros([BC], dtype=tl.float32) + b_k = tl.load(p_k, mask=m_k, other=0).to(tl.float32) + b_gk = tl.load(p_gk, mask=m_k, other=0).to(tl.float32) + b_A += tl.sum(b_q * b_k[None, :] * tl.exp(b_g - b_gk[None, :]), 1) + b_A = tl.where(o_i >= j, b_A * scale, 0.) + tl.store(A + o_A + j, b_A, mask=m_A) + p_k += K if HEAD_FIRST else H*K + p_gk += K if HEAD_FIRST else H*K + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8), + ], + key=["BC"], +) +@triton.heuristics({'USE_OFFSETS': lambda args: args['offsets'] is not None}) +@triton.jit +def chunk_gla_fwd_A_kernel_intra_sub_intra_merge( + A, + A2, + offsets, + indices, + B: tl.constexpr, + T: tl.constexpr, + H: tl.constexpr, + BT: tl.constexpr, + BC: tl.constexpr, + NK: tl.constexpr, + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + i_t, i_c, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_b, i_h = i_bh // H, i_bh % H + if USE_OFFSETS: + i_n, i_t = tl.load(indices + i_t * 2).to(tl.int32), tl.load(indices + i_t * 2 + 1).to(tl.int32) + bos, eos = tl.load(offsets + i_n).to(tl.int32), tl.load(offsets + i_n + 1).to(tl.int32) + all = T + T = eos - bos + else: + bos, eos = i_b * T, i_b * T + T + all = B * T + + if i_t * BT + i_c * BC >= T: + return + + b_A = tl.zeros([BC, BC], dtype=tl.float32) + for i_k in range(0, NK): + if HEAD_FIRST: + p_A = tl.make_block_ptr(A + (i_k*B*H+i_bh)*T*BC, (T, BC), (BC, 1), (i_t*BT + i_c*BC, 0), (BC, BC), (1, 0)) + else: + p_A = tl.make_block_ptr(A + (i_k*all+bos)*H*BC+i_h*BC, (T, BC), (H*BC, 1), (i_t*BT + i_c*BC, 0), (BC, BC), (1, 0)) + b_A += tl.load(p_A, boundary_check=(0, 1)) + if HEAD_FIRST: + p_A2 = tl.make_block_ptr(A2 + i_bh*T*BT, (T, BT), (BT, 1), (i_t * BT + i_c * BC, i_c * BC), (BC, BC), (1, 0)) + else: + p_A2 = tl.make_block_ptr(A2 + (bos*H+i_h)*BT, (T, BT), (H*BT, 1), (i_t * BT + i_c * BC, i_c * BC), (BC, BC), (1, 0)) + tl.store(p_A2, b_A.to(A2.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8), + ], + key=["BK", "BV", "BT"], +) +@triton.heuristics({'USE_OFFSETS': lambda args: args['offsets'] is not None}) +@triton.jit +def chunk_gla_fwd_kernel_o( + q, + v, + g, + h, + o, + A, + offsets, + indices, + scale, + T: tl.constexpr, + H: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + i_v, i_t, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_b, i_h = i_bh // H, i_bh % H + if USE_OFFSETS: + i_tg = i_t + i_n, i_t = tl.load(indices + i_t * 2).to(tl.int32), tl.load(indices + i_t * 2 + 1).to(tl.int32) + bos, eos = tl.load(offsets + i_n).to(tl.int32), tl.load(offsets + i_n + 1).to(tl.int32) + T = eos - bos + NT = tl.cdiv(T, BT) + else: + NT = tl.cdiv(T, BT) + i_tg = i_b * NT + i_t + bos, eos = i_b * T, i_b * T + T + + m_s = tl.arange(0, BT)[:, None] >= tl.arange(0, BT)[None, :] + + b_o = tl.zeros([BT, BV], dtype=tl.float32) + for i_k in range(tl.cdiv(K, BK)): + if HEAD_FIRST: + p_q = tl.make_block_ptr(q + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_g = tl.make_block_ptr(g + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_h = tl.make_block_ptr(h + (i_bh * NT + i_t) * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + else: + p_q = tl.make_block_ptr(q + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_g = tl.make_block_ptr(g + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_h = tl.make_block_ptr(h + (i_tg * H + i_h) * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + + # [BT, BK] + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_q = (b_q * scale).to(b_q.dtype) + # [BT, BK] + b_g = tl.load(p_g, boundary_check=(0, 1)) + # [BT, BK] + b_qg = (b_q * tl.exp(b_g)).to(b_q.dtype) + # [BK, BV] + b_h = tl.load(p_h, boundary_check=(0, 1)) + # works but dkw, owing to divine benevolence + # [BT, BV] + if i_k >= 0: + b_o += tl.dot(b_qg, b_h.to(b_qg.dtype)) + if HEAD_FIRST: + p_v = tl.make_block_ptr(v + i_bh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_o = tl.make_block_ptr(o + i_bh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_A = tl.make_block_ptr(A + i_bh * T*BT, (T, BT), (BT, 1), (i_t * BT, 0), (BT, BT), (1, 0)) + else: + p_v = tl.make_block_ptr(v + (bos * H + i_h) * V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_o = tl.make_block_ptr(o + (bos * H + i_h) * V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_A = tl.make_block_ptr(A + (bos * H + i_h) * BT, (T, BT), (H*BT, 1), (i_t * BT, 0), (BT, BT), (1, 0)) + # [BT, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + # [BT, BT] + b_A = tl.load(p_A, boundary_check=(0, 1)) + b_A = tl.where(m_s, b_A, 0.).to(b_v.dtype) + b_o += tl.dot(b_A, b_v, allow_tf32=False) + tl.store(p_o, b_o.to(p_o.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8), + ], + key=["BK", "NC", "BT"], +) +@triton.heuristics({'USE_OFFSETS': lambda args: args['offsets'] is not None}) +@triton.jit +def chunk_gla_bwd_kernel_intra( + q, + k, + g, + dA, + dq, + dk, + offsets, + indices, + T: tl.constexpr, + H: tl.constexpr, + K: tl.constexpr, + BT: tl.constexpr, + BC: tl.constexpr, + BK: tl.constexpr, + NC: tl.constexpr, + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + i_k, i_c, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_b, i_h = i_bh // H, i_bh % H + i_t, i_i = i_c // NC, i_c % NC + if USE_OFFSETS: + i_n, i_t = tl.load(indices + i_t * 2).to(tl.int32), tl.load(indices + i_t * 2 + 1).to(tl.int32) + bos, eos = tl.load(offsets + i_n).to(tl.int32), tl.load(offsets + i_n + 1).to(tl.int32) + else: + bos, eos = i_b * T, i_b * T + T + T = eos - bos + if i_t * BT + i_i * BC >= T: + return + + o_k = i_k * BK + tl.arange(0, BK) + m_k = o_k < K + + if HEAD_FIRST: + p_g = tl.make_block_ptr(g + i_bh * T*K, (T, K), (K, 1), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + else: + p_g = tl.make_block_ptr(g + (bos*H + i_h) * K, (T, K), (H*K, 1), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + # [BC, BK] + b_g = tl.load(p_g, boundary_check=(0, 1)) + b_dq = tl.zeros([BC, BK], dtype=tl.float32) + if i_i > 0: + if HEAD_FIRST: + p_gn = tl.max_contiguous(tl.multiple_of(g + (i_bh * T + i_t * BT + i_i * BC) * K + o_k, BK), BK) + else: + p_gn = tl.max_contiguous(tl.multiple_of(g + (bos + i_t * BT + i_i * BC) * H*K + i_h*K + o_k, BK), BK) + # [BK,] + b_gn = tl.load(p_gn, mask=m_k, other=0) + for i_j in range(0, i_i): + if HEAD_FIRST: + p_k = tl.make_block_ptr(k + i_bh * T*K, (T, K), (K, 1), (i_t * BT + i_j * BC, i_k * BK), (BC, BK), (1, 0)) + p_gk = tl.make_block_ptr(g + i_bh * T*K, (T, K), (K, 1), (i_t * BT + i_j * BC, i_k * BK), (BC, BK), (1, 0)) + p_dA = tl.make_block_ptr(dA + i_bh * T*BT, (T, BT), (BT, 1), (i_t * BT + i_i * BC, i_j * BC), (BC, BC), (1, 0)) + else: + p_k = tl.make_block_ptr(k+(bos*H+i_h)*K, (T, K), (H*K, 1), (i_t*BT+i_j*BC, i_k * BK), (BC, BK), (1, 0)) + p_gk = tl.make_block_ptr(g+(bos*H+i_h)*K, (T, K), (H*K, 1), (i_t*BT+i_j*BC, i_k * BK), (BC, BK), (1, 0)) + p_dA = tl.make_block_ptr(dA+(bos*H+i_h)*BT, (T, BT), (H*BT, 1), (i_t*BT+i_i*BC, i_j * BC), (BC, BC), (1, 0)) + # [BC, BK] + b_k = tl.load(p_k, boundary_check=(0, 1)) + b_gk = tl.load(p_gk, boundary_check=(0, 1)) + b_kg = (b_k * tl.exp(b_gn[None, :] - b_gk)) + # [BC, BC] + b_dA = tl.load(p_dA, boundary_check=(0, 1)) + # [BC, BK] + b_dq += tl.dot(b_dA, b_kg) + b_dq *= tl.exp(b_g - b_gn[None, :]) + + o_i = tl.arange(0, BC) + m_dA = (i_t * BT + i_i * BC + tl.arange(0, BC)) < T + if HEAD_FIRST: + o_dA = i_bh * T*BT + (i_t * BT + i_i * BC + tl.arange(0, BC)) * BT + i_i * BC + p_kj = tl.max_contiguous(tl.multiple_of(k + (i_bh * T + i_t * BT + i_i * BC) * K + o_k, BK), BK) + p_gkj = tl.max_contiguous(tl.multiple_of(g + (i_bh * T + i_t * BT + i_i * BC) * K + o_k, BK), BK) + p_dq = tl.make_block_ptr(dq + i_bh * T*K, (T, K), (K, 1), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + else: + o_dA = bos*H*BT + (i_t * BT + i_i * BC + tl.arange(0, BC)) * H*BT + i_h * BT + i_i * BC + p_kj = tl.max_contiguous(tl.multiple_of(k + (bos + i_t * BT + i_i * BC) * H*K + i_h * K + o_k, BK), BK) + p_gkj = tl.max_contiguous(tl.multiple_of(g + (bos + i_t * BT + i_i * BC) * H*K + i_h * K + o_k, BK), BK) + p_dq = tl.make_block_ptr(dq + (bos*H + i_h) * K, (T, K), (H*K, 1), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + + for j in range(0, min(BC, T - i_t * BT - i_i * BC)): + # [BC,] + b_dA = tl.load(dA + o_dA + j, mask=m_dA, other=0) + # [BK,] + b_kj = tl.load(p_kj, mask=m_k, other=0).to(tl.float32) + b_gkj = tl.load(p_gkj, mask=m_k, other=0).to(tl.float32) + # [BC, BK] + m_i = o_i[:, None] >= j + # [BC, BK] + # (SY 09/17) important to not use bf16 here to have a good precision. + b_dq += tl.where(m_i, b_dA[:, None] * b_kj[None, :] * tl.exp(b_g - b_gkj[None, :]), 0.) + p_kj += K if HEAD_FIRST else H*K + p_gkj += K if HEAD_FIRST else H*K + tl.store(p_dq, b_dq.to(p_dq.dtype.element_ty), boundary_check=(0, 1)) + + tl.debug_barrier() + if HEAD_FIRST: + p_k = tl.make_block_ptr(k + i_bh * T*K, (T, K), (K, 1), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + p_gk = tl.make_block_ptr(g + i_bh * T*K, (T, K), (K, 1), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + else: + p_k = tl.make_block_ptr(k + (bos*H + i_h) * K, (T, K), (H*K, 1), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + p_gk = tl.make_block_ptr(g + (bos*H + i_h) * K, (T, K), (H*K, 1), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + + # [BC, BK] + b_k = tl.load(p_k, boundary_check=(0, 1)) + b_gk = tl.load(p_gk, boundary_check=(0, 1)) + b_dk = tl.zeros([BC, BK], dtype=tl.float32) + + NC = min(NC, tl.cdiv(T - i_t * BT, BC)) + if i_i < NC - 1: + if HEAD_FIRST: + p_gn = tl.max_contiguous(tl.multiple_of(g + i_bh*T*K + (i_t * BT + i_i * BC + BC - 1)*K + o_k, BK), BK) + else: + p_gn = tl.max_contiguous(tl.multiple_of(g + bos*H*K + (i_t * BT + i_i * BC + BC - 1)*H*K + i_h*K + o_k, BK), BK) + # [BK,] + b_gn = tl.load(p_gn, mask=m_k, other=0) + for i_j in range(i_i + 1, NC): + if HEAD_FIRST: + p_q = tl.make_block_ptr(q + i_bh * T*K, (T, K), (K, 1), (i_t * BT + i_j * BC, i_k * BK), (BC, BK), (1, 0)) + p_g = tl.make_block_ptr(g + i_bh * T*K, (T, K), (K, 1), (i_t * BT + i_j * BC, i_k * BK), (BC, BK), (1, 0)) + p_dA = tl.make_block_ptr(dA + i_bh * T * BT, (BT, T), (1, BT), (i_i*BC, i_t*BT + i_j*BC), (BC, BC), (0, 1)) + else: + p_q = tl.make_block_ptr(q + (bos*H+i_h)*K, (T, K), (H*K, 1), (i_t * BT + i_j * BC, i_k * BK), (BC, BK), (1, 0)) + p_g = tl.make_block_ptr(g + (bos*H+i_h)*K, (T, K), (H*K, 1), (i_t * BT + i_j * BC, i_k * BK), (BC, BK), (1, 0)) + p_dA = tl.make_block_ptr(dA + (bos*H+i_h)*BT, (BT, T), (1, H*BT), (i_i*BC, i_t*BT + i_j*BC), (BC, BC), (0, 1)) + # [BC, BK] + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_g = tl.load(p_g, boundary_check=(0, 1)) + b_qg = (b_q * tl.exp(b_g - b_gn[None, :])) + # [BC, BC] + b_dA = tl.load(p_dA, boundary_check=(0, 1)) + # [BC, BK] + # (SY 09/17) important to not use bf16 here to have a good precision. + b_dk += tl.dot(b_dA, b_qg) + b_dk *= tl.exp(b_gn[None, :] - b_gk) + if HEAD_FIRST: + o_dA = i_bh * T * BT + (i_t * BT + i_i * BC) * BT + i_i * BC + tl.arange(0, BC) + p_qj = tl.max_contiguous(tl.multiple_of(q + (i_bh * T + i_t * BT + i_i * BC) * K + o_k, BK), BK) + p_gqj = tl.max_contiguous(tl.multiple_of(g + (i_bh * T + i_t * BT + i_i * BC) * K + o_k, BK), BK) + p_dk = tl.make_block_ptr(dk + i_bh*T*K, (T, K), (K, 1), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + else: + o_dA = bos*H*BT + (i_t * BT + i_i * BC) * H*BT + i_h * BT + i_i * BC + tl.arange(0, BC) + p_qj = tl.max_contiguous(tl.multiple_of(q + (bos + i_t * BT + i_i * BC) * H*K + i_h * K + o_k, BK), BK) + p_gqj = tl.max_contiguous(tl.multiple_of(g + (bos + i_t * BT + i_i * BC) * H*K + i_h * K + o_k, BK), BK) + p_dk = tl.make_block_ptr(dk + (bos*H+i_h)*K, (T, K), (H*K, 1), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + for j in range(0, min(BC, T - i_t * BT - i_i * BC)): + # [BC,] + b_dA = tl.load(dA + o_dA + j * (1 if HEAD_FIRST else H) * BT) + # [BK,] + b_qj = tl.load(p_qj, mask=m_k, other=0).to(tl.float32) + b_gqj = tl.load(p_gqj, mask=m_k, other=0).to(tl.float32) + # [BC, BK] + m_i = o_i[:, None] <= j + b_dk += tl.where(m_i, b_dA[:, None] * b_qj[None, :] * tl.exp(b_gqj[None, :] - b_gk), 0.) + p_qj += K if HEAD_FIRST else H*K + p_gqj += K if HEAD_FIRST else H*K + tl.store(p_dk, b_dk.to(p_dk.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8), + ], + key=["BV", "BT"], +) +@triton.heuristics({'USE_OFFSETS': lambda args: args['offsets'] is not None}) +@triton.jit +def chunk_gla_bwd_kernel_dA( + v, + do, + dA, + offsets, + indices, + scale, + T: tl.constexpr, + H: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BV: tl.constexpr, + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + i_t, i_bh = tl.program_id(0), tl.program_id(1) + i_b, i_h = i_bh // H, i_bh % H + if USE_OFFSETS: + i_n, i_t = tl.load(indices + i_t * 2).to(tl.int32), tl.load(indices + i_t * 2 + 1).to(tl.int32) + bos, eos = tl.load(offsets + i_n).to(tl.int32), tl.load(offsets + i_n + 1).to(tl.int32) + else: + bos, eos = i_b * T, i_b * T + T + T = eos - bos + + b_dA = tl.zeros([BT, BT], dtype=tl.float32) + for i_v in range(tl.cdiv(V, BV)): + if HEAD_FIRST: + p_do = tl.make_block_ptr(do + i_bh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_v = tl.make_block_ptr(v + i_bh * T*V, (V, T), (1, V), (i_v * BV, i_t * BT), (BV, BT), (0, 1)) + else: + p_do = tl.make_block_ptr(do + (bos*H + i_h) * V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_v = tl.make_block_ptr(v + (bos*H + i_h) * V, (V, T), (1, H*V), (i_v * BV, i_t * BT), (BV, BT), (0, 1)) + b_v = tl.load(p_v, boundary_check=(0, 1)) + b_do = tl.load(p_do, boundary_check=(0, 1)) + b_dA += tl.dot(b_do, b_v) + if HEAD_FIRST: + p_dA = tl.make_block_ptr(dA + i_bh * T*BT, (T, BT), (BT, 1), (i_t * BT, 0), (BT, BT), (1, 0)) + else: + p_dA = tl.make_block_ptr(dA + (bos * H + i_h) * BT, (T, BT), (H*BT, 1), (i_t * BT, 0), (BT, BT), (1, 0)) + m_s = tl.arange(0, BT)[:, None] >= tl.arange(0, BT)[None, :] + b_dA = tl.where(m_s, b_dA * scale, 0.) + tl.store(p_dA, b_dA.to(p_dA.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8), + ], + key=["BK", "BV", "BT"], +) +@triton.heuristics({'USE_OFFSETS': lambda args: args['offsets'] is not None}) +@triton.jit +def chunk_gla_bwd_kernel_dv( + k, + g, + A, + do, + dh, + dv, + offsets, + indices, + T: tl.constexpr, + H: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + i_v, i_t, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_b, i_h = i_bh // H, i_bh % H + if USE_OFFSETS: + i_tg = i_t + i_n, i_t = tl.load(indices + i_t * 2).to(tl.int32), tl.load(indices + i_t * 2 + 1).to(tl.int32) + bos, eos = tl.load(offsets + i_n).to(tl.int32), tl.load(offsets + i_n + 1).to(tl.int32) + T = eos - bos + NT = tl.cdiv(T, BT) + else: + NT = tl.cdiv(T, BT) + i_tg = i_b * NT + i_t + bos, eos = i_b * T, i_b * T + T + + if HEAD_FIRST: + p_A = tl.make_block_ptr(A + i_bh * T * BT, (BT, T), (1, BT), (0, i_t * BT), (BT, BT), (0, 1)) + p_do = tl.make_block_ptr(do + i_bh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_dv = tl.make_block_ptr(dv + i_bh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + else: + p_A = tl.make_block_ptr(A + (bos * H + i_h) * BT, (BT, T), (1, H*BT), (0, i_t * BT), (BT, BT), (0, 1)) + p_do = tl.make_block_ptr(do + (bos * H + i_h) * V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_dv = tl.make_block_ptr(dv + (bos * H + i_h) * V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + + b_A = tl.load(p_A, boundary_check=(0, 1)) + b_A = tl.where(tl.arange(0, BT)[:, None] <= tl.arange(0, BT)[None, :], b_A, 0.) + b_do = tl.load(p_do, boundary_check=(0, 1)) + # (SY 09/17) important to disallow tf32 here to maintain a good precision. + b_dv = tl.dot(b_A, b_do.to(b_A.dtype), allow_tf32=False) + + for i_k in range(tl.cdiv(K, BK)): + o_k = i_k * BK + tl.arange(0, BK) + m_k = o_k < K + + if HEAD_FIRST: + p_k = tl.make_block_ptr(k + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_gk = tl.make_block_ptr(g + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_gn = tl.max_contiguous(tl.multiple_of(g + i_bh * T*K + min(i_t * BT + BT, T) * K - K + o_k, BK), BK) + p_dh = tl.make_block_ptr(dh + (i_bh * NT + i_t) * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + else: + p_k = tl.make_block_ptr(k + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_gk = tl.make_block_ptr(g + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_gn = tl.max_contiguous(tl.multiple_of(g + (bos + min(i_t * BT + BT, T) - 1)*H*K + i_h * K + o_k, BK), BK) + p_dh = tl.make_block_ptr(dh + (i_tg * H + i_h) * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + + b_k = tl.load(p_k, boundary_check=(0, 1)) + b_gk = tl.load(p_gk, boundary_check=(0, 1)) + b_gn = tl.exp(tl.load(p_gn, mask=m_k, other=0)[None, :] - b_gk) + b_k = (b_k * b_gn).to(b_k.dtype) + b_dh = tl.load(p_dh, boundary_check=(0, 1)) + # [BT, BV] + # (SY 09/17) it is ok to have bf16 interchunk gradient contribution here + b_dv += tl.dot(b_k, b_dh.to(b_k.dtype)) + tl.store(p_dv, b_dv.to(p_dv.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8), + ], + key=["BK", "BV", "BT"], +) +@triton.heuristics({'USE_OFFSETS': lambda args: args['offsets'] is not None}) +@triton.jit +def chunk_gla_bwd_kernel_inter( + q, + k, + v, + h, + g, + do, + dh, + dq, + dk, + dq2, + dk2, + dg, + offsets, + indices, + scale, + T: tl.constexpr, + H: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + i_k, i_t, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_b, i_h = i_bh // H, i_bh % H + if USE_OFFSETS: + i_tg = i_t + i_n, i_t = tl.load(indices + i_t * 2).to(tl.int32), tl.load(indices + i_t * 2 + 1).to(tl.int32) + bos, eos = tl.load(offsets + i_n).to(tl.int32), tl.load(offsets + i_n + 1).to(tl.int32) + T = eos - bos + NT = tl.cdiv(T, BT) + else: + NT = tl.cdiv(T, BT) + i_tg = i_b * NT + i_t + bos, eos = i_b * T, i_b * T + T + o_k = i_k * BK + tl.arange(0, BK) + m_k = o_k < K + + if HEAD_FIRST: + p_gk = tl.make_block_ptr(g + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_gn = tl.max_contiguous(tl.multiple_of(g + i_bh * T*K + (min(T, i_t * BT + BT)-1) * K + o_k, BK), BK) + else: + p_gk = tl.make_block_ptr(g + (bos*H+i_h)*K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_gn = tl.max_contiguous(tl.multiple_of(g + (bos + min(T, i_t * BT + BT)-1) * H*K + i_h * K + o_k, BK), BK) + b_gn = tl.load(p_gn, mask=m_k, other=0) + b_dq = tl.zeros([BT, BK], dtype=tl.float32) + b_dk = tl.zeros([BT, BK], dtype=tl.float32) + b_dgk = tl.zeros([BK,], dtype=tl.float32) + + for i_v in range(tl.cdiv(V, BV)): + if HEAD_FIRST: + p_v = tl.make_block_ptr(v + i_bh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_do = tl.make_block_ptr(do + i_bh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_h = tl.make_block_ptr(h + i_bh * NT*K*V + i_t * K*V, (V, K), (1, V), (i_v * BV, i_k * BK), (BV, BK), (0, 1)) + p_dh = tl.make_block_ptr(dh + i_bh * NT*K*V + i_t * K*V, (V, K), (1, V), (i_v * BV, i_k * BK), (BV, BK), (0, 1)) + else: + p_v = tl.make_block_ptr(v + (bos*H + i_h) * V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_do = tl.make_block_ptr(do + (bos*H + i_h) * V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_h = tl.make_block_ptr(h + (i_tg * H + i_h) * K*V, (V, K), (1, V), (i_v * BV, i_k * BK), (BV, BK), (0, 1)) + p_dh = tl.make_block_ptr(dh + (i_tg * H + i_h) * K*V, (V, K), (1, V), (i_v * BV, i_k * BK), (BV, BK), (0, 1)) + # [BT, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + b_do = tl.load(p_do, boundary_check=(0, 1)) + # [BV, BK] + b_h = tl.load(p_h, boundary_check=(0, 1)) + b_dh = tl.load(p_dh, boundary_check=(0, 1)) + # [BK] + b_dgk += tl.sum(b_h * b_dh, axis=0) + # [BT, BK] + b_dq += tl.dot(b_do, b_h.to(b_do.dtype)) + b_dk += tl.dot(b_v, b_dh.to(b_v.dtype)) + b_dgk *= tl.exp(b_gn) + b_dq *= scale + b_gk = tl.load(p_gk, boundary_check=(0, 1)) + b_dq = b_dq * tl.exp(b_gk) + b_dk = b_dk * tl.exp(b_gn[None, :] - b_gk) + + if HEAD_FIRST: + p_q = tl.make_block_ptr(q + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_k = tl.make_block_ptr(k + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dq = tl.make_block_ptr(dq + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dk = tl.make_block_ptr(dk + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + else: + p_q = tl.make_block_ptr(q + (bos*H+i_h)*K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_k = tl.make_block_ptr(k + (bos*H+i_h)*K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dq = tl.make_block_ptr(dq + (bos*H+i_h)*K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dk = tl.make_block_ptr(dk + (bos*H+i_h)*K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_k = tl.load(p_k, boundary_check=(0, 1)) + b_dgk += tl.sum(b_dk * b_k, axis=0) + b_dq += tl.load(p_dq, boundary_check=(0, 1)) + b_dk += tl.load(p_dk, boundary_check=(0, 1)) + b_dg = b_q * b_dq - b_k * b_dk + # tl.debug_barrier() + b_dg = b_dg - tl.cumsum(b_dg, axis=0) + tl.sum(b_dg, axis=0)[None, :] + b_dgk[None, :] + # Buggy due to strange triton compiler issue. + # m_s = tl.where(tl.arange(0, BT)[:, None] <= tl.arange(0, BT)[None, :], 1., 0.) + # b_dg = tl.dot(m_s, b_dg, allow_tf32=False) + b_dgk[None, :] + if HEAD_FIRST: + p_dq = tl.make_block_ptr(dq2 + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dk = tl.make_block_ptr(dk2 + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dg = tl.make_block_ptr(dg + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + else: + p_dq = tl.make_block_ptr(dq2 + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dk = tl.make_block_ptr(dk2 + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dg = tl.make_block_ptr(dg + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + tl.store(p_dq, b_dq.to(p_dq.dtype.element_ty), boundary_check=(0, 1)) + tl.store(p_dk, b_dk.to(p_dk.dtype.element_ty), boundary_check=(0, 1)) + tl.store(p_dg, b_dg.to(p_dg.dtype.element_ty), boundary_check=(0, 1)) + + +def chunk_gla_fwd_intra_gk( + q: torch.Tensor, + k: torch.Tensor, + g: torch.Tensor, + scale: float, + offsets: Optional[torch.LongTensor] = None, + indices: Optional[torch.LongTensor] = None, + head_first: bool = True, + chunk_size: int = 64 +): + if head_first: + B, H, T, K = k.shape + else: + B, T, H, K = k.shape + BT = min(chunk_size, triton.next_power_of_2(T)) + if offsets is None: + NT = triton.cdiv(T, BT) + else: + if indices is None: + indices = torch.cat([torch.arange(n) for n in triton.cdiv(offsets[1:] - offsets[:-1], BT).tolist()]) + indices = torch.stack([indices.eq(0).cumsum(0) - 1, indices], 1).to(offsets) + NT = len(indices) + BC = min(16, triton.next_power_of_2(T)) + BK = min(64, triton.next_power_of_2(K)) + NC = triton.cdiv(BT, BC) + + A = q.new_empty(B, *((H, T) if head_first else (T, H)), BT, dtype=torch.float32) + grid = (NT, NC * NC, B * H) + chunk_gla_fwd_A_kernel_intra_sub_inter[grid]( + q, + k, + g, + A, + offsets, + indices, + scale, + T=T, + H=H, + K=K, + BT=BT, + BC=BC, + BK=BK, + NC=NC, + HEAD_FIRST=head_first + ) + + grid = (NT, NC, B * H) + # load the entire [BC, K] blocks into SRAM at once + if K <= 256: + BK = triton.next_power_of_2(K) + chunk_gla_fwd_A_kernel_intra_sub_intra[grid]( + q, + k, + g, + A, + offsets, + indices, + scale, + T=T, + H=H, + K=K, + BT=BT, + BC=BC, + BK=BK, + HEAD_FIRST=head_first + ) + # split then merge + else: + BK = min(128, triton.next_power_of_2(K)) + NK = triton.cdiv(K, BK) + A_intra = q.new_empty(NK, B, *((H, T) if head_first else (T, H)), BC, dtype=torch.float32) + + grid = (NK, NT * NC, B * H) + chunk_gla_fwd_A_kernel_intra_sub_intra_split[grid]( + q, + k, + g, + A_intra, + offsets, + indices, + scale, + B=B, + T=T, + H=H, + K=K, + BT=BT, + BC=BC, + BK=BK, + NC=NC, + HEAD_FIRST=head_first + ) + + grid = (NT, NC, B * H) + chunk_gla_fwd_A_kernel_intra_sub_intra_merge[grid]( + A_intra, + A, + offsets, + indices, + B=B, + T=T, + H=H, + BT=BT, + BC=BC, + NK=NK, + HEAD_FIRST=head_first + ) + return A + + +def chunk_gla_fwd_o_gk( + q: torch.Tensor, + v: torch.Tensor, + g: torch.Tensor, + A: torch.Tensor, + h: torch.Tensor, + scale: float, + offsets: Optional[torch.LongTensor] = None, + indices: Optional[torch.LongTensor] = None, + head_first: bool = True, + chunk_size: int = 64 +): + if head_first: + B, H, T, K, V = *q.shape, v.shape[-1] + else: + B, T, H, K, V = *q.shape, v.shape[-1] + BT = min(chunk_size, triton.next_power_of_2(T)) + if offsets is None: + NT = triton.cdiv(T, BT) + else: + if indices is None: + indices = torch.cat([torch.arange(n) for n in triton.cdiv(offsets[1:] - offsets[:-1], BT).tolist()]) + indices = torch.stack([indices.eq(0).cumsum(0) - 1, indices], 1).to(offsets) + NT = len(indices) + BK = min(32, triton.next_power_of_2(K)) + BV = min(32, triton.next_power_of_2(V)) + NV = triton.cdiv(V, BV) + + grid = (NV, NT, B * H) + o = torch.empty_like(v) + chunk_gla_fwd_kernel_o[grid]( + q, + v, + g, + h, + o, + A, + offsets, + indices, + scale, + T=T, + H=H, + K=K, + V=V, + BT=BT, + BK=BK, + BV=BV, + HEAD_FIRST=head_first + ) + return o + + +def chunk_gla_bwd_dA( + v: torch.Tensor, + do: torch.Tensor, + scale: float, + offsets: Optional[torch.LongTensor] = None, + indices: Optional[torch.LongTensor] = None, + head_first: bool = True, + chunk_size: int = 64 +): + if head_first: + B, H, T, V = v.shape + else: + B, T, H, V = v.shape + BT = min(chunk_size, triton.next_power_of_2(T)) + if offsets is None: + NT = triton.cdiv(T, BT) + else: + if indices is None: + indices = torch.cat([torch.arange(n) for n in triton.cdiv(offsets[1:] - offsets[:-1], BT).tolist()]) + indices = torch.stack([indices.eq(0).cumsum(0) - 1, indices], 1).to(offsets) + NT = len(indices) + BV = min(64, triton.next_power_of_2(V)) + + dA = v.new_empty(B, *((H, T) if head_first else (T, H)), BT, dtype=torch.float32) + grid = (NT, B * H) + chunk_gla_bwd_kernel_dA[grid]( + v, + do, + dA, + offsets, + indices, + scale, + T=T, + H=H, + V=V, + BT=BT, + BV=BV, + HEAD_FIRST=head_first + ) + return dA + + +def chunk_gla_bwd_dv( + k: torch.Tensor, + g: torch.Tensor, + A: torch.Tensor, + do: torch.Tensor, + dh: torch.Tensor, + offsets: Optional[torch.LongTensor] = None, + indices: Optional[torch.LongTensor] = None, + head_first: bool = True, + chunk_size: int = 64 +): + if head_first: + B, H, T, K, V = *k.shape, do.shape[-1] + else: + B, T, H, K, V = *k.shape, do.shape[-1] + BT = min(chunk_size, triton.next_power_of_2(T)) + if offsets is None: + NT = triton.cdiv(T, BT) + else: + if indices is None: + indices = torch.cat([torch.arange(n) for n in triton.cdiv(offsets[1:] - offsets[:-1], BT).tolist()]) + indices = torch.stack([indices.eq(0).cumsum(0) - 1, indices], 1).to(offsets) + NT = len(indices) + BK = min(64, triton.next_power_of_2(K)) + BV = min(32, triton.next_power_of_2(V)) + + dv = torch.empty_like(do) + grid = (triton.cdiv(V, BV), NT, B * H) + chunk_gla_bwd_kernel_dv[grid]( + k, + g, + A, + do, + dh, + dv, + offsets, + indices, + T=T, + H=H, + K=K, + V=V, + BT=BT, + BK=BK, + BV=BV, + HEAD_FIRST=head_first + ) + return dv + + +def chunk_gla_bwd_dqk_intra( + q: torch.Tensor, + k: torch.Tensor, + g: torch.Tensor, + dA: torch.Tensor, + offsets: Optional[torch.LongTensor] = None, + indices: Optional[torch.LongTensor] = None, + head_first: bool = True, + chunk_size: int = 64 +): + if head_first: + B, H, T, K = q.shape + else: + B, T, H, K = q.shape + BT = min(chunk_size, triton.next_power_of_2(T)) + if offsets is None: + NT = triton.cdiv(T, BT) + else: + if indices is None: + indices = torch.cat([torch.arange(n) for n in triton.cdiv(offsets[1:] - offsets[:-1], BT).tolist()]) + indices = torch.stack([indices.eq(0).cumsum(0) - 1, indices], 1).to(offsets) + NT = len(indices) + BC = min(16, triton.next_power_of_2(T)) + BK = min(64, triton.next_power_of_2(K)) + NK = triton.cdiv(K, BK) + NC = triton.cdiv(BT, BC) + + dq = torch.empty_like(q, dtype=torch.float32) + dk = torch.empty_like(k, dtype=torch.float32) + grid = (NK, NT * NC, B * H) + chunk_gla_bwd_kernel_intra[grid]( + q, + k, + g, + dA, + dq, + dk, + offsets, + indices, + T=T, + H=H, + K=K, + BT=BT, + BC=BC, + BK=BK, + NC=NC, + HEAD_FIRST=head_first + ) + return dq, dk + + +def chunk_gla_bwd_dqkg( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + h: torch.Tensor, + g: torch.Tensor, + do: torch.Tensor, + dh: torch.Tensor, + dq: torch.Tensor, + dk: torch.Tensor, + scale: float, + offsets: Optional[torch.LongTensor] = None, + indices: Optional[torch.LongTensor] = None, + head_first: bool = True, + chunk_size: int = 64 +): + if head_first: + B, H, T, K, V = *k.shape, v.shape[-1] + else: + B, T, H, K, V = *k.shape, v.shape[-1] + BT = min(chunk_size, triton.next_power_of_2(T)) + if offsets is None: + NT = triton.cdiv(T, BT) + else: + if indices is None: + indices = torch.cat([torch.arange(n) for n in triton.cdiv(offsets[1:] - offsets[:-1], chunk_size).tolist()]) + indices = torch.stack([indices.eq(0).cumsum(0) - 1, indices], 1).to(offsets) + NT = len(indices) + BK = min(64, triton.next_power_of_2(K)) + BV = min(64, triton.next_power_of_2(V)) + NK = triton.cdiv(K, BK) + + dg = torch.empty_like(g) + grid = (NK, NT, B * H) + # work around triton compiler bugs. + dq2 = torch.empty_like(dq) + dk2 = torch.empty_like(dk) + chunk_gla_bwd_kernel_inter[grid]( + q, + k, + v, + h, + g, + do, + dh, + dq, + dk, + dq2, + dk2, + dg, + offsets, + indices, + scale, + T=T, + H=H, + K=K, + V=V, + BT=BT, + BK=BK, + BV=BV, + HEAD_FIRST=head_first + ) + return dq2, dk2, dg + + +def chunk_gla_fwd( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + g: torch.Tensor, + g_cumsum: Optional[torch.Tensor], + scale: float, + initial_state: torch.Tensor, + output_final_state: bool, + offsets: Optional[torch.LongTensor] = None, + indices: Optional[torch.LongTensor] = None, + head_first: bool = True, + chunk_size: int = 64 +) -> Tuple[torch.Tensor, torch.Tensor, torch.Tensor]: + T = q.shape[2] if head_first else q.shape[1] + BT = min(chunk_size, triton.next_power_of_2(T)) + if g_cumsum is None: + g_cumsum = chunk_local_cumsum(g, BT, offsets=offsets, head_first=head_first) + + h, ht = chunk_fwd_h( + k=k, + v=v, + g=None, + gk=g_cumsum, + gv=None, + h0=initial_state, + output_final_state=output_final_state, + states_in_fp32=False, + offsets=offsets, + head_first=head_first, + chunk_size=BT + ) + + # the intra A is kept in fp32 + # the computation has very marginal effect on the entire throughput + A = chunk_gla_fwd_intra_gk( + q=q, + k=k, + g=g_cumsum, + scale=scale, + offsets=offsets, + indices=indices, + head_first=head_first, + chunk_size=BT + ) + o = chunk_gla_fwd_o_gk( + q=q, + v=v, + g=g_cumsum, + A=A, + h=h, + scale=scale, + offsets=offsets, + indices=indices, + head_first=head_first, + chunk_size=BT + ) + return g_cumsum, A, h, ht, o + + +def chunk_gla_bwd( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + g: torch.Tensor, + g_cumsum: Optional[torch.Tensor], + scale: float, + initial_state: torch.Tensor, + h: torch.Tensor, + A: torch.Tensor, + do: torch.Tensor, + dht: torch.Tensor, + offsets: Optional[torch.LongTensor] = None, + indices: Optional[torch.LongTensor] = None, + head_first: bool = True, + chunk_size: int = 64 +): + T = q.shape[2] if head_first else q.shape[1] + BT = min(chunk_size, triton.next_power_of_2(T)) + if g_cumsum is None: + g_cumsum = chunk_local_cumsum(g, BT, offsets=offsets, head_first=head_first) + + if h is None: + h, _ = chunk_fwd_h( + k=k, + v=v, + g=None, + gk=g_cumsum, + gv=None, + h0=initial_state, + output_final_state=False, + states_in_fp32=True, + offsets=offsets, + head_first=head_first, + chunk_size=BT + ) + dh, dh0 = chunk_bwd_dh( + q=q, + k=k, + v=v, + g=None, + gk=g_cumsum, + gv=None, + do=do, + h0=initial_state, + dht=dht, + scale=scale, + states_in_fp32=True, + offsets=offsets, + head_first=head_first, + chunk_size=BT + ) + dv = chunk_gla_bwd_dv( + k=k, + g=g_cumsum, + A=A, + do=do, + dh=dh, + offsets=offsets, + indices=indices, + head_first=head_first, + chunk_size=BT + ) + # dq dk in fp32 + dA = chunk_gla_bwd_dA( + v=v, + do=do, + scale=scale, + offsets=offsets, + indices=indices, + head_first=head_first, + chunk_size=BT + ) + dq, dk = chunk_gla_bwd_dqk_intra( + q=q, + k=k, + g=g_cumsum, + dA=dA, + offsets=offsets, + indices=indices, + head_first=head_first, + chunk_size=BT + ) + dq, dk, dg = chunk_gla_bwd_dqkg( + q=q, + k=k, + v=v, + h=h, + g=g_cumsum, + do=do, + dh=dh, + dq=dq, + dk=dk, + scale=scale, + offsets=offsets, + indices=indices, + head_first=head_first, + chunk_size=BT + ) + return dq, dk, dv, dg, dh0 + + +class ChunkGLAFunction(torch.autograd.Function): + + @staticmethod + @contiguous + def forward( + ctx, + q, + k, + v, + g, + scale, + initial_state, + output_final_state, + offsets, + head_first + ): + T = q.shape[2] if head_first else q.shape[1] + chunk_size = min(64, triton.next_power_of_2(T)) + + # 2-d indices denoting the offsets of chunks in each sequence + # for example, if the passed `offsets` is [0, 100, 356] and `chunk_size` is 64, + # then there are 2 and 4 chunks in the 1st and 2nd sequences respectively, and `indices` will be + # [[0, 0], [0, 1], [1, 0], [1, 1], [1, 2], [1, 3]] + indices = None + if offsets is not None: + indices = torch.cat([torch.arange(n) for n in triton.cdiv(offsets[1:] - offsets[:-1], chunk_size).tolist()]) + indices = torch.stack([indices.eq(0).cumsum(0) - 1, indices], 1).to(offsets) + g_cumsum, A, h, ht, o = chunk_gla_fwd( + q=q, + k=k, + v=v, + g=g, + g_cumsum=None, + scale=scale, + initial_state=initial_state, + output_final_state=output_final_state, + offsets=offsets, + indices=indices, + head_first=head_first, + chunk_size=chunk_size + ) + # recompute g_cumsum in bwd pass + if g.dtype != torch.float32: + g_cumsum = None + else: + g = None + ctx.save_for_backward(q, k, v, g, g_cumsum, initial_state, A) + ctx.chunk_size = chunk_size + ctx.scale = scale + ctx.offsets = offsets + ctx.indices = indices + ctx.head_first = head_first + return o, ht + + @staticmethod + @contiguous + def backward(ctx, do, dht): + q, k, v, g, g_cumsum, initial_state, A = ctx.saved_tensors + chunk_size, scale, offsets, indices, head_first = ctx.chunk_size, ctx.scale, ctx.offsets, ctx.indices, ctx.head_first + dq, dk, dv, dg, dh0 = chunk_gla_bwd( + q=q, + k=k, + v=v, + g=g, + g_cumsum=g_cumsum, + scale=scale, + h=None, + A=A, + initial_state=initial_state, + do=do, + dht=dht, + offsets=offsets, + indices=indices, + head_first=head_first, + chunk_size=chunk_size + ) + return dq.to(q), dk.to(k), dv.to(v), dg, None, dh0, None, None, None + + +def chunk_gla( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + g: torch.Tensor, + scale: Optional[int] = None, + initial_state: torch.Tensor = None, + output_final_state: bool = False, + offsets: Optional[torch.LongTensor] = None, + head_first: bool = True +) -> Tuple[torch.Tensor, torch.Tensor]: + r""" + Args: + q (torch.Tensor): + queries of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]`. + k (torch.Tensor): + keys of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]`. + v (torch.Tensor): + values of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]`. + g (torch.Tensor): + Forget gates of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]` applied to keys. + scale (Optional[int]): + Scale factor for the attention scores. + If not provided, it will default to `1 / sqrt(K)`. Default: `None`. + initial_state (Optional[torch.Tensor]): + Initial state of shape `[N, H, K, V]` for `N` input sequences. + For equal-length input sequences, `N` equals the batch size `B`. + Default: `None`. + output_final_state (Optional[bool]): + Whether to output the final state of shape `[N, H, K, V]`. Default: `False`. + offsets (Optional[torch.LongTensor]): + Offsets of shape `[N+1]` defining the bos/eos positions of `N` variable-length sequences in the batch. + For example, + if `offsets` is `[0, 1, 3, 6, 10, 15]`, there are `N=5` sequences with lengths 1, 2, 3, 4 and 5 respectively. + If provided, the inputs are concatenated and the batch size `B` is expected to be 1. + Default: `None`. + head_first (Optional[bool]): + Whether the inputs are in the head-first format, which is not supported for variable-length inputs. + Default: `True`. + + Returns: + o (torch.Tensor): + Outputs of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]`. + final_state (torch.Tensor): + Final state of shape `[N, H, K, V]` if `output_final_state=True` else `None`. + + Examples:: + >>> import torch + >>> import torch.nn.functional as F + >>> from einops import rearrange + >>> from fla.ops.gla import chunk_gla + # inputs with equal lengths + >>> B, T, H, K, V = 4, 2048, 4, 512, 512 + >>> q = torch.randn(B, T, H, K, device='cuda') + >>> k = torch.randn(B, T, H, K, device='cuda') + >>> v = torch.randn(B, T, H, V, device='cuda') + >>> g = F.logsigmoid(torch.randn(B, T, H, K, device='cuda')) + >>> h0 = torch.randn(B, H, K, V, device='cuda') + >>> o, ht = chunk_gla(q, k, v, g, + initial_state=h0, + output_final_state=True, + head_first=False) + # for variable-length inputs, the batch size `B` is expected to be 1 and `offsets` is required + >>> q, k, v, g = map(lambda x: rearrange(x, 'b t h d -> 1 (b t) h d'), (q, k, v, g)) + # for a batch with 4 sequences, offsets with 5 start/end positions are expected + >>> offsets = q.new_tensor([0, 2048, 4096, 6144, 8192], dtype=torch.long) + >>> o_var, ht_var = chunk_gla(q, k, v, g, + initial_state=h0, + output_final_state=True, + offsets=offsets, + head_first=False) + >>> assert o.allclose(o_var.view(o.shape)) + >>> assert ht.allclose(ht_var) + """ + if offsets is not None: + if q.shape[0] != 1: + raise ValueError(f"The batch size is expected to be 1 rather than {q.shape[0]} when using `offsets`." + f"Please flatten variable-length inputs before processing.") + if head_first: + raise RuntimeError("Sequences with variable lengths are not supported for head-first mode") + if initial_state is not None and initial_state.shape[0] != len(offsets) - 1: + raise ValueError(f"The number of initial states is expected to be equal to the number of input sequences, " + f"i.e., {len(offsets) - 1} rather than {initial_state.shape[0]}.") + if scale is None: + scale = q.shape[-1] ** -0.5 + o, final_state = ChunkGLAFunction.apply(q, k, v, g, scale, initial_state, output_final_state, offsets, head_first) + return o, final_state diff --git a/fla/ops/gla/fused_chunk.py b/fla/ops/gla/fused_chunk.py new file mode 100644 index 0000000000000000000000000000000000000000..5313f789fc672a895ff1e31b1a4aa3910863d40e --- /dev/null +++ b/fla/ops/gla/fused_chunk.py @@ -0,0 +1,644 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +# Gated Linear Attention Transformers with Hardware-Efficient Training: https://arxiv.org/abs/2312.06635 + +from typing import Tuple + +import torch +import torch.nn.functional as F +import triton +import triton.language as tl +from einops import rearrange +from packaging import version + +from fla.ops.utils import chunk_local_cumsum +from fla.utils import autocast_custom_bwd, autocast_custom_fwd, contiguous + + +@triton.jit +def prepare_qg_kg( + q, + k, + g, + qg, + kg, + s_k_h, + scale, + K: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr +): + + i_k, i_c, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + p_q = q + i_bh * s_k_h + i_c * BT * K + i_k * BK + tl.arange(0, BK) + p_g = g + i_bh * s_k_h + i_c * BT * K + i_k * BK + tl.arange(0, BK) + p_k = k + i_bh * s_k_h + i_c * BT * K + i_k * BK + tl.arange(0, BK) + p_qg = qg + i_bh * s_k_h + i_c * BT * K + i_k * BK + tl.arange(0, BK) + p_kg = kg + i_bh * s_k_h + i_c * BT * K + i_k * BK + tl.arange(0, BK) + + mask = (i_k * BK + tl.arange(0, BK)) < K + + last_decay = tl.load(g + i_bh * s_k_h + (i_c * BT + BT - 1) * K + i_k * BK + tl.arange(0, BK)) + + for i in range(BT): + b_q = tl.load(p_q, mask=mask, other=0) + b_k = tl.load(p_k, mask=mask, other=0) + _g = tl.load(p_g, mask=mask, other=0).to(tl.float32) + b_q *= tl.exp(_g) * scale + b_k *= tl.exp(last_decay - _g) + tl.store(p_kg, b_k.to(p_kg.dtype.element_ty), mask=mask) + tl.store(p_qg, b_q.to(p_qg.dtype.element_ty), mask=mask) + p_q += K + p_g += K + p_k += K + p_kg += K + p_qg += K + + +@triton.jit +def bwd_decay_global_cumsum( + dq_inner, + dq_inter, + dk_inner, + dk_inter, + q, k, g, dg, + s_k_h, + BT: tl.constexpr, + BK: tl.constexpr, + K: tl.constexpr +): + i_k, i_c, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + p_q = q + i_bh * s_k_h + i_k * BK + tl.arange(0, BK) + (i_c * BT + BT - 1) * K + p_k = k + i_bh * s_k_h + i_k * BK + tl.arange(0, BK) + (i_c * BT + BT - 1) * K + p_g = g + i_bh * s_k_h + i_k * BK + tl.arange(0, BK) + (i_c * BT + BT - 1) * K + p_dg = dg + i_bh * s_k_h + i_k * BK + tl.arange(0, BK) + (i_c * BT + BT - 1) * K + p_dq_inner = dq_inner + i_bh * s_k_h + i_k * BK + tl.arange(0, BK) + (i_c * BT + BT - 1) * K + p_dk_inner = dk_inner + i_bh * s_k_h + i_k * BK + tl.arange(0, BK) + (i_c * BT + BT - 1) * K + p_dq_inter = dq_inter + i_bh * s_k_h + i_k * BK + tl.arange(0, BK) + (i_c * BT + BT - 1) * K + p_dk_inter = dk_inter + i_bh * s_k_h + i_k * BK + tl.arange(0, BK) + (i_c * BT + BT - 1) * K + cum_grad_dg = tl.zeros([BK], dtype=tl.float32) + mask = (i_k * BK + tl.arange(0, BK)) < K + last_g = tl.zeros([BK], dtype=tl.float32) + for j in range(BT-1, -1, -1): + _g = tl.load(p_g, mask=mask, other=0).to(tl.float32) + if j == (BT-1): + last_g = _g + b_dq1 = tl.load(p_dq_inner, mask=mask, other=0) + b_dq2 = tl.load(p_dq_inter, mask=mask, other=0) + b_dq2 *= tl.exp(_g) + b_dq = b_dq1 + b_dq2 + tl.store(p_dq_inter, b_dq, mask=mask) + b_dk1 = tl.load(p_dk_inner, mask=mask, other=0) + b_dk2 = tl.load(p_dk_inter, mask=mask, other=0) + b_dk2 *= tl.exp(last_g - _g) + b_dk = b_dk1 + b_dk2 + tl.store(p_dk_inter, b_dk, mask=mask) + b_q = tl.load(p_q, mask=mask, other=0) + b_k = tl.load(p_k, mask=mask, other=0) + b_dg = b_dq * b_q - b_dk * b_k + cum_grad_dg += b_dg + tl.store(p_dg, cum_grad_dg.to(p_dg.dtype.element_ty), mask=mask) + p_g -= K + p_k -= K + p_q -= K + p_dq_inner -= K + p_dk_inner -= K + p_dq_inter -= K + p_dk_inter -= K + p_dg -= K + + +@triton.jit +def fused_chunk_gla_fwd_kernel( + q, # query [B, H, L, K] + k, # key [B, H, L, K] + v, # value [B, H, L, V] + g, # cumulative sum of log decay [B, H, L, K] + o, # output [B, H, L, V] + + h0, # initial state of the chunk [B, H, K, V] + ht, # final state of the chunk [B, H, K, V] + + s_k_h, # stride size: L * K + s_k_t, # stride size: K + s_k_d, # stride size: 1 + + s_v_h, # stride size: L * V + s_v_t, # stride size: V + s_v_d, # stride size: 1 + + B: tl.constexpr, # batch size + H: tl.constexpr, # H + T: tl.constexpr, # T + K: tl.constexpr, # K + V: tl.constexpr, # V + BT: tl.constexpr, # BLOCK SIZE along the sequence dimension, a.k.a. chunk size + BK: tl.constexpr, # BLOCK SIZE along the K dimension + BV: tl.constexpr, # BLOCK SIZE along the V dimension + USE_INITIAL_STATE: tl.constexpr, + STORE_FINAL_STATE: tl.constexpr, + CHECK: tl.constexpr +): + # indices + i_v, i_k, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + + b_h = tl.zeros([BK, BV], dtype=tl.float32) + + # make block pointers + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (0, i_k * BK), (BT, BK), (1, 0)) + p_db = g + i_bh * s_k_h + (BT - 1) * s_k_t + i_k * BK + tl.arange(0, BK) + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (K, T), (s_k_d, s_k_t), (i_k * BK, 0), (BK, BT), (0, 1)) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (0, i_v * BV), (BT, BV), (1, 0)) + p_o = tl.make_block_ptr(o + (i_bh + i_k * B * H) * s_v_h, (T, V), (s_v_t, s_v_d), (0, i_v * BV), (BT, BV), (1, 0)) + + if USE_INITIAL_STATE: + p_h = tl.make_block_ptr(h0 + i_bh * K * V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + b_h += tl.load(p_h, boundary_check=(0, 1)).to(tl.float32) + + mask = (i_k * BK + tl.arange(0, BK)) < K + + for i in range(0, tl.cdiv(T, BT)): + # [BK, BT] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BT, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + # [BT, BK] + b_q = tl.load(p_q, boundary_check=(0, 1)) + d_b = tl.load(p_db, mask=mask, other=0).to(tl.float32) + if CHECK and i == 0: + b_o = tl.dot(b_q.to(b_v.dtype), b_h.to(b_v.dtype), allow_tf32=False) + b_h = b_h * tl.exp(d_b)[:, None] + tl.dot(b_k.to(b_v.dtype), b_v, allow_tf32=False) + else: + b_o = tl.dot(b_q.to(b_v.dtype), b_h.to(b_v.dtype), allow_tf32=False) + b_h = b_h * tl.exp(d_b)[:, None] + tl.dot(b_k.to(b_v.dtype), b_v, allow_tf32=False) + + tl.store(p_o, b_o.to(p_o.dtype.element_ty), boundary_check=(0, 1)) + p_q = tl.advance(p_q, (BT, 0)) + p_k = tl.advance(p_k, (0, BT)) + p_v = tl.advance(p_v, (BT, 0)) + p_o = tl.advance(p_o, (BT, 0)) + p_db += BT * K + + if STORE_FINAL_STATE: + p_final = tl.make_block_ptr(ht + i_bh * K * V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + tl.store(p_final, b_h.to(p_final.dtype.element_ty), boundary_check=(0, 1)) + + +# Similar to Algorithm1 of https://arxiv.org/abs/2006.16236 +@triton.jit +def fused_chunk_gla_bwd_kernel( + q, k, v, g, + do, # gradient of output [B, H, L, V] + dq, # gradient of query [NV, B, H, L, K] + dk, # gradient of key [NV, B, H, L, K] + dv, # gradient of value [NK, B, H, L, V] + + h0, # initial state of the chunk [B, H, K, V] + + s_k_h, # stride size: L * K + s_k_t, # stride size: K + s_k_d, # stride size: 1 + + s_v_h, # stride size: L * V + s_v_t, # stride size: V + s_v_d, # stride size: 1 + scale, # K ** -0.5 + + B: tl.constexpr, # B + H: tl.constexpr, # H + T: tl.constexpr, # T + K: tl.constexpr, # K + V: tl.constexpr, # V + # clamp_min, # minimum log value of the gate for numerical stability. default: -5 + BT: tl.constexpr, # BLOCK SIZE along the sequence dimension, a.k.a. chunk size + BK: tl.constexpr, # BLOCK SIZE along the K dimension + BV: tl.constexpr, # BLOCK SIZE along the V dimension + USE_INITIAL_STATE: tl.constexpr, + CHECK: tl.constexpr +): + i_v, i_k, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + # [BV, BK] + b_h = tl.zeros([BV, BK], dtype=tl.float32) + + if USE_INITIAL_STATE: + p_h = tl.make_block_ptr(h0 + i_bh * K * V, (V, K), (1, V), (i_v * BV, i_k * BK), (BV, BK), (0, 1)) + b_h += tl.load(p_h, boundary_check=(0, 1)).to(tl.float32) + + mask = (i_k * BK + tl.arange(0, BK)) < K + for i in range(0, tl.cdiv(T, BT)): + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i * BT, i_k * BK), (BT, BK), (1, 0)) + p_db = g + i_bh * s_k_h + ((i+1) * BT - 1) * s_k_t + i_k * BK + tl.arange(0, BK) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (V, T), (s_v_d, s_v_t), (i_v * BV, i * BT), (BV, BT), (0, 1)) + p_do = tl.make_block_ptr(do + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i * BT, i_v * BV), (BT, BV), (1, 0)) + p_dq = tl.make_block_ptr(dq + (i_bh+i_v*B*H)*s_k_h, (T, K), (s_k_t, s_k_d), (i * BT, i_k * BK), (BT, BK), (1, 0)) + b_dq = tl.zeros([BT, BK], dtype=tl.float32) + # [BT, K] + b_k = tl.load(p_k, boundary_check=(0, 1)) + d_b = tl.load(p_db, mask=mask, other=0).to(tl.float32) + + # [V, BT] + b_v = tl.load(p_v, boundary_check=(0, 1)) + # [BT, V] + b_do = tl.load(p_do, boundary_check=(0, 1)) + # [V, K] + if CHECK and i == 0: + b_dq += tl.dot(b_do, b_h.to(b_do.dtype), allow_tf32=False) + b_h = b_h * tl.exp(d_b)[None, :] + tl.dot(b_v, b_k.to(b_v.dtype), allow_tf32=False) + else: + b_dq += tl.dot(b_do, b_h.to(b_do.dtype), allow_tf32=False) + b_h = b_h * tl.exp(d_b)[None, :] + tl.dot(b_v, b_k.to(b_v.dtype), allow_tf32=False) + b_dq *= scale + tl.store(p_dq, b_dq.to(p_dq.dtype.element_ty), boundary_check=(0, 1)) + + # sync threads + b_h = None + tl.debug_barrier() + # [BK, BV] + b_dh = tl.zeros([BK, BV], dtype=tl.float32) + + # cum = tl.zeros([BK], dtype=tl.float32) + for i in range(1, tl.cdiv(T, BT) + 1): + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (K, T), (s_k_d, s_k_t), (i_k * BK, T - i * BT), (BK, BT), (0, 1)) + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (T - i * BT, i_k * BK), (BT, BK), (1, 0)) + p_db = g + i_bh * s_k_h + (T - (i-1) * BT - 1) * s_k_t + i_k * BK + tl.arange(0, BK) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (T - i * BT, i_v * BV), (BT, BV), (1, 0)) + p_do = tl.make_block_ptr(do + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (T - i * BT, i_v * BV), (BT, BV), (1, 0)) + p_dk = tl.make_block_ptr(dk + (i_bh + i_v * B * H) * s_k_h, (T, K), + (s_k_t, s_k_d), (T - i * BT, i_k * BK), (BT, BK), (1, 0)) + p_dv = tl.make_block_ptr(dv + (i_bh + i_k * B * H) * s_v_h, (T, V), + (s_v_t, s_v_d), (T - i * BT, i_v * BV), (BT, BV), (1, 0)) + # [K, BT] + b_q = tl.load(p_q, boundary_check=(0, 1)) + # [BT, K] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BT, V] + b_v = tl.load(p_v, boundary_check=(0, 1)) + b_do = tl.load(p_do, boundary_check=(0, 1)) + b_db = tl.load(p_db, mask=mask, other=0).to(tl.float32) + + # inter-chunk + # [K, V] + if CHECK and i == 1: + b_dk = tl.trans(tl.dot(b_dh.to(b_v.dtype), tl.trans(b_v), allow_tf32=False)) + b_dv = tl.dot((b_k).to(b_v.dtype), b_dh.to(b_v.dtype), allow_tf32=False) + b_dh = b_dh * tl.exp(b_db)[:, None] + tl.dot(b_q.to(b_do.dtype), b_do, allow_tf32=False) + else: + b_dk = tl.trans(tl.dot(b_dh.to(b_v.dtype), tl.trans(b_v), allow_tf32=False)) + b_dv = tl.dot((b_k).to(b_v.dtype), b_dh.to(b_v.dtype), allow_tf32=False) + b_dh = b_dh * tl.exp(b_db)[:, None] + tl.dot(b_q.to(b_do.dtype), b_do, allow_tf32=False) + + tl.store(p_dk, b_dk.to(p_dk.dtype.element_ty), boundary_check=(0, 1)) + tl.store(p_dv, b_dv.to(p_dv.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.jit +def fwd_inner_chunk( + q, k, g, A, + s_k_h, # stride size: L * K + s_k_t, # stride size: K + s_k_d, # stride size: 1 + scale, # K ** -0.5 + B: tl.constexpr, # B + H: tl.constexpr, # H + T: tl.constexpr, # T + K: tl.constexpr, # K + BT: tl.constexpr, # BLOCK SIZE along the sequence dimension, a.k.a. chunk size + BK: tl.constexpr # BLOCK SIZE along the K dimension +): + + i_k, i_t, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + + b_k = tl.load(p_k, boundary_check=(0, 1)) + + p_g = tl.make_block_ptr(g + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + + b_g = tl.load(p_g, boundary_check=(0, 1)).to(tl.float32) + + mask = (i_k * BK + tl.arange(0, BK)) < K + o_i = tl.arange(0, BT) + + p_q = q + i_bh * s_k_h + i_k * BK + i_t * BT * K + tl.arange(0, BK) + p_gq = g + i_bh * s_k_h + i_k * BK + i_t * BT * K + tl.arange(0, BK) + p_A = A + (i_bh + (i_k * B * H)) * (tl.cdiv(T, BT) * BT * BT) + i_t * BT * BT + tl.arange(0, BT) + + for i in range(BT): + _q = tl.load(p_q, mask=mask, other=0) * scale + gq = tl.load(p_gq, mask=mask, other=0).to(tl.float32) + s = _q[None, :] * b_k * tl.exp(gq[None, :] - b_g) + score = tl.sum(s, axis=1) + score = tl.where(o_i <= i, score, 0) + tl.store(p_A, score.to(p_A.dtype.element_ty)) + p_q += K + p_gq += K + p_A += BT + + +@triton.jit +def bwd_inner_chunk( + q, + k, + g, + dA, + dq, + dk, + s_k_h, # stride size: L * K + s_k_t, # stride size: K + s_k_d, # stride size: 1 + T: tl.constexpr, # T + K: tl.constexpr, # K + # clamp_min, # minimum log value of the gate for numerical stability. default: -5 + BT: tl.constexpr, # BLOCK SIZE along the sequence dimension, a.k.a. chunk size + BK: tl.constexpr, # BLOCK SIZE along the K dimension +): + i_k, i_t, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + b_k = tl.load(p_k, boundary_check=(0, 1)) + p_g = tl.make_block_ptr(g + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + b_g = tl.load(p_g, boundary_check=(0, 1)).to(tl.float32) + + mask = (i_k * BK + tl.arange(0, BK)) < K + o_i = tl.arange(0, BT) + + p_q = q + i_bh * s_k_h + i_k * BK + i_t * BT * K + tl.arange(0, BK) + p_dq = dq + (i_bh) * s_k_h + i_k * BK + i_t * BT * K + tl.arange(0, BK) + p_gq = g + i_bh * s_k_h + i_k * BK + i_t * BT * K + tl.arange(0, BK) + p_dA = dA + i_bh * (tl.cdiv(T, BT) * BT * BT) + i_t * BT * BT + tl.arange(0, BT) + + b_dk = tl.zeros([BT, BK], dtype=tl.float32) + + for i in range(BT): + _q = tl.load(p_q, mask=mask, other=0) + gq = tl.load(p_gq, mask=mask, other=0).to(tl.float32) + score = tl.exp(gq[None, :] - b_g) + score = tl.where(o_i[:, None] <= i, score, 0) + _dA = tl.load(p_dA) + _dA = tl.where(o_i <= i, _dA, 0) + b_dk += (_dA[:, None] * score * _q[None, :]) + b_dq = tl.sum(_dA[:, None] * score * b_k, axis=0) + tl.store(p_dq, b_dq, mask=mask) + p_q += K + p_dq += K + p_gq += K + p_dA += BT + + p_dk = tl.make_block_ptr(dk + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + tl.store(p_dk, b_dk.to(dk.dtype.element_ty), boundary_check=(0, 1)) + + +class FusedChunkGLAFunction(torch.autograd.Function): + + @staticmethod + @contiguous + @autocast_custom_fwd + def forward(ctx, q, k, v, g, scale, initial_state, output_final_state): + ctx.g_dtype = g.dtype + ctx.scale = scale + B, H, T, K, V = *k.shape, v.shape[-1] + BT = 16 # chunk_size + BK, BV = min(K, 64), min(V, 64) + NK, NV = triton.cdiv(K, BK), triton.cdiv(V, BV) + num_stages = 1 + num_warps = 2 + + g_org = g + # cumulative decay should be in float32, otherwise the err will be accumulated and amplified. + g = chunk_local_cumsum(g_org, chunk_size=BT) + o = q.new_empty(NK, B, H, T, V) + q_g = torch.empty_like(q) + k_g = torch.empty_like(k) + + grid = (NK, triton.cdiv(T, BT), B * H) + prepare_qg_kg[grid]( + q, k, g, q_g, k_g, + q.stride(1), + scale, + K=K, + BT=BT, + BK=BK, + num_warps=1 + ) + + if output_final_state: + final_state = q.new_empty(B, H, K, V, dtype=torch.float, requires_grad=False) + else: + final_state = None + # the bug still exists even for Triton 2.2 on H100 GPUs + # so we always enable initial checks + CHECK = True + if version.parse(triton.__version__) < version.parse('2.2.0'): + import warnings + warnings.warn( + "Triton<2.2.0 detected for running this kernel, " + "which is known to have some weird compiler issues (refer to https://github.com/openai/triton/issues/2852) " + "that lead to significant precision loss. " + "We've add some initial condition checks to resolve this, sadly at the sacrifice of the speed. " + "For optimal performance, it is recommended to install Triton>=2.2.0 (if possible)." + ) + CHECK = True + + grid = (NV, NK, B * H) + fused_chunk_gla_fwd_kernel[grid]( + q_g, k_g, v, g, o, initial_state, final_state, + q.stride(1), q.stride(2), q.stride(3), + v.stride(1), v.stride(2), v.stride(3), + B=B, + H=H, + T=T, + K=K, + V=V, + BT=BT, + BK=BK, + BV=BV, + USE_INITIAL_STATE=initial_state is not None, + STORE_FINAL_STATE=output_final_state, + CHECK=CHECK, + num_warps=num_warps, + num_stages=num_stages + ) + + o = o.sum(0) + + # intra-chunk + chunk_size = 16 + num_chunk = T // chunk_size + v2 = rearrange(v, 'b h (n c) d -> b h n c d', n=num_chunk) + BK = min(K, 64) + NK = triton.cdiv(K, BK) + A = q.new_empty(NK, B, H, triton.cdiv(T, BT), BT, BT) + grid = (NK, triton.cdiv(T, BT), B * H) + fwd_inner_chunk[grid]( + q, k, g, A, + q.stride(1), q.stride(2), q.stride(3), + scale, + B=B, + H=H, + T=T, + K=K, + BT=BT, + BK=BK, + num_stages=3, + num_warps=4 + ) + A = A.sum(0) + o2 = A @ v2 + o2 = rearrange(o2, 'b h n c d -> b h (n c) d') + # combine inner and inter + o.add_(o2) + ctx.save_for_backward(q, k, v, g_org, A, initial_state) + ctx.CHECK = CHECK + return o.to(v), final_state + + @staticmethod + @contiguous + @autocast_custom_bwd + def backward(ctx, do, dht=None): + q, k, v, g_org, A, initial_state = ctx.saved_tensors + B, H, T, K, V = *k.shape, v.shape[-1] + scale = ctx.scale + + # recomputation + # inter-chunk + BT = 16 # chunk_size + g = chunk_local_cumsum(g_org, chunk_size=BT) + BK, BV = min(K, 64), min(V, 64) + NK, NV = triton.cdiv(K, BK), triton.cdiv(V, BV) + q_g = torch.empty_like(q) + k_g = torch.empty_like(k) + grid = (NK, triton.cdiv(T, BT), B * H) + prepare_qg_kg[grid]( + q, k, g, q_g, k_g, + q.stride(1), + scale, + K=K, + BT=BT, + BK=BK, + num_warps=1 + ) + + # inter-chunk + BT = 16 + BK, BV = min(triton.next_power_of_2(K), 64), min(triton.next_power_of_2(V), 64) + NK, NV = triton.cdiv(K, BK), triton.cdiv(V, BV) + num_stages = 1 + num_warps = 2 + dq = q.new_empty(NV, B, H, T, K) + dk = q.new_empty(NV, B, H, T, K) + dv = q.new_empty(NK, B, H, T, V) + + grid = (NV, NK, B * H) + + fused_chunk_gla_bwd_kernel[grid]( + q_g, k_g, v, g, do, dq, dk, dv, initial_state, + q.stride(1), q.stride(2), q.stride(3), + v.stride(1), v.stride(2), v.stride(3), + scale, + B=B, + H=H, + T=T, + K=K, + V=V, + BT=BT, + BK=BK, + BV=BV, + USE_INITIAL_STATE=initial_state is not None, + CHECK=ctx.CHECK, + num_warps=num_warps, + num_stages=num_stages, + ) + dq = dq.sum(0) + dk = dk.sum(0) + dv = dv.sum(0) + + # intra chunk + num_chunk = T // BT + v2 = rearrange(v, 'b h (n c) d -> b h n c d', n=num_chunk) + do2 = rearrange(do, 'b h (n c) d -> b h n c d', n=num_chunk) + dA2 = (do2 @ v2.transpose(-2, -1)) * scale + dv2 = A.transpose(-1, -2) @ do2 + dv2 = rearrange(dv2, 'b h n c d -> b h (n c) d', n=num_chunk) + + BK = min(triton.next_power_of_2(K), 16) + NK = triton.cdiv(K, BK) + dk2 = torch.empty_like(k) + dq2 = torch.empty_like(q) + + grid = (NK, triton.cdiv(T, BT), B * H) + bwd_inner_chunk[grid]( + q, k, g, + dA2, dq2, dk2, + q.stride(1), q.stride(2), q.stride(3), + T=T, + K=K, + BT=BT, + BK=BK, + num_warps=1, + num_stages=3 + ) + + BK = min(triton.next_power_of_2(K), 32) + NK = triton.cdiv(K, BK) + dg = torch.empty_like(g, dtype=torch.float32) + grid = (NK, triton.cdiv(T, BT), B * H) + bwd_decay_global_cumsum[grid]( + dq2, dq, dk2, dk, q, k, g, dg, + q.stride(1), + K=K, + BT=BT, + BK=BK, + num_warps=1, + num_stages=1 + ) + dg = rearrange(dg, 'b h (n c) d -> b h n c d', c=BT) + + def rev_cumsum_exclusive(x): + cumsum_x = x.cumsum(-2) + rev_cumsum_x = cumsum_x[..., -1, None, :] - cumsum_x + return rev_cumsum_x + + rev_cumsum_dg = rev_cumsum_exclusive(dg[..., 0, :]) + dg.add_(rev_cumsum_dg.unsqueeze(-2)) + dv.add_(dv2) + dg = rearrange(dg, 'b h n c d -> b h (n c) d') + + return dq.to(q), dk.to(k), dv.to(v), dg.to(ctx.g_dtype), None, None, None + + +def pad(x, chunk_size=16): + T = x.shape[-2] + padded_seq_len = ceildiv(T, chunk_size) * chunk_size + if x.shape[-2] % chunk_size != 0: + x = F.pad(x, (0, 0, 0, padded_seq_len - T)) + + return x + + +def ceildiv(a, b): + return -(a // -b) + + +def fused_chunk_gla( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + g: torch.Tensor, + scale: int = -1, + initial_state: torch.Tensor = None, + output_final_state: bool = False, + head_first: bool = True +) -> Tuple[torch.Tensor, torch.Tensor]: + if scale == -1: + scale = q.shape[-1] ** -0.5 + if initial_state is not None: + initial_state = initial_state.detach() + seq_len = q.shape[-2] + q, k, v, g = map(lambda x: pad(x), [q, k, v, g]) + if not head_first: + q, k, v, g = map(lambda x: x.transpose(1, 2), (q, k, v, g)) + o, final_state = FusedChunkGLAFunction.apply(q, k, v, g, scale, initial_state, output_final_state) + o = o[..., :seq_len, :] + if not head_first: + o = o.transpose(1, 2) + return o, final_state diff --git a/fla/ops/gla/fused_recurrent.py b/fla/ops/gla/fused_recurrent.py new file mode 100644 index 0000000000000000000000000000000000000000..68eb1492d58ce5c9229bd4b9bd3acd320bfb223d --- /dev/null +++ b/fla/ops/gla/fused_recurrent.py @@ -0,0 +1,116 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +from typing import Optional, Tuple + +import torch + +from fla.ops.common.fused_recurrent import fused_recurrent + + +def fused_recurrent_gla( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + gk: Optional[torch.Tensor] = None, + gv: Optional[torch.Tensor] = None, + scale: Optional[int] = None, + initial_state: Optional[torch.Tensor] = None, + output_final_state: bool = False, + reverse: bool = False, + offsets: Optional[torch.LongTensor] = None, + head_first: bool = True +) -> Tuple[torch.Tensor, torch.Tensor]: + r""" + Args: + q (torch.Tensor): + queries of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]`. + k (torch.Tensor): + keys of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]`. + v (torch.Tensor): + values of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]`. + gk (torch.Tensor): + Forget gates of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]` applied to keys. + gv (torch.Tensor): + Forget gates of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]` applied to values. + scale (Optional[int]): + Scale factor for the attention scores. + If not provided, it will default to `1 / sqrt(K)`. Default: `None`. + initial_state (Optional[torch.Tensor]): + Initial state of shape `[N, H, K, V]` for `N` input sequences. + For equal-length input sequences, `N` equals the batch size `B`. + Default: `None`. + output_final_state (Optional[bool]): + Whether to output the final state of shape `[N, H, K, V]`. Default: `False`. + reverse (Optional[bool]): + If `True`, process the state passing in reverse order. Default: `False`. + offsets (Optional[torch.LongTensor]): + Offsets of shape `[N+1]` defining the bos/eos positions of `N` variable-length sequences in the batch. + For example, + if `offsets` is `[0, 1, 3, 6, 10, 15]`, there are `N=5` sequences with lengths 1, 2, 3, 4 and 5 respectively. + If provided, the inputs are concatenated and the batch size `B` is expected to be 1. + Default: `None`. + head_first (Optional[bool]): + Whether the inputs are in the head-first format, which is not supported for variable-length inputs. + Default: `True`. + + Returns: + o (torch.Tensor): + Outputs of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]`. + final_state (torch.Tensor): + Final state of shape `[N, H, K, V]` if `output_final_state=True` else `None`. + + Examples:: + >>> import torch + >>> import torch.nn.functional as F + >>> from einops import rearrange + >>> from fla.ops.gla import fused_recurrent_gla + # inputs with equal lengths + >>> B, T, H, K, V = 4, 2048, 4, 512, 512 + >>> q = torch.randn(B, T, H, K, device='cuda') + >>> k = torch.randn(B, T, H, K, device='cuda') + >>> v = torch.randn(B, T, H, V, device='cuda') + >>> g = F.logsigmoid(torch.randn(B, T, H, K, device='cuda')) + >>> h0 = torch.randn(B, H, K, V, device='cuda') + >>> o, ht = fused_recurrent_gla(q, k, v, g, + initial_state=h0, + output_final_state=True, + head_first=False) + # for variable-length inputs, the batch size `B` is expected to be 1 and `offsets` is required + >>> q, k, v, g = map(lambda x: rearrange(x, 'b t h d -> 1 (b t) h d'), (q, k, v, g)) + # for a batch with 4 sequences, offsets with 5 start/end positions are expected + >>> offsets = q.new_tensor([0, 2048, 4096, 6144, 8192], dtype=torch.long) + >>> o_var, ht_var = fused_recurrent_gla(q, k, v, g, + initial_state=h0, + output_final_state=True, + offsets=offsets, + head_first=False) + >>> assert o.allclose(o_var.view(o.shape)) + >>> assert ht.allclose(ht_var) + """ + if offsets is not None: + if q.shape[0] != 1: + raise ValueError(f"The batch size is expected to be 1 rather than {q.shape[0]} when using `offsets`." + f"Please flatten variable-length inputs before processing.") + if head_first: + raise RuntimeError("Sequences with variable lengths are not supported for head-first mode") + if initial_state is not None and initial_state.shape[0] != len(offsets) - 1: + raise ValueError(f"The number of initial states is expected to be equal to the number of input sequences, " + f"i.e., {len(offsets) - 1} rather than {initial_state.shape[0]}.") + if scale is None: + scale = k.shape[-1] ** -0.5 + o, final_state = fused_recurrent( + q=q, + k=k, + v=v, + g=None, + gk=gk, + gv=gv, + scale=scale, + initial_state=initial_state, + output_final_state=output_final_state, + reverse=reverse, + offsets=offsets, + head_first=head_first + ) + return o, final_state diff --git a/fla/ops/gla/naive.py b/fla/ops/gla/naive.py new file mode 100644 index 0000000000000000000000000000000000000000..507a7395c0c28b0a9c54008e1735098cd3fbdc85 --- /dev/null +++ b/fla/ops/gla/naive.py @@ -0,0 +1,41 @@ +# -*- coding: utf-8 -*- + +from typing import Optional + +import torch + + +def ceildiv(a, b): + return -(a // -b) + + +def naive_recurrent_gla( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + gk: torch.Tensor, + initial_state: Optional[torch.Tensor] = None, + output_final_state: bool = False +): + dtype = q.dtype + q, k, v, gk = map(lambda x: x.float(), (q, k, v, gk)) + B, H, T, K, V = *q.shape, v.shape[-1] + o = torch.zeros_like(v) + scale = K ** -0.5 + + h = q.new_zeros(B, H, K, V, dtype=torch.float32) + if initial_state is not None: + h += initial_state.float() + + for i in range(T): + q_i = q[:, :, i] * scale + k_i = k[:, :, i] + v_i = v[:, :, i] + gk_i = gk[:, :, i].exp() + kv_i = k_i[..., None] * v_i[..., None, :] + h = h * gk_i[..., None] + kv_i + o[:, :, i] = (q_i[..., None] * h).sum(-2) + + if not output_final_state: + h = None + return o.to(dtype), h diff --git a/fla/ops/gsa/__init__.py b/fla/ops/gsa/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..ed8a88014ddfc3143e67d3a48c38a54b75d7f3d6 --- /dev/null +++ b/fla/ops/gsa/__init__.py @@ -0,0 +1,9 @@ +# -*- coding: utf-8 -*- + +from .chunk import chunk_gsa +from .fused_recurrent import fused_recurrent_gsa + +__all__ = [ + 'chunk_gsa', + 'fused_recurrent_gsa' +] diff --git a/fla/ops/gsa/chunk.py b/fla/ops/gsa/chunk.py new file mode 100644 index 0000000000000000000000000000000000000000..45825184dce1765adc568eeec20b05208b9f8be9 --- /dev/null +++ b/fla/ops/gsa/chunk.py @@ -0,0 +1,1255 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +from typing import Optional, Tuple + +import torch +import triton +import triton.language as tl +from einops import reduce + +from fla.ops.common.chunk_h import chunk_bwd_dh, chunk_fwd_h +from fla.ops.gla.chunk import chunk_gla_bwd, chunk_gla_fwd +from fla.ops.utils import chunk_local_cumsum, softmax_bwd, softmax_fwd +from fla.utils import contiguous + + +@triton.heuristics({'USE_OFFSETS': lambda args: args['offsets'] is not None}) +@triton.jit +def chunk_gsa_fwd_k_kernel_inter( + q, + k, + h, + g, + o, + A, + offsets, + indices, + scale, + T: tl.constexpr, + HQ: tl.constexpr, + H: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + NG: tl.constexpr, + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + i_v, i_t, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_bg = i_bh // NG + i_b, i_hq = i_bh // HQ, i_bh % HQ + i_h = i_hq // NG + if USE_OFFSETS: + i_tg = i_t + i_n, i_t = tl.load(indices + i_t * 2).to(tl.int32), tl.load(indices + i_t * 2 + 1).to(tl.int32) + bos, eos = tl.load(offsets + i_n).to(tl.int32), tl.load(offsets + i_n + 1).to(tl.int32) + T = eos - bos + NT = tl.cdiv(T, BT) + else: + NT = tl.cdiv(T, BT) + i_tg = i_b * NT + i_t + bos, eos = i_b * T, i_b * T + T + + o_i = tl.arange(0, BT) + m_s = o_i[:, None] >= o_i[None, :] + + b_o = tl.zeros([BT, BV], dtype=tl.float32) + b_A = tl.zeros([BT, BT], dtype=tl.float32) + for i_k in range(tl.cdiv(K, BK)): + if HEAD_FIRST: + p_q = tl.make_block_ptr(q + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_k = tl.make_block_ptr(k + i_bg * T*K, (K, T), (1, K), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + p_h = tl.make_block_ptr(h + (i_bg * NT + i_t) * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + else: + p_q = tl.make_block_ptr(q + (bos * HQ + i_hq) * K, (T, K), (HQ*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_k = tl.make_block_ptr(k + (bos * H + i_h) * K, (K, T), (1, H*K), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + p_h = tl.make_block_ptr(h + (i_tg * H + i_h) * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + + # [BT, BK] + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_q = (b_q * scale).to(b_q.dtype) + # [BK, BT] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BK, BV] + b_h = tl.load(p_h, boundary_check=(0, 1)) + # [BT, BV] + b_o += tl.dot(b_q, b_h) + # [BT, BT] + b_A += tl.dot(b_q, b_k) + if HEAD_FIRST: + p_g = tl.make_block_ptr(g + i_bg * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_o = tl.make_block_ptr(o + i_bh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_A = tl.make_block_ptr(A + i_bh * T*BT, (T, BT), (BT, 1), (i_t * BT, 0), (BT, BT), (1, 0)) + else: + p_g = tl.make_block_ptr(g + (bos * H + i_h) * V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_o = tl.make_block_ptr(o + (bos * HQ + i_hq) * V, (T, V), (HQ*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_A = tl.make_block_ptr(A + (bos * HQ + i_hq) * BT, (T, BT), (HQ*BT, 1), (i_t * BT, 0), (BT, BT), (1, 0)) + # [BT, BV] + b_g = tl.load(p_g, boundary_check=(0, 1)) + b_o = b_o * tl.exp(b_g) + tl.store(p_o, b_o.to(p_o.dtype.element_ty), boundary_check=(0, 1)) + + # [BT, BT] + b_A = tl.where(m_s, b_A, 0.) + if i_v == 0: + tl.store(p_A, b_A.to(p_A.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.heuristics({'USE_OFFSETS': lambda args: args['offsets'] is not None}) +@triton.jit +def chunk_gsa_fwd_k_kernel_intra( + v, + g, + o, + A, + offsets, + indices, + T: tl.constexpr, + HQ: tl.constexpr, + H: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BC: tl.constexpr, + BV: tl.constexpr, + NC: tl.constexpr, + NG: tl.constexpr, + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + i_v, i_c, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_bg = i_bh // NG + i_b, i_hq = i_bh // HQ, i_bh % HQ + i_h = i_hq // NG + i_t, i_i = i_c // NC, i_c % NC + if USE_OFFSETS: + i_n, i_t = tl.load(indices + i_t * 2).to(tl.int32), tl.load(indices + i_t * 2 + 1).to(tl.int32) + bos, eos = tl.load(offsets + i_n).to(tl.int32), tl.load(offsets + i_n + 1).to(tl.int32) + T = eos - bos + else: + bos, eos = i_b * T, i_b * T + T + + o_v = i_v * BV + tl.arange(0, BV) + m_v = o_v < V + + if i_t * BT + i_i * BC > T: + return + + if HEAD_FIRST: + p_g = tl.make_block_ptr(g + i_bg * T*V, (T, V), (V, 1), (i_t * BT + i_i * BC, i_v * BV), (BC, BV), (1, 0)) + p_gn = tl.max_contiguous(tl.multiple_of(g + i_bg * T*V + min(i_t * BT + i_i * BC, T) * V + o_v, BV), BV) + else: + p_g = tl.make_block_ptr(g + (bos * H + i_h) * V, (T, V), (H*V, 1), (i_t * BT + i_i * BC, i_v * BV), (BC, BV), (1, 0)) + p_gn = tl.max_contiguous(tl.multiple_of(g + (bos + min(i_t * BT + i_i * BC, T)) * H*V + i_h * V + o_v, BV), BV) + # [BV,] + b_gn = tl.load(p_gn, mask=m_v, other=0) + # [BC, BV] + b_o = tl.zeros([BC, BV], dtype=tl.float32) + for i_j in range(0, i_i): + if HEAD_FIRST: + p_A = tl.make_block_ptr(A + i_bh * T*BT, (T, BT), (BT, 1), (i_t * BT + i_i * BC, i_j * BC), (BC, BC), (1, 0)) + p_v = tl.make_block_ptr(v + i_bg * T*V, (T, V), (V, 1), (i_t * BT + i_j * BC, i_v * BV), (BC, BV), (1, 0)) + p_gv = tl.make_block_ptr(g + i_bg * T*V, (T, V), (V, 1), (i_t * BT + i_j * BC, i_v * BV), (BC, BV), (1, 0)) + else: + p_A = tl.make_block_ptr(A + (bos*HQ+i_hq) * BT, (T, BT), (HQ*BT, 1), (i_t*BT+i_i*BC, i_j * BC), (BC, BC), (1, 0)) + p_v = tl.make_block_ptr(v + (bos*H+i_h) * V, (T, V), (H*V, 1), (i_t * BT + i_j * BC, i_v * BV), (BC, BV), (1, 0)) + p_gv = tl.make_block_ptr(g + (bos*H+i_h) * V, (T, V), (H*V, 1), (i_t * BT + i_j * BC, i_v * BV), (BC, BV), (1, 0)) + # [BC, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + b_gv = tl.load(p_gv, boundary_check=(0, 1)) + b_vg = (b_v * tl.exp(b_gn[None, :] - b_gv)).to(b_v.dtype) + # [BC, BC] + b_A = tl.load(p_A, boundary_check=(0, 1)) + b_o += tl.dot(b_A, b_vg) + # [BC, BV] + b_g = tl.load(p_g, boundary_check=(0, 1)) + b_o *= tl.exp(b_g - b_gn[None, :]) + + o_i = tl.arange(0, BC) + if HEAD_FIRST: + o_A = i_bh * T*BT + (i_t * BT + i_i * BC + tl.arange(0, BC)) * BT + i_i * BC + else: + o_A = (bos + i_t * BT + i_i * BC + tl.arange(0, BC)) * HQ*BT + i_hq * BT + i_i * BC + m_A = (i_t * BT + i_i * BC + tl.arange(0, BC)) < T + for j in range(0, min(BC, T - i_t * BT - i_i * BC)): + if HEAD_FIRST: + p_v = tl.max_contiguous(tl.multiple_of(v + i_bg * T*V + (i_t * BT + i_i * BC + j) * V + o_v, BV), BV) + p_gv = tl.max_contiguous(tl.multiple_of(g + i_bg * T*V + (i_t * BT + i_i * BC + j) * V + o_v, BV), BV) + else: + p_v = tl.max_contiguous(tl.multiple_of(v + (bos + i_t * BT + i_i * BC + j) * H*V + i_h * V + o_v, BV), BV) + p_gv = tl.max_contiguous(tl.multiple_of(g + (bos + i_t * BT + i_i * BC + j) * H*V + i_h * V + o_v, BV), BV) + # [BC,] + b_A = tl.load(A + o_A + j, mask=m_A, other=0) + # [BV,] + b_v = tl.load(p_v, mask=m_v, other=0).to(tl.float32) + b_gv = tl.load(p_gv, mask=m_v, other=0).to(tl.float32) + # [BC, BV] + b_vg = b_v[None, :] * tl.exp(b_g - b_gv[None, :]) + # avoid 0 * inf = inf + b_o += tl.where(o_i[:, None] >= j, b_A[:, None] * b_vg, 0.) + if HEAD_FIRST: + p_o = tl.make_block_ptr(o + i_bh * T*V, (T, V), (V, 1), (i_t * BT + i_i * BC, i_v * BV), (BC, BV), (1, 0)) + else: + p_o = tl.make_block_ptr(o + (bos*HQ + i_hq) * V, (T, V), (HQ*V, 1), (i_t * BT + i_i * BC, i_v * BV), (BC, BV), (1, 0)) + b_o += tl.load(p_o, boundary_check=(0, 1)) + tl.store(p_o, b_o.to(p_o.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.heuristics({'USE_OFFSETS': lambda args: args['offsets'] is not None}) +@triton.jit +def chunk_gsa_bwd_k_kernel_dA( + v, + g, + do, + dA, + indices, + offsets, + scale, + B: tl.constexpr, + T: tl.constexpr, + HQ: tl.constexpr, + H: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BC: tl.constexpr, + BV: tl.constexpr, + NC: tl.constexpr, + NG: tl.constexpr, + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + i_v, i_c, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_bg = i_bh // NG + i_b, i_hq = i_bh // HQ, i_bh % HQ + i_h = i_hq // NG + i_t, i_i, i_j = i_c // (NC * NC), (i_c % (NC * NC)) // NC, (i_c % (NC * NC)) % NC + if USE_OFFSETS: + i_n, i_t = tl.load(indices + i_t * 2).to(tl.int32), tl.load(indices + i_t * 2 + 1).to(tl.int32) + bos, eos = tl.load(offsets + i_n).to(tl.int32), tl.load(offsets + i_n + 1).to(tl.int32) + all = T + T = eos - bos + else: + bos, eos = i_b * T, i_b * T + T + all = B * T + + o_v = i_v * BV + tl.arange(0, BV) + m_v = o_v < V + + if i_t * BT + i_i * BC > T: + return + + if HEAD_FIRST: + p_dA = tl.make_block_ptr(dA+(i_v*B*H+i_bh)*T*BT, (T, BT), (BT, 1), (i_t * BT + i_i * BC, i_j * BC), (BC, BC), (1, 0)) + else: + p_dA = tl.make_block_ptr(dA+((i_v*all+bos)*HQ+i_hq)*BT, (T, BT), (HQ*BT, 1), (i_t*BT+i_i*BC, i_j*BC), (BC, BC), (1, 0)) + + # [BC, BC] + b_dA = tl.zeros([BC, BC], dtype=tl.float32) + if i_i > i_j: + if HEAD_FIRST: + p_v = tl.make_block_ptr(v + i_bg * T*V, (V, T), (1, V), (i_v * BV, i_t * BT + i_j * BC), (BV, BC), (0, 1)) + p_gv = tl.make_block_ptr(g + i_bg * T*V, (V, T), (1, V), (i_v * BV, i_t * BT + i_j * BC), (BV, BC), (0, 1)) + p_gn = tl.max_contiguous(tl.multiple_of(g + i_bg * T*V + (i_t * BT + i_i * BC) * V + o_v, BV), BV) + p_g = tl.make_block_ptr(g + i_bg * T*V, (T, V), (V, 1), (i_t * BT + i_i * BC, i_v * BV), (BC, BV), (1, 0)) + p_do = tl.make_block_ptr(do + i_bh * T*V, (T, V), (V, 1), (i_t * BT + i_i * BC, i_v * BV), (BC, BV), (1, 0)) + else: + p_v = tl.make_block_ptr(v + (bos*H+i_h) * V, (V, T), (1, H*V), (i_v * BV, i_t*BT + i_j*BC), (BV, BC), (0, 1)) + p_gv = tl.make_block_ptr(g + (bos*H+i_h) * V, (V, T), (1, H*V), (i_v * BV, i_t*BT + i_j*BC), (BV, BC), (0, 1)) + p_gn = tl.max_contiguous(tl.multiple_of(g + (bos + i_t*BT + i_i*BC) * H*V + i_h * V + o_v, BV), BV) + p_g = tl.make_block_ptr(g + (bos*H+i_h) * V, (T, V), (H*V, 1), (i_t*BT + i_i*BC, i_v*BV), (BC, BV), (1, 0)) + p_do = tl.make_block_ptr(do + (bos*HQ+i_hq) * V, (T, V), (HQ*V, 1), (i_t*BT + i_i*BC, i_v*BV), (BC, BV), (1, 0)) + # [BV,] + b_gn = tl.load(p_gn, mask=m_v, other=0.) + # [BC, BV] + b_g = tl.load(p_g, boundary_check=(0, 1)) + b_do = tl.load(p_do, boundary_check=(0, 1)) + b_do = (b_do * tl.exp(b_g - b_gn[None, :]) * scale).to(b_do.dtype) + # [BV, BC] + b_v = tl.load(p_v, boundary_check=(0, 1)) + b_gv = tl.load(p_gv, boundary_check=(0, 1)) + b_vg = (b_v * tl.exp(b_gn[:, None] - b_gv)).to(b_v.dtype) + # [BC, BC] + b_dA = tl.dot(b_do, b_vg) + elif i_i == i_j: + if HEAD_FIRST: + p_g = tl.make_block_ptr(g + i_bg * T*V, (T, V), (V, 1), (i_t * BT + i_i * BC, i_v * BV), (BC, BV), (1, 0)) + p_do = tl.make_block_ptr(do + i_bh * T*V, (T, V), (V, 1), (i_t * BT + i_i * BC, i_v * BV), (BC, BV), (1, 0)) + p_v = tl.max_contiguous(tl.multiple_of(v + i_bg * T*V + (i_t * BT + i_j * BC) * V + o_v, BV), BV) + p_gv = tl.max_contiguous(tl.multiple_of(g + i_bg * T*V + (i_t * BT + i_j * BC) * V + o_v, BV), BV) + else: + p_g = tl.make_block_ptr(g + (bos*H + i_h) * V, (T, V), (H*V, 1), (i_t*BT + i_i*BC, i_v*BV), (BC, BV), (1, 0)) + p_do = tl.make_block_ptr(do + (bos*HQ + i_hq) * V, (T, V), (HQ*V, 1), (i_t*BT + i_i*BC, i_v*BV), (BC, BV), (1, 0)) + p_v = tl.max_contiguous(tl.multiple_of(v + (bos + i_t*BT + i_j*BC) * H*V + i_h * V + o_v, BV), BV) + p_gv = tl.max_contiguous(tl.multiple_of(g + (bos + i_t*BT + i_j*BC) * H*V + i_h * V + o_v, BV), BV) + # [BC, BV] + b_g = tl.load(p_g, boundary_check=(0, 1)) + b_do = tl.load(p_do, boundary_check=(0, 1)) * scale + m_v = o_v < V + + o_i = tl.arange(0, BC) + # [BC, BC] + m_dA = o_i[:, None] >= o_i[None, :] + for j in range(0, min(BC, T - i_t * BT - i_j * BC)): + # [BV,] + b_v = tl.load(p_v, mask=m_v, other=0).to(tl.float32) + b_gv = tl.load(p_gv, mask=m_v, other=0).to(tl.float32) + # [BC,] + b_dAj = tl.sum(b_do * b_v[None, :] * tl.exp(b_g - b_gv[None, :]), 1) + b_dA = tl.where((o_i == j)[None, :], b_dAj[:, None], b_dA) + + p_v += (1 if HEAD_FIRST else H) * V + p_gv += (1 if HEAD_FIRST else H) * V + b_dA = tl.where(m_dA, b_dA, 0.) + tl.store(p_dA, b_dA.to(dA.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.heuristics({'USE_OFFSETS': lambda args: args['offsets'] is not None}) +@triton.jit +def chunk_gsa_bwd_k_kernel_dqkvg( + q, + k, + v, + h, + g, + A, + do, + dh, + dq, + dk, + dv, + dg, + dgv, + dA, + offsets, + indices, + scale, + B: tl.constexpr, + T: tl.constexpr, + HQ: tl.constexpr, + H: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + NG: tl.constexpr, + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + i_k, i_t, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_bg = i_bh // NG + i_b, i_hq = i_bh // HQ, i_bh % HQ + i_h = i_hq // NG + if USE_OFFSETS: + i_tg = i_t + i_n, i_t = tl.load(indices + i_t * 2).to(tl.int32), tl.load(indices + i_t * 2 + 1).to(tl.int32) + bos, eos = tl.load(offsets + i_n).to(tl.int32), tl.load(offsets + i_n + 1).to(tl.int32) + all = T + T = eos - bos + NT = tl.cdiv(T, BT) + else: + NT = tl.cdiv(T, BT) + i_tg = i_b * NT + i_t + bos, eos = i_b * T, i_b * T + T + all = B * T + + o_i = tl.arange(0, BT) + o_t = min(i_t * BT + BT, T) + m_s = o_i[:, None] >= o_i[None, :] + + if HEAD_FIRST: + p_q = tl.make_block_ptr(q + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_k = tl.make_block_ptr(k + i_bg * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_A = tl.make_block_ptr(A + (i_k*B*H+i_bh) * T*BT, (T, BT), (BT, 1), (i_t * BT, 0), (BT, BT), (1, 0)) + else: + p_q = tl.make_block_ptr(q + (bos*HQ+i_hq) * K, (T, K), (HQ*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_k = tl.make_block_ptr(k + (bos*H+i_h) * K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_A = tl.make_block_ptr(A + ((i_k*all+bos)*HQ+i_hq)*BT, (T, BT), (HQ*BT, 1), (i_t * BT, 0), (BT, BT), (1, 0)) + + # [BT, BK] + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BT, BT] + b_A = tl.dot((b_q * scale).to(b_q.dtype), tl.trans(b_k)) + b_A = tl.where(m_s, b_A, 0.) + tl.store(p_A, b_A.to(p_A.dtype.element_ty), boundary_check=(0, 1)) + + b_dq = tl.zeros([BT, BK], dtype=tl.float32) + b_dk = tl.zeros([BT, BK], dtype=tl.float32) + for i_v in range(tl.cdiv(V, BV)): + o_v = i_v * BV + tl.arange(0, BV) + if HEAD_FIRST: + p_v = tl.make_block_ptr(v + i_bg * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_g = tl.make_block_ptr(g + i_bg * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_gn = tl.max_contiguous(tl.multiple_of(g + i_bg * T*V + (o_t - 1) * V + o_v, BV), BV) + p_do = tl.make_block_ptr(do + i_bh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_dv = tl.make_block_ptr(dv + (i_k*B*H+i_bh) * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_dg = tl.make_block_ptr(dg + i_bh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_dgv = tl.make_block_ptr(dgv + (i_k*B*H+i_bh) * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_h = tl.make_block_ptr(h + i_bg * NT*K*V + i_t * K*V, (V, K), (1, V), (i_v * BV, i_k * BK), (BV, BK), (0, 1)) + p_dh = tl.make_block_ptr(dh + i_bh * NT*K*V + i_t * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + else: + p_v = tl.make_block_ptr(v + (bos*H+i_h)*V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_g = tl.make_block_ptr(g + (bos*H+i_h)*V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_gn = tl.max_contiguous(tl.multiple_of(g + (bos + o_t - 1) * H*V + i_h * V + o_v, BV), BV) + p_do = tl.make_block_ptr(do + (bos*HQ+i_hq)*V, (T, V), (HQ*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_dv = tl.make_block_ptr(dv + ((i_k*all+bos)*HQ+i_hq)*V, (T, V), (HQ*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_dg = tl.make_block_ptr(dg + (bos*HQ+i_hq)*V, (T, V), (HQ*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_dgv = tl.make_block_ptr(dgv+((i_k*all+bos)*HQ+i_hq)*V, (T, V), (HQ*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_h = tl.make_block_ptr(h + (i_tg * H + i_h) * K*V, (V, K), (1, V), (i_v * BV, i_k * BK), (BV, BK), (0, 1)) + p_dh = tl.make_block_ptr(dh + (i_tg * HQ + i_hq) * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + m_v = o_v < V + + # [BV,] + b_gn = tl.load(p_gn, mask=m_v, other=0) + # [BT, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + b_g = tl.load(p_g, boundary_check=(0, 1)) + b_gv = tl.exp(b_gn[None, :] - b_g) + # [BV, BK] + b_h = tl.load(p_h, boundary_check=(0, 1)) + # [BT, BV] + b_do = tl.load(p_do, boundary_check=(0, 1)) + b_do = (b_do * tl.exp(b_g) * scale).to(b_do.dtype) + # [BK, BV] + b_dh = tl.load(p_dh, boundary_check=(0, 1)) + # [BV] + b_dg = tl.sum(tl.trans(b_h) * b_dh, 0) * tl.exp(b_gn) + + b_dh = b_dh.to(b_k.dtype) + # [BT, BK] + b_dq += tl.dot(b_do, b_h.to(b_k.dtype)) + b_dk += tl.dot((b_v * b_gv).to(b_v.dtype), tl.trans(b_dh)) + # [BT, BV] + b_dv = tl.dot(b_k, b_dh) * b_gv + # [BV] + b_dg += tl.sum(b_dv * b_v, 0) + + if i_k == 0: + b_dgv = tl.load(p_dg, boundary_check=(0, 1)) + b_dg[None, :] + else: + b_dgv = tl.zeros([BT, BV], dtype=tl.float32) + b_dg[None, :] + + tl.store(p_dgv, b_dgv.to(p_dgv.dtype.element_ty), boundary_check=(0, 1)) + tl.store(p_dv, b_dv.to(p_dv.dtype.element_ty), boundary_check=(0, 1)) + if HEAD_FIRST: + p_dA = tl.make_block_ptr(dA + i_bh * T*BT, (T, BT, ), (BT, 1), (i_t * BT, 0), (BT, BT), (1, 0)) + p_dq = tl.make_block_ptr(dq + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dk = tl.make_block_ptr(dk + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + else: + p_dA = tl.make_block_ptr(dA + (bos*HQ + i_hq) * BT, (T, BT), (HQ*BT, 1), (i_t * BT, 0), (BT, BT), (1, 0)) + p_dq = tl.make_block_ptr(dq + (bos*HQ + i_hq) * K, (T, K), (HQ*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dk = tl.make_block_ptr(dk + (bos*HQ + i_hq) * K, (T, K), (HQ*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + # [BT, BT] + b_dA = tl.load(p_dA, boundary_check=(0, 1)) + # [BT, BK] + b_dq += tl.dot(b_dA, b_k) + b_dk += tl.dot(tl.trans(b_dA).to(b_k.dtype), b_q) + + tl.store(p_dq, b_dq.to(p_dq.dtype.element_ty), boundary_check=(0, 1)) + tl.store(p_dk, b_dk.to(p_dk.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.heuristics({'USE_OFFSETS': lambda args: args['offsets'] is not None}) +@triton.jit +def chunk_gsa_bwd_k_kernel_intra_dvg( + v, + g, + o, + A, + do, + dv, + dg, + offsets, + indices, + T: tl.constexpr, + HQ: tl.constexpr, + H: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BC: tl.constexpr, + BV: tl.constexpr, + NC: tl.constexpr, + NG: tl.constexpr, + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + i_v, i_c, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_bg = i_bh // NG + i_b, i_hq = i_bh // HQ, i_bh % HQ + i_h = i_hq // NG + i_t, i_i = i_c // NC, i_c % NC + if USE_OFFSETS: + i_n, i_t = tl.load(indices + i_t * 2).to(tl.int32), tl.load(indices + i_t * 2 + 1).to(tl.int32) + bos, eos = tl.load(offsets + i_n).to(tl.int32), tl.load(offsets + i_n + 1).to(tl.int32) + T = eos - bos + else: + bos, eos = i_b * T, i_b * T + T + + o_v = i_v * BV + tl.arange(0, BV) + m_v = o_v < V + + if i_t * BT + i_i * BC > T: + return + + if HEAD_FIRST: + p_gv = tl.make_block_ptr(g + i_bg * T*V, (T, V), (V, 1), (i_t * BT + i_i * BC, i_v * BV), (BC, BV), (1, 0)) + p_gn = tl.max_contiguous(tl.multiple_of(g + i_bg * T*V + (min(i_t * BT + i_i * BC + BC, T) - 1) * V + o_v, BV), BV) + else: + p_gv = tl.make_block_ptr(g + (bos*H+i_h)*V, (T, V), (H*V, 1), (i_t * BT + i_i * BC, i_v * BV), (BC, BV), (1, 0)) + p_gn = tl.max_contiguous(tl.multiple_of(g + (bos + min(i_t * BT + i_i * BC + BC, T)-1)*H*V + i_h*V + o_v, BV), BV) + # [BV,] + b_gn = tl.load(p_gn, mask=m_v, other=0) + # [BC, BV] + b_gv = tl.load(p_gv, boundary_check=(0, 1)) + b_dv = tl.zeros([BC, BV], dtype=tl.float32) + for i_j in range(i_i + 1, NC): + if HEAD_FIRST: + p_g = tl.make_block_ptr(g + i_bg * T*V, (T, V), (V, 1), (i_t * BT + i_j * BC, i_v * BV), (BC, BV), (1, 0)) + p_A = tl.make_block_ptr(A + i_bh * T*BT, (BT, T), (1, BT), (i_i * BC, i_t * BT + i_j * BC), (BC, BC), (0, 1)) + p_do = tl.make_block_ptr(do + i_bh * T*V, (T, V), (V, 1), (i_t * BT + i_j * BC, i_v * BV), (BC, BV), (1, 0)) + else: + p_g = tl.make_block_ptr(g + (bos*H+i_h) * V, (T, V), (H*V, 1), (i_t * BT + i_j * BC, i_v * BV), (BC, BV), (1, 0)) + p_A = tl.make_block_ptr(A + (bos*HQ+i_hq) * BT, (BT, T), (1, HQ*BT), (i_i*BC, i_t*BT + i_j*BC), (BC, BC), (0, 1)) + p_do = tl.make_block_ptr(do + (bos*HQ+i_hq) * V, (T, V), (HQ*V, 1), (i_t*BT + i_j*BC, i_v*BV), (BC, BV), (1, 0)) + # [BC, BV] + b_g = tl.load(p_g, boundary_check=(0, 1)) + b_do = tl.load(p_do, boundary_check=(0, 1)) + b_do = (b_do * tl.exp(b_g - b_gn[None, :])).to(b_do.dtype) + # [BC, BC] + b_A = tl.load(p_A, boundary_check=(0, 1)) + b_dv += tl.dot(b_A, b_do) + b_dv *= tl.exp(b_gn[None, :] - b_gv) + + o_i = tl.arange(0, BC) + o_c = i_i * BC + tl.arange(0, BC) + + if HEAD_FIRST: + p_g = tl.max_contiguous(tl.multiple_of(g + i_bg * T*V + (i_t * BT + i_i * BC) * V + o_v, BV), BV) + p_A = tl.max_contiguous(tl.multiple_of(A + i_bh * T*BT + (i_t * BT + i_i * BC) * BT + o_c, BC), BC) + p_do = tl.max_contiguous(tl.multiple_of(do + i_bh * T*V + (i_t * BT + i_i * BC) * V + o_v, BV), BV) + else: + p_g = tl.max_contiguous(tl.multiple_of(g + (bos + i_t * BT + i_i * BC) * H*V + i_h * V + o_v, BV), BV) + p_A = tl.max_contiguous(tl.multiple_of(A + (bos + i_t*BT + i_i*BC) * HQ*BT + i_hq * BT + o_c, BC), BC) + p_do = tl.max_contiguous(tl.multiple_of(do + (bos + i_t*BT + i_i*BC) * HQ*V + i_hq * V + o_v, BV), BV) + + for j in range(0, min(BC, T - i_t * BT - i_i * BC)): + # [BC,] + b_A = tl.load(p_A) + # [BV,] + b_g = tl.load(p_g, mask=m_v, other=0) + b_do = tl.load(p_do, mask=m_v, other=0) + # [BC, BV] + m_i = o_i[:, None] <= j + b_dv += tl.where(m_i, tl.exp(b_g[None, :] - b_gv) * b_A[:, None] * b_do[None, :], 0.) + + p_g += (1 if HEAD_FIRST else H) * V + p_A += (1 if HEAD_FIRST else HQ) * BT + p_do += (1 if HEAD_FIRST else HQ) * V + if HEAD_FIRST: + p_o = tl.make_block_ptr(o + i_bh * T*V, (T, V), (V, 1), (i_t * BT + i_i * BC, i_v * BV), (BC, BV), (1, 0)) + p_v = tl.make_block_ptr(v + i_bg * T*V, (T, V), (V, 1), (i_t * BT + i_i * BC, i_v * BV), (BC, BV), (1, 0)) + p_do = tl.make_block_ptr(do + i_bh * T*V, (T, V), (V, 1), (i_t * BT + i_i * BC, i_v * BV), (BC, BV), (1, 0)) + p_dv = tl.make_block_ptr(dv + i_bh * T*V, (T, V), (V, 1), (i_t * BT + i_i * BC, i_v * BV), (BC, BV), (1, 0)) + p_dg = tl.make_block_ptr(dg + i_bh * T*V, (T, V), (V, 1), (i_t * BT + i_i * BC, i_v * BV), (BC, BV), (1, 0)) + else: + p_o = tl.make_block_ptr(o + (bos*HQ+i_hq)*V, (T, V), (HQ*V, 1), (i_t*BT + i_i*BC, i_v*BV), (BC, BV), (1, 0)) + p_v = tl.make_block_ptr(v + (bos*H+i_h)*V, (T, V), (H*V, 1), (i_t*BT + i_i*BC, i_v*BV), (BC, BV), (1, 0)) + p_do = tl.make_block_ptr(do + (bos*HQ+i_hq)*V, (T, V), (HQ*V, 1), (i_t*BT + i_i*BC, i_v*BV), (BC, BV), (1, 0)) + p_dv = tl.make_block_ptr(dv + (bos*HQ+i_hq)*V, (T, V), (HQ*V, 1), (i_t*BT + i_i*BC, i_v*BV), (BC, BV), (1, 0)) + p_dg = tl.make_block_ptr(dg + (bos*HQ+i_hq)*V, (T, V), (HQ*V, 1), (i_t*BT + i_i*BC, i_v*BV), (BC, BV), (1, 0)) + + b_o = tl.load(p_o, boundary_check=(0, 1)).to(tl.float32) + b_v = tl.load(p_v, boundary_check=(0, 1)).to(tl.float32) + b_do = tl.load(p_do, boundary_check=(0, 1)).to(tl.float32) + b_dv = b_dv + tl.load(p_dv, boundary_check=(0, 1)).to(tl.float32) + b_dg = b_o * b_do - b_v * b_dv + tl.store(p_dv, b_dv.to(p_dv.dtype.element_ty), boundary_check=(0, 1)) + tl.store(p_dg, b_dg.to(p_dg.dtype.element_ty), boundary_check=(0, 1)) + + +def chunk_gsa_fwd_v( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + g: torch.Tensor, + scale: float = 1., + initial_state: Optional[torch.Tensor] = None, + output_final_state: bool = False, + offsets: Optional[torch.LongTensor] = None, + indices: Optional[torch.LongTensor] = None, + head_first: bool = True, + chunk_size: int = 64 +) -> Tuple[torch.Tensor, torch.Tensor, torch.Tensor, torch.Tensor]: + _, A, h, ht, o = chunk_gla_fwd( + q=q, + k=k, + v=v, + g=None, + g_cumsum=g, + scale=scale, + initial_state=initial_state, + output_final_state=output_final_state, + offsets=offsets, + indices=indices, + head_first=head_first, + chunk_size=chunk_size + ) + return A, h, ht, o + + +def chunk_gsa_fwd_k( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + g: torch.Tensor, + h0: Optional[torch.Tensor] = None, + output_final_state: bool = False, + scale: float = 1., + offsets: Optional[torch.LongTensor] = None, + indices: Optional[torch.LongTensor] = None, + head_first: bool = True, + chunk_size: int = 64 +) -> Tuple[torch.Tensor, torch.Tensor, torch.Tensor, torch.Tensor]: + if head_first: + B, H, T, K, V = *k.shape, v.shape[-1] + else: + B, T, H, K, V = *k.shape, v.shape[-1] + BT = min(chunk_size, triton.next_power_of_2(T)) + if offsets is None: + NT = triton.cdiv(T, BT) + else: + if indices is None: + indices = torch.cat([torch.arange(n) for n in triton.cdiv(offsets[1:] - offsets[:-1], BT).tolist()]) + indices = torch.stack([indices.eq(0).cumsum(0) - 1, indices], 1).to(offsets) + NT = len(indices) + BC = min(16, BT) + BK = min(64, triton.next_power_of_2(K)) + BV = min(64, triton.next_power_of_2(V)) + HQ = q.shape[1] if head_first else q.shape[2] + NV = triton.cdiv(V, BV) + NC = triton.cdiv(BT, BC) + NG = HQ // H + num_warps = 4 if BK == 64 else 2 + num_stages = 1 + + h, ht = chunk_fwd_h( + k=k, + v=v, + g=None, + gk=None, + gv=g, + h0=h0, + output_final_state=output_final_state, + states_in_fp32=False, + offsets=offsets, + head_first=head_first, + chunk_size=BT + ) + o = v.new_empty(B, *((HQ, T) if head_first else (T, HQ)), V) + A = q.new_empty(B, *((HQ, T) if head_first else (T, HQ)), BT) + grid = (NV, NT, B * HQ) + chunk_gsa_fwd_k_kernel_inter[grid]( + q, + k, + h, + g, + o, + A, + offsets=offsets, + indices=indices, + scale=scale, + T=T, + HQ=HQ, + H=H, + K=K, + V=V, + BT=BT, + BK=BK, + BV=BV, + NG=NG, + HEAD_FIRST=head_first, + num_warps=num_warps, + num_stages=num_stages + ) + grid = (NV, NT * NC, B * HQ) + chunk_gsa_fwd_k_kernel_intra[grid]( + v, + g, + o, + A, + offsets=offsets, + indices=indices, + T=T, + HQ=HQ, + H=H, + V=V, + BT=BT, + BC=BC, + BV=BV, + NC=NC, + NG=NG, + HEAD_FIRST=head_first, + num_warps=num_warps, + num_stages=num_stages + ) + return A, h, ht, o + + +def chunk_gsa_bwd_v( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + g: torch.Tensor, + h0: torch.Tensor, + h: torch.Tensor, + A: torch.Tensor, + do: torch.Tensor, + dht: torch.Tensor, + dg: torch.Tensor, + scale: float = 1., + offsets: Optional[torch.LongTensor] = None, + indices: Optional[torch.LongTensor] = None, + head_first: bool = True, + chunk_size: int = 64 +): + dq, dk, dv, dg, dh0 = chunk_gla_bwd( + q=q, + k=k, + v=v, + g=None, + g_cumsum=g, + scale=scale, + initial_state=h0, + h=h, + A=A, + do=do, + dht=dht, + offsets=offsets, + indices=indices, + head_first=head_first, + chunk_size=chunk_size + ) + return dq, dk, dv, dg, dh0 + + +def chunk_gsa_bwd_k( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + g: torch.Tensor, + h: torch.Tensor, + h0: torch.Tensor, + o: torch.Tensor, + do: torch.Tensor, + dht: torch.Tensor, + dg: torch.Tensor, + scale: float = 1., + offsets: Optional[torch.LongTensor] = None, + indices: Optional[torch.LongTensor] = None, + head_first: bool = True, + chunk_size: int = 64 +): + if head_first: + B, H, T, K, V = *k.shape, v.shape[-1] + else: + B, T, H, K, V = *k.shape, v.shape[-1] + BT = min(chunk_size, triton.next_power_of_2(T)) + if offsets is None: + NT = triton.cdiv(T, BT) + else: + if indices is None: + indices = torch.cat([torch.arange(n) for n in triton.cdiv(offsets[1:] - offsets[:-1], BT).tolist()]) + indices = torch.stack([indices.eq(0).cumsum(0) - 1, indices], 1).to(offsets) + NT = len(indices) + BC = min(16, BT) + BK = min(64, triton.next_power_of_2(K)) + BV = min(64, triton.next_power_of_2(V)) + HQ = q.shape[1] if head_first else q.shape[2] + NC = triton.cdiv(BT, BC) + NK = triton.cdiv(K, BK) + NV = triton.cdiv(V, BV) + NG = HQ // H + num_warps = 4 if BK == 64 else 2 + num_stages = 1 + + if h is None: + h, _ = chunk_fwd_h( + k=k, + v=v, + g=None, + gk=None, + gv=g, + h0=h0, + output_final_state=False, + states_in_fp32=False, + offsets=offsets, + head_first=head_first, + chunk_size=chunk_size + ) + dh, dh0 = chunk_bwd_dh( + q=q, + k=k, + v=v, + g=None, + gk=None, + gv=g, + do=do, + h0=h0, + dht=dht, + scale=scale, + states_in_fp32=True, + offsets=offsets, + head_first=head_first, + chunk_size=BT + ) + dA = q.new_empty(NV, B, *((HQ, T) if head_first else (T, HQ)), BT) + grid = (NV, NT * NC * NC, B * HQ) + chunk_gsa_bwd_k_kernel_dA[grid]( + v, + g, + do, + dA, + offsets=offsets, + indices=indices, + scale=scale, + B=B, + T=T, + HQ=HQ, + H=H, + V=V, + BT=BT, + BC=BC, + BV=BV, + NC=NC, + NG=NG, + HEAD_FIRST=head_first, + num_warps=num_warps, + num_stages=num_stages + ) + dA = dA.sum(0, dtype=dA.dtype) + + A = do.new_empty(NK, B, *((HQ, T) if head_first else (T, HQ)), BT) + dq = torch.empty_like(q) + dk = k.new_empty(B, *((HQ, T) if head_first else (T, HQ)), K) + dv = v.new_empty(NK, B, *((HQ, T) if head_first else (T, HQ)), V) + dgv = g.new_empty(NK, B, *((HQ, T) if head_first else (T, HQ)), V, dtype=torch.float) + grid = (NK, NT, B * HQ) + chunk_gsa_bwd_k_kernel_dqkvg[grid]( + q, + k, + v, + h, + g, + A, + do, + dh, + dq, + dk, + dv, + dg, + dgv, + dA, + offsets=offsets, + indices=indices, + scale=scale, + B=B, + T=T, + HQ=HQ, + H=H, + K=K, + V=V, + BT=BT, + BK=BK, + BV=BV, + NG=NG, + HEAD_FIRST=head_first, + num_warps=num_warps, + num_stages=num_stages + ) + A = A.sum(0, dtype=A.dtype) + dv = dv.sum(0, dtype=dv.dtype) + dgv = dgv.sum(0, dtype=dgv.dtype) + + grid = (NV, NT * NC, B * HQ) + chunk_gsa_bwd_k_kernel_intra_dvg[grid]( + v, + g, + o, + A, + do, + dv, + dg, + offsets=offsets, + indices=indices, + T=T, + HQ=HQ, + H=H, + V=V, + BT=BT, + BC=BC, + BV=BV, + NC=NC, + NG=NG, + HEAD_FIRST=head_first, + num_warps=num_warps, + num_stages=num_stages + ) + dg = dgv.add_(chunk_local_cumsum(dg, chunk_size=BT, reverse=True, offsets=offsets, indices=indices, head_first=head_first)) + + return dq, dk, dv, dg, dh0 + + +def chunk_gsa_fwd( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + s: torch.Tensor, + g: torch.Tensor, + initial_state: Optional[Tuple[torch.Tensor, torch.Tensor]] = None, + output_final_state: bool = False, + scale: float = 1., + offsets: Optional[torch.LongTensor] = None, + indices: Optional[torch.LongTensor] = None, + head_first: bool = True, + chunk_size: int = 64 +) -> Tuple[torch.Tensor, torch.Tensor, torch.Tensor, torch.Tensor]: + hk0, hv0 = None, None + if initial_state is not None: + hk0, hv0 = initial_state + Ak, hk, hkt, ok = chunk_gsa_fwd_k( + q=q, + k=k, + v=s, + g=g, + h0=hk0, + output_final_state=output_final_state, + scale=scale, + offsets=offsets, + indices=indices, + head_first=head_first, + chunk_size=chunk_size + ) + + # p is kept in fp32 for safe softmax backward + p = softmax_fwd(ok, dtype=torch.float) + + qv = p.to(q.dtype) + Av, hv, hvt, ov = chunk_gsa_fwd_v( + q=qv, + k=s, + v=v, + g=g, + scale=1., + initial_state=hv0, + output_final_state=output_final_state, + offsets=offsets, + indices=indices, + head_first=head_first, + chunk_size=chunk_size + ) + return Ak, hk, hkt, ok, p, Av, hv, hvt, ov + + +def chunk_gsa_bwd( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + s: torch.Tensor, + g: torch.Tensor, + ok: torch.Tensor, + p: torch.Tensor, + A: Tuple[torch.Tensor, torch.Tensor], + h: Tuple[torch.Tensor, torch.Tensor], + initial_state: Optional[Tuple[torch.Tensor, torch.Tensor]], + scale: float, + do: torch.Tensor, + dht: Tuple[torch.Tensor, torch.Tensor], + offsets: Optional[torch.LongTensor] = None, + indices: Optional[torch.LongTensor] = None, + head_first: bool = True, + chunk_size: int = 64 +): + hk0, hv0 = None, None + if initial_state is not None: + hk0, hv0 = initial_state + + _, Av = A + hk, hv = h + dhkt, dhvt = dht + + qv = p.to(q.dtype) + dqv, dsv, dv, dg, dhv0 = chunk_gsa_bwd_v( + q=qv, + k=s, + v=v, + g=g, + h0=hv0, + h=hv, + A=Av, + do=do, + dht=dhvt, + dg=None, + scale=1., + offsets=offsets, + indices=indices, + head_first=head_first, + chunk_size=chunk_size + ) + + # softmax gradient, equivalent to: + # dok = qv * (dqv - (qv * dqv).sum(-1, True)) + dok = softmax_bwd(p, dqv, dtype=ok.dtype) + + dq, dk, dsk, dg, dhk0 = chunk_gsa_bwd_k( + q=q, + k=k, + v=s, + g=g, + h0=hk0, + h=hk, + o=ok, + do=dok, + dht=dhkt, + dg=dg, + scale=scale, + offsets=offsets, + indices=indices, + head_first=head_first, + chunk_size=chunk_size + ) + + ds = dsv.add_(dsk) + if q.shape[1] != k.shape[1]: + dk, dv, ds, dg = map(lambda x: reduce(x, 'b (h g) ... -> b h ...', 'sum', h=k.shape[1]), (dk, dv, ds, dg)) + dg = dg.to(s.dtype) + return dq, dk, dv, ds, dg, dhk0, dhv0 + + +class ChunkGSAFunction(torch.autograd.Function): + + @staticmethod + @contiguous + def forward( + ctx, + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + s: torch.Tensor, + g: torch.Tensor, + scale: float, + hk0: Optional[torch.Tensor], + hv0: Optional[torch.Tensor], + output_final_state: bool, + checkpoint_level: int, + offsets: Optional[torch.LongTensor], + head_first: bool = True + ) -> Tuple[torch.Tensor, torch.Tensor, torch.Tensor]: + T = q.shape[2] if head_first else q.shape[1] + chunk_size = min(64, triton.next_power_of_2(T)) + + # 2-d indices denoting the offsets of chunks in each sequence + # for example, if the passed `offsets` is [0, 100, 356] and `chunk_size` is 64, + # then there are 2 and 4 chunks in the 1st and 2nd sequences respectively, and `indices` will be + # [[0, 0], [0, 1], [1, 0], [1, 1], [1, 2], [1, 3]] + indices = None + if offsets is not None: + indices = torch.cat([torch.arange(n) for n in triton.cdiv(offsets[1:] - offsets[:-1], chunk_size).tolist()]) + indices = torch.stack([indices.eq(0).cumsum(0) - 1, indices], 1).to(offsets) + g_org, g = g, chunk_local_cumsum(g, chunk_size, offsets=offsets, indices=indices, head_first=head_first) + Ak, hk, hkt, ok, p, Av, hv, hvt, ov = chunk_gsa_fwd( + q=q, + k=k, + v=v, + s=s, + g=g, + initial_state=(hk0, hv0), + output_final_state=output_final_state, + scale=scale, + offsets=offsets, + indices=indices, + head_first=head_first, + chunk_size=chunk_size + ) + + if checkpoint_level >= 1: + del g + g = g_org + if checkpoint_level > 1: + del hk + del hv + hk, hv = None, None + else: + hk0, hv0 = None, None + + ctx.save_for_backward(q, k, v, s, g, ok, p, Av, hk0, hv0, hk, hv) + ctx.checkpoint_level = checkpoint_level + ctx.scale = scale + ctx.offsets = offsets + ctx.indices = indices + ctx.head_first = head_first + ctx.chunk_size = chunk_size + return ov, hkt, hvt + + @staticmethod + @contiguous + def backward(ctx, dov, dhkt=None, dhvt=None): + q, k, v, s, g, ok, p, Av, hk0, hv0, hk, hv = ctx.saved_tensors + scale = ctx.scale + offsets = ctx.offsets + indices = ctx.indices + head_first = ctx.head_first + chunk_size = ctx.chunk_size + + if ctx.checkpoint_level >= 1: + g = chunk_local_cumsum(g, chunk_size, offsets=offsets, indices=indices, head_first=head_first) + dq, dk, dv, ds, dg, dhk0, dhv0 = chunk_gsa_bwd( + q=q, + k=k, + v=v, + s=s, + g=g, + ok=ok, + p=p, + A=(None, Av), + h=(hk, hv), + initial_state=(hk0, hv0), + scale=scale, + do=dov, + dht=(dhkt, dhvt), + offsets=offsets, + indices=indices, + head_first=head_first, + chunk_size=chunk_size + ) + return dq, dk, dv, ds, dg, None, dhk0, dhv0, None, None, None, None + + +def chunk_gsa( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + s: torch.Tensor, + g: Optional[torch.Tensor] = None, + scale: Optional[int] = None, + initial_state: Optional[Tuple[torch.Tensor]] = None, + output_final_state: Optional[bool] = False, + checkpoint_level: Optional[int] = 2, + offsets: Optional[torch.LongTensor] = None, + head_first: Optional[bool] = True +) -> Tuple[torch.Tensor, torch.Tensor]: + r""" + Args: + q (torch.Tensor): + queries of shape `[B, HQ, T, K]` if `head_first=True` else `[B, T, HQ, K]`. + k (torch.Tensor): + keys of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]`. + GQA is performed if `H` is not equal to `HQ`. + v (torch.Tensor): + values of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]`. + s (torch.Tensor): + slot representations of shape `[B, H, T, M]` if `head_first=True` else `[B, T, H, M]`. + g (torch.Tensor): + Forget gates of shape `[B, H, T, M]` applied to keys. + If not provided, this function is equivalent to vanilla ABC. + scale (Optional[int]): + Scale factor for attention scores. + If not provided, it will default to `1 / sqrt(K)`. Default: `None`. + initial_state (Optional[Tuple[torch.Tensor]]): + Initial state tuple having tensors of shape `[N, H, K, M]` and `[N, H, M, V]` for `N` input sequences. + For equal-length input sequences, `N` equals the batch size `B`. + Default: `None`. + output_final_state (Optional[bool]): + Whether to output the final state tuple, having tensors of shape `[N, H, K, M]` and `[N, H, M, V]`. + Default: `False`. + checkpoint_level (Optional[int]): + Checkpointing level; higher values will save more memories and do more recomputations during backward. + Default: `2`: + - Level `0`: no memory saved, no recomputation. + - Level `1`: recompute the fp32 cumulative values during backward. + - Level `2`: recompute the fp32 cumulative values and forward hidden states during backward. + offsets (Optional[torch.LongTensor]): + Offsets of shape `[N+1]` defining the bos/eos positions of `N` variable-length sequences in the batch. + For example, + if `offsets` is `[0, 1, 3, 6, 10, 15]`, there are `N=5` sequences with lengths 1, 2, 3, 4 and 5 respectively. + If provided, the inputs are concatenated and the batch size `B` is expected to be 1. + Default: `None`. + head_first (Optional[bool]): + Whether the inputs are in the head-first format, which is not supported for variable-length inputs. + Default: `True`. + + Returns: + o (torch.Tensor): + Outputs of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]`. + final_state (Tuple[torch.Tensor]): + Final state tuple having tensors of shape `[N, H, K, M]` and `[N, H, M, V]` if `output_final_state=True`. + `None` otherwise. + + Examples:: + >>> import torch + >>> import torch.nn.functional as F + >>> from einops import rearrange + >>> from fla.ops.gsa import fused_recurrent_gsa + # inputs with equal lengths + >>> B, T, H, K, V, M = 4, 2048, 4, 512, 512, 64 + >>> q = torch.randn(B, T, H, K, device='cuda') + >>> k = torch.randn(B, T, H, K, device='cuda') + >>> v = torch.randn(B, T, H, V, device='cuda') + >>> s = torch.randn(B, T, H, M, device='cuda') + >>> g = F.logsigmoid(torch.randn(B, T, H, M, device='cuda')) + >>> h0 = (torch.randn(B, H, K, M, device='cuda'), torch.randn(B, H, M, V, device='cuda')) + >>> o, (hk, hv) = chunk_gsa(q, k, v, s, g, + initial_state=h0, + output_final_state=True, + head_first=False) + # for variable-length inputs, the batch size `B` is expected to be 1 and `offsets` is required + >>> q, k, v, s, g = map(lambda x: rearrange(x, 'b t h d -> 1 (b t) h d'), (q, k, v, s, g)) + # for a batch with 4 sequences, offsets with 5 start/end positions are expected + >>> offsets = q.new_tensor([0, 2048, 4096, 6144, 8192], dtype=torch.long) + >>> o_var, (hk_var, hv_var) = chunk_gsa(q, k, v, s, g, + initial_state=h0, + output_final_state=True, + offsets=offsets, + head_first=False) + >>> assert o.allclose(o_var.view(o.shape)) + >>> assert hk.allclose(hk_var) + >>> assert hv.allclose(hv_var) + """ + if offsets is not None: + if q.shape[0] != 1: + raise ValueError(f"The batch size is expected to be 1 rather than {q.shape[0]} when using `offsets`." + f"Please flatten variable-length inputs before processing.") + if head_first: + raise RuntimeError("Sequences with variable lengths are not supported for head-first mode") + if initial_state is not None and initial_state[0].shape[0] != len(offsets) - 1: + raise ValueError(f"The number of initial states is expected to be equal to the number of input sequences, " + f"i.e., {len(offsets) - 1} rather than {initial_state[0].shape[0]}.") + assert checkpoint_level in [0, 1, 2] + if g is None: + # TODO: this 3 steps took huge amount of time, ought to be optimized + z = s.float().logcumsumexp(2) + g = torch.cat((z[:, :, :1], z[:, :, :-1]), 2) - z + s = torch.exp(s - z).to(k.dtype) + if scale is None: + scale = q.shape[-1] ** -0.5 + + hk0, hv0 = None, None + if initial_state is not None: + hk0, hv0 = initial_state + o, *final_state = ChunkGSAFunction.apply( + q, + k, + v, + s, + g, + scale, + hk0, + hv0, + output_final_state, + checkpoint_level, + offsets, + head_first + ) + return o, final_state diff --git a/fla/ops/gsa/fused_recurrent.py b/fla/ops/gsa/fused_recurrent.py new file mode 100644 index 0000000000000000000000000000000000000000..bebc04c6fae7a521199a533d10caece277f34630 --- /dev/null +++ b/fla/ops/gsa/fused_recurrent.py @@ -0,0 +1,565 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +from typing import Optional, Tuple + +import torch +import triton +import triton.language as tl + +from fla.ops.common.fused_recurrent import (fused_recurrent_bwd_kernel, + fused_recurrent_fwd_kernel) +from fla.ops.utils import chunk_global_cumsum +from fla.utils import autocast_custom_bwd, autocast_custom_fwd, contiguous + + +@triton.jit +def fused_recurrent_gsa_inference_kernel( + q, + k, + v, + s, + g, + o, + hk0, + hv0, + hkt, + hvt, + scale, + K: tl.constexpr, + V: tl.constexpr, + M: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + NG: tl.constexpr +): + i_bh = tl.program_id(0) + i_bg = i_bh // NG + + b_s = tl.load(s + i_bg * M + tl.arange(0, M)).to(tl.float32) + b_g = tl.load(g + i_bg * M + tl.arange(0, M)).to(tl.float32) + b_g = tl.exp(b_g) + + b_ok = tl.zeros([M], dtype=tl.float32) + for i_k in range(tl.cdiv(K, BK)): + o_k = i_k * BK + tl.arange(0, BK) + + p_hk0 = hk0 + i_bg * K * M + (o_k[None, :]) * M + tl.arange(0, M)[:, None] + # [BK,] + mask_k = o_k < K + # [M, BK] + mask_hk = (tl.arange(0, M) < M)[:, None] & mask_k[None, :] + # [M, BK] + b_hk = tl.load(p_hk0, mask=mask_hk, other=0.).to(tl.float32) + # [BK,] + b_q = tl.load(q + i_bh * K + o_k, mask=mask_k, other=0.).to(tl.float32) * scale + b_k = tl.load(k + i_bg * K + o_k, mask=mask_k, other=0.).to(tl.float32) + b_hk = b_hk * b_g[:, None] + b_k[None, :] * b_s[:, None] + b_ok += tl.sum(b_hk * b_q[None, :], axis=1) + + if i_bh % NG == 0: + p_hkt = hkt + i_bg * K * M + o_k[None, :] * M + tl.arange(0, M)[:, None] + tl.store(p_hkt, b_hk.to(p_hkt.dtype.element_ty), mask=mask_hk) + + b_qv = tl.softmax(b_ok) + for i_v in range(tl.cdiv(V, BV)): + o_v = i_v * BV + tl.arange(0, BV) + + p_hv0 = hv0 + i_bg * M * V + tl.arange(0, M)[None, :] * V + o_v[:, None] + # [BV,] + mask_v = o_v < V + # [BV, M] + mask_hv = mask_v[:, None] & (tl.arange(0, M) < M)[None, :] + # [BV, M] + b_hv = tl.load(p_hv0, mask=mask_hv, other=0).to(tl.float32) + # [BV,] + b_v = tl.load(v + i_bg * V + o_v, mask=mask_v, other=0).to(tl.float32) + b_hv = b_hv * b_g[None, :] + b_s[None, :] * b_v[:, None] + b_ov = tl.sum(b_hv * b_qv[None, :], axis=1) + + tl.store(o + i_bh * V + o_v, b_ov.to(o.dtype.element_ty), mask=mask_v) + + if i_bh % NG == 0: + p_hvt = hvt + i_bg * M * V + tl.arange(0, M)[None, :] * V + o_v[:, None] + tl.store(p_hvt, b_hv.to(p_hvt.dtype.element_ty), mask=mask_hv) + + +def fused_recurrent_gsa_inference( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + s: torch.Tensor, + g: torch.Tensor, + initial_state: Optional[Tuple[torch.Tensor, torch.Tensor]] = None, + output_final_state: bool = False, + scale: float = 1., + head_first: bool = True +) -> torch.Tensor: + if head_first: + B, H, T, K, V, M = *k.shape, v.shape[-1], s.shape[-1] + else: + B, T, H, K, V, M = *k.shape, v.shape[-1], s.shape[-1] + HQ = q.shape[1] if head_first else q.shape[2] + BK, BV = min(triton.next_power_of_2(K), 64), min(triton.next_power_of_2(V), 64) + NG = HQ // H + + hk0, hv0 = None, None + if initial_state is not None: + hk0, hv0 = initial_state + hkt, hvt = None, None + if output_final_state: + if NG == 1: + hkt, hvt = hk0, hv0 + else: + hkt, hvt = q.new_empty(B, H, K, M, dtype=torch.float), q.new_empty(B, H, M, V, dtype=torch.float) + + o = v.new_empty(B, HQ, T, V) if head_first else v.new_empty(B, T, HQ, V) + grid = (B * HQ,) + fused_recurrent_gsa_inference_kernel[grid]( + q, + k, + v, + s, + g, + o, + hk0, + hv0, + hkt, + hvt, + scale=scale, + K=K, + V=V, + M=M, + BK=BK, + BV=BV, + NG=NG + ) + return o, (hkt, hvt) + + +def fused_recurrent_gsa_fwd( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + s: torch.Tensor, + g: torch.Tensor, + initial_state: Optional[Tuple[torch.Tensor, torch.Tensor]] = None, + output_final_state: bool = False, + scale: float = 1., + reverse: bool = False, + offsets: Optional[torch.LongTensor] = None, + head_first: bool = True +) -> Tuple[torch.Tensor, Tuple[torch.Tensor]]: + if head_first: + B, H, T, K, V, M = *k.shape, v.shape[-1], s.shape[-1] + else: + B, T, H, K, V, M = *k.shape, v.shape[-1], s.shape[-1] + N = B if offsets is None else len(offsets) - 1 + HQ = q.shape[1] if head_first else q.shape[2] + if HQ != H: + raise ValueError("GQA not supported yet.") + + BK, BV, BM = min(triton.next_power_of_2(K), 64), min(triton.next_power_of_2(V), 64), min(M, 64) + NK, NV, NM = triton.cdiv(K, BK), triton.cdiv(V, BV), triton.cdiv(M, BM) + + hk0, hv0 = None, None + if initial_state is not None: + hk0, hv0 = initial_state + hkt, hvt = None, None + if output_final_state: + hkt, hvt = q.new_empty(N, H, K, M, dtype=torch.float), q.new_empty(N, H, M, V, dtype=torch.float) + + ok = q.new_empty(NK, *s.shape, dtype=torch.float) + gk, gv = None, g + grid = (NM, NK, N * H) + fused_recurrent_fwd_kernel[grid]( + q=q, + k=k, + v=s, + g=None, + gk=gk, + gv=gv, + o=ok, + h0=hk0, + ht=hkt, + offsets=offsets, + scale=scale, + B=B, + T=T, + H=H, + K=K, + V=M, + BK=BK, + BV=BM, + USE_G=False, + USE_GK=False, + USE_GV=True, + REVERSE=reverse, + HEAD_FIRST=head_first + ) + ok = ok.sum(0) + + qv = ok.softmax(-1, dtype=torch.float) + ov = q.new_empty(NM, *v.shape, dtype=torch.float) + gk, gv = g, None + grid = (NV, NM, N * H) + fused_recurrent_fwd_kernel[grid]( + q=qv, + k=s, + v=v, + g=None, + gk=gk, + gv=gv, + o=ov, + h0=hv0, + ht=hvt, + offsets=offsets, + scale=1., + B=B, + T=T, + H=H, + K=M, + V=V, + BK=BM, + BV=BV, + USE_G=False, + USE_GK=True, + USE_GV=False, + REVERSE=reverse, + HEAD_FIRST=head_first + ) + ov = ov.sum(0) + return ok, hkt, qv, ov, hvt + + +def fused_recurrent_gsa_bwd( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + s: torch.Tensor, + g: torch.Tensor, + qv: torch.Tensor, + hk0: Optional[torch.Tensor] = None, + hv0: Optional[torch.Tensor] = None, + ok: Optional[torch.Tensor] = None, + do: Optional[torch.Tensor] = None, + dhkt: Optional[torch.Tensor] = None, + dhvt: Optional[torch.Tensor] = None, + scale: float = 1., + reverse: bool = False, + offsets: Optional[torch.LongTensor] = None, + head_first: bool = True +) -> Tuple[torch.Tensor]: + if head_first: + B, H, T, K, V, M = *q.shape, v.shape[-1], s.shape[-1] + else: + B, T, H, K, V, M = *q.shape, v.shape[-1], s.shape[-1] + N = B if offsets is None else len(offsets) - 1 + + BK, BV, BM = min(K, 64), min(V, 64), min(M, 64) + NK, NV, NM = triton.cdiv(K, BK), triton.cdiv(V, BV), triton.cdiv(M, BM) + + if head_first: + dqv = q.new_empty(NV, B, H, T, M, dtype=torch.float) + dsv = q.new_empty(NV, B, H, T, M, dtype=torch.float) + dv = q.new_empty(NM, B, H, T, V, dtype=torch.float) + else: + dqv = q.new_empty(NV, B, T, H, M, dtype=torch.float) + dsv = q.new_empty(NV, B, T, H, M, dtype=torch.float) + dv = q.new_empty(NM, B, T, H, V, dtype=torch.float) + dhk0 = torch.empty_like(hk0)if hk0 is not None else None + dhv0 = torch.empty_like(hv0)if hv0 is not None else None + + gk, gv = g, None + grid = (NV, NM, N * H) + fused_recurrent_bwd_kernel[grid]( + q=qv, + k=s, + v=v, + g=None, + gk=gk, + gv=gv, + h0=hv0, + do=do, + dq=dqv, + dk=dsv, + dv=dv, + dht=dhvt, + dh0=dhv0, + offsets=offsets, + scale=1., + B=B, + T=T, + H=H, + K=M, + V=V, + BK=BM, + BV=BV, + USE_G=False, + USE_GK=True, + USE_GV=False, + REVERSE=reverse, + HEAD_FIRST=head_first + ) + dqv = dqv.sum(0) + dsv = dsv.sum(0) + dv = dv.sum(0) + dgk = chunk_global_cumsum(dqv * qv.float() - dsv * s.float(), + reverse=not reverse, + offsets=offsets, + head_first=head_first) + + dok = qv * (dqv - (qv * dqv).sum(-1, True)) + if head_first: + dq = q.new_empty(NM, B, H, T, K, dtype=torch.float) + dk = q.new_empty(NM, B, H, T, K, dtype=torch.float) + dsk = q.new_empty(NK, B, H, T, M, dtype=torch.float) + else: + dq = q.new_empty(NM, B, T, H, K, dtype=torch.float) + dk = q.new_empty(NM, B, T, H, K, dtype=torch.float) + dsk = q.new_empty(NK, B, T, H, M, dtype=torch.float) + gk, gv = None, g + grid = (NM, NK, N * H) + fused_recurrent_bwd_kernel[grid]( + q=q, + k=k, + v=s, + g=None, + gk=gk, + gv=gv, + h0=hk0, + do=dok, + dq=dq, + dk=dk, + dv=dsk, + dht=dhkt, + dh0=dhk0, + offsets=offsets, + scale=scale, + B=B, + T=T, + H=H, + K=K, + V=M, + BK=BK, + BV=BM, + USE_G=False, + USE_GK=False, + USE_GV=True, + REVERSE=reverse, + HEAD_FIRST=head_first + ) + dq = dq.sum(0) + dk = dk.sum(0) + dsk = dsk.sum(0) + + dgv = chunk_global_cumsum(dok.float() * ok.float() - dsk * s.float(), + reverse=not reverse, + offsets=offsets, + head_first=head_first) + + ds = dsk.add_(dsv) + dg = dgk.add_(dgv) + + return dq, dk, dv, ds, dg, dhk0, dhv0 + + +class FusedRecurrentGSAFunction(torch.autograd.Function): + + @staticmethod + @contiguous + @autocast_custom_fwd + def forward( + ctx, + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + s: torch.Tensor, + g: torch.Tensor, + scale: Optional[float] = None, + hk0: Optional[torch.Tensor] = None, + hv0: Optional[torch.Tensor] = None, + output_final_state: bool = False, + reverse: bool = False, + offsets: Optional[torch.LongTensor] = None, + head_first: bool = True + ) -> Tuple[torch.Tensor, Tuple[torch.Tensor]]: + T = q.shape[2] if head_first else q.shape[1] + if T == 1 and not q.requires_grad: + o, (hkt, hvt) = fused_recurrent_gsa_inference( + q=q, + k=k, + v=v, + s=s, + g=g, + initial_state=(hk0, hv0), + output_final_state=output_final_state, + scale=scale, + head_first=head_first + ) + return o, (hkt, hvt) + ok, hkt, qv, ov, hvt = fused_recurrent_gsa_fwd( + q=q, + k=k, + v=v, + s=s, + g=g, + initial_state=(hk0, hv0), + output_final_state=output_final_state, + scale=scale, + reverse=reverse, + offsets=offsets, + head_first=head_first + ) + ctx.save_for_backward(q, k, v, s, g, qv, hk0, hv0, ok) + ctx.scale = scale + ctx.reverse = reverse + ctx.offsets = offsets + ctx.head_first = head_first + return ov.to(q.dtype), hkt, hvt + + @staticmethod + @contiguous + @autocast_custom_bwd + def backward(ctx, do, dhkt=None, dhvt=None): + q, k, v, s, g, qv, hk0, hv0, ok = ctx.saved_tensors + scale = ctx.scale + reverse = ctx.reverse + offsets = ctx.offsets + head_first = ctx.head_first + + # not supported yet. + if dhkt is not None or dhvt is not None: + if g is not None: + assert g.requires_grad is False, "Cannot load final state gradient and use gates at the same time" + dq, dk, dv, ds, dg, dhk0, dhv0 = fused_recurrent_gsa_bwd( + q=q, + k=k, + v=v, + s=s, + g=g, + qv=qv, + hk0=hk0, + hv0=hv0, + ok=ok, + do=do, + dhkt=dhkt, + dhvt=dhvt, + scale=scale, + reverse=reverse, + offsets=offsets, + head_first=head_first + ) + return dq.to(q), dk.to(k), dv.to(v), ds.to(s), dg.to(g), None, dhk0, dhv0, None, None, None, None + + +def fused_recurrent_gsa( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + s: torch.Tensor, + g: Optional[torch.Tensor] = None, + scale: Optional[int] = None, + initial_state: Optional[Tuple[torch.Tensor]] = None, + output_final_state: Optional[bool] = False, + reverse: Optional[bool] = False, + offsets: Optional[torch.LongTensor] = None, + head_first: bool = True +) -> Tuple[torch.Tensor, torch.Tensor]: + r""" + Args: + q (torch.Tensor): + queries of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]`. + k (torch.Tensor): + keys of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]`. + v (torch.Tensor): + values of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]`. + s (torch.Tensor): + slot representations of shape `[B, H, T, M]` if `head_first=True` else `[B, T, H, M]`. + g (torch.Tensor): + Forget gates of shape `[B, H, T, M]` applied to keys. + scale (Optional[int]): + Scale factor for the attention scores. + If not provided, it will default to `1 / sqrt(K)`. Default: `None`. + initial_state (Optional[Tuple[torch.Tensor]]): + Initial state tuple having tensors of shape `[N, H, K, M]` and `[N, H, M, V]` for `N` input sequences. + For equal-length input sequences, `N` equals the batch size `B`. + Default: `None`. + output_final_state (Optional[bool]): + Whether to output the final state of shape `[N, H, K, V]` and `[N, H, M, V]`. + Default: `False`. + reverse (Optional[bool]): + If `True`, process the state passing in reverse order. Default: `False`. + offsets (Optional[torch.LongTensor]): + Offsets of shape `[N+1]` defining the bos/eos positions of `N` variable-length sequences in the batch. + For example, + if `offsets` is `[0, 1, 3, 6, 10, 15]`, there are `N=5` sequences with lengths 1, 2, 3, 4 and 5 respectively. + If provided, the inputs are concatenated and the batch size `B` is expected to be 1. + Default: `None`. + head_first (Optional[bool]): + Whether the inputs are in the head-first format, which is not supported for variable-length inputs. + Default: `True`. + + Returns: + o (torch.Tensor): + Outputs of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]`. + final_state (Tuple[torch.Tensor]): + Final state tuple having tensors of shape `[N, H, K, M]` and `[N, H, M, V]`. + + Examples:: + >>> import torch + >>> import torch.nn.functional as F + >>> from einops import rearrange + >>> from fla.ops.gsa import fused_recurrent_gsa + # inputs with equal lengths + >>> B, T, H, K, V, M = 4, 2048, 4, 512, 512, 64 + >>> q = torch.randn(B, T, H, K, device='cuda') + >>> k = torch.randn(B, T, H, K, device='cuda') + >>> v = torch.randn(B, T, H, V, device='cuda') + >>> s = torch.randn(B, T, H, M, device='cuda') + >>> g = F.logsigmoid(torch.randn(B, T, H, M, device='cuda')) + >>> h0 = (torch.randn(B, H, K, M, device='cuda'), torch.randn(B, H, M, V, device='cuda')) + >>> o, (hk, hv) = fused_recurrent_gsa(q, k, v, s, g, + initial_state=h0, + output_final_state=True, + head_first=False) + # for variable-length inputs, the batch size `B` is expected to be 1 and `offsets` is required + >>> q, k, v, s, g = map(lambda x: rearrange(x, 'b t h d -> 1 (b t) h d'), (q, k, v, s, g)) + # for a batch with 4 sequences, offsets with 5 start/end positions are expected + >>> offsets = q.new_tensor([0, 2048, 4096, 6144, 8192], dtype=torch.long) + >>> o_var, (hk_var, hv_var) = fused_recurrent_gsa(q, k, v, s, g, + initial_state=h0, + output_final_state=True, + offsets=offsets, + head_first=False) + >>> assert o.allclose(o_var.view(o.shape)) + >>> assert hk.allclose(hk_var) + >>> assert hv.allclose(hv_var) + """ + if offsets is not None: + if q.shape[0] != 1: + raise ValueError(f"The batch size is expected to be 1 rather than {q.shape[0]} when using `offsets`." + f"Please flatten variable-length inputs before processing.") + if head_first: + raise RuntimeError("Sequences with variable lengths are not supported for head-first mode") + if initial_state is not None and initial_state[0].shape[0] != len(offsets) - 1: + raise ValueError(f"The number of initial states is expected to be equal to the number of input sequences, " + f"i.e., {len(offsets) - 1} rather than {initial_state[0].shape[0]}.") + if scale is None: + scale = k.shape[-1] ** -0.5 + if initial_state is None: + initial_state = (None, None) + o, final_state = FusedRecurrentGSAFunction.apply( + q, + k, + v, + s, + g, + scale, + *initial_state, + output_final_state, + reverse, + offsets, + head_first + ) + return o, final_state diff --git a/fla/ops/gsa/naive.py b/fla/ops/gsa/naive.py new file mode 100644 index 0000000000000000000000000000000000000000..699b2a4d1e5b4b8415a98c27da142a6130797685 --- /dev/null +++ b/fla/ops/gsa/naive.py @@ -0,0 +1,68 @@ +# -*- coding: utf-8 -*- + +from typing import Optional + +import torch +from einops import repeat + + +def naive_recurrent_gsa( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + s: torch.Tensor, + g: Optional[torch.Tensor] = None, + scale: Optional[int] = None, + initial_state: Optional[torch.Tensor] = None, + output_final_state: Optional[bool] = False +) -> torch.Tensor: + dtype = q.dtype + + NG = q.shape[1]//k.shape[1] + # [batch_size, n_heads, seq_len, n_slots] + if g is None: + z = s.float().logcumsumexp(2) + g = torch.cat((z[:, :, :1], z[:, :, :-1]), 2) - z + s = torch.exp(s - z) + q, k, v, s, g = map(lambda x: x.float(), (q, k, v, s, g)) + k, v, s, g = map(lambda x: repeat(x, 'b h t d -> b (h g) t d', g=NG), (k, v, s, g)) + if initial_state is not None: + initial_state = tuple(map(lambda x: repeat(x, 'b h k v -> b (h g) k v', g=NG), initial_state)) + + B, H, T, K, V, M = *q.shape, v.shape[-1], s.shape[-1] + + hk = torch.zeros(B, H, K, M, dtype=torch.float, device=q.device) + ok = torch.zeros_like(s) + + if scale is None: + scale = q.shape[-1] ** -0.5 + + final_state = None + if initial_state is not None: + hk += initial_state[0] + + for i in range(T): + q_i = q[:, :, i] * scale + k_i = k[:, :, i] + v_i = s[:, :, i] + g_i = g[:, :, i].exp() + hk = hk * g_i[..., None, :] + k_i[..., None] * v_i[..., None, :] + ok[:, :, i] = (q_i[..., None] * hk).sum(-2) + + qv = ok.softmax(-1) + hv = torch.zeros(B, H, M, V, dtype=torch.float, device=q.device) + ov = torch.zeros_like(v) + if initial_state is not None: + hv += initial_state[1] + + for i in range(T): + q_i = qv[:, :, i] + k_i = s[:, :, i] + v_i = v[:, :, i] + g_i = g[:, :, i].exp() + hv = hv * g_i[..., :, None] + k_i[..., None] * v_i[..., None, :] + ov[:, :, i] = (q_i[..., None] * hv).sum(-2) + + if output_final_state: + final_state = (hk.view(B, -1, NG, K, M)[:, :, 0], hv.view(B, -1, NG, M, V)[:, :, 0]) + return ov.to(dtype), final_state diff --git a/fla/ops/hgrn/__init__.py b/fla/ops/hgrn/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..f2012c3c15f125271df225ce755ed3b2dbe01a83 --- /dev/null +++ b/fla/ops/hgrn/__init__.py @@ -0,0 +1,9 @@ +# -*- coding: utf-8 -*- + +from .chunk import chunk_hgrn +from .fused_recurrent import fused_recurrent_hgrn + +__all__ = [ + 'chunk_hgrn', + 'fused_recurrent_hgrn' +] diff --git a/fla/ops/hgrn/chunk.py b/fla/ops/hgrn/chunk.py new file mode 100644 index 0000000000000000000000000000000000000000..5d71cc90da258ff6a1112b0097ae686ed35d2b95 --- /dev/null +++ b/fla/ops/hgrn/chunk.py @@ -0,0 +1,289 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +# this function implements the chunkwise form of HGRN, inspired by +# [Volodymyr Kyrylov in his blog post](https://proger.github.io/posts/scan/chunk.html) +# also refer to the `accelerated-scan` lib: https://github.com/proger/accelerated-scan + +# from tests on H800, with B, D = 16, 128, we see that the chunk can be greatly faster than the recurrent: +# +# Performance: +# seq_len chunk recurrent chunk_bwd recurrent_bwd +# 0 128.0 0.039360 0.061056 0.312160 0.205008 +# 1 256.0 0.045824 0.123712 0.308784 0.297696 +# 2 512.0 0.058688 0.241952 0.310720 0.626528 +# 3 1024.0 0.088288 0.476992 0.313184 1.333152 +# 4 2048.0 0.169472 0.943264 0.452464 2.724864 +# 5 4096.0 0.329920 1.886144 0.881600 5.551520 +# 6 8192.0 0.647872 3.755040 1.740496 11.117184 +# 7 16384.0 1.272064 7.520576 3.446608 22.362528 + +from typing import Tuple + +import torch +import triton +import triton.language as tl + +from fla.utils import contiguous + + +@triton.autotune( + configs=[ + triton.Config({'BD': 32}, num_warps=1), + triton.Config({'BD': 32}, num_warps=2), + triton.Config({'BD': 32}, num_warps=4), + triton.Config({'BD': 32}, num_warps=8), + triton.Config({'BD': 64}, num_warps=1), + triton.Config({'BD': 64}, num_warps=2), + triton.Config({'BD': 64}, num_warps=4), + triton.Config({'BD': 64}, num_warps=8), + triton.Config({'BD': 128}, num_warps=1), + triton.Config({'BD': 128}, num_warps=2), + triton.Config({'BD': 128}, num_warps=4), + triton.Config({'BD': 128}, num_warps=8), + ], + key=['D'] +) +@triton.jit +def chunk_hgrn_fwd_kernel_h( + x, + g, + gc, + o, + h0, + T: tl.constexpr, + D: tl.constexpr, + BT: tl.constexpr, + BD: tl.constexpr, + USE_INITIAL_STATE: tl.constexpr +): + i_d, i_t, i_b = tl.program_id(0), tl.program_id(1), tl.program_id(2) + o_d = i_d * BD + tl.arange(0, BD) + mask = o_d < D + + p_x = x + i_b * T * D + i_t * BT * D + o_d + p_g = g + i_b * T * D + i_t * BT * D + o_d + p_gc = gc + i_b * T * D + i_t * BT * D + o_d + p_o = o + i_b * T * D + i_t * BT * D + o_d + + b_h = tl.zeros([BD], dtype=tl.float32) + b_gc = tl.zeros([BD], dtype=tl.float32) + if USE_INITIAL_STATE: + if i_t == 0: + b_h += tl.load(h0 + i_b * D + o_d, mask=mask, other=0).to(tl.float32) + for i in range(0, BT): + mask_t = mask & ((i_t * BT + i) < T) + b_x = tl.load(p_x, mask=mask_t, other=0).to(tl.float32) + b_g = tl.load(p_g, mask=mask_t, other=0).to(tl.float32) + b_h = tl.exp(b_g) * b_h + b_x + b_gc = b_gc + b_g + tl.store(p_gc, b_gc.to(p_o.dtype.element_ty), mask=mask_t) + tl.store(p_o, b_h.to(p_o.dtype.element_ty), mask=mask_t) + + p_x += D + p_g += D + p_gc += D + p_o += D + + +@triton.jit +def chunk_hgrn_fwd_kernel_o( + gc, + o, + s_b, + s_t, + s_d, + T: tl.constexpr, + D: tl.constexpr, + BT: tl.constexpr, + BD: tl.constexpr +): + i_d, i_b = tl.program_id(0), tl.program_id(1) + o_d = i_d * BD + tl.arange(0, BD) + mask = o_d < D + + for i_t in range(1, tl.cdiv(T, BT)): + p_gc = tl.make_block_ptr(gc + i_b * s_b, (T, D), (s_t, s_d), (i_t * BT, i_d * BD), (BT, BD), (1, 0)) + p_o = tl.make_block_ptr(o + i_b * s_b, (T, D), (s_t, s_d), (i_t * BT, i_d * BD), (BT, BD), (1, 0)) + + # [BD,] + b_h0 = tl.load(o + i_b * T * D + i_t * BT * D - D + o_d, mask=mask, other=0).to(tl.float32) + # [BT, BD] + b_gc = tl.load(p_gc, boundary_check=(0, 1)).to(tl.float32) + b_o = tl.load(p_o, boundary_check=(0, 1)).to(tl.float32) + b_o = b_o + tl.exp(b_gc) * b_h0[None, :] + tl.store(p_o, b_o.to(p_o.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.autotune( + configs=[ + triton.Config({'BD': 32}, num_warps=1), + triton.Config({'BD': 32}, num_warps=2), + triton.Config({'BD': 32}, num_warps=4), + triton.Config({'BD': 32}, num_warps=8), + triton.Config({'BD': 64}, num_warps=1), + triton.Config({'BD': 64}, num_warps=2), + triton.Config({'BD': 64}, num_warps=4), + triton.Config({'BD': 64}, num_warps=8), + triton.Config({'BD': 128}, num_warps=1), + triton.Config({'BD': 128}, num_warps=2), + triton.Config({'BD': 128}, num_warps=4), + triton.Config({'BD': 128}, num_warps=8), + ], + key=['D'] +) +@triton.jit +def chunk_hgrn_bwd_kernel_h( + g, + gc, + dx, + do, + T: tl.constexpr, + D: tl.constexpr, + BT: tl.constexpr, + BD: tl.constexpr +): + i_d, i_t, i_b = tl.program_id(0), tl.program_id(1), tl.program_id(2) + o_d = i_d * BD + tl.arange(0, BD) + mask = o_d < D + BC = min(BT, T - i_t * BT) + NT = tl.num_programs(1) + + p_g = g + (i_b * T + i_t * BT + BC - 1) * D + o_d + p_gc = gc + (i_b * T + i_t * BT + BC - 1) * D + o_d + p_dx = dx + (i_b * T + i_t * BT + BC - 1) * D + o_d + p_do = do + (i_b * T + i_t * BT + BC - 1) * D + o_d + + if i_t == NT - 1: + b_gc = tl.zeros([BD], dtype=tl.float32) + else: + b_gc = tl.load(g + (i_b * T + i_t * BT + BT) * D + o_d, mask=mask, other=0).to(tl.float32) + b_dh = tl.zeros([BD], dtype=tl.float32) + for _ in range(BC - 1, -1, -1): + tl.store(p_gc, b_gc.to(p_gc.dtype.element_ty), mask=mask) + + b_g = tl.load(p_g, mask=mask, other=0).to(tl.float32) + b_do = tl.load(p_do, mask=mask, other=0).to(tl.float32) + + b_gc = b_gc + b_g + b_dh = b_dh + b_do + b_dx = b_dh + b_dh = b_dh * tl.exp(b_g) + + tl.store(p_dx, b_dx.to(p_dx.dtype.element_ty), mask=mask) + + p_g -= D + p_gc -= D + p_dx -= D + p_do -= D + + +@triton.jit +def chunk_hgrn_bwd_kernel_o( + g, + gc, + o, + dx, + dg, + s_b, + s_t, + s_d, + T: tl.constexpr, + D: tl.constexpr, + BT: tl.constexpr, + BD: tl.constexpr +): + i_d, i_b = tl.program_id(0), tl.program_id(1) + o_d = i_d * BD + tl.arange(0, BD) + mask = o_d < D + + for i_t in range(tl.cdiv(T, BT) - 1, -1, -1): + p_g = tl.make_block_ptr(g + i_b * s_b, (T, D), (s_t, s_d), (i_t * BT, i_d * BD), (BT, BD), (1, 0)) + p_gc = tl.make_block_ptr(gc + i_b * s_b, (T, D), (s_t, s_d), (i_t * BT, i_d * BD), (BT, BD), (1, 0)) + p_o = tl.make_block_ptr(o + i_b * s_b, (T, D), (s_t, s_d), (i_t * BT - 1, i_d * BD), (BT, BD), (1, 0)) + p_dx = tl.make_block_ptr(dx + i_b * s_b, (T, D), (s_t, s_d), (i_t * BT, i_d * BD), (BT, BD), (1, 0)) + p_dg = tl.make_block_ptr(dg + i_b * s_b, (T, D), (s_t, s_d), (i_t * BT, i_d * BD), (BT, BD), (1, 0)) + + # [BD,] + mask_t = mask & ((i_t + 1) * BT < T) + b_ht = tl.load(dx + i_b * T * D + (i_t + 1) * BT * D + o_d, mask=mask_t, other=0).to(tl.float32) + # [BT, BD] + b_g = tl.load(p_g, boundary_check=(0, 1)).to(tl.float32) + b_gc = tl.load(p_gc, boundary_check=(0, 1)).to(tl.float32) + b_o = tl.load(p_o, boundary_check=(0, 1)).to(tl.float32) + b_dx = tl.load(p_dx, boundary_check=(0, 1)).to(tl.float32) + + b_dx = b_dx + tl.exp(b_gc) * b_ht[None, :] + b_dg = b_o * b_dx * tl.exp(b_g) + tl.store(p_dx, b_dx.to(p_dx.dtype.element_ty), boundary_check=(0, 1)) + tl.store(p_dg, b_dg.to(p_dg.dtype.element_ty), boundary_check=(0, 1)) + + +class ChunkHGRNFunction(torch.autograd.Function): + + @staticmethod + @contiguous + def forward(ctx, x, g, initial_state=None, output_final_state=False): + B, T, D = x.shape + BT, BD = 128, min(64, triton.next_power_of_2(D)) + num_warps = 8 if BD == 64 else 4 + + gc = torch.empty_like(g, dtype=torch.float) + o = torch.empty_like(x, dtype=torch.float) + def grid(meta): return (triton.cdiv(D, meta['BD']), triton.cdiv(T, meta['BT']), B) + chunk_hgrn_fwd_kernel_h[grid]( + x, g, gc, o, initial_state, + T=T, D=D, BT=BT, + USE_INITIAL_STATE=initial_state is not None + ) + def grid(meta): return (triton.cdiv(D, meta['BD']), B) + chunk_hgrn_fwd_kernel_o[grid]( + gc, o, + o.stride(-3), o.stride(-2), o.stride(-1), + T=T, D=D, BT=BT, BD=BD, + num_warps=num_warps + ) + final_state = None + if output_final_state: + final_state = o[:, -1].clone() + o = o.to(x.dtype) + ctx.save_for_backward(g, o, initial_state) + return o, final_state + + @staticmethod + @contiguous + def backward(ctx, do, dht=None): + g, o, initial_state = ctx.saved_tensors + B, T, D = do.shape + BT, BD = 128, min(64, triton.next_power_of_2(D)) + num_warps = 8 if BD == 64 else 4 + + gc = torch.empty_like(g, dtype=torch.float) + dx = torch.empty_like(o, dtype=torch.float) + def grid(meta): return (triton.cdiv(D, meta['BD']), triton.cdiv(T, meta['BT']), B) + chunk_hgrn_bwd_kernel_h[grid]( + g, gc, dx, do, + T=T, D=D, BT=BT + ) + + dg = torch.empty_like(g, dtype=torch.float) + def grid(meta): return (triton.cdiv(D, meta['BD']), B) + chunk_hgrn_bwd_kernel_o[grid]( + g, gc, o, dx, dg, + o.stride(-3), o.stride(-2), o.stride(-1), + T=T, D=D, BT=BT, BD=BD, + num_warps=num_warps + ) + if initial_state is not None: + dg[:, 0] = (initial_state * dx[:, 0] * g[:, 0].float().exp()).to(dg.dtype) + + return dx.to(o.dtype), dg, None, None + + +def chunk_hgrn( + x: torch.Tensor, + g: torch.Tensor, + initial_state: torch.Tensor = None, + output_final_state: bool = False +) -> Tuple[torch.Tensor, torch.Tensor]: + return ChunkHGRNFunction.apply(x, g, initial_state, output_final_state) diff --git a/fla/ops/hgrn/fused_recurrent.py b/fla/ops/hgrn/fused_recurrent.py new file mode 100644 index 0000000000000000000000000000000000000000..3a88980db42b59820a771ee742bfc13675599bbe --- /dev/null +++ b/fla/ops/hgrn/fused_recurrent.py @@ -0,0 +1,327 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +from typing import Optional, Tuple + +import torch +import triton +import triton.language as tl + +from fla.utils import contiguous + + +@triton.autotune( + configs=[ + triton.Config({'BD': 32}, num_warps=1), + triton.Config({'BD': 32}, num_warps=2), + triton.Config({'BD': 32}, num_warps=4), + triton.Config({'BD': 32}, num_warps=8), + triton.Config({'BD': 64}, num_warps=1), + triton.Config({'BD': 64}, num_warps=2), + triton.Config({'BD': 64}, num_warps=4), + triton.Config({'BD': 64}, num_warps=8), + triton.Config({'BD': 128}, num_warps=1), + triton.Config({'BD': 128}, num_warps=2), + triton.Config({'BD': 128}, num_warps=4), + triton.Config({'BD': 128}, num_warps=8), + ], + key=['D'] +) +@triton.heuristics({ + 'USE_INITIAL_STATE': lambda args: args['h0'] is not None, + 'STORE_FINAL_STATE': lambda args: args['ht'] is not None, + 'USE_OFFSETS': lambda args: args['offsets'] is not None +}) +@triton.jit +def fused_recurrent_hgrn_fwd_kernel( + x, + g, + o, + h0, + ht, + offsets, + T: tl.constexpr, + D: tl.constexpr, + BD: tl.constexpr, + USE_INITIAL_STATE: tl.constexpr, + STORE_FINAL_STATE: tl.constexpr, + USE_OFFSETS: tl.constexpr +): + i_d, i_n = tl.program_id(0), tl.program_id(1) + if USE_OFFSETS: + bos, eos = tl.load(offsets + i_n).to(tl.int64), tl.load(offsets + i_n + 1).to(tl.int64) + T = eos - bos + else: + bos, eos = i_n * T, i_n * T + T + + o_d = i_d * BD + tl.arange(0, BD) + mask = o_d < D + + p_x = x + bos * D + o_d + p_g = g + bos * D + o_d + p_o = o + bos * D + o_d + + b_h = tl.zeros([BD], dtype=tl.float32) + if USE_INITIAL_STATE: + p_h0 = h0 + i_n * D + o_d + b_h += tl.load(p_h0, mask=mask, other=0).to(tl.float32) + for _ in range(0, T): + b_x = tl.load(p_x, mask=mask, other=0).to(tl.float32) + b_g = tl.load(p_g, mask=mask, other=0).to(tl.float32) + b_h = tl.exp(b_g) * b_h + b_x + tl.store(p_o, b_h.to(p_o.dtype.element_ty), mask=mask) + + p_x += D + p_g += D + p_o += D + + if STORE_FINAL_STATE: + p_ht = ht + i_n * D + o_d + tl.store(p_ht, b_h.to(p_ht.dtype.element_ty), mask=mask) + + +@triton.autotune( + configs=[ + triton.Config({'BD': 32}, num_warps=1), + triton.Config({'BD': 32}, num_warps=2), + triton.Config({'BD': 32}, num_warps=4), + triton.Config({'BD': 32}, num_warps=8), + triton.Config({'BD': 64}, num_warps=1), + triton.Config({'BD': 64}, num_warps=2), + triton.Config({'BD': 64}, num_warps=4), + triton.Config({'BD': 64}, num_warps=8), + triton.Config({'BD': 128}, num_warps=1), + triton.Config({'BD': 128}, num_warps=2), + triton.Config({'BD': 128}, num_warps=4), + triton.Config({'BD': 128}, num_warps=8), + ], + key=['D'] +) +@triton.heuristics({ + 'USE_INITIAL_STATE': lambda args: args['h0'] is not None, + 'USE_FINAL_STATE_GRADIENT': lambda args: args['dht'] is not None, + 'USE_OFFSETS': lambda args: args['offsets'] is not None +}) +@triton.jit +def fused_recurrent_hgrn_bwd_kernel( + g, + o, + h0, + dx, + dg, + do, + dht, + dh0, + offsets, + T: tl.constexpr, + D: tl.constexpr, + BD: tl.constexpr, + USE_INITIAL_STATE: tl.constexpr, + USE_FINAL_STATE_GRADIENT: tl.constexpr, + USE_OFFSETS: tl.constexpr +): + i_d, i_n = tl.program_id(0), tl.program_id(1) + if USE_OFFSETS: + bos, eos = tl.load(offsets + i_n).to(tl.int64), tl.load(offsets + i_n + 1).to(tl.int64) + T = eos - bos + else: + bos, eos = i_n * T, i_n * T + T + + o_d = i_d * BD + tl.arange(0, BD) + mask = o_d < D + + p_g = g + (bos + T - 1) * D + o_d + p_o = o + (bos + T - 2) * D + o_d + p_dx = dx + (bos + T - 1) * D + o_d + p_dg = dg + (bos + T - 1) * D + o_d + p_do = do + (bos + T - 1) * D + o_d + + b_dh = tl.zeros([BD], dtype=tl.float32) + if USE_FINAL_STATE_GRADIENT: + p_dht = dht + i_n * D + o_d + b_dh += tl.load(p_dht, mask=mask, other=0).to(tl.float32) + + for i in range(T - 1, -1, -1): + b_g = tl.load(p_g, mask=mask, other=0).to(tl.float32) + b_do = tl.load(p_do, mask=mask, other=0).to(tl.float32) + if i > 0: + b_o = tl.load(p_o, mask=mask, other=0).to(tl.float32) + elif USE_INITIAL_STATE: + b_o = tl.load(h0 + i_n * D + o_d, mask=mask, other=0).to(tl.float32) + else: + b_o = tl.zeros([BD], dtype=tl.float32) + + b_dh = b_dh + b_do + b_dx = b_dh + b_dh = b_dh * tl.exp(b_g) + b_dg = b_dh * b_o + tl.store(p_dx, b_dx.to(p_dx.dtype.element_ty), mask=mask) + tl.store(p_dg, b_dg.to(p_dg.dtype.element_ty), mask=mask) + + p_g -= D + p_o -= D + p_dx -= D + p_dg -= D + p_do -= D + + if USE_INITIAL_STATE: + p_dh0 = dh0 + i_n * D + o_d + tl.store(p_dh0, b_dh.to(p_dh0.dtype.element_ty), mask=mask) + + +def fused_recurrent_hgrn_fwd( + x: torch.Tensor, + g: torch.Tensor, + initial_state: torch.Tensor = None, + output_final_state: bool = False, + offsets: Optional[torch.LongTensor] = None, +) -> Tuple[torch.Tensor, torch.Tensor]: + B, T, D = x.shape + N = B if offsets is None else len(offsets) - 1 + + o = torch.empty_like(x) + final_state = x.new_empty(N, D) if output_final_state else None + + def grid(meta): return (triton.cdiv(D, meta['BD']), N) + fused_recurrent_hgrn_fwd_kernel[grid]( + x=x, + g=g, + o=o, + h0=initial_state, + ht=final_state, + offsets=offsets, + T=T, + D=D + ) + return o, final_state + + +def fused_recurrent_hgrn_bwd( + g: torch.Tensor, + o: torch.Tensor, + do: torch.Tensor, + dht: torch.Tensor = None, + initial_state: torch.Tensor = None, + offsets: Optional[torch.LongTensor] = None +) -> Tuple[torch.Tensor, torch.Tensor]: + B, T, D = do.shape + N = B if offsets is None else len(offsets) - 1 + + dx = torch.empty_like(o, dtype=torch.float) + dg = torch.empty_like(g, dtype=torch.float) + dh0 = torch.empty_like(initial_state, dtype=torch.float) if initial_state is not None else None + def grid(meta): return (triton.cdiv(D, meta['BD']), N) + fused_recurrent_hgrn_bwd_kernel[grid]( + g=g, + o=o, + h0=initial_state, + dx=dx, + dg=dg, + do=do, + dht=dht, + dh0=dh0, + offsets=offsets, + T=T, + D=D + ) + return dx, dg, dh0 + + +class FusedRecurrentHGRNFunction(torch.autograd.Function): + + @staticmethod + @contiguous + def forward( + ctx, + x: torch.Tensor, + g: torch.Tensor, + initial_state: torch.Tensor = None, + output_final_state: bool = False, + offsets: Optional[torch.LongTensor] = None + ): + o, ht = fused_recurrent_hgrn_fwd( + x=x, + g=g, + initial_state=initial_state, + output_final_state=output_final_state, + offsets=offsets + ) + ctx.save_for_backward(g, o, initial_state) + ctx.offsets = offsets + return o, ht + + @staticmethod + @contiguous + def backward(ctx, do, dht=None): + g, o, initial_state = ctx.saved_tensors + offsets = ctx.offsets + + dx, dg, dh0 = fused_recurrent_hgrn_bwd( + g=g, + o=o, + do=do, + dht=dht, + initial_state=initial_state, + offsets=offsets + ) + return dx, dg, dh0, None, None + + +def fused_recurrent_hgrn( + x: torch.Tensor, + g: torch.Tensor, + initial_state: torch.Tensor = None, + output_final_state: bool = False, + offsets: Optional[torch.LongTensor] = None +) -> Tuple[torch.Tensor, torch.Tensor]: + r""" + Args: + x (torch.Tensor): + inputs of shape `[B, T, D]. + g (torch.Tensor): + Forget gates of shape `[B, T, D]`. + initial_state (Optional[torch.Tensor]): + Initial state of shape `[N, D]` for `N` input sequences. + For equal-length input sequences, `N` equals the batch size `B`. + Default: `None`. + output_final_state (Optional[bool]): + Whether to output the final state of shape `[N, D]`. Default: `False`. + offsets (Optional[torch.LongTensor]): + Offsets of shape `[N+1]` defining the bos/eos positions of `N` variable-length sequences in the batch. + For example, + if `offsets` is `[0, 1, 3, 6, 10, 15]`, there are `N=5` sequences with lengths 1, 2, 3, 4 and 5 respectively. + If provided, the inputs are concatenated and the batch size `B` is expected to be 1. + Default: `None`. + + Returns: + o (torch.Tensor): + Outputs of shape `[B, T, D]`. + final_state (torch.Tensor): + Final state of shape `[N, D]` if `output_final_state=True` else `None`. + + Examples:: + >>> import torch + >>> import torch.nn.functional as F + >>> from einops import rearrange + >>> from fla.ops.hgrn import fused_recurrent_hgrn + # inputs with equal lengths + >>> B, T, D = 4, 2048, 512 + >>> x = torch.randn(B, T, D, device='cuda') + >>> g = F.logsigmoid(torch.randn(B, T, D, device='cuda')) + >>> h0 = torch.randn(B, D, device='cuda') + >>> o, ht = fused_recurrent_hgrn(x, g, initial_state=h0, output_final_state=True) + # for variable-length inputs, the batch size `B` is expected to be 1 and `offsets` is required + >>> x, g = map(lambda x: rearrange(x, 'b t d -> 1 (b t) d'), (x, g)) + # for a batch with 4 sequences, offsets with 5 start/end positions are expected + >>> offsets = x.new_tensor([0, 2048, 4096, 6144, 8192], dtype=torch.long) + >>> o_var, ht_var = fused_recurrent_hgrn(x, g, initial_state=h0, output_final_state=True, offsets=offsets) + >>> assert o.allclose(o_var.view(o.shape)) + >>> assert ht.allclose(ht_var) + """ + return FusedRecurrentHGRNFunction.apply( + x, + g, + initial_state, + output_final_state, + offsets + ) diff --git a/fla/ops/hgrn/naive.py b/fla/ops/hgrn/naive.py new file mode 100644 index 0000000000000000000000000000000000000000..9bcddc1967b31c5181d330704c7b5ff2127e9d68 --- /dev/null +++ b/fla/ops/hgrn/naive.py @@ -0,0 +1,63 @@ +# -*- coding: utf-8 -*- + +from typing import Optional + +import torch + + +def naive_recurrent_hgrn( + x: torch.Tensor, + g: torch.Tensor, + initial_state: Optional[torch.Tensor] = None, + output_final_state: Optional[bool] = False +) -> torch.Tensor: + dtype = x.dtype + x, g = map(lambda i: i.float(), (x, g)) + B, T, D = x.shape + + h = torch.zeros(B, D, dtype=torch.float, device=x.device) + o = torch.zeros_like(x) + + final_state = None + if initial_state is not None: + h += initial_state + + for i in range(T): + h = g[:, i].exp() * h + x[:, i] + o[:, i] = h + + if output_final_state: + final_state = h + return o.to(dtype), final_state + + +def naive_chunk_hgrn( + x: torch.Tensor, + g: torch.Tensor, + initial_state: Optional[torch.Tensor] = None, + output_final_state: Optional[bool] = False, + chunk_size: int = 64 +) -> torch.Tensor: + dtype = x.dtype + x, g = map(lambda i: i.float(), (x, g)) + B, T, D = x.shape + + gc = g.view(B, chunk_size, D).cumsum(-2).view_as(g) + h = torch.zeros(B, D, dtype=torch.float, device=x.device) + o = torch.zeros_like(x) + + final_state = None + if initial_state is not None: + h += initial_state + + for i in range(0, T, chunk_size): + hp = h + h = torch.zeros(B, D, dtype=torch.float, device=x.device) + for j in range(i, i + chunk_size): + h = g[:, j].exp() * h + x[:, j] + o[:, j] = hp * gc[:, j].exp() + h + h = o[:, j].clone() + + if output_final_state: + final_state = h + return o.to(dtype), final_state diff --git a/fla/ops/linear_attn/__init__.py b/fla/ops/linear_attn/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..1a981054aaf9ab98b30ac08fa525bde73e68e7e4 --- /dev/null +++ b/fla/ops/linear_attn/__init__.py @@ -0,0 +1,11 @@ +# -*- coding: utf-8 -*- + +from .chunk import chunk_linear_attn +from .fused_chunk import fused_chunk_linear_attn +from .fused_recurrent import fused_recurrent_linear_attn + +__all__ = [ + 'chunk_linear_attn', + 'fused_chunk_linear_attn', + 'fused_recurrent_linear_attn' +] diff --git a/fla/ops/linear_attn/chunk.py b/fla/ops/linear_attn/chunk.py new file mode 100644 index 0000000000000000000000000000000000000000..3f00b056f9908eb6856fc8b455176336f30b05f8 --- /dev/null +++ b/fla/ops/linear_attn/chunk.py @@ -0,0 +1,374 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2023, Yu Zhang, Songlin Yang + +from typing import Optional, Tuple + +import torch +import triton +import triton.language as tl + +from fla.ops.linear_attn.utils import normalize_output +from fla.utils import autocast_custom_bwd, autocast_custom_fwd, contiguous + + +@triton.jit +def chunk_linear_attn_fwd_kernel_h( + k, + v, + h, + h0, + ht, + s_k_h, + s_k_t, + s_k_d, + s_v_h, + s_v_t, + s_v_d, + s_h_h, + s_h_t, + T: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + NT: tl.constexpr, + USE_INITIAL_STATE: tl.constexpr, + STORE_FINAL_STATE: tl.constexpr +): + i_k, i_v, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + + # [BK, BV] + b_h = tl.zeros([BK, BV], dtype=tl.float32) + + if USE_INITIAL_STATE: + p_h0 = tl.make_block_ptr(h0 + i_bh * K * V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + b_h = tl.load(p_h0, boundary_check=(0, 1)).to(tl.float32) + + for i_t in range(NT): + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (K, T), (s_k_d, s_k_t), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_h = tl.make_block_ptr(h + i_bh * s_h_h + i_t * K * V, (K, V), (s_h_t, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + + tl.store(p_h, b_h.to(p_h.dtype.element_ty), boundary_check=(0, 1)) + # [BK, BT] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BT, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + # [BK, BV] + b_h += tl.dot(b_k, b_v, allow_tf32=False) + + if STORE_FINAL_STATE: + p_ht = tl.make_block_ptr(ht + i_bh * K * V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + tl.store(p_ht, b_h.to(p_ht.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.jit +def chunk_linear_attn_fwd_kernel_o( + q, + k, + v, + h, + o, + s_k_h, + s_k_t, + s_k_d, + s_v_h, + s_v_t, + s_v_d, + s_h_h, + s_h_t, + scale, + T: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr +): + i_v, i_t, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + + o_i = tl.arange(0, BT) + m_s = o_i[:, None] >= o_i[None, :] + + b_o = tl.zeros([BT, BV], dtype=tl.float32) + b_s = tl.zeros([BT, BT], dtype=tl.float32) + for i_k in range(tl.cdiv(K, BK)): + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (K, T), (s_k_d, s_k_t), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + p_h = tl.make_block_ptr(h + i_bh * s_h_h + i_t * K * V, (K, V), (s_h_t, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + # [BT, BK] + b_q = tl.load(p_q, boundary_check=(0, 1)) + # [BK, BT] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BK, BV] + b_h = tl.load(p_h, boundary_check=(0, 1)) + b_o += tl.dot(b_q, b_h, allow_tf32=False) + b_s += tl.dot(b_q, b_k, allow_tf32=False) + b_s = tl.where(m_s, b_s, 0) + + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_o = tl.make_block_ptr(o + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + + b_v = tl.load(p_v, boundary_check=(0, 1)) + b_o = (b_o + tl.dot(b_s.to(b_v.dtype), b_v, allow_tf32=False)) * scale + + tl.store(p_o, b_o.to(p_o.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.jit +def chunk_linear_attn_bwd_kernel_dh( + q, + do, + dh, + s_k_h, + s_k_t, + s_k_d, + s_v_h, + s_v_t, + s_v_d, + s_h_h, + s_h_t, + scale, + T: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + NT: tl.constexpr +): + i_k, i_v, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + + # [BK, BV] + b_dh = tl.zeros([BK, BV], dtype=tl.float32) + for i_t in range(NT - 1, -1, -1): + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (K, T), (s_k_d, s_k_t), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + p_do = tl.make_block_ptr(do + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_dh = tl.make_block_ptr(dh + i_bh * s_h_h + i_t * K * V, (K, V), (s_h_t, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + + tl.store(p_dh, b_dh.to(p_dh.dtype.element_ty), boundary_check=(0, 1)) + # [BK, BT] + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_q = (b_q * scale).to(b_q.dtype) + # [BT, V] + b_do = tl.load(p_do, boundary_check=(0, 1)) + # [BK, BV] + b_dh += tl.dot(b_q, b_do.to(b_q.dtype), allow_tf32=False) + + +@triton.jit +def chunk_linear_attn_bwd_kernel_dqkv( + q, + k, + v, + h, + do, + dh, + dq, + dk, + dv, + s_k_h, + s_k_t, + s_k_d, + s_v_h, + s_v_t, + s_v_d, + s_h_h, + s_h_t, + scale, + T: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + NT: tl.constexpr +): + i_k, i_t, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + n_bh = tl.num_programs(2) + o_i = tl.arange(0, BT) + + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (K, T), (s_k_d, s_k_t), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_k = tl.load(p_k, boundary_check=(0, 1)) + b_s = tl.dot(b_k, b_q, allow_tf32=False) * scale + b_s = tl.where(o_i[:, None] <= o_i[None, :], b_s, 0) + + b_dq = tl.zeros([BT, BK], dtype=tl.float32) + b_dk = tl.zeros([BT, BK], dtype=tl.float32) + b_ds = tl.zeros([BT, BT], dtype=tl.float32) + for i_v in range(tl.cdiv(V, BV)): + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_h = tl.make_block_ptr(h + i_bh * s_h_h, (V, NT * K), (1, s_h_t), (i_v * BV, i_t * K + i_k * BK), (BV, BK), (0, 1)) + p_do = tl.make_block_ptr(do + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_dh = tl.make_block_ptr(dh + i_bh * s_h_h, (NT * K, V), (s_h_t, 1), (i_t * K + i_k * BK, i_v * BV), (BK, BV), (1, 0)) + p_dv = tl.make_block_ptr(dv + (i_k*n_bh+i_bh)*s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + # [BT, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + b_do = tl.load(p_do, boundary_check=(0, 1)) + # [BV, BK] + b_h = tl.load(p_h, boundary_check=(0, 1)) + # [BK, BV] + b_dh = tl.load(p_dh, boundary_check=(0, 1)) + # [BT, BT] + b_ds += tl.dot(b_do, tl.trans(b_v), allow_tf32=False) + # [BT, BK] + b_dq += tl.dot(b_do, b_h, allow_tf32=False) * scale + b_dk += tl.dot(b_v, tl.trans(b_dh), allow_tf32=False) + # [BT, BV] + b_dv = tl.dot(b_k, b_dh, allow_tf32=False) + tl.dot(b_s.to(b_q.dtype), b_do, allow_tf32=False) + tl.store(p_dv, b_dv.to(p_dv.dtype.element_ty), boundary_check=(0, 1)) + # [BT, BT] + b_ds = tl.where(o_i[:, None] >= o_i[None, :], b_ds * scale, 0).to(b_q.dtype) + # [BT, BK] + b_dq += tl.dot(b_ds, b_k, allow_tf32=False) + b_dk += tl.trans(tl.dot(b_q, b_ds, allow_tf32=False)) + + p_dq = tl.make_block_ptr(dq + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dk = tl.make_block_ptr(dk + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + tl.store(p_dq, b_dq.to(p_dq.dtype.element_ty), boundary_check=(0, 1)) + tl.store(p_dk, b_dk.to(p_dk.dtype.element_ty), boundary_check=(0, 1)) + + +class ChunkLinearAttentionFunction(torch.autograd.Function): + + @staticmethod + @contiguous + @autocast_custom_fwd + def forward(ctx, q, k, v, scale, initial_state, output_final_state): + B, H, T, K, V = *q.shape, v.shape[-1] + BT = 64 + BK, BV = min(64, triton.next_power_of_2(K)), min(64, triton.next_power_of_2(V)) + NT, NK, NV = triton.cdiv(T, BT), triton.cdiv(K, BK), triton.cdiv(V, BV) + num_stages = 1 + num_warps = 4 if BK == 64 else 2 + ctx.scale = scale + + final_state = None + if output_final_state: + final_state = q.new_empty(B, H, K, V, dtype=torch.float32, requires_grad=False) + + h = q.new_empty(B, H, NT * K, V) + grid = (NK, NV, B * H) + chunk_linear_attn_fwd_kernel_h[grid]( + k, v, h, initial_state, final_state, + q.stride(1), q.stride(2), q.stride(3), + v.stride(1), v.stride(2), v.stride(3), + h.stride(1), h.stride(2), + T=T, K=K, V=V, BT=BT, BK=BK, BV=BV, NT=NT, + USE_INITIAL_STATE=initial_state is not None, + STORE_FINAL_STATE=output_final_state, + num_warps=num_warps, + num_stages=num_stages + ) + grid = (NV, NT, B * H) + o = torch.empty_like(v) + chunk_linear_attn_fwd_kernel_o[grid]( + q, k, v, h, o, + q.stride(1), q.stride(2), q.stride(3), + v.stride(1), v.stride(2), v.stride(3), + h.stride(1), h.stride(2), + scale, + T=T, K=K, V=V, BT=BT, BK=BK, BV=BV, + num_warps=num_warps, + num_stages=num_stages + ) + ctx.save_for_backward(q, k, v, h) + return o.to(q.dtype), final_state + + @staticmethod + @contiguous + @autocast_custom_bwd + def backward(ctx, do, dht=None): + q, k, v, h = ctx.saved_tensors + + B, H, T, K, V = *q.shape, v.shape[-1] + BT = 64 + BK, BV = min(64, triton.next_power_of_2(K)), min(32 if q.dtype == torch.float32 else 64, triton.next_power_of_2(V)) + NT, NK, NV = triton.cdiv(T, BT), triton.cdiv(K, BK), triton.cdiv(V, BV) + num_stages = 1 + num_warps = 4 if BK == 64 else 2 + scale = ctx.scale + + dh = q.new_empty(B, H, NT * K, V) + grid = (NK, NV, B * H) + chunk_linear_attn_bwd_kernel_dh[grid]( + q, do, dh, + q.stride(1), q.stride(2), q.stride(3), + v.stride(1), v.stride(2), v.stride(3), + dh.stride(1), dh.stride(2), + scale, + T=T, K=K, V=V, BT=BT, BK=BK, BV=BV, NT=NT, + num_warps=num_warps, + num_stages=num_stages + ) + + grid = (NK, NT, B * H) + dq = torch.empty_like(q) + dk = torch.empty_like(k) + dv = v.new_empty(NK, *v.shape) + num_stages = 1 + num_warps = 4 if BK == 64 else 2 + chunk_linear_attn_bwd_kernel_dqkv[grid]( + q, k, v, h, do, dh, dq, dk, dv, + q.stride(1), q.stride(2), q.stride(3), + v.stride(1), v.stride(2), v.stride(3), + dh.stride(1), dh.stride(2), + scale, + T=T, K=K, V=V, BT=BT, BK=BK, BV=BV, NT=NT, + num_warps=num_warps, + num_stages=num_stages + ) + dv = dv.sum(0) + return dq.to(q.dtype), dk.to(k.dtype), dv.to(v.dtype), None, None, None + + +def chunk_linear_attn( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + scale: Optional[float] = None, + initial_state: torch.Tensor = None, + output_final_state: bool = False, + normalize: bool = True, + head_first: bool = True +) -> Tuple[torch.Tensor, torch.Tensor]: + r""" + Args: + q (torch.Tensor): + queries of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]` + k (torch.Tensor): + keys of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]` + v (torch.Tensor): + values of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]` + scale (Optional[int]): + Scale factor for the linear attention scores. + If not provided, it will default to `1 / sqrt(K)`. Default: `None`. + initial_state (Optional[torch.Tensor]): + Initial state of shape `[B, H, K, V]`. Default: `None`. + output_final_state (Optional[bool]): + Whether to output the final state of shape `[B, H, K, V]`. Default: `False`. + normalize (bool): + Whether to normalize the output. Default: `True`. + head_first (Optional[bool]): + Whether the inputs are in the head-first format. Default: `True`. + + Returns: + o (torch.Tensor): + Outputs of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]` + final_state (torch.Tensor): + Final state of shape `[B, H, K, V]` if `output_final_state=True` else `None` + """ + if scale is None: + scale = q.shape[-1] ** -0.5 + if not head_first: + q, k, v = map(lambda x: x.transpose(1, 2), (q, k, v)) + o, final_state = ChunkLinearAttentionFunction.apply(q, k, v, scale, initial_state, output_final_state) + if normalize: + o = normalize_output(q * scale, k, o) + if not head_first: + o = o.transpose(1, 2) + return o, final_state diff --git a/fla/ops/linear_attn/fused_chunk.py b/fla/ops/linear_attn/fused_chunk.py new file mode 100644 index 0000000000000000000000000000000000000000..c13ce2c19854b8b5a8e0204ed05ec42702b24d24 --- /dev/null +++ b/fla/ops/linear_attn/fused_chunk.py @@ -0,0 +1,336 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +from typing import Optional, Tuple + +import torch +import triton +import triton.language as tl +from packaging import version + +from fla.ops.linear_attn.utils import normalize_output +from fla.utils import autocast_custom_bwd, autocast_custom_fwd, contiguous + + +@triton.jit +def fused_chunk_linear_attn_fwd_kernel( + q, # query [B, H, T, K] + k, # key [B, H, T, V] + v, # value [B, H, T, V] + o, # output [B, H, T, V] + h0, + ht, + s_k_h, # stride size: T * K + s_k_t, # stride size: K + s_k_d, # stride size: 1 + s_v_h, # stride size: T * V + s_v_t, # stride size: V + s_v_d, # stride size: 1 + scale, + B, # batch size + H, # H + T, # T + K: tl.constexpr, # K + V: tl.constexpr, # V + BT: tl.constexpr, # BLOCK SIZE along the sequence dimension, a.k.a. chunk size + BK: tl.constexpr, # BLOCK SIZE along the K dimension + BV: tl.constexpr, # BLOCK SIZE along the V dimension + USE_INITIAL_STATE: tl.constexpr, + STORE_FINAL_STATE: tl.constexpr, + CHECK: tl.constexpr +): + # indices + i_v, i_k, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + + o_i = tl.arange(0, BT) + + # [BT, BT] + m_s = o_i[:, None] >= o_i[None, :] + # [BK, BV] + b_h = tl.zeros([BK, BV], dtype=tl.float32) + + # make block pointers + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (0, i_k * BK), (BT, BK), (1, 0)) + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (K, T), (s_k_d, s_k_t), (i_k * BK, 0), (BK, BT), (0, 1)) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (0, i_v * BV), (BT, BV), (1, 0)) + p_o = tl.make_block_ptr(o + (i_bh+i_k*B*H) * s_v_h, (T, V), (s_v_t, s_v_d), (0, i_v * BV), (BT, BV), (1, 0)) + + if USE_INITIAL_STATE: + p_h0 = tl.make_block_ptr(h0 + i_bh * K * V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + b_h = tl.load(p_h0, boundary_check=(0, 1)).to(tl.float32) + + for i in range(0, tl.cdiv(T, BT)): + # [BT, BK] + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_q = (b_q * scale).to(b_q.dtype) + # [BK, BT] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BT, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + + # [BT, BT] + b_s = tl.dot(b_q, b_k, allow_tf32=False) + b_s = tl.where(m_s, b_s, 0) + # [BT, BV] + b_o = tl.dot(b_s.to(b_q.dtype), b_v, allow_tf32=False) + if CHECK and i == 0: + b_o += tl.dot(b_q, b_h.to(b_q.dtype), allow_tf32=False) + b_h = b_h + tl.dot(b_k, b_v, allow_tf32=False) + else: + b_o += tl.dot(b_q, b_h.to(b_q.dtype), allow_tf32=False) + b_h = b_h + tl.dot(b_k, b_v, allow_tf32=False) + tl.store(p_o, b_o.to(p_o.dtype.element_ty), boundary_check=(0, 1)) + p_q = tl.advance(p_q, (BT, 0)) + p_k = tl.advance(p_k, (0, BT)) + p_v = tl.advance(p_v, (BT, 0)) + p_o = tl.advance(p_o, (BT, 0)) + + if STORE_FINAL_STATE: + p_ht = tl.make_block_ptr(ht + i_bh * K * V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + tl.store(p_ht, b_h.to(p_ht.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.jit +def fused_chunk_linear_attn_bwd_kernel( + q, # query [B, H, T, K] + k, # key [B, H, T, V] + v, # value [B, H, T, V] + do, # gradient of output [B, H, T, V] + dq, # gradient of query [NV, B, H, T, K] + dk, # gradient of key [NV, B, H, T, K] + dv, # gradient of value [NK, B, H, T, V] + + h0, # initial state of the chunk [B, H, K, V] + + s_k_h, # stride size: T * K + s_k_t, # stride size: K + s_k_d, # stride size: 1 + s_v_h, # stride size: T * V + s_v_t, # stride size: V + s_v_d, # stride size: 1 + scale, # K ** -0.5 + B, # B + H, # H + T, # T + K: tl.constexpr, # K + V: tl.constexpr, # V + BT: tl.constexpr, # BLOCK SIZE along the sequence dimension, a.k.a. chunk size + BK: tl.constexpr, # BLOCK SIZE along the K dimension + BV: tl.constexpr, # BLOCK SIZE along the V dimension + USE_INITIAL_STATE: tl.constexpr, + CHECK: tl.constexpr +): + i_v, i_k, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + o_i = tl.arange(0, BT) + + m_s = o_i[:, None] >= o_i[None, :] + # [BV, BK] + b_h = tl.zeros([BV, BK], dtype=tl.float32) + if USE_INITIAL_STATE: + p_h = tl.make_block_ptr(h0 + i_bh * K * V, (V, K), (1, V), (i_v * BV, i_k * BK), (BV, BK), (0, 1)) + b_h = tl.load(p_h, boundary_check=(0, 1)).to(tl.float32) + + for i in range(0, tl.cdiv(T, BT)): + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i * BT, i_k * BK), (BT, BK), (1, 0)) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (V, T), (s_v_d, s_v_t), (i_v * BV, i * BT), (BV, BT), (0, 1)) + p_do = tl.make_block_ptr(do + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i * BT, i_v * BV), (BT, BV), (1, 0)) + p_dq = tl.make_block_ptr(dq + (i_bh + i_v*B*H) * s_k_h, (T, K), (s_k_t, s_k_d), (i*BT, i_k*BK), (BT, BK), (1, 0)) + + # [BT, BK] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [V, BT] + b_v = tl.load(p_v, boundary_check=(0, 1)) + # [BT, V] + b_do = tl.load(p_do, boundary_check=(0, 1)) + + # [BT, BT] + b_ds = tl.dot(b_do, b_v, allow_tf32=False) + b_ds = tl.where(m_s, b_ds, 0) + # [BT, BK] + b_dq = tl.dot(b_ds.to(b_k.dtype), b_k, allow_tf32=False) + # [BV, BK] + if CHECK and i == 0: + b_dq += tl.dot(b_do, b_h.to(b_do.dtype), allow_tf32=False) + b_h = b_h + tl.dot(b_v, b_k, allow_tf32=False) + else: + b_dq += tl.dot(b_do, b_h.to(b_do.dtype), allow_tf32=False) + b_h = b_h + tl.dot(b_v, b_k, allow_tf32=False) + b_dq *= scale + tl.store(p_dq, b_dq.to(p_dq.dtype.element_ty), boundary_check=(0, 1)) + + # sync threads + b_h = None + tl.debug_barrier() + # [BK, BV] + b_dh = tl.zeros([BK, BV], dtype=tl.float32) + m_s = o_i[:, None] <= o_i[None, :] + for i in range(1, tl.cdiv(T, BT) + 1): + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (K, T), (s_k_d, s_k_t), (i_k * BK, T - i * BT), (BK, BT), (0, 1)) + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (T - i * BT, i_k * BK), (BT, BK), (1, 0)) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (T - i * BT, i_v * BV), (BT, BV), (1, 0)) + p_do = tl.make_block_ptr(do + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (T - i * BT, i_v * BV), (BT, BV), (1, 0)) + p_dk = tl.make_block_ptr(dk + (i_bh+i_v*B*H) * s_k_h, (T, K), (s_k_t, s_k_d), (T - i*BT, i_k*BK), (BT, BK), (1, 0)) + p_dv = tl.make_block_ptr(dv + (i_bh+i_k*B*H) * s_v_h, (T, V), (s_v_t, s_v_d), (T - i*BT, i_v*BV), (BT, BV), (1, 0)) + # [BK, BT] + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_q = (b_q * scale).to(b_q.dtype) + # [BT, BK] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BT, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + b_do = tl.load(p_do, boundary_check=(0, 1)) + + # [BT, BT] + b_s = tl.dot(b_k, b_q, allow_tf32=False) + b_s = tl.where(m_s, b_s, 0).to(b_q.dtype) + # [BT, BT] + b_ds = tl.dot(b_v, tl.trans(b_do), allow_tf32=False) + b_ds = tl.where(m_s, b_ds, 0).to(b_q.dtype) + # [BT, BK] + b_dk = tl.dot(b_ds, tl.trans(b_q), allow_tf32=False) + # [BT, BV] + b_dv = tl.dot(b_s, b_do, allow_tf32=False) + if CHECK and i == 1: + b_dk += tl.dot(b_v, tl.trans(b_dh).to(b_v.dtype), allow_tf32=False) + b_dv += tl.dot(b_k, b_dh.to(b_k.dtype), allow_tf32=False) + b_dh += tl.dot(b_q, b_do, allow_tf32=False) + else: + b_dk += tl.dot(b_v, tl.trans(b_dh).to(b_v.dtype), allow_tf32=False) + b_dv += tl.dot(b_k, b_dh.to(b_k.dtype), allow_tf32=False) + b_dh += tl.dot(b_q, b_do, allow_tf32=False) + + tl.store(p_dk, b_dk.to(p_dk.dtype.element_ty), boundary_check=(0, 1)) + tl.store(p_dv, b_dv.to(p_dv.dtype.element_ty), boundary_check=(0, 1)) + + +class FusedChunkLinearAttentionFunction(torch.autograd.Function): + + @staticmethod + @contiguous + @autocast_custom_fwd + def forward(ctx, q, k, v, scale, initial_state, output_final_state): + B, H, T, K, V = *k.shape, v.shape[-1] + BT = 64 + BK, BV = min(triton.next_power_of_2(K), 64), min(triton.next_power_of_2(V), 64) + NK, NV = triton.cdiv(K, BK), triton.cdiv(V, BV) + num_warps = 4 + num_stages = 1 + + o = q.new_empty(NK, B, H, T, V) + final_state = q.new_empty(B, H, K, V, dtype=torch.float) if output_final_state else None + # the bug still exists even for Triton 2.2 on H100 GPUs + # so we always enable initial checks + CHECK = True + if version.parse(triton.__version__) < version.parse('2.2.0'): + import warnings + warnings.warn( + "Triton<2.2.0 detected for running this kernel, " + "which is known to have some weird compiler issues (refer to https://github.com/openai/triton/issues/2852) " + "that lead to significant precision loss. " + "We've add some initial condition checks to resolve this, sadly at the sacrifice of the speed. " + "For optimal performance, it is recommended to install Triton>=2.2.0 (if possible)." + ) + CHECK = True + + grid = (NV, NK, B * H) + fused_chunk_linear_attn_fwd_kernel[grid]( + q, k, v, o, initial_state, final_state, + q.stride(1), q.stride(2), q.stride(3), + v.stride(1), v.stride(2), v.stride(3), + scale, + B=B, H=H, T=T, K=K, V=V, BT=BT, BK=BK, BV=BV, + USE_INITIAL_STATE=initial_state is not None, + STORE_FINAL_STATE=output_final_state, + CHECK=CHECK, + num_warps=num_warps, + num_stages=num_stages + ) + o = o.sum(0) if NK > 1 else o[0] + + ctx.save_for_backward(q, k, v, initial_state) + ctx.scale = scale + ctx.CHECK = CHECK + return o.to(q.dtype), final_state + + @staticmethod + @contiguous + @autocast_custom_bwd + def backward(ctx, do, dht=None): + q, k, v, initial_state = ctx.saved_tensors + B, H, T, K, V = *k.shape, v.shape[-1] + scale = ctx.scale + + BT = 64 + BK, BV = min(triton.next_power_of_2(K), 64), min(triton.next_power_of_2(V), 64) + NK, NV = triton.cdiv(K, BK), triton.cdiv(V, BV) + num_warps = 4 + num_stages = 1 + + dq = q.new_empty(NV, B, H, T, K) + dk = q.new_empty(NV, B, H, T, K) + dv = q.new_empty(NK, B, H, T, V) + grid = (NV, NK, B * H) + + fused_chunk_linear_attn_bwd_kernel[grid]( + q, k, v, do, dq, dk, dv, initial_state, + q.stride(1), q.stride(2), q.stride(3), + v.stride(1), v.stride(2), v.stride(3), + scale, + B=B, H=H, T=T, K=K, V=V, BT=BT, BK=BK, BV=BV, + USE_INITIAL_STATE=initial_state is not None, + CHECK=ctx.CHECK, + num_warps=num_warps, + num_stages=num_stages + ) + dq = dq.sum(0) + dk = dk.sum(0) + dv = dv.sum(0) + return dq.to(q.dtype), dk.to(k.dtype), dv.to(v.dtype), None, None, None + + +def fused_chunk_linear_attn( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + scale: Optional[float] = None, + initial_state: torch.Tensor = None, + output_final_state: bool = False, + normalize: bool = True, + head_first: bool = True +) -> Tuple[torch.Tensor, torch.Tensor]: + r""" + Args: + q (torch.Tensor): + queries of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]` + k (torch.Tensor): + keys of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]` + v (torch.Tensor): + values of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]` + scale (Optional[int]): + Scale factor for linear attention scores. + If not provided, it will default to `1 / sqrt(K)`. Default: `None`. + initial_state (Optional[torch.Tensor]): + Initial state of shape `[B, H, K, V]`. Default: `None`. + output_final_state (Optional[bool]): + Whether to output the final state of shape `[B, H, K, V]`. Default: `False`. + normalize (bool): + Whether to normalize the output. Default: `True`. + head_first (Optional[bool]): + Whether the inputs are in the head-first format. Default: `True`. + + Returns: + o (torch.Tensor): + Outputs of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]` + final_state (torch.Tensor): + Final state of shape `[B, H, K, V]` if `output_final_state=True` else `None` + """ + if scale is None: + scale = q.shape[-1] ** -0.5 + if not head_first: + q, k, v = map(lambda x: x.transpose(1, 2), (q, k, v)) + o, final_state = FusedChunkLinearAttentionFunction.apply(q, k, v, scale, initial_state, output_final_state) + if normalize: + o = normalize_output(q * scale, k, o) + if not head_first: + o = o.transpose(1, 2) + return o, final_state diff --git a/fla/ops/linear_attn/fused_recurrent.py b/fla/ops/linear_attn/fused_recurrent.py new file mode 100644 index 0000000000000000000000000000000000000000..c8be0d62328658bf9527032849afc332d43e1182 --- /dev/null +++ b/fla/ops/linear_attn/fused_recurrent.py @@ -0,0 +1,251 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +from typing import Optional, Tuple + +import torch +import triton +import triton.language as tl + +from fla.ops.linear_attn.utils import normalize_output +from fla.utils import contiguous + + +@triton.jit +def fused_recurrent_linear_attn_fwd_kernel( + q, # query [B, H, L, K] + k, # key [B, H, L, V] + v, # value [B, H, L, V] + o, # output [B, H, L, V] + h0, + ht, # final hidden state [B, H, K, V] + + s_k_h, # stride size: L * K + s_v_h, # stride size: L * V + + scale, + B, # batch size + H, # H + T, # T + K: tl.constexpr, # K + V: tl.constexpr, # V + BK: tl.constexpr, # BLOCK SIZE along the K dimension + BV: tl.constexpr, # BLOCK SIZE along the V dimension + USE_INITIAL_STATE: tl.constexpr, # whether to use initial state + STORE_FINAL_STATE: tl.constexpr, # whether to store final state +): + # indices + i_v, i_k, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + + p_q = q + i_bh * s_k_h + i_k * BK + tl.arange(0, BK) + p_k = k + i_bh * s_k_h + i_k * BK + tl.arange(0, BK) + p_v = v + i_bh * s_v_h + i_v * BV + tl.arange(0, BV) + p_o = o + (i_bh + i_k * B * H) * s_v_h + i_v * BV + tl.arange(0, BV) + + mask_bk = (i_k * BK + tl.arange(0, BK)) < K + mask_bv = (i_v * BV + tl.arange(0, BV)) < V + mask_kv = mask_bk[None, :] & mask_bv[:, None] + + b_h = tl.zeros([BV, BK], dtype=tl.float32) + + if USE_INITIAL_STATE: + p_h0 = h0 + i_bh * K * V + (i_k * BK + tl.arange(0, BK)[None, :]) * V + (i_v * BV + tl.arange(0, BV)[:, None]) + b_h += tl.load(p_h0, mask=mask_kv, other=0).to(tl.float32) + + for _ in range(0, T): + b_k = tl.load(p_k, mask=mask_bk, other=0).to(tl.float32) + b_v = tl.load(p_v, mask=mask_bv, other=0).to(tl.float32) + b_q = tl.load(p_q, mask=mask_bk, other=0).to(tl.float32) * scale + + b_h += b_k[None, :] * b_v[:, None] + b_o = b_h * b_q[None, :] + b_o = tl.sum(b_o, axis=1) + tl.store(p_o, b_o.to(p_o.dtype.element_ty), mask=mask_bv) + + p_q += K + p_k += K + p_o += V + p_v += V + + if STORE_FINAL_STATE: + p_ht = ht + i_bh * K * V + (i_k * BK + tl.arange(0, BK)[None, :]) * V + (i_v * BV + tl.arange(0, BV)[:, None]) + tl.store(p_ht, b_h.to(p_ht.dtype.element_ty), mask=mask_kv) + + +# Similar to Algorithm1 of https://arxiv.org/abs/2006.16236 +@triton.jit +def fused_recurrent_linear_attn_bwd_kernel( + q, # query [B, H, L, K] + k, # key [B, H, L, V] + v, # value [B, H, L, V] + + do, # gradient of output [B, H, L, V] + dq, # gradient of query [NV, B, H, L, K] + dk, # gradient of key [NV, B, H, L, K] + dv, # gradient of value [NK, B, H, L, V] + h0, # initial hidden state initialization [B, H, K, V] + + s_k_h, # stride size: L * K + s_v_h, # stride size: L * V + scale, # K ** -0.5 + + B, # B + H, # H + T, # T + K: tl.constexpr, # K + V: tl.constexpr, # V + BK: tl.constexpr, # BLOCK SIZE along the K dimension + BV: tl.constexpr, # BLOCK SIZE along the V dimension + USE_INITIAL_STATE: tl.constexpr, # whether to use initial state +): + i_v, i_k, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + + p_q = q + i_bh * s_k_h + i_k * BK + tl.arange(0, BK) + p_k = k + i_bh * s_k_h + i_k * BK + tl.arange(0, BK) + p_v = v + i_bh * s_v_h + i_v * BV + tl.arange(0, BV) + p_do = do + i_bh * s_v_h + i_v * BV + tl.arange(0, BV) + + p_dq = dq + (i_bh + i_v * B * H) * s_k_h + i_k * BK + tl.arange(0, BK) + mask_bk = i_k * BK + tl.arange(0, BK) < K + mask_bv = i_v * BV + tl.arange(0, BV) < V + + b_h = tl.zeros([BK, BV], dtype=tl.float32) + + if USE_INITIAL_STATE: + mask_kv = mask_bk[:, None] & mask_bv[None, :] + p_h0 = h0 + i_bh * K * V + (i_k * BK + tl.arange(0, BK)[:, None]) * V + (i_v * BV + tl.arange(0, BV)[None, :]) + b_h += tl.load(p_h0, mask=mask_kv, other=0).to(tl.float32) + + for _ in range(0, T): + b_k = tl.load(p_k, mask=mask_bk, other=0).to(tl.float32) + b_v = tl.load(p_v, mask=mask_bv, other=0).to(tl.float32) + b_do = tl.load(p_do, mask=mask_bv, other=0).to(tl.float32) + + b_h += b_k[:, None] * b_v[None, :] + _d_q = b_h * b_do[None, :] + d_q = tl.sum(_d_q, axis=1) * scale + tl.store(p_dq, d_q.to(p_dq.dtype.element_ty), mask=mask_bk) + + p_k += K + p_do += V + p_v += V + p_dq += K + + # sync threads + tl.debug_barrier() + + p_q = q + i_bh * s_k_h + i_k * BK + tl.arange(0, BK) + (T - 1) * K + p_k = k + i_bh * s_k_h + i_k * BK + tl.arange(0, BK) + (T - 1) * K + p_do = do + i_bh * s_v_h + i_v * BV + tl.arange(0, BV) + (T - 1) * V + p_v = v + i_bh * s_v_h + i_v * BV + tl.arange(0, BV) + (T - 1) * V + p_dk = dk + (i_bh + i_v * B * H) * s_k_h + i_k * BK + tl.arange(0, BK) + (T - 1) * K + p_dv = dv + (i_bh + i_k * B * H) * s_v_h + i_v * BV + tl.arange(0, BV) + (T - 1) * V + d_h = tl.zeros([BK, BV], dtype=tl.float32) + + for _ in range(T): + b_do = tl.load(p_do, mask=mask_bv, other=0).to(tl.float32) + b_q = tl.load(p_q, mask=mask_bk, other=0).to(tl.float32) * scale + b_k = tl.load(p_k, mask=mask_bk, other=0).to(tl.float32) + b_v = tl.load(p_v, mask=mask_bv, other=0).to(tl.float32) + d_h += b_q[:, None] * b_do[None, :] + d_k = tl.sum(d_h * b_v[None, :], axis=1) + d_v = tl.sum(d_h * b_k[:, None], axis=0) + + tl.store(p_dk, d_k.to(p_dk.dtype.element_ty), mask=mask_bk) + tl.store(p_dv, d_v.to(p_dv.dtype.element_ty), mask=mask_bv) + + p_do -= V + p_q -= K + p_k -= K + p_v -= V + p_dk -= K + p_dv -= V + + +class FusedRecurrentLinearAttentionFunction(torch.autograd.Function): + + @staticmethod + @contiguous + def forward(ctx, q, k, v, scale, initial_state=None, output_final_state=False): + B, H, T, K = q.shape + V = v.shape[-1] + + BK, BV = min(K, 32), min(V, 32) + NK, NV = triton.cdiv(K, BK), triton.cdiv(V, BV) + num_warps = 1 + num_stages = 1 + + o = q.new_empty(NK, B, H, T, V) + final_state = q.new_empty(B, H, K, V) if output_final_state else None + + grid = (NV, NK, B * H) + fused_recurrent_linear_attn_fwd_kernel[grid]( + q, k, v, o, initial_state, final_state, + q.stride(1), + v.stride(1), scale, + B=B, H=H, T=T, K=K, V=V, BK=BK, BV=BV, + USE_INITIAL_STATE=initial_state is not None, + STORE_FINAL_STATE=final_state is not None, + num_warps=num_warps, + num_stages=num_stages + ) + + o = o.sum(0) + ctx.save_for_backward(q, k, v, initial_state) + ctx.scale = scale + return o, final_state + + @staticmethod + @contiguous + def backward(ctx, do, dht=None): + q, k, v, initial_state = ctx.saved_tensors + B, H, T, K = q.shape + V = v.shape[-1] + scale = ctx.scale + + BK, BV = min(K, 32), min(V, 32) + NK, NV = triton.cdiv(K, BK), triton.cdiv(V, BV) + num_warps = 1 + num_stages = 1 + + dq = q.new_empty(NV, B, H, T, K) + dk = q.new_empty(NV, B, H, T, K) + dv = q.new_empty(NK, B, H, T, V) + grid = (NV, NK, B * H) + + fused_recurrent_linear_attn_bwd_kernel[grid]( + q, k, v, do, dq, dk, dv, initial_state, + q.stride(1), + v.stride(1), + scale, + B=B, H=H, T=T, K=K, V=V, BK=BK, BV=BV, + USE_INITIAL_STATE=initial_state is not None, + num_warps=num_warps, + num_stages=num_stages + ) + dq = dq.sum(0) + dk = dk.sum(0) + dv = dv.sum(0) + return dq, dk, dv, None, None, None + + +def fused_recurrent_linear_attn( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + scale: Optional[float] = None, + initial_state: torch.Tensor = None, + output_final_state: bool = False, + normalize: bool = False, + head_first: bool = True +) -> Tuple[torch.Tensor, torch.Tensor]: + if scale is None: + scale = q.shape[-1] ** -0.5 + if not head_first: + q, k, v = map(lambda x: x.transpose(1, 2), (q, k, v)) + o, final_state = FusedRecurrentLinearAttentionFunction.apply(q, k, v, scale, initial_state, output_final_state) + if normalize: + o = normalize_output(q * scale, k, o) + if not head_first: + o = o.transpose(1, 2) + return o, final_state diff --git a/fla/ops/linear_attn/naive.py b/fla/ops/linear_attn/naive.py new file mode 100644 index 0000000000000000000000000000000000000000..b6ecf2718fcac8eef80f445ed02b95f36329f3c4 --- /dev/null +++ b/fla/ops/linear_attn/naive.py @@ -0,0 +1,36 @@ +# -*- coding: utf-8 -*- + +from typing import Optional, Tuple + +import torch +from einops import rearrange + +from fla.ops.linear_attn.utils import normalize_output + + +def naive_chunk_linear_attn( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + scale: Optional[float] = None, + normalize: bool = False +) -> Tuple[torch.Tensor, torch.Tensor]: + if scale is None: + scale = q.shape[-1] ** -0.5 + chunk_size = 64 + q = rearrange(q, 'b h (n c) d -> b h n c d', c=chunk_size) * scale + k = rearrange(k, 'b h (n c) d -> b h n c d', c=chunk_size) + v = rearrange(v, 'b h (n c) d -> b h n c d', c=chunk_size) + kv = k.transpose(-1, -2) @ v + kv = kv.cumsum(2) + kv = torch.cat([torch.zeros_like(kv[:, :, :1]), kv[:, :, :-1]], dim=2) + inter = q @ kv + intra = (( + q @ k.transpose(-1, -2)).masked_fill_( + torch.triu(torch.ones(chunk_size, chunk_size, dtype=bool, device=q.device), diagonal=1), + 0 + )) @ v + o = inter + intra + if normalize: + o = normalize_output(q * scale, k, o) + return rearrange(o, 'b h n c d -> b h (n c) d') diff --git a/fla/ops/linear_attn/utils.py b/fla/ops/linear_attn/utils.py new file mode 100644 index 0000000000000000000000000000000000000000..b444376833f5d512af6fc2db387db75a43a92e5d --- /dev/null +++ b/fla/ops/linear_attn/utils.py @@ -0,0 +1,10 @@ +# -*- coding: utf-8 -*- + +import torch + + +@torch.jit.script +def normalize_output(q, k, o): + k = k.cumsum(-2) + z = (q * k).sum(-1, keepdim=True) + return o / (z + 1e-10) diff --git a/fla/ops/rebased/__init__.py b/fla/ops/rebased/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..6ec6a0cb31f7f635aa528cad753d5e19196a2028 --- /dev/null +++ b/fla/ops/rebased/__init__.py @@ -0,0 +1,7 @@ +# -*- coding: utf-8 -*- + +from .parallel import parallel_rebased + +__all__ = [ + 'parallel_rebased' +] diff --git a/fla/ops/rebased/naive.py b/fla/ops/rebased/naive.py new file mode 100644 index 0000000000000000000000000000000000000000..e9436a0802c964485354082dcc9fbcd437e5d7f7 --- /dev/null +++ b/fla/ops/rebased/naive.py @@ -0,0 +1,48 @@ +# -*- coding: utf-8 -*- + +import torch + +from fla.ops.rebased.parallel import parallel_rebased + + +def naive_parallel_rebased(q, k, v, use_scale=True, use_norm=True): + if use_scale: + q = q * (q.shape[-1] ** -0.5) + attn = q @ k.transpose(-2, -1) + attn = (attn ** 2) + attn.masked_fill_(~torch.tril(torch.ones( + q.shape[-2], q.shape[-2], dtype=torch.bool, device=q.device)), 0) + o = attn @ v + if use_norm: + z = attn.sum(-1) + return o / (z[..., None] + 1e-6) + else: + return o + + +if __name__ == "__main__": + B = 4 + H = 4 + L = 128 + # D = 15 + dtype = torch.float32 + q = (torch.randn(B, H, L, 16).cuda().to(dtype)).requires_grad_(True) + k = (torch.randn(B, H, L, 16).cuda().to(dtype)).requires_grad_(True) + v = torch.randn(B, H, L, 128).cuda().to(dtype).requires_grad_(True) + + do = torch.randn_like(v).cuda() + ref = naive_parallel_rebased(q, k, v, True, True) + ref.backward(do, retain_graph=True) + ref_dq, q.grad = q.grad.clone(), None + ref_dk, k.grad = k.grad.clone(), None + ref_dv, v.grad = v.grad.clone(), None + + tri = parallel_rebased(q, k, v, 1e-6, True, True) + tri.backward(do, retain_graph=True) + tri_dq, q.grad = q.grad.clone(), None + tri_dk, k.grad = k.grad.clone(), None + tri_dv, v.grad = v.grad.clone(), None + print((ref-tri).abs().max()) + print((ref_dq-tri_dq).abs().max()) + print((ref_dk-tri_dk).abs().max()) + print((ref_dv-tri_dv).abs().max()) diff --git a/fla/ops/rebased/parallel.py b/fla/ops/rebased/parallel.py new file mode 100644 index 0000000000000000000000000000000000000000..7a0aecb81b6350fa08c143f7967452968321c1c8 --- /dev/null +++ b/fla/ops/rebased/parallel.py @@ -0,0 +1,440 @@ + +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +import torch +import triton +import triton.language as tl + +from fla.utils import autocast_custom_bwd, autocast_custom_fwd, contiguous + +# Rebased: Linear Transformers with Learnable Kernel Functions are Better In-Context Models +# https://github.com/corl-team/rebased/blob/main/flash_linear_attention/fla/ops/triton/rebased_fast/parallel.py + + +@triton.jit +def parallel_rebased_fwd_kernel( + q, # query [B, H, L, D_head_K] + k, # key [B, H, L, D_head_V] + v, # value [B, H, L, D_head_V] + o, # output [B, H, L, D_head_V] + z, # normalizer [B, H, L] + s_k_h, # stride size: L * D_head_K + s_k_t, # stride size: D_head_K + s_k_d, # stride size: 1 + s_v_h, # stride size: L * D_head_V + s_v_t, # stride size: D_head_V + s_v_d, # stride size: 1 + scale, # D_head_K ** -0.5 + B, # batch size + H, # H + T, # T + K: tl.constexpr, # D_head_K + V: tl.constexpr, # D_head_V + BTL: tl.constexpr, # BLOCK SIZE along the sequence dimension for Q + BTS: tl.constexpr, # BLOCK SIZE along the sequence dimension for K/V + BK: tl.constexpr, # BLOCK SIZE along the K dimension + BV: tl.constexpr, # BLOCK SIZE along the V dimension +): + # i_c: chunk index. used for sequence parallelism + i_kv, i_c, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + NV = tl.cdiv(V, BV) + i_k = i_kv // (NV) + i_v = i_kv % (NV) + + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_c * BTL, i_k * BK), (BTL, BK), (1, 0)) + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (K, T), (s_k_d, s_k_t), (i_k * BK, 0), (BK, BTS), (0, 1)) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (0, i_v * BV), (BTS, BV), (1, 0)) + + # [BQ, BD] block Q, in the shared memory throughout the whole kernel + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_q = (b_q * scale).to(b_q.dtype) + b_o = tl.zeros([BTL, BV], dtype=tl.float32) + b_z = tl.zeros([BTL], dtype=tl.float32) + + # Q block and K block have no overlap + # no need for mask, thereby saving flops + for _ in range(0, i_c * BTL, BTS): + # [BK, BTS] + b_k = tl.load(p_k, boundary_check=(0, 1)) + + # [BTS, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + # [BTL, BTS] + b_s = tl.dot(b_q, (b_k), allow_tf32=False) + b_s = b_s * b_s + b_z += tl.sum(b_s, axis=1) + + # [BQ, BD] + b_o = b_o + tl.dot(b_s.to(b_v.dtype), b_v, allow_tf32=False) + p_k = tl.advance(p_k, (0, BTS)) + p_v = tl.advance(p_v, (BTS, 0)) + + # # rescale interchunk output + tl.debug_barrier() + o_q = tl.arange(0, BTL) + # # sync threads, easy for compiler to optimize + # tl.debug_barrier() + + o_k = tl.arange(0, BTS) + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (K, T), (s_k_d, s_k_t), (i_k * BK, i_c * BTL), (BK, BTS), (0, 1)) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_c * BTL, i_v * BV), (BTS, BV), (1, 0)) + # Q block and K block have overlap. masks required + for _ in range(i_c * BTL, (i_c + 1) * BTL, BTS): + # [BK, BTS] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BTS, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + # [BTL, BTS] + m_s = o_q[:, None] >= o_k[None, :] + b_s = tl.dot(b_q, b_k, allow_tf32=False) + b_s = b_s * b_s + b_s = tl.where(m_s, b_s, 0) + b_z += tl.sum(b_s, axis=1) + # [BTL, BV] + b_o += tl.dot(b_s.to(b_q.dtype), b_v, allow_tf32=False) + p_k = tl.advance(p_k, (0, BTS)) + p_v = tl.advance(p_v, (BTS, 0)) + o_k += BTS + + p_o = tl.make_block_ptr(o + (i_bh + B * H * i_k) * s_v_h, (T, V), (s_v_t, s_v_d), (i_c*BTL, i_v*BV), (BTL, BV), (1, 0)) + p_z = z + (i_bh + B * H * i_k) * T + i_c * BTL + tl.arange(0, BTL) + tl.store(p_o, b_o.to(p_o.dtype.element_ty), boundary_check=(0, 1)) + tl.store(p_z, b_z.to(p_z.dtype.element_ty), + mask=((i_c * BTL + tl.arange(0, BTL)) < T)) + + +@triton.jit +def _parallel_rebased_bwd_dq( + i_bh, + i_c, + i_k, + i_v, + i_h, + q, + k, + v, + do, + dz, + dq, + s_k_h, + s_k_t, + s_k_d, + s_v_h, + s_v_t, + s_v_d, + scale, + B: tl.constexpr, + H: tl.constexpr, + T: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BTL: tl.constexpr, + BTS: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr +): + p_do = tl.make_block_ptr(do + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), + (i_c * BTL, i_v * BV), (BTL, BV), (1, 0)) + p_q = tl.make_block_ptr(q + (i_bh) * s_k_h, (T, K), + (s_k_t, s_k_d), (i_c*BTL, i_k*BK), (BTL, BK), (1, 0)) + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_do = tl.load(p_do, boundary_check=(0, 1)).to(b_q.dtype) + b_q = (b_q * scale).to(b_q.dtype) + b_dq = tl.zeros([BTL, BK], dtype=tl.float32) + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (T, K), + (s_k_t, s_k_d), (0, i_k * BK), (BTS, BK), (1, 0)) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (V, T), + (s_v_d, s_v_t), (i_v * BV, 0), (BV, BTS), (0, 1)) + p_dz = dz + i_bh * T + i_c * BTL + tl.arange(0, BTL) + b_dz = tl.load(p_dz, mask=(i_c * BTL + tl.arange(0, BTL)) < T) + + for _ in range(0, i_c * BTL, BTS): + # [BTS, BK] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BV, BTS] + b_v = tl.load(p_v, boundary_check=(0, 1)) + # [BTL, BTS] + b_ds = tl.dot(b_do, b_v, allow_tf32=False) + if i_v == 0: + b_ds += b_dz[:, None] + else: + b_ds = b_ds + b_s = tl.dot(b_q, tl.trans(b_k), allow_tf32=False) + # [BQ, BD] + b_dq += tl.dot((2 * b_ds * b_s).to(b_v.dtype), b_k, allow_tf32=False) + p_k = tl.advance(p_k, (BTS, 0)) + p_v = tl.advance(p_v, (0, BTS)) + + b_dq *= scale + o_q = tl.arange(0, BTL) + o_k = tl.arange(0, BTS) + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (T, K), + (s_k_t, s_k_d), (i_c * BTL, i_k * BK), (BTS, BK), (1, 0)) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (V, T), + (s_v_d, s_v_t), (i_v * BV, i_c * BTL), (BV, BTS), (0, 1)) + # Q block and K block have overlap. masks required + for _ in range(i_c * BTL, (i_c + 1) * BTL, BTS): + # [BTS, BK] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BV, BTS] + b_v = tl.load(p_v, boundary_check=(0, 1)) + # [BTL, BTS] + m_s = o_q[:, None] >= o_k[None, :] + b_ds = tl.dot(b_do, b_v, allow_tf32=False) + if i_v == 0: + b_ds += b_dz[:, None] + else: + b_ds = b_ds + b_ds = tl.where(m_s, b_ds, 0) * scale + b_s = tl.dot(b_q, tl.trans(b_k), allow_tf32=False) + b_s = tl.where(m_s, b_s, 0) + # [BTL, BK] + b_dq += tl.dot((2 * b_ds * b_s).to(b_k.dtype), + b_k, allow_tf32=False) + p_k = tl.advance(p_k, (BTS, 0)) + p_v = tl.advance(p_v, (0, BTS)) + o_k += BTS + p_dq = tl.make_block_ptr(dq + (i_bh + B * H * i_v) * s_k_h, (T, K), + (s_k_t, s_k_d), (i_c*BTL, i_k*BK), (BTL, BK), (1, 0)) + tl.store(p_dq, b_dq.to(p_dq.dtype.element_ty), boundary_check=(0, 1)) + return + + +@triton.jit +def _parallel_rebased_bwd_dkv( + i_bh, i_c, i_k, i_v, i_h, + q, k, v, do, dz, dk, dv, s_k_h, s_k_t, s_k_d, s_v_h, + s_v_t, s_v_d, + scale, + B: tl.constexpr, + H: tl.constexpr, + T: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BTL: tl.constexpr, + BTS: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, +): + # compute dk dv + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), + (i_c * BTL, i_k * BK), (BTL, BK), (1, 0)) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), + (i_c * BTL, i_v * BV), (BTL, BV), (1, 0)) + b_k, b_v = tl.load(p_k, boundary_check=(0, 1)), tl.load( + p_v, boundary_check=(0, 1)) + b_dk, b_dv = tl.zeros([BTL, BK], dtype=tl.float32), tl.zeros( + [BTL, BV], dtype=tl.float32) + + for i in range((tl.cdiv(T, BTS) * BTS)-BTS, (i_c + 1) * BTL - BTS, -BTS): + p_q = tl.make_block_ptr( + q + i_bh * s_k_h, (K, T), (s_k_d, s_k_t), (i_k * BK, i), (BK, BTS), (0, 1)) + p_do = tl.make_block_ptr( + do + i_bh * s_v_h, (V, T), (s_v_d, s_v_t), (i_v * BV, i), (BV, BTS), (0, 1)) + p_dz = dz + i_bh * T + i + tl.arange(0, BTS) + b_q = tl.load(p_q, boundary_check=(0, 1)) # [BK, BTS] + b_do = tl.load(p_do, boundary_check=(0, 1)).to(b_q.dtype) # [BV, BTS] + b_dz = tl.load(p_dz, mask=(i + tl.arange(0, BTS)) < T) + b_s = tl.dot(b_k.to(b_q.dtype), b_q, allow_tf32=False) * \ + scale # [BTL, BTS] + b_s2 = b_s * b_s + b_dv += tl.dot(b_s2.to(b_q.dtype), tl.trans(b_do), allow_tf32=False) + b_ds = tl.dot(b_v, b_do, allow_tf32=False) * scale + if i_v == 0: + b_ds += b_dz[None, :] * scale + else: + b_ds = b_ds + b_dk += tl.dot((2 * b_ds * b_s).to(b_q.dtype), + tl.trans(b_q), allow_tf32=False) + + tl.debug_barrier() + o_q, o_k = tl.arange(0, BTS), tl.arange(0, BTL) + for i in range(i_c*BTL, (i_c+1)*BTL, BTS): + p_q = tl.make_block_ptr( + q + i_bh * s_k_h, (K, T), (s_k_d, s_k_t), (i_k * BK, i), (BK, BTS), (0, 1)) + p_do = tl.make_block_ptr( + do + i_bh * s_v_h, (V, T), (s_v_d, s_v_t), (i_v * BV, i), (BV, BTS), (0, 1)) + p_dz = dz + i_bh * T + i + tl.arange(0, BTS) + b_q = tl.load(p_q, boundary_check=(0, 1)) # [BD, BQ] + b_do = tl.load(p_do, boundary_check=(0, 1)).to(b_q.dtype) + b_dz = tl.load(p_dz, mask=(i + tl.arange(0, BTS)) < T) + # [BK, BQ] + m_s = o_k[:, None] <= o_q[None, :] + b_s = tl.dot(b_k, b_q, allow_tf32=False) * scale + b_s2 = b_s * b_s + b_s = tl.where(m_s, b_s, 0) + b_s2 = tl.where(m_s, b_s2, 0) + + b_ds = tl.dot(b_v, b_do, allow_tf32=False) + if i_v == 0: + b_ds += b_dz[None, :] + else: + b_ds = b_ds + b_ds = tl.where(m_s, b_ds, 0) * scale + # [BK, BD] + b_dv += tl.dot(b_s2.to(b_q.dtype), tl.trans(b_do), allow_tf32=False) + b_dk += tl.dot((2 * b_ds * b_s).to(b_q.dtype), + tl.trans(b_q), allow_tf32=False) + o_q += BTS + + p_dk = tl.make_block_ptr(dk + (i_bh + B * H * i_v) * s_k_h, + (T, K), (s_k_t, s_k_d), (i_c*BTL, i_k*BK), (BTL, BK), (1, 0)) + p_dv = tl.make_block_ptr(dv + (i_bh + B * H * i_k) * s_v_h, + (T, V), (s_v_t, s_v_d), (i_c*BTL, i_v*BV), (BTL, BV), (1, 0)) + tl.store(p_dk, b_dk.to(p_dk.dtype.element_ty), boundary_check=(0, 1)) + tl.store(p_dv, b_dv.to(p_dv.dtype.element_ty), boundary_check=(0, 1)) + return + + +@triton.jit +def parallel_rebased_bwd_kernel( + q, + k, + v, + do, + dz, + dq, + dk, + dv, + s_k_h, + s_k_t, + s_k_d, + s_v_h, + s_v_t, + s_v_d, + scale, + B: tl.constexpr, + H: tl.constexpr, + T: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BTL: tl.constexpr, + BTS: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr +): + i_kv, i_c, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + NV = tl.cdiv(V, BV) + i_k = i_kv // (NV) + i_v = i_kv % (NV) + i_h = i_bh % H + _parallel_rebased_bwd_dq( + i_bh, i_c, i_k, i_v, i_h, + q, k, v, do, dz, dq, s_k_h, s_k_t, s_k_d, s_v_h, + s_v_t, s_v_d, scale, + B=B, H=H, T=T, K=K, V=V, BTL=BTL, BTS=BTS, BK=BK, BV=BV + ) + tl.debug_barrier() + _parallel_rebased_bwd_dkv( + i_bh, i_c, i_k, i_v, i_h, + q, k, v, do, dz, dk, dv, s_k_h, s_k_t, s_k_d, s_v_h, + s_v_t, s_v_d, + scale, + B=B, H=H, T=T, K=K, V=V, BTL=BTL, BTS=BTS, BK=BK, BV=BV + ) + + +class ParallelBasedFunction(torch.autograd.Function): + + @staticmethod + @contiguous + @autocast_custom_fwd + def forward(ctx, q, k, v, scale): + BTL, BTS = 128, 32 + assert BTL % BTS == 0 + # assert q.shape[-1] % 16 == 0 + BK = min(128, triton.next_power_of_2(k.shape[-1])) + BV = min(128, triton.next_power_of_2(v.shape[-1])) + BK, BV = max(BK, 16), max(BV, 16) + B, H, T, K, V = *k.shape, v.shape[-1] + num_stages = 2 + num_warps = 4 + NK = triton.cdiv(K, BK) + NV = triton.cdiv(V, BV) + grid = (NK * NV, triton.cdiv(T, BTL), B * H) + + assert NK == 1, "will encounter some synchronization issue if not." + + o = torch.empty(NK, B, H, T, V, device=q.device) + z = torch.empty(NK, B, H, T, device=q.device) + parallel_rebased_fwd_kernel[grid]( + q, k, v, o, z, + q.stride(1), q.stride(2), q.stride(3), + v.stride(1), v.stride(2), v.stride(3), + scale, + B=B, H=H, T=T, K=K, V=V, + BTL=BTL, BTS=BTS, BK=BK, BV=BV, + num_warps=num_warps, + num_stages=num_stages + ) + ctx.save_for_backward(q, k, v) + ctx.scale = scale + return o.sum(0).to(q.dtype), z.sum(0).to(q.dtype) + + @staticmethod + @contiguous + @autocast_custom_bwd + def backward(ctx, do, dz): + q, k, v = ctx.saved_tensors + scale = ctx.scale + BTL, BTS = 64, 32 + assert BTL % BTS == 0 + BK = min(128, triton.next_power_of_2(k.shape[-1])) + BV = min(128, triton.next_power_of_2(v.shape[-1])) + BK, BV = max(BK, 16), max(BV, 16) + B, H, T, K, V = *k.shape, v.shape[-1] + num_stages = 2 + num_warps = 4 + NK = triton.cdiv(K, BK) + NV = triton.cdiv(V, BV) + grid = (NK * NV, triton.cdiv(T, BTL), B * H) + + assert NK == 1, "will encounter some synchronization issue if not" + + dq = torch.empty(NV, B, H, T, K, dtype=q.dtype, device=q.device) + dk = torch.empty(NV, B, H, T, K, dtype=q.dtype, device=q.device) + dv = torch.empty(NK, B, H, T, V, dtype=q.dtype, device=q.device) + + parallel_rebased_bwd_kernel[grid]( + q, k, v, do, dz, dq, dk, dv, + q.stride(1), q.stride(2), q.stride(3), + v.stride(1), v.stride(2), v.stride(3), + scale, + B=B, H=H, T=T, K=K, V=V, + BTL=BTL, BTS=BTS, BK=BK, BV=BV, + num_warps=num_warps, + num_stages=num_stages + ) + + return dq.sum(0).to(q.dtype), dk.sum(0).to(k.dtype), dv.sum(0).to(v.dtype), None + + +triton_parallel_based = ParallelBasedFunction.apply + + +def parallel_rebased( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + eps: float = 1e-5, + use_scale: bool = True, + use_normalize: bool = True, + return_both: bool = False, + head_first: bool = True +): + assert q.shape[-1] <= 128, "only support feature dim up to 128" + if use_scale: + scale = q.shape[-1] ** -0.5 + else: + scale = 1 + if not head_first: + q, k, v = map(lambda x: x.transpose(1, 2), (q, k, v)) + o, z = triton_parallel_based(q, k, v, scale) + if return_both: + return o, z + if use_normalize: + o = o / (z[..., None] + eps) + if not head_first: + o = o.transpose(1, 2) + return o.to(q.dtype) diff --git a/fla/ops/retention/__init__.py b/fla/ops/retention/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..a38ab43c9982c9751bb9db146b9d9fe05663964a --- /dev/null +++ b/fla/ops/retention/__init__.py @@ -0,0 +1,13 @@ +# -*- coding: utf-8 -*- + +from .chunk import chunk_retention +from .fused_chunk import fused_chunk_retention +from .fused_recurrent import fused_recurrent_retention +from .parallel import parallel_retention + +__all__ = [ + 'chunk_retention', + 'fused_chunk_retention', + 'parallel_retention', + 'fused_recurrent_retention' +] diff --git a/fla/ops/retention/chunk.py b/fla/ops/retention/chunk.py new file mode 100644 index 0000000000000000000000000000000000000000..6b3c152cdf7df3c01b5e9501ba3a246ec34de5f5 --- /dev/null +++ b/fla/ops/retention/chunk.py @@ -0,0 +1,845 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +from typing import Optional, Tuple + +import torch +import triton +import triton.language as tl + +from fla.utils import autocast_custom_bwd, autocast_custom_fwd, contiguous + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4) + ], + key=["BT", "BK", "BV"], +) +@triton.heuristics({ + 'USE_INITIAL_STATE': lambda args: args['h0'] is not None, + 'STORE_FINAL_STATE': lambda args: args['ht'] is not None, + 'USE_OFFSETS': lambda args: args['offsets'] is not None +}) +@triton.jit +def chunk_retention_fwd_kernel_h( + k, + v, + h, + h0, + ht, + offsets, + c_offsets, + T: tl.constexpr, + H: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + USE_INITIAL_STATE: tl.constexpr, + STORE_FINAL_STATE: tl.constexpr, + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + i_k, i_v, i_nh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_n, i_h = i_nh // H, i_nh % H + if USE_OFFSETS: + bos, eos = tl.load(offsets + i_n).to(tl.int32), tl.load(offsets + i_n + 1).to(tl.int32) + T = eos - bos + NT = tl.cdiv(T, BT) + boh = tl.load(c_offsets + i_n).to(tl.int32) + else: + bos, eos = i_n * T, i_n * T + T + NT = tl.cdiv(T, BT) + boh = i_n * NT + b_b = tl.math.log2(1 - tl.math.exp2(-5 - i_h * 1.0)) + + o_i = tl.arange(0, BT) + d_b, d_i = tl.math.exp2(BT * b_b), tl.math.exp2((BT - o_i - 1) * b_b) + + # [BK, BV] + b_h = tl.zeros([BK, BV], dtype=tl.float32) + if USE_INITIAL_STATE: + p_h0 = tl.make_block_ptr(h0 + i_nh * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + b_h = tl.load(p_h0, boundary_check=(0, 1)).to(tl.float32) + + for i_t in range(NT): + if HEAD_FIRST: + p_k = tl.make_block_ptr(k + i_nh * T*K, (K, T), (1, K), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + p_v = tl.make_block_ptr(v + i_nh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_h = tl.make_block_ptr(h + (i_nh * NT + i_t) * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + else: + p_k = tl.make_block_ptr(k + (bos*H + i_h) * K, (K, T), (1, H*K), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + p_v = tl.make_block_ptr(v + (bos*H + i_h) * V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_h = tl.make_block_ptr(h + ((boh + i_t) * H + i_h) * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + + tl.store(p_h, b_h.to(p_h.dtype.element_ty), boundary_check=(0, 1)) + # [BK, BT] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BT, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + # [BK, BV] + if i_t == NT - 1 and (T % BT) != 0: + d_b = tl.math.exp2((T % BT) * b_b) + d_i = tl.math.exp2(((T % BT) - o_i - 1) * b_b) + b_h = d_b * b_h + tl.dot(b_k, (b_v * d_i[:, None]).to(b_k.dtype), allow_tf32=False) + + if STORE_FINAL_STATE: + p_ht = tl.make_block_ptr(ht + i_nh * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + tl.store(p_ht, b_h.to(p_ht.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4) + ], + key=["BT", "BK", "BV"], +) +@triton.heuristics({'USE_OFFSETS': lambda args: args['offsets'] is not None}) +@triton.jit +def chunk_retention_fwd_kernel_o( + q, + k, + v, + h, + o, + offsets, + indices, + scale, + H: tl.constexpr, + T: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + NT: tl.constexpr, + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + i_v, i_t, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_b, i_h = i_bh // H, i_bh % H + if USE_OFFSETS: + i_tg = i_t + i_n, i_t = tl.load(indices + i_t * 2).to(tl.int32), tl.load(indices + i_t * 2 + 1).to(tl.int32) + bos, eos = tl.load(offsets + i_n).to(tl.int32), tl.load(offsets + i_n + 1).to(tl.int32) + T = eos - bos + NT = tl.cdiv(T, BT) + else: + NT = tl.cdiv(T, BT) + i_tg = i_b * NT + i_t + bos, eos = i_b * T, i_b * T + T + b_b = tl.math.log2(1 - tl.math.exp2(-5 - i_h * 1.0)) + + o_i = tl.arange(0, BT) + d_i = tl.math.exp2((o_i + 1) * b_b) + m_s = o_i[:, None] >= o_i[None, :] + d_s = tl.where(m_s, tl.math.exp2((o_i[:, None] - o_i[None, :]) * b_b), 0) + + b_o = tl.zeros([BT, BV], dtype=tl.float32) + b_s = tl.zeros([BT, BT], dtype=tl.float32) + for i_k in range(tl.cdiv(K, BK)): + if HEAD_FIRST: + p_q = tl.make_block_ptr(q + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_k = tl.make_block_ptr(k + i_bh * T*K, (K, T), (1, K), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + p_h = tl.make_block_ptr(h + (i_bh * NT + i_t) * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + else: + p_q = tl.make_block_ptr(q + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_k = tl.make_block_ptr(k + (bos * H + i_h) * K, (K, T), (1, H*K), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + p_h = tl.make_block_ptr(h + (i_tg * H + i_h) * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + # [BT, BK] + b_q = tl.load(p_q, boundary_check=(0, 1)) + # [BK, BT] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BK, BV] + b_h = tl.load(p_h, boundary_check=(0, 1)) + b_o += tl.dot(b_q, b_h, allow_tf32=False) + b_s += tl.dot(b_q, b_k, allow_tf32=False) + + b_o = b_o * d_i[:, None] + b_s = b_s * d_s + if HEAD_FIRST: + p_v = tl.make_block_ptr(v + i_bh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_o = tl.make_block_ptr(o + i_bh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + else: + p_v = tl.make_block_ptr(v + (bos * H + i_h) * V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_o = tl.make_block_ptr(o + (bos * H + i_h) * V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + b_v = tl.load(p_v, boundary_check=(0, 1)) + b_o = (b_o + tl.dot(b_s.to(b_v.dtype), b_v, allow_tf32=False)) * scale + tl.store(p_o, b_o.to(p_o.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4) + ], + key=["BT", "BK", "BV"], +) +@triton.heuristics({ + 'STORE_INITIAL_STATE_GRADIENT': lambda args: args['dh0'] is not None, + 'USE_FINAL_STATE_GRADIENT': lambda args: args['dht'] is not None, + 'USE_OFFSETS': lambda args: args['offsets'] is not None +}) +@triton.jit +def chunk_retention_bwd_kernel_dh( + q, + do, + dh, + dh0, + dht, + offsets, + c_offsets, + scale, + T: tl.constexpr, + H: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + STORE_INITIAL_STATE_GRADIENT: tl.constexpr, + USE_FINAL_STATE_GRADIENT: tl.constexpr, + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + i_k, i_v, i_nh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_n, i_h = i_nh // H, i_nh % H + b_b = tl.math.log2(1 - tl.math.exp2(-5 - i_h * 1.0)) + if USE_OFFSETS: + bos, eos = tl.load(offsets + i_n).to(tl.int32), tl.load(offsets + i_n + 1).to(tl.int32) + T = eos - bos + NT = tl.cdiv(T, BT) + boh = tl.load(c_offsets + i_n).to(tl.int32) + else: + bos, eos = i_n * T, i_n * T + T + NT = tl.cdiv(T, BT) + boh = i_n * NT + + o_i = tl.arange(0, BT) + d_i = tl.math.exp2((o_i + 1) * b_b) + + # [BK, BV] + b_dh = tl.zeros([BK, BV], dtype=tl.float32) + if USE_FINAL_STATE_GRADIENT: + p_dht = tl.make_block_ptr(dht + i_nh * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + b_dh += tl.load(p_dht, boundary_check=(0, 1)).to(tl.float32) + for i_t in range(NT - 1, -1, -1): + if HEAD_FIRST: + p_q = tl.make_block_ptr(q + i_nh * T*K, (K, T), (1, K), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + p_do = tl.make_block_ptr(do + i_nh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_dh = tl.make_block_ptr(dh + (i_nh * NT + i_t) * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + else: + p_q = tl.make_block_ptr(q + (bos*H + i_h) * K, (K, T), (1, H*K), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + p_do = tl.make_block_ptr(do + (bos*H + i_h) * V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_dh = tl.make_block_ptr(dh + ((boh+i_t) * H + i_h) * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + tl.store(p_dh, b_dh.to(p_dh.dtype.element_ty), boundary_check=(0, 1)) + + d_b = tl.math.exp2(min(BT, T - i_t * BT) * b_b) + # [BK, BT] + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_q = (b_q * scale).to(b_q.dtype) + # [BT, V] + b_do = tl.load(p_do, boundary_check=(0, 1)) + # [BK, BV] + b_dh = d_b * b_dh + tl.dot(b_q, (b_do * d_i[:, None]).to(b_q.dtype), allow_tf32=False) + + if STORE_INITIAL_STATE_GRADIENT: + p_dh0 = tl.make_block_ptr(dh0 + i_nh * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + tl.store(p_dh0, b_dh.to(p_dh0.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4) + ], + key=["BT", "BK", "BV"], +) +@triton.heuristics({'USE_OFFSETS': lambda args: args['offsets'] is not None}) +@triton.jit +def chunk_retention_bwd_kernel_dqkv( + q, + k, + v, + h, + do, + dh, + dq, + dk, + dv, + offsets, + indices, + scale, + B: tl.constexpr, + T: tl.constexpr, + H: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + NT: tl.constexpr, + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + i_k, i_t, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_b, i_h = i_bh // H, i_bh % H + if USE_OFFSETS: + i_tg = i_t + i_n, i_t = tl.load(indices + i_t * 2).to(tl.int32), tl.load(indices + i_t * 2 + 1).to(tl.int32) + bos, eos = tl.load(offsets + i_n).to(tl.int32), tl.load(offsets + i_n + 1).to(tl.int32) + all = T + T = eos - bos + NT = tl.cdiv(T, BT) + else: + NT = tl.cdiv(T, BT) + i_tg = i_b * NT + i_t + bos, eos = i_b * T, i_b * T + T + all = B * T + b_b = tl.math.log2(1 - tl.math.exp2(-5 - i_h * 1.0)) + + o_i = tl.arange(0, BT) + d_q, d_k = tl.math.exp2((o_i + 1) * b_b), tl.math.exp2((min(BT, T - i_t * BT) - o_i - 1) * b_b) + d_q = (d_q * scale).to(d_q.dtype) + m_s = o_i[:, None] >= o_i[None, :] + d_s = tl.where(m_s, tl.math.exp2((o_i[:, None] - o_i[None, :]) * b_b), 0) * scale + + if HEAD_FIRST: + p_q = tl.make_block_ptr(q + i_bh * T*K, (K, T), (1, K), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + p_k = tl.make_block_ptr(k + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + else: + p_q = tl.make_block_ptr(q + (bos * H + i_h) * K, (K, T), (1, H*K), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + p_k = tl.make_block_ptr(k + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_k = tl.load(p_k, boundary_check=(0, 1)) + b_s = tl.dot(b_k, b_q, allow_tf32=False) * tl.trans(d_s) + + b_dq = tl.zeros([BT, BK], dtype=tl.float32) + b_dk = tl.zeros([BT, BK], dtype=tl.float32) + b_ds = tl.zeros([BT, BT], dtype=tl.float32) + for i_v in range(tl.cdiv(V, BV)): + if HEAD_FIRST: + p_v = tl.make_block_ptr(v + i_bh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_do = tl.make_block_ptr(do + i_bh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_dv = tl.make_block_ptr(dv + (i_k * B*H + i_bh) * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_h = tl.make_block_ptr(h + (i_bh * NT + i_t) * K*V, (V, K), (1, V), (i_v * BV, i_k * BK), (BV, BK), (0, 1)) + p_dh = tl.make_block_ptr(dh + (i_bh * NT + i_t) * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + else: + p_v = tl.make_block_ptr(v + (bos*H + i_h) * V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_do = tl.make_block_ptr(do + (bos*H + i_h) * V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_dv = tl.make_block_ptr(dv + ((i_k*all+bos)*H+i_h) * V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_h = tl.make_block_ptr(h + (i_tg * H + i_h) * K*V, (V, K), (1, V), (i_v * BV, i_k * BK), (BV, BK), (0, 1)) + p_dh = tl.make_block_ptr(dh + (i_tg * H + i_h) * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + + # [BT, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + b_do = tl.load(p_do, boundary_check=(0, 1)) + # [BV, BK] + b_h = tl.load(p_h, boundary_check=(0, 1)) + # [BK, BV] + b_dh = tl.load(p_dh, boundary_check=(0, 1)) + + # [BT, BT] + b_ds += tl.dot(b_do, tl.trans(b_v), allow_tf32=False) + # [BT, BK] + b_dq += tl.dot(b_do, b_h.to(b_do.dtype), allow_tf32=False) + b_dk += tl.dot(b_v, tl.trans(b_dh).to(b_v.dtype), allow_tf32=False) + # [BT, BV] + b_dv = tl.dot(b_k, b_dh, allow_tf32=False) * d_k[:, None] + tl.dot(b_s.to(b_q.dtype), b_do, allow_tf32=False) + tl.store(p_dv, b_dv.to(p_dv.dtype.element_ty), boundary_check=(0, 1)) + # [BT, BT] + b_ds = (b_ds * d_s).to(b_q.dtype) + # [BT, BK] + b_dq = b_dq * d_q[:, None] + tl.dot(b_ds, b_k, allow_tf32=False) + b_dk = b_dk * d_k[:, None] + tl.trans(tl.dot(b_q, b_ds, allow_tf32=False)) + + if HEAD_FIRST: + p_dq = tl.make_block_ptr(dq + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dk = tl.make_block_ptr(dk + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + else: + p_dq = tl.make_block_ptr(dq + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dk = tl.make_block_ptr(dk + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + tl.store(p_dq, b_dq.to(p_dq.dtype.element_ty), boundary_check=(0, 1)) + tl.store(p_dk, b_dk.to(p_dk.dtype.element_ty), boundary_check=(0, 1)) + + +def chunk_retention_fwd_h( + k: torch.Tensor, + v: torch.Tensor, + h0: torch.Tensor, + output_final_state: bool, + offsets: Optional[torch.Tensor] = None, + c_offsets: Optional[torch.Tensor] = None, + head_first: bool = True, + chunk_size: int = 64 +) -> Tuple[torch.Tensor, torch.Tensor]: + if head_first: + B, H, T, K, V = *k.shape, v.shape[-1] + else: + B, T, H, K, V = *k.shape, v.shape[-1] + BT = chunk_size + # N: the actual number of sequences in the batch with either equal or variable lengths + if offsets is None: + N, NT, c_offsets = B, triton.cdiv(T, BT), None + else: + N = len(offsets) - 1 + if c_offsets is None: + c_offsets = torch.cat([offsets.new_tensor([0]), triton.cdiv(offsets[1:] - offsets[:-1], BT)]).cumsum(-1) + NT = c_offsets[-1] + BK = min(triton.next_power_of_2(K), 64) + BV = min(triton.next_power_of_2(V), 64) + NK, NV = triton.cdiv(K, BK), triton.cdiv(V, BV) + + if head_first: + h = k.new_empty(B, H, NT, K, V, dtype=k.dtype) + else: + h = k.new_empty(B, NT, H, K, V, dtype=k.dtype) + ht = k.new_empty(N, H, K, V, dtype=torch.float32) if output_final_state else None + + grid = (NK, NV, N * H) + chunk_retention_fwd_kernel_h[grid]( + k=k, + v=v, + h=h, + h0=h0, + ht=ht, + offsets=offsets, + c_offsets=c_offsets, + T=T, + H=H, + K=K, + V=V, + BT=BT, + BK=BK, + BV=BV, + HEAD_FIRST=head_first + ) + return h, ht + + +def chunk_retention_fwd_o( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + h: torch.Tensor, + scale: float, + offsets: Optional[torch.LongTensor] = None, + indices: Optional[torch.LongTensor] = None, + head_first: bool = True, + chunk_size: int = 64 +) -> torch.Tensor: + if head_first: + B, H, T, K, V = *k.shape, v.shape[-1] + else: + B, T, H, K, V = *k.shape, v.shape[-1] + BT = min(chunk_size, triton.next_power_of_2(T)) + if offsets is None: + NT = triton.cdiv(T, BT) + else: + if indices is None: + indices = torch.cat([torch.arange(n) for n in triton.cdiv(offsets[1:] - offsets[:-1], BT).tolist()]) + indices = torch.stack([indices.eq(0).cumsum(0) - 1, indices], 1).to(offsets) + NT = len(indices) + BK = min(triton.next_power_of_2(K), 64) + BV = min(triton.next_power_of_2(V), 64) + NV = triton.cdiv(V, BV) + + o = torch.empty_like(v) + grid = (NV, NT, B * H) + chunk_retention_fwd_kernel_o[grid]( + q=q, + k=k, + v=v, + h=h, + o=o, + offsets=offsets, + indices=indices, + scale=scale, + T=T, + H=H, + K=K, + V=V, + BT=BT, + BK=BK, + BV=BV, + NT=NT, + HEAD_FIRST=head_first + ) + return o + + +def chunk_retention_bwd_dh( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + do: torch.Tensor, + h0: torch.Tensor, + dht: torch.Tensor, + scale: float, + offsets: Optional[torch.Tensor] = None, + c_offsets: Optional[torch.Tensor] = None, + head_first: bool = True, + chunk_size: int = 64 +) -> Tuple[torch.Tensor, torch.Tensor]: + if head_first: + B, H, T, K, V = *k.shape, v.shape[-1] + else: + B, T, H, K, V = *k.shape, v.shape[-1] + BT = chunk_size + # N: the actual number of sequences in the batch with either equal or variable lengths + if offsets is None: + N, NT, c_offsets = B, triton.cdiv(T, BT), None + else: + N = len(offsets) - 1 + if c_offsets is None: + c_offsets = torch.cat([offsets.new_tensor([0]), triton.cdiv(offsets[1:] - offsets[:-1], BT)]).cumsum(-1) + NT = c_offsets[-1] + BK = min(triton.next_power_of_2(K), 64) + BV = min(triton.next_power_of_2(V), 64) + NK, NV = triton.cdiv(K, BK), triton.cdiv(V, BV) + + if head_first: + dh = k.new_empty(B, H, NT, K, V, dtype=k.dtype) + else: + dh = k.new_empty(B, NT, H, K, V, dtype=k.dtype) + dh0 = torch.empty_like(h0, dtype=torch.float32) if h0 is not None else None + + grid = (NK, NV, N * H) + chunk_retention_bwd_kernel_dh[grid]( + q=q, + do=do, + dh=dh, + dh0=dh0, + dht=dht, + offsets=offsets, + c_offsets=c_offsets, + scale=scale, + T=T, + H=H, + K=K, + V=V, + BT=BT, + BK=BK, + BV=BV, + HEAD_FIRST=head_first + ) + return dh, dh0 + + +def chunk_retention_bwd_dqkv( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + h: torch.Tensor, + do: torch.Tensor, + dh: torch.Tensor, + scale: float, + offsets: Optional[torch.LongTensor] = None, + indices: Optional[torch.LongTensor] = None, + head_first: bool = True, + chunk_size: int = 64 +) -> Tuple[torch.Tensor, torch.Tensor, torch.Tensor]: + if head_first: + B, H, T, K, V = *k.shape, v.shape[-1] + else: + B, T, H, K, V = *k.shape, v.shape[-1] + BT = min(chunk_size, triton.next_power_of_2(T)) + if offsets is None: + NT = triton.cdiv(T, BT) + else: + if indices is None: + indices = torch.cat([torch.arange(n) for n in triton.cdiv(offsets[1:] - offsets[:-1], chunk_size).tolist()]) + indices = torch.stack([indices.eq(0).cumsum(0) - 1, indices], 1).to(offsets) + NT = len(indices) + BK = min(64, triton.next_power_of_2(K)) + BV = min(64, triton.next_power_of_2(V)) + NK = triton.cdiv(K, BK) + + dq = torch.empty_like(q) + dk = torch.empty_like(k) + dv = v.new_empty(NK, *v.shape) + grid = (NK, NT, B * H) + chunk_retention_bwd_kernel_dqkv[grid]( + q=q, + k=k, + v=v, + h=h, + do=do, + dh=dh, + dq=dq, + dk=dk, + dv=dv, + offsets=offsets, + indices=indices, + scale=scale, + B=B, + T=T, + H=H, + K=K, + V=V, + BT=BT, + BK=BK, + BV=BV, + NT=NT, + HEAD_FIRST=head_first + ) + dv = dv.sum(0) + return dq, dk, dv + + +def chunk_retention_fwd( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + scale: float, + initial_state: torch.Tensor, + output_final_state: bool, + offsets: Optional[torch.LongTensor] = None, + indices: Optional[torch.LongTensor] = None, + head_first: bool = True, + chunk_size: int = 64 +) -> Tuple[torch.Tensor, torch.Tensor]: + h, ht = chunk_retention_fwd_h( + k=k, + v=v, + h0=initial_state, + output_final_state=output_final_state, + offsets=offsets, + head_first=head_first, + chunk_size=chunk_size + ) + o = chunk_retention_fwd_o( + q=q, + k=k, + v=v, + h=h, + scale=scale, + offsets=offsets, + indices=indices, + head_first=head_first, + chunk_size=chunk_size + ) + return o, ht + + +def chunk_retention_bwd( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + scale: Optional[float], + initial_state: Optional[torch.Tensor], + do: torch.Tensor, + dht: torch.Tensor, + offsets: Optional[torch.LongTensor] = None, + indices: Optional[torch.LongTensor] = None, + head_first: bool = True, + chunk_size: int = 64 +) -> Tuple[torch.Tensor, torch.Tensor, torch.Tensor]: + h, _ = chunk_retention_fwd_h( + k=k, + v=v, + h0=initial_state, + output_final_state=False, + offsets=offsets, + head_first=head_first, + chunk_size=chunk_size + ) + dh, dh0 = chunk_retention_bwd_dh( + q=q, + k=k, + v=v, + do=do, + h0=initial_state, + dht=dht, + scale=scale, + offsets=offsets, + head_first=head_first, + chunk_size=chunk_size + ) + dq, dk, dv = chunk_retention_bwd_dqkv( + q=q, + k=k, + v=v, + h=h, + do=do, + dh=dh, + scale=scale, + offsets=offsets, + indices=indices, + head_first=head_first, + chunk_size=chunk_size + ) + return dq, dk, dv, dh0 + + +class ChunkRetentionFunction(torch.autograd.Function): + + @staticmethod + @contiguous + @autocast_custom_fwd + def forward( + ctx, + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + scale: float, + initial_state: torch.Tensor, + output_final_state: bool, + offsets: torch.LongTensor, + head_first: bool + ): + T = q.shape[2] if head_first else q.shape[1] + chunk_size = min(64, triton.next_power_of_2(T)) + + # 2-d indices denoting the offsets of chunks in each sequence + # for example, if the passed `offsets` is [0, 100, 356] and `chunk_size` is 64, + # then there are 2 and 4 chunks in the 1st and 2nd sequences respectively, and `indices` will be + # [[0, 0], [0, 1], [1, 0], [1, 1], [1, 2], [1, 3]] + indices = None + if offsets is not None: + indices = torch.cat([torch.arange(n) for n in triton.cdiv(offsets[1:] - offsets[:-1], chunk_size).tolist()]) + indices = torch.stack([indices.eq(0).cumsum(0) - 1, indices], 1).to(offsets) + + o, ht = chunk_retention_fwd( + q=q, + k=k, + v=v, + scale=scale, + initial_state=initial_state, + output_final_state=output_final_state, + offsets=offsets, + indices=indices, + head_first=head_first, + chunk_size=chunk_size + ) + ctx.save_for_backward(q, k, v, initial_state) + ctx.scale = scale + ctx.offsets = offsets + ctx.indices = indices + ctx.head_first = head_first + ctx.chunk_size = chunk_size + return o.to(q.dtype), ht + + @staticmethod + @contiguous + @autocast_custom_bwd + def backward(ctx, do, dht): + q, k, v, initial_state = ctx.saved_tensors + chunk_size, scale, offsets, indices, head_first = ctx.chunk_size, ctx.scale, ctx.offsets, ctx.indices, ctx.head_first + dq, dk, dv, dh0 = chunk_retention_bwd( + q=q, + k=k, + v=v, + scale=scale, + initial_state=initial_state, + do=do, + dht=dht, + offsets=offsets, + indices=indices, + head_first=head_first, + chunk_size=chunk_size + ) + return dq.to(q.dtype), dk.to(k.dtype), dv.to(v.dtype), None, dh0, None, None, None + + +def chunk_retention( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + scale: Optional[float] = None, + initial_state: Optional[torch.Tensor] = None, + output_final_state: bool = False, + offsets: Optional[torch.LongTensor] = None, + head_first: bool = True +) -> Tuple[torch.Tensor, torch.Tensor]: + r""" + Args: + q (torch.Tensor): + queries of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]` + k (torch.Tensor): + keys of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]` + v (torch.Tensor): + values of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]` + scale (Optional[int]): + Scale factor for the attention scores. + If not provided, it will default to `1 / sqrt(K)`. Default: `None`. + initial_state (Optional[torch.Tensor]): + Initial state of shape `[N, H, K, V]` for `N` input sequences. + For equal-length input sequences, `N` equals the batch size `B`. + Default: `None`. + output_final_state (Optional[bool]): + Whether to output the final state of shape `[N, H, K, V]`. Default: `False`. + offsets (Optional[torch.LongTensor]): + Offsets of shape `[N+1]` defining the bos/eos positions of `N` variable-length sequences in the batch. + For example, + if `offsets` is `[0, 1, 3, 6, 10, 15]`, there are `N=5` sequences with lengths 1, 2, 3, 4 and 5 respectively. + If provided, the inputs are concatenated and the batch size `B` is expected to be 1. + Default: `None`. + head_first (Optional[bool]): + Whether the inputs are in the head-first format, which is not supported for variable-length inputs. + Default: `True`. + + Returns: + o (torch.Tensor): + Outputs of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]`. + final_state (torch.Tensor): + Final state of shape `[N, H, K, V]` if `output_final_state=True` else `None`. + + Examples:: + >>> import torch + >>> import torch.nn.functional as F + >>> from einops import rearrange + >>> from fla.ops.retention import chunk_retention + # inputs with equal lengths + >>> B, T, H, K, V = 4, 2048, 4, 512, 512 + >>> q = torch.randn(B, T, H, K, device='cuda') + >>> k = torch.randn(B, T, H, K, device='cuda') + >>> v = torch.randn(B, T, H, V, device='cuda') + >>> h0 = torch.randn(B, H, K, V, device='cuda') + >>> o, ht = chunk_retention(q, k, v, + initial_state=h0, + output_final_state=True, + head_first=False) + # for variable-length inputs, the batch size `B` is expected to be 1 and `offsets` is required + >>> q, k, v = map(lambda x: rearrange(x, 'b t h d -> 1 (b t) h d'), (q, k, v)) + # for a batch with 4 sequences, offsets with 5 start/end positions are expected + >>> offsets = q.new_tensor([0, 2048, 4096, 6144, 8192], dtype=torch.long) + >>> o_var, ht_var = chunk_retention(q, k, v, + initial_state=h0, + output_final_state=True, + offsets=offsets, + head_first=False) + >>> assert o.allclose(o_var.view(o.shape)) + >>> assert ht.allclose(ht_var) + """ + if offsets is not None: + if q.shape[0] != 1: + raise ValueError(f"The batch size is expected to be 1 rather than {q.shape[0]} when using `offsets`." + f"Please flatten variable-length inputs before processing.") + if head_first: + raise RuntimeError("Sequences with variable lengths are not supported for head-first mode") + if initial_state is not None and initial_state.shape[0] != len(offsets) - 1: + raise ValueError(f"The number of initial states is expected to be equal to the number of input sequences, " + f"i.e., {len(offsets) - 1} rather than {initial_state.shape[0]}.") + if scale is None: + scale = k.shape[-1] ** -0.5 + o, final_state = ChunkRetentionFunction.apply( + q, + k, + v, + scale, + initial_state, + output_final_state, + offsets, + head_first + ) + return o, final_state diff --git a/fla/ops/retention/fused_chunk.py b/fla/ops/retention/fused_chunk.py new file mode 100644 index 0000000000000000000000000000000000000000..ae490cdb8a58b4c1a0e557667532d174eced3439 --- /dev/null +++ b/fla/ops/retention/fused_chunk.py @@ -0,0 +1,353 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +from typing import Optional, Tuple + +import torch +import triton +import triton.language as tl +from packaging import version + +from fla.utils import autocast_custom_bwd, autocast_custom_fwd, contiguous + + +@triton.jit +def fused_chunk_retention_fwd_kernel( + q, + k, + v, + o, + h0, + ht, + s_k_h, + s_k_t, + s_k_d, + s_v_h, + s_v_t, + s_v_d, + scale, + B: tl.constexpr, + H: tl.constexpr, + T: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + USE_INITIAL_STATE: tl.constexpr, + STORE_FINAL_STATE: tl.constexpr, + CHECK: tl.constexpr +): + # indices + i_v, i_k, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_h = i_bh % H + + o_i = tl.arange(0, BT) + # decay rate given the head index + b_b = tl.math.log2(1 - tl.math.exp2(-5 - i_h * 1.0)) + + # d_b: overall decay for the entire chunk + # d_o: cumulative decay from the start of the chunk + # d_h: cumulative decay from the end of the chunk + d_b, d_o, d_h = tl.math.exp2(BT * b_b), tl.math.exp2((o_i + 1) * b_b), tl.math.exp2((BT - o_i - 1) * b_b) + + # [BT, BT] + m_s = o_i[:, None] >= o_i[None, :] + d_s = tl.where(m_s, tl.math.exp2((o_i[:, None] - o_i[None, :]) * b_b), 0) + # [BK, BV] + b_h = tl.zeros([BK, BV], dtype=tl.float32) + + # make block pointers + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (0, i_k * BK), (BT, BK), (1, 0)) + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (K, T), (s_k_d, s_k_t), (i_k * BK, 0), (BK, BT), (0, 1)) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (0, i_v * BV), (BT, BV), (1, 0)) + p_o = tl.make_block_ptr(o + (i_bh+i_k*B*H) * s_v_h, (T, V), (s_v_t, s_v_d), (0, i_v * BV), (BT, BV), (1, 0)) + + if USE_INITIAL_STATE: + p_h = tl.make_block_ptr(h0 + i_bh * K * V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + b_h = tl.load(p_h, boundary_check=(0, 1)).to(tl.float32) + + NT = tl.cdiv(T, BT) + for i in range(0, NT): + # [BK, BT] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BT, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + # [BT, BK] + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_q = (b_q * scale).to(b_k.dtype) + + # [BT, BT] + b_s = tl.dot(b_q, b_k, allow_tf32=False) * d_s + # [BT, BV] + b_o = tl.dot(b_s.to(b_q.dtype), b_v, allow_tf32=False) + if CHECK and i == 0: + b_o += tl.dot(b_q, b_h.to(b_q.dtype), allow_tf32=False) * d_o[:, None] + b_h = d_b * b_h + tl.dot(b_k, (b_v * d_h[:, None]).to(b_k.dtype), allow_tf32=False) + else: + b_o += tl.dot(b_q, b_h.to(b_q.dtype), allow_tf32=False) * d_o[:, None] + if i == NT - 1 and (T % BT) != 0: + d_b = tl.math.exp2((T % BT) * b_b) + d_h = tl.math.exp2(((T % BT) - o_i - 1) * b_b) + b_h = d_b * b_h + tl.dot(b_k, (b_v * d_h[:, None]).to(b_k.dtype), allow_tf32=False) + tl.store(p_o, b_o.to(p_o.dtype.element_ty), boundary_check=(0, 1)) + + p_q = tl.advance(p_q, (BT, 0)) + p_k = tl.advance(p_k, (0, BT)) + p_v = tl.advance(p_v, (BT, 0)) + p_o = tl.advance(p_o, (BT, 0)) + + if STORE_FINAL_STATE: + p_ht = tl.make_block_ptr(ht + i_bh * K * V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + tl.store(p_ht, b_h.to(p_ht.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.jit +def fused_chunk_retention_bwd_kernel( + q, + k, + v, + do, + dq, + dk, + dv, + + h0, + + s_k_h, + s_k_t, + s_k_d, + s_v_h, + s_v_t, + s_v_d, + scale, + B: tl.constexpr, + H: tl.constexpr, + T: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + USE_INITIAL_STATE: tl.constexpr, + CHECK: tl.constexpr +): + i_v, i_k, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_h = i_bh % H + + o_i = tl.arange(0, BT) + b_b = tl.math.log2(1 - tl.math.exp2(-5 - i_h * 1.0)) + d_q, d_k = tl.math.exp2((o_i+1) * b_b) * scale, tl.math.exp2((BT - o_i - 1) * b_b) + d_b = tl.math.exp2(BT * b_b) + + m_s = o_i[:, None] >= o_i[None, :] + d_s = tl.where(m_s, tl.math.exp2((o_i[:, None] - o_i[None, :]) * b_b), 0) * scale + # [BV, BK] + b_h = tl.zeros([BV, BK], dtype=tl.float32) + if USE_INITIAL_STATE: + p_h = tl.make_block_ptr(h0 + i_bh * K * V, (V, K), (1, V), (i_v * BV, i_k * BK), (BV, BK), (0, 1)) + b_h = tl.load(p_h, boundary_check=(0, 1)).to(tl.float32) + + for i in range(0, tl.cdiv(T, BT)): + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i * BT, i_k * BK), (BT, BK), (1, 0)) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (V, T), (s_v_d, s_v_t), (i_v * BV, i * BT), (BV, BT), (0, 1)) + p_do = tl.make_block_ptr(do + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i * BT, i_v * BV), (BT, BV), (1, 0)) + p_dq = tl.make_block_ptr(dq + (i_bh + i_v*B*H) * s_k_h, (T, K), (s_k_t, s_k_d), (i*BT, i_k*BK), (BT, BK), (1, 0)) + + # [BT, K] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [V, BT] + b_v = tl.load(p_v, boundary_check=(0, 1)) + # [BT, V] + b_do = tl.load(p_do, boundary_check=(0, 1)) + b_dd = (b_do * d_q[:, None]).to(b_do.dtype) + + # [BT, BT] + b_ds = tl.dot(b_do, b_v, allow_tf32=False) + b_ds = (b_ds * d_s).to(b_k.dtype) + # [BT, K] + b_dq = tl.dot(b_ds, b_k, allow_tf32=False) + # [V, K] + if CHECK and i == 0: + b_dq += tl.dot(b_dd, b_h.to(b_k.dtype), allow_tf32=False) + b_h = d_b * b_h + tl.dot((b_v * d_k[None, :]).to(b_k.dtype), b_k, allow_tf32=False) + else: + b_dq += tl.dot(b_dd, b_h.to(b_k.dtype), allow_tf32=False) + b_h = d_b * b_h + tl.dot((b_v * d_k[None, :]).to(b_k.dtype), b_k, allow_tf32=False) + + tl.store(p_dq, b_dq.to(p_dq.dtype.element_ty), boundary_check=(0, 1)) + + # sync threads + b_h = None + tl.debug_barrier() + d_s = tl.trans(d_s) + # [BK, BV] + b_dh = tl.zeros([BK, BV], dtype=tl.float32) + for i in range(1, tl.cdiv(T, BT) + 1): + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (K, T), (s_k_d, s_k_t), (i_k * BK, T - i * BT), (BK, BT), (0, 1)) + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (T - i * BT, i_k * BK), (BT, BK), (1, 0)) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (T - i * BT, i_v * BV), (BT, BV), (1, 0)) + p_do = tl.make_block_ptr(do + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (T - i * BT, i_v * BV), (BT, BV), (1, 0)) + p_dk = tl.make_block_ptr(dk + (i_bh+i_v*B*H) * s_k_h, (T, K), (s_k_t, s_k_d), (T - i*BT, i_k*BK), (BT, BK), (1, 0)) + p_dv = tl.make_block_ptr(dv + (i_bh+i_k*B*H) * s_v_h, (T, V), (s_v_t, s_v_d), (T - i*BT, i_v*BV), (BT, BV), (1, 0)) + # [K, BT] + b_q = tl.load(p_q, boundary_check=(0, 1)) + # [BT, BK] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BT, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + b_do = tl.load(p_do, boundary_check=(0, 1)) + b_dd = (b_do * d_q[:, None]).to(b_do.dtype) + + # [BT, BT] + b_ds = tl.dot(b_v, tl.trans(b_do), allow_tf32=False) + b_ds = (b_ds * d_s).to(b_k.dtype) + + # [BT, BT] + b_s = tl.dot(b_k, b_q, allow_tf32=False) * d_s + # [BT, BK] + b_dk = tl.dot(b_ds, tl.trans(b_q), allow_tf32=False) + # [BT, BV] + b_dv = tl.dot(b_s.to(b_q.dtype), b_do, allow_tf32=False) + if CHECK and i == 1: + b_dk += tl.dot(b_v, tl.trans(b_dh).to(b_v.dtype), allow_tf32=False) * d_k[:, None] + b_dv += tl.dot(b_k, b_dh.to(b_k.dtype), allow_tf32=False) * d_k[:, None] + b_dh = d_b * b_dh + tl.dot(b_q, b_dd, allow_tf32=False) + else: + b_dk += tl.dot(b_v, tl.trans(b_dh).to(b_v.dtype), allow_tf32=False) * d_k[:, None] + b_dv += tl.dot(b_k, b_dh.to(b_k.dtype), allow_tf32=False) * d_k[:, None] + b_dh = d_b * b_dh + tl.dot(b_q, b_dd, allow_tf32=False) + + tl.store(p_dk, b_dk.to(p_dk.dtype.element_ty), boundary_check=(0, 1)) + tl.store(p_dv, b_dv.to(p_dv.dtype.element_ty), boundary_check=(0, 1)) + + +class FusedChunkRetentionFunction(torch.autograd.Function): + + @staticmethod + @contiguous + @autocast_custom_fwd + def forward(ctx, q, k, v, scale, initial_state, output_final_state): + B, H, T, K, V = *k.shape, v.shape[-1] + + BT = 64 + BK, BV = min(triton.next_power_of_2(K), 64), min(triton.next_power_of_2(V), 64) + NK, NV = triton.cdiv(K, BK), triton.cdiv(V, BV) + num_stages = 1 + num_warps = 4 + + o = q.new_empty(NK, B, H, T, V) + + if output_final_state: + final_state = q.new_empty(B, H, K, V, dtype=torch.float32, requires_grad=False) + else: + final_state = None + # the bug still exists even for Triton 2.2 on H100 GPUs + # so we always enable initial checks + CHECK = True + if version.parse(triton.__version__) < version.parse('2.2.0'): + import warnings + warnings.warn( + "Triton<2.2.0 detected for running this kernel, " + "which is known to have some weird compiler issues (refer to https://github.com/openai/triton/issues/2852) " + "that lead to significant precision loss. " + "We've add some initial condition checks to resolve this, sadly at the sacrifice of the speed. " + "For optimal performance, it is recommended to install Triton>=2.2.0 (if possible)." + ) + CHECK = True + + grid = (NV, NK, B * H) + fused_chunk_retention_fwd_kernel[grid]( + q, k, v, o, initial_state, final_state, + q.stride(1), q.stride(2), q.stride(3), + v.stride(1), v.stride(2), v.stride(3), + scale, + B=B, H=H, T=T, K=K, V=V, BT=BT, BK=BK, BV=BV, + USE_INITIAL_STATE=initial_state is not None, + STORE_FINAL_STATE=output_final_state, + CHECK=CHECK, + num_warps=num_warps, + num_stages=num_stages + ) + + o = o.sum(0) + ctx.save_for_backward(q, k, v, initial_state) + ctx.CHECK = CHECK + return o.to(q.dtype), final_state + + @staticmethod + @contiguous + @autocast_custom_bwd + def backward(ctx, do, dht=None): + q, k, v, initial_state = ctx.saved_tensors + B, H, T, K, V = *k.shape, v.shape[-1] + scale = K ** -0.5 + + BT = 64 + BK, BV = min(triton.next_power_of_2(K), 64), min(triton.next_power_of_2(V), 64) + NK, NV = triton.cdiv(K, BK), triton.cdiv(V, BV) + num_stages = 1 + num_warps = 4 + + dq = q.new_empty(NV, B, H, T, K) + dk = q.new_empty(NV, B, H, T, K) + dv = q.new_empty(NK, B, H, T, V) + grid = (NV, NK, B * H) + + fused_chunk_retention_bwd_kernel[grid]( + q, k, v, do, dq, dk, dv, initial_state, + q.stride(1), q.stride(2), q.stride(3), + v.stride(1), v.stride(2), v.stride(3), + scale, + B=B, H=H, T=T, K=K, V=V, BT=BT, BK=BK, BV=BV, + USE_INITIAL_STATE=initial_state is not None, + CHECK=ctx.CHECK, + num_warps=num_warps, + num_stages=num_stages + ) + dq = dq.sum(0) + dk = dk.sum(0) + dv = dv.sum(0) + return dq.to(q.dtype), dk.to(k.dtype), dv.to(v.dtype), None, None, None + + +def fused_chunk_retention( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + scale: Optional[float] = None, + initial_state: Optional[torch.Tensor] = None, + output_final_state: bool = False, + head_first: bool = True +) -> Tuple[torch.Tensor, torch.Tensor]: + r""" + Args: + q (torch.Tensor): + queries of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]` + k (torch.Tensor): + keys of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]` + v (torch.Tensor): + values of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]` + scale (Optional[int]): + Scale factor for the attention scores. + If not provided, it will default to `1 / sqrt(K)`. Default: `None`. + initial_state (Optional[torch.Tensor]): + Initial state of shape `[B, H, K, V]`. Default: `None`. + output_final_state (Optional[bool]): + Whether to output the final state of shape `[B, H, K, V]`. Default: `False`. + head_first (Optional[bool]): + Whether the inputs are in the head-first format. + Default: `True`. + + Returns: + o (torch.Tensor): + Outputs of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]`. + final_state (torch.Tensor): + Final state of shape `[B, H, K, V]` if `output_final_state=True` else `None`. + """ + assert q.dim() == k.dim() == v.dim() == 4, "q, k, v must have 4 dimensions" + assert q.dtype == k.dtype == v.dtype, "q, k, v must have the same dtype" + if scale is None: + scale = k.shape[-1] ** -0.5 + o, final_state = FusedChunkRetentionFunction.apply(q, k, v, scale, initial_state, output_final_state, head_first) + return o, final_state diff --git a/fla/ops/retention/fused_recurrent.py b/fla/ops/retention/fused_recurrent.py new file mode 100644 index 0000000000000000000000000000000000000000..05a5b57c6ab8cb7724b4c1f08a695cd66286c2ba --- /dev/null +++ b/fla/ops/retention/fused_recurrent.py @@ -0,0 +1,472 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +from typing import Optional, Tuple + +import torch +import triton +import triton.language as tl + +from fla.utils import contiguous + + +@triton.heuristics({ + 'USE_INITIAL_STATE': lambda args: args['h0'] is not None, + 'STORE_FINAL_STATE': lambda args: args['ht'] is not None, + 'USE_OFFSETS': lambda args: args['offsets'] is not None +}) +@triton.jit +def fused_recurrent_retention_fwd_kernel( + q, + k, + v, + o, + h0, + ht, + offsets, + scale, + B: tl.constexpr, + T: tl.constexpr, + H: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + REVERSE: tl.constexpr, + USE_INITIAL_STATE: tl.constexpr, + STORE_FINAL_STATE: tl.constexpr, + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + i_v, i_k, i_nh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_n, i_h = i_nh // H, i_nh % H + if USE_OFFSETS: + bos, eos = tl.load(offsets + i_n).to(tl.int64), tl.load(offsets + i_n + 1).to(tl.int64) + all = T + T = eos - bos + else: + bos, eos = i_n * T, i_n * T + T + all = B * T + + # decay rate given the head index + b_b = (1 - tl.math.exp2(-5 - i_h * 1.0)) + + if HEAD_FIRST: + p_q = q + i_nh * T*K + i_k * BK + tl.arange(0, BK) + p_k = k + i_nh * T*K + i_k * BK + tl.arange(0, BK) + p_v = v + i_nh * T*V + i_v * BV + tl.arange(0, BV) + p_o = o + (i_k * B*H + i_nh) * T*V + i_v * BV + tl.arange(0, BV) + else: + p_q = q + (bos + ((T-1) if REVERSE else 0)) * H*K + i_h * K + i_k * BK + tl.arange(0, BK) + p_k = k + (bos + ((T-1) if REVERSE else 0)) * H*K + i_h * K + i_k * BK + tl.arange(0, BK) + p_v = v + (bos + ((T-1) if REVERSE else 0)) * H*V + i_h * V + i_v * BV + tl.arange(0, BV) + p_o = o + ((i_k * all + bos) + ((T-1) if REVERSE else 0)) * H*V + i_h * V + i_v * BV + tl.arange(0, BV) + + mask_k = (i_k * BK + tl.arange(0, BK)) < K + mask_v = (i_v * BV + tl.arange(0, BV)) < V + mask_h = mask_k[None, :] & mask_v[:, None] + + b_h = tl.zeros([BV, BK], dtype=tl.float32) + if USE_INITIAL_STATE: + p_h0 = h0 + i_nh * K * V + (i_k * BK + tl.arange(0, BK)[None, :]) * V + (i_v * BV + tl.arange(0, BV)[:, None]) + b_h += tl.load(p_h0, mask=mask_h, other=0).to(tl.float32) + + for _ in range(0, T): + b_q = tl.load(p_q, mask=mask_k, other=0).to(tl.float32) * scale + b_k = tl.load(p_k, mask=mask_k, other=0).to(tl.float32) + b_v = tl.load(p_v, mask=mask_v, other=0).to(tl.float32) + + b_h = b_b * b_h + b_k[None, :] * b_v[:, None] + b_o = b_h * b_q[None, :] + b_o = tl.sum(b_o, axis=1) + tl.store(p_o, b_o.to(p_o.dtype.element_ty), mask=mask_v) + + p_q += (-1 if REVERSE else 1) * (1 if HEAD_FIRST else H) * K + p_k += (-1 if REVERSE else 1) * (1 if HEAD_FIRST else H) * K + p_v += (-1 if REVERSE else 1) * (1 if HEAD_FIRST else H) * V + p_o += (-1 if REVERSE else 1) * (1 if HEAD_FIRST else H) * V + + if STORE_FINAL_STATE: + p_ht = ht + i_nh * K * V + (i_k * BK + tl.arange(0, BK)[None, :]) * V + (i_v * BV + tl.arange(0, BV)[:, None]) + tl.store(p_ht, b_h.to(p_ht.dtype.element_ty), mask=mask_h) + + +@triton.heuristics({ + 'USE_INITIAL_STATE': lambda args: args['h0'] is not None, + 'USE_FINAL_STATE_GRADIENT': lambda args: args['dht'] is not None, + 'USE_OFFSETS': lambda args: args['offsets'] is not None +}) +@triton.jit +def fused_recurrent_retention_bwd_kernel( + q, + k, + v, + h0, + do, + dq, + dk, + dv, + dh0, + dht, + offsets, + scale, + B: tl.constexpr, + H: tl.constexpr, + T: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + REVERSE: tl.constexpr, + USE_INITIAL_STATE: tl.constexpr, + USE_FINAL_STATE_GRADIENT: tl.constexpr, + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + i_v, i_k, i_nh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_n, i_h = i_nh // H, i_nh % H + if USE_OFFSETS: + bos, eos = tl.load(offsets + i_n).to(tl.int64), tl.load(offsets + i_n + 1).to(tl.int64) + all = T + T = eos - bos + else: + bos, eos = i_n * T, i_n * T + T + all = B * T + + b_b = 1 - tl.math.exp2(-5 - i_h * 1.0) + + if HEAD_FIRST: + p_k = k + i_nh * T*K + ((T-1) * K if REVERSE else 0) + i_k * BK + tl.arange(0, BK) + p_v = v + i_nh * T*V + ((T-1) * V if REVERSE else 0) + i_v * BV + tl.arange(0, BV) + p_do = do + i_nh * T*V + ((T-1) * V if REVERSE else 0) + i_v * BV + tl.arange(0, BV) + p_dq = dq + (i_v * B*H + i_nh) * T*K + ((T-1) * K if REVERSE else 0) + i_k * BK + tl.arange(0, BK) + else: + p_k = k + (bos + ((T-1) if REVERSE else 0)) * H*K + i_h * K + i_k * BK + tl.arange(0, BK) + p_v = v + (bos + ((T-1) if REVERSE else 0)) * H*V + i_h * V + i_v * BV + tl.arange(0, BV) + p_do = do + (bos + ((T-1) if REVERSE else 0)) * H*V + i_h * V + i_v * BV + tl.arange(0, BV) + p_dq = dq + ((i_v * all + bos) + ((T-1) if REVERSE else 0)) * H*K + i_h * K + i_k * BK + tl.arange(0, BK) + mask_k = i_k * BK + tl.arange(0, BK) < K + mask_v = i_v * BV + tl.arange(0, BV) < V + mask_h = mask_k[:, None] & mask_v[None, :] + + b_h = tl.zeros([BK, BV], dtype=tl.float32) + if USE_INITIAL_STATE: + p_h0 = h0 + i_nh * K * V + (i_k * BK + tl.arange(0, BK)[:, None]) * V + (i_v * BV + tl.arange(0, BV)[None, :]) + b_h += tl.load(p_h0, mask=mask_h, other=0).to(tl.float32) + + for _ in range(0, T): + b_k = tl.load(p_k, mask=mask_k, other=0).to(tl.float32) + b_v = tl.load(p_v, mask=mask_v, other=0).to(tl.float32) + b_do = tl.load(p_do, mask=mask_v, other=0).to(tl.float32) + + b_h = b_b * b_h + b_k[:, None] * b_v[None, :] + b_dq = tl.sum(b_h * b_do[None, :], axis=1) * scale + tl.store(p_dq, b_dq.to(p_dq.dtype.element_ty), mask=mask_k) + + p_k += (-1 if REVERSE else 1) * (1 if HEAD_FIRST else H) * K + p_v += (-1 if REVERSE else 1) * (1 if HEAD_FIRST else H) * V + p_do += (-1 if REVERSE else 1) * (1 if HEAD_FIRST else H) * V + p_dq += (-1 if REVERSE else 1) * (1 if HEAD_FIRST else H) * K + + # sync threads + tl.debug_barrier() + + if HEAD_FIRST: + p_q = q + i_nh * T*K + ((T - 1) * K if not REVERSE else 0) + i_k * BK + tl.arange(0, BK) + p_k = k + i_nh * T*K + ((T - 1) * K if not REVERSE else 0) + i_k * BK + tl.arange(0, BK) + p_v = v + i_nh * T*V + ((T - 1) * V if not REVERSE else 0) + i_v * BV + tl.arange(0, BV) + p_do = do + i_nh * T*V + ((T - 1) * V if not REVERSE else 0) + i_v * BV + tl.arange(0, BV) + p_dk = dk + (i_v * B*H + i_nh) * T*K + ((T - 1) * K if not REVERSE else 0) + i_k * BK + tl.arange(0, BK) + p_dv = dv + (i_k * B*H + i_nh) * T*V + ((T - 1) * V if not REVERSE else 0) + i_v * BV + tl.arange(0, BV) + else: + p_q = q + (bos + ((T - 1) if not REVERSE else 0)) * H*K + i_h * K + i_k * BK + tl.arange(0, BK) + p_k = k + (bos + ((T - 1) if not REVERSE else 0)) * H*K + i_h * K + i_k * BK + tl.arange(0, BK) + p_v = v + (bos + ((T - 1) if not REVERSE else 0))*H*V + i_h * V + i_v * BV + tl.arange(0, BV) + p_do = do + (bos + ((T - 1) if not REVERSE else 0))*H*V + i_h * V + i_v * BV + tl.arange(0, BV) + p_dk = dk + ((i_v * all + bos) + ((T - 1) if not REVERSE else 0)) * H*K + i_h * K + i_k * BK + tl.arange(0, BK) + p_dv = dv + ((i_k * all + bos) + ((T - 1) if not REVERSE else 0)) * H*V + i_h * V + i_v * BV + tl.arange(0, BV) + + b_dh = tl.zeros([BK, BV], dtype=tl.float32) + if USE_FINAL_STATE_GRADIENT: + p_ht = dht + i_nh * K * V + (i_k * BK + tl.arange(0, BK)[:, None]) * V + (i_v * BV + tl.arange(0, BV)[None, :]) + b_dh += tl.load(p_ht, mask=mask_h, other=0).to(tl.float32) + + for _ in range(T): + b_q = tl.load(p_q, mask=mask_k, other=0).to(tl.float32) * scale + b_k = tl.load(p_k, mask=mask_k, other=0).to(tl.float32) + b_v = tl.load(p_v, mask=mask_v, other=0).to(tl.float32) + b_do = tl.load(p_do, mask=mask_v, other=0).to(tl.float32) + + b_dh += b_q[:, None] * b_do[None, :] + b_dk = tl.sum(b_dh * b_v[None, :], axis=1) + b_dv = tl.sum(b_dh * b_k[:, None], axis=0) + + b_dh *= b_b + tl.store(p_dk, b_dk.to(p_dk.dtype.element_ty), mask=mask_k) + tl.store(p_dv, b_dv.to(p_dv.dtype.element_ty), mask=mask_v) + + p_q += (1 if REVERSE else -1) * (1 if HEAD_FIRST else H) * K + p_k += (1 if REVERSE else -1) * (1 if HEAD_FIRST else H) * K + p_v += (1 if REVERSE else -1) * (1 if HEAD_FIRST else H) * V + p_do += (1 if REVERSE else -1) * (1 if HEAD_FIRST else H) * V + p_dk += (1 if REVERSE else -1) * (1 if HEAD_FIRST else H) * K + p_dv += (1 if REVERSE else -1) * (1 if HEAD_FIRST else H) * V + + if USE_INITIAL_STATE: + p_dh0 = dh0 + i_nh * K * V + (i_k * BK + tl.arange(0, BK)[:, None]) * V + (i_v * BV + tl.arange(0, BV)[None, :]) + tl.store(p_dh0, b_dh.to(p_dh0.dtype.element_ty), mask=mask_h) + + +def fused_recurrent_retention_fwd( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + scale: Optional[float] = None, + initial_state: Optional[torch.Tensor] = None, + output_final_state: bool = False, + reverse: bool = False, + offsets: Optional[torch.Tensor] = None, + head_first: bool = True +) -> Tuple[torch.Tensor, torch.Tensor]: + if head_first: + B, H, T, K, V = *k.shape, v.shape[-1] + else: + B, T, H, K, V = *k.shape, v.shape[-1] + N = B if offsets is None else len(offsets) - 1 + BK, BV = min(K, 64), min(V, 64) + NK, NV = triton.cdiv(K, BK), triton.cdiv(V, BV) + + h0 = initial_state + ht = q.new_empty(N, H, K, V, dtype=torch.float32) if output_final_state else None + o = q.new_empty(NK, *v.shape, dtype=torch.float) + + grid = (NV, NK, N * H) + fused_recurrent_retention_fwd_kernel[grid]( + q, + k, + v, + o, + h0, + ht, + offsets, + scale, + B=B, + T=T, + H=H, + K=K, + V=V, + BK=BK, + BV=BV, + REVERSE=reverse, + HEAD_FIRST=head_first + ) + o = o.sum(0) + return o.to(v.dtype), ht + + +def fused_recurrent_retention_bwd( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + do: torch.Tensor, + dht: torch.Tensor, + scale: Optional[float] = None, + initial_state: Optional[torch.Tensor] = None, + reverse: bool = False, + offsets: Optional[torch.Tensor] = None, + head_first: bool = True +) -> Tuple[torch.Tensor, torch.Tensor, torch.Tensor]: + if head_first: + B, H, T, K, V = *k.shape, v.shape[-1] + else: + B, T, H, K, V = *k.shape, v.shape[-1] + N = B if offsets is None else len(offsets) - 1 + + BK, BV = min(K, 64), min(V, 64) + NK, NV = triton.cdiv(K, BK), triton.cdiv(V, BV) + + dq = q.new_empty(NV, *q.shape, dtype=torch.float) + dk = q.new_empty(NV, *k.shape, dtype=torch.float) + dv = q.new_empty(NK, *v.shape, dtype=torch.float) + h0 = initial_state + dh0 = torch.empty_like(initial_state) if initial_state is not None else None + + grid = (NV, NK, N * H) + fused_recurrent_retention_bwd_kernel[grid]( + q, + k, + v, + h0, + do, + dq, + dk, + dv, + dh0, + dht, + offsets, + scale, + B=B, + T=T, + H=H, + K=K, + V=V, + BK=BK, + BV=BV, + REVERSE=reverse, + HEAD_FIRST=head_first + ) + dq = dq.sum(0) + dk = dk.sum(0) + dv = dv.sum(0) + return dq, dk, dv, dh0 + + +class FusedRecurrentRetentionFunction(torch.autograd.Function): + + @staticmethod + @contiguous + def forward( + ctx, + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + scale: Optional[float] = None, + initial_state: Optional[torch.Tensor] = None, + output_final_state: bool = False, + reverse: bool = False, + offsets: Optional[torch.LongTensor] = None, + head_first: bool = True + ): + o, ht = fused_recurrent_retention_fwd( + q=q, + k=k, + v=v, + scale=scale, + initial_state=initial_state, + output_final_state=output_final_state, + reverse=reverse, + offsets=offsets, + head_first=head_first + ) + ctx.save_for_backward(q, k, v, initial_state) + ctx.scale = scale + ctx.reverse = reverse + ctx.offsets = offsets + ctx.head_first = head_first + return o.to(v.dtype), ht + + @staticmethod + @contiguous + def backward(ctx, do, dht): + q, k, v, initial_state = ctx.saved_tensors + dq, dk, dv, dh0 = fused_recurrent_retention_bwd( + q=q, + k=k, + v=v, + do=do, + dht=dht, + scale=ctx.scale, + initial_state=initial_state, + reverse=ctx.reverse, + offsets=ctx.offsets, + head_first=ctx.head_first + ) + return dq.to(q), dk.to(k), dv.to(v), None, dh0, None, None, None, None + + +def fused_recurrent_retention( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + scale: float = None, + initial_state: torch.Tensor = None, + output_final_state: bool = False, + reverse: bool = False, + offsets: Optional[torch.LongTensor] = None, + head_first: bool = True +) -> Tuple[torch.Tensor, torch.Tensor]: + r""" + Args: + q (torch.Tensor): + queries of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]` + k (torch.Tensor): + keys of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]` + v (torch.Tensor): + values of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]` + scale (Optional[int]): + Scale factor for the attention scores. + If not provided, it will default to `1 / sqrt(K)`. Default: `None`. + initial_state (Optional[torch.Tensor]): + Initial state of shape `[N, H, K, V]` for `N` input sequences. + For equal-length input sequences, `N` equals the batch size `B`. + Default: `None`. + output_final_state (Optional[bool]): + Whether to output the final state of shape `[N, H, K, V]`. Default: `False`. + reverse (Optional[bool]): + If `True`, process the state passing in reverse order. Default: `False`. + offsets (Optional[torch.LongTensor]): + Offsets of shape `[N+1]` defining the bos/eos positions of `N` variable-length sequences in the batch. + For example, + if `offsets` is `[0, 1, 3, 6, 10, 15]`, there are `N=5` sequences with lengths 1, 2, 3, 4 and 5 respectively. + If provided, the inputs are concatenated and the batch size `B` is expected to be 1. + Default: `None`. + head_first (Optional[bool]): + Whether the inputs are in the head-first format, which is not supported for variable-length inputs. + Default: `True`. + + Returns: + o (torch.Tensor): + Outputs of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]`. + final_state (torch.Tensor): + Final state of shape `[N, H, K, V]` if `output_final_state=True` else `None`. + + Examples:: + >>> import torch + >>> import torch.nn.functional as F + >>> from einops import rearrange + >>> from fla.ops.retention import fused_recurrent_retention + # inputs with equal lengths + >>> B, T, H, K, V = 4, 2048, 4, 512, 512 + >>> q = torch.randn(B, T, H, K, device='cuda') + >>> k = torch.randn(B, T, H, K, device='cuda') + >>> v = torch.randn(B, T, H, V, device='cuda') + >>> h0 = torch.randn(B, H, K, V, device='cuda') + >>> o, ht = fused_recurrent_retention(q, k, v, + initial_state=h0, + output_final_state=True, + head_first=False) + # for variable-length inputs, the batch size `B` is expected to be 1 and `offsets` is required + >>> q, k, v = map(lambda x: rearrange(x, 'b t h d -> 1 (b t) h d'), (q, k, v)) + # for a batch with 4 sequences, offsets with 5 start/end positions are expected + >>> offsets = q.new_tensor([0, 2048, 4096, 6144, 8192], dtype=torch.long) + >>> o_var, ht_var = fused_recurrent_retention(q, k, v, + initial_state=h0, + output_final_state=True, + offsets=offsets, + head_first=False) + >>> assert o.allclose(o_var.view(o.shape)) + >>> assert ht.allclose(ht_var) + """ + if offsets is not None: + if q.shape[0] != 1: + raise ValueError(f"The batch size is expected to be 1 rather than {q.shape[0]} when using `offsets`." + f"Please flatten variable-length inputs before processing.") + if head_first: + raise RuntimeError("Sequences with variable lengths are not supported for head-first mode") + if initial_state is not None and initial_state.shape[0] != len(offsets) - 1: + raise ValueError(f"The number of initial states is expected to be equal to the number of input sequences, " + f"i.e., {len(offsets) - 1} rather than {initial_state.shape[0]}.") + if scale is None: + scale = k.shape[-1] ** -0.5 + o, final_state = FusedRecurrentRetentionFunction.apply( + q, + k, + v, + scale, + initial_state, + output_final_state, + reverse, + offsets, + head_first + ) + return o, final_state diff --git a/fla/ops/retention/naive.py b/fla/ops/retention/naive.py new file mode 100644 index 0000000000000000000000000000000000000000..15611bf649779d2d956d2ab390b7d72dbb12201d --- /dev/null +++ b/fla/ops/retention/naive.py @@ -0,0 +1,15 @@ +# -*- coding: utf-8 -*- + +import torch + + +def naive_retention(q, k, v): + orig_type = q.dtype + q, k, v = q.float(), k.float(), v.float() + _, n_heads, seq_len, d_head = q.shape + s = (1 - q.new_tensor(2., dtype=torch.float).pow(-5. - q.new_tensor(range(n_heads), dtype=torch.float))).log2() + n = q.new_tensor(range(seq_len), dtype=torch.float) + n = torch.exp2((n.unsqueeze(-1) - n) * s.view(-1, 1, 1)) * n.unsqueeze(-1).ge(n) + s = torch.einsum('bhqd,bhkd,hqk->bhqk', q * d_head ** -0.5, k, n.to(q.dtype)) + o = torch.einsum('bhqk,bhkd->bhqd', s, v) + return o.to(orig_type) diff --git a/fla/ops/retention/parallel.py b/fla/ops/retention/parallel.py new file mode 100644 index 0000000000000000000000000000000000000000..7321663e5a933596048a281c39190237ec77bee0 --- /dev/null +++ b/fla/ops/retention/parallel.py @@ -0,0 +1,509 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +from typing import Tuple + +import torch +import triton +import triton.language as tl + +from fla.utils import autocast_custom_bwd, autocast_custom_fwd, contiguous + + +@triton.heuristics({ + 'NV': lambda args: triton.cdiv(args['V'], args['BV']), + 'OUTPUT_ATTENTIONS': lambda args: args['attn'] is not None +}) +@triton.jit +def parallel_retention_fwd_kernel( + q, + k, + v, + o, + attn, + scale, + B: tl.constexpr, + H: tl.constexpr, + T: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BS: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + NV: tl.constexpr, + OUTPUT_ATTENTIONS: tl.constexpr +): + i_kv, i_t, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_k, i_v = i_kv // NV, i_kv % NV + i_h = i_bh % H + # decay rate given the head index + b_b = tl.math.log2(1 - tl.math.exp2(-5 - i_h * 1.0)) + # cumulative decay from the end of the chunk + # [BS] + o_k = tl.arange(0, BS) + d_h = tl.math.exp2((BS - o_k) * b_b) + + p_q = tl.make_block_ptr(q + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_k = tl.make_block_ptr(k + i_bh * T*K, (K, T), (1, K), (i_k * BK, 0), (BK, BS), (0, 1)) + p_v = tl.make_block_ptr(v + i_bh * T*V, (T, V), (V, 1), (0, i_v * BV), (BS, BV), (1, 0)) + if OUTPUT_ATTENTIONS: + p_a = tl.make_block_ptr(attn + (i_k*B*H + i_bh) * T * T, (T, T), (T, 1), (i_t * BT, 0), (BT, BS), (1, 0)) + + # the Q block is kept in the shared memory throughout the whole kernel + # [BT, BK] + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_q = (b_q * scale).to(b_q.dtype) + + b_o = tl.zeros([BT, BV], dtype=tl.float32) + # Q block and K block have no overlap + # no need for mask, thereby saving flops + for i in range(0, i_t * BT, BS): + # [BK, BS] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BS, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + # [BT, BS] + b_s = tl.dot(b_q, b_k, allow_tf32=False) * d_h + # do this check to avoid some layout bugs + # [[BT, BV] + if i > 0: + b_o = b_o * tl.math.exp2(b_b * BS) + b_o += tl.dot(b_s.to(b_v.dtype), b_v, allow_tf32=False) + p_k = tl.advance(p_k, (0, BS)) + p_v = tl.advance(p_v, (BS, 0)) + if OUTPUT_ATTENTIONS: + tl.store(p_a, b_s.to(p_a.dtype.element_ty), boundary_check=(0, 1)) + p_a = tl.advance(p_a, (0, BS)) + + tl.debug_barrier() + + o_q = tl.arange(0, BT) + d_q = tl.math.exp2(tl.arange(0, BT) * b_b) + # rescale interchunk output + b_o *= d_q[:, None] + + p_k = tl.make_block_ptr(k + i_bh * T*K, (K, T), (1, K), (i_k * BK, i_t * BT), (BK, BS), (0, 1)) + p_v = tl.make_block_ptr(v + i_bh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BS, BV), (1, 0)) + if OUTPUT_ATTENTIONS: + p_a = tl.make_block_ptr(attn + (i_k*B*H + i_bh) * T * T, (T, T), (T, 1), (i_t * BT, i_t * BT), (BT, BS), (1, 0)) + + # Q block and K block have overlap. + # masks required + for _ in range(i_t * BT, (i_t + 1) * BT, BS): + # [BK, BS] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BS, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + # [BT, BS] + m_s = o_q[:, None] >= o_k[None, :] + d_s = tl.where(m_s, tl.math.exp2((o_q[:, None] - o_k[None, :]) * b_b), 0) + b_s = tl.dot(b_q, b_k, allow_tf32=False) * d_s + # [BT, BV] + b_o += tl.dot(b_s.to(b_q.dtype), b_v, allow_tf32=False) + + if OUTPUT_ATTENTIONS: + tl.store(p_a, b_s.to(p_a.dtype.element_ty), boundary_check=(0, 1)) + p_a = tl.advance(p_a, (0, BS)) + p_k = tl.advance(p_k, (0, BS)) + p_v = tl.advance(p_v, (BS, 0)) + o_k += BS + + p_o = tl.make_block_ptr(o + (i_bh + B * H * i_k) * T*V, (T, V), (V, 1), (i_t*BT, i_v*BV), (BT, BV), (1, 0)) + tl.store(p_o, b_o.to(p_o.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.jit +def parallel_retention_bwd_kernel_dq( + i_bh, + i_t, + i_k, + i_v, + i_h, + k, + v, + do, + dq, + scale, + B: tl.constexpr, + H: tl.constexpr, + T: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BS: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, +): + p_k = tl.make_block_ptr(k + i_bh * T*K, (T, K), (K, 1), (0, i_k * BK), (BS, BK), (1, 0)) + p_v = tl.make_block_ptr(v + i_bh * T*V, (V, T), (1, V), (i_v * BV, 0), (BV, BS), (0, 1)) + p_do = tl.make_block_ptr(do + i_bh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + + b_do = tl.load(p_do, boundary_check=(0, 1)) + b_dq = tl.zeros([BT, BK], dtype=tl.float32) + # decay rate given the head index + b_b = tl.math.log2(1 - tl.math.exp2(-5 - i_h * 1.0)) + # overall decay rate for an entire block + d_b = tl.math.exp2(b_b * BS) + # cumulative decay from the end of the chunk + d_h = tl.math.exp2((BS - tl.arange(0, BS)) * b_b) + for i in range(0, i_t * BT, BS): + # [BS, BK] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BV, BS] + b_v = tl.load(p_v, boundary_check=(0, 1)) + # [BT, BS] + b_ds = tl.dot(b_do, b_v, allow_tf32=False) * d_h[None, :] + # [BT, BK] + if i != 0: + b_dq *= d_b + b_dq += tl.dot(b_ds.to(b_v.dtype), b_k, allow_tf32=False) + + p_k = tl.advance(p_k, (BS, 0)) + p_v = tl.advance(p_v, (0, BS)) + b_dq *= tl.math.exp2(tl.arange(0, BT) * b_b)[:, None] * scale + + o_q = tl.arange(0, BT) + o_k = tl.arange(0, BS) + p_k = tl.make_block_ptr(k + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BS, BK), (1, 0)) + p_v = tl.make_block_ptr(v + i_bh * T*V, (V, T), (1, V), (i_v * BV, i_t * BT), (BV, BS), (0, 1)) + # Q block and K block have overlap. masks required + for _ in range(i_t * BT, (i_t + 1) * BT, BS): + # [BS, BK] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BV, BS] + b_v = tl.load(p_v, boundary_check=(0, 1)) + # [BT, BS] + m_s = o_q[:, None] >= o_k[None, :] + d_s = tl.where(m_s, tl.math.exp2((o_q[:, None] - o_k[None, :]) * b_b), 0) + b_ds = tl.dot(b_do, b_v, allow_tf32=False) * d_s * scale + # [BT, BK] + b_dq += tl.dot(b_ds.to(b_k.dtype), b_k, allow_tf32=False) + + p_k = tl.advance(p_k, (BS, 0)) + p_v = tl.advance(p_v, (0, BS)) + o_k += BS + p_dq = tl.make_block_ptr(dq + (i_bh + B * H * i_v) * T*K, (T, K), (K, 1), (i_t*BT, i_k*BK), (BT, BK), (1, 0)) + tl.store(p_dq, b_dq.to(p_dq.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.jit +def parallel_retention_bwd_kernel_dkv( + i_bh, + i_t, + i_k, + i_v, + i_h, + q, + k, + v, + do, + dk, + dv, + scale, + B: tl.constexpr, + H: tl.constexpr, + T: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BS: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, +): + # no overlap. no need for mask. + b_b = tl.math.log2(1 - tl.math.exp2(-5 - i_h * 1.0)) + # overall decay rate for an entire block + d_b = tl.math.exp2(b_b * BS) + # compute dk dv + p_k = tl.make_block_ptr(k + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_v = tl.make_block_ptr(v + i_bh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + # [BT, BK] + b_k = tl.load(p_k, boundary_check=(0, 1)) + b_dk = tl.zeros([BT, BK], dtype=tl.float32) + # [BT, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + b_dv = tl.zeros([BT, BV], dtype=tl.float32) + + NTS = tl.cdiv(T, BS) + # [BT] + d_h = tl.math.exp2((BT - tl.arange(0, BT)) * b_b) + # [BT, BK] + b_kd = (b_k * d_h[:, None]).to(b_k.dtype) + # [BS] + d_q = tl.math.exp2(tl.arange(0, BS) * b_b) + for i in range(NTS * BS - BS, (i_t + 1) * BT - BS, -BS): + p_q = tl.make_block_ptr(q + i_bh * T*K, (T, K), (K, 1), (i, i_k * BK), (BS, BK), (1, 0)) + p_do = tl.make_block_ptr(do + i_bh * T*V, (T, V), (V, 1), (i, i_v * BV), (BS, BV), (1, 0)) + # [BS, BK] + b_q = tl.load(p_q, boundary_check=(0, 1)) + # [BS, BV] + b_do = tl.load(p_do, boundary_check=(0, 1)) + b_do = (b_do * d_q[:, None]).to(b_do.dtype) + + b_dk *= d_b + b_dv *= d_b + # [BT, BS] + b_ds = tl.dot(b_v, tl.trans(b_do), allow_tf32=False) + b_s = tl.dot(b_kd, tl.trans(b_q), allow_tf32=False) + # [BT, BK] + b_dk += tl.dot(b_ds.to(b_q.dtype), b_q, allow_tf32=False) + # [BT, BV] + b_dv += tl.dot(b_s.to(b_do.dtype), b_do, allow_tf32=False) + b_dk *= d_h[:, None] * scale + b_dv *= scale + + tl.debug_barrier() + o_q, o_k = tl.arange(0, BS), tl.arange(0, BT) + for i in range(i_t * BT, (i_t + 1) * BT, BS): + p_q = tl.make_block_ptr(q + i_bh * T*K, (T, K), (K, 1), (i, i_k * BK), (BS, BK), (1, 0)) + p_do = tl.make_block_ptr(do + i_bh * T*V, (T, V), (V, 1), (i, i_v * BV), (BS, BV), (1, 0)) + # [BS, BK] + b_q = tl.load(p_q, boundary_check=(0, 1)) + # [BS, BV] + b_do = tl.load(p_do, boundary_check=(0, 1)) + # [BT, BS] + m_s = o_k[:, None] <= o_q[None, :] + d_s = tl.where(m_s, tl.math.exp2((-o_k[:, None] + o_q[None, :]) * b_b.to(tl.float32)), 0) * scale + + b_ds = tl.dot(b_v, tl.trans(b_do), allow_tf32=False) * d_s + b_s = tl.dot(b_k, tl.trans(b_q), allow_tf32=False) * d_s + # [BT, BK] + b_dk += tl.dot(b_ds.to(b_q.dtype), b_q, allow_tf32=False) + b_dv += tl.dot(b_s.to(b_q.dtype), b_do, allow_tf32=False) + o_q += BS + p_dk = tl.make_block_ptr(dk + (i_v * B * H + i_bh) * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dv = tl.make_block_ptr(dv + (i_k * B * H + i_bh) * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + tl.store(p_dk, b_dk.to(p_dk.dtype.element_ty), boundary_check=(0, 1)) + tl.store(p_dv, b_dv.to(p_dv.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.heuristics({ + 'NV': lambda args: triton.cdiv(args['V'], args['BV']) +}) +@triton.jit +def parallel_retention_bwd_kernel( + q, + k, + v, + do, + dq, + dk, + dv, + scale, + B: tl.constexpr, + H: tl.constexpr, + T: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BS: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + NV: tl.constexpr +): + i_kv, i_t, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_k, i_v = i_kv // NV, i_kv % NV + i_h = i_bh % H + parallel_retention_bwd_kernel_dq( + i_bh, + i_t, + i_k, + i_v, + i_h, + k, + v, + do, + dq, + scale, + B=B, + H=H, + T=T, + K=K, + V=V, + BT=BT, + BS=BS, + BK=BK, + BV=BV + ) + tl.debug_barrier() + parallel_retention_bwd_kernel_dkv( + i_bh, + i_t, + i_k, + i_v, + i_h, + q, + k, + v, + do, + dk, + dv, + scale, + B, + H, + T, + K, + V, + BT, + BS, + BK, + BV + ) + + +def parallel_retention_fwd( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + scale: float, + output_attentions: bool = False +): + B, H, T, K, V = *k.shape, v.shape[-1] + BT, BS = 64, 32 + if torch.cuda.get_device_capability()[0] >= 9: + BK = min(256, triton.next_power_of_2(K)) + BV = min(256, triton.next_power_of_2(V)) + else: + BK = min(128, triton.next_power_of_2(K)) + BV = min(128, triton.next_power_of_2(V)) + NK = triton.cdiv(K, BK) + NV = triton.cdiv(V, BV) + assert BT % BS == 0 + + num_stages = 3 if K <= 64 else 2 + num_warps = 4 + + grid = (NK * NV, triton.cdiv(T, BT), B * H) + o = torch.empty(NK, B, H, T, V, dtype=q.dtype, device=q.device) + attn = q.new_zeros(NK, B, H, T, T) if output_attentions else None + parallel_retention_fwd_kernel[grid]( + q=q, + k=k, + v=v, + o=o, + attn=attn, + scale=scale, + B=B, + H=H, + T=T, + K=K, + V=V, + BT=BT, + BS=BS, + BK=BK, + BV=BV, + num_stages=num_stages, + num_warps=num_warps + ) + o = o.sum(0) + if output_attentions: + attn = attn.sum(0) + return o, attn + + +def parallel_retention_bwd( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + do: torch.Tensor, + scale: float, +): + B, H, T, K, V = *k.shape, v.shape[-1] + BT, BS = 64, 32 + BK = min(128, triton.next_power_of_2(k.shape[-1])) + BV = min(128, triton.next_power_of_2(v.shape[-1])) + NK = triton.cdiv(K, BK) + NV = triton.cdiv(V, BV) + assert BT % BS == 0 + + num_stages = 3 if K <= 64 else 2 + num_warps = 4 + + dq = torch.empty(NV, B, H, T, K, dtype=q.dtype, device=q.device) + dk = torch.empty(NV, B, H, T, K, dtype=q.dtype, device=q.device) + dv = torch.empty(NK, B, H, T, V, dtype=q.dtype, device=q.device) + grid = (NK * NV, triton.cdiv(T, BT), B * H) + parallel_retention_bwd_kernel[grid]( + q=q, + k=k, + v=v, + do=do, + dq=dq, + dk=dk, + dv=dv, + scale=scale, + B=B, + H=H, + T=T, + K=K, + V=V, + BT=BT, + BS=BS, + BK=BK, + BV=BV, + num_stages=num_stages, + num_warps=num_warps + ) + return dq, dk, dv + + +class ParallelRetentionFunction(torch.autograd.Function): + + @staticmethod + @contiguous + @autocast_custom_fwd + def forward(ctx, q, k, v, scale, output_attentions): + o, attn = parallel_retention_fwd(q, k, v, scale, output_attentions) + ctx.save_for_backward(q, k, v) + ctx.scale = scale + return o.to(q.dtype), attn + + @staticmethod + @contiguous + @autocast_custom_bwd + def backward(ctx, do, d_attn=None): + q, k, v = ctx.saved_tensors + dq, dk, dv = parallel_retention_bwd(q, k, v, do, ctx.scale) + return dq.sum(0).to(q.dtype), dk.sum(0).to(k.dtype), dv.sum(0).to(v.dtype), None, None + + +def parallel_retention( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + scale: float = None, + output_attentions: bool = False, + head_first: bool = True +) -> Tuple[torch.Tensor, torch.Tensor]: + r""" + Args: + q (torch.Tensor): + queries of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]` + k (torch.Tensor): + keys of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]` + v (torch.Tensor): + values of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]` + scale (Optional[int]): + Scale factor for the attention scores. + If not provided, it will default to `1 / sqrt(K)`. Default: `None`. + output_attentions (bool): + Whether to output the materialized attention scores of shape [B, H, T, T]. Default: `False`. + head_first (Optional[bool]): + Whether the inputs are in the head-first format. + Default: `True`. + + Returns: + o (torch.Tensor): + Outputs of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]` + attn (torch.Tensor): + Attention scores of shape `[B, H, T, T]` if `output_attentions=True` else `None` + """ + if scale is None: + scale = k.shape[-1] ** -0.5 + if not head_first: + q, k, v = map(lambda x: x.transpose(1, 2), (q, k, v)) + o, attn = ParallelRetentionFunction.apply(q, k, v, scale, output_attentions) + if not head_first: + o = o.transpose(1, 2) + return o, attn diff --git a/fla/ops/rotary.py b/fla/ops/rotary.py new file mode 100644 index 0000000000000000000000000000000000000000..0443685c1cda4f055d8d2c02737059cec1b6b47a --- /dev/null +++ b/fla/ops/rotary.py @@ -0,0 +1,229 @@ +# Copyright (c) 2023, Tri Dao. +# https://github.com/Dao-AILab/flash-attention/blob/main/flash_attn/ops/triton/rotary.py + +from typing import Optional, Union + +import torch +import triton +import triton.language as tl + + +# @triton.autotune( +# configs=[ +# triton.Config({"BLOCK_M": 2}), +# triton.Config({"BLOCK_M": 4}), +# triton.Config({"BLOCK_M": 8}), +# triton.Config({"BLOCK_M": 16}), +# ], +# key=["CACHE_KEY_SEQLEN", "BLOCK_K", "INTERLEAVED"], +# ) +@triton.jit +def rotary_kernel( + OUT, # Pointers to matrices + X, + COS, + SIN, + CU_SEQLENS, + SEQLEN_OFFSETS, # this could be int or a pointer + # Matrix dimensions + seqlen, + nheads, + rotary_dim, + seqlen_ro, + CACHE_KEY_SEQLEN, + # strides + stride_out_batch, + stride_out_seqlen, + stride_out_nheads, + stride_out_headdim, + stride_x_batch, + stride_x_seqlen, + stride_x_nheads, + stride_x_headdim, + # Meta-parameters + BLOCK_K: tl.constexpr, + IS_SEQLEN_OFFSETS_TENSOR: tl.constexpr, + IS_VARLEN: tl.constexpr, + INTERLEAVED: tl.constexpr, + CONJUGATE: tl.constexpr, + BLOCK_M: tl.constexpr, +): + pid_m = tl.program_id(axis=0) + pid_batch = tl.program_id(axis=1) + pid_head = tl.program_id(axis=2) + rotary_dim_half = rotary_dim // 2 + + if not IS_VARLEN: + X = X + pid_batch * stride_x_batch + pid_head * stride_x_nheads + OUT = OUT + pid_batch * stride_out_batch + pid_head * stride_out_nheads + else: + start_idx = tl.load(CU_SEQLENS + pid_batch) + seqlen = tl.load(CU_SEQLENS + pid_batch + 1) - start_idx + X = X + start_idx * stride_x_seqlen + pid_head * stride_x_nheads + OUT = OUT + start_idx * stride_out_seqlen + pid_head * stride_out_nheads + + if pid_m * BLOCK_M >= seqlen: + return + rm = pid_m * BLOCK_M + tl.arange(0, BLOCK_M) + if not IS_SEQLEN_OFFSETS_TENSOR: + rm_cs = rm + SEQLEN_OFFSETS + else: + rm_cs = rm + tl.load(SEQLEN_OFFSETS + pid_batch) + rk = tl.arange(0, BLOCK_K) + rk_half = tl.arange(0, BLOCK_K // 2) + + if not INTERLEAVED: + # Load the 1st and 2nd halves of X, do calculation, then store to 1st and 2nd halves of OUT + X = X + (rm[:, None] * stride_x_seqlen + rk_half[None, :] * stride_x_headdim) + COS = COS + (rm_cs[:, None] * rotary_dim_half + rk_half[None, :]) + SIN = SIN + (rm_cs[:, None] * rotary_dim_half + rk_half[None, :]) + cos = tl.load(COS, mask=(rm_cs[:, None] < seqlen_ro) & (rk_half[None, :] < rotary_dim_half), other=1.0).to(tl.float32) + sin = tl.load(SIN, mask=(rm_cs[:, None] < seqlen_ro) & (rk_half[None, :] < rotary_dim_half), other=0.0).to(tl.float32) + x0 = tl.load(X, mask=(rm[:, None] < seqlen) & (rk_half[None, :] < rotary_dim_half), other=0.0).to(tl.float32) + x1 = tl.load( + X + rotary_dim_half * stride_x_headdim, + mask=(rm[:, None] < seqlen) & (rk_half[None, :] < rotary_dim_half), + other=0.0, + ).to(tl.float32) + if CONJUGATE: + sin = -sin + o0 = x0 * cos - x1 * sin + o1 = x0 * sin + x1 * cos + # write back result + OUT = OUT + (rm[:, None] * stride_out_seqlen + rk_half[None, :] * stride_out_headdim) + tl.store(OUT, o0, mask=(rm[:, None] < seqlen) & (rk_half[None, :] < rotary_dim_half)) + tl.store( + OUT + rotary_dim_half * stride_out_headdim, + o1, + mask=(rm[:, None] < seqlen) & (rk_half[None, :] < rotary_dim_half), + ) + else: + # We don't want to load X[0, 2, 4, ...] and X[1, 3, 5, ...] separately since both are slow. + # Instead, we load x0 = X[0, 1, 2, 3, ...] and x1 = X[1, 0, 3, 2, ...]. + # Loading x0 will be fast but x1 will be slow. + # Then we load cos = COS[0, 0, 1, 1, ...] and sin = SIN[0, 0, 1, 1, ...]. + # Then we do the calculation and use tl.where to pick put the right outputs for the even + # and for the odd indices. + rk_swap = rk + ((rk + 1) % 2) * 2 - 1 # 1, 0, 3, 2, 5, 4, ... + rk_repeat = tl.arange(0, BLOCK_K) // 2 + X0 = X + (rm[:, None] * stride_x_seqlen + rk[None, :] * stride_x_headdim) + X1 = X + (rm[:, None] * stride_x_seqlen + rk_swap[None, :] * stride_x_headdim) + COS = COS + (rm_cs[:, None] * rotary_dim_half + rk_repeat[None, :]) + SIN = SIN + (rm_cs[:, None] * rotary_dim_half + rk_repeat[None, :]) + cos = tl.load( + COS, + mask=(rm_cs[:, None] < seqlen_ro) & (rk_repeat[None, :] < rotary_dim_half), + other=1.0, + ).to(tl.float32) + sin = tl.load( + SIN, + mask=(rm_cs[:, None] < seqlen_ro) & (rk_repeat[None, :] < rotary_dim_half), + other=0.0, + ).to(tl.float32) + x0 = tl.load(X0, mask=(rm[:, None] < seqlen) & (rk[None, :] < rotary_dim), other=0.0).to(tl.float32) + x1 = tl.load(X1, mask=(rm[:, None] < seqlen) & (rk_swap[None, :] < rotary_dim), other=0.0).to(tl.float32) + if CONJUGATE: + sin = -sin + x0_cos = x0 * cos + x1_sin = x1 * sin + out = tl.where(rk[None, :] % 2 == 0, x0_cos - x1_sin, x0_cos + x1_sin) + OUT = OUT + (rm[:, None] * stride_out_seqlen + rk[None, :] * stride_out_headdim) + tl.store(OUT, out, mask=(rm[:, None] < seqlen) & (rk[None, :] < rotary_dim)) + + +def apply_rotary( + x: torch.Tensor, + cos: torch.Tensor, + sin: torch.Tensor, + seqlen_offsets: Union[int, torch.Tensor] = 0, + cu_seqlens: Optional[torch.Tensor] = None, + max_seqlen: Optional[int] = None, + interleaved: bool = False, + inplace: bool = False, + conjugate: bool = False, +) -> torch.Tensor: + """ + Arguments: + x: (batch, seqlen, nheads, headdim) if cu_seqlens is None + else (total_seqlen, nheads, headdim). + cos: (seqlen_ro, rotary_dim / 2) + sin: (seqlen_ro, rotary_dim / 2) + seqlen_offsets: integer or integer tensor of size (batch,) + cu_seqlens: (batch + 1,) or None + max_seqlen: int + Returns: + y: (batch, seqlen, nheads, headdim) + """ + is_varlen = cu_seqlens is not None + if not is_varlen: + batch, seqlen, nheads, headdim = x.shape + else: + assert max_seqlen is not None, "If cu_seqlens is passed in, then max_seqlen must be passed" + _, nheads, headdim = x.shape + batch_p_1 = cu_seqlens.shape[0] + batch = batch_p_1 - 1 + seqlen = max_seqlen + seqlen_ro, rotary_dim = cos.shape + assert sin.shape == cos.shape + rotary_dim *= 2 + assert rotary_dim <= headdim, "rotary_dim must be <= headdim" + assert headdim <= 256, "Only support headdim <= 256" + assert seqlen_ro >= seqlen, "seqlen_ro must be >= seqlen" + + assert cos.dtype == sin.dtype, f"cos and sin must have the same dtype, got {cos.dtype} and {sin.dtype}" + assert x.dtype == cos.dtype, f"Input and cos/sin must have the same dtype, got {x.dtype} and {cos.dtype}" + + cos, sin = cos.contiguous(), sin.contiguous() + if isinstance(seqlen_offsets, torch.Tensor): + assert seqlen_offsets.shape == (batch,) + assert seqlen_offsets.dtype in [torch.int32, torch.int64] + seqlen_offsets = seqlen_offsets.contiguous() + else: + assert seqlen_offsets + seqlen <= seqlen_ro + + output = torch.empty_like(x) if not inplace else x + if rotary_dim < headdim and not inplace: + output[..., rotary_dim:].copy_(x[..., rotary_dim:]) + + BLOCK_K = ( + 32 + if rotary_dim <= 32 + else (64 if rotary_dim <= 64 else (128 if rotary_dim <= 128 else 256)) + ) + def grid(META): return (triton.cdiv(seqlen, META["BLOCK_M"]), batch, nheads) # noqa + BLOCK_M = 4 if interleaved else (8 if rotary_dim <= 64 else 4) + + # Need this, otherwise Triton tries to launch from cuda:0 and we get + # ValueError: Pointer argument (at 0) cannot be accessed from Triton (cpu tensor?) + with torch.cuda.device(x.device.index): + rotary_kernel[grid]( + output, # data ptrs + x, + cos, + sin, + cu_seqlens, + seqlen_offsets, + seqlen, # shapes + nheads, + rotary_dim, + seqlen_ro, + # key for triton cache (limit number of compilations) + seqlen // 128, + # batch_strides if not varlen else 0 + output.stride(0) if not is_varlen else 0, + output.stride(-3), # seqlen_stride or total_seqlen_stride + output.stride(-2), # nheads_stride + output.stride(-1), # headdim_stride + # batch_strides if not varlen else 0 + x.stride(0) if not is_varlen else 0, + x.stride(-3), # seqlen stride or total_seqlen_stride + x.stride(-2), # nheads stride + x.stride(-1), # headdim stride + BLOCK_K, + isinstance(seqlen_offsets, torch.Tensor), + is_varlen, + interleaved, + conjugate, + BLOCK_M, + ) + return output diff --git a/fla/ops/rwkv4/__init__.py b/fla/ops/rwkv4/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..49de2cf83aeec67069b67e0972cfccef8a81383a --- /dev/null +++ b/fla/ops/rwkv4/__init__.py @@ -0,0 +1,7 @@ +# -*- coding: utf-8 -*- + +from .fused_recurrent import fused_recurrent_rwkv4 + +__all__ = [ + 'fused_recurrent_rwkv4' +] diff --git a/fla/ops/rwkv4/fused_recurrent.py b/fla/ops/rwkv4/fused_recurrent.py new file mode 100644 index 0000000000000000000000000000000000000000..27f8adf28e0f55a8e0a8ae2170639086c2de02fc --- /dev/null +++ b/fla/ops/rwkv4/fused_recurrent.py @@ -0,0 +1,484 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +from typing import Any, cast + +import torch +import triton +import triton.language as tl +from torch import Tensor +from torch.autograd.function import Function, FunctionCtx, once_differentiable + + +def get_block_size_c(chans: int) -> int: + if chans < 32: + return 32 + if chans < 64: + return 64 + return 128 + + +@triton.jit +def fused_recurrent_rwkv4_forward_kernel( + # W + w_ptr, + w_s_c, + # U + u_ptr, + u_s_c, + # K + k_ptr, + k_s_b, + k_s_t, + k_s_c, + # V + v_ptr, + v_s_b, + v_s_t, + v_s_c, + # State + state_ptr, + state_s_b, + state_s_abe, + state_s_c, + # WKV + wkv_ptr, + wkv_s_b, + wkv_s_t, + wkv_s_c, + # Output state + state_out_ptr, + state_out_s_b, + state_out_s_abe, + state_out_s_t, + state_out_s_c, + # Params + chans, + tsz, + BLOCK_SIZE_C: tl.constexpr, +): + # Parallelize over the batch dimension. + b_idx = tl.program_id(0) + c_idx = tl.program_id(1) + + cs = (c_idx * BLOCK_SIZE_C) + tl.arange(0, BLOCK_SIZE_C) + cmask = cs < chans + + # Pointers to the batch (and possibly channel) for the input tensors. + k_ptr = k_ptr + b_idx * k_s_b + v_ptr = v_ptr + b_idx * v_s_b + alpha_ptr = state_ptr + b_idx * state_s_b + beta_ptr = state_ptr + b_idx * state_s_b + state_s_abe + eps_ptr = state_ptr + b_idx * state_s_b + 2 * state_s_abe + + # Pointers to the batch (and possibly channel) for the output tensors. + wkv_ptr = wkv_ptr + b_idx * wkv_s_b + alpha_out_ptr = state_out_ptr + b_idx * state_out_s_b + beta_out_ptr = state_out_ptr + b_idx * state_out_s_b + state_out_s_abe + eps_out_ptr = state_out_ptr + b_idx * state_out_s_b + 2 * state_out_s_abe + + # Loads parameters. + alpha = tl.load(alpha_ptr + cs * state_s_c, mask=cmask).to(tl.float32) + beta = tl.load(beta_ptr + cs * state_s_c, mask=cmask).to(tl.float32) + eps = tl.load(eps_ptr + cs * state_s_c, mask=cmask).to(tl.float32) + w = tl.load(w_ptr + cs * w_s_c, mask=cmask).to(tl.float32) + u = tl.load(u_ptr + cs * u_s_c, mask=cmask).to(tl.float32) + + for t in range(tsz): + kt = tl.load(k_ptr + t * k_s_t + cs * k_s_c, mask=cmask).to(tl.float32) + vt = tl.load(v_ptr + t * v_s_t + cs * v_s_c, mask=cmask).to(tl.float32) + + ukt = u + kt + tau = tl.maximum(ukt, eps) + e1a = tl.exp(eps - tau) + e2a = tl.exp(ukt - tau) + wkv = (e1a * alpha + e2a * vt) / (e1a * beta + e2a) + tl.store(wkv_ptr + t * wkv_s_t + cs * wkv_s_c, wkv, mask=cmask) + + w_eps = w + eps + eps = tl.maximum(w_eps, kt) + e1b = tl.exp(w_eps - eps) + e2b = tl.exp(kt - eps) + alpha = e1b * alpha + e2b * vt + beta = e1b * beta + e2b + tl.store(alpha_out_ptr + t * state_out_s_t + cs * state_out_s_c, alpha, mask=cmask) + tl.store(beta_out_ptr + t * state_out_s_t + cs * state_out_s_c, beta, mask=cmask) + tl.store(eps_out_ptr + t * state_out_s_t + cs * state_out_s_c, eps, mask=cmask) + + +def fused_recurrent_rwkv4_forward( + w: Tensor, + u: Tensor, + k: Tensor, + v: Tensor, + state: Tensor, +) -> tuple[Tensor, Tensor]: + (bsz, tsz, chans) = k.shape + + # New tensors to output. + wkvs = k.new_empty(bsz, tsz, chans) + state_out = k.new_empty(bsz, 3, tsz, chans) + + # Constants. + block_size_c = get_block_size_c(chans) + + def grid(meta: dict[str, Any]) -> tuple[int, ...]: + return (bsz, triton.cdiv(chans, meta["BLOCK_SIZE_C"])) + + fused_recurrent_rwkv4_forward_kernel[grid]( + # W + w, + w.stride(0), + # U + u, + u.stride(0), + # K + k, + k.stride(0), + k.stride(1), + k.stride(2), + # V + v, + v.stride(0), + v.stride(1), + v.stride(2), + # State + state, + state.stride(0), + state.stride(1), + state.stride(3), + # WKV + wkvs, + wkvs.stride(0), + wkvs.stride(1), + wkvs.stride(2), + # Output state + state_out, + state_out.stride(0), + state_out.stride(1), + state_out.stride(2), + state_out.stride(3), + # Params + chans, + tsz, + BLOCK_SIZE_C=block_size_c, + ) + + state_out = torch.cat((state, state_out), dim=2) + + return wkvs, state_out + + +@triton.jit +def fused_recurrent_rwkv4_backward_kernel( + # W + w_ptr, + w_s_c, + # U + u_ptr, + u_s_c, + # K + k_ptr, + k_s_b, + k_s_t, + k_s_c, + # V + v_ptr, + v_s_b, + v_s_t, + v_s_c, + # State + state_ptr, + state_s_b, + state_s_abe, + state_s_t, + state_s_c, + # WKV grad + gwkv_ptr, + gwkv_s_b, + gwkv_s_t, + gwkv_s_c, + # Output state grad + gstate_out_ptr, + gstate_out_s_b, + gstate_out_s_abe, + gstate_out_s_c, + # W grad + gw_ptr, + gw_s_c, + # U grad + gu_ptr, + gu_s_c, + # K grad + gk_ptr, + gk_s_b, + gk_s_t, + gk_s_c, + # V grad + gv_ptr, + gv_s_b, + gv_s_t, + gv_s_c, + # State grad + gstate_ptr, + gstate_s_b, + gstate_s_abe, + gstate_s_c, + # Params + tsz, + chans, + BLOCK_SIZE_C: tl.constexpr, +): + # Parallelize over the batch dimension. + b_idx = tl.program_id(0) + c_idx = tl.program_id(1) + + cs = (c_idx * BLOCK_SIZE_C) + tl.arange(0, BLOCK_SIZE_C) + cmask = cs < chans + + # Pointers to the batch (and possibly channel) for the input tensors. + k_ptr = k_ptr + b_idx * k_s_b + v_ptr = v_ptr + b_idx * v_s_b + alpha_ptr = state_ptr + b_idx * state_s_b + beta_ptr = state_ptr + b_idx * state_s_b + state_s_abe + eps_ptr = state_ptr + b_idx * state_s_b + 2 * state_s_abe + + # Pointers to the batch (and possibly channel) for the output tensors. + gk_ptr = gk_ptr + b_idx * gk_s_b + gv_ptr = gv_ptr + b_idx * gv_s_b + + # Pointers to gradients which were recieved by the function. + gwkv_ptr = gwkv_ptr + b_idx * gwkv_s_b + galpha_out_ptr = gstate_out_ptr + b_idx * gstate_out_s_b + gbeta_out_ptr = gstate_out_ptr + b_idx * gstate_out_s_b + gstate_out_s_abe + geps_out_ptr = gstate_out_ptr + b_idx * gstate_out_s_b + 2 * gstate_out_s_abe + + # Loads parameters. + galpha = tl.load(galpha_out_ptr + gstate_out_s_c * cs, mask=cmask).to(tl.float32) + gbeta = tl.load(gbeta_out_ptr + gstate_out_s_c * cs, mask=cmask).to(tl.float32) + geps = tl.load(geps_out_ptr + gstate_out_s_c * cs, mask=cmask).to(tl.float32) + w = tl.load(w_ptr + w_s_c * cs, mask=cmask).to(tl.float32) + u = tl.load(u_ptr + u_s_c * cs, mask=cmask).to(tl.float32) + + # Gradient accumulators. + gw = tl.zeros_like(w) + gu = tl.zeros_like(u) + + alpha_prev = tl.load(alpha_ptr + tsz * state_s_t + state_s_c * cs, mask=cmask).to(tl.float32) + beta_prev = tl.load(beta_ptr + tsz * state_s_t + state_s_c * cs, mask=cmask).to(tl.float32) + eps_prev = tl.load(eps_ptr + tsz * state_s_t + state_s_c * cs, mask=cmask).to(tl.float32) + + for t in range(tsz): + tc = tsz - t - 1 + + kt = tl.load(k_ptr + tc * k_s_t + k_s_c * cs, mask=cmask).to(tl.float32) + vt = tl.load(v_ptr + tc * v_s_t + v_s_c * cs, mask=cmask).to(tl.float32) + + alpha_curr = alpha_prev + beta_curr = beta_prev + eps_curr = eps_prev + + alpha_prev = tl.load(alpha_ptr + tc * state_s_t + state_s_c * cs, mask=cmask).to(tl.float32) + beta_prev = tl.load(beta_ptr + tc * state_s_t + state_s_c * cs, mask=cmask).to(tl.float32) + eps_prev = tl.load(eps_ptr + tc * state_s_t + state_s_c * cs, mask=cmask).to(tl.float32) + + ukt = u + kt + tau = tl.maximum(ukt, eps_prev) + e1 = tl.exp(eps_prev - tau) + e2 = tl.exp(ukt - tau) + + euke = tl.exp(ukt + eps_prev - 2 * tau) + + denom = e1 * beta_prev + e2 + denom_sq = denom * denom + + gwkvt = tl.load(gwkv_ptr + tc * gwkv_s_t + gwkv_s_c * cs, mask=cmask).to(tl.float32) + + # Backpropagates wkv gradients. + guk = gwkvt * e2 * (e1 * beta_prev * vt - e1 * alpha_prev) / denom_sq + gu += guk + gk = guk + gv = gwkvt * e2 / denom + + galpha_wkv = gwkvt * e1 / denom + gbeta_wkv = -gwkvt * e1 * (e2 * vt + e1 * alpha_prev) / denom_sq + geps_wkv_denom = e1 * beta_prev + e2 + geps_wkv = gwkvt * euke * (alpha_prev - vt * beta_prev) / (geps_wkv_denom * geps_wkv_denom) + + e1 = tl.exp(w + eps_prev - eps_curr) + e2 = tl.exp(kt - eps_curr) + + # Backpropagates alpha gradients. + galpha_we = galpha * e1 * alpha_prev + gw += galpha_we + gk += galpha * e2 * vt + gv += galpha * e2 + geps += galpha * -alpha_curr + + # Backpropagates beta gradients. + gbeta_we = gbeta * e1 * beta_prev + gw += gbeta_we + gk += gbeta * e2 + geps += gbeta * -beta_curr + + # Backpropagates epsilon gradients. + geps_mask = w + eps_prev > kt + geps_we = tl.where(geps_mask, geps, tl.zeros_like(geps)) + gw += geps_we + gk += tl.where(geps_mask, tl.zeros_like(geps), geps) + + # Stores the gradients for k and v. + tl.store(gk_ptr + tc * gk_s_t + gk_s_c * cs, gk, mask=cmask) + tl.store(gv_ptr + tc * gv_s_t + gv_s_c * cs, gv, mask=cmask) + + # Computes new gradients for alpha and beta. + galpha = galpha * e1 + galpha_wkv + gbeta = gbeta * e1 + gbeta_wkv + geps = galpha_we + gbeta_we + geps_we + geps_wkv + + # Stores final gradients for alpha and beta. + galpha_ptr = gstate_ptr + b_idx * gstate_s_b + gbeta_ptr = gstate_ptr + b_idx * gstate_s_b + gstate_s_abe + geps_ptr = gstate_ptr + b_idx * gstate_s_b + 2 * gstate_s_abe + tl.store(galpha_ptr + gstate_s_c * cs, galpha, mask=cmask) + tl.store(gbeta_ptr + gstate_s_c * cs, gbeta, mask=cmask) + tl.store(geps_ptr + gstate_s_c * cs, geps, mask=cmask) + + # Stores final gradients for w and u. + gw_temp = tl.load(gw_ptr + gw_s_c * cs, mask=cmask).to(tl.float32) + gw_temp += gw + tl.store(gw_ptr + gw_s_c * cs, gw_temp, mask=cmask) + gu_temp = tl.load(gu_ptr + gu_s_c * cs, mask=cmask).to(tl.float32) + gu_temp += gu + tl.store(gu_ptr + gu_s_c * cs, gu_temp, mask=cmask) + + +def fused_recurrent_rwkv4_backward( + w: Tensor, + u: Tensor, + k: Tensor, + v: Tensor, + state: Tensor, + grad_wkv: Tensor, + grad_state: Tensor, +) -> tuple[Tensor, Tensor, Tensor, Tensor, Tensor]: + bsz, tsz, chans = k.shape + + gw = torch.zeros_like(w) # New tensors to output. + gu = torch.zeros_like(u) + gk = torch.empty_like(k) + gv = torch.empty_like(v) + gstate = k.new_empty(bsz, 3, 1, chans) + + block_size_c = get_block_size_c(chans) # Constants. + + def grid(meta: dict[str, Any]) -> tuple[int, ...]: + return (bsz, triton.cdiv(chans, meta["BLOCK_SIZE_C"])) + + fused_recurrent_rwkv4_backward_kernel[grid]( + # W + w, + w.stride(0), + # U + u, + u.stride(0), + # K + k, + k.stride(0), + k.stride(1), + k.stride(2), + # V + v, + v.stride(0), + v.stride(1), + v.stride(2), + # State + state, + state.stride(0), + state.stride(1), + state.stride(2), + state.stride(3), + # WKV grad + grad_wkv, + grad_wkv.stride(0), + grad_wkv.stride(1), + grad_wkv.stride(2), + # Output state grad + grad_state, + grad_state.stride(0), + grad_state.stride(1), + grad_state.stride(3), + # W grad + gw, + gw.stride(0), + # U grad + gu, + gu.stride(0), + # K grad + gk, + gk.stride(0), + gk.stride(1), + gk.stride(2), + # V grad + gv, + gv.stride(0), + gv.stride(1), + gv.stride(2), + # State grad + gstate, + gstate.stride(0), + gstate.stride(1), + gstate.stride(3), + # Params + tsz, + chans, + BLOCK_SIZE_C=block_size_c, + ) + + return gw, gu, gk, gv, gstate + + +class FusedRecurrentRWKV4Function(Function): + @staticmethod + def forward( + ctx: FunctionCtx, + w: Tensor, + u: Tensor, + k: Tensor, + v: Tensor, + state: Tensor, + ) -> tuple[Tensor, Tensor]: + ctx.input_dtype = k.dtype + + if ( + w.device.type != "cuda" + or u.device.type != "cuda" + or k.device.type != "cuda" + or v.device.type != "cuda" + ): + raise ValueError( + "Calling the CUDA kernel for wkv attention requires all tensors to be on CUDA devices." + ) + + w = -torch.exp(w.float().contiguous()) + if k.dtype == torch.float16: + u = u.float() + k = k.float() + v = v.float() + u = u.contiguous() + k = k.contiguous() + v = v.contiguous() + wkv, state_out = fused_recurrent_rwkv4_forward(w, u, k, v, state) + ctx.save_for_backward(w, u, k, v, state_out[:, :, :-1]) + return wkv, state_out[:, :, -1:] + + @staticmethod + @once_differentiable + def backward(ctx: FunctionCtx, gwkv: Tensor, gstate: Tensor) -> tuple[Tensor, Tensor, Tensor, Tensor, Tensor]: + w, u, k, v, state = cast(tuple[Tensor, ...], ctx.saved_tensors) + gw, gu, gk, gv, gstate = fused_recurrent_rwkv4_backward(w, u, k, v, state, gwkv, gstate) + return gw, gu, gk, gv, gstate + + +def fused_recurrent_rwkv4(w: Tensor, u: Tensor, k: Tensor, v: Tensor, state: Tensor) -> tuple[Tensor, Tensor]: + return FusedRecurrentRWKV4Function.apply(w, u, k, v, state) diff --git a/fla/ops/rwkv6/__init__.py b/fla/ops/rwkv6/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..b3c7c218eb873a1a2115b5587530fe55f29a9d02 --- /dev/null +++ b/fla/ops/rwkv6/__init__.py @@ -0,0 +1,9 @@ +# -*- coding: utf-8 -*- + +from .chunk import chunk_rwkv6 +from .fused_recurrent import fused_recurrent_rwkv6 + +__all__ = [ + 'chunk_rwkv6', + 'fused_recurrent_rwkv6' +] diff --git a/fla/ops/rwkv6/chunk.py b/fla/ops/rwkv6/chunk.py new file mode 100644 index 0000000000000000000000000000000000000000..444e6cb1d9a102ed91fe31d07cba107b77896bb0 --- /dev/null +++ b/fla/ops/rwkv6/chunk.py @@ -0,0 +1,936 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +from typing import Optional, Tuple + +import torch +import triton +import triton.language as tl + +from fla.ops.common.chunk_h import chunk_fwd_h +from fla.ops.gla.chunk import chunk_gla_bwd_dA, chunk_gla_bwd_dv +from fla.utils import contiguous + + +@triton.autotune( + configs=[ + triton.Config({'BS': 16}, num_warps=2), + triton.Config({'BS': 16}, num_warps=4), + triton.Config({'BS': 16}, num_warps=8), + triton.Config({'BS': 32}, num_warps=2), + triton.Config({'BS': 32}, num_warps=4), + triton.Config({'BS': 32}, num_warps=8), + triton.Config({'BS': 64}, num_warps=2), + triton.Config({'BS': 64}, num_warps=4), + triton.Config({'BS': 64}, num_warps=8), + ], + key=['S'] +) +@triton.jit +def chunk_rwkv6_fwd_cumsum_kernel( + s, + o, + o_minus_s, + s_s_h, + s_s_t, + s_s_d, + T: tl.constexpr, + S: tl.constexpr, + BT: tl.constexpr, + BS: tl.constexpr +): + i_s, i_t, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + o_i = tl.arange(0, BT) + m_s = tl.where(o_i[:, None] >= o_i[None, :], 1., 0.) + + p_s = tl.make_block_ptr(s + i_bh * s_s_h, (T, S), (s_s_t, s_s_d), (i_t * BT, i_s * BS), (BT, BS), (1, 0)) + p_o = tl.make_block_ptr(o + i_bh * s_s_h, (T, S), (s_s_t, s_s_d), (i_t * BT, i_s * BS), (BT, BS), (1, 0)) + p_o_minus_s = tl.make_block_ptr(o_minus_s + i_bh * s_s_h, (T, S), (s_s_t, s_s_d), (i_t * BT, i_s * BS), (BT, BS), (1, 0)) + # [BT, BS] + b_s = tl.load(p_s, boundary_check=(0, 1)).to(tl.float32) + b_o = tl.dot(m_s, b_s, allow_tf32=False) + tl.store(p_o, b_o.to(p_o.dtype.element_ty), boundary_check=(0, 1)) + tl.store(p_o_minus_s, (b_o - b_s).to(p_o_minus_s.dtype.element_ty), boundary_check=(0, 1)) + + +def chunk_rwkv6_fwd_cumsum(g, BT): + B, H, T, K = g.shape + NT = triton.cdiv(T, BT) + g, gi, ge = g, torch.empty_like(g, dtype=torch.float), torch.empty_like(g, dtype=torch.float) + def grid(meta): return ((triton.cdiv(meta['S'], meta['BS']), NT, B * H)) + chunk_rwkv6_fwd_cumsum_kernel[grid]( + g, gi, ge, + g.stride(1), g.stride(2), g.stride(3), + T=T, + S=K, + BT=BT + ) + return gi, ge + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8), + ], + key=["BC", "BK"], +) +@triton.jit +def chunk_rwkv6_fwd_A_kernel_intra_sub_inter( + q, + k, + gi, # cumulative decay inclusive + ge, # cumulative decay exclusive + A, + s_k_h, + s_k_t, + s_k_d, + scale, + T: tl.constexpr, + K: tl.constexpr, + BT: tl.constexpr, + BC: tl.constexpr, + BK: tl.constexpr, + NC: tl.constexpr +): + i_t, i_c, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_i, i_j = i_c // NC, i_c % NC + if i_i <= i_j: + return + if i_t * BT + i_i * BC >= T: + return + b_A = tl.zeros([BC, BC], dtype=tl.float32) + for i_k in range(tl.cdiv(K, BK)): + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + # q block exlusive + p_gq = tl.make_block_ptr(ge + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (K, T), (s_k_d, s_k_t), (i_k * BK, i_t * BT + i_j * BC), (BK, BC), (0, 1)) + # k block inclusive + p_gk = tl.make_block_ptr(gi + i_bh * s_k_h, (K, T), (s_k_d, s_k_t), (i_k * BK, i_t * BT + i_j * BC), (BK, BC), (0, 1)) + # the last position of the k block inclusive + p_gn = tl.make_block_ptr(gi + i_bh * s_k_h, (T * K,), (s_k_d,), + ((i_t * BT + i_j * BC + BC - 1) * K + i_k * BK,), (BK,), (0,)) + # [BK,] + b_gn = tl.load(p_gn, boundary_check=(0,)) + # [BC, BK] + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_gq = tl.load(p_gq, boundary_check=(0, 1)) + b_qg = (b_q * tl.exp(b_gq - b_gn[None, :]) * scale) + # [BK, BC] + b_k = tl.load(p_k, boundary_check=(0, 1)) + b_gk = tl.load(p_gk, boundary_check=(0, 1)) + b_kg = (b_k * tl.exp(b_gn[:, None] - b_gk)) + # [BC, BC] using tf32 to improve precision here. + b_A += tl.dot(b_qg, b_kg) + p_A = tl.make_block_ptr(A + i_bh * T * BT, (T, BT), (BT, 1), (i_t * BT + i_i * BC, i_j * BC), (BC, BC), (1, 0)) + tl.store(p_A, b_A.to(A.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8), + ], + key=["BK", "BT"], +) +@triton.jit +def chunk_rwkv6_fwd_A_kernel_intra_sub_intra( + q, + k, + gi, + ge, + u, + A, + s_k_h, + s_k_t, + s_k_d, + scale, + H: tl.constexpr, + T: tl.constexpr, + K: tl.constexpr, + BT: tl.constexpr, + BC: tl.constexpr, + BK: tl.constexpr, + NC: tl.constexpr +): + i_t, i_i, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + if i_t * BT + i_i * BC >= T: + return + + i_j = i_i + i_h = i_bh % H + o_i = tl.arange(0, BC) + o_A = i_bh * T * BT + (i_t * BT + i_i * BC + tl.arange(0, BC)) * BT + i_j * BC + m_A = (i_t * BT + i_i * BC + tl.arange(0, BC)) < T + i_k = 0 + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + p_g = tl.make_block_ptr(ge + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_g = tl.load(p_g, boundary_check=(0, 1)) + p_u = tl.make_block_ptr(u + i_h * s_k_t, (s_k_t,), (1,), (i_k * BK), (BK,), (0,)) + b_u = tl.load(p_u, boundary_check=(0,)) + + for j in range(0, min(BC, T-i_t*BT-i_i*BC)): + b_A = tl.zeros([BC], dtype=tl.float32) + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (T * K,), (s_k_d,), ((i_t * BT + i_j * BC + j) * K + i_k * BK,), (BK,), (0,)) + p_gk = tl.make_block_ptr(gi + i_bh * s_k_h, (T*K,), (s_k_d,), ((i_t * BT + i_j * BC + j) * K + i_k * BK,), (BK,), (0,)) + b_k = tl.load(p_k, boundary_check=(0,)).to(tl.float32) + b_gk = tl.load(p_gk, boundary_check=(0,)).to(tl.float32) + b_A += tl.sum(b_q * b_k[None, :] * tl.exp(b_g - b_gk[None, :]), 1) + b_A = tl.where(o_i > j, b_A * scale, 0.) + p_qi = tl.make_block_ptr(q + i_bh * s_k_h, (T * K,), (s_k_d,), + ((i_t * BT + i_j * BC + j) * K + i_k * BK,), (BK,), (0,)) + b_qi = tl.load(p_qi, boundary_check=(0,)) + A_jj = tl.sum(b_qi * b_k * b_u * scale) + b_A = tl.where(o_i != j, b_A, A_jj) + tl.store(A + o_A + j, b_A, mask=m_A) + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8), + ], + key=["BC", "BK"], +) +@triton.jit +def chunk_rwkv6_fwd_A_kernel_intra_sub_intra_split( + q, + k, + gi, + ge, + u, + A, + s_k_h, + s_k_t, + s_k_d, + scale, + H: tl.constexpr, + T: tl.constexpr, + K: tl.constexpr, + BT: tl.constexpr, + BC: tl.constexpr, + BK: tl.constexpr, + NC: tl.constexpr +): + i_k, i_tc, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + n_bh = tl.num_programs(2) + i_t, i_i = i_tc // NC, i_tc % NC + if i_t * BT + i_i * BC >= T: + return + + i_j = i_i + i_h = i_bh % H + o_i = tl.arange(0, BC) + o_A = (i_bh + i_k * n_bh) * T * BC + (i_t * BT + i_i * BC + tl.arange(0, BC)) * BC + m_A = (i_t * BT + i_i * BC + tl.arange(0, BC)) < T + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + p_g = tl.make_block_ptr(ge + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_g = tl.load(p_g, boundary_check=(0, 1)) + p_u = tl.make_block_ptr(u + i_h * s_k_t, (s_k_t,), (1,), (i_k * BK), (BK,), (0,)) + b_u = tl.load(p_u, boundary_check=(0,)) + + for j in range(0, min(BC, T-i_t*BT-i_i*BC)): + b_A = tl.zeros([BC], dtype=tl.float32) + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (T * K,), (s_k_d,), ((i_t * BT + i_j * BC + j) * K + i_k * BK,), (BK,), (0,)) + p_gk = tl.make_block_ptr(gi + i_bh * s_k_h, (T*K,), (s_k_d,), ((i_t * BT + i_j * BC + j) * K + i_k * BK,), (BK,), (0,)) + b_k = tl.load(p_k, boundary_check=(0,)).to(tl.float32) + b_gk = tl.load(p_gk, boundary_check=(0,)).to(tl.float32) + b_A += tl.sum(b_q * b_k[None, :] * tl.exp(b_g - b_gk[None, :]), 1) + b_A = tl.where(o_i > j, b_A * scale, 0.) + p_qi = tl.make_block_ptr(q + i_bh * s_k_h, (T * K,), (s_k_d,), + ((i_t * BT + i_j * BC + j) * K + i_k * BK,), (BK,), (0,)) + b_qi = tl.load(p_qi, boundary_check=(0,)) + A_jj = tl.sum(b_qi * b_k * b_u * scale) + b_A = tl.where(o_i != j, b_A, A_jj) + tl.store(A + o_A + j, b_A, mask=m_A) + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8), + ], + key=["BC"], +) +@triton.jit +def chunk_rwkv6_fwd_A_kernel_intra_sub_intra_merge( + A, + A2, + T: tl.constexpr, + BT: tl.constexpr, + BC: tl.constexpr, + NK: tl.constexpr +): + i_t, i_c, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + if i_t * BT + i_c * BC >= T: + return + + n_bh = tl.num_programs(2) + b_A = tl.zeros([BC, BC], dtype=tl.float32) + for i_k in range(0, NK): + p_A = tl.make_block_ptr(A + (i_bh + i_k*n_bh) * T * BC, (T, BC), (BC, 1), (i_t * BT + i_c * BC, 0), (BC, BC), (1, 0)) + b_A += tl.load(p_A, boundary_check=(0, 1)) + p_A2 = tl.make_block_ptr(A2 + i_bh * T * BT, (T, BT), (BT, 1), (i_t * BT + i_c * BC, i_c * BC), (BC, BC), (1, 0)) + tl.store(p_A2, b_A.to(A2.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8), + ], + key=["BK", "BV", "BT"], +) +@triton.jit +def chunk_rwkv6_fwd_kernel_inter( + q, + v, + g, + h, + o, + A, + s_k_h, + s_k_t, + s_k_d, + s_v_h, + s_v_t, + s_v_d, + s_h_h, + s_h_t, + s_h_d, + scale, + T: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr +): + i_v, i_t, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + + b_o = tl.zeros([BT, BV], dtype=tl.float32) + for i_k in range(tl.cdiv(K, BK)): + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_ge = tl.make_block_ptr(g + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_h = tl.make_block_ptr(h + i_bh * s_h_h + i_t * K * V, (K, V), (s_h_t, s_h_d), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + # [BT, BK] + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_q = (b_q * scale).to(b_q.dtype) + # [BT, BK] + b_g = tl.load(p_ge, boundary_check=(0, 1)) + # [BT, BK] + b_qg = (b_q * tl.exp(b_g)).to(b_q.dtype) + # [BK, BV] + b_h = tl.load(p_h, boundary_check=(0, 1)) + # works but dkw, owing to divine benevolence + # [BT, BV] + if i_k >= 0: + b_o += tl.dot(b_qg, b_h.to(b_qg.dtype)) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_o = tl.make_block_ptr(o + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_A = tl.make_block_ptr(A + i_bh * T * BT, (T, BT), (BT, 1), (i_t * BT, 0), (BT, BT), (1, 0)) + # [BT, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + # [BT, BT] + b_A = tl.load(p_A, boundary_check=(0, 1)) + m_s = tl.arange(0, BT)[:, None] >= tl.arange(0, BT)[None, :] + b_A = tl.where(m_s, b_A, 0.) + b_o += tl.dot(b_A.to(b_v.dtype), b_v, allow_tf32=False) + tl.store(p_o, b_o.to(p_o.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8), + ], + key=["BK", "NC", "BT"], +) +@triton.jit +def chunk_rwkv6_bwd_kernel_intra( + q, + k, + gi, + ge, + dA, + dq, + dk, + s_k_h, + s_k_t, + s_k_d, + scale, + T: tl.constexpr, + K: tl.constexpr, + BT: tl.constexpr, + BC: tl.constexpr, + BK: tl.constexpr, + NC: tl.constexpr +): + i_k, i_c, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_t, i_i = i_c // NC, i_c % NC + if i_t * BT + i_i * BC >= T: + return + + o_k = i_k * BK + tl.arange(0, BK) + o_q = i_t * BT + i_i * BC + m_k = o_k < K + + p_ge = tl.make_block_ptr(ge + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + # [BC, BK] + b_ge = tl.load(p_ge, boundary_check=(0, 1)) + b_dq = tl.zeros([BC, BK], dtype=tl.float32) + b_dk = tl.zeros([BC, BK], dtype=tl.float32) + o_i = tl.arange(0, BC) + m_dA = (i_t * BT + i_i * BC + tl.arange(0, BC)) < T + + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_k = tl.load(p_k, boundary_check=(0, 1)) + + b_dq = tl.zeros([BC, BK], dtype=tl.float32) + + if i_i > 0: + b_gn = tl.load(gi + i_bh * T * K + (o_q - 1) * K + o_k, mask=(m_k & (i_i > 0) & (o_q <= T)), other=0) + for i_j in range(0, i_i): + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), + (i_t * BT + i_j * BC, i_k * BK), (BC, BK), (1, 0)) + p_gk = tl.make_block_ptr(gi + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), + (i_t * BT + i_j * BC, i_k * BK), (BC, BK), (1, 0)) + p_dA = tl.make_block_ptr(dA + i_bh * T * BT, (T, BT), (BT, 1), (i_t * BT + i_i * BC, i_j * BC), (BC, BC), (1, 0)) + # [BC, BK] + b_k = tl.load(p_k, boundary_check=(0, 1)) + b_gk = tl.load(p_gk, boundary_check=(0, 1)) + b_kg = (b_k * tl.exp(b_gn[None, :] - b_gk)) + # [BC, BC] + b_dA = tl.load(p_dA, boundary_check=(0, 1)) + # [BC, BK] + b_dq += tl.dot(b_dA, b_kg) + b_dq *= tl.exp(b_ge - b_gn[None, :]) + + o_dA = i_bh * T * BT + (i_t * BT + i_i * BC + tl.arange(0, BC)) * BT + i_i * BC + for j in range(0, min(BC, T-i_t*BT-i_i*BC)): + p_kj = tl.make_block_ptr(k + i_bh * s_k_h, (T * K,), (1,), ((i_t * BT + i_i*BC+j) * K + i_k * BK,), (BK,), (0,)) + p_gkj = tl.make_block_ptr(gi + i_bh * s_k_h, (T * K,), (1,), ((i_t * BT + i_i*BC+j) * K + i_k * BK,), (BK,), (0,)) + # [BC,] + b_dA = tl.load(dA + o_dA + j, mask=m_dA, other=0) + # [BK,] + b_kj = tl.load(p_kj, boundary_check=(0,)).to(tl.float32) + b_gkj = tl.load(p_gkj, boundary_check=(0,)).to(tl.float32) + # [BC, BK] + m_i = o_i[:, None] > j + # [BC, BK] + # (SY 09/17) important to not use bf16 for b_dA to have a good precision. + tmp = tl.exp(b_ge - b_gkj[None, :]) + b_dq += tl.where(m_i, b_dA[:, None] * b_kj[None, :] * tmp, 0.) + p_dq = tl.make_block_ptr(dq + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + tl.store(p_dq, b_dq.to(p_dq.dtype.element_ty), boundary_check=(0, 1)) + tl.debug_barrier() + b_dk = tl.zeros([BC, BK], dtype=tl.float32) + p_gk = tl.make_block_ptr(gi + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + # [BC, BK] + b_gk = tl.load(p_gk, boundary_check=(0, 1)) + + max_block_idx = min(NC, tl.cdiv(T-i_t*BT, BC)) + if i_i < max_block_idx - 1: + p_gn = tl.make_block_ptr(gi + i_bh * s_k_h, (T*K,), (s_k_d,), + ((i_t * BT + i_i * BC + BC - 1) * K + i_k * BK,), (BK,), (0,)) + # [BK,] + b_gn = tl.load(p_gn, boundary_check=(0,)) + for i_j in range(i_i + 1, NC): + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), + (i_t * BT + i_j * BC, i_k * BK), (BC, BK), (1, 0)) + p_ge = tl.make_block_ptr(ge + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), + (i_t * BT + i_j * BC, i_k * BK), (BC, BK), (1, 0)) + p_dA = tl.make_block_ptr(dA + i_bh * T * BT, (T, BT), (BT, 1), (i_t * BT + i_j * BC, i_i * BC), (BC, BC), (1, 0)) + # [BC, BK] + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_ge = tl.load(p_ge, boundary_check=(0, 1)) + b_qg = b_q * tl.exp(b_ge - b_gn[None, :]) + # [BC, BC] + b_dA = tl.load(p_dA, boundary_check=(0, 1)) + # [BC, BK] fp32 + b_dk += tl.dot(tl.trans(b_dA), b_qg, allow_tf32=False) + b_dk *= tl.exp(b_gn[None, :] - b_gk) + o_dA = i_bh * T * BT + (i_t * BT + i_i * BC) * BT + i_i * BC + tl.arange(0, BC) + for j in range(0, min(BC, T-i_t*BT-i_i*BC)): + p_qj = tl.make_block_ptr(q + i_bh * s_k_h, (T * K,), (1,), ((i_t * BT + i_i * BC + j) * K + i_k * BK,), (BK,), (0,)) + p_gqj = tl.make_block_ptr(ge + i_bh * s_k_h, (T * K,), (1,), ((i_t * BT + i_i * BC + j) * K + i_k * BK,), (BK,), (0,)) + # [BC,] + b_dA = tl.load(dA + o_dA + j * BT, mask=(i_t * BT + i_i * BC + j < T), other=0) + # [BK,] + b_qj = tl.load(p_qj, boundary_check=(0,)).to(tl.float32) + b_gqj = tl.load(p_gqj, boundary_check=(0,)).to(tl.float32) + # [BC, BK] + m_i = o_i[:, None] < j + b_dk += tl.where(m_i, b_dA[:, None] * b_qj[None, :] * tl.exp(b_gqj[None, :] - b_gk), 0.) + p_dk = tl.make_block_ptr(dk + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT + i_i * BC, i_k * BK), (BC, BK), (1, 0)) + tl.store(p_dk, b_dk.to(p_dk.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.autotune( + configs=[ + # triton.Config({}, num_warps=1), + # triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8), + ], + key=["BK", "BV", "BT"], +) +@triton.jit +def chunk_rwkv6_bwd_kernel_inter( + q, + k, + v, + h, + gi, + ge, + u, + do, + dh, + dA, + dq, + dk, + dq2, + dk2, + dg, + du, + s_k_h, + s_k_t, + s_k_d, + s_v_h, + s_v_t, + s_v_d, + s_h_h, + s_h_t, + s_h_d, + scale, + H: tl.constexpr, + T: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr +): + i_k, i_t, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_h = i_bh % H + n_bh = tl.num_programs(2) + + last_idx = min(T, i_t * BT + BT) - 1 + p_gn = tl.make_block_ptr(gi + i_bh * s_k_h, (T * K,), (s_k_d,), (last_idx * K + i_k * BK,), (BK,), (0,)) + b_gn = tl.load(p_gn, boundary_check=(0,)) + b_dq = tl.zeros([BT, BK], dtype=tl.float32) + b_dk = tl.zeros([BT, BK], dtype=tl.float32) + b_dgk = tl.zeros([BK,], dtype=tl.float32) + + for i_v in range(tl.cdiv(V, BV)): + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_h = tl.make_block_ptr(h + i_bh * s_h_h + i_t * V * K, (V, K), (s_h_d, s_h_t), (i_v * BV, i_k * BK), (BV, BK), (0, 1)) + p_do = tl.make_block_ptr(do + i_bh * s_v_h, (T, V), (s_v_t, s_v_d), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_dh = tl.make_block_ptr(dh + i_bh * s_h_h + i_t * V * K, (V, K), + (s_h_d, s_h_t), (i_v * BV, i_k * BK), (BV, BK), (0, 1)) + # [BT, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + b_do = tl.load(p_do, boundary_check=(0, 1)) + # [BV, BK] + b_h = tl.load(p_h, boundary_check=(0, 1)) + b_dh = tl.load(p_dh, boundary_check=(0, 1)) + # [BK] + b_dgk += tl.sum(b_h * b_dh, axis=0) + # [BT, BK] + b_dq += tl.dot(b_do, b_h.to(b_do.dtype)) + b_dk += tl.dot(b_v, b_dh.to(b_v.dtype)) + p_gk = tl.make_block_ptr(ge + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + b_dgk *= tl.exp(b_gn) + b_dq *= scale + b_gk = tl.load(p_gk, boundary_check=(0, 1)) + p_gi = tl.make_block_ptr(gi + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + b_gi = tl.load(p_gi, boundary_check=(0, 1)) + b_dq = b_dq * tl.exp(b_gk) + b_dk = b_dk * tl.exp(b_gn[None, :] - b_gi) + p_dq = tl.make_block_ptr(dq + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dk = tl.make_block_ptr(dk + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_k = tl.load(p_k, boundary_check=(0, 1)) + b_dgk += tl.sum(b_dk * b_k, axis=0) + + b_dq += tl.load(p_dq, boundary_check=(0, 1)) + b_dk += tl.load(p_dk, boundary_check=(0, 1)) + b_dg = b_q * b_dq - b_k * b_dk + b_dg = b_dg - tl.cumsum(b_dg, axis=0) + tl.sum(b_dg, axis=0)[None, :] + b_dgk[None, :] - b_q * b_dq + + o_i = tl.arange(0, BT) + p_dA_dig = dA + i_bh * T * BT + (i_t * BT + o_i) * BT + o_i + b_dA_dig = tl.load(p_dA_dig, mask=(i_t * BT + o_i) < T, other=0) + p_u = tl.make_block_ptr(u + i_h * K, (K,), (1,), (i_k * BK,), (BK,), (0,)) + b_u = tl.load(p_u, boundary_check=(0,)) + # scale is already applied to b_dA_diag + b_dq += (b_dA_dig[:, None] * b_u[None, :] * b_k) + b_dk += (b_dA_dig[:, None] * b_u[None, :] * b_q) + b_du = tl.sum(b_dA_dig[:, None] * b_q * b_k, axis=0) + p_du = tl.make_block_ptr(du + (i_h + i_t * n_bh) * K, (K,), (1,), (i_k * BK,), (BK,), (0,)) + tl.store(p_du, b_du, boundary_check=(0,)) + + # Buggy due to strange triton compiler issue. + # m_s = tl.where(tl.arange(0, BT)[:, None] <= tl.arange(0, BT)[None, :], 1., 0.) + # b_dg = tl.dot(m_s, b_dg, allow_tf32=False) + b_dgk[None, :] + p_dg = tl.make_block_ptr(dg + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + # work around triton compiler bugs. + p_dq = tl.make_block_ptr(dq2 + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dk = tl.make_block_ptr(dk2 + i_bh * s_k_h, (T, K), (s_k_t, s_k_d), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + tl.store(p_dq, b_dq.to(p_dq.dtype.element_ty), boundary_check=(0, 1)) + tl.store(p_dk, b_dk.to(p_dk.dtype.element_ty), boundary_check=(0, 1)) + tl.store(p_dg, b_dg.to(p_dg.dtype.element_ty), boundary_check=(0, 1)) + + +def chunk_rwkv6_fwd_intra_A_gated(q, k, gi, ge, u, scale, BT): + BC = 16 + B, H, T, K = q.shape + A = q.new_empty(B, H, T, BT, dtype=torch.float32) + NC = triton.cdiv(BT, BC) + NT = triton.cdiv(T, BT) + grid = (triton.cdiv(T, BT), NC * NC, B * H) + BK = min(64, triton.next_power_of_2(K)) + chunk_rwkv6_fwd_A_kernel_intra_sub_inter[grid]( + q, k, gi, ge, A, + k.stride(1), k.stride(2), k.stride(3), + scale, + T=T, K=K, BT=BT, BC=BC, BK=BK, NC=NC + ) + grid = (NT, NC, B * H) + # TODO: can we merge the two kernels? + # load the entire [BC, K] blocks into SRAM at once + if K <= 256: + chunk_rwkv6_fwd_A_kernel_intra_sub_intra[grid]( + q, k, gi, ge, u, A, + k.stride(1), k.stride(2), k.stride(3), + scale, + H=H, T=T, K=K, BT=BT, BC=BC, BK=triton.next_power_of_2(K), NC=NC + ) + # split then merge + else: + BK = 128 + NK = triton.cdiv(K, BK) + A_intra = q.new_empty(NK, B, H, T, BC, dtype=torch.float32) + grid = (NK, NT * NC, B * H) + chunk_rwkv6_fwd_A_kernel_intra_sub_intra_split[grid]( + q, k, gi, ge, u, A_intra, + k.stride(1), k.stride(2), k.stride(3), + scale, + H=H, T=T, K=K, BT=BT, BC=BC, BK=BK, NC=NC + ) + grid = (NT, NC, B * H) + chunk_rwkv6_fwd_A_kernel_intra_sub_intra_merge[grid]( + A_intra, A, + T=T, BT=BT, BC=BC, NK=NK + ) + return A + + +def chunk_rwkv6_fwd_o_gated_gk(q, v, g_cumsum, A, h, BT, scale): + B, H, T, K, V = *q.shape, v.shape[-1] + BV = min(32, triton.next_power_of_2(V)) + BK = min(32, triton.next_power_of_2(K)) + NV = triton.cdiv(V, BV) + NT = triton.cdiv(T, BT) + grid = (NV, NT, B * H) + o = torch.empty_like(v) + chunk_rwkv6_fwd_kernel_inter[grid]( + q, v, g_cumsum, h, o, A, + q.stride(1), q.stride(2), q.stride(3), + v.stride(1), v.stride(2), v.stride(3), + h.stride(1), h.stride(2), h.stride(3), + scale, + T=T, K=K, V=V, BT=BT, BK=BK, BV=BV + ) + return o + + +def chunk_rwkv6_bwd_dqk_intra(q, k, g_cumsum_inclusive, g_cumsum_exclusive, dA, BT, scale): + B, H, T, K = q.shape + BC = 16 + BK = min(64, triton.next_power_of_2(K)) + NK = triton.cdiv(K, BK) + NT = triton.cdiv(T, BT) + NC = triton.cdiv(BT, BC) + dq = torch.empty_like(q, dtype=torch.float32) + dk = torch.empty_like(k, dtype=torch.float32) + grid = (NK, NT * NC, B * H) + chunk_rwkv6_bwd_kernel_intra[grid]( + q, k, g_cumsum_inclusive, g_cumsum_exclusive, dA, dq, dk, + k.stride(1), k.stride(2), k.stride(3), scale, + T=T, K=K, BT=BT, BC=BC, BK=BK, NC=NC + ) + return dq, dk + + +def chunk_rwkv6_bwd_dqkgu(q, k, v, h, g_cumsum_inclusive, g_cumsum_exclusive, u, do, dh, dA, dq, dk, BT, scale): + B, H, T, K, V = *q.shape, v.shape[-1] + dg = torch.empty_like(g_cumsum_inclusive) + BK = 64 + BV = 64 + NK = triton.cdiv(K, BK) + NT = triton.cdiv(T, BT) + grid = (NK, NT, B * H) + # work around triton compiler bugs. + dq2 = torch.empty_like(dq) + dk2 = torch.empty_like(dk) + du = torch.empty(NT, B, H, K, dtype=torch.float32, device=u.device) + chunk_rwkv6_bwd_kernel_inter[grid]( + q, k, v, h, g_cumsum_inclusive, g_cumsum_exclusive, u, do, dh, dA, dq, dk, dq2, dk2, dg, du, + k.stride(1), k.stride(2), k.stride(3), + v.stride(1), v.stride(2), v.stride(3), + h.stride(1), h.stride(2), h.stride(3), + scale, H=H, T=T, K=K, V=V, BT=BT, BK=BK, BV=BV + ) + du = du.sum([0, 1]) + return dq2, dk2, dg, du + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8), + ], + key=["BT", "BK", "BV"], +) +@triton.heuristics({ + 'STORE_INITIAL_STATE_GRADIENT': lambda args: args['dh0'] is not None, + 'USE_FINAL_STATE_GRADIENT': lambda args: args['dht'] is not None +}) +@triton.jit +def chunk_rwkv6_bwd_kernel_dh( + q, + gi, + ge, + do, + dh, + dht, + dh0, + s_k_h, + s_k_t, + s_v_h, + s_v_t, + s_h_h, + s_h_t, + scale, + T: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + NT: tl.constexpr, + NG: tl.constexpr, + STORE_INITIAL_STATE_GRADIENT: tl.constexpr, + USE_FINAL_STATE_GRADIENT: tl.constexpr +): + i_k, i_v, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_bg = i_bh // NG + # [BK, BV] + b_dh = tl.zeros([BK, BV], dtype=tl.float32) + if USE_FINAL_STATE_GRADIENT: + p_dht = tl.make_block_ptr(dht + i_bh * K * V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + b_dh += tl.load(p_dht, boundary_check=(0, 1)).to(tl.float32) + + for i_t in range(NT - 1, -1, -1): + p_dh = tl.make_block_ptr(dh + i_bh * s_h_h + i_t * K * V, (K, V), (s_h_t, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + tl.store(p_dh, b_dh.to(p_dh.dtype.element_ty), boundary_check=(0, 1)) + last_idx = min(i_t * BT + BT, T) - 1 + # [BK, BT] + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (K, T), (1, s_k_t), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + b_q = tl.load(p_q, boundary_check=(0, 1)) + # [BT, BV] + p_do = tl.make_block_ptr(do + i_bh * s_v_h, (T, V), (s_v_t, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + b_do = tl.load(p_do, boundary_check=(0, 1)) + p_gk = tl.make_block_ptr(ge + i_bg * s_k_h, (K, T), (1, s_k_t), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + b_gk = tl.load(p_gk, boundary_check=(0, 1)) + b_q = (b_q * tl.exp(b_gk) * scale).to(b_q.dtype) + p_gk_last = gi + i_bg * s_k_h + last_idx * K + i_k * BK + tl.arange(0, BK) + p_gk_last = tl.max_contiguous(tl.multiple_of(p_gk_last, BK), BK) + b_gk_last = tl.load(p_gk_last, mask=(i_k * BK + tl.arange(0, BK) < K), other=0.) + b_dh *= tl.exp(b_gk_last)[:, None] + b_dh += tl.dot(b_q, b_do) + + if STORE_INITIAL_STATE_GRADIENT: + p_dh0 = tl.make_block_ptr(dh0 + i_bh * K * V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + tl.store(p_dh0, b_dh.to(p_dh0.dtype.element_ty), boundary_check=(0, 1)) + + +def chunk_rwkv6_bwd_dh(q, k, v, g_cumsum_inclusive, g_cumsum_exclusive, do, h0, dht, BT, scale, states_in_fp32=False): + HQ = q.shape[1] + B, H, T, K, V = *k.shape, v.shape[-1] + BT = 64 + BK = min(triton.next_power_of_2(K), 64) + BV = min(triton.next_power_of_2(V), 64) + NT, NK, NV = triton.cdiv(T, BT), triton.cdiv(K, BK), triton.cdiv(V, BV) + NG = HQ // H + + dh = k.new_empty(B, HQ, NT * K, V, dtype=k.dtype if not states_in_fp32 else torch.float32) + if h0 is not None: + dh0 = torch.empty_like(h0, dtype=torch.float32) if h0.requires_grad else None + else: + dh0 = None + chunk_rwkv6_bwd_kernel_dh[(NK, NV, B * HQ)]( + q, g_cumsum_inclusive, g_cumsum_exclusive, do, dh, dht, dh0, + q.stride(1), q.stride(2), + v.stride(1), v.stride(2), + dh.stride(1), dh.stride(2), + scale, + T=T, K=K, V=V, BT=BT, BK=BK, BV=BV, NT=NT, NG=NG + ) + return dh, dh0 + + +class ChunkRWKV6Function(torch.autograd.Function): + + @staticmethod + @contiguous + def forward(ctx, q, k, v, g, u, scale, initial_state, output_final_state): + BT = 64 + g_cumsum_inclusive, g_cumsum_exclusive = chunk_rwkv6_fwd_cumsum(g, BT=BT) # gi, ge for short + h, ht = chunk_fwd_h( + k=k, + v=v, + g=None, + gk=g_cumsum_inclusive, + gv=None, + h0=initial_state, + output_final_state=output_final_state, + states_in_fp32=False, + chunk_size=BT + ) + A = chunk_rwkv6_fwd_intra_A_gated(q, k, g_cumsum_inclusive, g_cumsum_exclusive, u, scale, BT) + o = chunk_rwkv6_fwd_o_gated_gk(q, v, g_cumsum_exclusive, A, h, BT, scale) + ctx.save_for_backward(q, k, v, g, initial_state, A, u) + ctx.BT = BT + ctx.scale = scale + return o, ht + + @staticmethod + @contiguous + def backward(ctx, do, dht): + q, k, v, g, initial_state, A, u = ctx.saved_tensors + BT, scale = ctx.BT, ctx.scale + g_cumsum_inclusive, g_cumsum_exclusive = chunk_rwkv6_fwd_cumsum(g, BT=BT) # gi, ge for short + h, _ = chunk_fwd_h( + k=k, + v=v, + g=None, + gk=g_cumsum_inclusive, + gv=None, + h0=initial_state, + output_final_state=False, + states_in_fp32=True, + chunk_size=BT + ) + dh, dh0 = chunk_rwkv6_bwd_dh( + q=q, + k=k, + v=v, + g_cumsum_inclusive=g_cumsum_inclusive, + g_cumsum_exclusive=g_cumsum_exclusive, + do=do, + h0=initial_state, + dht=dht, + BT=BT, + scale=scale, + states_in_fp32=True + ) + # dq dk in fp32 + dA = chunk_gla_bwd_dA(v=v, do=do, scale=scale, chunk_size=BT) + dv = chunk_gla_bwd_dv(k=k, g=g_cumsum_inclusive, A=A, do=do, dh=dh, chunk_size=BT) + dq, dk = chunk_rwkv6_bwd_dqk_intra( + q=q, + k=k, + g_cumsum_inclusive=g_cumsum_inclusive, + g_cumsum_exclusive=g_cumsum_exclusive, + dA=dA, + BT=BT, + scale=scale + ) + dq, dk, dg, du = chunk_rwkv6_bwd_dqkgu( + q=q, + k=k, + v=v, + h=h, + g_cumsum_inclusive=g_cumsum_inclusive, + g_cumsum_exclusive=g_cumsum_exclusive, + u=u, + do=do, + dh=dh, + dA=dA, + dq=dq, + dk=dk, + BT=BT, + scale=scale + ) + return dq.to(q), dk.to(k), dv.to(v), dg.to(g), du.to(u), None, dh0, None + + +def chunk_rwkv6( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + g: torch.Tensor, + u: torch.Tensor, + scale: Optional[int] = None, + initial_state: torch.Tensor = None, + output_final_state: bool = False, + head_first: bool = True +) -> Tuple[torch.Tensor, torch.Tensor]: + r""" + Args: + q (torch.Tensor): + queries of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]`. + k (torch.Tensor): + keys of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]`. + v (torch.Tensor): + values of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]`. + g (torch.Tensor): + forget gates of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]`. + u (torch.Tensor): + bonus representations of shape `[H]`. + scale (Optional[int]): + Scale factor for the rwkv6 attention scores. + If not provided, it will default to `1 / sqrt(K)`. Default: `None`. + initial_state (Optional[torch.Tensor]): + Initial state of shape `[B, H, K, V]`. Default: `None`. + output_final_state (Optional[bool]): + Whether to output the final state of shape `[B, H, K, V]`. Default: `False`. + head_first (Optional[bool]): + Whether the inputs are in the head-first format. Default: `True`. + + Returns: + o (torch.Tensor): + Outputs of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]`. + final_state (Optional[torch.Tensor]): + Final state of shape `[B, H, K, V]` if `output_final_state=True` and `head_first=True` else `[B, H, M, V]`. + """ + if scale is None: + scale = q.shape[-1] ** -0.5 + if not head_first: + q, k, v, g = map(lambda x: x.transpose(1, 2) if x is not None else None, (q, k, v, g)) + o, final_state = ChunkRWKV6Function.apply(q, k, v, g, u, scale, initial_state, output_final_state) + if not head_first: + o = o.transpose(1, 2) + return o, final_state diff --git a/fla/ops/rwkv6/chunk_naive.py b/fla/ops/rwkv6/chunk_naive.py new file mode 100644 index 0000000000000000000000000000000000000000..4a2ac664f5079a20eabe9b11c19c1cff6755c658 --- /dev/null +++ b/fla/ops/rwkv6/chunk_naive.py @@ -0,0 +1,43 @@ +# -*- coding: utf-8 -*- + +import torch +from einops import rearrange + + +def naive_chunk_rwkv6( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + w: torch.Tensor, + u: torch.Tensor, + chunk_size: int = 32 +): + assert q.shape[-2] % chunk_size == 0 + orig_dtype = q.dtype + num_chunk = q.shape[-2] // chunk_size + u = u.unsqueeze(0) + + q, k, v, w = map(lambda x: rearrange(x, 'b h (n c) d -> b h n c d', c=chunk_size).float(), (q, k, v, w)) + + w_cumsum = w.cumsum(-2) + + kw = k * (w_cumsum[..., -1, None, :] - w_cumsum).exp() + wkv = kw.transpose(-1, -2) @ v + + wkv_new = torch.zeros_like(wkv) + + for i in range(num_chunk - 1): + wkv_new[:, :, i+1] = (wkv_new[:, :, i] * w_cumsum[:, :, i, -1, :, None].exp()) + wkv[:, :, i] + + o_inter = torch.einsum('b h n d p, b h n c d -> b h n c p', wkv_new, (q * (w_cumsum - w).exp())) + + o_intra = torch.zeros_like(o_inter) + for i in range(chunk_size): + attn = (q[:, :, :, i, None] * k * (w_cumsum[:, :, :, i, None] - w[:, :, :, i, None] - w_cumsum).exp()).sum(-1) + mask = (torch.arange(0, chunk_size) < i).to(attn.device) + attn.masked_fill_(~mask, 0) + intra_inter_o = (attn.unsqueeze(-1) * v).sum(-2) + intra_intra_o = (q[:, :, :, i] * u.unsqueeze(2) * k[:, :, :, i]).sum(-1).unsqueeze(-1) * v[:, :, :, i] + o_intra[:, :, :, i] = intra_inter_o + intra_intra_o + o = o_inter + o_intra + return rearrange(o, 'b h n c d -> b h (n c) d').to(orig_dtype) diff --git a/fla/ops/rwkv6/fused_recurrent.py b/fla/ops/rwkv6/fused_recurrent.py new file mode 100644 index 0000000000000000000000000000000000000000..2e8643762401806790c5730c0c70775dd744d862 --- /dev/null +++ b/fla/ops/rwkv6/fused_recurrent.py @@ -0,0 +1,380 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +from typing import Tuple + +import torch +import triton +import triton.language as tl + +from fla.ops.utils import chunk_global_cumsum +from fla.utils import autocast_custom_bwd, autocast_custom_fwd, contiguous + + +@triton.jit +def fused_recurrent_rwkv6_fwd_kernel( + q, # query [B, H, T, K] + k, # key [B, H, T, K] + v, # value [B, H, T, V] + w, # log gate [B, H, T, K] + u, # bonus [B, H, K] + o, # output [B, H, T, V] + # initial hidden state initialization [B, H, K, V] + h0, + ht, # final hidden state [B, H, K, V] + s_k_h, # stride size: T * K + s_v_h, # stride size: T * V + scale, # K ** -0.5 + B: tl.constexpr, + H: tl.constexpr, + T: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BK: tl.constexpr, # BLOCK SIZE along the K dimension + BV: tl.constexpr, # BLOCK SIZE along the V dimension + USE_INITIAL_STATE: tl.constexpr, # whether to use initial state + STORE_FINAL_STATE: tl.constexpr, # whether to store final state + REVERSE: tl.constexpr, # whether to do autoregressive modeling in the reverse direction +): + i_v, i_k, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_h = i_bh % H + + p_q = q + i_bh * s_k_h + i_k * BK + tl.arange(0, BK) + ((T-1) * K if REVERSE else 0) + p_k = k + i_bh * s_k_h + i_k * BK + tl.arange(0, BK) + ((T-1) * K if REVERSE else 0) + p_v = v + i_bh * s_v_h + i_v * BV + tl.arange(0, BV) + ((T-1) * V if REVERSE else 0) + p_o = o + (i_bh + i_k * B * H) * s_v_h + i_v * BV + tl.arange(0, BV) + ((T-1) * V if REVERSE else 0) + p_w = w + i_bh * s_k_h + i_k * BK + tl.arange(0, BK) + ((T-1) * K if REVERSE else 0) + p_u = u + i_h * K + tl.arange(0, BK) + i_k * BK + + mask_bk = (i_k * BK + tl.arange(0, BK)) < K + mask_bv = (i_v * BV + tl.arange(0, BV)) < V + mask_kv = mask_bv[:, None] & mask_bk[None, :] + + b_h = tl.zeros([BV, BK], dtype=tl.float32) + if USE_INITIAL_STATE: + p_h0 = h0 + i_bh * K * V + (i_k * BK + tl.arange(0, BK)[None, :]) * V + (i_v * BV + tl.arange(0, BV)[:, None]) + b_h += tl.load(p_h0, mask=mask_kv, other=0).to(tl.float32) + + b_u = tl.load(p_u, mask=mask_bk, other=0).to(tl.float32) + for _ in range(0, T): + b_k = tl.load(p_k, mask=mask_bk, other=0).to(tl.float32) + b_v = tl.load(p_v, mask=mask_bv, other=0).to(tl.float32) + b_q = tl.load(p_q, mask=mask_bk, other=0).to(tl.float32) * scale + b_w = tl.load(p_w, mask=mask_bk, other=0).to(tl.float32) + b_w = tl.exp(b_w) + b_kv = b_k[None, :] * b_v[:, None] + b_o = (b_h + b_kv * b_u[None, :]) * b_q[None, :] + b_o = tl.sum(b_o, axis=1) + b_h = b_h * b_w[None, :] + b_h += b_kv + tl.store(p_o, b_o.to(p_o.dtype.element_ty), mask=mask_bv) + p_q += -K if REVERSE else K + p_k += -K if REVERSE else K + p_o += -V if REVERSE else V + p_v += -V if REVERSE else V + p_w += -K if REVERSE else K + + if STORE_FINAL_STATE: + p_ht = ht + i_bh * K * V + (i_k * BK + tl.arange(0, BK)[None, :]) * V + (i_v * BV + tl.arange(0, BV)[:, None]) + tl.store(p_ht, b_h.to(p_ht.dtype.element_ty), mask=mask_kv) + + +# Similar to Algorithm1 of https://arxiv.org/abs/2006.16236 +@triton.jit +def fused_recurrent_rwkv6_bwd_kernel_dq( + # B: B, H: H, T: T, D: d_head + # NV: number of split in the V dimension. NK: number of split in the K dimension + k, # key [B, H, T, V] + v, # value [B, H, T, V] + w, # log gate [B, H, T, K] + u, # bonus [B, H, K] + + do, # gradient of output [B, H, T, V] + dq, # gradient of query [NV, B, H, T, K] + dq_aux, # gradient of query_aux [NV, B, H, T, K] + + # initial hidden state initialization [B, H, K, V] + h0, + + s_k_h, # stride size: T * K + s_v_h, # stride size: T * V + + scale, # K ** -0.5 + B: tl.constexpr, # B + H: tl.constexpr, # H + T: tl.constexpr, # T + K: tl.constexpr, # K + V: tl.constexpr, # V + BK: tl.constexpr, # BLOCK SIZE along the K dimension + BV: tl.constexpr, # BLOCK SIZE along the V dimension + USE_INITIAL_STATE: tl.constexpr, # whether to use initial state + REVERSE: tl.constexpr, # whether to do autoregressive modeling in the reverse direction +): + i_v, i_k, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_h = i_bh % H + p_k = k + i_bh * s_k_h + i_k * BK + tl.arange(0, BK) + ((T-1) * K if REVERSE else 0) + p_v = v + i_bh * s_v_h + i_v * BV + tl.arange(0, BV) + ((T-1) * V if REVERSE else 0) + p_do = do + i_bh * s_v_h + i_v * BV + tl.arange(0, BV) + ((T-1) * V if REVERSE else 0) + p_dq = dq + (i_bh + i_v * B * H) * s_k_h + i_k * BK + tl.arange(0, BK) + ((T-1) * K if REVERSE else 0) + p_dq_aux = dq_aux + (i_bh + i_v * B * H) * s_k_h + i_k * BK + tl.arange(0, BK) + ((T-1) * K if REVERSE else 0) + p_w = w + i_bh * s_k_h + i_k * BK + tl.arange(0, BK) + ((T-1) * K if REVERSE else 0) + p_u = u + i_h * K + tl.arange(0, BK) + i_k * BK + + mask_bk = i_k * BK + tl.arange(0, BK) < K + mask_bv = i_v * BV + tl.arange(0, BV) < V + mask_kv = mask_bv[:, None] & mask_bk[None, :] + b_u = tl.load(p_u, mask=mask_bk, other=0).to(tl.float32) + b_h = tl.zeros([BV, BK], dtype=tl.float32) + + if USE_INITIAL_STATE: + p_h0 = h0 + i_bh * K * V + (i_k * BK + tl.arange(0, BK)[None, :]) * V + (i_v * BV + tl.arange(0, BV)[:, None]) + b_h += tl.load(p_h0, mask=mask_kv, other=0).to(tl.float32) + + for _ in range(0, T): + b_k = tl.load(p_k, mask=mask_bk, other=0).to(tl.float32) + b_v = tl.load(p_v, mask=mask_bv, other=0).to(tl.float32) + b_kv = b_k[None, :] * b_v[:, None] + b_do = tl.load(p_do, mask=mask_bv, other=0).to(tl.float32) + b_w = tl.load(p_w, mask=mask_bk, other=0).to(tl.float32) + b_w = tl.exp(b_w) + h_q = b_h * b_do[:, None] + b_dq = tl.sum(h_q + b_kv * b_u[None, :] * b_do[:, None], axis=0) + b_dq *= scale + b_dq_aux = tl.sum(h_q, axis=0) + b_h = b_h * b_w[None, :] + b_h += b_kv + tl.store(p_dq, b_dq.to(p_dq.dtype.element_ty), mask=mask_bk) + tl.store(p_dq_aux, b_dq_aux.to(p_dq_aux.dtype.element_ty), mask=mask_bk) + p_k += -K if REVERSE else K + p_do += -V if REVERSE else V + p_v += -V if REVERSE else V + p_w += -K if REVERSE else K + p_dq += -K if REVERSE else K + p_dq_aux += -K if REVERSE else K + + +@triton.jit +def fused_recurrent_rwkv6_bwd_kernel_dkv( + # B: B, H: H, T: T, D: d_head + # NV: number of split in the V dimension. NK: number of split in the K dimension + q, # query [B, H, T, K] + k, # key [B, H, T, V] + v, # value [B, H, T, V] + w, # log gate [B, H, T, K] + u, # bonus [B, H, K] + + do, # gradient of output [B, H, T, V] + dk, + dk_aux, + dv, + dh0, + + # initial hidden state initialization [B, H, K, V] + s_k_h, # stride size: T * K + s_v_h, # stride size: T * V + + scale, # K ** -0.5 + B: tl.constexpr, # B + H: tl.constexpr, # H + T: tl.constexpr, # T + K: tl.constexpr, # K + V: tl.constexpr, # V + BK: tl.constexpr, # BLOCK SIZE along the K dimension + BV: tl.constexpr, # BLOCK SIZE along the V dimension + USE_INITIAL_STATE: tl.constexpr, # whether to use initial state + REVERSE: tl.constexpr, # whether to do autoregressive modeling in the reverse direction +): + i_v, i_k, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_h = i_bh % H + p_q = q + i_bh * s_k_h + i_k * BK + tl.arange(0, BK) + ((T - 1) * K if not REVERSE else 0) + p_k = k + i_bh * s_k_h + i_k * BK + tl.arange(0, BK) + ((T - 1) * K if not REVERSE else 0) + p_do = do + i_bh * s_v_h + i_v * BV + tl.arange(0, BV) + ((T - 1) * V if not REVERSE else 0) + p_v = v + i_bh * s_v_h + i_v * BV + tl.arange(0, BV) + ((T - 1) * V if not REVERSE else 0) + p_dk = dk + (i_bh + i_v * B * H) * s_k_h + i_k * BK + tl.arange(0, BK) + ((T - 1) * K if not REVERSE else 0) + p_dk_aux = dk_aux + (i_bh + i_v * B * H) * s_k_h + i_k * BK + tl.arange(0, BK) + ((T - 1) * K if not REVERSE else 0) + p_dv = dv + (i_bh + i_k * B * H) * s_v_h + i_v * BV + tl.arange(0, BV) + ((T - 1) * V if not REVERSE else 0) + p_w = w + i_bh * s_k_h + i_k * BK + tl.arange(0, BK) + ((T - 1) * K if not REVERSE else 0) + b_dh = tl.zeros([BK, BV], dtype=tl.float32) + mask_bk = i_k * BK + tl.arange(0, BK) < K + mask_bv = i_v * BV + tl.arange(0, BV) < V + mask_kv = mask_bk[:, None] & mask_bv[None, :] + + p_u = u + i_h * K + tl.arange(0, BK) + i_k * BK + b_u = tl.load(p_u, mask=mask_bk, other=0).to(tl.float32) + + for _ in range(T-1, -1, -1): + b_q = tl.load(p_q, mask=mask_bk, other=0).to(tl.float32) * scale + b_k = tl.load(p_k, mask=mask_bk, other=0).to(tl.float32) + b_v = tl.load(p_v, mask=mask_bv, other=0).to(tl.float32) + b_w = tl.load(p_w, mask=mask_bk, other=0).to(tl.float32) + b_do = tl.load(p_do, mask=mask_bv, other=0).to(tl.float32) + b_dkv = b_q[:, None] * b_do[None, :] + b_dk = tl.sum(b_dh * b_v[None, :], axis=1) + tl.store(p_dk_aux, b_dk.to(p_dk_aux.dtype.element_ty), mask=mask_bk) + b_dk += tl.sum(b_dkv * b_u[:, None] * b_v[None, :], axis=1) + b_dv = tl.sum((b_dh + (b_dkv * b_u[:, None])) * b_k[:, None], axis=0) + + tl.store(p_dk, b_dk.to(p_dk.dtype.element_ty), mask=mask_bk) + tl.store(p_dv, b_dv.to(p_dv.dtype.element_ty), mask=mask_bv) + b_dh *= tl.exp(b_w)[:, None] + b_dh += b_dkv + + p_q += K if REVERSE else -K + p_k += K if REVERSE else -K + p_v += V if REVERSE else -V + p_w += K if REVERSE else -K + p_do += V if REVERSE else -V + p_dk += K if REVERSE else -K + p_dk_aux += K if REVERSE else -K + p_dv += V if REVERSE else -V + + if USE_INITIAL_STATE: + p_dh0 = dh0 + i_bh * K * V + (i_k * BK + tl.arange(0, BK)[:, None]) * V + (i_v * BV + tl.arange(0, BV)[None, :]) + tl.store(p_dh0, b_dh.to(p_dh0.dtype.element_ty), mask=mask_kv) + + +class FusedRecurrentRWKV6Function(torch.autograd.Function): + + @staticmethod + @contiguous + @autocast_custom_fwd + def forward(ctx, r, k, v, w, u, scale=None, initial_state=None, output_final_state=False, reverse=False): + q = r + B, H, T, K, V = *q.shape, v.shape[-1] + + BK, BV = min(triton.next_power_of_2(K), 32), min(triton.next_power_of_2(V), 32) + NK, NV = triton.cdiv(K, BK), triton.cdiv(V, BV) + num_stages = 1 + num_warps = 1 + + final_state = q.new_empty(B, H, K, V) if output_final_state else None + + o = q.new_empty(NK, B, H, T, V, dtype=torch.float32) + grid = (NV, NK, B * H) + fused_recurrent_rwkv6_fwd_kernel[grid]( + q, k, v, w, u, o, initial_state, final_state, + k.stride(1), + v.stride(1), + scale, + B=B, H=H, T=T, K=K, V=V, BK=BK, BV=BV, + USE_INITIAL_STATE=initial_state is not None, + STORE_FINAL_STATE=final_state is not None, + REVERSE=reverse, + num_warps=num_warps, + num_stages=num_stages + ) + + o = o.sum(0) + ctx.save_for_backward(q, k, v, w, u, initial_state) + ctx.scale = scale + ctx.reverse = reverse + return o.to(q.dtype), final_state + + @staticmethod + @contiguous + @autocast_custom_bwd + def backward(ctx, do, dht=None): + q, k, v, w, u, initial_state = ctx.saved_tensors + B, H, T, K, V = *q.shape, v.shape[-1] + scale = ctx.scale + + BK, BV = min(triton.next_power_of_2(K), 16), min(triton.next_power_of_2(V), 64) + NK, NV = triton.cdiv(K, BK), triton.cdiv(V, BV) + num_stages = 1 + num_warps = 1 + dq = q.new_empty(NV, B, H, T, K, dtype=torch.float32) + dq_aux = torch.empty_like(dq) + grid = (NV, NK, B * H) + + fused_recurrent_rwkv6_bwd_kernel_dq[grid]( + k, v, w, u, do, dq, dq_aux, initial_state, + q.stride(1), + v.stride(1), + scale, + B=B, H=H, T=T, K=K, V=V, BK=BK, BV=BV, + USE_INITIAL_STATE=initial_state is not None, + REVERSE=ctx.reverse, + num_warps=num_warps, + num_stages=num_stages + ) + dq = dq.sum(0).to(q) + dq_aux = dq_aux.sum(0) + + BK, BV = min(triton.next_power_of_2(K), 32), min(triton.next_power_of_2(V), 32) + NK, NV = triton.cdiv(K, BK), triton.cdiv(V, BV) + + dk = q.new_empty(NV, B, H, T, K, dtype=torch.float32) + dk_aux = q.new_empty(NV, B, H, T, K, dtype=torch.float32) + dv = q.new_empty(NK, B, H, T, V, dtype=torch.float32) + dh0 = initial_state.new_empty(B, H, K, V) if initial_state is not None else None + grid = (NV, NK, B * H) + fused_recurrent_rwkv6_bwd_kernel_dkv[grid]( + q, k, v, w, u, do, dk, dk_aux, dv, dh0, + q.stride(1), + v.stride(1), + scale, + B=B, H=H, T=T, K=K, V=V, BK=BK, BV=BV, + num_warps=num_warps, + num_stages=num_stages, + USE_INITIAL_STATE=initial_state is not None, + REVERSE=ctx.reverse, + ) + dk = dk.sum(0).to(k) + dv = dv.sum(0).to(v) + dk_aux = dk_aux.sum(0) + + dw = (dq_aux * q * scale)[:, :, 1:] - (dk_aux * k)[:, :, 0:-1] + dw = torch.nn.functional.pad(dw, (0, 0, 0, 1, 0, 0, 0, 0), value=0) + dw = chunk_global_cumsum(dw, reverse=True).to(w) + + du = ((do * v).sum(-1)[..., None] * k * q * scale).sum([0, -2]).to(u) + return dq, dk, dv, dw, du, None, dh0, None, None + + +def fused_recurrent_rwkv6( + r: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + w: torch.Tensor, + u: torch.Tensor, + scale: float = -1, + initial_state: torch.Tensor = None, + output_final_state: bool = False, + head_first: bool = True +) -> Tuple[torch.Tensor, torch.Tensor]: + r""" + Args: + r (torch.Tensor): + reception of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]`. Alias: q, query in linear attention. + k (torch.Tensor): + keys of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]`. + v (torch.Tensor): + values of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]`. + w (torch.Tensor): + data-dependent decays of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]` in log space! Alias: g. + u (torch.Tensor): + bonus of shape `[H]` + scale (Optional[int]): + Scale factor for the RWKV6 attention scores. + If not provided, it will default to `1 / sqrt(K)`. Default: `None`. + initial_state (Optional[torch.Tensor]): + Initial state of shape `[B, H, K, V]`. Default: `None`. + output_final_state (Optional[bool]): + Whether to output the final state of shape `[B, H, K, V]`. Default: `False`. + head_first (Optional[bool]): + Whether the inputs are in the head-first format. Default: `True`. + + Returns: + o (torch.Tensor): + Outputs of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]`. + final_state (Optional[torch.Tensor]): + Final state of shape `[B, H, K, V]` if `output_final_state=True` and `head_first=True` else `[B, H, M, V]`. + """ + if scale == -1: + scale = r.shape[-1] ** -0.5 + if not head_first: + r, k, v, w = map(lambda x: x.transpose(1, 2), (r, k, v, w)) + o, final_state = FusedRecurrentRWKV6Function.apply(r, k, v, w, u, scale, initial_state, output_final_state) + if not head_first: + o = o.transpose(1, 2) + return o, final_state diff --git a/fla/ops/rwkv6/recurrent_naive.py b/fla/ops/rwkv6/recurrent_naive.py new file mode 100644 index 0000000000000000000000000000000000000000..ba2268759b5d4ce7f9be1be1f9c2e1a2f2a8e6c3 --- /dev/null +++ b/fla/ops/rwkv6/recurrent_naive.py @@ -0,0 +1,103 @@ +# -*- coding: utf-8 -*- + +from typing import Optional + +import torch + + +def naive_recurrent_rwkv6( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + w: torch.Tensor, + u: torch.Tensor, + scale: Optional[float] = None, + initial_state: Optional[torch.Tensor] = None, + output_final_state: Optional[bool] = False +): + orig_dtype = q.dtype + B, H, T, K, V = *q.shape, v.shape[-1] + q, k, v, w, u = map(lambda x: x.float(), (q, k, v, w, u)) + h = torch.zeros(B, H, K, V, dtype=torch.float32, device=q.device) + o = torch.zeros_like(v) + + if scale is None: + scale = K ** -0.5 + + if initial_state is not None: + h += initial_state + + for i in range(T): + q_i = q[:, :, i, :] * scale + k_i = k[:, :, i] + v_i = v[:, :, i, :] + w_i = w[:, :, i].exp() + kv_i = k_i[..., None] * v_i[..., None, :] + o_i = (h + u[None, ..., None] * kv_i) * q_i[..., None] + o[:, :, i] = o_i.sum(-2) + h = h * w_i[..., None] + kv_i + ht = h if output_final_state else None + return o.to(orig_dtype), ht + + +@torch.no_grad +@torch.jit.script +def naive_recurrent_rwkv6_bwd( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + w: torch.Tensor, + u: torch.Tensor, + o: torch.Tensor, + do: torch.Tensor, + initial_state: Optional[torch.Tensor] = None +): + q, k, v, w, u, o, do = (x.to(dtype=torch.float32) for x in (q, k, v, w, u, o, do)) + B, H, T, K, V = q.shape[0], q.shape[1], q.shape[2], q.shape[3], v.shape[-1] + h = torch.zeros(B, H, K, V, dtype=torch.float32, device=q.device) + dq = torch.zeros_like(q) + dq_aux = torch.zeros_like(q) + + if initial_state is not None: + h += initial_state + + for i in range(T): + k_i = k[:, :, i] + v_i = v[:, :, i] + w_i = w[:, :, i].exp() + kv_i = k_i[..., None] * v_i[..., None, :] + h_i = (h + u[None, ..., None] * kv_i) + dq_i = (do[:, :, i, None, :] * h_i).sum(-1) + dq_aux_i = (do[:, :, i, None, :] * h).sum(-1) + dq[:, :, i] = dq_i + dq_aux[:, :, i] = dq_aux_i + h = h * w_i[..., None] + kv_i + + du = torch.zeros_like(u) + dh = torch.zeros_like(h) + dk = torch.zeros_like(k) + dk_aux = torch.zeros_like(k) + dv = torch.zeros_like(v) + + for i in range(T - 1, -1, -1): + d_kv_i = do[:, :, i, None, :] * q[:, :, i, :, None] + k_i = k[:, :, i] + v_i = v[:, :, i] + du_i = (d_kv_i * k_i[..., None] * v_i[..., None, :]).sum(-1) + du += du_i.sum(0) + dk_i = (dh * v_i[..., None, :]).sum(-1) + dk_aux[:, :, i] = dk_i + dk_i += (d_kv_i * u[None, ..., None] * v_i[..., None, :]).sum(-1) + dv_i = (d_kv_i * u[None, ..., None] * k_i[..., None]).sum(-2) + dv_i += (dh * k_i[..., None]).sum(-2) + + dk[:, :, i] = dk_i + dv[:, :, i] = dv_i + dh = dh * w[:, :, i, :, None].exp() + d_kv_i + + # dw = q * dq_aux - k * dk_aux + dw = torch.zeros_like(w) + for i in range(T - 2, -1, -1): + dw[:, :, i] = dw[:, :, i+1] + dq_aux[:, :, i+1] * q[:, :, i+1] - dk_aux[:, :, i] * k[:, :, i] + + return dq, dk, dv, dw, du, dh diff --git a/fla/ops/scan/__init__.py b/fla/ops/scan/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..692bfc522dda32fcb629308b0c0341d70db63395 --- /dev/null +++ b/fla/ops/scan/__init__.py @@ -0,0 +1,9 @@ +# -*- coding: utf-8 -*- + +from .parallel import parallel_scan +from .naive import naive_recurrent_scan + +__all__ = [ + 'parallel_scan', + 'naive_recurrent_scan' +] diff --git a/fla/ops/scan/fused_recurrent.py b/fla/ops/scan/fused_recurrent.py new file mode 100644 index 0000000000000000000000000000000000000000..47ffe9a6f77d50eaa55d4633e427c13f69e14877 --- /dev/null +++ b/fla/ops/scan/fused_recurrent.py @@ -0,0 +1,565 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +from typing import Optional, Tuple + +import torch +import triton +import triton.language as tl + +from fla.ops.common.fused_recurrent import (fused_recurrent_bwd_kernel, + fused_recurrent_fwd_kernel) +from fla.ops.utils import chunk_global_cumsum +from fla.utils import autocast_custom_bwd, autocast_custom_fwd, contiguous + + +@triton.jit +def fused_recurrent_gsa_inference_kernel( + q, + k, + v, + s, + g, + o, + hk0, + hv0, + hkt, + hvt, + scale, + K: tl.constexpr, + V: tl.constexpr, + M: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + NG: tl.constexpr +): + i_bh = tl.program_id(0) + i_bg = i_bh // NG + + b_s = tl.load(s + i_bg * M + tl.arange(0, M)).to(tl.float32) + b_g = tl.load(g + i_bg * M + tl.arange(0, M)).to(tl.float32) + b_g = tl.exp(b_g) + + b_ok = tl.zeros([M], dtype=tl.float32) + for i_k in range(tl.cdiv(K, BK)): + o_k = i_k * BK + tl.arange(0, BK) + + p_hk0 = hk0 + i_bg * K * M + (o_k[None, :]) * M + tl.arange(0, M)[:, None] + # [BK,] + mask_k = o_k < K + # [M, BK] + mask_hk = (tl.arange(0, M) < M)[:, None] & mask_k[None, :] + # [M, BK] + b_hk = tl.load(p_hk0, mask=mask_hk, other=0.).to(tl.float32) + # [BK,] + b_q = tl.load(q + i_bh * K + o_k, mask=mask_k, other=0.).to(tl.float32) * scale + b_k = tl.load(k + i_bg * K + o_k, mask=mask_k, other=0.).to(tl.float32) + b_hk = b_hk * b_g[:, None] + b_k[None, :] * b_s[:, None] + b_ok += tl.sum(b_hk * b_q[None, :], axis=1) + + if i_bh % NG == 0: + p_hkt = hkt + i_bg * K * M + o_k[None, :] * M + tl.arange(0, M)[:, None] + tl.store(p_hkt, b_hk.to(p_hkt.dtype.element_ty), mask=mask_hk) + + b_qv = tl.softmax(b_ok) + for i_v in range(tl.cdiv(V, BV)): + o_v = i_v * BV + tl.arange(0, BV) + + p_hv0 = hv0 + i_bg * M * V + tl.arange(0, M)[None, :] * V + o_v[:, None] + # [BV,] + mask_v = o_v < V + # [BV, M] + mask_hv = mask_v[:, None] & (tl.arange(0, M) < M)[None, :] + # [BV, M] + b_hv = tl.load(p_hv0, mask=mask_hv, other=0).to(tl.float32) + # [BV,] + b_v = tl.load(v + i_bg * V + o_v, mask=mask_v, other=0).to(tl.float32) + b_hv = b_hv * b_g[None, :] + b_s[None, :] * b_v[:, None] + b_ov = tl.sum(b_hv * b_qv[None, :], axis=1) + + tl.store(o + i_bh * V + o_v, b_ov.to(o.dtype.element_ty), mask=mask_v) + + if i_bh % NG == 0: + p_hvt = hvt + i_bg * M * V + tl.arange(0, M)[None, :] * V + o_v[:, None] + tl.store(p_hvt, b_hv.to(p_hvt.dtype.element_ty), mask=mask_hv) + + +def fused_recurrent_gsa_inference( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + s: torch.Tensor, + g: torch.Tensor, + initial_state: Optional[Tuple[torch.Tensor, torch.Tensor]] = None, + output_final_state: bool = False, + scale: float = 1., + head_first: bool = True +) -> torch.Tensor: + if head_first: + B, H, T, K, V, M = *k.shape, v.shape[-1], s.shape[-1] + else: + B, T, H, K, V, M = *k.shape, v.shape[-1], s.shape[-1] + HQ = q.shape[1] if head_first else q.shape[2] + BK, BV = min(triton.next_power_of_2(K), 64), min(triton.next_power_of_2(V), 64) + NG = HQ // H + + hk0, hv0 = None, None + if initial_state is not None: + hk0, hv0 = initial_state + hkt, hvt = None, None + if output_final_state: + if NG == 1: + hkt, hvt = hk0, hv0 + else: + hkt, hvt = q.new_empty(B, H, K, M, dtype=torch.float), q.new_empty(B, H, M, V, dtype=torch.float) + + o = v.new_empty(B, HQ, T, V) if head_first else v.new_empty(B, T, HQ, V) + grid = (B * HQ,) + fused_recurrent_gsa_inference_kernel[grid]( + q, + k, + v, + s, + g, + o, + hk0, + hv0, + hkt, + hvt, + scale=scale, + K=K, + V=V, + M=M, + BK=BK, + BV=BV, + NG=NG + ) + return o, (hkt, hvt) + + +def fused_recurrent_gsa_fwd( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + s: torch.Tensor, + g: torch.Tensor, + initial_state: Optional[Tuple[torch.Tensor, torch.Tensor]] = None, + output_final_state: bool = False, + scale: float = 1., + reverse: bool = False, + offsets: Optional[torch.LongTensor] = None, + head_first: bool = True +) -> Tuple[torch.Tensor, Tuple[torch.Tensor]]: + if head_first: + B, H, T, K, V, M = *k.shape, v.shape[-1], s.shape[-1] + else: + B, T, H, K, V, M = *k.shape, v.shape[-1], s.shape[-1] + N = B if offsets is None else len(offsets) - 1 + HQ = q.shape[1] if head_first else q.shape[2] + if HQ != H: + raise ValueError("GQA not supported yet.") + + BK, BV, BM = min(triton.next_power_of_2(K), 64), min(triton.next_power_of_2(V), 64), min(M, 64) + NK, NV, NM = triton.cdiv(K, BK), triton.cdiv(V, BV), triton.cdiv(M, BM) + + hk0, hv0 = None, None + if initial_state is not None: + hk0, hv0 = initial_state + hkt, hvt = None, None + if output_final_state: + hkt, hvt = q.new_empty(N, H, K, M, dtype=torch.float), q.new_empty(N, H, M, V, dtype=torch.float) + + ok = q.new_empty(NK, *s.shape, dtype=torch.float) + gk, gv = None, g + grid = (NM, NK, N * H) + fused_recurrent_fwd_kernel[grid]( + q=q, + k=k, + v=s, + g=None, + gk=gk, + gv=gv, + o=ok, + h0=hk0, + ht=hkt, + offsets=offsets, + scale=scale, + B=B, + T=T, + H=H, + K=K, + V=M, + BK=BK, + BV=BM, + USE_G=False, + USE_GK=False, + USE_GV=True, + REVERSE=reverse, + HEAD_FIRST=head_first + ) + ok = ok.sum(0) + + qv = ok.softmax(-1, dtype=torch.float) + ov = q.new_empty(NM, *v.shape, dtype=torch.float) + gk, gv = g, None + grid = (NV, NM, N * H) + fused_recurrent_fwd_kernel[grid]( + q=qv, + k=s, + v=v, + g=None, + gk=gk, + gv=gv, + o=ov, + h0=hv0, + ht=hvt, + offsets=offsets, + scale=1., + B=B, + T=T, + H=H, + K=M, + V=V, + BK=BM, + BV=BV, + USE_G=False, + USE_GK=True, + USE_GV=False, + REVERSE=reverse, + HEAD_FIRST=head_first + ) + ov = ov.sum(0) + return ok, hkt, qv, ov, hvt + + +def fused_recurrent_gsa_bwd( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + s: torch.Tensor, + g: torch.Tensor, + qv: torch.Tensor, + hk0: Optional[torch.Tensor] = None, + hv0: Optional[torch.Tensor] = None, + ok: Optional[torch.Tensor] = None, + do: Optional[torch.Tensor] = None, + dhkt: Optional[torch.Tensor] = None, + dhvt: Optional[torch.Tensor] = None, + scale: float = 1., + reverse: bool = False, + offsets: Optional[torch.LongTensor] = None, + head_first: bool = True +) -> Tuple[torch.Tensor]: + if head_first: + B, H, T, K, V, M = *q.shape, v.shape[-1], s.shape[-1] + else: + B, T, H, K, V, M = *q.shape, v.shape[-1], s.shape[-1] + N = B if offsets is None else len(offsets) - 1 + + BK, BV, BM = min(K, 64), min(V, 64), min(M, 64) + NK, NV, NM = triton.cdiv(K, BK), triton.cdiv(V, BV), triton.cdiv(M, BM) + + if head_first: + dqv = q.new_empty(NV, B, H, T, M, dtype=torch.float) + dsv = q.new_empty(NV, B, H, T, M, dtype=torch.float) + dv = q.new_empty(NM, B, H, T, V, dtype=torch.float) + else: + dqv = q.new_empty(NV, B, T, H, M, dtype=torch.float) + dsv = q.new_empty(NV, B, T, H, M, dtype=torch.float) + dv = q.new_empty(NM, B, T, H, V, dtype=torch.float) + dhk0 = torch.empty_like(hk0)if hk0 is not None else None + dhv0 = torch.empty_like(hv0)if hv0 is not None else None + + gk, gv = g, None + grid = (NV, NM, N * H) + fused_recurrent_bwd_kernel[grid]( + q=qv, + k=s, + v=v, + g=None, + gk=gk, + gv=gv, + h0=hv0, + do=do, + dq=dqv, + dk=dsv, + dv=dv, + dht=dhvt, + dh0=dhv0, + offsets=offsets, + scale=1., + B=B, + T=T, + H=H, + K=M, + V=V, + BK=BM, + BV=BV, + USE_G=False, + USE_GK=True, + USE_GV=False, + REVERSE=reverse, + HEAD_FIRST=head_first + ) + dqv = dqv.sum(0) + dsv = dsv.sum(0) + dv = dv.sum(0) + dgk = chunk_global_cumsum(dqv * qv.float() - dsv * s.float(), + reverse=not reverse, + offsets=offsets, + head_first=head_first) + + dok = qv * (dqv - (qv * dqv).sum(-1, True)) + if head_first: + dq = q.new_empty(NM, B, H, T, K, dtype=torch.float) + dk = q.new_empty(NM, B, H, T, K, dtype=torch.float) + dsk = q.new_empty(NK, B, H, T, M, dtype=torch.float) + else: + dq = q.new_empty(NM, B, T, H, K, dtype=torch.float) + dk = q.new_empty(NM, B, T, H, K, dtype=torch.float) + dsk = q.new_empty(NK, B, T, H, M, dtype=torch.float) + gk, gv = None, g + grid = (NM, NK, N * H) + fused_recurrent_bwd_kernel[grid]( + q=q, + k=k, + v=s, + g=None, + gk=gk, + gv=gv, + h0=hk0, + do=dok, + dq=dq, + dk=dk, + dv=dsk, + dht=dhkt, + dh0=dhk0, + offsets=offsets, + scale=scale, + B=B, + T=T, + H=H, + K=K, + V=M, + BK=BK, + BV=BM, + USE_G=False, + USE_GK=False, + USE_GV=True, + REVERSE=reverse, + HEAD_FIRST=head_first + ) + dq = dq.sum(0) + dk = dk.sum(0) + dsk = dsk.sum(0) + + dgv = chunk_global_cumsum(dok.float() * ok.float() - dsk * s.float(), + reverse=not reverse, + offsets=offsets, + head_first=head_first) + + ds = dsk.add_(dsv) + dg = dgk.add_(dgv) + + return dq, dk, dv, ds, dg, dhk0, dhv0 + + +class FusedRecurrentGSAFunction(torch.autograd.Function): + + @staticmethod + @contiguous + @autocast_custom_fwd + def forward( + ctx, + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + s: torch.Tensor, + g: torch.Tensor, + scale: Optional[float] = None, + hk0: Optional[torch.Tensor] = None, + hv0: Optional[torch.Tensor] = None, + output_final_state: bool = False, + reverse: bool = False, + offsets: Optional[torch.LongTensor] = None, + head_first: bool = True + ) -> Tuple[torch.Tensor, Tuple[torch.Tensor]]: + T = q.shape[2] if head_first else q.shape[1] + if T == 1 and not q.requires_grad: + o, (hkt, hvt) = fused_recurrent_gsa_inference( + q=q, + k=k, + v=v, + s=s, + g=g, + initial_state=(hk0, hv0), + output_final_state=output_final_state, + scale=scale, + head_first=head_first + ) + return o, (hkt, hvt) + ok, hkt, qv, ov, hvt = fused_recurrent_gsa_fwd( + q=q, + k=k, + v=v, + s=s, + g=g, + initial_state=(hk0, hv0), + output_final_state=output_final_state, + scale=scale, + reverse=reverse, + offsets=offsets, + head_first=head_first + ) + ctx.save_for_backward(q, k, v, s, g, qv, hk0, hv0, ok) + ctx.scale = scale + ctx.reverse = reverse + ctx.offsets = offsets + ctx.head_first = head_first + return ov.to(q.dtype), hkt, hvt + + @staticmethod + @contiguous + @autocast_custom_bwd + def backward(ctx, do, dhkt=None, dhvt=None): + q, k, v, s, g, qv, hk0, hv0, ok = ctx.saved_tensors + scale = ctx.scale + reverse = ctx.reverse + offsets = ctx.offsets + head_first = ctx.head_first + + # not supported yet. + if dhkt is not None or dhvt is not None: + if g is not None: + assert g.requires_grad is False, "Cannot load final state gradient and use gates at the same time" + dq, dk, dv, ds, dg, dhk0, dhv0 = fused_recurrent_gsa_bwd( + q=q, + k=k, + v=v, + s=s, + g=g, + qv=qv, + hk0=hk0, + hv0=hv0, + ok=ok, + do=do, + dhkt=dhkt, + dhvt=dhvt, + scale=scale, + reverse=reverse, + offsets=offsets, + head_first=head_first + ) + return dq.to(q), dk.to(k), dv.to(v), ds.to(s), dg.to(g), None, dhk0, dhv0, None, None, None, None + + +def fused_recurrent_gsa( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + s: torch.Tensor, + g: Optional[torch.Tensor] = None, + scale: Optional[int] = None, + initial_state: Optional[Tuple[torch.Tensor]] = None, + output_final_state: Optional[bool] = False, + reverse: Optional[bool] = False, + offsets: Optional[torch.LongTensor] = None, + head_first: bool = True +) -> Tuple[torch.Tensor, torch.Tensor]: + r""" + Args: + q (torch.Tensor): + queries of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]`. + k (torch.Tensor): + keys of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]`. + v (torch.Tensor): + values of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]`. + s (torch.Tensor): + slot representations of shape `[B, H, T, M]` if `head_first=True` else `[B, T, H, M]`. + g (torch.Tensor): + Forget gates of shape `[B, H, T, M]` applied to keys. + scale (Optional[int]): + Scale factor for the attention scores. + If not provided, it will default to `1 / sqrt(K)`. Default: `None`. + initial_state (Optional[Tuple[torch.Tensor]]): + Initial state tuple having tensors of shape `[N, H, K, M]` and `[N, H, M, V]` for `N` input sequences. + For equal-length input sequences, `N` equals the batch size `B`. + Default: `None`. + output_final_state (Optional[bool]): + Whether to output the final state of shape `[N, H, K, V]` and `[N, H, M, V]`. + Default: `False`. + reverse (Optional[bool]): + If `True`, process the state passing in reverse order. Default: `False`. + offsets (Optional[torch.LongTensor]): + Offsets of shape `[N+1]` defining the bos/eos positions of `N` variable-length sequences in the batch. + For example, + if `offsets` is `[0, 1, 3, 6, 10, 15]`, there are `N=5` sequences with lengths 1, 2, 3, 4 and 5 respectively. + If provided, the inputs are concatenated and the batch size `B` is expected to be 1. + Default: `None`. + head_first (Optional[bool]): + Whether the inputs are in the head-first format, which is not supported for variable-length inputs. + Default: `True`. + + Returns: + o (torch.Tensor): + Outputs of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]`. + final_state (Tuple[torch.Tensor]): + Final state tuple having tensors of shape `[N, H, K, M]` and `[N, H, M, V]`. + + Examples:: + >>> import torch + >>> import torch.nn.functional as F + >>> from einops import rearrange + >>> from fla.ops.gsa import fused_recurrent_gsa + # inputs with equal lengths + >>> B, T, H, K, V, M = 4, 2048, 4, 512, 512, 64 + >>> q = torch.randn(B, T, H, K, device='cuda') + >>> k = torch.randn(B, T, H, K, device='cuda') + >>> v = torch.randn(B, T, H, V, device='cuda') + >>> s = torch.randn(B, T, H, M, device='cuda') + >>> g = F.logsigmoid(torch.randn(B, T, H, M, device='cuda')) + >>> h0 = (torch.randn(B, H, K, M, device='cuda'), torch.randn(B, H, M, V, device='cuda')) + >>> o, (hk, hv) = fused_recurrent_gsa(q, k, v, s, g, + initial_state=h0, + output_final_state=True, + head_first=False) + # for variable-length inputs, the batch size `B` is expected to be 1 and `offsets` is required + >>> q, k, v, s, g = map(lambda x: rearrange(x, 'b t h d -> 1 (b t) h d'), (q, k, v, s, g)) + # for a batch with 4 sequences, offsets with 5 start/end positions are expected + >>> offsets = q.new_tensor([0, 2048, 4096, 6144, 8192], dtype=torch.long) + >>> o_var, (hk_var, hv_var) = fused_recurrent_gsa(q, k, v, s, g, + initial_state=h0, + output_final_state=True, + offsets=offsets, + head_first=False) + >>> assert o.allclose(o_var.view(o.shape)) + >>> assert hk.allclose(hk_var) + >>> assert hv.allclose(hv_var) + """ + if offsets is not None: + if q.shape[0] != 1: + raise ValueError(f"The batch size is expected to be 1 rather than {q.shape[0]} when using `offsets`." + f"Please flatten variable-length inputs before processing.") + if head_first: + raise RuntimeError("Sequences with variable lengths are not supported for head-first mode") + if initial_state is not None and initial_state[0].shape[0] != len(offsets) - 1: + raise ValueError(f"The number of initial states is expected to be equal to the number of input sequences, " + f"i.e., {len(offsets) - 1} rather than {initial_state[0].shape[0]}.") + if scale is None: + scale = k.shape[-1] ** -0.5 + if initial_state is None: + initial_state = (None, None) + o, *final_state = FusedRecurrentGSAFunction.apply( + q, + k, + v, + s, + g, + scale, + *initial_state, + output_final_state, + reverse, + offsets, + head_first + ) + return o, final_state diff --git a/fla/ops/scan/naive.py b/fla/ops/scan/naive.py new file mode 100644 index 0000000000000000000000000000000000000000..11a1dbc49e93148faca79c639539e1e988151c09 --- /dev/null +++ b/fla/ops/scan/naive.py @@ -0,0 +1,62 @@ +# -*- coding: utf-8 -*- + +from typing import Optional, Tuple + +import torch +from einops import repeat + + +def naive_recurrent_scan( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + s: torch.Tensor, + g: torch.Tensor, + window_size: int, + alibi: torch.Tensor, + mask: torch.Tensor, + scale: Optional[int] = None, + initial_state: Optional[torch.Tensor] = None, + output_final_state: Optional[bool] = False, + head_first: Optional[bool] = True +) -> Tuple[torch.Tensor, torch.Tensor]: + B, H, T, C, W, S = *q.shape, window_size, g.shape[-1] + Tk = k.shape[2] + + if scale is None: + scale = C ** -0.5 + + sg = torch.einsum("bhts, bhtc -> bhtsc", g, s) # (B, H, T, S, C) + gi = 1 - g # (B, H, T, S) + prev_state = initial_state if initial_state is not None else torch.zeros((B, H, S, C), device=q.device, dtype=q.dtype) + outs = [] + + for t in range(T): # this will only loop more than once in the first prefill pass + prev_state = torch.einsum("bhs, bhsc -> bhsc", gi[:, :, t], prev_state) # (B, H, S, C) + state = prev_state + sg[:, :, t] # (B, H, S, C) + + if T == Tk: # first prefill pass + k_window = k[:, :, max(0, t - W):t] # (B, H, W, C) + v_window = v[:, :, max(0, t - W):t] + else: # subsequent passes + k_window = k + v_window = v + Tw = k_window.shape[-2] + # if the window crop is less than W, pad with zeros on the left + if Tw < W: + k_window = torch.cat((torch.zeros((B, H, W - Tw, C), device=k.device, dtype=k.dtype), k_window), dim=2) + v_window = torch.cat((torch.zeros((B, H, W - Tw, C), device=v.device, dtype=v.dtype), v_window), dim=2) + all_keys = torch.cat((state, k_window), dim=2) # (B, H, S, C) + (B, H, W, C) -> (B, H, S+W, C) + all_values = torch.cat((state, v_window), dim=2) # (B, H, S, C) + (B, H, W, C) -> (B, H, S+W, C) + scores = torch.einsum("bhc, bhxc -> bhx", q[:, :, 0], all_keys) * scale # (B, H, C) @ (B, H, S+W, C) -> (B, H, S+W) + scores += alibi[:, Tw] # (B, H, S+W) + scores = scores.masked_fill(mask[Tw] == 0, float("-inf")) + scores = torch.softmax(scores, dim=-1) + out = torch.einsum("bhx, bhxc -> bhc", scores, all_values) + outs.append(out) + + prev_state = state + final_state = prev_state + outs = torch.stack(outs, dim=2) + + return outs, final_state diff --git a/fla/ops/scan/parallel.py b/fla/ops/scan/parallel.py new file mode 100644 index 0000000000000000000000000000000000000000..f956384e8ec0064d547b30387dff275a6663d82d --- /dev/null +++ b/fla/ops/scan/parallel.py @@ -0,0 +1,1086 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +from typing import Optional, Tuple + +import math +import torch +import triton +import triton.language as tl + +# triton kernel +# @triton.autotune( +# configs=[ +# triton.Config({'BLOCK_SIZE_S': 16, 'BLOCK_SIZE_W': 16}, num_warps=2), +# # triton.Config({'BLOCK_SIZE_S': 16, 'BLOCK_SIZE_W': 16}, num_warps=4), +# # triton.Config({'BLOCK_SIZE_S': 16, 'BLOCK_SIZE_W': 16}, num_warps=8), +# # triton.Config({'BLOCK_SIZE_S': 32, 'BLOCK_SIZE_W': 32}, num_warps=2), +# # triton.Config({'BLOCK_SIZE_S': 32, 'BLOCK_SIZE_W': 32}, num_warps=4), +# # triton.Config({'BLOCK_SIZE_S': 32, 'BLOCK_SIZE_W': 32}, num_warps=8), +# # triton.Config({'BLOCK_SIZE_S': 64, 'BLOCK_SIZE_W': 64}, num_warps=2), +# # triton.Config({'BLOCK_SIZE_S': 64, 'BLOCK_SIZE_W': 64}, num_warps=4), +# # triton.Config({'BLOCK_SIZE_S': 64, 'BLOCK_SIZE_W': 64}, num_warps=8), +# ], +# key=[] +# ) +@triton.autotune( + configs=[ + triton.Config({'BLOCK_SIZE_S': 8, 'BLOCK_SIZE_W': 8}, num_warps=2), + triton.Config({'BLOCK_SIZE_S': 8, 'BLOCK_SIZE_W': 8}, num_warps=4), + triton.Config({'BLOCK_SIZE_S': 8, 'BLOCK_SIZE_W': 8}, num_warps=8), + triton.Config({'BLOCK_SIZE_S': 16, 'BLOCK_SIZE_W': 16}, num_warps=2), + triton.Config({'BLOCK_SIZE_S': 16, 'BLOCK_SIZE_W': 16}, num_warps=4), + triton.Config({'BLOCK_SIZE_S': 16, 'BLOCK_SIZE_W': 16}, num_warps=8), + # triton.Config({'BLOCK_SIZE_S': 32, 'BLOCK_SIZE_W': 32}, num_warps=2), + # triton.Config({'BLOCK_SIZE_S': 32, 'BLOCK_SIZE_W': 32}, num_warps=4), + # triton.Config({'BLOCK_SIZE_S': 32, 'BLOCK_SIZE_W': 32}, num_warps=8), + # triton.Config({'BLOCK_SIZE_S': 64, 'BLOCK_SIZE_W': 64}, num_warps=2), + # triton.Config({'BLOCK_SIZE_S': 64, 'BLOCK_SIZE_W': 64}, num_warps=4), + # triton.Config({'BLOCK_SIZE_S': 64, 'BLOCK_SIZE_W': 64}, num_warps=8), + ], + key=[] +) +@triton.jit +def afak_fwd_kernel( + q_ptr, k_ptr, states_ptr, y_ptr, + B: tl.constexpr, T: tl.constexpr, S:tl.constexpr, C: tl.constexpr, W: tl.constexpr, + BLOCK_SIZE_S: tl.constexpr, + BLOCK_SIZE_W: tl.constexpr, +): + # Use multiple program IDs for better parallelization + b_id = tl.program_id(axis=0) + t_id = tl.program_id(axis=1) + sw_block_id = tl.program_id(axis=2) + num_s_blocks = triton.cdiv(S, BLOCK_SIZE_S) + num_w_blocks = triton.cdiv(W, BLOCK_SIZE_W) + SW = S + W + + # Compute base pointers + q_base = q_ptr + b_id * T * C + k_base = k_ptr + b_id * T * C + states_base = states_ptr + b_id * T * S * C + y_base = y_ptr + b_id * T * W + + # Fetch the query at [b_id, t_id, :] + q_block_ptr = tl.make_block_ptr( + base=q_ptr, + shape=(B, T, C), + strides=(T * C, C, 1), + offsets=(b_id, t_id, 0), + block_shape=(1, 1, C), + order=(0, 1, 2), + ) + q = tl.load(q_block_ptr) # (1, 1, C) + + if sw_block_id < num_s_blocks: + s_first_id = sw_block_id * BLOCK_SIZE_S + # Fetch the states at [b_id, t_id, s_first_id:s_first_id+BLOCK_SIZE_S, :] + s_block_ptr = tl.make_block_ptr( + base=states_ptr, + shape=(B, T, S, C), + strides=(T * S * C, S * C, C, 1), + offsets=(b_id, t_id, s_first_id, 0), + block_shape=(1, 1, BLOCK_SIZE_S, C), + order=(0, 1, 2, 3), + ) + s = tl.load(s_block_ptr) # (1, 1, BLOCK_SIZE_S, C) + o = q[:, :, None, :] * s # (1, 1, BLOCK_SIZE_S, C) + o = tl.sum(o, axis=-1) # (1, 1, BLOCK_SIZE_S) + # Store the result + y_block_ptr = tl.make_block_ptr( + base=y_ptr, + shape=(B, T, SW), + strides=(T * SW, SW, 1), + offsets=(b_id, t_id, s_first_id), + block_shape=(1, 1, BLOCK_SIZE_S), + order=(0, 1, 2), + ) + tl.store(y_block_ptr, o.to(y_block_ptr.dtype.element_ty)) # (1, 1, BLOCK_SIZE_S) + else: + w_first_id = (sw_block_id - num_s_blocks) * BLOCK_SIZE_W + # Fetch the key at [b_id, t_id-W+1+(w_block_id*BLOCK_SIZE_W):t_id+(w_block_id*BLOCK_SIZE_W), :] + # need to load the keys manually because make_block_ptr doesn't support masks + tw_offs = tl.arange(0, BLOCK_SIZE_W) + c_offs = tl.arange(0, C) + k_block_ptr = k_base + (t_id - W + 1 + (w_first_id + tw_offs[:, None])) * C + c_offs[None, :] + mask = w_first_id + tl.arange(0, BLOCK_SIZE_W)[:, None] > (W - t_id - 2) + k = tl.load(k_block_ptr, mask=mask) # (BLOCK_SIZE_W, C) + # Compute the dot product (but not with tl.dot because it has a minimum size of 16) + y = q * k[None, :] # (1, BLOCK_SIZE_W, C) + y = tl.sum(y, axis=-1) # (1, BLOCK_SIZE_W) + # Store the result + y_block_ptr = tl.make_block_ptr( + base=y_ptr, + shape=(B, T, SW), + strides=(T * SW, SW, 1), + offsets=(b_id, t_id, S + w_first_id), + block_shape=(1, 1, BLOCK_SIZE_W), + order=(0, 1, 2), + ) + tl.store(y_block_ptr, y[None, :].to(y_block_ptr.dtype.element_ty)) # (1, 1, BLOCK_SIZE_W) + +# @triton.autotune( +# configs=[ +# triton.Config({ +# 'BLOCK_SIZE_C': bs_c, +# }, num_warps=warps) +# for bs_c in [16] #, 32, 64] +# for warps in [2] # 4, 8] +# ], +# key=[] +# ) +@triton.autotune( + configs=[ + triton.Config({ + 'BLOCK_SIZE_C': bs_c, + }, num_warps=warps) + for bs_c in [16, 32, 64] + for warps in [2, 4, 8] + ], + key=[] +) +@triton.jit +def afak_bwd_kernel( + q_ptr, k_ptr, states_ptr, dy_ptr, dq_ptr, dk_ptr, ds_ptr, + B: tl.constexpr, T: tl.constexpr, S: tl.constexpr, C: tl.constexpr, W: tl.constexpr, + BLOCK_SIZE_C: tl.constexpr, +): + # Use multiple program IDs for better parallelization + b_id = tl.program_id(axis=0) + t_id = tl.program_id(axis=1) + c_block_id = tl.program_id(axis=2) + c_first_id = c_block_id * BLOCK_SIZE_C + SW = S + W + + # Compute base pointers + q_base = q_ptr + b_id * T * C + k_base = k_ptr + b_id * T * C + dy_base = dy_ptr + b_id * T * SW + dq_base = dq_ptr + b_id * T * C + dk_base = dk_ptr + b_id * T * C + + # First calculate the gradients for q + # Fetch original keys at [b_id, t_id-W+1:t_id, c_first_id:c_first_id+BLOCK_SIZE_C] + # using a block ptr also disallows the use of masks when loading, so let's just make a ptr manually + tw_offs = tl.arange(0, W) + c_offs = tl.arange(0, BLOCK_SIZE_C) + k_block_ptr = k_base + (t_id - W + 1 + tw_offs[:, None]) * C + c_first_id + c_offs[None, :] + mask = tl.arange(0, W)[:, None] > (W - t_id - 2) + k = tl.load(k_block_ptr, mask=mask) # (W, BLOCK_SIZE_C) + # Fetch output gradients at [b_id, t_id, S:W] + dy_block_ptr = tl.make_block_ptr( + base=dy_ptr, + shape=(B, T, SW), + strides=(T * SW, SW, 1), + offsets=(b_id, t_id, S), + block_shape=(1, 1, W), + order=(0, 1, 2), + ) + dy = tl.load(dy_block_ptr) # (1, 1, W) + # Compute the gradients for q + dqk = dy.permute(0, 2, 1) * k[None, :] # (1, W, BLOCK_SIZE_C) + dqk = tl.sum(dqk, axis=1) # (1, BLOCK_SIZE_C) + # Then we also have to add the gradients from the states + # Fetch the states at [b_id, t_id, c_first_id:c_first_id+BLOCK_SIZE_C] + s_block_ptr = tl.make_block_ptr( + base=states_ptr, + shape=(B, T, S, C), + strides=(T * S * C, S * C, C, 1), + offsets=(b_id, t_id, 0, c_first_id), + block_shape=(1, 1, S, BLOCK_SIZE_C), + order=(0, 1, 2, 3), + ) + s = tl.load(s_block_ptr) # (1, 1, S, BLOCK_SIZE_C) + # Fetch the output gradients at [b_id, t_id, :S] + dy_block_ptr = tl.make_block_ptr( + base=dy_ptr, + shape=(B, T, SW), + strides=(T * SW, SW, 1), + offsets=(b_id, t_id, 0), + block_shape=(1, 1, S), + order=(0, 1, 2), + ) + dy = tl.load(dy_block_ptr) # (1, 1, S) + # Compute the gradients for q + dqs = dy[:, :, :, None] * s # (1, 1, S, BLOCK_SIZE_C) + dqs = tl.sum(dqs, axis=2) # (1, 1, BLOCK_SIZE_C) + dq = dqk[None, :] + dqs # (1, 1, BLOCK_SIZE_C) + # Store the result + dq_block_ptr = tl.make_block_ptr( + base=dq_ptr, + shape=(B, T, C), + strides=(T * C, C, 1), + offsets=(b_id, t_id, c_first_id), + block_shape=(1, 1, BLOCK_SIZE_C), + order=(0, 1, 2), + ) + tl.store(dq_block_ptr, dq.to(dq_block_ptr.dtype.element_ty)) # (1, 1, BLOCK_SIZE_C) + + # Calculate the gradients for states while we're at it + # Fetch the query at [b_id, t_id, c_first_id:c_first_id+BLOCK_SIZE_C] + q_block_ptr = tl.make_block_ptr( + base=q_ptr, + shape=(B, T, C), + strides=(T * C, C, 1), + offsets=(b_id, t_id, c_first_id), + block_shape=(1, 1, BLOCK_SIZE_C), + order=(0, 1, 2), + ) + q = tl.load(q_block_ptr) # (1, 1, BLOCK_SIZE_C) + # Compute the gradients for states + ds = dy[:, :, :, None] * q[:, :, None, :] # (1, 1, S, BLOCK_SIZE_C) + # Store the result + ds_block_ptr = tl.make_block_ptr( + base=ds_ptr, + shape=(B, T, S, C), + strides=(T * S * C, S * C, C, 1), + offsets=(b_id, t_id, 0, c_first_id), + block_shape=(1, 1, S, BLOCK_SIZE_C), + order=(0, 1, 2, 3), + ) + tl.store(ds_block_ptr, ds.to(ds_block_ptr.dtype.element_ty)) # (1, 1, S, BLOCK_SIZE_C) + + # Then calculate the gradients for k + # same thing here, let's just make the ptr manually + tw_offs = tl.arange(0, W) + c_offs = tl.arange(0, BLOCK_SIZE_C) + q_block_ptr = q_base + (t_id + tw_offs[:, None]) * C + c_first_id + c_offs[None, :] + mask = tl.arange(0, W)[:, None] < T - t_id + q = tl.load(q_block_ptr, mask=mask) # (W, BLOCK_SIZE_C) + # Fetch original gradients at [b_id, t_id, :] + # This one is tricky bc we have to fetch a diagonal from dy + # going from [b_id, t_id, W] to [b_id, t_id+W, 0] + w_offs = tl.arange(0, W) + diag_dy_base = dy_base + t_id * SW + S + tl.flip(w_offs, 0) + dy_block_ptr = diag_dy_base + w_offs * SW + mask = tl.arange(0, W) < T - t_id + dy = tl.load(dy_block_ptr, mask=mask) # (W) + # Compute the gradients for k + dk = dy.reshape(W, 1) * q # (W, BLOCK_SIZE_C) + dk = tl.sum(dk, axis=0) # (BLOCK_SIZE_C) + # Store the result + dk_block_ptr = tl.make_block_ptr( + base=dk_ptr, + shape=(B, T, C), + strides=(T * C, C, 1), + offsets=(b_id, t_id, c_first_id), + block_shape=(1, 1, BLOCK_SIZE_C), + order=(0, 1, 2), + ) + tl.store(dk_block_ptr, dk.reshape(1, 1, BLOCK_SIZE_C).to(dk_block_ptr.dtype.element_ty)) # (1, 1, BLOCK_SIZE_C) + +class AttendFoldedAllKeysTriton(torch.autograd.Function): + # @torch.compiler.disable + @staticmethod + def forward(ctx, q, k, states, W): + B, T, C = q.shape + B, T, S, C = states.shape + q = q.contiguous() + k = k.contiguous() + states = states.contiguous() + ctx.save_for_backward(q, k, states) + ctx.W = W + + # Calculate grid dimensions + grid = lambda meta: (B, T, triton.cdiv(S, meta['BLOCK_SIZE_S']) + triton.cdiv(W, meta['BLOCK_SIZE_W'])) + + # Allocate output tensor + y = torch.zeros((B, T, S+W), dtype=q.dtype, device=q.device).contiguous() + + # Launch kernel + afak_fwd_kernel[grid]( + q, k, states, y, + B, T, S, C, W, + ) + + return y + + # @torch.compiler.disable + @staticmethod + def backward(ctx, grad_output): + grad_output = grad_output.contiguous() + q, k, states = ctx.saved_tensors + B, T, S, C = states.shape + W = ctx.W + + # Calculate grid dimensions + grid = lambda meta: (B, T, triton.cdiv(C, meta['BLOCK_SIZE_C'])) + + gq = torch.zeros_like(q).contiguous() + gk = torch.zeros_like(k).contiguous() + gs = torch.zeros_like(states).contiguous() + + # Launch kernel + afak_bwd_kernel[grid]( + q, k, states, grad_output, gq, gk, gs, + B, T, S, C, W + ) + + return gq, gk, gs, None + +# triton kernel +# @triton.autotune( +# configs=[ +# triton.Config({ +# 'BLOCK_SIZE_C': bs_c, +# }, num_warps=warps) +# for bs_c in [16] #, 32, 64] +# for warps in [2] # 4, 8] +# ], +# key=[] +# ) +@triton.autotune( + configs=[ + triton.Config({ + 'BLOCK_SIZE_C': bs_c, + }, num_warps=warps) + for bs_c in [16, 32, 64] + for warps in [2, 4, 8] + ], + key=[] +) +@triton.jit +def afav_fwd_kernel( + s_ptr, v_ptr, states_ptr, y_ptr, + B: tl.constexpr, T: tl.constexpr, S: tl.constexpr, C: tl.constexpr, W: tl.constexpr, + BLOCK_SIZE_C: tl.constexpr, +): + # Use multiple program IDs for better parallelization + b_id = tl.program_id(axis=0) + t_id = tl.program_id(axis=1) + c_block_id = tl.program_id(axis=2) + c_first_id = c_block_id * BLOCK_SIZE_C + SW = S + W + + # Compute base pointers + s_base = s_ptr + b_id * T * W + v_base = v_ptr + b_id * T * C + y_base = y_ptr + b_id * T * C + + # First we accumulate the values + # Fetch the scores at [b_id, t_id, S:W] + sv_block_ptr = tl.make_block_ptr( + base=s_ptr, + shape=(B, T, SW), + strides=(T * SW, SW, 1), + offsets=(b_id, t_id, S), + block_shape=(1, 1, W), + order=(0, 1, 2), + ) + sv = tl.load(sv_block_ptr) # (1, 1, W) + # Fetch the value at [b_id, t_id-W+1:t_id, c_first_id:c_first_id+BLOCK_SIZE_C] + # need to load the keys manually because make_block_ptr doesn't support masks + tw_offs = tl.arange(0, W) + c_offs = tl.arange(0, BLOCK_SIZE_C) + v_block_ptr = v_base + (t_id - W + 1 + tw_offs[:, None]) * C + c_first_id + c_offs[None, :] + mask = tl.arange(0, W)[:, None] > (W - t_id - 2) + v = tl.load(v_block_ptr, mask=mask) # (W, BLOCK_SIZE_C) but W can vary (W - t_id - 2) + v = tl.load(v_block_ptr, mask=mask) # (BLOCK_SIZE_W, C) + + # We already fetched output gradients dy at [b_id, t_id, :] w/ size (1, 1, C) + # Compute the gradients for v + dv = dy * s.reshape(1, BLOCK_SIZE_W, 1) # (1, BLOCK_SIZE_W, C) + + # Compute the gradients for q + dsv = dy * v[None, :] # (1, BLOCK_SIZE_W, C) + dsv = tl.sum(dsv, axis=-1) # (1, BLOCK_SIZE_W) + + # Store the result + dsv_block_ptr = tl.make_block_ptr( + base=ds_ptr, + shape=(B, T, SW), + strides=(T * SW, SW, 1), + offsets=(b_id, t_id, S+w_first_id), + block_shape=(1, 1, BLOCK_SIZE_W), + order=(0, 1, 2), + ) + tl.store(dsv_block_ptr, dsv[None, :].to(dsv_block_ptr.dtype.element_ty)) # (1, 1, BLOCK_SIZE_W) + + # Store the result + # need to make a ptr manually because make_block_ptr doesn't support masks + tw_offs = tl.arange(0, BLOCK_SIZE_W) + c_offs = tl.arange(0, C) + dv_block_ptr = dv_base + (t_id - W + 1 + (w_first_id + tw_offs[:, None])) * C + c_offs[None, :] + mask = w_first_id + tl.arange(0, BLOCK_SIZE_W)[:, None] > (W - t_id - 2) + # now we have to atomically add the gradients to the original values + tl.atomic_add(dv_block_ptr[None, :], dv) + else: + s_first_id = sw_block_id * BLOCK_SIZE_S + # Here we calculate the gradients for s[:, :, :S] and for states + # First calculate the gradients for s + # Fetch states at [b_id, t_id, s_first_id:s_first_id+BLOCK_SIZE_S, :] + states_block_ptr = tl.make_block_ptr( + base=states_ptr, + shape=(B, T, S, C), + strides=(T * S * C, S * C, C, 1), + offsets=(b_id, t_id, s_first_id, 0), + block_shape=(1, 1, BLOCK_SIZE_S, C), + order=(0, 1, 2, 3), + ) + states = tl.load(states_block_ptr) # (1, 1, BLOCK_SIZE_S, C) + # Fetch original output gradients at [b_id, t_id, :] + dy_block_ptr = tl.make_block_ptr( + base=dy_ptr, + shape=(B, T, C), + strides=(T * C, C, 1), + offsets=(b_id, t_id, 0), + block_shape=(1, 1, C), + order=(0, 1, 2), + ) + dy = tl.load(dy_block_ptr) # (1, 1, C) + # Fetch the scores at [b_id, t_id, :S] + ss_block_ptr = tl.make_block_ptr( + base=s_ptr, + shape=(B, T, SW), + strides=(T * SW, SW, 1), + offsets=(b_id, t_id, s_first_id), + block_shape=(1, 1, BLOCK_SIZE_S), + order=(0, 1, 2), + ) + ss = tl.load(ss_block_ptr) # (1, 1, BLOCK_SIZE_S) + + # Compute the gradients for s + dss = dy[:, :, None, :] * states # (1, 1, BLOCK_SIZE_S, C) + dss = tl.sum(dss, axis=-1) # (1, 1, BLOCK_SIZE_S) + + # Then calculate the gradients for states + dstates = dy[:, :, None, :] * ss[:, :, :, None] # (1, 1, BLOCK_SIZE_S, C) + + # Store the result gradients of s at [b_id, t_id, :S] + dss_block_ptr = tl.make_block_ptr( + base=ds_ptr, + shape=(B, T, SW), + strides=(T * SW, SW, 1), + offsets=(b_id, t_id, s_first_id), + block_shape=(1, 1, BLOCK_SIZE_S), + order=(0, 1, 2), + ) + tl.store(dss_block_ptr, dss.to(dss_block_ptr.dtype.element_ty)) # (1, 1, BLOCK_SIZE_S) + + # Store the result gradients of states at [b_id, t_id, s_first_id:s_first_id+BLOCK_SIZE_S, :] + dstates_block_ptr = tl.make_block_ptr( + base=dstates_ptr, + shape=(B, T, S, C), + strides=(T * S * C, S * C, C, 1), + offsets=(b_id, t_id, s_first_id, 0), + block_shape=(1, 1, BLOCK_SIZE_S, C), + order=(0, 1, 2, 3), + ) + tl.store(dstates_block_ptr, dstates.to(dstates_block_ptr.dtype.element_ty)) # (1, 1, BLOCK_SIZE_S, C) + +class AccumulateFoldedAllValuesTriton(torch.autograd.Function): + # @torch.compiler.disable + @staticmethod + def forward(ctx, s, v, states, W): + B, T, S, C = states.shape + s = s.contiguous() + v = v.contiguous() + states = states.contiguous() + ctx.save_for_backward(s, v, states) + ctx.W = W + + # Calculate grid dimensions + grid = lambda meta: (B, T, triton.cdiv(C, meta['BLOCK_SIZE_C'])) + + # Allocate output tensor + y = torch.zeros((B, T, C), dtype=v.dtype, device=v.device).contiguous() + + # Launch kernel + afav_fwd_kernel[grid]( + s, v, states, y, + B, T, S, C, W, + ) + + return y + + # @torch.compiler.disable + @staticmethod + def backward(ctx, grad_output): + grad_output = grad_output.contiguous() + s, v, states = ctx.saved_tensors + B, T, S, C = states.shape + W = ctx.W + + # Calculate grid dimensions + grid = lambda meta: (B, T, triton.cdiv(S, meta['BLOCK_SIZE_S']) + triton.cdiv(W, meta['BLOCK_SIZE_W'])) + + gs = torch.zeros_like(s).contiguous() + # for gv we want an additional W at the start of the time dimension bc we can't mask atomic add + gv = torch.zeros((B, T+W-1, C), device=v.device).contiguous() + gst = torch.zeros_like(states).contiguous() + + # Launch kernel + afav_bwd_kernel[grid]( + s, v, states, grad_output, gs, gv, gst, + B, T, S, C, W, + ) + + # No need for the additional W at the start of the time dimension for gv + return gs, gv[:, W-1:].to(s.dtype), gst, None + +# @triton.autotune( +# configs=[ +# triton.Config({ +# 'BLOCK_SIZE': bs, +# 'BLOCK_SIZE_S': bs_s, +# 'BLOCK_SIZE_C': bs_c +# }, num_warps=warps) +# for bs in [16] #, 32, 64] +# for bs_s in [16] #, 32, 64] +# for bs_c in [16] #, 32, 64] +# for warps in [2] # 4, 8] +# ], +# key=[] +# ) +@triton.autotune( + configs=[ + triton.Config({ + 'BLOCK_SIZE': bs, + 'BLOCK_SIZE_S': bs_s, + 'BLOCK_SIZE_C': bs_c + }, num_warps=warps) + for bs in [16, 32, 64] + for bs_s in [8, 16] #, 32, 64] + for bs_c in [16, 32, 64] + for warps in [2, 4, 8] + ], + key=[] +) +@triton.jit +def cg2d_fwd_kernel( + xg_ptr, gi_ptr, + B: tl.constexpr, S: tl.constexpr, C: tl.constexpr, T: tl.constexpr, nstages: tl.constexpr, + BLOCK_SIZE: tl.constexpr, + # Add more constants for tiling + BLOCK_SIZE_S: tl.constexpr, + BLOCK_SIZE_C: tl.constexpr, +): + # Use multiple program IDs for better parallelization + pid = tl.program_id(axis=0) + # Compute batch, spatial, and channel indices + num_s_blocks = tl.cdiv(S, BLOCK_SIZE_S) + num_c_blocks = tl.cdiv(C, BLOCK_SIZE_C) + b = pid // (num_s_blocks * num_c_blocks) + rem = pid % (num_s_blocks * num_c_blocks) + s_block = rem // num_c_blocks + c_block = rem % num_c_blocks + + # Compute actual indices + s_offs = tl.arange(0, BLOCK_SIZE_S) + c_offs = tl.arange(0, BLOCK_SIZE_C) + s_mask = s_offs < (S - s_block * BLOCK_SIZE_S) + c_mask = c_offs < (C - c_block * BLOCK_SIZE_C) + s_offs = s_block * BLOCK_SIZE_S + s_offs + c_offs = c_block * BLOCK_SIZE_C + c_offs + + # Compute base pointers + xg_base = xg_ptr + b * T * S * C + gi_base = gi_ptr + b * T * S + + # Precompute stages for better efficiency + # nstages = tl.ceil(tl.log2(float(T))).to(tl.int32) + offs = tl.arange(0, BLOCK_SIZE) + + for stage in tl.range(nstages): # CHANGE BACK TO tl.static_range() IN FINAL VERSION + group_stride = 1 << stage + # Process multiple elements per thread using BLOCK_SIZE + for block_start in tl.range(0, T//2, BLOCK_SIZE): + block_mask = offs < (T//2 - block_start) + block_s_mask = block_mask[:, None] & s_mask[None, :] + block_s_c_mask = block_mask[:, None, None] & s_mask[None, :, None] & c_mask[None, None, :] + + # Compute indices with vectorization + initial_indices = group_stride + ((offs + block_start) // group_stride) * group_stride * 2 + t_targets = initial_indices + ((offs + block_start) % group_stride) + t_adders = initial_indices - 1 + + xg_targets_ptr = xg_base + t_targets[:, None, None] * S * C + s_offs[None, :, None] * C + c_offs[None, None, :] # (BLOCK_SIZE, BLOCK_SIZE_S, BLOCK_SIZE_C) + xg_adders_ptr = xg_base + t_adders[:, None, None] * S * C + s_offs[None, :, None] * C + c_offs[None, None, :] # (BLOCK_SIZE, BLOCK_SIZE_S, BLOCK_SIZE_C) + gi_targets_ptr = gi_base + t_targets[:, None] * S + s_offs[None, :] # (BLOCK_SIZE, BLOCK_SIZE_S) + gi_adders_ptr = gi_base + t_adders[:, None] * S + s_offs[None, :] # (BLOCK_SIZE, BLOCK_SIZE_S) + + xg_targets = tl.load(xg_targets_ptr, mask=block_s_c_mask) # (BLOCK_SIZE, BLOCK_SIZE_S, BLOCK_SIZE_C) + xg_adders = tl.load(xg_adders_ptr, mask=block_s_c_mask) # (BLOCK_SIZE, BLOCK_SIZE_S, BLOCK_SIZE_C) + gi_targets = tl.load(gi_targets_ptr, mask=block_s_mask) # (BLOCK_SIZE, BLOCK_SIZE_S) + gi_adders = tl.load(gi_adders_ptr, mask=block_s_mask) # (BLOCK_SIZE, BLOCK_SIZE_S) + + # Compute and store results + xg_targets += xg_adders * gi_targets[:, :, None] # (BLOCK_SIZE, BLOCK_SIZE_S, BLOCK_SIZE_C) + # Update gates + gi_targets *= gi_adders # (BLOCK_SIZE, BLOCK_SIZE_S) + + tl.store(xg_targets_ptr, xg_targets.to(xg_targets_ptr.dtype.element_ty), mask=block_s_c_mask) # (BLOCK_SIZE, BLOCK_SIZE_S, BLOCK_SIZE_C) + tl.store(gi_targets_ptr, gi_targets.to(gi_targets_ptr.dtype.element_ty), mask=block_s_mask) # (BLOCK_SIZE, BLOCK_SIZE_S) + +# @triton.autotune( +# configs=[ +# triton.Config({ +# 'BLOCK_SIZE': bs, +# 'BLOCK_SIZE_S': bs_s, +# 'BLOCK_SIZE_C': bs_c +# }, num_warps=warps) +# for bs in [16] #, 32, 64] +# for bs_s in [16] #, 32, 64] +# for bs_c in [16] #, 32, 64] +# for warps in [2] #, 32, 64] +# ], +# key=[] +# ) +@triton.autotune( + configs=[ + triton.Config({ + 'BLOCK_SIZE': bs, + 'BLOCK_SIZE_S': bs_s, + 'BLOCK_SIZE_C': bs_c + }, num_warps=warps) + for bs in [16, 32, 64] + for bs_s in [8, 16] #, 32, 64] + for bs_c in [16, 32, 64] + for warps in [2, 4, 8] + ], + key=[] +) +@triton.jit +def cg2d_gxg_bwd_kernel( + gi_ptr, go_ptr, + B: tl.constexpr, S: tl.constexpr, C: tl.constexpr, T: tl.constexpr, nstages: tl.constexpr, + BLOCK_SIZE: tl.constexpr, + BLOCK_SIZE_S: tl.constexpr, + BLOCK_SIZE_C: tl.constexpr, +): + # Similar structure to forward kernel with reversed indices + pid = tl.program_id(axis=0) + num_s_blocks = tl.cdiv(S, BLOCK_SIZE_S) + num_c_blocks = tl.cdiv(C, BLOCK_SIZE_C) + b = pid // (num_s_blocks * num_c_blocks) + rem = pid % (num_s_blocks * num_c_blocks) + s_block = rem // num_c_blocks + c_block = rem % num_c_blocks + + s_offs = tl.arange(0, BLOCK_SIZE_S) + c_offs = tl.arange(0, BLOCK_SIZE_C) + s_mask = s_offs < (S - s_block * BLOCK_SIZE_S) + c_mask = c_offs < (C - c_block * BLOCK_SIZE_C) + s_offs = s_block * BLOCK_SIZE_S + s_offs + c_offs = c_block * BLOCK_SIZE_C + c_offs + + gi_base = gi_ptr + b * T * S + go_base = go_ptr + b * T * S * C + + # nstages = tl.ceil(tl.log2(float(T))).to(tl.int32) + offs = tl.arange(0, BLOCK_SIZE) + + for stage in tl.range(nstages): # CHANGE BACK TO tl.static_range() IN FINAL VERSION + group_stride = 1 << stage + for block_start in tl.range(0, T//2, BLOCK_SIZE): + block_mask = offs < (T//2 - block_start) + block_s_mask = block_mask[:, None] & s_mask[None, :] + block_s_c_mask = block_mask[:, None, None] & s_mask[None, :, None] & c_mask[None, None, :] + + initial_indices = T - 1 - group_stride - ((offs + block_start) // group_stride) * group_stride * 2 + t_targets = initial_indices - ((offs + block_start) % group_stride) + t_adders = initial_indices + 1 + + go_targets_ptr = go_base + t_targets[:, None, None] * S * C + s_offs[None, :, None] * C + c_offs[None, None, :] # (BLOCK_SIZE, BLOCK_SIZE_S, BLOCK_SIZE_C) + go_adders_ptr = go_base + t_adders[:, None, None] * S * C + s_offs[None, :, None] * C + c_offs[None, None, :] # (BLOCK_SIZE, BLOCK_SIZE_S, BLOCK_SIZE_C) + gi_targets_ptr = gi_base + t_targets[:, None] * S + s_offs[None, :] # (BLOCK_SIZE, BLOCK_SIZE_S) + gi_adders_ptr = gi_base + t_adders[:, None] * S + s_offs[None, :] # (BLOCK_SIZE, BLOCK_SIZE_S) + + # Load with block masking + go_targets = tl.load(go_targets_ptr, mask=block_s_c_mask) # (BLOCK_SIZE, BLOCK_SIZE_S, BLOCK_SIZE_C) + go_adders = tl.load(go_adders_ptr, mask=block_s_c_mask) # (BLOCK_SIZE, BLOCK_SIZE_S, BLOCK_SIZE_C) + gi_targets = tl.load(gi_targets_ptr, mask=block_s_mask) # (BLOCK_SIZE, BLOCK_SIZE_S) + gi_adders = tl.load(gi_adders_ptr, mask=block_s_mask) # (BLOCK_SIZE, BLOCK_SIZE_S) + + # Compute and store results + go_targets += go_adders * gi_targets[:, :, None] # (BLOCK_SIZE, BLOCK_SIZE_S, BLOCK_SIZE_C) + gi_targets *= gi_adders # (BLOCK_SIZE, BLOCK_SIZE_S) + + tl.store(go_targets_ptr, go_targets.to(go_targets_ptr.dtype.element_ty), mask=block_s_c_mask) # (BLOCK_SIZE, BLOCK_SIZE_S, BLOCK_SIZE_C) + tl.store(gi_targets_ptr, gi_targets.to(gi_targets_ptr.dtype.element_ty), mask=block_s_mask) # (BLOCK_SIZE, BLOCK_SIZE_S) + +# @triton.autotune( +# configs=[ +# triton.Config({ +# 'BLOCK_SIZE': bs, +# 'BLOCK_SIZE_S': bs_s, +# 'BLOCK_SIZE_C': bs_c +# }, num_warps=warps) +# for bs in [16] #, 32, 64] +# for bs_s in [16] #, 32, 64] +# for bs_c in [16] #, 32, 64] +# for warps in [2] #, 4, 8] +# ], +# key=[] +# ) +@triton.autotune( + configs=[ + triton.Config({ + 'BLOCK_SIZE': bs, + 'BLOCK_SIZE_S': bs_s, + 'BLOCK_SIZE_C': bs_c + }, num_warps=warps) + for bs in [16, 32, 64] + for bs_s in [8, 16] #, 32, 64] + for bs_c in [16, 32, 64] + for warps in [2, 4, 8] + ], + key=[] +) +@triton.jit +def cg2d_ggi_bwd_kernel( + go_ptr, y_ptr, grad_gi_ptr, + B: tl.constexpr, S: tl.constexpr, C: tl.constexpr, T: tl.constexpr, + BLOCK_SIZE: tl.constexpr, + BLOCK_SIZE_S: tl.constexpr, + BLOCK_SIZE_C: tl.constexpr +): + b = tl.program_id(axis=0) + pid = tl.program_id(axis=1) + num_t_blocks = tl.cdiv(T, BLOCK_SIZE) + num_s_blocks = tl.cdiv(S, BLOCK_SIZE_S) + num_c_blocks = tl.cdiv(C, BLOCK_SIZE_C) + t_block = pid // (num_s_blocks * num_c_blocks) + rem = pid % (num_s_blocks * num_c_blocks) + s_block = rem // num_c_blocks + c_block = rem % num_c_blocks + + t_offs = tl.arange(0, BLOCK_SIZE) + s_offs = tl.arange(0, BLOCK_SIZE_S) + c_offs = tl.arange(0, BLOCK_SIZE_C) + t_mask = t_offs < (T - t_block * BLOCK_SIZE) + s_mask = s_offs < (S - s_block * BLOCK_SIZE_S) + c_mask = c_offs < (C - c_block * BLOCK_SIZE_C) + t_offs = t_block * BLOCK_SIZE + t_offs + s_offs = s_block * BLOCK_SIZE_S + s_offs + c_offs = c_block * BLOCK_SIZE_C + c_offs + + # Compute grad_gi + # torch: + # grad_gi = grad_output * y + # grad_gi = grad_gi.sum(-1) + grad_gi_base = grad_gi_ptr + b * T * S + t_first_id = t_block * BLOCK_SIZE + s_first_id = s_block * BLOCK_SIZE_S + c_first_id = c_block * BLOCK_SIZE_C + # We can use make_block_ptr since the blocks we need are contiguous + go_block_ptr = tl.make_block_ptr( + base=go_ptr, + shape=(B, T, S, C), + strides=(T * S * C, S * C, C, 1), + offsets=(b, t_first_id, s_first_id, c_first_id), + block_shape=(1, BLOCK_SIZE, BLOCK_SIZE_S, BLOCK_SIZE_C), + order=(0, 1, 2, 3) + ) + go_block = tl.load(go_block_ptr) # (1, BLOCK_SIZE, BLOCK_SIZE_S, BLOCK_SIZE_C) + y_block_ptr = tl.make_block_ptr( + base=y_ptr, + shape=(B, T, S, C), + strides=(T * S * C, S * C, C, 1), + offsets=(b, t_first_id, s_first_id, c_first_id), # y is already shifted to the right by 1 + block_shape=(1, BLOCK_SIZE, BLOCK_SIZE_S, BLOCK_SIZE_C), + order=(0, 1, 2, 3) + ) + y_block = tl.load(y_block_ptr) # (1, BLOCK_SIZE, BLOCK_SIZE_S, BLOCK_SIZE_C) + + grad_gi = go_block * y_block # (1, BLOCK_SIZE, BLOCK_SIZE_S, BLOCK_SIZE_C) + grad_gi = tl.sum(grad_gi, axis=-1) # (1, BLOCK_SIZE, BLOCK_SIZE_S) + + # Need to use atomic add for accumulation between S blocks, so we also need to use manual pointer bc it's what atomic add accepts + grad_gi_block_ptr = grad_gi_base + t_offs[:, None] * S + s_offs[None, :] + grad_gi_mask = t_mask[:, None] & s_mask[None, :] + tl.atomic_add(grad_gi_block_ptr[None, :], grad_gi, mask=grad_gi_mask[None, :]) + +class CumulativeGating2DTriton(torch.autograd.Function): + # @torch.compiler.disable + @staticmethod + def forward(ctx, xg, gi): + xg = xg.contiguous() + gi = gi.contiguous() + orig_gi = gi.clone() + B, T, S, C = xg.shape + + # Calculate grid dimensions + grid = lambda meta: (B * triton.cdiv(S, meta['BLOCK_SIZE_S']) * triton.cdiv(C, meta['BLOCK_SIZE_C']),) + + # Launch kernel + nstages = math.ceil(math.log2(T)) + cg2d_fwd_kernel[grid]( + xg, gi, + B, S, C, T, nstages, + ) + + ctx.save_for_backward(xg, orig_gi) + return xg + + # @torch.compiler.disable + @staticmethod + def backward(ctx, grad_output): + grad_output = grad_output.contiguous() + y, gi = ctx.saved_tensors + B, T, S, C = y.shape + + # Calculate grid dimensions + grid = lambda meta: (B * triton.cdiv(S, meta['BLOCK_SIZE_S']) * triton.cdiv(C, meta['BLOCK_SIZE_C']),) + + gi = torch.cat((gi[:, 1:], torch.ones_like(gi[:, -1:])), dim=1).contiguous() + grad_xg = grad_output.clone() + y = torch.cat((torch.zeros_like(y[:, :1]), y[:, :-1]), dim=1).contiguous() + grad_gi = torch.zeros((B, T, S), device=gi.device).contiguous() # torch.zeros_like(gi) + + # Launch kernel + nstages = math.ceil(math.log2(T)) + cg2d_gxg_bwd_kernel[grid]( + gi, grad_xg, + B, S, C, T, nstages, + ) + + # Launch kernel + grid = lambda meta: (B, triton.cdiv(T, meta['BLOCK_SIZE']) * triton.cdiv(S, meta['BLOCK_SIZE_S']) * triton.cdiv(C, meta['BLOCK_SIZE_C'])) + cg2d_ggi_bwd_kernel[grid]( + grad_xg, y, grad_gi, + B, S, C, T, + ) + + return grad_xg, grad_gi.to(gi.dtype) + +# Parallel Semi-Compressed Attention +def parallel_scan( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + s: torch.Tensor, + g: torch.Tensor, + window_size: int, + num_heads: int, + alibi: torch.Tensor, + mask: torch.Tensor, + scale: Optional[int] = None, + initial_state: Optional[torch.Tensor] = None, + output_final_state: Optional[bool] = False, + checkpoint_level: Optional[int] = 2, + offsets: Optional[torch.LongTensor] = None, + head_first: Optional[bool] = True +) -> Tuple[torch.Tensor, torch.Tensor]: + r""" + Args: + q (torch.Tensor): + queries of shape `[B, HQ, T, K]` if `head_first=True` else `[B, T, HQ, K]`. + k (torch.Tensor): + keys of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]`. + GQA is performed if `H` is not equal to `HQ`. + v (torch.Tensor): + values of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]`. + s (torch.Tensor): + slot representations of shape `[B, H, T, M]` if `head_first=True` else `[B, T, H, M]`. + g (torch.Tensor): + Forget gates of shape `[B, H, T, M]` applied to keys. + If not provided, this function is equivalent to vanilla ABC. + scale (Optional[int]): + Scale factor for attention scores. + If not provided, it will default to `1 / sqrt(K)`. Default: `None`. + initial_state (Optional[Tuple[torch.Tensor]]): + Initial state tuple having tensors of shape `[N, H, K, M]` and `[N, H, M, V]` for `N` input sequences. + For equal-length input sequences, `N` equals the batch size `B`. + Default: `None`. + output_final_state (Optional[bool]): + Whether to output the final state tuple, having tensors of shape `[N, H, K, M]` and `[N, H, M, V]`. + Default: `False`. + checkpoint_level (Optional[int]): + Checkpointing level; higher values will save more memories and do more recomputations during backward. + Default: `2`: + - Level `0`: no memory saved, no recomputation. + - Level `1`: recompute the fp32 cumulative values during backward. + - Level `2`: recompute the fp32 cumulative values and forward hidden states during backward. + offsets (Optional[torch.LongTensor]): + Offsets of shape `[N+1]` defining the bos/eos positions of `N` variable-length sequences in the batch. + For example, + if `offsets` is `[0, 1, 3, 6, 10, 15]`, there are `N=5` sequences with lengths 1, 2, 3, 4 and 5 respectively. + If provided, the inputs are concatenated and the batch size `B` is expected to be 1. + Default: `None`. + head_first (Optional[bool]): + Whether the inputs are in the head-first format, which is not supported for variable-length inputs. + Default: `True`. + + Returns: + o (torch.Tensor): + Outputs of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]`. + final_state (Tuple[torch.Tensor]): + Final state tuple having tensors of shape `[N, H, K, M]` and `[N, H, M, V]` if `output_final_state=True`. + `None` otherwise. + + Examples:: + >>> import torch + >>> import torch.nn.functional as F + >>> from einops import rearrange + >>> from fla.ops.gsa import fused_recurrent_gsa + # inputs with equal lengths + >>> B, T, H, K, V, M = 4, 2048, 4, 512, 512, 64 + >>> q = torch.randn(B, T, H, K, device='cuda') + >>> k = torch.randn(B, T, H, K, device='cuda') + >>> v = torch.randn(B, T, H, V, device='cuda') + >>> s = torch.randn(B, T, H, M, device='cuda') + >>> g = F.logsigmoid(torch.randn(B, T, H, M, device='cuda')) + >>> h0 = (torch.randn(B, H, K, M, device='cuda'), torch.randn(B, H, M, V, device='cuda')) + >>> o, (hk, hv) = chunk_gsa(q, k, v, s, g, + initial_state=h0, + output_final_state=True, + head_first=False) + # for variable-length inputs, the batch size `B` is expected to be 1 and `offsets` is required + >>> q, k, v, s, g = map(lambda x: rearrange(x, 'b t h d -> 1 (b t) h d'), (q, k, v, s, g)) + # for a batch with 4 sequences, offsets with 5 start/end positions are expected + >>> offsets = q.new_tensor([0, 2048, 4096, 6144, 8192], dtype=torch.long) + >>> o_var, (hk_var, hv_var) = chunk_gsa(q, k, v, s, g, + initial_state=h0, + output_final_state=True, + offsets=offsets, + head_first=False) + >>> assert o.allclose(o_var.view(o.shape)) + >>> assert hk.allclose(hk_var) + >>> assert hv.allclose(hv_var) + """ + if offsets is not None: + if q.shape[0] != 1: + raise ValueError(f"The batch size is expected to be 1 rather than {q.shape[0]} when using `offsets`." + f"Please flatten variable-length inputs before processing.") + if head_first: + raise RuntimeError("Sequences with variable lengths are not supported for head-first mode") + if initial_state is not None and initial_state[0].shape[0] != len(offsets) - 1: + raise ValueError(f"The number of initial states is expected to be equal to the number of input sequences, " + f"i.e., {len(offsets) - 1} rather than {initial_state[0].shape[0]}.") + assert checkpoint_level in [0, 1, 2] + + if scale is None: + scale = q.shape[-1] ** -0.5 + + BH, T, S = g.shape + # Do semi-compressed attention + sg = torch.einsum('bts,btc->btsc', g, s) + gi = 1 - g + states = CumulativeGating2DTriton.apply(sg, gi) # states (B*H, T, S, C) at all time steps + scores = AttendFoldedAllKeysTriton.apply(q, k, states, window_size) * scale # scores (B*H, T, S+W) + # bring back to (B, H, T, S+W) to apply alibi with shape (H, T, S+W) + scores = scores.view(-1, num_heads, T, S + window_size) + alibi[:, :T] + scores = scores.masked_fill(mask[:T] == 0, float('-inf')) + scores = torch.softmax(scores, dim=-1).view(BH, T, S + window_size) + o = AccumulateFoldedAllValuesTriton.apply(scores, v, states, window_size) # outputs (B*H, T, C) + + final_state = None # TODO: fix for inference + return o, final_state diff --git a/fla/ops/simple_gla/README.md b/fla/ops/simple_gla/README.md new file mode 100644 index 0000000000000000000000000000000000000000..2a64f3dcdee7ff9863089a6b47ef694f6234ab8f --- /dev/null +++ b/fla/ops/simple_gla/README.md @@ -0,0 +1,10 @@ +# Simple GLA + +Gating mechanism in [Gated RFA](https://arxiv.org/abs/2103.02143), [Mamba2](https://arxiv.org/abs/2405.21060) and [YOCO](https://arxiv.org/abs/2405.05254) (a.k.a., Gated RetNet). + +Compared to GLA, the gating is head-wise instead of elementwise. +As a result, we can adapt the RetNet kernel for training using matmul w/o numerical instability. +It is faster than GLA but has less expressive power. +I will use it as a baseline for the GLA. + +$S_{t+1} = g_{t+1} \odot S_{t} + K_{t+1} V_{t+1}^{\top}$ where $g$ is a scalar. diff --git a/fla/ops/simple_gla/__init__.py b/fla/ops/simple_gla/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..561e3afbf81e8ab1b0fe738e5c5e8d1e1626868e --- /dev/null +++ b/fla/ops/simple_gla/__init__.py @@ -0,0 +1,11 @@ +# -*- coding: utf-8 -*- + +from .chunk import chunk_simple_gla +from .fused_recurrent import fused_recurrent_simple_gla +from .parallel import parallel_simple_gla + +__all__ = [ + 'chunk_simple_gla', + 'fused_recurrent_simple_gla', + 'parallel_simple_gla' +] diff --git a/fla/ops/simple_gla/chunk.py b/fla/ops/simple_gla/chunk.py new file mode 100644 index 0000000000000000000000000000000000000000..4095aee8a4be100c52b294e2c61fd11398bf940f --- /dev/null +++ b/fla/ops/simple_gla/chunk.py @@ -0,0 +1,760 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +from typing import Optional, Tuple + +import torch +import triton +import triton.language as tl + +from fla.ops.common.chunk_h import chunk_bwd_dh, chunk_fwd_h +from fla.ops.utils import chunk_local_cumsum +from fla.utils import autocast_custom_bwd, autocast_custom_fwd, contiguous + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=4), + ], + key=["BT", "BK", "BV"], +) +@triton.heuristics({'USE_OFFSETS': lambda args: args['offsets'] is not None}) +@triton.jit +def chunk_simple_gla_fwd_kernel_o( + q, + k, + v, + h, + g, + o, + offsets, + indices, + scale, + T: tl.constexpr, + H: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + NT: tl.constexpr, + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + i_v, i_t, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_b, i_h = i_bh // H, i_bh % H + if USE_OFFSETS: + i_tg = i_t + i_n, i_t = tl.load(indices + i_t * 2).to(tl.int32), tl.load(indices + i_t * 2 + 1).to(tl.int32) + bos, eos = tl.load(offsets + i_n).to(tl.int32), tl.load(offsets + i_n + 1).to(tl.int32) + T = eos - bos + NT = tl.cdiv(T, BT) + else: + NT = tl.cdiv(T, BT) + i_tg = i_b * NT + i_t + bos, eos = i_b * T, i_b * T + T + + o_i = tl.arange(0, BT) + m_s = o_i[:, None] >= o_i[None, :] + + b_o = tl.zeros([BT, BV], dtype=tl.float32) + b_s = tl.zeros([BT, BT], dtype=tl.float32) + for i_k in range(tl.cdiv(K, BK)): + if HEAD_FIRST: + p_q = tl.make_block_ptr(q + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_k = tl.make_block_ptr(k + i_bh * T*K, (K, T), (1, K), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + p_h = tl.make_block_ptr(h + (i_bh * NT + i_t) * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + else: + p_q = tl.make_block_ptr(q + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_k = tl.make_block_ptr(k + (bos * H + i_h) * K, (K, T), (1, H*K), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + p_h = tl.make_block_ptr(h + (i_tg * H + i_h) * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + # [BT, BK] + b_q = tl.load(p_q, boundary_check=(0, 1)) + # [BK, BT] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BK, BV] + b_h = tl.load(p_h, boundary_check=(0, 1)) + b_o += tl.dot(b_q, b_h, allow_tf32=False) + b_s += tl.dot(b_q, b_k, allow_tf32=False) + + if HEAD_FIRST: + p_g = tl.make_block_ptr(g + i_bh * T, (T,), (1,), (i_t * BT,), (BT,), (0,)) + p_v = tl.make_block_ptr(v + i_bh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_o = tl.make_block_ptr(o + i_bh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + else: + p_g = tl.make_block_ptr(g + bos * H + i_h, (T,), (H,), (i_t * BT,), (BT,), (0,)) + p_v = tl.make_block_ptr(v + (bos * H + i_h) * V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_o = tl.make_block_ptr(o + (bos * H + i_h) * V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + b_g = tl.load(p_g, boundary_check=(0,)) + b_o = b_o * tl.exp(b_g)[:, None] + b_s = b_s * tl.exp(b_g[:, None] - b_g[None, :]) + b_s = tl.where(m_s, b_s, 0) + + b_v = tl.load(p_v, boundary_check=(0, 1)) + b_o = (b_o + tl.dot(b_s.to(b_v.dtype), b_v, allow_tf32=False)) * scale + tl.store(p_o, b_o.to(p_o.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8) + ], + key=["BT", "BK", "BV"], +) +@triton.heuristics({'USE_OFFSETS': lambda args: args['offsets'] is not None}) +@triton.jit +def chunk_simple_gla_bwd_kernel_dqkg( + q, + k, + v, + h, + g, + do, + dh, + dq, + dk, + dg, + offsets, + indices, + scale, + B: tl.constexpr, + T: tl.constexpr, + H: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + i_k, i_t, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_b, i_h = i_bh // H, i_bh % H + if USE_OFFSETS: + i_tg = i_t + i_n, i_t = tl.load(indices + i_t * 2).to(tl.int32), tl.load(indices + i_t * 2 + 1).to(tl.int32) + bos, eos = tl.load(offsets + i_n).to(tl.int32), tl.load(offsets + i_n + 1).to(tl.int32) + all = T + T = eos - bos + NT = tl.cdiv(T, BT) + else: + NT = tl.cdiv(T, BT) + i_tg = i_b * NT + i_t + bos, eos = i_b * T, i_b * T + T + all = B * T + o_i = tl.arange(0, BT) + + if HEAD_FIRST: + p_g = tl.make_block_ptr(g + i_bh * T, (T,), (1,), (i_t * BT,), (BT,), (0,)) + b_g_last = tl.load(g + i_bh * T + min(i_t * BT + BT, T) - 1) + else: + p_g = tl.make_block_ptr(g + bos * H + i_h, (T,), (H,), (i_t * BT,), (BT,), (0,)) + b_g_last = tl.load(g + (bos + min(i_t * BT + BT, T) - 1) * H + i_h) + b_g = tl.load(p_g, boundary_check=(0,)) + + b_dq = tl.zeros([BT, BK], dtype=tl.float32) + b_dk = tl.zeros([BT, BK], dtype=tl.float32) + b_ds = tl.zeros([BT, BT], dtype=tl.float32) + b_dg = tl.zeros([BT,], dtype=tl.float32) + b_dg_last = tl.zeros([1,], dtype=tl.float32) + + for i_v in range(tl.cdiv(V, BV)): + if HEAD_FIRST: + p_v = tl.make_block_ptr(v + i_bh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_do = tl.make_block_ptr(do + i_bh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_h = tl.make_block_ptr(h + (i_bh * NT + i_t) * K*V, (V, K), (1, V), (i_v * BV, i_k * BK), (BV, BK), (0, 1)) + p_dh = tl.make_block_ptr(dh + (i_bh * NT + i_t) * K*V, (V, K), (1, V), (i_v * BV, i_k * BK), (BV, BK), (0, 1)) + else: + p_v = tl.make_block_ptr(v + (bos*H + i_h) * V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_do = tl.make_block_ptr(do + (bos*H + i_h) * V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_h = tl.make_block_ptr(h + (i_tg * H + i_h) * K*V, (V, K), (1, V), (i_v * BV, i_k * BK), (BV, BK), (0, 1)) + p_dh = tl.make_block_ptr(dh + (i_tg * H + i_h) * K*V, (V, K), (1, V), (i_v * BV, i_k * BK), (BV, BK), (0, 1)) + # [BT, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + b_do = tl.load(p_do, boundary_check=(0, 1)) + # [BV, BK] + b_h = tl.load(p_h, boundary_check=(0, 1)) + b_dh = tl.load(p_dh, boundary_check=(0, 1)) + + b_dg_last += (tl.sum(b_h * b_dh)) + b_ds += tl.dot(b_do, tl.trans(b_v)) + b_dq += tl.dot(b_do, b_h.to(b_do.dtype)) + b_dk += tl.dot(b_v, b_dh.to(b_v.dtype)) + + if HEAD_FIRST: + p_q = tl.make_block_ptr(q + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_k = tl.make_block_ptr(k + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dq = tl.make_block_ptr(dq + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dk = tl.make_block_ptr(dk + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dg = tl.make_block_ptr(dg + (i_k*B*H + i_bh) * T, (T,), (1,), (i_t * BT,), (BT,), (0,)) + else: + p_q = tl.make_block_ptr(q + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_k = tl.make_block_ptr(k + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BK, BT), (0, 1)) + p_dq = tl.make_block_ptr(dq + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dk = tl.make_block_ptr(dk + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dg = tl.make_block_ptr(dg + (i_k*all + bos) * H + i_h, (T,), (H,), (i_t * BT,), (BT,), (0,)) + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_k = tl.load(p_k, boundary_check=(0, 1)) + b_dg_last *= tl.exp(b_g_last) + b_dq = b_dq * tl.exp(b_g)[:, None] * scale + b_dk = b_dk * tl.exp(-b_g + b_g_last)[:, None] + b_dg_last += tl.sum(b_dk * b_k) + b_ds = tl.where(o_i[:, None] >= o_i[None, :], b_ds * scale * tl.exp(b_g[:, None] - b_g[None, :]), 0) + b_ds = b_ds.to(b_k.dtype) + # [BT, BK] + b_dq += tl.dot(b_ds, b_k) + b_dk += tl.dot(tl.trans(b_ds), b_q) + b_dg += tl.sum(b_q * b_dq - b_k * b_dk, axis=1) + # (SY 09/21) revcumsum in a separate kernel due to strange triton compiler issue + # b_dg = tl.dot(tl.where(o_i[:, None] <= o_i[None, :], 1., 0.), b_dg, allow_tf32=False) + b_dg_last) + b_dg = tl.where(o_i < min(BT, T-i_t*BT) - 1, b_dg, b_dg + b_dg_last) + tl.store(p_dq, b_dq.to(p_dq.dtype.element_ty), boundary_check=(0, 1)) + tl.store(p_dk, b_dk.to(p_dk.dtype.element_ty), boundary_check=(0, 1)) + tl.store(p_dg, b_dg.to(p_dg.dtype.element_ty), boundary_check=(0,)) + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8), + ], + key=["BT", "BK", "BV"], +) +@triton.heuristics({'USE_OFFSETS': lambda args: args['offsets'] is not None}) +@triton.jit +def chunk_simple_gla_bwd_kernel_dv( + q, + k, + g, + do, + dv, + dh, + offsets, + indices, + scale, + T: tl.constexpr, + H: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + USE_OFFSETS: tl.constexpr, + HEAD_FIRST: tl.constexpr +): + i_v, i_t, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_b, i_h = i_bh // H, i_bh % H + if USE_OFFSETS: + i_tg = i_t + i_n, i_t = tl.load(indices + i_t * 2).to(tl.int32), tl.load(indices + i_t * 2 + 1).to(tl.int32) + bos, eos = tl.load(offsets + i_n).to(tl.int32), tl.load(offsets + i_n + 1).to(tl.int32) + T = eos - bos + NT = tl.cdiv(T, BT) + else: + NT = tl.cdiv(T, BT) + i_tg = i_b * NT + i_t + bos, eos = i_b * T, i_b * T + T + + if HEAD_FIRST: + b_g = tl.load(g + i_bh * T + i_t * BT + tl.arange(0, BT)) + b_g_last = tl.load(g + i_bh * T + min(i_t * BT + BT, T) - 1) + else: + b_g = tl.load(g + (bos + i_t * BT + tl.arange(0, BT)) * H + i_h) + b_g_last = tl.load(g + (bos + min(i_t * BT + BT, T) - 1) * H + i_h) + b_dv = tl.zeros([BT, BV], dtype=tl.float32) + for i_k in range(tl.cdiv(K, BK)): + if HEAD_FIRST: + p_k = tl.make_block_ptr(k + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dh = tl.make_block_ptr(dh + (i_bh * NT + i_t) * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + else: + p_k = tl.make_block_ptr(k + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dh = tl.make_block_ptr(dh + (i_tg * H + i_h) * K*V, (K, V), (V, 1), (i_k * BK, i_v * BV), (BK, BV), (1, 0)) + + # [BT, BK] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BK, BV] + b_dh = tl.load(p_dh, boundary_check=(0, 1)) + b_dv += tl.dot(b_k, b_dh.to(b_k.dtype)) * tl.exp(-b_g + b_g_last)[:, None] + + b_A = tl.zeros([BT, BT], dtype=tl.float32) + for i_k in range(tl.cdiv(K, BK)): + if HEAD_FIRST: + p_q = tl.make_block_ptr(q + i_bh * T*K, (K, T), (1, K), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + p_k = tl.make_block_ptr(k + i_bh * T*K, (T, K), (K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + else: + p_q = tl.make_block_ptr(q + (bos * H + i_h) * K, (K, T), (1, H*K), (i_k * BK, i_t * BT), (BK, BT), (0, 1)) + p_k = tl.make_block_ptr(k + (bos * H + i_h) * K, (T, K), (H*K, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_k = tl.load(p_k, boundary_check=(0, 1)) + b_A += tl.dot(b_k, b_q, allow_tf32=False) + + mask = (tl.arange(0, BT)[:, None] <= tl.arange(0, BT)[None, :]) & (i_t * BT + tl.arange(0, BT) < T) + b_A = b_A * tl.exp(b_g[None, :] - b_g[:, None]) * scale + b_A = tl.where(mask, b_A, 0).to(do.dtype.element_ty) + + if HEAD_FIRST: + p_do = tl.make_block_ptr(do + i_bh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_dv = tl.make_block_ptr(dv + i_bh * T*V, (T, V), (V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + else: + p_do = tl.make_block_ptr(do + (bos*H + i_h) * V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_dv = tl.make_block_ptr(dv + (bos*H + i_h) * V, (T, V), (H*V, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + b_do = tl.load(p_do, boundary_check=(0, 1)) + b_dv += tl.dot(b_A, b_do) + tl.store(p_dv, b_dv.to(p_dv.dtype.element_ty), boundary_check=(0, 1)) + + +def chunk_simple_gla_fwd_o( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + g: torch.Tensor, + h: torch.Tensor, + scale: float, + offsets: Optional[torch.LongTensor] = None, + indices: Optional[torch.LongTensor] = None, + head_first: bool = True, + chunk_size: int = 64 +) -> torch.Tensor: + if head_first: + B, H, T, K, V = *q.shape, v.shape[-1] + else: + B, T, H, K, V = *q.shape, v.shape[-1] + BT = min(chunk_size, triton.next_power_of_2(T)) + if offsets is None: + NT = triton.cdiv(T, BT) + else: + if indices is None: + indices = torch.cat([torch.arange(n) for n in triton.cdiv(offsets[1:] - offsets[:-1], BT).tolist()]) + indices = torch.stack([indices.eq(0).cumsum(0) - 1, indices], 1).to(offsets) + NT = len(indices) + BK = min(triton.next_power_of_2(K), 64) + BV = min(triton.next_power_of_2(V), 64) + NV = triton.cdiv(V, BV) + + o = torch.empty_like(v) + grid = (NV, NT, B * H) + chunk_simple_gla_fwd_kernel_o[grid]( + q, + k, + v, + h, + g, + o, + offsets, + indices, + scale, + T=T, + H=H, + K=K, + V=V, + BT=BT, + BK=BK, + BV=BV, + NT=NT, + HEAD_FIRST=head_first + ) + return o + + +def chunk_simple_gla_bwd_dqkg( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + g: torch.Tensor, + h: torch.Tensor, + do: torch.Tensor, + dh: torch.Tensor, + scale: float, + offsets: Optional[torch.LongTensor] = None, + indices: Optional[torch.LongTensor] = None, + head_first: bool = True, + chunk_size: int = 64 +) -> Tuple[torch.Tensor, torch.Tensor, torch.Tensor]: + if head_first: + B, H, T, K, V = *k.shape, v.shape[-1] + else: + B, T, H, K, V = *k.shape, v.shape[-1] + BT = min(chunk_size, triton.next_power_of_2(T)) + if offsets is None: + NT = triton.cdiv(T, BT) + else: + if indices is None: + indices = torch.cat([torch.arange(n) for n in triton.cdiv(offsets[1:] - offsets[:-1], BT).tolist()]) + indices = torch.stack([indices.eq(0).cumsum(0) - 1, indices], 1).to(offsets) + NT = len(indices) + BK = min(triton.next_power_of_2(K), 64) + BV = min(triton.next_power_of_2(V), 64) + NK = triton.cdiv(K, BK) + + dq = torch.empty_like(q) + dk = torch.empty_like(k) + dg = torch.empty(NK, *g.shape, dtype=torch.float32, device=g.device) + grid = (NK, NT, B * H) + chunk_simple_gla_bwd_kernel_dqkg[grid]( + q, + k, + v, + h, + g, + do, + dh, + dq, + dk, + dg, + offsets, + indices, + scale, + B=B, + T=T, + H=H, + K=K, + V=V, + BT=BT, + BK=BK, + BV=BV, + HEAD_FIRST=head_first + ) + dg = chunk_local_cumsum(dg.sum(0), chunk_size, reverse=True, offsets=offsets, head_first=head_first) + return dq, dk, dg + + +def chunk_simple_gla_bwd_dv( + q: torch.Tensor, + k: torch.Tensor, + g: torch.Tensor, + do: torch.Tensor, + dh: torch.Tensor, + scale: float, + offsets: Optional[torch.LongTensor] = None, + indices: Optional[torch.LongTensor] = None, + head_first: bool = True, + chunk_size: int = 64 +) -> torch.Tensor: + if head_first: + B, H, T, K, V = *k.shape, do.shape[-1] + else: + B, T, H, K, V = *k.shape, do.shape[-1] + BT = min(chunk_size, triton.next_power_of_2(T)) + if offsets is None: + NT = triton.cdiv(T, BT) + else: + if indices is None: + indices = torch.cat([torch.arange(n) for n in triton.cdiv(offsets[1:] - offsets[:-1], BT).tolist()]) + indices = torch.stack([indices.eq(0).cumsum(0) - 1, indices], 1).to(offsets) + NT = len(indices) + BK = min(triton.next_power_of_2(K), 64) + BV = min(triton.next_power_of_2(V), 64) + NV = triton.cdiv(V, BV) + + dv = torch.empty_like(do) + grid = (NV, NT, B * H) + chunk_simple_gla_bwd_kernel_dv[grid]( + q, + k, + g, + do, + dv, + dh, + offsets, + indices, + scale, + T=T, + H=H, + K=K, + V=V, + BT=BT, + BK=BK, + BV=BV, + HEAD_FIRST=head_first + ) + return dv + + +def chunk_simple_gla_fwd( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + g: torch.Tensor, + scale: float, + initial_state: torch.Tensor, + output_final_state: bool, + offsets: Optional[torch.LongTensor] = None, + indices: Optional[torch.LongTensor] = None, + head_first: bool = True, + chunk_size: int = 64 +) -> Tuple[torch.Tensor, torch.Tensor, torch.Tensor]: + g = chunk_local_cumsum(g, chunk_size, offsets=offsets, head_first=head_first) + h, ht = chunk_fwd_h( + k=k, + v=v, + g=g, + gk=None, + gv=None, + h0=initial_state, + output_final_state=output_final_state, + states_in_fp32=False, + offsets=offsets, + head_first=head_first, + chunk_size=chunk_size + ) + o = chunk_simple_gla_fwd_o( + q=q, + k=k, + v=v, + g=g, + h=h, + scale=scale, + offsets=offsets, + indices=indices, + head_first=head_first, + chunk_size=chunk_size + ) + return g, o, ht + + +def chunk_simple_gla_bwd( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + g: torch.Tensor, + initial_state: torch.Tensor, + do: torch.Tensor, + dht: torch.Tensor, + scale: float, + offsets: Optional[torch.LongTensor] = None, + indices: Optional[torch.LongTensor] = None, + head_first: bool = True, + chunk_size: int = 64 +) -> Tuple[torch.Tensor, torch.Tensor, torch.Tensor]: + # (SY 09/22) states_in_fp32 seems not affecting the error of dg but for safety, set to True + h, _ = chunk_fwd_h( + k=k, + v=v, + g=g, + gk=None, + gv=None, + h0=initial_state, + output_final_state=False, + states_in_fp32=True, + offsets=offsets, + head_first=head_first, + chunk_size=chunk_size + ) + dh, dh0 = chunk_bwd_dh( + q=q, + k=k, + v=v, + g=g, + gk=None, + gv=None, + do=do, + h0=initial_state, + dht=dht, + scale=scale, + states_in_fp32=True, + offsets=offsets, + head_first=head_first, + chunk_size=chunk_size + ) + dq, dk, dg = chunk_simple_gla_bwd_dqkg( + q=q, + k=k, + v=v, + g=g, + h=h, + do=do, + dh=dh, + scale=scale, + offsets=offsets, + indices=indices, + head_first=head_first, + chunk_size=chunk_size + ) + dv = chunk_simple_gla_bwd_dv( + q=q, + k=k, + g=g, + do=do, + dh=dh, + scale=scale, + offsets=offsets, + indices=indices, + head_first=head_first, + chunk_size=chunk_size + ) + return dq, dk, dv, dg, dh0 + + +class ChunkSimpleGLAFunction(torch.autograd.Function): + + @staticmethod + @contiguous + @autocast_custom_fwd + def forward( + ctx, + q, + k, + v, + g, + scale, + initial_state, + output_final_state, + offsets, + head_first + ): + T = q.shape[2] if head_first else q.shape[1] + chunk_size = min(64, triton.next_power_of_2(T)) + + # 2-d indices denoting the offsets of chunks in each sequence + # for example, if the passed `offsets` is [0, 100, 356] and `chunk_size` is 64, + # then there are 2 and 4 chunks in the 1st and 2nd sequences respectively, and `indices` will be + # [[0, 0], [0, 1], [1, 0], [1, 1], [1, 2], [1, 3]] + indices = None + if offsets is not None: + indices = torch.cat([torch.arange(n) for n in triton.cdiv(offsets[1:] - offsets[:-1], chunk_size).tolist()]) + indices = torch.stack([indices.eq(0).cumsum(0) - 1, indices], 1).to(offsets) + + g, o, ht = chunk_simple_gla_fwd( + q=q, + k=k, + v=v, + g=g, + scale=scale, + initial_state=initial_state, + output_final_state=output_final_state, + offsets=offsets, + indices=indices, + head_first=head_first, + chunk_size=chunk_size + ) + ctx.save_for_backward(q, k, v, g, initial_state) + ctx.chunk_size = chunk_size + ctx.scale = scale + ctx.offsets = offsets + ctx.indices = indices + ctx.head_first = head_first + return o.to(q.dtype), ht + + @staticmethod + @contiguous + @autocast_custom_bwd + def backward(ctx, do, dht): + chunk_size, scale, offsets, indices, head_first = ctx.chunk_size, ctx.scale, ctx.offsets, ctx.indices, ctx.head_first + q, k, v, g, initial_state = ctx.saved_tensors + dq, dk, dv, dg, dh0 = chunk_simple_gla_bwd( + q=q, + k=k, + v=v, + g=g, + initial_state=initial_state, + do=do, + dht=dht, + scale=scale, + offsets=offsets, + indices=indices, + head_first=head_first, + chunk_size=chunk_size + ) + return dq.to(q.dtype), dk.to(k.dtype), dv.to(v.dtype), dg.to(g.dtype), None, dh0, None, None, None + + +def chunk_simple_gla( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + g: torch.Tensor, # log decay + scale: Optional[float] = None, + initial_state: Optional[torch.Tensor] = None, + output_final_state: bool = False, + offsets: Optional[torch.LongTensor] = None, + head_first: bool = True +) -> Tuple[torch.Tensor, torch.Tensor]: + r""" + Args: + q (torch.Tensor): + queries of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]`. + k (torch.Tensor): + keys of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]`. + v (torch.Tensor): + values of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]`. + g (torch.Tensor): + Forget gates of shape `[B, H, T]` if `head_first=True` else `[B, T, H]`. + Compared to GLA, the gating is head-wise instead of elementwise. + scale (Optional[int]): + Scale factor for the attention scores. + If not provided, it will default to `1 / sqrt(K)`. Default: `None`. + initial_state (Optional[torch.Tensor]): + Initial state of shape `[N, H, K, V]` for `N` input sequences. + For equal-length input sequences, `N` equals the batch size `B`. + Default: `None`. + output_final_state (Optional[bool]): + Whether to output the final state of shape `[N, H, K, V]`. Default: `False`. + offsets (Optional[torch.LongTensor]): + Offsets of shape `[N+1]` defining the bos/eos positions of `N` variable-length sequences in the batch. + For example, + if `offsets` is `[0, 1, 3, 6, 10, 15]`, there are `N=5` sequences with lengths 1, 2, 3, 4 and 5 respectively. + If provided, the inputs are concatenated and the batch size `B` is expected to be 1. + Default: `None`. + head_first (Optional[bool]): + Whether the inputs are in the head-first format, which is not supported for variable-length inputs. + Default: `True`. + + Returns: + o (torch.Tensor): + Outputs of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]`. + final_state (torch.Tensor): + Final state of shape `[N, H, K, V]` if `output_final_state=True` else `None`. + + Examples:: + >>> import torch + >>> import torch.nn.functional as F + >>> from einops import rearrange + >>> from fla.ops.simple_gla import chunk_simple_gla + # inputs with equal lengths + >>> B, T, H, K, V = 4, 2048, 4, 512, 512 + >>> q = torch.randn(B, T, H, K, device='cuda') + >>> k = torch.randn(B, T, H, K, device='cuda') + >>> v = torch.randn(B, T, H, V, device='cuda') + >>> g = F.logsigmoid(torch.randn(B, T, H, device='cuda')) + >>> o, ht = chunk_simple_gla(q, k, v, g, + initial_state=None, + output_final_state=True, + head_first=False) + # for variable-length inputs, the batch size `B` is expected to be 1 and `offsets` is required + >>> q, k, v, g = map(lambda x: rearrange(x, 'b t ... -> 1 (b t) ...'), (q, k, v, g)) + # for a batch with 4 sequences, offsets with 5 start/end positions are expected + >>> offsets = q.new_tensor([0, 2048, 4096, 6144, 8192], dtype=torch.long) + >>> o_var, ht_var = chunk_simple_gla(q, k, v, g, + initial_state=None, + output_final_state=True, + offsets=offsets, + head_first=False) + >>> assert o.allclose(o_var.view(o.shape)) + >>> assert ht.allclose(ht_var) + """ + if offsets is not None: + if q.shape[0] != 1: + raise ValueError(f"The batch size is expected to be 1 rather than {q.shape[0]} when using `offsets`." + f"Please flatten variable-length inputs before processing.") + if head_first: + raise RuntimeError("Sequences with variable lengths are not supported for head-first mode") + if initial_state is not None and initial_state.shape[0] != len(offsets) - 1: + raise ValueError(f"The number of initial states is expected to be equal to the number of input sequences, " + f"i.e., {len(offsets) - 1} rather than {initial_state.shape[0]}.") + if scale is None: + scale = k.shape[-1] ** -0.5 + o, final_state = ChunkSimpleGLAFunction.apply( + q, + k, + v, + g, + scale, + initial_state, + output_final_state, + offsets, + head_first + ) + return o, final_state diff --git a/fla/ops/simple_gla/fused_recurrent.py b/fla/ops/simple_gla/fused_recurrent.py new file mode 100644 index 0000000000000000000000000000000000000000..a155609360f0835a91d78e79113e711284c2277a --- /dev/null +++ b/fla/ops/simple_gla/fused_recurrent.py @@ -0,0 +1,112 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +from typing import Optional, Tuple + +import torch + +from fla.ops.common.fused_recurrent import fused_recurrent + + +def fused_recurrent_simple_gla( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + g: torch.Tensor, + scale: Optional[float] = None, + initial_state: Optional[torch.Tensor] = None, + output_final_state: bool = False, + reverse: bool = False, + offsets: Optional[torch.LongTensor] = None, + head_first: bool = True +) -> Tuple[torch.Tensor, torch.Tensor]: + r""" + Args: + q (torch.Tensor): + queries of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]`. + k (torch.Tensor): + keys of shape `[B, H, T, K]` if `head_first=True` else `[B, T, H, K]`. + v (torch.Tensor): + values of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]`. + g (torch.Tensor): + Forget gates of shape `[B, H, T]` if `head_first=True` else `[B, T, H]`. + Compared to GLA, the gating is head-wise instead of elementwise. + scale (Optional[int]): + Scale factor for the attention scores. + If not provided, it will default to `1 / sqrt(K)`. Default: `None`. + initial_state (Optional[torch.Tensor]): + Initial state of shape `[N, H, K, V]` for `N` input sequences. + For equal-length input sequences, `N` equals the batch size `B`. + Default: `None`. + output_final_state (Optional[bool]): + Whether to output the final state of shape `[N, H, K, V]`. Default: `False`. + reverse (Optional[bool]): + If `True`, process the state passing in reverse order. Default: `False`. + offsets (Optional[torch.LongTensor]): + Offsets of shape `[N+1]` defining the bos/eos positions of `N` variable-length sequences in the batch. + For example, + if `offsets` is `[0, 1, 3, 6, 10, 15]`, there are `N=5` sequences with lengths 1, 2, 3, 4 and 5 respectively. + If provided, the inputs are concatenated and the batch size `B` is expected to be 1. + Default: `None`. + head_first (Optional[bool]): + Whether the inputs are in the head-first format, which is not supported for variable-length inputs. + Default: `True`. + + Returns: + o (torch.Tensor): + Outputs of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]`. + final_state (torch.Tensor): + Final state of shape `[N, H, K, V]` if `output_final_state=True` else `None`. + + Examples:: + >>> import torch + >>> import torch.nn.functional as F + >>> from einops import rearrange + >>> from fla.ops.simple_gla import fused_recurrent_simple_gla + # inputs with equal lengths + >>> B, T, H, K, V = 4, 2048, 4, 512, 512 + >>> q = torch.randn(B, T, H, K, device='cuda') + >>> k = torch.randn(B, T, H, K, device='cuda') + >>> v = torch.randn(B, T, H, V, device='cuda') + >>> g = F.logsigmoid(torch.randn(B, T, H, K, device='cuda')) + >>> h0 = torch.randn(B, H, K, V, device='cuda') + >>> o, ht = fused_recurrent_simple_gla(q, k, v, g, + initial_state=h0, + output_final_state=True, + head_first=False) + # for variable-length inputs, the batch size `B` is expected to be 1 and `offsets` is required + >>> q, k, v, g = map(lambda x: rearrange(x, 'b t h d -> 1 (b t) h d'), (q, k, v, g)) + # for a batch with 4 sequences, offsets with 5 start/end positions are expected + >>> offsets = q.new_tensor([0, 2048, 4096, 6144, 8192], dtype=torch.long) + >>> o_var, ht_var = fused_recurrent_simple_gla(q, k, v, g, + initial_state=h0, + output_final_state=True, + offsets=offsets, + head_first=False) + >>> assert o.allclose(o_var.view(o.shape)) + >>> assert ht.allclose(ht_var) + """ + if offsets is not None: + if q.shape[0] != 1: + raise ValueError(f"The batch size is expected to be 1 rather than {q.shape[0]} when using `offsets`." + f"Please flatten variable-length inputs before processing.") + if head_first: + raise RuntimeError("Sequences with variable lengths are not supported for head-first mode") + if initial_state is not None and initial_state.shape[0] != len(offsets) - 1: + raise ValueError(f"The number of initial states is expected to be equal to the number of input sequences, " + f"i.e., {len(offsets) - 1} rather than {initial_state.shape[0]}.") + if scale is None: + scale = k.shape[-1] ** -0.5 + o, final_state = fused_recurrent( + q=q, + k=k, + v=v, + g=g, + scale=scale, + initial_state=initial_state, + output_final_state=output_final_state, + reverse=reverse, + offsets=offsets, + head_first=head_first + ) + return o, final_state diff --git a/fla/ops/simple_gla/naive.py b/fla/ops/simple_gla/naive.py new file mode 100644 index 0000000000000000000000000000000000000000..5fcc96ebeb720cc8b9699793ee6bdf8d3d39fdaa --- /dev/null +++ b/fla/ops/simple_gla/naive.py @@ -0,0 +1,54 @@ +# -*- coding: utf-8 -*- + +import torch +from einops import rearrange + + +def torch_simple_gla(q, k, v, g, chunk_size=64, scale=None): + if scale is None: + scale = (q.shape[-1] ** -0.5) + q = rearrange(q, 'b h (n c) d -> b h n c d', c=chunk_size) * scale + k = rearrange(k, 'b h (n c) d -> b h n c d', c=chunk_size) + v = rearrange(v, 'b h (n c) d -> b h n c d', c=chunk_size) + g = rearrange(g, 'b h (n c) -> b h n c', c=chunk_size) + g = g.cumsum(-1) + kv = k.transpose(-1, -2) @ (v * (-g + g[:, :, :, -1, None]).exp()[..., None]) + S = torch.zeros_like(kv) + + for i in range(1, g.shape[-2]): + S[:, :, i] = S[:, :, i-1].clone() * g[:, :, i-1, -1, None, None].exp() + kv[:, :, i-1] + + inter = (q * g[..., None].exp()) @ S + attn = q @ k.transpose(-1, -2) + attn = attn * (g[..., None] - g[..., None, :]).exp() + attn = attn.masked_fill(torch.triu(torch.ones(chunk_size, chunk_size, dtype=bool, device=q.device), diagonal=1), 0) + intra = attn @ v + o = inter + intra + return rearrange(o, 'b h n c d -> b h (n c) d') + + +def torch_simple_gla_recurrent(q, k, v, g, scale=None, initial_state=None, output_final_state=True): + B, H, T, DK = q.shape + original_dtype = q.dtype + q, k, v, g = q.float(), k.float(), v.float(), g.float() + if scale is None: + scale = DK ** -0.5 + q = q * scale + _, _, _, DV = v.shape + if initial_state is None: + S = torch.zeros(B, H, DK, DV) + else: + S = initial_state + o = torch.zeros(B, H, T, DV).to(q) + for i in range(T): + gate = g[:, :, i].exp() + key = k[:, :, i] + value = v[:, :, i] + kv = key.unsqueeze(-1) * value.unsqueeze(-2) + S = S.clone() * gate.unsqueeze(-1).unsqueeze(-1) + kv + q_i = q[:, :, i, :] + o_i = (q_i.unsqueeze(-1) * S).sum(-2) + o[:, :, i] = o_i + if not output_final_state: + S = None + return o.to(original_dtype), S diff --git a/fla/ops/simple_gla/parallel.py b/fla/ops/simple_gla/parallel.py new file mode 100644 index 0000000000000000000000000000000000000000..00c21469677f151fdfa4c9777288615c18bfad30 --- /dev/null +++ b/fla/ops/simple_gla/parallel.py @@ -0,0 +1,607 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2024, Songlin Yang, Yu Zhang + +from typing import Tuple + +import torch +import triton +import triton.language as tl + +from fla.ops.utils import chunk_global_cumsum, chunk_local_cumsum +from fla.utils import autocast_custom_bwd, autocast_custom_fwd, contiguous + + +@triton.heuristics({ + 'NV': lambda args: triton.cdiv(args['V'], args['BV']), + 'OUTPUT_ATTENTIONS': lambda args: args['attn'] is not None +}) +@triton.jit +def parallel_simple_gla_fwd_kernel( + q, + k, + v, + g, + o, + attn, + s_k_h, + s_k_t, + s_v_h, + s_v_t, + scale, + B: tl.constexpr, + H: tl.constexpr, + T: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BS: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + NV: tl.constexpr, + OUTPUT_ATTENTIONS: tl.constexpr +): + i_kv, i_t, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_k, i_v = i_kv // NV, i_kv % NV + + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (T, K), (s_k_t, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + if OUTPUT_ATTENTIONS: + p_a = tl.make_block_ptr(attn + (i_k * B * H + i_bh) * T * T, (T, T), (T, 1), (i_t * BT, 0), (BT, BS), (1, 0)) + + # the Q block is kept in the shared memory throughout the whole kernel + # [BT, BK] + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_q = (b_q * scale).to(b_q.dtype) + + b_o = tl.zeros([BT, BV], dtype=tl.float32) + # Q block and K block have no overlap + # no need for mask, thereby saving flops + for i_s in range(0, i_t * BT, BS): + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (K, T), (1, s_k_t), (i_k * BK, i_s), (BK, BS), (0, 1)) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (T, V), (s_v_t, 1), (i_s, i_v * BV), (BS, BV), (1, 0)) + p_g = tl.make_block_ptr(g + i_bh * T, (T,), (1,), (i_s,), (BS,), (0,)) + # [BK, BS] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BS, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + # [BS,] + b_g = tl.load(p_g, boundary_check=(0,)) + + b_gn = tl.load(g + i_bh * T + min(i_s + BS, T) - 1) + b_gp = tl.load(g + i_bh * T + i_s - 1) if i_s % BT > 0 else 0. + + b_kg = (b_k * tl.exp(b_gn - b_g)).to(b_k.dtype) + # [BT, BS] + b_s = tl.dot(b_q, b_kg, allow_tf32=False) + # do this check to avoid some layout bugs + # [[BT, BV] + if i_s > 0: + b_o = b_o * tl.exp(b_gn - b_gp) + b_o += tl.dot(b_s.to(b_v.dtype), b_v, allow_tf32=False) + + if OUTPUT_ATTENTIONS: + tl.store(p_a, b_s.to(p_a.dtype.element_ty), boundary_check=(0, 1)) + p_a = tl.advance(p_a, (0, BS)) + + tl.debug_barrier() + + p_g = tl.make_block_ptr(g + i_bh * T, (T,), (1,), (i_t * BT,), (BT,), (0,)) + # [BT,] + b_gq = tl.load(p_g, boundary_check=(0,)) + # rescale interchunk output + b_o *= tl.exp(b_gq)[:, None] + + if OUTPUT_ATTENTIONS: + p_a = tl.make_block_ptr(attn + (i_k * B * H + i_bh) * T * T, (T, T), (T, 1), (i_t * BT, i_t * BT), (BT, BS), (1, 0)) + + # [BT] + o_q = i_t * BT + tl.arange(0, BT) + # [BS] + o_k = i_t * BT + tl.arange(0, BS) + # Q block and K block have overlap. + # masks required + for i_s in range(i_t * BT, min((i_t + 1) * BT, T), BS): + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (K, T), (1, s_k_t), (i_k * BK, i_s), (BK, BS), (0, 1)) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (T, V), (s_v_t, 1), (i_s, i_v * BV), (BS, BV), (1, 0)) + p_gk = tl.make_block_ptr(g + i_bh * T, (T,), (1,), (i_s,), (BS,), (0,)) + # [BK, BS] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BS, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + # [BS,] + b_gk = tl.load(p_gk, boundary_check=(0,)) + # [BT, BS] + m_s = o_q[:, None] >= o_k[None, :] + b_s = tl.where(m_s, tl.dot(b_q, b_k, allow_tf32=False) * tl.exp(b_gq[:, None] - b_gk[None, :]), 0) + # [BT, BV] + b_o += tl.dot(b_s.to(b_q.dtype), b_v, allow_tf32=False) + + if OUTPUT_ATTENTIONS: + tl.store(p_a, b_s.to(p_a.dtype.element_ty), boundary_check=(0, 1)) + p_a = tl.advance(p_a, (0, BS)) + o_k += BS + + p_o = tl.make_block_ptr(o + (i_bh + B * H * i_k) * s_v_h, (T, V), (s_v_t, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + tl.store(p_o, b_o.to(p_o.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.jit +def parallel_simple_gla_bwd_kernel_dq( + i_bh, + i_t, + i_k, + i_v, + q, + k, + v, + g, + do, + dq, + dg, + s_k_h, + s_k_t, + s_v_h, + s_v_t, + scale, + B: tl.constexpr, + H: tl.constexpr, + T: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BS: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, +): + p_do = tl.make_block_ptr(do + i_bh * s_v_h, (T, V), (s_v_t, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + # [BT, BV] + b_do = tl.load(p_do, boundary_check=(0, 1)) + # [BT, BK] + b_dq = tl.zeros([BT, BK], dtype=tl.float32) + + for i_s in range(0, i_t * BT, BS): + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (T, K), (s_k_t, 1), (i_s, i_k * BK), (BS, BK), (1, 0)) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (V, T), (1, s_v_t), (i_v * BV, i_s), (BV, BS), (0, 1)) + p_g = tl.make_block_ptr(g + i_bh * T, (T,), (1,), (i_s,), (BS,), (0,)) + # [BS, BK] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BV, BS] + b_v = tl.load(p_v, boundary_check=(0, 1)) + # [BS] + b_g = tl.load(p_g, boundary_check=(0,)) + + b_gn = tl.load(g + i_bh * T + min(i_s + BS, T) - 1) + b_gp = tl.load(g + i_bh * T + i_s - 1) if i_s % BT > 0 else 0. + # [BT, BS] + b_ds = tl.dot(b_do, b_v, allow_tf32=False) * tl.exp(b_gn - b_g)[None, :] + # [BT, BK] + if i_s > 0: + b_dq *= tl.exp(b_gn - b_gp) + b_dq += tl.dot(b_ds.to(b_v.dtype), b_k, allow_tf32=False) + + p_gq = tl.make_block_ptr(g + i_bh * T, (T,), (1,), (i_t * BT,), (BT,), (0,)) + # [BT,] + b_gq = tl.load(p_gq, boundary_check=(0,)) + # [BT, BK] + b_dq *= tl.exp(b_gq)[:, None] * scale + + # [BT] + o_q = i_t * BT + tl.arange(0, BT) + # [BS] + o_k = i_t * BT + tl.arange(0, BS) + # Q block and K block have overlap. masks required + for i_s in range(i_t * BT, min((i_t + 1) * BT, T), BS): + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (T, K), (s_k_t, 1), (i_s, i_k * BK), (BS, BK), (1, 0)) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (V, T), (1, s_v_t), (i_v * BV, i_s), (BV, BS), (0, 1)) + p_gk = tl.make_block_ptr(g + i_bh * T, (T,), (1,), (i_s,), (BS,), (0,)) + # [BS, BK] + b_k = tl.load(p_k, boundary_check=(0, 1)) + # [BV, BS] + b_v = tl.load(p_v, boundary_check=(0, 1)) + # [BS] + b_gk = tl.load(p_gk, boundary_check=(0,)) + # [BT, BS] + m_s = o_q[:, None] >= o_k[None, :] + b_ds = tl.where(m_s, tl.dot(b_do, b_v, allow_tf32=False) * tl.exp((b_gq[:, None] - b_gk[None, :])), 0) * scale + # [BT, BK] + b_dq += tl.dot(b_ds.to(b_k.dtype), b_k, allow_tf32=False) + + o_k += BS + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (T, K), (s_k_t, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dq = tl.make_block_ptr(dq + (i_v * B * H + i_bh) * s_k_h, (T, K), (s_k_t, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dg = tl.make_block_ptr(dg + (i_v * B * H + i_bh) * T, (T,), (1,), (i_t * BT,), (BT,), (0,)) + + b_q = tl.load(p_q, boundary_check=(0, 1)) + b_dg = tl.sum(b_dq * b_q, 1) + tl.store(p_dq, b_dq.to(p_dq.dtype.element_ty), boundary_check=(0, 1)) + tl.store(p_dg, b_dg.to(p_dg.dtype.element_ty), boundary_check=(0,)) + + +@triton.jit +def parallel_simple_gla_bwd_kernel_dkv( + i_bh, + i_t, + i_k, + i_v, + q, + k, + v, + g, + do, + dk, + dv, + dg, + s_k_h, + s_k_t, + s_v_h, + s_v_t, + scale, + B: tl.constexpr, + H: tl.constexpr, + T: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BS: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, +): + # compute dk dv + p_k = tl.make_block_ptr(k + i_bh * s_k_h, (T, K), (s_k_t, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_v = tl.make_block_ptr(v + i_bh * s_v_h, (T, V), (s_v_t, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_gk = tl.make_block_ptr(g + i_bh * T, (T,), (1,), (i_t * BT,), (BT,), (0,)) + # [BT, BK] + b_k = tl.load(p_k, boundary_check=(0, 1)) + b_dk = tl.zeros([BT, BK], dtype=tl.float32) + # [BT, BV] + b_v = tl.load(p_v, boundary_check=(0, 1)) + b_dv = tl.zeros([BT, BV], dtype=tl.float32) + # [BT,] + b_gk = tl.load(p_gk, boundary_check=(0,)) + + NTS = tl.cdiv(T, BS) + # [BT, BK] + b_kg = (b_k * tl.exp(tl.load(g + i_bh * T + min(i_t * BT + BT, T) - 1) - b_gk)[:, None]).to(b_k.dtype) + + for i_s in range(NTS * BS - BS, (i_t + 1) * BT - BS, -BS): + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (T, K), (s_k_t, 1), (i_s, i_k * BK), (BS, BK), (1, 0)) + p_do = tl.make_block_ptr(do + i_bh * s_v_h, (T, V), (s_v_t, 1), (i_s, i_v * BV), (BS, BV), (1, 0)) + p_gq = tl.make_block_ptr(g + i_bh * T, (T,), (1,), (i_s,), (BS,), (0,)) + # [BS, BK] + b_q = tl.load(p_q, boundary_check=(0, 1)) + # [BS,] + b_gq = tl.load(p_gq, boundary_check=(0,)) + + b_gp = tl.load(g + i_bh * T + min(i_s + BS, T) - 1) + b_gn = tl.load(g + i_bh * T + i_s - 1) if i_s % BT > 0 else 0. + # [BS, BV] + b_do = tl.load(p_do, boundary_check=(0, 1)) + b_do = (b_do * tl.exp(b_gq - b_gn)[:, None]).to(b_do.dtype) + + # overall decay rate for an entire block + # [BS, BK] + b_dk *= tl.exp(b_gp - b_gn) + # [BS, BV] + b_dv *= tl.exp(b_gp - b_gn) + # [BT, BS] + b_ds = tl.dot(b_v, tl.trans(b_do), allow_tf32=False) + b_s = tl.dot(b_kg, tl.trans(b_q), allow_tf32=False) + # [BT, BK] + b_dk += tl.dot(b_ds.to(b_q.dtype), b_q, allow_tf32=False) + # [BT, BV] + b_dv += tl.dot(b_s.to(b_do.dtype), b_do, allow_tf32=False) + + # [BT, BK] + b_dk *= tl.exp(tl.load(g + i_bh * T + min(T, i_t * BT + BT) - 1) - b_gk)[:, None] * scale + # [BT, BV] + b_dv *= scale + + tl.debug_barrier() + o_q = i_t * BT + tl.arange(0, BS) + o_k = i_t * BT + tl.arange(0, BT) + for i_s in range(i_t * BT, min((i_t + 1) * BT, T), BS): + p_q = tl.make_block_ptr(q + i_bh * s_k_h, (T, K), (s_k_t, 1), (i_s, i_k * BK), (BS, BK), (1, 0)) + p_do = tl.make_block_ptr(do + i_bh * s_v_h, (T, V), (s_v_t, 1), (i_s, i_v * BV), (BS, BV), (1, 0)) + p_gq = tl.make_block_ptr(g + i_bh * T, (T,), (1,), (i_s,), (BS,), (0,)) + # [BS, BK] + b_q = tl.load(p_q, boundary_check=(0, 1)) + # [BS, BV] + b_do = tl.load(p_do, boundary_check=(0, 1)) + # [BS] + b_gq = tl.load(p_gq, boundary_check=(0,)) + # [BT, BS] + m_s = o_k[:, None] <= o_q[None, :] + d_s = tl.where(m_s, tl.exp(-b_gk[:, None] + b_gq[None, :]), 0) * scale + + b_ds = tl.dot(b_v, tl.trans(b_do), allow_tf32=False) * d_s + b_s = tl.dot(b_k, tl.trans(b_q), allow_tf32=False) * d_s + # [BT, BK] + b_dk += tl.dot(b_ds.to(b_q.dtype), b_q, allow_tf32=False) + b_dv += tl.dot(b_s.to(b_q.dtype), b_do, allow_tf32=False) + o_q += BS + p_dk = tl.make_block_ptr(dk + (i_v * B * H + i_bh) * s_k_h, (T, K), (s_k_t, 1), (i_t * BT, i_k * BK), (BT, BK), (1, 0)) + p_dv = tl.make_block_ptr(dv + (i_k * B * H + i_bh) * s_v_h, (T, V), (s_v_t, 1), (i_t * BT, i_v * BV), (BT, BV), (1, 0)) + p_dg = tl.make_block_ptr(dg + (i_v * B * H + i_bh) * T, (T,), (1,), (i_t * BT,), (BT,), (0,)) + tl.store(p_dk, b_dk.to(p_dk.dtype.element_ty), boundary_check=(0, 1)) + tl.store(p_dv, b_dv.to(p_dv.dtype.element_ty), boundary_check=(0, 1)) + + b_dg = tl.load(p_dg, boundary_check=(0,)) + b_dg -= tl.sum(b_dk * b_k, 1) + tl.store(p_dg, b_dg.to(p_dg.dtype.element_ty), boundary_check=(0,)) + + +@triton.heuristics({ + 'NV': lambda args: triton.cdiv(args['V'], args['BV']) +}) +@triton.jit +def parallel_simple_gla_bwd_kernel( + q, + k, + v, + g, + do, + dq, + dk, + dv, + dg, + s_k_h, + s_k_t, + s_v_h, + s_v_t, + scale, + B: tl.constexpr, + H: tl.constexpr, + T: tl.constexpr, + K: tl.constexpr, + V: tl.constexpr, + BT: tl.constexpr, + BS: tl.constexpr, + BK: tl.constexpr, + BV: tl.constexpr, + NV: tl.constexpr +): + i_kv, i_t, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_k, i_v = i_kv // NV, i_kv % NV + + parallel_simple_gla_bwd_kernel_dq( + i_bh, + i_t, + i_k, + i_v, + q, + k, + v, + g, + do, + dq, + dg, + s_k_h, + s_k_t, + s_v_h, + s_v_t, + scale, + B=B, + H=H, + T=T, + K=K, + V=V, + BT=BT, + BS=BS, + BK=BK, + BV=BV + ) + tl.debug_barrier() + parallel_simple_gla_bwd_kernel_dkv( + i_bh, + i_t, + i_k, + i_v, + q, + k, + v, + g, + do, + dk, + dv, + dg, + s_k_h, + s_k_t, + s_v_h, + s_v_t, + scale, + B, + H, + T, + K, + V, + BT, + BS, + BK, + BV + ) + + +def parallel_simple_gla_fwd( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + g: torch.Tensor, + scale: float, + output_attentions: bool = False, + chunk_size: int = 128 +): + B, H, T, K, V = *k.shape, v.shape[-1] + BT, BS = chunk_size, 32 + if torch.cuda.get_device_capability()[0] >= 9: + BK = min(256, triton.next_power_of_2(K)) + BV = min(256, triton.next_power_of_2(V)) + else: + BK = min(128, triton.next_power_of_2(K)) + BV = min(128, triton.next_power_of_2(V)) + NK = triton.cdiv(K, BK) + NV = triton.cdiv(V, BV) + assert BT % BS == 0 + + num_stages = 3 if K <= 64 else 2 + num_warps = 4 + + # local cumulative decay in log space + g = chunk_local_cumsum(g, BT) + + grid = (NK * NV, triton.cdiv(T, BT), B * H) + o = torch.empty(NK, B, H, T, V, dtype=q.dtype, device=q.device) + attn = q.new_zeros(NK, B, H, T, T) if output_attentions else None + parallel_simple_gla_fwd_kernel[grid]( + q=q, + k=k, + v=v, + g=g, + o=o, + attn=attn, + s_k_h=k.stride(1), + s_k_t=k.stride(2), + s_v_h=v.stride(1), + s_v_t=v.stride(2), + scale=scale, + B=B, + H=H, + T=T, + K=K, + V=V, + BT=BT, + BS=BS, + BK=BK, + BV=BV, + num_stages=num_stages, + num_warps=num_warps + ) + o = o.sum(0) + if output_attentions: + attn = attn.sum(0) + return o, g, attn + + +def parallel_simple_gla_bwd( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + g: torch.Tensor, + do: torch.Tensor, + scale: float, + chunk_size: int = 128 +): + B, H, T, K, V = *k.shape, v.shape[-1] + BT, BS = chunk_size, 32 + BK = min(128, triton.next_power_of_2(k.shape[-1])) + BV = min(128, triton.next_power_of_2(v.shape[-1])) + NK = triton.cdiv(K, BK) + NV = triton.cdiv(V, BV) + assert BT % BS == 0 + + num_stages = 3 if K <= 64 else 2 + num_warps = 4 + + dq = torch.empty(NV, B, H, T, K, dtype=q.dtype, device=q.device) + dk = torch.empty(NV, B, H, T, K, dtype=q.dtype, device=q.device) + dv = torch.empty(NK, B, H, T, V, dtype=q.dtype, device=q.device) + dg = torch.empty(NV, B, H, T, dtype=torch.float, device=q.device) + grid = (NK * NV, triton.cdiv(T, BT), B * H) + parallel_simple_gla_bwd_kernel[grid]( + q=q, + k=k, + v=v, + g=g, + do=do, + dq=dq, + dk=dk, + dv=dv, + dg=dg, + s_k_h=k.stride(1), + s_k_t=k.stride(2), + s_v_h=v.stride(1), + s_v_t=v.stride(2), + scale=scale, + B=B, + H=H, + T=T, + K=K, + V=V, + BT=BT, + BS=BS, + BK=BK, + BV=BV, + num_stages=num_stages, + num_warps=num_warps + ) + dq = dq.sum(0) + dk = dk.sum(0) + dv = dv.sum(0) + dg = dg.sum(0) + dg = chunk_global_cumsum(dg, reverse=True) + return dq, dk, dv, dg + + +class ParallelSimpleGLAFunction(torch.autograd.Function): + + @staticmethod + @contiguous + @autocast_custom_fwd + def forward(ctx, q, k, v, g, scale, output_attentions): + BT = 128 + ctx.dtype = g.dtype + o, g, attn = parallel_simple_gla_fwd(q, k, v, g, scale, output_attentions, BT) + ctx.save_for_backward(q, k, v, g) + ctx.scale = scale + ctx.BT = BT + return o.to(q.dtype), attn + + @staticmethod + @contiguous + @autocast_custom_bwd + def backward(ctx, do, da=None): + q, k, v, g = ctx.saved_tensors + dq, dk, dv, dg = parallel_simple_gla_bwd(q, k, v, g, do, ctx.scale, ctx.BT) + return dq.to(q), dk.to(k), dv.to(v), dg.to(ctx.dtype), None, None + + +def parallel_simple_gla( + q: torch.Tensor, + k: torch.Tensor, + v: torch.Tensor, + g: torch.Tensor, + scale: float = None, + output_attentions: bool = False, + head_first: bool = True +) -> Tuple[torch.Tensor, torch.Tensor]: + r""" + Args: + q (torch.Tensor): + queries of shape `[B, H, T, K]` + k (torch.Tensor): + keys of shape `[B, H, T, K]` + v (torch.Tensor): + values of shape `[B, H, T, V]` + g (torch.Tensor): + Forget gates of shape `[B, H, T]` applied to keys. + Compared to GLA, the gating is head-wise instead of elementwise. + scale (Optional[int]): + Scale factor for attention scores. + If not provided, it will default to `1 / sqrt(K)`. Default: `None`. + output_attentions (bool): + Whether to output the materialized attention scores of shape [B, H, T, T]. Default: `False`. + head_first (Optional[bool]): + Whether the inputs are in the head-first format. Default: `True`. + + Returns: + o (torch.Tensor): + Outputs of shape `[B, H, T, V]` if `head_first=True` else `[B, T, H, V]`. + attn (torch.Tensor): + Attention scores of shape `[B, H, T, T]` if `output_attentions=True` else `None` + """ + if scale is None: + scale = k.shape[-1] ** -0.5 + if not head_first: + q, k, v, g = map(lambda x: x.transpose(1, 2) if x is not None else None, (q, k, v, g)) + o, attn = ParallelSimpleGLAFunction.apply(q, k, v, g, scale, output_attentions) + if not head_first: + o = o.transpose(1, 2).contiguous() + return o, attn diff --git a/fla/ops/utils/__init__.py b/fla/ops/utils/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..383a8dc9c9b184b52c8d17873ed47b20b7c274fb --- /dev/null +++ b/fla/ops/utils/__init__.py @@ -0,0 +1,38 @@ +# -*- coding: utf-8 -*- + +from .cumsum import (chunk_global_cumsum, chunk_global_cumsum_scalar, + chunk_global_cumsum_scalar_kernel, + chunk_global_cumsum_vector, + chunk_global_cumsum_vector_kernel, chunk_local_cumsum, + chunk_local_cumsum_scalar, + chunk_local_cumsum_scalar_kernel, + chunk_local_cumsum_vector, + chunk_local_cumsum_vector_kernel) +from .logcumsumexp import logcumsumexp_fwd_kernel +from .logsumexp import logsumexp_fwd, logsumexp_fwd_kernel +from .matmul import addmm, matmul, matmul_kernel +from .softmax import (softmax_bwd, softmax_bwd_kernel, softmax_fwd, + softmax_fwd_kernel) + +__all__ = [ + 'chunk_global_cumsum', + 'chunk_global_cumsum_scalar', + 'chunk_global_cumsum_scalar_kernel', + 'chunk_global_cumsum_vector', + 'chunk_global_cumsum_vector_kernel', + 'chunk_local_cumsum', + 'chunk_local_cumsum_scalar', + 'chunk_local_cumsum_scalar_kernel', + 'chunk_local_cumsum_vector', + 'chunk_local_cumsum_vector_kernel', + 'logcumsumexp_fwd_kernel', + 'logsumexp_fwd', + 'logsumexp_fwd_kernel', + 'addmm', + 'matmul', + 'matmul_kernel', + 'softmax_bwd', + 'softmax_bwd_kernel', + 'softmax_fwd', + 'softmax_fwd_kernel', +] diff --git a/fla/ops/utils/cumsum.py b/fla/ops/utils/cumsum.py new file mode 100644 index 0000000000000000000000000000000000000000..efdea0c52abf93ed7c8214579d91bb8b78af1ab1 --- /dev/null +++ b/fla/ops/utils/cumsum.py @@ -0,0 +1,632 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2023-2024, Songlin Yang, Yu Zhang + +from typing import Optional + +import torch +import triton +import triton.language as tl + +from fla.utils import contiguous + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8) + ], + key=['BT'] +) +@triton.heuristics({'USE_OFFSETS': lambda args: args['offsets'] is not None}) +@triton.jit +def chunk_local_cumsum_scalar_kernel( + s, + o, + offsets, + indices, + T: tl.constexpr, + H: tl.constexpr, + BT: tl.constexpr, + HEAD_FIRST: tl.constexpr, + USE_OFFSETS: tl.constexpr +): + i_t, i_bh = tl.program_id(0), tl.program_id(1) + i_b, i_h = i_bh // H, i_bh % H + if USE_OFFSETS: + i_n, i_t = tl.load(indices + i_t * 2).to(tl.int32), tl.load(indices + i_t * 2 + 1).to(tl.int32) + bos, eos = tl.load(offsets + i_n).to(tl.int32), tl.load(offsets + i_n + 1).to(tl.int32) + T = eos - bos + else: + bos, eos = i_b * T, i_b * T + T + + if HEAD_FIRST: + p_s = tl.make_block_ptr(s + i_bh * T, (T,), (1,), (i_t * BT,), (BT,), (0,)) + p_o = tl.make_block_ptr(o + i_bh * T, (T,), (1,), (i_t * BT,), (BT,), (0,)) + else: + p_s = tl.make_block_ptr(s + bos*H + i_h, (T,), (H,), (i_t * BT,), (BT,), (0,)) + p_o = tl.make_block_ptr(o + bos*H + i_h, (T,), (H,), (i_t * BT,), (BT,), (0,)) + # [BT] + b_s = tl.load(p_s, boundary_check=(0,)).to(tl.float32) + b_o = tl.cumsum(b_s, axis=0) + tl.store(p_o, b_o.to(p_o.dtype.element_ty), boundary_check=(0,)) + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8) + ], + key=['BT'] +) +@triton.heuristics({'USE_OFFSETS': lambda args: args['offsets'] is not None}) +@triton.jit +def chunk_local_reversed_cumsum_scalar_kernel( + s, + o, + offsets, + indices, + T: tl.constexpr, + H: tl.constexpr, + BT: tl.constexpr, + HEAD_FIRST: tl.constexpr, + USE_OFFSETS: tl.constexpr +): + i_t, i_bh = tl.program_id(0), tl.program_id(1) + i_b, i_h = i_bh // H, i_bh % H + if USE_OFFSETS: + i_n, i_t = tl.load(indices + i_t * 2).to(tl.int32), tl.load(indices + i_t * 2 + 1).to(tl.int32) + bos, eos = tl.load(offsets + i_n).to(tl.int32), tl.load(offsets + i_n + 1).to(tl.int32) + T = eos - bos + else: + bos, eos = i_b * T, i_b * T + T + + if HEAD_FIRST: + p_s = tl.make_block_ptr(s + i_bh * T, (T,), (1,), (i_t * BT,), (BT,), (0,)) + p_o = tl.make_block_ptr(o + i_bh * T, (T,), (1,), (i_t * BT,), (BT,), (0,)) + else: + p_s = tl.make_block_ptr(s + bos*H + i_h, (T,), (H,), (i_t * BT,), (BT,), (0,)) + p_o = tl.make_block_ptr(o + bos*H + i_h, (T,), (H,), (i_t * BT,), (BT,), (0,)) + # [BT] + b_s = tl.load(p_s, boundary_check=(0,)).to(tl.float32) + b_z = tl.sum(b_s, axis=0) + b_o = b_z[None] - tl.cumsum(b_s, axis=0) + b_s + tl.store(p_o, b_o.to(p_o.dtype.element_ty), boundary_check=(0,)) + + +@triton.autotune( + configs=[ + triton.Config({'BS': 16}, num_warps=2), + triton.Config({'BS': 16}, num_warps=4), + triton.Config({'BS': 16}, num_warps=8), + triton.Config({'BS': 32}, num_warps=2), + triton.Config({'BS': 32}, num_warps=4), + triton.Config({'BS': 32}, num_warps=8), + triton.Config({'BS': 64}, num_warps=2), + triton.Config({'BS': 64}, num_warps=4), + triton.Config({'BS': 64}, num_warps=8), + ], + key=['S', 'BT'] +) +@triton.heuristics({'USE_OFFSETS': lambda args: args['offsets'] is not None}) +@triton.jit +def chunk_local_cumsum_vector_kernel( + s, + o, + offsets, + indices, + T: tl.constexpr, + H: tl.constexpr, + S: tl.constexpr, + BT: tl.constexpr, + BS: tl.constexpr, + HEAD_FIRST: tl.constexpr, + USE_OFFSETS: tl.constexpr +): + i_s, i_t, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_b, i_h = i_bh // H, i_bh % H + if USE_OFFSETS: + i_n, i_t = tl.load(indices + i_t * 2).to(tl.int32), tl.load(indices + i_t * 2 + 1).to(tl.int32) + bos, eos = tl.load(offsets + i_n).to(tl.int32), tl.load(offsets + i_n + 1).to(tl.int32) + T = eos - bos + else: + bos, eos = i_b * T, i_b * T + T + + o_i = tl.arange(0, BT) + m_s = tl.where(o_i[:, None] >= o_i[None, :], 1., 0.) + + if HEAD_FIRST: + p_s = tl.make_block_ptr(s + i_bh * T*S, (T, S), (S, 1), (i_t * BT, i_s * BS), (BT, BS), (1, 0)) + p_o = tl.make_block_ptr(o + i_bh * T*S, (T, S), (S, 1), (i_t * BT, i_s * BS), (BT, BS), (1, 0)) + else: + p_s = tl.make_block_ptr(s + (bos * H + i_h) * S, (T, S), (H*S, 1), (i_t * BT, i_s * BS), (BT, BS), (1, 0)) + p_o = tl.make_block_ptr(o + (bos * H + i_h) * S, (T, S), (H*S, 1), (i_t * BT, i_s * BS), (BT, BS), (1, 0)) + # [BT, BS] + b_s = tl.load(p_s, boundary_check=(0, 1)).to(tl.float32) + b_o = tl.dot(m_s, b_s, allow_tf32=False) + tl.store(p_o, b_o.to(p_o.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.autotune( + configs=[ + triton.Config({'BS': 16}, num_warps=2), + triton.Config({'BS': 16}, num_warps=4), + triton.Config({'BS': 16}, num_warps=8), + triton.Config({'BS': 32}, num_warps=2), + triton.Config({'BS': 32}, num_warps=4), + triton.Config({'BS': 32}, num_warps=8), + triton.Config({'BS': 64}, num_warps=2), + triton.Config({'BS': 64}, num_warps=4), + triton.Config({'BS': 64}, num_warps=8), + ], + key=['S', 'BT'] +) +@triton.heuristics({'USE_OFFSETS': lambda args: args['offsets'] is not None}) +@triton.jit +def chunk_local_reversed_cumsum_vector_kernel( + s, + o, + offsets, + indices, + T: tl.constexpr, + H: tl.constexpr, + S: tl.constexpr, + BT: tl.constexpr, + BS: tl.constexpr, + HEAD_FIRST: tl.constexpr, + USE_OFFSETS: tl.constexpr +): + i_s, i_t, i_bh = tl.program_id(0), tl.program_id(1), tl.program_id(2) + i_b, i_h = i_bh // H, i_bh % H + if USE_OFFSETS: + i_n, i_t = tl.load(indices + i_t * 2).to(tl.int32), tl.load(indices + i_t * 2 + 1).to(tl.int32) + bos, eos = tl.load(offsets + i_n).to(tl.int32), tl.load(offsets + i_n + 1).to(tl.int32) + T = eos - bos + else: + bos, eos = i_b * T, i_b * T + T + + o_i = tl.arange(0, BT) + m_s = tl.where(o_i[:, None] <= o_i[None, :], 1., 0.) + + if HEAD_FIRST: + p_s = tl.make_block_ptr(s + i_bh * T*S, (T, S), (S, 1), (i_t * BT, i_s * BS), (BT, BS), (1, 0)) + p_o = tl.make_block_ptr(o + i_bh * T*S, (T, S), (S, 1), (i_t * BT, i_s * BS), (BT, BS), (1, 0)) + else: + p_s = tl.make_block_ptr(s + (bos * H + i_h) * S, (T, S), (H*S, 1), (i_t * BT, i_s * BS), (BT, BS), (1, 0)) + p_o = tl.make_block_ptr(o + (bos * H + i_h) * S, (T, S), (H*S, 1), (i_t * BT, i_s * BS), (BT, BS), (1, 0)) + # [BT, BS] + b_s = tl.load(p_s, boundary_check=(0, 1)).to(tl.float32) + b_o = tl.dot(m_s, b_s, allow_tf32=False) + tl.store(p_o, b_o.to(p_o.dtype.element_ty), boundary_check=(0, 1)) + + +@triton.autotune( + configs=[ + triton.Config({'BT': 16}, num_warps=2), + triton.Config({'BT': 32}, num_warps=4), + triton.Config({'BT': 32}, num_warps=2), + triton.Config({'BT': 64}, num_warps=8), + triton.Config({'BT': 64}, num_warps=4), + ], + key=[] +) +@triton.heuristics({'USE_OFFSETS': lambda args: args['offsets'] is not None}) +@triton.jit +def chunk_global_cumsum_scalar_kernel( + s, + o, + offsets, + T: tl.constexpr, + H: tl.constexpr, + BT: tl.constexpr, + HEAD_FIRST: tl.constexpr, + USE_OFFSETS: tl.constexpr +): + i_bh = tl.program_id(0) + i_b, i_h = i_bh // H, i_bh % H + if USE_OFFSETS: + bos, eos = tl.load(offsets + i_b).to(tl.int32), tl.load(offsets + i_b + 1).to(tl.int32) + else: + bos, eos = i_b * T, i_b * T + T + T = eos - bos + + b_z = tl.zeros([], dtype=tl.float32) + for i_t in range(tl.cdiv(T, BT)): + if HEAD_FIRST: + p_s = tl.make_block_ptr(s + i_bh * T, (T,), (1,), (i_t * BT,), (BT,), (0,)) + p_o = tl.make_block_ptr(o + i_bh * T, (T,), (1,), (i_t * BT,), (BT,), (0,)) + else: + p_s = tl.make_block_ptr(s + bos*H + i_h, (T,), (H,), (i_t * BT,), (BT,), (0,)) + p_o = tl.make_block_ptr(o + bos*H + i_h, (T,), (H,), (i_t * BT,), (BT,), (0,)) + b_s = tl.load(p_s, boundary_check=(0,)).to(tl.float32) + b_o = tl.cumsum(b_s, axis=0) + b_z[None] + b_z += tl.sum(b_s, axis=0) + tl.store(p_o, b_o.to(p_o.dtype.element_ty), boundary_check=(0,)) + + +@triton.autotune( + configs=[ + triton.Config({'BT': 16}, num_warps=2), + triton.Config({'BT': 32}, num_warps=4), + triton.Config({'BT': 32}, num_warps=2), + triton.Config({'BT': 64}, num_warps=8), + triton.Config({'BT': 64}, num_warps=4), + ], + key=[] +) +@triton.heuristics({'USE_OFFSETS': lambda args: args['offsets'] is not None}) +@triton.jit +def chunk_global_reversed_cumsum_scalar_kernel( + s, + o, + offsets, + T: tl.constexpr, + H: tl.constexpr, + BT: tl.constexpr, + HEAD_FIRST: tl.constexpr, + USE_OFFSETS: tl.constexpr +): + i_bh = tl.program_id(0) + i_b, i_h = i_bh // H, i_bh % H + if USE_OFFSETS: + bos, eos = tl.load(offsets + i_b).to(tl.int32), tl.load(offsets + i_b + 1).to(tl.int32) + else: + bos, eos = i_b * T, i_b * T + T + T = eos - bos + + b_z = tl.zeros([], dtype=tl.float32) + for i_t in range(tl.cdiv(T, BT) - 1, -1, -1): + if HEAD_FIRST: + p_s = tl.make_block_ptr(s + i_bh * T, (T,), (1,), (i_t * BT,), (BT,), (0,)) + p_o = tl.make_block_ptr(o + i_bh * T, (T,), (1,), (i_t * BT,), (BT,), (0,)) + else: + p_s = tl.make_block_ptr(s + bos*H + i_h, (T,), (H,), (i_t * BT,), (BT,), (0,)) + p_o = tl.make_block_ptr(o + bos*H + i_h, (T,), (H,), (i_t * BT,), (BT,), (0,)) + b_s = tl.load(p_s, boundary_check=(0,)).to(tl.float32) + b_zz = tl.sum(b_s, axis=0) + b_z += b_zz + b_o = b_s - tl.cumsum(b_s, axis=0) + b_z[None] + tl.store(p_o, b_o.to(p_o.dtype.element_ty), boundary_check=(0,)) + + +@triton.autotune( + configs=[ + triton.Config({'BT': 16}, num_warps=2), + triton.Config({'BT': 16}, num_warps=4), + triton.Config({'BT': 16}, num_warps=8), + triton.Config({'BT': 32}, num_warps=2), + triton.Config({'BT': 32}, num_warps=4), + triton.Config({'BT': 32}, num_warps=8), + triton.Config({'BT': 64}, num_warps=2), + triton.Config({'BT': 64}, num_warps=4), + triton.Config({'BT': 64}, num_warps=8), + ], + key=['S'] +) +@triton.heuristics({'USE_OFFSETS': lambda args: args['offsets'] is not None}) +@triton.jit +def chunk_global_cumsum_vector_kernel( + s, + z, + offsets, + T: tl.constexpr, + H: tl.constexpr, + S: tl.constexpr, + BT: tl.constexpr, + BS: tl.constexpr, + HEAD_FIRST: tl.constexpr, + USE_OFFSETS: tl.constexpr +): + i_s, i_bh = tl.program_id(0), tl.program_id(1) + i_b, i_h = i_bh // H, i_bh % H + if USE_OFFSETS: + bos, eos = tl.load(offsets + i_b).to(tl.int32), tl.load(offsets + i_b + 1).to(tl.int32) + else: + bos, eos = i_b * T, i_b * T + T + T = eos - bos + + o_i = tl.arange(0, BT) + m_s = tl.where(o_i[:, None] >= o_i[None, :], 1., 0.) + + b_z = tl.zeros([BS], dtype=tl.float32) + for i_t in range(tl.cdiv(T, BT)): + if HEAD_FIRST: + p_s = tl.make_block_ptr(s + i_bh * T*S, (T, S), (S, 1), (i_t * BT, i_s * BS), (BT, BS), (1, 0)) + p_z = tl.make_block_ptr(z + i_bh * T*S, (T, S), (S, 1), (i_t * BT, i_s * BS), (BT, BS), (1, 0)) + else: + p_s = tl.make_block_ptr(s + (bos * H + i_h) * S, (T, S), (H*S, 1), (i_t * BT, i_s * BS), (BT, BS), (1, 0)) + p_z = tl.make_block_ptr(z + (bos * H + i_h) * S, (T, S), (H*S, 1), (i_t * BT, i_s * BS), (BT, BS), (1, 0)) + # [BT, BS] + b_s = tl.load(p_s, boundary_check=(0, 1)).to(tl.float32) + b_c = b_z[None, :] + tl.dot(m_s, b_s, allow_tf32=False) + tl.store(p_z, b_c.to(p_z.dtype.element_ty), boundary_check=(0, 1)) + if i_t >= 0: + b_z += tl.sum(b_s, 0) + + +@triton.autotune( + configs=[ + triton.Config({'BT': 16}, num_warps=2), + triton.Config({'BT': 16}, num_warps=4), + triton.Config({'BT': 16}, num_warps=8), + triton.Config({'BT': 32}, num_warps=2), + triton.Config({'BT': 32}, num_warps=4), + triton.Config({'BT': 32}, num_warps=8), + triton.Config({'BT': 64}, num_warps=2), + triton.Config({'BT': 64}, num_warps=4), + triton.Config({'BT': 64}, num_warps=8), + ], + key=['S'] +) +@triton.heuristics({'USE_OFFSETS': lambda args: args['offsets'] is not None}) +@triton.jit +def chunk_global_reversed_cumsum_vector_kernel( + s, + z, + offsets, + T: tl.constexpr, + H: tl.constexpr, + S: tl.constexpr, + BT: tl.constexpr, + BS: tl.constexpr, + HEAD_FIRST: tl.constexpr, + USE_OFFSETS: tl.constexpr +): + i_s, i_bh = tl.program_id(0), tl.program_id(1) + i_b, i_h = i_bh // H, i_bh % H + if USE_OFFSETS: + bos, eos = tl.load(offsets + i_b).to(tl.int32), tl.load(offsets + i_b + 1).to(tl.int32) + else: + bos, eos = i_b * T, i_b * T + T + T = eos - bos + + o_i = tl.arange(0, BT) + m_s = tl.where(o_i[:, None] <= o_i[None, :], 1., 0.) + + b_z = tl.zeros([BS], dtype=tl.float32) + for i_t in range(tl.cdiv(T, BT) - 1, -1, -1): + if HEAD_FIRST: + p_s = tl.make_block_ptr(s + i_bh * T*S, (T, S), (S, 1), (i_t * BT, i_s * BS), (BT, BS), (1, 0)) + p_z = tl.make_block_ptr(z + i_bh * T*S, (T, S), (S, 1), (i_t * BT, i_s * BS), (BT, BS), (1, 0)) + else: + p_s = tl.make_block_ptr(s + (bos * H + i_h) * S, (T, S), (H*S, 1), (i_t * BT, i_s * BS), (BT, BS), (1, 0)) + p_z = tl.make_block_ptr(z + (bos * H + i_h) * S, (T, S), (H*S, 1), (i_t * BT, i_s * BS), (BT, BS), (1, 0)) + # [BT, BS] + b_s = tl.load(p_s, boundary_check=(0, 1)).to(tl.float32) + b_c = b_z[None, :] + tl.dot(m_s, b_s, allow_tf32=False) + tl.store(p_z, b_c.to(p_z.dtype.element_ty), boundary_check=(0, 1)) + + if i_t >= 0: + b_z += tl.sum(b_s, 0) + + +def chunk_local_cumsum_scalar( + g: torch.Tensor, + chunk_size: int, + reverse: bool = False, + offsets: Optional[torch.Tensor] = None, + indices: Optional[torch.Tensor] = None, + head_first: bool = True +) -> torch.Tensor: + if head_first: + B, H, T = g.shape + else: + B, T, H = g.shape + if offsets is not None: + B = len(offsets) - 1 + BT = chunk_size + if offsets is None: + NT = triton.cdiv(T, BT) + else: + if indices is None: + indices = torch.cat([ + torch.stack([offsets.new_full((n,), i), offsets.new_tensor(range(n))], 1) + for i, n in enumerate(triton.cdiv(offsets[1:] - offsets[:-1], BT).tolist()) + ]) + NT = len(indices) + g_org, g = g, torch.empty_like(g, dtype=torch.float) + grid = (NT, B * H) + if reverse: + chunk_local_reversed_cumsum_scalar_kernel[grid]( + g_org, + g, + offsets, + indices, + T=T, + H=H, + BT=BT, + HEAD_FIRST=head_first + ) + else: + chunk_local_cumsum_scalar_kernel[grid]( + g_org, + g, + offsets, + indices, + T=T, + H=H, + BT=BT, + HEAD_FIRST=head_first + ) + return g + + +def chunk_local_cumsum_vector( + g: torch.Tensor, + chunk_size: int, + reverse: bool = False, + offsets: Optional[torch.Tensor] = None, + indices: Optional[torch.Tensor] = None, + head_first: bool = True +) -> torch.Tensor: + if head_first: + B, H, T, S = g.shape + else: + B, T, H, S = g.shape + BT = chunk_size + if offsets is None: + NT = triton.cdiv(T, BT) + else: + if indices is None: + indices = torch.cat([ + torch.stack([offsets.new_full((n,), i), offsets.new_tensor(range(n))], 1) + for i, n in enumerate(triton.cdiv(offsets[1:] - offsets[:-1], BT).tolist()) + ]) + NT = len(indices) + g_org, g = g, torch.empty_like(g, dtype=torch.float) + def grid(meta): return (triton.cdiv(meta['S'], meta['BS']), NT, B * H) + # keep cummulative normalizer in fp32 + # this kernel is equivalent to + # g = g.view(B, H, NT, BT, -1).cumsum(-2).view(B, H, T, -1) + if reverse: + chunk_local_reversed_cumsum_vector_kernel[grid]( + g_org, + g, + offsets, + indices, + T=T, + H=H, + S=S, + BT=BT, + HEAD_FIRST=head_first + ) + else: + chunk_local_cumsum_vector_kernel[grid]( + g_org, + g, + offsets, + indices, + T=T, + H=H, + S=S, + BT=BT, + HEAD_FIRST=head_first + ) + return g + + +@contiguous +def chunk_local_cumsum( + g: torch.Tensor, + chunk_size: int, + reverse: bool = False, + offsets: Optional[torch.Tensor] = None, + indices: Optional[torch.Tensor] = None, + head_first: bool = True +) -> torch.Tensor: + if offsets is not None: + assert not head_first, "Sequences with variable lengths are not supported for head-first mode" + assert g.shape[0] == 1, "Only batch size 1 is supported when offsets are provided" + if len(g.shape) == 3: + return chunk_local_cumsum_scalar(g, chunk_size, reverse, offsets, indices, head_first) + elif len(g.shape) == 4: + return chunk_local_cumsum_vector(g, chunk_size, reverse, offsets, indices, head_first) + else: + raise ValueError(f"Unsupported input shape {g.shape}. " + f"which should be (B, H, T, dim) if `head_first=True` " + f"or (batch_size, num_heads, seq_len) otherwise") + + +@contiguous +def chunk_global_cumsum_scalar( + s: torch.Tensor, + dtype: Optional[torch.dtype] = None, + reverse: bool = False, + offsets: Optional[torch.Tensor] = None, + head_first: bool = True +) -> torch.Tensor: + dtype = dtype or s.dtype + if head_first: + B, H, T = s.shape + else: + B, T, H = s.shape + if offsets is not None: + B = len(offsets) - 1 + grid = (B * H,) + z = torch.empty_like(s, dtype=dtype) + if reverse: + chunk_global_reversed_cumsum_scalar_kernel[grid]( + s, + z, + offsets, + T=T, + H=H, + HEAD_FIRST=head_first + ) + else: + chunk_global_cumsum_scalar_kernel[grid]( + s, + z, + offsets, + T=T, + H=H, + HEAD_FIRST=head_first + ) + return z + + +@contiguous +def chunk_global_cumsum_vector( + s: torch.Tensor, + dtype: Optional[torch.dtype] = None, + reverse: bool = False, + offsets: Optional[torch.Tensor] = None, + head_first: bool = True +) -> torch.Tensor: + dtype = dtype or s.dtype + if head_first: + B, H, T, S = s.shape + else: + B, T, H, S = s.shape + BS = min(32, S) + if offsets is not None: + B = len(offsets) - 1 + grid = (triton.cdiv(S, BS), B * H) + z = torch.empty_like(s, dtype=dtype) + if reverse: + chunk_global_reversed_cumsum_vector_kernel[grid]( + s, + z, + offsets, + T=T, + H=H, + S=S, + BS=BS, + HEAD_FIRST=head_first + ) + else: + chunk_global_cumsum_vector_kernel[grid]( + s, + z, + offsets, + T=T, + H=H, + S=S, + BS=BS, + HEAD_FIRST=head_first + ) + return z + + +@contiguous +def chunk_global_cumsum( + s: torch.Tensor, + dtype: Optional[torch.dtype] = None, + reverse: bool = False, + offsets: Optional[torch.Tensor] = None, + head_first: bool = True +) -> torch.Tensor: + if offsets is not None: + assert not head_first, "Sequences with variable lengths are not supported for head-first mode" + assert s.shape[0] == 1, "Only batch size 1 is supported when offsets are provided" + if len(s.shape) == 3: + return chunk_global_cumsum_scalar(s, dtype, reverse, offsets, head_first) + elif len(s.shape) == 4: + return chunk_global_cumsum_vector(s, dtype, reverse, offsets, head_first) + else: + raise ValueError(f"Unsupported input shape {s.shape}. " + f"which should be [B, H, T]/[B, H, T, D] if `head_first=True` " + f"or [B, T, H]/[B, T, H, D] otherwise") diff --git a/fla/ops/utils/logcumsumexp.py b/fla/ops/utils/logcumsumexp.py new file mode 100644 index 0000000000000000000000000000000000000000..2d6c3028c7bb359b630e9d641dc8b9c8c26a56b4 --- /dev/null +++ b/fla/ops/utils/logcumsumexp.py @@ -0,0 +1,61 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2023-2024, Songlin Yang, Yu Zhang + +import triton +import triton.language as tl + + +@triton.autotune( + configs=[ + triton.Config({'BT': 16}, num_warps=2), + triton.Config({'BT': 16}, num_warps=4), + triton.Config({'BT': 16}, num_warps=8), + triton.Config({'BT': 32}, num_warps=2), + triton.Config({'BT': 32}, num_warps=4), + triton.Config({'BT': 32}, num_warps=8), + triton.Config({'BT': 64}, num_warps=2), + triton.Config({'BT': 64}, num_warps=4), + triton.Config({'BT': 64}, num_warps=8), + ], + key=['S'] +) +@triton.jit +def logcumsumexp_fwd_kernel( + s, + z, + s_s_h, + s_s_t, + s_s_d, + T: tl.constexpr, + S: tl.constexpr, + BT: tl.constexpr +): + i_bh = tl.program_id(0) + o_i = tl.arange(0, BT) + m_s = tl.where(o_i[:, None] >= o_i[None, :], 1., 0.) + + b_mp = tl.full([S,], float('-inf'), dtype=tl.float32) + b_zp = tl.zeros([S,], dtype=tl.float32) + for i_t in range(tl.cdiv(T, BT)): + p_s = tl.make_block_ptr(s + i_bh * s_s_h, (T, S), (s_s_t, s_s_d), (i_t * BT, 0), (BT, S), (1, 0)) + p_z = tl.make_block_ptr(z + i_bh * s_s_h, (T, S), (s_s_t, s_s_d), (i_t * BT, 0), (BT, S), (1, 0)) + + # [BT, S] + b_s = tl.load(p_s, boundary_check=(0, 1)).to(tl.float32) + # [S,] + b_mc = tl.max(b_s, 0) + # workaround for compiler bugs + if i_t > 0: + b_mc = tl.maximum(b_mp, b_mc) + b_zp = b_zp * tl.exp(b_mp - b_mc) + # [BT, S] + b_s = tl.exp(b_s - b_mc) + b_z = tl.dot(m_s, b_s, allow_tf32=False) + b_zp + # [S,] + b_zc = tl.max(b_z, 0) + b_mp = b_mc + b_zp = b_zc + # [BT, BS] + # small eps to prevent underflows + b_z = tl.log(tl.where(b_z != 0, b_z, 1e-20)) + b_mc + tl.store(p_z, b_z.to(p_z.dtype.element_ty), boundary_check=(0, 1)) diff --git a/fla/ops/utils/logsumexp.py b/fla/ops/utils/logsumexp.py new file mode 100644 index 0000000000000000000000000000000000000000..58c6188a12d2cd06e921fe0504d5f938a0f100a8 --- /dev/null +++ b/fla/ops/utils/logsumexp.py @@ -0,0 +1,82 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2023-2024, Songlin Yang, Yu Zhang + +from typing import Optional + +import torch +import triton +import triton.language as tl + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8), + triton.Config({}, num_warps=16), + triton.Config({}, num_warps=32), + ], + key=['D'] +) +@triton.heuristics({ + 'HAS_SCALE': lambda args: args['scale'] is not None +}) +@triton.jit +def logsumexp_fwd_kernel( + x, + z, + scale, + D: tl.constexpr, + B: tl.constexpr, + HAS_SCALE: tl.constexpr +): + i_n, i_d = tl.program_id(0), tl.program_id(1) + o_d = i_d * B + tl.arange(0, B) + m_d = o_d < D + + b_x = tl.load(x + i_n * D + o_d, mask=m_d, other=-float('inf')) + if HAS_SCALE: + b_x = b_x * scale + b_m = tl.max(b_x, 0) + b_z = tl.log(tl.sum(tl.exp(b_x - b_m), 0)) + b_m + tl.store(z + i_n * tl.cdiv(D, B) + i_d, b_z) + + +def logsumexp_fwd( + x, + scale: Optional[float] = None, + dtype: Optional[torch.dtype] = None +): + r""" + Compute the logsumexp of the input tensor over the last dimension. + + Args: + x (Tensor): + The input tensor of any shape. + scale (Optional[float]): + The scale applied to the input tensor. Default: `None`. + dtype (Optional[torch.dtype]): + The data type of the output tensor. Default: `None`. + Returns: + Tensor: The logsumexp of the input tensor. + """ + + shape = x.shape + x = x.view(-1, shape[-1]) + N, D = x.shape + B = min(triton.next_power_of_2(D), 64 * 1024) + ND = triton.cdiv(D, B) + + z = x.new_empty(N, ND, dtype=torch.float) + logsumexp_fwd_kernel[(N, ND)]( + x=x, + z=z, + scale=scale, + D=D, + B=B + ) + z = z.logsumexp(-1).view(*shape[:-1]) + if dtype is not None and dtype != torch.float: + z = z.to(dtype) + return z diff --git a/fla/ops/utils/matmul.py b/fla/ops/utils/matmul.py new file mode 100644 index 0000000000000000000000000000000000000000..7dfedd3a5249bd7205fd96683987b14fa0cd126b --- /dev/null +++ b/fla/ops/utils/matmul.py @@ -0,0 +1,194 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2023-2024, Songlin Yang, Yu Zhang + +# code adapted from +# https://triton-lang.org/main/getting-started/tutorials/03-matrix-multiplication.html + +from typing import Optional + +import torch +import triton +import triton.language as tl + +from fla.utils import contiguous + + +# `triton.jit`'ed functions can be auto-tuned by using the `triton.autotune` decorator, which consumes: +# - A list of `triton.Config` objects that define different configurations of +# meta-parameters (e.g., `BM`) and compilation options (e.g., `num_warps`) to try +# - An auto-tuning *key* whose change in values will trigger evaluation of all the +# provided configs +@triton.autotune( + configs=[ + triton.Config({'BM': 128, 'BK': 64, 'BN': 256, 'G': 4}, num_stages=3, num_warps=8), + triton.Config({'BM': 64, 'BK': 32, 'BN': 256, 'G': 4}, num_stages=4, num_warps=4), + triton.Config({'BM': 128, 'BK': 32, 'BN': 128, 'G': 4}, num_stages=4, num_warps=4), + triton.Config({'BM': 128, 'BK': 32, 'BN': 64, 'G': 4}, num_stages=4, num_warps=4), + triton.Config({'BM': 64, 'BK': 32, 'BN': 128, 'G': 4}, num_stages=4, num_warps=4), + triton.Config({'BM': 128, 'BK': 32, 'BN': 32, 'G': 4}, num_stages=4, num_warps=4), + triton.Config({'BM': 64, 'BK': 32, 'BN': 32, 'G': 4}, num_stages=5, num_warps=2), + triton.Config({'BM': 32, 'BK': 32, 'BN': 64, 'G': 4}, num_stages=5, num_warps=2), + # Good config for fp8 inputs. + triton.Config({'BM': 128, 'BK': 128, 'BN': 256, 'G': 4}, num_stages=3, num_warps=8), + triton.Config({'BM': 256, 'BK': 128, 'BN': 128, 'G': 4}, num_stages=3, num_warps=8), + triton.Config({'BM': 256, 'BK': 128, 'BN': 64, 'G': 4}, num_stages=4, num_warps=4), + triton.Config({'BM': 64, 'BK': 128, 'BN': 256, 'G': 4}, num_stages=4, num_warps=4), + triton.Config({'BM': 128, 'BK': 128, 'BN': 128, 'G': 4}, num_stages=4, num_warps=4), + triton.Config({'BM': 128, 'BK': 64, 'BN': 64, 'G': 4}, num_stages=4, num_warps=4), + triton.Config({'BM': 64, 'BK': 64, 'BN': 128, 'G': 4}, num_stages=4, num_warps=4), + triton.Config({'BM': 128, 'BK': 64, 'BN': 32, 'G': 4}, num_stages=4, num_warps=4) + ], + key=['M', 'N', 'K'], +) +@triton.heuristics({ + 'HAS_INPUT': lambda args: args['input'] is not None, + 'HAS_ALPHA': lambda args: args['alpha'] is not None, + 'HAS_BETA': lambda args: args['beta'] is not None +}) +@triton.jit +def matmul_kernel( + # Pointers to matrices + a, + b, + c, + input, + alpha, + beta, + # Matrix dimensions + M, + N, + K, + # The stride variables represent how much to increase the ptr by when moving by 1 + # element in a particular dimension. E.g. `s_am` is how much to increase `a` + # by to get the element one row down (A has M rows). + s_am, + s_ak, + s_bk, + s_bn, + s_cm, + s_cn, + # Meta-parameters + BM: tl.constexpr, + BK: tl.constexpr, + BN: tl.constexpr, + G: tl.constexpr, + ACTIVATION: tl.constexpr, + HAS_INPUT: tl.constexpr, + HAS_ALPHA: tl.constexpr, + HAS_BETA: tl.constexpr +): + """Kernel for computing the matmul C = A x B. + A has shape (M, K), B has shape (K, N) and C has shape (M, N) + """ + # ----------------------------------------------------------- + # Map program ids `pid` to the block of C it should compute. + # This is done in a grouped ordering to promote L2 data reuse. + # See above `L2 Cache Optimizations` section for details. + NM, NN = tl.num_programs(0), tl.num_programs(1) + i_m, i_n = tl.program_id(0), tl.program_id(1) + i_m, i_n = tl.swizzle2d(i_m, i_n, NM, NN, G) + + # ---------------------------------------------------------- + # Create pointers for the first blocks of A and B. + # We will advance this pointer as we move in the K direction + # and accumulate + # `p_a` is a block of [BM, BK] pointers + # `p_b` is a block of [BK, BN] pointers + # See above `Pointer Arithmetic` section for details + o_am = (i_m * BM + tl.arange(0, BM)) % M + o_bn = (i_n * BN + tl.arange(0, BN)) % N + o_k = tl.arange(0, BK) + + p_a = a + (o_am[:, None] * s_am + o_k[None, :] * s_ak) + p_b = b + (o_k[:, None] * s_bk + o_bn[None, :] * s_bn) + + b_acc = tl.zeros((BM, BN), dtype=tl.float32) + for k in range(0, tl.cdiv(K, BK)): + # Load the next block of A and B, generate a mask by checking the K dimension. + # If it is out of bounds, set it to 0. + b_a = tl.load(p_a, mask=o_k[None, :] < K - k * BK, other=0.0) + b_b = tl.load(p_b, mask=o_k[:, None] < K - k * BK, other=0.0) + # We accumulate along the K dimension. + b_acc += tl.dot(b_a, b_b, allow_tf32=False) + # Advance the ptrs to the next K block. + p_a += BK * s_ak + p_b += BK * s_bk + + o_cm = i_m * BM + tl.arange(0, BM) + o_cn = i_n * BN + tl.arange(0, BN) + mask = (o_cm[:, None] < M) & (o_cn[None, :] < N) + + b_c = b_acc + # You can fuse arbitrary activation functions here + # while the b_acc is still in FP32! + if ACTIVATION == "leaky_relu": + b_c = leaky_relu(b_c) + if HAS_ALPHA: + b_c *= tl.load(alpha) + if HAS_INPUT: + p_i = input + s_cm * o_cm[:, None] + s_cn * o_cn[None, :] + b_i = tl.load(p_i, mask=mask, other=0.0).to(tl.float32) + if HAS_BETA: + b_i *= tl.load(beta) + b_c += b_i + + # ----------------------------------------------------------- + # Write back the block of the output matrix C with masks. + p_c = c + s_cm * o_cm[:, None] + s_cn * o_cn[None, :] + tl.store(p_c, b_c.to(c.dtype.element_ty), mask=mask) + + +# We can fuse `leaky_relu` by providing it as an `ACTIVATION` meta-parameter in `matmul_kernel`. +@triton.jit +def leaky_relu(x): + return tl.where(x >= 0, x, 0.01 * x) + + +@contiguous +def matmul(a, b, activation=''): + assert a.shape[1] == b.shape[0], 'Incompatible dimensions (A: {}x{}, B: {}x{})'.format(*a.shape, *b.shape) + + M, K = a.shape + K, N = b.shape + # Allocates output. + c = a.new_empty(M, N) + # 1D launch kernel where each block gets its own program. + + def grid(meta): return (triton.cdiv(M, meta['BM']), triton.cdiv(N, meta['BN'])) + matmul_kernel[grid]( + a, b, c, None, None, None, + M, N, K, + a.stride(0), a.stride(1), + b.stride(0), b.stride(1), + c.stride(0), c.stride(1), + ACTIVATION=activation, + ) + return c + + +@contiguous +def addmm( + x: torch.Tensor, + a: torch.Tensor, + b: torch.Tensor, + alpha: Optional[float] = None, + beta: Optional[float] = None, + inplace: Optional[bool] = False +) -> torch.Tensor: + assert a.shape[1] == b.shape[0], 'Incompatible dimensions (A: {}x{}, B: {}x{})'.format(*a.shape, *b.shape) + + M, K = a.shape + K, N = b.shape + # Allocates output. + c = x if inplace else a.new_empty(M, N) + + def grid(meta): return (triton.cdiv(M, meta['BM']), triton.cdiv(N, meta['BN'])) + matmul_kernel[grid]( + a, b, c, x, alpha, beta, + M, N, K, + a.stride(0), a.stride(1), + b.stride(0), b.stride(1), + c.stride(0), c.stride(1), + ACTIVATION=None, + ) + return c diff --git a/fla/ops/utils/softmax.py b/fla/ops/utils/softmax.py new file mode 100644 index 0000000000000000000000000000000000000000..ff54cea1ea7c0a60d7b360b0d91e98fc20c48de0 --- /dev/null +++ b/fla/ops/utils/softmax.py @@ -0,0 +1,109 @@ +# -*- coding: utf-8 -*- +# Copyright (c) 2023-2024, Songlin Yang, Yu Zhang + +from typing import Optional + +import torch +import triton +import triton.language as tl + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8), + triton.Config({}, num_warps=16), + triton.Config({}, num_warps=32) + ], + key=['D'] +) +@triton.jit +def softmax_fwd_kernel( + x, + p, + D: tl.constexpr, + B: tl.constexpr +): + i_n = tl.program_id(0) + o_d = tl.arange(0, B) + m_d = o_d < D + + b_x = tl.load(x + i_n * D + o_d, mask=m_d, other=-float('inf')) + b_m = tl.max(b_x, 0) + b_x = tl.exp(b_x - b_m) + b_p = b_x / tl.sum(b_x, 0) + + tl.store(p + i_n * D + o_d, b_p.to(p.dtype.element_ty), mask=m_d) + + +@triton.autotune( + configs=[ + triton.Config({}, num_warps=1), + triton.Config({}, num_warps=2), + triton.Config({}, num_warps=4), + triton.Config({}, num_warps=8), + triton.Config({}, num_warps=16), + triton.Config({}, num_warps=32) + ], + key=['D'] +) +@triton.jit +def softmax_bwd_kernel( + p, + dp, + ds, + D: tl.constexpr, + B: tl.constexpr +): + i_n = tl.program_id(0) + o_d = tl.arange(0, B) + m_d = o_d < D + + b_p = tl.load(p + i_n * D + o_d, mask=m_d, other=0.) + b_dp = tl.load(dp + i_n * D + o_d, mask=m_d, other=0.) + b_pp = tl.sum(b_p * b_dp, 0) + b_ds = b_p * b_dp - b_p * b_pp + tl.store(ds + i_n * D + o_d, b_ds.to(ds.dtype.element_ty), mask=m_d) + + +def softmax_fwd( + x: torch.Tensor, + dtype: Optional[torch.dtype] = torch.float +) -> torch.Tensor: + shape = x.shape + x = x.view(-1, x.shape[-1]) + + N, D = x.shape + B = triton.next_power_of_2(D) + + p = torch.empty_like(x, dtype=dtype) + softmax_fwd_kernel[(N,)]( + x=x, + p=p, + D=D, + B=B + ) + return p.view(*shape) + + +def softmax_bwd( + p: torch.Tensor, + dp: torch.Tensor, + dtype: Optional[torch.dtype] = torch.float +) -> torch.Tensor: + shape = p.shape + p = p.view(-1, p.shape[-1]) + ds = torch.empty_like(p, dtype=dtype) + + N, D = p.shape + B = triton.next_power_of_2(D) + softmax_bwd_kernel[(N,)]( + p=p, + dp=dp, + ds=ds, + D=D, + B=B + ) + return ds.view(*shape) diff --git a/fla/utils.py b/fla/utils.py new file mode 100644 index 0000000000000000000000000000000000000000..604c755abbf91d7f7407f475e22a7b122b79c4c8 --- /dev/null +++ b/fla/utils.py @@ -0,0 +1,48 @@ +# -*- coding: utf-8 -*- + +import functools + +import torch +from packaging import version + + +def contiguous(fn): + """ + Make sure all input tensors are contiguous. + """ + @functools.wraps(fn) + def wrapper(ctx, *args, **kwargs): + return fn(ctx, + *(i if not isinstance(i, torch.Tensor) else i.contiguous() for i in args), + **{k: (v if not isinstance(v, torch.Tensor) else v.contiguous()) for k, v in kwargs.items()}) + return wrapper + + +def require_version(version, hint): + """ + Perform a runtime check of the dependency versions, using the exact same syntax used by pip. + """ + def decorator(fn): + @functools.wraps(fn) + def wrapper(ctx, *args, **kwargs): + from transformers.utils.versions import require_version + require_version(version, hint) + return fn(ctx, + *(i if not isinstance(i, torch.Tensor) else i.contiguous() for i in args), + **{k: (v if not isinstance(v, torch.Tensor) else v.contiguous()) for k, v in kwargs.items()}) + return wrapper + return decorator + + +def checkpoint(func): + def wrapper(*args, **kwargs): + return torch.utils.checkpoint.checkpoint(func, *args, **kwargs) + return wrapper + + +if version.parse(torch.__version__) >= version.parse("2.4"): + autocast_custom_fwd = functools.partial(torch.amp.custom_fwd, device_type="cuda") + autocast_custom_bwd = functools.partial(torch.amp.custom_bwd, device_type="cuda") +else: + autocast_custom_fwd = torch.cuda.amp.custom_fwd + autocast_custom_bwd = torch.cuda.amp.custom_bwd diff --git a/flame/__init__.py b/flame/__init__.py new file mode 100644 index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391 diff --git a/flame/__pycache__/__init__.cpython-312.pyc b/flame/__pycache__/__init__.cpython-312.pyc new file mode 100644 index 0000000000000000000000000000000000000000..9bbf08b4cfc5a632583ad6ee8921de9f0f5bc340 Binary files /dev/null and b/flame/__pycache__/__init__.cpython-312.pyc differ diff --git a/flame/__pycache__/data.cpython-312.pyc b/flame/__pycache__/data.cpython-312.pyc new file mode 100644 index 0000000000000000000000000000000000000000..d7b50f7b4a6ace23f8c05b80c7935997cb1f7deb Binary files /dev/null and b/flame/__pycache__/data.cpython-312.pyc differ diff --git a/flame/__pycache__/logging.cpython-312.pyc b/flame/__pycache__/logging.cpython-312.pyc new file mode 100644 index 0000000000000000000000000000000000000000..608991b0075a4b4e2f4bd2093e88475f67905cc8 Binary files /dev/null and b/flame/__pycache__/logging.cpython-312.pyc differ diff --git a/flame/__pycache__/parser.cpython-312.pyc b/flame/__pycache__/parser.cpython-312.pyc new file mode 100644 index 0000000000000000000000000000000000000000..2c51a4e9fb8a40cd98bba88487a938155d2f4063 Binary files /dev/null and b/flame/__pycache__/parser.cpython-312.pyc differ diff --git a/flame/data.py b/flame/data.py new file mode 100644 index 0000000000000000000000000000000000000000..fd2806b43dede99ea494702c99afa22e0dfacf0e --- /dev/null +++ b/flame/data.py @@ -0,0 +1,86 @@ +# -*- coding: utf-8 -*- + +from dataclasses import dataclass +from typing import Any, Dict, List, Union + +import numpy as np +import torch +from transformers import PreTrainedTokenizer + + +@dataclass +class DataCollatorForLanguageModeling: + """ + Data collator used for language modeling. Inputs are dynamically padded to the maximum length of a batch if they + are not all of the same length. + + Args: + tokenizer ([`PreTrainedTokenizer`] or [`PreTrainedTokenizerFast`]): + The tokenizer used for encoding the data. + varlen (`bool`): + Whether to return sequences with variable lengths. + If `True`, the offsets indicating the start and end of each sequence will be returned. + For example, if the sequence lengths are `[4, 8, 12]`, + the returned `input_ids` will be a long flattened tensor of shape `[1, 24]`, with `offsets` being `[0, 4, 12, 24]`. + If `False`, the `input_ids` with shape `[batch_size, seq_len]` will be returned directly. + return_tensors (`str`): + The type of Tensor to return. Allowable values are "pt". + + + + For best performance, this data collator should be used with a dataset having items that are dictionaries or + BatchEncoding, with the `"special_tokens_mask"` key, as returned by a [`PreTrainedTokenizer`] or a + [`PreTrainedTokenizerFast`] with the argument `return_special_tokens_mask=True`. + + """ + + tokenizer: PreTrainedTokenizer + varlen: bool = False + return_tensors: str = "pt" + + def __call__( + self, + examples: List[Union[List[int], Dict[str, Any]]] + ) -> Dict[str, Any]: + if not isinstance(examples[0], Dict): + examples = [{'input_ids': x} for x in examples] + if isinstance(examples[0]['input_ids'], List): + examples = [{'input_ids': torch.tensor(x['input_ids'], dtype=torch.long)} for x in examples] + elif isinstance(examples[0]['input_ids'], np.ndarray): + examples = [{'input_ids': torch.from_numpy(x['input_ids'])} for x in examples] + + if not self.varlen: + length_of_first = examples[0]['input_ids'].size(0) + # Check if padding is necessary. + if all(x['input_ids'].size(0) == length_of_first for x in examples): + batch = {'input_ids': torch.stack([x['input_ids'] for x in examples], dim=0)} + else: + # If yes, check if we have a `pad_token`. + if self.tokenizer._pad_token is None: + raise ValueError( + f"You are attempting to pad samples but the tokenizer you are using " + f"({self.tokenizer.__class__.__name__}) does not have a pad token." + ) + batch = self.tokenizer.pad(examples, return_tensors=self.return_tensors, return_attention_mask=False) + else: + batch = {'input_ids': torch.cat([x['input_ids'] for x in examples], dim=0).unsqueeze(0)} + if self.tokenizer.add_bos_token: + offsets = [] + if batch['input_ids'][0, 0] != self.tokenizer.bos_token_id: + offsets.append(torch.tensor([0], dtype=torch.long)) + offsets.append(torch.where(batch['input_ids'].eq(self.tokenizer.bos_token_id))[1]) + offsets.append(torch.tensor([len(batch['input_ids'][0])], dtype=torch.long)) + batch['offsets'] = torch.cat(offsets, dim=0) + elif self.tokenizer.add_eos_token: + offsets = [torch.tensor([0], dtype=torch.long)] + offsets.append(torch.where(batch['input_ids'].eq(self.tokenizer.eos_token_id))[1] + 1) + if batch['input_ids'][0, -1] != self.tokenizer.eos_token_id: + offsets.append(torch.tensor([len(batch['input_ids'][0])], dtype=torch.long)) + batch['offsets'] = torch.cat(offsets, dim=0) + else: + raise ValueError("You must allow the tokenizer to add either a bos or eos token as separators.") + labels = batch['input_ids'].clone() + if self.tokenizer.pad_token_id is not None: + labels[labels == self.tokenizer.pad_token_id] = -100 + batch["labels"] = labels + return batch diff --git a/flame/logging.py b/flame/logging.py new file mode 100644 index 0000000000000000000000000000000000000000..5ebdf2e1281a187f246430ed8b1074b40ebfb450 --- /dev/null +++ b/flame/logging.py @@ -0,0 +1,149 @@ +# -*- coding: utf-8 -*- + +import json +import logging +import os +import sys +import time + +from transformers.trainer_callback import (ExportableState, TrainerCallback, + TrainerControl, TrainerState) +from transformers.training_args import TrainingArguments + + +class LoggerHandler(logging.Handler): + r""" + Logger handler used in Web UI. + """ + + def __init__(self): + super().__init__() + self.log = "" + + def reset(self): + self.log = "" + + def emit(self, record): + if record.name == "httpx": + return + log_entry = self.format(record) + self.log += log_entry + self.log += "\n\n" + + +def get_logger(name: str) -> logging.Logger: + r""" + Gets a standard logger with a stream hander to stdout. + """ + formatter = logging.Formatter( + fmt="%(asctime)s - %(levelname)s - %(name)s - %(message)s", datefmt="%m/%d/%Y %H:%M:%S" + ) + handler = logging.StreamHandler(sys.stdout) + handler.setFormatter(formatter) + + logger = logging.getLogger(name) + logger.setLevel(logging.INFO) + logger.addHandler(handler) + + return logger + + +def reset_logging() -> None: + r""" + Removes basic config of root logger. (unused in script) + """ + root = logging.getLogger() + list(map(root.removeHandler, root.handlers)) + list(map(root.removeFilter, root.filters)) + + +logger = get_logger(__name__) + +LOG_FILE_NAME = "trainer_log.jsonl" + + +class LogCallback(TrainerCallback, ExportableState): + def __init__(self, start_time: float = None, elapsed_time: float = None): + + self.start_time = time.time() if start_time is None else start_time + self.elapsed_time = 0 if elapsed_time is None else elapsed_time + self.last_time = self.start_time + + def on_train_begin( + self, + args: TrainingArguments, + state: TrainerState, + control: TrainerControl, + **kwargs + ): + r""" + Event called at the beginning of training. + """ + if state.is_local_process_zero: + if not args.resume_from_checkpoint: + self.start_time = time.time() + self.elapsed_time = 0 + else: + self.start_time = state.stateful_callbacks['LogCallback']['start_time'] + self.elapsed_time = state.stateful_callbacks['LogCallback']['elapsed_time'] + + if args.save_on_each_node: + if not state.is_local_process_zero: + return + else: + if not state.is_world_process_zero: + return + + self.last_time = time.time() + if os.path.exists(os.path.join(args.output_dir, LOG_FILE_NAME)) and args.overwrite_output_dir: + logger.warning("Previous log file in this folder will be deleted.") + os.remove(os.path.join(args.output_dir, LOG_FILE_NAME)) + + def on_log( + self, + args: TrainingArguments, + state: TrainerState, + control: TrainerControl, + logs, + **kwargs + ): + if args.save_on_each_node: + if not state.is_local_process_zero: + return + else: + if not state.is_world_process_zero: + return + + self.elapsed_time += time.time() - self.last_time + self.last_time = time.time() + if 'num_input_tokens_seen' in logs: + logs['num_tokens'] = logs.pop('num_input_tokens_seen') + state.log_history[-1].pop('num_input_tokens_seen') + throughput = logs['num_tokens'] / args.world_size / self.elapsed_time + state.log_history[-1]['throughput'] = logs['throughput'] = throughput + state.stateful_callbacks["LogCallback"] = self.state() + + logs = dict( + current_steps=state.global_step, + total_steps=state.max_steps, + loss=state.log_history[-1].get("loss", None), + eval_loss=state.log_history[-1].get("eval_loss", None), + predict_loss=state.log_history[-1].get("predict_loss", None), + learning_rate=state.log_history[-1].get("learning_rate", None), + epoch=state.log_history[-1].get("epoch", None), + percentage=round(state.global_step / state.max_steps * 100, 2) if state.max_steps != 0 else 100, + ) + + os.makedirs(args.output_dir, exist_ok=True) + with open(os.path.join(args.output_dir, "trainer_log.jsonl"), "a", encoding="utf-8") as f: + f.write(json.dumps(logs) + "\n") + + def state(self) -> dict: + return { + 'start_time': self.start_time, + 'elapsed_time': self.elapsed_time + } + + @classmethod + def from_state(cls, state): + return cls(state['start_time'], state['elapsed_time']) diff --git a/flame/parser.py b/flame/parser.py new file mode 100644 index 0000000000000000000000000000000000000000..621a0ef157f0e6c96f016d8f21abe4e6b2b8772d --- /dev/null +++ b/flame/parser.py @@ -0,0 +1,90 @@ +# -*- coding: utf-8 -*- + +from __future__ import annotations + +from dataclasses import dataclass, field +from typing import Optional + +import transformers +from transformers import HfArgumentParser, TrainingArguments + +from flame.logging import get_logger + +logger = get_logger(__name__) + + +@dataclass +class TrainingArguments(TrainingArguments): + + model_name_or_path: str = field( + default=None, + metadata={ + "help": "Path to the model weight or identifier from huggingface.co/models or modelscope.cn/models." + }, + ) + tokenizer: str = field( + default="mistralai/Mistral-7B-v0.1", + metadata={"help": "Name of the tokenizer to use."} + ) + use_fast_tokenizer: bool = field( + default=False, + metadata={"help": "Whether or not to use one of the fast tokenizer (backed by the tokenizers library)."}, + ) + from_config: bool = field( + default=True, + metadata={"help": "Whether to initialize models from scratch."}, + ) + dataset: Optional[str] = field( + default=None, + metadata={"help": "The dataset(s) to use. Use commas to separate multiple datasets."}, + ) + dataset_name: Optional[str] = field( + default=None, + metadata={"help": "The name of provided dataset(s) to use."}, + ) + cache_dir: str = field( + default=None, + metadata={"help": "Path to the cached tokenized dataset."}, + ) + split: str = field( + default="train", + metadata={"help": "Which dataset split to use for training and evaluation."}, + ) + streaming: bool = field( + default=False, + metadata={"help": "Enable dataset streaming."}, + ) + hf_hub_token: Optional[str] = field( + default=None, + metadata={"help": "Auth token to log in with Hugging Face Hub."}, + ) + preprocessing_num_workers: Optional[int] = field( + default=None, + metadata={"help": "The number of processes to use for the pre-processing."}, + ) + buffer_size: int = field( + default=2048, + metadata={"help": "Size of the buffer to randomly sample examples from in dataset streaming."}, + ) + context_length: int = field( + default=2048, + metadata={"help": "The context length of the tokenized inputs in the dataset."}, + ) + + +def get_train_args(): + parser = HfArgumentParser(TrainingArguments) + args, unknown_args = parser.parse_args_into_dataclasses(return_remaining_strings=True) + + if unknown_args: + print(parser.format_help()) + print("Got unknown args, potentially deprecated arguments: {}".format(unknown_args)) + raise ValueError("Some specified arguments are not used by the HfArgumentParser: {}".format(unknown_args)) + + if args.should_log: + transformers.utils.logging.set_verbosity(args.get_process_log_level()) + transformers.utils.logging.enable_default_handler() + transformers.utils.logging.enable_explicit_format() + # set seeds manually + transformers.set_seed(args.seed) + return args diff --git a/model.safetensors b/model.safetensors new file mode 100644 index 0000000000000000000000000000000000000000..1631c6c8953ab70d5cd91247b9a26ad48607e150 --- /dev/null +++ b/model.safetensors @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:077ea186538f6591ceaaa5ff30ddde7385b35d39ebb98b7e10538dc67e588936 +size 169434248 diff --git a/preprocess.py b/preprocess.py new file mode 100644 index 0000000000000000000000000000000000000000..d175b0cb8f0a26737295989c7f4a5a2c0d7d8517 --- /dev/null +++ b/preprocess.py @@ -0,0 +1,114 @@ +# -*- coding: utf-8 -*- + +from __future__ import annotations + +import argparse +import logging +from itertools import chain +from typing import Any, Dict, List, Optional + +from datasets import load_dataset +from transformers import AutoTokenizer + +logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s') + + +def tokenize( + examples: Dict[str, List[Any]], + tokenizer: AutoTokenizer, + context_length: int +) -> Dict[str, List[List[int]]]: + """ + Tokenize the input text and split into chunks of specified context length. + + Args: + examples: + Dictionary containing the input text. + tokenizer: + Initialized tokenizer. + context_length: + Length of each context chunk. + + Returns: + Dictionary containing tokenized and chunked input ids + """ + text = examples['text'] + input_ids = tokenizer(text)['input_ids'] + input_ids = list(chain(*input_ids)) + total_length = len(input_ids) + total_length = (total_length // context_length) * context_length + # The last chunk smaller than context_length will be discarded + return {'input_ids': [input_ids[i:i+context_length] for i in range(0, total_length, context_length)]} + + +def preprocess( + dataset: str, + name: Optional[str] = None, + split: str = 'train', + output: str = 'data', + model: str = 'mistralai/Mistral-7B-v0.1', + num_proc: int = 64, + context_length: int = 8192 +) -> None: + """ + Load, tokenize, and save the processed dataset. + + Args: + dataset: + Path or name of the dataset. + name: + Name of the dataset configuration. + split: + Dataset split to process. + output: + Output directory. + model: + Model name for tokenizer. + num_proc: + Number of processes for parallel processing. + context_length: + Context length for tokenization. + """ + tokenized_path = f'{output}/{dataset}/{name}/{split}' if name is not None else f'{output}/{dataset}/{split}' + + logging.info(f'Initializing tokenizer of {model}') + tokenizer = AutoTokenizer.from_pretrained(model, trust_remote_code=True) + logging.info(f'Tokenizer initialized: {tokenizer}') + + logging.info(f'Loading dataset: {dataset}') + dataset = load_dataset(dataset, name=name, split=split) + + remove_columns = list(next(iter(dataset)).keys()) + logging.info('Tokenizing and processing dataset') + dataset = dataset.map( + lambda examples: tokenize(examples, tokenizer, context_length), + batched=True, + remove_columns=remove_columns, + num_proc=num_proc, + desc="Running tokenizer on dataset" + ) + + logging.info(f'Saving processed dataset to {tokenized_path}') + dataset.save_to_disk(tokenized_path, num_proc=num_proc) + + +if __name__ == "__main__": + parser = argparse.ArgumentParser(description="Preprocess and tokenize dataset") + parser.add_argument("--dataset", default="HuggingFaceFW/fineweb-edu", help="Path or name of the dataset") + parser.add_argument("--name", default=None, help="Name of the dataset configuration") + parser.add_argument("--split", default="train", help="Dataset split to process") + parser.add_argument("--output", default="data", help="Output directory") + parser.add_argument("--model", default="mistralai/Mistral-7B-v0.1", help="Model name for tokenizer") + parser.add_argument("--num_proc", type=int, default=64, help="Number of processes for parallel processing") + parser.add_argument("--context_length", type=int, default=8192, help="Context length for tokenization") + args = parser.parse_args() + + preprocess( + dataset=args.dataset, + name=args.name, + split=args.split, + output=args.output, + model=args.model, + num_proc=args.num_proc, + context_length=args.context_length + ) diff --git a/profile.sh b/profile.sh new file mode 100644 index 0000000000000000000000000000000000000000..0678f13bcc5390387c3e3e501559060d58dff082 --- /dev/null +++ b/profile.sh @@ -0,0 +1,194 @@ +args=$@ +for arg in $args; do + eval "$arg" +done + +echo "model: ${model:=mistralai/Mistral-7B-v0.1}" +echo "tokenizer: ${tokenizer:=mistralai/Mistral-7B-v0.1}" +echo "project: ${project:=fla}" +echo "type: ${type:=gla}" +echo "data: ${data:=}" +echo "name: ${name:=}" +echo "cache: ${cache:=}" +echo "seed: ${seed:=42}" +echo "context: ${context:=2048}" +echo "steps: ${steps:=0}" +echo "save: ${save:=2048}" +echo "limit: ${limit:=16}" +echo "preprocessing: ${preprocessing:=32}" +echo "workers: ${workers:=32}" +echo "logging: ${logging:=32}" +echo "config: ${config:=configs/deepspeed.yaml}" +echo "push: ${push:=False}" + +echo "lr: ${lr:=3e-4}" +echo "scheduler: ${scheduler:=cosine_with_min_lr}" +echo "epochs: ${epochs:=1}" +echo "optim: ${optim:=adamw_torch_fused}" +echo "decay: ${decay:=0.01}" +echo "beta1: ${beta1:=0.9}" +echo "beta2: ${beta2:=0.95}" +echo "norm: ${norm:=1.0}" +echo "batch: ${batch:=32}" +echo "update: ${update:=4}" +echo "warmup: ${warmup:=512}" +echo "path: ${path:=}" +echo "checkpoint: ${checkpoint:=}" +echo "node: ${node:=}" +echo "rank: ${rank:=}" +echo "ip: ${ip:=}" +echo "port: ${port:=}" +echo "nodes: ${nodes:=1}" + +params="--model_name_or_path $model \ + --tokenizer $tokenizer \ + --use_fast_tokenizer \ + --do_train \ + --dataset $data \ + --context_length $context \ + --streaming \ + --preprocessing_num_workers $preprocessing \ + --dataloader_num_workers $workers \ + --dataloader_prefetch_factor 2 \ + --ignore_data_skip \ + --output_dir $path \ + --overwrite_output_dir \ + --logging_steps $logging \ + --include_num_input_tokens_seen \ + --save_steps $save \ + --save_total_limit $limit \ + --learning_rate $lr \ + --lr_scheduler_type $scheduler \ + --warmup_steps $warmup \ + --optim $optim \ + --weight_decay $decay \ + --adam_beta1=$beta1 \ + --adam_beta2=$beta2 \ + --max_grad_norm $norm \ + --num_train_epochs $epochs \ + --per_device_train_batch_size $batch \ + --gradient_accumulation_steps $update \ + --seed $seed \ + --logging_steps $logging \ + --push_to_hub $push \ + --bf16" + +if [ $steps -gt 0 ]; then + params+=" --max_steps $steps" +fi + +if [ "$name" != "" ]; then + params+=" --dataset_name $name" +fi +if [ "$cache" != "" ]; then + params+=" --cache_dir $cache" +fi +if [ "$checkpoint" != "" ]; then + params+=" --resume_from_checkpoint $checkpoint" +fi +if [ "$WANDB_DISABLED" != "true" ]; then + params+=" --report_to wandb \ + --run_name $type.$(basename $path)" +else + params+=" --report_to none" +fi + +NUM_GPUS=$(nvidia-smi --list-gpus | wc -l) +echo "Launching training..." +accelerate_params="" +if [ "$rank" != "" ]; then + accelerate_params+=" --machine_rank $rank \ + --num_processes $((nodes * $NUM_GPUS)) \ + --num_machines $nodes \ + --main_process_ip $ip \ + --main_process_port $port \ + --same_network" +fi + +if [[ $config == *"deepspeed"* ]]; then +cat < "configs/ds_config.json" +{ + "train_batch_size": "auto", + "train_micro_batch_size_per_gpu": "auto", + "gradient_accumulation_steps": "auto", + "gradient_clipping": "auto", + "zero_allow_untested_optimizer": true, + "bf16": { + "enabled": true + }, + "zero_optimization": { + "stage": 2, + "allgather_partitions": true, + "allgather_bucket_size": 5e8, + "reduce_scatter": true, + "reduce_bucket_size": 5e8, + "overlap_comm": false, + "contiguous_gradients": true + } +} +EOF +cat < $config +compute_environment: LOCAL_MACHINE +distributed_type: DEEPSPEED +deepspeed_config: + deepspeed_config_file: configs/ds_config.json + zero3_init_flag: true +machine_rank: 0 +main_training_function: main +num_machines: 1 +num_processes: $NUM_GPUS +use_cpu: false +EOF +fi +if [[ $config == *"fsdp"* ]]; then +cat < $config +compute_environment: LOCAL_MACHINE +distributed_type: FSDP +fsdp_config: + fsdp_auto_wrap_policy: TRANSFORMER_BASED_WRAP + fsdp_backward_prefetch: BACKWARD_PRE + fsdp_forward_prefetch: false + fsdp_cpu_ram_efficient_loading: true + fsdp_offload_params: false + fsdp_sharding_strategy: HYBRID_SHARD_ZERO2 + fsdp_state_dict_type: SHARDED_STATE_DICT + fsdp_sync_module_states: true + fsdp_use_orig_params: true +machine_rank: 0 +main_training_function: main +mixed_precision: bf16 +num_machines: $nodes +num_processes: $((nodes * $NUM_GPUS)) +rdzv_backend: static +same_network: true +tpu_env: [] +tpu_use_cluster: false +tpu_use_sudo: false +use_cpu: false +EOF +fi + +cat $config + +set -x +mkdir -p $path +cp * $path +cp -r configs $path +cp -r flame $path +cp -r ../fla $path + +export TRANSFORMERS_OFFLINE=1 +export HF_DATASETS_OFFLINE=1 +if [ "$date" == "" ]; then + date=$(date +%Y%m%d%H%M) +fi +# export WANDB_RESUME=allow +# export WANDB_NAME="$type.$(basename $path)" +# export WANDB_PROJECT=$project +# export WANDB_RUN_ID="$WANDB_NAME-$date" +export WANDB_MODE=offline +export HF_HUB_OFFLINE=0 +export TRITON_PRINT_AUTOTUNING=1 +ncu --set all -o profiling/train-profile python run.py $params + +echo "RUNNING DONE!" diff --git a/run.py b/run.py new file mode 100644 index 0000000000000000000000000000000000000000..b9e1be1999ee910335e700112954c30b9b1e43e5 --- /dev/null +++ b/run.py @@ -0,0 +1,75 @@ +# -*- coding: utf-8 -*- + +from datasets import load_from_disk +from transformers import (AutoConfig, AutoModelForCausalLM, AutoTokenizer, + Trainer) + +import fla # noqa +from flame.data import DataCollatorForLanguageModeling +from flame.logging import LogCallback, get_logger +from flame.parser import get_train_args + +logger = get_logger(__name__) + + +def main(): + args = get_train_args() + logger.info(args) + + tokenizer = AutoTokenizer.from_pretrained( + args.tokenizer, + use_fast=args.use_fast_tokenizer, + trust_remote_code=True, + add_bos_token=True, + add_eos_token=False + ) + if tokenizer.pad_token_id is None: + tokenizer.pad_token = tokenizer.eos_token + logger.info("Add pad token: {}".format(tokenizer.pad_token)) + if args.from_config: + logger.info("All model params are randomly initialized for from-scratch training.") + model = AutoModelForCausalLM.from_config(AutoConfig.from_pretrained(args.model_name_or_path)) + else: + logger.info(f"Loading pretrained checkpoint {args.model_name_or_path}") + model = AutoModelForCausalLM.from_pretrained(args.model_name_or_path) + model.train() + + trainable_params, all_param = model.num_parameters(only_trainable=True), model.num_parameters() + logger.info(f"% of trainable params: {trainable_params:d} / {all_param:d} = {trainable_params / all_param:.2%}") + logger.info(f"{tokenizer}\n{model}\n{model.config}") + + logger.info(f"Loading the `{args.split}` split directly from the cache {args.cache_dir}...") + dataset = load_from_disk(args.cache_dir) + logger.info(f"{dataset}") + logger.info(f"Shuffling the dataset with seed {args.seed}") + dataset = dataset.shuffle(seed=args.seed) + data_collator = DataCollatorForLanguageModeling(tokenizer=tokenizer) + + if args.lr_scheduler_type == 'cosine_with_min_lr': + args.lr_scheduler_kwargs = {'min_lr_rate': 0.1} + if args.lr_scheduler_type == 'warmup_stable_decay': + args.lr_scheduler_kwargs = { + 'num_stable_steps': args.max_steps * 0.9 - args.warmup_steps, + 'num_decay_steps': args.max_steps * 0.1 + } + + trainer = Trainer( + model=model, + args=args, + tokenizer=tokenizer, + data_collator=data_collator, + callbacks=[LogCallback()], + train_dataset=dataset + ) + + results = trainer.train(resume_from_checkpoint=args.resume_from_checkpoint) + trainer.save_model() + tokenizer.save_pretrained(trainer.args.output_dir) + + trainer.log_metrics("train", results.metrics) + trainer.save_metrics("train", results.metrics) + trainer.save_state() + + +if __name__ == "__main__": + main() diff --git a/special_tokens_map.json b/special_tokens_map.json new file mode 100644 index 0000000000000000000000000000000000000000..72ecfeeb7e14d244c936169d2ed139eeae235ef1 --- /dev/null +++ b/special_tokens_map.json @@ -0,0 +1,24 @@ +{ + "bos_token": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false + }, + "eos_token": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false + }, + "pad_token": "", + "unk_token": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false + } +} diff --git a/tokenizer.json b/tokenizer.json new file mode 100644 index 0000000000000000000000000000000000000000..65d75227e9e5faa60d4f81dee357d9222a450184 --- /dev/null +++ b/tokenizer.json @@ -0,0 +1,268053 @@ +{ + "version": "1.0", + "truncation": null, + "padding": null, + "added_tokens": [ + { + "id": 0, + "content": "", + "single_word": false, + "lstrip": false, + "rstrip": false, + "normalized": false, + "special": true + }, + { + "id": 1, + "content": "", + "single_word": false, + "lstrip": false, + "rstrip": false, + "normalized": false, + "special": true + }, + { + "id": 2, + "content": "", + "single_word": false, + "lstrip": false, + "rstrip": false, + "normalized": false, + "special": true + } + ], + "normalizer": null, + "pre_tokenizer": { + "type": "Metaspace", + "replacement": "▁", + "prepend_scheme": "first", + "split": false + }, + "post_processor": { + "type": "TemplateProcessing", + "single": [ + { + "SpecialToken": { + "id": "", + "type_id": 0 + } + }, + { + "Sequence": { + "id": "A", + "type_id": 0 + } + } + ], + "pair": [ + { + "SpecialToken": { + "id": "", + "type_id": 0 + } + }, + { + "Sequence": { + "id": "A", + "type_id": 0 + } + }, + { + "SpecialToken": { + "id": "", + "type_id": 1 + } + }, + { + "Sequence": { + "id": "B", + "type_id": 1 + } + } + ], + "special_tokens": { + "": { + "id": "", + "ids": [ + 1 + ], + "tokens": [ + "" + ] + } + } + }, + "decoder": { + "type": "Sequence", + "decoders": [ + { + "type": "Replace", + "pattern": { + "String": "▁" + }, + "content": " " + }, + { + "type": "ByteFallback" + }, + { + "type": "Fuse" + }, + { + "type": "Strip", + "content": " ", + "start": 1, + "stop": 0 + } + ] + }, + "model": { + "type": "BPE", + "dropout": null, + "unk_token": "", + "continuing_subword_prefix": null, + "end_of_word_suffix": null, + "fuse_unk": true, + "byte_fallback": true, + "ignore_merges": false, + "vocab": { + "": 0, + "": 1, + "": 2, + "<0x00>": 3, + "<0x01>": 4, + "<0x02>": 5, + "<0x03>": 6, + "<0x04>": 7, + "<0x05>": 8, + "<0x06>": 9, + "<0x07>": 10, + "<0x08>": 11, + "<0x09>": 12, + "<0x0A>": 13, + "<0x0B>": 14, + "<0x0C>": 15, + "<0x0D>": 16, + "<0x0E>": 17, + "<0x0F>": 18, + "<0x10>": 19, + "<0x11>": 20, + "<0x12>": 21, + "<0x13>": 22, + "<0x14>": 23, + "<0x15>": 24, + "<0x16>": 25, + "<0x17>": 26, + "<0x18>": 27, + "<0x19>": 28, + "<0x1A>": 29, + "<0x1B>": 30, + "<0x1C>": 31, + "<0x1D>": 32, + "<0x1E>": 33, + "<0x1F>": 34, + "<0x20>": 35, + "<0x21>": 36, + "<0x22>": 37, + "<0x23>": 38, + "<0x24>": 39, + "<0x25>": 40, + "<0x26>": 41, + "<0x27>": 42, + "<0x28>": 43, + "<0x29>": 44, + "<0x2A>": 45, + "<0x2B>": 46, + "<0x2C>": 47, + "<0x2D>": 48, + "<0x2E>": 49, + "<0x2F>": 50, + "<0x30>": 51, + "<0x31>": 52, + "<0x32>": 53, + "<0x33>": 54, + "<0x34>": 55, + "<0x35>": 56, + "<0x36>": 57, + "<0x37>": 58, + "<0x38>": 59, + "<0x39>": 60, + "<0x3A>": 61, + "<0x3B>": 62, + "<0x3C>": 63, + "<0x3D>": 64, + "<0x3E>": 65, + "<0x3F>": 66, + "<0x40>": 67, + "<0x41>": 68, + "<0x42>": 69, + "<0x43>": 70, + "<0x44>": 71, + "<0x45>": 72, + "<0x46>": 73, + "<0x47>": 74, + "<0x48>": 75, + "<0x49>": 76, + "<0x4A>": 77, + "<0x4B>": 78, + "<0x4C>": 79, + "<0x4D>": 80, + "<0x4E>": 81, + "<0x4F>": 82, + "<0x50>": 83, + "<0x51>": 84, + "<0x52>": 85, + "<0x53>": 86, + "<0x54>": 87, + "<0x55>": 88, + "<0x56>": 89, + "<0x57>": 90, + "<0x58>": 91, + "<0x59>": 92, + "<0x5A>": 93, + "<0x5B>": 94, + "<0x5C>": 95, + "<0x5D>": 96, + "<0x5E>": 97, + "<0x5F>": 98, + "<0x60>": 99, + "<0x61>": 100, + "<0x62>": 101, + "<0x63>": 102, + "<0x64>": 103, + "<0x65>": 104, + "<0x66>": 105, + "<0x67>": 106, + "<0x68>": 107, + "<0x69>": 108, + "<0x6A>": 109, + "<0x6B>": 110, + "<0x6C>": 111, + "<0x6D>": 112, + "<0x6E>": 113, + "<0x6F>": 114, + "<0x70>": 115, + "<0x71>": 116, + "<0x72>": 117, + "<0x73>": 118, + "<0x74>": 119, + "<0x75>": 120, + "<0x76>": 121, + "<0x77>": 122, + "<0x78>": 123, + "<0x79>": 124, + "<0x7A>": 125, + "<0x7B>": 126, + "<0x7C>": 127, + "<0x7D>": 128, + "<0x7E>": 129, + "<0x7F>": 130, + "<0x80>": 131, + "<0x81>": 132, + "<0x82>": 133, + "<0x83>": 134, + "<0x84>": 135, + "<0x85>": 136, + "<0x86>": 137, + "<0x87>": 138, + "<0x88>": 139, + "<0x89>": 140, + "<0x8A>": 141, + "<0x8B>": 142, + "<0x8C>": 143, + "<0x8D>": 144, + "<0x8E>": 145, + "<0x8F>": 146, + "<0x90>": 147, + "<0x91>": 148, + "<0x92>": 149, + "<0x93>": 150, + "<0x94>": 151, + "<0x95>": 152, + "<0x96>": 153, + "<0x97>": 154, + "<0x98>": 155, + "<0x99>": 156, + "<0x9A>": 157, + "<0x9B>": 158, + "<0x9C>": 159, + "<0x9D>": 160, + "<0x9E>": 161, + "<0x9F>": 162, + "<0xA0>": 163, + "<0xA1>": 164, + "<0xA2>": 165, + "<0xA3>": 166, + "<0xA4>": 167, + "<0xA5>": 168, + "<0xA6>": 169, + "<0xA7>": 170, + "<0xA8>": 171, + "<0xA9>": 172, + "<0xAA>": 173, + "<0xAB>": 174, + "<0xAC>": 175, + "<0xAD>": 176, + "<0xAE>": 177, + "<0xAF>": 178, + "<0xB0>": 179, + "<0xB1>": 180, + "<0xB2>": 181, + "<0xB3>": 182, + "<0xB4>": 183, + "<0xB5>": 184, + "<0xB6>": 185, + "<0xB7>": 186, + "<0xB8>": 187, + "<0xB9>": 188, + "<0xBA>": 189, + "<0xBB>": 190, + "<0xBC>": 191, + "<0xBD>": 192, + "<0xBE>": 193, + "<0xBF>": 194, + "<0xC0>": 195, + "<0xC1>": 196, + "<0xC2>": 197, + "<0xC3>": 198, + "<0xC4>": 199, + "<0xC5>": 200, + "<0xC6>": 201, + "<0xC7>": 202, + "<0xC8>": 203, + "<0xC9>": 204, + "<0xCA>": 205, + "<0xCB>": 206, + "<0xCC>": 207, + "<0xCD>": 208, + "<0xCE>": 209, + "<0xCF>": 210, + "<0xD0>": 211, + "<0xD1>": 212, + "<0xD2>": 213, + "<0xD3>": 214, + "<0xD4>": 215, + "<0xD5>": 216, + "<0xD6>": 217, + "<0xD7>": 218, + "<0xD8>": 219, + "<0xD9>": 220, + "<0xDA>": 221, + "<0xDB>": 222, + "<0xDC>": 223, + "<0xDD>": 224, + "<0xDE>": 225, + "<0xDF>": 226, + "<0xE0>": 227, + "<0xE1>": 228, + "<0xE2>": 229, + "<0xE3>": 230, + "<0xE4>": 231, + "<0xE5>": 232, + "<0xE6>": 233, + "<0xE7>": 234, + "<0xE8>": 235, + "<0xE9>": 236, + "<0xEA>": 237, + "<0xEB>": 238, + "<0xEC>": 239, + "<0xED>": 240, + "<0xEE>": 241, + "<0xEF>": 242, + "<0xF0>": 243, + "<0xF1>": 244, + "<0xF2>": 245, + "<0xF3>": 246, + "<0xF4>": 247, + "<0xF5>": 248, + "<0xF6>": 249, + "<0xF7>": 250, + "<0xF8>": 251, + "<0xF9>": 252, + "<0xFA>": 253, + "<0xFB>": 254, + "<0xFC>": 255, + "<0xFD>": 256, + "<0xFE>": 257, + "<0xFF>": 258, + "▁▁": 259, + "▁▁▁▁": 260, + "▁t": 261, + "in": 262, + "er": 263, + "▁a": 264, + "he": 265, + "on": 266, + "re": 267, + "▁s": 268, + "en": 269, + "at": 270, + "or": 271, + "▁the": 272, + "▁▁▁▁▁▁▁▁": 273, + "es": 274, + "▁w": 275, + "an": 276, + "▁c": 277, + "is": 278, + "it": 279, + "ou": 280, + "▁d": 281, + "al": 282, + "ar": 283, + "▁p": 284, + "▁f": 285, + "ed": 286, + "▁b": 287, + "ing": 288, + "▁o": 289, + "▁m": 290, + "le": 291, + "nd": 292, + "as": 293, + "ic": 294, + "▁h": 295, + "ion": 296, + "▁in": 297, + "▁to": 298, + "et": 299, + "om": 300, + "el": 301, + "▁of": 302, + "st": 303, + "▁and": 304, + "▁l": 305, + "▁th": 306, + "▁n": 307, + "ent": 308, + "il": 309, + "ct": 310, + "ro": 311, + "▁re": 312, + "id": 313, + "am": 314, + "▁I": 315, + "ad": 316, + "▁e": 317, + "▁S": 318, + "▁g": 319, + "▁T": 320, + "im": 321, + "ot": 322, + "ac": 323, + "ur": 324, + "▁(": 325, + "ig": 326, + "▁=": 327, + "ol": 328, + "ut": 329, + "▁A": 330, + "se": 331, + "▁u": 332, + "ve": 333, + "▁C": 334, + "if": 335, + "ow": 336, + "▁y": 337, + "ch": 338, + "ay": 339, + "▁de": 340, + "▁st": 341, + "▁|": 342, + "ver": 343, + ");": 344, + "▁\"": 345, + "ly": 346, + "▁be": 347, + "**": 348, + "▁is": 349, + "od": 350, + "▁M": 351, + "ation": 352, + "ul": 353, + "▁for": 354, + "▁▁▁▁▁": 355, + "▁on": 356, + "ag": 357, + "ce": 358, + "▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁▁": 359, + "ter": 360, + "ir": 361, + "th": 362, + "▁v": 363, + "qu": 364, + "▁B": 365, + "em": 366, + "▁P": 367, + "▁you": 368, + "▁that": 369, + "un": 370, + "▁{": 371, + "ith": 372, + "ri": 373, + "est": 374, + "ab": 375, + "--": 376, + "ap": 377, + "▁it": 378, + "▁con": 379, + "ate": 380, + "us": 381, + "▁H": 382, + "um": 383, + "▁D": 384, + "os": 385, + "pe": 386, + "▁-": 387, + "▁wh": 388, + "▁al": 389, + "▁as": 390, + "and": 391, + "ist": 392, + "▁L": 393, + "▁W": 394, + "▁with": 395, + "▁an": 396, + "ere": 397, + "▁*": 398, + "▁R": 399, + "▁he": 400, + "▁F": 401, + "oc": 402, + "▁was": 403, + "ers": 404, + "ke": 405, + "out": 406, + "ht": 407, + "▁r": 408, + "ess": 409, + "op": 410, + "res": 411, + "ie": 412, + "▁E": 413, + "▁\\": 414, + "▁The": 415, + "end": 416, + "ld": 417, + "▁N": 418, + "ort": 419, + "▁G": 420, + "//": 421, + "▁#": 422, + "our": 423, + "te": 424, + "ill": 425, + "ain": 426, + "▁se": 427, + "▁▁▁▁▁▁": 428, + "▁$": 429, + "▁pro": 430, + "ore": 431, + "▁com": 432, + "ame": 433, + "tr": 434, + "▁ne": 435, + "rom": 436, + "ub": 437, + "▁at": 438, + "▁ex": 439, + "ant": 440, + "ue": 441, + "▁or": 442, + "▁}": 443, + "art": 444, + "ction": 445, + "▁k": 446, + "pt": 447, + "nt": 448, + "iv": 449, + "de": 450, + "▁O": 451, + "pl": 452, + "urn": 453, + "ight": 454, + "all": 455, + "▁this": 456, + "ser": 457, + "ave": 458, + "▁not": 459, + "▁are": 460, + "▁j": 461, + "▁le": 462, + "iz": 463, + "▁'": 464, + "age": 465, + "ment": 466, + "▁tr": 467, + "ack": 468, + "ust": 469, + "()": 470, + "->": 471, + "ity": 472, + "ine": 473, + "ould": 474, + "▁J": 475, + "og": 476, + "▁from": 477, + "▁we": 478, + "ell": 479, + "▁sh": 480, + "▁en": 481, + "ure": 482, + "port": 483, + "▁ch": 484, + "ne": 485, + "▁by": 486, + "per": 487, + "ard": 488, + "ass": 489, + "ge": 490, + "ak": 491, + "are": 492, + "ok": 493, + "av": 494, + "ive": 495, + "ff": 496, + "ies": 497, + "ath": 498, + "turn": 499, + "▁U": 500, + "int": 501, + "----": 502, + "▁im": 503, + "ost": 504, + "ial": 505, + "▁have": 506, + "ind": 507, + "ip": 508, + "ans": 509, + "xt": 510, + "▁do": 511, + "cl": 512, + "▁if": 513, + "con": 514, + "ia": 515, + "▁his": 516, + "ult": 517, + "rou": 518, + "▁su": 519, + "ra": 520, + "▁un": 521, + "able": 522, + "▁<": 523, + "▁K": 524, + "ome": 525, + "▁qu": 526, + "get": 527, + "▁me": 528, + "ast": 529, + "ect": 530, + "▁##": 531, + "to": 532, + "▁cl": 533, + "▁ab": 534, + "ice": 535, + "ire": 536, + "ber": 537, + "one": 538, + "ich": 539, + "hen": 540, + "▁can": 541, + "▁Th": 542, + "▁la": 543, + "▁all": 544, + "ime": 545, + "ile": 546, + "ide": 547, + "\",": 548, + "▁pl": 549, + "▁V": 550, + "ru": 551, + "orm": 552, + "▁had": 553, + "ud": 554, + "ase": 555, + "ord": 556, + "),": 557, + "▁▁▁▁▁▁▁▁▁▁▁▁": 558, + "▁her": 559, + "▁In": 560, + "ace": 561, + "▁but": 562, + "ata": 563, + "::": 564, + "****": 565, + "ong": 566, + "▁&": 567, + "..": 568, + "▁▁▁▁▁▁▁▁▁▁▁▁▁": 569, + "ite": 570, + "ype": 571, + "act": 572, + "ode": 573, + "▁your": 574, + "▁out": 575, + "▁go": 576, + "lic": 577, + "ally": 578, + "▁so": 579, + "ork": 580, + "au": 581, + "▁up": 582, + "▁_": 583, + "ll": 584, + "==": 585, + "▁my": 586, + "pp": 587, + "cc": 588, + "▁//": 589, + "▁they": 590, + "gh": 591, + "▁us": 592, + "ib": 593, + "ions": 594, + "ach": 595, + "ens": 596, + "▁ar": 597, + "ob": 598, + "elf": 599, + "ook": 600, + "ated": 601, + "ang": 602, + "ign": 603, + "▁return": 604, + "▁res": 605, + "ck": 606, + "ous": 607, + "ст": 608, + ").": 609, + "▁п": 610, + ".\"": 611, + "на": 612, + "▁i": 613, + "ail": 614, + "ep": 615, + "▁ad": 616, + "ance": 617, + "(\"": 618, + "▁**": 619, + "ther": 620, + "ake": 621, + "▁will": 622, + "▁comp": 623, + "▁one": 624, + "▁get": 625, + "ov": 626, + "▁Y": 627, + "ary": 628, + "ock": 629, + "▁she": 630, + "che": 631, + "ft": 632, + "▁new": 633, + "▁des": 634, + "▁li": 635, + "ence": 636, + "▁sa": 637, + "ress": 638, + "▁el": 639, + "▁und": 640, + "eg": 641, + "fer": 642, + "ry": 643, + "ear": 644, + "ose": 645, + "very": 646, + "',": 647, + "▁+": 648, + "▁в": 649, + "▁He": 650, + "ublic": 651, + "▁their": 652, + "ize": 653, + "▁were": 654, + "ink": 655, + "own": 656, + "In": 657, + "{\\": 658, + "▁has": 659, + "▁per": 660, + "▁It": 661, + "▁St": 662, + "her": 663, + "ject": 664, + "ра": 665, + "ild": 666, + "so": 667, + "▁sp": 668, + "ни": 669, + "du": 670, + "row": 671, + "alue": 672, + "set": 673, + "form": 674, + "com": 675, + "▁man": 676, + "ont": 677, + "ull": 678, + "▁cont": 679, + "▁more": 680, + "ick": 681, + "▁would": 682, + "▁ev": 683, + "▁about": 684, + "ition": 685, + "▁z": 686, + "ound": 687, + "ree": 688, + "▁Ch": 689, + "▁which": 690, + "io": 691, + "();": 692, + "▁who": 693, + "err": 694, + "ory": 695, + "ount": 696, + "ations": 697, + "▁с": 698, + "ring": 699, + "": 876, + "▁em": 877, + "▁$\\": 878, + "▁year": 879, + "wn": 880, + "},": 881, + "▁del": 882, + "ale": 883, + "ty": 884, + "fig": 885, + "sp": 886, + "hed": 887, + "round": 888, + "ew": 889, + "▁di": 890, + "▁der": 891, + "ри": 892, + "red": 893, + "this": 894, + "let": 895, + "RE": 896, + "ax": 897, + "fr": 898, + "essage": 899, + "ough": 900, + "▁comm": 901, + "fo": 902, + "uch": 903, + "oy": 904, + "▁people": 905, + "ystem": 906, + "▁first": 907, + "▁function": 908, + "ange": 909, + "▁how": 910, + "▁et": 911, + "ah": 912, + "▁look": 913, + "то": 914, + "und": 915, + "▁under": 916, + "ка": 917, + "▁!": 918, + "ray": 919, + "ST": 920, + "ific": 921, + "ли": 922, + "read": 923, + "▁bet": 924, + "ious": 925, + "arg": 926, + "▁need": 927, + "math": 928, + "▁на": 929, + "ert": 930, + "▁op": 931, + "▁acc": 932, + "Pro": 933, + "▁est": 934, + "▁Un": 935, + "▁ent": 936, + "▁rec": 937, + "▁use": 938, + "ен": 939, + "▁par": 940, + "az": 941, + "▁д": 942, + "▁Wh": 943, + "self": 944, + "▁ke": 945, + "та": 946, + "▁want": 947, + "▁end": 948, + "▁don": 949, + "ek": 950, + "ren": 951, + "Name": 952, + "▁=>": 953, + "▁app": 954, + "▁que": 955, + "igh": 956, + "▁bu": 957, + "equ": 958, + "vel": 959, + "▁act": 960, + "cre": 961, + "AT": 962, + "▁var": 963, + "cess": 964, + "====": 965, + "Ex": 966, + "▁add": 967, + "▁mod": 968, + "ung": 969, + "▁where": 970, + "ning": 971, + "▁fl": 972, + "als": 973, + "tern": 974, + "}}": 975, + "▁Al": 976, + "▁pos": 977, + "ank": 978, + "▁ap": 979, + "eng": 980, + "▁“": 981, + "ble": 982, + "▁reg": 983, + "^{": 984, + "▁She": 985, + "▁*/": 986, + "ude": 987, + "add": 988, + "▁two": 989, + "▁col": 990, + "▁sm": 991, + "air": 992, + "▁may": 993, + "fore": 994, + "▁You": 995, + "rough": 996, + "▁che": 997, + "▁att": 998, + "oth": 999, + "ла": 1000, + "▁co": 1001, + "ates": 1002, + "▁rem": 1003, + "ood": 1004, + "Type": 1005, + "led": 1006, + "ful": 1007, + "▁self": 1008, + "of": 1009, + "▁Ar": 1010, + "que": 1011, + "▁every": 1012, + "ref": 1013, + "The": 1014, + "▁And": 1015, + "▁rel": 1016, + "OR": 1017, + "Id": 1018, + "▁even": 1019, + "EN": 1020, + "▁hand": 1021, + "ait": 1022, + "▁should": 1023, + "▁after": 1024, + "▁dif": 1025, + "ght": 1026, + "ife": 1027, + "ator": 1028, + "ash": 1029, + "ribut": 1030, + "umber": 1031, + "▁see": 1032, + "ms": 1033, + "▁call": 1034, + "yn": 1035, + "dd": 1036, + "▁es": 1037, + "▁make": 1038, + "other": 1039, + "▁—": 1040, + "\");": 1041, + "str": 1042, + "▁long": 1043, + "lement": 1044, + "▁wor": 1045, + "its": 1046, + "▁If": 1047, + "alse": 1048, + "ль": 1049, + "ward": 1050, + "▁по": 1051, + "val": 1052, + "ons": 1053, + "▁Z": 1054, + "▁now": 1055, + "data": 1056, + "amp": 1057, + "ense": 1058, + "▁through": 1059, + "▁down": 1060, + "att": 1061, + "▁static": 1062, + "ics": 1063, + "##": 1064, + "pos": 1065, + "▁void": 1066, + "aw": 1067, + "oun": 1068, + "▁way": 1069, + "ible": 1070, + "vent": 1071, + "ower": 1072, + "▁think": 1073, + "ts": 1074, + "*/": 1075, + "▁again": 1076, + "ating": 1077, + "те": 1078, + "ner": 1079, + "▁most": 1080, + "line": 1081, + "ym": 1082, + "▁sub": 1083, + "erson": 1084, + "▁requ": 1085, + "AL": 1086, + "AR": 1087, + "abel": 1088, + "ond": 1089, + "));": 1090, + "▁Se": 1091, + "▁But": 1092, + "alk": 1093, + "▁An": 1094, + "new": 1095, + "▁because": 1096, + "ger": 1097, + "ular": 1098, + "roup": 1099, + "ta": 1100, + "...": 1101, + "▁cons": 1102, + "▁right": 1103, + "▁fr": 1104, + "be": 1105, + "ily": 1106, + "ки": 1107, + "▁ph": 1108, + "ead": 1109, + "?\"": 1110, + "▁gu": 1111, + "▁else": 1112, + "▁som": 1113, + "rent": 1114, + "co": 1115, + "ement": 1116, + "▁str": 1117, + "ault": 1118, + "▁з": 1119, + "ло": 1120, + "sert": 1121, + "var": 1122, + "type": 1123, + "▁Com": 1124, + "ле": 1125, + "ins": 1126, + "me": 1127, + "way": 1128, + "ident": 1129, + "▁prov": 1130, + "▁м": 1131, + "▁true": 1132, + "▁Pro": 1133, + "fl": 1134, + "▁sl": 1135, + "▁As": 1136, + "}\\": 1137, + "ID": 1138, + "ues": 1139, + "▁inst": 1140, + "▁name": 1141, + "ox": 1142, + "▁)": 1143, + "li": 1144, + "ames": 1145, + "Res": 1146, + "▁sur": 1147, + "param": 1148, + "▁start": 1149, + "aj": 1150, + "SE": 1151, + "ask": 1152, + "IT": 1153, + "String": 1154, + "▁ass": 1155, + "▁play": 1156, + "ting": 1157, + "ton": 1158, + "▁before": 1159, + "▁pol": 1160, + "arch": 1161, + "▁well": 1162, + "Com": 1163, + "any": 1164, + "olog": 1165, + "▁err": 1166, + "▁these": 1167, + "ars": 1168, + "eb": 1169, + "▁br": 1170, + "▁incl": 1171, + "▁hel": 1172, + "ern": 1173, + "ody": 1174, + "во": 1175, + "▁ind": 1176, + "----------------": 1177, + "▁data": 1178, + "▁good": 1179, + "LE": 1180, + "],": 1181, + "▁av": 1182, + "▁ac": 1183, + "ider": 1184, + "не": 1185, + "▁Q": 1186, + "▁min": 1187, + "▁much": 1188, + "ci": 1189, + "els": 1190, + "▁cur": 1191, + "▁value": 1192, + "ery": 1193, + "uf": 1194, + "▁loc": 1195, + "reak": 1196, + "ative": 1197, + "imes": 1198, + "Cl": 1199, + "▁,": 1200, + "▁ser": 1201, + "▁die": 1202, + "▁trans": 1203, + "▁result": 1204, + "ext": 1205, + "▁aut": 1206, + "land": 1207, + "▁&&": 1208, + "Ch": 1209, + "ten": 1210, + "}$": 1211, + "▁type": 1212, + "cond": 1213, + "ices": 1214, + "▁very": 1215, + "▁own": 1216, + "▁fil": 1217, + "ities": 1218, + "▁produ": 1219, + "▁read": 1220, + "▁form": 1221, + "▁case": 1222, + "ather": 1223, + "ти": 1224, + "да": 1225, + "ер": 1226, + "Th": 1227, + "aut": 1228, + "▁spec": 1229, + "ij": 1230, + "bl": 1231, + "ility": 1232, + "▁é": 1233, + "▁er": 1234, + "▁does": 1235, + "▁here": 1236, + "the": 1237, + "ures": 1238, + "▁%": 1239, + "min": 1240, + "▁null": 1241, + "rap": 1242, + "\")": 1243, + "rr": 1244, + "List": 1245, + "right": 1246, + "▁User": 1247, + "UL": 1248, + "ational": 1249, + "▁being": 1250, + "AN": 1251, + "sk": 1252, + "▁car": 1253, + "ole": 1254, + "▁dist": 1255, + "plic": 1256, + "ollow": 1257, + "▁pres": 1258, + "▁such": 1259, + "ream": 1260, + "ince": 1261, + "gan": 1262, + "▁For": 1263, + "\":": 1264, + "son": 1265, + "rivate": 1266, + "▁years": 1267, + "▁serv": 1268, + "▁made": 1269, + "def": 1270, + ";\r": 1271, + "▁gl": 1272, + "▁bel": 1273, + "▁list": 1274, + "▁cor": 1275, + "▁det": 1276, + "ception": 1277, + "egin": 1278, + "▁б": 1279, + "▁char": 1280, + "trans": 1281, + "▁fam": 1282, + "▁!=": 1283, + "ouse": 1284, + "▁dec": 1285, + "ica": 1286, + "▁many": 1287, + "aking": 1288, + "▁à": 1289, + "▁sim": 1290, + "ages": 1291, + "uff": 1292, + "ased": 1293, + "man": 1294, + "▁Sh": 1295, + "iet": 1296, + "irect": 1297, + "▁Re": 1298, + "▁differ": 1299, + "▁find": 1300, + "ethod": 1301, + "▁\r": 1302, + "ines": 1303, + "▁inv": 1304, + "▁point": 1305, + "▁They": 1306, + "▁used": 1307, + "ctions": 1308, + "▁still": 1309, + "ió": 1310, + "ined": 1311, + "▁while": 1312, + "It": 1313, + "ember": 1314, + "▁say": 1315, + "▁help": 1316, + "▁cre": 1317, + "▁x": 1318, + "▁Tr": 1319, + "ument": 1320, + "▁sk": 1321, + "ought": 1322, + "ually": 1323, + "message": 1324, + "▁Con": 1325, + "▁mon": 1326, + "ared": 1327, + "work": 1328, + "):": 1329, + "ister": 1330, + "arn": 1331, + "ized": 1332, + "Data": 1333, + "orn": 1334, + "▁head": 1335, + "DE": 1336, + "▁Le": 1337, + "▁person": 1338, + "ments": 1339, + "ength": 1340, + "▁false": 1341, + "▁med": 1342, + "▁De": 1343, + "ache": 1344, + "ited": 1345, + "▁let": 1346, + "▁show": 1347, + "▁same": 1348, + "uss": 1349, + "▁gener": 1350, + "▁у": 1351, + "cur": 1352, + "▁real": 1353, + "ced": 1354, + "\">": 1355, + "struct": 1356, + "begin": 1357, + "cept": 1358, + "▁bo": 1359, + "ired": 1360, + "▁Fr": 1361, + "▁stud": 1362, + "dev": 1363, + "Ar": 1364, + "(\\": 1365, + "▁Cl": 1366, + "ween": 1367, + "▁too": 1368, + "▁test": 1369, + "▁day": 1370, + "oh": 1371, + "▁follow": 1372, + "ature": 1373, + "ze": 1374, + "ien": 1375, + "reg": 1376, + "ces": 1377, + "uring": 1378, + "amb": 1379, + "ina": 1380, + "cri": 1381, + "▁ed": 1382, + "SS": 1383, + "uck": 1384, + "▁/*": 1385, + "CT": 1386, + "▁There": 1387, + "▁take": 1388, + "par": 1389, + "ule": 1390, + "cal": 1391, + "for": 1392, + "****************": 1393, + "source": 1394, + "▁those": 1395, + "col": 1396, + "▁eff": 1397, + "mod": 1398, + "cont": 1399, + "}{": 1400, + "▁around": 1401, + "press": 1402, + "by": 1403, + "▁going": 1404, + "ponse": 1405, + "▁С": 1406, + "▁line": 1407, + "date": 1408, + "code": 1409, + "['": 1410, + "▁life": 1411, + "ason": 1412, + "▁using": 1413, + "▁val": 1414, + "▁du": 1415, + "yp": 1416, + "▁▁▁▁▁▁▁▁▁▁▁▁▁▁": 1417, + "▁On": 1418, + "▁found": 1419, + "olut": 1420, + "']": 1421, + "arent": 1422, + "▁string": 1423, + "▁met": 1424, + "▁wr": 1425, + "ush": 1426, + "string": 1427, + "size": 1428, + "▁ver": 1429, + "▁each": 1430, + "value": 1431, + "▁last": 1432, + "▁got": 1433, + "ven": 1434, + "back": 1435, + "Set": 1436, + "ey": 1437, + "rol": 1438, + "▁cr": 1439, + "thing": 1440, + "ret": 1441, + "és": 1442, + "ism": 1443, + "▁between": 1444, + "Ob": 1445, + "ething": 1446, + "mp": 1447, + "▁lo": 1448, + "ats": 1449, + "▁New": 1450, + "ви": 1451, + "ado": 1452, + "dex": 1453, + "ди": 1454, + "▁pass": 1455, + "wh": 1456, + "▁den": 1457, + "Get": 1458, + "apt": 1459, + "▁ask": 1460, + "▁sup": 1461, + "Value": 1462, + "ны": 1463, + "▁try": 1464, + "lation": 1465, + "day": 1466, + "ness": 1467, + "ets": 1468, + "▁exper": 1469, + "Tr": 1470, + "▁Mar": 1471, + "serv": 1472, + "br": 1473, + "▁number": 1474, + "inal": 1475, + "cent": 1476, + "/*": 1477, + "not": 1478, + "ional": 1479, + "▁final": 1480, + "')": 1481, + "▁run": 1482, + "over": 1483, + "▁never": 1484, + "uc": 1485, + "▁high": 1486, + "yle": 1487, + "▁ins": 1488, + "▁best": 1489, + "ittle": 1490, + "ric": 1491, + "▁sign": 1492, + "▁dem": 1493, + "iness": 1494, + "gy": 1495, + "▁war": 1496, + "ished": 1497, + "▁giv": 1498, + "key": 1499, + "▁X": 1500, + "($": 1501, + "▁child": 1502, + "less": 1503, + "ways": 1504, + "incl": 1505, + "rop": 1506, + "raw": 1507, + "://": 1508, + "▁«": 1509, + "no": 1510, + "indow": 1511, + "fe": 1512, + "riend": 1513, + "▁les": 1514, + "▁los": 1515, + "file": 1516, + "formation": 1517, + "ccess": 1518, + "▁В": 1519, + "na": 1520, + "▁il": 1521, + "ision": 1522, + "ler": 1523, + "▁art": 1524, + "Cont": 1525, + "▁world": 1526, + "▁turn": 1527, + "▁really": 1528, + "▁Ex": 1529, + "ма": 1530, + "▁П": 1531, + "ters": 1532, + "arget": 1533, + "Err": 1534, + "▁happ": 1535, + "time": 1536, + "▁So": 1537, + "div": 1538, + "▁didn": 1539, + "ada": 1540, + "oot": 1541, + "})": 1542, + "▁sch": 1543, + "▁cle": 1544, + "▁something": 1545, + "().": 1546, + "▁cour": 1547, + "ever": 1548, + "ants": 1549, + "▁?": 1550, + "To": 1551, + "▁`": 1552, + "try": 1553, + "ux": 1554, + "ais": 1555, + "ross": 1556, + "hip": 1557, + "▁rep": 1558, + "label": 1559, + "▁both": 1560, + "*,": 1561, + "ott": 1562, + "ми": 1563, + "ane": 1564, + "▁open": 1565, + "ww": 1566, + "▁come": 1567, + "▁ext": 1568, + "rem": 1569, + "_{\\": 1570, + "▁old": 1571, + "ched": 1572, + "._": 1573, + "ME": 1574, + "ify": 1575, + "gg": 1576, + "Col": 1577, + "view": 1578, + "▁bus": 1579, + "▁must": 1580, + "▁different": 1581, + "log": 1582, + "ists": 1583, + "roll": 1584, + "ai": 1585, + "▁за": 1586, + "▁system": 1587, + "ivers": 1588, + "atus": 1589, + "ote": 1590, + "med": 1591, + "].": 1592, + "akes": 1593, + "RO": 1594, + "▁cent": 1595, + "gram": 1596, + "▁private": 1597, + "▁great": 1598, + "\";": 1599, + "opy": 1600, + "▁feel": 1601, + "▁How": 1602, + "////": 1603, + "IC": 1604, + "▁dr": 1605, + "ains": 1606, + "lock": 1607, + "En": 1608, + "▁Sch": 1609, + "▁mat": 1610, + "▁home": 1611, + "perty": 1612, + "test": 1613, + "loc": 1614, + "▁wom": 1615, + "sw": 1616, + "arly": 1617, + "▁En": 1618, + "▁ко": 1619, + "den": 1620, + "ста": 1621, + "▁а": 1622, + "eter": 1623, + "▁includ": 1624, + "ULL": 1625, + "▁mem": 1626, + "▁po": 1627, + "▁little": 1628, + "▁arg": 1629, + "▁},": 1630, + "include": 1631, + "eta": 1632, + "▁place": 1633, + "idth": 1634, + "ustom": 1635, + "▁||": 1636, + "▁tem": 1637, + "ried": 1638, + "▁fact": 1639, + "ience": 1640, + "▁Pl": 1641, + "opt": 1642, + "ele": 1643, + "go": 1644, + "AC": 1645, + "inter": 1646, + "========": 1647, + "(),": 1648, + "ots": 1649, + "ral": 1650, + "ique": 1651, + "aving": 1652, + "ml": 1653, + "▁thought": 1654, + "frac": 1655, + "▁care": 1656, + "());": 1657, + "▁put": 1658, + "▁might": 1659, + "▁Amer": 1660, + "▁(!": 1661, + "ample": 1662, + "alth": 1663, + "▁few": 1664, + "▁state": 1665, + "sub": 1666, + "▁Or": 1667, + "];": 1668, + "▁size": 1669, + "▁Sp": 1670, + "▁without": 1671, + "▁poss": 1672, + "eq": 1673, + "play": 1674, + "▁expect": 1675, + "▁second": 1676, + "▁String": 1677, + "uild": 1678, + "▁next": 1679, + "++": 1680, + "requ": 1681, + "▁All": 1682, + "▁men": 1683, + "▁When": 1684, + "iter": 1685, + "ament": 1686, + "net": 1687, + "▁К": 1688, + "ron": 1689, + "aint": 1690, + "▁Is": 1691, + "ве": 1692, + "pend": 1693, + "translation": 1694, + "▁го": 1695, + "че": 1696, + "▁van": 1697, + "▁another": 1698, + "▁ret": 1699, + "▁La": 1700, + "Mod": 1701, + "ION": 1702, + "list": 1703, + "▁post": 1704, + "da": 1705, + "ware": 1706, + "▁word": 1707, + "Error": 1708, + "▁seem": 1709, + "▁contin": 1710, + "atic": 1711, + "▁three": 1712, + "Object": 1713, + "▁partic": 1714, + "$.": 1715, + "▁mark": 1716, + "▁vis": 1717, + "rc": 1718, + "▁sw": 1719, + "ptions": 1720, + "▁break": 1721, + "▁things": 1722, + "ute": 1723, + "ui": 1724, + "▁That": 1725, + "urs": 1726, + "gl": 1727, + "ру": 1728, + "▁file": 1729, + "use": 1730, + "igned": 1731, + "part": 1732, + "Un": 1733, + "▁equ": 1734, + "(&": 1735, + "▁lead": 1736, + "rm": 1737, + "ained": 1738, + "▁Be": 1739, + "path": 1740, + "▁small": 1741, + "ager": 1742, + "▁always": 1743, + "▁El": 1744, + "▁order": 1745, + "▁ey": 1746, + "▁won": 1747, + "ape": 1748, + "▁left": 1749, + "ava": 1750, + "item": 1751, + "hor": 1752, + "▁away": 1753, + "bb": 1754, + "fun": 1755, + "▁Ind": 1756, + "mb": 1757, + "▁struct": 1758, + "▁process": 1759, + "▁support": 1760, + ");\r": 1761, + "ión": 1762, + "LO": 1763, + "▁oper": 1764, + "UT": 1765, + "▁·": 1766, + "PE": 1767, + "load": 1768, + "off": 1769, + "▁No": 1770, + "ives": 1771, + "ican": 1772, + "▁ve": 1773, + "action": 1774, + "';": 1775, + "▁vo": 1776, + "$,": 1777, + "▁Gr": 1778, + "pre": 1779, + "ny": 1780, + "aining": 1781, + "ior": 1782, + "init": 1783, + "lection": 1784, + "arm": 1785, + "umn": 1786, + "ags": 1787, + "ци": 1788, + "ско": 1789, + "version": 1790, + "▁To": 1791, + "▁ref": 1792, + "stand": 1793, + "▁At": 1794, + "ift": 1795, + "▁ein": 1796, + "face": 1797, + "bo": 1798, + "ified": 1799, + "ved": 1800, + "sum": 1801, + "une": 1802, + "ital": 1803, + "ump": 1804, + "comm": 1805, + "▁mov": 1806, + "elt": 1807, + "▁von": 1808, + "velop": 1809, + "ctor": 1810, + "head": 1811, + "cle": 1812, + "▁build": 1813, + "inc": 1814, + ".'": 1815, + "bs": 1816, + "info": 1817, + "chn": 1818, + "▁week": 1819, + "▁book": 1820, + "HE": 1821, + "bar": 1822, + "icense": 1823, + "▁What": 1824, + "▁quest": 1825, + "urch": 1826, + "ato": 1827, + "left": 1828, + "▁mar": 1829, + "▁top": 1830, + "FF": 1831, + "▁friend": 1832, + "▁beh": 1833, + "▁field": 1834, + "▁against": 1835, + "ract": 1836, + "ization": 1837, + "user": 1838, + "chen": 1839, + "▁keep": 1840, + "AD": 1841, + "itor": 1842, + "▁non": 1843, + "ird": 1844, + "ope": 1845, + "▁rest": 1846, + "▁dev": 1847, + "▁__": 1848, + "▁una": 1849, + "▁term": 1850, + "IS": 1851, + "▁pop": 1852, + "rist": 1853, + "▁since": 1854, + "ves": 1855, + "▁hard": 1856, + "pi": 1857, + "util": 1858, + "▁soc": 1859, + "ene": 1860, + "Exception": 1861, + "▁local": 1862, + "▁direct": 1863, + "▁sure": 1864, + "▁bro": 1865, + "▁da": 1866, + "▁": 2370, + "aim": 2371, + "▁service": 2372, + "▁within": 2373, + "angu": 2374, + "▁Д": 2375, + "uffer": 2376, + "AG": 2377, + "▁Do": 2378, + "▁incre": 2379, + "▁understand": 2380, + "}^": 2381, + "▁looked": 2382, + "gen": 2383, + "ailed": 2384, + "▁е": 2385, + "ayer": 2386, + "▁One": 2387, + "▁bas": 2388, + "▁job": 2389, + "mu": 2390, + "but": 2391, + "elta": 2392, + "▁Christ": 2393, + "uration": 2394, + "▁record": 2395, + "▁Univers": 2396, + "ivid": 2397, + "valid": 2398, + "▁Р": 2399, + "▁hold": 2400, + "▁table": 2401, + "ones": 2402, + "link": 2403, + "▁Ge": 2404, + "▁offer": 2405, + "ster": 2406, + "Form": 2407, + "={": 2408, + "▁не": 2409, + "stance": 2410, + "▁govern": 2411, + "▁techn": 2412, + "▁prim": 2413, + "*.": 2414, + "cho": 2415, + "max": 2416, + "▁fore": 2417, + "▁Can": 2418, + "▁polit": 2419, + "ories": 2420, + "▁times": 2421, + "▁dans": 2422, + "▁air": 2423, + "▁anything": 2424, + "▁sever": 2425, + "acy": 2426, + "}_": 2427, + "He": 2428, + "▁least": 2429, + "ips": 2430, + "ENT": 2431, + "do": 2432, + "▁от": 2433, + "▁cost": 2434, + ".”": 2435, + "▁children": 2436, + "ability": 2437, + "But": 2438, + "▁path": 2439, + "result": 2440, + "acter": 2441, + "▁element": 2442, + "ee": 2443, + "▁wait": 2444, + "▁money": 2445, + "Map": 2446, + "td": 2447, + "oin": 2448, + "iving": 2449, + "icht": 2450, + "icy": 2451, + "sch": 2452, + "ste": 2453, + "ду": 2454, + "ored": 2455, + "oud": 2456, + "ille": 2457, + "ised": 2458, + "plication": 2459, + "▁custom": 2460, + "▁having": 2461, + "ponent": 2462, + "▁By": 2463, + "ules": 2464, + "ued": 2465, + "atter": 2466, + "And": 2467, + "itive": 2468, + "Def": 2469, + "▁moment": 2470, + "aterial": 2471, + "Class": 2472, + "ograph": 2473, + "ike": 2474, + "▁large": 2475, + "▁####": 2476, + "▁either": 2477, + "duct": 2478, + "▁Then": 2479, + "▁Gu": 2480, + "olean": 2481, + "pert": 2482, + "▁Get": 2483, + "▁Ab": 2484, + "▁short": 2485, + "On": 2486, + "iment": 2487, + "▁project": 2488, + "cript": 2489, + "▁including": 2490, + "ния": 2491, + "▁making": 2492, + "▁someone": 2493, + "▁Fl": 2494, + "▁sat": 2495, + "▁company": 2496, + "ocus": 2497, + "pu": 2498, + "▁God": 2499, + "ification": 2500, + "No": 2501, + "▁sn": 2502, + "ano": 2503, + "ga": 2504, + "▁au": 2505, + "▁cou": 2506, + "ás": 2507, + "ended": 2508, + "ту": 2509, + "ober": 2510, + "▁nothing": 2511, + "▁net": 2512, + "▁pot": 2513, + "▁typ": 2514, + "▁item": 2515, + "rew": 2516, + "Att": 2517, + "▁young": 2518, + "}\r": 2519, + "nder": 2520, + "start": 2521, + "▁Sc": 2522, + "*)": 2523, + "▁enc": 2524, + "▁women": 2525, + "▁looking": 2526, + "▁ро": 2527, + "▁health": 2528, + "Path": 2529, + "▁After": 2530, + "▁mult": 2531, + "▁{\\": 2532, + "▁land": 2533, + "orld": 2534, + "▁Des": 2535, + "▁eng": 2536, + "input": 2537, + "▁Pol": 2538, + "\"\"": 2539, + "Code": 2540, + "▁supp": 2541, + "ainer": 2542, + "heck": 2543, + "▁mor": 2544, + "▁mill": 2545, + "▁aw": 2546, + "fs": 2547, + "▁doing": 2548, + "tings": 2549, + "ades": 2550, + "▁toget": 2551, + "▁certain": 2552, + "▁together": 2553, + "CE": 2554, + "ideo": 2555, + "▁American": 2556, + "ony": 2557, + "idd": 2558, + "II": 2559, + "ged": 2560, + "ables": 2561, + "▁ident": 2562, + "iod": 2563, + "▁parent": 2564, + "For": 2565, + "ambda": 2566, + "ando": 2567, + "=\\": 2568, + "aged": 2569, + "ending": 2570, + "Int": 2571, + "▁possible": 2572, + "▁со": 2573, + "ivity": 2574, + "num": 2575, + "rt": 2576, + "ajor": 2577, + "create": 2578, + "ride": 2579, + "▁knew": 2580, + "bit": 2581, + "itional": 2582, + "▁lik": 2583, + "▁Her": 2584, + "ension": 2585, + "\".": 2586, + "oto": 2587, + "▁exist": 2588, + "aken": 2589, + "▁actually": 2590, + "ca": 2591, + "▁Г": 2592, + "хо": 2593, + "inn": 2594, + "All": 2595, + "buf": 2596, + "▁Me": 2597, + "▁seen": 2598, + "ops": 2599, + "▁▁▁▁▁▁▁▁▁": 2600, + "Not": 2601, + "▁control": 2602, + "▁respon": 2603, + "};": 2604, + "ilt": 2605, + "isk": 2606, + "▁bad": 2607, + "▁often": 2608, + "▁past": 2609, + "aper": 2610, + "▁reason": 2611, + "eters": 2612, + "▁wanted": 2613, + "ura": 2614, + "table": 2615, + "ormal": 2616, + "width": 2617, + "га": 2618, + "ptr": 2619, + "▁dest": 2620, + "▁design": 2621, + "▁sound": 2622, + "▁plan": 2623, + "▁base": 2624, + "hand": 2625, + "gs": 2626, + "▁says": 2627, + "function": 2628, + "▁tri": 2629, + "mt": 2630, + "▁invest": 2631, + "▁available": 2632, + "ayout": 2633, + "▁och": 2634, + "▁las": 2635, + "illed": 2636, + "Val": 2637, + "▁ф": 2638, + "iety": 2639, + "mon": 2640, + "Hand": 2641, + "Fr": 2642, + "iam": 2643, + "pace": 2644, + "▁Ob": 2645, + "▁para": 2646, + "▁meet": 2647, + "▁sum": 2648, + "Message": 2649, + "ici": 2650, + "▁known": 2651, + "▁gen": 2652, + "amma": 2653, + "arr": 2654, + "▁tre": 2655, + "oke": 2656, + "uth": 2657, + "~\\": 2658, + "▁experience": 2659, + "icle": 2660, + "▁Il": 2661, + "▁sent": 2662, + "▁others": 2663, + "▁soft": 2664, + "IP": 2665, + "▁max": 2666, + "ball": 2667, + "▁market": 2668, + "▁pour": 2669, + "pression": 2670, + "eps": 2671, + "▁saw": 2672, + "▁across": 2673, + "▁Su": 2674, + "Over": 2675, + "ние": 2676, + "ulation": 2677, + "▁Reg": 2678, + "▁+=": 2679, + "body": 2680, + ")\\": 2681, + "▁print": 2682, + "▁при": 2683, + "db": 2684, + "ources": 2685, + "wards": 2686, + "▁black": 2687, + "со": 2688, + "ili": 2689, + "▁Ed": 2690, + "▁complet": 2691, + "▁single": 2692, + "▁IN": 2693, + "ached": 2694, + "bt": 2695, + "▁code": 2696, + "▁bool": 2697, + "▁area": 2698, + "▁require": 2699, + "▁problem": 2700, + "aced": 2701, + "Equ": 2702, + "▁config": 2703, + "vec": 2704, + "ney": 2705, + "cy": 2706, + "Al": 2707, + "▁account": 2708, + "ymbol": 2709, + "▁ste": 2710, + "ges": 2711, + "Array": 2712, + "empl": 2713, + "context": 2714, + "Des": 2715, + "Result": 2716, + "ecut": 2717, + "▁target": 2718, + "▁getting": 2719, + "\"/>": 2720, + "ogle": 2721, + "▁himself": 2722, + "▁wasn": 2723, + "▁block": 2724, + "▁ant": 2725, + "▁York": 2726, + "▁become": 2727, + "iff": 2728, + "ports": 2729, + "reate": 2730, + "='": 2731, + "cd": 2732, + "location": 2733, + "ет": 2734, + "▁access": 2735, + "gress": 2736, + "ros": 2737, + "Up": 2738, + "▁working": 2739, + "▁Am": 2740, + "iqu": 2741, + "cer": 2742, + "▁((": 2743, + "▁Per": 2744, + "▁func": 2745, + "▁girl": 2746, + "▁above": 2747, + "pen": 2748, + "пи": 2749, + "ido": 2750, + "▁version": 2751, + "TY": 2752, + "▁;": 2753, + "mary": 2754, + "abled": 2755, + "annel": 2756, + "▁example": 2757, + "▁context": 2758, + "OP": 2759, + "▁red": 2760, + "▁cir": 2761, + "sm": 2762, + "Log": 2763, + "▁space": 2764, + "▁fut": 2765, + "▁Gener": 2766, + "ills": 2767, + "▁dri": 2768, + "_.": 2769, + "▁felt": 2770, + "▁offic": 2771, + "▁===": 2772, + "ii": 2773, + "▁started": 2774, + "▁Т": 2775, + "▁});": 2776, + "js": 2777, + "▁front": 2778, + "▁almost": 2779, + "irm": 2780, + "!\"": 2781, + "signed": 2782, + "▁yet": 2783, + "▁trad": 2784, + "ients": 2785, + "ama": 2786, + "▁input": 2787, + "lim": 2788, + "па": 2789, + "▁ка": 2790, + "▁camp": 2791, + "ibr": 2792, + "fect": 2793, + "unt": 2794, + "▁half": 2795, + "▁cover": 2796, + "anguage": 2797, + "▁ben": 2798, + "ha": 2799, + "▁diff": 2800, + "_\\": 2801, + "▁об": 2802, + "])": 2803, + "odes": 2804, + "hel": 2805, + "ios": 2806, + "▁О": 2807, + "▁mot": 2808, + "▁social": 2809, + "////////": 2810, + "▁stre": 2811, + "ground": 2812, + "ів": 2813, + "object": 2814, + "ples": 2815, + "reed": 2816, + "▁een": 2817, + "▁based": 2818, + "▁range": 2819, + "An": 2820, + "urg": 2821, + "▁learn": 2822, + "▁exc": 2823, + "▁imp": 2824, + "▁means": 2825, + "▁wur": 2826, + "ends": 2827, + "void": 2828, + "▁std": 2829, + "▁particular": 2830, + "ja": 2831, + "▁source": 2832, + "default": 2833, + "py": 2834, + "▁als": 2835, + "scri": 2836, + "status": 2837, + "▁story": 2838, + "▁begin": 2839, + "▁position": 2840, + "▁special": 2841, + "php": 2842, + "▁bar": 2843, + "▁pract": 2844, + "call": 2845, + "▁das": 2846, + "▁rad": 2847, + "▁close": 2848, + "www": 2849, + "ере": 2850, + "gu": 2851, + "▁Er": 2852, + "▁dom": 2853, + "AM": 2854, + "▁bed": 2855, + "▁several": 2856, + "aul": 2857, + "box": 2858, + "▁low": 2859, + "pack": 2860, + "Reg": 2861, + "Of": 2862, + "atures": 2863, + "én": 2864, + "eder": 2865, + "uilder": 2866, + "cast": 2867, + "conom": 2868, + "raft": 2869, + "▁makes": 2870, + "Loc": 2871, + "http": 2872, + "▁abs": 2873, + "resh": 2874, + "▁Will": 2875, + "break": 2876, + "▁options": 2877, + "fort": 2878, + "▁из": 2879, + "▁anal": 2880, + "▁env": 2881, + "({": 2882, + "event": 2883, + "▁page": 2884, + "ternal": 2885, + "▁distribut": 2886, + "▁food": 2887, + "check": 2888, + "CK": 2889, + "▁во": 2890, + "assert": 2891, + "án": 2892, + "base": 2893, + "▁whole": 2894, + "ación": 2895, + "OD": 2896, + "▁turned": 2897, + "igma": 2898, + "▁response": 2899, + "▁University": 2900, + "▁div": 2901, + "apter": 2902, + "▁results": 2903, + "▁represent": 2904, + "▁everything": 2905, + "▁Cent": 2906, + "utes": 2907, + "rix": 2908, + "▁Some": 2909, + "▁behind": 2910, + "▁creat": 2911, + "place": 2912, + "su": 2913, + "▁Part": 2914, + "umb": 2915, + "mathbb": 2916, + "ping": 2917, + "▁match": 2918, + "Out": 2919, + "dom": 2920, + "▁situ": 2921, + "dr": 2922, + "ara": 2923, + "▁window": 2924, + "ns": 2925, + "lished": 2926, + "▁Ver": 2927, + "▁message": 2928, + "▁Em": 2929, + "▁human": 2930, + "perties": 2931, + "лу": 2932, + "lem": 2933, + "ORT": 2934, + "▁early": 2935, + "▁quick": 2936, + "▁та": 2937, + "roid": 2938, + "▁country": 2939, + "▁due": 2940, + "▁Die": 2941, + "▁trying": 2942, + "▁live": 2943, + "▁press": 2944, + "INT": 2945, + "With": 2946, + "oved": 2947, + "▁specific": 2948, + "▁fall": 2949, + "uk": 2950, + "yl": 2951, + "▁general": 2952, + "му": 2953, + "ну": 2954, + "▁names": 2955, + "where": 2956, + "▁These": 2957, + "▁sil": 2958, + "ét": 2959, + "▁ener": 2960, + "▁Now": 2961, + "▁address": 2962, + "Response": 2963, + "▁Mr": 2964, + "▁answ": 2965, + "▁film": 2966, + "▁strong": 2967, + "▁bring": 2968, + "▁United": 2969, + "▁ge": 2970, + "▁woman": 2971, + "New": 2972, + "ett": 2973, + ".)": 2974, + "ename": 2975, + "▁AN": 2976, + "▁describ": 2977, + "за": 2978, + "ising": 2979, + "EL": 2980, + "ql": 2981, + "▁fur": 2982, + "ying": 2983, + "▁Cal": 2984, + "▁Dr": 2985, + "ERR": 2986, + "▁\\\\": 2987, + "angle": 2988, + "urope": 2989, + "▁city": 2990, + "▁index": 2991, + "▁action": 2992, + "▁However": 2993, + "▁fig": 2994, + "ias": 2995, + "▁question": 2996, + "▁Jan": 2997, + "▁Med": 2998, + "▁Cont": 2999, + "amed": 3000, + "Call": 3001, + "plied": 3002, + "tty": 3003, + "▁individ": 3004, + "page": 3005, + "▁comb": 3006, + "section": 3007, + "▁Comm": 3008, + "uel": 3009, + "▁het": 3010, + "▁Bar": 3011, + "agement": 3012, + "fin": 3013, + "▁major": 3014, + "oper": 3015, + "api": 3016, + "room": 3017, + "▁„": 3018, + "▁hab": 3019, + "зи": 3020, + "▁auf": 3021, + "current": 3022, + "ni": 3023, + "▁include": 3024, + "▁qui": 3025, + "va": 3026, + "UE": 3027, + "▁idea": 3028, + ",'": 3029, + "▁required": 3030, + "▁heart": 3031, + "ibility": 3032, + "iction": 3033, + "Model": 3034, + "write": 3035, + "▁content": 3036, + "▁wer": 3037, + "▁hands": 3038, + "zen": 3039, + "char": 3040, + "}^{": 3041, + "▁mass": 3042, + "ply": 3043, + "▁nat": 3044, + "rel": 3045, + "▁dat": 3046, + "================": 3047, + "imal": 3048, + "▁probably": 3049, + "unch": 3050, + "▁mer": 3051, + "ilar": 3052, + "ires": 3053, + "▁watch": 3054, + "SI": 3055, + "▁cult": 3056, + "▁mother": 3057, + "▁government": 3058, + "ording": 3059, + "▁()": 3060, + "▁pri": 3061, + "▁link": 3062, + "group": 3063, + "OL": 3064, + "▁near": 3065, + "▁Ser": 3066, + "Ser": 3067, + "ito": 3068, + "▁values": 3069, + "▁java": 3070, + "fully": 3071, + "Count": 3072, + "++)": 3073, + "▁vi": 3074, + "▁white": 3075, + "mat": 3076, + "ctx": 3077, + "▁conc": 3078, + "▁stay": 3079, + "ging": 3080, + "▁clear": 3081, + "▁copy": 3082, + "selves": 3083, + "▁provide": 3084, + "▁words": 3085, + "comp": 3086, + "args": 3087, + "▁pick": 3088, + "uly": 3089, + "▁vari": 3090, + "▁believe": 3091, + "▁Co": 3092, + "Property": 3093, + "Group": 3094, + "▁ten": 3095, + "ischen": 3096, + "eturn": 3097, + "ival": 3098, + "System": 3099, + "CL": 3100, + "bed": 3101, + "▁total": 3102, + "▁ist": 3103, + "Input": 3104, + "uments": 3105, + "Manager": 3106, + "ши": 3107, + "▁win": 3108, + "leep": 3109, + "PI": 3110, + "ного": 3111, + "ruction": 3112, + "▁inte": 3113, + "App": 3114, + "avor": 3115, + "▁respect": 3116, + "ators": 3117, + "▁como": 3118, + "▁cut": 3119, + "FA": 3120, + "▁sus": 3121, + "▁App": 3122, + "rect": 3123, + "FI": 3124, + "▁began": 3125, + "oph": 3126, + "▁sort": 3127, + "though": 3128, + "је": 3129, + "icro": 3130, + "Trans": 3131, + "лі": 3132, + "▁Inst": 3133, + "request": 3134, + "ор": 3135, + "▁relations": 3136, + "-\\": 3137, + "Status": 3138, + "жи": 3139, + "▁father": 3140, + "cs": 3141, + "▁sex": 3142, + "isch": 3143, + "vo": 3144, + "}_{": 3145, + "aven": 3146, + "▁Ne": 3147, + "ATE": 3148, + "itten": 3149, + "▁ess": 3150, + "TH": 3151, + "ights": 3152, + "▁hom": 3153, + "▁today": 3154, + "▁zu": 3155, + "ita": 3156, + "▁isn": 3157, + "▁opt": 3158, + "ogn": 3159, + "ér": 3160, + "▁whether": 3161, + "ixed": 3162, + "phi": 3163, + "idence": 3164, + "ald": 3165, + "Client": 3166, + "At": 3167, + "▁death": 3168, + "▁Let": 3169, + "ius": 3170, + "ги": 3171, + "▁ре": 3172, + "ben": 3173, + ")\r": 3174, + "ba": 3175, + ">": 3193, + "▁Just": 3194, + "What": 3195, + "atal": 3196, + "▁Min": 3197, + "▁Cor": 3198, + "▁dark": 3199, + "rl": 3200, + "▁larg": 3201, + "ding": 3202, + "ón": 3203, + "ouch": 3204, + "▁um": 3205, + "▁elect": 3206, + "▁dam": 3207, + "▁needs": 3208, + "▁matter": 3209, + "▁rather": 3210, + "from": 3211, + "ram": 3212, + "▁і": 3213, + "▁taken": 3214, + "▁deal": 3215, + "▁period": 3216, + "▁Mon": 3217, + "▁Л": 3218, + "▁Aug": 3219, + "run": 3220, + "mm": 3221, + "elle": 3222, + "▁export": 3223, + "Sc": 3224, + "vis": 3225, + "abor": 3226, + "▁author": 3227, + "ère": 3228, + "▁remember": 3229, + "▁redu": 3230, + "▁List": 3231, + "▁focus": 3232, + "▁character": 3233, + "Table": 3234, + "▁individual": 3235, + "▁needed": 3236, + "bum": 3237, + "▁style": 3238, + "inary": 3239, + "ersion": 3240, + "oute": 3241, + "▁Pe": 3242, + "▁hon": 3243, + "mut": 3244, + "see": 3245, + "▁became": 3246, + "▁dire": 3247, + "▁document": 3248, + "sec": 3249, + "ening": 3250, + "▁visit": 3251, + "▁fac": 3252, + "tx": 3253, + "down": 3254, + "plit": 3255, + "▁phys": 3256, + "itting": 3257, + "joy": 3258, + "▁hig": 3259, + "This": 3260, + "Ad": 3261, + "▁Brit": 3262, + "▁employ": 3263, + "▁ré": 3264, + "▁т": 3265, + "lambda": 3266, + "▁impro": 3267, + "▁Bo": 3268, + "iding": 3269, + "▁online": 3270, + "mem": 3271, + "atform": 3272, + "▁War": 3273, + "▁cas": 3274, + "asure": 3275, + "▁pur": 3276, + "medi": 3277, + "Dis": 3278, + "▁Germ": 3279, + "pc": 3280, + "са": 3281, + "▁friends": 3282, + "▁Mc": 3283, + "DI": 3284, + "▁plus": 3285, + "▁Set": 3286, + "iddle": 3287, + "itut": 3288, + "▁depend": 3289, + "rest": 3290, + "▁Je": 3291, + "▁hor": 3292, + "▁entire": 3293, + "Query": 3294, + "▁refer": 3295, + "▁hot": 3296, + "▁Aust": 3297, + "▁common": 3298, + "ці": 3299, + "▁pull": 3300, + "▁Add": 3301, + "▁season": 3302, + "▁invol": 3303, + "▁World": 3304, + "client": 3305, + "now": 3306, + "true": 3307, + "append": 3308, + "itted": 3309, + "empt": 3310, + "){": 3311, + "///": 3312, + "▁prop": 3313, + "imate": 3314, + "SC": 3315, + "▁hours": 3316, + "▁hope": 3317, + "andom": 3318, + "ід": 3319, + "istic": 3320, + "▁property": 3321, + "sg": 3322, + ">(": 3323, + "▁write": 3324, + "mark": 3325, + "find": 3326, + "▁personal": 3327, + "][": 3328, + "rown": 3329, + "Ph": 3330, + "▁foot": 3331, + "▁research": 3332, + "ironment": 3333, + "▁nom": 3334, + "▁instance": 3335, + "▁held": 3336, + "De": 3337, + "▁members": 3338, + "▁fire": 3339, + "▁history": 3340, + "▁map": 3341, + "▁discuss": 3342, + "▁espec": 3343, + "▁taking": 3344, + "▁services": 3345, + "▁indust": 3346, + "igen": 3347, + "▁Ass": 3348, + "▁expected": 3349, + "▁wurde": 3350, + "dir": 3351, + "▁among": 3352, + "▁sugg": 3353, + "rec": 3354, + "Inter": 3355, + "block": 3356, + "▁Rep": 3357, + "▁pain": 3358, + "▁five": 3359, + "▁fund": 3360, + "rid": 3361, + "arrow": 3362, + "▁treat": 3363, + "▁heard": 3364, + "▁determ": 3365, + "icult": 3366, + "▁sense": 3367, + "ese": 3368, + "Fun": 3369, + "▁months": 3370, + "json": 3371, + ",”": 3372, + "TI": 3373, + "orage": 3374, + "▁У": 3375, + "▁everyone": 3376, + "▁clos": 3377, + "iers": 3378, + "airs": 3379, + "define": 3380, + "If": 3381, + "osp": 3382, + "▁wonder": 3383, + "NA": 3384, + "query": 3385, + "pg": 3386, + "ites": 3387, + "▁material": 3388, + "yd": 3389, + "Read": 3390, + "html": 3391, + "TE": 3392, + "Pr": 3393, + "^{\\": 3394, + "▁gave": 3395, + "▁IS": 3396, + "▁suggest": 3397, + "Override": 3398, + "rodu": 3399, + "From": 3400, + "▁Europe": 3401, + "PO": 3402, + "▁soon": 3403, + "host": 3404, + "▁Ber": 3405, + "....": 3406, + "▁Har": 3407, + "▁energy": 3408, + "><": 3409, + "aves": 3410, + "▁easy": 3411, + "▁bre": 3412, + "frame": 3413, + "▁ground": 3414, + "with": 3415, + "▁inside": 3416, + "ief": 3417, + "▁mo": 3418, + "pm": 3419, + "pan": 3420, + "igr": 3421, + "▁om": 3422, + "next": 3423, + "omet": 3424, + "▁status": 3425, + "▁}\r": 3426, + "▁music": 3427, + "ora": 3428, + "iles": 3429, + "ki": 3430, + "▁esc": 3431, + "▁bes": 3432, + "▁Dis": 3433, + "▁host": 3434, + "▁comes": 3435, + "used": 3436, + "▁future": 3437, + "lick": 3438, + "aid": 3439, + "▁compet": 3440, + "▁voice": 3441, + "▁load": 3442, + "evel": 3443, + "▁neg": 3444, + "▁command": 3445, + "▁für": 3446, + "▁pie": 3447, + "▁quite": 3448, + "▁blo": 3449, + "agn": 3450, + "ilon": 3451, + "▁claim": 3452, + "▁teach": 3453, + "▁previous": 3454, + "▁site": 3455, + "color": 3456, + "attr": 3457, + "▁accept": 3458, + "▁exact": 3459, + ")}": 3460, + "aft": 3461, + "roller": 3462, + "он": 3463, + "oo": 3464, + "Date": 3465, + "▁ou": 3466, + "sy": 3467, + "▁pretty": 3468, + "▁image": 3469, + "BU": 3470, + "▁terms": 3471, + "▁search": 3472, + "▁è": 3473, + "▁Val": 3474, + "▁‘": 3475, + "▁Dav": 3476, + "MS": 3477, + "src": 3478, + "mar": 3479, + "incip": 3480, + "▁couldn": 3481, + "ados": 3482, + "▁dro": 3483, + "beta": 3484, + "imum": 3485, + "▁minutes": 3486, + "▁grand": 3487, + "▁»": 3488, + "▁Our": 3489, + "Str": 3490, + "VER": 3491, + "maz": 3492, + "▁original": 3493, + "ini": 3494, + "▁coll": 3495, + "loat": 3496, + "▁os": 3497, + "});": 3498, + "summary": 3499, + "▁wall": 3500, + "Color": 3501, + "▁vers": 3502, + "▁della": 3503, + "▁\"\"\"": 3504, + "mathbf": 3505, + "zer": 3506, + "aur": 3507, + "▁track": 3508, + "▁associ": 3509, + "▁suff": 3510, + "▁inde": 3511, + "ague": 3512, + "▁Apr": 3513, + "Le": 3514, + "roups": 3515, + "board": 3516, + "▁attack": 3517, + "▁series": 3518, + "▁instead": 3519, + "ham": 3520, + "book": 3521, + "▁six": 3522, + "▁Rec": 3523, + "▁coming": 3524, + "urt": 3525, + "▁global": 3526, + "▁necess": 3527, + "lege": 3528, + "Pos": 3529, + "▁leave": 3530, + "▁pod": 3531, + "ategory": 3532, + "uz": 3533, + "▁deep": 3534, + "▁km": 3535, + "▁outside": 3536, + "has": 3537, + "options": 3538, + "▁Sm": 3539, + "Sub": 3540, + "rows": 3541, + "▁ви": 3542, + "▁States": 3543, + "▁wrong": 3544, + "▁however": 3545, + "▁sem": 3546, + "▁catch": 3547, + "\"),": 3548, + "model": 3549, + "▁http": 3550, + "▁option": 3551, + "rie": 3552, + "▁ста": 3553, + "▁är": 3554, + "▁enjoy": 3555, + "nu": 3556, + "▁pas": 3557, + "▁amount": 3558, + "▁respons": 3559, + "▁Intern": 3560, + "▁myself": 3561, + "▁opp": 3562, + "▁Sim": 3563, + "▁sens": 3564, + "Ed": 3565, + "▁(\\": 3566, + "▁students": 3567, + "нов": 3568, + "▁points": 3569, + "arning": 3570, + "UP": 3571, + "elling": 3572, + "▁cannot": 3573, + "Be": 3574, + "▁length": 3575, + "null": 3576, + "uint": 3577, + "wise": 3578, + "▁double": 3579, + "ige": 3580, + "ista": 3581, + "▁estab": 3582, + "anch": 3583, + "▁ago": 3584, + "▁bound": 3585, + "▁fa": 3586, + "▁clean": 3587, + "▁simple": 3588, + "mi": 3589, + "########": 3590, + "ifier": 3591, + "▁General": 3592, + "▁seemed": 3593, + "ena": 3594, + "▁age": 3595, + "ной": 3596, + "endif": 3597, + "AA": 3598, + "▁caus": 3599, + "▁educ": 3600, + "▁cell": 3601, + "Gener": 3602, + "space": 3603, + "▁Your": 3604, + "▁beaut": 3605, + "gt": 3606, + "▁limit": 3607, + "▁date": 3608, + "Util": 3609, + "▁National": 3610, + "ows": 3611, + "pat": 3612, + "quad": 3613, + "▁ok": 3614, + "▁И": 3615, + "arth": 3616, + "hat": 3617, + "▁community": 3618, + "oul": 3619, + "▁econom": 3620, + "Component": 3621, + "bor": 3622, + "usion": 3623, + "▁below": 3624, + "earch": 3625, + "ores": 3626, + "ban": 3627, + "▁August": 3628, + "▁further": 3629, + "sigma": 3630, + "▁ha": 3631, + "ji": 3632, + "▁comput": 3633, + "гра": 3634, + "▁None": 3635, + "▁ter": 3636, + "▁anyone": 3637, + "▁task": 3638, + "ente": 3639, + "position": 3640, + "pped": 3641, + "▁aus": 3642, + "Attribute": 3643, + "req": 3644, + "addr": 3645, + "light": 3646, + "ше": 3647, + "▁arm": 3648, + "cover": 3649, + "upport": 3650, + "▁Gl": 3651, + "▁San": 3652, + "▁writing": 3653, + "▁lost": 3654, + "▁Mark": 3655, + "▁gre": 3656, + "TYPE": 3657, + "▁South": 3658, + "▁perfect": 3659, + "▁package": 3660, + "▁infl": 3661, + "haps": 3662, + "▁Ang": 3663, + "respon": 3664, + "ris": 3665, + "ptember": 3666, + "▁building": 3667, + "VAL": 3668, + "free": 3669, + "▁ce": 3670, + "HT": 3671, + "▁From": 3672, + "ds": 3673, + "roy": 3674, + "achine": 3675, + "nown": 3676, + "▁saying": 3677, + "▁бы": 3678, + "oe": 3679, + "Ref": 3680, + "▁network": 3681, + "parent": 3682, + "uge": 3683, + "▁similar": 3684, + ">\r": 3685, + "Builder": 3686, + "▁living": 3687, + "▁continue": 3688, + "anger": 3689, + "▁Red": 3690, + "▁hair": 3691, + "anced": 3692, + "ians": 3693, + "▁dead": 3694, + "▁boolean": 3695, + "ication": 3696, + "▁де": 3697, + "▁client": 3698, + "uct": 3699, + "▁•": 3700, + "SP": 3701, + "older": 3702, + "пе": 3703, + "udio": 3704, + "▁deg": 3705, + "asing": 3706, + "▁step": 3707, + "▁pers": 3708, + "ção": 3709, + "obj": 3710, + "oz": 3711, + "ula": 3712, + "▁round": 3713, + "▁upon": 3714, + "▁resource": 3715, + "▁valid": 3716, + "▁II": 3717, + "bug": 3718, + "std": 3719, + "▁ang": 3720, + "span": 3721, + "pol": 3722, + "ialog": 3723, + "▁phot": 3724, + "?'": 3725, + "DB": 3726, + "▁Fin": 3727, + "VE": 3728, + "Em": 3729, + "▁cam": 3730, + "target": 3731, + "pected": 3732, + "Hel": 3733, + "▁ut": 3734, + "▁Test": 3735, + "▁town": 3736, + "align": 3737, + "▁webs": 3738, + "inner": 3739, + "augh": 3740, + "▁except": 3741, + "▁initial": 3742, + "enty": 3743, + "lich": 3744, + "▁Aut": 3745, + "top": 3746, + "▁fail": 3747, + "ona": 3748, + "▁benef": 3749, + "anks": 3750, + "ische": 3751, + ".*": 3752, + "▁signific": 3753, + "▁contact": 3754, + "Rec": 3755, + "ario": 3756, + "ottom": 3757, + "▁relationship": 3758, + "]);": 3759, + "▁На": 3760, + "Head": 3761, + "format": 3762, + "▁ét": 3763, + "▁More": 3764, + "actory": 3765, + "portun": 3766, + "+\\": 3767, + "▁simply": 3768, + "▁ep": 3769, + "▁Russ": 3770, + "ní": 3771, + "ua": 3772, + "erc": 3773, + "▁longer": 3774, + "inition": 3775, + "ector": 3776, + "aption": 3777, + "▁profess": 3778, + "▁Mus": 3779, + "ilities": 3780, + "ès": 3781, + "▁Act": 3782, + "offset": 3783, + "▁ill": 3784, + "band": 3785, + "▁Ag": 3786, + "▁По": 3787, + "би": 3788, + "content": 3789, + "icon": 3790, + "▁works": 3791, + "ynam": 3792, + "plement": 3793, + "Resource": 3794, + "Action": 3795, + "▁difficult": 3796, + "▁West": 3797, + "▁video": 3798, + "▁THE": 3799, + "▁decl": 3800, + "ondon": 3801, + "ded": 3802, + "}{\\": 3803, + "ocr": 3804, + "▁City": 3805, + "▁я": 3806, + "uer": 3807, + "cz": 3808, + "▁imag": 3809, + "cr": 3810, + "ete": 3811, + "idget": 3812, + "▁Mod": 3813, + "▁forward": 3814, + "▁pict": 3815, + "orge": 3816, + "▁subject": 3817, + "update": 3818, + "attle": 3819, + "sa": 3820, + "▁Ant": 3821, + "▁running": 3822, + "▁sal": 3823, + "conne": 3824, + "▁output": 3825, + "adata": 3826, + "ML": 3827, + "Check": 3828, + "ledge": 3829, + "▁paper": 3830, + "params": 3831, + "avy": 3832, + "▁af": 3833, + "▁eine": 3834, + "▁jour": 3835, + "AY": 3836, + "▁itself": 3837, + "▁Str": 3838, + "style": 3839, + "That": 3840, + "▁million": 3841, + "▁language": 3842, + "OS": 3843, + "ving": 3844, + "▁ма": 3845, + "▁то": 3846, + ")(": 3847, + "▁buy": 3848, + "./": 3849, + "▁...": 3850, + "▁tried": 3851, + "▁compl": 3852, + "▁activ": 3853, + "apped": 3854, + "Button": 3855, + "Token": 3856, + "▁provided": 3857, + "iber": 3858, + "▁created": 3859, + "curity": 3860, + "End": 3861, + "ał": 3862, + "uster": 3863, + "izing": 3864, + "omb": 3865, + "▁sich": 3866, + "▁compon": 3867, + "▁See": 3868, + "▁uint": 3869, + "▁label": 3870, + "vol": 3871, + "ów": 3872, + "ocol": 3873, + "▁received": 3874, + "▁intern": 3875, + "це": 3876, + "Run": 3877, + "▁road": 3878, + "▁Oct": 3879, + "▁Comp": 3880, + "▁study": 3881, + "▁те": 3882, + "Act": 3883, + "▁tour": 3884, + "▁State": 3885, + "▁added": 3886, + "https": 3887, + "stream": 3888, + "▁lower": 3889, + "▁box": 3890, + "▁Sk": 3891, + "▁themselves": 3892, + "▁cross": 3893, + "▁echo": 3894, + "▁device": 3895, + "pose": 3896, + "▁games": 3897, + "PL": 3898, + "Window": 3899, + "ises": 3900, + "title": 3901, + "Stream": 3902, + "zt": 3903, + "▁Sw": 3904, + "▁role": 3905, + "iant": 3906, + "ku": 3907, + "sequ": 3908, + "▁late": 3909, + "▁sold": 3910, + "ря": 3911, + "Comm": 3912, + "▁entre": 3913, + "▁dog": 3914, + "device": 3915, + "Par": 3916, + "▁likely": 3917, + "^{-": 3918, + "▁len": 3919, + "▁Paul": 3920, + "▁tool": 3921, + "Off": 3922, + "▁famil": 3923, + "▁draw": 3924, + "apping": 3925, + "▁events": 3926, + "cret": 3927, + "rought": 3928, + "Content": 3929, + "▁software": 3930, + "ria": 3931, + "msg": 3932, + "gamma": 3933, + "▁hear": 3934, + "Oper": 3935, + "▁yourself": 3936, + "▁liter": 3937, + "emp": 3938, + "▁separ": 3939, + "▁З": 3940, + "▁title": 3941, + "Method": 3942, + "mathrm": 3943, + "▁slow": 3944, + "▁Rom": 3945, + "!!": 3946, + "▁tax": 3947, + "ска": 3948, + "emplate": 3949, + "oi": 3950, + "▁Art": 3951, + "false": 3952, + "astic": 3953, + "сть": 3954, + "ocket": 3955, + "▁ens": 3956, + "TO": 3957, + "amente": 3958, + "local": 3959, + "chie": 3960, + "▁pan": 3961, + "ний": 3962, + "chema": 3963, + "▁North": 3964, + "зо": 3965, + "▁>=": 3966, + "Aut": 3967, + "▁dig": 3968, + "▁seems": 3969, + "▁morning": 3970, + "sole": 3971, + "umer": 3972, + "delta": 3973, + "ité": 3974, + "abase": 3975, + "raf": 3976, + "▁observ": 3977, + "▁Est": 3978, + "▁seg": 3979, + "▁[]": 3980, + "▁Pres": 3981, + "iful": 3982, + "push": 3983, + "▁Off": 3984, + "ipe": 3985, + "ati": 3986, + "▁dim": 3987, + "ceed": 3988, + "Ent": 3989, + "____": 3990, + "entry": 3991, + "▁fight": 3992, + "▁cred": 3993, + "▁OR": 3994, + "▁Dep": 3995, + "${": 3996, + "лен": 3997, + "Create": 3998, + "▁April": 3999, + "ministr": 4000, + "FL": 4001, + "▁Ap": 4002, + "▁Here": 4003, + "private": 4004, + "Instance": 4005, + "iem": 4006, + "▁office": 4007, + "▁third": 4008, + "▁update": 4009, + "Line": 4010, + "tag": 4011, + "▁especially": 4012, + "▁года": 4013, + "▁cu": 4014, + "▁kill": 4015, + "aught": 4016, + "▁swe": 4017, + "Options": 4018, + "IM": 4019, + "CC": 4020, + "▁compan": 4021, + "just": 4022, + "▁While": 4023, + "izer": 4024, + "▁мо": 4025, + "ке": 4026, + "▁auto": 4027, + "▁band": 4028, + "мен": 4029, + "iques": 4030, + "▁ple": 4031, + "NO": 4032, + "▁OF": 4033, + "▁song": 4034, + "▁Acc": 4035, + "EXT": 4036, + "ensor": 4037, + "ining": 4038, + "▁lat": 4039, + "big": 4040, + "▁King": 4041, + "och": 4042, + "si": 4043, + "▁Hist": 4044, + "▁quality": 4045, + "mode": 4046, + "▁opportun": 4047, + "▁wouldn": 4048, + ":**": 4049, + "output": 4050, + "▁feet": 4051, + "▁mis": 4052, + "df": 4053, + "aging": 4054, + "▁ме": 4055, + "▁tro": 4056, + "▁defined": 4057, + "▁review": 4058, + "▁Fil": 4059, + ">>": 4060, + "▁princip": 4061, + "Base": 4062, + "dict": 4063, + "verage": 4064, + "icient": 4065, + "IF": 4066, + "▁hit": 4067, + "Page": 4068, + "▁perm": 4069, + "cel": 4070, + "ít": 4071, + "▁express": 4072, + "▁indic": 4073, + "▁September": 4074, + "image": 4075, + "▁products": 4076, + "▁media": 4077, + "change": 4078, + "igger": 4079, + "▁send": 4080, + "last": 4081, + "ming": 4082, + "pa": 4083, + "uary": 4084, + "▁speak": 4085, + "ный": 4086, + "ще": 4087, + "ysis": 4088, + "lying": 4089, + "▁ч": 4090, + "like": 4091, + "ры": 4092, + "ві": 4093, + "▁Mich": 4094, + "MO": 4095, + "▁Jah": 4096, + "ensive": 4097, + "▁share": 4098, + "▁development": 4099, + "CP": 4100, + "spec": 4101, + "▁fast": 4102, + "het": 4103, + "HO": 4104, + "▁particip": 4105, + "Block": 4106, + "▁viol": 4107, + "▁frame": 4108, + "▁qual": 4109, + "tre": 4110, + "▁Ф": 4111, + "▁toward": 4112, + "fg": 4113, + "Box": 4114, + "Column": 4115, + "▁milit": 4116, + "▁March": 4117, + "▁various": 4118, + "pass": 4119, + "▁Park": 4120, + "▁Ben": 4121, + "Frame": 4122, + "▁normal": 4123, + "open": 4124, + "px": 4125, + "▁phone": 4126, + "▁Even": 4127, + "▁ma": 4128, + "ibrary": 4129, + "Start": 4130, + "idden": 4131, + "rho": 4132, + "graph": 4133, + "acing": 4134, + "'.": 4135, + "arter": 4136, + "mes": 4137, + "inst": 4138, + "▁ir": 4139, + "active": 4140, + "▁fem": 4141, + "▁moved": 4142, + "▁store": 4143, + "▁price": 4144, + "\").": 4145, + "berg": 4146, + "▁nov": 4147, + "▁card": 4148, + "ellow": 4149, + "▁party": 4150, + "▁Mor": 4151, + "ael": 4152, + "▁percent": 4153, + "▁training": 4154, + "▁ing": 4155, + "imer": 4156, + "▁Sam": 4157, + "Default": 4158, + "▁fuck": 4159, + "▁complete": 4160, + "uid": 4161, + "▁details": 4162, + "▁led": 4163, + "Point": 4164, + "▁Count": 4165, + "▁regard": 4166, + "zo": 4167, + "▁Bro": 4168, + "▁recogn": 4169, + "▁Hol": 4170, + "UM": 4171, + "element": 4172, + "Mode": 4173, + "▁exam": 4174, + "▁EX": 4175, + "Image": 4176, + "verse": 4177, + "riter": 4178, + "soft": 4179, + "▁introdu": 4180, + "▁surpr": 4181, + "Buffer": 4182, + "lector": 4183, + "aren": 4184, + "anged": 4185, + "▁Pat": 4186, + "▁Pal": 4187, + "▁contr": 4188, + "Handler": 4189, + "▁features": 4190, + "iple": 4191, + "▁CON": 4192, + "Fil": 4193, + "▁Port": 4194, + "▁thinking": 4195, + "doc": 4196, + "wer": 4197, + "▁worked": 4198, + "PC": 4199, + "cm": 4200, + "dat": 4201, + "PRO": 4202, + "▁Every": 4203, + "▁era": 4204, + "▁First": 4205, + "gn": 4206, + "▁immedi": 4207, + "ovember": 4208, + "apan": 4209, + "▁extra": 4210, + "▁section": 4211, + "▁June": 4212, + "▁via": 4213, + "▁gone": 4214, + "come": 4215, + "▁stri": 4216, + "^\\": 4217, + "antly": 4218, + "▁arch": 4219, + "Source": 4220, + "▁conv": 4221, + "▁London": 4222, + "Number": 4223, + "▁questions": 4224, + "andid": 4225, + "▁played": 4226, + "env": 4227, + "▁School": 4228, + "▁natural": 4229, + "can": 4230, + "▁news": 4231, + "DR": 4232, + "▁chall": 4233, + "▁Soc": 4234, + "▁э": 4235, + "▁attempt": 4236, + "*}": 4237, + "Null": 4238, + "rote": 4239, + "▁bi": 4240, + "▁written": 4241, + "▁blood": 4242, + "▁happened": 4243, + "▁cause": 4244, + "ashing": 4245, + "▁William": 4246, + "adem": 4247, + "▁brought": 4248, + "▁display": 4249, + "ima": 4250, + "▁finally": 4251, + "tab": 4252, + "▁returned": 4253, + "ных": 4254, + "nie": 4255, + "▁q": 4256, + "▁hers": 4257, + "▁Pre": 4258, + "▁dou": 4259, + "buffer": 4260, + "▁effort": 4261, + "aine": 4262, + "xy": 4263, + "▁histor": 4264, + "enu": 4265, + "▁arriv": 4266, + "▁Dem": 4267, + "▁favor": 4268, + "▁handle": 4269, + "SET": 4270, + "▁Public": 4271, + "rupt": 4272, + "▁ur": 4273, + "▁force": 4274, + "▁és": 4275, + "ube": 4276, + "Pre": 4277, + "рі": 4278, + "iny": 4279, + "theta": 4280, + "isf": 4281, + "▁national": 4282, + "Equal": 4283, + "rench": 4284, + "▁wife": 4285, + "▁capt": 4286, + "▁Inter": 4287, + "tau": 4288, + "▁sleep": 4289, + "../../": 4290, + "▁issue": 4291, + "▁member": 4292, + "▁await": 4293, + "▁Dan": 4294, + "zi": 4295, + "inate": 4296, + "▁sym": 4297, + "chan": 4298, + "▁Jack": 4299, + "▁English": 4300, + "▁sz": 4301, + "ributes": 4302, + "▁ign": 4303, + "ál": 4304, + "▁appear": 4305, + "rad": 4306, + "idge": 4307, + "▁couple": 4308, + "▁ship": 4309, + "lig": 4310, + "web": 4311, + "▁usually": 4312, + "▁ready": 4313, + "▁vill": 4314, + "▁Why": 4315, + "ebru": 4316, + "▁grad": 4317, + "ords": 4318, + "▁inf": 4319, + "▁loss": 4320, + "▁od": 4321, + "▁Phil": 4322, + "server": 4323, + "▁Up": 4324, + "▁buff": 4325, + "▁filename": 4326, + "ABLE": 4327, + "iting": 4328, + "efore": 4329, + "()->": 4330, + "▁conditions": 4331, + "vm": 4332, + "eld": 4333, + "itz": 4334, + "▁Trans": 4335, + "▁weight": 4336, + "▁higher": 4337, + "▁rate": 4338, + "▁accom": 4339, + "vider": 4340, + "OM": 4341, + "▁ways": 4342, + "coming": 4343, + "▁lock": 4344, + "▁etc": 4345, + "▁avec": 4346, + "▁takes": 4347, + "▁Char": 4348, + "▁November": 4349, + "method": 4350, + "▁Austral": 4351, + "▁America": 4352, + "long": 4353, + "cember": 4354, + "▁political": 4355, + "flow": 4356, + "▁maybe": 4357, + "▁amb": 4358, + "Layout": 4359, + "iled": 4360, + "omen": 4361, + "ola": 4362, + "icip": 4363, + "partial": 4364, + "True": 4365, + "▁floor": 4366, + "▁Def": 4367, + "▁concern": 4368, + "yr": 4369, + "▁shows": 4370, + "ih": 4371, + "▁answer": 4372, + "acc": 4373, + "▁ball": 4374, + "▁Rev": 4375, + "▁sun": 4376, + "▁quickly": 4377, + "▁somet": 4378, + "mente": 4379, + "▁Mal": 4380, + "undred": 4381, + "▁issues": 4382, + "ecause": 4383, + "pes": 4384, + "▁player": 4385, + "▁parents": 4386, + "▁popular": 4387, + "▁mode": 4388, + "▁mention": 4389, + "NE": 4390, + "Load": 4391, + "▁regular": 4392, + "aved": 4393, + "?:": 4394, + "year": 4395, + "func": 4396, + "▁performance": 4397, + "▁July": 4398, + "thern": 4399, + "▁website": 4400, + "ford": 4401, + "PR": 4402, + "ela": 4403, + "level": 4404, + "uit": 4405, + "flags": 4406, + "▁worth": 4407, + "▁correspon": 4408, + "▁British": 4409, + "sim": 4410, + "▁alone": 4411, + "▁har": 4412, + "▁ones": 4413, + "obile": 4414, + "▁dru": 4415, + "chi": 4416, + "▁David": 4417, + "▁problems": 4418, + "▁column": 4419, + "();\r": 4420, + "ZE": 4421, + "▁relig": 4422, + "ological": 4423, + "▁region": 4424, + "ady": 4425, + "IO": 4426, + "ander": 4427, + "Net": 4428, + "▁built": 4429, + "▁install": 4430, + "▁approach": 4431, + "Cur": 4432, + "▁fine": 4433, + "▁talking": 4434, + "▁changes": 4435, + "Style": 4436, + "▁Mart": 4437, + "лю": 4438, + "response": 4439, + "teger": 4440, + "{\r": 4441, + "irit": 4442, + "▁protected": 4443, + "▁rele": 4444, + "ership": 4445, + "тель": 4446, + "unsigned": 4447, + "ialize": 4448, + "▁https": 4449, + "Tag": 4450, + "▁$(": 4451, + "more": 4452, + "ypes": 4453, + "▁stream": 4454, + "etch": 4455, + "▁engine": 4456, + "KE": 4457, + "cmd": 4458, + "script": 4459, + "ttp": 4460, + "▁avoid": 4461, + "▁terr": 4462, + "▁rock": 4463, + "▁ful": 4464, + "Update": 4465, + "▁environment": 4466, + "▁prec": 4467, + "▁са": 4468, + "▁cases": 4469, + "▁offset": 4470, + "▁rais": 4471, + "lib": 4472, + "ées": 4473, + "aa": 4474, + "yt": 4475, + "▁arr": 4476, + "opyright": 4477, + "first": 4478, + "▁util": 4479, + "▁feature": 4480, + "posed": 4481, + "ffect": 4482, + "жа": 4483, + "itude": 4484, + "ements": 4485, + "asc": 4486, + "ador": 4487, + "lections": 4488, + "▁club": 4489, + "]{": 4490, + "▁*)": 4491, + "ство": 4492, + "▁imm": 4493, + "▁former": 4494, + "▁rights": 4495, + "▁decided": 4496, + "▁rev": 4497, + "▁ment": 4498, + "ani": 4499, + "▁stru": 4500, + "▁attention": 4501, + "artment": 4502, + "▁Ital": 4503, + "alle": 4504, + "▁bis": 4505, + "gener": 4506, + "▁integr": 4507, + "ello": 4508, + "rypt": 4509, + "▁achie": 4510, + "nes": 4511, + "▁stra": 4512, + "sb": 4513, + "▁types": 4514, + "▁RE": 4515, + "Init": 4516, + "▁comment": 4517, + "▁addition": 4518, + "▁ID": 4519, + "ART": 4520, + "FO": 4521, + "щи": 4522, + "Conne": 4523, + "▁squ": 4524, + "▁considered": 4525, + "idad": 4526, + "▁October": 4527, + "cial": 4528, + "▁Of": 4529, + "▁travel": 4530, + "▁boy": 4531, + "').": 4532, + "uy": 4533, + "illa": 4534, + "istry": 4535, + "▁va": 4536, + "▁Che": 4537, + "ERT": 4538, + "ende": 4539, + "ungen": 4540, + "aby": 4541, + "▁Rober": 4542, + "▁playing": 4543, + "ils": 4544, + "▁sam": 4545, + "▁execut": 4546, + "▁Us": 4547, + "▁mut": 4548, + "▁bal": 4549, + "asse": 4550, + "▁kids": 4551, + "▁financ": 4552, + "gor": 4553, + "▁Sec": 4554, + "bert": 4555, + "▁High": 4556, + "▁је": 4557, + "▁kept": 4558, + "button": 4559, + "itory": 4560, + "▁Rem": 4561, + "▁DE": 4562, + "▁reach": 4563, + "▁bur": 4564, + "Label": 4565, + "át": 4566, + "ago": 4567, + "▁passed": 4568, + "▁behav": 4569, + "xFF": 4570, + "▁Return": 4571, + "STR": 4572, + "▁Les": 4573, + "▁ord": 4574, + "ala": 4575, + "inger": 4576, + "▁Since": 4577, + "▁experi": 4578, + "▁shall": 4579, + "▁star": 4580, + "non": 4581, + "▁gun": 4582, + "▁Bel": 4583, + "▁obj": 4584, + "ares": 4585, + "rs": 4586, + "▁weeks": 4587, + "nen": 4588, + "▁Stre": 4589, + "oring": 4590, + "▁î": 4591, + "▁serious": 4592, + "times": 4593, + "▁House": 4594, + "▁roll": 4595, + "▁register": 4596, + "▁module": 4597, + "▁applic": 4598, + "IR": 4599, + "▁cook": 4600, + "aux": 4601, + "▁save": 4602, + "▁Cr": 4603, + ",\r": 4604, + "▁states": 4605, + "▁empty": 4606, + "▁autom": 4607, + "figure": 4608, + "iance": 4609, + "▁happy": 4610, + "▁fn": 4611, + "▁jud": 4612, + "▁hat": 4613, + "ACK": 4614, + "▁Fe": 4615, + "$-": 4616, + "ivil": 4617, + "oted": 4618, + "▁sizeof": 4619, + "▁situation": 4620, + "▁lives": 4621, + "▁feeling": 4622, + "▁risk": 4623, + "▁January": 4624, + "▁Object": 4625, + "▁recomm": 4626, + "▁вы": 4627, + "▁potential": 4628, + "eah": 4629, + "▁complex": 4630, + "printf": 4631, + "istance": 4632, + "irth": 4633, + "lik": 4634, + "aste": 4635, + "▁whose": 4636, + "Arg": 4637, + "▁modern": 4638, + "iones": 4639, + "▁че": 4640, + "▁sett": 4641, + "▁Mag": 4642, + "ae": 4643, + "▁condition": 4644, + "Length": 4645, + "▁fit": 4646, + "ounds": 4647, + "▁changed": 4648, + "▁guy": 4649, + "filter": 4650, + "atever": 4651, + "éd": 4652, + "remove": 4653, + "▁hop": 4654, + "▁Out": 4655, + "▁Rich": 4656, + "child": 4657, + "▁included": 4658, + "$\\": 4659, + "▁Tom": 4660, + "eline": 4661, + "▁sometimes": 4662, + "▁drink": 4663, + "▁quant": 4664, + "▁please": 4665, + "▁Int": 4666, + "rief": 4667, + "▁exactly": 4668, + "cing": 4669, + "▁allowed": 4670, + "build": 4671, + "▁beautiful": 4672, + "▁Well": 4673, + "▁looks": 4674, + "▁ü": 4675, + "▁chance": 4676, + "▁wrote": 4677, + "▁nor": 4678, + "▁failed": 4679, + "Met": 4680, + "▁prior": 4681, + "▁hundred": 4682, + "ской": 4683, + "oria": 4684, + "▁cy": 4685, + "▁web": 4686, + "▁mess": 4687, + "leq": 4688, + "dy": 4689, + "tex": 4690, + "▁anim": 4691, + "atur": 4692, + "▁structure": 4693, + "option": 4694, + "▁actual": 4695, + "▁Franc": 4696, + "enced": 4697, + ".": 4884, + "▁production": 4885, + "iger": 4886, + "▁ст": 4887, + "show": 4888, + "▁population": 4889, + "▁park": 4890, + "▁Ze": 4891, + "▁necessary": 4892, + "▁trust": 4893, + "▁shown": 4894, + "module": 4895, + "GE": 4896, + "▁lay": 4897, + "▁announ": 4898, + "▁className": 4899, + "▁calcul": 4900, + "Function": 4901, + "▁Sal": 4902, + "OK": 4903, + "TP": 4904, + "▁entry": 4905, + "▁Stud": 4906, + "▁items": 4907, + "▁security": 4908, + "Entry": 4909, + "float": 4910, + "ls": 4911, + "ibly": 4912, + "▁contribut": 4913, + "▁Check": 4914, + "MD": 4915, + "▁improve": 4916, + "Part": 4917, + "▁systems": 4918, + "Bl": 4919, + "▁policy": 4920, + "▁screen": 4921, + "▁Any": 4922, + "▁opened": 4923, + "alloc": 4924, + "▁December": 4925, + "▁É": 4926, + "▁email": 4927, + "ader": 4928, + "=>": 4929, + "▁Hen": 4930, + "▁info": 4931, + "▁float": 4932, + "▁switch": 4933, + "ран": 4934, + "urance": 4935, + "▁assum": 4936, + "ustr": 4937, + "▁groups": 4938, + "▁Read": 4939, + "▁wat": 4940, + "Sp": 4941, + "вер": 4942, + "RAN": 4943, + "hib": 4944, + "ALL": 4945, + "▁hus": 4946, + "Spec": 4947, + "\"))": 4948, + "▁French": 4949, + "▁Class": 4950, + "▁president": 4951, + "▁definit": 4952, + "▁Nor": 4953, + "▁Thom": 4954, + "aign": 4955, + "Width": 4956, + "Do": 4957, + "▁{@": 4958, + "agon": 4959, + "▁Lu": 4960, + "▁followed": 4961, + "MM": 4962, + "asons": 4963, + "tmp": 4964, + "▁throws": 4965, + "ITY": 4966, + "ном": 4967, + "▁fair": 4968, + "▁pen": 4969, + "ég": 4970, + "▁interface": 4971, + "▁saf": 4972, + "oon": 4973, + "Back": 4974, + "▁speed": 4975, + "▁extends": 4976, + "empty": 4977, + "▁пере": 4978, + "▁proper": 4979, + "▁driv": 4980, + "фи": 4981, + "▁center": 4982, + "header": 4983, + "▁})": 4984, + "wa": 4985, + "▁middle": 4986, + "▁choose": 4987, + "▁Stad": 4988, + "SO": 4989, + "Factory": 4990, + "Dev": 4991, + "icles": 4992, + "▁application": 4993, + "▁models": 4994, + "pite": 4995, + "cap": 4996, + "xi": 4997, + "ospital": 4998, + "▁dream": 4999, + "END": 5000, + "▁contract": 5001, + "icrosoft": 5002, + "▁thous": 5003, + "izes": 5004, + "▁да": 5005, + "▁CO": 5006, + "▁direction": 5007, + "▁``": 5008, + "▁drive": 5009, + "Max": 5010, + "cia": 5011, + "▁continu": 5012, + "▁Alex": 5013, + "▁gold": 5014, + "▁prep": 5015, + "▁origin": 5016, + "▁rap": 5017, + "Op": 5018, + "ously": 5019, + "▁areas": 5020, + "PORT": 5021, + "она": 5022, + "▁safe": 5023, + "▁professional": 5024, + "apache": 5025, + "▁temper": 5026, + "sz": 5027, + "▁unit": 5028, + "▁cop": 5029, + "eqn": 5030, + "Listener": 5031, + "▁format": 5032, + "select": 5033, + "▁comfort": 5034, + "▁meant": 5035, + "iday": 5036, + "eme": 5037, + "▁active": 5038, + "▁note": 5039, + "▁Mil": 5040, + "only": 5041, + "▁<=": 5042, + "▁neigh": 5043, + "ao": 5044, + "▁blue": 5045, + "▁TV": 5046, + "Child": 5047, + "▁reached": 5048, + "Address": 5049, + "ств": 5050, + "▁closed": 5051, + "inder": 5052, + "olo": 5053, + "▁alt": 5054, + "▁adm": 5055, + "Format": 5056, + "UI": 5057, + "▁Ham": 5058, + "▁frequ": 5059, + "▁independ": 5060, + "▁easily": 5061, + "▁Land": 5062, + "▁tor": 5063, + "ography": 5064, + "infty": 5065, + "▁Work": 5066, + "iven": 5067, + "▁County": 5068, + "▁src": 5069, + "}$,": 5070, + "parse": 5071, + "CD": 5072, + "▁Cour": 5073, + "▁fol": 5074, + "Entity": 5075, + "pgf": 5076, + "▁China": 5077, + "▁Sub": 5078, + "hood": 5079, + "▁fields": 5080, + "▁yes": 5081, + "rend": 5082, + "▁towards": 5083, + "▁staff": 5084, + "▁Air": 5085, + "▁station": 5086, + "atives": 5087, + "▁impact": 5088, + "вы": 5089, + "▁directly": 5090, + "issions": 5091, + "iva": 5092, + "|\\": 5093, + "Ptr": 5094, + "▁Sant": 5095, + "Pol": 5096, + "▁progress": 5097, + "itar": 5098, + "▁parts": 5099, + "▁plant": 5100, + "▁absolut": 5101, + "▁guess": 5102, + "eqref": 5103, + "▁tim": 5104, + "▁Lou": 5105, + "▁cool": 5106, + "alu": 5107, + "▁mouth": 5108, + "них": 5109, + "▁height": 5110, + "gest": 5111, + "▁Post": 5112, + "▁board": 5113, + "▁tit": 5114, + "▁hour": 5115, + "▁server": 5116, + "▁players": 5117, + "rier": 5118, + "Link": 5119, + "▁President": 5120, + "](": 5121, + "▁construct": 5122, + "handle": 5123, + "}$.": 5124, + "rying": 5125, + "▁shop": 5126, + "iana": 5127, + "exp": 5128, + "Helper": 5129, + "Offset": 5130, + "aches": 5131, + "▁connection": 5132, + "▁difference": 5133, + "service": 5134, + "▁gas": 5135, + "▁priv": 5136, + "▁univers": 5137, + "▁wish": 5138, + "Rem": 5139, + "Url": 5140, + "geb": 5141, + "So": 5142, + "ensions": 5143, + "Module": 5144, + "SIZE": 5145, + "▁prem": 5146, + "window": 5147, + "▁dies": 5148, + "del": 5149, + "▁row": 5150, + "▁average": 5151, + "xim": 5152, + "▁pu": 5153, + "anç": 5154, + "Det": 5155, + "ker": 5156, + "ya": 5157, + "▁Det": 5158, + "▁på": 5159, + "▁named": 5160, + "▁decision": 5161, + "win": 5162, + "▁George": 5163, + "arily": 5164, + "▁solution": 5165, + "▁multiple": 5166, + "ategy": 5167, + "▁learning": 5168, + "▁secret": 5169, + "DO": 5170, + "▁nice": 5171, + "////////////////": 5172, + "Su": 5173, + "itation": 5174, + "▁join": 5175, + "▁elements": 5176, + "▁emer": 5177, + "tilde": 5178, + "▁dep": 5179, + "▁shot": 5180, + "▁platform": 5181, + "othing": 5182, + "My": 5183, + "edia": 5184, + "oms": 5185, + "aily": 5186, + "([": 5187, + "▁dress": 5188, + "▁official": 5189, + "estern": 5190, + "▁discover": 5191, + "▁mi": 5192, + "ные": 5193, + "CA": 5194, + "oding": 5195, + "▁Found": 5196, + "▁affect": 5197, + "Vis": 5198, + "stract": 5199, + "iced": 5200, + "debug": 5201, + "▁related": 5202, + "▁spect": 5203, + "ushed": 5204, + "сько": 5205, + "▁bank": 5206, + "▁cele": 5207, + "AND": 5208, + "olf": 5209, + "ем": 5210, + "▁fill": 5211, + "▁gives": 5212, + "▁бу": 5213, + "aron": 5214, + "▁Jes": 5215, + "REG": 5216, + "▁sudd": 5217, + "dated": 5218, + "vi": 5219, + "▁gi": 5220, + "send": 5221, + "cpp": 5222, + "▁spent": 5223, + "ande": 5224, + "▁operation": 5225, + "process": 5226, + "▁inform": 5227, + "▁Free": 5228, + "yond": 5229, + "▁perhaps": 5230, + "▁surv": 5231, + "▁Loc": 5232, + "▁concl": 5233, + "▁раз": 5234, + "▁Over": 5235, + "hol": 5236, + "raz": 5237, + "Write": 5238, + "▁giving": 5239, + "rd": 5240, + "instance": 5241, + "▁released": 5242, + "▁Ro": 5243, + "RA": 5244, + "▁practice": 5245, + "▁graph": 5246, + "▁increase": 5247, + "▁figure": 5248, + "Filter": 5249, + "HECK": 5250, + "idx": 5251, + "▁glass": 5252, + "ski": 5253, + "comes": 5254, + "▁cat": 5255, + "▁cold": 5256, + "goto": 5257, + "ufact": 5258, + "▁Copyright": 5259, + "}}\\": 5260, + "▁streng": 5261, + "▁dir": 5262, + "token": 5263, + "▁occur": 5264, + "arlier": 5265, + "▁measure": 5266, + "▁sec": 5267, + "▁más": 5268, + "▁Net": 5269, + "▁argument": 5270, + "▁sou": 5271, + "▁moving": 5272, + "▁prefer": 5273, + "mask": 5274, + "<<": 5275, + "▁breath": 5276, + "▁physical": 5277, + "▁positive": 5278, + "▁sor": 5279, + "▁depart": 5280, + "▁remove": 5281, + "▁kit": 5282, + "▁meeting": 5283, + "▁Data": 5284, + "ograf": 5285, + "actions": 5286, + "▁parameters": 5287, + "▁Att": 5288, + "esch": 5289, + "▁involved": 5290, + "ät": 5291, + "LL": 5292, + "Bar": 5293, + "▁си": 5294, + "ech": 5295, + "GET": 5296, + "▁prevent": 5297, + "▁beyond": 5298, + "▁Other": 5299, + "än": 5300, + "byte": 5301, + "▁sudden": 5302, + "olve": 5303, + "▁но": 5304, + "LOG": 5305, + "unit": 5306, + "▁truth": 5307, + "rat": 5308, + "SD": 5309, + "▁eat": 5310, + "▁Mad": 5311, + "▁provides": 5312, + "▁session": 5313, + "Dele": 5314, + "▁convers": 5315, + "center": 5316, + "▁continued": 5317, + "otion": 5318, + "cache": 5319, + "display": 5320, + "▁protect": 5321, + "ams": 5322, + "▁pow": 5323, + "CTION": 5324, + "▁Mac": 5325, + "mo": 5326, + "ха": 5327, + "▁distance": 5328, + "▁Time": 5329, + "gi": 5330, + "▁sequ": 5331, + "Target": 5332, + "сле": 5333, + "Server": 5334, + "▁wide": 5335, + "close": 5336, + "▁cru": 5337, + "Ext": 5338, + "▁select": 5339, + "▁pattern": 5340, + "\"));": 5341, + "Provider": 5342, + "URL": 5343, + "▁green": 5344, + "▁waiting": 5345, + "proto": 5346, + "▁immediately": 5347, + "common": 5348, + "azione": 5349, + "river": 5350, + "▁sen": 5351, + "▁!==": 5352, + "▁February": 5353, + "urb": 5354, + "▁Sen": 5355, + "dest": 5356, + ">": 6122, + "command": 6123, + "atz": 6124, + "▁mal": 6125, + "став": 6126, + "▁Press": 6127, + "▁characters": 6128, + "▁zero": 6129, + "AGE": 6130, + "rapper": 6131, + "▁kitchen": 6132, + "aming": 6133, + "▁restr": 6134, + "XX": 6135, + "▁College": 6136, + "▁Array": 6137, + "▁fresh": 6138, + "▁shift": 6139, + "▁specified": 6140, + "plete": 6141, + "ITE": 6142, + "▁Camp": 6143, + "rial": 6144, + "cb": 6145, + "▁TH": 6146, + "IB": 6147, + "osen": 6148, + "▁ú": 6149, + "▁params": 6150, + "ignment": 6151, + "adding": 6152, + "▁degree": 6153, + "Local": 6154, + "Oh": 6155, + "▁zur": 6156, + "▁levels": 6157, + "CS": 6158, + "finished": 6159, + "Case": 6160, + "riage": 6161, + "Vector": 6162, + "▁sea": 6163, + "antic": 6164, + "▁League": 6165, + "▁therefore": 6166, + "One": 6167, + "Return": 6168, + "Access": 6169, + "vas": 6170, + "▁ос": 6171, + "▁rat": 6172, + "Big": 6173, + "▁behavior": 6174, + "kr": 6175, + "▁undefined": 6176, + "▁Es": 6177, + "▁appeared": 6178, + "eles": 6179, + "▁WAR": 6180, + "Stat": 6181, + "▁Google": 6182, + "▁credit": 6183, + "▁File": 6184, + "anging": 6185, + "house": 6186, + "romise": 6187, + "gent": 6188, + "▁habit": 6189, + "▁society": 6190, + "▁encour": 6191, + "▁paint": 6192, + "pet": 6193, + "▁UK": 6194, + "aws": 6195, + "onom": 6196, + "Gl": 6197, + "}_{\\": 6198, + "eless": 6199, + "emy": 6200, + "▁Cong": 6201, + "▁developed": 6202, + "▁images": 6203, + "▁ö": 6204, + "▁font": 6205, + "clear": 6206, + "gin": 6207, + "▁Lord": 6208, + "▁transport": 6209, + "▁::": 6210, + "▁cup": 6211, + "ulate": 6212, + "▁During": 6213, + "priv": 6214, + "▁extrem": 6215, + "▁Di": 6216, + "▁doubt": 6217, + "Py": 6218, + "ifying": 6219, + "split": 6220, + "ego": 6221, + "github": 6222, + "▁),": 6223, + "ROM": 6224, + "▁chair": 6225, + "▁trade": 6226, + "▁nicht": 6227, + "Top": 6228, + "Store": 6229, + "▁parte": 6230, + "project": 6231, + "nia": 6232, + "▁від": 6233, + "war": 6234, + "▁Prof": 6235, + "▁caught": 6236, + "Thread": 6237, + "ства": 6238, + "author": 6239, + "▁doll": 6240, + "▁harm": 6241, + "▁Gen": 6242, + "tree": 6243, + "etime": 6244, + "cfg": 6245, + "▁guys": 6246, + "▁California": 6247, + "▁Green": 6248, + "▁movement": 6249, + "iej": 6250, + "▁statement": 6251, + "▁seeing": 6252, + "▁haven": 6253, + "vention": 6254, + "SL": 6255, + "chedul": 6256, + "iert": 6257, + "▁primary": 6258, + "▁civil": 6259, + "rian": 6260, + "▁button": 6261, + "▁lived": 6262, + "Pass": 6263, + "sor": 6264, + "▁watching": 6265, + "▁skills": 6266, + "tee": 6267, + "Level": 6268, + "▁scient": 6269, + "hs": 6270, + "▁agre": 6271, + "cat": 6272, + "▁tend": 6273, + "▁Mill": 6274, + "▁Cap": 6275, + "ORD": 6276, + "gle": 6277, + "▁сво": 6278, + "»,": 6279, + "▁ahead": 6280, + "vest": 6281, + "▁Jose": 6282, + "ischer": 6283, + "și": 6284, + "▁leaving": 6285, + "▁для": 6286, + "▁south": 6287, + "▁consum": 6288, + "Range": 6289, + "▁activities": 6290, + "Sec": 6291, + "▁sales": 6292, + "▁fix": 6293, + "▁jed": 6294, + "rum": 6295, + "vector": 6296, + "▁spot": 6297, + "▁manufact": 6298, + "кт": 6299, + "orrow": 6300, + "sign": 6301, + "▁college": 6302, + "▁driver": 6303, + "▁definitely": 6304, + "▁spend": 6305, + "mission": 6306, + "зу": 6307, + "atively": 6308, + "bi": 6309, + "Callback": 6310, + "▁particularly": 6311, + "▁hell": 6312, + "▁pool": 6313, + "PRE": 6314, + "▁clearly": 6315, + "PT": 6316, + "othes": 6317, + "▁Id": 6318, + "Location": 6319, + "▁Run": 6320, + "▁fixed": 6321, + "▁Hand": 6322, + "bal": 6323, + "double": 6324, + "Can": 6325, + "Omega": 6326, + "▁challeng": 6327, + "▁standing": 6328, + "iten": 6329, + "▁mechan": 6330, + "▁durch": 6331, + "▁dell": 6332, + "▁raised": 6333, + "▁weak": 6334, + "▁Du": 6335, + "grad": 6336, + "▁scene": 6337, + "poss": 6338, + "▁ton": 6339, + "▁earth": 6340, + "ulations": 6341, + "▁strength": 6342, + "aked": 6343, + "▁remain": 6344, + "▁Bi": 6345, + "▁customer": 6346, + "range": 6347, + "▁interested": 6348, + "ONE": 6349, + "▁coff": 6350, + "require": 6351, + "▁Only": 6352, + "▁Web": 6353, + "▁farm": 6354, + "▁activity": 6355, + "▁rout": 6356, + "bling": 6357, + "SY": 6358, + "▁Richard": 6359, + "▁Ref": 6360, + "▁кон": 6361, + "▁jun": 6362, + "born": 6363, + "ijn": 6364, + "Configuration": 6365, + "uman": 6366, + "EE": 6367, + "▁married": 6368, + "▁За": 6369, + "▁fat": 6370, + "▁kid": 6371, + "▁Tur": 6372, + "▁offered": 6373, + "nic": 6374, + "▁Big": 6375, + "Gamma": 6376, + "▁Health": 6377, + "▁TR": 6378, + "▁się": 6379, + "▁construction": 6380, + "▁Church": 6381, + "▁Bet": 6382, + "bus": 6383, + "▁earn": 6384, + "rict": 6385, + "▁пра": 6386, + "▁brain": 6387, + "▁fra": 6388, + "▁Op": 6389, + "FIG": 6390, + "ema": 6391, + "▁European": 6392, + "▁Saint": 6393, + "ARE": 6394, + "uri": 6395, + "▁River": 6396, + "{}": 6397, + "▁sitting": 6398, + "▁understanding": 6399, + "▁plans": 6400, + "ropri": 6401, + "▁older": 6402, + "▁pressure": 6403, + "Impl": 6404, + "▁peace": 6405, + "Connection": 6406, + "▁fi": 6407, + "rich": 6408, + "▁shut": 6409, + "apers": 6410, + "Port": 6411, + "▁Look": 6412, + "rim": 6413, + "auth": 6414, + "auto": 6415, + "▁highly": 6416, + "▁unless": 6417, + "▁Wal": 6418, + "▁ren": 6419, + "ws": 6420, + "▁core": 6421, + "(-": 6422, + "▁clim": 6423, + "ruit": 6424, + "▁callback": 6425, + "hest": 6426, + "▁Charles": 6427, + "▁Long": 6428, + "}=": 6429, + "ър": 6430, + "▁shared": 6431, + "ulated": 6432, + "gorithm": 6433, + "▁Home": 6434, + "▁village": 6435, + "ees": 6436, + "sv": 6437, + "▁restaur": 6438, + "rey": 6439, + "▁Cast": 6440, + "▁Person": 6441, + "кий": 6442, + "▁organiz": 6443, + "▁Rad": 6444, + "ponents": 6445, + "▁werden": 6446, + "▁bow": 6447, + "sen": 6448, + "ami": 6449, + "Interface": 6450, + "▁basis": 6451, + "▁Company": 6452, + "ernel": 6453, + "itu": 6454, + "Hash": 6455, + "▁aan": 6456, + "▁х": 6457, + "▁smile": 6458, + "xml": 6459, + "▁scen": 6460, + "amm": 6461, + "tool": 6462, + "aria": 6463, + "▁accur": 6464, + "settings": 6465, + "▁Jesus": 6466, + "acement": 6467, + "power": 6468, + "(!": 6469, + "▁calls": 6470, + "▁basic": 6471, + "▁settings": 6472, + "ript": 6473, + "pool": 6474, + "ctors": 6475, + "▁Foundation": 6476, + "▁weap": 6477, + "KEY": 6478, + "foot": 6479, + "▁radio": 6480, + "▁helped": 6481, + "mann": 6482, + "▁jump": 6483, + "▁tick": 6484, + "▁growing": 6485, + "aten": 6486, + "real": 6487, + "▁increasing": 6488, + "Device": 6489, + "varepsilon": 6490, + "▁sets": 6491, + "▁advant": 6492, + "Open": 6493, + "▁reasons": 6494, + "▁supposed": 6495, + "oes": 6496, + "ede": 6497, + "teen": 6498, + "ifdef": 6499, + "▁delete": 6500, + "▁&=": 6501, + "▁Bill": 6502, + "▁aim": 6503, + "▁Ok": 6504, + "▁Av": 6505, + "reci": 6506, + "acks": 6507, + "iste": 6508, + "Properties": 6509, + "▁tmp": 6510, + "▁dei": 6511, + "PER": 6512, + "DC": 6513, + "sta": 6514, + "нии": 6515, + "▁limited": 6516, + "▁greater": 6517, + "description": 6518, + "ori": 6519, + "aints": 6520, + "▁hy": 6521, + "▁Mel": 6522, + "▁CH": 6523, + "cons": 6524, + "▁surround": 6525, + "▁Who": 6526, + "arc": 6527, + "▁telev": 6528, + "itution": 6529, + "▁equal": 6530, + "кі": 6531, + "▁Israel": 6532, + "äh": 6533, + "▁Caption": 6534, + "▁exerc": 6535, + "empor": 6536, + "▁++": 6537, + "▁lib": 6538, + "make": 6539, + "▁MA": 6540, + "copy": 6541, + "friend": 6542, + "▁кото": 6543, + "▁damage": 6544, + "▁\\,": 6545, + "oded": 6546, + "▁none": 6547, + "▁evalu": 6548, + "ston": 6549, + ">,": 6550, + "FOR": 6551, + "▁norm": 6552, + "appe": 6553, + "Session": 6554, + "▁adult": 6555, + "▁hospital": 6556, + "▁recommend": 6557, + "property": 6558, + "stein": 6559, + "final": 6560, + "▁nu": 6561, + "second": 6562, + "▁aspect": 6563, + "\")]": 6564, + "жен": 6565, + "amento": 6566, + "▁rac": 6567, + "save": 6568, + "▁football": 6569, + "Ab": 6570, + "ungs": 6571, + "abil": 6572, + "▁Arch": 6573, + "system": 6574, + "hist": 6575, + "▁luck": 6576, + "render": 6577, + "▁sein": 6578, + "ioni": 6579, + "▁rot": 6580, + "▁corner": 6581, + "▁appropri": 6582, + "▁Software": 6583, + "▁tele": 6584, + "Delete": 6585, + "▁According": 6586, + "▁prison": 6587, + "▁lic": 6588, + "▁ми": 6589, + "term": 6590, + "sets": 6591, + "▁vel": 6592, + "▁rank": 6593, + "▁existing": 6594, + "▁Vir": 6595, + "▁trip": 6596, + "▁му": 6597, + "avax": 6598, + "▁ris": 6599, + "▁define": 6600, + "▁heat": 6601, + "car": 6602, + "▁convert": 6603, + "email": 6604, + "▁Under": 6605, + "▁Ш": 6606, + "▁Grand": 6607, + "▁exists": 6608, + "sys": 6609, + "eff": 6610, + "▁Top": 6611, + "▁č": 6612, + "▁tempor": 6613, + "▁arguments": 6614, + "▁supported": 6615, + "ensed": 6616, + "▁Francis": 6617, + "▁coord": 6618, + "▁achieve": 6619, + "▁Name": 6620, + "▁Jahr": 6621, + "▁Gi": 6622, + "she": 6623, + "▁Dev": 6624, + "▁alla": 6625, + "▁WIT": 6626, + "agment": 6627, + "custom": 6628, + "alls": 6629, + "&&": 6630, + "WE": 6631, + "▁holding": 6632, + "prototype": 6633, + "▁fing": 6634, + "▁bag": 6635, + "▁Party": 6636, + "stack": 6637, + "▁economic": 6638, + "▁Gal": 6639, + "idents": 6640, + "▁Jun": 6641, + "▁showed": 6642, + "osh": 6643, + "▁Bay": 6644, + "mail": 6645, + "▁SO": 6646, + "▁\"<": 6647, + "graphics": 6648, + "▁fu": 6649, + "click": 6650, + "▁battle": 6651, + "{{": 6652, + "▁Event": 6653, + "rior": 6654, + "chaft": 6655, + "▁favorite": 6656, + "usive": 6657, + "support": 6658, + "bm": 6659, + "Kind": 6660, + "▁safety": 6661, + "▁Ent": 6662, + "cup": 6663, + "▁Australia": 6664, + "▁destroy": 6665, + "▁organization": 6666, + "iden": 6667, + "################": 6668, + "dec": 6669, + "▁za": 6670, + "▁seven": 6671, + "arely": 6672, + "▁flag": 6673, + "Dir": 6674, + "▁Carl": 6675, + "▁doctor": 6676, + "▁variety": 6677, + "▁Lin": 6678, + "▁tom": 6679, + "^{(": 6680, + "Bo": 6681, + "antes": 6682, + "▁mine": 6683, + "▁Mit": 6684, + "▁describe": 6685, + "Args": 6686, + "LS": 6687, + "API": 6688, + "▁Luc": 6689, + "phone": 6690, + "▁science": 6691, + "▁Oper": 6692, + "Next": 6693, + "▁investig": 6694, + "▁demonstr": 6695, + "▁Govern": 6696, + "▁objects": 6697, + "▁Louis": 6698, + "▁Returns": 6699, + "▁han": 6700, + "nam": 6701, + "▁comme": 6702, + "▁presence": 6703, + "▁pel": 6704, + "▁detect": 6705, + ")=": 6706, + "▁Chinese": 6707, + "▁rich": 6708, + "▁classes": 6709, + "▁expand": 6710, + "▁Dom": 6711, + "▁Dec": 6712, + "sn": 6713, + "peed": 6714, + "▁Jim": 6715, + "should": 6716, + "▁Smith": 6717, + "▁pages": 6718, + "▁Jean": 6719, + "rics": 6720, + "▁Sund": 6721, + "ads": 6722, + "▁Their": 6723, + "unicip": 6724, + "ву": 6725, + "▁download": 6726, + "▁stress": 6727, + "▁Pet": 6728, + "menu": 6729, + "reme": 6730, + "▁compared": 6731, + "Ste": 6732, + "IND": 6733, + "container": 6734, + "▁Indian": 6735, + "oren": 6736, + "▁ses": 6737, + "▁Whe": 6738, + "▁roku": 6739, + "▁established": 6740, + "▁generally": 6741, + "▁fle": 6742, + "__(": 6743, + "=\"+": 6744, + "Var": 6745, + "▁Make": 6746, + "▁removed": 6747, + "zz": 6748, + "ün": 6749, + "▁mix": 6750, + "erk": 6751, + "iation": 6752, + "outer": 6753, + "SK": 6754, + "▁becomes": 6755, + "▁Hall": 6756, + "scious": 6757, + "▁watched": 6758, + "▁gather": 6759, + "▁Result": 6760, + "proof": 6761, + "pay": 6762, + "▁produced": 6763, + "▁|=": 6764, + "▁border": 6765, + "▁din": 6766, + "▁script": 6767, + "▁actions": 6768, + "▁mas": 6769, + "ща": 6770, + "ooth": 6771, + "▁Techn": 6772, + "Json": 6773, + "▁filled": 6774, + "ден": 6775, + "undle": 6776, + "сту": 6777, + "Tool": 6778, + "▁king": 6779, + "▁ven": 6780, + "stra": 6781, + "▁predict": 6782, + "▁lui": 6783, + "▁WARRAN": 6784, + "▁Fun": 6785, + "Script": 6786, + "▁powerful": 6787, + "▁lose": 6788, + "atically": 6789, + "▁daily": 6790, + "▁ring": 6791, + "▁arrived": 6792, + "Stack": 6793, + "scope": 6794, + "▁Back": 6795, + "elij": 6796, + "▁ze": 6797, + "keys": 6798, + "{\"": 6799, + "VID": 6800, + "▁license": 6801, + "what": 6802, + "▁proced": 6803, + "rant": 6804, + "estival": 6805, + "agram": 6806, + "▁LO": 6807, + "▁Henry": 6808, + "▁flags": 6809, + "Down": 6810, + "scription": 6811, + "▁families": 6812, + "isse": 6813, + "bour": 6814, + "▁Bur": 6815, + "—\"": 6816, + "▁brief": 6817, + "▁creating": 6818, + "▁clients": 6819, + "rangle": 6820, + "▁amazing": 6821, + "▁sind": 6822, + "▁covered": 6823, + "Well": 6824, + "сте": 6825, + "тор": 6826, + "▁Bas": 6827, + "total": 6828, + "▁Init": 6829, + "▁sand": 6830, + "Unit": 6831, + "▁murder": 6832, + "▁bright": 6833, + "▁trav": 6834, + "icans": 6835, + "▁attribute": 6836, + "fc": 6837, + "▁placed": 6838, + "EST": 6839, + "Vari": 6840, + "▁cos": 6841, + "▁attract": 6842, + "anel": 6843, + "}).": 6844, + "bytes": 6845, + "▁parse": 6846, + "▁belong": 6847, + "BN": 6848, + "▁Sol": 6849, + "Po": 6850, + "`,": 6851, + "▁calling": 6852, + "▁?>": 6853, + "▁iter": 6854, + "▁url": 6855, + "▁evening": 6856, + "reek": 6857, + "▁honest": 6858, + "▁director": 6859, + "RC": 6860, + "▁solid": 6861, + "▁phil": 6862, + "iene": 6863, + "FAULT": 6864, + "cope": 6865, + "▁History": 6866, + "▁Team": 6867, + "reedom": 6868, + "▁ru": 6869, + "UB": 6870, + "▁worse": 6871, + "imo": 6872, + "Mat": 6873, + "▁Mex": 6874, + "actor": 6875, + "▁vor": 6876, + "ться": 6877, + "▁experiment": 6878, + "▁Play": 6879, + "▁Another": 6880, + "▁happens": 6881, + "uan": 6882, + "▁patients": 6883, + "▁rend": 6884, + "▁Mo": 6885, + "▁Tex": 6886, + "▁wed": 6887, + "tn": 6888, + "insert": 6889, + "▁па": 6890, + "▁anti": 6891, + "Match": 6892, + "ampionship": 6893, + "▁forces": 6894, + "▁Hot": 6895, + "▁phase": 6896, + "▁template": 6897, + "stop": 6898, + "icated": 6899, + "▁managed": 6900, + "wait": 6901, + "▁*(": 6902, + "GB": 6903, + "▁appoint": 6904, + "ła": 6905, + "▁stick": 6906, + "▁FOR": 6907, + "▁Vis": 6908, + "tor": 6909, + "▁př": 6910, + "quest": 6911, + "uses": 6912, + "\");\r": 6913, + "▁suddenly": 6914, + "éc": 6915, + "ND": 6916, + "urop": 6917, + "ред": 6918, + "▁insurance": 6919, + "access": 6920, + "unfinished": 6921, + "▁tamb": 6922, + "▁sac": 6923, + "▁Court": 6924, + "▁missing": 6925, + "▁Where": 6926, + "▁Sum": 6927, + "}^{\\": 6928, + "▁sua": 6929, + "_,": 6930, + "▁thick": 6931, + "▁Trump": 6932, + "▁operations": 6933, + "FS": 6934, + "▁deux": 6935, + "dz": 6936, + "Template": 6937, + "▁\"/": 6938, + "▁odd": 6939, + "▁reality": 6940, + "▁teams": 6941, + "▁cer": 6942, + "oma": 6943, + "▁și": 6944, + "▁cloud": 6945, + "▁Department": 6946, + "Ne": 6947, + "▁requires": 6948, + "items": 6949, + "▁III": 6950, + "rightarrow": 6951, + ")->": 6952, + "▁writer": 6953, + "replace": 6954, + "▁thr": 6955, + "jen": 6956, + "▁ot": 6957, + "▁occup": 6958, + "▁eventually": 6959, + "▁Math": 6960, + "▁conserv": 6961, + "amer": 6962, + "▁Fort": 6963, + "▁dry": 6964, + "▁sexual": 6965, + "▁costs": 6966, + "▁forms": 6967, + "▁Vict": 6968, + "PAR": 6969, + "framework": 6970, + "▁ди": 6971, + "Operation": 6972, + "зна": 6973, + "which": 6974, + "▁tight": 6975, + "Invalid": 6976, + "▁partner": 6977, + "▁пред": 6978, + "▁thank": 6979, + "▁guard": 6980, + "hem": 6981, + "Body": 6982, + "▁emot": 6983, + "IX": 6984, + "fast": 6985, + "що": 6986, + "ño": 6987, + "night": 6988, + "▁Sci": 6989, + "ника": 6990, + "▁TO": 6991, + "▁individuals": 6992, + "сси": 6993, + "}),": 6994, + "False": 6995, + "(\"%": 6996, + "▁optim": 6997, + "▁-->": 6998, + "▁factor": 6999, + "▁smaller": 7000, + "▁contain": 7001, + "spect": 7002, + "Engine": 7003, + "▁announced": 7004, + "▁Democr": 7005, + "▁rob": 7006, + "▁flat": 7007, + "osoph": 7008, + "Search": 7009, + "ahl": 7010, + "▁Exception": 7011, + "▁Ol": 7012, + "equals": 7013, + "▁unter": 7014, + "shape": 7015, + "NS": 7016, + "Obj": 7017, + "▁species": 7018, + "weight": 7019, + "you": 7020, + "▁este": 7021, + "▁View": 7022, + "▁mission": 7023, + "▁journal": 7024, + "Values": 7025, + "▁einem": 7026, + "ismo": 7027, + "▁projects": 7028, + "▁Das": 7029, + "rible": 7030, + "▁serve": 7031, + "▁opening": 7032, + "▁hur": 7033, + "▁programs": 7034, + "▁USA": 7035, + "iliar": 7036, + "idos": 7037, + "Br": 7038, + "estamp": 7039, + "▁tools": 7040, + "anner": 7041, + "RT": 7042, + "▁Start": 7043, + "▁bath": 7044, + "▁coffee": 7045, + "orter": 7046, + "internal": 7047, + "files": 7048, + "INVAL": 7049, + "ako": 7050, + "dt": 7051, + "▁Second": 7052, + "▁alloc": 7053, + "▁ended": 7054, + "acional": 7055, + "▁manager": 7056, + "▁Sun": 7057, + "agg": 7058, + "▁leader": 7059, + "olved": 7060, + "▁что": 7061, + "▁traditional": 7062, + "shot": 7063, + "rup": 7064, + "CF": 7065, + "▁Each": 7066, + "wr": 7067, + "▁Som": 7068, + "▁materials": 7069, + "▁msg": 7070, + "▁syn": 7071, + "▁produce": 7072, + "▁storage": 7073, + "subsection": 7074, + "▁Sie": 7075, + "▁IP": 7076, + "CESS": 7077, + "▁wa": 7078, + "Record": 7079, + "▁marketing": 7080, + "plet": 7081, + "Dialog": 7082, + "▁mentioned": 7083, + "▁Na": 7084, + "▁Union": 7085, + "▁API": 7086, + "▁negative": 7087, + "txt": 7088, + "▁easier": 7089, + "legal": 7090, + "Dep": 7091, + "▁novel": 7092, + "eur": 7093, + "ació": 7094, + "▁Bud": 7095, + "▁carry": 7096, + "schaft": 7097, + "▁broken": 7098, + "▁trees": 7099, + ">();": 7100, + "▁emb": 7101, + "ieder": 7102, + "▁route": 7103, + "ikel": 7104, + "▁listen": 7105, + "ashion": 7106, + "▁Mrs": 7107, + "▁equipment": 7108, + "agger": 7109, + "▁Thus": 7110, + "▁matrix": 7111, + "alla": 7112, + "▁Tour": 7113, + "▁conversation": 7114, + "Mon": 7115, + "ournal": 7116, + "▁minute": 7117, + "Am": 7118, + "Api": 7119, + "▁forget": 7120, + "Me": 7121, + "levant": 7122, + "temp": 7123, + "▁telling": 7124, + "move": 7125, + "▁independent": 7126, + "toString": 7127, + "edit": 7128, + "▁Jac": 7129, + "azz": 7130, + "react": 7131, + "▁cin": 7132, + "▁Prov": 7133, + "isted": 7134, + "▁hash": 7135, + "onna": 7136, + "iki": 7137, + "▁generated": 7138, + "Render": 7139, + "▁psych": 7140, + "nav": 7141, + "▁entr": 7142, + "пра": 7143, + "rx": 7144, + "ATH": 7145, + "▁assume": 7146, + "Tree": 7147, + "sembly": 7148, + "▁Matt": 7149, + "caption": 7150, + "▁solutions": 7151, + "▁faith": 7152, + "▁digital": 7153, + "▁excell": 7154, + "▁Version": 7155, + "Debug": 7156, + "▁жи": 7157, + "▁carried": 7158, + "reset": 7159, + "▁slowly": 7160, + "ancing": 7161, + "▁owner": 7162, + "▁Ter": 7163, + "▁Did": 7164, + "▁gest": 7165, + "▁été": 7166, + "▁proof": 7167, + "Font": 7168, + "▁nob": 7169, + "Co": 7170, + "▁GNU": 7171, + "▁liber": 7172, + "itness": 7173, + "▁hij": 7174, + "▁vert": 7175, + "ша": 7176, + "FLAG": 7177, + "MENT": 7178, + "▁Son": 7179, + "Mult": 7180, + "▁district": 7181, + "connect": 7182, + "jection": 7183, + "lymp": 7184, + "▁realized": 7185, + "mos": 7186, + "ye": 7187, + "▁render": 7188, + "rio": 7189, + "▁interpret": 7190, + "▁slightly": 7191, + "fix": 7192, + "▁studies": 7193, + "▁rid": 7194, + "atre": 7195, + "▁benefits": 7196, + "▁Face": 7197, + "ivery": 7198, + "рия": 7199, + "document": 7200, + "▁asking": 7201, + "Last": 7202, + "arante": 7203, + "▁Martin": 7204, + "▁Ell": 7205, + "▁vector": 7206, + "▁forced": 7207, + "оло": 7208, + "PH": 7209, + "WR": 7210, + "▁Kl": 7211, + "▁sky": 7212, + "▁strategy": 7213, + "ocked": 7214, + "▁neck": 7215, + "ści": 7216, + "OUT": 7217, + ")),": 7218, + "Custom": 7219, + "▁wie": 7220, + "▁sweet": 7221, + "▁temp": 7222, + "▁foreign": 7223, + "▁hall": 7224, + "astr": 7225, + "Ass": 7226, + "MODE": 7227, + "▁maximum": 7228, + "annels": 7229, + "▁tip": 7230, + "▁seconds": 7231, + "▁stack": 7232, + "iga": 7233, + "▁raise": 7234, + "enable": 7235, + "oir": 7236, + "▁soul": 7237, + "Ke": 7238, + ")$.": 7239, + "▁Tim": 7240, + "ALSE": 7241, + "iser": 7242, + "contin": 7243, + "bel": 7244, + "▁mad": 7245, + "lichen": 7246, + "abe": 7247, + "safe": 7248, + "▁concent": 7249, + "bound": 7250, + "▁Requ": 7251, + "switch": 7252, + "▁stone": 7253, + "▁transl": 7254, + "▁vac": 7255, + "andon": 7256, + "▁Fore": 7257, + "▁sounds": 7258, + "▁Pop": 7259, + "▁HT": 7260, + "lia": 7261, + "enter": 7262, + "▁helps": 7263, + "edy": 7264, + "ствен": 7265, + "anted": 7266, + "▁Its": 7267, + "▁Step": 7268, + "Icon": 7269, + "▁EXPECT": 7270, + "ialized": 7271, + "Post": 7272, + "aze": 7273, + "▁Carol": 7274, + "▁req": 7275, + "▁critical": 7276, + "DS": 7277, + "▁seat": 7278, + "aped": 7279, + "▁upper": 7280, + "▁Sy": 7281, + "▁explain": 7282, + "▁'./": 7283, + "utils": 7284, + "possible": 7285, + "▁dont": 7286, + "Host": 7287, + "▁approxim": 7288, + "Async": 7289, + "▁grab": 7290, + "▁sources": 7291, + "▁Mos": 7292, + "▁Germany": 7293, + "▁rub": 7294, + "CHAN": 7295, + "▁rain": 7296, + "▁truly": 7297, + "▁joined": 7298, + "▁": 8276, + "agnost": 8277, + "▁proposed": 8278, + "▁Game": 8279, + "▁efforts": 8280, + "вя": 8281, + "tc": 8282, + "ск": 8283, + "▁intent": 8284, + "▁Bre": 8285, + "isc": 8286, + "▁protest": 8287, + "▁holds": 8288, + "ometry": 8289, + "▁Have": 8290, + "▁detail": 8291, + "▁WITHOUT": 8292, + "yer": 8293, + "▁Kon": 8294, + "▁noticed": 8295, + "▁requirements": 8296, + "DEBUG": 8297, + "kins": 8298, + "▁Span": 8299, + "▁cars": 8300, + "meta": 8301, + "▁kil": 8302, + "▁Bron": 8303, + "▁experienced": 8304, + "▁remind": 8305, + "ourse": 8306, + "▁Western": 8307, + "tered": 8308, + "▁devices": 8309, + "▁pictures": 8310, + "▁tut": 8311, + "\"`": 8312, + "▁impossible": 8313, + "▁rail": 8314, + "▁feels": 8315, + "icas": 8316, + "illing": 8317, + "▁accident": 8318, + "▁'@": 8319, + "________": 8320, + "▁notes": 8321, + "oman": 8322, + "Parser": 8323, + "▁discovered": 8324, + "▁Roman": 8325, + "▁budget": 8326, + "▁guide": 8327, + "king": 8328, + "▁incred": 8329, + "olar": 8330, + "enden": 8331, + "Desc": 8332, + "▁wave": 8333, + "бли": 8334, + "igt": 8335, + "▁restrict": 8336, + "▁Ret": 8337, + "▁mac": 8338, + "ур": 8339, + "BS": 8340, + "ís": 8341, + "▁generation": 8342, + "dem": 8343, + "alo": 8344, + "бра": 8345, + "▁ordered": 8346, + "drop": 8347, + "▁pp": 8348, + "▁Review": 8349, + "▁literally": 8350, + "▁Sir": 8351, + "▁Yeah": 8352, + "▁density": 8353, + "riz": 8354, + "inde": 8355, + "▁gain": 8356, + "▁panel": 8357, + "jet": 8358, + "▁Times": 8359, + "▁nella": 8360, + "▁previously": 8361, + "points": 8362, + "Send": 8363, + "▁Brown": 8364, + "each": 8365, + "▁trigger": 8366, + "ometimes": 8367, + "icos": 8368, + "GR": 8369, + "Panel": 8370, + "ogen": 8371, + "▁cm": 8372, + "ructions": 8373, + "▁kiss": 8374, + "▁solo": 8375, + "▁famous": 8376, + "ran": 8377, + "про": 8378, + "▁thro": 8379, + "Graph": 8380, + "imit": 8381, + "▁Value": 8382, + "▁starts": 8383, + "ipeline": 8384, + "hd": 8385, + "TC": 8386, + "▁discussion": 8387, + "▁truck": 8388, + "aka": 8389, + "Only": 8390, + "▁Equ": 8391, + "▁kö": 8392, + "▁Bes": 8393, + "▁critic": 8394, + "▁propos": 8395, + "▁batt": 8396, + "▁Section": 8397, + "Show": 8398, + "gp": 8399, + "STATE": 8400, + "POST": 8401, + "▁Nord": 8402, + "▁innov": 8403, + "▁crim": 8404, + "axis": 8405, + "▁Turn": 8406, + "conn": 8407, + "Runtime": 8408, + "▁remaining": 8409, + "oston": 8410, + "▁Э": 8411, + "▁windows": 8412, + "▁Royal": 8413, + "▁vide": 8414, + "PP": 8415, + "chron": 8416, + "▁san": 8417, + "▁rise": 8418, + "▁delle": 8419, + "▁Dur": 8420, + "▁rapid": 8421, + "cert": 8422, + "LA": 8423, + "edge": 8424, + "▁\\]": 8425, + "▁entered": 8426, + "▁laws": 8427, + "▁photo": 8428, + "▁applications": 8429, + "▁Berlin": 8430, + "▁arrest": 8431, + "▁federal": 8432, + "▁Russia": 8433, + "▁usual": 8434, + "▁raw": 8435, + "▁più": 8436, + "être": 8437, + "JSON": 8438, + "SION": 8439, + "xture": 8440, + "istent": 8441, + "▁Power": 8442, + "Bit": 8443, + "▁capacity": 8444, + "▁cards": 8445, + "UID": 8446, + "iments": 8447, + "▁dar": 8448, + "▁Chicago": 8449, + "▁comfortable": 8450, + "tip": 8451, + "bas": 8452, + "▁mu": 8453, + "▁enemy": 8454, + "yan": 8455, + "▁фи": 8456, + "▁updated": 8457, + "ango": 8458, + "Ev": 8459, + "Effect": 8460, + "osing": 8461, + "rence": 8462, + "▁Congress": 8463, + "▁defe": 8464, + "▁ip": 8465, + "▁tout": 8466, + "▁freedom": 8467, + "▁ao": 8468, + "▁Therefore": 8469, + "Edit": 8470, + "▁Virgin": 8471, + "REE": 8472, + "argo": 8473, + "▁Dam": 8474, + "▁traffic": 8475, + "ños": 8476, + "▁alle": 8477, + "▁depth": 8478, + "Now": 8479, + "▁sides": 8480, + "▁годи": 8481, + "Descriptor": 8482, + "▁artikel": 8483, + "▁narrow": 8484, + "___": 8485, + "kw": 8486, + "uto": 8487, + "▁Facebook": 8488, + "tegr": 8489, + "boolean": 8490, + "nik": 8491, + "bd": 8492, + "Track": 8493, + "▁gran": 8494, + "reshold": 8495, + "вет": 8496, + "wrap": 8497, + "▁noise": 8498, + "igu": 8499, + "▁Bon": 8500, + "▁wy": 8501, + "linux": 8502, + "cks": 8503, + "▁fans": 8504, + "▁mach": 8505, + "▁prices": 8506, + "év": 8507, + "outs": 8508, + "standing": 8509, + "▁categ": 8510, + ";\\": 8511, + "▁decre": 8512, + "▁Saturday": 8513, + "▁menu": 8514, + "▁Nov": 8515, + "▁Yet": 8516, + "▁так": 8517, + "liche": 8518, + "▁Academ": 8519, + "▁communication": 8520, + "using": 8521, + "▁Society": 8522, + "▁nuc": 8523, + "pective": 8524, + "orial": 8525, + "▁afraid": 8526, + "▁animal": 8527, + "▁turning": 8528, + "dst": 8529, + "mathfrak": 8530, + "lers": 8531, + "▁lots": 8532, + "▁á": 8533, + "▁Tra": 8534, + "np": 8535, + "▁rose": 8536, + "▁GL": 8537, + "▁helping": 8538, + "▁winter": 8539, + "▁ком": 8540, + "Mock": 8541, + "▁investment": 8542, + "Use": 8543, + "▁Canad": 8544, + "нд": 8545, + "Copy": 8546, + "▁fly": 8547, + "SER": 8548, + "▁Far": 8549, + "▁Ros": 8550, + "amil": 8551, + "▁fighting": 8552, + "▁religious": 8553, + "super": 8554, + "screen": 8555, + "▁furn": 8556, + "▁surprised": 8557, + "▁replied": 8558, + "Activity": 8559, + "▁Down": 8560, + "▁insert": 8561, + "▁Olymp": 8562, + "▁pointed": 8563, + "▁Card": 8564, + "driver": 8565, + "▁Da": 8566, + "!--": 8567, + "roud": 8568, + "undo": 8569, + "▁messages": 8570, + "▁Point": 8571, + "VM": 8572, + "▁plane": 8573, + "xc": 8574, + "▁television": 8575, + "ён": 8576, + "▁thousands": 8577, + "▁cris": 8578, + "▁delay": 8579, + "▁Next": 8580, + "▁nombre": 8581, + "▁tu": 8582, + "▁skip": 8583, + "road": 8584, + "istration": 8585, + "▁tur": 8586, + "▁Develop": 8587, + "▁Па": 8588, + "▁дру": 8589, + "▁wonderful": 8590, + ">&": 8591, + "▁Liber": 8592, + "▁scope": 8593, + "▁manage": 8594, + "▁dass": 8595, + "▁recall": 8596, + "PM": 8597, + "▁relevant": 8598, + "▁Earth": 8599, + "▁как": 8600, + "▁apr": 8601, + "▁ASS": 8602, + "ién": 8603, + "▁SH": 8604, + "oom": 8605, + "itet": 8606, + "none": 8607, + "asi": 8608, + "▁motor": 8609, + "▁Show": 8610, + "nb": 8611, + "▁factors": 8612, + "▁forest": 8613, + "▁вре": 8614, + "thm": 8615, + "▁municip": 8616, + "▁turns": 8617, + "▁Division": 8618, + "EC": 8619, + "▁disappe": 8620, + "structor": 8621, + "▁somewhere": 8622, + "▁African": 8623, + "▁Institute": 8624, + "Grid": 8625, + "▁teacher": 8626, + "uries": 8627, + "▁respectively": 8628, + "▁SD": 8629, + "▁alive": 8630, + "▁pou": 8631, + "▁Water": 8632, + "фе": 8633, + "▁changing": 8634, + "▁afternoon": 8635, + "▁orders": 8636, + "Ret": 8637, + "Pointer": 8638, + "▁sav": 8639, + "erg": 8640, + "oked": 8641, + "essions": 8642, + "▁Fire": 8643, + "aret": 8644, + "imm": 8645, + "▁desire": 8646, + "▁що": 8647, + "▁Design": 8648, + "uture": 8649, + "▁Office": 8650, + "▁cmd": 8651, + "▁eating": 8652, + "Network": 8653, + "▁rough": 8654, + "operator": 8655, + "IGN": 8656, + "▁sports": 8657, + "▁weren": 8658, + "▁noted": 8659, + "▁twice": 8660, + "III": 8661, + "▁anx": 8662, + "▁elim": 8663, + "▁ав": 8664, + "▁io": 8665, + "▁speech": 8666, + "▁condu": 8667, + "elles": 8668, + "idade": 8669, + "▁advance": 8670, + "RI": 8671, + "oca": 8672, + "/\\": 8673, + "apshot": 8674, + "▁tail": 8675, + "models": 8676, + "ogy": 8677, + "▁Jeff": 8678, + "iration": 8679, + "▁Kore": 8680, + "▁leads": 8681, + "bat": 8682, + "Adapter": 8683, + "category": 8684, + "angular": 8685, + "▁saved": 8686, + "▁uniform": 8687, + "▁né": 8688, + "▁businesses": 8689, + "Hist": 8690, + "▁ар": 8691, + "domain": 8692, + "▁Si": 8693, + "raise": 8694, + "▁warn": 8695, + "hetic": 8696, + "▁Gro": 8697, + ")).": 8698, + "}>": 8699, + "зе": 8700, + "▁Amazon": 8701, + "▁Organ": 8702, + "▁Lake": 8703, + "▁agreement": 8704, + "xa": 8705, + "▁perman": 8706, + "▁containing": 8707, + "▁strange": 8708, + "сті": 8709, + "▁stupid": 8710, + "▁speaking": 8711, + "▁Internet": 8712, + "prefix": 8713, + "esc": 8714, + "Assert": 8715, + "prote": 8716, + "▁manner": 8717, + "▁Sz": 8718, + "unte": 8719, + "iot": 8720, + "Profile": 8721, + "oven": 8722, + "▁formed": 8723, + "▁lit": 8724, + "▁economy": 8725, + "▁cz": 8726, + "wid": 8727, + "REQ": 8728, + "▁chosen": 8729, + "▁Produ": 8730, + "oster": 8731, + "stances": 8732, + "awa": 8733, + "▁Ren": 8734, + "▁confirm": 8735, + "▁Бо": 8736, + "▁billion": 8737, + "▁déc": 8738, + "ých": 8739, + "▁illustr": 8740, + "TIES": 8741, + "▁Pub": 8742, + "▁ban": 8743, + "aded": 8744, + "ahn": 8745, + "▁Cath": 8746, + "nonumber": 8747, + "▁worst": 8748, + "▁Ме": 8749, + "▁suggested": 8750, + "stats": 8751, + "▁cant": 8752, + "▁align": 8753, + "kappa": 8754, + "▁hen": 8755, + "▁initi": 8756, + "'])": 8757, + "BI": 8758, + "▁garden": 8759, + "▁secure": 8760, + "▁\\[": 8761, + "handler": 8762, + "elli": 8763, + "ldots": 8764, + "secut": 8765, + "▁extended": 8766, + "}-": 8767, + "anie": 8768, + "▁Find": 8769, + "▁Museum": 8770, + "▁Conne": 8771, + "yy": 8772, + "▁passion": 8773, + "akers": 8774, + "ahr": 8775, + "ologies": 8776, + "▁equation": 8777, + "▁occasion": 8778, + "Let": 8779, + "']['": 8780, + "Print": 8781, + "anes": 8782, + "iente": 8783, + "▁Today": 8784, + "LECT": 8785, + "▁Af": 8786, + ",,": 8787, + "▁Та": 8788, + "▁```": 8789, + "even": 8790, + "sin": 8791, + "urer": 8792, + "▁°": 8793, + "otimes": 8794, + "▁IO": 8795, + "▁poet": 8796, + "()));": 8797, + "▁−": 8798, + "▁adopt": 8799, + "phere": 8800, + "#[": 8801, + "▁centre": 8802, + "oves": 8803, + "▁ans": 8804, + "dp": 8805, + "▁Kir": 8806, + "▁applicable": 8807, + "fp": 8808, + "▁visual": 8809, + "▁okay": 8810, + "oro": 8811, + "▁opportunities": 8812, + "Repository": 8813, + "▁ll": 8814, + "▁Rod": 8815, + "▁shel": 8816, + "▁launch": 8817, + "▁conven": 8818, + "▁Spe": 8819, + "Amer": 8820, + "▁cette": 8821, + "Cond": 8822, + "dep": 8823, + "Own": 8824, + "▁hook": 8825, + "▁dict": 8826, + "▁Those": 8827, + "▁fellow": 8828, + "▁philosoph": 8829, + "vin": 8830, + "ferences": 8831, + "hav": 8832, + "▁adding": 8833, + "iverse": 8834, + "game": 8835, + "▁Blue": 8836, + "▁clin": 8837, + "note": 8838, + "▁Ram": 8839, + "мер": 8840, + "covery": 8841, + "ña": 8842, + "▁би": 8843, + "▁fashion": 8844, + "▁broke": 8845, + "▁'\\": 8846, + "▁reader": 8847, + "ное": 8848, + "ности": 8849, + "▁payment": 8850, + "▁Lic": 8851, + "▁lips": 8852, + "▁academ": 8853, + "▁Mot": 8854, + "ells": 8855, + "CHECK": 8856, + "▁ру": 8857, + "▁MS": 8858, + "Editor": 8859, + "▁zone": 8860, + "iture": 8861, + "▁IT": 8862, + "runtime": 8863, + "▁proceed": 8864, + "лов": 8865, + "▁Maria": 8866, + "olver": 8867, + "▁Thanks": 8868, + "▁shouldn": 8869, + "▁Joh": 8870, + "▁Model": 8871, + "▁Sov": 8872, + "!'": 8873, + "Di": 8874, + "▁cancer": 8875, + "Ident": 8876, + "▁exchange": 8877, + "iller": 8878, + "inf": 8879, + "LEN": 8880, + "(){": 8881, + "aga": 8882, + "\"],": 8883, + "uh": 8884, + "▁Ken": 8885, + "▁photos": 8886, + "▁tiny": 8887, + "▁gent": 8888, + "ül": 8889, + "▁Take": 8890, + "idel": 8891, + "outing": 8892, + "Internal": 8893, + "▁cells": 8894, + "ним": 8895, + "hard": 8896, + "▁Town": 8897, + "obe": 8898, + "plex": 8899, + "тер": 8900, + "tons": 8901, + "▁concentr": 8902, + "mock": 8903, + "vc": 8904, + "áz": 8905, + "▁Championship": 8906, + "▁бе": 8907, + "??": 8908, + "éri": 8909, + "aly": 8910, + "▁Ц": 8911, + "ierte": 8912, + "▁totally": 8913, + "▁Auf": 8914, + "▁ourselves": 8915, + "▁Self": 8916, + "Forms": 8917, + "ighter": 8918, + "▁island": 8919, + "fmt": 8920, + "▁rc": 8921, + "▁tells": 8922, + "BB": 8923, + "dit": 8924, + "▁variables": 8925, + "▁intended": 8926, + "izont": 8927, + "▁plays": 8928, + "dam": 8929, + "seq": 8930, + "▁Sup": 8931, + "▁cultural": 8932, + "▁scream": 8933, + "__,": 8934, + "cipl": 8935, + "Timeout": 8936, + "▁ж": 8937, + "orte": 8938, + "▁replaced": 8939, + "EM": 8940, + "▁abandon": 8941, + "▁Special": 8942, + "ellen": 8943, + "▁Bru": 8944, + "irmed": 8945, + "Te": 8946, + "olt": 8947, + "ju": 8948, + "Argument": 8949, + "▁neut": 8950, + "scape": 8951, + "▁Ray": 8952, + "▁Polit": 8953, + "▁crowd": 8954, + "▁Windows": 8955, + "iego": 8956, + "▁escape": 8957, + "▁Apache": 8958, + "sync": 8959, + "eben": 8960, + "ifies": 8961, + "ether": 8962, + "Meta": 8963, + "▁biggest": 8964, + "Game": 8965, + "▁transaction": 8966, + "Env": 8967, + "▁Мо": 8968, + "▁plenty": 8969, + "▁mel": 8970, + "пре": 8971, + "▁motiv": 8972, + "▁ор": 8973, + "organ": 8974, + "▁mock": 8975, + "▁$_": 8976, + "ене": 8977, + "▁Number": 8978, + "cknow": 8979, + "▁Update": 8980, + "zero": 8981, + "▁surprise": 8982, + "cean": 8983, + "pdf": 8984, + "Global": 8985, + "▁attend": 8986, + "▁fond": 8987, + "▁understood": 8988, + "Nav": 8989, + "▁Mic": 8990, + "=$": 8991, + "oking": 8992, + "▁Stadium": 8993, + "Close": 8994, + "▁competition": 8995, + "▁soldiers": 8996, + "▁OP": 8997, + "agne": 8998, + "▁Anton": 8999, + "Main": 9000, + "ák": 9001, + "▁#[": 9002, + "▁Commit": 9003, + "pyx": 9004, + "▁east": 9005, + "▁Order": 9006, + "Float": 9007, + "▁accepted": 9008, + "▁monitor": 9009, + "▁pad": 9010, + "onic": 9011, + "▁pushed": 9012, + "▁replace": 9013, + "CRE": 9014, + "▁ride": 9015, + "found": 9016, + "=%": 9017, + "вой": 9018, + "▁matches": 9019, + "▁Lie": 9020, + "▁experiences": 9021, + "Pool": 9022, + "ups": 9023, + "AV": 9024, + "▁existence": 9025, + "▁thin": 9026, + "▁magn": 9027, + "COMP": 9028, + "home": 9029, + "▁ni": 9030, + "▁wurden": 9031, + "лав": 9032, + "▁teeth": 9033, + "▁Stan": 9034, + "appro": 9035, + "anny": 9036, + "ifts": 9037, + "▁unknown": 9038, + "▁homes": 9039, + "▁entity": 9040, + "cie": 9041, + "ление": 9042, + "iar": 9043, + "▁compliance": 9044, + "▁focused": 9045, + "uzz": 9046, + "=\\\"": 9047, + "components": 9048, + "Attr": 9049, + "allery": 9050, + "▁identify": 9051, + "Ok": 9052, + "pie": 9053, + "▁Still": 9054, + "▁offering": 9055, + "▁busy": 9056, + "ctl": 9057, + "itors": 9058, + "▁concerned": 9059, + "▁brown": 9060, + "clk": 9061, + "Selected": 9062, + "▁Block": 9063, + "▁egy": 9064, + "icing": 9065, + "▁URL": 9066, + "▁topic": 9067, + "▁Product": 9068, + "▁чи": 9069, + "▁trial": 9070, + "▁weekend": 9071, + "lu": 9072, + "▁IV": 9073, + "▁Egy": 9074, + "xC": 9075, + "▁nove": 9076, + "▁lett": 9077, + "enne": 9078, + "()).": 9079, + ".**": 9080, + "▁promise": 9081, + "election": 9082, + "Auth": 9083, + "rv": 9084, + "ril": 9085, + "▁conduct": 9086, + "▁maintain": 9087, + "▁boat": 9088, + "▁opposite": 9089, + "spin": 9090, + "webpack": 9091, + "anta": 9092, + "▁orient": 9093, + "▁suc": 9094, + "▁exercise": 9095, + "▁efficient": 9096, + "▁tradition": 9097, + "▁zw": 9098, + "▁Sud": 9099, + "going": 9100, + "▁Pier": 9101, + "inv": 9102, + "ipes": 9103, + "ensuremath": 9104, + "▁conver": 9105, + "creen": 9106, + "▁terror": 9107, + "▁Dou": 9108, + "▁invalid": 9109, + "ceived": 9110, + "▁Arab": 9111, + "▁wire": 9112, + "application": 9113, + "shift": 9114, + "Generic": 9115, + "▁Plan": 9116, + "▁Wall": 9117, + "▁directory": 9118, + "▁egg": 9119, + "▁wealth": 9120, + "random": 9121, + "attribute": 9122, + "▁hide": 9123, + "Serial": 9124, + "cam": 9125, + "▁ital": 9126, + "▁Line": 9127, + "▁CHECK": 9128, + "ployment": 9129, + "▁massive": 9130, + "▁extract": 9131, + "chain": 9132, + "Rest": 9133, + "▁Las": 9134, + "▁bear": 9135, + "▁links": 9136, + "▁newsp": 9137, + "▁FC": 9138, + "Card": 9139, + "aks": 9140, + "▁visible": 9141, + "▁Marc": 9142, + "▁Boston": 9143, + "▁reserved": 9144, + "▁roof": 9145, + "licenses": 9146, + "dc": 9147, + "▁Information": 9148, + "▁witness": 9149, + "Sk": 9150, + "*),": 9151, + "Scope": 9152, + "'];": 9153, + "▁Mir": 9154, + "uding": 9155, + "▁trend": 9156, + "rep": 9157, + "▁musical": 9158, + "▁neither": 9159, + "▁Creat": 9160, + "▁positions": 9161, + "LC": 9162, + "ridge": 9163, + "▁officers": 9164, + "▁violence": 9165, + "▁Tem": 9166, + "▁Sus": 9167, + "▁Way": 9168, + "After": 9169, + "acket": 9170, + "▁Sou": 9171, + "acer": 9172, + "||": 9173, + "▁remark": 9174, + "water": 9175, + "ně": 9176, + "▁Са": 9177, + "▁sed": 9178, + "Each": 9179, + "▁photograph": 9180, + "▁letters": 9181, + "▁invent": 9182, + "▁Mas": 9183, + "▁songs": 9184, + "ól": 9185, + "kind": 9186, + "▁Non": 9187, + "▁dust": 9188, + "**:": 9189, + "nabla": 9190, + ".\",": 9191, + "Lock": 9192, + "▁До": 9193, + "▁cluster": 9194, + "loss": 9195, + "▁ASSERT": 9196, + "fall": 9197, + "▁reject": 9198, + "▁Spring": 9199, + "▁wedding": 9200, + "▁grav": 9201, + "ression": 9202, + "limit": 9203, + "RES": 9204, + "]}": 9205, + "▁listed": 9206, + "▁Tele": 9207, + "hline": 9208, + "▁chief": 9209, + "MEM": 9210, + "дар": 9211, + "▁expensive": 9212, + "trace": 9213, + "▁Rog": 9214, + "▁Coll": 9215, + "▁Author": 9216, + "▁Board": 9217, + "▁Capt": 9218, + "TEXT": 9219, + "▁recon": 9220, + "esta": 9221, + "▁properly": 9222, + "▁&\\": 9223, + "leton": 9224, + "iker": 9225, + "Gu": 9226, + "▁Kom": 9227, + "oco": 9228, + "▁anymore": 9229, + "▁taste": 9230, + "▁Santa": 9231, + "gex": 9232, + "▁Secret": 9233, + "▁talent": 9234, + "▁moments": 9235, + "▁Ba": 9236, + "▁extr": 9237, + "▁Commission": 9238, + "▁modify": 9239, + "▁Figure": 9240, + "▁domin": 9241, + "▁plot": 9242, + "enger": 9243, + "utch": 9244, + "▁cities": 9245, + "▁nut": 9246, + "profile": 9247, + "▁Stat": 9248, + "▁nodes": 9249, + "▁ns": 9250, + "essages": 9251, + "impl": 9252, + "icker": 9253, + "▁examples": 9254, + "abeth": 9255, + "▁stated": 9256, + "fire": 9257, + "bul": 9258, + "▁dangerous": 9259, + "▁Pay": 9260, + "▁Gre": 9261, + "▁Monday": 9262, + "esome": 9263, + "igan": 9264, + "rund": 9265, + "prise": 9266, + "fail": 9267, + "▁Never": 9268, + "Av": 9269, + "▁linear": 9270, + "▁ul": 9271, + "WAR": 9272, + "рен": 9273, + "▁AT": 9274, + "▁dop": 9275, + "▁nou": 9276, + "Dest": 9277, + "▁claims": 9278, + "enda": 9279, + "▁crazy": 9280, + "gel": 9281, + "oggle": 9282, + "▁representation": 9283, + "inen": 9284, + "▁alternative": 9285, + "DM": 9286, + "ABILITY": 9287, + "faces": 9288, + "▁doors": 9289, + "ativ": 9290, + "Look": 9291, + "▁JSON": 9292, + "▁appearance": 9293, + "бря": 9294, + "SQL": 9295, + "▁silence": 9296, + "udo": 9297, + "▁Director": 9298, + "Statement": 9299, + "selected": 9300, + "high": 9301, + "prime": 9302, + "▁ignore": 9303, + "▁colors": 9304, + "ushing": 9305, + "▁virt": 9306, + "manager": 9307, + "▁remote": 9308, + "ło": 9309, + "small": 9310, + "▁crime": 9311, + "rb": 9312, + "▁creation": 9313, + "▁flight": 9314, + "▁Sign": 9315, + "ILE": 9316, + "▁DO": 9317, + "comment": 9318, + "▁Cost": 9319, + ".__": 9320, + "▁Cop": 9321, + "▁vom": 9322, + "▁Science": 9323, + "ления": 9324, + "oop": 9325, + "interface": 9326, + "▁WARRANTIES": 9327, + "▁Page": 9328, + "******": 9329, + "ском": 9330, + "TRUE": 9331, + "▁repeated": 9332, + "▁его": 9333, + "шо": 9334, + "▁roz": 9335, + "Pe": 9336, + "▁ISBN": 9337, + "irts": 9338, + "poses": 9339, + "})$": 9340, + "▁І": 9341, + "children": 9342, + "bles": 9343, + "ECT": 9344, + "▁iz": 9345, + "▁builder": 9346, + "▁Media": 9347, + "iat": 9348, + "▁contrast": 9349, + "”,": 9350, + "▁Link": 9351, + "▁Education": 9352, + "▁joint": 9353, + "▁external": 9354, + "▁роз": 9355, + "▁bits": 9356, + "FORM": 9357, + "erman": 9358, + "wp": 9359, + "▁Mike": 9360, + "▁Master": 9361, + "▁senior": 9362, + "▁Nav": 9363, + "▁recorded": 9364, + "eling": 9365, + "esh": 9366, + "fx": 9367, + "кан": 9368, + "▁tall": 9369, + "▁Johnson": 9370, + "▁sono": 9371, + "▁anche": 9372, + "icken": 9373, + "loop": 9374, + "iciency": 9375, + "emporary": 9376, + "▁Does": 9377, + "▁relation": 9378, + "мы": 9379, + "was": 9380, + "low": 9381, + "ichte": 9382, + "▁Jones": 9383, + "▁bedroom": 9384, + "DIS": 9385, + "▁magnet": 9386, + "▁Engine": 9387, + "▁feelings": 9388, + "GC": 9389, + "▁torn": 9390, + "▁relationships": 9391, + "▁Ре": 9392, + "▁proud": 9393, + "▁twe": 9394, + "oval": 9395, + "▁waste": 9396, + "▁reduced": 9397, + "ilton": 9398, + "BP": 9399, + "▁forgot": 9400, + "▁bodies": 9401, + "▁Haw": 9402, + "lag": 9403, + "▁www": 9404, + "door": 9405, + "▁sufficient": 9406, + "▁dollars": 9407, + "Len": 9408, + "▁talked": 9409, + "▁bond": 9410, + "▁Bor": 9411, + "}}{": 9412, + "rod": 9413, + "Password": 9414, + "quare": 9415, + "▁lights": 9416, + "eren": 9417, + "▁thirty": 9418, + "NC": 9419, + "▁TODO": 9420, + "▁respond": 9421, + "ких": 9422, + "direct": 9423, + "ação": 9424, + "▁heav": 9425, + "Media": 9426, + "exit": 9427, + "License": 9428, + "`.": 9429, + "▁mixed": 9430, + "▁desk": 9431, + "▁teaching": 9432, + "▁maj": 9433, + "▁nerv": 9434, + "inations": 9435, + "typeof": 9436, + "▁coast": 9437, + "▁же": 9438, + "▁beside": 9439, + "ummy": 9440, + "Doc": 9441, + "▁schedule": 9442, + "▁recover": 9443, + "▁Further": 9444, + "▁steel": 9445, + "boot": 9446, + "▁Perhaps": 9447, + "▁съ": 9448, + "▁Os": 9449, + "rick": 9450, + "▁Ви": 9451, + "Support": 9452, + "▁(_": 9453, + "nil": 9454, + "pis": 9455, + "xpected": 9456, + "▁processing": 9457, + "Build": 9458, + "arian": 9459, + "▁icon": 9460, + "▁CA": 9461, + "wick": 9462, + "=(": 9463, + "▁algorithm": 9464, + "▁Young": 9465, + "▁Management": 9466, + "▁ancient": 9467, + "ность": 9468, + "oti": 9469, + "▁combination": 9470, + "world": 9471, + "nn": 9472, + "▁dram": 9473, + "enabled": 9474, + "Ac": 9475, + "CCESS": 9476, + "aration": 9477, + "▁blocks": 9478, + "▁Angeles": 9479, + "▁Qual": 9480, + "▁succeed": 9481, + "network": 9482, + "▁oblig": 9483, + "springframework": 9484, + "▁Tre": 9485, + "okes": 9486, + "mun": 9487, + "▁Network": 9488, + "Del": 9489, + "▁estate": 9490, + "▁liqu": 9491, + "▁pob": 9492, + "▁dad": 9493, + "▁distinct": 9494, + "▁Tit": 9495, + "▁Lear": 9496, + "ferred": 9497, + "android": 9498, + "▁subsequ": 9499, + "▁Florida": 9500, + "subset": 9501, + "▁whisper": 9502, + "Vol": 9503, + "ulous": 9504, + "▁crew": 9505, + "▁lug": 9506, + "pid": 9507, + "ocity": 9508, + "skb": 9509, + "▁tea": 9510, + "ун": 9511, + "▁honor": 9512, + "▁Ins": 9513, + "▁gew": 9514, + "Details": 9515, + "eneath": 9516, + "atar": 9517, + "▁_{": 9518, + "amen": 9519, + "▁setup": 9520, + "Transaction": 9521, + "▁blank": 9522, + "Failed": 9523, + "job": 9524, + "▁pret": 9525, + "ße": 9526, + "loor": 9527, + "ří": 9528, + "ncia": 9529, + "▁anywhere": 9530, + "▁Light": 9531, + "▁Ak": 9532, + "BD": 9533, + "▁excited": 9534, + "agers": 9535, + "▁warning": 9536, + "▁processes": 9537, + "hu": 9538, + "▁youth": 9539, + "▁dogs": 9540, + "▁oct": 9541, + "▁nine": 9542, + "Writer": 9543, + "grid": 9544, + "▁importance": 9545, + "estic": 9546, + "▁carefully": 9547, + "master": 9548, + "▁decisions": 9549, + "▁pin": 9550, + "▁crack": 9551, + "TEST": 9552, + "▁Local": 9553, + "▁Right": 9554, + "▁vast": 9555, + "▁faster": 9556, + "▁institut": 9557, + "▁annual": 9558, + "LAN": 9559, + "▁episode": 9560, + "▁XV": 9561, + "▁delivery": 9562, + "tl": 9563, + "FP": 9564, + "circ": 9565, + "▁typically": 9566, + "igo": 9567, + "▁intel": 9568, + "nat": 9569, + "xb": 9570, + "стро": 9571, + ")-": 9572, + "▁Bal": 9573, + "▁Jos": 9574, + "▁gonna": 9575, + "▁Rest": 9576, + "jor": 9577, + "onia": 9578, + "orship": 9579, + "overy": 9580, + "LINE": 9581, + "]:": 9582, + "Queue": 9583, + "▁compare": 9584, + "▁apartment": 9585, + "▁rul": 9586, + "Dr": 9587, + "gency": 9588, + "▁obviously": 9589, + "zie": 9590, + "ycl": 9591, + "fortunately": 9592, + "▁stepped": 9593, + "▁Seg": 9594, + "▁Which": 9595, + "▁PC": 9596, + "▁ast": 9597, + "endor": 9598, + "▁permission": 9599, + "COL": 9600, + "▁TEST": 9601, + "Pay": 9602, + "ères": 9603, + "▁studied": 9604, + "▁accompl": 9605, + "role": 9606, + "Where": 9607, + "protobuf": 9608, + "metadata": 9609, + "Job": 9610, + "▁Four": 9611, + "plements": 9612, + "disable": 9613, + "▁loud": 9614, + "▁happening": 9615, + "▁Using": 9616, + "rog": 9617, + "▁depends": 9618, + "ím": 9619, + "'\\": 9620, + "▁taught": 9621, + "shared": 9622, + "▁attributes": 9623, + "▁Action": 9624, + "▁dess": 9625, + "▁houses": 9626, + "▁reset": 9627, + "▁bien": 9628, + "▁explicit": 9629, + "LOW": 9630, + "->_": 9631, + "▁PM": 9632, + "Category": 9633, + "oice": 9634, + "into": 9635, + "▁mail": 9636, + "▁authority": 9637, + "▁unable": 9638, + "filename": 9639, + "ék": 9640, + "лей": 9641, + "▁sector": 9642, + "appoint": 9643, + "▁hang": 9644, + "▁cel": 9645, + "related": 9646, + "itate": 9647, + "▁'<": 9648, + "amber": 9649, + "▁cheap": 9650, + "▁enabled": 9651, + "▁division": 9652, + "Any": 9653, + "▁hier": 9654, + "▁Head": 9655, + "ntax": 9656, + "uda": 9657, + "▁limitations": 9658, + "▁studio": 9659, + "media": 9660, + "▁circle": 9661, + "нова": 9662, + "▁laug": 9663, + "acts": 9664, + "▁Во": 9665, + "ód": 9666, + "pled": 9667, + "LOC": 9668, + "Expr": 9669, + ">:": 9670, + "▁prés": 9671, + "▁laughed": 9672, + "▁Three": 9673, + "лы": 9674, + "▁ends": 9675, + "▁fundament": 9676, + "▁inher": 9677, + "▁liv": 9678, + "bid": 9679, + "▁responsibility": 9680, + "▁checked": 9681, + "▁Pac": 9682, + "▁fault": 9683, + "▁yellow": 9684, + "▁salt": 9685, + "▁Francisco": 9686, + "▁^": 9687, + "▁ON": 9688, + "▁beauty": 9689, + "yg": 9690, + "▁Aff": 9691, + "▁Eq": 9692, + "▁magic": 9693, + "▁handler": 9694, + "xE": 9695, + "▁numerous": 9696, + "▁hole": 9697, + "▁rooms": 9698, + "cción": 9699, + "▁Arm": 9700, + "person": 9701, + "▁buildings": 9702, + "▁plate": 9703, + "bled": 9704, + "errors": 9705, + "▁Again": 9706, + "▁Default": 9707, + "▁Hard": 9708, + "tó": 9709, + "hus": 9710, + "▁dimension": 9711, + "iale": 9712, + "▁Mult": 9713, + "▁Government": 9714, + "Func": 9715, + "▁blow": 9716, + "▁rect": 9717, + "erra": 9718, + "connection": 9719, + "▁passing": 9720, + "ßen": 9721, + "phas": 9722, + "ensional": 9723, + "record": 9724, + "cohol": 9725, + "▁Harry": 9726, + "izontal": 9727, + "▁finger": 9728, + "▁younger": 9729, + "▁SC": 9730, + "operation": 9731, + "BY": 9732, + "heim": 9733, + "▁Bad": 9734, + "▁storm": 9735, + "▁Nat": 9736, + "▁buying": 9737, + "▁Sometimes": 9738, + "▁Ста": 9739, + "essed": 9740, + "▁damn": 9741, + "▁meg": 9742, + "umes": 9743, + "ünd": 9744, + "тра": 9745, + "▁silver": 9746, + "wd": 9747, + "hidden": 9748, + "ardo": 9749, + "▁communities": 9750, + "▁diet": 9751, + "otted": 9752, + "▁bat": 9753, + "ancer": 9754, + "▁fmt": 9755, + "▁Pen": 9756, + "▁til": 9757, + "Enum": 9758, + "PATH": 9759, + "▁matters": 9760, + "timeout": 9761, + "------------": 9762, + "kan": 9763, + "▁Corpor": 9764, + "=\"../../": 9765, + "▁Ale": 9766, + "hentication": 9767, + "▁complic": 9768, + "▁Security": 9769, + "OFF": 9770, + "Rad": 9771, + "apse": 9772, + "▁dance": 9773, + "▁permissions": 9774, + "▁warrant": 9775, + "▁lad": 9776, + "▁isol": 9777, + "dl": 9778, + "▁Au": 9779, + "yes": 9780, + "▁tv": 9781, + "▁provider": 9782, + "▁terrible": 9783, + "▁department": 9784, + "eral": 9785, + "▁implementation": 9786, + "SR": 9787, + "▁hearing": 9788, + "▁Kn": 9789, + "FR": 9790, + "tv": 9791, + "▁diss": 9792, + "FUN": 9793, + "▁durante": 9794, + "osis": 9795, + "▁tasks": 9796, + "▁Blo": 9797, + "вод": 9798, + "▁branch": 9799, + "▁politics": 9800, + "▁Elle": 9801, + "▁leadership": 9802, + "expr": 9803, + "▁techniques": 9804, + "prec": 9805, + "Sigma": 9806, + "imately": 9807, + "tk": 9808, + "achment": 9809, + "▁Enter": 9810, + "▁creative": 9811, + "▁зна": 9812, + "appy": 9813, + "unched": 9814, + "▁'',": 9815, + "onder": 9816, + "{-": 9817, + "NUM": 9818, + "▁narr": 9819, + "Memory": 9820, + "▁winning": 9821, + "▁Follow": 9822, + "*/\r": 9823, + "vision": 9824, + "resents": 9825, + "zione": 9826, + "▁latter": 9827, + "▁requests": 9828, + "▁margin": 9829, + "▁{\"": 9830, + "video": 9831, + "cn": 9832, + "▁Image": 9833, + "Tim": 9834, + "CONFIG": 9835, + "▁allowing": 9836, + "▁combined": 9837, + "PUT": 9838, + "▁instanceof": 9839, + "igin": 9840, + "▁pero": 9841, + "▁''": 9842, + "▁confidence": 9843, + "▁equivalent": 9844, + "pad": 9845, + "effect": 9846, + "RX": 9847, + "▁lang": 9848, + "strong": 9849, + "▁bridge": 9850, + "aya": 9851, + "▁treated": 9852, + "▁forth": 9853, + "SW": 9854, + "▁accounts": 9855, + "▁PO": 9856, + "▁listening": 9857, + "Route": 9858, + "()))": 9859, + "cpy": 9860, + "▁reform": 9861, + "▁gate": 9862, + "▁Walk": 9863, + "▁somehow": 9864, + "tf": 9865, + "▁layout": 9866, + "umin": 9867, + "▁considering": 9868, + "▁premi": 9869, + "▁Mom": 9870, + "athan": 9871, + "Gen": 9872, + "▁planet": 9873, + "amples": 9874, + "▁MO": 9875, + "shop": 9876, + "▁premier": 9877, + "▁simpl": 9878, + "▁segu": 9879, + "LY": 9880, + "Sum": 9881, + "▁tables": 9882, + "ska": 9883, + "▁ž": 9884, + "pd": 9885, + "▁sous": 9886, + "▁conference": 9887, + "▁Dat": 9888, + "Scroll": 9889, + "▁standards": 9890, + "▁гру": 9891, + "esse": 9892, + "▁citizens": 9893, + "▁occurred": 9894, + "▁democr": 9895, + "▁elev": 9896, + "▁Sem": 9897, + "ensus": 9898, + "headers": 9899, + "▁Chris": 9900, + "imento": 9901, + "kom": 9902, + "Cor": 9903, + "MIN": 9904, + "usher": 9905, + "Database": 9906, + "▁formal": 9907, + "igne": 9908, + "▁organizations": 9909, + "▁Ire": 9910, + "Xml": 9911, + "из": 9912, + "▁pray": 9913, + "▁bomb": 9914, + "▁mand": 9915, + "erts": 9916, + "▁clock": 9917, + "▁buck": 9918, + "вали": 9919, + "ensch": 9920, + "▁volt": 9921, + "▁films": 9922, + "▁plants": 9923, + "inode": 9924, + "Boolean": 9925, + "▁restaurant": 9926, + "ían": 9927, + "▁debut": 9928, + "pages": 9929, + "▁wordt": 9930, + "▁Ба": 9931, + "▁greatest": 9932, + "(\"/": 9933, + "▁copyright": 9934, + "▁rit": 9935, + "sizeof": 9936, + "Trace": 9937, + "uent": 9938, + "тур": 9939, + "▁ko": 9940, + ":\\": 9941, + "▁bigger": 9942, + "▁perfectly": 9943, + "tenance": 9944, + "MASK": 9945, + "ré": 9946, + "▁ett": 9947, + "▁nose": 9948, + "▁craft": 9949, + "iteral": 9950, + "▁discussed": 9951, + "▁Jewish": 9952, + "Cap": 9953, + "▁Unless": 9954, + "▁Jackson": 9955, + "Attributes": 9956, + "▁lunch": 9957, + "öl": 9958, + "atr": 9959, + "▁paying": 9960, + "Parse": 9961, + "()\r": 9962, + "lad": 9963, + "▁rare": 9964, + "▁[];": 9965, + "stone": 9966, + "▁unc": 9967, + "▁defense": 9968, + "}+": 9969, + "▁Global": 9970, + "▁Soviet": 9971, + "▁Australian": 9972, + "▁gli": 9973, + "variant": 9974, + "▁Ron": 9975, + "▁loan": 9976, + "Step": 9977, + "member": 9978, + "Sch": 9979, + "▁Committee": 9980, + "▁spending": 9981, + "▁Tri": 9982, + "▁Journal": 9983, + "▁sugar": 9984, + "elly": 9985, + "HTML": 9986, + "▁advent": 9987, + "wing": 9988, + "▁Whether": 9989, + "oration": 9990, + "▁NE": 9991, + "iveness": 9992, + "▁hav": 9993, + "▁conscious": 9994, + "een": 9995, + "Symbol": 9996, + "▁ку": 9997, + "Logger": 9998, + "▁Little": 9999, + "widet": 10000, + "ocation": 10001, + "pin": 10002, + "▁symmet": 10003, + "▁AD": 10004, + "▁posts": 10005, + "shal": 10006, + "▁Conf": 10007, + "▁chose": 10008, + "mal": 10009, + "ulo": 10010, + "▁Method": 10011, + "▁missed": 10012, + "Remove": 10013, + "Auto": 10014, + "VALUE": 10015, + "thlet": 10016, + "▁Force": 10017, + "pf": 10018, + "▁Я": 10019, + "late": 10020, + "▁pul": 10021, + "Pop": 10022, + "▁advanced": 10023, + "aires": 10024, + "ressed": 10025, + "AME": 10026, + "bell": 10027, + "aching": 10028, + "ić": 10029, + "echo": 10030, + "HS": 10031, + "▁funny": 10032, + "рии": 10033, + "▁eer": 10034, + "▁veget": 10035, + "▁fourth": 10036, + "cf": 10037, + "transform": 10038, + "▁grown": 10039, + "▁McC": 10040, + "site": 10041, + "▁beneath": 10042, + "▁shell": 10043, + "xd": 10044, + "Play": 10045, + "short": 10046, + "Role": 10047, + "▁religion": 10048, + "inator": 10049, + "}<": 10123, + "asp": 10124, + "ajo": 10125, + "exports": 10126, + "▁Node": 10127, + "▁jako": 10128, + "▁ya": 10129, + "▁successfully": 10130, + "▁friendly": 10131, + "buff": 10132, + "DEFAULT": 10133, + "▁pregn": 10134, + "Required": 10135, + "▁binary": 10136, + "isting": 10137, + "▁stared": 10138, + "▁circumstances": 10139, + "▁хо": 10140, + "rei": 10141, + "▁Го": 10142, + "Transform": 10143, + "cnt": 10144, + "▁Ext": 10145, + "report": 10146, + "VERSION": 10147, + "▁analy": 10148, + "▁Marg": 10149, + "▁alleg": 10150, + "builder": 10151, + "ToString": 10152, + "Layer": 10153, + "íst": 10154, + "Prop": 10155, + "▁Emp": 10156, + "}]": 10157, + "▁selling": 10158, + "▁queue": 10159, + "▁seriously": 10160, + "▁Lead": 10161, + "textit": 10162, + "testing": 10163, + "▁Пре": 10164, + "security": 10165, + "iał": 10166, + "ún": 10167, + "chip": 10168, + "▁candidate": 10169, + "▁minister": 10170, + "eria": 10171, + "▁Het": 10172, + "дин": 10173, + "▁Britain": 10174, + "▁barely": 10175, + "▁sty": 10176, + "▁Spanish": 10177, + "▁Ven": 10178, + "timer": 10179, + "ків": 10180, + "▁documents": 10181, + "('.": 10182, + "▁debug": 10183, + "▁contro": 10184, + "стоя": 10185, + "▁joy": 10186, + "Sn": 10187, + "Inv": 10188, + "▁protocol": 10189, + "▁faces": 10190, + "▁Despite": 10191, + "sed": 10192, + "Conf": 10193, + "ARG": 10194, + "▁evolution": 10195, + "▁tod": 10196, + "▁Promise": 10197, + "▁posted": 10198, + "Perm": 10199, + "bet": 10200, + "Ang": 10201, + "Just": 10202, + "▁rum": 10203, + "layer": 10204, + "▁behavi": 10205, + "ipping": 10206, + "▁dynam": 10207, + "▁scheme": 10208, + "▁proto": 10209, + ")/": 10210, + "Collections": 10211, + "riev": 10212, + "▁Click": 10213, + "▁uns": 10214, + "widetilde": 10215, + "▁remembered": 10216, + "гі": 10217, + "inates": 10218, + "▁incorpor": 10219, + "▁Description": 10220, + "▁prepare": 10221, + "▁Final": 10222, + "uation": 10223, + "▁Queen": 10224, + ">;": 10225, + "▁automatically": 10226, + "▁sharp": 10227, + "▁meat": 10228, + "ateur": 10229, + "astern": 10230, + "▁stuck": 10231, + "ASSERT": 10232, + "▁planned": 10233, + "dots": 10234, + "ookie": 10235, + "▁Histor": 10236, + "▁reviews": 10237, + "IMP": 10238, + "▁answered": 10239, + "Total": 10240, + "▁sau": 10241, + "▁Mexico": 10242, + "continue": 10243, + "▁Apple": 10244, + "likely": 10245, + "зва": 10246, + "users": 10247, + "▁identified": 10248, + "▁Lev": 10249, + "▁mol": 10250, + "▁Islam": 10251, + "▁committed": 10252, + "writ": 10253, + "бер": 10254, + "rift": 10255, + "▁interrupt": 10256, + "▁readonly": 10257, + "schema": 10258, + "Sm": 10259, + "Double": 10260, + "aza": 10261, + "▁Hal": 10262, + "Move": 10263, + "▁Series": 10264, + "inline": 10265, + "▁которы": 10266, + "soc": 10267, + "▁tent": 10268, + "▁amer": 10269, + "aki": 10270, + "▁lady": 10271, + "▁tired": 10272, + "ifi": 10273, + "▁même": 10274, + "ouver": 10275, + "▁aside": 10276, + "Did": 10277, + "',\r": 10278, + "▁bringing": 10279, + "Drawing": 10280, + "aro": 10281, + "▁Rh": 10282, + "▁Naz": 10283, + "esso": 10284, + "▁reaction": 10285, + "mitted": 10286, + "▁absolute": 10287, + "haust": 10288, + "(()": 10289, + "▁Task": 10290, + "ERS": 10291, + "▁^{": 10292, + "VD": 10293, + "▁tone": 10294, + "dist": 10295, + "vs": 10296, + "▁wheel": 10297, + "▁administration": 10298, + "▁interests": 10299, + "▁pointer": 10300, + "▁encounter": 10301, + "aver": 10302, + "▁nord": 10303, + "ket": 10304, + "▁beach": 10305, + "▁enjoyed": 10306, + "contains": 10307, + "▁append": 10308, + "Wait": 10309, + "▁squad": 10310, + "zel": 10311, + "▁medium": 10312, + "▁sending": 10313, + "▁Lady": 10314, + "ções": 10315, + "▁destination": 10316, + "nych": 10317, + "▁conflict": 10318, + "▁Ly": 10319, + "▁vul": 10320, + "▁basically": 10321, + "reated": 10322, + "black": 10323, + "ugins": 10324, + "▁calm": 10325, + "érie": 10326, + "har": 10327, + "лан": 10328, + "▁Се": 10329, + "watch": 10330, + "▁Put": 10331, + "▁dump": 10332, + "acher": 10333, + "scroll": 10334, + "▁claimed": 10335, + "▁Control": 10336, + "▁blind": 10337, + "enti": 10338, + "▁Keep": 10339, + "▁Development": 10340, + "images": 10341, + "▁tough": 10342, + "gebra": 10343, + "▁sept": 10344, + "hew": 10345, + "▁skill": 10346, + "▁Tay": 10347, + "▁któ": 10348, + "owner": 10349, + "pare": 10350, + "▁fee": 10351, + "▁continues": 10352, + "▁kan": 10353, + "bes": 10354, + "▁cha": 10355, + "ovo": 10356, + "▁Night": 10357, + "icture": 10358, + "shire": 10359, + "▁essay": 10360, + "▁suppose": 10361, + "etic": 10362, + "Art": 10363, + "acon": 10364, + "lla": 10365, + "words": 10366, + "▁comparison": 10367, + "▁BE": 10368, + "▁challenges": 10369, + "▁ol": 10370, + "citep": 10371, + "▁Foot": 10372, + "▁Such": 10373, + "▁papers": 10374, + "activ": 10375, + "quer": 10376, + "тя": 10377, + "▁То": 10378, + "ський": 10379, + "thur": 10380, + "done": 10381, + "▁shock": 10382, + "▁dedicated": 10383, + "▁correspond": 10384, + "Second": 10385, + "▁bull": 10386, + "life": 10387, + "indent": 10388, + "▁figures": 10389, + "▁Andrew": 10390, + "isp": 10391, + "▁favour": 10392, + "зда": 10393, + "▁Elect": 10394, + "Full": 10395, + "▁nearby": 10396, + "▁Register": 10397, + "Scale": 10398, + "ications": 10399, + "ин": 10400, + "▁AM": 10401, + "pair": 10402, + "▁perspective": 10403, + "▁nos": 10404, + "apa": 10405, + "ostał": 10406, + "▁Pers": 10407, + "icer": 10408, + "▁plastic": 10409, + "дов": 10410, + "ciples": 10411, + "zą": 10412, + "clos": 10413, + "▁уча": 10414, + "▁Á": 10415, + "plugin": 10416, + "▁angle": 10417, + "▁commission": 10418, + "▁funds": 10419, + "▁indu": 10420, + "▁drawn": 10421, + "ám": 10422, + "▁developing": 10423, + "▁segment": 10424, + "isme": 10425, + "scr": 10426, + "▁lies": 10427, + "▁IL": 10428, + "▁api": 10429, + "Extension": 10430, + "▁scal": 10431, + "install": 10432, + "▁Week": 10433, + "▁gentle": 10434, + "▁Canadian": 10435, + "▁dialog": 10436, + "▁articles": 10437, + "Theme": 10438, + "SM": 10439, + "▁Bul": 10440, + "▁leur": 10441, + "▁stom": 10442, + "Plugin": 10443, + "▁после": 10444, + "▁stead": 10445, + "▁ś": 10446, + "ipher": 10447, + "▁prze": 10448, + "▁draft": 10449, + "bottom": 10450, + "▁{};": 10451, + "▁stayed": 10452, + "feature": 10453, + "▁vot": 10454, + "▁fabric": 10455, + "ça": 10456, + "('#": 10457, + "rea": 10458, + "▁reput": 10459, + "▁Cir": 10460, + "▁AL": 10461, + "▁assertEquals": 10462, + "results": 10463, + "▁Cross": 10464, + "ursday": 10465, + "▁audio": 10466, + "▁gap": 10467, + "▁streets": 10468, + "▁scientific": 10469, + "platform": 10470, + "▁auss": 10471, + "▁Cro": 10472, + "▁partial": 10473, + "unc": 10474, + "▁choices": 10475, + "▁или": 10476, + "pred": 10477, + "▁heads": 10478, + "terday": 10479, + "▁Nick": 10480, + "▁weird": 10481, + "asant": 10482, + "▁represented": 10483, + "▁пи": 10484, + "DP": 10485, + "orders": 10486, + "clock": 10487, + "▁Ho": 10488, + "arters": 10489, + "Cmd": 10490, + "oga": 10491, + "Keys": 10492, + "Report": 10493, + "▁Vill": 10494, + "▁Mu": 10495, + "▁owned": 10496, + "SUCCESS": 10497, + "▁typeof": 10498, + "hdr": 10499, + "uable": 10500, + "▁neighborhood": 10501, + "▁AP": 10502, + "▁resulting": 10503, + "▁shadow": 10504, + "STRING": 10505, + "▁videos": 10506, + "лення": 10507, + "expect": 10508, + "▁Valley": 10509, + "▁goto": 10510, + "▁Sher": 10511, + "frastr": 10512, + "▁operating": 10513, + "▁это": 10514, + "▁Licensed": 10515, + "Variable": 10516, + "▁PR": 10517, + "▁Hans": 10518, + "clone": 10519, + "▁Gesch": 10520, + "▁Band": 10521, + "........": 10522, + "uing": 10523, + "▁hundreds": 10524, + "▁ок": 10525, + "▁emotional": 10526, + "▁Indust": 10527, + ")+": 10528, + "▁Egypt": 10529, + "▁franç": 10530, + "▁š": 10531, + "▁fasc": 10532, + "onto": 10533, + "▁Adam": 10534, + "▁laid": 10535, + "▁rig": 10536, + "▁detailed": 10537, + "▁implements": 10538, + "▁university": 10539, + "▁Hy": 10540, + "▁grid": 10541, + "▁regions": 10542, + "Stop": 10543, + "▁slot": 10544, + "▁angry": 10545, + "▁-=": 10546, + "▁waited": 10547, + "Vert": 10548, + "\":\"": 10549, + "▁elem": 10550, + "▁rég": 10551, + "owed": 10552, + "Member": 10553, + "▁ratio": 10554, + "isen": 10555, + "▁Lem": 10556, + "gery": 10557, + "▁cream": 10558, + "▁était": 10559, + "▁geb": 10560, + "unique": 10561, + "▁Deb": 10562, + "▁factory": 10563, + "że": 10564, + "dialog": 10565, + "▁Config": 10566, + "Sync": 10567, + "angers": 10568, + "▁governing": 10569, + "▁Hun": 10570, + "Space": 10571, + "▁jest": 10572, + "icious": 10573, + "▁emphas": 10574, + "umps": 10575, + "▁Esp": 10576, + "▁sul": 10577, + "▁historical": 10578, + "ija": 10579, + "▁lying": 10580, + "▁Steve": 10581, + "▁measures": 10582, + "osto": 10583, + "?”": 10584, + "▁pocket": 10585, + "▁Sat": 10586, + "▁pitch": 10587, + "▁natur": 10588, + "▁humans": 10589, + "▁Simon": 10590, + "adores": 10591, + "(\"\\": 10592, + "inking": 10593, + "▁expos": 10594, + "material": 10595, + "▁apparently": 10596, + "▁Camb": 10597, + "▁Box": 10598, + "▁spaces": 10599, + "exists": 10600, + "▁acting": 10601, + "ORY": 10602, + "зова": 10603, + "Good": 10604, + "ienne": 10605, + "▁Williams": 10606, + "▁fruit": 10607, + "iera": 10608, + "▁Lim": 10609, + "▁trait": 10610, + "▁artists": 10611, + "▁absor": 10612, + "rait": 10613, + "LOAD": 10614, + "▁movies": 10615, + "▁dynamic": 10616, + "asts": 10617, + "▁Integer": 10618, + "▁smoke": 10619, + "пі": 10620, + "angel": 10621, + ">(\"": 10622, + "▁instrument": 10623, + "▁fuel": 10624, + "ної": 10625, + "atalogue": 10626, + "▁serial": 10627, + "Files": 10628, + "▁bathroom": 10629, + "ilo": 10630, + "esto": 10631, + "▁pm": 10632, + "entials": 10633, + "▁Online": 10634, + "white": 10635, + "▁tips": 10636, + "▁capable": 10637, + "Fig": 10638, + "TV": 10639, + "▁он": 10640, + "ké": 10641, + "bitr": 10642, + "Mapping": 10643, + "▁tak": 10644, + "ющи": 10645, + "вля": 10646, + ")\",": 10647, + "▁Karl": 10648, + "▁Human": 10649, + "▁Pot": 10650, + "▁represents": 10651, + "▁consistent": 10652, + "_(": 10653, + "wen": 10654, + "▁Rose": 10655, + "law": 10656, + "▁FROM": 10657, + "▁begins": 10658, + "▁edit": 10659, + "▁mountain": 10660, + "▁chapter": 10661, + "▁wondered": 10662, + "▁industrial": 10663, + "▁Major": 10664, + "▁ges": 10665, + "▁directed": 10666, + "eros": 10667, + "▁Wild": 10668, + "liament": 10669, + "Book": 10670, + "username": 10671, + "hot": 10672, + "▁nam": 10673, + "▁league": 10674, + "bra": 10675, + "кон": 10676, + "▁Tal": 10677, + "▁Ва": 10678, + "▁exports": 10679, + "(@": 10680, + "▁sharing": 10681, + "▁Tro": 10682, + "ść": 10683, + "uesday": 10684, + "ylv": 10685, + "▁guitar": 10686, + "elen": 10687, + "Selection": 10688, + "▁confident": 10689, + "rypto": 10690, + "▁hors": 10691, + "editor": 10692, + "▁shoulders": 10693, + "getName": 10694, + "encing": 10695, + "SELECT": 10696, + "вши": 10697, + "▁kinds": 10698, + "▁Wel": 10699, + "▁purposes": 10700, + "Matrix": 10701, + "invalid": 10702, + "▁owners": 10703, + "▁Records": 10704, + "▁Process": 10705, + "▁chat": 10706, + "▁Dor": 10707, + "▁bin": 10708, + "redit": 10709, + "oire": 10710, + "▁Total": 10711, + "▁Family": 10712, + "ARY": 10713, + "▁bread": 10714, + "▁compre": 10715, + "▁shoes": 10716, + "▁raz": 10717, + "▁trace": 10718, + "nej": 10719, + "orted": 10720, + "hn": 10721, + "▁procedure": 10722, + "properties": 10723, + "plier": 10724, + "▁hero": 10725, + "panel": 10726, + "▁marked": 10727, + "▁worried": 10728, + "\\|": 10729, + "pts": 10730, + "▁Support": 10731, + "▁serving": 10732, + "Fail": 10733, + "▁disappoint": 10734, + "▁Scot": 10735, + "▁pleasure": 10736, + "▁judge": 10737, + "zeich": 10738, + "▁forever": 10739, + "▁Zeit": 10740, + "uous": 10741, + "inent": 10742, + "▁dw": 10743, + "▁waren": 10744, + "▁flash": 10745, + "▁troops": 10746, + "▁drugs": 10747, + "▁diam": 10748, + ".~": 10749, + "imp": 10750, + "inned": 10751, + "▁EV": 10752, + "Struct": 10753, + "▁justice": 10754, + "▁officials": 10755, + "ffff": 10756, + "▁Common": 10757, + "▁Cat": 10758, + "▁tomorrow": 10759, + "▁él": 10760, + "Texture": 10761, + "qpoint": 10762, + "▁Fried": 10763, + "▁Term": 10764, + "pgfqpoint": 10765, + "▁nem": 10766, + "norm": 10767, + "▁hardly": 10768, + "oda": 10769, + "zeta": 10770, + "emic": 10771, + "▁полу": 10772, + "▁loaded": 10773, + "kes": 10774, + "ció": 10775, + "▁fool": 10776, + "▁trick": 10777, + "▁dst": 10778, + "Find": 10779, + "▁все": 10780, + "}},": 10781, + "▁framework": 10782, + "▁merely": 10783, + "▁union": 10784, + "▁Edward": 10785, + "rif": 10786, + "Flag": 10787, + "▁crisis": 10788, + "▁finite": 10789, + "▁lol": 10790, + "▁Kim": 10791, + "ната": 10792, + "since": 10793, + "▁compat": 10794, + "▁pert": 10795, + "ibilities": 10796, + "▁también": 10797, + "ibli": 10798, + "▁teen": 10799, + "▁sympt": 10800, + "oral": 10801, + "ders": 10802, + "otte": 10803, + "при": 10804, + "▁Jane": 10805, + "▁originally": 10806, + "▁throat": 10807, + "mag": 10808, + "sup": 10809, + "uni": 10810, + "$$": 10811, + "▁Library": 10812, + "▁attacks": 10813, + "ingen": 10814, + "('/": 10815, + "▁hes": 10816, + "coin": 10817, + "ounce": 10818, + "▁Academy": 10819, + "MODULE": 10820, + "isms": 10821, + "▁Adv": 10822, + "▁Bol": 10823, + "▁incident": 10824, + ")^{": 10825, + "▁bij": 10826, + "▁Rome": 10827, + "▁Italy": 10828, + "events": 10829, + "▁Fern": 10830, + "▁ber": 10831, + "▁silent": 10832, + "▁pier": 10833, + "▁YO": 10834, + "▁plain": 10835, + "Bas": 10836, + "▁pill": 10837, + "rase": 10838, + "▁carrying": 10839, + "▁resp": 10840, + "ную": 10841, + "▁typical": 10842, + "Wrapper": 10843, + "▁gau": 10844, + "▁chemical": 10845, + "▁hal": 10846, + "throw": 10847, + "Cluster": 10848, + "▁Gab": 10849, + "▁Girl": 10850, + "quir": 10851, + "▁Arg": 10852, + "▁relief": 10853, + "▁Ве": 10854, + "dm": 10855, + "▁frustr": 10856, + "\\%": 10857, + "▁stores": 10858, + "▁bottle": 10859, + "▁Lew": 10860, + "two": 10861, + "stad": 10862, + "▁cheek": 10863, + "▁concerns": 10864, + "▁helpful": 10865, + "▁coverage": 10866, + "isi": 10867, + "ADD": 10868, + "async": 10869, + "▁approximately": 10870, + "iffer": 10871, + "hook": 10872, + "▁enum": 10873, + "ová": 10874, + "▁evil": 10875, + "▁constantly": 10876, + "apply": 10877, + "▁siè": 10878, + "▁practices": 10879, + "▁teachers": 10880, + "▁Sn": 10881, + "▁Awards": 10882, + "▁substant": 10883, + "▁$.": 10884, + "dk": 10885, + "▁mob": 10886, + "▁ingred": 10887, + "vere": 10888, + "Multi": 10889, + "пер": 10890, + "stal": 10891, + "yard": 10892, + "required": 10893, + "vement": 10894, + "▁intelligence": 10895, + "▁thinks": 10896, + "▁personally": 10897, + "▁trained": 10898, + "orney": 10899, + ")\\": 11266, + "anal": 11267, + "Section": 11268, + "plus": 11269, + "üt": 11270, + "▁embed": 11271, + "▁strings": 11272, + "Before": 11273, + "proc": 11274, + "▁спо": 11275, + "trl": 11276, + "vr": 11277, + "Background": 11278, + "logger": 11279, + "agraph": 11280, + "iest": 11281, + "▁goods": 11282, + "batch": 11283, + "▁optional": 11284, + "▁Taylor": 11285, + "▁recognize": 11286, + "walk": 11287, + "▁Hit": 11288, + "▁Elizabeth": 11289, + "}:": 11290, + "▁careful": 11291, + "краї": 11292, + "▁locations": 11293, + "▁structures": 11294, + "▁disk": 11295, + "▁ships": 11296, + "▁suo": 11297, + "▁sowie": 11298, + "▁Ess": 11299, + "▁Hash": 11300, + "▁reasonable": 11301, + "▁Moreover": 11302, + "▁formula": 11303, + "▁Centre": 11304, + "▁residents": 11305, + "RS": 11306, + "Ids": 11307, + "▁Know": 11308, + "▁trib": 11309, + "▁rés": 11310, + "▁stable": 11311, + "▁Would": 11312, + "▁breaking": 11313, + "▁meal": 11314, + "▁phen": 11315, + "▁fel": 11316, + "▁Fred": 11317, + "Author": 11318, + "▁capture": 11319, + "opts": 11320, + "▁everywhere": 11321, + "▁sque": 11322, + "▁moder": 11323, + "setup": 11324, + "▁Supp": 11325, + "▁whenever": 11326, + "{(": 11327, + "wart": 11328, + "▁toe": 11329, + "Prefix": 11330, + "hou": 11331, + "gage": 11332, + ">\"": 11333, + "▁frag": 11334, + "▁Theorem": 11335, + "memory": 11336, + "▁contents": 11337, + "docs": 11338, + "}'": 11339, + "▁Irish": 11340, + "Then": 11341, + "aats": 11342, + "Save": 11343, + "▁agency": 11344, + "▁име": 11345, + "дова": 11346, + "▁Function": 11347, + "NN": 11348, + "destroy": 11349, + "▁Message": 11350, + "▁cancel": 11351, + "▁superior": 11352, + "▁ec": 11353, + "▁literature": 11354, + "▁PART": 11355, + "Il": 11356, + "▁Cab": 11357, + "engine": 11358, + "▁basket": 11359, + "worth": 11360, + "▁Sel": 11361, + "fetch": 11362, + "▁Stadt": 11363, + "▁Ки": 11364, + "▁conj": 11365, + "▁seiner": 11366, + "▁confirmed": 11367, + "▁Argent": 11368, + "amar": 11369, + "pgfpath": 11370, + "▁struggle": 11371, + "Pattern": 11372, + "▁Middle": 11373, + "itan": 11374, + "▁moon": 11375, + "orough": 11376, + "▁Catholic": 11377, + "▁struck": 11378, + "]->": 11379, + "▁weapon": 11380, + "▁subst": 11381, + "▁instructions": 11382, + "▁occas": 11383, + "protected": 11384, + "▁Less": 11385, + "▁batch": 11386, + "▁contra": 11387, + "▁deck": 11388, + "▁ignored": 11389, + "▁refused": 11390, + "trigger": 11391, + "▁criminal": 11392, + "GA": 11393, + "olly": 11394, + "▁Bell": 11395, + "▁Ю": 11396, + "forward": 11397, + "▁prefix": 11398, + "▁immediate": 11399, + "▁assigned": 11400, + "▁elected": 11401, + "▁tonight": 11402, + "▁Dies": 11403, + "▁Beach": 11404, + "▁preced": 11405, + "ował": 11406, + "▁galax": 11407, + "▁logic": 11408, + "enza": 11409, + "▁Captain": 11410, + "▁Hay": 11411, + "▁facts": 11412, + "▁ни": 11413, + "té": 11414, + "▁sb": 11415, + "oped": 11416, + "▁combat": 11417, + "▁explore": 11418, + "▁(-": 11419, + "Loader": 11420, + "▁Wilson": 11421, + "▁locked": 11422, + ":)": 12970, + "▁quel": 12971, + "▁Га": 12972, + "Ty": 12973, + "▁temps": 12974, + "▁ghost": 12975, + "Material": 12976, + "ERCHANT": 12977, + "pointer": 12978, + "жда": 12979, + "aha": 12980, + "ulf": 12981, + "▁supplement": 12982, + "▁dismiss": 12983, + "▁closing": 12984, + "▁vulner": 12985, + "▁après": 12986, + "▁overwhel": 12987, + "ское": 12988, + "▁disag": 12989, + "acia": 12990, + "oured": 12991, + "ruption": 12992, + "▁PS": 12993, + "Endpoint": 12994, + "Real": 12995, + "▁Tag": 12996, + "▁stairs": 12997, + "lyn": 12998, + "▁eleg": 12999, + "▁veter": 13000, + "factory": 13001, + "anne": 13002, + "▁Bat": 13003, + "▁franc": 13004, + "lung": 13005, + "▁\"'": 13006, + ".',": 13007, + "▁Country": 13008, + "^{[": 13009, + "▁yours": 13010, + "ailability": 13011, + "Clear": 13012, + "ätt": 13013, + "пис": 13014, + "▁joke": 13015, + "▁annoy": 13016, + "▁rag": 13017, + "vari": 13018, + "лекс": 13019, + "▁Psy": 13020, + "ilty": 13021, + "mount": 13022, + "▁cual": 13023, + "▁solar": 13024, + "}^{(": 13025, + "Short": 13026, + "▁taxes": 13027, + "Append": 13028, + "Win": 13029, + "estyle": 13030, + "▁facil": 13031, + "вро": 13032, + "▁sought": 13033, + "▁bare": 13034, + "▁react": 13035, + "jar": 13036, + "MAC": 13037, + "lov": 13038, + "warn": 13039, + "▁crucial": 13040, + "▁museum": 13041, + "ниц": 13042, + "▁Kent": 13043, + "Maybe": 13044, + "▁bike": 13045, + "▁Address": 13046, + "XML": 13047, + "▁admitted": 13048, + "▁$(\\": 13049, + "▁spell": 13050, + "▁vic": 13051, + "gre": 13052, + "▁proc": 13053, + "theless": 13054, + "▁Nom": 13055, + "▁Rail": 13056, + "▁acceler": 13057, + "▁convin": 13058, + "▁Property": 13059, + "▁DA": 13060, + "▁clip": 13061, + "▁plugin": 13062, + "Limit": 13063, + "views": 13064, + "bru": 13065, + "▁pra": 13066, + "▁ak": 13067, + "▁ej": 13068, + "▁opts": 13069, + "▁slip": 13070, + "▁gang": 13071, + "asted": 13072, + "uals": 13073, + "▁dying": 13074, + "Coll": 13075, + "ammen": 13076, + "▁Policy": 13077, + "ERCHANTABILITY": 13078, + "▁Collection": 13079, + "▁vec": 13080, + "▁Dick": 13081, + "stud": 13082, + "▁layers": 13083, + "▁tied": 13084, + "}\\\\": 13085, + "▁alors": 13086, + "▁jou": 13087, + "▁chicken": 13088, + "▁permanent": 13089, + "▁Everything": 13090, + "▁Low": 13091, + "▁Cook": 13092, + "▁peak": 13093, + "▁PARTICULAR": 13094, + "▁dear": 13095, + "ič": 13096, + "▁introduce": 13097, + "▁causing": 13098, + "писа": 13099, + "Bound": 13100, + "hund": 13101, + "multi": 13102, + "▁pare": 13103, + "annt": 13104, + "▁breat": 13105, + "▁commitment": 13106, + "▁increasingly": 13107, + "кой": 13108, + "▁Friend": 13109, + "▁statistics": 13110, + "▁Manager": 13111, + "plicate": 13112, + "Cloud": 13113, + "aci": 13114, + "▁Conference": 13115, + "Span": 13116, + "▁CEO": 13117, + "▁Wait": 13118, + "▁Ober": 13119, + "ifting": 13120, + "imiento": 13121, + "getElement": 13122, + "▁gle": 13123, + "лия": 13124, + "▁wieder": 13125, + "▁instruction": 13126, + "gly": 13127, + "▁blame": 13128, + "▁listade": 13129, + "▁aapt": 13130, + "▁Lewis": 13131, + "Fragment": 13132, + "▁gear": 13133, + "mill": 13134, + "prod": 13135, + "▁burning": 13136, + "ється": 13137, + "▁mé": 13138, + "ène": 13139, + "▁complicated": 13140, + "bh": 13141, + "▁Justice": 13142, + "▁tested": 13143, + "▁staring": 13144, + "▁survive": 13145, + "▁cous": 13146, + "▁rib": 13147, + "aml": 13148, + "▁Trust": 13149, + "▁cad": 13150, + "▁Terr": 13151, + "▁mapping": 13152, + "▁twelve": 13153, + "▁grant": 13154, + "▁thorough": 13155, + "▁Ü": 13156, + "▁folks": 13157, + "▁Content": 13158, + "▁childhood": 13159, + "cker": 13160, + "сно": 13161, + "RECT": 13162, + "▁finale": 13163, + "▁shower": 13164, + "éric": 13165, + "▁spat": 13166, + "odge": 13167, + "рь": 13168, + "▁pes": 13169, + "eda": 13170, + "Db": 13171, + "▁Antonio": 13172, + "▁engaged": 13173, + "▁vess": 13174, + "vals": 13175, + "▁electronic": 13176, + "lemma": 13177, + "▁Wy": 13178, + "mad": 13179, + "merge": 13180, + "apon": 13181, + "▁privile": 13182, + "▁novembre": 13183, + "▁Sports": 13184, + "will": 13185, + "▁controls": 13186, + "▁categories": 13187, + "▁Georgia": 13188, + "ipedia": 13189, + "▁AV": 13190, + "atori": 13191, + "▁___": 13192, + "▁À": 13193, + "▁Ryan": 13194, + "▁Charlie": 13195, + "▁исто": 13196, + "▁emotion": 13197, + "▁cooking": 13198, + "▁attempts": 13199, + "▁FITNESS": 13200, + "äter": 13201, + "Enable": 13202, + "DT": 13203, + "▁Change": 13204, + "AspNet": 13205, + "▁га": 13206, + "▁ordinary": 13207, + "▁SQL": 13208, + "plane": 13209, + "%.": 13210, + "▁Summer": 13211, + "▁avait": 13212, + "upp": 13213, + "▁illness": 13214, + "UINT": 13215, + ">{": 13216, + "▁zwischen": 13217, + "▁hardware": 13218, + "▁sounded": 13219, + "equiv": 13220, + "▁piano": 13221, + "uset": 13222, + "kn": 13223, + "TRY": 13224, + "▁bab": 13225, + "нен": 13226, + "▁reliable": 13227, + "▁Bronnen": 13228, + "▁Store": 13229, + "Az": 13230, + "▁»,": 13231, + "Static": 13232, + "dw": 13233, + "green": 13234, + "▁'';": 13235, + "lij": 13236, + "eva": 13237, + "ній": 13238, + "▁Syd": 13239, + "inois": 13240, + "convert": 13241, + "▁declare": 13242, + "bres": 13243, + "INK": 13244, + "itled": 13245, + "▁accord": 13246, + "▁mars": 13247, + "Sequence": 13248, + "zip": 13249, + "▁Brazil": 13250, + "▁meetings": 13251, + "▁accuracy": 13252, + "▁Machine": 13253, + "▁autor": 13254, + "▁ainsi": 13255, + "Simple": 13256, + "Resources": 13257, + "каза": 13258, + "▁MP": 13259, + "they": 13260, + "▁Bang": 13261, + "▁eing": 13262, + "ateful": 13263, + "▁Something": 13264, + "▁upset": 13265, + "History": 13266, + "dimensional": 13267, + "▁explanation": 13268, + "▁civ": 13269, + "▁conce": 13270, + "▁köz": 13271, + "▁promised": 13272, + "жду": 13273, + "wed": 13274, + "Fore": 13275, + "Amount": 13276, + "abb": 13277, + "▁clothing": 13278, + "лись": 13279, + "oen": 13280, + "▁Print": 13281, + "▁sizes": 13282, + "▁banks": 13283, + "ribed": 13284, + "▁'../": 13285, + "FIX": 13286, + "▁Hug": 13287, + "▁zn": 13288, + "▁INT": 13289, + "▁instances": 13290, + "▁alongside": 13291, + "Namespace": 13292, + "▁renew": 13293, + "▁asc": 13294, + "▁waves": 13295, + "▁pom": 13296, + "Duration": 13297, + "days": 13298, + "$(": 13299, + "▁grabbed": 13300, + "▁surgery": 13301, + "▁restore": 13302, + "Normal": 13303, + "▁Leb": 13304, + "▁analyt": 13305, + "Literal": 13306, + "HA": 13307, + "▁shares": 13308, + "illet": 13309, + "ols": 13310, + "▁Dog": 13311, + "orno": 13312, + "▁manip": 13313, + "jav": 13314, + "▁essentially": 13315, + "▁casual": 13316, + "opl": 13317, + "▁р": 13318, + "▁SU": 13319, + "▁engineering": 13320, + "▁Prime": 13321, + "▁SW": 13322, + "▁reaching": 13323, + "▁вла": 13324, + "▁Росси": 13325, + "▁Kre": 13326, + "erry": 13327, + "▁oppon": 13328, + "program": 13329, + "emper": 13330, + "isEmpty": 13331, + "▁Unit": 13332, + "INTER": 13333, + "ethe": 13334, + "zd": 13335, + "CUR": 13336, + "▁vm": 13337, + "conv": 13338, + "ropol": 13339, + "▁Coast": 13340, + "▁Select": 13341, + "▁была": 13342, + "▁Ve": 13343, + "owy": 13344, + "▁myth": 13345, + "ceptions": 13346, + "classes": 13347, + "▁worden": 13348, + "▁assault": 13349, + "▁dual": 13350, + "ORK": 13351, + "▁inches": 13352, + "▁FA": 13353, + "▁Station": 13354, + "▁personality": 13355, + "▁scar": 13356, + "▁regime": 13357, + "▁noten": 13358, + "▁rural": 13359, + "iza": 13360, + "Audio": 13361, + "▁disput": 13362, + "▁aver": 13363, + "▁obst": 13364, + "▁Region": 13365, + "utf": 13366, + "▁Cass": 13367, + "hspace": 13368, + "▁shipping": 13369, + "iko": 13370, + "icked": 13371, + "numer": 13372, + "дна": 13373, + "riel": 13374, + "disabled": 13375, + "opol": 13376, + "looking": 13377, + "▁classical": 13378, + "▁constructed": 13379, + "▁referenties": 13380, + "]+": 13381, + "▁captured": 13382, + "▁minimal": 13383, + "▁sock": 13384, + "father": 13385, + "isión": 13386, + "▁equally": 13387, + "▁reduction": 13388, + "Ant": 13389, + "aison": 13390, + "▁argue": 13391, + "circle": 13392, + "▁toler": 13393, + "}\",": 13394, + "▁primarily": 13395, + "usal": 13396, + "▁algebra": 13397, + "▁gathered": 13398, + "▁Remember": 13399, + "_);": 13400, + "UTE": 13401, + "▁Kit": 13402, + "Sy": 13403, + "HEAD": 13404, + "▁recipe": 13405, + "▁scenario": 13406, + "▁Following": 13407, + "VAR": 13408, + "▁yard": 13409, + "▁stad": 13410, + "*(": 13411, + "▁validate": 13412, + "DEX": 13413, + "▁committee": 13414, + "▁temporary": 13415, + "▁consequences": 13416, + "▁également": 13417, + "ктив": 13418, + "▁ra": 13419, + "▁displ": 13420, + "▁apps": 13421, + "▁Teil": 13422, + "▁».": 13423, + "▁adopted": 13424, + "tensor": 13425, + "▁femin": 13426, + "▁мар": 13427, + "логи": 13428, + "tech": 13429, + "▁Rot": 13430, + "▁knees": 13431, + "phys": 13432, + "owej": 13433, + "▁Oxford": 13434, + "анд": 13435, + "hell": 13436, + "ografia": 13437, + "▁exposed": 13438, + "ktop": 13439, + "oby": 13440, + "lower": 13441, + "▁Senate": 13442, + "▁sword": 13443, + "Flow": 13444, + "▁Unfortunately": 13445, + "▁boxes": 13446, + "▁cuando": 13447, + "▁pilot": 13448, + "▁Album": 13449, + "Bal": 13450, + "Sort": 13451, + "FIELD": 13452, + "▁desert": 13453, + "COMM": 13454, + "rons": 13455, + "adows": 13456, + "▁loyal": 13457, + "▁asset": 13458, + "▁mud": 13459, + "фа": 13460, + "▁secondary": 13461, + "▁Ар": 13462, + "▁cul": 13463, + "▁Asian": 13464, + "▁staying": 13465, + "▁dataset": 13466, + "▁USE": 13467, + "▁loves": 13468, + "▁velocity": 13469, + "áv": 13470, + "▁purchased": 13471, + "SOC": 13472, + "▁competitive": 13473, + "▁Football": 13474, + "iska": 13475, + "▁knock": 13476, + "stairs": 13477, + "azy": 13478, + "▁vend": 13479, + "▁arts": 13480, + "▁Bras": 13481, + "uela": 13482, + "кто": 13483, + "trim": 13484, + "▁dirty": 13485, + "▁websites": 13486, + "▁Indep": 13487, + "▁стра": 13488, + "sr": 13489, + "▁ticket": 13490, + "atile": 13491, + "▁implemented": 13492, + "▁время": 13493, + "▁bowl": 13494, + "DATE": 13495, + "▁alter": 13496, + "▁Space": 13497, + "▁accompan": 13498, + "ordon": 13499, + "▁doctors": 13500, + "istas": 13501, + "Cast": 13502, + "дом": 13503, + "CTL": 13504, + "urers": 13505, + "▁ingredients": 13506, + "▁calculated": 13507, + "▁leather": 13508, + "▁sensitive": 13509, + "▁suspic": 13510, + "stan": 13511, + "▁anni": 13512, + "await": 13513, + "▁Franç": 13514, + "▁abort": 13515, + "▁Spirit": 13516, + "▁Walter": 13517, + "unkt": 13518, + "▁vertical": 13519, + "ORS": 13520, + "best": 13521, + "▁Client": 13522, + "itated": 13523, + "▁ва": 13524, + "▁Č": 13525, + "▁ville": 13526, + "▁diplom": 13527, + "orne": 13528, + "▁bars": 13529, + "Uri": 13530, + "APTER": 13531, + "pons": 13532, + "utz": 13533, + "Proto": 13534, + "▁stir": 13535, + "▁це": 13536, + "▁primer": 13537, + "igible": 13538, + "extra": 13539, + "▁Books": 13540, + "▁Bos": 13541, + "▁Et": 13542, + "▁Welt": 13543, + "▁Korea": 13544, + "рито": 13545, + "▁vibr": 13546, + "Self": 13547, + "linear": 13548, + "об": 13549, + "▁Lang": 13550, + "▁deeper": 13551, + "▁termin": 13552, + "enschaft": 13553, + "▁році": 13554, + "ammed": 13555, + "visible": 13556, + "▁IOException": 13557, + "▁Wind": 13558, + "usqu": 13559, + "▁Stop": 13560, + "▁орга": 13561, + "INVALID": 13562, + "▁cub": 13563, + "▁jew": 13564, + "▁captain": 13565, + "зі": 13566, + "chunk": 13567, + "apture": 13568, + "ashboard": 13569, + "▁divided": 13570, + "▁extensive": 13571, + "▁suffer": 13572, + "▁heading": 13573, + "created": 13574, + "▁quietly": 13575, + "▁ny": 13576, + "▁пол": 13577, + "\"+": 13578, + "ikan": 13579, + "▁designs": 13580, + "zu": 13581, + "}+\\": 13582, + "Operator": 13583, + "▁Lemma": 13584, + "▁нау": 13585, + "acji": 13586, + "лове": 13587, + "Servlet": 13588, + "▁Kevin": 13589, + "stage": 13590, + "bn": 13591, + "textwidth": 13592, + "failed": 13593, + "▁Staff": 13594, + "▁enem": 13595, + "unde": 13596, + "ень": 13597, + "Packet": 13598, + "▁Als": 13599, + "kar": 13600, + "]['": 13601, + "ked": 13602, + "Pers": 13603, + ">::": 13604, + "▁arc": 13605, + "▁synt": 13606, + "SPE": 13607, + "▁Да": 13608, + "▁Mi": 13609, + "▁Moh": 13610, + "▁Death": 13611, + "browser": 13612, + "▁Dave": 13613, + "▁succ": 13614, + "toggle": 13615, + "▁tack": 13616, + "Comment": 13617, + "eron": 13618, + "▁awareness": 13619, + "▁hug": 13620, + "▁contemporary": 13621, + "ulating": 13622, + "▁Title": 13623, + "▁THIS": 13624, + "havior": 13625, + "rank": 13626, + "▁dozen": 13627, + "▁cheese": 13628, + "coln": 13629, + "▁radius": 13630, + "▁dimensions": 13631, + "roduction": 13632, + "▁adds": 13633, + "▁household": 13634, + "▁Davis": 13635, + "pkg": 13636, + "{$": 13637, + "▁casino": 13638, + "▁Pierre": 13639, + "▁objective": 13640, + "train": 13641, + "▁Michigan": 13642, + "payload": 13643, + "▁rug": 13644, + "▁severe": 13645, + "mean": 13646, + "▁toss": 13647, + "▁embarrass": 13648, + "▁Very": 13649, + "▁appeal": 13650, + "▁Comput": 13651, + "▁forgotten": 13652, + "▁kernel": 13653, + "▁carbon": 13654, + "fw": 13655, + "▁Су": 13656, + "▁Empire": 13657, + "▁quote": 13658, + "etz": 13659, + "▁mini": 13660, + "▁pipe": 13661, + "▁nous": 13662, + "▁Move": 13663, + "▁ду": 13664, + "▁nervous": 13665, + "▁Мар": 13666, + "*\r": 13667, + "▁Bush": 13668, + "▁peer": 13669, + "▁Writ": 13670, + "▁satisfied": 13671, + "▁pulling": 13672, + "▁Pur": 13673, + "▁Miller": 13674, + "▁FL": 13675, + "amaz": 13676, + "▁mile": 13677, + "▁Need": 13678, + "▁supplies": 13679, + "▁año": 13680, + "▁pace": 13681, + "▁Victoria": 13682, + "▁ought": 13683, + "▁Player": 13684, + "agnostic": 13685, + "▁viv": 13686, + "▁Patrick": 13687, + "▁Š": 13688, + "▁Story": 13689, + "aca": 13690, + "▁mountains": 13691, + "CLASS": 13692, + "▁fragment": 13693, + "▁settlement": 13694, + "▁Furthermore": 13695, + "▁drivers": 13696, + "▁Ju": 13697, + "▁были": 13698, + "Rows": 13699, + "▁impression": 13700, + "▁infer": 13701, + "▁Expl": 13702, + "olute": 13703, + "ovan": 13704, + "arance": 13705, + "CAP": 13706, + "▁enforce": 13707, + "▁Burn": 13708, + "Reset": 13709, + "mother": 13710, + "▁Battle": 13711, + "padding": 13712, + "iate": 13713, + "▁cried": 13714, + "AK": 13715, + "uns": 13716, + "▁siècle": 13717, + "▁Contin": 13718, + "bank": 13719, + "junit": 13720, + "objects": 13721, + "Rot": 13722, + "issa": 13723, + "▁begun": 13724, + "*-": 13725, + "▁visiting": 13726, + "жде": 13727, + "▁targets": 13728, + "▁Latin": 13729, + "ут": 13730, + "▁Esc": 13731, + "*;": 13732, + "ång": 13733, + "▁({": 13734, + "▁diagram": 13735, + "Models": 13736, + "▁partnership": 13737, + "▁från": 13738, + "ulty": 13739, + "Pod": 13740, + "CALL": 13741, + "modal": 13742, + "sig": 13743, + "itzer": 13744, + "itel": 13745, + "▁convinced": 13746, + "abl": 13747, + "стве": 13748, + "▁cot": 13749, + "▁repeat": 13750, + "▁lists": 13751, + "sound": 13752, + "▁royal": 13753, + "▁grace": 13754, + "▁oraz": 13755, + "Notification": 13756, + "prite": 13757, + "▁arrival": 13758, + "ancell": 13759, + "hentic": 13760, + "decode": 13761, + "▁fantastic": 13762, + "progress": 13763, + "proxy": 13764, + "ző": 13765, + "kel": 13766, + "▁convenient": 13767, + "aque": 13768, + "riet": 13769, + "▁Digital": 13770, + "iors": 13771, + "▁Budd": 13772, + "andra": 13773, + "addy": 13774, + "▁overs": 13775, + "▁consumers": 13776, + "pn": 13777, + "mouse": 13778, + "▁BC": 13779, + "deg": 13780, + "perm": 13781, + "ités": 13782, + "▁испо": 13783, + "heast": 13784, + "hour": 13785, + "PARAM": 13786, + "conscious": 13787, + "▁wing": 13788, + "▁atmosphere": 13789, + "▁gig": 13790, + "▁contre": 13791, + "▁drama": 13792, + "ят": 13793, + "▁Front": 13794, + "▁philosophy": 13795, + "▁Hart": 13796, + "▁nurs": 13797, + "uras": 13798, + "▁Tru": 13799, + "▁sud": 13800, + "▁performing": 13801, + "пы": 13802, + "▁confused": 13803, + "▁checks": 13804, + "amt": 13805, + "Make": 13806, + "▁RO": 13807, + "▁df": 13808, + "izations": 13809, + "▁degli": 13810, + "▁architecture": 13811, + "Renderer": 13812, + "▁Ла": 13813, + "▁ptr": 13814, + "▁dieser": 13815, + "submit": 13816, + "▁topics": 13817, + "▁principles": 13818, + "vars": 13819, + "sock": 13820, + "▁tongue": 13821, + "▁percentage": 13822, + "▁SS": 13823, + "▁dol": 13824, + "▁rice": 13825, + "ío": 13826, + "▁Eastern": 13827, + "▁recognition": 13828, + "▁Ern": 13829, + "▁Ut": 13830, + "▁caut": 13831, + "▁Cloud": 13832, + "▁conversion": 13833, + "▁Ohio": 13834, + "▁ME": 13835, + "▁surely": 13836, + "▁gard": 13837, + "puis": 13838, + "▁urg": 13839, + "imi": 13840, + "▁absence": 13841, + "▁winner": 13842, + "Language": 13843, + "▁HTTP": 13844, + "wt": 13845, + "▁translation": 13846, + "сс": 13847, + "▁Kind": 13848, + "Two": 13849, + "▁Revolution": 13850, + "Insert": 13851, + "Every": 13852, + "orient": 13853, + "▁тра": 13854, + "▁emotions": 13855, + "details": 13856, + "▁flu": 13857, + "▁operate": 13858, + "Ag": 13859, + "unning": 13860, + "▁partie": 13861, + "tri": 13862, + "▁golden": 13863, + "▁Би": 13864, + "▁foundation": 13865, + "isten": 13866, + "▁Carlos": 13867, + "Children": 13868, + "▁neighb": 13869, + "▁Cart": 13870, + "Begin": 13871, + "гда": 13872, + "▁scheduled": 13873, + "'>": 13874, + "▁observations": 13875, + "▁producer": 13876, + "athers": 13877, + "ному": 13878, + "▁expectations": 13879, + "oso": 13880, + "zh": 13881, + "mutable": 13882, + "▁writes": 13883, + "▁pushing": 13884, + "▁seats": 13885, + "▁breast": 13886, + "aping": 13887, + "▁Simple": 13888, + "▁socket": 13889, + "▁slave": 13890, + "iley": 13891, + "▁assistant": 13892, + "▁trim": 13893, + "▁landscape": 13894, + "▁association": 13895, + "quant": 13896, + "▁Palest": 13897, + "▁sweat": 13898, + "engers": 13899, + "?_": 13900, + "ép": 13901, + ">.": 13902, + "▁curious": 13903, + "▁Component": 13904, + "▁replacement": 13905, + "раль": 13906, + "▁Track": 13907, + "▁Remove": 13908, + "▁Size": 13909, + "peror": 13910, + "▁calculate": 13911, + "▁sessions": 13912, + "▁typed": 13913, + "▁submit": 13914, + "!!!": 13915, + "▁partition": 13916, + "eding": 13917, + "-----": 13918, + "azioni": 13919, + "ließ": 13920, + "onal": 13921, + "▁shru": 13922, + "▁REG": 13923, + "▁Fac": 13924, + "configuration": 13925, + "▁было": 13926, + "▁Among": 13927, + "__);": 13928, + "▁Server": 13929, + "▁LOG": 13930, + "▁cand": 13931, + "']);": 13932, + "gov": 13933, + "▁Six": 13934, + "undefined": 13935, + "▁ty": 13936, + "asa": 13937, + "▁particles": 13938, + "▁фор": 13939, + "``": 13940, + "Tube": 13941, + "eland": 13942, + "fold": 13943, + "ogo": 13944, + "▁approaches": 13945, + "onda": 13946, + "agr": 13947, + ",$": 13948, + "▁{{": 13949, + "▁Modern": 13950, + "▁Winter": 13951, + "available": 13952, + "▁Lud": 13953, + "▁casa": 13954, + "▁Could": 13955, + "▁fifteen": 13956, + "▁potentially": 13957, + "^^": 13958, + "▁seit": 13959, + "Animation": 13960, + "кого": 13961, + "Zone": 13962, + "elif": 13963, + "▁acknowled": 13964, + "▁ownership": 13965, + "▁describes": 13966, + "▁reverse": 13967, + "▁contest": 13968, + "▁scored": 13969, + "▁opposed": 13970, + "flex": 13971, + "kre": 13972, + "▁merge": 13973, + "▁covering": 13974, + "▁honestly": 13975, + "▁Mess": 13976, + "▁rarely": 13977, + "▁incredible": 13978, + "itage": 13979, + "▁victims": 13980, + "ными": 13981, + "wl": 13982, + "izza": 13983, + "dn": 13984, + "onde": 13985, + "▁przy": 13986, + "▁HTML": 13987, + "▁payload": 13988, + "Bus": 13989, + "usb": 13990, + "Fn": 13991, + "▁displayed": 13992, + "▁ocean": 13993, + "▁Avenue": 13994, + "acion": 13995, + "ghan": 13996, + "metric": 13997, + "ieties": 13998, + "▁attractive": 13999, + "▁fö": 14000, + "Creat": 14001, + "verter": 14002, + "▁Alice": 14003, + "пол": 14004, + "▁fraction": 14005, + "▁behaviour": 14006, + "▁Jersey": 14007, + "▁revenue": 14008, + "▁tres": 14009, + "ILD": 14010, + "▁Ét": 14011, + "▁sync": 14012, + "wich": 14013, + "▁ancest": 14014, + "ът": 14015, + "omo": 14016, + "▁Ide": 14017, + "▁gained": 14018, + "▁momentum": 14019, + "▁Ko": 14020, + "ieu": 14021, + "ielt": 14022, + "▁bonus": 14023, + "▁texture": 14024, + "Modal": 14025, + "NEXT": 14026, + "▁године": 14027, + "▁languages": 14028, + "vt": 14029, + "▁representing": 14030, + "▁Dream": 14031, + "curr": 14032, + "qual": 14033, + "▁js": 14034, + "burn": 14035, + "▁contributions": 14036, + "▁ric": 14037, + "}-\\": 14038, + "={{": 14039, + "cart": 14040, + "FB": 14041, + "jud": 14042, + "esp": 14043, + "▁electron": 14044, + "▁ell": 14045, + "▁Runtime": 14046, + "achel": 14047, + "\\_": 14048, + "week": 14049, + "packet": 14050, + "▁Secretary": 14051, + "▁Jahrhund": 14052, + "▁threshold": 14053, + "bage": 14054, + "▁concer": 14055, + "▁bone": 14056, + "▁Hollywood": 14057, + "Cursor": 14058, + "▁awarded": 14059, + "▁summary": 14060, + "aggio": 14061, + "▁stell": 14062, + "▁flesh": 14063, + "Pair": 14064, + "▁Age": 14065, + "ington": 14066, + "▁'.": 14067, + "aser": 14068, + "кова": 14069, + "▁quart": 14070, + "ryption": 14071, + "Alloc": 14072, + "ften": 14073, + "Operand": 14074, + "▁indicated": 14075, + "($_": 14076, + "getString": 14077, + "▁listener": 14078, + "spir": 14079, + ")_": 14080, + "vens": 14081, + "▁foods": 14082, + "anza": 14083, + "teil": 14084, + "DESC": 14085, + "▁notion": 14086, + "▁employment": 14087, + "▁swing": 14088, + "nbsp": 14089, + "▁pounds": 14090, + "tools": 14091, + "▁participate": 14092, + "▁Tax": 14093, + "▁скла": 14094, + "apol": 14095, + "▁fost": 14096, + "compat": 14097, + "▁publication": 14098, + "▁rapidly": 14099, + "▁Wis": 14100, + "EventListener": 14101, + "▁première": 14102, + "uso": 14103, + "extend": 14104, + "▁MERCHANTABILITY": 14105, + "UTF": 14106, + "▁experiments": 14107, + "single": 14108, + "zk": 14109, + "▁naj": 14110, + "}}}": 14111, + "Lin": 14112, + "▁interact": 14113, + "▁cms": 14114, + "▁Roger": 14115, + "▁Ру": 14116, + ">'": 14117, + "commit": 14118, + "лось": 14119, + "▁outcome": 14120, + "▁hits": 14121, + "▁им": 14122, + "▁spark": 14123, + "console": 14124, + "▁verw": 14125, + "▁като": 14126, + "agnostics": 14127, + "▁soci": 14128, + "▁dining": 14129, + "▁tech": 14130, + "št": 14131, + "folio": 14132, + "ultane": 14133, + "ктор": 14134, + "▁Brand": 14135, + "Join": 14136, + "▁ию": 14137, + "▁pros": 14138, + "▁posit": 14139, + "Public": 14140, + "AspNetCore": 14141, + "▁Shop": 14142, + "▁coinc": 14143, + "нием": 14144, + "▁references": 14145, + "about": 14146, + "namespace": 14147, + "DL": 14148, + "▁IR": 14149, + "▁cada": 14150, + "▁Jordan": 14151, + "▁gep": 14152, + "▁bron": 14153, + "andidate": 14154, + "EXPECT": 14155, + "amo": 14156, + "▁Deutsch": 14157, + "auc": 14158, + "▁райо": 14159, + "▁Labor": 14160, + "▁surrounded": 14161, + "тро": 14162, + "▁nome": 14163, + "▁underlying": 14164, + "▁educational": 14165, + "RIGHT": 14166, + "COUNT": 14167, + "inch": 14168, + "Typ": 14169, + "umph": 14170, + "four": 14171, + "Controls": 14172, + "▁cp": 14173, + "cost": 14174, + "▁mechanism": 14175, + "eness": 14176, + "équ": 14177, + "▁acquired": 14178, + "▁falls": 14179, + "▁Hou": 14180, + "▁LE": 14181, + "forEach": 14182, + "▁vertex": 14183, + "▁IF": 14184, + "curs": 14185, + "'=>": 14186, + "тери": 14187, + "▁SA": 14188, + "riers": 14189, + "▁uw": 14190, + "▁marks": 14191, + "▁energ": 14192, + "hof": 14193, + "ylvania": 14194, + "▁Allen": 14195, + "umpy": 14196, + "ого": 14197, + "ству": 14198, + "voice": 14199, + "▁engage": 14200, + "▁mant": 14201, + "orse": 14202, + "===": 14203, + "▁improvement": 14204, + "Opt": 14205, + "▁arrested": 14206, + "тия": 14207, + "▁сле": 14208, + "itched": 14209, + "socket": 14210, + "▁cycl": 14211, + "▁SM": 14212, + "▁Sex": 14213, + "▁neutral": 14214, + "вав": 14215, + "▁Jess": 14216, + "▁dip": 14217, + "▁opposition": 14218, + "▁borrow": 14219, + "спе": 14220, + "▁avant": 14221, + "кола": 14222, + "▁ta": 14223, + "Anim": 14224, + "▁Gall": 14225, + "rgb": 14226, + "▁guilty": 14227, + "▁buried": 14228, + "▁gy": 14229, + "Initial": 14230, + "▁accomp": 14231, + "▁breathing": 14232, + "berry": 14233, + "GRO": 14234, + "▁subsequent": 14235, + "roupe": 14236, + "ulpt": 14237, + "tb": 14238, + "▁ä": 14239, + "Pi": 14240, + "argv": 14241, + "▁Must": 14242, + ":'": 14243, + "svg": 14244, + "oup": 14245, + "▁precisely": 14246, + "▁Ta": 14247, + "rena": 14248, + "▁folder": 14249, + "▁Channel": 14250, + "▁revol": 14251, + "Miss": 14252, + "лом": 14253, + "reddit": 14254, + "adelph": 14255, + "▁discrim": 14256, + "▁ave": 14257, + "pleted": 14258, + "▁gently": 14259, + "FFFF": 14260, + "ropy": 14261, + "▁dial": 14262, + "NotFound": 14263, + "▁\"[": 14264, + "Home": 14265, + "onte": 14266, + "▁relie": 14267, + "▁Context": 14268, + "▁stats": 14269, + "▁Energy": 14270, + "ounced": 14271, + "▁grave": 14272, + "▁recip": 14273, + "лин": 14274, + "blog": 14275, + "▁naam": 14276, + "▁wo": 14277, + "▁directions": 14278, + "▁Lincoln": 14279, + "!)": 14280, + "unci": 14281, + "neq": 14282, + "Tags": 14283, + "▁tum": 14284, + "▁saving": 14285, + "aille": 14286, + "itemize": 14287, + "▁Famil": 14288, + "msm": 14289, + "news": 14290, + "FFER": 14291, + "▁Dead": 14292, + "▁territory": 14293, + "▁Kat": 14294, + "ocker": 14295, + "integer": 14296, + "▁sne": 14297, + "▁fails": 14298, + "▁français": 14299, + "▁introduction": 14300, + "▁Grant": 14301, + "ycle": 14302, + "'].": 14303, + "▁vier": 14304, + "native": 14305, + "▁Kle": 14306, + "quote": 14307, + "Users": 14308, + "▁advis": 14309, + "▁gym": 14310, + "▁protein": 14311, + "ال": 14312, + "▁Mai": 14313, + "▁providers": 14314, + "▁soil": 14315, + "gui": 14316, + "▁Nation": 14317, + "reation": 14318, + "▁Tab": 14319, + "ensis": 14320, + "inas": 14321, + "▁Scotland": 14322, + "▁dispatch": 14323, + "union": 14324, + "▁bere": 14325, + "▁Pow": 14326, + "▁Hig": 14327, + "▁studying": 14328, + "REF": 14329, + "SSL": 14330, + "▁fright": 14331, + "▁SORT": 14332, + "▁compr": 14333, + "▁Madrid": 14334, + "rowned": 14335, + "opes": 14336, + "pdev": 14337, + "▁wash": 14338, + "▁'../../": 14339, + "}}_": 14340, + "▁accum": 14341, + "rolling": 14342, + "▁NC": 14343, + "▁fiction": 14344, + "ipt": 14345, + "connected": 14346, + "limits": 14347, + "▁lap": 14348, + "▁whereas": 14349, + "prom": 14350, + "▁appointment": 14351, + "Program": 14352, + "▁Пер": 14353, + "nah": 14354, + "Validation": 14355, + "icons": 14356, + "äll": 14357, + "▁radical": 14358, + "▁exclusive": 14359, + "emony": 14360, + "▁challenging": 14361, + "▁ms": 14362, + "▁Private": 14363, + "▁vida": 14364, + "▁други": 14365, + "▁campus": 14366, + "forms": 14367, + "дно": 14368, + "plaat": 14369, + "bst": 14370, + "ATED": 14371, + "▁Abstract": 14372, + "▁intense": 14373, + "▁Ltd": 14374, + "▁controvers": 14375, + "óg": 14376, + "▁să": 14377, + "▁landing": 14378, + "!=": 14379, + "▁scenes": 14380, + "▁Chap": 14381, + "▁spoken": 14382, + "cred": 14383, + "▁pride": 14384, + "quet": 14385, + "▁meter": 14386, + "▁deutsch": 14387, + "uum": 14388, + "▁bless": 14389, + "▁Hann": 14390, + "▁inputs": 14391, + "▁Row": 14392, + "▁withdraw": 14393, + "Pal": 14394, + "acles": 14395, + "assets": 14396, + "▁vl": 14397, + "веде": 14398, + "▁Got": 14399, + "▁airport": 14400, + "wind": 14401, + "▁Columbia": 14402, + "▁chocolate": 14403, + "▁hö": 14404, + "▁alarm": 14405, + "FTWARE": 14406, + "▁Jay": 14407, + "▁sake": 14408, + "▁registration": 14409, + "vid": 14410, + "▁lake": 14411, + "▁username": 14412, + "▁hack": 14413, + "indexOf": 14414, + "cx": 14415, + "▁festival": 14416, + "▁clubs": 14417, + "cases": 14418, + "CTRL": 14419, + "];\r": 14420, + "▁Aud": 14421, + "▁primera": 14422, + "ват": 14423, + "▁brilliant": 14424, + "uther": 14425, + "▁difficulty": 14426, + "itals": 14427, + "▁scores": 14428, + "▁polít": 14429, + "database": 14430, + "aska": 14431, + "▁######": 14432, + "▁acid": 14433, + "aton": 14434, + "atomic": 14435, + "freq": 14436, + "▁WARRANTY": 14437, + "▁reporting": 14438, + ".),": 14439, + "▁nights": 14440, + "▁programme": 14441, + ")}{": 14442, + "xic": 14443, + "▁spo": 14444, + "lined": 14445, + "quarters": 14446, + "eree": 14447, + "mers": 14448, + "▁serves": 14449, + "cow": 14450, + "лько": 14451, + "enso": 14452, + "▁environ": 14453, + "Like": 14454, + "anche": 14455, + "▁crash": 14456, + "▁Kap": 14457, + "noindent": 14458, + "Conn": 14459, + "▁авто": 14460, + "▁infrastructure": 14461, + "IME": 14462, + "▁Room": 14463, + "need": 14464, + "orer": 14465, + "▁Dest": 14466, + "▁Domin": 14467, + "atherine": 14468, + "▁Sydney": 14469, + "▁gauge": 14470, + "▁jet": 14471, + "bably": 14472, + "▁commonly": 14473, + "▁stations": 14474, + "iah": 14475, + "nl": 14476, + "жу": 14477, + "eten": 14478, + "_)": 14479, + "iac": 14480, + "amos": 14481, + "nement": 14482, + "kon": 14483, + "Interval": 14484, + "▁cabin": 14485, + "▁eg": 14486, + "▁shots": 14487, + "▁Area": 14488, + "smith": 14489, + "parameter": 14490, + "'}": 14491, + "▁hem": 14492, + "▁singing": 14493, + "▁accessible": 14494, + "▁Prin": 14495, + "optional": 14496, + "ancial": 14497, + "ships": 14498, + "▁canvas": 14499, + "spe": 14500, + "▁addresses": 14501, + "▁xml": 14502, + "▁'\"": 14503, + "▁kar": 14504, + "öff": 14505, + "▁ages": 14506, + "ёр": 14507, + "zing": 14508, + "▁över": 14509, + "▁Clean": 14510, + "▁Silver": 14511, + "▁осо": 14512, + "health": 14513, + "Ali": 14514, + "▁ts": 14515, + "atern": 14516, + "▁choosing": 14517, + "▁burned": 14518, + "brid": 14519, + "rooms": 14520, + "ött": 14521, + "KERN": 14522, + "▁dish": 14523, + "Sa": 14524, + "Detail": 14525, + "▁Hind": 14526, + "▁Dans": 14527, + "ię": 14528, + "▁Jahren": 14529, + "extension": 14530, + "allas": 14531, + "▁Billy": 14532, + "usammen": 14533, + "itud": 14534, + "geon": 14535, + "Temp": 14536, + "Leg": 14537, + "ittel": 14538, + "addle": 14539, + "▁muscle": 14540, + "▁scared": 14541, + "sson": 14542, + "▁denote": 14543, + "ieurs": 14544, + "▁orange": 14545, + "▁hub": 14546, + "▁reb": 14547, + "edi": 14548, + "▁voices": 14549, + "Folder": 14550, + "▁suspend": 14551, + "▁Heart": 14552, + "▁scrap": 14553, + "▁aggreg": 14554, + "▁Guide": 14555, + "transaction": 14556, + "▁riding": 14557, + "▁vá": 14558, + "▁breed": 14559, + "▁concert": 14560, + "approx": 14561, + "▁chances": 14562, + "Tok": 14563, + "Eq": 14564, + "parts": 14565, + "▁scholar": 14566, + "offs": 14567, + "flush": 14568, + "!”": 14569, + "▁login": 14570, + "▁soort": 14571, + "▁Mand": 14572, + "▁functional": 14573, + "▁Bou": 14574, + "▁subjects": 14575, + "mys": 14576, + "▁extraord": 14577, + "▁Building": 14578, + "ikt": 14579, + "Bad": 14580, + "iami": 14581, + "Driver": 14582, + "ête": 14583, + "▁kv": 14584, + "▁timer": 14585, + "itionally": 14586, + "▁athlet": 14587, + "▁\");": 14588, + "wy": 14589, + "CFG": 14590, + "▁heaven": 14591, + "ов": 14592, + "▁experimental": 14593, + "▁bounds": 14594, + "ICK": 14595, + "▁excit": 14596, + "▁quit": 14597, + "▁universal": 14598, + "дь": 14599, + "▁SP": 14600, + "▁stub": 14601, + "▁kle": 14602, + "▁Bart": 14603, + "▁\"@": 14604, + "pel": 14605, + "▁(!(": 14606, + "▁selector": 14607, + "EB": 14608, + "▁coc": 14609, + "eted": 14610, + "ють": 14611, + "▁possess": 14612, + "▁Rick": 14613, + "▁unusual": 14614, + "termin": 14615, + "▁bags": 14616, + "▁loading": 14617, + "▁tf": 14618, + "▁)\r": 14619, + "provider": 14620, + "pletion": 14621, + "▁cursor": 14622, + "▁paused": 14623, + "им": 14624, + "▁counsel": 14625, + "]<": 14626, + "zech": 14627, + "▁tie": 14628, + "▁Moon": 14629, + "▁armed": 14630, + "▁observe": 14631, + "▁permet": 14632, + "▁Job": 14633, + "för": 14634, + "argument": 14635, + "▁eggs": 14636, + "ást": 14637, + "▁incredibly": 14638, + "werken": 14639, + "izard": 14640, + "▁painted": 14641, + "▁Vietnam": 14642, + "▁violent": 14643, + "Est": 14644, + "ierra": 14645, + "reader": 14646, + "weise": 14647, + "▁Josh": 14648, + "▁Him": 14649, + "ashes": 14650, + "origin": 14651, + "▁spir": 14652, + "▁Tree": 14653, + "▁niet": 14654, + "WIN": 14655, + "margin": 14656, + "▁involves": 14657, + "▁organis": 14658, + "▁Nacional": 14659, + "bara": 14660, + "▁depuis": 14661, + "pio": 14662, + "features": 14663, + "stru": 14664, + "▁Disney": 14665, + "▁restaurants": 14666, + "Mill": 14667, + "))\r": 14668, + "сла": 14669, + "remote": 14670, + "▁Third": 14671, + "▁baseball": 14672, + "▁algun": 14673, + "]$": 14674, + "▁employed": 14675, + "pot": 14676, + "▁UnityEngine": 14677, + "▁integration": 14678, + "▁risks": 14679, + "▁stro": 14680, + "▁agosto": 14681, + "including": 14682, + "▁Mind": 14683, + "▁stroke": 14684, + "▁deals": 14685, + "ajax": 14686, + "ёт": 14687, + "▁\\|": 14688, + "tar": 14689, + "adelphia": 14690, + "▁sab": 14691, + "pur": 14692, + "▁screw": 14693, + "▁inev": 14694, + "▁\\;": 14695, + "▁Donald": 14696, + "öd": 14697, + "cca": 14698, + "esis": 14699, + "▁separated": 14700, + "DBG": 14701, + "agent": 14702, + "▁packed": 14703, + "ння": 14704, + "intern": 14705, + "▁Monte": 14706, + "▁province": 14707, + "▁expanded": 14708, + "▁approached": 14709, + "▁Ep": 14710, + "CLK": 14711, + "▁ore": 14712, + "Batch": 14713, + "▁impressive": 14714, + "RM": 14715, + "▁Location": 14716, + "▁shame": 14717, + "wrapper": 14718, + "unwrap": 14719, + "peer": 14720, + "Bits": 14721, + "▁SN": 14722, + "scar": 14723, + "Come": 14724, + "▁council": 14725, + "▁shouted": 14726, + "making": 14727, + "▁Maur": 14728, + "▁wis": 14729, + "LETE": 14730, + "▁fs": 14731, + "▁dz": 14732, + "unque": 14733, + "uego": 14734, + "Random": 14735, + "Html": 14736, + "zem": 14737, + "▁Dutch": 14738, + "▁Golden": 14739, + "▁Tar": 14740, + "▁Herm": 14741, + "▁stretch": 14742, + "vard": 14743, + "▁tries": 14744, + "WI": 14745, + "▁disappeared": 14746, + "▁crusher": 14747, + "▁Kan": 14748, + "Mag": 14749, + "ør": 14750, + "▁Cambridge": 14751, + "▁dopo": 14752, + "atura": 14753, + "heart": 14754, + "▁Spiel": 14755, + "/**\r": 14756, + "Direction": 14757, + "atting": 14758, + "wig": 14759, + "▁codes": 14760, + "▁powder": 14761, + "alert": 14762, + "sembl": 14763, + "▁ye": 14764, + "Star": 14765, + "▁roots": 14766, + "▁Holl": 14767, + "Rele": 14768, + "▁constitu": 14769, + "nc": 14770, + "“.": 14771, + "reference": 14772, + "ificial": 14773, + "closure": 14774, + "▁figured": 14775, + "▁assumption": 14776, + "getElementById": 14777, + "▁AG": 14778, + "oses": 14779, + "▁_\"": 14780, + "epper": 14781, + "obre": 14782, + "enumerate": 14783, + "ографи": 14784, + "▁lessons": 14785, + "▁qualified": 14786, + "Person": 14787, + "anse": 14788, + "▁Mort": 14789, + "sylvania": 14790, + "▁cré": 14791, + "Binding": 14792, + "іс": 14793, + "▁Vari": 14794, + "▁reminded": 14795, + "▁membership": 14796, + "iper": 14797, + "zte": 14798, + "▁cref": 14799, + "▁PA": 14800, + "plaatst": 14801, + "▁Environment": 14802, + "boy": 14803, + "▁phrase": 14804, + "rivial": 14805, + "rag": 14806, + "води": 14807, + "▁pse": 14808, + "▁tournament": 14809, + ")},": 14810, + "▁Sound": 14811, + "▁Vel": 14812, + "▁Berg": 14813, + "elson": 14814, + "▁refuge": 14815, + "▁elsewhere": 14816, + "quality": 14817, + "▁abandoned": 14818, + "▁Flo": 14819, + "ibil": 14820, + "UAL": 14821, + "▁Platz": 14822, + "▁delta": 14823, + "▁Buy": 14824, + "rière": 14825, + "▁flour": 14826, + "▁laughing": 14827, + "▁Looking": 14828, + "Agent": 14829, + "▁wx": 14830, + "▁Wales": 14831, + "Ctx": 14832, + "▁cake": 14833, + "▁crate": 14834, + "▁кла": 14835, + "anga": 14836, + "Zero": 14837, + "▁amounts": 14838, + "Tra": 14839, + "ometric": 14840, + "▁constraints": 14841, + "▁temple": 14842, + "▁installation": 14843, + "stroke": 14844, + "▁Neder": 14845, + "ți": 14846, + "▁Ibid": 14847, + "▁obs": 14848, + "entries": 14849, + "▁jusqu": 14850, + "ORM": 14851, + "▁Sky": 14852, + "ikes": 14853, + "nak": 14854, + "▁modes": 14855, + "▁Hitler": 14856, + "▁belt": 14857, + "▁pointing": 14858, + "▁Ban": 14859, + "ignore": 14860, + "▁persu": 14861, + "▁Besides": 14862, + "ynom": 14863, + "▁legis": 14864, + "▁CPU": 14865, + "anded": 14866, + "uis": 14867, + "bsite": 14868, + "▁Euro": 14869, + "▁utter": 14870, + "eclipse": 14871, + "▁irre": 14872, + "▁Document": 14873, + "▁Meanwhile": 14874, + "▁familie": 14875, + "verify": 14876, + "▁Jason": 14877, + "▁Ort": 14878, + "▁ciudad": 14879, + "▁technologies": 14880, + "▁части": 14881, + "nica": 14882, + "cancel": 14883, + "Virtual": 14884, + "▁evident": 14885, + "aman": 14886, + "▁Supreme": 14887, + "atoes": 14888, + "▁steady": 14889, + "▁monthly": 14890, + "▁SOFTWARE": 14891, + "Die": 14892, + "▁applying": 14893, + "Dig": 14894, + "vier": 14895, + "▁горо": 14896, + "▁WH": 14897, + "▁minds": 14898, + "▁kam": 14899, + "▁expertise": 14900, + "▁notification": 14901, + ".-": 14902, + "▁deliber": 14903, + "▁HE": 14904, + "▁resist": 14905, + "outes": 14906, + "▁Howard": 14907, + "special": 14908, + "▁presentation": 14909, + "▁YouTube": 14910, + "mir": 14911, + "▁rust": 14912, + "▁nations": 14913, + "▁Gets": 14914, + "▁responses": 14915, + "arded": 14916, + "immer": 14917, + "▁reveal": 14918, + "▁Meg": 14919, + "▁todos": 14920, + "▁ade": 14921, + "ategories": 14922, + "▁payments": 14923, + "ôt": 14924, + "Enumer": 14925, + "▁platforms": 14926, + "▁lifetime": 14927, + "Complete": 14928, + "Quest": 14929, + "enders": 14930, + "▁cum": 14931, + "pler": 14932, + "▁appl": 14933, + "ährend": 14934, + "зь": 14935, + "enez": 14936, + "overty": 14937, + "ynchron": 14938, + "▁argued": 14939, + "▁Kath": 14940, + "▁synchron": 14941, + "▁Builder": 14942, + "Border": 14943, + "Plan": 14944, + "rieb": 14945, + "nm": 14946, + "FORMAT": 14947, + "usk": 14948, + "▁jumped": 14949, + "charg": 14950, + "▁contribute": 14951, + "Mesh": 14952, + "Univers": 14953, + "rell": 14954, + "▁polar": 14955, + "▁trois": 14956, + "icio": 14957, + "Groups": 14958, + "▁(%": 14959, + "Loop": 14960, + "▁gaz": 14961, + "dbg": 14962, + "LAY": 14963, + "John": 14964, + "blocks": 14965, + "▁lung": 14966, + "▁kön": 14967, + "through": 14968, + "▁fifth": 14969, + "lisher": 14970, + "▁involving": 14971, + "▁Deep": 14972, + "▁области": 14973, + "▁sull": 14974, + "Export": 14975, + "▁Kate": 14976, + "period": 14977, + "charge": 14978, + "GT": 14979, + "\">\r": 14980, + "тин": 14981, + "▁Ott": 14982, + "▁interactions": 14983, + "▁Toronto": 14984, + "TRACE": 14985, + "▁difer": 14986, + "▁liberal": 14987, + "▁particle": 14988, + "▁surve": 14989, + "alous": 14990, + "reason": 14991, + "▁depression": 14992, + "ал": 14993, + "▁flower": 14994, + "▁waar": 14995, + "▁hade": 14996, + "▁centuries": 14997, + "uty": 14998, + "party": 14999, + "▁approval": 15000, + "generate": 15001, + "▁Barn": 15002, + "▁marg": 15003, + "▁monde": 15004, + "▁ook": 15005, + "▁Clark": 15006, + "▁theoret": 15007, + "viously": 15008, + "?)": 15009, + "▁Rud": 15010, + "stmt": 15011, + "inction": 15012, + "▁tun": 15013, + "▁roads": 15014, + "▁rotation": 15015, + "ppen": 15016, + "sensor": 15017, + "▁Kol": 15018, + "idelines": 15019, + "▁є": 15020, + "▁composed": 15021, + "▁virus": 15022, + "'$": 15023, + "SN": 15024, + "▁Von": 15025, + "mont": 15026, + "lar": 15027, + "▁opinions": 15028, + "uction": 15029, + "rupal": 15030, + "underline": 15031, + "▁horror": 15032, + "Must": 15033, + "otto": 15034, + "Should": 15035, + "▁statist": 15036, + "▁gem": 15037, + "▁secre": 15038, + "▁strip": 15039, + "▁dirt": 15040, + "amazon": 15041, + "▁Round": 15042, + "▁discovery": 15043, + "▁GO": 15044, + "▁substantial": 15045, + "ibt": 15046, + "▁demands": 15047, + "▁everyday": 15048, + "▁besch": 15049, + "▁Bridge": 15050, + "▁HD": 15051, + "▁Dol": 15052, + "▁très": 15053, + "anni": 15054, + "roit": 15055, + "());\r": 15056, + "far": 15057, + "timestamp": 15058, + "▁bulk": 15059, + "Black": 15060, + "▁gan": 15061, + "setting": 15062, + "retval": 15063, + "ване": 15064, + "nung": 15065, + "▁talks": 15066, + "▁scientists": 15067, + "▁vig": 15068, + "▁quantity": 15069, + "▁Gard": 15070, + "▁movements": 15071, + "ähr": 15072, + "lings": 15073, + "▁Те": 15074, + "team": 15075, + "rito": 15076, + "▁assembly": 15077, + "ilst": 15078, + "▁happiness": 15079, + "▁leaf": 15080, + "▁assessment": 15081, + "Coord": 15082, + "irs": 15083, + "sam": 15084, + "▁attorney": 15085, + "▁geme": 15086, + "IDE": 15087, + "▁Vere": 15088, + "▁Anthony": 15089, + "amiento": 15090, + "▁Ast": 15091, + "▁circul": 15092, + "▁Frances": 15093, + "▁pent": 15094, + "▁mate": 15095, + "▁Transport": 15096, + "owo": 15097, + "чу": 15098, + "istes": 15099, + "TRAN": 15100, + "IMPORT": 15101, + "▁Break": 15102, + "▁sons": 15103, + "▁investors": 15104, + "▁Philipp": 15105, + "THOD": 15106, + "▁panic": 15107, + "▁:)": 15108, + "▁detection": 15109, + "▁simultane": 15110, + "nte": 15111, + "▁listened": 15112, + "кре": 15113, + "▁Brig": 15114, + "Optional": 15115, + "▁abund": 15116, + "▁criteria": 15117, + "▁chip": 15118, + "▁окру": 15119, + "▁Constant": 15120, + "▁mining": 15121, + "тал": 15122, + "mates": 15123, + "▁worship": 15124, + "router": 15125, + "CN": 15126, + "▁Match": 15127, + "▁Cole": 15128, + "▁downt": 15129, + "▁holes": 15130, + "▁grateful": 15131, + "RESULT": 15132, + "▁Europa": 15133, + "▁consent": 15134, + "lä": 15135, + "opter": 15136, + "▁colleagues": 15137, + "orous": 15138, + "▁enemies": 15139, + "hang": 15140, + "actual": 15141, + "Objects": 15142, + "▁як": 15143, + "▁fluid": 15144, + "fixed": 15145, + "▁Graph": 15146, + "▁scratch": 15147, + "cers": 15148, + "ribu": 15149, + "▁validation": 15150, + "▁completion": 15151, + "▁Begin": 15152, + "endpoint": 15153, + "rient": 15154, + "CM": 15155, + "▁Site": 15156, + "▁explains": 15157, + "tres": 15158, + "▁anybody": 15159, + "foreach": 15160, + "lon": 15161, + "Chain": 15162, + "▁Buff": 15163, + "ocal": 15164, + "▁Morgan": 15165, + "▁sang": 15166, + "▁passes": 15167, + "@@": 15168, + "ijd": 15169, + "Word": 15170, + "▁Hung": 15171, + "▁Fer": 15172, + "▁vý": 15173, + "bast": 15174, + "▁entertainment": 15175, + "hin": 15176, + "▁grat": 15177, + "▁Member": 15178, + "▁Minn": 15179, + "▁printed": 15180, + "▁Franklin": 15181, + "▁Imp": 15182, + "Machine": 15183, + "columns": 15184, + "▁deleted": 15185, + "▁manufacturing": 15186, + "▁rely": 15187, + "▁conse": 15188, + "▁fishing": 15189, + "blo": 15190, + "-$": 15191, + "▁.\"": 15192, + "▁clinical": 15193, + "▁Studies": 15194, + "▁Бу": 15195, + "definition": 15196, + "▁evaluation": 15197, + "▁attacked": 15198, + "▁frozen": 15199, + "zent": 15200, + "▁últ": 15201, + "▁rational": 15202, + "othe": 15203, + "Cancel": 15204, + "history": 15205, + "setText": 15206, + "▁alc": 15207, + "▁hydro": 15208, + "▁Theatre": 15209, + "▁Material": 15210, + "IOException": 15211, + "******/": 15212, + "spl": 15213, + "NODE": 15214, + "attrs": 15215, + "▁mie": 15216, + "▁offices": 15217, + "ró": 15218, + "▁jam": 15219, + "▁Ident": 15220, + "vé": 15221, + "Setting": 15222, + "▁Several": 15223, + "▁decay": 15224, + "Android": 15225, + "▁Save": 15226, + "unted": 15227, + "▁Mountain": 15228, + "usc": 15229, + "▁marzo": 15230, + "▁asleep": 15231, + "▁soldier": 15232, + "▁Double": 15233, + "PK": 15234, + "▁contrad": 15235, + "▁wins": 15236, + "ceiver": 15237, + "▁seasons": 15238, + "▁Chall": 15239, + "▁healthcare": 15240, + "ład": 15241, + "от": 15242, + "▁Five": 15243, + "▁Hell": 15244, + "▁worldwide": 15245, + "▁',": 15246, + "ян": 15247, + "made": 15248, + "▁responded": 15249, + "▁ay": 15250, + "▁procedures": 15251, + "тера": 15252, + "▁cleared": 15253, + "\"].": 15254, + "▁Target": 15255, + "▁Side": 15256, + "omin": 15257, + "▁deploy": 15258, + "▁Tell": 15259, + "▁ongoing": 15260, + "floor": 15261, + "▁bones": 15262, + "▁Delete": 15263, + "▁shrugged": 15264, + "Our": 15265, + "Der": 15266, + "▁initialize": 15267, + "▁Ted": 15268, + "MAGE": 15269, + "▁hire": 15270, + "▁tracking": 15271, + "▁ash": 15272, + "▁ceiling": 15273, + "ках": 15274, + "etti": 15275, + "▁courage": 15276, + "enschapp": 15277, + "ются": 15278, + "More": 15279, + "▁folg": 15280, + "▁Grace": 15281, + "▁Kelly": 15282, + "▁reven": 15283, + "▁Ali": 15284, + "▁disp": 15285, + "▁defeat": 15286, + "▁creature": 15287, + "▁Kennedy": 15288, + "▁Diego": 15289, + "EMP": 15290, + "▁steam": 15291, + "endance": 15292, + "rig": 15293, + "▁ignor": 15294, + "emen": 15295, + "▁Gru": 15296, + "▁proposal": 15297, + "▁weiter": 15298, + "▁лі": 15299, + "ibles": 15300, + "▁consideration": 15301, + "▁believes": 15302, + "▁Soph": 15303, + "“,": 15304, + "▁Matthew": 15305, + "▁circuit": 15306, + "▁singer": 15307, + "▁Square": 15308, + "ço": 15309, + "Edge": 15310, + "▁astr": 15311, + "▁representative": 15312, + "▁comprehensive": 15313, + "liga": 15314, + "▁mere": 15315, + "tbl": 15316, + "▁continuing": 15317, + "ographer": 15318, + "LED": 15319, + "▁/***/": 15320, + "▁sear": 15321, + "▁enormous": 15322, + "izi": 15323, + "Dit": 15324, + "there": 15325, + "ін": 15326, + "сите": 15327, + "▁guerra": 15328, + "▁endpoint": 15329, + "▁lesson": 15330, + "zon": 15331, + "variable": 15332, + "ис": 15333, + "▁researchers": 15334, + "▁attempted": 15335, + "▁enf": 15336, + "тура": 15337, + "▁defin": 15338, + "вест": 15339, + "▁awful": 15340, + "▁lowest": 15341, + "rules": 15342, + "▁unlike": 15343, + "interval": 15344, + "▁producing": 15345, + "▁Kam": 15346, + "▁IMP": 15347, + "General": 15348, + "▁faire": 15349, + "▁maxim": 15350, + "assemb": 15351, + "acent": 15352, + "?>": 15353, + "plica": 15354, + "▁ram": 15355, + "mate": 15356, + "цу": 15357, + "mn": 15358, + "▁Hi": 15359, + "▁stages": 15360, + "▁Editor": 15361, + "▁tang": 15362, + "RD": 15363, + "▁ich": 15364, + "▁dependent": 15365, + "lifer": 15366, + "ascript": 15367, + "▁exposure": 15368, + "рез": 15369, + "▁mart": 15370, + "▁Barcel": 15371, + "xspace": 15372, + "SESSION": 15373, + "▁prest": 15374, + "URCE": 15375, + "-.": 15376, + "▁село": 15377, + "have": 15378, + "▁observation": 15379, + "▁commands": 15380, + "▁eager": 15381, + "▁outdoor": 15382, + "▁DEBUG": 15383, + "▁hr": 15384, + "AX": 15385, + "▁puzz": 15386, + "blank": 15387, + "бур": 15388, + "▁kennis": 15389, + "▁regarded": 15390, + "▁}),": 15391, + "volume": 15392, + "▁произ": 15393, + "▁Training": 15394, + "añ": 15395, + "▁fois": 15396, + "▁три": 15397, + "вня": 15398, + "▁optimal": 15399, + "▁subscription": 15400, + "bridge": 15401, + "imental": 15402, + "▁Think": 15403, + "▁\";": 15404, + "▁legisl": 15405, + "▁Hop": 15406, + "▁branches": 15407, + "▁Veg": 15408, + "▁sprint": 15409, + "▁flux": 15410, + "▁Freder": 15411, + "sis": 15412, + "notify": 15413, + "▁Фран": 15414, + "som": 15415, + "nym": 15416, + "▁Ré": 15417, + "lett": 15418, + "ingham": 15419, + "▁Farm": 15420, + "DOM": 15421, + "▁shield": 15422, + "Here": 15423, + "▁Treat": 15424, + "▁Luke": 15425, + "▁unsafe": 15426, + "anton": 15427, + "▁Imper": 15428, + "▁telephone": 15429, + "▁unlock": 15430, + "Owner": 15431, + "collection": 15432, + "▁snd": 15433, + "▁suiv": 15434, + "▁entering": 15435, + "шен": 15436, + "▁Label": 15437, + "selector": 15438, + "▁GET": 15439, + "▁quando": 15440, + "▁fed": 15441, + "jQuery": 15442, + "Origin": 15443, + "▁Alan": 15444, + "mathscr": 15445, + "▁pregnant": 15446, + "Expect": 15447, + "resources": 15448, + "▁ersten": 15449, + "alia": 15450, + "▁retired": 15451, + "ût": 15452, + "Cred": 15453, + "▁méd": 15454, + "▁erh": 15455, + "Framework": 15456, + "Slot": 15457, + "duration": 15458, + "sal": 15459, + "▁composition": 15460, + "article": 15461, + "gpu": 15462, + "▁permitted": 15463, + "▁Font": 15464, + "▁Much": 15465, + "▁pending": 15466, + "▁agencies": 15467, + "Columns": 15468, + "▁klik": 15469, + "▁rating": 15470, + "mind": 15471, + "▁Pennsylvania": 15472, + "Java": 15473, + "abstract": 15474, + "▁dumb": 15475, + "▁VI": 15476, + "usa": 15477, + "Remote": 15478, + "▁YOU": 15479, + "▁Creek": 15480, + "мати": 15481, + "Bottom": 15482, + "▁rolling": 15483, + "▁bundle": 15484, + "▁golf": 15485, + "gpio": 15486, + "▁Chair": 15487, + "▁cls": 15488, + "$}": 15489, + "▁Parliament": 15490, + "führ": 15491, + "Many": 15492, + "▁Sep": 15493, + "▁badly": 15494, + "igi": 15495, + "▁Gemeinde": 15496, + "Ill": 15497, + "▁Ан": 15498, + "uart": 15499, + "itempty": 15500, + "▁Niger": 15501, + "▁immigr": 15502, + "Super": 15503, + "vá": 15504, + "istribute": 15505, + "Helpers": 15506, + "▁waters": 15507, + "▁joining": 15508, + "omitempty": 15509, + "▁Otherwise": 15510, + "▁Host": 15511, + "▁redd": 15512, + "▁dy": 15513, + "▁converted": 15514, + "▁prayer": 15515, + "▁Украї": 15516, + "▁elections": 15517, + "reb": 15518, + "erie": 15519, + "▁свя": 15520, + "Abs": 15521, + "iembre": 15522, + "holders": 15523, + "▁Rol": 15524, + "utschen": 15525, + "▁Gh": 15526, + "tery": 15527, + "анг": 15528, + "▁narrative": 15529, + "minus": 15530, + "▁Iron": 15531, + "=\"#": 15532, + "▁wand": 15533, + "▁wished": 15534, + "icode": 15535, + "orr": 15536, + "[[": 15537, + "▁detected": 15538, + "▁municipal": 15539, + "▁Pour": 15540, + "▁Serv": 15541, + "citet": 15542, + "▁grey": 15543, + "▁Rap": 15544, + "▁voy": 15545, + "▁lleg": 15546, + "▁currency": 15547, + "▁Script": 15548, + "strument": 15549, + "▁expecting": 15550, + "▁tickets": 15551, + "▁bucket": 15552, + "egr": 15553, + "▁jacket": 15554, + "drv": 15555, + "▁loans": 15556, + "▁kann": 15557, + "▁integral": 15558, + "▁characteristics": 15559, + "(\".": 15560, + "▁manual": 15561, + "▁dynamics": 15562, + ":*": 15563, + "sha": 15564, + "reens": 15565, + "onical": 15566, + "▁toile": 15567, + "aña": 15568, + "▁distant": 15569, + "▁handled": 15570, + "Bool": 15571, + "▁penal": 15572, + "▁Things": 15573, + "▁prominent": 15574, + "▁exped": 15575, + "▁Help": 15576, + "▁asp": 15577, + "lap": 15578, + "▁Auth": 15579, + "Basic": 15580, + "achuset": 15581, + "▁Bild": 15582, + "▁entitled": 15583, + "▁jag": 15584, + "▁rejected": 15585, + "▁memor": 15586, + "orts": 15587, + "▁applies": 15588, + "▁Language": 15589, + "specific": 15590, + "achusetts": 15591, + "HAND": 15592, + "▁Route": 15593, + "market": 15594, + "▁Ky": 15595, + "▁pose": 15596, + "ACHE": 15597, + "poll": 15598, + "▁rocks": 15599, + "bone": 15600, + "▁DIS": 15601, + "Watch": 15602, + "▁smiling": 15603, + "рио": 15604, + "Month": 15605, + "▁efter": 15606, + "construct": 15607, + "▁bands": 15608, + "▁collaboration": 15609, + "ними": 15610, + "glas": 15611, + "▁vy": 15612, + "▁engagement": 15613, + "__)": 15614, + "▁wings": 15615, + "ким": 15616, + "netje": 15617, + "ativa": 15618, + "▁Duke": 15619, + "лее": 15620, + "▁Within": 15621, + "▁dove": 15622, + "▁cb": 15623, + "yers": 15624, + "pow": 15625, + "[(": 15626, + "▁evaluate": 15627, + "Points": 15628, + "▁рі": 15629, + "odigd": 15630, + "onomy": 15631, + "▁Illinois": 15632, + "▁Typ": 15633, + "▁coordinates": 15634, + "pisode": 15635, + "ucked": 15636, + "▁flav": 15637, + "▁brands": 15638, + "▁calendar": 15639, + "Lib": 15640, + "▁uitgen": 15641, + "▁tale": 15642, + "▁briefly": 15643, + "▁mic": 15644, + "RESS": 15645, + "▁später": 15646, + "▁integrated": 15647, + "▁cookies": 15648, + "▁uitgenodigd": 15649, + "▁Priv": 15650, + "▁phenomen": 15651, + "▁voegen": 15652, + "Supp": 15653, + "▁refers": 15654, + "пад": 15655, + "▁Clinton": 15656, + "▁assignment": 15657, + "inals": 15658, + "▁asym": 15659, + "cycle": 15660, + "▁Anderson": 15661, + "▁binding": 15662, + "rique": 15663, + "hind": 15664, + "▁behalf": 15665, + "▁Fle": 15666, + "▁breaks": 15667, + "▁soap": 15668, + "вар": 15669, + "▁vä": 15670, + "▁crying": 15671, + "▁→": 15672, + "▁msm": 15673, + "▁boots": 15674, + "owing": 15675, + "▁bell": 15676, + "suite": 15677, + "▁Bundes": 15678, + "Year": 15679, + "ndef": 15680, + "Other": 15681, + "▁google": 15682, + "ENCE": 15683, + "WER": 15684, + "Les": 15685, + "Shared": 15686, + "▁ED": 15687, + "IFT": 15688, + "▁floating": 15689, + "ým": 15690, + "{},": 15691, + "Binary": 15692, + "▁roce": 15693, + "raj": 15694, + "▁bewerken": 15695, + "BF": 15696, + "▁Hur": 15697, + "cen": 15698, + "▁ere": 15699, + "▁camb": 15700, + "▁Pakistan": 15701, + "▁greatly": 15702, + "▁logging": 15703, + "/.": 15704, + "Tensor": 15705, + "▁opens": 15706, + "▁Rio": 15707, + "▁klikken": 15708, + "▁sculpt": 15709, + "apore": 15710, + "wx": 15711, + "▁Nich": 15712, + "nan": 15713, + "▁injured": 15714, + "compare": 15715, + "tha": 15716, + "Sample": 15717, + "Shell": 15718, + "▁commander": 15719, + "▁receiver": 15720, + "▁hopes": 15721, + "▁byl": 15722, + "▁proxy": 15723, + "▁gall": 15724, + "getId": 15725, + "▁Bab": 15726, + "feld": 15727, + "▁\"_": 15728, + "▁Hab": 15729, + "simple": 15730, + "▁executed": 15731, + "▁ate": 15732, + "▁animation": 15733, + "▁inhab": 15734, + "▁боль": 15735, + "▁router": 15736, + "▁glob": 15737, + "Geplaatst": 15738, + "▁beginnetje": 15739, + "▁Kur": 15740, + "▁Ха": 15741, + "aligned": 15742, + "▁certificate": 15743, + "▁Å": 15744, + ".).": 15745, + "▁soll": 15746, + "▁Import": 15747, + "реди": 15748, + "▁pandemic": 15749, + "▁nic": 15750, + "vä": 15751, + "▁Gree": 15752, + "▁Say": 15753, + "▁ді": 15754, + "▁Num": 15755, + "▁roughly": 15756, + "▁después": 15757, + "▁​": 15758, + "▁specify": 15759, + "Mapper": 15760, + "licht": 15761, + "▁thumb": 15762, + "wie": 15763, + "▁unlikely": 15764, + "▁Edd": 15765, + "Hey": 15766, + "▁Opt": 15767, + "BLOCK": 15768, + "вор": 15769, + "▁×": 15770, + "▁ba": 15771, + "▁periods": 15772, + "▁titles": 15773, + "Med": 15774, + "▁fon": 15775, + "▁bast": 15776, + "▁Forest": 15777, + "▁№": 15778, + "onds": 15779, + "▁fal": 15780, + "▁gesch": 15781, + "direction": 15782, + "IFY": 15783, + "▁LA": 15784, + "▁(((": 15785, + "GTH": 15786, + "itudes": 15787, + "▁destruction": 15788, + "▁Ja": 15789, + "▁stake": 15790, + "ifferent": 15791, + "▁identical": 15792, + "▁fog": 15793, + "▁Reb": 15794, + "ские": 15795, + "ступ": 15796, + "jax": 15797, + "▁Mars": 15798, + "▁historic": 15799, + "▁Vo": 15800, + "▁entrepre": 15801, + "▁tension": 15802, + "▁WHERE": 15803, + "▁Philadelphia": 15804, + "Counter": 15805, + "▁frames": 15806, + "▁muy": 15807, + "ej": 15808, + "öt": 15809, + "eu": 15810, + "▁челове": 15811, + "PROC": 15812, + "▁resolved": 15813, + "▁tape": 15814, + "цион": 15815, + "▁singular": 15816, + "▁personnel": 15817, + "▁Mun": 15818, + "▁Occ": 15819, + "▁scalar": 15820, + "dess": 15821, + "▁cable": 15822, + "being": 15823, + "▁Jenn": 15824, + "▁erst": 15825, + "Actions": 15826, + "Environment": 15827, + "via": 15828, + "▁struggling": 15829, + "▁DVD": 15830, + "whe": 15831, + "▁throwing": 15832, + "Bounds": 15833, + "▁MD": 15834, + "▁\"../": 15835, + "▁satisfy": 15836, + "▁Colorado": 15837, + "▁Active": 15838, + "Tasks": 15839, + "<>();": 15840, + "▁slipped": 15841, + "▁poison": 15842, + "zb": 15843, + "Dispatch": 15844, + "warning": 15845, + "▁ultimate": 15846, + "picture": 15847, + "expression": 15848, + "▁Talk": 15849, + "▁flick": 15850, + "▁raising": 15851, + "▁transactions": 15852, + "▁glance": 15853, + "▁gri": 15854, + "▁през": 15855, + "selection": 15856, + "ња": 15857, + "endl": 15858, + "▁Abb": 15859, + "▁bold": 15860, + "▁maintained": 15861, + "Exists": 15862, + "▁encouraged": 15863, + "Qual": 15864, + "▁essere": 15865, + "▁hired": 15866, + "letter": 15867, + "itches": 15868, + "others": 15869, + "▁woj": 15870, + "▁injuries": 15871, + "▁dil": 15872, + "execut": 15873, + "▁Steel": 15874, + "▁Garden": 15875, + "зя": 15876, + "\\,\\": 15877, + "▁Angel": 15878, + "prim": 15879, + ">:]<": 15880, + "gb": 15881, + "peat": 15882, + "inte": 15883, + "▁apolog": 15884, + "▁regulations": 15885, + "Src": 15886, + "kh": 15887, + "Upload": 15888, + "mapping": 15889, + "▁presents": 15890, + "▁poetry": 15891, + "▁stops": 15892, + "▁Tol": 15893, + "▁tower": 15894, + "▁OUT": 15895, + "Thank": 15896, + "▁organic": 15897, + "▁drei": 15898, + "▁pound": 15899, + "century": 15900, + "▁modules": 15901, + "▁дере": 15902, + "▁worn": 15903, + "▁parad": 15904, + "▁Cos": 15905, + "fic": 15906, + "▁без": 15907, + "▁Jimmy": 15908, + "▁lands": 15909, + "▁minist": 15910, + "vspace": 15911, + "▁lighting": 15912, + "▁naked": 15913, + "▁designer": 15914, + "▁Stream": 15915, + "TMP": 15916, + "Center": 15917, + "resentation": 15918, + "ONT": 15919, + "▁ers": 15920, + "▁measurement": 15921, + "▁muscles": 15922, + "▁Ign": 15923, + "▁COM": 15924, + "▁fru": 15925, + "▁genre": 15926, + "▁alpha": 15927, + "▁retirement": 15928, + "▁Gon": 15929, + "ől": 15930, + "contents": 15931, + "▁healing": 15932, + "▁sido": 15933, + "incipal": 15934, + "Permission": 15935, + "рак": 15936, + "▁Gordon": 15937, + "▁Rank": 15938, + "▁Autom": 15939, + "Constructor": 15940, + "wiki": 15941, + "▁concerning": 15942, + "rizona": 15943, + "▁variant": 15944, + "▁arranged": 15945, + "▁Spr": 15946, + "BPACK": 15947, + "Timestamp": 15948, + "restore": 15949, + "aware": 15950, + "▁Observ": 15951, + "▁SV": 15952, + "ipp": 15953, + "▁Executive": 15954, + "▁colleg": 15955, + "▁explicitly": 15956, + "written": 15957, + "▁Kön": 15958, + "irus": 15959, + "▁Hold": 15960, + "▁Pract": 15961, + "Character": 15962, + "▁redistribute": 15963, + "uerto": 15964, + "▁Student": 15965, + "▁elder": 15966, + "▁Dop": 15967, + "vp": 15968, + "▁Hub": 15969, + "▁grounds": 15970, + "▁Ry": 15971, + "▁signals": 15972, + "▁gifts": 15973, + "▁strengthen": 15974, + "▁Lyn": 15975, + "commun": 15976, + "▁най": 15977, + "▁finance": 15978, + "noc": 15979, + "helm": 15980, + "▁cuts": 15981, + "▁adventure": 15982, + "▁Ric": 15983, + "▁intellectual": 15984, + "▁Output": 15985, + "▁awk": 15986, + "▁concentration": 15987, + "▁guidance": 15988, + "Buff": 15989, + "▁filling": 15990, + "▁regul": 15991, + "▁delicious": 15992, + "([]": 15993, + "ших": 15994, + "▁tons": 15995, + "activity": 15996, + "GP": 15997, + "LOB": 15998, + "stadt": 15999, + "tal": 16000, + "▁img": 16001, + "▁rush": 16002, + "attice": 16003, + "▁pok": 16004, + "steps": 16005, + "▁lid": 16006, + "▁DNA": 16007, + "Browser": 16008, + "▁ladies": 16009, + "▁années": 16010, + "▁rescue": 16011, + "avity": 16012, + "rock": 16013, + "▁glasses": 16014, + "▁Bey": 16015, + ")}$": 16016, + "detail": 16017, + "▁dés": 16018, + "tax": 16019, + "▁favourite": 16020, + "▁precision": 16021, + "▁conoc": 16022, + "Ms": 16023, + "▁Native": 16024, + "▁Pil": 16025, + "InputStream": 16026, + "orp": 16027, + "▁Pap": 16028, + "▁picking": 16029, + "iph": 16030, + "Loading": 16031, + "▁priest": 16032, + "Hook": 16033, + "▁pist": 16034, + "▁Une": 16035, + "%,": 16036, + "▁bil": 16037, + "▁conservative": 16038, + "eval": 16039, + "iking": 16040, + "'},": 16041, + "▁sauce": 16042, + "▁Due": 16043, + "assen": 16044, + "▁occasionally": 16045, + "▁Дж": 16046, + "unknown": 16047, + "DED": 16048, + "▁drum": 16049, + "▁dub": 16050, + "ATURE": 16051, + "usage": 16052, + "getType": 16053, + "reply": 16054, + "▁strategic": 16055, + "▁kap": 16056, + "design": 16057, + "datetime": 16058, + "▁Prim": 16059, + "Master": 16060, + "▁Corps": 16061, + "▁considerable": 16062, + "▁Tu": 16063, + "▁ла": 16064, + "▁tous": 16065, + "▁clar": 16066, + "▁poem": 16067, + "album": 16068, + "]*": 16069, + "loaded": 16070, + "▁traveling": 16071, + "вые": 16072, + "▁Ferr": 16073, + "▁pharm": 16074, + "abi": 16075, + "▁}\\": 16076, + "collect": 16077, + "▁Bour": 16078, + "OC": 16079, + "▁measurements": 16080, + "▁Professional": 16081, + "▁sensor": 16082, + "utsche": 16083, + "▁demanded": 16084, + "▁accompanied": 16085, + "▁prend": 16086, + "▁encoding": 16087, + "▁Geschichte": 16088, + "▁mig": 16089, + "▁Gib": 16090, + "▁Reich": 16091, + "▁myster": 16092, + "▁Mock": 16093, + "▁physically": 16094, + "▁Bau": 16095, + "▁Single": 16096, + "▁managing": 16097, + "▁Kil": 16098, + "▁Temple": 16099, + "▁lev": 16100, + "▁lí": 16101, + "CPU": 16102, + "▁Premier": 16103, + "▁Give": 16104, + "iri": 16105, + "NV": 16106, + "▁AI": 16107, + "▁fp": 16108, + "лександ": 16109, + "▁tant": 16110, + "▁fot": 16111, + "Nullable": 16112, + "▁guards": 16113, + "Once": 16114, + "▁chamber": 16115, + "film": 16116, + "▁bias": 16117, + "▁Tai": 16118, + "insic": 16119, + "▁ml": 16120, + "▁Ka": 16121, + "вал": 16122, + "▁SER": 16123, + "▁Someone": 16124, + "}}_{": 16125, + "Fixed": 16126, + "▁bent": 16127, + "▁prohib": 16128, + "▁bid": 16129, + "▁fewer": 16130, + "кры": 16131, + "▁lugar": 16132, + "▁deserve": 16133, + "ssl": 16134, + "▁cfg": 16135, + "reck": 16136, + "▁stability": 16137, + "resize": 16138, + "▁assertThat": 16139, + "Trigger": 16140, + "▁станов": 16141, + "plugins": 16142, + "▁lets": 16143, + "хід": 16144, + "▁Laura": 16145, + "нер": 16146, + "▁brut": 16147, + "▁FI": 16148, + "isons": 16149, + "▁dyn": 16150, + "icher": 16151, + "rayed": 16152, + "▁frequent": 16153, + "▁jedoch": 16154, + "▁Marine": 16155, + "strings": 16156, + "▁Util": 16157, + "▁bos": 16158, + "Mus": 16159, + "▁Portugal": 16160, + "Strategy": 16161, + "▁посе": 16162, + "▁slice": 16163, + "▁insight": 16164, + "▁widget": 16165, + "▁général": 16166, + "messages": 16167, + "▁Hu": 16168, + "▁requirement": 16169, + "Side": 16170, + "emplates": 16171, + "▁ceremony": 16172, + "▁physics": 16173, + "▁graduate": 16174, + "para": 16175, + "▁preserv": 16176, + "▁shops": 16177, + "zek": 16178, + "▁ub": 16179, + "prepare": 16180, + "▁Oil": 16181, + "▁fib": 16182, + "▁runtime": 16183, + "▁hogy": 16184, + "Warning": 16185, + "▁Convert": 16186, + "bourne": 16187, + "▁emerged": 16188, + "▁Ди": 16189, + "ighth": 16190, + "guard": 16191, + "kal": 16192, + "validation": 16193, + "ência": 16194, + "▁drinks": 16195, + "theorem": 16196, + "HR": 16197, + "iev": 16198, + "ployee": 16199, + "Usage": 16200, + "▁спе": 16201, + "dispatch": 16202, + "▁instantly": 16203, + "obi": 16204, + "▁justify": 16205, + "▁Nev": 16206, + "▁явля": 16207, + "agra": 16208, + "▁transmission": 16209, + "fly": 16210, + ";';": 17021, + "▁cousin": 17022, + "createElement": 17023, + "Could": 17024, + "▁capac": 17025, + "▁pause": 17026, + "ArrayList": 17027, + "kte": 17028, + "ordered": 17029, + "▁shaking": 17030, + "labels": 17031, + "▁reducing": 17032, + "вых": 17033, + "USED": 17034, + "▁voting": 17035, + "▁Ministry": 17036, + "▁Mig": 17037, + "▁Chen": 17038, + "▁accompany": 17039, + "ulle": 17040, + "▁ga": 17041, + "▁equipped": 17042, + "▁nun": 17043, + "Bet": 17044, + "▁licensed": 17045, + "ARCH": 17046, + "FN": 17047, + "▁engines": 17048, + "▁ster": 17049, + "▁locale": 17050, + "▁въ": 17051, + "links": 17052, + "▁Capital": 17053, + "▁alien": 17054, + "Wr": 17055, + "ръ": 17056, + "Cart": 17057, + "▁Marketing": 17058, + "▁RT": 17059, + "FileName": 17060, + "▁ti": 17061, + "iji": 17062, + "▁versus": 17063, + "live": 17064, + "Sym": 17065, + "kor": 17066, + "▁emission": 17067, + "umm": 17068, + "ycz": 17069, + "▁climbed": 17070, + "▁plusieurs": 17071, + "кри": 17072, + "yar": 17073, + "osten": 17074, + "▁usb": 17075, + "▁crossing": 17076, + "▁polynom": 17077, + "▁removal": 17078, + "▁Adams": 17079, + "▁ihre": 17080, + "anden": 17081, + "▁Benj": 17082, + "▁Phill": 17083, + "▁wounded": 17084, + "▁Castle": 17085, + "bild": 17086, + "Annotation": 17087, + "Processor": 17088, + "▁tin": 17089, + "folg": 17090, + "▁Students": 17091, + "▁Mexican": 17092, + "▁administrative": 17093, + "ILED": 17094, + "▁conqu": 17095, + "▁cheer": 17096, + "▁Ces": 17097, + "Because": 17098, + "▁Juni": 17099, + "▁encontr": 17100, + "avi": 17101, + "VI": 17102, + "aku": 17103, + "▁Ton": 17104, + "▁smoking": 17105, + "▁bay": 17106, + "works": 17107, + "ат": 17108, + "attered": 17109, + "▁Boolean": 17110, + "▁Balt": 17111, + "defer": 17112, + "pathy": 17113, + "Ah": 17114, + "▁akt": 17115, + "▁governor": 17116, + "Pad": 17117, + "▁sisters": 17118, + "Lat": 17119, + "▁revel": 17120, + "▁SY": 17121, + "itos": 17122, + "▁filters": 17123, + "Chunk": 17124, + "consum": 17125, + "▁removing": 17126, + "▁Herr": 17127, + "▁generator": 17128, + "▁Cra": 17129, + "▁farmers": 17130, + "▁Members": 17131, + "▁overcome": 17132, + "▁Cin": 17133, + "igkeit": 17134, + "criptions": 17135, + "Tests": 17136, + "▁клу": 17137, + "▁shake": 17138, + "▁yy": 17139, + "placement": 17140, + "▁awards": 17141, + "▁episodes": 17142, + "▁Blood": 17143, + "▁bullet": 17144, + "▁viene": 17145, + "▁Financial": 17146, + "Future": 17147, + "▁rou": 17148, + "▁biologie": 17149, + "▁useState": 17150, + "iani": 17151, + "piece": 17152, + "▁speaker": 17153, + "▁refr": 17154, + "ARK": 17155, + "▁MIT": 17156, + "▁Tan": 17157, + "▁Based": 17158, + "▁cultiv": 17159, + "▁hungry": 17160, + "▁Ay": 17161, + "▁Hey": 17162, + "▁excitement": 17163, + "ibraries": 17164, + "Hit": 17165, + "▁Ende": 17166, + "NG": 17167, + "FIL": 17168, + ".\")": 17169, + "Family": 17170, + "inery": 17171, + "necess": 17172, + "velope": 17173, + "▁Bot": 17174, + "porter": 17175, + "▁climb": 17176, + "▁Eli": 17177, + "urent": 17178, + "▁mistakes": 17179, + "ában": 17180, + "marks": 17181, + "pkt": 17182, + "Library": 17183, + "sted": 17184, + "ublice": 17185, + "▁Administration": 17186, + "▁shapes": 17187, + "публи": 17188, + "God": 17189, + "innen": 17190, + "коло": 17191, + "<<<<": 17192, + "ibe": 17193, + "ês": 17194, + "▁США": 17195, + "▁Foreign": 17196, + "▁Margaret": 17197, + "▁gene": 17198, + "▁disturb": 17199, + "▁тер": 17200, + "▁onClick": 17201, + "▁Engineering": 17202, + "▁stopping": 17203, + "▁restrictions": 17204, + ",*": 17205, + "BUF": 17206, + "▁shadows": 17207, + "hci": 17208, + "▁Christians": 17209, + "▁fence": 17210, + "▁luxury": 17211, + "akh": 17212, + "coord": 17213, + "▁investigate": 17214, + "▁conventional": 17215, + "\"—": 17216, + "▁visits": 17217, + "isé": 17218, + "▁Sac": 17219, + "className": 17220, + "▁Psych": 17221, + "▁reflected": 17222, + "▁пло": 17223, + "▁Vice": 17224, + "ław": 17225, + "________________": 17226, + "▁Wolf": 17227, + "rente": 17228, + "▁Champion": 17229, + "▁simulation": 17230, + "esota": 17231, + "▁Soon": 17232, + "▁Cel": 17233, + "▁theories": 17234, + "▁STR": 17235, + "▁collective": 17236, + "▁coordinate": 17237, + "querySelector": 17238, + "emed": 17239, + "Break": 17240, + "▁gef": 17241, + "▁electricity": 17242, + "▁gathering": 17243, + "aters": 17244, + "exper": 17245, + "▁Roma": 17246, + "▁Cooper": 17247, + "SYMBOL": 17248, + "vd": 17249, + "iversary": 17250, + "aines": 17251, + "▁Grad": 17252, + "▁independence": 17253, + "woh": 17254, + "▁consequence": 17255, + "▁conversations": 17256, + "▁Rou": 17257, + "▁andere": 17258, + "▁Systems": 17259, + "гар": 17260, + "▁moist": 17261, + "flu": 17262, + "ція": 17263, + "ниш": 17264, + "▁rode": 17265, + "▁perd": 17266, + "▁szer": 17267, + "▁flood": 17268, + "▁intim": 17269, + "stderr": 17270, + "▁reflection": 17271, + "Scan": 17272, + "▁disaster": 17273, + "akespe": 17274, + "▁Invalid": 17275, + "▁humor": 17276, + "▁Friedrich": 17277, + "▁suggestions": 17278, + "uvud": 17279, + "Delay": 17280, + "brief": 17281, + "▁ис": 17282, + "glied": 17283, + "fas": 17284, + "▁Smart": 17285, + "▁medi": 17286, + "sdk": 17287, + "▁seus": 17288, + "▁Arizona": 17289, + "▁innocent": 17290, + "Warn": 17291, + "acious": 17292, + "▁Moscow": 17293, + "▁caps": 17294, + "Delegate": 17295, + "▁dramatic": 17296, + "books": 17297, + "▁shore": 17298, + "uki": 17299, + "▁Russell": 17300, + "▁correlation": 17301, + "Help": 17302, + "▁pubblic": 17303, + "zym": 17304, + "comb": 17305, + "EY": 17306, + "LENGTH": 17307, + "▁Mün": 17308, + "▁_.": 17309, + "▁ferm": 17310, + "▁Ian": 17311, + "▁Studio": 17312, + "▁affairs": 17313, + "los": 17314, + "Rules": 17315, + "running": 17316, + "▁Posted": 17317, + "Pixel": 17318, + "▁dancing": 17319, + "▁agreements": 17320, + "▁Pic": 17321, + "ancia": 17322, + "▁má": 17323, + "ationToken": 17324, + "descriptor": 17325, + "▁Carter": 17326, + "Release": 17327, + "************": 17328, + "▁outstanding": 17329, + "changes": 17330, + "ARRAY": 17331, + "▁Barbara": 17332, + "▁nurse": 17333, + "(\r": 17334, + "▁Douglas": 17335, + "▁nucle": 17336, + "ouri": 17337, + "▁Style": 17338, + "avo": 17339, + "▁painful": 17340, + "▁slic": 17341, + "▁seinem": 17342, + "SUPPORT": 17343, + "ogene": 17344, + "▁satell": 17345, + "tagon": 17346, + "▁collapse": 17347, + "velle": 17348, + "MON": 17349, + "aughters": 17350, + "▁threatened": 17351, + "▁Illegal": 17352, + "▁desperate": 17353, + "strict": 17354, + "rus": 17355, + "ститу": 17356, + "\\\":": 17357, + "▁conflic": 17358, + "download": 17359, + "atos": 17360, + "▁Position": 17361, + ".*;": 17362, + "▁theater": 17363, + "▁pleasant": 17364, + "▁Cette": 17365, + "▁Singapore": 17366, + "heet": 17367, + "▁pir": 17368, + "▁acquis": 17369, + "▁назва": 17370, + "теля": 17371, + "▁recru": 17372, + "жения": 17373, + "ёл": 17374, + "версите": 17375, + "▁respective": 17376, + "▁tunnel": 17377, + "▁Dean": 17378, + "Du": 17379, + "▁uncle": 17380, + "▁offensive": 17381, + "colo": 17382, + "▁Unlike": 17383, + "series": 17384, + "▁Arn": 17385, + "minute": 17386, + "▁descriptor": 17387, + "▁stones": 17388, + "ICATION": 17389, + "▁Pad": 17390, + "▁iPhone": 17391, + "ei": 17392, + "▁fantasy": 17393, + "▁Korean": 17394, + "\"}": 17395, + "▁orth": 17396, + "halten": 17397, + "deep": 17398, + "▁Kay": 17399, + "requency": 17400, + "▁duties": 17401, + "awt": 17402, + "▁nearest": 17403, + "▁disorder": 17404, + "стру": 17405, + "▁Chile": 17406, + "▁seq": 17407, + "▁transportation": 17408, + "OO": 17409, + "▁Dez": 17410, + "iju": 17411, + "▁Results": 17412, + "jed": 17413, + "ivel": 17414, + "HOST": 17415, + "▁€": 17416, + "▁Î": 17417, + "▁chin": 17418, + "▁matt": 17419, + "▁voted": 17420, + "▁gehör": 17421, + "▁▁▁▁▁▁▁▁▁▁▁": 17422, + "▁sue": 17423, + "▁legacy": 17424, + "вся": 17425, + "SOURCE": 17426, + "WORK": 17427, + "itis": 17428, + "▁$|": 17429, + "▁обо": 17430, + "▁nr": 17431, + "▁Tamb": 17432, + "▁snap": 17433, + "▁impressed": 17434, + "▁deposit": 17435, + "▁divid": 17436, + "Segment": 17437, + "▁кар": 17438, + "▁Gas": 17439, + "▁crimes": 17440, + "▁insult": 17441, + "▁Hum": 17442, + "▁bounded": 17443, + "▁kicked": 17444, + "▁Му": 17445, + "▁|\\": 17446, + "added": 17447, + "Produ": 17448, + "▁./": 17449, + "▁awkward": 17450, + "▁Кра": 17451, + "▁ї": 17452, + "▁CONTR": 17453, + "▁beim": 17454, + "▁placeholder": 17455, + "spi": 17456, + "▁Bei": 17457, + "▁Pf": 17458, + "ientes": 17459, + "disk": 17460, + "blk": 17461, + "neo": 17462, + "itarian": 17463, + "▁cogn": 17464, + "▁sout": 17465, + "▁trash": 17466, + "▁Rab": 17467, + "▁decline": 17468, + "tat": 17469, + "▁combine": 17470, + "▁Tot": 17471, + "▁drops": 17472, + "Times": 17473, + "cheduler": 17474, + "▁governments": 17475, + "Tex": 17476, + "▁Used": 17477, + "зан": 17478, + "▁pd": 17479, + "мет": 17480, + "▁&=&": 17481, + "▁Nag": 17482, + "▁дол": 17483, + "▁Always": 17484, + "rtc": 17485, + "ске": 17486, + "▁performances": 17487, + "rupted": 17488, + "▁два": 17489, + "▁managers": 17490, + "▁Pitt": 17491, + "▁mystery": 17492, + "▁settle": 17493, + "ulse": 17494, + "cross": 17495, + "question": 17496, + "asha": 17497, + "seed": 17498, + "urable": 17499, + "Final": 17500, + "++++": 17501, + "inputs": 17502, + "▁backup": 17503, + "▁Learning": 17504, + "▁*,": 17505, + "logo": 17506, + "▁seinen": 17507, + "▁vulnerable": 17508, + "directory": 17509, + "ië": 17510, + "▁friendship": 17511, + "tu": 17512, + "▁Vec": 17513, + "rifice": 17514, + "▁бра": 17515, + "▁involve": 17516, + "TON": 17517, + "▁corrid": 17518, + "separ": 17519, + "Destroy": 17520, + "▁jul": 17521, + "▁inequality": 17522, + "▁ain": 17523, + "hex": 17524, + "▁wider": 17525, + "тели": 17526, + "▁jack": 17527, + "▁quot": 17528, + "▁Glen": 17529, + "initely": 17530, + "ihood": 17531, + "▁waist": 17532, + "▁Manchester": 17533, + "regular": 17534, + "▁(&": 17535, + "▁masses": 17536, + "▁DEFAULT": 17537, + "▁chairs": 17538, + "▁Fast": 17539, + "▁citt": 17540, + "_{{\\": 17541, + "oa": 17542, + "▁$\\{": 17543, + "▁seeds": 17544, + "▁Ald": 17545, + "▁Batt": 17546, + "fab": 17547, + "▁democracy": 17548, + "DTO": 17549, + "▁Hij": 17550, + "PTR": 17551, + "Na": 17552, + "▁Harvard": 17553, + "sid": 17554, + "Pred": 17555, + "fers": 17556, + "▁spare": 17557, + "AMP": 17558, + "▁groupe": 17559, + "▁sender": 17560, + "▁Christopher": 17561, + "▁prisoners": 17562, + "▁Ker": 17563, + "▁Crist": 17564, + "▁ALL": 17565, + "rice": 17566, + "▁antes": 17567, + "natural": 17568, + "▁Susan": 17569, + "▁Juli": 17570, + "▁diab": 17571, + "ixon": 17572, + "icator": 17573, + "▁flexible": 17574, + "▁reserve": 17575, + "Contains": 17576, + "▁Hil": 17577, + "▁Isa": 17578, + "▁towns": 17579, + "GS": 17580, + "▁Trad": 17581, + "▁Lock": 17582, + "▁Grund": 17583, + "▁criticism": 17584, + "ню": 17585, + "▁că": 17586, + "▁politician": 17587, + "stable": 17588, + "Accept": 17589, + "Summary": 17590, + "▁também": 17591, + "}^{-": 17592, + "▁IM": 17593, + "idal": 17594, + "мор": 17595, + "Blue": 17596, + "GROUP": 17597, + "▁terminal": 17598, + "▁complexity": 17599, + "▁locally": 17600, + "DOWN": 17601, + "▁Near": 17602, + "Depth": 17603, + "▁pole": 17604, + "▁equality": 17605, + "Site": 17606, + "▁isinstance": 17607, + "Speed": 17608, + "ippi": 17609, + ",&": 17610, + "▁Enc": 17611, + "щен": 17612, + "▁mater": 17613, + "▁slaves": 17614, + "ACTION": 17615, + "usalem": 17616, + "▁haz": 17617, + "▁Beat": 17618, + "▁wrest": 17619, + "▁llam": 17620, + "Ins": 17621, + "мина": 17622, + "▁був": 17623, + "▁Frame": 17624, + "ushes": 17625, + "▁virtually": 17626, + "▁Perm": 17627, + "▁weights": 17628, + "▁llvm": 17629, + "▁cave": 17630, + "states": 17631, + "DMA": 17632, + "ellt": 17633, + "ifact": 17634, + "vendor": 17635, + "▁Emma": 17636, + "Locale": 17637, + "▁SET": 17638, + "▁geometry": 17639, + "Styles": 17640, + "▁Referee": 17641, + "▁weit": 17642, + "fica": 17643, + "▁ads": 17644, + "gray": 17645, + "▁Burg": 17646, + "iona": 17647, + "dagger": 17648, + "▁Januar": 17649, + "дей": 17650, + "isterschaft": 17651, + "ppo": 17652, + "oids": 17653, + "▁départ": 17654, + "Shader": 17655, + "▁constraint": 17656, + "Secret": 17657, + "▁Peters": 17658, + "▁eyeb": 17659, + "▁mesh": 17660, + "▁cookie": 17661, + "▁Pick": 17662, + "▁nick": 17663, + "bye": 17664, + "▁savings": 17665, + "Try": 17666, + "python": 17667, + "▁patri": 17668, + "▁multip": 17669, + "▁kinda": 17670, + "▁'_": 17671, + "▁Franz": 17672, + "▁cloth": 17673, + "зульта": 17674, + "▁fleet": 17675, + "▁humanity": 17676, + "resa": 17677, + "blob": 17678, + "▁TX": 17679, + "▁Buch": 17680, + "▁Lond": 17681, + "▁valley": 17682, + "▁murm": 17683, + "▁Trade": 17684, + "linewidth": 17685, + "▁especial": 17686, + "upper": 17687, + "▁hosp": 17688, + "▁tanto": 17689, + "▁oldest": 17690, + "▁Roose": 17691, + "▁hitting": 17692, + "dog": 17693, + "ovi": 17694, + "},\r": 17695, + "▁compatible": 17696, + "▁Website": 17697, + "poch": 17698, + "▁Bag": 17699, + "▁accomplish": 17700, + "Christ": 17701, + "asset": 17702, + "▁Until": 17703, + "▁geld": 17704, + "Listen": 17705, + "SB": 17706, + "Setup": 17707, + "icia": 17708, + "▁lum": 17709, + "▁janvier": 17710, + "PAGE": 17711, + "▁Nu": 17712, + "/\"": 17713, + "▁divorce": 17714, + "Execute": 17715, + "Depend": 17716, + "▁Scottish": 17717, + "▁Ts": 17718, + "ruppe": 17719, + "▁refuse": 17720, + "▁Oktober": 17721, + "ijk": 17722, + "▁Amy": 17723, + "▁dimin": 17724, + "▁gross": 17725, + "▁trat": 17726, + "isible": 17727, + "mixer": 17728, + "▁autres": 17729, + "▁neat": 17730, + "▁otros": 17731, + "Void": 17732, + "▁schol": 17733, + "▁Walker": 17734, + "▁tube": 17735, + "ologists": 17736, + "▁груп": 17737, + "▁haben": 17738, + "uber": 17739, + "ACTIVE": 17740, + "▁Attendance": 17741, + "▁оп": 17742, + "▁blade": 17743, + "oplus": 17744, + "▁Original": 17745, + "▁manufacturer": 17746, + "asz": 17747, + "âte": 17748, + "rer": 17749, + "▁Json": 17750, + "▁succeeded": 17751, + "uffle": 17752, + "▁backed": 17753, + "esian": 17754, + "tick": 17755, + "External": 17756, + "▁XIX": 17757, + "▁hearts": 17758, + "▁После": 17759, + "olu": 17760, + "▁лет": 17761, + "VICE": 17762, + "ário": 17763, + "▁fraud": 17764, + "edu": 17765, + "Primary": 17766, + "▁gaming": 17767, + "▁plt": 17768, + "igator": 17769, + "IES": 17770, + "Compiler": 17771, + "▁monument": 17772, + "agem": 17773, + "▁Rain": 17774, + "▁moins": 17775, + "oku": 17776, + "osex": 17777, + "▁Kansas": 17778, + "▁gepublice": 17779, + "▁Joy": 17780, + "Scene": 17781, + "▁kingdom": 17782, + "rices": 17783, + "▁juin": 17784, + "▁uncomfortable": 17785, + "▁Money": 17786, + "obb": 17787, + "expl": 17788, + "strcmp": 17789, + "▁dread": 17790, + "rition": 17791, + "▁Chi": 17792, + "▁demonstrated": 17793, + "▁vertices": 17794, + "чо": 17795, + "▁Culture": 17796, + "FX": 17797, + "Dictionary": 17798, + "▁Dru": 17799, + "trm": 17800, + "▁examine": 17801, + "▁therap": 17802, + "ième": 17803, + "мини": 17804, + "▁produces": 17805, + "▁photographs": 17806, + "▁threads": 17807, + "▁MI": 17808, + "▁extraordinary": 17809, + "ским": 17810, + "▁gepubliceerd": 17811, + "▁Poland": 17812, + "▁guaranteed": 17813, + "RG": 17814, + "osc": 17815, + "али": 17816, + "▁тех": 17817, + "errno": 17818, + "science": 17819, + "iffs": 17820, + "▁Tam": 17821, + "▁Beth": 17822, + "▁Travel": 17823, + "▁translate": 17824, + "ché": 17825, + "▁ling": 17826, + "▁belongs": 17827, + "▁electrical": 17828, + "ensk": 17829, + "▁Compet": 17830, + "cg": 17831, + "VC": 17832, + "topic": 17833, + "▁presum": 17834, + "вета": 17835, + "▁approximation": 17836, + "▁grim": 17837, + "▁Из": 17838, + "_{(": 17839, + "вин": 17840, + "ution": 17841, + "owych": 17842, + "åg": 17843, + "sterreich": 17844, + "▁characteristic": 17845, + "oming": 17846, + "▁/*!": 17847, + "▁prize": 17848, + "▁Minnesota": 17849, + "ted": 17850, + "цы": 17851, + "▁Om": 17852, + "▁indices": 17853, + "▁stem": 17854, + "regon": 17855, + "ниче": 17856, + "▁Salv": 17857, + "ése": 17858, + "▁aged": 17859, + "▁Past": 17860, + "▁internation": 17861, + "▁Vic": 17862, + "▁resume": 17863, + "akespeare": 17864, + "▁estado": 17865, + "▁abilities": 17866, + "▁brow": 17867, + "▁NFL": 17868, + "▁trends": 17869, + "▁Austin": 17870, + "▁LIMIT": 17871, + "▁Kor": 17872, + "▁folk": 17873, + "▁ward": 17874, + "▁nest": 17875, + "▁Junior": 17876, + "▁maintaining": 17877, + "Pub": 17878, + "OBJECT": 17879, + "▁bloody": 17880, + "▁sj": 17881, + "▁dtype": 17882, + "Pane": 17883, + "▁bacter": 17884, + "▁gradually": 17885, + "mr": 17886, + "Team": 17887, + "▁indicating": 17888, + "▁decrease": 17889, + "tek": 17890, + "▁Represent": 17891, + "▁developers": 17892, + "Guid": 17893, + "▁Diet": 17894, + "▁retr": 17895, + "Navigation": 17896, + "esi": 17897, + "▁lazy": 17898, + "Standard": 17899, + "Er": 17900, + "AW": 17901, + "▁États": 17902, + "▁assured": 17903, + "San": 17904, + "▁Andre": 17905, + "’,": 17906, + "fang": 17907, + "ération": 17908, + "▁industries": 17909, + "▁incon": 17910, + "Emit": 17911, + "▁где": 17912, + "▁retriev": 17913, + "eni": 17914, + "▁Turkey": 17915, + "izers": 17916, + "Angle": 17917, + "▁oc": 17918, + "▁palm": 17919, + "▁stan": 17920, + "льно": 17921, + "▁CSS": 17922, + "▁frances": 17923, + "▁grin": 17924, + "▁tiempo": 17925, + "▁Prix": 17926, + "]).": 17927, + "▁deput": 17928, + "▁Pin": 17929, + "▁sixt": 17930, + "▁predicted": 17931, + "azure": 17932, + "▁Motor": 17933, + "▁ihm": 17934, + "▁manus": 17935, + "apos": 17936, + "▁instruments": 17937, + "▁counts": 17938, + "▁aimed": 17939, + "profit": 17940, + "▁dok": 17941, + "обра": 17942, + "▁estud": 17943, + "iesz": 17944, + "▁piss": 17945, + "▁inaug": 17946, + "▁voters": 17947, + "▁packages": 17948, + "▁cute": 17949, + "▁fitness": 17950, + "▁leurs": 17951, + "▁sorted": 17952, + "phant": 17953, + "OPT": 17954, + "▁zip": 17955, + "season": 17956, + "emi": 17957, + "encoding": 17958, + "won": 17959, + "elect": 17960, + "▁tooth": 17961, + "▁upcoming": 17962, + "▁Graham": 17963, + "nut": 17964, + "▁Ark": 17965, + "ält": 17966, + "▁precious": 17967, + "agle": 17968, + "née": 17969, + "ница": 17970, + "aris": 17971, + "▁pile": 17972, + "cole": 17973, + "▁WITH": 17974, + "routing": 17975, + "▁***": 17976, + "Appearance": 17977, + "llvm": 17978, + "▁Oliver": 17979, + "▁PL": 17980, + "ifndef": 17981, + "etzt": 17982, + "skiego": 17983, + "▁pon": 17984, + "ARGET": 17985, + "kö": 17986, + "alled": 17987, + "▁=\\": 17988, + "sure": 17989, + "matches": 17990, + "▁temperatures": 17991, + "SEL": 17992, + "▁clone": 17993, + "▁eller": 17994, + "erna": 17995, + "▁поло": 17996, + "Management": 17997, + "company": 17998, + "▁lun": 17999, + "▁streaming": 18000, + "▁Ni": 18001, + "▁sí": 18002, + "Contact": 18003, + "▁Credit": 18004, + "▁Oak": 18005, + "▁представ": 18006, + "radius": 18007, + "cli": 18008, + "IENT": 18009, + "▁Lucy": 18010, + "▁calculation": 18011, + "▁pixel": 18012, + "▁mul": 18013, + "▁outcomes": 18014, + "▁centers": 18015, + "▁residence": 18016, + "Constraint": 18017, + "▁preserve": 18018, + "peon": 18019, + "uffix": 18020, + "▁Roberts": 18021, + "▁promot": 18022, + "?!": 18023, + "balance": 18024, + "▁courts": 18025, + "▁disg": 18026, + "PRINT": 18027, + "▁их": 18028, + "elfare": 18029, + "▁retreat": 18030, + "▁Ав": 18031, + "Cost": 18032, + "also": 18033, + "▁Für": 18034, + "▁März": 18035, + "DIO": 18036, + "▁bez": 18037, + "AUTH": 18038, + "Den": 18039, + "▁atom": 18040, + "▁roman": 18041, + "▁Pel": 18042, + "▁Roosevelt": 18043, + "▁Plant": 18044, + "Contents": 18045, + "▁Between": 18046, + "▁coupling": 18047, + "structure": 18048, + "▁Marshall": 18049, + "▁Career": 18050, + "▁railway": 18051, + "▁Bureau": 18052, + "▁possibilities": 18053, + "▁kor": 18054, + "){\r": 18055, + "mero": 18056, + "mov": 18057, + "англ": 18058, + "AIN": 18059, + "mund": 18060, + "lette": 18061, + "▁summar": 18062, + "▁describing": 18063, + "▁NAS": 18064, + "▁Emb": 18065, + "Instruction": 18066, + "liest": 18067, + "▁Sig": 18068, + "Bill": 18069, + "▁verd": 18070, + "plant": 18071, + "▁galaxies": 18072, + "\"])": 18073, + "▁PyObject": 18074, + "▁Gy": 18075, + "▁mě": 18076, + "▁organisation": 18077, + "Her": 18078, + "Sep": 18079, + "ocom": 18080, + "▁Same": 18081, + "▁bite": 18082, + "▁Seattle": 18083, + "зыва": 18084, + "Observer": 18085, + "’.": 18086, + "▁morph": 18087, + "urches": 18088, + "alph": 18089, + "reement": 18090, + "consin": 18091, + "^-": 18092, + "▁dann": 18093, + "translate": 18094, + "вих": 18095, + "React": 18096, + "▁cats": 18097, + "▁brew": 18098, + "▁ds": 18099, + "▁circles": 18100, + "▁drift": 18101, + "agma": 18102, + "▁Valent": 18103, + "PIN": 18104, + "ARM": 18105, + "▁surviv": 18106, + "alin": 18107, + "Pref": 18108, + "friendly": 18109, + "▁uncertainty": 18110, + "▁fd": 18111, + "▁engineer": 18112, + "Ben": 18113, + "icular": 18114, + "orest": 18115, + "▁horizontal": 18116, + "UTC": 18117, + "textrm": 18118, + "Live": 18119, + "Score": 18120, + "▁Germans": 18121, + "distance": 18122, + "uti": 18123, + "▁équ": 18124, + "▁numerical": 18125, + "▁reass": 18126, + "Activ": 18127, + "▁cod": 18128, + "bullet": 18129, + "ensing": 18130, + "▁Gem": 18131, + "▁navigation": 18132, + "addClass": 18133, + "▁simultaneously": 18134, + "вий": 18135, + "▁його": 18136, + "▁Hö": 18137, + "▁harsh": 18138, + "precated": 18139, + "ССР": 18140, + "▁Equip": 18141, + "adget": 18142, + "▁TYPE": 18143, + "▁mg": 18144, + "IGH": 18145, + "▁vin": 18146, + "▁findings": 18147, + "ivan": 18148, + "▁possession": 18149, + "▁того": 18150, + "▁parsed": 18151, + "riors": 18152, + "zeichnet": 18153, + "ников": 18154, + "Worker": 18155, + "▁enables": 18156, + "▁($\\": 18157, + "▁Copy": 18158, + "▁orientation": 18159, + "стре": 18160, + "▁Indians": 18161, + "▁Gary": 18162, + "▁Insurance": 18163, + "isan": 18164, + "Chat": 18165, + "▁comun": 18166, + "▁coron": 18167, + "ография": 18168, + "updated": 18169, + "▁Ин": 18170, + "These": 18171, + "SEC": 18172, + "▁boyfriend": 18173, + "Diagnostics": 18174, + "Hint": 18175, + "mul": 18176, + "▁inode": 18177, + "xA": 18178, + "eft": 18179, + "OPTION": 18180, + "unct": 18181, + "annon": 18182, + "ENS": 18183, + "strip": 18184, + "▁enthusi": 18185, + "▁Whit": 18186, + "▁Фи": 18187, + "aude": 18188, + "▁disagree": 18189, + "▁snapped": 18190, + "Phys": 18191, + "▁Syn": 18192, + "▁sour": 18193, + "▁Lux": 18194, + "ugar": 18195, + "tile": 18196, + "▁infection": 18197, + "▁Feb": 18198, + "▁Chem": 18199, + "dataset": 18200, + "chts": 18201, + "Dynamic": 18202, + "▁сред": 18203, + "▁queen": 18204, + "worker": 18205, + "swap": 18206, + "▁timestamp": 18207, + "▁Integr": 18208, + "▁interviews": 18209, + "such": 18210, + "▁laughter": 18211, + "prof": 18212, + "▁Bird": 18213, + "(|": 18214, + "ân": 18215, + "▁gra": 18216, + "&=": 18217, + "zens": 18218, + "getMessage": 18219, + "▁Ost": 18220, + "▁gab": 18221, + "▁mortgage": 18222, + "multicol": 18223, + "LEVEL": 18224, + "partition": 18225, + "seen": 18226, + "▁declar": 18227, + "AU": 18228, + "▁ox": 18229, + "▁ligger": 18230, + "▁Carm": 18231, + "geme": 18232, + "▁Vegas": 18233, + "▁Eug": 18234, + "orus": 18235, + "▁brick": 18236, + "▁así": 18237, + "▁Magazine": 18238, + "HasColumnType": 18239, + "VR": 18240, + "licher": 18241, + "▁Future": 18242, + "▁Jug": 18243, + "attan": 18244, + "constructor": 18245, + "VP": 18246, + "▁тур": 18247, + "чина": 18248, + "Comparator": 18249, + "▁authentic": 18250, + "▁monster": 18251, + "▁transformed": 18252, + "▁firms": 18253, + "FW": 18254, + "▁catalog": 18255, + "boards": 18256, + "▁diseases": 18257, + "▁Benjamin": 18258, + "▁horizon": 18259, + "▁Available": 18260, + "Mvc": 18261, + "Stud": 18262, + "▁lord": 18263, + "general": 18264, + "пар": 18265, + "▁cabinet": 18266, + "▁Basic": 18267, + "TestCase": 18268, + "ansk": 18269, + "▁Snow": 18270, + "ierten": 18271, + "▁vocal": 18272, + "Padding": 18273, + "halt": 18274, + "▁Alexand": 18275, + "▁Colomb": 18276, + "ivamente": 18277, + "▁artificial": 18278, + "▁Atlanta": 18279, + "▁mentre": 18280, + "▁estaba": 18281, + "jekt": 18282, + "▁slept": 18283, + "▁endless": 18284, + "éro": 18285, + "attery": 18286, + "uur": 18287, + "▁weakness": 18288, + "▁attempting": 18289, + "BYTE": 18290, + "▁founder": 18291, + "▁salv": 18292, + "▁Medicine": 18293, + "tid": 18294, + "▁Schwe": 18295, + "raction": 18296, + "▁¿": 18297, + "crate": 18298, + "SERVER": 18299, + "▁compound": 18300, + "▁conve": 18301, + "▁caf": 18302, + "▁handful": 18303, + "onne": 18304, + "ública": 18305, + "▁defensive": 18306, + "Alignment": 18307, + "▁préc": 18308, + "▁significance": 18309, + "élé": 18310, + "arta": 18311, + "Dam": 18312, + "▁perpet": 18313, + "▁caller": 18314, + "icients": 18315, + "cep": 18316, + "▁Multi": 18317, + "▁stolen": 18318, + "▁focusing": 18319, + "embed": 18320, + "▁bree": 18321, + "▁AB": 18322, + "▁occasions": 18323, + "sea": 18324, + "Prov": 18325, + "чение": 18326, + "▁Category": 18327, + "▁sq": 18328, + "▁Фе": 18329, + "VA": 18330, + "Diff": 18331, + "Tri": 18332, + "issement": 18333, + "▁actress": 18334, + "▁Пе": 18335, + "▁jej": 18336, + "▁twisted": 18337, + "▁Nicol": 18338, + "▁junior": 18339, + "Sound": 18340, + "▁Brasil": 18341, + "▁juice": 18342, + "▁>>>": 18343, + "▁Alb": 18344, + "▁softly": 18345, + "▁McK": 18346, + "▁Gren": 18347, + "▁italiano": 18348, + "▁creatures": 18349, + "▁residential": 18350, + "▁Instagram": 18351, + "ucks": 18352, + "▁killer": 18353, + "▁Johnny": 18354, + "▁enterprise": 18355, + "Dto": 18356, + "chestra": 18357, + "▁Tel": 18358, + "▁Activ": 18359, + "factor": 18360, + "oust": 18361, + "▁vacuum": 18362, + "рал": 18363, + "')->": 18364, + "▁Left": 18365, + "▁defect": 18366, + "▁ninete": 18367, + "fare": 18368, + "▁regret": 18369, + "▁shar": 18370, + "ctrine": 18371, + "mesh": 18372, + "city": 18373, + "icit": 18374, + "▁Fem": 18375, + "limited": 18376, + "oka": 18377, + "!\\!\\": 18378, + "Donald": 18379, + "зно": 18380, + "▁provision": 18381, + "▁discussions": 18382, + "Drag": 18383, + "▁Incl": 18384, + "Exit": 18385, + "▁Abd": 18386, + "story": 18387, + "ieve": 18388, + "▁był": 18389, + "olving": 18390, + "wohner": 18391, + "▁guidelines": 18392, + "▁straw": 18393, + "üss": 18394, + "▁було": 18395, + "▁burden": 18396, + "▁spatial": 18397, + "▁stretched": 18398, + "▁Inf": 18399, + "▁typedef": 18400, + "▁robot": 18401, + "▁Doc": 18402, + "pliers": 18403, + "wal": 18404, + "camp": 18405, + "▁diffé": 18406, + "▁McG": 18407, + "▁tel": 18408, + "arette": 18409, + "▁subsequently": 18410, + "▁honey": 18411, + "FUNC": 18412, + "▁establishment": 18413, + "tesy": 18414, + "▁który": 18415, + "▁сель": 18416, + "▁FO": 18417, + "▁Islands": 18418, + "▁mp": 18419, + "Scalar": 18420, + "▁Yan": 18421, + "cken": 18422, + "▁variation": 18423, + "ią": 18424, + "optim": 18425, + "azor": 18426, + "tuple": 18427, + "▁gravity": 18428, + "▁conclude": 18429, + "▁collections": 18430, + "ész": 18431, + "▁Liver": 18432, + "▁ethnic": 18433, + "compile": 18434, + "▁parl": 18435, + "Surface": 18436, + "{'": 18437, + "▁paragraph": 18438, + "posite": 18439, + "ítulo": 18440, + "oba": 18441, + "binary": 18442, + "rob": 18443, + "▁Pedro": 18444, + "▁fis": 18445, + "▁Grande": 18446, + "odox": 18447, + "▁posting": 18448, + "": 26345, + "olent": 26346, + "▁этого": 26347, + "▁Generic": 26348, + "▁*/,": 26349, + "▁combinations": 26350, + "▁rejo": 26351, + "спубли": 26352, + "capacity": 26353, + "▁traces": 26354, + "▁opacity": 26355, + "▁Official": 26356, + "icion": 26357, + "▁emotionally": 26358, + "▁Joel": 26359, + "ському": 26360, + "▁legendary": 26361, + "▁pam": 26362, + "▁También": 26363, + ".<": 26364, + "iba": 26365, + "midt": 26366, + "бом": 26367, + "▁ensuite": 26368, + "Authorization": 26369, + "Pag": 26370, + "▁helmet": 26371, + "▁territo": 26372, + "secondary": 26373, + "▁segunda": 26374, + "▁Wire": 26375, + "recated": 26376, + "▁invoked": 26377, + "▁ValueError": 26378, + "▁фо": 26379, + "ALIGN": 26380, + "CURRENT": 26381, + "\\+\\_\\": 26382, + "▁compilation": 26383, + "ær": 26384, + "▁Palmar": 26385, + "▁influences": 26386, + "/:": 26387, + "Mix": 26388, + "NOP": 26389, + "econom": 26390, + "▁tucked": 26391, + "▁});\r": 26392, + "ANK": 26393, + "reject": 26394, + "▁pension": 26395, + "▁generates": 26396, + "чё": 26397, + "▁incap": 26398, + "▁clicked": 26399, + "▁fus": 26400, + "ourses": 26401, + "▁Easter": 26402, + "%;": 26403, + "zin": 26404, + "▁obligations": 26405, + "▁Tips": 26406, + "};\r": 26407, + ".\"_": 26408, + "▁BSD": 26409, + "ática": 26410, + "▁expose": 26411, + "Pars": 26412, + "▁Amanda": 26413, + "куп": 26414, + "▁guessed": 26415, + "dsi": 26416, + "▁Leip": 26417, + "Broad": 26418, + "▁Hughes": 26419, + "ié": 26420, + "▁Wahl": 26421, + "▁formerly": 26422, + "Relative": 26423, + "▁Yu": 26424, + "▁Mountains": 26425, + "▁Enum": 26426, + "▁strang": 26427, + "_-": 26428, + "recht": 26429, + "viv": 26430, + "pause": 26431, + "▁Londres": 26432, + "▁elbow": 26433, + "▁Hawaii": 26434, + "▁Casino": 26435, + "Threshold": 26436, + "Units": 26437, + "Include": 26438, + "ито": 26439, + "asury": 26440, + "▁steht": 26441, + "▁damned": 26442, + "▁packets": 26443, + "▁Werk": 26444, + "▁elevator": 26445, + "iedad": 26446, + "govern": 26447, + "▁CONTRACT": 26448, + "mals": 26449, + "▁remem": 26450, + "▁entonces": 26451, + "▁vas": 26452, + "▁sympathy": 26453, + "▁befindet": 26454, + "incing": 26455, + "DataSet": 26456, + "▁additionally": 26457, + "▁musician": 26458, + "шего": 26459, + "▁listop": 26460, + ">\")": 26461, + "Printf": 26462, + "▁Felix": 26463, + "▁carved": 26464, + "▁nicely": 26465, + "гом": 26466, + "chap": 26467, + "▁Nieder": 26468, + "▁Lav": 26469, + "▁modifications": 26470, + "moment": 26471, + "▁balcon": 26472, + "▁dependency": 26473, + "CKET": 26474, + "▁vanished": 26475, + "▁fighters": 26476, + "▁zunächst": 26477, + "ioctl": 26478, + "▁defens": 26479, + "▁Nem": 26480, + "Utility": 26481, + "▁curv": 26482, + "▁DAMAGES": 26483, + "▁Rogers": 26484, + "▁gratitude": 26485, + "▁Denmark": 26486, + "рая": 26487, + "grpc": 26488, + "▁juni": 26489, + "▁октября": 26490, + "▁immense": 26491, + "▁prevented": 26492, + "▁foam": 26493, + "▁Extra": 26494, + "aimed": 26495, + "▁Criteria": 26496, + "▁Simply": 26497, + "boxes": 26498, + "▁Legend": 26499, + "▁Players": 26500, + "▁Mercedes": 26501, + "▁Branch": 26502, + "TERN": 26503, + "omena": 26504, + "▁incorporate": 26505, + "conde": 26506, + "▁Estado": 26507, + "▁wasted": 26508, + "▁complaining": 26509, + "▁warriors": 26510, + "oter": 26511, + "▁этом": 26512, + "▁conten": 26513, + "▁machinery": 26514, + "▁technological": 26515, + "▁TD": 26516, + "▁gras": 26517, + "▁minimize": 26518, + "▁Door": 26519, + "▁bzw": 26520, + "▁prac": 26521, + "TREE": 26522, + "▁Wing": 26523, + "▁Transaction": 26524, + "▁MVT": 26525, + "▁Klein": 26526, + "commons": 26527, + "▁}{": 26528, + "▁Heritage": 26529, + "▁fade": 26530, + "рок": 26531, + "setValue": 26532, + "▁Wallace": 26533, + "MX": 26534, + "▁ACT": 26535, + "▁footage": 26536, + "▁entstand": 26537, + "arga": 26538, + "▁nails": 26539, + "▁capitalism": 26540, + "▁Garc": 26541, + "▁suspension": 26542, + "ilis": 26543, + "▁Mov": 26544, + "uffled": 26545, + "Arc": 26546, + "▁Beautiful": 26547, + "WAY": 26548, + "Parallel": 26549, + "XXXX": 26550, + "diag": 26551, + "▁DT": 26552, + "mq": 26553, + "TextView": 26554, + "MLE": 26555, + "ennen": 26556, + "▁infected": 26557, + "▁therapist": 26558, + "INGS": 26559, + "▁cidade": 26560, + "ън": 26561, + "▁pdf": 26562, + "▁bump": 26563, + "CTX": 26564, + "▁INCLUDING": 26565, + "▁Gef": 26566, + "ENTIAL": 26567, + "▁handy": 26568, + "▁temporal": 26569, + "AtA": 26570, + "ISH": 26571, + "▁Pattern": 26572, + "▁lan": 26573, + "ependant": 26574, + "▁shining": 26575, + "idy": 26576, + "▁NT": 26577, + "▁Fran": 26578, + "▁nurses": 26579, + "▁betray": 26580, + "▁sensible": 26581, + "▁апреля": 26582, + "▁'[": 26583, + "▁thirteen": 26584, + ")}_{": 26585, + "▁Noah": 26586, + "INSERT": 26587, + "istically": 26588, + "▁Appendix": 26589, + "▁recher": 26590, + "Receiver": 26591, + "▁dernier": 26592, + "лла": 26593, + "лиза": 26594, + "▁Partido": 26595, + "▁maximal": 26596, + "snap": 26597, + "▁часть": 26598, + "STOP": 26599, + "▁ultra": 26600, + "▁développ": 26601, + "▁tegen": 26602, + "▁Чи": 26603, + "LIB": 26604, + "▁baseline": 26605, + "reload": 26606, + "▁Arbitro": 26607, + "▁kall": 26608, + "capture": 26609, + "Arm": 26610, + "quin": 26611, + "impse": 26612, + "zas": 26613, + "▁Cand": 26614, + "▁brains": 26615, + "▁hostile": 26616, + "▁marble": 26617, + "oons": 26618, + "▁Loss": 26619, + "MetaData": 26620, + "▁República": 26621, + "▁andra": 26622, + "oden": 26623, + "▁documented": 26624, + "▁Moses": 26625, + "odd": 26626, + "▁wax": 26627, + "usch": 26628, + "▁diagnosed": 26629, + "inkle": 26630, + "▁Xbox": 26631, + "▁seventy": 26632, + "cias": 26633, + "▁noviembre": 26634, + "Compute": 26635, + "});\r": 26636, + "▁Philippe": 26637, + "▁För": 26638, + "Leave": 26639, + "▁sage": 26640, + "▁unpre": 26641, + "▁Fortunately": 26642, + "▁apost": 26643, + "entities": 26644, + "▁ellos": 26645, + "authorized": 26646, + "GBT": 26647, + "▁insist": 26648, + "▁inspire": 26649, + "Mass": 26650, + "▁rôle": 26651, + "fee": 26652, + "ipart": 26653, + "цер": 26654, + "unate": 26655, + "▁CNN": 26656, + ":}": 26657, + "▁unhappy": 26658, + "▁imported": 26659, + "HIGH": 26660, + "rings": 26661, + "▁Instance": 26662, + "Bay": 26663, + "agles": 26664, + "mee": 26665, + "bery": 26666, + "▁Stories": 26667, + "▁Chase": 26668, + "▁carriage": 26669, + "▁misunder": 26670, + "▁imagin": 26671, + "pw": 26672, + "▁Meter": 26673, + "▁crowds": 26674, + "▁Fame": 26675, + "skill": 26676, + "▁comed": 26677, + "▁ranch": 26678, + "▁lacking": 26679, + "▁submar": 26680, + "iante": 26681, + "▁lanz": 26682, + "▁служ": 26683, + "-----------": 26684, + "▁obten": 26685, + "▁downstairs": 26686, + "YN": 26687, + "rotation": 26688, + "▁Jesse": 26689, + "$(\"#": 26690, + "▁puls": 26691, + "irling": 26692, + "▁Schaus": 26693, + "▁deployed": 26694, + "▁{}\",": 26695, + "▁Marvel": 26696, + "ENUM": 26697, + "▁Mathemat": 26698, + "▁nn": 26699, + "compet": 26700, + "ków": 26701, + "bil": 26702, + "Which": 26703, + "isine": 26704, + "▁rude": 26705, + "▁niveau": 26706, + "▁área": 26707, + "▁près": 26708, + "atis": 26709, + "▁[...]": 26710, + "fur": 26711, + "omm": 26712, + "packed": 26713, + "мене": 26714, + "scriptstyle": 26715, + "▁Ath": 26716, + "▁desp": 26717, + "eltemperaturen": 26718, + "▁talents": 26719, + "ocy": 26720, + "▁raises": 26721, + "LIMIT": 26722, + "▁editorial": 26723, + "▁Animal": 26724, + "drive": 26725, + "▁работа": 26726, + "bss": 26727, + "▁Sev": 26728, + "epoch": 26729, + "▁RC": 26730, + "UNUSED": 26731, + "▁mandatory": 26732, + "(?:": 26733, + "▁Bin": 26734, + "▁synthetic": 26735, + "▁gown": 26736, + "▁Dob": 26737, + "kap": 26738, + "▁harmon": 26739, + "▁liberty": 26740, + "▁Rice": 26741, + "▁prayers": 26742, + "▁mise": 26743, + "▁confusing": 26744, + "▁leap": 26745, + "▁arrives": 26746, + "kamp": 26747, + "▁thats": 26748, + "ACC": 26749, + "▁Parameters": 26750, + "▁одно": 26751, + "▁Bio": 26752, + "density": 26753, + "▁glimpse": 26754, + "FORE": 26755, + "▁Listen": 26756, + "Prev": 26757, + "}\\,\\": 26758, + "куль": 26759, + "▁SEC": 26760, + "▁explored": 26761, + "▁meantime": 26762, + "AIL": 26763, + "▁WP": 26764, + "▁raison": 26765, + "▁existe": 26766, + "▁lesser": 26767, + "▁Validate": 26768, + "▁caution": 26769, + "usta": 26770, + "heading": 26771, + "EFF": 26772, + ".'\"": 26773, + "▁Gilbert": 26774, + "▁limitation": 26775, + "▁retour": 26776, + "▁Commonwealth": 26777, + "▁gewann": 26778, + "▁miserable": 26779, + "▁networking": 26780, + "▁ottobre": 26781, + "▁Dise": 26782, + "edges": 26783, + "▁sede": 26784, + "вича": 26785, + "uniform": 26786, + "▁деятель": 26787, + "iros": 26788, + "▁desen": 26789, + "▁parc": 26790, + "▁Rico": 26791, + "Ns": 26792, + "guid": 26793, + "orio": 26794, + "avelength": 26795, + "▁Gle": 26796, + "inceton": 26797, + "Amaz": 26798, + "Construct": 26799, + "▁mx": 26800, + "▁Vern": 26801, + "▁Generation": 26802, + "Jack": 26803, + "romag": 26804, + "▁viagra": 26805, + "▁Peg": 26806, + "▁Updated": 26807, + "▁overlap": 26808, + "EventArgs": 26809, + "кро": 26810, + "▁*«": 26811, + "▁questioned": 26812, + "South": 26813, + "notice": 26814, + "▁permanently": 26815, + "lst": 26816, + "ficie": 26817, + "▁quella": 26818, + "▁colleges": 26819, + "▁disappointment": 26820, + "▁Luft": 26821, + "imgur": 26822, + "▁transitions": 26823, + "▁seller": 26824, + "▁июня": 26825, + "▁Og": 26826, + "▁ADD": 26827, + "▁Pays": 26828, + "COMMAND": 26829, + "grades": 26830, + "▁febbra": 26831, + "▁Cyr": 26832, + "▁febbraio": 26833, + "eti": 26834, + "▁arom": 26835, + "▁Claude": 26836, + "▁UEFA": 26837, + "▁живе": 26838, + "▁Victorian": 26839, + "keeping": 26840, + "ên": 26841, + "▁FIXME": 26842, + "itime": 26843, + "chestr": 26844, + "▁Samsung": 26845, + "▁doctrine": 26846, + "▁pear": 26847, + "▁Mediterranean": 26848, + "▁Ya": 26849, + "▁vault": 26850, + "▁Historic": 26851, + "▁sedan": 26852, + "▁heated": 26853, + "▁política": 26854, + "Proof": 26855, + ":{": 26856, + "fem": 26857, + "▁Frankfurt": 26858, + "pectives": 26859, + "MG": 26860, + "▁Eye": 26861, + "dai": 26862, + "▁reserves": 26863, + "NER": 26864, + "▁tobacco": 26865, + "▁fragments": 26866, + "icc": 26867, + "▁booth": 26868, + "▁cruise": 26869, + "▁Testament": 26870, + "cola": 26871, + "▁Leop": 26872, + "▁noon": 26873, + "▁terrified": 26874, + "vb": 26875, + "intel": 26876, + "alie": 26877, + "▁verification": 26878, + "yster": 26879, + "ADER": 26880, + "chied": 26881, + "▁datasets": 26882, + "▁зі": 26883, + "▁miem": 26884, + "ulates": 26885, + "▁uuid": 26886, + "▁Pictures": 26887, + "▁Brend": 26888, + "Billboard": 26889, + "▁stern": 26890, + "▁denom": 26891, + "▁accidents": 26892, + "сня": 26893, + "▁packing": 26894, + "ција": 26895, + "iblical": 26896, + "▁Так": 26897, + "▁whisk": 26898, + "▁luego": 26899, + "▁rectangle": 26900, + "▁hooks": 26901, + "▁neglect": 26902, + "▁sober": 26903, + "proposition": 26904, + "Multiple": 26905, + ":\",": 26906, + "▁bapt": 26907, + "Parts": 26908, + "▁Selection": 26909, + "▁Alpha": 26910, + "weights": 26911, + "hall": 26912, + "соб": 26913, + "▁lur": 26914, + "▁época": 26915, + "▁rested": 26916, + "ambigu": 26917, + "▁tastes": 26918, + "amazonaws": 26919, + "▁confess": 26920, + "▁diciembre": 26921, + "implement": 26922, + "▁absorption": 26923, + "Hal": 26924, + "LEAN": 26925, + "▁Zach": 26926, + "▁freeze": 26927, + "LBL": 26928, + "STM": 26929, + "▁calc": 26930, + "={()": 26931, + "=*/": 26932, + "▁bt": 26933, + "Reb": 26934, + "▁Wien": 26935, + "anska": 26936, + "▁surn": 26937, + "iative": 26938, + "▁invån": 26939, + "CY": 26940, + "▁là": 26941, + "amba": 26942, + "leen": 26943, + "wahl": 26944, + "▁functioning": 26945, + "ția": 26946, + "getContext": 26947, + "gart": 26948, + "▁обе": 26949, + "Pen": 26950, + "vik": 26951, + "Slider": 26952, + "▁Accept": 26953, + "Gap": 26954, + "▁Jorge": 26955, + "SIG": 26956, + "▁вос": 26957, + "▁голо": 26958, + "▁periodo": 26959, + "шта": 26960, + "▁patches": 26961, + "кої": 26962, + "äre": 26963, + "engono": 26964, + "lista": 26965, + "horn": 26966, + "▁Complex": 26967, + "Sent": 26968, + "trfs": 26969, + "▁convex": 26970, + "Generation": 26971, + "▁місце": 26972, + "compress": 26973, + "▁Sax": 26974, + "▁uid": 26975, + "▁Lebens": 26976, + "Completion": 26977, + "\\|_{": 26978, + "insky": 26979, + "▁schon": 26980, + "▁masters": 26981, + "independ": 26982, + "neys": 26983, + "▁lied": 26984, + "▁aspir": 26985, + "чні": 26986, + "▁breakdown": 26987, + "▁Harm": 26988, + "▁designing": 26989, + "hf": 26990, + "▁Angela": 26991, + "▁confer": 26992, + "▁partido": 26993, + "▁interference": 26994, + "mao": 26995, + "▁absorbed": 26996, + "▁Vall": 26997, + "ErrorCode": 26998, + "▁Publishing": 26999, + "vano": 27000, + "BITS": 27001, + "▁deer": 27002, + "▁Campaign": 27003, + "▁graz": 27004, + "CHANGE": 27005, + "▁feder": 27006, + "iffe": 27007, + "handed": 27008, + "cq": 27009, + "umbing": 27010, + "▁unre": 27011, + "▁siendo": 27012, + "▁simpler": 27013, + "why": 27014, + "arettes": 27015, + "anst": 27016, + "▁hass": 27017, + "▁Enterprise": 27018, + "▁mois": 27019, + "▁Fo": 27020, + "▁участ": 27021, + "ffen": 27022, + "▁MODULE": 27023, + "▁activated": 27024, + "▁internacional": 27025, + "▁Mittel": 27026, + "degree": 27027, + "▁откры": 27028, + "▁&(": 27029, + "getProperty": 27030, + "isz": 27031, + "cedure": 27032, + "▁enters": 27033, + "▁Sally": 27034, + "▁Train": 27035, + "▁logged": 27036, + "▁Rav": 27037, + "▁Avoid": 27038, + "▁Kaiser": 27039, + "▁expend": 27040, + "aphor": 27041, + "▁brass": 27042, + "▁melod": 27043, + "▁attitudes": 27044, + "*\"": 27045, + "Wall": 27046, + "▁owe": 27047, + "▁bamb": 27048, + "shader": 27049, + "cester": 27050, + "▁PP": 27051, + "▁migrations": 27052, + "entric": 27053, + "▁Setup": 27054, + "▁Artist": 27055, + "hre": 27056, + "▁polite": 27057, + "ahan": 27058, + "▁luglio": 27059, + "▁predecess": 27060, + "▁SIG": 27061, + "тів": 27062, + "▁RF": 27063, + "▁Dry": 27064, + "▁maker": 27065, + "шим": 27066, + "▁Sounds": 27067, + "▁implementing": 27068, + "▁ah": 27069, + "▁gev": 27070, + "▁duplicate": 27071, + "▁Logan": 27072, + "▁Grade": 27073, + "DUCT": 27074, + "íses": 27075, + "ért": 27076, + "▁nonsense": 27077, + "backup": 27078, + "Attachment": 27079, + "▁ecc": 27080, + "▁Squadron": 27081, + "learn": 27082, + "deprecated": 27083, + "▁Aub": 27084, + "▁Gol": 27085, + "▁overl": 27086, + "SERVICE": 27087, + "▁beautifully": 27088, + "REL": 27089, + "▁Gian": 27090, + "▁Papa": 27091, + "respond": 27092, + "▁Caribbean": 27093, + "rn": 27094, + "▁худож": 27095, + "Cfg": 27096, + "rai": 27097, + "▁sniff": 27098, + "tto": 27099, + "ологи": 27100, + "▁rb": 27101, + "▁incidents": 27102, + "▁duck": 27103, + "▁PROVIDED": 27104, + "Sources": 27105, + "▁Chelsea": 27106, + "▁tek": 27107, + "▁налази": 27108, + "▁pilots": 27109, + "тки": 27110, + "▁traded": 27111, + "▁Beijing": 27112, + "▁Gregory": 27113, + "scalar": 27114, + "▁inclined": 27115, + "▁Kamp": 27116, + "▁Marian": 27117, + "▁fierce": 27118, + "▁theft": 27119, + "ющих": 27120, + "▁Into": 27121, + "constraint": 27122, + "parentNode": 27123, + "idental": 27124, + "▁gouvernement": 27125, + "▁SND": 27126, + "▁Ruby": 27127, + "▁monaster": 27128, + "Records": 27129, + "▁Kab": 27130, + "▁Universe": 27131, + "▁approximate": 27132, + "Water": 27133, + "▁Physical": 27134, + "appers": 27135, + "oubtedly": 27136, + "ложен": 27137, + "▁towel": 27138, + "▁siblings": 27139, + "eph": 27140, + "icios": 27141, + "рами": 27142, + "▁outrage": 27143, + "▁també": 27144, + "SRC": 27145, + "телем": 27146, + "Vi": 27147, + ".');": 27148, + "LM": 27149, + "▁mitt": 27150, + "▁weed": 27151, + "▁crops": 27152, + "iman": 27153, + "Claim": 27154, + "insula": 27155, + "▁(“": 27156, + "▁Changes": 27157, + "▁invånare": 27158, + "again": 27159, + "▁cnt": 27160, + "▁Gaz": 27161, + "▁austral": 27162, + "overlay": 27163, + "▁Mechan": 27164, + "▁slammed": 27165, + "▁trailing": 27166, + "▁Biography": 27167, + "▁appealing": 27168, + "IVER": 27169, + "▁Ave": 27170, + "▁Plot": 27171, + "voj": 27172, + "▁sung": 27173, + "▁unos": 27174, + "Effects": 27175, + "vv": 27176, + "cook": 27177, + "Buttons": 27178, + "▁transm": 27179, + "ierto": 27180, + "CONTEXT": 27181, + "▁dignity": 27182, + "aired": 27183, + "javax": 27184, + "▁Alberto": 27185, + "▁Recently": 27186, + "▁facial": 27187, + "mathop": 27188, + "ało": 27189, + "вид": 27190, + "cott": 27191, + "Variables": 27192, + "▁Ran": 27193, + "▁bunk": 27194, + "amiliar": 27195, + "CAST": 27196, + "▁frü": 27197, + "VED": 27198, + "▁NOTICE": 27199, + "▁turno": 27200, + "validator": 27201, + "▁Portuguese": 27202, + "▁questioning": 27203, + "}})": 27204, + "▁lear": 27205, + "Xamarin": 27206, + "▁disadv": 27207, + "encoded": 27208, + "▁Kot": 27209, + "rated": 27210, + "▁Theory": 27211, + "cius": 27212, + "▁Darwin": 27213, + "ђе": 27214, + "▁décl": 27215, + "▁область": 27216, + "рович": 27217, + "▁mobility": 27218, + "VF": 27219, + "▁хи": 27220, + "until": 27221, + "▁barriers": 27222, + "gif": 27223, + "▁Roh": 27224, + "▁aging": 27225, + "▁Widget": 27226, + "olk": 27227, + "▁farms": 27228, + "Checker": 27229, + "Introduction": 27230, + "смо": 27231, + "▁Russians": 27232, + "naments": 27233, + "▁Insert": 27234, + "▁Whenever": 27235, + "erset": 27236, + "itori": 27237, + "▁Dort": 27238, + "▁costume": 27239, + "▁mathematical": 27240, + "▁Bast": 27241, + "▁nominated": 27242, + "▁restoration": 27243, + "posal": 27244, + "▁unfortunate": 27245, + "Ps": 27246, + "LIN": 27247, + "▁intact": 27248, + "▁provoc": 27249, + "▁située": 27250, + "▁ноября": 27251, + "ermo": 27252, + "▁fisher": 27253, + "гля": 27254, + "▁conting": 27255, + "▁Doug": 27256, + "\"?": 27257, + "▁Eva": 27258, + "▁tops": 27259, + "▁Remote": 27260, + "▁artwork": 27261, + "▁artillery": 27262, + "quick": 27263, + "▁Arabia": 27264, + "▁SDValue": 27265, + "▁Dakota": 27266, + "iated": 27267, + "▁Optim": 27268, + "buttons": 27269, + "▁cottage": 27270, + "▁wherein": 27271, + "▁tutorial": 27272, + "▁Scre": 27273, + "▁sweep": 27274, + "▁Coffee": 27275, + "})}": 27276, + "▁музы": 27277, + "hostname": 27278, + "▁Temp": 27279, + "▁Fut": 27280, + "respect": 27281, + "ocz": 27282, + "▁predomin": 27283, + "Indicator": 27284, + "encial": 27285, + "UMENT": 27286, + "▁SHALL": 27287, + "▁commanded": 27288, + "▁withdrawal": 27289, + "iour": 27290, + "REGION": 27291, + "sprintf": 27292, + "▁вме": 27293, + "▁Payment": 27294, + "▁Anim": 27295, + "publish": 27296, + "▁seeks": 27297, + "ouw": 27298, + "▁GM": 27299, + "rugu": 27300, + "ustain": 27301, + "▁))": 27302, + "▁consulting": 27303, + "▁Dialog": 27304, + "▁Lars": 27305, + "▁critique": 27306, + "▁circulation": 27307, + "▁landsc": 27308, + "managed": 27309, + "▁Craft": 27310, + "▁herman": 27311, + "afi": 27312, + "amy": 27313, + "▁discour": 27314, + "<>(": 27315, + "▁Steph": 27316, + "▁tolerance": 27317, + "typename": 27318, + "ventions": 27319, + "ział": 27320, + "стов": 27321, + "▁sticking": 27322, + "ASC": 27323, + "ISO": 27324, + "▁Spencer": 27325, + "▁Didn": 27326, + "gomery": 27327, + "imiter": 27328, + "dru": 27329, + "Clause": 27330, + "▁slides": 27331, + "###": 27332, + "▁Sugar": 27333, + "HY": 27334, + "▁эти": 27335, + "▁Edwards": 27336, + "▁cents": 27337, + "oya": 27338, + "serts": 27339, + "▁Hass": 27340, + "▁ingen": 27341, + "стри": 27342, + "▁saddle": 27343, + "solid": 27344, + "▁champions": 27345, + "-)": 27346, + "▁Slov": 27347, + "▁shiny": 27348, + "▁*)&": 27349, + "▁Define": 27350, + "če": 27351, + "▁scrut": 27352, + "onden": 27353, + "'\",": 27354, + "uffs": 27355, + "▁olymp": 27356, + "idential": 27357, + "wand": 27358, + "▁annually": 27359, + "▁Arkansas": 27360, + "▁saint": 27361, + "▁gleich": 27362, + "▁perfection": 27363, + ")>": 27364, + "▁shorts": 27365, + "▁justified": 27366, + "peated": 27367, + "packages": 27368, + "driven": 27369, + "▁Liberty": 27370, + "▁stripped": 27371, + "шение": 27372, + "▁fünf": 27373, + "▁ecosystem": 27374, + "ixa": 27375, + "▁Fresh": 27376, + "vart": 27377, + "▁treats": 27378, + "▁stance": 27379, + "чёт": 27380, + "▁pity": 27381, + "adém": 27382, + "▁окон": 27383, + "▁Chand": 27384, + "rab": 27385, + "вший": 27386, + "inski": 27387, + "▁continually": 27388, + "▁Daddy": 27389, + "▁nightmare": 27390, + "icional": 27391, + "▁efect": 27392, + "ueblo": 27393, + "▁lanç": 27394, + "▁Collections": 27395, + "due": 27396, + "ampton": 27397, + "▁memcpy": 27398, + "▁**(": 27399, + "issent": 27400, + "▁Insp": 27401, + "▁Glasgow": 27402, + "▁furono": 27403, + "▁kindness": 27404, + "Bi": 27405, + "▁competed": 27406, + "▁oak": 27407, + "Large": 27408, + "▁disgu": 27409, + "▁kings": 27410, + "тами": 27411, + "▁stuffed": 27412, + "▁hilar": 27413, + "published": 27414, + "▁stressed": 27415, + "▁Peak": 27416, + "▁loader": 27417, + "Keyboard": 27418, + "▁reconstruction": 27419, + "▁vod": 27420, + "▁dun": 27421, + "▁understands": 27422, + "tenant": 27423, + "▁chaque": 27424, + "▁prejud": 27425, + "utat": 27426, + "▁uso": 27427, + "▁Heavy": 27428, + "▁cuatro": 27429, + "▁sidewalk": 27430, + "▁Bug": 27431, + "▁månaden": 27432, + "geo": 27433, + "▁united": 27434, + "▁Files": 27435, + "▁Аль": 27436, + "▁rugby": 27437, + "▁financing": 27438, + "▁comply": 27439, + "&#": 27440, + "▁rushing": 27441, + "▁fen": 27442, + "mong": 27443, + "▁spé": 27444, + "▁presenting": 27445, + "INCLUDING": 27446, + "ěl": 27447, + "zeichnung": 27448, + "Backup": 27449, + "▁petit": 27450, + "▁allerg": 27451, + "нут": 27452, + "▁worrying": 27453, + "▁mamm": 27454, + "▁operand": 27455, + ":%.*]]": 27456, + "▁realise": 27457, + "Commands": 27458, + "▁Bew": 27459, + "▁assumes": 27460, + "▁Covid": 27461, + "▁quand": 27462, + "tyard": 27463, + "▁Mono": 27464, + "linked": 27465, + "MARK": 27466, + "Esp": 27467, + "▁blessing": 27468, + "▁eyebrows": 27469, + "▁NV": 27470, + "▁стру": 27471, + "▁modeling": 27472, + "▁greeted": 27473, + "Workspace": 27474, + "▁pedest": 27475, + "▁неза": 27476, + "lemagne": 27477, + "Statistics": 27478, + "▁aument": 27479, + "▁speeds": 27480, + "▁syndrome": 27481, + "CONNECT": 27482, + "zahl": 27483, + "verso": 27484, + "ército": 27485, + "▁astronom": 27486, + "▁aprile": 27487, + "žen": 27488, + "веро": 27489, + "draft": 27490, + "▁gioc": 27491, + "▁comport": 27492, + "▁variance": 27493, + "▁realizing": 27494, + "EDIT": 27495, + "олові": 27496, + "▁estar": 27497, + "▁sost": 27498, + "NORMAL": 27499, + "▁ó": 27500, + "▁Andr": 27501, + "ATTRIB": 27502, + "▁rede": 27503, + "▁toes": 27504, + "▁advances": 27505, + "▁Against": 27506, + "TOM": 27507, + "rss": 27508, + "MMMM": 27509, + "▁newest": 27510, + "▁VER": 27511, + "▁phrases": 27512, + "anter": 27513, + "Launch": 27514, + "▁chr": 27515, + "▁manufactured": 27516, + "$),": 27517, + "rollment": 27518, + "eston": 27519, + "▁peint": 27520, + "”)": 27521, + "endet": 27522, + "▁Hair": 27523, + "ivalent": 27524, + "▁upright": 27525, + "gren": 27526, + "anked": 27527, + "wright": 27528, + "▁mast": 27529, + "▁onChange": 27530, + "▁debris": 27531, + "▁grap": 27532, + "etry": 27533, + "▁(__": 27534, + "▁Commerce": 27535, + "BOX": 27536, + "Tax": 27537, + "▁отри": 27538, + "▁prevention": 27539, + "▁Feel": 27540, + "▁exotic": 27541, + "▁Bark": 27542, + "▁Steam": 27543, + "fon": 27544, + "olin": 27545, + "▁eliminated": 27546, + "▁bc": 27547, + "▁Cycl": 27548, + "▁$(\"#": 27549, + "▁Parl": 27550, + "manuel": 27551, + "ospher": 27552, + "WF": 27553, + "Analy": 27554, + "▁navig": 27555, + "▁renown": 27556, + "Rx": 27557, + "▁Walt": 27558, + "uffed": 27559, + "▁foster": 27560, + "$:": 27561, + "shore": 27562, + "Connector": 27563, + "фика": 27564, + "▁realization": 27565, + "Li": 27566, + "ctxt": 27567, + "ahoo": 27568, + "▁miracle": 27569, + "▁ET": 27570, + "▁GPS": 27571, + "▁Observable": 27572, + "▁hf": 27573, + "▁magnificent": 27574, + "него": 27575, + "BIN": 27576, + "▁Dorf": 27577, + "ieck": 27578, + "vee": 27579, + "▁Craw": 27580, + "/#": 27581, + "▁pci": 27582, + "ippet": 27583, + "▁Hillary": 27584, + "▁gir": 27585, + "▁rand": 27586, + "▁laying": 27587, + "▁Different": 27588, + "boys": 27589, + "virt": 27590, + "▁encryption": 27591, + "ász": 27592, + "пор": 27593, + "▁smelled": 27594, + "▁suscept": 27595, + "cluded": 27596, + "▁Carn": 27597, + "igten": 27598, + "▁Chuck": 27599, + "▁Provinc": 27600, + "▁perí": 27601, + "▁Marshal": 27602, + "мож": 27603, + "gfx": 27604, + "oshi": 27605, + "▁WHE": 27606, + "▁relaxation": 27607, + ",.": 27608, + "were": 27609, + "▁varieties": 27610, + "▁Won": 27611, + "▁gaps": 27612, + "▁stole": 27613, + "igua": 27614, + "ющие": 27615, + "▁Hampshire": 27616, + "phrase": 27617, + "▁película": 27618, + "Processing": 27619, + "▁initialization": 27620, + "oustic": 27621, + "▁Josef": 27622, + "icating": 27623, + "▁goodness": 27624, + "TES": 27625, + "▁cope": 27626, + "▁ignorance": 27627, + "▁Brist": 27628, + "▁paras": 27629, + "▁accidentally": 27630, + "▁tand": 27631, + "ittest": 27632, + "▁ули": 27633, + "▁shipped": 27634, + "▁ост": 27635, + "elseif": 27636, + "▁usize": 27637, + "horizontal": 27638, + "▁Carr": 27639, + "▁precip": 27640, + "roz": 27641, + "pathetic": 27642, + "rived": 27643, + "rok": 27644, + "▁digging": 27645, + "мом": 27646, + "▁Mull": 27647, + "▁XIII": 27648, + "▁peas": 27649, + "▁foul": 27650, + "▁travels": 27651, + "▁Ng": 27652, + "▁составе": 27653, + "Mont": 27654, + "arde": 27655, + "▁Stefan": 27656, + "^^^^": 27657, + "▁Kiss": 27658, + "▁Ek": 27659, + "▁oktober": 27660, + "▁memorable": 27661, + "')).": 27662, + "▁Vision": 27663, + "▁Nina": 27664, + "▁Solar": 27665, + "▁highlighted": 27666, + "▁memo": 27667, + "meisterschaft": 27668, + "sidebar": 27669, + "SEE": 27670, + "▁Nevada": 27671, + "Da": 27672, + "▁drawer": 27673, + "astically": 27674, + "elde": 27675, + "scribed": 27676, + "▁priests": 27677, + "▁hommes": 27678, + "▁instructor": 27679, + "клад": 27680, + "▁spett": 27681, + "\\-": 27682, + "▁мира": 27683, + "▁Looks": 27684, + "▁sleeve": 27685, + "▁strongest": 27686, + "▁tête": 27687, + "▁Nicole": 27688, + "imper": 27689, + "нача": 27690, + "ipper": 27691, + "▁inwon": 27692, + "ilers": 27693, + "▁Deputy": 27694, + "oge": 27695, + "▁depressed": 27696, + "▁arte": 27697, + "▁combining": 27698, + "LAST": 27699, + "inted": 27700, + "▁Average": 27701, + "▁pollution": 27702, + "▁Phillips": 27703, + "▁WM": 27704, + "}}}\\": 27705, + "Added": 27706, + "▁peripher": 27707, + "Creation": 27708, + "▁italien": 27709, + "▁Choice": 27710, + "▁EXPRESS": 27711, + "▁Struct": 27712, + "ysz": 27713, + "Resize": 27714, + "ARGS": 27715, + "▁repo": 27716, + "▁чтобы": 27717, + "▁pref": 27718, + "▁earthqu": 27719, + "▁Мекси": 27720, + "▁Finale": 27721, + "▁hecho": 27722, + "requests": 27723, + "Cut": 27724, + "▁deserved": 27725, + "гово": 27726, + "▁Recent": 27727, + "▁дивизи": 27728, + "▁supportive": 27729, + "прави": 27730, + "▁irrelevant": 27731, + "'\r": 27732, + "▁ctrl": 27733, + "▁Deal": 27734, + "izada": 27735, + "uo": 27736, + "▁nort": 27737, + "geometry": 27738, + "▁Individual": 27739, + "ereg": 27740, + "▁приня": 27741, + "cref": 27742, + "══": 27743, + "▁comerc": 27744, + "=_": 27745, + "bund": 27746, + "тах": 27747, + "ilen": 27748, + "чита": 27749, + "▁corporation": 27750, + "esz": 27751, + "▁==>": 27752, + "ablish": 27753, + "Apr": 27754, + "▁ripped": 27755, + "Vars": 27756, + "stret": 27757, + "▁Francesco": 27758, + "NaN": 27759, + "▁anytime": 27760, + "▁automated": 27761, + "ostream": 27762, + "▁drawings": 27763, + "▁enhancement": 27764, + "okrat": 27765, + "▁Issue": 27766, + "вра": 27767, + "Currency": 27768, + "▁wyn": 27769, + "izarre": 27770, + "ético": 27771, + "multiple": 27772, + "▁Rate": 27773, + "▁Ich": 27774, + "▁Auss": 27775, + "▁Former": 27776, + "Curve": 27777, + "▁marvel": 27778, + "attro": 27779, + "▁сп": 27780, + "BOOL": 27781, + "сия": 27782, + "gold": 27783, + "▁Nintendo": 27784, + "▁Salvador": 27785, + "▁Solution": 27786, + "ADC": 27787, + "бора": 27788, + "▁Bennett": 27789, + "▁FR": 27790, + "▁pueden": 27791, + "patient": 27792, + "▁PG": 27793, + "▁Jin": 27794, + "▁crashed": 27795, + "▁denen": 27796, + "▁Sample": 27797, + "▁Quebec": 27798, + "itories": 27799, + "▁blinked": 27800, + "▁lion": 27801, + "▁voce": 27802, + "▁Impact": 27803, + "▁Mau": 27804, + "▁Nie": 27805, + "▁lob": 27806, + "▁две": 27807, + "orneys": 27808, + "▁coastal": 27809, + "▁sensors": 27810, + "▁XII": 27811, + "▁illusion": 27812, + "oji": 27813, + "▁INC": 27814, + "▁Duncan": 27815, + "yk": 27816, + "▁affecting": 27817, + "pul": 27818, + "▁Napoleon": 27819, + "▁акаде": 27820, + "▁compt": 27821, + "▁profitable": 27822, + "loe": 27823, + "▁deuxième": 27824, + "▁WC": 27825, + "▁viable": 27826, + "▁Drug": 27827, + "TextBox": 27828, + "▁luminos": 27829, + "auté": 27830, + "yc": 27831, + "ště": 27832, + "▁affiliates": 27833, + "ilda": 27834, + "conduct": 27835, + "▁ebenfalls": 27836, + "▁AMD": 27837, + "▁Monitor": 27838, + "▁Companies": 27839, + "▁corrected": 27840, + "äck": 27841, + "SYSTEM": 27842, + "otherapy": 27843, + "▁перед": 27844, + "▁blues": 27845, + "atisf": 27846, + "although": 27847, + "rost": 27848, + "SCAN": 27849, + "▁RAM": 27850, + "ціональ": 27851, + "▁vendors": 27852, + "▁customs": 27853, + "▁activate": 27854, + "▁blogs": 27855, + "▁brace": 27856, + "▁strat": 27857, + "anje": 27858, + "щё": 27859, + "▁tide": 27860, + "▁Brigade": 27861, + "getOperand": 27862, + "▁aliment": 27863, + "▁achievements": 27864, + "▁suspicion": 27865, + "▁touchdown": 27866, + "broad": 27867, + "iore": 27868, + "Comparison": 27869, + "▁mum": 27870, + "English": 27871, + "▁Picture": 27872, + "▁Mouse": 27873, + "amd": 27874, + "▁[`": 27875, + "▁denomin": 27876, + "▁Aleks": 27877, + "▁prevents": 27878, + "ób": 27879, + "fed": 27880, + "▁Pray": 27881, + "▁shine": 27882, + "▁clutch": 27883, + "mux": 27884, + "Appro": 27885, + "▁notably": 27886, + "chio": 27887, + "nage": 27888, + "HAS": 27889, + "▁')": 27890, + "▁Miche": 27891, + "tg": 27892, + "::~": 27893, + "▁amely": 27894, + "▁rodz": 27895, + "zs": 27896, + "trait": 27897, + "▁klass": 27898, + "fö": 27899, + "▁destac": 27900, + "▁Clara": 27901, + "frequency": 27902, + "▁Git": 27903, + "▁поль": 27904, + "▁frequencies": 27905, + "▁febrero": 27906, + "▁stumbled": 27907, + "кою": 27908, + "▁Names": 27909, + "▁Flight": 27910, + "▁prey": 27911, + "▁medio": 27912, + "▁VAR": 27913, + "▁Float": 27914, + "▁Ernest": 27915, + "▁Marcatori": 27916, + "oport": 27917, + "▁cancellation": 27918, + "▁Bryan": 27919, + "————": 27920, + "Luc": 27921, + "▁libre": 27922, + "▁título": 27923, + "*>": 27924, + "▁Sandy": 27925, + "▁Marina": 27926, + "Been": 27927, + "▁wal": 27928, + "▁Kultur": 27929, + "▁explode": 27930, + "▁limiting": 27931, + "▁presumably": 27932, + "▁pb": 27933, + "▁Merc": 27934, + "▁реки": 27935, + "learning": 27936, + "Catalog": 27937, + "▁Census": 27938, + "lte": 27939, + "▁NET": 27940, + "raising": 27941, + "ське": 27942, + "staff": 27943, + "▁Quinn": 27944, + "▁memorial": 27945, + "пня": 27946, + "▁cuenta": 27947, + "▁XI": 27948, + "lbl": 27949, + "▁varies": 27950, + "▁fluctuations": 27951, + "▁долж": 27952, + "▁особи": 27953, + "▁warehouse": 27954, + "However": 27955, + "▁corrections": 27956, + "dhd": 27957, + "▁fals": 27958, + "▁controversy": 27959, + "▁curse": 27960, + "▁télé": 27961, + "řed": 27962, + "▁AU": 27963, + "▁тор": 27964, + "▁crít": 27965, + "idan": 27966, + "iliary": 27967, + "▁Panel": 27968, + "cule": 27969, + "▁Poor": 27970, + "▁BA": 27971, + "▁ignorant": 27972, + "èmes": 27973, + "▁aesthetic": 27974, + "Linked": 27975, + "getInt": 27976, + "Unicode": 27977, + "[@": 27978, + "▁Zent": 27979, + "Manifest": 27980, + "▁vars": 27981, + "PB": 27982, + "▁ву": 27983, + "▁Describe": 27984, + "▁Anything": 27985, + "oirs": 27986, + "▁socks": 27987, + "▁imped": 27988, + "▁neue": 27989, + "▁dispers": 27990, + "Collect": 27991, + "filer": 27992, + "▁Frau": 27993, + "▁Hockey": 27994, + "▁teens": 27995, + "▁Roberto": 27996, + "lauf": 27997, + "вать": 27998, + "▁ско": 27999, + "isArray": 28000, + "▁teenager": 28001, + "Built": 28002, + "▁loudly": 28003, + "Capacity": 28004, + "▁adventures": 28005, + "▁Molly": 28006, + "recogn": 28007, + "bars": 28008, + "▁Lor": 28009, + "▁può": 28010, + "▁mong": 28011, + "inement": 28012, + "Assignment": 28013, + "▁diz": 28014, + "lessness": 28015, + "▁Halloween": 28016, + "▁bitmap": 28017, + "Rom": 28018, + "нар": 28019, + "▁rebel": 28020, + "▁radial": 28021, + "measure": 28022, + "nit": 28023, + "▁Assume": 28024, + "▁assignments": 28025, + "▁Isn": 28026, + "▁altre": 28027, + "ßer": 28028, + "наль": 28029, + "▁flies": 28030, + "▁droit": 28031, + "▁thickness": 28032, + "▁enjo": 28033, + "▁dwell": 28034, + "▁homosexual": 28035, + "▁eval": 28036, + "$_{": 28037, + "asia": 28038, + "▁philos": 28039, + "getCurrent": 28040, + "▁veterans": 28041, + "▁Berkeley": 28042, + "▁wildlife": 28043, + "Cop": 28044, + "vern": 28045, + "▁Ú": 28046, + "tos": 28047, + "▁Led": 28048, + "▁keywords": 28049, + "▁medications": 28050, + "neum": 28051, + "▁jamais": 28052, + "▁Buc": 28053, + "▁PD": 28054, + "▁Statement": 28055, + "▁PI": 28056, + "▁Jackie": 28057, + "▁ordin": 28058, + "▁kör": 28059, + "enze": 28060, + "▁utilized": 28061, + "áct": 28062, + "azed": 28063, + "▁severely": 28064, + "▁även": 28065, + "▁libro": 28066, + "▁Eu": 28067, + "äst": 28068, + "PART": 28069, + "▁Butler": 28070, + "▁puzzle": 28071, + "Fall": 28072, + "Country": 28073, + "pfn": 28074, + "▁україн": 28075, + "▁Orchestra": 28076, + "▁alto": 28077, + "▁ancora": 28078, + "▁decomposition": 28079, + "▁م": 28080, + "▁appetite": 28081, + "adu": 28082, + "▁THAT": 28083, + "▁comenz": 28084, + "mina": 28085, + "▁initiated": 28086, + "▁Tat": 28087, + "▁sometime": 28088, + "rek": 28089, + "bread": 28090, + "▁Statistics": 28091, + "▁Cob": 28092, + "Follow": 28093, + "▁geometric": 28094, + "шла": 28095, + "▁proceedings": 28096, + "Dlg": 28097, + "seven": 28098, + "▁[-": 28099, + "▁Buffalo": 28100, + "▁blacks": 28101, + "▁sov": 28102, + "▁custody": 28103, + "▁ras": 28104, + "▁tattoo": 28105, + "öffentlicht": 28106, + "Blo": 28107, + "Austral": 28108, + "▁recuper": 28109, + "лев": 28110, + "▁bem": 28111, + "▁thou": 28112, + "oriented": 28113, + "vir": 28114, + "▁colony": 28115, + "▁Stanford": 28116, + "Absolute": 28117, + "adrat": 28118, + "▁Situ": 28119, + "▁souvent": 28120, + "EXEC": 28121, + "▁mű": 28122, + "▁apartments": 28123, + "▁случа": 28124, + "▁ano": 28125, + "WINDO": 28126, + "acci": 28127, + "▁Lau": 28128, + "court": 28129, + "▁manifold": 28130, + "▁coalition": 28131, + "▁XIV": 28132, + "Attrib": 28133, + "ascade": 28134, + "▁wheat": 28135, + "▁strengths": 28136, + "FREE": 28137, + "EMPTY": 28138, + "▁hey": 28139, + "ascular": 28140, + "▁plasma": 28141, + "▁bob": 28142, + "Separator": 28143, + "=\"${": 28144, + "▁Zag": 28145, + "▁projet": 28146, + "▁smoothly": 28147, + "SEQU": 28148, + "analy": 28149, + "attachment": 28150, + "▁ES": 28151, + "▁popped": 28152, + "ős": 28153, + "tom": 28154, + "▁són": 28155, + "▁rott": 28156, + "Utilities": 28157, + "hadoop": 28158, + "▁sotto": 28159, + "autor": 28160, + "▁Georges": 28161, + "▁který": 28162, + "▁gruppo": 28163, + "▁когда": 28164, + "▁меда": 28165, + "▁instrumental": 28166, + "▁Writer": 28167, + "▁setTimeout": 28168, + "ikk": 28169, + "▁Dopo": 28170, + "]);\r": 28171, + "▁practicing": 28172, + "▁Ronald": 28173, + "▁уби": 28174, + "▁agrees": 28175, + "▁denoted": 28176, + "ismiss": 28177, + "▁interviewed": 28178, + "templates": 28179, + "ři": 28180, + "administr": 28181, + "▁Butter": 28182, + "▁XVII": 28183, + "▁positioned": 28184, + "▁Fourth": 28185, + "▁overwhelmed": 28186, + "▁Regular": 28187, + "▁reprezent": 28188, + "кономи": 28189, + "▁expects": 28190, + "Indices": 28191, + "▁marijuana": 28192, + "▁zaj": 28193, + "▁Bren": 28194, + "▁begg": 28195, + "▁nahm": 28196, + "▁interrog": 28197, + "тие": 28198, + "▁Bun": 28199, + "▁серед": 28200, + "▁shelves": 28201, + "▁которых": 28202, + "▁Frauen": 28203, + "▁Sergeant": 28204, + "▁успе": 28205, + "matched": 28206, + "▁donne": 28207, + "▁touches": 28208, + "abort": 28209, + "▁vale": 28210, + "▁institutional": 28211, + "▁Mons": 28212, + "▁ambitious": 28213, + "▁nonetheless": 28214, + "jd": 28215, + "пей": 28216, + "▁backpack": 28217, + "dao": 28218, + "вия": 28219, + "▁surroundings": 28220, + "|_{": 28221, + "▁gegründ": 28222, + "disp": 28223, + "▁moisture": 28224, + "▁wyd": 28225, + "▁traders": 28226, + "▁Erst": 28227, + "▁Galaxy": 28228, + "▁воло": 28229, + "▁Peru": 28230, + "▁priorities": 28231, + "▁pronounced": 28232, + "▁CBS": 28233, + "▁Palm": 28234, + "▁expans": 28235, + "▁energet": 28236, + "▁Condition": 28237, + "▁Sver": 28238, + "nested": 28239, + "▁февраля": 28240, + "hero": 28241, + "▁коло": 28242, + "▁Films": 28243, + "Bon": 28244, + "éal": 28245, + "ployed": 28246, + "trained": 28247, + "▁első": 28248, + "▁lust": 28249, + "atinum": 28250, + "oyle": 28251, + "▁Jet": 28252, + "ждения": 28253, + "▁surveys": 28254, + "bee": 28255, + "workers": 28256, + "records": 28257, + "calendar": 28258, + "bbing": 28259, + "regation": 28260, + "dashboard": 28261, + "King": 28262, + "▁vista": 28263, + "▁depicted": 28264, + "▁occurring": 28265, + "▁офи": 28266, + "▁sandwich": 28267, + "rcu": 28268, + "kern": 28269, + "▁minut": 28270, + "▁смер": 28271, + "▁td": 28272, + "solete": 28273, + "Complex": 28274, + "▁tunn": 28275, + "▁scarc": 28276, + "stead": 28277, + "▁Fail": 28278, + "▁Rs": 28279, + "▁trails": 28280, + "kem": 28281, + "▁Romans": 28282, + "ativity": 28283, + "Previous": 28284, + "▁depress": 28285, + "▁resigned": 28286, + "getDefault": 28287, + "▁Tibet": 28288, + "▁Franco": 28289, + "\")));": 28290, + "▁injection": 28291, + "removed": 28292, + "▁praised": 28293, + "▁Asc": 28294, + "erase": 28295, + "▁commissioned": 28296, + "MAIL": 28297, + "▁Boh": 28298, + "Poly": 28299, + "▁cinq": 28300, + "▁Above": 28301, + "▁Joshua": 28302, + "ZERO": 28303, + "▁summit": 28304, + "▁Urs": 28305, + "▁curl": 28306, + "▁visa": 28307, + "▁resur": 28308, + "={'": 28309, + "feat": 28310, + "▁absorb": 28311, + "▁planets": 28312, + "▁princess": 28313, + "▁Jahrhunderts": 28314, + "xp": 28315, + "▁NBC": 28316, + "▁коми": 28317, + "▁FUN": 28318, + "▁neuen": 28319, + "▁déjà": 28320, + "▁Oz": 28321, + "bben": 28322, + "VIDEO": 28323, + "▁ejempl": 28324, + "▁considers": 28325, + "atri": 28326, + "▁arrog": 28327, + "ioso": 28328, + "▁hace": 28329, + "▁contacted": 28330, + "▁unple": 28331, + "▁sponsored": 28332, + "▁trainer": 28333, + "sbi": 28334, + "▁занима": 28335, + "Criterion": 28336, + "ното": 28337, + "scheme": 28338, + "ennial": 28339, + "perform": 28340, + "▁fixing": 28341, + "▁постро": 28342, + "arb": 28343, + "EXIT": 28344, + "▁café": 28345, + "ituted": 28346, + "riages": 28347, + "Tur": 28348, + "▁haber": 28349, + "elasticsearch": 28350, + "▁ал": 28351, + "rh": 28352, + "▁voll": 28353, + "CLU": 28354, + "Mil": 28355, + "▁membres": 28356, + "▁remarked": 28357, + "вана": 28358, + "=\"_": 28359, + "Less": 28360, + "(\"\");": 28361, + "▁Yale": 28362, + "berries": 28363, + "▁releasing": 28364, + "▁imports": 28365, + "idea": 28366, + "▁(+": 28367, + "▁arqu": 28368, + "ificación": 28369, + "▁пара": 28370, + "▁Rangers": 28371, + "Mic": 28372, + "▁nederbörd": 28373, + "▁imaginary": 28374, + "▁specialists": 28375, + "▁hoof": 28376, + "Modules": 28377, + "▁sadly": 28378, + "ceil": 28379, + "TabIndex": 28380, + "ationale": 28381, + "▁Partner": 28382, + "tbody": 28383, + "▁leverage": 28384, + "DN": 28385, + "▁Prec": 28386, + "▁Sé": 28387, + "▁Mam": 28388, + "▁afin": 28389, + "isValid": 28390, + "Pse": 28391, + "▁сторо": 28392, + "▁chopped": 28393, + "▁Minor": 28394, + "▁dabei": 28395, + "David": 28396, + "ussia": 28397, + "▁деревня": 28398, + "▁Identity": 28399, + "▁LGBT": 28400, + "ције": 28401, + "▁Orts": 28402, + "▁parti": 28403, + "▁Bachelor": 28404, + "uga": 28405, + "▁OPT": 28406, + "▁Seth": 28407, + "▁LIABLE": 28408, + "▁inaugur": 28409, + "▁Shanghai": 28410, + "▁relaxing": 28411, + "циона": 28412, + "\"%": 28413, + "▁obey": 28414, + "▁Airlines": 28415, + "Links": 28416, + "▁Celt": 28417, + "▁Admin": 28418, + "agation": 28419, + "▁worries": 28420, + "INTE": 28421, + "arith": 28422, + "Fatalf": 28423, + "]])": 28424, + "colm": 28425, + "▁archae": 28426, + "▁brushed": 28427, + "▁tät": 28428, + "▁structured": 28429, + "тии": 28430, + "▁homem": 28431, + "[:,": 28432, + "▁navy": 28433, + "getKey": 28434, + "powered": 28435, + "▁sucked": 28436, + "▁zomb": 28437, + "issant": 28438, + "▁Might": 28439, + "▁Pull": 28440, + "rir": 28441, + "▁пі": 28442, + "▁seas": 28443, + "▁Wrest": 28444, + "▁tense": 28445, + "▁atm": 28446, + "▁havet": 28447, + "▁pierws": 28448, + "▁tragic": 28449, + "▁Diff": 28450, + "▁confidential": 28451, + "successful": 28452, + "ęż": 28453, + "▁Chain": 28454, + "▁Kenya": 28455, + "Choice": 28456, + "ocur": 28457, + "aniu": 28458, + "▁consultant": 28459, + "▁Advis": 28460, + "Lif": 28461, + "▁Lors": 28462, + "avorite": 28463, + "▁utilizing": 28464, + "▁vintage": 28465, + "Matcher": 28466, + "▁membre": 28467, + "▁Expect": 28468, + "▁tracing": 28469, + "nog": 28470, + "▁dej": 28471, + "▁уче": 28472, + "▁loops": 28473, + "▁onclick": 28474, + "▁GPU": 28475, + "▁Albums": 28476, + "▁Archives": 28477, + "вата": 28478, + "▁stove": 28479, + "шли": 28480, + "ancies": 28481, + "▁gemeente": 28482, + "mob": 28483, + "PDF": 28484, + "eso": 28485, + "▁vég": 28486, + "Resolve": 28487, + "▁teaches": 28488, + "ложе": 28489, + "▁ство": 28490, + "▁Одна": 28491, + "▁fid": 28492, + "Something": 28493, + "▁nebo": 28494, + "▁Valentine": 28495, + "rowning": 28496, + "▁але": 28497, + "awi": 28498, + "ishi": 28499, + "▁SPI": 28500, + "▁spel": 28501, + "▁біль": 28502, + "▁participant": 28503, + "▁Ned": 28504, + "▁Gast": 28505, + "▁blond": 28506, + "▁saves": 28507, + "colored": 28508, + "▁ACTION": 28509, + "▁Politiker": 28510, + "}$)": 28511, + "▁Dum": 28512, + "dentry": 28513, + "Student": 28514, + "▁~=": 28515, + "loads": 28516, + "▁Foster": 28517, + "一个": 28518, + "▁PK": 28519, + "▁SB": 28520, + "▁Hern": 28521, + "▁Exhib": 28522, + "Listeners": 28523, + "Sun": 28524, + "plac": 28525, + "▁Bever": 28526, + "▁incluy": 28527, + "▁dc": 28528, + "argc": 28529, + "▁ged": 28530, + "спа": 28531, + "▁Formula": 28532, + "▁сем": 28533, + "▁empt": 28534, + "unregister": 28535, + "▁Queensland": 28536, + "ández": 28537, + "otive": 28538, + "▁alley": 28539, + "▁Democrat": 28540, + "▁travail": 28541, + "▁$,": 28542, + "RP": 28543, + "рое": 28544, + "personal": 28545, + "▁période": 28546, + "HOME": 28547, + "omes": 28548, + "▁recognised": 28549, + "heng": 28550, + "▁Jung": 28551, + "▁Roland": 28552, + "▁convicted": 28553, + "Locked": 28554, + "▁mari": 28555, + "▁Luxem": 28556, + "referto": 28557, + "Deleted": 28558, + "intent": 28559, + "▁Staats": 28560, + "▁області": 28561, + "ит": 28562, + "▁саве": 28563, + "▁Protocol": 28564, + "ając": 28565, + "chk": 28566, + "TypeInfo": 28567, + "▁pkt": 28568, + "▁scandal": 28569, + "▁individually": 28570, + "FMT": 28571, + "▁nj": 28572, + "abile": 28573, + "▁Rivers": 28574, + "PROPERTY": 28575, + "VB": 28576, + "wort": 28577, + "▁splitting": 28578, + "achten": 28579, + "▁ARISING": 28580, + "▁sip": 28581, + "▁fres": 28582, + "▁groom": 28583, + "Hol": 28584, + "▁canon": 28585, + "▁abruptly": 28586, + "▁afterward": 28587, + "▁Running": 28588, + "▁ji": 28589, + "▁%,": 28590, + "▁Palestinian": 28591, + "RW": 28592, + "pgfscope": 28593, + "▁countryside": 28594, + "▁fortunate": 28595, + "▁cél": 28596, + "▁Pointer": 28597, + "ensors": 28598, + "rating": 28599, + "▁buffers": 28600, + "▁remot": 28601, + "▁PropTypes": 28602, + "▁Nah": 28603, + "altern": 28604, + "▁easiest": 28605, + "▁invas": 28606, + "▁clk": 28607, + "copyright": 28608, + "▁blanc": 28609, + "SAMP": 28610, + "▁Cohen": 28611, + "▁Shell": 28612, + "▁destroying": 28613, + "▁Zel": 28614, + "dater": 28615, + "čen": 28616, + "▁filing": 28617, + "▁integrate": 28618, + "xit": 28619, + "▁RET": 28620, + "lene": 28621, + "calls": 28622, + "▁slaughter": 28623, + "initialized": 28624, + "unches": 28625, + "▁Trace": 28626, + "efficient": 28627, + "▁Woods": 28628, + "▁longitud": 28629, + "GN": 28630, + "▁Kont": 28631, + "▁chunks": 28632, + "ách": 28633, + "▁unemployment": 28634, + "acom": 28635, + "▁slowed": 28636, + "▁outlined": 28637, + "xffff": 28638, + "▁ikke": 28639, + "▁workspace": 28640, + "Mc": 28641, + "▁kicking": 28642, + "▁embedding": 28643, + "chnitt": 28644, + "erten": 28645, + "▁Interior": 28646, + "▁Songs": 28647, + "mmc": 28648, + "▁analyzed": 28649, + "▁Coupe": 28650, + "▁favorites": 28651, + "▁tt": 28652, + "▁той": 28653, + "Routing": 28654, + "▁Silva": 28655, + "▁anderem": 28656, + "▁honom": 28657, + "▁использова": 28658, + ".\"]": 28659, + "▁Wu": 28660, + "legt": 28661, + "▁spoon": 28662, + "▁jap": 28663, + "▁Extension": 28664, + "erne": 28665, + "▁vagy": 28666, + "▁села": 28667, + "▁функ": 28668, + "▁analytics": 28669, + "▁sug": 28670, + "▁Async": 28671, + "▁peaks": 28672, + "▁Gym": 28673, + "▁lawsuit": 28674, + "<>": 28675, + "ialis": 28676, + "etric": 28677, + "faced": 28678, + "▁disrupt": 28679, + "▁få": 28680, + "Inputs": 28681, + "`);": 28682, + "▁Mend": 28683, + "gon": 28684, + "▁\",\"": 28685, + "▁nerves": 28686, + "▁doubts": 28687, + "sap": 28688, + "▁sow": 28689, + ",\\,\\": 28690, + "▁BS": 28691, + "▁Glad": 28692, + "▁aster": 28693, + "œuvre": 28694, + "▁Bangl": 28695, + "▁iPad": 28696, + "useppe": 28697, + "▁conducting": 28698, + "▁({\\": 28699, + "▁Harbor": 28700, + "psz": 28701, + "▁FIFA": 28702, + "_**": 28703, + "emor": 28704, + "▁": 28705, + "e": 28706, + "t": 28707, + "a": 28708, + "o": 28709, + "i": 28710, + "n": 28711, + "r": 28712, + "s": 28713, + "l": 28714, + "d": 28715, + "h": 28716, + "c": 28717, + "u": 28718, + "m": 28719, + "p": 28720, + "g": 28721, + "f": 28722, + ".": 28723, + "y": 28724, + ",": 28725, + "b": 28726, + "w": 28727, + "v": 28728, + "k": 28729, + "_": 28730, + ")": 28731, + "(": 28732, + "-": 28733, + "0": 28734, + "S": 28735, + "*": 28736, + "I": 28737, + "T": 28738, + "\"": 28739, + "1": 28740, + "A": 28741, + "'": 28742, + "C": 28743, + "x": 28744, + ";": 28745, + "=": 28746, + ":": 28747, + "/": 28748, + "E": 28749, + "2": 28750, + "{": 28751, + "}": 28752, + "P": 28753, + "R": 28754, + "M": 28755, + "\\": 28756, + "D": 28757, + "L": 28758, + "N": 28759, + "B": 28760, + "о": 28761, + "O": 28762, + "а": 28763, + "z": 28764, + "F": 28765, + "|": 28766, + ">": 28767, + "j": 28768, + "H": 28769, + "3": 28770, + "#": 28771, + "и": 28772, + "е": 28773, + "9": 28774, + "q": 28775, + "$": 28776, + "G": 28777, + "н": 28778, + "U": 28779, + "W": 28780, + "4": 28781, + "5": 28782, + "8": 28783, + "6": 28784, + "р": 28785, + "т": 28786, + "7": 28787, + "с": 28788, + "<": 28789, + "V": 28790, + "в": 28791, + "[": 28792, + "]": 28793, + "л": 28794, + "к": 28795, + "K": 28796, + "é": 28797, + "J": 28798, + "д": 28799, + "&": 28800, + "\r": 28801, + "Y": 28802, + "м": 28803, + "?": 28804, + "у": 28805, + "+": 28806, + "п": 28807, + "!": 28808, + "’": 28809, + "г": 28810, + "я": 28811, + "з": 28812, + "і": 28813, + "X": 28814, + "^": 28815, + "–": 28816, + "б": 28817, + "@": 28818, + "й": 28819, + "á": 28820, + "—": 28821, + "ь": 28822, + "%": 28823, + "Q": 28824, + "ó": 28825, + "ч": 28826, + "í": 28827, + "Z": 28828, + "ы": 28829, + "ä": 28830, + "х": 28831, + "`": 28832, + "ц": 28833, + "ö": 28834, + "“": 28835, + "ж": 28836, + "ü": 28837, + "”": 28838, + "à": 28839, + "è": 28840, + "ш": 28841, + "ю": 28842, + "ł": 28843, + "С": 28844, + "~": 28845, + "ф": 28846, + "П": 28847, + "»": 28848, + "В": 28849, + "«": 28850, + "å": 28851, + "К": 28852, + "щ": 28853, + "·": 28854, + "ј": 28855, + "М": 28856, + "ç": 28857, + "А": 28858, + "Н": 28859, + "Р": 28860, + "Б": 28861, + "č": 28862, + "ú": 28863, + "ę": 28864, + "ã": 28865, + "ą": 28866, + "ă": 28867, + "Д": 28868, + "ї": 28869, + "ъ": 28870, + "ě": 28871, + "Г": 28872, + "š": 28873, + "О": 28874, + "Т": 28875, + "ê": 28876, + "ñ": 28877, + "…": 28878, + "ž": 28879, + "ß": 28880, + "ё": 28881, + "ż": 28882, + "ř": 28883, + "ś": 28884, + "Л": 28885, + "ő": 28886, + "„": 28887, + "э": 28888, + "ý": 28889, + "У": 28890, + "â": 28891, + "И": 28892, + "є": 28893, + "‘": 28894, + "î": 28895, + "З": 28896, + "Ф": 28897, + "ò": 28898, + "•": 28899, + "ć": 28900, + "É": 28901, + "°": 28902, + "ș": 28903, + "Х": 28904, + "ț": 28905, + "ô": 28906, + "Е": 28907, + "ń": 28908, + "Ч": 28909, + "Ш": 28910, + "ø": 28911, + "ù": 28912, + "ů": 28913, + "的": 28914, + "ا": 28915, + "æ": 28916, + "њ": 28917, + "љ": 28918, + "ë": 28919, + "ï": 28920, + "Э": 28921, + "£": 28922, + "−": 28923, + ",": 28924, + "õ": 28925, + "ћ": 28926, + "­": 28927, + "Ц": 28928, + "І": 28929, + "ā": 28930, + "ű": 28931, + "†": 28932, + "ل": 28933, + "ō": 28934, + "​": 28935, + "º": 28936, + "Я": 28937, + "′": 28938, + "Á": 28939, + "Ö": 28940, + "²": 28941, + "Ж": 28942, + "ì": 28943, + "。": 28944, + "数": 28945, + "×": 28946, + "ر": 28947, + "α": 28948, + "́": 28949, + "Ю": 28950, + "û": 28951, + "œ": 28952, + "ı": 28953, + "م": 28954, + "ن": 28955, + "ª": 28956, + "ź": 28957, + "ο": 28958, + "″": 28959, + "€": 28960, + "Ü": 28961, + "و": 28962, + "用": 28963, + "À": 28964, + "Č": 28965, + "Š": 28966, + "ت": 28967, + "د": 28968, + "一": 28969, + "¿": 28970, + "是": 28971, + "ي": 28972, + "ђ": 28973, + "®": 28974, + "ی": 28975, + "ν": 28976, + "đ": 28977, + "τ": 28978, + "─": 28979, + "ι": 28980, + "ε": 28981, + "→": 28982, + "ب": 28983, + "Å": 28984, + "ū": 28985, + "№": 28986, + "ş": 28987, + "不": 28988, + "џ": 28989, + "ー": 28990, + "中": 28991, + "Î": 28992, + "の": 28993, + ":": 28994, + "个": 28995, + "Й": 28996, + "ρ": 28997, + "有": 28998, + "Ä": 28999, + " ": 29000, + "ī": 29001, + "©": 29002, + "为": 29003, + "ه": 29004, + "י": 29005, + "ו": 29006, + "时": 29007, + "س": 29008, + "Ś": 29009, + "在": 29010, + "件": 29011, + "取": 29012, + "ς": 29013, + "™": 29014, + "이": 29015, + "σ": 29016, + "μ": 29017, + "定": 29018, + "文": 29019, + "据": 29020, + "置": 29021, + "Ž": 29022, + "±": 29023, + "表": 29024, + "成": 29025, + "ň": 29026, + "λ": 29027, + "¡": 29028, + "È": 29029, + "π": 29030, + "字": 29031, + "│": 29032, + "Ј": 29033, + "回": 29034, + "Є": 29035, + "到": 29036, + "行": 29037, + "§": 29038, + "½": 29039, + "ع": 29040, + "、": 29041, + "Ł": 29042, + "다": 29043, + "ン": 29044, + "κ": 29045, + "名": 29046, + "ה": 29047, + "入": 29048, + "η": 29049, + "大": 29050, + "对": 29051, + "可": 29052, + "Â": 29053, + "上": 29054, + "█": 29055, + "新": 29056, + "ف": 29057, + "加": 29058, + "要": 29059, + "Ż": 29060, + "下": 29061, + "分": 29062, + "值": 29063, + "ת": 29064, + "出": 29065, + "类": 29066, + "请": 29067, + "’": 29068, + "息": 29069, + "Ú": 29070, + "υ": 29071, + "获": 29072, + "示": 29073, + "以": 29074, + "ר": 29075, + "接": 29076, + "ל": 29077, + "を": 29078, + "存": 29079, + "信": 29080, + "设": 29081, + "方": 29082, + "ش": 29083, + "能": 29084, + "点": 29085, + "人": 29086, + "前": 29087, + "ğ": 29088, + "作": 29089, + "═": 29090, + "↘": 29091, + "ð": 29092, + "理": 29093, + "■": 29094, + "法": 29095, + "️": 29096, + "ˈ": 29097, + "果": 29098, + "发": 29099, + "ح": 29100, + "γ": 29101, + "ɵ": 29102, + "า": 29103, + "َ": 29104, + "了": 29105, + "户": 29106, + "Í": 29107, + "ə": 29108, + "ス": 29109, + "查": 29110, + "し": 29111, + "מ": 29112, + "单": 29113, + "ť": 29114, + "ق": 29115, + "る": 29116, + "间": 29117, + "如": 29118, + "本": 29119, + "后": 29120, + "ί": 29121, + "式": 29122, + "ト": 29123, + "Щ": 29124, + "Ó": 29125, + "す": 29126, + "א": 29127, + "生": 29128, + "动": 29129, + "ک": 29130, + "和": 29131, + "い": 29132, + "€": 29133, + "ა": 29134, + "가": 29135, + "하": 29136, + "�": 29137, + "小": 29138, + "返": 29139, + "否": 29140, + "ة": 29141, + "日": 29142, + "로": 29143, + "标": 29144, + "码": 29145, + "地": 29146, + "位": 29147, + "에": 29148, + " ": 29149, + "列": 29150, + "수": 29151, + "β": 29152, + "除": 29153, + "使": 29154, + "ש": 29155, + "ج": 29156, + "イ": 29157, + "δ": 29158, + "自": 29159, + "于": 29160, + "지": 29161, + "当": 29162, + "所": 29163, + "기": 29164, + "ი": 29165, + "ב": 29166, + "ร": 29167, + "★": 29168, + "子": 29169, + "号": 29170, + "ك": 29171, + "参": 29172, + "型": 29173, + "に": 29174, + "는": 29175, + "这": 29176, + "开": 29177, + "น": 29178, + "会": 29179, + "器": 29180, + "面": 29181, + "ル": 29182, + "图": 29183, + "度": 29184, + ")": 29185, + "(": 29186, + "의": 29187, + "内": 29188, + "을": 29189, + "最": 29190, + "”": 29191, + "化": 29192, + "建": 29193, + "니": 29194, + "量": 29195, + "😂": 29196, + "始": 29197, + "ē": 29198, + "خ": 29199, + "를": 29200, + "ά": 29201, + "过": 29202, + "³": 29203, + "´": 29204, + "组": 29205, + "功": 29206, + "‎": 29207, + "Ÿ": 29208, + "区": 29209, + "ز": 29210, + "ґ": 29211, + "ό": 29212, + "ッ": 29213, + "ω": 29214, + "Ç": 29215, + "选": 29216, + "通": 29217, + "结": 29218, + "录": 29219, + "改": 29220, + "ク": 29221, + "目": 29222, + "指": 29223, + "务": 29224, + "๐": 29225, + "输": 29226, + "た": 29227, + "อ": 29228, + "关": 29229, + "で": 29230, + "调": 29231, + "ा": 29232, + "정": 29233, + "合": 29234, + "已": 29235, + "시": 29236, + "部": 29237, + "页": 29238, + "━": 29239, + "ː": 29240, + "ま": 29241, + "我": 29242, + "求": 29243, + "市": 29244, + "次": 29245, + "נ": 29246, + "实": 29247, + "将": 29248, + "重": 29249, + "更": 29250, + "制": 29251, + "符": 29252, + "配": 29253, + "象": 29254, + "θ": 29255, + "ก": 29256, + "て": 29257, + "进": 29258, + "需": 29259, + "Đ": 29260, + "性": 29261, + "认": 29262, + "来": 29263, + "题": 29264, + "程": 29265, + "模": 29266, + "!": 29267, + "失": 29268, + "口": 29269, + "な": 29270, + "έ": 29271, + "": 29272, + "空": 29273, + "‍": 29274, + "期": 29275, + "者": 29276, + "は": 29277, + "Ђ": 29278, + "提": 29279, + "ή": 29280, + "ラ": 29281, + "한": 29282, + "态": 29283, + "复": 29284, + "ง": 29285, + "ე": 29286, + "Ø": 29287, + "리": 29288, + "修": 29289, + "‚": 29290, + "得": 29291, + "多": 29292, + "格": 29293, + "자": 29294, + "ע": 29295, + "่": 29296, + "函": 29297, + "应": 29298, + "↗": 29299, + "्": 29300, + "เ": 29301, + "正": 29302, + "注": 29303, + "스": 29304, + "서": 29305, + "リ": 29306, + "φ": 29307, + "ص": 29308, + "が": 29309, + "则": 29310, + "消": 29311, + "节": 29312, + "序": 29313, + "代": 29314, + "사": 29315, + "と": 29316, + "ד": 29317, + "้": 29318, + "र": 29319, + "此": 29320, + "保": 29321, + "ア": 29322, + "ư": 29323, + "인": 29324, + "ė": 29325, + "处": 29326, + "删": 29327, + "ɛ": 29328, + "容": 29329, + "ط": 29330, + "“": 29331, + "之": 29332, + "包": 29333, + "状": 29334, + "ド": 29335, + "İ": 29336, + "体": 29337, + "同": 29338, + "事": 29339, + "🙂": 29340, + "タ": 29341, + "χ": 29342, + "ʿ": 29343, + "Ș": 29344, + "主": 29345, + "品": 29346, + "ק": 29347, + "询": 29348, + "创": 29349, + "该": 29350, + " ": 29351, + "元": 29352, + "第": 29353, + "天": 29354, + "或": 29355, + "年": 29356, + "转": 29357, + "ח": 29358, + "传": 29359, + "ţ": 29360, + "路": 29361, + "例": 29362, + "机": 29363, + "Ã": 29364, + "ď": 29365, + "高": 29366, + "相": 29367, + "โ": 29368, + "片": 29369, + "―": 29370, + "操": 29371, + "ա": 29372, + "ม": 29373, + "全": 29374, + "无": 29375, + "月": 29376, + "称": 29377, + "ั": 29378, + "就": 29379, + "™": 29380, + "明": 29381, + "计": 29382, + "你": 29383, + "败": 29384, + "密": 29385, + "解": 29386, + "れ": 29387, + "أ": 29388, + "变": 29389, + "段": 29390, + "条": 29391, + "默": 29392, + "●": 29393, + "ล": 29394, + "色": 29395, + "断": 29396, + "商": 29397, + "ם": 29398, + "か": 29399, + "里": 29400, + "系": 29401, + "编": 29402, + "错": 29403, + "트": 29404, + "只": 29405, + "县": 29406, + "ს": 29407, + "常": 29408, + "初": 29409, + "ɔ": 29410, + "Α": 29411, + "フ": 29412, + "►": 29413, + "等": 29414, + "일": 29415, + "・": 29416, + "Ō": 29417, + "情": 29418, + "现": 29419, + "Ř": 29420, + "ِ": 29421, + "さ": 29422, + "ạ": 29423, + "용": 29424, + "证": 29425, + "해": 29426, + "手": 29427, + "支": 29428, + "입": 29429, + "服": 29430, + "்": 29431, + "道": 29432, + "어": 29433, + "送": 29434, + "载": 29435, + "限": 29436, + "线": 29437, + "属": 29438, + "—": 29439, + "他": 29440, + "放": 29441, + "记": 29442, + "公": 29443, + "没": 29444, + "添": 29445, + "显": 29446, + "บ": 29447, + "ย": 29448, + "რ": 29449, + "其": 29450, + "集": 29451, + "金": 29452, + "国": 29453, + "任": 29454, + "ە": 29455, + "话": 29456, + "并": 29457, + "被": 29458, + "ύ": 29459, + "都": 29460, + "گ": 29461, + "意": 29462, + "כ": 29463, + "经": 29464, + "성": 29465, + "看": 29466, + "פ": 29467, + "址": 29468, + "ס": 29469, + "드": 29470, + "交": 29471, + "¼": 29472, + "Џ": 29473, + "完": 29474, + "Δ": 29475, + "义": 29476, + "보": 29477, + "向": 29478, + "换": 29479, + "山": 29480, + "算": 29481, + "二": 29482, + "پ": 29483, + "⁄": 29484, + "判": 29485, + "级": 29486, + "工": 29487, + "ด": 29488, + "⠀": 29489, + "家": 29490, + "レ": 29491, + "三": 29492, + "原": 29493, + "】": 29494, + "长": 29495, + "া": 29496, + "管": 29497, + "ѝ": 29498, + "क": 29499, + "学": 29500, + "ロ": 29501, + "验": 29502, + "写": 29503, + "Œ": 29504, + "从": 29505, + "【": 29506, + "收": 29507, + "ả": 29508, + "未": 29509, + "登": 29510, + "고": 29511, + "源": 29512, + "每": 29513, + "µ": 29514, + "误": 29515, + "り": 29516, + "요": 29517, + "按": 29518, + "ว": 29519, + "权": 29520, + "根": 29521, + "プ": 29522, + "串": 29523, + "ส": 29524, + "›": 29525, + "제": 29526, + "シ": 29527, + "Ş": 29528, + "确": 29529, + "好": 29530, + "统": 29531, + "效": 29532, + "网": 29533, + "\u0001": 29534, + "物": 29535, + "아": 29536, + "也": 29537, + "은": 29538, + "ệ": 29539, + "न": 29540, + "项": 29541, + "资": 29542, + "こ": 29543, + "引": 29544, + "ジ": 29545, + "ค": 29546, + "版": 29547, + "ท": 29548, + "平": 29549, + "们": 29550, + "与": 29551, + "き": 29552, + "移": 29553, + "ि": 29554, + "素": 29555, + "执": 29556, + "주": 29557, + "‐": 29558, + "Ґ": 29559, + "ี": 29560, + "板": 29561, + "问": 29562, + "Ε": 29563, + "安": 29564, + "면": 29565, + "소": 29566, + "ต": 29567, + "ิ": 29568, + "持": 29569, + "습": 29570, + "Σ": 29571, + "ら": 29572, + "コ": 29573, + "心": 29574, + "Π": 29575, + "打": 29576, + "」": 29577, + "상": 29578, + "「": 29579, + "检": 29580, + "库": 29581, + "÷": 29582, + "으": 29583, + "测": 29584, + "ん": 29585, + "े": 29586, + "ُ": 29587, + "力": 29588, + "直": 29589, + "由": 29590, + "ى": 29591, + "试": 29592, + "必": 29593, + "端": 29594, + "ʻ": 29595, + "先": 29596, + "↑": 29597, + "命": 29598, + "도": 29599, + "전": 29600, + "ห": 29601, + "员": 29602, + "ɪ": 29603, + "있": 29604, + "比": 29605, + "ṣ": 29606, + "時": 29607, + "择": 29608, + "ذ": 29609, + "テ": 29610, + "‌": 29611, + "构": 29612, + "备": 29613, + "그": 29614, + "链": 29615, + "说": 29616, + "ლ": 29617, + "ן": 29618, + "签": 29619, + "う": 29620, + "غ": 29621, + "ế": 29622, + "ض": 29623, + "ḥ": 29624, + "启": 29625, + "력": 29626, + "ო": 29627, + "付": 29628, + "მ": 29629, + "索": 29630, + "特": 29631, + "ג": 29632, + "西": 29633, + "대": 29634, + "├": 29635, + "–": 29636, + "Ž": 29637, + "外": 29638, + "צ": 29639, + "头": 29640, + "连": 29641, + "流": 29642, + "◄": 29643, + "デ": 29644, + "カ": 29645, + "র": 29646, + "오": 29647, + "找": 29648, + "清": 29649, + "🤣": 29650, + "去": 29651, + "₹": 29652, + "경": 29653, + "グ": 29654, + "ْ": 29655, + "¢": 29656, + "因": 29657, + "": 29658, + "Κ": 29659, + "增": 29660, + "知": 29661, + "¶": 29662, + "像": 29663, + "♥": 29664, + "터": 29665, + "く": 29666, + "ậ": 29667, + "メ": 29668, + "Æ": 29669, + "省": 29670, + "स": 29671, + "म": 29672, + "❤": 29673, + "あ": 29674, + "样": 29675, + "起": 29676, + "台": 29677, + "读": 29678, + "角": 29679, + "南": 29680, + "整": 29681, + "订": 29682, + "\f": 29683, + "ט": 29684, + "マ": 29685, + "্": 29686, + "우": 29687, + "ն": 29688, + "您": 29689, + "ئ": 29690, + "基": 29691, + "水": 29692, + "생": 29693, + "‑": 29694, + "나": 29695, + "画": 29696, + "描": 29697, + "击": 29698, + "っ": 29699, + "라": 29700, + "ნ": 29701, + "ր": 29702, + "业": 29703, + "ბ": 29704, + "别": 29705, + "♦": 29706, + "ィ": 29707, + "त": 29708, + "给": 29709, + "문": 29710, + "形": 29711, + "控": 29712, + "然": 29713, + "동": 29714, + "Њ": 29715, + "⁠": 29716, + "东": 29717, + "ป": 29718, + "州": 29719, + "排": 29720, + "세": 29721, + "装": 29722, + "할": 29723, + "Ć": 29724, + "∞": 29725, + "海": 29726, + "城": 29727, + "键": 29728, + "径": 29729, + "호": 29730, + "화": 29731, + "្": 29732, + "料": 29733, + "ơ": 29734, + "ी": 29735, + "ウ": 29736, + "具": 29737, + "ブ": 29738, + "块": 29739, + "再": 29740, + "ố": 29741, + "电": 29742, + ";": 29743, + "위": 29744, + "两": 29745, + "而": 29746, + "장": 29747, + "آ": 29748, + "Ț": 29749, + "バ": 29750, + "还": 29751, + "令": 29752, + "キ": 29753, + "ّ": 29754, + "값": 29755, + "번": 29756, + "만": 29757, + "总": 29758, + "ल": 29759, + "▲": 29760, + "异": 29761, + "光": 29762, + "客": 29763, + "非": 29764, + "ị": 29765, + "": 29766, + "þ": 29767, + "設": 29768, + "述": 29769, + "합": 29770, + "?": 29771, + "✔": 29772, + "导": 29773, + "ṇ": 29774, + "부": 29775, + "˙": 29776, + "Τ": 29777, + "も": 29778, + "구": 29779, + "镇": 29780, + "작": 29781, + "░": 29782, + "步": 29783, + "ộ": 29784, + "活": 29785, + "พ": 29786, + "←": 29787, + "ǎ": 29788, + "จ": 29789, + "束": 29790, + "ـ": 29791, + "‘": 29792, + "那": 29793, + "प": 29794, + "エ": 29795, + "志": 29796, + "么": 29797, + "运": 29798, + "北": 29799, + "超": 29800, + "་": 29801, + "布": 29802, + "ώ": 29803, + "͡": 29804, + "少": 29805, + "파": 29806, + "ʃ": 29807, + "ム": 29808, + "•": 29809, + "卡": 29810, + "ন": 29811, + "Μ": 29812, + "ɑ": 29813, + "😉": 29814, + "辑": 29815, + "원": 29816, + "美": 29817, + "产": 29818, + "利": 29819, + "모": 29820, + "联": 29821, + "界": 29822, + "체": 29823, + "种": 29824, + "王": 29825, + "ľ": 29826, + "여": 29827, + "메": 29828, + "域": 29829, + "ვ": 29830, + "立": 29831, + "록": 29832, + "게": 29833, + "إ": 29834, + "ṭ": 29835, + "神": 29836, + "ո": 29837, + "音": 29838, + "☆": 29839, + "Ñ": 29840, + "조": 29841, + "動": 29842, + "缓": 29843, + "과": 29844, + "报": 29845, + "ʼ": 29846, + "ា": 29847, + "되": 29848, + "ե": 29849, + "视": 29850, + "ช": 29851, + "详": 29852, + "แ": 29853, + "¦": 29854, + "把": 29855, + "க": 29856, + "ি": 29857, + "출": 29858, + "비": 29859, + "边": 29860, + "框": 29861, + "व": 29862, + "サ": 29863, + "Ι": 29864, + "Ο": 29865, + "オ": 29866, + "¾": 29867, + "历": 29868, + "ŏ": 29869, + "门": 29870, + "ข": 29871, + "含": 29872, + "¬": 29873, + "周": 29874, + "填": 29875, + "待": 29876, + "ะ": 29877, + "დ": 29878, + "Ї": 29879, + "额": 29880, + "음": 29881, + "四": 29882, + "だ": 29883, + "회": 29884, + "止": 29885, + "率": 29886, + "环": 29887, + "パ": 29888, + "래": 29889, + "闭": 29890, + "̀": 29891, + "语": 29892, + "개": 29893, + "身": 29894, + "藏": 29895, + "य": 29896, + "된": 29897, + "即": 29898, + "拉": 29899, + "선": 29900, + "변": 29901, + "≥": 29902, + "ุ": 29903, + "些": 29904, + "🤷": 29905, + "せ": 29906, + "左": 29907, + "ợ": 29908, + "右": 29909, + "ể": 29910, + "내": 29911, + "ּ": 29912, + "ז": 29913, + "ে": 29914, + "告": 29915, + "ấ": 29916, + "白": 29917, + "账": 29918, + "费": 29919, + "江": 29920, + "み": 29921, + "‹": 29922, + "์": 29923, + "‡": 29924, + "造": 29925, + "但": 29926, + "十": 29927, + "它": 29928, + "ं": 29929, + "ŋ": 29930, + "ў": 29931, + "セ": 29932, + "女": 29933, + "⣿": 29934, + "ի": 29935, + "京": 29936, + "触": 29937, + "함": 29938, + "들": 29939, + "Ā": 29940, + "˜": 29941, + "石": 29942, + "よ": 29943, + "田": 29944, + "易": 29945, + "规": 29946, + "展": 29947, + "¯": 29948, + "做": 29949, + "星": 29950, + "უ": 29951, + "✓": 29952, + "თ": 29953, + "供": 29954, + "명": 29955, + "ξ": 29956, + "己": 29957, + "且": 29958, + "插": 29959, + "景": 29960, + "切": 29961, + "ไ": 29962, + "없": 29963, + "ョ": 29964, + "及": 29965, + "Ν": 29966, + "미": 29967, + "ث": 29968, + "데": 29969, + "价": 29970, + "乡": 29971, + "ह": 29972, + "チ": 29973, + "真": 29974, + "太": 29975, + "ู": 29976, + "ダ": 29977, + "局": 29978, + "♂": 29979, + "退": 29980, + "ு": 29981, + "ক": 29982, + "ி": 29983, + "何": 29984, + "😭": 29985, + "¥": 29986, + "": 29987, + "≈": 29988, + "司": 29989, + "层": 29990, + "실": 29991, + "站": 29992, + "首": 29993, + "款": 29994, + "រ": 29995, + "間": 29996, + "ָ": 29997, + "저": 29998, + "监": 29999, + "ァ": 30000, + "册": 30001, + "案": 30002, + "ो": 30003, + "反": 30004, + "听": 30005, + "族": 30006, + "析": 30007, + "ื": 30008, + "秒": 30009, + "공": 30010, + "œ": 30011, + "🚀": 30012, + "거": 30013, + "재": 30014, + "‚": 30015, + "場": 30016, + "广": 30017, + "播": 30018, + "║": 30019, + "⋅": 30020, + "技": 30021, + "贴": 30022, + "想": 30023, + "ʁ": 30024, + "ớ": 30025, + "ャ": 30026, + "중": 30027, + "》": 30028, + "速": 30029, + "频": 30030, + "队": 30031, + "ำ": 30032, + "け": 30033, + "ु": 30034, + "≤": 30035, + "↓": 30036, + "须": 30037, + "菜": 30038, + "̃": 30039, + "剪": 30040, + "버": 30041, + "ェ": 30042, + "Λ": 30043, + "细": 30044, + "選": 30045, + "द": 30046, + "¹": 30047, + "许": 30048, + "ầ": 30049, + "世": 30050, + "ュ": 30051, + "ء": 30052, + "‡": 30053, + "候": 30054, + "共": 30055, + "크": 30056, + "ธ": 30057, + "설": 30058, + "快": 30059, + "友": 30060, + "ְ": 30061, + "车": 30062, + "推": 30063, + "花": 30064, + "言": 30065, + "چ": 30066, + "至": 30067, + "開": 30068, + "校": 30069, + "個": 30070, + "村": 30071, + "つ": 30072, + "▌": 30073, + "ப": 30074, + "결": 30075, + "ņ": 30076, + "优": 30077, + "ន": 30078, + "达": 30079, + "核": 30080, + "ナ": 30081, + "场": 30082, + "影": 30083, + "🏻": 30084, + "钮": 30085, + "ظ": 30086, + "Þ": 30087, + "▼": 30088, + "お": 30089, + "份": 30090, + "微": 30091, + "ờ": 30092, + "识": 30093, + "행": 30094, + "《": 30095, + "ใ": 30096, + "ọ": 30097, + "预": 30098, + "ব": 30099, + "த": 30100, + "": 30101, + "ų": 30102, + "마": 30103, + "않": 30104, + "ɡ": 30105, + "계": 30106, + "연": 30107, + "五": 30108, + "Ź": 30109, + "め": 30110, + "很": 30111, + "간": 30112, + "無": 30113, + "ប": 30114, + "社": 30115, + "Ê": 30116, + "书": 30117, + "顶": 30118, + "ტ": 30119, + "才": 30120, + "云": 30121, + "└": 30122, + "ζ": 30123, + "،": 30124, + "搜": 30125, + "신": 30126, + "유": 30127, + "‏": 30128, + "✅": 30129, + "⭐": 30130, + "照": 30131, + "短": 30132, + "川": 30133, + "後": 30134, + "范": 30135, + "民": 30136, + "治": 30137, + "章": 30138, + "ề": 30139, + "바": 30140, + "ә": 30141, + "⚭": 30142, + "河": 30143, + "论": 30144, + "え": 30145, + "Ω": 30146, + "√": 30147, + "Ă": 30148, + "Γ": 30149, + "坐": 30150, + "적": 30151, + "停": 30152, + "추": 30153, + "受": 30154, + "♀": 30155, + "ʾ": 30156, + "树": 30157, + "林": 30158, + "치": 30159, + "fi": 30160, + "▒": 30161, + "张": 30162, + "着": 30163, + "访": 30164, + "考": 30165, + "教": 30166, + "ग": 30167, + "准": 30168, + "印": 30169, + "精": 30170, + "窗": 30171, + "宝": 30172, + "ち": 30173, + "围": 30174, + "ַ": 30175, + "致": 30176, + "モ": 30177, + "때": 30178, + "随": 30179, + "储": 30180, + "况": 30181, + "邮": 30182, + "武": 30183, + "⛔": 30184, + "维": 30185, + "ү": 30186, + "跳": 30187, + "ब": 30188, + "投": 30189, + "ủ": 30190, + "표": 30191, + "반": 30192, + "英": 30193, + "ʰ": 30194, + "👍": 30195, + "ज": 30196, + "带": 30197, + "為": 30198, + "续": 30199, + "ɨ": 30200, + "처": 30201, + "₂": 30202, + "클": 30203, + "群": 30204, + "현": 30205, + "风": 30206, + "购": 30207, + "ក": 30208, + "老": 30209, + "留": 30210, + "球": 30211, + "프": 30212, + "▄": 30213, + "史": 30214, + "Љ": 30215, + "⟩": 30216, + "분": 30217, + "გ": 30218, + "店": 30219, + "审": 30220, + "료": 30221, + "목": 30222, + "略": 30223, + "관": 30224, + "ִ": 30225, + "科": 30226, + "货": 30227, + "ம": 30228, + "络": 30229, + "阳": 30230, + "Ḥ": 30231, + "資": 30232, + "若": 30233, + "স": 30234, + "ہ": 30235, + "宽": 30236, + "见": 30237, + "ズ": 30238, + "游": 30239, + "방": 30240, + "ồ": 30241, + "ɾ": 30242, + "열": 30243, + "러": 30244, + "ך": 30245, + "\u001b": 30246, + "်": 30247, + "余": 30248, + "响": 30249, + "缩": 30250, + "ட": 30251, + "评": 30252, + "允": 30253, + "离": 30254, + "🤔": 30255, + "Ё": 30256, + "ʊ": 30257, + "黑": 30258, + "马": 30259, + "⟨": 30260, + "値": 30261, + "箱": 30262, + "야": 30263, + "ម": 30264, + "Ő": 30265, + "感": 30266, + "ツ": 30267, + "ụ": 30268, + "ポ": 30269, + "확": 30270, + "声": 30271, + "战": 30272, + "ѕ": 30273, + "変": 30274, + "와": 30275, + "父": 30276, + "ベ": 30277, + "助": 30278, + "업": 30279, + "ʲ": 30280, + "ÿ": 30281, + "充": 30282, + "强": 30283, + "博": 30284, + "ミ": 30285, + "销": 30286, + "당": 30287, + "記": 30288, + "什": 30289, + "匹": 30290, + "ւ": 30291, + "そ": 30292, + "코": 30293, + "ল": 30294, + "ŭ": 30295, + "午": 30296, + "ニ": 30297, + "\u0012": 30298, + "ʒ": 30299, + "შ": 30300, + "某": 30301, + "ォ": 30302, + "足": 30303, + "타": 30304, + "Ð": 30305, + "ხ": 30306, + "름": 30307, + "木": 30308, + "楼": 30309, + "최": 30310, + "红": 30311, + "¨": 30312, + "古": 30313, + "\u0006": 30314, + "단": 30315, + "今": 30316, + "ʔ": 30317, + "ट": 30318, + "ম": 30319, + "斯": 30320, + "語": 30321, + "Ÿ": 30322, + "🙄": 30323, + "牌": 30324, + "안": 30325, + "ស": 30326, + "颜": 30327, + "~": 30328, + "克": 30329, + "深": 30330, + "금": 30331, + "會": 30332, + "尔": 30333, + "释": 30334, + "批": 30335, + "산": 30336, + "野": 30337, + "防": 30338, + "Η": 30339, + "ө": 30340, + "ψ": 30341, + "ボ": 30342, + "š": 30343, + "各": 30344, + "진": 30345, + "追": 30346, + "句": 30347, + "警": 30348, + "Φ": 30349, + "ѣ": 30350, + "ḍ": 30351, + "词": 30352, + "男": 30353, + "글": 30354, + "식": 30355, + "隐": 30356, + "복": 30357, + "盘": 30358, + "Ì": 30359, + "申": 30360, + "议": 30361, + "ザ": 30362, + "近": 30363, + "능": 30364, + "য": 30365, + "東": 30366, + "這": 30367, + "ர": 30368, + "距": 30369, + "院": 30370, + "德": 30371, + "ǐ": 30372, + "针": 30373, + "▀": 30374, + "↔": 30375, + "房": 30376, + "青": 30377, + "政": 30378, + "😅": 30379, + "递": 30380, + "প": 30381, + "波": 30382, + "ソ": 30383, + "绑": 30384, + "ビ": 30385, + "ễ": 30386, + "포": 30387, + "\u0010": 30388, + "ử": 30389, + "등": 30390, + "환": 30391, + "士": 30392, + "ত": 30393, + "Θ": 30394, + "초": 30395, + "境": 30396, + "差": 30397, + "采": 30398, + "디": 30399, + "ĩ": 30400, + "升": 30401, + "背": 30402, + "배": 30403, + "龙": 30404, + "街": 30405, + "್": 30406, + "ṛ": 30407, + "ু": 30408, + "弹": 30409, + "魔": 30410, + "객": 30411, + "‰": 30412, + "⌁": 30413, + "ἐ": 30414, + "禁": 30415, + "ผ": 30416, + "қ": 30417, + "島": 30418, + "ா": 30419, + "♭": 30420, + "百": 30421, + "ứ": 30422, + "ネ": 30423, + "专": 30424, + "來": 30425, + "刷": 30426, + "필": 30427, + "յ": 30428, + "ắ": 30429, + "华": 30430, + "Β": 30431, + "श": 30432, + "¸": 30433, + "屏": 30434, + "死": 30435, + "遍": 30436, + "검": 30437, + "Χ": 30438, + "것": 30439, + "八": 30440, + "览": 30441, + "택": 30442, + "唯": 30443, + "∙": 30444, + "¤": 30445, + "페": 30446, + "让": 30447, + "锁": 30448, + "무": 30449, + "思": 30450, + "隔": 30451, + "Ô": 30452, + "\u0013": 30453, + "ṃ": 30454, + "ワ": 30455, + "低": 30456, + "션": 30457, + "半": 30458, + "较": 30459, + "ត": 30460, + "享": 30461, + "积": 30462, + "ˆ": 30463, + "😊": 30464, + "典": 30465, + "ǔ": 30466, + "六": 30467, + "便": 30468, + "ɐ": 30469, + "简": 30470, + "继": 30471, + "仅": 30472, + "尾": 30473, + "‹": 30474, + "வ": 30475, + "կ": 30476, + "ƒ": 30477, + "영": 30478, + "火": 30479, + "湖": 30480, + "書": 30481, + "발": 30482, + "ハ": 30483, + "循": 30484, + "术": 30485, + "結": 30486, + "ļ": 30487, + "乐": 30488, + "滤": 30489, + "종": 30490, + "ถ": 30491, + "ὶ": 30492, + "满": 30493, + "╝": 30494, + "わ": 30495, + "ど": 30496, + "็": 30497, + "형": 30498, + "國": 30499, + "ự": 30500, + "線": 30501, + "블": 30502, + "封": 30503, + "確": 30504, + "依": 30505, + "ս": 30506, + "永": 30507, + "색": 30508, + "歌": 30509, + "數": 30510, + "福": 30511, + "삭": 30512, + "実": 30513, + "레": 30514, + "ſ": 30515, + "千": 30516, + "\u000e": 30517, + "母": 30518, + "더": 30519, + "임": 30520, + "տ": 30521, + "ے": 30522, + "几": 30523, + "双": 30524, + "노": 30525, + "ณ": 30526, + "掉": 30527, + "Ρ": 30528, + "ἀ": 30529, + "標": 30530, + "長": 30531, + "档": 30532, + "태": 30533, + "ペ": 30534, + "본": 30535, + "Œ": 30536, + "底": 30537, + "终": 30538, + "請": 30539, + "კ": 30540, + "̯": 30541, + "예": 30542, + "▬": 30543, + "報": 30544, + "ピ": 30545, + "๏": 30546, + "暂": 30547, + "李": 30548, + "Υ": 30549, + "\u0005": 30550, + "\u0002": 30551, + "替": 30552, + "운": 30553, + "射": 30554, + "\u0018": 30555, + "매": 30556, + "\u0011": 30557, + "🏼": 30558, + "票": 30559, + "附": 30560, + "ノ": 30561, + "ũ": 30562, + "压": 30563, + "阿": 30564, + "Ò": 30565, + "테": 30566, + "∼": 30567, + "万": 30568, + "մ": 30569, + "후": 30570, + "普": 30571, + "截": 30572, + "속": 30573, + "括": 30574, + "😀": 30575, + "ை": 30576, + "▶": 30577, + "까": 30578, + "ট": 30579, + "曲": 30580, + "师": 30581, + "钱": 30582, + "栏": 30583, + "Ы": 30584, + "走": 30585, + "ữ": 30586, + "‬": 30587, + "归": 30588, + "점": 30589, + "🔥": 30590, + "었": 30591, + "連": 30592, + "私": 30593, + "청": 30594, + "刘": 30595, + "免": 30596, + "": 30597, + "奖": 30598, + "見": 30599, + "ֹ": 30600, + "☺": 30601, + "ケ": 30602, + "역": 30603, + "际": 30604, + "받": 30605, + "望": 30606, + "帝": 30607, + "减": 30608, + "두": 30609, + "领": 30610, + "„": 30611, + "钟": 30612, + "ガ": 30613, + "架": 30614, + "든": 30615, + "ல": 30616, + "松": 30617, + "□": 30618, + "越": 30619, + "答": 30620, + "ɕ": 30621, + "ῦ": 30622, + "染": 30623, + "": 30624, + "质": 30625, + "顺": 30626, + "气": 30627, + "╗": 30628, + "計": 30629, + "ქ": 30630, + "亮": 30631, + "🤦": 30632, + "̂": 30633, + "ٹ": 30634, + "座": 30635, + "ˌ": 30636, + "均": 30637, + "\u000b": 30638, + "官": 30639, + "适": 30640, + "护": 30641, + "久": 30642, + "春": 30643, + "曹": 30644, + "皇": 30645, + "脚": 30646, + "池": 30647, + "延": 30648, + "키": 30649, + "품": 30650, + "現": 30651, + "檔": 30652, + "ば": 30653, + "ⴰ": 30654, + "希": 30655, + "玩": 30656, + "固": 30657, + "黄": 30658, + "": 30659, + "☽": 30660, + "银": 30661, + "\u0003": 30662, + "┃": 30663, + "👏": 30664, + "불": 30665, + "攻": 30666, + "へ": 30667, + "决": 30668, + "⊙": 30669, + "宁": 30670, + "च": 30671, + "機": 30672, + "義": 30673, + "ɲ": 30674, + "\u0015": 30675, + "했": 30676, + "ẩ": 30677, + "愛": 30678, + "矩": 30679, + "패": 30680, + "ặ": 30681, + "郎": 30682, + "Ь": 30683, + "绘": 30684, + "负": 30685, + "ổ": 30686, + "ய": 30687, + "汉": 30688, + "編": 30689, + "ێ": 30690, + "്": 30691, + "じ": 30692, + "카": 30693, + "似": 30694, + "ں": 30695, + "や": 30696, + "認": 30697, + "\u000f": 30698, + "過": 30699, + "통": 30700, + "▪": 30701, + "约": 30702, + "香": 30703, + "买": 30704, + "住": 30705, + "╚": 30706, + "😁": 30707, + "扩": 30708, + "静": 30709, + "려": 30710, + "학": 30711, + "钥": 30712, + "증": 30713, + "ỉ": 30714, + "她": 30715, + "食": 30716, + "往": 30717, + "點": 30718, + "偏": 30719, + "康": 30720, + "\u0014": 30721, + "į": 30722, + "준": 30723, + "\u0004": 30724, + "ฟ": 30725, + "♣": 30726, + "戏": 30727, + "ʂ": 30728, + "井": 30729, + "军": 30730, + "爱": 30731, + "ٱ": 30732, + "七": 30733, + "차": 30734, + "币": 30735, + "♠": 30736, + "哈": 30737, + "阅": 30738, + "介": 30739, + "观": 30740, + "區": 30741, + "˜": 30742, + "ً": 30743, + "又": 30744, + "冲": 30745, + "朝": 30746, + "姓": 30747, + "课": 30748, + "龍": 30749, + "각": 30750, + "∈": 30751, + "米": 30752, + "ƒ": 30753, + "喜": 30754, + "夜": 30755, + "团": 30756, + "⇒": 30757, + "远": 30758, + "\u001a": 30759, + "ὐ": 30760, + "承": 30761, + "ಿ": 30762, + "室": 30763, + "ʀ": 30764, + "ង": 30765, + "अ": 30766, + "罗": 30767, + "🙏": 30768, + "软": 30769, + "🟡": 30770, + "건": 30771, + "؟": 30772, + "း": 30773, + "ᴇ": 30774, + "ユ": 30775, + "토": 30776, + "策": 30777, + "̄": 30778, + "국": 30779, + "ֶ": 30780, + "协": 30781, + "营": 30782, + "関": 30783, + "吉": 30784, + "💀": 30785, + "奇": 30786, + "滚": 30787, + "轴": 30788, + "処": 30789, + "土": 30790, + "划": 30791, + "ड": 30792, + "临": 30793, + "ֵ": 30794, + "航": 30795, + "浏": 30796, + "ゴ": 30797, + "別": 30798, + "寺": 30799, + "於": 30800, + "進": 30801, + "ὸ": 30802, + "風": 30803, + "ன": 30804, + "班": 30805, + "◼": 30806, + "九": 30807, + "̥": 30808, + "號": 30809, + "류": 30810, + "础": 30811, + "般": 30812, + "︙": 30813, + "̈": 30814, + "番": 30815, + "✨": 30816, + "😎": 30817, + "ো": 30818, + "😍": 30819, + "單": 30820, + "帧": 30821, + "授": 30822, + "赋": 30823, + "巴": 30824, + "占": 30825, + "假": 30826, + "ṅ": 30827, + "透": 30828, + "項": 30829, + "ħ": 30830, + "馬": 30831, + "🟢": 30832, + "Ľ": 30833, + "լ": 30834, + "券": 30835, + "같": 30836, + "類": 30837, + "對": 30838, + "월": 30839, + "激": 30840, + "\u0017": 30841, + "戦": 30842, + "独": 30843, + "訊": 30844, + "ិ": 30845, + "套": 30846, + "ʷ": 30847, + "跟": 30848, + "ở": 30849, + "渲": 30850, + "顯": 30851, + "降": 30852, + "ာ": 30853, + "尼": 30854, + "血": 30855, + "언": 30856, + "牛": 30857, + "將": 30858, + "ศ": 30859, + "拍": 30860, + "刻": 30861, + "ზ": 30862, + "╔": 30863, + "藤": 30864, + "్": 30865, + "ῶ": 30866, + "🟠": 30867, + "良": 30868, + "김": 30869, + "দ": 30870, + "Ṣ": 30871, + "録": 30872, + "伊": 30873, + "落": 30874, + "雄": 30875, + "雪": 30876, + "映": 30877, + "著": 30878, + "른": 30879, + "ფ": 30880, + "対": 30881, + "智": 30882, + "译": 30883, + "┬": 30884, + "抽": 30885, + "ῖ": 30886, + "酒": 30887, + "Ћ": 30888, + "股": 30889, + "់": 30890, + "순": 30891, + "직": 30892, + "भ": 30893, + "谷": 30894, + "물": 30895, + "ǒ": 30896, + "⠄": 30897, + "热": 30898, + "終": 30899, + "夹": 30900, + "干": 30901, + "彩": 30902, + "敗": 30903, + "ќ": 30904, + "♯": 30905, + "̣": 30906, + "վ": 30907, + "轮": 30908, + "阵": 30909, + "夏": 30910, + "幕": 30911, + "吧": 30912, + "港": 30913, + "益": 30914, + "儿": 30915, + "액": 30916, + "售": 30917, + "兵": 30918, + "惠": 30919, + "欢": 30920, + "›": 30921, + "零": 30922, + "學": 30923, + "ž": 30924, + "員": 30925, + "ỗ": 30926, + "玉": 30927, + "逻": 30928, + "᥀": 30929, + "吗": 30930, + "沒": 30931, + "≠": 30932, + "너": 30933, + "ச": 30934, + "\u0016": 30935, + "夫": 30936, + "წ": 30937, + "堂": 30938, + "電": 30939, + "≡": 30940, + "陆": 30941, + "져": 30942, + "研": 30943, + "荐": 30944, + "健": 30945, + "碼": 30946, + "练": 30947, + "検": 30948, + "송": 30949, + "ै": 30950, + "哪": 30951, + "圆": 30952, + "Ա": 30953, + "↩": 30954, + "托": 30955, + "̪": 30956, + "ू": 30957, + "缀": 30958, + "네": 30959, + "沙": 30960, + "兴": 30961, + "病": 30962, + "\u0007": 30963, + "ល": 30964, + "ừ": 30965, + "Ἀ": 30966, + "강": 30967, + "항": 30968, + "\u0019": 30969, + "換": 30970, + "温": 30971, + "帖": 30972, + "ទ": 30973, + "込": 30974, + "削": 30975, + "알": 30976, + "征": 30977, + "习": 30978, + "법": 30979, + "栈": 30980, + "绝": 30981, + "": 30982, + "ڕ": 30983, + "圖": 30984, + "苏": 30985, + "発": 30986, + "ု": 30987, + "町": 30988, + "互": 30989, + "়": 30990, + "ც": 30991, + "守": 30992, + "새": 30993, + "侧": 30994, + "草": 30995, + "ས": 30996, + "扫": 30997, + "‒": 30998, + "恢": 30999, + "ң": 31000, + "ण": 31001, + "ற": 31002, + "째": 31003, + "්": 31004, + "拟": 31005, + "派": 31006, + "🏽": 31007, + "呼": 31008, + "Š": 31009, + "演": 31010, + "究": 31011, + "교": 31012, + "ɣ": 31013, + "ए": 31014, + "ី": 31015, + "ף": 31016, + "富": 31017, + "駅": 31018, + "ず": 31019, + "♪": 31020, + "😆": 31021, + "접": 31022, + "ғ": 31023, + "▓": 31024, + "존": 31025, + "ಾ": 31026, + "旋": 31027, + "ゃ": 31028, + "补": 31029, + "ץ": 31030, + "門": 31031, + "ច": 31032, + "날": 31033, + "ภ": 31034, + "ག": 31035, + "傳": 31036, + "∆": 31037, + "†": 31038, + "ׁ": 31039, + "缺": 31040, + "頭": 31041, + "怪": 31042, + "組": 31043, + "별": 31044, + "Ъ": 31045, + "發": 31046, + "雷": 31047, + "ರ": 31048, + "ซ": 31049, + "び": 31050, + "翻": 31051, + "ھ": 31052, + "პ": 31053, + "題": 31054, + "居": 31055, + "집": 31056, + "🌍": 31057, + "˚": 31058, + "避": 31059, + "줄": 31060, + "ុ": 31061, + "滑": 31062, + "故": 31063, + "ญ": 31064, + "〜": 31065, + "ನ": 31066, + "양": 31067, + "완": 31068, + "ள": 31069, + "倍": 31070, + "宗": 31071, + "択": 31072, + "브": 31073, + "ɴ": 31074, + "効": 31075, + "尺": 31076, + "視": 31077, + "ẽ": 31078, + "覆": 31079, + "ध": 31080, + "骨": 31081, + "달": 31082, + "ᴛ": 31083, + "蓝": 31084, + "關": 31085, + "額": 31086, + "Õ": 31087, + "∗": 31088, + "卷": 31089, + "갑": 31090, + "르": 31091, + "众": 31092, + "ᴀ": 31093, + "態": 31094, + "ٰ": 31095, + "暗": 31096, + "君": 31097, + "錯": 31098, + "ɒ": 31099, + "យ": 31100, + "ḫ": 31101, + "ῆ": 31102, + "亚": 31103, + "♡": 31104, + "割": 31105, + "鼠": 31106, + "̶": 31107, + "Ë": 31108, + "読": 31109, + "격": 31110, + "ゲ": 31111, + "眼": 31112, + "Ý": 31113, + "ژ": 31114, + "雨": 31115, + "宮": 31116, + "쪽": 31117, + "ष": 31118, + "複": 31119, + "剩": 31120, + "早": 31121, + "杂": 31122, + "焦": 31123, + "贝": 31124, + "突": 31125, + "워": 31126, + "另": 31127, + "摄": 31128, + "\b": 31129, + "‭": 31130, + "府": 31131, + "외": 31132, + "盖": 31133, + "\u001c": 31134, + "ษ": 31135, + "佛": 31136, + "概": 31137, + "與": 31138, + "經": 31139, + "-": 31140, + "һ": 31141, + "問": 31142, + "ು": 31143, + "ἰ": 31144, + "話": 31145, + "倒": 31146, + "葛": 31147, + "べ": 31148, + "ろ": 31149, + "\u001e": 31150, + "।": 31151, + "ေ": 31152, + "ᴏ": 31153, + "训": 31154, + "體": 31155, + "👌": 31156, + "內": 31157, + "က": 31158, + "企": 31159, + "약": 31160, + "찾": 31161, + "ོ": 31162, + "破": 31163, + "輸": 31164, + "림": 31165, + "塔": 31166, + "턴": 31167, + "杀": 31168, + "』": 31169, + "味": 31170, + "浮": 31171, + "┆": 31172, + "ġ": 31173, + "郡": 31174, + "┐": 31175, + "『": 31176, + "阶": 31177, + "雅": 31178, + "┈": 31179, + "园": 31180, + ".": 31181, + "吃": 31182, + "남": 31183, + " ": 31184, + "ར": 31185, + "帮": 31186, + "毛": 31187, + "耗": 31188, + "举": 31189, + "ర": 31190, + "拿": 31191, + "밀": 31192, + "ご": 31193, + "够": 31194, + "礼": 31195, + "ព": 31196, + "ね": 31197, + "‰": 31198, + "兰": 31199, + "❌": 31200, + "折": 31201, + "십": 31202, + "💎": 31203, + "業": 31204, + "诸": 31205, + "孙": 31206, + "བ": 31207, + "😳": 31208, + "種": 31209, + "Ï": 31210, + "ึ": 31211, + "⁣": 31212, + "医": 31213, + "拼": 31214, + "↵": 31215, + "⅓": 31216, + "\u001f": 31217, + "မ": 31218, + "叫": 31219, + "জ": 31220, + "予": 31221, + "寸": 31222, + "梅": 31223, + "醒": 31224, + "津": 31225, + "န": 31226, + "ి": 31227, + "厂": 31228, + "屋": 31229, + "ख": 31230, + "師": 31231, + "👀": 31232, + "ỏ": 31233, + "ヤ": 31234, + "ὰ": 31235, + "\u001d": 31236, + "◆": 31237, + "ដ": 31238, + "材": 31239, + "ホ": 31240, + "張": 31241, + "洞": 31242, + "餐": 31243, + "천": 31244, + "হ": 31245, + "達": 31246, + "們": 31247, + "斗": 31248, + "横": 31249, + "백": 31250, + "ំ": 31251, + "ۆ": 31252, + "말": 31253, + "গ": 31254, + "佳": 31255, + "랜": 31256, + "仁": 31257, + "陈": 31258, + "飞": 31259, + "极": 31260, + "": 31261, + "및": 31262, + "仓": 31263, + "⬛": 31264, + "昌": 31265, + "錢": 31266, + "殊": 31267, + "┴": 31268, + "○": 31269, + "길": 31270, + "泉": 31271, + "甲": 31272, + "활": 31273, + "ひ": 31274, + "শ": 31275, + "ን": 31276, + "Ť": 31277, + "ღ": 31278, + "皮": 31279, + "強": 31280, + "赛": 31281, + "ా": 31282, + "預": 31283, + "င": 31284, + "튼": 31285, + "플": 31286, + "ყ": 31287, + "⋆": 31288, + "ք": 31289, + "ા": 31290, + "尚": 31291, + "또": 31292, + "բ": 31293, + "┌": 31294, + "節": 31295, + "森": 31296, + "आ": 31297, + "办": 31298, + "園": 31299, + "牙": 31300, + "庆": 31301, + "隆": 31302, + "😔": 31303, + "叉": 31304, + "գ": 31305, + "피": 31306, + "ギ": 31307, + "啊": 31308, + "続": 31309, + "灵": 31310, + "ヒ": 31311, + "忽": 31312, + "ʌ": 31313, + "량": 31314, + "油": 31315, + "讯": 31316, + "ⵉ": 31317, + "릭": 31318, + "刚": 31319, + "氏": 31320, + "ိ": 31321, + "Ī": 31322, + "誤": 31323, + "齐": 31324, + "末": 31325, + "🙌": 31326, + "̞": 31327, + "圈": 31328, + "念": 31329, + "숫": 31330, + "毫": 31331, + "當": 31332, + "規": 31333, + "판": 31334, + "ు": 31335, + "旧": 31336, + "卖": 31337, + "ฉ": 31338, + "幸": 31339, + "署": 31340, + "근": 31341, + "ই": 31342, + "岛": 31343, + "դ": 31344, + "觉": 31345, + "害": 31346, + "毕": 31347, + "ฐ": 31348, + "威": 31349, + "育": 31350, + "呢": 31351, + "峰": 31352, + "职": 31353, + "陽": 31354, + "ි": 31355, + "亞": 31356, + "ұ": 31357, + "₃": 31358, + "따": 31359, + "施": 31360, + "泰": 31361, + "載": 31362, + "…": 31363, + "笑": 31364, + "華": 31365, + "迎": 31366, + "됩": 31367, + "豆": 31368, + "嘉": 31369, + "🤡": 31370, + "ĕ": 31371, + "庄": 31372, + "級": 31373, + "Ψ": 31374, + "ི": 31375, + "気": 31376, + "责": 31377, + "հ": 31378, + "អ": 31379, + "乱": 31380, + "休": 31381, + "約": 31382, + "ฆ": 31383, + "∑": 31384, + "察": 31385, + "온": 31386, + "😬": 31387, + "ড": 31388, + "乘": 31389, + "람": 31390, + "इ": 31391, + "Ά": 31392, + "ந": 31393, + "ើ": 31394, + "亲": 31395, + "េ": 31396, + "委": 31397, + "赤": 31398, + "됨": 31399, + "勝": 31400, + "怎": 31401, + "감": 31402, + "宋": 31403, + "調": 31404, + "짜": 31405, + "ী": 31406, + "难": 31407, + "못": 31408, + "티": 31409, + "備": 31410, + "塞": 31411, + "វ": 31412, + "险": 31413, + "旅": 31414, + "虚": 31415, + "↳": 31416, + "笔": 31417, + "馆": 31418, + "Қ": 31419, + "⚡": 31420, + "ೆ": 31421, + "※": 31422, + "唐": 31423, + "律": 31424, + "稍": 31425, + "散": 31426, + "ર": 31427, + "ヴ": 31428, + "副": 31429, + "尽": 31430, + "挂": 31431, + "県": 31432, + "⚠": 31433, + "洋": 31434, + "鬼": 31435, + "암": 31436, + "孩": 31437, + "℃": 31438, + "並": 31439, + "ց": 31440, + "ូ": 31441, + "ℓ": 31442, + "ⵏ": 31443, + "扣": 31444, + "铁": 31445, + "闻": 31446, + "ˆ": 31447, + "戳": 31448, + "む": 31449, + "秀": 31450, + "細": 31451, + "ပ": 31452, + "御": 31453, + "拖": 31454, + "좌": 31455, + "ؤ": 31456, + "绍": 31457, + "ỹ": 31458, + "참": 31459, + "향": 31460, + "Ď": 31461, + "끝": 31462, + "민": 31463, + "ძ": 31464, + "贵": 31465, + "纪": 31466, + "秋": 31467, + "ಕ": 31468, + "ӏ": 31469, + "網": 31470, + "铺": 31471, + "恋": 31472, + "fl": 31473, + "兼": 31474, + "羽": 31475, + "창": 31476, + "啟": 31477, + "弟": 31478, + "년": 31479, + "慢": 31480, + "효": 31481, + "許": 31482, + "硬": 31483, + "잘": 31484, + "템": 31485, + "્": 31486, + "න": 31487, + "術": 31488, + "ڈ": 31489, + "溪": 31490, + "": 31491, + "暴": 31492, + "混": 31493, + "夢": 31494, + "랑": 31495, + "আ": 31496, + "還": 31497, + "探": 31498, + "祖": 31499, + "织": 31500, + "軍": 31501, + "թ": 31502, + "務": 31503, + "艺": 31504, + "ད": 31505, + "ት": 31506, + "ṁ": 31507, + "應": 31508, + "擇": 31509, + "🥰": 31510, + "ķ": 31511, + "渡": 31512, + "葉": 31513, + "령": 31514, + "決": 31515, + "刀": 31516, + "從": 31517, + "變": 31518, + "올": 31519, + "💪": 31520, + "灣": 31521, + "ር": 31522, + "평": 31523, + "衣": 31524, + "😄": 31525, + "ി": 31526, + "ჩ": 31527, + "ὁ": 31528, + "ほ": 31529, + "Û": 31530, + "চ": 31531, + "ර": 31532, + "製": 31533, + "隊": 31534, + "₱": 31535, + "纳": 31536, + "赖": 31537, + "农": 31538, + "桥": 31539, + "ỳ": 31540, + "🏾": 31541, + "阻": 31542, + "ជ": 31543, + "秘": 31544, + "박": 31545, + "伤": 31546, + "稿": 31547, + "ం": 31548, + "拦": 31549, + "넣": 31550, + "💕": 31551, + "₁": 31552, + "宿": 31553, + "錄": 31554, + "镜": 31555, + "채": 31556, + "Ə": 31557, + "ང": 31558, + "⇔": 31559, + "☼": 31560, + "ུ": 31561, + "党": 31562, + "급": 31563, + "洲": 31564, + "ղ": 31565, + "說": 31566, + "ĭ": 31567, + "尝": 31568, + "담": 31569, + "फ": 31570, + "哥": 31571, + "圣": 31572, + "萨": 31573, + "😏": 31574, + "ʏ": 31575, + "ெ": 31576, + "丁": 31577, + "虎": 31578, + "권": 31579, + "善": 31580, + "岩": 31581, + "커": 31582, + "◦": 31583, + "抛": 31584, + "석": 31585, + "Έ": 31586, + "宣": 31587, + "拳": 31588, + "팅": 31589, + "枚": 31590, + "洛": 31591, + "証": 31592, + "陵": 31593, + "佐": 31594, + "館": 31595, + "누": 31596, + "돌": 31597, + "₄": 31598, + "稱": 31599, + "聊": 31600, + "車": 31601, + "루": 31602, + "״": 31603, + "ಠ": 31604, + "庫": 31605, + "མ": 31606, + "統": 31607, + "련": 31608, + "़": 31609, + "ṯ": 31610, + "ക": 31611, + "旗": 31612, + "励": 31613, + "紀": 31614, + "忠": 31615, + "າ": 31616, + "杨": 31617, + "丹": 31618, + "Ù": 31619, + "ฝ": 31620, + "却": 31621, + "舞": 31622, + "轉": 31623, + "တ": 31624, + "丽": 31625, + "借": 31626, + "ා": 31627, + "ょ": 31628, + "옵": 31629, + "편": 31630, + "蒙": 31631, + "衡": 31632, + "ʋ": 31633, + "叶": 31634, + "̇": 31635, + "⬜": 31636, + "🇺": 31637, + "Հ": 31638, + "谢": 31639, + "Ą": 31640, + "ே": 31641, + "ằ": 31642, + "既": 31643, + "济": 31644, + "≯": 31645, + "準": 31646, + "답": 31647, + "ಲ": 31648, + "残": 31649, + "虑": 31650, + "̆": 31651, + "┘": 31652, + "急": 31653, + "招": 31654, + "막": 31655, + "≮": 31656, + "產": 31657, + "Ṭ": 31658, + "😢": 31659, + "垂": 31660, + "親": 31661, + "ģ": 31662, + "־": 31663, + "猫": 31664, + "ʟ": 31665, + "☃": 31666, + "✪": 31667, + "刪": 31668, + "胡": 31669, + "☉": 31670, + "晚": 31671, + "군": 31672, + "승": 31673, + "న": 31674, + "ὴ": 31675, + "曾": 31676, + "論": 31677, + "ɯ": 31678, + "త": 31679, + "戰": 31680, + "鱼": 31681, + "ǧ": 31682, + "寶": 31683, + "특": 31684, + "💯": 31685, + "崎": 31686, + "甘": 31687, + "該": 31688, + "링": 31689, + "😡": 31690, + "उ": 31691, + "ែ": 31692, + "頁": 31693, + "큰": 31694, + "➤": 31695, + "총": 31696, + "💰": 31697, + "∂": 31698, + "毁": 31699, + "聖": 31700, + "麻": 31701, + "ʐ": 31702, + "敏": 31703, + "運": 31704, + "될": 31705, + "쓰": 31706, + "ಸ": 31707, + "စ": 31708, + "✦": 31709, + "젝": 31710, + "復": 31711, + "寻": 31712, + "茶": 31713, + "ਾ": 31714, + "竹": 31715, + "遇": 31716, + "順": 31717, + "며": 31718, + "累": 31719, + "ĝ": 31720, + "ˇ": 31721, + "覧": 31722, + "এ": 31723, + "株": 31724, + "취": 31725, + "ስ": 31726, + "争": 31727, + "势": 31728, + "宇": 31729, + "橋": 31730, + "Ӏ": 31731, + "堆": 31732, + "ⵙ": 31733, + "丶": 31734, + "棋": 31735, + "肉": 31736, + "የ": 31737, + "": 31738, + "❶": 31739, + "季": 31740, + "ል": 31741, + "殿": 31742, + "優": 31743, + "試": 31744, + "첫": 31745, + "Ό": 31746, + "戶": 31747, + "ண": 31748, + "羅": 31749, + "桃": 31750, + "립": 31751, + "浪": 31752, + "脑": 31753, + "😛": 31754, + "弃": 31755, + "炮": 31756, + "轻": 31757, + "울": 31758, + "": 31759, + "ヘ": 31760, + "奥": 31761, + "💜": 31762, + "忘": 31763, + "遠": 31764, + "飛": 31765, + "魏": 31766, + "Ē": 31767, + "汇": 31768, + "央": 31769, + "逆": 31770, + "露": 31771, + "須": 31772, + "ѐ": 31773, + "ḷ": 31774, + "ದ": 31775, + "✭": 31776, + "寄": 31777, + "盟": 31778, + "财": 31779, + "際": 31780, + "ἔ": 31781, + "ǫ": 31782, + "थ": 31783, + "ാ": 31784, + "宫": 31785, + "巨": 31786, + "途": 31787, + "ʹ": 31788, + "ಗ": 31789, + "帐": 31790, + "‪": 31791, + "拒": 31792, + "药": 31793, + "🙃": 31794, + "ŕ": 31795, + "亡": 31796, + "壁": 31797, + "ም": 31798, + "參": 31799, + "😩": 31800, + "շ": 31801, + "ವ": 31802, + "ណ": 31803, + "丰": 31804, + "獲": 31805, + "莉": 31806, + "좋": 31807, + "ရ": 31808, + "₦": 31809, + "겠": 31810, + "👉": 31811, + "吴": 31812, + "岡": 31813, + "诉": 31814, + "읽": 31815, + "🥺": 31816, + "爆": 31817, + "🇸": 31818, + "ভ": 31819, + "迭": 31820, + "엔": 31821, + "ἄ": 31822, + "捷": 31823, + "納": 31824, + "邀": 31825, + "ಯ": 31826, + "爾": 31827, + "船": 31828, + "赞": 31829, + "胜": 31830, + "므": 31831, + "သ": 31832, + "構": 31833, + "磁": 31834, + "冰": 31835, + "딩": 31836, + "ે": 31837, + "媒": 31838, + "繁": 31839, + "☠": 31840, + "❒": 31841, + "仪": 31842, + "렬": 31843, + "昭": 31844, + "珠": 31845, + "離": 31846, + "ན": 31847, + "ల": 31848, + "ತ": 31849, + "拷": 31850, + "粉": 31851, + "벤": 31852, + "⇽": 31853, + "乌": 31854, + "拥": 31855, + "ҳ": 31856, + "ය": 31857, + "ེ": 31858, + "仙": 31859, + "塊": 31860, + "幅": 31861, + "🎉": 31862, + "Մ": 31863, + "跨": 31864, + "ٔ": 31865, + "恩": 31866, + "损": 31867, + "养": 31868, + "奈": 31869, + "ǀ": 31870, + "严": 31871, + "卫": 31872, + "迟": 31873, + "様": 31874, + "裡": 31875, + "난": 31876, + "았": 31877, + "͜": 31878, + "Ζ": 31879, + "ਰ": 31880, + "պ": 31881, + "ং": 31882, + "丢": 31883, + "伝": 31884, + "컨": 31885, + "ව": 31886, + "ြ": 31887, + "冷": 31888, + "遗": 31889, + "銀": 31890, + "̌": 31891, + "ᴜ": 31892, + "瑞": 31893, + "ฌ": 31894, + "❍": 31895, + "ふ": 31896, + "聚": 31897, + "碎": 31898, + "衛": 31899, + "অ": 31900, + "ញ": 31901, + "퍼": 31902, + "Ս": 31903, + "ນ": 31904, + "ẓ": 31905, + "✌": 31906, + "孝": 31907, + "陳": 31908, + "히": 31909, + "ක": 31910, + "黒": 31911, + "💖": 31912, + "ḩ": 31913, + "応": 31914, + "饰": 31915, + "∪": 31916, + "宜": 31917, + "樂": 31918, + "則": 31919, + "勇": 31920, + "徐": 31921, + "ⵓ": 31922, + "權": 31923, + "鲁": 31924, + "‟": 31925, + "庭": 31926, + "苗": 31927, + "🔴": 31928, + "闲": 31929, + "독": 31930, + "ɹ": 31931, + "ҽ": 31932, + "ថ": 31933, + "宏": 31934, + "尊": 31935, + "總": 31936, + "裝": 31937, + "ම": 31938, + "▸": 31939, + "測": 31940, + "ಮ": 31941, + "አ": 31942, + "轩": 31943, + "兄": 31944, + "剑": 31945, + "ન": 31946, + "朱": 31947, + "ǝ": 31948, + "Ḩ": 31949, + "担": 31950, + "灰": 31951, + "讲": 31952, + "롤": 31953, + "︎": 31954, + "😤": 31955, + "ោ": 31956, + "애": 31957, + "였": 31958, + "질": 31959, + "振": 31960, + "灯": 31961, + "ĉ": 31962, + "ස": 31963, + "閉": 31964, + "램": 31965, + "ಂ": 31966, + "げ": 31967, + "̧": 31968, + "狂": 31969, + "融": 31970, + "仍": 31971, + "實": 31972, + "楽": 31973, + "範": 31974, + "ٌ": 31975, + "వ": 31976, + "嵌": 31977, + "摩": 31978, + "袁": 31979, + "ষ": 31980, + "乎": 31981, + "규": 31982, + "岗": 31983, + "糊": 31984, + "క": 31985, + "雲": 31986, + "심": 31987, + "ई": 31988, + "འ": 31989, + "ἡ": 31990, + "丝": 31991, + "Ħ": 31992, + "ٍ": 31993, + "ٓ": 31994, + "အ": 31995, + "執": 31996, + "벨": 31997, + "ゼ": 31998, + "梦": 31999 + }, + "merges": [ + [ + "▁", + "t" + ], + [ + "i", + "n" + ], + [ + "e", + "r" + ], + [ + "▁", + "a" + ], + [ + "h", + "e" + ], + [ + "o", + "n" + ], + [ + "r", + "e" + ], + [ + "▁", + "s" + ], + [ + "e", + "n" + ], + [ + "a", + "t" + ], + [ + "o", + "r" + ], + [ + "▁t", + "he" + ], + [ + "▁th", + "e" + ], + [ + "▁", + "the" + ], + [ + "e", + "s" + ], + [ + "▁", + "w" + ], + [ + "a", + "n" + ], + [ + "▁", + "c" + ], + [ + "i", + "s" + ], + [ + "i", + "t" + ], + [ + "o", + "u" + ], + [ + "▁", + "d" + ], + [ + "a", + "l" + ], + [ + "a", + "r" + ], + [ + "▁", + "p" + ], + [ + "▁", + "f" + ], + [ + "e", + "d" + ], + [ + "▁", + "b" + ], + [ + "in", + "g" + ], + [ + "i", + "ng" + ], + [ + "▁", + "o" + ], + [ + "▁", + "m" + ], + [ + "l", + "e" + ], + [ + "n", + "d" + ], + [ + "a", + "s" + ], + [ + "i", + "c" + ], + [ + "▁", + "h" + ], + [ + "io", + "n" + ], + [ + "i", + "on" + ], + [ + "▁i", + "n" + ], + [ + "▁", + "in" + ], + [ + "▁t", + "o" + ], + [ + "▁", + "to" + ], + [ + "e", + "t" + ], + [ + "o", + "m" + ], + [ + "e", + "l" + ], + [ + "▁o", + "f" + ], + [ + "▁", + "of" + ], + [ + "s", + "t" + ], + [ + "▁a", + "nd" + ], + [ + "▁an", + "d" + ], + [ + "▁", + "and" + ], + [ + "▁", + "l" + ], + [ + "▁t", + "h" + ], + [ + "▁", + "th" + ], + [ + "▁", + "n" + ], + [ + "en", + "t" + ], + [ + "e", + "nt" + ], + [ + "i", + "l" + ], + [ + "c", + "t" + ], + [ + "r", + "o" + ], + [ + "▁r", + "e" + ], + [ + "▁", + "re" + ], + [ + "i", + "d" + ], + [ + "a", + "m" + ], + [ + "▁", + "I" + ], + [ + "a", + "d" + ], + [ + "▁", + "e" + ], + [ + "▁", + "S" + ], + [ + "▁", + "g" + ], + [ + "▁", + "T" + ], + [ + "i", + "m" + ], + [ + "o", + "t" + ], + [ + "a", + "c" + ], + [ + "u", + "r" + ], + [ + "▁", + "(" + ], + [ + "i", + "g" + ], + [ + "▁", + "=" + ], + [ + "o", + "l" + ], + [ + "u", + "t" + ], + [ + "▁", + "A" + ], + [ + "s", + "e" + ], + [ + "▁", + "u" + ], + [ + "v", + "e" + ], + [ + "▁", + "C" + ], + [ + "i", + "f" + ], + [ + "o", + "w" + ], + [ + "▁", + "y" + ], + [ + "c", + "h" + ], + [ + "a", + "y" + ], + [ + "▁d", + "e" + ], + [ + "▁", + "de" + ], + [ + "▁s", + "t" + ], + [ + "▁", + "st" + ], + [ + "▁", + "|" + ], + [ + "ve", + "r" + ], + [ + "v", + "er" + ], + [ + ")", + ";" + ], + [ + "▁", + "\"" + ], + [ + "l", + "y" + ], + [ + "▁b", + "e" + ], + [ + "▁", + "be" + ], + [ + "*", + "*" + ], + [ + "▁i", + "s" + ], + [ + "▁", + "is" + ], + [ + "o", + "d" + ], + [ + "▁", + "M" + ], + [ + "at", + "ion" + ], + [ + "ati", + "on" + ], + [ + "atio", + "n" + ], + [ + "u", + "l" + ], + [ + "▁f", + "or" + ], + [ + "▁fo", + "r" + ], + [ + "▁", + "for" + ], + [ + "▁o", + "n" + ], + [ + "▁", + "on" + ], + [ + "a", + "g" + ], + [ + "c", + "e" + ], + [ + "te", + "r" + ], + [ + "t", + "er" + ], + [ + "i", + "r" + ], + [ + "t", + "h" + ], + [ + "▁", + "v" + ], + [ + "q", + "u" + ], + [ + "▁", + "B" + ], + [ + "e", + "m" + ], + [ + "▁", + "P" + ], + [ + "▁y", + "ou" + ], + [ + "▁yo", + "u" + ], + [ + "▁", + "you" + ], + [ + "▁t", + "hat" + ], + [ + "▁th", + "at" + ], + [ + "▁", + "that" + ], + [ + "u", + "n" + ], + [ + "▁", + "{" + ], + [ + "it", + "h" + ], + [ + "i", + "th" + ], + [ + "r", + "i" + ], + [ + "es", + "t" + ], + [ + "e", + "st" + ], + [ + "a", + "b" + ], + [ + "-", + "-" + ], + [ + "a", + "p" + ], + [ + "▁i", + "t" + ], + [ + "▁", + "it" + ], + [ + "▁c", + "on" + ], + [ + "▁co", + "n" + ], + [ + "▁", + "con" + ], + [ + "at", + "e" + ], + [ + "a", + "te" + ], + [ + "u", + "s" + ], + [ + "▁", + "H" + ], + [ + "u", + "m" + ], + [ + "▁", + "D" + ], + [ + "o", + "s" + ], + [ + "p", + "e" + ], + [ + "▁", + "-" + ], + [ + "▁w", + "h" + ], + [ + "▁", + "wh" + ], + [ + "▁a", + "l" + ], + [ + "▁", + "al" + ], + [ + "▁a", + "s" + ], + [ + "▁", + "as" + ], + [ + "an", + "d" + ], + [ + "a", + "nd" + ], + [ + "is", + "t" + ], + [ + "i", + "st" + ], + [ + "▁", + "L" + ], + [ + "▁", + "W" + ], + [ + "▁w", + "ith" + ], + [ + "▁", + "with" + ], + [ + "▁a", + "n" + ], + [ + "▁", + "an" + ], + [ + "er", + "e" + ], + [ + "e", + "re" + ], + [ + "▁", + "*" + ], + [ + "▁", + "R" + ], + [ + "▁h", + "e" + ], + [ + "▁", + "he" + ], + [ + "▁", + "F" + ], + [ + "o", + "c" + ], + [ + "▁w", + "as" + ], + [ + "▁wa", + "s" + ], + [ + "▁", + "was" + ], + [ + "er", + "s" + ], + [ + "e", + "rs" + ], + [ + "k", + "e" + ], + [ + "ou", + "t" + ], + [ + "o", + "ut" + ], + [ + "h", + "t" + ], + [ + "▁", + "r" + ], + [ + "es", + "s" + ], + [ + "e", + "ss" + ], + [ + "o", + "p" + ], + [ + "re", + "s" + ], + [ + "r", + "es" + ], + [ + "i", + "e" + ], + [ + "▁", + "E" + ], + [ + "▁", + "\\" + ], + [ + "▁T", + "he" + ], + [ + "▁Th", + "e" + ], + [ + "▁", + "The" + ], + [ + "en", + "d" + ], + [ + "e", + "nd" + ], + [ + "l", + "d" + ], + [ + "▁", + "N" + ], + [ + "or", + "t" + ], + [ + "o", + "rt" + ], + [ + "▁", + "G" + ], + [ + "/", + "/" + ], + [ + "▁", + "#" + ], + [ + "ou", + "r" + ], + [ + "o", + "ur" + ], + [ + "t", + "e" + ], + [ + "il", + "l" + ], + [ + "i", + "ll" + ], + [ + "ai", + "n" + ], + [ + "a", + "in" + ], + [ + "▁s", + "e" + ], + [ + "▁", + "se" + ], + [ + "▁", + "$" + ], + [ + "▁p", + "ro" + ], + [ + "▁pr", + "o" + ], + [ + "▁", + "pro" + ], + [ + "or", + "e" + ], + [ + "o", + "re" + ], + [ + "▁c", + "om" + ], + [ + "▁co", + "m" + ], + [ + "▁", + "com" + ], + [ + "am", + "e" + ], + [ + "a", + "me" + ], + [ + "t", + "r" + ], + [ + "▁n", + "e" + ], + [ + "▁", + "ne" + ], + [ + "ro", + "m" + ], + [ + "r", + "om" + ], + [ + "u", + "b" + ], + [ + "▁a", + "t" + ], + [ + "▁", + "at" + ], + [ + "▁e", + "x" + ], + [ + "▁", + "ex" + ], + [ + "an", + "t" + ], + [ + "a", + "nt" + ], + [ + "u", + "e" + ], + [ + "▁o", + "r" + ], + [ + "▁", + "or" + ], + [ + "▁", + "}" + ], + [ + "ar", + "t" + ], + [ + "a", + "rt" + ], + [ + "ct", + "ion" + ], + [ + "▁", + "k" + ], + [ + "p", + "t" + ], + [ + "n", + "t" + ], + [ + "i", + "v" + ], + [ + "d", + "e" + ], + [ + "▁", + "O" + ], + [ + "p", + "l" + ], + [ + "ur", + "n" + ], + [ + "u", + "rn" + ], + [ + "ig", + "ht" + ], + [ + "igh", + "t" + ], + [ + "i", + "ght" + ], + [ + "al", + "l" + ], + [ + "a", + "ll" + ], + [ + "▁t", + "his" + ], + [ + "▁th", + "is" + ], + [ + "▁", + "this" + ], + [ + "se", + "r" + ], + [ + "s", + "er" + ], + [ + "av", + "e" + ], + [ + "a", + "ve" + ], + [ + "▁n", + "ot" + ], + [ + "▁no", + "t" + ], + [ + "▁", + "not" + ], + [ + "▁a", + "re" + ], + [ + "▁ar", + "e" + ], + [ + "▁", + "are" + ], + [ + "▁", + "j" + ], + [ + "▁l", + "e" + ], + [ + "▁", + "le" + ], + [ + "i", + "z" + ], + [ + "▁", + "'" + ], + [ + "ag", + "e" + ], + [ + "a", + "ge" + ], + [ + "me", + "nt" + ], + [ + "men", + "t" + ], + [ + "m", + "ent" + ], + [ + "▁t", + "r" + ], + [ + "▁", + "tr" + ], + [ + "ac", + "k" + ], + [ + "a", + "ck" + ], + [ + "us", + "t" + ], + [ + "u", + "st" + ], + [ + "(", + ")" + ], + [ + "-", + ">" + ], + [ + "it", + "y" + ], + [ + "i", + "ty" + ], + [ + "in", + "e" + ], + [ + "i", + "ne" + ], + [ + "ou", + "ld" + ], + [ + "oul", + "d" + ], + [ + "o", + "uld" + ], + [ + "▁", + "J" + ], + [ + "o", + "g" + ], + [ + "▁f", + "rom" + ], + [ + "▁fr", + "om" + ], + [ + "▁fro", + "m" + ], + [ + "▁", + "from" + ], + [ + "▁w", + "e" + ], + [ + "▁", + "we" + ], + [ + "el", + "l" + ], + [ + "e", + "ll" + ], + [ + "▁s", + "h" + ], + [ + "▁", + "sh" + ], + [ + "▁e", + "n" + ], + [ + "▁", + "en" + ], + [ + "ur", + "e" + ], + [ + "u", + "re" + ], + [ + "por", + "t" + ], + [ + "po", + "rt" + ], + [ + "p", + "ort" + ], + [ + "▁c", + "h" + ], + [ + "▁", + "ch" + ], + [ + "n", + "e" + ], + [ + "▁b", + "y" + ], + [ + "▁", + "by" + ], + [ + "pe", + "r" + ], + [ + "p", + "er" + ], + [ + "ar", + "d" + ], + [ + "a", + "rd" + ], + [ + "as", + "s" + ], + [ + "a", + "ss" + ], + [ + "g", + "e" + ], + [ + "a", + "k" + ], + [ + "ar", + "e" + ], + [ + "a", + "re" + ], + [ + "o", + "k" + ], + [ + "a", + "v" + ], + [ + "iv", + "e" + ], + [ + "i", + "ve" + ], + [ + "f", + "f" + ], + [ + "ie", + "s" + ], + [ + "i", + "es" + ], + [ + "at", + "h" + ], + [ + "a", + "th" + ], + [ + "tu", + "rn" + ], + [ + "t", + "urn" + ], + [ + "▁", + "U" + ], + [ + "in", + "t" + ], + [ + "i", + "nt" + ], + [ + "--", + "--" + ], + [ + "---", + "-" + ], + [ + "-", + "---" + ], + [ + "▁i", + "m" + ], + [ + "▁", + "im" + ], + [ + "os", + "t" + ], + [ + "o", + "st" + ], + [ + "ia", + "l" + ], + [ + "i", + "al" + ], + [ + "▁h", + "ave" + ], + [ + "▁ha", + "ve" + ], + [ + "▁hav", + "e" + ], + [ + "▁", + "have" + ], + [ + "in", + "d" + ], + [ + "i", + "nd" + ], + [ + "i", + "p" + ], + [ + "an", + "s" + ], + [ + "a", + "ns" + ], + [ + "x", + "t" + ], + [ + "▁d", + "o" + ], + [ + "▁", + "do" + ], + [ + "c", + "l" + ], + [ + "▁i", + "f" + ], + [ + "▁", + "if" + ], + [ + "co", + "n" + ], + [ + "c", + "on" + ], + [ + "i", + "a" + ], + [ + "▁h", + "is" + ], + [ + "▁hi", + "s" + ], + [ + "▁", + "his" + ], + [ + "ul", + "t" + ], + [ + "u", + "lt" + ], + [ + "ro", + "u" + ], + [ + "r", + "ou" + ], + [ + "▁s", + "u" + ], + [ + "▁", + "su" + ], + [ + "r", + "a" + ], + [ + "▁u", + "n" + ], + [ + "▁", + "un" + ], + [ + "ab", + "le" + ], + [ + "abl", + "e" + ], + [ + "a", + "ble" + ], + [ + "▁", + "<" + ], + [ + "▁", + "K" + ], + [ + "om", + "e" + ], + [ + "o", + "me" + ], + [ + "▁q", + "u" + ], + [ + "▁", + "qu" + ], + [ + "ge", + "t" + ], + [ + "g", + "et" + ], + [ + "▁m", + "e" + ], + [ + "▁", + "me" + ], + [ + "as", + "t" + ], + [ + "a", + "st" + ], + [ + "ec", + "t" + ], + [ + "e", + "ct" + ], + [ + "▁#", + "#" + ], + [ + "▁", + "##" + ], + [ + "t", + "o" + ], + [ + "▁c", + "l" + ], + [ + "▁", + "cl" + ], + [ + "▁a", + "b" + ], + [ + "▁", + "ab" + ], + [ + "ic", + "e" + ], + [ + "i", + "ce" + ], + [ + "ir", + "e" + ], + [ + "i", + "re" + ], + [ + "be", + "r" + ], + [ + "b", + "er" + ], + [ + "on", + "e" + ], + [ + "o", + "ne" + ], + [ + "ic", + "h" + ], + [ + "i", + "ch" + ], + [ + "he", + "n" + ], + [ + "h", + "en" + ], + [ + "▁c", + "an" + ], + [ + "▁ca", + "n" + ], + [ + "▁", + "can" + ], + [ + "▁T", + "h" + ], + [ + "▁", + "Th" + ], + [ + "▁l", + "a" + ], + [ + "▁", + "la" + ], + [ + "▁a", + "ll" + ], + [ + "▁al", + "l" + ], + [ + "▁", + "all" + ], + [ + "im", + "e" + ], + [ + "i", + "me" + ], + [ + "il", + "e" + ], + [ + "i", + "le" + ], + [ + "id", + "e" + ], + [ + "i", + "de" + ], + [ + "\"", + "," + ], + [ + "▁p", + "l" + ], + [ + "▁", + "pl" + ], + [ + "▁", + "V" + ], + [ + "r", + "u" + ], + [ + "or", + "m" + ], + [ + "o", + "rm" + ], + [ + "▁h", + "ad" + ], + [ + "▁ha", + "d" + ], + [ + "▁", + "had" + ], + [ + "u", + "d" + ], + [ + "as", + "e" + ], + [ + "a", + "se" + ], + [ + "or", + "d" + ], + [ + "o", + "rd" + ], + [ + ")", + "," + ], + [ + "▁h", + "er" + ], + [ + "▁he", + "r" + ], + [ + "▁", + "her" + ], + [ + "▁I", + "n" + ], + [ + "▁", + "In" + ], + [ + "ac", + "e" + ], + [ + "a", + "ce" + ], + [ + "▁b", + "ut" + ], + [ + "▁bu", + "t" + ], + [ + "▁", + "but" + ], + [ + "at", + "a" + ], + [ + "a", + "ta" + ], + [ + ":", + ":" + ], + [ + "**", + "**" + ], + [ + "***", + "*" + ], + [ + "*", + "***" + ], + [ + "on", + "g" + ], + [ + "o", + "ng" + ], + [ + "▁", + "&" + ], + [ + ".", + "." + ], + [ + "it", + "e" + ], + [ + "i", + "te" + ], + [ + "yp", + "e" + ], + [ + "y", + "pe" + ], + [ + "ac", + "t" + ], + [ + "a", + "ct" + ], + [ + "od", + "e" + ], + [ + "o", + "de" + ], + [ + "▁y", + "our" + ], + [ + "▁you", + "r" + ], + [ + "▁yo", + "ur" + ], + [ + "▁", + "your" + ], + [ + "▁o", + "ut" + ], + [ + "▁ou", + "t" + ], + [ + "▁", + "out" + ], + [ + "▁g", + "o" + ], + [ + "▁", + "go" + ], + [ + "li", + "c" + ], + [ + "l", + "ic" + ], + [ + "al", + "ly" + ], + [ + "all", + "y" + ], + [ + "▁s", + "o" + ], + [ + "▁", + "so" + ], + [ + "or", + "k" + ], + [ + "a", + "u" + ], + [ + "▁u", + "p" + ], + [ + "▁", + "up" + ], + [ + "▁", + "_" + ], + [ + "l", + "l" + ], + [ + "=", + "=" + ], + [ + "▁m", + "y" + ], + [ + "▁", + "my" + ], + [ + "p", + "p" + ], + [ + "c", + "c" + ], + [ + "▁/", + "/" + ], + [ + "▁", + "//" + ], + [ + "▁the", + "y" + ], + [ + "▁th", + "ey" + ], + [ + "▁", + "they" + ], + [ + "g", + "h" + ], + [ + "▁u", + "s" + ], + [ + "▁", + "us" + ], + [ + "i", + "b" + ], + [ + "ion", + "s" + ], + [ + "io", + "ns" + ], + [ + "i", + "ons" + ], + [ + "ac", + "h" + ], + [ + "a", + "ch" + ], + [ + "en", + "s" + ], + [ + "e", + "ns" + ], + [ + "▁a", + "r" + ], + [ + "▁", + "ar" + ], + [ + "o", + "b" + ], + [ + "el", + "f" + ], + [ + "oo", + "k" + ], + [ + "o", + "ok" + ], + [ + "at", + "ed" + ], + [ + "ate", + "d" + ], + [ + "a", + "ted" + ], + [ + "an", + "g" + ], + [ + "a", + "ng" + ], + [ + "ig", + "n" + ], + [ + "i", + "gn" + ], + [ + "▁re", + "turn" + ], + [ + "▁r", + "eturn" + ], + [ + "▁ret", + "urn" + ], + [ + "▁", + "return" + ], + [ + "▁re", + "s" + ], + [ + "▁r", + "es" + ], + [ + "▁", + "res" + ], + [ + "c", + "k" + ], + [ + "ou", + "s" + ], + [ + "o", + "us" + ], + [ + "с", + "т" + ], + [ + ")", + "." + ], + [ + "▁", + "п" + ], + [ + ".", + "\"" + ], + [ + "н", + "а" + ], + [ + "▁", + "i" + ], + [ + "ai", + "l" + ], + [ + "a", + "il" + ], + [ + "e", + "p" + ], + [ + "▁a", + "d" + ], + [ + "▁", + "ad" + ], + [ + "an", + "ce" + ], + [ + "anc", + "e" + ], + [ + "(", + "\"" + ], + [ + "▁*", + "*" + ], + [ + "▁", + "**" + ], + [ + "th", + "er" + ], + [ + "the", + "r" + ], + [ + "t", + "her" + ], + [ + "ak", + "e" + ], + [ + "a", + "ke" + ], + [ + "▁w", + "ill" + ], + [ + "▁", + "will" + ], + [ + "▁c", + "omp" + ], + [ + "▁com", + "p" + ], + [ + "▁co", + "mp" + ], + [ + "▁", + "comp" + ], + [ + "▁o", + "ne" + ], + [ + "▁on", + "e" + ], + [ + "▁", + "one" + ], + [ + "▁g", + "et" + ], + [ + "▁ge", + "t" + ], + [ + "▁", + "get" + ], + [ + "o", + "v" + ], + [ + "▁", + "Y" + ], + [ + "ar", + "y" + ], + [ + "a", + "ry" + ], + [ + "oc", + "k" + ], + [ + "o", + "ck" + ], + [ + "▁s", + "he" + ], + [ + "▁sh", + "e" + ], + [ + "▁", + "she" + ], + [ + "ch", + "e" + ], + [ + "c", + "he" + ], + [ + "f", + "t" + ], + [ + "▁n", + "ew" + ], + [ + "▁ne", + "w" + ], + [ + "▁", + "new" + ], + [ + "▁d", + "es" + ], + [ + "▁de", + "s" + ], + [ + "▁", + "des" + ], + [ + "▁l", + "i" + ], + [ + "▁", + "li" + ], + [ + "en", + "ce" + ], + [ + "enc", + "e" + ], + [ + "▁s", + "a" + ], + [ + "▁", + "sa" + ], + [ + "re", + "ss" + ], + [ + "res", + "s" + ], + [ + "r", + "ess" + ], + [ + "▁e", + "l" + ], + [ + "▁", + "el" + ], + [ + "▁u", + "nd" + ], + [ + "▁un", + "d" + ], + [ + "▁", + "und" + ], + [ + "e", + "g" + ], + [ + "fe", + "r" + ], + [ + "f", + "er" + ], + [ + "r", + "y" + ], + [ + "ea", + "r" + ], + [ + "e", + "ar" + ], + [ + "os", + "e" + ], + [ + "o", + "se" + ], + [ + "ve", + "ry" + ], + [ + "ver", + "y" + ], + [ + "v", + "ery" + ], + [ + "'", + "," + ], + [ + "▁", + "+" + ], + [ + "▁", + "в" + ], + [ + "▁H", + "e" + ], + [ + "▁", + "He" + ], + [ + "ub", + "lic" + ], + [ + "ubl", + "ic" + ], + [ + "u", + "blic" + ], + [ + "▁the", + "ir" + ], + [ + "iz", + "e" + ], + [ + "i", + "ze" + ], + [ + "▁w", + "ere" + ], + [ + "▁we", + "re" + ], + [ + "▁wer", + "e" + ], + [ + "▁", + "were" + ], + [ + "in", + "k" + ], + [ + "ow", + "n" + ], + [ + "o", + "wn" + ], + [ + "I", + "n" + ], + [ + "{", + "\\" + ], + [ + "▁h", + "as" + ], + [ + "▁ha", + "s" + ], + [ + "▁", + "has" + ], + [ + "▁p", + "er" + ], + [ + "▁pe", + "r" + ], + [ + "▁", + "per" + ], + [ + "▁I", + "t" + ], + [ + "▁", + "It" + ], + [ + "▁S", + "t" + ], + [ + "▁", + "St" + ], + [ + "he", + "r" + ], + [ + "h", + "er" + ], + [ + "je", + "ct" + ], + [ + "j", + "ect" + ], + [ + "р", + "а" + ], + [ + "il", + "d" + ], + [ + "i", + "ld" + ], + [ + "s", + "o" + ], + [ + "▁s", + "p" + ], + [ + "▁", + "sp" + ], + [ + "н", + "и" + ], + [ + "d", + "u" + ], + [ + "ro", + "w" + ], + [ + "r", + "ow" + ], + [ + "al", + "ue" + ], + [ + "alu", + "e" + ], + [ + "se", + "t" + ], + [ + "s", + "et" + ], + [ + "fo", + "rm" + ], + [ + "for", + "m" + ], + [ + "f", + "orm" + ], + [ + "co", + "m" + ], + [ + "c", + "om" + ], + [ + "▁m", + "an" + ], + [ + "▁ma", + "n" + ], + [ + "▁", + "man" + ], + [ + "on", + "t" + ], + [ + "o", + "nt" + ], + [ + "ul", + "l" + ], + [ + "u", + "ll" + ], + [ + "▁c", + "ont" + ], + [ + "▁con", + "t" + ], + [ + "▁co", + "nt" + ], + [ + "▁", + "cont" + ], + [ + "▁m", + "ore" + ], + [ + "▁mor", + "e" + ], + [ + "▁mo", + "re" + ], + [ + "▁", + "more" + ], + [ + "ic", + "k" + ], + [ + "i", + "ck" + ], + [ + "▁w", + "ould" + ], + [ + "▁wo", + "uld" + ], + [ + "▁e", + "v" + ], + [ + "▁", + "ev" + ], + [ + "▁ab", + "out" + ], + [ + "▁", + "about" + ], + [ + "it", + "ion" + ], + [ + "iti", + "on" + ], + [ + "▁", + "z" + ], + [ + "ou", + "nd" + ], + [ + "oun", + "d" + ], + [ + "o", + "und" + ], + [ + "re", + "e" + ], + [ + "r", + "ee" + ], + [ + "▁C", + "h" + ], + [ + "▁", + "Ch" + ], + [ + "▁wh", + "ich" + ], + [ + "▁", + "which" + ], + [ + "i", + "o" + ], + [ + "()", + ";" + ], + [ + "(", + ");" + ], + [ + "▁w", + "ho" + ], + [ + "▁wh", + "o" + ], + [ + "▁", + "who" + ], + [ + "er", + "r" + ], + [ + "e", + "rr" + ], + [ + "or", + "y" + ], + [ + "o", + "ry" + ], + [ + "ou", + "nt" + ], + [ + "oun", + "t" + ], + [ + "o", + "unt" + ], + [ + "at", + "ions" + ], + [ + "ation", + "s" + ], + [ + "ati", + "ons" + ], + [ + "atio", + "ns" + ], + [ + "▁", + "с" + ], + [ + "ri", + "ng" + ], + [ + "rin", + "g" + ], + [ + "r", + "ing" + ], + [ + "<", + "/" + ], + [ + "▁f", + "e" + ], + [ + "▁", + "fe" + ], + [ + "к", + "о" + ], + [ + "н", + "о" + ], + [ + "▁d", + "is" + ], + [ + "▁di", + "s" + ], + [ + "▁", + "dis" + ], + [ + "m", + "a" + ], + [ + "▁t", + "hem" + ], + [ + "▁the", + "m" + ], + [ + "▁th", + "em" + ], + [ + "▁a", + "ny" + ], + [ + "▁an", + "y" + ], + [ + "▁", + "any" + ], + [ + "▁n", + "o" + ], + [ + "▁", + "no" + ], + [ + "--", + "------" + ], + [ + "----", + "----" + ], + [ + "---", + "-----" + ], + [ + "-----", + "---" + ], + [ + "------", + "--" + ], + [ + "-------", + "-" + ], + [ + "-", + "-------" + ], + [ + "▁p", + "re" + ], + [ + "▁pr", + "e" + ], + [ + "▁", + "pre" + ], + [ + "▁t", + "e" + ], + [ + "▁", + "te" + ], + [ + "▁r", + "o" + ], + [ + "▁", + "ro" + ], + [ + "▁h", + "im" + ], + [ + "▁hi", + "m" + ], + [ + "▁", + "him" + ], + [ + "▁", + ":" + ], + [ + "u", + "p" + ], + [ + "▁in", + "t" + ], + [ + "▁i", + "nt" + ], + [ + "▁", + "int" + ], + [ + "▁a", + "g" + ], + [ + "▁", + "ag" + ], + [ + "S", + "t" + ], + [ + "ar", + "k" + ], + [ + "e", + "x" + ], + [ + "p", + "h" + ], + [ + "ie", + "nt" + ], + [ + "ien", + "t" + ], + [ + "i", + "ent" + ], + [ + "el", + "y" + ], + [ + "e", + "ly" + ], + [ + "▁p", + "r" + ], + [ + "▁", + "pr" + ], + [ + "E", + "R" + ], + [ + "▁im", + "port" + ], + [ + "▁imp", + "ort" + ], + [ + "▁", + "import" + ], + [ + "▁t", + "ime" + ], + [ + "▁tim", + "e" + ], + [ + "▁ti", + "me" + ], + [ + "▁", + "time" + ], + [ + "р", + "о" + ], + [ + "pr", + "o" + ], + [ + "p", + "ro" + ], + [ + "Us", + "er" + ], + [ + "Use", + "r" + ], + [ + "U", + "ser" + ], + [ + "l", + "o" + ], + [ + "▁", + "/" + ], + [ + "▁", + "[" + ], + [ + "or", + "s" + ], + [ + "o", + "rs" + ], + [ + "=", + "\"" + ], + [ + "▁t", + "here" + ], + [ + "▁the", + "re" + ], + [ + "▁th", + "ere" + ], + [ + "▁ther", + "e" + ], + [ + "▁", + "there" + ], + [ + "▁l", + "ike" + ], + [ + "▁li", + "ke" + ], + [ + "▁lik", + "e" + ], + [ + "▁", + "like" + ], + [ + "ol", + "d" + ], + [ + "o", + "ld" + ], + [ + "▁w", + "hen" + ], + [ + "▁wh", + "en" + ], + [ + "▁whe", + "n" + ], + [ + "▁", + "when" + ], + [ + "ve", + "rs" + ], + [ + "ver", + "s" + ], + [ + "v", + "ers" + ], + [ + "▁s", + "ome" + ], + [ + "▁so", + "me" + ], + [ + "▁som", + "e" + ], + [ + "▁", + "some" + ], + [ + "in", + "gs" + ], + [ + "ing", + "s" + ], + [ + ")", + ")" + ], + [ + "▁p", + "art" + ], + [ + "▁par", + "t" + ], + [ + "▁pa", + "rt" + ], + [ + "▁", + "part" + ], + [ + "ic", + "al" + ], + [ + "ica", + "l" + ], + [ + "i", + "cal" + ], + [ + "▁f", + "un" + ], + [ + "▁fu", + "n" + ], + [ + "▁", + "fun" + ], + [ + "▁k", + "n" + ], + [ + "▁", + "kn" + ], + [ + "ay", + "s" + ], + [ + "a", + "ys" + ], + [ + "ie", + "r" + ], + [ + "i", + "er" + ], + [ + "▁b", + "een" + ], + [ + "▁be", + "en" + ], + [ + "ov", + "e" + ], + [ + "o", + "ve" + ], + [ + "▁s", + "c" + ], + [ + "▁", + "sc" + ], + [ + "ia", + "n" + ], + [ + "i", + "an" + ], + [ + "▁o", + "ver" + ], + [ + "▁ov", + "er" + ], + [ + "▁", + "over" + ], + [ + "ie", + "l" + ], + [ + "i", + "el" + ], + [ + "▁p", + "e" + ], + [ + "▁", + "pe" + ], + [ + "ri", + "b" + ], + [ + "r", + "ib" + ], + [ + "pu", + "t" + ], + [ + "p", + "ut" + ], + [ + "e", + "c" + ], + [ + "et", + "h" + ], + [ + "e", + "th" + ], + [ + "ar", + "am" + ], + [ + "ara", + "m" + ], + [ + "a", + "ram" + ], + [ + "ap", + "p" + ], + [ + "a", + "pp" + ], + [ + "▁", + "–" + ], + [ + "▁s", + "tat" + ], + [ + "▁st", + "at" + ], + [ + "▁sta", + "t" + ], + [ + "▁", + "stat" + ], + [ + "po", + "n" + ], + [ + "p", + "on" + ], + [ + "▁w", + "hat" + ], + [ + "▁wh", + "at" + ], + [ + "▁", + "what" + ], + [ + "pt", + "ion" + ], + [ + "w", + "e" + ], + [ + "ad", + "e" + ], + [ + "a", + "de" + ], + [ + "▁w", + "ork" + ], + [ + "▁wor", + "k" + ], + [ + "▁", + "work" + ], + [ + "te", + "xt" + ], + [ + "tex", + "t" + ], + [ + "t", + "ext" + ], + [ + "▁s", + "aid" + ], + [ + "▁sa", + "id" + ], + [ + "▁#", + "##" + ], + [ + "▁##", + "#" + ], + [ + "▁", + "###" + ], + [ + "I", + "N" + ], + [ + "▁j", + "ust" + ], + [ + "▁ju", + "st" + ], + [ + "▁", + "just" + ], + [ + "ir", + "st" + ], + [ + "irs", + "t" + ], + [ + "▁in", + "to" + ], + [ + "▁int", + "o" + ], + [ + "▁", + "into" + ], + [ + "▁con", + "st" + ], + [ + "▁cons", + "t" + ], + [ + "▁", + "const" + ], + [ + "our", + "ce" + ], + [ + "t", + "t" + ], + [ + "p", + "s" + ], + [ + "p", + "r" + ], + [ + "er", + "v" + ], + [ + "e", + "rv" + ], + [ + "it", + "t" + ], + [ + "i", + "tt" + ], + [ + "u", + "g" + ], + [ + "_", + "{" + ], + [ + "en", + "ts" + ], + [ + "ent", + "s" + ], + [ + "is", + "h" + ], + [ + "i", + "sh" + ], + [ + "en", + "er" + ], + [ + "ene", + "r" + ], + [ + "e", + "ner" + ], + [ + "▁in", + "ter" + ], + [ + "▁int", + "er" + ], + [ + "▁inte", + "r" + ], + [ + "▁", + "inter" + ], + [ + "pl", + "e" + ], + [ + "p", + "le" + ], + [ + "ol", + "l" + ], + [ + "o", + "ll" + ], + [ + "me", + "r" + ], + [ + "m", + "er" + ], + [ + "at", + "er" + ], + [ + "ate", + "r" + ], + [ + "a", + "ter" + ], + [ + "oo", + "l" + ], + [ + "o", + "ol" + ], + [ + "e", + "f" + ], + [ + "▁p", + "ublic" + ], + [ + "▁pub", + "lic" + ], + [ + "▁pu", + "blic" + ], + [ + "▁publi", + "c" + ], + [ + "▁", + "public" + ], + [ + "▁o", + "ther" + ], + [ + "▁ot", + "her" + ], + [ + "▁", + "other" + ], + [ + "р", + "е" + ], + [ + "▁d", + "ef" + ], + [ + "▁de", + "f" + ], + [ + "▁", + "def" + ], + [ + "▁", + "@" + ], + [ + "г", + "о" + ], + [ + "oin", + "t" + ], + [ + "oi", + "nt" + ], + [ + "o", + "int" + ], + [ + "▁o", + "ff" + ], + [ + "▁of", + "f" + ], + [ + "▁", + "off" + ], + [ + "oi", + "d" + ], + [ + "o", + "id" + ], + [ + "re", + "turn" + ], + [ + "ret", + "urn" + ], + [ + "r", + "eturn" + ], + [ + "▁s", + "et" + ], + [ + "▁se", + "t" + ], + [ + "▁", + "set" + ], + [ + "w", + "o" + ], + [ + "ft", + "er" + ], + [ + "fte", + "r" + ], + [ + "f", + "ter" + ], + [ + "s", + "h" + ], + [ + "**", + "******" + ], + [ + "****", + "****" + ], + [ + "******", + "**" + ], + [ + "▁o", + "ur" + ], + [ + "▁ou", + "r" + ], + [ + "▁", + "our" + ], + [ + "ri", + "v" + ], + [ + "r", + "iv" + ], + [ + "is", + "s" + ], + [ + "i", + "ss" + ], + [ + "▁W", + "e" + ], + [ + "▁", + "We" + ], + [ + "n", + "g" + ], + [ + "▁o", + "b" + ], + [ + "▁", + "ob" + ], + [ + "s", + "s" + ], + [ + "g", + "r" + ], + [ + "▁t", + "han" + ], + [ + "▁th", + "an" + ], + [ + "▁", + "than" + ], + [ + "pe", + "ct" + ], + [ + "pec", + "t" + ], + [ + "p", + "ect" + ], + [ + "ie", + "d" + ], + [ + "i", + "ed" + ], + [ + "s", + "c" + ], + [ + "ie", + "w" + ], + [ + "i", + "ew" + ], + [ + "de", + "r" + ], + [ + "d", + "er" + ], + [ + "ys", + "t" + ], + [ + "y", + "st" + ], + [ + "e", + "v" + ], + [ + "▁c", + "ould" + ], + [ + "▁co", + "uld" + ], + [ + "▁cou", + "ld" + ], + [ + "▁", + "could" + ], + [ + "an", + "n" + ], + [ + "a", + "nn" + ], + [ + "en", + "c" + ], + [ + "e", + "nc" + ], + [ + "O", + "N" + ], + [ + "i", + "x" + ], + [ + "an", + "c" + ], + [ + "a", + "nc" + ], + [ + "▁al", + "so" + ], + [ + "▁als", + "o" + ], + [ + "▁", + "also" + ], + [ + "re", + "at" + ], + [ + "rea", + "t" + ], + [ + "▁a", + "m" + ], + [ + "▁", + "am" + ], + [ + "▁b", + "ec" + ], + [ + "▁be", + "c" + ], + [ + "▁", + "bec" + ], + [ + "▁", + "и" + ], + [ + "ua", + "l" + ], + [ + "u", + "al" + ], + [ + "pe", + "c" + ], + [ + "p", + "ec" + ], + [ + "▁", + "." + ], + [ + "▁b", + "l" + ], + [ + "▁", + "bl" + ], + [ + "le", + "ct" + ], + [ + "l", + "ect" + ], + [ + "op", + "le" + ], + [ + "opl", + "e" + ], + [ + "o", + "ple" + ], + [ + "y", + "s" + ], + [ + "▁g", + "r" + ], + [ + "▁", + "gr" + ], + [ + "ic", + "t" + ], + [ + "i", + "ct" + ], + [ + "i", + "k" + ], + [ + "tr", + "ing" + ], + [ + "tri", + "ng" + ], + [ + "t", + "ring" + ], + [ + "▁T", + "his" + ], + [ + "▁Th", + "is" + ], + [ + "▁", + "This" + ], + [ + "▁b", + "ack" + ], + [ + "▁ba", + "ck" + ], + [ + "▁", + "back" + ], + [ + "▁", + "о" + ], + [ + "▁f", + "in" + ], + [ + "▁fi", + "n" + ], + [ + "▁", + "fin" + ], + [ + "at", + "ch" + ], + [ + "Co", + "n" + ], + [ + "C", + "on" + ], + [ + "(", + "'" + ], + [ + "er", + "m" + ], + [ + "e", + "rm" + ], + [ + "▁=", + "=" + ], + [ + "▁", + "==" + ], + [ + "_", + "_" + ], + [ + "na", + "me" + ], + [ + "nam", + "e" + ], + [ + "n", + "ame" + ], + [ + ",", + "\"" + ], + [ + "▁d", + "id" + ], + [ + "▁di", + "d" + ], + [ + "▁", + "did" + ], + [ + "is", + "e" + ], + [ + "i", + "se" + ], + [ + "▁on", + "ly" + ], + [ + "▁", + "only" + ], + [ + "ru", + "ct" + ], + [ + "r", + "uct" + ], + [ + "le", + "s" + ], + [ + "l", + "es" + ], + [ + "▁t", + "hen" + ], + [ + "▁the", + "n" + ], + [ + "▁th", + "en" + ], + [ + "▁", + "then" + ], + [ + "au", + "se" + ], + [ + "aus", + "e" + ], + [ + "a", + "use" + ], + [ + "в", + "а" + ], + [ + "▁it", + "s" + ], + [ + "▁i", + "ts" + ], + [ + "▁", + "its" + ], + [ + "ri", + "t" + ], + [ + "r", + "it" + ], + [ + "▁k", + "now" + ], + [ + "▁kn", + "ow" + ], + [ + "▁", + "know" + ], + [ + "ie", + "ld" + ], + [ + "iel", + "d" + ], + [ + "i", + "eld" + ], + [ + "▁c", + "lass" + ], + [ + "▁cl", + "ass" + ], + [ + "▁clas", + "s" + ], + [ + "▁", + "class" + ], + [ + "▁", + ">" + ], + [ + "▁e", + "m" + ], + [ + "▁", + "em" + ], + [ + "▁$", + "\\" + ], + [ + "▁", + "$\\" + ], + [ + "▁y", + "ear" + ], + [ + "▁ye", + "ar" + ], + [ + "▁", + "year" + ], + [ + "w", + "n" + ], + [ + "}", + "," + ], + [ + "▁d", + "el" + ], + [ + "▁de", + "l" + ], + [ + "▁", + "del" + ], + [ + "al", + "e" + ], + [ + "a", + "le" + ], + [ + "t", + "y" + ], + [ + "fi", + "g" + ], + [ + "f", + "ig" + ], + [ + "s", + "p" + ], + [ + "he", + "d" + ], + [ + "h", + "ed" + ], + [ + "ro", + "und" + ], + [ + "rou", + "nd" + ], + [ + "r", + "ound" + ], + [ + "e", + "w" + ], + [ + "▁d", + "i" + ], + [ + "▁", + "di" + ], + [ + "▁d", + "er" + ], + [ + "▁de", + "r" + ], + [ + "▁", + "der" + ], + [ + "р", + "и" + ], + [ + "re", + "d" + ], + [ + "r", + "ed" + ], + [ + "th", + "is" + ], + [ + "t", + "his" + ], + [ + "le", + "t" + ], + [ + "l", + "et" + ], + [ + "R", + "E" + ], + [ + "a", + "x" + ], + [ + "f", + "r" + ], + [ + "ess", + "age" + ], + [ + "essa", + "ge" + ], + [ + "ou", + "gh" + ], + [ + "o", + "ugh" + ], + [ + "▁c", + "omm" + ], + [ + "▁com", + "m" + ], + [ + "▁co", + "mm" + ], + [ + "▁", + "comm" + ], + [ + "f", + "o" + ], + [ + "uc", + "h" + ], + [ + "u", + "ch" + ], + [ + "o", + "y" + ], + [ + "▁pe", + "ople" + ], + [ + "▁", + "people" + ], + [ + "yst", + "em" + ], + [ + "ys", + "tem" + ], + [ + "▁f", + "irst" + ], + [ + "▁fir", + "st" + ], + [ + "▁", + "first" + ], + [ + "▁f", + "unction" + ], + [ + "▁fun", + "ction" + ], + [ + "▁", + "function" + ], + [ + "an", + "ge" + ], + [ + "ang", + "e" + ], + [ + "▁h", + "ow" + ], + [ + "▁ho", + "w" + ], + [ + "▁", + "how" + ], + [ + "▁e", + "t" + ], + [ + "▁", + "et" + ], + [ + "a", + "h" + ], + [ + "▁l", + "ook" + ], + [ + "▁lo", + "ok" + ], + [ + "▁", + "look" + ], + [ + "т", + "о" + ], + [ + "un", + "d" + ], + [ + "u", + "nd" + ], + [ + "▁u", + "nder" + ], + [ + "▁un", + "der" + ], + [ + "▁und", + "er" + ], + [ + "▁", + "under" + ], + [ + "к", + "а" + ], + [ + "▁", + "!" + ], + [ + "ra", + "y" + ], + [ + "r", + "ay" + ], + [ + "S", + "T" + ], + [ + "if", + "ic" + ], + [ + "ifi", + "c" + ], + [ + "i", + "fic" + ], + [ + "л", + "и" + ], + [ + "re", + "ad" + ], + [ + "rea", + "d" + ], + [ + "r", + "ead" + ], + [ + "▁b", + "et" + ], + [ + "▁be", + "t" + ], + [ + "▁", + "bet" + ], + [ + "io", + "us" + ], + [ + "i", + "ous" + ], + [ + "ar", + "g" + ], + [ + "a", + "rg" + ], + [ + "▁n", + "eed" + ], + [ + "▁ne", + "ed" + ], + [ + "▁", + "need" + ], + [ + "ma", + "th" + ], + [ + "mat", + "h" + ], + [ + "m", + "ath" + ], + [ + "▁н", + "а" + ], + [ + "▁", + "на" + ], + [ + "er", + "t" + ], + [ + "e", + "rt" + ], + [ + "▁o", + "p" + ], + [ + "▁", + "op" + ], + [ + "▁a", + "cc" + ], + [ + "▁ac", + "c" + ], + [ + "▁", + "acc" + ], + [ + "Pr", + "o" + ], + [ + "P", + "ro" + ], + [ + "▁e", + "st" + ], + [ + "▁es", + "t" + ], + [ + "▁", + "est" + ], + [ + "▁U", + "n" + ], + [ + "▁", + "Un" + ], + [ + "▁e", + "nt" + ], + [ + "▁en", + "t" + ], + [ + "▁", + "ent" + ], + [ + "▁re", + "c" + ], + [ + "▁r", + "ec" + ], + [ + "▁", + "rec" + ], + [ + "▁u", + "se" + ], + [ + "▁us", + "e" + ], + [ + "▁", + "use" + ], + [ + "е", + "н" + ], + [ + "▁p", + "ar" + ], + [ + "▁pa", + "r" + ], + [ + "▁", + "par" + ], + [ + "a", + "z" + ], + [ + "▁", + "д" + ], + [ + "▁W", + "h" + ], + [ + "▁", + "Wh" + ], + [ + "sel", + "f" + ], + [ + "s", + "elf" + ], + [ + "▁k", + "e" + ], + [ + "▁", + "ke" + ], + [ + "т", + "а" + ], + [ + "▁w", + "ant" + ], + [ + "▁wa", + "nt" + ], + [ + "▁", + "want" + ], + [ + "▁e", + "nd" + ], + [ + "▁en", + "d" + ], + [ + "▁", + "end" + ], + [ + "▁d", + "on" + ], + [ + "▁do", + "n" + ], + [ + "▁", + "don" + ], + [ + "e", + "k" + ], + [ + "re", + "n" + ], + [ + "r", + "en" + ], + [ + "Na", + "me" + ], + [ + "N", + "ame" + ], + [ + "▁=", + ">" + ], + [ + "▁", + "=>" + ], + [ + "▁a", + "pp" + ], + [ + "▁ap", + "p" + ], + [ + "▁", + "app" + ], + [ + "▁qu", + "e" + ], + [ + "▁q", + "ue" + ], + [ + "▁", + "que" + ], + [ + "ig", + "h" + ], + [ + "i", + "gh" + ], + [ + "▁b", + "u" + ], + [ + "▁", + "bu" + ], + [ + "eq", + "u" + ], + [ + "e", + "qu" + ], + [ + "ve", + "l" + ], + [ + "v", + "el" + ], + [ + "▁a", + "ct" + ], + [ + "▁ac", + "t" + ], + [ + "▁", + "act" + ], + [ + "cr", + "e" + ], + [ + "c", + "re" + ], + [ + "A", + "T" + ], + [ + "▁v", + "ar" + ], + [ + "▁va", + "r" + ], + [ + "▁", + "var" + ], + [ + "ce", + "ss" + ], + [ + "ces", + "s" + ], + [ + "c", + "ess" + ], + [ + "==", + "==" + ], + [ + "===", + "=" + ], + [ + "=", + "===" + ], + [ + "E", + "x" + ], + [ + "▁a", + "dd" + ], + [ + "▁ad", + "d" + ], + [ + "▁", + "add" + ], + [ + "▁m", + "od" + ], + [ + "▁mo", + "d" + ], + [ + "▁", + "mod" + ], + [ + "un", + "g" + ], + [ + "u", + "ng" + ], + [ + "▁w", + "here" + ], + [ + "▁wh", + "ere" + ], + [ + "▁whe", + "re" + ], + [ + "▁", + "where" + ], + [ + "ni", + "ng" + ], + [ + "n", + "ing" + ], + [ + "▁f", + "l" + ], + [ + "▁", + "fl" + ], + [ + "al", + "s" + ], + [ + "a", + "ls" + ], + [ + "ter", + "n" + ], + [ + "te", + "rn" + ], + [ + "t", + "ern" + ], + [ + "}", + "}" + ], + [ + "▁A", + "l" + ], + [ + "▁", + "Al" + ], + [ + "▁p", + "os" + ], + [ + "▁po", + "s" + ], + [ + "▁", + "pos" + ], + [ + "an", + "k" + ], + [ + "▁a", + "p" + ], + [ + "▁", + "ap" + ], + [ + "en", + "g" + ], + [ + "e", + "ng" + ], + [ + "▁", + "“" + ], + [ + "bl", + "e" + ], + [ + "b", + "le" + ], + [ + "▁re", + "g" + ], + [ + "▁r", + "eg" + ], + [ + "▁", + "reg" + ], + [ + "^", + "{" + ], + [ + "▁S", + "he" + ], + [ + "▁Sh", + "e" + ], + [ + "▁", + "She" + ], + [ + "▁*", + "/" + ], + [ + "▁", + "*/" + ], + [ + "ud", + "e" + ], + [ + "u", + "de" + ], + [ + "ad", + "d" + ], + [ + "a", + "dd" + ], + [ + "▁t", + "wo" + ], + [ + "▁tw", + "o" + ], + [ + "▁", + "two" + ], + [ + "▁c", + "ol" + ], + [ + "▁co", + "l" + ], + [ + "▁", + "col" + ], + [ + "▁s", + "m" + ], + [ + "▁", + "sm" + ], + [ + "ai", + "r" + ], + [ + "a", + "ir" + ], + [ + "▁m", + "ay" + ], + [ + "▁ma", + "y" + ], + [ + "▁", + "may" + ], + [ + "fo", + "re" + ], + [ + "for", + "e" + ], + [ + "f", + "ore" + ], + [ + "▁Y", + "ou" + ], + [ + "▁", + "You" + ], + [ + "ro", + "ugh" + ], + [ + "rou", + "gh" + ], + [ + "r", + "ough" + ], + [ + "▁c", + "he" + ], + [ + "▁ch", + "e" + ], + [ + "▁", + "che" + ], + [ + "▁a", + "tt" + ], + [ + "▁at", + "t" + ], + [ + "▁", + "att" + ], + [ + "ot", + "h" + ], + [ + "o", + "th" + ], + [ + "л", + "а" + ], + [ + "▁c", + "o" + ], + [ + "▁", + "co" + ], + [ + "at", + "es" + ], + [ + "ate", + "s" + ], + [ + "a", + "tes" + ], + [ + "▁re", + "m" + ], + [ + "▁r", + "em" + ], + [ + "▁", + "rem" + ], + [ + "oo", + "d" + ], + [ + "o", + "od" + ], + [ + "Ty", + "pe" + ], + [ + "Typ", + "e" + ], + [ + "T", + "ype" + ], + [ + "le", + "d" + ], + [ + "l", + "ed" + ], + [ + "fu", + "l" + ], + [ + "f", + "ul" + ], + [ + "▁s", + "elf" + ], + [ + "▁sel", + "f" + ], + [ + "▁", + "self" + ], + [ + "o", + "f" + ], + [ + "▁A", + "r" + ], + [ + "▁", + "Ar" + ], + [ + "qu", + "e" + ], + [ + "q", + "ue" + ], + [ + "▁e", + "very" + ], + [ + "▁ev", + "ery" + ], + [ + "▁ever", + "y" + ], + [ + "▁", + "every" + ], + [ + "re", + "f" + ], + [ + "r", + "ef" + ], + [ + "Th", + "e" + ], + [ + "T", + "he" + ], + [ + "▁A", + "nd" + ], + [ + "▁An", + "d" + ], + [ + "▁", + "And" + ], + [ + "▁re", + "l" + ], + [ + "▁r", + "el" + ], + [ + "▁", + "rel" + ], + [ + "O", + "R" + ], + [ + "I", + "d" + ], + [ + "▁e", + "ven" + ], + [ + "▁ev", + "en" + ], + [ + "▁", + "even" + ], + [ + "E", + "N" + ], + [ + "▁h", + "and" + ], + [ + "▁ha", + "nd" + ], + [ + "▁han", + "d" + ], + [ + "▁", + "hand" + ], + [ + "ai", + "t" + ], + [ + "a", + "it" + ], + [ + "▁sh", + "ould" + ], + [ + "▁", + "should" + ], + [ + "▁a", + "fter" + ], + [ + "▁af", + "ter" + ], + [ + "▁", + "after" + ], + [ + "▁d", + "if" + ], + [ + "▁di", + "f" + ], + [ + "gh", + "t" + ], + [ + "g", + "ht" + ], + [ + "if", + "e" + ], + [ + "i", + "fe" + ], + [ + "at", + "or" + ], + [ + "ato", + "r" + ], + [ + "a", + "tor" + ], + [ + "as", + "h" + ], + [ + "a", + "sh" + ], + [ + "ri", + "but" + ], + [ + "rib", + "ut" + ], + [ + "ribu", + "t" + ], + [ + "um", + "ber" + ], + [ + "umb", + "er" + ], + [ + "u", + "mber" + ], + [ + "▁s", + "ee" + ], + [ + "▁se", + "e" + ], + [ + "▁", + "see" + ], + [ + "m", + "s" + ], + [ + "▁c", + "all" + ], + [ + "▁cal", + "l" + ], + [ + "▁ca", + "ll" + ], + [ + "▁", + "call" + ], + [ + "y", + "n" + ], + [ + "d", + "d" + ], + [ + "▁e", + "s" + ], + [ + "▁", + "es" + ], + [ + "▁m", + "ake" + ], + [ + "▁ma", + "ke" + ], + [ + "▁", + "make" + ], + [ + "ot", + "her" + ], + [ + "oth", + "er" + ], + [ + "othe", + "r" + ], + [ + "o", + "ther" + ], + [ + "▁", + "—" + ], + [ + "\")", + ";" + ], + [ + "\"", + ");" + ], + [ + "st", + "r" + ], + [ + "s", + "tr" + ], + [ + "▁l", + "ong" + ], + [ + "▁lo", + "ng" + ], + [ + "▁lon", + "g" + ], + [ + "▁", + "long" + ], + [ + "le", + "ment" + ], + [ + "lem", + "ent" + ], + [ + "l", + "ement" + ], + [ + "▁w", + "or" + ], + [ + "▁wo", + "r" + ], + [ + "▁", + "wor" + ], + [ + "it", + "s" + ], + [ + "i", + "ts" + ], + [ + "▁I", + "f" + ], + [ + "▁", + "If" + ], + [ + "al", + "se" + ], + [ + "als", + "e" + ], + [ + "л", + "ь" + ], + [ + "wa", + "rd" + ], + [ + "war", + "d" + ], + [ + "w", + "ard" + ], + [ + "▁п", + "о" + ], + [ + "▁", + "по" + ], + [ + "va", + "l" + ], + [ + "v", + "al" + ], + [ + "on", + "s" + ], + [ + "o", + "ns" + ], + [ + "▁", + "Z" + ], + [ + "▁n", + "ow" + ], + [ + "▁no", + "w" + ], + [ + "▁", + "now" + ], + [ + "da", + "ta" + ], + [ + "dat", + "a" + ], + [ + "d", + "ata" + ], + [ + "am", + "p" + ], + [ + "a", + "mp" + ], + [ + "en", + "se" + ], + [ + "ens", + "e" + ], + [ + "▁th", + "rough" + ], + [ + "▁thr", + "ough" + ], + [ + "▁thro", + "ugh" + ], + [ + "▁", + "through" + ], + [ + "▁d", + "own" + ], + [ + "▁do", + "wn" + ], + [ + "▁dow", + "n" + ], + [ + "▁", + "down" + ], + [ + "at", + "t" + ], + [ + "a", + "tt" + ], + [ + "▁st", + "atic" + ], + [ + "▁stat", + "ic" + ], + [ + "▁", + "static" + ], + [ + "ic", + "s" + ], + [ + "i", + "cs" + ], + [ + "#", + "#" + ], + [ + "po", + "s" + ], + [ + "p", + "os" + ], + [ + "▁v", + "oid" + ], + [ + "▁vo", + "id" + ], + [ + "▁", + "void" + ], + [ + "a", + "w" + ], + [ + "ou", + "n" + ], + [ + "o", + "un" + ], + [ + "▁w", + "ay" + ], + [ + "▁wa", + "y" + ], + [ + "▁", + "way" + ], + [ + "ib", + "le" + ], + [ + "i", + "ble" + ], + [ + "ve", + "nt" + ], + [ + "ven", + "t" + ], + [ + "v", + "ent" + ], + [ + "ow", + "er" + ], + [ + "owe", + "r" + ], + [ + "o", + "wer" + ], + [ + "▁th", + "ink" + ], + [ + "▁thin", + "k" + ], + [ + "▁", + "think" + ], + [ + "t", + "s" + ], + [ + "*", + "/" + ], + [ + "▁a", + "gain" + ], + [ + "▁ag", + "ain" + ], + [ + "▁", + "again" + ], + [ + "at", + "ing" + ], + [ + "ati", + "ng" + ], + [ + "atin", + "g" + ], + [ + "a", + "ting" + ], + [ + "т", + "е" + ], + [ + "ne", + "r" + ], + [ + "n", + "er" + ], + [ + "▁m", + "ost" + ], + [ + "▁mo", + "st" + ], + [ + "▁mos", + "t" + ], + [ + "▁", + "most" + ], + [ + "li", + "ne" + ], + [ + "lin", + "e" + ], + [ + "l", + "ine" + ], + [ + "y", + "m" + ], + [ + "▁s", + "ub" + ], + [ + "▁su", + "b" + ], + [ + "▁", + "sub" + ], + [ + "er", + "son" + ], + [ + "ers", + "on" + ], + [ + "▁re", + "qu" + ], + [ + "▁r", + "equ" + ], + [ + "▁req", + "u" + ], + [ + "▁", + "requ" + ], + [ + "A", + "L" + ], + [ + "A", + "R" + ], + [ + "ab", + "el" + ], + [ + "abe", + "l" + ], + [ + "a", + "bel" + ], + [ + "on", + "d" + ], + [ + "o", + "nd" + ], + [ + "))", + ";" + ], + [ + ")", + ");" + ], + [ + "▁S", + "e" + ], + [ + "▁", + "Se" + ], + [ + "▁B", + "ut" + ], + [ + "▁Bu", + "t" + ], + [ + "▁", + "But" + ], + [ + "al", + "k" + ], + [ + "▁A", + "n" + ], + [ + "▁", + "An" + ], + [ + "ne", + "w" + ], + [ + "n", + "ew" + ], + [ + "▁b", + "ecause" + ], + [ + "▁bec", + "ause" + ], + [ + "▁", + "because" + ], + [ + "ge", + "r" + ], + [ + "g", + "er" + ], + [ + "ul", + "ar" + ], + [ + "ula", + "r" + ], + [ + "u", + "lar" + ], + [ + "ro", + "up" + ], + [ + "rou", + "p" + ], + [ + "r", + "oup" + ], + [ + "t", + "a" + ], + [ + "..", + "." + ], + [ + ".", + ".." + ], + [ + "▁c", + "ons" + ], + [ + "▁con", + "s" + ], + [ + "▁co", + "ns" + ], + [ + "▁", + "cons" + ], + [ + "▁r", + "ight" + ], + [ + "▁ri", + "ght" + ], + [ + "▁rig", + "ht" + ], + [ + "▁", + "right" + ], + [ + "▁f", + "r" + ], + [ + "▁", + "fr" + ], + [ + "b", + "e" + ], + [ + "il", + "y" + ], + [ + "i", + "ly" + ], + [ + "к", + "и" + ], + [ + "▁p", + "h" + ], + [ + "▁", + "ph" + ], + [ + "ea", + "d" + ], + [ + "e", + "ad" + ], + [ + "?", + "\"" + ], + [ + "▁g", + "u" + ], + [ + "▁", + "gu" + ], + [ + "▁el", + "se" + ], + [ + "▁els", + "e" + ], + [ + "▁", + "else" + ], + [ + "▁s", + "om" + ], + [ + "▁so", + "m" + ], + [ + "▁", + "som" + ], + [ + "re", + "nt" + ], + [ + "ren", + "t" + ], + [ + "r", + "ent" + ], + [ + "c", + "o" + ], + [ + "em", + "ent" + ], + [ + "eme", + "nt" + ], + [ + "emen", + "t" + ], + [ + "e", + "ment" + ], + [ + "▁s", + "tr" + ], + [ + "▁st", + "r" + ], + [ + "▁", + "str" + ], + [ + "au", + "lt" + ], + [ + "aul", + "t" + ], + [ + "a", + "ult" + ], + [ + "▁", + "з" + ], + [ + "л", + "о" + ], + [ + "se", + "rt" + ], + [ + "ser", + "t" + ], + [ + "s", + "ert" + ], + [ + "va", + "r" + ], + [ + "v", + "ar" + ], + [ + "ty", + "pe" + ], + [ + "typ", + "e" + ], + [ + "t", + "ype" + ], + [ + "▁C", + "om" + ], + [ + "▁Co", + "m" + ], + [ + "▁", + "Com" + ], + [ + "л", + "е" + ], + [ + "in", + "s" + ], + [ + "i", + "ns" + ], + [ + "m", + "e" + ], + [ + "wa", + "y" + ], + [ + "w", + "ay" + ], + [ + "id", + "ent" + ], + [ + "ide", + "nt" + ], + [ + "iden", + "t" + ], + [ + "▁p", + "rov" + ], + [ + "▁pro", + "v" + ], + [ + "▁pr", + "ov" + ], + [ + "▁", + "prov" + ], + [ + "▁", + "м" + ], + [ + "▁tr", + "ue" + ], + [ + "▁", + "true" + ], + [ + "▁P", + "ro" + ], + [ + "▁Pr", + "o" + ], + [ + "▁", + "Pro" + ], + [ + "f", + "l" + ], + [ + "▁s", + "l" + ], + [ + "▁", + "sl" + ], + [ + "▁A", + "s" + ], + [ + "▁", + "As" + ], + [ + "}", + "\\" + ], + [ + "I", + "D" + ], + [ + "ue", + "s" + ], + [ + "u", + "es" + ], + [ + "▁in", + "st" + ], + [ + "▁ins", + "t" + ], + [ + "▁", + "inst" + ], + [ + "▁n", + "ame" + ], + [ + "▁na", + "me" + ], + [ + "▁nam", + "e" + ], + [ + "▁", + "name" + ], + [ + "o", + "x" + ], + [ + "▁", + ")" + ], + [ + "l", + "i" + ], + [ + "am", + "es" + ], + [ + "ame", + "s" + ], + [ + "a", + "mes" + ], + [ + "Re", + "s" + ], + [ + "R", + "es" + ], + [ + "▁s", + "ur" + ], + [ + "▁su", + "r" + ], + [ + "▁", + "sur" + ], + [ + "par", + "am" + ], + [ + "pa", + "ram" + ], + [ + "para", + "m" + ], + [ + "p", + "aram" + ], + [ + "▁st", + "art" + ], + [ + "▁star", + "t" + ], + [ + "▁sta", + "rt" + ], + [ + "▁", + "start" + ], + [ + "a", + "j" + ], + [ + "S", + "E" + ], + [ + "as", + "k" + ], + [ + "a", + "sk" + ], + [ + "I", + "T" + ], + [ + "St", + "ring" + ], + [ + "Str", + "ing" + ], + [ + "S", + "tring" + ], + [ + "▁a", + "ss" + ], + [ + "▁as", + "s" + ], + [ + "▁", + "ass" + ], + [ + "▁p", + "lay" + ], + [ + "▁pl", + "ay" + ], + [ + "▁", + "play" + ], + [ + "ti", + "ng" + ], + [ + "t", + "ing" + ], + [ + "to", + "n" + ], + [ + "t", + "on" + ], + [ + "▁b", + "efore" + ], + [ + "▁be", + "fore" + ], + [ + "▁bef", + "ore" + ], + [ + "▁", + "before" + ], + [ + "▁p", + "ol" + ], + [ + "▁po", + "l" + ], + [ + "▁", + "pol" + ], + [ + "ar", + "ch" + ], + [ + "arc", + "h" + ], + [ + "▁w", + "ell" + ], + [ + "▁we", + "ll" + ], + [ + "▁wel", + "l" + ], + [ + "▁", + "well" + ], + [ + "Co", + "m" + ], + [ + "C", + "om" + ], + [ + "an", + "y" + ], + [ + "a", + "ny" + ], + [ + "ol", + "og" + ], + [ + "olo", + "g" + ], + [ + "o", + "log" + ], + [ + "▁e", + "rr" + ], + [ + "▁er", + "r" + ], + [ + "▁", + "err" + ], + [ + "▁the", + "se" + ], + [ + "▁th", + "ese" + ], + [ + "ar", + "s" + ], + [ + "a", + "rs" + ], + [ + "e", + "b" + ], + [ + "▁b", + "r" + ], + [ + "▁", + "br" + ], + [ + "▁in", + "cl" + ], + [ + "▁inc", + "l" + ], + [ + "▁", + "incl" + ], + [ + "▁h", + "el" + ], + [ + "▁he", + "l" + ], + [ + "▁", + "hel" + ], + [ + "er", + "n" + ], + [ + "e", + "rn" + ], + [ + "od", + "y" + ], + [ + "o", + "dy" + ], + [ + "в", + "о" + ], + [ + "▁in", + "d" + ], + [ + "▁i", + "nd" + ], + [ + "▁", + "ind" + ], + [ + "--", + "--------------" + ], + [ + "----", + "------------" + ], + [ + "--------", + "--------" + ], + [ + "---", + "-------------" + ], + [ + "------------", + "----" + ], + [ + "-----", + "-----------" + ], + [ + "----------", + "------" + ], + [ + "------", + "----------" + ], + [ + "-------------", + "---" + ], + [ + "--------------", + "--" + ], + [ + "---------", + "-------" + ], + [ + "-------", + "---------" + ], + [ + "-----------", + "-----" + ], + [ + "▁d", + "ata" + ], + [ + "▁da", + "ta" + ], + [ + "▁dat", + "a" + ], + [ + "▁", + "data" + ], + [ + "▁g", + "ood" + ], + [ + "▁go", + "od" + ], + [ + "▁", + "good" + ], + [ + "L", + "E" + ], + [ + "]", + "," + ], + [ + "▁a", + "v" + ], + [ + "▁", + "av" + ], + [ + "▁a", + "c" + ], + [ + "▁", + "ac" + ], + [ + "id", + "er" + ], + [ + "ide", + "r" + ], + [ + "i", + "der" + ], + [ + "н", + "е" + ], + [ + "▁", + "Q" + ], + [ + "▁m", + "in" + ], + [ + "▁mi", + "n" + ], + [ + "▁", + "min" + ], + [ + "▁m", + "uch" + ], + [ + "▁mu", + "ch" + ], + [ + "c", + "i" + ], + [ + "el", + "s" + ], + [ + "e", + "ls" + ], + [ + "▁c", + "ur" + ], + [ + "▁cu", + "r" + ], + [ + "▁", + "cur" + ], + [ + "▁v", + "alue" + ], + [ + "▁val", + "ue" + ], + [ + "▁", + "value" + ], + [ + "er", + "y" + ], + [ + "e", + "ry" + ], + [ + "u", + "f" + ], + [ + "▁l", + "oc" + ], + [ + "▁lo", + "c" + ], + [ + "▁", + "loc" + ], + [ + "re", + "ak" + ], + [ + "rea", + "k" + ], + [ + "at", + "ive" + ], + [ + "ati", + "ve" + ], + [ + "ativ", + "e" + ], + [ + "im", + "es" + ], + [ + "ime", + "s" + ], + [ + "i", + "mes" + ], + [ + "C", + "l" + ], + [ + "▁", + "," + ], + [ + "▁s", + "er" + ], + [ + "▁se", + "r" + ], + [ + "▁", + "ser" + ], + [ + "▁d", + "ie" + ], + [ + "▁di", + "e" + ], + [ + "▁", + "die" + ], + [ + "▁tr", + "ans" + ], + [ + "▁tra", + "ns" + ], + [ + "▁", + "trans" + ], + [ + "▁res", + "ult" + ], + [ + "▁", + "result" + ], + [ + "ex", + "t" + ], + [ + "e", + "xt" + ], + [ + "▁a", + "ut" + ], + [ + "▁au", + "t" + ], + [ + "▁", + "aut" + ], + [ + "la", + "nd" + ], + [ + "lan", + "d" + ], + [ + "l", + "and" + ], + [ + "▁&", + "&" + ], + [ + "▁", + "&&" + ], + [ + "C", + "h" + ], + [ + "te", + "n" + ], + [ + "t", + "en" + ], + [ + "}", + "$" + ], + [ + "▁t", + "ype" + ], + [ + "▁typ", + "e" + ], + [ + "▁ty", + "pe" + ], + [ + "▁", + "type" + ], + [ + "con", + "d" + ], + [ + "co", + "nd" + ], + [ + "c", + "ond" + ], + [ + "ic", + "es" + ], + [ + "ice", + "s" + ], + [ + "i", + "ces" + ], + [ + "▁v", + "ery" + ], + [ + "▁ver", + "y" + ], + [ + "▁ve", + "ry" + ], + [ + "▁", + "very" + ], + [ + "▁o", + "wn" + ], + [ + "▁", + "own" + ], + [ + "▁f", + "il" + ], + [ + "▁fi", + "l" + ], + [ + "▁", + "fil" + ], + [ + "it", + "ies" + ], + [ + "iti", + "es" + ], + [ + "i", + "ties" + ], + [ + "▁p", + "rodu" + ], + [ + "▁pro", + "du" + ], + [ + "▁prod", + "u" + ], + [ + "▁", + "produ" + ], + [ + "▁re", + "ad" + ], + [ + "▁r", + "ead" + ], + [ + "▁", + "read" + ], + [ + "▁f", + "orm" + ], + [ + "▁for", + "m" + ], + [ + "▁fo", + "rm" + ], + [ + "▁", + "form" + ], + [ + "▁c", + "ase" + ], + [ + "▁cas", + "e" + ], + [ + "▁ca", + "se" + ], + [ + "▁", + "case" + ], + [ + "at", + "her" + ], + [ + "ath", + "er" + ], + [ + "a", + "ther" + ], + [ + "т", + "и" + ], + [ + "д", + "а" + ], + [ + "е", + "р" + ], + [ + "T", + "h" + ], + [ + "au", + "t" + ], + [ + "a", + "ut" + ], + [ + "▁s", + "pec" + ], + [ + "▁sp", + "ec" + ], + [ + "▁spe", + "c" + ], + [ + "▁", + "spec" + ], + [ + "i", + "j" + ], + [ + "b", + "l" + ], + [ + "il", + "ity" + ], + [ + "ili", + "ty" + ], + [ + "▁", + "é" + ], + [ + "▁e", + "r" + ], + [ + "▁", + "er" + ], + [ + "▁d", + "oes" + ], + [ + "▁do", + "es" + ], + [ + "▁", + "does" + ], + [ + "▁h", + "ere" + ], + [ + "▁he", + "re" + ], + [ + "▁her", + "e" + ], + [ + "▁", + "here" + ], + [ + "th", + "e" + ], + [ + "t", + "he" + ], + [ + "ur", + "es" + ], + [ + "ure", + "s" + ], + [ + "u", + "res" + ], + [ + "▁", + "%" + ], + [ + "mi", + "n" + ], + [ + "m", + "in" + ], + [ + "▁n", + "ull" + ], + [ + "▁nu", + "ll" + ], + [ + "▁", + "null" + ], + [ + "ra", + "p" + ], + [ + "r", + "ap" + ], + [ + "\"", + ")" + ], + [ + "r", + "r" + ], + [ + "Li", + "st" + ], + [ + "L", + "ist" + ], + [ + "ri", + "ght" + ], + [ + "rig", + "ht" + ], + [ + "r", + "ight" + ], + [ + "▁U", + "ser" + ], + [ + "▁Us", + "er" + ], + [ + "▁Use", + "r" + ], + [ + "▁", + "User" + ], + [ + "U", + "L" + ], + [ + "at", + "ional" + ], + [ + "ation", + "al" + ], + [ + "ati", + "onal" + ], + [ + "atio", + "nal" + ], + [ + "▁b", + "eing" + ], + [ + "▁be", + "ing" + ], + [ + "▁bei", + "ng" + ], + [ + "▁", + "being" + ], + [ + "A", + "N" + ], + [ + "s", + "k" + ], + [ + "▁c", + "ar" + ], + [ + "▁ca", + "r" + ], + [ + "▁", + "car" + ], + [ + "ol", + "e" + ], + [ + "o", + "le" + ], + [ + "▁d", + "ist" + ], + [ + "▁dis", + "t" + ], + [ + "▁di", + "st" + ], + [ + "▁", + "dist" + ], + [ + "pl", + "ic" + ], + [ + "p", + "lic" + ], + [ + "ol", + "low" + ], + [ + "oll", + "ow" + ], + [ + "▁p", + "res" + ], + [ + "▁pre", + "s" + ], + [ + "▁pr", + "es" + ], + [ + "▁", + "pres" + ], + [ + "▁s", + "uch" + ], + [ + "▁su", + "ch" + ], + [ + "▁suc", + "h" + ], + [ + "▁", + "such" + ], + [ + "re", + "am" + ], + [ + "rea", + "m" + ], + [ + "in", + "ce" + ], + [ + "inc", + "e" + ], + [ + "ga", + "n" + ], + [ + "g", + "an" + ], + [ + "▁F", + "or" + ], + [ + "▁Fo", + "r" + ], + [ + "▁", + "For" + ], + [ + "\"", + ":" + ], + [ + "so", + "n" + ], + [ + "s", + "on" + ], + [ + "riv", + "ate" + ], + [ + "▁y", + "ears" + ], + [ + "▁year", + "s" + ], + [ + "▁ye", + "ars" + ], + [ + "▁s", + "erv" + ], + [ + "▁se", + "rv" + ], + [ + "▁ser", + "v" + ], + [ + "▁", + "serv" + ], + [ + "▁m", + "ade" + ], + [ + "▁ma", + "de" + ], + [ + "▁mad", + "e" + ], + [ + "▁", + "made" + ], + [ + "de", + "f" + ], + [ + "d", + "ef" + ], + [ + ";", + "\r" + ], + [ + "▁g", + "l" + ], + [ + "▁", + "gl" + ], + [ + "▁b", + "el" + ], + [ + "▁be", + "l" + ], + [ + "▁", + "bel" + ], + [ + "▁l", + "ist" + ], + [ + "▁li", + "st" + ], + [ + "▁", + "list" + ], + [ + "▁c", + "or" + ], + [ + "▁co", + "r" + ], + [ + "▁", + "cor" + ], + [ + "▁d", + "et" + ], + [ + "▁de", + "t" + ], + [ + "▁", + "det" + ], + [ + "ce", + "ption" + ], + [ + "cept", + "ion" + ], + [ + "eg", + "in" + ], + [ + "e", + "gin" + ], + [ + "▁", + "б" + ], + [ + "▁c", + "har" + ], + [ + "▁ch", + "ar" + ], + [ + "▁cha", + "r" + ], + [ + "▁", + "char" + ], + [ + "tr", + "ans" + ], + [ + "tra", + "ns" + ], + [ + "▁f", + "am" + ], + [ + "▁fa", + "m" + ], + [ + "▁!", + "=" + ], + [ + "▁", + "!=" + ], + [ + "ou", + "se" + ], + [ + "ous", + "e" + ], + [ + "o", + "use" + ], + [ + "▁d", + "ec" + ], + [ + "▁de", + "c" + ], + [ + "▁", + "dec" + ], + [ + "ic", + "a" + ], + [ + "i", + "ca" + ], + [ + "▁m", + "any" + ], + [ + "▁man", + "y" + ], + [ + "▁ma", + "ny" + ], + [ + "▁", + "many" + ], + [ + "ak", + "ing" + ], + [ + "aki", + "ng" + ], + [ + "a", + "king" + ], + [ + "▁", + "à" + ], + [ + "▁s", + "im" + ], + [ + "▁si", + "m" + ], + [ + "▁", + "sim" + ], + [ + "ag", + "es" + ], + [ + "age", + "s" + ], + [ + "a", + "ges" + ], + [ + "uf", + "f" + ], + [ + "u", + "ff" + ], + [ + "as", + "ed" + ], + [ + "ase", + "d" + ], + [ + "a", + "sed" + ], + [ + "ma", + "n" + ], + [ + "m", + "an" + ], + [ + "▁S", + "h" + ], + [ + "▁", + "Sh" + ], + [ + "ie", + "t" + ], + [ + "i", + "et" + ], + [ + "ir", + "ect" + ], + [ + "ire", + "ct" + ], + [ + "i", + "rect" + ], + [ + "▁R", + "e" + ], + [ + "▁", + "Re" + ], + [ + "▁d", + "iffer" + ], + [ + "▁dif", + "fer" + ], + [ + "▁diff", + "er" + ], + [ + "▁f", + "ind" + ], + [ + "▁fin", + "d" + ], + [ + "▁fi", + "nd" + ], + [ + "▁", + "find" + ], + [ + "eth", + "od" + ], + [ + "▁", + "\r" + ], + [ + "in", + "es" + ], + [ + "ine", + "s" + ], + [ + "i", + "nes" + ], + [ + "▁in", + "v" + ], + [ + "▁i", + "nv" + ], + [ + "▁", + "inv" + ], + [ + "▁p", + "oint" + ], + [ + "▁po", + "int" + ], + [ + "▁poi", + "nt" + ], + [ + "▁", + "point" + ], + [ + "▁The", + "y" + ], + [ + "▁Th", + "ey" + ], + [ + "▁", + "They" + ], + [ + "▁u", + "sed" + ], + [ + "▁us", + "ed" + ], + [ + "▁use", + "d" + ], + [ + "▁", + "used" + ], + [ + "ct", + "ions" + ], + [ + "ction", + "s" + ], + [ + "▁st", + "ill" + ], + [ + "i", + "ó" + ], + [ + "in", + "ed" + ], + [ + "ine", + "d" + ], + [ + "i", + "ned" + ], + [ + "▁wh", + "ile" + ], + [ + "▁", + "while" + ], + [ + "I", + "t" + ], + [ + "em", + "ber" + ], + [ + "emb", + "er" + ], + [ + "e", + "mber" + ], + [ + "▁s", + "ay" + ], + [ + "▁sa", + "y" + ], + [ + "▁", + "say" + ], + [ + "▁he", + "lp" + ], + [ + "▁hel", + "p" + ], + [ + "▁", + "help" + ], + [ + "▁c", + "re" + ], + [ + "▁cr", + "e" + ], + [ + "▁", + "cre" + ], + [ + "▁", + "x" + ], + [ + "▁T", + "r" + ], + [ + "▁", + "Tr" + ], + [ + "um", + "ent" + ], + [ + "ume", + "nt" + ], + [ + "umen", + "t" + ], + [ + "u", + "ment" + ], + [ + "▁s", + "k" + ], + [ + "▁", + "sk" + ], + [ + "ou", + "ght" + ], + [ + "ough", + "t" + ], + [ + "ual", + "ly" + ], + [ + "u", + "ally" + ], + [ + "m", + "essage" + ], + [ + "▁C", + "on" + ], + [ + "▁Co", + "n" + ], + [ + "▁", + "Con" + ], + [ + "▁m", + "on" + ], + [ + "▁mo", + "n" + ], + [ + "▁", + "mon" + ], + [ + "ar", + "ed" + ], + [ + "are", + "d" + ], + [ + "a", + "red" + ], + [ + "wor", + "k" + ], + [ + "w", + "ork" + ], + [ + ")", + ":" + ], + [ + "is", + "ter" + ], + [ + "ist", + "er" + ], + [ + "iste", + "r" + ], + [ + "i", + "ster" + ], + [ + "ar", + "n" + ], + [ + "a", + "rn" + ], + [ + "iz", + "ed" + ], + [ + "ize", + "d" + ], + [ + "i", + "zed" + ], + [ + "Dat", + "a" + ], + [ + "Da", + "ta" + ], + [ + "D", + "ata" + ], + [ + "or", + "n" + ], + [ + "o", + "rn" + ], + [ + "▁h", + "ead" + ], + [ + "▁he", + "ad" + ], + [ + "▁", + "head" + ], + [ + "D", + "E" + ], + [ + "▁L", + "e" + ], + [ + "▁", + "Le" + ], + [ + "▁p", + "erson" + ], + [ + "▁per", + "son" + ], + [ + "▁pers", + "on" + ], + [ + "▁", + "person" + ], + [ + "ment", + "s" + ], + [ + "men", + "ts" + ], + [ + "m", + "ents" + ], + [ + "eng", + "th" + ], + [ + "e", + "ngth" + ], + [ + "▁f", + "alse" + ], + [ + "▁fal", + "se" + ], + [ + "▁fals", + "e" + ], + [ + "▁", + "false" + ], + [ + "▁m", + "ed" + ], + [ + "▁me", + "d" + ], + [ + "▁", + "med" + ], + [ + "▁D", + "e" + ], + [ + "▁", + "De" + ], + [ + "ac", + "he" + ], + [ + "ach", + "e" + ], + [ + "a", + "che" + ], + [ + "it", + "ed" + ], + [ + "ite", + "d" + ], + [ + "i", + "ted" + ], + [ + "▁l", + "et" + ], + [ + "▁le", + "t" + ], + [ + "▁", + "let" + ], + [ + "▁s", + "how" + ], + [ + "▁sh", + "ow" + ], + [ + "▁", + "show" + ], + [ + "▁s", + "ame" + ], + [ + "▁sa", + "me" + ], + [ + "▁sam", + "e" + ], + [ + "▁", + "same" + ], + [ + "us", + "s" + ], + [ + "u", + "ss" + ], + [ + "▁g", + "ener" + ], + [ + "▁gen", + "er" + ], + [ + "▁ge", + "ner" + ], + [ + "▁gene", + "r" + ], + [ + "▁", + "gener" + ], + [ + "▁", + "у" + ], + [ + "cu", + "r" + ], + [ + "c", + "ur" + ], + [ + "▁re", + "al" + ], + [ + "▁", + "real" + ], + [ + "ce", + "d" + ], + [ + "c", + "ed" + ], + [ + "\"", + ">" + ], + [ + "st", + "ruct" + ], + [ + "str", + "uct" + ], + [ + "stru", + "ct" + ], + [ + "be", + "gin" + ], + [ + "b", + "egin" + ], + [ + "ce", + "pt" + ], + [ + "cep", + "t" + ], + [ + "▁b", + "o" + ], + [ + "▁", + "bo" + ], + [ + "ir", + "ed" + ], + [ + "ire", + "d" + ], + [ + "i", + "red" + ], + [ + "▁F", + "r" + ], + [ + "▁", + "Fr" + ], + [ + "▁st", + "ud" + ], + [ + "▁", + "stud" + ], + [ + "de", + "v" + ], + [ + "d", + "ev" + ], + [ + "A", + "r" + ], + [ + "(", + "\\" + ], + [ + "▁C", + "l" + ], + [ + "▁", + "Cl" + ], + [ + "we", + "en" + ], + [ + "w", + "een" + ], + [ + "▁t", + "oo" + ], + [ + "▁to", + "o" + ], + [ + "▁", + "too" + ], + [ + "▁t", + "est" + ], + [ + "▁te", + "st" + ], + [ + "▁", + "test" + ], + [ + "▁d", + "ay" + ], + [ + "▁da", + "y" + ], + [ + "▁", + "day" + ], + [ + "o", + "h" + ], + [ + "▁f", + "ollow" + ], + [ + "▁fol", + "low" + ], + [ + "▁", + "follow" + ], + [ + "at", + "ure" + ], + [ + "atur", + "e" + ], + [ + "atu", + "re" + ], + [ + "z", + "e" + ], + [ + "ie", + "n" + ], + [ + "i", + "en" + ], + [ + "re", + "g" + ], + [ + "r", + "eg" + ], + [ + "ce", + "s" + ], + [ + "c", + "es" + ], + [ + "ur", + "ing" + ], + [ + "uri", + "ng" + ], + [ + "u", + "ring" + ], + [ + "am", + "b" + ], + [ + "a", + "mb" + ], + [ + "in", + "a" + ], + [ + "i", + "na" + ], + [ + "cr", + "i" + ], + [ + "c", + "ri" + ], + [ + "▁e", + "d" + ], + [ + "▁", + "ed" + ], + [ + "S", + "S" + ], + [ + "uc", + "k" + ], + [ + "u", + "ck" + ], + [ + "▁/", + "*" + ], + [ + "▁", + "/*" + ], + [ + "C", + "T" + ], + [ + "▁T", + "here" + ], + [ + "▁The", + "re" + ], + [ + "▁Th", + "ere" + ], + [ + "▁Ther", + "e" + ], + [ + "▁", + "There" + ], + [ + "▁t", + "ake" + ], + [ + "▁tak", + "e" + ], + [ + "▁ta", + "ke" + ], + [ + "▁", + "take" + ], + [ + "pa", + "r" + ], + [ + "p", + "ar" + ], + [ + "ul", + "e" + ], + [ + "u", + "le" + ], + [ + "ca", + "l" + ], + [ + "c", + "al" + ], + [ + "fo", + "r" + ], + [ + "f", + "or" + ], + [ + "**", + "**************" + ], + [ + "****", + "************" + ], + [ + "********", + "********" + ], + [ + "************", + "****" + ], + [ + "**************", + "**" + ], + [ + "s", + "ource" + ], + [ + "▁th", + "ose" + ], + [ + "co", + "l" + ], + [ + "c", + "ol" + ], + [ + "▁e", + "ff" + ], + [ + "▁", + "eff" + ], + [ + "mo", + "d" + ], + [ + "m", + "od" + ], + [ + "con", + "t" + ], + [ + "co", + "nt" + ], + [ + "c", + "ont" + ], + [ + "}", + "{" + ], + [ + "▁a", + "round" + ], + [ + "▁ar", + "ound" + ], + [ + "▁", + "around" + ], + [ + "pr", + "ess" + ], + [ + "pre", + "ss" + ], + [ + "pres", + "s" + ], + [ + "p", + "ress" + ], + [ + "b", + "y" + ], + [ + "▁go", + "ing" + ], + [ + "▁", + "going" + ], + [ + "pon", + "se" + ], + [ + "pons", + "e" + ], + [ + "▁", + "С" + ], + [ + "▁l", + "ine" + ], + [ + "▁li", + "ne" + ], + [ + "▁lin", + "e" + ], + [ + "▁", + "line" + ], + [ + "da", + "te" + ], + [ + "dat", + "e" + ], + [ + "d", + "ate" + ], + [ + "co", + "de" + ], + [ + "cod", + "e" + ], + [ + "c", + "ode" + ], + [ + "[", + "'" + ], + [ + "▁l", + "ife" + ], + [ + "▁li", + "fe" + ], + [ + "▁lif", + "e" + ], + [ + "▁", + "life" + ], + [ + "as", + "on" + ], + [ + "a", + "son" + ], + [ + "▁u", + "sing" + ], + [ + "▁us", + "ing" + ], + [ + "▁", + "using" + ], + [ + "▁v", + "al" + ], + [ + "▁va", + "l" + ], + [ + "▁", + "val" + ], + [ + "▁d", + "u" + ], + [ + "▁", + "du" + ], + [ + "y", + "p" + ], + [ + "▁O", + "n" + ], + [ + "▁", + "On" + ], + [ + "▁f", + "ound" + ], + [ + "▁fo", + "und" + ], + [ + "▁fou", + "nd" + ], + [ + "▁", + "found" + ], + [ + "ol", + "ut" + ], + [ + "olu", + "t" + ], + [ + "'", + "]" + ], + [ + "ar", + "ent" + ], + [ + "are", + "nt" + ], + [ + "aren", + "t" + ], + [ + "a", + "rent" + ], + [ + "▁s", + "tring" + ], + [ + "▁st", + "ring" + ], + [ + "▁str", + "ing" + ], + [ + "▁stri", + "ng" + ], + [ + "▁", + "string" + ], + [ + "▁m", + "et" + ], + [ + "▁me", + "t" + ], + [ + "▁", + "met" + ], + [ + "▁w", + "r" + ], + [ + "▁", + "wr" + ], + [ + "us", + "h" + ], + [ + "u", + "sh" + ], + [ + "st", + "ring" + ], + [ + "str", + "ing" + ], + [ + "stri", + "ng" + ], + [ + "s", + "tring" + ], + [ + "si", + "ze" + ], + [ + "s", + "ize" + ], + [ + "▁v", + "er" + ], + [ + "▁ve", + "r" + ], + [ + "▁", + "ver" + ], + [ + "▁e", + "ach" + ], + [ + "▁", + "each" + ], + [ + "val", + "ue" + ], + [ + "v", + "alue" + ], + [ + "▁l", + "ast" + ], + [ + "▁la", + "st" + ], + [ + "▁las", + "t" + ], + [ + "▁", + "last" + ], + [ + "▁g", + "ot" + ], + [ + "▁go", + "t" + ], + [ + "▁", + "got" + ], + [ + "ve", + "n" + ], + [ + "v", + "en" + ], + [ + "ba", + "ck" + ], + [ + "b", + "ack" + ], + [ + "Se", + "t" + ], + [ + "S", + "et" + ], + [ + "e", + "y" + ], + [ + "ro", + "l" + ], + [ + "r", + "ol" + ], + [ + "▁c", + "r" + ], + [ + "▁", + "cr" + ], + [ + "th", + "ing" + ], + [ + "t", + "hing" + ], + [ + "re", + "t" + ], + [ + "r", + "et" + ], + [ + "é", + "s" + ], + [ + "is", + "m" + ], + [ + "i", + "sm" + ], + [ + "▁bet", + "ween" + ], + [ + "▁", + "between" + ], + [ + "O", + "b" + ], + [ + "et", + "hing" + ], + [ + "eth", + "ing" + ], + [ + "e", + "thing" + ], + [ + "m", + "p" + ], + [ + "▁l", + "o" + ], + [ + "▁", + "lo" + ], + [ + "at", + "s" + ], + [ + "a", + "ts" + ], + [ + "▁N", + "ew" + ], + [ + "▁Ne", + "w" + ], + [ + "▁", + "New" + ], + [ + "в", + "и" + ], + [ + "ad", + "o" + ], + [ + "a", + "do" + ], + [ + "de", + "x" + ], + [ + "d", + "ex" + ], + [ + "д", + "и" + ], + [ + "▁p", + "ass" + ], + [ + "▁pas", + "s" + ], + [ + "▁pa", + "ss" + ], + [ + "▁", + "pass" + ], + [ + "w", + "h" + ], + [ + "▁d", + "en" + ], + [ + "▁de", + "n" + ], + [ + "▁", + "den" + ], + [ + "Ge", + "t" + ], + [ + "G", + "et" + ], + [ + "ap", + "t" + ], + [ + "a", + "pt" + ], + [ + "▁a", + "sk" + ], + [ + "▁as", + "k" + ], + [ + "▁", + "ask" + ], + [ + "▁s", + "up" + ], + [ + "▁su", + "p" + ], + [ + "▁", + "sup" + ], + [ + "Val", + "ue" + ], + [ + "V", + "alue" + ], + [ + "н", + "ы" + ], + [ + "▁t", + "ry" + ], + [ + "▁tr", + "y" + ], + [ + "▁", + "try" + ], + [ + "lat", + "ion" + ], + [ + "l", + "ation" + ], + [ + "da", + "y" + ], + [ + "d", + "ay" + ], + [ + "ne", + "ss" + ], + [ + "nes", + "s" + ], + [ + "n", + "ess" + ], + [ + "et", + "s" + ], + [ + "e", + "ts" + ], + [ + "▁ex", + "per" + ], + [ + "▁exp", + "er" + ], + [ + "▁", + "exper" + ], + [ + "T", + "r" + ], + [ + "▁M", + "ar" + ], + [ + "▁Ma", + "r" + ], + [ + "▁", + "Mar" + ], + [ + "se", + "rv" + ], + [ + "ser", + "v" + ], + [ + "s", + "erv" + ], + [ + "b", + "r" + ], + [ + "▁n", + "umber" + ], + [ + "▁num", + "ber" + ], + [ + "▁nu", + "mber" + ], + [ + "▁", + "number" + ], + [ + "in", + "al" + ], + [ + "ina", + "l" + ], + [ + "i", + "nal" + ], + [ + "ce", + "nt" + ], + [ + "cen", + "t" + ], + [ + "c", + "ent" + ], + [ + "/", + "*" + ], + [ + "no", + "t" + ], + [ + "n", + "ot" + ], + [ + "ion", + "al" + ], + [ + "io", + "nal" + ], + [ + "iona", + "l" + ], + [ + "i", + "onal" + ], + [ + "▁f", + "inal" + ], + [ + "▁fin", + "al" + ], + [ + "▁fi", + "nal" + ], + [ + "▁", + "final" + ], + [ + "'", + ")" + ], + [ + "▁r", + "un" + ], + [ + "▁ru", + "n" + ], + [ + "▁", + "run" + ], + [ + "ov", + "er" + ], + [ + "ove", + "r" + ], + [ + "o", + "ver" + ], + [ + "▁n", + "ever" + ], + [ + "▁ne", + "ver" + ], + [ + "▁", + "never" + ], + [ + "u", + "c" + ], + [ + "▁h", + "igh" + ], + [ + "▁hig", + "h" + ], + [ + "▁hi", + "gh" + ], + [ + "▁", + "high" + ], + [ + "yl", + "e" + ], + [ + "y", + "le" + ], + [ + "▁in", + "s" + ], + [ + "▁i", + "ns" + ], + [ + "▁", + "ins" + ], + [ + "▁b", + "est" + ], + [ + "▁be", + "st" + ], + [ + "▁bes", + "t" + ], + [ + "▁", + "best" + ], + [ + "it", + "tle" + ], + [ + "itt", + "le" + ], + [ + "ri", + "c" + ], + [ + "r", + "ic" + ], + [ + "▁s", + "ign" + ], + [ + "▁si", + "gn" + ], + [ + "▁sig", + "n" + ], + [ + "▁", + "sign" + ], + [ + "▁d", + "em" + ], + [ + "▁de", + "m" + ], + [ + "▁", + "dem" + ], + [ + "in", + "ess" + ], + [ + "ine", + "ss" + ], + [ + "ines", + "s" + ], + [ + "i", + "ness" + ], + [ + "g", + "y" + ], + [ + "▁w", + "ar" + ], + [ + "▁wa", + "r" + ], + [ + "▁", + "war" + ], + [ + "is", + "hed" + ], + [ + "ish", + "ed" + ], + [ + "▁g", + "iv" + ], + [ + "▁gi", + "v" + ], + [ + "ke", + "y" + ], + [ + "k", + "ey" + ], + [ + "▁", + "X" + ], + [ + "(", + "$" + ], + [ + "▁ch", + "ild" + ], + [ + "▁chi", + "ld" + ], + [ + "▁", + "child" + ], + [ + "le", + "ss" + ], + [ + "les", + "s" + ], + [ + "l", + "ess" + ], + [ + "way", + "s" + ], + [ + "wa", + "ys" + ], + [ + "w", + "ays" + ], + [ + "in", + "cl" + ], + [ + "inc", + "l" + ], + [ + "ro", + "p" + ], + [ + "r", + "op" + ], + [ + "ra", + "w" + ], + [ + "r", + "aw" + ], + [ + ":", + "//" + ], + [ + "▁", + "«" + ], + [ + "n", + "o" + ], + [ + "ind", + "ow" + ], + [ + "indo", + "w" + ], + [ + "f", + "e" + ], + [ + "ri", + "end" + ], + [ + "rie", + "nd" + ], + [ + "rien", + "d" + ], + [ + "▁l", + "es" + ], + [ + "▁le", + "s" + ], + [ + "▁", + "les" + ], + [ + "▁l", + "os" + ], + [ + "▁lo", + "s" + ], + [ + "▁", + "los" + ], + [ + "fil", + "e" + ], + [ + "fi", + "le" + ], + [ + "f", + "ile" + ], + [ + "form", + "ation" + ], + [ + "format", + "ion" + ], + [ + "cc", + "ess" + ], + [ + "c", + "cess" + ], + [ + "▁", + "В" + ], + [ + "n", + "a" + ], + [ + "▁i", + "l" + ], + [ + "▁", + "il" + ], + [ + "is", + "ion" + ], + [ + "isi", + "on" + ], + [ + "le", + "r" + ], + [ + "l", + "er" + ], + [ + "▁a", + "rt" + ], + [ + "▁ar", + "t" + ], + [ + "▁", + "art" + ], + [ + "Con", + "t" + ], + [ + "Co", + "nt" + ], + [ + "C", + "ont" + ], + [ + "▁w", + "orld" + ], + [ + "▁wor", + "ld" + ], + [ + "▁", + "world" + ], + [ + "▁t", + "urn" + ], + [ + "▁tu", + "rn" + ], + [ + "▁tur", + "n" + ], + [ + "▁", + "turn" + ], + [ + "▁re", + "ally" + ], + [ + "▁real", + "ly" + ], + [ + "▁E", + "x" + ], + [ + "▁", + "Ex" + ], + [ + "м", + "а" + ], + [ + "▁", + "П" + ], + [ + "ter", + "s" + ], + [ + "te", + "rs" + ], + [ + "t", + "ers" + ], + [ + "ar", + "get" + ], + [ + "arg", + "et" + ], + [ + "arge", + "t" + ], + [ + "Er", + "r" + ], + [ + "E", + "rr" + ], + [ + "▁h", + "app" + ], + [ + "▁ha", + "pp" + ], + [ + "ti", + "me" + ], + [ + "tim", + "e" + ], + [ + "t", + "ime" + ], + [ + "▁S", + "o" + ], + [ + "▁", + "So" + ], + [ + "di", + "v" + ], + [ + "d", + "iv" + ], + [ + "▁did", + "n" + ], + [ + "▁di", + "dn" + ], + [ + "ad", + "a" + ], + [ + "a", + "da" + ], + [ + "oo", + "t" + ], + [ + "o", + "ot" + ], + [ + "}", + ")" + ], + [ + "▁s", + "ch" + ], + [ + "▁sc", + "h" + ], + [ + "▁", + "sch" + ], + [ + "▁c", + "le" + ], + [ + "▁cl", + "e" + ], + [ + "▁", + "cle" + ], + [ + "▁some", + "thing" + ], + [ + "▁som", + "ething" + ], + [ + "▁somet", + "hing" + ], + [ + "▁", + "something" + ], + [ + "()", + "." + ], + [ + "(", + ")." + ], + [ + "▁c", + "our" + ], + [ + "▁co", + "ur" + ], + [ + "▁cou", + "r" + ], + [ + "ev", + "er" + ], + [ + "eve", + "r" + ], + [ + "e", + "ver" + ], + [ + "an", + "ts" + ], + [ + "ant", + "s" + ], + [ + "▁", + "?" + ], + [ + "T", + "o" + ], + [ + "▁", + "`" + ], + [ + "tr", + "y" + ], + [ + "t", + "ry" + ], + [ + "u", + "x" + ], + [ + "ai", + "s" + ], + [ + "a", + "is" + ], + [ + "ro", + "ss" + ], + [ + "ros", + "s" + ], + [ + "r", + "oss" + ], + [ + "hi", + "p" + ], + [ + "h", + "ip" + ], + [ + "▁re", + "p" + ], + [ + "▁r", + "ep" + ], + [ + "▁", + "rep" + ], + [ + "la", + "bel" + ], + [ + "lab", + "el" + ], + [ + "l", + "abel" + ], + [ + "▁b", + "oth" + ], + [ + "▁bo", + "th" + ], + [ + "▁bot", + "h" + ], + [ + "▁", + "both" + ], + [ + "*", + "," + ], + [ + "ot", + "t" + ], + [ + "o", + "tt" + ], + [ + "м", + "и" + ], + [ + "an", + "e" + ], + [ + "a", + "ne" + ], + [ + "▁o", + "pen" + ], + [ + "▁op", + "en" + ], + [ + "▁", + "open" + ], + [ + "w", + "w" + ], + [ + "▁c", + "ome" + ], + [ + "▁com", + "e" + ], + [ + "▁co", + "me" + ], + [ + "▁", + "come" + ], + [ + "▁e", + "xt" + ], + [ + "▁ex", + "t" + ], + [ + "▁", + "ext" + ], + [ + "re", + "m" + ], + [ + "r", + "em" + ], + [ + "_{", + "\\" + ], + [ + "_", + "{\\" + ], + [ + "▁o", + "ld" + ], + [ + "▁ol", + "d" + ], + [ + "▁", + "old" + ], + [ + "ch", + "ed" + ], + [ + "che", + "d" + ], + [ + "c", + "hed" + ], + [ + ".", + "_" + ], + [ + "M", + "E" + ], + [ + "if", + "y" + ], + [ + "i", + "fy" + ], + [ + "g", + "g" + ], + [ + "Co", + "l" + ], + [ + "C", + "ol" + ], + [ + "vi", + "ew" + ], + [ + "v", + "iew" + ], + [ + "▁b", + "us" + ], + [ + "▁bu", + "s" + ], + [ + "▁", + "bus" + ], + [ + "▁m", + "ust" + ], + [ + "▁mus", + "t" + ], + [ + "▁mu", + "st" + ], + [ + "▁", + "must" + ], + [ + "▁d", + "ifferent" + ], + [ + "▁differ", + "ent" + ], + [ + "lo", + "g" + ], + [ + "l", + "og" + ], + [ + "is", + "ts" + ], + [ + "ist", + "s" + ], + [ + "i", + "sts" + ], + [ + "ro", + "ll" + ], + [ + "rol", + "l" + ], + [ + "r", + "oll" + ], + [ + "a", + "i" + ], + [ + "▁з", + "а" + ], + [ + "▁", + "за" + ], + [ + "▁s", + "ystem" + ], + [ + "▁sys", + "tem" + ], + [ + "▁syst", + "em" + ], + [ + "▁", + "system" + ], + [ + "iv", + "ers" + ], + [ + "ive", + "rs" + ], + [ + "iver", + "s" + ], + [ + "i", + "vers" + ], + [ + "at", + "us" + ], + [ + "atu", + "s" + ], + [ + "ot", + "e" + ], + [ + "o", + "te" + ], + [ + "me", + "d" + ], + [ + "m", + "ed" + ], + [ + "]", + "." + ], + [ + "ak", + "es" + ], + [ + "ake", + "s" + ], + [ + "a", + "kes" + ], + [ + "R", + "O" + ], + [ + "▁c", + "ent" + ], + [ + "▁ce", + "nt" + ], + [ + "▁", + "cent" + ], + [ + "gr", + "am" + ], + [ + "gra", + "m" + ], + [ + "g", + "ram" + ], + [ + "▁p", + "rivate" + ], + [ + "▁priv", + "ate" + ], + [ + "▁", + "private" + ], + [ + "▁g", + "reat" + ], + [ + "▁gre", + "at" + ], + [ + "\"", + ";" + ], + [ + "op", + "y" + ], + [ + "o", + "py" + ], + [ + "▁fe", + "el" + ], + [ + "▁fee", + "l" + ], + [ + "▁H", + "ow" + ], + [ + "▁Ho", + "w" + ], + [ + "▁", + "How" + ], + [ + "//", + "//" + ], + [ + "///", + "/" + ], + [ + "/", + "///" + ], + [ + "I", + "C" + ], + [ + "▁d", + "r" + ], + [ + "▁", + "dr" + ], + [ + "ain", + "s" + ], + [ + "ai", + "ns" + ], + [ + "a", + "ins" + ], + [ + "lo", + "ck" + ], + [ + "loc", + "k" + ], + [ + "l", + "ock" + ], + [ + "E", + "n" + ], + [ + "▁S", + "ch" + ], + [ + "▁Sc", + "h" + ], + [ + "▁", + "Sch" + ], + [ + "▁m", + "at" + ], + [ + "▁ma", + "t" + ], + [ + "▁", + "mat" + ], + [ + "▁h", + "ome" + ], + [ + "▁hom", + "e" + ], + [ + "▁ho", + "me" + ], + [ + "▁", + "home" + ], + [ + "per", + "ty" + ], + [ + "pert", + "y" + ], + [ + "te", + "st" + ], + [ + "tes", + "t" + ], + [ + "t", + "est" + ], + [ + "lo", + "c" + ], + [ + "l", + "oc" + ], + [ + "▁w", + "om" + ], + [ + "▁wo", + "m" + ], + [ + "s", + "w" + ], + [ + "ar", + "ly" + ], + [ + "arl", + "y" + ], + [ + "▁E", + "n" + ], + [ + "▁", + "En" + ], + [ + "▁к", + "о" + ], + [ + "▁", + "ко" + ], + [ + "de", + "n" + ], + [ + "d", + "en" + ], + [ + "ст", + "а" + ], + [ + "с", + "та" + ], + [ + "▁", + "а" + ], + [ + "et", + "er" + ], + [ + "ete", + "r" + ], + [ + "e", + "ter" + ], + [ + "▁incl", + "ud" + ], + [ + "▁inclu", + "d" + ], + [ + "UL", + "L" + ], + [ + "U", + "LL" + ], + [ + "▁m", + "em" + ], + [ + "▁me", + "m" + ], + [ + "▁", + "mem" + ], + [ + "▁p", + "o" + ], + [ + "▁", + "po" + ], + [ + "▁l", + "ittle" + ], + [ + "▁lit", + "tle" + ], + [ + "▁litt", + "le" + ], + [ + "▁a", + "rg" + ], + [ + "▁ar", + "g" + ], + [ + "▁", + "arg" + ], + [ + "▁}", + "," + ], + [ + "▁", + "}," + ], + [ + "in", + "clude" + ], + [ + "incl", + "ude" + ], + [ + "et", + "a" + ], + [ + "e", + "ta" + ], + [ + "▁p", + "lace" + ], + [ + "▁pl", + "ace" + ], + [ + "▁plac", + "e" + ], + [ + "▁", + "place" + ], + [ + "id", + "th" + ], + [ + "us", + "tom" + ], + [ + "ust", + "om" + ], + [ + "▁|", + "|" + ], + [ + "▁", + "||" + ], + [ + "▁t", + "em" + ], + [ + "▁te", + "m" + ], + [ + "▁", + "tem" + ], + [ + "ri", + "ed" + ], + [ + "rie", + "d" + ], + [ + "r", + "ied" + ], + [ + "▁f", + "act" + ], + [ + "▁fac", + "t" + ], + [ + "▁fa", + "ct" + ], + [ + "▁", + "fact" + ], + [ + "ien", + "ce" + ], + [ + "i", + "ence" + ], + [ + "▁P", + "l" + ], + [ + "▁", + "Pl" + ], + [ + "op", + "t" + ], + [ + "o", + "pt" + ], + [ + "el", + "e" + ], + [ + "e", + "le" + ], + [ + "g", + "o" + ], + [ + "A", + "C" + ], + [ + "in", + "ter" + ], + [ + "int", + "er" + ], + [ + "inte", + "r" + ], + [ + "====", + "====" + ], + [ + "()", + "," + ], + [ + "(", + ")," + ], + [ + "ot", + "s" + ], + [ + "o", + "ts" + ], + [ + "ra", + "l" + ], + [ + "r", + "al" + ], + [ + "iqu", + "e" + ], + [ + "iq", + "ue" + ], + [ + "i", + "que" + ], + [ + "av", + "ing" + ], + [ + "avi", + "ng" + ], + [ + "a", + "ving" + ], + [ + "m", + "l" + ], + [ + "▁th", + "ought" + ], + [ + "▁though", + "t" + ], + [ + "▁thou", + "ght" + ], + [ + "fr", + "ac" + ], + [ + "f", + "rac" + ], + [ + "▁c", + "are" + ], + [ + "▁car", + "e" + ], + [ + "▁ca", + "re" + ], + [ + "▁", + "care" + ], + [ + "()", + ");" + ], + [ + "())", + ";" + ], + [ + "(", + "));" + ], + [ + "▁p", + "ut" + ], + [ + "▁pu", + "t" + ], + [ + "▁", + "put" + ], + [ + "▁m", + "ight" + ], + [ + "▁mi", + "ght" + ], + [ + "▁mig", + "ht" + ], + [ + "▁A", + "mer" + ], + [ + "▁Am", + "er" + ], + [ + "▁", + "Amer" + ], + [ + "▁(", + "!" + ], + [ + "▁", + "(!" + ], + [ + "am", + "ple" + ], + [ + "amp", + "le" + ], + [ + "al", + "th" + ], + [ + "alt", + "h" + ], + [ + "▁f", + "ew" + ], + [ + "▁fe", + "w" + ], + [ + "▁st", + "ate" + ], + [ + "▁stat", + "e" + ], + [ + "▁sta", + "te" + ], + [ + "▁", + "state" + ], + [ + "su", + "b" + ], + [ + "s", + "ub" + ], + [ + "▁O", + "r" + ], + [ + "▁", + "Or" + ], + [ + "]", + ";" + ], + [ + "▁s", + "ize" + ], + [ + "▁si", + "ze" + ], + [ + "▁", + "size" + ], + [ + "▁S", + "p" + ], + [ + "▁", + "Sp" + ], + [ + "▁with", + "out" + ], + [ + "▁", + "without" + ], + [ + "▁p", + "oss" + ], + [ + "▁pos", + "s" + ], + [ + "▁po", + "ss" + ], + [ + "▁", + "poss" + ], + [ + "e", + "q" + ], + [ + "pl", + "ay" + ], + [ + "p", + "lay" + ], + [ + "▁ex", + "pect" + ], + [ + "▁exp", + "ect" + ], + [ + "▁", + "expect" + ], + [ + "▁se", + "cond" + ], + [ + "▁sec", + "ond" + ], + [ + "▁", + "second" + ], + [ + "▁S", + "tring" + ], + [ + "▁St", + "ring" + ], + [ + "▁Str", + "ing" + ], + [ + "▁", + "String" + ], + [ + "ui", + "ld" + ], + [ + "u", + "ild" + ], + [ + "▁n", + "ext" + ], + [ + "▁ne", + "xt" + ], + [ + "▁", + "next" + ], + [ + "+", + "+" + ], + [ + "re", + "qu" + ], + [ + "req", + "u" + ], + [ + "r", + "equ" + ], + [ + "▁A", + "ll" + ], + [ + "▁Al", + "l" + ], + [ + "▁", + "All" + ], + [ + "▁m", + "en" + ], + [ + "▁me", + "n" + ], + [ + "▁", + "men" + ], + [ + "▁W", + "hen" + ], + [ + "▁Wh", + "en" + ], + [ + "▁Whe", + "n" + ], + [ + "▁", + "When" + ], + [ + "it", + "er" + ], + [ + "ite", + "r" + ], + [ + "i", + "ter" + ], + [ + "am", + "ent" + ], + [ + "ame", + "nt" + ], + [ + "amen", + "t" + ], + [ + "a", + "ment" + ], + [ + "ne", + "t" + ], + [ + "n", + "et" + ], + [ + "▁", + "К" + ], + [ + "ro", + "n" + ], + [ + "r", + "on" + ], + [ + "ain", + "t" + ], + [ + "ai", + "nt" + ], + [ + "a", + "int" + ], + [ + "▁I", + "s" + ], + [ + "▁", + "Is" + ], + [ + "в", + "е" + ], + [ + "pe", + "nd" + ], + [ + "pen", + "d" + ], + [ + "p", + "end" + ], + [ + "trans", + "lation" + ], + [ + "transl", + "ation" + ], + [ + "▁г", + "о" + ], + [ + "▁", + "го" + ], + [ + "ч", + "е" + ], + [ + "▁v", + "an" + ], + [ + "▁va", + "n" + ], + [ + "▁", + "van" + ], + [ + "▁an", + "other" + ], + [ + "▁ano", + "ther" + ], + [ + "▁re", + "t" + ], + [ + "▁r", + "et" + ], + [ + "▁", + "ret" + ], + [ + "▁L", + "a" + ], + [ + "▁", + "La" + ], + [ + "Mo", + "d" + ], + [ + "M", + "od" + ], + [ + "IO", + "N" + ], + [ + "I", + "ON" + ], + [ + "li", + "st" + ], + [ + "l", + "ist" + ], + [ + "▁p", + "ost" + ], + [ + "▁pos", + "t" + ], + [ + "▁po", + "st" + ], + [ + "▁", + "post" + ], + [ + "d", + "a" + ], + [ + "wa", + "re" + ], + [ + "war", + "e" + ], + [ + "w", + "are" + ], + [ + "▁w", + "ord" + ], + [ + "▁wor", + "d" + ], + [ + "▁wo", + "rd" + ], + [ + "▁", + "word" + ], + [ + "Err", + "or" + ], + [ + "Er", + "ror" + ], + [ + "▁se", + "em" + ], + [ + "▁see", + "m" + ], + [ + "▁cont", + "in" + ], + [ + "▁", + "contin" + ], + [ + "at", + "ic" + ], + [ + "ati", + "c" + ], + [ + "▁th", + "ree" + ], + [ + "▁thr", + "ee" + ], + [ + "▁", + "three" + ], + [ + "Ob", + "ject" + ], + [ + "Obj", + "ect" + ], + [ + "▁part", + "ic" + ], + [ + "▁parti", + "c" + ], + [ + "$", + "." + ], + [ + "▁m", + "ark" + ], + [ + "▁mar", + "k" + ], + [ + "▁", + "mark" + ], + [ + "▁v", + "is" + ], + [ + "▁vi", + "s" + ], + [ + "▁", + "vis" + ], + [ + "r", + "c" + ], + [ + "▁s", + "w" + ], + [ + "▁", + "sw" + ], + [ + "pt", + "ions" + ], + [ + "ption", + "s" + ], + [ + "▁b", + "reak" + ], + [ + "▁bre", + "ak" + ], + [ + "▁", + "break" + ], + [ + "▁th", + "ings" + ], + [ + "▁thing", + "s" + ], + [ + "▁thin", + "gs" + ], + [ + "ut", + "e" + ], + [ + "u", + "te" + ], + [ + "u", + "i" + ], + [ + "▁T", + "hat" + ], + [ + "▁Th", + "at" + ], + [ + "▁", + "That" + ], + [ + "ur", + "s" + ], + [ + "u", + "rs" + ], + [ + "g", + "l" + ], + [ + "р", + "у" + ], + [ + "▁f", + "ile" + ], + [ + "▁fil", + "e" + ], + [ + "▁fi", + "le" + ], + [ + "▁", + "file" + ], + [ + "us", + "e" + ], + [ + "u", + "se" + ], + [ + "ig", + "ned" + ], + [ + "ign", + "ed" + ], + [ + "igne", + "d" + ], + [ + "par", + "t" + ], + [ + "pa", + "rt" + ], + [ + "p", + "art" + ], + [ + "U", + "n" + ], + [ + "▁e", + "qu" + ], + [ + "▁eq", + "u" + ], + [ + "▁", + "equ" + ], + [ + "(", + "&" + ], + [ + "▁l", + "ead" + ], + [ + "▁le", + "ad" + ], + [ + "r", + "m" + ], + [ + "ain", + "ed" + ], + [ + "ai", + "ned" + ], + [ + "aine", + "d" + ], + [ + "a", + "ined" + ], + [ + "▁B", + "e" + ], + [ + "▁", + "Be" + ], + [ + "pat", + "h" + ], + [ + "pa", + "th" + ], + [ + "p", + "ath" + ], + [ + "▁sm", + "all" + ], + [ + "▁", + "small" + ], + [ + "ag", + "er" + ], + [ + "age", + "r" + ], + [ + "a", + "ger" + ], + [ + "▁al", + "ways" + ], + [ + "▁", + "always" + ], + [ + "▁E", + "l" + ], + [ + "▁", + "El" + ], + [ + "▁or", + "der" + ], + [ + "▁ord", + "er" + ], + [ + "▁", + "order" + ], + [ + "▁e", + "y" + ], + [ + "▁", + "ey" + ], + [ + "▁w", + "on" + ], + [ + "▁wo", + "n" + ], + [ + "▁", + "won" + ], + [ + "ap", + "e" + ], + [ + "a", + "pe" + ], + [ + "▁l", + "eft" + ], + [ + "▁le", + "ft" + ], + [ + "▁", + "left" + ], + [ + "av", + "a" + ], + [ + "a", + "va" + ], + [ + "it", + "em" + ], + [ + "ite", + "m" + ], + [ + "i", + "tem" + ], + [ + "ho", + "r" + ], + [ + "h", + "or" + ], + [ + "▁a", + "way" + ], + [ + "▁aw", + "ay" + ], + [ + "▁", + "away" + ], + [ + "b", + "b" + ], + [ + "fu", + "n" + ], + [ + "f", + "un" + ], + [ + "▁I", + "nd" + ], + [ + "▁In", + "d" + ], + [ + "▁", + "Ind" + ], + [ + "m", + "b" + ], + [ + "▁st", + "ruct" + ], + [ + "▁str", + "uct" + ], + [ + "▁stru", + "ct" + ], + [ + "▁", + "struct" + ], + [ + "▁pro", + "cess" + ], + [ + "▁proc", + "ess" + ], + [ + "▁proces", + "s" + ], + [ + "▁", + "process" + ], + [ + "▁s", + "upport" + ], + [ + "▁sup", + "port" + ], + [ + "▁supp", + "ort" + ], + [ + "▁", + "support" + ], + [ + ");", + "\r" + ], + [ + ")", + ";\r" + ], + [ + "ió", + "n" + ], + [ + "i", + "ón" + ], + [ + "L", + "O" + ], + [ + "▁o", + "per" + ], + [ + "▁op", + "er" + ], + [ + "▁", + "oper" + ], + [ + "U", + "T" + ], + [ + "▁", + "·" + ], + [ + "P", + "E" + ], + [ + "lo", + "ad" + ], + [ + "l", + "oad" + ], + [ + "of", + "f" + ], + [ + "o", + "ff" + ], + [ + "▁N", + "o" + ], + [ + "▁", + "No" + ], + [ + "iv", + "es" + ], + [ + "ive", + "s" + ], + [ + "i", + "ves" + ], + [ + "ic", + "an" + ], + [ + "ica", + "n" + ], + [ + "i", + "can" + ], + [ + "▁v", + "e" + ], + [ + "▁", + "ve" + ], + [ + "act", + "ion" + ], + [ + "a", + "ction" + ], + [ + "'", + ";" + ], + [ + "▁v", + "o" + ], + [ + "▁", + "vo" + ], + [ + "$", + "," + ], + [ + "▁G", + "r" + ], + [ + "▁", + "Gr" + ], + [ + "pr", + "e" + ], + [ + "p", + "re" + ], + [ + "n", + "y" + ], + [ + "ain", + "ing" + ], + [ + "ai", + "ning" + ], + [ + "a", + "ining" + ], + [ + "io", + "r" + ], + [ + "i", + "or" + ], + [ + "in", + "it" + ], + [ + "ini", + "t" + ], + [ + "i", + "nit" + ], + [ + "le", + "ction" + ], + [ + "lect", + "ion" + ], + [ + "l", + "ection" + ], + [ + "ar", + "m" + ], + [ + "a", + "rm" + ], + [ + "um", + "n" + ], + [ + "u", + "mn" + ], + [ + "ag", + "s" + ], + [ + "a", + "gs" + ], + [ + "ц", + "и" + ], + [ + "ск", + "о" + ], + [ + "с", + "ко" + ], + [ + "vers", + "ion" + ], + [ + "v", + "ersion" + ], + [ + "▁T", + "o" + ], + [ + "▁", + "To" + ], + [ + "▁re", + "f" + ], + [ + "▁r", + "ef" + ], + [ + "▁", + "ref" + ], + [ + "st", + "and" + ], + [ + "sta", + "nd" + ], + [ + "stan", + "d" + ], + [ + "▁A", + "t" + ], + [ + "▁", + "At" + ], + [ + "if", + "t" + ], + [ + "i", + "ft" + ], + [ + "▁e", + "in" + ], + [ + "fa", + "ce" + ], + [ + "fac", + "e" + ], + [ + "f", + "ace" + ], + [ + "b", + "o" + ], + [ + "if", + "ied" + ], + [ + "ifi", + "ed" + ], + [ + "ve", + "d" + ], + [ + "v", + "ed" + ], + [ + "su", + "m" + ], + [ + "s", + "um" + ], + [ + "un", + "e" + ], + [ + "u", + "ne" + ], + [ + "it", + "al" + ], + [ + "ita", + "l" + ], + [ + "i", + "tal" + ], + [ + "um", + "p" + ], + [ + "u", + "mp" + ], + [ + "com", + "m" + ], + [ + "co", + "mm" + ], + [ + "c", + "omm" + ], + [ + "▁m", + "ov" + ], + [ + "▁mo", + "v" + ], + [ + "▁", + "mov" + ], + [ + "el", + "t" + ], + [ + "e", + "lt" + ], + [ + "▁v", + "on" + ], + [ + "▁vo", + "n" + ], + [ + "vel", + "op" + ], + [ + "ct", + "or" + ], + [ + "c", + "tor" + ], + [ + "he", + "ad" + ], + [ + "h", + "ead" + ], + [ + "cl", + "e" + ], + [ + "c", + "le" + ], + [ + "▁b", + "uild" + ], + [ + "▁bu", + "ild" + ], + [ + "▁", + "build" + ], + [ + "in", + "c" + ], + [ + "i", + "nc" + ], + [ + ".", + "'" + ], + [ + "b", + "s" + ], + [ + "in", + "fo" + ], + [ + "inf", + "o" + ], + [ + "ch", + "n" + ], + [ + "c", + "hn" + ], + [ + "▁we", + "ek" + ], + [ + "▁", + "week" + ], + [ + "▁b", + "ook" + ], + [ + "▁bo", + "ok" + ], + [ + "▁", + "book" + ], + [ + "H", + "E" + ], + [ + "ba", + "r" + ], + [ + "b", + "ar" + ], + [ + "ic", + "ense" + ], + [ + "▁W", + "hat" + ], + [ + "▁Wh", + "at" + ], + [ + "▁", + "What" + ], + [ + "▁qu", + "est" + ], + [ + "▁que", + "st" + ], + [ + "▁q", + "uest" + ], + [ + "▁", + "quest" + ], + [ + "ur", + "ch" + ], + [ + "at", + "o" + ], + [ + "a", + "to" + ], + [ + "le", + "ft" + ], + [ + "l", + "eft" + ], + [ + "▁m", + "ar" + ], + [ + "▁ma", + "r" + ], + [ + "▁", + "mar" + ], + [ + "▁t", + "op" + ], + [ + "▁to", + "p" + ], + [ + "▁", + "top" + ], + [ + "F", + "F" + ], + [ + "▁f", + "riend" + ], + [ + "▁", + "friend" + ], + [ + "▁b", + "eh" + ], + [ + "▁be", + "h" + ], + [ + "▁f", + "ield" + ], + [ + "▁fi", + "eld" + ], + [ + "▁", + "field" + ], + [ + "▁again", + "st" + ], + [ + "ra", + "ct" + ], + [ + "rac", + "t" + ], + [ + "r", + "act" + ], + [ + "iz", + "ation" + ], + [ + "us", + "er" + ], + [ + "use", + "r" + ], + [ + "u", + "ser" + ], + [ + "ch", + "en" + ], + [ + "che", + "n" + ], + [ + "c", + "hen" + ], + [ + "▁ke", + "ep" + ], + [ + "▁", + "keep" + ], + [ + "A", + "D" + ], + [ + "it", + "or" + ], + [ + "ito", + "r" + ], + [ + "i", + "tor" + ], + [ + "▁n", + "on" + ], + [ + "▁no", + "n" + ], + [ + "▁", + "non" + ], + [ + "ir", + "d" + ], + [ + "i", + "rd" + ], + [ + "op", + "e" + ], + [ + "o", + "pe" + ], + [ + "▁re", + "st" + ], + [ + "▁r", + "est" + ], + [ + "▁res", + "t" + ], + [ + "▁", + "rest" + ], + [ + "▁d", + "ev" + ], + [ + "▁de", + "v" + ], + [ + "▁", + "dev" + ], + [ + "▁_", + "_" + ], + [ + "▁", + "__" + ], + [ + "▁u", + "na" + ], + [ + "▁un", + "a" + ], + [ + "▁", + "una" + ], + [ + "▁t", + "erm" + ], + [ + "▁te", + "rm" + ], + [ + "▁ter", + "m" + ], + [ + "▁", + "term" + ], + [ + "I", + "S" + ], + [ + "▁p", + "op" + ], + [ + "▁po", + "p" + ], + [ + "▁", + "pop" + ], + [ + "ri", + "st" + ], + [ + "ris", + "t" + ], + [ + "r", + "ist" + ], + [ + "▁s", + "ince" + ], + [ + "▁sin", + "ce" + ], + [ + "▁sinc", + "e" + ], + [ + "▁", + "since" + ], + [ + "ve", + "s" + ], + [ + "v", + "es" + ], + [ + "▁h", + "ard" + ], + [ + "▁ha", + "rd" + ], + [ + "▁har", + "d" + ], + [ + "▁", + "hard" + ], + [ + "p", + "i" + ], + [ + "ut", + "il" + ], + [ + "uti", + "l" + ], + [ + "u", + "til" + ], + [ + "▁s", + "oc" + ], + [ + "▁so", + "c" + ], + [ + "▁", + "soc" + ], + [ + "en", + "e" + ], + [ + "e", + "ne" + ], + [ + "Ex", + "ception" + ], + [ + "▁l", + "ocal" + ], + [ + "▁loc", + "al" + ], + [ + "▁lo", + "cal" + ], + [ + "▁", + "local" + ], + [ + "▁d", + "irect" + ], + [ + "▁di", + "rect" + ], + [ + "▁dire", + "ct" + ], + [ + "▁dir", + "ect" + ], + [ + "▁", + "direct" + ], + [ + "▁s", + "ure" + ], + [ + "▁su", + "re" + ], + [ + "▁sur", + "e" + ], + [ + "▁", + "sure" + ], + [ + "▁b", + "ro" + ], + [ + "▁br", + "o" + ], + [ + "▁", + "bro" + ], + [ + "▁d", + "a" + ], + [ + "▁", + "da" + ], + [ + "▁<", + "/" + ], + [ + "▁", + "" + ], + [ + "ai", + "m" + ], + [ + "a", + "im" + ], + [ + "▁s", + "ervice" + ], + [ + "▁serv", + "ice" + ], + [ + "▁", + "service" + ], + [ + "▁with", + "in" + ], + [ + "an", + "gu" + ], + [ + "ang", + "u" + ], + [ + "▁", + "Д" + ], + [ + "uf", + "fer" + ], + [ + "uff", + "er" + ], + [ + "A", + "G" + ], + [ + "▁D", + "o" + ], + [ + "▁", + "Do" + ], + [ + "▁in", + "cre" + ], + [ + "▁inc", + "re" + ], + [ + "▁under", + "stand" + ], + [ + "}", + "^" + ], + [ + "▁look", + "ed" + ], + [ + "▁lo", + "oked" + ], + [ + "ge", + "n" + ], + [ + "g", + "en" + ], + [ + "ail", + "ed" + ], + [ + "ai", + "led" + ], + [ + "a", + "iled" + ], + [ + "▁", + "е" + ], + [ + "ay", + "er" + ], + [ + "aye", + "r" + ], + [ + "a", + "yer" + ], + [ + "▁O", + "ne" + ], + [ + "▁On", + "e" + ], + [ + "▁", + "One" + ], + [ + "▁b", + "as" + ], + [ + "▁ba", + "s" + ], + [ + "▁", + "bas" + ], + [ + "▁j", + "ob" + ], + [ + "▁jo", + "b" + ], + [ + "▁", + "job" + ], + [ + "m", + "u" + ], + [ + "bu", + "t" + ], + [ + "b", + "ut" + ], + [ + "el", + "ta" + ], + [ + "elt", + "a" + ], + [ + "▁Ch", + "rist" + ], + [ + "▁Chris", + "t" + ], + [ + "▁", + "Christ" + ], + [ + "ur", + "ation" + ], + [ + "▁re", + "cord" + ], + [ + "▁rec", + "ord" + ], + [ + "▁", + "record" + ], + [ + "▁Un", + "ivers" + ], + [ + "▁", + "Univers" + ], + [ + "iv", + "id" + ], + [ + "ivi", + "d" + ], + [ + "i", + "vid" + ], + [ + "val", + "id" + ], + [ + "▁", + "Р" + ], + [ + "▁h", + "old" + ], + [ + "▁hol", + "d" + ], + [ + "▁ho", + "ld" + ], + [ + "▁", + "hold" + ], + [ + "▁t", + "able" + ], + [ + "▁tab", + "le" + ], + [ + "▁ta", + "ble" + ], + [ + "▁", + "table" + ], + [ + "on", + "es" + ], + [ + "one", + "s" + ], + [ + "o", + "nes" + ], + [ + "lin", + "k" + ], + [ + "l", + "ink" + ], + [ + "▁G", + "e" + ], + [ + "▁", + "Ge" + ], + [ + "▁of", + "fer" + ], + [ + "▁off", + "er" + ], + [ + "st", + "er" + ], + [ + "ste", + "r" + ], + [ + "s", + "ter" + ], + [ + "For", + "m" + ], + [ + "F", + "orm" + ], + [ + "=", + "{" + ], + [ + "▁н", + "е" + ], + [ + "▁", + "не" + ], + [ + "st", + "ance" + ], + [ + "stan", + "ce" + ], + [ + "▁g", + "overn" + ], + [ + "▁go", + "vern" + ], + [ + "▁gover", + "n" + ], + [ + "▁", + "govern" + ], + [ + "▁te", + "chn" + ], + [ + "▁tech", + "n" + ], + [ + "▁", + "techn" + ], + [ + "▁p", + "rim" + ], + [ + "▁pr", + "im" + ], + [ + "▁pri", + "m" + ], + [ + "▁", + "prim" + ], + [ + "*", + "." + ], + [ + "ch", + "o" + ], + [ + "c", + "ho" + ], + [ + "ma", + "x" + ], + [ + "m", + "ax" + ], + [ + "▁f", + "ore" + ], + [ + "▁for", + "e" + ], + [ + "▁fo", + "re" + ], + [ + "▁", + "fore" + ], + [ + "▁C", + "an" + ], + [ + "▁Ca", + "n" + ], + [ + "▁", + "Can" + ], + [ + "▁pol", + "it" + ], + [ + "▁po", + "lit" + ], + [ + "▁", + "polit" + ], + [ + "or", + "ies" + ], + [ + "ori", + "es" + ], + [ + "orie", + "s" + ], + [ + "o", + "ries" + ], + [ + "▁t", + "imes" + ], + [ + "▁time", + "s" + ], + [ + "▁tim", + "es" + ], + [ + "▁ti", + "mes" + ], + [ + "▁", + "times" + ], + [ + "▁d", + "ans" + ], + [ + "▁da", + "ns" + ], + [ + "▁dan", + "s" + ], + [ + "▁a", + "ir" + ], + [ + "▁ai", + "r" + ], + [ + "▁", + "air" + ], + [ + "▁any", + "thing" + ], + [ + "▁s", + "ever" + ], + [ + "▁se", + "ver" + ], + [ + "ac", + "y" + ], + [ + "a", + "cy" + ], + [ + "}", + "_" + ], + [ + "H", + "e" + ], + [ + "▁l", + "east" + ], + [ + "▁le", + "ast" + ], + [ + "ip", + "s" + ], + [ + "i", + "ps" + ], + [ + "EN", + "T" + ], + [ + "E", + "NT" + ], + [ + "d", + "o" + ], + [ + "▁о", + "т" + ], + [ + "▁", + "от" + ], + [ + "▁c", + "ost" + ], + [ + "▁co", + "st" + ], + [ + "▁cos", + "t" + ], + [ + "▁", + "cost" + ], + [ + ".", + "”" + ], + [ + "▁child", + "ren" + ], + [ + "▁", + "children" + ], + [ + "ab", + "ility" + ], + [ + "abil", + "ity" + ], + [ + "Bu", + "t" + ], + [ + "B", + "ut" + ], + [ + "▁p", + "ath" + ], + [ + "▁pat", + "h" + ], + [ + "▁pa", + "th" + ], + [ + "▁", + "path" + ], + [ + "res", + "ult" + ], + [ + "ac", + "ter" + ], + [ + "act", + "er" + ], + [ + "▁e", + "lement" + ], + [ + "▁el", + "ement" + ], + [ + "▁ele", + "ment" + ], + [ + "▁elem", + "ent" + ], + [ + "▁", + "element" + ], + [ + "e", + "e" + ], + [ + "▁w", + "ait" + ], + [ + "▁wa", + "it" + ], + [ + "▁", + "wait" + ], + [ + "▁m", + "oney" + ], + [ + "▁mon", + "ey" + ], + [ + "▁mo", + "ney" + ], + [ + "Ma", + "p" + ], + [ + "M", + "ap" + ], + [ + "t", + "d" + ], + [ + "oi", + "n" + ], + [ + "o", + "in" + ], + [ + "iv", + "ing" + ], + [ + "ivi", + "ng" + ], + [ + "i", + "ving" + ], + [ + "ic", + "ht" + ], + [ + "ich", + "t" + ], + [ + "i", + "cht" + ], + [ + "ic", + "y" + ], + [ + "i", + "cy" + ], + [ + "sc", + "h" + ], + [ + "s", + "ch" + ], + [ + "st", + "e" + ], + [ + "s", + "te" + ], + [ + "д", + "у" + ], + [ + "or", + "ed" + ], + [ + "ore", + "d" + ], + [ + "o", + "red" + ], + [ + "ou", + "d" + ], + [ + "o", + "ud" + ], + [ + "il", + "le" + ], + [ + "ill", + "e" + ], + [ + "i", + "lle" + ], + [ + "is", + "ed" + ], + [ + "ise", + "d" + ], + [ + "i", + "sed" + ], + [ + "pl", + "ication" + ], + [ + "plic", + "ation" + ], + [ + "▁c", + "ustom" + ], + [ + "▁cust", + "om" + ], + [ + "▁", + "custom" + ], + [ + "▁h", + "aving" + ], + [ + "▁ha", + "ving" + ], + [ + "▁hav", + "ing" + ], + [ + "pon", + "ent" + ], + [ + "po", + "nent" + ], + [ + "▁B", + "y" + ], + [ + "▁", + "By" + ], + [ + "ul", + "es" + ], + [ + "ule", + "s" + ], + [ + "u", + "les" + ], + [ + "ue", + "d" + ], + [ + "u", + "ed" + ], + [ + "at", + "ter" + ], + [ + "att", + "er" + ], + [ + "atte", + "r" + ], + [ + "An", + "d" + ], + [ + "A", + "nd" + ], + [ + "it", + "ive" + ], + [ + "iti", + "ve" + ], + [ + "De", + "f" + ], + [ + "D", + "ef" + ], + [ + "▁m", + "oment" + ], + [ + "▁mom", + "ent" + ], + [ + "▁mo", + "ment" + ], + [ + "▁", + "moment" + ], + [ + "at", + "erial" + ], + [ + "ate", + "rial" + ], + [ + "ater", + "ial" + ], + [ + "Cl", + "ass" + ], + [ + "C", + "lass" + ], + [ + "og", + "raph" + ], + [ + "ograp", + "h" + ], + [ + "o", + "graph" + ], + [ + "ik", + "e" + ], + [ + "i", + "ke" + ], + [ + "▁l", + "arge" + ], + [ + "▁larg", + "e" + ], + [ + "▁", + "large" + ], + [ + "▁#", + "###" + ], + [ + "▁##", + "##" + ], + [ + "▁###", + "#" + ], + [ + "▁", + "####" + ], + [ + "▁e", + "ither" + ], + [ + "du", + "ct" + ], + [ + "duc", + "t" + ], + [ + "d", + "uct" + ], + [ + "▁T", + "hen" + ], + [ + "▁The", + "n" + ], + [ + "▁Th", + "en" + ], + [ + "▁", + "Then" + ], + [ + "▁G", + "u" + ], + [ + "▁", + "Gu" + ], + [ + "ole", + "an" + ], + [ + "o", + "lean" + ], + [ + "pe", + "rt" + ], + [ + "per", + "t" + ], + [ + "p", + "ert" + ], + [ + "▁G", + "et" + ], + [ + "▁Ge", + "t" + ], + [ + "▁", + "Get" + ], + [ + "▁A", + "b" + ], + [ + "▁", + "Ab" + ], + [ + "▁sh", + "ort" + ], + [ + "▁", + "short" + ], + [ + "O", + "n" + ], + [ + "im", + "ent" + ], + [ + "ime", + "nt" + ], + [ + "imen", + "t" + ], + [ + "i", + "ment" + ], + [ + "▁pro", + "ject" + ], + [ + "▁", + "project" + ], + [ + "cri", + "pt" + ], + [ + "cr", + "ipt" + ], + [ + "c", + "ript" + ], + [ + "▁incl", + "uding" + ], + [ + "▁includ", + "ing" + ], + [ + "▁inclu", + "ding" + ], + [ + "▁", + "including" + ], + [ + "ни", + "я" + ], + [ + "▁m", + "aking" + ], + [ + "▁ma", + "king" + ], + [ + "▁", + "making" + ], + [ + "▁some", + "one" + ], + [ + "▁F", + "l" + ], + [ + "▁", + "Fl" + ], + [ + "▁s", + "at" + ], + [ + "▁sa", + "t" + ], + [ + "▁", + "sat" + ], + [ + "▁comp", + "any" + ], + [ + "▁compan", + "y" + ], + [ + "▁", + "company" + ], + [ + "oc", + "us" + ], + [ + "p", + "u" + ], + [ + "▁G", + "od" + ], + [ + "▁Go", + "d" + ], + [ + "▁", + "God" + ], + [ + "if", + "ication" + ], + [ + "ific", + "ation" + ], + [ + "N", + "o" + ], + [ + "▁s", + "n" + ], + [ + "▁", + "sn" + ], + [ + "an", + "o" + ], + [ + "a", + "no" + ], + [ + "g", + "a" + ], + [ + "▁a", + "u" + ], + [ + "▁", + "au" + ], + [ + "▁c", + "ou" + ], + [ + "▁co", + "u" + ], + [ + "▁", + "cou" + ], + [ + "á", + "s" + ], + [ + "en", + "ded" + ], + [ + "end", + "ed" + ], + [ + "ende", + "d" + ], + [ + "т", + "у" + ], + [ + "ob", + "er" + ], + [ + "obe", + "r" + ], + [ + "o", + "ber" + ], + [ + "▁n", + "othing" + ], + [ + "▁not", + "hing" + ], + [ + "▁no", + "thing" + ], + [ + "▁n", + "et" + ], + [ + "▁ne", + "t" + ], + [ + "▁", + "net" + ], + [ + "▁p", + "ot" + ], + [ + "▁po", + "t" + ], + [ + "▁", + "pot" + ], + [ + "▁t", + "yp" + ], + [ + "▁ty", + "p" + ], + [ + "▁", + "typ" + ], + [ + "▁it", + "em" + ], + [ + "▁i", + "tem" + ], + [ + "▁", + "item" + ], + [ + "re", + "w" + ], + [ + "r", + "ew" + ], + [ + "At", + "t" + ], + [ + "A", + "tt" + ], + [ + "▁you", + "ng" + ], + [ + "▁yo", + "ung" + ], + [ + "}", + "\r" + ], + [ + "nd", + "er" + ], + [ + "nde", + "r" + ], + [ + "n", + "der" + ], + [ + "st", + "art" + ], + [ + "sta", + "rt" + ], + [ + "star", + "t" + ], + [ + "▁S", + "c" + ], + [ + "▁", + "Sc" + ], + [ + "*", + ")" + ], + [ + "▁e", + "nc" + ], + [ + "▁en", + "c" + ], + [ + "▁", + "enc" + ], + [ + "▁w", + "omen" + ], + [ + "▁wom", + "en" + ], + [ + "▁wo", + "men" + ], + [ + "▁look", + "ing" + ], + [ + "▁lo", + "oking" + ], + [ + "▁", + "looking" + ], + [ + "▁р", + "о" + ], + [ + "▁", + "ро" + ], + [ + "▁he", + "alth" + ], + [ + "▁heal", + "th" + ], + [ + "▁", + "health" + ], + [ + "Pat", + "h" + ], + [ + "P", + "ath" + ], + [ + "▁A", + "fter" + ], + [ + "▁Af", + "ter" + ], + [ + "▁", + "After" + ], + [ + "▁m", + "ult" + ], + [ + "▁mu", + "lt" + ], + [ + "▁mul", + "t" + ], + [ + "▁", + "mult" + ], + [ + "▁{", + "\\" + ], + [ + "▁", + "{\\" + ], + [ + "▁l", + "and" + ], + [ + "▁la", + "nd" + ], + [ + "▁lan", + "d" + ], + [ + "▁", + "land" + ], + [ + "or", + "ld" + ], + [ + "▁D", + "es" + ], + [ + "▁De", + "s" + ], + [ + "▁", + "Des" + ], + [ + "▁e", + "ng" + ], + [ + "▁en", + "g" + ], + [ + "▁", + "eng" + ], + [ + "in", + "put" + ], + [ + "▁P", + "ol" + ], + [ + "▁Po", + "l" + ], + [ + "▁", + "Pol" + ], + [ + "\"", + "\"" + ], + [ + "Co", + "de" + ], + [ + "C", + "ode" + ], + [ + "▁s", + "upp" + ], + [ + "▁su", + "pp" + ], + [ + "▁sup", + "p" + ], + [ + "▁", + "supp" + ], + [ + "ain", + "er" + ], + [ + "ai", + "ner" + ], + [ + "aine", + "r" + ], + [ + "a", + "iner" + ], + [ + "he", + "ck" + ], + [ + "▁m", + "or" + ], + [ + "▁mo", + "r" + ], + [ + "▁", + "mor" + ], + [ + "▁m", + "ill" + ], + [ + "▁mil", + "l" + ], + [ + "▁mi", + "ll" + ], + [ + "▁", + "mill" + ], + [ + "▁a", + "w" + ], + [ + "▁", + "aw" + ], + [ + "f", + "s" + ], + [ + "▁do", + "ing" + ], + [ + "ting", + "s" + ], + [ + "t", + "ings" + ], + [ + "ad", + "es" + ], + [ + "ade", + "s" + ], + [ + "a", + "des" + ], + [ + "▁to", + "get" + ], + [ + "▁c", + "ertain" + ], + [ + "▁cert", + "ain" + ], + [ + "▁cer", + "tain" + ], + [ + "▁t", + "ogether" + ], + [ + "▁toget", + "her" + ], + [ + "C", + "E" + ], + [ + "ide", + "o" + ], + [ + "▁Amer", + "ican" + ], + [ + "▁America", + "n" + ], + [ + "▁", + "American" + ], + [ + "on", + "y" + ], + [ + "o", + "ny" + ], + [ + "id", + "d" + ], + [ + "i", + "dd" + ], + [ + "I", + "I" + ], + [ + "ge", + "d" + ], + [ + "g", + "ed" + ], + [ + "ab", + "les" + ], + [ + "able", + "s" + ], + [ + "abl", + "es" + ], + [ + "a", + "bles" + ], + [ + "▁ide", + "nt" + ], + [ + "▁id", + "ent" + ], + [ + "▁", + "ident" + ], + [ + "io", + "d" + ], + [ + "i", + "od" + ], + [ + "▁p", + "arent" + ], + [ + "▁par", + "ent" + ], + [ + "▁pa", + "rent" + ], + [ + "▁pare", + "nt" + ], + [ + "▁", + "parent" + ], + [ + "F", + "or" + ], + [ + "amb", + "da" + ], + [ + "an", + "do" + ], + [ + "and", + "o" + ], + [ + "=", + "\\" + ], + [ + "ag", + "ed" + ], + [ + "age", + "d" + ], + [ + "a", + "ged" + ], + [ + "en", + "ding" + ], + [ + "end", + "ing" + ], + [ + "In", + "t" + ], + [ + "I", + "nt" + ], + [ + "▁poss", + "ible" + ], + [ + "▁", + "possible" + ], + [ + "▁с", + "о" + ], + [ + "▁", + "со" + ], + [ + "iv", + "ity" + ], + [ + "ivi", + "ty" + ], + [ + "nu", + "m" + ], + [ + "n", + "um" + ], + [ + "r", + "t" + ], + [ + "aj", + "or" + ], + [ + "ajo", + "r" + ], + [ + "a", + "jor" + ], + [ + "cre", + "ate" + ], + [ + "creat", + "e" + ], + [ + "c", + "reate" + ], + [ + "ri", + "de" + ], + [ + "rid", + "e" + ], + [ + "r", + "ide" + ], + [ + "▁k", + "new" + ], + [ + "▁kn", + "ew" + ], + [ + "▁kne", + "w" + ], + [ + "bi", + "t" + ], + [ + "b", + "it" + ], + [ + "it", + "ional" + ], + [ + "ition", + "al" + ], + [ + "iti", + "onal" + ], + [ + "▁l", + "ik" + ], + [ + "▁li", + "k" + ], + [ + "▁", + "lik" + ], + [ + "▁H", + "er" + ], + [ + "▁He", + "r" + ], + [ + "▁", + "Her" + ], + [ + "ens", + "ion" + ], + [ + "\"", + "." + ], + [ + "ot", + "o" + ], + [ + "o", + "to" + ], + [ + "▁ex", + "ist" + ], + [ + "▁", + "exist" + ], + [ + "ak", + "en" + ], + [ + "ake", + "n" + ], + [ + "a", + "ken" + ], + [ + "▁act", + "ually" + ], + [ + "▁actual", + "ly" + ], + [ + "c", + "a" + ], + [ + "▁", + "Г" + ], + [ + "х", + "о" + ], + [ + "in", + "n" + ], + [ + "i", + "nn" + ], + [ + "Al", + "l" + ], + [ + "A", + "ll" + ], + [ + "bu", + "f" + ], + [ + "b", + "uf" + ], + [ + "▁M", + "e" + ], + [ + "▁", + "Me" + ], + [ + "▁s", + "een" + ], + [ + "▁se", + "en" + ], + [ + "▁see", + "n" + ], + [ + "▁", + "seen" + ], + [ + "op", + "s" + ], + [ + "o", + "ps" + ], + [ + "No", + "t" + ], + [ + "N", + "ot" + ], + [ + "▁cont", + "rol" + ], + [ + "▁contr", + "ol" + ], + [ + "▁contro", + "l" + ], + [ + "▁", + "control" + ], + [ + "▁res", + "pon" + ], + [ + "▁resp", + "on" + ], + [ + "▁", + "respon" + ], + [ + "}", + ";" + ], + [ + "il", + "t" + ], + [ + "i", + "lt" + ], + [ + "is", + "k" + ], + [ + "i", + "sk" + ], + [ + "▁b", + "ad" + ], + [ + "▁ba", + "d" + ], + [ + "▁", + "bad" + ], + [ + "▁o", + "ften" + ], + [ + "▁of", + "ten" + ], + [ + "▁p", + "ast" + ], + [ + "▁pas", + "t" + ], + [ + "▁pa", + "st" + ], + [ + "ap", + "er" + ], + [ + "ape", + "r" + ], + [ + "a", + "per" + ], + [ + "▁re", + "ason" + ], + [ + "▁", + "reason" + ], + [ + "et", + "ers" + ], + [ + "eter", + "s" + ], + [ + "ete", + "rs" + ], + [ + "e", + "ters" + ], + [ + "▁w", + "anted" + ], + [ + "▁want", + "ed" + ], + [ + "ur", + "a" + ], + [ + "u", + "ra" + ], + [ + "ta", + "ble" + ], + [ + "tab", + "le" + ], + [ + "t", + "able" + ], + [ + "or", + "mal" + ], + [ + "orm", + "al" + ], + [ + "wid", + "th" + ], + [ + "w", + "idth" + ], + [ + "г", + "а" + ], + [ + "pt", + "r" + ], + [ + "p", + "tr" + ], + [ + "▁d", + "est" + ], + [ + "▁de", + "st" + ], + [ + "▁des", + "t" + ], + [ + "▁", + "dest" + ], + [ + "▁de", + "sign" + ], + [ + "▁des", + "ign" + ], + [ + "▁", + "design" + ], + [ + "▁s", + "ound" + ], + [ + "▁so", + "und" + ], + [ + "▁sou", + "nd" + ], + [ + "▁", + "sound" + ], + [ + "▁p", + "lan" + ], + [ + "▁pl", + "an" + ], + [ + "▁", + "plan" + ], + [ + "▁b", + "ase" + ], + [ + "▁bas", + "e" + ], + [ + "▁ba", + "se" + ], + [ + "▁", + "base" + ], + [ + "ha", + "nd" + ], + [ + "han", + "d" + ], + [ + "h", + "and" + ], + [ + "g", + "s" + ], + [ + "▁s", + "ays" + ], + [ + "▁sa", + "ys" + ], + [ + "▁say", + "s" + ], + [ + "fun", + "ction" + ], + [ + "f", + "unction" + ], + [ + "▁t", + "ri" + ], + [ + "▁tr", + "i" + ], + [ + "▁", + "tri" + ], + [ + "m", + "t" + ], + [ + "▁in", + "vest" + ], + [ + "▁inv", + "est" + ], + [ + "▁av", + "ailable" + ], + [ + "▁", + "available" + ], + [ + "ay", + "out" + ], + [ + "a", + "yout" + ], + [ + "▁o", + "ch" + ], + [ + "▁oc", + "h" + ], + [ + "▁", + "och" + ], + [ + "▁l", + "as" + ], + [ + "▁la", + "s" + ], + [ + "▁", + "las" + ], + [ + "il", + "led" + ], + [ + "ill", + "ed" + ], + [ + "ille", + "d" + ], + [ + "V", + "al" + ], + [ + "▁", + "ф" + ], + [ + "ie", + "ty" + ], + [ + "iet", + "y" + ], + [ + "i", + "ety" + ], + [ + "mo", + "n" + ], + [ + "m", + "on" + ], + [ + "Ha", + "nd" + ], + [ + "H", + "and" + ], + [ + "F", + "r" + ], + [ + "ia", + "m" + ], + [ + "i", + "am" + ], + [ + "pa", + "ce" + ], + [ + "p", + "ace" + ], + [ + "▁O", + "b" + ], + [ + "▁", + "Ob" + ], + [ + "▁p", + "ara" + ], + [ + "▁par", + "a" + ], + [ + "▁pa", + "ra" + ], + [ + "▁", + "para" + ], + [ + "▁me", + "et" + ], + [ + "▁s", + "um" + ], + [ + "▁su", + "m" + ], + [ + "▁", + "sum" + ], + [ + "M", + "essage" + ], + [ + "ic", + "i" + ], + [ + "i", + "ci" + ], + [ + "▁k", + "nown" + ], + [ + "▁kn", + "own" + ], + [ + "▁know", + "n" + ], + [ + "▁", + "known" + ], + [ + "▁g", + "en" + ], + [ + "▁ge", + "n" + ], + [ + "▁", + "gen" + ], + [ + "am", + "ma" + ], + [ + "amm", + "a" + ], + [ + "a", + "mma" + ], + [ + "ar", + "r" + ], + [ + "a", + "rr" + ], + [ + "▁t", + "re" + ], + [ + "▁tr", + "e" + ], + [ + "▁", + "tre" + ], + [ + "ok", + "e" + ], + [ + "o", + "ke" + ], + [ + "ut", + "h" + ], + [ + "u", + "th" + ], + [ + "~", + "\\" + ], + [ + "▁exper", + "ience" + ], + [ + "▁experi", + "ence" + ], + [ + "ic", + "le" + ], + [ + "icl", + "e" + ], + [ + "i", + "cle" + ], + [ + "▁I", + "l" + ], + [ + "▁", + "Il" + ], + [ + "▁s", + "ent" + ], + [ + "▁se", + "nt" + ], + [ + "▁sen", + "t" + ], + [ + "▁", + "sent" + ], + [ + "▁o", + "thers" + ], + [ + "▁other", + "s" + ], + [ + "▁", + "others" + ], + [ + "▁s", + "oft" + ], + [ + "▁so", + "ft" + ], + [ + "▁", + "soft" + ], + [ + "I", + "P" + ], + [ + "▁m", + "ax" + ], + [ + "▁ma", + "x" + ], + [ + "▁", + "max" + ], + [ + "ba", + "ll" + ], + [ + "bal", + "l" + ], + [ + "b", + "all" + ], + [ + "▁mark", + "et" + ], + [ + "▁mar", + "ket" + ], + [ + "▁", + "market" + ], + [ + "▁p", + "our" + ], + [ + "▁po", + "ur" + ], + [ + "▁pou", + "r" + ], + [ + "pr", + "ession" + ], + [ + "press", + "ion" + ], + [ + "p", + "ression" + ], + [ + "ep", + "s" + ], + [ + "e", + "ps" + ], + [ + "▁s", + "aw" + ], + [ + "▁sa", + "w" + ], + [ + "▁a", + "cross" + ], + [ + "▁ac", + "ross" + ], + [ + "▁S", + "u" + ], + [ + "▁", + "Su" + ], + [ + "O", + "ver" + ], + [ + "ни", + "е" + ], + [ + "ul", + "ation" + ], + [ + "u", + "lation" + ], + [ + "▁R", + "eg" + ], + [ + "▁Re", + "g" + ], + [ + "▁", + "Reg" + ], + [ + "▁+", + "=" + ], + [ + "▁", + "+=" + ], + [ + "bo", + "dy" + ], + [ + "b", + "ody" + ], + [ + ")", + "\\" + ], + [ + "▁pr", + "int" + ], + [ + "▁pri", + "nt" + ], + [ + "▁prin", + "t" + ], + [ + "▁", + "print" + ], + [ + "▁п", + "ри" + ], + [ + "▁пр", + "и" + ], + [ + "▁", + "при" + ], + [ + "d", + "b" + ], + [ + "our", + "ces" + ], + [ + "ource", + "s" + ], + [ + "ward", + "s" + ], + [ + "war", + "ds" + ], + [ + "w", + "ards" + ], + [ + "▁bl", + "ack" + ], + [ + "▁", + "black" + ], + [ + "с", + "о" + ], + [ + "il", + "i" + ], + [ + "i", + "li" + ], + [ + "▁E", + "d" + ], + [ + "▁", + "Ed" + ], + [ + "▁com", + "plet" + ], + [ + "▁comp", + "let" + ], + [ + "▁compl", + "et" + ], + [ + "▁s", + "ingle" + ], + [ + "▁sing", + "le" + ], + [ + "▁sin", + "gle" + ], + [ + "▁", + "single" + ], + [ + "▁I", + "N" + ], + [ + "▁", + "IN" + ], + [ + "ac", + "hed" + ], + [ + "ach", + "ed" + ], + [ + "ache", + "d" + ], + [ + "a", + "ched" + ], + [ + "b", + "t" + ], + [ + "▁c", + "ode" + ], + [ + "▁co", + "de" + ], + [ + "▁cod", + "e" + ], + [ + "▁", + "code" + ], + [ + "▁b", + "ool" + ], + [ + "▁bo", + "ol" + ], + [ + "▁", + "bool" + ], + [ + "▁a", + "rea" + ], + [ + "▁are", + "a" + ], + [ + "▁ar", + "ea" + ], + [ + "▁", + "area" + ], + [ + "▁re", + "quire" + ], + [ + "▁requ", + "ire" + ], + [ + "▁", + "require" + ], + [ + "▁pro", + "blem" + ], + [ + "▁proble", + "m" + ], + [ + "▁prob", + "lem" + ], + [ + "ac", + "ed" + ], + [ + "ace", + "d" + ], + [ + "a", + "ced" + ], + [ + "Eq", + "u" + ], + [ + "E", + "qu" + ], + [ + "▁con", + "fig" + ], + [ + "▁conf", + "ig" + ], + [ + "▁", + "config" + ], + [ + "ve", + "c" + ], + [ + "v", + "ec" + ], + [ + "ne", + "y" + ], + [ + "n", + "ey" + ], + [ + "c", + "y" + ], + [ + "A", + "l" + ], + [ + "▁acc", + "ount" + ], + [ + "▁ac", + "count" + ], + [ + "▁", + "account" + ], + [ + "ym", + "bol" + ], + [ + "▁s", + "te" + ], + [ + "▁st", + "e" + ], + [ + "▁", + "ste" + ], + [ + "ge", + "s" + ], + [ + "g", + "es" + ], + [ + "Ar", + "ray" + ], + [ + "Arr", + "ay" + ], + [ + "em", + "pl" + ], + [ + "emp", + "l" + ], + [ + "con", + "text" + ], + [ + "cont", + "ext" + ], + [ + "De", + "s" + ], + [ + "D", + "es" + ], + [ + "Res", + "ult" + ], + [ + "ec", + "ut" + ], + [ + "e", + "cut" + ], + [ + "▁t", + "arget" + ], + [ + "▁tar", + "get" + ], + [ + "▁", + "target" + ], + [ + "▁get", + "ting" + ], + [ + "\"", + "/>" + ], + [ + "og", + "le" + ], + [ + "o", + "gle" + ], + [ + "▁him", + "self" + ], + [ + "▁was", + "n" + ], + [ + "▁wa", + "sn" + ], + [ + "▁b", + "lock" + ], + [ + "▁bl", + "ock" + ], + [ + "▁blo", + "ck" + ], + [ + "▁", + "block" + ], + [ + "▁a", + "nt" + ], + [ + "▁an", + "t" + ], + [ + "▁", + "ant" + ], + [ + "▁Y", + "ork" + ], + [ + "▁be", + "come" + ], + [ + "▁bec", + "ome" + ], + [ + "if", + "f" + ], + [ + "i", + "ff" + ], + [ + "port", + "s" + ], + [ + "por", + "ts" + ], + [ + "p", + "orts" + ], + [ + "re", + "ate" + ], + [ + "reat", + "e" + ], + [ + "rea", + "te" + ], + [ + "=", + "'" + ], + [ + "c", + "d" + ], + [ + "loc", + "ation" + ], + [ + "l", + "ocation" + ], + [ + "е", + "т" + ], + [ + "▁a", + "ccess" + ], + [ + "▁acc", + "ess" + ], + [ + "▁ac", + "cess" + ], + [ + "▁", + "access" + ], + [ + "gr", + "ess" + ], + [ + "gre", + "ss" + ], + [ + "gres", + "s" + ], + [ + "g", + "ress" + ], + [ + "ro", + "s" + ], + [ + "r", + "os" + ], + [ + "U", + "p" + ], + [ + "▁work", + "ing" + ], + [ + "▁wor", + "king" + ], + [ + "▁", + "working" + ], + [ + "▁A", + "m" + ], + [ + "▁", + "Am" + ], + [ + "iq", + "u" + ], + [ + "i", + "qu" + ], + [ + "ce", + "r" + ], + [ + "c", + "er" + ], + [ + "▁(", + "(" + ], + [ + "▁", + "((" + ], + [ + "▁P", + "er" + ], + [ + "▁Pe", + "r" + ], + [ + "▁", + "Per" + ], + [ + "▁f", + "unc" + ], + [ + "▁fun", + "c" + ], + [ + "▁fu", + "nc" + ], + [ + "▁", + "func" + ], + [ + "▁g", + "irl" + ], + [ + "▁gi", + "rl" + ], + [ + "▁gir", + "l" + ], + [ + "▁", + "girl" + ], + [ + "▁ab", + "ove" + ], + [ + "pe", + "n" + ], + [ + "p", + "en" + ], + [ + "п", + "и" + ], + [ + "id", + "o" + ], + [ + "i", + "do" + ], + [ + "▁v", + "ersion" + ], + [ + "▁vers", + "ion" + ], + [ + "▁", + "version" + ], + [ + "T", + "Y" + ], + [ + "▁", + ";" + ], + [ + "ma", + "ry" + ], + [ + "mar", + "y" + ], + [ + "m", + "ary" + ], + [ + "ab", + "led" + ], + [ + "able", + "d" + ], + [ + "abl", + "ed" + ], + [ + "a", + "bled" + ], + [ + "an", + "nel" + ], + [ + "ann", + "el" + ], + [ + "anne", + "l" + ], + [ + "▁ex", + "ample" + ], + [ + "▁exam", + "ple" + ], + [ + "▁", + "example" + ], + [ + "▁con", + "text" + ], + [ + "▁cont", + "ext" + ], + [ + "▁", + "context" + ], + [ + "O", + "P" + ], + [ + "▁re", + "d" + ], + [ + "▁r", + "ed" + ], + [ + "▁", + "red" + ], + [ + "▁c", + "ir" + ], + [ + "▁ci", + "r" + ], + [ + "▁", + "cir" + ], + [ + "s", + "m" + ], + [ + "Lo", + "g" + ], + [ + "L", + "og" + ], + [ + "▁s", + "pace" + ], + [ + "▁sp", + "ace" + ], + [ + "▁", + "space" + ], + [ + "▁f", + "ut" + ], + [ + "▁fu", + "t" + ], + [ + "▁G", + "ener" + ], + [ + "▁Ge", + "ner" + ], + [ + "▁Gen", + "er" + ], + [ + "▁Gene", + "r" + ], + [ + "▁", + "Gener" + ], + [ + "il", + "ls" + ], + [ + "ill", + "s" + ], + [ + "▁d", + "ri" + ], + [ + "▁dr", + "i" + ], + [ + "_", + "." + ], + [ + "▁f", + "elt" + ], + [ + "▁fe", + "lt" + ], + [ + "▁fel", + "t" + ], + [ + "▁o", + "ffic" + ], + [ + "▁of", + "fic" + ], + [ + "▁off", + "ic" + ], + [ + "▁=", + "==" + ], + [ + "▁==", + "=" + ], + [ + "▁", + "===" + ], + [ + "i", + "i" + ], + [ + "▁start", + "ed" + ], + [ + "▁star", + "ted" + ], + [ + "▁", + "Т" + ], + [ + "▁}", + ");" + ], + [ + "▁})", + ";" + ], + [ + "▁", + "});" + ], + [ + "j", + "s" + ], + [ + "▁fr", + "ont" + ], + [ + "▁fro", + "nt" + ], + [ + "▁", + "front" + ], + [ + "▁al", + "most" + ], + [ + "ir", + "m" + ], + [ + "i", + "rm" + ], + [ + "!", + "\"" + ], + [ + "sign", + "ed" + ], + [ + "sig", + "ned" + ], + [ + "s", + "igned" + ], + [ + "▁y", + "et" + ], + [ + "▁ye", + "t" + ], + [ + "▁t", + "rad" + ], + [ + "▁tr", + "ad" + ], + [ + "▁tra", + "d" + ], + [ + "ient", + "s" + ], + [ + "ien", + "ts" + ], + [ + "i", + "ents" + ], + [ + "am", + "a" + ], + [ + "a", + "ma" + ], + [ + "▁in", + "put" + ], + [ + "▁", + "input" + ], + [ + "li", + "m" + ], + [ + "l", + "im" + ], + [ + "п", + "а" + ], + [ + "▁к", + "а" + ], + [ + "▁", + "ка" + ], + [ + "▁c", + "amp" + ], + [ + "▁cam", + "p" + ], + [ + "▁ca", + "mp" + ], + [ + "▁", + "camp" + ], + [ + "ib", + "r" + ], + [ + "i", + "br" + ], + [ + "fe", + "ct" + ], + [ + "f", + "ect" + ], + [ + "un", + "t" + ], + [ + "u", + "nt" + ], + [ + "▁h", + "alf" + ], + [ + "▁hal", + "f" + ], + [ + "▁", + "half" + ], + [ + "▁c", + "over" + ], + [ + "▁co", + "ver" + ], + [ + "▁cov", + "er" + ], + [ + "▁", + "cover" + ], + [ + "angu", + "age" + ], + [ + "▁b", + "en" + ], + [ + "▁be", + "n" + ], + [ + "▁", + "ben" + ], + [ + "h", + "a" + ], + [ + "▁d", + "iff" + ], + [ + "▁di", + "ff" + ], + [ + "▁dif", + "f" + ], + [ + "▁", + "diff" + ], + [ + "_", + "\\" + ], + [ + "▁о", + "б" + ], + [ + "▁", + "об" + ], + [ + "]", + ")" + ], + [ + "od", + "es" + ], + [ + "ode", + "s" + ], + [ + "o", + "des" + ], + [ + "he", + "l" + ], + [ + "h", + "el" + ], + [ + "io", + "s" + ], + [ + "i", + "os" + ], + [ + "▁", + "О" + ], + [ + "▁m", + "ot" + ], + [ + "▁mo", + "t" + ], + [ + "▁", + "mot" + ], + [ + "▁s", + "ocial" + ], + [ + "▁so", + "cial" + ], + [ + "▁soc", + "ial" + ], + [ + "▁soci", + "al" + ], + [ + "▁", + "social" + ], + [ + "////", + "////" + ], + [ + "▁s", + "tre" + ], + [ + "▁st", + "re" + ], + [ + "▁str", + "e" + ], + [ + "▁", + "stre" + ], + [ + "gr", + "ound" + ], + [ + "gro", + "und" + ], + [ + "g", + "round" + ], + [ + "і", + "в" + ], + [ + "ob", + "ject" + ], + [ + "obj", + "ect" + ], + [ + "pl", + "es" + ], + [ + "ple", + "s" + ], + [ + "p", + "les" + ], + [ + "re", + "ed" + ], + [ + "ree", + "d" + ], + [ + "r", + "eed" + ], + [ + "▁e", + "en" + ], + [ + "▁", + "een" + ], + [ + "▁b", + "ased" + ], + [ + "▁bas", + "ed" + ], + [ + "▁base", + "d" + ], + [ + "▁ba", + "sed" + ], + [ + "▁", + "based" + ], + [ + "▁r", + "ange" + ], + [ + "▁ran", + "ge" + ], + [ + "▁rang", + "e" + ], + [ + "▁", + "range" + ], + [ + "A", + "n" + ], + [ + "ur", + "g" + ], + [ + "u", + "rg" + ], + [ + "▁le", + "arn" + ], + [ + "▁lear", + "n" + ], + [ + "▁", + "learn" + ], + [ + "▁e", + "xc" + ], + [ + "▁ex", + "c" + ], + [ + "▁", + "exc" + ], + [ + "▁im", + "p" + ], + [ + "▁i", + "mp" + ], + [ + "▁", + "imp" + ], + [ + "▁me", + "ans" + ], + [ + "▁mean", + "s" + ], + [ + "▁w", + "ur" + ], + [ + "en", + "ds" + ], + [ + "end", + "s" + ], + [ + "vo", + "id" + ], + [ + "v", + "oid" + ], + [ + "▁s", + "td" + ], + [ + "▁st", + "d" + ], + [ + "▁", + "std" + ], + [ + "▁part", + "icular" + ], + [ + "▁partic", + "ular" + ], + [ + "▁particul", + "ar" + ], + [ + "▁parti", + "cular" + ], + [ + "j", + "a" + ], + [ + "▁s", + "ource" + ], + [ + "▁sour", + "ce" + ], + [ + "▁", + "source" + ], + [ + "def", + "ault" + ], + [ + "p", + "y" + ], + [ + "▁a", + "ls" + ], + [ + "▁al", + "s" + ], + [ + "▁", + "als" + ], + [ + "sc", + "ri" + ], + [ + "scr", + "i" + ], + [ + "s", + "cri" + ], + [ + "st", + "atus" + ], + [ + "stat", + "us" + ], + [ + "▁st", + "ory" + ], + [ + "▁stor", + "y" + ], + [ + "▁sto", + "ry" + ], + [ + "▁", + "story" + ], + [ + "▁b", + "egin" + ], + [ + "▁be", + "gin" + ], + [ + "▁beg", + "in" + ], + [ + "▁", + "begin" + ], + [ + "▁pos", + "ition" + ], + [ + "▁posit", + "ion" + ], + [ + "▁", + "position" + ], + [ + "▁spec", + "ial" + ], + [ + "▁spe", + "cial" + ], + [ + "▁", + "special" + ], + [ + "ph", + "p" + ], + [ + "p", + "hp" + ], + [ + "▁b", + "ar" + ], + [ + "▁ba", + "r" + ], + [ + "▁", + "bar" + ], + [ + "▁p", + "ract" + ], + [ + "▁pr", + "act" + ], + [ + "▁pra", + "ct" + ], + [ + "▁prac", + "t" + ], + [ + "cal", + "l" + ], + [ + "ca", + "ll" + ], + [ + "c", + "all" + ], + [ + "▁d", + "as" + ], + [ + "▁da", + "s" + ], + [ + "▁", + "das" + ], + [ + "▁r", + "ad" + ], + [ + "▁ra", + "d" + ], + [ + "▁", + "rad" + ], + [ + "▁cl", + "ose" + ], + [ + "▁clos", + "e" + ], + [ + "▁clo", + "se" + ], + [ + "▁", + "close" + ], + [ + "ww", + "w" + ], + [ + "w", + "ww" + ], + [ + "ер", + "е" + ], + [ + "е", + "ре" + ], + [ + "g", + "u" + ], + [ + "▁E", + "r" + ], + [ + "▁", + "Er" + ], + [ + "▁d", + "om" + ], + [ + "▁do", + "m" + ], + [ + "▁", + "dom" + ], + [ + "A", + "M" + ], + [ + "▁b", + "ed" + ], + [ + "▁be", + "d" + ], + [ + "▁", + "bed" + ], + [ + "▁sever", + "al" + ], + [ + "au", + "l" + ], + [ + "a", + "ul" + ], + [ + "bo", + "x" + ], + [ + "b", + "ox" + ], + [ + "▁l", + "ow" + ], + [ + "▁lo", + "w" + ], + [ + "▁", + "low" + ], + [ + "pa", + "ck" + ], + [ + "p", + "ack" + ], + [ + "Re", + "g" + ], + [ + "R", + "eg" + ], + [ + "O", + "f" + ], + [ + "at", + "ures" + ], + [ + "ature", + "s" + ], + [ + "atur", + "es" + ], + [ + "atu", + "res" + ], + [ + "é", + "n" + ], + [ + "ed", + "er" + ], + [ + "ede", + "r" + ], + [ + "e", + "der" + ], + [ + "uild", + "er" + ], + [ + "ca", + "st" + ], + [ + "cas", + "t" + ], + [ + "c", + "ast" + ], + [ + "con", + "om" + ], + [ + "co", + "nom" + ], + [ + "c", + "onom" + ], + [ + "ra", + "ft" + ], + [ + "raf", + "t" + ], + [ + "r", + "aft" + ], + [ + "▁m", + "akes" + ], + [ + "▁make", + "s" + ], + [ + "▁ma", + "kes" + ], + [ + "Lo", + "c" + ], + [ + "L", + "oc" + ], + [ + "ht", + "tp" + ], + [ + "htt", + "p" + ], + [ + "h", + "ttp" + ], + [ + "▁a", + "bs" + ], + [ + "▁ab", + "s" + ], + [ + "▁", + "abs" + ], + [ + "re", + "sh" + ], + [ + "res", + "h" + ], + [ + "r", + "esh" + ], + [ + "▁W", + "ill" + ], + [ + "▁Wil", + "l" + ], + [ + "▁Wi", + "ll" + ], + [ + "▁", + "Will" + ], + [ + "bre", + "ak" + ], + [ + "b", + "reak" + ], + [ + "▁o", + "ptions" + ], + [ + "▁opt", + "ions" + ], + [ + "▁option", + "s" + ], + [ + "▁", + "options" + ], + [ + "fo", + "rt" + ], + [ + "for", + "t" + ], + [ + "f", + "ort" + ], + [ + "▁и", + "з" + ], + [ + "▁", + "из" + ], + [ + "▁a", + "nal" + ], + [ + "▁an", + "al" + ], + [ + "▁", + "anal" + ], + [ + "▁e", + "nv" + ], + [ + "▁en", + "v" + ], + [ + "▁", + "env" + ], + [ + "(", + "{" + ], + [ + "ev", + "ent" + ], + [ + "even", + "t" + ], + [ + "eve", + "nt" + ], + [ + "e", + "vent" + ], + [ + "▁p", + "age" + ], + [ + "▁pa", + "ge" + ], + [ + "▁pag", + "e" + ], + [ + "▁", + "page" + ], + [ + "ter", + "nal" + ], + [ + "tern", + "al" + ], + [ + "▁d", + "istribut" + ], + [ + "▁dist", + "ribut" + ], + [ + "▁f", + "ood" + ], + [ + "▁fo", + "od" + ], + [ + "▁foo", + "d" + ], + [ + "▁", + "food" + ], + [ + "che", + "ck" + ], + [ + "c", + "heck" + ], + [ + "C", + "K" + ], + [ + "▁в", + "о" + ], + [ + "▁", + "во" + ], + [ + "as", + "sert" + ], + [ + "ass", + "ert" + ], + [ + "asse", + "rt" + ], + [ + "á", + "n" + ], + [ + "ba", + "se" + ], + [ + "bas", + "e" + ], + [ + "b", + "ase" + ], + [ + "▁w", + "hole" + ], + [ + "▁wh", + "ole" + ], + [ + "▁who", + "le" + ], + [ + "ac", + "ión" + ], + [ + "ació", + "n" + ], + [ + "aci", + "ón" + ], + [ + "a", + "ción" + ], + [ + "O", + "D" + ], + [ + "▁turn", + "ed" + ], + [ + "▁tur", + "ned" + ], + [ + "ig", + "ma" + ], + [ + "▁res", + "ponse" + ], + [ + "▁respon", + "se" + ], + [ + "▁respons", + "e" + ], + [ + "▁", + "response" + ], + [ + "▁Univers", + "ity" + ], + [ + "▁d", + "iv" + ], + [ + "▁di", + "v" + ], + [ + "▁", + "div" + ], + [ + "ap", + "ter" + ], + [ + "apt", + "er" + ], + [ + "▁result", + "s" + ], + [ + "▁", + "results" + ], + [ + "▁re", + "present" + ], + [ + "▁rep", + "resent" + ], + [ + "▁every", + "thing" + ], + [ + "▁C", + "ent" + ], + [ + "▁Ce", + "nt" + ], + [ + "▁", + "Cent" + ], + [ + "ut", + "es" + ], + [ + "ute", + "s" + ], + [ + "u", + "tes" + ], + [ + "ri", + "x" + ], + [ + "r", + "ix" + ], + [ + "▁S", + "ome" + ], + [ + "▁So", + "me" + ], + [ + "▁Som", + "e" + ], + [ + "▁", + "Some" + ], + [ + "▁be", + "hind" + ], + [ + "▁beh", + "ind" + ], + [ + "▁c", + "reat" + ], + [ + "▁cre", + "at" + ], + [ + "▁", + "creat" + ], + [ + "pl", + "ace" + ], + [ + "plac", + "e" + ], + [ + "p", + "lace" + ], + [ + "s", + "u" + ], + [ + "▁P", + "art" + ], + [ + "▁Par", + "t" + ], + [ + "▁Pa", + "rt" + ], + [ + "▁", + "Part" + ], + [ + "um", + "b" + ], + [ + "u", + "mb" + ], + [ + "math", + "bb" + ], + [ + "pi", + "ng" + ], + [ + "pin", + "g" + ], + [ + "p", + "ing" + ], + [ + "▁m", + "atch" + ], + [ + "▁mat", + "ch" + ], + [ + "▁", + "match" + ], + [ + "O", + "ut" + ], + [ + "do", + "m" + ], + [ + "d", + "om" + ], + [ + "▁s", + "itu" + ], + [ + "▁sit", + "u" + ], + [ + "▁si", + "tu" + ], + [ + "d", + "r" + ], + [ + "ar", + "a" + ], + [ + "a", + "ra" + ], + [ + "▁w", + "indow" + ], + [ + "▁wind", + "ow" + ], + [ + "▁", + "window" + ], + [ + "n", + "s" + ], + [ + "lish", + "ed" + ], + [ + "l", + "ished" + ], + [ + "▁V", + "er" + ], + [ + "▁Ve", + "r" + ], + [ + "▁", + "Ver" + ], + [ + "▁m", + "essage" + ], + [ + "▁mess", + "age" + ], + [ + "▁", + "message" + ], + [ + "▁E", + "m" + ], + [ + "▁", + "Em" + ], + [ + "▁h", + "uman" + ], + [ + "▁hum", + "an" + ], + [ + "▁", + "human" + ], + [ + "per", + "ties" + ], + [ + "pert", + "ies" + ], + [ + "л", + "у" + ], + [ + "le", + "m" + ], + [ + "l", + "em" + ], + [ + "OR", + "T" + ], + [ + "O", + "RT" + ], + [ + "▁e", + "arly" + ], + [ + "▁ear", + "ly" + ], + [ + "▁qu", + "ick" + ], + [ + "▁qui", + "ck" + ], + [ + "▁", + "quick" + ], + [ + "▁т", + "а" + ], + [ + "▁", + "та" + ], + [ + "ro", + "id" + ], + [ + "r", + "oid" + ], + [ + "▁c", + "ountry" + ], + [ + "▁coun", + "try" + ], + [ + "▁count", + "ry" + ], + [ + "▁countr", + "y" + ], + [ + "▁", + "country" + ], + [ + "▁d", + "ue" + ], + [ + "▁du", + "e" + ], + [ + "▁", + "due" + ], + [ + "▁D", + "ie" + ], + [ + "▁Di", + "e" + ], + [ + "▁", + "Die" + ], + [ + "▁t", + "rying" + ], + [ + "▁tr", + "ying" + ], + [ + "▁try", + "ing" + ], + [ + "▁l", + "ive" + ], + [ + "▁li", + "ve" + ], + [ + "▁liv", + "e" + ], + [ + "▁", + "live" + ], + [ + "▁p", + "ress" + ], + [ + "▁pre", + "ss" + ], + [ + "▁pr", + "ess" + ], + [ + "▁pres", + "s" + ], + [ + "▁", + "press" + ], + [ + "IN", + "T" + ], + [ + "I", + "NT" + ], + [ + "W", + "ith" + ], + [ + "ov", + "ed" + ], + [ + "ove", + "d" + ], + [ + "o", + "ved" + ], + [ + "▁spec", + "ific" + ], + [ + "▁", + "specific" + ], + [ + "▁f", + "all" + ], + [ + "▁fa", + "ll" + ], + [ + "▁fal", + "l" + ], + [ + "▁", + "fall" + ], + [ + "u", + "k" + ], + [ + "y", + "l" + ], + [ + "▁gener", + "al" + ], + [ + "▁gen", + "eral" + ], + [ + "▁gene", + "ral" + ], + [ + "▁", + "general" + ], + [ + "м", + "у" + ], + [ + "н", + "у" + ], + [ + "▁n", + "ames" + ], + [ + "▁name", + "s" + ], + [ + "▁na", + "mes" + ], + [ + "▁nam", + "es" + ], + [ + "▁", + "names" + ], + [ + "wh", + "ere" + ], + [ + "whe", + "re" + ], + [ + "w", + "here" + ], + [ + "▁The", + "se" + ], + [ + "▁Th", + "ese" + ], + [ + "▁", + "These" + ], + [ + "▁s", + "il" + ], + [ + "▁si", + "l" + ], + [ + "▁", + "sil" + ], + [ + "é", + "t" + ], + [ + "▁e", + "ner" + ], + [ + "▁en", + "er" + ], + [ + "▁", + "ener" + ], + [ + "▁N", + "ow" + ], + [ + "▁No", + "w" + ], + [ + "▁", + "Now" + ], + [ + "▁add", + "ress" + ], + [ + "▁addr", + "ess" + ], + [ + "▁", + "address" + ], + [ + "Res", + "ponse" + ], + [ + "▁M", + "r" + ], + [ + "▁", + "Mr" + ], + [ + "▁an", + "sw" + ], + [ + "▁ans", + "w" + ], + [ + "▁fil", + "m" + ], + [ + "▁fi", + "lm" + ], + [ + "▁", + "film" + ], + [ + "▁str", + "ong" + ], + [ + "▁stro", + "ng" + ], + [ + "▁", + "strong" + ], + [ + "▁b", + "ring" + ], + [ + "▁br", + "ing" + ], + [ + "▁Un", + "ited" + ], + [ + "▁Unit", + "ed" + ], + [ + "▁g", + "e" + ], + [ + "▁", + "ge" + ], + [ + "▁w", + "oman" + ], + [ + "▁wom", + "an" + ], + [ + "▁wo", + "man" + ], + [ + "▁", + "woman" + ], + [ + "Ne", + "w" + ], + [ + "N", + "ew" + ], + [ + "et", + "t" + ], + [ + "e", + "tt" + ], + [ + ".", + ")" + ], + [ + "en", + "ame" + ], + [ + "ena", + "me" + ], + [ + "e", + "name" + ], + [ + "▁A", + "N" + ], + [ + "▁", + "AN" + ], + [ + "▁de", + "scrib" + ], + [ + "▁desc", + "rib" + ], + [ + "з", + "а" + ], + [ + "is", + "ing" + ], + [ + "isi", + "ng" + ], + [ + "i", + "sing" + ], + [ + "E", + "L" + ], + [ + "q", + "l" + ], + [ + "▁f", + "ur" + ], + [ + "▁fu", + "r" + ], + [ + "▁", + "fur" + ], + [ + "y", + "ing" + ], + [ + "▁C", + "al" + ], + [ + "▁Ca", + "l" + ], + [ + "▁", + "Cal" + ], + [ + "▁D", + "r" + ], + [ + "▁", + "Dr" + ], + [ + "ER", + "R" + ], + [ + "E", + "RR" + ], + [ + "▁\\", + "\\" + ], + [ + "▁", + "\\\\" + ], + [ + "an", + "gle" + ], + [ + "ang", + "le" + ], + [ + "ur", + "ope" + ], + [ + "uro", + "pe" + ], + [ + "urop", + "e" + ], + [ + "▁c", + "ity" + ], + [ + "▁cit", + "y" + ], + [ + "▁ci", + "ty" + ], + [ + "▁", + "city" + ], + [ + "▁in", + "dex" + ], + [ + "▁ind", + "ex" + ], + [ + "▁inde", + "x" + ], + [ + "▁", + "index" + ], + [ + "▁a", + "ction" + ], + [ + "▁act", + "ion" + ], + [ + "▁", + "action" + ], + [ + "▁How", + "ever" + ], + [ + "▁", + "However" + ], + [ + "▁f", + "ig" + ], + [ + "▁fi", + "g" + ], + [ + "▁", + "fig" + ], + [ + "ia", + "s" + ], + [ + "i", + "as" + ], + [ + "▁quest", + "ion" + ], + [ + "▁", + "question" + ], + [ + "▁J", + "an" + ], + [ + "▁Ja", + "n" + ], + [ + "▁", + "Jan" + ], + [ + "▁M", + "ed" + ], + [ + "▁Me", + "d" + ], + [ + "▁", + "Med" + ], + [ + "▁C", + "ont" + ], + [ + "▁Con", + "t" + ], + [ + "▁Co", + "nt" + ], + [ + "▁", + "Cont" + ], + [ + "am", + "ed" + ], + [ + "ame", + "d" + ], + [ + "a", + "med" + ], + [ + "Cal", + "l" + ], + [ + "C", + "all" + ], + [ + "pl", + "ied" + ], + [ + "tt", + "y" + ], + [ + "t", + "ty" + ], + [ + "▁ind", + "ivid" + ], + [ + "pa", + "ge" + ], + [ + "pag", + "e" + ], + [ + "p", + "age" + ], + [ + "▁c", + "omb" + ], + [ + "▁com", + "b" + ], + [ + "▁co", + "mb" + ], + [ + "▁", + "comb" + ], + [ + "se", + "ction" + ], + [ + "sect", + "ion" + ], + [ + "s", + "ection" + ], + [ + "▁C", + "omm" + ], + [ + "▁Com", + "m" + ], + [ + "▁Co", + "mm" + ], + [ + "▁", + "Comm" + ], + [ + "ue", + "l" + ], + [ + "u", + "el" + ], + [ + "▁h", + "et" + ], + [ + "▁he", + "t" + ], + [ + "▁", + "het" + ], + [ + "▁B", + "ar" + ], + [ + "▁Ba", + "r" + ], + [ + "▁", + "Bar" + ], + [ + "ag", + "ement" + ], + [ + "age", + "ment" + ], + [ + "agem", + "ent" + ], + [ + "fi", + "n" + ], + [ + "f", + "in" + ], + [ + "▁m", + "ajor" + ], + [ + "▁ma", + "jor" + ], + [ + "▁maj", + "or" + ], + [ + "▁", + "major" + ], + [ + "op", + "er" + ], + [ + "ope", + "r" + ], + [ + "o", + "per" + ], + [ + "ap", + "i" + ], + [ + "a", + "pi" + ], + [ + "ro", + "om" + ], + [ + "r", + "oom" + ], + [ + "▁", + "„" + ], + [ + "▁h", + "ab" + ], + [ + "▁ha", + "b" + ], + [ + "▁", + "hab" + ], + [ + "з", + "и" + ], + [ + "▁a", + "uf" + ], + [ + "▁au", + "f" + ], + [ + "▁", + "auf" + ], + [ + "cur", + "rent" + ], + [ + "curr", + "ent" + ], + [ + "n", + "i" + ], + [ + "▁in", + "clude" + ], + [ + "▁incl", + "ude" + ], + [ + "▁includ", + "e" + ], + [ + "▁inclu", + "de" + ], + [ + "▁", + "include" + ], + [ + "▁qu", + "i" + ], + [ + "▁q", + "ui" + ], + [ + "v", + "a" + ], + [ + "U", + "E" + ], + [ + "▁ide", + "a" + ], + [ + "▁id", + "ea" + ], + [ + "▁", + "idea" + ], + [ + ",", + "'" + ], + [ + "▁requ", + "ired" + ], + [ + "▁require", + "d" + ], + [ + "▁", + "required" + ], + [ + "▁he", + "art" + ], + [ + "▁hear", + "t" + ], + [ + "▁", + "heart" + ], + [ + "ib", + "ility" + ], + [ + "ibil", + "ity" + ], + [ + "ict", + "ion" + ], + [ + "i", + "ction" + ], + [ + "Mod", + "el" + ], + [ + "Mode", + "l" + ], + [ + "Mo", + "del" + ], + [ + "wr", + "ite" + ], + [ + "writ", + "e" + ], + [ + "w", + "rite" + ], + [ + "▁cont", + "ent" + ], + [ + "▁conten", + "t" + ], + [ + "▁", + "content" + ], + [ + "▁w", + "er" + ], + [ + "▁we", + "r" + ], + [ + "▁", + "wer" + ], + [ + "▁h", + "ands" + ], + [ + "▁hand", + "s" + ], + [ + "▁han", + "ds" + ], + [ + "ze", + "n" + ], + [ + "z", + "en" + ], + [ + "ch", + "ar" + ], + [ + "cha", + "r" + ], + [ + "c", + "har" + ], + [ + "}^", + "{" + ], + [ + "}", + "^{" + ], + [ + "▁m", + "ass" + ], + [ + "▁ma", + "ss" + ], + [ + "▁mas", + "s" + ], + [ + "▁", + "mass" + ], + [ + "pl", + "y" + ], + [ + "p", + "ly" + ], + [ + "▁n", + "at" + ], + [ + "▁na", + "t" + ], + [ + "▁", + "nat" + ], + [ + "re", + "l" + ], + [ + "r", + "el" + ], + [ + "▁d", + "at" + ], + [ + "▁da", + "t" + ], + [ + "▁", + "dat" + ], + [ + "====", + "============" + ], + [ + "========", + "========" + ], + [ + "============", + "====" + ], + [ + "im", + "al" + ], + [ + "ima", + "l" + ], + [ + "i", + "mal" + ], + [ + "▁pro", + "bably" + ], + [ + "▁prob", + "ably" + ], + [ + "un", + "ch" + ], + [ + "unc", + "h" + ], + [ + "▁m", + "er" + ], + [ + "▁me", + "r" + ], + [ + "▁", + "mer" + ], + [ + "il", + "ar" + ], + [ + "ila", + "r" + ], + [ + "i", + "lar" + ], + [ + "ir", + "es" + ], + [ + "ire", + "s" + ], + [ + "i", + "res" + ], + [ + "▁w", + "atch" + ], + [ + "▁wat", + "ch" + ], + [ + "▁", + "watch" + ], + [ + "S", + "I" + ], + [ + "▁c", + "ult" + ], + [ + "▁cu", + "lt" + ], + [ + "▁cul", + "t" + ], + [ + "▁m", + "other" + ], + [ + "▁mot", + "her" + ], + [ + "▁mo", + "ther" + ], + [ + "▁", + "mother" + ], + [ + "▁govern", + "ment" + ], + [ + "or", + "ding" + ], + [ + "ord", + "ing" + ], + [ + "▁(", + ")" + ], + [ + "▁", + "()" + ], + [ + "▁p", + "ri" + ], + [ + "▁pr", + "i" + ], + [ + "▁l", + "ink" + ], + [ + "▁lin", + "k" + ], + [ + "▁", + "link" + ], + [ + "gr", + "oup" + ], + [ + "gro", + "up" + ], + [ + "g", + "roup" + ], + [ + "O", + "L" + ], + [ + "▁n", + "ear" + ], + [ + "▁ne", + "ar" + ], + [ + "▁S", + "er" + ], + [ + "▁Se", + "r" + ], + [ + "▁", + "Ser" + ], + [ + "Se", + "r" + ], + [ + "S", + "er" + ], + [ + "it", + "o" + ], + [ + "i", + "to" + ], + [ + "▁value", + "s" + ], + [ + "▁val", + "ues" + ], + [ + "▁", + "values" + ], + [ + "▁j", + "ava" + ], + [ + "▁ja", + "va" + ], + [ + "▁", + "java" + ], + [ + "ful", + "ly" + ], + [ + "full", + "y" + ], + [ + "f", + "ully" + ], + [ + "Co", + "unt" + ], + [ + "C", + "ount" + ], + [ + "++", + ")" + ], + [ + "▁v", + "i" + ], + [ + "▁", + "vi" + ], + [ + "▁wh", + "ite" + ], + [ + "▁", + "white" + ], + [ + "ma", + "t" + ], + [ + "m", + "at" + ], + [ + "ct", + "x" + ], + [ + "c", + "tx" + ], + [ + "▁con", + "c" + ], + [ + "▁co", + "nc" + ], + [ + "▁", + "conc" + ], + [ + "▁st", + "ay" + ], + [ + "▁sta", + "y" + ], + [ + "gi", + "ng" + ], + [ + "gin", + "g" + ], + [ + "g", + "ing" + ], + [ + "▁c", + "lear" + ], + [ + "▁cl", + "ear" + ], + [ + "▁cle", + "ar" + ], + [ + "▁", + "clear" + ], + [ + "▁c", + "opy" + ], + [ + "▁co", + "py" + ], + [ + "▁cop", + "y" + ], + [ + "▁", + "copy" + ], + [ + "sel", + "ves" + ], + [ + "▁prov", + "ide" + ], + [ + "▁w", + "ords" + ], + [ + "▁wor", + "ds" + ], + [ + "▁word", + "s" + ], + [ + "▁", + "words" + ], + [ + "com", + "p" + ], + [ + "co", + "mp" + ], + [ + "c", + "omp" + ], + [ + "ar", + "gs" + ], + [ + "arg", + "s" + ], + [ + "▁p", + "ick" + ], + [ + "▁pi", + "ck" + ], + [ + "▁pic", + "k" + ], + [ + "▁", + "pick" + ], + [ + "ul", + "y" + ], + [ + "u", + "ly" + ], + [ + "▁v", + "ari" + ], + [ + "▁var", + "i" + ], + [ + "▁va", + "ri" + ], + [ + "▁", + "vari" + ], + [ + "▁bel", + "ieve" + ], + [ + "▁belie", + "ve" + ], + [ + "▁C", + "o" + ], + [ + "▁", + "Co" + ], + [ + "Pro", + "perty" + ], + [ + "Gr", + "oup" + ], + [ + "G", + "roup" + ], + [ + "▁t", + "en" + ], + [ + "▁te", + "n" + ], + [ + "▁", + "ten" + ], + [ + "is", + "chen" + ], + [ + "isch", + "en" + ], + [ + "ische", + "n" + ], + [ + "isc", + "hen" + ], + [ + "i", + "schen" + ], + [ + "et", + "urn" + ], + [ + "e", + "turn" + ], + [ + "iv", + "al" + ], + [ + "iva", + "l" + ], + [ + "i", + "val" + ], + [ + "Sys", + "tem" + ], + [ + "S", + "ystem" + ], + [ + "C", + "L" + ], + [ + "be", + "d" + ], + [ + "b", + "ed" + ], + [ + "▁t", + "otal" + ], + [ + "▁to", + "tal" + ], + [ + "▁tot", + "al" + ], + [ + "▁", + "total" + ], + [ + "▁is", + "t" + ], + [ + "▁i", + "st" + ], + [ + "▁", + "ist" + ], + [ + "In", + "put" + ], + [ + "um", + "ents" + ], + [ + "ument", + "s" + ], + [ + "umen", + "ts" + ], + [ + "u", + "ments" + ], + [ + "Man", + "ager" + ], + [ + "ш", + "и" + ], + [ + "▁w", + "in" + ], + [ + "▁", + "win" + ], + [ + "le", + "ep" + ], + [ + "lee", + "p" + ], + [ + "P", + "I" + ], + [ + "но", + "го" + ], + [ + "н", + "ого" + ], + [ + "ru", + "ction" + ], + [ + "ruct", + "ion" + ], + [ + "r", + "uction" + ], + [ + "▁in", + "te" + ], + [ + "▁i", + "nte" + ], + [ + "▁int", + "e" + ], + [ + "▁", + "inte" + ], + [ + "Ap", + "p" + ], + [ + "A", + "pp" + ], + [ + "av", + "or" + ], + [ + "avo", + "r" + ], + [ + "a", + "vor" + ], + [ + "▁re", + "spect" + ], + [ + "▁res", + "pect" + ], + [ + "▁resp", + "ect" + ], + [ + "▁", + "respect" + ], + [ + "at", + "ors" + ], + [ + "ator", + "s" + ], + [ + "ato", + "rs" + ], + [ + "▁c", + "omo" + ], + [ + "▁com", + "o" + ], + [ + "▁co", + "mo" + ], + [ + "▁c", + "ut" + ], + [ + "▁cu", + "t" + ], + [ + "▁", + "cut" + ], + [ + "F", + "A" + ], + [ + "▁s", + "us" + ], + [ + "▁su", + "s" + ], + [ + "▁A", + "pp" + ], + [ + "▁Ap", + "p" + ], + [ + "▁", + "App" + ], + [ + "re", + "ct" + ], + [ + "rec", + "t" + ], + [ + "r", + "ect" + ], + [ + "F", + "I" + ], + [ + "▁be", + "gan" + ], + [ + "▁beg", + "an" + ], + [ + "op", + "h" + ], + [ + "o", + "ph" + ], + [ + "▁s", + "ort" + ], + [ + "▁so", + "rt" + ], + [ + "▁sor", + "t" + ], + [ + "▁", + "sort" + ], + [ + "th", + "ough" + ], + [ + "ј", + "е" + ], + [ + "ic", + "ro" + ], + [ + "i", + "cro" + ], + [ + "Tr", + "ans" + ], + [ + "Tra", + "ns" + ], + [ + "л", + "і" + ], + [ + "▁In", + "st" + ], + [ + "▁Ins", + "t" + ], + [ + "▁", + "Inst" + ], + [ + "re", + "quest" + ], + [ + "requ", + "est" + ], + [ + "req", + "uest" + ], + [ + "о", + "р" + ], + [ + "▁rel", + "ations" + ], + [ + "▁relation", + "s" + ], + [ + "-", + "\\" + ], + [ + "St", + "atus" + ], + [ + "Stat", + "us" + ], + [ + "ж", + "и" + ], + [ + "▁f", + "ather" + ], + [ + "▁fa", + "ther" + ], + [ + "▁fat", + "her" + ], + [ + "▁", + "father" + ], + [ + "c", + "s" + ], + [ + "▁s", + "ex" + ], + [ + "▁se", + "x" + ], + [ + "▁", + "sex" + ], + [ + "is", + "ch" + ], + [ + "isc", + "h" + ], + [ + "i", + "sch" + ], + [ + "v", + "o" + ], + [ + "}_", + "{" + ], + [ + "}", + "_{" + ], + [ + "ave", + "n" + ], + [ + "av", + "en" + ], + [ + "a", + "ven" + ], + [ + "▁N", + "e" + ], + [ + "▁", + "Ne" + ], + [ + "AT", + "E" + ], + [ + "A", + "TE" + ], + [ + "it", + "ten" + ], + [ + "itt", + "en" + ], + [ + "itte", + "n" + ], + [ + "▁e", + "ss" + ], + [ + "▁es", + "s" + ], + [ + "▁", + "ess" + ], + [ + "T", + "H" + ], + [ + "ight", + "s" + ], + [ + "igh", + "ts" + ], + [ + "▁h", + "om" + ], + [ + "▁ho", + "m" + ], + [ + "▁", + "hom" + ], + [ + "▁t", + "oday" + ], + [ + "▁to", + "day" + ], + [ + "▁tod", + "ay" + ], + [ + "▁toda", + "y" + ], + [ + "▁z", + "u" + ], + [ + "▁", + "zu" + ], + [ + "it", + "a" + ], + [ + "i", + "ta" + ], + [ + "▁is", + "n" + ], + [ + "▁i", + "sn" + ], + [ + "▁o", + "pt" + ], + [ + "▁op", + "t" + ], + [ + "▁", + "opt" + ], + [ + "og", + "n" + ], + [ + "o", + "gn" + ], + [ + "é", + "r" + ], + [ + "▁wh", + "ether" + ], + [ + "▁whe", + "ther" + ], + [ + "ix", + "ed" + ], + [ + "ph", + "i" + ], + [ + "p", + "hi" + ], + [ + "id", + "ence" + ], + [ + "iden", + "ce" + ], + [ + "al", + "d" + ], + [ + "a", + "ld" + ], + [ + "Cl", + "ient" + ], + [ + "A", + "t" + ], + [ + "▁de", + "ath" + ], + [ + "▁L", + "et" + ], + [ + "▁Le", + "t" + ], + [ + "▁", + "Let" + ], + [ + "iu", + "s" + ], + [ + "i", + "us" + ], + [ + "г", + "и" + ], + [ + "▁р", + "е" + ], + [ + "▁", + "ре" + ], + [ + "be", + "n" + ], + [ + "b", + "en" + ], + [ + ")", + "\r" + ], + [ + "b", + "a" + ], + [ + "><", + "/" + ], + [ + ">", + "" + ], + [ + "▁", + "->" + ], + [ + "▁J", + "ust" + ], + [ + "▁Ju", + "st" + ], + [ + "▁", + "Just" + ], + [ + "Wh", + "at" + ], + [ + "W", + "hat" + ], + [ + "at", + "al" + ], + [ + "ata", + "l" + ], + [ + "a", + "tal" + ], + [ + "▁M", + "in" + ], + [ + "▁Mi", + "n" + ], + [ + "▁", + "Min" + ], + [ + "▁C", + "or" + ], + [ + "▁Co", + "r" + ], + [ + "▁", + "Cor" + ], + [ + "▁d", + "ark" + ], + [ + "▁dar", + "k" + ], + [ + "▁", + "dark" + ], + [ + "r", + "l" + ], + [ + "▁l", + "arg" + ], + [ + "▁la", + "rg" + ], + [ + "▁", + "larg" + ], + [ + "di", + "ng" + ], + [ + "d", + "ing" + ], + [ + "ó", + "n" + ], + [ + "ou", + "ch" + ], + [ + "o", + "uch" + ], + [ + "▁u", + "m" + ], + [ + "▁", + "um" + ], + [ + "▁e", + "lect" + ], + [ + "▁el", + "ect" + ], + [ + "▁ele", + "ct" + ], + [ + "▁", + "elect" + ], + [ + "▁d", + "am" + ], + [ + "▁da", + "m" + ], + [ + "▁", + "dam" + ], + [ + "▁ne", + "eds" + ], + [ + "▁need", + "s" + ], + [ + "▁m", + "atter" + ], + [ + "▁mat", + "ter" + ], + [ + "▁matt", + "er" + ], + [ + "▁r", + "ather" + ], + [ + "▁rat", + "her" + ], + [ + "▁ra", + "ther" + ], + [ + "fr", + "om" + ], + [ + "f", + "rom" + ], + [ + "ra", + "m" + ], + [ + "r", + "am" + ], + [ + "▁", + "і" + ], + [ + "▁t", + "aken" + ], + [ + "▁take", + "n" + ], + [ + "▁tak", + "en" + ], + [ + "▁ta", + "ken" + ], + [ + "▁de", + "al" + ], + [ + "▁per", + "iod" + ], + [ + "▁", + "period" + ], + [ + "▁M", + "on" + ], + [ + "▁Mo", + "n" + ], + [ + "▁", + "Mon" + ], + [ + "▁", + "Л" + ], + [ + "▁A", + "ug" + ], + [ + "▁Au", + "g" + ], + [ + "▁", + "Aug" + ], + [ + "ru", + "n" + ], + [ + "r", + "un" + ], + [ + "m", + "m" + ], + [ + "el", + "le" + ], + [ + "ell", + "e" + ], + [ + "e", + "lle" + ], + [ + "▁ex", + "port" + ], + [ + "▁exp", + "ort" + ], + [ + "▁", + "export" + ], + [ + "S", + "c" + ], + [ + "vi", + "s" + ], + [ + "v", + "is" + ], + [ + "ab", + "or" + ], + [ + "a", + "bor" + ], + [ + "▁aut", + "hor" + ], + [ + "▁auth", + "or" + ], + [ + "▁", + "author" + ], + [ + "è", + "re" + ], + [ + "▁re", + "member" + ], + [ + "▁rem", + "ember" + ], + [ + "▁remem", + "ber" + ], + [ + "▁re", + "du" + ], + [ + "▁r", + "edu" + ], + [ + "▁red", + "u" + ], + [ + "▁", + "redu" + ], + [ + "▁L", + "ist" + ], + [ + "▁Li", + "st" + ], + [ + "▁Lis", + "t" + ], + [ + "▁", + "List" + ], + [ + "▁f", + "ocus" + ], + [ + "▁", + "focus" + ], + [ + "▁char", + "acter" + ], + [ + "▁", + "character" + ], + [ + "Tab", + "le" + ], + [ + "T", + "able" + ], + [ + "▁individ", + "ual" + ], + [ + "▁need", + "ed" + ], + [ + "bu", + "m" + ], + [ + "b", + "um" + ], + [ + "▁st", + "yle" + ], + [ + "▁sty", + "le" + ], + [ + "▁", + "style" + ], + [ + "in", + "ary" + ], + [ + "ina", + "ry" + ], + [ + "inar", + "y" + ], + [ + "ers", + "ion" + ], + [ + "ou", + "te" + ], + [ + "out", + "e" + ], + [ + "o", + "ute" + ], + [ + "▁P", + "e" + ], + [ + "▁", + "Pe" + ], + [ + "▁h", + "on" + ], + [ + "▁ho", + "n" + ], + [ + "▁", + "hon" + ], + [ + "mu", + "t" + ], + [ + "m", + "ut" + ], + [ + "se", + "e" + ], + [ + "s", + "ee" + ], + [ + "▁bec", + "ame" + ], + [ + "▁d", + "ire" + ], + [ + "▁di", + "re" + ], + [ + "▁dir", + "e" + ], + [ + "▁", + "dire" + ], + [ + "▁d", + "ocument" + ], + [ + "▁doc", + "ument" + ], + [ + "▁", + "document" + ], + [ + "se", + "c" + ], + [ + "s", + "ec" + ], + [ + "en", + "ing" + ], + [ + "eni", + "ng" + ], + [ + "e", + "ning" + ], + [ + "▁vis", + "it" + ], + [ + "▁", + "visit" + ], + [ + "▁f", + "ac" + ], + [ + "▁fa", + "c" + ], + [ + "▁", + "fac" + ], + [ + "t", + "x" + ], + [ + "do", + "wn" + ], + [ + "d", + "own" + ], + [ + "pl", + "it" + ], + [ + "p", + "lit" + ], + [ + "▁ph", + "ys" + ], + [ + "▁", + "phys" + ], + [ + "it", + "ting" + ], + [ + "itt", + "ing" + ], + [ + "jo", + "y" + ], + [ + "j", + "oy" + ], + [ + "▁h", + "ig" + ], + [ + "▁hi", + "g" + ], + [ + "Th", + "is" + ], + [ + "T", + "his" + ], + [ + "A", + "d" + ], + [ + "▁B", + "rit" + ], + [ + "▁Br", + "it" + ], + [ + "▁em", + "ploy" + ], + [ + "▁r", + "é" + ], + [ + "▁", + "ré" + ], + [ + "▁", + "т" + ], + [ + "l", + "ambda" + ], + [ + "▁im", + "pro" + ], + [ + "▁imp", + "ro" + ], + [ + "▁B", + "o" + ], + [ + "▁", + "Bo" + ], + [ + "id", + "ing" + ], + [ + "idi", + "ng" + ], + [ + "i", + "ding" + ], + [ + "▁on", + "line" + ], + [ + "▁", + "online" + ], + [ + "me", + "m" + ], + [ + "m", + "em" + ], + [ + "at", + "form" + ], + [ + "▁W", + "ar" + ], + [ + "▁Wa", + "r" + ], + [ + "▁", + "War" + ], + [ + "▁c", + "as" + ], + [ + "▁ca", + "s" + ], + [ + "▁", + "cas" + ], + [ + "as", + "ure" + ], + [ + "a", + "sure" + ], + [ + "▁p", + "ur" + ], + [ + "▁pu", + "r" + ], + [ + "▁", + "pur" + ], + [ + "me", + "di" + ], + [ + "med", + "i" + ], + [ + "m", + "edi" + ], + [ + "Di", + "s" + ], + [ + "D", + "is" + ], + [ + "▁G", + "erm" + ], + [ + "▁Ge", + "rm" + ], + [ + "▁Ger", + "m" + ], + [ + "p", + "c" + ], + [ + "с", + "а" + ], + [ + "▁friend", + "s" + ], + [ + "▁M", + "c" + ], + [ + "▁", + "Mc" + ], + [ + "D", + "I" + ], + [ + "▁pl", + "us" + ], + [ + "▁", + "plus" + ], + [ + "▁S", + "et" + ], + [ + "▁Se", + "t" + ], + [ + "▁", + "Set" + ], + [ + "idd", + "le" + ], + [ + "it", + "ut" + ], + [ + "itu", + "t" + ], + [ + "▁de", + "pend" + ], + [ + "▁dep", + "end" + ], + [ + "▁", + "depend" + ], + [ + "re", + "st" + ], + [ + "res", + "t" + ], + [ + "r", + "est" + ], + [ + "▁J", + "e" + ], + [ + "▁", + "Je" + ], + [ + "▁h", + "or" + ], + [ + "▁ho", + "r" + ], + [ + "▁", + "hor" + ], + [ + "▁ent", + "ire" + ], + [ + "Qu", + "ery" + ], + [ + "Que", + "ry" + ], + [ + "▁re", + "fer" + ], + [ + "▁ref", + "er" + ], + [ + "▁", + "refer" + ], + [ + "▁h", + "ot" + ], + [ + "▁ho", + "t" + ], + [ + "▁", + "hot" + ], + [ + "▁A", + "ust" + ], + [ + "▁Aus", + "t" + ], + [ + "▁Au", + "st" + ], + [ + "▁com", + "mon" + ], + [ + "▁comm", + "on" + ], + [ + "▁", + "common" + ], + [ + "ц", + "і" + ], + [ + "▁p", + "ull" + ], + [ + "▁pu", + "ll" + ], + [ + "▁pul", + "l" + ], + [ + "▁", + "pull" + ], + [ + "▁A", + "dd" + ], + [ + "▁Ad", + "d" + ], + [ + "▁", + "Add" + ], + [ + "▁se", + "ason" + ], + [ + "▁sea", + "son" + ], + [ + "▁seas", + "on" + ], + [ + "▁", + "season" + ], + [ + "▁in", + "vol" + ], + [ + "▁inv", + "ol" + ], + [ + "▁W", + "orld" + ], + [ + "▁Wor", + "ld" + ], + [ + "▁", + "World" + ], + [ + "cl", + "ient" + ], + [ + "cli", + "ent" + ], + [ + "no", + "w" + ], + [ + "n", + "ow" + ], + [ + "tr", + "ue" + ], + [ + "ap", + "pend" + ], + [ + "app", + "end" + ], + [ + "appe", + "nd" + ], + [ + "appen", + "d" + ], + [ + "it", + "ted" + ], + [ + "itt", + "ed" + ], + [ + "itte", + "d" + ], + [ + "em", + "pt" + ], + [ + "emp", + "t" + ], + [ + ")", + "{" + ], + [ + "//", + "/" + ], + [ + "/", + "//" + ], + [ + "▁p", + "rop" + ], + [ + "▁pro", + "p" + ], + [ + "▁pr", + "op" + ], + [ + "▁", + "prop" + ], + [ + "im", + "ate" + ], + [ + "ima", + "te" + ], + [ + "imat", + "e" + ], + [ + "i", + "mate" + ], + [ + "S", + "C" + ], + [ + "▁h", + "ours" + ], + [ + "▁hour", + "s" + ], + [ + "▁ho", + "urs" + ], + [ + "▁h", + "ope" + ], + [ + "▁hop", + "e" + ], + [ + "▁ho", + "pe" + ], + [ + "an", + "dom" + ], + [ + "and", + "om" + ], + [ + "ando", + "m" + ], + [ + "і", + "д" + ], + [ + "ist", + "ic" + ], + [ + "isti", + "c" + ], + [ + "▁pro", + "perty" + ], + [ + "▁proper", + "ty" + ], + [ + "▁", + "property" + ], + [ + "s", + "g" + ], + [ + ">", + "(" + ], + [ + "▁w", + "rite" + ], + [ + "▁wr", + "ite" + ], + [ + "▁writ", + "e" + ], + [ + "▁", + "write" + ], + [ + "mar", + "k" + ], + [ + "m", + "ark" + ], + [ + "fin", + "d" + ], + [ + "fi", + "nd" + ], + [ + "f", + "ind" + ], + [ + "▁person", + "al" + ], + [ + "▁pers", + "onal" + ], + [ + "▁persona", + "l" + ], + [ + "▁", + "personal" + ], + [ + "]", + "[" + ], + [ + "ro", + "wn" + ], + [ + "row", + "n" + ], + [ + "r", + "own" + ], + [ + "P", + "h" + ], + [ + "▁f", + "oot" + ], + [ + "▁fo", + "ot" + ], + [ + "▁foo", + "t" + ], + [ + "▁", + "foot" + ], + [ + "▁re", + "search" + ], + [ + "▁res", + "earch" + ], + [ + "iron", + "ment" + ], + [ + "▁n", + "om" + ], + [ + "▁no", + "m" + ], + [ + "▁", + "nom" + ], + [ + "▁in", + "stance" + ], + [ + "▁inst", + "ance" + ], + [ + "▁", + "instance" + ], + [ + "▁h", + "eld" + ], + [ + "▁he", + "ld" + ], + [ + "▁hel", + "d" + ], + [ + "▁", + "held" + ], + [ + "D", + "e" + ], + [ + "▁mem", + "bers" + ], + [ + "▁member", + "s" + ], + [ + "▁", + "members" + ], + [ + "▁f", + "ire" + ], + [ + "▁fi", + "re" + ], + [ + "▁fir", + "e" + ], + [ + "▁", + "fire" + ], + [ + "▁hist", + "ory" + ], + [ + "▁histor", + "y" + ], + [ + "▁hi", + "story" + ], + [ + "▁", + "history" + ], + [ + "▁m", + "ap" + ], + [ + "▁ma", + "p" + ], + [ + "▁", + "map" + ], + [ + "▁dis", + "cuss" + ], + [ + "▁disc", + "uss" + ], + [ + "▁e", + "spec" + ], + [ + "▁es", + "pec" + ], + [ + "▁esp", + "ec" + ], + [ + "▁", + "espec" + ], + [ + "▁t", + "aking" + ], + [ + "▁tak", + "ing" + ], + [ + "▁ta", + "king" + ], + [ + "▁s", + "ervices" + ], + [ + "▁serv", + "ices" + ], + [ + "▁service", + "s" + ], + [ + "▁", + "services" + ], + [ + "▁ind", + "ust" + ], + [ + "▁indu", + "st" + ], + [ + "▁", + "indust" + ], + [ + "ig", + "en" + ], + [ + "ige", + "n" + ], + [ + "i", + "gen" + ], + [ + "▁A", + "ss" + ], + [ + "▁As", + "s" + ], + [ + "▁", + "Ass" + ], + [ + "▁e", + "xpected" + ], + [ + "▁ex", + "pected" + ], + [ + "▁expect", + "ed" + ], + [ + "▁", + "expected" + ], + [ + "▁wur", + "de" + ], + [ + "di", + "r" + ], + [ + "d", + "ir" + ], + [ + "▁a", + "mong" + ], + [ + "▁am", + "ong" + ], + [ + "▁s", + "ugg" + ], + [ + "▁su", + "gg" + ], + [ + "▁sug", + "g" + ], + [ + "re", + "c" + ], + [ + "r", + "ec" + ], + [ + "In", + "ter" + ], + [ + "Int", + "er" + ], + [ + "bl", + "ock" + ], + [ + "blo", + "ck" + ], + [ + "b", + "lock" + ], + [ + "▁R", + "ep" + ], + [ + "▁Re", + "p" + ], + [ + "▁", + "Rep" + ], + [ + "▁p", + "ain" + ], + [ + "▁pa", + "in" + ], + [ + "▁f", + "ive" + ], + [ + "▁fi", + "ve" + ], + [ + "▁", + "five" + ], + [ + "▁f", + "und" + ], + [ + "▁fun", + "d" + ], + [ + "▁fu", + "nd" + ], + [ + "▁", + "fund" + ], + [ + "ri", + "d" + ], + [ + "r", + "id" + ], + [ + "ar", + "row" + ], + [ + "arr", + "ow" + ], + [ + "▁t", + "reat" + ], + [ + "▁tre", + "at" + ], + [ + "▁he", + "ard" + ], + [ + "▁hear", + "d" + ], + [ + "▁de", + "term" + ], + [ + "▁det", + "erm" + ], + [ + "▁deter", + "m" + ], + [ + "ic", + "ult" + ], + [ + "▁s", + "ense" + ], + [ + "▁sens", + "e" + ], + [ + "▁sen", + "se" + ], + [ + "es", + "e" + ], + [ + "e", + "se" + ], + [ + "F", + "un" + ], + [ + "▁month", + "s" + ], + [ + "▁mont", + "hs" + ], + [ + "js", + "on" + ], + [ + "j", + "son" + ], + [ + ",", + "”" + ], + [ + "T", + "I" + ], + [ + "or", + "age" + ], + [ + "ora", + "ge" + ], + [ + "o", + "rage" + ], + [ + "▁", + "У" + ], + [ + "▁every", + "one" + ], + [ + "▁c", + "los" + ], + [ + "▁cl", + "os" + ], + [ + "▁clo", + "s" + ], + [ + "▁", + "clos" + ], + [ + "ie", + "rs" + ], + [ + "ier", + "s" + ], + [ + "i", + "ers" + ], + [ + "air", + "s" + ], + [ + "ai", + "rs" + ], + [ + "a", + "irs" + ], + [ + "def", + "ine" + ], + [ + "I", + "f" + ], + [ + "os", + "p" + ], + [ + "o", + "sp" + ], + [ + "▁w", + "onder" + ], + [ + "▁won", + "der" + ], + [ + "▁wo", + "nder" + ], + [ + "N", + "A" + ], + [ + "qu", + "ery" + ], + [ + "que", + "ry" + ], + [ + "quer", + "y" + ], + [ + "p", + "g" + ], + [ + "it", + "es" + ], + [ + "ite", + "s" + ], + [ + "i", + "tes" + ], + [ + "▁m", + "aterial" + ], + [ + "▁mat", + "erial" + ], + [ + "▁mate", + "rial" + ], + [ + "▁mater", + "ial" + ], + [ + "▁", + "material" + ], + [ + "y", + "d" + ], + [ + "Re", + "ad" + ], + [ + "R", + "ead" + ], + [ + "ht", + "ml" + ], + [ + "h", + "tml" + ], + [ + "T", + "E" + ], + [ + "P", + "r" + ], + [ + "^{", + "\\" + ], + [ + "^", + "{\\" + ], + [ + "▁g", + "ave" + ], + [ + "▁ga", + "ve" + ], + [ + "▁I", + "S" + ], + [ + "▁", + "IS" + ], + [ + "▁s", + "uggest" + ], + [ + "▁sugg", + "est" + ], + [ + "▁sug", + "gest" + ], + [ + "Over", + "ride" + ], + [ + "ro", + "du" + ], + [ + "rod", + "u" + ], + [ + "Fr", + "om" + ], + [ + "F", + "rom" + ], + [ + "▁E", + "urope" + ], + [ + "▁Europ", + "e" + ], + [ + "▁Euro", + "pe" + ], + [ + "▁", + "Europe" + ], + [ + "P", + "O" + ], + [ + "▁s", + "oon" + ], + [ + "▁so", + "on" + ], + [ + "ho", + "st" + ], + [ + "hos", + "t" + ], + [ + "h", + "ost" + ], + [ + "▁B", + "er" + ], + [ + "▁Be", + "r" + ], + [ + "▁", + "Ber" + ], + [ + "..", + ".." + ], + [ + "...", + "." + ], + [ + ".", + "..." + ], + [ + "▁H", + "ar" + ], + [ + "▁Ha", + "r" + ], + [ + "▁", + "Har" + ], + [ + "▁e", + "nergy" + ], + [ + "▁ener", + "gy" + ], + [ + "▁energ", + "y" + ], + [ + "▁", + "energy" + ], + [ + ">", + "<" + ], + [ + "ave", + "s" + ], + [ + "av", + "es" + ], + [ + "a", + "ves" + ], + [ + "▁e", + "asy" + ], + [ + "▁eas", + "y" + ], + [ + "▁b", + "re" + ], + [ + "▁br", + "e" + ], + [ + "▁", + "bre" + ], + [ + "fr", + "ame" + ], + [ + "▁g", + "round" + ], + [ + "▁gr", + "ound" + ], + [ + "▁gro", + "und" + ], + [ + "▁", + "ground" + ], + [ + "wi", + "th" + ], + [ + "w", + "ith" + ], + [ + "▁in", + "side" + ], + [ + "▁ins", + "ide" + ], + [ + "ie", + "f" + ], + [ + "i", + "ef" + ], + [ + "▁m", + "o" + ], + [ + "▁", + "mo" + ], + [ + "p", + "m" + ], + [ + "pa", + "n" + ], + [ + "p", + "an" + ], + [ + "ig", + "r" + ], + [ + "i", + "gr" + ], + [ + "▁o", + "m" + ], + [ + "▁", + "om" + ], + [ + "ne", + "xt" + ], + [ + "nex", + "t" + ], + [ + "n", + "ext" + ], + [ + "om", + "et" + ], + [ + "ome", + "t" + ], + [ + "o", + "met" + ], + [ + "▁st", + "atus" + ], + [ + "▁stat", + "us" + ], + [ + "▁", + "status" + ], + [ + "▁}", + "\r" + ], + [ + "▁", + "}\r" + ], + [ + "▁mus", + "ic" + ], + [ + "or", + "a" + ], + [ + "o", + "ra" + ], + [ + "il", + "es" + ], + [ + "ile", + "s" + ], + [ + "i", + "les" + ], + [ + "k", + "i" + ], + [ + "▁e", + "sc" + ], + [ + "▁es", + "c" + ], + [ + "▁", + "esc" + ], + [ + "▁b", + "es" + ], + [ + "▁be", + "s" + ], + [ + "▁", + "bes" + ], + [ + "▁D", + "is" + ], + [ + "▁Di", + "s" + ], + [ + "▁", + "Dis" + ], + [ + "▁h", + "ost" + ], + [ + "▁ho", + "st" + ], + [ + "▁", + "host" + ], + [ + "▁c", + "omes" + ], + [ + "▁com", + "es" + ], + [ + "▁co", + "mes" + ], + [ + "▁come", + "s" + ], + [ + "▁", + "comes" + ], + [ + "us", + "ed" + ], + [ + "use", + "d" + ], + [ + "u", + "sed" + ], + [ + "▁f", + "uture" + ], + [ + "▁fut", + "ure" + ], + [ + "▁", + "future" + ], + [ + "lic", + "k" + ], + [ + "li", + "ck" + ], + [ + "l", + "ick" + ], + [ + "ai", + "d" + ], + [ + "a", + "id" + ], + [ + "▁com", + "pet" + ], + [ + "▁comp", + "et" + ], + [ + "▁", + "compet" + ], + [ + "▁v", + "oice" + ], + [ + "▁vo", + "ice" + ], + [ + "▁", + "voice" + ], + [ + "▁l", + "oad" + ], + [ + "▁lo", + "ad" + ], + [ + "▁", + "load" + ], + [ + "ev", + "el" + ], + [ + "eve", + "l" + ], + [ + "e", + "vel" + ], + [ + "▁n", + "eg" + ], + [ + "▁ne", + "g" + ], + [ + "▁", + "neg" + ], + [ + "▁com", + "mand" + ], + [ + "▁comm", + "and" + ], + [ + "▁", + "command" + ], + [ + "▁f", + "ür" + ], + [ + "▁p", + "ie" + ], + [ + "▁pi", + "e" + ], + [ + "▁", + "pie" + ], + [ + "▁qu", + "ite" + ], + [ + "▁qui", + "te" + ], + [ + "▁quit", + "e" + ], + [ + "▁b", + "lo" + ], + [ + "▁bl", + "o" + ], + [ + "▁", + "blo" + ], + [ + "ag", + "n" + ], + [ + "a", + "gn" + ], + [ + "il", + "on" + ], + [ + "ilo", + "n" + ], + [ + "i", + "lon" + ], + [ + "▁cl", + "aim" + ], + [ + "▁", + "claim" + ], + [ + "▁t", + "each" + ], + [ + "▁te", + "ach" + ], + [ + "▁tea", + "ch" + ], + [ + "▁pre", + "vious" + ], + [ + "▁prev", + "ious" + ], + [ + "▁", + "previous" + ], + [ + "▁s", + "ite" + ], + [ + "▁sit", + "e" + ], + [ + "▁si", + "te" + ], + [ + "▁", + "site" + ], + [ + "co", + "lor" + ], + [ + "col", + "or" + ], + [ + "colo", + "r" + ], + [ + "at", + "tr" + ], + [ + "att", + "r" + ], + [ + "▁ac", + "cept" + ], + [ + "▁", + "accept" + ], + [ + "▁ex", + "act" + ], + [ + ")", + "}" + ], + [ + "af", + "t" + ], + [ + "a", + "ft" + ], + [ + "rol", + "ler" + ], + [ + "roll", + "er" + ], + [ + "о", + "н" + ], + [ + "o", + "o" + ], + [ + "Dat", + "e" + ], + [ + "Da", + "te" + ], + [ + "D", + "ate" + ], + [ + "▁o", + "u" + ], + [ + "▁", + "ou" + ], + [ + "s", + "y" + ], + [ + "▁pre", + "tty" + ], + [ + "▁pret", + "ty" + ], + [ + "▁im", + "age" + ], + [ + "▁imag", + "e" + ], + [ + "▁", + "image" + ], + [ + "B", + "U" + ], + [ + "▁term", + "s" + ], + [ + "▁ter", + "ms" + ], + [ + "▁s", + "earch" + ], + [ + "▁se", + "arch" + ], + [ + "▁sear", + "ch" + ], + [ + "▁", + "search" + ], + [ + "▁", + "è" + ], + [ + "▁V", + "al" + ], + [ + "▁Va", + "l" + ], + [ + "▁", + "Val" + ], + [ + "▁", + "‘" + ], + [ + "▁D", + "av" + ], + [ + "▁Da", + "v" + ], + [ + "M", + "S" + ], + [ + "sr", + "c" + ], + [ + "s", + "rc" + ], + [ + "ma", + "r" + ], + [ + "m", + "ar" + ], + [ + "in", + "cip" + ], + [ + "inc", + "ip" + ], + [ + "▁could", + "n" + ], + [ + "ad", + "os" + ], + [ + "ado", + "s" + ], + [ + "▁d", + "ro" + ], + [ + "▁dr", + "o" + ], + [ + "▁", + "dro" + ], + [ + "be", + "ta" + ], + [ + "bet", + "a" + ], + [ + "b", + "eta" + ], + [ + "im", + "um" + ], + [ + "▁min", + "utes" + ], + [ + "▁minute", + "s" + ], + [ + "▁minut", + "es" + ], + [ + "▁g", + "rand" + ], + [ + "▁gr", + "and" + ], + [ + "▁gran", + "d" + ], + [ + "▁gra", + "nd" + ], + [ + "▁", + "grand" + ], + [ + "▁", + "»" + ], + [ + "▁O", + "ur" + ], + [ + "▁", + "Our" + ], + [ + "St", + "r" + ], + [ + "S", + "tr" + ], + [ + "VE", + "R" + ], + [ + "V", + "ER" + ], + [ + "ma", + "z" + ], + [ + "m", + "az" + ], + [ + "▁or", + "iginal" + ], + [ + "▁orig", + "inal" + ], + [ + "▁origin", + "al" + ], + [ + "▁", + "original" + ], + [ + "in", + "i" + ], + [ + "i", + "ni" + ], + [ + "▁c", + "oll" + ], + [ + "▁col", + "l" + ], + [ + "▁co", + "ll" + ], + [ + "▁", + "coll" + ], + [ + "lo", + "at" + ], + [ + "▁o", + "s" + ], + [ + "▁", + "os" + ], + [ + "})", + ";" + ], + [ + "}", + ");" + ], + [ + "sum", + "mary" + ], + [ + "▁w", + "all" + ], + [ + "▁wa", + "ll" + ], + [ + "▁wal", + "l" + ], + [ + "▁", + "wall" + ], + [ + "Col", + "or" + ], + [ + "Co", + "lor" + ], + [ + "▁v", + "ers" + ], + [ + "▁ver", + "s" + ], + [ + "▁ve", + "rs" + ], + [ + "▁", + "vers" + ], + [ + "▁d", + "ella" + ], + [ + "▁de", + "lla" + ], + [ + "▁del", + "la" + ], + [ + "▁dell", + "a" + ], + [ + "▁\"", + "\"\"" + ], + [ + "▁\"\"", + "\"" + ], + [ + "▁", + "\"\"\"" + ], + [ + "math", + "bf" + ], + [ + "ze", + "r" + ], + [ + "z", + "er" + ], + [ + "au", + "r" + ], + [ + "a", + "ur" + ], + [ + "▁tr", + "ack" + ], + [ + "▁tra", + "ck" + ], + [ + "▁", + "track" + ], + [ + "▁ass", + "oci" + ], + [ + "▁", + "associ" + ], + [ + "▁s", + "uff" + ], + [ + "▁su", + "ff" + ], + [ + "▁in", + "de" + ], + [ + "▁i", + "nde" + ], + [ + "▁ind", + "e" + ], + [ + "▁", + "inde" + ], + [ + "ag", + "ue" + ], + [ + "agu", + "e" + ], + [ + "a", + "gue" + ], + [ + "▁A", + "pr" + ], + [ + "▁Ap", + "r" + ], + [ + "▁", + "Apr" + ], + [ + "L", + "e" + ], + [ + "ro", + "ups" + ], + [ + "rou", + "ps" + ], + [ + "roup", + "s" + ], + [ + "bo", + "ard" + ], + [ + "b", + "oard" + ], + [ + "▁att", + "ack" + ], + [ + "▁s", + "eries" + ], + [ + "▁se", + "ries" + ], + [ + "▁ser", + "ies" + ], + [ + "▁serie", + "s" + ], + [ + "▁", + "series" + ], + [ + "▁in", + "stead" + ], + [ + "▁inst", + "ead" + ], + [ + "ha", + "m" + ], + [ + "h", + "am" + ], + [ + "bo", + "ok" + ], + [ + "b", + "ook" + ], + [ + "▁s", + "ix" + ], + [ + "▁si", + "x" + ], + [ + "▁", + "six" + ], + [ + "▁R", + "ec" + ], + [ + "▁Re", + "c" + ], + [ + "▁", + "Rec" + ], + [ + "▁c", + "oming" + ], + [ + "▁com", + "ing" + ], + [ + "▁co", + "ming" + ], + [ + "▁", + "coming" + ], + [ + "ur", + "t" + ], + [ + "u", + "rt" + ], + [ + "▁gl", + "obal" + ], + [ + "▁glob", + "al" + ], + [ + "▁glo", + "bal" + ], + [ + "▁", + "global" + ], + [ + "▁ne", + "cess" + ], + [ + "▁neces", + "s" + ], + [ + "▁", + "necess" + ], + [ + "le", + "ge" + ], + [ + "leg", + "e" + ], + [ + "Po", + "s" + ], + [ + "P", + "os" + ], + [ + "▁le", + "ave" + ], + [ + "▁", + "leave" + ], + [ + "▁p", + "od" + ], + [ + "▁po", + "d" + ], + [ + "▁", + "pod" + ], + [ + "ateg", + "ory" + ], + [ + "ategor", + "y" + ], + [ + "u", + "z" + ], + [ + "▁de", + "ep" + ], + [ + "▁", + "deep" + ], + [ + "▁k", + "m" + ], + [ + "▁", + "km" + ], + [ + "▁out", + "side" + ], + [ + "▁outs", + "ide" + ], + [ + "ha", + "s" + ], + [ + "h", + "as" + ], + [ + "opt", + "ions" + ], + [ + "option", + "s" + ], + [ + "o", + "ptions" + ], + [ + "▁S", + "m" + ], + [ + "▁", + "Sm" + ], + [ + "Su", + "b" + ], + [ + "S", + "ub" + ], + [ + "ro", + "ws" + ], + [ + "row", + "s" + ], + [ + "r", + "ows" + ], + [ + "▁в", + "и" + ], + [ + "▁", + "ви" + ], + [ + "▁St", + "ates" + ], + [ + "▁State", + "s" + ], + [ + "▁Stat", + "es" + ], + [ + "▁Sta", + "tes" + ], + [ + "▁", + "States" + ], + [ + "▁wr", + "ong" + ], + [ + "▁how", + "ever" + ], + [ + "▁s", + "em" + ], + [ + "▁se", + "m" + ], + [ + "▁", + "sem" + ], + [ + "▁c", + "atch" + ], + [ + "▁cat", + "ch" + ], + [ + "▁", + "catch" + ], + [ + "\")", + "," + ], + [ + "\"", + ")," + ], + [ + "mod", + "el" + ], + [ + "mode", + "l" + ], + [ + "mo", + "del" + ], + [ + "▁h", + "ttp" + ], + [ + "▁htt", + "p" + ], + [ + "▁", + "http" + ], + [ + "▁o", + "ption" + ], + [ + "▁opt", + "ion" + ], + [ + "▁", + "option" + ], + [ + "ri", + "e" + ], + [ + "r", + "ie" + ], + [ + "▁с", + "та" + ], + [ + "▁ст", + "а" + ], + [ + "▁", + "ста" + ], + [ + "▁ä", + "r" + ], + [ + "▁", + "är" + ], + [ + "▁en", + "joy" + ], + [ + "▁enjo", + "y" + ], + [ + "n", + "u" + ], + [ + "▁p", + "as" + ], + [ + "▁pa", + "s" + ], + [ + "▁", + "pas" + ], + [ + "▁a", + "mount" + ], + [ + "▁am", + "ount" + ], + [ + "▁", + "amount" + ], + [ + "▁res", + "pons" + ], + [ + "▁respon", + "s" + ], + [ + "▁resp", + "ons" + ], + [ + "▁", + "respons" + ], + [ + "▁In", + "tern" + ], + [ + "▁Inter", + "n" + ], + [ + "▁Int", + "ern" + ], + [ + "▁", + "Intern" + ], + [ + "▁my", + "self" + ], + [ + "▁o", + "pp" + ], + [ + "▁op", + "p" + ], + [ + "▁", + "opp" + ], + [ + "▁S", + "im" + ], + [ + "▁Si", + "m" + ], + [ + "▁", + "Sim" + ], + [ + "▁s", + "ens" + ], + [ + "▁se", + "ns" + ], + [ + "▁sen", + "s" + ], + [ + "E", + "d" + ], + [ + "▁(", + "\\" + ], + [ + "▁", + "(\\" + ], + [ + "▁stud", + "ents" + ], + [ + "▁student", + "s" + ], + [ + "но", + "в" + ], + [ + "н", + "ов" + ], + [ + "▁point", + "s" + ], + [ + "▁", + "points" + ], + [ + "ar", + "ning" + ], + [ + "arn", + "ing" + ], + [ + "U", + "P" + ], + [ + "el", + "ling" + ], + [ + "ell", + "ing" + ], + [ + "elli", + "ng" + ], + [ + "▁c", + "annot" + ], + [ + "▁can", + "not" + ], + [ + "B", + "e" + ], + [ + "▁l", + "ength" + ], + [ + "▁le", + "ngth" + ], + [ + "▁", + "length" + ], + [ + "nu", + "ll" + ], + [ + "n", + "ull" + ], + [ + "ui", + "nt" + ], + [ + "u", + "int" + ], + [ + "wi", + "se" + ], + [ + "w", + "ise" + ], + [ + "▁d", + "ouble" + ], + [ + "▁dou", + "ble" + ], + [ + "▁doub", + "le" + ], + [ + "▁", + "double" + ], + [ + "ig", + "e" + ], + [ + "i", + "ge" + ], + [ + "is", + "ta" + ], + [ + "ist", + "a" + ], + [ + "i", + "sta" + ], + [ + "▁est", + "ab" + ], + [ + "▁es", + "tab" + ], + [ + "▁esta", + "b" + ], + [ + "an", + "ch" + ], + [ + "anc", + "h" + ], + [ + "▁a", + "go" + ], + [ + "▁ag", + "o" + ], + [ + "▁", + "ago" + ], + [ + "▁b", + "ound" + ], + [ + "▁bo", + "und" + ], + [ + "▁bou", + "nd" + ], + [ + "▁", + "bound" + ], + [ + "▁f", + "a" + ], + [ + "▁", + "fa" + ], + [ + "▁c", + "lean" + ], + [ + "▁cle", + "an" + ], + [ + "▁", + "clean" + ], + [ + "▁sim", + "ple" + ], + [ + "▁simpl", + "e" + ], + [ + "▁", + "simple" + ], + [ + "m", + "i" + ], + [ + "####", + "####" + ], + [ + "if", + "ier" + ], + [ + "ifi", + "er" + ], + [ + "▁Gener", + "al" + ], + [ + "▁Gen", + "eral" + ], + [ + "▁Gene", + "ral" + ], + [ + "▁", + "General" + ], + [ + "▁se", + "emed" + ], + [ + "▁see", + "med" + ], + [ + "▁seem", + "ed" + ], + [ + "en", + "a" + ], + [ + "e", + "na" + ], + [ + "▁a", + "ge" + ], + [ + "▁ag", + "e" + ], + [ + "▁", + "age" + ], + [ + "но", + "й" + ], + [ + "end", + "if" + ], + [ + "A", + "A" + ], + [ + "▁c", + "aus" + ], + [ + "▁ca", + "us" + ], + [ + "▁e", + "duc" + ], + [ + "▁ed", + "uc" + ], + [ + "▁", + "educ" + ], + [ + "▁c", + "ell" + ], + [ + "▁ce", + "ll" + ], + [ + "▁cel", + "l" + ], + [ + "▁", + "cell" + ], + [ + "Ge", + "ner" + ], + [ + "Gen", + "er" + ], + [ + "G", + "ener" + ], + [ + "sp", + "ace" + ], + [ + "s", + "pace" + ], + [ + "▁Y", + "our" + ], + [ + "▁You", + "r" + ], + [ + "▁", + "Your" + ], + [ + "▁be", + "aut" + ], + [ + "g", + "t" + ], + [ + "▁l", + "imit" + ], + [ + "▁li", + "mit" + ], + [ + "▁lim", + "it" + ], + [ + "▁", + "limit" + ], + [ + "▁d", + "ate" + ], + [ + "▁da", + "te" + ], + [ + "▁dat", + "e" + ], + [ + "▁", + "date" + ], + [ + "Ut", + "il" + ], + [ + "U", + "til" + ], + [ + "▁N", + "ational" + ], + [ + "▁Nat", + "ional" + ], + [ + "▁Nation", + "al" + ], + [ + "▁", + "National" + ], + [ + "ow", + "s" + ], + [ + "o", + "ws" + ], + [ + "pa", + "t" + ], + [ + "p", + "at" + ], + [ + "qu", + "ad" + ], + [ + "▁o", + "k" + ], + [ + "▁", + "ok" + ], + [ + "▁", + "И" + ], + [ + "ar", + "th" + ], + [ + "art", + "h" + ], + [ + "ha", + "t" + ], + [ + "h", + "at" + ], + [ + "▁comm", + "unity" + ], + [ + "▁commun", + "ity" + ], + [ + "ou", + "l" + ], + [ + "o", + "ul" + ], + [ + "▁e", + "conom" + ], + [ + "▁ec", + "onom" + ], + [ + "▁", + "econom" + ], + [ + "Com", + "ponent" + ], + [ + "bo", + "r" + ], + [ + "b", + "or" + ], + [ + "us", + "ion" + ], + [ + "▁be", + "low" + ], + [ + "▁bel", + "ow" + ], + [ + "ear", + "ch" + ], + [ + "e", + "arch" + ], + [ + "or", + "es" + ], + [ + "ore", + "s" + ], + [ + "o", + "res" + ], + [ + "ba", + "n" + ], + [ + "b", + "an" + ], + [ + "▁Aug", + "ust" + ], + [ + "▁fur", + "ther" + ], + [ + "sig", + "ma" + ], + [ + "s", + "igma" + ], + [ + "▁h", + "a" + ], + [ + "▁", + "ha" + ], + [ + "j", + "i" + ], + [ + "▁com", + "put" + ], + [ + "▁comp", + "ut" + ], + [ + "▁", + "comput" + ], + [ + "г", + "ра" + ], + [ + "▁N", + "one" + ], + [ + "▁No", + "ne" + ], + [ + "▁Non", + "e" + ], + [ + "▁", + "None" + ], + [ + "▁t", + "er" + ], + [ + "▁te", + "r" + ], + [ + "▁", + "ter" + ], + [ + "▁any", + "one" + ], + [ + "▁t", + "ask" + ], + [ + "▁ta", + "sk" + ], + [ + "▁", + "task" + ], + [ + "en", + "te" + ], + [ + "ent", + "e" + ], + [ + "e", + "nte" + ], + [ + "pos", + "ition" + ], + [ + "pp", + "ed" + ], + [ + "ppe", + "d" + ], + [ + "p", + "ped" + ], + [ + "▁a", + "us" + ], + [ + "▁au", + "s" + ], + [ + "▁", + "aus" + ], + [ + "Att", + "ribute" + ], + [ + "Attrib", + "ute" + ], + [ + "re", + "q" + ], + [ + "r", + "eq" + ], + [ + "ad", + "dr" + ], + [ + "add", + "r" + ], + [ + "li", + "ght" + ], + [ + "lig", + "ht" + ], + [ + "l", + "ight" + ], + [ + "ш", + "е" + ], + [ + "▁a", + "rm" + ], + [ + "▁ar", + "m" + ], + [ + "▁", + "arm" + ], + [ + "co", + "ver" + ], + [ + "cov", + "er" + ], + [ + "c", + "over" + ], + [ + "up", + "port" + ], + [ + "upp", + "ort" + ], + [ + "▁G", + "l" + ], + [ + "▁", + "Gl" + ], + [ + "▁S", + "an" + ], + [ + "▁Sa", + "n" + ], + [ + "▁", + "San" + ], + [ + "▁wr", + "iting" + ], + [ + "▁writ", + "ing" + ], + [ + "▁", + "writing" + ], + [ + "▁l", + "ost" + ], + [ + "▁lo", + "st" + ], + [ + "▁los", + "t" + ], + [ + "▁M", + "ark" + ], + [ + "▁Mar", + "k" + ], + [ + "▁", + "Mark" + ], + [ + "▁g", + "re" + ], + [ + "▁gr", + "e" + ], + [ + "▁", + "gre" + ], + [ + "TY", + "PE" + ], + [ + "T", + "YPE" + ], + [ + "▁S", + "outh" + ], + [ + "▁So", + "uth" + ], + [ + "▁Sou", + "th" + ], + [ + "▁Sout", + "h" + ], + [ + "▁", + "South" + ], + [ + "▁per", + "fect" + ], + [ + "▁perf", + "ect" + ], + [ + "▁pack", + "age" + ], + [ + "▁", + "package" + ], + [ + "▁in", + "fl" + ], + [ + "▁inf", + "l" + ], + [ + "▁", + "infl" + ], + [ + "ha", + "ps" + ], + [ + "h", + "aps" + ], + [ + "▁A", + "ng" + ], + [ + "▁An", + "g" + ], + [ + "▁", + "Ang" + ], + [ + "res", + "pon" + ], + [ + "resp", + "on" + ], + [ + "ri", + "s" + ], + [ + "r", + "is" + ], + [ + "pt", + "ember" + ], + [ + "pte", + "mber" + ], + [ + "▁build", + "ing" + ], + [ + "▁", + "building" + ], + [ + "VA", + "L" + ], + [ + "V", + "AL" + ], + [ + "fr", + "ee" + ], + [ + "fre", + "e" + ], + [ + "f", + "ree" + ], + [ + "▁c", + "e" + ], + [ + "▁", + "ce" + ], + [ + "H", + "T" + ], + [ + "▁F", + "rom" + ], + [ + "▁Fr", + "om" + ], + [ + "▁Fro", + "m" + ], + [ + "▁", + "From" + ], + [ + "d", + "s" + ], + [ + "ro", + "y" + ], + [ + "r", + "oy" + ], + [ + "ach", + "ine" + ], + [ + "achi", + "ne" + ], + [ + "no", + "wn" + ], + [ + "now", + "n" + ], + [ + "n", + "own" + ], + [ + "▁sa", + "ying" + ], + [ + "▁say", + "ing" + ], + [ + "▁б", + "ы" + ], + [ + "▁", + "бы" + ], + [ + "o", + "e" + ], + [ + "Re", + "f" + ], + [ + "R", + "ef" + ], + [ + "▁net", + "work" + ], + [ + "▁", + "network" + ], + [ + "par", + "ent" + ], + [ + "pa", + "rent" + ], + [ + "pare", + "nt" + ], + [ + "paren", + "t" + ], + [ + "p", + "arent" + ], + [ + "ug", + "e" + ], + [ + "u", + "ge" + ], + [ + "▁sim", + "ilar" + ], + [ + ">", + "\r" + ], + [ + "Build", + "er" + ], + [ + "B", + "uilder" + ], + [ + "▁l", + "iving" + ], + [ + "▁li", + "ving" + ], + [ + "▁liv", + "ing" + ], + [ + "▁contin", + "ue" + ], + [ + "▁continu", + "e" + ], + [ + "▁", + "continue" + ], + [ + "an", + "ger" + ], + [ + "ang", + "er" + ], + [ + "ange", + "r" + ], + [ + "▁R", + "ed" + ], + [ + "▁Re", + "d" + ], + [ + "▁", + "Red" + ], + [ + "▁h", + "air" + ], + [ + "▁ha", + "ir" + ], + [ + "an", + "ced" + ], + [ + "ance", + "d" + ], + [ + "anc", + "ed" + ], + [ + "ia", + "ns" + ], + [ + "ian", + "s" + ], + [ + "i", + "ans" + ], + [ + "▁d", + "ead" + ], + [ + "▁de", + "ad" + ], + [ + "▁", + "dead" + ], + [ + "▁bo", + "olean" + ], + [ + "▁", + "boolean" + ], + [ + "ic", + "ation" + ], + [ + "▁д", + "е" + ], + [ + "▁", + "де" + ], + [ + "▁cl", + "ient" + ], + [ + "▁", + "client" + ], + [ + "uc", + "t" + ], + [ + "u", + "ct" + ], + [ + "▁", + "•" + ], + [ + "S", + "P" + ], + [ + "ol", + "der" + ], + [ + "old", + "er" + ], + [ + "п", + "е" + ], + [ + "ud", + "io" + ], + [ + "udi", + "o" + ], + [ + "▁d", + "eg" + ], + [ + "▁de", + "g" + ], + [ + "▁", + "deg" + ], + [ + "as", + "ing" + ], + [ + "asi", + "ng" + ], + [ + "a", + "sing" + ], + [ + "▁st", + "ep" + ], + [ + "▁ste", + "p" + ], + [ + "▁", + "step" + ], + [ + "▁p", + "ers" + ], + [ + "▁per", + "s" + ], + [ + "▁pe", + "rs" + ], + [ + "▁", + "pers" + ], + [ + "ç", + "ão" + ], + [ + "ob", + "j" + ], + [ + "o", + "z" + ], + [ + "ul", + "a" + ], + [ + "u", + "la" + ], + [ + "▁r", + "ound" + ], + [ + "▁ro", + "und" + ], + [ + "▁rou", + "nd" + ], + [ + "▁", + "round" + ], + [ + "▁u", + "pon" + ], + [ + "▁up", + "on" + ], + [ + "▁re", + "source" + ], + [ + "▁res", + "ource" + ], + [ + "▁", + "resource" + ], + [ + "▁val", + "id" + ], + [ + "▁", + "valid" + ], + [ + "▁I", + "I" + ], + [ + "▁", + "II" + ], + [ + "bu", + "g" + ], + [ + "b", + "ug" + ], + [ + "st", + "d" + ], + [ + "s", + "td" + ], + [ + "▁a", + "ng" + ], + [ + "▁an", + "g" + ], + [ + "▁", + "ang" + ], + [ + "sp", + "an" + ], + [ + "s", + "pan" + ], + [ + "po", + "l" + ], + [ + "p", + "ol" + ], + [ + "ial", + "og" + ], + [ + "ia", + "log" + ], + [ + "▁p", + "hot" + ], + [ + "▁ph", + "ot" + ], + [ + "?", + "'" + ], + [ + "D", + "B" + ], + [ + "▁F", + "in" + ], + [ + "▁Fi", + "n" + ], + [ + "▁", + "Fin" + ], + [ + "V", + "E" + ], + [ + "E", + "m" + ], + [ + "▁c", + "am" + ], + [ + "▁ca", + "m" + ], + [ + "▁", + "cam" + ], + [ + "tar", + "get" + ], + [ + "t", + "arget" + ], + [ + "pe", + "cted" + ], + [ + "pect", + "ed" + ], + [ + "pec", + "ted" + ], + [ + "He", + "l" + ], + [ + "H", + "el" + ], + [ + "▁u", + "t" + ], + [ + "▁", + "ut" + ], + [ + "▁T", + "est" + ], + [ + "▁Te", + "st" + ], + [ + "▁Tes", + "t" + ], + [ + "▁", + "Test" + ], + [ + "▁t", + "own" + ], + [ + "▁to", + "wn" + ], + [ + "▁tow", + "n" + ], + [ + "▁", + "town" + ], + [ + "al", + "ign" + ], + [ + "ali", + "gn" + ], + [ + "▁we", + "bs" + ], + [ + "▁web", + "s" + ], + [ + "in", + "ner" + ], + [ + "inn", + "er" + ], + [ + "au", + "gh" + ], + [ + "aug", + "h" + ], + [ + "a", + "ugh" + ], + [ + "▁ex", + "cept" + ], + [ + "▁", + "except" + ], + [ + "▁init", + "ial" + ], + [ + "▁initi", + "al" + ], + [ + "▁", + "initial" + ], + [ + "en", + "ty" + ], + [ + "ent", + "y" + ], + [ + "lic", + "h" + ], + [ + "li", + "ch" + ], + [ + "l", + "ich" + ], + [ + "▁A", + "ut" + ], + [ + "▁Au", + "t" + ], + [ + "▁", + "Aut" + ], + [ + "to", + "p" + ], + [ + "t", + "op" + ], + [ + "▁f", + "ail" + ], + [ + "▁fa", + "il" + ], + [ + "▁", + "fail" + ], + [ + "on", + "a" + ], + [ + "o", + "na" + ], + [ + "▁ben", + "ef" + ], + [ + "an", + "ks" + ], + [ + "ank", + "s" + ], + [ + "is", + "che" + ], + [ + "isch", + "e" + ], + [ + "isc", + "he" + ], + [ + "i", + "sche" + ], + [ + ".", + "*" + ], + [ + "▁sign", + "ific" + ], + [ + "▁cont", + "act" + ], + [ + "▁", + "contact" + ], + [ + "Re", + "c" + ], + [ + "R", + "ec" + ], + [ + "ar", + "io" + ], + [ + "ari", + "o" + ], + [ + "a", + "rio" + ], + [ + "ot", + "tom" + ], + [ + "ott", + "om" + ], + [ + "otto", + "m" + ], + [ + "▁rel", + "ationship" + ], + [ + "▁relations", + "hip" + ], + [ + "▁relation", + "ship" + ], + [ + "])", + ";" + ], + [ + "]", + ");" + ], + [ + "▁Н", + "а" + ], + [ + "▁", + "На" + ], + [ + "He", + "ad" + ], + [ + "H", + "ead" + ], + [ + "form", + "at" + ], + [ + "for", + "mat" + ], + [ + "▁é", + "t" + ], + [ + "▁", + "ét" + ], + [ + "▁M", + "ore" + ], + [ + "▁Mor", + "e" + ], + [ + "▁Mo", + "re" + ], + [ + "▁", + "More" + ], + [ + "act", + "ory" + ], + [ + "actor", + "y" + ], + [ + "port", + "un" + ], + [ + "+", + "\\" + ], + [ + "▁sim", + "ply" + ], + [ + "▁simpl", + "y" + ], + [ + "▁e", + "p" + ], + [ + "▁", + "ep" + ], + [ + "▁R", + "uss" + ], + [ + "▁Ru", + "ss" + ], + [ + "▁Rus", + "s" + ], + [ + "n", + "í" + ], + [ + "u", + "a" + ], + [ + "er", + "c" + ], + [ + "e", + "rc" + ], + [ + "▁long", + "er" + ], + [ + "▁lon", + "ger" + ], + [ + "in", + "ition" + ], + [ + "init", + "ion" + ], + [ + "ect", + "or" + ], + [ + "ec", + "tor" + ], + [ + "e", + "ctor" + ], + [ + "apt", + "ion" + ], + [ + "a", + "ption" + ], + [ + "▁prof", + "ess" + ], + [ + "▁profes", + "s" + ], + [ + "▁M", + "us" + ], + [ + "▁Mu", + "s" + ], + [ + "▁", + "Mus" + ], + [ + "il", + "ities" + ], + [ + "ili", + "ties" + ], + [ + "è", + "s" + ], + [ + "▁A", + "ct" + ], + [ + "▁Ac", + "t" + ], + [ + "▁", + "Act" + ], + [ + "off", + "set" + ], + [ + "offs", + "et" + ], + [ + "▁i", + "ll" + ], + [ + "▁il", + "l" + ], + [ + "▁", + "ill" + ], + [ + "ba", + "nd" + ], + [ + "ban", + "d" + ], + [ + "b", + "and" + ], + [ + "▁A", + "g" + ], + [ + "▁", + "Ag" + ], + [ + "▁П", + "о" + ], + [ + "▁", + "По" + ], + [ + "б", + "и" + ], + [ + "cont", + "ent" + ], + [ + "ic", + "on" + ], + [ + "ico", + "n" + ], + [ + "i", + "con" + ], + [ + "▁work", + "s" + ], + [ + "▁wor", + "ks" + ], + [ + "▁", + "works" + ], + [ + "yn", + "am" + ], + [ + "yna", + "m" + ], + [ + "y", + "nam" + ], + [ + "pl", + "ement" + ], + [ + "ple", + "ment" + ], + [ + "p", + "lement" + ], + [ + "Res", + "ource" + ], + [ + "Re", + "source" + ], + [ + "Act", + "ion" + ], + [ + "A", + "ction" + ], + [ + "▁diff", + "icult" + ], + [ + "▁W", + "est" + ], + [ + "▁We", + "st" + ], + [ + "▁Wes", + "t" + ], + [ + "▁", + "West" + ], + [ + "▁v", + "ideo" + ], + [ + "▁vide", + "o" + ], + [ + "▁", + "video" + ], + [ + "▁T", + "HE" + ], + [ + "▁TH", + "E" + ], + [ + "▁", + "THE" + ], + [ + "▁de", + "cl" + ], + [ + "▁dec", + "l" + ], + [ + "▁", + "decl" + ], + [ + "on", + "don" + ], + [ + "ond", + "on" + ], + [ + "ondo", + "n" + ], + [ + "de", + "d" + ], + [ + "d", + "ed" + ], + [ + "}{", + "\\" + ], + [ + "}", + "{\\" + ], + [ + "oc", + "r" + ], + [ + "o", + "cr" + ], + [ + "▁C", + "ity" + ], + [ + "▁Cit", + "y" + ], + [ + "▁Ci", + "ty" + ], + [ + "▁", + "City" + ], + [ + "▁", + "я" + ], + [ + "ue", + "r" + ], + [ + "u", + "er" + ], + [ + "c", + "z" + ], + [ + "▁im", + "ag" + ], + [ + "▁i", + "mag" + ], + [ + "▁", + "imag" + ], + [ + "c", + "r" + ], + [ + "et", + "e" + ], + [ + "e", + "te" + ], + [ + "id", + "get" + ], + [ + "idge", + "t" + ], + [ + "▁M", + "od" + ], + [ + "▁Mo", + "d" + ], + [ + "▁", + "Mod" + ], + [ + "▁for", + "ward" + ], + [ + "▁", + "forward" + ], + [ + "▁p", + "ict" + ], + [ + "▁pi", + "ct" + ], + [ + "▁pic", + "t" + ], + [ + "or", + "ge" + ], + [ + "org", + "e" + ], + [ + "▁sub", + "ject" + ], + [ + "▁", + "subject" + ], + [ + "up", + "date" + ], + [ + "at", + "tle" + ], + [ + "att", + "le" + ], + [ + "s", + "a" + ], + [ + "▁A", + "nt" + ], + [ + "▁An", + "t" + ], + [ + "▁", + "Ant" + ], + [ + "▁r", + "unning" + ], + [ + "▁run", + "ning" + ], + [ + "▁", + "running" + ], + [ + "▁s", + "al" + ], + [ + "▁sa", + "l" + ], + [ + "▁", + "sal" + ], + [ + "con", + "ne" + ], + [ + "conn", + "e" + ], + [ + "c", + "onne" + ], + [ + "▁out", + "put" + ], + [ + "▁", + "output" + ], + [ + "ad", + "ata" + ], + [ + "ada", + "ta" + ], + [ + "a", + "data" + ], + [ + "M", + "L" + ], + [ + "Che", + "ck" + ], + [ + "C", + "heck" + ], + [ + "led", + "ge" + ], + [ + "l", + "edge" + ], + [ + "▁p", + "aper" + ], + [ + "▁pa", + "per" + ], + [ + "▁pap", + "er" + ], + [ + "▁", + "paper" + ], + [ + "param", + "s" + ], + [ + "par", + "ams" + ], + [ + "para", + "ms" + ], + [ + "av", + "y" + ], + [ + "a", + "vy" + ], + [ + "▁a", + "f" + ], + [ + "▁", + "af" + ], + [ + "▁e", + "ine" + ], + [ + "▁ein", + "e" + ], + [ + "▁j", + "our" + ], + [ + "▁jo", + "ur" + ], + [ + "▁jou", + "r" + ], + [ + "▁", + "jour" + ], + [ + "A", + "Y" + ], + [ + "▁it", + "self" + ], + [ + "▁its", + "elf" + ], + [ + "▁S", + "tr" + ], + [ + "▁St", + "r" + ], + [ + "▁", + "Str" + ], + [ + "st", + "yle" + ], + [ + "sty", + "le" + ], + [ + "Th", + "at" + ], + [ + "T", + "hat" + ], + [ + "▁m", + "illion" + ], + [ + "▁mill", + "ion" + ], + [ + "▁l", + "anguage" + ], + [ + "▁", + "language" + ], + [ + "O", + "S" + ], + [ + "vi", + "ng" + ], + [ + "vin", + "g" + ], + [ + "v", + "ing" + ], + [ + "▁м", + "а" + ], + [ + "▁", + "ма" + ], + [ + "▁т", + "о" + ], + [ + "▁", + "то" + ], + [ + ")", + "(" + ], + [ + "▁b", + "uy" + ], + [ + "▁bu", + "y" + ], + [ + ".", + "/" + ], + [ + "▁.", + ".." + ], + [ + "▁..", + "." + ], + [ + "▁", + "..." + ], + [ + "▁t", + "ried" + ], + [ + "▁tr", + "ied" + ], + [ + "▁tri", + "ed" + ], + [ + "▁com", + "pl" + ], + [ + "▁comp", + "l" + ], + [ + "▁act", + "iv" + ], + [ + "▁", + "activ" + ], + [ + "ap", + "ped" + ], + [ + "app", + "ed" + ], + [ + "appe", + "d" + ], + [ + "a", + "pped" + ], + [ + "But", + "ton" + ], + [ + "B", + "utton" + ], + [ + "To", + "ken" + ], + [ + "Tok", + "en" + ], + [ + "T", + "oken" + ], + [ + "▁prov", + "ided" + ], + [ + "▁provide", + "d" + ], + [ + "ib", + "er" + ], + [ + "ibe", + "r" + ], + [ + "i", + "ber" + ], + [ + "▁c", + "reated" + ], + [ + "▁cre", + "ated" + ], + [ + "▁create", + "d" + ], + [ + "▁creat", + "ed" + ], + [ + "▁", + "created" + ], + [ + "cur", + "ity" + ], + [ + "c", + "urity" + ], + [ + "En", + "d" + ], + [ + "E", + "nd" + ], + [ + "a", + "ł" + ], + [ + "us", + "ter" + ], + [ + "ust", + "er" + ], + [ + "u", + "ster" + ], + [ + "iz", + "ing" + ], + [ + "izi", + "ng" + ], + [ + "i", + "zing" + ], + [ + "om", + "b" + ], + [ + "o", + "mb" + ], + [ + "▁s", + "ich" + ], + [ + "▁si", + "ch" + ], + [ + "▁com", + "pon" + ], + [ + "▁comp", + "on" + ], + [ + "▁S", + "ee" + ], + [ + "▁Se", + "e" + ], + [ + "▁", + "See" + ], + [ + "▁u", + "int" + ], + [ + "▁ui", + "nt" + ], + [ + "▁", + "uint" + ], + [ + "▁l", + "abel" + ], + [ + "▁la", + "bel" + ], + [ + "▁lab", + "el" + ], + [ + "▁", + "label" + ], + [ + "vo", + "l" + ], + [ + "v", + "ol" + ], + [ + "ó", + "w" + ], + [ + "oc", + "ol" + ], + [ + "oco", + "l" + ], + [ + "o", + "col" + ], + [ + "▁re", + "ceived" + ], + [ + "▁rece", + "ived" + ], + [ + "▁receive", + "d" + ], + [ + "▁in", + "tern" + ], + [ + "▁int", + "ern" + ], + [ + "▁inter", + "n" + ], + [ + "▁inte", + "rn" + ], + [ + "▁", + "intern" + ], + [ + "ц", + "е" + ], + [ + "R", + "un" + ], + [ + "▁r", + "oad" + ], + [ + "▁ro", + "ad" + ], + [ + "▁", + "road" + ], + [ + "▁O", + "ct" + ], + [ + "▁", + "Oct" + ], + [ + "▁C", + "omp" + ], + [ + "▁Com", + "p" + ], + [ + "▁Co", + "mp" + ], + [ + "▁", + "Comp" + ], + [ + "▁stud", + "y" + ], + [ + "▁т", + "е" + ], + [ + "▁", + "те" + ], + [ + "Ac", + "t" + ], + [ + "A", + "ct" + ], + [ + "▁t", + "our" + ], + [ + "▁to", + "ur" + ], + [ + "▁tou", + "r" + ], + [ + "▁St", + "ate" + ], + [ + "▁Stat", + "e" + ], + [ + "▁Sta", + "te" + ], + [ + "▁", + "State" + ], + [ + "▁ad", + "ded" + ], + [ + "▁add", + "ed" + ], + [ + "▁", + "added" + ], + [ + "htt", + "ps" + ], + [ + "http", + "s" + ], + [ + "st", + "ream" + ], + [ + "stre", + "am" + ], + [ + "▁l", + "ower" + ], + [ + "▁lo", + "wer" + ], + [ + "▁low", + "er" + ], + [ + "▁", + "lower" + ], + [ + "▁b", + "ox" + ], + [ + "▁bo", + "x" + ], + [ + "▁", + "box" + ], + [ + "▁S", + "k" + ], + [ + "▁", + "Sk" + ], + [ + "▁them", + "selves" + ], + [ + "▁c", + "ross" + ], + [ + "▁cr", + "oss" + ], + [ + "▁cro", + "ss" + ], + [ + "▁", + "cross" + ], + [ + "▁e", + "cho" + ], + [ + "▁ec", + "ho" + ], + [ + "▁", + "echo" + ], + [ + "▁dev", + "ice" + ], + [ + "▁", + "device" + ], + [ + "pos", + "e" + ], + [ + "po", + "se" + ], + [ + "p", + "ose" + ], + [ + "▁g", + "ames" + ], + [ + "▁game", + "s" + ], + [ + "▁gam", + "es" + ], + [ + "▁ga", + "mes" + ], + [ + "P", + "L" + ], + [ + "W", + "indow" + ], + [ + "is", + "es" + ], + [ + "ise", + "s" + ], + [ + "i", + "ses" + ], + [ + "ti", + "tle" + ], + [ + "tit", + "le" + ], + [ + "t", + "itle" + ], + [ + "St", + "ream" + ], + [ + "z", + "t" + ], + [ + "▁S", + "w" + ], + [ + "▁", + "Sw" + ], + [ + "▁r", + "ole" + ], + [ + "▁ro", + "le" + ], + [ + "▁", + "role" + ], + [ + "ia", + "nt" + ], + [ + "ian", + "t" + ], + [ + "i", + "ant" + ], + [ + "k", + "u" + ], + [ + "se", + "qu" + ], + [ + "seq", + "u" + ], + [ + "s", + "equ" + ], + [ + "▁l", + "ate" + ], + [ + "▁la", + "te" + ], + [ + "▁lat", + "e" + ], + [ + "▁", + "late" + ], + [ + "▁s", + "old" + ], + [ + "▁so", + "ld" + ], + [ + "▁sol", + "d" + ], + [ + "р", + "я" + ], + [ + "Com", + "m" + ], + [ + "Co", + "mm" + ], + [ + "C", + "omm" + ], + [ + "▁en", + "tre" + ], + [ + "▁ent", + "re" + ], + [ + "▁entr", + "e" + ], + [ + "▁", + "entre" + ], + [ + "▁d", + "og" + ], + [ + "▁do", + "g" + ], + [ + "▁", + "dog" + ], + [ + "dev", + "ice" + ], + [ + "P", + "ar" + ], + [ + "▁like", + "ly" + ], + [ + "▁lik", + "ely" + ], + [ + "▁", + "likely" + ], + [ + "^{", + "-" + ], + [ + "^", + "{-" + ], + [ + "▁l", + "en" + ], + [ + "▁le", + "n" + ], + [ + "▁", + "len" + ], + [ + "▁P", + "aul" + ], + [ + "▁Pa", + "ul" + ], + [ + "▁", + "Paul" + ], + [ + "▁t", + "ool" + ], + [ + "▁to", + "ol" + ], + [ + "▁too", + "l" + ], + [ + "▁", + "tool" + ], + [ + "Of", + "f" + ], + [ + "O", + "ff" + ], + [ + "▁f", + "amil" + ], + [ + "▁fam", + "il" + ], + [ + "▁fa", + "mil" + ], + [ + "▁d", + "raw" + ], + [ + "▁dr", + "aw" + ], + [ + "▁", + "draw" + ], + [ + "ap", + "ping" + ], + [ + "app", + "ing" + ], + [ + "a", + "pping" + ], + [ + "▁ev", + "ents" + ], + [ + "▁even", + "ts" + ], + [ + "▁event", + "s" + ], + [ + "▁", + "events" + ], + [ + "cre", + "t" + ], + [ + "cr", + "et" + ], + [ + "c", + "ret" + ], + [ + "rou", + "ght" + ], + [ + "rough", + "t" + ], + [ + "r", + "ought" + ], + [ + "Cont", + "ent" + ], + [ + "▁soft", + "ware" + ], + [ + "ri", + "a" + ], + [ + "r", + "ia" + ], + [ + "ms", + "g" + ], + [ + "m", + "sg" + ], + [ + "ga", + "mma" + ], + [ + "g", + "amma" + ], + [ + "▁h", + "ear" + ], + [ + "▁he", + "ar" + ], + [ + "Op", + "er" + ], + [ + "O", + "per" + ], + [ + "▁your", + "self" + ], + [ + "▁yours", + "elf" + ], + [ + "▁l", + "iter" + ], + [ + "▁li", + "ter" + ], + [ + "▁lit", + "er" + ], + [ + "▁", + "liter" + ], + [ + "em", + "p" + ], + [ + "e", + "mp" + ], + [ + "▁se", + "par" + ], + [ + "▁sep", + "ar" + ], + [ + "▁", + "separ" + ], + [ + "▁", + "З" + ], + [ + "▁t", + "itle" + ], + [ + "▁tit", + "le" + ], + [ + "▁ti", + "tle" + ], + [ + "▁", + "title" + ], + [ + "M", + "ethod" + ], + [ + "math", + "rm" + ], + [ + "▁s", + "low" + ], + [ + "▁sl", + "ow" + ], + [ + "▁R", + "om" + ], + [ + "▁Ro", + "m" + ], + [ + "▁", + "Rom" + ], + [ + "!", + "!" + ], + [ + "▁t", + "ax" + ], + [ + "▁ta", + "x" + ], + [ + "▁", + "tax" + ], + [ + "ск", + "а" + ], + [ + "с", + "ка" + ], + [ + "empl", + "ate" + ], + [ + "emp", + "late" + ], + [ + "o", + "i" + ], + [ + "▁A", + "rt" + ], + [ + "▁Ar", + "t" + ], + [ + "▁", + "Art" + ], + [ + "f", + "alse" + ], + [ + "ast", + "ic" + ], + [ + "ст", + "ь" + ], + [ + "с", + "ть" + ], + [ + "oc", + "ket" + ], + [ + "ock", + "et" + ], + [ + "▁e", + "ns" + ], + [ + "▁en", + "s" + ], + [ + "▁", + "ens" + ], + [ + "T", + "O" + ], + [ + "am", + "ente" + ], + [ + "ame", + "nte" + ], + [ + "ament", + "e" + ], + [ + "amen", + "te" + ], + [ + "a", + "mente" + ], + [ + "lo", + "cal" + ], + [ + "loc", + "al" + ], + [ + "l", + "ocal" + ], + [ + "ch", + "ie" + ], + [ + "chi", + "e" + ], + [ + "▁p", + "an" + ], + [ + "▁pa", + "n" + ], + [ + "▁", + "pan" + ], + [ + "ни", + "й" + ], + [ + "ch", + "ema" + ], + [ + "che", + "ma" + ], + [ + "chem", + "a" + ], + [ + "▁N", + "orth" + ], + [ + "▁Nor", + "th" + ], + [ + "▁Nort", + "h" + ], + [ + "з", + "о" + ], + [ + "▁>", + "=" + ], + [ + "▁", + ">=" + ], + [ + "A", + "ut" + ], + [ + "▁d", + "ig" + ], + [ + "▁di", + "g" + ], + [ + "▁", + "dig" + ], + [ + "▁se", + "ems" + ], + [ + "▁see", + "ms" + ], + [ + "▁seem", + "s" + ], + [ + "▁mor", + "ning" + ], + [ + "so", + "le" + ], + [ + "sol", + "e" + ], + [ + "s", + "ole" + ], + [ + "um", + "er" + ], + [ + "ume", + "r" + ], + [ + "u", + "mer" + ], + [ + "del", + "ta" + ], + [ + "d", + "elta" + ], + [ + "it", + "é" + ], + [ + "i", + "té" + ], + [ + "ab", + "ase" + ], + [ + "aba", + "se" + ], + [ + "a", + "base" + ], + [ + "ra", + "f" + ], + [ + "r", + "af" + ], + [ + "▁ob", + "serv" + ], + [ + "▁obs", + "erv" + ], + [ + "▁", + "observ" + ], + [ + "▁E", + "st" + ], + [ + "▁Es", + "t" + ], + [ + "▁", + "Est" + ], + [ + "▁s", + "eg" + ], + [ + "▁se", + "g" + ], + [ + "▁", + "seg" + ], + [ + "▁[", + "]" + ], + [ + "▁", + "[]" + ], + [ + "▁P", + "res" + ], + [ + "▁Pr", + "es" + ], + [ + "▁Pre", + "s" + ], + [ + "▁", + "Pres" + ], + [ + "if", + "ul" + ], + [ + "i", + "ful" + ], + [ + "pu", + "sh" + ], + [ + "pus", + "h" + ], + [ + "p", + "ush" + ], + [ + "▁O", + "ff" + ], + [ + "▁Of", + "f" + ], + [ + "▁", + "Off" + ], + [ + "ip", + "e" + ], + [ + "i", + "pe" + ], + [ + "at", + "i" + ], + [ + "a", + "ti" + ], + [ + "▁d", + "im" + ], + [ + "▁di", + "m" + ], + [ + "▁", + "dim" + ], + [ + "ce", + "ed" + ], + [ + "c", + "eed" + ], + [ + "En", + "t" + ], + [ + "E", + "nt" + ], + [ + "__", + "__" + ], + [ + "___", + "_" + ], + [ + "_", + "___" + ], + [ + "en", + "try" + ], + [ + "ent", + "ry" + ], + [ + "entr", + "y" + ], + [ + "▁f", + "ight" + ], + [ + "▁fig", + "ht" + ], + [ + "▁fi", + "ght" + ], + [ + "▁c", + "red" + ], + [ + "▁cre", + "d" + ], + [ + "▁cr", + "ed" + ], + [ + "▁", + "cred" + ], + [ + "▁O", + "R" + ], + [ + "▁", + "OR" + ], + [ + "▁D", + "ep" + ], + [ + "▁De", + "p" + ], + [ + "▁", + "Dep" + ], + [ + "$", + "{" + ], + [ + "ле", + "н" + ], + [ + "л", + "ен" + ], + [ + "Creat", + "e" + ], + [ + "C", + "reate" + ], + [ + "▁Apr", + "il" + ], + [ + "▁Ap", + "ril" + ], + [ + "min", + "istr" + ], + [ + "F", + "L" + ], + [ + "▁A", + "p" + ], + [ + "▁", + "Ap" + ], + [ + "▁H", + "ere" + ], + [ + "▁He", + "re" + ], + [ + "▁Her", + "e" + ], + [ + "▁", + "Here" + ], + [ + "priv", + "ate" + ], + [ + "p", + "rivate" + ], + [ + "In", + "stance" + ], + [ + "Inst", + "ance" + ], + [ + "ie", + "m" + ], + [ + "i", + "em" + ], + [ + "▁off", + "ice" + ], + [ + "▁offic", + "e" + ], + [ + "▁th", + "ird" + ], + [ + "▁", + "third" + ], + [ + "▁up", + "date" + ], + [ + "▁", + "update" + ], + [ + "Lin", + "e" + ], + [ + "Li", + "ne" + ], + [ + "L", + "ine" + ], + [ + "ta", + "g" + ], + [ + "t", + "ag" + ], + [ + "▁e", + "specially" + ], + [ + "▁espec", + "ially" + ], + [ + "▁especial", + "ly" + ], + [ + "▁", + "especially" + ], + [ + "▁го", + "да" + ], + [ + "▁год", + "а" + ], + [ + "▁c", + "u" + ], + [ + "▁", + "cu" + ], + [ + "▁k", + "ill" + ], + [ + "▁kil", + "l" + ], + [ + "▁ki", + "ll" + ], + [ + "▁", + "kill" + ], + [ + "au", + "ght" + ], + [ + "augh", + "t" + ], + [ + "aug", + "ht" + ], + [ + "▁s", + "we" + ], + [ + "▁sw", + "e" + ], + [ + "Option", + "s" + ], + [ + "Opt", + "ions" + ], + [ + "O", + "ptions" + ], + [ + "I", + "M" + ], + [ + "C", + "C" + ], + [ + "▁com", + "pan" + ], + [ + "▁comp", + "an" + ], + [ + "ju", + "st" + ], + [ + "j", + "ust" + ], + [ + "▁Wh", + "ile" + ], + [ + "▁", + "While" + ], + [ + "iz", + "er" + ], + [ + "ize", + "r" + ], + [ + "i", + "zer" + ], + [ + "▁м", + "о" + ], + [ + "▁", + "мо" + ], + [ + "к", + "е" + ], + [ + "▁a", + "uto" + ], + [ + "▁aut", + "o" + ], + [ + "▁au", + "to" + ], + [ + "▁", + "auto" + ], + [ + "▁b", + "and" + ], + [ + "▁ban", + "d" + ], + [ + "▁ba", + "nd" + ], + [ + "▁", + "band" + ], + [ + "ме", + "н" + ], + [ + "м", + "ен" + ], + [ + "ique", + "s" + ], + [ + "iqu", + "es" + ], + [ + "iq", + "ues" + ], + [ + "i", + "ques" + ], + [ + "▁p", + "le" + ], + [ + "▁pl", + "e" + ], + [ + "▁", + "ple" + ], + [ + "N", + "O" + ], + [ + "▁O", + "F" + ], + [ + "▁", + "OF" + ], + [ + "▁s", + "ong" + ], + [ + "▁so", + "ng" + ], + [ + "▁son", + "g" + ], + [ + "▁A", + "cc" + ], + [ + "▁Ac", + "c" + ], + [ + "▁", + "Acc" + ], + [ + "EX", + "T" + ], + [ + "E", + "XT" + ], + [ + "en", + "sor" + ], + [ + "ens", + "or" + ], + [ + "enso", + "r" + ], + [ + "in", + "ing" + ], + [ + "ini", + "ng" + ], + [ + "i", + "ning" + ], + [ + "▁l", + "at" + ], + [ + "▁la", + "t" + ], + [ + "▁", + "lat" + ], + [ + "bi", + "g" + ], + [ + "b", + "ig" + ], + [ + "▁K", + "ing" + ], + [ + "▁Ki", + "ng" + ], + [ + "▁Kin", + "g" + ], + [ + "▁", + "King" + ], + [ + "oc", + "h" + ], + [ + "o", + "ch" + ], + [ + "s", + "i" + ], + [ + "▁H", + "ist" + ], + [ + "▁His", + "t" + ], + [ + "▁Hi", + "st" + ], + [ + "▁", + "Hist" + ], + [ + "▁qu", + "ality" + ], + [ + "▁qual", + "ity" + ], + [ + "▁", + "quality" + ], + [ + "mod", + "e" + ], + [ + "mo", + "de" + ], + [ + "m", + "ode" + ], + [ + "▁op", + "portun" + ], + [ + "▁would", + "n" + ], + [ + ":*", + "*" + ], + [ + ":", + "**" + ], + [ + "out", + "put" + ], + [ + "▁fe", + "et" + ], + [ + "▁fee", + "t" + ], + [ + "▁m", + "is" + ], + [ + "▁mi", + "s" + ], + [ + "d", + "f" + ], + [ + "ag", + "ing" + ], + [ + "agi", + "ng" + ], + [ + "a", + "ging" + ], + [ + "▁м", + "е" + ], + [ + "▁", + "ме" + ], + [ + "▁t", + "ro" + ], + [ + "▁tr", + "o" + ], + [ + "▁d", + "efined" + ], + [ + "▁def", + "ined" + ], + [ + "▁define", + "d" + ], + [ + "▁defin", + "ed" + ], + [ + "▁", + "defined" + ], + [ + "▁re", + "view" + ], + [ + "▁rev", + "iew" + ], + [ + "▁", + "review" + ], + [ + "▁F", + "il" + ], + [ + "▁Fi", + "l" + ], + [ + "▁", + "Fil" + ], + [ + ">", + ">" + ], + [ + "▁pr", + "incip" + ], + [ + "▁prin", + "cip" + ], + [ + "Bas", + "e" + ], + [ + "B", + "ase" + ], + [ + "di", + "ct" + ], + [ + "d", + "ict" + ], + [ + "ve", + "rage" + ], + [ + "ver", + "age" + ], + [ + "ic", + "ient" + ], + [ + "ici", + "ent" + ], + [ + "I", + "F" + ], + [ + "▁h", + "it" + ], + [ + "▁hi", + "t" + ], + [ + "▁", + "hit" + ], + [ + "Pag", + "e" + ], + [ + "P", + "age" + ], + [ + "▁p", + "erm" + ], + [ + "▁per", + "m" + ], + [ + "▁pe", + "rm" + ], + [ + "▁", + "perm" + ], + [ + "ce", + "l" + ], + [ + "c", + "el" + ], + [ + "í", + "t" + ], + [ + "▁ex", + "press" + ], + [ + "▁exp", + "ress" + ], + [ + "▁expr", + "ess" + ], + [ + "▁", + "express" + ], + [ + "▁ind", + "ic" + ], + [ + "▁Se", + "ptember" + ], + [ + "▁Sept", + "ember" + ], + [ + "im", + "age" + ], + [ + "ima", + "ge" + ], + [ + "imag", + "e" + ], + [ + "▁product", + "s" + ], + [ + "▁", + "products" + ], + [ + "▁m", + "edia" + ], + [ + "▁med", + "ia" + ], + [ + "▁medi", + "a" + ], + [ + "▁", + "media" + ], + [ + "ch", + "ange" + ], + [ + "chan", + "ge" + ], + [ + "ig", + "ger" + ], + [ + "igg", + "er" + ], + [ + "▁s", + "end" + ], + [ + "▁se", + "nd" + ], + [ + "▁sen", + "d" + ], + [ + "▁", + "send" + ], + [ + "la", + "st" + ], + [ + "las", + "t" + ], + [ + "l", + "ast" + ], + [ + "min", + "g" + ], + [ + "mi", + "ng" + ], + [ + "m", + "ing" + ], + [ + "p", + "a" + ], + [ + "ua", + "ry" + ], + [ + "uar", + "y" + ], + [ + "u", + "ary" + ], + [ + "▁spe", + "ak" + ], + [ + "ны", + "й" + ], + [ + "щ", + "е" + ], + [ + "ys", + "is" + ], + [ + "y", + "sis" + ], + [ + "ly", + "ing" + ], + [ + "l", + "ying" + ], + [ + "▁", + "ч" + ], + [ + "li", + "ke" + ], + [ + "lik", + "e" + ], + [ + "l", + "ike" + ], + [ + "р", + "ы" + ], + [ + "в", + "і" + ], + [ + "▁M", + "ich" + ], + [ + "▁Mic", + "h" + ], + [ + "▁Mi", + "ch" + ], + [ + "M", + "O" + ], + [ + "▁J", + "ah" + ], + [ + "▁Ja", + "h" + ], + [ + "ens", + "ive" + ], + [ + "▁sh", + "are" + ], + [ + "▁shar", + "e" + ], + [ + "▁sha", + "re" + ], + [ + "▁", + "share" + ], + [ + "▁develop", + "ment" + ], + [ + "C", + "P" + ], + [ + "sp", + "ec" + ], + [ + "spe", + "c" + ], + [ + "s", + "pec" + ], + [ + "▁f", + "ast" + ], + [ + "▁fa", + "st" + ], + [ + "▁", + "fast" + ], + [ + "he", + "t" + ], + [ + "h", + "et" + ], + [ + "H", + "O" + ], + [ + "▁part", + "icip" + ], + [ + "▁partic", + "ip" + ], + [ + "▁parti", + "cip" + ], + [ + "Bl", + "ock" + ], + [ + "Blo", + "ck" + ], + [ + "B", + "lock" + ], + [ + "▁vi", + "ol" + ], + [ + "▁fr", + "ame" + ], + [ + "▁fra", + "me" + ], + [ + "▁fram", + "e" + ], + [ + "▁", + "frame" + ], + [ + "▁qu", + "al" + ], + [ + "▁q", + "ual" + ], + [ + "▁", + "qual" + ], + [ + "tr", + "e" + ], + [ + "t", + "re" + ], + [ + "▁", + "Ф" + ], + [ + "▁to", + "ward" + ], + [ + "▁tow", + "ard" + ], + [ + "f", + "g" + ], + [ + "Bo", + "x" + ], + [ + "B", + "ox" + ], + [ + "Col", + "umn" + ], + [ + "▁mil", + "it" + ], + [ + "▁mi", + "lit" + ], + [ + "▁M", + "arch" + ], + [ + "▁Mar", + "ch" + ], + [ + "▁Marc", + "h" + ], + [ + "▁var", + "ious" + ], + [ + "▁vari", + "ous" + ], + [ + "pa", + "ss" + ], + [ + "pas", + "s" + ], + [ + "p", + "ass" + ], + [ + "▁P", + "ark" + ], + [ + "▁Par", + "k" + ], + [ + "▁B", + "en" + ], + [ + "▁Be", + "n" + ], + [ + "▁", + "Ben" + ], + [ + "Fr", + "ame" + ], + [ + "▁n", + "ormal" + ], + [ + "▁nor", + "mal" + ], + [ + "▁norm", + "al" + ], + [ + "▁", + "normal" + ], + [ + "op", + "en" + ], + [ + "ope", + "n" + ], + [ + "o", + "pen" + ], + [ + "p", + "x" + ], + [ + "▁ph", + "one" + ], + [ + "▁", + "phone" + ], + [ + "▁E", + "ven" + ], + [ + "▁Ev", + "en" + ], + [ + "▁Eve", + "n" + ], + [ + "▁", + "Even" + ], + [ + "▁m", + "a" + ], + [ + "▁", + "ma" + ], + [ + "ibr", + "ary" + ], + [ + "St", + "art" + ], + [ + "Star", + "t" + ], + [ + "id", + "den" + ], + [ + "idd", + "en" + ], + [ + "rh", + "o" + ], + [ + "r", + "ho" + ], + [ + "gr", + "aph" + ], + [ + "gra", + "ph" + ], + [ + "g", + "raph" + ], + [ + "ac", + "ing" + ], + [ + "aci", + "ng" + ], + [ + "a", + "cing" + ], + [ + "'", + "." + ], + [ + "ar", + "ter" + ], + [ + "art", + "er" + ], + [ + "arte", + "r" + ], + [ + "me", + "s" + ], + [ + "m", + "es" + ], + [ + "in", + "st" + ], + [ + "ins", + "t" + ], + [ + "▁i", + "r" + ], + [ + "▁", + "ir" + ], + [ + "act", + "ive" + ], + [ + "activ", + "e" + ], + [ + "▁f", + "em" + ], + [ + "▁fe", + "m" + ], + [ + "▁", + "fem" + ], + [ + "▁m", + "oved" + ], + [ + "▁mov", + "ed" + ], + [ + "▁move", + "d" + ], + [ + "▁mo", + "ved" + ], + [ + "▁st", + "ore" + ], + [ + "▁stor", + "e" + ], + [ + "▁sto", + "re" + ], + [ + "▁", + "store" + ], + [ + "▁p", + "rice" + ], + [ + "▁pr", + "ice" + ], + [ + "▁pri", + "ce" + ], + [ + "▁", + "price" + ], + [ + "\")", + "." + ], + [ + "\"", + ")." + ], + [ + "ber", + "g" + ], + [ + "be", + "rg" + ], + [ + "b", + "erg" + ], + [ + "▁n", + "ov" + ], + [ + "▁no", + "v" + ], + [ + "▁", + "nov" + ], + [ + "▁c", + "ard" + ], + [ + "▁car", + "d" + ], + [ + "▁ca", + "rd" + ], + [ + "▁", + "card" + ], + [ + "el", + "low" + ], + [ + "ell", + "ow" + ], + [ + "ello", + "w" + ], + [ + "▁part", + "y" + ], + [ + "▁par", + "ty" + ], + [ + "▁", + "party" + ], + [ + "▁M", + "or" + ], + [ + "▁Mo", + "r" + ], + [ + "ae", + "l" + ], + [ + "a", + "el" + ], + [ + "▁per", + "cent" + ], + [ + "▁", + "percent" + ], + [ + "▁tr", + "aining" + ], + [ + "▁tra", + "ining" + ], + [ + "▁train", + "ing" + ], + [ + "▁", + "training" + ], + [ + "▁in", + "g" + ], + [ + "▁i", + "ng" + ], + [ + "▁", + "ing" + ], + [ + "im", + "er" + ], + [ + "ime", + "r" + ], + [ + "i", + "mer" + ], + [ + "▁S", + "am" + ], + [ + "▁Sa", + "m" + ], + [ + "▁", + "Sam" + ], + [ + "Def", + "ault" + ], + [ + "▁f", + "uck" + ], + [ + "▁fu", + "ck" + ], + [ + "▁com", + "plete" + ], + [ + "▁comp", + "lete" + ], + [ + "▁complet", + "e" + ], + [ + "▁compl", + "ete" + ], + [ + "▁", + "complete" + ], + [ + "ui", + "d" + ], + [ + "u", + "id" + ], + [ + "▁det", + "ails" + ], + [ + "▁detail", + "s" + ], + [ + "▁", + "details" + ], + [ + "▁l", + "ed" + ], + [ + "▁le", + "d" + ], + [ + "▁", + "led" + ], + [ + "Po", + "int" + ], + [ + "P", + "oint" + ], + [ + "▁C", + "ount" + ], + [ + "▁Co", + "unt" + ], + [ + "▁Coun", + "t" + ], + [ + "▁Cou", + "nt" + ], + [ + "▁", + "Count" + ], + [ + "▁reg", + "ard" + ], + [ + "z", + "o" + ], + [ + "▁B", + "ro" + ], + [ + "▁Br", + "o" + ], + [ + "▁", + "Bro" + ], + [ + "▁rec", + "ogn" + ], + [ + "▁", + "recogn" + ], + [ + "▁H", + "ol" + ], + [ + "▁Ho", + "l" + ], + [ + "▁", + "Hol" + ], + [ + "U", + "M" + ], + [ + "el", + "ement" + ], + [ + "ele", + "ment" + ], + [ + "elem", + "ent" + ], + [ + "e", + "lement" + ], + [ + "Mod", + "e" + ], + [ + "Mo", + "de" + ], + [ + "M", + "ode" + ], + [ + "▁ex", + "am" + ], + [ + "▁E", + "X" + ], + [ + "▁", + "EX" + ], + [ + "Im", + "age" + ], + [ + "ver", + "se" + ], + [ + "vers", + "e" + ], + [ + "ri", + "ter" + ], + [ + "rit", + "er" + ], + [ + "rite", + "r" + ], + [ + "r", + "iter" + ], + [ + "so", + "ft" + ], + [ + "s", + "oft" + ], + [ + "▁int", + "rodu" + ], + [ + "▁intro", + "du" + ], + [ + "▁sur", + "pr" + ], + [ + "Buf", + "fer" + ], + [ + "Buff", + "er" + ], + [ + "B", + "uffer" + ], + [ + "le", + "ctor" + ], + [ + "lect", + "or" + ], + [ + "l", + "ector" + ], + [ + "ar", + "en" + ], + [ + "are", + "n" + ], + [ + "a", + "ren" + ], + [ + "an", + "ged" + ], + [ + "ang", + "ed" + ], + [ + "ange", + "d" + ], + [ + "▁P", + "at" + ], + [ + "▁Pa", + "t" + ], + [ + "▁", + "Pat" + ], + [ + "▁P", + "al" + ], + [ + "▁Pa", + "l" + ], + [ + "▁", + "Pal" + ], + [ + "▁con", + "tr" + ], + [ + "▁cont", + "r" + ], + [ + "▁", + "contr" + ], + [ + "Hand", + "ler" + ], + [ + "Handle", + "r" + ], + [ + "▁fe", + "atures" + ], + [ + "▁feature", + "s" + ], + [ + "▁feat", + "ures" + ], + [ + "▁", + "features" + ], + [ + "ip", + "le" + ], + [ + "i", + "ple" + ], + [ + "▁C", + "ON" + ], + [ + "▁CO", + "N" + ], + [ + "▁", + "CON" + ], + [ + "Fi", + "l" + ], + [ + "F", + "il" + ], + [ + "▁P", + "ort" + ], + [ + "▁Po", + "rt" + ], + [ + "▁Por", + "t" + ], + [ + "▁", + "Port" + ], + [ + "▁th", + "inking" + ], + [ + "▁think", + "ing" + ], + [ + "▁thin", + "king" + ], + [ + "do", + "c" + ], + [ + "d", + "oc" + ], + [ + "we", + "r" + ], + [ + "w", + "er" + ], + [ + "▁work", + "ed" + ], + [ + "▁wor", + "ked" + ], + [ + "P", + "C" + ], + [ + "c", + "m" + ], + [ + "da", + "t" + ], + [ + "d", + "at" + ], + [ + "PR", + "O" + ], + [ + "P", + "RO" + ], + [ + "▁E", + "very" + ], + [ + "▁Ev", + "ery" + ], + [ + "▁Ever", + "y" + ], + [ + "▁Eve", + "ry" + ], + [ + "▁", + "Every" + ], + [ + "▁e", + "ra" + ], + [ + "▁er", + "a" + ], + [ + "▁", + "era" + ], + [ + "▁F", + "irst" + ], + [ + "▁", + "First" + ], + [ + "g", + "n" + ], + [ + "▁im", + "medi" + ], + [ + "▁imm", + "edi" + ], + [ + "ov", + "ember" + ], + [ + "ove", + "mber" + ], + [ + "ap", + "an" + ], + [ + "apa", + "n" + ], + [ + "a", + "pan" + ], + [ + "▁ex", + "tra" + ], + [ + "▁ext", + "ra" + ], + [ + "▁extr", + "a" + ], + [ + "▁", + "extra" + ], + [ + "▁s", + "ection" + ], + [ + "▁se", + "ction" + ], + [ + "▁sect", + "ion" + ], + [ + "▁", + "section" + ], + [ + "▁J", + "une" + ], + [ + "▁Jun", + "e" + ], + [ + "▁Ju", + "ne" + ], + [ + "▁v", + "ia" + ], + [ + "▁vi", + "a" + ], + [ + "▁", + "via" + ], + [ + "▁g", + "one" + ], + [ + "▁go", + "ne" + ], + [ + "com", + "e" + ], + [ + "co", + "me" + ], + [ + "c", + "ome" + ], + [ + "▁s", + "tri" + ], + [ + "▁st", + "ri" + ], + [ + "▁str", + "i" + ], + [ + "▁", + "stri" + ], + [ + "^", + "\\" + ], + [ + "ant", + "ly" + ], + [ + "▁ar", + "ch" + ], + [ + "▁arc", + "h" + ], + [ + "▁", + "arch" + ], + [ + "S", + "ource" + ], + [ + "▁con", + "v" + ], + [ + "▁co", + "nv" + ], + [ + "▁", + "conv" + ], + [ + "▁L", + "ondon" + ], + [ + "▁Lond", + "on" + ], + [ + "▁", + "London" + ], + [ + "Num", + "ber" + ], + [ + "N", + "umber" + ], + [ + "▁quest", + "ions" + ], + [ + "▁question", + "s" + ], + [ + "an", + "did" + ], + [ + "and", + "id" + ], + [ + "▁play", + "ed" + ], + [ + "en", + "v" + ], + [ + "e", + "nv" + ], + [ + "▁Sch", + "ool" + ], + [ + "▁nat", + "ural" + ], + [ + "▁natur", + "al" + ], + [ + "▁", + "natural" + ], + [ + "ca", + "n" + ], + [ + "c", + "an" + ], + [ + "▁ne", + "ws" + ], + [ + "▁new", + "s" + ], + [ + "▁", + "news" + ], + [ + "D", + "R" + ], + [ + "▁c", + "hall" + ], + [ + "▁ch", + "all" + ], + [ + "▁cha", + "ll" + ], + [ + "▁S", + "oc" + ], + [ + "▁So", + "c" + ], + [ + "▁", + "э" + ], + [ + "▁att", + "empt" + ], + [ + "*", + "}" + ], + [ + "N", + "ull" + ], + [ + "ro", + "te" + ], + [ + "rot", + "e" + ], + [ + "r", + "ote" + ], + [ + "▁b", + "i" + ], + [ + "▁", + "bi" + ], + [ + "▁wr", + "itten" + ], + [ + "▁writ", + "ten" + ], + [ + "▁", + "written" + ], + [ + "▁bl", + "ood" + ], + [ + "▁blo", + "od" + ], + [ + "▁happ", + "ened" + ], + [ + "▁happen", + "ed" + ], + [ + "▁c", + "ause" + ], + [ + "▁caus", + "e" + ], + [ + "▁ca", + "use" + ], + [ + "as", + "hing" + ], + [ + "ash", + "ing" + ], + [ + "ashi", + "ng" + ], + [ + "▁Will", + "iam" + ], + [ + "ad", + "em" + ], + [ + "ade", + "m" + ], + [ + "a", + "dem" + ], + [ + "▁b", + "rought" + ], + [ + "▁br", + "ought" + ], + [ + "▁dis", + "play" + ], + [ + "▁displ", + "ay" + ], + [ + "▁disp", + "lay" + ], + [ + "▁", + "display" + ], + [ + "im", + "a" + ], + [ + "i", + "ma" + ], + [ + "▁fin", + "ally" + ], + [ + "▁final", + "ly" + ], + [ + "ta", + "b" + ], + [ + "t", + "ab" + ], + [ + "▁return", + "ed" + ], + [ + "ны", + "х" + ], + [ + "ni", + "e" + ], + [ + "n", + "ie" + ], + [ + "▁", + "q" + ], + [ + "▁h", + "ers" + ], + [ + "▁he", + "rs" + ], + [ + "▁her", + "s" + ], + [ + "▁P", + "re" + ], + [ + "▁Pr", + "e" + ], + [ + "▁", + "Pre" + ], + [ + "▁d", + "ou" + ], + [ + "▁do", + "u" + ], + [ + "buf", + "fer" + ], + [ + "buff", + "er" + ], + [ + "b", + "uffer" + ], + [ + "▁eff", + "ort" + ], + [ + "ain", + "e" + ], + [ + "ai", + "ne" + ], + [ + "a", + "ine" + ], + [ + "x", + "y" + ], + [ + "▁his", + "tor" + ], + [ + "▁hist", + "or" + ], + [ + "en", + "u" + ], + [ + "e", + "nu" + ], + [ + "▁ar", + "riv" + ], + [ + "▁arr", + "iv" + ], + [ + "▁D", + "em" + ], + [ + "▁De", + "m" + ], + [ + "▁", + "Dem" + ], + [ + "▁f", + "avor" + ], + [ + "▁fa", + "vor" + ], + [ + "▁fav", + "or" + ], + [ + "▁hand", + "le" + ], + [ + "▁", + "handle" + ], + [ + "SE", + "T" + ], + [ + "S", + "ET" + ], + [ + "▁P", + "ublic" + ], + [ + "▁Pub", + "lic" + ], + [ + "▁Pu", + "blic" + ], + [ + "▁", + "Public" + ], + [ + "ru", + "pt" + ], + [ + "rup", + "t" + ], + [ + "r", + "upt" + ], + [ + "▁u", + "r" + ], + [ + "▁", + "ur" + ], + [ + "▁for", + "ce" + ], + [ + "▁", + "force" + ], + [ + "▁é", + "s" + ], + [ + "▁", + "és" + ], + [ + "ub", + "e" + ], + [ + "u", + "be" + ], + [ + "Pr", + "e" + ], + [ + "P", + "re" + ], + [ + "р", + "і" + ], + [ + "in", + "y" + ], + [ + "i", + "ny" + ], + [ + "th", + "eta" + ], + [ + "the", + "ta" + ], + [ + "is", + "f" + ], + [ + "i", + "sf" + ], + [ + "▁n", + "ational" + ], + [ + "▁nat", + "ional" + ], + [ + "▁nation", + "al" + ], + [ + "Equ", + "al" + ], + [ + "Eq", + "ual" + ], + [ + "E", + "qual" + ], + [ + "ren", + "ch" + ], + [ + "▁w", + "ife" + ], + [ + "▁c", + "apt" + ], + [ + "▁cap", + "t" + ], + [ + "▁ca", + "pt" + ], + [ + "▁In", + "ter" + ], + [ + "▁Int", + "er" + ], + [ + "▁", + "Inter" + ], + [ + "ta", + "u" + ], + [ + "t", + "au" + ], + [ + "▁s", + "leep" + ], + [ + "▁sle", + "ep" + ], + [ + "▁", + "sleep" + ], + [ + "../", + "../" + ], + [ + "▁iss", + "ue" + ], + [ + "▁", + "issue" + ], + [ + "▁m", + "ember" + ], + [ + "▁me", + "mber" + ], + [ + "▁mem", + "ber" + ], + [ + "▁", + "member" + ], + [ + "▁a", + "wait" + ], + [ + "▁aw", + "ait" + ], + [ + "▁", + "await" + ], + [ + "▁D", + "an" + ], + [ + "▁Da", + "n" + ], + [ + "▁", + "Dan" + ], + [ + "z", + "i" + ], + [ + "in", + "ate" + ], + [ + "ina", + "te" + ], + [ + "i", + "nate" + ], + [ + "▁s", + "ym" + ], + [ + "▁sy", + "m" + ], + [ + "▁", + "sym" + ], + [ + "ch", + "an" + ], + [ + "cha", + "n" + ], + [ + "c", + "han" + ], + [ + "▁J", + "ack" + ], + [ + "▁Jac", + "k" + ], + [ + "▁Ja", + "ck" + ], + [ + "▁", + "Jack" + ], + [ + "▁Eng", + "lish" + ], + [ + "▁", + "English" + ], + [ + "▁s", + "z" + ], + [ + "▁", + "sz" + ], + [ + "rib", + "utes" + ], + [ + "ribut", + "es" + ], + [ + "ribute", + "s" + ], + [ + "ribu", + "tes" + ], + [ + "▁i", + "gn" + ], + [ + "▁ig", + "n" + ], + [ + "▁", + "ign" + ], + [ + "á", + "l" + ], + [ + "▁app", + "ear" + ], + [ + "▁appe", + "ar" + ], + [ + "ra", + "d" + ], + [ + "r", + "ad" + ], + [ + "id", + "ge" + ], + [ + "▁co", + "uple" + ], + [ + "▁cou", + "ple" + ], + [ + "▁coup", + "le" + ], + [ + "▁s", + "hip" + ], + [ + "▁sh", + "ip" + ], + [ + "▁", + "ship" + ], + [ + "li", + "g" + ], + [ + "l", + "ig" + ], + [ + "we", + "b" + ], + [ + "w", + "eb" + ], + [ + "▁us", + "ually" + ], + [ + "▁usual", + "ly" + ], + [ + "▁re", + "ady" + ], + [ + "▁read", + "y" + ], + [ + "▁", + "ready" + ], + [ + "▁v", + "ill" + ], + [ + "▁vi", + "ll" + ], + [ + "▁vil", + "l" + ], + [ + "▁W", + "hy" + ], + [ + "▁Wh", + "y" + ], + [ + "▁", + "Why" + ], + [ + "eb", + "ru" + ], + [ + "e", + "bru" + ], + [ + "▁g", + "rad" + ], + [ + "▁gr", + "ad" + ], + [ + "▁gra", + "d" + ], + [ + "▁", + "grad" + ], + [ + "or", + "ds" + ], + [ + "ord", + "s" + ], + [ + "▁in", + "f" + ], + [ + "▁i", + "nf" + ], + [ + "▁", + "inf" + ], + [ + "▁l", + "oss" + ], + [ + "▁lo", + "ss" + ], + [ + "▁los", + "s" + ], + [ + "▁", + "loss" + ], + [ + "▁o", + "d" + ], + [ + "▁", + "od" + ], + [ + "▁Ph", + "il" + ], + [ + "▁", + "Phil" + ], + [ + "ser", + "ver" + ], + [ + "serv", + "er" + ], + [ + "serve", + "r" + ], + [ + "▁U", + "p" + ], + [ + "▁", + "Up" + ], + [ + "▁b", + "uff" + ], + [ + "▁bu", + "ff" + ], + [ + "▁buf", + "f" + ], + [ + "▁", + "buff" + ], + [ + "▁fil", + "ename" + ], + [ + "▁file", + "name" + ], + [ + "▁", + "filename" + ], + [ + "AB", + "LE" + ], + [ + "it", + "ing" + ], + [ + "iti", + "ng" + ], + [ + "i", + "ting" + ], + [ + "ef", + "ore" + ], + [ + "e", + "fore" + ], + [ + "()", + "->" + ], + [ + "(", + ")->" + ], + [ + "▁cond", + "itions" + ], + [ + "▁condition", + "s" + ], + [ + "▁", + "conditions" + ], + [ + "v", + "m" + ], + [ + "el", + "d" + ], + [ + "e", + "ld" + ], + [ + "it", + "z" + ], + [ + "i", + "tz" + ], + [ + "▁Tr", + "ans" + ], + [ + "▁Tra", + "ns" + ], + [ + "▁", + "Trans" + ], + [ + "▁w", + "eight" + ], + [ + "▁we", + "ight" + ], + [ + "▁weigh", + "t" + ], + [ + "▁", + "weight" + ], + [ + "▁high", + "er" + ], + [ + "▁hig", + "her" + ], + [ + "▁r", + "ate" + ], + [ + "▁rat", + "e" + ], + [ + "▁ra", + "te" + ], + [ + "▁", + "rate" + ], + [ + "▁acc", + "om" + ], + [ + "▁ac", + "com" + ], + [ + "vi", + "der" + ], + [ + "vid", + "er" + ], + [ + "v", + "ider" + ], + [ + "O", + "M" + ], + [ + "▁w", + "ays" + ], + [ + "▁way", + "s" + ], + [ + "▁wa", + "ys" + ], + [ + "▁", + "ways" + ], + [ + "com", + "ing" + ], + [ + "co", + "ming" + ], + [ + "c", + "oming" + ], + [ + "▁l", + "ock" + ], + [ + "▁loc", + "k" + ], + [ + "▁lo", + "ck" + ], + [ + "▁", + "lock" + ], + [ + "▁e", + "tc" + ], + [ + "▁et", + "c" + ], + [ + "▁", + "etc" + ], + [ + "▁a", + "vec" + ], + [ + "▁av", + "ec" + ], + [ + "▁ave", + "c" + ], + [ + "▁t", + "akes" + ], + [ + "▁take", + "s" + ], + [ + "▁tak", + "es" + ], + [ + "▁ta", + "kes" + ], + [ + "▁C", + "har" + ], + [ + "▁Ch", + "ar" + ], + [ + "▁Cha", + "r" + ], + [ + "▁", + "Char" + ], + [ + "▁N", + "ovember" + ], + [ + "▁Nov", + "ember" + ], + [ + "m", + "ethod" + ], + [ + "▁A", + "ustral" + ], + [ + "▁Aust", + "ral" + ], + [ + "▁", + "Austral" + ], + [ + "▁Amer", + "ica" + ], + [ + "▁", + "America" + ], + [ + "lo", + "ng" + ], + [ + "lon", + "g" + ], + [ + "l", + "ong" + ], + [ + "ce", + "mber" + ], + [ + "c", + "ember" + ], + [ + "▁polit", + "ical" + ], + [ + "fl", + "ow" + ], + [ + "f", + "low" + ], + [ + "▁may", + "be" + ], + [ + "▁", + "maybe" + ], + [ + "▁a", + "mb" + ], + [ + "▁am", + "b" + ], + [ + "▁", + "amb" + ], + [ + "La", + "yout" + ], + [ + "L", + "ayout" + ], + [ + "il", + "ed" + ], + [ + "ile", + "d" + ], + [ + "i", + "led" + ], + [ + "om", + "en" + ], + [ + "ome", + "n" + ], + [ + "o", + "men" + ], + [ + "ol", + "a" + ], + [ + "o", + "la" + ], + [ + "ic", + "ip" + ], + [ + "ici", + "p" + ], + [ + "i", + "cip" + ], + [ + "part", + "ial" + ], + [ + "Tr", + "ue" + ], + [ + "▁f", + "loor" + ], + [ + "▁fl", + "oor" + ], + [ + "▁flo", + "or" + ], + [ + "▁", + "floor" + ], + [ + "▁D", + "ef" + ], + [ + "▁De", + "f" + ], + [ + "▁", + "Def" + ], + [ + "▁conc", + "ern" + ], + [ + "▁conce", + "rn" + ], + [ + "▁concer", + "n" + ], + [ + "y", + "r" + ], + [ + "▁sh", + "ows" + ], + [ + "▁show", + "s" + ], + [ + "i", + "h" + ], + [ + "▁an", + "swer" + ], + [ + "▁answ", + "er" + ], + [ + "▁ans", + "wer" + ], + [ + "▁", + "answer" + ], + [ + "ac", + "c" + ], + [ + "a", + "cc" + ], + [ + "▁b", + "all" + ], + [ + "▁bal", + "l" + ], + [ + "▁ba", + "ll" + ], + [ + "▁", + "ball" + ], + [ + "▁R", + "ev" + ], + [ + "▁Re", + "v" + ], + [ + "▁", + "Rev" + ], + [ + "▁s", + "un" + ], + [ + "▁su", + "n" + ], + [ + "▁", + "sun" + ], + [ + "▁quick", + "ly" + ], + [ + "▁s", + "omet" + ], + [ + "▁so", + "met" + ], + [ + "▁some", + "t" + ], + [ + "▁som", + "et" + ], + [ + "ment", + "e" + ], + [ + "me", + "nte" + ], + [ + "men", + "te" + ], + [ + "m", + "ente" + ], + [ + "▁M", + "al" + ], + [ + "▁Ma", + "l" + ], + [ + "▁", + "Mal" + ], + [ + "und", + "red" + ], + [ + "▁iss", + "ues" + ], + [ + "▁issue", + "s" + ], + [ + "▁", + "issues" + ], + [ + "ec", + "ause" + ], + [ + "eca", + "use" + ], + [ + "pe", + "s" + ], + [ + "p", + "es" + ], + [ + "▁p", + "layer" + ], + [ + "▁pl", + "ayer" + ], + [ + "▁play", + "er" + ], + [ + "▁", + "player" + ], + [ + "▁par", + "ents" + ], + [ + "▁parent", + "s" + ], + [ + "▁", + "parents" + ], + [ + "▁pop", + "ular" + ], + [ + "▁popula", + "r" + ], + [ + "▁popul", + "ar" + ], + [ + "▁m", + "ode" + ], + [ + "▁mod", + "e" + ], + [ + "▁mo", + "de" + ], + [ + "▁", + "mode" + ], + [ + "▁m", + "ention" + ], + [ + "▁ment", + "ion" + ], + [ + "N", + "E" + ], + [ + "Lo", + "ad" + ], + [ + "L", + "oad" + ], + [ + "▁reg", + "ular" + ], + [ + "▁regul", + "ar" + ], + [ + "▁", + "regular" + ], + [ + "ave", + "d" + ], + [ + "av", + "ed" + ], + [ + "a", + "ved" + ], + [ + "?", + ":" + ], + [ + "ye", + "ar" + ], + [ + "y", + "ear" + ], + [ + "fun", + "c" + ], + [ + "fu", + "nc" + ], + [ + "f", + "unc" + ], + [ + "▁per", + "formance" + ], + [ + "▁perform", + "ance" + ], + [ + "▁J", + "uly" + ], + [ + "▁Jul", + "y" + ], + [ + "▁Ju", + "ly" + ], + [ + "th", + "ern" + ], + [ + "ther", + "n" + ], + [ + "the", + "rn" + ], + [ + "▁we", + "bsite" + ], + [ + "▁webs", + "ite" + ], + [ + "▁web", + "site" + ], + [ + "fo", + "rd" + ], + [ + "for", + "d" + ], + [ + "f", + "ord" + ], + [ + "P", + "R" + ], + [ + "el", + "a" + ], + [ + "e", + "la" + ], + [ + "le", + "vel" + ], + [ + "lev", + "el" + ], + [ + "l", + "evel" + ], + [ + "ui", + "t" + ], + [ + "u", + "it" + ], + [ + "fl", + "ags" + ], + [ + "flag", + "s" + ], + [ + "▁w", + "orth" + ], + [ + "▁wor", + "th" + ], + [ + "▁", + "worth" + ], + [ + "▁cor", + "respon" + ], + [ + "▁Brit", + "ish" + ], + [ + "si", + "m" + ], + [ + "s", + "im" + ], + [ + "▁al", + "one" + ], + [ + "▁", + "alone" + ], + [ + "▁h", + "ar" + ], + [ + "▁ha", + "r" + ], + [ + "▁", + "har" + ], + [ + "▁o", + "nes" + ], + [ + "▁on", + "es" + ], + [ + "▁one", + "s" + ], + [ + "▁", + "ones" + ], + [ + "ob", + "ile" + ], + [ + "obi", + "le" + ], + [ + "obil", + "e" + ], + [ + "▁d", + "ru" + ], + [ + "▁dr", + "u" + ], + [ + "▁", + "dru" + ], + [ + "ch", + "i" + ], + [ + "c", + "hi" + ], + [ + "▁D", + "avid" + ], + [ + "▁Dav", + "id" + ], + [ + "▁Da", + "vid" + ], + [ + "▁", + "David" + ], + [ + "▁proble", + "ms" + ], + [ + "▁problem", + "s" + ], + [ + "▁col", + "umn" + ], + [ + "▁", + "column" + ], + [ + "()", + ";\r" + ], + [ + "();", + "\r" + ], + [ + "(", + ");\r" + ], + [ + "Z", + "E" + ], + [ + "▁re", + "lig" + ], + [ + "▁rel", + "ig" + ], + [ + "▁reli", + "g" + ], + [ + "olog", + "ical" + ], + [ + "▁reg", + "ion" + ], + [ + "▁", + "region" + ], + [ + "ad", + "y" + ], + [ + "a", + "dy" + ], + [ + "I", + "O" + ], + [ + "an", + "der" + ], + [ + "and", + "er" + ], + [ + "ande", + "r" + ], + [ + "a", + "nder" + ], + [ + "Ne", + "t" + ], + [ + "N", + "et" + ], + [ + "▁bu", + "ilt" + ], + [ + "▁", + "built" + ], + [ + "▁inst", + "all" + ], + [ + "▁", + "install" + ], + [ + "▁appro", + "ach" + ], + [ + "C", + "ur" + ], + [ + "▁f", + "ine" + ], + [ + "▁fin", + "e" + ], + [ + "▁fi", + "ne" + ], + [ + "▁talk", + "ing" + ], + [ + "▁tal", + "king" + ], + [ + "▁ch", + "anges" + ], + [ + "▁chang", + "es" + ], + [ + "▁change", + "s" + ], + [ + "▁", + "changes" + ], + [ + "St", + "yle" + ], + [ + "▁M", + "art" + ], + [ + "▁Mar", + "t" + ], + [ + "▁Ma", + "rt" + ], + [ + "▁", + "Mart" + ], + [ + "л", + "ю" + ], + [ + "res", + "ponse" + ], + [ + "respon", + "se" + ], + [ + "respons", + "e" + ], + [ + "te", + "ger" + ], + [ + "{", + "\r" + ], + [ + "ir", + "it" + ], + [ + "iri", + "t" + ], + [ + "i", + "rit" + ], + [ + "▁prote", + "cted" + ], + [ + "▁protect", + "ed" + ], + [ + "▁", + "protected" + ], + [ + "▁re", + "le" + ], + [ + "▁r", + "ele" + ], + [ + "▁rel", + "e" + ], + [ + "er", + "ship" + ], + [ + "ers", + "hip" + ], + [ + "те", + "ль" + ], + [ + "тел", + "ь" + ], + [ + "un", + "signed" + ], + [ + "uns", + "igned" + ], + [ + "ial", + "ize" + ], + [ + "▁htt", + "ps" + ], + [ + "▁http", + "s" + ], + [ + "▁", + "https" + ], + [ + "T", + "ag" + ], + [ + "▁$", + "(" + ], + [ + "▁", + "$(" + ], + [ + "mo", + "re" + ], + [ + "mor", + "e" + ], + [ + "m", + "ore" + ], + [ + "ype", + "s" + ], + [ + "yp", + "es" + ], + [ + "y", + "pes" + ], + [ + "▁st", + "ream" + ], + [ + "▁stre", + "am" + ], + [ + "▁", + "stream" + ], + [ + "et", + "ch" + ], + [ + "etc", + "h" + ], + [ + "▁eng", + "ine" + ], + [ + "▁", + "engine" + ], + [ + "K", + "E" + ], + [ + "cm", + "d" + ], + [ + "c", + "md" + ], + [ + "sc", + "ript" + ], + [ + "scri", + "pt" + ], + [ + "scr", + "ipt" + ], + [ + "s", + "cript" + ], + [ + "tt", + "p" + ], + [ + "t", + "tp" + ], + [ + "▁a", + "void" + ], + [ + "▁av", + "oid" + ], + [ + "▁t", + "err" + ], + [ + "▁te", + "rr" + ], + [ + "▁ter", + "r" + ], + [ + "▁r", + "ock" + ], + [ + "▁ro", + "ck" + ], + [ + "▁", + "rock" + ], + [ + "▁f", + "ul" + ], + [ + "▁fu", + "l" + ], + [ + "▁", + "ful" + ], + [ + "Up", + "date" + ], + [ + "▁env", + "ironment" + ], + [ + "▁environ", + "ment" + ], + [ + "▁", + "environment" + ], + [ + "▁p", + "rec" + ], + [ + "▁pre", + "c" + ], + [ + "▁pr", + "ec" + ], + [ + "▁", + "prec" + ], + [ + "▁с", + "а" + ], + [ + "▁", + "са" + ], + [ + "▁c", + "ases" + ], + [ + "▁case", + "s" + ], + [ + "▁cas", + "es" + ], + [ + "▁ca", + "ses" + ], + [ + "▁", + "cases" + ], + [ + "▁off", + "set" + ], + [ + "▁", + "offset" + ], + [ + "▁r", + "ais" + ], + [ + "▁ra", + "is" + ], + [ + "▁", + "rais" + ], + [ + "li", + "b" + ], + [ + "l", + "ib" + ], + [ + "ée", + "s" + ], + [ + "é", + "es" + ], + [ + "a", + "a" + ], + [ + "y", + "t" + ], + [ + "▁a", + "rr" + ], + [ + "▁ar", + "r" + ], + [ + "▁", + "arr" + ], + [ + "opy", + "right" + ], + [ + "f", + "irst" + ], + [ + "▁u", + "til" + ], + [ + "▁ut", + "il" + ], + [ + "▁", + "util" + ], + [ + "▁fe", + "ature" + ], + [ + "▁feat", + "ure" + ], + [ + "▁", + "feature" + ], + [ + "pos", + "ed" + ], + [ + "po", + "sed" + ], + [ + "pose", + "d" + ], + [ + "p", + "osed" + ], + [ + "ff", + "ect" + ], + [ + "f", + "fect" + ], + [ + "ж", + "а" + ], + [ + "it", + "ude" + ], + [ + "itu", + "de" + ], + [ + "itud", + "e" + ], + [ + "em", + "ents" + ], + [ + "ement", + "s" + ], + [ + "emen", + "ts" + ], + [ + "e", + "ments" + ], + [ + "as", + "c" + ], + [ + "a", + "sc" + ], + [ + "ad", + "or" + ], + [ + "ado", + "r" + ], + [ + "le", + "ctions" + ], + [ + "lect", + "ions" + ], + [ + "lection", + "s" + ], + [ + "▁cl", + "ub" + ], + [ + "▁", + "club" + ], + [ + "]", + "{" + ], + [ + "▁*", + ")" + ], + [ + "▁", + "*)" + ], + [ + "ст", + "во" + ], + [ + "ств", + "о" + ], + [ + "с", + "тво" + ], + [ + "▁im", + "m" + ], + [ + "▁i", + "mm" + ], + [ + "▁", + "imm" + ], + [ + "▁for", + "mer" + ], + [ + "▁form", + "er" + ], + [ + "▁forme", + "r" + ], + [ + "▁", + "former" + ], + [ + "▁r", + "ights" + ], + [ + "▁right", + "s" + ], + [ + "▁dec", + "ided" + ], + [ + "▁decide", + "d" + ], + [ + "▁decid", + "ed" + ], + [ + "▁re", + "v" + ], + [ + "▁r", + "ev" + ], + [ + "▁", + "rev" + ], + [ + "▁m", + "ent" + ], + [ + "▁me", + "nt" + ], + [ + "▁men", + "t" + ], + [ + "▁", + "ment" + ], + [ + "an", + "i" + ], + [ + "a", + "ni" + ], + [ + "▁st", + "ru" + ], + [ + "▁str", + "u" + ], + [ + "▁", + "stru" + ], + [ + "▁att", + "ention" + ], + [ + "art", + "ment" + ], + [ + "▁I", + "tal" + ], + [ + "▁It", + "al" + ], + [ + "al", + "le" + ], + [ + "all", + "e" + ], + [ + "a", + "lle" + ], + [ + "▁b", + "is" + ], + [ + "▁bi", + "s" + ], + [ + "▁", + "bis" + ], + [ + "ge", + "ner" + ], + [ + "gen", + "er" + ], + [ + "g", + "ener" + ], + [ + "▁in", + "tegr" + ], + [ + "▁int", + "egr" + ], + [ + "▁inte", + "gr" + ], + [ + "▁", + "integr" + ], + [ + "el", + "lo" + ], + [ + "ell", + "o" + ], + [ + "ry", + "pt" + ], + [ + "▁a", + "chie" + ], + [ + "ne", + "s" + ], + [ + "n", + "es" + ], + [ + "▁s", + "tra" + ], + [ + "▁st", + "ra" + ], + [ + "▁str", + "a" + ], + [ + "▁", + "stra" + ], + [ + "s", + "b" + ], + [ + "▁t", + "ypes" + ], + [ + "▁type", + "s" + ], + [ + "▁typ", + "es" + ], + [ + "▁ty", + "pes" + ], + [ + "▁", + "types" + ], + [ + "▁R", + "E" + ], + [ + "▁", + "RE" + ], + [ + "In", + "it" + ], + [ + "I", + "nit" + ], + [ + "▁com", + "ment" + ], + [ + "▁comm", + "ent" + ], + [ + "▁comme", + "nt" + ], + [ + "▁", + "comment" + ], + [ + "▁add", + "ition" + ], + [ + "▁I", + "D" + ], + [ + "▁", + "ID" + ], + [ + "AR", + "T" + ], + [ + "A", + "RT" + ], + [ + "F", + "O" + ], + [ + "щ", + "и" + ], + [ + "Con", + "ne" + ], + [ + "Conn", + "e" + ], + [ + "C", + "onne" + ], + [ + "▁s", + "qu" + ], + [ + "▁sq", + "u" + ], + [ + "▁consider", + "ed" + ], + [ + "▁consid", + "ered" + ], + [ + "id", + "ad" + ], + [ + "ida", + "d" + ], + [ + "▁Oct", + "ober" + ], + [ + "ci", + "al" + ], + [ + "cia", + "l" + ], + [ + "c", + "ial" + ], + [ + "▁O", + "f" + ], + [ + "▁", + "Of" + ], + [ + "▁tr", + "avel" + ], + [ + "▁tra", + "vel" + ], + [ + "▁trav", + "el" + ], + [ + "▁b", + "oy" + ], + [ + "▁bo", + "y" + ], + [ + "▁", + "boy" + ], + [ + "')", + "." + ], + [ + "'", + ")." + ], + [ + "u", + "y" + ], + [ + "il", + "la" + ], + [ + "ill", + "a" + ], + [ + "i", + "lla" + ], + [ + "is", + "try" + ], + [ + "ist", + "ry" + ], + [ + "istr", + "y" + ], + [ + "▁v", + "a" + ], + [ + "▁", + "va" + ], + [ + "▁C", + "he" + ], + [ + "▁Ch", + "e" + ], + [ + "▁", + "Che" + ], + [ + "ER", + "T" + ], + [ + "E", + "RT" + ], + [ + "en", + "de" + ], + [ + "end", + "e" + ], + [ + "e", + "nde" + ], + [ + "un", + "gen" + ], + [ + "ung", + "en" + ], + [ + "unge", + "n" + ], + [ + "ab", + "y" + ], + [ + "a", + "by" + ], + [ + "▁R", + "ober" + ], + [ + "▁Ro", + "ber" + ], + [ + "▁Rob", + "er" + ], + [ + "▁play", + "ing" + ], + [ + "il", + "s" + ], + [ + "i", + "ls" + ], + [ + "▁s", + "am" + ], + [ + "▁sa", + "m" + ], + [ + "▁", + "sam" + ], + [ + "▁ex", + "ecut" + ], + [ + "▁exec", + "ut" + ], + [ + "▁", + "execut" + ], + [ + "▁U", + "s" + ], + [ + "▁", + "Us" + ], + [ + "▁m", + "ut" + ], + [ + "▁mu", + "t" + ], + [ + "▁", + "mut" + ], + [ + "▁b", + "al" + ], + [ + "▁ba", + "l" + ], + [ + "▁", + "bal" + ], + [ + "as", + "se" + ], + [ + "ass", + "e" + ], + [ + "▁k", + "ids" + ], + [ + "▁kid", + "s" + ], + [ + "▁ki", + "ds" + ], + [ + "▁fin", + "anc" + ], + [ + "go", + "r" + ], + [ + "g", + "or" + ], + [ + "▁S", + "ec" + ], + [ + "▁Se", + "c" + ], + [ + "▁", + "Sec" + ], + [ + "ber", + "t" + ], + [ + "be", + "rt" + ], + [ + "b", + "ert" + ], + [ + "▁H", + "igh" + ], + [ + "▁Hig", + "h" + ], + [ + "▁Hi", + "gh" + ], + [ + "▁", + "High" + ], + [ + "▁", + "је" + ], + [ + "▁ke", + "pt" + ], + [ + "but", + "ton" + ], + [ + "b", + "utton" + ], + [ + "it", + "ory" + ], + [ + "itor", + "y" + ], + [ + "ito", + "ry" + ], + [ + "▁R", + "em" + ], + [ + "▁Re", + "m" + ], + [ + "▁", + "Rem" + ], + [ + "▁D", + "E" + ], + [ + "▁", + "DE" + ], + [ + "▁re", + "ach" + ], + [ + "▁r", + "each" + ], + [ + "▁", + "reach" + ], + [ + "▁b", + "ur" + ], + [ + "▁bu", + "r" + ], + [ + "▁", + "bur" + ], + [ + "La", + "bel" + ], + [ + "L", + "abel" + ], + [ + "á", + "t" + ], + [ + "ag", + "o" + ], + [ + "a", + "go" + ], + [ + "▁pass", + "ed" + ], + [ + "▁pas", + "sed" + ], + [ + "▁be", + "hav" + ], + [ + "▁beh", + "av" + ], + [ + "xF", + "F" + ], + [ + "x", + "FF" + ], + [ + "▁R", + "eturn" + ], + [ + "▁Re", + "turn" + ], + [ + "▁Ret", + "urn" + ], + [ + "▁", + "Return" + ], + [ + "ST", + "R" + ], + [ + "S", + "TR" + ], + [ + "▁L", + "es" + ], + [ + "▁Le", + "s" + ], + [ + "▁", + "Les" + ], + [ + "▁o", + "rd" + ], + [ + "▁or", + "d" + ], + [ + "▁", + "ord" + ], + [ + "al", + "a" + ], + [ + "a", + "la" + ], + [ + "in", + "ger" + ], + [ + "ing", + "er" + ], + [ + "inge", + "r" + ], + [ + "▁S", + "ince" + ], + [ + "▁Sin", + "ce" + ], + [ + "▁", + "Since" + ], + [ + "▁exper", + "i" + ], + [ + "▁exp", + "eri" + ], + [ + "▁s", + "hall" + ], + [ + "▁sh", + "all" + ], + [ + "▁sha", + "ll" + ], + [ + "▁", + "shall" + ], + [ + "▁s", + "tar" + ], + [ + "▁st", + "ar" + ], + [ + "▁sta", + "r" + ], + [ + "▁", + "star" + ], + [ + "no", + "n" + ], + [ + "n", + "on" + ], + [ + "▁g", + "un" + ], + [ + "▁gu", + "n" + ], + [ + "▁", + "gun" + ], + [ + "▁B", + "el" + ], + [ + "▁Be", + "l" + ], + [ + "▁", + "Bel" + ], + [ + "▁ob", + "j" + ], + [ + "▁", + "obj" + ], + [ + "ar", + "es" + ], + [ + "are", + "s" + ], + [ + "a", + "res" + ], + [ + "r", + "s" + ], + [ + "▁we", + "eks" + ], + [ + "▁week", + "s" + ], + [ + "ne", + "n" + ], + [ + "n", + "en" + ], + [ + "▁S", + "tre" + ], + [ + "▁St", + "re" + ], + [ + "▁Str", + "e" + ], + [ + "or", + "ing" + ], + [ + "ori", + "ng" + ], + [ + "o", + "ring" + ], + [ + "▁", + "î" + ], + [ + "▁ser", + "ious" + ], + [ + "time", + "s" + ], + [ + "ti", + "mes" + ], + [ + "tim", + "es" + ], + [ + "t", + "imes" + ], + [ + "▁H", + "ouse" + ], + [ + "▁Ho", + "use" + ], + [ + "▁Hou", + "se" + ], + [ + "▁r", + "oll" + ], + [ + "▁ro", + "ll" + ], + [ + "▁", + "roll" + ], + [ + "▁reg", + "ister" + ], + [ + "▁", + "register" + ], + [ + "▁mod", + "ule" + ], + [ + "▁mo", + "dule" + ], + [ + "▁", + "module" + ], + [ + "▁app", + "lic" + ], + [ + "▁ap", + "plic" + ], + [ + "▁appl", + "ic" + ], + [ + "I", + "R" + ], + [ + "▁c", + "ook" + ], + [ + "▁co", + "ok" + ], + [ + "▁", + "cook" + ], + [ + "au", + "x" + ], + [ + "a", + "ux" + ], + [ + "▁s", + "ave" + ], + [ + "▁sa", + "ve" + ], + [ + "▁sav", + "e" + ], + [ + "▁", + "save" + ], + [ + "▁C", + "r" + ], + [ + "▁", + "Cr" + ], + [ + ",", + "\r" + ], + [ + "▁st", + "ates" + ], + [ + "▁stat", + "es" + ], + [ + "▁state", + "s" + ], + [ + "▁sta", + "tes" + ], + [ + "▁", + "states" + ], + [ + "▁em", + "pty" + ], + [ + "▁emp", + "ty" + ], + [ + "▁empt", + "y" + ], + [ + "▁", + "empty" + ], + [ + "▁aut", + "om" + ], + [ + "▁au", + "tom" + ], + [ + "▁auto", + "m" + ], + [ + "▁", + "autom" + ], + [ + "fig", + "ure" + ], + [ + "ian", + "ce" + ], + [ + "i", + "ance" + ], + [ + "▁h", + "appy" + ], + [ + "▁happ", + "y" + ], + [ + "▁f", + "n" + ], + [ + "▁", + "fn" + ], + [ + "▁j", + "ud" + ], + [ + "▁ju", + "d" + ], + [ + "▁", + "jud" + ], + [ + "▁h", + "at" + ], + [ + "▁ha", + "t" + ], + [ + "▁", + "hat" + ], + [ + "AC", + "K" + ], + [ + "A", + "CK" + ], + [ + "▁F", + "e" + ], + [ + "▁", + "Fe" + ], + [ + "$", + "-" + ], + [ + "iv", + "il" + ], + [ + "ivi", + "l" + ], + [ + "i", + "vil" + ], + [ + "ot", + "ed" + ], + [ + "ote", + "d" + ], + [ + "o", + "ted" + ], + [ + "▁size", + "of" + ], + [ + "▁", + "sizeof" + ], + [ + "▁sit", + "uation" + ], + [ + "▁situ", + "ation" + ], + [ + "▁l", + "ives" + ], + [ + "▁li", + "ves" + ], + [ + "▁live", + "s" + ], + [ + "▁liv", + "es" + ], + [ + "▁fe", + "eling" + ], + [ + "▁feel", + "ing" + ], + [ + "▁fee", + "ling" + ], + [ + "▁r", + "isk" + ], + [ + "▁ri", + "sk" + ], + [ + "▁ris", + "k" + ], + [ + "▁Jan", + "uary" + ], + [ + "▁Januar", + "y" + ], + [ + "▁Ob", + "ject" + ], + [ + "▁", + "Object" + ], + [ + "▁re", + "comm" + ], + [ + "▁rec", + "omm" + ], + [ + "▁в", + "ы" + ], + [ + "▁", + "вы" + ], + [ + "▁pot", + "ential" + ], + [ + "ea", + "h" + ], + [ + "e", + "ah" + ], + [ + "▁com", + "plex" + ], + [ + "▁comp", + "lex" + ], + [ + "▁compl", + "ex" + ], + [ + "▁", + "complex" + ], + [ + "print", + "f" + ], + [ + "ist", + "ance" + ], + [ + "istan", + "ce" + ], + [ + "i", + "stance" + ], + [ + "ir", + "th" + ], + [ + "irt", + "h" + ], + [ + "li", + "k" + ], + [ + "l", + "ik" + ], + [ + "as", + "te" + ], + [ + "ast", + "e" + ], + [ + "a", + "ste" + ], + [ + "▁wh", + "ose" + ], + [ + "▁who", + "se" + ], + [ + "Ar", + "g" + ], + [ + "A", + "rg" + ], + [ + "▁mod", + "ern" + ], + [ + "▁mo", + "dern" + ], + [ + "▁mode", + "rn" + ], + [ + "▁moder", + "n" + ], + [ + "ion", + "es" + ], + [ + "io", + "nes" + ], + [ + "ione", + "s" + ], + [ + "i", + "ones" + ], + [ + "▁ч", + "е" + ], + [ + "▁", + "че" + ], + [ + "▁s", + "ett" + ], + [ + "▁se", + "tt" + ], + [ + "▁set", + "t" + ], + [ + "▁M", + "ag" + ], + [ + "▁Ma", + "g" + ], + [ + "▁", + "Mag" + ], + [ + "a", + "e" + ], + [ + "▁cond", + "ition" + ], + [ + "▁", + "condition" + ], + [ + "Le", + "ngth" + ], + [ + "L", + "ength" + ], + [ + "▁f", + "it" + ], + [ + "▁fi", + "t" + ], + [ + "▁", + "fit" + ], + [ + "ound", + "s" + ], + [ + "oun", + "ds" + ], + [ + "▁ch", + "anged" + ], + [ + "▁chang", + "ed" + ], + [ + "▁change", + "d" + ], + [ + "▁", + "changed" + ], + [ + "▁g", + "uy" + ], + [ + "▁gu", + "y" + ], + [ + "fil", + "ter" + ], + [ + "at", + "ever" + ], + [ + "ate", + "ver" + ], + [ + "é", + "d" + ], + [ + "re", + "move" + ], + [ + "rem", + "ove" + ], + [ + "▁h", + "op" + ], + [ + "▁ho", + "p" + ], + [ + "▁", + "hop" + ], + [ + "▁O", + "ut" + ], + [ + "▁", + "Out" + ], + [ + "▁R", + "ich" + ], + [ + "▁Ric", + "h" + ], + [ + "▁", + "Rich" + ], + [ + "ch", + "ild" + ], + [ + "chi", + "ld" + ], + [ + "▁in", + "cluded" + ], + [ + "▁incl", + "uded" + ], + [ + "▁includ", + "ed" + ], + [ + "▁include", + "d" + ], + [ + "▁inclu", + "ded" + ], + [ + "$", + "\\" + ], + [ + "▁T", + "om" + ], + [ + "▁To", + "m" + ], + [ + "▁", + "Tom" + ], + [ + "el", + "ine" + ], + [ + "eli", + "ne" + ], + [ + "elin", + "e" + ], + [ + "e", + "line" + ], + [ + "▁s", + "ometimes" + ], + [ + "▁some", + "times" + ], + [ + "▁somet", + "imes" + ], + [ + "▁sometime", + "s" + ], + [ + "▁dr", + "ink" + ], + [ + "▁qu", + "ant" + ], + [ + "▁", + "quant" + ], + [ + "▁p", + "lease" + ], + [ + "▁ple", + "ase" + ], + [ + "▁I", + "nt" + ], + [ + "▁In", + "t" + ], + [ + "▁", + "Int" + ], + [ + "ri", + "ef" + ], + [ + "rie", + "f" + ], + [ + "r", + "ief" + ], + [ + "▁ex", + "actly" + ], + [ + "▁exact", + "ly" + ], + [ + "ci", + "ng" + ], + [ + "cin", + "g" + ], + [ + "c", + "ing" + ], + [ + "▁all", + "owed" + ], + [ + "▁allow", + "ed" + ], + [ + "▁", + "allowed" + ], + [ + "bu", + "ild" + ], + [ + "b", + "uild" + ], + [ + "▁beaut", + "iful" + ], + [ + "▁W", + "ell" + ], + [ + "▁We", + "ll" + ], + [ + "▁Wel", + "l" + ], + [ + "▁", + "Well" + ], + [ + "▁look", + "s" + ], + [ + "▁lo", + "oks" + ], + [ + "▁", + "ü" + ], + [ + "▁ch", + "ance" + ], + [ + "▁w", + "rote" + ], + [ + "▁wr", + "ote" + ], + [ + "▁n", + "or" + ], + [ + "▁no", + "r" + ], + [ + "▁", + "nor" + ], + [ + "▁f", + "ailed" + ], + [ + "▁fa", + "iled" + ], + [ + "▁fail", + "ed" + ], + [ + "▁", + "failed" + ], + [ + "Me", + "t" + ], + [ + "M", + "et" + ], + [ + "▁p", + "rior" + ], + [ + "▁pr", + "ior" + ], + [ + "▁pri", + "or" + ], + [ + "▁h", + "undred" + ], + [ + "ско", + "й" + ], + [ + "с", + "кой" + ], + [ + "or", + "ia" + ], + [ + "ori", + "a" + ], + [ + "o", + "ria" + ], + [ + "▁c", + "y" + ], + [ + "▁", + "cy" + ], + [ + "▁w", + "eb" + ], + [ + "▁we", + "b" + ], + [ + "▁", + "web" + ], + [ + "▁m", + "ess" + ], + [ + "▁me", + "ss" + ], + [ + "▁mes", + "s" + ], + [ + "le", + "q" + ], + [ + "l", + "eq" + ], + [ + "d", + "y" + ], + [ + "te", + "x" + ], + [ + "t", + "ex" + ], + [ + "▁a", + "nim" + ], + [ + "▁an", + "im" + ], + [ + "▁", + "anim" + ], + [ + "at", + "ur" + ], + [ + "atu", + "r" + ], + [ + "▁str", + "ucture" + ], + [ + "▁struct", + "ure" + ], + [ + "▁", + "structure" + ], + [ + "opt", + "ion" + ], + [ + "o", + "ption" + ], + [ + "▁act", + "ual" + ], + [ + "▁", + "actual" + ], + [ + "▁Fr", + "anc" + ], + [ + "▁Fra", + "nc" + ], + [ + "▁Fran", + "c" + ], + [ + "en", + "ced" + ], + [ + "ence", + "d" + ], + [ + "enc", + "ed" + ], + [ + ".<", + "/" + ], + [ + ".", + "" + ], + [ + "▁", + "/>" + ], + [ + "▁p", + "roduction" + ], + [ + "▁produ", + "ction" + ], + [ + "▁product", + "ion" + ], + [ + "▁prod", + "uction" + ], + [ + "▁", + "production" + ], + [ + "ig", + "er" + ], + [ + "ige", + "r" + ], + [ + "i", + "ger" + ], + [ + "▁с", + "т" + ], + [ + "▁", + "ст" + ], + [ + "sh", + "ow" + ], + [ + "s", + "how" + ], + [ + "▁pop", + "ulation" + ], + [ + "▁popul", + "ation" + ], + [ + "▁p", + "ark" + ], + [ + "▁par", + "k" + ], + [ + "▁", + "park" + ], + [ + "▁Z", + "e" + ], + [ + "▁necess", + "ary" + ], + [ + "▁", + "necessary" + ], + [ + "▁t", + "rust" + ], + [ + "▁tr", + "ust" + ], + [ + "▁sh", + "own" + ], + [ + "▁show", + "n" + ], + [ + "mod", + "ule" + ], + [ + "mo", + "dule" + ], + [ + "G", + "E" + ], + [ + "▁l", + "ay" + ], + [ + "▁la", + "y" + ], + [ + "▁", + "lay" + ], + [ + "▁ann", + "oun" + ], + [ + "▁class", + "Name" + ], + [ + "▁", + "className" + ], + [ + "▁cal", + "cul" + ], + [ + "▁calc", + "ul" + ], + [ + "Fun", + "ction" + ], + [ + "F", + "unction" + ], + [ + "▁S", + "al" + ], + [ + "▁Sa", + "l" + ], + [ + "▁", + "Sal" + ], + [ + "O", + "K" + ], + [ + "T", + "P" + ], + [ + "▁en", + "try" + ], + [ + "▁ent", + "ry" + ], + [ + "▁entr", + "y" + ], + [ + "▁", + "entry" + ], + [ + "▁St", + "ud" + ], + [ + "▁", + "Stud" + ], + [ + "▁it", + "ems" + ], + [ + "▁item", + "s" + ], + [ + "▁", + "items" + ], + [ + "▁se", + "curity" + ], + [ + "▁sec", + "urity" + ], + [ + "▁secur", + "ity" + ], + [ + "▁", + "security" + ], + [ + "En", + "try" + ], + [ + "Ent", + "ry" + ], + [ + "f", + "loat" + ], + [ + "l", + "s" + ], + [ + "ib", + "ly" + ], + [ + "▁cont", + "ribut" + ], + [ + "▁C", + "heck" + ], + [ + "▁Che", + "ck" + ], + [ + "▁", + "Check" + ], + [ + "M", + "D" + ], + [ + "▁impro", + "ve" + ], + [ + "Par", + "t" + ], + [ + "P", + "art" + ], + [ + "▁system", + "s" + ], + [ + "▁syst", + "ems" + ], + [ + "B", + "l" + ], + [ + "▁pol", + "icy" + ], + [ + "▁polic", + "y" + ], + [ + "▁", + "policy" + ], + [ + "▁s", + "creen" + ], + [ + "▁sc", + "reen" + ], + [ + "▁scr", + "een" + ], + [ + "▁", + "screen" + ], + [ + "▁A", + "ny" + ], + [ + "▁An", + "y" + ], + [ + "▁", + "Any" + ], + [ + "▁op", + "ened" + ], + [ + "▁open", + "ed" + ], + [ + "al", + "loc" + ], + [ + "all", + "oc" + ], + [ + "allo", + "c" + ], + [ + "▁De", + "cember" + ], + [ + "▁Dec", + "ember" + ], + [ + "▁", + "É" + ], + [ + "▁e", + "mail" + ], + [ + "▁em", + "ail" + ], + [ + "▁", + "email" + ], + [ + "ad", + "er" + ], + [ + "ade", + "r" + ], + [ + "a", + "der" + ], + [ + "=", + ">" + ], + [ + "▁H", + "en" + ], + [ + "▁He", + "n" + ], + [ + "▁", + "Hen" + ], + [ + "▁in", + "fo" + ], + [ + "▁inf", + "o" + ], + [ + "▁", + "info" + ], + [ + "▁f", + "loat" + ], + [ + "▁flo", + "at" + ], + [ + "▁", + "float" + ], + [ + "▁sw", + "itch" + ], + [ + "▁", + "switch" + ], + [ + "ра", + "н" + ], + [ + "р", + "ан" + ], + [ + "ur", + "ance" + ], + [ + "▁as", + "sum" + ], + [ + "▁ass", + "um" + ], + [ + "us", + "tr" + ], + [ + "ust", + "r" + ], + [ + "u", + "str" + ], + [ + "▁g", + "roups" + ], + [ + "▁group", + "s" + ], + [ + "▁gro", + "ups" + ], + [ + "▁", + "groups" + ], + [ + "▁R", + "ead" + ], + [ + "▁Re", + "ad" + ], + [ + "▁", + "Read" + ], + [ + "▁w", + "at" + ], + [ + "▁wa", + "t" + ], + [ + "S", + "p" + ], + [ + "ве", + "р" + ], + [ + "в", + "ер" + ], + [ + "RA", + "N" + ], + [ + "R", + "AN" + ], + [ + "hi", + "b" + ], + [ + "h", + "ib" + ], + [ + "AL", + "L" + ], + [ + "A", + "LL" + ], + [ + "▁h", + "us" + ], + [ + "▁", + "hus" + ], + [ + "Sp", + "ec" + ], + [ + "Spe", + "c" + ], + [ + "S", + "pec" + ], + [ + "\")", + ")" + ], + [ + "\"", + "))" + ], + [ + "▁F", + "rench" + ], + [ + "▁C", + "lass" + ], + [ + "▁Cl", + "ass" + ], + [ + "▁", + "Class" + ], + [ + "▁pres", + "ident" + ], + [ + "▁presid", + "ent" + ], + [ + "▁def", + "init" + ], + [ + "▁defin", + "it" + ], + [ + "▁N", + "or" + ], + [ + "▁No", + "r" + ], + [ + "▁T", + "hom" + ], + [ + "▁Th", + "om" + ], + [ + "ai", + "gn" + ], + [ + "a", + "ign" + ], + [ + "W", + "idth" + ], + [ + "D", + "o" + ], + [ + "▁{", + "@" + ], + [ + "ag", + "on" + ], + [ + "ago", + "n" + ], + [ + "a", + "gon" + ], + [ + "▁L", + "u" + ], + [ + "▁", + "Lu" + ], + [ + "▁follow", + "ed" + ], + [ + "M", + "M" + ], + [ + "as", + "ons" + ], + [ + "ason", + "s" + ], + [ + "tm", + "p" + ], + [ + "t", + "mp" + ], + [ + "▁th", + "rows" + ], + [ + "▁throw", + "s" + ], + [ + "▁thr", + "ows" + ], + [ + "▁thro", + "ws" + ], + [ + "▁", + "throws" + ], + [ + "IT", + "Y" + ], + [ + "I", + "TY" + ], + [ + "но", + "м" + ], + [ + "▁f", + "air" + ], + [ + "▁fa", + "ir" + ], + [ + "▁p", + "en" + ], + [ + "▁pe", + "n" + ], + [ + "▁", + "pen" + ], + [ + "é", + "g" + ], + [ + "▁inter", + "face" + ], + [ + "▁", + "interface" + ], + [ + "▁s", + "af" + ], + [ + "▁sa", + "f" + ], + [ + "oo", + "n" + ], + [ + "o", + "on" + ], + [ + "B", + "ack" + ], + [ + "▁s", + "peed" + ], + [ + "▁sp", + "eed" + ], + [ + "▁spe", + "ed" + ], + [ + "▁", + "speed" + ], + [ + "▁ext", + "ends" + ], + [ + "▁extend", + "s" + ], + [ + "em", + "pty" + ], + [ + "empt", + "y" + ], + [ + "emp", + "ty" + ], + [ + "▁п", + "ере" + ], + [ + "▁пер", + "е" + ], + [ + "▁пе", + "ре" + ], + [ + "▁pro", + "per" + ], + [ + "▁pr", + "oper" + ], + [ + "▁prop", + "er" + ], + [ + "▁d", + "riv" + ], + [ + "▁dr", + "iv" + ], + [ + "▁dri", + "v" + ], + [ + "ф", + "и" + ], + [ + "▁c", + "enter" + ], + [ + "▁cent", + "er" + ], + [ + "▁", + "center" + ], + [ + "he", + "ader" + ], + [ + "head", + "er" + ], + [ + "▁}", + ")" + ], + [ + "▁", + "})" + ], + [ + "w", + "a" + ], + [ + "▁m", + "iddle" + ], + [ + "▁", + "middle" + ], + [ + "▁ch", + "oose" + ], + [ + "▁cho", + "ose" + ], + [ + "▁St", + "ad" + ], + [ + "▁Sta", + "d" + ], + [ + "S", + "O" + ], + [ + "Fact", + "ory" + ], + [ + "Factor", + "y" + ], + [ + "F", + "actory" + ], + [ + "De", + "v" + ], + [ + "D", + "ev" + ], + [ + "ic", + "les" + ], + [ + "icle", + "s" + ], + [ + "icl", + "es" + ], + [ + "i", + "cles" + ], + [ + "▁ap", + "plication" + ], + [ + "▁applic", + "ation" + ], + [ + "▁appl", + "ication" + ], + [ + "▁", + "application" + ], + [ + "▁mod", + "els" + ], + [ + "▁model", + "s" + ], + [ + "▁mode", + "ls" + ], + [ + "▁", + "models" + ], + [ + "pi", + "te" + ], + [ + "pit", + "e" + ], + [ + "p", + "ite" + ], + [ + "ca", + "p" + ], + [ + "c", + "ap" + ], + [ + "x", + "i" + ], + [ + "osp", + "ital" + ], + [ + "▁d", + "ream" + ], + [ + "▁dre", + "am" + ], + [ + "EN", + "D" + ], + [ + "E", + "ND" + ], + [ + "▁con", + "tract" + ], + [ + "▁cont", + "ract" + ], + [ + "▁contr", + "act" + ], + [ + "▁contra", + "ct" + ], + [ + "▁", + "contract" + ], + [ + "icro", + "soft" + ], + [ + "▁th", + "ous" + ], + [ + "▁thou", + "s" + ], + [ + "iz", + "es" + ], + [ + "ize", + "s" + ], + [ + "i", + "zes" + ], + [ + "▁д", + "а" + ], + [ + "▁", + "да" + ], + [ + "▁C", + "O" + ], + [ + "▁", + "CO" + ], + [ + "▁d", + "irection" + ], + [ + "▁di", + "rection" + ], + [ + "▁direct", + "ion" + ], + [ + "▁dire", + "ction" + ], + [ + "▁dir", + "ection" + ], + [ + "▁", + "direction" + ], + [ + "▁`", + "`" + ], + [ + "▁", + "``" + ], + [ + "▁d", + "rive" + ], + [ + "▁dr", + "ive" + ], + [ + "▁dri", + "ve" + ], + [ + "▁driv", + "e" + ], + [ + "▁", + "drive" + ], + [ + "Ma", + "x" + ], + [ + "M", + "ax" + ], + [ + "ci", + "a" + ], + [ + "c", + "ia" + ], + [ + "▁contin", + "u" + ], + [ + "▁A", + "lex" + ], + [ + "▁Al", + "ex" + ], + [ + "▁Ale", + "x" + ], + [ + "▁", + "Alex" + ], + [ + "▁g", + "old" + ], + [ + "▁go", + "ld" + ], + [ + "▁gol", + "d" + ], + [ + "▁", + "gold" + ], + [ + "▁p", + "rep" + ], + [ + "▁pre", + "p" + ], + [ + "▁pr", + "ep" + ], + [ + "▁or", + "igin" + ], + [ + "▁orig", + "in" + ], + [ + "▁", + "origin" + ], + [ + "▁r", + "ap" + ], + [ + "▁ra", + "p" + ], + [ + "▁", + "rap" + ], + [ + "O", + "p" + ], + [ + "ous", + "ly" + ], + [ + "▁are", + "as" + ], + [ + "▁area", + "s" + ], + [ + "PO", + "RT" + ], + [ + "P", + "ORT" + ], + [ + "он", + "а" + ], + [ + "о", + "на" + ], + [ + "▁sa", + "fe" + ], + [ + "▁saf", + "e" + ], + [ + "▁", + "safe" + ], + [ + "▁profess", + "ional" + ], + [ + "▁profession", + "al" + ], + [ + "ap", + "ache" + ], + [ + "apa", + "che" + ], + [ + "▁t", + "emper" + ], + [ + "▁tem", + "per" + ], + [ + "▁temp", + "er" + ], + [ + "s", + "z" + ], + [ + "▁u", + "nit" + ], + [ + "▁un", + "it" + ], + [ + "▁", + "unit" + ], + [ + "▁c", + "op" + ], + [ + "▁co", + "p" + ], + [ + "▁", + "cop" + ], + [ + "eq", + "n" + ], + [ + "List", + "ener" + ], + [ + "Listen", + "er" + ], + [ + "▁for", + "mat" + ], + [ + "▁form", + "at" + ], + [ + "▁forma", + "t" + ], + [ + "▁", + "format" + ], + [ + "se", + "lect" + ], + [ + "sel", + "ect" + ], + [ + "s", + "elect" + ], + [ + "▁com", + "fort" + ], + [ + "▁", + "comfort" + ], + [ + "▁me", + "ant" + ], + [ + "▁mean", + "t" + ], + [ + "id", + "ay" + ], + [ + "ida", + "y" + ], + [ + "i", + "day" + ], + [ + "em", + "e" + ], + [ + "e", + "me" + ], + [ + "▁act", + "ive" + ], + [ + "▁activ", + "e" + ], + [ + "▁", + "active" + ], + [ + "▁n", + "ote" + ], + [ + "▁not", + "e" + ], + [ + "▁no", + "te" + ], + [ + "▁", + "note" + ], + [ + "▁M", + "il" + ], + [ + "▁Mi", + "l" + ], + [ + "▁", + "Mil" + ], + [ + "on", + "ly" + ], + [ + "▁<", + "=" + ], + [ + "▁", + "<=" + ], + [ + "▁ne", + "igh" + ], + [ + "▁nei", + "gh" + ], + [ + "a", + "o" + ], + [ + "▁bl", + "ue" + ], + [ + "▁", + "blue" + ], + [ + "▁T", + "V" + ], + [ + "▁", + "TV" + ], + [ + "Ch", + "ild" + ], + [ + "▁re", + "ached" + ], + [ + "▁reach", + "ed" + ], + [ + "Add", + "ress" + ], + [ + "Addr", + "ess" + ], + [ + "ст", + "в" + ], + [ + "▁cl", + "osed" + ], + [ + "▁close", + "d" + ], + [ + "▁clos", + "ed" + ], + [ + "▁clo", + "sed" + ], + [ + "▁", + "closed" + ], + [ + "in", + "der" + ], + [ + "ind", + "er" + ], + [ + "inde", + "r" + ], + [ + "i", + "nder" + ], + [ + "ol", + "o" + ], + [ + "o", + "lo" + ], + [ + "▁a", + "lt" + ], + [ + "▁al", + "t" + ], + [ + "▁", + "alt" + ], + [ + "▁a", + "dm" + ], + [ + "▁ad", + "m" + ], + [ + "Form", + "at" + ], + [ + "For", + "mat" + ], + [ + "U", + "I" + ], + [ + "▁H", + "am" + ], + [ + "▁Ha", + "m" + ], + [ + "▁f", + "requ" + ], + [ + "▁fr", + "equ" + ], + [ + "▁fre", + "qu" + ], + [ + "▁in", + "depend" + ], + [ + "▁inde", + "pend" + ], + [ + "▁", + "independ" + ], + [ + "▁eas", + "ily" + ], + [ + "▁L", + "and" + ], + [ + "▁La", + "nd" + ], + [ + "▁Lan", + "d" + ], + [ + "▁", + "Land" + ], + [ + "▁t", + "or" + ], + [ + "▁to", + "r" + ], + [ + "▁", + "tor" + ], + [ + "ograph", + "y" + ], + [ + "ograp", + "hy" + ], + [ + "in", + "fty" + ], + [ + "inf", + "ty" + ], + [ + "▁W", + "ork" + ], + [ + "▁Wor", + "k" + ], + [ + "▁", + "Work" + ], + [ + "iv", + "en" + ], + [ + "ive", + "n" + ], + [ + "i", + "ven" + ], + [ + "▁Count", + "y" + ], + [ + "▁Coun", + "ty" + ], + [ + "▁s", + "rc" + ], + [ + "▁", + "src" + ], + [ + "}$", + "," + ], + [ + "}", + "$," + ], + [ + "par", + "se" + ], + [ + "pars", + "e" + ], + [ + "p", + "arse" + ], + [ + "C", + "D" + ], + [ + "▁C", + "our" + ], + [ + "▁Co", + "ur" + ], + [ + "▁Cou", + "r" + ], + [ + "▁f", + "ol" + ], + [ + "▁fo", + "l" + ], + [ + "▁", + "fol" + ], + [ + "Ent", + "ity" + ], + [ + "pg", + "f" + ], + [ + "▁Ch", + "ina" + ], + [ + "▁Chi", + "na" + ], + [ + "▁S", + "ub" + ], + [ + "▁Su", + "b" + ], + [ + "▁", + "Sub" + ], + [ + "ho", + "od" + ], + [ + "h", + "ood" + ], + [ + "▁field", + "s" + ], + [ + "▁", + "fields" + ], + [ + "▁y", + "es" + ], + [ + "▁ye", + "s" + ], + [ + "▁", + "yes" + ], + [ + "re", + "nd" + ], + [ + "ren", + "d" + ], + [ + "r", + "end" + ], + [ + "▁to", + "wards" + ], + [ + "▁toward", + "s" + ], + [ + "▁tow", + "ards" + ], + [ + "▁st", + "aff" + ], + [ + "▁sta", + "ff" + ], + [ + "▁", + "staff" + ], + [ + "▁A", + "ir" + ], + [ + "▁", + "Air" + ], + [ + "▁st", + "ation" + ], + [ + "▁stat", + "ion" + ], + [ + "▁", + "station" + ], + [ + "at", + "ives" + ], + [ + "ative", + "s" + ], + [ + "ati", + "ves" + ], + [ + "ativ", + "es" + ], + [ + "▁imp", + "act" + ], + [ + "в", + "ы" + ], + [ + "▁direct", + "ly" + ], + [ + "iss", + "ions" + ], + [ + "ission", + "s" + ], + [ + "iv", + "a" + ], + [ + "i", + "va" + ], + [ + "|", + "\\" + ], + [ + "Pt", + "r" + ], + [ + "P", + "tr" + ], + [ + "▁S", + "ant" + ], + [ + "▁San", + "t" + ], + [ + "▁Sa", + "nt" + ], + [ + "Po", + "l" + ], + [ + "P", + "ol" + ], + [ + "▁pro", + "gress" + ], + [ + "▁", + "progress" + ], + [ + "it", + "ar" + ], + [ + "ita", + "r" + ], + [ + "i", + "tar" + ], + [ + "▁p", + "arts" + ], + [ + "▁part", + "s" + ], + [ + "▁par", + "ts" + ], + [ + "▁", + "parts" + ], + [ + "▁pl", + "ant" + ], + [ + "▁plan", + "t" + ], + [ + "▁", + "plant" + ], + [ + "▁abs", + "olut" + ], + [ + "▁gu", + "ess" + ], + [ + "eq", + "ref" + ], + [ + "▁t", + "im" + ], + [ + "▁ti", + "m" + ], + [ + "▁", + "tim" + ], + [ + "▁L", + "ou" + ], + [ + "▁Lo", + "u" + ], + [ + "▁", + "Lou" + ], + [ + "▁c", + "ool" + ], + [ + "▁co", + "ol" + ], + [ + "al", + "u" + ], + [ + "a", + "lu" + ], + [ + "▁m", + "outh" + ], + [ + "▁mo", + "uth" + ], + [ + "▁mou", + "th" + ], + [ + "▁", + "mouth" + ], + [ + "ни", + "х" + ], + [ + "▁h", + "eight" + ], + [ + "▁he", + "ight" + ], + [ + "▁", + "height" + ], + [ + "ge", + "st" + ], + [ + "ges", + "t" + ], + [ + "g", + "est" + ], + [ + "▁P", + "ost" + ], + [ + "▁Po", + "st" + ], + [ + "▁Pos", + "t" + ], + [ + "▁", + "Post" + ], + [ + "▁b", + "oard" + ], + [ + "▁bo", + "ard" + ], + [ + "▁", + "board" + ], + [ + "▁t", + "it" + ], + [ + "▁ti", + "t" + ], + [ + "▁", + "tit" + ], + [ + "▁h", + "our" + ], + [ + "▁ho", + "ur" + ], + [ + "▁", + "hour" + ], + [ + "▁ser", + "ver" + ], + [ + "▁serv", + "er" + ], + [ + "▁serve", + "r" + ], + [ + "▁", + "server" + ], + [ + "▁p", + "layers" + ], + [ + "▁play", + "ers" + ], + [ + "▁player", + "s" + ], + [ + "ri", + "er" + ], + [ + "rie", + "r" + ], + [ + "r", + "ier" + ], + [ + "Lin", + "k" + ], + [ + "L", + "ink" + ], + [ + "▁Pres", + "ident" + ], + [ + "]", + "(" + ], + [ + "▁con", + "struct" + ], + [ + "▁const", + "ruct" + ], + [ + "▁constr", + "uct" + ], + [ + "▁constru", + "ct" + ], + [ + "▁", + "construct" + ], + [ + "hand", + "le" + ], + [ + "}$", + "." + ], + [ + "}", + "$." + ], + [ + "ry", + "ing" + ], + [ + "r", + "ying" + ], + [ + "▁s", + "hop" + ], + [ + "▁sh", + "op" + ], + [ + "▁", + "shop" + ], + [ + "ia", + "na" + ], + [ + "ian", + "a" + ], + [ + "i", + "ana" + ], + [ + "ex", + "p" + ], + [ + "e", + "xp" + ], + [ + "Hel", + "per" + ], + [ + "Help", + "er" + ], + [ + "Off", + "set" + ], + [ + "ac", + "hes" + ], + [ + "ach", + "es" + ], + [ + "ache", + "s" + ], + [ + "a", + "ches" + ], + [ + "▁conne", + "ction" + ], + [ + "▁connect", + "ion" + ], + [ + "▁conn", + "ection" + ], + [ + "▁", + "connection" + ], + [ + "▁d", + "ifference" + ], + [ + "▁dif", + "ference" + ], + [ + "▁differ", + "ence" + ], + [ + "serv", + "ice" + ], + [ + "s", + "ervice" + ], + [ + "▁g", + "as" + ], + [ + "▁ga", + "s" + ], + [ + "▁", + "gas" + ], + [ + "▁p", + "riv" + ], + [ + "▁pr", + "iv" + ], + [ + "▁pri", + "v" + ], + [ + "▁", + "priv" + ], + [ + "▁un", + "ivers" + ], + [ + "▁", + "univers" + ], + [ + "▁w", + "ish" + ], + [ + "▁wis", + "h" + ], + [ + "Re", + "m" + ], + [ + "R", + "em" + ], + [ + "U", + "rl" + ], + [ + "ge", + "b" + ], + [ + "g", + "eb" + ], + [ + "S", + "o" + ], + [ + "ens", + "ions" + ], + [ + "ension", + "s" + ], + [ + "Mod", + "ule" + ], + [ + "Mo", + "dule" + ], + [ + "SI", + "ZE" + ], + [ + "▁p", + "rem" + ], + [ + "▁pre", + "m" + ], + [ + "▁pr", + "em" + ], + [ + "wind", + "ow" + ], + [ + "w", + "indow" + ], + [ + "▁d", + "ies" + ], + [ + "▁di", + "es" + ], + [ + "▁die", + "s" + ], + [ + "de", + "l" + ], + [ + "d", + "el" + ], + [ + "▁r", + "ow" + ], + [ + "▁ro", + "w" + ], + [ + "▁", + "row" + ], + [ + "▁a", + "verage" + ], + [ + "▁aver", + "age" + ], + [ + "▁ave", + "rage" + ], + [ + "xi", + "m" + ], + [ + "x", + "im" + ], + [ + "▁p", + "u" + ], + [ + "▁", + "pu" + ], + [ + "an", + "ç" + ], + [ + "De", + "t" + ], + [ + "D", + "et" + ], + [ + "ke", + "r" + ], + [ + "k", + "er" + ], + [ + "y", + "a" + ], + [ + "▁D", + "et" + ], + [ + "▁De", + "t" + ], + [ + "▁", + "Det" + ], + [ + "▁p", + "å" + ], + [ + "▁n", + "amed" + ], + [ + "▁name", + "d" + ], + [ + "▁na", + "med" + ], + [ + "▁nam", + "ed" + ], + [ + "▁", + "named" + ], + [ + "▁dec", + "ision" + ], + [ + "▁decis", + "ion" + ], + [ + "wi", + "n" + ], + [ + "w", + "in" + ], + [ + "▁Ge", + "orge" + ], + [ + "▁Georg", + "e" + ], + [ + "ar", + "ily" + ], + [ + "ari", + "ly" + ], + [ + "▁s", + "olution" + ], + [ + "▁sol", + "ution" + ], + [ + "▁mult", + "iple" + ], + [ + "▁multi", + "ple" + ], + [ + "▁multip", + "le" + ], + [ + "▁", + "multiple" + ], + [ + "at", + "egy" + ], + [ + "ate", + "gy" + ], + [ + "ateg", + "y" + ], + [ + "▁le", + "arning" + ], + [ + "▁learn", + "ing" + ], + [ + "▁lear", + "ning" + ], + [ + "▁", + "learning" + ], + [ + "▁se", + "cret" + ], + [ + "▁sec", + "ret" + ], + [ + "▁secre", + "t" + ], + [ + "▁", + "secret" + ], + [ + "D", + "O" + ], + [ + "▁n", + "ice" + ], + [ + "▁ni", + "ce" + ], + [ + "▁nic", + "e" + ], + [ + "▁", + "nice" + ], + [ + "////////", + "////////" + ], + [ + "S", + "u" + ], + [ + "it", + "ation" + ], + [ + "itat", + "ion" + ], + [ + "▁j", + "oin" + ], + [ + "▁jo", + "in" + ], + [ + "▁", + "join" + ], + [ + "▁el", + "ements" + ], + [ + "▁element", + "s" + ], + [ + "▁ele", + "ments" + ], + [ + "▁elem", + "ents" + ], + [ + "▁", + "elements" + ], + [ + "▁e", + "mer" + ], + [ + "▁em", + "er" + ], + [ + "til", + "de" + ], + [ + "t", + "ilde" + ], + [ + "▁d", + "ep" + ], + [ + "▁de", + "p" + ], + [ + "▁", + "dep" + ], + [ + "▁s", + "hot" + ], + [ + "▁sh", + "ot" + ], + [ + "▁", + "shot" + ], + [ + "▁pl", + "atform" + ], + [ + "▁plat", + "form" + ], + [ + "▁", + "platform" + ], + [ + "ot", + "hing" + ], + [ + "oth", + "ing" + ], + [ + "o", + "thing" + ], + [ + "M", + "y" + ], + [ + "ed", + "ia" + ], + [ + "edi", + "a" + ], + [ + "om", + "s" + ], + [ + "o", + "ms" + ], + [ + "ail", + "y" + ], + [ + "ai", + "ly" + ], + [ + "a", + "ily" + ], + [ + "(", + "[" + ], + [ + "▁d", + "ress" + ], + [ + "▁dr", + "ess" + ], + [ + "▁dre", + "ss" + ], + [ + "▁off", + "icial" + ], + [ + "▁offic", + "ial" + ], + [ + "es", + "tern" + ], + [ + "est", + "ern" + ], + [ + "ester", + "n" + ], + [ + "este", + "rn" + ], + [ + "▁dis", + "cover" + ], + [ + "▁disc", + "over" + ], + [ + "▁disco", + "ver" + ], + [ + "▁m", + "i" + ], + [ + "▁", + "mi" + ], + [ + "ны", + "е" + ], + [ + "C", + "A" + ], + [ + "od", + "ing" + ], + [ + "odi", + "ng" + ], + [ + "o", + "ding" + ], + [ + "▁F", + "ound" + ], + [ + "▁Fou", + "nd" + ], + [ + "▁Fo", + "und" + ], + [ + "▁", + "Found" + ], + [ + "▁a", + "ffect" + ], + [ + "▁aff", + "ect" + ], + [ + "▁af", + "fect" + ], + [ + "Vi", + "s" + ], + [ + "V", + "is" + ], + [ + "st", + "ract" + ], + [ + "str", + "act" + ], + [ + "stra", + "ct" + ], + [ + "s", + "tract" + ], + [ + "ic", + "ed" + ], + [ + "ice", + "d" + ], + [ + "i", + "ced" + ], + [ + "de", + "bug" + ], + [ + "d", + "ebug" + ], + [ + "▁rel", + "ated" + ], + [ + "▁relate", + "d" + ], + [ + "▁", + "related" + ], + [ + "▁s", + "pect" + ], + [ + "▁sp", + "ect" + ], + [ + "▁spec", + "t" + ], + [ + "▁spe", + "ct" + ], + [ + "▁", + "spect" + ], + [ + "us", + "hed" + ], + [ + "ush", + "ed" + ], + [ + "сь", + "ко" + ], + [ + "▁b", + "ank" + ], + [ + "▁ban", + "k" + ], + [ + "▁", + "bank" + ], + [ + "▁c", + "ele" + ], + [ + "▁ce", + "le" + ], + [ + "▁cel", + "e" + ], + [ + "AN", + "D" + ], + [ + "A", + "ND" + ], + [ + "ol", + "f" + ], + [ + "е", + "м" + ], + [ + "▁f", + "ill" + ], + [ + "▁fil", + "l" + ], + [ + "▁fi", + "ll" + ], + [ + "▁", + "fill" + ], + [ + "▁g", + "ives" + ], + [ + "▁giv", + "es" + ], + [ + "▁give", + "s" + ], + [ + "▁gi", + "ves" + ], + [ + "▁б", + "у" + ], + [ + "▁", + "бу" + ], + [ + "ar", + "on" + ], + [ + "aro", + "n" + ], + [ + "a", + "ron" + ], + [ + "▁J", + "es" + ], + [ + "▁Je", + "s" + ], + [ + "RE", + "G" + ], + [ + "▁s", + "udd" + ], + [ + "▁su", + "dd" + ], + [ + "▁sud", + "d" + ], + [ + "date", + "d" + ], + [ + "da", + "ted" + ], + [ + "dat", + "ed" + ], + [ + "d", + "ated" + ], + [ + "v", + "i" + ], + [ + "▁g", + "i" + ], + [ + "▁", + "gi" + ], + [ + "se", + "nd" + ], + [ + "sen", + "d" + ], + [ + "s", + "end" + ], + [ + "cp", + "p" + ], + [ + "c", + "pp" + ], + [ + "▁s", + "pent" + ], + [ + "▁sp", + "ent" + ], + [ + "▁spe", + "nt" + ], + [ + "an", + "de" + ], + [ + "and", + "e" + ], + [ + "a", + "nde" + ], + [ + "▁oper", + "ation" + ], + [ + "▁", + "operation" + ], + [ + "pro", + "cess" + ], + [ + "proc", + "ess" + ], + [ + "▁in", + "form" + ], + [ + "▁inf", + "orm" + ], + [ + "▁info", + "rm" + ], + [ + "▁F", + "ree" + ], + [ + "▁Fr", + "ee" + ], + [ + "▁Fre", + "e" + ], + [ + "▁", + "Free" + ], + [ + "yo", + "nd" + ], + [ + "y", + "ond" + ], + [ + "▁per", + "haps" + ], + [ + "▁su", + "rv" + ], + [ + "▁sur", + "v" + ], + [ + "▁L", + "oc" + ], + [ + "▁Lo", + "c" + ], + [ + "▁", + "Loc" + ], + [ + "▁con", + "cl" + ], + [ + "▁conc", + "l" + ], + [ + "▁ра", + "з" + ], + [ + "▁", + "раз" + ], + [ + "▁O", + "ver" + ], + [ + "▁", + "Over" + ], + [ + "ho", + "l" + ], + [ + "h", + "ol" + ], + [ + "ra", + "z" + ], + [ + "r", + "az" + ], + [ + "Wr", + "ite" + ], + [ + "Writ", + "e" + ], + [ + "W", + "rite" + ], + [ + "▁g", + "iving" + ], + [ + "▁giv", + "ing" + ], + [ + "▁gi", + "ving" + ], + [ + "r", + "d" + ], + [ + "in", + "stance" + ], + [ + "inst", + "ance" + ], + [ + "▁re", + "leased" + ], + [ + "▁rele", + "ased" + ], + [ + "▁release", + "d" + ], + [ + "▁R", + "o" + ], + [ + "▁", + "Ro" + ], + [ + "R", + "A" + ], + [ + "▁pract", + "ice" + ], + [ + "▁g", + "raph" + ], + [ + "▁gr", + "aph" + ], + [ + "▁gra", + "ph" + ], + [ + "▁grap", + "h" + ], + [ + "▁", + "graph" + ], + [ + "▁incre", + "ase" + ], + [ + "▁fig", + "ure" + ], + [ + "▁", + "figure" + ], + [ + "Fil", + "ter" + ], + [ + "HE", + "CK" + ], + [ + "id", + "x" + ], + [ + "i", + "dx" + ], + [ + "▁g", + "lass" + ], + [ + "▁gl", + "ass" + ], + [ + "▁", + "glass" + ], + [ + "sk", + "i" + ], + [ + "s", + "ki" + ], + [ + "com", + "es" + ], + [ + "co", + "mes" + ], + [ + "come", + "s" + ], + [ + "c", + "omes" + ], + [ + "▁c", + "at" + ], + [ + "▁ca", + "t" + ], + [ + "▁", + "cat" + ], + [ + "▁c", + "old" + ], + [ + "▁col", + "d" + ], + [ + "▁co", + "ld" + ], + [ + "go", + "to" + ], + [ + "got", + "o" + ], + [ + "g", + "oto" + ], + [ + "uf", + "act" + ], + [ + "u", + "fact" + ], + [ + "▁C", + "opyright" + ], + [ + "▁Copy", + "right" + ], + [ + "▁", + "Copyright" + ], + [ + "}}", + "\\" + ], + [ + "}", + "}\\" + ], + [ + "▁str", + "eng" + ], + [ + "▁stre", + "ng" + ], + [ + "▁d", + "ir" + ], + [ + "▁di", + "r" + ], + [ + "▁", + "dir" + ], + [ + "to", + "ken" + ], + [ + "tok", + "en" + ], + [ + "t", + "oken" + ], + [ + "▁occ", + "ur" + ], + [ + "▁oc", + "cur" + ], + [ + "arl", + "ier" + ], + [ + "▁me", + "asure" + ], + [ + "▁meas", + "ure" + ], + [ + "▁", + "measure" + ], + [ + "▁s", + "ec" + ], + [ + "▁se", + "c" + ], + [ + "▁", + "sec" + ], + [ + "▁m", + "ás" + ], + [ + "▁má", + "s" + ], + [ + "▁N", + "et" + ], + [ + "▁Ne", + "t" + ], + [ + "▁", + "Net" + ], + [ + "▁arg", + "ument" + ], + [ + "▁", + "argument" + ], + [ + "▁s", + "ou" + ], + [ + "▁so", + "u" + ], + [ + "▁m", + "oving" + ], + [ + "▁mov", + "ing" + ], + [ + "▁mo", + "ving" + ], + [ + "▁p", + "refer" + ], + [ + "▁pre", + "fer" + ], + [ + "▁pref", + "er" + ], + [ + "ma", + "sk" + ], + [ + "mas", + "k" + ], + [ + "m", + "ask" + ], + [ + "<", + "<" + ], + [ + "▁bre", + "ath" + ], + [ + "▁breat", + "h" + ], + [ + "▁phys", + "ical" + ], + [ + "▁pos", + "itive" + ], + [ + "▁posit", + "ive" + ], + [ + "▁s", + "or" + ], + [ + "▁so", + "r" + ], + [ + "▁", + "sor" + ], + [ + "▁de", + "part" + ], + [ + "▁dep", + "art" + ], + [ + "▁re", + "move" + ], + [ + "▁rem", + "ove" + ], + [ + "▁", + "remove" + ], + [ + "▁k", + "it" + ], + [ + "▁ki", + "t" + ], + [ + "▁", + "kit" + ], + [ + "▁me", + "eting" + ], + [ + "▁meet", + "ing" + ], + [ + "▁D", + "ata" + ], + [ + "▁Da", + "ta" + ], + [ + "▁Dat", + "a" + ], + [ + "▁", + "Data" + ], + [ + "og", + "raf" + ], + [ + "act", + "ions" + ], + [ + "action", + "s" + ], + [ + "a", + "ctions" + ], + [ + "▁param", + "eters" + ], + [ + "▁parameter", + "s" + ], + [ + "▁", + "parameters" + ], + [ + "▁A", + "tt" + ], + [ + "▁At", + "t" + ], + [ + "▁", + "Att" + ], + [ + "es", + "ch" + ], + [ + "esc", + "h" + ], + [ + "e", + "sch" + ], + [ + "▁inv", + "olved" + ], + [ + "▁invol", + "ved" + ], + [ + "▁involve", + "d" + ], + [ + "ä", + "t" + ], + [ + "L", + "L" + ], + [ + "B", + "ar" + ], + [ + "▁с", + "и" + ], + [ + "▁", + "си" + ], + [ + "ec", + "h" + ], + [ + "e", + "ch" + ], + [ + "GE", + "T" + ], + [ + "G", + "ET" + ], + [ + "▁pre", + "vent" + ], + [ + "▁pr", + "event" + ], + [ + "▁prev", + "ent" + ], + [ + "▁", + "prevent" + ], + [ + "▁be", + "yond" + ], + [ + "▁O", + "ther" + ], + [ + "▁Ot", + "her" + ], + [ + "▁", + "Other" + ], + [ + "ä", + "n" + ], + [ + "by", + "te" + ], + [ + "▁sudd", + "en" + ], + [ + "▁sud", + "den" + ], + [ + "ol", + "ve" + ], + [ + "olv", + "e" + ], + [ + "▁н", + "о" + ], + [ + "▁", + "но" + ], + [ + "LO", + "G" + ], + [ + "L", + "OG" + ], + [ + "un", + "it" + ], + [ + "uni", + "t" + ], + [ + "u", + "nit" + ], + [ + "▁tr", + "uth" + ], + [ + "ra", + "t" + ], + [ + "r", + "at" + ], + [ + "S", + "D" + ], + [ + "▁e", + "at" + ], + [ + "▁M", + "ad" + ], + [ + "▁Ma", + "d" + ], + [ + "▁", + "Mad" + ], + [ + "▁prov", + "ides" + ], + [ + "▁provide", + "s" + ], + [ + "▁s", + "ession" + ], + [ + "▁", + "session" + ], + [ + "De", + "le" + ], + [ + "Del", + "e" + ], + [ + "D", + "ele" + ], + [ + "▁con", + "vers" + ], + [ + "▁conv", + "ers" + ], + [ + "▁conver", + "s" + ], + [ + "▁conve", + "rs" + ], + [ + "cent", + "er" + ], + [ + "cen", + "ter" + ], + [ + "c", + "enter" + ], + [ + "▁contin", + "ued" + ], + [ + "▁continue", + "d" + ], + [ + "▁continu", + "ed" + ], + [ + "ot", + "ion" + ], + [ + "oti", + "on" + ], + [ + "ca", + "che" + ], + [ + "c", + "ache" + ], + [ + "dis", + "play" + ], + [ + "disp", + "lay" + ], + [ + "▁prote", + "ct" + ], + [ + "▁prot", + "ect" + ], + [ + "am", + "s" + ], + [ + "a", + "ms" + ], + [ + "▁p", + "ow" + ], + [ + "▁po", + "w" + ], + [ + "▁", + "pow" + ], + [ + "CT", + "ION" + ], + [ + "C", + "TION" + ], + [ + "▁M", + "ac" + ], + [ + "▁Ma", + "c" + ], + [ + "▁", + "Mac" + ], + [ + "m", + "o" + ], + [ + "х", + "а" + ], + [ + "▁d", + "istance" + ], + [ + "▁di", + "stance" + ], + [ + "▁dist", + "ance" + ], + [ + "▁", + "distance" + ], + [ + "▁T", + "ime" + ], + [ + "▁Tim", + "e" + ], + [ + "▁Ti", + "me" + ], + [ + "▁", + "Time" + ], + [ + "g", + "i" + ], + [ + "▁s", + "equ" + ], + [ + "▁se", + "qu" + ], + [ + "▁seq", + "u" + ], + [ + "▁", + "sequ" + ], + [ + "T", + "arget" + ], + [ + "с", + "ле" + ], + [ + "Ser", + "ver" + ], + [ + "Serv", + "er" + ], + [ + "▁w", + "ide" + ], + [ + "▁wid", + "e" + ], + [ + "▁", + "wide" + ], + [ + "cl", + "ose" + ], + [ + "clos", + "e" + ], + [ + "▁c", + "ru" + ], + [ + "▁cr", + "u" + ], + [ + "Ex", + "t" + ], + [ + "E", + "xt" + ], + [ + "▁s", + "elect" + ], + [ + "▁se", + "lect" + ], + [ + "▁sel", + "ect" + ], + [ + "▁sele", + "ct" + ], + [ + "▁", + "select" + ], + [ + "▁pat", + "tern" + ], + [ + "▁", + "pattern" + ], + [ + "\")", + ");" + ], + [ + "\"))", + ";" + ], + [ + "\"", + "));" + ], + [ + "Pro", + "vider" + ], + [ + "Prov", + "ider" + ], + [ + "UR", + "L" + ], + [ + "U", + "RL" + ], + [ + "▁g", + "reen" + ], + [ + "▁gr", + "een" + ], + [ + "▁gre", + "en" + ], + [ + "▁", + "green" + ], + [ + "▁wait", + "ing" + ], + [ + "▁wa", + "iting" + ], + [ + "pro", + "to" + ], + [ + "pr", + "oto" + ], + [ + "prot", + "o" + ], + [ + "▁immedi", + "ately" + ], + [ + "▁immediate", + "ly" + ], + [ + "com", + "mon" + ], + [ + "comm", + "on" + ], + [ + "az", + "ione" + ], + [ + "azi", + "one" + ], + [ + "a", + "zione" + ], + [ + "ri", + "ver" + ], + [ + "riv", + "er" + ], + [ + "rive", + "r" + ], + [ + "r", + "iver" + ], + [ + "▁s", + "en" + ], + [ + "▁se", + "n" + ], + [ + "▁", + "sen" + ], + [ + "▁!", + "==" + ], + [ + "▁!=", + "=" + ], + [ + "▁Febru", + "ary" + ], + [ + "▁Februar", + "y" + ], + [ + "ur", + "b" + ], + [ + "u", + "rb" + ], + [ + "▁S", + "en" + ], + [ + "▁Se", + "n" + ], + [ + "de", + "st" + ], + [ + "des", + "t" + ], + [ + "d", + "est" + ], + [ + "<", + "?" + ], + [ + "▁ed", + "ge" + ], + [ + "▁", + "edge" + ], + [ + "▁m", + "ais" + ], + [ + "▁ma", + "is" + ], + [ + "▁mai", + "s" + ], + [ + "gor", + "ith" + ], + [ + "cp", + "u" + ], + [ + "c", + "pu" + ], + [ + "▁educ", + "ation" + ], + [ + "▁associ", + "ated" + ], + [ + "▁associate", + "d" + ], + [ + "No", + "ne" + ], + [ + "Non", + "e" + ], + [ + "N", + "one" + ], + [ + "h", + "i" + ], + [ + "▁p", + "oor" + ], + [ + "▁po", + "or" + ], + [ + "se", + "m" + ], + [ + "s", + "em" + ], + [ + "▁W", + "il" + ], + [ + "▁Wi", + "l" + ], + [ + "▁b", + "ud" + ], + [ + "▁bu", + "d" + ], + [ + "▁", + "bud" + ], + [ + "▁a", + "uch" + ], + [ + "▁au", + "ch" + ], + [ + "▁", + "auch" + ], + [ + "el", + "ler" + ], + [ + "ell", + "er" + ], + [ + "elle", + "r" + ], + [ + "▁L", + "ife" + ], + [ + "▁Li", + "fe" + ], + [ + "▁", + "Life" + ], + [ + "▁f", + "iles" + ], + [ + "▁fil", + "es" + ], + [ + "▁file", + "s" + ], + [ + "▁fi", + "les" + ], + [ + "▁", + "files" + ], + [ + "▁le", + "ading" + ], + [ + "▁lead", + "ing" + ], + [ + "▁", + "leading" + ], + [ + "▁ob", + "tain" + ], + [ + "▁obt", + "ain" + ], + [ + "▁J", + "ul" + ], + [ + "▁Ju", + "l" + ], + [ + "at", + "ory" + ], + [ + "ator", + "y" + ], + [ + "ato", + "ry" + ], + [ + "г", + "у" + ], + [ + "it", + "able" + ], + [ + "ita", + "ble" + ], + [ + "i", + "table" + ], + [ + "▁on", + "to" + ], + [ + "▁ont", + "o" + ], + [ + "▁", + "onto" + ], + [ + "▁b", + "orn" + ], + [ + "▁bo", + "rn" + ], + [ + "▁bor", + "n" + ], + [ + "▁", + "born" + ], + [ + "or", + "em" + ], + [ + "ore", + "m" + ], + [ + "o", + "rem" + ], + [ + "▁Stre", + "et" + ], + [ + "▁m", + "aint" + ], + [ + "▁main", + "t" + ], + [ + "▁ma", + "int" + ], + [ + "▁mai", + "nt" + ], + [ + "Param", + "s" + ], + [ + "Par", + "ams" + ], + [ + "ri", + "p" + ], + [ + "r", + "ip" + ], + [ + "▁S", + "T" + ], + [ + "▁", + "ST" + ], + [ + "u", + "v" + ], + [ + "ma", + "in" + ], + [ + "m", + "ain" + ], + [ + "▁re", + "cent" + ], + [ + "▁rec", + "ent" + ], + [ + "▁rece", + "nt" + ], + [ + "We", + "b" + ], + [ + "W", + "eb" + ], + [ + "ov", + "a" + ], + [ + "o", + "va" + ], + [ + "ц", + "а" + ], + [ + "ais", + "e" + ], + [ + "ai", + "se" + ], + [ + "a", + "ise" + ], + [ + "yle", + "s" + ], + [ + "yl", + "es" + ], + [ + "y", + "les" + ], + [ + "▁de", + "scribed" + ], + [ + "▁desc", + "ribed" + ], + [ + "▁describ", + "ed" + ], + [ + "▁describe", + "d" + ], + [ + "▁begin", + "ning" + ], + [ + "▁D", + "ay" + ], + [ + "▁Da", + "y" + ], + [ + "▁", + "Day" + ], + [ + "▁V", + "ol" + ], + [ + "▁Vo", + "l" + ], + [ + "▁", + "Vol" + ], + [ + "▁h", + "uge" + ], + [ + "▁hug", + "e" + ], + [ + "Ha", + "s" + ], + [ + "H", + "as" + ], + [ + "an", + "cy" + ], + [ + "anc", + "y" + ], + [ + "He", + "ader" + ], + [ + "Head", + "er" + ], + [ + "▁a", + "ren" + ], + [ + "▁are", + "n" + ], + [ + "▁ar", + "en" + ], + [ + "▁", + "aren" + ], + [ + "ва", + "н" + ], + [ + "в", + "ан" + ], + [ + "▁en", + "sure" + ], + [ + "▁ens", + "ure" + ], + [ + "▁", + "ensure" + ], + [ + "▁p", + "et" + ], + [ + "▁pe", + "t" + ], + [ + "▁", + "pet" + ], + [ + "mu", + "lt" + ], + [ + "mul", + "t" + ], + [ + "m", + "ult" + ], + [ + "▁L", + "ike" + ], + [ + "▁Li", + "ke" + ], + [ + "▁", + "Like" + ], + [ + "▁man", + "agement" + ], + [ + "▁manage", + "ment" + ], + [ + "▁", + "management" + ], + [ + "P", + "S" + ], + [ + "wh", + "ile" + ], + [ + "▁back", + "ground" + ], + [ + "▁", + "background" + ], + [ + "ount", + "er" + ], + [ + "oun", + "ter" + ], + [ + "o", + "unter" + ], + [ + "bo", + "ol" + ], + [ + "b", + "ool" + ], + [ + "F", + "C" + ], + [ + "N", + "um" + ], + [ + "R", + "L" + ], + [ + "▁ex", + "cl" + ], + [ + "▁exc", + "l" + ], + [ + "▁e", + "ye" + ], + [ + "▁ey", + "e" + ], + [ + "im", + "g" + ], + [ + "i", + "mg" + ], + [ + "▁r", + "om" + ], + [ + "▁ro", + "m" + ], + [ + "▁", + "rom" + ], + [ + "▁H", + "el" + ], + [ + "▁He", + "l" + ], + [ + "▁", + "Hel" + ], + [ + "Opt", + "ion" + ], + [ + "O", + "ption" + ], + [ + "▁stop", + "ped" + ], + [ + "▁sto", + "pped" + ], + [ + "▁th", + "read" + ], + [ + "▁thr", + "ead" + ], + [ + "▁", + "thread" + ], + [ + "to", + "type" + ], + [ + "tot", + "ype" + ], + [ + "t", + "otype" + ], + [ + "))", + ")" + ], + [ + ")", + "))" + ], + [ + "▁st", + "age" + ], + [ + "▁stag", + "e" + ], + [ + "▁sta", + "ge" + ], + [ + "▁", + "stage" + ], + [ + "▁ü", + "ber" + ], + [ + "▁", + "über" + ], + [ + "▁al", + "though" + ], + [ + "▁", + "although" + ], + [ + "Type", + "s" + ], + [ + "Ty", + "pes" + ], + [ + "Typ", + "es" + ], + [ + "T", + "ypes" + ], + [ + "▁O", + "h" + ], + [ + "▁", + "Oh" + ], + [ + "▁e", + "ight" + ], + [ + "▁", + "eight" + ], + [ + "▁de", + "scription" + ], + [ + "▁des", + "cription" + ], + [ + "▁", + "description" + ], + [ + "'", + "'" + ], + [ + "ö", + "n" + ], + [ + "▁sur", + "face" + ], + [ + "▁surf", + "ace" + ], + [ + "▁", + "surface" + ], + [ + "▁Intern", + "ational" + ], + [ + "▁ch", + "arg" + ], + [ + "▁char", + "g" + ], + [ + "▁cha", + "rg" + ], + [ + "▁", + "charg" + ], + [ + "▁col", + "lection" + ], + [ + "▁coll", + "ection" + ], + [ + "▁collect", + "ion" + ], + [ + "▁colle", + "ction" + ], + [ + "▁", + "collection" + ], + [ + "▁us", + "ers" + ], + [ + "▁use", + "rs" + ], + [ + "▁user", + "s" + ], + [ + "▁", + "users" + ], + [ + "▁ob", + "vious" + ], + [ + "▁cent", + "ury" + ], + [ + "▁", + "century" + ], + [ + "ic", + "ks" + ], + [ + "ick", + "s" + ], + [ + "i", + "cks" + ], + [ + "▁art", + "icle" + ], + [ + "▁artic", + "le" + ], + [ + "▁", + "article" + ], + [ + "▁\"", + "\\" + ], + [ + "▁", + "\"\\" + ], + [ + "di", + "m" + ], + [ + "d", + "im" + ], + [ + "▁s", + "in" + ], + [ + "▁si", + "n" + ], + [ + "▁", + "sin" + ], + [ + "en", + "ge" + ], + [ + "eng", + "e" + ], + [ + "Cont", + "rol" + ], + [ + "▁com", + "mit" + ], + [ + "▁comm", + "it" + ], + [ + "▁", + "commit" + ], + [ + "ens", + "ity" + ], + [ + "▁t", + "ra" + ], + [ + "▁tr", + "a" + ], + [ + "▁", + "tra" + ], + [ + "cript", + "or" + ], + [ + "▁N", + "OT" + ], + [ + "▁NO", + "T" + ], + [ + "▁", + "NOT" + ], + [ + "we", + "ll" + ], + [ + "w", + "ell" + ], + [ + "▁M", + "ichael" + ], + [ + "▁Mich", + "ael" + ], + [ + "▁n", + "od" + ], + [ + "▁no", + "d" + ], + [ + "▁", + "nod" + ], + [ + "▁m", + "ort" + ], + [ + "▁mor", + "t" + ], + [ + "▁mo", + "rt" + ], + [ + "iv", + "o" + ], + [ + "i", + "vo" + ], + [ + "is", + "ation" + ], + [ + "▁P", + "o" + ], + [ + "▁", + "Po" + ], + [ + "▁P", + "aris" + ], + [ + "▁Par", + "is" + ], + [ + "▁Pa", + "ris" + ], + [ + "▁ad", + "ministr" + ], + [ + "▁admin", + "istr" + ], + [ + "▁", + "administr" + ], + [ + "bu", + "rg" + ], + [ + "bur", + "g" + ], + [ + "b", + "urg" + ], + [ + "cd", + "ot" + ], + [ + "c", + "dot" + ], + [ + "▁mil", + "itary" + ], + [ + "▁milit", + "ary" + ], + [ + "▁militar", + "y" + ], + [ + "▁B", + "est" + ], + [ + "▁Be", + "st" + ], + [ + "▁Bes", + "t" + ], + [ + "▁", + "Best" + ], + [ + "▁К", + "а" + ], + [ + "▁", + "Ка" + ], + [ + "IN", + "E" + ], + [ + "I", + "NE" + ], + [ + "▁through", + "out" + ], + [ + "S", + "l" + ], + [ + "▁im", + "pl" + ], + [ + "▁imp", + "l" + ], + [ + "▁", + "impl" + ], + [ + "cont", + "rol" + ], + [ + "contr", + "ol" + ], + [ + "▁", + "Ч" + ], + [ + "▁u", + "it" + ], + [ + "▁ui", + "t" + ], + [ + "▁", + "uit" + ], + [ + "▁un", + "signed" + ], + [ + "▁uns", + "igned" + ], + [ + "▁", + "unsigned" + ], + [ + "▁M", + "ary" + ], + [ + "▁Mar", + "y" + ], + [ + "▁Ma", + "ry" + ], + [ + "Ch", + "ar" + ], + [ + "C", + "har" + ], + [ + "м", + "і" + ], + [ + "▁th", + "reat" + ], + [ + "▁c", + "ourt" + ], + [ + "▁co", + "urt" + ], + [ + "▁cour", + "t" + ], + [ + "▁cou", + "rt" + ], + [ + "▁", + "court" + ], + [ + "vi", + "lle" + ], + [ + "vil", + "le" + ], + [ + "v", + "ille" + ], + [ + "▁", + "ш" + ], + [ + "▁C", + "am" + ], + [ + "▁Ca", + "m" + ], + [ + "▁", + "Cam" + ], + [ + ".", + "\r" + ], + [ + "▁current", + "ly" + ], + [ + "▁curr", + "ently" + ], + [ + "ro", + "t" + ], + [ + "r", + "ot" + ], + [ + "▁D", + "ate" + ], + [ + "▁Da", + "te" + ], + [ + "▁Dat", + "e" + ], + [ + "▁", + "Date" + ], + [ + "▁s", + "hit" + ], + [ + "▁sh", + "it" + ], + [ + "▁", + "shit" + ], + [ + "▁$", + "{\\" + ], + [ + "▁${", + "\\" + ], + [ + "un", + "n" + ], + [ + "u", + "nn" + ], + [ + "U", + "s" + ], + [ + "▁b", + "uffer" + ], + [ + "▁buff", + "er" + ], + [ + "▁buf", + "fer" + ], + [ + "▁", + "buffer" + ], + [ + "▁s", + "ont" + ], + [ + "▁so", + "nt" + ], + [ + "▁son", + "t" + ], + [ + "▁let", + "ter" + ], + [ + "▁lett", + "er" + ], + [ + "▁", + "letter" + ], + [ + "in", + "ated" + ], + [ + "ina", + "ted" + ], + [ + "inate", + "d" + ], + [ + "Ch", + "ange" + ], + [ + "▁h", + "ref" + ], + [ + "▁hr", + "ef" + ], + [ + "▁", + "href" + ], + [ + "▁l", + "ack" + ], + [ + "▁la", + "ck" + ], + [ + "▁lac", + "k" + ], + [ + "▁o", + "il" + ], + [ + "▁C", + "ons" + ], + [ + "▁Con", + "s" + ], + [ + "▁Co", + "ns" + ], + [ + "▁", + "Cons" + ], + [ + "▁J", + "er" + ], + [ + "▁Je", + "r" + ], + [ + "BU", + "G" + ], + [ + "B", + "UG" + ], + [ + "if", + "orn" + ], + [ + "▁pro", + "perties" + ], + [ + "▁proper", + "ties" + ], + [ + "▁", + "properties" + ], + [ + "▁r", + "andom" + ], + [ + "▁ran", + "dom" + ], + [ + "▁rand", + "om" + ], + [ + "▁", + "random" + ], + [ + "▁br", + "other" + ], + [ + "▁bro", + "ther" + ], + [ + "▁p", + "iece" + ], + [ + "▁pie", + "ce" + ], + [ + "▁", + "piece" + ], + [ + "б", + "у" + ], + [ + "ist", + "ics" + ], + [ + "istic", + "s" + ], + [ + "isti", + "cs" + ], + [ + "▁techn", + "ology" + ], + [ + "gl", + "obal" + ], + [ + "glob", + "al" + ], + [ + "▁trans", + "form" + ], + [ + "▁", + "transform" + ], + [ + "er", + "d" + ], + [ + "e", + "rd" + ], + [ + "▁B", + "ecause" + ], + [ + "▁", + "Because" + ], + [ + "PE", + "CT" + ], + [ + "P", + "ECT" + ], + [ + "pr", + "et" + ], + [ + "pre", + "t" + ], + [ + "p", + "ret" + ], + [ + "▁го", + "ду" + ], + [ + "▁год", + "у" + ], + [ + "▁M", + "et" + ], + [ + "▁Me", + "t" + ], + [ + "▁", + "Met" + ], + [ + "▁p", + "sy" + ], + [ + "▁ps", + "y" + ], + [ + "▁", + "psy" + ], + [ + "▁о", + "д" + ], + [ + "▁g", + "od" + ], + [ + "▁go", + "d" + ], + [ + "▁", + "god" + ], + [ + "▁D", + "el" + ], + [ + "▁De", + "l" + ], + [ + "▁", + "Del" + ], + [ + "base", + "d" + ], + [ + "ba", + "sed" + ], + [ + "bas", + "ed" + ], + [ + "b", + "ased" + ], + [ + "▁v", + "oor" + ], + [ + "▁vo", + "or" + ], + [ + "▁C", + "all" + ], + [ + "▁Cal", + "l" + ], + [ + "▁Ca", + "ll" + ], + [ + "▁", + "Call" + ], + [ + "S", + "A" + ], + [ + "▁fil", + "ter" + ], + [ + "▁", + "filter" + ], + [ + "▁incl", + "udes" + ], + [ + "▁includ", + "es" + ], + [ + "▁include", + "s" + ], + [ + "▁inclu", + "des" + ], + [ + "▁", + "includes" + ], + [ + "olut", + "ions" + ], + [ + "olution", + "s" + ], + [ + "f", + "d" + ], + [ + "▁w", + "ind" + ], + [ + "▁win", + "d" + ], + [ + "▁", + "wind" + ], + [ + "▁б", + "о" + ], + [ + "▁", + "бо" + ], + [ + "▁ab", + "ility" + ], + [ + "▁", + "ability" + ], + [ + "ca", + "rd" + ], + [ + "car", + "d" + ], + [ + "c", + "ard" + ], + [ + "▁n", + "umer" + ], + [ + "▁num", + "er" + ], + [ + "▁nu", + "mer" + ], + [ + "▁", + "numer" + ], + [ + "add", + "ress" + ], + [ + "addr", + "ess" + ], + [ + "▁go", + "al" + ], + [ + "ash", + "ington" + ], + [ + "ashing", + "ton" + ], + [ + "▁s", + "light" + ], + [ + "▁sl", + "ight" + ], + [ + "ab", + "a" + ], + [ + "a", + "ba" + ], + [ + "▁L", + "og" + ], + [ + "▁Lo", + "g" + ], + [ + "▁", + "Log" + ], + [ + "Set", + "tings" + ], + [ + "Setting", + "s" + ], + [ + "ad", + "ow" + ], + [ + "ado", + "w" + ], + [ + "▁p", + "i" + ], + [ + "▁", + "pi" + ], + [ + "ir", + "ing" + ], + [ + "iri", + "ng" + ], + [ + "i", + "ring" + ], + [ + "F", + "T" + ], + [ + "▁number", + "s" + ], + [ + "▁num", + "bers" + ], + [ + "con", + "f" + ], + [ + "co", + "nf" + ], + [ + "ta", + "sk" + ], + [ + "t", + "ask" + ], + [ + "▁î", + "n" + ], + [ + "т", + "ы" + ], + [ + "▁re", + "ceive" + ], + [ + "▁rece", + "ive" + ], + [ + "▁r", + "oot" + ], + [ + "▁ro", + "ot" + ], + [ + "▁", + "root" + ], + [ + "▁Ind", + "ia" + ], + [ + "pat", + "ch" + ], + [ + "p", + "atch" + ], + [ + "é", + "l" + ], + [ + "▁sum", + "mer" + ], + [ + "▁method", + "s" + ], + [ + "▁", + "methods" + ], + [ + "▁pl", + "aces" + ], + [ + "▁place", + "s" + ], + [ + "▁plac", + "es" + ], + [ + "▁М", + "а" + ], + [ + "▁", + "Ма" + ], + [ + "▁cap", + "ital" + ], + [ + "▁capit", + "al" + ], + [ + "▁ev", + "idence" + ], + [ + "▁G", + "erman" + ], + [ + "▁Germ", + "an" + ], + [ + "▁Ger", + "man" + ], + [ + "\\", + "," + ], + [ + "D", + "A" + ], + [ + "ec", + "ute" + ], + [ + "ecut", + "e" + ], + [ + "col", + "umn" + ], + [ + "▁fun", + "ctions" + ], + [ + "▁function", + "s" + ], + [ + "▁", + "functions" + ], + [ + "▁c", + "ounter" + ], + [ + "▁co", + "unter" + ], + [ + "▁coun", + "ter" + ], + [ + "▁count", + "er" + ], + [ + "▁", + "counter" + ], + [ + "▁ar", + "ms" + ], + [ + "▁arm", + "s" + ], + [ + "▁", + "arms" + ], + [ + "▁f", + "eed" + ], + [ + "▁fe", + "ed" + ], + [ + "▁fee", + "d" + ], + [ + "▁", + "feed" + ], + [ + "ve", + "y" + ], + [ + "v", + "ey" + ], + [ + "he", + "nt" + ], + [ + "hen", + "t" + ], + [ + "h", + "ent" + ], + [ + "MA", + "X" + ], + [ + "M", + "AX" + ], + [ + "▁ac", + "qu" + ], + [ + "▁app", + "ly" + ], + [ + "▁ap", + "ply" + ], + [ + "▁appl", + "y" + ], + [ + "▁", + "apply" + ], + [ + "▁hus", + "band" + ], + [ + "▁k", + "illed" + ], + [ + "▁kill", + "ed" + ], + [ + "▁kil", + "led" + ], + [ + "▁S", + "pec" + ], + [ + "▁Sp", + "ec" + ], + [ + "▁Spe", + "c" + ], + [ + "▁", + "Spec" + ], + [ + "ent", + "ity" + ], + [ + "enti", + "ty" + ], + [ + "▁e", + "arlier" + ], + [ + "▁M", + "iss" + ], + [ + "▁Mi", + "ss" + ], + [ + "▁Mis", + "s" + ], + [ + "▁", + "Miss" + ], + [ + "▁set", + "ting" + ], + [ + "▁sett", + "ing" + ], + [ + "▁", + "setting" + ], + [ + "it", + "ect" + ], + [ + "ite", + "ct" + ], + [ + "▁d", + "ed" + ], + [ + "▁de", + "d" + ], + [ + "▁", + "ded" + ], + [ + "Ro", + "w" + ], + [ + "R", + "ow" + ], + [ + "▁r", + "an" + ], + [ + "▁ra", + "n" + ], + [ + "▁", + "ran" + ], + [ + "▁Y", + "es" + ], + [ + "▁Ye", + "s" + ], + [ + "▁", + "Yes" + ], + [ + "▁fin", + "ancial" + ], + [ + "▁financ", + "ial" + ], + [ + "s", + "ession" + ], + [ + "le", + "ar" + ], + [ + "l", + "ear" + ], + [ + "is", + "hing" + ], + [ + "ish", + "ing" + ], + [ + "ishi", + "ng" + ], + [ + "▁ne", + "arly" + ], + [ + "▁near", + "ly" + ], + [ + "▁d", + "ur" + ], + [ + "▁du", + "r" + ], + [ + "▁m", + "achine" + ], + [ + "▁mach", + "ine" + ], + [ + "▁", + "machine" + ], + [ + "xf", + "f" + ], + [ + "x", + "ff" + ], + [ + "br", + "o" + ], + [ + "b", + "ro" + ], + [ + "▁s", + "ymbol" + ], + [ + "▁sym", + "bol" + ], + [ + "▁", + "symbol" + ], + [ + "land", + "s" + ], + [ + "lan", + "ds" + ], + [ + "l", + "ands" + ], + [ + "Ac", + "c" + ], + [ + "A", + "cc" + ], + [ + "d", + "i" + ], + [ + "▁Rober", + "t" + ], + [ + "▁Ro", + "bert" + ], + [ + "▁Rob", + "ert" + ], + [ + "pro", + "p" + ], + [ + "pr", + "op" + ], + [ + "p", + "rop" + ], + [ + "ur", + "ity" + ], + [ + "uri", + "ty" + ], + [ + "▁#", + "####" + ], + [ + "▁##", + "###" + ], + [ + "▁###", + "##" + ], + [ + "▁####", + "#" + ], + [ + "▁walk", + "ed" + ], + [ + "▁wal", + "ked" + ], + [ + "▁intern", + "ational" + ], + [ + "▁internation", + "al" + ], + [ + "▁", + "Е" + ], + [ + "Y", + "es" + ], + [ + "▁re", + "lease" + ], + [ + "▁rele", + "ase" + ], + [ + "▁", + "release" + ], + [ + "▁start", + "ing" + ], + [ + "▁star", + "ting" + ], + [ + "st", + "atic" + ], + [ + "stat", + "ic" + ], + [ + "▁b", + "ei" + ], + [ + "▁be", + "i" + ], + [ + "al", + "low" + ], + [ + "all", + "ow" + ], + [ + "allo", + "w" + ], + [ + "▁Pe", + "ople" + ], + [ + "▁", + "People" + ], + [ + "e", + "z" + ], + [ + "▁param", + "eter" + ], + [ + "▁", + "parameter" + ], + [ + "C", + "ache" + ], + [ + "▁$", + "$" + ], + [ + "▁", + "$$" + ], + [ + "amp", + "ions" + ], + [ + "ampion", + "s" + ], + [ + "▁M", + "er" + ], + [ + "▁Me", + "r" + ], + [ + "▁", + "Mer" + ], + [ + "▁k", + "om" + ], + [ + "▁ko", + "m" + ], + [ + "▁", + "kom" + ], + [ + "le", + "ted" + ], + [ + "let", + "ed" + ], + [ + "lete", + "d" + ], + [ + "l", + "eted" + ], + [ + "oi", + "s" + ], + [ + "o", + "is" + ], + [ + "▁O", + "pen" + ], + [ + "▁Op", + "en" + ], + [ + "▁", + "Open" + ], + [ + "ty", + "pes" + ], + [ + "type", + "s" + ], + [ + "typ", + "es" + ], + [ + "t", + "ypes" + ], + [ + "▁f", + "ue" + ], + [ + "▁fu", + "e" + ], + [ + "ac", + "ters" + ], + [ + "act", + "ers" + ], + [ + "acter", + "s" + ], + [ + "▁re", + "ference" + ], + [ + "▁refer", + "ence" + ], + [ + "▁", + "reference" + ], + [ + "Equ", + "als" + ], + [ + "Equal", + "s" + ], + [ + "Eq", + "uals" + ], + [ + "▁a", + "ware" + ], + [ + "▁aw", + "are" + ], + [ + "▁", + "aware" + ], + [ + "▁h", + "ol" + ], + [ + "▁ho", + "l" + ], + [ + "▁", + "hol" + ], + [ + "▁de", + "mand" + ], + [ + "▁dem", + "and" + ], + [ + "lo", + "r" + ], + [ + "l", + "or" + ], + [ + "▁v", + "eh" + ], + [ + "▁ve", + "h" + ], + [ + "▁", + "veh" + ], + [ + "▁not", + "ice" + ], + [ + "▁", + "notice" + ], + [ + "▁com", + "ponent" + ], + [ + "▁compon", + "ent" + ], + [ + "▁", + "component" + ], + [ + "f", + "n" + ], + [ + "▁anal", + "ysis" + ], + [ + "▁analy", + "sis" + ], + [ + "▁analys", + "is" + ], + [ + "▁", + "analysis" + ], + [ + "mat", + "ch" + ], + [ + "m", + "atch" + ], + [ + "▁effect", + "ive" + ], + [ + "▁", + "effective" + ], + [ + "pro", + "duct" + ], + [ + "produ", + "ct" + ], + [ + "prod", + "uct" + ], + [ + "ни", + "к" + ], + [ + "▁le", + "gal" + ], + [ + "▁leg", + "al" + ], + [ + "▁", + "legal" + ], + [ + "е", + "й" + ], + [ + "se", + "mb" + ], + [ + "sem", + "b" + ], + [ + "s", + "emb" + ], + [ + "▁loc", + "ated" + ], + [ + "▁locate", + "d" + ], + [ + "▁с", + "у" + ], + [ + "▁", + "су" + ], + [ + "Q", + "L" + ], + [ + "in", + "ct" + ], + [ + "inc", + "t" + ], + [ + "et", + "o" + ], + [ + "e", + "to" + ], + [ + "Dr", + "aw" + ], + [ + "D", + "raw" + ], + [ + "▁sc", + "ale" + ], + [ + "▁scal", + "e" + ], + [ + "▁", + "scale" + ], + [ + "ро", + "в" + ], + [ + "р", + "ов" + ], + [ + "▁w", + "ants" + ], + [ + "▁want", + "s" + ], + [ + "H", + "ow" + ], + [ + "▁w", + "el" + ], + [ + "▁we", + "l" + ], + [ + "is", + "ions" + ], + [ + "ision", + "s" + ], + [ + "isi", + "ons" + ], + [ + "▁de", + "liver" + ], + [ + "▁del", + "iver" + ], + [ + "un", + "der" + ], + [ + "und", + "er" + ], + [ + "unde", + "r" + ], + [ + "u", + "nder" + ], + [ + "▁d", + "eb" + ], + [ + "▁de", + "b" + ], + [ + "▁j", + "u" + ], + [ + "▁", + "ju" + ], + [ + "val", + "ues" + ], + [ + "value", + "s" + ], + [ + "▁s", + "ister" + ], + [ + "▁si", + "ster" + ], + [ + "▁sist", + "er" + ], + [ + "ко", + "в" + ], + [ + "к", + "ов" + ], + [ + "▁C", + "reate" + ], + [ + "▁Creat", + "e" + ], + [ + "▁Cre", + "ate" + ], + [ + "▁", + "Create" + ], + [ + "▁I", + "nc" + ], + [ + "▁In", + "c" + ], + [ + "▁a", + "ux" + ], + [ + "▁au", + "x" + ], + [ + "▁", + "aux" + ], + [ + "▁Wh", + "ite" + ], + [ + "▁Whit", + "e" + ], + [ + "▁", + "White" + ], + [ + "Me", + "nu" + ], + [ + "Men", + "u" + ], + [ + "M", + "enu" + ], + [ + "au", + "d" + ], + [ + "a", + "ud" + ], + [ + "re", + "source" + ], + [ + "res", + "ource" + ], + [ + "▁c", + "ab" + ], + [ + "▁ca", + "b" + ], + [ + "▁l", + "if" + ], + [ + "▁li", + "f" + ], + [ + "▁", + "lif" + ], + [ + "▁c", + "ulture" + ], + [ + "▁cult", + "ure" + ], + [ + "ic", + "he" + ], + [ + "ich", + "e" + ], + [ + "i", + "che" + ], + [ + "▁wh", + "atever" + ], + [ + "▁what", + "ever" + ], + [ + "▁de", + "signed" + ], + [ + "▁des", + "igned" + ], + [ + "▁design", + "ed" + ], + [ + "▁re", + "pe" + ], + [ + "▁rep", + "e" + ], + [ + "▁M", + "ont" + ], + [ + "▁Mon", + "t" + ], + [ + "▁Mo", + "nt" + ], + [ + "▁", + "Mont" + ], + [ + "▁ch", + "arge" + ], + [ + "▁char", + "ge" + ], + [ + "▁charg", + "e" + ], + [ + "▁", + "charge" + ], + [ + "Name", + "s" + ], + [ + "Na", + "mes" + ], + [ + "N", + "ames" + ], + [ + "▁in", + "sp" + ], + [ + "▁ins", + "p" + ], + [ + "▁custom", + "ers" + ], + [ + "▁customer", + "s" + ], + [ + "os", + "a" + ], + [ + "o", + "sa" + ], + [ + "▁d", + "aughter" + ], + [ + "▁E", + "ast" + ], + [ + "E", + "Q" + ], + [ + "▁o", + "pin" + ], + [ + "▁op", + "in" + ], + [ + "▁F", + "re" + ], + [ + "▁Fr", + "e" + ], + [ + "▁se", + "ek" + ], + [ + "▁see", + "k" + ], + [ + "▁", + "seek" + ], + [ + "▁p", + "ush" + ], + [ + "▁pu", + "sh" + ], + [ + "▁", + "push" + ], + [ + "▁n", + "av" + ], + [ + "▁na", + "v" + ], + [ + "▁", + "nav" + ], + [ + "▁b", + "urn" + ], + [ + "▁bu", + "rn" + ], + [ + "▁bur", + "n" + ], + [ + "▁", + "burn" + ], + [ + "ar", + "den" + ], + [ + "ard", + "en" + ], + [ + "arde", + "n" + ], + [ + "ha", + "sh" + ], + [ + "has", + "h" + ], + [ + "h", + "ash" + ], + [ + "▁opportun", + "ity" + ], + [ + "▁M", + "at" + ], + [ + "▁Ma", + "t" + ], + [ + "▁", + "Mat" + ], + [ + "oy", + "al" + ], + [ + "oya", + "l" + ], + [ + "o", + "yal" + ], + [ + "▁p", + "un" + ], + [ + "▁pu", + "n" + ], + [ + "sc", + "ale" + ], + [ + "scal", + "e" + ], + [ + "yn", + "amic" + ], + [ + "ynam", + "ic" + ], + [ + "yna", + "mic" + ], + [ + "▁T", + "ype" + ], + [ + "▁Ty", + "pe" + ], + [ + "▁Typ", + "e" + ], + [ + "▁", + "Type" + ], + [ + "il", + "ing" + ], + [ + "ili", + "ng" + ], + [ + "i", + "ling" + ], + [ + "▁qu", + "ery" + ], + [ + "▁que", + "ry" + ], + [ + "▁quer", + "y" + ], + [ + "▁", + "query" + ], + [ + "▁m", + "ist" + ], + [ + "▁mis", + "t" + ], + [ + "▁mi", + "st" + ], + [ + "ro", + "r" + ], + [ + "r", + "or" + ], + [ + "for", + "ce" + ], + [ + "▁On", + "ce" + ], + [ + "▁", + "Once" + ], + [ + "▁med", + "ical" + ], + [ + "▁medic", + "al" + ], + [ + "▁medi", + "cal" + ], + [ + "li", + "e" + ], + [ + "l", + "ie" + ], + [ + "▁stud", + "ent" + ], + [ + "▁", + "student" + ], + [ + "ed", + "eral" + ], + [ + "eder", + "al" + ], + [ + "ede", + "ral" + ], + [ + "▁l", + "ov" + ], + [ + "▁lo", + "v" + ], + [ + "▁", + "lov" + ], + [ + "if", + "orm" + ], + [ + "i", + "form" + ], + [ + "▁al", + "tern" + ], + [ + "▁alt", + "ern" + ], + [ + "▁alter", + "n" + ], + [ + "▁", + "altern" + ], + [ + "bi", + "n" + ], + [ + "b", + "in" + ], + [ + "od", + "er" + ], + [ + "ode", + "r" + ], + [ + "o", + "der" + ], + [ + "▁return", + "s" + ], + [ + "▁", + "returns" + ], + [ + "reg", + "ister" + ], + [ + "ut", + "s" + ], + [ + "u", + "ts" + ], + [ + "C", + "I" + ], + [ + "▁T", + "or" + ], + [ + "▁To", + "r" + ], + [ + "▁", + "Tor" + ], + [ + "C", + "R" + ], + [ + "▁L", + "os" + ], + [ + "▁Lo", + "s" + ], + [ + "▁", + "Los" + ], + [ + "am", + "ily" + ], + [ + "ami", + "ly" + ], + [ + "amil", + "y" + ], + [ + "air", + "e" + ], + [ + "ai", + "re" + ], + [ + "a", + "ire" + ], + [ + "++", + ";" + ], + [ + "Cont", + "roller" + ], + [ + "Control", + "ler" + ], + [ + "wi", + "de" + ], + [ + "wid", + "e" + ], + [ + "w", + "ide" + ], + [ + "x", + "x" + ], + [ + "row", + "ser" + ], + [ + "rows", + "er" + ], + [ + "▁B", + "ook" + ], + [ + "▁Bo", + "ok" + ], + [ + "▁", + "Book" + ], + [ + "Cont", + "ainer" + ], + [ + "pl", + "oad" + ], + [ + "plo", + "ad" + ], + [ + "p", + "load" + ], + [ + "▁E", + "v" + ], + [ + "▁", + "Ev" + ], + [ + "▁t", + "al" + ], + [ + "▁ta", + "l" + ], + [ + "▁", + "tal" + ], + [ + "▁the", + "ory" + ], + [ + "eqn", + "array" + ], + [ + "б", + "е" + ], + [ + "▁rep", + "orted" + ], + [ + "▁report", + "ed" + ], + [ + "▁me", + "aning" + ], + [ + "▁mean", + "ing" + ], + [ + "▁s", + "y" + ], + [ + "▁", + "sy" + ], + [ + "ri", + "be" + ], + [ + "rib", + "e" + ], + [ + "r", + "ibe" + ], + [ + "ic", + "ate" + ], + [ + "ica", + "te" + ], + [ + "ho", + "ld" + ], + [ + "hol", + "d" + ], + [ + "h", + "old" + ], + [ + "▁of", + "fers" + ], + [ + "▁off", + "ers" + ], + [ + "▁offer", + "s" + ], + [ + "▁t", + "empl" + ], + [ + "▁tem", + "pl" + ], + [ + "▁temp", + "l" + ], + [ + "cs", + "s" + ], + [ + "c", + "ss" + ], + [ + "▁p", + "icture" + ], + [ + "▁pict", + "ure" + ], + [ + "▁", + "picture" + ], + [ + "▁a", + "sync" + ], + [ + "▁as", + "ync" + ], + [ + "▁", + "async" + ], + [ + "▁st", + "ock" + ], + [ + "▁sto", + "ck" + ], + [ + "▁", + "stock" + ], + [ + "▁in", + "ternal" + ], + [ + "▁inter", + "nal" + ], + [ + "▁intern", + "al" + ], + [ + "▁", + "internal" + ], + [ + "t", + "i" + ], + [ + "B", + "O" + ], + [ + "V", + "er" + ], + [ + "с", + "по" + ], + [ + "▁d", + "emon" + ], + [ + "▁de", + "mon" + ], + [ + "▁dem", + "on" + ], + [ + "▁demo", + "n" + ], + [ + "▁l", + "augh" + ], + [ + "▁la", + "ugh" + ], + [ + "▁laug", + "h" + ], + [ + "▁E", + "nd" + ], + [ + "▁En", + "d" + ], + [ + "▁", + "End" + ], + [ + "▁k", + "on" + ], + [ + "▁ko", + "n" + ], + [ + "▁", + "kon" + ], + [ + "▁ide", + "as" + ], + [ + "▁idea", + "s" + ], + [ + "▁c", + "andid" + ], + [ + "▁can", + "did" + ], + [ + "▁cand", + "id" + ], + [ + "Me", + "m" + ], + [ + "M", + "em" + ], + [ + "iz", + "z" + ], + [ + "i", + "zz" + ], + [ + "re", + "fix" + ], + [ + "ref", + "ix" + ], + [ + "▁A", + "ND" + ], + [ + "▁AN", + "D" + ], + [ + "▁", + "AND" + ], + [ + "eg", + "en" + ], + [ + "e", + "gen" + ], + [ + "E", + "l" + ], + [ + "▁camp", + "aign" + ], + [ + "H", + "ttp" + ], + [ + "▁R", + "ob" + ], + [ + "▁Ro", + "b" + ], + [ + "▁", + "Rob" + ], + [ + "д", + "і" + ], + [ + "▁b", + "ul" + ], + [ + "▁bu", + "l" + ], + [ + "▁", + "bul" + ], + [ + "▁К", + "о" + ], + [ + "▁", + "Ко" + ], + [ + "▁count", + "ries" + ], + [ + "▁countr", + "ies" + ], + [ + "»", + "." + ], + [ + "▁ex", + "pression" + ], + [ + "▁exp", + "ression" + ], + [ + "▁express", + "ion" + ], + [ + "▁expr", + "ession" + ], + [ + "▁", + "expression" + ], + [ + "▁Eng", + "land" + ], + [ + "s", + "f" + ], + [ + "▁certain", + "ly" + ], + [ + "ag", + "en" + ], + [ + "age", + "n" + ], + [ + "a", + "gen" + ], + [ + "▁ч", + "а" + ], + [ + "▁", + "ча" + ], + [ + "▁A", + "NY" + ], + [ + "▁AN", + "Y" + ], + [ + "▁", + "ANY" + ], + [ + "▁conne", + "ct" + ], + [ + "▁conn", + "ect" + ], + [ + "▁", + "connect" + ], + [ + "F", + "E" + ], + [ + "▁and", + "roid" + ], + [ + "▁", + "android" + ], + [ + "▁G", + "old" + ], + [ + "▁Go", + "ld" + ], + [ + "▁Gol", + "d" + ], + [ + "▁", + "Gold" + ], + [ + "▁op", + "pos" + ], + [ + "▁opp", + "os" + ], + [ + "ov", + "ern" + ], + [ + "ove", + "rn" + ], + [ + "over", + "n" + ], + [ + "o", + "vern" + ], + [ + "▁Com", + "mun" + ], + [ + "▁Comm", + "un" + ], + [ + ",", + "_" + ], + [ + "as", + "ion" + ], + [ + "asi", + "on" + ], + [ + "L", + "a" + ], + [ + "▁f", + "irm" + ], + [ + "▁fi", + "rm" + ], + [ + "▁fir", + "m" + ], + [ + "▁Al", + "though" + ], + [ + "▁G", + "ood" + ], + [ + "▁Go", + "od" + ], + [ + "▁", + "Good" + ], + [ + "▁L", + "aw" + ], + [ + "▁La", + "w" + ], + [ + "er", + "ve" + ], + [ + "erv", + "e" + ], + [ + "▁b", + "rand" + ], + [ + "▁br", + "and" + ], + [ + "▁bra", + "nd" + ], + [ + "▁", + "brand" + ], + [ + "M", + "in" + ], + [ + "fil", + "l" + ], + [ + "fi", + "ll" + ], + [ + "f", + "ill" + ], + [ + "']", + "," + ], + [ + "'", + "]," + ], + [ + "▁J", + "ew" + ], + [ + "▁Je", + "w" + ], + [ + "il", + "er" + ], + [ + "ile", + "r" + ], + [ + "i", + "ler" + ], + [ + "in", + "gle" + ], + [ + "ing", + "le" + ], + [ + "it", + "hub" + ], + [ + "ith", + "ub" + ], + [ + "▁D", + "iv" + ], + [ + "▁Di", + "v" + ], + [ + "▁", + "Div" + ], + [ + "▁c", + "ert" + ], + [ + "▁ce", + "rt" + ], + [ + "▁cer", + "t" + ], + [ + "▁", + "cert" + ], + [ + "He", + "ight" + ], + [ + "H", + "eight" + ], + [ + "ra", + "el" + ], + [ + "r", + "ael" + ], + [ + "The", + "re" + ], + [ + "Th", + "ere" + ], + [ + "T", + "here" + ], + [ + "it", + "ute" + ], + [ + "itut", + "e" + ], + [ + "itu", + "te" + ], + [ + "▁a", + "maz" + ], + [ + "▁am", + "az" + ], + [ + "▁", + "amaz" + ], + [ + "lo", + "ok" + ], + [ + "l", + "ook" + ], + [ + "▁S", + "E" + ], + [ + "▁", + "SE" + ], + [ + "▁j", + "o" + ], + [ + "▁", + "jo" + ], + [ + "▁pull", + "ed" + ], + [ + "▁pul", + "led" + ], + [ + "▁re", + "sources" + ], + [ + "▁res", + "ources" + ], + [ + "▁resource", + "s" + ], + [ + "▁", + "resources" + ], + [ + "▁M", + "ax" + ], + [ + "▁Ma", + "x" + ], + [ + "▁", + "Max" + ], + [ + "▁ag", + "reed" + ], + [ + "▁agree", + "d" + ], + [ + "▁agre", + "ed" + ], + [ + "as", + "y" + ], + [ + "a", + "sy" + ], + [ + "▁treat", + "ment" + ], + [ + "\">", + "<", + "/" + ], + [ + "\"", + ">", + ">" + ], + [ + "▁", + ">>" + ], + [ + "com", + "mand" + ], + [ + "comm", + "and" + ], + [ + "at", + "z" + ], + [ + "a", + "tz" + ], + [ + "▁m", + "al" + ], + [ + "▁ma", + "l" + ], + [ + "▁", + "mal" + ], + [ + "ста", + "в" + ], + [ + "▁P", + "ress" + ], + [ + "▁Pr", + "ess" + ], + [ + "▁Pres", + "s" + ], + [ + "▁Pre", + "ss" + ], + [ + "▁", + "Press" + ], + [ + "▁char", + "acters" + ], + [ + "▁character", + "s" + ], + [ + "▁z", + "ero" + ], + [ + "▁ze", + "ro" + ], + [ + "▁", + "zero" + ], + [ + "AG", + "E" + ], + [ + "A", + "GE" + ], + [ + "rap", + "per" + ], + [ + "▁kit", + "chen" + ], + [ + "am", + "ing" + ], + [ + "ami", + "ng" + ], + [ + "amin", + "g" + ], + [ + "a", + "ming" + ], + [ + "▁re", + "str" + ], + [ + "▁r", + "estr" + ], + [ + "▁res", + "tr" + ], + [ + "▁rest", + "r" + ], + [ + "X", + "X" + ], + [ + "▁Col", + "lege" + ], + [ + "▁Ar", + "ray" + ], + [ + "▁Arr", + "ay" + ], + [ + "▁", + "Array" + ], + [ + "▁f", + "resh" + ], + [ + "▁fr", + "esh" + ], + [ + "▁fre", + "sh" + ], + [ + "▁fres", + "h" + ], + [ + "▁sh", + "ift" + ], + [ + "▁", + "shift" + ], + [ + "▁spec", + "ified" + ], + [ + "pl", + "ete" + ], + [ + "ple", + "te" + ], + [ + "plet", + "e" + ], + [ + "p", + "lete" + ], + [ + "IT", + "E" + ], + [ + "I", + "TE" + ], + [ + "▁C", + "amp" + ], + [ + "▁Cam", + "p" + ], + [ + "▁Ca", + "mp" + ], + [ + "▁", + "Camp" + ], + [ + "ri", + "al" + ], + [ + "ria", + "l" + ], + [ + "r", + "ial" + ], + [ + "c", + "b" + ], + [ + "▁T", + "H" + ], + [ + "▁", + "TH" + ], + [ + "I", + "B" + ], + [ + "os", + "en" + ], + [ + "ose", + "n" + ], + [ + "o", + "sen" + ], + [ + "▁", + "ú" + ], + [ + "▁par", + "ams" + ], + [ + "▁param", + "s" + ], + [ + "▁para", + "ms" + ], + [ + "▁", + "params" + ], + [ + "ign", + "ment" + ], + [ + "ad", + "ding" + ], + [ + "add", + "ing" + ], + [ + "▁deg", + "ree" + ], + [ + "▁", + "degree" + ], + [ + "Loc", + "al" + ], + [ + "Lo", + "cal" + ], + [ + "L", + "ocal" + ], + [ + "O", + "h" + ], + [ + "▁z", + "ur" + ], + [ + "▁zu", + "r" + ], + [ + "▁level", + "s" + ], + [ + "▁lev", + "els" + ], + [ + "C", + "S" + ], + [ + "fin", + "ished" + ], + [ + "finish", + "ed" + ], + [ + "C", + "ase" + ], + [ + "ri", + "age" + ], + [ + "ria", + "ge" + ], + [ + "Vec", + "tor" + ], + [ + "V", + "ector" + ], + [ + "▁s", + "ea" + ], + [ + "▁se", + "a" + ], + [ + "▁", + "sea" + ], + [ + "ant", + "ic" + ], + [ + "anti", + "c" + ], + [ + "▁Le", + "ague" + ], + [ + "▁there", + "fore" + ], + [ + "▁ther", + "efore" + ], + [ + "On", + "e" + ], + [ + "O", + "ne" + ], + [ + "Re", + "turn" + ], + [ + "Ret", + "urn" + ], + [ + "R", + "eturn" + ], + [ + "Acc", + "ess" + ], + [ + "Ac", + "cess" + ], + [ + "A", + "ccess" + ], + [ + "va", + "s" + ], + [ + "v", + "as" + ], + [ + "▁о", + "с" + ], + [ + "▁r", + "at" + ], + [ + "▁ra", + "t" + ], + [ + "▁", + "rat" + ], + [ + "Bi", + "g" + ], + [ + "B", + "ig" + ], + [ + "▁be", + "havior" + ], + [ + "▁behav", + "ior" + ], + [ + "▁behavi", + "or" + ], + [ + "k", + "r" + ], + [ + "▁un", + "defined" + ], + [ + "▁und", + "efined" + ], + [ + "▁", + "undefined" + ], + [ + "▁E", + "s" + ], + [ + "▁", + "Es" + ], + [ + "▁appe", + "ared" + ], + [ + "▁appear", + "ed" + ], + [ + "el", + "es" + ], + [ + "ele", + "s" + ], + [ + "e", + "les" + ], + [ + "▁W", + "AR" + ], + [ + "▁WA", + "R" + ], + [ + "▁", + "WAR" + ], + [ + "St", + "at" + ], + [ + "S", + "tat" + ], + [ + "▁Go", + "ogle" + ], + [ + "▁", + "Google" + ], + [ + "▁c", + "redit" + ], + [ + "▁cre", + "dit" + ], + [ + "▁cr", + "edit" + ], + [ + "▁cred", + "it" + ], + [ + "▁F", + "ile" + ], + [ + "▁Fil", + "e" + ], + [ + "▁Fi", + "le" + ], + [ + "▁", + "File" + ], + [ + "an", + "ging" + ], + [ + "ang", + "ing" + ], + [ + "ho", + "use" + ], + [ + "hou", + "se" + ], + [ + "h", + "ouse" + ], + [ + "rom", + "ise" + ], + [ + "ge", + "nt" + ], + [ + "gen", + "t" + ], + [ + "g", + "ent" + ], + [ + "▁hab", + "it" + ], + [ + "▁ha", + "bit" + ], + [ + "▁soc", + "iety" + ], + [ + "▁soci", + "ety" + ], + [ + "▁societ", + "y" + ], + [ + "▁enc", + "our" + ], + [ + "▁p", + "aint" + ], + [ + "▁pain", + "t" + ], + [ + "▁pa", + "int" + ], + [ + "pe", + "t" + ], + [ + "p", + "et" + ], + [ + "▁U", + "K" + ], + [ + "▁", + "UK" + ], + [ + "aw", + "s" + ], + [ + "a", + "ws" + ], + [ + "on", + "om" + ], + [ + "ono", + "m" + ], + [ + "o", + "nom" + ], + [ + "G", + "l" + ], + [ + "}_", + "{\\" + ], + [ + "}_{", + "\\" + ], + [ + "}", + "_{\\" + ], + [ + "el", + "ess" + ], + [ + "ele", + "ss" + ], + [ + "eles", + "s" + ], + [ + "e", + "less" + ], + [ + "em", + "y" + ], + [ + "e", + "my" + ], + [ + "▁C", + "ong" + ], + [ + "▁Con", + "g" + ], + [ + "▁Co", + "ng" + ], + [ + "▁develop", + "ed" + ], + [ + "▁im", + "ages" + ], + [ + "▁image", + "s" + ], + [ + "▁imag", + "es" + ], + [ + "▁", + "images" + ], + [ + "▁", + "ö" + ], + [ + "▁f", + "ont" + ], + [ + "▁fo", + "nt" + ], + [ + "▁fon", + "t" + ], + [ + "▁", + "font" + ], + [ + "cl", + "ear" + ], + [ + "cle", + "ar" + ], + [ + "c", + "lear" + ], + [ + "gi", + "n" + ], + [ + "g", + "in" + ], + [ + "▁L", + "ord" + ], + [ + "▁Lo", + "rd" + ], + [ + "▁Lor", + "d" + ], + [ + "▁trans", + "port" + ], + [ + "▁", + "transport" + ], + [ + "▁:", + ":" + ], + [ + "▁", + "::" + ], + [ + "▁c", + "up" + ], + [ + "▁cu", + "p" + ], + [ + "▁", + "cup" + ], + [ + "ul", + "ate" + ], + [ + "ula", + "te" + ], + [ + "u", + "late" + ], + [ + "▁D", + "uring" + ], + [ + "▁Du", + "ring" + ], + [ + "▁Dur", + "ing" + ], + [ + "pr", + "iv" + ], + [ + "p", + "riv" + ], + [ + "▁ext", + "rem" + ], + [ + "▁extr", + "em" + ], + [ + "▁D", + "i" + ], + [ + "▁", + "Di" + ], + [ + "▁d", + "oubt" + ], + [ + "▁dou", + "bt" + ], + [ + "▁doub", + "t" + ], + [ + "P", + "y" + ], + [ + "if", + "ying" + ], + [ + "ify", + "ing" + ], + [ + "sp", + "lit" + ], + [ + "spl", + "it" + ], + [ + "s", + "plit" + ], + [ + "eg", + "o" + ], + [ + "e", + "go" + ], + [ + "git", + "hub" + ], + [ + "g", + "ithub" + ], + [ + "▁)", + "," + ], + [ + "▁", + ")," + ], + [ + "RO", + "M" + ], + [ + "R", + "OM" + ], + [ + "▁ch", + "air" + ], + [ + "▁cha", + "ir" + ], + [ + "▁", + "chair" + ], + [ + "▁t", + "rade" + ], + [ + "▁tr", + "ade" + ], + [ + "▁trad", + "e" + ], + [ + "▁tra", + "de" + ], + [ + "▁n", + "icht" + ], + [ + "▁ni", + "cht" + ], + [ + "▁nic", + "ht" + ], + [ + "To", + "p" + ], + [ + "T", + "op" + ], + [ + "St", + "ore" + ], + [ + "▁p", + "arte" + ], + [ + "▁part", + "e" + ], + [ + "▁par", + "te" + ], + [ + "pro", + "ject" + ], + [ + "ni", + "a" + ], + [ + "n", + "ia" + ], + [ + "▁в", + "ід" + ], + [ + "▁ві", + "д" + ], + [ + "wa", + "r" + ], + [ + "w", + "ar" + ], + [ + "▁Pro", + "f" + ], + [ + "▁Pr", + "of" + ], + [ + "▁c", + "aught" + ], + [ + "Th", + "read" + ], + [ + "ст", + "ва" + ], + [ + "ств", + "а" + ], + [ + "с", + "тва" + ], + [ + "aut", + "hor" + ], + [ + "auth", + "or" + ], + [ + "▁d", + "oll" + ], + [ + "▁do", + "ll" + ], + [ + "▁dol", + "l" + ], + [ + "▁h", + "arm" + ], + [ + "▁ha", + "rm" + ], + [ + "▁har", + "m" + ], + [ + "▁", + "harm" + ], + [ + "▁G", + "en" + ], + [ + "▁Ge", + "n" + ], + [ + "▁", + "Gen" + ], + [ + "tr", + "ee" + ], + [ + "tre", + "e" + ], + [ + "t", + "ree" + ], + [ + "et", + "ime" + ], + [ + "eti", + "me" + ], + [ + "e", + "time" + ], + [ + "cf", + "g" + ], + [ + "c", + "fg" + ], + [ + "▁gu", + "ys" + ], + [ + "▁guy", + "s" + ], + [ + "▁Cal", + "ifornia" + ], + [ + "▁G", + "reen" + ], + [ + "▁Gr", + "een" + ], + [ + "▁Gre", + "en" + ], + [ + "▁Gree", + "n" + ], + [ + "▁", + "Green" + ], + [ + "▁mov", + "ement" + ], + [ + "▁move", + "ment" + ], + [ + "▁mo", + "vement" + ], + [ + "ie", + "j" + ], + [ + "i", + "ej" + ], + [ + "▁stat", + "ement" + ], + [ + "▁state", + "ment" + ], + [ + "▁", + "statement" + ], + [ + "▁se", + "eing" + ], + [ + "▁see", + "ing" + ], + [ + "▁h", + "aven" + ], + [ + "▁have", + "n" + ], + [ + "▁ha", + "ven" + ], + [ + "▁hav", + "en" + ], + [ + "vent", + "ion" + ], + [ + "v", + "ention" + ], + [ + "S", + "L" + ], + [ + "ched", + "ul" + ], + [ + "ie", + "rt" + ], + [ + "ier", + "t" + ], + [ + "i", + "ert" + ], + [ + "▁pr", + "imary" + ], + [ + "▁prim", + "ary" + ], + [ + "▁pri", + "mary" + ], + [ + "▁prima", + "ry" + ], + [ + "▁", + "primary" + ], + [ + "▁c", + "ivil" + ], + [ + "▁ci", + "vil" + ], + [ + "▁civ", + "il" + ], + [ + "ri", + "an" + ], + [ + "ria", + "n" + ], + [ + "r", + "ian" + ], + [ + "▁b", + "utton" + ], + [ + "▁but", + "ton" + ], + [ + "▁butt", + "on" + ], + [ + "▁", + "button" + ], + [ + "▁l", + "ived" + ], + [ + "▁li", + "ved" + ], + [ + "▁live", + "d" + ], + [ + "▁liv", + "ed" + ], + [ + "P", + "ass" + ], + [ + "so", + "r" + ], + [ + "s", + "or" + ], + [ + "▁watch", + "ing" + ], + [ + "▁wat", + "ching" + ], + [ + "▁sk", + "ills" + ], + [ + "▁skill", + "s" + ], + [ + "te", + "e" + ], + [ + "t", + "ee" + ], + [ + "Le", + "vel" + ], + [ + "L", + "evel" + ], + [ + "▁sc", + "ient" + ], + [ + "h", + "s" + ], + [ + "▁a", + "gre" + ], + [ + "▁ag", + "re" + ], + [ + "ca", + "t" + ], + [ + "c", + "at" + ], + [ + "▁t", + "end" + ], + [ + "▁te", + "nd" + ], + [ + "▁ten", + "d" + ], + [ + "▁M", + "ill" + ], + [ + "▁Mil", + "l" + ], + [ + "▁Mi", + "ll" + ], + [ + "▁", + "Mill" + ], + [ + "▁C", + "ap" + ], + [ + "▁Ca", + "p" + ], + [ + "▁", + "Cap" + ], + [ + "OR", + "D" + ], + [ + "O", + "RD" + ], + [ + "gl", + "e" + ], + [ + "g", + "le" + ], + [ + "▁с", + "во" + ], + [ + "»", + "," + ], + [ + "▁a", + "head" + ], + [ + "▁ah", + "ead" + ], + [ + "ve", + "st" + ], + [ + "ves", + "t" + ], + [ + "v", + "est" + ], + [ + "▁J", + "ose" + ], + [ + "▁Jo", + "se" + ], + [ + "▁Jos", + "e" + ], + [ + "is", + "cher" + ], + [ + "isch", + "er" + ], + [ + "ische", + "r" + ], + [ + "isc", + "her" + ], + [ + "ș", + "i" + ], + [ + "▁le", + "aving" + ], + [ + "▁д", + "ля" + ], + [ + "▁s", + "outh" + ], + [ + "▁so", + "uth" + ], + [ + "▁sou", + "th" + ], + [ + "▁sout", + "h" + ], + [ + "▁con", + "sum" + ], + [ + "▁cons", + "um" + ], + [ + "▁", + "consum" + ], + [ + "R", + "ange" + ], + [ + "▁activ", + "ities" + ], + [ + "Se", + "c" + ], + [ + "S", + "ec" + ], + [ + "▁s", + "ales" + ], + [ + "▁sa", + "les" + ], + [ + "▁sal", + "es" + ], + [ + "▁sale", + "s" + ], + [ + "▁f", + "ix" + ], + [ + "▁fi", + "x" + ], + [ + "▁", + "fix" + ], + [ + "▁j", + "ed" + ], + [ + "▁je", + "d" + ], + [ + "▁", + "jed" + ], + [ + "ru", + "m" + ], + [ + "r", + "um" + ], + [ + "ve", + "ctor" + ], + [ + "vec", + "tor" + ], + [ + "v", + "ector" + ], + [ + "▁s", + "pot" + ], + [ + "▁sp", + "ot" + ], + [ + "▁spo", + "t" + ], + [ + "▁", + "spot" + ], + [ + "▁man", + "ufact" + ], + [ + "к", + "т" + ], + [ + "or", + "row" + ], + [ + "orr", + "ow" + ], + [ + "si", + "gn" + ], + [ + "sig", + "n" + ], + [ + "s", + "ign" + ], + [ + "▁col", + "lege" + ], + [ + "▁colle", + "ge" + ], + [ + "▁colleg", + "e" + ], + [ + "▁d", + "river" + ], + [ + "▁dr", + "iver" + ], + [ + "▁dri", + "ver" + ], + [ + "▁driv", + "er" + ], + [ + "▁drive", + "r" + ], + [ + "▁", + "driver" + ], + [ + "▁def", + "initely" + ], + [ + "▁definit", + "ely" + ], + [ + "▁s", + "pend" + ], + [ + "▁sp", + "end" + ], + [ + "▁spe", + "nd" + ], + [ + "miss", + "ion" + ], + [ + "m", + "ission" + ], + [ + "з", + "у" + ], + [ + "at", + "ively" + ], + [ + "ative", + "ly" + ], + [ + "ativ", + "ely" + ], + [ + "b", + "i" + ], + [ + "Call", + "back" + ], + [ + "▁particular", + "ly" + ], + [ + "▁particul", + "arly" + ], + [ + "▁h", + "ell" + ], + [ + "▁he", + "ll" + ], + [ + "▁hel", + "l" + ], + [ + "▁", + "hell" + ], + [ + "▁p", + "ool" + ], + [ + "▁po", + "ol" + ], + [ + "▁", + "pool" + ], + [ + "PR", + "E" + ], + [ + "P", + "RE" + ], + [ + "▁cle", + "arly" + ], + [ + "▁clear", + "ly" + ], + [ + "P", + "T" + ], + [ + "ot", + "hes" + ], + [ + "oth", + "es" + ], + [ + "othe", + "s" + ], + [ + "▁I", + "d" + ], + [ + "▁", + "Id" + ], + [ + "Loc", + "ation" + ], + [ + "L", + "ocation" + ], + [ + "▁R", + "un" + ], + [ + "▁Ru", + "n" + ], + [ + "▁", + "Run" + ], + [ + "▁f", + "ixed" + ], + [ + "▁fix", + "ed" + ], + [ + "▁", + "fixed" + ], + [ + "▁H", + "and" + ], + [ + "▁Ha", + "nd" + ], + [ + "▁Han", + "d" + ], + [ + "▁", + "Hand" + ], + [ + "ba", + "l" + ], + [ + "b", + "al" + ], + [ + "d", + "ouble" + ], + [ + "C", + "an" + ], + [ + "Om", + "ega" + ], + [ + "▁chall", + "eng" + ], + [ + "▁stand", + "ing" + ], + [ + "▁stan", + "ding" + ], + [ + "▁", + "standing" + ], + [ + "it", + "en" + ], + [ + "ite", + "n" + ], + [ + "i", + "ten" + ], + [ + "▁me", + "chan" + ], + [ + "▁d", + "urch" + ], + [ + "▁dur", + "ch" + ], + [ + "▁d", + "ell" + ], + [ + "▁de", + "ll" + ], + [ + "▁del", + "l" + ], + [ + "▁rais", + "ed" + ], + [ + "▁raise", + "d" + ], + [ + "▁ra", + "ised" + ], + [ + "▁we", + "ak" + ], + [ + "▁", + "weak" + ], + [ + "▁D", + "u" + ], + [ + "▁", + "Du" + ], + [ + "gr", + "ad" + ], + [ + "gra", + "d" + ], + [ + "g", + "rad" + ], + [ + "▁sc", + "ene" + ], + [ + "▁scen", + "e" + ], + [ + "▁", + "scene" + ], + [ + "pos", + "s" + ], + [ + "po", + "ss" + ], + [ + "p", + "oss" + ], + [ + "▁t", + "on" + ], + [ + "▁to", + "n" + ], + [ + "▁", + "ton" + ], + [ + "▁e", + "arth" + ], + [ + "▁ear", + "th" + ], + [ + "ul", + "ations" + ], + [ + "ulation", + "s" + ], + [ + "▁str", + "ength" + ], + [ + "▁stre", + "ngth" + ], + [ + "▁streng", + "th" + ], + [ + "ak", + "ed" + ], + [ + "ake", + "d" + ], + [ + "a", + "ked" + ], + [ + "▁re", + "main" + ], + [ + "▁rem", + "ain" + ], + [ + "▁B", + "i" + ], + [ + "▁", + "Bi" + ], + [ + "▁custom", + "er" + ], + [ + "▁cust", + "omer" + ], + [ + "▁", + "customer" + ], + [ + "ran", + "ge" + ], + [ + "r", + "ange" + ], + [ + "▁inter", + "ested" + ], + [ + "▁interest", + "ed" + ], + [ + "ON", + "E" + ], + [ + "O", + "NE" + ], + [ + "▁c", + "off" + ], + [ + "▁co", + "ff" + ], + [ + "re", + "quire" + ], + [ + "requ", + "ire" + ], + [ + "▁On", + "ly" + ], + [ + "▁", + "Only" + ], + [ + "▁W", + "eb" + ], + [ + "▁We", + "b" + ], + [ + "▁", + "Web" + ], + [ + "▁f", + "arm" + ], + [ + "▁far", + "m" + ], + [ + "▁fa", + "rm" + ], + [ + "▁act", + "ivity" + ], + [ + "▁activ", + "ity" + ], + [ + "▁", + "activity" + ], + [ + "▁r", + "out" + ], + [ + "▁ro", + "ut" + ], + [ + "▁rou", + "t" + ], + [ + "bl", + "ing" + ], + [ + "b", + "ling" + ], + [ + "S", + "Y" + ], + [ + "▁Rich", + "ard" + ], + [ + "▁Ric", + "hard" + ], + [ + "▁R", + "ef" + ], + [ + "▁Re", + "f" + ], + [ + "▁", + "Ref" + ], + [ + "▁ко", + "н" + ], + [ + "▁к", + "он" + ], + [ + "▁", + "кон" + ], + [ + "▁j", + "un" + ], + [ + "▁ju", + "n" + ], + [ + "bo", + "rn" + ], + [ + "bor", + "n" + ], + [ + "b", + "orn" + ], + [ + "ij", + "n" + ], + [ + "Config", + "uration" + ], + [ + "um", + "an" + ], + [ + "uma", + "n" + ], + [ + "u", + "man" + ], + [ + "E", + "E" + ], + [ + "▁mar", + "ried" + ], + [ + "▁З", + "а" + ], + [ + "▁", + "За" + ], + [ + "▁f", + "at" + ], + [ + "▁fa", + "t" + ], + [ + "▁k", + "id" + ], + [ + "▁ki", + "d" + ], + [ + "▁T", + "ur" + ], + [ + "▁Tu", + "r" + ], + [ + "▁", + "Tur" + ], + [ + "▁off", + "ered" + ], + [ + "▁offer", + "ed" + ], + [ + "ni", + "c" + ], + [ + "n", + "ic" + ], + [ + "▁B", + "ig" + ], + [ + "▁Bi", + "g" + ], + [ + "▁", + "Big" + ], + [ + "Ga", + "mma" + ], + [ + "G", + "amma" + ], + [ + "▁He", + "alth" + ], + [ + "▁", + "Health" + ], + [ + "▁T", + "R" + ], + [ + "▁", + "TR" + ], + [ + "▁s", + "ię" + ], + [ + "▁si", + "ę" + ], + [ + "▁const", + "ruction" + ], + [ + "▁construct", + "ion" + ], + [ + "▁constr", + "uction" + ], + [ + "▁constru", + "ction" + ], + [ + "▁", + "construction" + ], + [ + "▁Ch", + "urch" + ], + [ + "▁B", + "et" + ], + [ + "▁Be", + "t" + ], + [ + "▁", + "Bet" + ], + [ + "bu", + "s" + ], + [ + "b", + "us" + ], + [ + "▁e", + "arn" + ], + [ + "▁ear", + "n" + ], + [ + "ri", + "ct" + ], + [ + "ric", + "t" + ], + [ + "r", + "ict" + ], + [ + "▁п", + "ра" + ], + [ + "▁пр", + "а" + ], + [ + "▁", + "пра" + ], + [ + "▁br", + "ain" + ], + [ + "▁bra", + "in" + ], + [ + "▁f", + "ra" + ], + [ + "▁fr", + "a" + ], + [ + "▁O", + "p" + ], + [ + "▁", + "Op" + ], + [ + "FI", + "G" + ], + [ + "F", + "IG" + ], + [ + "em", + "a" + ], + [ + "e", + "ma" + ], + [ + "▁Europe", + "an" + ], + [ + "▁S", + "aint" + ], + [ + "▁Sa", + "int" + ], + [ + "▁", + "Saint" + ], + [ + "AR", + "E" + ], + [ + "A", + "RE" + ], + [ + "ur", + "i" + ], + [ + "u", + "ri" + ], + [ + "▁R", + "iver" + ], + [ + "{", + "}" + ], + [ + "▁s", + "itting" + ], + [ + "▁sit", + "ting" + ], + [ + "▁under", + "standing" + ], + [ + "▁understand", + "ing" + ], + [ + "▁pl", + "ans" + ], + [ + "▁plan", + "s" + ], + [ + "rop", + "ri" + ], + [ + "▁old", + "er" + ], + [ + "▁ol", + "der" + ], + [ + "▁", + "older" + ], + [ + "▁pres", + "sure" + ], + [ + "▁press", + "ure" + ], + [ + "Im", + "pl" + ], + [ + "Imp", + "l" + ], + [ + "▁pe", + "ace" + ], + [ + "Conne", + "ction" + ], + [ + "Conn", + "ection" + ], + [ + "Connect", + "ion" + ], + [ + "▁f", + "i" + ], + [ + "▁", + "fi" + ], + [ + "ri", + "ch" + ], + [ + "ric", + "h" + ], + [ + "r", + "ich" + ], + [ + "▁sh", + "ut" + ], + [ + "ap", + "ers" + ], + [ + "ape", + "rs" + ], + [ + "aper", + "s" + ], + [ + "a", + "pers" + ], + [ + "Po", + "rt" + ], + [ + "P", + "ort" + ], + [ + "▁L", + "ook" + ], + [ + "▁Lo", + "ok" + ], + [ + "▁", + "Look" + ], + [ + "ri", + "m" + ], + [ + "r", + "im" + ], + [ + "au", + "th" + ], + [ + "aut", + "h" + ], + [ + "a", + "uth" + ], + [ + "au", + "to" + ], + [ + "aut", + "o" + ], + [ + "a", + "uto" + ], + [ + "▁high", + "ly" + ], + [ + "▁un", + "less" + ], + [ + "▁W", + "al" + ], + [ + "▁Wa", + "l" + ], + [ + "▁re", + "n" + ], + [ + "▁r", + "en" + ], + [ + "▁", + "ren" + ], + [ + "w", + "s" + ], + [ + "▁c", + "ore" + ], + [ + "▁co", + "re" + ], + [ + "▁cor", + "e" + ], + [ + "▁", + "core" + ], + [ + "(", + "-" + ], + [ + "▁c", + "lim" + ], + [ + "▁cl", + "im" + ], + [ + "ru", + "it" + ], + [ + "r", + "uit" + ], + [ + "▁call", + "back" + ], + [ + "▁", + "callback" + ], + [ + "he", + "st" + ], + [ + "hes", + "t" + ], + [ + "h", + "est" + ], + [ + "▁Char", + "les" + ], + [ + "▁Charl", + "es" + ], + [ + "▁L", + "ong" + ], + [ + "▁Lo", + "ng" + ], + [ + "▁", + "Long" + ], + [ + "}", + "=" + ], + [ + "ъ", + "р" + ], + [ + "▁sh", + "ared" + ], + [ + "▁share", + "d" + ], + [ + "▁shar", + "ed" + ], + [ + "▁sha", + "red" + ], + [ + "▁", + "shared" + ], + [ + "ul", + "ated" + ], + [ + "ula", + "ted" + ], + [ + "ulate", + "d" + ], + [ + "gorith", + "m" + ], + [ + "▁H", + "ome" + ], + [ + "▁Ho", + "me" + ], + [ + "▁Hom", + "e" + ], + [ + "▁", + "Home" + ], + [ + "▁vill", + "age" + ], + [ + "▁vil", + "lage" + ], + [ + "ee", + "s" + ], + [ + "e", + "es" + ], + [ + "s", + "v" + ], + [ + "▁rest", + "aur" + ], + [ + "re", + "y" + ], + [ + "r", + "ey" + ], + [ + "▁C", + "ast" + ], + [ + "▁Cas", + "t" + ], + [ + "▁Ca", + "st" + ], + [ + "▁", + "Cast" + ], + [ + "▁P", + "erson" + ], + [ + "▁Per", + "son" + ], + [ + "▁Pers", + "on" + ], + [ + "▁", + "Person" + ], + [ + "ки", + "й" + ], + [ + "▁organ", + "iz" + ], + [ + "▁R", + "ad" + ], + [ + "▁Ra", + "d" + ], + [ + "▁", + "Rad" + ], + [ + "pon", + "ents" + ], + [ + "ponent", + "s" + ], + [ + "▁wer", + "den" + ], + [ + "▁werd", + "en" + ], + [ + "▁b", + "ow" + ], + [ + "▁bo", + "w" + ], + [ + "▁", + "bow" + ], + [ + "se", + "n" + ], + [ + "s", + "en" + ], + [ + "am", + "i" + ], + [ + "a", + "mi" + ], + [ + "Inter", + "face" + ], + [ + "▁b", + "asis" + ], + [ + "▁bas", + "is" + ], + [ + "▁ba", + "sis" + ], + [ + "▁Comp", + "any" + ], + [ + "▁Compan", + "y" + ], + [ + "▁", + "Company" + ], + [ + "er", + "nel" + ], + [ + "ern", + "el" + ], + [ + "erne", + "l" + ], + [ + "it", + "u" + ], + [ + "i", + "tu" + ], + [ + "Has", + "h" + ], + [ + "Ha", + "sh" + ], + [ + "H", + "ash" + ], + [ + "▁a", + "an" + ], + [ + "▁", + "х" + ], + [ + "▁s", + "mile" + ], + [ + "▁sm", + "ile" + ], + [ + "x", + "ml" + ], + [ + "▁s", + "cen" + ], + [ + "▁sc", + "en" + ], + [ + "am", + "m" + ], + [ + "a", + "mm" + ], + [ + "to", + "ol" + ], + [ + "too", + "l" + ], + [ + "t", + "ool" + ], + [ + "ar", + "ia" + ], + [ + "ari", + "a" + ], + [ + "a", + "ria" + ], + [ + "▁acc", + "ur" + ], + [ + "▁ac", + "cur" + ], + [ + "▁", + "accur" + ], + [ + "set", + "tings" + ], + [ + "setting", + "s" + ], + [ + "▁Jes", + "us" + ], + [ + "ac", + "ement" + ], + [ + "ace", + "ment" + ], + [ + "po", + "wer" + ], + [ + "pow", + "er" + ], + [ + "p", + "ower" + ], + [ + "(", + "!" + ], + [ + "▁c", + "alls" + ], + [ + "▁call", + "s" + ], + [ + "▁cal", + "ls" + ], + [ + "▁", + "calls" + ], + [ + "▁bas", + "ic" + ], + [ + "▁", + "basic" + ], + [ + "▁set", + "tings" + ], + [ + "▁sett", + "ings" + ], + [ + "▁setting", + "s" + ], + [ + "▁", + "settings" + ], + [ + "ri", + "pt" + ], + [ + "rip", + "t" + ], + [ + "r", + "ipt" + ], + [ + "po", + "ol" + ], + [ + "p", + "ool" + ], + [ + "ct", + "ors" + ], + [ + "ctor", + "s" + ], + [ + "▁Found", + "ation" + ], + [ + "▁", + "Foundation" + ], + [ + "▁we", + "ap" + ], + [ + "KE", + "Y" + ], + [ + "K", + "EY" + ], + [ + "fo", + "ot" + ], + [ + "foo", + "t" + ], + [ + "f", + "oot" + ], + [ + "▁r", + "adio" + ], + [ + "▁rad", + "io" + ], + [ + "▁radi", + "o" + ], + [ + "▁", + "radio" + ], + [ + "▁hel", + "ped" + ], + [ + "▁help", + "ed" + ], + [ + "ma", + "nn" + ], + [ + "man", + "n" + ], + [ + "m", + "ann" + ], + [ + "▁j", + "ump" + ], + [ + "▁ju", + "mp" + ], + [ + "▁t", + "ick" + ], + [ + "▁ti", + "ck" + ], + [ + "▁", + "tick" + ], + [ + "▁gr", + "owing" + ], + [ + "▁grow", + "ing" + ], + [ + "▁gro", + "wing" + ], + [ + "at", + "en" + ], + [ + "ate", + "n" + ], + [ + "a", + "ten" + ], + [ + "re", + "al" + ], + [ + "rea", + "l" + ], + [ + "▁incre", + "asing" + ], + [ + "Dev", + "ice" + ], + [ + "var", + "epsilon" + ], + [ + "vare", + "psilon" + ], + [ + "▁s", + "ets" + ], + [ + "▁se", + "ts" + ], + [ + "▁set", + "s" + ], + [ + "▁", + "sets" + ], + [ + "▁adv", + "ant" + ], + [ + "Op", + "en" + ], + [ + "O", + "pen" + ], + [ + "▁re", + "asons" + ], + [ + "▁reason", + "s" + ], + [ + "▁sup", + "posed" + ], + [ + "▁supp", + "osed" + ], + [ + "▁suppose", + "d" + ], + [ + "oe", + "s" + ], + [ + "o", + "es" + ], + [ + "ed", + "e" + ], + [ + "e", + "de" + ], + [ + "te", + "en" + ], + [ + "tee", + "n" + ], + [ + "t", + "een" + ], + [ + "if", + "def" + ], + [ + "▁de", + "lete" + ], + [ + "▁del", + "ete" + ], + [ + "▁delet", + "e" + ], + [ + "▁", + "delete" + ], + [ + "▁&", + "=" + ], + [ + "▁", + "&=" + ], + [ + "▁B", + "ill" + ], + [ + "▁Bi", + "ll" + ], + [ + "▁Bil", + "l" + ], + [ + "▁", + "Bill" + ], + [ + "▁a", + "im" + ], + [ + "▁ai", + "m" + ], + [ + "▁", + "aim" + ], + [ + "▁O", + "k" + ], + [ + "▁", + "Ok" + ], + [ + "▁A", + "v" + ], + [ + "▁", + "Av" + ], + [ + "re", + "ci" + ], + [ + "rec", + "i" + ], + [ + "ac", + "ks" + ], + [ + "ack", + "s" + ], + [ + "a", + "cks" + ], + [ + "is", + "te" + ], + [ + "ist", + "e" + ], + [ + "i", + "ste" + ], + [ + "Pro", + "perties" + ], + [ + "▁t", + "mp" + ], + [ + "▁tm", + "p" + ], + [ + "▁", + "tmp" + ], + [ + "▁d", + "ei" + ], + [ + "▁de", + "i" + ], + [ + "PE", + "R" + ], + [ + "P", + "ER" + ], + [ + "D", + "C" + ], + [ + "st", + "a" + ], + [ + "s", + "ta" + ], + [ + "ни", + "и" + ], + [ + "▁lim", + "ited" + ], + [ + "▁limit", + "ed" + ], + [ + "▁", + "limited" + ], + [ + "▁great", + "er" + ], + [ + "▁gre", + "ater" + ], + [ + "de", + "scription" + ], + [ + "des", + "cription" + ], + [ + "or", + "i" + ], + [ + "o", + "ri" + ], + [ + "ain", + "ts" + ], + [ + "aint", + "s" + ], + [ + "▁h", + "y" + ], + [ + "▁", + "hy" + ], + [ + "▁M", + "el" + ], + [ + "▁Me", + "l" + ], + [ + "▁C", + "H" + ], + [ + "▁", + "CH" + ], + [ + "con", + "s" + ], + [ + "co", + "ns" + ], + [ + "c", + "ons" + ], + [ + "▁sur", + "round" + ], + [ + "▁W", + "ho" + ], + [ + "▁Wh", + "o" + ], + [ + "▁", + "Who" + ], + [ + "ar", + "c" + ], + [ + "a", + "rc" + ], + [ + "▁te", + "lev" + ], + [ + "▁tele", + "v" + ], + [ + "▁tel", + "ev" + ], + [ + "it", + "ution" + ], + [ + "itut", + "ion" + ], + [ + "▁e", + "qual" + ], + [ + "▁equ", + "al" + ], + [ + "▁eq", + "ual" + ], + [ + "▁", + "equal" + ], + [ + "к", + "і" + ], + [ + "▁Is", + "rael" + ], + [ + "ä", + "h" + ], + [ + "▁C", + "aption" + ], + [ + "▁Capt", + "ion" + ], + [ + "▁Ca", + "ption" + ], + [ + "▁ex", + "erc" + ], + [ + "em", + "por" + ], + [ + "emp", + "or" + ], + [ + "▁+", + "+" + ], + [ + "▁", + "++" + ], + [ + "▁l", + "ib" + ], + [ + "▁li", + "b" + ], + [ + "▁", + "lib" + ], + [ + "ma", + "ke" + ], + [ + "m", + "ake" + ], + [ + "▁M", + "A" + ], + [ + "▁", + "MA" + ], + [ + "co", + "py" + ], + [ + "cop", + "y" + ], + [ + "c", + "opy" + ], + [ + "f", + "riend" + ], + [ + "▁ко", + "то" + ], + [ + "▁", + "кото" + ], + [ + "▁dam", + "age" + ], + [ + "▁\\", + "," + ], + [ + "▁", + "\\," + ], + [ + "od", + "ed" + ], + [ + "ode", + "d" + ], + [ + "o", + "ded" + ], + [ + "▁n", + "one" + ], + [ + "▁no", + "ne" + ], + [ + "▁non", + "e" + ], + [ + "▁", + "none" + ], + [ + "▁ev", + "alu" + ], + [ + "▁eval", + "u" + ], + [ + "▁", + "evalu" + ], + [ + "st", + "on" + ], + [ + "sto", + "n" + ], + [ + "s", + "ton" + ], + [ + ">", + "," + ], + [ + "FO", + "R" + ], + [ + "F", + "OR" + ], + [ + "▁n", + "orm" + ], + [ + "▁no", + "rm" + ], + [ + "▁nor", + "m" + ], + [ + "▁", + "norm" + ], + [ + "ap", + "pe" + ], + [ + "app", + "e" + ], + [ + "a", + "ppe" + ], + [ + "S", + "ession" + ], + [ + "▁ad", + "ult" + ], + [ + "▁h", + "ospital" + ], + [ + "▁hosp", + "ital" + ], + [ + "▁recomm", + "end" + ], + [ + "pro", + "perty" + ], + [ + "ste", + "in" + ], + [ + "fin", + "al" + ], + [ + "fi", + "nal" + ], + [ + "f", + "inal" + ], + [ + "▁n", + "u" + ], + [ + "▁", + "nu" + ], + [ + "se", + "cond" + ], + [ + "sec", + "ond" + ], + [ + "▁a", + "spect" + ], + [ + "▁as", + "pect" + ], + [ + "▁asp", + "ect" + ], + [ + "\")", + "]" + ], + [ + "\"", + ")]" + ], + [ + "же", + "н" + ], + [ + "ж", + "ен" + ], + [ + "am", + "ento" + ], + [ + "ament", + "o" + ], + [ + "amen", + "to" + ], + [ + "▁r", + "ac" + ], + [ + "▁ra", + "c" + ], + [ + "▁", + "rac" + ], + [ + "sa", + "ve" + ], + [ + "s", + "ave" + ], + [ + "▁foot", + "ball" + ], + [ + "A", + "b" + ], + [ + "un", + "gs" + ], + [ + "ung", + "s" + ], + [ + "ab", + "il" + ], + [ + "abi", + "l" + ], + [ + "a", + "bil" + ], + [ + "▁Ar", + "ch" + ], + [ + "▁Arc", + "h" + ], + [ + "▁", + "Arch" + ], + [ + "sys", + "tem" + ], + [ + "s", + "ystem" + ], + [ + "hi", + "st" + ], + [ + "his", + "t" + ], + [ + "h", + "ist" + ], + [ + "▁l", + "uck" + ], + [ + "▁lu", + "ck" + ], + [ + "▁luc", + "k" + ], + [ + "re", + "nder" + ], + [ + "ren", + "der" + ], + [ + "rend", + "er" + ], + [ + "r", + "ender" + ], + [ + "▁se", + "in" + ], + [ + "▁sei", + "n" + ], + [ + "ion", + "i" + ], + [ + "io", + "ni" + ], + [ + "i", + "oni" + ], + [ + "▁r", + "ot" + ], + [ + "▁ro", + "t" + ], + [ + "▁", + "rot" + ], + [ + "▁cor", + "ner" + ], + [ + "▁corn", + "er" + ], + [ + "▁app", + "ropri" + ], + [ + "▁ap", + "propri" + ], + [ + "▁", + "appropri" + ], + [ + "▁Soft", + "ware" + ], + [ + "▁t", + "ele" + ], + [ + "▁te", + "le" + ], + [ + "▁tel", + "e" + ], + [ + "▁", + "tele" + ], + [ + "De", + "lete" + ], + [ + "Dele", + "te" + ], + [ + "Del", + "ete" + ], + [ + "▁Acc", + "ording" + ], + [ + "▁pr", + "ison" + ], + [ + "▁pri", + "son" + ], + [ + "▁", + "prison" + ], + [ + "▁l", + "ic" + ], + [ + "▁li", + "c" + ], + [ + "▁", + "lic" + ], + [ + "▁м", + "и" + ], + [ + "▁", + "ми" + ], + [ + "ter", + "m" + ], + [ + "te", + "rm" + ], + [ + "t", + "erm" + ], + [ + "se", + "ts" + ], + [ + "set", + "s" + ], + [ + "s", + "ets" + ], + [ + "▁v", + "el" + ], + [ + "▁ve", + "l" + ], + [ + "▁", + "vel" + ], + [ + "▁r", + "ank" + ], + [ + "▁ran", + "k" + ], + [ + "▁", + "rank" + ], + [ + "▁ex", + "isting" + ], + [ + "▁exist", + "ing" + ], + [ + "▁", + "existing" + ], + [ + "▁V", + "ir" + ], + [ + "▁Vi", + "r" + ], + [ + "▁t", + "rip" + ], + [ + "▁tr", + "ip" + ], + [ + "▁tri", + "p" + ], + [ + "▁м", + "у" + ], + [ + "▁", + "му" + ], + [ + "av", + "ax" + ], + [ + "ava", + "x" + ], + [ + "▁r", + "is" + ], + [ + "▁ri", + "s" + ], + [ + "▁", + "ris" + ], + [ + "▁def", + "ine" + ], + [ + "▁defin", + "e" + ], + [ + "▁", + "define" + ], + [ + "▁he", + "at" + ], + [ + "ca", + "r" + ], + [ + "c", + "ar" + ], + [ + "▁con", + "vert" + ], + [ + "▁conv", + "ert" + ], + [ + "▁conver", + "t" + ], + [ + "▁conve", + "rt" + ], + [ + "▁", + "convert" + ], + [ + "em", + "ail" + ], + [ + "ema", + "il" + ], + [ + "e", + "mail" + ], + [ + "▁U", + "nder" + ], + [ + "▁Un", + "der" + ], + [ + "▁Und", + "er" + ], + [ + "▁", + "Under" + ], + [ + "▁", + "Ш" + ], + [ + "▁G", + "rand" + ], + [ + "▁Gr", + "and" + ], + [ + "▁Gran", + "d" + ], + [ + "▁Gra", + "nd" + ], + [ + "▁ex", + "ists" + ], + [ + "▁exist", + "s" + ], + [ + "▁", + "exists" + ], + [ + "sy", + "s" + ], + [ + "s", + "ys" + ], + [ + "ef", + "f" + ], + [ + "e", + "ff" + ], + [ + "▁T", + "op" + ], + [ + "▁To", + "p" + ], + [ + "▁", + "Top" + ], + [ + "▁", + "č" + ], + [ + "▁t", + "empor" + ], + [ + "▁tem", + "por" + ], + [ + "▁temp", + "or" + ], + [ + "▁tempo", + "r" + ], + [ + "▁arg", + "uments" + ], + [ + "▁argument", + "s" + ], + [ + "▁", + "arguments" + ], + [ + "▁support", + "ed" + ], + [ + "▁supp", + "orted" + ], + [ + "▁", + "supported" + ], + [ + "en", + "sed" + ], + [ + "ens", + "ed" + ], + [ + "ense", + "d" + ], + [ + "▁Franc", + "is" + ], + [ + "▁co", + "ord" + ], + [ + "▁", + "coord" + ], + [ + "▁achie", + "ve" + ], + [ + "▁N", + "ame" + ], + [ + "▁Na", + "me" + ], + [ + "▁Nam", + "e" + ], + [ + "▁", + "Name" + ], + [ + "▁J", + "ahr" + ], + [ + "▁Jah", + "r" + ], + [ + "▁Ja", + "hr" + ], + [ + "▁G", + "i" + ], + [ + "sh", + "e" + ], + [ + "s", + "he" + ], + [ + "▁D", + "ev" + ], + [ + "▁De", + "v" + ], + [ + "▁", + "Dev" + ], + [ + "▁a", + "lla" + ], + [ + "▁al", + "la" + ], + [ + "▁all", + "a" + ], + [ + "▁", + "alla" + ], + [ + "▁W", + "IT" + ], + [ + "ag", + "ment" + ], + [ + "c", + "ustom" + ], + [ + "al", + "ls" + ], + [ + "all", + "s" + ], + [ + "&", + "&" + ], + [ + "W", + "E" + ], + [ + "▁h", + "olding" + ], + [ + "▁hold", + "ing" + ], + [ + "▁hol", + "ding" + ], + [ + "pro", + "totype" + ], + [ + "proto", + "type" + ], + [ + "prot", + "otype" + ], + [ + "▁f", + "ing" + ], + [ + "▁fin", + "g" + ], + [ + "▁fi", + "ng" + ], + [ + "▁b", + "ag" + ], + [ + "▁ba", + "g" + ], + [ + "▁", + "bag" + ], + [ + "▁Par", + "ty" + ], + [ + "▁Part", + "y" + ], + [ + "st", + "ack" + ], + [ + "sta", + "ck" + ], + [ + "▁econom", + "ic" + ], + [ + "▁G", + "al" + ], + [ + "▁Ga", + "l" + ], + [ + "id", + "ents" + ], + [ + "ident", + "s" + ], + [ + "iden", + "ts" + ], + [ + "▁J", + "un" + ], + [ + "▁Ju", + "n" + ], + [ + "▁sh", + "owed" + ], + [ + "▁show", + "ed" + ], + [ + "os", + "h" + ], + [ + "o", + "sh" + ], + [ + "▁B", + "ay" + ], + [ + "▁Ba", + "y" + ], + [ + "▁", + "Bay" + ], + [ + "ma", + "il" + ], + [ + "m", + "ail" + ], + [ + "▁S", + "O" + ], + [ + "▁", + "SO" + ], + [ + "▁\"", + "<" + ], + [ + "graph", + "ics" + ], + [ + "▁f", + "u" + ], + [ + "▁", + "fu" + ], + [ + "cl", + "ick" + ], + [ + "cli", + "ck" + ], + [ + "c", + "lick" + ], + [ + "▁b", + "attle" + ], + [ + "▁batt", + "le" + ], + [ + "▁bat", + "tle" + ], + [ + "{", + "{" + ], + [ + "▁E", + "vent" + ], + [ + "▁Even", + "t" + ], + [ + "▁Ev", + "ent" + ], + [ + "▁Eve", + "nt" + ], + [ + "▁", + "Event" + ], + [ + "ri", + "or" + ], + [ + "rio", + "r" + ], + [ + "r", + "ior" + ], + [ + "ch", + "aft" + ], + [ + "cha", + "ft" + ], + [ + "▁f", + "avorite" + ], + [ + "▁favor", + "ite" + ], + [ + "us", + "ive" + ], + [ + "sup", + "port" + ], + [ + "supp", + "ort" + ], + [ + "s", + "upport" + ], + [ + "b", + "m" + ], + [ + "K", + "ind" + ], + [ + "▁saf", + "ety" + ], + [ + "▁safe", + "ty" + ], + [ + "▁E", + "nt" + ], + [ + "▁En", + "t" + ], + [ + "▁", + "Ent" + ], + [ + "cu", + "p" + ], + [ + "c", + "up" + ], + [ + "▁Austral", + "ia" + ], + [ + "▁dest", + "roy" + ], + [ + "▁destro", + "y" + ], + [ + "▁", + "destroy" + ], + [ + "▁organ", + "ization" + ], + [ + "▁organiz", + "ation" + ], + [ + "id", + "en" + ], + [ + "ide", + "n" + ], + [ + "i", + "den" + ], + [ + "########", + "########" + ], + [ + "de", + "c" + ], + [ + "d", + "ec" + ], + [ + "▁z", + "a" + ], + [ + "▁", + "za" + ], + [ + "▁s", + "even" + ], + [ + "▁se", + "ven" + ], + [ + "▁", + "seven" + ], + [ + "ar", + "ely" + ], + [ + "are", + "ly" + ], + [ + "arel", + "y" + ], + [ + "▁f", + "lag" + ], + [ + "▁fl", + "ag" + ], + [ + "▁", + "flag" + ], + [ + "Di", + "r" + ], + [ + "D", + "ir" + ], + [ + "▁C", + "arl" + ], + [ + "▁Car", + "l" + ], + [ + "▁Ca", + "rl" + ], + [ + "▁do", + "ctor" + ], + [ + "▁doc", + "tor" + ], + [ + "▁var", + "iety" + ], + [ + "▁vari", + "ety" + ], + [ + "▁L", + "in" + ], + [ + "▁Li", + "n" + ], + [ + "▁", + "Lin" + ], + [ + "▁t", + "om" + ], + [ + "▁to", + "m" + ], + [ + "▁", + "tom" + ], + [ + "^{", + "(" + ], + [ + "^", + "{(" + ], + [ + "B", + "o" + ], + [ + "an", + "tes" + ], + [ + "ant", + "es" + ], + [ + "ante", + "s" + ], + [ + "▁m", + "ine" + ], + [ + "▁min", + "e" + ], + [ + "▁mi", + "ne" + ], + [ + "▁", + "mine" + ], + [ + "▁M", + "it" + ], + [ + "▁Mi", + "t" + ], + [ + "▁de", + "scribe" + ], + [ + "▁desc", + "ribe" + ], + [ + "▁describ", + "e" + ], + [ + "Ar", + "gs" + ], + [ + "Arg", + "s" + ], + [ + "L", + "S" + ], + [ + "AP", + "I" + ], + [ + "A", + "PI" + ], + [ + "▁L", + "uc" + ], + [ + "▁Lu", + "c" + ], + [ + "▁", + "Luc" + ], + [ + "ph", + "one" + ], + [ + "▁sc", + "ience" + ], + [ + "▁", + "science" + ], + [ + "▁O", + "per" + ], + [ + "▁Op", + "er" + ], + [ + "▁", + "Oper" + ], + [ + "Ne", + "xt" + ], + [ + "N", + "ext" + ], + [ + "▁invest", + "ig" + ], + [ + "▁demon", + "str" + ], + [ + "▁G", + "overn" + ], + [ + "▁Go", + "vern" + ], + [ + "▁object", + "s" + ], + [ + "▁", + "objects" + ], + [ + "▁Lou", + "is" + ], + [ + "▁Lo", + "uis" + ], + [ + "▁Return", + "s" + ], + [ + "▁", + "Returns" + ], + [ + "▁h", + "an" + ], + [ + "▁ha", + "n" + ], + [ + "▁", + "han" + ], + [ + "na", + "m" + ], + [ + "n", + "am" + ], + [ + "▁com", + "me" + ], + [ + "▁comm", + "e" + ], + [ + "▁pres", + "ence" + ], + [ + "▁p", + "el" + ], + [ + "▁pe", + "l" + ], + [ + "▁", + "pel" + ], + [ + "▁det", + "ect" + ], + [ + "▁", + "detect" + ], + [ + ")", + "=" + ], + [ + "▁Ch", + "inese" + ], + [ + "▁r", + "ich" + ], + [ + "▁ri", + "ch" + ], + [ + "▁ric", + "h" + ], + [ + "▁", + "rich" + ], + [ + "▁class", + "es" + ], + [ + "▁classe", + "s" + ], + [ + "▁clas", + "ses" + ], + [ + "▁", + "classes" + ], + [ + "▁exp", + "and" + ], + [ + "▁", + "expand" + ], + [ + "▁D", + "om" + ], + [ + "▁Do", + "m" + ], + [ + "▁", + "Dom" + ], + [ + "▁D", + "ec" + ], + [ + "▁De", + "c" + ], + [ + "▁", + "Dec" + ], + [ + "s", + "n" + ], + [ + "pe", + "ed" + ], + [ + "p", + "eed" + ], + [ + "▁J", + "im" + ], + [ + "▁Ji", + "m" + ], + [ + "sh", + "ould" + ], + [ + "▁Sm", + "ith" + ], + [ + "▁p", + "ages" + ], + [ + "▁page", + "s" + ], + [ + "▁pa", + "ges" + ], + [ + "▁pag", + "es" + ], + [ + "▁", + "pages" + ], + [ + "▁Je", + "an" + ], + [ + "ri", + "cs" + ], + [ + "ric", + "s" + ], + [ + "r", + "ics" + ], + [ + "▁S", + "und" + ], + [ + "▁Su", + "nd" + ], + [ + "▁Sun", + "d" + ], + [ + "ad", + "s" + ], + [ + "a", + "ds" + ], + [ + "▁The", + "ir" + ], + [ + "un", + "icip" + ], + [ + "uni", + "cip" + ], + [ + "unic", + "ip" + ], + [ + "в", + "у" + ], + [ + "▁down", + "load" + ], + [ + "▁", + "download" + ], + [ + "▁st", + "ress" + ], + [ + "▁str", + "ess" + ], + [ + "▁stre", + "ss" + ], + [ + "▁P", + "et" + ], + [ + "▁Pe", + "t" + ], + [ + "▁", + "Pet" + ], + [ + "me", + "nu" + ], + [ + "men", + "u" + ], + [ + "m", + "enu" + ], + [ + "re", + "me" + ], + [ + "rem", + "e" + ], + [ + "r", + "eme" + ], + [ + "▁com", + "pared" + ], + [ + "▁comp", + "ared" + ], + [ + "▁compar", + "ed" + ], + [ + "▁compare", + "d" + ], + [ + "St", + "e" + ], + [ + "S", + "te" + ], + [ + "IN", + "D" + ], + [ + "I", + "ND" + ], + [ + "cont", + "ainer" + ], + [ + "▁Ind", + "ian" + ], + [ + "▁India", + "n" + ], + [ + "or", + "en" + ], + [ + "ore", + "n" + ], + [ + "o", + "ren" + ], + [ + "▁s", + "es" + ], + [ + "▁se", + "s" + ], + [ + "▁", + "ses" + ], + [ + "▁W", + "he" + ], + [ + "▁Wh", + "e" + ], + [ + "▁", + "Whe" + ], + [ + "▁r", + "oku" + ], + [ + "▁ro", + "ku" + ], + [ + "▁estab", + "lished" + ], + [ + "▁establish", + "ed" + ], + [ + "▁gener", + "ally" + ], + [ + "▁general", + "ly" + ], + [ + "▁f", + "le" + ], + [ + "▁fl", + "e" + ], + [ + "__", + "(" + ], + [ + "_", + "_(" + ], + [ + "=\"", + "+" + ], + [ + "=", + "\"+" + ], + [ + "V", + "ar" + ], + [ + "▁M", + "ake" + ], + [ + "▁Ma", + "ke" + ], + [ + "▁Mak", + "e" + ], + [ + "▁", + "Make" + ], + [ + "▁rem", + "oved" + ], + [ + "▁remove", + "d" + ], + [ + "▁", + "removed" + ], + [ + "z", + "z" + ], + [ + "ü", + "n" + ], + [ + "▁m", + "ix" + ], + [ + "▁mi", + "x" + ], + [ + "▁", + "mix" + ], + [ + "er", + "k" + ], + [ + "iat", + "ion" + ], + [ + "i", + "ation" + ], + [ + "ou", + "ter" + ], + [ + "out", + "er" + ], + [ + "oute", + "r" + ], + [ + "o", + "uter" + ], + [ + "S", + "K" + ], + [ + "▁be", + "comes" + ], + [ + "▁bec", + "omes" + ], + [ + "▁become", + "s" + ], + [ + "▁H", + "all" + ], + [ + "▁Ha", + "ll" + ], + [ + "▁Hal", + "l" + ], + [ + "sc", + "ious" + ], + [ + "▁w", + "atched" + ], + [ + "▁watch", + "ed" + ], + [ + "▁wat", + "ched" + ], + [ + "▁g", + "ather" + ], + [ + "▁ga", + "ther" + ], + [ + "▁", + "gather" + ], + [ + "▁Res", + "ult" + ], + [ + "▁", + "Result" + ], + [ + "pro", + "of" + ], + [ + "pa", + "y" + ], + [ + "p", + "ay" + ], + [ + "▁produ", + "ced" + ], + [ + "▁produce", + "d" + ], + [ + "▁prod", + "uced" + ], + [ + "▁|", + "=" + ], + [ + "▁b", + "order" + ], + [ + "▁bord", + "er" + ], + [ + "▁bor", + "der" + ], + [ + "▁", + "border" + ], + [ + "▁d", + "in" + ], + [ + "▁di", + "n" + ], + [ + "▁s", + "cript" + ], + [ + "▁sc", + "ript" + ], + [ + "▁scr", + "ipt" + ], + [ + "▁", + "script" + ], + [ + "▁a", + "ctions" + ], + [ + "▁act", + "ions" + ], + [ + "▁action", + "s" + ], + [ + "▁", + "actions" + ], + [ + "▁m", + "as" + ], + [ + "▁ma", + "s" + ], + [ + "▁", + "mas" + ], + [ + "щ", + "а" + ], + [ + "oot", + "h" + ], + [ + "oo", + "th" + ], + [ + "o", + "oth" + ], + [ + "▁Te", + "chn" + ], + [ + "▁Tech", + "n" + ], + [ + "Js", + "on" + ], + [ + "J", + "son" + ], + [ + "▁f", + "illed" + ], + [ + "▁fil", + "led" + ], + [ + "▁fill", + "ed" + ], + [ + "▁", + "filled" + ], + [ + "де", + "н" + ], + [ + "д", + "ен" + ], + [ + "und", + "le" + ], + [ + "ст", + "у" + ], + [ + "с", + "ту" + ], + [ + "To", + "ol" + ], + [ + "Too", + "l" + ], + [ + "T", + "ool" + ], + [ + "▁k", + "ing" + ], + [ + "▁ki", + "ng" + ], + [ + "▁kin", + "g" + ], + [ + "▁", + "king" + ], + [ + "▁v", + "en" + ], + [ + "▁ve", + "n" + ], + [ + "▁", + "ven" + ], + [ + "st", + "ra" + ], + [ + "str", + "a" + ], + [ + "s", + "tra" + ], + [ + "▁pre", + "dict" + ], + [ + "▁pred", + "ict" + ], + [ + "▁", + "predict" + ], + [ + "▁l", + "ui" + ], + [ + "▁lu", + "i" + ], + [ + "▁WAR", + "RAN" + ], + [ + "▁F", + "un" + ], + [ + "▁Fu", + "n" + ], + [ + "▁", + "Fun" + ], + [ + "Sc", + "ript" + ], + [ + "S", + "cript" + ], + [ + "▁power", + "ful" + ], + [ + "▁l", + "ose" + ], + [ + "▁lo", + "se" + ], + [ + "▁los", + "e" + ], + [ + "at", + "ically" + ], + [ + "atic", + "ally" + ], + [ + "▁d", + "aily" + ], + [ + "▁da", + "ily" + ], + [ + "▁dai", + "ly" + ], + [ + "▁r", + "ing" + ], + [ + "▁ri", + "ng" + ], + [ + "▁", + "ring" + ], + [ + "▁ar", + "rived" + ], + [ + "▁arriv", + "ed" + ], + [ + "▁arr", + "ived" + ], + [ + "▁arrive", + "d" + ], + [ + "St", + "ack" + ], + [ + "sc", + "ope" + ], + [ + "s", + "cope" + ], + [ + "▁B", + "ack" + ], + [ + "▁Ba", + "ck" + ], + [ + "▁", + "Back" + ], + [ + "el", + "ij" + ], + [ + "eli", + "j" + ], + [ + "e", + "lij" + ], + [ + "▁z", + "e" + ], + [ + "▁", + "ze" + ], + [ + "ke", + "ys" + ], + [ + "key", + "s" + ], + [ + "{", + "\"" + ], + [ + "VI", + "D" + ], + [ + "V", + "ID" + ], + [ + "▁l", + "icense" + ], + [ + "▁lic", + "ense" + ], + [ + "▁", + "license" + ], + [ + "wh", + "at" + ], + [ + "w", + "hat" + ], + [ + "▁pro", + "ced" + ], + [ + "▁proc", + "ed" + ], + [ + "ra", + "nt" + ], + [ + "ran", + "t" + ], + [ + "r", + "ant" + ], + [ + "est", + "ival" + ], + [ + "ag", + "ram" + ], + [ + "agr", + "am" + ], + [ + "agra", + "m" + ], + [ + "a", + "gram" + ], + [ + "▁L", + "O" + ], + [ + "▁", + "LO" + ], + [ + "▁Hen", + "ry" + ], + [ + "▁fl", + "ags" + ], + [ + "▁flag", + "s" + ], + [ + "▁", + "flags" + ], + [ + "Do", + "wn" + ], + [ + "D", + "own" + ], + [ + "scri", + "ption" + ], + [ + "script", + "ion" + ], + [ + "s", + "cription" + ], + [ + "▁famil", + "ies" + ], + [ + "▁familie", + "s" + ], + [ + "is", + "se" + ], + [ + "iss", + "e" + ], + [ + "bo", + "ur" + ], + [ + "b", + "our" + ], + [ + "▁B", + "ur" + ], + [ + "▁Bu", + "r" + ], + [ + "—", + "\"" + ], + [ + "▁b", + "rief" + ], + [ + "▁br", + "ief" + ], + [ + "▁", + "brief" + ], + [ + "▁cre", + "ating" + ], + [ + "▁creat", + "ing" + ], + [ + "▁cl", + "ients" + ], + [ + "▁client", + "s" + ], + [ + "ran", + "gle" + ], + [ + "r", + "angle" + ], + [ + "▁amaz", + "ing" + ], + [ + "▁s", + "ind" + ], + [ + "▁si", + "nd" + ], + [ + "▁sin", + "d" + ], + [ + "▁cover", + "ed" + ], + [ + "▁cov", + "ered" + ], + [ + "▁", + "covered" + ], + [ + "We", + "ll" + ], + [ + "W", + "ell" + ], + [ + "ст", + "е" + ], + [ + "с", + "те" + ], + [ + "то", + "р" + ], + [ + "т", + "ор" + ], + [ + "▁B", + "as" + ], + [ + "▁Ba", + "s" + ], + [ + "▁", + "Bas" + ], + [ + "to", + "tal" + ], + [ + "tot", + "al" + ], + [ + "t", + "otal" + ], + [ + "▁I", + "nit" + ], + [ + "▁In", + "it" + ], + [ + "▁", + "Init" + ], + [ + "▁s", + "and" + ], + [ + "▁sa", + "nd" + ], + [ + "▁san", + "d" + ], + [ + "Un", + "it" + ], + [ + "U", + "nit" + ], + [ + "▁mur", + "der" + ], + [ + "▁b", + "right" + ], + [ + "▁br", + "ight" + ], + [ + "▁brig", + "ht" + ], + [ + "▁t", + "rav" + ], + [ + "▁tr", + "av" + ], + [ + "▁tra", + "v" + ], + [ + "ic", + "ans" + ], + [ + "ica", + "ns" + ], + [ + "ican", + "s" + ], + [ + "▁att", + "ribute" + ], + [ + "▁attribut", + "e" + ], + [ + "▁", + "attribute" + ], + [ + "f", + "c" + ], + [ + "▁pl", + "aced" + ], + [ + "▁place", + "d" + ], + [ + "▁plac", + "ed" + ], + [ + "ES", + "T" + ], + [ + "E", + "ST" + ], + [ + "Var", + "i" + ], + [ + "V", + "ari" + ], + [ + "▁c", + "os" + ], + [ + "▁co", + "s" + ], + [ + "▁", + "cos" + ], + [ + "▁at", + "tract" + ], + [ + "▁att", + "ract" + ], + [ + "▁attr", + "act" + ], + [ + "▁attra", + "ct" + ], + [ + "an", + "el" + ], + [ + "ane", + "l" + ], + [ + "a", + "nel" + ], + [ + "})", + "." + ], + [ + "}", + ")." + ], + [ + "by", + "tes" + ], + [ + "byte", + "s" + ], + [ + "▁p", + "arse" + ], + [ + "▁par", + "se" + ], + [ + "▁", + "parse" + ], + [ + "▁be", + "long" + ], + [ + "▁bel", + "ong" + ], + [ + "B", + "N" + ], + [ + "▁S", + "ol" + ], + [ + "▁So", + "l" + ], + [ + "P", + "o" + ], + [ + "`", + "," + ], + [ + "▁c", + "alling" + ], + [ + "▁call", + "ing" + ], + [ + "▁cal", + "ling" + ], + [ + "▁?", + ">" + ], + [ + "▁", + "?>" + ], + [ + "▁it", + "er" + ], + [ + "▁i", + "ter" + ], + [ + "▁", + "iter" + ], + [ + "▁u", + "rl" + ], + [ + "▁ur", + "l" + ], + [ + "▁", + "url" + ], + [ + "▁ev", + "ening" + ], + [ + "▁even", + "ing" + ], + [ + "re", + "ek" + ], + [ + "ree", + "k" + ], + [ + "▁hon", + "est" + ], + [ + "▁direct", + "or" + ], + [ + "▁dire", + "ctor" + ], + [ + "▁dir", + "ector" + ], + [ + "R", + "C" + ], + [ + "▁s", + "olid" + ], + [ + "▁sol", + "id" + ], + [ + "▁", + "solid" + ], + [ + "▁ph", + "il" + ], + [ + "ie", + "ne" + ], + [ + "ien", + "e" + ], + [ + "i", + "ene" + ], + [ + "FA", + "ULT" + ], + [ + "co", + "pe" + ], + [ + "cop", + "e" + ], + [ + "c", + "ope" + ], + [ + "▁Hist", + "ory" + ], + [ + "▁Histor", + "y" + ], + [ + "▁Hi", + "story" + ], + [ + "▁", + "History" + ], + [ + "▁Te", + "am" + ], + [ + "▁", + "Team" + ], + [ + "ree", + "dom" + ], + [ + "reed", + "om" + ], + [ + "▁r", + "u" + ], + [ + "▁", + "ru" + ], + [ + "U", + "B" + ], + [ + "▁w", + "orse" + ], + [ + "▁wor", + "se" + ], + [ + "im", + "o" + ], + [ + "i", + "mo" + ], + [ + "Ma", + "t" + ], + [ + "M", + "at" + ], + [ + "▁M", + "ex" + ], + [ + "▁Me", + "x" + ], + [ + "ac", + "tor" + ], + [ + "act", + "or" + ], + [ + "a", + "ctor" + ], + [ + "▁v", + "or" + ], + [ + "▁vo", + "r" + ], + [ + "▁", + "vor" + ], + [ + "ть", + "ся" + ], + [ + "▁exper", + "iment" + ], + [ + "▁experi", + "ment" + ], + [ + "▁P", + "lay" + ], + [ + "▁Pl", + "ay" + ], + [ + "▁", + "Play" + ], + [ + "▁An", + "other" + ], + [ + "▁happ", + "ens" + ], + [ + "▁happen", + "s" + ], + [ + "ua", + "n" + ], + [ + "u", + "an" + ], + [ + "▁pat", + "ients" + ], + [ + "▁patient", + "s" + ], + [ + "▁re", + "nd" + ], + [ + "▁r", + "end" + ], + [ + "▁ren", + "d" + ], + [ + "▁", + "rend" + ], + [ + "▁M", + "o" + ], + [ + "▁", + "Mo" + ], + [ + "▁T", + "ex" + ], + [ + "▁Te", + "x" + ], + [ + "▁", + "Tex" + ], + [ + "▁w", + "ed" + ], + [ + "▁we", + "d" + ], + [ + "▁", + "wed" + ], + [ + "t", + "n" + ], + [ + "in", + "sert" + ], + [ + "ins", + "ert" + ], + [ + "▁п", + "а" + ], + [ + "▁", + "па" + ], + [ + "▁an", + "ti" + ], + [ + "▁ant", + "i" + ], + [ + "▁", + "anti" + ], + [ + "Mat", + "ch" + ], + [ + "M", + "atch" + ], + [ + "ampions", + "hip" + ], + [ + "ampion", + "ship" + ], + [ + "▁for", + "ces" + ], + [ + "▁force", + "s" + ], + [ + "▁H", + "ot" + ], + [ + "▁Ho", + "t" + ], + [ + "▁", + "Hot" + ], + [ + "▁ph", + "ase" + ], + [ + "▁", + "phase" + ], + [ + "▁t", + "emplate" + ], + [ + "▁templ", + "ate" + ], + [ + "▁temp", + "late" + ], + [ + "▁", + "template" + ], + [ + "st", + "op" + ], + [ + "sto", + "p" + ], + [ + "s", + "top" + ], + [ + "ic", + "ated" + ], + [ + "ica", + "ted" + ], + [ + "icate", + "d" + ], + [ + "▁man", + "aged" + ], + [ + "▁manage", + "d" + ], + [ + "▁", + "managed" + ], + [ + "wa", + "it" + ], + [ + "w", + "ait" + ], + [ + "▁*", + "(" + ], + [ + "▁", + "*(" + ], + [ + "G", + "B" + ], + [ + "▁app", + "oint" + ], + [ + "▁ap", + "point" + ], + [ + "▁", + "appoint" + ], + [ + "ł", + "a" + ], + [ + "▁s", + "tick" + ], + [ + "▁st", + "ick" + ], + [ + "▁", + "stick" + ], + [ + "▁F", + "OR" + ], + [ + "▁FO", + "R" + ], + [ + "▁", + "FOR" + ], + [ + "▁V", + "is" + ], + [ + "▁Vi", + "s" + ], + [ + "▁", + "Vis" + ], + [ + "to", + "r" + ], + [ + "t", + "or" + ], + [ + "▁p", + "ř" + ], + [ + "qu", + "est" + ], + [ + "que", + "st" + ], + [ + "ques", + "t" + ], + [ + "q", + "uest" + ], + [ + "us", + "es" + ], + [ + "use", + "s" + ], + [ + "u", + "ses" + ], + [ + "\");", + "\r" + ], + [ + "\")", + ";\r" + ], + [ + "\"", + ");\r" + ], + [ + "▁sudden", + "ly" + ], + [ + "▁sud", + "denly" + ], + [ + "é", + "c" + ], + [ + "N", + "D" + ], + [ + "ur", + "op" + ], + [ + "uro", + "p" + ], + [ + "u", + "rop" + ], + [ + "ре", + "д" + ], + [ + "▁ins", + "urance" + ], + [ + "ac", + "cess" + ], + [ + "acc", + "ess" + ], + [ + "a", + "ccess" + ], + [ + "un", + "finished" + ], + [ + "▁t", + "amb" + ], + [ + "▁ta", + "mb" + ], + [ + "▁tam", + "b" + ], + [ + "▁s", + "ac" + ], + [ + "▁sa", + "c" + ], + [ + "▁C", + "ourt" + ], + [ + "▁Co", + "urt" + ], + [ + "▁Cour", + "t" + ], + [ + "▁Cou", + "rt" + ], + [ + "▁miss", + "ing" + ], + [ + "▁mis", + "sing" + ], + [ + "▁", + "missing" + ], + [ + "▁W", + "here" + ], + [ + "▁Wh", + "ere" + ], + [ + "▁Whe", + "re" + ], + [ + "▁", + "Where" + ], + [ + "▁S", + "um" + ], + [ + "▁Su", + "m" + ], + [ + "▁", + "Sum" + ], + [ + "}^", + "{\\" + ], + [ + "}^{", + "\\" + ], + [ + "}", + "^{\\" + ], + [ + "▁s", + "ua" + ], + [ + "▁su", + "a" + ], + [ + "_", + "," + ], + [ + "▁th", + "ick" + ], + [ + "▁Tr", + "ump" + ], + [ + "▁Tru", + "mp" + ], + [ + "▁oper", + "ations" + ], + [ + "▁operation", + "s" + ], + [ + "▁", + "operations" + ], + [ + "F", + "S" + ], + [ + "▁de", + "ux" + ], + [ + "d", + "z" + ], + [ + "Temp", + "late" + ], + [ + "T", + "emplate" + ], + [ + "▁\"", + "/" + ], + [ + "▁o", + "dd" + ], + [ + "▁od", + "d" + ], + [ + "▁", + "odd" + ], + [ + "▁re", + "ality" + ], + [ + "▁real", + "ity" + ], + [ + "▁te", + "ams" + ], + [ + "▁team", + "s" + ], + [ + "▁tea", + "ms" + ], + [ + "▁c", + "er" + ], + [ + "▁ce", + "r" + ], + [ + "▁", + "cer" + ], + [ + "om", + "a" + ], + [ + "o", + "ma" + ], + [ + "▁", + "și" + ], + [ + "▁cl", + "oud" + ], + [ + "▁clo", + "ud" + ], + [ + "▁", + "cloud" + ], + [ + "▁Dep", + "artment" + ], + [ + "N", + "e" + ], + [ + "▁requ", + "ires" + ], + [ + "▁require", + "s" + ], + [ + "it", + "ems" + ], + [ + "ite", + "ms" + ], + [ + "item", + "s" + ], + [ + "▁I", + "II" + ], + [ + "▁II", + "I" + ], + [ + "▁", + "III" + ], + [ + "right", + "arrow" + ], + [ + ")-", + ">" + ], + [ + ")", + "->" + ], + [ + "▁w", + "riter" + ], + [ + "▁wr", + "iter" + ], + [ + "▁writ", + "er" + ], + [ + "▁write", + "r" + ], + [ + "▁", + "writer" + ], + [ + "re", + "place" + ], + [ + "rep", + "lace" + ], + [ + "▁t", + "hr" + ], + [ + "▁th", + "r" + ], + [ + "je", + "n" + ], + [ + "j", + "en" + ], + [ + "▁o", + "t" + ], + [ + "▁", + "ot" + ], + [ + "▁occ", + "up" + ], + [ + "▁oc", + "cup" + ], + [ + "▁", + "occup" + ], + [ + "▁event", + "ually" + ], + [ + "▁M", + "ath" + ], + [ + "▁Mat", + "h" + ], + [ + "▁Ma", + "th" + ], + [ + "▁", + "Math" + ], + [ + "▁con", + "serv" + ], + [ + "▁cons", + "erv" + ], + [ + "▁conse", + "rv" + ], + [ + "am", + "er" + ], + [ + "ame", + "r" + ], + [ + "a", + "mer" + ], + [ + "▁F", + "ort" + ], + [ + "▁For", + "t" + ], + [ + "▁Fo", + "rt" + ], + [ + "▁d", + "ry" + ], + [ + "▁dr", + "y" + ], + [ + "▁sex", + "ual" + ], + [ + "▁co", + "sts" + ], + [ + "▁cost", + "s" + ], + [ + "▁cos", + "ts" + ], + [ + "▁for", + "ms" + ], + [ + "▁form", + "s" + ], + [ + "▁", + "forms" + ], + [ + "▁V", + "ict" + ], + [ + "▁Vi", + "ct" + ], + [ + "▁Vic", + "t" + ], + [ + "PA", + "R" + ], + [ + "P", + "AR" + ], + [ + "frame", + "work" + ], + [ + "▁д", + "и" + ], + [ + "▁", + "ди" + ], + [ + "Oper", + "ation" + ], + [ + "з", + "на" + ], + [ + "wh", + "ich" + ], + [ + "▁t", + "ight" + ], + [ + "▁ti", + "ght" + ], + [ + "In", + "valid" + ], + [ + "▁part", + "ner" + ], + [ + "▁п", + "ред" + ], + [ + "▁пре", + "д" + ], + [ + "▁th", + "ank" + ], + [ + "▁than", + "k" + ], + [ + "▁gu", + "ard" + ], + [ + "▁", + "guard" + ], + [ + "he", + "m" + ], + [ + "h", + "em" + ], + [ + "Bo", + "dy" + ], + [ + "B", + "ody" + ], + [ + "▁e", + "mot" + ], + [ + "▁em", + "ot" + ], + [ + "I", + "X" + ], + [ + "fa", + "st" + ], + [ + "fas", + "t" + ], + [ + "f", + "ast" + ], + [ + "щ", + "о" + ], + [ + "ñ", + "o" + ], + [ + "ni", + "ght" + ], + [ + "n", + "ight" + ], + [ + "▁S", + "ci" + ], + [ + "▁Sc", + "i" + ], + [ + "ни", + "ка" + ], + [ + "ник", + "а" + ], + [ + "▁T", + "O" + ], + [ + "▁", + "TO" + ], + [ + "▁individ", + "uals" + ], + [ + "▁individual", + "s" + ], + [ + "сс", + "и" + ], + [ + "с", + "си" + ], + [ + "})", + "," + ], + [ + "}", + ")," + ], + [ + "F", + "alse" + ], + [ + "(\"", + "%" + ], + [ + "(", + "\"%" + ], + [ + "▁op", + "tim" + ], + [ + "▁opt", + "im" + ], + [ + "▁", + "optim" + ], + [ + "▁-", + "->" + ], + [ + "▁--", + ">" + ], + [ + "▁", + "-->" + ], + [ + "▁f", + "actor" + ], + [ + "▁fact", + "or" + ], + [ + "▁fac", + "tor" + ], + [ + "▁fa", + "ctor" + ], + [ + "▁", + "factor" + ], + [ + "▁sm", + "aller" + ], + [ + "▁small", + "er" + ], + [ + "▁con", + "tain" + ], + [ + "▁cont", + "ain" + ], + [ + "sp", + "ect" + ], + [ + "spec", + "t" + ], + [ + "spe", + "ct" + ], + [ + "s", + "pect" + ], + [ + "Eng", + "ine" + ], + [ + "▁ann", + "ounced" + ], + [ + "▁announ", + "ced" + ], + [ + "▁announce", + "d" + ], + [ + "▁Dem", + "ocr" + ], + [ + "▁r", + "ob" + ], + [ + "▁ro", + "b" + ], + [ + "▁", + "rob" + ], + [ + "▁f", + "lat" + ], + [ + "▁fl", + "at" + ], + [ + "▁", + "flat" + ], + [ + "os", + "oph" + ], + [ + "oso", + "ph" + ], + [ + "Se", + "arch" + ], + [ + "S", + "earch" + ], + [ + "ah", + "l" + ], + [ + "a", + "hl" + ], + [ + "▁Ex", + "ception" + ], + [ + "▁Except", + "ion" + ], + [ + "▁", + "Exception" + ], + [ + "▁O", + "l" + ], + [ + "equ", + "als" + ], + [ + "eq", + "uals" + ], + [ + "equal", + "s" + ], + [ + "▁un", + "ter" + ], + [ + "▁unt", + "er" + ], + [ + "▁", + "unter" + ], + [ + "sh", + "ape" + ], + [ + "sha", + "pe" + ], + [ + "N", + "S" + ], + [ + "Ob", + "j" + ], + [ + "▁spec", + "ies" + ], + [ + "▁spe", + "cies" + ], + [ + "we", + "ight" + ], + [ + "wei", + "ght" + ], + [ + "w", + "eight" + ], + [ + "yo", + "u" + ], + [ + "y", + "ou" + ], + [ + "▁e", + "ste" + ], + [ + "▁est", + "e" + ], + [ + "▁es", + "te" + ], + [ + "▁", + "este" + ], + [ + "▁V", + "iew" + ], + [ + "▁Vi", + "ew" + ], + [ + "▁", + "View" + ], + [ + "▁m", + "ission" + ], + [ + "▁miss", + "ion" + ], + [ + "▁", + "mission" + ], + [ + "▁j", + "ournal" + ], + [ + "▁jour", + "nal" + ], + [ + "▁", + "journal" + ], + [ + "Value", + "s" + ], + [ + "Val", + "ues" + ], + [ + "▁ein", + "em" + ], + [ + "▁eine", + "m" + ], + [ + "is", + "mo" + ], + [ + "ism", + "o" + ], + [ + "▁project", + "s" + ], + [ + "▁", + "projects" + ], + [ + "▁D", + "as" + ], + [ + "▁Da", + "s" + ], + [ + "ri", + "ble" + ], + [ + "rib", + "le" + ], + [ + "r", + "ible" + ], + [ + "▁s", + "erve" + ], + [ + "▁ser", + "ve" + ], + [ + "▁serv", + "e" + ], + [ + "▁", + "serve" + ], + [ + "▁op", + "ening" + ], + [ + "▁open", + "ing" + ], + [ + "▁h", + "ur" + ], + [ + "▁program", + "s" + ], + [ + "▁U", + "SA" + ], + [ + "▁US", + "A" + ], + [ + "▁", + "USA" + ], + [ + "il", + "iar" + ], + [ + "ili", + "ar" + ], + [ + "ilia", + "r" + ], + [ + "id", + "os" + ], + [ + "ido", + "s" + ], + [ + "B", + "r" + ], + [ + "est", + "amp" + ], + [ + "esta", + "mp" + ], + [ + "▁t", + "ools" + ], + [ + "▁to", + "ols" + ], + [ + "▁too", + "ls" + ], + [ + "▁tool", + "s" + ], + [ + "▁", + "tools" + ], + [ + "an", + "ner" + ], + [ + "ann", + "er" + ], + [ + "anne", + "r" + ], + [ + "R", + "T" + ], + [ + "▁St", + "art" + ], + [ + "▁Star", + "t" + ], + [ + "▁Sta", + "rt" + ], + [ + "▁", + "Start" + ], + [ + "▁b", + "ath" + ], + [ + "▁bat", + "h" + ], + [ + "▁ba", + "th" + ], + [ + "▁coff", + "ee" + ], + [ + "or", + "ter" + ], + [ + "ort", + "er" + ], + [ + "orte", + "r" + ], + [ + "in", + "ternal" + ], + [ + "inter", + "nal" + ], + [ + "intern", + "al" + ], + [ + "file", + "s" + ], + [ + "fil", + "es" + ], + [ + "fi", + "les" + ], + [ + "f", + "iles" + ], + [ + "IN", + "VAL" + ], + [ + "ak", + "o" + ], + [ + "a", + "ko" + ], + [ + "d", + "t" + ], + [ + "▁Se", + "cond" + ], + [ + "▁Sec", + "ond" + ], + [ + "▁", + "Second" + ], + [ + "▁al", + "loc" + ], + [ + "▁all", + "oc" + ], + [ + "▁", + "alloc" + ], + [ + "▁en", + "ded" + ], + [ + "▁end", + "ed" + ], + [ + "▁ende", + "d" + ], + [ + "▁", + "ended" + ], + [ + "ac", + "ional" + ], + [ + "aci", + "onal" + ], + [ + "acion", + "al" + ], + [ + "acio", + "nal" + ], + [ + "▁man", + "ager" + ], + [ + "▁manage", + "r" + ], + [ + "▁", + "manager" + ], + [ + "▁S", + "un" + ], + [ + "▁Su", + "n" + ], + [ + "▁", + "Sun" + ], + [ + "ag", + "g" + ], + [ + "a", + "gg" + ], + [ + "▁le", + "ader" + ], + [ + "▁lead", + "er" + ], + [ + "ol", + "ved" + ], + [ + "olve", + "d" + ], + [ + "olv", + "ed" + ], + [ + "▁ч", + "то" + ], + [ + "▁trad", + "itional" + ], + [ + "▁tradition", + "al" + ], + [ + "sh", + "ot" + ], + [ + "s", + "hot" + ], + [ + "ru", + "p" + ], + [ + "r", + "up" + ], + [ + "C", + "F" + ], + [ + "▁E", + "ach" + ], + [ + "▁", + "Each" + ], + [ + "w", + "r" + ], + [ + "▁S", + "om" + ], + [ + "▁So", + "m" + ], + [ + "▁", + "Som" + ], + [ + "▁material", + "s" + ], + [ + "▁mater", + "ials" + ], + [ + "▁m", + "sg" + ], + [ + "▁ms", + "g" + ], + [ + "▁", + "msg" + ], + [ + "▁s", + "yn" + ], + [ + "▁sy", + "n" + ], + [ + "▁", + "syn" + ], + [ + "▁produ", + "ce" + ], + [ + "▁prod", + "uce" + ], + [ + "▁st", + "orage" + ], + [ + "▁stor", + "age" + ], + [ + "▁sto", + "rage" + ], + [ + "▁", + "storage" + ], + [ + "sub", + "section" + ], + [ + "▁S", + "ie" + ], + [ + "▁Si", + "e" + ], + [ + "▁I", + "P" + ], + [ + "▁", + "IP" + ], + [ + "CE", + "SS" + ], + [ + "▁w", + "a" + ], + [ + "▁", + "wa" + ], + [ + "Re", + "cord" + ], + [ + "Rec", + "ord" + ], + [ + "▁mark", + "eting" + ], + [ + "▁market", + "ing" + ], + [ + "pl", + "et" + ], + [ + "ple", + "t" + ], + [ + "p", + "let" + ], + [ + "D", + "ialog" + ], + [ + "▁mention", + "ed" + ], + [ + "▁ment", + "ioned" + ], + [ + "▁N", + "a" + ], + [ + "▁", + "Na" + ], + [ + "▁Un", + "ion" + ], + [ + "▁", + "Union" + ], + [ + "▁A", + "PI" + ], + [ + "▁AP", + "I" + ], + [ + "▁", + "API" + ], + [ + "▁neg", + "ative" + ], + [ + "▁", + "negative" + ], + [ + "tx", + "t" + ], + [ + "t", + "xt" + ], + [ + "▁eas", + "ier" + ], + [ + "le", + "gal" + ], + [ + "leg", + "al" + ], + [ + "De", + "p" + ], + [ + "D", + "ep" + ], + [ + "▁no", + "vel" + ], + [ + "▁nov", + "el" + ], + [ + "▁nove", + "l" + ], + [ + "eu", + "r" + ], + [ + "e", + "ur" + ], + [ + "ac", + "ió" + ], + [ + "aci", + "ó" + ], + [ + "a", + "ció" + ], + [ + "▁B", + "ud" + ], + [ + "▁Bu", + "d" + ], + [ + "▁c", + "arry" + ], + [ + "▁car", + "ry" + ], + [ + "sch", + "aft" + ], + [ + "s", + "chaft" + ], + [ + "▁br", + "oken" + ], + [ + "▁bro", + "ken" + ], + [ + "▁broke", + "n" + ], + [ + "▁t", + "rees" + ], + [ + "▁tr", + "ees" + ], + [ + "▁tre", + "es" + ], + [ + "▁tree", + "s" + ], + [ + ">(", + ");" + ], + [ + ">()", + ";" + ], + [ + ">", + "();" + ], + [ + "▁e", + "mb" + ], + [ + "▁em", + "b" + ], + [ + "▁", + "emb" + ], + [ + "ie", + "der" + ], + [ + "ied", + "er" + ], + [ + "i", + "eder" + ], + [ + "▁r", + "oute" + ], + [ + "▁ro", + "ute" + ], + [ + "▁rout", + "e" + ], + [ + "▁rou", + "te" + ], + [ + "▁", + "route" + ], + [ + "ik", + "el" + ], + [ + "ike", + "l" + ], + [ + "i", + "kel" + ], + [ + "▁l", + "isten" + ], + [ + "▁li", + "sten" + ], + [ + "▁list", + "en" + ], + [ + "▁", + "listen" + ], + [ + "ash", + "ion" + ], + [ + "ashi", + "on" + ], + [ + "▁M", + "rs" + ], + [ + "▁Mr", + "s" + ], + [ + "▁equip", + "ment" + ], + [ + "ag", + "ger" + ], + [ + "agg", + "er" + ], + [ + "▁T", + "hus" + ], + [ + "▁Th", + "us" + ], + [ + "▁mat", + "rix" + ], + [ + "▁", + "matrix" + ], + [ + "al", + "la" + ], + [ + "all", + "a" + ], + [ + "a", + "lla" + ], + [ + "▁T", + "our" + ], + [ + "▁To", + "ur" + ], + [ + "▁con", + "versation" + ], + [ + "▁convers", + "ation" + ], + [ + "Mo", + "n" + ], + [ + "M", + "on" + ], + [ + "our", + "nal" + ], + [ + "▁min", + "ute" + ], + [ + "▁minut", + "e" + ], + [ + "▁", + "minute" + ], + [ + "A", + "m" + ], + [ + "Ap", + "i" + ], + [ + "A", + "pi" + ], + [ + "▁for", + "get" + ], + [ + "▁forg", + "et" + ], + [ + "M", + "e" + ], + [ + "lev", + "ant" + ], + [ + "te", + "mp" + ], + [ + "tem", + "p" + ], + [ + "t", + "emp" + ], + [ + "▁t", + "elling" + ], + [ + "▁tell", + "ing" + ], + [ + "▁tel", + "ling" + ], + [ + "mo", + "ve" + ], + [ + "mov", + "e" + ], + [ + "m", + "ove" + ], + [ + "▁in", + "dependent" + ], + [ + "▁independ", + "ent" + ], + [ + "to", + "String" + ], + [ + "ed", + "it" + ], + [ + "edi", + "t" + ], + [ + "e", + "dit" + ], + [ + "▁J", + "ac" + ], + [ + "▁Ja", + "c" + ], + [ + "az", + "z" + ], + [ + "a", + "zz" + ], + [ + "re", + "act" + ], + [ + "rea", + "ct" + ], + [ + "▁c", + "in" + ], + [ + "▁ci", + "n" + ], + [ + "▁", + "cin" + ], + [ + "▁P", + "rov" + ], + [ + "▁Pro", + "v" + ], + [ + "▁Pr", + "ov" + ], + [ + "▁", + "Prov" + ], + [ + "is", + "ted" + ], + [ + "ist", + "ed" + ], + [ + "iste", + "d" + ], + [ + "i", + "sted" + ], + [ + "▁h", + "ash" + ], + [ + "▁has", + "h" + ], + [ + "▁ha", + "sh" + ], + [ + "▁", + "hash" + ], + [ + "on", + "na" + ], + [ + "ik", + "i" + ], + [ + "i", + "ki" + ], + [ + "▁gener", + "ated" + ], + [ + "▁generate", + "d" + ], + [ + "▁gene", + "rated" + ], + [ + "▁", + "generated" + ], + [ + "Re", + "nder" + ], + [ + "Rend", + "er" + ], + [ + "R", + "ender" + ], + [ + "▁psy", + "ch" + ], + [ + "▁ps", + "ych" + ], + [ + "na", + "v" + ], + [ + "n", + "av" + ], + [ + "▁en", + "tr" + ], + [ + "▁ent", + "r" + ], + [ + "▁", + "entr" + ], + [ + "п", + "ра" + ], + [ + "r", + "x" + ], + [ + "AT", + "H" + ], + [ + "A", + "TH" + ], + [ + "▁ass", + "ume" + ], + [ + "▁assum", + "e" + ], + [ + "Tr", + "ee" + ], + [ + "T", + "ree" + ], + [ + "semb", + "ly" + ], + [ + "sembl", + "y" + ], + [ + "▁M", + "att" + ], + [ + "▁Mat", + "t" + ], + [ + "▁Ma", + "tt" + ], + [ + "ca", + "ption" + ], + [ + "c", + "aption" + ], + [ + "▁s", + "olutions" + ], + [ + "▁solution", + "s" + ], + [ + "▁fa", + "ith" + ], + [ + "▁fait", + "h" + ], + [ + "▁dig", + "ital" + ], + [ + "▁digit", + "al" + ], + [ + "▁ex", + "cell" + ], + [ + "▁exc", + "ell" + ], + [ + "▁V", + "ersion" + ], + [ + "▁Vers", + "ion" + ], + [ + "▁", + "Version" + ], + [ + "De", + "bug" + ], + [ + "D", + "ebug" + ], + [ + "▁ж", + "и" + ], + [ + "▁", + "жи" + ], + [ + "▁car", + "ried" + ], + [ + "re", + "set" + ], + [ + "res", + "et" + ], + [ + "▁slow", + "ly" + ], + [ + "an", + "cing" + ], + [ + "anc", + "ing" + ], + [ + "▁own", + "er" + ], + [ + "▁", + "owner" + ], + [ + "▁T", + "er" + ], + [ + "▁Te", + "r" + ], + [ + "▁D", + "id" + ], + [ + "▁Di", + "d" + ], + [ + "▁", + "Did" + ], + [ + "▁g", + "est" + ], + [ + "▁ge", + "st" + ], + [ + "▁ges", + "t" + ], + [ + "▁", + "gest" + ], + [ + "▁é", + "té" + ], + [ + "▁ét", + "é" + ], + [ + "▁", + "été" + ], + [ + "▁pro", + "of" + ], + [ + "▁", + "proof" + ], + [ + "F", + "ont" + ], + [ + "▁n", + "ob" + ], + [ + "▁no", + "b" + ], + [ + "▁", + "nob" + ], + [ + "C", + "o" + ], + [ + "▁G", + "NU" + ], + [ + "▁l", + "iber" + ], + [ + "▁li", + "ber" + ], + [ + "▁lib", + "er" + ], + [ + "it", + "ness" + ], + [ + "▁h", + "ij" + ], + [ + "▁hi", + "j" + ], + [ + "▁v", + "ert" + ], + [ + "▁ver", + "t" + ], + [ + "▁ve", + "rt" + ], + [ + "▁", + "vert" + ], + [ + "ш", + "а" + ], + [ + "FL", + "AG" + ], + [ + "ME", + "NT" + ], + [ + "M", + "ENT" + ], + [ + "▁S", + "on" + ], + [ + "▁So", + "n" + ], + [ + "Mu", + "lt" + ], + [ + "M", + "ult" + ], + [ + "▁d", + "istrict" + ], + [ + "▁di", + "strict" + ], + [ + "▁dist", + "rict" + ], + [ + "conne", + "ct" + ], + [ + "conn", + "ect" + ], + [ + "ject", + "ion" + ], + [ + "je", + "ction" + ], + [ + "j", + "ection" + ], + [ + "ly", + "mp" + ], + [ + "▁real", + "ized" + ], + [ + "▁realize", + "d" + ], + [ + "▁realiz", + "ed" + ], + [ + "mo", + "s" + ], + [ + "m", + "os" + ], + [ + "y", + "e" + ], + [ + "▁re", + "nder" + ], + [ + "▁r", + "ender" + ], + [ + "▁ren", + "der" + ], + [ + "▁rend", + "er" + ], + [ + "▁", + "render" + ], + [ + "ri", + "o" + ], + [ + "r", + "io" + ], + [ + "▁inter", + "pret" + ], + [ + "▁", + "interpret" + ], + [ + "▁slight", + "ly" + ], + [ + "fi", + "x" + ], + [ + "f", + "ix" + ], + [ + "▁stud", + "ies" + ], + [ + "▁r", + "id" + ], + [ + "▁ri", + "d" + ], + [ + "▁", + "rid" + ], + [ + "at", + "re" + ], + [ + "atr", + "e" + ], + [ + "a", + "tre" + ], + [ + "▁benef", + "its" + ], + [ + "▁benefit", + "s" + ], + [ + "▁F", + "ace" + ], + [ + "▁Fa", + "ce" + ], + [ + "▁Fac", + "e" + ], + [ + "▁", + "Face" + ], + [ + "iv", + "ery" + ], + [ + "ive", + "ry" + ], + [ + "iver", + "y" + ], + [ + "i", + "very" + ], + [ + "ри", + "я" + ], + [ + "doc", + "ument" + ], + [ + "d", + "ocument" + ], + [ + "▁as", + "king" + ], + [ + "▁ask", + "ing" + ], + [ + "La", + "st" + ], + [ + "L", + "ast" + ], + [ + "ar", + "ante" + ], + [ + "ara", + "nte" + ], + [ + "aran", + "te" + ], + [ + "▁Mart", + "in" + ], + [ + "▁E", + "ll" + ], + [ + "▁El", + "l" + ], + [ + "▁v", + "ector" + ], + [ + "▁ve", + "ctor" + ], + [ + "▁vec", + "tor" + ], + [ + "▁", + "vector" + ], + [ + "▁for", + "ced" + ], + [ + "▁force", + "d" + ], + [ + "▁", + "forced" + ], + [ + "о", + "ло" + ], + [ + "P", + "H" + ], + [ + "W", + "R" + ], + [ + "▁K", + "l" + ], + [ + "▁s", + "ky" + ], + [ + "▁sk", + "y" + ], + [ + "▁", + "sky" + ], + [ + "▁str", + "ategy" + ], + [ + "▁strateg", + "y" + ], + [ + "▁strat", + "egy" + ], + [ + "oc", + "ked" + ], + [ + "ock", + "ed" + ], + [ + "▁ne", + "ck" + ], + [ + "ś", + "ci" + ], + [ + "O", + "UT" + ], + [ + "))", + "," + ], + [ + ")", + ")," + ], + [ + "C", + "ustom" + ], + [ + "▁w", + "ie" + ], + [ + "▁", + "wie" + ], + [ + "▁s", + "weet" + ], + [ + "▁swe", + "et" + ], + [ + "▁t", + "emp" + ], + [ + "▁te", + "mp" + ], + [ + "▁tem", + "p" + ], + [ + "▁", + "temp" + ], + [ + "▁fore", + "ign" + ], + [ + "▁h", + "all" + ], + [ + "▁ha", + "ll" + ], + [ + "▁hal", + "l" + ], + [ + "▁", + "hall" + ], + [ + "as", + "tr" + ], + [ + "ast", + "r" + ], + [ + "a", + "str" + ], + [ + "As", + "s" + ], + [ + "A", + "ss" + ], + [ + "MO", + "DE" + ], + [ + "MOD", + "E" + ], + [ + "▁max", + "imum" + ], + [ + "▁maxim", + "um" + ], + [ + "an", + "nels" + ], + [ + "ann", + "els" + ], + [ + "annel", + "s" + ], + [ + "anne", + "ls" + ], + [ + "▁t", + "ip" + ], + [ + "▁ti", + "p" + ], + [ + "▁", + "tip" + ], + [ + "▁second", + "s" + ], + [ + "▁sec", + "onds" + ], + [ + "▁", + "seconds" + ], + [ + "▁st", + "ack" + ], + [ + "▁sta", + "ck" + ], + [ + "▁", + "stack" + ], + [ + "ig", + "a" + ], + [ + "i", + "ga" + ], + [ + "▁r", + "aise" + ], + [ + "▁rais", + "e" + ], + [ + "▁ra", + "ise" + ], + [ + "▁", + "raise" + ], + [ + "en", + "able" + ], + [ + "ena", + "ble" + ], + [ + "oi", + "r" + ], + [ + "o", + "ir" + ], + [ + "▁s", + "oul" + ], + [ + "▁so", + "ul" + ], + [ + "▁sou", + "l" + ], + [ + "K", + "e" + ], + [ + ")$", + "." + ], + [ + ")", + "$." + ], + [ + "▁T", + "im" + ], + [ + "▁Ti", + "m" + ], + [ + "▁", + "Tim" + ], + [ + "AL", + "SE" + ], + [ + "is", + "er" + ], + [ + "ise", + "r" + ], + [ + "i", + "ser" + ], + [ + "cont", + "in" + ], + [ + "be", + "l" + ], + [ + "b", + "el" + ], + [ + "▁m", + "ad" + ], + [ + "▁ma", + "d" + ], + [ + "▁", + "mad" + ], + [ + "lic", + "hen" + ], + [ + "li", + "chen" + ], + [ + "lich", + "en" + ], + [ + "liche", + "n" + ], + [ + "l", + "ichen" + ], + [ + "ab", + "e" + ], + [ + "a", + "be" + ], + [ + "sa", + "fe" + ], + [ + "▁con", + "cent" + ], + [ + "▁conc", + "ent" + ], + [ + "▁conce", + "nt" + ], + [ + "bo", + "und" + ], + [ + "b", + "ound" + ], + [ + "▁R", + "equ" + ], + [ + "▁Re", + "qu" + ], + [ + "▁", + "Requ" + ], + [ + "sw", + "itch" + ], + [ + "▁st", + "one" + ], + [ + "▁sto", + "ne" + ], + [ + "▁", + "stone" + ], + [ + "▁trans", + "l" + ], + [ + "▁", + "transl" + ], + [ + "▁v", + "ac" + ], + [ + "▁va", + "c" + ], + [ + "an", + "don" + ], + [ + "and", + "on" + ], + [ + "ando", + "n" + ], + [ + "▁F", + "ore" + ], + [ + "▁For", + "e" + ], + [ + "▁Fo", + "re" + ], + [ + "▁", + "Fore" + ], + [ + "▁s", + "ounds" + ], + [ + "▁sound", + "s" + ], + [ + "▁P", + "op" + ], + [ + "▁Po", + "p" + ], + [ + "▁", + "Pop" + ], + [ + "▁H", + "T" + ], + [ + "▁", + "HT" + ], + [ + "li", + "a" + ], + [ + "l", + "ia" + ], + [ + "en", + "ter" + ], + [ + "ent", + "er" + ], + [ + "ente", + "r" + ], + [ + "▁hel", + "ps" + ], + [ + "▁help", + "s" + ], + [ + "ed", + "y" + ], + [ + "e", + "dy" + ], + [ + "ст", + "вен" + ], + [ + "ств", + "ен" + ], + [ + "стве", + "н" + ], + [ + "an", + "ted" + ], + [ + "ant", + "ed" + ], + [ + "ante", + "d" + ], + [ + "▁I", + "ts" + ], + [ + "▁It", + "s" + ], + [ + "▁St", + "ep" + ], + [ + "▁Ste", + "p" + ], + [ + "▁", + "Step" + ], + [ + "I", + "con" + ], + [ + "▁EX", + "PECT" + ], + [ + "▁", + "EXPECT" + ], + [ + "ial", + "ized" + ], + [ + "ialize", + "d" + ], + [ + "Pos", + "t" + ], + [ + "Po", + "st" + ], + [ + "P", + "ost" + ], + [ + "az", + "e" + ], + [ + "a", + "ze" + ], + [ + "▁Car", + "ol" + ], + [ + "▁Ca", + "rol" + ], + [ + "▁re", + "q" + ], + [ + "▁r", + "eq" + ], + [ + "▁", + "req" + ], + [ + "▁crit", + "ical" + ], + [ + "▁critic", + "al" + ], + [ + "D", + "S" + ], + [ + "▁se", + "at" + ], + [ + "▁sea", + "t" + ], + [ + "ap", + "ed" + ], + [ + "ape", + "d" + ], + [ + "a", + "ped" + ], + [ + "▁up", + "per" + ], + [ + "▁upp", + "er" + ], + [ + "▁", + "upper" + ], + [ + "▁S", + "y" + ], + [ + "▁", + "Sy" + ], + [ + "▁ex", + "plain" + ], + [ + "▁expl", + "ain" + ], + [ + "▁'", + "./" + ], + [ + "▁'.", + "/" + ], + [ + "ut", + "ils" + ], + [ + "util", + "s" + ], + [ + "uti", + "ls" + ], + [ + "poss", + "ible" + ], + [ + "▁d", + "ont" + ], + [ + "▁do", + "nt" + ], + [ + "▁don", + "t" + ], + [ + "H", + "ost" + ], + [ + "▁appro", + "xim" + ], + [ + "▁approx", + "im" + ], + [ + "As", + "ync" + ], + [ + "A", + "sync" + ], + [ + "▁g", + "rab" + ], + [ + "▁gr", + "ab" + ], + [ + "▁gra", + "b" + ], + [ + "▁s", + "ources" + ], + [ + "▁source", + "s" + ], + [ + "▁sour", + "ces" + ], + [ + "▁", + "sources" + ], + [ + "▁M", + "os" + ], + [ + "▁Mo", + "s" + ], + [ + "▁Germ", + "any" + ], + [ + "▁German", + "y" + ], + [ + "▁Ger", + "many" + ], + [ + "▁r", + "ub" + ], + [ + "▁ru", + "b" + ], + [ + "▁", + "rub" + ], + [ + "CH", + "AN" + ], + [ + "▁r", + "ain" + ], + [ + "▁ra", + "in" + ], + [ + "▁tr", + "uly" + ], + [ + "▁join", + "ed" + ], + [ + "▁jo", + "ined" + ], + [ + "▁<", + "?" + ], + [ + "▁", + "" + ], + [ + "_", + "->" + ], + [ + "ag", + "nost" + ], + [ + "agn", + "ost" + ], + [ + "▁pro", + "posed" + ], + [ + "▁prop", + "osed" + ], + [ + "▁propos", + "ed" + ], + [ + "▁propose", + "d" + ], + [ + "▁G", + "ame" + ], + [ + "▁Ga", + "me" + ], + [ + "▁Gam", + "e" + ], + [ + "▁", + "Game" + ], + [ + "▁eff", + "orts" + ], + [ + "▁effort", + "s" + ], + [ + "в", + "я" + ], + [ + "t", + "c" + ], + [ + "с", + "к" + ], + [ + "▁int", + "ent" + ], + [ + "▁inte", + "nt" + ], + [ + "▁", + "intent" + ], + [ + "▁B", + "re" + ], + [ + "▁Br", + "e" + ], + [ + "is", + "c" + ], + [ + "i", + "sc" + ], + [ + "▁pro", + "test" + ], + [ + "▁prote", + "st" + ], + [ + "▁prot", + "est" + ], + [ + "▁h", + "olds" + ], + [ + "▁hold", + "s" + ], + [ + "▁hol", + "ds" + ], + [ + "▁", + "holds" + ], + [ + "om", + "etry" + ], + [ + "ome", + "try" + ], + [ + "omet", + "ry" + ], + [ + "o", + "metry" + ], + [ + "▁H", + "ave" + ], + [ + "▁Ha", + "ve" + ], + [ + "▁Hav", + "e" + ], + [ + "▁", + "Have" + ], + [ + "▁de", + "tail" + ], + [ + "▁det", + "ail" + ], + [ + "▁", + "detail" + ], + [ + "▁WIT", + "HOUT" + ], + [ + "▁WITH", + "OUT" + ], + [ + "ye", + "r" + ], + [ + "y", + "er" + ], + [ + "▁K", + "on" + ], + [ + "▁Ko", + "n" + ], + [ + "▁not", + "iced" + ], + [ + "▁notice", + "d" + ], + [ + "▁require", + "ments" + ], + [ + "▁requirement", + "s" + ], + [ + "DE", + "BUG" + ], + [ + "ki", + "ns" + ], + [ + "kin", + "s" + ], + [ + "k", + "ins" + ], + [ + "▁S", + "pan" + ], + [ + "▁Sp", + "an" + ], + [ + "▁", + "Span" + ], + [ + "▁c", + "ars" + ], + [ + "▁car", + "s" + ], + [ + "▁ca", + "rs" + ], + [ + "me", + "ta" + ], + [ + "met", + "a" + ], + [ + "m", + "eta" + ], + [ + "▁k", + "il" + ], + [ + "▁ki", + "l" + ], + [ + "▁", + "kil" + ], + [ + "▁B", + "ron" + ], + [ + "▁Br", + "on" + ], + [ + "▁Bro", + "n" + ], + [ + "▁experience", + "d" + ], + [ + "▁experi", + "enced" + ], + [ + "▁re", + "mind" + ], + [ + "▁rem", + "ind" + ], + [ + "our", + "se" + ], + [ + "ours", + "e" + ], + [ + "▁W", + "estern" + ], + [ + "▁West", + "ern" + ], + [ + "▁Wes", + "tern" + ], + [ + "ter", + "ed" + ], + [ + "te", + "red" + ], + [ + "tere", + "d" + ], + [ + "t", + "ered" + ], + [ + "▁dev", + "ices" + ], + [ + "▁device", + "s" + ], + [ + "▁", + "devices" + ], + [ + "▁pict", + "ures" + ], + [ + "▁picture", + "s" + ], + [ + "▁t", + "ut" + ], + [ + "▁tu", + "t" + ], + [ + "\"", + "`" + ], + [ + "▁im", + "possible" + ], + [ + "▁r", + "ail" + ], + [ + "▁ra", + "il" + ], + [ + "▁fe", + "els" + ], + [ + "▁feel", + "s" + ], + [ + "▁fee", + "ls" + ], + [ + "ic", + "as" + ], + [ + "ica", + "s" + ], + [ + "i", + "cas" + ], + [ + "il", + "ling" + ], + [ + "ill", + "ing" + ], + [ + "▁acc", + "ident" + ], + [ + "▁'", + "@" + ], + [ + "____", + "____" + ], + [ + "▁n", + "otes" + ], + [ + "▁not", + "es" + ], + [ + "▁no", + "tes" + ], + [ + "▁note", + "s" + ], + [ + "▁", + "notes" + ], + [ + "om", + "an" + ], + [ + "oma", + "n" + ], + [ + "o", + "man" + ], + [ + "Par", + "ser" + ], + [ + "Parse", + "r" + ], + [ + "Pars", + "er" + ], + [ + "▁dis", + "covered" + ], + [ + "▁discover", + "ed" + ], + [ + "▁R", + "oman" + ], + [ + "▁Rom", + "an" + ], + [ + "▁Ro", + "man" + ], + [ + "▁Roma", + "n" + ], + [ + "▁bud", + "get" + ], + [ + "▁gu", + "ide" + ], + [ + "▁guid", + "e" + ], + [ + "ki", + "ng" + ], + [ + "kin", + "g" + ], + [ + "k", + "ing" + ], + [ + "▁in", + "cred" + ], + [ + "▁inc", + "red" + ], + [ + "▁incre", + "d" + ], + [ + "ol", + "ar" + ], + [ + "ola", + "r" + ], + [ + "o", + "lar" + ], + [ + "en", + "den" + ], + [ + "end", + "en" + ], + [ + "ende", + "n" + ], + [ + "Des", + "c" + ], + [ + "De", + "sc" + ], + [ + "D", + "esc" + ], + [ + "▁w", + "ave" + ], + [ + "▁wa", + "ve" + ], + [ + "▁", + "wave" + ], + [ + "б", + "ли" + ], + [ + "ig", + "t" + ], + [ + "i", + "gt" + ], + [ + "▁re", + "strict" + ], + [ + "▁rest", + "rict" + ], + [ + "▁restr", + "ict" + ], + [ + "▁R", + "et" + ], + [ + "▁Re", + "t" + ], + [ + "▁", + "Ret" + ], + [ + "▁m", + "ac" + ], + [ + "▁ma", + "c" + ], + [ + "▁", + "mac" + ], + [ + "у", + "р" + ], + [ + "B", + "S" + ], + [ + "í", + "s" + ], + [ + "▁gener", + "ation" + ], + [ + "de", + "m" + ], + [ + "d", + "em" + ], + [ + "al", + "o" + ], + [ + "a", + "lo" + ], + [ + "б", + "ра" + ], + [ + "▁order", + "ed" + ], + [ + "▁ord", + "ered" + ], + [ + "▁", + "ordered" + ], + [ + "dr", + "op" + ], + [ + "dro", + "p" + ], + [ + "d", + "rop" + ], + [ + "▁p", + "p" + ], + [ + "▁", + "pp" + ], + [ + "▁Re", + "view" + ], + [ + "▁Rev", + "iew" + ], + [ + "▁", + "Review" + ], + [ + "▁liter", + "ally" + ], + [ + "▁literal", + "ly" + ], + [ + "▁S", + "ir" + ], + [ + "▁Si", + "r" + ], + [ + "▁", + "Sir" + ], + [ + "▁Y", + "eah" + ], + [ + "▁Ye", + "ah" + ], + [ + "▁", + "Yeah" + ], + [ + "▁d", + "ensity" + ], + [ + "▁dens", + "ity" + ], + [ + "▁", + "density" + ], + [ + "ri", + "z" + ], + [ + "r", + "iz" + ], + [ + "in", + "de" + ], + [ + "ind", + "e" + ], + [ + "i", + "nde" + ], + [ + "▁g", + "ain" + ], + [ + "▁ga", + "in" + ], + [ + "▁", + "gain" + ], + [ + "▁p", + "anel" + ], + [ + "▁pan", + "el" + ], + [ + "▁pa", + "nel" + ], + [ + "▁", + "panel" + ], + [ + "je", + "t" + ], + [ + "j", + "et" + ], + [ + "▁T", + "imes" + ], + [ + "▁Time", + "s" + ], + [ + "▁Tim", + "es" + ], + [ + "▁Ti", + "mes" + ], + [ + "▁", + "Times" + ], + [ + "▁n", + "ella" + ], + [ + "▁ne", + "lla" + ], + [ + "▁nel", + "la" + ], + [ + "▁nell", + "a" + ], + [ + "▁pre", + "viously" + ], + [ + "▁previous", + "ly" + ], + [ + "▁prev", + "iously" + ], + [ + "point", + "s" + ], + [ + "Se", + "nd" + ], + [ + "S", + "end" + ], + [ + "▁B", + "rown" + ], + [ + "▁Br", + "own" + ], + [ + "▁Bro", + "wn" + ], + [ + "▁Brow", + "n" + ], + [ + "ea", + "ch" + ], + [ + "e", + "ach" + ], + [ + "▁tr", + "igger" + ], + [ + "▁", + "trigger" + ], + [ + "ome", + "times" + ], + [ + "omet", + "imes" + ], + [ + "ic", + "os" + ], + [ + "ico", + "s" + ], + [ + "i", + "cos" + ], + [ + "G", + "R" + ], + [ + "Pane", + "l" + ], + [ + "Pan", + "el" + ], + [ + "P", + "anel" + ], + [ + "og", + "en" + ], + [ + "oge", + "n" + ], + [ + "o", + "gen" + ], + [ + "▁c", + "m" + ], + [ + "▁", + "cm" + ], + [ + "ru", + "ctions" + ], + [ + "ruct", + "ions" + ], + [ + "ruction", + "s" + ], + [ + "▁k", + "iss" + ], + [ + "▁ki", + "ss" + ], + [ + "▁s", + "olo" + ], + [ + "▁so", + "lo" + ], + [ + "▁sol", + "o" + ], + [ + "▁f", + "amous" + ], + [ + "▁fam", + "ous" + ], + [ + "ra", + "n" + ], + [ + "r", + "an" + ], + [ + "п", + "ро" + ], + [ + "▁th", + "ro" + ], + [ + "▁thr", + "o" + ], + [ + "Gr", + "aph" + ], + [ + "G", + "raph" + ], + [ + "im", + "it" + ], + [ + "imi", + "t" + ], + [ + "i", + "mit" + ], + [ + "▁V", + "alue" + ], + [ + "▁Val", + "ue" + ], + [ + "▁", + "Value" + ], + [ + "▁st", + "arts" + ], + [ + "▁start", + "s" + ], + [ + "▁star", + "ts" + ], + [ + "ip", + "eline" + ], + [ + "ipe", + "line" + ], + [ + "h", + "d" + ], + [ + "T", + "C" + ], + [ + "▁dis", + "cussion" + ], + [ + "▁discuss", + "ion" + ], + [ + "▁tr", + "uck" + ], + [ + "ak", + "a" + ], + [ + "a", + "ka" + ], + [ + "On", + "ly" + ], + [ + "▁E", + "qu" + ], + [ + "▁Eq", + "u" + ], + [ + "▁", + "Equ" + ], + [ + "▁k", + "ö" + ], + [ + "▁", + "kö" + ], + [ + "▁B", + "es" + ], + [ + "▁Be", + "s" + ], + [ + "▁crit", + "ic" + ], + [ + "▁pro", + "pos" + ], + [ + "▁prop", + "os" + ], + [ + "▁b", + "att" + ], + [ + "▁bat", + "t" + ], + [ + "▁ba", + "tt" + ], + [ + "▁S", + "ection" + ], + [ + "▁Se", + "ction" + ], + [ + "▁", + "Section" + ], + [ + "Sh", + "ow" + ], + [ + "S", + "how" + ], + [ + "g", + "p" + ], + [ + "ST", + "ATE" + ], + [ + "STAT", + "E" + ], + [ + "PO", + "ST" + ], + [ + "POS", + "T" + ], + [ + "P", + "OST" + ], + [ + "▁N", + "ord" + ], + [ + "▁No", + "rd" + ], + [ + "▁Nor", + "d" + ], + [ + "▁in", + "nov" + ], + [ + "▁inn", + "ov" + ], + [ + "▁c", + "rim" + ], + [ + "▁cr", + "im" + ], + [ + "▁cri", + "m" + ], + [ + "▁", + "crim" + ], + [ + "ax", + "is" + ], + [ + "a", + "xis" + ], + [ + "▁T", + "urn" + ], + [ + "▁Tur", + "n" + ], + [ + "▁Tu", + "rn" + ], + [ + "▁", + "Turn" + ], + [ + "con", + "n" + ], + [ + "co", + "nn" + ], + [ + "Run", + "time" + ], + [ + "▁rem", + "aining" + ], + [ + "▁remain", + "ing" + ], + [ + "os", + "ton" + ], + [ + "ost", + "on" + ], + [ + "osto", + "n" + ], + [ + "o", + "ston" + ], + [ + "▁", + "Э" + ], + [ + "▁window", + "s" + ], + [ + "▁wind", + "ows" + ], + [ + "▁", + "windows" + ], + [ + "▁R", + "oyal" + ], + [ + "▁Ro", + "yal" + ], + [ + "▁Roy", + "al" + ], + [ + "▁v", + "ide" + ], + [ + "▁vi", + "de" + ], + [ + "▁vid", + "e" + ], + [ + "P", + "P" + ], + [ + "ch", + "ron" + ], + [ + "chr", + "on" + ], + [ + "▁s", + "an" + ], + [ + "▁sa", + "n" + ], + [ + "▁", + "san" + ], + [ + "▁r", + "ise" + ], + [ + "▁ri", + "se" + ], + [ + "▁ris", + "e" + ], + [ + "▁", + "rise" + ], + [ + "▁d", + "elle" + ], + [ + "▁de", + "lle" + ], + [ + "▁del", + "le" + ], + [ + "▁dell", + "e" + ], + [ + "▁D", + "ur" + ], + [ + "▁Du", + "r" + ], + [ + "▁rap", + "id" + ], + [ + "▁ra", + "pid" + ], + [ + "ce", + "rt" + ], + [ + "cer", + "t" + ], + [ + "c", + "ert" + ], + [ + "L", + "A" + ], + [ + "ed", + "ge" + ], + [ + "▁\\", + "]" + ], + [ + "▁", + "\\]" + ], + [ + "▁en", + "tered" + ], + [ + "▁ent", + "ered" + ], + [ + "▁enter", + "ed" + ], + [ + "▁l", + "aws" + ], + [ + "▁la", + "ws" + ], + [ + "▁law", + "s" + ], + [ + "▁ph", + "oto" + ], + [ + "▁phot", + "o" + ], + [ + "▁", + "photo" + ], + [ + "▁ap", + "plications" + ], + [ + "▁applic", + "ations" + ], + [ + "▁application", + "s" + ], + [ + "▁appl", + "ications" + ], + [ + "▁Ber", + "lin" + ], + [ + "▁ar", + "rest" + ], + [ + "▁arr", + "est" + ], + [ + "▁f", + "ederal" + ], + [ + "▁fed", + "eral" + ], + [ + "▁feder", + "al" + ], + [ + "▁R", + "ussia" + ], + [ + "▁Russ", + "ia" + ], + [ + "▁us", + "ual" + ], + [ + "▁r", + "aw" + ], + [ + "▁ra", + "w" + ], + [ + "▁", + "raw" + ], + [ + "▁pi", + "ù" + ], + [ + "êt", + "re" + ], + [ + "ê", + "tre" + ], + [ + "JS", + "ON" + ], + [ + "J", + "SON" + ], + [ + "SI", + "ON" + ], + [ + "S", + "ION" + ], + [ + "xt", + "ure" + ], + [ + "ist", + "ent" + ], + [ + "iste", + "nt" + ], + [ + "isten", + "t" + ], + [ + "▁P", + "ower" + ], + [ + "▁Po", + "wer" + ], + [ + "▁Pow", + "er" + ], + [ + "▁", + "Power" + ], + [ + "Bi", + "t" + ], + [ + "B", + "it" + ], + [ + "▁cap", + "acity" + ], + [ + "▁capac", + "ity" + ], + [ + "▁", + "capacity" + ], + [ + "▁c", + "ards" + ], + [ + "▁car", + "ds" + ], + [ + "▁card", + "s" + ], + [ + "▁", + "cards" + ], + [ + "UI", + "D" + ], + [ + "U", + "ID" + ], + [ + "im", + "ents" + ], + [ + "iment", + "s" + ], + [ + "imen", + "ts" + ], + [ + "i", + "ments" + ], + [ + "▁d", + "ar" + ], + [ + "▁da", + "r" + ], + [ + "▁", + "dar" + ], + [ + "▁Ch", + "icago" + ], + [ + "▁comfort", + "able" + ], + [ + "ti", + "p" + ], + [ + "t", + "ip" + ], + [ + "ba", + "s" + ], + [ + "b", + "as" + ], + [ + "▁m", + "u" + ], + [ + "▁", + "mu" + ], + [ + "▁en", + "emy" + ], + [ + "▁enem", + "y" + ], + [ + "ya", + "n" + ], + [ + "y", + "an" + ], + [ + "▁ф", + "и" + ], + [ + "▁", + "фи" + ], + [ + "▁up", + "dated" + ], + [ + "▁update", + "d" + ], + [ + "▁", + "updated" + ], + [ + "an", + "go" + ], + [ + "ang", + "o" + ], + [ + "E", + "v" + ], + [ + "E", + "ffect" + ], + [ + "os", + "ing" + ], + [ + "osi", + "ng" + ], + [ + "o", + "sing" + ], + [ + "ren", + "ce" + ], + [ + "r", + "ence" + ], + [ + "▁Con", + "gress" + ], + [ + "▁Cong", + "ress" + ], + [ + "▁d", + "efe" + ], + [ + "▁de", + "fe" + ], + [ + "▁def", + "e" + ], + [ + "▁i", + "p" + ], + [ + "▁", + "ip" + ], + [ + "▁t", + "out" + ], + [ + "▁to", + "ut" + ], + [ + "▁tou", + "t" + ], + [ + "▁f", + "reedom" + ], + [ + "▁free", + "dom" + ], + [ + "▁freed", + "om" + ], + [ + "▁a", + "o" + ], + [ + "▁", + "ao" + ], + [ + "▁There", + "fore" + ], + [ + "▁Ther", + "efore" + ], + [ + "Ed", + "it" + ], + [ + "E", + "dit" + ], + [ + "▁Vir", + "gin" + ], + [ + "RE", + "E" + ], + [ + "R", + "EE" + ], + [ + "ar", + "go" + ], + [ + "arg", + "o" + ], + [ + "▁D", + "am" + ], + [ + "▁Da", + "m" + ], + [ + "▁", + "Dam" + ], + [ + "▁tra", + "ffic" + ], + [ + "▁traff", + "ic" + ], + [ + "ño", + "s" + ], + [ + "ñ", + "os" + ], + [ + "▁a", + "lle" + ], + [ + "▁al", + "le" + ], + [ + "▁all", + "e" + ], + [ + "▁", + "alle" + ], + [ + "▁dep", + "th" + ], + [ + "▁", + "depth" + ], + [ + "No", + "w" + ], + [ + "N", + "ow" + ], + [ + "▁s", + "ides" + ], + [ + "▁side", + "s" + ], + [ + "▁si", + "des" + ], + [ + "▁sid", + "es" + ], + [ + "▁го", + "ди" + ], + [ + "▁год", + "и" + ], + [ + "Des", + "criptor" + ], + [ + "▁art", + "ikel" + ], + [ + "▁n", + "arrow" + ], + [ + "▁narr", + "ow" + ], + [ + "▁nar", + "row" + ], + [ + "__", + "_" + ], + [ + "_", + "__" + ], + [ + "k", + "w" + ], + [ + "ut", + "o" + ], + [ + "u", + "to" + ], + [ + "▁Face", + "book" + ], + [ + "▁Fac", + "ebook" + ], + [ + "te", + "gr" + ], + [ + "t", + "egr" + ], + [ + "bo", + "olean" + ], + [ + "ni", + "k" + ], + [ + "n", + "ik" + ], + [ + "b", + "d" + ], + [ + "Tr", + "ack" + ], + [ + "Tra", + "ck" + ], + [ + "▁g", + "ran" + ], + [ + "▁gr", + "an" + ], + [ + "▁gra", + "n" + ], + [ + "res", + "hold" + ], + [ + "resh", + "old" + ], + [ + "ве", + "т" + ], + [ + "в", + "ет" + ], + [ + "wr", + "ap" + ], + [ + "w", + "rap" + ], + [ + "▁n", + "oise" + ], + [ + "▁no", + "ise" + ], + [ + "ig", + "u" + ], + [ + "i", + "gu" + ], + [ + "▁B", + "on" + ], + [ + "▁Bo", + "n" + ], + [ + "▁", + "Bon" + ], + [ + "▁w", + "y" + ], + [ + "▁", + "wy" + ], + [ + "lin", + "ux" + ], + [ + "ck", + "s" + ], + [ + "c", + "ks" + ], + [ + "▁f", + "ans" + ], + [ + "▁fa", + "ns" + ], + [ + "▁fan", + "s" + ], + [ + "▁m", + "ach" + ], + [ + "▁ma", + "ch" + ], + [ + "▁mac", + "h" + ], + [ + "▁p", + "rices" + ], + [ + "▁pr", + "ices" + ], + [ + "▁pri", + "ces" + ], + [ + "▁price", + "s" + ], + [ + "é", + "v" + ], + [ + "ou", + "ts" + ], + [ + "out", + "s" + ], + [ + "o", + "uts" + ], + [ + "stand", + "ing" + ], + [ + "stan", + "ding" + ], + [ + "▁c", + "ateg" + ], + [ + "▁cat", + "eg" + ], + [ + ";", + "\\" + ], + [ + "▁de", + "cre" + ], + [ + "▁dec", + "re" + ], + [ + "▁S", + "aturday" + ], + [ + "▁m", + "enu" + ], + [ + "▁me", + "nu" + ], + [ + "▁men", + "u" + ], + [ + "▁", + "menu" + ], + [ + "▁N", + "ov" + ], + [ + "▁No", + "v" + ], + [ + "▁Y", + "et" + ], + [ + "▁Ye", + "t" + ], + [ + "▁та", + "к" + ], + [ + "lic", + "he" + ], + [ + "li", + "che" + ], + [ + "lich", + "e" + ], + [ + "l", + "iche" + ], + [ + "▁Ac", + "adem" + ], + [ + "▁commun", + "ication" + ], + [ + "us", + "ing" + ], + [ + "u", + "sing" + ], + [ + "▁Soc", + "iety" + ], + [ + "▁Soci", + "ety" + ], + [ + "▁n", + "uc" + ], + [ + "▁nu", + "c" + ], + [ + "pect", + "ive" + ], + [ + "or", + "ial" + ], + [ + "oria", + "l" + ], + [ + "ori", + "al" + ], + [ + "o", + "rial" + ], + [ + "▁af", + "raid" + ], + [ + "▁an", + "imal" + ], + [ + "▁anim", + "al" + ], + [ + "▁turn", + "ing" + ], + [ + "▁tur", + "ning" + ], + [ + "ds", + "t" + ], + [ + "d", + "st" + ], + [ + "math", + "frak" + ], + [ + "le", + "rs" + ], + [ + "ler", + "s" + ], + [ + "l", + "ers" + ], + [ + "▁l", + "ots" + ], + [ + "▁lo", + "ts" + ], + [ + "▁lot", + "s" + ], + [ + "▁", + "á" + ], + [ + "▁T", + "ra" + ], + [ + "▁Tr", + "a" + ], + [ + "▁", + "Tra" + ], + [ + "n", + "p" + ], + [ + "▁r", + "ose" + ], + [ + "▁ro", + "se" + ], + [ + "▁", + "rose" + ], + [ + "▁G", + "L" + ], + [ + "▁", + "GL" + ], + [ + "▁hel", + "ping" + ], + [ + "▁help", + "ing" + ], + [ + "▁w", + "inter" + ], + [ + "▁win", + "ter" + ], + [ + "▁ко", + "м" + ], + [ + "▁", + "ком" + ], + [ + "Mo", + "ck" + ], + [ + "M", + "ock" + ], + [ + "▁invest", + "ment" + ], + [ + "Us", + "e" + ], + [ + "U", + "se" + ], + [ + "▁Can", + "ad" + ], + [ + "н", + "д" + ], + [ + "Co", + "py" + ], + [ + "Cop", + "y" + ], + [ + "C", + "opy" + ], + [ + "▁f", + "ly" + ], + [ + "▁fl", + "y" + ], + [ + "▁", + "fly" + ], + [ + "SE", + "R" + ], + [ + "S", + "ER" + ], + [ + "▁F", + "ar" + ], + [ + "▁Fa", + "r" + ], + [ + "▁R", + "os" + ], + [ + "▁Ro", + "s" + ], + [ + "am", + "il" + ], + [ + "ami", + "l" + ], + [ + "a", + "mil" + ], + [ + "▁fight", + "ing" + ], + [ + "▁rel", + "igious" + ], + [ + "▁relig", + "ious" + ], + [ + "su", + "per" + ], + [ + "sup", + "er" + ], + [ + "s", + "uper" + ], + [ + "sc", + "reen" + ], + [ + "scr", + "een" + ], + [ + "s", + "creen" + ], + [ + "▁f", + "urn" + ], + [ + "▁fur", + "n" + ], + [ + "▁fu", + "rn" + ], + [ + "▁surpr", + "ised" + ], + [ + "▁surprise", + "d" + ], + [ + "▁re", + "plied" + ], + [ + "▁repl", + "ied" + ], + [ + "Act", + "ivity" + ], + [ + "Activ", + "ity" + ], + [ + "▁D", + "own" + ], + [ + "▁Do", + "wn" + ], + [ + "▁Dow", + "n" + ], + [ + "▁", + "Down" + ], + [ + "▁in", + "sert" + ], + [ + "▁ins", + "ert" + ], + [ + "▁", + "insert" + ], + [ + "▁O", + "lymp" + ], + [ + "▁point", + "ed" + ], + [ + "▁po", + "inted" + ], + [ + "▁C", + "ard" + ], + [ + "▁Car", + "d" + ], + [ + "▁Ca", + "rd" + ], + [ + "▁", + "Card" + ], + [ + "dr", + "iver" + ], + [ + "drive", + "r" + ], + [ + "d", + "river" + ], + [ + "▁D", + "a" + ], + [ + "▁", + "Da" + ], + [ + "!", + "--" + ], + [ + "ro", + "ud" + ], + [ + "rou", + "d" + ], + [ + "r", + "oud" + ], + [ + "un", + "do" + ], + [ + "und", + "o" + ], + [ + "▁m", + "essages" + ], + [ + "▁message", + "s" + ], + [ + "▁mess", + "ages" + ], + [ + "▁", + "messages" + ], + [ + "▁P", + "oint" + ], + [ + "▁Po", + "int" + ], + [ + "▁", + "Point" + ], + [ + "V", + "M" + ], + [ + "▁p", + "lane" + ], + [ + "▁pl", + "ane" + ], + [ + "▁plan", + "e" + ], + [ + "▁", + "plane" + ], + [ + "x", + "c" + ], + [ + "▁telev", + "ision" + ], + [ + "▁tele", + "vision" + ], + [ + "▁televis", + "ion" + ], + [ + "ё", + "н" + ], + [ + "▁thous", + "ands" + ], + [ + "▁thousand", + "s" + ], + [ + "▁c", + "ris" + ], + [ + "▁cr", + "is" + ], + [ + "▁cri", + "s" + ], + [ + "▁de", + "lay" + ], + [ + "▁del", + "ay" + ], + [ + "▁", + "delay" + ], + [ + "▁N", + "ext" + ], + [ + "▁Ne", + "xt" + ], + [ + "▁", + "Next" + ], + [ + "▁no", + "mbre" + ], + [ + "▁nom", + "bre" + ], + [ + "▁t", + "u" + ], + [ + "▁", + "tu" + ], + [ + "▁sk", + "ip" + ], + [ + "▁ski", + "p" + ], + [ + "▁", + "skip" + ], + [ + "ro", + "ad" + ], + [ + "r", + "oad" + ], + [ + "istr", + "ation" + ], + [ + "▁t", + "ur" + ], + [ + "▁tu", + "r" + ], + [ + "▁De", + "velop" + ], + [ + "▁Devel", + "op" + ], + [ + "▁П", + "а" + ], + [ + "▁д", + "ру" + ], + [ + "▁др", + "у" + ], + [ + "▁wonder", + "ful" + ], + [ + ">", + "&" + ], + [ + "▁L", + "iber" + ], + [ + "▁Li", + "ber" + ], + [ + "▁Lib", + "er" + ], + [ + "▁s", + "cope" + ], + [ + "▁sc", + "ope" + ], + [ + "▁", + "scope" + ], + [ + "▁man", + "age" + ], + [ + "▁ma", + "nage" + ], + [ + "▁d", + "ass" + ], + [ + "▁da", + "ss" + ], + [ + "▁das", + "s" + ], + [ + "▁re", + "call" + ], + [ + "▁rec", + "all" + ], + [ + "P", + "M" + ], + [ + "▁re", + "levant" + ], + [ + "▁relev", + "ant" + ], + [ + "▁E", + "arth" + ], + [ + "▁ка", + "к" + ], + [ + "▁a", + "pr" + ], + [ + "▁ap", + "r" + ], + [ + "▁A", + "SS" + ], + [ + "▁AS", + "S" + ], + [ + "▁", + "ASS" + ], + [ + "ié", + "n" + ], + [ + "i", + "én" + ], + [ + "▁S", + "H" + ], + [ + "▁", + "SH" + ], + [ + "oo", + "m" + ], + [ + "o", + "om" + ], + [ + "it", + "et" + ], + [ + "ite", + "t" + ], + [ + "no", + "ne" + ], + [ + "non", + "e" + ], + [ + "n", + "one" + ], + [ + "as", + "i" + ], + [ + "a", + "si" + ], + [ + "▁mot", + "or" + ], + [ + "▁mo", + "tor" + ], + [ + "▁S", + "how" + ], + [ + "▁Sh", + "ow" + ], + [ + "▁", + "Show" + ], + [ + "n", + "b" + ], + [ + "▁fact", + "ors" + ], + [ + "▁fa", + "ctors" + ], + [ + "▁factor", + "s" + ], + [ + "▁f", + "orest" + ], + [ + "▁for", + "est" + ], + [ + "▁fore", + "st" + ], + [ + "▁fo", + "rest" + ], + [ + "▁в", + "ре" + ], + [ + "th", + "m" + ], + [ + "t", + "hm" + ], + [ + "▁m", + "unicip" + ], + [ + "▁turn", + "s" + ], + [ + "▁tur", + "ns" + ], + [ + "▁Div", + "ision" + ], + [ + "▁Di", + "vision" + ], + [ + "E", + "C" + ], + [ + "▁dis", + "appe" + ], + [ + "struct", + "or" + ], + [ + "stru", + "ctor" + ], + [ + "▁some", + "where" + ], + [ + "▁Afr", + "ican" + ], + [ + "▁Africa", + "n" + ], + [ + "▁Inst", + "itute" + ], + [ + "▁Institut", + "e" + ], + [ + "Gr", + "id" + ], + [ + "G", + "rid" + ], + [ + "▁te", + "acher" + ], + [ + "▁teach", + "er" + ], + [ + "▁tea", + "cher" + ], + [ + "ur", + "ies" + ], + [ + "uri", + "es" + ], + [ + "u", + "ries" + ], + [ + "▁respect", + "ively" + ], + [ + "▁respective", + "ly" + ], + [ + "▁S", + "D" + ], + [ + "▁", + "SD" + ], + [ + "▁a", + "live" + ], + [ + "▁al", + "ive" + ], + [ + "▁ali", + "ve" + ], + [ + "▁p", + "ou" + ], + [ + "▁po", + "u" + ], + [ + "▁W", + "ater" + ], + [ + "▁Wat", + "er" + ], + [ + "▁Wa", + "ter" + ], + [ + "▁", + "Water" + ], + [ + "ф", + "е" + ], + [ + "▁ch", + "anging" + ], + [ + "▁chang", + "ing" + ], + [ + "▁", + "changing" + ], + [ + "▁after", + "noon" + ], + [ + "▁or", + "ders" + ], + [ + "▁order", + "s" + ], + [ + "▁ord", + "ers" + ], + [ + "▁", + "orders" + ], + [ + "Re", + "t" + ], + [ + "R", + "et" + ], + [ + "Point", + "er" + ], + [ + "Po", + "inter" + ], + [ + "▁s", + "av" + ], + [ + "▁sa", + "v" + ], + [ + "er", + "g" + ], + [ + "e", + "rg" + ], + [ + "ok", + "ed" + ], + [ + "oke", + "d" + ], + [ + "o", + "ked" + ], + [ + "ess", + "ions" + ], + [ + "ession", + "s" + ], + [ + "▁F", + "ire" + ], + [ + "▁Fi", + "re" + ], + [ + "▁", + "Fire" + ], + [ + "ar", + "et" + ], + [ + "are", + "t" + ], + [ + "a", + "ret" + ], + [ + "im", + "m" + ], + [ + "i", + "mm" + ], + [ + "▁des", + "ire" + ], + [ + "▁", + "що" + ], + [ + "▁De", + "sign" + ], + [ + "▁Des", + "ign" + ], + [ + "▁", + "Design" + ], + [ + "ut", + "ure" + ], + [ + "▁Off", + "ice" + ], + [ + "▁c", + "md" + ], + [ + "▁cm", + "d" + ], + [ + "▁", + "cmd" + ], + [ + "▁e", + "ating" + ], + [ + "▁eat", + "ing" + ], + [ + "Net", + "work" + ], + [ + "▁r", + "ough" + ], + [ + "▁ro", + "ugh" + ], + [ + "▁rou", + "gh" + ], + [ + "▁", + "rough" + ], + [ + "oper", + "ator" + ], + [ + "IG", + "N" + ], + [ + "I", + "GN" + ], + [ + "▁s", + "ports" + ], + [ + "▁sp", + "orts" + ], + [ + "▁sport", + "s" + ], + [ + "▁w", + "eren" + ], + [ + "▁we", + "ren" + ], + [ + "▁were", + "n" + ], + [ + "▁wer", + "en" + ], + [ + "▁n", + "oted" + ], + [ + "▁not", + "ed" + ], + [ + "▁no", + "ted" + ], + [ + "▁note", + "d" + ], + [ + "▁tw", + "ice" + ], + [ + "II", + "I" + ], + [ + "I", + "II" + ], + [ + "▁a", + "nx" + ], + [ + "▁an", + "x" + ], + [ + "▁e", + "lim" + ], + [ + "▁el", + "im" + ], + [ + "▁а", + "в" + ], + [ + "▁i", + "o" + ], + [ + "▁", + "io" + ], + [ + "▁spe", + "ech" + ], + [ + "▁con", + "du" + ], + [ + "▁cond", + "u" + ], + [ + "el", + "les" + ], + [ + "ell", + "es" + ], + [ + "elle", + "s" + ], + [ + "id", + "ade" + ], + [ + "ida", + "de" + ], + [ + "idad", + "e" + ], + [ + "▁adv", + "ance" + ], + [ + "R", + "I" + ], + [ + "oc", + "a" + ], + [ + "o", + "ca" + ], + [ + "/", + "\\" + ], + [ + "ap", + "shot" + ], + [ + "aps", + "hot" + ], + [ + "▁t", + "ail" + ], + [ + "▁ta", + "il" + ], + [ + "▁", + "tail" + ], + [ + "mod", + "els" + ], + [ + "model", + "s" + ], + [ + "mode", + "ls" + ], + [ + "og", + "y" + ], + [ + "o", + "gy" + ], + [ + "▁J", + "eff" + ], + [ + "▁Je", + "ff" + ], + [ + "ir", + "ation" + ], + [ + "irat", + "ion" + ], + [ + "▁K", + "ore" + ], + [ + "▁Ko", + "re" + ], + [ + "▁Kor", + "e" + ], + [ + "▁le", + "ads" + ], + [ + "▁lead", + "s" + ], + [ + "ba", + "t" + ], + [ + "b", + "at" + ], + [ + "Ad", + "apter" + ], + [ + "c", + "ategory" + ], + [ + "ang", + "ular" + ], + [ + "angu", + "lar" + ], + [ + "▁s", + "aved" + ], + [ + "▁sa", + "ved" + ], + [ + "▁save", + "d" + ], + [ + "▁sav", + "ed" + ], + [ + "▁", + "saved" + ], + [ + "▁un", + "iform" + ], + [ + "▁", + "uniform" + ], + [ + "▁n", + "é" + ], + [ + "▁", + "né" + ], + [ + "▁business", + "es" + ], + [ + "His", + "t" + ], + [ + "Hi", + "st" + ], + [ + "H", + "ist" + ], + [ + "▁а", + "р" + ], + [ + "▁", + "ар" + ], + [ + "do", + "main" + ], + [ + "dom", + "ain" + ], + [ + "▁S", + "i" + ], + [ + "▁", + "Si" + ], + [ + "ra", + "ise" + ], + [ + "rais", + "e" + ], + [ + "rai", + "se" + ], + [ + "r", + "aise" + ], + [ + "▁w", + "arn" + ], + [ + "▁war", + "n" + ], + [ + "▁wa", + "rn" + ], + [ + "▁", + "warn" + ], + [ + "het", + "ic" + ], + [ + "h", + "etic" + ], + [ + "▁G", + "ro" + ], + [ + "▁Gr", + "o" + ], + [ + "))", + "." + ], + [ + ")", + ")." + ], + [ + "}", + ">" + ], + [ + "з", + "е" + ], + [ + "▁Amaz", + "on" + ], + [ + "▁Or", + "gan" + ], + [ + "▁", + "Organ" + ], + [ + "▁L", + "ake" + ], + [ + "▁La", + "ke" + ], + [ + "▁ag", + "reement" + ], + [ + "▁agree", + "ment" + ], + [ + "▁agre", + "ement" + ], + [ + "x", + "a" + ], + [ + "▁p", + "erman" + ], + [ + "▁per", + "man" + ], + [ + "▁perm", + "an" + ], + [ + "▁cont", + "aining" + ], + [ + "▁contain", + "ing" + ], + [ + "▁st", + "range" + ], + [ + "▁str", + "ange" + ], + [ + "▁strang", + "e" + ], + [ + "ст", + "і" + ], + [ + "с", + "ті" + ], + [ + "▁st", + "upid" + ], + [ + "▁spe", + "aking" + ], + [ + "▁speak", + "ing" + ], + [ + "▁Intern", + "et" + ], + [ + "▁Inter", + "net" + ], + [ + "pre", + "fix" + ], + [ + "pref", + "ix" + ], + [ + "p", + "refix" + ], + [ + "es", + "c" + ], + [ + "e", + "sc" + ], + [ + "As", + "sert" + ], + [ + "Ass", + "ert" + ], + [ + "pro", + "te" + ], + [ + "pr", + "ote" + ], + [ + "prot", + "e" + ], + [ + "p", + "rote" + ], + [ + "▁m", + "anner" + ], + [ + "▁man", + "ner" + ], + [ + "▁S", + "z" + ], + [ + "un", + "te" + ], + [ + "unt", + "e" + ], + [ + "u", + "nte" + ], + [ + "io", + "t" + ], + [ + "i", + "ot" + ], + [ + "Pro", + "file" + ], + [ + "ov", + "en" + ], + [ + "ove", + "n" + ], + [ + "o", + "ven" + ], + [ + "▁for", + "med" + ], + [ + "▁form", + "ed" + ], + [ + "▁forme", + "d" + ], + [ + "▁", + "formed" + ], + [ + "▁l", + "it" + ], + [ + "▁li", + "t" + ], + [ + "▁", + "lit" + ], + [ + "▁econom", + "y" + ], + [ + "▁ec", + "onomy" + ], + [ + "▁c", + "z" + ], + [ + "▁", + "cz" + ], + [ + "wi", + "d" + ], + [ + "w", + "id" + ], + [ + "RE", + "Q" + ], + [ + "R", + "EQ" + ], + [ + "▁ch", + "osen" + ], + [ + "▁cho", + "sen" + ], + [ + "▁chose", + "n" + ], + [ + "▁P", + "rodu" + ], + [ + "▁Pro", + "du" + ], + [ + "▁", + "Produ" + ], + [ + "os", + "ter" + ], + [ + "ost", + "er" + ], + [ + "o", + "ster" + ], + [ + "st", + "ances" + ], + [ + "stance", + "s" + ], + [ + "stan", + "ces" + ], + [ + "aw", + "a" + ], + [ + "a", + "wa" + ], + [ + "▁R", + "en" + ], + [ + "▁Re", + "n" + ], + [ + "▁conf", + "irm" + ], + [ + "▁", + "confirm" + ], + [ + "▁Б", + "о" + ], + [ + "▁b", + "illion" + ], + [ + "▁bill", + "ion" + ], + [ + "▁d", + "éc" + ], + [ + "▁dé", + "c" + ], + [ + "ý", + "ch" + ], + [ + "▁ill", + "ustr" + ], + [ + "TI", + "ES" + ], + [ + "T", + "IES" + ], + [ + "▁P", + "ub" + ], + [ + "▁Pu", + "b" + ], + [ + "▁", + "Pub" + ], + [ + "▁b", + "an" + ], + [ + "▁ba", + "n" + ], + [ + "▁", + "ban" + ], + [ + "ad", + "ed" + ], + [ + "ade", + "d" + ], + [ + "a", + "ded" + ], + [ + "ah", + "n" + ], + [ + "a", + "hn" + ], + [ + "▁C", + "ath" + ], + [ + "▁Cat", + "h" + ], + [ + "▁Ca", + "th" + ], + [ + "no", + "number" + ], + [ + "non", + "umber" + ], + [ + "▁wor", + "st" + ], + [ + "▁М", + "е" + ], + [ + "▁sugg", + "ested" + ], + [ + "▁suggest", + "ed" + ], + [ + "st", + "ats" + ], + [ + "stat", + "s" + ], + [ + "sta", + "ts" + ], + [ + "▁c", + "ant" + ], + [ + "▁can", + "t" + ], + [ + "▁ca", + "nt" + ], + [ + "▁al", + "ign" + ], + [ + "▁ali", + "gn" + ], + [ + "▁", + "align" + ], + [ + "kap", + "pa" + ], + [ + "k", + "appa" + ], + [ + "▁h", + "en" + ], + [ + "▁he", + "n" + ], + [ + "▁", + "hen" + ], + [ + "▁in", + "iti" + ], + [ + "▁init", + "i" + ], + [ + "']", + ")" + ], + [ + "'", + "])" + ], + [ + "B", + "I" + ], + [ + "▁g", + "arden" + ], + [ + "▁gar", + "den" + ], + [ + "▁gard", + "en" + ], + [ + "▁sec", + "ure" + ], + [ + "▁secur", + "e" + ], + [ + "▁", + "secure" + ], + [ + "▁\\", + "[" + ], + [ + "▁", + "\\[" + ], + [ + "hand", + "ler" + ], + [ + "handle", + "r" + ], + [ + "el", + "li" + ], + [ + "ell", + "i" + ], + [ + "e", + "lli" + ], + [ + "ld", + "ots" + ], + [ + "l", + "dots" + ], + [ + "se", + "cut" + ], + [ + "sec", + "ut" + ], + [ + "s", + "ecut" + ], + [ + "▁ext", + "ended" + ], + [ + "▁extend", + "ed" + ], + [ + "}", + "-" + ], + [ + "an", + "ie" + ], + [ + "ani", + "e" + ], + [ + "a", + "nie" + ], + [ + "▁F", + "ind" + ], + [ + "▁Fin", + "d" + ], + [ + "▁Fi", + "nd" + ], + [ + "▁", + "Find" + ], + [ + "▁M", + "useum" + ], + [ + "▁Muse", + "um" + ], + [ + "▁C", + "onne" + ], + [ + "▁Con", + "ne" + ], + [ + "▁", + "Conne" + ], + [ + "y", + "y" + ], + [ + "▁pass", + "ion" + ], + [ + "ak", + "ers" + ], + [ + "ake", + "rs" + ], + [ + "aker", + "s" + ], + [ + "a", + "kers" + ], + [ + "ah", + "r" + ], + [ + "a", + "hr" + ], + [ + "olog", + "ies" + ], + [ + "ologie", + "s" + ], + [ + "▁equ", + "ation" + ], + [ + "▁eq", + "uation" + ], + [ + "▁", + "equation" + ], + [ + "▁occ", + "asion" + ], + [ + "▁occas", + "ion" + ], + [ + "Le", + "t" + ], + [ + "L", + "et" + ], + [ + "']", + "['" + ], + [ + "'][", + "'" + ], + [ + "'", + "]['" + ], + [ + "Pr", + "int" + ], + [ + "an", + "es" + ], + [ + "ane", + "s" + ], + [ + "a", + "nes" + ], + [ + "ie", + "nte" + ], + [ + "ient", + "e" + ], + [ + "ien", + "te" + ], + [ + "i", + "ente" + ], + [ + "▁T", + "oday" + ], + [ + "▁To", + "day" + ], + [ + "▁Tod", + "ay" + ], + [ + "LE", + "CT" + ], + [ + "L", + "ECT" + ], + [ + "▁A", + "f" + ], + [ + "▁", + "Af" + ], + [ + ",", + "," + ], + [ + "▁Т", + "а" + ], + [ + "▁`", + "``" + ], + [ + "▁``", + "`" + ], + [ + "ev", + "en" + ], + [ + "eve", + "n" + ], + [ + "e", + "ven" + ], + [ + "si", + "n" + ], + [ + "s", + "in" + ], + [ + "ur", + "er" + ], + [ + "ure", + "r" + ], + [ + "u", + "rer" + ], + [ + "▁", + "°" + ], + [ + "ot", + "imes" + ], + [ + "oti", + "mes" + ], + [ + "o", + "times" + ], + [ + "▁I", + "O" + ], + [ + "▁", + "IO" + ], + [ + "▁po", + "et" + ], + [ + "()", + "));" + ], + [ + "())", + ");" + ], + [ + "()))", + ";" + ], + [ + "(", + ")));" + ], + [ + "▁", + "−" + ], + [ + "▁ad", + "opt" + ], + [ + "ph", + "ere" + ], + [ + "pher", + "e" + ], + [ + "p", + "here" + ], + [ + "#", + "[" + ], + [ + "▁c", + "entre" + ], + [ + "▁cent", + "re" + ], + [ + "ov", + "es" + ], + [ + "ove", + "s" + ], + [ + "o", + "ves" + ], + [ + "▁a", + "ns" + ], + [ + "▁an", + "s" + ], + [ + "▁", + "ans" + ], + [ + "d", + "p" + ], + [ + "▁K", + "ir" + ], + [ + "▁Ki", + "r" + ], + [ + "▁applic", + "able" + ], + [ + "f", + "p" + ], + [ + "▁vis", + "ual" + ], + [ + "▁ok", + "ay" + ], + [ + "or", + "o" + ], + [ + "o", + "ro" + ], + [ + "▁opportun", + "ities" + ], + [ + "Re", + "pository" + ], + [ + "Rep", + "ository" + ], + [ + "▁l", + "l" + ], + [ + "▁", + "ll" + ], + [ + "▁R", + "od" + ], + [ + "▁Ro", + "d" + ], + [ + "▁s", + "hel" + ], + [ + "▁sh", + "el" + ], + [ + "▁she", + "l" + ], + [ + "▁la", + "unch" + ], + [ + "▁con", + "ven" + ], + [ + "▁conv", + "en" + ], + [ + "▁conve", + "n" + ], + [ + "▁S", + "pe" + ], + [ + "▁Sp", + "e" + ], + [ + "▁", + "Spe" + ], + [ + "Am", + "er" + ], + [ + "A", + "mer" + ], + [ + "▁c", + "ette" + ], + [ + "▁cet", + "te" + ], + [ + "Con", + "d" + ], + [ + "Co", + "nd" + ], + [ + "C", + "ond" + ], + [ + "de", + "p" + ], + [ + "d", + "ep" + ], + [ + "O", + "wn" + ], + [ + "▁h", + "ook" + ], + [ + "▁ho", + "ok" + ], + [ + "▁", + "hook" + ], + [ + "▁d", + "ict" + ], + [ + "▁di", + "ct" + ], + [ + "▁dic", + "t" + ], + [ + "▁", + "dict" + ], + [ + "▁Th", + "ose" + ], + [ + "▁f", + "ellow" + ], + [ + "▁fell", + "ow" + ], + [ + "▁fel", + "low" + ], + [ + "▁phil", + "osoph" + ], + [ + "▁philos", + "oph" + ], + [ + "vi", + "n" + ], + [ + "v", + "in" + ], + [ + "fer", + "ences" + ], + [ + "ference", + "s" + ], + [ + "ha", + "v" + ], + [ + "h", + "av" + ], + [ + "▁ad", + "ding" + ], + [ + "▁add", + "ing" + ], + [ + "▁", + "adding" + ], + [ + "ivers", + "e" + ], + [ + "iver", + "se" + ], + [ + "i", + "verse" + ], + [ + "ga", + "me" + ], + [ + "g", + "ame" + ], + [ + "▁Bl", + "ue" + ], + [ + "▁", + "Blue" + ], + [ + "▁c", + "lin" + ], + [ + "▁cl", + "in" + ], + [ + "not", + "e" + ], + [ + "no", + "te" + ], + [ + "n", + "ote" + ], + [ + "▁R", + "am" + ], + [ + "▁Ra", + "m" + ], + [ + "ме", + "р" + ], + [ + "м", + "ер" + ], + [ + "co", + "very" + ], + [ + "cover", + "y" + ], + [ + "cov", + "ery" + ], + [ + "c", + "overy" + ], + [ + "ñ", + "a" + ], + [ + "▁б", + "и" + ], + [ + "▁", + "би" + ], + [ + "▁f", + "ashion" + ], + [ + "▁b", + "roke" + ], + [ + "▁br", + "oke" + ], + [ + "▁bro", + "ke" + ], + [ + "▁'", + "\\" + ], + [ + "▁", + "'\\" + ], + [ + "▁re", + "ader" + ], + [ + "▁read", + "er" + ], + [ + "▁", + "reader" + ], + [ + "но", + "е" + ], + [ + "но", + "сти" + ], + [ + "ност", + "и" + ], + [ + "▁pay", + "ment" + ], + [ + "▁", + "payment" + ], + [ + "▁L", + "ic" + ], + [ + "▁Li", + "c" + ], + [ + "▁l", + "ips" + ], + [ + "▁li", + "ps" + ], + [ + "▁lip", + "s" + ], + [ + "▁ac", + "adem" + ], + [ + "▁M", + "ot" + ], + [ + "▁Mo", + "t" + ], + [ + "el", + "ls" + ], + [ + "ell", + "s" + ], + [ + "C", + "HECK" + ], + [ + "▁р", + "у" + ], + [ + "▁", + "ру" + ], + [ + "▁M", + "S" + ], + [ + "▁", + "MS" + ], + [ + "Ed", + "itor" + ], + [ + "Edit", + "or" + ], + [ + "▁z", + "one" + ], + [ + "▁zo", + "ne" + ], + [ + "▁", + "zone" + ], + [ + "it", + "ure" + ], + [ + "itu", + "re" + ], + [ + "▁I", + "T" + ], + [ + "▁", + "IT" + ], + [ + "run", + "time" + ], + [ + "▁pro", + "ceed" + ], + [ + "▁proc", + "eed" + ], + [ + "ло", + "в" + ], + [ + "л", + "ов" + ], + [ + "▁M", + "aria" + ], + [ + "▁Mar", + "ia" + ], + [ + "▁Ma", + "ria" + ], + [ + "ol", + "ver" + ], + [ + "olve", + "r" + ], + [ + "olv", + "er" + ], + [ + "▁Th", + "anks" + ], + [ + "▁Thank", + "s" + ], + [ + "▁", + "Thanks" + ], + [ + "▁should", + "n" + ], + [ + "▁J", + "oh" + ], + [ + "▁Jo", + "h" + ], + [ + "▁Mod", + "el" + ], + [ + "▁Mo", + "del" + ], + [ + "▁Mode", + "l" + ], + [ + "▁", + "Model" + ], + [ + "▁S", + "ov" + ], + [ + "▁So", + "v" + ], + [ + "!", + "'" + ], + [ + "D", + "i" + ], + [ + "▁c", + "ancer" + ], + [ + "▁can", + "cer" + ], + [ + "Id", + "ent" + ], + [ + "▁ex", + "change" + ], + [ + "il", + "ler" + ], + [ + "ill", + "er" + ], + [ + "ille", + "r" + ], + [ + "in", + "f" + ], + [ + "i", + "nf" + ], + [ + "LE", + "N" + ], + [ + "L", + "EN" + ], + [ + "()", + "{" + ], + [ + "(", + "){" + ], + [ + "ag", + "a" + ], + [ + "a", + "ga" + ], + [ + "\"]", + "," + ], + [ + "\"", + "]," + ], + [ + "u", + "h" + ], + [ + "▁K", + "en" + ], + [ + "▁Ke", + "n" + ], + [ + "▁ph", + "otos" + ], + [ + "▁phot", + "os" + ], + [ + "▁photo", + "s" + ], + [ + "▁t", + "iny" + ], + [ + "▁ti", + "ny" + ], + [ + "▁tin", + "y" + ], + [ + "▁", + "tiny" + ], + [ + "▁g", + "ent" + ], + [ + "▁gen", + "t" + ], + [ + "▁ge", + "nt" + ], + [ + "▁", + "gent" + ], + [ + "ü", + "l" + ], + [ + "▁T", + "ake" + ], + [ + "▁Ta", + "ke" + ], + [ + "▁Tak", + "e" + ], + [ + "▁", + "Take" + ], + [ + "id", + "el" + ], + [ + "ide", + "l" + ], + [ + "i", + "del" + ], + [ + "ou", + "ting" + ], + [ + "out", + "ing" + ], + [ + "In", + "ternal" + ], + [ + "Inter", + "nal" + ], + [ + "Intern", + "al" + ], + [ + "▁c", + "ells" + ], + [ + "▁cell", + "s" + ], + [ + "▁cel", + "ls" + ], + [ + "ни", + "м" + ], + [ + "н", + "им" + ], + [ + "ha", + "rd" + ], + [ + "har", + "d" + ], + [ + "h", + "ard" + ], + [ + "▁T", + "own" + ], + [ + "▁To", + "wn" + ], + [ + "▁Tow", + "n" + ], + [ + "ob", + "e" + ], + [ + "o", + "be" + ], + [ + "pl", + "ex" + ], + [ + "ple", + "x" + ], + [ + "p", + "lex" + ], + [ + "те", + "р" + ], + [ + "т", + "ер" + ], + [ + "to", + "ns" + ], + [ + "ton", + "s" + ], + [ + "t", + "ons" + ], + [ + "▁conc", + "entr" + ], + [ + "▁concent", + "r" + ], + [ + "mo", + "ck" + ], + [ + "m", + "ock" + ], + [ + "v", + "c" + ], + [ + "á", + "z" + ], + [ + "▁Ch", + "ampionship" + ], + [ + "▁Champion", + "ship" + ], + [ + "▁Champions", + "hip" + ], + [ + "▁б", + "е" + ], + [ + "▁", + "бе" + ], + [ + "?", + "?" + ], + [ + "ér", + "i" + ], + [ + "é", + "ri" + ], + [ + "al", + "y" + ], + [ + "a", + "ly" + ], + [ + "▁", + "Ц" + ], + [ + "ier", + "te" + ], + [ + "iert", + "e" + ], + [ + "▁tot", + "ally" + ], + [ + "▁total", + "ly" + ], + [ + "▁A", + "uf" + ], + [ + "▁Au", + "f" + ], + [ + "▁our", + "selves" + ], + [ + "▁S", + "elf" + ], + [ + "▁Sel", + "f" + ], + [ + "▁", + "Self" + ], + [ + "Form", + "s" + ], + [ + "For", + "ms" + ], + [ + "ight", + "er" + ], + [ + "igh", + "ter" + ], + [ + "▁is", + "land" + ], + [ + "fm", + "t" + ], + [ + "f", + "mt" + ], + [ + "▁r", + "c" + ], + [ + "▁", + "rc" + ], + [ + "▁t", + "ells" + ], + [ + "▁tell", + "s" + ], + [ + "▁tel", + "ls" + ], + [ + "B", + "B" + ], + [ + "di", + "t" + ], + [ + "d", + "it" + ], + [ + "▁vari", + "ables" + ], + [ + "▁variable", + "s" + ], + [ + "▁", + "variables" + ], + [ + "▁int", + "ended" + ], + [ + "▁intend", + "ed" + ], + [ + "iz", + "ont" + ], + [ + "izon", + "t" + ], + [ + "izo", + "nt" + ], + [ + "▁pl", + "ays" + ], + [ + "▁play", + "s" + ], + [ + "da", + "m" + ], + [ + "d", + "am" + ], + [ + "se", + "q" + ], + [ + "s", + "eq" + ], + [ + "▁S", + "up" + ], + [ + "▁Su", + "p" + ], + [ + "▁", + "Sup" + ], + [ + "▁c", + "ultural" + ], + [ + "▁cult", + "ural" + ], + [ + "▁sc", + "ream" + ], + [ + "__", + "," + ], + [ + "_", + "_," + ], + [ + "ci", + "pl" + ], + [ + "cip", + "l" + ], + [ + "Time", + "out" + ], + [ + "▁", + "ж" + ], + [ + "or", + "te" + ], + [ + "ort", + "e" + ], + [ + "▁repl", + "aced" + ], + [ + "▁replace", + "d" + ], + [ + "E", + "M" + ], + [ + "▁ab", + "andon" + ], + [ + "▁Spec", + "ial" + ], + [ + "▁Spe", + "cial" + ], + [ + "▁", + "Special" + ], + [ + "el", + "len" + ], + [ + "ell", + "en" + ], + [ + "elle", + "n" + ], + [ + "▁B", + "ru" + ], + [ + "▁Br", + "u" + ], + [ + "ir", + "med" + ], + [ + "irm", + "ed" + ], + [ + "T", + "e" + ], + [ + "ol", + "t" + ], + [ + "o", + "lt" + ], + [ + "j", + "u" + ], + [ + "Arg", + "ument" + ], + [ + "▁ne", + "ut" + ], + [ + "▁neu", + "t" + ], + [ + "▁", + "neut" + ], + [ + "sc", + "ape" + ], + [ + "▁R", + "ay" + ], + [ + "▁Ra", + "y" + ], + [ + "▁", + "Ray" + ], + [ + "▁Pol", + "it" + ], + [ + "▁Po", + "lit" + ], + [ + "▁crow", + "d" + ], + [ + "▁cro", + "wd" + ], + [ + "▁Window", + "s" + ], + [ + "▁Wind", + "ows" + ], + [ + "▁", + "Windows" + ], + [ + "ie", + "go" + ], + [ + "ieg", + "o" + ], + [ + "i", + "ego" + ], + [ + "▁e", + "scape" + ], + [ + "▁esc", + "ape" + ], + [ + "▁", + "escape" + ], + [ + "▁Ap", + "ache" + ], + [ + "sy", + "nc" + ], + [ + "syn", + "c" + ], + [ + "s", + "ync" + ], + [ + "eb", + "en" + ], + [ + "e", + "ben" + ], + [ + "if", + "ies" + ], + [ + "ifi", + "es" + ], + [ + "et", + "her" + ], + [ + "eth", + "er" + ], + [ + "ethe", + "r" + ], + [ + "e", + "ther" + ], + [ + "Met", + "a" + ], + [ + "Me", + "ta" + ], + [ + "M", + "eta" + ], + [ + "▁big", + "gest" + ], + [ + "Ga", + "me" + ], + [ + "G", + "ame" + ], + [ + "▁trans", + "action" + ], + [ + "▁", + "transaction" + ], + [ + "En", + "v" + ], + [ + "E", + "nv" + ], + [ + "▁М", + "о" + ], + [ + "▁pl", + "enty" + ], + [ + "▁m", + "el" + ], + [ + "▁me", + "l" + ], + [ + "▁", + "mel" + ], + [ + "п", + "ре" + ], + [ + "▁mot", + "iv" + ], + [ + "▁о", + "р" + ], + [ + "▁", + "ор" + ], + [ + "or", + "gan" + ], + [ + "org", + "an" + ], + [ + "▁m", + "ock" + ], + [ + "▁mo", + "ck" + ], + [ + "▁", + "mock" + ], + [ + "▁$", + "_" + ], + [ + "▁", + "$_" + ], + [ + "ен", + "е" + ], + [ + "е", + "не" + ], + [ + "▁N", + "umber" + ], + [ + "▁Num", + "ber" + ], + [ + "▁Nu", + "mber" + ], + [ + "▁", + "Number" + ], + [ + "ck", + "now" + ], + [ + "c", + "know" + ], + [ + "▁Up", + "date" + ], + [ + "▁", + "Update" + ], + [ + "ze", + "ro" + ], + [ + "zer", + "o" + ], + [ + "z", + "ero" + ], + [ + "▁sur", + "prise" + ], + [ + "▁surpr", + "ise" + ], + [ + "ce", + "an" + ], + [ + "pd", + "f" + ], + [ + "p", + "df" + ], + [ + "Gl", + "obal" + ], + [ + "▁att", + "end" + ], + [ + "▁f", + "ond" + ], + [ + "▁fo", + "nd" + ], + [ + "▁fon", + "d" + ], + [ + "▁under", + "stood" + ], + [ + "Na", + "v" + ], + [ + "N", + "av" + ], + [ + "▁M", + "ic" + ], + [ + "▁Mi", + "c" + ], + [ + "▁", + "Mic" + ], + [ + "=", + "$" + ], + [ + "ok", + "ing" + ], + [ + "oki", + "ng" + ], + [ + "o", + "king" + ], + [ + "▁Stad", + "ium" + ], + [ + "Cl", + "ose" + ], + [ + "▁compet", + "ition" + ], + [ + "▁sold", + "iers" + ], + [ + "▁soldier", + "s" + ], + [ + "▁O", + "P" + ], + [ + "▁", + "OP" + ], + [ + "ag", + "ne" + ], + [ + "agn", + "e" + ], + [ + "▁An", + "ton" + ], + [ + "▁Ant", + "on" + ], + [ + "Ma", + "in" + ], + [ + "M", + "ain" + ], + [ + "á", + "k" + ], + [ + "▁#", + "[" + ], + [ + "▁", + "#[" + ], + [ + "▁Com", + "mit" + ], + [ + "▁Comm", + "it" + ], + [ + "▁", + "Commit" + ], + [ + "py", + "x" + ], + [ + "▁e", + "ast" + ], + [ + "▁eas", + "t" + ], + [ + "▁", + "east" + ], + [ + "▁Or", + "der" + ], + [ + "▁Ord", + "er" + ], + [ + "▁", + "Order" + ], + [ + "F", + "loat" + ], + [ + "▁accept", + "ed" + ], + [ + "▁mon", + "itor" + ], + [ + "▁", + "monitor" + ], + [ + "▁p", + "ad" + ], + [ + "▁pa", + "d" + ], + [ + "▁", + "pad" + ], + [ + "on", + "ic" + ], + [ + "oni", + "c" + ], + [ + "o", + "nic" + ], + [ + "▁p", + "ushed" + ], + [ + "▁push", + "ed" + ], + [ + "▁re", + "place" + ], + [ + "▁rep", + "lace" + ], + [ + "▁repl", + "ace" + ], + [ + "▁", + "replace" + ], + [ + "CR", + "E" + ], + [ + "C", + "RE" + ], + [ + "▁r", + "ide" + ], + [ + "▁ri", + "de" + ], + [ + "▁rid", + "e" + ], + [ + "▁", + "ride" + ], + [ + "fo", + "und" + ], + [ + "f", + "ound" + ], + [ + "=", + "%" + ], + [ + "во", + "й" + ], + [ + "▁mat", + "ches" + ], + [ + "▁match", + "es" + ], + [ + "▁", + "matches" + ], + [ + "▁L", + "ie" + ], + [ + "▁Li", + "e" + ], + [ + "▁exper", + "iences" + ], + [ + "▁experience", + "s" + ], + [ + "▁experi", + "ences" + ], + [ + "Po", + "ol" + ], + [ + "P", + "ool" + ], + [ + "up", + "s" + ], + [ + "u", + "ps" + ], + [ + "A", + "V" + ], + [ + "▁ex", + "istence" + ], + [ + "▁exist", + "ence" + ], + [ + "▁t", + "hin" + ], + [ + "▁th", + "in" + ], + [ + "▁m", + "agn" + ], + [ + "▁mag", + "n" + ], + [ + "▁ma", + "gn" + ], + [ + "CO", + "MP" + ], + [ + "COM", + "P" + ], + [ + "ho", + "me" + ], + [ + "hom", + "e" + ], + [ + "h", + "ome" + ], + [ + "▁n", + "i" + ], + [ + "▁", + "ni" + ], + [ + "▁wur", + "den" + ], + [ + "▁wurde", + "n" + ], + [ + "ла", + "в" + ], + [ + "▁te", + "eth" + ], + [ + "▁S", + "tan" + ], + [ + "▁St", + "an" + ], + [ + "▁Sta", + "n" + ], + [ + "ap", + "pro" + ], + [ + "app", + "ro" + ], + [ + "an", + "ny" + ], + [ + "ann", + "y" + ], + [ + "if", + "ts" + ], + [ + "ift", + "s" + ], + [ + "▁un", + "known" + ], + [ + "▁", + "unknown" + ], + [ + "▁h", + "omes" + ], + [ + "▁home", + "s" + ], + [ + "▁hom", + "es" + ], + [ + "▁ho", + "mes" + ], + [ + "▁ent", + "ity" + ], + [ + "▁", + "entity" + ], + [ + "ci", + "e" + ], + [ + "c", + "ie" + ], + [ + "ле", + "ние" + ], + [ + "ia", + "r" + ], + [ + "i", + "ar" + ], + [ + "▁compl", + "iance" + ], + [ + "▁focus", + "ed" + ], + [ + "uz", + "z" + ], + [ + "u", + "zz" + ], + [ + "=\\", + "\"" + ], + [ + "=", + "\\\"" + ], + [ + "com", + "ponents" + ], + [ + "component", + "s" + ], + [ + "Att", + "r" + ], + [ + "At", + "tr" + ], + [ + "all", + "ery" + ], + [ + "alle", + "ry" + ], + [ + "aller", + "y" + ], + [ + "▁ident", + "ify" + ], + [ + "O", + "k" + ], + [ + "pi", + "e" + ], + [ + "p", + "ie" + ], + [ + "▁St", + "ill" + ], + [ + "▁off", + "ering" + ], + [ + "▁offer", + "ing" + ], + [ + "▁bu", + "sy" + ], + [ + "▁bus", + "y" + ], + [ + "ct", + "l" + ], + [ + "c", + "tl" + ], + [ + "it", + "ors" + ], + [ + "itor", + "s" + ], + [ + "ito", + "rs" + ], + [ + "▁concern", + "ed" + ], + [ + "▁concer", + "ned" + ], + [ + "▁b", + "rown" + ], + [ + "▁br", + "own" + ], + [ + "▁bro", + "wn" + ], + [ + "▁brow", + "n" + ], + [ + "cl", + "k" + ], + [ + "Se", + "lected" + ], + [ + "Select", + "ed" + ], + [ + "▁B", + "lock" + ], + [ + "▁Bl", + "ock" + ], + [ + "▁Blo", + "ck" + ], + [ + "▁", + "Block" + ], + [ + "▁e", + "gy" + ], + [ + "▁eg", + "y" + ], + [ + "▁", + "egy" + ], + [ + "ic", + "ing" + ], + [ + "ici", + "ng" + ], + [ + "i", + "cing" + ], + [ + "▁U", + "RL" + ], + [ + "▁", + "URL" + ], + [ + "▁t", + "opic" + ], + [ + "▁to", + "pic" + ], + [ + "▁top", + "ic" + ], + [ + "▁", + "topic" + ], + [ + "▁Pro", + "duct" + ], + [ + "▁Produ", + "ct" + ], + [ + "▁", + "Product" + ], + [ + "▁ч", + "и" + ], + [ + "▁", + "чи" + ], + [ + "▁t", + "rial" + ], + [ + "▁tr", + "ial" + ], + [ + "▁tri", + "al" + ], + [ + "▁week", + "end" + ], + [ + "l", + "u" + ], + [ + "▁I", + "V" + ], + [ + "▁", + "IV" + ], + [ + "▁E", + "gy" + ], + [ + "▁Eg", + "y" + ], + [ + "x", + "C" + ], + [ + "▁n", + "ove" + ], + [ + "▁no", + "ve" + ], + [ + "▁nov", + "e" + ], + [ + "▁l", + "ett" + ], + [ + "▁le", + "tt" + ], + [ + "▁let", + "t" + ], + [ + "▁", + "lett" + ], + [ + "en", + "ne" + ], + [ + "enn", + "e" + ], + [ + "()", + ")." + ], + [ + "())", + "." + ], + [ + "(", + "))." + ], + [ + ".*", + "*" + ], + [ + ".", + "**" + ], + [ + "▁p", + "romise" + ], + [ + "▁prom", + "ise" + ], + [ + "el", + "ection" + ], + [ + "ele", + "ction" + ], + [ + "elect", + "ion" + ], + [ + "e", + "lection" + ], + [ + "Aut", + "h" + ], + [ + "A", + "uth" + ], + [ + "r", + "v" + ], + [ + "ri", + "l" + ], + [ + "r", + "il" + ], + [ + "▁con", + "duct" + ], + [ + "▁cond", + "uct" + ], + [ + "▁condu", + "ct" + ], + [ + "▁", + "conduct" + ], + [ + "▁main", + "tain" + ], + [ + "▁maint", + "ain" + ], + [ + "▁bo", + "at" + ], + [ + "▁", + "boat" + ], + [ + "▁op", + "posite" + ], + [ + "▁oppos", + "ite" + ], + [ + "sp", + "in" + ], + [ + "spi", + "n" + ], + [ + "s", + "pin" + ], + [ + "web", + "pack" + ], + [ + "an", + "ta" + ], + [ + "ant", + "a" + ], + [ + "▁o", + "rient" + ], + [ + "▁or", + "ient" + ], + [ + "▁", + "orient" + ], + [ + "▁s", + "uc" + ], + [ + "▁su", + "c" + ], + [ + "▁ex", + "ercise" + ], + [ + "▁exerc", + "ise" + ], + [ + "▁eff", + "icient" + ], + [ + "▁", + "efficient" + ], + [ + "▁trad", + "ition" + ], + [ + "▁z", + "w" + ], + [ + "▁", + "zw" + ], + [ + "▁S", + "ud" + ], + [ + "▁Su", + "d" + ], + [ + "go", + "ing" + ], + [ + "▁P", + "ier" + ], + [ + "▁Pi", + "er" + ], + [ + "in", + "v" + ], + [ + "i", + "nv" + ], + [ + "ip", + "es" + ], + [ + "ipe", + "s" + ], + [ + "i", + "pes" + ], + [ + "ensure", + "math" + ], + [ + "▁con", + "ver" + ], + [ + "▁conv", + "er" + ], + [ + "▁conve", + "r" + ], + [ + "cre", + "en" + ], + [ + "cr", + "een" + ], + [ + "c", + "reen" + ], + [ + "▁t", + "error" + ], + [ + "▁ter", + "ror" + ], + [ + "▁terr", + "or" + ], + [ + "▁D", + "ou" + ], + [ + "▁Do", + "u" + ], + [ + "▁in", + "valid" + ], + [ + "▁", + "invalid" + ], + [ + "ce", + "ived" + ], + [ + "ceive", + "d" + ], + [ + "▁A", + "rab" + ], + [ + "▁Ar", + "ab" + ], + [ + "▁w", + "ire" + ], + [ + "▁wir", + "e" + ], + [ + "▁", + "wire" + ], + [ + "ap", + "plication" + ], + [ + "sh", + "ift" + ], + [ + "Gener", + "ic" + ], + [ + "▁P", + "lan" + ], + [ + "▁Pl", + "an" + ], + [ + "▁", + "Plan" + ], + [ + "▁W", + "all" + ], + [ + "▁Wal", + "l" + ], + [ + "▁Wa", + "ll" + ], + [ + "▁", + "Wall" + ], + [ + "▁direct", + "ory" + ], + [ + "▁director", + "y" + ], + [ + "▁", + "directory" + ], + [ + "▁e", + "gg" + ], + [ + "▁eg", + "g" + ], + [ + "▁we", + "alth" + ], + [ + "▁", + "wealth" + ], + [ + "ran", + "dom" + ], + [ + "rand", + "om" + ], + [ + "r", + "andom" + ], + [ + "att", + "ribute" + ], + [ + "▁h", + "ide" + ], + [ + "▁hi", + "de" + ], + [ + "▁hid", + "e" + ], + [ + "▁", + "hide" + ], + [ + "Se", + "rial" + ], + [ + "Ser", + "ial" + ], + [ + "S", + "erial" + ], + [ + "ca", + "m" + ], + [ + "c", + "am" + ], + [ + "▁it", + "al" + ], + [ + "▁i", + "tal" + ], + [ + "▁", + "ital" + ], + [ + "▁L", + "ine" + ], + [ + "▁Lin", + "e" + ], + [ + "▁Li", + "ne" + ], + [ + "▁", + "Line" + ], + [ + "▁C", + "HECK" + ], + [ + "▁", + "CHECK" + ], + [ + "ploy", + "ment" + ], + [ + "▁mass", + "ive" + ], + [ + "▁ex", + "tract" + ], + [ + "▁ext", + "ract" + ], + [ + "▁extra", + "ct" + ], + [ + "▁extr", + "act" + ], + [ + "▁", + "extract" + ], + [ + "ch", + "ain" + ], + [ + "cha", + "in" + ], + [ + "Res", + "t" + ], + [ + "Re", + "st" + ], + [ + "R", + "est" + ], + [ + "▁L", + "as" + ], + [ + "▁La", + "s" + ], + [ + "▁b", + "ear" + ], + [ + "▁be", + "ar" + ], + [ + "▁", + "bear" + ], + [ + "▁l", + "inks" + ], + [ + "▁link", + "s" + ], + [ + "▁lin", + "ks" + ], + [ + "▁", + "links" + ], + [ + "▁new", + "sp" + ], + [ + "▁news", + "p" + ], + [ + "▁F", + "C" + ], + [ + "▁", + "FC" + ], + [ + "Car", + "d" + ], + [ + "C", + "ard" + ], + [ + "ak", + "s" + ], + [ + "a", + "ks" + ], + [ + "▁v", + "isible" + ], + [ + "▁vis", + "ible" + ], + [ + "▁", + "visible" + ], + [ + "▁M", + "arc" + ], + [ + "▁Mar", + "c" + ], + [ + "▁Ma", + "rc" + ], + [ + "▁B", + "oston" + ], + [ + "▁Bo", + "ston" + ], + [ + "▁Bos", + "ton" + ], + [ + "▁res", + "erved" + ], + [ + "▁reserv", + "ed" + ], + [ + "▁reserve", + "d" + ], + [ + "▁ro", + "of" + ], + [ + "lic", + "enses" + ], + [ + "license", + "s" + ], + [ + "d", + "c" + ], + [ + "▁In", + "formation" + ], + [ + "▁", + "Information" + ], + [ + "▁w", + "itness" + ], + [ + "S", + "k" + ], + [ + "*)", + "," + ], + [ + "*", + ")," + ], + [ + "Sc", + "ope" + ], + [ + "S", + "cope" + ], + [ + "']", + ";" + ], + [ + "'", + "];" + ], + [ + "▁M", + "ir" + ], + [ + "▁Mi", + "r" + ], + [ + "▁", + "Mir" + ], + [ + "ud", + "ing" + ], + [ + "udi", + "ng" + ], + [ + "u", + "ding" + ], + [ + "▁t", + "rend" + ], + [ + "▁tr", + "end" + ], + [ + "▁tre", + "nd" + ], + [ + "▁tren", + "d" + ], + [ + "re", + "p" + ], + [ + "r", + "ep" + ], + [ + "▁mus", + "ical" + ], + [ + "▁music", + "al" + ], + [ + "▁ne", + "ither" + ], + [ + "▁nei", + "ther" + ], + [ + "▁C", + "reat" + ], + [ + "▁Cre", + "at" + ], + [ + "▁", + "Creat" + ], + [ + "▁pos", + "itions" + ], + [ + "▁position", + "s" + ], + [ + "▁posit", + "ions" + ], + [ + "L", + "C" + ], + [ + "rid", + "ge" + ], + [ + "r", + "idge" + ], + [ + "▁offic", + "ers" + ], + [ + "▁office", + "rs" + ], + [ + "▁officer", + "s" + ], + [ + "▁vi", + "olence" + ], + [ + "▁viol", + "ence" + ], + [ + "▁T", + "em" + ], + [ + "▁Te", + "m" + ], + [ + "▁S", + "us" + ], + [ + "▁Su", + "s" + ], + [ + "▁W", + "ay" + ], + [ + "▁Wa", + "y" + ], + [ + "Af", + "ter" + ], + [ + "A", + "fter" + ], + [ + "ac", + "ket" + ], + [ + "ack", + "et" + ], + [ + "▁S", + "ou" + ], + [ + "▁So", + "u" + ], + [ + "ac", + "er" + ], + [ + "ace", + "r" + ], + [ + "a", + "cer" + ], + [ + "|", + "|" + ], + [ + "▁re", + "mark" + ], + [ + "▁r", + "emark" + ], + [ + "▁rem", + "ark" + ], + [ + "▁", + "remark" + ], + [ + "wa", + "ter" + ], + [ + "w", + "ater" + ], + [ + "n", + "ě" + ], + [ + "▁С", + "а" + ], + [ + "▁s", + "ed" + ], + [ + "▁se", + "d" + ], + [ + "▁", + "sed" + ], + [ + "E", + "ach" + ], + [ + "▁phot", + "ograph" + ], + [ + "▁photo", + "graph" + ], + [ + "▁let", + "ters" + ], + [ + "▁letter", + "s" + ], + [ + "▁lett", + "ers" + ], + [ + "▁in", + "vent" + ], + [ + "▁inv", + "ent" + ], + [ + "▁M", + "as" + ], + [ + "▁Ma", + "s" + ], + [ + "▁s", + "ongs" + ], + [ + "▁son", + "gs" + ], + [ + "▁song", + "s" + ], + [ + "ó", + "l" + ], + [ + "ki", + "nd" + ], + [ + "kin", + "d" + ], + [ + "k", + "ind" + ], + [ + "▁N", + "on" + ], + [ + "▁No", + "n" + ], + [ + "▁", + "Non" + ], + [ + "▁d", + "ust" + ], + [ + "▁du", + "st" + ], + [ + "**", + ":" + ], + [ + "*", + "*:" + ], + [ + "nab", + "la" + ], + [ + ".\"", + "," + ], + [ + ".", + "\"," + ], + [ + "Loc", + "k" + ], + [ + "Lo", + "ck" + ], + [ + "L", + "ock" + ], + [ + "▁Д", + "о" + ], + [ + "▁cl", + "uster" + ], + [ + "▁", + "cluster" + ], + [ + "lo", + "ss" + ], + [ + "los", + "s" + ], + [ + "l", + "oss" + ], + [ + "▁ASS", + "ERT" + ], + [ + "▁", + "ASSERT" + ], + [ + "fa", + "ll" + ], + [ + "f", + "all" + ], + [ + "▁re", + "ject" + ], + [ + "▁", + "reject" + ], + [ + "▁Sp", + "ring" + ], + [ + "▁Spr", + "ing" + ], + [ + "▁", + "Spring" + ], + [ + "▁wed", + "ding" + ], + [ + "▁g", + "rav" + ], + [ + "▁gr", + "av" + ], + [ + "▁gra", + "v" + ], + [ + "▁", + "grav" + ], + [ + "ress", + "ion" + ], + [ + "r", + "ession" + ], + [ + "li", + "mit" + ], + [ + "lim", + "it" + ], + [ + "l", + "imit" + ], + [ + "RE", + "S" + ], + [ + "R", + "ES" + ], + [ + "]", + "}" + ], + [ + "▁l", + "isted" + ], + [ + "▁li", + "sted" + ], + [ + "▁list", + "ed" + ], + [ + "▁", + "listed" + ], + [ + "▁T", + "ele" + ], + [ + "▁Te", + "le" + ], + [ + "▁Tel", + "e" + ], + [ + "▁", + "Tele" + ], + [ + "hl", + "ine" + ], + [ + "h", + "line" + ], + [ + "▁ch", + "ief" + ], + [ + "▁chi", + "ef" + ], + [ + "ME", + "M" + ], + [ + "M", + "EM" + ], + [ + "да", + "р" + ], + [ + "д", + "ар" + ], + [ + "▁exp", + "ensive" + ], + [ + "tr", + "ace" + ], + [ + "tra", + "ce" + ], + [ + "▁R", + "og" + ], + [ + "▁Ro", + "g" + ], + [ + "▁C", + "oll" + ], + [ + "▁Col", + "l" + ], + [ + "▁Co", + "ll" + ], + [ + "▁", + "Coll" + ], + [ + "▁Aut", + "hor" + ], + [ + "▁Auth", + "or" + ], + [ + "▁", + "Author" + ], + [ + "▁B", + "oard" + ], + [ + "▁Bo", + "ard" + ], + [ + "▁", + "Board" + ], + [ + "▁C", + "apt" + ], + [ + "▁Cap", + "t" + ], + [ + "▁Ca", + "pt" + ], + [ + "▁", + "Capt" + ], + [ + "TE", + "XT" + ], + [ + "T", + "EXT" + ], + [ + "▁re", + "con" + ], + [ + "▁rec", + "on" + ], + [ + "es", + "ta" + ], + [ + "est", + "a" + ], + [ + "e", + "sta" + ], + [ + "▁proper", + "ly" + ], + [ + "▁&", + "\\" + ], + [ + "▁", + "&\\" + ], + [ + "le", + "ton" + ], + [ + "let", + "on" + ], + [ + "l", + "eton" + ], + [ + "ik", + "er" + ], + [ + "ike", + "r" + ], + [ + "i", + "ker" + ], + [ + "G", + "u" + ], + [ + "▁K", + "om" + ], + [ + "▁Ko", + "m" + ], + [ + "oc", + "o" + ], + [ + "o", + "co" + ], + [ + "▁any", + "more" + ], + [ + "▁t", + "aste" + ], + [ + "▁ta", + "ste" + ], + [ + "▁tast", + "e" + ], + [ + "▁S", + "anta" + ], + [ + "▁San", + "ta" + ], + [ + "▁Sant", + "a" + ], + [ + "ge", + "x" + ], + [ + "g", + "ex" + ], + [ + "▁Se", + "cret" + ], + [ + "▁Sec", + "ret" + ], + [ + "▁", + "Secret" + ], + [ + "▁tal", + "ent" + ], + [ + "▁tale", + "nt" + ], + [ + "▁mom", + "ents" + ], + [ + "▁moment", + "s" + ], + [ + "▁mo", + "ments" + ], + [ + "▁B", + "a" + ], + [ + "▁ex", + "tr" + ], + [ + "▁ext", + "r" + ], + [ + "▁", + "extr" + ], + [ + "▁Com", + "mission" + ], + [ + "▁Comm", + "ission" + ], + [ + "▁mod", + "ify" + ], + [ + "▁Fig", + "ure" + ], + [ + "▁", + "Figure" + ], + [ + "▁d", + "omin" + ], + [ + "▁do", + "min" + ], + [ + "▁dom", + "in" + ], + [ + "▁", + "domin" + ], + [ + "▁p", + "lot" + ], + [ + "▁pl", + "ot" + ], + [ + "▁", + "plot" + ], + [ + "en", + "ger" + ], + [ + "eng", + "er" + ], + [ + "enge", + "r" + ], + [ + "ut", + "ch" + ], + [ + "▁c", + "ities" + ], + [ + "▁cit", + "ies" + ], + [ + "▁ci", + "ties" + ], + [ + "▁n", + "ut" + ], + [ + "▁nu", + "t" + ], + [ + "▁", + "nut" + ], + [ + "pro", + "file" + ], + [ + "prof", + "ile" + ], + [ + "▁S", + "tat" + ], + [ + "▁St", + "at" + ], + [ + "▁Sta", + "t" + ], + [ + "▁", + "Stat" + ], + [ + "▁n", + "odes" + ], + [ + "▁no", + "des" + ], + [ + "▁node", + "s" + ], + [ + "▁nod", + "es" + ], + [ + "▁", + "nodes" + ], + [ + "▁n", + "s" + ], + [ + "▁", + "ns" + ], + [ + "ess", + "ages" + ], + [ + "essage", + "s" + ], + [ + "essa", + "ges" + ], + [ + "im", + "pl" + ], + [ + "imp", + "l" + ], + [ + "ic", + "ker" + ], + [ + "ick", + "er" + ], + [ + "i", + "cker" + ], + [ + "▁ex", + "amples" + ], + [ + "▁example", + "s" + ], + [ + "▁exam", + "ples" + ], + [ + "ab", + "eth" + ], + [ + "abe", + "th" + ], + [ + "abet", + "h" + ], + [ + "▁st", + "ated" + ], + [ + "▁stat", + "ed" + ], + [ + "▁state", + "d" + ], + [ + "▁sta", + "ted" + ], + [ + "fi", + "re" + ], + [ + "f", + "ire" + ], + [ + "bu", + "l" + ], + [ + "b", + "ul" + ], + [ + "▁danger", + "ous" + ], + [ + "▁P", + "ay" + ], + [ + "▁Pa", + "y" + ], + [ + "▁", + "Pay" + ], + [ + "▁G", + "re" + ], + [ + "▁Gr", + "e" + ], + [ + "▁", + "Gre" + ], + [ + "▁Mon", + "day" + ], + [ + "▁Mond", + "ay" + ], + [ + "es", + "ome" + ], + [ + "eso", + "me" + ], + [ + "e", + "some" + ], + [ + "ig", + "an" + ], + [ + "iga", + "n" + ], + [ + "i", + "gan" + ], + [ + "ru", + "nd" + ], + [ + "run", + "d" + ], + [ + "r", + "und" + ], + [ + "pr", + "ise" + ], + [ + "p", + "rise" + ], + [ + "fa", + "il" + ], + [ + "f", + "ail" + ], + [ + "▁N", + "ever" + ], + [ + "▁Ne", + "ver" + ], + [ + "▁Nev", + "er" + ], + [ + "▁", + "Never" + ], + [ + "A", + "v" + ], + [ + "▁line", + "ar" + ], + [ + "▁lin", + "ear" + ], + [ + "▁", + "linear" + ], + [ + "▁u", + "l" + ], + [ + "▁", + "ul" + ], + [ + "WA", + "R" + ], + [ + "W", + "AR" + ], + [ + "ре", + "н" + ], + [ + "р", + "ен" + ], + [ + "▁A", + "T" + ], + [ + "▁", + "AT" + ], + [ + "▁d", + "op" + ], + [ + "▁do", + "p" + ], + [ + "▁n", + "ou" + ], + [ + "▁no", + "u" + ], + [ + "Des", + "t" + ], + [ + "De", + "st" + ], + [ + "D", + "est" + ], + [ + "▁claim", + "s" + ], + [ + "en", + "da" + ], + [ + "end", + "a" + ], + [ + "▁c", + "razy" + ], + [ + "▁cr", + "azy" + ], + [ + "ge", + "l" + ], + [ + "g", + "el" + ], + [ + "og", + "gle" + ], + [ + "ogg", + "le" + ], + [ + "▁rep", + "resentation" + ], + [ + "▁represent", + "ation" + ], + [ + "in", + "en" + ], + [ + "ine", + "n" + ], + [ + "i", + "nen" + ], + [ + "▁altern", + "ative" + ], + [ + "▁alter", + "native" + ], + [ + "D", + "M" + ], + [ + "AB", + "ILITY" + ], + [ + "face", + "s" + ], + [ + "fa", + "ces" + ], + [ + "fac", + "es" + ], + [ + "f", + "aces" + ], + [ + "▁do", + "ors" + ], + [ + "▁door", + "s" + ], + [ + "▁", + "doors" + ], + [ + "at", + "iv" + ], + [ + "ati", + "v" + ], + [ + "Lo", + "ok" + ], + [ + "L", + "ook" + ], + [ + "▁J", + "SON" + ], + [ + "▁JS", + "ON" + ], + [ + "▁", + "JSON" + ], + [ + "▁appe", + "arance" + ], + [ + "▁appear", + "ance" + ], + [ + "б", + "ря" + ], + [ + "S", + "QL" + ], + [ + "▁sil", + "ence" + ], + [ + "ud", + "o" + ], + [ + "u", + "do" + ], + [ + "▁Direct", + "or" + ], + [ + "▁Dire", + "ctor" + ], + [ + "▁Dir", + "ector" + ], + [ + "State", + "ment" + ], + [ + "Stat", + "ement" + ], + [ + "se", + "lected" + ], + [ + "select", + "ed" + ], + [ + "hi", + "gh" + ], + [ + "h", + "igh" + ], + [ + "pr", + "ime" + ], + [ + "prim", + "e" + ], + [ + "▁ign", + "ore" + ], + [ + "▁ignor", + "e" + ], + [ + "▁", + "ignore" + ], + [ + "▁col", + "ors" + ], + [ + "▁color", + "s" + ], + [ + "▁", + "colors" + ], + [ + "us", + "hing" + ], + [ + "ush", + "ing" + ], + [ + "▁v", + "irt" + ], + [ + "▁vi", + "rt" + ], + [ + "▁vir", + "t" + ], + [ + "▁", + "virt" + ], + [ + "man", + "ager" + ], + [ + "▁rem", + "ote" + ], + [ + "▁remot", + "e" + ], + [ + "▁", + "remote" + ], + [ + "ł", + "o" + ], + [ + "sm", + "all" + ], + [ + "▁cr", + "ime" + ], + [ + "▁crim", + "e" + ], + [ + "▁cri", + "me" + ], + [ + "r", + "b" + ], + [ + "▁c", + "reation" + ], + [ + "▁cre", + "ation" + ], + [ + "▁creat", + "ion" + ], + [ + "▁f", + "light" + ], + [ + "▁fl", + "ight" + ], + [ + "▁S", + "ign" + ], + [ + "▁Si", + "gn" + ], + [ + "▁Sig", + "n" + ], + [ + "▁", + "Sign" + ], + [ + "IL", + "E" + ], + [ + "I", + "LE" + ], + [ + "▁D", + "O" + ], + [ + "▁", + "DO" + ], + [ + "com", + "ment" + ], + [ + "comm", + "ent" + ], + [ + "▁C", + "ost" + ], + [ + "▁Co", + "st" + ], + [ + "▁Cos", + "t" + ], + [ + "▁", + "Cost" + ], + [ + "._", + "_" + ], + [ + ".", + "__" + ], + [ + "▁C", + "op" + ], + [ + "▁Co", + "p" + ], + [ + "▁", + "Cop" + ], + [ + "▁v", + "om" + ], + [ + "▁vo", + "m" + ], + [ + "▁Sc", + "ience" + ], + [ + "▁Sci", + "ence" + ], + [ + "ле", + "ния" + ], + [ + "oo", + "p" + ], + [ + "o", + "op" + ], + [ + "inter", + "face" + ], + [ + "▁WARRAN", + "TIES" + ], + [ + "▁P", + "age" + ], + [ + "▁Pa", + "ge" + ], + [ + "▁", + "Page" + ], + [ + "**", + "****" + ], + [ + "****", + "**" + ], + [ + "***", + "***" + ], + [ + "ско", + "м" + ], + [ + "с", + "ком" + ], + [ + "TR", + "UE" + ], + [ + "▁re", + "peated" + ], + [ + "▁repe", + "ated" + ], + [ + "▁repeat", + "ed" + ], + [ + "▁е", + "го" + ], + [ + "ш", + "о" + ], + [ + "▁r", + "oz" + ], + [ + "▁ro", + "z" + ], + [ + "▁", + "roz" + ], + [ + "P", + "e" + ], + [ + "▁IS", + "BN" + ], + [ + "ir", + "ts" + ], + [ + "irt", + "s" + ], + [ + "pos", + "es" + ], + [ + "po", + "ses" + ], + [ + "pose", + "s" + ], + [ + "p", + "oses" + ], + [ + "})", + "$" + ], + [ + "}", + ")$" + ], + [ + "▁", + "І" + ], + [ + "child", + "ren" + ], + [ + "ble", + "s" + ], + [ + "bl", + "es" + ], + [ + "b", + "les" + ], + [ + "EC", + "T" + ], + [ + "E", + "CT" + ], + [ + "▁i", + "z" + ], + [ + "▁", + "iz" + ], + [ + "▁b", + "uilder" + ], + [ + "▁build", + "er" + ], + [ + "▁", + "builder" + ], + [ + "▁M", + "edia" + ], + [ + "▁Med", + "ia" + ], + [ + "▁", + "Media" + ], + [ + "ia", + "t" + ], + [ + "i", + "at" + ], + [ + "▁contr", + "ast" + ], + [ + "▁contra", + "st" + ], + [ + "”", + "," + ], + [ + "▁L", + "ink" + ], + [ + "▁Lin", + "k" + ], + [ + "▁", + "Link" + ], + [ + "▁Educ", + "ation" + ], + [ + "▁j", + "oint" + ], + [ + "▁join", + "t" + ], + [ + "▁jo", + "int" + ], + [ + "▁", + "joint" + ], + [ + "▁ex", + "ternal" + ], + [ + "▁extern", + "al" + ], + [ + "▁", + "external" + ], + [ + "▁ро", + "з" + ], + [ + "▁b", + "its" + ], + [ + "▁bit", + "s" + ], + [ + "▁bi", + "ts" + ], + [ + "▁", + "bits" + ], + [ + "FO", + "RM" + ], + [ + "FOR", + "M" + ], + [ + "F", + "ORM" + ], + [ + "er", + "man" + ], + [ + "erm", + "an" + ], + [ + "w", + "p" + ], + [ + "▁M", + "ike" + ], + [ + "▁Mi", + "ke" + ], + [ + "▁Mik", + "e" + ], + [ + "▁M", + "aster" + ], + [ + "▁Ma", + "ster" + ], + [ + "▁Mas", + "ter" + ], + [ + "▁", + "Master" + ], + [ + "▁sen", + "ior" + ], + [ + "▁N", + "av" + ], + [ + "▁Na", + "v" + ], + [ + "▁", + "Nav" + ], + [ + "▁record", + "ed" + ], + [ + "el", + "ing" + ], + [ + "eli", + "ng" + ], + [ + "elin", + "g" + ], + [ + "e", + "ling" + ], + [ + "es", + "h" + ], + [ + "e", + "sh" + ], + [ + "f", + "x" + ], + [ + "ка", + "н" + ], + [ + "к", + "ан" + ], + [ + "▁t", + "all" + ], + [ + "▁tal", + "l" + ], + [ + "▁ta", + "ll" + ], + [ + "▁John", + "son" + ], + [ + "▁s", + "ono" + ], + [ + "▁so", + "no" + ], + [ + "▁son", + "o" + ], + [ + "▁an", + "che" + ], + [ + "▁anc", + "he" + ], + [ + "▁anch", + "e" + ], + [ + "▁", + "anche" + ], + [ + "ic", + "ken" + ], + [ + "ick", + "en" + ], + [ + "i", + "cken" + ], + [ + "lo", + "op" + ], + [ + "l", + "oop" + ], + [ + "ici", + "ency" + ], + [ + "empor", + "ary" + ], + [ + "▁D", + "oes" + ], + [ + "▁Do", + "es" + ], + [ + "▁", + "Does" + ], + [ + "▁re", + "lation" + ], + [ + "▁rel", + "ation" + ], + [ + "▁", + "relation" + ], + [ + "м", + "ы" + ], + [ + "wa", + "s" + ], + [ + "w", + "as" + ], + [ + "lo", + "w" + ], + [ + "l", + "ow" + ], + [ + "ich", + "te" + ], + [ + "icht", + "e" + ], + [ + "i", + "chte" + ], + [ + "▁J", + "ones" + ], + [ + "▁Jo", + "nes" + ], + [ + "▁Jon", + "es" + ], + [ + "▁bed", + "room" + ], + [ + "DI", + "S" + ], + [ + "D", + "IS" + ], + [ + "▁mag", + "net" + ], + [ + "▁magn", + "et" + ], + [ + "▁Eng", + "ine" + ], + [ + "▁", + "Engine" + ], + [ + "▁feel", + "ings" + ], + [ + "▁feeling", + "s" + ], + [ + "▁fee", + "lings" + ], + [ + "G", + "C" + ], + [ + "▁t", + "orn" + ], + [ + "▁to", + "rn" + ], + [ + "▁tor", + "n" + ], + [ + "▁relationship", + "s" + ], + [ + "▁relation", + "ships" + ], + [ + "▁Р", + "е" + ], + [ + "▁p", + "roud" + ], + [ + "▁pro", + "ud" + ], + [ + "▁pr", + "oud" + ], + [ + "▁t", + "we" + ], + [ + "▁tw", + "e" + ], + [ + "ov", + "al" + ], + [ + "ova", + "l" + ], + [ + "o", + "val" + ], + [ + "▁w", + "aste" + ], + [ + "▁was", + "te" + ], + [ + "▁wa", + "ste" + ], + [ + "▁red", + "uced" + ], + [ + "▁redu", + "ced" + ], + [ + "▁reduce", + "d" + ], + [ + "il", + "ton" + ], + [ + "ilt", + "on" + ], + [ + "B", + "P" + ], + [ + "▁for", + "got" + ], + [ + "▁forg", + "ot" + ], + [ + "▁bod", + "ies" + ], + [ + "▁H", + "aw" + ], + [ + "▁Ha", + "w" + ], + [ + "la", + "g" + ], + [ + "l", + "ag" + ], + [ + "▁w", + "ww" + ], + [ + "▁", + "www" + ], + [ + "do", + "or" + ], + [ + "d", + "oor" + ], + [ + "▁s", + "ufficient" + ], + [ + "▁suff", + "icient" + ], + [ + "▁doll", + "ars" + ], + [ + "▁dollar", + "s" + ], + [ + "Le", + "n" + ], + [ + "L", + "en" + ], + [ + "▁talk", + "ed" + ], + [ + "▁tal", + "ked" + ], + [ + "▁b", + "ond" + ], + [ + "▁bo", + "nd" + ], + [ + "▁bon", + "d" + ], + [ + "▁B", + "or" + ], + [ + "▁Bo", + "r" + ], + [ + "}}", + "{" + ], + [ + "}", + "}{" + ], + [ + "ro", + "d" + ], + [ + "r", + "od" + ], + [ + "Pass", + "word" + ], + [ + "qu", + "are" + ], + [ + "▁l", + "ights" + ], + [ + "▁light", + "s" + ], + [ + "▁", + "lights" + ], + [ + "er", + "en" + ], + [ + "ere", + "n" + ], + [ + "e", + "ren" + ], + [ + "▁th", + "irty" + ], + [ + "N", + "C" + ], + [ + "▁T", + "ODO" + ], + [ + "▁TO", + "DO" + ], + [ + "▁res", + "pond" + ], + [ + "▁respon", + "d" + ], + [ + "▁resp", + "ond" + ], + [ + "▁", + "respond" + ], + [ + "ки", + "х" + ], + [ + "dir", + "ect" + ], + [ + "di", + "rect" + ], + [ + "dire", + "ct" + ], + [ + "d", + "irect" + ], + [ + "a", + "ção" + ], + [ + "▁he", + "av" + ], + [ + "Med", + "ia" + ], + [ + "M", + "edia" + ], + [ + "ex", + "it" + ], + [ + "e", + "xit" + ], + [ + "L", + "icense" + ], + [ + "`", + "." + ], + [ + "▁m", + "ixed" + ], + [ + "▁mix", + "ed" + ], + [ + "▁d", + "esk" + ], + [ + "▁de", + "sk" + ], + [ + "▁des", + "k" + ], + [ + "▁te", + "aching" + ], + [ + "▁teach", + "ing" + ], + [ + "▁tea", + "ching" + ], + [ + "▁m", + "aj" + ], + [ + "▁ma", + "j" + ], + [ + "▁n", + "erv" + ], + [ + "▁ne", + "rv" + ], + [ + "▁ner", + "v" + ], + [ + "in", + "ations" + ], + [ + "ination", + "s" + ], + [ + "type", + "of" + ], + [ + "▁co", + "ast" + ], + [ + "▁ж", + "е" + ], + [ + "▁", + "же" + ], + [ + "▁be", + "side" + ], + [ + "▁bes", + "ide" + ], + [ + "um", + "my" + ], + [ + "umm", + "y" + ], + [ + "Do", + "c" + ], + [ + "D", + "oc" + ], + [ + "▁sche", + "dule" + ], + [ + "▁schedul", + "e" + ], + [ + "▁sched", + "ule" + ], + [ + "▁", + "schedule" + ], + [ + "▁re", + "cover" + ], + [ + "▁rec", + "over" + ], + [ + "▁Fur", + "ther" + ], + [ + "▁ste", + "el" + ], + [ + "bo", + "ot" + ], + [ + "b", + "oot" + ], + [ + "▁Per", + "haps" + ], + [ + "▁с", + "ъ" + ], + [ + "▁O", + "s" + ], + [ + "▁", + "Os" + ], + [ + "ri", + "ck" + ], + [ + "ric", + "k" + ], + [ + "r", + "ick" + ], + [ + "▁В", + "и" + ], + [ + "Supp", + "ort" + ], + [ + "Sup", + "port" + ], + [ + "S", + "upport" + ], + [ + "▁(", + "_" + ], + [ + "▁", + "(_" + ], + [ + "ni", + "l" + ], + [ + "n", + "il" + ], + [ + "pi", + "s" + ], + [ + "p", + "is" + ], + [ + "x", + "pected" + ], + [ + "▁process", + "ing" + ], + [ + "▁proces", + "sing" + ], + [ + "▁", + "processing" + ], + [ + "Bu", + "ild" + ], + [ + "B", + "uild" + ], + [ + "ar", + "ian" + ], + [ + "ari", + "an" + ], + [ + "aria", + "n" + ], + [ + "a", + "rian" + ], + [ + "▁i", + "con" + ], + [ + "▁ic", + "on" + ], + [ + "▁", + "icon" + ], + [ + "▁C", + "A" + ], + [ + "▁", + "CA" + ], + [ + "wi", + "ck" + ], + [ + "w", + "ick" + ], + [ + "=", + "(" + ], + [ + "▁al", + "gorithm" + ], + [ + "▁", + "algorithm" + ], + [ + "▁You", + "ng" + ], + [ + "▁Man", + "agement" + ], + [ + "▁", + "Management" + ], + [ + "▁anc", + "ient" + ], + [ + "▁anci", + "ent" + ], + [ + "но", + "сть" + ], + [ + "ност", + "ь" + ], + [ + "ot", + "i" + ], + [ + "o", + "ti" + ], + [ + "▁comb", + "ination" + ], + [ + "wor", + "ld" + ], + [ + "w", + "orld" + ], + [ + "n", + "n" + ], + [ + "▁d", + "ram" + ], + [ + "▁dr", + "am" + ], + [ + "en", + "abled" + ], + [ + "ena", + "bled" + ], + [ + "enable", + "d" + ], + [ + "A", + "c" + ], + [ + "C", + "CESS" + ], + [ + "ar", + "ation" + ], + [ + "▁bl", + "ocks" + ], + [ + "▁block", + "s" + ], + [ + "▁blo", + "cks" + ], + [ + "▁", + "blocks" + ], + [ + "▁Ang", + "eles" + ], + [ + "▁Angel", + "es" + ], + [ + "▁Q", + "ual" + ], + [ + "▁Qu", + "al" + ], + [ + "▁", + "Qual" + ], + [ + "▁suc", + "ceed" + ], + [ + "▁succ", + "eed" + ], + [ + "net", + "work" + ], + [ + "▁ob", + "lig" + ], + [ + "spring", + "framework" + ], + [ + "▁T", + "re" + ], + [ + "▁Tr", + "e" + ], + [ + "ok", + "es" + ], + [ + "oke", + "s" + ], + [ + "o", + "kes" + ], + [ + "mu", + "n" + ], + [ + "m", + "un" + ], + [ + "▁Net", + "work" + ], + [ + "▁", + "Network" + ], + [ + "De", + "l" + ], + [ + "D", + "el" + ], + [ + "▁e", + "state" + ], + [ + "▁est", + "ate" + ], + [ + "▁esta", + "te" + ], + [ + "▁l", + "iqu" + ], + [ + "▁li", + "qu" + ], + [ + "▁p", + "ob" + ], + [ + "▁po", + "b" + ], + [ + "▁d", + "ad" + ], + [ + "▁da", + "d" + ], + [ + "▁dist", + "inct" + ], + [ + "▁T", + "it" + ], + [ + "▁Ti", + "t" + ], + [ + "▁L", + "ear" + ], + [ + "▁Le", + "ar" + ], + [ + "fer", + "red" + ], + [ + "and", + "roid" + ], + [ + "andro", + "id" + ], + [ + "▁sub", + "sequ" + ], + [ + "▁subs", + "equ" + ], + [ + "▁Flor", + "ida" + ], + [ + "sub", + "set" + ], + [ + "▁whis", + "per" + ], + [ + "Vo", + "l" + ], + [ + "V", + "ol" + ], + [ + "ul", + "ous" + ], + [ + "ulo", + "us" + ], + [ + "▁c", + "rew" + ], + [ + "▁cre", + "w" + ], + [ + "▁cr", + "ew" + ], + [ + "▁l", + "ug" + ], + [ + "▁lu", + "g" + ], + [ + "pi", + "d" + ], + [ + "p", + "id" + ], + [ + "oc", + "ity" + ], + [ + "oci", + "ty" + ], + [ + "o", + "city" + ], + [ + "sk", + "b" + ], + [ + "s", + "kb" + ], + [ + "▁t", + "ea" + ], + [ + "▁te", + "a" + ], + [ + "у", + "н" + ], + [ + "▁hon", + "or" + ], + [ + "▁ho", + "nor" + ], + [ + "▁I", + "ns" + ], + [ + "▁In", + "s" + ], + [ + "▁", + "Ins" + ], + [ + "▁g", + "ew" + ], + [ + "▁ge", + "w" + ], + [ + "▁", + "gew" + ], + [ + "Det", + "ails" + ], + [ + "Detail", + "s" + ], + [ + "ene", + "ath" + ], + [ + "e", + "neath" + ], + [ + "at", + "ar" + ], + [ + "ata", + "r" + ], + [ + "a", + "tar" + ], + [ + "▁_", + "{" + ], + [ + "▁", + "_{" + ], + [ + "am", + "en" + ], + [ + "ame", + "n" + ], + [ + "a", + "men" + ], + [ + "▁set", + "up" + ], + [ + "▁", + "setup" + ], + [ + "Trans", + "action" + ], + [ + "▁bl", + "ank" + ], + [ + "▁", + "blank" + ], + [ + "Fail", + "ed" + ], + [ + "F", + "ailed" + ], + [ + "jo", + "b" + ], + [ + "j", + "ob" + ], + [ + "▁p", + "ret" + ], + [ + "▁pre", + "t" + ], + [ + "▁pr", + "et" + ], + [ + "▁", + "pret" + ], + [ + "ß", + "e" + ], + [ + "lo", + "or" + ], + [ + "l", + "oor" + ], + [ + "ř", + "í" + ], + [ + "nc", + "ia" + ], + [ + "n", + "cia" + ], + [ + "▁any", + "where" + ], + [ + "▁L", + "ight" + ], + [ + "▁Li", + "ght" + ], + [ + "▁", + "Light" + ], + [ + "▁A", + "k" + ], + [ + "B", + "D" + ], + [ + "▁exc", + "ited" + ], + [ + "▁excit", + "ed" + ], + [ + "ag", + "ers" + ], + [ + "age", + "rs" + ], + [ + "ager", + "s" + ], + [ + "a", + "gers" + ], + [ + "▁w", + "arning" + ], + [ + "▁war", + "ning" + ], + [ + "▁warn", + "ing" + ], + [ + "▁", + "warning" + ], + [ + "▁process", + "es" + ], + [ + "▁proces", + "ses" + ], + [ + "h", + "u" + ], + [ + "▁y", + "outh" + ], + [ + "▁you", + "th" + ], + [ + "▁yo", + "uth" + ], + [ + "▁d", + "ogs" + ], + [ + "▁do", + "gs" + ], + [ + "▁dog", + "s" + ], + [ + "▁o", + "ct" + ], + [ + "▁oc", + "t" + ], + [ + "▁", + "oct" + ], + [ + "▁n", + "ine" + ], + [ + "▁ni", + "ne" + ], + [ + "▁nin", + "e" + ], + [ + "Write", + "r" + ], + [ + "Wr", + "iter" + ], + [ + "Writ", + "er" + ], + [ + "W", + "riter" + ], + [ + "gr", + "id" + ], + [ + "g", + "rid" + ], + [ + "▁import", + "ance" + ], + [ + "est", + "ic" + ], + [ + "▁care", + "fully" + ], + [ + "▁careful", + "ly" + ], + [ + "ma", + "ster" + ], + [ + "mas", + "ter" + ], + [ + "m", + "aster" + ], + [ + "▁dec", + "isions" + ], + [ + "▁decision", + "s" + ], + [ + "▁decis", + "ions" + ], + [ + "▁p", + "in" + ], + [ + "▁pi", + "n" + ], + [ + "▁", + "pin" + ], + [ + "▁cr", + "ack" + ], + [ + "TE", + "ST" + ], + [ + "TES", + "T" + ], + [ + "T", + "EST" + ], + [ + "▁L", + "ocal" + ], + [ + "▁Loc", + "al" + ], + [ + "▁Lo", + "cal" + ], + [ + "▁", + "Local" + ], + [ + "▁R", + "ight" + ], + [ + "▁", + "Right" + ], + [ + "▁v", + "ast" + ], + [ + "▁va", + "st" + ], + [ + "▁vas", + "t" + ], + [ + "▁f", + "aster" + ], + [ + "▁fa", + "ster" + ], + [ + "▁fast", + "er" + ], + [ + "▁inst", + "itut" + ], + [ + "▁ann", + "ual" + ], + [ + "LA", + "N" + ], + [ + "L", + "AN" + ], + [ + "▁e", + "pisode" + ], + [ + "▁epis", + "ode" + ], + [ + "▁X", + "V" + ], + [ + "▁del", + "ivery" + ], + [ + "▁deliver", + "y" + ], + [ + "t", + "l" + ], + [ + "F", + "P" + ], + [ + "ci", + "rc" + ], + [ + "cir", + "c" + ], + [ + "▁typ", + "ically" + ], + [ + "▁typical", + "ly" + ], + [ + "ig", + "o" + ], + [ + "i", + "go" + ], + [ + "▁int", + "el" + ], + [ + "▁inte", + "l" + ], + [ + "▁", + "intel" + ], + [ + "na", + "t" + ], + [ + "n", + "at" + ], + [ + "x", + "b" + ], + [ + "ст", + "ро" + ], + [ + "с", + "тро" + ], + [ + ")", + "-" + ], + [ + "▁B", + "al" + ], + [ + "▁Ba", + "l" + ], + [ + "▁", + "Bal" + ], + [ + "▁J", + "os" + ], + [ + "▁Jo", + "s" + ], + [ + "▁g", + "onna" + ], + [ + "▁R", + "est" + ], + [ + "▁Re", + "st" + ], + [ + "▁Res", + "t" + ], + [ + "▁", + "Rest" + ], + [ + "jo", + "r" + ], + [ + "j", + "or" + ], + [ + "on", + "ia" + ], + [ + "oni", + "a" + ], + [ + "o", + "nia" + ], + [ + "or", + "ship" + ], + [ + "ors", + "hip" + ], + [ + "ov", + "ery" + ], + [ + "ove", + "ry" + ], + [ + "over", + "y" + ], + [ + "o", + "very" + ], + [ + "LI", + "NE" + ], + [ + "LIN", + "E" + ], + [ + "L", + "INE" + ], + [ + "]", + ":" + ], + [ + "Que", + "ue" + ], + [ + "▁com", + "pare" + ], + [ + "▁comp", + "are" + ], + [ + "▁compar", + "e" + ], + [ + "▁", + "compare" + ], + [ + "▁ap", + "artment" + ], + [ + "▁apart", + "ment" + ], + [ + "▁r", + "ul" + ], + [ + "▁ru", + "l" + ], + [ + "D", + "r" + ], + [ + "gen", + "cy" + ], + [ + "g", + "ency" + ], + [ + "▁ob", + "viously" + ], + [ + "▁obvious", + "ly" + ], + [ + "zi", + "e" + ], + [ + "z", + "ie" + ], + [ + "yc", + "l" + ], + [ + "y", + "cl" + ], + [ + "fort", + "unately" + ], + [ + "fortun", + "ately" + ], + [ + "fortunate", + "ly" + ], + [ + "▁ste", + "pped" + ], + [ + "▁step", + "ped" + ], + [ + "▁S", + "eg" + ], + [ + "▁Se", + "g" + ], + [ + "▁", + "Seg" + ], + [ + "▁Wh", + "ich" + ], + [ + "▁", + "Which" + ], + [ + "▁P", + "C" + ], + [ + "▁", + "PC" + ], + [ + "▁a", + "st" + ], + [ + "▁as", + "t" + ], + [ + "▁", + "ast" + ], + [ + "end", + "or" + ], + [ + "endo", + "r" + ], + [ + "▁per", + "mission" + ], + [ + "▁perm", + "ission" + ], + [ + "▁", + "permission" + ], + [ + "CO", + "L" + ], + [ + "C", + "OL" + ], + [ + "▁T", + "EST" + ], + [ + "▁TE", + "ST" + ], + [ + "▁", + "TEST" + ], + [ + "P", + "ay" + ], + [ + "ère", + "s" + ], + [ + "è", + "res" + ], + [ + "▁stud", + "ied" + ], + [ + "▁accom", + "pl" + ], + [ + "▁accomp", + "l" + ], + [ + "ro", + "le" + ], + [ + "rol", + "e" + ], + [ + "r", + "ole" + ], + [ + "Wh", + "ere" + ], + [ + "Whe", + "re" + ], + [ + "W", + "here" + ], + [ + "proto", + "buf" + ], + [ + "met", + "adata" + ], + [ + "meta", + "data" + ], + [ + "Jo", + "b" + ], + [ + "J", + "ob" + ], + [ + "▁F", + "our" + ], + [ + "▁Fou", + "r" + ], + [ + "▁Fo", + "ur" + ], + [ + "pl", + "ements" + ], + [ + "ple", + "ments" + ], + [ + "plement", + "s" + ], + [ + "dis", + "able" + ], + [ + "▁l", + "oud" + ], + [ + "▁lo", + "ud" + ], + [ + "▁lou", + "d" + ], + [ + "▁happ", + "ening" + ], + [ + "▁happen", + "ing" + ], + [ + "▁U", + "sing" + ], + [ + "▁Us", + "ing" + ], + [ + "▁", + "Using" + ], + [ + "ro", + "g" + ], + [ + "r", + "og" + ], + [ + "▁depend", + "s" + ], + [ + "▁dep", + "ends" + ], + [ + "í", + "m" + ], + [ + "'", + "\\" + ], + [ + "▁t", + "aught" + ], + [ + "sh", + "ared" + ], + [ + "sha", + "red" + ], + [ + "share", + "d" + ], + [ + "▁att", + "ributes" + ], + [ + "▁attribute", + "s" + ], + [ + "▁attribut", + "es" + ], + [ + "▁", + "attributes" + ], + [ + "▁A", + "ction" + ], + [ + "▁Act", + "ion" + ], + [ + "▁", + "Action" + ], + [ + "▁d", + "ess" + ], + [ + "▁de", + "ss" + ], + [ + "▁des", + "s" + ], + [ + "▁", + "dess" + ], + [ + "▁h", + "ouses" + ], + [ + "▁house", + "s" + ], + [ + "▁hous", + "es" + ], + [ + "▁ho", + "uses" + ], + [ + "▁re", + "set" + ], + [ + "▁res", + "et" + ], + [ + "▁", + "reset" + ], + [ + "▁b", + "ien" + ], + [ + "▁bi", + "en" + ], + [ + "▁ex", + "plicit" + ], + [ + "▁expl", + "icit" + ], + [ + "LO", + "W" + ], + [ + "->", + "_" + ], + [ + "▁P", + "M" + ], + [ + "▁", + "PM" + ], + [ + "C", + "ategory" + ], + [ + "oi", + "ce" + ], + [ + "o", + "ice" + ], + [ + "in", + "to" + ], + [ + "int", + "o" + ], + [ + "▁m", + "ail" + ], + [ + "▁ma", + "il" + ], + [ + "▁mai", + "l" + ], + [ + "▁", + "mail" + ], + [ + "▁author", + "ity" + ], + [ + "▁un", + "able" + ], + [ + "▁una", + "ble" + ], + [ + "file", + "name" + ], + [ + "fil", + "ename" + ], + [ + "é", + "k" + ], + [ + "ле", + "й" + ], + [ + "л", + "ей" + ], + [ + "▁s", + "ector" + ], + [ + "▁se", + "ctor" + ], + [ + "▁sec", + "tor" + ], + [ + "▁sect", + "or" + ], + [ + "ap", + "point" + ], + [ + "app", + "oint" + ], + [ + "▁h", + "ang" + ], + [ + "▁ha", + "ng" + ], + [ + "▁han", + "g" + ], + [ + "▁", + "hang" + ], + [ + "▁c", + "el" + ], + [ + "▁ce", + "l" + ], + [ + "▁", + "cel" + ], + [ + "rel", + "ated" + ], + [ + "it", + "ate" + ], + [ + "ita", + "te" + ], + [ + "itat", + "e" + ], + [ + "▁'", + "<" + ], + [ + "am", + "ber" + ], + [ + "amb", + "er" + ], + [ + "a", + "mber" + ], + [ + "▁c", + "heap" + ], + [ + "▁che", + "ap" + ], + [ + "▁en", + "abled" + ], + [ + "▁enable", + "d" + ], + [ + "▁", + "enabled" + ], + [ + "▁di", + "vision" + ], + [ + "▁div", + "ision" + ], + [ + "▁divis", + "ion" + ], + [ + "An", + "y" + ], + [ + "A", + "ny" + ], + [ + "▁h", + "ier" + ], + [ + "▁hi", + "er" + ], + [ + "▁H", + "ead" + ], + [ + "▁He", + "ad" + ], + [ + "▁", + "Head" + ], + [ + "nt", + "ax" + ], + [ + "n", + "tax" + ], + [ + "ud", + "a" + ], + [ + "u", + "da" + ], + [ + "▁lim", + "itations" + ], + [ + "▁limit", + "ations" + ], + [ + "▁limitation", + "s" + ], + [ + "▁st", + "udio" + ], + [ + "▁stud", + "io" + ], + [ + "med", + "ia" + ], + [ + "medi", + "a" + ], + [ + "m", + "edia" + ], + [ + "▁cir", + "cle" + ], + [ + "▁circ", + "le" + ], + [ + "▁", + "circle" + ], + [ + "но", + "ва" + ], + [ + "нов", + "а" + ], + [ + "▁l", + "aug" + ], + [ + "▁la", + "ug" + ], + [ + "ac", + "ts" + ], + [ + "act", + "s" + ], + [ + "▁В", + "о" + ], + [ + "ó", + "d" + ], + [ + "pl", + "ed" + ], + [ + "ple", + "d" + ], + [ + "p", + "led" + ], + [ + "LO", + "C" + ], + [ + "L", + "OC" + ], + [ + "Ex", + "pr" + ], + [ + "Exp", + "r" + ], + [ + ">", + ":" + ], + [ + "▁pr", + "és" + ], + [ + "▁pré", + "s" + ], + [ + "▁", + "prés" + ], + [ + "▁laugh", + "ed" + ], + [ + "▁laug", + "hed" + ], + [ + "▁Th", + "ree" + ], + [ + "▁", + "Three" + ], + [ + "л", + "ы" + ], + [ + "▁en", + "ds" + ], + [ + "▁end", + "s" + ], + [ + "▁", + "ends" + ], + [ + "▁fund", + "ament" + ], + [ + "▁in", + "her" + ], + [ + "▁", + "inher" + ], + [ + "▁l", + "iv" + ], + [ + "▁li", + "v" + ], + [ + "▁", + "liv" + ], + [ + "bi", + "d" + ], + [ + "b", + "id" + ], + [ + "▁respons", + "ibility" + ], + [ + "▁check", + "ed" + ], + [ + "▁", + "checked" + ], + [ + "▁P", + "ac" + ], + [ + "▁Pa", + "c" + ], + [ + "▁f", + "ault" + ], + [ + "▁fa", + "ult" + ], + [ + "▁y", + "ellow" + ], + [ + "▁s", + "alt" + ], + [ + "▁sa", + "lt" + ], + [ + "▁sal", + "t" + ], + [ + "▁Franc", + "isco" + ], + [ + "▁Francis", + "co" + ], + [ + "▁", + "^" + ], + [ + "▁O", + "N" + ], + [ + "▁", + "ON" + ], + [ + "▁beaut", + "y" + ], + [ + "y", + "g" + ], + [ + "▁A", + "ff" + ], + [ + "▁Af", + "f" + ], + [ + "▁", + "Aff" + ], + [ + "▁E", + "q" + ], + [ + "▁", + "Eq" + ], + [ + "▁mag", + "ic" + ], + [ + "▁hand", + "ler" + ], + [ + "▁handle", + "r" + ], + [ + "▁", + "handler" + ], + [ + "x", + "E" + ], + [ + "▁numer", + "ous" + ], + [ + "▁numero", + "us" + ], + [ + "▁h", + "ole" + ], + [ + "▁hol", + "e" + ], + [ + "▁ho", + "le" + ], + [ + "▁", + "hole" + ], + [ + "▁ro", + "oms" + ], + [ + "▁room", + "s" + ], + [ + "▁", + "rooms" + ], + [ + "cc", + "ión" + ], + [ + "cció", + "n" + ], + [ + "c", + "ción" + ], + [ + "▁A", + "rm" + ], + [ + "▁Ar", + "m" + ], + [ + "▁", + "Arm" + ], + [ + "per", + "son" + ], + [ + "pers", + "on" + ], + [ + "p", + "erson" + ], + [ + "▁build", + "ings" + ], + [ + "▁building", + "s" + ], + [ + "▁p", + "late" + ], + [ + "▁pl", + "ate" + ], + [ + "▁plat", + "e" + ], + [ + "ble", + "d" + ], + [ + "bl", + "ed" + ], + [ + "b", + "led" + ], + [ + "er", + "rors" + ], + [ + "err", + "ors" + ], + [ + "error", + "s" + ], + [ + "▁A", + "gain" + ], + [ + "▁Ag", + "ain" + ], + [ + "▁Def", + "ault" + ], + [ + "▁", + "Default" + ], + [ + "▁H", + "ard" + ], + [ + "▁Har", + "d" + ], + [ + "▁Ha", + "rd" + ], + [ + "▁", + "Hard" + ], + [ + "t", + "ó" + ], + [ + "hu", + "s" + ], + [ + "h", + "us" + ], + [ + "▁dim", + "ension" + ], + [ + "ial", + "e" + ], + [ + "ia", + "le" + ], + [ + "i", + "ale" + ], + [ + "▁M", + "ult" + ], + [ + "▁Mu", + "lt" + ], + [ + "▁Mul", + "t" + ], + [ + "▁", + "Mult" + ], + [ + "▁Govern", + "ment" + ], + [ + "Fun", + "c" + ], + [ + "F", + "unc" + ], + [ + "▁b", + "low" + ], + [ + "▁bl", + "ow" + ], + [ + "▁blo", + "w" + ], + [ + "▁re", + "ct" + ], + [ + "▁r", + "ect" + ], + [ + "▁rec", + "t" + ], + [ + "▁", + "rect" + ], + [ + "er", + "ra" + ], + [ + "err", + "a" + ], + [ + "conne", + "ction" + ], + [ + "connect", + "ion" + ], + [ + "conn", + "ection" + ], + [ + "▁pass", + "ing" + ], + [ + "▁pas", + "sing" + ], + [ + "ße", + "n" + ], + [ + "ß", + "en" + ], + [ + "ph", + "as" + ], + [ + "pha", + "s" + ], + [ + "p", + "has" + ], + [ + "ens", + "ional" + ], + [ + "ension", + "al" + ], + [ + "re", + "cord" + ], + [ + "rec", + "ord" + ], + [ + "co", + "hol" + ], + [ + "▁H", + "arry" + ], + [ + "▁Har", + "ry" + ], + [ + "▁Harr", + "y" + ], + [ + "izont", + "al" + ], + [ + "izon", + "tal" + ], + [ + "▁f", + "inger" + ], + [ + "▁fin", + "ger" + ], + [ + "▁fing", + "er" + ], + [ + "▁young", + "er" + ], + [ + "▁S", + "C" + ], + [ + "▁", + "SC" + ], + [ + "oper", + "ation" + ], + [ + "B", + "Y" + ], + [ + "he", + "im" + ], + [ + "▁B", + "ad" + ], + [ + "▁Ba", + "d" + ], + [ + "▁", + "Bad" + ], + [ + "▁st", + "orm" + ], + [ + "▁stor", + "m" + ], + [ + "▁sto", + "rm" + ], + [ + "▁", + "storm" + ], + [ + "▁N", + "at" + ], + [ + "▁Na", + "t" + ], + [ + "▁bu", + "ying" + ], + [ + "▁buy", + "ing" + ], + [ + "▁S", + "ometimes" + ], + [ + "▁Some", + "times" + ], + [ + "▁С", + "та" + ], + [ + "es", + "sed" + ], + [ + "ess", + "ed" + ], + [ + "esse", + "d" + ], + [ + "▁da", + "mn" + ], + [ + "▁dam", + "n" + ], + [ + "▁m", + "eg" + ], + [ + "▁me", + "g" + ], + [ + "um", + "es" + ], + [ + "ume", + "s" + ], + [ + "u", + "mes" + ], + [ + "ün", + "d" + ], + [ + "ü", + "nd" + ], + [ + "т", + "ра" + ], + [ + "▁sil", + "ver" + ], + [ + "w", + "d" + ], + [ + "hid", + "den" + ], + [ + "h", + "idden" + ], + [ + "ar", + "do" + ], + [ + "ard", + "o" + ], + [ + "▁commun", + "ities" + ], + [ + "▁d", + "iet" + ], + [ + "▁di", + "et" + ], + [ + "▁die", + "t" + ], + [ + "ot", + "ted" + ], + [ + "ott", + "ed" + ], + [ + "otte", + "d" + ], + [ + "▁b", + "at" + ], + [ + "▁ba", + "t" + ], + [ + "▁", + "bat" + ], + [ + "an", + "cer" + ], + [ + "ance", + "r" + ], + [ + "anc", + "er" + ], + [ + "▁f", + "mt" + ], + [ + "▁", + "fmt" + ], + [ + "▁P", + "en" + ], + [ + "▁Pe", + "n" + ], + [ + "▁", + "Pen" + ], + [ + "▁t", + "il" + ], + [ + "▁ti", + "l" + ], + [ + "▁", + "til" + ], + [ + "En", + "um" + ], + [ + "E", + "num" + ], + [ + "PA", + "TH" + ], + [ + "P", + "ATH" + ], + [ + "▁mat", + "ters" + ], + [ + "▁matter", + "s" + ], + [ + "▁matt", + "ers" + ], + [ + "time", + "out" + ], + [ + "--", + "----------" + ], + [ + "----", + "--------" + ], + [ + "--------", + "----" + ], + [ + "---", + "---------" + ], + [ + "-----", + "-------" + ], + [ + "----------", + "--" + ], + [ + "------", + "------" + ], + [ + "---------", + "---" + ], + [ + "-------", + "-----" + ], + [ + "-----------", + "-" + ], + [ + "-", + "-----------" + ], + [ + "ka", + "n" + ], + [ + "k", + "an" + ], + [ + "▁Cor", + "por" + ], + [ + "=\"", + "../../" + ], + [ + "=\"../", + "../" + ], + [ + "▁A", + "le" + ], + [ + "▁Al", + "e" + ], + [ + "hent", + "ication" + ], + [ + "hentic", + "ation" + ], + [ + "▁com", + "plic" + ], + [ + "▁comp", + "lic" + ], + [ + "▁compl", + "ic" + ], + [ + "▁Se", + "curity" + ], + [ + "▁Sec", + "urity" + ], + [ + "▁", + "Security" + ], + [ + "OF", + "F" + ], + [ + "O", + "FF" + ], + [ + "R", + "ad" + ], + [ + "ap", + "se" + ], + [ + "aps", + "e" + ], + [ + "a", + "pse" + ], + [ + "▁d", + "ance" + ], + [ + "▁dan", + "ce" + ], + [ + "▁perm", + "issions" + ], + [ + "▁permission", + "s" + ], + [ + "▁war", + "rant" + ], + [ + "▁l", + "ad" + ], + [ + "▁la", + "d" + ], + [ + "▁", + "lad" + ], + [ + "▁is", + "ol" + ], + [ + "▁i", + "sol" + ], + [ + "d", + "l" + ], + [ + "▁A", + "u" + ], + [ + "ye", + "s" + ], + [ + "y", + "es" + ], + [ + "▁t", + "v" + ], + [ + "▁", + "tv" + ], + [ + "▁pro", + "vider" + ], + [ + "▁prov", + "ider" + ], + [ + "▁provide", + "r" + ], + [ + "▁", + "provider" + ], + [ + "▁ter", + "rible" + ], + [ + "▁terr", + "ible" + ], + [ + "▁dep", + "artment" + ], + [ + "▁depart", + "ment" + ], + [ + "er", + "al" + ], + [ + "era", + "l" + ], + [ + "e", + "ral" + ], + [ + "▁implement", + "ation" + ], + [ + "S", + "R" + ], + [ + "▁h", + "earing" + ], + [ + "▁he", + "aring" + ], + [ + "▁hear", + "ing" + ], + [ + "▁K", + "n" + ], + [ + "F", + "R" + ], + [ + "t", + "v" + ], + [ + "▁d", + "iss" + ], + [ + "▁dis", + "s" + ], + [ + "▁di", + "ss" + ], + [ + "F", + "UN" + ], + [ + "▁dur", + "ante" + ], + [ + "▁durant", + "e" + ], + [ + "os", + "is" + ], + [ + "osi", + "s" + ], + [ + "o", + "sis" + ], + [ + "▁task", + "s" + ], + [ + "▁", + "tasks" + ], + [ + "▁B", + "lo" + ], + [ + "▁Bl", + "o" + ], + [ + "▁", + "Blo" + ], + [ + "во", + "д" + ], + [ + "▁br", + "anch" + ], + [ + "▁", + "branch" + ], + [ + "▁polit", + "ics" + ], + [ + "▁E", + "lle" + ], + [ + "▁El", + "le" + ], + [ + "▁Ell", + "e" + ], + [ + "▁lead", + "ership" + ], + [ + "▁leader", + "ship" + ], + [ + "▁leaders", + "hip" + ], + [ + "ex", + "pr" + ], + [ + "exp", + "r" + ], + [ + "▁techn", + "iques" + ], + [ + "▁technique", + "s" + ], + [ + "pr", + "ec" + ], + [ + "pre", + "c" + ], + [ + "p", + "rec" + ], + [ + "Sig", + "ma" + ], + [ + "S", + "igma" + ], + [ + "im", + "ately" + ], + [ + "imate", + "ly" + ], + [ + "imat", + "ely" + ], + [ + "t", + "k" + ], + [ + "ach", + "ment" + ], + [ + "▁En", + "ter" + ], + [ + "▁Ent", + "er" + ], + [ + "▁", + "Enter" + ], + [ + "▁cre", + "ative" + ], + [ + "▁creat", + "ive" + ], + [ + "▁з", + "на" + ], + [ + "▁", + "зна" + ], + [ + "ap", + "py" + ], + [ + "app", + "y" + ], + [ + "un", + "ched" + ], + [ + "unch", + "ed" + ], + [ + "unc", + "hed" + ], + [ + "▁'", + "'," + ], + [ + "▁''", + "," + ], + [ + "on", + "der" + ], + [ + "ond", + "er" + ], + [ + "onde", + "r" + ], + [ + "o", + "nder" + ], + [ + "{", + "-" + ], + [ + "NU", + "M" + ], + [ + "N", + "UM" + ], + [ + "▁n", + "arr" + ], + [ + "▁na", + "rr" + ], + [ + "▁nar", + "r" + ], + [ + "Mem", + "ory" + ], + [ + "▁win", + "ning" + ], + [ + "▁", + "winning" + ], + [ + "▁F", + "ollow" + ], + [ + "▁Fol", + "low" + ], + [ + "▁", + "Follow" + ], + [ + "*/", + "\r" + ], + [ + "vis", + "ion" + ], + [ + "v", + "ision" + ], + [ + "res", + "ents" + ], + [ + "resent", + "s" + ], + [ + "zi", + "one" + ], + [ + "z", + "ione" + ], + [ + "▁l", + "atter" + ], + [ + "▁lat", + "ter" + ], + [ + "▁requ", + "ests" + ], + [ + "▁request", + "s" + ], + [ + "▁", + "requests" + ], + [ + "▁m", + "argin" + ], + [ + "▁mar", + "gin" + ], + [ + "▁marg", + "in" + ], + [ + "▁", + "margin" + ], + [ + "▁{", + "\"" + ], + [ + "▁", + "{\"" + ], + [ + "v", + "ideo" + ], + [ + "c", + "n" + ], + [ + "▁Im", + "age" + ], + [ + "▁", + "Image" + ], + [ + "T", + "im" + ], + [ + "CON", + "FIG" + ], + [ + "CONF", + "IG" + ], + [ + "▁all", + "owing" + ], + [ + "▁allow", + "ing" + ], + [ + "▁comb", + "ined" + ], + [ + "▁combine", + "d" + ], + [ + "PU", + "T" + ], + [ + "P", + "UT" + ], + [ + "▁instance", + "of" + ], + [ + "ig", + "in" + ], + [ + "igi", + "n" + ], + [ + "i", + "gin" + ], + [ + "▁p", + "ero" + ], + [ + "▁per", + "o" + ], + [ + "▁pe", + "ro" + ], + [ + "▁'", + "'" + ], + [ + "▁", + "''" + ], + [ + "▁conf", + "idence" + ], + [ + "▁equ", + "ivalent" + ], + [ + "▁equival", + "ent" + ], + [ + "pa", + "d" + ], + [ + "p", + "ad" + ], + [ + "ef", + "fect" + ], + [ + "eff", + "ect" + ], + [ + "e", + "ffect" + ], + [ + "R", + "X" + ], + [ + "▁l", + "ang" + ], + [ + "▁la", + "ng" + ], + [ + "▁lan", + "g" + ], + [ + "▁", + "lang" + ], + [ + "str", + "ong" + ], + [ + "▁b", + "ridge" + ], + [ + "▁br", + "idge" + ], + [ + "▁", + "bridge" + ], + [ + "ay", + "a" + ], + [ + "a", + "ya" + ], + [ + "▁t", + "reated" + ], + [ + "▁tre", + "ated" + ], + [ + "▁treat", + "ed" + ], + [ + "▁f", + "orth" + ], + [ + "▁for", + "th" + ], + [ + "▁fort", + "h" + ], + [ + "S", + "W" + ], + [ + "▁account", + "s" + ], + [ + "▁P", + "O" + ], + [ + "▁", + "PO" + ], + [ + "▁list", + "ening" + ], + [ + "▁listen", + "ing" + ], + [ + "Ro", + "ute" + ], + [ + "R", + "oute" + ], + [ + "()", + "))" + ], + [ + "())", + ")" + ], + [ + "(", + ")))" + ], + [ + "cp", + "y" + ], + [ + "c", + "py" + ], + [ + "▁re", + "form" + ], + [ + "▁ref", + "orm" + ], + [ + "▁g", + "ate" + ], + [ + "▁ga", + "te" + ], + [ + "▁", + "gate" + ], + [ + "▁W", + "alk" + ], + [ + "▁Wal", + "k" + ], + [ + "▁", + "Walk" + ], + [ + "▁some", + "how" + ], + [ + "t", + "f" + ], + [ + "▁l", + "ayout" + ], + [ + "▁la", + "yout" + ], + [ + "▁lay", + "out" + ], + [ + "▁", + "layout" + ], + [ + "um", + "in" + ], + [ + "umi", + "n" + ], + [ + "u", + "min" + ], + [ + "▁consider", + "ing" + ], + [ + "▁consid", + "ering" + ], + [ + "▁pre", + "mi" + ], + [ + "▁pr", + "emi" + ], + [ + "▁prem", + "i" + ], + [ + "▁M", + "om" + ], + [ + "▁Mo", + "m" + ], + [ + "at", + "han" + ], + [ + "ath", + "an" + ], + [ + "a", + "than" + ], + [ + "Ge", + "n" + ], + [ + "G", + "en" + ], + [ + "▁plan", + "et" + ], + [ + "▁plane", + "t" + ], + [ + "am", + "ples" + ], + [ + "amp", + "les" + ], + [ + "ample", + "s" + ], + [ + "▁M", + "O" + ], + [ + "▁", + "MO" + ], + [ + "sh", + "op" + ], + [ + "s", + "hop" + ], + [ + "▁prem", + "ier" + ], + [ + "▁premi", + "er" + ], + [ + "▁s", + "impl" + ], + [ + "▁sim", + "pl" + ], + [ + "▁s", + "egu" + ], + [ + "▁se", + "gu" + ], + [ + "▁seg", + "u" + ], + [ + "L", + "Y" + ], + [ + "Su", + "m" + ], + [ + "S", + "um" + ], + [ + "▁t", + "ables" + ], + [ + "▁table", + "s" + ], + [ + "▁tab", + "les" + ], + [ + "▁ta", + "bles" + ], + [ + "▁", + "tables" + ], + [ + "sk", + "a" + ], + [ + "s", + "ka" + ], + [ + "▁", + "ž" + ], + [ + "p", + "d" + ], + [ + "▁s", + "ous" + ], + [ + "▁so", + "us" + ], + [ + "▁sou", + "s" + ], + [ + "▁con", + "ference" + ], + [ + "▁confer", + "ence" + ], + [ + "▁D", + "at" + ], + [ + "▁Da", + "t" + ], + [ + "▁", + "Dat" + ], + [ + "Sc", + "roll" + ], + [ + "▁stand", + "ards" + ], + [ + "▁standard", + "s" + ], + [ + "▁г", + "ру" + ], + [ + "es", + "se" + ], + [ + "ess", + "e" + ], + [ + "▁citiz", + "ens" + ], + [ + "▁citizen", + "s" + ], + [ + "▁occur", + "red" + ], + [ + "▁dem", + "ocr" + ], + [ + "▁demo", + "cr" + ], + [ + "▁e", + "lev" + ], + [ + "▁el", + "ev" + ], + [ + "▁ele", + "v" + ], + [ + "▁S", + "em" + ], + [ + "▁Se", + "m" + ], + [ + "▁", + "Sem" + ], + [ + "ens", + "us" + ], + [ + "he", + "aders" + ], + [ + "head", + "ers" + ], + [ + "header", + "s" + ], + [ + "▁Ch", + "ris" + ], + [ + "im", + "ento" + ], + [ + "iment", + "o" + ], + [ + "imen", + "to" + ], + [ + "ko", + "m" + ], + [ + "k", + "om" + ], + [ + "Co", + "r" + ], + [ + "C", + "or" + ], + [ + "MI", + "N" + ], + [ + "M", + "IN" + ], + [ + "us", + "her" + ], + [ + "ush", + "er" + ], + [ + "Data", + "base" + ], + [ + "Dat", + "abase" + ], + [ + "▁f", + "ormal" + ], + [ + "▁for", + "mal" + ], + [ + "▁form", + "al" + ], + [ + "▁forma", + "l" + ], + [ + "ig", + "ne" + ], + [ + "ign", + "e" + ], + [ + "▁organ", + "izations" + ], + [ + "▁organiz", + "ations" + ], + [ + "▁organization", + "s" + ], + [ + "▁I", + "re" + ], + [ + "▁Ir", + "e" + ], + [ + "X", + "ml" + ], + [ + "и", + "з" + ], + [ + "▁p", + "ray" + ], + [ + "▁pr", + "ay" + ], + [ + "▁pra", + "y" + ], + [ + "▁b", + "omb" + ], + [ + "▁bo", + "mb" + ], + [ + "▁bom", + "b" + ], + [ + "▁m", + "and" + ], + [ + "▁man", + "d" + ], + [ + "▁ma", + "nd" + ], + [ + "▁", + "mand" + ], + [ + "er", + "ts" + ], + [ + "ert", + "s" + ], + [ + "▁c", + "lock" + ], + [ + "▁cl", + "ock" + ], + [ + "▁clo", + "ck" + ], + [ + "▁", + "clock" + ], + [ + "▁b", + "uck" + ], + [ + "▁bu", + "ck" + ], + [ + "ва", + "ли" + ], + [ + "вал", + "и" + ], + [ + "в", + "али" + ], + [ + "en", + "sch" + ], + [ + "ens", + "ch" + ], + [ + "▁v", + "olt" + ], + [ + "▁vo", + "lt" + ], + [ + "▁vol", + "t" + ], + [ + "▁", + "volt" + ], + [ + "▁fil", + "ms" + ], + [ + "▁film", + "s" + ], + [ + "▁pl", + "ants" + ], + [ + "▁plan", + "ts" + ], + [ + "▁plant", + "s" + ], + [ + "in", + "ode" + ], + [ + "ino", + "de" + ], + [ + "i", + "node" + ], + [ + "Bo", + "olean" + ], + [ + "▁restaur", + "ant" + ], + [ + "ía", + "n" + ], + [ + "í", + "an" + ], + [ + "▁de", + "but" + ], + [ + "▁deb", + "ut" + ], + [ + "page", + "s" + ], + [ + "pa", + "ges" + ], + [ + "pag", + "es" + ], + [ + "p", + "ages" + ], + [ + "▁wor", + "dt" + ], + [ + "▁word", + "t" + ], + [ + "▁Б", + "а" + ], + [ + "▁great", + "est" + ], + [ + "(\"", + "/" + ], + [ + "▁c", + "opyright" + ], + [ + "▁copy", + "right" + ], + [ + "▁", + "copyright" + ], + [ + "▁r", + "it" + ], + [ + "▁ri", + "t" + ], + [ + "▁", + "rit" + ], + [ + "size", + "of" + ], + [ + "Tr", + "ace" + ], + [ + "Tra", + "ce" + ], + [ + "ue", + "nt" + ], + [ + "uen", + "t" + ], + [ + "u", + "ent" + ], + [ + "ту", + "р" + ], + [ + "т", + "ур" + ], + [ + "▁k", + "o" + ], + [ + "▁", + "ko" + ], + [ + ":", + "\\" + ], + [ + "▁b", + "igger" + ], + [ + "▁big", + "ger" + ], + [ + "▁perfect", + "ly" + ], + [ + "ten", + "ance" + ], + [ + "MA", + "SK" + ], + [ + "M", + "ASK" + ], + [ + "r", + "é" + ], + [ + "▁e", + "tt" + ], + [ + "▁et", + "t" + ], + [ + "▁", + "ett" + ], + [ + "▁n", + "ose" + ], + [ + "▁no", + "se" + ], + [ + "▁nos", + "e" + ], + [ + "▁c", + "raft" + ], + [ + "▁cr", + "aft" + ], + [ + "▁", + "craft" + ], + [ + "it", + "eral" + ], + [ + "ite", + "ral" + ], + [ + "iter", + "al" + ], + [ + "▁discuss", + "ed" + ], + [ + "▁Jew", + "ish" + ], + [ + "C", + "ap" + ], + [ + "▁Un", + "less" + ], + [ + "▁Jack", + "son" + ], + [ + "Att", + "ributes" + ], + [ + "Attribute", + "s" + ], + [ + "Attrib", + "utes" + ], + [ + "▁l", + "unch" + ], + [ + "▁lun", + "ch" + ], + [ + "ö", + "l" + ], + [ + "at", + "r" + ], + [ + "a", + "tr" + ], + [ + "▁pay", + "ing" + ], + [ + "▁pa", + "ying" + ], + [ + "Par", + "se" + ], + [ + "Pars", + "e" + ], + [ + "P", + "arse" + ], + [ + "()", + "\r" + ], + [ + "(", + ")\r" + ], + [ + "la", + "d" + ], + [ + "l", + "ad" + ], + [ + "▁r", + "are" + ], + [ + "▁ra", + "re" + ], + [ + "▁[", + "];" + ], + [ + "▁[]", + ";" + ], + [ + "▁", + "[];" + ], + [ + "st", + "one" + ], + [ + "ston", + "e" + ], + [ + "sto", + "ne" + ], + [ + "▁u", + "nc" + ], + [ + "▁un", + "c" + ], + [ + "▁", + "unc" + ], + [ + "▁def", + "ense" + ], + [ + "▁defens", + "e" + ], + [ + "}", + "+" + ], + [ + "▁Gl", + "obal" + ], + [ + "▁", + "Global" + ], + [ + "▁Sov", + "iet" + ], + [ + "▁Austral", + "ian" + ], + [ + "▁Australia", + "n" + ], + [ + "▁g", + "li" + ], + [ + "▁gl", + "i" + ], + [ + "var", + "iant" + ], + [ + "vari", + "ant" + ], + [ + "▁R", + "on" + ], + [ + "▁Ro", + "n" + ], + [ + "▁lo", + "an" + ], + [ + "St", + "ep" + ], + [ + "Ste", + "p" + ], + [ + "me", + "mber" + ], + [ + "mem", + "ber" + ], + [ + "m", + "ember" + ], + [ + "Sc", + "h" + ], + [ + "S", + "ch" + ], + [ + "▁Commit", + "tee" + ], + [ + "▁s", + "pending" + ], + [ + "▁sp", + "ending" + ], + [ + "▁spend", + "ing" + ], + [ + "▁T", + "ri" + ], + [ + "▁Tr", + "i" + ], + [ + "▁", + "Tri" + ], + [ + "▁J", + "ournal" + ], + [ + "▁Jour", + "nal" + ], + [ + "▁", + "Journal" + ], + [ + "▁s", + "ugar" + ], + [ + "▁su", + "gar" + ], + [ + "▁sug", + "ar" + ], + [ + "el", + "ly" + ], + [ + "ell", + "y" + ], + [ + "HT", + "ML" + ], + [ + "▁ad", + "vent" + ], + [ + "▁adv", + "ent" + ], + [ + "win", + "g" + ], + [ + "wi", + "ng" + ], + [ + "w", + "ing" + ], + [ + "▁Wh", + "ether" + ], + [ + "▁Whe", + "ther" + ], + [ + "or", + "ation" + ], + [ + "▁N", + "E" + ], + [ + "▁", + "NE" + ], + [ + "iv", + "eness" + ], + [ + "ive", + "ness" + ], + [ + "iven", + "ess" + ], + [ + "▁h", + "av" + ], + [ + "▁ha", + "v" + ], + [ + "▁", + "hav" + ], + [ + "▁con", + "scious" + ], + [ + "▁", + "conscious" + ], + [ + "ee", + "n" + ], + [ + "e", + "en" + ], + [ + "Sym", + "bol" + ], + [ + "S", + "ymbol" + ], + [ + "▁к", + "у" + ], + [ + "▁", + "ку" + ], + [ + "Log", + "ger" + ], + [ + "▁L", + "ittle" + ], + [ + "▁Lit", + "tle" + ], + [ + "wide", + "t" + ], + [ + "wi", + "det" + ], + [ + "wid", + "et" + ], + [ + "oc", + "ation" + ], + [ + "pi", + "n" + ], + [ + "p", + "in" + ], + [ + "▁sym", + "met" + ], + [ + "▁A", + "D" + ], + [ + "▁", + "AD" + ], + [ + "▁pos", + "ts" + ], + [ + "▁po", + "sts" + ], + [ + "▁post", + "s" + ], + [ + "▁", + "posts" + ], + [ + "sh", + "al" + ], + [ + "sha", + "l" + ], + [ + "s", + "hal" + ], + [ + "▁Con", + "f" + ], + [ + "▁Co", + "nf" + ], + [ + "▁", + "Conf" + ], + [ + "▁ch", + "ose" + ], + [ + "▁cho", + "se" + ], + [ + "ma", + "l" + ], + [ + "m", + "al" + ], + [ + "ul", + "o" + ], + [ + "u", + "lo" + ], + [ + "▁M", + "ethod" + ], + [ + "▁", + "Method" + ], + [ + "▁miss", + "ed" + ], + [ + "▁mis", + "sed" + ], + [ + "Re", + "move" + ], + [ + "Rem", + "ove" + ], + [ + "Aut", + "o" + ], + [ + "A", + "uto" + ], + [ + "VAL", + "UE" + ], + [ + "th", + "let" + ], + [ + "▁For", + "ce" + ], + [ + "▁", + "Force" + ], + [ + "p", + "f" + ], + [ + "▁", + "Я" + ], + [ + "la", + "te" + ], + [ + "lat", + "e" + ], + [ + "l", + "ate" + ], + [ + "▁p", + "ul" + ], + [ + "▁pu", + "l" + ], + [ + "▁", + "pul" + ], + [ + "Po", + "p" + ], + [ + "P", + "op" + ], + [ + "▁adv", + "anced" + ], + [ + "▁advance", + "d" + ], + [ + "air", + "es" + ], + [ + "ai", + "res" + ], + [ + "aire", + "s" + ], + [ + "a", + "ires" + ], + [ + "res", + "sed" + ], + [ + "ress", + "ed" + ], + [ + "resse", + "d" + ], + [ + "r", + "essed" + ], + [ + "AM", + "E" + ], + [ + "A", + "ME" + ], + [ + "be", + "ll" + ], + [ + "bel", + "l" + ], + [ + "b", + "ell" + ], + [ + "ac", + "hing" + ], + [ + "ach", + "ing" + ], + [ + "achi", + "ng" + ], + [ + "a", + "ching" + ], + [ + "i", + "ć" + ], + [ + "ec", + "ho" + ], + [ + "ech", + "o" + ], + [ + "e", + "cho" + ], + [ + "H", + "S" + ], + [ + "▁fun", + "ny" + ], + [ + "ри", + "и" + ], + [ + "▁e", + "er" + ], + [ + "▁ve", + "get" + ], + [ + "▁four", + "th" + ], + [ + "c", + "f" + ], + [ + "trans", + "form" + ], + [ + "▁g", + "rown" + ], + [ + "▁gr", + "own" + ], + [ + "▁grow", + "n" + ], + [ + "▁gro", + "wn" + ], + [ + "▁Mc", + "C" + ], + [ + "si", + "te" + ], + [ + "s", + "ite" + ], + [ + "▁b", + "eneath" + ], + [ + "▁be", + "neath" + ], + [ + "▁s", + "hell" + ], + [ + "▁sh", + "ell" + ], + [ + "▁she", + "ll" + ], + [ + "▁shel", + "l" + ], + [ + "▁", + "shell" + ], + [ + "x", + "d" + ], + [ + "Pl", + "ay" + ], + [ + "P", + "lay" + ], + [ + "sh", + "ort" + ], + [ + "Ro", + "le" + ], + [ + "R", + "ole" + ], + [ + "▁relig", + "ion" + ], + [ + "in", + "ator" + ], + [ + "ina", + "tor" + ], + [ + "}", + "", + "<" + ], + [ + "\"", + "><" + ], + [ + "as", + "p" + ], + [ + "a", + "sp" + ], + [ + "aj", + "o" + ], + [ + "a", + "jo" + ], + [ + "ex", + "ports" + ], + [ + "exp", + "orts" + ], + [ + "export", + "s" + ], + [ + "▁N", + "ode" + ], + [ + "▁No", + "de" + ], + [ + "▁", + "Node" + ], + [ + "▁j", + "ako" + ], + [ + "▁ja", + "ko" + ], + [ + "▁jak", + "o" + ], + [ + "▁y", + "a" + ], + [ + "▁", + "ya" + ], + [ + "▁success", + "fully" + ], + [ + "▁successful", + "ly" + ], + [ + "▁friend", + "ly" + ], + [ + "▁", + "friendly" + ], + [ + "buf", + "f" + ], + [ + "bu", + "ff" + ], + [ + "b", + "uff" + ], + [ + "DE", + "FAULT" + ], + [ + "▁pre", + "gn" + ], + [ + "▁preg", + "n" + ], + [ + "Requ", + "ired" + ], + [ + "Require", + "d" + ], + [ + "▁b", + "inary" + ], + [ + "▁bin", + "ary" + ], + [ + "▁", + "binary" + ], + [ + "is", + "ting" + ], + [ + "ist", + "ing" + ], + [ + "isti", + "ng" + ], + [ + "▁st", + "ared" + ], + [ + "▁star", + "ed" + ], + [ + "▁stare", + "d" + ], + [ + "▁sta", + "red" + ], + [ + "▁circum", + "stances" + ], + [ + "▁х", + "о" + ], + [ + "▁", + "хо" + ], + [ + "re", + "i" + ], + [ + "r", + "ei" + ], + [ + "▁Г", + "о" + ], + [ + "Trans", + "form" + ], + [ + "cn", + "t" + ], + [ + "c", + "nt" + ], + [ + "▁E", + "xt" + ], + [ + "▁Ex", + "t" + ], + [ + "▁", + "Ext" + ], + [ + "re", + "port" + ], + [ + "rep", + "ort" + ], + [ + "repo", + "rt" + ], + [ + "VER", + "SION" + ], + [ + "▁an", + "aly" + ], + [ + "▁anal", + "y" + ], + [ + "▁", + "analy" + ], + [ + "▁M", + "arg" + ], + [ + "▁Mar", + "g" + ], + [ + "▁Ma", + "rg" + ], + [ + "▁al", + "leg" + ], + [ + "▁all", + "eg" + ], + [ + "▁alle", + "g" + ], + [ + "build", + "er" + ], + [ + "b", + "uilder" + ], + [ + "To", + "String" + ], + [ + "La", + "yer" + ], + [ + "L", + "ayer" + ], + [ + "ís", + "t" + ], + [ + "í", + "st" + ], + [ + "Pro", + "p" + ], + [ + "Pr", + "op" + ], + [ + "P", + "rop" + ], + [ + "▁E", + "mp" + ], + [ + "▁Em", + "p" + ], + [ + "▁", + "Emp" + ], + [ + "}", + "]" + ], + [ + "▁s", + "elling" + ], + [ + "▁sell", + "ing" + ], + [ + "▁sel", + "ling" + ], + [ + "▁", + "selling" + ], + [ + "▁que", + "ue" + ], + [ + "▁", + "queue" + ], + [ + "▁ser", + "iously" + ], + [ + "▁serious", + "ly" + ], + [ + "▁L", + "ead" + ], + [ + "▁Le", + "ad" + ], + [ + "▁", + "Lead" + ], + [ + "text", + "it" + ], + [ + "tex", + "tit" + ], + [ + "test", + "ing" + ], + [ + "tes", + "ting" + ], + [ + "▁П", + "ре" + ], + [ + "se", + "curity" + ], + [ + "sec", + "urity" + ], + [ + "ia", + "ł" + ], + [ + "i", + "ał" + ], + [ + "ú", + "n" + ], + [ + "ch", + "ip" + ], + [ + "chi", + "p" + ], + [ + "c", + "hip" + ], + [ + "▁c", + "andidate" + ], + [ + "▁candid", + "ate" + ], + [ + "▁min", + "ister" + ], + [ + "▁mini", + "ster" + ], + [ + "▁minist", + "er" + ], + [ + "▁", + "minister" + ], + [ + "er", + "ia" + ], + [ + "eri", + "a" + ], + [ + "e", + "ria" + ], + [ + "▁H", + "et" + ], + [ + "▁He", + "t" + ], + [ + "ди", + "н" + ], + [ + "д", + "ин" + ], + [ + "▁Brit", + "ain" + ], + [ + "▁b", + "arely" + ], + [ + "▁bar", + "ely" + ], + [ + "▁bare", + "ly" + ], + [ + "▁s", + "ty" + ], + [ + "▁st", + "y" + ], + [ + "▁", + "sty" + ], + [ + "▁Span", + "ish" + ], + [ + "▁V", + "en" + ], + [ + "▁Ve", + "n" + ], + [ + "time", + "r" + ], + [ + "ti", + "mer" + ], + [ + "tim", + "er" + ], + [ + "t", + "imer" + ], + [ + "кі", + "в" + ], + [ + "к", + "ів" + ], + [ + "▁document", + "s" + ], + [ + "▁doc", + "uments" + ], + [ + "('", + "." + ], + [ + "(", + "'." + ], + [ + "▁d", + "ebug" + ], + [ + "▁de", + "bug" + ], + [ + "▁deb", + "ug" + ], + [ + "▁", + "debug" + ], + [ + "▁cont", + "ro" + ], + [ + "▁contr", + "o" + ], + [ + "сто", + "я" + ], + [ + "▁j", + "oy" + ], + [ + "▁jo", + "y" + ], + [ + "▁", + "joy" + ], + [ + "S", + "n" + ], + [ + "In", + "v" + ], + [ + "I", + "nv" + ], + [ + "▁pro", + "tocol" + ], + [ + "▁proto", + "col" + ], + [ + "▁prot", + "ocol" + ], + [ + "▁", + "protocol" + ], + [ + "▁f", + "aces" + ], + [ + "▁face", + "s" + ], + [ + "▁fac", + "es" + ], + [ + "▁fa", + "ces" + ], + [ + "▁", + "faces" + ], + [ + "▁Des", + "pite" + ], + [ + "se", + "d" + ], + [ + "s", + "ed" + ], + [ + "Con", + "f" + ], + [ + "Co", + "nf" + ], + [ + "AR", + "G" + ], + [ + "A", + "RG" + ], + [ + "▁e", + "volution" + ], + [ + "▁ev", + "olution" + ], + [ + "▁t", + "od" + ], + [ + "▁to", + "d" + ], + [ + "▁P", + "romise" + ], + [ + "▁Prom", + "ise" + ], + [ + "▁", + "Promise" + ], + [ + "▁pos", + "ted" + ], + [ + "▁po", + "sted" + ], + [ + "▁post", + "ed" + ], + [ + "Per", + "m" + ], + [ + "Pe", + "rm" + ], + [ + "P", + "erm" + ], + [ + "be", + "t" + ], + [ + "b", + "et" + ], + [ + "An", + "g" + ], + [ + "A", + "ng" + ], + [ + "J", + "ust" + ], + [ + "▁r", + "um" + ], + [ + "▁ru", + "m" + ], + [ + "▁", + "rum" + ], + [ + "la", + "yer" + ], + [ + "lay", + "er" + ], + [ + "l", + "ayer" + ], + [ + "▁beh", + "avi" + ], + [ + "▁behav", + "i" + ], + [ + "ip", + "ping" + ], + [ + "ipp", + "ing" + ], + [ + "ippi", + "ng" + ], + [ + "i", + "pping" + ], + [ + "▁d", + "ynam" + ], + [ + "▁dy", + "nam" + ], + [ + "▁dyn", + "am" + ], + [ + "▁sch", + "eme" + ], + [ + "▁sche", + "me" + ], + [ + "▁", + "scheme" + ], + [ + "▁pro", + "to" + ], + [ + "▁pr", + "oto" + ], + [ + "▁prot", + "o" + ], + [ + "▁", + "proto" + ], + [ + ")", + "/" + ], + [ + "Col", + "lections" + ], + [ + "Collection", + "s" + ], + [ + "Collect", + "ions" + ], + [ + "ri", + "ev" + ], + [ + "rie", + "v" + ], + [ + "r", + "iev" + ], + [ + "▁C", + "lick" + ], + [ + "▁Cl", + "ick" + ], + [ + "▁", + "Click" + ], + [ + "▁u", + "ns" + ], + [ + "▁un", + "s" + ], + [ + "▁", + "uns" + ], + [ + "wide", + "tilde" + ], + [ + "widet", + "ilde" + ], + [ + "▁remember", + "ed" + ], + [ + "г", + "і" + ], + [ + "in", + "ates" + ], + [ + "ina", + "tes" + ], + [ + "inate", + "s" + ], + [ + "▁incor", + "por" + ], + [ + "▁De", + "scription" + ], + [ + "▁Des", + "cription" + ], + [ + "▁", + "Description" + ], + [ + "▁pre", + "pare" + ], + [ + "▁prep", + "are" + ], + [ + "▁prepar", + "e" + ], + [ + "▁", + "prepare" + ], + [ + "▁F", + "inal" + ], + [ + "▁Fin", + "al" + ], + [ + "▁Fi", + "nal" + ], + [ + "▁", + "Final" + ], + [ + "u", + "ation" + ], + [ + "▁Qu", + "een" + ], + [ + "▁Que", + "en" + ], + [ + ">", + ";" + ], + [ + "▁autom", + "atically" + ], + [ + "▁automatic", + "ally" + ], + [ + "▁sh", + "arp" + ], + [ + "▁shar", + "p" + ], + [ + "▁sha", + "rp" + ], + [ + "▁me", + "at" + ], + [ + "at", + "eur" + ], + [ + "ate", + "ur" + ], + [ + "as", + "tern" + ], + [ + "ast", + "ern" + ], + [ + "aster", + "n" + ], + [ + "aste", + "rn" + ], + [ + "▁st", + "uck" + ], + [ + "ASS", + "ERT" + ], + [ + "▁pl", + "anned" + ], + [ + "▁plan", + "ned" + ], + [ + "do", + "ts" + ], + [ + "dot", + "s" + ], + [ + "d", + "ots" + ], + [ + "ook", + "ie" + ], + [ + "oo", + "kie" + ], + [ + "▁His", + "tor" + ], + [ + "▁Hist", + "or" + ], + [ + "▁re", + "views" + ], + [ + "▁review", + "s" + ], + [ + "IM", + "P" + ], + [ + "I", + "MP" + ], + [ + "▁answ", + "ered" + ], + [ + "▁answer", + "ed" + ], + [ + "To", + "tal" + ], + [ + "T", + "otal" + ], + [ + "▁s", + "au" + ], + [ + "▁sa", + "u" + ], + [ + "▁Me", + "xico" + ], + [ + "▁Mex", + "ico" + ], + [ + "contin", + "ue" + ], + [ + "▁App", + "le" + ], + [ + "▁Ap", + "ple" + ], + [ + "like", + "ly" + ], + [ + "lik", + "ely" + ], + [ + "з", + "ва" + ], + [ + "us", + "ers" + ], + [ + "use", + "rs" + ], + [ + "user", + "s" + ], + [ + "▁ident", + "ified" + ], + [ + "▁L", + "ev" + ], + [ + "▁Le", + "v" + ], + [ + "▁m", + "ol" + ], + [ + "▁mo", + "l" + ], + [ + "▁Is", + "lam" + ], + [ + "▁com", + "mitted" + ], + [ + "▁comm", + "itted" + ], + [ + "▁commit", + "ted" + ], + [ + "wr", + "it" + ], + [ + "w", + "rit" + ], + [ + "бе", + "р" + ], + [ + "б", + "ер" + ], + [ + "ri", + "ft" + ], + [ + "rif", + "t" + ], + [ + "r", + "ift" + ], + [ + "▁inter", + "rupt" + ], + [ + "▁", + "interrupt" + ], + [ + "▁read", + "only" + ], + [ + "sch", + "ema" + ], + [ + "sche", + "ma" + ], + [ + "s", + "chema" + ], + [ + "S", + "m" + ], + [ + "D", + "ouble" + ], + [ + "az", + "a" + ], + [ + "a", + "za" + ], + [ + "▁H", + "al" + ], + [ + "▁Ha", + "l" + ], + [ + "▁", + "Hal" + ], + [ + "Mo", + "ve" + ], + [ + "M", + "ove" + ], + [ + "▁S", + "eries" + ], + [ + "▁Se", + "ries" + ], + [ + "▁Ser", + "ies" + ], + [ + "▁Serie", + "s" + ], + [ + "▁", + "Series" + ], + [ + "in", + "line" + ], + [ + "▁кото", + "ры" + ], + [ + "so", + "c" + ], + [ + "s", + "oc" + ], + [ + "▁t", + "ent" + ], + [ + "▁te", + "nt" + ], + [ + "▁ten", + "t" + ], + [ + "▁a", + "mer" + ], + [ + "▁am", + "er" + ], + [ + "▁", + "amer" + ], + [ + "ak", + "i" + ], + [ + "a", + "ki" + ], + [ + "▁l", + "ady" + ], + [ + "▁la", + "dy" + ], + [ + "▁lad", + "y" + ], + [ + "▁t", + "ired" + ], + [ + "▁ti", + "red" + ], + [ + "▁tire", + "d" + ], + [ + "▁tir", + "ed" + ], + [ + "if", + "i" + ], + [ + "i", + "fi" + ], + [ + "▁m", + "ême" + ], + [ + "▁", + "même" + ], + [ + "ou", + "ver" + ], + [ + "▁a", + "side" + ], + [ + "▁as", + "ide" + ], + [ + "Di", + "d" + ], + [ + "D", + "id" + ], + [ + "',", + "\r" + ], + [ + "'", + ",\r" + ], + [ + "▁br", + "inging" + ], + [ + "▁bring", + "ing" + ], + [ + "Draw", + "ing" + ], + [ + "ar", + "o" + ], + [ + "a", + "ro" + ], + [ + "▁R", + "h" + ], + [ + "▁N", + "az" + ], + [ + "▁Na", + "z" + ], + [ + "es", + "so" + ], + [ + "ess", + "o" + ], + [ + "▁re", + "action" + ], + [ + "▁react", + "ion" + ], + [ + "mit", + "ted" + ], + [ + "mitt", + "ed" + ], + [ + "m", + "itted" + ], + [ + "▁abs", + "olute" + ], + [ + "▁absolut", + "e" + ], + [ + "▁", + "absolute" + ], + [ + "ha", + "ust" + ], + [ + "haus", + "t" + ], + [ + "((", + ")" + ], + [ + "(", + "()" + ], + [ + "▁T", + "ask" + ], + [ + "▁Ta", + "sk" + ], + [ + "▁", + "Task" + ], + [ + "ER", + "S" + ], + [ + "E", + "RS" + ], + [ + "▁^", + "{" + ], + [ + "▁", + "^{" + ], + [ + "V", + "D" + ], + [ + "▁t", + "one" + ], + [ + "▁to", + "ne" + ], + [ + "▁ton", + "e" + ], + [ + "dis", + "t" + ], + [ + "di", + "st" + ], + [ + "d", + "ist" + ], + [ + "v", + "s" + ], + [ + "▁whe", + "el" + ], + [ + "▁", + "wheel" + ], + [ + "▁administr", + "ation" + ], + [ + "▁admin", + "istration" + ], + [ + "▁inter", + "ests" + ], + [ + "▁interest", + "s" + ], + [ + "▁point", + "er" + ], + [ + "▁po", + "inter" + ], + [ + "▁", + "pointer" + ], + [ + "▁en", + "counter" + ], + [ + "▁enc", + "ounter" + ], + [ + "ave", + "r" + ], + [ + "av", + "er" + ], + [ + "a", + "ver" + ], + [ + "▁n", + "ord" + ], + [ + "▁no", + "rd" + ], + [ + "▁nor", + "d" + ], + [ + "ke", + "t" + ], + [ + "k", + "et" + ], + [ + "▁b", + "each" + ], + [ + "▁be", + "ach" + ], + [ + "▁enjoy", + "ed" + ], + [ + "cont", + "ains" + ], + [ + "▁app", + "end" + ], + [ + "▁ap", + "pend" + ], + [ + "▁appe", + "nd" + ], + [ + "▁", + "append" + ], + [ + "W", + "ait" + ], + [ + "▁s", + "quad" + ], + [ + "▁squ", + "ad" + ], + [ + "ze", + "l" + ], + [ + "z", + "el" + ], + [ + "▁med", + "ium" + ], + [ + "▁medi", + "um" + ], + [ + "▁", + "medium" + ], + [ + "▁s", + "ending" + ], + [ + "▁send", + "ing" + ], + [ + "▁sen", + "ding" + ], + [ + "▁L", + "ady" + ], + [ + "▁La", + "dy" + ], + [ + "▁Lad", + "y" + ], + [ + "ç", + "ões" + ], + [ + "▁dest", + "ination" + ], + [ + "▁destin", + "ation" + ], + [ + "▁", + "destination" + ], + [ + "ny", + "ch" + ], + [ + "n", + "ych" + ], + [ + "▁conf", + "lict" + ], + [ + "▁conflic", + "t" + ], + [ + "▁L", + "y" + ], + [ + "▁v", + "ul" + ], + [ + "▁vu", + "l" + ], + [ + "▁bas", + "ically" + ], + [ + "▁basic", + "ally" + ], + [ + "re", + "ated" + ], + [ + "reat", + "ed" + ], + [ + "reate", + "d" + ], + [ + "rea", + "ted" + ], + [ + "bl", + "ack" + ], + [ + "ug", + "ins" + ], + [ + "ugin", + "s" + ], + [ + "▁cal", + "m" + ], + [ + "▁ca", + "lm" + ], + [ + "ér", + "ie" + ], + [ + "éri", + "e" + ], + [ + "é", + "rie" + ], + [ + "ha", + "r" + ], + [ + "h", + "ar" + ], + [ + "ла", + "н" + ], + [ + "л", + "ан" + ], + [ + "▁С", + "е" + ], + [ + "w", + "atch" + ], + [ + "▁P", + "ut" + ], + [ + "▁Pu", + "t" + ], + [ + "▁", + "Put" + ], + [ + "▁d", + "ump" + ], + [ + "▁du", + "mp" + ], + [ + "▁", + "dump" + ], + [ + "ac", + "her" + ], + [ + "ach", + "er" + ], + [ + "ache", + "r" + ], + [ + "a", + "cher" + ], + [ + "sc", + "roll" + ], + [ + "scr", + "oll" + ], + [ + "▁cl", + "aimed" + ], + [ + "▁claim", + "ed" + ], + [ + "▁", + "claimed" + ], + [ + "▁Cont", + "rol" + ], + [ + "▁", + "Control" + ], + [ + "▁bl", + "ind" + ], + [ + "en", + "ti" + ], + [ + "ent", + "i" + ], + [ + "▁Ke", + "ep" + ], + [ + "▁", + "Keep" + ], + [ + "▁Develop", + "ment" + ], + [ + "im", + "ages" + ], + [ + "image", + "s" + ], + [ + "ima", + "ges" + ], + [ + "imag", + "es" + ], + [ + "▁t", + "ough" + ], + [ + "▁to", + "ugh" + ], + [ + "▁tou", + "gh" + ], + [ + "ge", + "bra" + ], + [ + "geb", + "ra" + ], + [ + "▁se", + "pt" + ], + [ + "▁sep", + "t" + ], + [ + "he", + "w" + ], + [ + "h", + "ew" + ], + [ + "▁s", + "kill" + ], + [ + "▁sk", + "ill" + ], + [ + "▁ski", + "ll" + ], + [ + "▁", + "skill" + ], + [ + "▁T", + "ay" + ], + [ + "▁Ta", + "y" + ], + [ + "▁k", + "tó" + ], + [ + "ow", + "ner" + ], + [ + "own", + "er" + ], + [ + "par", + "e" + ], + [ + "pa", + "re" + ], + [ + "p", + "are" + ], + [ + "▁f", + "ee" + ], + [ + "▁fe", + "e" + ], + [ + "▁", + "fee" + ], + [ + "▁contin", + "ues" + ], + [ + "▁continue", + "s" + ], + [ + "▁continu", + "es" + ], + [ + "▁k", + "an" + ], + [ + "▁ka", + "n" + ], + [ + "▁", + "kan" + ], + [ + "be", + "s" + ], + [ + "b", + "es" + ], + [ + "▁c", + "ha" + ], + [ + "▁ch", + "a" + ], + [ + "▁", + "cha" + ], + [ + "ov", + "o" + ], + [ + "o", + "vo" + ], + [ + "▁N", + "ight" + ], + [ + "▁Ni", + "ght" + ], + [ + "ict", + "ure" + ], + [ + "sh", + "ire" + ], + [ + "s", + "hire" + ], + [ + "▁es", + "say" + ], + [ + "▁ess", + "ay" + ], + [ + "▁sup", + "pose" + ], + [ + "▁supp", + "ose" + ], + [ + "et", + "ic" + ], + [ + "eti", + "c" + ], + [ + "Ar", + "t" + ], + [ + "A", + "rt" + ], + [ + "ac", + "on" + ], + [ + "aco", + "n" + ], + [ + "a", + "con" + ], + [ + "ll", + "a" + ], + [ + "l", + "la" + ], + [ + "word", + "s" + ], + [ + "wor", + "ds" + ], + [ + "w", + "ords" + ], + [ + "▁compar", + "ison" + ], + [ + "▁B", + "E" + ], + [ + "▁", + "BE" + ], + [ + "▁challeng", + "es" + ], + [ + "▁challenge", + "s" + ], + [ + "▁o", + "l" + ], + [ + "▁", + "ol" + ], + [ + "cite", + "p" + ], + [ + "cit", + "ep" + ], + [ + "▁F", + "oot" + ], + [ + "▁Fo", + "ot" + ], + [ + "▁", + "Foot" + ], + [ + "▁S", + "uch" + ], + [ + "▁Su", + "ch" + ], + [ + "▁", + "Such" + ], + [ + "▁p", + "apers" + ], + [ + "▁paper", + "s" + ], + [ + "▁pa", + "pers" + ], + [ + "▁pap", + "ers" + ], + [ + "act", + "iv" + ], + [ + "qu", + "er" + ], + [ + "que", + "r" + ], + [ + "q", + "uer" + ], + [ + "т", + "я" + ], + [ + "▁Т", + "о" + ], + [ + "сь", + "кий" + ], + [ + "th", + "ur" + ], + [ + "do", + "ne" + ], + [ + "don", + "e" + ], + [ + "d", + "one" + ], + [ + "▁sh", + "ock" + ], + [ + "▁ded", + "icated" + ], + [ + "▁dedic", + "ated" + ], + [ + "▁cor", + "respond" + ], + [ + "▁correspon", + "d" + ], + [ + "Se", + "cond" + ], + [ + "Sec", + "ond" + ], + [ + "▁b", + "ull" + ], + [ + "▁bu", + "ll" + ], + [ + "▁bul", + "l" + ], + [ + "li", + "fe" + ], + [ + "lif", + "e" + ], + [ + "l", + "ife" + ], + [ + "ind", + "ent" + ], + [ + "inde", + "nt" + ], + [ + "inden", + "t" + ], + [ + "▁fig", + "ures" + ], + [ + "▁figure", + "s" + ], + [ + "▁And", + "rew" + ], + [ + "▁Andre", + "w" + ], + [ + "▁Andr", + "ew" + ], + [ + "is", + "p" + ], + [ + "i", + "sp" + ], + [ + "▁fav", + "our" + ], + [ + "зд", + "а" + ], + [ + "з", + "да" + ], + [ + "▁E", + "lect" + ], + [ + "▁El", + "ect" + ], + [ + "▁Ele", + "ct" + ], + [ + "F", + "ull" + ], + [ + "▁near", + "by" + ], + [ + "▁Reg", + "ister" + ], + [ + "▁", + "Register" + ], + [ + "Sc", + "ale" + ], + [ + "Scal", + "e" + ], + [ + "ic", + "ations" + ], + [ + "ication", + "s" + ], + [ + "и", + "н" + ], + [ + "▁A", + "M" + ], + [ + "▁", + "AM" + ], + [ + "pa", + "ir" + ], + [ + "p", + "air" + ], + [ + "▁pers", + "pective" + ], + [ + "▁n", + "os" + ], + [ + "▁no", + "s" + ], + [ + "▁", + "nos" + ], + [ + "ap", + "a" + ], + [ + "a", + "pa" + ], + [ + "ost", + "ał" + ], + [ + "osta", + "ł" + ], + [ + "▁P", + "ers" + ], + [ + "▁Per", + "s" + ], + [ + "▁Pe", + "rs" + ], + [ + "▁", + "Pers" + ], + [ + "ic", + "er" + ], + [ + "ice", + "r" + ], + [ + "i", + "cer" + ], + [ + "▁pl", + "astic" + ], + [ + "до", + "в" + ], + [ + "д", + "ов" + ], + [ + "ci", + "ples" + ], + [ + "cipl", + "es" + ], + [ + "cip", + "les" + ], + [ + "z", + "ą" + ], + [ + "cl", + "os" + ], + [ + "c", + "los" + ], + [ + "▁у", + "ча" + ], + [ + "▁", + "Á" + ], + [ + "pl", + "ugin" + ], + [ + "plug", + "in" + ], + [ + "▁an", + "gle" + ], + [ + "▁ang", + "le" + ], + [ + "▁angl", + "e" + ], + [ + "▁", + "angle" + ], + [ + "▁com", + "mission" + ], + [ + "▁comm", + "ission" + ], + [ + "▁fun", + "ds" + ], + [ + "▁fund", + "s" + ], + [ + "▁in", + "du" + ], + [ + "▁ind", + "u" + ], + [ + "▁d", + "rawn" + ], + [ + "▁dr", + "awn" + ], + [ + "▁draw", + "n" + ], + [ + "á", + "m" + ], + [ + "▁develop", + "ing" + ], + [ + "▁seg", + "ment" + ], + [ + "▁", + "segment" + ], + [ + "is", + "me" + ], + [ + "ism", + "e" + ], + [ + "sc", + "r" + ], + [ + "s", + "cr" + ], + [ + "▁l", + "ies" + ], + [ + "▁li", + "es" + ], + [ + "▁lie", + "s" + ], + [ + "▁I", + "L" + ], + [ + "▁", + "IL" + ], + [ + "▁a", + "pi" + ], + [ + "▁ap", + "i" + ], + [ + "▁", + "api" + ], + [ + "Ext", + "ension" + ], + [ + "▁s", + "cal" + ], + [ + "▁sc", + "al" + ], + [ + "▁", + "scal" + ], + [ + "inst", + "all" + ], + [ + "▁We", + "ek" + ], + [ + "▁", + "Week" + ], + [ + "▁gen", + "tle" + ], + [ + "▁gent", + "le" + ], + [ + "▁Canad", + "ian" + ], + [ + "▁d", + "ialog" + ], + [ + "▁dial", + "og" + ], + [ + "▁dia", + "log" + ], + [ + "▁", + "dialog" + ], + [ + "▁art", + "icles" + ], + [ + "▁article", + "s" + ], + [ + "▁artic", + "les" + ], + [ + "The", + "me" + ], + [ + "Th", + "eme" + ], + [ + "S", + "M" + ], + [ + "▁B", + "ul" + ], + [ + "▁Bu", + "l" + ], + [ + "▁", + "Bul" + ], + [ + "▁l", + "eur" + ], + [ + "▁le", + "ur" + ], + [ + "▁s", + "tom" + ], + [ + "▁st", + "om" + ], + [ + "▁sto", + "m" + ], + [ + "Pl", + "ugin" + ], + [ + "▁по", + "сле" + ], + [ + "▁пос", + "ле" + ], + [ + "▁st", + "ead" + ], + [ + "▁ste", + "ad" + ], + [ + "▁", + "stead" + ], + [ + "▁", + "ś" + ], + [ + "ip", + "her" + ], + [ + "iph", + "er" + ], + [ + "i", + "pher" + ], + [ + "▁pr", + "ze" + ], + [ + "▁prz", + "e" + ], + [ + "▁d", + "raft" + ], + [ + "▁dr", + "aft" + ], + [ + "▁", + "draft" + ], + [ + "bot", + "tom" + ], + [ + "b", + "ottom" + ], + [ + "▁{", + "};" + ], + [ + "▁{}", + ";" + ], + [ + "▁stay", + "ed" + ], + [ + "fe", + "ature" + ], + [ + "feat", + "ure" + ], + [ + "▁v", + "ot" + ], + [ + "▁vo", + "t" + ], + [ + "▁fab", + "ric" + ], + [ + "ç", + "a" + ], + [ + "('", + "#" + ], + [ + "re", + "a" + ], + [ + "r", + "ea" + ], + [ + "▁re", + "put" + ], + [ + "▁rep", + "ut" + ], + [ + "▁C", + "ir" + ], + [ + "▁Ci", + "r" + ], + [ + "▁", + "Cir" + ], + [ + "▁A", + "L" + ], + [ + "▁", + "AL" + ], + [ + "▁assert", + "Equals" + ], + [ + "▁", + "assertEquals" + ], + [ + "result", + "s" + ], + [ + "▁C", + "ross" + ], + [ + "▁Cr", + "oss" + ], + [ + "▁Cro", + "ss" + ], + [ + "▁", + "Cross" + ], + [ + "urs", + "day" + ], + [ + "▁a", + "udio" + ], + [ + "▁aud", + "io" + ], + [ + "▁", + "audio" + ], + [ + "▁g", + "ap" + ], + [ + "▁ga", + "p" + ], + [ + "▁stre", + "ets" + ], + [ + "▁street", + "s" + ], + [ + "▁scient", + "ific" + ], + [ + "pl", + "atform" + ], + [ + "▁a", + "uss" + ], + [ + "▁au", + "ss" + ], + [ + "▁aus", + "s" + ], + [ + "▁C", + "ro" + ], + [ + "▁Cr", + "o" + ], + [ + "▁part", + "ial" + ], + [ + "▁parti", + "al" + ], + [ + "▁", + "partial" + ], + [ + "un", + "c" + ], + [ + "u", + "nc" + ], + [ + "▁cho", + "ices" + ], + [ + "▁choice", + "s" + ], + [ + "▁и", + "ли" + ], + [ + "pr", + "ed" + ], + [ + "pre", + "d" + ], + [ + "p", + "red" + ], + [ + "▁he", + "ads" + ], + [ + "▁head", + "s" + ], + [ + "▁", + "heads" + ], + [ + "ter", + "day" + ], + [ + "▁N", + "ick" + ], + [ + "▁Nic", + "k" + ], + [ + "▁Ni", + "ck" + ], + [ + "▁we", + "ird" + ], + [ + "as", + "ant" + ], + [ + "asa", + "nt" + ], + [ + "▁represent", + "ed" + ], + [ + "▁п", + "и" + ], + [ + "▁", + "пи" + ], + [ + "D", + "P" + ], + [ + "or", + "ders" + ], + [ + "ord", + "ers" + ], + [ + "order", + "s" + ], + [ + "cl", + "ock" + ], + [ + "c", + "lock" + ], + [ + "▁H", + "o" + ], + [ + "ar", + "ters" + ], + [ + "art", + "ers" + ], + [ + "arter", + "s" + ], + [ + "arte", + "rs" + ], + [ + "C", + "md" + ], + [ + "og", + "a" + ], + [ + "o", + "ga" + ], + [ + "Key", + "s" + ], + [ + "Ke", + "ys" + ], + [ + "Re", + "port" + ], + [ + "Rep", + "ort" + ], + [ + "Repo", + "rt" + ], + [ + "▁V", + "ill" + ], + [ + "▁Vi", + "ll" + ], + [ + "▁Vil", + "l" + ], + [ + "▁M", + "u" + ], + [ + "▁", + "Mu" + ], + [ + "▁own", + "ed" + ], + [ + "▁", + "owned" + ], + [ + "SU", + "CCESS" + ], + [ + "▁type", + "of" + ], + [ + "▁", + "typeof" + ], + [ + "hd", + "r" + ], + [ + "h", + "dr" + ], + [ + "ua", + "ble" + ], + [ + "u", + "able" + ], + [ + "▁neighbor", + "hood" + ], + [ + "▁A", + "P" + ], + [ + "▁", + "AP" + ], + [ + "▁result", + "ing" + ], + [ + "▁sh", + "adow" + ], + [ + "▁", + "shadow" + ], + [ + "STR", + "ING" + ], + [ + "▁video", + "s" + ], + [ + "▁vide", + "os" + ], + [ + "ле", + "ння" + ], + [ + "лен", + "ня" + ], + [ + "ex", + "pect" + ], + [ + "exp", + "ect" + ], + [ + "▁Val", + "ley" + ], + [ + "▁Vall", + "ey" + ], + [ + "▁g", + "oto" + ], + [ + "▁go", + "to" + ], + [ + "▁got", + "o" + ], + [ + "▁", + "goto" + ], + [ + "▁S", + "her" + ], + [ + "▁She", + "r" + ], + [ + "▁Sh", + "er" + ], + [ + "fr", + "astr" + ], + [ + "▁oper", + "ating" + ], + [ + "▁opera", + "ting" + ], + [ + "▁э", + "то" + ], + [ + "▁License", + "d" + ], + [ + "▁Lic", + "ensed" + ], + [ + "Var", + "iable" + ], + [ + "Vari", + "able" + ], + [ + "▁P", + "R" + ], + [ + "▁", + "PR" + ], + [ + "▁H", + "ans" + ], + [ + "▁Ha", + "ns" + ], + [ + "▁Han", + "s" + ], + [ + "cl", + "one" + ], + [ + "▁G", + "esch" + ], + [ + "▁Ge", + "sch" + ], + [ + "▁Ges", + "ch" + ], + [ + "▁B", + "and" + ], + [ + "▁Ba", + "nd" + ], + [ + "▁Ban", + "d" + ], + [ + "▁", + "Band" + ], + [ + "...", + "....." + ], + [ + "....", + "...." + ], + [ + ".....", + "..." + ], + [ + "ui", + "ng" + ], + [ + "u", + "ing" + ], + [ + "▁hundred", + "s" + ], + [ + "▁о", + "к" + ], + [ + "▁emot", + "ional" + ], + [ + "▁emotion", + "al" + ], + [ + "▁Ind", + "ust" + ], + [ + ")", + "+" + ], + [ + "▁Egy", + "pt" + ], + [ + "▁fr", + "anç" + ], + [ + "▁", + "š" + ], + [ + "▁f", + "asc" + ], + [ + "▁fa", + "sc" + ], + [ + "on", + "to" + ], + [ + "ont", + "o" + ], + [ + "▁A", + "dam" + ], + [ + "▁Ad", + "am" + ], + [ + "▁l", + "aid" + ], + [ + "▁la", + "id" + ], + [ + "▁r", + "ig" + ], + [ + "▁ri", + "g" + ], + [ + "▁", + "rig" + ], + [ + "▁det", + "ailed" + ], + [ + "▁detail", + "ed" + ], + [ + "▁im", + "plements" + ], + [ + "▁implement", + "s" + ], + [ + "▁impl", + "ements" + ], + [ + "▁univers", + "ity" + ], + [ + "▁H", + "y" + ], + [ + "▁", + "Hy" + ], + [ + "▁g", + "rid" + ], + [ + "▁gr", + "id" + ], + [ + "▁gri", + "d" + ], + [ + "▁", + "grid" + ], + [ + "▁reg", + "ions" + ], + [ + "▁region", + "s" + ], + [ + "St", + "op" + ], + [ + "S", + "top" + ], + [ + "▁s", + "lot" + ], + [ + "▁sl", + "ot" + ], + [ + "▁", + "slot" + ], + [ + "▁ang", + "ry" + ], + [ + "▁-", + "=" + ], + [ + "▁wait", + "ed" + ], + [ + "▁wa", + "ited" + ], + [ + "Ver", + "t" + ], + [ + "V", + "ert" + ], + [ + "\":", + "\"" + ], + [ + "\"", + ":\"" + ], + [ + "▁e", + "lem" + ], + [ + "▁el", + "em" + ], + [ + "▁ele", + "m" + ], + [ + "▁", + "elem" + ], + [ + "▁r", + "ég" + ], + [ + "▁ré", + "g" + ], + [ + "ow", + "ed" + ], + [ + "owe", + "d" + ], + [ + "o", + "wed" + ], + [ + "Mem", + "ber" + ], + [ + "Me", + "mber" + ], + [ + "M", + "ember" + ], + [ + "▁r", + "atio" + ], + [ + "▁rat", + "io" + ], + [ + "▁", + "ratio" + ], + [ + "is", + "en" + ], + [ + "ise", + "n" + ], + [ + "i", + "sen" + ], + [ + "▁L", + "em" + ], + [ + "▁Le", + "m" + ], + [ + "ge", + "ry" + ], + [ + "ger", + "y" + ], + [ + "g", + "ery" + ], + [ + "▁c", + "ream" + ], + [ + "▁cre", + "am" + ], + [ + "▁ét", + "ait" + ], + [ + "▁", + "était" + ], + [ + "▁g", + "eb" + ], + [ + "▁ge", + "b" + ], + [ + "▁", + "geb" + ], + [ + "un", + "ique" + ], + [ + "uni", + "que" + ], + [ + "▁D", + "eb" + ], + [ + "▁De", + "b" + ], + [ + "▁f", + "actory" + ], + [ + "▁fact", + "ory" + ], + [ + "▁factor", + "y" + ], + [ + "▁", + "factory" + ], + [ + "ż", + "e" + ], + [ + "d", + "ialog" + ], + [ + "▁Con", + "fig" + ], + [ + "▁Conf", + "ig" + ], + [ + "▁", + "Config" + ], + [ + "Sy", + "nc" + ], + [ + "S", + "ync" + ], + [ + "an", + "gers" + ], + [ + "ang", + "ers" + ], + [ + "ange", + "rs" + ], + [ + "anger", + "s" + ], + [ + "▁gover", + "ning" + ], + [ + "▁govern", + "ing" + ], + [ + "▁H", + "un" + ], + [ + "▁Hu", + "n" + ], + [ + "Sp", + "ace" + ], + [ + "S", + "pace" + ], + [ + "▁j", + "est" + ], + [ + "▁je", + "st" + ], + [ + "ic", + "ious" + ], + [ + "ici", + "ous" + ], + [ + "icio", + "us" + ], + [ + "▁em", + "phas" + ], + [ + "▁emp", + "has" + ], + [ + "um", + "ps" + ], + [ + "ump", + "s" + ], + [ + "▁E", + "sp" + ], + [ + "▁Es", + "p" + ], + [ + "▁", + "Esp" + ], + [ + "▁s", + "ul" + ], + [ + "▁su", + "l" + ], + [ + "▁histor", + "ical" + ], + [ + "▁historic", + "al" + ], + [ + "ij", + "a" + ], + [ + "i", + "ja" + ], + [ + "▁l", + "ying" + ], + [ + "▁ly", + "ing" + ], + [ + "▁", + "lying" + ], + [ + "▁St", + "eve" + ], + [ + "▁Ste", + "ve" + ], + [ + "▁me", + "asures" + ], + [ + "▁measure", + "s" + ], + [ + "▁meas", + "ures" + ], + [ + "os", + "to" + ], + [ + "ost", + "o" + ], + [ + "o", + "sto" + ], + [ + "?", + "”" + ], + [ + "▁p", + "ocket" + ], + [ + "▁poc", + "ket" + ], + [ + "▁S", + "at" + ], + [ + "▁Sa", + "t" + ], + [ + "▁p", + "itch" + ], + [ + "▁pit", + "ch" + ], + [ + "▁n", + "atur" + ], + [ + "▁nat", + "ur" + ], + [ + "▁hum", + "ans" + ], + [ + "▁human", + "s" + ], + [ + "▁Sim", + "on" + ], + [ + "▁Si", + "mon" + ], + [ + "ad", + "ores" + ], + [ + "ado", + "res" + ], + [ + "ador", + "es" + ], + [ + "(\"", + "\\" + ], + [ + "(", + "\"\\" + ], + [ + "in", + "king" + ], + [ + "ink", + "ing" + ], + [ + "▁ex", + "pos" + ], + [ + "▁exp", + "os" + ], + [ + "mat", + "erial" + ], + [ + "mate", + "rial" + ], + [ + "m", + "aterial" + ], + [ + "▁app", + "arently" + ], + [ + "▁apparent", + "ly" + ], + [ + "▁appar", + "ently" + ], + [ + "▁C", + "amb" + ], + [ + "▁Cam", + "b" + ], + [ + "▁Ca", + "mb" + ], + [ + "▁B", + "ox" + ], + [ + "▁Bo", + "x" + ], + [ + "▁", + "Box" + ], + [ + "▁s", + "paces" + ], + [ + "▁sp", + "aces" + ], + [ + "▁space", + "s" + ], + [ + "ex", + "ists" + ], + [ + "exist", + "s" + ], + [ + "▁act", + "ing" + ], + [ + "▁ac", + "ting" + ], + [ + "OR", + "Y" + ], + [ + "зо", + "ва" + ], + [ + "Go", + "od" + ], + [ + "G", + "ood" + ], + [ + "ien", + "ne" + ], + [ + "i", + "enne" + ], + [ + "▁William", + "s" + ], + [ + "▁f", + "ruit" + ], + [ + "▁fr", + "uit" + ], + [ + "▁fru", + "it" + ], + [ + "ie", + "ra" + ], + [ + "ier", + "a" + ], + [ + "i", + "era" + ], + [ + "▁L", + "im" + ], + [ + "▁Li", + "m" + ], + [ + "▁", + "Lim" + ], + [ + "▁t", + "rait" + ], + [ + "▁tr", + "ait" + ], + [ + "▁tra", + "it" + ], + [ + "▁", + "trait" + ], + [ + "▁art", + "ists" + ], + [ + "▁artist", + "s" + ], + [ + "▁ab", + "sor" + ], + [ + "▁abs", + "or" + ], + [ + "ra", + "it" + ], + [ + "rai", + "t" + ], + [ + "r", + "ait" + ], + [ + "LO", + "AD" + ], + [ + "▁mov", + "ies" + ], + [ + "▁movie", + "s" + ], + [ + "▁d", + "ynamic" + ], + [ + "▁dynam", + "ic" + ], + [ + "▁dyn", + "amic" + ], + [ + "▁", + "dynamic" + ], + [ + "as", + "ts" + ], + [ + "ast", + "s" + ], + [ + "a", + "sts" + ], + [ + "▁In", + "teger" + ], + [ + "▁", + "Integer" + ], + [ + "▁sm", + "oke" + ], + [ + "п", + "і" + ], + [ + "an", + "gel" + ], + [ + "ang", + "el" + ], + [ + "ange", + "l" + ], + [ + ">(", + "\"" + ], + [ + ">", + "(\"" + ], + [ + "▁in", + "strument" + ], + [ + "▁instr", + "ument" + ], + [ + "▁f", + "uel" + ], + [ + "▁fue", + "l" + ], + [ + "▁fu", + "el" + ], + [ + "но", + "ї" + ], + [ + "atal", + "ogue" + ], + [ + "atalog", + "ue" + ], + [ + "▁s", + "erial" + ], + [ + "▁se", + "rial" + ], + [ + "▁ser", + "ial" + ], + [ + "▁", + "serial" + ], + [ + "File", + "s" + ], + [ + "Fil", + "es" + ], + [ + "Fi", + "les" + ], + [ + "F", + "iles" + ], + [ + "▁bath", + "room" + ], + [ + "il", + "o" + ], + [ + "i", + "lo" + ], + [ + "es", + "to" + ], + [ + "est", + "o" + ], + [ + "e", + "sto" + ], + [ + "▁p", + "m" + ], + [ + "▁", + "pm" + ], + [ + "ent", + "ials" + ], + [ + "ential", + "s" + ], + [ + "enti", + "als" + ], + [ + "▁On", + "line" + ], + [ + "wh", + "ite" + ], + [ + "▁t", + "ips" + ], + [ + "▁tip", + "s" + ], + [ + "▁ti", + "ps" + ], + [ + "▁cap", + "able" + ], + [ + "Fi", + "g" + ], + [ + "F", + "ig" + ], + [ + "T", + "V" + ], + [ + "▁о", + "н" + ], + [ + "▁", + "он" + ], + [ + "k", + "é" + ], + [ + "bit", + "r" + ], + [ + "bi", + "tr" + ], + [ + "b", + "itr" + ], + [ + "Map", + "ping" + ], + [ + "Ma", + "pping" + ], + [ + "M", + "apping" + ], + [ + "▁t", + "ak" + ], + [ + "▁ta", + "k" + ], + [ + "ю", + "щи" + ], + [ + "в", + "ля" + ], + [ + ")\"", + "," + ], + [ + ")", + "\"," + ], + [ + "▁K", + "arl" + ], + [ + "▁Kar", + "l" + ], + [ + "▁Ka", + "rl" + ], + [ + "▁H", + "uman" + ], + [ + "▁Hu", + "man" + ], + [ + "▁Hum", + "an" + ], + [ + "▁P", + "ot" + ], + [ + "▁Po", + "t" + ], + [ + "▁rep", + "resents" + ], + [ + "▁represent", + "s" + ], + [ + "▁cons", + "istent" + ], + [ + "▁consist", + "ent" + ], + [ + "_", + "(" + ], + [ + "we", + "n" + ], + [ + "w", + "en" + ], + [ + "▁R", + "ose" + ], + [ + "▁Ro", + "se" + ], + [ + "▁Ros", + "e" + ], + [ + "la", + "w" + ], + [ + "l", + "aw" + ], + [ + "▁F", + "ROM" + ], + [ + "▁FR", + "OM" + ], + [ + "▁", + "FROM" + ], + [ + "▁beg", + "ins" + ], + [ + "▁begin", + "s" + ], + [ + "▁e", + "dit" + ], + [ + "▁ed", + "it" + ], + [ + "▁", + "edit" + ], + [ + "▁mount", + "ain" + ], + [ + "▁ch", + "apter" + ], + [ + "▁chap", + "ter" + ], + [ + "▁wonder", + "ed" + ], + [ + "▁indust", + "rial" + ], + [ + "▁M", + "ajor" + ], + [ + "▁Ma", + "jor" + ], + [ + "▁Maj", + "or" + ], + [ + "▁g", + "es" + ], + [ + "▁ge", + "s" + ], + [ + "▁", + "ges" + ], + [ + "▁direct", + "ed" + ], + [ + "▁dire", + "cted" + ], + [ + "er", + "os" + ], + [ + "ero", + "s" + ], + [ + "e", + "ros" + ], + [ + "▁W", + "ild" + ], + [ + "▁Wil", + "d" + ], + [ + "▁Wi", + "ld" + ], + [ + "li", + "ament" + ], + [ + "lia", + "ment" + ], + [ + "Bo", + "ok" + ], + [ + "B", + "ook" + ], + [ + "user", + "name" + ], + [ + "ho", + "t" + ], + [ + "h", + "ot" + ], + [ + "▁n", + "am" + ], + [ + "▁na", + "m" + ], + [ + "▁", + "nam" + ], + [ + "▁le", + "ague" + ], + [ + "br", + "a" + ], + [ + "b", + "ra" + ], + [ + "ко", + "н" + ], + [ + "к", + "он" + ], + [ + "▁T", + "al" + ], + [ + "▁Ta", + "l" + ], + [ + "▁В", + "а" + ], + [ + "▁ex", + "ports" + ], + [ + "▁exp", + "orts" + ], + [ + "▁export", + "s" + ], + [ + "▁", + "exports" + ], + [ + "(", + "@" + ], + [ + "▁sh", + "aring" + ], + [ + "▁shar", + "ing" + ], + [ + "▁sha", + "ring" + ], + [ + "▁T", + "ro" + ], + [ + "▁Tr", + "o" + ], + [ + "ś", + "ć" + ], + [ + "ues", + "day" + ], + [ + "yl", + "v" + ], + [ + "y", + "lv" + ], + [ + "▁gu", + "itar" + ], + [ + "el", + "en" + ], + [ + "ele", + "n" + ], + [ + "e", + "len" + ], + [ + "Se", + "lection" + ], + [ + "Select", + "ion" + ], + [ + "S", + "election" + ], + [ + "▁conf", + "ident" + ], + [ + "ry", + "pto" + ], + [ + "rypt", + "o" + ], + [ + "▁h", + "ors" + ], + [ + "▁hor", + "s" + ], + [ + "▁ho", + "rs" + ], + [ + "ed", + "itor" + ], + [ + "edit", + "or" + ], + [ + "edi", + "tor" + ], + [ + "▁should", + "ers" + ], + [ + "▁shoulder", + "s" + ], + [ + "get", + "Name" + ], + [ + "en", + "cing" + ], + [ + "enc", + "ing" + ], + [ + "enci", + "ng" + ], + [ + "SE", + "LECT" + ], + [ + "SEL", + "ECT" + ], + [ + "в", + "ши" + ], + [ + "▁kind", + "s" + ], + [ + "▁kin", + "ds" + ], + [ + "▁W", + "el" + ], + [ + "▁We", + "l" + ], + [ + "▁pur", + "poses" + ], + [ + "▁purpose", + "s" + ], + [ + "Mat", + "rix" + ], + [ + "in", + "valid" + ], + [ + "▁own", + "ers" + ], + [ + "▁owner", + "s" + ], + [ + "▁", + "owners" + ], + [ + "▁Rec", + "ords" + ], + [ + "▁Record", + "s" + ], + [ + "▁", + "Records" + ], + [ + "▁Pro", + "cess" + ], + [ + "▁", + "Process" + ], + [ + "▁c", + "hat" + ], + [ + "▁ch", + "at" + ], + [ + "▁cha", + "t" + ], + [ + "▁", + "chat" + ], + [ + "▁D", + "or" + ], + [ + "▁Do", + "r" + ], + [ + "▁b", + "in" + ], + [ + "▁bi", + "n" + ], + [ + "▁", + "bin" + ], + [ + "re", + "dit" + ], + [ + "red", + "it" + ], + [ + "r", + "edit" + ], + [ + "oi", + "re" + ], + [ + "oir", + "e" + ], + [ + "o", + "ire" + ], + [ + "▁T", + "otal" + ], + [ + "▁To", + "tal" + ], + [ + "▁Tot", + "al" + ], + [ + "▁", + "Total" + ], + [ + "▁F", + "amily" + ], + [ + "▁Famil", + "y" + ], + [ + "▁", + "Family" + ], + [ + "AR", + "Y" + ], + [ + "▁b", + "read" + ], + [ + "▁br", + "ead" + ], + [ + "▁bre", + "ad" + ], + [ + "▁", + "bread" + ], + [ + "▁com", + "pre" + ], + [ + "▁comp", + "re" + ], + [ + "▁compr", + "e" + ], + [ + "▁sh", + "oes" + ], + [ + "▁shoe", + "s" + ], + [ + "▁r", + "az" + ], + [ + "▁ra", + "z" + ], + [ + "▁", + "raz" + ], + [ + "▁tr", + "ace" + ], + [ + "▁tra", + "ce" + ], + [ + "▁", + "trace" + ], + [ + "ne", + "j" + ], + [ + "n", + "ej" + ], + [ + "or", + "ted" + ], + [ + "ort", + "ed" + ], + [ + "orte", + "d" + ], + [ + "h", + "n" + ], + [ + "▁pro", + "cedure" + ], + [ + "▁proced", + "ure" + ], + [ + "pro", + "perties" + ], + [ + "pl", + "ier" + ], + [ + "▁h", + "ero" + ], + [ + "▁he", + "ro" + ], + [ + "▁her", + "o" + ], + [ + "▁", + "hero" + ], + [ + "pan", + "el" + ], + [ + "pa", + "nel" + ], + [ + "p", + "anel" + ], + [ + "▁mark", + "ed" + ], + [ + "▁mar", + "ked" + ], + [ + "▁wor", + "ried" + ], + [ + "\\", + "|" + ], + [ + "pt", + "s" + ], + [ + "p", + "ts" + ], + [ + "▁S", + "upport" + ], + [ + "▁Sup", + "port" + ], + [ + "▁Supp", + "ort" + ], + [ + "▁", + "Support" + ], + [ + "▁ser", + "ving" + ], + [ + "▁serv", + "ing" + ], + [ + "F", + "ail" + ], + [ + "▁dis", + "appoint" + ], + [ + "▁Sc", + "ot" + ], + [ + "▁ple", + "asure" + ], + [ + "▁j", + "udge" + ], + [ + "▁jud", + "ge" + ], + [ + "▁judg", + "e" + ], + [ + "ze", + "ich" + ], + [ + "▁for", + "ever" + ], + [ + "▁fore", + "ver" + ], + [ + "▁Ze", + "it" + ], + [ + "uo", + "us" + ], + [ + "u", + "ous" + ], + [ + "in", + "ent" + ], + [ + "ine", + "nt" + ], + [ + "inen", + "t" + ], + [ + "i", + "nent" + ], + [ + "▁d", + "w" + ], + [ + "▁", + "dw" + ], + [ + "▁w", + "aren" + ], + [ + "▁war", + "en" + ], + [ + "▁wa", + "ren" + ], + [ + "▁ware", + "n" + ], + [ + "▁fl", + "ash" + ], + [ + "▁", + "flash" + ], + [ + "▁tro", + "ops" + ], + [ + "▁dr", + "ugs" + ], + [ + "▁dru", + "gs" + ], + [ + "▁drug", + "s" + ], + [ + "▁d", + "iam" + ], + [ + "▁di", + "am" + ], + [ + "▁dia", + "m" + ], + [ + ".", + "~" + ], + [ + "im", + "p" + ], + [ + "i", + "mp" + ], + [ + "in", + "ned" + ], + [ + "inn", + "ed" + ], + [ + "▁E", + "V" + ], + [ + "▁", + "EV" + ], + [ + "St", + "ruct" + ], + [ + "Str", + "uct" + ], + [ + "▁just", + "ice" + ], + [ + "▁offic", + "ials" + ], + [ + "▁official", + "s" + ], + [ + "ff", + "ff" + ], + [ + "fff", + "f" + ], + [ + "f", + "fff" + ], + [ + "▁Com", + "mon" + ], + [ + "▁Comm", + "on" + ], + [ + "▁", + "Common" + ], + [ + "▁C", + "at" + ], + [ + "▁Ca", + "t" + ], + [ + "▁", + "Cat" + ], + [ + "▁tom", + "orrow" + ], + [ + "▁é", + "l" + ], + [ + "▁", + "él" + ], + [ + "Text", + "ure" + ], + [ + "Te", + "xture" + ], + [ + "qp", + "oint" + ], + [ + "q", + "point" + ], + [ + "▁F", + "ried" + ], + [ + "▁Fr", + "ied" + ], + [ + "▁T", + "erm" + ], + [ + "▁Te", + "rm" + ], + [ + "▁Ter", + "m" + ], + [ + "▁", + "Term" + ], + [ + "pgf", + "qpoint" + ], + [ + "▁n", + "em" + ], + [ + "▁ne", + "m" + ], + [ + "▁", + "nem" + ], + [ + "no", + "rm" + ], + [ + "nor", + "m" + ], + [ + "n", + "orm" + ], + [ + "▁hard", + "ly" + ], + [ + "od", + "a" + ], + [ + "o", + "da" + ], + [ + "ze", + "ta" + ], + [ + "zet", + "a" + ], + [ + "z", + "eta" + ], + [ + "em", + "ic" + ], + [ + "emi", + "c" + ], + [ + "e", + "mic" + ], + [ + "▁по", + "лу" + ], + [ + "▁пол", + "у" + ], + [ + "▁lo", + "aded" + ], + [ + "▁load", + "ed" + ], + [ + "▁", + "loaded" + ], + [ + "ke", + "s" + ], + [ + "k", + "es" + ], + [ + "ci", + "ó" + ], + [ + "c", + "ió" + ], + [ + "▁f", + "ool" + ], + [ + "▁fo", + "ol" + ], + [ + "▁foo", + "l" + ], + [ + "▁t", + "rick" + ], + [ + "▁tr", + "ick" + ], + [ + "▁tri", + "ck" + ], + [ + "▁d", + "st" + ], + [ + "▁ds", + "t" + ], + [ + "▁", + "dst" + ], + [ + "Fin", + "d" + ], + [ + "Fi", + "nd" + ], + [ + "F", + "ind" + ], + [ + "▁в", + "се" + ], + [ + "}}", + "," + ], + [ + "}", + "}," + ], + [ + "▁frame", + "work" + ], + [ + "▁", + "framework" + ], + [ + "▁mer", + "ely" + ], + [ + "▁mere", + "ly" + ], + [ + "▁un", + "ion" + ], + [ + "▁", + "union" + ], + [ + "▁Ed", + "ward" + ], + [ + "ri", + "f" + ], + [ + "r", + "if" + ], + [ + "Fl", + "ag" + ], + [ + "F", + "lag" + ], + [ + "▁cris", + "is" + ], + [ + "▁cri", + "sis" + ], + [ + "▁fin", + "ite" + ], + [ + "▁", + "finite" + ], + [ + "▁l", + "ol" + ], + [ + "▁lo", + "l" + ], + [ + "▁K", + "im" + ], + [ + "▁Ki", + "m" + ], + [ + "на", + "та" + ], + [ + "sin", + "ce" + ], + [ + "s", + "ince" + ], + [ + "▁com", + "pat" + ], + [ + "▁comp", + "at" + ], + [ + "▁", + "compat" + ], + [ + "▁p", + "ert" + ], + [ + "▁per", + "t" + ], + [ + "▁pe", + "rt" + ], + [ + "▁", + "pert" + ], + [ + "ib", + "ilities" + ], + [ + "ibil", + "ities" + ], + [ + "▁tamb", + "ién" + ], + [ + "ib", + "li" + ], + [ + "▁t", + "een" + ], + [ + "▁te", + "en" + ], + [ + "▁", + "teen" + ], + [ + "▁sym", + "pt" + ], + [ + "or", + "al" + ], + [ + "ora", + "l" + ], + [ + "o", + "ral" + ], + [ + "de", + "rs" + ], + [ + "der", + "s" + ], + [ + "d", + "ers" + ], + [ + "ot", + "te" + ], + [ + "ott", + "e" + ], + [ + "п", + "ри" + ], + [ + "▁J", + "ane" + ], + [ + "▁Jan", + "e" + ], + [ + "▁Ja", + "ne" + ], + [ + "▁original", + "ly" + ], + [ + "▁origin", + "ally" + ], + [ + "▁thro", + "at" + ], + [ + "ma", + "g" + ], + [ + "m", + "ag" + ], + [ + "su", + "p" + ], + [ + "s", + "up" + ], + [ + "un", + "i" + ], + [ + "u", + "ni" + ], + [ + "$", + "$" + ], + [ + "▁L", + "ibrary" + ], + [ + "▁", + "Library" + ], + [ + "▁att", + "acks" + ], + [ + "▁attack", + "s" + ], + [ + "in", + "gen" + ], + [ + "ing", + "en" + ], + [ + "inge", + "n" + ], + [ + "('", + "/" + ], + [ + "▁h", + "es" + ], + [ + "▁he", + "s" + ], + [ + "▁", + "hes" + ], + [ + "co", + "in" + ], + [ + "c", + "oin" + ], + [ + "oun", + "ce" + ], + [ + "▁Academ", + "y" + ], + [ + "MOD", + "ULE" + ], + [ + "is", + "ms" + ], + [ + "ism", + "s" + ], + [ + "▁A", + "dv" + ], + [ + "▁Ad", + "v" + ], + [ + "▁", + "Adv" + ], + [ + "▁B", + "ol" + ], + [ + "▁Bo", + "l" + ], + [ + "▁inc", + "ident" + ], + [ + ")^", + "{" + ], + [ + ")", + "^{" + ], + [ + "▁b", + "ij" + ], + [ + "▁bi", + "j" + ], + [ + "▁R", + "ome" + ], + [ + "▁Rom", + "e" + ], + [ + "▁Ro", + "me" + ], + [ + "▁It", + "aly" + ], + [ + "▁Ital", + "y" + ], + [ + "ev", + "ents" + ], + [ + "event", + "s" + ], + [ + "even", + "ts" + ], + [ + "▁F", + "ern" + ], + [ + "▁Fe", + "rn" + ], + [ + "▁Fer", + "n" + ], + [ + "▁b", + "er" + ], + [ + "▁be", + "r" + ], + [ + "▁", + "ber" + ], + [ + "▁sil", + "ent" + ], + [ + "▁p", + "ier" + ], + [ + "▁pie", + "r" + ], + [ + "▁pi", + "er" + ], + [ + "▁Y", + "O" + ], + [ + "▁pl", + "ain" + ], + [ + "▁", + "plain" + ], + [ + "B", + "as" + ], + [ + "▁p", + "ill" + ], + [ + "▁pi", + "ll" + ], + [ + "▁pil", + "l" + ], + [ + "ra", + "se" + ], + [ + "ras", + "e" + ], + [ + "r", + "ase" + ], + [ + "▁car", + "rying" + ], + [ + "▁carry", + "ing" + ], + [ + "▁re", + "sp" + ], + [ + "▁r", + "esp" + ], + [ + "▁res", + "p" + ], + [ + "▁", + "resp" + ], + [ + "ну", + "ю" + ], + [ + "▁typ", + "ical" + ], + [ + "Wrap", + "per" + ], + [ + "W", + "rapper" + ], + [ + "▁g", + "au" + ], + [ + "▁ga", + "u" + ], + [ + "▁chem", + "ical" + ], + [ + "▁h", + "al" + ], + [ + "▁ha", + "l" + ], + [ + "▁", + "hal" + ], + [ + "th", + "row" + ], + [ + "Cl", + "uster" + ], + [ + "▁G", + "ab" + ], + [ + "▁Ga", + "b" + ], + [ + "▁G", + "irl" + ], + [ + "▁Gi", + "rl" + ], + [ + "▁Gir", + "l" + ], + [ + "qu", + "ir" + ], + [ + "▁A", + "rg" + ], + [ + "▁Ar", + "g" + ], + [ + "▁", + "Arg" + ], + [ + "▁rel", + "ief" + ], + [ + "▁relie", + "f" + ], + [ + "▁reli", + "ef" + ], + [ + "▁В", + "е" + ], + [ + "d", + "m" + ], + [ + "▁fr", + "ustr" + ], + [ + "▁fru", + "str" + ], + [ + "\\", + "%" + ], + [ + "▁st", + "ores" + ], + [ + "▁store", + "s" + ], + [ + "▁stor", + "es" + ], + [ + "▁sto", + "res" + ], + [ + "▁bott", + "le" + ], + [ + "▁bot", + "tle" + ], + [ + "▁L", + "ew" + ], + [ + "▁Le", + "w" + ], + [ + "tw", + "o" + ], + [ + "t", + "wo" + ], + [ + "st", + "ad" + ], + [ + "sta", + "d" + ], + [ + "▁che", + "ek" + ], + [ + "▁concern", + "s" + ], + [ + "▁concer", + "ns" + ], + [ + "▁help", + "ful" + ], + [ + "▁co", + "verage" + ], + [ + "▁cover", + "age" + ], + [ + "is", + "i" + ], + [ + "i", + "si" + ], + [ + "AD", + "D" + ], + [ + "A", + "DD" + ], + [ + "as", + "ync" + ], + [ + "asy", + "nc" + ], + [ + "a", + "sync" + ], + [ + "▁approxim", + "ately" + ], + [ + "▁approx", + "imately" + ], + [ + "▁approximate", + "ly" + ], + [ + "if", + "fer" + ], + [ + "iff", + "er" + ], + [ + "iffe", + "r" + ], + [ + "ho", + "ok" + ], + [ + "h", + "ook" + ], + [ + "▁e", + "num" + ], + [ + "▁en", + "um" + ], + [ + "▁", + "enum" + ], + [ + "ov", + "á" + ], + [ + "o", + "vá" + ], + [ + "▁e", + "vil" + ], + [ + "▁ev", + "il" + ], + [ + "▁const", + "antly" + ], + [ + "▁constant", + "ly" + ], + [ + "ap", + "ply" + ], + [ + "app", + "ly" + ], + [ + "▁si", + "è" + ], + [ + "▁pract", + "ices" + ], + [ + "▁practice", + "s" + ], + [ + "▁te", + "achers" + ], + [ + "▁teach", + "ers" + ], + [ + "▁teacher", + "s" + ], + [ + "▁S", + "n" + ], + [ + "▁", + "Sn" + ], + [ + "▁A", + "wards" + ], + [ + "▁Award", + "s" + ], + [ + "▁Aw", + "ards" + ], + [ + "▁sub", + "stant" + ], + [ + "▁subst", + "ant" + ], + [ + "▁$", + "." + ], + [ + "▁", + "$." + ], + [ + "d", + "k" + ], + [ + "▁m", + "ob" + ], + [ + "▁mo", + "b" + ], + [ + "▁", + "mob" + ], + [ + "▁ing", + "red" + ], + [ + "ve", + "re" + ], + [ + "ver", + "e" + ], + [ + "v", + "ere" + ], + [ + "Mult", + "i" + ], + [ + "пе", + "р" + ], + [ + "п", + "ер" + ], + [ + "st", + "al" + ], + [ + "sta", + "l" + ], + [ + "s", + "tal" + ], + [ + "ya", + "rd" + ], + [ + "yar", + "d" + ], + [ + "y", + "ard" + ], + [ + "requ", + "ired" + ], + [ + "require", + "d" + ], + [ + "ve", + "ment" + ], + [ + "v", + "ement" + ], + [ + "▁int", + "elligence" + ], + [ + "▁intellig", + "ence" + ], + [ + "▁th", + "inks" + ], + [ + "▁think", + "s" + ], + [ + "▁thin", + "ks" + ], + [ + "▁person", + "ally" + ], + [ + "▁personal", + "ly" + ], + [ + "▁tr", + "ained" + ], + [ + "▁tra", + "ined" + ], + [ + "▁train", + "ed" + ], + [ + "▁", + "trained" + ], + [ + "or", + "ney" + ], + [ + "orn", + "ey" + ], + [ + "orne", + "y" + ], + [ + ")", + "", + "\\" + ], + [ + "an", + "al" + ], + [ + "ana", + "l" + ], + [ + "a", + "nal" + ], + [ + "Se", + "ction" + ], + [ + "S", + "ection" + ], + [ + "pl", + "us" + ], + [ + "ü", + "t" + ], + [ + "▁em", + "bed" + ], + [ + "▁emb", + "ed" + ], + [ + "▁", + "embed" + ], + [ + "▁st", + "rings" + ], + [ + "▁str", + "ings" + ], + [ + "▁string", + "s" + ], + [ + "▁", + "strings" + ], + [ + "Be", + "fore" + ], + [ + "B", + "efore" + ], + [ + "pro", + "c" + ], + [ + "pr", + "oc" + ], + [ + "p", + "roc" + ], + [ + "▁с", + "по" + ], + [ + "▁сп", + "о" + ], + [ + "▁", + "спо" + ], + [ + "tr", + "l" + ], + [ + "t", + "rl" + ], + [ + "v", + "r" + ], + [ + "Back", + "ground" + ], + [ + "log", + "ger" + ], + [ + "ag", + "raph" + ], + [ + "agr", + "aph" + ], + [ + "agra", + "ph" + ], + [ + "a", + "graph" + ], + [ + "ie", + "st" + ], + [ + "ies", + "t" + ], + [ + "i", + "est" + ], + [ + "▁good", + "s" + ], + [ + "bat", + "ch" + ], + [ + "b", + "atch" + ], + [ + "▁opt", + "ional" + ], + [ + "▁option", + "al" + ], + [ + "▁", + "optional" + ], + [ + "▁Tay", + "lor" + ], + [ + "▁recogn", + "ize" + ], + [ + "wal", + "k" + ], + [ + "w", + "alk" + ], + [ + "▁H", + "it" + ], + [ + "▁Hi", + "t" + ], + [ + "▁", + "Hit" + ], + [ + "▁Eliz", + "abeth" + ], + [ + "}", + ":" + ], + [ + "▁care", + "ful" + ], + [ + "кра", + "ї" + ], + [ + "▁loc", + "ations" + ], + [ + "▁location", + "s" + ], + [ + "▁struct", + "ures" + ], + [ + "▁structure", + "s" + ], + [ + "▁d", + "isk" + ], + [ + "▁dis", + "k" + ], + [ + "▁di", + "sk" + ], + [ + "▁", + "disk" + ], + [ + "▁sh", + "ips" + ], + [ + "▁ship", + "s" + ], + [ + "▁", + "ships" + ], + [ + "▁s", + "uo" + ], + [ + "▁su", + "o" + ], + [ + "▁s", + "owie" + ], + [ + "▁so", + "wie" + ], + [ + "▁sow", + "ie" + ], + [ + "▁E", + "ss" + ], + [ + "▁Es", + "s" + ], + [ + "▁H", + "ash" + ], + [ + "▁Ha", + "sh" + ], + [ + "▁Has", + "h" + ], + [ + "▁", + "Hash" + ], + [ + "▁reason", + "able" + ], + [ + "▁More", + "over" + ], + [ + "▁form", + "ula" + ], + [ + "▁C", + "entre" + ], + [ + "▁Cent", + "re" + ], + [ + "▁res", + "idents" + ], + [ + "▁resident", + "s" + ], + [ + "▁resid", + "ents" + ], + [ + "R", + "S" + ], + [ + "Id", + "s" + ], + [ + "I", + "ds" + ], + [ + "▁K", + "now" + ], + [ + "▁Kn", + "ow" + ], + [ + "▁t", + "rib" + ], + [ + "▁tr", + "ib" + ], + [ + "▁tri", + "b" + ], + [ + "▁r", + "és" + ], + [ + "▁ré", + "s" + ], + [ + "▁s", + "table" + ], + [ + "▁st", + "able" + ], + [ + "▁sta", + "ble" + ], + [ + "▁stab", + "le" + ], + [ + "▁", + "stable" + ], + [ + "▁W", + "ould" + ], + [ + "▁Wo", + "uld" + ], + [ + "▁", + "Would" + ], + [ + "▁break", + "ing" + ], + [ + "▁bre", + "aking" + ], + [ + "▁", + "breaking" + ], + [ + "▁me", + "al" + ], + [ + "▁p", + "hen" + ], + [ + "▁ph", + "en" + ], + [ + "▁f", + "el" + ], + [ + "▁fe", + "l" + ], + [ + "▁", + "fel" + ], + [ + "▁F", + "red" + ], + [ + "▁Fr", + "ed" + ], + [ + "▁Fre", + "d" + ], + [ + "Aut", + "hor" + ], + [ + "Auth", + "or" + ], + [ + "▁c", + "apture" + ], + [ + "▁capt", + "ure" + ], + [ + "▁", + "capture" + ], + [ + "op", + "ts" + ], + [ + "opt", + "s" + ], + [ + "o", + "pts" + ], + [ + "▁every", + "where" + ], + [ + "▁s", + "que" + ], + [ + "▁squ", + "e" + ], + [ + "▁sq", + "ue" + ], + [ + "▁m", + "oder" + ], + [ + "▁mod", + "er" + ], + [ + "▁mo", + "der" + ], + [ + "▁mode", + "r" + ], + [ + "set", + "up" + ], + [ + "▁S", + "upp" + ], + [ + "▁Su", + "pp" + ], + [ + "▁Sup", + "p" + ], + [ + "▁", + "Supp" + ], + [ + "▁when", + "ever" + ], + [ + "▁whe", + "never" + ], + [ + "{", + "(" + ], + [ + "wa", + "rt" + ], + [ + "war", + "t" + ], + [ + "w", + "art" + ], + [ + "▁t", + "oe" + ], + [ + "▁to", + "e" + ], + [ + "Pre", + "fix" + ], + [ + "Pref", + "ix" + ], + [ + "P", + "refix" + ], + [ + "ho", + "u" + ], + [ + "h", + "ou" + ], + [ + "ga", + "ge" + ], + [ + "g", + "age" + ], + [ + ">", + "\"" + ], + [ + "▁f", + "rag" + ], + [ + "▁fr", + "ag" + ], + [ + "▁fra", + "g" + ], + [ + "▁", + "frag" + ], + [ + "▁The", + "orem" + ], + [ + "mem", + "ory" + ], + [ + "▁cont", + "ents" + ], + [ + "▁content", + "s" + ], + [ + "▁conten", + "ts" + ], + [ + "▁", + "contents" + ], + [ + "do", + "cs" + ], + [ + "doc", + "s" + ], + [ + "}", + "'" + ], + [ + "▁Ir", + "ish" + ], + [ + "The", + "n" + ], + [ + "Th", + "en" + ], + [ + "T", + "hen" + ], + [ + "aa", + "ts" + ], + [ + "aat", + "s" + ], + [ + "a", + "ats" + ], + [ + "Sa", + "ve" + ], + [ + "S", + "ave" + ], + [ + "▁a", + "gency" + ], + [ + "▁ag", + "ency" + ], + [ + "▁и", + "ме" + ], + [ + "▁им", + "е" + ], + [ + "до", + "ва" + ], + [ + "дов", + "а" + ], + [ + "▁F", + "unction" + ], + [ + "▁Fun", + "ction" + ], + [ + "▁", + "Function" + ], + [ + "N", + "N" + ], + [ + "dest", + "roy" + ], + [ + "▁M", + "essage" + ], + [ + "▁Mess", + "age" + ], + [ + "▁", + "Message" + ], + [ + "▁c", + "ancel" + ], + [ + "▁can", + "cel" + ], + [ + "▁", + "cancel" + ], + [ + "▁super", + "ior" + ], + [ + "▁e", + "c" + ], + [ + "▁", + "ec" + ], + [ + "▁liter", + "ature" + ], + [ + "▁P", + "ART" + ], + [ + "▁PA", + "RT" + ], + [ + "▁PAR", + "T" + ], + [ + "▁", + "PART" + ], + [ + "I", + "l" + ], + [ + "▁C", + "ab" + ], + [ + "▁Ca", + "b" + ], + [ + "eng", + "ine" + ], + [ + "▁b", + "asket" + ], + [ + "▁bas", + "ket" + ], + [ + "wor", + "th" + ], + [ + "wort", + "h" + ], + [ + "w", + "orth" + ], + [ + "▁S", + "el" + ], + [ + "▁Se", + "l" + ], + [ + "f", + "etch" + ], + [ + "▁St", + "adt" + ], + [ + "▁Stad", + "t" + ], + [ + "▁Sta", + "dt" + ], + [ + "▁К", + "и" + ], + [ + "▁con", + "j" + ], + [ + "▁se", + "iner" + ], + [ + "▁sein", + "er" + ], + [ + "▁seine", + "r" + ], + [ + "▁sei", + "ner" + ], + [ + "▁conf", + "irmed" + ], + [ + "▁confirm", + "ed" + ], + [ + "▁Ar", + "gent" + ], + [ + "▁Arg", + "ent" + ], + [ + "am", + "ar" + ], + [ + "ama", + "r" + ], + [ + "a", + "mar" + ], + [ + "pgf", + "path" + ], + [ + "▁strugg", + "le" + ], + [ + "Pat", + "tern" + ], + [ + "▁M", + "iddle" + ], + [ + "it", + "an" + ], + [ + "ita", + "n" + ], + [ + "i", + "tan" + ], + [ + "▁m", + "oon" + ], + [ + "▁mo", + "on" + ], + [ + "or", + "ough" + ], + [ + "oro", + "ugh" + ], + [ + "o", + "rough" + ], + [ + "▁Cath", + "olic" + ], + [ + "▁str", + "uck" + ], + [ + "▁stru", + "ck" + ], + [ + "]", + "->" + ], + [ + "▁we", + "apon" + ], + [ + "▁weap", + "on" + ], + [ + "▁su", + "bst" + ], + [ + "▁sub", + "st" + ], + [ + "▁subs", + "t" + ], + [ + "▁inst", + "ructions" + ], + [ + "▁instruct", + "ions" + ], + [ + "▁instruction", + "s" + ], + [ + "▁occ", + "as" + ], + [ + "▁oc", + "cas" + ], + [ + "prote", + "cted" + ], + [ + "▁L", + "ess" + ], + [ + "▁Le", + "ss" + ], + [ + "▁Les", + "s" + ], + [ + "▁", + "Less" + ], + [ + "▁b", + "atch" + ], + [ + "▁bat", + "ch" + ], + [ + "▁", + "batch" + ], + [ + "▁con", + "tra" + ], + [ + "▁cont", + "ra" + ], + [ + "▁contr", + "a" + ], + [ + "▁de", + "ck" + ], + [ + "▁dec", + "k" + ], + [ + "▁", + "deck" + ], + [ + "▁ign", + "ored" + ], + [ + "▁ignore", + "d" + ], + [ + "▁ignor", + "ed" + ], + [ + "▁ref", + "used" + ], + [ + "▁refuse", + "d" + ], + [ + "tr", + "igger" + ], + [ + "▁crim", + "inal" + ], + [ + "G", + "A" + ], + [ + "ol", + "ly" + ], + [ + "oll", + "y" + ], + [ + "▁B", + "ell" + ], + [ + "▁Be", + "ll" + ], + [ + "▁Bel", + "l" + ], + [ + "▁", + "Ю" + ], + [ + "for", + "ward" + ], + [ + "▁p", + "refix" + ], + [ + "▁pre", + "fix" + ], + [ + "▁pref", + "ix" + ], + [ + "▁", + "prefix" + ], + [ + "▁im", + "mediate" + ], + [ + "▁immedi", + "ate" + ], + [ + "▁as", + "signed" + ], + [ + "▁ass", + "igned" + ], + [ + "▁assign", + "ed" + ], + [ + "▁e", + "lected" + ], + [ + "▁elect", + "ed" + ], + [ + "▁ele", + "cted" + ], + [ + "▁to", + "night" + ], + [ + "▁ton", + "ight" + ], + [ + "▁D", + "ies" + ], + [ + "▁Die", + "s" + ], + [ + "▁Di", + "es" + ], + [ + "▁B", + "each" + ], + [ + "▁Be", + "ach" + ], + [ + "▁pre", + "ced" + ], + [ + "▁prec", + "ed" + ], + [ + "ow", + "ał" + ], + [ + "owa", + "ł" + ], + [ + "▁gal", + "ax" + ], + [ + "▁log", + "ic" + ], + [ + "en", + "za" + ], + [ + "enz", + "a" + ], + [ + "▁Cap", + "tain" + ], + [ + "▁Capt", + "ain" + ], + [ + "▁H", + "ay" + ], + [ + "▁Ha", + "y" + ], + [ + "▁f", + "acts" + ], + [ + "▁fact", + "s" + ], + [ + "▁fac", + "ts" + ], + [ + "▁н", + "и" + ], + [ + "▁", + "ни" + ], + [ + "t", + "é" + ], + [ + "▁s", + "b" + ], + [ + "▁", + "sb" + ], + [ + "op", + "ed" + ], + [ + "ope", + "d" + ], + [ + "o", + "ped" + ], + [ + "▁com", + "bat" + ], + [ + "▁comb", + "at" + ], + [ + "▁expl", + "ore" + ], + [ + "▁explo", + "re" + ], + [ + "▁(", + "-" + ], + [ + "▁", + "(-" + ], + [ + "Load", + "er" + ], + [ + "Lo", + "ader" + ], + [ + "▁Wil", + "son" + ], + [ + "▁l", + "ocked" + ], + [ + "▁loc", + "ked" + ], + [ + "▁lock", + "ed" + ], + [ + "▁", + "locked" + ], + [ + ":", + "", + ")" + ], + [ + "▁qu", + "el" + ], + [ + "▁que", + "l" + ], + [ + "▁q", + "uel" + ], + [ + "▁", + "quel" + ], + [ + "▁Г", + "а" + ], + [ + "T", + "y" + ], + [ + "▁tem", + "ps" + ], + [ + "▁temp", + "s" + ], + [ + "▁g", + "host" + ], + [ + "Mat", + "erial" + ], + [ + "M", + "aterial" + ], + [ + "ER", + "CHANT" + ], + [ + "point", + "er" + ], + [ + "po", + "inter" + ], + [ + "ж", + "да" + ], + [ + "ah", + "a" + ], + [ + "a", + "ha" + ], + [ + "ul", + "f" + ], + [ + "▁sup", + "plement" + ], + [ + "▁supp", + "lement" + ], + [ + "▁d", + "ismiss" + ], + [ + "▁dis", + "miss" + ], + [ + "▁cl", + "osing" + ], + [ + "▁clos", + "ing" + ], + [ + "▁clo", + "sing" + ], + [ + "▁vul", + "ner" + ], + [ + "▁ap", + "rès" + ], + [ + "▁apr", + "ès" + ], + [ + "▁over", + "whel" + ], + [ + "ско", + "е" + ], + [ + "▁dis", + "ag" + ], + [ + "ac", + "ia" + ], + [ + "aci", + "a" + ], + [ + "a", + "cia" + ], + [ + "ou", + "red" + ], + [ + "our", + "ed" + ], + [ + "o", + "ured" + ], + [ + "ru", + "ption" + ], + [ + "rupt", + "ion" + ], + [ + "▁P", + "S" + ], + [ + "▁", + "PS" + ], + [ + "End", + "point" + ], + [ + "Re", + "al" + ], + [ + "▁T", + "ag" + ], + [ + "▁Ta", + "g" + ], + [ + "▁", + "Tag" + ], + [ + "▁st", + "airs" + ], + [ + "▁sta", + "irs" + ], + [ + "▁stair", + "s" + ], + [ + "▁", + "stairs" + ], + [ + "ly", + "n" + ], + [ + "l", + "yn" + ], + [ + "▁e", + "leg" + ], + [ + "▁el", + "eg" + ], + [ + "▁ele", + "g" + ], + [ + "▁v", + "eter" + ], + [ + "▁ve", + "ter" + ], + [ + "▁vet", + "er" + ], + [ + "factor", + "y" + ], + [ + "fact", + "ory" + ], + [ + "f", + "actory" + ], + [ + "an", + "ne" + ], + [ + "ann", + "e" + ], + [ + "▁B", + "at" + ], + [ + "▁Ba", + "t" + ], + [ + "▁fr", + "anc" + ], + [ + "▁fra", + "nc" + ], + [ + "lu", + "ng" + ], + [ + "l", + "ung" + ], + [ + "▁\"", + "'" + ], + [ + ".'", + "," + ], + [ + ".", + "'," + ], + [ + "▁C", + "ountry" + ], + [ + "▁Count", + "ry" + ], + [ + "▁Coun", + "try" + ], + [ + "▁", + "Country" + ], + [ + "^{", + "[" + ], + [ + "▁y", + "ours" + ], + [ + "▁you", + "rs" + ], + [ + "▁your", + "s" + ], + [ + "▁yo", + "urs" + ], + [ + "ail", + "ability" + ], + [ + "Cl", + "ear" + ], + [ + "C", + "lear" + ], + [ + "ät", + "t" + ], + [ + "ä", + "tt" + ], + [ + "пи", + "с" + ], + [ + "п", + "ис" + ], + [ + "▁j", + "oke" + ], + [ + "▁jo", + "ke" + ], + [ + "▁ann", + "oy" + ], + [ + "▁r", + "ag" + ], + [ + "▁ra", + "g" + ], + [ + "▁", + "rag" + ], + [ + "var", + "i" + ], + [ + "va", + "ri" + ], + [ + "v", + "ari" + ], + [ + "ле", + "кс" + ], + [ + "лек", + "с" + ], + [ + "▁P", + "sy" + ], + [ + "il", + "ty" + ], + [ + "ilt", + "y" + ], + [ + "mo", + "unt" + ], + [ + "m", + "ount" + ], + [ + "▁c", + "ual" + ], + [ + "▁cu", + "al" + ], + [ + "▁s", + "olar" + ], + [ + "▁so", + "lar" + ], + [ + "▁sol", + "ar" + ], + [ + "}^", + "{(" + ], + [ + "}^{", + "(" + ], + [ + "}", + "^{(" + ], + [ + "Sh", + "ort" + ], + [ + "▁tax", + "es" + ], + [ + "App", + "end" + ], + [ + "Ap", + "pend" + ], + [ + "Appe", + "nd" + ], + [ + "W", + "in" + ], + [ + "est", + "yle" + ], + [ + "esty", + "le" + ], + [ + "e", + "style" + ], + [ + "▁fac", + "il" + ], + [ + "▁fa", + "cil" + ], + [ + "в", + "ро" + ], + [ + "▁s", + "ought" + ], + [ + "▁sou", + "ght" + ], + [ + "▁b", + "are" + ], + [ + "▁bar", + "e" + ], + [ + "▁ba", + "re" + ], + [ + "▁re", + "act" + ], + [ + "▁", + "react" + ], + [ + "ja", + "r" + ], + [ + "j", + "ar" + ], + [ + "MA", + "C" + ], + [ + "M", + "AC" + ], + [ + "lo", + "v" + ], + [ + "l", + "ov" + ], + [ + "wa", + "rn" + ], + [ + "war", + "n" + ], + [ + "w", + "arn" + ], + [ + "▁cru", + "cial" + ], + [ + "▁m", + "useum" + ], + [ + "ни", + "ц" + ], + [ + "▁K", + "ent" + ], + [ + "▁Ke", + "nt" + ], + [ + "▁Ken", + "t" + ], + [ + "May", + "be" + ], + [ + "▁b", + "ike" + ], + [ + "▁bi", + "ke" + ], + [ + "▁Add", + "ress" + ], + [ + "▁", + "Address" + ], + [ + "X", + "ML" + ], + [ + "▁ad", + "mitted" + ], + [ + "▁adm", + "itted" + ], + [ + "▁admit", + "ted" + ], + [ + "▁$", + "(\\" + ], + [ + "▁$(", + "\\" + ], + [ + "▁sp", + "ell" + ], + [ + "▁spe", + "ll" + ], + [ + "▁spel", + "l" + ], + [ + "▁", + "spell" + ], + [ + "▁v", + "ic" + ], + [ + "▁vi", + "c" + ], + [ + "gr", + "e" + ], + [ + "g", + "re" + ], + [ + "▁p", + "roc" + ], + [ + "▁pro", + "c" + ], + [ + "▁pr", + "oc" + ], + [ + "▁", + "proc" + ], + [ + "th", + "eless" + ], + [ + "the", + "less" + ], + [ + "▁N", + "om" + ], + [ + "▁No", + "m" + ], + [ + "▁R", + "ail" + ], + [ + "▁Ra", + "il" + ], + [ + "▁acc", + "eler" + ], + [ + "▁con", + "vin" + ], + [ + "▁conv", + "in" + ], + [ + "▁Pro", + "perty" + ], + [ + "▁", + "Property" + ], + [ + "▁D", + "A" + ], + [ + "▁", + "DA" + ], + [ + "▁cl", + "ip" + ], + [ + "▁", + "clip" + ], + [ + "▁pl", + "ugin" + ], + [ + "▁plug", + "in" + ], + [ + "▁", + "plugin" + ], + [ + "Lim", + "it" + ], + [ + "Li", + "mit" + ], + [ + "L", + "imit" + ], + [ + "view", + "s" + ], + [ + "br", + "u" + ], + [ + "b", + "ru" + ], + [ + "▁p", + "ra" + ], + [ + "▁pr", + "a" + ], + [ + "▁a", + "k" + ], + [ + "▁", + "ak" + ], + [ + "▁e", + "j" + ], + [ + "▁", + "ej" + ], + [ + "▁o", + "pts" + ], + [ + "▁op", + "ts" + ], + [ + "▁opt", + "s" + ], + [ + "▁", + "opts" + ], + [ + "▁sl", + "ip" + ], + [ + "▁g", + "ang" + ], + [ + "▁gan", + "g" + ], + [ + "▁ga", + "ng" + ], + [ + "▁", + "gang" + ], + [ + "as", + "ted" + ], + [ + "ast", + "ed" + ], + [ + "aste", + "d" + ], + [ + "a", + "sted" + ], + [ + "ual", + "s" + ], + [ + "ua", + "ls" + ], + [ + "u", + "als" + ], + [ + "▁d", + "ying" + ], + [ + "▁dy", + "ing" + ], + [ + "Col", + "l" + ], + [ + "Co", + "ll" + ], + [ + "C", + "oll" + ], + [ + "am", + "men" + ], + [ + "amm", + "en" + ], + [ + "▁Pol", + "icy" + ], + [ + "▁", + "Policy" + ], + [ + "ERCHANT", + "ABILITY" + ], + [ + "▁Col", + "lection" + ], + [ + "▁Coll", + "ection" + ], + [ + "▁Collect", + "ion" + ], + [ + "▁", + "Collection" + ], + [ + "▁v", + "ec" + ], + [ + "▁ve", + "c" + ], + [ + "▁", + "vec" + ], + [ + "▁D", + "ick" + ], + [ + "▁Di", + "ck" + ], + [ + "st", + "ud" + ], + [ + "▁la", + "yers" + ], + [ + "▁lay", + "ers" + ], + [ + "▁layer", + "s" + ], + [ + "▁", + "layers" + ], + [ + "▁t", + "ied" + ], + [ + "▁tie", + "d" + ], + [ + "▁ti", + "ed" + ], + [ + "}\\", + "\\" + ], + [ + "}", + "\\\\" + ], + [ + "▁al", + "ors" + ], + [ + "▁j", + "ou" + ], + [ + "▁jo", + "u" + ], + [ + "▁ch", + "icken" + ], + [ + "▁chi", + "cken" + ], + [ + "▁chick", + "en" + ], + [ + "▁perman", + "ent" + ], + [ + "▁Every", + "thing" + ], + [ + "▁L", + "ow" + ], + [ + "▁Lo", + "w" + ], + [ + "▁", + "Low" + ], + [ + "▁C", + "ook" + ], + [ + "▁Co", + "ok" + ], + [ + "▁pe", + "ak" + ], + [ + "▁PARTIC", + "ULAR" + ], + [ + "▁d", + "ear" + ], + [ + "▁de", + "ar" + ], + [ + "i", + "č" + ], + [ + "▁introdu", + "ce" + ], + [ + "▁caus", + "ing" + ], + [ + "▁ca", + "using" + ], + [ + "пи", + "са" + ], + [ + "пис", + "а" + ], + [ + "Bo", + "und" + ], + [ + "B", + "ound" + ], + [ + "hu", + "nd" + ], + [ + "h", + "und" + ], + [ + "mult", + "i" + ], + [ + "mul", + "ti" + ], + [ + "▁p", + "are" + ], + [ + "▁par", + "e" + ], + [ + "▁pa", + "re" + ], + [ + "▁", + "pare" + ], + [ + "an", + "nt" + ], + [ + "ann", + "t" + ], + [ + "▁b", + "reat" + ], + [ + "▁bre", + "at" + ], + [ + "▁commit", + "ment" + ], + [ + "▁increasing", + "ly" + ], + [ + "ко", + "й" + ], + [ + "▁F", + "riend" + ], + [ + "▁", + "Friend" + ], + [ + "▁stat", + "istics" + ], + [ + "▁statist", + "ics" + ], + [ + "▁Man", + "ager" + ], + [ + "▁", + "Manager" + ], + [ + "pl", + "icate" + ], + [ + "plic", + "ate" + ], + [ + "plica", + "te" + ], + [ + "Cl", + "oud" + ], + [ + "ac", + "i" + ], + [ + "a", + "ci" + ], + [ + "▁Con", + "ference" + ], + [ + "Sp", + "an" + ], + [ + "S", + "pan" + ], + [ + "▁C", + "EO" + ], + [ + "▁CE", + "O" + ], + [ + "▁W", + "ait" + ], + [ + "▁Wa", + "it" + ], + [ + "▁", + "Wait" + ], + [ + "▁O", + "ber" + ], + [ + "▁Ob", + "er" + ], + [ + "if", + "ting" + ], + [ + "ift", + "ing" + ], + [ + "im", + "iento" + ], + [ + "imi", + "ento" + ], + [ + "get", + "Element" + ], + [ + "▁g", + "le" + ], + [ + "▁gl", + "e" + ], + [ + "▁", + "gle" + ], + [ + "ли", + "я" + ], + [ + "▁w", + "ieder" + ], + [ + "▁wie", + "der" + ], + [ + "▁inst", + "ruction" + ], + [ + "▁instr", + "uction" + ], + [ + "▁instruct", + "ion" + ], + [ + "gl", + "y" + ], + [ + "g", + "ly" + ], + [ + "▁bl", + "ame" + ], + [ + "▁list", + "ade" + ], + [ + "▁lista", + "de" + ], + [ + "▁a", + "apt" + ], + [ + "▁Lew", + "is" + ], + [ + "Fr", + "agment" + ], + [ + "▁g", + "ear" + ], + [ + "▁ge", + "ar" + ], + [ + "mi", + "ll" + ], + [ + "mil", + "l" + ], + [ + "m", + "ill" + ], + [ + "pro", + "d" + ], + [ + "pr", + "od" + ], + [ + "p", + "rod" + ], + [ + "▁bur", + "ning" + ], + [ + "▁burn", + "ing" + ], + [ + "є", + "ться" + ], + [ + "▁m", + "é" + ], + [ + "▁", + "mé" + ], + [ + "è", + "ne" + ], + [ + "▁com", + "plicated" + ], + [ + "▁compl", + "icated" + ], + [ + "▁complic", + "ated" + ], + [ + "b", + "h" + ], + [ + "▁Just", + "ice" + ], + [ + "▁t", + "ested" + ], + [ + "▁te", + "sted" + ], + [ + "▁test", + "ed" + ], + [ + "▁st", + "aring" + ], + [ + "▁star", + "ing" + ], + [ + "▁sta", + "ring" + ], + [ + "▁surv", + "ive" + ], + [ + "▁surviv", + "e" + ], + [ + "▁c", + "ous" + ], + [ + "▁co", + "us" + ], + [ + "▁cou", + "s" + ], + [ + "▁r", + "ib" + ], + [ + "▁ri", + "b" + ], + [ + "▁", + "rib" + ], + [ + "am", + "l" + ], + [ + "a", + "ml" + ], + [ + "▁T", + "rust" + ], + [ + "▁Tr", + "ust" + ], + [ + "▁Tru", + "st" + ], + [ + "▁c", + "ad" + ], + [ + "▁ca", + "d" + ], + [ + "▁T", + "err" + ], + [ + "▁Te", + "rr" + ], + [ + "▁Ter", + "r" + ], + [ + "▁m", + "apping" + ], + [ + "▁map", + "ping" + ], + [ + "▁ma", + "pping" + ], + [ + "▁", + "mapping" + ], + [ + "▁tw", + "elve" + ], + [ + "▁g", + "rant" + ], + [ + "▁gr", + "ant" + ], + [ + "▁gran", + "t" + ], + [ + "▁gra", + "nt" + ], + [ + "▁th", + "orough" + ], + [ + "▁", + "Ü" + ], + [ + "▁fol", + "ks" + ], + [ + "▁folk", + "s" + ], + [ + "▁Cont", + "ent" + ], + [ + "▁", + "Content" + ], + [ + "▁child", + "hood" + ], + [ + "ck", + "er" + ], + [ + "cke", + "r" + ], + [ + "c", + "ker" + ], + [ + "с", + "но" + ], + [ + "RE", + "CT" + ], + [ + "REC", + "T" + ], + [ + "R", + "ECT" + ], + [ + "▁f", + "inale" + ], + [ + "▁fin", + "ale" + ], + [ + "▁final", + "e" + ], + [ + "▁sh", + "ower" + ], + [ + "▁show", + "er" + ], + [ + "ér", + "ic" + ], + [ + "éri", + "c" + ], + [ + "é", + "ric" + ], + [ + "▁s", + "pat" + ], + [ + "▁sp", + "at" + ], + [ + "od", + "ge" + ], + [ + "р", + "ь" + ], + [ + "▁p", + "es" + ], + [ + "▁pe", + "s" + ], + [ + "▁", + "pes" + ], + [ + "ed", + "a" + ], + [ + "e", + "da" + ], + [ + "D", + "b" + ], + [ + "▁Ant", + "onio" + ], + [ + "▁Anton", + "io" + ], + [ + "▁eng", + "aged" + ], + [ + "▁engage", + "d" + ], + [ + "▁v", + "ess" + ], + [ + "▁ve", + "ss" + ], + [ + "val", + "s" + ], + [ + "va", + "ls" + ], + [ + "v", + "als" + ], + [ + "▁elect", + "ronic" + ], + [ + "▁electron", + "ic" + ], + [ + "▁electro", + "nic" + ], + [ + "le", + "mma" + ], + [ + "lem", + "ma" + ], + [ + "▁W", + "y" + ], + [ + "ma", + "d" + ], + [ + "m", + "ad" + ], + [ + "mer", + "ge" + ], + [ + "ap", + "on" + ], + [ + "a", + "pon" + ], + [ + "▁priv", + "ile" + ], + [ + "▁nov", + "embre" + ], + [ + "▁nove", + "mbre" + ], + [ + "▁S", + "ports" + ], + [ + "▁Sp", + "orts" + ], + [ + "▁Sport", + "s" + ], + [ + "wi", + "ll" + ], + [ + "w", + "ill" + ], + [ + "▁control", + "s" + ], + [ + "▁contr", + "ols" + ], + [ + "▁contro", + "ls" + ], + [ + "▁", + "controls" + ], + [ + "▁c", + "ategories" + ], + [ + "▁categ", + "ories" + ], + [ + "▁categor", + "ies" + ], + [ + "▁", + "categories" + ], + [ + "▁Georg", + "ia" + ], + [ + "ip", + "edia" + ], + [ + "▁A", + "V" + ], + [ + "▁", + "AV" + ], + [ + "at", + "ori" + ], + [ + "ator", + "i" + ], + [ + "ato", + "ri" + ], + [ + "▁_", + "__" + ], + [ + "▁__", + "_" + ], + [ + "▁", + "___" + ], + [ + "▁", + "À" + ], + [ + "▁R", + "yan" + ], + [ + "▁Ry", + "an" + ], + [ + "▁Char", + "lie" + ], + [ + "▁Charl", + "ie" + ], + [ + "▁и", + "сто" + ], + [ + "▁ис", + "то" + ], + [ + "▁em", + "otion" + ], + [ + "▁emot", + "ion" + ], + [ + "▁co", + "oking" + ], + [ + "▁cook", + "ing" + ], + [ + "▁attempt", + "s" + ], + [ + "▁FIT", + "NESS" + ], + [ + "ät", + "er" + ], + [ + "ä", + "ter" + ], + [ + "En", + "able" + ], + [ + "D", + "T" + ], + [ + "▁Ch", + "ange" + ], + [ + "▁", + "Change" + ], + [ + "Asp", + "Net" + ], + [ + "▁г", + "а" + ], + [ + "▁", + "га" + ], + [ + "▁ord", + "inary" + ], + [ + "▁ordin", + "ary" + ], + [ + "▁S", + "QL" + ], + [ + "▁", + "SQL" + ], + [ + "pl", + "ane" + ], + [ + "plan", + "e" + ], + [ + "p", + "lane" + ], + [ + "%", + "." + ], + [ + "▁Sum", + "mer" + ], + [ + "▁av", + "ait" + ], + [ + "up", + "p" + ], + [ + "u", + "pp" + ], + [ + "▁ill", + "ness" + ], + [ + "UI", + "NT" + ], + [ + "U", + "INT" + ], + [ + ">", + "{" + ], + [ + "▁zw", + "ischen" + ], + [ + "▁hard", + "ware" + ], + [ + "▁sound", + "ed" + ], + [ + "equ", + "iv" + ], + [ + "▁p", + "iano" + ], + [ + "▁pi", + "ano" + ], + [ + "▁pian", + "o" + ], + [ + "us", + "et" + ], + [ + "use", + "t" + ], + [ + "u", + "set" + ], + [ + "k", + "n" + ], + [ + "TR", + "Y" + ], + [ + "▁b", + "ab" + ], + [ + "▁ba", + "b" + ], + [ + "не", + "н" + ], + [ + "н", + "ен" + ], + [ + "▁rel", + "iable" + ], + [ + "▁reli", + "able" + ], + [ + "▁Bron", + "nen" + ], + [ + "▁St", + "ore" + ], + [ + "▁Sto", + "re" + ], + [ + "▁", + "Store" + ], + [ + "A", + "z" + ], + [ + "▁»", + "," + ], + [ + "▁", + "»," + ], + [ + "St", + "atic" + ], + [ + "Stat", + "ic" + ], + [ + "d", + "w" + ], + [ + "gr", + "een" + ], + [ + "gre", + "en" + ], + [ + "g", + "reen" + ], + [ + "▁'", + "';" + ], + [ + "▁''", + ";" + ], + [ + "li", + "j" + ], + [ + "l", + "ij" + ], + [ + "ev", + "a" + ], + [ + "e", + "va" + ], + [ + "ні", + "й" + ], + [ + "▁S", + "yd" + ], + [ + "▁Sy", + "d" + ], + [ + "in", + "ois" + ], + [ + "ino", + "is" + ], + [ + "con", + "vert" + ], + [ + "conv", + "ert" + ], + [ + "▁decl", + "are" + ], + [ + "▁declar", + "e" + ], + [ + "br", + "es" + ], + [ + "bre", + "s" + ], + [ + "b", + "res" + ], + [ + "IN", + "K" + ], + [ + "it", + "led" + ], + [ + "itle", + "d" + ], + [ + "▁acc", + "ord" + ], + [ + "▁ac", + "cord" + ], + [ + "▁m", + "ars" + ], + [ + "▁mar", + "s" + ], + [ + "▁ma", + "rs" + ], + [ + "Sequ", + "ence" + ], + [ + "zi", + "p" + ], + [ + "z", + "ip" + ], + [ + "▁Braz", + "il" + ], + [ + "▁meet", + "ings" + ], + [ + "▁meeting", + "s" + ], + [ + "▁accur", + "acy" + ], + [ + "▁M", + "achine" + ], + [ + "▁Mach", + "ine" + ], + [ + "▁", + "Machine" + ], + [ + "▁aut", + "or" + ], + [ + "▁au", + "tor" + ], + [ + "▁auto", + "r" + ], + [ + "▁", + "autor" + ], + [ + "▁a", + "insi" + ], + [ + "▁ain", + "si" + ], + [ + "Sim", + "ple" + ], + [ + "Res", + "ources" + ], + [ + "Re", + "sources" + ], + [ + "Resource", + "s" + ], + [ + "ка", + "за" + ], + [ + "каз", + "а" + ], + [ + "▁M", + "P" + ], + [ + "▁", + "MP" + ], + [ + "th", + "ey" + ], + [ + "the", + "y" + ], + [ + "▁B", + "ang" + ], + [ + "▁Ba", + "ng" + ], + [ + "▁Ban", + "g" + ], + [ + "▁e", + "ing" + ], + [ + "▁ein", + "g" + ], + [ + "▁", + "eing" + ], + [ + "ate", + "ful" + ], + [ + "▁Some", + "thing" + ], + [ + "▁Som", + "ething" + ], + [ + "▁", + "Something" + ], + [ + "▁up", + "set" + ], + [ + "Hist", + "ory" + ], + [ + "Hi", + "story" + ], + [ + "dim", + "ensional" + ], + [ + "▁explan", + "ation" + ], + [ + "▁c", + "iv" + ], + [ + "▁ci", + "v" + ], + [ + "▁c", + "once" + ], + [ + "▁con", + "ce" + ], + [ + "▁conc", + "e" + ], + [ + "▁kö", + "z" + ], + [ + "▁prom", + "ised" + ], + [ + "▁promise", + "d" + ], + [ + "ж", + "ду" + ], + [ + "we", + "d" + ], + [ + "w", + "ed" + ], + [ + "For", + "e" + ], + [ + "F", + "ore" + ], + [ + "Am", + "ount" + ], + [ + "A", + "mount" + ], + [ + "ab", + "b" + ], + [ + "a", + "bb" + ], + [ + "▁cl", + "othing" + ], + [ + "▁cloth", + "ing" + ], + [ + "▁clo", + "thing" + ], + [ + "ли", + "сь" + ], + [ + "oe", + "n" + ], + [ + "o", + "en" + ], + [ + "▁Pr", + "int" + ], + [ + "▁Pri", + "nt" + ], + [ + "▁Prin", + "t" + ], + [ + "▁", + "Print" + ], + [ + "▁s", + "izes" + ], + [ + "▁size", + "s" + ], + [ + "▁si", + "zes" + ], + [ + "▁b", + "anks" + ], + [ + "▁bank", + "s" + ], + [ + "▁ban", + "ks" + ], + [ + "ri", + "bed" + ], + [ + "rib", + "ed" + ], + [ + "ribe", + "d" + ], + [ + "▁'", + "../" + ], + [ + "▁'.", + "./" + ], + [ + "FI", + "X" + ], + [ + "F", + "IX" + ], + [ + "▁H", + "ug" + ], + [ + "▁Hu", + "g" + ], + [ + "▁z", + "n" + ], + [ + "▁", + "zn" + ], + [ + "▁I", + "NT" + ], + [ + "▁IN", + "T" + ], + [ + "▁", + "INT" + ], + [ + "▁in", + "stances" + ], + [ + "▁inst", + "ances" + ], + [ + "▁instance", + "s" + ], + [ + "▁along", + "side" + ], + [ + "Name", + "space" + ], + [ + "Names", + "pace" + ], + [ + "▁re", + "new" + ], + [ + "▁ren", + "ew" + ], + [ + "▁a", + "sc" + ], + [ + "▁as", + "c" + ], + [ + "▁", + "asc" + ], + [ + "▁w", + "aves" + ], + [ + "▁wa", + "ves" + ], + [ + "▁wave", + "s" + ], + [ + "▁p", + "om" + ], + [ + "▁po", + "m" + ], + [ + "D", + "uration" + ], + [ + "day", + "s" + ], + [ + "da", + "ys" + ], + [ + "d", + "ays" + ], + [ + "$", + "(" + ], + [ + "▁grab", + "bed" + ], + [ + "▁sur", + "gery" + ], + [ + "▁surge", + "ry" + ], + [ + "▁surg", + "ery" + ], + [ + "▁re", + "store" + ], + [ + "▁rest", + "ore" + ], + [ + "▁", + "restore" + ], + [ + "Norm", + "al" + ], + [ + "N", + "ormal" + ], + [ + "▁L", + "eb" + ], + [ + "▁Le", + "b" + ], + [ + "▁anal", + "yt" + ], + [ + "▁analy", + "t" + ], + [ + "Lite", + "ral" + ], + [ + "L", + "iteral" + ], + [ + "H", + "A" + ], + [ + "▁sh", + "ares" + ], + [ + "▁share", + "s" + ], + [ + "▁shar", + "es" + ], + [ + "▁sha", + "res" + ], + [ + "il", + "let" + ], + [ + "ill", + "et" + ], + [ + "ille", + "t" + ], + [ + "ol", + "s" + ], + [ + "o", + "ls" + ], + [ + "▁D", + "og" + ], + [ + "▁Do", + "g" + ], + [ + "or", + "no" + ], + [ + "orn", + "o" + ], + [ + "▁man", + "ip" + ], + [ + "ja", + "v" + ], + [ + "j", + "av" + ], + [ + "▁ess", + "entially" + ], + [ + "▁essential", + "ly" + ], + [ + "▁cas", + "ual" + ], + [ + "op", + "l" + ], + [ + "o", + "pl" + ], + [ + "▁", + "р" + ], + [ + "▁S", + "U" + ], + [ + "▁", + "SU" + ], + [ + "▁engine", + "ering" + ], + [ + "▁engineer", + "ing" + ], + [ + "▁Pr", + "ime" + ], + [ + "▁Pri", + "me" + ], + [ + "▁Prim", + "e" + ], + [ + "▁S", + "W" + ], + [ + "▁", + "SW" + ], + [ + "▁re", + "aching" + ], + [ + "▁reach", + "ing" + ], + [ + "▁в", + "ла" + ], + [ + "▁Ро", + "сси" + ], + [ + "▁K", + "re" + ], + [ + "▁Kr", + "e" + ], + [ + "er", + "ry" + ], + [ + "err", + "y" + ], + [ + "▁op", + "pon" + ], + [ + "▁opp", + "on" + ], + [ + "pro", + "gram" + ], + [ + "pr", + "ogram" + ], + [ + "em", + "per" + ], + [ + "emp", + "er" + ], + [ + "is", + "Empty" + ], + [ + "▁U", + "nit" + ], + [ + "▁Un", + "it" + ], + [ + "▁", + "Unit" + ], + [ + "IN", + "TER" + ], + [ + "INT", + "ER" + ], + [ + "INTE", + "R" + ], + [ + "et", + "he" + ], + [ + "eth", + "e" + ], + [ + "e", + "the" + ], + [ + "z", + "d" + ], + [ + "CU", + "R" + ], + [ + "C", + "UR" + ], + [ + "▁v", + "m" + ], + [ + "▁", + "vm" + ], + [ + "con", + "v" + ], + [ + "co", + "nv" + ], + [ + "ro", + "pol" + ], + [ + "rop", + "ol" + ], + [ + "r", + "opol" + ], + [ + "▁Co", + "ast" + ], + [ + "▁S", + "elect" + ], + [ + "▁Se", + "lect" + ], + [ + "▁Sel", + "ect" + ], + [ + "▁", + "Select" + ], + [ + "▁бы", + "ла" + ], + [ + "▁был", + "а" + ], + [ + "▁V", + "e" + ], + [ + "ow", + "y" + ], + [ + "o", + "wy" + ], + [ + "▁my", + "th" + ], + [ + "ce", + "ptions" + ], + [ + "ception", + "s" + ], + [ + "cept", + "ions" + ], + [ + "class", + "es" + ], + [ + "▁w", + "orden" + ], + [ + "▁wor", + "den" + ], + [ + "▁word", + "en" + ], + [ + "▁ass", + "ault" + ], + [ + "▁d", + "ual" + ], + [ + "▁du", + "al" + ], + [ + "OR", + "K" + ], + [ + "▁in", + "ches" + ], + [ + "▁inc", + "hes" + ], + [ + "▁inch", + "es" + ], + [ + "▁F", + "A" + ], + [ + "▁", + "FA" + ], + [ + "▁St", + "ation" + ], + [ + "▁Stat", + "ion" + ], + [ + "▁", + "Station" + ], + [ + "▁person", + "ality" + ], + [ + "▁personal", + "ity" + ], + [ + "▁s", + "car" + ], + [ + "▁sc", + "ar" + ], + [ + "▁", + "scar" + ], + [ + "▁reg", + "ime" + ], + [ + "▁not", + "en" + ], + [ + "▁no", + "ten" + ], + [ + "▁note", + "n" + ], + [ + "▁r", + "ural" + ], + [ + "▁ru", + "ral" + ], + [ + "iz", + "a" + ], + [ + "i", + "za" + ], + [ + "Aud", + "io" + ], + [ + "A", + "udio" + ], + [ + "▁dis", + "put" + ], + [ + "▁disp", + "ut" + ], + [ + "▁a", + "ver" + ], + [ + "▁av", + "er" + ], + [ + "▁ave", + "r" + ], + [ + "▁", + "aver" + ], + [ + "▁o", + "bst" + ], + [ + "▁ob", + "st" + ], + [ + "▁obs", + "t" + ], + [ + "▁Reg", + "ion" + ], + [ + "▁", + "Region" + ], + [ + "ut", + "f" + ], + [ + "u", + "tf" + ], + [ + "▁C", + "ass" + ], + [ + "▁Cas", + "s" + ], + [ + "▁Ca", + "ss" + ], + [ + "hs", + "pace" + ], + [ + "h", + "space" + ], + [ + "▁sh", + "ipping" + ], + [ + "▁ship", + "ping" + ], + [ + "ik", + "o" + ], + [ + "i", + "ko" + ], + [ + "ic", + "ked" + ], + [ + "ick", + "ed" + ], + [ + "num", + "er" + ], + [ + "nu", + "mer" + ], + [ + "n", + "umer" + ], + [ + "д", + "на" + ], + [ + "ri", + "el" + ], + [ + "rie", + "l" + ], + [ + "r", + "iel" + ], + [ + "dis", + "abled" + ], + [ + "disable", + "d" + ], + [ + "op", + "ol" + ], + [ + "o", + "pol" + ], + [ + "lo", + "oking" + ], + [ + "look", + "ing" + ], + [ + "▁class", + "ical" + ], + [ + "▁classic", + "al" + ], + [ + "▁construct", + "ed" + ], + [ + "▁constru", + "cted" + ], + [ + "▁refer", + "enties" + ], + [ + "]", + "+" + ], + [ + "▁capt", + "ured" + ], + [ + "▁capture", + "d" + ], + [ + "▁min", + "imal" + ], + [ + "▁minim", + "al" + ], + [ + "▁mini", + "mal" + ], + [ + "▁s", + "ock" + ], + [ + "▁so", + "ck" + ], + [ + "▁soc", + "k" + ], + [ + "▁", + "sock" + ], + [ + "fa", + "ther" + ], + [ + "f", + "ather" + ], + [ + "is", + "ión" + ], + [ + "isi", + "ón" + ], + [ + "▁equ", + "ally" + ], + [ + "▁equal", + "ly" + ], + [ + "▁eq", + "ually" + ], + [ + "▁red", + "uction" + ], + [ + "▁redu", + "ction" + ], + [ + "An", + "t" + ], + [ + "A", + "nt" + ], + [ + "ais", + "on" + ], + [ + "ai", + "son" + ], + [ + "a", + "ison" + ], + [ + "▁ar", + "gue" + ], + [ + "▁arg", + "ue" + ], + [ + "cir", + "cle" + ], + [ + "circ", + "le" + ], + [ + "▁t", + "oler" + ], + [ + "▁to", + "ler" + ], + [ + "}\"", + "," + ], + [ + "}", + "\"," + ], + [ + "▁prim", + "arily" + ], + [ + "us", + "al" + ], + [ + "usa", + "l" + ], + [ + "u", + "sal" + ], + [ + "▁al", + "gebra" + ], + [ + "▁gather", + "ed" + ], + [ + "▁Re", + "member" + ], + [ + "▁Rem", + "ember" + ], + [ + "_)", + ";" + ], + [ + "_", + ");" + ], + [ + "UT", + "E" + ], + [ + "U", + "TE" + ], + [ + "▁K", + "it" + ], + [ + "▁Ki", + "t" + ], + [ + "▁", + "Kit" + ], + [ + "S", + "y" + ], + [ + "HE", + "AD" + ], + [ + "▁re", + "cipe" + ], + [ + "▁rec", + "ipe" + ], + [ + "▁recip", + "e" + ], + [ + "▁sc", + "enario" + ], + [ + "▁scen", + "ario" + ], + [ + "▁Follow", + "ing" + ], + [ + "VA", + "R" + ], + [ + "V", + "AR" + ], + [ + "▁y", + "ard" + ], + [ + "▁ya", + "rd" + ], + [ + "▁", + "yard" + ], + [ + "▁st", + "ad" + ], + [ + "▁sta", + "d" + ], + [ + "▁", + "stad" + ], + [ + "*", + "(" + ], + [ + "▁valid", + "ate" + ], + [ + "▁", + "validate" + ], + [ + "DE", + "X" + ], + [ + "D", + "EX" + ], + [ + "▁commit", + "tee" + ], + [ + "▁t", + "emporary" + ], + [ + "▁tempor", + "ary" + ], + [ + "▁consequ", + "ences" + ], + [ + "▁consequence", + "s" + ], + [ + "▁égal", + "ement" + ], + [ + "кти", + "в" + ], + [ + "к", + "тив" + ], + [ + "▁r", + "a" + ], + [ + "▁", + "ra" + ], + [ + "▁dis", + "pl" + ], + [ + "▁di", + "spl" + ], + [ + "▁disp", + "l" + ], + [ + "▁app", + "s" + ], + [ + "▁ap", + "ps" + ], + [ + "▁", + "apps" + ], + [ + "▁Te", + "il" + ], + [ + "▁»", + "." + ], + [ + "▁", + "»." + ], + [ + "▁adopt", + "ed" + ], + [ + "ten", + "sor" + ], + [ + "t", + "ensor" + ], + [ + "▁fe", + "min" + ], + [ + "▁fem", + "in" + ], + [ + "▁м", + "ар" + ], + [ + "▁ма", + "р" + ], + [ + "ло", + "ги" + ], + [ + "te", + "ch" + ], + [ + "t", + "ech" + ], + [ + "▁R", + "ot" + ], + [ + "▁Ro", + "t" + ], + [ + "▁", + "Rot" + ], + [ + "▁kn", + "ees" + ], + [ + "▁kne", + "es" + ], + [ + "▁knee", + "s" + ], + [ + "ph", + "ys" + ], + [ + "phy", + "s" + ], + [ + "ow", + "ej" + ], + [ + "owe", + "j" + ], + [ + "▁Ox", + "ford" + ], + [ + "ан", + "д" + ], + [ + "а", + "нд" + ], + [ + "he", + "ll" + ], + [ + "hel", + "l" + ], + [ + "h", + "ell" + ], + [ + "ograf", + "ia" + ], + [ + "▁ex", + "posed" + ], + [ + "▁exp", + "osed" + ], + [ + "▁expos", + "ed" + ], + [ + "▁expose", + "d" + ], + [ + "kt", + "op" + ], + [ + "k", + "top" + ], + [ + "ob", + "y" + ], + [ + "o", + "by" + ], + [ + "lo", + "wer" + ], + [ + "low", + "er" + ], + [ + "l", + "ower" + ], + [ + "▁Se", + "nate" + ], + [ + "▁Sen", + "ate" + ], + [ + "▁s", + "word" + ], + [ + "▁sw", + "ord" + ], + [ + "▁swo", + "rd" + ], + [ + "Fl", + "ow" + ], + [ + "F", + "low" + ], + [ + "▁Un", + "fortunately" + ], + [ + "▁box", + "es" + ], + [ + "▁", + "boxes" + ], + [ + "▁cu", + "ando" + ], + [ + "▁pi", + "lot" + ], + [ + "▁pil", + "ot" + ], + [ + "▁Al", + "bum" + ], + [ + "▁Alb", + "um" + ], + [ + "B", + "al" + ], + [ + "So", + "rt" + ], + [ + "S", + "ort" + ], + [ + "FI", + "ELD" + ], + [ + "▁de", + "sert" + ], + [ + "▁des", + "ert" + ], + [ + "CO", + "MM" + ], + [ + "COM", + "M" + ], + [ + "ro", + "ns" + ], + [ + "ron", + "s" + ], + [ + "r", + "ons" + ], + [ + "ad", + "ows" + ], + [ + "ado", + "ws" + ], + [ + "adow", + "s" + ], + [ + "▁l", + "oyal" + ], + [ + "▁lo", + "yal" + ], + [ + "▁as", + "set" + ], + [ + "▁ass", + "et" + ], + [ + "▁", + "asset" + ], + [ + "▁m", + "ud" + ], + [ + "▁mu", + "d" + ], + [ + "ф", + "а" + ], + [ + "▁second", + "ary" + ], + [ + "▁", + "secondary" + ], + [ + "▁А", + "р" + ], + [ + "▁c", + "ul" + ], + [ + "▁cu", + "l" + ], + [ + "▁", + "cul" + ], + [ + "▁As", + "ian" + ], + [ + "▁Asia", + "n" + ], + [ + "▁stay", + "ing" + ], + [ + "▁sta", + "ying" + ], + [ + "▁data", + "set" + ], + [ + "▁dat", + "aset" + ], + [ + "▁", + "dataset" + ], + [ + "▁U", + "SE" + ], + [ + "▁US", + "E" + ], + [ + "▁", + "USE" + ], + [ + "▁l", + "oves" + ], + [ + "▁lo", + "ves" + ], + [ + "▁love", + "s" + ], + [ + "▁lov", + "es" + ], + [ + "▁vel", + "ocity" + ], + [ + "▁veloc", + "ity" + ], + [ + "á", + "v" + ], + [ + "▁purch", + "ased" + ], + [ + "▁purchase", + "d" + ], + [ + "SO", + "C" + ], + [ + "S", + "OC" + ], + [ + "▁compet", + "itive" + ], + [ + "▁Foot", + "ball" + ], + [ + "is", + "ka" + ], + [ + "isk", + "a" + ], + [ + "i", + "ska" + ], + [ + "▁kn", + "ock" + ], + [ + "st", + "airs" + ], + [ + "sta", + "irs" + ], + [ + "az", + "y" + ], + [ + "a", + "zy" + ], + [ + "▁v", + "end" + ], + [ + "▁ve", + "nd" + ], + [ + "▁ven", + "d" + ], + [ + "▁ar", + "ts" + ], + [ + "▁art", + "s" + ], + [ + "▁", + "arts" + ], + [ + "▁B", + "ras" + ], + [ + "▁Br", + "as" + ], + [ + "▁Bra", + "s" + ], + [ + "ue", + "la" + ], + [ + "uel", + "a" + ], + [ + "u", + "ela" + ], + [ + "кт", + "о" + ], + [ + "к", + "то" + ], + [ + "tr", + "im" + ], + [ + "tri", + "m" + ], + [ + "t", + "rim" + ], + [ + "▁d", + "irty" + ], + [ + "▁dir", + "ty" + ], + [ + "▁dirt", + "y" + ], + [ + "▁", + "dirty" + ], + [ + "▁webs", + "ites" + ], + [ + "▁website", + "s" + ], + [ + "▁In", + "dep" + ], + [ + "▁Ind", + "ep" + ], + [ + "▁с", + "тра" + ], + [ + "▁ст", + "ра" + ], + [ + "▁", + "стра" + ], + [ + "s", + "r" + ], + [ + "▁t", + "icket" + ], + [ + "▁tick", + "et" + ], + [ + "at", + "ile" + ], + [ + "ati", + "le" + ], + [ + "a", + "tile" + ], + [ + "▁implement", + "ed" + ], + [ + "▁вре", + "мя" + ], + [ + "▁bo", + "wl" + ], + [ + "▁bow", + "l" + ], + [ + "DA", + "TE" + ], + [ + "DAT", + "E" + ], + [ + "D", + "ATE" + ], + [ + "▁al", + "ter" + ], + [ + "▁alt", + "er" + ], + [ + "▁", + "alter" + ], + [ + "▁S", + "pace" + ], + [ + "▁Sp", + "ace" + ], + [ + "▁", + "Space" + ], + [ + "▁accom", + "pan" + ], + [ + "▁accomp", + "an" + ], + [ + "or", + "don" + ], + [ + "ord", + "on" + ], + [ + "▁do", + "ctors" + ], + [ + "▁doctor", + "s" + ], + [ + "ist", + "as" + ], + [ + "ista", + "s" + ], + [ + "C", + "ast" + ], + [ + "до", + "м" + ], + [ + "CT", + "L" + ], + [ + "C", + "TL" + ], + [ + "ur", + "ers" + ], + [ + "ure", + "rs" + ], + [ + "urer", + "s" + ], + [ + "▁ingred", + "ients" + ], + [ + "▁calcul", + "ated" + ], + [ + "▁calculate", + "d" + ], + [ + "▁calc", + "ulated" + ], + [ + "▁le", + "ather" + ], + [ + "▁s", + "ensitive" + ], + [ + "▁sens", + "itive" + ], + [ + "▁sus", + "pic" + ], + [ + "▁susp", + "ic" + ], + [ + "st", + "an" + ], + [ + "sta", + "n" + ], + [ + "s", + "tan" + ], + [ + "▁an", + "ni" + ], + [ + "▁ann", + "i" + ], + [ + "▁", + "anni" + ], + [ + "aw", + "ait" + ], + [ + "awa", + "it" + ], + [ + "a", + "wait" + ], + [ + "▁Fr", + "anç" + ], + [ + "▁Fran", + "ç" + ], + [ + "▁ab", + "ort" + ], + [ + "▁", + "abort" + ], + [ + "▁Sp", + "irit" + ], + [ + "▁W", + "alter" + ], + [ + "▁Wal", + "ter" + ], + [ + "▁Walt", + "er" + ], + [ + "un", + "kt" + ], + [ + "unk", + "t" + ], + [ + "▁vert", + "ical" + ], + [ + "▁", + "vertical" + ], + [ + "OR", + "S" + ], + [ + "O", + "RS" + ], + [ + "be", + "st" + ], + [ + "bes", + "t" + ], + [ + "b", + "est" + ], + [ + "▁Cl", + "ient" + ], + [ + "▁", + "Client" + ], + [ + "it", + "ated" + ], + [ + "ita", + "ted" + ], + [ + "itate", + "d" + ], + [ + "itat", + "ed" + ], + [ + "▁в", + "а" + ], + [ + "▁", + "ва" + ], + [ + "▁", + "Č" + ], + [ + "▁v", + "ille" + ], + [ + "▁vi", + "lle" + ], + [ + "▁vill", + "e" + ], + [ + "▁vil", + "le" + ], + [ + "▁", + "ville" + ], + [ + "▁di", + "plom" + ], + [ + "or", + "ne" + ], + [ + "orn", + "e" + ], + [ + "▁b", + "ars" + ], + [ + "▁bar", + "s" + ], + [ + "▁ba", + "rs" + ], + [ + "▁", + "bars" + ], + [ + "U", + "ri" + ], + [ + "AP", + "TER" + ], + [ + "pon", + "s" + ], + [ + "po", + "ns" + ], + [ + "p", + "ons" + ], + [ + "ut", + "z" + ], + [ + "u", + "tz" + ], + [ + "Pro", + "to" + ], + [ + "Pr", + "oto" + ], + [ + "▁st", + "ir" + ], + [ + "▁ц", + "е" + ], + [ + "▁", + "це" + ], + [ + "▁pr", + "imer" + ], + [ + "▁prim", + "er" + ], + [ + "▁pri", + "mer" + ], + [ + "▁prime", + "r" + ], + [ + "ig", + "ible" + ], + [ + "igi", + "ble" + ], + [ + "ex", + "tra" + ], + [ + "ext", + "ra" + ], + [ + "extr", + "a" + ], + [ + "▁Bo", + "oks" + ], + [ + "▁Book", + "s" + ], + [ + "▁B", + "os" + ], + [ + "▁Bo", + "s" + ], + [ + "▁E", + "t" + ], + [ + "▁W", + "elt" + ], + [ + "▁We", + "lt" + ], + [ + "▁Wel", + "t" + ], + [ + "▁Kore", + "a" + ], + [ + "▁Ko", + "rea" + ], + [ + "▁Kor", + "ea" + ], + [ + "ри", + "то" + ], + [ + "р", + "ито" + ], + [ + "▁v", + "ibr" + ], + [ + "▁vi", + "br" + ], + [ + "S", + "elf" + ], + [ + "line", + "ar" + ], + [ + "lin", + "ear" + ], + [ + "о", + "б" + ], + [ + "▁L", + "ang" + ], + [ + "▁La", + "ng" + ], + [ + "▁Lan", + "g" + ], + [ + "▁", + "Lang" + ], + [ + "▁de", + "eper" + ], + [ + "▁deep", + "er" + ], + [ + "▁term", + "in" + ], + [ + "▁ter", + "min" + ], + [ + "▁", + "termin" + ], + [ + "en", + "schaft" + ], + [ + "ens", + "chaft" + ], + [ + "ensch", + "aft" + ], + [ + "▁ро", + "ці" + ], + [ + "am", + "med" + ], + [ + "amm", + "ed" + ], + [ + "vis", + "ible" + ], + [ + "v", + "isible" + ], + [ + "▁IO", + "Exception" + ], + [ + "▁", + "IOException" + ], + [ + "▁W", + "ind" + ], + [ + "▁Win", + "d" + ], + [ + "▁Wi", + "nd" + ], + [ + "us", + "qu" + ], + [ + "▁S", + "top" + ], + [ + "▁St", + "op" + ], + [ + "▁Sto", + "p" + ], + [ + "▁", + "Stop" + ], + [ + "▁ор", + "га" + ], + [ + "IN", + "VALID" + ], + [ + "INVAL", + "ID" + ], + [ + "▁c", + "ub" + ], + [ + "▁cu", + "b" + ], + [ + "▁j", + "ew" + ], + [ + "▁je", + "w" + ], + [ + "▁cap", + "tain" + ], + [ + "▁capt", + "ain" + ], + [ + "з", + "і" + ], + [ + "ch", + "unk" + ], + [ + "apt", + "ure" + ], + [ + "ash", + "board" + ], + [ + "▁div", + "ided" + ], + [ + "▁divid", + "ed" + ], + [ + "▁divide", + "d" + ], + [ + "▁ext", + "ensive" + ], + [ + "▁extens", + "ive" + ], + [ + "▁s", + "uffer" + ], + [ + "▁suff", + "er" + ], + [ + "▁he", + "ading" + ], + [ + "▁head", + "ing" + ], + [ + "▁", + "heading" + ], + [ + "cre", + "ated" + ], + [ + "create", + "d" + ], + [ + "creat", + "ed" + ], + [ + "c", + "reated" + ], + [ + "▁quiet", + "ly" + ], + [ + "▁n", + "y" + ], + [ + "▁", + "ny" + ], + [ + "▁по", + "л" + ], + [ + "▁", + "пол" + ], + [ + "\"", + "+" + ], + [ + "ik", + "an" + ], + [ + "ika", + "n" + ], + [ + "i", + "kan" + ], + [ + "▁design", + "s" + ], + [ + "z", + "u" + ], + [ + "}+", + "\\" + ], + [ + "}", + "+\\" + ], + [ + "Oper", + "ator" + ], + [ + "▁Le", + "mma" + ], + [ + "▁Lem", + "ma" + ], + [ + "▁на", + "у" + ], + [ + "ac", + "ji" + ], + [ + "ло", + "ве" + ], + [ + "лов", + "е" + ], + [ + "Serv", + "let" + ], + [ + "▁K", + "evin" + ], + [ + "▁Ke", + "vin" + ], + [ + "st", + "age" + ], + [ + "sta", + "ge" + ], + [ + "b", + "n" + ], + [ + "text", + "width" + ], + [ + "fa", + "iled" + ], + [ + "fail", + "ed" + ], + [ + "f", + "ailed" + ], + [ + "▁St", + "aff" + ], + [ + "▁Sta", + "ff" + ], + [ + "▁e", + "nem" + ], + [ + "▁en", + "em" + ], + [ + "un", + "de" + ], + [ + "und", + "e" + ], + [ + "u", + "nde" + ], + [ + "ен", + "ь" + ], + [ + "е", + "нь" + ], + [ + "Pack", + "et" + ], + [ + "P", + "acket" + ], + [ + "▁A", + "ls" + ], + [ + "▁Al", + "s" + ], + [ + "ka", + "r" + ], + [ + "k", + "ar" + ], + [ + "][", + "'" + ], + [ + "]", + "['" + ], + [ + "ke", + "d" + ], + [ + "k", + "ed" + ], + [ + "Per", + "s" + ], + [ + "Pe", + "rs" + ], + [ + "P", + "ers" + ], + [ + ">:", + ":" + ], + [ + ">", + "::" + ], + [ + "▁a", + "rc" + ], + [ + "▁ar", + "c" + ], + [ + "▁", + "arc" + ], + [ + "▁sy", + "nt" + ], + [ + "▁syn", + "t" + ], + [ + "SP", + "E" + ], + [ + "S", + "PE" + ], + [ + "▁Д", + "а" + ], + [ + "▁M", + "i" + ], + [ + "▁M", + "oh" + ], + [ + "▁Mo", + "h" + ], + [ + "▁De", + "ath" + ], + [ + "b", + "rowser" + ], + [ + "▁D", + "ave" + ], + [ + "▁Dav", + "e" + ], + [ + "▁Da", + "ve" + ], + [ + "▁s", + "ucc" + ], + [ + "▁su", + "cc" + ], + [ + "▁suc", + "c" + ], + [ + "t", + "oggle" + ], + [ + "▁t", + "ack" + ], + [ + "▁ta", + "ck" + ], + [ + "Com", + "ment" + ], + [ + "Comm", + "ent" + ], + [ + "er", + "on" + ], + [ + "ero", + "n" + ], + [ + "e", + "ron" + ], + [ + "▁aware", + "ness" + ], + [ + "▁h", + "ug" + ], + [ + "▁cont", + "emporary" + ], + [ + "▁contempor", + "ary" + ], + [ + "ul", + "ating" + ], + [ + "ula", + "ting" + ], + [ + "▁T", + "itle" + ], + [ + "▁Tit", + "le" + ], + [ + "▁Ti", + "tle" + ], + [ + "▁", + "Title" + ], + [ + "▁TH", + "IS" + ], + [ + "hav", + "ior" + ], + [ + "ran", + "k" + ], + [ + "r", + "ank" + ], + [ + "▁do", + "zen" + ], + [ + "▁che", + "ese" + ], + [ + "co", + "ln" + ], + [ + "col", + "n" + ], + [ + "▁rad", + "ius" + ], + [ + "▁radi", + "us" + ], + [ + "▁", + "radius" + ], + [ + "▁dim", + "ensions" + ], + [ + "▁dimension", + "s" + ], + [ + "rodu", + "ction" + ], + [ + "rod", + "uction" + ], + [ + "▁ad", + "ds" + ], + [ + "▁add", + "s" + ], + [ + "▁house", + "hold" + ], + [ + "▁D", + "avis" + ], + [ + "▁Dav", + "is" + ], + [ + "▁Da", + "vis" + ], + [ + "pk", + "g" + ], + [ + "p", + "kg" + ], + [ + "{", + "$" + ], + [ + "▁cas", + "ino" + ], + [ + "▁P", + "ierre" + ], + [ + "▁Pier", + "re" + ], + [ + "▁Pi", + "erre" + ], + [ + "▁object", + "ive" + ], + [ + "tr", + "ain" + ], + [ + "tra", + "in" + ], + [ + "▁Mich", + "igan" + ], + [ + "pay", + "load" + ], + [ + "▁r", + "ug" + ], + [ + "▁ru", + "g" + ], + [ + "▁", + "rug" + ], + [ + "▁se", + "vere" + ], + [ + "▁sever", + "e" + ], + [ + "me", + "an" + ], + [ + "▁t", + "oss" + ], + [ + "▁to", + "ss" + ], + [ + "▁embar", + "rass" + ], + [ + "▁V", + "ery" + ], + [ + "▁Ver", + "y" + ], + [ + "▁Ve", + "ry" + ], + [ + "▁", + "Very" + ], + [ + "▁appe", + "al" + ], + [ + "▁Com", + "put" + ], + [ + "▁Comp", + "ut" + ], + [ + "▁", + "Comput" + ], + [ + "▁forgot", + "ten" + ], + [ + "▁k", + "ernel" + ], + [ + "▁ker", + "nel" + ], + [ + "▁", + "kernel" + ], + [ + "▁car", + "bon" + ], + [ + "▁carb", + "on" + ], + [ + "f", + "w" + ], + [ + "▁С", + "у" + ], + [ + "▁Emp", + "ire" + ], + [ + "▁qu", + "ote" + ], + [ + "▁quot", + "e" + ], + [ + "▁", + "quote" + ], + [ + "et", + "z" + ], + [ + "e", + "tz" + ], + [ + "▁m", + "ini" + ], + [ + "▁min", + "i" + ], + [ + "▁mi", + "ni" + ], + [ + "▁p", + "ipe" + ], + [ + "▁pi", + "pe" + ], + [ + "▁pip", + "e" + ], + [ + "▁", + "pipe" + ], + [ + "▁n", + "ous" + ], + [ + "▁no", + "us" + ], + [ + "▁nou", + "s" + ], + [ + "▁M", + "ove" + ], + [ + "▁Mo", + "ve" + ], + [ + "▁Mov", + "e" + ], + [ + "▁", + "Move" + ], + [ + "▁д", + "у" + ], + [ + "▁", + "ду" + ], + [ + "▁nerv", + "ous" + ], + [ + "▁М", + "ар" + ], + [ + "▁Ма", + "р" + ], + [ + "*", + "\r" + ], + [ + "▁B", + "ush" + ], + [ + "▁Bus", + "h" + ], + [ + "▁Bu", + "sh" + ], + [ + "▁pe", + "er" + ], + [ + "▁", + "peer" + ], + [ + "▁W", + "rit" + ], + [ + "▁Wr", + "it" + ], + [ + "▁", + "Writ" + ], + [ + "▁satisf", + "ied" + ], + [ + "▁pull", + "ing" + ], + [ + "▁pul", + "ling" + ], + [ + "▁P", + "ur" + ], + [ + "▁Pu", + "r" + ], + [ + "▁M", + "iller" + ], + [ + "▁Mil", + "ler" + ], + [ + "▁Mill", + "er" + ], + [ + "▁F", + "L" + ], + [ + "▁", + "FL" + ], + [ + "am", + "az" + ], + [ + "ama", + "z" + ], + [ + "a", + "maz" + ], + [ + "▁m", + "ile" + ], + [ + "▁mil", + "e" + ], + [ + "▁mi", + "le" + ], + [ + "▁", + "mile" + ], + [ + "▁N", + "eed" + ], + [ + "▁Ne", + "ed" + ], + [ + "▁", + "Need" + ], + [ + "▁sup", + "plies" + ], + [ + "▁a", + "ño" + ], + [ + "▁p", + "ace" + ], + [ + "▁pa", + "ce" + ], + [ + "▁pac", + "e" + ], + [ + "▁", + "pace" + ], + [ + "▁Vict", + "oria" + ], + [ + "▁Victor", + "ia" + ], + [ + "▁ou", + "ght" + ], + [ + "▁", + "ought" + ], + [ + "▁P", + "layer" + ], + [ + "▁Pl", + "ayer" + ], + [ + "▁Play", + "er" + ], + [ + "▁", + "Player" + ], + [ + "agnost", + "ic" + ], + [ + "▁v", + "iv" + ], + [ + "▁vi", + "v" + ], + [ + "▁", + "viv" + ], + [ + "▁Pat", + "rick" + ], + [ + "▁Patri", + "ck" + ], + [ + "▁", + "Š" + ], + [ + "▁St", + "ory" + ], + [ + "▁Sto", + "ry" + ], + [ + "ac", + "a" + ], + [ + "a", + "ca" + ], + [ + "▁mount", + "ains" + ], + [ + "▁mountain", + "s" + ], + [ + "CL", + "ASS" + ], + [ + "▁fr", + "agment" + ], + [ + "▁frag", + "ment" + ], + [ + "▁", + "fragment" + ], + [ + "▁sett", + "lement" + ], + [ + "▁settle", + "ment" + ], + [ + "▁Further", + "more" + ], + [ + "▁dr", + "ivers" + ], + [ + "▁dri", + "vers" + ], + [ + "▁driv", + "ers" + ], + [ + "▁drive", + "rs" + ], + [ + "▁driver", + "s" + ], + [ + "▁J", + "u" + ], + [ + "▁бы", + "ли" + ], + [ + "▁был", + "и" + ], + [ + "Row", + "s" + ], + [ + "Ro", + "ws" + ], + [ + "R", + "ows" + ], + [ + "▁im", + "pression" + ], + [ + "▁imp", + "ression" + ], + [ + "▁impress", + "ion" + ], + [ + "▁in", + "fer" + ], + [ + "▁inf", + "er" + ], + [ + "▁Ex", + "pl" + ], + [ + "▁Exp", + "l" + ], + [ + "ol", + "ute" + ], + [ + "olut", + "e" + ], + [ + "olu", + "te" + ], + [ + "ov", + "an" + ], + [ + "ova", + "n" + ], + [ + "o", + "van" + ], + [ + "ar", + "ance" + ], + [ + "aran", + "ce" + ], + [ + "CA", + "P" + ], + [ + "C", + "AP" + ], + [ + "▁en", + "force" + ], + [ + "▁B", + "urn" + ], + [ + "▁Bur", + "n" + ], + [ + "▁Bu", + "rn" + ], + [ + "Res", + "et" + ], + [ + "Re", + "set" + ], + [ + "mo", + "ther" + ], + [ + "mot", + "her" + ], + [ + "m", + "other" + ], + [ + "▁B", + "attle" + ], + [ + "▁Bat", + "tle" + ], + [ + "▁Batt", + "le" + ], + [ + "▁", + "Battle" + ], + [ + "pad", + "ding" + ], + [ + "p", + "adding" + ], + [ + "ia", + "te" + ], + [ + "iat", + "e" + ], + [ + "i", + "ate" + ], + [ + "▁c", + "ried" + ], + [ + "▁cr", + "ied" + ], + [ + "▁cri", + "ed" + ], + [ + "A", + "K" + ], + [ + "un", + "s" + ], + [ + "u", + "ns" + ], + [ + "▁siè", + "cle" + ], + [ + "▁Cont", + "in" + ], + [ + "▁", + "Contin" + ], + [ + "ban", + "k" + ], + [ + "b", + "ank" + ], + [ + "ju", + "nit" + ], + [ + "j", + "unit" + ], + [ + "object", + "s" + ], + [ + "Ro", + "t" + ], + [ + "R", + "ot" + ], + [ + "is", + "sa" + ], + [ + "iss", + "a" + ], + [ + "▁be", + "gun" + ], + [ + "▁beg", + "un" + ], + [ + "*", + "-" + ], + [ + "▁vis", + "iting" + ], + [ + "▁visit", + "ing" + ], + [ + "ж", + "де" + ], + [ + "▁target", + "s" + ], + [ + "▁tar", + "gets" + ], + [ + "▁L", + "atin" + ], + [ + "▁Lat", + "in" + ], + [ + "у", + "т" + ], + [ + "▁E", + "sc" + ], + [ + "▁Es", + "c" + ], + [ + "*", + ";" + ], + [ + "ån", + "g" + ], + [ + "å", + "ng" + ], + [ + "▁(", + "{" + ], + [ + "▁", + "({" + ], + [ + "▁di", + "agram" + ], + [ + "▁dia", + "gram" + ], + [ + "Mod", + "els" + ], + [ + "Model", + "s" + ], + [ + "Mode", + "ls" + ], + [ + "▁part", + "nership" + ], + [ + "▁partner", + "ship" + ], + [ + "▁partners", + "hip" + ], + [ + "▁fr", + "ån" + ], + [ + "ul", + "ty" + ], + [ + "ult", + "y" + ], + [ + "Po", + "d" + ], + [ + "P", + "od" + ], + [ + "CA", + "LL" + ], + [ + "CAL", + "L" + ], + [ + "C", + "ALL" + ], + [ + "mod", + "al" + ], + [ + "mo", + "dal" + ], + [ + "si", + "g" + ], + [ + "s", + "ig" + ], + [ + "it", + "zer" + ], + [ + "itz", + "er" + ], + [ + "it", + "el" + ], + [ + "ite", + "l" + ], + [ + "▁convin", + "ced" + ], + [ + "▁convince", + "d" + ], + [ + "ab", + "l" + ], + [ + "a", + "bl" + ], + [ + "ст", + "ве" + ], + [ + "ств", + "е" + ], + [ + "▁c", + "ot" + ], + [ + "▁co", + "t" + ], + [ + "▁re", + "peat" + ], + [ + "▁repe", + "at" + ], + [ + "▁", + "repeat" + ], + [ + "▁l", + "ists" + ], + [ + "▁li", + "sts" + ], + [ + "▁list", + "s" + ], + [ + "▁", + "lists" + ], + [ + "so", + "und" + ], + [ + "s", + "ound" + ], + [ + "▁r", + "oyal" + ], + [ + "▁ro", + "yal" + ], + [ + "▁gr", + "ace" + ], + [ + "▁gra", + "ce" + ], + [ + "▁o", + "raz" + ], + [ + "▁or", + "az" + ], + [ + "Not", + "ification" + ], + [ + "pr", + "ite" + ], + [ + "prit", + "e" + ], + [ + "p", + "rite" + ], + [ + "▁arriv", + "al" + ], + [ + "▁arr", + "ival" + ], + [ + "an", + "cell" + ], + [ + "ance", + "ll" + ], + [ + "anc", + "ell" + ], + [ + "ancel", + "l" + ], + [ + "hent", + "ic" + ], + [ + "de", + "code" + ], + [ + "dec", + "ode" + ], + [ + "▁fant", + "astic" + ], + [ + "pro", + "gress" + ], + [ + "pro", + "xy" + ], + [ + "pr", + "oxy" + ], + [ + "z", + "ő" + ], + [ + "ke", + "l" + ], + [ + "k", + "el" + ], + [ + "▁conven", + "ient" + ], + [ + "aqu", + "e" + ], + [ + "a", + "que" + ], + [ + "ri", + "et" + ], + [ + "rie", + "t" + ], + [ + "r", + "iet" + ], + [ + "▁Dig", + "ital" + ], + [ + "io", + "rs" + ], + [ + "ior", + "s" + ], + [ + "i", + "ors" + ], + [ + "▁B", + "udd" + ], + [ + "▁Bud", + "d" + ], + [ + "▁Bu", + "dd" + ], + [ + "and", + "ra" + ], + [ + "ad", + "dy" + ], + [ + "add", + "y" + ], + [ + "▁o", + "vers" + ], + [ + "▁over", + "s" + ], + [ + "▁ov", + "ers" + ], + [ + "▁consum", + "ers" + ], + [ + "▁consumer", + "s" + ], + [ + "▁consume", + "rs" + ], + [ + "p", + "n" + ], + [ + "mo", + "use" + ], + [ + "m", + "ouse" + ], + [ + "▁B", + "C" + ], + [ + "▁", + "BC" + ], + [ + "de", + "g" + ], + [ + "d", + "eg" + ], + [ + "pe", + "rm" + ], + [ + "per", + "m" + ], + [ + "p", + "erm" + ], + [ + "it", + "és" + ], + [ + "ité", + "s" + ], + [ + "▁и", + "спо" + ], + [ + "▁ис", + "по" + ], + [ + "he", + "ast" + ], + [ + "h", + "east" + ], + [ + "ho", + "ur" + ], + [ + "hou", + "r" + ], + [ + "h", + "our" + ], + [ + "PAR", + "AM" + ], + [ + "con", + "scious" + ], + [ + "▁w", + "ing" + ], + [ + "▁win", + "g" + ], + [ + "▁", + "wing" + ], + [ + "▁atmos", + "phere" + ], + [ + "▁g", + "ig" + ], + [ + "▁gi", + "g" + ], + [ + "▁con", + "tre" + ], + [ + "▁cont", + "re" + ], + [ + "▁contr", + "e" + ], + [ + "▁dr", + "ama" + ], + [ + "▁dram", + "a" + ], + [ + "я", + "т" + ], + [ + "▁Fr", + "ont" + ], + [ + "▁Fro", + "nt" + ], + [ + "▁", + "Front" + ], + [ + "▁philosoph", + "y" + ], + [ + "▁H", + "art" + ], + [ + "▁Har", + "t" + ], + [ + "▁Ha", + "rt" + ], + [ + "▁n", + "urs" + ], + [ + "▁nu", + "rs" + ], + [ + "▁nur", + "s" + ], + [ + "ur", + "as" + ], + [ + "ura", + "s" + ], + [ + "u", + "ras" + ], + [ + "▁T", + "ru" + ], + [ + "▁Tr", + "u" + ], + [ + "▁s", + "ud" + ], + [ + "▁su", + "d" + ], + [ + "▁per", + "forming" + ], + [ + "▁perform", + "ing" + ], + [ + "п", + "ы" + ], + [ + "▁conf", + "used" + ], + [ + "▁che", + "cks" + ], + [ + "▁check", + "s" + ], + [ + "am", + "t" + ], + [ + "a", + "mt" + ], + [ + "Ma", + "ke" + ], + [ + "M", + "ake" + ], + [ + "▁R", + "O" + ], + [ + "▁", + "RO" + ], + [ + "▁d", + "f" + ], + [ + "▁", + "df" + ], + [ + "iz", + "ations" + ], + [ + "ization", + "s" + ], + [ + "▁deg", + "li" + ], + [ + "▁architect", + "ure" + ], + [ + "Render", + "er" + ], + [ + "▁Л", + "а" + ], + [ + "▁p", + "tr" + ], + [ + "▁pt", + "r" + ], + [ + "▁", + "ptr" + ], + [ + "▁die", + "ser" + ], + [ + "▁dies", + "er" + ], + [ + "▁diese", + "r" + ], + [ + "sub", + "mit" + ], + [ + "▁top", + "ics" + ], + [ + "▁topic", + "s" + ], + [ + "▁princip", + "les" + ], + [ + "▁prin", + "ciples" + ], + [ + "▁principle", + "s" + ], + [ + "var", + "s" + ], + [ + "va", + "rs" + ], + [ + "v", + "ars" + ], + [ + "so", + "ck" + ], + [ + "soc", + "k" + ], + [ + "s", + "ock" + ], + [ + "▁ton", + "gue" + ], + [ + "▁tong", + "ue" + ], + [ + "▁percent", + "age" + ], + [ + "▁S", + "S" + ], + [ + "▁", + "SS" + ], + [ + "▁d", + "ol" + ], + [ + "▁do", + "l" + ], + [ + "▁r", + "ice" + ], + [ + "▁ri", + "ce" + ], + [ + "▁ric", + "e" + ], + [ + "▁", + "rice" + ], + [ + "í", + "o" + ], + [ + "▁E", + "astern" + ], + [ + "▁East", + "ern" + ], + [ + "▁Easter", + "n" + ], + [ + "▁recogn", + "ition" + ], + [ + "▁E", + "rn" + ], + [ + "▁Er", + "n" + ], + [ + "▁U", + "t" + ], + [ + "▁", + "Ut" + ], + [ + "▁c", + "aut" + ], + [ + "▁ca", + "ut" + ], + [ + "▁Cl", + "oud" + ], + [ + "▁", + "Cloud" + ], + [ + "▁con", + "version" + ], + [ + "▁conv", + "ersion" + ], + [ + "▁convers", + "ion" + ], + [ + "▁Oh", + "io" + ], + [ + "▁M", + "E" + ], + [ + "▁", + "ME" + ], + [ + "▁sur", + "ely" + ], + [ + "▁sure", + "ly" + ], + [ + "▁g", + "ard" + ], + [ + "▁gar", + "d" + ], + [ + "▁ga", + "rd" + ], + [ + "pu", + "is" + ], + [ + "p", + "uis" + ], + [ + "▁u", + "rg" + ], + [ + "▁ur", + "g" + ], + [ + "▁", + "urg" + ], + [ + "im", + "i" + ], + [ + "i", + "mi" + ], + [ + "▁abs", + "ence" + ], + [ + "▁w", + "inner" + ], + [ + "▁win", + "ner" + ], + [ + "L", + "anguage" + ], + [ + "▁HT", + "TP" + ], + [ + "▁", + "HTTP" + ], + [ + "w", + "t" + ], + [ + "▁trans", + "lation" + ], + [ + "▁transl", + "ation" + ], + [ + "▁", + "translation" + ], + [ + "с", + "с" + ], + [ + "▁K", + "ind" + ], + [ + "▁Ki", + "nd" + ], + [ + "▁Kin", + "d" + ], + [ + "▁", + "Kind" + ], + [ + "Tw", + "o" + ], + [ + "T", + "wo" + ], + [ + "▁Re", + "volution" + ], + [ + "▁Rev", + "olution" + ], + [ + "In", + "sert" + ], + [ + "Ins", + "ert" + ], + [ + "Ev", + "ery" + ], + [ + "E", + "very" + ], + [ + "or", + "ient" + ], + [ + "ori", + "ent" + ], + [ + "orie", + "nt" + ], + [ + "o", + "rient" + ], + [ + "▁т", + "ра" + ], + [ + "▁", + "тра" + ], + [ + "▁emot", + "ions" + ], + [ + "▁emotion", + "s" + ], + [ + "det", + "ails" + ], + [ + "detail", + "s" + ], + [ + "▁f", + "lu" + ], + [ + "▁fl", + "u" + ], + [ + "▁", + "flu" + ], + [ + "▁oper", + "ate" + ], + [ + "▁opera", + "te" + ], + [ + "A", + "g" + ], + [ + "un", + "ning" + ], + [ + "unn", + "ing" + ], + [ + "▁part", + "ie" + ], + [ + "▁parti", + "e" + ], + [ + "tr", + "i" + ], + [ + "t", + "ri" + ], + [ + "▁gold", + "en" + ], + [ + "▁gol", + "den" + ], + [ + "▁Б", + "и" + ], + [ + "▁found", + "ation" + ], + [ + "is", + "ten" + ], + [ + "ist", + "en" + ], + [ + "iste", + "n" + ], + [ + "i", + "sten" + ], + [ + "▁Car", + "los" + ], + [ + "▁Carl", + "os" + ], + [ + "▁Carlo", + "s" + ], + [ + "Child", + "ren" + ], + [ + "▁neigh", + "b" + ], + [ + "▁C", + "art" + ], + [ + "▁Car", + "t" + ], + [ + "▁Ca", + "rt" + ], + [ + "▁", + "Cart" + ], + [ + "Be", + "gin" + ], + [ + "B", + "egin" + ], + [ + "г", + "да" + ], + [ + "▁s", + "cheduled" + ], + [ + "▁schedule", + "d" + ], + [ + "▁schedul", + "ed" + ], + [ + "'", + ">" + ], + [ + "▁observ", + "ations" + ], + [ + "▁observation", + "s" + ], + [ + "▁produ", + "cer" + ], + [ + "▁produce", + "r" + ], + [ + "ath", + "ers" + ], + [ + "ather", + "s" + ], + [ + "a", + "thers" + ], + [ + "но", + "му" + ], + [ + "ном", + "у" + ], + [ + "▁expect", + "ations" + ], + [ + "▁expectation", + "s" + ], + [ + "os", + "o" + ], + [ + "o", + "so" + ], + [ + "z", + "h" + ], + [ + "mu", + "table" + ], + [ + "mut", + "able" + ], + [ + "▁wr", + "ites" + ], + [ + "▁writ", + "es" + ], + [ + "▁write", + "s" + ], + [ + "▁p", + "ushing" + ], + [ + "▁push", + "ing" + ], + [ + "▁se", + "ats" + ], + [ + "▁sea", + "ts" + ], + [ + "▁seat", + "s" + ], + [ + "▁br", + "east" + ], + [ + "▁bre", + "ast" + ], + [ + "ap", + "ing" + ], + [ + "api", + "ng" + ], + [ + "a", + "ping" + ], + [ + "▁Sim", + "ple" + ], + [ + "▁", + "Simple" + ], + [ + "▁s", + "ocket" + ], + [ + "▁soc", + "ket" + ], + [ + "▁sock", + "et" + ], + [ + "▁", + "socket" + ], + [ + "▁sl", + "ave" + ], + [ + "▁sla", + "ve" + ], + [ + "▁", + "slave" + ], + [ + "il", + "ey" + ], + [ + "ile", + "y" + ], + [ + "i", + "ley" + ], + [ + "▁ass", + "istant" + ], + [ + "▁assist", + "ant" + ], + [ + "▁t", + "rim" + ], + [ + "▁tr", + "im" + ], + [ + "▁tri", + "m" + ], + [ + "▁", + "trim" + ], + [ + "▁land", + "scape" + ], + [ + "▁landsc", + "ape" + ], + [ + "▁associ", + "ation" + ], + [ + "qu", + "ant" + ], + [ + "▁Pal", + "est" + ], + [ + "▁swe", + "at" + ], + [ + "en", + "gers" + ], + [ + "eng", + "ers" + ], + [ + "enge", + "rs" + ], + [ + "enger", + "s" + ], + [ + "?", + "_" + ], + [ + "é", + "p" + ], + [ + ">", + "." + ], + [ + "▁c", + "urious" + ], + [ + "▁cur", + "ious" + ], + [ + "▁Com", + "ponent" + ], + [ + "▁", + "Component" + ], + [ + "▁re", + "placement" + ], + [ + "▁repl", + "acement" + ], + [ + "▁replace", + "ment" + ], + [ + "ра", + "ль" + ], + [ + "рал", + "ь" + ], + [ + "р", + "аль" + ], + [ + "▁Tr", + "ack" + ], + [ + "▁Tra", + "ck" + ], + [ + "▁", + "Track" + ], + [ + "▁Re", + "move" + ], + [ + "▁Rem", + "ove" + ], + [ + "▁", + "Remove" + ], + [ + "▁S", + "ize" + ], + [ + "▁Si", + "ze" + ], + [ + "▁", + "Size" + ], + [ + "pe", + "ror" + ], + [ + "per", + "or" + ], + [ + "▁cal", + "culate" + ], + [ + "▁calcul", + "ate" + ], + [ + "▁calc", + "ulate" + ], + [ + "▁s", + "essions" + ], + [ + "▁session", + "s" + ], + [ + "▁type", + "d" + ], + [ + "▁typ", + "ed" + ], + [ + "▁ty", + "ped" + ], + [ + "▁sub", + "mit" + ], + [ + "▁subm", + "it" + ], + [ + "▁", + "submit" + ], + [ + "!!", + "!" + ], + [ + "!", + "!!" + ], + [ + "▁part", + "ition" + ], + [ + "▁", + "partition" + ], + [ + "ed", + "ing" + ], + [ + "edi", + "ng" + ], + [ + "e", + "ding" + ], + [ + "--", + "---" + ], + [ + "----", + "-" + ], + [ + "---", + "--" + ], + [ + "-", + "----" + ], + [ + "az", + "ioni" + ], + [ + "azi", + "oni" + ], + [ + "lie", + "ß" + ], + [ + "on", + "al" + ], + [ + "ona", + "l" + ], + [ + "o", + "nal" + ], + [ + "▁sh", + "ru" + ], + [ + "▁shr", + "u" + ], + [ + "▁RE", + "G" + ], + [ + "▁", + "REG" + ], + [ + "▁F", + "ac" + ], + [ + "▁Fa", + "c" + ], + [ + "▁", + "Fac" + ], + [ + "config", + "uration" + ], + [ + "▁бы", + "ло" + ], + [ + "▁был", + "о" + ], + [ + "▁A", + "mong" + ], + [ + "▁Am", + "ong" + ], + [ + "__", + ");" + ], + [ + "__)", + ";" + ], + [ + "_", + "_);" + ], + [ + "▁Ser", + "ver" + ], + [ + "▁Serv", + "er" + ], + [ + "▁", + "Server" + ], + [ + "▁L", + "OG" + ], + [ + "▁LO", + "G" + ], + [ + "▁", + "LOG" + ], + [ + "▁c", + "and" + ], + [ + "▁can", + "d" + ], + [ + "▁ca", + "nd" + ], + [ + "']", + ");" + ], + [ + "'])", + ";" + ], + [ + "'", + "]);" + ], + [ + "go", + "v" + ], + [ + "g", + "ov" + ], + [ + "▁S", + "ix" + ], + [ + "▁Si", + "x" + ], + [ + "un", + "defined" + ], + [ + "und", + "efined" + ], + [ + "undef", + "ined" + ], + [ + "▁t", + "y" + ], + [ + "▁", + "ty" + ], + [ + "as", + "a" + ], + [ + "a", + "sa" + ], + [ + "▁part", + "icles" + ], + [ + "▁partic", + "les" + ], + [ + "▁particle", + "s" + ], + [ + "▁parti", + "cles" + ], + [ + "▁ф", + "ор" + ], + [ + "▁фо", + "р" + ], + [ + "▁", + "фор" + ], + [ + "`", + "`" + ], + [ + "T", + "ube" + ], + [ + "el", + "and" + ], + [ + "ela", + "nd" + ], + [ + "e", + "land" + ], + [ + "fo", + "ld" + ], + [ + "fol", + "d" + ], + [ + "f", + "old" + ], + [ + "og", + "o" + ], + [ + "o", + "go" + ], + [ + "▁appro", + "aches" + ], + [ + "▁approach", + "es" + ], + [ + "on", + "da" + ], + [ + "ond", + "a" + ], + [ + "ag", + "r" + ], + [ + "a", + "gr" + ], + [ + ",", + "$" + ], + [ + "▁{", + "{" + ], + [ + "▁", + "{{" + ], + [ + "▁Mod", + "ern" + ], + [ + "▁Mo", + "dern" + ], + [ + "▁Mode", + "rn" + ], + [ + "▁W", + "inter" + ], + [ + "▁Win", + "ter" + ], + [ + "av", + "ailable" + ], + [ + "▁L", + "ud" + ], + [ + "▁Lu", + "d" + ], + [ + "▁c", + "asa" + ], + [ + "▁cas", + "a" + ], + [ + "▁ca", + "sa" + ], + [ + "▁C", + "ould" + ], + [ + "▁Co", + "uld" + ], + [ + "▁Cou", + "ld" + ], + [ + "▁", + "Could" + ], + [ + "▁fif", + "teen" + ], + [ + "▁pot", + "entially" + ], + [ + "▁potential", + "ly" + ], + [ + "^", + "^" + ], + [ + "▁se", + "it" + ], + [ + "▁sei", + "t" + ], + [ + "An", + "imation" + ], + [ + "Anim", + "ation" + ], + [ + "ко", + "го" + ], + [ + "к", + "ого" + ], + [ + "Z", + "one" + ], + [ + "el", + "if" + ], + [ + "eli", + "f" + ], + [ + "e", + "lif" + ], + [ + "▁acknow", + "led" + ], + [ + "▁own", + "ership" + ], + [ + "▁owner", + "ship" + ], + [ + "▁owners", + "hip" + ], + [ + "▁describ", + "es" + ], + [ + "▁describe", + "s" + ], + [ + "▁re", + "verse" + ], + [ + "▁revers", + "e" + ], + [ + "▁rever", + "se" + ], + [ + "▁", + "reverse" + ], + [ + "▁con", + "test" + ], + [ + "▁cont", + "est" + ], + [ + "▁sc", + "ored" + ], + [ + "▁score", + "d" + ], + [ + "▁op", + "posed" + ], + [ + "▁opp", + "osed" + ], + [ + "▁oppos", + "ed" + ], + [ + "fl", + "ex" + ], + [ + "f", + "lex" + ], + [ + "kr", + "e" + ], + [ + "k", + "re" + ], + [ + "▁mer", + "ge" + ], + [ + "▁", + "merge" + ], + [ + "▁cover", + "ing" + ], + [ + "▁cov", + "ering" + ], + [ + "▁hon", + "estly" + ], + [ + "▁honest", + "ly" + ], + [ + "▁M", + "ess" + ], + [ + "▁Me", + "ss" + ], + [ + "▁Mes", + "s" + ], + [ + "▁r", + "arely" + ], + [ + "▁rare", + "ly" + ], + [ + "▁incred", + "ible" + ], + [ + "it", + "age" + ], + [ + "ita", + "ge" + ], + [ + "▁vict", + "ims" + ], + [ + "▁victim", + "s" + ], + [ + "ны", + "ми" + ], + [ + "ным", + "и" + ], + [ + "w", + "l" + ], + [ + "iz", + "za" + ], + [ + "izz", + "a" + ], + [ + "i", + "zza" + ], + [ + "d", + "n" + ], + [ + "on", + "de" + ], + [ + "ond", + "e" + ], + [ + "o", + "nde" + ], + [ + "▁pr", + "zy" + ], + [ + "▁prz", + "y" + ], + [ + "▁HT", + "ML" + ], + [ + "▁", + "HTML" + ], + [ + "▁pay", + "load" + ], + [ + "▁", + "payload" + ], + [ + "Bu", + "s" + ], + [ + "B", + "us" + ], + [ + "us", + "b" + ], + [ + "u", + "sb" + ], + [ + "F", + "n" + ], + [ + "▁display", + "ed" + ], + [ + "▁o", + "cean" + ], + [ + "▁A", + "venue" + ], + [ + "▁Av", + "enue" + ], + [ + "ac", + "ion" + ], + [ + "aci", + "on" + ], + [ + "acio", + "n" + ], + [ + "gh", + "an" + ], + [ + "g", + "han" + ], + [ + "met", + "ric" + ], + [ + "m", + "etric" + ], + [ + "ie", + "ties" + ], + [ + "iet", + "ies" + ], + [ + "▁attract", + "ive" + ], + [ + "▁attr", + "active" + ], + [ + "▁f", + "ö" + ], + [ + "▁", + "fö" + ], + [ + "C", + "reat" + ], + [ + "ver", + "ter" + ], + [ + "vert", + "er" + ], + [ + "▁Al", + "ice" + ], + [ + "▁Ali", + "ce" + ], + [ + "по", + "л" + ], + [ + "▁f", + "raction" + ], + [ + "▁fr", + "action" + ], + [ + "▁fra", + "ction" + ], + [ + "▁fract", + "ion" + ], + [ + "▁behav", + "iour" + ], + [ + "▁behavi", + "our" + ], + [ + "▁Jer", + "sey" + ], + [ + "▁re", + "venue" + ], + [ + "▁rev", + "enue" + ], + [ + "▁reven", + "ue" + ], + [ + "▁t", + "res" + ], + [ + "▁tr", + "es" + ], + [ + "▁tre", + "s" + ], + [ + "▁", + "tres" + ], + [ + "IL", + "D" + ], + [ + "I", + "LD" + ], + [ + "▁É", + "t" + ], + [ + "▁s", + "ync" + ], + [ + "▁sy", + "nc" + ], + [ + "▁syn", + "c" + ], + [ + "▁", + "sync" + ], + [ + "wi", + "ch" + ], + [ + "w", + "ich" + ], + [ + "▁anc", + "est" + ], + [ + "ъ", + "т" + ], + [ + "om", + "o" + ], + [ + "o", + "mo" + ], + [ + "▁I", + "de" + ], + [ + "▁Id", + "e" + ], + [ + "▁g", + "ained" + ], + [ + "▁gain", + "ed" + ], + [ + "▁ga", + "ined" + ], + [ + "▁moment", + "um" + ], + [ + "▁K", + "o" + ], + [ + "ie", + "u" + ], + [ + "i", + "eu" + ], + [ + "ie", + "lt" + ], + [ + "iel", + "t" + ], + [ + "i", + "elt" + ], + [ + "▁bon", + "us" + ], + [ + "▁te", + "xture" + ], + [ + "▁text", + "ure" + ], + [ + "▁", + "texture" + ], + [ + "Mod", + "al" + ], + [ + "Mo", + "dal" + ], + [ + "NE", + "XT" + ], + [ + "N", + "EXT" + ], + [ + "▁годи", + "не" + ], + [ + "▁l", + "anguages" + ], + [ + "▁language", + "s" + ], + [ + "v", + "t" + ], + [ + "▁represent", + "ing" + ], + [ + "▁D", + "ream" + ], + [ + "▁Dre", + "am" + ], + [ + "cur", + "r" + ], + [ + "cu", + "rr" + ], + [ + "qu", + "al" + ], + [ + "q", + "ual" + ], + [ + "▁j", + "s" + ], + [ + "▁", + "js" + ], + [ + "bu", + "rn" + ], + [ + "bur", + "n" + ], + [ + "b", + "urn" + ], + [ + "▁contribut", + "ions" + ], + [ + "▁contribution", + "s" + ], + [ + "▁r", + "ic" + ], + [ + "▁ri", + "c" + ], + [ + "▁", + "ric" + ], + [ + "}-", + "\\" + ], + [ + "}", + "-\\" + ], + [ + "={", + "{" + ], + [ + "=", + "{{" + ], + [ + "ca", + "rt" + ], + [ + "car", + "t" + ], + [ + "c", + "art" + ], + [ + "F", + "B" + ], + [ + "ju", + "d" + ], + [ + "j", + "ud" + ], + [ + "es", + "p" + ], + [ + "e", + "sp" + ], + [ + "▁elect", + "ron" + ], + [ + "▁electro", + "n" + ], + [ + "▁e", + "ll" + ], + [ + "▁el", + "l" + ], + [ + "▁", + "ell" + ], + [ + "▁Run", + "time" + ], + [ + "▁", + "Runtime" + ], + [ + "ac", + "hel" + ], + [ + "ach", + "el" + ], + [ + "ache", + "l" + ], + [ + "a", + "chel" + ], + [ + "\\", + "_" + ], + [ + "we", + "ek" + ], + [ + "pack", + "et" + ], + [ + "p", + "acket" + ], + [ + "▁Secret", + "ary" + ], + [ + "▁Jahr", + "hund" + ], + [ + "▁th", + "reshold" + ], + [ + "▁", + "threshold" + ], + [ + "ba", + "ge" + ], + [ + "bag", + "e" + ], + [ + "b", + "age" + ], + [ + "▁con", + "cer" + ], + [ + "▁conc", + "er" + ], + [ + "▁conce", + "r" + ], + [ + "▁b", + "one" + ], + [ + "▁bo", + "ne" + ], + [ + "▁bon", + "e" + ], + [ + "▁", + "bone" + ], + [ + "▁Holly", + "wood" + ], + [ + "Cur", + "sor" + ], + [ + "C", + "ursor" + ], + [ + "▁aw", + "arded" + ], + [ + "▁award", + "ed" + ], + [ + "▁sum", + "mary" + ], + [ + "▁summar", + "y" + ], + [ + "▁", + "summary" + ], + [ + "ag", + "gio" + ], + [ + "agg", + "io" + ], + [ + "aggi", + "o" + ], + [ + "▁st", + "ell" + ], + [ + "▁ste", + "ll" + ], + [ + "▁", + "stell" + ], + [ + "▁f", + "lesh" + ], + [ + "▁fl", + "esh" + ], + [ + "▁fle", + "sh" + ], + [ + "P", + "air" + ], + [ + "▁A", + "ge" + ], + [ + "▁Ag", + "e" + ], + [ + "ing", + "ton" + ], + [ + "▁'", + "." + ], + [ + "▁", + "'." + ], + [ + "as", + "er" + ], + [ + "ase", + "r" + ], + [ + "a", + "ser" + ], + [ + "ко", + "ва" + ], + [ + "ков", + "а" + ], + [ + "▁qu", + "art" + ], + [ + "▁q", + "uart" + ], + [ + "▁quar", + "t" + ], + [ + "ry", + "ption" + ], + [ + "rypt", + "ion" + ], + [ + "All", + "oc" + ], + [ + "Al", + "loc" + ], + [ + "ft", + "en" + ], + [ + "fte", + "n" + ], + [ + "f", + "ten" + ], + [ + "Oper", + "and" + ], + [ + "▁ind", + "icated" + ], + [ + "▁indic", + "ated" + ], + [ + "▁indicate", + "d" + ], + [ + "($", + "_" + ], + [ + "(", + "$_" + ], + [ + "get", + "String" + ], + [ + "▁list", + "ener" + ], + [ + "▁listen", + "er" + ], + [ + "▁", + "listener" + ], + [ + "sp", + "ir" + ], + [ + "spi", + "r" + ], + [ + ")", + "_" + ], + [ + "ve", + "ns" + ], + [ + "ven", + "s" + ], + [ + "v", + "ens" + ], + [ + "▁food", + "s" + ], + [ + "▁foo", + "ds" + ], + [ + "an", + "za" + ], + [ + "anz", + "a" + ], + [ + "te", + "il" + ], + [ + "DE", + "SC" + ], + [ + "▁n", + "otion" + ], + [ + "▁not", + "ion" + ], + [ + "▁em", + "ployment" + ], + [ + "▁employ", + "ment" + ], + [ + "▁s", + "wing" + ], + [ + "▁sw", + "ing" + ], + [ + "▁", + "swing" + ], + [ + "nb", + "sp" + ], + [ + "▁p", + "ounds" + ], + [ + "▁pound", + "s" + ], + [ + "to", + "ols" + ], + [ + "tool", + "s" + ], + [ + "too", + "ls" + ], + [ + "t", + "ools" + ], + [ + "▁particip", + "ate" + ], + [ + "▁T", + "ax" + ], + [ + "▁Ta", + "x" + ], + [ + "▁", + "Tax" + ], + [ + "▁с", + "кла" + ], + [ + "ap", + "ol" + ], + [ + "a", + "pol" + ], + [ + "▁f", + "ost" + ], + [ + "▁fo", + "st" + ], + [ + "▁fos", + "t" + ], + [ + "com", + "pat" + ], + [ + "comp", + "at" + ], + [ + "▁public", + "ation" + ], + [ + "▁rapid", + "ly" + ], + [ + "▁W", + "is" + ], + [ + "▁Wi", + "s" + ], + [ + "Event", + "Listener" + ], + [ + "▁prem", + "ière" + ], + [ + "▁premi", + "ère" + ], + [ + "us", + "o" + ], + [ + "u", + "so" + ], + [ + "ext", + "end" + ], + [ + "▁M", + "ERCHANTABILITY" + ], + [ + "UT", + "F" + ], + [ + "U", + "TF" + ], + [ + "▁exper", + "iments" + ], + [ + "▁experi", + "ments" + ], + [ + "▁experiment", + "s" + ], + [ + "sin", + "gle" + ], + [ + "sing", + "le" + ], + [ + "s", + "ingle" + ], + [ + "z", + "k" + ], + [ + "▁n", + "aj" + ], + [ + "▁na", + "j" + ], + [ + "}}", + "}" + ], + [ + "}", + "}}" + ], + [ + "Li", + "n" + ], + [ + "L", + "in" + ], + [ + "▁inter", + "act" + ], + [ + "▁inte", + "ract" + ], + [ + "▁c", + "ms" + ], + [ + "▁cm", + "s" + ], + [ + "▁Ro", + "ger" + ], + [ + "▁Rog", + "er" + ], + [ + "▁Р", + "у" + ], + [ + ">", + "'" + ], + [ + "com", + "mit" + ], + [ + "comm", + "it" + ], + [ + "ло", + "сь" + ], + [ + "▁out", + "come" + ], + [ + "▁h", + "its" + ], + [ + "▁hit", + "s" + ], + [ + "▁hi", + "ts" + ], + [ + "▁и", + "м" + ], + [ + "▁", + "им" + ], + [ + "▁s", + "park" + ], + [ + "▁sp", + "ark" + ], + [ + "con", + "sole" + ], + [ + "cons", + "ole" + ], + [ + "▁ver", + "w" + ], + [ + "▁ve", + "rw" + ], + [ + "▁ка", + "то" + ], + [ + "agnost", + "ics" + ], + [ + "agnostic", + "s" + ], + [ + "▁s", + "oci" + ], + [ + "▁so", + "ci" + ], + [ + "▁soc", + "i" + ], + [ + "▁d", + "ining" + ], + [ + "▁di", + "ning" + ], + [ + "▁din", + "ing" + ], + [ + "▁t", + "ech" + ], + [ + "▁te", + "ch" + ], + [ + "▁", + "tech" + ], + [ + "š", + "t" + ], + [ + "fo", + "lio" + ], + [ + "fol", + "io" + ], + [ + "ult", + "ane" + ], + [ + "ultan", + "e" + ], + [ + "кт", + "ор" + ], + [ + "кто", + "р" + ], + [ + "к", + "тор" + ], + [ + "▁B", + "rand" + ], + [ + "▁Br", + "and" + ], + [ + "▁Bra", + "nd" + ], + [ + "Jo", + "in" + ], + [ + "J", + "oin" + ], + [ + "▁и", + "ю" + ], + [ + "▁p", + "ros" + ], + [ + "▁pro", + "s" + ], + [ + "▁pr", + "os" + ], + [ + "▁pos", + "it" + ], + [ + "Pub", + "lic" + ], + [ + "P", + "ublic" + ], + [ + "AspNet", + "Core" + ], + [ + "▁S", + "hop" + ], + [ + "▁Sh", + "op" + ], + [ + "▁", + "Shop" + ], + [ + "▁co", + "inc" + ], + [ + "▁coin", + "c" + ], + [ + "ни", + "ем" + ], + [ + "ние", + "м" + ], + [ + "▁re", + "ferences" + ], + [ + "▁refer", + "ences" + ], + [ + "▁reference", + "s" + ], + [ + "ab", + "out" + ], + [ + "name", + "space" + ], + [ + "names", + "pace" + ], + [ + "D", + "L" + ], + [ + "▁I", + "R" + ], + [ + "▁", + "IR" + ], + [ + "▁c", + "ada" + ], + [ + "▁ca", + "da" + ], + [ + "▁cad", + "a" + ], + [ + "▁Jord", + "an" + ], + [ + "▁g", + "ep" + ], + [ + "▁ge", + "p" + ], + [ + "▁b", + "ron" + ], + [ + "▁br", + "on" + ], + [ + "▁bro", + "n" + ], + [ + "andid", + "ate" + ], + [ + "EX", + "PECT" + ], + [ + "EXP", + "ECT" + ], + [ + "am", + "o" + ], + [ + "a", + "mo" + ], + [ + "▁De", + "utsch" + ], + [ + "au", + "c" + ], + [ + "a", + "uc" + ], + [ + "▁ра", + "йо" + ], + [ + "▁рай", + "о" + ], + [ + "▁L", + "abor" + ], + [ + "▁La", + "bor" + ], + [ + "▁Lab", + "or" + ], + [ + "▁surround", + "ed" + ], + [ + "т", + "ро" + ], + [ + "▁n", + "ome" + ], + [ + "▁no", + "me" + ], + [ + "▁nom", + "e" + ], + [ + "▁under", + "lying" + ], + [ + "▁educ", + "ational" + ], + [ + "▁education", + "al" + ], + [ + "R", + "IGHT" + ], + [ + "CO", + "UNT" + ], + [ + "in", + "ch" + ], + [ + "inc", + "h" + ], + [ + "Ty", + "p" + ], + [ + "T", + "yp" + ], + [ + "um", + "ph" + ], + [ + "ump", + "h" + ], + [ + "fo", + "ur" + ], + [ + "f", + "our" + ], + [ + "Control", + "s" + ], + [ + "▁c", + "p" + ], + [ + "▁", + "cp" + ], + [ + "co", + "st" + ], + [ + "cos", + "t" + ], + [ + "c", + "ost" + ], + [ + "▁mechan", + "ism" + ], + [ + "en", + "ess" + ], + [ + "ene", + "ss" + ], + [ + "enes", + "s" + ], + [ + "e", + "ness" + ], + [ + "é", + "qu" + ], + [ + "▁acqu", + "ired" + ], + [ + "▁acquire", + "d" + ], + [ + "▁f", + "alls" + ], + [ + "▁fall", + "s" + ], + [ + "▁fal", + "ls" + ], + [ + "▁", + "falls" + ], + [ + "▁H", + "ou" + ], + [ + "▁Ho", + "u" + ], + [ + "▁L", + "E" + ], + [ + "▁", + "LE" + ], + [ + "for", + "Each" + ], + [ + "▁ver", + "tex" + ], + [ + "▁vert", + "ex" + ], + [ + "▁", + "vertex" + ], + [ + "▁I", + "F" + ], + [ + "▁", + "IF" + ], + [ + "cur", + "s" + ], + [ + "cu", + "rs" + ], + [ + "c", + "urs" + ], + [ + "'", + "=>" + ], + [ + "те", + "ри" + ], + [ + "тер", + "и" + ], + [ + "▁S", + "A" + ], + [ + "▁", + "SA" + ], + [ + "ri", + "ers" + ], + [ + "rie", + "rs" + ], + [ + "rier", + "s" + ], + [ + "r", + "iers" + ], + [ + "▁u", + "w" + ], + [ + "▁", + "uw" + ], + [ + "▁m", + "arks" + ], + [ + "▁mark", + "s" + ], + [ + "▁mar", + "ks" + ], + [ + "▁", + "marks" + ], + [ + "▁en", + "erg" + ], + [ + "▁ener", + "g" + ], + [ + "ho", + "f" + ], + [ + "h", + "of" + ], + [ + "ylv", + "ania" + ], + [ + "▁Al", + "len" + ], + [ + "▁All", + "en" + ], + [ + "um", + "py" + ], + [ + "ump", + "y" + ], + [ + "о", + "го" + ], + [ + "ст", + "ву" + ], + [ + "ств", + "у" + ], + [ + "vo", + "ice" + ], + [ + "v", + "oice" + ], + [ + "▁en", + "gage" + ], + [ + "▁eng", + "age" + ], + [ + "▁m", + "ant" + ], + [ + "▁man", + "t" + ], + [ + "▁ma", + "nt" + ], + [ + "or", + "se" + ], + [ + "ors", + "e" + ], + [ + "==", + "=" + ], + [ + "=", + "==" + ], + [ + "▁impro", + "vement" + ], + [ + "▁improve", + "ment" + ], + [ + "Op", + "t" + ], + [ + "O", + "pt" + ], + [ + "▁arr", + "ested" + ], + [ + "▁arrest", + "ed" + ], + [ + "ти", + "я" + ], + [ + "▁с", + "ле" + ], + [ + "▁", + "сле" + ], + [ + "it", + "ched" + ], + [ + "itch", + "ed" + ], + [ + "soc", + "ket" + ], + [ + "sock", + "et" + ], + [ + "s", + "ocket" + ], + [ + "▁c", + "ycl" + ], + [ + "▁cy", + "cl" + ], + [ + "▁", + "cycl" + ], + [ + "▁S", + "M" + ], + [ + "▁", + "SM" + ], + [ + "▁S", + "ex" + ], + [ + "▁Se", + "x" + ], + [ + "▁neut", + "ral" + ], + [ + "▁neutr", + "al" + ], + [ + "ва", + "в" + ], + [ + "▁J", + "ess" + ], + [ + "▁Je", + "ss" + ], + [ + "▁Jes", + "s" + ], + [ + "▁d", + "ip" + ], + [ + "▁di", + "p" + ], + [ + "▁op", + "position" + ], + [ + "▁oppos", + "ition" + ], + [ + "▁b", + "orrow" + ], + [ + "▁bor", + "row" + ], + [ + "с", + "пе" + ], + [ + "▁av", + "ant" + ], + [ + "ко", + "ла" + ], + [ + "▁t", + "a" + ], + [ + "▁", + "ta" + ], + [ + "An", + "im" + ], + [ + "A", + "nim" + ], + [ + "▁G", + "all" + ], + [ + "▁Gal", + "l" + ], + [ + "▁Ga", + "ll" + ], + [ + "rg", + "b" + ], + [ + "r", + "gb" + ], + [ + "▁gu", + "ilty" + ], + [ + "▁guilt", + "y" + ], + [ + "▁bu", + "ried" + ], + [ + "▁bur", + "ied" + ], + [ + "▁g", + "y" + ], + [ + "▁", + "gy" + ], + [ + "Init", + "ial" + ], + [ + "▁acc", + "omp" + ], + [ + "▁ac", + "comp" + ], + [ + "▁accom", + "p" + ], + [ + "▁breath", + "ing" + ], + [ + "▁breat", + "hing" + ], + [ + "ber", + "ry" + ], + [ + "b", + "erry" + ], + [ + "GR", + "O" + ], + [ + "G", + "RO" + ], + [ + "▁subsequ", + "ent" + ], + [ + "rou", + "pe" + ], + [ + "roup", + "e" + ], + [ + "ul", + "pt" + ], + [ + "ulp", + "t" + ], + [ + "t", + "b" + ], + [ + "▁", + "ä" + ], + [ + "P", + "i" + ], + [ + "arg", + "v" + ], + [ + "▁M", + "ust" + ], + [ + "▁Mus", + "t" + ], + [ + "▁Mu", + "st" + ], + [ + "▁", + "Must" + ], + [ + ":", + "'" + ], + [ + "sv", + "g" + ], + [ + "ou", + "p" + ], + [ + "o", + "up" + ], + [ + "▁prec", + "isely" + ], + [ + "▁precise", + "ly" + ], + [ + "▁T", + "a" + ], + [ + "re", + "na" + ], + [ + "ren", + "a" + ], + [ + "r", + "ena" + ], + [ + "▁f", + "older" + ], + [ + "▁fol", + "der" + ], + [ + "▁fold", + "er" + ], + [ + "▁", + "folder" + ], + [ + "▁Ch", + "annel" + ], + [ + "▁", + "Channel" + ], + [ + "▁re", + "vol" + ], + [ + "▁rev", + "ol" + ], + [ + "M", + "iss" + ], + [ + "ло", + "м" + ], + [ + "red", + "dit" + ], + [ + "adel", + "ph" + ], + [ + "▁dis", + "crim" + ], + [ + "▁disc", + "rim" + ], + [ + "▁a", + "ve" + ], + [ + "▁av", + "e" + ], + [ + "▁", + "ave" + ], + [ + "pl", + "eted" + ], + [ + "ple", + "ted" + ], + [ + "plete", + "d" + ], + [ + "plet", + "ed" + ], + [ + "p", + "leted" + ], + [ + "▁g", + "ently" + ], + [ + "▁gent", + "ly" + ], + [ + "FF", + "FF" + ], + [ + "ro", + "py" + ], + [ + "rop", + "y" + ], + [ + "r", + "opy" + ], + [ + "▁d", + "ial" + ], + [ + "▁di", + "al" + ], + [ + "▁dia", + "l" + ], + [ + "Not", + "Found" + ], + [ + "▁\"", + "[" + ], + [ + "Hom", + "e" + ], + [ + "H", + "ome" + ], + [ + "on", + "te" + ], + [ + "ont", + "e" + ], + [ + "o", + "nte" + ], + [ + "▁re", + "lie" + ], + [ + "▁rel", + "ie" + ], + [ + "▁reli", + "e" + ], + [ + "▁Con", + "text" + ], + [ + "▁Cont", + "ext" + ], + [ + "▁", + "Context" + ], + [ + "▁st", + "ats" + ], + [ + "▁stat", + "s" + ], + [ + "▁sta", + "ts" + ], + [ + "▁", + "stats" + ], + [ + "▁E", + "nergy" + ], + [ + "oun", + "ced" + ], + [ + "ounce", + "d" + ], + [ + "▁gr", + "ave" + ], + [ + "▁grav", + "e" + ], + [ + "▁gra", + "ve" + ], + [ + "▁re", + "cip" + ], + [ + "▁rec", + "ip" + ], + [ + "ли", + "н" + ], + [ + "л", + "ин" + ], + [ + "bl", + "og" + ], + [ + "blo", + "g" + ], + [ + "b", + "log" + ], + [ + "▁na", + "am" + ], + [ + "▁w", + "o" + ], + [ + "▁", + "wo" + ], + [ + "▁direct", + "ions" + ], + [ + "▁dire", + "ctions" + ], + [ + "▁direction", + "s" + ], + [ + "▁Lin", + "coln" + ], + [ + "!", + ")" + ], + [ + "un", + "ci" + ], + [ + "unc", + "i" + ], + [ + "ne", + "q" + ], + [ + "n", + "eq" + ], + [ + "Tag", + "s" + ], + [ + "T", + "ags" + ], + [ + "▁t", + "um" + ], + [ + "▁tu", + "m" + ], + [ + "▁s", + "aving" + ], + [ + "▁sa", + "ving" + ], + [ + "▁sav", + "ing" + ], + [ + "ail", + "le" + ], + [ + "ai", + "lle" + ], + [ + "a", + "ille" + ], + [ + "item", + "ize" + ], + [ + "▁F", + "amil" + ], + [ + "▁Fa", + "mil" + ], + [ + "ms", + "m" + ], + [ + "m", + "sm" + ], + [ + "ne", + "ws" + ], + [ + "new", + "s" + ], + [ + "FF", + "ER" + ], + [ + "F", + "FER" + ], + [ + "▁D", + "ead" + ], + [ + "▁De", + "ad" + ], + [ + "▁", + "Dead" + ], + [ + "▁terr", + "itory" + ], + [ + "▁territor", + "y" + ], + [ + "▁territo", + "ry" + ], + [ + "▁K", + "at" + ], + [ + "▁Ka", + "t" + ], + [ + "oc", + "ker" + ], + [ + "ock", + "er" + ], + [ + "o", + "cker" + ], + [ + "in", + "teger" + ], + [ + "inte", + "ger" + ], + [ + "▁s", + "ne" + ], + [ + "▁sn", + "e" + ], + [ + "▁f", + "ails" + ], + [ + "▁fa", + "ils" + ], + [ + "▁fail", + "s" + ], + [ + "▁franç", + "ais" + ], + [ + "▁int", + "roduction" + ], + [ + "▁introdu", + "ction" + ], + [ + "▁G", + "rant" + ], + [ + "▁Gr", + "ant" + ], + [ + "▁Gran", + "t" + ], + [ + "▁Gra", + "nt" + ], + [ + "ycl", + "e" + ], + [ + "yc", + "le" + ], + [ + "y", + "cle" + ], + [ + "']", + "." + ], + [ + "'", + "]." + ], + [ + "▁v", + "ier" + ], + [ + "▁vi", + "er" + ], + [ + "▁vie", + "r" + ], + [ + "▁", + "vier" + ], + [ + "nat", + "ive" + ], + [ + "n", + "ative" + ], + [ + "▁K", + "le" + ], + [ + "▁Kl", + "e" + ], + [ + "qu", + "ote" + ], + [ + "quot", + "e" + ], + [ + "User", + "s" + ], + [ + "Us", + "ers" + ], + [ + "Use", + "rs" + ], + [ + "▁ad", + "vis" + ], + [ + "▁adv", + "is" + ], + [ + "▁g", + "ym" + ], + [ + "▁gy", + "m" + ], + [ + "▁prote", + "in" + ], + [ + "ا", + "ل" + ], + [ + "▁M", + "ai" + ], + [ + "▁Ma", + "i" + ], + [ + "▁prov", + "iders" + ], + [ + "▁provide", + "rs" + ], + [ + "▁provider", + "s" + ], + [ + "▁so", + "il" + ], + [ + "gu", + "i" + ], + [ + "g", + "ui" + ], + [ + "▁N", + "ation" + ], + [ + "▁Nat", + "ion" + ], + [ + "re", + "ation" + ], + [ + "reat", + "ion" + ], + [ + "▁T", + "ab" + ], + [ + "▁Ta", + "b" + ], + [ + "▁", + "Tab" + ], + [ + "en", + "sis" + ], + [ + "ens", + "is" + ], + [ + "in", + "as" + ], + [ + "ina", + "s" + ], + [ + "i", + "nas" + ], + [ + "▁Scot", + "land" + ], + [ + "▁dis", + "patch" + ], + [ + "▁disp", + "atch" + ], + [ + "▁", + "dispatch" + ], + [ + "un", + "ion" + ], + [ + "uni", + "on" + ], + [ + "▁b", + "ere" + ], + [ + "▁be", + "re" + ], + [ + "▁ber", + "e" + ], + [ + "▁", + "bere" + ], + [ + "▁P", + "ow" + ], + [ + "▁Po", + "w" + ], + [ + "▁H", + "ig" + ], + [ + "▁Hi", + "g" + ], + [ + "▁stud", + "ying" + ], + [ + "▁study", + "ing" + ], + [ + "RE", + "F" + ], + [ + "R", + "EF" + ], + [ + "SS", + "L" + ], + [ + "S", + "SL" + ], + [ + "▁f", + "right" + ], + [ + "▁fr", + "ight" + ], + [ + "▁S", + "ORT" + ], + [ + "▁SO", + "RT" + ], + [ + "▁com", + "pr" + ], + [ + "▁comp", + "r" + ], + [ + "▁Mad", + "rid" + ], + [ + "row", + "ned" + ], + [ + "rown", + "ed" + ], + [ + "r", + "owned" + ], + [ + "op", + "es" + ], + [ + "ope", + "s" + ], + [ + "o", + "pes" + ], + [ + "pd", + "ev" + ], + [ + "p", + "dev" + ], + [ + "▁w", + "ash" + ], + [ + "▁was", + "h" + ], + [ + "▁wa", + "sh" + ], + [ + "▁'", + "../../" + ], + [ + "▁'../", + "../" + ], + [ + "}}", + "_" + ], + [ + "}", + "}_" + ], + [ + "▁acc", + "um" + ], + [ + "rol", + "ling" + ], + [ + "roll", + "ing" + ], + [ + "▁N", + "C" + ], + [ + "▁", + "NC" + ], + [ + "▁f", + "iction" + ], + [ + "▁fi", + "ction" + ], + [ + "▁fict", + "ion" + ], + [ + "ip", + "t" + ], + [ + "i", + "pt" + ], + [ + "conne", + "cted" + ], + [ + "connect", + "ed" + ], + [ + "lim", + "its" + ], + [ + "limit", + "s" + ], + [ + "▁l", + "ap" + ], + [ + "▁la", + "p" + ], + [ + "▁", + "lap" + ], + [ + "▁where", + "as" + ], + [ + "pro", + "m" + ], + [ + "pr", + "om" + ], + [ + "p", + "rom" + ], + [ + "▁appoint", + "ment" + ], + [ + "Pro", + "gram" + ], + [ + "Pr", + "ogram" + ], + [ + "▁П", + "ер" + ], + [ + "▁Пе", + "р" + ], + [ + "na", + "h" + ], + [ + "n", + "ah" + ], + [ + "Valid", + "ation" + ], + [ + "ic", + "ons" + ], + [ + "ico", + "ns" + ], + [ + "icon", + "s" + ], + [ + "i", + "cons" + ], + [ + "äl", + "l" + ], + [ + "ä", + "ll" + ], + [ + "▁rad", + "ical" + ], + [ + "▁radi", + "cal" + ], + [ + "▁ex", + "clusive" + ], + [ + "▁excl", + "usive" + ], + [ + "▁exclus", + "ive" + ], + [ + "em", + "ony" + ], + [ + "emon", + "y" + ], + [ + "▁challeng", + "ing" + ], + [ + "▁m", + "s" + ], + [ + "▁", + "ms" + ], + [ + "▁P", + "rivate" + ], + [ + "▁Priv", + "ate" + ], + [ + "▁", + "Private" + ], + [ + "▁v", + "ida" + ], + [ + "▁vi", + "da" + ], + [ + "▁vid", + "a" + ], + [ + "▁дру", + "ги" + ], + [ + "▁camp", + "us" + ], + [ + "▁cam", + "pus" + ], + [ + "form", + "s" + ], + [ + "for", + "ms" + ], + [ + "д", + "но" + ], + [ + "pl", + "aat" + ], + [ + "bs", + "t" + ], + [ + "b", + "st" + ], + [ + "AT", + "ED" + ], + [ + "ATE", + "D" + ], + [ + "▁Ab", + "stract" + ], + [ + "▁Abs", + "tract" + ], + [ + "▁", + "Abstract" + ], + [ + "▁int", + "ense" + ], + [ + "▁intens", + "e" + ], + [ + "▁L", + "td" + ], + [ + "▁contro", + "vers" + ], + [ + "ó", + "g" + ], + [ + "▁s", + "ă" + ], + [ + "▁land", + "ing" + ], + [ + "▁lan", + "ding" + ], + [ + "!", + "=" + ], + [ + "▁sc", + "enes" + ], + [ + "▁scene", + "s" + ], + [ + "▁scen", + "es" + ], + [ + "▁Ch", + "ap" + ], + [ + "▁Cha", + "p" + ], + [ + "▁sp", + "oken" + ], + [ + "▁spoke", + "n" + ], + [ + "▁spo", + "ken" + ], + [ + "cre", + "d" + ], + [ + "cr", + "ed" + ], + [ + "c", + "red" + ], + [ + "▁p", + "ride" + ], + [ + "▁pr", + "ide" + ], + [ + "▁pri", + "de" + ], + [ + "qu", + "et" + ], + [ + "que", + "t" + ], + [ + "▁m", + "eter" + ], + [ + "▁me", + "ter" + ], + [ + "▁met", + "er" + ], + [ + "▁de", + "utsch" + ], + [ + "uu", + "m" + ], + [ + "u", + "um" + ], + [ + "▁b", + "less" + ], + [ + "▁bl", + "ess" + ], + [ + "▁ble", + "ss" + ], + [ + "▁H", + "ann" + ], + [ + "▁Ha", + "nn" + ], + [ + "▁Han", + "n" + ], + [ + "▁input", + "s" + ], + [ + "▁", + "inputs" + ], + [ + "▁R", + "ow" + ], + [ + "▁Ro", + "w" + ], + [ + "▁", + "Row" + ], + [ + "▁with", + "draw" + ], + [ + "▁withd", + "raw" + ], + [ + "P", + "al" + ], + [ + "ac", + "les" + ], + [ + "acle", + "s" + ], + [ + "acl", + "es" + ], + [ + "a", + "cles" + ], + [ + "as", + "sets" + ], + [ + "ass", + "ets" + ], + [ + "asse", + "ts" + ], + [ + "asset", + "s" + ], + [ + "▁v", + "l" + ], + [ + "▁", + "vl" + ], + [ + "ве", + "де" + ], + [ + "вед", + "е" + ], + [ + "▁G", + "ot" + ], + [ + "▁Go", + "t" + ], + [ + "▁air", + "port" + ], + [ + "win", + "d" + ], + [ + "wi", + "nd" + ], + [ + "w", + "ind" + ], + [ + "▁Columb", + "ia" + ], + [ + "▁ch", + "ocolate" + ], + [ + "▁h", + "ö" + ], + [ + "▁", + "hö" + ], + [ + "▁al", + "arm" + ], + [ + "FT", + "WARE" + ], + [ + "▁J", + "ay" + ], + [ + "▁Ja", + "y" + ], + [ + "▁s", + "ake" + ], + [ + "▁sa", + "ke" + ], + [ + "▁reg", + "istration" + ], + [ + "▁registr", + "ation" + ], + [ + "vi", + "d" + ], + [ + "v", + "id" + ], + [ + "▁l", + "ake" + ], + [ + "▁la", + "ke" + ], + [ + "▁user", + "name" + ], + [ + "▁", + "username" + ], + [ + "▁h", + "ack" + ], + [ + "▁ha", + "ck" + ], + [ + "index", + "Of" + ], + [ + "c", + "x" + ], + [ + "▁f", + "estival" + ], + [ + "▁fest", + "ival" + ], + [ + "▁club", + "s" + ], + [ + "case", + "s" + ], + [ + "ca", + "ses" + ], + [ + "cas", + "es" + ], + [ + "c", + "ases" + ], + [ + "CT", + "RL" + ], + [ + "];", + "\r" + ], + [ + "]", + ";\r" + ], + [ + "▁A", + "ud" + ], + [ + "▁Au", + "d" + ], + [ + "▁", + "Aud" + ], + [ + "▁prim", + "era" + ], + [ + "▁prime", + "ra" + ], + [ + "▁primer", + "a" + ], + [ + "ва", + "т" + ], + [ + "в", + "ат" + ], + [ + "▁brill", + "iant" + ], + [ + "ut", + "her" + ], + [ + "uth", + "er" + ], + [ + "u", + "ther" + ], + [ + "▁difficult", + "y" + ], + [ + "it", + "als" + ], + [ + "ital", + "s" + ], + [ + "ita", + "ls" + ], + [ + "▁sc", + "ores" + ], + [ + "▁score", + "s" + ], + [ + "▁pol", + "ít" + ], + [ + "data", + "base" + ], + [ + "dat", + "abase" + ], + [ + "as", + "ka" + ], + [ + "ask", + "a" + ], + [ + "a", + "ska" + ], + [ + "▁##", + "####" + ], + [ + "▁###", + "###" + ], + [ + "▁####", + "##" + ], + [ + "▁#####", + "#" + ], + [ + "▁a", + "cid" + ], + [ + "▁ac", + "id" + ], + [ + "at", + "on" + ], + [ + "ato", + "n" + ], + [ + "a", + "ton" + ], + [ + "at", + "omic" + ], + [ + "ato", + "mic" + ], + [ + "atom", + "ic" + ], + [ + "fr", + "eq" + ], + [ + "fre", + "q" + ], + [ + "f", + "req" + ], + [ + "▁WARRAN", + "TY" + ], + [ + "▁report", + "ing" + ], + [ + ".)", + "," + ], + [ + ".", + ")," + ], + [ + "▁n", + "ights" + ], + [ + "▁night", + "s" + ], + [ + "▁program", + "me" + ], + [ + ")}", + "{" + ], + [ + ")", + "}{" + ], + [ + "xi", + "c" + ], + [ + "x", + "ic" + ], + [ + "▁s", + "po" + ], + [ + "▁sp", + "o" + ], + [ + "line", + "d" + ], + [ + "li", + "ned" + ], + [ + "lin", + "ed" + ], + [ + "l", + "ined" + ], + [ + "qu", + "arters" + ], + [ + "er", + "ee" + ], + [ + "ere", + "e" + ], + [ + "e", + "ree" + ], + [ + "mer", + "s" + ], + [ + "me", + "rs" + ], + [ + "m", + "ers" + ], + [ + "▁s", + "erves" + ], + [ + "▁ser", + "ves" + ], + [ + "▁serv", + "es" + ], + [ + "▁serve", + "s" + ], + [ + "co", + "w" + ], + [ + "c", + "ow" + ], + [ + "ль", + "ко" + ], + [ + "en", + "so" + ], + [ + "ens", + "o" + ], + [ + "▁env", + "iron" + ], + [ + "▁", + "environ" + ], + [ + "Li", + "ke" + ], + [ + "L", + "ike" + ], + [ + "an", + "che" + ], + [ + "anc", + "he" + ], + [ + "anch", + "e" + ], + [ + "▁cr", + "ash" + ], + [ + "▁K", + "ap" + ], + [ + "▁Ka", + "p" + ], + [ + "no", + "indent" + ], + [ + "Con", + "n" + ], + [ + "Co", + "nn" + ], + [ + "▁ав", + "то" + ], + [ + "▁in", + "frastructure" + ], + [ + "IM", + "E" + ], + [ + "I", + "ME" + ], + [ + "▁R", + "oom" + ], + [ + "▁Ro", + "om" + ], + [ + "▁", + "Room" + ], + [ + "ne", + "ed" + ], + [ + "n", + "eed" + ], + [ + "or", + "er" + ], + [ + "ore", + "r" + ], + [ + "o", + "rer" + ], + [ + "▁D", + "est" + ], + [ + "▁De", + "st" + ], + [ + "▁Des", + "t" + ], + [ + "▁", + "Dest" + ], + [ + "▁D", + "omin" + ], + [ + "▁Do", + "min" + ], + [ + "▁Dom", + "in" + ], + [ + "ather", + "ine" + ], + [ + "▁Syd", + "ney" + ], + [ + "▁g", + "auge" + ], + [ + "▁gau", + "ge" + ], + [ + "▁ga", + "uge" + ], + [ + "▁j", + "et" + ], + [ + "▁je", + "t" + ], + [ + "▁", + "jet" + ], + [ + "b", + "ably" + ], + [ + "▁comm", + "only" + ], + [ + "▁common", + "ly" + ], + [ + "▁st", + "ations" + ], + [ + "▁stat", + "ions" + ], + [ + "▁station", + "s" + ], + [ + "ia", + "h" + ], + [ + "i", + "ah" + ], + [ + "n", + "l" + ], + [ + "ж", + "у" + ], + [ + "et", + "en" + ], + [ + "ete", + "n" + ], + [ + "e", + "ten" + ], + [ + "_", + ")" + ], + [ + "ia", + "c" + ], + [ + "i", + "ac" + ], + [ + "am", + "os" + ], + [ + "amo", + "s" + ], + [ + "a", + "mos" + ], + [ + "ne", + "ment" + ], + [ + "nem", + "ent" + ], + [ + "n", + "ement" + ], + [ + "ko", + "n" + ], + [ + "k", + "on" + ], + [ + "Inter", + "val" + ], + [ + "▁cab", + "in" + ], + [ + "▁ca", + "bin" + ], + [ + "▁e", + "g" + ], + [ + "▁", + "eg" + ], + [ + "▁sh", + "ots" + ], + [ + "▁shot", + "s" + ], + [ + "▁", + "shots" + ], + [ + "▁A", + "rea" + ], + [ + "▁Ar", + "ea" + ], + [ + "▁Are", + "a" + ], + [ + "▁", + "Area" + ], + [ + "sm", + "ith" + ], + [ + "param", + "eter" + ], + [ + "'", + "}" + ], + [ + "▁h", + "em" + ], + [ + "▁he", + "m" + ], + [ + "▁", + "hem" + ], + [ + "▁s", + "inging" + ], + [ + "▁sing", + "ing" + ], + [ + "▁sin", + "ging" + ], + [ + "▁access", + "ible" + ], + [ + "▁P", + "rin" + ], + [ + "▁Pr", + "in" + ], + [ + "▁Pri", + "n" + ], + [ + "opt", + "ional" + ], + [ + "option", + "al" + ], + [ + "an", + "cial" + ], + [ + "anc", + "ial" + ], + [ + "ancia", + "l" + ], + [ + "sh", + "ips" + ], + [ + "ship", + "s" + ], + [ + "▁can", + "vas" + ], + [ + "▁", + "canvas" + ], + [ + "sp", + "e" + ], + [ + "s", + "pe" + ], + [ + "▁address", + "es" + ], + [ + "▁x", + "ml" + ], + [ + "▁", + "xml" + ], + [ + "▁'", + "\"" + ], + [ + "▁", + "'\"" + ], + [ + "▁k", + "ar" + ], + [ + "▁ka", + "r" + ], + [ + "▁", + "kar" + ], + [ + "ö", + "ff" + ], + [ + "▁a", + "ges" + ], + [ + "▁ag", + "es" + ], + [ + "▁age", + "s" + ], + [ + "▁", + "ages" + ], + [ + "ё", + "р" + ], + [ + "zi", + "ng" + ], + [ + "zin", + "g" + ], + [ + "z", + "ing" + ], + [ + "▁ö", + "ver" + ], + [ + "▁C", + "lean" + ], + [ + "▁Cle", + "an" + ], + [ + "▁", + "Clean" + ], + [ + "▁Sil", + "ver" + ], + [ + "▁о", + "со" + ], + [ + "▁ос", + "о" + ], + [ + "he", + "alth" + ], + [ + "Al", + "i" + ], + [ + "A", + "li" + ], + [ + "▁t", + "s" + ], + [ + "▁", + "ts" + ], + [ + "at", + "ern" + ], + [ + "ate", + "rn" + ], + [ + "ater", + "n" + ], + [ + "a", + "tern" + ], + [ + "▁cho", + "osing" + ], + [ + "▁bur", + "ned" + ], + [ + "▁burn", + "ed" + ], + [ + "br", + "id" + ], + [ + "b", + "rid" + ], + [ + "ro", + "oms" + ], + [ + "room", + "s" + ], + [ + "öt", + "t" + ], + [ + "ö", + "tt" + ], + [ + "K", + "ERN" + ], + [ + "▁d", + "ish" + ], + [ + "▁dis", + "h" + ], + [ + "▁di", + "sh" + ], + [ + "S", + "a" + ], + [ + "De", + "tail" + ], + [ + "Det", + "ail" + ], + [ + "▁H", + "ind" + ], + [ + "▁Hi", + "nd" + ], + [ + "▁D", + "ans" + ], + [ + "▁Dan", + "s" + ], + [ + "▁Da", + "ns" + ], + [ + "i", + "ę" + ], + [ + "▁J", + "ahren" + ], + [ + "▁Jah", + "ren" + ], + [ + "▁Jahr", + "en" + ], + [ + "▁Jahre", + "n" + ], + [ + "▁Ja", + "hren" + ], + [ + "ext", + "ension" + ], + [ + "al", + "las" + ], + [ + "all", + "as" + ], + [ + "alla", + "s" + ], + [ + "▁B", + "illy" + ], + [ + "▁Bill", + "y" + ], + [ + "▁Bil", + "ly" + ], + [ + "us", + "ammen" + ], + [ + "it", + "ud" + ], + [ + "itu", + "d" + ], + [ + "ge", + "on" + ], + [ + "geo", + "n" + ], + [ + "Te", + "mp" + ], + [ + "T", + "emp" + ], + [ + "Le", + "g" + ], + [ + "L", + "eg" + ], + [ + "itt", + "el" + ], + [ + "itte", + "l" + ], + [ + "add", + "le" + ], + [ + "▁mus", + "cle" + ], + [ + "▁sc", + "ared" + ], + [ + "▁scar", + "ed" + ], + [ + "ss", + "on" + ], + [ + "s", + "son" + ], + [ + "▁de", + "note" + ], + [ + "▁den", + "ote" + ], + [ + "ie", + "urs" + ], + [ + "ieu", + "rs" + ], + [ + "ieur", + "s" + ], + [ + "i", + "eurs" + ], + [ + "▁o", + "range" + ], + [ + "▁or", + "ange" + ], + [ + "▁h", + "ub" + ], + [ + "▁", + "hub" + ], + [ + "▁re", + "b" + ], + [ + "▁r", + "eb" + ], + [ + "▁", + "reb" + ], + [ + "ed", + "i" + ], + [ + "e", + "di" + ], + [ + "▁vo", + "ices" + ], + [ + "▁voice", + "s" + ], + [ + "F", + "older" + ], + [ + "▁s", + "uspend" + ], + [ + "▁sus", + "pend" + ], + [ + "▁susp", + "end" + ], + [ + "▁", + "suspend" + ], + [ + "▁He", + "art" + ], + [ + "▁sc", + "rap" + ], + [ + "▁scr", + "ap" + ], + [ + "▁a", + "ggreg" + ], + [ + "▁ag", + "greg" + ], + [ + "▁Gu", + "ide" + ], + [ + "trans", + "action" + ], + [ + "▁r", + "iding" + ], + [ + "▁ri", + "ding" + ], + [ + "▁rid", + "ing" + ], + [ + "▁v", + "á" + ], + [ + "▁", + "vá" + ], + [ + "▁b", + "reed" + ], + [ + "▁br", + "eed" + ], + [ + "▁bre", + "ed" + ], + [ + "▁bree", + "d" + ], + [ + "▁con", + "cert" + ], + [ + "▁conc", + "ert" + ], + [ + "▁conce", + "rt" + ], + [ + "▁concer", + "t" + ], + [ + "appro", + "x" + ], + [ + "▁ch", + "ances" + ], + [ + "▁chance", + "s" + ], + [ + "To", + "k" + ], + [ + "T", + "ok" + ], + [ + "E", + "q" + ], + [ + "par", + "ts" + ], + [ + "part", + "s" + ], + [ + "p", + "arts" + ], + [ + "▁sch", + "olar" + ], + [ + "▁schol", + "ar" + ], + [ + "of", + "fs" + ], + [ + "off", + "s" + ], + [ + "fl", + "ush" + ], + [ + "flu", + "sh" + ], + [ + "!", + "”" + ], + [ + "▁lo", + "gin" + ], + [ + "▁log", + "in" + ], + [ + "▁", + "login" + ], + [ + "▁so", + "ort" + ], + [ + "▁M", + "and" + ], + [ + "▁Man", + "d" + ], + [ + "▁Ma", + "nd" + ], + [ + "▁function", + "al" + ], + [ + "▁B", + "ou" + ], + [ + "▁Bo", + "u" + ], + [ + "▁subject", + "s" + ], + [ + "my", + "s" + ], + [ + "m", + "ys" + ], + [ + "▁extra", + "ord" + ], + [ + "▁Build", + "ing" + ], + [ + "ik", + "t" + ], + [ + "i", + "kt" + ], + [ + "B", + "ad" + ], + [ + "ia", + "mi" + ], + [ + "iam", + "i" + ], + [ + "i", + "ami" + ], + [ + "Dr", + "iver" + ], + [ + "D", + "river" + ], + [ + "êt", + "e" + ], + [ + "ê", + "te" + ], + [ + "▁k", + "v" + ], + [ + "▁", + "kv" + ], + [ + "▁t", + "imer" + ], + [ + "▁time", + "r" + ], + [ + "▁tim", + "er" + ], + [ + "▁ti", + "mer" + ], + [ + "▁", + "timer" + ], + [ + "ition", + "ally" + ], + [ + "itional", + "ly" + ], + [ + "▁a", + "thlet" + ], + [ + "▁ath", + "let" + ], + [ + "▁\"", + ");" + ], + [ + "▁\")", + ";" + ], + [ + "▁", + "\");" + ], + [ + "w", + "y" + ], + [ + "CF", + "G" + ], + [ + "▁he", + "aven" + ], + [ + "▁heav", + "en" + ], + [ + "о", + "в" + ], + [ + "▁exper", + "imental" + ], + [ + "▁experiment", + "al" + ], + [ + "▁b", + "ounds" + ], + [ + "▁bound", + "s" + ], + [ + "▁", + "bounds" + ], + [ + "IC", + "K" + ], + [ + "I", + "CK" + ], + [ + "▁ex", + "cit" + ], + [ + "▁exc", + "it" + ], + [ + "▁qu", + "it" + ], + [ + "▁qui", + "t" + ], + [ + "▁q", + "uit" + ], + [ + "▁univers", + "al" + ], + [ + "д", + "ь" + ], + [ + "▁S", + "P" + ], + [ + "▁", + "SP" + ], + [ + "▁st", + "ub" + ], + [ + "▁", + "stub" + ], + [ + "▁k", + "le" + ], + [ + "▁kl", + "e" + ], + [ + "▁", + "kle" + ], + [ + "▁B", + "art" + ], + [ + "▁Bar", + "t" + ], + [ + "▁Ba", + "rt" + ], + [ + "▁\"", + "@" + ], + [ + "pe", + "l" + ], + [ + "p", + "el" + ], + [ + "▁(", + "!(" + ], + [ + "▁(!", + "(" + ], + [ + "▁se", + "lector" + ], + [ + "▁select", + "or" + ], + [ + "▁sel", + "ector" + ], + [ + "▁sele", + "ctor" + ], + [ + "▁", + "selector" + ], + [ + "E", + "B" + ], + [ + "▁c", + "oc" + ], + [ + "▁co", + "c" + ], + [ + "et", + "ed" + ], + [ + "ete", + "d" + ], + [ + "e", + "ted" + ], + [ + "ют", + "ь" + ], + [ + "ю", + "ть" + ], + [ + "▁poss", + "ess" + ], + [ + "▁R", + "ick" + ], + [ + "▁Ric", + "k" + ], + [ + "▁unus", + "ual" + ], + [ + "ter", + "min" + ], + [ + "term", + "in" + ], + [ + "▁b", + "ags" + ], + [ + "▁bag", + "s" + ], + [ + "▁ba", + "gs" + ], + [ + "▁lo", + "ading" + ], + [ + "▁load", + "ing" + ], + [ + "▁", + "loading" + ], + [ + "▁t", + "f" + ], + [ + "▁", + "tf" + ], + [ + "▁)", + "\r" + ], + [ + "▁", + ")\r" + ], + [ + "pro", + "vider" + ], + [ + "prov", + "ider" + ], + [ + "plet", + "ion" + ], + [ + "▁c", + "ursor" + ], + [ + "▁cur", + "sor" + ], + [ + "▁", + "cursor" + ], + [ + "▁pa", + "used" + ], + [ + "▁paus", + "ed" + ], + [ + "▁pause", + "d" + ], + [ + "и", + "м" + ], + [ + "▁coun", + "sel" + ], + [ + "]", + "<" + ], + [ + "ze", + "ch" + ], + [ + "zec", + "h" + ], + [ + "z", + "ech" + ], + [ + "▁t", + "ie" + ], + [ + "▁ti", + "e" + ], + [ + "▁M", + "oon" + ], + [ + "▁Mo", + "on" + ], + [ + "▁ar", + "med" + ], + [ + "▁arm", + "ed" + ], + [ + "▁", + "armed" + ], + [ + "▁ob", + "serve" + ], + [ + "▁observ", + "e" + ], + [ + "▁obs", + "erve" + ], + [ + "▁per", + "met" + ], + [ + "▁perm", + "et" + ], + [ + "▁J", + "ob" + ], + [ + "▁Jo", + "b" + ], + [ + "▁", + "Job" + ], + [ + "fö", + "r" + ], + [ + "f", + "ör" + ], + [ + "arg", + "ument" + ], + [ + "▁egg", + "s" + ], + [ + "▁eg", + "gs" + ], + [ + "ás", + "t" + ], + [ + "á", + "st" + ], + [ + "▁incred", + "ibly" + ], + [ + "wer", + "ken" + ], + [ + "werk", + "en" + ], + [ + "iz", + "ard" + ], + [ + "izar", + "d" + ], + [ + "iza", + "rd" + ], + [ + "▁p", + "ainted" + ], + [ + "▁pain", + "ted" + ], + [ + "▁pa", + "inted" + ], + [ + "▁paint", + "ed" + ], + [ + "▁Viet", + "nam" + ], + [ + "▁vi", + "olent" + ], + [ + "▁viol", + "ent" + ], + [ + "Es", + "t" + ], + [ + "E", + "st" + ], + [ + "ier", + "ra" + ], + [ + "i", + "erra" + ], + [ + "re", + "ader" + ], + [ + "read", + "er" + ], + [ + "rea", + "der" + ], + [ + "we", + "ise" + ], + [ + "wei", + "se" + ], + [ + "▁J", + "osh" + ], + [ + "▁Jo", + "sh" + ], + [ + "▁Jos", + "h" + ], + [ + "▁H", + "im" + ], + [ + "▁Hi", + "m" + ], + [ + "as", + "hes" + ], + [ + "ash", + "es" + ], + [ + "or", + "igin" + ], + [ + "orig", + "in" + ], + [ + "ori", + "gin" + ], + [ + "▁sp", + "ir" + ], + [ + "▁", + "spir" + ], + [ + "▁T", + "ree" + ], + [ + "▁Tr", + "ee" + ], + [ + "▁Tre", + "e" + ], + [ + "▁", + "Tree" + ], + [ + "▁n", + "iet" + ], + [ + "▁nie", + "t" + ], + [ + "▁ni", + "et" + ], + [ + "WI", + "N" + ], + [ + "W", + "IN" + ], + [ + "mar", + "gin" + ], + [ + "m", + "argin" + ], + [ + "▁inv", + "olves" + ], + [ + "▁invol", + "ves" + ], + [ + "▁involve", + "s" + ], + [ + "▁organ", + "is" + ], + [ + "▁N", + "acional" + ], + [ + "bar", + "a" + ], + [ + "ba", + "ra" + ], + [ + "b", + "ara" + ], + [ + "▁de", + "puis" + ], + [ + "▁dep", + "uis" + ], + [ + "pi", + "o" + ], + [ + "p", + "io" + ], + [ + "fe", + "atures" + ], + [ + "feature", + "s" + ], + [ + "feat", + "ures" + ], + [ + "st", + "ru" + ], + [ + "str", + "u" + ], + [ + "▁Dis", + "ney" + ], + [ + "▁restaur", + "ants" + ], + [ + "▁restaurant", + "s" + ], + [ + "Mil", + "l" + ], + [ + "M", + "ill" + ], + [ + "))", + "\r" + ], + [ + ")", + ")\r" + ], + [ + "с", + "ла" + ], + [ + "rem", + "ote" + ], + [ + "▁Th", + "ird" + ], + [ + "▁base", + "ball" + ], + [ + "▁al", + "gun" + ], + [ + "▁alg", + "un" + ], + [ + "]", + "$" + ], + [ + "▁em", + "ployed" + ], + [ + "▁employ", + "ed" + ], + [ + "po", + "t" + ], + [ + "p", + "ot" + ], + [ + "▁Un", + "ityEngine" + ], + [ + "▁", + "UnityEngine" + ], + [ + "▁integr", + "ation" + ], + [ + "▁risk", + "s" + ], + [ + "▁ris", + "ks" + ], + [ + "▁st", + "ro" + ], + [ + "▁str", + "o" + ], + [ + "▁ag", + "osto" + ], + [ + "▁ago", + "sto" + ], + [ + "incl", + "uding" + ], + [ + "▁M", + "ind" + ], + [ + "▁Min", + "d" + ], + [ + "▁Mi", + "nd" + ], + [ + "▁st", + "roke" + ], + [ + "▁str", + "oke" + ], + [ + "▁stro", + "ke" + ], + [ + "▁", + "stroke" + ], + [ + "▁de", + "als" + ], + [ + "▁deal", + "s" + ], + [ + "aj", + "ax" + ], + [ + "aja", + "x" + ], + [ + "a", + "jax" + ], + [ + "ё", + "т" + ], + [ + "▁\\", + "|" + ], + [ + "▁", + "\\|" + ], + [ + "ta", + "r" + ], + [ + "t", + "ar" + ], + [ + "adelph", + "ia" + ], + [ + "▁s", + "ab" + ], + [ + "▁sa", + "b" + ], + [ + "pu", + "r" + ], + [ + "p", + "ur" + ], + [ + "▁sc", + "rew" + ], + [ + "▁scr", + "ew" + ], + [ + "▁in", + "ev" + ], + [ + "▁\\", + ";" + ], + [ + "▁Don", + "ald" + ], + [ + "▁", + "Donald" + ], + [ + "ö", + "d" + ], + [ + "cc", + "a" + ], + [ + "c", + "ca" + ], + [ + "es", + "is" + ], + [ + "esi", + "s" + ], + [ + "e", + "sis" + ], + [ + "▁separ", + "ated" + ], + [ + "▁separate", + "d" + ], + [ + "DB", + "G" + ], + [ + "D", + "BG" + ], + [ + "ag", + "ent" + ], + [ + "age", + "nt" + ], + [ + "agen", + "t" + ], + [ + "a", + "gent" + ], + [ + "▁p", + "acked" + ], + [ + "▁pack", + "ed" + ], + [ + "▁pac", + "ked" + ], + [ + "▁", + "packed" + ], + [ + "н", + "ня" + ], + [ + "in", + "tern" + ], + [ + "int", + "ern" + ], + [ + "inter", + "n" + ], + [ + "inte", + "rn" + ], + [ + "▁M", + "onte" + ], + [ + "▁Mon", + "te" + ], + [ + "▁Mont", + "e" + ], + [ + "▁Mo", + "nte" + ], + [ + "▁prov", + "ince" + ], + [ + "▁provinc", + "e" + ], + [ + "▁provin", + "ce" + ], + [ + "▁exp", + "anded" + ], + [ + "▁expand", + "ed" + ], + [ + "▁appro", + "ached" + ], + [ + "▁approach", + "ed" + ], + [ + "▁E", + "p" + ], + [ + "CL", + "K" + ], + [ + "▁o", + "re" + ], + [ + "▁or", + "e" + ], + [ + "▁", + "ore" + ], + [ + "B", + "atch" + ], + [ + "▁impress", + "ive" + ], + [ + "R", + "M" + ], + [ + "▁L", + "ocation" + ], + [ + "▁Loc", + "ation" + ], + [ + "▁", + "Location" + ], + [ + "▁sh", + "ame" + ], + [ + "▁sha", + "me" + ], + [ + "wrap", + "per" + ], + [ + "w", + "rapper" + ], + [ + "un", + "wrap" + ], + [ + "pe", + "er" + ], + [ + "Bit", + "s" + ], + [ + "Bi", + "ts" + ], + [ + "B", + "its" + ], + [ + "▁S", + "N" + ], + [ + "▁", + "SN" + ], + [ + "sc", + "ar" + ], + [ + "s", + "car" + ], + [ + "Com", + "e" + ], + [ + "Co", + "me" + ], + [ + "C", + "ome" + ], + [ + "▁coun", + "cil" + ], + [ + "▁shout", + "ed" + ], + [ + "ma", + "king" + ], + [ + "m", + "aking" + ], + [ + "▁M", + "aur" + ], + [ + "▁Ma", + "ur" + ], + [ + "▁Mau", + "r" + ], + [ + "▁w", + "is" + ], + [ + "LE", + "TE" + ], + [ + "LET", + "E" + ], + [ + "▁f", + "s" + ], + [ + "▁", + "fs" + ], + [ + "▁d", + "z" + ], + [ + "▁", + "dz" + ], + [ + "un", + "que" + ], + [ + "ue", + "go" + ], + [ + "u", + "ego" + ], + [ + "R", + "andom" + ], + [ + "H", + "tml" + ], + [ + "ze", + "m" + ], + [ + "z", + "em" + ], + [ + "▁D", + "utch" + ], + [ + "▁Gold", + "en" + ], + [ + "▁Gol", + "den" + ], + [ + "▁T", + "ar" + ], + [ + "▁Ta", + "r" + ], + [ + "▁H", + "erm" + ], + [ + "▁He", + "rm" + ], + [ + "▁Her", + "m" + ], + [ + "▁str", + "etch" + ], + [ + "▁stret", + "ch" + ], + [ + "var", + "d" + ], + [ + "va", + "rd" + ], + [ + "v", + "ard" + ], + [ + "▁t", + "ries" + ], + [ + "▁tr", + "ies" + ], + [ + "▁tri", + "es" + ], + [ + "W", + "I" + ], + [ + "▁disappe", + "ared" + ], + [ + "▁disappear", + "ed" + ], + [ + "▁cr", + "usher" + ], + [ + "▁crush", + "er" + ], + [ + "▁K", + "an" + ], + [ + "▁Ka", + "n" + ], + [ + "Ma", + "g" + ], + [ + "M", + "ag" + ], + [ + "ø", + "r" + ], + [ + "▁Cam", + "bridge" + ], + [ + "▁Camb", + "ridge" + ], + [ + "▁do", + "po" + ], + [ + "▁dop", + "o" + ], + [ + "at", + "ura" + ], + [ + "atur", + "a" + ], + [ + "atu", + "ra" + ], + [ + "he", + "art" + ], + [ + "▁Sp", + "iel" + ], + [ + "/*", + "*\r" + ], + [ + "/**", + "\r" + ], + [ + "Dir", + "ection" + ], + [ + "Direct", + "ion" + ], + [ + "Di", + "rection" + ], + [ + "D", + "irection" + ], + [ + "at", + "ting" + ], + [ + "att", + "ing" + ], + [ + "atti", + "ng" + ], + [ + "wi", + "g" + ], + [ + "w", + "ig" + ], + [ + "▁c", + "odes" + ], + [ + "▁co", + "des" + ], + [ + "▁code", + "s" + ], + [ + "▁cod", + "es" + ], + [ + "▁", + "codes" + ], + [ + "▁pow", + "der" + ], + [ + "al", + "ert" + ], + [ + "ale", + "rt" + ], + [ + "aler", + "t" + ], + [ + "sem", + "bl" + ], + [ + "semb", + "l" + ], + [ + "▁y", + "e" + ], + [ + "▁", + "ye" + ], + [ + "St", + "ar" + ], + [ + "S", + "tar" + ], + [ + "▁ro", + "ots" + ], + [ + "▁root", + "s" + ], + [ + "▁H", + "oll" + ], + [ + "▁Hol", + "l" + ], + [ + "▁Ho", + "ll" + ], + [ + "Re", + "le" + ], + [ + "Rel", + "e" + ], + [ + "R", + "ele" + ], + [ + "▁const", + "itu" + ], + [ + "n", + "c" + ], + [ + "“", + "." + ], + [ + "re", + "ference" + ], + [ + "refer", + "ence" + ], + [ + "if", + "icial" + ], + [ + "ific", + "ial" + ], + [ + "ifi", + "cial" + ], + [ + "clos", + "ure" + ], + [ + "▁fig", + "ured" + ], + [ + "▁figure", + "d" + ], + [ + "▁assum", + "ption" + ], + [ + "getElement", + "ById" + ], + [ + "▁A", + "G" + ], + [ + "▁", + "AG" + ], + [ + "os", + "es" + ], + [ + "ose", + "s" + ], + [ + "o", + "ses" + ], + [ + "▁_", + "\"" + ], + [ + "ep", + "per" + ], + [ + "ob", + "re" + ], + [ + "o", + "bre" + ], + [ + "en", + "umerate" + ], + [ + "о", + "графи" + ], + [ + "▁less", + "ons" + ], + [ + "▁lesson", + "s" + ], + [ + "▁qual", + "ified" + ], + [ + "Per", + "son" + ], + [ + "Pers", + "on" + ], + [ + "P", + "erson" + ], + [ + "an", + "se" + ], + [ + "ans", + "e" + ], + [ + "▁M", + "ort" + ], + [ + "▁Mor", + "t" + ], + [ + "▁Mo", + "rt" + ], + [ + "s", + "ylvania" + ], + [ + "▁c", + "ré" + ], + [ + "▁cr", + "é" + ], + [ + "Bind", + "ing" + ], + [ + "Bin", + "ding" + ], + [ + "B", + "inding" + ], + [ + "і", + "с" + ], + [ + "▁V", + "ari" + ], + [ + "▁Var", + "i" + ], + [ + "▁Va", + "ri" + ], + [ + "▁", + "Vari" + ], + [ + "▁re", + "minded" + ], + [ + "▁remind", + "ed" + ], + [ + "▁members", + "hip" + ], + [ + "▁member", + "ship" + ], + [ + "ip", + "er" + ], + [ + "ipe", + "r" + ], + [ + "i", + "per" + ], + [ + "zt", + "e" + ], + [ + "z", + "te" + ], + [ + "▁c", + "ref" + ], + [ + "▁cre", + "f" + ], + [ + "▁cr", + "ef" + ], + [ + "▁", + "cref" + ], + [ + "▁P", + "A" + ], + [ + "▁", + "PA" + ], + [ + "plaat", + "st" + ], + [ + "▁Env", + "ironment" + ], + [ + "▁", + "Environment" + ], + [ + "bo", + "y" + ], + [ + "b", + "oy" + ], + [ + "▁ph", + "rase" + ], + [ + "▁phr", + "ase" + ], + [ + "▁", + "phrase" + ], + [ + "riv", + "ial" + ], + [ + "ra", + "g" + ], + [ + "r", + "ag" + ], + [ + "во", + "ди" + ], + [ + "вод", + "и" + ], + [ + "▁p", + "se" + ], + [ + "▁ps", + "e" + ], + [ + "▁", + "pse" + ], + [ + "▁tour", + "nament" + ], + [ + ")}", + "," + ], + [ + ")", + "}," + ], + [ + "▁S", + "ound" + ], + [ + "▁So", + "und" + ], + [ + "▁Sou", + "nd" + ], + [ + "▁", + "Sound" + ], + [ + "▁V", + "el" + ], + [ + "▁Ve", + "l" + ], + [ + "▁", + "Vel" + ], + [ + "▁B", + "erg" + ], + [ + "▁Be", + "rg" + ], + [ + "▁Ber", + "g" + ], + [ + "el", + "son" + ], + [ + "els", + "on" + ], + [ + "▁ref", + "uge" + ], + [ + "▁else", + "where" + ], + [ + "qu", + "ality" + ], + [ + "qual", + "ity" + ], + [ + "▁abandon", + "ed" + ], + [ + "▁F", + "lo" + ], + [ + "▁Fl", + "o" + ], + [ + "ib", + "il" + ], + [ + "i", + "bil" + ], + [ + "UA", + "L" + ], + [ + "U", + "AL" + ], + [ + "▁Pl", + "atz" + ], + [ + "▁d", + "elta" + ], + [ + "▁del", + "ta" + ], + [ + "▁", + "delta" + ], + [ + "▁B", + "uy" + ], + [ + "▁Bu", + "y" + ], + [ + "ri", + "ère" + ], + [ + "r", + "ière" + ], + [ + "▁fl", + "our" + ], + [ + "▁flo", + "ur" + ], + [ + "▁laugh", + "ing" + ], + [ + "▁laug", + "hing" + ], + [ + "▁Look", + "ing" + ], + [ + "▁Lo", + "oking" + ], + [ + "Ag", + "ent" + ], + [ + "A", + "gent" + ], + [ + "▁w", + "x" + ], + [ + "▁", + "wx" + ], + [ + "▁W", + "ales" + ], + [ + "▁Wal", + "es" + ], + [ + "▁Wa", + "les" + ], + [ + "C", + "tx" + ], + [ + "▁c", + "ake" + ], + [ + "▁ca", + "ke" + ], + [ + "▁c", + "rate" + ], + [ + "▁cr", + "ate" + ], + [ + "▁", + "crate" + ], + [ + "▁к", + "ла" + ], + [ + "▁", + "кла" + ], + [ + "an", + "ga" + ], + [ + "ang", + "a" + ], + [ + "Z", + "ero" + ], + [ + "▁amount", + "s" + ], + [ + "Tr", + "a" + ], + [ + "T", + "ra" + ], + [ + "om", + "etric" + ], + [ + "omet", + "ric" + ], + [ + "o", + "metric" + ], + [ + "▁con", + "straints" + ], + [ + "▁constr", + "aints" + ], + [ + "▁constraint", + "s" + ], + [ + "▁tem", + "ple" + ], + [ + "▁templ", + "e" + ], + [ + "▁temp", + "le" + ], + [ + "▁install", + "ation" + ], + [ + "st", + "roke" + ], + [ + "str", + "oke" + ], + [ + "▁N", + "eder" + ], + [ + "▁Ne", + "der" + ], + [ + "▁Ned", + "er" + ], + [ + "ț", + "i" + ], + [ + "▁I", + "bid" + ], + [ + "▁o", + "bs" + ], + [ + "▁ob", + "s" + ], + [ + "▁", + "obs" + ], + [ + "ent", + "ries" + ], + [ + "entr", + "ies" + ], + [ + "▁j", + "usqu" + ], + [ + "OR", + "M" + ], + [ + "O", + "RM" + ], + [ + "▁S", + "ky" + ], + [ + "▁Sk", + "y" + ], + [ + "ik", + "es" + ], + [ + "ike", + "s" + ], + [ + "i", + "kes" + ], + [ + "na", + "k" + ], + [ + "n", + "ak" + ], + [ + "▁m", + "odes" + ], + [ + "▁mod", + "es" + ], + [ + "▁mo", + "des" + ], + [ + "▁mode", + "s" + ], + [ + "▁Hit", + "ler" + ], + [ + "▁b", + "elt" + ], + [ + "▁be", + "lt" + ], + [ + "▁bel", + "t" + ], + [ + "▁point", + "ing" + ], + [ + "▁B", + "an" + ], + [ + "▁Ba", + "n" + ], + [ + "ign", + "ore" + ], + [ + "▁per", + "su" + ], + [ + "▁pers", + "u" + ], + [ + "▁Bes", + "ides" + ], + [ + "yn", + "om" + ], + [ + "y", + "nom" + ], + [ + "▁leg", + "is" + ], + [ + "▁C", + "PU" + ], + [ + "▁CP", + "U" + ], + [ + "▁", + "CPU" + ], + [ + "an", + "ded" + ], + [ + "and", + "ed" + ], + [ + "ande", + "d" + ], + [ + "ui", + "s" + ], + [ + "u", + "is" + ], + [ + "bs", + "ite" + ], + [ + "b", + "site" + ], + [ + "▁E", + "uro" + ], + [ + "▁Eu", + "ro" + ], + [ + "▁ut", + "ter" + ], + [ + "▁", + "utter" + ], + [ + "e", + "clipse" + ], + [ + "▁ir", + "re" + ], + [ + "▁irr", + "e" + ], + [ + "▁D", + "ocument" + ], + [ + "▁Doc", + "ument" + ], + [ + "▁", + "Document" + ], + [ + "▁Mean", + "while" + ], + [ + "▁famil", + "ie" + ], + [ + "ver", + "ify" + ], + [ + "▁J", + "ason" + ], + [ + "▁Ja", + "son" + ], + [ + "▁O", + "rt" + ], + [ + "▁Or", + "t" + ], + [ + "▁ci", + "udad" + ], + [ + "▁techn", + "ologies" + ], + [ + "▁ча", + "сти" + ], + [ + "▁част", + "и" + ], + [ + "▁час", + "ти" + ], + [ + "ni", + "ca" + ], + [ + "nic", + "a" + ], + [ + "n", + "ica" + ], + [ + "can", + "cel" + ], + [ + "c", + "ancel" + ], + [ + "V", + "irtual" + ], + [ + "▁ev", + "ident" + ], + [ + "am", + "an" + ], + [ + "ama", + "n" + ], + [ + "a", + "man" + ], + [ + "▁Sup", + "reme" + ], + [ + "at", + "oes" + ], + [ + "ato", + "es" + ], + [ + "▁ste", + "ady" + ], + [ + "▁stead", + "y" + ], + [ + "▁month", + "ly" + ], + [ + "▁SO", + "FTWARE" + ], + [ + "Di", + "e" + ], + [ + "D", + "ie" + ], + [ + "▁app", + "lying" + ], + [ + "▁apply", + "ing" + ], + [ + "▁appl", + "ying" + ], + [ + "Di", + "g" + ], + [ + "D", + "ig" + ], + [ + "vi", + "er" + ], + [ + "v", + "ier" + ], + [ + "▁го", + "ро" + ], + [ + "▁W", + "H" + ], + [ + "▁", + "WH" + ], + [ + "▁min", + "ds" + ], + [ + "▁mind", + "s" + ], + [ + "▁k", + "am" + ], + [ + "▁ka", + "m" + ], + [ + "▁expert", + "ise" + ], + [ + "▁not", + "ification" + ], + [ + "▁", + "notification" + ], + [ + ".", + "-" + ], + [ + "▁del", + "iber" + ], + [ + "▁H", + "E" + ], + [ + "▁", + "HE" + ], + [ + "▁res", + "ist" + ], + [ + "ou", + "tes" + ], + [ + "out", + "es" + ], + [ + "oute", + "s" + ], + [ + "o", + "utes" + ], + [ + "▁How", + "ard" + ], + [ + "▁Ho", + "ward" + ], + [ + "spec", + "ial" + ], + [ + "spe", + "cial" + ], + [ + "▁p", + "resentation" + ], + [ + "▁present", + "ation" + ], + [ + "▁You", + "Tube" + ], + [ + "mi", + "r" + ], + [ + "m", + "ir" + ], + [ + "▁r", + "ust" + ], + [ + "▁ru", + "st" + ], + [ + "▁rus", + "t" + ], + [ + "▁", + "rust" + ], + [ + "▁n", + "ations" + ], + [ + "▁nat", + "ions" + ], + [ + "▁nation", + "s" + ], + [ + "▁G", + "ets" + ], + [ + "▁Ge", + "ts" + ], + [ + "▁Get", + "s" + ], + [ + "▁respon", + "ses" + ], + [ + "▁response", + "s" + ], + [ + "▁respons", + "es" + ], + [ + "ar", + "ded" + ], + [ + "ard", + "ed" + ], + [ + "arde", + "d" + ], + [ + "im", + "mer" + ], + [ + "imm", + "er" + ], + [ + "▁reve", + "al" + ], + [ + "▁M", + "eg" + ], + [ + "▁Me", + "g" + ], + [ + "▁tod", + "os" + ], + [ + "▁todo", + "s" + ], + [ + "▁a", + "de" + ], + [ + "▁ad", + "e" + ], + [ + "▁", + "ade" + ], + [ + "ateg", + "ories" + ], + [ + "ategor", + "ies" + ], + [ + "▁pay", + "ments" + ], + [ + "▁payment", + "s" + ], + [ + "ô", + "t" + ], + [ + "En", + "umer" + ], + [ + "Enum", + "er" + ], + [ + "E", + "numer" + ], + [ + "▁platform", + "s" + ], + [ + "▁plat", + "forms" + ], + [ + "▁life", + "time" + ], + [ + "▁lif", + "etime" + ], + [ + "Com", + "plete" + ], + [ + "Comp", + "lete" + ], + [ + "Qu", + "est" + ], + [ + "Que", + "st" + ], + [ + "Q", + "uest" + ], + [ + "en", + "ders" + ], + [ + "end", + "ers" + ], + [ + "ender", + "s" + ], + [ + "ende", + "rs" + ], + [ + "▁c", + "um" + ], + [ + "▁cu", + "m" + ], + [ + "pl", + "er" + ], + [ + "ple", + "r" + ], + [ + "p", + "ler" + ], + [ + "▁app", + "l" + ], + [ + "▁ap", + "pl" + ], + [ + "äh", + "rend" + ], + [ + "ähr", + "end" + ], + [ + "з", + "ь" + ], + [ + "en", + "ez" + ], + [ + "ene", + "z" + ], + [ + "e", + "nez" + ], + [ + "over", + "ty" + ], + [ + "yn", + "chron" + ], + [ + "▁arg", + "ued" + ], + [ + "▁argue", + "d" + ], + [ + "▁K", + "ath" + ], + [ + "▁Kat", + "h" + ], + [ + "▁Ka", + "th" + ], + [ + "▁s", + "ynchron" + ], + [ + "▁syn", + "chron" + ], + [ + "▁B", + "uilder" + ], + [ + "▁Build", + "er" + ], + [ + "▁", + "Builder" + ], + [ + "B", + "order" + ], + [ + "Pl", + "an" + ], + [ + "P", + "lan" + ], + [ + "ri", + "eb" + ], + [ + "rie", + "b" + ], + [ + "r", + "ieb" + ], + [ + "n", + "m" + ], + [ + "FOR", + "MAT" + ], + [ + "FORM", + "AT" + ], + [ + "us", + "k" + ], + [ + "u", + "sk" + ], + [ + "▁j", + "umped" + ], + [ + "▁jump", + "ed" + ], + [ + "ch", + "arg" + ], + [ + "char", + "g" + ], + [ + "cha", + "rg" + ], + [ + "▁cont", + "ribute" + ], + [ + "▁contribut", + "e" + ], + [ + "Me", + "sh" + ], + [ + "M", + "esh" + ], + [ + "Un", + "ivers" + ], + [ + "re", + "ll" + ], + [ + "rel", + "l" + ], + [ + "r", + "ell" + ], + [ + "▁p", + "olar" + ], + [ + "▁pol", + "ar" + ], + [ + "▁po", + "lar" + ], + [ + "▁tr", + "ois" + ], + [ + "▁tro", + "is" + ], + [ + "ic", + "io" + ], + [ + "ici", + "o" + ], + [ + "i", + "cio" + ], + [ + "Group", + "s" + ], + [ + "G", + "roups" + ], + [ + "▁(", + "%" + ], + [ + "▁", + "(%" + ], + [ + "Lo", + "op" + ], + [ + "L", + "oop" + ], + [ + "▁g", + "az" + ], + [ + "▁ga", + "z" + ], + [ + "db", + "g" + ], + [ + "d", + "bg" + ], + [ + "LA", + "Y" + ], + [ + "L", + "AY" + ], + [ + "Jo", + "hn" + ], + [ + "J", + "ohn" + ], + [ + "bl", + "ocks" + ], + [ + "block", + "s" + ], + [ + "blo", + "cks" + ], + [ + "▁l", + "ung" + ], + [ + "▁lu", + "ng" + ], + [ + "▁lun", + "g" + ], + [ + "▁", + "lung" + ], + [ + "▁k", + "ön" + ], + [ + "▁kö", + "n" + ], + [ + "th", + "rough" + ], + [ + "▁fif", + "th" + ], + [ + "lish", + "er" + ], + [ + "l", + "isher" + ], + [ + "▁inv", + "olving" + ], + [ + "▁invol", + "ving" + ], + [ + "▁De", + "ep" + ], + [ + "▁", + "Deep" + ], + [ + "▁обла", + "сти" + ], + [ + "▁s", + "ull" + ], + [ + "▁su", + "ll" + ], + [ + "▁sul", + "l" + ], + [ + "Ex", + "port" + ], + [ + "Exp", + "ort" + ], + [ + "▁K", + "ate" + ], + [ + "▁Kat", + "e" + ], + [ + "▁Ka", + "te" + ], + [ + "per", + "iod" + ], + [ + "ch", + "arge" + ], + [ + "char", + "ge" + ], + [ + "charg", + "e" + ], + [ + "G", + "T" + ], + [ + "\">", + "\r" + ], + [ + "\"", + ">\r" + ], + [ + "ти", + "н" + ], + [ + "т", + "ин" + ], + [ + "▁O", + "tt" + ], + [ + "▁Ot", + "t" + ], + [ + "▁inter", + "actions" + ], + [ + "▁interaction", + "s" + ], + [ + "▁interact", + "ions" + ], + [ + "▁Tor", + "onto" + ], + [ + "TR", + "ACE" + ], + [ + "TRA", + "CE" + ], + [ + "▁d", + "ifer" + ], + [ + "▁di", + "fer" + ], + [ + "▁dif", + "er" + ], + [ + "▁lib", + "eral" + ], + [ + "▁liber", + "al" + ], + [ + "▁p", + "article" + ], + [ + "▁part", + "icle" + ], + [ + "▁partic", + "le" + ], + [ + "▁parti", + "cle" + ], + [ + "▁sur", + "ve" + ], + [ + "▁surv", + "e" + ], + [ + "al", + "ous" + ], + [ + "alo", + "us" + ], + [ + "re", + "ason" + ], + [ + "rea", + "son" + ], + [ + "▁de", + "pression" + ], + [ + "▁dep", + "ression" + ], + [ + "▁depress", + "ion" + ], + [ + "а", + "л" + ], + [ + "▁f", + "lower" + ], + [ + "▁fl", + "ower" + ], + [ + "▁flo", + "wer" + ], + [ + "▁flow", + "er" + ], + [ + "▁wa", + "ar" + ], + [ + "▁h", + "ade" + ], + [ + "▁had", + "e" + ], + [ + "▁ha", + "de" + ], + [ + "▁cent", + "uries" + ], + [ + "ut", + "y" + ], + [ + "u", + "ty" + ], + [ + "par", + "ty" + ], + [ + "part", + "y" + ], + [ + "▁appro", + "val" + ], + [ + "gener", + "ate" + ], + [ + "▁B", + "arn" + ], + [ + "▁Bar", + "n" + ], + [ + "▁Ba", + "rn" + ], + [ + "▁m", + "arg" + ], + [ + "▁mar", + "g" + ], + [ + "▁ma", + "rg" + ], + [ + "▁m", + "onde" + ], + [ + "▁mon", + "de" + ], + [ + "▁mo", + "nde" + ], + [ + "▁mond", + "e" + ], + [ + "▁o", + "ok" + ], + [ + "▁", + "ook" + ], + [ + "▁Cl", + "ark" + ], + [ + "▁Clar", + "k" + ], + [ + "▁the", + "oret" + ], + [ + "vious", + "ly" + ], + [ + "vi", + "ously" + ], + [ + "v", + "iously" + ], + [ + "?", + ")" + ], + [ + "▁R", + "ud" + ], + [ + "▁Ru", + "d" + ], + [ + "st", + "mt" + ], + [ + "in", + "ction" + ], + [ + "inct", + "ion" + ], + [ + "▁t", + "un" + ], + [ + "▁tu", + "n" + ], + [ + "▁ro", + "ads" + ], + [ + "▁road", + "s" + ], + [ + "▁rot", + "ation" + ], + [ + "▁", + "rotation" + ], + [ + "pp", + "en" + ], + [ + "ppe", + "n" + ], + [ + "p", + "pen" + ], + [ + "sen", + "sor" + ], + [ + "s", + "ensor" + ], + [ + "▁K", + "ol" + ], + [ + "▁Ko", + "l" + ], + [ + "id", + "elines" + ], + [ + "ide", + "lines" + ], + [ + "idel", + "ines" + ], + [ + "▁", + "є" + ], + [ + "▁com", + "posed" + ], + [ + "▁comp", + "osed" + ], + [ + "▁compos", + "ed" + ], + [ + "▁v", + "irus" + ], + [ + "▁vi", + "rus" + ], + [ + "▁vir", + "us" + ], + [ + "'", + "$" + ], + [ + "S", + "N" + ], + [ + "▁V", + "on" + ], + [ + "▁Vo", + "n" + ], + [ + "mon", + "t" + ], + [ + "mo", + "nt" + ], + [ + "m", + "ont" + ], + [ + "la", + "r" + ], + [ + "l", + "ar" + ], + [ + "▁opin", + "ions" + ], + [ + "▁opinion", + "s" + ], + [ + "uct", + "ion" + ], + [ + "u", + "ction" + ], + [ + "ru", + "pal" + ], + [ + "rup", + "al" + ], + [ + "under", + "line" + ], + [ + "▁hor", + "ror" + ], + [ + "Mus", + "t" + ], + [ + "Mu", + "st" + ], + [ + "M", + "ust" + ], + [ + "ot", + "to" + ], + [ + "ott", + "o" + ], + [ + "o", + "tto" + ], + [ + "Sh", + "ould" + ], + [ + "▁stat", + "ist" + ], + [ + "▁g", + "em" + ], + [ + "▁ge", + "m" + ], + [ + "▁", + "gem" + ], + [ + "▁se", + "cre" + ], + [ + "▁sec", + "re" + ], + [ + "▁st", + "rip" + ], + [ + "▁str", + "ip" + ], + [ + "▁stri", + "p" + ], + [ + "▁", + "strip" + ], + [ + "▁d", + "irt" + ], + [ + "▁di", + "rt" + ], + [ + "▁dir", + "t" + ], + [ + "ama", + "zon" + ], + [ + "amaz", + "on" + ], + [ + "▁R", + "ound" + ], + [ + "▁Ro", + "und" + ], + [ + "▁Rou", + "nd" + ], + [ + "▁", + "Round" + ], + [ + "▁dis", + "covery" + ], + [ + "▁disc", + "overy" + ], + [ + "▁discover", + "y" + ], + [ + "▁disco", + "very" + ], + [ + "▁G", + "O" + ], + [ + "▁", + "GO" + ], + [ + "▁substant", + "ial" + ], + [ + "ib", + "t" + ], + [ + "i", + "bt" + ], + [ + "▁dem", + "ands" + ], + [ + "▁demand", + "s" + ], + [ + "▁every", + "day" + ], + [ + "▁b", + "esch" + ], + [ + "▁be", + "sch" + ], + [ + "▁bes", + "ch" + ], + [ + "▁B", + "ridge" + ], + [ + "▁Br", + "idge" + ], + [ + "▁H", + "D" + ], + [ + "▁", + "HD" + ], + [ + "▁D", + "ol" + ], + [ + "▁Do", + "l" + ], + [ + "▁t", + "rès" + ], + [ + "▁tr", + "ès" + ], + [ + "an", + "ni" + ], + [ + "ann", + "i" + ], + [ + "ro", + "it" + ], + [ + "()", + ");\r" + ], + [ + "());", + "\r" + ], + [ + "())", + ";\r" + ], + [ + "(", + "));\r" + ], + [ + "fa", + "r" + ], + [ + "f", + "ar" + ], + [ + "tim", + "estamp" + ], + [ + "▁bul", + "k" + ], + [ + "Bl", + "ack" + ], + [ + "▁g", + "an" + ], + [ + "▁ga", + "n" + ], + [ + "▁", + "gan" + ], + [ + "set", + "ting" + ], + [ + "ret", + "val" + ], + [ + "ва", + "не" + ], + [ + "ван", + "е" + ], + [ + "nu", + "ng" + ], + [ + "n", + "ung" + ], + [ + "▁talk", + "s" + ], + [ + "▁tal", + "ks" + ], + [ + "▁scient", + "ists" + ], + [ + "▁scientist", + "s" + ], + [ + "▁v", + "ig" + ], + [ + "▁vi", + "g" + ], + [ + "▁quant", + "ity" + ], + [ + "▁G", + "ard" + ], + [ + "▁Gar", + "d" + ], + [ + "▁Ga", + "rd" + ], + [ + "▁mov", + "ements" + ], + [ + "▁move", + "ments" + ], + [ + "▁movement", + "s" + ], + [ + "äh", + "r" + ], + [ + "ä", + "hr" + ], + [ + "ling", + "s" + ], + [ + "lin", + "gs" + ], + [ + "l", + "ings" + ], + [ + "▁Т", + "е" + ], + [ + "te", + "am" + ], + [ + "ri", + "to" + ], + [ + "rit", + "o" + ], + [ + "r", + "ito" + ], + [ + "▁as", + "sembly" + ], + [ + "▁", + "assembly" + ], + [ + "il", + "st" + ], + [ + "ils", + "t" + ], + [ + "i", + "lst" + ], + [ + "▁happ", + "iness" + ], + [ + "▁le", + "af" + ], + [ + "▁", + "leaf" + ], + [ + "▁ass", + "essment" + ], + [ + "▁assess", + "ment" + ], + [ + "Co", + "ord" + ], + [ + "ir", + "s" + ], + [ + "i", + "rs" + ], + [ + "sa", + "m" + ], + [ + "s", + "am" + ], + [ + "▁att", + "orney" + ], + [ + "▁g", + "eme" + ], + [ + "▁ge", + "me" + ], + [ + "▁gem", + "e" + ], + [ + "▁", + "geme" + ], + [ + "ID", + "E" + ], + [ + "I", + "DE" + ], + [ + "▁V", + "ere" + ], + [ + "▁Ver", + "e" + ], + [ + "▁Ve", + "re" + ], + [ + "▁Anth", + "ony" + ], + [ + "am", + "iento" + ], + [ + "ami", + "ento" + ], + [ + "▁A", + "st" + ], + [ + "▁As", + "t" + ], + [ + "▁cir", + "cul" + ], + [ + "▁circ", + "ul" + ], + [ + "▁Fr", + "ances" + ], + [ + "▁Franc", + "es" + ], + [ + "▁France", + "s" + ], + [ + "▁Fran", + "ces" + ], + [ + "▁p", + "ent" + ], + [ + "▁pe", + "nt" + ], + [ + "▁pen", + "t" + ], + [ + "▁", + "pent" + ], + [ + "▁m", + "ate" + ], + [ + "▁mat", + "e" + ], + [ + "▁ma", + "te" + ], + [ + "▁", + "mate" + ], + [ + "▁Trans", + "port" + ], + [ + "▁", + "Transport" + ], + [ + "ow", + "o" + ], + [ + "o", + "wo" + ], + [ + "ч", + "у" + ], + [ + "is", + "tes" + ], + [ + "ist", + "es" + ], + [ + "iste", + "s" + ], + [ + "TR", + "AN" + ], + [ + "TRA", + "N" + ], + [ + "T", + "RAN" + ], + [ + "IM", + "PORT" + ], + [ + "IMP", + "ORT" + ], + [ + "▁B", + "reak" + ], + [ + "▁Bre", + "ak" + ], + [ + "▁", + "Break" + ], + [ + "▁s", + "ons" + ], + [ + "▁so", + "ns" + ], + [ + "▁son", + "s" + ], + [ + "▁invest", + "ors" + ], + [ + "▁Phil", + "ipp" + ], + [ + "▁Philip", + "p" + ], + [ + "TH", + "OD" + ], + [ + "▁pan", + "ic" + ], + [ + "▁pa", + "nic" + ], + [ + "▁", + "panic" + ], + [ + "▁:", + ")" + ], + [ + "▁d", + "etection" + ], + [ + "▁det", + "ection" + ], + [ + "▁detect", + "ion" + ], + [ + "▁sim", + "ultane" + ], + [ + "nt", + "e" + ], + [ + "n", + "te" + ], + [ + "▁list", + "ened" + ], + [ + "▁listen", + "ed" + ], + [ + "к", + "ре" + ], + [ + "▁B", + "rig" + ], + [ + "▁Br", + "ig" + ], + [ + "Option", + "al" + ], + [ + "Opt", + "ional" + ], + [ + "▁a", + "bund" + ], + [ + "▁ab", + "und" + ], + [ + "▁c", + "riteria" + ], + [ + "▁crit", + "eria" + ], + [ + "▁c", + "hip" + ], + [ + "▁ch", + "ip" + ], + [ + "▁chi", + "p" + ], + [ + "▁", + "chip" + ], + [ + "▁ок", + "ру" + ], + [ + "▁Con", + "stant" + ], + [ + "▁Const", + "ant" + ], + [ + "▁", + "Constant" + ], + [ + "▁m", + "ining" + ], + [ + "▁min", + "ing" + ], + [ + "▁mi", + "ning" + ], + [ + "▁mini", + "ng" + ], + [ + "та", + "л" + ], + [ + "т", + "ал" + ], + [ + "ma", + "tes" + ], + [ + "mat", + "es" + ], + [ + "mate", + "s" + ], + [ + "m", + "ates" + ], + [ + "▁w", + "orship" + ], + [ + "▁wor", + "ship" + ], + [ + "ro", + "uter" + ], + [ + "rou", + "ter" + ], + [ + "route", + "r" + ], + [ + "r", + "outer" + ], + [ + "C", + "N" + ], + [ + "▁M", + "atch" + ], + [ + "▁Mat", + "ch" + ], + [ + "▁", + "Match" + ], + [ + "▁C", + "ole" + ], + [ + "▁Col", + "e" + ], + [ + "▁Co", + "le" + ], + [ + "▁down", + "t" + ], + [ + "▁dow", + "nt" + ], + [ + "▁h", + "oles" + ], + [ + "▁hol", + "es" + ], + [ + "▁ho", + "les" + ], + [ + "▁hole", + "s" + ], + [ + "▁gr", + "ateful" + ], + [ + "RES", + "ULT" + ], + [ + "▁Europ", + "a" + ], + [ + "▁Euro", + "pa" + ], + [ + "▁con", + "sent" + ], + [ + "▁cons", + "ent" + ], + [ + "▁conse", + "nt" + ], + [ + "l", + "ä" + ], + [ + "op", + "ter" + ], + [ + "opt", + "er" + ], + [ + "▁colle", + "agues" + ], + [ + "or", + "ous" + ], + [ + "oro", + "us" + ], + [ + "o", + "rous" + ], + [ + "▁enem", + "ies" + ], + [ + "ha", + "ng" + ], + [ + "han", + "g" + ], + [ + "h", + "ang" + ], + [ + "act", + "ual" + ], + [ + "Object", + "s" + ], + [ + "▁я", + "к" + ], + [ + "▁fl", + "uid" + ], + [ + "▁flu", + "id" + ], + [ + "fix", + "ed" + ], + [ + "f", + "ixed" + ], + [ + "▁G", + "raph" + ], + [ + "▁Gr", + "aph" + ], + [ + "▁Gra", + "ph" + ], + [ + "▁", + "Graph" + ], + [ + "▁scr", + "atch" + ], + [ + "ce", + "rs" + ], + [ + "cer", + "s" + ], + [ + "c", + "ers" + ], + [ + "ri", + "bu" + ], + [ + "rib", + "u" + ], + [ + "▁valid", + "ation" + ], + [ + "▁", + "validation" + ], + [ + "▁com", + "pletion" + ], + [ + "▁complet", + "ion" + ], + [ + "▁B", + "egin" + ], + [ + "▁Be", + "gin" + ], + [ + "▁Beg", + "in" + ], + [ + "▁", + "Begin" + ], + [ + "end", + "point" + ], + [ + "ri", + "ent" + ], + [ + "rie", + "nt" + ], + [ + "rien", + "t" + ], + [ + "r", + "ient" + ], + [ + "C", + "M" + ], + [ + "▁S", + "ite" + ], + [ + "▁Si", + "te" + ], + [ + "▁", + "Site" + ], + [ + "▁expl", + "ains" + ], + [ + "▁explain", + "s" + ], + [ + "tr", + "es" + ], + [ + "tre", + "s" + ], + [ + "t", + "res" + ], + [ + "▁any", + "body" + ], + [ + "fo", + "reach" + ], + [ + "fore", + "ach" + ], + [ + "for", + "each" + ], + [ + "lo", + "n" + ], + [ + "l", + "on" + ], + [ + "Ch", + "ain" + ], + [ + "▁B", + "uff" + ], + [ + "▁Bu", + "ff" + ], + [ + "▁", + "Buff" + ], + [ + "oc", + "al" + ], + [ + "oca", + "l" + ], + [ + "o", + "cal" + ], + [ + "▁M", + "organ" + ], + [ + "▁Mor", + "gan" + ], + [ + "▁s", + "ang" + ], + [ + "▁sa", + "ng" + ], + [ + "▁san", + "g" + ], + [ + "▁pass", + "es" + ], + [ + "▁pas", + "ses" + ], + [ + "@", + "@" + ], + [ + "ij", + "d" + ], + [ + "i", + "jd" + ], + [ + "W", + "ord" + ], + [ + "▁H", + "ung" + ], + [ + "▁Hun", + "g" + ], + [ + "▁Hu", + "ng" + ], + [ + "▁F", + "er" + ], + [ + "▁Fe", + "r" + ], + [ + "▁v", + "ý" + ], + [ + "ba", + "st" + ], + [ + "bas", + "t" + ], + [ + "b", + "ast" + ], + [ + "▁enter", + "tainment" + ], + [ + "▁entertain", + "ment" + ], + [ + "hi", + "n" + ], + [ + "h", + "in" + ], + [ + "▁g", + "rat" + ], + [ + "▁gr", + "at" + ], + [ + "▁gra", + "t" + ], + [ + "▁M", + "ember" + ], + [ + "▁Me", + "mber" + ], + [ + "▁Mem", + "ber" + ], + [ + "▁", + "Member" + ], + [ + "▁M", + "inn" + ], + [ + "▁Min", + "n" + ], + [ + "▁Mi", + "nn" + ], + [ + "▁pr", + "inted" + ], + [ + "▁print", + "ed" + ], + [ + "▁prin", + "ted" + ], + [ + "▁Frank", + "lin" + ], + [ + "▁I", + "mp" + ], + [ + "▁Im", + "p" + ], + [ + "▁", + "Imp" + ], + [ + "M", + "achine" + ], + [ + "column", + "s" + ], + [ + "▁de", + "leted" + ], + [ + "▁del", + "eted" + ], + [ + "▁delete", + "d" + ], + [ + "▁delet", + "ed" + ], + [ + "▁", + "deleted" + ], + [ + "▁manufact", + "uring" + ], + [ + "▁re", + "ly" + ], + [ + "▁r", + "ely" + ], + [ + "▁rel", + "y" + ], + [ + "▁con", + "se" + ], + [ + "▁cons", + "e" + ], + [ + "▁f", + "ishing" + ], + [ + "▁fish", + "ing" + ], + [ + "▁fis", + "hing" + ], + [ + "bl", + "o" + ], + [ + "b", + "lo" + ], + [ + "-", + "$" + ], + [ + "▁.", + "\"" + ], + [ + "▁", + ".\"" + ], + [ + "▁clin", + "ical" + ], + [ + "▁clinic", + "al" + ], + [ + "▁Stud", + "ies" + ], + [ + "▁Б", + "у" + ], + [ + "def", + "inition" + ], + [ + "▁evalu", + "ation" + ], + [ + "▁eval", + "uation" + ], + [ + "▁att", + "acked" + ], + [ + "▁attack", + "ed" + ], + [ + "▁fro", + "zen" + ], + [ + "ze", + "nt" + ], + [ + "zen", + "t" + ], + [ + "z", + "ent" + ], + [ + "▁ú", + "lt" + ], + [ + "▁r", + "ational" + ], + [ + "▁rat", + "ional" + ], + [ + "▁ratio", + "nal" + ], + [ + "ot", + "he" + ], + [ + "oth", + "e" + ], + [ + "o", + "the" + ], + [ + "Can", + "cel" + ], + [ + "C", + "ancel" + ], + [ + "hi", + "story" + ], + [ + "hist", + "ory" + ], + [ + "set", + "Text" + ], + [ + "▁a", + "lc" + ], + [ + "▁al", + "c" + ], + [ + "▁h", + "ydro" + ], + [ + "▁hy", + "dro" + ], + [ + "▁hyd", + "ro" + ], + [ + "▁The", + "atre" + ], + [ + "▁M", + "aterial" + ], + [ + "▁Mat", + "erial" + ], + [ + "▁", + "Material" + ], + [ + "IO", + "Exception" + ], + [ + "****", + "**/" + ], + [ + "******", + "/" + ], + [ + "sp", + "l" + ], + [ + "s", + "pl" + ], + [ + "NO", + "DE" + ], + [ + "att", + "rs" + ], + [ + "attr", + "s" + ], + [ + "▁m", + "ie" + ], + [ + "▁mi", + "e" + ], + [ + "▁off", + "ices" + ], + [ + "▁offic", + "es" + ], + [ + "▁office", + "s" + ], + [ + "r", + "ó" + ], + [ + "▁j", + "am" + ], + [ + "▁ja", + "m" + ], + [ + "▁Id", + "ent" + ], + [ + "▁Ide", + "nt" + ], + [ + "▁", + "Ident" + ], + [ + "v", + "é" + ], + [ + "Set", + "ting" + ], + [ + "▁Sever", + "al" + ], + [ + "▁Sev", + "eral" + ], + [ + "▁dec", + "ay" + ], + [ + "And", + "roid" + ], + [ + "▁S", + "ave" + ], + [ + "▁Sa", + "ve" + ], + [ + "▁Sav", + "e" + ], + [ + "▁", + "Save" + ], + [ + "un", + "ted" + ], + [ + "unt", + "ed" + ], + [ + "unte", + "d" + ], + [ + "▁Mount", + "ain" + ], + [ + "us", + "c" + ], + [ + "u", + "sc" + ], + [ + "▁mar", + "zo" + ], + [ + "▁a", + "sleep" + ], + [ + "▁as", + "leep" + ], + [ + "▁sold", + "ier" + ], + [ + "▁D", + "ouble" + ], + [ + "▁Dou", + "ble" + ], + [ + "▁", + "Double" + ], + [ + "P", + "K" + ], + [ + "▁cont", + "rad" + ], + [ + "▁contr", + "ad" + ], + [ + "▁contra", + "d" + ], + [ + "▁w", + "ins" + ], + [ + "▁win", + "s" + ], + [ + "ce", + "iver" + ], + [ + "ceive", + "r" + ], + [ + "▁se", + "asons" + ], + [ + "▁season", + "s" + ], + [ + "▁seas", + "ons" + ], + [ + "▁C", + "hall" + ], + [ + "▁Ch", + "all" + ], + [ + "▁Cha", + "ll" + ], + [ + "▁health", + "care" + ], + [ + "ła", + "d" + ], + [ + "ł", + "ad" + ], + [ + "о", + "т" + ], + [ + "▁F", + "ive" + ], + [ + "▁Fi", + "ve" + ], + [ + "▁H", + "ell" + ], + [ + "▁He", + "ll" + ], + [ + "▁Hel", + "l" + ], + [ + "▁world", + "wide" + ], + [ + "▁'", + "," + ], + [ + "▁", + "'," + ], + [ + "я", + "н" + ], + [ + "ma", + "de" + ], + [ + "mad", + "e" + ], + [ + "m", + "ade" + ], + [ + "▁respon", + "ded" + ], + [ + "▁respond", + "ed" + ], + [ + "▁a", + "y" + ], + [ + "▁", + "ay" + ], + [ + "▁proced", + "ures" + ], + [ + "▁procedure", + "s" + ], + [ + "те", + "ра" + ], + [ + "тер", + "а" + ], + [ + "▁cle", + "ared" + ], + [ + "▁clear", + "ed" + ], + [ + "\"]", + "." + ], + [ + "\"", + "]." + ], + [ + "▁T", + "arget" + ], + [ + "▁Tar", + "get" + ], + [ + "▁", + "Target" + ], + [ + "▁S", + "ide" + ], + [ + "▁Si", + "de" + ], + [ + "▁Sid", + "e" + ], + [ + "▁", + "Side" + ], + [ + "om", + "in" + ], + [ + "omi", + "n" + ], + [ + "o", + "min" + ], + [ + "▁de", + "ploy" + ], + [ + "▁T", + "ell" + ], + [ + "▁Te", + "ll" + ], + [ + "▁Tel", + "l" + ], + [ + "▁", + "Tell" + ], + [ + "▁on", + "going" + ], + [ + "fl", + "oor" + ], + [ + "f", + "loor" + ], + [ + "▁b", + "ones" + ], + [ + "▁bo", + "nes" + ], + [ + "▁bon", + "es" + ], + [ + "▁bone", + "s" + ], + [ + "▁De", + "lete" + ], + [ + "▁Del", + "ete" + ], + [ + "▁", + "Delete" + ], + [ + "▁shru", + "gged" + ], + [ + "O", + "ur" + ], + [ + "De", + "r" + ], + [ + "D", + "er" + ], + [ + "▁init", + "ialize" + ], + [ + "▁initial", + "ize" + ], + [ + "▁", + "initialize" + ], + [ + "▁T", + "ed" + ], + [ + "▁Te", + "d" + ], + [ + "MA", + "GE" + ], + [ + "MAG", + "E" + ], + [ + "M", + "AGE" + ], + [ + "▁h", + "ire" + ], + [ + "▁hi", + "re" + ], + [ + "▁", + "hire" + ], + [ + "▁tr", + "acking" + ], + [ + "▁track", + "ing" + ], + [ + "▁a", + "sh" + ], + [ + "▁as", + "h" + ], + [ + "▁", + "ash" + ], + [ + "▁ce", + "iling" + ], + [ + "ка", + "х" + ], + [ + "et", + "ti" + ], + [ + "ett", + "i" + ], + [ + "e", + "tti" + ], + [ + "▁cour", + "age" + ], + [ + "▁cou", + "rage" + ], + [ + "ensch", + "app" + ], + [ + "ют", + "ся" + ], + [ + "ю", + "тся" + ], + [ + "Mo", + "re" + ], + [ + "M", + "ore" + ], + [ + "▁fol", + "g" + ], + [ + "▁fo", + "lg" + ], + [ + "▁", + "folg" + ], + [ + "▁Gr", + "ace" + ], + [ + "▁Gra", + "ce" + ], + [ + "▁K", + "elly" + ], + [ + "▁Kel", + "ly" + ], + [ + "▁re", + "ven" + ], + [ + "▁r", + "even" + ], + [ + "▁rev", + "en" + ], + [ + "▁reve", + "n" + ], + [ + "▁A", + "li" + ], + [ + "▁Al", + "i" + ], + [ + "▁", + "Ali" + ], + [ + "▁d", + "isp" + ], + [ + "▁dis", + "p" + ], + [ + "▁di", + "sp" + ], + [ + "▁", + "disp" + ], + [ + "▁de", + "feat" + ], + [ + "▁defe", + "at" + ], + [ + "▁cre", + "ature" + ], + [ + "▁creat", + "ure" + ], + [ + "▁Kenn", + "edy" + ], + [ + "▁D", + "iego" + ], + [ + "▁Die", + "go" + ], + [ + "▁Di", + "ego" + ], + [ + "EM", + "P" + ], + [ + "E", + "MP" + ], + [ + "▁s", + "team" + ], + [ + "▁ste", + "am" + ], + [ + "end", + "ance" + ], + [ + "ri", + "g" + ], + [ + "r", + "ig" + ], + [ + "▁ign", + "or" + ], + [ + "▁ig", + "nor" + ], + [ + "em", + "en" + ], + [ + "eme", + "n" + ], + [ + "e", + "men" + ], + [ + "▁G", + "ru" + ], + [ + "▁Gr", + "u" + ], + [ + "▁pro", + "posal" + ], + [ + "▁propos", + "al" + ], + [ + "▁we", + "iter" + ], + [ + "▁weit", + "er" + ], + [ + "▁", + "лі" + ], + [ + "ib", + "les" + ], + [ + "ible", + "s" + ], + [ + "i", + "bles" + ], + [ + "▁consider", + "ation" + ], + [ + "▁belie", + "ves" + ], + [ + "▁believe", + "s" + ], + [ + "▁S", + "oph" + ], + [ + "▁So", + "ph" + ], + [ + "“", + "," + ], + [ + "▁Matt", + "hew" + ], + [ + "▁circ", + "uit" + ], + [ + "▁s", + "inger" + ], + [ + "▁sing", + "er" + ], + [ + "▁sin", + "ger" + ], + [ + "▁S", + "quare" + ], + [ + "ç", + "o" + ], + [ + "Ed", + "ge" + ], + [ + "▁a", + "str" + ], + [ + "▁as", + "tr" + ], + [ + "▁ast", + "r" + ], + [ + "▁", + "astr" + ], + [ + "▁represent", + "ative" + ], + [ + "▁comprehens", + "ive" + ], + [ + "li", + "ga" + ], + [ + "lig", + "a" + ], + [ + "l", + "iga" + ], + [ + "▁m", + "ere" + ], + [ + "▁me", + "re" + ], + [ + "▁mer", + "e" + ], + [ + "tb", + "l" + ], + [ + "t", + "bl" + ], + [ + "▁contin", + "uing" + ], + [ + "▁continu", + "ing" + ], + [ + "ograph", + "er" + ], + [ + "ograp", + "her" + ], + [ + "LE", + "D" + ], + [ + "L", + "ED" + ], + [ + "▁/*", + "**/" + ], + [ + "▁/**", + "*/" + ], + [ + "▁s", + "ear" + ], + [ + "▁se", + "ar" + ], + [ + "▁sea", + "r" + ], + [ + "▁enorm", + "ous" + ], + [ + "iz", + "i" + ], + [ + "i", + "zi" + ], + [ + "Di", + "t" + ], + [ + "D", + "it" + ], + [ + "th", + "ere" + ], + [ + "ther", + "e" + ], + [ + "the", + "re" + ], + [ + "t", + "here" + ], + [ + "і", + "н" + ], + [ + "си", + "те" + ], + [ + "▁gu", + "erra" + ], + [ + "▁end", + "point" + ], + [ + "▁", + "endpoint" + ], + [ + "▁le", + "sson" + ], + [ + "▁les", + "son" + ], + [ + "▁less", + "on" + ], + [ + "zo", + "n" + ], + [ + "z", + "on" + ], + [ + "var", + "iable" + ], + [ + "vari", + "able" + ], + [ + "и", + "с" + ], + [ + "▁research", + "ers" + ], + [ + "▁attempt", + "ed" + ], + [ + "▁e", + "nf" + ], + [ + "▁en", + "f" + ], + [ + "ту", + "ра" + ], + [ + "тур", + "а" + ], + [ + "▁de", + "fin" + ], + [ + "▁def", + "in" + ], + [ + "ве", + "ст" + ], + [ + "▁aw", + "ful" + ], + [ + "▁lo", + "west" + ], + [ + "▁low", + "est" + ], + [ + "ru", + "les" + ], + [ + "rule", + "s" + ], + [ + "r", + "ules" + ], + [ + "▁un", + "like" + ], + [ + "inter", + "val" + ], + [ + "▁produ", + "cing" + ], + [ + "▁K", + "am" + ], + [ + "▁Ka", + "m" + ], + [ + "▁I", + "MP" + ], + [ + "▁IM", + "P" + ], + [ + "▁", + "IMP" + ], + [ + "Gener", + "al" + ], + [ + "Gen", + "eral" + ], + [ + "▁f", + "aire" + ], + [ + "▁fa", + "ire" + ], + [ + "▁fair", + "e" + ], + [ + "▁max", + "im" + ], + [ + "▁ma", + "xim" + ], + [ + "as", + "semb" + ], + [ + "ass", + "emb" + ], + [ + "asse", + "mb" + ], + [ + "assem", + "b" + ], + [ + "ac", + "ent" + ], + [ + "ace", + "nt" + ], + [ + "a", + "cent" + ], + [ + "?", + ">" + ], + [ + "pl", + "ica" + ], + [ + "plic", + "a" + ], + [ + "p", + "lica" + ], + [ + "▁r", + "am" + ], + [ + "▁ra", + "m" + ], + [ + "▁", + "ram" + ], + [ + "ma", + "te" + ], + [ + "mat", + "e" + ], + [ + "m", + "ate" + ], + [ + "ц", + "у" + ], + [ + "m", + "n" + ], + [ + "▁H", + "i" + ], + [ + "▁", + "Hi" + ], + [ + "▁st", + "ages" + ], + [ + "▁stage", + "s" + ], + [ + "▁stag", + "es" + ], + [ + "▁sta", + "ges" + ], + [ + "▁Ed", + "itor" + ], + [ + "▁Edit", + "or" + ], + [ + "▁", + "Editor" + ], + [ + "▁t", + "ang" + ], + [ + "▁tan", + "g" + ], + [ + "▁ta", + "ng" + ], + [ + "R", + "D" + ], + [ + "▁i", + "ch" + ], + [ + "▁ic", + "h" + ], + [ + "▁", + "ich" + ], + [ + "▁depend", + "ent" + ], + [ + "▁dep", + "endent" + ], + [ + "▁", + "dependent" + ], + [ + "li", + "fer" + ], + [ + "life", + "r" + ], + [ + "lif", + "er" + ], + [ + "l", + "ifer" + ], + [ + "as", + "cript" + ], + [ + "asc", + "ript" + ], + [ + "a", + "script" + ], + [ + "▁expos", + "ure" + ], + [ + "ре", + "з" + ], + [ + "▁m", + "art" + ], + [ + "▁mar", + "t" + ], + [ + "▁ma", + "rt" + ], + [ + "▁", + "mart" + ], + [ + "▁Bar", + "cel" + ], + [ + "xs", + "pace" + ], + [ + "x", + "space" + ], + [ + "SE", + "SSION" + ], + [ + "▁p", + "rest" + ], + [ + "▁pre", + "st" + ], + [ + "▁pr", + "est" + ], + [ + "▁pres", + "t" + ], + [ + "UR", + "CE" + ], + [ + "-", + "." + ], + [ + "▁се", + "ло" + ], + [ + "ha", + "ve" + ], + [ + "hav", + "e" + ], + [ + "h", + "ave" + ], + [ + "▁observ", + "ation" + ], + [ + "▁obs", + "ervation" + ], + [ + "▁comm", + "ands" + ], + [ + "▁command", + "s" + ], + [ + "▁", + "commands" + ], + [ + "▁e", + "ager" + ], + [ + "▁out", + "door" + ], + [ + "▁DE", + "BUG" + ], + [ + "▁", + "DEBUG" + ], + [ + "▁h", + "r" + ], + [ + "▁", + "hr" + ], + [ + "A", + "X" + ], + [ + "▁p", + "uzz" + ], + [ + "▁pu", + "zz" + ], + [ + "bl", + "ank" + ], + [ + "бу", + "р" + ], + [ + "б", + "ур" + ], + [ + "▁k", + "ennis" + ], + [ + "▁reg", + "arded" + ], + [ + "▁regard", + "ed" + ], + [ + "▁}", + ")," + ], + [ + "▁})", + "," + ], + [ + "▁", + "})," + ], + [ + "vol", + "ume" + ], + [ + "▁про", + "из" + ], + [ + "▁Tr", + "aining" + ], + [ + "▁Tra", + "ining" + ], + [ + "▁Train", + "ing" + ], + [ + "a", + "ñ" + ], + [ + "▁f", + "ois" + ], + [ + "▁foi", + "s" + ], + [ + "▁fo", + "is" + ], + [ + "▁т", + "ри" + ], + [ + "▁", + "три" + ], + [ + "в", + "ня" + ], + [ + "▁opt", + "imal" + ], + [ + "▁optim", + "al" + ], + [ + "▁sub", + "scription" + ], + [ + "▁subs", + "cription" + ], + [ + "▁", + "subscription" + ], + [ + "br", + "idge" + ], + [ + "brid", + "ge" + ], + [ + "b", + "ridge" + ], + [ + "im", + "ental" + ], + [ + "iment", + "al" + ], + [ + "imen", + "tal" + ], + [ + "▁Th", + "ink" + ], + [ + "▁\"", + ";" + ], + [ + "▁", + "\";" + ], + [ + "▁leg", + "isl" + ], + [ + "▁legis", + "l" + ], + [ + "▁H", + "op" + ], + [ + "▁Ho", + "p" + ], + [ + "▁br", + "anches" + ], + [ + "▁branch", + "es" + ], + [ + "▁V", + "eg" + ], + [ + "▁Ve", + "g" + ], + [ + "▁s", + "print" + ], + [ + "▁spr", + "int" + ], + [ + "▁fl", + "ux" + ], + [ + "▁flu", + "x" + ], + [ + "▁Fr", + "eder" + ], + [ + "▁Fre", + "der" + ], + [ + "▁Fred", + "er" + ], + [ + "si", + "s" + ], + [ + "s", + "is" + ], + [ + "not", + "ify" + ], + [ + "▁Ф", + "ран" + ], + [ + "so", + "m" + ], + [ + "s", + "om" + ], + [ + "ny", + "m" + ], + [ + "n", + "ym" + ], + [ + "▁R", + "é" + ], + [ + "le", + "tt" + ], + [ + "let", + "t" + ], + [ + "l", + "ett" + ], + [ + "ing", + "ham" + ], + [ + "▁F", + "arm" + ], + [ + "▁Far", + "m" + ], + [ + "▁Fa", + "rm" + ], + [ + "DO", + "M" + ], + [ + "D", + "OM" + ], + [ + "▁sh", + "ield" + ], + [ + "He", + "re" + ], + [ + "Her", + "e" + ], + [ + "H", + "ere" + ], + [ + "▁T", + "reat" + ], + [ + "▁Tre", + "at" + ], + [ + "▁Lu", + "ke" + ], + [ + "▁un", + "safe" + ], + [ + "an", + "ton" + ], + [ + "ant", + "on" + ], + [ + "anto", + "n" + ], + [ + "▁Im", + "per" + ], + [ + "▁Imp", + "er" + ], + [ + "▁tele", + "phone" + ], + [ + "▁un", + "lock" + ], + [ + "▁", + "unlock" + ], + [ + "Own", + "er" + ], + [ + "col", + "lection" + ], + [ + "coll", + "ection" + ], + [ + "collect", + "ion" + ], + [ + "▁s", + "nd" + ], + [ + "▁sn", + "d" + ], + [ + "▁", + "snd" + ], + [ + "▁su", + "iv" + ], + [ + "▁ent", + "ering" + ], + [ + "▁enter", + "ing" + ], + [ + "ше", + "н" + ], + [ + "ш", + "ен" + ], + [ + "▁L", + "abel" + ], + [ + "▁La", + "bel" + ], + [ + "▁Lab", + "el" + ], + [ + "▁", + "Label" + ], + [ + "se", + "lector" + ], + [ + "sel", + "ector" + ], + [ + "select", + "or" + ], + [ + "▁G", + "ET" + ], + [ + "▁", + "GET" + ], + [ + "▁qu", + "ando" + ], + [ + "▁quand", + "o" + ], + [ + "▁f", + "ed" + ], + [ + "▁fe", + "d" + ], + [ + "▁", + "fed" + ], + [ + "j", + "Query" + ], + [ + "Or", + "igin" + ], + [ + "▁A", + "lan" + ], + [ + "▁Al", + "an" + ], + [ + "math", + "scr" + ], + [ + "▁pregn", + "ant" + ], + [ + "▁preg", + "nant" + ], + [ + "Ex", + "pect" + ], + [ + "Exp", + "ect" + ], + [ + "re", + "sources" + ], + [ + "res", + "ources" + ], + [ + "resource", + "s" + ], + [ + "▁er", + "sten" + ], + [ + "▁erst", + "en" + ], + [ + "▁ers", + "ten" + ], + [ + "▁erste", + "n" + ], + [ + "al", + "ia" + ], + [ + "ali", + "a" + ], + [ + "a", + "lia" + ], + [ + "▁ret", + "ired" + ], + [ + "▁retire", + "d" + ], + [ + "û", + "t" + ], + [ + "Cr", + "ed" + ], + [ + "C", + "red" + ], + [ + "▁m", + "éd" + ], + [ + "▁mé", + "d" + ], + [ + "▁e", + "rh" + ], + [ + "▁er", + "h" + ], + [ + "Frame", + "work" + ], + [ + "Sl", + "ot" + ], + [ + "S", + "lot" + ], + [ + "d", + "uration" + ], + [ + "sa", + "l" + ], + [ + "s", + "al" + ], + [ + "▁com", + "position" + ], + [ + "▁compos", + "ition" + ], + [ + "art", + "icle" + ], + [ + "gp", + "u" + ], + [ + "g", + "pu" + ], + [ + "▁per", + "mitted" + ], + [ + "▁perm", + "itted" + ], + [ + "▁permit", + "ted" + ], + [ + "▁F", + "ont" + ], + [ + "▁Fo", + "nt" + ], + [ + "▁", + "Font" + ], + [ + "▁M", + "uch" + ], + [ + "▁Mu", + "ch" + ], + [ + "▁p", + "ending" + ], + [ + "▁pen", + "ding" + ], + [ + "▁", + "pending" + ], + [ + "▁ag", + "encies" + ], + [ + "Column", + "s" + ], + [ + "▁k", + "lik" + ], + [ + "▁kl", + "ik" + ], + [ + "▁r", + "ating" + ], + [ + "▁rat", + "ing" + ], + [ + "▁ra", + "ting" + ], + [ + "▁", + "rating" + ], + [ + "min", + "d" + ], + [ + "mi", + "nd" + ], + [ + "m", + "ind" + ], + [ + "▁Penn", + "sylvania" + ], + [ + "J", + "ava" + ], + [ + "ab", + "stract" + ], + [ + "abs", + "tract" + ], + [ + "▁d", + "umb" + ], + [ + "▁du", + "mb" + ], + [ + "▁V", + "I" + ], + [ + "▁", + "VI" + ], + [ + "us", + "a" + ], + [ + "u", + "sa" + ], + [ + "Rem", + "ote" + ], + [ + "▁YO", + "U" + ], + [ + "▁C", + "reek" + ], + [ + "▁Cre", + "ek" + ], + [ + "ма", + "ти" + ], + [ + "мат", + "и" + ], + [ + "Bot", + "tom" + ], + [ + "B", + "ottom" + ], + [ + "▁roll", + "ing" + ], + [ + "▁", + "rolling" + ], + [ + "▁b", + "undle" + ], + [ + "▁bund", + "le" + ], + [ + "▁", + "bundle" + ], + [ + "▁g", + "olf" + ], + [ + "▁gol", + "f" + ], + [ + "gp", + "io" + ], + [ + "g", + "pio" + ], + [ + "▁Ch", + "air" + ], + [ + "▁Cha", + "ir" + ], + [ + "▁c", + "ls" + ], + [ + "▁cl", + "s" + ], + [ + "▁", + "cls" + ], + [ + "$", + "}" + ], + [ + "▁Par", + "liament" + ], + [ + "f", + "ühr" + ], + [ + "Man", + "y" + ], + [ + "Ma", + "ny" + ], + [ + "M", + "any" + ], + [ + "▁S", + "ep" + ], + [ + "▁Se", + "p" + ], + [ + "▁", + "Sep" + ], + [ + "▁bad", + "ly" + ], + [ + "ig", + "i" + ], + [ + "i", + "gi" + ], + [ + "▁Geme", + "inde" + ], + [ + "Il", + "l" + ], + [ + "I", + "ll" + ], + [ + "▁А", + "н" + ], + [ + "ua", + "rt" + ], + [ + "uar", + "t" + ], + [ + "u", + "art" + ], + [ + "it", + "empty" + ], + [ + "item", + "pty" + ], + [ + "▁N", + "iger" + ], + [ + "▁Ni", + "ger" + ], + [ + "▁im", + "migr" + ], + [ + "▁imm", + "igr" + ], + [ + "Su", + "per" + ], + [ + "Sup", + "er" + ], + [ + "S", + "uper" + ], + [ + "v", + "á" + ], + [ + "ist", + "ribute" + ], + [ + "istribut", + "e" + ], + [ + "Hel", + "pers" + ], + [ + "Helper", + "s" + ], + [ + "Help", + "ers" + ], + [ + "▁w", + "aters" + ], + [ + "▁water", + "s" + ], + [ + "▁wat", + "ers" + ], + [ + "▁wa", + "ters" + ], + [ + "▁join", + "ing" + ], + [ + "▁jo", + "ining" + ], + [ + "om", + "itempty" + ], + [ + "▁Other", + "wise" + ], + [ + "▁H", + "ost" + ], + [ + "▁Ho", + "st" + ], + [ + "▁", + "Host" + ], + [ + "▁re", + "dd" + ], + [ + "▁red", + "d" + ], + [ + "▁d", + "y" + ], + [ + "▁", + "dy" + ], + [ + "▁con", + "verted" + ], + [ + "▁convert", + "ed" + ], + [ + "▁conver", + "ted" + ], + [ + "▁pr", + "ayer" + ], + [ + "▁pray", + "er" + ], + [ + "▁pra", + "yer" + ], + [ + "▁У", + "краї" + ], + [ + "▁Укра", + "ї" + ], + [ + "▁e", + "lections" + ], + [ + "▁elect", + "ions" + ], + [ + "▁ele", + "ctions" + ], + [ + "▁election", + "s" + ], + [ + "re", + "b" + ], + [ + "r", + "eb" + ], + [ + "er", + "ie" + ], + [ + "eri", + "e" + ], + [ + "e", + "rie" + ], + [ + "▁с", + "вя" + ], + [ + "Ab", + "s" + ], + [ + "A", + "bs" + ], + [ + "ie", + "mbre" + ], + [ + "iem", + "bre" + ], + [ + "i", + "embre" + ], + [ + "hol", + "ders" + ], + [ + "hold", + "ers" + ], + [ + "holder", + "s" + ], + [ + "▁R", + "ol" + ], + [ + "▁Ro", + "l" + ], + [ + "ut", + "schen" + ], + [ + "uts", + "chen" + ], + [ + "utsch", + "en" + ], + [ + "utsche", + "n" + ], + [ + "▁G", + "h" + ], + [ + "ter", + "y" + ], + [ + "te", + "ry" + ], + [ + "t", + "ery" + ], + [ + "ан", + "г" + ], + [ + "а", + "нг" + ], + [ + "▁narr", + "ative" + ], + [ + "min", + "us" + ], + [ + "m", + "inus" + ], + [ + "▁I", + "ron" + ], + [ + "▁Ir", + "on" + ], + [ + "=\"", + "#" + ], + [ + "▁w", + "and" + ], + [ + "▁wa", + "nd" + ], + [ + "▁", + "wand" + ], + [ + "▁w", + "ished" + ], + [ + "▁wish", + "ed" + ], + [ + "▁wis", + "hed" + ], + [ + "ic", + "ode" + ], + [ + "ico", + "de" + ], + [ + "i", + "code" + ], + [ + "or", + "r" + ], + [ + "o", + "rr" + ], + [ + "[", + "[" + ], + [ + "▁detect", + "ed" + ], + [ + "▁municip", + "al" + ], + [ + "▁P", + "our" + ], + [ + "▁Po", + "ur" + ], + [ + "▁S", + "erv" + ], + [ + "▁Se", + "rv" + ], + [ + "▁Ser", + "v" + ], + [ + "▁", + "Serv" + ], + [ + "cite", + "t" + ], + [ + "cit", + "et" + ], + [ + "c", + "itet" + ], + [ + "▁g", + "rey" + ], + [ + "▁gr", + "ey" + ], + [ + "▁gre", + "y" + ], + [ + "▁R", + "ap" + ], + [ + "▁Ra", + "p" + ], + [ + "▁v", + "oy" + ], + [ + "▁vo", + "y" + ], + [ + "▁l", + "leg" + ], + [ + "▁ll", + "eg" + ], + [ + "▁cur", + "rency" + ], + [ + "▁curr", + "ency" + ], + [ + "▁", + "currency" + ], + [ + "▁S", + "cript" + ], + [ + "▁Sc", + "ript" + ], + [ + "▁", + "Script" + ], + [ + "str", + "ument" + ], + [ + "stru", + "ment" + ], + [ + "▁expect", + "ing" + ], + [ + "▁t", + "ickets" + ], + [ + "▁tick", + "ets" + ], + [ + "▁ticket", + "s" + ], + [ + "▁b", + "ucket" + ], + [ + "▁buck", + "et" + ], + [ + "▁", + "bucket" + ], + [ + "eg", + "r" + ], + [ + "e", + "gr" + ], + [ + "▁j", + "acket" + ], + [ + "▁jack", + "et" + ], + [ + "dr", + "v" + ], + [ + "d", + "rv" + ], + [ + "▁lo", + "ans" + ], + [ + "▁loan", + "s" + ], + [ + "▁k", + "ann" + ], + [ + "▁kan", + "n" + ], + [ + "▁ka", + "nn" + ], + [ + "▁integr", + "al" + ], + [ + "▁character", + "istics" + ], + [ + "▁characteristic", + "s" + ], + [ + "(\"", + "." + ], + [ + "(", + "\"." + ], + [ + "▁man", + "ual" + ], + [ + "▁d", + "ynamics" + ], + [ + "▁dynam", + "ics" + ], + [ + "▁dynamic", + "s" + ], + [ + ":", + "*" + ], + [ + "sh", + "a" + ], + [ + "s", + "ha" + ], + [ + "re", + "ens" + ], + [ + "ree", + "ns" + ], + [ + "reen", + "s" + ], + [ + "on", + "ical" + ], + [ + "oni", + "cal" + ], + [ + "onic", + "al" + ], + [ + "▁to", + "ile" + ], + [ + "añ", + "a" + ], + [ + "a", + "ña" + ], + [ + "▁d", + "istant" + ], + [ + "▁di", + "stant" + ], + [ + "▁dist", + "ant" + ], + [ + "▁hand", + "led" + ], + [ + "▁handle", + "d" + ], + [ + "Bo", + "ol" + ], + [ + "B", + "ool" + ], + [ + "▁pe", + "nal" + ], + [ + "▁pen", + "al" + ], + [ + "▁Th", + "ings" + ], + [ + "▁prom", + "inent" + ], + [ + "▁ex", + "ped" + ], + [ + "▁exp", + "ed" + ], + [ + "▁He", + "lp" + ], + [ + "▁Hel", + "p" + ], + [ + "▁", + "Help" + ], + [ + "▁a", + "sp" + ], + [ + "▁as", + "p" + ], + [ + "▁", + "asp" + ], + [ + "la", + "p" + ], + [ + "l", + "ap" + ], + [ + "▁A", + "uth" + ], + [ + "▁Aut", + "h" + ], + [ + "▁Au", + "th" + ], + [ + "▁", + "Auth" + ], + [ + "Bas", + "ic" + ], + [ + "ach", + "uset" + ], + [ + "▁B", + "ild" + ], + [ + "▁Bi", + "ld" + ], + [ + "▁Bil", + "d" + ], + [ + "▁ent", + "itled" + ], + [ + "▁j", + "ag" + ], + [ + "▁ja", + "g" + ], + [ + "▁reject", + "ed" + ], + [ + "▁m", + "emor" + ], + [ + "▁me", + "mor" + ], + [ + "▁mem", + "or" + ], + [ + "▁memo", + "r" + ], + [ + "or", + "ts" + ], + [ + "ort", + "s" + ], + [ + "▁ap", + "plies" + ], + [ + "▁appl", + "ies" + ], + [ + "▁L", + "anguage" + ], + [ + "▁", + "Language" + ], + [ + "spec", + "ific" + ], + [ + "achuset", + "ts" + ], + [ + "HA", + "ND" + ], + [ + "H", + "AND" + ], + [ + "▁R", + "oute" + ], + [ + "▁Ro", + "ute" + ], + [ + "▁Rou", + "te" + ], + [ + "▁", + "Route" + ], + [ + "mark", + "et" + ], + [ + "mar", + "ket" + ], + [ + "▁K", + "y" + ], + [ + "▁p", + "ose" + ], + [ + "▁pos", + "e" + ], + [ + "▁po", + "se" + ], + [ + "▁", + "pose" + ], + [ + "AC", + "HE" + ], + [ + "ACH", + "E" + ], + [ + "po", + "ll" + ], + [ + "pol", + "l" + ], + [ + "p", + "oll" + ], + [ + "▁r", + "ocks" + ], + [ + "▁ro", + "cks" + ], + [ + "▁rock", + "s" + ], + [ + "bo", + "ne" + ], + [ + "bon", + "e" + ], + [ + "b", + "one" + ], + [ + "▁D", + "IS" + ], + [ + "▁DI", + "S" + ], + [ + "▁", + "DIS" + ], + [ + "W", + "atch" + ], + [ + "▁sm", + "iling" + ], + [ + "ри", + "о" + ], + [ + "Mon", + "th" + ], + [ + "Mont", + "h" + ], + [ + "▁e", + "fter" + ], + [ + "con", + "struct" + ], + [ + "const", + "ruct" + ], + [ + "▁b", + "ands" + ], + [ + "▁band", + "s" + ], + [ + "▁ban", + "ds" + ], + [ + "▁", + "bands" + ], + [ + "▁collabor", + "ation" + ], + [ + "ни", + "ми" + ], + [ + "ним", + "и" + ], + [ + "gl", + "as" + ], + [ + "g", + "las" + ], + [ + "▁v", + "y" + ], + [ + "▁", + "vy" + ], + [ + "▁eng", + "agement" + ], + [ + "▁engage", + "ment" + ], + [ + "__", + ")" + ], + [ + "_", + "_)" + ], + [ + "▁w", + "ings" + ], + [ + "▁win", + "gs" + ], + [ + "▁wing", + "s" + ], + [ + "ки", + "м" + ], + [ + "к", + "им" + ], + [ + "net", + "je" + ], + [ + "at", + "iva" + ], + [ + "ati", + "va" + ], + [ + "ativ", + "a" + ], + [ + "▁Du", + "ke" + ], + [ + "ле", + "е" + ], + [ + "▁With", + "in" + ], + [ + "▁d", + "ove" + ], + [ + "▁do", + "ve" + ], + [ + "▁c", + "b" + ], + [ + "▁", + "cb" + ], + [ + "ye", + "rs" + ], + [ + "yer", + "s" + ], + [ + "y", + "ers" + ], + [ + "po", + "w" + ], + [ + "p", + "ow" + ], + [ + "[", + "(" + ], + [ + "▁evalu", + "ate" + ], + [ + "▁eval", + "uate" + ], + [ + "Point", + "s" + ], + [ + "▁р", + "і" + ], + [ + "▁", + "рі" + ], + [ + "od", + "igd" + ], + [ + "odi", + "gd" + ], + [ + "on", + "omy" + ], + [ + "ono", + "my" + ], + [ + "onom", + "y" + ], + [ + "▁Ill", + "inois" + ], + [ + "▁T", + "yp" + ], + [ + "▁Ty", + "p" + ], + [ + "▁", + "Typ" + ], + [ + "▁coord", + "inates" + ], + [ + "▁coordinate", + "s" + ], + [ + "pis", + "ode" + ], + [ + "uck", + "ed" + ], + [ + "uc", + "ked" + ], + [ + "▁f", + "lav" + ], + [ + "▁fl", + "av" + ], + [ + "▁br", + "ands" + ], + [ + "▁brand", + "s" + ], + [ + "▁cal", + "endar" + ], + [ + "▁", + "calendar" + ], + [ + "Li", + "b" + ], + [ + "L", + "ib" + ], + [ + "▁uit", + "gen" + ], + [ + "▁t", + "ale" + ], + [ + "▁tal", + "e" + ], + [ + "▁ta", + "le" + ], + [ + "▁brief", + "ly" + ], + [ + "▁m", + "ic" + ], + [ + "▁mi", + "c" + ], + [ + "▁", + "mic" + ], + [ + "RE", + "SS" + ], + [ + "RES", + "S" + ], + [ + "▁sp", + "äter" + ], + [ + "▁integr", + "ated" + ], + [ + "▁integrate", + "d" + ], + [ + "▁cook", + "ies" + ], + [ + "▁cookie", + "s" + ], + [ + "▁uitgen", + "odigd" + ], + [ + "▁P", + "riv" + ], + [ + "▁Pr", + "iv" + ], + [ + "▁Pri", + "v" + ], + [ + "▁", + "Priv" + ], + [ + "▁phen", + "omen" + ], + [ + "▁vo", + "egen" + ], + [ + "Su", + "pp" + ], + [ + "Sup", + "p" + ], + [ + "S", + "upp" + ], + [ + "▁re", + "fers" + ], + [ + "▁ref", + "ers" + ], + [ + "▁refer", + "s" + ], + [ + "па", + "д" + ], + [ + "▁Cl", + "inton" + ], + [ + "▁Clin", + "ton" + ], + [ + "▁ass", + "ignment" + ], + [ + "▁assign", + "ment" + ], + [ + "in", + "als" + ], + [ + "ina", + "ls" + ], + [ + "inal", + "s" + ], + [ + "i", + "nals" + ], + [ + "▁a", + "sym" + ], + [ + "▁as", + "ym" + ], + [ + "cy", + "cle" + ], + [ + "cycl", + "e" + ], + [ + "c", + "ycle" + ], + [ + "▁And", + "erson" + ], + [ + "▁Anders", + "on" + ], + [ + "▁b", + "inding" + ], + [ + "▁bin", + "ding" + ], + [ + "▁bind", + "ing" + ], + [ + "▁", + "binding" + ], + [ + "ri", + "que" + ], + [ + "r", + "ique" + ], + [ + "hi", + "nd" + ], + [ + "hin", + "d" + ], + [ + "h", + "ind" + ], + [ + "▁be", + "half" + ], + [ + "▁beh", + "alf" + ], + [ + "▁F", + "le" + ], + [ + "▁Fl", + "e" + ], + [ + "▁break", + "s" + ], + [ + "▁bre", + "aks" + ], + [ + "▁so", + "ap" + ], + [ + "▁", + "soap" + ], + [ + "ва", + "р" + ], + [ + "в", + "ар" + ], + [ + "▁v", + "ä" + ], + [ + "▁", + "vä" + ], + [ + "▁c", + "rying" + ], + [ + "▁cr", + "ying" + ], + [ + "▁cry", + "ing" + ], + [ + "▁", + "→" + ], + [ + "▁m", + "sm" + ], + [ + "▁ms", + "m" + ], + [ + "▁", + "msm" + ], + [ + "▁bo", + "ots" + ], + [ + "▁boot", + "s" + ], + [ + "ow", + "ing" + ], + [ + "owi", + "ng" + ], + [ + "o", + "wing" + ], + [ + "▁b", + "ell" + ], + [ + "▁be", + "ll" + ], + [ + "▁bel", + "l" + ], + [ + "▁", + "bell" + ], + [ + "su", + "ite" + ], + [ + "suit", + "e" + ], + [ + "▁Bund", + "es" + ], + [ + "▁Bun", + "des" + ], + [ + "Y", + "ear" + ], + [ + "nd", + "ef" + ], + [ + "nde", + "f" + ], + [ + "n", + "def" + ], + [ + "O", + "ther" + ], + [ + "▁go", + "ogle" + ], + [ + "▁goog", + "le" + ], + [ + "▁", + "google" + ], + [ + "EN", + "CE" + ], + [ + "ENC", + "E" + ], + [ + "WE", + "R" + ], + [ + "W", + "ER" + ], + [ + "Le", + "s" + ], + [ + "L", + "es" + ], + [ + "Sh", + "ared" + ], + [ + "Share", + "d" + ], + [ + "▁E", + "D" + ], + [ + "▁", + "ED" + ], + [ + "IF", + "T" + ], + [ + "I", + "FT" + ], + [ + "▁flo", + "ating" + ], + [ + "▁float", + "ing" + ], + [ + "ý", + "m" + ], + [ + "{}", + "," + ], + [ + "{", + "}," + ], + [ + "Bin", + "ary" + ], + [ + "B", + "inary" + ], + [ + "▁ro", + "ce" + ], + [ + "ra", + "j" + ], + [ + "r", + "aj" + ], + [ + "▁be", + "werken" + ], + [ + "B", + "F" + ], + [ + "▁H", + "ur" + ], + [ + "▁Hu", + "r" + ], + [ + "ce", + "n" + ], + [ + "c", + "en" + ], + [ + "▁e", + "re" + ], + [ + "▁er", + "e" + ], + [ + "▁", + "ere" + ], + [ + "▁c", + "amb" + ], + [ + "▁cam", + "b" + ], + [ + "▁ca", + "mb" + ], + [ + "▁Pak", + "istan" + ], + [ + "▁great", + "ly" + ], + [ + "▁log", + "ging" + ], + [ + "▁", + "logging" + ], + [ + "/", + "." + ], + [ + "Ten", + "sor" + ], + [ + "T", + "ensor" + ], + [ + "▁op", + "ens" + ], + [ + "▁open", + "s" + ], + [ + "▁", + "opens" + ], + [ + "▁R", + "io" + ], + [ + "▁klik", + "ken" + ], + [ + "▁sc", + "ulpt" + ], + [ + "ap", + "ore" + ], + [ + "apor", + "e" + ], + [ + "w", + "x" + ], + [ + "▁N", + "ich" + ], + [ + "▁Nic", + "h" + ], + [ + "▁Ni", + "ch" + ], + [ + "na", + "n" + ], + [ + "n", + "an" + ], + [ + "▁inj", + "ured" + ], + [ + "com", + "pare" + ], + [ + "comp", + "are" + ], + [ + "compar", + "e" + ], + [ + "th", + "a" + ], + [ + "t", + "ha" + ], + [ + "Sam", + "ple" + ], + [ + "S", + "ample" + ], + [ + "Sh", + "ell" + ], + [ + "She", + "ll" + ], + [ + "S", + "hell" + ], + [ + "▁comm", + "ander" + ], + [ + "▁command", + "er" + ], + [ + "▁re", + "ceiver" + ], + [ + "▁rece", + "iver" + ], + [ + "▁receive", + "r" + ], + [ + "▁h", + "opes" + ], + [ + "▁hope", + "s" + ], + [ + "▁hop", + "es" + ], + [ + "▁ho", + "pes" + ], + [ + "▁b", + "yl" + ], + [ + "▁by", + "l" + ], + [ + "▁pro", + "xy" + ], + [ + "▁pr", + "oxy" + ], + [ + "▁", + "proxy" + ], + [ + "▁g", + "all" + ], + [ + "▁gal", + "l" + ], + [ + "▁ga", + "ll" + ], + [ + "get", + "Id" + ], + [ + "▁B", + "ab" + ], + [ + "▁Ba", + "b" + ], + [ + "fe", + "ld" + ], + [ + "fel", + "d" + ], + [ + "f", + "eld" + ], + [ + "▁\"", + "_" + ], + [ + "▁H", + "ab" + ], + [ + "▁Ha", + "b" + ], + [ + "sim", + "ple" + ], + [ + "▁execut", + "ed" + ], + [ + "▁execute", + "d" + ], + [ + "▁a", + "te" + ], + [ + "▁at", + "e" + ], + [ + "▁", + "ate" + ], + [ + "▁an", + "imation" + ], + [ + "▁anim", + "ation" + ], + [ + "▁", + "animation" + ], + [ + "▁in", + "hab" + ], + [ + "▁бо", + "ль" + ], + [ + "▁r", + "outer" + ], + [ + "▁ro", + "uter" + ], + [ + "▁rout", + "er" + ], + [ + "▁route", + "r" + ], + [ + "▁rou", + "ter" + ], + [ + "▁", + "router" + ], + [ + "▁gl", + "ob" + ], + [ + "▁glo", + "b" + ], + [ + "▁", + "glob" + ], + [ + "Ge", + "plaatst" + ], + [ + "▁begin", + "netje" + ], + [ + "▁K", + "ur" + ], + [ + "▁Ku", + "r" + ], + [ + "▁Х", + "а" + ], + [ + "al", + "igned" + ], + [ + "align", + "ed" + ], + [ + "▁cert", + "ificate" + ], + [ + "▁", + "Å" + ], + [ + ".)", + "." + ], + [ + ".", + ")." + ], + [ + "▁s", + "oll" + ], + [ + "▁so", + "ll" + ], + [ + "▁sol", + "l" + ], + [ + "▁Im", + "port" + ], + [ + "▁Imp", + "ort" + ], + [ + "▁", + "Import" + ], + [ + "ре", + "ди" + ], + [ + "ред", + "и" + ], + [ + "р", + "еди" + ], + [ + "▁pand", + "emic" + ], + [ + "▁n", + "ic" + ], + [ + "▁ni", + "c" + ], + [ + "▁", + "nic" + ], + [ + "v", + "ä" + ], + [ + "▁G", + "ree" + ], + [ + "▁Gr", + "ee" + ], + [ + "▁Gre", + "e" + ], + [ + "▁S", + "ay" + ], + [ + "▁Sa", + "y" + ], + [ + "▁д", + "і" + ], + [ + "▁", + "ді" + ], + [ + "▁N", + "um" + ], + [ + "▁Nu", + "m" + ], + [ + "▁", + "Num" + ], + [ + "▁rough", + "ly" + ], + [ + "▁des", + "pués" + ], + [ + "▁", + "​" + ], + [ + "▁spec", + "ify" + ], + [ + "Map", + "per" + ], + [ + "lic", + "ht" + ], + [ + "li", + "cht" + ], + [ + "lich", + "t" + ], + [ + "l", + "icht" + ], + [ + "▁th", + "umb" + ], + [ + "▁", + "thumb" + ], + [ + "wi", + "e" + ], + [ + "w", + "ie" + ], + [ + "▁un", + "likely" + ], + [ + "▁unlike", + "ly" + ], + [ + "▁", + "unlikely" + ], + [ + "▁E", + "dd" + ], + [ + "▁Ed", + "d" + ], + [ + "He", + "y" + ], + [ + "H", + "ey" + ], + [ + "▁O", + "pt" + ], + [ + "▁Op", + "t" + ], + [ + "▁", + "Opt" + ], + [ + "B", + "LOCK" + ], + [ + "во", + "р" + ], + [ + "в", + "ор" + ], + [ + "▁", + "×" + ], + [ + "▁b", + "a" + ], + [ + "▁", + "ba" + ], + [ + "▁period", + "s" + ], + [ + "▁title", + "s" + ], + [ + "▁tit", + "les" + ], + [ + "Me", + "d" + ], + [ + "M", + "ed" + ], + [ + "▁f", + "on" + ], + [ + "▁fo", + "n" + ], + [ + "▁", + "fon" + ], + [ + "▁b", + "ast" + ], + [ + "▁bas", + "t" + ], + [ + "▁ba", + "st" + ], + [ + "▁", + "bast" + ], + [ + "▁F", + "orest" + ], + [ + "▁For", + "est" + ], + [ + "▁Fore", + "st" + ], + [ + "▁Fo", + "rest" + ], + [ + "▁", + "№" + ], + [ + "on", + "ds" + ], + [ + "ond", + "s" + ], + [ + "▁f", + "al" + ], + [ + "▁fa", + "l" + ], + [ + "▁g", + "esch" + ], + [ + "▁ge", + "sch" + ], + [ + "▁ges", + "ch" + ], + [ + "▁", + "gesch" + ], + [ + "dir", + "ection" + ], + [ + "di", + "rection" + ], + [ + "direct", + "ion" + ], + [ + "dire", + "ction" + ], + [ + "d", + "irection" + ], + [ + "IF", + "Y" + ], + [ + "▁L", + "A" + ], + [ + "▁", + "LA" + ], + [ + "▁(", + "((" + ], + [ + "▁((", + "(" + ], + [ + "▁", + "(((" + ], + [ + "GT", + "H" + ], + [ + "G", + "TH" + ], + [ + "it", + "udes" + ], + [ + "itude", + "s" + ], + [ + "itu", + "des" + ], + [ + "itud", + "es" + ], + [ + "▁dest", + "ruction" + ], + [ + "▁destruct", + "ion" + ], + [ + "▁J", + "a" + ], + [ + "▁s", + "take" + ], + [ + "▁st", + "ake" + ], + [ + "▁sta", + "ke" + ], + [ + "iffer", + "ent" + ], + [ + "iffe", + "rent" + ], + [ + "▁ident", + "ical" + ], + [ + "▁f", + "og" + ], + [ + "▁fo", + "g" + ], + [ + "▁R", + "eb" + ], + [ + "▁Re", + "b" + ], + [ + "▁", + "Reb" + ], + [ + "ски", + "е" + ], + [ + "сту", + "п" + ], + [ + "ja", + "x" + ], + [ + "j", + "ax" + ], + [ + "▁M", + "ars" + ], + [ + "▁Mar", + "s" + ], + [ + "▁Ma", + "rs" + ], + [ + "▁hist", + "oric" + ], + [ + "▁histor", + "ic" + ], + [ + "▁V", + "o" + ], + [ + "▁", + "Vo" + ], + [ + "▁entre", + "pre" + ], + [ + "▁t", + "ension" + ], + [ + "▁tens", + "ion" + ], + [ + "▁W", + "HERE" + ], + [ + "▁WH", + "ERE" + ], + [ + "▁WHE", + "RE" + ], + [ + "▁Phil", + "adelphia" + ], + [ + "Count", + "er" + ], + [ + "Co", + "unter" + ], + [ + "C", + "ounter" + ], + [ + "▁fr", + "ames" + ], + [ + "▁frame", + "s" + ], + [ + "▁fra", + "mes" + ], + [ + "▁fram", + "es" + ], + [ + "▁", + "frames" + ], + [ + "▁m", + "uy" + ], + [ + "▁mu", + "y" + ], + [ + "e", + "j" + ], + [ + "ö", + "t" + ], + [ + "e", + "u" + ], + [ + "▁че", + "лове" + ], + [ + "PRO", + "C" + ], + [ + "PR", + "OC" + ], + [ + "▁res", + "olved" + ], + [ + "▁resolve", + "d" + ], + [ + "▁", + "resolved" + ], + [ + "▁t", + "ape" + ], + [ + "▁tap", + "e" + ], + [ + "▁ta", + "pe" + ], + [ + "ци", + "он" + ], + [ + "▁sing", + "ular" + ], + [ + "▁person", + "nel" + ], + [ + "▁M", + "un" + ], + [ + "▁Mu", + "n" + ], + [ + "▁O", + "cc" + ], + [ + "▁", + "Occ" + ], + [ + "▁scal", + "ar" + ], + [ + "▁", + "scalar" + ], + [ + "de", + "ss" + ], + [ + "des", + "s" + ], + [ + "d", + "ess" + ], + [ + "▁c", + "able" + ], + [ + "▁cab", + "le" + ], + [ + "▁ca", + "ble" + ], + [ + "be", + "ing" + ], + [ + "b", + "eing" + ], + [ + "▁J", + "enn" + ], + [ + "▁Je", + "nn" + ], + [ + "▁Jen", + "n" + ], + [ + "▁er", + "st" + ], + [ + "▁ers", + "t" + ], + [ + "▁", + "erst" + ], + [ + "Action", + "s" + ], + [ + "Act", + "ions" + ], + [ + "A", + "ctions" + ], + [ + "Env", + "ironment" + ], + [ + "vi", + "a" + ], + [ + "v", + "ia" + ], + [ + "▁strugg", + "ling" + ], + [ + "▁D", + "VD" + ], + [ + "wh", + "e" + ], + [ + "w", + "he" + ], + [ + "▁throw", + "ing" + ], + [ + "▁thr", + "owing" + ], + [ + "▁thro", + "wing" + ], + [ + "Bound", + "s" + ], + [ + "B", + "ounds" + ], + [ + "▁M", + "D" + ], + [ + "▁", + "MD" + ], + [ + "▁\"", + "../" + ], + [ + "▁\".", + "./" + ], + [ + "▁satisf", + "y" + ], + [ + "▁Color", + "ado" + ], + [ + "▁Act", + "ive" + ], + [ + "▁Activ", + "e" + ], + [ + "▁", + "Active" + ], + [ + "Task", + "s" + ], + [ + "<>(", + ");" + ], + [ + "<>", + "();" + ], + [ + "<", + ">();" + ], + [ + "▁sl", + "ipped" + ], + [ + "▁slip", + "ped" + ], + [ + "▁po", + "ison" + ], + [ + "▁poi", + "son" + ], + [ + "z", + "b" + ], + [ + "Dis", + "patch" + ], + [ + "war", + "ning" + ], + [ + "warn", + "ing" + ], + [ + "w", + "arning" + ], + [ + "▁ult", + "imate" + ], + [ + "p", + "icture" + ], + [ + "ex", + "pression" + ], + [ + "exp", + "ression" + ], + [ + "expr", + "ession" + ], + [ + "express", + "ion" + ], + [ + "▁T", + "alk" + ], + [ + "▁Tal", + "k" + ], + [ + "▁f", + "lick" + ], + [ + "▁fl", + "ick" + ], + [ + "▁rais", + "ing" + ], + [ + "▁ra", + "ising" + ], + [ + "▁", + "raising" + ], + [ + "▁trans", + "actions" + ], + [ + "▁transaction", + "s" + ], + [ + "▁gl", + "ance" + ], + [ + "▁g", + "ri" + ], + [ + "▁gr", + "i" + ], + [ + "▁п", + "рез" + ], + [ + "▁пре", + "з" + ], + [ + "se", + "lection" + ], + [ + "sel", + "ection" + ], + [ + "select", + "ion" + ], + [ + "s", + "election" + ], + [ + "њ", + "а" + ], + [ + "en", + "dl" + ], + [ + "end", + "l" + ], + [ + "▁A", + "bb" + ], + [ + "▁Ab", + "b" + ], + [ + "▁b", + "old" + ], + [ + "▁bo", + "ld" + ], + [ + "▁bol", + "d" + ], + [ + "▁", + "bold" + ], + [ + "▁maint", + "ained" + ], + [ + "▁maintain", + "ed" + ], + [ + "Ex", + "ists" + ], + [ + "▁encour", + "aged" + ], + [ + "▁encourage", + "d" + ], + [ + "Qu", + "al" + ], + [ + "Q", + "ual" + ], + [ + "▁ess", + "ere" + ], + [ + "▁h", + "ired" + ], + [ + "▁hi", + "red" + ], + [ + "▁hire", + "d" + ], + [ + "let", + "ter" + ], + [ + "lett", + "er" + ], + [ + "lette", + "r" + ], + [ + "it", + "ches" + ], + [ + "itch", + "es" + ], + [ + "oth", + "ers" + ], + [ + "other", + "s" + ], + [ + "othe", + "rs" + ], + [ + "o", + "thers" + ], + [ + "▁w", + "oj" + ], + [ + "▁wo", + "j" + ], + [ + "▁inj", + "uries" + ], + [ + "▁d", + "il" + ], + [ + "▁di", + "l" + ], + [ + "ex", + "ecut" + ], + [ + "exec", + "ut" + ], + [ + "▁Ste", + "el" + ], + [ + "▁G", + "arden" + ], + [ + "▁Gar", + "den" + ], + [ + "▁Gard", + "en" + ], + [ + "з", + "я" + ], + [ + "\\,", + "\\" + ], + [ + "\\", + ",\\" + ], + [ + "▁An", + "gel" + ], + [ + "▁Ang", + "el" + ], + [ + "pr", + "im" + ], + [ + "p", + "rim" + ], + [ + ">:", + "]<" + ], + [ + "g", + "b" + ], + [ + "pe", + "at" + ], + [ + "in", + "te" + ], + [ + "int", + "e" + ], + [ + "i", + "nte" + ], + [ + "▁ap", + "olog" + ], + [ + "▁reg", + "ulations" + ], + [ + "▁regul", + "ations" + ], + [ + "▁regulation", + "s" + ], + [ + "S", + "rc" + ], + [ + "k", + "h" + ], + [ + "Up", + "load" + ], + [ + "U", + "pload" + ], + [ + "ma", + "pping" + ], + [ + "map", + "ping" + ], + [ + "m", + "apping" + ], + [ + "▁p", + "resents" + ], + [ + "▁pres", + "ents" + ], + [ + "▁present", + "s" + ], + [ + "▁po", + "etry" + ], + [ + "▁poet", + "ry" + ], + [ + "▁st", + "ops" + ], + [ + "▁stop", + "s" + ], + [ + "▁sto", + "ps" + ], + [ + "▁T", + "ol" + ], + [ + "▁To", + "l" + ], + [ + "▁t", + "ower" + ], + [ + "▁to", + "wer" + ], + [ + "▁tow", + "er" + ], + [ + "▁O", + "UT" + ], + [ + "▁", + "OUT" + ], + [ + "Th", + "ank" + ], + [ + "Than", + "k" + ], + [ + "▁organ", + "ic" + ], + [ + "▁d", + "rei" + ], + [ + "▁dr", + "ei" + ], + [ + "▁dre", + "i" + ], + [ + "▁p", + "ound" + ], + [ + "▁po", + "und" + ], + [ + "▁pou", + "nd" + ], + [ + "cent", + "ury" + ], + [ + "▁mod", + "ules" + ], + [ + "▁module", + "s" + ], + [ + "▁", + "modules" + ], + [ + "▁д", + "ере" + ], + [ + "▁де", + "ре" + ], + [ + "▁w", + "orn" + ], + [ + "▁wor", + "n" + ], + [ + "▁wo", + "rn" + ], + [ + "▁par", + "ad" + ], + [ + "▁para", + "d" + ], + [ + "▁pa", + "rad" + ], + [ + "▁C", + "os" + ], + [ + "▁Co", + "s" + ], + [ + "fi", + "c" + ], + [ + "f", + "ic" + ], + [ + "▁бе", + "з" + ], + [ + "▁Jim", + "my" + ], + [ + "▁l", + "ands" + ], + [ + "▁land", + "s" + ], + [ + "▁lan", + "ds" + ], + [ + "▁", + "lands" + ], + [ + "▁min", + "ist" + ], + [ + "▁mini", + "st" + ], + [ + "vs", + "pace" + ], + [ + "v", + "space" + ], + [ + "▁light", + "ing" + ], + [ + "▁n", + "aked" + ], + [ + "▁na", + "ked" + ], + [ + "▁design", + "er" + ], + [ + "▁St", + "ream" + ], + [ + "▁Stre", + "am" + ], + [ + "▁", + "Stream" + ], + [ + "TM", + "P" + ], + [ + "T", + "MP" + ], + [ + "Cent", + "er" + ], + [ + "C", + "enter" + ], + [ + "resent", + "ation" + ], + [ + "ON", + "T" + ], + [ + "O", + "NT" + ], + [ + "▁e", + "rs" + ], + [ + "▁er", + "s" + ], + [ + "▁", + "ers" + ], + [ + "▁measure", + "ment" + ], + [ + "▁mus", + "cles" + ], + [ + "▁muscle", + "s" + ], + [ + "▁I", + "gn" + ], + [ + "▁", + "Ign" + ], + [ + "▁C", + "OM" + ], + [ + "▁CO", + "M" + ], + [ + "▁", + "COM" + ], + [ + "▁f", + "ru" + ], + [ + "▁fr", + "u" + ], + [ + "▁gen", + "re" + ], + [ + "▁al", + "pha" + ], + [ + "▁", + "alpha" + ], + [ + "▁ret", + "irement" + ], + [ + "▁retire", + "ment" + ], + [ + "▁G", + "on" + ], + [ + "▁Go", + "n" + ], + [ + "ő", + "l" + ], + [ + "cont", + "ents" + ], + [ + "content", + "s" + ], + [ + "▁he", + "aling" + ], + [ + "▁heal", + "ing" + ], + [ + "▁s", + "ido" + ], + [ + "▁si", + "do" + ], + [ + "▁sid", + "o" + ], + [ + "incip", + "al" + ], + [ + "Per", + "mission" + ], + [ + "Perm", + "ission" + ], + [ + "ра", + "к" + ], + [ + "▁G", + "ordon" + ], + [ + "▁Gor", + "don" + ], + [ + "▁R", + "ank" + ], + [ + "▁Ran", + "k" + ], + [ + "▁", + "Rank" + ], + [ + "▁Aut", + "om" + ], + [ + "▁Au", + "tom" + ], + [ + "▁Auto", + "m" + ], + [ + "▁", + "Autom" + ], + [ + "Con", + "structor" + ], + [ + "Construct", + "or" + ], + [ + "wi", + "ki" + ], + [ + "wik", + "i" + ], + [ + "w", + "iki" + ], + [ + "▁concern", + "ing" + ], + [ + "▁concer", + "ning" + ], + [ + "riz", + "ona" + ], + [ + "▁var", + "iant" + ], + [ + "▁vari", + "ant" + ], + [ + "▁", + "variant" + ], + [ + "▁arr", + "anged" + ], + [ + "▁arrang", + "ed" + ], + [ + "▁arrange", + "d" + ], + [ + "▁S", + "pr" + ], + [ + "▁Sp", + "r" + ], + [ + "BP", + "ACK" + ], + [ + "B", + "PACK" + ], + [ + "Tim", + "estamp" + ], + [ + "re", + "store" + ], + [ + "rest", + "ore" + ], + [ + "aw", + "are" + ], + [ + "awa", + "re" + ], + [ + "a", + "ware" + ], + [ + "▁Ob", + "serv" + ], + [ + "▁", + "Observ" + ], + [ + "▁S", + "V" + ], + [ + "▁", + "SV" + ], + [ + "ip", + "p" + ], + [ + "i", + "pp" + ], + [ + "▁Execut", + "ive" + ], + [ + "▁col", + "leg" + ], + [ + "▁coll", + "eg" + ], + [ + "▁colle", + "g" + ], + [ + "▁explicit", + "ly" + ], + [ + "wr", + "itten" + ], + [ + "writ", + "ten" + ], + [ + "▁K", + "ön" + ], + [ + "▁Kö", + "n" + ], + [ + "ir", + "us" + ], + [ + "i", + "rus" + ], + [ + "▁H", + "old" + ], + [ + "▁Hol", + "d" + ], + [ + "▁Ho", + "ld" + ], + [ + "▁P", + "ract" + ], + [ + "▁Pr", + "act" + ], + [ + "▁Pra", + "ct" + ], + [ + "Char", + "acter" + ], + [ + "▁red", + "istribute" + ], + [ + "uer", + "to" + ], + [ + "▁Stud", + "ent" + ], + [ + "▁", + "Student" + ], + [ + "▁el", + "der" + ], + [ + "▁D", + "op" + ], + [ + "▁Do", + "p" + ], + [ + "v", + "p" + ], + [ + "▁H", + "ub" + ], + [ + "▁Hu", + "b" + ], + [ + "▁", + "Hub" + ], + [ + "▁gr", + "ounds" + ], + [ + "▁ground", + "s" + ], + [ + "▁R", + "y" + ], + [ + "▁sign", + "als" + ], + [ + "▁sig", + "nals" + ], + [ + "▁signal", + "s" + ], + [ + "▁g", + "ifts" + ], + [ + "▁gift", + "s" + ], + [ + "▁streng", + "then" + ], + [ + "▁strength", + "en" + ], + [ + "▁L", + "yn" + ], + [ + "▁Ly", + "n" + ], + [ + "com", + "mun" + ], + [ + "comm", + "un" + ], + [ + "▁на", + "й" + ], + [ + "▁fin", + "ance" + ], + [ + "▁financ", + "e" + ], + [ + "no", + "c" + ], + [ + "n", + "oc" + ], + [ + "he", + "lm" + ], + [ + "hel", + "m" + ], + [ + "h", + "elm" + ], + [ + "▁c", + "uts" + ], + [ + "▁cut", + "s" + ], + [ + "▁cu", + "ts" + ], + [ + "▁advent", + "ure" + ], + [ + "▁R", + "ic" + ], + [ + "▁intellect", + "ual" + ], + [ + "▁Out", + "put" + ], + [ + "▁", + "Output" + ], + [ + "▁aw", + "k" + ], + [ + "▁", + "awk" + ], + [ + "▁concentr", + "ation" + ], + [ + "▁guid", + "ance" + ], + [ + "Buf", + "f" + ], + [ + "Bu", + "ff" + ], + [ + "B", + "uff" + ], + [ + "▁f", + "illing" + ], + [ + "▁fil", + "ling" + ], + [ + "▁fill", + "ing" + ], + [ + "▁reg", + "ul" + ], + [ + "▁del", + "icious" + ], + [ + "([", + "]" + ], + [ + "(", + "[]" + ], + [ + "ши", + "х" + ], + [ + "▁t", + "ons" + ], + [ + "▁to", + "ns" + ], + [ + "▁ton", + "s" + ], + [ + "▁", + "tons" + ], + [ + "act", + "ivity" + ], + [ + "activ", + "ity" + ], + [ + "G", + "P" + ], + [ + "LO", + "B" + ], + [ + "L", + "OB" + ], + [ + "st", + "adt" + ], + [ + "sta", + "dt" + ], + [ + "stad", + "t" + ], + [ + "ta", + "l" + ], + [ + "t", + "al" + ], + [ + "▁im", + "g" + ], + [ + "▁i", + "mg" + ], + [ + "▁", + "img" + ], + [ + "▁r", + "ush" + ], + [ + "▁ru", + "sh" + ], + [ + "▁rus", + "h" + ], + [ + "att", + "ice" + ], + [ + "atti", + "ce" + ], + [ + "▁p", + "ok" + ], + [ + "▁po", + "k" + ], + [ + "st", + "eps" + ], + [ + "ste", + "ps" + ], + [ + "step", + "s" + ], + [ + "▁l", + "id" + ], + [ + "▁li", + "d" + ], + [ + "▁D", + "NA" + ], + [ + "B", + "rowser" + ], + [ + "▁lad", + "ies" + ], + [ + "▁an", + "nées" + ], + [ + "▁ann", + "ées" + ], + [ + "▁resc", + "ue" + ], + [ + "av", + "ity" + ], + [ + "avi", + "ty" + ], + [ + "ro", + "ck" + ], + [ + "roc", + "k" + ], + [ + "r", + "ock" + ], + [ + "▁glass", + "es" + ], + [ + "▁B", + "ey" + ], + [ + "▁Be", + "y" + ], + [ + ")}", + "$" + ], + [ + ")", + "}$" + ], + [ + "de", + "tail" + ], + [ + "det", + "ail" + ], + [ + "▁d", + "és" + ], + [ + "▁dé", + "s" + ], + [ + "ta", + "x" + ], + [ + "t", + "ax" + ], + [ + "▁favour", + "ite" + ], + [ + "▁prec", + "ision" + ], + [ + "▁con", + "oc" + ], + [ + "▁co", + "noc" + ], + [ + "M", + "s" + ], + [ + "▁N", + "ative" + ], + [ + "▁Nat", + "ive" + ], + [ + "▁", + "Native" + ], + [ + "▁P", + "il" + ], + [ + "▁Pi", + "l" + ], + [ + "Input", + "Stream" + ], + [ + "or", + "p" + ], + [ + "o", + "rp" + ], + [ + "▁P", + "ap" + ], + [ + "▁Pa", + "p" + ], + [ + "▁p", + "icking" + ], + [ + "▁pick", + "ing" + ], + [ + "▁pic", + "king" + ], + [ + "ip", + "h" + ], + [ + "i", + "ph" + ], + [ + "Load", + "ing" + ], + [ + "Lo", + "ading" + ], + [ + "▁pr", + "iest" + ], + [ + "▁pri", + "est" + ], + [ + "H", + "ook" + ], + [ + "▁p", + "ist" + ], + [ + "▁pi", + "st" + ], + [ + "▁U", + "ne" + ], + [ + "▁Un", + "e" + ], + [ + "▁", + "Une" + ], + [ + "%", + "," + ], + [ + "▁b", + "il" + ], + [ + "▁bi", + "l" + ], + [ + "▁", + "bil" + ], + [ + "▁conserv", + "ative" + ], + [ + "ev", + "al" + ], + [ + "eva", + "l" + ], + [ + "e", + "val" + ], + [ + "ik", + "ing" + ], + [ + "iki", + "ng" + ], + [ + "i", + "king" + ], + [ + "'}", + "," + ], + [ + "'", + "}," + ], + [ + "▁sa", + "uce" + ], + [ + "▁sau", + "ce" + ], + [ + "▁D", + "ue" + ], + [ + "▁Du", + "e" + ], + [ + "as", + "sen" + ], + [ + "ass", + "en" + ], + [ + "asse", + "n" + ], + [ + "▁occasion", + "ally" + ], + [ + "▁occasional", + "ly" + ], + [ + "▁Д", + "ж" + ], + [ + "un", + "known" + ], + [ + "unk", + "nown" + ], + [ + "DE", + "D" + ], + [ + "D", + "ED" + ], + [ + "▁d", + "rum" + ], + [ + "▁dr", + "um" + ], + [ + "▁dru", + "m" + ], + [ + "▁d", + "ub" + ], + [ + "▁du", + "b" + ], + [ + "AT", + "URE" + ], + [ + "us", + "age" + ], + [ + "usa", + "ge" + ], + [ + "get", + "Type" + ], + [ + "re", + "ply" + ], + [ + "rep", + "ly" + ], + [ + "▁strateg", + "ic" + ], + [ + "▁k", + "ap" + ], + [ + "▁ka", + "p" + ], + [ + "▁", + "kap" + ], + [ + "de", + "sign" + ], + [ + "des", + "ign" + ], + [ + "date", + "time" + ], + [ + "dat", + "etime" + ], + [ + "▁P", + "rim" + ], + [ + "▁Pr", + "im" + ], + [ + "▁Pri", + "m" + ], + [ + "▁", + "Prim" + ], + [ + "Ma", + "ster" + ], + [ + "M", + "aster" + ], + [ + "▁Cor", + "ps" + ], + [ + "▁consider", + "able" + ], + [ + "▁T", + "u" + ], + [ + "▁", + "ла" + ], + [ + "▁t", + "ous" + ], + [ + "▁to", + "us" + ], + [ + "▁tou", + "s" + ], + [ + "▁c", + "lar" + ], + [ + "▁cl", + "ar" + ], + [ + "▁po", + "em" + ], + [ + "al", + "bum" + ], + [ + "]", + "*" + ], + [ + "lo", + "aded" + ], + [ + "load", + "ed" + ], + [ + "▁travel", + "ing" + ], + [ + "▁trav", + "eling" + ], + [ + "вы", + "е" + ], + [ + "▁F", + "err" + ], + [ + "▁Fe", + "rr" + ], + [ + "▁Fer", + "r" + ], + [ + "▁p", + "harm" + ], + [ + "▁ph", + "arm" + ], + [ + "ab", + "i" + ], + [ + "a", + "bi" + ], + [ + "▁}", + "\\" + ], + [ + "▁", + "}\\" + ], + [ + "col", + "lect" + ], + [ + "coll", + "ect" + ], + [ + "▁B", + "our" + ], + [ + "▁Bo", + "ur" + ], + [ + "▁Bou", + "r" + ], + [ + "O", + "C" + ], + [ + "▁measure", + "ments" + ], + [ + "▁measurement", + "s" + ], + [ + "▁Profess", + "ional" + ], + [ + "▁s", + "ensor" + ], + [ + "▁sens", + "or" + ], + [ + "▁sen", + "sor" + ], + [ + "▁", + "sensor" + ], + [ + "ut", + "sche" + ], + [ + "uts", + "che" + ], + [ + "utsch", + "e" + ], + [ + "▁dem", + "anded" + ], + [ + "▁demand", + "ed" + ], + [ + "▁accompan", + "ied" + ], + [ + "▁p", + "rend" + ], + [ + "▁pre", + "nd" + ], + [ + "▁pr", + "end" + ], + [ + "▁enc", + "oding" + ], + [ + "▁", + "encoding" + ], + [ + "▁Gesch", + "ichte" + ], + [ + "▁m", + "ig" + ], + [ + "▁mi", + "g" + ], + [ + "▁G", + "ib" + ], + [ + "▁Gi", + "b" + ], + [ + "▁Re", + "ich" + ], + [ + "▁m", + "yster" + ], + [ + "▁my", + "ster" + ], + [ + "▁myst", + "er" + ], + [ + "▁M", + "ock" + ], + [ + "▁Mo", + "ck" + ], + [ + "▁", + "Mock" + ], + [ + "▁phys", + "ically" + ], + [ + "▁physical", + "ly" + ], + [ + "▁B", + "au" + ], + [ + "▁Ba", + "u" + ], + [ + "▁S", + "ingle" + ], + [ + "▁Sing", + "le" + ], + [ + "▁Sin", + "gle" + ], + [ + "▁", + "Single" + ], + [ + "▁man", + "aging" + ], + [ + "▁K", + "il" + ], + [ + "▁Ki", + "l" + ], + [ + "▁Tem", + "ple" + ], + [ + "▁Temp", + "le" + ], + [ + "▁l", + "ev" + ], + [ + "▁le", + "v" + ], + [ + "▁", + "lev" + ], + [ + "▁l", + "í" + ], + [ + "CP", + "U" + ], + [ + "C", + "PU" + ], + [ + "▁Prem", + "ier" + ], + [ + "▁G", + "ive" + ], + [ + "▁Gi", + "ve" + ], + [ + "ir", + "i" + ], + [ + "i", + "ri" + ], + [ + "N", + "V" + ], + [ + "▁A", + "I" + ], + [ + "▁", + "AI" + ], + [ + "▁f", + "p" + ], + [ + "▁", + "fp" + ], + [ + "лекс", + "анд" + ], + [ + "▁t", + "ant" + ], + [ + "▁tan", + "t" + ], + [ + "▁ta", + "nt" + ], + [ + "▁f", + "ot" + ], + [ + "▁fo", + "t" + ], + [ + "Null", + "able" + ], + [ + "▁gu", + "ards" + ], + [ + "▁guard", + "s" + ], + [ + "On", + "ce" + ], + [ + "▁ch", + "amber" + ], + [ + "▁cha", + "mber" + ], + [ + "fil", + "m" + ], + [ + "fi", + "lm" + ], + [ + "▁b", + "ias" + ], + [ + "▁bi", + "as" + ], + [ + "▁", + "bias" + ], + [ + "▁T", + "ai" + ], + [ + "▁Ta", + "i" + ], + [ + "ins", + "ic" + ], + [ + "insi", + "c" + ], + [ + "▁m", + "l" + ], + [ + "▁", + "ml" + ], + [ + "▁K", + "a" + ], + [ + "ва", + "л" + ], + [ + "в", + "ал" + ], + [ + "▁S", + "ER" + ], + [ + "▁SE", + "R" + ], + [ + "▁", + "SER" + ], + [ + "▁Some", + "one" + ], + [ + "}}", + "_{" + ], + [ + "}}_", + "{" + ], + [ + "}", + "}_{" + ], + [ + "Fix", + "ed" + ], + [ + "F", + "ixed" + ], + [ + "▁b", + "ent" + ], + [ + "▁be", + "nt" + ], + [ + "▁ben", + "t" + ], + [ + "▁pro", + "hib" + ], + [ + "▁b", + "id" + ], + [ + "▁bi", + "d" + ], + [ + "▁", + "bid" + ], + [ + "▁fe", + "wer" + ], + [ + "▁few", + "er" + ], + [ + "к", + "ры" + ], + [ + "▁l", + "ugar" + ], + [ + "▁lug", + "ar" + ], + [ + "▁lu", + "gar" + ], + [ + "▁de", + "serve" + ], + [ + "▁des", + "erve" + ], + [ + "ss", + "l" + ], + [ + "s", + "sl" + ], + [ + "▁c", + "fg" + ], + [ + "▁cf", + "g" + ], + [ + "▁", + "cfg" + ], + [ + "re", + "ck" + ], + [ + "rec", + "k" + ], + [ + "▁st", + "ability" + ], + [ + "▁stabil", + "ity" + ], + [ + "▁stab", + "ility" + ], + [ + "re", + "size" + ], + [ + "res", + "ize" + ], + [ + "▁assert", + "That" + ], + [ + "Tr", + "igger" + ], + [ + "▁ста", + "нов" + ], + [ + "▁стан", + "ов" + ], + [ + "▁", + "станов" + ], + [ + "pl", + "ugins" + ], + [ + "plugin", + "s" + ], + [ + "plug", + "ins" + ], + [ + "▁l", + "ets" + ], + [ + "▁le", + "ts" + ], + [ + "▁let", + "s" + ], + [ + "▁", + "lets" + ], + [ + "хі", + "д" + ], + [ + "х", + "ід" + ], + [ + "▁La", + "ura" + ], + [ + "▁Lau", + "ra" + ], + [ + "не", + "р" + ], + [ + "н", + "ер" + ], + [ + "▁br", + "ut" + ], + [ + "▁bru", + "t" + ], + [ + "▁F", + "I" + ], + [ + "▁", + "FI" + ], + [ + "is", + "ons" + ], + [ + "ison", + "s" + ], + [ + "iso", + "ns" + ], + [ + "▁d", + "yn" + ], + [ + "▁dy", + "n" + ], + [ + "▁", + "dyn" + ], + [ + "ic", + "her" + ], + [ + "ich", + "er" + ], + [ + "iche", + "r" + ], + [ + "i", + "cher" + ], + [ + "ray", + "ed" + ], + [ + "▁frequ", + "ent" + ], + [ + "▁jed", + "och" + ], + [ + "▁Mar", + "ine" + ], + [ + "st", + "rings" + ], + [ + "str", + "ings" + ], + [ + "string", + "s" + ], + [ + "▁U", + "til" + ], + [ + "▁Ut", + "il" + ], + [ + "▁", + "Util" + ], + [ + "▁b", + "os" + ], + [ + "▁bo", + "s" + ], + [ + "Mu", + "s" + ], + [ + "M", + "us" + ], + [ + "▁Portug", + "al" + ], + [ + "Str", + "ategy" + ], + [ + "▁по", + "се" + ], + [ + "▁пос", + "е" + ], + [ + "▁sl", + "ice" + ], + [ + "▁slic", + "e" + ], + [ + "▁", + "slice" + ], + [ + "▁ins", + "ight" + ], + [ + "▁w", + "idget" + ], + [ + "▁wid", + "get" + ], + [ + "▁", + "widget" + ], + [ + "▁gén", + "éral" + ], + [ + "message", + "s" + ], + [ + "m", + "essages" + ], + [ + "▁H", + "u" + ], + [ + "▁requ", + "irement" + ], + [ + "▁require", + "ment" + ], + [ + "Si", + "de" + ], + [ + "S", + "ide" + ], + [ + "empl", + "ates" + ], + [ + "emplate", + "s" + ], + [ + "▁cer", + "emony" + ], + [ + "▁ceremon", + "y" + ], + [ + "▁phys", + "ics" + ], + [ + "▁grad", + "uate" + ], + [ + "▁gradu", + "ate" + ], + [ + "▁", + "graduate" + ], + [ + "par", + "a" + ], + [ + "pa", + "ra" + ], + [ + "p", + "ara" + ], + [ + "▁pre", + "serv" + ], + [ + "▁pres", + "erv" + ], + [ + "▁sh", + "ops" + ], + [ + "▁shop", + "s" + ], + [ + "▁", + "shops" + ], + [ + "ze", + "k" + ], + [ + "z", + "ek" + ], + [ + "▁u", + "b" + ], + [ + "▁", + "ub" + ], + [ + "pre", + "pare" + ], + [ + "▁O", + "il" + ], + [ + "▁f", + "ib" + ], + [ + "▁fi", + "b" + ], + [ + "▁run", + "time" + ], + [ + "▁", + "runtime" + ], + [ + "▁h", + "ogy" + ], + [ + "▁ho", + "gy" + ], + [ + "Warn", + "ing" + ], + [ + "War", + "ning" + ], + [ + "W", + "arning" + ], + [ + "▁Con", + "vert" + ], + [ + "▁", + "Convert" + ], + [ + "bour", + "ne" + ], + [ + "▁emer", + "ged" + ], + [ + "▁emerge", + "d" + ], + [ + "▁Д", + "и" + ], + [ + "ight", + "h" + ], + [ + "igh", + "th" + ], + [ + "gu", + "ard" + ], + [ + "ka", + "l" + ], + [ + "k", + "al" + ], + [ + "valid", + "ation" + ], + [ + "ên", + "cia" + ], + [ + "ê", + "ncia" + ], + [ + "▁dr", + "inks" + ], + [ + "▁drink", + "s" + ], + [ + "the", + "orem" + ], + [ + "H", + "R" + ], + [ + "ie", + "v" + ], + [ + "i", + "ev" + ], + [ + "ploy", + "ee" + ], + [ + "Us", + "age" + ], + [ + "▁с", + "пе" + ], + [ + "▁сп", + "е" + ], + [ + "▁", + "спе" + ], + [ + "dis", + "patch" + ], + [ + "disp", + "atch" + ], + [ + "▁inst", + "antly" + ], + [ + "▁instant", + "ly" + ], + [ + "ob", + "i" + ], + [ + "o", + "bi" + ], + [ + "▁just", + "ify" + ], + [ + "▁N", + "ev" + ], + [ + "▁Ne", + "v" + ], + [ + "▁я", + "вля" + ], + [ + "ag", + "ra" + ], + [ + "agr", + "a" + ], + [ + "a", + "gra" + ], + [ + "▁trans", + "mission" + ], + [ + "▁transm", + "ission" + ], + [ + "fl", + "y" + ], + [ + "f", + "ly" + ], + [ + ";", + "'", + ";" + ], + [ + ">", + "';" + ], + [ + "▁cou", + "sin" + ], + [ + "▁cous", + "in" + ], + [ + "create", + "Element" + ], + [ + "Co", + "uld" + ], + [ + "C", + "ould" + ], + [ + "▁cap", + "ac" + ], + [ + "▁p", + "ause" + ], + [ + "▁pa", + "use" + ], + [ + "▁paus", + "e" + ], + [ + "▁", + "pause" + ], + [ + "Array", + "List" + ], + [ + "kt", + "e" + ], + [ + "k", + "te" + ], + [ + "ord", + "ered" + ], + [ + "order", + "ed" + ], + [ + "▁sh", + "aking" + ], + [ + "▁sha", + "king" + ], + [ + "label", + "s" + ], + [ + "lab", + "els" + ], + [ + "▁redu", + "cing" + ], + [ + "вы", + "х" + ], + [ + "US", + "ED" + ], + [ + "USE", + "D" + ], + [ + "U", + "SED" + ], + [ + "▁v", + "oting" + ], + [ + "▁vo", + "ting" + ], + [ + "▁vot", + "ing" + ], + [ + "▁Min", + "istry" + ], + [ + "▁M", + "ig" + ], + [ + "▁Mi", + "g" + ], + [ + "▁C", + "hen" + ], + [ + "▁Ch", + "en" + ], + [ + "▁Che", + "n" + ], + [ + "▁ac", + "company" + ], + [ + "▁accompan", + "y" + ], + [ + "▁accomp", + "any" + ], + [ + "ul", + "le" + ], + [ + "ull", + "e" + ], + [ + "u", + "lle" + ], + [ + "▁g", + "a" + ], + [ + "▁", + "ga" + ], + [ + "▁equ", + "ipped" + ], + [ + "▁equip", + "ped" + ], + [ + "▁n", + "un" + ], + [ + "▁nu", + "n" + ], + [ + "Be", + "t" + ], + [ + "B", + "et" + ], + [ + "▁lic", + "ensed" + ], + [ + "▁license", + "d" + ], + [ + "AR", + "CH" + ], + [ + "F", + "N" + ], + [ + "▁eng", + "ines" + ], + [ + "▁engine", + "s" + ], + [ + "▁s", + "ter" + ], + [ + "▁st", + "er" + ], + [ + "▁ste", + "r" + ], + [ + "▁", + "ster" + ], + [ + "▁loc", + "ale" + ], + [ + "▁local", + "e" + ], + [ + "▁", + "locale" + ], + [ + "▁в", + "ъ" + ], + [ + "lin", + "ks" + ], + [ + "link", + "s" + ], + [ + "l", + "inks" + ], + [ + "▁Cap", + "ital" + ], + [ + "▁al", + "ien" + ], + [ + "▁ali", + "en" + ], + [ + "W", + "r" + ], + [ + "р", + "ъ" + ], + [ + "Car", + "t" + ], + [ + "C", + "art" + ], + [ + "▁Mark", + "eting" + ], + [ + "▁Market", + "ing" + ], + [ + "▁R", + "T" + ], + [ + "▁", + "RT" + ], + [ + "File", + "Name" + ], + [ + "▁t", + "i" + ], + [ + "▁", + "ti" + ], + [ + "ij", + "i" + ], + [ + "i", + "ji" + ], + [ + "▁vers", + "us" + ], + [ + "li", + "ve" + ], + [ + "liv", + "e" + ], + [ + "l", + "ive" + ], + [ + "Sy", + "m" + ], + [ + "S", + "ym" + ], + [ + "ko", + "r" + ], + [ + "k", + "or" + ], + [ + "▁e", + "mission" + ], + [ + "▁em", + "ission" + ], + [ + "um", + "m" + ], + [ + "u", + "mm" + ], + [ + "yc", + "z" + ], + [ + "y", + "cz" + ], + [ + "▁clim", + "bed" + ], + [ + "▁climb", + "ed" + ], + [ + "▁plus", + "ieurs" + ], + [ + "к", + "ри" + ], + [ + "ya", + "r" + ], + [ + "y", + "ar" + ], + [ + "os", + "ten" + ], + [ + "ost", + "en" + ], + [ + "o", + "sten" + ], + [ + "▁u", + "sb" + ], + [ + "▁us", + "b" + ], + [ + "▁", + "usb" + ], + [ + "▁cross", + "ing" + ], + [ + "▁pol", + "ynom" + ], + [ + "▁poly", + "nom" + ], + [ + "▁rem", + "oval" + ], + [ + "▁Ad", + "ams" + ], + [ + "▁Adam", + "s" + ], + [ + "▁i", + "hre" + ], + [ + "▁ih", + "re" + ], + [ + "▁ihr", + "e" + ], + [ + "an", + "den" + ], + [ + "and", + "en" + ], + [ + "ande", + "n" + ], + [ + "▁Ben", + "j" + ], + [ + "▁P", + "hill" + ], + [ + "▁Ph", + "ill" + ], + [ + "▁Phil", + "l" + ], + [ + "▁wound", + "ed" + ], + [ + "▁Cast", + "le" + ], + [ + "▁Cas", + "tle" + ], + [ + "bi", + "ld" + ], + [ + "bil", + "d" + ], + [ + "b", + "ild" + ], + [ + "An", + "notation" + ], + [ + "Process", + "or" + ], + [ + "▁t", + "in" + ], + [ + "▁ti", + "n" + ], + [ + "fo", + "lg" + ], + [ + "fol", + "g" + ], + [ + "▁Stud", + "ents" + ], + [ + "▁Student", + "s" + ], + [ + "▁Mex", + "ican" + ], + [ + "▁administr", + "ative" + ], + [ + "IL", + "ED" + ], + [ + "ILE", + "D" + ], + [ + "I", + "LED" + ], + [ + "▁con", + "qu" + ], + [ + "▁che", + "er" + ], + [ + "▁C", + "es" + ], + [ + "▁Ce", + "s" + ], + [ + "B", + "ecause" + ], + [ + "▁J", + "uni" + ], + [ + "▁Jun", + "i" + ], + [ + "▁Ju", + "ni" + ], + [ + "▁en", + "contr" + ], + [ + "av", + "i" + ], + [ + "a", + "vi" + ], + [ + "V", + "I" + ], + [ + "ak", + "u" + ], + [ + "a", + "ku" + ], + [ + "▁T", + "on" + ], + [ + "▁To", + "n" + ], + [ + "▁sm", + "oking" + ], + [ + "▁b", + "ay" + ], + [ + "▁ba", + "y" + ], + [ + "work", + "s" + ], + [ + "wor", + "ks" + ], + [ + "а", + "т" + ], + [ + "at", + "tered" + ], + [ + "att", + "ered" + ], + [ + "atter", + "ed" + ], + [ + "atte", + "red" + ], + [ + "▁Bo", + "olean" + ], + [ + "▁", + "Boolean" + ], + [ + "▁B", + "alt" + ], + [ + "▁Ba", + "lt" + ], + [ + "▁Bal", + "t" + ], + [ + "de", + "fer" + ], + [ + "def", + "er" + ], + [ + "path", + "y" + ], + [ + "pat", + "hy" + ], + [ + "pa", + "thy" + ], + [ + "A", + "h" + ], + [ + "▁a", + "kt" + ], + [ + "▁ak", + "t" + ], + [ + "▁", + "akt" + ], + [ + "▁gover", + "nor" + ], + [ + "▁govern", + "or" + ], + [ + "P", + "ad" + ], + [ + "▁si", + "sters" + ], + [ + "▁sister", + "s" + ], + [ + "▁sist", + "ers" + ], + [ + "La", + "t" + ], + [ + "L", + "at" + ], + [ + "▁re", + "vel" + ], + [ + "▁r", + "evel" + ], + [ + "▁rev", + "el" + ], + [ + "▁reve", + "l" + ], + [ + "▁S", + "Y" + ], + [ + "▁", + "SY" + ], + [ + "it", + "os" + ], + [ + "ito", + "s" + ], + [ + "i", + "tos" + ], + [ + "▁fil", + "ters" + ], + [ + "▁filter", + "s" + ], + [ + "▁", + "filters" + ], + [ + "Ch", + "unk" + ], + [ + "con", + "sum" + ], + [ + "cons", + "um" + ], + [ + "▁rem", + "oving" + ], + [ + "▁H", + "err" + ], + [ + "▁He", + "rr" + ], + [ + "▁Her", + "r" + ], + [ + "▁gener", + "ator" + ], + [ + "▁", + "generator" + ], + [ + "▁C", + "ra" + ], + [ + "▁Cr", + "a" + ], + [ + "▁far", + "mers" + ], + [ + "▁farm", + "ers" + ], + [ + "▁farmer", + "s" + ], + [ + "▁Mem", + "bers" + ], + [ + "▁Member", + "s" + ], + [ + "▁", + "Members" + ], + [ + "▁over", + "come" + ], + [ + "▁C", + "in" + ], + [ + "▁Ci", + "n" + ], + [ + "ig", + "keit" + ], + [ + "cri", + "ptions" + ], + [ + "cription", + "s" + ], + [ + "cript", + "ions" + ], + [ + "Test", + "s" + ], + [ + "Te", + "sts" + ], + [ + "T", + "ests" + ], + [ + "▁к", + "лу" + ], + [ + "▁sh", + "ake" + ], + [ + "▁sha", + "ke" + ], + [ + "▁", + "shake" + ], + [ + "▁y", + "y" + ], + [ + "▁", + "yy" + ], + [ + "pl", + "acement" + ], + [ + "place", + "ment" + ], + [ + "plac", + "ement" + ], + [ + "▁a", + "wards" + ], + [ + "▁aw", + "ards" + ], + [ + "▁award", + "s" + ], + [ + "▁epis", + "odes" + ], + [ + "▁episode", + "s" + ], + [ + "▁Bl", + "ood" + ], + [ + "▁Blo", + "od" + ], + [ + "▁bul", + "let" + ], + [ + "▁bull", + "et" + ], + [ + "▁", + "bullet" + ], + [ + "▁v", + "iene" + ], + [ + "▁vi", + "ene" + ], + [ + "▁vie", + "ne" + ], + [ + "▁Fin", + "ancial" + ], + [ + "F", + "uture" + ], + [ + "▁r", + "ou" + ], + [ + "▁ro", + "u" + ], + [ + "▁", + "rou" + ], + [ + "▁bi", + "ologie" + ], + [ + "▁use", + "State" + ], + [ + "ia", + "ni" + ], + [ + "ian", + "i" + ], + [ + "i", + "ani" + ], + [ + "pie", + "ce" + ], + [ + "p", + "iece" + ], + [ + "▁spe", + "aker" + ], + [ + "▁speak", + "er" + ], + [ + "▁re", + "fr" + ], + [ + "▁ref", + "r" + ], + [ + "AR", + "K" + ], + [ + "▁M", + "IT" + ], + [ + "▁MI", + "T" + ], + [ + "▁", + "MIT" + ], + [ + "▁T", + "an" + ], + [ + "▁Ta", + "n" + ], + [ + "▁B", + "ased" + ], + [ + "▁Bas", + "ed" + ], + [ + "▁Base", + "d" + ], + [ + "▁Ba", + "sed" + ], + [ + "▁", + "Based" + ], + [ + "▁cult", + "iv" + ], + [ + "▁hung", + "ry" + ], + [ + "▁A", + "y" + ], + [ + "▁H", + "ey" + ], + [ + "▁He", + "y" + ], + [ + "▁", + "Hey" + ], + [ + "▁excit", + "ement" + ], + [ + "ibr", + "aries" + ], + [ + "Hi", + "t" + ], + [ + "H", + "it" + ], + [ + "▁E", + "nde" + ], + [ + "▁En", + "de" + ], + [ + "▁End", + "e" + ], + [ + "N", + "G" + ], + [ + "FI", + "L" + ], + [ + "F", + "IL" + ], + [ + ".\"", + ")" + ], + [ + ".", + "\")" + ], + [ + "F", + "amily" + ], + [ + "in", + "ery" + ], + [ + "ine", + "ry" + ], + [ + "iner", + "y" + ], + [ + "ne", + "cess" + ], + [ + "ve", + "lope" + ], + [ + "vel", + "ope" + ], + [ + "velop", + "e" + ], + [ + "▁B", + "ot" + ], + [ + "▁Bo", + "t" + ], + [ + "▁", + "Bot" + ], + [ + "port", + "er" + ], + [ + "por", + "ter" + ], + [ + "porte", + "r" + ], + [ + "p", + "orter" + ], + [ + "▁cl", + "imb" + ], + [ + "▁clim", + "b" + ], + [ + "▁E", + "li" + ], + [ + "▁El", + "i" + ], + [ + "ur", + "ent" + ], + [ + "ure", + "nt" + ], + [ + "uren", + "t" + ], + [ + "u", + "rent" + ], + [ + "▁mist", + "akes" + ], + [ + "▁mistake", + "s" + ], + [ + "áb", + "an" + ], + [ + "á", + "ban" + ], + [ + "mark", + "s" + ], + [ + "mar", + "ks" + ], + [ + "m", + "arks" + ], + [ + "pk", + "t" + ], + [ + "p", + "kt" + ], + [ + "L", + "ibrary" + ], + [ + "st", + "ed" + ], + [ + "ste", + "d" + ], + [ + "s", + "ted" + ], + [ + "ublic", + "e" + ], + [ + "ubl", + "ice" + ], + [ + "▁Administr", + "ation" + ], + [ + "▁Admin", + "istration" + ], + [ + "▁sh", + "apes" + ], + [ + "▁shape", + "s" + ], + [ + "▁sha", + "pes" + ], + [ + "пу", + "бли" + ], + [ + "Go", + "d" + ], + [ + "G", + "od" + ], + [ + "in", + "nen" + ], + [ + "inn", + "en" + ], + [ + "ко", + "ло" + ], + [ + "к", + "оло" + ], + [ + "<<", + "<<" + ], + [ + "ib", + "e" + ], + [ + "i", + "be" + ], + [ + "ê", + "s" + ], + [ + "▁С", + "ША" + ], + [ + "▁Fore", + "ign" + ], + [ + "▁", + "Foreign" + ], + [ + "▁Marg", + "aret" + ], + [ + "▁g", + "ene" + ], + [ + "▁gen", + "e" + ], + [ + "▁ge", + "ne" + ], + [ + "▁dist", + "urb" + ], + [ + "▁т", + "ер" + ], + [ + "▁те", + "р" + ], + [ + "▁", + "тер" + ], + [ + "▁on", + "Click" + ], + [ + "▁Engine", + "ering" + ], + [ + "▁stop", + "ping" + ], + [ + "▁sto", + "pping" + ], + [ + "▁restr", + "ictions" + ], + [ + "▁restrict", + "ions" + ], + [ + "▁restriction", + "s" + ], + [ + ",", + "*" + ], + [ + "BU", + "F" + ], + [ + "▁sh", + "adows" + ], + [ + "▁shadow", + "s" + ], + [ + "hc", + "i" + ], + [ + "h", + "ci" + ], + [ + "▁Christ", + "ians" + ], + [ + "▁Christian", + "s" + ], + [ + "▁f", + "ence" + ], + [ + "▁fen", + "ce" + ], + [ + "▁lux", + "ury" + ], + [ + "ak", + "h" + ], + [ + "a", + "kh" + ], + [ + "co", + "ord" + ], + [ + "▁invest", + "igate" + ], + [ + "▁investig", + "ate" + ], + [ + "▁convent", + "ional" + ], + [ + "▁convention", + "al" + ], + [ + "\"", + "—" + ], + [ + "▁vis", + "its" + ], + [ + "▁visit", + "s" + ], + [ + "is", + "é" + ], + [ + "▁S", + "ac" + ], + [ + "▁Sa", + "c" + ], + [ + "class", + "Name" + ], + [ + "▁Psy", + "ch" + ], + [ + "▁ref", + "lected" + ], + [ + "▁reflect", + "ed" + ], + [ + "▁п", + "ло" + ], + [ + "▁", + "пло" + ], + [ + "▁V", + "ice" + ], + [ + "▁Vi", + "ce" + ], + [ + "▁Vic", + "e" + ], + [ + "ła", + "w" + ], + [ + "ł", + "aw" + ], + [ + "________", + "________" + ], + [ + "▁W", + "olf" + ], + [ + "▁Wol", + "f" + ], + [ + "re", + "nte" + ], + [ + "ren", + "te" + ], + [ + "rent", + "e" + ], + [ + "r", + "ente" + ], + [ + "▁Ch", + "ampion" + ], + [ + "▁sim", + "ulation" + ], + [ + "es", + "ota" + ], + [ + "eso", + "ta" + ], + [ + "▁S", + "oon" + ], + [ + "▁So", + "on" + ], + [ + "▁C", + "el" + ], + [ + "▁Ce", + "l" + ], + [ + "▁the", + "ories" + ], + [ + "▁S", + "TR" + ], + [ + "▁ST", + "R" + ], + [ + "▁", + "STR" + ], + [ + "▁collect", + "ive" + ], + [ + "▁coord", + "inate" + ], + [ + "query", + "Selector" + ], + [ + "em", + "ed" + ], + [ + "eme", + "d" + ], + [ + "e", + "med" + ], + [ + "B", + "reak" + ], + [ + "▁g", + "ef" + ], + [ + "▁ge", + "f" + ], + [ + "▁electric", + "ity" + ], + [ + "▁gather", + "ing" + ], + [ + "at", + "ers" + ], + [ + "ate", + "rs" + ], + [ + "ater", + "s" + ], + [ + "a", + "ters" + ], + [ + "ex", + "per" + ], + [ + "exp", + "er" + ], + [ + "▁R", + "oma" + ], + [ + "▁Rom", + "a" + ], + [ + "▁Ro", + "ma" + ], + [ + "▁Co", + "oper" + ], + [ + "SY", + "MBOL" + ], + [ + "v", + "d" + ], + [ + "ivers", + "ary" + ], + [ + "ain", + "es" + ], + [ + "ai", + "nes" + ], + [ + "aine", + "s" + ], + [ + "a", + "ines" + ], + [ + "▁G", + "rad" + ], + [ + "▁Gr", + "ad" + ], + [ + "▁Gra", + "d" + ], + [ + "▁", + "Grad" + ], + [ + "▁independ", + "ence" + ], + [ + "wo", + "h" + ], + [ + "w", + "oh" + ], + [ + "▁con", + "sequence" + ], + [ + "▁consequ", + "ence" + ], + [ + "▁convers", + "ations" + ], + [ + "▁conversation", + "s" + ], + [ + "▁R", + "ou" + ], + [ + "▁Ro", + "u" + ], + [ + "▁and", + "ere" + ], + [ + "▁ander", + "e" + ], + [ + "▁System", + "s" + ], + [ + "га", + "р" + ], + [ + "г", + "ар" + ], + [ + "▁mo", + "ist" + ], + [ + "▁mois", + "t" + ], + [ + "fl", + "u" + ], + [ + "f", + "lu" + ], + [ + "ці", + "я" + ], + [ + "ни", + "ш" + ], + [ + "▁r", + "ode" + ], + [ + "▁ro", + "de" + ], + [ + "▁rod", + "e" + ], + [ + "▁p", + "erd" + ], + [ + "▁per", + "d" + ], + [ + "▁pe", + "rd" + ], + [ + "▁s", + "zer" + ], + [ + "▁sz", + "er" + ], + [ + "▁fl", + "ood" + ], + [ + "▁flo", + "od" + ], + [ + "▁in", + "tim" + ], + [ + "▁int", + "im" + ], + [ + "std", + "err" + ], + [ + "▁ref", + "lection" + ], + [ + "▁reflect", + "ion" + ], + [ + "Sc", + "an" + ], + [ + "S", + "can" + ], + [ + "▁dis", + "aster" + ], + [ + "ake", + "spe" + ], + [ + "akes", + "pe" + ], + [ + "▁In", + "valid" + ], + [ + "▁", + "Invalid" + ], + [ + "▁hum", + "or" + ], + [ + "▁Fried", + "rich" + ], + [ + "▁suggest", + "ions" + ], + [ + "▁suggestion", + "s" + ], + [ + "uv", + "ud" + ], + [ + "De", + "lay" + ], + [ + "Del", + "ay" + ], + [ + "br", + "ief" + ], + [ + "b", + "rief" + ], + [ + "▁и", + "с" + ], + [ + "▁", + "ис" + ], + [ + "gl", + "ied" + ], + [ + "fa", + "s" + ], + [ + "f", + "as" + ], + [ + "▁S", + "mart" + ], + [ + "▁Sm", + "art" + ], + [ + "▁m", + "edi" + ], + [ + "▁me", + "di" + ], + [ + "▁med", + "i" + ], + [ + "▁", + "medi" + ], + [ + "sd", + "k" + ], + [ + "s", + "dk" + ], + [ + "▁se", + "us" + ], + [ + "▁seu", + "s" + ], + [ + "▁A", + "rizona" + ], + [ + "▁innoc", + "ent" + ], + [ + "War", + "n" + ], + [ + "W", + "arn" + ], + [ + "ac", + "ious" + ], + [ + "aci", + "ous" + ], + [ + "acio", + "us" + ], + [ + "▁Mos", + "cow" + ], + [ + "▁c", + "aps" + ], + [ + "▁cap", + "s" + ], + [ + "▁ca", + "ps" + ], + [ + "▁", + "caps" + ], + [ + "Dele", + "gate" + ], + [ + "▁dram", + "atic" + ], + [ + "bo", + "oks" + ], + [ + "book", + "s" + ], + [ + "▁sh", + "ore" + ], + [ + "▁", + "shore" + ], + [ + "uk", + "i" + ], + [ + "u", + "ki" + ], + [ + "▁Russ", + "ell" + ], + [ + "▁cor", + "relation" + ], + [ + "▁corre", + "lation" + ], + [ + "▁correl", + "ation" + ], + [ + "He", + "lp" + ], + [ + "Hel", + "p" + ], + [ + "▁pub", + "blic" + ], + [ + "zy", + "m" + ], + [ + "z", + "ym" + ], + [ + "com", + "b" + ], + [ + "co", + "mb" + ], + [ + "c", + "omb" + ], + [ + "E", + "Y" + ], + [ + "LEN", + "GTH" + ], + [ + "▁M", + "ün" + ], + [ + "▁_", + "." + ], + [ + "▁", + "_." + ], + [ + "▁f", + "erm" + ], + [ + "▁fe", + "rm" + ], + [ + "▁fer", + "m" + ], + [ + "▁I", + "an" + ], + [ + "▁St", + "udio" + ], + [ + "▁Stud", + "io" + ], + [ + "▁aff", + "airs" + ], + [ + "▁affair", + "s" + ], + [ + "lo", + "s" + ], + [ + "l", + "os" + ], + [ + "Rule", + "s" + ], + [ + "R", + "ules" + ], + [ + "run", + "ning" + ], + [ + "r", + "unning" + ], + [ + "▁Post", + "ed" + ], + [ + "▁Po", + "sted" + ], + [ + "▁Pos", + "ted" + ], + [ + "P", + "ixel" + ], + [ + "▁d", + "ancing" + ], + [ + "▁dan", + "cing" + ], + [ + "▁agree", + "ments" + ], + [ + "▁agre", + "ements" + ], + [ + "▁agreement", + "s" + ], + [ + "▁P", + "ic" + ], + [ + "▁Pi", + "c" + ], + [ + "an", + "cia" + ], + [ + "anc", + "ia" + ], + [ + "a", + "ncia" + ], + [ + "▁m", + "á" + ], + [ + "ation", + "Token" + ], + [ + "des", + "criptor" + ], + [ + "▁C", + "arter" + ], + [ + "▁Car", + "ter" + ], + [ + "▁Cart", + "er" + ], + [ + "Re", + "lease" + ], + [ + "Rele", + "ase" + ], + [ + "****", + "********" + ], + [ + "********", + "****" + ], + [ + "******", + "******" + ], + [ + "▁out", + "standing" + ], + [ + "ch", + "anges" + ], + [ + "change", + "s" + ], + [ + "chan", + "ges" + ], + [ + "AR", + "RAY" + ], + [ + "▁Bar", + "bara" + ], + [ + "▁Barb", + "ara" + ], + [ + "▁nur", + "se" + ], + [ + "▁nurs", + "e" + ], + [ + "(", + "\r" + ], + [ + "▁Dou", + "glas" + ], + [ + "▁Doug", + "las" + ], + [ + "▁nu", + "cle" + ], + [ + "▁nuc", + "le" + ], + [ + "ou", + "ri" + ], + [ + "our", + "i" + ], + [ + "o", + "uri" + ], + [ + "▁St", + "yle" + ], + [ + "▁", + "Style" + ], + [ + "av", + "o" + ], + [ + "a", + "vo" + ], + [ + "▁pain", + "ful" + ], + [ + "▁s", + "lic" + ], + [ + "▁sl", + "ic" + ], + [ + "▁sein", + "em" + ], + [ + "▁seine", + "m" + ], + [ + "▁sei", + "nem" + ], + [ + "SUP", + "PORT" + ], + [ + "og", + "ene" + ], + [ + "ogen", + "e" + ], + [ + "oge", + "ne" + ], + [ + "▁sat", + "ell" + ], + [ + "ta", + "gon" + ], + [ + "tag", + "on" + ], + [ + "t", + "agon" + ], + [ + "▁coll", + "apse" + ], + [ + "▁", + "collapse" + ], + [ + "ve", + "lle" + ], + [ + "vel", + "le" + ], + [ + "v", + "elle" + ], + [ + "MO", + "N" + ], + [ + "M", + "ON" + ], + [ + "augh", + "ters" + ], + [ + "aught", + "ers" + ], + [ + "aughter", + "s" + ], + [ + "▁threat", + "ened" + ], + [ + "▁Il", + "legal" + ], + [ + "▁desper", + "ate" + ], + [ + "st", + "rict" + ], + [ + "str", + "ict" + ], + [ + "stri", + "ct" + ], + [ + "ru", + "s" + ], + [ + "r", + "us" + ], + [ + "сти", + "ту" + ], + [ + "\\\"", + ":" + ], + [ + "\\", + "\":" + ], + [ + "▁conf", + "lic" + ], + [ + "down", + "load" + ], + [ + "at", + "os" + ], + [ + "ato", + "s" + ], + [ + "a", + "tos" + ], + [ + "▁Pos", + "ition" + ], + [ + "▁", + "Position" + ], + [ + ".*", + ";" + ], + [ + ".", + "*;" + ], + [ + "▁the", + "ater" + ], + [ + "▁ple", + "asant" + ], + [ + "▁C", + "ette" + ], + [ + "▁Sing", + "apore" + ], + [ + "he", + "et" + ], + [ + "▁p", + "ir" + ], + [ + "▁pi", + "r" + ], + [ + "▁ac", + "quis" + ], + [ + "▁acqu", + "is" + ], + [ + "▁на", + "зва" + ], + [ + "те", + "ля" + ], + [ + "тел", + "я" + ], + [ + "▁rec", + "ru" + ], + [ + "же", + "ния" + ], + [ + "ё", + "л" + ], + [ + "вер", + "сите" + ], + [ + "▁res", + "pective" + ], + [ + "▁respect", + "ive" + ], + [ + "▁t", + "unnel" + ], + [ + "▁tun", + "nel" + ], + [ + "▁tunn", + "el" + ], + [ + "▁De", + "an" + ], + [ + "D", + "u" + ], + [ + "▁un", + "cle" + ], + [ + "▁unc", + "le" + ], + [ + "▁off", + "ensive" + ], + [ + "co", + "lo" + ], + [ + "col", + "o" + ], + [ + "c", + "olo" + ], + [ + "▁Un", + "like" + ], + [ + "se", + "ries" + ], + [ + "ser", + "ies" + ], + [ + "serie", + "s" + ], + [ + "s", + "eries" + ], + [ + "▁A", + "rn" + ], + [ + "▁Ar", + "n" + ], + [ + "min", + "ute" + ], + [ + "▁des", + "criptor" + ], + [ + "▁", + "descriptor" + ], + [ + "▁st", + "ones" + ], + [ + "▁stone", + "s" + ], + [ + "▁sto", + "nes" + ], + [ + "IC", + "ATION" + ], + [ + "▁P", + "ad" + ], + [ + "▁Pa", + "d" + ], + [ + "▁", + "Pad" + ], + [ + "▁i", + "Phone" + ], + [ + "e", + "i" + ], + [ + "▁fant", + "asy" + ], + [ + "▁Kore", + "an" + ], + [ + "▁Korea", + "n" + ], + [ + "\"", + "}" + ], + [ + "▁or", + "th" + ], + [ + "▁", + "orth" + ], + [ + "hal", + "ten" + ], + [ + "halt", + "en" + ], + [ + "de", + "ep" + ], + [ + "▁K", + "ay" + ], + [ + "▁Ka", + "y" + ], + [ + "requ", + "ency" + ], + [ + "▁du", + "ties" + ], + [ + "▁dut", + "ies" + ], + [ + "aw", + "t" + ], + [ + "a", + "wt" + ], + [ + "▁ne", + "arest" + ], + [ + "▁near", + "est" + ], + [ + "▁dis", + "order" + ], + [ + "ст", + "ру" + ], + [ + "▁Ch", + "ile" + ], + [ + "▁Chi", + "le" + ], + [ + "▁s", + "eq" + ], + [ + "▁se", + "q" + ], + [ + "▁", + "seq" + ], + [ + "▁transport", + "ation" + ], + [ + "O", + "O" + ], + [ + "▁D", + "ez" + ], + [ + "▁De", + "z" + ], + [ + "ij", + "u" + ], + [ + "i", + "ju" + ], + [ + "▁Result", + "s" + ], + [ + "▁", + "Results" + ], + [ + "je", + "d" + ], + [ + "j", + "ed" + ], + [ + "iv", + "el" + ], + [ + "ive", + "l" + ], + [ + "i", + "vel" + ], + [ + "HO", + "ST" + ], + [ + "H", + "OST" + ], + [ + "▁", + "€" + ], + [ + "▁", + "Î" + ], + [ + "▁c", + "hin" + ], + [ + "▁ch", + "in" + ], + [ + "▁chi", + "n" + ], + [ + "▁m", + "att" + ], + [ + "▁mat", + "t" + ], + [ + "▁ma", + "tt" + ], + [ + "▁v", + "oted" + ], + [ + "▁vo", + "ted" + ], + [ + "▁vote", + "d" + ], + [ + "▁vot", + "ed" + ], + [ + "▁ge", + "hör" + ], + [ + "▁s", + "ue" + ], + [ + "▁su", + "e" + ], + [ + "▁leg", + "acy" + ], + [ + "в", + "ся" + ], + [ + "SO", + "URCE" + ], + [ + "W", + "ORK" + ], + [ + "it", + "is" + ], + [ + "iti", + "s" + ], + [ + "▁$", + "|" + ], + [ + "▁о", + "бо" + ], + [ + "▁об", + "о" + ], + [ + "▁n", + "r" + ], + [ + "▁", + "nr" + ], + [ + "▁T", + "amb" + ], + [ + "▁Ta", + "mb" + ], + [ + "▁Tam", + "b" + ], + [ + "▁sn", + "ap" + ], + [ + "▁", + "snap" + ], + [ + "▁im", + "pressed" + ], + [ + "▁imp", + "ressed" + ], + [ + "▁impress", + "ed" + ], + [ + "▁depos", + "it" + ], + [ + "▁d", + "ivid" + ], + [ + "▁di", + "vid" + ], + [ + "▁div", + "id" + ], + [ + "Seg", + "ment" + ], + [ + "▁к", + "ар" + ], + [ + "▁ка", + "р" + ], + [ + "▁", + "кар" + ], + [ + "▁G", + "as" + ], + [ + "▁Ga", + "s" + ], + [ + "▁cr", + "imes" + ], + [ + "▁crim", + "es" + ], + [ + "▁crime", + "s" + ], + [ + "▁cri", + "mes" + ], + [ + "▁ins", + "ult" + ], + [ + "▁H", + "um" + ], + [ + "▁Hu", + "m" + ], + [ + "▁bound", + "ed" + ], + [ + "▁k", + "icked" + ], + [ + "▁kick", + "ed" + ], + [ + "▁М", + "у" + ], + [ + "▁|", + "\\" + ], + [ + "▁", + "|\\" + ], + [ + "ad", + "ded" + ], + [ + "add", + "ed" + ], + [ + "Pro", + "du" + ], + [ + "P", + "rodu" + ], + [ + "▁.", + "/" + ], + [ + "▁", + "./" + ], + [ + "▁awk", + "ward" + ], + [ + "▁К", + "ра" + ], + [ + "▁", + "ї" + ], + [ + "▁CON", + "TR" + ], + [ + "▁be", + "im" + ], + [ + "▁bei", + "m" + ], + [ + "▁place", + "holder" + ], + [ + "▁", + "placeholder" + ], + [ + "sp", + "i" + ], + [ + "s", + "pi" + ], + [ + "▁B", + "ei" + ], + [ + "▁Be", + "i" + ], + [ + "▁P", + "f" + ], + [ + "ient", + "es" + ], + [ + "ien", + "tes" + ], + [ + "iente", + "s" + ], + [ + "i", + "entes" + ], + [ + "dis", + "k" + ], + [ + "di", + "sk" + ], + [ + "d", + "isk" + ], + [ + "bl", + "k" + ], + [ + "ne", + "o" + ], + [ + "it", + "arian" + ], + [ + "ita", + "rian" + ], + [ + "itar", + "ian" + ], + [ + "▁c", + "ogn" + ], + [ + "▁co", + "gn" + ], + [ + "▁s", + "out" + ], + [ + "▁so", + "ut" + ], + [ + "▁sou", + "t" + ], + [ + "▁tr", + "ash" + ], + [ + "▁tra", + "sh" + ], + [ + "▁tras", + "h" + ], + [ + "▁R", + "ab" + ], + [ + "▁Ra", + "b" + ], + [ + "▁dec", + "line" + ], + [ + "▁decl", + "ine" + ], + [ + "ta", + "t" + ], + [ + "t", + "at" + ], + [ + "▁comb", + "ine" + ], + [ + "▁T", + "ot" + ], + [ + "▁To", + "t" + ], + [ + "▁dr", + "ops" + ], + [ + "▁dro", + "ps" + ], + [ + "▁drop", + "s" + ], + [ + "Time", + "s" + ], + [ + "Tim", + "es" + ], + [ + "T", + "imes" + ], + [ + "ched", + "uler" + ], + [ + "chedul", + "er" + ], + [ + "▁govern", + "ments" + ], + [ + "▁government", + "s" + ], + [ + "Te", + "x" + ], + [ + "T", + "ex" + ], + [ + "▁U", + "sed" + ], + [ + "▁Us", + "ed" + ], + [ + "▁Use", + "d" + ], + [ + "▁", + "Used" + ], + [ + "за", + "н" + ], + [ + "з", + "ан" + ], + [ + "▁p", + "d" + ], + [ + "▁", + "pd" + ], + [ + "ме", + "т" + ], + [ + "м", + "ет" + ], + [ + "▁&=", + "&" + ], + [ + "▁N", + "ag" + ], + [ + "▁Na", + "g" + ], + [ + "▁до", + "л" + ], + [ + "▁", + "дол" + ], + [ + "▁Al", + "ways" + ], + [ + "rt", + "c" + ], + [ + "r", + "tc" + ], + [ + "ск", + "е" + ], + [ + "с", + "ке" + ], + [ + "▁perform", + "ances" + ], + [ + "▁performance", + "s" + ], + [ + "rupt", + "ed" + ], + [ + "rup", + "ted" + ], + [ + "▁д", + "ва" + ], + [ + "▁man", + "agers" + ], + [ + "▁manager", + "s" + ], + [ + "▁manage", + "rs" + ], + [ + "▁P", + "itt" + ], + [ + "▁Pi", + "tt" + ], + [ + "▁myst", + "ery" + ], + [ + "▁myster", + "y" + ], + [ + "▁set", + "tle" + ], + [ + "▁sett", + "le" + ], + [ + "ul", + "se" + ], + [ + "uls", + "e" + ], + [ + "cr", + "oss" + ], + [ + "cro", + "ss" + ], + [ + "c", + "ross" + ], + [ + "quest", + "ion" + ], + [ + "as", + "ha" + ], + [ + "ash", + "a" + ], + [ + "a", + "sha" + ], + [ + "se", + "ed" + ], + [ + "see", + "d" + ], + [ + "s", + "eed" + ], + [ + "ur", + "able" + ], + [ + "ura", + "ble" + ], + [ + "Fin", + "al" + ], + [ + "Fi", + "nal" + ], + [ + "F", + "inal" + ], + [ + "++", + "++" + ], + [ + "input", + "s" + ], + [ + "▁back", + "up" + ], + [ + "▁", + "backup" + ], + [ + "▁Le", + "arning" + ], + [ + "▁Lear", + "ning" + ], + [ + "▁Learn", + "ing" + ], + [ + "▁*", + "," + ], + [ + "▁", + "*," + ], + [ + "lo", + "go" + ], + [ + "log", + "o" + ], + [ + "l", + "ogo" + ], + [ + "▁se", + "inen" + ], + [ + "▁sein", + "en" + ], + [ + "▁seine", + "n" + ], + [ + "▁sei", + "nen" + ], + [ + "▁vulner", + "able" + ], + [ + "direct", + "ory" + ], + [ + "i", + "ë" + ], + [ + "▁friend", + "ship" + ], + [ + "▁friends", + "hip" + ], + [ + "t", + "u" + ], + [ + "▁V", + "ec" + ], + [ + "▁Ve", + "c" + ], + [ + "▁", + "Vec" + ], + [ + "rif", + "ice" + ], + [ + "rific", + "e" + ], + [ + "▁б", + "ра" + ], + [ + "▁", + "бра" + ], + [ + "▁inv", + "olve" + ], + [ + "▁invol", + "ve" + ], + [ + "TO", + "N" + ], + [ + "T", + "ON" + ], + [ + "▁cor", + "rid" + ], + [ + "se", + "par" + ], + [ + "sep", + "ar" + ], + [ + "Dest", + "roy" + ], + [ + "▁j", + "ul" + ], + [ + "▁ju", + "l" + ], + [ + "▁inequ", + "ality" + ], + [ + "▁a", + "in" + ], + [ + "▁ai", + "n" + ], + [ + "▁", + "ain" + ], + [ + "he", + "x" + ], + [ + "h", + "ex" + ], + [ + "▁w", + "ider" + ], + [ + "▁wide", + "r" + ], + [ + "▁wid", + "er" + ], + [ + "те", + "ли" + ], + [ + "тел", + "и" + ], + [ + "▁j", + "ack" + ], + [ + "▁ja", + "ck" + ], + [ + "▁", + "jack" + ], + [ + "▁qu", + "ot" + ], + [ + "▁", + "quot" + ], + [ + "▁G", + "len" + ], + [ + "▁Gl", + "en" + ], + [ + "▁Gle", + "n" + ], + [ + "init", + "ely" + ], + [ + "ih", + "ood" + ], + [ + "i", + "hood" + ], + [ + "▁wa", + "ist" + ], + [ + "▁Man", + "chester" + ], + [ + "reg", + "ular" + ], + [ + "▁(", + "&" + ], + [ + "▁", + "(&" + ], + [ + "▁mass", + "es" + ], + [ + "▁mas", + "ses" + ], + [ + "▁DE", + "FAULT" + ], + [ + "▁", + "DEFAULT" + ], + [ + "▁ch", + "airs" + ], + [ + "▁chair", + "s" + ], + [ + "▁cha", + "irs" + ], + [ + "▁F", + "ast" + ], + [ + "▁Fa", + "st" + ], + [ + "▁", + "Fast" + ], + [ + "▁c", + "itt" + ], + [ + "▁cit", + "t" + ], + [ + "▁ci", + "tt" + ], + [ + "_{", + "{\\" + ], + [ + "_", + "{{\\" + ], + [ + "o", + "a" + ], + [ + "▁$", + "\\{" + ], + [ + "▁$\\", + "{" + ], + [ + "▁se", + "eds" + ], + [ + "▁see", + "ds" + ], + [ + "▁seed", + "s" + ], + [ + "▁A", + "ld" + ], + [ + "▁Al", + "d" + ], + [ + "▁B", + "att" + ], + [ + "▁Ba", + "tt" + ], + [ + "▁Bat", + "t" + ], + [ + "fa", + "b" + ], + [ + "f", + "ab" + ], + [ + "▁democr", + "acy" + ], + [ + "DT", + "O" + ], + [ + "D", + "TO" + ], + [ + "▁H", + "ij" + ], + [ + "▁Hi", + "j" + ], + [ + "PT", + "R" + ], + [ + "P", + "TR" + ], + [ + "N", + "a" + ], + [ + "▁Har", + "vard" + ], + [ + "si", + "d" + ], + [ + "s", + "id" + ], + [ + "Pr", + "ed" + ], + [ + "Pre", + "d" + ], + [ + "P", + "red" + ], + [ + "fer", + "s" + ], + [ + "fe", + "rs" + ], + [ + "f", + "ers" + ], + [ + "▁s", + "pare" + ], + [ + "▁sp", + "are" + ], + [ + "AM", + "P" + ], + [ + "A", + "MP" + ], + [ + "▁g", + "roupe" + ], + [ + "▁group", + "e" + ], + [ + "▁s", + "ender" + ], + [ + "▁se", + "nder" + ], + [ + "▁send", + "er" + ], + [ + "▁sen", + "der" + ], + [ + "▁", + "sender" + ], + [ + "▁Christ", + "opher" + ], + [ + "▁prison", + "ers" + ], + [ + "▁prisoner", + "s" + ], + [ + "▁K", + "er" + ], + [ + "▁Ke", + "r" + ], + [ + "▁C", + "rist" + ], + [ + "▁Cr", + "ist" + ], + [ + "▁Cris", + "t" + ], + [ + "▁A", + "LL" + ], + [ + "▁AL", + "L" + ], + [ + "▁", + "ALL" + ], + [ + "ri", + "ce" + ], + [ + "ric", + "e" + ], + [ + "r", + "ice" + ], + [ + "▁an", + "tes" + ], + [ + "▁ant", + "es" + ], + [ + "▁ante", + "s" + ], + [ + "▁", + "antes" + ], + [ + "nat", + "ural" + ], + [ + "▁Su", + "san" + ], + [ + "▁Sus", + "an" + ], + [ + "▁J", + "uli" + ], + [ + "▁Jul", + "i" + ], + [ + "▁Ju", + "li" + ], + [ + "▁di", + "ab" + ], + [ + "▁dia", + "b" + ], + [ + "ix", + "on" + ], + [ + "ic", + "ator" + ], + [ + "ica", + "tor" + ], + [ + "▁flex", + "ible" + ], + [ + "▁re", + "serve" + ], + [ + "▁res", + "erve" + ], + [ + "▁reserv", + "e" + ], + [ + "Cont", + "ains" + ], + [ + "▁H", + "il" + ], + [ + "▁Hi", + "l" + ], + [ + "▁I", + "sa" + ], + [ + "▁Is", + "a" + ], + [ + "▁town", + "s" + ], + [ + "▁tow", + "ns" + ], + [ + "G", + "S" + ], + [ + "▁T", + "rad" + ], + [ + "▁Tr", + "ad" + ], + [ + "▁Tra", + "d" + ], + [ + "▁L", + "ock" + ], + [ + "▁Loc", + "k" + ], + [ + "▁Lo", + "ck" + ], + [ + "▁", + "Lock" + ], + [ + "▁G", + "rund" + ], + [ + "▁Gr", + "und" + ], + [ + "▁Gru", + "nd" + ], + [ + "▁crit", + "icism" + ], + [ + "▁critic", + "ism" + ], + [ + "н", + "ю" + ], + [ + "▁c", + "ă" + ], + [ + "▁polit", + "ician" + ], + [ + "st", + "able" + ], + [ + "sta", + "ble" + ], + [ + "s", + "table" + ], + [ + "Ac", + "cept" + ], + [ + "Sum", + "mary" + ], + [ + "▁tamb", + "ém" + ], + [ + "▁també", + "m" + ], + [ + "}^", + "{-" + ], + [ + "}^{", + "-" + ], + [ + "}", + "^{-" + ], + [ + "▁I", + "M" + ], + [ + "▁", + "IM" + ], + [ + "id", + "al" + ], + [ + "ida", + "l" + ], + [ + "i", + "dal" + ], + [ + "мо", + "р" + ], + [ + "м", + "ор" + ], + [ + "Bl", + "ue" + ], + [ + "GRO", + "UP" + ], + [ + "▁term", + "inal" + ], + [ + "▁termin", + "al" + ], + [ + "▁complex", + "ity" + ], + [ + "▁loc", + "ally" + ], + [ + "▁local", + "ly" + ], + [ + "DO", + "WN" + ], + [ + "▁N", + "ear" + ], + [ + "▁Ne", + "ar" + ], + [ + "Dep", + "th" + ], + [ + "▁p", + "ole" + ], + [ + "▁pol", + "e" + ], + [ + "▁po", + "le" + ], + [ + "▁e", + "quality" + ], + [ + "▁equ", + "ality" + ], + [ + "▁equal", + "ity" + ], + [ + "Si", + "te" + ], + [ + "S", + "ite" + ], + [ + "▁is", + "instance" + ], + [ + "Sp", + "eed" + ], + [ + "Spe", + "ed" + ], + [ + "S", + "peed" + ], + [ + "ip", + "pi" + ], + [ + "ipp", + "i" + ], + [ + ",", + "&" + ], + [ + "▁E", + "nc" + ], + [ + "▁En", + "c" + ], + [ + "▁", + "Enc" + ], + [ + "ще", + "н" + ], + [ + "щ", + "ен" + ], + [ + "▁m", + "ater" + ], + [ + "▁mat", + "er" + ], + [ + "▁ma", + "ter" + ], + [ + "▁mate", + "r" + ], + [ + "▁sl", + "aves" + ], + [ + "▁slave", + "s" + ], + [ + "▁sla", + "ves" + ], + [ + "AC", + "TION" + ], + [ + "ACT", + "ION" + ], + [ + "A", + "CTION" + ], + [ + "usal", + "em" + ], + [ + "usa", + "lem" + ], + [ + "▁h", + "az" + ], + [ + "▁ha", + "z" + ], + [ + "▁Be", + "at" + ], + [ + "▁w", + "rest" + ], + [ + "▁wr", + "est" + ], + [ + "▁l", + "lam" + ], + [ + "▁ll", + "am" + ], + [ + "In", + "s" + ], + [ + "I", + "ns" + ], + [ + "ми", + "на" + ], + [ + "▁бу", + "в" + ], + [ + "▁Fr", + "ame" + ], + [ + "▁Fra", + "me" + ], + [ + "▁", + "Frame" + ], + [ + "us", + "hes" + ], + [ + "ush", + "es" + ], + [ + "▁virtual", + "ly" + ], + [ + "▁virt", + "ually" + ], + [ + "▁P", + "erm" + ], + [ + "▁Per", + "m" + ], + [ + "▁Pe", + "rm" + ], + [ + "▁", + "Perm" + ], + [ + "▁we", + "ights" + ], + [ + "▁weight", + "s" + ], + [ + "▁weigh", + "ts" + ], + [ + "▁", + "weights" + ], + [ + "▁ll", + "vm" + ], + [ + "▁", + "llvm" + ], + [ + "▁c", + "ave" + ], + [ + "▁ca", + "ve" + ], + [ + "▁cav", + "e" + ], + [ + "st", + "ates" + ], + [ + "state", + "s" + ], + [ + "stat", + "es" + ], + [ + "sta", + "tes" + ], + [ + "DM", + "A" + ], + [ + "D", + "MA" + ], + [ + "el", + "lt" + ], + [ + "ell", + "t" + ], + [ + "if", + "act" + ], + [ + "ifa", + "ct" + ], + [ + "i", + "fact" + ], + [ + "v", + "endor" + ], + [ + "▁E", + "mma" + ], + [ + "▁Em", + "ma" + ], + [ + "Loc", + "ale" + ], + [ + "Local", + "e" + ], + [ + "▁S", + "ET" + ], + [ + "▁SE", + "T" + ], + [ + "▁", + "SET" + ], + [ + "▁ge", + "ometry" + ], + [ + "▁", + "geometry" + ], + [ + "St", + "yles" + ], + [ + "Style", + "s" + ], + [ + "▁Ref", + "eree" + ], + [ + "▁Refer", + "ee" + ], + [ + "▁we", + "it" + ], + [ + "fi", + "ca" + ], + [ + "fic", + "a" + ], + [ + "f", + "ica" + ], + [ + "▁a", + "ds" + ], + [ + "▁ad", + "s" + ], + [ + "▁", + "ads" + ], + [ + "gr", + "ay" + ], + [ + "gra", + "y" + ], + [ + "g", + "ray" + ], + [ + "▁B", + "urg" + ], + [ + "▁Bur", + "g" + ], + [ + "▁Bu", + "rg" + ], + [ + "ion", + "a" + ], + [ + "io", + "na" + ], + [ + "i", + "ona" + ], + [ + "dag", + "ger" + ], + [ + "d", + "agger" + ], + [ + "▁Jan", + "uar" + ], + [ + "де", + "й" + ], + [ + "д", + "ей" + ], + [ + "ister", + "schaft" + ], + [ + "pp", + "o" + ], + [ + "p", + "po" + ], + [ + "oid", + "s" + ], + [ + "oi", + "ds" + ], + [ + "o", + "ids" + ], + [ + "▁dé", + "part" + ], + [ + "Sh", + "ader" + ], + [ + "▁con", + "straint" + ], + [ + "▁constr", + "aint" + ], + [ + "▁", + "constraint" + ], + [ + "Se", + "cret" + ], + [ + "Sec", + "ret" + ], + [ + "▁P", + "eters" + ], + [ + "▁Pe", + "ters" + ], + [ + "▁Peter", + "s" + ], + [ + "▁Pet", + "ers" + ], + [ + "▁Pete", + "rs" + ], + [ + "▁ey", + "eb" + ], + [ + "▁eye", + "b" + ], + [ + "▁m", + "esh" + ], + [ + "▁me", + "sh" + ], + [ + "▁mes", + "h" + ], + [ + "▁", + "mesh" + ], + [ + "▁c", + "ookie" + ], + [ + "▁cook", + "ie" + ], + [ + "▁", + "cookie" + ], + [ + "▁P", + "ick" + ], + [ + "▁Pic", + "k" + ], + [ + "▁Pi", + "ck" + ], + [ + "▁n", + "ick" + ], + [ + "▁ni", + "ck" + ], + [ + "▁nic", + "k" + ], + [ + "▁", + "nick" + ], + [ + "by", + "e" + ], + [ + "b", + "ye" + ], + [ + "▁sav", + "ings" + ], + [ + "▁saving", + "s" + ], + [ + "Tr", + "y" + ], + [ + "T", + "ry" + ], + [ + "py", + "thon" + ], + [ + "▁p", + "atri" + ], + [ + "▁pat", + "ri" + ], + [ + "▁pa", + "tri" + ], + [ + "▁mult", + "ip" + ], + [ + "▁multi", + "p" + ], + [ + "▁mul", + "tip" + ], + [ + "▁", + "multip" + ], + [ + "▁k", + "inda" + ], + [ + "▁kind", + "a" + ], + [ + "▁kin", + "da" + ], + [ + "▁'", + "_" + ], + [ + "▁", + "'_" + ], + [ + "▁Fr", + "anz" + ], + [ + "▁Fran", + "z" + ], + [ + "▁cl", + "oth" + ], + [ + "▁clo", + "th" + ], + [ + "зу", + "льта" + ], + [ + "▁fle", + "et" + ], + [ + "▁human", + "ity" + ], + [ + "re", + "sa" + ], + [ + "res", + "a" + ], + [ + "r", + "esa" + ], + [ + "bl", + "ob" + ], + [ + "blo", + "b" + ], + [ + "▁T", + "X" + ], + [ + "▁", + "TX" + ], + [ + "▁B", + "uch" + ], + [ + "▁Bu", + "ch" + ], + [ + "▁Buc", + "h" + ], + [ + "▁L", + "ond" + ], + [ + "▁Lo", + "nd" + ], + [ + "▁val", + "ley" + ], + [ + "▁m", + "urm" + ], + [ + "▁mur", + "m" + ], + [ + "▁mu", + "rm" + ], + [ + "▁T", + "rade" + ], + [ + "▁Tr", + "ade" + ], + [ + "▁Tra", + "de" + ], + [ + "▁Trad", + "e" + ], + [ + "line", + "width" + ], + [ + "▁e", + "special" + ], + [ + "▁espec", + "ial" + ], + [ + "up", + "per" + ], + [ + "upp", + "er" + ], + [ + "▁h", + "osp" + ], + [ + "▁ho", + "sp" + ], + [ + "▁t", + "anto" + ], + [ + "▁tan", + "to" + ], + [ + "▁tant", + "o" + ], + [ + "▁old", + "est" + ], + [ + "▁ol", + "dest" + ], + [ + "▁R", + "oose" + ], + [ + "▁Ro", + "ose" + ], + [ + "▁h", + "itting" + ], + [ + "▁hit", + "ting" + ], + [ + "do", + "g" + ], + [ + "d", + "og" + ], + [ + "ov", + "i" + ], + [ + "o", + "vi" + ], + [ + "},", + "\r" + ], + [ + "}", + ",\r" + ], + [ + "▁compat", + "ible" + ], + [ + "▁", + "compatible" + ], + [ + "▁We", + "bsite" + ], + [ + "▁Web", + "site" + ], + [ + "po", + "ch" + ], + [ + "p", + "och" + ], + [ + "▁B", + "ag" + ], + [ + "▁Ba", + "g" + ], + [ + "▁", + "Bag" + ], + [ + "▁accompl", + "ish" + ], + [ + "▁accomp", + "lish" + ], + [ + "Ch", + "rist" + ], + [ + "as", + "set" + ], + [ + "ass", + "et" + ], + [ + "asse", + "t" + ], + [ + "▁U", + "ntil" + ], + [ + "▁Un", + "til" + ], + [ + "▁", + "Until" + ], + [ + "▁g", + "eld" + ], + [ + "▁ge", + "ld" + ], + [ + "▁gel", + "d" + ], + [ + "List", + "en" + ], + [ + "Li", + "sten" + ], + [ + "L", + "isten" + ], + [ + "S", + "B" + ], + [ + "Set", + "up" + ], + [ + "ic", + "ia" + ], + [ + "ici", + "a" + ], + [ + "i", + "cia" + ], + [ + "▁l", + "um" + ], + [ + "▁lu", + "m" + ], + [ + "▁jan", + "vier" + ], + [ + "PA", + "GE" + ], + [ + "P", + "AGE" + ], + [ + "▁N", + "u" + ], + [ + "/", + "\"" + ], + [ + "▁divor", + "ce" + ], + [ + "Ex", + "ecute" + ], + [ + "Execut", + "e" + ], + [ + "Exec", + "ute" + ], + [ + "De", + "pend" + ], + [ + "Dep", + "end" + ], + [ + "▁Scott", + "ish" + ], + [ + "▁T", + "s" + ], + [ + "ru", + "ppe" + ], + [ + "rup", + "pe" + ], + [ + "▁ref", + "use" + ], + [ + "▁Ok", + "tober" + ], + [ + "ij", + "k" + ], + [ + "i", + "jk" + ], + [ + "▁A", + "my" + ], + [ + "▁Am", + "y" + ], + [ + "▁di", + "min" + ], + [ + "▁dim", + "in" + ], + [ + "▁g", + "ross" + ], + [ + "▁gr", + "oss" + ], + [ + "▁gro", + "ss" + ], + [ + "▁t", + "rat" + ], + [ + "▁tr", + "at" + ], + [ + "▁tra", + "t" + ], + [ + "is", + "ible" + ], + [ + "isi", + "ble" + ], + [ + "mix", + "er" + ], + [ + "m", + "ixer" + ], + [ + "▁aut", + "res" + ], + [ + "▁au", + "tres" + ], + [ + "▁autre", + "s" + ], + [ + "▁", + "autres" + ], + [ + "▁ne", + "at" + ], + [ + "▁ot", + "ros" + ], + [ + "▁otro", + "s" + ], + [ + "Vo", + "id" + ], + [ + "V", + "oid" + ], + [ + "▁sc", + "hol" + ], + [ + "▁sch", + "ol" + ], + [ + "▁Wal", + "ker" + ], + [ + "▁Walk", + "er" + ], + [ + "▁t", + "ube" + ], + [ + "▁tu", + "be" + ], + [ + "▁tub", + "e" + ], + [ + "olog", + "ists" + ], + [ + "ologist", + "s" + ], + [ + "▁г", + "руп" + ], + [ + "▁гру", + "п" + ], + [ + "▁h", + "aben" + ], + [ + "▁hab", + "en" + ], + [ + "▁ha", + "ben" + ], + [ + "ub", + "er" + ], + [ + "ube", + "r" + ], + [ + "u", + "ber" + ], + [ + "ACT", + "IVE" + ], + [ + "▁Att", + "endance" + ], + [ + "▁о", + "п" + ], + [ + "▁bl", + "ade" + ], + [ + "opl", + "us" + ], + [ + "o", + "plus" + ], + [ + "▁Or", + "iginal" + ], + [ + "▁Origin", + "al" + ], + [ + "▁", + "Original" + ], + [ + "▁manufact", + "urer" + ], + [ + "as", + "z" + ], + [ + "a", + "sz" + ], + [ + "ât", + "e" + ], + [ + "â", + "te" + ], + [ + "re", + "r" + ], + [ + "r", + "er" + ], + [ + "▁J", + "son" + ], + [ + "▁", + "Json" + ], + [ + "▁succeed", + "ed" + ], + [ + "uff", + "le" + ], + [ + "▁b", + "acked" + ], + [ + "▁back", + "ed" + ], + [ + "es", + "ian" + ], + [ + "esi", + "an" + ], + [ + "ti", + "ck" + ], + [ + "t", + "ick" + ], + [ + "Ex", + "ternal" + ], + [ + "▁X", + "IX" + ], + [ + "▁XI", + "X" + ], + [ + "▁he", + "arts" + ], + [ + "▁heart", + "s" + ], + [ + "▁hear", + "ts" + ], + [ + "▁По", + "сле" + ], + [ + "ol", + "u" + ], + [ + "o", + "lu" + ], + [ + "▁ле", + "т" + ], + [ + "▁", + "лет" + ], + [ + "VI", + "CE" + ], + [ + "V", + "ICE" + ], + [ + "ár", + "io" + ], + [ + "á", + "rio" + ], + [ + "▁fr", + "aud" + ], + [ + "▁fra", + "ud" + ], + [ + "ed", + "u" + ], + [ + "e", + "du" + ], + [ + "Pr", + "imary" + ], + [ + "Prim", + "ary" + ], + [ + "▁g", + "aming" + ], + [ + "▁gam", + "ing" + ], + [ + "▁ga", + "ming" + ], + [ + "▁p", + "lt" + ], + [ + "▁pl", + "t" + ], + [ + "ig", + "ator" + ], + [ + "iga", + "tor" + ], + [ + "IE", + "S" + ], + [ + "I", + "ES" + ], + [ + "Comp", + "iler" + ], + [ + "▁mon", + "ument" + ], + [ + "ag", + "em" + ], + [ + "age", + "m" + ], + [ + "a", + "gem" + ], + [ + "▁R", + "ain" + ], + [ + "▁Ra", + "in" + ], + [ + "▁mo", + "ins" + ], + [ + "ok", + "u" + ], + [ + "o", + "ku" + ], + [ + "os", + "ex" + ], + [ + "ose", + "x" + ], + [ + "o", + "sex" + ], + [ + "▁K", + "ansas" + ], + [ + "▁gep", + "ublice" + ], + [ + "▁J", + "oy" + ], + [ + "▁Jo", + "y" + ], + [ + "Sc", + "ene" + ], + [ + "▁king", + "dom" + ], + [ + "ri", + "ces" + ], + [ + "ric", + "es" + ], + [ + "rice", + "s" + ], + [ + "r", + "ices" + ], + [ + "▁ju", + "in" + ], + [ + "▁uncomfort", + "able" + ], + [ + "▁M", + "oney" + ], + [ + "▁Mon", + "ey" + ], + [ + "▁Mo", + "ney" + ], + [ + "ob", + "b" + ], + [ + "o", + "bb" + ], + [ + "ex", + "pl" + ], + [ + "exp", + "l" + ], + [ + "str", + "cmp" + ], + [ + "▁d", + "read" + ], + [ + "▁dr", + "ead" + ], + [ + "▁dre", + "ad" + ], + [ + "rit", + "ion" + ], + [ + "r", + "ition" + ], + [ + "▁C", + "hi" + ], + [ + "▁Ch", + "i" + ], + [ + "▁demonstr", + "ated" + ], + [ + "▁demonstrate", + "d" + ], + [ + "▁vert", + "ices" + ], + [ + "ч", + "о" + ], + [ + "▁C", + "ulture" + ], + [ + "▁", + "Culture" + ], + [ + "F", + "X" + ], + [ + "D", + "ictionary" + ], + [ + "▁D", + "ru" + ], + [ + "▁Dr", + "u" + ], + [ + "tr", + "m" + ], + [ + "t", + "rm" + ], + [ + "▁ex", + "amine" + ], + [ + "▁exam", + "ine" + ], + [ + "▁the", + "rap" + ], + [ + "▁ther", + "ap" + ], + [ + "i", + "ème" + ], + [ + "ми", + "ни" + ], + [ + "▁produ", + "ces" + ], + [ + "▁produce", + "s" + ], + [ + "▁photograph", + "s" + ], + [ + "▁thread", + "s" + ], + [ + "▁", + "threads" + ], + [ + "▁M", + "I" + ], + [ + "▁", + "MI" + ], + [ + "▁extraord", + "inary" + ], + [ + "ски", + "м" + ], + [ + "ск", + "им" + ], + [ + "с", + "ким" + ], + [ + "▁gepublice", + "erd" + ], + [ + "▁Pol", + "and" + ], + [ + "▁Po", + "land" + ], + [ + "▁guarante", + "ed" + ], + [ + "▁guarantee", + "d" + ], + [ + "R", + "G" + ], + [ + "os", + "c" + ], + [ + "o", + "sc" + ], + [ + "ал", + "и" + ], + [ + "а", + "ли" + ], + [ + "▁те", + "х" + ], + [ + "err", + "no" + ], + [ + "sc", + "ience" + ], + [ + "if", + "fs" + ], + [ + "iff", + "s" + ], + [ + "▁T", + "am" + ], + [ + "▁Ta", + "m" + ], + [ + "▁B", + "eth" + ], + [ + "▁Be", + "th" + ], + [ + "▁Bet", + "h" + ], + [ + "▁Tr", + "avel" + ], + [ + "▁Tra", + "vel" + ], + [ + "▁trans", + "late" + ], + [ + "▁transl", + "ate" + ], + [ + "▁", + "translate" + ], + [ + "ch", + "é" + ], + [ + "▁l", + "ing" + ], + [ + "▁li", + "ng" + ], + [ + "▁lin", + "g" + ], + [ + "▁", + "ling" + ], + [ + "▁bel", + "ongs" + ], + [ + "▁belong", + "s" + ], + [ + "▁elect", + "rical" + ], + [ + "▁electric", + "al" + ], + [ + "en", + "sk" + ], + [ + "ens", + "k" + ], + [ + "▁Com", + "pet" + ], + [ + "▁Comp", + "et" + ], + [ + "c", + "g" + ], + [ + "V", + "C" + ], + [ + "to", + "pic" + ], + [ + "top", + "ic" + ], + [ + "t", + "opic" + ], + [ + "▁pre", + "sum" + ], + [ + "▁pres", + "um" + ], + [ + "ве", + "та" + ], + [ + "вет", + "а" + ], + [ + "▁approxim", + "ation" + ], + [ + "▁approx", + "imation" + ], + [ + "▁g", + "rim" + ], + [ + "▁gr", + "im" + ], + [ + "▁gri", + "m" + ], + [ + "▁И", + "з" + ], + [ + "_{", + "(" + ], + [ + "_", + "{(" + ], + [ + "ви", + "н" + ], + [ + "в", + "ин" + ], + [ + "ut", + "ion" + ], + [ + "uti", + "on" + ], + [ + "ow", + "ych" + ], + [ + "owy", + "ch" + ], + [ + "å", + "g" + ], + [ + "ster", + "reich" + ], + [ + "▁character", + "istic" + ], + [ + "om", + "ing" + ], + [ + "omin", + "g" + ], + [ + "omi", + "ng" + ], + [ + "o", + "ming" + ], + [ + "▁/*", + "!" + ], + [ + "▁", + "/*!" + ], + [ + "▁pr", + "ize" + ], + [ + "▁pri", + "ze" + ], + [ + "▁Minn", + "esota" + ], + [ + "te", + "d" + ], + [ + "t", + "ed" + ], + [ + "ц", + "ы" + ], + [ + "▁O", + "m" + ], + [ + "▁", + "Om" + ], + [ + "▁ind", + "ices" + ], + [ + "▁indic", + "es" + ], + [ + "▁", + "indices" + ], + [ + "▁s", + "tem" + ], + [ + "▁st", + "em" + ], + [ + "▁ste", + "m" + ], + [ + "re", + "gon" + ], + [ + "reg", + "on" + ], + [ + "ни", + "че" + ], + [ + "▁Sal", + "v" + ], + [ + "▁Sa", + "lv" + ], + [ + "és", + "e" + ], + [ + "é", + "se" + ], + [ + "▁a", + "ged" + ], + [ + "▁ag", + "ed" + ], + [ + "▁age", + "d" + ], + [ + "▁", + "aged" + ], + [ + "▁P", + "ast" + ], + [ + "▁Pa", + "st" + ], + [ + "▁Pas", + "t" + ], + [ + "▁intern", + "ation" + ], + [ + "▁V", + "ic" + ], + [ + "▁Vi", + "c" + ], + [ + "▁res", + "ume" + ], + [ + "▁", + "resume" + ], + [ + "akespe", + "are" + ], + [ + "▁est", + "ado" + ], + [ + "▁esta", + "do" + ], + [ + "▁estad", + "o" + ], + [ + "▁ab", + "ilities" + ], + [ + "▁", + "abilities" + ], + [ + "▁b", + "row" + ], + [ + "▁br", + "ow" + ], + [ + "▁bro", + "w" + ], + [ + "▁N", + "FL" + ], + [ + "▁tr", + "ends" + ], + [ + "▁trend", + "s" + ], + [ + "▁tren", + "ds" + ], + [ + "▁Aust", + "in" + ], + [ + "▁L", + "IMIT" + ], + [ + "▁LI", + "MIT" + ], + [ + "▁", + "LIMIT" + ], + [ + "▁K", + "or" + ], + [ + "▁Ko", + "r" + ], + [ + "▁f", + "olk" + ], + [ + "▁fol", + "k" + ], + [ + "▁", + "folk" + ], + [ + "▁w", + "ard" + ], + [ + "▁war", + "d" + ], + [ + "▁wa", + "rd" + ], + [ + "▁", + "ward" + ], + [ + "▁n", + "est" + ], + [ + "▁ne", + "st" + ], + [ + "▁Jun", + "ior" + ], + [ + "▁Juni", + "or" + ], + [ + "▁maint", + "aining" + ], + [ + "▁maintain", + "ing" + ], + [ + "P", + "ub" + ], + [ + "OB", + "JECT" + ], + [ + "▁blo", + "ody" + ], + [ + "▁blood", + "y" + ], + [ + "▁s", + "j" + ], + [ + "▁d", + "type" + ], + [ + "▁dt", + "ype" + ], + [ + "▁", + "dtype" + ], + [ + "Pan", + "e" + ], + [ + "P", + "ane" + ], + [ + "▁b", + "acter" + ], + [ + "▁grad", + "ually" + ], + [ + "▁gradu", + "ally" + ], + [ + "m", + "r" + ], + [ + "Te", + "am" + ], + [ + "▁ind", + "icating" + ], + [ + "▁indic", + "ating" + ], + [ + "▁decre", + "ase" + ], + [ + "te", + "k" + ], + [ + "t", + "ek" + ], + [ + "▁Re", + "present" + ], + [ + "▁Rep", + "resent" + ], + [ + "▁develop", + "ers" + ], + [ + "▁developer", + "s" + ], + [ + "Gu", + "id" + ], + [ + "Gui", + "d" + ], + [ + "G", + "uid" + ], + [ + "▁D", + "iet" + ], + [ + "▁Die", + "t" + ], + [ + "▁Di", + "et" + ], + [ + "▁re", + "tr" + ], + [ + "▁r", + "etr" + ], + [ + "▁ret", + "r" + ], + [ + "Nav", + "igation" + ], + [ + "es", + "i" + ], + [ + "e", + "si" + ], + [ + "▁l", + "azy" + ], + [ + "▁la", + "zy" + ], + [ + "Stand", + "ard" + ], + [ + "E", + "r" + ], + [ + "A", + "W" + ], + [ + "▁Ét", + "ats" + ], + [ + "▁ass", + "ured" + ], + [ + "▁assure", + "d" + ], + [ + "Sa", + "n" + ], + [ + "S", + "an" + ], + [ + "▁And", + "re" + ], + [ + "▁Andr", + "e" + ], + [ + "’", + "," + ], + [ + "fa", + "ng" + ], + [ + "fan", + "g" + ], + [ + "f", + "ang" + ], + [ + "ér", + "ation" + ], + [ + "▁indust", + "ries" + ], + [ + "▁in", + "con" + ], + [ + "▁inc", + "on" + ], + [ + "Em", + "it" + ], + [ + "E", + "mit" + ], + [ + "▁г", + "де" + ], + [ + "▁ret", + "riev" + ], + [ + "▁retr", + "iev" + ], + [ + "en", + "i" + ], + [ + "e", + "ni" + ], + [ + "▁Tur", + "key" + ], + [ + "▁Turk", + "ey" + ], + [ + "iz", + "ers" + ], + [ + "ize", + "rs" + ], + [ + "izer", + "s" + ], + [ + "An", + "gle" + ], + [ + "Ang", + "le" + ], + [ + "▁o", + "c" + ], + [ + "▁", + "oc" + ], + [ + "▁pal", + "m" + ], + [ + "▁pa", + "lm" + ], + [ + "▁s", + "tan" + ], + [ + "▁st", + "an" + ], + [ + "▁sta", + "n" + ], + [ + "▁", + "stan" + ], + [ + "ль", + "но" + ], + [ + "▁C", + "SS" + ], + [ + "▁CS", + "S" + ], + [ + "▁", + "CSS" + ], + [ + "▁fr", + "ances" + ], + [ + "▁franc", + "es" + ], + [ + "▁g", + "rin" + ], + [ + "▁gr", + "in" + ], + [ + "▁gri", + "n" + ], + [ + "▁tiem", + "po" + ], + [ + "▁P", + "rix" + ], + [ + "▁Pr", + "ix" + ], + [ + "▁Pri", + "x" + ], + [ + "])", + "." + ], + [ + "]", + ")." + ], + [ + "▁de", + "put" + ], + [ + "▁dep", + "ut" + ], + [ + "▁P", + "in" + ], + [ + "▁Pi", + "n" + ], + [ + "▁", + "Pin" + ], + [ + "▁si", + "xt" + ], + [ + "▁six", + "t" + ], + [ + "▁predict", + "ed" + ], + [ + "▁pred", + "icted" + ], + [ + "az", + "ure" + ], + [ + "azu", + "re" + ], + [ + "▁Mo", + "tor" + ], + [ + "▁Mot", + "or" + ], + [ + "▁i", + "hm" + ], + [ + "▁ih", + "m" + ], + [ + "▁man", + "us" + ], + [ + "ap", + "os" + ], + [ + "a", + "pos" + ], + [ + "▁instr", + "uments" + ], + [ + "▁instrument", + "s" + ], + [ + "▁co", + "unts" + ], + [ + "▁coun", + "ts" + ], + [ + "▁count", + "s" + ], + [ + "▁aim", + "ed" + ], + [ + "▁ai", + "med" + ], + [ + "▁", + "aimed" + ], + [ + "pro", + "fit" + ], + [ + "prof", + "it" + ], + [ + "▁d", + "ok" + ], + [ + "▁do", + "k" + ], + [ + "об", + "ра" + ], + [ + "о", + "бра" + ], + [ + "▁e", + "stud" + ], + [ + "▁est", + "ud" + ], + [ + "ie", + "sz" + ], + [ + "ies", + "z" + ], + [ + "i", + "esz" + ], + [ + "▁p", + "iss" + ], + [ + "▁pi", + "ss" + ], + [ + "▁in", + "aug" + ], + [ + "▁vo", + "ters" + ], + [ + "▁vote", + "rs" + ], + [ + "▁vot", + "ers" + ], + [ + "▁pack", + "ages" + ], + [ + "▁package", + "s" + ], + [ + "▁", + "packages" + ], + [ + "▁c", + "ute" + ], + [ + "▁cut", + "e" + ], + [ + "▁cu", + "te" + ], + [ + "▁f", + "itness" + ], + [ + "▁fit", + "ness" + ], + [ + "▁l", + "eurs" + ], + [ + "▁le", + "urs" + ], + [ + "▁leur", + "s" + ], + [ + "▁s", + "orted" + ], + [ + "▁sort", + "ed" + ], + [ + "▁sor", + "ted" + ], + [ + "ph", + "ant" + ], + [ + "pha", + "nt" + ], + [ + "phan", + "t" + ], + [ + "OP", + "T" + ], + [ + "O", + "PT" + ], + [ + "▁z", + "ip" + ], + [ + "▁", + "zip" + ], + [ + "se", + "ason" + ], + [ + "sea", + "son" + ], + [ + "em", + "i" + ], + [ + "e", + "mi" + ], + [ + "enc", + "oding" + ], + [ + "wo", + "n" + ], + [ + "w", + "on" + ], + [ + "el", + "ect" + ], + [ + "ele", + "ct" + ], + [ + "e", + "lect" + ], + [ + "▁t", + "ooth" + ], + [ + "▁to", + "oth" + ], + [ + "▁too", + "th" + ], + [ + "▁up", + "coming" + ], + [ + "▁G", + "raham" + ], + [ + "▁Gra", + "ham" + ], + [ + "nu", + "t" + ], + [ + "n", + "ut" + ], + [ + "▁Ar", + "k" + ], + [ + "äl", + "t" + ], + [ + "ä", + "lt" + ], + [ + "▁prec", + "ious" + ], + [ + "ag", + "le" + ], + [ + "a", + "gle" + ], + [ + "né", + "e" + ], + [ + "n", + "ée" + ], + [ + "ни", + "ца" + ], + [ + "ниц", + "а" + ], + [ + "ar", + "is" + ], + [ + "ari", + "s" + ], + [ + "a", + "ris" + ], + [ + "▁p", + "ile" + ], + [ + "▁pi", + "le" + ], + [ + "▁pil", + "e" + ], + [ + "co", + "le" + ], + [ + "col", + "e" + ], + [ + "c", + "ole" + ], + [ + "▁W", + "ITH" + ], + [ + "▁WIT", + "H" + ], + [ + "▁", + "WITH" + ], + [ + "rou", + "ting" + ], + [ + "r", + "outing" + ], + [ + "▁*", + "**" + ], + [ + "▁**", + "*" + ], + [ + "▁", + "***" + ], + [ + "Appe", + "arance" + ], + [ + "ll", + "vm" + ], + [ + "▁O", + "liver" + ], + [ + "▁Ol", + "iver" + ], + [ + "▁P", + "L" + ], + [ + "▁", + "PL" + ], + [ + "if", + "ndef" + ], + [ + "et", + "zt" + ], + [ + "etz", + "t" + ], + [ + "sk", + "iego" + ], + [ + "ski", + "ego" + ], + [ + "▁p", + "on" + ], + [ + "▁po", + "n" + ], + [ + "▁", + "pon" + ], + [ + "AR", + "GET" + ], + [ + "ARG", + "ET" + ], + [ + "k", + "ö" + ], + [ + "al", + "led" + ], + [ + "all", + "ed" + ], + [ + "alle", + "d" + ], + [ + "▁=", + "\\" + ], + [ + "▁", + "=\\" + ], + [ + "su", + "re" + ], + [ + "sur", + "e" + ], + [ + "s", + "ure" + ], + [ + "mat", + "ches" + ], + [ + "match", + "es" + ], + [ + "▁temper", + "atures" + ], + [ + "▁temperature", + "s" + ], + [ + "SE", + "L" + ], + [ + "S", + "EL" + ], + [ + "▁cl", + "one" + ], + [ + "▁clo", + "ne" + ], + [ + "▁", + "clone" + ], + [ + "▁el", + "ler" + ], + [ + "▁elle", + "r" + ], + [ + "▁ell", + "er" + ], + [ + "▁", + "eller" + ], + [ + "er", + "na" + ], + [ + "ern", + "a" + ], + [ + "▁п", + "оло" + ], + [ + "▁по", + "ло" + ], + [ + "▁пол", + "о" + ], + [ + "Man", + "agement" + ], + [ + "comp", + "any" + ], + [ + "▁l", + "un" + ], + [ + "▁lu", + "n" + ], + [ + "▁stre", + "aming" + ], + [ + "▁stream", + "ing" + ], + [ + "▁N", + "i" + ], + [ + "▁s", + "í" + ], + [ + "Cont", + "act" + ], + [ + "▁C", + "redit" + ], + [ + "▁Cr", + "edit" + ], + [ + "▁Cre", + "dit" + ], + [ + "▁O", + "ak" + ], + [ + "▁пред", + "став" + ], + [ + "rad", + "ius" + ], + [ + "cl", + "i" + ], + [ + "c", + "li" + ], + [ + "IE", + "NT" + ], + [ + "I", + "ENT" + ], + [ + "▁Lu", + "cy" + ], + [ + "▁Luc", + "y" + ], + [ + "▁calcul", + "ation" + ], + [ + "▁calc", + "ulation" + ], + [ + "▁p", + "ixel" + ], + [ + "▁", + "pixel" + ], + [ + "▁m", + "ul" + ], + [ + "▁mu", + "l" + ], + [ + "▁", + "mul" + ], + [ + "▁out", + "comes" + ], + [ + "▁outcome", + "s" + ], + [ + "▁cent", + "ers" + ], + [ + "▁center", + "s" + ], + [ + "▁res", + "idence" + ], + [ + "▁resid", + "ence" + ], + [ + "Con", + "straint" + ], + [ + "▁pre", + "serve" + ], + [ + "▁pres", + "erve" + ], + [ + "▁preserv", + "e" + ], + [ + "pe", + "on" + ], + [ + "uf", + "fix" + ], + [ + "uff", + "ix" + ], + [ + "▁Rober", + "ts" + ], + [ + "▁Robert", + "s" + ], + [ + "▁Rob", + "erts" + ], + [ + "▁pro", + "mot" + ], + [ + "▁pr", + "omot" + ], + [ + "▁prom", + "ot" + ], + [ + "?", + "!" + ], + [ + "bal", + "ance" + ], + [ + "▁cour", + "ts" + ], + [ + "▁court", + "s" + ], + [ + "▁dis", + "g" + ], + [ + "▁di", + "sg" + ], + [ + "PR", + "INT" + ], + [ + "PRI", + "NT" + ], + [ + "▁и", + "х" + ], + [ + "el", + "fare" + ], + [ + "elf", + "are" + ], + [ + "▁ret", + "reat" + ], + [ + "▁А", + "в" + ], + [ + "Co", + "st" + ], + [ + "C", + "ost" + ], + [ + "al", + "so" + ], + [ + "als", + "o" + ], + [ + "▁F", + "ür" + ], + [ + "▁Mär", + "z" + ], + [ + "DI", + "O" + ], + [ + "D", + "IO" + ], + [ + "▁b", + "ez" + ], + [ + "▁be", + "z" + ], + [ + "▁", + "bez" + ], + [ + "AUT", + "H" + ], + [ + "AU", + "TH" + ], + [ + "De", + "n" + ], + [ + "D", + "en" + ], + [ + "▁a", + "tom" + ], + [ + "▁at", + "om" + ], + [ + "▁", + "atom" + ], + [ + "▁r", + "oman" + ], + [ + "▁ro", + "man" + ], + [ + "▁rom", + "an" + ], + [ + "▁P", + "el" + ], + [ + "▁Pe", + "l" + ], + [ + "▁Roose", + "velt" + ], + [ + "▁Pl", + "ant" + ], + [ + "▁Plan", + "t" + ], + [ + "Cont", + "ents" + ], + [ + "Content", + "s" + ], + [ + "▁Bet", + "ween" + ], + [ + "▁cou", + "pling" + ], + [ + "▁coup", + "ling" + ], + [ + "str", + "ucture" + ], + [ + "struct", + "ure" + ], + [ + "▁Mar", + "shall" + ], + [ + "▁Mars", + "hall" + ], + [ + "▁Marshal", + "l" + ], + [ + "▁Care", + "er" + ], + [ + "▁rail", + "way" + ], + [ + "▁B", + "ureau" + ], + [ + "▁Bur", + "eau" + ], + [ + "▁poss", + "ibilities" + ], + [ + "▁k", + "or" + ], + [ + "▁ko", + "r" + ], + [ + "▁", + "kor" + ], + [ + "){", + "\r" + ], + [ + ")", + "{\r" + ], + [ + "mer", + "o" + ], + [ + "me", + "ro" + ], + [ + "m", + "ero" + ], + [ + "mo", + "v" + ], + [ + "m", + "ov" + ], + [ + "анг", + "л" + ], + [ + "AI", + "N" + ], + [ + "A", + "IN" + ], + [ + "mu", + "nd" + ], + [ + "mun", + "d" + ], + [ + "m", + "und" + ], + [ + "let", + "te" + ], + [ + "lett", + "e" + ], + [ + "l", + "ette" + ], + [ + "▁sum", + "mar" + ], + [ + "▁describ", + "ing" + ], + [ + "▁N", + "AS" + ], + [ + "▁NA", + "S" + ], + [ + "▁E", + "mb" + ], + [ + "▁Em", + "b" + ], + [ + "▁", + "Emb" + ], + [ + "Inst", + "ruction" + ], + [ + "li", + "est" + ], + [ + "lie", + "st" + ], + [ + "l", + "iest" + ], + [ + "▁S", + "ig" + ], + [ + "▁Si", + "g" + ], + [ + "▁", + "Sig" + ], + [ + "Bi", + "ll" + ], + [ + "B", + "ill" + ], + [ + "▁v", + "erd" + ], + [ + "▁ver", + "d" + ], + [ + "▁ve", + "rd" + ], + [ + "pl", + "ant" + ], + [ + "plan", + "t" + ], + [ + "▁galax", + "ies" + ], + [ + "\"]", + ")" + ], + [ + "\"", + "])" + ], + [ + "▁Py", + "Object" + ], + [ + "▁", + "PyObject" + ], + [ + "▁G", + "y" + ], + [ + "▁m", + "ě" + ], + [ + "▁organ", + "isation" + ], + [ + "▁organis", + "ation" + ], + [ + "He", + "r" + ], + [ + "H", + "er" + ], + [ + "Se", + "p" + ], + [ + "S", + "ep" + ], + [ + "oc", + "om" + ], + [ + "oco", + "m" + ], + [ + "o", + "com" + ], + [ + "▁S", + "ame" + ], + [ + "▁Sam", + "e" + ], + [ + "▁Sa", + "me" + ], + [ + "▁", + "Same" + ], + [ + "▁b", + "ite" + ], + [ + "▁bit", + "e" + ], + [ + "▁bi", + "te" + ], + [ + "▁Se", + "attle" + ], + [ + "зы", + "ва" + ], + [ + "Ob", + "server" + ], + [ + "Observ", + "er" + ], + [ + "’", + "." + ], + [ + "▁m", + "orph" + ], + [ + "▁mor", + "ph" + ], + [ + "ur", + "ches" + ], + [ + "urch", + "es" + ], + [ + "al", + "ph" + ], + [ + "re", + "ement" + ], + [ + "ree", + "ment" + ], + [ + "con", + "sin" + ], + [ + "cons", + "in" + ], + [ + "^", + "-" + ], + [ + "▁d", + "ann" + ], + [ + "▁da", + "nn" + ], + [ + "▁dan", + "n" + ], + [ + "trans", + "late" + ], + [ + "transl", + "ate" + ], + [ + "ви", + "х" + ], + [ + "Re", + "act" + ], + [ + "▁c", + "ats" + ], + [ + "▁cat", + "s" + ], + [ + "▁ca", + "ts" + ], + [ + "▁b", + "rew" + ], + [ + "▁br", + "ew" + ], + [ + "▁bre", + "w" + ], + [ + "▁", + "brew" + ], + [ + "▁d", + "s" + ], + [ + "▁", + "ds" + ], + [ + "▁cir", + "cles" + ], + [ + "▁circ", + "les" + ], + [ + "▁circle", + "s" + ], + [ + "▁d", + "rift" + ], + [ + "▁dr", + "ift" + ], + [ + "▁dri", + "ft" + ], + [ + "ag", + "ma" + ], + [ + "▁Val", + "ent" + ], + [ + "PI", + "N" + ], + [ + "P", + "IN" + ], + [ + "AR", + "M" + ], + [ + "A", + "RM" + ], + [ + "▁sur", + "viv" + ], + [ + "▁surv", + "iv" + ], + [ + "al", + "in" + ], + [ + "ali", + "n" + ], + [ + "a", + "lin" + ], + [ + "Pr", + "ef" + ], + [ + "Pre", + "f" + ], + [ + "P", + "ref" + ], + [ + "friend", + "ly" + ], + [ + "▁uncertain", + "ty" + ], + [ + "▁f", + "d" + ], + [ + "▁", + "fd" + ], + [ + "▁engine", + "er" + ], + [ + "Be", + "n" + ], + [ + "B", + "en" + ], + [ + "ic", + "ular" + ], + [ + "i", + "cular" + ], + [ + "or", + "est" + ], + [ + "ore", + "st" + ], + [ + "ores", + "t" + ], + [ + "o", + "rest" + ], + [ + "▁hor", + "izontal" + ], + [ + "▁horizon", + "tal" + ], + [ + "▁", + "horizontal" + ], + [ + "UT", + "C" + ], + [ + "U", + "TC" + ], + [ + "text", + "rm" + ], + [ + "tex", + "trm" + ], + [ + "Li", + "ve" + ], + [ + "L", + "ive" + ], + [ + "Sc", + "ore" + ], + [ + "S", + "core" + ], + [ + "▁Germ", + "ans" + ], + [ + "▁German", + "s" + ], + [ + "▁Ger", + "mans" + ], + [ + "di", + "stance" + ], + [ + "dist", + "ance" + ], + [ + "d", + "istance" + ], + [ + "ut", + "i" + ], + [ + "u", + "ti" + ], + [ + "▁é", + "qu" + ], + [ + "▁", + "équ" + ], + [ + "▁numer", + "ical" + ], + [ + "▁re", + "ass" + ], + [ + "Act", + "iv" + ], + [ + "▁c", + "od" + ], + [ + "▁co", + "d" + ], + [ + "▁", + "cod" + ], + [ + "bul", + "let" + ], + [ + "en", + "sing" + ], + [ + "ens", + "ing" + ], + [ + "▁G", + "em" + ], + [ + "▁Ge", + "m" + ], + [ + "▁nav", + "igation" + ], + [ + "▁navig", + "ation" + ], + [ + "▁", + "navigation" + ], + [ + "add", + "Class" + ], + [ + "▁simultane", + "ously" + ], + [ + "ви", + "й" + ], + [ + "▁йо", + "го" + ], + [ + "▁й", + "ого" + ], + [ + "▁H", + "ö" + ], + [ + "▁har", + "sh" + ], + [ + "prec", + "ated" + ], + [ + "p", + "recated" + ], + [ + "С", + "СР" + ], + [ + "▁Equ", + "ip" + ], + [ + "ad", + "get" + ], + [ + "▁T", + "YPE" + ], + [ + "▁", + "TYPE" + ], + [ + "▁m", + "g" + ], + [ + "▁", + "mg" + ], + [ + "IG", + "H" + ], + [ + "▁v", + "in" + ], + [ + "▁vi", + "n" + ], + [ + "▁", + "vin" + ], + [ + "▁fin", + "dings" + ], + [ + "▁find", + "ings" + ], + [ + "▁finding", + "s" + ], + [ + "iv", + "an" + ], + [ + "iva", + "n" + ], + [ + "i", + "van" + ], + [ + "▁pos", + "session" + ], + [ + "▁poss", + "ession" + ], + [ + "▁possess", + "ion" + ], + [ + "▁т", + "ого" + ], + [ + "▁то", + "го" + ], + [ + "▁", + "того" + ], + [ + "▁par", + "sed" + ], + [ + "▁parse", + "d" + ], + [ + "▁", + "parsed" + ], + [ + "ri", + "ors" + ], + [ + "rior", + "s" + ], + [ + "rio", + "rs" + ], + [ + "r", + "iors" + ], + [ + "zeich", + "net" + ], + [ + "ни", + "ков" + ], + [ + "ник", + "ов" + ], + [ + "Work", + "er" + ], + [ + "▁en", + "ables" + ], + [ + "▁enable", + "s" + ], + [ + "▁(", + "$\\" + ], + [ + "▁($", + "\\" + ], + [ + "▁C", + "opy" + ], + [ + "▁Co", + "py" + ], + [ + "▁Cop", + "y" + ], + [ + "▁", + "Copy" + ], + [ + "▁orient", + "ation" + ], + [ + "ст", + "ре" + ], + [ + "с", + "тре" + ], + [ + "▁Ind", + "ians" + ], + [ + "▁India", + "ns" + ], + [ + "▁Indian", + "s" + ], + [ + "▁G", + "ary" + ], + [ + "▁Gar", + "y" + ], + [ + "▁Ga", + "ry" + ], + [ + "▁Ins", + "urance" + ], + [ + "is", + "an" + ], + [ + "isa", + "n" + ], + [ + "i", + "san" + ], + [ + "Ch", + "at" + ], + [ + "C", + "hat" + ], + [ + "▁com", + "un" + ], + [ + "▁co", + "mun" + ], + [ + "▁co", + "ron" + ], + [ + "▁cor", + "on" + ], + [ + "ографи", + "я" + ], + [ + "up", + "dated" + ], + [ + "update", + "d" + ], + [ + "▁И", + "н" + ], + [ + "The", + "se" + ], + [ + "Th", + "ese" + ], + [ + "SE", + "C" + ], + [ + "S", + "EC" + ], + [ + "▁boy", + "friend" + ], + [ + "Di", + "agnostics" + ], + [ + "Hi", + "nt" + ], + [ + "H", + "int" + ], + [ + "mu", + "l" + ], + [ + "m", + "ul" + ], + [ + "▁in", + "ode" + ], + [ + "▁i", + "node" + ], + [ + "▁", + "inode" + ], + [ + "x", + "A" + ], + [ + "ef", + "t" + ], + [ + "e", + "ft" + ], + [ + "OP", + "TION" + ], + [ + "OPT", + "ION" + ], + [ + "un", + "ct" + ], + [ + "unc", + "t" + ], + [ + "an", + "non" + ], + [ + "ann", + "on" + ], + [ + "anno", + "n" + ], + [ + "EN", + "S" + ], + [ + "E", + "NS" + ], + [ + "st", + "rip" + ], + [ + "str", + "ip" + ], + [ + "stri", + "p" + ], + [ + "▁enthus", + "i" + ], + [ + "▁W", + "hit" + ], + [ + "▁Wh", + "it" + ], + [ + "▁Ф", + "и" + ], + [ + "au", + "de" + ], + [ + "aud", + "e" + ], + [ + "a", + "ude" + ], + [ + "▁disag", + "ree" + ], + [ + "▁sn", + "apped" + ], + [ + "▁snap", + "ped" + ], + [ + "Ph", + "ys" + ], + [ + "▁S", + "yn" + ], + [ + "▁Sy", + "n" + ], + [ + "▁s", + "our" + ], + [ + "▁so", + "ur" + ], + [ + "▁sou", + "r" + ], + [ + "▁L", + "ux" + ], + [ + "▁Lu", + "x" + ], + [ + "ug", + "ar" + ], + [ + "uga", + "r" + ], + [ + "u", + "gar" + ], + [ + "til", + "e" + ], + [ + "ti", + "le" + ], + [ + "t", + "ile" + ], + [ + "▁in", + "fection" + ], + [ + "▁inf", + "ection" + ], + [ + "▁infect", + "ion" + ], + [ + "▁F", + "eb" + ], + [ + "▁Fe", + "b" + ], + [ + "▁C", + "hem" + ], + [ + "▁Ch", + "em" + ], + [ + "▁Che", + "m" + ], + [ + "data", + "set" + ], + [ + "dat", + "aset" + ], + [ + "ch", + "ts" + ], + [ + "cht", + "s" + ], + [ + "D", + "ynamic" + ], + [ + "▁с", + "ред" + ], + [ + "▁qu", + "een" + ], + [ + "▁que", + "en" + ], + [ + "work", + "er" + ], + [ + "wor", + "ker" + ], + [ + "sw", + "ap" + ], + [ + "▁tim", + "estamp" + ], + [ + "▁", + "timestamp" + ], + [ + "▁In", + "tegr" + ], + [ + "▁Int", + "egr" + ], + [ + "▁", + "Integr" + ], + [ + "▁inter", + "views" + ], + [ + "▁interview", + "s" + ], + [ + "su", + "ch" + ], + [ + "s", + "uch" + ], + [ + "▁l", + "aughter" + ], + [ + "▁laugh", + "ter" + ], + [ + "pro", + "f" + ], + [ + "pr", + "of" + ], + [ + "▁B", + "ird" + ], + [ + "▁Bi", + "rd" + ], + [ + "▁Bir", + "d" + ], + [ + "(", + "|" + ], + [ + "â", + "n" + ], + [ + "▁g", + "ra" + ], + [ + "▁gr", + "a" + ], + [ + "▁", + "gra" + ], + [ + "&", + "=" + ], + [ + "ze", + "ns" + ], + [ + "zen", + "s" + ], + [ + "z", + "ens" + ], + [ + "get", + "Message" + ], + [ + "▁O", + "st" + ], + [ + "▁Os", + "t" + ], + [ + "▁g", + "ab" + ], + [ + "▁ga", + "b" + ], + [ + "▁mort", + "gage" + ], + [ + "mult", + "icol" + ], + [ + "multi", + "col" + ], + [ + "LE", + "VEL" + ], + [ + "part", + "ition" + ], + [ + "se", + "en" + ], + [ + "see", + "n" + ], + [ + "s", + "een" + ], + [ + "▁dec", + "lar" + ], + [ + "▁decl", + "ar" + ], + [ + "A", + "U" + ], + [ + "▁o", + "x" + ], + [ + "▁", + "ox" + ], + [ + "▁l", + "igger" + ], + [ + "▁lig", + "ger" + ], + [ + "▁C", + "arm" + ], + [ + "▁Car", + "m" + ], + [ + "▁Ca", + "rm" + ], + [ + "ge", + "me" + ], + [ + "gem", + "e" + ], + [ + "g", + "eme" + ], + [ + "▁Ve", + "gas" + ], + [ + "▁Veg", + "as" + ], + [ + "▁E", + "ug" + ], + [ + "▁Eu", + "g" + ], + [ + "or", + "us" + ], + [ + "o", + "rus" + ], + [ + "▁b", + "rick" + ], + [ + "▁br", + "ick" + ], + [ + "▁as", + "í" + ], + [ + "▁Mag", + "azine" + ], + [ + "HasColumn", + "Type" + ], + [ + "V", + "R" + ], + [ + "lic", + "her" + ], + [ + "li", + "cher" + ], + [ + "lich", + "er" + ], + [ + "liche", + "r" + ], + [ + "l", + "icher" + ], + [ + "▁F", + "uture" + ], + [ + "▁Fut", + "ure" + ], + [ + "▁", + "Future" + ], + [ + "▁J", + "ug" + ], + [ + "▁Ju", + "g" + ], + [ + "at", + "tan" + ], + [ + "att", + "an" + ], + [ + "atta", + "n" + ], + [ + "con", + "structor" + ], + [ + "construct", + "or" + ], + [ + "V", + "P" + ], + [ + "▁т", + "ур" + ], + [ + "▁ту", + "р" + ], + [ + "▁", + "тур" + ], + [ + "чи", + "на" + ], + [ + "чин", + "а" + ], + [ + "Comp", + "arator" + ], + [ + "Compar", + "ator" + ], + [ + "▁aut", + "hentic" + ], + [ + "▁mon", + "ster" + ], + [ + "▁trans", + "formed" + ], + [ + "▁transform", + "ed" + ], + [ + "▁firm", + "s" + ], + [ + "▁fir", + "ms" + ], + [ + "F", + "W" + ], + [ + "▁c", + "atalog" + ], + [ + "▁catal", + "og" + ], + [ + "▁", + "catalog" + ], + [ + "bo", + "ards" + ], + [ + "board", + "s" + ], + [ + "▁dise", + "ases" + ], + [ + "▁disease", + "s" + ], + [ + "▁Benj", + "amin" + ], + [ + "▁hor", + "izon" + ], + [ + "▁Av", + "ailable" + ], + [ + "▁", + "Available" + ], + [ + "M", + "vc" + ], + [ + "St", + "ud" + ], + [ + "▁l", + "ord" + ], + [ + "▁lo", + "rd" + ], + [ + "▁", + "lord" + ], + [ + "gen", + "eral" + ], + [ + "gener", + "al" + ], + [ + "па", + "р" + ], + [ + "п", + "ар" + ], + [ + "▁cab", + "inet" + ], + [ + "▁cabin", + "et" + ], + [ + "▁Bas", + "ic" + ], + [ + "▁", + "Basic" + ], + [ + "Test", + "Case" + ], + [ + "an", + "sk" + ], + [ + "ans", + "k" + ], + [ + "▁S", + "now" + ], + [ + "▁Sn", + "ow" + ], + [ + "ier", + "ten" + ], + [ + "iert", + "en" + ], + [ + "ierte", + "n" + ], + [ + "i", + "erten" + ], + [ + "▁v", + "ocal" + ], + [ + "▁vo", + "cal" + ], + [ + "▁voc", + "al" + ], + [ + "Pad", + "ding" + ], + [ + "P", + "adding" + ], + [ + "ha", + "lt" + ], + [ + "hal", + "t" + ], + [ + "h", + "alt" + ], + [ + "▁Alex", + "and" + ], + [ + "▁Col", + "omb" + ], + [ + "iv", + "amente" + ], + [ + "iva", + "mente" + ], + [ + "▁art", + "ificial" + ], + [ + "▁Atl", + "anta" + ], + [ + "▁m", + "entre" + ], + [ + "▁men", + "tre" + ], + [ + "▁ment", + "re" + ], + [ + "▁est", + "aba" + ], + [ + "▁estab", + "a" + ], + [ + "▁esta", + "ba" + ], + [ + "je", + "kt" + ], + [ + "jek", + "t" + ], + [ + "j", + "ekt" + ], + [ + "▁sle", + "pt" + ], + [ + "▁end", + "less" + ], + [ + "▁endl", + "ess" + ], + [ + "ér", + "o" + ], + [ + "é", + "ro" + ], + [ + "at", + "tery" + ], + [ + "att", + "ery" + ], + [ + "atter", + "y" + ], + [ + "atte", + "ry" + ], + [ + "uu", + "r" + ], + [ + "u", + "ur" + ], + [ + "▁weak", + "ness" + ], + [ + "▁attempt", + "ing" + ], + [ + "BY", + "TE" + ], + [ + "▁found", + "er" + ], + [ + "▁fo", + "under" + ], + [ + "▁fou", + "nder" + ], + [ + "▁sa", + "lv" + ], + [ + "▁sal", + "v" + ], + [ + "▁Medic", + "ine" + ], + [ + "ti", + "d" + ], + [ + "t", + "id" + ], + [ + "▁Sch", + "we" + ], + [ + "▁Schw", + "e" + ], + [ + "ra", + "ction" + ], + [ + "ract", + "ion" + ], + [ + "r", + "action" + ], + [ + "▁", + "¿" + ], + [ + "cr", + "ate" + ], + [ + "c", + "rate" + ], + [ + "SER", + "VER" + ], + [ + "▁comp", + "ound" + ], + [ + "▁con", + "ve" + ], + [ + "▁conv", + "e" + ], + [ + "▁c", + "af" + ], + [ + "▁ca", + "f" + ], + [ + "▁hand", + "ful" + ], + [ + "on", + "ne" + ], + [ + "úblic", + "a" + ], + [ + "▁def", + "ensive" + ], + [ + "▁defens", + "ive" + ], + [ + "Al", + "ignment" + ], + [ + "Align", + "ment" + ], + [ + "▁pr", + "éc" + ], + [ + "▁pré", + "c" + ], + [ + "▁signific", + "ance" + ], + [ + "él", + "é" + ], + [ + "é", + "lé" + ], + [ + "ar", + "ta" + ], + [ + "art", + "a" + ], + [ + "Da", + "m" + ], + [ + "D", + "am" + ], + [ + "▁per", + "pet" + ], + [ + "▁c", + "aller" + ], + [ + "▁call", + "er" + ], + [ + "▁cal", + "ler" + ], + [ + "ic", + "ients" + ], + [ + "ici", + "ents" + ], + [ + "icient", + "s" + ], + [ + "ce", + "p" + ], + [ + "c", + "ep" + ], + [ + "▁Mult", + "i" + ], + [ + "▁Mul", + "ti" + ], + [ + "▁", + "Multi" + ], + [ + "▁st", + "olen" + ], + [ + "▁sto", + "len" + ], + [ + "▁stole", + "n" + ], + [ + "▁focus", + "ing" + ], + [ + "em", + "bed" + ], + [ + "emb", + "ed" + ], + [ + "▁b", + "ree" + ], + [ + "▁br", + "ee" + ], + [ + "▁bre", + "e" + ], + [ + "▁A", + "B" + ], + [ + "▁", + "AB" + ], + [ + "▁occasion", + "s" + ], + [ + "▁occas", + "ions" + ], + [ + "se", + "a" + ], + [ + "s", + "ea" + ], + [ + "Pro", + "v" + ], + [ + "Pr", + "ov" + ], + [ + "P", + "rov" + ], + [ + "че", + "ние" + ], + [ + "▁C", + "ategory" + ], + [ + "▁", + "Category" + ], + [ + "▁s", + "q" + ], + [ + "▁", + "sq" + ], + [ + "▁Ф", + "е" + ], + [ + "V", + "A" + ], + [ + "Di", + "ff" + ], + [ + "D", + "iff" + ], + [ + "Tr", + "i" + ], + [ + "T", + "ri" + ], + [ + "iss", + "ement" + ], + [ + "isse", + "ment" + ], + [ + "▁act", + "ress" + ], + [ + "▁П", + "е" + ], + [ + "▁j", + "ej" + ], + [ + "▁je", + "j" + ], + [ + "▁tw", + "isted" + ], + [ + "▁twist", + "ed" + ], + [ + "▁N", + "icol" + ], + [ + "▁Nic", + "ol" + ], + [ + "▁Ni", + "col" + ], + [ + "▁jun", + "ior" + ], + [ + "▁junio", + "r" + ], + [ + "▁juni", + "or" + ], + [ + "So", + "und" + ], + [ + "S", + "ound" + ], + [ + "▁Bra", + "sil" + ], + [ + "▁Bras", + "il" + ], + [ + "▁ju", + "ice" + ], + [ + "▁>", + ">>" + ], + [ + "▁>>", + ">" + ], + [ + "▁", + ">>>" + ], + [ + "▁A", + "lb" + ], + [ + "▁Al", + "b" + ], + [ + "▁soft", + "ly" + ], + [ + "▁Mc", + "K" + ], + [ + "▁G", + "ren" + ], + [ + "▁Gr", + "en" + ], + [ + "▁Gre", + "n" + ], + [ + "▁ital", + "iano" + ], + [ + "▁cre", + "atures" + ], + [ + "▁creat", + "ures" + ], + [ + "▁creature", + "s" + ], + [ + "▁res", + "idential" + ], + [ + "▁resident", + "ial" + ], + [ + "▁resid", + "ential" + ], + [ + "▁Inst", + "agram" + ], + [ + "uck", + "s" + ], + [ + "uc", + "ks" + ], + [ + "u", + "cks" + ], + [ + "▁k", + "iller" + ], + [ + "▁kill", + "er" + ], + [ + "▁kil", + "ler" + ], + [ + "▁John", + "ny" + ], + [ + "▁enter", + "prise" + ], + [ + "D", + "to" + ], + [ + "ch", + "estra" + ], + [ + "che", + "stra" + ], + [ + "ches", + "tra" + ], + [ + "chestr", + "a" + ], + [ + "▁T", + "el" + ], + [ + "▁Te", + "l" + ], + [ + "▁Act", + "iv" + ], + [ + "▁", + "Activ" + ], + [ + "fa", + "ctor" + ], + [ + "fac", + "tor" + ], + [ + "fact", + "or" + ], + [ + "f", + "actor" + ], + [ + "ou", + "st" + ], + [ + "ous", + "t" + ], + [ + "o", + "ust" + ], + [ + "▁vac", + "uum" + ], + [ + "ра", + "л" + ], + [ + "р", + "ал" + ], + [ + "')", + "->" + ], + [ + "'", + ")->" + ], + [ + "▁L", + "eft" + ], + [ + "▁Le", + "ft" + ], + [ + "▁", + "Left" + ], + [ + "▁de", + "fect" + ], + [ + "▁def", + "ect" + ], + [ + "▁defe", + "ct" + ], + [ + "▁nine", + "te" + ], + [ + "▁nin", + "ete" + ], + [ + "fa", + "re" + ], + [ + "far", + "e" + ], + [ + "f", + "are" + ], + [ + "▁reg", + "ret" + ], + [ + "▁s", + "har" + ], + [ + "▁sh", + "ar" + ], + [ + "▁sha", + "r" + ], + [ + "ctr", + "ine" + ], + [ + "me", + "sh" + ], + [ + "mes", + "h" + ], + [ + "m", + "esh" + ], + [ + "ci", + "ty" + ], + [ + "cit", + "y" + ], + [ + "c", + "ity" + ], + [ + "ic", + "it" + ], + [ + "ici", + "t" + ], + [ + "i", + "cit" + ], + [ + "▁F", + "em" + ], + [ + "▁Fe", + "m" + ], + [ + "lim", + "ited" + ], + [ + "limit", + "ed" + ], + [ + "ok", + "a" + ], + [ + "o", + "ka" + ], + [ + "!\\", + "!\\" + ], + [ + "Don", + "ald" + ], + [ + "з", + "но" + ], + [ + "▁pro", + "vision" + ], + [ + "▁prov", + "ision" + ], + [ + "▁discuss", + "ions" + ], + [ + "▁discussion", + "s" + ], + [ + "Dr", + "ag" + ], + [ + "D", + "rag" + ], + [ + "▁In", + "cl" + ], + [ + "▁Inc", + "l" + ], + [ + "Ex", + "it" + ], + [ + "E", + "xit" + ], + [ + "▁A", + "bd" + ], + [ + "▁Ab", + "d" + ], + [ + "st", + "ory" + ], + [ + "sto", + "ry" + ], + [ + "ie", + "ve" + ], + [ + "iev", + "e" + ], + [ + "i", + "eve" + ], + [ + "▁by", + "ł" + ], + [ + "ol", + "ving" + ], + [ + "olv", + "ing" + ], + [ + "woh", + "ner" + ], + [ + "▁gu", + "idelines" + ], + [ + "▁guide", + "lines" + ], + [ + "▁guid", + "elines" + ], + [ + "▁st", + "raw" + ], + [ + "▁str", + "aw" + ], + [ + "▁stra", + "w" + ], + [ + "ü", + "ss" + ], + [ + "▁бу", + "ло" + ], + [ + "▁bur", + "den" + ], + [ + "▁spat", + "ial" + ], + [ + "▁stret", + "ched" + ], + [ + "▁stretch", + "ed" + ], + [ + "▁I", + "nf" + ], + [ + "▁In", + "f" + ], + [ + "▁", + "Inf" + ], + [ + "▁type", + "def" + ], + [ + "▁typed", + "ef" + ], + [ + "▁ro", + "bot" + ], + [ + "▁rob", + "ot" + ], + [ + "▁D", + "oc" + ], + [ + "▁Do", + "c" + ], + [ + "▁", + "Doc" + ], + [ + "pl", + "iers" + ], + [ + "plier", + "s" + ], + [ + "wa", + "l" + ], + [ + "w", + "al" + ], + [ + "ca", + "mp" + ], + [ + "cam", + "p" + ], + [ + "c", + "amp" + ], + [ + "▁dif", + "fé" + ], + [ + "▁diff", + "é" + ], + [ + "▁Mc", + "G" + ], + [ + "▁t", + "el" + ], + [ + "▁te", + "l" + ], + [ + "ar", + "ette" + ], + [ + "aret", + "te" + ], + [ + "▁sub", + "sequently" + ], + [ + "▁subsequ", + "ently" + ], + [ + "▁subsequent", + "ly" + ], + [ + "▁h", + "oney" + ], + [ + "▁hon", + "ey" + ], + [ + "▁ho", + "ney" + ], + [ + "FUN", + "C" + ], + [ + "▁establish", + "ment" + ], + [ + "te", + "sy" + ], + [ + "tes", + "y" + ], + [ + "▁któ", + "ry" + ], + [ + "▁се", + "ль" + ], + [ + "▁F", + "O" + ], + [ + "▁", + "FO" + ], + [ + "▁Is", + "lands" + ], + [ + "▁Island", + "s" + ], + [ + "▁m", + "p" + ], + [ + "▁", + "mp" + ], + [ + "Scal", + "ar" + ], + [ + "▁Y", + "an" + ], + [ + "▁Ya", + "n" + ], + [ + "ck", + "en" + ], + [ + "cke", + "n" + ], + [ + "c", + "ken" + ], + [ + "▁var", + "iation" + ], + [ + "▁vari", + "ation" + ], + [ + "i", + "ą" + ], + [ + "op", + "tim" + ], + [ + "opt", + "im" + ], + [ + "az", + "or" + ], + [ + "tu", + "ple" + ], + [ + "t", + "uple" + ], + [ + "▁gr", + "avity" + ], + [ + "▁grav", + "ity" + ], + [ + "▁con", + "clude" + ], + [ + "▁concl", + "ude" + ], + [ + "▁col", + "lections" + ], + [ + "▁collection", + "s" + ], + [ + "▁collect", + "ions" + ], + [ + "▁colle", + "ctions" + ], + [ + "és", + "z" + ], + [ + "é", + "sz" + ], + [ + "▁L", + "iver" + ], + [ + "▁Li", + "ver" + ], + [ + "▁Live", + "r" + ], + [ + "▁Liv", + "er" + ], + [ + "▁eth", + "nic" + ], + [ + "comp", + "ile" + ], + [ + "▁p", + "arl" + ], + [ + "▁par", + "l" + ], + [ + "▁pa", + "rl" + ], + [ + "Sur", + "face" + ], + [ + "{", + "'" + ], + [ + "▁par", + "agraph" + ], + [ + "▁para", + "graph" + ], + [ + "▁", + "paragraph" + ], + [ + "pos", + "ite" + ], + [ + "po", + "site" + ], + [ + "ít", + "ulo" + ], + [ + "ob", + "a" + ], + [ + "o", + "ba" + ], + [ + "bin", + "ary" + ], + [ + "b", + "inary" + ], + [ + "ro", + "b" + ], + [ + "r", + "ob" + ], + [ + "▁Pe", + "dro" + ], + [ + "▁Ped", + "ro" + ], + [ + "▁f", + "is" + ], + [ + "▁fi", + "s" + ], + [ + "▁Gr", + "ande" + ], + [ + "▁Grand", + "e" + ], + [ + "▁Gran", + "de" + ], + [ + "▁Gra", + "nde" + ], + [ + "od", + "ox" + ], + [ + "odo", + "x" + ], + [ + "▁pos", + "ting" + ], + [ + "▁post", + "ing" + ], + [ + "<", + "!--" + ], + [ + "▁rac", + "ial" + ], + [ + "▁ra", + "cial" + ], + [ + "CO", + "M" + ], + [ + "C", + "OM" + ], + [ + "ё", + "м" + ], + [ + "▁A", + "UT" + ], + [ + "▁AU", + "T" + ], + [ + "▁", + "AUT" + ], + [ + "▁d", + "ishes" + ], + [ + "▁dis", + "hes" + ], + [ + "▁dish", + "es" + ], + [ + "assert", + "True" + ], + [ + "▁G", + "row" + ], + [ + "▁Gr", + "ow" + ], + [ + "▁Gro", + "w" + ], + [ + "▁sl", + "id" + ], + [ + "▁ju", + "illet" + ], + [ + "сс", + "о" + ], + [ + "с", + "со" + ], + [ + "Run", + "ner" + ], + [ + "Sa", + "l" + ], + [ + "S", + "al" + ], + [ + "Sa", + "me" + ], + [ + "Sam", + "e" + ], + [ + "S", + "ame" + ], + [ + "▁Stud", + "y" + ], + [ + "▁Col", + "onel" + ], + [ + "▁J", + "oin" + ], + [ + "▁Jo", + "in" + ], + [ + "▁", + "Join" + ], + [ + "ar", + "ms" + ], + [ + "arm", + "s" + ], + [ + "▁l", + "y" + ], + [ + "▁", + "ly" + ], + [ + "▁co", + "oper" + ], + [ + "▁cur", + "ves" + ], + [ + "▁curve", + "s" + ], + [ + "▁curv", + "es" + ], + [ + "He", + "alth" + ], + [ + "▁M", + "OD" + ], + [ + "▁MO", + "D" + ], + [ + "▁", + "MOD" + ], + [ + "▁pr", + "imo" + ], + [ + "▁prim", + "o" + ], + [ + "▁pri", + "mo" + ], + [ + "ock", + "ets" + ], + [ + "ocket", + "s" + ], + [ + "multi", + "column" + ], + [ + "multicol", + "umn" + ], + [ + "▁С", + "ан" + ], + [ + "▁Са", + "н" + ], + [ + "▁H", + "unter" + ], + [ + "▁Hun", + "ter" + ], + [ + "▁Hunt", + "er" + ], + [ + "Custom", + "er" + ], + [ + "ot", + "hy" + ], + [ + "oth", + "y" + ], + [ + "o", + "thy" + ], + [ + "Des", + "ign" + ], + [ + "De", + "sign" + ], + [ + "ma", + "ss" + ], + [ + "mas", + "s" + ], + [ + "m", + "ass" + ], + [ + "▁fam", + "ille" + ], + [ + "▁famil", + "le" + ], + [ + "▁fue", + "ron" + ], + [ + "▁fu", + "eron" + ], + [ + "▁fuer", + "on" + ], + [ + "ä", + "m" + ], + [ + "▁head", + "quarters" + ], + [ + "▁d", + "ign" + ], + [ + "▁di", + "gn" + ], + [ + "▁dig", + "n" + ], + [ + "▁Ro", + "bin" + ], + [ + "▁Rob", + "in" + ], + [ + "▁me", + "ets" + ], + [ + "▁meet", + "s" + ], + [ + "▁so", + "it" + ], + [ + "па", + "да" + ], + [ + "пад", + "а" + ], + [ + ")\"", + ");" + ], + [ + ")", + "\");" + ], + [ + "▁w", + "rapper" + ], + [ + "▁wrap", + "per" + ], + [ + "▁", + "wrapper" + ], + [ + "▁theoret", + "ical" + ], + [ + "▁u", + "d" + ], + [ + "▁", + "ud" + ], + [ + "pl", + "icity" + ], + [ + "plic", + "ity" + ], + [ + "plicit", + "y" + ], + [ + "▁w", + "p" + ], + [ + "▁", + "wp" + ], + [ + "▁испо", + "ль" + ], + [ + "▁c", + "amps" + ], + [ + "▁camp", + "s" + ], + [ + "▁cam", + "ps" + ], + [ + "▁A", + "gency" + ], + [ + "▁Ag", + "ency" + ], + [ + "g", + "c" + ], + [ + "hu", + "m" + ], + [ + "h", + "um" + ], + [ + "AT", + "T" + ], + [ + "A", + "TT" + ], + [ + "B", + "tn" + ], + [ + "C", + "ent" + ], + [ + "▁H", + "elen" + ], + [ + "▁He", + "len" + ], + [ + "▁Hel", + "en" + ], + [ + "▁am", + "plit" + ], + [ + "▁ampl", + "it" + ], + [ + "▁Mem", + "orial" + ], + [ + "und", + "ial" + ], + [ + "SH", + "IFT" + ], + [ + "wi", + "k" + ], + [ + "w", + "ik" + ], + [ + "▁Lie", + "utenant" + ], + [ + "VAL", + "ID" + ], + [ + "▁B", + "ath" + ], + [ + "▁Ba", + "th" + ], + [ + "▁Bat", + "h" + ], + [ + "▁Jeff", + "erson" + ], + [ + "▁C", + "ut" + ], + [ + "▁Cu", + "t" + ], + [ + "▁", + "Cut" + ], + [ + "▁ser", + "vers" + ], + [ + "▁serv", + "ers" + ], + [ + "▁server", + "s" + ], + [ + "▁serve", + "rs" + ], + [ + "▁", + "servers" + ], + [ + "ly", + "ph" + ], + [ + "▁CO", + "PY" + ], + [ + "▁COP", + "Y" + ], + [ + "▁comput", + "ers" + ], + [ + "▁computer", + "s" + ], + [ + "▁compute", + "rs" + ], + [ + "const", + "ruction" + ], + [ + "construct", + "ion" + ], + [ + "▁P", + "DF" + ], + [ + "▁PD", + "F" + ], + [ + "▁", + "PDF" + ], + [ + "▁pro", + "tagon" + ], + [ + "▁prot", + "agon" + ], + [ + "▁fore", + "head" + ], + [ + "custom", + "er" + ], + [ + "Un", + "is" + ], + [ + "U", + "nis" + ], + [ + "▁sign", + "ing" + ], + [ + "▁sig", + "ning" + ], + [ + ".", + "’" + ], + [ + "F", + "etch" + ], + [ + "▁S", + "core" + ], + [ + "▁Sc", + "ore" + ], + [ + "▁", + "Score" + ], + [ + "hu", + "man" + ], + [ + "hum", + "an" + ], + [ + "h", + "uman" + ], + [ + "▁down", + "town" + ], + [ + "▁downt", + "own" + ], + [ + "In", + "tern" + ], + [ + "Int", + "ern" + ], + [ + "Inter", + "n" + ], + [ + "▁bes", + "ides" + ], + [ + "▁beside", + "s" + ], + [ + "▁д", + "во" + ], + [ + "▁пра", + "ви" + ], + [ + "▁", + "прави" + ], + [ + "▁c", + "c" + ], + [ + "▁", + "cc" + ], + [ + "▁D", + "ebug" + ], + [ + "▁De", + "bug" + ], + [ + "▁Deb", + "ug" + ], + [ + "▁", + "Debug" + ], + [ + "▁Cl", + "ose" + ], + [ + "▁", + "Close" + ], + [ + "el", + "ihood" + ], + [ + "eli", + "hood" + ], + [ + "▁al", + "gorithms" + ], + [ + "▁algorithm", + "s" + ], + [ + "▁H", + "amb" + ], + [ + "▁Ham", + "b" + ], + [ + "▁Ha", + "mb" + ], + [ + "ч", + "на" + ], + [ + "▁c", + "ust" + ], + [ + "▁cu", + "st" + ], + [ + "▁mo", + "unted" + ], + [ + "▁mount", + "ed" + ], + [ + "par", + "en" + ], + [ + "pa", + "ren" + ], + [ + "pare", + "n" + ], + [ + "p", + "aren" + ], + [ + "▁isol", + "ated" + ], + [ + "▁A", + "gr" + ], + [ + "▁Ag", + "r" + ], + [ + "▁or", + "bit" + ], + [ + "▁orb", + "it" + ], + [ + "print", + "k" + ], + [ + "▁t", + "urb" + ], + [ + "▁tu", + "rb" + ], + [ + "▁tur", + "b" + ], + [ + "▁gru", + "po" + ], + [ + "ми", + "и" + ], + [ + "\"\"", + "\"" + ], + [ + "\"", + "\"\"" + ], + [ + "▁h", + "ills" + ], + [ + "▁hill", + "s" + ], + [ + "ря", + "д" + ], + [ + "▁B", + "od" + ], + [ + "▁Bo", + "d" + ], + [ + "▁об", + "ще" + ], + [ + "est", + "one" + ], + [ + "esto", + "ne" + ], + [ + "eston", + "e" + ], + [ + "e", + "stone" + ], + [ + "▁satisf", + "ying" + ], + [ + "▁satisfy", + "ing" + ], + [ + "▁I", + "van" + ], + [ + "▁Iv", + "an" + ], + [ + "▁associ", + "ate" + ], + [ + "name", + "d" + ], + [ + "na", + "med" + ], + [ + "nam", + "ed" + ], + [ + "n", + "amed" + ], + [ + "oc", + "cup" + ], + [ + "occ", + "up" + ], + [ + "GP", + "IO" + ], + [ + "G", + "PIO" + ], + [ + "hi", + "t" + ], + [ + "h", + "it" + ], + [ + "▁dis", + "tract" + ], + [ + "▁di", + "stract" + ], + [ + "▁dist", + "ract" + ], + [ + "▁bar", + "rel" + ], + [ + "▁barr", + "el" + ], + [ + "▁in", + "variant" + ], + [ + "di", + "d" + ], + [ + "d", + "id" + ], + [ + "▁l", + "ieu" + ], + [ + "▁li", + "eu" + ], + [ + "▁lie", + "u" + ], + [ + "sc", + "ene" + ], + [ + "UN", + "K" + ], + [ + "▁Ont", + "ario" + ], + [ + "▁M", + "ission" + ], + [ + "▁Miss", + "ion" + ], + [ + "zi", + "al" + ], + [ + "z", + "ial" + ], + [ + "▁comp", + "ete" + ], + [ + "▁compet", + "e" + ], + [ + "▁cou", + "ples" + ], + [ + "▁couple", + "s" + ], + [ + "▁coup", + "les" + ], + [ + "SH", + "A" + ], + [ + "S", + "HA" + ], + [ + "▁s", + "ei" + ], + [ + "▁se", + "i" + ], + [ + "▁m", + "igration" + ], + [ + "▁migr", + "ation" + ], + [ + "ac", + "ked" + ], + [ + "ack", + "ed" + ], + [ + "▁b", + "arn" + ], + [ + "▁bar", + "n" + ], + [ + "▁ba", + "rn" + ], + [ + "hal", + "f" + ], + [ + "h", + "alf" + ], + [ + "▁neigh", + "bour" + ], + [ + "▁neighb", + "our" + ], + [ + "ft", + "e" + ], + [ + "f", + "te" + ], + [ + "▁od", + "ds" + ], + [ + "▁odd", + "s" + ], + [ + "▁optim", + "ization" + ], + [ + "▁I", + "C" + ], + [ + "▁", + "IC" + ], + [ + "▁H", + "end" + ], + [ + "▁He", + "nd" + ], + [ + "▁Hen", + "d" + ], + [ + "pay", + "ment" + ], + [ + "M", + "r" + ], + [ + "')", + ":" + ], + [ + "'", + "):" + ], + [ + "vo", + "ir" + ], + [ + "v", + "oir" + ], + [ + "▁R", + "ange" + ], + [ + "▁Rang", + "e" + ], + [ + "▁Ran", + "ge" + ], + [ + "▁", + "Range" + ], + [ + "▁polit", + "icians" + ], + [ + "▁politician", + "s" + ], + [ + "▁K", + "han" + ], + [ + "▁Kh", + "an" + ], + [ + "▁shel", + "ter" + ], + [ + "▁tim", + "ing" + ], + [ + "▁ti", + "ming" + ], + [ + "Create", + "d" + ], + [ + "Creat", + "ed" + ], + [ + "C", + "reated" + ], + [ + "▁sept", + "embre" + ], + [ + "li", + "t" + ], + [ + "l", + "it" + ], + [ + "▁S", + "hel" + ], + [ + "▁She", + "l" + ], + [ + "▁Sh", + "el" + ], + [ + "▁c", + "ouch" + ], + [ + "▁co", + "uch" + ], + [ + "▁cou", + "ch" + ], + [ + "▁d", + "är" + ], + [ + "ult", + "ur" + ], + [ + "▁G", + "iov" + ], + [ + "▁Gi", + "ov" + ], + [ + "ô", + "le" + ], + [ + "RE", + "AM" + ], + [ + "▁O", + "cean" + ], + [ + "▁M", + "B" + ], + [ + "▁", + "MB" + ], + [ + "▁lie", + "gt" + ], + [ + "▁o", + "v" + ], + [ + "▁", + "ov" + ], + [ + "▁car", + "pet" + ], + [ + "та", + "р" + ], + [ + "т", + "ар" + ], + [ + "▁го", + "дина" + ], + [ + "▁годи", + "на" + ], + [ + "▁S", + "ão" + ], + [ + "▁о", + "тно" + ], + [ + "▁от", + "но" + ], + [ + "ab", + "ling" + ], + [ + "abl", + "ing" + ], + [ + "a", + "bling" + ], + [ + "in", + "th" + ], + [ + "int", + "h" + ], + [ + "▁purs", + "ue" + ], + [ + "▁Const", + "itution" + ], + [ + "an", + "j" + ], + [ + "▁F", + "BI" + ], + [ + "▁ar", + "row" + ], + [ + "▁arr", + "ow" + ], + [ + "▁", + "arrow" + ], + [ + "ph", + "ones" + ], + [ + "phone", + "s" + ], + [ + "▁kn", + "ocked" + ], + [ + "▁knock", + "ed" + ], + [ + "▁de", + "com" + ], + [ + "▁dec", + "om" + ], + [ + "ie", + "k" + ], + [ + "i", + "ek" + ], + [ + "ь", + "е" + ], + [ + "St", + "rip" + ], + [ + "Str", + "ip" + ], + [ + "▁V", + "enez" + ], + [ + "▁Ven", + "ez" + ], + [ + "▁Ve", + "nez" + ], + [ + "▁p", + "upp" + ], + [ + "▁pu", + "pp" + ], + [ + "▁pup", + "p" + ], + [ + "bi", + "an" + ], + [ + "bia", + "n" + ], + [ + "b", + "ian" + ], + [ + "▁cot", + "ton" + ], + [ + "h", + "p" + ], + [ + "▁the", + "atre" + ], + [ + "▁accept", + "able" + ], + [ + "cuss", + "ion" + ], + [ + "▁r", + "ounds" + ], + [ + "▁round", + "s" + ], + [ + "▁act", + "ively" + ], + [ + "▁activ", + "ely" + ], + [ + "▁active", + "ly" + ], + [ + "▁among", + "st" + ], + [ + "▁a", + "bc" + ], + [ + "▁ab", + "c" + ], + [ + "▁", + "abc" + ], + [ + "F", + "M" + ], + [ + "Pop", + "up" + ], + [ + "▁divers", + "ity" + ], + [ + "us", + "z" + ], + [ + "u", + "sz" + ], + [ + "▁employ", + "er" + ], + [ + "spec", + "ially" + ], + [ + "special", + "ly" + ], + [ + "▁sus", + "pected" + ], + [ + "▁suspect", + "ed" + ], + [ + "▁c", + "rypt" + ], + [ + "▁cry", + "pt" + ], + [ + "▁O", + "scar" + ], + [ + "▁Os", + "car" + ], + [ + "no", + "r" + ], + [ + "n", + "or" + ], + [ + "▁bab", + "ies" + ], + [ + "во", + "м" + ], + [ + "▁m", + "undo" + ], + [ + "▁li", + "bert" + ], + [ + "▁lib", + "ert" + ], + [ + "▁liber", + "t" + ], + [ + "S", + "G" + ], + [ + "ah", + "ren" + ], + [ + "ahr", + "en" + ], + [ + "a", + "hren" + ], + [ + "▁magn", + "itude" + ], + [ + "T", + "M" + ], + [ + "'", + "+" + ], + [ + "▁об", + "ъ" + ], + [ + "▁G", + "ust" + ], + [ + "▁Gu", + "st" + ], + [ + "▁gr", + "ain" + ], + [ + "▁gra", + "in" + ], + [ + "мен", + "т" + ], + [ + "м", + "ент" + ], + [ + "to", + "Equal" + ], + [ + "▁m", + "os" + ], + [ + "▁mo", + "s" + ], + [ + "▁", + "mos" + ], + [ + "▁consist", + "ently" + ], + [ + "▁consistent", + "ly" + ], + [ + "х", + "у" + ], + [ + "▁domin", + "ant" + ], + [ + "Con", + "verter" + ], + [ + "Convert", + "er" + ], + [ + "at", + "able" + ], + [ + "ata", + "ble" + ], + [ + "a", + "table" + ], + [ + "▁J", + "ag" + ], + [ + "▁Ja", + "g" + ], + [ + "scri", + "ptions" + ], + [ + "script", + "ions" + ], + [ + "scription", + "s" + ], + [ + "s", + "criptions" + ], + [ + "x", + "B" + ], + [ + "▁", + "©" + ], + [ + "fol", + "der" + ], + [ + "fold", + "er" + ], + [ + "f", + "older" + ], + [ + "▁sub", + "stance" + ], + [ + "▁subst", + "ance" + ], + [ + "▁по", + "с" + ], + [ + "L", + "o" + ], + [ + "BU", + "S" + ], + [ + "B", + "US" + ], + [ + "bas", + "ic" + ], + [ + "us", + "sen" + ], + [ + "uss", + "en" + ], + [ + "▁co", + "ins" + ], + [ + "▁coin", + "s" + ], + [ + ":", + "-" + ], + [ + "▁N", + "elson" + ], + [ + "▁Nel", + "son" + ], + [ + "In", + "ner" + ], + [ + "ograf", + "ía" + ], + [ + "▁ex", + "empl" + ], + [ + "▁exem", + "pl" + ], + [ + "ch", + "g" + ], + [ + "▁sy", + "nd" + ], + [ + "▁syn", + "d" + ], + [ + "dyn", + "amic" + ], + [ + "d", + "ynamic" + ], + [ + "ver", + "ted" + ], + [ + "vert", + "ed" + ], + [ + "▁EV", + "ENT" + ], + [ + "▁", + "EVENT" + ], + [ + "se", + "ek" + ], + [ + "see", + "k" + ], + [ + "av", + "ier" + ], + [ + "avi", + "er" + ], + [ + "a", + "vier" + ], + [ + "▁p", + "rot" + ], + [ + "▁pro", + "t" + ], + [ + "▁pr", + "ot" + ], + [ + "▁", + "prot" + ], + [ + "--", + "----" + ], + [ + "----", + "--" + ], + [ + "---", + "---" + ], + [ + "-----", + "-" + ], + [ + "-", + "-----" + ], + [ + "▁con", + "vention" + ], + [ + "▁conv", + "ention" + ], + [ + "▁convent", + "ion" + ], + [ + "▁станов", + "ника" + ], + [ + "gl", + "ing" + ], + [ + "g", + "ling" + ], + [ + "hor", + "a" + ], + [ + "ho", + "ra" + ], + [ + "h", + "ora" + ], + [ + "ши", + "й" + ], + [ + "▁wh", + "ilst" + ], + [ + "ser", + "ialize" + ], + [ + "serial", + "ize" + ], + [ + "s", + "erialize" + ], + [ + "▁R", + "ing" + ], + [ + "([", + "'" + ], + [ + "(", + "['" + ], + [ + "▁c", + "her" + ], + [ + "▁ch", + "er" + ], + [ + "▁che", + "r" + ], + [ + "▁", + "cher" + ], + [ + "сь", + "кі" + ], + [ + "▁D", + "anny" + ], + [ + "▁Dan", + "ny" + ], + [ + "▁re", + "aches" + ], + [ + "▁reach", + "es" + ], + [ + "▁el", + "igible" + ], + [ + "▁P", + "arent" + ], + [ + "▁Par", + "ent" + ], + [ + "▁Pa", + "rent" + ], + [ + "▁", + "Parent" + ], + [ + "▁came", + "ras" + ], + [ + "▁cam", + "eras" + ], + [ + "▁camera", + "s" + ], + [ + "▁discipl", + "ine" + ], + [ + "▁s", + "illy" + ], + [ + "▁sil", + "ly" + ], + [ + "re", + "ts" + ], + [ + "ret", + "s" + ], + [ + "r", + "ets" + ], + [ + "yt", + "ics" + ], + [ + "▁Reg", + "ional" + ], + [ + "▁Region", + "al" + ], + [ + "▁B", + "aby" + ], + [ + "▁Ba", + "by" + ], + [ + "▁Bab", + "y" + ], + [ + "te", + "le" + ], + [ + "t", + "ele" + ], + [ + "WAR", + "NING" + ], + [ + "WARN", + "ING" + ], + [ + "su", + "pp" + ], + [ + "sup", + "p" + ], + [ + "s", + "upp" + ], + [ + "▁refer", + "ring" + ], + [ + "▁mer", + "ch" + ], + [ + "▁merc", + "h" + ], + [ + "ol", + "ves" + ], + [ + "olve", + "s" + ], + [ + "olv", + "es" + ], + [ + "em", + "et" + ], + [ + "eme", + "t" + ], + [ + "e", + "met" + ], + [ + "ck", + "e" + ], + [ + "c", + "ke" + ], + [ + "▁M", + "unicip" + ], + [ + "▁Mun", + "icip" + ], + [ + "Wh", + "ite" + ], + [ + "▁", + "Ś" + ], + [ + "ri", + "os" + ], + [ + "rio", + "s" + ], + [ + "r", + "ios" + ], + [ + "log", + "ging" + ], + [ + "▁d", + "x" + ], + [ + "▁", + "dx" + ], + [ + "▁su", + "sp" + ], + [ + "▁sus", + "p" + ], + [ + "ex", + "ternal" + ], + [ + "▁Liber", + "al" + ], + [ + "▁Lib", + "eral" + ], + [ + "▁Init", + "ialize" + ], + [ + "▁Initial", + "ize" + ], + [ + "▁", + "Initialize" + ], + [ + "▁exhib", + "ition" + ], + [ + "▁exhibit", + "ion" + ], + [ + "▁ext", + "ensions" + ], + [ + "▁extension", + "s" + ], + [ + "▁extens", + "ions" + ], + [ + "▁", + "extensions" + ], + [ + "ke", + "eper" + ], + [ + "keep", + "er" + ], + [ + "kee", + "per" + ], + [ + "SY", + "S" + ], + [ + "▁J", + "ake" + ], + [ + "▁Ja", + "ke" + ], + [ + "▁Jak", + "e" + ], + [ + "fo", + "oter" + ], + [ + "foot", + "er" + ], + [ + "foo", + "ter" + ], + [ + "▁ph", + "ones" + ], + [ + "▁phone", + "s" + ], + [ + "▁", + "phones" + ], + [ + "▁real", + "m" + ], + [ + "▁contribut", + "ed" + ], + [ + "▁contribute", + "d" + ], + [ + "ME", + "SS" + ], + [ + "▁For", + "mat" + ], + [ + "▁Form", + "at" + ], + [ + "▁", + "Format" + ], + [ + "Per", + "iod" + ], + [ + "▁h", + "id" + ], + [ + "▁hi", + "d" + ], + [ + "▁", + "hid" + ], + [ + "▁me", + "tres" + ], + [ + "▁met", + "res" + ], + [ + "▁D", + "im" + ], + [ + "▁Di", + "m" + ], + [ + "▁", + "Dim" + ], + [ + "ache", + "lor" + ], + [ + "achel", + "or" + ], + [ + "▁T", + "ak" + ], + [ + "▁Ta", + "k" + ], + [ + "▁ве", + "ли" + ], + [ + "▁g", + "ram" + ], + [ + "▁gr", + "am" + ], + [ + "▁gra", + "m" + ], + [ + "▁", + "gram" + ], + [ + "▁M", + "Y" + ], + [ + "▁", + "MY" + ], + [ + "on", + "ders" + ], + [ + "ond", + "ers" + ], + [ + "onder", + "s" + ], + [ + "onde", + "rs" + ], + [ + "';", + "\r" + ], + [ + "'", + ";\r" + ], + [ + "▁F", + "ro" + ], + [ + "▁Fr", + "o" + ], + [ + "▁advant", + "ages" + ], + [ + "▁advantage", + "s" + ], + [ + "io", + "v" + ], + [ + "i", + "ov" + ], + [ + "▁she", + "ets" + ], + [ + "▁sheet", + "s" + ], + [ + "ce", + "mbre" + ], + [ + "c", + "embre" + ], + [ + "ž", + "e" + ], + [ + "]", + "\r" + ], + [ + "▁D", + "J" + ], + [ + "subset", + "eq" + ], + [ + "UP", + "DATE" + ], + [ + "▁b", + "locked" + ], + [ + "▁bl", + "ocked" + ], + [ + "▁block", + "ed" + ], + [ + "▁pan", + "els" + ], + [ + "▁pa", + "nels" + ], + [ + "▁panel", + "s" + ], + [ + "E", + "A" + ], + [ + "nd", + "e" + ], + [ + "n", + "de" + ], + [ + "ê", + "t" + ], + [ + "Bu", + "l" + ], + [ + "B", + "ul" + ], + [ + "▁m", + "eters" + ], + [ + "▁me", + "ters" + ], + [ + "▁met", + "ers" + ], + [ + "▁meter", + "s" + ], + [ + "jo", + "ur" + ], + [ + "j", + "our" + ], + [ + "▁rap", + "port" + ], + [ + "▁rapp", + "ort" + ], + [ + "▁J", + "ak" + ], + [ + "▁Ja", + "k" + ], + [ + "▁V", + "AL" + ], + [ + "▁VA", + "L" + ], + [ + "▁", + "VAL" + ], + [ + "▁p", + "up" + ], + [ + "▁pu", + "p" + ], + [ + "▁k", + "a" + ], + [ + "▁", + "ka" + ], + [ + "for", + "ced" + ], + [ + "force", + "d" + ], + [ + "▁ав", + "гу" + ], + [ + "ener", + "gy" + ], + [ + "e", + "nergy" + ], + [ + "▁V", + "a" + ], + [ + "not", + "es" + ], + [ + "no", + "tes" + ], + [ + "note", + "s" + ], + [ + "n", + "otes" + ], + [ + "▁relax", + "ed" + ], + [ + "C", + "r" + ], + [ + "id", + "ding" + ], + [ + "idd", + "ing" + ], + [ + "▁def", + "ines" + ], + [ + "▁define", + "s" + ], + [ + "▁defin", + "es" + ], + [ + "▁kiss", + "ed" + ], + [ + "▁inv", + "asion" + ], + [ + "▁invas", + "ion" + ], + [ + "▁sc", + "reens" + ], + [ + "▁screen", + "s" + ], + [ + "C", + "trl" + ], + [ + "▁pass", + "engers" + ], + [ + "▁passenger", + "s" + ], + [ + "▁Х", + "о" + ], + [ + "ation", + "ship" + ], + [ + "ations", + "hip" + ], + [ + "per", + "cent" + ], + [ + "\\", + "}" + ], + [ + "▁be", + "ating" + ], + [ + "▁beat", + "ing" + ], + [ + "life", + "ray" + ], + [ + "lifer", + "ay" + ], + [ + "▁V", + "M" + ], + [ + "▁", + "VM" + ], + [ + "▁Gab", + "riel" + ], + [ + "▁g", + "allery" + ], + [ + "▁gall", + "ery" + ], + [ + "▁Л", + "о" + ], + [ + "iv", + "ot" + ], + [ + "ivo", + "t" + ], + [ + "▁r", + "ental" + ], + [ + "▁ren", + "tal" + ], + [ + "▁rent", + "al" + ], + [ + "▁sh", + "ocked" + ], + [ + "▁shock", + "ed" + ], + [ + "▁Ste", + "in" + ], + [ + "▁B", + "h" + ], + [ + "▁", + "ло" + ], + [ + "Un", + "e" + ], + [ + "U", + "ne" + ], + [ + "ге", + "н" + ], + [ + "г", + "ен" + ], + [ + "▁kom", + "mun" + ], + [ + "an", + "ka" + ], + [ + "ank", + "a" + ], + [ + "▁C", + "ape" + ], + [ + "▁Cap", + "e" + ], + [ + "▁Ca", + "pe" + ], + [ + "Re", + "ady" + ], + [ + "Read", + "y" + ], + [ + "▁к", + "ри" + ], + [ + "▁", + "кри" + ], + [ + "tr", + "ag" + ], + [ + "tra", + "g" + ], + [ + "t", + "rag" + ], + [ + "Al", + "ign" + ], + [ + "Ali", + "gn" + ], + [ + "▁host", + "ed" + ], + [ + "▁ho", + "sted" + ], + [ + "▁\\", + "(" + ], + [ + "▁S", + "ession" + ], + [ + "▁", + "Session" + ], + [ + "ys", + "k" + ], + [ + "y", + "sk" + ], + [ + "Pen", + "ding" + ], + [ + "P", + "ending" + ], + [ + "ellig", + "ence" + ], + [ + "elli", + "gence" + ], + [ + "▁Never", + "theless" + ], + [ + "bit", + "ro" + ], + [ + "bitr", + "o" + ], + [ + "ho", + "lm" + ], + [ + "hol", + "m" + ], + [ + "quir", + "y" + ], + [ + "▁mechan", + "ical" + ], + [ + "▁D", + "é" + ], + [ + "an", + "eous" + ], + [ + "ane", + "ous" + ], + [ + "▁psych", + "ological" + ], + [ + "▁a", + "broad" + ], + [ + "▁ab", + "road" + ], + [ + "▁a", + "voir" + ], + [ + "▁av", + "oir" + ], + [ + "▁separ", + "ation" + ], + [ + "▁sep", + "aration" + ], + [ + "▁Haw", + "ai" + ], + [ + "iej", + "sc" + ], + [ + "▁N", + "ether" + ], + [ + "▁Ne", + "ther" + ], + [ + "▁Net", + "her" + ], + [ + "▁sub", + "tle" + ], + [ + "bi", + "rd" + ], + [ + "b", + "ird" + ], + [ + "▁mark", + "er" + ], + [ + "▁mar", + "ker" + ], + [ + "▁", + "marker" + ], + [ + "▁со", + "зда" + ], + [ + "ва", + "ла" + ], + [ + "вал", + "а" + ], + [ + "▁Work", + "ing" + ], + [ + "▁Wor", + "king" + ], + [ + "▁h", + "over" + ], + [ + "▁ho", + "ver" + ], + [ + "▁", + "hover" + ], + [ + "%%%%", + "%%%%" + ], + [ + "▁м", + "ат" + ], + [ + "▁ма", + "т" + ], + [ + "▁", + "мат" + ], + [ + "▁s", + "oup" + ], + [ + "▁so", + "up" + ], + [ + "▁sou", + "p" + ], + [ + "Al", + "ert" + ], + [ + "ch", + "r" + ], + [ + "c", + "hr" + ], + [ + "▁P", + "CI" + ], + [ + "▁PC", + "I" + ], + [ + "▁", + "PCI" + ], + [ + "▁m", + "ús" + ], + [ + "ient", + "ras" + ], + [ + "ien", + "tras" + ], + [ + "▁St", + "orage" + ], + [ + "▁Sto", + "rage" + ], + [ + "▁", + "Storage" + ], + [ + "▁av", + "ailability" + ], + [ + "▁op", + "era" + ], + [ + "▁oper", + "a" + ], + [ + "▁P", + "roduction" + ], + [ + "▁Produ", + "ction" + ], + [ + "▁Product", + "ion" + ], + [ + "ia", + "ne" + ], + [ + "ian", + "e" + ], + [ + "i", + "ane" + ], + [ + "▁Bet", + "ter" + ], + [ + "▁B", + "utton" + ], + [ + "▁But", + "ton" + ], + [ + "▁", + "Button" + ], + [ + "▁Pe", + "ace" + ], + [ + "▁Mor", + "ris" + ], + [ + "▁s", + "ib" + ], + [ + "▁si", + "b" + ], + [ + "▁f", + "iber" + ], + [ + "▁fi", + "ber" + ], + [ + "▁fib", + "er" + ], + [ + "Int", + "ent" + ], + [ + "▁D", + "esc" + ], + [ + "▁De", + "sc" + ], + [ + "▁Des", + "c" + ], + [ + "▁", + "Desc" + ], + [ + "ning", + "en" + ], + [ + "n", + "ingen" + ], + [ + "ze", + "j" + ], + [ + "z", + "ej" + ], + [ + "av", + "an" + ], + [ + "ava", + "n" + ], + [ + "a", + "van" + ], + [ + "cover", + "ed" + ], + [ + "cov", + "ered" + ], + [ + "▁s", + "yst" + ], + [ + "▁sy", + "st" + ], + [ + "▁sys", + "t" + ], + [ + "_", + "+" + ], + [ + "▁орга", + "ни" + ], + [ + "▁Re", + "lig" + ], + [ + "▁Rel", + "ig" + ], + [ + "ци", + "аль" + ], + [ + "▁s", + "pite" + ], + [ + "▁sp", + "ite" + ], + [ + "▁re", + "prés" + ], + [ + "▁~", + "~" + ], + [ + "▁", + "~~" + ], + [ + "▁to", + "xic" + ], + [ + "▁a", + "pro" + ], + [ + "▁ap", + "ro" + ], + [ + "▁apr", + "o" + ], + [ + "X", + "Y" + ], + [ + "▁tr", + "ips" + ], + [ + "▁tri", + "ps" + ], + [ + "▁trip", + "s" + ], + [ + "▁pl", + "aats" + ], + [ + "▁con", + "vey" + ], + [ + "▁conv", + "ey" + ], + [ + "▁conve", + "y" + ], + [ + "Pr", + "im" + ], + [ + "P", + "rim" + ], + [ + "▁о", + "ста" + ], + [ + "▁ос", + "та" + ], + [ + "▁ост", + "а" + ], + [ + "ok", + "o" + ], + [ + "o", + "ko" + ], + [ + "▁l", + "obby" + ], + [ + "▁lob", + "by" + ], + [ + "▁recommend", + "ations" + ], + [ + "▁recommendation", + "s" + ], + [ + "SP", + "ACE" + ], + [ + "▁overwhel", + "ming" + ], + [ + "ennes", + "see" + ], + [ + "▁ac", + "quire" + ], + [ + "▁acqu", + "ire" + ], + [ + "w", + "m" + ], + [ + "LOB", + "AL" + ], + [ + "▁D", + "EF" + ], + [ + "▁DE", + "F" + ], + [ + "▁", + "DEF" + ], + [ + "je", + "r" + ], + [ + "j", + "er" + ], + [ + "▁re", + "cur" + ], + [ + "▁rec", + "ur" + ], + [ + "om", + "men" + ], + [ + "omm", + "en" + ], + [ + "▁j", + "og" + ], + [ + "▁jo", + "g" + ], + [ + "▁n", + "ast" + ], + [ + "▁na", + "st" + ], + [ + "▁nas", + "t" + ], + [ + "▁L", + "P" + ], + [ + "▁", + "LP" + ], + [ + "jo", + "n" + ], + [ + "j", + "on" + ], + [ + "▁w", + "ishes" + ], + [ + "▁wish", + "es" + ], + [ + "▁wis", + "hes" + ], + [ + "▁N", + "ancy" + ], + [ + "▁support", + "ers" + ], + [ + "▁supp", + "orters" + ], + [ + "^{", + "-\\" + ], + [ + "^{-", + "\\" + ], + [ + "▁T", + "rib" + ], + [ + "▁Tr", + "ib" + ], + [ + "▁Tri", + "b" + ], + [ + "▁", + "Ä" + ], + [ + "▁disappoint", + "ed" + ], + [ + "▁у", + "ни" + ], + [ + "x", + "D" + ], + [ + "li", + "nt" + ], + [ + "lin", + "t" + ], + [ + "l", + "int" + ], + [ + "I", + "p" + ], + [ + "▁Islam", + "ic" + ], + [ + "än", + "de" + ], + [ + "änd", + "e" + ], + [ + "ä", + "nde" + ], + [ + "end", + "ment" + ], + [ + "dt", + "ype" + ], + [ + "d", + "type" + ], + [ + "▁di", + "gest" + ], + [ + "▁dig", + "est" + ], + [ + "▁Set", + "tings" + ], + [ + "▁Setting", + "s" + ], + [ + "▁", + "Settings" + ], + [ + "ér", + "a" + ], + [ + "é", + "ra" + ], + [ + "▁aggress", + "ive" + ], + [ + "▁intellig", + "ent" + ], + [ + "eder", + "börd" + ], + [ + "ster", + "dam" + ], + [ + "pc", + "i" + ], + [ + "p", + "ci" + ], + [ + "▁over", + "flow" + ], + [ + "▁", + "overflow" + ], + [ + "im", + "b" + ], + [ + "i", + "mb" + ], + [ + "re", + "ach" + ], + [ + "rea", + "ch" + ], + [ + "r", + "each" + ], + [ + "cept", + "or" + ], + [ + "cep", + "tor" + ], + [ + "▁yield", + "s" + ], + [ + "▁Se", + "bast" + ], + [ + "▁ut", + "ility" + ], + [ + "▁util", + "ity" + ], + [ + "▁р", + "и" + ], + [ + "▁", + "ри" + ], + [ + "▁fac", + "ulty" + ], + [ + "▁In", + "ternal" + ], + [ + "▁Intern", + "al" + ], + [ + "▁Inter", + "nal" + ], + [ + "▁", + "Internal" + ], + [ + "▁attract", + "ed" + ], + [ + "▁attra", + "cted" + ], + [ + "рі", + "в" + ], + [ + "р", + "ів" + ], + [ + "▁mix", + "ing" + ], + [ + "▁R", + "uth" + ], + [ + "▁Ru", + "th" + ], + [ + "▁esc", + "aped" + ], + [ + "▁escape", + "d" + ], + [ + "▁E", + "asy" + ], + [ + "▁dr", + "ain" + ], + [ + "▁r", + "ings" + ], + [ + "▁ring", + "s" + ], + [ + "▁", + "rings" + ], + [ + "qu", + "ire" + ], + [ + "quir", + "e" + ], + [ + "Av", + "ailable" + ], + [ + "▁ц", + "и" + ], + [ + "▁", + "ци" + ], + [ + "▁conv", + "ince" + ], + [ + "▁convin", + "ce" + ], + [ + "or", + "sch" + ], + [ + "ors", + "ch" + ], + [ + "ут", + "бо" + ], + [ + "CP", + "P" + ], + [ + "C", + "PP" + ], + [ + "ra", + "ge" + ], + [ + "rag", + "e" + ], + [ + "r", + "age" + ], + [ + "ч", + "і" + ], + [ + "▁p", + "rod" + ], + [ + "▁pro", + "d" + ], + [ + "▁pr", + "od" + ], + [ + "▁", + "prod" + ], + [ + "▁p", + "ig" + ], + [ + "▁pi", + "g" + ], + [ + "▁C", + "atal" + ], + [ + "▁Cat", + "al" + ], + [ + "▁Ca", + "tal" + ], + [ + "▁al", + "ias" + ], + [ + "▁ali", + "as" + ], + [ + "▁", + "alias" + ], + [ + "▁че", + "мпи" + ], + [ + "▁чем", + "пи" + ], + [ + "Pl", + "ace" + ], + [ + "P", + "lace" + ], + [ + "▁g", + "orge" + ], + [ + "▁depend", + "ence" + ], + [ + "▁cr", + "uel" + ], + [ + "▁cru", + "el" + ], + [ + "▁ther", + "mal" + ], + [ + "▁therm", + "al" + ], + [ + "ut", + "down" + ], + [ + "ref", + "resh" + ], + [ + "▁re", + "sort" + ], + [ + "▁res", + "ort" + ], + [ + "▁S", + "HA" + ], + [ + "▁SH", + "A" + ], + [ + "▁", + "SHA" + ], + [ + "ти", + "й" + ], + [ + "fo", + "od" + ], + [ + "foo", + "d" + ], + [ + "f", + "ood" + ], + [ + "▁N", + "ad" + ], + [ + "▁Na", + "d" + ], + [ + "▁pregn", + "ancy" + ], + [ + "▁pro", + "jection" + ], + [ + "▁project", + "ion" + ], + [ + "▁pa", + "ís" + ], + [ + "▁полу", + "чи" + ], + [ + "▁the", + "mes" + ], + [ + "▁them", + "es" + ], + [ + "▁theme", + "s" + ], + [ + "▁fun", + "eral" + ], + [ + "▁cas", + "o" + ], + [ + "▁ca", + "so" + ], + [ + "ле", + "кт" + ], + [ + "лек", + "т" + ], + [ + "Ex", + "tra" + ], + [ + "Ext", + "ra" + ], + [ + "▁t", + "issue" + ], + [ + "▁dr", + "agon" + ], + [ + "▁drag", + "on" + ], + [ + "▁l", + "ig" + ], + [ + "▁li", + "g" + ], + [ + "▁", + "lig" + ], + [ + "▁n", + "ei" + ], + [ + "▁ne", + "i" + ], + [ + "▁com", + "edy" + ], + [ + "▁come", + "dy" + ], + [ + "▁comed", + "y" + ], + [ + "те", + "м" + ], + [ + "т", + "ем" + ], + [ + "сла", + "в" + ], + [ + "с", + "лав" + ], + [ + "▁pass", + "enger" + ], + [ + "Cl", + "one" + ], + [ + "i", + "ção" + ], + [ + "yg", + "on" + ], + [ + "y", + "gon" + ], + [ + "▁H", + "alf" + ], + [ + "▁Hal", + "f" + ], + [ + "▁la", + "bour" + ], + [ + "▁lab", + "our" + ], + [ + "▁vill", + "ages" + ], + [ + "▁village", + "s" + ], + [ + "▁ві", + "й" + ], + [ + "▁О", + "т" + ], + [ + "▁L", + "isa" + ], + [ + "▁Li", + "sa" + ], + [ + "▁Lis", + "a" + ], + [ + "_", + "[" + ], + [ + "ba", + "g" + ], + [ + "b", + "ag" + ], + [ + "▁d", + "iver" + ], + [ + "▁di", + "ver" + ], + [ + "▁div", + "er" + ], + [ + "▁dive", + "r" + ], + [ + "▁M", + "L" + ], + [ + "▁", + "ML" + ], + [ + "▁transl", + "ated" + ], + [ + "▁translate", + "d" + ], + [ + "▁per", + "ò" + ], + [ + "ab", + "ama" + ], + [ + "aba", + "ma" + ], + [ + "▁cas", + "tle" + ], + [ + "▁cast", + "le" + ], + [ + "▁", + "castle" + ], + [ + "*", + "\\" + ], + [ + "▁reg", + "ia" + ], + [ + "!!", + "!!" + ], + [ + "!!!", + "!" + ], + [ + "!", + "!!!" + ], + [ + "*>", + "(" + ], + [ + "*", + ">(" + ], + [ + "▁Work", + "s" + ], + [ + "▁Wor", + "ks" + ], + [ + "▁N", + "ature" + ], + [ + "▁Nat", + "ure" + ], + [ + "▁Natur", + "e" + ], + [ + "NE", + "L" + ], + [ + "N", + "EL" + ], + [ + "▁P", + "om" + ], + [ + "▁Po", + "m" + ], + [ + "tt", + "a" + ], + [ + "t", + "ta" + ], + [ + "▁Jam", + "ie" + ], + [ + "▁p", + "unch" + ], + [ + "▁pun", + "ch" + ], + [ + "tain", + "ment" + ], + [ + "▁K", + "rieg" + ], + [ + "▁Kr", + "ieg" + ], + [ + "▁restr", + "icted" + ], + [ + "▁restrict", + "ed" + ], + [ + "mob", + "ile" + ], + [ + "m", + "obile" + ], + [ + "▁grand", + "mother" + ], + [ + "Arg", + "uments" + ], + [ + "Argument", + "s" + ], + [ + "▁s", + "inc" + ], + [ + "▁si", + "nc" + ], + [ + "▁sin", + "c" + ], + [ + "▁Mon", + "th" + ], + [ + "▁Mont", + "h" + ], + [ + "▁", + "Month" + ], + [ + "esc", + "ape" + ], + [ + "e", + "scape" + ], + [ + "▁opt", + "ical" + ], + [ + "▁L", + "ane" + ], + [ + "▁La", + "ne" + ], + [ + "▁Lan", + "e" + ], + [ + "▁Deutsch", + "land" + ], + [ + "▁S", + "aison" + ], + [ + "▁Sa", + "ison" + ], + [ + "▁V", + "irtual" + ], + [ + "▁", + "Virtual" + ], + [ + "pe", + "z" + ], + [ + "p", + "ez" + ], + [ + "In", + "line" + ], + [ + "ow", + "any" + ], + [ + "owa", + "ny" + ], + [ + "rad", + "io" + ], + [ + "r", + "adio" + ], + [ + "ö", + "ß" + ], + [ + "▁O", + "thers" + ], + [ + "▁Other", + "s" + ], + [ + "MA", + "IN" + ], + [ + "M", + "AIN" + ], + [ + "sc", + "al" + ], + [ + "s", + "cal" + ], + [ + "▁D", + "allas" + ], + [ + "▁Dal", + "las" + ], + [ + "▁an", + "chor" + ], + [ + "▁anc", + "hor" + ], + [ + "▁anch", + "or" + ], + [ + "▁", + "anchor" + ], + [ + "en", + "cias" + ], + [ + "enc", + "ias" + ], + [ + "encia", + "s" + ], + [ + "enci", + "as" + ], + [ + "▁re", + "porter" + ], + [ + "▁rep", + "orter" + ], + [ + "▁report", + "er" + ], + [ + "▁veget", + "ables" + ], + [ + "▁enforce", + "ment" + ], + [ + "▁Wis", + "consin" + ], + [ + "▁con", + "dem" + ], + [ + "▁cond", + "em" + ], + [ + "▁e", + "b" + ], + [ + "▁", + "eb" + ], + [ + "▁s", + "its" + ], + [ + "▁sit", + "s" + ], + [ + "▁si", + "ts" + ], + [ + "▁calcul", + "ations" + ], + [ + "▁calculation", + "s" + ], + [ + "▁calc", + "ulations" + ], + [ + "▁\"", + "--" + ], + [ + "▁\"-", + "-" + ], + [ + "ue", + "lle" + ], + [ + "uel", + "le" + ], + [ + "u", + "elle" + ], + [ + "▁tip", + "o" + ], + [ + "▁ti", + "po" + ], + [ + "▁P", + "AR" + ], + [ + "▁PA", + "R" + ], + [ + "▁", + "PAR" + ], + [ + "co", + "rd" + ], + [ + "cor", + "d" + ], + [ + "c", + "ord" + ], + [ + "▁ро", + "ків" + ], + [ + "ph", + "an" + ], + [ + "pha", + "n" + ], + [ + "p", + "han" + ], + [ + "▁kon", + "nte" + ], + [ + "▁z", + "ap" + ], + [ + "▁za", + "p" + ], + [ + "wr", + "iting" + ], + [ + "writ", + "ing" + ], + [ + "en", + "gu" + ], + [ + "eng", + "u" + ], + [ + "▁pert", + "urb" + ], + [ + "Fac", + "e" + ], + [ + "F", + "ace" + ], + [ + "ag", + "og" + ], + [ + "ago", + "g" + ], + [ + "▁De", + "cl" + ], + [ + "▁Dec", + "l" + ], + [ + "▁", + "Decl" + ], + [ + "est", + "ly" + ], + [ + "▁War", + "ren" + ], + [ + "▁H", + "ills" + ], + [ + "▁Hill", + "s" + ], + [ + "▁Hil", + "ls" + ], + [ + "▁ref", + "resh" + ], + [ + "▁refr", + "esh" + ], + [ + "▁refres", + "h" + ], + [ + "▁", + "refresh" + ], + [ + "▁fl", + "ip" + ], + [ + "io", + "p" + ], + [ + "i", + "op" + ], + [ + "▁key", + "board" + ], + [ + "is", + "to" + ], + [ + "ist", + "o" + ], + [ + "i", + "sto" + ], + [ + "▁prom", + "oted" + ], + [ + "▁promote", + "d" + ], + [ + "▁promot", + "ed" + ], + [ + "back", + "s" + ], + [ + "ba", + "cks" + ], + [ + "b", + "acks" + ], + [ + "Enc", + "oding" + ], + [ + "▁", + "ال" + ], + [ + "▁g", + "min" + ], + [ + "ро", + "б" + ], + [ + "р", + "об" + ], + [ + "▁follow", + "ers" + ], + [ + "▁p", + "epper" + ], + [ + "um", + "ble" + ], + [ + "umb", + "le" + ], + [ + "▁sp", + "ray" + ], + [ + "▁spr", + "ay" + ], + [ + "▁dr", + "ives" + ], + [ + "▁dri", + "ves" + ], + [ + "▁driv", + "es" + ], + [ + "▁drive", + "s" + ], + [ + "P", + "ush" + ], + [ + "cook", + "ie" + ], + [ + "c", + "ookie" + ], + [ + "▁gel", + "dig" + ], + [ + "▁geld", + "ig" + ], + [ + "ig", + "ung" + ], + [ + "igu", + "ng" + ], + [ + "vis", + "it" + ], + [ + "▁at", + "omic" + ], + [ + "▁atom", + "ic" + ], + [ + "▁", + "atomic" + ], + [ + "▁A", + "thlet" + ], + [ + "▁Ath", + "let" + ], + [ + "▁Or", + "igin" + ], + [ + "▁Ori", + "gin" + ], + [ + "▁", + "Origin" + ], + [ + "▁H", + "appy" + ], + [ + "▁G", + "ra" + ], + [ + "▁Gr", + "a" + ], + [ + "▁att", + "ribut" + ], + [ + "▁п", + "ов" + ], + [ + "▁по", + "в" + ], + [ + "▁", + "пов" + ], + [ + "▁n", + "ost" + ], + [ + "▁no", + "st" + ], + [ + "▁nos", + "t" + ], + [ + "▁", + "nost" + ], + [ + "ur", + "u" + ], + [ + "u", + "ru" + ], + [ + "▁Ne", + "ither" + ], + [ + "▁ma", + "ar" + ], + [ + "ject", + "ions" + ], + [ + "je", + "ctions" + ], + [ + "jection", + "s" + ], + [ + "▁re", + "nov" + ], + [ + "▁ren", + "ov" + ], + [ + "fin", + "ity" + ], + [ + "f", + "inity" + ], + [ + "gener", + "ic" + ], + [ + "init", + "ialize" + ], + [ + "initial", + "ize" + ], + [ + "pgf", + "set" + ], + [ + "▁hyp", + "othes" + ], + [ + "▁ma", + "cro" + ], + [ + "▁mac", + "ro" + ], + [ + "ma", + "ps" + ], + [ + "map", + "s" + ], + [ + "m", + "aps" + ], + [ + "▁f", + "are" + ], + [ + "▁far", + "e" + ], + [ + "▁fa", + "re" + ], + [ + "▁", + "fare" + ], + [ + "Be", + "st" + ], + [ + "B", + "est" + ], + [ + "uch", + "t" + ], + [ + "uc", + "ht" + ], + [ + "u", + "cht" + ], + [ + "co", + "d" + ], + [ + "c", + "od" + ], + [ + "▁h", + "orm" + ], + [ + "▁hor", + "m" + ], + [ + "▁ho", + "rm" + ], + [ + "▁P", + "oll" + ], + [ + "▁Pol", + "l" + ], + [ + "▁Po", + "ll" + ], + [ + "▁host", + "ing" + ], + [ + "▁Re", + "ading" + ], + [ + "▁Read", + "ing" + ], + [ + "Cert", + "ificate" + ], + [ + "▁и", + "ма" + ], + [ + "▁им", + "а" + ], + [ + "▁C", + "ov" + ], + [ + "▁Co", + "v" + ], + [ + "▁P", + "red" + ], + [ + "▁Pr", + "ed" + ], + [ + "▁Pre", + "d" + ], + [ + "▁", + "Pred" + ], + [ + "re", + "direct" + ], + [ + "red", + "irect" + ], + [ + "▁l", + "attice" + ], + [ + "▁port", + "folio" + ], + [ + "▁o", + "ven" + ], + [ + "▁ov", + "en" + ], + [ + "▁", + "oven" + ], + [ + "ie", + "len" + ], + [ + "iel", + "en" + ], + [ + "iele", + "n" + ], + [ + "i", + "elen" + ], + [ + "sub", + "scribe" + ], + [ + "foot", + "note" + ], + [ + "но", + "ю" + ], + [ + "▁mom", + "ento" + ], + [ + "▁moment", + "o" + ], + [ + "▁d", + "ich" + ], + [ + "▁di", + "ch" + ], + [ + "▁dic", + "h" + ], + [ + "▁ent", + "ert" + ], + [ + "▁enter", + "t" + ], + [ + "▁g", + "é" + ], + [ + "▁connect", + "ing" + ], + [ + "▁n", + "acional" + ], + [ + "▁o", + "tt" + ], + [ + "▁ot", + "t" + ], + [ + "▁", + "ott" + ], + [ + "ні", + "в" + ], + [ + "н", + "ів" + ], + [ + "▁rac", + "ist" + ], + [ + "▁penal", + "ty" + ], + [ + "ül", + "t" + ], + [ + "ü", + "lt" + ], + [ + "▁Israel", + "i" + ], + [ + "▁(", + "†" + ], + [ + "▁desc", + "end" + ], + [ + "▁ос", + "іб" + ], + [ + "▁b", + "elly" + ], + [ + "▁bel", + "ly" + ], + [ + "▁bell", + "y" + ], + [ + "ні", + "сть" + ], + [ + "▁encounter", + "ed" + ], + [ + "T", + "ip" + ], + [ + "▁gu", + "ilt" + ], + [ + "▁d", + "amp" + ], + [ + "▁da", + "mp" + ], + [ + "▁dam", + "p" + ], + [ + "ze", + "ug" + ], + [ + "▁Mem", + "ory" + ], + [ + "▁", + "Memory" + ], + [ + "Check", + "ed" + ], + [ + "▁Sh", + "akespeare" + ], + [ + "hi", + "ll" + ], + [ + "h", + "ill" + ], + [ + "▁w", + "oke" + ], + [ + "▁wo", + "ke" + ], + [ + "▁sal", + "ary" + ], + [ + "eth", + "eless" + ], + [ + "ethe", + "less" + ], + [ + "e", + "theless" + ], + [ + "▁Т", + "и" + ], + [ + "er", + "de" + ], + [ + "erd", + "e" + ], + [ + "▁He", + "in" + ], + [ + "▁g", + "it" + ], + [ + "▁gi", + "t" + ], + [ + "▁", + "git" + ], + [ + "=\"", + "\"" + ], + [ + "=", + "\"\"" + ], + [ + "ül", + "l" + ], + [ + "ü", + "ll" + ], + [ + "ge", + "ben" + ], + [ + "geb", + "en" + ], + [ + "g", + "eben" + ], + [ + "Pr", + "es" + ], + [ + "Pre", + "s" + ], + [ + "P", + "res" + ], + [ + "ie", + "val" + ], + [ + "iev", + "al" + ], + [ + "i", + "eval" + ], + [ + "mark", + "er" + ], + [ + "mar", + "ker" + ], + [ + "▁д", + "ан" + ], + [ + "▁да", + "н" + ], + [ + "▁", + "дан" + ], + [ + "▁oct", + "obre" + ], + [ + "RO", + "L" + ], + [ + "R", + "OL" + ], + [ + "▁jan", + "u" + ], + [ + "▁ja", + "nu" + ], + [ + "▁)", + ":" + ], + [ + "▁", + "):" + ], + [ + "br", + "anch" + ], + [ + "▁J", + "erry" + ], + [ + "▁Jer", + "ry" + ], + [ + "ke", + "hr" + ], + [ + "▁contr", + "acts" + ], + [ + "▁contract", + "s" + ], + [ + "▁aff", + "air" + ], + [ + "▁Росси", + "и" + ], + [ + "ja", + "ck" + ], + [ + "j", + "ack" + ], + [ + "AN", + "G" + ], + [ + "A", + "NG" + ], + [ + "▁dro", + "pping" + ], + [ + "▁drop", + "ping" + ], + [ + "▁d", + "ic" + ], + [ + "▁di", + "c" + ], + [ + "sch", + "ool" + ], + [ + "▁Fin", + "land" + ], + [ + "▁d", + "ort" + ], + [ + "▁do", + "rt" + ], + [ + "▁K", + "ings" + ], + [ + "▁King", + "s" + ], + [ + "▁Kin", + "gs" + ], + [ + "▁Arg", + "ument" + ], + [ + "▁", + "Argument" + ], + [ + "▁Sim", + "ilarly" + ], + [ + "▁Similar", + "ly" + ], + [ + "▁V", + "erm" + ], + [ + "▁Ver", + "m" + ], + [ + "▁Ve", + "rm" + ], + [ + "▁pret", + "end" + ], + [ + "!", + "_" + ], + [ + "łu", + "g" + ], + [ + "ł", + "ug" + ], + [ + "же", + "ння" + ], + [ + "жен", + "ня" + ], + [ + "da", + "ting" + ], + [ + "dat", + "ing" + ], + [ + "d", + "ating" + ], + [ + "cs", + "v" + ], + [ + "c", + "sv" + ], + [ + "▁dialog", + "ue" + ], + [ + "▁dial", + "ogue" + ], + [ + "STR", + "U" + ], + [ + "▁public", + "ly" + ], + [ + "wed", + "ge" + ], + [ + "w", + "edge" + ], + [ + "▁H", + "och" + ], + [ + "▁Ho", + "ch" + ], + [ + "▁spe", + "aks" + ], + [ + "▁speak", + "s" + ], + [ + "▁compens", + "ation" + ], + [ + "an", + "ca" + ], + [ + "anc", + "a" + ], + [ + "text", + "tt" + ], + [ + "▁Fil", + "ter" + ], + [ + "▁", + "Filter" + ], + [ + "▁part", + "ly" + ], + [ + "▁us", + "eless" + ], + [ + "▁use", + "less" + ], + [ + "▁г", + "у" + ], + [ + "▁", + "гу" + ], + [ + "▁d", + "eter" + ], + [ + "▁de", + "ter" + ], + [ + "▁det", + "er" + ], + [ + "IE", + "W" + ], + [ + "▁con", + "secut" + ], + [ + "▁cons", + "ecut" + ], + [ + "▁conse", + "cut" + ], + [ + "▁h", + "oly" + ], + [ + "▁hol", + "y" + ], + [ + "▁ho", + "ly" + ], + [ + "▁grad", + "uated" + ], + [ + "▁gradu", + "ated" + ], + [ + "▁graduate", + "d" + ], + [ + "an", + "dal" + ], + [ + "and", + "al" + ], + [ + "anda", + "l" + ], + [ + "ți", + "e" + ], + [ + "ț", + "ie" + ], + [ + "▁W", + "ant" + ], + [ + "▁Wa", + "nt" + ], + [ + "▁Aust", + "ria" + ], + [ + "or", + "den" + ], + [ + "ord", + "en" + ], + [ + "fr", + "ag" + ], + [ + "f", + "rag" + ], + [ + "▁f", + "oo" + ], + [ + "▁fo", + "o" + ], + [ + "▁", + "foo" + ], + [ + "cl", + "aimed" + ], + [ + "claim", + "ed" + ], + [ + "во", + "е" + ], + [ + "▁not", + "able" + ], + [ + "▁no", + "table" + ], + [ + "▁journal", + "ist" + ], + [ + "▁M", + "ail" + ], + [ + "▁Ma", + "il" + ], + [ + "▁Mai", + "l" + ], + [ + "▁", + "Mail" + ], + [ + "!(", + "\"" + ], + [ + "!", + "(\"" + ], + [ + "ps", + "e" + ], + [ + "p", + "se" + ], + [ + "▁C", + "lay" + ], + [ + "▁Cl", + "ay" + ], + [ + "iv", + "i" + ], + [ + "i", + "vi" + ], + [ + "▁sc", + "ales" + ], + [ + "▁scale", + "s" + ], + [ + "▁scal", + "es" + ], + [ + "▁er", + "ste" + ], + [ + "▁erst", + "e" + ], + [ + "▁ers", + "te" + ], + [ + "Data", + "Type" + ], + [ + "▁D", + "iam" + ], + [ + "▁Di", + "am" + ], + [ + "í", + "r" + ], + [ + "loc", + "ale" + ], + [ + "local", + "e" + ], + [ + "▁rel", + "uct" + ], + [ + "ien", + "st" + ], + [ + "iens", + "t" + ], + [ + "ast", + "ro" + ], + [ + "astr", + "o" + ], + [ + "act", + "ly" + ], + [ + "я", + "х" + ], + [ + "▁Vill", + "age" + ], + [ + "▁Villa", + "ge" + ], + [ + "▁Vil", + "lage" + ], + [ + "▁d", + "aughters" + ], + [ + "▁daughter", + "s" + ], + [ + "▁manufact", + "urers" + ], + [ + "▁manufacturer", + "s" + ], + [ + "▁print", + "ing" + ], + [ + "▁prin", + "ting" + ], + [ + "ч", + "ка" + ], + [ + "Nd", + "Ex" + ], + [ + "Ch", + "anges" + ], + [ + "Change", + "s" + ], + [ + "▁/", + "******/" + ], + [ + "ver", + "tex" + ], + [ + "vert", + "ex" + ], + [ + "▁b", + "rows" + ], + [ + "▁br", + "ows" + ], + [ + "▁bro", + "ws" + ], + [ + "▁brow", + "s" + ], + [ + "▁K", + "ö" + ], + [ + "not", + "ations" + ], + [ + "notation", + "s" + ], + [ + "▁i", + "ls" + ], + [ + "▁il", + "s" + ], + [ + "▁", + "ils" + ], + [ + "at", + "el" + ], + [ + "ate", + "l" + ], + [ + "C", + "ir" + ], + [ + "▁meaning", + "ful" + ], + [ + "q", + "a" + ], + [ + "▁C", + "old" + ], + [ + "▁Col", + "d" + ], + [ + "▁Co", + "ld" + ], + [ + "ue", + "to" + ], + [ + "u", + "eto" + ], + [ + "you", + "r" + ], + [ + "yo", + "ur" + ], + [ + "y", + "our" + ], + [ + "m", + "f" + ], + [ + "мо", + "в" + ], + [ + "м", + "ов" + ], + [ + "▁Ü", + "ber" + ], + [ + "▁fam", + "ilia" + ], + [ + "▁famil", + "ia" + ], + [ + "▁ste", + "ep" + ], + [ + "▁pres", + "idential" + ], + [ + "▁president", + "ial" + ], + [ + "▁presid", + "ential" + ], + [ + "▁z", + "á" + ], + [ + "▁", + "zá" + ], + [ + "▁w", + "ars" + ], + [ + "▁war", + "s" + ], + [ + "▁wa", + "rs" + ], + [ + "▁C", + "re" + ], + [ + "▁Cr", + "e" + ], + [ + "▁after", + "wards" + ], + [ + "▁afterward", + "s" + ], + [ + "ha", + "lb" + ], + [ + "hal", + "b" + ], + [ + "▁strugg", + "led" + ], + [ + "▁struggle", + "d" + ], + [ + "Ch", + "art" + ], + [ + "Char", + "t" + ], + [ + "User", + "Id" + ], + [ + "ac", + "ular" + ], + [ + "a", + "cular" + ], + [ + "iv", + "ia" + ], + [ + "ivi", + "a" + ], + [ + "i", + "via" + ], + [ + "▁u", + "gly" + ], + [ + "▁K", + "unst" + ], + [ + "E", + "s" + ], + [ + "▁Q", + "String" + ], + [ + "▁C", + "ow" + ], + [ + "▁Co", + "w" + ], + [ + "Rad", + "ius" + ], + [ + "▁Gr", + "iff" + ], + [ + "▁V", + "as" + ], + [ + "▁Va", + "s" + ], + [ + "HA", + "L" + ], + [ + "H", + "AL" + ], + [ + "Mod", + "ified" + ], + [ + "ra", + "le" + ], + [ + "ral", + "e" + ], + [ + "r", + "ale" + ], + [ + "mem", + "cpy" + ], + [ + "▁в", + "клю" + ], + [ + "▁r", + "s" + ], + [ + "▁", + "rs" + ], + [ + "▁h", + "alt" + ], + [ + "▁ha", + "lt" + ], + [ + "▁hal", + "t" + ], + [ + "▁", + "halt" + ], + [ + "▁Miss", + "iss" + ], + [ + "▁h", + "uvud" + ], + [ + "ec", + "a" + ], + [ + "e", + "ca" + ], + [ + "▁Jahrhund", + "ert" + ], + [ + "E", + "urope" + ], + [ + "Sign", + "ature" + ], + [ + "▁grand", + "father" + ], + [ + "▁O", + "regon" + ], + [ + "gu", + "e" + ], + [ + "g", + "ue" + ], + [ + "xy", + "gen" + ], + [ + "fr", + "ames" + ], + [ + "frame", + "s" + ], + [ + "▁hab", + "its" + ], + [ + "▁ha", + "bits" + ], + [ + "▁habit", + "s" + ], + [ + "Support", + "ed" + ], + [ + "Supp", + "orted" + ], + [ + "▁low", + "ered" + ], + [ + "▁lower", + "ed" + ], + [ + "▁rad", + "iation" + ], + [ + "▁radi", + "ation" + ], + [ + "ab", + "en" + ], + [ + "abe", + "n" + ], + [ + "a", + "ben" + ], + [ + "▁Pro", + "gress" + ], + [ + "▁", + "Progress" + ], + [ + "▁C", + "osta" + ], + [ + "▁Co", + "sta" + ], + [ + "▁Cost", + "a" + ], + [ + "▁Cos", + "ta" + ], + [ + "▁dev", + "oted" + ], + [ + "▁gest", + "ure" + ], + [ + "▁Dez", + "ember" + ], + [ + "▁qu", + "oted" + ], + [ + "▁quote", + "d" + ], + [ + "▁quot", + "ed" + ], + [ + "▁difficult", + "ies" + ], + [ + "т", + "ре" + ], + [ + "▁sustain", + "able" + ], + [ + "▁d", + "ense" + ], + [ + "▁den", + "se" + ], + [ + "▁dens", + "e" + ], + [ + "▁ih", + "rer" + ], + [ + "▁ihr", + "er" + ], + [ + "▁ihre", + "r" + ], + [ + "▁firm", + "ly" + ], + [ + "â", + "t" + ], + [ + "om", + "ent" + ], + [ + "ome", + "nt" + ], + [ + "omen", + "t" + ], + [ + "o", + "ment" + ], + [ + "▁c", + "out" + ], + [ + "▁co", + "ut" + ], + [ + "▁cou", + "t" + ], + [ + "▁", + "cout" + ], + [ + "▁p", + "oi" + ], + [ + "▁po", + "i" + ], + [ + "d", + "jango" + ], + [ + "▁pro", + "found" + ], + [ + "▁prof", + "ound" + ], + [ + "▁Wil", + "helm" + ], + [ + "▁fl", + "ush" + ], + [ + "▁flu", + "sh" + ], + [ + "▁", + "flush" + ], + [ + "▁av", + "ril" + ], + [ + "LA", + "B" + ], + [ + "L", + "AB" + ], + [ + "▁B", + "row" + ], + [ + "▁Br", + "ow" + ], + [ + "▁Bro", + "w" + ], + [ + "▁pro", + "pose" + ], + [ + "▁prop", + "ose" + ], + [ + "▁propos", + "e" + ], + [ + "▁r", + "anks" + ], + [ + "▁ran", + "ks" + ], + [ + "▁rank", + "s" + ], + [ + "WI", + "D" + ], + [ + "W", + "ID" + ], + [ + "▁mut", + "ual" + ], + [ + "▁text", + "s" + ], + [ + "▁tex", + "ts" + ], + [ + "▁S", + "ale" + ], + [ + "▁Sal", + "e" + ], + [ + "▁Sa", + "le" + ], + [ + "▁qu", + "asi" + ], + [ + "▁n", + "og" + ], + [ + "▁no", + "g" + ], + [ + "▁", + "nog" + ], + [ + "▁nouve", + "au" + ], + [ + "▁c", + "v" + ], + [ + "▁", + "cv" + ], + [ + "▁no", + "ble" + ], + [ + "▁nob", + "le" + ], + [ + "▁dé", + "cembre" + ], + [ + "▁déc", + "embre" + ], + [ + "▁cl", + "ever" + ], + [ + "▁cle", + "ver" + ], + [ + "▁P", + "ir" + ], + [ + "▁Pi", + "r" + ], + [ + "▁graph", + "ics" + ], + [ + "▁graphic", + "s" + ], + [ + "▁", + "graphics" + ], + [ + "▁G", + "R" + ], + [ + "▁", + "GR" + ], + [ + "че", + "ской" + ], + [ + "▁s", + "ag" + ], + [ + "▁sa", + "g" + ], + [ + "ict", + "ions" + ], + [ + "iction", + "s" + ], + [ + "i", + "ctions" + ], + [ + "na", + "nt" + ], + [ + "nan", + "t" + ], + [ + "n", + "ant" + ], + [ + "▁th", + "é" + ], + [ + "C", + "G" + ], + [ + "▁Jac", + "ques" + ], + [ + "W", + "M" + ], + [ + "▁F", + "inn" + ], + [ + "▁Fin", + "n" + ], + [ + "▁Fi", + "nn" + ], + [ + "▁dev", + "ast" + ], + [ + "зо", + "м" + ], + [ + "хо", + "в" + ], + [ + "х", + "ов" + ], + [ + "▁En", + "tre" + ], + [ + "▁Ent", + "re" + ], + [ + ".", + ";" + ], + [ + "▁fl", + "uct" + ], + [ + "▁flu", + "ct" + ], + [ + "▁Sc", + "iences" + ], + [ + "▁Sci", + "ences" + ], + [ + "▁Science", + "s" + ], + [ + "▁т", + "у" + ], + [ + "▁", + "ту" + ], + [ + "path", + "s" + ], + [ + "pat", + "hs" + ], + [ + "▁sh", + "orter" + ], + [ + "▁short", + "er" + ], + [ + "▁suggest", + "ion" + ], + [ + "ER", + "Y" + ], + [ + "▁D", + "ire" + ], + [ + "▁Di", + "re" + ], + [ + "▁Dir", + "e" + ], + [ + "at", + "eurs" + ], + [ + "ate", + "urs" + ], + [ + "ateur", + "s" + ], + [ + "▁round", + "ed" + ], + [ + "▁t", + "art" + ], + [ + "▁tar", + "t" + ], + [ + "▁ta", + "rt" + ], + [ + "ю", + "ще" + ], + [ + "up", + "er" + ], + [ + "u", + "per" + ], + [ + "▁secret", + "s" + ], + [ + "▁sec", + "rets" + ], + [ + "▁secre", + "ts" + ], + [ + "▁compan", + "ion" + ], + [ + "▁K", + "EY" + ], + [ + "▁", + "KEY" + ], + [ + "T", + "ile" + ], + [ + "▁B", + "ibli" + ], + [ + "x", + "s" + ], + [ + "▁ang", + "ular" + ], + [ + "▁", + "angular" + ], + [ + "pa", + "g" + ], + [ + "p", + "ag" + ], + [ + "er", + "ness" + ], + [ + "ern", + "ess" + ], + [ + "erne", + "ss" + ], + [ + "▁S", + "orry" + ], + [ + "▁Sor", + "ry" + ], + [ + "▁", + "Sorry" + ], + [ + "▁pre", + "diction" + ], + [ + "▁predict", + "ion" + ], + [ + "▁pred", + "iction" + ], + [ + "▁M", + "aking" + ], + [ + "▁Ma", + "king" + ], + [ + "▁Mak", + "ing" + ], + [ + "на", + "род" + ], + [ + "ol", + "are" + ], + [ + "ola", + "re" + ], + [ + "olar", + "e" + ], + [ + "rp", + "c" + ], + [ + "r", + "pc" + ], + [ + "▁t", + "ens" + ], + [ + "▁te", + "ns" + ], + [ + "▁ten", + "s" + ], + [ + "en", + "as" + ], + [ + "ena", + "s" + ], + [ + "e", + "nas" + ], + [ + "▁Re", + "ally" + ], + [ + "▁Real", + "ly" + ], + [ + "H", + "I" + ], + [ + "port", + "al" + ], + [ + "por", + "tal" + ], + [ + "▁for", + "me" + ], + [ + "▁form", + "e" + ], + [ + "gan", + "g" + ], + [ + "ga", + "ng" + ], + [ + "g", + "ang" + ], + [ + "▁l", + "ane" + ], + [ + "▁la", + "ne" + ], + [ + "▁lan", + "e" + ], + [ + "▁", + "lane" + ], + [ + "▁s", + "tag" + ], + [ + "▁st", + "ag" + ], + [ + "▁sta", + "g" + ], + [ + "▁Mar", + "x" + ], + [ + "▁Ma", + "rx" + ], + [ + "▁L", + "LC" + ], + [ + "▁LL", + "C" + ], + [ + "▁d", + "are" + ], + [ + "▁da", + "re" + ], + [ + "▁dar", + "e" + ], + [ + "▁Olymp", + "ic" + ], + [ + "▁p", + "ant" + ], + [ + "▁pan", + "t" + ], + [ + "▁pa", + "nt" + ], + [ + "build", + "ing" + ], + [ + ";", + ";" + ], + [ + "▁c", + "ops" + ], + [ + "▁co", + "ps" + ], + [ + "▁cop", + "s" + ], + [ + "▁r", + "ushed" + ], + [ + "▁rush", + "ed" + ], + [ + "▁rus", + "hed" + ], + [ + "▁L", + "ot" + ], + [ + "▁Lo", + "t" + ], + [ + "▁init", + "iative" + ], + [ + "▁initi", + "ative" + ], + [ + "▁inv", + "ite" + ], + [ + "▁Saf", + "ety" + ], + [ + "▁Safe", + "ty" + ], + [ + "FA", + "ILED" + ], + [ + "FAIL", + "ED" + ], + [ + "▁habit", + "ants" + ], + [ + "en", + "sen" + ], + [ + "ens", + "en" + ], + [ + "ense", + "n" + ], + [ + "▁l", + "ég" + ], + [ + "▁W", + "elcome" + ], + [ + "▁Wel", + "come" + ], + [ + "Valid", + "ate" + ], + [ + "▁qu", + "atre" + ], + [ + "▁G", + "ray" + ], + [ + "▁Gr", + "ay" + ], + [ + "▁Gra", + "y" + ], + [ + "▁E", + "ve" + ], + [ + "▁Ev", + "e" + ], + [ + "▁C", + "omb" + ], + [ + "▁Com", + "b" + ], + [ + "▁Co", + "mb" + ], + [ + "▁", + "Comb" + ], + [ + "▁p", + "endant" + ], + [ + "a", + "qu" + ], + [ + "con", + "figure" + ], + [ + "config", + "ure" + ], + [ + "▁A", + "dm" + ], + [ + "▁Ad", + "m" + ], + [ + "▁rif", + "le" + ], + [ + "▁Exper", + "ience" + ], + [ + "Decl", + "aration" + ], + [ + "▁å", + "r" + ], + [ + "▁", + "år" + ], + [ + "ill", + "ery" + ], + [ + "ille", + "ry" + ], + [ + "iller", + "y" + ], + [ + "os", + "pel" + ], + [ + "osp", + "el" + ], + [ + "▁A", + "rena" + ], + [ + "▁Ar", + "ena" + ], + [ + "▁Are", + "na" + ], + [ + "▁bo", + "ards" + ], + [ + "▁board", + "s" + ], + [ + "▁", + "boards" + ], + [ + "▁pur", + "ple" + ], + [ + "▁p", + "ills" + ], + [ + "▁pil", + "ls" + ], + [ + "▁pill", + "s" + ], + [ + "ueto", + "oth" + ], + [ + "li", + "que" + ], + [ + "l", + "ique" + ], + [ + "▁pop", + "ulations" + ], + [ + "▁population", + "s" + ], + [ + "▁popul", + "ations" + ], + [ + "▁acc", + "ent" + ], + [ + "▁ac", + "cent" + ], + [ + "▁r", + "anges" + ], + [ + "▁range", + "s" + ], + [ + "▁ran", + "ges" + ], + [ + "▁rang", + "es" + ], + [ + "▁Anal", + "ysis" + ], + [ + "▁", + "Analysis" + ], + [ + "▁d", + "ictionary" + ], + [ + "▁Dr", + "agon" + ], + [ + "▁Drag", + "on" + ], + [ + "re", + "ction" + ], + [ + "rect", + "ion" + ], + [ + "r", + "ection" + ], + [ + "▁vis", + "itor" + ], + [ + "▁visit", + "or" + ], + [ + "seg", + "ment" + ], + [ + "▁д", + "р" + ], + [ + "▁F", + "uck" + ], + [ + "▁Fu", + "ck" + ], + [ + "д", + "ж" + ], + [ + "▁ident", + "ification" + ], + [ + "Class", + "Name" + ], + [ + "boot", + "strap" + ], + [ + "▁sur", + "faces" + ], + [ + "▁surface", + "s" + ], + [ + "▁surf", + "aces" + ], + [ + "▁scream", + "ing" + ], + [ + "кт", + "у" + ], + [ + "к", + "ту" + ], + [ + "pl", + "ain" + ], + [ + "sh", + "adow" + ], + [ + "incl", + "udes" + ], + [ + "include", + "s" + ], + [ + "▁j", + "azz" + ], + [ + "▁ja", + "zz" + ], + [ + "▁á", + "l" + ], + [ + "▁", + "ál" + ], + [ + "ri", + "ka" + ], + [ + "rik", + "a" + ], + [ + "r", + "ika" + ], + [ + "ho", + "p" + ], + [ + "h", + "op" + ], + [ + "▁i", + "on" + ], + [ + "▁io", + "n" + ], + [ + "▁", + "ion" + ], + [ + "vr", + "e" + ], + [ + "v", + "re" + ], + [ + "▁newsp", + "apers" + ], + [ + "▁newspaper", + "s" + ], + [ + "▁i", + "hn" + ], + [ + "▁ih", + "n" + ], + [ + "▁P", + "arse" + ], + [ + "▁Par", + "se" + ], + [ + "▁Pars", + "e" + ], + [ + "▁", + "Parse" + ], + [ + "П", + "о" + ], + [ + "▁strict", + "ly" + ], + [ + "▁re", + "covered" + ], + [ + "▁recover", + "ed" + ], + [ + "▁U", + "na" + ], + [ + "▁Un", + "a" + ], + [ + "▁err", + "e" + ], + [ + "▁er", + "re" + ], + [ + "▁", + "erre" + ], + [ + "iss", + "ues" + ], + [ + "issue", + "s" + ], + [ + "▁exp", + "ense" + ], + [ + "че", + "ния" + ], + [ + "▁do", + "nc" + ], + [ + "▁don", + "c" + ], + [ + "Bi", + "n" + ], + [ + "B", + "in" + ], + [ + "▁Com", + "ment" + ], + [ + "▁Comm", + "ent" + ], + [ + "▁", + "Comment" + ], + [ + "▁sac", + "rifice" + ], + [ + "▁sacrific", + "e" + ], + [ + "T", + "uple" + ], + [ + "()", + "[" + ], + [ + "(", + ")[" + ], + [ + "▁tra", + "vers" + ], + [ + "▁trav", + "ers" + ], + [ + "Im", + "p" + ], + [ + "I", + "mp" + ], + [ + "J", + "e" + ], + [ + "▁Lin", + "ux" + ], + [ + "▁е", + "ё" + ], + [ + "▁P", + "i" + ], + [ + "▁", + "Pi" + ], + [ + "▁cur", + "ios" + ], + [ + "▁cu", + "rios" + ], + [ + "▁r", + "age" + ], + [ + "▁rag", + "e" + ], + [ + "▁ra", + "ge" + ], + [ + "▁", + "rage" + ], + [ + "▁e", + "scal" + ], + [ + "▁es", + "cal" + ], + [ + "▁esc", + "al" + ], + [ + "▁al", + "ignment" + ], + [ + "▁align", + "ment" + ], + [ + "▁pent", + "ru" + ], + [ + "▁cur", + "r" + ], + [ + "▁cu", + "rr" + ], + [ + "▁", + "curr" + ], + [ + "▁b", + "este" + ], + [ + "▁be", + "ste" + ], + [ + "▁best", + "e" + ], + [ + "▁bes", + "te" + ], + [ + "[]", + "," + ], + [ + "[", + "]," + ], + [ + "▁//", + "!" + ], + [ + "H", + "ub" + ], + [ + "Vis", + "ibility" + ], + [ + "▁A", + "sk" + ], + [ + "▁As", + "k" + ], + [ + "ab", + "ul" + ], + [ + "a", + "bul" + ], + [ + "co", + "lon" + ], + [ + "col", + "on" + ], + [ + "colo", + "n" + ], + [ + "▁D", + "ays" + ], + [ + "▁Day", + "s" + ], + [ + "▁Da", + "ys" + ], + [ + "Aut", + "hentication" + ], + [ + "ві", + "т" + ], + [ + "▁l", + "od" + ], + [ + "▁lo", + "d" + ], + [ + "xF", + "C" + ], + [ + "x", + "FC" + ], + [ + "Look", + "up" + ], + [ + "js", + "ce" + ], + [ + "Al", + "pha" + ], + [ + "▁harm", + "ony" + ], + [ + "▁harmon", + "y" + ], + [ + "▁W", + "ard" + ], + [ + "▁War", + "d" + ], + [ + "▁Wa", + "rd" + ], + [ + "trans", + "fer" + ], + [ + "▁H", + "orn" + ], + [ + "▁Hor", + "n" + ], + [ + "▁Ho", + "rn" + ], + [ + "▁s", + "d" + ], + [ + "▁", + "sd" + ], + [ + "so", + "ap" + ], + [ + "▁z", + "ich" + ], + [ + "▁Con", + "sole" + ], + [ + "▁Cons", + "ole" + ], + [ + "▁", + "Console" + ], + [ + "▁ко", + "ли" + ], + [ + "▁Ph", + "one" + ], + [ + "▁", + "Phone" + ], + [ + "pa", + "per" + ], + [ + "p", + "aper" + ], + [ + "й", + "н" + ], + [ + "▁z", + "m" + ], + [ + "▁", + "zm" + ], + [ + "Do", + "ne" + ], + [ + "Don", + "e" + ], + [ + "D", + "one" + ], + [ + "ph", + "ase" + ], + [ + "pha", + "se" + ], + [ + "phas", + "e" + ], + [ + "▁Jul", + "ia" + ], + [ + "▁Ju", + "lia" + ], + [ + "▁Juli", + "a" + ], + [ + "▁ed", + "ited" + ], + [ + "▁edit", + "ed" + ], + [ + "af", + "fe" + ], + [ + "aff", + "e" + ], + [ + "Sy", + "ntax" + ], + [ + "yl", + "l" + ], + [ + "y", + "ll" + ], + [ + "▁Lu", + "cas" + ], + [ + "▁Luc", + "as" + ], + [ + "▁and", + "eren" + ], + [ + "▁andere", + "n" + ], + [ + "▁ander", + "en" + ], + [ + "[", + "<" + ], + [ + "▁Data", + "base" + ], + [ + "▁Dat", + "abase" + ], + [ + "▁", + "Database" + ], + [ + "▁spect", + "ral" + ], + [ + "▁spectra", + "l" + ], + [ + "ass", + "ador" + ], + [ + "ска", + "та" + ], + [ + "с", + "ката" + ], + [ + "▁import", + "ante" + ], + [ + "▁important", + "e" + ], + [ + "▁х", + "а" + ], + [ + "▁", + "ха" + ], + [ + "t", + "z" + ], + [ + "▁s", + "tere" + ], + [ + "▁st", + "ere" + ], + [ + "▁ste", + "re" + ], + [ + "▁ster", + "e" + ], + [ + "▁m", + "elt" + ], + [ + "▁me", + "lt" + ], + [ + "▁mel", + "t" + ], + [ + "▁C", + "row" + ], + [ + "▁Cr", + "ow" + ], + [ + "▁Cro", + "w" + ], + [ + "ш", + "ка" + ], + [ + "it", + "utes" + ], + [ + "itut", + "es" + ], + [ + "itute", + "s" + ], + [ + "itu", + "tes" + ], + [ + "▁satisf", + "ies" + ], + [ + "▁L", + "iga" + ], + [ + "▁Li", + "ga" + ], + [ + "▁t", + "omb" + ], + [ + "▁to", + "mb" + ], + [ + "▁tom", + "b" + ], + [ + "▁f", + "ühr" + ], + [ + "▁", + "führ" + ], + [ + "▁sol", + "ely" + ], + [ + "▁sole", + "ly" + ], + [ + "▁E", + "ither" + ], + [ + "▁t", + "ennis" + ], + [ + "▁ten", + "nis" + ], + [ + "▁s", + "igh" + ], + [ + "▁si", + "gh" + ], + [ + "▁sig", + "h" + ], + [ + "ser", + "de" + ], + [ + "s", + "erde" + ], + [ + "ub", + "a" + ], + [ + "u", + "ba" + ], + [ + "ę", + "d" + ], + [ + "le", + "z" + ], + [ + "l", + "ez" + ], + [ + "Fac", + "t" + ], + [ + "F", + "act" + ], + [ + "▁sque", + "ez" + ], + [ + "▁Thom", + "pson" + ], + [ + "▁N", + "L" + ], + [ + "▁", + "NL" + ], + [ + "▁P", + "ara" + ], + [ + "▁Par", + "a" + ], + [ + "▁Pa", + "ra" + ], + [ + "▁?", + "?" + ], + [ + "▁", + "??" + ], + [ + "▁fin", + "ishing" + ], + [ + "▁finish", + "ing" + ], + [ + "She", + "et" + ], + [ + "S", + "heet" + ], + [ + "LIN", + "K" + ], + [ + "L", + "INK" + ], + [ + "▁б", + "ро" + ], + [ + "▁", + "бро" + ], + [ + "▁l", + "over" + ], + [ + "▁lo", + "ver" + ], + [ + "▁love", + "r" + ], + [ + "▁lov", + "er" + ], + [ + "m", + "achine" + ], + [ + "▁L", + "esser" + ], + [ + "▁Les", + "ser" + ], + [ + "▁Less", + "er" + ], + [ + "pon", + "d" + ], + [ + "po", + "nd" + ], + [ + "p", + "ond" + ], + [ + "▁pain", + "tings" + ], + [ + "▁paint", + "ings" + ], + [ + "▁painting", + "s" + ], + [ + "▁assum", + "ptions" + ], + [ + "▁assumption", + "s" + ], + [ + "▁mod", + "ification" + ], + [ + "fr", + "e" + ], + [ + "f", + "re" + ], + [ + "▁U", + "lt" + ], + [ + "▁Ul", + "t" + ], + [ + "▁A", + "F" + ], + [ + "▁", + "AF" + ], + [ + "R", + "V" + ], + [ + "bin", + "ding" + ], + [ + "bind", + "ing" + ], + [ + "b", + "inding" + ], + [ + "▁toile", + "t" + ], + [ + "ra", + "r" + ], + [ + "r", + "ar" + ], + [ + "▁an", + "ge" + ], + [ + "▁ang", + "e" + ], + [ + "▁", + "ange" + ], + [ + "▁she", + "ep" + ], + [ + "PRO", + "TO" + ], + [ + "act", + "ic" + ], + [ + "a", + "ctic" + ], + [ + "▁S", + "peed" + ], + [ + "▁Sp", + "eed" + ], + [ + "▁Spe", + "ed" + ], + [ + "▁", + "Speed" + ], + [ + "▁I", + "ce" + ], + [ + "gn", + "u" + ], + [ + "g", + "nu" + ], + [ + "ow", + "ned" + ], + [ + "own", + "ed" + ], + [ + "Sub", + "scription" + ], + [ + "yr", + "ics" + ], + [ + "y", + "rics" + ], + [ + "▁back", + "ward" + ], + [ + ">\"", + "." + ], + [ + ">", + "\"." + ], + [ + "pi", + "t" + ], + [ + "p", + "it" + ], + [ + "▁real", + "istic" + ], + [ + "öff", + "ent" + ], + [ + "az", + "i" + ], + [ + "a", + "zi" + ], + [ + "DE", + "R" + ], + [ + "D", + "ER" + ], + [ + "b", + "ucket" + ], + [ + "én", + "y" + ], + [ + "é", + "ny" + ], + [ + "xF", + "E" + ], + [ + "x", + "FE" + ], + [ + "▁f", + "ancy" + ], + [ + "▁fan", + "cy" + ], + [ + "ex", + "cept" + ], + [ + "▁S", + "ul" + ], + [ + "▁Su", + "l" + ], + [ + "▁l", + "aser" + ], + [ + "▁la", + "ser" + ], + [ + "▁las", + "er" + ], + [ + "Mon", + "itor" + ], + [ + "▁c", + "omic" + ], + [ + "▁com", + "ic" + ], + [ + "▁co", + "mic" + ], + [ + "▁Arch", + "itect" + ], + [ + "▁ex", + "pr" + ], + [ + "▁exp", + "r" + ], + [ + "▁", + "expr" + ], + [ + "ount", + "ers" + ], + [ + "oun", + "ters" + ], + [ + "ounter", + "s" + ], + [ + "▁Mel", + "bourne" + ], + [ + "com", + "plex" + ], + [ + "comp", + "lex" + ], + [ + "'.", + "$" + ], + [ + "'", + ".$" + ], + [ + "om", + "ot" + ], + [ + "omo", + "t" + ], + [ + "o", + "mot" + ], + [ + "▁M", + "enu" + ], + [ + "▁Me", + "nu" + ], + [ + "▁Men", + "u" + ], + [ + "▁", + "Menu" + ], + [ + "astic", + "search" + ], + [ + "▁ed", + "iting" + ], + [ + "▁edit", + "ing" + ], + [ + "Pre", + "sent" + ], + [ + "Pres", + "ent" + ], + [ + "P", + "resent" + ], + [ + "op", + "les" + ], + [ + "ople", + "s" + ], + [ + "opl", + "es" + ], + [ + "o", + "ples" + ], + [ + "è", + "ncia" + ], + [ + "▁в", + "то" + ], + [ + "gl", + "ise" + ], + [ + "she", + "et" + ], + [ + "s", + "heet" + ], + [ + "▁he", + "lic" + ], + [ + "▁hel", + "ic" + ], + [ + "▁str", + "anger" + ], + [ + "▁strange", + "r" + ], + [ + "▁strang", + "er" + ], + [ + "▁ex", + "ec" + ], + [ + "▁", + "exec" + ], + [ + "FE", + "R" + ], + [ + "F", + "ER" + ], + [ + "in", + "ian" + ], + [ + "ini", + "an" + ], + [ + "SET", + "TING" + ], + [ + "▁M", + "ix" + ], + [ + "▁Mi", + "x" + ], + [ + "▁", + "Mix" + ], + [ + "▁com", + "plain" + ], + [ + "▁compl", + "ain" + ], + [ + "▁in", + "crement" + ], + [ + "▁incre", + "ment" + ], + [ + "CS", + "S" + ], + [ + "C", + "SS" + ], + [ + "mm", + "a" + ], + [ + "m", + "ma" + ], + [ + "sl", + "ide" + ], + [ + "▁про", + "тив" + ], + [ + "▁проти", + "в" + ], + [ + "▁Lim", + "ited" + ], + [ + "Con", + "sole" + ], + [ + "Cons", + "ole" + ], + [ + "▁eng", + "aging" + ], + [ + "ul", + "er" + ], + [ + "ule", + "r" + ], + [ + "u", + "ler" + ], + [ + "▁O", + "ptions" + ], + [ + "▁Option", + "s" + ], + [ + "▁Opt", + "ions" + ], + [ + "▁", + "Options" + ], + [ + "▁l", + "ens" + ], + [ + "▁le", + "ns" + ], + [ + "▁len", + "s" + ], + [ + "Ma", + "il" + ], + [ + "M", + "ail" + ], + [ + "▁bar", + "rier" + ], + [ + "▁barr", + "ier" + ], + [ + "trans", + "port" + ], + [ + "▁c", + "ups" + ], + [ + "▁cu", + "ps" + ], + [ + "▁cup", + "s" + ], + [ + "it", + "err" + ], + [ + "ite", + "rr" + ], + [ + "iter", + "r" + ], + [ + "▁const", + "ants" + ], + [ + "▁constant", + "s" + ], + [ + "▁", + "constants" + ], + [ + "▁T", + "ech" + ], + [ + "▁Te", + "ch" + ], + [ + "iz", + "io" + ], + [ + "izi", + "o" + ], + [ + "сту", + "па" + ], + [ + "ступ", + "а" + ], + [ + "▁Sw", + "eden" + ], + [ + "at", + "hon" + ], + [ + "ath", + "on" + ], + [ + "a", + "thon" + ], + [ + "▁M", + "agn" + ], + [ + "▁Mag", + "n" + ], + [ + "▁Ma", + "gn" + ], + [ + "trans", + "ition" + ], + [ + "де", + "ла" + ], + [ + "es", + "k" + ], + [ + "e", + "sk" + ], + [ + "So", + "ft" + ], + [ + "S", + "oft" + ], + [ + "fun", + "ctions" + ], + [ + "function", + "s" + ], + [ + "ne", + "a" + ], + [ + "n", + "ea" + ], + [ + "Im", + "plement" + ], + [ + "Impl", + "ement" + ], + [ + "Imp", + "lement" + ], + [ + "ev", + "ery" + ], + [ + "ever", + "y" + ], + [ + "eve", + "ry" + ], + [ + "e", + "very" + ], + [ + "▁Man", + "ufact" + ], + [ + "▁improve", + "ments" + ], + [ + "▁improvement", + "s" + ], + [ + "▁Ind", + "iana" + ], + [ + "▁India", + "na" + ], + [ + "▁Indian", + "a" + ], + [ + "▁host", + "s" + ], + [ + "▁ho", + "sts" + ], + [ + "C", + "V" + ], + [ + "We", + "st" + ], + [ + "W", + "est" + ], + [ + "to", + "wn" + ], + [ + "t", + "own" + ], + [ + "can", + "vas" + ], + [ + "▁ш", + "ко" + ], + [ + "▁Col", + "umn" + ], + [ + "▁", + "Column" + ], + [ + "▁Par", + "ker" + ], + [ + "▁Park", + "er" + ], + [ + "▁es", + "pa" + ], + [ + "▁esp", + "a" + ], + [ + "▁Pub", + "lish" + ], + [ + "▁которы", + "й" + ], + [ + "av", + "is" + ], + [ + "avi", + "s" + ], + [ + "a", + "vis" + ], + [ + "▁Z", + "w" + ], + [ + "▁emphas", + "is" + ], + [ + "ol", + "v" + ], + [ + "o", + "lv" + ], + [ + "▁re", + "curs" + ], + [ + "▁rec", + "urs" + ], + [ + "▁recur", + "s" + ], + [ + "it", + "aire" + ], + [ + "ita", + "ire" + ], + [ + "▁B", + "ishop" + ], + [ + "▁Bi", + "shop" + ], + [ + "▁Bis", + "hop" + ], + [ + "ne", + "ro" + ], + [ + "ner", + "o" + ], + [ + "n", + "ero" + ], + [ + "▁d", + "eny" + ], + [ + "▁de", + "ny" + ], + [ + "▁den", + "y" + ], + [ + "▁do", + "ub" + ], + [ + "▁dou", + "b" + ], + [ + "peon", + "ato" + ], + [ + "▁C", + "ourse" + ], + [ + "▁Cour", + "se" + ], + [ + "▁Que", + "ens" + ], + [ + "▁Queen", + "s" + ], + [ + "▁bl", + "ur" + ], + [ + "el", + "ed" + ], + [ + "ele", + "d" + ], + [ + "e", + "led" + ], + [ + "iz", + "o" + ], + [ + "i", + "zo" + ], + [ + "▁dé", + "but" + ], + [ + "▁Mod", + "ule" + ], + [ + "▁Mo", + "dule" + ], + [ + "▁", + "Module" + ], + [ + "▁anx", + "ious" + ], + [ + "▁st", + "are" + ], + [ + "▁star", + "e" + ], + [ + "▁sta", + "re" + ], + [ + "▁Pro", + "position" + ], + [ + "▁K", + "u" + ], + [ + "▁i", + "c" + ], + [ + "▁", + "ic" + ], + [ + "Per", + "cent" + ], + [ + "Qu", + "ant" + ], + [ + "▁И", + "сто" + ], + [ + "▁h", + "ex" + ], + [ + "▁he", + "x" + ], + [ + "▁", + "hex" + ], + [ + "ass", + "oci" + ], + [ + "asso", + "ci" + ], + [ + "▁arrang", + "ement" + ], + [ + "▁arrange", + "ment" + ], + [ + "▁bo", + "ats" + ], + [ + "▁boat", + "s" + ], + [ + "Un", + "d" + ], + [ + "U", + "nd" + ], + [ + "▁sl", + "ots" + ], + [ + "▁slot", + "s" + ], + [ + "се", + "н" + ], + [ + "с", + "ен" + ], + [ + "necess", + "ary" + ], + [ + "▁app", + "earing" + ], + [ + "▁appe", + "aring" + ], + [ + "▁appear", + "ing" + ], + [ + "▁R", + "ule" + ], + [ + "▁Ru", + "le" + ], + [ + "▁", + "Rule" + ], + [ + "▁G", + "T" + ], + [ + "▁", + "GT" + ], + [ + "For", + "ce" + ], + [ + "et", + "to" + ], + [ + "ett", + "o" + ], + [ + "e", + "tto" + ], + [ + "ze", + "nia" + ], + [ + "zen", + "ia" + ], + [ + "▁o", + "uts" + ], + [ + "▁out", + "s" + ], + [ + "▁ou", + "ts" + ], + [ + "▁", + "outs" + ], + [ + "▁vari", + "ations" + ], + [ + "▁variation", + "s" + ], + [ + "▁wh", + "ites" + ], + [ + "▁white", + "s" + ], + [ + "▁g", + "lo" + ], + [ + "▁gl", + "o" + ], + [ + "▁B", + "R" + ], + [ + "▁", + "BR" + ], + [ + "ic", + "ky" + ], + [ + "ick", + "y" + ], + [ + "▁j", + "ury" + ], + [ + "▁ju", + "ry" + ], + [ + "▁jur", + "y" + ], + [ + "▁treat", + "ments" + ], + [ + "▁treatment", + "s" + ], + [ + "▁The", + "ater" + ], + [ + "kn", + "ow" + ], + [ + "k", + "now" + ], + [ + "▁pro", + "files" + ], + [ + "▁prof", + "iles" + ], + [ + "▁profile", + "s" + ], + [ + "▁con", + "spir" + ], + [ + "▁class", + "room" + ], + [ + "▁B", + "ass" + ], + [ + "▁Bas", + "s" + ], + [ + "▁Ba", + "ss" + ], + [ + "▁law", + "yers" + ], + [ + "▁lawyer", + "s" + ], + [ + "v", + "ue" + ], + [ + "▁A", + "rc" + ], + [ + "▁Ar", + "c" + ], + [ + "▁", + "Arc" + ], + [ + "▁s", + "la" + ], + [ + "▁sl", + "a" + ], + [ + "▁att", + "ending" + ], + [ + "▁attend", + "ing" + ], + [ + "n", + "x" + ], + [ + "m", + "x" + ], + [ + "TO", + "P" + ], + [ + "T", + "OP" + ], + [ + "▁b", + "ored" + ], + [ + "▁bo", + "red" + ], + [ + "▁bore", + "d" + ], + [ + "▁bor", + "ed" + ], + [ + "pre", + "vious" + ], + [ + "prev", + "ious" + ], + [ + "r", + "w" + ], + [ + "pt", + "ic" + ], + [ + "љ", + "у" + ], + [ + "▁app", + "ar" + ], + [ + "▁ap", + "par" + ], + [ + "▁P", + "ont" + ], + [ + "▁Po", + "nt" + ], + [ + ":", + "_" + ], + [ + "ii", + "i" + ], + [ + "i", + "ii" + ], + [ + "▁j", + "erk" + ], + [ + "▁jer", + "k" + ], + [ + "hed", + "ral" + ], + [ + "сс", + "а" + ], + [ + "с", + "са" + ], + [ + "▁Pr", + "ize" + ], + [ + "▁Pri", + "ze" + ], + [ + "▁Р", + "и" + ], + [ + "б", + "ре" + ], + [ + "▁hand", + "les" + ], + [ + "▁handle", + "s" + ], + [ + "▁j", + "ak" + ], + [ + "▁ja", + "k" + ], + [ + "▁Afghan", + "istan" + ], + [ + "▁b", + "oring" + ], + [ + "▁bo", + "ring" + ], + [ + "▁bor", + "ing" + ], + [ + "if", + "ik" + ], + [ + "ifi", + "k" + ], + [ + "▁sh", + "ade" + ], + [ + "▁sha", + "de" + ], + [ + "air", + "o" + ], + [ + "ai", + "ro" + ], + [ + "a", + "iro" + ], + [ + "od", + "ay" + ], + [ + "oda", + "y" + ], + [ + "o", + "day" + ], + [ + "▁pl", + "ates" + ], + [ + "▁plate", + "s" + ], + [ + "▁plat", + "es" + ], + [ + "▁Championship", + "s" + ], + [ + "▁Champion", + "ships" + ], + [ + "▁che", + "eks" + ], + [ + "▁cheek", + "s" + ], + [ + "ri", + "ke" + ], + [ + "rik", + "e" + ], + [ + "r", + "ike" + ], + [ + "▁kön", + "nen" + ], + [ + "▁app", + "le" + ], + [ + "▁ap", + "ple" + ], + [ + "▁appl", + "e" + ], + [ + "▁", + "apple" + ], + [ + "▁Ed", + "die" + ], + [ + "▁Edd", + "ie" + ], + [ + "▁s", + "od" + ], + [ + "▁so", + "d" + ], + [ + "▁tr", + "ains" + ], + [ + "▁tra", + "ins" + ], + [ + "▁train", + "s" + ], + [ + "pan", + "ic" + ], + [ + "pa", + "nic" + ], + [ + "▁Ad", + "vent" + ], + [ + "▁Adv", + "ent" + ], + [ + "ub", + "re" + ], + [ + "u", + "bre" + ], + [ + "▁d", + "å" + ], + [ + "▁S", + "ymbol" + ], + [ + "▁Sym", + "bol" + ], + [ + "▁", + "Symbol" + ], + [ + "▁с", + "те" + ], + [ + "▁ст", + "е" + ], + [ + "▁", + "сте" + ], + [ + "Sa", + "m" + ], + [ + "S", + "am" + ], + [ + "inher", + "it" + ], + [ + "cam", + "era" + ], + [ + "▁c", + "ours" + ], + [ + "▁co", + "urs" + ], + [ + "▁cour", + "s" + ], + [ + "▁cou", + "rs" + ], + [ + "▁make", + "up" + ], + [ + "re", + "gex" + ], + [ + "reg", + "ex" + ], + [ + "▁U", + "E" + ], + [ + "▁", + "UE" + ], + [ + "▁Det", + "roit" + ], + [ + "▁W", + "eight" + ], + [ + "▁We", + "ight" + ], + [ + "▁", + "Weight" + ], + [ + "▁P", + "iet" + ], + [ + "▁Pi", + "et" + ], + [ + "▁a", + "ria" + ], + [ + "▁ar", + "ia" + ], + [ + "▁", + "aria" + ], + [ + "DI", + "RECT" + ], + [ + "DIR", + "ECT" + ], + [ + "ace", + "ae" + ], + [ + "▁In", + "fo" + ], + [ + "▁Inf", + "o" + ], + [ + "▁", + "Info" + ], + [ + "an", + "ya" + ], + [ + "any", + "a" + ], + [ + "back", + "end" + ], + [ + "▁T", + "ennessee" + ], + [ + "pi", + "cker" + ], + [ + "pic", + "ker" + ], + [ + "pick", + "er" + ], + [ + "p", + "icker" + ], + [ + "▁Le", + "o" + ], + [ + "▁P", + "oss" + ], + [ + "▁Po", + "ss" + ], + [ + "▁Pos", + "s" + ], + [ + "pr", + "ises" + ], + [ + "prise", + "s" + ], + [ + "▁m", + "ature" + ], + [ + "▁mat", + "ure" + ], + [ + "сь", + "ких" + ], + [ + "▁F", + "ant" + ], + [ + "▁Fa", + "nt" + ], + [ + "Re", + "ason" + ], + [ + "▁m", + "oy" + ], + [ + "▁mo", + "y" + ], + [ + "▁B", + "aker" + ], + [ + "▁Ba", + "ker" + ], + [ + "▁Bak", + "er" + ], + [ + "▁sub", + "set" + ], + [ + "▁subs", + "et" + ], + [ + "▁", + "subset" + ], + [ + "▁Stan", + "ley" + ], + [ + "▁el", + "even" + ], + [ + "▁ele", + "ven" + ], + [ + "▁elev", + "en" + ], + [ + "ol", + "ate" + ], + [ + "ola", + "te" + ], + [ + "o", + "late" + ], + [ + "▁fort", + "une" + ], + [ + "Status", + "Code" + ], + [ + "▁ent", + "ities" + ], + [ + "▁", + "entities" + ], + [ + "▁Ok", + "ay" + ], + [ + "▁", + "Okay" + ], + [ + "ц", + "о" + ], + [ + "an", + "os" + ], + [ + "ano", + "s" + ], + [ + "a", + "nos" + ], + [ + "rel", + "ative" + ], + [ + "▁order", + "ing" + ], + [ + "▁ord", + "ering" + ], + [ + "▁No", + "body" + ], + [ + "▁Nob", + "ody" + ], + [ + "▁str", + "len" + ], + [ + "▁", + "strlen" + ], + [ + "▁r", + "ope" + ], + [ + "▁ro", + "pe" + ], + [ + "▁cig", + "arette" + ], + [ + "hol", + "ds" + ], + [ + "hold", + "s" + ], + [ + "h", + "olds" + ], + [ + "ir", + "able" + ], + [ + "ira", + "ble" + ], + [ + "value", + "Of" + ], + [ + "St", + "ub" + ], + [ + "▁phot", + "ography" + ], + [ + "▁photograph", + "y" + ], + [ + "es", + "tra" + ], + [ + "est", + "ra" + ], + [ + "estr", + "a" + ], + [ + "e", + "stra" + ], + [ + "▁cult", + "ures" + ], + [ + "▁culture", + "s" + ], + [ + "▁decl", + "aration" + ], + [ + "▁declar", + "ation" + ], + [ + "mer", + "cial" + ], + [ + "LI", + "ED" + ], + [ + "au", + "te" + ], + [ + "aut", + "e" + ], + [ + "a", + "ute" + ], + [ + "al", + "ter" + ], + [ + "alt", + "er" + ], + [ + "Sub", + "mit" + ], + [ + "▁Mag", + "ic" + ], + [ + "▁r", + "hythm" + ], + [ + "Pay", + "ment" + ], + [ + "ni", + "h" + ], + [ + "n", + "ih" + ], + [ + "▁inter", + "section" + ], + [ + "l", + "é" + ], + [ + "EN", + "TRY" + ], + [ + "/", + ")" + ], + [ + "▁m", + "og" + ], + [ + "▁mo", + "g" + ], + [ + "ru", + "st" + ], + [ + "rus", + "t" + ], + [ + "r", + "ust" + ], + [ + "▁threat", + "s" + ], + [ + "▁Mil", + "itary" + ], + [ + "ap", + "or" + ], + [ + "a", + "por" + ], + [ + "▁s", + "igu" + ], + [ + "▁si", + "gu" + ], + [ + "▁sig", + "u" + ], + [ + "set", + "minus" + ], + [ + "▁I", + "ng" + ], + [ + "▁In", + "g" + ], + [ + "st", + "ation" + ], + [ + "stat", + "ion" + ], + [ + "T", + "ake" + ], + [ + "▁s", + "hed" + ], + [ + "▁sh", + "ed" + ], + [ + "▁she", + "d" + ], + [ + "▁Fr", + "ancia" + ], + [ + "▁Franc", + "ia" + ], + [ + "▁Fra", + "ncia" + ], + [ + "▁Fran", + "cia" + ], + [ + "pos", + "ts" + ], + [ + "po", + "sts" + ], + [ + "post", + "s" + ], + [ + "Mar", + "ker" + ], + [ + "Mark", + "er" + ], + [ + "Lower", + "Case" + ], + [ + "▁be", + "find" + ], + [ + "▁bef", + "ind" + ], + [ + "▁C", + "zech" + ], + [ + "▁Cz", + "ech" + ], + [ + "ícul", + "a" + ], + [ + "▁Per", + "formance" + ], + [ + "▁W", + "es" + ], + [ + "▁We", + "s" + ], + [ + "▁L", + "arry" + ], + [ + "▁Lar", + "ry" + ], + [ + "▁o", + "st" + ], + [ + "▁os", + "t" + ], + [ + "▁", + "ost" + ], + [ + "▁em", + "ails" + ], + [ + "▁email", + "s" + ], + [ + "▁Re", + "lease" + ], + [ + "▁", + "Release" + ], + [ + "▁ad", + "apter" + ], + [ + "▁adapt", + "er" + ], + [ + "▁", + "adapter" + ], + [ + "▁pad", + "re" + ], + [ + "ac", + "io" + ], + [ + "aci", + "o" + ], + [ + "a", + "cio" + ], + [ + "▁з", + "ем" + ], + [ + "▁gen", + "etic" + ], + [ + "▁ge", + "netic" + ], + [ + "▁U", + "nd" + ], + [ + "▁Un", + "d" + ], + [ + "▁", + "Und" + ], + [ + "▁accept", + "ance" + ], + [ + "да", + "н" + ], + [ + "д", + "ан" + ], + [ + "▁Girl", + "s" + ], + [ + "▁Gir", + "ls" + ], + [ + "comp", + "iler" + ], + [ + "compile", + "r" + ], + [ + "su", + "n" + ], + [ + "s", + "un" + ], + [ + "▁whe", + "els" + ], + [ + "▁wheel", + "s" + ], + [ + "▁thorough", + "ly" + ], + [ + "gr", + "und" + ], + [ + "gru", + "nd" + ], + [ + "g", + "rund" + ], + [ + "un", + "ction" + ], + [ + "unct", + "ion" + ], + [ + "▁e", + "lla" + ], + [ + "▁el", + "la" + ], + [ + "▁ell", + "a" + ], + [ + "▁", + "ella" + ], + [ + "X", + "FF" + ], + [ + "ug", + "s" + ], + [ + "u", + "gs" + ], + [ + "ient", + "os" + ], + [ + "ien", + "tos" + ], + [ + "iento", + "s" + ], + [ + "▁D", + "M" + ], + [ + "▁", + "DM" + ], + [ + "▁polit", + "ique" + ], + [ + "▁campaign", + "s" + ], + [ + "▁Tok", + "yo" + ], + [ + "▁album", + "s" + ], + [ + "KERN", + "EL" + ], + [ + "pd", + "ata" + ], + [ + "p", + "data" + ], + [ + "▁lap", + "top" + ], + [ + "▁lapt", + "op" + ], + [ + "▁v", + "ál" + ], + [ + "▁vá", + "l" + ], + [ + "▁f", + "ou" + ], + [ + "▁fo", + "u" + ], + [ + "or", + "b" + ], + [ + "o", + "rb" + ], + [ + "▁T", + "ower" + ], + [ + "▁To", + "wer" + ], + [ + "▁Tow", + "er" + ], + [ + "▁Get", + "ting" + ], + [ + "▁cor", + "ners" + ], + [ + "▁corner", + "s" + ], + [ + "▁corn", + "ers" + ], + [ + "pl", + "ess" + ], + [ + "ple", + "ss" + ], + [ + "ples", + "s" + ], + [ + "p", + "less" + ], + [ + "▁special", + "ist" + ], + [ + "▁i", + "v" + ], + [ + "▁", + "iv" + ], + [ + "Ui", + "nt" + ], + [ + "U", + "int" + ], + [ + "▁name", + "ly" + ], + [ + "▁nam", + "ely" + ], + [ + "▁sc", + "aling" + ], + [ + "▁scal", + "ing" + ], + [ + "Ext", + "ensions" + ], + [ + "Extension", + "s" + ], + [ + "▁cent", + "ro" + ], + [ + "omorph", + "ism" + ], + [ + "▁dé", + "f" + ], + [ + "),", + "\\" + ], + [ + ")", + ",\\" + ], + [ + "▁contr", + "ary" + ], + [ + "▁contra", + "ry" + ], + [ + "▁str", + "iking" + ], + [ + "▁stri", + "king" + ], + [ + "▁B", + "ere" + ], + [ + "▁Be", + "re" + ], + [ + "▁Ber", + "e" + ], + [ + "▁fore", + "cast" + ], + [ + "▁z", + "ones" + ], + [ + "▁zone", + "s" + ], + [ + "▁zo", + "nes" + ], + [ + "sm", + "art" + ], + [ + "s", + "mart" + ], + [ + "as", + "hi" + ], + [ + "ash", + "i" + ], + [ + "ri", + "n" + ], + [ + "r", + "in" + ], + [ + "NE", + "W" + ], + [ + "▁sim", + "ulations" + ], + [ + "▁simulation", + "s" + ], + [ + "▁R", + "ather" + ], + [ + "▁Ra", + "ther" + ], + [ + "▁Rat", + "her" + ], + [ + "▁Writ", + "ing" + ], + [ + "▁Wr", + "iting" + ], + [ + "▁$", + "[" + ], + [ + "▁as", + "sh" + ], + [ + "▁ass", + "h" + ], + [ + "▁f", + "ailing" + ], + [ + "▁fa", + "iling" + ], + [ + "▁fail", + "ing" + ], + [ + "▁man", + "if" + ], + [ + "▁B", + "og" + ], + [ + "▁Bo", + "g" + ], + [ + "▁D", + "ir" + ], + [ + "▁Di", + "r" + ], + [ + "▁", + "Dir" + ], + [ + "▁influ", + "enced" + ], + [ + "▁influence", + "d" + ], + [ + "conf", + "irm" + ], + [ + "▁we", + "igh" + ], + [ + "▁in", + "ventory" + ], + [ + "▁invent", + "ory" + ], + [ + "▁a", + "pare" + ], + [ + "▁ap", + "are" + ], + [ + "▁e", + "u" + ], + [ + "▁", + "eu" + ], + [ + "char", + "acter" + ], + [ + "io", + "m" + ], + [ + "i", + "om" + ], + [ + "▁o", + "rb" + ], + [ + "▁or", + "b" + ], + [ + "▁", + "orb" + ], + [ + "dev", + "ices" + ], + [ + "device", + "s" + ], + [ + "▁L", + "ED" + ], + [ + "▁LE", + "D" + ], + [ + "▁", + "LED" + ], + [ + "▁prop", + "ortion" + ], + [ + "▁proport", + "ion" + ], + [ + "▁Hon", + "or" + ], + [ + "▁Ho", + "nor" + ], + [ + "▁appro", + "aching" + ], + [ + "▁approach", + "ing" + ], + [ + "de", + "leg" + ], + [ + "del", + "eg" + ], + [ + "▁B", + "B" + ], + [ + "▁", + "BB" + ], + [ + "hel", + "pers" + ], + [ + "help", + "ers" + ], + [ + "helper", + "s" + ], + [ + "re", + "pository" + ], + [ + "rep", + "ository" + ], + [ + "▁б", + "ере" + ], + [ + "▁бе", + "ре" + ], + [ + "▁inhab", + "it" + ], + [ + "▁s", + "ão" + ], + [ + "▁travel", + "ed" + ], + [ + "▁trav", + "eled" + ], + [ + "ne", + "x" + ], + [ + "n", + "ex" + ], + [ + "▁C", + "lin" + ], + [ + "▁Cl", + "in" + ], + [ + "CE", + "PT" + ], + [ + "▁off", + "ense" + ], + [ + "▁in", + "cent" + ], + [ + "▁inc", + "ent" + ], + [ + "ID", + "S" + ], + [ + "I", + "DS" + ], + [ + "▁coeff", + "icients" + ], + [ + "▁coefficient", + "s" + ], + [ + "▁l", + "p" + ], + [ + "▁", + "lp" + ], + [ + "чно", + "го" + ], + [ + "ч", + "ного" + ], + [ + "▁c", + "d" + ], + [ + "▁", + "cd" + ], + [ + "mu", + "st" + ], + [ + "mus", + "t" + ], + [ + "m", + "ust" + ], + [ + "▁soon", + "er" + ], + [ + "ez", + "e" + ], + [ + "e", + "ze" + ], + [ + "C", + "at" + ], + [ + "ma", + "ker" + ], + [ + "make", + "r" + ], + [ + "m", + "aker" + ], + [ + "▁r", + "anked" + ], + [ + "▁ran", + "ked" + ], + [ + "▁rank", + "ed" + ], + [ + "ful", + "ness" + ], + [ + "▁part", + "ially" + ], + [ + "▁partial", + "ly" + ], + [ + "▁parti", + "ally" + ], + [ + "Pro", + "m" + ], + [ + "Pr", + "om" + ], + [ + "P", + "rom" + ], + [ + "▁ф", + "он" + ], + [ + "▁фо", + "н" + ], + [ + "▁Pro", + "bably" + ], + [ + "▁c", + "ached" + ], + [ + "▁cache", + "d" + ], + [ + "▁ca", + "ched" + ], + [ + "▁", + "cached" + ], + [ + "▁bal", + "anced" + ], + [ + "▁balance", + "d" + ], + [ + "ah", + "oma" + ], + [ + "aho", + "ma" + ], + [ + "▁Mur", + "ray" + ], + [ + "▁a", + "li" + ], + [ + "▁al", + "i" + ], + [ + "▁", + "ali" + ], + [ + "iv", + "os" + ], + [ + "ivo", + "s" + ], + [ + "▁b", + "ark" + ], + [ + "▁bar", + "k" + ], + [ + "IT", + "EM" + ], + [ + "ITE", + "M" + ], + [ + "▁Kir", + "che" + ], + [ + "▁alloc", + "ated" + ], + [ + "▁allocate", + "d" + ], + [ + "Al", + "t" + ], + [ + "A", + "lt" + ], + [ + "▁am", + "éric" + ], + [ + "íl", + "ia" + ], + [ + "í", + "lia" + ], + [ + "▁c", + "ens" + ], + [ + "▁ce", + "ns" + ], + [ + "▁lic", + "enses" + ], + [ + "▁license", + "s" + ], + [ + "▁", + "licenses" + ], + [ + "ac", + "z" + ], + [ + "a", + "cz" + ], + [ + "▁G", + "ate" + ], + [ + "▁Ga", + "te" + ], + [ + "▁", + "Gate" + ], + [ + "▁B", + "L" + ], + [ + "▁", + "BL" + ], + [ + "▁re", + "public" + ], + [ + "▁rep", + "ublic" + ], + [ + "RO", + "W" + ], + [ + "▁состав", + "ля" + ], + [ + "▁соста", + "вля" + ], + [ + "▁Fil", + "ip" + ], + [ + "▁Ind", + "ivid" + ], + [ + "▁tr", + "ials" + ], + [ + "▁tri", + "als" + ], + [ + "▁trial", + "s" + ], + [ + "/*", + "!" + ], + [ + "▁G", + "P" + ], + [ + "▁", + "GP" + ], + [ + "ni", + "ka" + ], + [ + "nik", + "a" + ], + [ + "n", + "ika" + ], + [ + "▁ex", + "em" + ], + [ + "▁ad", + "vers" + ], + [ + "▁adv", + "ers" + ], + [ + "um", + "ped" + ], + [ + "ump", + "ed" + ], + [ + "▁Dev", + "ice" + ], + [ + "▁", + "Device" + ], + [ + "wa", + "ke" + ], + [ + "w", + "ake" + ], + [ + "Ex", + "ec" + ], + [ + "ar", + "ding" + ], + [ + "ard", + "ing" + ], + [ + "ardi", + "ng" + ], + [ + "▁pobl", + "ación" + ], + [ + "▁k", + "een" + ], + [ + "▁ke", + "en" + ], + [ + "▁b", + "itch" + ], + [ + "▁bit", + "ch" + ], + [ + "▁embed", + "ded" + ], + [ + "▁B", + "ond" + ], + [ + "▁Bo", + "nd" + ], + [ + "▁Bon", + "d" + ], + [ + "ri", + "des" + ], + [ + "ride", + "s" + ], + [ + "rid", + "es" + ], + [ + "r", + "ides" + ], + [ + "▁W", + "oman" + ], + [ + "▁Wo", + "man" + ], + [ + ".", + "[" + ], + [ + "ér", + "é" + ], + [ + "é", + "ré" + ], + [ + "▁Hash", + "Map" + ], + [ + "▁", + "HashMap" + ], + [ + "▁co", + "unting" + ], + [ + "▁coun", + "ting" + ], + [ + "▁count", + "ing" + ], + [ + "▁Init", + "ial" + ], + [ + "▁", + "Initial" + ], + [ + "▁ver", + "se" + ], + [ + "▁vers", + "e" + ], + [ + "▁", + "verse" + ], + [ + "▁Vere", + "in" + ], + [ + ">\"", + "," + ], + [ + ">", + "\"," + ], + [ + "▁an", + "th" + ], + [ + "▁ant", + "h" + ], + [ + "▁", + "anth" + ], + [ + "ci", + "d" + ], + [ + "c", + "id" + ], + [ + "▁h", + "unt" + ], + [ + "▁hun", + "t" + ], + [ + "на", + "л" + ], + [ + "н", + "ал" + ], + [ + "ci", + "es" + ], + [ + "cie", + "s" + ], + [ + "c", + "ies" + ], + [ + "Pi", + "n" + ], + [ + "P", + "in" + ], + [ + "▁#", + "!" + ], + [ + "ва", + "я" + ], + [ + "sn", + "d" + ], + [ + "s", + "nd" + ], + [ + "▁u", + "k" + ], + [ + "▁", + "uk" + ], + [ + "▁sw", + "ift" + ], + [ + "▁tempor", + "ada" + ], + [ + "▁environment", + "s" + ], + [ + "▁environ", + "ments" + ], + [ + "claim", + "er" + ], + [ + "eme", + "tery" + ], + [ + "emet", + "ery" + ], + [ + "j", + "är" + ], + [ + "▁ча", + "ст" + ], + [ + "▁час", + "т" + ], + [ + "Trans", + "port" + ], + [ + "▁A", + "rr" + ], + [ + "▁Ar", + "r" + ], + [ + "▁", + "Arr" + ], + [ + "▁P", + "aper" + ], + [ + "▁Pa", + "per" + ], + [ + "▁Pap", + "er" + ], + [ + "▁b", + "ew" + ], + [ + "▁be", + "w" + ], + [ + "▁", + "bew" + ], + [ + "▁har", + "vest" + ], + [ + "▁-", + "----" + ], + [ + "▁--", + "---" + ], + [ + "▁---", + "--" + ], + [ + "▁", + "-----" + ], + [ + "product", + "s" + ], + [ + "ле", + "т" + ], + [ + "л", + "ет" + ], + [ + "ident", + "ifier" + ], + [ + "RO", + "OT" + ], + [ + "▁M", + "ak" + ], + [ + "▁Ma", + "k" + ], + [ + "▁App", + "ro" + ], + [ + "▁Ap", + "pro" + ], + [ + "▁", + "Appro" + ], + [ + "ie", + "ri" + ], + [ + "ier", + "i" + ], + [ + "i", + "eri" + ], + [ + "▁F", + "ly" + ], + [ + "▁Fl", + "y" + ], + [ + "▁is", + "set" + ], + [ + "▁iss", + "et" + ], + [ + "▁", + "isset" + ], + [ + "▁determ", + "ination" + ], + [ + "▁determin", + "ation" + ], + [ + "Ge", + "ometry" + ], + [ + "▁emer", + "ging" + ], + [ + "sub", + "scription" + ], + [ + "ol", + "y" + ], + [ + "o", + "ly" + ], + [ + "▁R", + "ace" + ], + [ + "▁Ra", + "ce" + ], + [ + "▁B", + "ah" + ], + [ + "▁Ba", + "h" + ], + [ + "▁Config", + "uration" + ], + [ + "▁", + "Configuration" + ], + [ + "▁Inter", + "est" + ], + [ + "ско", + "в" + ], + [ + "ск", + "ов" + ], + [ + "с", + "ков" + ], + [ + "ist", + "rz" + ], + [ + "istr", + "z" + ], + [ + "▁S", + "han" + ], + [ + "▁Sh", + "an" + ], + [ + "▁Sha", + "n" + ], + [ + "▁P", + "ain" + ], + [ + "▁Pa", + "in" + ], + [ + "CON", + "NE" + ], + [ + "ma", + "jor" + ], + [ + "m", + "ajor" + ], + [ + "▁St", + "ay" + ], + [ + "▁Sta", + "y" + ], + [ + "▁bron", + "ze" + ], + [ + "▁f", + "itting" + ], + [ + "▁fit", + "ting" + ], + [ + "▁J", + "ar" + ], + [ + "▁Ja", + "r" + ], + [ + "mg", + "r" + ], + [ + "m", + "gr" + ], + [ + "▁S", + "har" + ], + [ + "▁Sh", + "ar" + ], + [ + "▁Sha", + "r" + ], + [ + "FL", + "O" + ], + [ + "F", + "LO" + ], + [ + "ut", + "er" + ], + [ + "ute", + "r" + ], + [ + "u", + "ter" + ], + [ + "с", + "ы" + ], + [ + "▁cont", + "acts" + ], + [ + "▁contact", + "s" + ], + [ + "▁f", + "iring" + ], + [ + "▁fi", + "ring" + ], + [ + "▁fir", + "ing" + ], + [ + "на", + "н" + ], + [ + "н", + "ан" + ], + [ + "▁prof", + "es" + ], + [ + "sk", + "é" + ], + [ + "s", + "ké" + ], + [ + "▁rule", + "d" + ], + [ + "▁ru", + "led" + ], + [ + "▁rul", + "ed" + ], + [ + "=\"", + "/" + ], + [ + "an", + "dro" + ], + [ + "and", + "ro" + ], + [ + "▁ens", + "uring" + ], + [ + "iz", + "en" + ], + [ + "ize", + "n" + ], + [ + "i", + "zen" + ], + [ + "▁че", + "рез" + ], + [ + "ise", + "cond" + ], + [ + "i", + "second" + ], + [ + "ob", + "il" + ], + [ + "obi", + "l" + ], + [ + "o", + "bil" + ], + [ + "▁re", + "ck" + ], + [ + "▁rec", + "k" + ], + [ + "▁", + "reck" + ], + [ + ")}", + "(" + ], + [ + ")", + "}(" + ], + [ + "bit", + "map" + ], + [ + "▁B", + "run" + ], + [ + "▁Br", + "un" + ], + [ + "▁Bru", + "n" + ], + [ + "▁Jer", + "usalem" + ], + [ + "▁W", + "o" + ], + [ + "▁Republic", + "ans" + ], + [ + "▁Republican", + "s" + ], + [ + "mat", + "ic" + ], + [ + "m", + "atic" + ], + [ + "▁E", + "arl" + ], + [ + "▁d", + "ock" + ], + [ + "▁do", + "ck" + ], + [ + "▁doc", + "k" + ], + [ + "▁M", + "all" + ], + [ + "▁Mal", + "l" + ], + [ + "▁Ma", + "ll" + ], + [ + "k", + "k" + ], + [ + "▁", + "Й" + ], + [ + "▁C", + "OL" + ], + [ + "▁CO", + "L" + ], + [ + "▁", + "COL" + ], + [ + "▁lat", + "ach" + ], + [ + "UI", + "nt" + ], + [ + "U", + "Int" + ], + [ + "ци", + "ональ" + ], + [ + "цион", + "аль" + ], + [ + "циона", + "ль" + ], + [ + "▁seg", + "ments" + ], + [ + "▁segment", + "s" + ], + [ + "▁re", + "fund" + ], + [ + "▁ref", + "und" + ], + [ + "fa", + "c" + ], + [ + "f", + "ac" + ], + [ + "▁Art", + "icle" + ], + [ + "▁B", + "orn" + ], + [ + "▁Bo", + "rn" + ], + [ + "▁Bor", + "n" + ], + [ + "²", + "." + ], + [ + "br", + "and" + ], + [ + "bra", + "nd" + ], + [ + "b", + "rand" + ], + [ + "{$", + "\\" + ], + [ + "{", + "$\\" + ], + [ + "▁s", + "s" + ], + [ + "▁", + "ss" + ], + [ + "▁Re", + "sources" + ], + [ + "▁Res", + "ources" + ], + [ + "▁Resource", + "s" + ], + [ + "▁", + "Resources" + ], + [ + "▁re", + "cycl" + ], + [ + "▁rec", + "ycl" + ], + [ + "▁$", + "$\\" + ], + [ + "▁$$", + "\\" + ], + [ + "▁Conne", + "ction" + ], + [ + "▁Connect", + "ion" + ], + [ + "▁", + "Connection" + ], + [ + "▁imp", + "erial" + ], + [ + "▁imper", + "ial" + ], + [ + "▁pract", + "ically" + ], + [ + "▁practical", + "ly" + ], + [ + "▁–", + "," + ], + [ + "▁Dis", + "play" + ], + [ + "▁", + "Display" + ], + [ + "ier", + "no" + ], + [ + "mo", + "uth" + ], + [ + "m", + "outh" + ], + [ + "ed", + "es" + ], + [ + "ede", + "s" + ], + [ + "e", + "des" + ], + [ + "ba", + "hn" + ], + [ + "b", + "ahn" + ], + [ + "▁C", + "atherine" + ], + [ + "▁high", + "way" + ], + [ + "un", + "ting" + ], + [ + "unt", + "ing" + ], + [ + "▁Any", + "way" + ], + [ + "Sp", + "ell" + ], + [ + "Spe", + "ll" + ], + [ + "▁L", + "iste" + ], + [ + "▁List", + "e" + ], + [ + "▁Li", + "ste" + ], + [ + "▁Lis", + "te" + ], + [ + "▁ret", + "rieve" + ], + [ + "▁retr", + "ieve" + ], + [ + "▁retriev", + "e" + ], + [ + "▁z", + "d" + ], + [ + "▁", + "zd" + ], + [ + "stra", + "ße" + ], + [ + "▁dom", + "inated" + ], + [ + "▁domin", + "ated" + ], + [ + "to", + "uch" + ], + [ + "t", + "ouch" + ], + [ + "▁m", + "b" + ], + [ + "▁", + "mb" + ], + [ + "LO", + "NG" + ], + [ + "L", + "ONG" + ], + [ + "as", + "ures" + ], + [ + "asure", + "s" + ], + [ + "TL", + "S" + ], + [ + "T", + "LS" + ], + [ + "▁accompl", + "ished" + ], + [ + "▁accomp", + "lished" + ], + [ + "▁accomplish", + "ed" + ], + [ + "▁f", + "ears" + ], + [ + "▁fe", + "ars" + ], + [ + "▁fear", + "s" + ], + [ + "▁seem", + "ingly" + ], + [ + "▁d", + "ag" + ], + [ + "▁da", + "g" + ], + [ + "▁", + "dag" + ], + [ + "▁b", + "ureau" + ], + [ + "▁bur", + "eau" + ], + [ + "▁Gro", + "ß" + ], + [ + "▁accord", + "ance" + ], + [ + ".", + "]" + ], + [ + "ou", + "x" + ], + [ + "o", + "ux" + ], + [ + "▁col", + "onial" + ], + [ + "▁colon", + "ial" + ], + [ + "▁compass", + "ion" + ], + [ + "th", + "umb" + ], + [ + "▁s", + "wo" + ], + [ + "▁sw", + "o" + ], + [ + "on", + "line" + ], + [ + "▁J", + "i" + ], + [ + "▁work", + "shop" + ], + [ + "▁works", + "hop" + ], + [ + "▁l", + "ub" + ], + [ + "▁lu", + "b" + ], + [ + "év", + "rier" + ], + [ + "ш", + "і" + ], + [ + ">\"", + ";" + ], + [ + ">", + "\";" + ], + [ + "▁gener", + "ous" + ], + [ + "▁gene", + "rous" + ], + [ + "ro", + "us" + ], + [ + "rou", + "s" + ], + [ + "r", + "ous" + ], + [ + "av", + "id" + ], + [ + "avi", + "d" + ], + [ + "a", + "vid" + ], + [ + "igen", + "ous" + ], + [ + "▁R", + "aw" + ], + [ + "▁Ra", + "w" + ], + [ + "▁", + "Raw" + ], + [ + "▁sw", + "ap" + ], + [ + "▁", + "swap" + ], + [ + "h", + "c" + ], + [ + "java", + "script" + ], + [ + "jav", + "ascript" + ], + [ + "Fact", + "or" + ], + [ + "Fac", + "tor" + ], + [ + "F", + "actor" + ], + [ + "▁gar", + "bage" + ], + [ + "▁M", + "icro" + ], + [ + "▁Mic", + "ro" + ], + [ + "▁Mi", + "cro" + ], + [ + "co", + "u" + ], + [ + "c", + "ou" + ], + [ + "ü", + "ber" + ], + [ + "▁f", + "atal" + ], + [ + "▁fa", + "tal" + ], + [ + "▁fat", + "al" + ], + [ + "▁trans", + "parent" + ], + [ + "▁b", + "earing" + ], + [ + "▁be", + "aring" + ], + [ + "▁bear", + "ing" + ], + [ + "▁celebr", + "ated" + ], + [ + "▁celebrate", + "d" + ], + [ + "VI", + "S" + ], + [ + "V", + "IS" + ], + [ + "▁B", + "M" + ], + [ + "▁", + "BM" + ], + [ + "▁pr", + "ince" + ], + [ + "▁prin", + "ce" + ], + [ + "to", + "l" + ], + [ + "t", + "ol" + ], + [ + "▁'", + "", + "" + ], + [ + "\\", + "\">" + ], + [ + "▁du", + "rant" + ], + [ + "▁dur", + "ant" + ], + [ + "▁vent", + "ure" + ], + [ + "▁F", + "itz" + ], + [ + "▁Fi", + "tz" + ], + [ + "▁C", + "BD" + ], + [ + "▁CB", + "D" + ], + [ + "▁b", + "acking" + ], + [ + "▁back", + "ing" + ], + [ + "▁w", + "are" + ], + [ + "▁war", + "e" + ], + [ + "▁wa", + "re" + ], + [ + "▁", + "ware" + ], + [ + "ev", + "e" + ], + [ + "e", + "ve" + ], + [ + "O", + "G" + ], + [ + "ed", + "ish" + ], + [ + "edi", + "sh" + ], + [ + "▁Giov", + "anni" + ], + [ + "▁Sh", + "are" + ], + [ + "▁Shar", + "e" + ], + [ + "▁Sha", + "re" + ], + [ + "▁", + "Share" + ], + [ + "▁rec", + "ipes" + ], + [ + "▁recipe", + "s" + ], + [ + "▁recip", + "es" + ], + [ + "big", + "g" + ], + [ + "bi", + "gg" + ], + [ + "b", + "igg" + ], + [ + "▁minor", + "ity" + ], + [ + "▁n", + "ar" + ], + [ + "▁na", + "r" + ], + [ + "▁", + "nar" + ], + [ + "oll", + "ary" + ], + [ + "ollar", + "y" + ], + [ + "▁F", + "E" + ], + [ + "▁", + "FE" + ], + [ + "sh", + "irt" + ], + [ + "▁redu", + "ces" + ], + [ + "▁reduce", + "s" + ], + [ + "Ch", + "e" + ], + [ + "C", + "he" + ], + [ + "▁NOT", + "E" + ], + [ + "▁NO", + "TE" + ], + [ + "j", + "query" + ], + [ + "▁F", + "low" + ], + [ + "▁Fl", + "ow" + ], + [ + "▁Flo", + "w" + ], + [ + "▁", + "Flow" + ], + [ + "task", + "s" + ], + [ + "pr", + "event" + ], + [ + "pre", + "vent" + ], + [ + "prev", + "ent" + ], + [ + "▁со", + "вет" + ], + [ + "▁сов", + "ет" + ], + [ + "it", + "as" + ], + [ + "ita", + "s" + ], + [ + "▁exam", + "ined" + ], + [ + "▁examine", + "d" + ], + [ + "ho", + "n" + ], + [ + "h", + "on" + ], + [ + "▁M", + "ine" + ], + [ + "▁Min", + "e" + ], + [ + "▁Mi", + "ne" + ], + [ + "▁grad", + "ient" + ], + [ + "▁V", + "ien" + ], + [ + "▁Vi", + "en" + ], + [ + "▁b", + "eds" + ], + [ + "▁be", + "ds" + ], + [ + "▁bed", + "s" + ], + [ + "ET", + "H" + ], + [ + "E", + "TH" + ], + [ + "fl", + "at" + ], + [ + "f", + "lat" + ], + [ + "an", + "son" + ], + [ + "ans", + "on" + ], + [ + "▁in", + "tu" + ], + [ + "▁int", + "u" + ], + [ + "▁fl", + "ows" + ], + [ + "▁flo", + "ws" + ], + [ + "▁flow", + "s" + ], + [ + "но", + "к" + ], + [ + "▁E", + "ine" + ], + [ + "▁Ein", + "e" + ], + [ + "ро", + "ди" + ], + [ + "род", + "и" + ], + [ + "▁ко", + "р" + ], + [ + "▁к", + "ор" + ], + [ + "▁", + "кор" + ], + [ + "▁aff", + "ection" + ], + [ + "▁af", + "fection" + ], + [ + "▁affect", + "ion" + ], + [ + "▁p", + "orts" + ], + [ + "▁por", + "ts" + ], + [ + "▁port", + "s" + ], + [ + "▁", + "ports" + ], + [ + "__", + "." + ], + [ + "_", + "_." + ], + [ + "re", + "po" + ], + [ + "rep", + "o" + ], + [ + "ail", + "and" + ], + [ + "ai", + "land" + ], + [ + "▁по", + "да" + ], + [ + "▁под", + "а" + ], + [ + "int", + "age" + ], + [ + "inta", + "ge" + ], + [ + "▁Prote", + "ction" + ], + [ + "î", + "t" + ], + [ + "▁[", + "{" + ], + [ + "▁l", + "amp" + ], + [ + "▁la", + "mp" + ], + [ + "▁benef", + "icial" + ], + [ + "ка", + "де" + ], + [ + "▁Станов", + "ништво" + ], + [ + "▁l", + "ined" + ], + [ + "▁li", + "ned" + ], + [ + "▁line", + "d" + ], + [ + "▁lin", + "ed" + ], + [ + "▁", + "lined" + ], + [ + "▁Ex", + "change" + ], + [ + "▁f", + "itted" + ], + [ + "▁fit", + "ted" + ], + [ + "▁v", + "erk" + ], + [ + "▁ver", + "k" + ], + [ + "▁focus", + "es" + ], + [ + "vo", + "d" + ], + [ + "v", + "od" + ], + [ + "▁Car", + "lo" + ], + [ + "▁Carl", + "o" + ], + [ + "▁ра", + "спо" + ], + [ + "▁рас", + "по" + ], + [ + "ain", + "ted" + ], + [ + "aint", + "ed" + ], + [ + "ainte", + "d" + ], + [ + "a", + "inted" + ], + [ + "▁r", + "ape" + ], + [ + "▁rap", + "e" + ], + [ + "▁ra", + "pe" + ], + [ + "▁t", + "ogg" + ], + [ + "▁to", + "gg" + ], + [ + "ac", + "ker" + ], + [ + "ack", + "er" + ], + [ + "a", + "cker" + ], + [ + "T", + "w" + ], + [ + "ra", + "h" + ], + [ + "r", + "ah" + ], + [ + "trans", + "l" + ], + [ + "▁je", + "alous" + ], + [ + "▁re", + "pository" + ], + [ + "▁rep", + "ository" + ], + [ + "▁", + "repository" + ], + [ + "re", + "marks" + ], + [ + "rem", + "arks" + ], + [ + "remark", + "s" + ], + [ + "▁i", + "e" + ], + [ + "▁", + "ie" + ], + [ + "í", + "d" + ], + [ + "▁sk", + "ull" + ], + [ + "ra", + "c" + ], + [ + "r", + "ac" + ], + [ + "()", + "]" + ], + [ + "(", + ")]" + ], + [ + "ri", + "en" + ], + [ + "rie", + "n" + ], + [ + "r", + "ien" + ], + [ + "?", + "(" + ], + [ + "▁K", + "ids" + ], + [ + "▁Ki", + "ds" + ], + [ + "▁Kid", + "s" + ], + [ + "▁sw", + "itched" + ], + [ + "▁switch", + "ed" + ], + [ + "▁G", + "ew" + ], + [ + "▁Ge", + "w" + ], + [ + "▁be", + "ef" + ], + [ + "▁appear", + "ances" + ], + [ + "▁appearance", + "s" + ], + [ + "▁Coll", + "ins" + ], + [ + "▁V", + "illa" + ], + [ + "▁Vill", + "a" + ], + [ + "▁Vi", + "lla" + ], + [ + "▁Vil", + "la" + ], + [ + "▁z", + "ona" + ], + [ + "▁zo", + "na" + ], + [ + "▁n", + "eu" + ], + [ + "▁ne", + "u" + ], + [ + "те", + "льно" + ], + [ + "тель", + "но" + ], + [ + "▁х", + "удо" + ], + [ + "▁oper", + "ational" + ], + [ + "▁operation", + "al" + ], + [ + "ON", + "LY" + ], + [ + "▁h", + "ockey" + ], + [ + "▁ś", + "wi" + ], + [ + "ö", + "k" + ], + [ + "Sl", + "ice" + ], + [ + "Ref", + "resh" + ], + [ + "▁n", + "uts" + ], + [ + "▁nu", + "ts" + ], + [ + "▁nut", + "s" + ], + [ + "sa", + "y" + ], + [ + "s", + "ay" + ], + [ + "▁ста", + "нови" + ], + [ + "▁станов", + "и" + ], + [ + "х", + "е" + ], + [ + "▁le", + "aning" + ], + [ + "▁lean", + "ing" + ], + [ + "▁H", + "aus" + ], + [ + "▁Ha", + "us" + ], + [ + "▁o", + "ral" + ], + [ + "▁or", + "al" + ], + [ + "▁", + "oral" + ], + [ + "▁", + "Ž" + ], + [ + "▁Sup", + "pose" + ], + [ + "▁Supp", + "ose" + ], + [ + "▁ess", + "ence" + ], + [ + "EN", + "TER" + ], + [ + "ENT", + "ER" + ], + [ + "B", + "ucket" + ], + [ + "▁C", + "ant" + ], + [ + "▁Can", + "t" + ], + [ + "▁Ca", + "nt" + ], + [ + "▁N", + "azi" + ], + [ + "▁Na", + "zi" + ], + [ + "▁Naz", + "i" + ], + [ + "ш", + "ти" + ], + [ + "▁Vol", + "ume" + ], + [ + "▁", + "Volume" + ], + [ + "▁wor", + "thy" + ], + [ + "▁worth", + "y" + ], + [ + "▁", + "worthy" + ], + [ + "B", + "u" + ], + [ + "Ent", + "ries" + ], + [ + "on", + "ie" + ], + [ + "oni", + "e" + ], + [ + "o", + "nie" + ], + [ + "▁h", + "ood" + ], + [ + "▁ho", + "od" + ], + [ + "▁", + "hood" + ], + [ + "▁emp", + "ire" + ], + [ + "▁dé", + "velop" + ], + [ + "▁p", + "robe" + ], + [ + "▁pro", + "be" + ], + [ + "▁pr", + "obe" + ], + [ + "▁prob", + "e" + ], + [ + "▁", + "probe" + ], + [ + "▁K", + "night" + ], + [ + "▁Kn", + "ight" + ], + [ + "▁peace", + "ful" + ], + [ + "hu", + "b" + ], + [ + "h", + "ub" + ], + [ + "▁ál", + "bum" + ], + [ + "su", + "it" + ], + [ + "s", + "uit" + ], + [ + "▁sil", + "k" + ], + [ + "+", + "=" + ], + [ + "▁p", + "ione" + ], + [ + "▁pi", + "one" + ], + [ + "'", + "\"" + ], + [ + "ка", + "ми" + ], + [ + "▁N", + "ull" + ], + [ + "▁Nu", + "ll" + ], + [ + "▁", + "Null" + ], + [ + "Label", + "s" + ], + [ + "au", + "tres" + ], + [ + "aut", + "res" + ], + [ + "autre", + "s" + ], + [ + "to", + "LowerCase" + ], + [ + "▁b", + "uzz" + ], + [ + "▁bu", + "zz" + ], + [ + "▁w", + "ashed" + ], + [ + "▁was", + "hed" + ], + [ + "▁wash", + "ed" + ], + [ + "'", + "*" + ], + [ + "itzer", + "land" + ], + [ + "▁r", + "amp" + ], + [ + "▁ra", + "mp" + ], + [ + "▁ram", + "p" + ], + [ + "▁к", + "ни" + ], + [ + "▁k", + "un" + ], + [ + "col", + "ors" + ], + [ + "color", + "s" + ], + [ + "colo", + "rs" + ], + [ + "▁vacc", + "ine" + ], + [ + "an", + "imation" + ], + [ + "anim", + "ation" + ], + [ + "▁Just", + "in" + ], + [ + "mem", + "set" + ], + [ + "▁c", + "ensus" + ], + [ + "▁cens", + "us" + ], + [ + "in", + "fl" + ], + [ + "inf", + "l" + ], + [ + "▁statist", + "ical" + ], + [ + "▁trop", + "ical" + ], + [ + "Dis", + "abled" + ], + [ + "Disable", + "d" + ], + [ + "\r", + "\r" + ], + [ + "▁Cra", + "ig" + ], + [ + "Page", + "s" + ], + [ + "Pag", + "es" + ], + [ + "P", + "ages" + ], + [ + "▁mag", + "az" + ], + [ + "▁comput", + "ing" + ], + [ + "▁flo", + "ors" + ], + [ + "▁floor", + "s" + ], + [ + "oin", + "e" + ], + [ + "oi", + "ne" + ], + [ + "o", + "ine" + ], + [ + "▁tit", + "olo" + ], + [ + "▁an", + "ci" + ], + [ + "▁anc", + "i" + ], + [ + "▁Indust", + "ry" + ], + [ + "▁г", + "лав" + ], + [ + "▁гла", + "в" + ], + [ + "Bo", + "ot" + ], + [ + "B", + "oot" + ], + [ + "Cl", + "ip" + ], + [ + "▁d", + "v" + ], + [ + "▁", + "dv" + ], + [ + "▁met", + "all" + ], + [ + "▁metal", + "l" + ], + [ + "▁meta", + "ll" + ], + [ + "▁Is", + "abel" + ], + [ + "▁Isa", + "bel" + ], + [ + "▁look", + "up" + ], + [ + "▁", + "lookup" + ], + [ + "▁ц", + "ер" + ], + [ + "▁це", + "р" + ], + [ + "▁", + "цер" + ], + [ + "▁car", + "ries" + ], + [ + "f", + "u" + ], + [ + "tp", + "l" + ], + [ + "t", + "pl" + ], + [ + "pe", + "rp" + ], + [ + "per", + "p" + ], + [ + "▁St", + "orm" + ], + [ + "▁Sto", + "rm" + ], + [ + "eh", + "icle" + ], + [ + "▁S", + "even" + ], + [ + "▁Se", + "ven" + ], + [ + "▁Sev", + "en" + ], + [ + "љ", + "а" + ], + [ + "▁l", + "ut" + ], + [ + "▁lu", + "t" + ], + [ + "th", + "reshold" + ], + [ + "▁d", + "ull" + ], + [ + "▁du", + "ll" + ], + [ + "▁E", + "ND" + ], + [ + "▁EN", + "D" + ], + [ + "▁", + "END" + ], + [ + "▁O", + "tto" + ], + [ + "▁Ot", + "to" + ], + [ + "▁Ott", + "o" + ], + [ + "▁there", + "by" + ], + [ + "TE", + "MP" + ], + [ + "T", + "EMP" + ], + [ + "▁S", + "cal" + ], + [ + "▁Sc", + "al" + ], + [ + "▁", + "Scal" + ], + [ + "Com", + "put" + ], + [ + "Comp", + "ut" + ], + [ + "ip", + "v" + ], + [ + "i", + "pv" + ], + [ + "▁ins", + "ane" + ], + [ + "▁myster", + "ious" + ], + [ + "▁M", + "is" + ], + [ + "▁Mi", + "s" + ], + [ + "uch", + "ar" + ], + [ + "uc", + "har" + ], + [ + "u", + "char" + ], + [ + "as", + "ma" + ], + [ + "asm", + "a" + ], + [ + "au", + "ch" + ], + [ + "auc", + "h" + ], + [ + "a", + "uch" + ], + [ + "ne", + "tt" + ], + [ + "net", + "t" + ], + [ + "n", + "ett" + ], + [ + "El", + "em" + ], + [ + "E", + "lem" + ], + [ + "de", + "rive" + ], + [ + "der", + "ive" + ], + [ + "▁murder", + "ed" + ], + [ + "ak", + "ten" + ], + [ + "akt", + "en" + ], + [ + "akte", + "n" + ], + [ + "ро", + "ван" + ], + [ + "ров", + "ан" + ], + [ + "рова", + "н" + ], + [ + "▁a", + "nos" + ], + [ + "▁an", + "os" + ], + [ + "▁ano", + "s" + ], + [ + "▁", + "anos" + ], + [ + "}}", + "^" + ], + [ + "}", + "}^" + ], + [ + "▁F", + "uß" + ], + [ + "▁Fu", + "ß" + ], + [ + "▁S", + "ister" + ], + [ + "▁Si", + "ster" + ], + [ + "▁volunte", + "er" + ], + [ + "::", + "_" + ], + [ + ":", + ":_" + ], + [ + "er", + "ta" + ], + [ + "ert", + "a" + ], + [ + "▁бо", + "лее" + ], + [ + "og", + "rá" + ], + [ + "▁Im", + "Gui" + ], + [ + "sa", + "me" + ], + [ + "sam", + "e" + ], + [ + "s", + "ame" + ], + [ + "Sh", + "adow" + ], + [ + "▁re", + "actions" + ], + [ + "▁reaction", + "s" + ], + [ + "▁react", + "ions" + ], + [ + "▁purch", + "asing" + ], + [ + "PRE", + "FIX" + ], + [ + "▁emb", + "od" + ], + [ + "со", + "м" + ], + [ + "▁alt", + "ogether" + ], + [ + "▁prom", + "oting" + ], + [ + "▁promot", + "ing" + ], + [ + "U", + "V" + ], + [ + "▁ind", + "uced" + ], + [ + "▁indu", + "ced" + ], + [ + "▁eer", + "ste" + ], + [ + "▁eerst", + "e" + ], + [ + "Li", + "fe" + ], + [ + "Lif", + "e" + ], + [ + "L", + "ife" + ], + [ + "hd", + "d" + ], + [ + "h", + "dd" + ], + [ + "ní", + "ch" + ], + [ + "▁c", + "hill" + ], + [ + "▁ch", + "ill" + ], + [ + "▁chi", + "ll" + ], + [ + "RG", + "B" + ], + [ + "R", + "GB" + ], + [ + "red", + "uce" + ], + [ + "redu", + "ce" + ], + [ + "FR", + "OM" + ], + [ + "F", + "ROM" + ], + [ + "dir", + "name" + ], + [ + "▁t", + "une" + ], + [ + "▁tu", + "ne" + ], + [ + "▁tun", + "e" + ], + [ + "▁r", + "ay" + ], + [ + "▁ra", + "y" + ], + [ + "▁", + "ray" + ], + [ + "T", + "D" + ], + [ + "▁к", + "ъ" + ], + [ + "▁Febru", + "ar" + ], + [ + "▁suspend", + "ed" + ], + [ + "▁susp", + "ended" + ], + [ + "▁u", + "pp" + ], + [ + "▁up", + "p" + ], + [ + "▁", + "upp" + ], + [ + "er", + "i" + ], + [ + "e", + "ri" + ], + [ + "pr", + "eter" + ], + [ + "pre", + "ter" + ], + [ + "pret", + "er" + ], + [ + "▁E", + "R" + ], + [ + "▁", + "ER" + ], + [ + "то", + "н" + ], + [ + "т", + "он" + ], + [ + "▁c", + "atal" + ], + [ + "▁cat", + "al" + ], + [ + "▁ca", + "tal" + ], + [ + "▁h", + "iring" + ], + [ + "▁hi", + "ring" + ], + [ + "▁п", + "ів" + ], + [ + "▁пі", + "в" + ], + [ + "▁Olymp", + "ics" + ], + [ + "▁Olympic", + "s" + ], + [ + "da", + "le" + ], + [ + "dal", + "e" + ], + [ + "d", + "ale" + ], + [ + "::", + "{" + ], + [ + ":", + ":{" + ], + [ + "▁expl", + "oring" + ], + [ + "▁explo", + "ring" + ], + [ + "▁с", + "тал" + ], + [ + "▁ста", + "л" + ], + [ + "▁ст", + "ал" + ], + [ + "▁univers", + "ities" + ], + [ + "Class", + "es" + ], + [ + "▁ча", + "с" + ], + [ + "▁C", + "ool" + ], + [ + "▁Co", + "ol" + ], + [ + "▁S", + "ony" + ], + [ + "▁So", + "ny" + ], + [ + "▁Son", + "y" + ], + [ + "th", + "al" + ], + [ + "tha", + "l" + ], + [ + "t", + "hal" + ], + [ + "▁es", + "crit" + ], + [ + "▁esc", + "rit" + ], + [ + "▁cor", + "ruption" + ], + [ + "▁corrupt", + "ion" + ], + [ + "az", + "ar" + ], + [ + "aza", + "r" + ], + [ + "▁N", + "eb" + ], + [ + "▁Ne", + "b" + ], + [ + "▁Py", + "thon" + ], + [ + "▁c", + "him" + ], + [ + "▁ch", + "im" + ], + [ + "▁chi", + "m" + ], + [ + "▁cap", + "ability" + ], + [ + "cy", + "cl" + ], + [ + "c", + "ycl" + ], + [ + "▁re", + "try" + ], + [ + "▁r", + "etry" + ], + [ + "▁ret", + "ry" + ], + [ + "▁retr", + "y" + ], + [ + "▁", + "retry" + ], + [ + "++", + "]" + ], + [ + "▁t", + "oy" + ], + [ + "▁to", + "y" + ], + [ + "▁T", + "erry" + ], + [ + "▁Ter", + "ry" + ], + [ + "▁Terr", + "y" + ], + [ + "View", + "ById" + ], + [ + "▁v", + "ine" + ], + [ + "▁vi", + "ne" + ], + [ + "▁vin", + "e" + ], + [ + "▁Kit", + "chen" + ], + [ + "▁B", + "iden" + ], + [ + "▁Bi", + "den" + ], + [ + "Back", + "end" + ], + [ + "gl", + "ich" + ], + [ + "g", + "lich" + ], + [ + "re", + "lation" + ], + [ + "rel", + "ation" + ], + [ + "▁rat", + "ings" + ], + [ + "▁ra", + "tings" + ], + [ + "▁rating", + "s" + ], + [ + "Execut", + "or" + ], + [ + "ibr", + "ation" + ], + [ + ">(", + ")" + ], + [ + ">", + "()" + ], + [ + "▁he", + "al" + ], + [ + "if", + "iable" + ], + [ + "ifi", + "able" + ], + [ + "par", + "k" + ], + [ + "p", + "ark" + ], + [ + "▁P", + "ete" + ], + [ + "▁Pe", + "te" + ], + [ + "▁Pet", + "e" + ], + [ + "▁tr", + "aged" + ], + [ + "▁tra", + "ged" + ], + [ + "▁trag", + "ed" + ], + [ + "▁ch", + "uck" + ], + [ + "▁wire", + "less" + ], + [ + "▁wir", + "eless" + ], + [ + "Re", + "place" + ], + [ + "Rep", + "lace" + ], + [ + "IR", + "Q" + ], + [ + "▁се", + "зо" + ], + [ + "i", + "ß" + ], + [ + "▁j", + "unto" + ], + [ + "▁jun", + "to" + ], + [ + "Lo", + "w" + ], + [ + "L", + "ow" + ], + [ + "▁s", + "id" + ], + [ + "▁si", + "d" + ], + [ + "▁", + "sid" + ], + [ + "Tag", + "Helpers" + ], + [ + "TagHelper", + "s" + ], + [ + "▁comp", + "aring" + ], + [ + "▁compar", + "ing" + ], + [ + "▁c", + "elle" + ], + [ + "▁cell", + "e" + ], + [ + "▁ce", + "lle" + ], + [ + "▁cel", + "le" + ], + [ + "▁obt", + "aining" + ], + [ + "▁obtain", + "ing" + ], + [ + "▁qu", + "ar" + ], + [ + "▁q", + "uar" + ], + [ + "Br", + "o" + ], + [ + "B", + "ro" + ], + [ + "▁E", + "C" + ], + [ + "▁", + "EC" + ], + [ + "in", + "ea" + ], + [ + "ine", + "a" + ], + [ + "i", + "nea" + ], + [ + "▁F", + "ue" + ], + [ + "▁Fu", + "e" + ], + [ + "▁Prince", + "ss" + ], + [ + "▁Prin", + "cess" + ], + [ + "ij", + "o" + ], + [ + "i", + "jo" + ], + [ + "ge", + "ns" + ], + [ + "gen", + "s" + ], + [ + "g", + "ens" + ], + [ + "PO", + "L" + ], + [ + "P", + "OL" + ], + [ + "è", + "tres" + ], + [ + "▁h", + "ind" + ], + [ + "▁hi", + "nd" + ], + [ + "▁", + "hind" + ], + [ + "Var", + "iant" + ], + [ + "Vari", + "ant" + ], + [ + "▁rece", + "ives" + ], + [ + "▁receive", + "s" + ], + [ + "go", + "d" + ], + [ + "g", + "od" + ], + [ + "ik", + "en" + ], + [ + "ike", + "n" + ], + [ + "i", + "ken" + ], + [ + "na", + "il" + ], + [ + "n", + "ail" + ], + [ + "▁amer", + "ican" + ], + [ + "▁", + "american" + ], + [ + "br", + "as" + ], + [ + "bra", + "s" + ], + [ + "b", + "ras" + ], + [ + "('", + "\\" + ], + [ + "(", + "'\\" + ], + [ + "ie", + "ce" + ], + [ + "if", + "ference" + ], + [ + "iffer", + "ence" + ], + [ + "iffe", + "rence" + ], + [ + "▁b", + "ubble" + ], + [ + "▁bub", + "ble" + ], + [ + "▁B", + "ear" + ], + [ + "▁Be", + "ar" + ], + [ + "un", + "ivers" + ], + [ + "uni", + "vers" + ], + [ + "▁demand", + "ing" + ], + [ + "sa", + "ved" + ], + [ + "save", + "d" + ], + [ + "s", + "aved" + ], + [ + "▁cred", + "entials" + ], + [ + "MS", + "M" + ], + [ + "M", + "SM" + ], + [ + "▁struct", + "ural" + ], + [ + "Con", + "s" + ], + [ + "Co", + "ns" + ], + [ + "C", + "ons" + ], + [ + "▁Way", + "ne" + ], + [ + "▁blank", + "et" + ], + [ + "▁re", + "pet" + ], + [ + "▁rep", + "et" + ], + [ + "▁repe", + "t" + ], + [ + "Ne", + "g" + ], + [ + "N", + "eg" + ], + [ + "▁exclusive", + "ly" + ], + [ + "▁exclus", + "ively" + ], + [ + "IF", + "I" + ], + [ + "I", + "FI" + ], + [ + "бур", + "г" + ], + [ + "▁arg", + "uing" + ], + [ + "▁Re", + "pub" + ], + [ + "▁Rep", + "ub" + ], + [ + "▁f", + "rowned" + ], + [ + "▁fr", + "owned" + ], + [ + "Met", + "ric" + ], + [ + "M", + "etric" + ], + [ + "sk", + "im" + ], + [ + "ski", + "m" + ], + [ + "s", + "kim" + ], + [ + "▁П", + "ет" + ], + [ + "▁Пе", + "т" + ], + [ + "▁rele", + "ases" + ], + [ + "▁release", + "s" + ], + [ + "▁t", + "ast" + ], + [ + "▁ta", + "st" + ], + [ + "▁p", + "reference" + ], + [ + "▁pre", + "ference" + ], + [ + "▁prefer", + "ence" + ], + [ + "▁S", + "üd" + ], + [ + "▁Sü", + "d" + ], + [ + "oc", + "c" + ], + [ + "o", + "cc" + ], + [ + "▁r", + "x" + ], + [ + "▁", + "rx" + ], + [ + "activ", + "ate" + ], + [ + "cl", + "am" + ], + [ + "c", + "lam" + ], + [ + "▁фи", + "ль" + ], + [ + "▁Sud", + "denly" + ], + [ + "▁cr", + "ushing" + ], + [ + "▁crush", + "ing" + ], + [ + "▁L", + "ower" + ], + [ + "▁Lo", + "wer" + ], + [ + "▁Low", + "er" + ], + [ + "▁", + "Lower" + ], + [ + "ei", + "ng" + ], + [ + "e", + "ing" + ], + [ + "wa", + "lt" + ], + [ + "wal", + "t" + ], + [ + "w", + "alt" + ], + [ + "▁Г", + "ер" + ], + [ + "▁Ге", + "р" + ], + [ + "▁m", + "ö" + ], + [ + "ри", + "сто" + ], + [ + "la", + "gen" + ], + [ + "lag", + "en" + ], + [ + "lage", + "n" + ], + [ + "l", + "agen" + ], + [ + "▁co", + "aching" + ], + [ + "▁coach", + "ing" + ], + [ + "ight", + "ers" + ], + [ + "igh", + "ters" + ], + [ + "ighter", + "s" + ], + [ + "▁bas", + "ement" + ], + [ + "▁base", + "ment" + ], + [ + "▁F", + "IX" + ], + [ + "▁FI", + "X" + ], + [ + "▁", + "FIX" + ], + [ + "Te", + "le" + ], + [ + "T", + "ele" + ], + [ + "With", + "out" + ], + [ + "▁Com", + "mons" + ], + [ + "▁Comm", + "ons" + ], + [ + "▁Common", + "s" + ], + [ + "ul", + "ly" + ], + [ + "ull", + "y" + ], + [ + "h", + "box" + ], + [ + "fl", + "ash" + ], + [ + "▁por", + "tal" + ], + [ + "▁port", + "al" + ], + [ + "▁", + "portal" + ], + [ + "ot", + "ype" + ], + [ + "o", + "type" + ], + [ + "▁S", + "or" + ], + [ + "▁So", + "r" + ], + [ + "▁trou", + "bles" + ], + [ + "▁trouble", + "s" + ], + [ + "ar", + "si" + ], + [ + "ars", + "i" + ], + [ + "▁с", + "тан" + ], + [ + "▁ста", + "н" + ], + [ + "▁ст", + "ан" + ], + [ + "▁", + "стан" + ], + [ + "CA", + "M" + ], + [ + "C", + "AM" + ], + [ + "▁de", + "notes" + ], + [ + "▁den", + "otes" + ], + [ + "▁denote", + "s" + ], + [ + "LA", + "NG" + ], + [ + "LAN", + "G" + ], + [ + "L", + "ANG" + ], + [ + "▁Be", + "yond" + ], + [ + "▁Bey", + "ond" + ], + [ + "▁Bo", + "wl" + ], + [ + "▁Bow", + "l" + ], + [ + "▁import", + "antly" + ], + [ + "▁important", + "ly" + ], + [ + "▁W", + "R" + ], + [ + "▁", + "WR" + ], + [ + "▁rel", + "ating" + ], + [ + "▁a", + "nder" + ], + [ + "▁and", + "er" + ], + [ + "▁an", + "der" + ], + [ + "▁", + "ander" + ], + [ + "▁gr", + "inned" + ], + [ + "▁grin", + "ned" + ], + [ + "▁D", + "ak" + ], + [ + "▁Da", + "k" + ], + [ + "▁Brook", + "lyn" + ], + [ + "▁d", + "p" + ], + [ + "▁", + "dp" + ], + [ + "▁P", + "oly" + ], + [ + "▁Pol", + "y" + ], + [ + "▁Po", + "ly" + ], + [ + "▁", + "Poly" + ], + [ + "▁Sch", + "ul" + ], + [ + "▁B", + "uffer" + ], + [ + "▁Buff", + "er" + ], + [ + "▁", + "Buffer" + ], + [ + "▁h", + "older" + ], + [ + "▁hold", + "er" + ], + [ + "▁hol", + "der" + ], + [ + "▁", + "holder" + ], + [ + "IC", + "AL" + ], + [ + "I", + "CAL" + ], + [ + "▁tra", + "iler" + ], + [ + "▁trail", + "er" + ], + [ + "er", + "ek" + ], + [ + "ere", + "k" + ], + [ + "e", + "rek" + ], + [ + "▁n", + "ě" + ], + [ + "▁", + "ně" + ], + [ + "sh", + "aped" + ], + [ + "shape", + "d" + ], + [ + "sha", + "ped" + ], + [ + ":", + "`" + ], + [ + "▁de", + "code" + ], + [ + "▁dec", + "ode" + ], + [ + "▁", + "decode" + ], + [ + "▁co", + "unted" + ], + [ + "▁coun", + "ted" + ], + [ + "▁count", + "ed" + ], + [ + "▁v", + "amp" + ], + [ + "▁va", + "mp" + ], + [ + "▁re", + "late" + ], + [ + "▁rel", + "ate" + ], + [ + "▁M", + "ason" + ], + [ + "▁Ma", + "son" + ], + [ + "▁Mas", + "on" + ], + [ + "▁t", + "itled" + ], + [ + "▁title", + "d" + ], + [ + "▁tit", + "led" + ], + [ + "▁Kent", + "ucky" + ], + [ + "▁particip", + "ated" + ], + [ + "▁participate", + "d" + ], + [ + "▁Jenn", + "ifer" + ], + [ + "▁mat", + "rices" + ], + [ + "Cal", + "endar" + ], + [ + "st", + "s" + ], + [ + "s", + "ts" + ], + [ + "Ass", + "oci" + ], + [ + "▁f", + "orum" + ], + [ + "▁for", + "um" + ], + [ + "▁fo", + "rum" + ], + [ + "▁s", + "phere" + ], + [ + "▁sp", + "here" + ], + [ + "▁spher", + "e" + ], + [ + "▁S", + "EO" + ], + [ + "▁SE", + "O" + ], + [ + "pop", + "up" + ], + [ + "▁Current", + "ly" + ], + [ + "CL", + "E" + ], + [ + "C", + "LE" + ], + [ + "▁vol", + "unt" + ], + [ + "▁stell", + "ar" + ], + [ + "for", + "all" + ], + [ + "Is", + "s" + ], + [ + "I", + "ss" + ], + [ + "im", + "et" + ], + [ + "ime", + "t" + ], + [ + "i", + "met" + ], + [ + "q", + "p" + ], + [ + "la", + "test" + ], + [ + "lat", + "est" + ], + [ + "late", + "st" + ], + [ + "▁config", + "ured" + ], + [ + "▁configure", + "d" + ], + [ + "ab", + "ol" + ], + [ + "a", + "bol" + ], + [ + "ig", + "ent" + ], + [ + "igen", + "t" + ], + [ + "ige", + "nt" + ], + [ + "i", + "gent" + ], + [ + "▁delay", + "ed" + ], + [ + "ff", + "ic" + ], + [ + "f", + "fic" + ], + [ + "▁g", + "ing" + ], + [ + "▁gi", + "ng" + ], + [ + "▁", + "ging" + ], + [ + "▁s", + "cent" + ], + [ + "▁sc", + "ent" + ], + [ + "▁scen", + "t" + ], + [ + "▁disg", + "ust" + ], + [ + "▁disgu", + "st" + ], + [ + "he", + "sis" + ], + [ + "hes", + "is" + ], + [ + "h", + "esis" + ], + [ + "im", + "en" + ], + [ + "ime", + "n" + ], + [ + "i", + "men" + ], + [ + "▁re", + "ign" + ], + [ + "▁П", + "и" + ], + [ + "ul", + "as" + ], + [ + "ula", + "s" + ], + [ + "u", + "las" + ], + [ + "um", + "ing" + ], + [ + "umin", + "g" + ], + [ + "umi", + "ng" + ], + [ + "u", + "ming" + ], + [ + "in", + "nings" + ], + [ + "inn", + "ings" + ], + [ + "Re", + "nd" + ], + [ + "R", + "end" + ], + [ + "id", + "ity" + ], + [ + "idi", + "ty" + ], + [ + "▁do", + "zens" + ], + [ + "▁dozen", + "s" + ], + [ + "wa", + "rf" + ], + [ + "war", + "f" + ], + [ + "▁Del", + "hi" + ], + [ + "▁bi", + "ological" + ], + [ + "▁corrid", + "or" + ], + [ + "Vis", + "ual" + ], + [ + "▁I", + "z" + ], + [ + "▁s", + "uits" + ], + [ + "▁su", + "its" + ], + [ + "▁suit", + "s" + ], + [ + "Py", + "Object" + ], + [ + "ia", + "go" + ], + [ + "i", + "ago" + ], + [ + "▁div", + "ide" + ], + [ + "▁divid", + "e" + ], + [ + "pe", + "nt" + ], + [ + "pen", + "t" + ], + [ + "p", + "ent" + ], + [ + "hel", + "lo" + ], + [ + "hell", + "o" + ], + [ + "h", + "ello" + ], + [ + "▁b", + "eta" + ], + [ + "▁be", + "ta" + ], + [ + "▁bet", + "a" + ], + [ + "▁", + "beta" + ], + [ + "▁ex", + "terior" + ], + [ + "▁fin", + "est" + ], + [ + "▁fine", + "st" + ], + [ + "▁B", + "ir" + ], + [ + "▁Bi", + "r" + ], + [ + "▁f", + "reed" + ], + [ + "▁fr", + "eed" + ], + [ + "▁free", + "d" + ], + [ + "▁fre", + "ed" + ], + [ + "▁K", + "el" + ], + [ + "▁Ke", + "l" + ], + [ + "Se", + "m" + ], + [ + "S", + "em" + ], + [ + "▁fr", + "uits" + ], + [ + "▁fruit", + "s" + ], + [ + "▁fru", + "its" + ], + [ + "▁serv", + "ants" + ], + [ + "▁servant", + "s" + ], + [ + "▁pub", + "lisher" + ], + [ + "▁publish", + "er" + ], + [ + "▁cop", + "per" + ], + [ + "ol", + "ation" + ], + [ + "o", + "lation" + ], + [ + "se", + "p" + ], + [ + "s", + "ep" + ], + [ + "▁chair", + "man" + ], + [ + "ti", + "k" + ], + [ + "t", + "ik" + ], + [ + "▁m", + "others" + ], + [ + "▁mother", + "s" + ], + [ + "▁mo", + "thers" + ], + [ + "A", + "ug" + ], + [ + "▁je", + "ans" + ], + [ + "[]", + ")" + ], + [ + "[", + "])" + ], + [ + "▁D", + "ATA" + ], + [ + "▁DA", + "TA" + ], + [ + "▁", + "DATA" + ], + [ + "▁reve", + "als" + ], + [ + "▁reveal", + "s" + ], + [ + "▁un", + "conscious" + ], + [ + "▁h", + "acer" + ], + [ + "▁ha", + "cer" + ], + [ + "▁hace", + "r" + ], + [ + "ric", + "ulum" + ], + [ + "▁T", + "ogether" + ], + [ + "▁ш", + "та" + ], + [ + "▁", + "шта" + ], + [ + "or", + "sz" + ], + [ + "ors", + "z" + ], + [ + "▁c", + "anal" + ], + [ + "▁can", + "al" + ], + [ + "▁ca", + "nal" + ], + [ + "ös", + "t" + ], + [ + "ö", + "st" + ], + [ + "▁equ", + "als" + ], + [ + "▁equal", + "s" + ], + [ + "▁eq", + "uals" + ], + [ + "▁", + "equals" + ], + [ + "▁по", + "мо" + ], + [ + "▁al", + "location" + ], + [ + "▁all", + "ocation" + ], + [ + "▁alloc", + "ation" + ], + [ + "st", + "änd" + ], + [ + "▁ч", + "ер" + ], + [ + "▁че", + "р" + ], + [ + "ac", + "king" + ], + [ + "ack", + "ing" + ], + [ + "▁motiv", + "ation" + ], + [ + "со", + "н" + ], + [ + "с", + "он" + ], + [ + "▁R", + "ole" + ], + [ + "▁Ro", + "le" + ], + [ + "▁Rol", + "e" + ], + [ + "▁", + "Role" + ], + [ + "App", + "ly" + ], + [ + "Ap", + "ply" + ], + [ + "ig", + "es" + ], + [ + "ige", + "s" + ], + [ + "i", + "ges" + ], + [ + "*", + "{" + ], + [ + "▁f", + "ires" + ], + [ + "▁fire", + "s" + ], + [ + "▁fi", + "res" + ], + [ + "▁fir", + "es" + ], + [ + "Us", + "ed" + ], + [ + "Use", + "d" + ], + [ + "U", + "sed" + ], + [ + "▁he", + "ute" + ], + [ + "sk", + "iej" + ], + [ + "ski", + "ej" + ], + [ + "▁Or", + "leans" + ], + [ + "yl", + "an" + ], + [ + "y", + "lan" + ], + [ + "▁warm", + "th" + ], + [ + "▁w", + "elfare" + ], + [ + "▁wel", + "fare" + ], + [ + "je", + "m" + ], + [ + "j", + "em" + ], + [ + "▁си", + "сте" + ], + [ + "be", + "z" + ], + [ + "b", + "ez" + ], + [ + "ř", + "e" + ], + [ + "ke", + "e" + ], + [ + "k", + "ee" + ], + [ + "▁segu", + "ito" + ], + [ + "un", + "ge" + ], + [ + "ung", + "e" + ], + [ + "▁y", + "oga" + ], + [ + "▁yo", + "ga" + ], + [ + "▁d", + "ug" + ], + [ + "▁du", + "g" + ], + [ + "▁rest", + "ored" + ], + [ + "▁restore", + "d" + ], + [ + "Dr", + "oid" + ], + [ + "D", + "roid" + ], + [ + "▁P", + "ent" + ], + [ + "▁Pe", + "nt" + ], + [ + "▁Pen", + "t" + ], + [ + "▁ran", + "king" + ], + [ + "▁rank", + "ing" + ], + [ + "mo", + "r" + ], + [ + "m", + "or" + ], + [ + ".~", + "(\\" + ], + [ + "ograph", + "ical" + ], + [ + "ographic", + "al" + ], + [ + "▁p", + "ian" + ], + [ + "▁pi", + "an" + ], + [ + "▁g", + "ates" + ], + [ + "▁gate", + "s" + ], + [ + "▁ga", + "tes" + ], + [ + "▁с", + "ти" + ], + [ + "▁ст", + "и" + ], + [ + "▁", + "сти" + ], + [ + "s", + "quare" + ], + [ + "▁im", + "plicit" + ], + [ + "▁impl", + "icit" + ], + [ + "▁G", + "ram" + ], + [ + "▁Gr", + "am" + ], + [ + "▁Gra", + "m" + ], + [ + "▁Apr", + "ès" + ], + [ + "▁Ap", + "rès" + ], + [ + "▁Ass", + "istant" + ], + [ + "▁p", + "ac" + ], + [ + "▁pa", + "c" + ], + [ + "▁P", + "ope" + ], + [ + "▁Po", + "pe" + ], + [ + "▁Pop", + "e" + ], + [ + "г", + "ре" + ], + [ + "▁sc", + "attering" + ], + [ + "▁scatter", + "ing" + ], + [ + "стра", + "тив" + ], + [ + "▁all", + "ocate" + ], + [ + "▁alloc", + "ate" + ], + [ + "▁Man", + "hattan" + ], + [ + "▁а", + "нг" + ], + [ + "▁ан", + "г" + ], + [ + "▁", + "анг" + ], + [ + "▁inter", + "rupted" + ], + [ + "▁interrupt", + "ed" + ], + [ + "ér", + "ieur" + ], + [ + "éri", + "eur" + ], + [ + "érie", + "ur" + ], + [ + "数", + "据" + ], + [ + "Sign", + "al" + ], + [ + "Sig", + "nal" + ], + [ + "▁Con", + "tract" + ], + [ + "▁Cont", + "ract" + ], + [ + "▁", + "Contract" + ], + [ + "ór", + "ia" + ], + [ + "ó", + "ria" + ], + [ + "WI", + "TH" + ], + [ + "W", + "ITH" + ], + [ + "хо", + "дя" + ], + [ + "ход", + "я" + ], + [ + "Ag", + "greg" + ], + [ + "A", + "ggreg" + ], + [ + "cul", + "es" + ], + [ + "cu", + "les" + ], + [ + "cule", + "s" + ], + [ + "c", + "ules" + ], + [ + "J", + "an" + ], + [ + "▁s", + "to" + ], + [ + "▁st", + "o" + ], + [ + "▁", + "sto" + ], + [ + "▁G", + "PIO" + ], + [ + "▁GP", + "IO" + ], + [ + "▁", + "GPIO" + ], + [ + "▁ident", + "ifying" + ], + [ + "▁identify", + "ing" + ], + [ + "▁p", + "id" + ], + [ + "▁pi", + "d" + ], + [ + "▁", + "pid" + ], + [ + "ę", + "p" + ], + [ + "▁di", + "git" + ], + [ + "▁dig", + "it" + ], + [ + "el", + "ia" + ], + [ + "eli", + "a" + ], + [ + "e", + "lia" + ], + [ + "inv", + "oke" + ], + [ + "▁Fl", + "oren" + ], + [ + "▁Flor", + "en" + ], + [ + "▁Flo", + "ren" + ], + [ + "▁sh", + "allow" + ], + [ + "▁shall", + "ow" + ], + [ + "get", + "Class" + ], + [ + "getC", + "lass" + ], + [ + "▁advert", + "is" + ], + [ + "ем", + "ы" + ], + [ + "е", + "мы" + ], + [ + "▁H", + "R" + ], + [ + "▁", + "HR" + ], + [ + "ym", + "an" + ], + [ + "y", + "man" + ], + [ + "▁C", + "E" + ], + [ + "▁", + "CE" + ], + [ + "▁sec", + "ured" + ], + [ + "▁secure", + "d" + ], + [ + "▁secur", + "ed" + ], + [ + "▁rel", + "atives" + ], + [ + "▁relative", + "s" + ], + [ + "▁relativ", + "es" + ], + [ + "▁s", + "ob" + ], + [ + "▁so", + "b" + ], + [ + "▁s", + "tab" + ], + [ + "▁st", + "ab" + ], + [ + "▁sta", + "b" + ], + [ + "Trans", + "ition" + ], + [ + "▁w", + "en" + ], + [ + "▁we", + "n" + ], + [ + "▁", + "wen" + ], + [ + "sh", + "ops" + ], + [ + "shop", + "s" + ], + [ + "▁k", + "ont" + ], + [ + "▁kon", + "t" + ], + [ + "▁ko", + "nt" + ], + [ + "▁h", + "acia" + ], + [ + "▁ha", + "cia" + ], + [ + "H", + "y" + ], + [ + "в", + "ри" + ], + [ + "sh", + "ell" + ], + [ + "she", + "ll" + ], + [ + "s", + "hell" + ], + [ + "▁ant", + "ib" + ], + [ + "▁anti", + "b" + ], + [ + "env", + "ironment" + ], + [ + "environ", + "ment" + ], + [ + "um", + "bs" + ], + [ + "umb", + "s" + ], + [ + "Tr", + "acker" + ], + [ + "Track", + "er" + ], + [ + "Tra", + "cker" + ], + [ + "en", + "tr" + ], + [ + "ent", + "r" + ], + [ + "▁Polit", + "ical" + ], + [ + "ex", + "tract" + ], + [ + "ext", + "ract" + ], + [ + "extra", + "ct" + ], + [ + "extr", + "act" + ], + [ + "=\"", + "{{" + ], + [ + "▁m", + "erc" + ], + [ + "▁me", + "rc" + ], + [ + "▁mer", + "c" + ], + [ + "▁p", + "oc" + ], + [ + "▁po", + "c" + ], + [ + "▁Re", + "set" + ], + [ + "▁Res", + "et" + ], + [ + "▁", + "Reset" + ], + [ + "▁pur", + "ely" + ], + [ + "▁pure", + "ly" + ], + [ + "▁M", + "ul" + ], + [ + "▁Mu", + "l" + ], + [ + "▁gorge", + "ous" + ], + [ + "▁Î", + "n" + ], + [ + "ri", + "ven" + ], + [ + "riv", + "en" + ], + [ + "rive", + "n" + ], + [ + "r", + "iven" + ], + [ + "▁rom", + "ance" + ], + [ + "▁roman", + "ce" + ], + [ + "▁d", + "av" + ], + [ + "▁da", + "v" + ], + [ + "че", + "ского" + ], + [ + "ér", + "ica" + ], + [ + "éri", + "ca" + ], + [ + "éric", + "a" + ], + [ + "▁tra", + "ject" + ], + [ + "▁a", + "rise" + ], + [ + "▁ar", + "ise" + ], + [ + "▁sw", + "ung" + ], + [ + "▁p", + "ockets" + ], + [ + "▁pocket", + "s" + ], + [ + "▁trad", + "itions" + ], + [ + "▁tradition", + "s" + ], + [ + "▁re", + "ver" + ], + [ + "▁r", + "ever" + ], + [ + "▁rev", + "er" + ], + [ + "▁reve", + "r" + ], + [ + ">>", + ">" + ], + [ + ">", + ">>" + ], + [ + "▁n", + "d" + ], + [ + "▁", + "nd" + ], + [ + "▁di", + "vis" + ], + [ + "▁div", + "is" + ], + [ + "▁bel", + "oved" + ], + [ + "▁quant", + "ities" + ], + [ + "▁é", + "d" + ], + [ + "▁", + "éd" + ], + [ + "ien", + "do" + ], + [ + "i", + "endo" + ], + [ + "▁tal", + "ented" + ], + [ + "▁talent", + "ed" + ], + [ + "▁C", + "ad" + ], + [ + "▁Ca", + "d" + ], + [ + "▁В", + "ла" + ], + [ + "▁imm", + "igration" + ], + [ + "▁immigr", + "ation" + ], + [ + "▁ju", + "ris" + ], + [ + "▁jur", + "is" + ], + [ + "▁a", + "er" + ], + [ + "▁e", + "aten" + ], + [ + "▁eat", + "en" + ], + [ + "▁m", + "iejsc" + ], + [ + "▁sum", + "mon" + ], + [ + "pe", + "ople" + ], + [ + "▁g", + "ains" + ], + [ + "▁gain", + "s" + ], + [ + "▁ga", + "ins" + ], + [ + "▁пра", + "во" + ], + [ + "▁restr", + "iction" + ], + [ + "▁restrict", + "ion" + ], + [ + "st", + "ub" + ], + [ + "▁b", + "out" + ], + [ + "▁bo", + "ut" + ], + [ + "▁bou", + "t" + ], + [ + "▁slave", + "ry" + ], + [ + "▁sla", + "very" + ], + [ + "▁comput", + "ation" + ], + [ + "▁ar", + "mor" + ], + [ + "▁arm", + "or" + ], + [ + "▁e", + "k" + ], + [ + "▁", + "ek" + ], + [ + "▁Muslim", + "s" + ], + [ + "▁co", + "operation" + ], + [ + "▁cooper", + "ation" + ], + [ + "▁enh", + "anced" + ], + [ + "▁enhance", + "d" + ], + [ + "os", + "lav" + ], + [ + "▁ab", + "rupt" + ], + [ + "▁pod", + "cast" + ], + [ + "▁hospital", + "s" + ], + [ + "▁hosp", + "itals" + ], + [ + "нь", + "о" + ], + [ + "▁hot", + "els" + ], + [ + "▁hotel", + "s" + ], + [ + "▁Wik", + "ipedia" + ], + [ + "▁ж", + "ен" + ], + [ + "▁же", + "н" + ], + [ + "▁", + "жен" + ], + [ + "G", + "LOBAL" + ], + [ + "▁Commun", + "ist" + ], + [ + "an", + "gles" + ], + [ + "ang", + "les" + ], + [ + "angle", + "s" + ], + [ + "▁t", + "high" + ], + [ + "▁th", + "igh" + ], + [ + "▁K", + "irk" + ], + [ + "▁Kir", + "k" + ], + [ + "▁t", + "ends" + ], + [ + "▁ten", + "ds" + ], + [ + "▁tend", + "s" + ], + [ + "▁M", + "ode" + ], + [ + "▁Mod", + "e" + ], + [ + "▁Mo", + "de" + ], + [ + "▁", + "Mode" + ], + [ + "▁N", + "atur" + ], + [ + "▁Nat", + "ur" + ], + [ + "▁de", + "let" + ], + [ + "▁del", + "et" + ], + [ + "▁po", + "pul" + ], + [ + "▁pop", + "ul" + ], + [ + "▁Ch", + "amber" + ], + [ + "▁Cha", + "mber" + ], + [ + "▁Conserv", + "ative" + ], + [ + "kr", + "ieg" + ], + [ + "k", + "rieg" + ], + [ + "▁Class", + "ic" + ], + [ + "▁die", + "sem" + ], + [ + "▁dies", + "em" + ], + [ + "▁diese", + "m" + ], + [ + "▁em", + "power" + ], + [ + "▁emp", + "ower" + ], + [ + "▁M", + "es" + ], + [ + "▁Me", + "s" + ], + [ + "▁de", + "alt" + ], + [ + "▁deal", + "t" + ], + [ + "▁e", + "stad" + ], + [ + "▁est", + "ad" + ], + [ + "▁esta", + "d" + ], + [ + "▁Se", + "it" + ], + [ + "▁cred", + "its" + ], + [ + "▁credit", + "s" + ], + [ + "sub", + "subsection" + ], + [ + "Inv", + "oke" + ], + [ + "▁phys", + "ician" + ], + [ + "це", + "в" + ], + [ + "ц", + "ев" + ], + [ + "ás", + "a" + ], + [ + "á", + "sa" + ], + [ + "▁g", + "ob" + ], + [ + "▁go", + "b" + ], + [ + "▁R", + "ug" + ], + [ + "▁Ru", + "g" + ], + [ + "▁м", + "іс" + ], + [ + "▁мі", + "с" + ], + [ + "sh", + "aller" + ], + [ + "shal", + "ler" + ], + [ + "shall", + "er" + ], + [ + "▁k", + "ol" + ], + [ + "▁ko", + "l" + ], + [ + "▁", + "kol" + ], + [ + "▁c", + "ared" + ], + [ + "▁car", + "ed" + ], + [ + "▁care", + "d" + ], + [ + "▁ca", + "red" + ], + [ + "▁of", + "icial" + ], + [ + "no", + "s" + ], + [ + "n", + "os" + ], + [ + "▁j", + "el" + ], + [ + "▁je", + "l" + ], + [ + "▁", + "jel" + ], + [ + "null", + "able" + ], + [ + "GU", + "I" + ], + [ + "G", + "UI" + ], + [ + "▁r", + "app" + ], + [ + "▁rap", + "p" + ], + [ + "▁ra", + "pp" + ], + [ + "▁An", + "nie" + ], + [ + "▁Ann", + "ie" + ], + [ + "▁st", + "ocks" + ], + [ + "▁stock", + "s" + ], + [ + "▁sto", + "cks" + ], + [ + "▁develop", + "er" + ], + [ + "▁pl", + "acement" + ], + [ + "▁place", + "ment" + ], + [ + "▁plac", + "ement" + ], + [ + "▁", + "placement" + ], + [ + "(\"", + "<" + ], + [ + "▁l", + "avor" + ], + [ + "▁la", + "vor" + ], + [ + "▁lav", + "or" + ], + [ + "▁acc", + "us" + ], + [ + "Mar", + "t" + ], + [ + "Ma", + "rt" + ], + [ + "M", + "art" + ], + [ + "amer", + "ikan" + ], + [ + "▁sk", + "etch" + ], + [ + "▁sent", + "iment" + ], + [ + "▁а", + "мерикан" + ], + [ + "An", + "chor" + ], + [ + "Mer", + "ge" + ], + [ + "Pe", + "ople" + ], + [ + "▁rend", + "ered" + ], + [ + "▁render", + "ed" + ], + [ + "▁la", + "und" + ], + [ + "▁n", + "ons" + ], + [ + "▁no", + "ns" + ], + [ + "▁non", + "s" + ], + [ + "▁bl", + "ew" + ], + [ + "▁ble", + "w" + ], + [ + "k", + "b" + ], + [ + "ate", + "gor" + ], + [ + "ateg", + "or" + ], + [ + "▁franç", + "aise" + ], + [ + "▁français", + "e" + ], + [ + "KE", + "N" + ], + [ + "K", + "EN" + ], + [ + "method", + "s" + ], + [ + "▁Part", + "icip" + ], + [ + "nost", + "i" + ], + [ + "nos", + "ti" + ], + [ + "n", + "osti" + ], + [ + "▁com", + "merce" + ], + [ + "▁commer", + "ce" + ], + [ + "▁", + "commerce" + ], + [ + "▁до", + "ма" + ], + [ + "▁d", + "re" + ], + [ + "▁dr", + "e" + ], + [ + "▁t", + "win" + ], + [ + "▁tw", + "in" + ], + [ + "▁ded", + "ic" + ], + [ + "▁U", + "TC" + ], + [ + "▁", + "UTC" + ], + [ + "We", + "ek" + ], + [ + "▁differ", + "ential" + ], + [ + "▁different", + "ial" + ], + [ + "л", + "ё" + ], + [ + "▁Ch", + "oose" + ], + [ + "▁Cho", + "ose" + ], + [ + "▁\"", + "(" + ], + [ + "▁то", + "м" + ], + [ + "▁", + "том" + ], + [ + "▁про", + "фе" + ], + [ + "em", + "ark" + ], + [ + "e", + "mark" + ], + [ + "▁fe", + "ared" + ], + [ + "▁fear", + "ed" + ], + [ + "sk", + "o" + ], + [ + "s", + "ko" + ], + [ + "Br", + "anch" + ], + [ + "▁in", + "vention" + ], + [ + "▁inv", + "ention" + ], + [ + "▁invent", + "ion" + ], + [ + "er", + "mine" + ], + [ + "erm", + "ine" + ], + [ + "▁car", + "act" + ], + [ + "▁ca", + "ract" + ], + [ + "ро", + "го" + ], + [ + "р", + "ого" + ], + [ + "lo", + "yd" + ], + [ + "▁ку", + "ль" + ], + [ + "▁", + "куль" + ], + [ + "▁del", + "icate" + ], + [ + "Or", + "gan" + ], + [ + "▁Im", + "pro" + ], + [ + "▁Imp", + "ro" + ], + [ + "▁r", + "ip" + ], + [ + "▁ri", + "p" + ], + [ + "▁", + "rip" + ], + [ + "Up", + "dated" + ], + [ + "Update", + "d" + ], + [ + "ul", + "ent" + ], + [ + "ule", + "nt" + ], + [ + "▁o", + "bra" + ], + [ + "▁ob", + "ra" + ], + [ + "s", + "uspend" + ], + [ + "Line", + "s" + ], + [ + "Lin", + "es" + ], + [ + "Li", + "nes" + ], + [ + "L", + "ines" + ], + [ + "▁b", + "anda" + ], + [ + "▁band", + "a" + ], + [ + "▁ban", + "da" + ], + [ + "ot", + "ta" + ], + [ + "ott", + "a" + ], + [ + "o", + "tta" + ], + [ + "▁k", + "ole" + ], + [ + "▁ko", + "le" + ], + [ + "▁kol", + "e" + ], + [ + "il", + "io" + ], + [ + "ili", + "o" + ], + [ + "i", + "lio" + ], + [ + "▁output", + "s" + ], + [ + "▁", + "outputs" + ], + [ + "est", + "ro" + ], + [ + "estr", + "o" + ], + [ + "AAAA", + "AAAA" + ], + [ + "R", + "UN" + ], + [ + "ne", + "nt" + ], + [ + "nen", + "t" + ], + [ + "n", + "ent" + ], + [ + "▁d", + "ated" + ], + [ + "▁da", + "ted" + ], + [ + "▁dat", + "ed" + ], + [ + "▁date", + "d" + ], + [ + "▁", + "dated" + ], + [ + "▁s", + "py" + ], + [ + "▁sp", + "y" + ], + [ + "▁c", + "rap" + ], + [ + "▁cr", + "ap" + ], + [ + "▁in", + "coming" + ], + [ + "▁inc", + "oming" + ], + [ + "▁ф", + "ев" + ], + [ + "▁фе", + "в" + ], + [ + "PH", + "Y" + ], + [ + "P", + "HY" + ], + [ + "▁O", + "range" + ], + [ + "▁Or", + "ange" + ], + [ + "▁ob", + "server" + ], + [ + "▁observ", + "er" + ], + [ + "▁observe", + "r" + ], + [ + "▁up", + "stairs" + ], + [ + "ion", + "ed" + ], + [ + "io", + "ned" + ], + [ + "ione", + "d" + ], + [ + "i", + "oned" + ], + [ + "▁a", + "tr" + ], + [ + "▁at", + "r" + ], + [ + "▁", + "atr" + ], + [ + "igh", + "bor" + ], + [ + "▁expect", + "ation" + ], + [ + "Hi", + "s" + ], + [ + "H", + "is" + ], + [ + "im", + "edia" + ], + [ + "i", + "media" + ], + [ + "com", + "put" + ], + [ + "comp", + "ut" + ], + [ + "▁arg", + "v" + ], + [ + "▁", + "argv" + ], + [ + "▁ear", + "liest" + ], + [ + "та", + "ли" + ], + [ + "тал", + "и" + ], + [ + "т", + "али" + ], + [ + "мо", + "н" + ], + [ + "м", + "он" + ], + [ + "ol", + "len" + ], + [ + "oll", + "en" + ], + [ + "ra", + "ke" + ], + [ + "r", + "ake" + ], + [ + "▁pat", + "ience" + ], + [ + "ходи", + "т" + ], + [ + "ход", + "ит" + ], + [ + "▁де", + "ка" + ], + [ + "▁bu", + "yers" + ], + [ + "▁buy", + "ers" + ], + [ + "▁buyer", + "s" + ], + [ + "▁Conne", + "ct" + ], + [ + "▁", + "Connect" + ], + [ + "▁Univers", + "al" + ], + [ + "▁adjust", + "ed" + ], + [ + "▁adj", + "usted" + ], + [ + "im", + "eq" + ], + [ + "ime", + "q" + ], + [ + "el", + "lers" + ], + [ + "ell", + "ers" + ], + [ + "elle", + "rs" + ], + [ + "eller", + "s" + ], + [ + "▁ru", + "in" + ], + [ + "▁Cr", + "usher" + ], + [ + "▁Freder", + "ick" + ], + [ + "ott", + "age" + ], + [ + "otta", + "ge" + ], + [ + "▁com", + "prom" + ], + [ + "▁comp", + "rom" + ], + [ + "▁compr", + "om" + ], + [ + "ia", + "sm" + ], + [ + "ias", + "m" + ], + [ + "i", + "asm" + ], + [ + "wa", + "ve" + ], + [ + "w", + "ave" + ], + [ + "▁encour", + "aging" + ], + [ + "▁be", + "ans" + ], + [ + "▁bean", + "s" + ], + [ + "▁", + "beans" + ], + [ + "▁per", + "ceived" + ], + [ + "…", + "]" + ], + [ + "▁gl", + "obe" + ], + [ + "▁glob", + "e" + ], + [ + "▁glo", + "be" + ], + [ + "▁S", + "F" + ], + [ + "▁", + "SF" + ], + [ + "he", + "rent" + ], + [ + "her", + "ent" + ], + [ + "here", + "nt" + ], + [ + "▁a", + "like" + ], + [ + "▁al", + "ike" + ], + [ + "▁ali", + "ke" + ], + [ + "▁hur", + "ried" + ], + [ + "qu", + "el" + ], + [ + "que", + "l" + ], + [ + "q", + "uel" + ], + [ + "▁mus", + "icians" + ], + [ + "▁music", + "ians" + ], + [ + "▁musician", + "s" + ], + [ + "ar", + "z" + ], + [ + "a", + "rz" + ], + [ + "по", + "в" + ], + [ + "п", + "ов" + ], + [ + "drop", + "down" + ], + [ + "ac", + "l" + ], + [ + "a", + "cl" + ], + [ + "pre", + "view" + ], + [ + "prev", + "iew" + ], + [ + "p", + "review" + ], + [ + "▁under", + "neath" + ], + [ + "ze", + "ś" + ], + [ + "▁fem", + "ales" + ], + [ + "▁female", + "s" + ], + [ + "list", + "ener" + ], + [ + "listen", + "er" + ], + [ + "▁C", + "AN" + ], + [ + "▁CA", + "N" + ], + [ + "▁", + "CAN" + ], + [ + "▁T", + "ow" + ], + [ + "▁To", + "w" + ], + [ + "▁pe", + "ers" + ], + [ + "▁peer", + "s" + ], + [ + "tl", + "s" + ], + [ + "t", + "ls" + ], + [ + "at", + "ra" + ], + [ + "atr", + "a" + ], + [ + "a", + "tra" + ], + [ + "se", + "nder" + ], + [ + "send", + "er" + ], + [ + "sen", + "der" + ], + [ + "s", + "ender" + ], + [ + "TIME", + "OUT" + ], + [ + "fu", + "rt" + ], + [ + "fur", + "t" + ], + [ + "f", + "urt" + ], + [ + "▁Gu", + "erra" + ], + [ + "{}", + ")" + ], + [ + "{", + "})" + ], + [ + "▁D", + "urch" + ], + [ + "▁Dur", + "ch" + ], + [ + "▁s", + "ki" + ], + [ + "▁sk", + "i" + ], + [ + "▁", + "ski" + ], + [ + "il", + "las" + ], + [ + "ill", + "as" + ], + [ + "illa", + "s" + ], + [ + "▁S", + "of" + ], + [ + "▁So", + "f" + ], + [ + "▁Organ", + "ization" + ], + [ + "▁C", + "leveland" + ], + [ + "▁b", + "utt" + ], + [ + "▁but", + "t" + ], + [ + "▁bu", + "tt" + ], + [ + "▁sim", + "ilarly" + ], + [ + "▁similar", + "ly" + ], + [ + "▁assert", + "True" + ], + [ + "▁", + "assertTrue" + ], + [ + "▁inev", + "itable" + ], + [ + "ne", + "ll" + ], + [ + "nel", + "l" + ], + [ + "n", + "ell" + ], + [ + "▁R", + "af" + ], + [ + "▁Ra", + "f" + ], + [ + "DIS", + "ABLE" + ], + [ + "am", + "ine" + ], + [ + "ami", + "ne" + ], + [ + "amin", + "e" + ], + [ + "a", + "mine" + ], + [ + "▁Com", + "plete" + ], + [ + "▁Comp", + "lete" + ], + [ + "▁", + "Complete" + ], + [ + "▁be", + "iden" + ], + [ + "▁bei", + "den" + ], + [ + "▁Chall", + "enge" + ], + [ + "Rad", + "io" + ], + [ + "R", + "adio" + ], + [ + "▁Not", + "ice" + ], + [ + "He", + "x" + ], + [ + "H", + "ex" + ], + [ + "▁C", + "uba" + ], + [ + "▁Cub", + "a" + ], + [ + "▁Cu", + "ba" + ], + [ + "▁aug", + "ust" + ], + [ + "▁Philipp", + "ines" + ], + [ + "Mar", + "gin" + ], + [ + "M", + "argin" + ], + [ + "ja", + "l" + ], + [ + "j", + "al" + ], + [ + "gener", + "ator" + ], + [ + "▁t", + "atto" + ], + [ + "▁ta", + "tto" + ], + [ + "▁H", + "em" + ], + [ + "▁He", + "m" + ], + [ + "▁S", + "alt" + ], + [ + "▁Sal", + "t" + ], + [ + "▁Sa", + "lt" + ], + [ + "un", + "ately" + ], + [ + "unate", + "ly" + ], + [ + "▁terr", + "ain" + ], + [ + "▁terra", + "in" + ], + [ + ",\\", + "," + ], + [ + ",", + "\\," + ], + [ + "гра", + "д" + ], + [ + "▁c", + "rop" + ], + [ + "▁cr", + "op" + ], + [ + "▁cro", + "p" + ], + [ + "Name", + "d" + ], + [ + "Na", + "med" + ], + [ + "N", + "amed" + ], + [ + "▁W", + "onder" + ], + [ + "▁Wo", + "nder" + ], + [ + "▁Won", + "der" + ], + [ + "es", + "sen" + ], + [ + "ess", + "en" + ], + [ + "esse", + "n" + ], + [ + "▁f", + "ist" + ], + [ + "▁fi", + "st" + ], + [ + "▁fis", + "t" + ], + [ + "▁z", + "oom" + ], + [ + "▁zo", + "om" + ], + [ + "▁", + "zoom" + ], + [ + "пе", + "н" + ], + [ + "п", + "ен" + ], + [ + "▁ru", + "ling" + ], + [ + "▁rul", + "ing" + ], + [ + "un", + "likely" + ], + [ + "as", + "sy" + ], + [ + "ass", + "y" + ], + [ + "or", + "ent" + ], + [ + "ore", + "nt" + ], + [ + "oren", + "t" + ], + [ + "o", + "rent" + ], + [ + "▁g", + "ibt" + ], + [ + "▁gi", + "bt" + ], + [ + "▁A", + "w" + ], + [ + "sim", + "eq" + ], + [ + "s", + "imeq" + ], + [ + "▁r", + "aid" + ], + [ + "▁ra", + "id" + ], + [ + "▁", + "raid" + ], + [ + "▁Com", + "par" + ], + [ + "▁Comp", + "ar" + ], + [ + "▁", + "Compar" + ], + [ + "▁free", + "ly" + ], + [ + "▁fre", + "ely" + ], + [ + "▁esp", + "añ" + ], + [ + "▁espa", + "ñ" + ], + [ + "▁py", + "thon" + ], + [ + "▁", + "python" + ], + [ + "▁diagn", + "osis" + ], + [ + "▁ch", + "ips" + ], + [ + "▁chip", + "s" + ], + [ + "▁chi", + "ps" + ], + [ + "R", + "azor" + ], + [ + "▁V", + "ert" + ], + [ + "▁Ver", + "t" + ], + [ + "▁Ve", + "rt" + ], + [ + "▁", + "Vert" + ], + [ + "For", + "ward" + ], + [ + "▁P", + "é" + ], + [ + "▁compar", + "able" + ], + [ + "▁anal", + "ys" + ], + [ + "▁analy", + "s" + ], + [ + "St", + "d" + ], + [ + "S", + "td" + ], + [ + "▁Franç", + "ois" + ], + [ + "▁c", + "ó" + ], + [ + "jo", + "s" + ], + [ + "j", + "os" + ], + [ + "▁p", + "eg" + ], + [ + "▁pe", + "g" + ], + [ + "▁", + "peg" + ], + [ + "CON", + "ST" + ], + [ + "cl", + "usive" + ], + [ + "▁voy", + "age" + ], + [ + "▁Sch", + "l" + ], + [ + "▁Sc", + "hl" + ], + [ + "Group", + "Layout" + ], + [ + "oi", + "se" + ], + [ + "ois", + "e" + ], + [ + "o", + "ise" + ], + [ + "сс", + "е" + ], + [ + "с", + "се" + ], + [ + "▁cr", + "ush" + ], + [ + "▁cru", + "sh" + ], + [ + "▁Die", + "se" + ], + [ + "▁Di", + "ese" + ], + [ + "▁Dies", + "e" + ], + [ + "▁be", + "kan" + ], + [ + "▁bek", + "an" + ], + [ + "ci", + "t" + ], + [ + "c", + "it" + ], + [ + "▁Ein", + "wohner" + ], + [ + "▁L", + "an" + ], + [ + "▁La", + "n" + ], + [ + "▁dress", + "ing" + ], + [ + "▁s", + "olved" + ], + [ + "▁sol", + "ved" + ], + [ + "▁solve", + "d" + ], + [ + "М", + "а" + ], + [ + "▁C", + "hel" + ], + [ + "▁Ch", + "el" + ], + [ + "▁Che", + "l" + ], + [ + "par", + "ed" + ], + [ + "pa", + "red" + ], + [ + "pare", + "d" + ], + [ + "p", + "ared" + ], + [ + "▁se", + "aled" + ], + [ + "▁sea", + "led" + ], + [ + "▁seal", + "ed" + ], + [ + "})", + ")" + ], + [ + "}", + "))" + ], + [ + "anc", + "ouver" + ], + [ + "se", + "h" + ], + [ + "s", + "eh" + ], + [ + "ta", + "bles" + ], + [ + "table", + "s" + ], + [ + "tab", + "les" + ], + [ + "t", + "ables" + ], + [ + "▁red", + "dit" + ], + [ + "▁redd", + "it" + ], + [ + "▁", + "reddit" + ], + [ + "▁m", + "our" + ], + [ + "▁mo", + "ur" + ], + [ + "▁mou", + "r" + ], + [ + "▁clean", + "up" + ], + [ + "▁", + "cleanup" + ], + [ + "ov", + "ić" + ], + [ + "ovi", + "ć" + ], + [ + "▁Ur", + "ban" + ], + [ + "oc", + "t" + ], + [ + "o", + "ct" + ], + [ + "то", + "ра" + ], + [ + "тор", + "а" + ], + [ + "▁Le", + "gal" + ], + [ + "▁Leg", + "al" + ], + [ + "▁J", + "ur" + ], + [ + "▁Ju", + "r" + ], + [ + "▁N", + "as" + ], + [ + "▁Na", + "s" + ], + [ + "C", + "ity" + ], + [ + "▁un", + "fortunately" + ], + [ + "▁unfortunate", + "ly" + ], + [ + "▁P", + "ER" + ], + [ + "▁PE", + "R" + ], + [ + "▁", + "PER" + ], + [ + "ma", + "kers" + ], + [ + "make", + "rs" + ], + [ + "maker", + "s" + ], + [ + "m", + "akers" + ], + [ + "▁sig", + "lo" + ], + [ + "▁k", + "in" + ], + [ + "▁ki", + "n" + ], + [ + "▁", + "kin" + ], + [ + "co", + "des" + ], + [ + "code", + "s" + ], + [ + "cod", + "es" + ], + [ + "c", + "odes" + ], + [ + "ля", + "р" + ], + [ + "NI", + "NG" + ], + [ + "N", + "ING" + ], + [ + "▁C", + "ec" + ], + [ + "▁Ce", + "c" + ], + [ + "▁C", + "T" + ], + [ + "▁", + "CT" + ], + [ + "▁R", + "acing" + ], + [ + "▁Ra", + "cing" + ], + [ + "da", + "n" + ], + [ + "d", + "an" + ], + [ + "▁He", + "rz" + ], + [ + "▁Her", + "z" + ], + [ + "▁gen", + "ius" + ], + [ + "▁e", + "urop" + ], + [ + "▁eu", + "rop" + ], + [ + "serv", + "let" + ], + [ + "ow", + "ego" + ], + [ + "owe", + "go" + ], + [ + "▁Im", + "agine" + ], + [ + "▁Imp", + "erial" + ], + [ + "▁Imper", + "ial" + ], + [ + "Re", + "gex" + ], + [ + "Reg", + "ex" + ], + [ + "c", + "é" + ], + [ + "HE", + "D" + ], + [ + "H", + "ED" + ], + [ + "det", + "ect" + ], + [ + "з", + "ни" + ], + [ + "io", + "c" + ], + [ + "i", + "oc" + ], + [ + "Anal", + "ysis" + ], + [ + "Analy", + "sis" + ], + [ + "▁*", + "=" + ], + [ + "▁f", + "ever" + ], + [ + "▁fe", + "ver" + ], + [ + "▁Ob", + "viously" + ], + [ + "F", + "oot" + ], + [ + "Line", + "ar" + ], + [ + "Lin", + "ear" + ], + [ + "▁p", + "ró" + ], + [ + "▁pr", + "ó" + ], + [ + "▁satell", + "ite" + ], + [ + "▁B", + "eng" + ], + [ + "▁Be", + "ng" + ], + [ + "▁Ben", + "g" + ], + [ + "bound", + "s" + ], + [ + "b", + "ounds" + ], + [ + "▁J", + "azz" + ], + [ + "▁Ja", + "zz" + ], + [ + "▁C", + "urt" + ], + [ + "▁Cur", + "t" + ], + [ + "▁Cu", + "rt" + ], + [ + "▁поли", + "ти" + ], + [ + "▁b", + "ild" + ], + [ + "▁bi", + "ld" + ], + [ + "▁bil", + "d" + ], + [ + "▁", + "bild" + ], + [ + "▁\"", + "\");" + ], + [ + "▁\"\"", + ");" + ], + [ + "▁\"\")", + ";" + ], + [ + "▁document", + "ary" + ], + [ + "▁gr", + "asp" + ], + [ + "▁gra", + "sp" + ], + [ + "▁gras", + "p" + ], + [ + "▁d", + "la" + ], + [ + "▁dl", + "a" + ], + [ + "TR", + "A" + ], + [ + "T", + "RA" + ], + [ + "▁read", + "ily" + ], + [ + "To", + "r" + ], + [ + "T", + "or" + ], + [ + "C", + "ACHE" + ], + [ + "▁Const", + "ruction" + ], + [ + "▁Construct", + "ion" + ], + [ + "▁d", + "ía" + ], + [ + "да", + "т" + ], + [ + "д", + "ат" + ], + [ + "▁G", + "rey" + ], + [ + "▁Gr", + "ey" + ], + [ + "▁Gre", + "y" + ], + [ + "run", + "ner" + ], + [ + "le", + "ading" + ], + [ + "▁co", + "oked" + ], + [ + "▁cook", + "ed" + ], + [ + "ro", + "log" + ], + [ + "rol", + "og" + ], + [ + "r", + "olog" + ], + [ + "▁annoy", + "ing" + ], + [ + "DE", + "LETE" + ], + [ + "amer", + "ican" + ], + [ + "▁Niger", + "ia" + ], + [ + "▁d", + "ai" + ], + [ + "▁da", + "i" + ], + [ + "▁", + "dai" + ], + [ + "▁sac", + "rific" + ], + [ + "▁serv", + "ant" + ], + [ + "▁s", + "kb" + ], + [ + "▁sk", + "b" + ], + [ + "▁", + "skb" + ], + [ + "▁b", + "arg" + ], + [ + "▁bar", + "g" + ], + [ + "▁ba", + "rg" + ], + [ + "pix", + "el" + ], + [ + "p", + "ixel" + ], + [ + "In", + "ject" + ], + [ + "ca", + "ched" + ], + [ + "cache", + "d" + ], + [ + "c", + "ached" + ], + [ + "▁cou", + "pled" + ], + [ + "▁couple", + "d" + ], + [ + "▁coup", + "led" + ], + [ + "un", + "gle" + ], + [ + "ung", + "le" + ], + [ + "pro", + "b" + ], + [ + "pr", + "ob" + ], + [ + "p", + "rob" + ], + [ + ">{", + "@" + ], + [ + "ла", + "го" + ], + [ + "default", + "s" + ], + [ + "▁por", + "trait" + ], + [ + "▁port", + "rait" + ], + [ + "▁d", + "ental" + ], + [ + "▁den", + "tal" + ], + [ + "▁dent", + "al" + ], + [ + "▁d", + "estro" + ], + [ + "▁dest", + "ro" + ], + [ + "▁r", + "ue" + ], + [ + "▁ru", + "e" + ], + [ + "▁hy", + "brid" + ], + [ + "▁", + "й" + ], + [ + "▁CO", + "MP" + ], + [ + "▁COM", + "P" + ], + [ + "▁", + "COMP" + ], + [ + "▁B", + "ent" + ], + [ + "▁Be", + "nt" + ], + [ + "▁Ben", + "t" + ], + [ + "Com", + "pare" + ], + [ + "Comp", + "are" + ], + [ + "Compar", + "e" + ], + [ + "bo", + "th" + ], + [ + "bot", + "h" + ], + [ + "b", + "oth" + ], + [ + "kl", + "ahoma" + ], + [ + "ais", + "er" + ], + [ + "ai", + "ser" + ], + [ + "aise", + "r" + ], + [ + "a", + "iser" + ], + [ + "Su", + "re" + ], + [ + "Sur", + "e" + ], + [ + "S", + "ure" + ], + [ + "▁s", + "olving" + ], + [ + "▁sol", + "ving" + ], + [ + "▁l", + "ista" + ], + [ + "▁li", + "sta" + ], + [ + "▁list", + "a" + ], + [ + "▁", + "lista" + ], + [ + "▁у", + "чи" + ], + [ + "▁Ev", + "ans" + ], + [ + "▁Eva", + "ns" + ], + [ + "▁f", + "usion" + ], + [ + "▁fus", + "ion" + ], + [ + "▁compl", + "aint" + ], + [ + "▁complain", + "t" + ], + [ + "H", + "P" + ], + [ + "He", + "ap" + ], + [ + "al", + "ways" + ], + [ + "M", + "gr" + ], + [ + "▁appro", + "x" + ], + [ + "▁", + "approx" + ], + [ + "display", + "style" + ], + [ + "lo", + "rd" + ], + [ + "lor", + "d" + ], + [ + "l", + "ord" + ], + [ + "in", + "sn" + ], + [ + "ins", + "n" + ], + [ + "▁Fe", + "ature" + ], + [ + "▁", + "Feature" + ], + [ + "RP", + "C" + ], + [ + "R", + "PC" + ], + [ + "▁v", + "et" + ], + [ + "▁ve", + "t" + ], + [ + "▁", + "vet" + ], + [ + "К", + "а" + ], + [ + "▁kil", + "omet" + ], + [ + "▁kilom", + "et" + ], + [ + "▁deliver", + "ing" + ], + [ + "▁const", + "itution" + ], + [ + "sh", + "ine" + ], + [ + "ле", + "к" + ], + [ + "▁го", + "род" + ], + [ + "▁горо", + "д" + ], + [ + "▁prob", + "able" + ], + [ + "▁run", + "ner" + ], + [ + "▁", + "runner" + ], + [ + "hr", + "en" + ], + [ + "hre", + "n" + ], + [ + "h", + "ren" + ], + [ + "▁N", + "ep" + ], + [ + "▁Ne", + "p" + ], + [ + "▁over", + "night" + ], + [ + "pr", + "ead" + ], + [ + "pre", + "ad" + ], + [ + "p", + "read" + ], + [ + "л", + "та" + ], + [ + "фор", + "ма" + ], + [ + "CL", + "O" + ], + [ + "C", + "LO" + ], + [ + "ie", + "sa" + ], + [ + "ies", + "a" + ], + [ + "i", + "esa" + ], + [ + "▁object", + "ives" + ], + [ + "▁objective", + "s" + ], + [ + "con", + "tract" + ], + [ + "cont", + "ract" + ], + [ + "contr", + "act" + ], + [ + "EX", + "P" + ], + [ + "▁col", + "ours" + ], + [ + "▁colour", + "s" + ], + [ + "xi", + "co" + ], + [ + "xic", + "o" + ], + [ + "x", + "ico" + ], + [ + "C", + "lean" + ], + [ + "▁light", + "ly" + ], + [ + "▁scen", + "arios" + ], + [ + "▁scenario", + "s" + ], + [ + "▁qu", + "arters" + ], + [ + "▁quarter", + "s" + ], + [ + "▁quart", + "ers" + ], + [ + "▁quar", + "ters" + ], + [ + "▁", + "quarters" + ], + [ + "▁D", + "ear" + ], + [ + "▁De", + "ar" + ], + [ + "▁l", + "uc" + ], + [ + "▁lu", + "c" + ], + [ + "▁app", + "et" + ], + [ + "▁ap", + "pet" + ], + [ + "▁appe", + "t" + ], + [ + "▁de", + "port" + ], + [ + "▁dep", + "ort" + ], + [ + "Sa", + "fe" + ], + [ + "▁me", + "nos" + ], + [ + "▁men", + "os" + ], + [ + "▁Paul", + "o" + ], + [ + "▁Pa", + "ulo" + ], + [ + "CI", + "AL" + ], + [ + "C", + "IAL" + ], + [ + "ці", + "в" + ], + [ + "ц", + "ів" + ], + [ + "▁R", + "oc" + ], + [ + "▁Ro", + "c" + ], + [ + "▁c", + "aring" + ], + [ + "▁car", + "ing" + ], + [ + "▁ca", + "ring" + ], + [ + "▁elect", + "ro" + ], + [ + "▁de", + "cember" + ], + [ + "▁dec", + "ember" + ], + [ + "▁dece", + "mber" + ], + [ + "▁Phil", + "osoph" + ], + [ + "▁col", + "ored" + ], + [ + "▁color", + "ed" + ], + [ + "▁", + "colored" + ], + [ + "it", + "sch" + ], + [ + "its", + "ch" + ], + [ + "ropol", + "itan" + ], + [ + "os", + "ti" + ], + [ + "ost", + "i" + ], + [ + "▁N", + "ut" + ], + [ + "▁Nu", + "t" + ], + [ + "▁consecut", + "ive" + ], + [ + "Pe", + "er" + ], + [ + "ar", + "ness" + ], + [ + "arn", + "ess" + ], + [ + "▁ż", + "e" + ], + [ + "▁", + "że" + ], + [ + "▁A", + "round" + ], + [ + "▁Ar", + "ound" + ], + [ + "af", + "ka" + ], + [ + "▁d", + "io" + ], + [ + "▁di", + "o" + ], + [ + "ci", + "p" + ], + [ + "c", + "ip" + ], + [ + "▁to", + "ys" + ], + [ + "▁toy", + "s" + ], + [ + "cr", + "o" + ], + [ + "c", + "ro" + ], + [ + "▁m", + "iser" + ], + [ + "▁mis", + "er" + ], + [ + "▁mi", + "ser" + ], + [ + "▁mise", + "r" + ], + [ + "check", + "box" + ], + [ + "▁F", + "isher" + ], + [ + "▁Fish", + "er" + ], + [ + "▁gover", + "ned" + ], + [ + "▁govern", + "ed" + ], + [ + "▁h", + "á" + ], + [ + "▁En", + "able" + ], + [ + "▁", + "Enable" + ], + [ + "▁t", + "rivial" + ], + [ + "▁occup", + "ation" + ], + [ + "ro", + "rs" + ], + [ + "ror", + "s" + ], + [ + "r", + "ors" + ], + [ + "▁l", + "av" + ], + [ + "▁la", + "v" + ], + [ + "▁", + "lav" + ], + [ + "▁m", + "ou" + ], + [ + "▁mo", + "u" + ], + [ + "▁b", + "ord" + ], + [ + "▁bo", + "rd" + ], + [ + "▁bor", + "d" + ], + [ + "ли", + "ч" + ], + [ + "Ro", + "om" + ], + [ + "R", + "oom" + ], + [ + "')", + "\r" + ], + [ + "'", + ")\r" + ], + [ + "▁art", + "ic" + ], + [ + "▁m", + "ientras" + ], + [ + "ch", + "air" + ], + [ + "cha", + "ir" + ], + [ + "uation", + "s" + ], + [ + "u", + "ations" + ], + [ + "▁comm", + "ented" + ], + [ + "▁comment", + "ed" + ], + [ + "▁trigger", + "ed" + ], + [ + "Can", + "not" + ], + [ + "C", + "annot" + ], + [ + "▁Marc", + "us" + ], + [ + "▁p", + "unct" + ], + [ + "▁pun", + "ct" + ], + [ + "▁achie", + "vement" + ], + [ + "▁achieve", + "ment" + ], + [ + "е", + "ди" + ], + [ + "ext", + "ensions" + ], + [ + "extension", + "s" + ], + [ + "ad", + "ers" + ], + [ + "ade", + "rs" + ], + [ + "ader", + "s" + ], + [ + "a", + "ders" + ], + [ + "jo", + "urs" + ], + [ + "jour", + "s" + ], + [ + "j", + "ours" + ], + [ + "ir", + "lines" + ], + [ + "irl", + "ines" + ], + [ + "▁со", + "стоя" + ], + [ + "V", + "IEW" + ], + [ + "▁Nap", + "ole" + ], + [ + "Conf", + "irm" + ], + [ + "▁por", + "que" + ], + [ + "........", + "........" + ], + [ + "▁LI", + "ABILITY" + ], + [ + "Wall", + "et" + ], + [ + "W", + "allet" + ], + [ + "Sub", + "ject" + ], + [ + "al", + "gorithm" + ], + [ + "▁tr", + "iple" + ], + [ + "▁tri", + "ple" + ], + [ + "▁trip", + "le" + ], + [ + "ru", + "b" + ], + [ + "r", + "ub" + ], + [ + "▁se", + "cur" + ], + [ + "▁sec", + "ur" + ], + [ + "▁hand", + "some" + ], + [ + "▁hands", + "ome" + ], + [ + "▁d", + "od" + ], + [ + "▁do", + "d" + ], + [ + "r", + "ès" + ], + [ + "ac", + "ja" + ], + [ + "ch", + "od" + ], + [ + "cho", + "d" + ], + [ + "н", + "ва" + ], + [ + "es", + "ar" + ], + [ + "esa", + "r" + ], + [ + "an", + "chor" + ], + [ + "anc", + "hor" + ], + [ + "anch", + "or" + ], + [ + "▁Soph", + "ie" + ], + [ + "▁Украї", + "ни" + ], + [ + "Up", + "per" + ], + [ + "am", + "ous" + ], + [ + "amo", + "us" + ], + [ + "Fe", + "atures" + ], + [ + "Feature", + "s" + ], + [ + "▁б", + "ли" + ], + [ + "▁", + "бли" + ], + [ + "Supp", + "ress" + ], + [ + "Sup", + "press" + ], + [ + "▁kil", + "om" + ], + [ + "▁Z", + "u" + ], + [ + "▁belong", + "ed" + ], + [ + "▁Red", + "dit" + ], + [ + "▁pro", + "ces" + ], + [ + "▁proc", + "es" + ], + [ + "▁с", + "тар" + ], + [ + "▁ста", + "р" + ], + [ + "▁ст", + "ар" + ], + [ + "▁F", + "est" + ], + [ + "▁Fe", + "st" + ], + [ + "/", + "%" + ], + [ + "▁P", + "am" + ], + [ + "▁Pa", + "m" + ], + [ + "st", + "orm" + ], + [ + "sto", + "rm" + ], + [ + "W", + "W" + ], + [ + "P", + "aul" + ], + [ + "▁t", + "ales" + ], + [ + "▁tal", + "es" + ], + [ + "▁ta", + "les" + ], + [ + "▁tale", + "s" + ], + [ + "▁рай", + "она" + ], + [ + "▁райо", + "на" + ], + [ + "▁район", + "а" + ], + [ + "▁spread", + "ing" + ], + [ + "▁s", + "ched" + ], + [ + "▁sc", + "hed" + ], + [ + "▁sch", + "ed" + ], + [ + "▁sche", + "d" + ], + [ + "▁", + "sched" + ], + [ + "le", + "ased" + ], + [ + "lease", + "d" + ], + [ + "Non", + "Null" + ], + [ + "▁High", + "way" + ], + [ + "▁Re", + "serve" + ], + [ + "▁Res", + "erve" + ], + [ + "▁c", + "ater" + ], + [ + "▁cat", + "er" + ], + [ + "▁ca", + "ter" + ], + [ + "▁t", + "ire" + ], + [ + "▁ti", + "re" + ], + [ + "▁tir", + "e" + ], + [ + "▁por", + "ch" + ], + [ + "qu", + "ier" + ], + [ + "US", + "A" + ], + [ + "U", + "SA" + ], + [ + "▁Sw", + "iss" + ], + [ + "▁", + "È" + ], + [ + "▁br", + "ave" + ], + [ + "▁bra", + "ve" + ], + [ + "▁explos", + "ion" + ], + [ + "l", + "r" + ], + [ + "▁class", + "ified" + ], + [ + "Ab", + "out" + ], + [ + "▁P", + "ict" + ], + [ + "▁Pic", + "t" + ], + [ + "▁Pi", + "ct" + ], + [ + "▁Dub", + "lin" + ], + [ + "▁separ", + "ately" + ], + [ + "▁separate", + "ly" + ], + [ + "▁bank", + "ing" + ], + [ + "▁ban", + "king" + ], + [ + "▁Christian", + "ity" + ], + [ + "mi", + "gr" + ], + [ + "m", + "igr" + ], + [ + "Ro", + "b" + ], + [ + "R", + "ob" + ], + [ + "се", + "р" + ], + [ + "с", + "ер" + ], + [ + "▁el", + "f" + ], + [ + "▁", + "elf" + ], + [ + "▁employ", + "ers" + ], + [ + "▁employer", + "s" + ], + [ + "▁S", + "low" + ], + [ + "▁Sl", + "ow" + ], + [ + "▁j", + "uli" + ], + [ + "▁ju", + "li" + ], + [ + "▁jul", + "i" + ], + [ + "west", + "ern" + ], + [ + "w", + "estern" + ], + [ + "▁anal", + "yst" + ], + [ + "▁analy", + "st" + ], + [ + "▁analys", + "t" + ], + [ + "ob", + "serv" + ], + [ + "obs", + "erv" + ], + [ + "▁N", + "ice" + ], + [ + "▁Nic", + "e" + ], + [ + "▁Ni", + "ce" + ], + [ + "▁G", + "C" + ], + [ + "▁", + "GC" + ], + [ + "▁Let", + "ter" + ], + [ + "▁ha", + "rass" + ], + [ + "▁har", + "ass" + ], + [ + "User", + "name" + ], + [ + "▁A", + "unt" + ], + [ + "▁Au", + "nt" + ], + [ + "▁с", + "ент" + ], + [ + "Su", + "p" + ], + [ + "S", + "up" + ], + [ + "IC", + "ES" + ], + [ + "ICE", + "S" + ], + [ + "RE", + "NT" + ], + [ + "R", + "ENT" + ], + [ + "rat", + "io" + ], + [ + "r", + "atio" + ], + [ + "▁Мо", + "ск" + ], + [ + "▁an", + "gles" + ], + [ + "▁ang", + "les" + ], + [ + "▁angle", + "s" + ], + [ + "▁angl", + "es" + ], + [ + "▁", + "angles" + ], + [ + "▁l", + "lev" + ], + [ + "▁ll", + "ev" + ], + [ + "_", + "*" + ], + [ + "▁n", + "it" + ], + [ + "▁ni", + "t" + ], + [ + "▁", + "nit" + ], + [ + "▁w", + "reck" + ], + [ + "▁pat", + "rol" + ], + [ + "▁loyal", + "ty" + ], + [ + "▁n", + "ationale" + ], + [ + "▁nat", + "ionale" + ], + [ + "▁national", + "e" + ], + [ + "▁nation", + "ale" + ], + [ + "go", + "m" + ], + [ + "g", + "om" + ], + [ + "}$", + "-" + ], + [ + "}", + "$-" + ], + [ + "▁dis", + "pute" + ], + [ + "▁disput", + "e" + ], + [ + "▁disp", + "ute" + ], + [ + "▁r", + "us" + ], + [ + "▁ru", + "s" + ], + [ + "▁", + "rus" + ], + [ + "▁П", + "рез" + ], + [ + "▁Пре", + "з" + ], + [ + "▁Indust", + "rial" + ], + [ + "▁dem", + "ocratic" + ], + [ + "▁democr", + "atic" + ], + [ + "b", + "w" + ], + [ + "li", + "mp" + ], + [ + "lim", + "p" + ], + [ + "l", + "imp" + ], + [ + "ur", + "bed" + ], + [ + "urb", + "ed" + ], + [ + "▁mie", + "jsce" + ], + [ + "▁miejsc", + "e" + ], + [ + "ру", + "д" + ], + [ + "▁t", + "ex" + ], + [ + "▁te", + "x" + ], + [ + "▁", + "tex" + ], + [ + "▁develop", + "ments" + ], + [ + "▁development", + "s" + ], + [ + "▁B", + "right" + ], + [ + "▁Br", + "ight" + ], + [ + "▁Brig", + "ht" + ], + [ + "▁var", + "ying" + ], + [ + "▁va", + "rying" + ], + [ + "▁vary", + "ing" + ], + [ + "fa", + "ct" + ], + [ + "fac", + "t" + ], + [ + "f", + "act" + ], + [ + "▁Port", + "al" + ], + [ + "▁Por", + "tal" + ], + [ + "as", + "is" + ], + [ + "asi", + "s" + ], + [ + "a", + "sis" + ], + [ + "▁горо", + "да" + ], + [ + "▁город", + "а" + ], + [ + "▁cre", + "ativity" + ], + [ + "▁creat", + "ivity" + ], + [ + "))", + "))" + ], + [ + ")))", + ")" + ], + [ + ")", + ")))" + ], + [ + ".\"", + ";" + ], + [ + ".", + "\";" + ], + [ + "ie", + "ux" + ], + [ + "ieu", + "x" + ], + [ + "▁prov", + "isions" + ], + [ + "▁provision", + "s" + ], + [ + "uv", + "e" + ], + [ + "u", + "ve" + ], + [ + "La", + "ng" + ], + [ + "L", + "ang" + ], + [ + "miss", + "ing" + ], + [ + "ра", + "т" + ], + [ + "р", + "ат" + ], + [ + "ph", + "ony" + ], + [ + "▁out", + "line" + ], + [ + "pa", + "s" + ], + [ + "p", + "as" + ], + [ + "el", + "m" + ], + [ + "e", + "lm" + ], + [ + "mon", + "itor" + ], + [ + "TC", + "P" + ], + [ + "T", + "CP" + ], + [ + "ka", + "t" + ], + [ + "k", + "at" + ], + [ + "uc", + "ed" + ], + [ + "uce", + "d" + ], + [ + "u", + "ced" + ], + [ + "\\\"", + "," + ], + [ + "\\", + "\"," + ], + [ + "yn", + "a" + ], + [ + "y", + "na" + ], + [ + "ра", + "бо" + ], + [ + "раб", + "о" + ], + [ + "oc", + "ate" + ], + [ + "oca", + "te" + ], + [ + "▁c", + "ares" + ], + [ + "▁car", + "es" + ], + [ + "▁care", + "s" + ], + [ + "▁ca", + "res" + ], + [ + "▁f", + "ins" + ], + [ + "▁fin", + "s" + ], + [ + "▁fi", + "ns" + ], + [ + "▁he", + "ap" + ], + [ + "▁", + "heap" + ], + [ + "▁small", + "est" + ], + [ + "äch", + "st" + ], + [ + "▁I", + "X" + ], + [ + "▁", + "IX" + ], + [ + "re", + "cv" + ], + [ + "rec", + "v" + ], + [ + "key", + "word" + ], + [ + "▁at", + "tra" + ], + [ + "▁att", + "ra" + ], + [ + "▁attr", + "a" + ], + [ + "▁sel", + "bst" + ], + [ + "Un", + "expected" + ], + [ + "Une", + "xpected" + ], + [ + "Sm", + "all" + ], + [ + "▁насе", + "ље" + ], + [ + "▁H", + "us" + ], + [ + "▁Hu", + "s" + ], + [ + "Enc", + "oder" + ], + [ + "Encode", + "r" + ], + [ + "▁un", + "set" + ], + [ + "▁uns", + "et" + ], + [ + "▁home", + "less" + ], + [ + "▁hom", + "eless" + ], + [ + "▁Johann", + "es" + ], + [ + "▁U", + "RI" + ], + [ + "▁", + "URI" + ], + [ + "ant", + "age" + ], + [ + "anta", + "ge" + ], + [ + "▁in", + "hib" + ], + [ + "▁appreci", + "ated" + ], + [ + "▁appreciate", + "d" + ], + [ + "ie", + "lte" + ], + [ + "iel", + "te" + ], + [ + "ielt", + "e" + ], + [ + "i", + "elte" + ], + [ + "▁st", + "ays" + ], + [ + "▁stay", + "s" + ], + [ + "▁sta", + "ys" + ], + [ + "▁alle", + "ged" + ], + [ + "▁alleg", + "ed" + ], + [ + "▁c", + "oding" + ], + [ + "▁co", + "ding" + ], + [ + "▁cod", + "ing" + ], + [ + "▁tv", + "å" + ], + [ + "pipe", + "line" + ], + [ + "p", + "ipeline" + ], + [ + "▁W", + "or" + ], + [ + "▁Wo", + "r" + ], + [ + "File", + "Path" + ], + [ + "▁accept", + "ing" + ], + [ + "▁Ex", + "cell" + ], + [ + "▁L", + "uther" + ], + [ + "▁Lu", + "ther" + ], + [ + "▁Friend", + "s" + ], + [ + "▁c", + "urt" + ], + [ + "▁cur", + "t" + ], + [ + "▁cu", + "rt" + ], + [ + "▁'", + "$" + ], + [ + "▁", + "'$" + ], + [ + "▁tight", + "ly" + ], + [ + "▁cz", + "ę" + ], + [ + "▁un", + "necessary" + ], + [ + "▁F", + "ed" + ], + [ + "▁Fe", + "d" + ], + [ + "▁А", + "нд" + ], + [ + "▁Ан", + "д" + ], + [ + "▁H", + "P" + ], + [ + "▁", + "HP" + ], + [ + "▁String", + "Builder" + ], + [ + "en", + "burg" + ], + [ + "'", + "(" + ], + [ + "vm", + "a" + ], + [ + "v", + "ma" + ], + [ + "▁Ab", + "raham" + ], + [ + "W", + "L" + ], + [ + "▁Re", + "ference" + ], + [ + "▁Refer", + "ence" + ], + [ + "▁", + "Reference" + ], + [ + "J", + "o" + ], + [ + "Bl", + "ob" + ], + [ + "Blo", + "b" + ], + [ + "▁H", + "ugh" + ], + [ + "▁Hug", + "h" + ], + [ + "▁Hu", + "gh" + ], + [ + "▁Bul", + "gar" + ], + [ + "MESS", + "AGE" + ], + [ + "з", + "во" + ], + [ + "▁avoid", + "ed" + ], + [ + "▁po", + "ems" + ], + [ + "▁poem", + "s" + ], + [ + "▁с", + "ы" + ], + [ + "▁", + "сы" + ], + [ + "▁O", + "pp" + ], + [ + "▁Op", + "p" + ], + [ + "av", + "irus" + ], + [ + "avi", + "rus" + ], + [ + "Pre", + "view" + ], + [ + "Prev", + "iew" + ], + [ + "P", + "review" + ], + [ + "▁k", + "er" + ], + [ + "▁ke", + "r" + ], + [ + "▁", + "ker" + ], + [ + "ue", + "va" + ], + [ + "u", + "eva" + ], + [ + "fl", + "ix" + ], + [ + "▁char", + "ging" + ], + [ + "▁charg", + "ing" + ], + [ + "▁motiv", + "ated" + ], + [ + "▁O", + "rd" + ], + [ + "▁Or", + "d" + ], + [ + "▁", + "Ord" + ], + [ + "▁av", + "eva" + ], + [ + "▁ave", + "va" + ], + [ + "x", + "l" + ], + [ + "▁flex", + "ibility" + ], + [ + "ag", + "na" + ], + [ + "agn", + "a" + ], + [ + "▁rac", + "ism" + ], + [ + "d", + "h" + ], + [ + "▁b", + "aking" + ], + [ + "▁ba", + "king" + ], + [ + "F", + "riend" + ], + [ + "ble", + "r" + ], + [ + "bl", + "er" + ], + [ + "b", + "ler" + ], + [ + "▁Log", + "ger" + ], + [ + "▁", + "Logger" + ], + [ + "Te", + "n" + ], + [ + "T", + "en" + ], + [ + "nav", + "igation" + ], + [ + "▁att", + "achment" + ], + [ + "▁attach", + "ment" + ], + [ + "▁", + "attachment" + ], + [ + "▁b", + "ajo" + ], + [ + "▁ba", + "jo" + ], + [ + "▁pr", + "icing" + ], + [ + "▁pri", + "cing" + ], + [ + "▁T", + "ip" + ], + [ + "▁Ti", + "p" + ], + [ + "▁", + "Tip" + ], + [ + "da", + "r" + ], + [ + "d", + "ar" + ], + [ + "G", + "G" + ], + [ + "To", + "ols" + ], + [ + "Tool", + "s" + ], + [ + "Too", + "ls" + ], + [ + "T", + "ools" + ], + [ + "vol", + "ution" + ], + [ + "v", + "olution" + ], + [ + "am", + "as" + ], + [ + "ama", + "s" + ], + [ + "a", + "mas" + ], + [ + "▁b", + "ibli" + ], + [ + "▁adapt", + "ed" + ], + [ + "ox", + "y" + ], + [ + "o", + "xy" + ], + [ + "▁F", + "reedom" + ], + [ + "▁Free", + "dom" + ], + [ + "ri", + "co" + ], + [ + "ric", + "o" + ], + [ + "r", + "ico" + ], + [ + "▁coll", + "apsed" + ], + [ + "▁collapse", + "d" + ], + [ + "z", + "m" + ], + [ + "pl", + "o" + ], + [ + "p", + "lo" + ], + [ + "▁c", + "ô" + ], + [ + "▁r", + "t" + ], + [ + "▁", + "rt" + ], + [ + "än", + "ger" + ], + [ + "äng", + "er" + ], + [ + "änge", + "r" + ], + [ + "▁D", + "R" + ], + [ + "▁", + "DR" + ], + [ + "▁Bit", + "coin" + ], + [ + "go", + "w" + ], + [ + "g", + "ow" + ], + [ + "▁ch", + "ez" + ], + [ + "▁che", + "z" + ], + [ + "▁", + "chez" + ], + [ + "▁ot", + "ro" + ], + [ + "▁te", + "il" + ], + [ + "▁", + "teil" + ], + [ + "ла", + "га" + ], + [ + "▁St", + "ars" + ], + [ + "▁Star", + "s" + ], + [ + "▁Sta", + "rs" + ], + [ + "▁invest", + "ing" + ], + [ + "▁a", + "board" + ], + [ + "▁ab", + "oard" + ], + [ + "▁f", + "lights" + ], + [ + "▁fl", + "ights" + ], + [ + "▁flight", + "s" + ], + [ + "▁genu", + "inely" + ], + [ + "▁genuine", + "ly" + ], + [ + "▁prom", + "ising" + ], + [ + "Rot", + "ation" + ], + [ + "O", + "cc" + ], + [ + "▁su", + "oi" + ], + [ + "▁suo", + "i" + ], + [ + "string", + "ify" + ], + [ + "ac", + "ies" + ], + [ + "aci", + "es" + ], + [ + "a", + "cies" + ], + [ + "▁G", + "round" + ], + [ + "▁Gr", + "ound" + ], + [ + "▁Gro", + "und" + ], + [ + "▁sequ", + "ences" + ], + [ + "▁sequence", + "s" + ], + [ + "▁c", + "ure" + ], + [ + "▁cur", + "e" + ], + [ + "▁cu", + "re" + ], + [ + "out", + "ine" + ], + [ + "▁!", + "!" + ], + [ + "▁", + "!!" + ], + [ + "▁G", + "ay" + ], + [ + "▁Ga", + "y" + ], + [ + "▁garden", + "s" + ], + [ + "▁gard", + "ens" + ], + [ + "▁G", + "las" + ], + [ + "▁Gl", + "as" + ], + [ + "▁Tai", + "wan" + ], + [ + "reg", + "istry" + ], + [ + "▁#", + "{" + ], + [ + "▁", + "#{" + ], + [ + "▁ins", + "pection" + ], + [ + "▁insp", + "ection" + ], + [ + "▁inspect", + "ion" + ], + [ + "Te", + "ll" + ], + [ + "T", + "ell" + ], + [ + "▁`", + "${" + ], + [ + "p", + "matrix" + ], + [ + "▁reg", + "ulation" + ], + [ + "▁regul", + "ation" + ], + [ + "fin", + "ish" + ], + [ + "▁Ed", + "ge" + ], + [ + "▁", + "Edge" + ], + [ + "Sp", + "rite" + ], + [ + "S", + "prite" + ], + [ + "▁Conf", + "eder" + ], + [ + "▁immigr", + "ants" + ], + [ + "▁elder", + "ly" + ], + [ + "um", + "ed" + ], + [ + "ume", + "d" + ], + [ + "u", + "med" + ], + [ + "▁Quest", + "ion" + ], + [ + "▁", + "Question" + ], + [ + "Gate", + "way" + ], + [ + "fo", + "ny" + ], + [ + "fon", + "y" + ], + [ + "f", + "ony" + ], + [ + "ît", + "re" + ], + [ + "î", + "tre" + ], + [ + "▁co", + "sm" + ], + [ + "▁cos", + "m" + ], + [ + "Ro", + "und" + ], + [ + "R", + "ound" + ], + [ + "▁ign", + "oring" + ], + [ + "▁ignor", + "ing" + ], + [ + "▁K", + "i" + ], + [ + "▁sens", + "itivity" + ], + [ + "âte", + "au" + ], + [ + "ât", + "eau" + ], + [ + "▁engine", + "ers" + ], + [ + "▁engineer", + "s" + ], + [ + "▁cor", + "rel" + ], + [ + "▁corre", + "l" + ], + [ + "ir", + "teen" + ], + [ + "irt", + "een" + ], + [ + "▁Sw", + "itzerland" + ], + [ + "▁inher", + "it" + ], + [ + "▁", + "inherit" + ], + [ + "wo", + "r" + ], + [ + "w", + "or" + ], + [ + "▁mid", + "night" + ], + [ + "▁P", + "un" + ], + [ + "▁Pu", + "n" + ], + [ + "ak", + "te" + ], + [ + "akt", + "e" + ], + [ + "a", + "kte" + ], + [ + "Dis", + "able" + ], + [ + "▁es", + "per" + ], + [ + "▁esp", + "er" + ], + [ + "▁not", + "ation" + ], + [ + "▁", + "notation" + ], + [ + "▁Univers", + "idad" + ], + [ + "so", + "l" + ], + [ + "s", + "ol" + ], + [ + "de", + "rn" + ], + [ + "der", + "n" + ], + [ + "d", + "ern" + ], + [ + "in", + "ge" + ], + [ + "ing", + "e" + ], + [ + "▁inv", + "itation" + ], + [ + ")}", + "}" + ], + [ + ")", + "}}" + ], + [ + "▁", + "â" + ], + [ + "▁ess", + "ays" + ], + [ + "▁essay", + "s" + ], + [ + "ar", + "med" + ], + [ + "arm", + "ed" + ], + [ + "ch", + "sel" + ], + [ + "chs", + "el" + ], + [ + "▁не", + "го" + ], + [ + "▁", + "него" + ], + [ + "▁confirm", + "ation" + ], + [ + "un", + "ity" + ], + [ + "unit", + "y" + ], + [ + "uni", + "ty" + ], + [ + "▁Br", + "other" + ], + [ + "▁Bro", + "ther" + ], + [ + "▁", + "Є" + ], + [ + "ni", + "ce" + ], + [ + "nic", + "e" + ], + [ + "n", + "ice" + ], + [ + "▁S", + "ue" + ], + [ + "▁Su", + "e" + ], + [ + "▁t", + "ray" + ], + [ + "▁tr", + "ay" + ], + [ + "▁tra", + "y" + ], + [ + "ро", + "и" + ], + [ + "C", + "ookie" + ], + [ + "▁Feder", + "ation" + ], + [ + "IC", + "T" + ], + [ + "I", + "CT" + ], + [ + "▁p", + "éri" + ], + [ + "stud", + "ent" + ], + [ + "▁V", + "ent" + ], + [ + "▁Ven", + "t" + ], + [ + "▁Ve", + "nt" + ], + [ + "K", + "K" + ], + [ + "ST", + "EM" + ], + [ + "aw", + "k" + ], + [ + "▁re", + "un" + ], + [ + "▁pe", + "oples" + ], + [ + "▁people", + "s" + ], + [ + "io", + "res" + ], + [ + "ior", + "es" + ], + [ + "iore", + "s" + ], + [ + "i", + "ores" + ], + [ + "ou", + "bt" + ], + [ + "▁St", + "age" + ], + [ + "▁Sta", + "ge" + ], + [ + "▁", + "Stage" + ], + [ + "▁c", + "harm" + ], + [ + "▁ch", + "arm" + ], + [ + "▁char", + "m" + ], + [ + "▁cha", + "rm" + ], + [ + "ie", + "ur" + ], + [ + "ieu", + "r" + ], + [ + "i", + "eur" + ], + [ + "▁util", + "ize" + ], + [ + "▁utiliz", + "e" + ], + [ + "▁d", + "istribute" + ], + [ + "▁dist", + "ribute" + ], + [ + "▁distribut", + "e" + ], + [ + "▁g", + "otta" + ], + [ + "▁go", + "tta" + ], + [ + "▁got", + "ta" + ], + [ + "▁block", + "ing" + ], + [ + "H", + "ot" + ], + [ + "br", + "ew" + ], + [ + "bre", + "w" + ], + [ + "b", + "rew" + ], + [ + "▁b", + "onds" + ], + [ + "▁bon", + "ds" + ], + [ + "▁bond", + "s" + ], + [ + "le", + "af" + ], + [ + "Pro", + "te" + ], + [ + "Pr", + "ote" + ], + [ + "P", + "rote" + ], + [ + "▁d", + "ice" + ], + [ + "▁di", + "ce" + ], + [ + "▁dic", + "e" + ], + [ + "▁Nor", + "man" + ], + [ + "▁Norm", + "an" + ], + [ + "▁о", + "кт" + ], + [ + "▁ок", + "т" + ], + [ + "▁in", + "spir" + ], + [ + "▁insp", + "ir" + ], + [ + "Pr", + "iv" + ], + [ + "P", + "riv" + ], + [ + "▁P", + "uerto" + ], + [ + "▁то", + "ва" + ], + [ + "RS", + "T" + ], + [ + "R", + "ST" + ], + [ + "▁s", + "f" + ], + [ + "▁", + "sf" + ], + [ + "▁qu", + "ale" + ], + [ + "▁qual", + "e" + ], + [ + "ni", + "ck" + ], + [ + "nic", + "k" + ], + [ + "n", + "ick" + ], + [ + "▁sup", + "press" + ], + [ + "▁supp", + "ress" + ], + [ + "ча", + "т" + ], + [ + "ч", + "ат" + ], + [ + "▁H", + "ello" + ], + [ + "▁Hel", + "lo" + ], + [ + "▁Hell", + "o" + ], + [ + "▁", + "Hello" + ], + [ + "▁crow", + "ded" + ], + [ + "▁crowd", + "ed" + ], + [ + "hba", + "r" + ], + [ + "h", + "bar" + ], + [ + "▁lo", + "ads" + ], + [ + "▁load", + "s" + ], + [ + "▁", + "loads" + ], + [ + "▁cor", + "rection" + ], + [ + "▁correct", + "ion" + ], + [ + "▁corre", + "ction" + ], + [ + "ad", + "just" + ], + [ + "adj", + "ust" + ], + [ + "▁E", + "state" + ], + [ + "▁Est", + "ate" + ], + [ + "▁Esta", + "te" + ], + [ + "text", + "sc" + ], + [ + "▁cool", + "ing" + ], + [ + "iv", + "eau" + ], + [ + "ive", + "au" + ], + [ + "▁bet", + "ting" + ], + [ + "====", + "========" + ], + [ + "========", + "====" + ], + [ + "re", + "mark" + ], + [ + "rem", + "ark" + ], + [ + "r", + "emark" + ], + [ + "▁im", + "plications" + ], + [ + "▁impl", + "ications" + ], + [ + "▁p", + "oz" + ], + [ + "▁po", + "z" + ], + [ + "ün", + "g" + ], + [ + "ü", + "ng" + ], + [ + "▁reg", + "ards" + ], + [ + "▁regard", + "s" + ], + [ + "▁a", + "mid" + ], + [ + "▁am", + "id" + ], + [ + "▁habit", + "antes" + ], + [ + "G", + "I" + ], + [ + "▁F", + "ou" + ], + [ + "▁Fo", + "u" + ], + [ + "▁j", + "ar" + ], + [ + "▁ja", + "r" + ], + [ + "▁", + "jar" + ], + [ + "▁requ", + "iring" + ], + [ + "▁D", + "rupal" + ], + [ + "▁Dru", + "pal" + ], + [ + "▁li", + "ability" + ], + [ + "cz", + "as" + ], + [ + "c", + "zas" + ], + [ + "▁l", + "yrics" + ], + [ + "▁ly", + "rics" + ], + [ + "▁N", + "ort" + ], + [ + "▁No", + "rt" + ], + [ + "▁Nor", + "t" + ], + [ + "si", + "l" + ], + [ + "s", + "il" + ], + [ + "▁M", + "ey" + ], + [ + "▁Me", + "y" + ], + [ + "UN", + "IT" + ], + [ + "ва", + "ния" + ], + [ + "f", + "uture" + ], + [ + "hi", + "r" + ], + [ + "h", + "ir" + ], + [ + "CA", + "L" + ], + [ + "C", + "AL" + ], + [ + "LAB", + "EL" + ], + [ + "▁S", + "weet" + ], + [ + "▁stat", + "ue" + ], + [ + "bor", + "ne" + ], + [ + "born", + "e" + ], + [ + "b", + "orne" + ], + [ + "Not", + "ify" + ], + [ + "▁her", + "itage" + ], + [ + "▁d", + "orm" + ], + [ + "▁do", + "rm" + ], + [ + "▁l", + "ever" + ], + [ + "▁le", + "ver" + ], + [ + "▁lev", + "er" + ], + [ + "▁mut", + "tered" + ], + [ + "}", + "&" + ], + [ + "▁inter", + "mediate" + ], + [ + "▁Wat", + "son" + ], + [ + "▁view", + "ing" + ], + [ + "▁vie", + "wing" + ], + [ + "kt", + "or" + ], + [ + "k", + "tor" + ], + [ + "enti", + "eth" + ], + [ + "xx", + "x" + ], + [ + "x", + "xx" + ], + [ + "at", + "u" + ], + [ + "a", + "tu" + ], + [ + "▁Inst", + "all" + ], + [ + "▁", + "Install" + ], + [ + "Cont", + "in" + ], + [ + "▁t", + "oute" + ], + [ + "▁to", + "ute" + ], + [ + "▁tou", + "te" + ], + [ + "▁tout", + "e" + ], + [ + "▁P", + "T" + ], + [ + "▁", + "PT" + ], + [ + "▁u", + "ri" + ], + [ + "▁ur", + "i" + ], + [ + "▁", + "uri" + ], + [ + "Call", + "ed" + ], + [ + "Cal", + "led" + ], + [ + "C", + "alled" + ], + [ + "▁O", + "FF" + ], + [ + "▁OF", + "F" + ], + [ + "▁", + "OFF" + ], + [ + "ig", + "lia" + ], + [ + "ic", + "hi" + ], + [ + "ich", + "i" + ], + [ + "i", + "chi" + ], + [ + "с", + "ни" + ], + [ + "V", + "o" + ], + [ + "▁exhib", + "it" + ], + [ + "▁asym", + "pt" + ], + [ + "▁G", + "ulf" + ], + [ + "л", + "ли" + ], + [ + "do", + "min" + ], + [ + "dom", + "in" + ], + [ + "d", + "omin" + ], + [ + "▁départ", + "ement" + ], + [ + "mi", + "l" + ], + [ + "m", + "il" + ], + [ + "▁B", + "ez" + ], + [ + "▁Be", + "z" + ], + [ + "▁l", + "ately" + ], + [ + "▁late", + "ly" + ], + [ + "▁lat", + "ely" + ], + [ + "▁def", + "ining" + ], + [ + "▁defin", + "ing" + ], + [ + "▁E", + "L" + ], + [ + "▁", + "EL" + ], + [ + "omorph", + "ic" + ], + [ + "▁f", + "ebru" + ], + [ + "▁fe", + "bru" + ], + [ + "▁febr", + "u" + ], + [ + "IS", + "TER" + ], + [ + "IST", + "ER" + ], + [ + "I", + "STER" + ], + [ + "res", + "olved" + ], + [ + "resolve", + "d" + ], + [ + "те", + "й" + ], + [ + "т", + "ей" + ], + [ + "▁S", + "pect" + ], + [ + "▁Sp", + "ect" + ], + [ + "▁Spec", + "t" + ], + [ + "▁Spe", + "ct" + ], + [ + "▁sem", + "pre" + ], + [ + "▁Se", + "pt" + ], + [ + "▁Sep", + "t" + ], + [ + "▁cl", + "earing" + ], + [ + "▁cle", + "aring" + ], + [ + "▁clear", + "ing" + ], + [ + "▁diam", + "eter" + ], + [ + "in", + "do" + ], + [ + "ind", + "o" + ], + [ + "▁soc", + "cer" + ], + [ + "▁D", + "CHECK" + ], + [ + "▁DC", + "HECK" + ], + [ + "vo", + "te" + ], + [ + "v", + "ote" + ], + [ + "▁n", + "omin" + ], + [ + "▁no", + "min" + ], + [ + "▁nom", + "in" + ], + [ + "Type", + "d" + ], + [ + "Ty", + "ped" + ], + [ + "Typ", + "ed" + ], + [ + "Miss", + "ing" + ], + [ + "W", + "as" + ], + [ + "▁Cent", + "ury" + ], + [ + "▁direct", + "ors" + ], + [ + "▁dire", + "ctors" + ], + [ + "▁director", + "s" + ], + [ + "▁mode", + "rate" + ], + [ + "▁moder", + "ate" + ], + [ + "▁Ill", + "uminate" + ], + [ + "▁", + "Illuminate" + ], + [ + "▁челове", + "к" + ], + [ + "▁B", + "apt" + ], + [ + "▁Ba", + "pt" + ], + [ + "▁Qu", + "ant" + ], + [ + "▁", + "Quant" + ], + [ + "▁tre", + "ating" + ], + [ + "▁treat", + "ing" + ], + [ + "ag", + "i" + ], + [ + "a", + "gi" + ], + [ + "Si", + "l" + ], + [ + "S", + "il" + ], + [ + "ring", + "e" + ], + [ + "rin", + "ge" + ], + [ + "r", + "inge" + ], + [ + "ł", + "ą" + ], + [ + "el", + "lan" + ], + [ + "ell", + "an" + ], + [ + "ella", + "n" + ], + [ + "▁f", + "ino" + ], + [ + "▁fin", + "o" + ], + [ + "▁fi", + "no" + ], + [ + "Capt", + "ure" + ], + [ + "C", + "apture" + ], + [ + "▁S", + "ic" + ], + [ + "▁Si", + "c" + ], + [ + "▁st", + "amp" + ], + [ + "▁sta", + "mp" + ], + [ + "▁stam", + "p" + ], + [ + "▁B", + "uen" + ], + [ + "▁Bu", + "en" + ], + [ + "▁seg", + "undo" + ], + [ + "▁in", + "verse" + ], + [ + "▁d", + "up" + ], + [ + "▁du", + "p" + ], + [ + "▁", + "dup" + ], + [ + "▁br", + "oker" + ], + [ + "▁bro", + "ker" + ], + [ + "▁broke", + "r" + ], + [ + "▁search", + "ed" + ], + [ + "▁sear", + "ched" + ], + [ + "be", + "ans" + ], + [ + "bean", + "s" + ], + [ + "▁A", + "BC" + ], + [ + "▁AB", + "C" + ], + [ + "is", + "ha" + ], + [ + "ish", + "a" + ], + [ + "i", + "sha" + ], + [ + "▁Lin", + "ked" + ], + [ + "▁Link", + "ed" + ], + [ + "▁", + "Linked" + ], + [ + "▁Nich", + "olas" + ], + [ + "▁Sw", + "edish" + ], + [ + "he", + "mal" + ], + [ + "hem", + "al" + ], + [ + "▁E", + "M" + ], + [ + "▁", + "EM" + ], + [ + "▁j", + "ego" + ], + [ + "▁je", + "go" + ], + [ + "че", + "ский" + ], + [ + "чески", + "й" + ], + [ + "lo", + "t" + ], + [ + "l", + "ot" + ], + [ + "▁dis", + "cret" + ], + [ + "▁disc", + "ret" + ], + [ + "▁discre", + "t" + ], + [ + "▁E", + "g" + ], + [ + "pi", + "ck" + ], + [ + "pic", + "k" + ], + [ + "p", + "ick" + ], + [ + "am", + "on" + ], + [ + "amo", + "n" + ], + [ + "a", + "mon" + ], + [ + "▁Rail", + "way" + ], + [ + "ка", + "р" + ], + [ + "к", + "ар" + ], + [ + "▁nav", + "igate" + ], + [ + "▁navig", + "ate" + ], + [ + "▁Comm", + "ander" + ], + [ + "▁Command", + "er" + ], + [ + "▁disappe", + "ar" + ], + [ + "▁con", + "gress" + ], + [ + "▁congr", + "ess" + ], + [ + "▁graph", + "ic" + ], + [ + "sp", + "r" + ], + [ + "s", + "pr" + ], + [ + "FLO", + "AT" + ], + [ + "▁S", + "erial" + ], + [ + "▁Se", + "rial" + ], + [ + "▁Ser", + "ial" + ], + [ + "▁", + "Serial" + ], + [ + "▁я", + "нва" + ], + [ + "so", + "cial" + ], + [ + "soc", + "ial" + ], + [ + "s", + "ocial" + ], + [ + "bu", + "ch" + ], + [ + "b", + "uch" + ], + [ + "▁se", + "al" + ], + [ + "▁sea", + "l" + ], + [ + "▁c", + "ement" + ], + [ + "▁ce", + "ment" + ], + [ + "▁Y", + "e" + ], + [ + "ot", + "ti" + ], + [ + "ott", + "i" + ], + [ + "o", + "tti" + ], + [ + "▁The", + "od" + ], + [ + "remove", + "Class" + ], + [ + "▁Jul", + "ie" + ], + [ + "▁Ju", + "lie" + ], + [ + "▁Juli", + "e" + ], + [ + "▁gr", + "öß" + ], + [ + "ST", + "REAM" + ], + [ + "▁G", + "B" + ], + [ + "▁", + "GB" + ], + [ + "▁Ben", + "ef" + ], + [ + "▁Mat", + "rix" + ], + [ + "▁", + "Matrix" + ], + [ + "▁ke", + "ine" + ], + [ + "▁cont", + "inent" + ], + [ + "▁contin", + "ent" + ], + [ + "▁ja", + "ar" + ], + [ + "DA", + "I" + ], + [ + "D", + "AI" + ], + [ + "▁S", + "equ" + ], + [ + "▁Se", + "qu" + ], + [ + "▁", + "Sequ" + ], + [ + "kre", + "is" + ], + [ + "▁c", + "rown" + ], + [ + "▁cr", + "own" + ], + [ + "▁crow", + "n" + ], + [ + "▁cro", + "wn" + ], + [ + "Init", + "ialize" + ], + [ + "Initial", + "ize" + ], + [ + "ax", + "y" + ], + [ + "a", + "xy" + ], + [ + "▁C", + "IA" + ], + [ + "▁int", + "end" + ], + [ + "▁inte", + "nd" + ], + [ + "▁b", + "ub" + ], + [ + "▁bu", + "b" + ], + [ + "▁mask", + "s" + ], + [ + "▁mas", + "ks" + ], + [ + "▁sit", + "uated" + ], + [ + "▁situ", + "ated" + ], + [ + "▁E", + "du" + ], + [ + "▁Ed", + "u" + ], + [ + "▁particip", + "ating" + ], + [ + "ше", + "й" + ], + [ + "ш", + "ей" + ], + [ + "_{", + "-" + ], + [ + "_", + "{-" + ], + [ + "▁Tele", + "vision" + ], + [ + "▁pre", + "ferences" + ], + [ + "▁prefer", + "ences" + ], + [ + "▁preference", + "s" + ], + [ + "▁D", + "rop" + ], + [ + "▁Dr", + "op" + ], + [ + "▁", + "Drop" + ], + [ + "re", + "view" + ], + [ + "rev", + "iew" + ], + [ + "▁vi", + "olation" + ], + [ + "▁viol", + "ation" + ], + [ + "▁ch", + "rist" + ], + [ + "▁chr", + "ist" + ], + [ + "q", + "q" + ], + [ + "▁M", + "yst" + ], + [ + "▁My", + "st" + ], + [ + "comm", + "ands" + ], + [ + "command", + "s" + ], + [ + "▁prim", + "itive" + ], + [ + "ill", + "ance" + ], + [ + "▁r", + "anging" + ], + [ + "▁ran", + "ging" + ], + [ + "▁rang", + "ing" + ], + [ + "▁Adv", + "anced" + ], + [ + ")", + "&" + ], + [ + "▁О", + "б" + ], + [ + "▁sub", + "str" + ], + [ + "▁subst", + "r" + ], + [ + "▁subs", + "tr" + ], + [ + "▁", + "substr" + ], + [ + "▁clos", + "ure" + ], + [ + "▁clo", + "sure" + ], + [ + "▁", + "closure" + ], + [ + "tw", + "itter" + ], + [ + "ne", + "z" + ], + [ + "n", + "ez" + ], + [ + "▁pr", + "zed" + ], + [ + "▁prz", + "ed" + ], + [ + "▁prze", + "d" + ], + [ + "▁mer", + "ged" + ], + [ + "▁merge", + "d" + ], + [ + "ur", + "os" + ], + [ + "uro", + "s" + ], + [ + "u", + "ros" + ], + [ + "▁j", + "er" + ], + [ + "▁je", + "r" + ], + [ + "▁", + "jer" + ], + [ + "▁_", + "(" + ], + [ + "▁", + "_(" + ], + [ + "ar", + "an" + ], + [ + "ara", + "n" + ], + [ + "a", + "ran" + ], + [ + "▁P", + "atri" + ], + [ + "▁Pat", + "ri" + ], + [ + "▁Pa", + "tri" + ], + [ + "▁T", + "un" + ], + [ + "▁Tu", + "n" + ], + [ + "U", + "K" + ], + [ + "il", + "iation" + ], + [ + "ili", + "ation" + ], + [ + "▁Ke", + "ith" + ], + [ + "Own", + "Property" + ], + [ + "op", + "sis" + ], + [ + "ops", + "is" + ], + [ + "Ma", + "d" + ], + [ + "M", + "ad" + ], + [ + "▁def", + "ence" + ], + [ + "A", + "ir" + ], + [ + "=$", + "{" + ], + [ + "=", + "${" + ], + [ + "cript", + "ors" + ], + [ + "criptor", + "s" + ], + [ + "So", + "m" + ], + [ + "S", + "om" + ], + [ + "▁", + "±" + ], + [ + "▁HA", + "VE" + ], + [ + "~~~~", + "~~~~" + ], + [ + "▁be", + "aten" + ], + [ + "▁beat", + "en" + ], + [ + "▁int", + "imate" + ], + [ + "▁intim", + "ate" + ], + [ + "op", + "ic" + ], + [ + "o", + "pic" + ], + [ + "▁p", + "řed" + ], + [ + "▁př", + "ed" + ], + [ + "Sh", + "op" + ], + [ + "S", + "hop" + ], + [ + "Table", + "s" + ], + [ + "Tab", + "les" + ], + [ + "T", + "ables" + ], + [ + "▁S", + "I" + ], + [ + "▁", + "SI" + ], + [ + "re", + "name" + ], + [ + "ren", + "ame" + ], + [ + "rena", + "me" + ], + [ + "r", + "ename" + ], + [ + "▁product", + "ive" + ], + [ + "rib", + "ly" + ], + [ + "r", + "ibly" + ], + [ + "▁L", + "uck" + ], + [ + "▁Lu", + "ck" + ], + [ + "▁Luc", + "k" + ], + [ + "▁kl", + "ub" + ], + [ + "}}", + "^{" + ], + [ + "}}^", + "{" + ], + [ + "}", + "}^{" + ], + [ + "▁F", + "ish" + ], + [ + "▁Fi", + "sh" + ], + [ + "PR", + "I" + ], + [ + "P", + "RI" + ], + [ + "en", + "ario" + ], + [ + "ena", + "rio" + ], + [ + "▁pse", + "ud" + ], + [ + "Or", + "d" + ], + [ + "O", + "rd" + ], + [ + "▁quel", + "ques" + ], + [ + "▁D", + "od" + ], + [ + "▁Do", + "d" + ], + [ + "▁p", + "unto" + ], + [ + "▁pun", + "to" + ], + [ + "▁punt", + "o" + ], + [ + "se", + "nal" + ], + [ + "sen", + "al" + ], + [ + "▁Br", + "others" + ], + [ + "▁Bro", + "thers" + ], + [ + "▁Brother", + "s" + ], + [ + "▁diab", + "etes" + ], + [ + "P", + "aint" + ], + [ + "▁person", + "as" + ], + [ + "▁persona", + "s" + ], + [ + "в", + "ър" + ], + [ + "▁n", + "ep" + ], + [ + "▁ne", + "p" + ], + [ + "▁El", + "len" + ], + [ + "▁Ell", + "en" + ], + [ + "▁Elle", + "n" + ], + [ + "▁h", + "ä" + ], + [ + "cr", + "tc" + ], + [ + "c", + "rtc" + ], + [ + "▁frustr", + "ation" + ], + [ + ".", + "^{[" + ], + [ + "▁s", + "printf" + ], + [ + "▁sprint", + "f" + ], + [ + "▁", + "sprintf" + ], + [ + "+", + "-" + ], + [ + "En", + "code" + ], + [ + "Enc", + "ode" + ], + [ + "▁насе", + "лення" + ], + [ + "Draw", + "able" + ], + [ + "▁b", + "ore" + ], + [ + "▁bo", + "re" + ], + [ + "▁bor", + "e" + ], + [ + "▁E", + "ld" + ], + [ + "▁El", + "d" + ], + [ + "те", + "т" + ], + [ + "т", + "ет" + ], + [ + "T", + "ick" + ], + [ + "ar", + "ator" + ], + [ + "ara", + "tor" + ], + [ + "▁Fin", + "ance" + ], + [ + "▁agric", + "ultural" + ], + [ + ")^", + "{-" + ], + [ + ")^{", + "-" + ], + [ + ")", + "^{-" + ], + [ + "may", + "be" + ], + [ + "Sche", + "dule" + ], + [ + "▁[", + "…]" + ], + [ + "et", + "ection" + ], + [ + "ete", + "ction" + ], + [ + "ль", + "ного" + ], + [ + "льно", + "го" + ], + [ + "▁he", + "els" + ], + [ + "▁En", + "joy" + ], + [ + "Sy", + "s" + ], + [ + "S", + "ys" + ], + [ + "orsz", + "ág" + ], + [ + "CONT", + "ROL" + ], + [ + "cc", + "cc" + ], + [ + "▁D", + "ictionary" + ], + [ + "▁", + "Dictionary" + ], + [ + "Ne", + "ed" + ], + [ + "N", + "eed" + ], + [ + "▁He", + "aven" + ], + [ + "▁vess", + "els" + ], + [ + "▁vessel", + "s" + ], + [ + "ec", + "ycle" + ], + [ + "e", + "cycle" + ], + [ + "ti", + "es" + ], + [ + "t", + "ies" + ], + [ + "▁e", + "nde" + ], + [ + "▁en", + "de" + ], + [ + "▁end", + "e" + ], + [ + "▁", + "ende" + ], + [ + "SI", + "NG" + ], + [ + "S", + "ING" + ], + [ + "De", + "scribe" + ], + [ + "Desc", + "ribe" + ], + [ + "▁Pub", + "lished" + ], + [ + "▁Publish", + "ed" + ], + [ + "▁win", + "ds" + ], + [ + "▁wind", + "s" + ], + [ + "neh", + "men" + ], + [ + "▁D", + "ES" + ], + [ + "▁DE", + "S" + ], + [ + "Hor", + "izontal" + ], + [ + "▁L", + "ost" + ], + [ + "▁Los", + "t" + ], + [ + "▁Lo", + "st" + ], + [ + "--", + "-----------" + ], + [ + "----", + "---------" + ], + [ + "--------", + "-----" + ], + [ + "---", + "----------" + ], + [ + "------------", + "-" + ], + [ + "-----", + "--------" + ], + [ + "----------", + "---" + ], + [ + "------", + "-------" + ], + [ + "---------", + "----" + ], + [ + "-------", + "------" + ], + [ + "-----------", + "--" + ], + [ + "-", + "------------" + ], + [ + "▁p", + "x" + ], + [ + "▁", + "px" + ], + [ + "}(", + "{\\" + ], + [ + "}", + "({\\" + ], + [ + "▁Hein", + "rich" + ], + [ + "oms", + "nitt" + ], + [ + "ho", + "s" + ], + [ + "h", + "os" + ], + [ + "Ro", + "ll" + ], + [ + "R", + "oll" + ], + [ + "tor", + "ch" + ], + [ + "▁equ", + "ity" + ], + [ + "▁eq", + "uity" + ], + [ + "▁collect", + "ing" + ], + [ + "▁l", + "ifting" + ], + [ + "▁lif", + "ting" + ], + [ + "▁lift", + "ing" + ], + [ + "sub", + "figure" + ], + [ + "Ne", + "ver" + ], + [ + "N", + "ever" + ], + [ + "▁L", + "ength" + ], + [ + "▁Le", + "ngth" + ], + [ + "▁", + "Length" + ], + [ + "▁w", + "inners" + ], + [ + "▁win", + "ners" + ], + [ + "▁winner", + "s" + ], + [ + "▁U", + "SD" + ], + [ + "▁US", + "D" + ], + [ + "▁st", + "esso" + ], + [ + "▁а", + "бо" + ], + [ + "▁al", + "tri" + ], + [ + "▁alt", + "ri" + ], + [ + "▁produ", + "cers" + ], + [ + "▁produce", + "rs" + ], + [ + "▁producer", + "s" + ], + [ + "mon", + "s" + ], + [ + "mo", + "ns" + ], + [ + "m", + "ons" + ], + [ + "▁Pop", + "ular" + ], + [ + "Com", + "b" + ], + [ + "Co", + "mb" + ], + [ + "C", + "omb" + ], + [ + "ab", + "lo" + ], + [ + "abl", + "o" + ], + [ + "a", + "blo" + ], + [ + "RE", + "SET" + ], + [ + "RES", + "ET" + ], + [ + "т", + "ва" + ], + [ + "Over", + "lay" + ], + [ + "▁id", + "iot" + ], + [ + "▁idi", + "ot" + ], + [ + "ex", + "ist" + ], + [ + "Be", + "havior" + ], + [ + "UB", + "LE" + ], + [ + "ier", + "re" + ], + [ + "i", + "erre" + ], + [ + "mine", + "craft" + ], + [ + "▁f", + "os" + ], + [ + "▁fo", + "s" + ], + [ + "▁encuent", + "ra" + ], + [ + "▁scream", + "ed" + ], + [ + "▁polynom", + "ial" + ], + [ + "▁c", + "one" + ], + [ + "▁con", + "e" + ], + [ + "▁co", + "ne" + ], + [ + "▁c", + "ited" + ], + [ + "▁cit", + "ed" + ], + [ + "▁ci", + "ted" + ], + [ + "▁president", + "e" + ], + [ + "▁presid", + "ente" + ], + [ + "▁re", + "sign" + ], + [ + "▁res", + "ign" + ], + [ + "▁y", + "elled" + ], + [ + "▁i", + "k" + ], + [ + "▁", + "ik" + ], + [ + "Pl", + "us" + ], + [ + "▁Ми", + "ха" + ], + [ + "▁The", + "me" + ], + [ + "▁Th", + "eme" + ], + [ + "▁", + "Theme" + ], + [ + "▁re", + "li" + ], + [ + "▁r", + "eli" + ], + [ + "▁rel", + "i" + ], + [ + "ne", + "m" + ], + [ + "n", + "em" + ], + [ + "▁a", + "men" + ], + [ + "▁am", + "en" + ], + [ + "▁", + "amen" + ], + [ + "▁", + "Ј" + ], + [ + "Th", + "anks" + ], + [ + "Thank", + "s" + ], + [ + "Than", + "ks" + ], + [ + "▁al", + "umin" + ], + [ + "▁sh", + "elf" + ], + [ + "▁shel", + "f" + ], + [ + "!\"", + ");" + ], + [ + "!", + "\");" + ], + [ + "append", + "Child" + ], + [ + "▁l", + "ogs" + ], + [ + "▁lo", + "gs" + ], + [ + "▁log", + "s" + ], + [ + "▁", + "logs" + ], + [ + "▁re", + "gex" + ], + [ + "▁reg", + "ex" + ], + [ + "▁", + "regex" + ], + [ + "▁p", + "unk" + ], + [ + "▁pun", + "k" + ], + [ + "CO", + "RE" + ], + [ + "▁b", + "orders" + ], + [ + "▁border", + "s" + ], + [ + "▁bord", + "ers" + ], + [ + "▁bor", + "ders" + ], + [ + "▁Requ", + "ired" + ], + [ + "▁", + "Required" + ], + [ + "▁f", + "law" + ], + [ + "▁fl", + "aw" + ], + [ + "▁cin", + "ema" + ], + [ + "▁v", + "í" + ], + [ + "▁", + "ví" + ], + [ + "▁ab", + "ortion" + ], + [ + "▁abort", + "ion" + ], + [ + "jour", + "nal" + ], + [ + "j", + "ournal" + ], + [ + "in", + "itions" + ], + [ + "init", + "ions" + ], + [ + "inition", + "s" + ], + [ + "state", + "ment" + ], + [ + "stat", + "ement" + ], + [ + "▁o", + "urs" + ], + [ + "▁our", + "s" + ], + [ + "▁ou", + "rs" + ], + [ + "▁", + "ours" + ], + [ + "ó", + "t" + ], + [ + "▁Tur", + "ner" + ], + [ + "▁Turn", + "er" + ], + [ + "in", + "us" + ], + [ + "ev", + "es" + ], + [ + "eve", + "s" + ], + [ + "e", + "ves" + ], + [ + "▁magazine", + "s" + ], + [ + "▁magaz", + "ines" + ], + [ + "…", + "…" + ], + [ + "la", + "ce" + ], + [ + "l", + "ace" + ], + [ + "sl", + "ider" + ], + [ + "slide", + "r" + ], + [ + "▁l", + "ocate" + ], + [ + "▁loc", + "ate" + ], + [ + "▁des", + "arroll" + ], + [ + "P", + "an" + ], + [ + "To", + "m" + ], + [ + "T", + "om" + ], + [ + "▁Land", + "es" + ], + [ + "▁Lan", + "des" + ], + [ + "ol", + "ia" + ], + [ + "oli", + "a" + ], + [ + "o", + "lia" + ], + [ + "▁u", + "nm" + ], + [ + "▁un", + "m" + ], + [ + "▁Sen", + "ator" + ], + [ + "▁ad", + "minister" + ], + [ + "▁admin", + "ister" + ], + [ + "▁ко", + "ји" + ], + [ + "▁'", + "{" + ], + [ + "▁)", + "{" + ], + [ + "▁", + "){" + ], + [ + "▁G", + "olf" + ], + [ + "▁Gol", + "f" + ], + [ + "▁g", + "ele" + ], + [ + "▁ge", + "le" + ], + [ + "▁gel", + "e" + ], + [ + "▁d", + "rank" + ], + [ + "▁dr", + "ank" + ], + [ + "pos", + "ing" + ], + [ + "po", + "sing" + ], + [ + "p", + "osing" + ], + [ + "▁en", + "semble" + ], + [ + "he", + "ap" + ], + [ + "sign", + "ature" + ], + [ + "то", + "й" + ], + [ + "ци", + "й" + ], + [ + "scri", + "ber" + ], + [ + "scr", + "iber" + ], + [ + "scribe", + "r" + ], + [ + "scrib", + "er" + ], + [ + "▁ch", + "amp" + ], + [ + "▁cha", + "mp" + ], + [ + "ni", + "o" + ], + [ + "n", + "io" + ], + [ + "la", + "yers" + ], + [ + "lay", + "ers" + ], + [ + "layer", + "s" + ], + [ + "▁tr", + "ump" + ], + [ + "▁mod", + "al" + ], + [ + "▁mo", + "dal" + ], + [ + "▁", + "modal" + ], + [ + "on", + "ces" + ], + [ + "once", + "s" + ], + [ + "че", + "ння" + ], + [ + "чен", + "ня" + ], + [ + "▁C", + "ort" + ], + [ + "▁Co", + "rt" + ], + [ + "▁Cor", + "t" + ], + [ + "▁sun", + "light" + ], + [ + "▁M", + "use" + ], + [ + "▁Mus", + "e" + ], + [ + "▁Mu", + "se" + ], + [ + "ém", + "ent" + ], + [ + "é", + "ment" + ], + [ + "▁curios", + "ity" + ], + [ + "▁v", + "r" + ], + [ + "▁", + "vr" + ], + [ + "O", + "ct" + ], + [ + "yl", + "on" + ], + [ + "y", + "lon" + ], + [ + "▁rel", + "ativ" + ], + [ + "st", + "y" + ], + [ + "s", + "ty" + ], + [ + "]", + "/" + ], + [ + "az", + "u" + ], + [ + "a", + "zu" + ], + [ + "▁U", + "SS" + ], + [ + "▁US", + "S" + ], + [ + "▁person", + "a" + ], + [ + "▁pers", + "ona" + ], + [ + "Me", + "n" + ], + [ + "M", + "en" + ], + [ + "▁w", + "ides" + ], + [ + "▁wide", + "s" + ], + [ + "▁wid", + "es" + ], + [ + "▁K", + "as" + ], + [ + "▁Ka", + "s" + ], + [ + "ic", + "ies" + ], + [ + "ici", + "es" + ], + [ + "i", + "cies" + ], + [ + "▁C", + "off" + ], + [ + "▁Co", + "ff" + ], + [ + "▁con", + "solid" + ], + [ + "▁cons", + "olid" + ], + [ + "▁inter", + "active" + ], + [ + "▁interact", + "ive" + ], + [ + "op", + "ing" + ], + [ + "o", + "ping" + ], + [ + "La", + "nd" + ], + [ + "L", + "and" + ], + [ + "▁energ", + "ies" + ], + [ + "▁independ", + "ently" + ], + [ + "▁independent", + "ly" + ], + [ + "inner", + "HTML" + ], + [ + "Requ", + "ire" + ], + [ + "Re", + "quire" + ], + [ + "▁abs", + "urd" + ], + [ + "▁IN", + "FO" + ], + [ + "▁", + "INFO" + ], + [ + "▁b", + "und" + ], + [ + "▁bu", + "nd" + ], + [ + "▁", + "bund" + ], + [ + "anz", + "ös" + ], + [ + "▁G", + "ent" + ], + [ + "▁Ge", + "nt" + ], + [ + "▁Gen", + "t" + ], + [ + "▁scholar", + "s" + ], + [ + "▁schol", + "ars" + ], + [ + "▁C", + "reated" + ], + [ + "▁Create", + "d" + ], + [ + "▁Creat", + "ed" + ], + [ + "▁Cre", + "ated" + ], + [ + "▁", + "Created" + ], + [ + "▁mar", + "ine" + ], + [ + "▁mari", + "ne" + ], + [ + "..", + ".'" + ], + [ + "...", + "'" + ], + [ + "EN", + "V" + ], + [ + "E", + "NV" + ], + [ + "ach", + "te" + ], + [ + "acht", + "e" + ], + [ + "a", + "chte" + ], + [ + "am", + "ents" + ], + [ + "ament", + "s" + ], + [ + "amen", + "ts" + ], + [ + "a", + "ments" + ], + [ + "▁tr", + "ucks" + ], + [ + "▁truck", + "s" + ], + [ + "▁re", + "wards" + ], + [ + "▁reward", + "s" + ], + [ + "og", + "s" + ], + [ + "o", + "gs" + ], + [ + "Gr", + "een" + ], + [ + "Gre", + "en" + ], + [ + "G", + "reen" + ], + [ + "▁n", + "ä" + ], + [ + "▁inher", + "ited" + ], + [ + "▁inherit", + "ed" + ], + [ + "im", + "ated" + ], + [ + "imate", + "d" + ], + [ + "ima", + "ted" + ], + [ + "imat", + "ed" + ], + [ + "▁F", + "REE" + ], + [ + "▁FR", + "EE" + ], + [ + "▁", + "FREE" + ], + [ + "▁ext", + "ens" + ], + [ + "da", + "g" + ], + [ + "d", + "ag" + ], + [ + "▁g", + "low" + ], + [ + "▁gl", + "ow" + ], + [ + "▁glo", + "w" + ], + [ + "ar", + "di" + ], + [ + "ard", + "i" + ], + [ + "N", + "F" + ], + [ + "▁evalu", + "ated" + ], + [ + "▁evaluate", + "d" + ], + [ + "▁eval", + "uated" + ], + [ + "▁o", + "ps" + ], + [ + "▁op", + "s" + ], + [ + "▁", + "ops" + ], + [ + "▁cle", + "aned" + ], + [ + "▁clean", + "ed" + ], + [ + "▁Prov", + "ince" + ], + [ + "▁Provinc", + "e" + ], + [ + "ha", + "bil" + ], + [ + "hab", + "il" + ], + [ + "h", + "abil" + ], + [ + "гра", + "фі" + ], + [ + "▁T", + "CP" + ], + [ + "▁", + "TCP" + ], + [ + "▁я", + "кі" + ], + [ + "▁як", + "і" + ], + [ + "▁de", + "ce" + ], + [ + "▁dec", + "e" + ], + [ + "▁cont", + "empl" + ], + [ + "▁acquis", + "ition" + ], + [ + "})", + "$." + ], + [ + "})$", + "." + ], + [ + "}", + ")$." + ], + [ + "=\"", + "-" + ], + [ + "▁se", + "ctors" + ], + [ + "▁sector", + "s" + ], + [ + "▁sect", + "ors" + ], + [ + "::", + "<" + ], + [ + "u", + "ß" + ], + [ + "▁trab", + "aj" + ], + [ + "th", + "an" + ], + [ + "tha", + "n" + ], + [ + "t", + "han" + ], + [ + "▁S", + "ta" + ], + [ + "▁St", + "a" + ], + [ + "Mem", + "bers" + ], + [ + "Member", + "s" + ], + [ + "▁r", + "v" + ], + [ + "▁", + "rv" + ], + [ + ")^", + "{\\" + ], + [ + ")^{", + "\\" + ], + [ + ")", + "^{\\" + ], + [ + "mit", + "t" + ], + [ + "mi", + "tt" + ], + [ + "m", + "itt" + ], + [ + "▁W", + "ang" + ], + [ + "▁Wa", + "ng" + ], + [ + "▁W", + "end" + ], + [ + "▁We", + "nd" + ], + [ + "▁G", + "lass" + ], + [ + "▁Gl", + "ass" + ], + [ + "▁Glas", + "s" + ], + [ + "▁t", + "xt" + ], + [ + "▁tx", + "t" + ], + [ + "▁", + "txt" + ], + [ + "▁Cam", + "eron" + ], + [ + "ie", + "ls" + ], + [ + "iel", + "s" + ], + [ + "i", + "els" + ], + [ + "▁im", + "mer" + ], + [ + "▁imm", + "er" + ], + [ + "▁", + "immer" + ], + [ + "▁насе", + "ления" + ], + [ + "..", + ".", + "/" + ], + [ + "▁ро", + "ди" + ], + [ + "▁", + "роди" + ], + [ + "▁sophistic", + "ated" + ], + [ + "▁R", + "he" + ], + [ + "▁Rh", + "e" + ], + [ + "us", + "sy" + ], + [ + "uss", + "y" + ], + [ + "▁Sy", + "ria" + ], + [ + "▁Car", + "oline" + ], + [ + "▁Carol", + "ine" + ], + [ + "riter", + "ion" + ], + [ + "ér", + "c" + ], + [ + "é", + "rc" + ], + [ + "Lo", + "ve" + ], + [ + "L", + "ove" + ], + [ + "▁cy", + "cles" + ], + [ + "▁cycle", + "s" + ], + [ + "▁cycl", + "es" + ], + [ + "▁Ter", + "ms" + ], + [ + "▁Term", + "s" + ], + [ + "▁med", + "ieval" + ], + [ + "▁medi", + "eval" + ], + [ + "ь", + "я" + ], + [ + "▁m", + "issions" + ], + [ + "▁miss", + "ions" + ], + [ + "▁mission", + "s" + ], + [ + "Har", + "d" + ], + [ + "Ha", + "rd" + ], + [ + "H", + "ard" + ], + [ + "▁rég", + "ion" + ], + [ + "▁Ph", + "oenix" + ], + [ + "De", + "ep" + ], + [ + "▁sam", + "pling" + ], + [ + "▁dismiss", + "ed" + ], + [ + "prop", + "ri" + ], + [ + "p", + "ropri" + ], + [ + "▁jud", + "ges" + ], + [ + "▁judge", + "s" + ], + [ + "▁judg", + "es" + ], + [ + "ał", + "a" + ], + [ + "a", + "ła" + ], + [ + "ul", + "os" + ], + [ + "ulo", + "s" + ], + [ + "u", + "los" + ], + [ + "▁L", + "ion" + ], + [ + "▁Li", + "on" + ], + [ + "▁loc", + "als" + ], + [ + "▁local", + "s" + ], + [ + "neg", + "ative" + ], + [ + "ogen", + "eous" + ], + [ + "ogene", + "ous" + ], + [ + "▁A", + "pi" + ], + [ + "▁Ap", + "i" + ], + [ + "▁", + "Api" + ], + [ + "▁d", + "ici" + ], + [ + "▁di", + "ci" + ], + [ + "▁dic", + "i" + ], + [ + "▁а", + "пре" + ], + [ + "▁author", + "ized" + ], + [ + "▁", + "authorized" + ], + [ + "ze", + "rw" + ], + [ + "zer", + "w" + ], + [ + "▁p", + "g" + ], + [ + "▁", + "pg" + ], + [ + "▁A", + "WS" + ], + [ + "▁key", + "word" + ], + [ + "▁", + "keyword" + ], + [ + "▁entrepre", + "neur" + ], + [ + "▁п", + "рое" + ], + [ + "▁про", + "е" + ], + [ + "▁V", + "ancouver" + ], + [ + "it", + "ating" + ], + [ + "ita", + "ting" + ], + [ + "itat", + "ing" + ], + [ + "F", + "ast" + ], + [ + "▁acknowled", + "ged" + ], + [ + "▁acknowledge", + "d" + ], + [ + "▁tour", + "ist" + ], + [ + "▁tou", + "rist" + ], + [ + "▁G", + "rid" + ], + [ + "▁Gr", + "id" + ], + [ + "▁", + "Grid" + ], + [ + "▁En", + "try" + ], + [ + "▁Ent", + "ry" + ], + [ + "▁", + "Entry" + ], + [ + "▁g", + "ebru" + ], + [ + "▁ge", + "bru" + ], + [ + "▁geb", + "ru" + ], + [ + "sa", + "t" + ], + [ + "s", + "at" + ], + [ + "ber", + "ger" + ], + [ + "berg", + "er" + ], + [ + "▁T", + "F" + ], + [ + "▁", + "TF" + ], + [ + "▁m", + "t" + ], + [ + "▁", + "mt" + ], + [ + "▁Mar", + "cel" + ], + [ + "▁Marc", + "el" + ], + [ + "▁Tw", + "enty" + ], + [ + "▁", + "”" + ], + [ + "{}", + "{" + ], + [ + "{", + "}{" + ], + [ + "hi", + "nt" + ], + [ + "hin", + "t" + ], + [ + "h", + "int" + ], + [ + "▁an", + "onymous" + ], + [ + "Cam", + "p" + ], + [ + "C", + "amp" + ], + [ + "▁**", + "_" + ], + [ + "By", + "Comparator" + ], + [ + "U", + "C" + ], + [ + "▁t", + "ö" + ], + [ + "Event", + "Handler" + ], + [ + "▁t", + "ours" + ], + [ + "▁to", + "urs" + ], + [ + "▁tour", + "s" + ], + [ + "▁tou", + "rs" + ], + [ + "▁lon", + "ely" + ], + [ + "▁Sum", + "mary" + ], + [ + "▁", + "Summary" + ], + [ + "st", + "ick" + ], + [ + "s", + "tick" + ], + [ + "All", + "owed" + ], + [ + "Allow", + "ed" + ], + [ + "лі", + "в" + ], + [ + "л", + "ів" + ], + [ + "▁B", + "rew" + ], + [ + "▁Br", + "ew" + ], + [ + "▁Bre", + "w" + ], + [ + "AME", + "TER" + ], + [ + "▁review", + "ed" + ], + [ + "ir", + "at" + ], + [ + "ira", + "t" + ], + [ + "i", + "rat" + ], + [ + "▁n", + "erve" + ], + [ + "▁nerv", + "e" + ], + [ + "▁ner", + "ve" + ], + [ + "▁L", + "inda" + ], + [ + "▁Lin", + "da" + ], + [ + "▁Lind", + "a" + ], + [ + "▁dec", + "is" + ], + [ + "▁sp", + "okes" + ], + [ + "▁spoke", + "s" + ], + [ + "▁spo", + "kes" + ], + [ + "▁qu", + "ed" + ], + [ + "▁que", + "d" + ], + [ + "▁q", + "ued" + ], + [ + "▁F", + "T" + ], + [ + "▁", + "FT" + ], + [ + "▁в", + "ін" + ], + [ + "▁ві", + "н" + ], + [ + "ou", + "sing" + ], + [ + "ous", + "ing" + ], + [ + "o", + "using" + ], + [ + "▁L", + "arge" + ], + [ + "▁Lar", + "ge" + ], + [ + "▁", + "Large" + ], + [ + "▁op", + "ponents" + ], + [ + "▁oppon", + "ents" + ], + [ + "▁opponent", + "s" + ], + [ + "▁D", + "isc" + ], + [ + "▁Dis", + "c" + ], + [ + "▁Di", + "sc" + ], + [ + "Found", + "ation" + ], + [ + "EQ", + "UAL" + ], + [ + "og", + "g" + ], + [ + "o", + "gg" + ], + [ + "Re", + "try" + ], + [ + "Ret", + "ry" + ], + [ + "R", + "etry" + ], + [ + "CHAN", + "NEL" + ], + [ + "▁Е", + "вро" + ], + [ + "▁%", + "." + ], + [ + "▁", + "%." + ], + [ + "▁i", + "i" + ], + [ + "▁", + "ii" + ], + [ + "de", + "ad" + ], + [ + "d", + "ead" + ], + [ + "▁M", + "ale" + ], + [ + "▁Mal", + "e" + ], + [ + "▁Ma", + "le" + ], + [ + "Com", + "pleted" + ], + [ + "Comp", + "leted" + ], + [ + "Complete", + "d" + ], + [ + "ty", + "p" + ], + [ + "t", + "yp" + ], + [ + "▁Ty", + "ler" + ], + [ + "Dis", + "k" + ], + [ + "Di", + "sk" + ], + [ + "D", + "isk" + ], + [ + "Hi", + "de" + ], + [ + "H", + "ide" + ], + [ + "iju", + "ana" + ], + [ + "▁public", + "ations" + ], + [ + "▁publication", + "s" + ], + [ + "fo", + "x" + ], + [ + "f", + "ox" + ], + [ + "vis", + "ed" + ], + [ + "vi", + "sed" + ], + [ + "v", + "ised" + ], + [ + "Fore", + "ign" + ], + [ + "Write", + "Line" + ], + [ + "де", + "ра" + ], + [ + "дер", + "а" + ], + [ + "▁remain", + "der" + ], + [ + "Pi", + "cker" + ], + [ + "P", + "icker" + ], + [ + "we", + "alth" + ], + [ + "▁G", + "or" + ], + [ + "▁Go", + "r" + ], + [ + "sequ", + "ently" + ], + [ + "▁coll", + "ision" + ], + [ + "▁Harr", + "ison" + ], + [ + "▁Harris", + "on" + ], + [ + "▁work", + "place" + ], + [ + "▁N", + "ormal" + ], + [ + "▁Nor", + "mal" + ], + [ + "▁Norm", + "al" + ], + [ + "▁", + "Normal" + ], + [ + "▁B", + "irth" + ], + [ + "▁Bir", + "th" + ], + [ + "▁cons", + "ume" + ], + [ + "▁consum", + "e" + ], + [ + "Sh", + "ift" + ], + [ + "▁avoid", + "ing" + ], + [ + "▁C", + "ha" + ], + [ + "▁Ch", + "a" + ], + [ + "▁An", + "ti" + ], + [ + "▁Ant", + "i" + ], + [ + "▁ch", + "arts" + ], + [ + "▁char", + "ts" + ], + [ + "▁chart", + "s" + ], + [ + "▁P", + "av" + ], + [ + "▁Pa", + "v" + ], + [ + "ст", + "вом" + ], + [ + "ство", + "м" + ], + [ + "ual", + "mente" + ], + [ + "an", + "ed" + ], + [ + "ane", + "d" + ], + [ + "a", + "ned" + ], + [ + "▁A", + "uch" + ], + [ + "▁Au", + "ch" + ], + [ + "rd", + "ev" + ], + [ + "r", + "dev" + ], + [ + "▁she", + "er" + ], + [ + "▁an", + "gl" + ], + [ + "▁ang", + "l" + ], + [ + "sub", + "str" + ], + [ + "Gener", + "ate" + ], + [ + ">", + "=" + ], + [ + "▁B", + "ev" + ], + [ + "▁Be", + "v" + ], + [ + "▁ч", + "ем" + ], + [ + "▁че", + "м" + ], + [ + "▁camp", + "o" + ], + [ + "▁cam", + "po" + ], + [ + "▁lect", + "ure" + ], + [ + "hy", + "per" + ], + [ + "▁Balt", + "imore" + ], + [ + "mi", + "x" + ], + [ + "m", + "ix" + ], + [ + "ke", + "iten" + ], + [ + "keit", + "en" + ], + [ + "▁ра", + "ди" + ], + [ + "▁l", + "asted" + ], + [ + "▁la", + "sted" + ], + [ + "▁last", + "ed" + ], + [ + "▁las", + "ted" + ], + [ + "▁discrim", + "ination" + ], + [ + "ig", + "te" + ], + [ + "igt", + "e" + ], + [ + "ok", + "al" + ], + [ + "oka", + "l" + ], + [ + "o", + "kal" + ], + [ + "Ph", + "ase" + ], + [ + "▁T", + "itel" + ], + [ + "▁Tit", + "el" + ], + [ + "▁Fif", + "th" + ], + [ + "▁di", + "agnostic" + ], + [ + "su", + "ng" + ], + [ + "sun", + "g" + ], + [ + "s", + "ung" + ], + [ + "▁giorn", + "ata" + ], + [ + "os", + "ta" + ], + [ + "ost", + "a" + ], + [ + "o", + "sta" + ], + [ + "is", + "co" + ], + [ + "isc", + "o" + ], + [ + "▁S", + "ara" + ], + [ + "▁Sa", + "ra" + ], + [ + "▁Sar", + "a" + ], + [ + "m", + "v" + ], + [ + "▁el", + "ő" + ], + [ + "▁R", + "osen" + ], + [ + "▁Ro", + "sen" + ], + [ + "▁Ros", + "en" + ], + [ + "▁Rose", + "n" + ], + [ + "▁E", + "SP" + ], + [ + "▁ES", + "P" + ], + [ + "ph", + "er" + ], + [ + "p", + "her" + ], + [ + "▁a", + "j" + ], + [ + "▁", + "aj" + ], + [ + "Path", + "s" + ], + [ + "Pat", + "hs" + ], + [ + "▁R", + "alph" + ], + [ + "▁ž", + "e" + ], + [ + "▁", + "že" + ], + [ + "ре", + "в" + ], + [ + "р", + "ев" + ], + [ + "▁о", + "коло" + ], + [ + "▁ок", + "оло" + ], + [ + "▁Ag", + "reement" + ], + [ + "▁Word", + "Press" + ], + [ + "an", + "try" + ], + [ + "ant", + "ry" + ], + [ + "▁p", + "icks" + ], + [ + "▁pick", + "s" + ], + [ + "▁pi", + "cks" + ], + [ + "▁pic", + "ks" + ], + [ + "▁N", + "ur" + ], + [ + "▁Nu", + "r" + ], + [ + "chedul", + "ed" + ], + [ + "ki", + "e" + ], + [ + "k", + "ie" + ], + [ + "▁represent", + "ations" + ], + [ + "▁representation", + "s" + ], + [ + "++", + "){" + ], + [ + "++)", + "{" + ], + [ + "ess", + "ment" + ], + [ + "▁count", + "less" + ], + [ + "Block", + "s" + ], + [ + "Bl", + "ocks" + ], + [ + "Blo", + "cks" + ], + [ + "ym", + "e" + ], + [ + "y", + "me" + ], + [ + "▁c", + "lo" + ], + [ + "▁cl", + "o" + ], + [ + "▁B", + "ened" + ], + [ + "▁Be", + "ned" + ], + [ + "▁Ben", + "ed" + ], + [ + "ch", + "ars" + ], + [ + "char", + "s" + ], + [ + "cha", + "rs" + ], + [ + "▁A", + "gent" + ], + [ + "▁Ag", + "ent" + ], + [ + "▁Age", + "nt" + ], + [ + "▁", + "Agent" + ], + [ + "▁hist", + "oria" + ], + [ + "▁histor", + "ia" + ], + [ + "▁F", + "loor" + ], + [ + "▁Fl", + "oor" + ], + [ + "▁Flo", + "or" + ], + [ + "▁ten", + "ía" + ], + [ + "▁long", + "est" + ], + [ + "▁lon", + "gest" + ], + [ + "fr", + "ica" + ], + [ + "▁b", + "ef" + ], + [ + "▁be", + "f" + ], + [ + "▁mechan", + "isms" + ], + [ + "▁mechanism", + "s" + ], + [ + "ла", + "зи" + ], + [ + "▁h", + "eter" + ], + [ + "▁he", + "ter" + ], + [ + "▁het", + "er" + ], + [ + "▁athlet", + "es" + ], + [ + "▁period", + "ic" + ], + [ + "▁V", + "otes" + ], + [ + "▁Vo", + "tes" + ], + [ + "ри", + "сти" + ], + [ + "▁n", + "á" + ], + [ + "▁", + "ná" + ], + [ + "▁m", + "aid" + ], + [ + "▁ma", + "id" + ], + [ + "▁mai", + "d" + ], + [ + "▁s", + "wear" + ], + [ + "▁sw", + "ear" + ], + [ + "▁swe", + "ar" + ], + [ + "▁wip", + "ed" + ], + [ + "▁graph", + "s" + ], + [ + "▁grap", + "hs" + ], + [ + "▁t", + "hesis" + ], + [ + "▁the", + "sis" + ], + [ + "▁th", + "esis" + ], + [ + "▁sens", + "ation" + ], + [ + "pers", + "istence" + ], + [ + "▁V", + "il" + ], + [ + "▁Vi", + "l" + ], + [ + "ac", + "s" + ], + [ + "a", + "cs" + ], + [ + "▁de", + "el" + ], + [ + "sc", + "rib" + ], + [ + "scri", + "b" + ], + [ + "scr", + "ib" + ], + [ + "ie", + "ro" + ], + [ + "ier", + "o" + ], + [ + "i", + "ero" + ], + [ + "▁dis", + "cre" + ], + [ + "▁disc", + "re" + ], + [ + "air", + "y" + ], + [ + "ai", + "ry" + ], + [ + "Data", + "Source" + ], + [ + "q", + "t" + ], + [ + "ic", + "iones" + ], + [ + "ici", + "ones" + ], + [ + "icio", + "nes" + ], + [ + "icion", + "es" + ], + [ + "▁res", + "pected" + ], + [ + "▁respect", + "ed" + ], + [ + "▁f", + "ram" + ], + [ + "▁fr", + "am" + ], + [ + "▁fra", + "m" + ], + [ + "▁spec", + "ialized" + ], + [ + "▁special", + "ized" + ], + [ + "▁prés", + "ent" + ], + [ + "▁pré", + "sent" + ], + [ + "Tur", + "n" + ], + [ + "T", + "urn" + ], + [ + "▁compl", + "aints" + ], + [ + "▁complain", + "ts" + ], + [ + "▁complaint", + "s" + ], + [ + "(\"", + "," + ], + [ + "(", + "\"," + ], + [ + "▁Rel", + "ated" + ], + [ + "▁Set", + "ting" + ], + [ + "▁", + "Setting" + ], + [ + "р", + "ю" + ], + [ + "▁s", + "ą" + ], + [ + "▁P", + "le" + ], + [ + "▁Pl", + "e" + ], + [ + "▁d", + "isse" + ], + [ + "▁dis", + "se" + ], + [ + "▁diss", + "e" + ], + [ + "ca", + "ps" + ], + [ + "cap", + "s" + ], + [ + "c", + "aps" + ], + [ + "▁C", + "ash" + ], + [ + "▁Cas", + "h" + ], + [ + "▁Ca", + "sh" + ], + [ + "▁cons", + "umed" + ], + [ + "▁consum", + "ed" + ], + [ + "▁consume", + "d" + ], + [ + "▁l", + "b" + ], + [ + "▁", + "lb" + ], + [ + "Ad", + "just" + ], + [ + "Ser", + "ialize" + ], + [ + "Serial", + "ize" + ], + [ + "S", + "erialize" + ], + [ + "is", + "y" + ], + [ + "i", + "sy" + ], + [ + "▁pat", + "ent" + ], + [ + "▁vis", + "ibility" + ], + [ + "▁S", + "ach" + ], + [ + "▁Sa", + "ch" + ], + [ + "▁Sac", + "h" + ], + [ + "ün", + "st" + ], + [ + "▁cy", + "ber" + ], + [ + "▁Bl", + "ake" + ], + [ + "▁Bl", + "oom" + ], + [ + "▁Blo", + "om" + ], + [ + "▁Sh", + "ah" + ], + [ + "▁Sha", + "h" + ], + [ + "PO", + "WER" + ], + [ + "▁in", + "clusion" + ], + [ + "▁incl", + "usion" + ], + [ + "se", + "rie" + ], + [ + "ser", + "ie" + ], + [ + "s", + "erie" + ], + [ + "▁man", + "era" + ], + [ + "sec", + "onds" + ], + [ + "second", + "s" + ], + [ + "is", + "ches" + ], + [ + "isch", + "es" + ], + [ + "ische", + "s" + ], + [ + "isc", + "hes" + ], + [ + "▁C", + "andidate" + ], + [ + "W", + "D" + ], + [ + "op", + "ath" + ], + [ + "o", + "path" + ], + [ + "▁про", + "гра" + ], + [ + "▁efficient", + "ly" + ], + [ + "ap", + "ps" + ], + [ + "app", + "s" + ], + [ + "tool", + "bar" + ], + [ + "we", + "nd" + ], + [ + "wen", + "d" + ], + [ + "w", + "end" + ], + [ + "▁Ne", + "il" + ], + [ + "▁form", + "ats" + ], + [ + "▁format", + "s" + ], + [ + "▁forma", + "ts" + ], + [ + "▁T", + "emplate" + ], + [ + "▁Temp", + "late" + ], + [ + "▁", + "Template" + ], + [ + "▁min", + "istry" + ], + [ + "▁minist", + "ry" + ], + [ + "▁Char", + "acter" + ], + [ + "▁", + "Character" + ], + [ + "Un", + "iform" + ], + [ + "▁fon", + "ction" + ], + [ + "не", + "м" + ], + [ + "н", + "ем" + ], + [ + "Wh", + "ile" + ], + [ + "к", + "ва" + ], + [ + "рі", + "я" + ], + [ + "▁D", + "L" + ], + [ + "▁", + "DL" + ], + [ + "▁L", + "ayout" + ], + [ + "▁La", + "yout" + ], + [ + "▁Lay", + "out" + ], + [ + "▁", + "Layout" + ], + [ + "не", + "ние" + ], + [ + "▁c", + "aval" + ], + [ + "▁ca", + "val" + ], + [ + "▁cav", + "al" + ], + [ + "▁H", + "ob" + ], + [ + "▁Ho", + "b" + ], + [ + "SP", + "I" + ], + [ + "S", + "PI" + ], + [ + "▁h", + "ely" + ], + [ + "▁he", + "ly" + ], + [ + "▁hel", + "y" + ], + [ + "Dest", + "ination" + ], + [ + "),", + "\r" + ], + [ + ")", + ",\r" + ], + [ + "▁i", + "OS" + ], + [ + "▁ad", + "mission" + ], + [ + "▁adm", + "ission" + ], + [ + "▁c", + "ss" + ], + [ + "▁cs", + "s" + ], + [ + "▁", + "css" + ], + [ + "user", + "Id" + ], + [ + "um", + "bling" + ], + [ + "umb", + "ling" + ], + [ + "▁bo", + "oking" + ], + [ + "▁book", + "ing" + ], + [ + "▁COPY", + "RIGHT" + ], + [ + "▁b", + "land" + ], + [ + "▁bl", + "and" + ], + [ + "output", + "s" + ], + [ + "▁sub", + "mission" + ], + [ + "▁subm", + "ission" + ], + [ + "ti", + "t" + ], + [ + "t", + "it" + ], + [ + "fe", + "ctions" + ], + [ + "fect", + "ions" + ], + [ + "fection", + "s" + ], + [ + "fr", + "agment" + ], + [ + "frag", + "ment" + ], + [ + "▁fa", + "ç" + ], + [ + "▁Through", + "out" + ], + [ + "▁distingu", + "ished" + ], + [ + "▁distinguish", + "ed" + ], + [ + "▁ar", + "range" + ], + [ + "▁arr", + "ange" + ], + [ + "▁arrang", + "e" + ], + [ + "ume", + "ric" + ], + [ + "umer", + "ic" + ], + [ + "xf", + "e" + ], + [ + "x", + "fe" + ], + [ + "ip", + "age" + ], + [ + "ipa", + "ge" + ], + [ + "i", + "page" + ], + [ + "ер", + "жа" + ], + [ + "▁C", + "ars" + ], + [ + "▁Car", + "s" + ], + [ + "▁Ca", + "rs" + ], + [ + "▁P", + "AGE" + ], + [ + "▁PA", + "GE" + ], + [ + "▁", + "PAGE" + ], + [ + "▁a", + "unque" + ], + [ + "▁insert", + "ed" + ], + [ + "smith", + "y" + ], + [ + "AL", + "LOC" + ], + [ + "ALL", + "OC" + ], + [ + "RE", + "C" + ], + [ + "R", + "EC" + ], + [ + "▁B", + "ak" + ], + [ + "▁Ba", + "k" + ], + [ + "▁Str", + "ong" + ], + [ + "ac", + "hen" + ], + [ + "ach", + "en" + ], + [ + "ache", + "n" + ], + [ + "a", + "chen" + ], + [ + "▁Spec", + "ific" + ], + [ + "w", + "q" + ], + [ + "▁Д", + "у" + ], + [ + "MO", + "VE" + ], + [ + "▁mús", + "ica" + ], + [ + "▁C", + "ris" + ], + [ + "▁Cr", + "is" + ], + [ + "ea", + "u" + ], + [ + "e", + "au" + ], + [ + "▁F", + "orum" + ], + [ + "▁For", + "um" + ], + [ + "▁Fo", + "rum" + ], + [ + "li", + "sted" + ], + [ + "list", + "ed" + ], + [ + "l", + "isted" + ], + [ + ")\\", + "\\" + ], + [ + ")", + "\\\\" + ], + [ + "▁X", + "VI" + ], + [ + "▁XV", + "I" + ], + [ + "▁м", + "оло" + ], + [ + "▁мо", + "ло" + ], + [ + "/", + "$" + ], + [ + "Be", + "r" + ], + [ + "B", + "er" + ], + [ + "▁tact", + "ics" + ], + [ + "Form", + "atter" + ], + [ + "Format", + "ter" + ], + [ + "op", + "ens" + ], + [ + "ope", + "ns" + ], + [ + "open", + "s" + ], + [ + "▁r", + "h" + ], + [ + "▁", + "rh" + ], + [ + "▁t", + "ram" + ], + [ + "▁tr", + "am" + ], + [ + "▁tra", + "m" + ], + [ + "V", + "L" + ], + [ + "▁Pro", + "file" + ], + [ + "▁Prof", + "ile" + ], + [ + "▁", + "Profile" + ], + [ + "▁par", + "ish" + ], + [ + "▁Ray", + "mond" + ], + [ + "▁cont", + "empor" + ], + [ + "▁Pl", + "anning" + ], + [ + "▁Plan", + "ning" + ], + [ + "▁Ч", + "е" + ], + [ + "▁A", + "RM" + ], + [ + "▁AR", + "M" + ], + [ + "▁", + "ARM" + ], + [ + "▁des", + "ires" + ], + [ + "▁desire", + "s" + ], + [ + "k", + "v" + ], + [ + "O", + "s" + ], + [ + "▁m", + "iner" + ], + [ + "▁min", + "er" + ], + [ + "▁mi", + "ner" + ], + [ + "▁mine", + "r" + ], + [ + "▁qual", + "ify" + ], + [ + "ik", + "u" + ], + [ + "i", + "ku" + ], + [ + "▁der", + "ni" + ], + [ + "ol", + "óg" + ], + [ + "▁K", + "id" + ], + [ + "▁Ki", + "d" + ], + [ + "ane", + "an" + ], + [ + "▁Hol", + "land" + ], + [ + "▁Holl", + "and" + ], + [ + "Aut", + "om" + ], + [ + "Auto", + "m" + ], + [ + "▁Hamilton", + "ian" + ], + [ + "St", + "ation" + ], + [ + "Stat", + "ion" + ], + [ + "js", + "p" + ], + [ + "j", + "sp" + ], + [ + "▁YO", + "UR" + ], + [ + "▁YOU", + "R" + ], + [ + "▁Th", + "ailand" + ], + [ + "effect", + "ive" + ], + [ + "п", + "ло" + ], + [ + "▁relie", + "ved" + ], + [ + "▁O", + "klahoma" + ], + [ + "▁Jul", + "ian" + ], + [ + "▁Juli", + "an" + ], + [ + "▁Julia", + "n" + ], + [ + "▁ind", + "ent" + ], + [ + "▁inde", + "nt" + ], + [ + "▁", + "indent" + ], + [ + "if", + "r" + ], + [ + "i", + "fr" + ], + [ + "пре", + "де" + ], + [ + "▁fl", + "ame" + ], + [ + "on", + "io" + ], + [ + "oni", + "o" + ], + [ + "o", + "nio" + ], + [ + "As", + "sign" + ], + [ + "Ass", + "ign" + ], + [ + "▁sh", + "ifts" + ], + [ + "▁shift", + "s" + ], + [ + "▁car", + "acter" + ], + [ + "▁caract", + "er" + ], + [ + "if", + "icates" + ], + [ + "ific", + "ates" + ], + [ + "ificate", + "s" + ], + [ + "ifica", + "tes" + ], + [ + "X", + "R" + ], + [ + "▁G", + "FP" + ], + [ + "▁GF", + "P" + ], + [ + "FE", + "ATURE" + ], + [ + "▁M", + "aine" + ], + [ + "▁Ma", + "ine" + ], + [ + "▁Main", + "e" + ], + [ + "▁Mai", + "ne" + ], + [ + "▁f", + "rank" + ], + [ + "▁fr", + "ank" + ], + [ + "▁al", + "igned" + ], + [ + "▁align", + "ed" + ], + [ + "▁", + "aligned" + ], + [ + "▁p", + "ří" + ], + [ + "▁př", + "í" + ], + [ + "Code", + "Attribute" + ], + [ + "▁M", + "AC" + ], + [ + "▁MA", + "C" + ], + [ + "▁", + "MAC" + ], + [ + "▁R", + "oot" + ], + [ + "▁Ro", + "ot" + ], + [ + "▁", + "Root" + ], + [ + "▁F", + "M" + ], + [ + "▁", + "FM" + ], + [ + "erv", + "ation" + ], + [ + "с", + "лі" + ], + [ + "▁s", + "hy" + ], + [ + "▁sh", + "y" + ], + [ + "▁partic", + "ul" + ], + [ + "▁parti", + "cul" + ], + [ + "pl", + "atz" + ], + [ + "▁hypothes", + "is" + ], + [ + "at", + "hol" + ], + [ + "ath", + "ol" + ], + [ + "s", + "With" + ], + [ + "J", + "s" + ], + [ + "$", + "^{-" + ], + [ + "▁#!", + "/" + ], + [ + "▁l", + "emon" + ], + [ + "▁le", + "mon" + ], + [ + "▁a", + "bol" + ], + [ + "▁ab", + "ol" + ], + [ + "▁", + "abol" + ], + [ + "▁Mil", + "an" + ], + [ + "▁Mi", + "lan" + ], + [ + "an", + "ten" + ], + [ + "ant", + "en" + ], + [ + "ante", + "n" + ], + [ + "a", + "nten" + ], + [ + "▁s", + "ia" + ], + [ + "▁si", + "a" + ], + [ + "ri", + "as" + ], + [ + "ria", + "s" + ], + [ + "r", + "ias" + ], + [ + "▁con", + "sid" + ], + [ + "▁cons", + "id" + ], + [ + "as", + "so" + ], + [ + "ass", + "o" + ], + [ + "ain", + "ers" + ], + [ + "ai", + "ners" + ], + [ + "ainer", + "s" + ], + [ + "aine", + "rs" + ], + [ + "▁cir", + "ca" + ], + [ + "▁circ", + "a" + ], + [ + "re", + "try" + ], + [ + "ret", + "ry" + ], + [ + "r", + "etry" + ], + [ + "▁nue", + "vo" + ], + [ + "const", + "ants" + ], + [ + "constant", + "s" + ], + [ + "▁Med", + "iterr" + ], + [ + "▁Turk", + "ish" + ], + [ + "ion", + "en" + ], + [ + "io", + "nen" + ], + [ + "ione", + "n" + ], + [ + "i", + "onen" + ], + [ + "c", + "rypto" + ], + [ + "▁ev", + "olved" + ], + [ + "▁\"", + "", + "?" + ], + [ + "▁p", + "úblic" + ], + [ + "▁comp", + "rend" + ], + [ + "▁compre", + "nd" + ], + [ + "▁compr", + "end" + ], + [ + "al", + "lo" + ], + [ + "all", + "o" + ], + [ + "zo", + "om" + ], + [ + "z", + "oom" + ], + [ + "▁dat", + "etime" + ], + [ + "▁date", + "time" + ], + [ + "▁", + "datetime" + ], + [ + "▁mond", + "iale" + ], + [ + "ма", + "т" + ], + [ + "м", + "ат" + ], + [ + "▁M", + "ask" + ], + [ + "▁Ma", + "sk" + ], + [ + "▁Mas", + "k" + ], + [ + "▁", + "Mask" + ], + [ + "▁p", + "row" + ], + [ + "▁pro", + "w" + ], + [ + "▁pr", + "ow" + ], + [ + "▁belong", + "ing" + ], + [ + "+", + "'" + ], + [ + "OUT", + "PUT" + ], + [ + "▁G", + "rab" + ], + [ + "▁Gr", + "ab" + ], + [ + "▁Gra", + "b" + ], + [ + "M", + "ir" + ], + [ + "▁accommod", + "ate" + ], + [ + "▁$", + "('#" + ], + [ + "▁", + "$('#" + ], + [ + "▁Lou", + "ise" + ], + [ + "▁Louis", + "e" + ], + [ + "▁da", + "mit" + ], + [ + "▁dam", + "it" + ], + [ + "}'", + "," + ], + [ + "}", + "'," + ], + [ + "scri", + "pts" + ], + [ + "script", + "s" + ], + [ + "sn", + "apshot" + ], + [ + "snap", + "shot" + ], + [ + "▁sh", + "itty" + ], + [ + "▁shit", + "ty" + ], + [ + "▁y", + "o" + ], + [ + "▁", + "yo" + ], + [ + "▁belie", + "ving" + ], + [ + "▁inhabit", + "ants" + ], + [ + "W", + "P" + ], + [ + "▁Colomb", + "ia" + ], + [ + "li", + "sts" + ], + [ + "list", + "s" + ], + [ + "l", + "ists" + ], + [ + "▁Mur", + "phy" + ], + [ + "Data", + "set" + ], + [ + "Dat", + "aset" + ], + [ + "▁(!", + "$" + ], + [ + "▁tremend", + "ous" + ], + [ + "▁se", + "ñ" + ], + [ + "▁S", + "ed" + ], + [ + "▁Se", + "d" + ], + [ + "▁sw", + "allowed" + ], + [ + "▁swallow", + "ed" + ], + [ + "om", + "p" + ], + [ + "o", + "mp" + ], + [ + "▁L", + "ate" + ], + [ + "▁La", + "te" + ], + [ + "▁Lat", + "e" + ], + [ + "▁an", + "ys" + ], + [ + "▁any", + "s" + ], + [ + "▁dead", + "ly" + ], + [ + "fol", + "low" + ], + [ + "f", + "ollow" + ], + [ + "▁A", + "nc" + ], + [ + "▁An", + "c" + ], + [ + "▁h", + "w" + ], + [ + "▁", + "hw" + ], + [ + "wik", + "ipedia" + ], + [ + "ic", + "ts" + ], + [ + "ict", + "s" + ], + [ + "▁Al", + "aska" + ], + [ + "▁sc", + "ary" + ], + [ + "▁scar", + "y" + ], + [ + "▁second", + "o" + ], + [ + "▁sec", + "ondo" + ], + [ + "▁her", + "oes" + ], + [ + "▁hero", + "es" + ], + [ + "▁veter", + "an" + ], + [ + "▁behav", + "iors" + ], + [ + "▁behavior", + "s" + ], + [ + "▁behavi", + "ors" + ], + [ + "-", + "%" + ], + [ + "▁E", + "z" + ], + [ + "▁с", + "і" + ], + [ + "▁", + "сі" + ], + [ + "tik", + "z" + ], + [ + "▁spect", + "acular" + ], + [ + "▁Ch", + "ron" + ], + [ + "▁(", + "@" + ], + [ + "▁", + "(@" + ], + [ + "▁de", + "mo" + ], + [ + "▁dem", + "o" + ], + [ + "▁", + "demo" + ], + [ + "▁ser", + "ialized" + ], + [ + "▁serial", + "ized" + ], + [ + "▁In", + "depend" + ], + [ + "▁Indep", + "end" + ], + [ + "BU", + "ILD" + ], + [ + "fail", + "ure" + ], + [ + "▁P", + "ORT" + ], + [ + "▁PO", + "RT" + ], + [ + "▁", + "PORT" + ], + [ + "ю", + "чи" + ], + [ + "▁med", + "itation" + ], + [ + "sample", + "s" + ], + [ + "sam", + "ples" + ], + [ + "s", + "amples" + ], + [ + "i", + "ão" + ], + [ + "▁Ни", + "кола" + ], + [ + "▁я", + "зы" + ], + [ + "▁Tr", + "uth" + ], + [ + "▁Tru", + "th" + ], + [ + "▁co", + "efficient" + ], + [ + "▁coeff", + "icient" + ], + [ + "sl", + "ug" + ], + [ + "▁XV", + "III" + ], + [ + "▁XVI", + "II" + ], + [ + "▁XVII", + "I" + ], + [ + "ia", + "o" + ], + [ + "i", + "ao" + ], + [ + "de", + "ck" + ], + [ + "dec", + "k" + ], + [ + "▁раз", + "ви" + ], + [ + "▁ad", + "oles" + ], + [ + "ar", + "ius" + ], + [ + "ari", + "us" + ], + [ + "▁H", + "az" + ], + [ + "▁Ha", + "z" + ], + [ + "▁Pro", + "test" + ], + [ + "▁Prote", + "st" + ], + [ + "ra", + "de" + ], + [ + "rad", + "e" + ], + [ + "r", + "ade" + ], + [ + "не", + "ния" + ], + [ + "▁cl", + "ause" + ], + [ + "conne", + "ctor" + ], + [ + "connect", + "or" + ], + [ + "conn", + "ector" + ], + [ + "RA", + "TE" + ], + [ + "R", + "ATE" + ], + [ + "ц", + "ю" + ], + [ + "▁Conne", + "cticut" + ], + [ + "V", + "S" + ], + [ + "abul", + "ary" + ], + [ + "HO", + "W" + ], + [ + "▁d", + "elen" + ], + [ + "▁de", + "len" + ], + [ + "▁del", + "en" + ], + [ + "▁su", + "ited" + ], + [ + "▁suit", + "ed" + ], + [ + "▁suite", + "d" + ], + [ + "▁Sur", + "vey" + ], + [ + "ze", + "c" + ], + [ + "z", + "ec" + ], + [ + "ți", + "i" + ], + [ + "ț", + "ii" + ], + [ + "▁b", + "acks" + ], + [ + "▁back", + "s" + ], + [ + "▁ba", + "cks" + ], + [ + "▁", + "backs" + ], + [ + "com", + "merce" + ], + [ + "▁And", + "rea" + ], + [ + "▁Andre", + "a" + ], + [ + "▁Andr", + "ea" + ], + [ + "▁propag", + "anda" + ], + [ + "iz", + "ioni" + ], + [ + "izi", + "oni" + ], + [ + "izio", + "ni" + ], + [ + "▁B", + "il" + ], + [ + "▁Bi", + "l" + ], + [ + "▁In", + "nov" + ], + [ + "▁Inn", + "ov" + ], + [ + "▁forg", + "ive" + ], + [ + "▁oper", + "ates" + ], + [ + "▁operate", + "s" + ], + [ + "▁opera", + "tes" + ], + [ + "ч", + "ний" + ], + [ + "▁l", + "ingu" + ], + [ + "▁lin", + "gu" + ], + [ + "▁ling", + "u" + ], + [ + "▁c", + "ollar" + ], + [ + "▁col", + "lar" + ], + [ + "▁coll", + "ar" + ], + [ + "до", + "л" + ], + [ + "сі", + "й" + ], + [ + "zt", + "en" + ], + [ + "zte", + "n" + ], + [ + "z", + "ten" + ], + [ + "im", + "at" + ], + [ + "ima", + "t" + ], + [ + "i", + "mat" + ], + [ + "▁sh", + "oe" + ], + [ + "ge", + "nder" + ], + [ + "gen", + "der" + ], + [ + "g", + "ender" + ], + [ + "▁leg", + "ally" + ], + [ + "▁legal", + "ly" + ], + [ + "RO", + "P" + ], + [ + "R", + "OP" + ], + [ + "▁S", + "leep" + ], + [ + "deleg", + "ate" + ], + [ + "ID", + "s" + ], + [ + "▁build", + "s" + ], + [ + "▁qu", + "er" + ], + [ + "▁que", + "r" + ], + [ + "▁q", + "uer" + ], + [ + "▁", + "quer" + ], + [ + "uls", + "ion" + ], + [ + ".", + "“" + ], + [ + "к", + "ло" + ], + [ + "ri", + "se" + ], + [ + "ris", + "e" + ], + [ + "r", + "ise" + ], + [ + "th", + "ink" + ], + [ + "К", + "о" + ], + [ + "▁bacter", + "ia" + ], + [ + "▁magn", + "ific" + ], + [ + "▁prison", + "er" + ], + [ + "Cl", + "ock" + ], + [ + "C", + "lock" + ], + [ + "R", + "B" + ], + [ + "ú", + "t" + ], + [ + "▁L", + "iz" + ], + [ + "▁Li", + "z" + ], + [ + "gr", + "a" + ], + [ + "g", + "ra" + ], + [ + "▁And", + "ré" + ], + [ + "▁Andr", + "é" + ], + [ + "▁D", + "ennis" + ], + [ + "▁Den", + "nis" + ], + [ + "▁sur", + "ge" + ], + [ + "▁surg", + "e" + ], + [ + "ex", + "isting" + ], + [ + "exist", + "ing" + ], + [ + "▁W", + "ald" + ], + [ + "▁Wal", + "d" + ], + [ + "▁Wa", + "ld" + ], + [ + "▁S", + "chema" + ], + [ + "▁Sch", + "ema" + ], + [ + "▁Sche", + "ma" + ], + [ + "▁", + "Schema" + ], + [ + "▁war", + "nings" + ], + [ + "▁warn", + "ings" + ], + [ + "▁warning", + "s" + ], + [ + "▁qu", + "adr" + ], + [ + "▁quad", + "r" + ], + [ + "at", + "te" + ], + [ + "att", + "e" + ], + [ + "▁E", + "ins" + ], + [ + "▁Ein", + "s" + ], + [ + "▁ad", + "option" + ], + [ + "▁adopt", + "ion" + ], + [ + "▁w", + "anna" + ], + [ + "▁de", + "rive" + ], + [ + "▁der", + "ive" + ], + [ + "▁deriv", + "e" + ], + [ + "▁", + "derive" + ], + [ + "▁a", + "rena" + ], + [ + "▁are", + "na" + ], + [ + "▁ar", + "ena" + ], + [ + "▁aren", + "a" + ], + [ + "▁Den", + "ver" + ], + [ + "▁F", + "i" + ], + [ + "▁", + "Fi" + ], + [ + "▁Jess", + "ica" + ], + [ + "acy", + "j" + ], + [ + "R", + "atio" + ], + [ + "▁которы", + "е" + ], + [ + "▁Act", + "ivity" + ], + [ + "▁Activ", + "ity" + ], + [ + "▁", + "Activity" + ], + [ + "em", + "u" + ], + [ + "e", + "mu" + ], + [ + "▁St", + "alin" + ], + [ + "▁Sta", + "lin" + ], + [ + "ag", + "gi" + ], + [ + "agg", + "i" + ], + [ + "a", + "ggi" + ], + [ + "▁f", + "ün" + ], + [ + "▁f", + "ils" + ], + [ + "▁fil", + "s" + ], + [ + "▁fi", + "ls" + ], + [ + "aj", + "u" + ], + [ + "a", + "ju" + ], + [ + "card", + "s" + ], + [ + "car", + "ds" + ], + [ + "c", + "ards" + ], + [ + "▁att", + "raction" + ], + [ + "▁attract", + "ion" + ], + [ + "▁attr", + "action" + ], + [ + "▁attra", + "ction" + ], + [ + "od", + "ot" + ], + [ + "odo", + "t" + ], + [ + "o", + "dot" + ], + [ + "F", + "at" + ], + [ + "▁H", + "aven" + ], + [ + "▁Ha", + "ven" + ], + [ + "▁Have", + "n" + ], + [ + "▁Hav", + "en" + ], + [ + "▁nine", + "teenth" + ], + [ + "▁ninete", + "enth" + ], + [ + "▁*", + "*\"" + ], + [ + "▁**", + "\"" + ], + [ + "▁m", + "aggio" + ], + [ + "▁mag", + "gio" + ], + [ + "ma", + "ny" + ], + [ + "man", + "y" + ], + [ + "m", + "any" + ], + [ + "win", + "ning" + ], + [ + "▁G", + "A" + ], + [ + "▁", + "GA" + ], + [ + "▁d", + "ummy" + ], + [ + "▁", + "dummy" + ], + [ + "Un", + "able" + ], + [ + "en", + "ci" + ], + [ + "enc", + "i" + ], + [ + "ère", + "nt" + ], + [ + "è", + "rent" + ], + [ + "Im", + "g" + ], + [ + "I", + "mg" + ], + [ + "▁t", + "ob" + ], + [ + "▁to", + "b" + ], + [ + "DI", + "P" + ], + [ + "D", + "IP" + ], + [ + "S", + "ince" + ], + [ + "▁Sa", + "fe" + ], + [ + "▁Saf", + "e" + ], + [ + "▁", + "Safe" + ], + [ + "Gu", + "ard" + ], + [ + "is", + "ure" + ], + [ + "i", + "sure" + ], + [ + "port", + "e" + ], + [ + "por", + "te" + ], + [ + "p", + "orte" + ], + [ + "▁stad", + "ium" + ], + [ + "in", + "di" + ], + [ + "ind", + "i" + ], + [ + "▁App", + "arently" + ], + [ + "ug", + "no" + ], + [ + "▁w", + "olf" + ], + [ + "▁ne", + "ces" + ], + [ + "▁overse", + "as" + ], + [ + "of", + "s" + ], + [ + "o", + "fs" + ], + [ + "ar", + "el" + ], + [ + "are", + "l" + ], + [ + "a", + "rel" + ], + [ + "▁F", + "ine" + ], + [ + "▁Fin", + "e" + ], + [ + "▁Fi", + "ne" + ], + [ + "▁cor", + "rupt" + ], + [ + "▁n", + "ovember" + ], + [ + "▁nov", + "ember" + ], + [ + "▁nove", + "mber" + ], + [ + "▁interpret", + "ed" + ], + [ + "ib", + "ile" + ], + [ + "ibil", + "e" + ], + [ + "▁w", + "ages" + ], + [ + "▁wa", + "ges" + ], + [ + "▁wage", + "s" + ], + [ + "▁Pre", + "tty" + ], + [ + "▁Her", + "bert" + ], + [ + "▁reg", + "istr" + ], + [ + "вы", + "м" + ], + [ + "an", + "swer" + ], + [ + "ans", + "wer" + ], + [ + "▁m", + "orte" + ], + [ + "▁mor", + "te" + ], + [ + "▁mort", + "e" + ], + [ + "▁com", + "posite" + ], + [ + "▁compos", + "ite" + ], + [ + "Tool", + "bar" + ], + [ + "▁iter", + "ator" + ], + [ + "▁", + "iterator" + ], + [ + "ant", + "ine" + ], + [ + "anti", + "ne" + ], + [ + "▁init", + "ialized" + ], + [ + "▁initial", + "ized" + ], + [ + "▁initialize", + "d" + ], + [ + "▁", + "initialized" + ], + [ + "▁poor", + "ly" + ], + [ + "Access", + "or" + ], + [ + "▁Han", + "nah" + ], + [ + "▁Hann", + "ah" + ], + [ + "▁то", + "лько" + ], + [ + "ol", + "an" + ], + [ + "ola", + "n" + ], + [ + "o", + "lan" + ], + [ + "▁o", + "tto" + ], + [ + "▁ot", + "to" + ], + [ + "▁ott", + "o" + ], + [ + "▁", + "otto" + ], + [ + "▁str", + "ikes" + ], + [ + "▁stri", + "kes" + ], + [ + "▁strike", + "s" + ], + [ + "▁conflict", + "s" + ], + [ + "▁conflic", + "ts" + ], + [ + "▁s", + "urg" + ], + [ + "▁su", + "rg" + ], + [ + "▁sur", + "g" + ], + [ + "▁histor", + "ian" + ], + [ + "▁historia", + "n" + ], + [ + "wo", + "man" + ], + [ + "w", + "oman" + ], + [ + "▁l", + "ibraries" + ], + [ + "be", + "w" + ], + [ + "b", + "ew" + ], + [ + ")-", + "-(" + ], + [ + ")--", + "(" + ], + [ + "ga", + "ther" + ], + [ + "g", + "ather" + ], + [ + "▁L", + "ip" + ], + [ + "▁Li", + "p" + ], + [ + "▁f", + "ict" + ], + [ + "▁fi", + "ct" + ], + [ + "FIL", + "TER" + ], + [ + "@", + "{" + ], + [ + "▁bl", + "essed" + ], + [ + "▁bless", + "ed" + ], + [ + "et", + "ics" + ], + [ + "etic", + "s" + ], + [ + "eti", + "cs" + ], + [ + "▁f", + "ork" + ], + [ + "▁for", + "k" + ], + [ + "▁Me", + "tal" + ], + [ + "▁Met", + "al" + ], + [ + "▁Meta", + "l" + ], + [ + "po", + "lation" + ], + [ + "pol", + "ation" + ], + [ + "p", + "olation" + ], + [ + "▁negoti", + "ations" + ], + [ + "▁gen", + "us" + ], + [ + "▁genu", + "s" + ], + [ + "▁cont", + "rolling" + ], + [ + "▁control", + "ling" + ], + [ + "VER", + "T" + ], + [ + "VE", + "RT" + ], + [ + "V", + "ERT" + ], + [ + "▁P", + "erry" + ], + [ + "▁Per", + "ry" + ], + [ + "▁S", + "PD" + ], + [ + "▁SP", + "D" + ], + [ + "CA", + "SE" + ], + [ + "C", + "ASE" + ], + [ + "т", + "вер" + ], + [ + "▁C", + "rown" + ], + [ + "▁Cr", + "own" + ], + [ + "▁Cro", + "wn" + ], + [ + "▁Crow", + "n" + ], + [ + "▁ind", + "ul" + ], + [ + "▁indu", + "l" + ], + [ + "▁e", + "hemal" + ], + [ + "▁ampl", + "itude" + ], + [ + "▁amplit", + "ude" + ], + [ + "▁B", + "ach" + ], + [ + "▁Ba", + "ch" + ], + [ + "▁phot", + "ographer" + ], + [ + "▁photograph", + "er" + ], + [ + "n", + "ý" + ], + [ + "▁inv", + "ested" + ], + [ + "▁invest", + "ed" + ], + [ + "▁P", + "arte" + ], + [ + "▁Par", + "te" + ], + [ + "▁Part", + "e" + ], + [ + "▁pro", + "long" + ], + [ + "C", + "U" + ], + [ + "icht", + "et" + ], + [ + "ichte", + "t" + ], + [ + "res", + "ume" + ], + [ + "▁c", + "arb" + ], + [ + "▁car", + "b" + ], + [ + "▁ca", + "rb" + ], + [ + "ur", + "st" + ], + [ + "urs", + "t" + ], + [ + "▁N", + "ixon" + ], + [ + "▁n", + "eur" + ], + [ + "▁ne", + "ur" + ], + [ + "▁neu", + "r" + ], + [ + "▁", + "neur" + ], + [ + "▁corpor", + "ations" + ], + [ + "▁corporation", + "s" + ], + [ + "Op", + "s" + ], + [ + "O", + "ps" + ], + [ + "u", + "u" + ], + [ + "l", + "m" + ], + [ + "ap", + "ple" + ], + [ + "app", + "le" + ], + [ + "ch", + "te" + ], + [ + "cht", + "e" + ], + [ + "▁deliber", + "ately" + ], + [ + "ber", + "e" + ], + [ + "be", + "re" + ], + [ + "b", + "ere" + ], + [ + "▁fe", + "br" + ], + [ + "▁provinc", + "ia" + ], + [ + "▁provin", + "cia" + ], + [ + "Over", + "flow" + ], + [ + "▁E", + "ight" + ], + [ + "▁ind", + "ication" + ], + [ + "▁indic", + "ation" + ], + [ + "▁pist", + "ol" + ], + [ + "▁к", + "ре" + ], + [ + "▁", + "кре" + ], + [ + "oc", + "ial" + ], + [ + "oci", + "al" + ], + [ + "o", + "cial" + ], + [ + "▁r", + "und" + ], + [ + "▁run", + "d" + ], + [ + "▁ru", + "nd" + ], + [ + "▁", + "rund" + ], + [ + "▁se", + "hr" + ], + [ + "ok", + "at" + ], + [ + "oka", + "t" + ], + [ + "o", + "kat" + ], + [ + "ül", + "et" + ], + [ + "ü", + "let" + ], + [ + "▁He", + "at" + ], + [ + "Н", + "а" + ], + [ + "▁о", + "дин" + ], + [ + "▁од", + "ин" + ], + [ + "IC", + "S" + ], + [ + "I", + "CS" + ], + [ + "ay", + "e" + ], + [ + "a", + "ye" + ], + [ + "▁eight", + "een" + ], + [ + "▁t", + "ug" + ], + [ + "▁tu", + "g" + ], + [ + "LO", + "T" + ], + [ + "L", + "OT" + ], + [ + "▁L", + "ar" + ], + [ + "▁La", + "r" + ], + [ + "ning", + "s" + ], + [ + "n", + "ings" + ], + [ + "▁T", + "odd" + ], + [ + "▁To", + "dd" + ], + [ + "▁Tod", + "d" + ], + [ + "▁organis", + "ations" + ], + [ + "▁organisation", + "s" + ], + [ + "▁g", + "enes" + ], + [ + "▁gen", + "es" + ], + [ + "▁ge", + "nes" + ], + [ + "▁gene", + "s" + ], + [ + "B", + "ag" + ], + [ + "Ke", + "ep" + ], + [ + "^{", + "+" + ], + [ + "Base", + "d" + ], + [ + "Bas", + "ed" + ], + [ + "B", + "ased" + ], + [ + "sk", + "in" + ], + [ + "ski", + "n" + ], + [ + "s", + "kin" + ], + [ + "▁to", + "das" + ], + [ + "▁tod", + "as" + ], + [ + "▁toda", + "s" + ], + [ + "▁illustr", + "ated" + ], + [ + "▁c", + "f" + ], + [ + "▁", + "cf" + ], + [ + "▁ar", + "riving" + ], + [ + "▁arriv", + "ing" + ], + [ + "▁arr", + "iving" + ], + [ + "▁excess", + "ive" + ], + [ + "▁tra", + "its" + ], + [ + "▁trait", + "s" + ], + [ + "▁s", + "ank" + ], + [ + "▁san", + "k" + ], + [ + "▁Att", + "ribute" + ], + [ + "▁", + "Attribute" + ], + [ + "▁G", + "D" + ], + [ + "▁", + "GD" + ], + [ + "com", + "par" + ], + [ + "comp", + "ar" + ], + [ + "▁dent", + "ro" + ], + [ + "br", + "is" + ], + [ + "b", + "ris" + ], + [ + "▁at", + "oms" + ], + [ + "▁atom", + "s" + ], + [ + "fr", + "ed" + ], + [ + "fre", + "d" + ], + [ + "f", + "red" + ], + [ + "▁E", + "val" + ], + [ + "▁Ev", + "al" + ], + [ + "▁Eva", + "l" + ], + [ + "▁", + "Eval" + ], + [ + "▁di", + "stances" + ], + [ + "▁dist", + "ances" + ], + [ + "▁distance", + "s" + ], + [ + "st", + "aw" + ], + [ + "sta", + "w" + ], + [ + "краї", + "н" + ], + [ + "vari", + "ables" + ], + [ + "variable", + "s" + ], + [ + "l", + "c" + ], + [ + "на", + "ли" + ], + [ + "нал", + "и" + ], + [ + "н", + "али" + ], + [ + "▁чемпи", + "она" + ], + [ + "wi", + "j" + ], + [ + "w", + "ij" + ], + [ + "▁Sim", + "ilar" + ], + [ + "je", + "k" + ], + [ + "j", + "ek" + ], + [ + "Pe", + "t" + ], + [ + "P", + "et" + ], + [ + "=\"", + "$" + ], + [ + "ко", + "то" + ], + [ + "▁R", + "ang" + ], + [ + "▁Ra", + "ng" + ], + [ + "▁Ran", + "g" + ], + [ + "ion", + "ato" + ], + [ + "iona", + "to" + ], + [ + "▁bek", + "annt" + ], + [ + "▁bekan", + "nt" + ], + [ + "!", + "*" + ], + [ + "Li", + "m" + ], + [ + "L", + "im" + ], + [ + "▁concl", + "usions" + ], + [ + "▁conclusion", + "s" + ], + [ + "ain", + "te" + ], + [ + "ai", + "nte" + ], + [ + "aint", + "e" + ], + [ + "a", + "inte" + ], + [ + "-", + "," + ], + [ + "▁g", + "ł" + ], + [ + "▁pass", + "ive" + ], + [ + "▁Ga", + "ussian" + ], + [ + "▁stag", + "ione" + ], + [ + "ME", + "DI" + ], + [ + "MED", + "I" + ], + [ + "it", + "ol" + ], + [ + "ito", + "l" + ], + [ + "i", + "tol" + ], + [ + "▁Jer", + "emy" + ], + [ + "View", + "s" + ], + [ + "class", + "List" + ], + [ + "▁desper", + "ately" + ], + [ + "▁desperate", + "ly" + ], + [ + "▁ver", + "l" + ], + [ + "▁ve", + "rl" + ], + [ + "br", + "ace" + ], + [ + "bra", + "ce" + ], + [ + "N", + "P" + ], + [ + "▁c", + "ob" + ], + [ + "▁co", + "b" + ], + [ + "▁A", + "rist" + ], + [ + "▁Ar", + "ist" + ], + [ + "▁Ari", + "st" + ], + [ + "da", + "p" + ], + [ + "d", + "ap" + ], + [ + "Fil", + "ters" + ], + [ + "Filter", + "s" + ], + [ + "'=>", + "'" + ], + [ + "ul", + "tan" + ], + [ + "ult", + "an" + ], + [ + "▁F", + "actory" + ], + [ + "▁", + "Factory" + ], + [ + "è", + "le" + ], + [ + "▁l", + "asting" + ], + [ + "▁last", + "ing" + ], + [ + "▁las", + "ting" + ], + [ + "▁element", + "ary" + ], + [ + "▁C", + "M" + ], + [ + "▁", + "CM" + ], + [ + "▁Louis", + "iana" + ], + [ + "▁p", + "ov" + ], + [ + "▁po", + "v" + ], + [ + "PC", + "I" + ], + [ + "P", + "CI" + ], + [ + "è", + "de" + ], + [ + "▁P", + "ink" + ], + [ + "▁Pin", + "k" + ], + [ + "▁Br", + "uno" + ], + [ + "▁Bru", + "no" + ], + [ + "▁Brun", + "o" + ], + [ + "▁Y", + "ellow" + ], + [ + "▁ev", + "angel" + ], + [ + "▁lik", + "elihood" + ], + [ + "WID", + "TH" + ], + [ + "▁$", + "-" + ], + [ + "▁", + "$-" + ], + [ + "ni", + "co" + ], + [ + "nic", + "o" + ], + [ + "n", + "ico" + ], + [ + "hu", + "i" + ], + [ + "h", + "ui" + ], + [ + "ak", + "ter" + ], + [ + "akt", + "er" + ], + [ + "akte", + "r" + ], + [ + "ne", + "urs" + ], + [ + "neur", + "s" + ], + [ + "n", + "eurs" + ], + [ + "▁bre", + "eze" + ], + [ + "▁bree", + "ze" + ], + [ + "▁со", + "ста" + ], + [ + "▁He", + "ader" + ], + [ + "▁Head", + "er" + ], + [ + "▁", + "Header" + ], + [ + "om", + "rå" + ], + [ + "▁D", + "ylan" + ], + [ + "▁Dy", + "lan" + ], + [ + "▁Bi", + "ographie" + ], + [ + "▁Univers", + "ität" + ], + [ + "on", + "so" + ], + [ + "ons", + "o" + ], + [ + "HAND", + "LE" + ], + [ + "J", + "ournal" + ], + [ + "ea", + "st" + ], + [ + "e", + "ast" + ], + [ + "▁sup", + "pliers" + ], + [ + "▁supplier", + "s" + ], + [ + "▁table", + "t" + ], + [ + "▁tab", + "let" + ], + [ + "LI", + "C" + ], + [ + "L", + "IC" + ], + [ + "PER", + "TY" + ], + [ + "ї", + "в" + ], + [ + "▁z", + "aw" + ], + [ + "▁za", + "w" + ], + [ + "▁su", + "bm" + ], + [ + "▁sub", + "m" + ], + [ + "▁Fern", + "ando" + ], + [ + "▁nou", + "velle" + ], + [ + "▁nouve", + "lle" + ], + [ + "▁Point", + "s" + ], + [ + "▁", + "Points" + ], + [ + "▁str", + "angers" + ], + [ + "▁strange", + "rs" + ], + [ + "▁stranger", + "s" + ], + [ + "▁strang", + "ers" + ], + [ + "Component", + "Model" + ], + [ + "ist", + "ro" + ], + [ + "istr", + "o" + ], + [ + "au", + "rus" + ], + [ + "aur", + "us" + ], + [ + "▁san", + "ct" + ], + [ + "▁о", + "дна" + ], + [ + "▁од", + "на" + ], + [ + "▁В", + "ы" + ], + [ + "▁о", + "на" + ], + [ + "▁он", + "а" + ], + [ + "▁", + "она" + ], + [ + "vert", + "ical" + ], + [ + "Sp", + "ring" + ], + [ + "▁Har", + "old" + ], + [ + "▁Back", + "ground" + ], + [ + "▁", + "Background" + ], + [ + "Bal", + "ance" + ], + [ + "Key", + "word" + ], + [ + "~$", + "\\" + ], + [ + "~", + "$\\" + ], + [ + "mal", + "loc" + ], + [ + "m", + "alloc" + ], + [ + "ORM", + "AL" + ], + [ + "Sk", + "ip" + ], + [ + "▁Mu", + "ham" + ], + [ + "▁back", + "wards" + ], + [ + "▁backward", + "s" + ], + [ + "c", + "ów" + ], + [ + "по", + "зи" + ], + [ + "▁back", + "end" + ], + [ + "▁", + "backend" + ], + [ + "▁de", + "emed" + ], + [ + "▁accur", + "ately" + ], + [ + "▁accurate", + "ly" + ], + [ + "▁trans", + "c" + ], + [ + "▁Broad", + "way" + ], + [ + "▁g", + "rud" + ], + [ + "▁gr", + "ud" + ], + [ + "▁gru", + "d" + ], + [ + "▁N", + "amen" + ], + [ + "▁Name", + "n" + ], + [ + "▁Na", + "men" + ], + [ + "▁Nam", + "en" + ], + [ + "▁sh", + "ifting" + ], + [ + "▁shift", + "ing" + ], + [ + "▁ment", + "ally" + ], + [ + "▁mental", + "ly" + ], + [ + "▁cal", + "ories" + ], + [ + "▁cons", + "ensus" + ], + [ + "Perm", + "issions" + ], + [ + "Permission", + "s" + ], + [ + "▁ob", + "jet" + ], + [ + "▁obj", + "et" + ], + [ + "▁elabor", + "ate" + ], + [ + "at", + "ts" + ], + [ + "att", + "s" + ], + [ + "▁sn", + "ake" + ], + [ + "▁ref", + "res" + ], + [ + "▁refr", + "es" + ], + [ + "ar", + "u" + ], + [ + "a", + "ru" + ], + [ + "▁reflect", + "s" + ], + [ + "oun", + "ge" + ], + [ + "o", + "unge" + ], + [ + "R", + "ank" + ], + [ + "▁K", + "urt" + ], + [ + "▁Kur", + "t" + ], + [ + "▁Ku", + "rt" + ], + [ + "▁p", + "ied" + ], + [ + "▁pie", + "d" + ], + [ + "▁pi", + "ed" + ], + [ + "▁exped", + "ition" + ], + [ + "V", + "el" + ], + [ + "▁O", + "wen" + ], + [ + "Le", + "ad" + ], + [ + "L", + "ead" + ], + [ + "▁utter", + "ly" + ], + [ + "▁Ar", + "be" + ], + [ + "▁bre", + "asts" + ], + [ + "▁breast", + "s" + ], + [ + "IP", + "S" + ], + [ + "I", + "PS" + ], + [ + "▁hung", + "er" + ], + [ + "▁hun", + "ger" + ], + [ + "at", + "em" + ], + [ + "ate", + "m" + ], + [ + "a", + "tem" + ], + [ + "▁vers", + "chied" + ], + [ + "▁versch", + "ied" + ], + [ + "▁Cam", + "era" + ], + [ + "▁", + "Camera" + ], + [ + "▁Mün", + "chen" + ], + [ + "iv", + "als" + ], + [ + "ival", + "s" + ], + [ + "iva", + "ls" + ], + [ + "i", + "vals" + ], + [ + "▁sp", + "raw" + ], + [ + "▁spr", + "aw" + ], + [ + "▁S", + "ü" + ], + [ + "▁Was", + "ser" + ], + [ + "▁mechan", + "ics" + ], + [ + "Load", + "ed" + ], + [ + "Lo", + "aded" + ], + [ + "db", + "c" + ], + [ + "d", + "bc" + ], + [ + "▁re", + "marks" + ], + [ + "▁rem", + "arks" + ], + [ + "▁remark", + "s" + ], + [ + "▁", + "remarks" + ], + [ + "▁}", + ")." + ], + [ + "▁})", + "." + ], + [ + "▁", + "})." + ], + [ + "▁pain", + "ter" + ], + [ + "▁pa", + "inter" + ], + [ + "▁paint", + "er" + ], + [ + "▁h", + "aut" + ], + [ + "▁ha", + "ut" + ], + [ + "Mar", + "shal" + ], + [ + "IS", + "D" + ], + [ + "I", + "SD" + ], + [ + "▁ve", + "loc" + ], + [ + "▁vel", + "oc" + ], + [ + "▁In", + "cre" + ], + [ + "▁Inc", + "re" + ], + [ + "W", + "ar" + ], + [ + "▁ру", + "с" + ], + [ + "▁com", + "pte" + ], + [ + "▁comp", + "te" + ], + [ + "▁compt", + "e" + ], + [ + "ü", + "g" + ], + [ + "▁Def", + "inition" + ], + [ + "▁", + "Definition" + ], + [ + "▁G", + "am" + ], + [ + "▁Ga", + "m" + ], + [ + "▁H", + "ir" + ], + [ + "▁Hi", + "r" + ], + [ + "▁witness", + "ed" + ], + [ + "▁g", + "ren" + ], + [ + "▁gr", + "en" + ], + [ + "▁gre", + "n" + ], + [ + "▁", + "gren" + ], + [ + "▁hur", + "ry" + ], + [ + "ch", + "et" + ], + [ + "che", + "t" + ], + [ + "c", + "het" + ], + [ + "re", + "verse" + ], + [ + "G", + "F" + ], + [ + "▁Qu", + "arter" + ], + [ + "п", + "ла" + ], + [ + "▁s", + "ar" + ], + [ + "▁sa", + "r" + ], + [ + "sb", + "urg" + ], + [ + "sbur", + "g" + ], + [ + "s", + "burg" + ], + [ + "▁D", + "it" + ], + [ + "▁Di", + "t" + ], + [ + "▁", + "Dit" + ], + [ + "▁Arn", + "old" + ], + [ + "j", + "k" + ], + [ + "▁l", + "ambda" + ], + [ + "▁", + "lambda" + ], + [ + "è", + "ge" + ], + [ + "▁o", + "z" + ], + [ + "▁", + "oz" + ], + [ + "▁h", + "ans" + ], + [ + "▁ha", + "ns" + ], + [ + "▁han", + "s" + ], + [ + "▁answ", + "ering" + ], + [ + "▁answer", + "ing" + ], + [ + "▁o", + "live" + ], + [ + "▁ol", + "ive" + ], + [ + "▁sp", + "ont" + ], + [ + "▁spo", + "nt" + ], + [ + "▁inter", + "vals" + ], + [ + "▁interval", + "s" + ], + [ + ">", + "@" + ], + [ + "▁т", + "ран" + ], + [ + "▁тра", + "н" + ], + [ + "▁F", + "ocus" + ], + [ + "▁", + "Focus" + ], + [ + "ч", + "них" + ], + [ + "▁д", + "ви" + ], + [ + "▁tri", + "angle" + ], + [ + "▁r", + "ally" + ], + [ + "▁P", + "unk" + ], + [ + "▁Pun", + "k" + ], + [ + "▁G", + "and" + ], + [ + "▁Ga", + "nd" + ], + [ + "se", + "ctions" + ], + [ + "section", + "s" + ], + [ + "sect", + "ions" + ], + [ + "сси", + "й" + ], + [ + "AC", + "CESS" + ], + [ + "A", + "CCESS" + ], + [ + "ha", + "rm" + ], + [ + "har", + "m" + ], + [ + "h", + "arm" + ], + [ + "▁Sk", + "ip" + ], + [ + "▁", + "Skip" + ], + [ + "▁D", + "river" + ], + [ + "▁Dr", + "iver" + ], + [ + "▁Drive", + "r" + ], + [ + "▁", + "Driver" + ], + [ + "▁Sant", + "iago" + ], + [ + "it", + "ung" + ], + [ + "itu", + "ng" + ], + [ + "▁B", + "arr" + ], + [ + "▁Bar", + "r" + ], + [ + "▁Ba", + "rr" + ], + [ + "process", + "or" + ], + [ + "▁real", + "ised" + ], + [ + "▁realise", + "d" + ], + [ + "ą", + "z" + ], + [ + "le", + "ave" + ], + [ + "▁C", + "omo" + ], + [ + "▁Com", + "o" + ], + [ + "▁Co", + "mo" + ], + [ + "▁Re", + "views" + ], + [ + "▁Review", + "s" + ], + [ + "▁и", + "зда" + ], + [ + "▁из", + "да" + ], + [ + "▁earn", + "ings" + ], + [ + "▁ear", + "nings" + ], + [ + "▁earning", + "s" + ], + [ + "▁S", + "creen" + ], + [ + "▁Sc", + "reen" + ], + [ + "▁Scre", + "en" + ], + [ + "▁", + "Screen" + ], + [ + "gr", + "and" + ], + [ + "gra", + "nd" + ], + [ + "g", + "rand" + ], + [ + "▁ap", + "ril" + ], + [ + "▁apr", + "il" + ], + [ + "▁sil", + "ently" + ], + [ + "▁silent", + "ly" + ], + [ + "ed", + "o" + ], + [ + "e", + "do" + ], + [ + "ue", + "st" + ], + [ + "ues", + "t" + ], + [ + "u", + "est" + ], + [ + "oo", + "oo" + ], + [ + "▁Исто", + "рия" + ], + [ + "ра", + "з" + ], + [ + "MAGE", + "S" + ], + [ + "MAG", + "ES" + ], + [ + "▁Sing", + "h" + ], + [ + "▁Sin", + "gh" + ], + [ + "▁Per", + "fect" + ], + [ + "▁revolution", + "ary" + ], + [ + "▁н", + "і" + ], + [ + "▁", + "ні" + ], + [ + "▁Sch", + "ools" + ], + [ + "▁School", + "s" + ], + [ + "R", + "ich" + ], + [ + "▁ch", + "rom" + ], + [ + "▁chr", + "om" + ], + [ + "▁an", + "terior" + ], + [ + "▁ante", + "rior" + ], + [ + "▁Indones", + "ia" + ], + [ + "Con", + "straints" + ], + [ + "Constraint", + "s" + ], + [ + "▁\"", + "__" + ], + [ + "▁\"_", + "_" + ], + [ + "▁six", + "teen" + ], + [ + "▁sixt", + "een" + ], + [ + "ér", + "e" + ], + [ + "é", + "re" + ], + [ + "мен", + "та" + ], + [ + "мент", + "а" + ], + [ + "N", + "il" + ], + [ + "je", + "l" + ], + [ + "j", + "el" + ], + [ + "че", + "ские" + ], + [ + "чески", + "е" + ], + [ + "▁thr", + "one" + ], + [ + "▁thro", + "ne" + ], + [ + "▁aud", + "iences" + ], + [ + "▁audience", + "s" + ], + [ + "▁i", + "hren" + ], + [ + "▁ih", + "ren" + ], + [ + "▁ihr", + "en" + ], + [ + "▁ihre", + "n" + ], + [ + "ра", + "б" + ], + [ + "Qu", + "ick" + ], + [ + "in", + "burgh" + ], + [ + "fi", + "co" + ], + [ + "fic", + "o" + ], + [ + "f", + "ico" + ], + [ + "▁kid", + "n" + ], + [ + "▁ki", + "dn" + ], + [ + "irm", + "ingham" + ], + [ + "is", + "le" + ], + [ + "isl", + "e" + ], + [ + "iz", + "ación" + ], + [ + "iza", + "ción" + ], + [ + "▁Ch", + "ampions" + ], + [ + "▁Champion", + "s" + ], + [ + "▁вы", + "со" + ], + [ + "ol", + "er" + ], + [ + "ole", + "r" + ], + [ + "o", + "ler" + ], + [ + "▁z", + "ak" + ], + [ + "▁za", + "k" + ], + [ + "▁p", + "lat" + ], + [ + "▁pl", + "at" + ], + [ + "▁V", + "III" + ], + [ + "▁VI", + "II" + ], + [ + "▁VII", + "I" + ], + [ + "at", + "ique" + ], + [ + "ati", + "que" + ], + [ + "li", + "ter" + ], + [ + "lit", + "er" + ], + [ + "l", + "iter" + ], + [ + "▁P", + "rest" + ], + [ + "▁Pr", + "est" + ], + [ + "▁Pres", + "t" + ], + [ + "▁Pre", + "st" + ], + [ + "in", + "is" + ], + [ + "ini", + "s" + ], + [ + "i", + "nis" + ], + [ + "▁scient", + "ist" + ], + [ + "▁m", + "ån" + ], + [ + "▁må", + "n" + ], + [ + "ke", + "ley" + ], + [ + "kel", + "ey" + ], + [ + "▁h", + "yd" + ], + [ + "▁hy", + "d" + ], + [ + "grad", + "uate" + ], + [ + "of", + "t" + ], + [ + "o", + "ft" + ], + [ + "▁N", + "GC" + ], + [ + "on", + "gs" + ], + [ + "ong", + "s" + ], + [ + "▁t", + "ier" + ], + [ + "▁tie", + "r" + ], + [ + "▁ti", + "er" + ], + [ + "▁Sh", + "aw" + ], + [ + "▁Sha", + "w" + ], + [ + "un", + "ächst" + ], + [ + "▁establish", + "ing" + ], + [ + "▁ind", + "icator" + ], + [ + "▁indic", + "ator" + ], + [ + "▁Par", + "ad" + ], + [ + "▁Pa", + "rad" + ], + [ + "▁Para", + "d" + ], + [ + "▁Tr", + "ail" + ], + [ + "▁Tra", + "il" + ], + [ + "UM", + "N" + ], + [ + "▁sp", + "ine" + ], + [ + "▁spin", + "e" + ], + [ + "▁Vis", + "ual" + ], + [ + "▁", + "Visual" + ], + [ + "::", + "$" + ], + [ + "▁t", + "eles" + ], + [ + "▁te", + "les" + ], + [ + "▁tele", + "s" + ], + [ + "▁tel", + "es" + ], + [ + "OP", + "ER" + ], + [ + "O", + "PER" + ], + [ + "▁pack", + "aging" + ], + [ + "to", + "ire" + ], + [ + "t", + "oire" + ], + [ + "▁не", + "ско" + ], + [ + "▁product", + "ivity" + ], + [ + "A", + "f" + ], + [ + "ні", + "ї" + ], + [ + "▁de", + "gener" + ], + [ + "▁deg", + "ener" + ], + [ + "br", + "it" + ], + [ + "b", + "rit" + ], + [ + "U", + "i" + ], + [ + "▁Y", + "am" + ], + [ + "▁Ya", + "m" + ], + [ + "▁d", + "ough" + ], + [ + "▁do", + "ugh" + ], + [ + "▁dou", + "gh" + ], + [ + "os", + "ph" + ], + [ + "osp", + "h" + ], + [ + "▁cl", + "ue" + ], + [ + "▁ре", + "ги" + ], + [ + "▁me", + "ille" + ], + [ + "▁tend", + "ency" + ], + [ + "▁re", + "lay" + ], + [ + "▁rel", + "ay" + ], + [ + "▁design", + "ers" + ], + [ + "▁designer", + "s" + ], + [ + "▁Т", + "у" + ], + [ + "Sh", + "are" + ], + [ + "▁b", + "icy" + ], + [ + "▁bi", + "cy" + ], + [ + "▁M", + "asters" + ], + [ + "▁Ma", + "sters" + ], + [ + "▁Mas", + "ters" + ], + [ + "▁Master", + "s" + ], + [ + "▁м", + "но" + ], + [ + "▁altern", + "atives" + ], + [ + "▁alternative", + "s" + ], + [ + "ет", + "о" + ], + [ + "е", + "то" + ], + [ + "▁coun", + "tr" + ], + [ + "▁count", + "r" + ], + [ + "▁W", + "ow" + ], + [ + "▁Wo", + "w" + ], + [ + "LO", + "CAL" + ], + [ + "LOC", + "AL" + ], + [ + "en", + "ue" + ], + [ + "enu", + "e" + ], + [ + "▁s", + "lim" + ], + [ + "▁sl", + "im" + ], + [ + "к", + "ви" + ], + [ + "▁t", + "ir" + ], + [ + "▁ti", + "r" + ], + [ + "▁do", + "it" + ], + [ + "lic", + "a" + ], + [ + "li", + "ca" + ], + [ + "l", + "ica" + ], + [ + "ci", + "pe" + ], + [ + "cip", + "e" + ], + [ + "c", + "ipe" + ], + [ + "iz", + "ia" + ], + [ + "izi", + "a" + ], + [ + "▁A", + "ires" + ], + [ + "▁Air", + "es" + ], + [ + "▁F", + "alls" + ], + [ + "▁Fall", + "s" + ], + [ + "▁Fal", + "ls" + ], + [ + "▁concent", + "rate" + ], + [ + "▁concentr", + "ate" + ], + [ + "▁ne", + "gl" + ], + [ + "▁neg", + "l" + ], + [ + "▁Re", + "in" + ], + [ + "?", + "," + ], + [ + "▁G", + "ott" + ], + [ + "▁Go", + "tt" + ], + [ + "▁Got", + "t" + ], + [ + "▁Ver", + "ify" + ], + [ + "▁", + "Verify" + ], + [ + "▁Stud", + "ios" + ], + [ + "▁Studio", + "s" + ], + [ + "$", + "('#" + ], + [ + "ow", + "ym" + ], + [ + "owy", + "m" + ], + [ + "я", + "в" + ], + [ + "Prim", + "itive" + ], + [ + "▁tax", + "i" + ], + [ + "▁ta", + "xi" + ], + [ + "▁Com", + "mercial" + ], + [ + "▁Ч", + "ер" + ], + [ + "▁Че", + "р" + ], + [ + "place", + "holder" + ], + [ + "se", + "au" + ], + [ + "sea", + "u" + ], + [ + "s", + "eau" + ], + [ + "cor", + "rect" + ], + [ + "he", + "imer" + ], + [ + "heim", + "er" + ], + [ + "▁H", + "of" + ], + [ + "▁Ho", + "f" + ], + [ + "▁d", + "ia" + ], + [ + "▁di", + "a" + ], + [ + "▁i", + "rr" + ], + [ + "▁ir", + "r" + ], + [ + "▁ur", + "ged" + ], + [ + "▁urg", + "ed" + ], + [ + "▁urge", + "d" + ], + [ + "▁a", + "nom" + ], + [ + "▁an", + "om" + ], + [ + "▁ano", + "m" + ], + [ + "▁t", + "arde" + ], + [ + "▁tar", + "de" + ], + [ + "▁tard", + "e" + ], + [ + "ur", + "m" + ], + [ + "u", + "rm" + ], + [ + "▁se", + "ized" + ], + [ + "▁sei", + "zed" + ], + [ + "▁seiz", + "ed" + ], + [ + "DO", + "T" + ], + [ + "D", + "OT" + ], + [ + "op", + "acity" + ], + [ + "St", + "rings" + ], + [ + "String", + "s" + ], + [ + "Str", + "ings" + ], + [ + "▁dec", + "iding" + ], + [ + "▁decid", + "ing" + ], + [ + "▁listen", + "ers" + ], + [ + "▁listener", + "s" + ], + [ + "ár", + "a" + ], + [ + "á", + "ra" + ], + [ + "▁pl", + "anted" + ], + [ + "▁plan", + "ted" + ], + [ + "▁plant", + "ed" + ], + [ + "▁é", + "taient" + ], + [ + "▁ét", + "aient" + ], + [ + "Z", + "oom" + ], + [ + "st", + "ví" + ], + [ + "ng", + "th" + ], + [ + "ä", + "ude" + ], + [ + "▁C", + "av" + ], + [ + "▁Ca", + "v" + ], + [ + "▁v", + "endor" + ], + [ + "▁vend", + "or" + ], + [ + "▁", + "vendor" + ], + [ + "▁", + "ż" + ], + [ + "▁meas", + "uring" + ], + [ + "▁necess", + "ity" + ], + [ + "▁r", + "ivers" + ], + [ + "▁ri", + "vers" + ], + [ + "▁river", + "s" + ], + [ + "▁riv", + "ers" + ], + [ + "▁labor", + "atory" + ], + [ + "▁E", + "ff" + ], + [ + "▁reprodu", + "ce" + ], + [ + "▁S", + "ak" + ], + [ + "▁Sa", + "k" + ], + [ + "▁not", + "ebook" + ], + [ + "▁note", + "book" + ], + [ + "▁reason", + "ably" + ], + [ + "isecond", + "s" + ], + [ + "i", + "seconds" + ], + [ + "▁Part", + "ial" + ], + [ + "▁", + "Partial" + ], + [ + "GUI", + "D" + ], + [ + "GU", + "ID" + ], + [ + "G", + "UID" + ], + [ + "▁Per", + "iod" + ], + [ + "▁", + "Period" + ], + [ + "▁reve", + "aling" + ], + [ + "▁reveal", + "ing" + ], + [ + "▁conv", + "iction" + ], + [ + "▁", + "н" + ], + [ + "▁бу", + "ли" + ], + [ + "▁altern", + "ate" + ], + [ + "▁alter", + "nate" + ], + [ + "cc", + "iones" + ], + [ + "▁N", + "AT" + ], + [ + "▁NA", + "T" + ], + [ + "▁can", + "onical" + ], + [ + "▁canon", + "ical" + ], + [ + "mo", + "z" + ], + [ + "m", + "oz" + ], + [ + "▁Mé", + "xico" + ], + [ + "M", + "o" + ], + [ + "▁ш", + "а" + ], + [ + "▁", + "ша" + ], + [ + "lim", + "inary" + ], + [ + "f", + "é" + ], + [ + "чно", + "й" + ], + [ + "ч", + "ной" + ], + [ + "▁Ham", + "burg" + ], + [ + "▁Hamb", + "urg" + ], + [ + "▁influ", + "ential" + ], + [ + "▁b", + "olt" + ], + [ + "▁bo", + "lt" + ], + [ + "▁bol", + "t" + ], + [ + "az", + "zo" + ], + [ + "azz", + "o" + ], + [ + "PH", + "P" + ], + [ + "P", + "HP" + ], + [ + "▁Sa", + "udi" + ], + [ + "▁Saud", + "i" + ], + [ + "▁Sau", + "di" + ], + [ + "▁r", + "m" + ], + [ + "▁", + "rm" + ], + [ + "▁cer", + "ca" + ], + [ + "▁decor", + "ated" + ], + [ + "▁st", + "aat" + ], + [ + "▁sta", + "at" + ], + [ + "Lo", + "u" + ], + [ + "L", + "ou" + ], + [ + "▁compet", + "itors" + ], + [ + "во", + "ї" + ], + [ + "▁diam", + "ond" + ], + [ + "▁dia", + "mond" + ], + [ + "▁m", + "obil" + ], + [ + "▁mo", + "bil" + ], + [ + "▁mob", + "il" + ], + [ + "Click", + "Listener" + ], + [ + "set", + "State" + ], + [ + "▁s", + "üd" + ], + [ + ";", + "\"" + ], + [ + "œ", + "ur" + ], + [ + "▁Lud", + "wig" + ], + [ + "▁clin", + "ic" + ], + [ + "▁e", + "go" + ], + [ + "▁eg", + "o" + ], + [ + "▁", + "ego" + ], + [ + "Thread", + "ing" + ], + [ + "▁f", + "ract" + ], + [ + "▁fr", + "act" + ], + [ + "▁fra", + "ct" + ], + [ + "Ref", + "lection" + ], + [ + "oss", + "ip" + ], + [ + "\"]", + "[\"" + ], + [ + "▁L", + "ov" + ], + [ + "▁Lo", + "v" + ], + [ + "Ex", + "press" + ], + [ + "Exp", + "ress" + ], + [ + "Expr", + "ess" + ], + [ + "д", + "ри" + ], + [ + "if", + "acts" + ], + [ + "ifact", + "s" + ], + [ + "▁O", + "ften" + ], + [ + "▁Of", + "ten" + ], + [ + "▁", + "лу" + ], + [ + "▁p", + "ets" + ], + [ + "▁pe", + "ts" + ], + [ + "▁pet", + "s" + ], + [ + "▁address", + "ing" + ], + [ + "▁m", + "ens" + ], + [ + "▁me", + "ns" + ], + [ + "▁men", + "s" + ], + [ + "▁ED", + "IT" + ], + [ + "▁", + "EDIT" + ], + [ + "ud", + "der" + ], + [ + "udd", + "er" + ], + [ + "Vert", + "ical" + ], + [ + "ка", + "та" + ], + [ + "Cap", + "t" + ], + [ + "C", + "apt" + ], + [ + "verb", + "ose" + ], + [ + "▁вой", + "ны" + ], + [ + "UNK", + "NOWN" + ], + [ + "un", + "its" + ], + [ + "unit", + "s" + ], + [ + "uni", + "ts" + ], + [ + "per", + "mission" + ], + [ + "perm", + "ission" + ], + [ + "[", + "_" + ], + [ + "▁er", + "sch" + ], + [ + "▁ers", + "ch" + ], + [ + "▁comm", + "unes" + ], + [ + "▁commun", + "es" + ], + [ + "▁commune", + "s" + ], + [ + "Un", + "ityEngine" + ], + [ + "▁com", + "mut" + ], + [ + "▁comm", + "ut" + ], + [ + "kl", + "ass" + ], + [ + "k", + "lass" + ], + [ + "▁volt", + "age" + ], + [ + "▁volta", + "ge" + ], + [ + "re", + "zent" + ], + [ + "rez", + "ent" + ], + [ + "pe", + "rf" + ], + [ + "per", + "f" + ], + [ + "DR", + "V" + ], + [ + "D", + "RV" + ], + [ + "▁f", + "ame" + ], + [ + "▁fam", + "e" + ], + [ + "▁fa", + "me" + ], + [ + "▁S", + "pot" + ], + [ + "▁Sp", + "ot" + ], + [ + "▁Л", + "ю" + ], + [ + "▁c", + "asting" + ], + [ + "▁cas", + "ting" + ], + [ + "▁cast", + "ing" + ], + [ + "hi", + "m" + ], + [ + "h", + "im" + ], + [ + "▁en", + "gl" + ], + [ + "▁eng", + "l" + ], + [ + "▁int", + "ro" + ], + [ + "▁intr", + "o" + ], + [ + "▁Г", + "у" + ], + [ + "Comp", + "any" + ], + [ + "some", + "thing" + ], + [ + "som", + "ething" + ], + [ + "▁cl", + "icking" + ], + [ + "▁click", + "ing" + ], + [ + "жи", + "ва" + ], + [ + "▁fl", + "ames" + ], + [ + "▁flame", + "s" + ], + [ + "▁random", + "ly" + ], + [ + "ex", + "tr" + ], + [ + "ext", + "r" + ], + [ + "Equal", + "To" + ], + [ + "an", + "ners" + ], + [ + "ann", + "ers" + ], + [ + "anner", + "s" + ], + [ + "anne", + "rs" + ], + [ + "▁p", + "arks" + ], + [ + "▁par", + "ks" + ], + [ + "▁park", + "s" + ], + [ + "▁murm", + "ured" + ], + [ + "ми", + "я" + ], + [ + "▁reason", + "ing" + ], + [ + "сле", + "д" + ], + [ + "▁n", + "er" + ], + [ + "▁ne", + "r" + ], + [ + "▁", + "ner" + ], + [ + "▁é", + "c" + ], + [ + "▁", + "éc" + ], + [ + "ow", + "ners" + ], + [ + "own", + "ers" + ], + [ + "owner", + "s" + ], + [ + "▁Д", + "же" + ], + [ + "▁Дж", + "е" + ], + [ + "▁me", + "er" + ], + [ + "▁typ", + "ing" + ], + [ + "▁ty", + "ping" + ], + [ + "▁happ", + "ily" + ], + [ + "..", + "..." + ], + [ + "...", + ".." + ], + [ + "....", + "." + ], + [ + ".", + "...." + ], + [ + "▁Ч", + "а" + ], + [ + "be", + "cca" + ], + [ + "bec", + "ca" + ], + [ + "▁P", + "apers" + ], + [ + "▁Pa", + "pers" + ], + [ + "▁Pap", + "ers" + ], + [ + "▁Paper", + "s" + ], + [ + "▁Or", + "acle" + ], + [ + "▁equ", + "ilibrium" + ], + [ + "man", + "agement" + ], + [ + "Li", + "te" + ], + [ + "L", + "ite" + ], + [ + "▁des", + "ktop" + ], + [ + "▁desk", + "top" + ], + [ + "ă", + "r" + ], + [ + "▁G", + "ill" + ], + [ + "▁Gi", + "ll" + ], + [ + "▁Gil", + "l" + ], + [ + "do", + "rf" + ], + [ + "d", + "orf" + ], + [ + "ig", + "g" + ], + [ + "i", + "gg" + ], + [ + "▁qu", + "esta" + ], + [ + "▁que", + "sta" + ], + [ + "▁quest", + "a" + ], + [ + "▁", + "questa" + ], + [ + "Warning", + "s" + ], + [ + "Warn", + "ings" + ], + [ + "War", + "nings" + ], + [ + "over", + "flow" + ], + [ + "▁V", + "T" + ], + [ + "▁", + "VT" + ], + [ + "▁cons", + "isted" + ], + [ + "▁consist", + "ed" + ], + [ + "▁A", + "bu" + ], + [ + "▁Ab", + "u" + ], + [ + "v", + "scale" + ], + [ + "J", + "O" + ], + [ + "ah", + "o" + ], + [ + "a", + "ho" + ], + [ + "▁T", + "ensor" + ], + [ + "▁Ten", + "sor" + ], + [ + "▁", + "Tensor" + ], + [ + "▁hes", + "itated" + ], + [ + "▁w", + "enn" + ], + [ + "▁we", + "nn" + ], + [ + "▁wen", + "n" + ], + [ + "map", + "sto" + ], + [ + "maps", + "to" + ], + [ + "▁controvers", + "ial" + ], + [ + "M", + "F" + ], + [ + "▁l", + "ac" + ], + [ + "▁la", + "c" + ], + [ + "▁an", + "ch" + ], + [ + "▁anc", + "h" + ], + [ + "▁", + "anch" + ], + [ + "▁A", + "A" + ], + [ + "▁", + "AA" + ], + [ + "it", + "ta" + ], + [ + "itt", + "a" + ], + [ + "i", + "tta" + ], + [ + "ul", + "in" + ], + [ + "uli", + "n" + ], + [ + "u", + "lin" + ], + [ + "▁c", + "ler" + ], + [ + "▁cl", + "er" + ], + [ + "▁cle", + "r" + ], + [ + "▁D", + "iana" + ], + [ + "▁Di", + "ana" + ], + [ + "▁Fre", + "ud" + ], + [ + "▁challeng", + "ed" + ], + [ + "▁challenge", + "d" + ], + [ + "лё", + "н" + ], + [ + "л", + "ён" + ], + [ + "▁se", + "ated" + ], + [ + "▁sea", + "ted" + ], + [ + "▁seat", + "ed" + ], + [ + "▁sm", + "iles" + ], + [ + "▁smile", + "s" + ], + [ + "▁cr", + "acked" + ], + [ + "▁crack", + "ed" + ], + [ + "▁а", + "ктив" + ], + [ + "ско", + "ј" + ], + [ + "dict", + "ion" + ], + [ + "di", + "ction" + ], + [ + "d", + "iction" + ], + [ + "ex", + "press" + ], + [ + "exp", + "ress" + ], + [ + "expr", + "ess" + ], + [ + "▁im", + "posed" + ], + [ + "▁imp", + "osed" + ], + [ + "▁pro", + "tests" + ], + [ + "▁prote", + "sts" + ], + [ + "▁protest", + "s" + ], + [ + "▁prot", + "ests" + ], + [ + "▁w", + "ounds" + ], + [ + "▁wound", + "s" + ], + [ + "C", + "ulture" + ], + [ + "N", + "Y" + ], + [ + "prevent", + "Default" + ], + [ + "ad", + "io" + ], + [ + "adi", + "o" + ], + [ + "▁NE", + "W" + ], + [ + "▁", + "NEW" + ], + [ + "B", + "attle" + ], + [ + "▁se", + "colo" + ], + [ + "▁sec", + "olo" + ], + [ + "▁A", + "x" + ], + [ + "▁found", + "ing" + ], + [ + "(\"", + "-" + ], + [ + "▁ret", + "ro" + ], + [ + "▁retr", + "o" + ], + [ + "▁pot", + "atoes" + ], + [ + "import", + "ant" + ], + [ + "ie", + "me" + ], + [ + "iem", + "e" + ], + [ + "i", + "eme" + ], + [ + "ys", + "ide" + ], + [ + "y", + "side" + ], + [ + "d", + "ummy" + ], + [ + "▁t", + "ilt" + ], + [ + "▁til", + "t" + ], + [ + "▁ti", + "lt" + ], + [ + "▁R", + "ules" + ], + [ + "▁Ru", + "les" + ], + [ + "▁Rule", + "s" + ], + [ + "▁", + "Rules" + ], + [ + "▁un", + "ters" + ], + [ + "▁unt", + "ers" + ], + [ + "▁unter", + "s" + ], + [ + "A", + "ud" + ], + [ + "V", + "ENDOR" + ], + [ + "ud", + "ge" + ], + [ + "un", + "al" + ], + [ + "una", + "l" + ], + [ + "u", + "nal" + ], + [ + "▁Ad", + "ult" + ], + [ + "▁im", + "pat" + ], + [ + "▁imp", + "at" + ], + [ + "▁rep", + "airs" + ], + [ + "▁repair", + "s" + ], + [ + "▁F", + "erd" + ], + [ + "▁Fe", + "rd" + ], + [ + "▁Fer", + "d" + ], + [ + "▁Az", + "ure" + ], + [ + "▁", + "Azure" + ], + [ + "))", + ":" + ], + [ + ")", + "):" + ], + [ + "▁pag", + "ina" + ], + [ + "▁E", + "pisode" + ], + [ + "File", + "name" + ], + [ + "Fil", + "ename" + ], + [ + "▁j", + "á" + ], + [ + "▁oblig", + "ation" + ], + [ + "ig", + "hed" + ], + [ + "igh", + "ed" + ], + [ + "▁pers", + "istent" + ], + [ + "Mus", + "ic" + ], + [ + "▁C", + "ele" + ], + [ + "▁Ce", + "le" + ], + [ + "▁Cel", + "e" + ], + [ + "▁r", + "y" + ], + [ + "▁", + "ry" + ], + [ + "▁cert", + "ification" + ], + [ + "ul", + "d" + ], + [ + "u", + "ld" + ], + [ + "▁T", + "L" + ], + [ + "▁", + "TL" + ], + [ + "▁sk", + "irt" + ], + [ + "▁ski", + "rt" + ], + [ + "▁M", + "ini" + ], + [ + "▁Min", + "i" + ], + [ + "▁Mi", + "ni" + ], + [ + "▁B", + "ring" + ], + [ + "▁Br", + "ing" + ], + [ + "><", + "?" + ], + [ + ">", + "", + "%" + ], + [ + "▁P", + "and" + ], + [ + "▁Pan", + "d" + ], + [ + "▁Pa", + "nd" + ], + [ + "▁S", + "UB" + ], + [ + "▁SU", + "B" + ], + [ + "▁", + "SUB" + ], + [ + "▁compan", + "ions" + ], + [ + "▁companion", + "s" + ], + [ + "▁RE", + "AD" + ], + [ + "▁", + "READ" + ], + [ + "▁S", + "olutions" + ], + [ + "▁Solution", + "s" + ], + [ + "▁acc", + "essed" + ], + [ + "▁access", + "ed" + ], + [ + "▁p", + "osto" + ], + [ + "▁pos", + "to" + ], + [ + "▁po", + "sto" + ], + [ + "▁post", + "o" + ], + [ + "▁pur", + "suit" + ], + [ + "▁purs", + "uit" + ], + [ + "ow", + "i" + ], + [ + "o", + "wi" + ], + [ + "▁gro", + "cery" + ], + [ + "Sp", + "e" + ], + [ + "S", + "pe" + ], + [ + "haus", + "en" + ], + [ + "▁normal", + "ized" + ], + [ + "▁tra", + "uma" + ], + [ + "gg", + "i" + ], + [ + "g", + "gi" + ], + [ + "ie", + "nia" + ], + [ + "ien", + "ia" + ], + [ + "▁aut", + "umn" + ], + [ + "▁so", + "vere" + ], + [ + "▁sov", + "ere" + ], + [ + "▁Men", + "schen" + ], + [ + "▁Mens", + "chen" + ], + [ + "▁D", + "AG" + ], + [ + "▁DA", + "G" + ], + [ + "▁S", + "ort" + ], + [ + "▁So", + "rt" + ], + [ + "▁Sor", + "t" + ], + [ + "▁", + "Sort" + ], + [ + "|", + "---" + ], + [ + "▁l", + "iver" + ], + [ + "▁li", + "ver" + ], + [ + "▁live", + "r" + ], + [ + "▁liv", + "er" + ], + [ + "▁", + "liver" + ], + [ + "env", + "iron" + ], + [ + "DE", + "CL" + ], + [ + "DEC", + "L" + ], + [ + "▁ма", + "й" + ], + [ + "▁N", + "ak" + ], + [ + "▁Na", + "k" + ], + [ + "bet", + "ween" + ], + [ + "▁gentle", + "man" + ], + [ + "in", + "ging" + ], + [ + "ing", + "ing" + ], + [ + "▁su", + "bur" + ], + [ + "▁sub", + "ur" + ], + [ + "ST", + "O" + ], + [ + "S", + "TO" + ], + [ + "ace", + "ut" + ], + [ + "\\", + "!" + ], + [ + "▁Fuß", + "ball" + ], + [ + "na", + "r" + ], + [ + "n", + "ar" + ], + [ + "▁b", + "og" + ], + [ + "▁bo", + "g" + ], + [ + "Token", + "s" + ], + [ + "Tok", + "ens" + ], + [ + "▁cer", + "emon" + ], + [ + "▁cere", + "mon" + ], + [ + "DA", + "Y" + ], + [ + "D", + "AY" + ], + [ + "▁out", + "fit" + ], + [ + "▁agric", + "ulture" + ], + [ + "ди", + "и" + ], + [ + "▁N", + "in" + ], + [ + "▁Ni", + "n" + ], + [ + "▁Sp", + "rings" + ], + [ + "▁Spring", + "s" + ], + [ + "▁Spr", + "ings" + ], + [ + "▁Co", + "ach" + ], + [ + "▁d", + "jango" + ], + [ + "▁", + "django" + ], + [ + "▁C", + "rim" + ], + [ + "▁Cr", + "im" + ], + [ + "▁te", + "cn" + ], + [ + "Th", + "ree" + ], + [ + "em", + "os" + ], + [ + "e", + "mos" + ], + [ + "▁be", + "an" + ], + [ + "▁", + "bean" + ], + [ + "pi", + "eler" + ], + [ + "pie", + "ler" + ], + [ + "p", + "ieler" + ], + [ + "ri", + "tz" + ], + [ + "rit", + "z" + ], + [ + "r", + "itz" + ], + [ + "ta", + "bs" + ], + [ + "tab", + "s" + ], + [ + "t", + "abs" + ], + [ + "▁Pro", + "blem" + ], + [ + "in", + "and" + ], + [ + "ina", + "nd" + ], + [ + "oc", + "on" + ], + [ + "oco", + "n" + ], + [ + "o", + "con" + ], + [ + "њ", + "и" + ], + [ + "▁bu", + "yer" + ], + [ + "▁buy", + "er" + ], + [ + "us", + "ement" + ], + [ + "use", + "ment" + ], + [ + "▁b", + "or" + ], + [ + "▁bo", + "r" + ], + [ + "▁", + "bor" + ], + [ + "▁sett", + "embre" + ], + [ + "pp", + "e" + ], + [ + "p", + "pe" + ], + [ + "▁D", + "eg" + ], + [ + "▁De", + "g" + ], + [ + "▁W", + "a" + ], + [ + "▁w", + "ives" + ], + [ + "▁fr", + "anzös" + ], + [ + "▁mar", + "ca" + ], + [ + "▁des", + "cent" + ], + [ + "▁desc", + "ent" + ], + [ + "▁S", + "ha" + ], + [ + "▁Sh", + "a" + ], + [ + "ver", + "ts" + ], + [ + "vert", + "s" + ], + [ + "v", + "erts" + ], + [ + "▁Sh", + "adow" + ], + [ + "▁", + "Shadow" + ], + [ + "▁Hug", + "o" + ], + [ + "▁Hu", + "go" + ], + [ + "▁A", + "ppe" + ], + [ + "▁App", + "e" + ], + [ + "▁Ap", + "pe" + ], + [ + "▁", + "Appe" + ], + [ + "▁L", + "ac" + ], + [ + "▁La", + "c" + ], + [ + "al", + "len" + ], + [ + "all", + "en" + ], + [ + "alle", + "n" + ], + [ + "os", + "ity" + ], + [ + "osi", + "ty" + ], + [ + "▁consult", + "ation" + ], + [ + "▁T", + "i" + ], + [ + "▁er", + "ano" + ], + [ + "▁era", + "no" + ], + [ + "▁eran", + "o" + ], + [ + "▁lo", + "vers" + ], + [ + "▁love", + "rs" + ], + [ + "▁lov", + "ers" + ], + [ + "▁lover", + "s" + ], + [ + "▁уни", + "версите" + ], + [ + "▁virt", + "ue" + ], + [ + "▁view", + "ers" + ], + [ + "M", + "u" + ], + [ + "c", + "ategories" + ], + [ + "▁о", + "пера" + ], + [ + "▁over", + "look" + ], + [ + "▁overl", + "ook" + ], + [ + "▁тер", + "рито" + ], + [ + "▁Oper", + "ations" + ], + [ + "▁Operation", + "s" + ], + [ + "▁", + "Operations" + ], + [ + "è", + "ve" + ], + [ + "-", + "(" + ], + [ + "▁", + "Ż" + ], + [ + "je", + "v" + ], + [ + "j", + "ev" + ], + [ + "▁c", + "rist" + ], + [ + "▁cr", + "ist" + ], + [ + "▁cris", + "t" + ], + [ + "▁cri", + "st" + ], + [ + "▁мар", + "та" + ], + [ + "▁pro", + "vin" + ], + [ + "▁prov", + "in" + ], + [ + "product", + "ion" + ], + [ + "produ", + "ction" + ], + [ + "prod", + "uction" + ], + [ + "p", + "roduction" + ], + [ + "▁T", + "all" + ], + [ + "▁Tal", + "l" + ], + [ + "▁Ta", + "ll" + ], + [ + "Requ", + "ests" + ], + [ + "Request", + "s" + ], + [ + "▁t", + "iles" + ], + [ + "▁til", + "es" + ], + [ + "▁tile", + "s" + ], + [ + "▁ti", + "les" + ], + [ + "ref", + "lect" + ], + [ + "▁ar", + "gc" + ], + [ + "▁arg", + "c" + ], + [ + "▁", + "argc" + ], + [ + "▁t", + "emplates" + ], + [ + "▁templ", + "ates" + ], + [ + "▁template", + "s" + ], + [ + "▁", + "templates" + ], + [ + "AR", + "B" + ], + [ + "A", + "RB" + ], + [ + "▁weiter", + "e" + ], + [ + "▁weit", + "ere" + ], + [ + ")?", + ";" + ], + [ + ")", + "?;" + ], + [ + "▁t", + "oll" + ], + [ + "▁to", + "ll" + ], + [ + "▁correspond", + "ence" + ], + [ + "$", + ";" + ], + [ + "L", + "T" + ], + [ + "▁t", + "am" + ], + [ + "▁ta", + "m" + ], + [ + "de", + "cess" + ], + [ + "dec", + "ess" + ], + [ + "built", + "in" + ], + [ + "da", + "sh" + ], + [ + "das", + "h" + ], + [ + "d", + "ash" + ], + [ + "ze", + "nie" + ], + [ + "zen", + "ie" + ], + [ + "▁mole", + "cular" + ], + [ + "▁chem", + "icals" + ], + [ + "▁chemical", + "s" + ], + [ + "▁rend", + "ering" + ], + [ + "▁render", + "ing" + ], + [ + "▁Sing", + "les" + ], + [ + "▁Sin", + "gles" + ], + [ + "▁Single", + "s" + ], + [ + "Init", + "ialized" + ], + [ + "Initial", + "ized" + ], + [ + "Initialize", + "d" + ], + [ + "▁Mar", + "tha" + ], + [ + "▁Mart", + "ha" + ], + [ + "ri", + "ere" + ], + [ + "rie", + "re" + ], + [ + "rier", + "e" + ], + [ + "r", + "iere" + ], + [ + "par", + "agraph" + ], + [ + "para", + "graph" + ], + [ + "as", + "ters" + ], + [ + "ast", + "ers" + ], + [ + "aster", + "s" + ], + [ + "aste", + "rs" + ], + [ + "a", + "sters" + ], + [ + "▁dec", + "ides" + ], + [ + "▁decide", + "s" + ], + [ + "▁decid", + "es" + ], + [ + "▁Flor", + "ence" + ], + [ + "▁Flo", + "rence" + ], + [ + "▁Floren", + "ce" + ], + [ + "▁And", + "ers" + ], + [ + "▁An", + "ders" + ], + [ + "мо", + "й" + ], + [ + "▁a", + "pt" + ], + [ + "▁ap", + "t" + ], + [ + "▁", + "apt" + ], + [ + "▁affili", + "ate" + ], + [ + "ch", + "el" + ], + [ + "che", + "l" + ], + [ + "c", + "hel" + ], + [ + "▁re", + "vision" + ], + [ + "▁rev", + "ision" + ], + [ + "Pat", + "ch" + ], + [ + "P", + "atch" + ], + [ + "▁fi", + "scal" + ], + [ + "▁fis", + "cal" + ], + [ + "wi", + "ę" + ], + [ + "w", + "ię" + ], + [ + "N", + "ational" + ], + [ + "▁depend", + "encies" + ], + [ + "TRAN", + "S" + ], + [ + "TRA", + "NS" + ], + [ + "▁r", + "ack" + ], + [ + "▁rac", + "k" + ], + [ + "▁ra", + "ck" + ], + [ + "sel", + "ling" + ], + [ + "s", + "elling" + ], + [ + "na", + "issance" + ], + [ + "c", + "atalog" + ], + [ + "Sh", + "ip" + ], + [ + "S", + "hip" + ], + [ + "IM", + "AGE" + ], + [ + "I", + "MAGE" + ], + [ + "']", + "[" + ], + [ + "'", + "][" + ], + [ + "▁p", + "rv" + ], + [ + "▁pr", + "v" + ], + [ + "▁F", + "en" + ], + [ + "▁Fe", + "n" + ], + [ + "▁rad", + "ar" + ], + [ + "▁ra", + "dar" + ], + [ + "cond", + "itions" + ], + [ + "condition", + "s" + ], + [ + "▁Quest", + "ions" + ], + [ + "▁Question", + "s" + ], + [ + "▁v", + "ivid" + ], + [ + "▁vi", + "vid" + ], + [ + "▁viv", + "id" + ], + [ + "op", + "f" + ], + [ + "o", + "pf" + ], + [ + "FA", + "CE" + ], + [ + "F", + "ACE" + ], + [ + "ry", + "s" + ], + [ + "r", + "ys" + ], + [ + "Ex", + "tract" + ], + [ + "Ext", + "ract" + ], + [ + "Extra", + "ct" + ], + [ + "il", + "ians" + ], + [ + "ili", + "ans" + ], + [ + "ilia", + "ns" + ], + [ + "pl", + "ug" + ], + [ + "▁a", + "té" + ], + [ + "▁at", + "é" + ], + [ + "и", + "л" + ], + [ + "▁like", + "wise" + ], + [ + "▁L", + "il" + ], + [ + "▁Li", + "l" + ], + [ + "▁Cam", + "peonato" + ], + [ + "AUT", + "O" + ], + [ + "AU", + "TO" + ], + [ + "▁M", + "eta" + ], + [ + "▁Me", + "ta" + ], + [ + "▁Met", + "a" + ], + [ + "▁", + "Meta" + ], + [ + "re", + "no" + ], + [ + "ren", + "o" + ], + [ + "r", + "eno" + ], + [ + "▁Trans", + "fer" + ], + [ + "▁", + "Transfer" + ], + [ + "▁Mich", + "elle" + ], + [ + "▁Michel", + "le" + ], + [ + "▁Miche", + "lle" + ], + [ + "bi", + "s" + ], + [ + "b", + "is" + ], + [ + "ń", + "st" + ], + [ + "зо", + "н" + ], + [ + "з", + "он" + ], + [ + "▁C", + "ultural" + ], + [ + "com", + "pass" + ], + [ + "comp", + "ass" + ], + [ + "▁my", + "sql" + ], + [ + "▁", + "mysql" + ], + [ + "▁cancel", + "led" + ], + [ + "▁cancell", + "ed" + ], + [ + "▁", + "’" + ], + [ + "to", + "o" + ], + [ + "t", + "oo" + ], + [ + "▁re", + "bell" + ], + [ + "▁reb", + "ell" + ], + [ + "▁rebel", + "l" + ], + [ + "ég", + "e" + ], + [ + "é", + "ge" + ], + [ + "os", + "z" + ], + [ + "o", + "sz" + ], + [ + "▁com", + "poser" + ], + [ + "▁comp", + "oser" + ], + [ + "▁compos", + "er" + ], + [ + "}\"", + ")" + ], + [ + "}", + "\")" + ], + [ + "▁des", + "erves" + ], + [ + "▁deserve", + "s" + ], + [ + "▁oh", + "ne" + ], + [ + "▁J", + "ed" + ], + [ + "▁Je", + "d" + ], + [ + "K", + "ernel" + ], + [ + "▁pract", + "ition" + ], + [ + "▁in", + "door" + ], + [ + "▁ind", + "oor" + ], + [ + "▁config", + "urations" + ], + [ + "▁configuration", + "s" + ], + [ + "▁m", + "eth" + ], + [ + "▁me", + "th" + ], + [ + "▁met", + "h" + ], + [ + "+", + "(" + ], + [ + "Quest", + "ion" + ], + [ + "▁bl", + "own" + ], + [ + "▁blo", + "wn" + ], + [ + "▁blow", + "n" + ], + [ + ")", + "'" + ], + [ + "▁Ar", + "gs" + ], + [ + "▁Arg", + "s" + ], + [ + "▁", + "Args" + ], + [ + "F", + "ake" + ], + [ + "▁d", + "even" + ], + [ + "▁de", + "ven" + ], + [ + "▁dev", + "en" + ], + [ + "istrz", + "ost" + ], + [ + "na", + "io" + ], + [ + "▁\"", + "{" + ], + [ + "▁L", + "it" + ], + [ + "▁Li", + "t" + ], + [ + "com", + "ed" + ], + [ + "co", + "med" + ], + [ + "come", + "d" + ], + [ + "c", + "omed" + ], + [ + "▁st", + "am" + ], + [ + "▁sta", + "m" + ], + [ + "▁pl", + "ugins" + ], + [ + "▁plugin", + "s" + ], + [ + "▁plug", + "ins" + ], + [ + "▁", + "plugins" + ], + [ + "▁travel", + "ling" + ], + [ + "▁trav", + "elling" + ], + [ + "na", + "ire" + ], + [ + "n", + "aire" + ], + [ + "▁aut", + "onom" + ], + [ + "▁auto", + "nom" + ], + [ + "STRU", + "CT" + ], + [ + "n", + "h" + ], + [ + "né", + "es" + ], + [ + "née", + "s" + ], + [ + "n", + "ées" + ], + [ + "▁consider", + "ably" + ], + [ + "ко", + "р" + ], + [ + "к", + "ор" + ], + [ + "B", + "G" + ], + [ + "▁lad", + "der" + ], + [ + "▁h", + "ast" + ], + [ + "▁has", + "t" + ], + [ + "▁ha", + "st" + ], + [ + "iz", + "ado" + ], + [ + "iza", + "do" + ], + [ + "▁s", + "ele" + ], + [ + "▁se", + "le" + ], + [ + "▁sel", + "e" + ], + [ + "▁W", + "ere" + ], + [ + "▁We", + "re" + ], + [ + "▁Wer", + "e" + ], + [ + "ar", + "don" + ], + [ + "ard", + "on" + ], + [ + "ardo", + "n" + ], + [ + "B", + "ank" + ], + [ + "bund", + "le" + ], + [ + "b", + "undle" + ], + [ + "▁anticip", + "ated" + ], + [ + "▁C", + "ot" + ], + [ + "▁Co", + "t" + ], + [ + "▁else", + "if" + ], + [ + "▁", + "elseif" + ], + [ + "▁Bl", + "ues" + ], + [ + "▁Blue", + "s" + ], + [ + "▁fil", + "tered" + ], + [ + "▁filter", + "ed" + ], + [ + "▁a", + "uction" + ], + [ + "▁au", + "ction" + ], + [ + "ed", + "uc" + ], + [ + "edu", + "c" + ], + [ + "e", + "duc" + ], + [ + "▁Ex", + "pression" + ], + [ + "▁Express", + "ion" + ], + [ + "▁Exp", + "ression" + ], + [ + "▁", + "Expression" + ], + [ + "in", + "x" + ], + [ + "i", + "nx" + ], + [ + "▁s", + "ucks" + ], + [ + "▁su", + "cks" + ], + [ + "▁suc", + "ks" + ], + [ + "▁suck", + "s" + ], + [ + "▁ма", + "я" + ], + [ + "EL", + "L" + ], + [ + "E", + "LL" + ], + [ + "ющи", + "й" + ], + [ + "▁Hud", + "son" + ], + [ + "it", + "ä" + ], + [ + "на", + "ми" + ], + [ + "▁fem", + "me" + ], + [ + "in", + "ho" + ], + [ + "▁e", + "vt" + ], + [ + "▁ev", + "t" + ], + [ + "▁", + "evt" + ], + [ + "istribut", + "ions" + ], + [ + "istribution", + "s" + ], + [ + "▁r", + "uss" + ], + [ + "▁ru", + "ss" + ], + [ + "▁rus", + "s" + ], + [ + "▁pet", + "ition" + ], + [ + "▁petit", + "ion" + ], + [ + "▁г", + "ла" + ], + [ + "▁", + "гла" + ], + [ + "Si", + "g" + ], + [ + "S", + "ig" + ], + [ + "▁T", + "ut" + ], + [ + "▁Tu", + "t" + ], + [ + "Part", + "ial" + ], + [ + "Ent", + "ities" + ], + [ + "▁b", + "ears" + ], + [ + "▁be", + "ars" + ], + [ + "▁bear", + "s" + ], + [ + "▁h", + "ollow" + ], + [ + "▁hol", + "low" + ], + [ + "__", + "[\"" + ], + [ + "▁R", + "is" + ], + [ + "ț", + "ă" + ], + [ + "dim", + "s" + ], + [ + "di", + "ms" + ], + [ + "d", + "ims" + ], + [ + "▁compl", + "ained" + ], + [ + "▁complain", + "ed" + ], + [ + "▁m", + "apped" + ], + [ + "▁map", + "ped" + ], + [ + "▁ma", + "pped" + ], + [ + "▁авгу", + "ста" + ], + [ + "▁initi", + "atives" + ], + [ + "▁initiative", + "s" + ], + [ + "▁own", + "s" + ], + [ + "ch", + "ez" + ], + [ + "che", + "z" + ], + [ + "▁dis", + "pon" + ], + [ + "▁disp", + "on" + ], + [ + "▁m", + "ush" + ], + [ + "▁mus", + "h" + ], + [ + "▁mu", + "sh" + ], + [ + "q", + "s" + ], + [ + "▁er", + "folg" + ], + [ + "▁Nor", + "weg" + ], + [ + "▁c", + "et" + ], + [ + "▁ce", + "t" + ], + [ + "im", + "ag" + ], + [ + "ima", + "g" + ], + [ + "i", + "mag" + ], + [ + "▁исто", + "ри" + ], + [ + "▁ни", + "х" + ], + [ + "▁", + "них" + ], + [ + "Un", + "til" + ], + [ + "U", + "ntil" + ], + [ + "▁s", + "talk" + ], + [ + "▁st", + "alk" + ], + [ + "▁П", + "ра" + ], + [ + "uv", + "o" + ], + [ + "u", + "vo" + ], + [ + "ie", + "rz" + ], + [ + "ier", + "z" + ], + [ + "ri", + "eben" + ], + [ + "rie", + "ben" + ], + [ + "rieb", + "en" + ], + [ + "X", + "T" + ], + [ + "ic", + "als" + ], + [ + "ical", + "s" + ], + [ + "ica", + "ls" + ], + [ + "std", + "out" + ], + [ + "▁extra", + "cted" + ], + [ + "▁extract", + "ed" + ], + [ + "▁Im", + "ages" + ], + [ + "▁Image", + "s" + ], + [ + "▁", + "Images" + ], + [ + "un", + "def" + ], + [ + "und", + "ef" + ], + [ + "unde", + "f" + ], + [ + "u", + "ndef" + ], + [ + "▁L", + "é" + ], + [ + "▁accommod", + "ation" + ], + [ + "▁T", + "ouch" + ], + [ + "▁To", + "uch" + ], + [ + "▁", + "Touch" + ], + [ + "▁intent", + "ions" + ], + [ + "▁intention", + "s" + ], + [ + "▁concent", + "rated" + ], + [ + "▁concentr", + "ated" + ], + [ + "▁concentrate", + "d" + ], + [ + "▁Насе", + "ление" + ], + [ + "▁ut", + "ilis" + ], + [ + "▁util", + "is" + ], + [ + "▁сле", + "д" + ], + [ + "▁", + "след" + ], + [ + "li", + "f" + ], + [ + "l", + "if" + ], + [ + "▁comp", + "ris" + ], + [ + "▁compr", + "is" + ], + [ + "▁с", + "бор" + ], + [ + "med", + "ium" + ], + [ + "medi", + "um" + ], + [ + "St", + "ates" + ], + [ + "State", + "s" + ], + [ + "Stat", + "es" + ], + [ + "▁Би", + "ография" + ], + [ + "▁Fa", + "ith" + ], + [ + "U", + "A" + ], + [ + "ADD", + "RESS" + ], + [ + "▁r", + "ated" + ], + [ + "▁rate", + "d" + ], + [ + "▁rat", + "ed" + ], + [ + "▁ra", + "ted" + ], + [ + "▁", + "rated" + ], + [ + "▁R", + "ena" + ], + [ + "▁Re", + "na" + ], + [ + "▁Ren", + "a" + ], + [ + "▁C", + "ache" + ], + [ + "▁Ca", + "che" + ], + [ + "▁", + "Cache" + ], + [ + "▁pe", + "que" + ], + [ + "▁un", + "used" + ], + [ + "▁unus", + "ed" + ], + [ + "▁", + "unused" + ], + [ + "ni", + "m" + ], + [ + "n", + "im" + ], + [ + "ol", + "ding" + ], + [ + "old", + "ing" + ], + [ + "▁N", + "r" + ], + [ + "R", + "ay" + ], + [ + "ur", + "ls" + ], + [ + "url", + "s" + ], + [ + "▁em", + "issions" + ], + [ + "▁emission", + "s" + ], + [ + "I", + "r" + ], + [ + "▁m", + "å" + ], + [ + "be", + "ar" + ], + [ + "b", + "ear" + ], + [ + "▁L", + "ub" + ], + [ + "▁Lu", + "b" + ], + [ + "▁Out", + "side" + ], + [ + "min", + "ded" + ], + [ + "mind", + "ed" + ], + [ + "▁PRO", + "VID" + ], + [ + "▁s", + "ó" + ], + [ + "▁civil", + "ian" + ], + [ + "Find", + "er" + ], + [ + "Fin", + "der" + ], + [ + "Fi", + "nder" + ], + [ + "F", + "inder" + ], + [ + "▁achie", + "ving" + ], + [ + "mod", + "ified" + ], + [ + "la", + "ne" + ], + [ + "lan", + "e" + ], + [ + "l", + "ane" + ], + [ + "Se", + "nder" + ], + [ + "Send", + "er" + ], + [ + "S", + "ender" + ], + [ + "▁Cr", + "ime" + ], + [ + "▁Crim", + "e" + ], + [ + "REQ", + "UI" + ], + [ + "▁open", + "ly" + ], + [ + "▁Belg", + "ium" + ], + [ + "ic", + "ity" + ], + [ + "ici", + "ty" + ], + [ + "icit", + "y" + ], + [ + "i", + "city" + ], + [ + "▁M", + "az" + ], + [ + "▁Ma", + "z" + ], + [ + "▁st", + "agger" + ], + [ + "▁stag", + "ger" + ], + [ + "}}", + "$," + ], + [ + "}}$", + "," + ], + [ + "}", + "}$," + ], + [ + "na", + "te" + ], + [ + "nat", + "e" + ], + [ + "n", + "ate" + ], + [ + "''", + "'" + ], + [ + "'", + "''" + ], + [ + "▁Ge", + "off" + ], + [ + "ll", + "i" + ], + [ + "l", + "li" + ], + [ + "Su", + "ite" + ], + [ + "▁D", + "istribution" + ], + [ + "▁я", + "кий" + ], + [ + "Com", + "bo" + ], + [ + "Comb", + "o" + ], + [ + "ho", + "oks" + ], + [ + "hook", + "s" + ], + [ + "▁F", + "ight" + ], + [ + "▁Fig", + "ht" + ], + [ + "▁Fi", + "ght" + ], + [ + "Set", + "s" + ], + [ + "Se", + "ts" + ], + [ + "S", + "ets" + ], + [ + "▁m", + "k" + ], + [ + "▁", + "mk" + ], + [ + "▁gu", + "ides" + ], + [ + "▁guide", + "s" + ], + [ + "▁guid", + "es" + ], + [ + "▁princip", + "ale" + ], + [ + "▁principal", + "e" + ], + [ + "Pre", + "ferences" + ], + [ + "ti", + "ny" + ], + [ + "t", + "iny" + ], + [ + "ap", + "pen" + ], + [ + "app", + "en" + ], + [ + "appe", + "n" + ], + [ + "a", + "ppen" + ], + [ + "▁ru", + "ined" + ], + [ + "▁ruin", + "ed" + ], + [ + "▁sl", + "iding" + ], + [ + "▁slid", + "ing" + ], + [ + "▁Z", + "en" + ], + [ + "▁Ze", + "n" + ], + [ + "▁oct", + "ubre" + ], + [ + "pos", + "er" + ], + [ + "po", + "ser" + ], + [ + "pose", + "r" + ], + [ + "p", + "oser" + ], + [ + "▁F", + "lag" + ], + [ + "▁Fl", + "ag" + ], + [ + "▁", + "Flag" + ], + [ + "▁b", + "oom" + ], + [ + "▁bo", + "om" + ], + [ + "▁Det", + "ect" + ], + [ + "▁activ", + "ation" + ], + [ + "▁обра", + "зова" + ], + [ + "▁entertain", + "ing" + ], + [ + "▁entert", + "aining" + ], + [ + "▁protect", + "ive" + ], + [ + "ál", + "l" + ], + [ + "á", + "ll" + ], + [ + "▁Fl", + "ash" + ], + [ + "▁mid", + "st" + ], + [ + "▁mi", + "dst" + ], + [ + "ствен", + "ной" + ], + [ + "▁Ph", + "D" + ], + [ + "ij", + "ing" + ], + [ + "iji", + "ng" + ], + [ + "cl", + "ub" + ], + [ + "get", + "C" + ], + [ + "▁tro", + "uve" + ], + [ + "▁trou", + "ve" + ], + [ + "am", + "bers" + ], + [ + "amb", + "ers" + ], + [ + "amber", + "s" + ], + [ + "▁g", + "reed" + ], + [ + "▁gr", + "eed" + ], + [ + "▁gre", + "ed" + ], + [ + "am", + "arin" + ], + [ + "ama", + "rin" + ], + [ + "amar", + "in" + ], + [ + "▁suspic", + "ious" + ], + [ + "▁susp", + "icious" + ], + [ + "▁dep", + "uty" + ], + [ + "▁deput", + "y" + ], + [ + "as", + "per" + ], + [ + "asp", + "er" + ], + [ + "▁fun", + "ded" + ], + [ + "▁fund", + "ed" + ], + [ + "al", + "one" + ], + [ + "alo", + "ne" + ], + [ + "▁t", + "ract" + ], + [ + "▁tr", + "act" + ], + [ + "▁tra", + "ct" + ], + [ + "▁", + "tract" + ], + [ + "▁R", + "ating" + ], + [ + "▁Ra", + "ting" + ], + [ + "▁Rat", + "ing" + ], + [ + "ad", + "ays" + ], + [ + "ada", + "ys" + ], + [ + "a", + "days" + ], + [ + "▁st", + "att" + ], + [ + "▁stat", + "t" + ], + [ + "▁sta", + "tt" + ], + [ + "▁Priv", + "acy" + ], + [ + "▁_", + "_(" + ], + [ + "▁__", + "(" + ], + [ + "▁", + "__(" + ], + [ + "▁f", + "ights" + ], + [ + "▁fight", + "s" + ], + [ + "á", + "j" + ], + [ + "\\", + "]" + ], + [ + "ag", + "h" + ], + [ + "a", + "gh" + ], + [ + "or", + "na" + ], + [ + "orn", + "a" + ], + [ + "▁Diam", + "ond" + ], + [ + "▁pro", + "totype" + ], + [ + "▁proto", + "type" + ], + [ + "▁prot", + "otype" + ], + [ + "▁", + "prototype" + ], + [ + "▁Str", + "ateg" + ], + [ + "ha", + "do" + ], + [ + "had", + "o" + ], + [ + "h", + "ado" + ], + [ + "▁l", + "ungs" + ], + [ + "▁lung", + "s" + ], + [ + "▁lun", + "gs" + ], + [ + "Pro", + "totype" + ], + [ + "Proto", + "type" + ], + [ + "ließ", + "lich" + ], + [ + "▁d", + "ive" + ], + [ + "▁di", + "ve" + ], + [ + "▁div", + "e" + ], + [ + "co", + "v" + ], + [ + "c", + "ov" + ], + [ + "▁M", + "ist" + ], + [ + "▁Mi", + "st" + ], + [ + "▁Mis", + "t" + ], + [ + "▁T", + "ypes" + ], + [ + "▁Type", + "s" + ], + [ + "▁Ty", + "pes" + ], + [ + "▁Typ", + "es" + ], + [ + "▁", + "Types" + ], + [ + "▁di", + "agonal" + ], + [ + "▁p", + "review" + ], + [ + "▁pre", + "view" + ], + [ + "▁prev", + "iew" + ], + [ + "▁", + "preview" + ], + [ + "▁Cont", + "ainer" + ], + [ + "▁", + "Container" + ], + [ + "DESC", + "RIP" + ], + [ + "▁brit", + "ann" + ], + [ + "▁C", + "ord" + ], + [ + "▁Co", + "rd" + ], + [ + "▁Cor", + "d" + ], + [ + "ak", + "ov" + ], + [ + "ako", + "v" + ], + [ + "a", + "kov" + ], + [ + "▁far", + "ming" + ], + [ + "▁farm", + "ing" + ], + [ + "▁p", + "ère" + ], + [ + "▁k", + "ills" + ], + [ + "▁kill", + "s" + ], + [ + "▁kil", + "ls" + ], + [ + "▁Car", + "ib" + ], + [ + "▁Ca", + "rib" + ], + [ + "ћ", + "и" + ], + [ + "▁А", + "л" + ], + [ + "?", + ";" + ], + [ + "▁пи", + "са" + ], + [ + "▁", + "писа" + ], + [ + "▁En", + "sure" + ], + [ + "par", + "sed" + ], + [ + "parse", + "d" + ], + [ + "pars", + "ed" + ], + [ + "än", + "ge" + ], + [ + "äng", + "e" + ], + [ + "▁D", + "elta" + ], + [ + "▁Del", + "ta" + ], + [ + "▁", + "Delta" + ], + [ + "▁g", + "aining" + ], + [ + "▁gain", + "ing" + ], + [ + "▁ga", + "ining" + ], + [ + "▁n", + "oting" + ], + [ + "▁not", + "ing" + ], + [ + "▁no", + "ting" + ], + [ + "▁B", + "arb" + ], + [ + "▁Bar", + "b" + ], + [ + "▁Ba", + "rb" + ], + [ + "▁фев", + "ра" + ], + [ + "▁фе", + "вра" + ], + [ + "Em", + "p" + ], + [ + "E", + "mp" + ], + [ + "▁{", + "})" + ], + [ + "▁{}", + ")" + ], + [ + "▁", + "{})" + ], + [ + "▁sy", + "ntax" + ], + [ + "▁syn", + "tax" + ], + [ + "▁synt", + "ax" + ], + [ + "W", + "alk" + ], + [ + "▁P", + "ere" + ], + [ + "▁Per", + "e" + ], + [ + "▁Pe", + "re" + ], + [ + "Is", + "Null" + ], + [ + "▁U", + "V" + ], + [ + "▁", + "UV" + ], + [ + "▁ret", + "val" + ], + [ + "▁", + "retval" + ], + [ + "▁sim", + "plicity" + ], + [ + "▁simpl", + "icity" + ], + [ + "▁rein", + "force" + ], + [ + "Lin", + "q" + ], + [ + "▁diff", + "usion" + ], + [ + "▁dis", + "orders" + ], + [ + "▁disorder", + "s" + ], + [ + "ât", + "re" + ], + [ + "â", + "tre" + ], + [ + "ui", + "ty" + ], + [ + "uit", + "y" + ], + [ + "u", + "ity" + ], + [ + "▁hel", + "pless" + ], + [ + "▁help", + "less" + ], + [ + "Me", + "asure" + ], + [ + "▁com", + "pression" + ], + [ + "▁comp", + "ression" + ], + [ + "▁compr", + "ession" + ], + [ + "▁compress", + "ion" + ], + [ + "▁Co", + "al" + ], + [ + "olut", + "ely" + ], + [ + "olute", + "ly" + ], + [ + "og", + "ue" + ], + [ + "o", + "gue" + ], + [ + "▁up", + "ward" + ], + [ + "▁Block", + "ly" + ], + [ + "▁b", + "ride" + ], + [ + "▁br", + "ide" + ], + [ + "parse", + "Int" + ], + [ + "▁is", + "olation" + ], + [ + "▁isol", + "ation" + ], + [ + "▁regul", + "atory" + ], + [ + "ș", + "ti" + ], + [ + "ric", + "ane" + ], + [ + "м", + "б" + ], + [ + "▁с", + "ло" + ], + [ + "▁", + "сло" + ], + [ + "▁sa", + "lad" + ], + [ + "▁sal", + "ad" + ], + [ + "we", + "i" + ], + [ + "w", + "ei" + ], + [ + "▁B", + "asket" + ], + [ + "▁Bas", + "ket" + ], + [ + "▁M", + "ON" + ], + [ + "▁MO", + "N" + ], + [ + "▁", + "MON" + ], + [ + "\">", + "&" + ], + [ + "\"", + ">&" + ], + [ + "do", + "ors" + ], + [ + "door", + "s" + ], + [ + "▁K", + "ill" + ], + [ + "▁Kil", + "l" + ], + [ + "▁Ki", + "ll" + ], + [ + "▁conspir", + "acy" + ], + [ + "▁M", + "iles" + ], + [ + "▁Mil", + "es" + ], + [ + "▁Mi", + "les" + ], + [ + "wa", + "nt" + ], + [ + "wan", + "t" + ], + [ + "w", + "ant" + ], + [ + "Mod", + "ifier" + ], + [ + "▁batter", + "ies" + ], + [ + "▁batt", + "eries" + ], + [ + "iv", + "as" + ], + [ + "iva", + "s" + ], + [ + "i", + "vas" + ], + [ + "▁att", + "endance" + ], + [ + "▁attend", + "ance" + ], + [ + "▁AUT", + "H" + ], + [ + "▁AU", + "TH" + ], + [ + "▁", + "AUTH" + ], + [ + "▁с", + "ві" + ], + [ + "..", + ".," + ], + [ + "...", + "," + ], + [ + "▁aggreg", + "ate" + ], + [ + "▁de", + "struct" + ], + [ + "▁dest", + "ruct" + ], + [ + "▁four", + "teen" + ], + [ + "▁м", + "ет" + ], + [ + "▁ме", + "т" + ], + [ + "▁", + "мет" + ], + [ + "▁both", + "ered" + ], + [ + "▁bother", + "ed" + ], + [ + "el", + "te" + ], + [ + "elt", + "e" + ], + [ + "e", + "lte" + ], + [ + "▁m", + "ism" + ], + [ + "▁mis", + "m" + ], + [ + "▁mi", + "sm" + ], + [ + "▁res", + "ting" + ], + [ + "▁rest", + "ing" + ], + [ + "▁P", + "ars" + ], + [ + "▁Par", + "s" + ], + [ + "▁Pa", + "rs" + ], + [ + "▁", + "Pars" + ], + [ + "▁id", + "le" + ], + [ + "▁", + "idle" + ], + [ + "▁d", + "eren" + ], + [ + "▁de", + "ren" + ], + [ + "▁der", + "en" + ], + [ + "▁dere", + "n" + ], + [ + "▁di", + "ary" + ], + [ + "▁dia", + "ry" + ], + [ + "▁v", + "ague" + ], + [ + "▁va", + "gue" + ], + [ + "▁vag", + "ue" + ], + [ + "▁margin", + "al" + ], + [ + "▁marg", + "inal" + ], + [ + "Wr", + "it" + ], + [ + "W", + "rit" + ], + [ + "Bo", + "t" + ], + [ + "B", + "ot" + ], + [ + "▁Met", + "ro" + ], + [ + "▁e", + "arning" + ], + [ + "▁earn", + "ing" + ], + [ + "▁ear", + "ning" + ], + [ + "hist", + "oire" + ], + [ + "his", + "toire" + ], + [ + "▁end", + "orse" + ], + [ + "▁be", + "ard" + ], + [ + "▁bear", + "d" + ], + [ + "▁Chair", + "man" + ], + [ + "ie", + "b" + ], + [ + "i", + "eb" + ], + [ + "▁neut", + "r" + ], + [ + "▁neu", + "tr" + ], + [ + "▁am", + "bit" + ], + [ + "▁amb", + "it" + ], + [ + "▁Leon", + "ard" + ], + [ + "ban", + "ds" + ], + [ + "band", + "s" + ], + [ + "b", + "ands" + ], + [ + "▁D", + "ale" + ], + [ + "▁Da", + "le" + ], + [ + "▁Dal", + "e" + ], + [ + "▁ver", + "ified" + ], + [ + "Al", + "gorithm" + ], + [ + "Enumer", + "able" + ], + [ + "op", + "code" + ], + [ + "cast", + "le" + ], + [ + "cas", + "tle" + ], + [ + "š", + "e" + ], + [ + "▁Venez", + "uela" + ], + [ + "▁de", + "scriptions" + ], + [ + "▁des", + "criptions" + ], + [ + "▁description", + "s" + ], + [ + "▁value", + "d" + ], + [ + "▁val", + "ued" + ], + [ + "▁chapter", + "s" + ], + [ + "▁chap", + "ters" + ], + [ + "▁I", + "ls" + ], + [ + "▁Il", + "s" + ], + [ + "▁cl", + "arity" + ], + [ + "▁clar", + "ity" + ], + [ + "▁tour", + "ists" + ], + [ + "▁tourist", + "s" + ], + [ + "Da", + "n" + ], + [ + "D", + "an" + ], + [ + "▁t", + "ribe" + ], + [ + "▁tr", + "ibe" + ], + [ + "▁tri", + "be" + ], + [ + "▁trib", + "e" + ], + [ + "▁г", + "и" + ], + [ + "▁", + "ги" + ], + [ + "fol", + "k" + ], + [ + "f", + "olk" + ], + [ + "ac", + "cur" + ], + [ + "acc", + "ur" + ], + [ + "▁St", + "ack" + ], + [ + "▁Sta", + "ck" + ], + [ + "▁", + "Stack" + ], + [ + "▁adv", + "ocate" + ], + [ + "▁advoc", + "ate" + ], + [ + "▁G", + "ene" + ], + [ + "▁Ge", + "ne" + ], + [ + "▁Gen", + "e" + ], + [ + "Im", + "ages" + ], + [ + "Image", + "s" + ], + [ + "▁rig", + "id" + ], + [ + "▁con", + "greg" + ], + [ + "▁congr", + "eg" + ], + [ + "▁start", + "up" + ], + [ + "▁dead", + "line" + ], + [ + "co", + "uld" + ], + [ + "cou", + "ld" + ], + [ + "c", + "ould" + ], + [ + "▁beg", + "ann" + ], + [ + "▁began", + "n" + ], + [ + "▁cal", + "ci" + ], + [ + "▁calc", + "i" + ], + [ + "▁Cir", + "cle" + ], + [ + "▁Circ", + "le" + ], + [ + "▁in", + "cons" + ], + [ + "▁inc", + "ons" + ], + [ + "▁incon", + "s" + ], + [ + "aaaa", + "aaaa" + ], + [ + "▁rub", + "bed" + ], + [ + "ape", + "ut" + ], + [ + "ua", + "rio" + ], + [ + "uar", + "io" + ], + [ + "u", + "ario" + ], + [ + "worth", + "y" + ], + [ + "wor", + "thy" + ], + [ + "wort", + "hy" + ], + [ + "▁уча", + "сти" + ], + [ + "▁участ", + "и" + ], + [ + "▁fam", + "ília" + ], + [ + "▁synchron", + "ized" + ], + [ + "▁unf", + "air" + ], + [ + "rs", + "p" + ], + [ + "r", + "sp" + ], + [ + "▁soc", + "ieties" + ], + [ + "▁societ", + "ies" + ], + [ + "bo", + "at" + ], + [ + "gr", + "o" + ], + [ + "g", + "ro" + ], + [ + "▁k", + "at" + ], + [ + "▁ka", + "t" + ], + [ + "▁", + "kat" + ], + [ + "▁p", + "oker" + ], + [ + "▁po", + "ker" + ], + [ + "▁pok", + "er" + ], + [ + "▁l", + "ocks" + ], + [ + "▁loc", + "ks" + ], + [ + "▁lo", + "cks" + ], + [ + "▁lock", + "s" + ], + [ + "▁G", + "F" + ], + [ + "▁", + "GF" + ], + [ + "▁re", + "conc" + ], + [ + "▁recon", + "c" + ], + [ + "▁Maur", + "ice" + ], + [ + "▁Mau", + "rice" + ], + [ + "__(", + "/*!" + ], + [ + "▁ble", + "eding" + ], + [ + "äs", + "ident" + ], + [ + "▁по", + "след" + ], + [ + "▁после", + "д" + ], + [ + "▁deriv", + "ative" + ], + [ + "ша", + "я" + ], + [ + "cc", + "ió" + ], + [ + "c", + "ció" + ], + [ + "▁cr", + "ushed" + ], + [ + "▁crush", + "ed" + ], + [ + "▁tempor", + "arily" + ], + [ + "▁co", + "aches" + ], + [ + "▁coach", + "es" + ], + [ + "▁Mo", + "vement" + ], + [ + "▁Move", + "ment" + ], + [ + "▁Mov", + "ement" + ], + [ + "}}", + "$." + ], + [ + "}}$", + "." + ], + [ + "}", + "}$." + ], + [ + "▁K", + "yle" + ], + [ + "▁Ky", + "le" + ], + [ + "▁S", + "ohn" + ], + [ + "▁So", + "hn" + ], + [ + "▁cre", + "ator" + ], + [ + "▁creat", + "or" + ], + [ + "ind", + "ust" + ], + [ + "▁E", + "rik" + ], + [ + "▁Er", + "ik" + ], + [ + "▁se", + "iz" + ], + [ + "▁sei", + "z" + ], + [ + "▁dim", + "ensional" + ], + [ + "▁dimension", + "al" + ], + [ + "▁", + "dimensional" + ], + [ + "▁I", + "st" + ], + [ + "▁Is", + "t" + ], + [ + "▁pre", + "val" + ], + [ + "▁pr", + "eval" + ], + [ + "▁prev", + "al" + ], + [ + "he", + "ads" + ], + [ + "head", + "s" + ], + [ + "▁про", + "ти" + ], + [ + "▁determ", + "ines" + ], + [ + "▁determine", + "s" + ], + [ + "▁determin", + "es" + ], + [ + "eg", + "y" + ], + [ + "e", + "gy" + ], + [ + "▁U", + "INT" + ], + [ + "▁UI", + "NT" + ], + [ + "▁", + "UINT" + ], + [ + "▁V", + "olk" + ], + [ + "▁Vol", + "k" + ], + [ + "pa", + "wn" + ], + [ + "p", + "awn" + ], + [ + "Ph", + "oto" + ], + [ + "▁C", + "olin" + ], + [ + "▁Col", + "in" + ], + [ + "▁Co", + "lin" + ], + [ + "ap", + "propri" + ], + [ + "app", + "ropri" + ], + [ + "ort", + "ion" + ], + [ + "st", + "eller" + ], + [ + "stell", + "er" + ], + [ + "É", + "tat" + ], + [ + "▁im", + "ply" + ], + [ + "▁imp", + "ly" + ], + [ + "▁impl", + "y" + ], + [ + "▁t", + "outes" + ], + [ + "▁to", + "utes" + ], + [ + "▁tou", + "tes" + ], + [ + "▁tout", + "es" + ], + [ + "▁toute", + "s" + ], + [ + "VO", + "L" + ], + [ + "V", + "OL" + ], + [ + "an", + "ing" + ], + [ + "ani", + "ng" + ], + [ + "a", + "ning" + ], + [ + "Tool", + "tip" + ], + [ + "ig", + "ious" + ], + [ + "igi", + "ous" + ], + [ + "▁e", + "ternal" + ], + [ + "▁etern", + "al" + ], + [ + "▁P", + "oz" + ], + [ + "▁Po", + "z" + ], + [ + "▁bank", + "rupt" + ], + [ + "▁fail", + "ures" + ], + [ + "▁failure", + "s" + ], + [ + "uer", + "te" + ], + [ + "▁вре", + "ме" + ], + [ + "zu", + "ng" + ], + [ + "z", + "ung" + ], + [ + "▁t", + "cp" + ], + [ + "▁tc", + "p" + ], + [ + "▁", + "tcp" + ], + [ + "▁cont", + "ainers" + ], + [ + "▁contain", + "ers" + ], + [ + "▁container", + "s" + ], + [ + "ou", + "sel" + ], + [ + "ous", + "el" + ], + [ + "ouse", + "l" + ], + [ + "▁H", + "IV" + ], + [ + "▁con", + "ced" + ], + [ + "▁conc", + "ed" + ], + [ + "▁conce", + "d" + ], + [ + "▁sept", + "iembre" + ], + [ + "gi", + "rl" + ], + [ + "g", + "irl" + ], + [ + "▁C", + "ho" + ], + [ + "▁Ch", + "o" + ], + [ + "▁f", + "az" + ], + [ + "▁fa", + "z" + ], + [ + "▁Up", + "per" + ], + [ + "▁", + "Upper" + ], + [ + "▁For", + "ces" + ], + [ + "▁Force", + "s" + ], + [ + "äh", + "lt" + ], + [ + "in", + "ject" + ], + [ + "Re", + "ceived" + ], + [ + "MA", + "T" + ], + [ + "M", + "AT" + ], + [ + "ag", + "lia" + ], + [ + "ów", + "nie" + ], + [ + "ówn", + "ie" + ], + [ + "/", + "'" + ], + [ + "▁p", + "ip" + ], + [ + "▁pi", + "p" + ], + [ + "▁G", + "est" + ], + [ + "▁Ge", + "st" + ], + [ + "▁Ges", + "t" + ], + [ + "▁l", + "ado" + ], + [ + "▁la", + "do" + ], + [ + "▁lad", + "o" + ], + [ + "▁compat", + "ibility" + ], + [ + "▁m", + "are" + ], + [ + "▁mar", + "e" + ], + [ + "▁ma", + "re" + ], + [ + "▁", + "mare" + ], + [ + "▁Cle", + "arly" + ], + [ + "▁Clear", + "ly" + ], + [ + "vers", + "ation" + ], + [ + "Ver", + "s" + ], + [ + "V", + "ers" + ], + [ + "▁ch", + "ick" + ], + [ + "▁chi", + "ck" + ], + [ + "▁organ", + "ize" + ], + [ + "▁organiz", + "e" + ], + [ + "▁econom", + "ics" + ], + [ + "▁economic", + "s" + ], + [ + "▁ancest", + "ors" + ], + [ + "ME", + "D" + ], + [ + "M", + "ED" + ], + [ + "▁sc", + "rub" + ], + [ + "▁scr", + "ub" + ], + [ + "▁label", + "ed" + ], + [ + "▁lab", + "eled" + ], + [ + "▁п", + "р" + ], + [ + "▁S", + "uz" + ], + [ + "▁Su", + "z" + ], + [ + "▁A", + "str" + ], + [ + "▁As", + "tr" + ], + [ + "▁Ast", + "r" + ], + [ + "allow", + "een" + ], + [ + "allo", + "ween" + ], + [ + "rh", + "s" + ], + [ + "r", + "hs" + ], + [ + "as", + "ci" + ], + [ + "asc", + "i" + ], + [ + "▁C", + "ancer" + ], + [ + "▁Can", + "cer" + ], + [ + "▁H", + "unt" + ], + [ + "▁Hun", + "t" + ], + [ + "▁Hu", + "nt" + ], + [ + "▁switch", + "ing" + ], + [ + "▁R", + "idge" + ], + [ + "Se", + "q" + ], + [ + "S", + "eq" + ], + [ + "▁gi", + "ugno" + ], + [ + "bus", + "iness" + ], + [ + "▁char", + "ming" + ], + [ + "▁charm", + "ing" + ], + [ + "▁I", + "o" + ], + [ + "▁", + "Io" + ], + [ + "▁prés", + "ident" + ], + [ + "ek", + "ing" + ], + [ + "e", + "king" + ], + [ + "í", + "l" + ], + [ + "en", + "h" + ], + [ + "e", + "nh" + ], + [ + "pr", + "it" + ], + [ + "p", + "rit" + ], + [ + "erc", + "ise" + ], + [ + "án", + "ak" + ], + [ + "á", + "nak" + ], + [ + "▁х", + "ра" + ], + [ + "▁", + "хра" + ], + [ + "▁b", + "ugs" + ], + [ + "▁bu", + "gs" + ], + [ + "▁bug", + "s" + ], + [ + "▁жи", + "во" + ], + [ + "▁light", + "ning" + ], + [ + "▁never", + "theless" + ], + [ + "▁length", + "s" + ], + [ + "G", + "U" + ], + [ + "H", + "idden" + ], + [ + "Act", + "or" + ], + [ + "Ac", + "tor" + ], + [ + "A", + "ctor" + ], + [ + "To", + "pic" + ], + [ + "Top", + "ic" + ], + [ + "T", + "opic" + ], + [ + "▁H", + "orse" + ], + [ + "▁Hor", + "se" + ], + [ + "ћ", + "е" + ], + [ + "el", + "ines" + ], + [ + "eline", + "s" + ], + [ + "eli", + "nes" + ], + [ + "elin", + "es" + ], + [ + "e", + "lines" + ], + [ + "▁trag", + "edy" + ], + [ + "▁traged", + "y" + ], + [ + "int", + "endo" + ], + [ + "▁abund", + "ance" + ], + [ + "▁ev", + "ac" + ], + [ + "it", + "ably" + ], + [ + "+\\", + "_\\" + ], + [ + "▁rec", + "ib" + ], + [ + "ua", + "ted" + ], + [ + "uate", + "d" + ], + [ + "u", + "ated" + ], + [ + "рі", + "ї" + ], + [ + "▁fool", + "ish" + ], + [ + "▁foo", + "lish" + ], + [ + "▁t", + "m" + ], + [ + "▁", + "tm" + ], + [ + "▁des", + "pair" + ], + [ + "▁desp", + "air" + ], + [ + "TO", + "KEN" + ], + [ + "▁comp", + "romise" + ], + [ + "▁comprom", + "ise" + ], + [ + "▁Person", + "en" + ], + [ + "▁Pers", + "onen" + ], + [ + "▁investig", + "ated" + ], + [ + "▁investigate", + "d" + ], + [ + "▁ex", + "clude" + ], + [ + "▁excl", + "ude" + ], + [ + "▁telev", + "is" + ], + [ + "▁tele", + "vis" + ], + [ + "▁pull", + "s" + ], + [ + "▁pul", + "ls" + ], + [ + "▁according", + "ly" + ], + [ + "▁accord", + "ingly" + ], + [ + "▁f", + "ő" + ], + [ + "▁Le", + "ave" + ], + [ + "▁", + "Leave" + ], + [ + "oper", + "ations" + ], + [ + "operation", + "s" + ], + [ + "cri", + "m" + ], + [ + "cr", + "im" + ], + [ + "c", + "rim" + ], + [ + "▁r", + "hs" + ], + [ + "▁rh", + "s" + ], + [ + "▁", + "rhs" + ], + [ + "▁form", + "ally" + ], + [ + "▁formal", + "ly" + ], + [ + "▁L", + "ily" + ], + [ + "▁Li", + "ly" + ], + [ + "▁Lil", + "y" + ], + [ + "▁Com", + "ments" + ], + [ + "▁Comm", + "ents" + ], + [ + "▁Comment", + "s" + ], + [ + "▁se", + "ptember" + ], + [ + "▁sept", + "ember" + ], + [ + "ie", + "fs" + ], + [ + "ief", + "s" + ], + [ + "▁tre", + "asure" + ], + [ + "Http", + "Servlet" + ], + [ + "ді", + "в" + ], + [ + "д", + "ів" + ], + [ + "▁dis", + "claimer" + ], + [ + "lu", + "ss" + ], + [ + "l", + "uss" + ], + [ + "▁ка", + "о" + ], + [ + "ro", + "gen" + ], + [ + "rog", + "en" + ], + [ + "r", + "ogen" + ], + [ + "▁Start", + "ing" + ], + [ + "▁Star", + "ting" + ], + [ + "▁d", + "ém" + ], + [ + "▁dé", + "m" + ], + [ + "▁select", + "ing" + ], + [ + "▁", + "↘" + ], + [ + "▁О", + "н" + ], + [ + "▁Pract", + "ice" + ], + [ + "▁p", + "orte" + ], + [ + "▁por", + "te" + ], + [ + "▁port", + "e" + ], + [ + "▁", + "porte" + ], + [ + "▁as", + "sure" + ], + [ + "▁ass", + "ure" + ], + [ + "▁frustr", + "ated" + ], + [ + "S", + "ink" + ], + [ + "▁A", + "ri" + ], + [ + "▁Ar", + "i" + ], + [ + "▁esc", + "ort" + ], + [ + "ais", + "es" + ], + [ + "ai", + "ses" + ], + [ + "aise", + "s" + ], + [ + "a", + "ises" + ], + [ + "▁b", + "ush" + ], + [ + "▁bu", + "sh" + ], + [ + "▁bus", + "h" + ], + [ + "▁Se", + "ine" + ], + [ + "▁F", + "ill" + ], + [ + "▁Fil", + "l" + ], + [ + "▁Fi", + "ll" + ], + [ + "▁", + "Fill" + ], + [ + "▁S", + "ull" + ], + [ + "▁Su", + "ll" + ], + [ + "▁Sul", + "l" + ], + [ + "Do", + "t" + ], + [ + "D", + "ot" + ], + [ + "vi", + "l" + ], + [ + "v", + "il" + ], + [ + "un", + "ing" + ], + [ + "uni", + "ng" + ], + [ + "u", + "ning" + ], + [ + "Render", + "ing" + ], + [ + "Rend", + "ering" + ], + [ + "sh", + "ake" + ], + [ + "sha", + "ke" + ], + [ + "пи", + "си" + ], + [ + "пис", + "и" + ], + [ + "pt", + "e" + ], + [ + "p", + "te" + ], + [ + "▁b", + "end" + ], + [ + "▁be", + "nd" + ], + [ + "▁ben", + "d" + ], + [ + "▁jewel", + "ry" + ], + [ + "▁Stock", + "holm" + ], + [ + "▁Hon", + "estly" + ], + [ + "!", + "[" + ], + [ + "▁array", + "s" + ], + [ + "▁arr", + "ays" + ], + [ + "▁War", + "ner" + ], + [ + "▁sh", + "aft" + ], + [ + "▁sha", + "ft" + ], + [ + "▁C", + "ann" + ], + [ + "▁Can", + "n" + ], + [ + "▁Ca", + "nn" + ], + [ + "▁Pitt", + "sburgh" + ], + [ + "ir", + "ical" + ], + [ + "iri", + "cal" + ], + [ + "i", + "rical" + ], + [ + "au", + "tre" + ], + [ + "aut", + "re" + ], + [ + "▁R", + "ück" + ], + [ + "▁gen", + "naio" + ], + [ + "▁Ш", + "а" + ], + [ + "an", + "nte" + ], + [ + "ann", + "te" + ], + [ + "annt", + "e" + ], + [ + "ps", + "hire" + ], + [ + "p", + "shire" + ], + [ + "но", + "логи" + ], + [ + "н", + "ологи" + ], + [ + "ét", + "a" + ], + [ + "é", + "ta" + ], + [ + "▁pr", + "inter" + ], + [ + "▁print", + "er" + ], + [ + "▁prin", + "ter" + ], + [ + "▁dam", + "ages" + ], + [ + "▁damage", + "s" + ], + [ + "▁Isa", + "ac" + ], + [ + "▁Famil", + "ie" + ], + [ + "Code", + "s" + ], + [ + "Co", + "des" + ], + [ + "C", + "odes" + ], + [ + "th", + "rift" + ], + [ + "no", + "b" + ], + [ + "n", + "ob" + ], + [ + "▁c", + "av" + ], + [ + "▁ca", + "v" + ], + [ + "▁techn", + "ically" + ], + [ + "▁technical", + "ly" + ], + [ + "▁I", + "mm" + ], + [ + "▁Im", + "m" + ], + [ + "▁tr", + "icks" + ], + [ + "▁tri", + "cks" + ], + [ + "▁trick", + "s" + ], + [ + "EA", + "R" + ], + [ + "E", + "AR" + ], + [ + "▁Sub", + "ject" + ], + [ + "▁", + "Subject" + ], + [ + "▁ne", + "eding" + ], + [ + "▁need", + "ing" + ], + [ + "▁G", + "ir" + ], + [ + "▁Gi", + "r" + ], + [ + "Bo", + "ard" + ], + [ + "B", + "oard" + ], + [ + "▁re", + "he" + ], + [ + "▁rem", + "inder" + ], + [ + "▁remind", + "er" + ], + [ + "▁sh", + "iver" + ], + [ + "K", + "it" + ], + [ + "▁strugg", + "les" + ], + [ + "▁struggle", + "s" + ], + [ + "▁gen", + "om" + ], + [ + "▁ge", + "nom" + ], + [ + "im", + "il" + ], + [ + "imi", + "l" + ], + [ + "i", + "mil" + ], + [ + "Reg", + "istration" + ], + [ + "▁gl", + "oves" + ], + [ + "▁glo", + "ves" + ], + [ + "▁Z", + "ur" + ], + [ + "▁Zu", + "r" + ], + [ + "▁B", + "eg" + ], + [ + "▁Be", + "g" + ], + [ + "▁in", + "clusive" + ], + [ + "▁incl", + "usive" + ], + [ + "/", + "," + ], + [ + "og", + "an" + ], + [ + "oga", + "n" + ], + [ + "o", + "gan" + ], + [ + "po", + "que" + ], + [ + "cont", + "rib" + ], + [ + "contr", + "ib" + ], + [ + "ши", + "н" + ], + [ + "ш", + "ин" + ], + [ + "▁M", + "ama" + ], + [ + "▁Ma", + "ma" + ], + [ + "▁Mam", + "a" + ], + [ + "print", + "s" + ], + [ + "▁re", + "named" + ], + [ + "▁ren", + "amed" + ], + [ + "ють", + "ся" + ], + [ + "ю", + "ться" + ], + [ + "net", + "dev" + ], + [ + "▁comp", + "ile" + ], + [ + "▁", + "compile" + ], + [ + "▁", + "§" + ], + [ + "M", + "UL" + ], + [ + "▁dr", + "aws" + ], + [ + "▁draw", + "s" + ], + [ + "co", + "ck" + ], + [ + "c", + "ock" + ], + [ + "▁сво", + "и" + ], + [ + "▁M", + "um" + ], + [ + "▁Mu", + "m" + ], + [ + "sp", + "ieler" + ], + [ + "spi", + "eler" + ], + [ + "s", + "pieler" + ], + [ + "▁n", + "ail" + ], + [ + "▁na", + "il" + ], + [ + "▁", + "nail" + ], + [ + "▁trans", + "it" + ], + [ + "▁S", + "aw" + ], + [ + "▁Sa", + "w" + ], + [ + "▁com", + "press" + ], + [ + "▁comp", + "ress" + ], + [ + "▁compre", + "ss" + ], + [ + "▁compr", + "ess" + ], + [ + "▁", + "compress" + ], + [ + "▁purch", + "ases" + ], + [ + "▁purchase", + "s" + ], + [ + "▁per", + "forms" + ], + [ + "▁perform", + "s" + ], + [ + "▁dem", + "ol" + ], + [ + "▁demo", + "l" + ], + [ + "▁comm", + "ence" + ], + [ + "▁C", + "B" + ], + [ + "▁", + "CB" + ], + [ + "▁A", + "ber" + ], + [ + "▁Ab", + "er" + ], + [ + "▁c", + "ush" + ], + [ + "▁cu", + "sh" + ], + [ + "▁ком", + "п" + ], + [ + "▁ру", + "ко" + ], + [ + "▁Muham", + "mad" + ], + [ + "▁Net", + "flix" + ], + [ + "▁Environment", + "al" + ], + [ + "No", + "rm" + ], + [ + "N", + "orm" + ], + [ + "▁w", + "ir" + ], + [ + "null", + "ptr" + ], + [ + "▁refuge", + "es" + ], + [ + "до", + "н" + ], + [ + "д", + "он" + ], + [ + "▁B", + "irmingham" + ], + [ + "New", + "s" + ], + [ + "Ne", + "ws" + ], + [ + "▁В", + "се" + ], + [ + "Or", + "ient" + ], + [ + "O", + "rient" + ], + [ + "As", + "sembly" + ], + [ + "▁introdu", + "cing" + ], + [ + "fin", + "der" + ], + [ + "find", + "er" + ], + [ + "fi", + "nder" + ], + [ + "f", + "inder" + ], + [ + "▁scholar", + "ship" + ], + [ + "▁scholars", + "hip" + ], + [ + "▁ос", + "нова" + ], + [ + "▁основ", + "а" + ], + [ + "if", + "a" + ], + [ + "i", + "fa" + ], + [ + "Si", + "ng" + ], + [ + "S", + "ing" + ], + [ + "ib", + "lic" + ], + [ + "ibli", + "c" + ], + [ + "i", + "blic" + ], + [ + "istribut", + "ed" + ], + [ + "istribute", + "d" + ], + [ + "▁depart", + "ments" + ], + [ + "▁department", + "s" + ], + [ + "CR", + "EF" + ], + [ + "CRE", + "F" + ], + [ + "C", + "REF" + ], + [ + "▁Malays", + "ia" + ], + [ + "CO", + "NF" + ], + [ + "CON", + "F" + ], + [ + "▁Cl", + "aud" + ], + [ + "▁Bu", + "ilt" + ], + [ + "▁", + "Built" + ], + [ + "RAN", + "GE" + ], + [ + "Re", + "direct" + ], + [ + "Red", + "irect" + ], + [ + "LE", + "ASE" + ], + [ + "--", + "-------" + ], + [ + "----", + "-----" + ], + [ + "--------", + "-" + ], + [ + "---", + "------" + ], + [ + "-----", + "----" + ], + [ + "------", + "---" + ], + [ + "-------", + "--" + ], + [ + "-", + "--------" + ], + [ + "▁П", + "у" + ], + [ + "▁n", + "umpy" + ], + [ + "▁num", + "py" + ], + [ + "▁project", + "ed" + ], + [ + "▁remind", + "s" + ], + [ + "▁-", + "*-" + ], + [ + "ib", + "ling" + ], + [ + "ibli", + "ng" + ], + [ + "i", + "bling" + ], + [ + "▁s", + "lower" + ], + [ + "▁sl", + "ower" + ], + [ + "▁slow", + "er" + ], + [ + "op", + "p" + ], + [ + "o", + "pp" + ], + [ + "ro", + "pic" + ], + [ + "rop", + "ic" + ], + [ + "r", + "opic" + ], + [ + "▁Mont", + "real" + ], + [ + "▁detect", + "ive" + ], + [ + "TH", + "READ" + ], + [ + "▁qu", + "é" + ], + [ + "▁R", + "osa" + ], + [ + "▁Ro", + "sa" + ], + [ + "▁Ros", + "a" + ], + [ + "▁seven", + "th" + ], + [ + "▁sevent", + "h" + ], + [ + "Col", + "ors" + ], + [ + "Color", + "s" + ], + [ + "de", + "mo" + ], + [ + "dem", + "o" + ], + [ + "▁E", + "sta" + ], + [ + "▁Est", + "a" + ], + [ + "▁Es", + "ta" + ], + [ + "ff", + "f" + ], + [ + "f", + "ff" + ], + [ + "ick", + "ets" + ], + [ + "icket", + "s" + ], + [ + "Gr", + "e" + ], + [ + "G", + "re" + ], + [ + "á", + "b" + ], + [ + "bo", + "ost" + ], + [ + "▁Go", + "ing" + ], + [ + "▁Su", + "ite" + ], + [ + "▁", + "Suite" + ], + [ + "▁adapt", + "ation" + ], + [ + "▁j", + "ours" + ], + [ + "▁jour", + "s" + ], + [ + "▁jo", + "urs" + ], + [ + "▁jou", + "rs" + ], + [ + "▁", + "jours" + ], + [ + "▁Or", + "th" + ], + [ + "▁Ort", + "h" + ], + [ + "х", + "і" + ], + [ + "Fig", + "ure" + ], + [ + "▁su", + "pers" + ], + [ + "▁sup", + "ers" + ], + [ + "▁super", + "s" + ], + [ + "▁access", + "ories" + ], + [ + "we", + "ak" + ], + [ + "▁dist", + "ress" + ], + [ + "fr", + "ied" + ], + [ + "f", + "ried" + ], + [ + "▁go", + "og" + ], + [ + "ка", + "з" + ], + [ + "▁far", + "mer" + ], + [ + "▁farm", + "er" + ], + [ + "it", + "ational" + ], + [ + "itation", + "al" + ], + [ + "itat", + "ional" + ], + [ + "Go", + "ld" + ], + [ + "G", + "old" + ], + [ + "▁ass", + "hole" + ], + [ + "▁assh", + "ole" + ], + [ + "▁Cont", + "roller" + ], + [ + "▁Control", + "ler" + ], + [ + "▁", + "Controller" + ], + [ + "▁ар", + "хи" + ], + [ + "To", + "o" + ], + [ + "T", + "oo" + ], + [ + "▁mol", + "to" + ], + [ + "▁p", + "ropri" + ], + [ + "▁prop", + "ri" + ], + [ + "▁", + "propri" + ], + [ + "▁al", + "go" + ], + [ + "▁alg", + "o" + ], + [ + "Af", + "f" + ], + [ + "A", + "ff" + ], + [ + "re", + "sc" + ], + [ + "res", + "c" + ], + [ + "r", + "esc" + ], + [ + "▁D", + "y" + ], + [ + "▁con", + "gr" + ], + [ + "▁T", + "es" + ], + [ + "▁Te", + "s" + ], + [ + "▁W", + "IN" + ], + [ + "▁", + "WIN" + ], + [ + "de", + "serialize" + ], + [ + "des", + "erialize" + ], + [ + "sy", + "n" + ], + [ + "s", + "yn" + ], + [ + "▁chem", + "istry" + ], + [ + "m", + "iddle" + ], + [ + "▁an", + "imated" + ], + [ + "▁anim", + "ated" + ], + [ + "▁K", + "um" + ], + [ + "▁Ku", + "m" + ], + [ + "file", + "Name" + ], + [ + "Amer", + "ica" + ], + [ + "▁dr", + "ums" + ], + [ + "▁dru", + "ms" + ], + [ + "▁drum", + "s" + ], + [ + "▁program", + "a" + ], + [ + "▁n", + "ej" + ], + [ + "▁ne", + "j" + ], + [ + "▁", + "nej" + ], + [ + "Read", + "Only" + ], + [ + "▁Б", + "ра" + ], + [ + "--", + "-----" + ], + [ + "----", + "---" + ], + [ + "---", + "----" + ], + [ + "-----", + "--" + ], + [ + "------", + "-" + ], + [ + "-", + "------" + ], + [ + "Mut", + "ex" + ], + [ + "Mu", + "tex" + ], + [ + "un", + "ned" + ], + [ + "unn", + "ed" + ], + [ + "ynam", + "ics" + ], + [ + "ynamic", + "s" + ], + [ + "co", + "system" + ], + [ + "cos", + "ystem" + ], + [ + "▁R", + "ect" + ], + [ + "▁Re", + "ct" + ], + [ + "▁Rec", + "t" + ], + [ + "▁", + "Rect" + ], + [ + "▁an", + "ime" + ], + [ + "▁anim", + "e" + ], + [ + "▁I", + "BM" + ], + [ + "▁need", + "le" + ], + [ + "es", + "ser" + ], + [ + "ess", + "er" + ], + [ + "esse", + "r" + ], + [ + "▁incl", + "u" + ], + [ + "▁inc", + "lu" + ], + [ + "Le", + "an" + ], + [ + "tr", + "aining" + ], + [ + "tra", + "ining" + ], + [ + "train", + "ing" + ], + [ + "▁b", + "our" + ], + [ + "▁bo", + "ur" + ], + [ + "▁bou", + "r" + ], + [ + "▁", + "bour" + ], + [ + "ab", + "ases" + ], + [ + "abase", + "s" + ], + [ + "aba", + "ses" + ], + [ + "▁tak", + "że" + ], + [ + "wa", + "rz" + ], + [ + "war", + "z" + ], + [ + "w", + "arz" + ], + [ + "▁ste", + "pping" + ], + [ + "▁step", + "ping" + ], + [ + "▁T", + "IME" + ], + [ + "▁", + "TIME" + ], + [ + "▁Ein", + "stein" + ], + [ + "▁Log", + "in" + ], + [ + "▁Lo", + "gin" + ], + [ + "▁", + "Login" + ], + [ + "pon", + "ential" + ], + [ + "ponent", + "ial" + ], + [ + "De", + "ad" + ], + [ + "D", + "ead" + ], + [ + "in", + "str" + ], + [ + "ins", + "tr" + ], + [ + "inst", + "r" + ], + [ + "▁ne", + "ural" + ], + [ + "▁neu", + "ral" + ], + [ + "▁neur", + "al" + ], + [ + "▁ub", + "ic" + ], + [ + "▁Init", + "ialized" + ], + [ + "▁Initialize", + "d" + ], + [ + "▁Initial", + "ized" + ], + [ + "▁", + "Initialized" + ], + [ + "▁facil", + "itate" + ], + [ + "G", + "D" + ], + [ + "}{", + "(" + ], + [ + "}", + "{(" + ], + [ + "D", + "ark" + ], + [ + "▁n", + "ag" + ], + [ + "▁na", + "g" + ], + [ + "min", + "ipage" + ], + [ + "Size", + "s" + ], + [ + "Si", + "zes" + ], + [ + "S", + "izes" + ], + [ + "▁w", + "orm" + ], + [ + "▁wor", + "m" + ], + [ + "▁wo", + "rm" + ], + [ + "bi", + "as" + ], + [ + "bia", + "s" + ], + [ + "b", + "ias" + ], + [ + "Su", + "ch" + ], + [ + "S", + "uch" + ], + [ + "wick", + "lung" + ], + [ + "▁sp", + "ouse" + ], + [ + "▁spo", + "use" + ], + [ + "▁surviv", + "ors" + ], + [ + "er", + "st" + ], + [ + "ers", + "t" + ], + [ + "at", + "ype" + ], + [ + "aty", + "pe" + ], + [ + "a", + "type" + ], + [ + "})", + "$," + ], + [ + "})$", + "," + ], + [ + "}", + ")$," + ], + [ + "▁n", + "l" + ], + [ + "▁", + "nl" + ], + [ + "▁cogn", + "itive" + ], + [ + "▁o", + "nde" + ], + [ + "▁on", + "de" + ], + [ + "▁", + "onde" + ], + [ + "▁en", + "abling" + ], + [ + "▁soc", + "iet" + ], + [ + "▁soci", + "et" + ], + [ + "▁c", + "lan" + ], + [ + "▁cl", + "an" + ], + [ + "▁ex", + "cluded" + ], + [ + "▁excl", + "uded" + ], + [ + "▁exclude", + "d" + ], + [ + "▁th", + "under" + ], + [ + "▁ent", + "ropy" + ], + [ + "▁entr", + "opy" + ], + [ + "▁fast", + "est" + ], + [ + "RE", + "EN" + ], + [ + "REE", + "N" + ], + [ + "▁Vien", + "na" + ], + [ + "▁fl", + "owing" + ], + [ + "▁flo", + "wing" + ], + [ + "▁flow", + "ing" + ], + [ + "▁aff", + "irm" + ], + [ + "al", + "om" + ], + [ + "alo", + "m" + ], + [ + "▁h", + "ips" + ], + [ + "▁hi", + "ps" + ], + [ + "▁hip", + "s" + ], + [ + "▁can", + "nab" + ], + [ + "▁st", + "icks" + ], + [ + "▁stick", + "s" + ], + [ + "▁cur", + "riculum" + ], + [ + "▁ret", + "ained" + ], + [ + "▁retain", + "ed" + ], + [ + "▁ext", + "ending" + ], + [ + "▁extend", + "ing" + ], + [ + "ó", + "z" + ], + [ + "he", + "aded" + ], + [ + "head", + "ed" + ], + [ + "ex", + "c" + ], + [ + "e", + "xc" + ], + [ + "▁je", + "ho" + ], + [ + "▁for", + "ests" + ], + [ + "▁fore", + "sts" + ], + [ + "▁forest", + "s" + ], + [ + "ma", + "nia" + ], + [ + "man", + "ia" + ], + [ + "m", + "ania" + ], + [ + "▁C", + "anal" + ], + [ + "▁Can", + "al" + ], + [ + "▁Ca", + "nal" + ], + [ + "▁S", + "out" + ], + [ + "▁So", + "ut" + ], + [ + "▁Sou", + "t" + ], + [ + "▁B", + "ahn" + ], + [ + "▁Ba", + "hn" + ], + [ + "▁Bah", + "n" + ], + [ + "▁T", + "EXT" + ], + [ + "▁TE", + "XT" + ], + [ + "▁", + "TEXT" + ], + [ + "▁др", + "жа" + ], + [ + "▁User", + "s" + ], + [ + "▁Us", + "ers" + ], + [ + "▁Use", + "rs" + ], + [ + "▁", + "Users" + ], + [ + "▁G", + "EN" + ], + [ + "▁", + "GEN" + ], + [ + "sl", + "ash" + ], + [ + "ben", + "falls" + ], + [ + "Text", + "Field" + ], + [ + "▁r", + "av" + ], + [ + "▁ra", + "v" + ], + [ + "▁", + "rav" + ], + [ + "▁continu", + "ously" + ], + [ + "▁continuous", + "ly" + ], + [ + "IT", + "ER" + ], + [ + "ITE", + "R" + ], + [ + "I", + "TER" + ], + [ + "▁Jen", + "ny" + ], + [ + "▁Jenn", + "y" + ], + [ + "ch", + "os" + ], + [ + "cho", + "s" + ], + [ + "c", + "hos" + ], + [ + "▁am", + "big" + ], + [ + "▁amb", + "ig" + ], + [ + "▁ж", + "ур" + ], + [ + "Aut", + "ow" + ], + [ + "Auto", + "w" + ], + [ + "▁V", + "eter" + ], + [ + "▁Ve", + "ter" + ], + [ + "▁dest", + "in" + ], + [ + "H", + "om" + ], + [ + "au", + "ge" + ], + [ + "aug", + "e" + ], + [ + "a", + "uge" + ], + [ + "▁com", + "mod" + ], + [ + "▁comm", + "od" + ], + [ + "▁gar", + "lic" + ], + [ + "<", + "=" + ], + [ + "▁dram", + "atically" + ], + [ + "▁dramatic", + "ally" + ], + [ + "CA", + "N" + ], + [ + "C", + "AN" + ], + [ + "an", + "cers" + ], + [ + "ance", + "rs" + ], + [ + "anc", + "ers" + ], + [ + "ancer", + "s" + ], + [ + "()", + "}" + ], + [ + "(", + ")}" + ], + [ + "gh", + "ai" + ], + [ + "▁tw", + "ee" + ], + [ + "▁twe", + "e" + ], + [ + "▁сент", + "ября" + ], + [ + "GP", + "U" + ], + [ + "G", + "PU" + ], + [ + "▁B", + "omb" + ], + [ + "▁Bo", + "mb" + ], + [ + "▁young", + "est" + ], + [ + "▁c", + "age" + ], + [ + "▁ca", + "ge" + ], + [ + "ok", + "s" + ], + [ + "o", + "ks" + ], + [ + "ic", + "hes" + ], + [ + "ich", + "es" + ], + [ + "iche", + "s" + ], + [ + "i", + "ches" + ], + [ + "▁T", + "ests" + ], + [ + "▁Te", + "sts" + ], + [ + "▁Test", + "s" + ], + [ + "▁Tes", + "ts" + ], + [ + "▁", + "Tests" + ], + [ + "sk", + "ý" + ], + [ + "cur", + "y" + ], + [ + "cu", + "ry" + ], + [ + "c", + "ury" + ], + [ + "na", + "ls" + ], + [ + "nal", + "s" + ], + [ + "n", + "als" + ], + [ + "ț", + "a" + ], + [ + "▁V", + "oice" + ], + [ + "▁Vo", + "ice" + ], + [ + "Depend", + "ency" + ], + [ + "v", + "f" + ], + [ + "e", + "ous" + ], + [ + "▁Z", + "a" + ], + [ + "▁am", + "ateur" + ], + [ + "▁G", + "host" + ], + [ + "▁Gh", + "ost" + ], + [ + "▁dis", + "ability" + ], + [ + "▁Вла", + "ди" + ], + [ + "▁rev", + "enge" + ], + [ + "▁reven", + "ge" + ], + [ + "Trans", + "lation" + ], + [ + "▁cour", + "tesy" + ], + [ + "ски", + "я" + ], + [ + "▁bl", + "ob" + ], + [ + "▁blo", + "b" + ], + [ + "▁", + "blob" + ], + [ + "ä", + "ß" + ], + [ + "ó", + "j" + ], + [ + "▁print", + "s" + ], + [ + "▁prin", + "ts" + ], + [ + "▁", + "prints" + ], + [ + "▁pro", + "ves" + ], + [ + "▁pr", + "oves" + ], + [ + "▁prov", + "es" + ], + [ + "▁prove", + "s" + ], + [ + ">?", + "[<" + ], + [ + "▁ut", + "ils" + ], + [ + "▁util", + "s" + ], + [ + "▁", + "utils" + ], + [ + "ty", + "pen" + ], + [ + "type", + "n" + ], + [ + "typ", + "en" + ], + [ + "▁t", + "erra" + ], + [ + "▁ter", + "ra" + ], + [ + "▁terr", + "a" + ], + [ + "▁", + "terra" + ], + [ + "▁min", + "eral" + ], + [ + "▁mine", + "ral" + ], + [ + "▁miner", + "al" + ], + [ + "▁war", + "rior" + ], + [ + "▁ме", + "ст" + ], + [ + "▁D", + "S" + ], + [ + "▁", + "DS" + ], + [ + "Em", + "b" + ], + [ + "E", + "mb" + ], + [ + "get", + "Data" + ], + [ + "ли", + "чи" + ], + [ + "лич", + "и" + ], + [ + "▁sa", + "fer" + ], + [ + "▁saf", + "er" + ], + [ + "▁safe", + "r" + ], + [ + "▁com", + "une" + ], + [ + "▁comun", + "e" + ], + [ + "▁hier", + "archy" + ], + [ + "Cred", + "entials" + ], + [ + "res", + "se" + ], + [ + "ress", + "e" + ], + [ + "r", + "esse" + ], + [ + "gr", + "av" + ], + [ + "gra", + "v" + ], + [ + "g", + "rav" + ], + [ + "lo", + "gs" + ], + [ + "log", + "s" + ], + [ + "l", + "ogs" + ], + [ + "br", + "os" + ], + [ + "bro", + "s" + ], + [ + "b", + "ros" + ], + [ + "BUT", + "TON" + ], + [ + "lit", + "eral" + ], + [ + "liter", + "al" + ], + [ + "l", + "iteral" + ], + [ + "▁S", + "r" + ], + [ + "an", + "tal" + ], + [ + "ant", + "al" + ], + [ + "anta", + "l" + ], + [ + "▁mer", + "cy" + ], + [ + "▁merc", + "y" + ], + [ + "DA", + "P" + ], + [ + "D", + "AP" + ], + [ + "▁Mag", + "gie" + ], + [ + "▁sust", + "ained" + ], + [ + "▁sustain", + "ed" + ], + [ + "N", + "M" + ], + [ + "Re", + "view" + ], + [ + "Rev", + "iew" + ], + [ + "▁Buen", + "os" + ], + [ + "▁de", + "aler" + ], + [ + "▁deal", + "er" + ], + [ + "en", + "es" + ], + [ + "ene", + "s" + ], + [ + "e", + "nes" + ], + [ + "▁file", + "Name" + ], + [ + "▁", + "fileName" + ], + [ + "bb", + "ra" + ], + [ + "b", + "bra" + ], + [ + "ро", + "ма" + ], + [ + "ром", + "а" + ], + [ + "Inst", + "all" + ], + [ + "▁Mor", + "ning" + ], + [ + "LE", + "T" + ], + [ + "L", + "ET" + ], + [ + "ip", + "a" + ], + [ + "i", + "pa" + ], + [ + "G", + "a" + ], + [ + "го", + "в" + ], + [ + "г", + "ов" + ], + [ + "▁Sche", + "dule" + ], + [ + "▁", + "Schedule" + ], + [ + "▁rep", + "orters" + ], + [ + "▁report", + "ers" + ], + [ + "▁reporter", + "s" + ], + [ + "▁pecul", + "iar" + ], + [ + "▁sup", + "plier" + ], + [ + ")$", + "-" + ], + [ + ")", + "$-" + ], + [ + "ë", + "l" + ], + [ + "▁roll", + "s" + ], + [ + "▁né", + "cess" + ], + [ + "▁p", + "reg" + ], + [ + "▁pre", + "g" + ], + [ + "▁pr", + "eg" + ], + [ + "▁Re", + "yn" + ], + [ + "▁sur", + "render" + ], + [ + "▁contribut", + "ing" + ], + [ + ")+", + "\\" + ], + [ + ")", + "+\\" + ], + [ + "PRO", + "P" + ], + [ + "PR", + "OP" + ], + [ + "P", + "ROP" + ], + [ + "▁dec", + "imal" + ], + [ + "▁Town", + "ship" + ], + [ + "gr", + "p" + ], + [ + "g", + "rp" + ], + [ + "▁terror", + "ist" + ], + [ + "pt", + "o" + ], + [ + "p", + "to" + ], + [ + "on", + "en" + ], + [ + "one", + "n" + ], + [ + "o", + "nen" + ], + [ + "▁Polit", + "ics" + ], + [ + "▁Pe", + "arl" + ], + [ + "▁Pear", + "l" + ], + [ + "▁pil", + "low" + ], + [ + "▁pill", + "ow" + ], + [ + "▁gr", + "ades" + ], + [ + "▁grad", + "es" + ], + [ + "▁grade", + "s" + ], + [ + "▁gra", + "des" + ], + [ + "▁", + "grades" + ], + [ + "TH", + "E" + ], + [ + "T", + "HE" + ], + [ + "▁num", + "ero" + ], + [ + "▁numer", + "o" + ], + [ + "▁nu", + "mero" + ], + [ + "i", + "NdEx" + ], + [ + "M", + "igration" + ], + [ + "PE", + "ND" + ], + [ + "P", + "END" + ], + [ + "ph", + "oto" + ], + [ + "▁cent", + "ered" + ], + [ + "▁center", + "ed" + ], + [ + "▁r", + "het" + ], + [ + "▁rh", + "et" + ], + [ + "egr", + "ünd" + ], + [ + "▁laund", + "ry" + ], + [ + "get", + "Node" + ], + [ + "▁est", + "imation" + ], + [ + "▁estim", + "ation" + ], + [ + "▁I", + "v" + ], + [ + "▁wh", + "oles" + ], + [ + "▁who", + "les" + ], + [ + "▁whole", + "s" + ], + [ + "ше", + "ния" + ], + [ + "▁const", + "itutional" + ], + [ + "▁constitution", + "al" + ], + [ + "am", + "ination" + ], + [ + "amin", + "ation" + ], + [ + "▁Municip", + "al" + ], + [ + "ad", + "t" + ], + [ + "a", + "dt" + ], + [ + "th", + "y" + ], + [ + "t", + "hy" + ], + [ + "▁pub", + "li" + ], + [ + "▁di", + "cembre" + ], + [ + "▁dic", + "embre" + ], + [ + "▁dice", + "mbre" + ], + [ + "`", + ")" + ], + [ + "▁Ch", + "rome" + ], + [ + "ef", + "e" + ], + [ + "e", + "fe" + ], + [ + "con", + "g" + ], + [ + "co", + "ng" + ], + [ + "c", + "ong" + ], + [ + "bre", + "aking" + ], + [ + "break", + "ing" + ], + [ + "at", + "ched" + ], + [ + "atch", + "ed" + ], + [ + "es", + "tr" + ], + [ + "est", + "r" + ], + [ + "e", + "str" + ], + [ + "▁i", + "di" + ], + [ + "▁id", + "i" + ], + [ + "▁", + "idi" + ], + [ + "VER", + "Y" + ], + [ + "V", + "ERY" + ], + [ + "▁app", + "el" + ], + [ + "▁ap", + "pel" + ], + [ + "▁appe", + "l" + ], + [ + "▁Techn", + "ical" + ], + [ + "tc", + "x" + ], + [ + "t", + "cx" + ], + [ + "DO", + "UBLE" + ], + [ + "se", + "k" + ], + [ + "s", + "ek" + ], + [ + "hu", + "ng" + ], + [ + "h", + "ung" + ], + [ + "▁A", + "ur" + ], + [ + "▁Au", + "r" + ], + [ + "coll", + "apse" + ], + [ + "▁adv", + "ise" + ], + [ + "▁advis", + "e" + ], + [ + "▁Pr", + "imary" + ], + [ + "▁Pri", + "mary" + ], + [ + "▁Prim", + "ary" + ], + [ + "▁", + "Primary" + ], + [ + "ia", + "z" + ], + [ + "i", + "az" + ], + [ + "▁a", + "nten" + ], + [ + "▁an", + "ten" + ], + [ + "▁ant", + "en" + ], + [ + "▁ante", + "n" + ], + [ + "▁", + "anten" + ], + [ + "▁bro", + "ader" + ], + [ + "▁broad", + "er" + ], + [ + "▁ju", + "nio" + ], + [ + "▁jun", + "io" + ], + [ + "▁juni", + "o" + ], + [ + "▁w", + "ool" + ], + [ + "▁wo", + "ol" + ], + [ + "▁hat", + "red" + ], + [ + "▁ex", + "agger" + ], + [ + "Con", + "v" + ], + [ + "Co", + "nv" + ], + [ + "kt", + "ur" + ], + [ + "▁em", + "peror" + ], + [ + "▁Pack", + "age" + ], + [ + "▁", + "Package" + ], + [ + "TD", + "M" + ], + [ + "T", + "DM" + ], + [ + "\\{", + "\\" + ], + [ + "\\", + "{\\" + ], + [ + "whe", + "el" + ], + [ + "▁fe", + "as" + ], + [ + "▁js", + "ou" + ], + [ + "" + ], + [ + "<", + "?>" + ], + [ + "INST", + "ANCE" + ], + [ + "▁ch", + "ant" + ], + [ + "▁cha", + "nt" + ], + [ + "▁", + "chant" + ], + [ + "▁Re", + "fer" + ], + [ + "▁Ref", + "er" + ], + [ + "▁S", + "hir" + ], + [ + "▁Sh", + "ir" + ], + [ + "▁ве", + "ка" + ], + [ + "▁Me", + "eting" + ], + [ + "▁Meet", + "ing" + ], + [ + "▁n", + "v" + ], + [ + "▁", + "nv" + ], + [ + "▁associ", + "ations" + ], + [ + "▁association", + "s" + ], + [ + "it", + "ations" + ], + [ + "itation", + "s" + ], + [ + "itat", + "ions" + ], + [ + "or", + "um" + ], + [ + "o", + "rum" + ], + [ + "▁t", + "ires" + ], + [ + "▁ti", + "res" + ], + [ + "▁tire", + "s" + ], + [ + "▁tir", + "es" + ], + [ + "▁d", + "ash" + ], + [ + "▁da", + "sh" + ], + [ + "▁das", + "h" + ], + [ + "▁", + "dash" + ], + [ + "▁}", + "));" + ], + [ + "▁})", + ");" + ], + [ + "ar", + "to" + ], + [ + "art", + "o" + ], + [ + "▁Ed", + "inburgh" + ], + [ + "W", + "T" + ], + [ + "▁inv", + "ented" + ], + [ + "▁invent", + "ed" + ], + [ + "ve", + "h" + ], + [ + "v", + "eh" + ], + [ + "▁Hind", + "u" + ], + [ + "▁Насе", + "лення" + ], + [ + "▁ur", + "gent" + ], + [ + "▁urg", + "ent" + ], + [ + "▁urge", + "nt" + ], + [ + "text", + "color" + ], + [ + "we", + "rp" + ], + [ + "wer", + "p" + ], + [ + "▁det", + "ector" + ], + [ + "▁detect", + "or" + ], + [ + "▁al", + "tered" + ], + [ + "▁alt", + "ered" + ], + [ + "▁alter", + "ed" + ], + [ + "▁t", + "b" + ], + [ + "▁", + "tb" + ], + [ + "▁N", + "aval" + ], + [ + "▁Na", + "val" + ], + [ + "▁Nav", + "al" + ], + [ + "▁mem", + "br" + ], + [ + "style", + "sheet" + ], + [ + "styles", + "heet" + ], + [ + "un", + "ts" + ], + [ + "unt", + "s" + ], + [ + "▁nut", + "rition" + ], + [ + "▁S", + "ylv" + ], + [ + "▁Sy", + "lv" + ], + [ + "▁e", + "numer" + ], + [ + "▁en", + "umer" + ], + [ + "▁enum", + "er" + ], + [ + "▁m", + "ines" + ], + [ + "▁min", + "es" + ], + [ + "▁mi", + "nes" + ], + [ + "▁mine", + "s" + ], + [ + "▁l", + "itter" + ], + [ + "▁lit", + "ter" + ], + [ + "▁litt", + "er" + ], + [ + "ž", + "í" + ], + [ + "con", + "current" + ], + [ + "▁sw", + "allow" + ], + [ + "Si", + "r" + ], + [ + "S", + "ir" + ], + [ + "tal", + "k" + ], + [ + "t", + "alk" + ], + [ + "▁de", + "utschen" + ], + [ + "▁deutsch", + "en" + ], + [ + "re", + "peat" + ], + [ + "▁dom", + "ains" + ], + [ + "▁domain", + "s" + ], + [ + "▁Mc", + "Donald" + ], + [ + "▁cand", + "le" + ], + [ + "▁pl", + "ural" + ], + [ + "▁sharp", + "ly" + ], + [ + "▁shar", + "ply" + ], + [ + "orig", + "ine" + ], + [ + "origin", + "e" + ], + [ + "▁c", + "andy" + ], + [ + "▁can", + "dy" + ], + [ + "▁cand", + "y" + ], + [ + "▁kilomet", + "res" + ], + [ + "▁power", + "ed" + ], + [ + "▁pow", + "ered" + ], + [ + "▁", + "powered" + ], + [ + "▁s", + "ep" + ], + [ + "▁se", + "p" + ], + [ + "▁", + "sep" + ], + [ + "▁S", + "oci" + ], + [ + "▁So", + "ci" + ], + [ + "▁Soc", + "i" + ], + [ + "▁Ber", + "nie" + ], + [ + "▁Bern", + "ie" + ], + [ + "GE", + "NER" + ], + [ + "GEN", + "ER" + ], + [ + "Ex", + "per" + ], + [ + "Exp", + "er" + ], + [ + "▁Al", + "low" + ], + [ + "▁All", + "ow" + ], + [ + "▁", + "Allow" + ], + [ + "▁Ern", + "st" + ], + [ + "▁Re", + "becca" + ], + [ + "▁Cont", + "ribut" + ], + [ + "ro", + "utes" + ], + [ + "rou", + "tes" + ], + [ + "route", + "s" + ], + [ + "r", + "outes" + ], + [ + "▁s", + "uffix" + ], + [ + "▁suff", + "ix" + ], + [ + "▁ju", + "lio" + ], + [ + "▁jul", + "io" + ], + [ + "▁juli", + "o" + ], + [ + "▁provinc", + "ial" + ], + [ + "▁provincia", + "l" + ], + [ + "▁provin", + "cial" + ], + [ + "▁appreci", + "ation" + ], + [ + "Us", + "ing" + ], + [ + "U", + "sing" + ], + [ + "abs", + "olute" + ], + [ + "▁cr", + "icket" + ], + [ + "W", + "ould" + ], + [ + "▁Equip", + "ment" + ], + [ + "▁tort", + "ure" + ], + [ + "на", + "х" + ], + [ + "ut", + "ton" + ], + [ + "utt", + "on" + ], + [ + "че", + "ство" + ], + [ + "▁out", + "break" + ], + [ + "▁prevent", + "ing" + ], + [ + "▁mad", + "re" + ], + [ + "▁ret", + "ire" + ], + [ + "end", + "region" + ], + [ + "▁f", + "ais" + ], + [ + "▁fa", + "is" + ], + [ + "▁remember", + "ing" + ], + [ + "▁Al", + "ban" + ], + [ + "▁Alb", + "an" + ], + [ + "▁a", + "rist" + ], + [ + "▁ar", + "ist" + ], + [ + "▁work", + "out" + ], + [ + "▁u", + "z" + ], + [ + "▁", + "uz" + ], + [ + "as", + "to" + ], + [ + "ast", + "o" + ], + [ + "a", + "sto" + ], + [ + "fort", + "unate" + ], + [ + "fortun", + "ate" + ], + [ + "▁p", + "aste" + ], + [ + "▁past", + "e" + ], + [ + "▁pas", + "te" + ], + [ + "▁pa", + "ste" + ], + [ + "▁M", + "R" + ], + [ + "▁", + "MR" + ], + [ + "▁o", + "tra" + ], + [ + "▁ot", + "ra" + ], + [ + "S", + "v" + ], + [ + "an", + "gen" + ], + [ + "ang", + "en" + ], + [ + "ange", + "n" + ], + [ + "▁S", + "ierra" + ], + [ + "▁Si", + "erra" + ], + [ + "▁n", + "au" + ], + [ + "▁na", + "u" + ], + [ + "▁s", + "era" + ], + [ + "▁se", + "ra" + ], + [ + "▁ser", + "a" + ], + [ + "$", + "~" + ], + [ + "▁cos", + "ì" + ], + [ + ")(", + "(" + ], + [ + ")", + "((" + ], + [ + "▁propos", + "als" + ], + [ + "▁proposal", + "s" + ], + [ + "it", + "te" + ], + [ + "itt", + "e" + ], + [ + "▁P", + "ero" + ], + [ + "▁Per", + "o" + ], + [ + "▁Pe", + "ro" + ], + [ + "▁te", + "nant" + ], + [ + "▁ten", + "ant" + ], + [ + "▁", + "tenant" + ], + [ + "Y", + "P" + ], + [ + "▁Param", + "eter" + ], + [ + "▁", + "Parameter" + ], + [ + "sp", + "ell" + ], + [ + "spe", + "ll" + ], + [ + "▁e", + "merge" + ], + [ + "▁emer", + "ge" + ], + [ + "▁g", + "ek" + ], + [ + "▁ge", + "k" + ], + [ + "ol", + "ence" + ], + [ + "olen", + "ce" + ], + [ + "ot", + "os" + ], + [ + "oto", + "s" + ], + [ + "o", + "tos" + ], + [ + "▁witness", + "es" + ], + [ + "▁watch", + "es" + ], + [ + "▁wat", + "ches" + ], + [ + "▁A", + "ch" + ], + [ + "▁Ac", + "h" + ], + [ + "Cr", + "oss" + ], + [ + "C", + "ross" + ], + [ + "▁янва", + "ря" + ], + [ + ";", + "}" + ], + [ + "▁O", + "NE" + ], + [ + "▁ON", + "E" + ], + [ + "▁", + "ONE" + ], + [ + "▁care", + "ers" + ], + [ + "▁career", + "s" + ], + [ + "▁faith", + "ful" + ], + [ + "▁J", + "our" + ], + [ + "▁Jo", + "ur" + ], + [ + "▁Gener", + "ate" + ], + [ + "▁Gene", + "rate" + ], + [ + "▁", + "Generate" + ], + [ + "▁ию", + "ля" + ], + [ + "▁recommend", + "ation" + ], + [ + "w", + "b" + ], + [ + "sk", + "ich" + ], + [ + "ski", + "ch" + ], + [ + "bold", + "math" + ], + [ + "▁orig", + "ins" + ], + [ + "▁origin", + "s" + ], + [ + "▁spin", + "ning" + ], + [ + "▁//", + "\r" + ], + [ + "▁bomb", + "s" + ], + [ + "▁bom", + "bs" + ], + [ + "min", + "ister" + ], + [ + "I", + "o" + ], + [ + "öl", + "ker" + ], + [ + "Autow", + "ired" + ], + [ + "um", + "per" + ], + [ + "ump", + "er" + ], + [ + "ich", + "ael" + ], + [ + "▁contribut", + "ors" + ], + [ + "▁contributor", + "s" + ], + [ + "▁n", + "asty" + ], + [ + "▁na", + "sty" + ], + [ + "▁nas", + "ty" + ], + [ + "▁nast", + "y" + ], + [ + "▁d", + "rap" + ], + [ + "▁dr", + "ap" + ], + [ + "▁Bud", + "apest" + ], + [ + "ur", + "ious" + ], + [ + "uri", + "ous" + ], + [ + "hi", + "d" + ], + [ + "h", + "id" + ], + [ + "▁wel", + "comed" + ], + [ + "▁welcome", + "d" + ], + [ + "▁w", + "agon" + ], + [ + "▁wa", + "gon" + ], + [ + "▁Ва", + "си" + ], + [ + "▁embarrass", + "ed" + ], + [ + "▁Har", + "vey" + ], + [ + "Lo", + "s" + ], + [ + "L", + "os" + ], + [ + "▁S", + "ter" + ], + [ + "▁St", + "er" + ], + [ + "▁Ste", + "r" + ], + [ + "▁enjoy", + "able" + ], + [ + "ör", + "t" + ], + [ + "ö", + "rt" + ], + [ + "Mill", + "is" + ], + [ + "--", + ")" + ], + [ + "-", + "-)" + ], + [ + "▁d", + "ashed" + ], + [ + "▁das", + "hed" + ], + [ + "▁dash", + "ed" + ], + [ + "\">", + "<", + "?" + ], + [ + "\"", + ">'", + "," + ], + [ + ">", + "'," + ], + [ + "▁all", + "iance" + ], + [ + "ic", + "ism" + ], + [ + "ici", + "sm" + ], + [ + "▁NAS", + "A" + ], + [ + "▁NA", + "SA" + ], + [ + "▁p", + "ode" + ], + [ + "▁po", + "de" + ], + [ + "▁pod", + "e" + ], + [ + "č", + "ní" + ], + [ + "▁respon", + "ding" + ], + [ + "▁respond", + "ing" + ], + [ + "▁bl", + "owing" + ], + [ + "▁blo", + "wing" + ], + [ + "▁blow", + "ing" + ], + [ + "ic", + "ké" + ], + [ + "ick", + "é" + ], + [ + "ва", + "но" + ], + [ + "ван", + "о" + ], + [ + "▁H", + "off" + ], + [ + "▁Ho", + "ff" + ], + [ + "▁Hof", + "f" + ], + [ + "MB", + "ER" + ], + [ + "M", + "BER" + ], + [ + "▁civil", + "ization" + ], + [ + "ar", + "ía" + ], + [ + "a", + "ría" + ], + [ + "Un", + "lock" + ], + [ + "ge", + "ts" + ], + [ + "get", + "s" + ], + [ + "g", + "ets" + ], + [ + "no", + "d" + ], + [ + "n", + "od" + ], + [ + "▁S", + "TE" + ], + [ + "▁ST", + "E" + ], + [ + "▁con", + "science" + ], + [ + "PE", + "G" + ], + [ + "ch", + "anging" + ], + [ + "chan", + "ging" + ], + [ + "▁Rich", + "mond" + ], + [ + "ling", + "ton" + ], + [ + "l", + "ington" + ], + [ + "ocr", + "atic" + ], + [ + "▁trav", + "és" + ], + [ + "▁ф", + "ран" + ], + [ + "▁up", + "dating" + ], + [ + "process", + "ing" + ], + [ + "Al", + "ex" + ], + [ + "A", + "lex" + ], + [ + "▁mil", + "itar" + ], + [ + "▁milit", + "ar" + ], + [ + "▁pse", + "udo" + ], + [ + "▁pseud", + "o" + ], + [ + "str", + "len" + ], + [ + "▁be", + "have" + ], + [ + "▁beh", + "ave" + ], + [ + "▁behav", + "e" + ], + [ + "▁distinct", + "ive" + ], + [ + "▁E", + "c" + ], + [ + "▁c", + "x" + ], + [ + "▁", + "cx" + ], + [ + "▁journal", + "ists" + ], + [ + "▁journalist", + "s" + ], + [ + "vo", + "lt" + ], + [ + "vol", + "t" + ], + [ + "v", + "olt" + ], + [ + "▁sp", + "un" + ], + [ + "▁d", + "urable" + ], + [ + "▁dur", + "able" + ], + [ + "▁pro", + "position" + ], + [ + "▁propos", + "ition" + ], + [ + "▁", + "proposition" + ], + [ + "thread", + "s" + ], + [ + "▁tw", + "entieth" + ], + [ + "▁ф", + "і" + ], + [ + "▁", + "фі" + ], + [ + "en", + "son" + ], + [ + "ens", + "on" + ], + [ + "enso", + "n" + ], + [ + "▁self", + "ish" + ], + [ + "▁sel", + "fish" + ], + [ + "ar", + "ium" + ], + [ + "ari", + "um" + ], + [ + "a", + "rium" + ], + [ + "▁de", + "cid" + ], + [ + "▁dec", + "id" + ], + [ + "▁ха", + "рак" + ], + [ + "▁psy", + "chiat" + ], + [ + "▁psych", + "iat" + ], + [ + "g", + "d" + ], + [ + "Z", + "Z" + ], + [ + "ug", + "u" + ], + [ + "u", + "gu" + ], + [ + "▁i", + "ds" + ], + [ + "▁id", + "s" + ], + [ + "▁", + "ids" + ], + [ + "Man", + "aged" + ], + [ + "▁Leg", + "isl" + ], + [ + "ancell", + "ationToken" + ], + [ + "▁gr", + "ants" + ], + [ + "▁gran", + "ts" + ], + [ + "▁grant", + "s" + ], + [ + "▁lie", + "utenant" + ], + [ + "▁lieu", + "tenant" + ], + [ + "▁Fle", + "et" + ], + [ + "**", + "/" + ], + [ + "*", + "*/" + ], + [ + "▁T", + "ig" + ], + [ + "▁Ti", + "g" + ], + [ + "▁accept", + "s" + ], + [ + "▁system", + "atic" + ], + [ + ",", + "{\\" + ], + [ + "▁У", + "кра" + ], + [ + "▁aus", + "ge" + ], + [ + "▁dial", + "ect" + ], + [ + "▁dia", + "lect" + ], + [ + "st", + "ri" + ], + [ + "str", + "i" + ], + [ + "s", + "tri" + ], + [ + "er", + "me" + ], + [ + "erm", + "e" + ], + [ + "▁B", + "esch" + ], + [ + "▁Be", + "sch" + ], + [ + "▁Bes", + "ch" + ], + [ + "lo", + "ve" + ], + [ + "lov", + "e" + ], + [ + "l", + "ove" + ], + [ + "S", + "ensor" + ], + [ + "▁B", + "IT" + ], + [ + "▁", + "BIT" + ], + [ + "▁т", + "ру" + ], + [ + "▁mist", + "aken" + ], + [ + "▁mistake", + "n" + ], + [ + "p", + "v" + ], + [ + "▁u", + "tf" + ], + [ + "▁ut", + "f" + ], + [ + "▁", + "utf" + ], + [ + "▁[", + "\\" + ], + [ + "▁", + "[\\" + ], + [ + "▁Geb", + "iet" + ], + [ + "▁Mann", + "schaft" + ], + [ + "PAR", + "AMETER" + ], + [ + "▁u", + "rb" + ], + [ + "▁ur", + "b" + ], + [ + "▁", + "urb" + ], + [ + "▁R", + "eed" + ], + [ + "▁Re", + "ed" + ], + [ + "▁c", + "ough" + ], + [ + "▁co", + "ugh" + ], + [ + "▁cou", + "gh" + ], + [ + "wa", + "ld" + ], + [ + "wal", + "d" + ], + [ + "w", + "ald" + ], + [ + "▁L", + "amb" + ], + [ + "▁La", + "mb" + ], + [ + "▁Lam", + "b" + ], + [ + "▁surv", + "iving" + ], + [ + "▁surviv", + "ing" + ], + [ + "▁s", + "way" + ], + [ + "▁sw", + "ay" + ], + [ + "▁с", + "ве" + ], + [ + "WI", + "SE" + ], + [ + "ä", + "ger" + ], + [ + "f", + "y" + ], + [ + "sk", + "e" + ], + [ + "s", + "ke" + ], + [ + "▁s", + "og" + ], + [ + "▁so", + "g" + ], + [ + "▁Im", + "plement" + ], + [ + "▁Imp", + "lement" + ], + [ + "▁", + "Implement" + ], + [ + "获", + "取" + ], + [ + "▁T", + "ools" + ], + [ + "▁To", + "ols" + ], + [ + "▁Tool", + "s" + ], + [ + "▁Too", + "ls" + ], + [ + "▁", + "Tools" + ], + [ + "▁ne", + "wer" + ], + [ + "▁new", + "er" + ], + [ + "▁exempl", + "e" + ], + [ + "▁exem", + "ple" + ], + [ + "▁l", + "itt" + ], + [ + "▁li", + "tt" + ], + [ + "▁lit", + "t" + ], + [ + "▁вы", + "пу" + ], + [ + "▁у", + "прав" + ], + [ + "Em", + "itter" + ], + [ + "Emit", + "ter" + ], + [ + "IS", + "ING" + ], + [ + "I", + "SING" + ], + [ + "▁органи", + "за" + ], + [ + "▁М", + "і" + ], + [ + "▁Ex", + "amples" + ], + [ + "▁Example", + "s" + ], + [ + "▁I", + "con" + ], + [ + "▁", + "Icon" + ], + [ + "Get", + "ter" + ], + [ + "▁L", + "ay" + ], + [ + "▁La", + "y" + ], + [ + "▁Col", + "lect" + ], + [ + "▁Coll", + "ect" + ], + [ + "▁", + "Collect" + ], + [ + "Sa", + "int" + ], + [ + "S", + "aint" + ], + [ + "or", + "able" + ], + [ + "ora", + "ble" + ], + [ + "▁f", + "ick" + ], + [ + "▁fi", + "ck" + ], + [ + "ik", + "h" + ], + [ + "i", + "kh" + ], + [ + "sl", + "ave" + ], + [ + "▁c", + "lay" + ], + [ + "▁cl", + "ay" + ], + [ + "▁W", + "A" + ], + [ + "▁", + "WA" + ], + [ + "Re", + "po" + ], + [ + "Rep", + "o" + ], + [ + "▁Java", + "Script" + ], + [ + "it", + "r" + ], + [ + "i", + "tr" + ], + [ + "pa", + "id" + ], + [ + "p", + "aid" + ], + [ + "▁home", + "work" + ], + [ + "M", + "iddleware" + ], + [ + "▁r", + "éal" + ], + [ + "▁ré", + "al" + ], + [ + "▁при", + "зна" + ], + [ + "ê", + "m" + ], + [ + "ès", + "e" + ], + [ + "è", + "se" + ], + [ + "▁W", + "ells" + ], + [ + "▁Well", + "s" + ], + [ + "▁Wel", + "ls" + ], + [ + "▁e", + "nero" + ], + [ + "▁en", + "ero" + ], + [ + "▁ener", + "o" + ], + [ + "emperature", + "n" + ], + [ + "▁N", + "aj" + ], + [ + "▁Na", + "j" + ], + [ + "▁Re", + "agan" + ], + [ + "▁comp", + "elling" + ], + [ + "▁tri", + "bes" + ], + [ + "▁trib", + "es" + ], + [ + "▁tribe", + "s" + ], + [ + "▁to", + "String" + ], + [ + "▁", + "toString" + ], + [ + "pace", + "s" + ], + [ + "pa", + "ces" + ], + [ + "p", + "aces" + ], + [ + "▁harm", + "ful" + ], + [ + "▁Con", + "se" + ], + [ + "▁Cons", + "e" + ], + [ + "od", + "io" + ], + [ + "odi", + "o" + ], + [ + "▁m", + "im" + ], + [ + "▁mi", + "m" + ], + [ + "get", + "Item" + ], + [ + "▁script", + "s" + ], + [ + "▁", + "scripts" + ], + [ + "ra", + "is" + ], + [ + "rai", + "s" + ], + [ + "r", + "ais" + ], + [ + "▁Ph", + "ase" + ], + [ + "▁", + "Phase" + ], + [ + "▁An", + "swer" + ], + [ + "▁$", + "|\\" + ], + [ + "▁$|", + "\\" + ], + [ + "▁as", + "sembled" + ], + [ + "el", + "in" + ], + [ + "eli", + "n" + ], + [ + "e", + "lin" + ], + [ + "ph", + "abet" + ], + [ + "pha", + "bet" + ], + [ + "▁to", + "ast" + ], + [ + "▁tut", + "ti" + ], + [ + "▁tu", + "tti" + ], + [ + "▁be", + "zeichnet" + ], + [ + "Gre", + "at" + ], + [ + "G", + "reat" + ], + [ + "et", + "tes" + ], + [ + "ett", + "es" + ], + [ + "ette", + "s" + ], + [ + "e", + "ttes" + ], + [ + "▁дека", + "бря" + ], + [ + "F", + "ULL" + ], + [ + "▁re", + "gener" + ], + [ + "▁reg", + "ener" + ], + [ + "▁któ", + "re" + ], + [ + "го", + "р" + ], + [ + "г", + "ор" + ], + [ + "is", + "ce" + ], + [ + "isc", + "e" + ], + [ + "▁t", + "oda" + ], + [ + "▁to", + "da" + ], + [ + "▁tod", + "a" + ], + [ + "▁eth", + "ical" + ], + [ + "i", + "q" + ], + [ + "P", + "t" + ], + [ + "ar", + "in" + ], + [ + "ari", + "n" + ], + [ + "a", + "rin" + ], + [ + "ig", + "os" + ], + [ + "igo", + "s" + ], + [ + "i", + "gos" + ], + [ + "▁work", + "shops" + ], + [ + "▁workshop", + "s" + ], + [ + "▁R", + "oche" + ], + [ + "▁Ro", + "che" + ], + [ + "▁Roc", + "he" + ], + [ + "Get", + "String" + ], + [ + "мини", + "стратив" + ], + [ + "m", + "ême" + ], + [ + "▁D", + "aw" + ], + [ + "▁Da", + "w" + ], + [ + "ar", + "ians" + ], + [ + "ari", + "ans" + ], + [ + "aria", + "ns" + ], + [ + "arian", + "s" + ], + [ + "▁imp", + "acts" + ], + [ + "▁impact", + "s" + ], + [ + "▁por", + "table" + ], + [ + "▁port", + "able" + ], + [ + ")-", + "\\" + ], + [ + ")", + "-\\" + ], + [ + "sh", + "ots" + ], + [ + "shot", + "s" + ], + [ + "▁re", + "lev" + ], + [ + "▁rel", + "ev" + ], + [ + "▁rele", + "v" + ], + [ + "PR", + "IV" + ], + [ + "PRI", + "V" + ], + [ + "▁бу", + "ла" + ], + [ + "ard", + "less" + ], + [ + "ul", + "ously" + ], + [ + "ulous", + "ly" + ], + [ + "--", + ">" + ], + [ + "-", + "->" + ], + [ + "ol", + "ent" + ], + [ + "ole", + "nt" + ], + [ + "olen", + "t" + ], + [ + "▁э", + "того" + ], + [ + "▁это", + "го" + ], + [ + "▁Gener", + "ic" + ], + [ + "▁Gene", + "ric" + ], + [ + "▁", + "Generic" + ], + [ + "▁*", + "/," + ], + [ + "▁*/", + "," + ], + [ + "▁comb", + "inations" + ], + [ + "▁combination", + "s" + ], + [ + "▁re", + "jo" + ], + [ + "с", + "публи" + ], + [ + "cap", + "acity" + ], + [ + "▁tr", + "aces" + ], + [ + "▁tra", + "ces" + ], + [ + "▁trace", + "s" + ], + [ + "▁op", + "acity" + ], + [ + "▁", + "opacity" + ], + [ + "▁Off", + "icial" + ], + [ + "ic", + "ion" + ], + [ + "ici", + "on" + ], + [ + "icio", + "n" + ], + [ + "▁emotional", + "ly" + ], + [ + "▁emotion", + "ally" + ], + [ + "▁Jo", + "el" + ], + [ + "▁Joe", + "l" + ], + [ + "сько", + "му" + ], + [ + "▁legend", + "ary" + ], + [ + "▁p", + "am" + ], + [ + "▁pa", + "m" + ], + [ + "▁Tamb", + "ién" + ], + [ + ".", + "<" + ], + [ + "ib", + "a" + ], + [ + "i", + "ba" + ], + [ + "mi", + "dt" + ], + [ + "mid", + "t" + ], + [ + "бо", + "м" + ], + [ + "▁en", + "suite" + ], + [ + "Author", + "ization" + ], + [ + "P", + "ag" + ], + [ + "▁hel", + "met" + ], + [ + "▁ter", + "rito" + ], + [ + "▁terr", + "ito" + ], + [ + "second", + "ary" + ], + [ + "▁seg", + "unda" + ], + [ + "▁W", + "ire" + ], + [ + "▁Wi", + "re" + ], + [ + "rec", + "ated" + ], + [ + "▁inv", + "oked" + ], + [ + "▁invoke", + "d" + ], + [ + "▁Value", + "Error" + ], + [ + "▁ф", + "о" + ], + [ + "▁", + "фо" + ], + [ + "AL", + "IGN" + ], + [ + "CUR", + "RENT" + ], + [ + "\\", + "+\\_\\" + ], + [ + "▁comp", + "ilation" + ], + [ + "æ", + "r" + ], + [ + "▁Pal", + "mar" + ], + [ + "▁Palm", + "ar" + ], + [ + "▁influ", + "ences" + ], + [ + "▁influence", + "s" + ], + [ + "/", + ":" + ], + [ + "M", + "ix" + ], + [ + "NO", + "P" + ], + [ + "N", + "OP" + ], + [ + "ec", + "onom" + ], + [ + "e", + "conom" + ], + [ + "▁t", + "ucked" + ], + [ + "▁}", + ");\r" + ], + [ + "▁});", + "\r" + ], + [ + "▁})", + ";\r" + ], + [ + "▁", + "});\r" + ], + [ + "AN", + "K" + ], + [ + "re", + "ject" + ], + [ + "▁p", + "ension" + ], + [ + "▁pens", + "ion" + ], + [ + "▁gener", + "ates" + ], + [ + "▁generate", + "s" + ], + [ + "ч", + "ё" + ], + [ + "▁in", + "cap" + ], + [ + "▁inc", + "ap" + ], + [ + "▁cl", + "icked" + ], + [ + "▁click", + "ed" + ], + [ + "▁f", + "us" + ], + [ + "▁fu", + "s" + ], + [ + "our", + "ses" + ], + [ + "ours", + "es" + ], + [ + "ourse", + "s" + ], + [ + "▁E", + "aster" + ], + [ + "▁East", + "er" + ], + [ + "%", + ";" + ], + [ + "zi", + "n" + ], + [ + "z", + "in" + ], + [ + "▁oblig", + "ations" + ], + [ + "▁obligation", + "s" + ], + [ + "▁T", + "ips" + ], + [ + "▁Tip", + "s" + ], + [ + "▁Ti", + "ps" + ], + [ + "};", + "\r" + ], + [ + "}", + ";\r" + ], + [ + ".\"", + "_" + ], + [ + "▁B", + "SD" + ], + [ + "▁BS", + "D" + ], + [ + "át", + "ica" + ], + [ + "▁ex", + "pose" + ], + [ + "▁exp", + "ose" + ], + [ + "▁expos", + "e" + ], + [ + "Par", + "s" + ], + [ + "P", + "ars" + ], + [ + "▁Am", + "anda" + ], + [ + "ку", + "п" + ], + [ + "▁gu", + "essed" + ], + [ + "▁guess", + "ed" + ], + [ + "ds", + "i" + ], + [ + "d", + "si" + ], + [ + "▁Le", + "ip" + ], + [ + "Br", + "oad" + ], + [ + "Bro", + "ad" + ], + [ + "B", + "road" + ], + [ + "▁Hug", + "hes" + ], + [ + "▁Hugh", + "es" + ], + [ + "i", + "é" + ], + [ + "▁W", + "ahl" + ], + [ + "▁Wa", + "hl" + ], + [ + "▁former", + "ly" + ], + [ + "Rel", + "ative" + ], + [ + "▁Y", + "u" + ], + [ + "▁Mount", + "ains" + ], + [ + "▁Mountain", + "s" + ], + [ + "▁E", + "num" + ], + [ + "▁En", + "um" + ], + [ + "▁", + "Enum" + ], + [ + "▁str", + "ang" + ], + [ + "▁stra", + "ng" + ], + [ + "_", + "-" + ], + [ + "re", + "cht" + ], + [ + "rec", + "ht" + ], + [ + "vi", + "v" + ], + [ + "v", + "iv" + ], + [ + "pa", + "use" + ], + [ + "p", + "ause" + ], + [ + "▁Lond", + "res" + ], + [ + "▁el", + "bow" + ], + [ + "▁Hawai", + "i" + ], + [ + "▁Cas", + "ino" + ], + [ + "Th", + "reshold" + ], + [ + "Un", + "its" + ], + [ + "Unit", + "s" + ], + [ + "In", + "clude" + ], + [ + "ит", + "о" + ], + [ + "и", + "то" + ], + [ + "as", + "ury" + ], + [ + "▁ste", + "ht" + ], + [ + "▁dam", + "ned" + ], + [ + "▁damn", + "ed" + ], + [ + "▁pack", + "ets" + ], + [ + "▁packet", + "s" + ], + [ + "▁W", + "erk" + ], + [ + "▁Wer", + "k" + ], + [ + "▁elev", + "ator" + ], + [ + "ied", + "ad" + ], + [ + "go", + "vern" + ], + [ + "gov", + "ern" + ], + [ + "g", + "overn" + ], + [ + "▁CONTR", + "ACT" + ], + [ + "ma", + "ls" + ], + [ + "mal", + "s" + ], + [ + "m", + "als" + ], + [ + "▁re", + "mem" + ], + [ + "▁rem", + "em" + ], + [ + "▁ent", + "onces" + ], + [ + "▁v", + "as" + ], + [ + "▁va", + "s" + ], + [ + "▁", + "vas" + ], + [ + "▁sym", + "pathy" + ], + [ + "▁befind", + "et" + ], + [ + "in", + "cing" + ], + [ + "inc", + "ing" + ], + [ + "Data", + "Set" + ], + [ + "▁add", + "itionally" + ], + [ + "▁addition", + "ally" + ], + [ + "▁additional", + "ly" + ], + [ + "▁mus", + "ician" + ], + [ + "▁music", + "ian" + ], + [ + "ше", + "го" + ], + [ + "▁li", + "stop" + ], + [ + "▁list", + "op" + ], + [ + ">\"", + ")" + ], + [ + ">", + "\")" + ], + [ + "Print", + "f" + ], + [ + "▁Fel", + "ix" + ], + [ + "▁car", + "ved" + ], + [ + "▁nice", + "ly" + ], + [ + "▁nic", + "ely" + ], + [ + "го", + "м" + ], + [ + "ch", + "ap" + ], + [ + "cha", + "p" + ], + [ + "▁N", + "ieder" + ], + [ + "▁Ni", + "eder" + ], + [ + "▁Nie", + "der" + ], + [ + "▁L", + "av" + ], + [ + "▁La", + "v" + ], + [ + "▁mod", + "ifications" + ], + [ + "▁modification", + "s" + ], + [ + "mo", + "ment" + ], + [ + "m", + "oment" + ], + [ + "▁bal", + "con" + ], + [ + "▁depend", + "ency" + ], + [ + "CK", + "ET" + ], + [ + "▁van", + "ished" + ], + [ + "▁f", + "ighters" + ], + [ + "▁fight", + "ers" + ], + [ + "▁fighter", + "s" + ], + [ + "▁z", + "unächst" + ], + [ + "io", + "ctl" + ], + [ + "ioc", + "tl" + ], + [ + "▁def", + "ens" + ], + [ + "▁defe", + "ns" + ], + [ + "▁N", + "em" + ], + [ + "▁Ne", + "m" + ], + [ + "Util", + "ity" + ], + [ + "Ut", + "ility" + ], + [ + "▁cur", + "v" + ], + [ + "▁cu", + "rv" + ], + [ + "▁DA", + "MAGES" + ], + [ + "▁Ro", + "gers" + ], + [ + "▁Rog", + "ers" + ], + [ + "▁Roger", + "s" + ], + [ + "▁grat", + "itude" + ], + [ + "▁Den", + "mark" + ], + [ + "ра", + "я" + ], + [ + "gr", + "pc" + ], + [ + "grp", + "c" + ], + [ + "g", + "rpc" + ], + [ + "▁j", + "uni" + ], + [ + "▁ju", + "ni" + ], + [ + "▁jun", + "i" + ], + [ + "▁окт", + "ября" + ], + [ + "▁imm", + "ense" + ], + [ + "▁prevent", + "ed" + ], + [ + "▁prev", + "ented" + ], + [ + "▁fo", + "am" + ], + [ + "▁Ex", + "tra" + ], + [ + "▁Ext", + "ra" + ], + [ + "▁", + "Extra" + ], + [ + "ai", + "med" + ], + [ + "aim", + "ed" + ], + [ + "▁C", + "riteria" + ], + [ + "▁Crit", + "eria" + ], + [ + "▁", + "Criteria" + ], + [ + "▁Sim", + "ply" + ], + [ + "box", + "es" + ], + [ + "▁Leg", + "end" + ], + [ + "▁P", + "layers" + ], + [ + "▁Play", + "ers" + ], + [ + "▁Player", + "s" + ], + [ + "▁Mer", + "cedes" + ], + [ + "▁Merc", + "edes" + ], + [ + "▁Br", + "anch" + ], + [ + "▁", + "Branch" + ], + [ + "TER", + "N" + ], + [ + "T", + "ERN" + ], + [ + "om", + "ena" + ], + [ + "ome", + "na" + ], + [ + "omen", + "a" + ], + [ + "▁incorpor", + "ate" + ], + [ + "con", + "de" + ], + [ + "co", + "nde" + ], + [ + "cond", + "e" + ], + [ + "c", + "onde" + ], + [ + "▁Est", + "ado" + ], + [ + "▁Esta", + "do" + ], + [ + "▁w", + "asted" + ], + [ + "▁was", + "ted" + ], + [ + "▁wa", + "sted" + ], + [ + "▁waste", + "d" + ], + [ + "▁compl", + "aining" + ], + [ + "▁complain", + "ing" + ], + [ + "▁war", + "riors" + ], + [ + "▁warrior", + "s" + ], + [ + "ot", + "er" + ], + [ + "ote", + "r" + ], + [ + "o", + "ter" + ], + [ + "▁э", + "том" + ], + [ + "▁это", + "м" + ], + [ + "▁con", + "ten" + ], + [ + "▁cont", + "en" + ], + [ + "▁co", + "nten" + ], + [ + "▁machine", + "ry" + ], + [ + "▁mach", + "inery" + ], + [ + "▁techn", + "ological" + ], + [ + "▁T", + "D" + ], + [ + "▁", + "TD" + ], + [ + "▁g", + "ras" + ], + [ + "▁gr", + "as" + ], + [ + "▁gra", + "s" + ], + [ + "▁minim", + "ize" + ], + [ + "▁D", + "oor" + ], + [ + "▁Do", + "or" + ], + [ + "▁b", + "zw" + ], + [ + "▁p", + "rac" + ], + [ + "▁pr", + "ac" + ], + [ + "▁pra", + "c" + ], + [ + "TR", + "EE" + ], + [ + "T", + "REE" + ], + [ + "▁W", + "ing" + ], + [ + "▁Win", + "g" + ], + [ + "▁Wi", + "ng" + ], + [ + "▁Trans", + "action" + ], + [ + "▁", + "Transaction" + ], + [ + "▁M", + "VT" + ], + [ + "▁Kle", + "in" + ], + [ + "com", + "mons" + ], + [ + "comm", + "ons" + ], + [ + "common", + "s" + ], + [ + "▁}", + "{" + ], + [ + "▁", + "}{" + ], + [ + "▁Her", + "itage" + ], + [ + "▁f", + "ade" + ], + [ + "▁fa", + "de" + ], + [ + "ро", + "к" + ], + [ + "set", + "Value" + ], + [ + "▁Wal", + "lace" + ], + [ + "▁Wall", + "ace" + ], + [ + "M", + "X" + ], + [ + "▁A", + "CT" + ], + [ + "▁AC", + "T" + ], + [ + "▁", + "ACT" + ], + [ + "▁foot", + "age" + ], + [ + "▁ent", + "stand" + ], + [ + "ar", + "ga" + ], + [ + "arg", + "a" + ], + [ + "▁n", + "ails" + ], + [ + "▁na", + "ils" + ], + [ + "▁nail", + "s" + ], + [ + "▁capital", + "ism" + ], + [ + "▁G", + "arc" + ], + [ + "▁Gar", + "c" + ], + [ + "▁Ga", + "rc" + ], + [ + "▁susp", + "ension" + ], + [ + "il", + "is" + ], + [ + "ili", + "s" + ], + [ + "▁M", + "ov" + ], + [ + "▁Mo", + "v" + ], + [ + "uff", + "led" + ], + [ + "uffle", + "d" + ], + [ + "Ar", + "c" + ], + [ + "A", + "rc" + ], + [ + "▁Beaut", + "iful" + ], + [ + "WA", + "Y" + ], + [ + "W", + "AY" + ], + [ + "Par", + "allel" + ], + [ + "XX", + "XX" + ], + [ + "di", + "ag" + ], + [ + "▁D", + "T" + ], + [ + "▁", + "DT" + ], + [ + "m", + "q" + ], + [ + "Text", + "View" + ], + [ + "ML", + "E" + ], + [ + "M", + "LE" + ], + [ + "en", + "nen" + ], + [ + "enn", + "en" + ], + [ + "enne", + "n" + ], + [ + "▁infect", + "ed" + ], + [ + "▁therap", + "ist" + ], + [ + "IN", + "GS" + ], + [ + "ING", + "S" + ], + [ + "▁c", + "idade" + ], + [ + "ъ", + "н" + ], + [ + "▁p", + "df" + ], + [ + "▁pd", + "f" + ], + [ + "▁", + "pdf" + ], + [ + "▁b", + "ump" + ], + [ + "▁bu", + "mp" + ], + [ + "CT", + "X" + ], + [ + "C", + "TX" + ], + [ + "▁IN", + "CLUDING" + ], + [ + "▁", + "INCLUDING" + ], + [ + "▁G", + "ef" + ], + [ + "▁Ge", + "f" + ], + [ + "ENT", + "IAL" + ], + [ + "▁h", + "andy" + ], + [ + "▁hand", + "y" + ], + [ + "▁han", + "dy" + ], + [ + "▁tempor", + "al" + ], + [ + "▁temp", + "oral" + ], + [ + "▁tempo", + "ral" + ], + [ + "At", + "A" + ], + [ + "IS", + "H" + ], + [ + "I", + "SH" + ], + [ + "▁Pat", + "tern" + ], + [ + "▁", + "Pattern" + ], + [ + "▁l", + "an" + ], + [ + "▁la", + "n" + ], + [ + "▁", + "lan" + ], + [ + "ep", + "endant" + ], + [ + "▁sh", + "ining" + ], + [ + "id", + "y" + ], + [ + "i", + "dy" + ], + [ + "▁N", + "T" + ], + [ + "▁", + "NT" + ], + [ + "▁F", + "ran" + ], + [ + "▁Fr", + "an" + ], + [ + "▁Fra", + "n" + ], + [ + "▁nur", + "ses" + ], + [ + "▁nurs", + "es" + ], + [ + "▁nurse", + "s" + ], + [ + "▁bet", + "ray" + ], + [ + "▁sens", + "ible" + ], + [ + "▁апре", + "ля" + ], + [ + "▁'", + "[" + ], + [ + "▁th", + "irteen" + ], + [ + ")}", + "_{" + ], + [ + ")", + "}_{" + ], + [ + "▁No", + "ah" + ], + [ + "INS", + "ERT" + ], + [ + "ist", + "ically" + ], + [ + "istic", + "ally" + ], + [ + "▁Append", + "ix" + ], + [ + "▁re", + "cher" + ], + [ + "▁rec", + "her" + ], + [ + "Re", + "ceiver" + ], + [ + "▁der", + "nier" + ], + [ + "▁derni", + "er" + ], + [ + "л", + "ла" + ], + [ + "ли", + "за" + ], + [ + "▁Part", + "ido" + ], + [ + "▁max", + "imal" + ], + [ + "▁maxim", + "al" + ], + [ + "sn", + "ap" + ], + [ + "▁ча", + "сть" + ], + [ + "▁част", + "ь" + ], + [ + "▁час", + "ть" + ], + [ + "ST", + "OP" + ], + [ + "STO", + "P" + ], + [ + "S", + "TOP" + ], + [ + "▁ult", + "ra" + ], + [ + "▁ul", + "tra" + ], + [ + "▁dévelop", + "p" + ], + [ + "▁t", + "egen" + ], + [ + "▁te", + "gen" + ], + [ + "▁Ч", + "и" + ], + [ + "LI", + "B" + ], + [ + "L", + "IB" + ], + [ + "▁bas", + "eline" + ], + [ + "▁base", + "line" + ], + [ + "re", + "load" + ], + [ + "rel", + "oad" + ], + [ + "▁Ar", + "bitro" + ], + [ + "▁k", + "all" + ], + [ + "▁ka", + "ll" + ], + [ + "c", + "apture" + ], + [ + "Ar", + "m" + ], + [ + "A", + "rm" + ], + [ + "qu", + "in" + ], + [ + "im", + "pse" + ], + [ + "imp", + "se" + ], + [ + "za", + "s" + ], + [ + "z", + "as" + ], + [ + "▁C", + "and" + ], + [ + "▁Can", + "d" + ], + [ + "▁Ca", + "nd" + ], + [ + "▁br", + "ains" + ], + [ + "▁brain", + "s" + ], + [ + "▁bra", + "ins" + ], + [ + "▁host", + "ile" + ], + [ + "▁mar", + "ble" + ], + [ + "oo", + "ns" + ], + [ + "oon", + "s" + ], + [ + "o", + "ons" + ], + [ + "▁L", + "oss" + ], + [ + "▁Los", + "s" + ], + [ + "▁Lo", + "ss" + ], + [ + "Meta", + "Data" + ], + [ + "▁Rep", + "ública" + ], + [ + "▁and", + "ra" + ], + [ + "▁", + "andra" + ], + [ + "od", + "en" + ], + [ + "ode", + "n" + ], + [ + "o", + "den" + ], + [ + "▁document", + "ed" + ], + [ + "▁M", + "oses" + ], + [ + "▁Mo", + "ses" + ], + [ + "▁Mos", + "es" + ], + [ + "od", + "d" + ], + [ + "o", + "dd" + ], + [ + "▁w", + "ax" + ], + [ + "▁wa", + "x" + ], + [ + "us", + "ch" + ], + [ + "usc", + "h" + ], + [ + "u", + "sch" + ], + [ + "▁diagn", + "osed" + ], + [ + "in", + "kle" + ], + [ + "ink", + "le" + ], + [ + "▁X", + "box" + ], + [ + "▁seven", + "ty" + ], + [ + "▁sevent", + "y" + ], + [ + "ci", + "as" + ], + [ + "cia", + "s" + ], + [ + "c", + "ias" + ], + [ + "▁nov", + "iembre" + ], + [ + "Com", + "pute" + ], + [ + "Comp", + "ute" + ], + [ + "Comput", + "e" + ], + [ + "})", + ";\r" + ], + [ + "});", + "\r" + ], + [ + "}", + ");\r" + ], + [ + "▁Philip", + "pe" + ], + [ + "▁Philipp", + "e" + ], + [ + "▁F", + "ör" + ], + [ + "Le", + "ave" + ], + [ + "▁s", + "age" + ], + [ + "▁sa", + "ge" + ], + [ + "▁sag", + "e" + ], + [ + "▁un", + "pre" + ], + [ + "▁Fort", + "unately" + ], + [ + "▁a", + "post" + ], + [ + "▁ap", + "ost" + ], + [ + "ent", + "ities" + ], + [ + "enti", + "ties" + ], + [ + "▁el", + "los" + ], + [ + "▁ell", + "os" + ], + [ + "author", + "ized" + ], + [ + "GB", + "T" + ], + [ + "G", + "BT" + ], + [ + "▁ins", + "ist" + ], + [ + "▁insp", + "ire" + ], + [ + "▁inspir", + "e" + ], + [ + "Ma", + "ss" + ], + [ + "M", + "ass" + ], + [ + "▁r", + "ôle" + ], + [ + "fe", + "e" + ], + [ + "f", + "ee" + ], + [ + "ip", + "art" + ], + [ + "ipa", + "rt" + ], + [ + "i", + "part" + ], + [ + "це", + "р" + ], + [ + "ц", + "ер" + ], + [ + "un", + "ate" + ], + [ + "una", + "te" + ], + [ + "u", + "nate" + ], + [ + "▁C", + "NN" + ], + [ + ":", + "}" + ], + [ + "▁unh", + "appy" + ], + [ + "▁import", + "ed" + ], + [ + "▁imp", + "orted" + ], + [ + "H", + "IGH" + ], + [ + "ring", + "s" + ], + [ + "rin", + "gs" + ], + [ + "r", + "ings" + ], + [ + "▁In", + "stance" + ], + [ + "▁Inst", + "ance" + ], + [ + "▁", + "Instance" + ], + [ + "B", + "ay" + ], + [ + "ag", + "les" + ], + [ + "agle", + "s" + ], + [ + "a", + "gles" + ], + [ + "me", + "e" + ], + [ + "m", + "ee" + ], + [ + "ber", + "y" + ], + [ + "be", + "ry" + ], + [ + "b", + "ery" + ], + [ + "▁St", + "ories" + ], + [ + "▁Sto", + "ries" + ], + [ + "▁Ch", + "ase" + ], + [ + "▁Cha", + "se" + ], + [ + "▁car", + "riage" + ], + [ + "▁mis", + "under" + ], + [ + "▁imag", + "in" + ], + [ + "p", + "w" + ], + [ + "▁M", + "eter" + ], + [ + "▁Me", + "ter" + ], + [ + "▁Met", + "er" + ], + [ + "▁crow", + "ds" + ], + [ + "▁crowd", + "s" + ], + [ + "▁F", + "ame" + ], + [ + "▁Fa", + "me" + ], + [ + "sk", + "ill" + ], + [ + "ski", + "ll" + ], + [ + "s", + "kill" + ], + [ + "▁c", + "omed" + ], + [ + "▁com", + "ed" + ], + [ + "▁co", + "med" + ], + [ + "▁come", + "d" + ], + [ + "▁", + "comed" + ], + [ + "▁r", + "anch" + ], + [ + "▁ran", + "ch" + ], + [ + "▁l", + "acking" + ], + [ + "▁lack", + "ing" + ], + [ + "▁lac", + "king" + ], + [ + "▁sub", + "mar" + ], + [ + "▁subm", + "ar" + ], + [ + "ia", + "nte" + ], + [ + "ian", + "te" + ], + [ + "iant", + "e" + ], + [ + "i", + "ante" + ], + [ + "▁l", + "anz" + ], + [ + "▁lan", + "z" + ], + [ + "▁слу", + "ж" + ], + [ + "--", + "---------" + ], + [ + "----", + "-------" + ], + [ + "--------", + "---" + ], + [ + "---", + "--------" + ], + [ + "-----", + "------" + ], + [ + "----------", + "-" + ], + [ + "------", + "-----" + ], + [ + "---------", + "--" + ], + [ + "-------", + "----" + ], + [ + "-", + "----------" + ], + [ + "▁ob", + "ten" + ], + [ + "▁obt", + "en" + ], + [ + "▁down", + "stairs" + ], + [ + "Y", + "N" + ], + [ + "rot", + "ation" + ], + [ + "▁J", + "esse" + ], + [ + "▁Jes", + "se" + ], + [ + "▁Jess", + "e" + ], + [ + "$", + "(\"#" + ], + [ + "▁p", + "uls" + ], + [ + "▁pu", + "ls" + ], + [ + "▁pul", + "s" + ], + [ + "ir", + "ling" + ], + [ + "irl", + "ing" + ], + [ + "▁Sch", + "aus" + ], + [ + "▁Sc", + "haus" + ], + [ + "▁de", + "ployed" + ], + [ + "▁deploy", + "ed" + ], + [ + "▁{", + "}\"," + ], + [ + "▁{}", + "\"," + ], + [ + "▁Mar", + "vel" + ], + [ + "EN", + "UM" + ], + [ + "E", + "NUM" + ], + [ + "▁Mat", + "hemat" + ], + [ + "▁Math", + "emat" + ], + [ + "▁n", + "n" + ], + [ + "▁", + "nn" + ], + [ + "com", + "pet" + ], + [ + "comp", + "et" + ], + [ + "k", + "ów" + ], + [ + "bi", + "l" + ], + [ + "b", + "il" + ], + [ + "Wh", + "ich" + ], + [ + "is", + "ine" + ], + [ + "isi", + "ne" + ], + [ + "▁r", + "ude" + ], + [ + "▁ru", + "de" + ], + [ + "▁n", + "iveau" + ], + [ + "▁á", + "rea" + ], + [ + "▁p", + "rès" + ], + [ + "▁pr", + "ès" + ], + [ + "at", + "is" + ], + [ + "ati", + "s" + ], + [ + "▁[...", + "]" + ], + [ + "fu", + "r" + ], + [ + "f", + "ur" + ], + [ + "om", + "m" + ], + [ + "o", + "mm" + ], + [ + "pack", + "ed" + ], + [ + "p", + "acked" + ], + [ + "ме", + "не" + ], + [ + "мен", + "е" + ], + [ + "м", + "ене" + ], + [ + "script", + "style" + ], + [ + "▁A", + "th" + ], + [ + "▁At", + "h" + ], + [ + "▁d", + "esp" + ], + [ + "▁de", + "sp" + ], + [ + "▁des", + "p" + ], + [ + "elt", + "emperaturen" + ], + [ + "▁tal", + "ents" + ], + [ + "▁talent", + "s" + ], + [ + "oc", + "y" + ], + [ + "o", + "cy" + ], + [ + "▁r", + "aises" + ], + [ + "▁rais", + "es" + ], + [ + "▁raise", + "s" + ], + [ + "▁ra", + "ises" + ], + [ + "LI", + "MIT" + ], + [ + "L", + "IMIT" + ], + [ + "▁editor", + "ial" + ], + [ + "▁edit", + "orial" + ], + [ + "▁An", + "imal" + ], + [ + "▁Anim", + "al" + ], + [ + "dr", + "ive" + ], + [ + "d", + "rive" + ], + [ + "▁рабо", + "та" + ], + [ + "bs", + "s" + ], + [ + "b", + "ss" + ], + [ + "▁S", + "ev" + ], + [ + "▁Se", + "v" + ], + [ + "ep", + "och" + ], + [ + "e", + "poch" + ], + [ + "▁R", + "C" + ], + [ + "▁", + "RC" + ], + [ + "UN", + "USED" + ], + [ + "▁mand", + "atory" + ], + [ + "(", + "?:" + ], + [ + "▁B", + "in" + ], + [ + "▁Bi", + "n" + ], + [ + "▁", + "Bin" + ], + [ + "▁synt", + "hetic" + ], + [ + "▁g", + "own" + ], + [ + "▁go", + "wn" + ], + [ + "▁D", + "ob" + ], + [ + "▁Do", + "b" + ], + [ + "ka", + "p" + ], + [ + "k", + "ap" + ], + [ + "▁har", + "mon" + ], + [ + "▁harm", + "on" + ], + [ + "▁liber", + "ty" + ], + [ + "▁libert", + "y" + ], + [ + "▁R", + "ice" + ], + [ + "▁Ric", + "e" + ], + [ + "▁pray", + "ers" + ], + [ + "▁pra", + "yers" + ], + [ + "▁prayer", + "s" + ], + [ + "▁m", + "ise" + ], + [ + "▁mis", + "e" + ], + [ + "▁mi", + "se" + ], + [ + "▁conf", + "using" + ], + [ + "▁le", + "ap" + ], + [ + "▁arriv", + "es" + ], + [ + "▁arr", + "ives" + ], + [ + "▁arrive", + "s" + ], + [ + "ka", + "mp" + ], + [ + "k", + "amp" + ], + [ + "▁th", + "ats" + ], + [ + "▁that", + "s" + ], + [ + "AC", + "C" + ], + [ + "A", + "CC" + ], + [ + "▁Param", + "eters" + ], + [ + "▁Parameter", + "s" + ], + [ + "▁", + "Parameters" + ], + [ + "▁о", + "дно" + ], + [ + "▁од", + "но" + ], + [ + "▁B", + "io" + ], + [ + "▁Bi", + "o" + ], + [ + "d", + "ensity" + ], + [ + "▁gl", + "impse" + ], + [ + "FO", + "RE" + ], + [ + "FOR", + "E" + ], + [ + "▁L", + "isten" + ], + [ + "▁List", + "en" + ], + [ + "▁Li", + "sten" + ], + [ + "▁Liste", + "n" + ], + [ + "▁Lis", + "ten" + ], + [ + "▁", + "Listen" + ], + [ + "Pr", + "ev" + ], + [ + "Pre", + "v" + ], + [ + "P", + "rev" + ], + [ + "}\\", + ",\\" + ], + [ + "}\\,", + "\\" + ], + [ + "}", + "\\,\\" + ], + [ + "ку", + "ль" + ], + [ + "▁S", + "EC" + ], + [ + "▁SE", + "C" + ], + [ + "▁", + "SEC" + ], + [ + "▁expl", + "ored" + ], + [ + "▁explore", + "d" + ], + [ + "▁explo", + "red" + ], + [ + "▁mean", + "time" + ], + [ + "▁meant", + "ime" + ], + [ + "AI", + "L" + ], + [ + "A", + "IL" + ], + [ + "▁W", + "P" + ], + [ + "▁", + "WP" + ], + [ + "▁r", + "aison" + ], + [ + "▁rais", + "on" + ], + [ + "▁ra", + "ison" + ], + [ + "▁ex", + "iste" + ], + [ + "▁exist", + "e" + ], + [ + "▁l", + "esser" + ], + [ + "▁les", + "ser" + ], + [ + "▁less", + "er" + ], + [ + "▁Valid", + "ate" + ], + [ + "▁", + "Validate" + ], + [ + "▁ca", + "ution" + ], + [ + "▁caut", + "ion" + ], + [ + "us", + "ta" + ], + [ + "ust", + "a" + ], + [ + "u", + "sta" + ], + [ + "he", + "ading" + ], + [ + "head", + "ing" + ], + [ + "EF", + "F" + ], + [ + "E", + "FF" + ], + [ + ".'", + "\"" + ], + [ + ".", + "'\"" + ], + [ + "▁Gil", + "bert" + ], + [ + "▁lim", + "itation" + ], + [ + "▁limit", + "ation" + ], + [ + "▁ret", + "our" + ], + [ + "▁Common", + "wealth" + ], + [ + "▁gew", + "ann" + ], + [ + "▁miser", + "able" + ], + [ + "▁net", + "working" + ], + [ + "▁network", + "ing" + ], + [ + "▁ott", + "obre" + ], + [ + "▁otto", + "bre" + ], + [ + "▁D", + "ise" + ], + [ + "▁Dis", + "e" + ], + [ + "▁Di", + "se" + ], + [ + "ed", + "ges" + ], + [ + "edge", + "s" + ], + [ + "▁s", + "ede" + ], + [ + "▁se", + "de" + ], + [ + "▁sed", + "e" + ], + [ + "ви", + "ча" + ], + [ + "вич", + "а" + ], + [ + "un", + "iform" + ], + [ + "uni", + "form" + ], + [ + "▁дея", + "тель" + ], + [ + "ir", + "os" + ], + [ + "iro", + "s" + ], + [ + "i", + "ros" + ], + [ + "▁d", + "esen" + ], + [ + "▁de", + "sen" + ], + [ + "▁des", + "en" + ], + [ + "▁p", + "arc" + ], + [ + "▁par", + "c" + ], + [ + "▁pa", + "rc" + ], + [ + "▁R", + "ico" + ], + [ + "▁Ric", + "o" + ], + [ + "N", + "s" + ], + [ + "gu", + "id" + ], + [ + "gui", + "d" + ], + [ + "g", + "uid" + ], + [ + "or", + "io" + ], + [ + "ori", + "o" + ], + [ + "o", + "rio" + ], + [ + "ave", + "length" + ], + [ + "avel", + "ength" + ], + [ + "▁G", + "le" + ], + [ + "▁Gl", + "e" + ], + [ + "ince", + "ton" + ], + [ + "inc", + "eton" + ], + [ + "Am", + "az" + ], + [ + "A", + "maz" + ], + [ + "Con", + "struct" + ], + [ + "Const", + "ruct" + ], + [ + "▁m", + "x" + ], + [ + "▁", + "mx" + ], + [ + "▁V", + "ern" + ], + [ + "▁Ver", + "n" + ], + [ + "▁Ve", + "rn" + ], + [ + "▁Gener", + "ation" + ], + [ + "▁", + "Generation" + ], + [ + "J", + "ack" + ], + [ + "ro", + "mag" + ], + [ + "rom", + "ag" + ], + [ + "▁vi", + "agra" + ], + [ + "▁via", + "gra" + ], + [ + "▁P", + "eg" + ], + [ + "▁Pe", + "g" + ], + [ + "▁Up", + "dated" + ], + [ + "▁Update", + "d" + ], + [ + "▁", + "Updated" + ], + [ + "▁over", + "lap" + ], + [ + "▁overl", + "ap" + ], + [ + "Event", + "Args" + ], + [ + "к", + "ро" + ], + [ + "▁*", + "«" + ], + [ + "▁quest", + "ioned" + ], + [ + "▁question", + "ed" + ], + [ + "So", + "uth" + ], + [ + "S", + "outh" + ], + [ + "not", + "ice" + ], + [ + "▁perman", + "ently" + ], + [ + "▁permanent", + "ly" + ], + [ + "ls", + "t" + ], + [ + "l", + "st" + ], + [ + "fi", + "cie" + ], + [ + "fic", + "ie" + ], + [ + "▁qu", + "ella" + ], + [ + "▁que", + "lla" + ], + [ + "▁quel", + "la" + ], + [ + "▁college", + "s" + ], + [ + "▁colle", + "ges" + ], + [ + "▁colleg", + "es" + ], + [ + "▁disappoint", + "ment" + ], + [ + "▁Lu", + "ft" + ], + [ + "img", + "ur" + ], + [ + "▁trans", + "itions" + ], + [ + "▁transition", + "s" + ], + [ + "▁transit", + "ions" + ], + [ + "▁s", + "eller" + ], + [ + "▁sell", + "er" + ], + [ + "▁sel", + "ler" + ], + [ + "▁ию", + "ня" + ], + [ + "▁O", + "g" + ], + [ + "▁A", + "DD" + ], + [ + "▁AD", + "D" + ], + [ + "▁", + "ADD" + ], + [ + "▁P", + "ays" + ], + [ + "▁Pa", + "ys" + ], + [ + "▁Pay", + "s" + ], + [ + "COMM", + "AND" + ], + [ + "gr", + "ades" + ], + [ + "grad", + "es" + ], + [ + "grade", + "s" + ], + [ + "gra", + "des" + ], + [ + "▁fe", + "bbra" + ], + [ + "▁C", + "yr" + ], + [ + "▁Cy", + "r" + ], + [ + "▁febbra", + "io" + ], + [ + "et", + "i" + ], + [ + "e", + "ti" + ], + [ + "▁a", + "rom" + ], + [ + "▁ar", + "om" + ], + [ + "▁Cl", + "aude" + ], + [ + "▁Claud", + "e" + ], + [ + "▁UE", + "FA" + ], + [ + "▁жи", + "ве" + ], + [ + "▁Victor", + "ian" + ], + [ + "▁Victoria", + "n" + ], + [ + "ke", + "eping" + ], + [ + "keep", + "ing" + ], + [ + "kee", + "ping" + ], + [ + "ê", + "n" + ], + [ + "▁FIX", + "ME" + ], + [ + "it", + "ime" + ], + [ + "iti", + "me" + ], + [ + "i", + "time" + ], + [ + "ch", + "estr" + ], + [ + "che", + "str" + ], + [ + "ches", + "tr" + ], + [ + "▁Sam", + "sung" + ], + [ + "▁do", + "ctrine" + ], + [ + "▁p", + "ear" + ], + [ + "▁pe", + "ar" + ], + [ + "▁Mediterr", + "anean" + ], + [ + "▁Y", + "a" + ], + [ + "▁v", + "ault" + ], + [ + "▁va", + "ult" + ], + [ + "▁Hist", + "oric" + ], + [ + "▁Histor", + "ic" + ], + [ + "▁se", + "dan" + ], + [ + "▁sed", + "an" + ], + [ + "▁he", + "ated" + ], + [ + "▁heat", + "ed" + ], + [ + "▁polít", + "ica" + ], + [ + "Pro", + "of" + ], + [ + ":", + "{" + ], + [ + "fe", + "m" + ], + [ + "f", + "em" + ], + [ + "▁Frank", + "furt" + ], + [ + "pect", + "ives" + ], + [ + "pective", + "s" + ], + [ + "M", + "G" + ], + [ + "▁E", + "ye" + ], + [ + "da", + "i" + ], + [ + "d", + "ai" + ], + [ + "▁res", + "erves" + ], + [ + "▁reserv", + "es" + ], + [ + "▁reserve", + "s" + ], + [ + "NE", + "R" + ], + [ + "N", + "ER" + ], + [ + "▁tob", + "acco" + ], + [ + "▁frag", + "ments" + ], + [ + "▁fragment", + "s" + ], + [ + "ic", + "c" + ], + [ + "i", + "cc" + ], + [ + "▁b", + "ooth" + ], + [ + "▁bo", + "oth" + ], + [ + "▁boot", + "h" + ], + [ + "▁cru", + "ise" + ], + [ + "▁Test", + "ament" + ], + [ + "co", + "la" + ], + [ + "col", + "a" + ], + [ + "c", + "ola" + ], + [ + "▁Le", + "op" + ], + [ + "▁Leo", + "p" + ], + [ + "▁n", + "oon" + ], + [ + "▁no", + "on" + ], + [ + "▁", + "noon" + ], + [ + "▁terr", + "ified" + ], + [ + "v", + "b" + ], + [ + "int", + "el" + ], + [ + "inte", + "l" + ], + [ + "al", + "ie" + ], + [ + "ali", + "e" + ], + [ + "a", + "lie" + ], + [ + "▁ver", + "ification" + ], + [ + "yst", + "er" + ], + [ + "ys", + "ter" + ], + [ + "y", + "ster" + ], + [ + "AD", + "ER" + ], + [ + "A", + "DER" + ], + [ + "ch", + "ied" + ], + [ + "chie", + "d" + ], + [ + "chi", + "ed" + ], + [ + "▁data", + "sets" + ], + [ + "▁dat", + "asets" + ], + [ + "▁dataset", + "s" + ], + [ + "▁з", + "і" + ], + [ + "▁", + "зі" + ], + [ + "▁m", + "iem" + ], + [ + "▁mi", + "em" + ], + [ + "▁mie", + "m" + ], + [ + "ul", + "ates" + ], + [ + "ula", + "tes" + ], + [ + "ulate", + "s" + ], + [ + "▁u", + "uid" + ], + [ + "▁", + "uuid" + ], + [ + "▁Pict", + "ures" + ], + [ + "▁Picture", + "s" + ], + [ + "▁B", + "rend" + ], + [ + "▁Br", + "end" + ], + [ + "▁Bre", + "nd" + ], + [ + "▁Bren", + "d" + ], + [ + "Bill", + "board" + ], + [ + "▁s", + "tern" + ], + [ + "▁st", + "ern" + ], + [ + "▁ste", + "rn" + ], + [ + "▁ster", + "n" + ], + [ + "▁de", + "nom" + ], + [ + "▁den", + "om" + ], + [ + "▁acc", + "idents" + ], + [ + "▁accident", + "s" + ], + [ + "с", + "ня" + ], + [ + "▁p", + "acking" + ], + [ + "▁pack", + "ing" + ], + [ + "▁pac", + "king" + ], + [ + "ци", + "ја" + ], + [ + "ibli", + "cal" + ], + [ + "iblic", + "al" + ], + [ + "▁Та", + "к" + ], + [ + "▁wh", + "isk" + ], + [ + "▁whis", + "k" + ], + [ + "▁l", + "uego" + ], + [ + "▁lu", + "ego" + ], + [ + "▁rect", + "angle" + ], + [ + "▁ho", + "oks" + ], + [ + "▁hook", + "s" + ], + [ + "▁", + "hooks" + ], + [ + "▁neg", + "lect" + ], + [ + "▁negl", + "ect" + ], + [ + "▁s", + "ober" + ], + [ + "▁so", + "ber" + ], + [ + "▁sob", + "er" + ], + [ + "pro", + "position" + ], + [ + "Mult", + "iple" + ], + [ + "Multi", + "ple" + ], + [ + ":\"", + "," + ], + [ + ":", + "\"," + ], + [ + "▁b", + "apt" + ], + [ + "▁ba", + "pt" + ], + [ + "Par", + "ts" + ], + [ + "Part", + "s" + ], + [ + "P", + "arts" + ], + [ + "▁S", + "election" + ], + [ + "▁Se", + "lection" + ], + [ + "▁Sel", + "ection" + ], + [ + "▁Select", + "ion" + ], + [ + "▁", + "Selection" + ], + [ + "▁Al", + "pha" + ], + [ + "▁", + "Alpha" + ], + [ + "we", + "ights" + ], + [ + "weight", + "s" + ], + [ + "ha", + "ll" + ], + [ + "hal", + "l" + ], + [ + "h", + "all" + ], + [ + "со", + "б" + ], + [ + "с", + "об" + ], + [ + "▁l", + "ur" + ], + [ + "▁lu", + "r" + ], + [ + "▁ép", + "oca" + ], + [ + "▁re", + "sted" + ], + [ + "▁r", + "ested" + ], + [ + "▁res", + "ted" + ], + [ + "▁rest", + "ed" + ], + [ + "▁reste", + "d" + ], + [ + "amb", + "igu" + ], + [ + "▁taste", + "s" + ], + [ + "▁tast", + "es" + ], + [ + "amazon", + "aws" + ], + [ + "▁conf", + "ess" + ], + [ + "▁dic", + "iembre" + ], + [ + "▁dici", + "embre" + ], + [ + "im", + "plement" + ], + [ + "impl", + "ement" + ], + [ + "imp", + "lement" + ], + [ + "▁absor", + "ption" + ], + [ + "Ha", + "l" + ], + [ + "H", + "al" + ], + [ + "LE", + "AN" + ], + [ + "▁Z", + "ach" + ], + [ + "▁Za", + "ch" + ], + [ + "▁free", + "ze" + ], + [ + "▁fre", + "eze" + ], + [ + "L", + "BL" + ], + [ + "ST", + "M" + ], + [ + "S", + "TM" + ], + [ + "▁cal", + "c" + ], + [ + "▁ca", + "lc" + ], + [ + "▁", + "calc" + ], + [ + "={", + "()" + ], + [ + "=", + "*/" + ], + [ + "▁b", + "t" + ], + [ + "▁", + "bt" + ], + [ + "Re", + "b" + ], + [ + "R", + "eb" + ], + [ + "▁W", + "ien" + ], + [ + "▁Wi", + "en" + ], + [ + "an", + "ska" + ], + [ + "ans", + "ka" + ], + [ + "ansk", + "a" + ], + [ + "▁s", + "urn" + ], + [ + "▁su", + "rn" + ], + [ + "▁sur", + "n" + ], + [ + "iat", + "ive" + ], + [ + "i", + "ative" + ], + [ + "▁inv", + "ån" + ], + [ + "C", + "Y" + ], + [ + "▁l", + "à" + ], + [ + "am", + "ba" + ], + [ + "amb", + "a" + ], + [ + "le", + "en" + ], + [ + "lee", + "n" + ], + [ + "l", + "een" + ], + [ + "wa", + "hl" + ], + [ + "w", + "ahl" + ], + [ + "▁function", + "ing" + ], + [ + "ți", + "a" + ], + [ + "ț", + "ia" + ], + [ + "get", + "Context" + ], + [ + "ga", + "rt" + ], + [ + "gar", + "t" + ], + [ + "g", + "art" + ], + [ + "▁о", + "бе" + ], + [ + "▁об", + "е" + ], + [ + "Pe", + "n" + ], + [ + "P", + "en" + ], + [ + "vi", + "k" + ], + [ + "v", + "ik" + ], + [ + "Sl", + "ider" + ], + [ + "▁Ac", + "cept" + ], + [ + "▁", + "Accept" + ], + [ + "Ga", + "p" + ], + [ + "G", + "ap" + ], + [ + "▁J", + "orge" + ], + [ + "SI", + "G" + ], + [ + "S", + "IG" + ], + [ + "▁во", + "с" + ], + [ + "▁го", + "ло" + ], + [ + "▁г", + "оло" + ], + [ + "▁period", + "o" + ], + [ + "ш", + "та" + ], + [ + "▁pat", + "ches" + ], + [ + "▁patch", + "es" + ], + [ + "ко", + "ї" + ], + [ + "är", + "e" + ], + [ + "ä", + "re" + ], + [ + "eng", + "ono" + ], + [ + "li", + "sta" + ], + [ + "list", + "a" + ], + [ + "l", + "ista" + ], + [ + "hor", + "n" + ], + [ + "ho", + "rn" + ], + [ + "h", + "orn" + ], + [ + "▁Com", + "plex" + ], + [ + "▁Comp", + "lex" + ], + [ + "▁", + "Complex" + ], + [ + "Se", + "nt" + ], + [ + "S", + "ent" + ], + [ + "tr", + "fs" + ], + [ + "▁conv", + "ex" + ], + [ + "▁conve", + "x" + ], + [ + "Gener", + "ation" + ], + [ + "▁міс", + "це" + ], + [ + "com", + "press" + ], + [ + "comp", + "ress" + ], + [ + "▁S", + "ax" + ], + [ + "▁Sa", + "x" + ], + [ + "▁u", + "id" + ], + [ + "▁ui", + "d" + ], + [ + "▁", + "uid" + ], + [ + "▁Leb", + "ens" + ], + [ + "▁Leben", + "s" + ], + [ + "Com", + "pletion" + ], + [ + "\\|", + "_{" + ], + [ + "\\", + "|_{" + ], + [ + "in", + "sky" + ], + [ + "ins", + "ky" + ], + [ + "▁sc", + "hon" + ], + [ + "▁sch", + "on" + ], + [ + "▁m", + "asters" + ], + [ + "▁ma", + "sters" + ], + [ + "▁master", + "s" + ], + [ + "▁mas", + "ters" + ], + [ + "▁mast", + "ers" + ], + [ + "in", + "depend" + ], + [ + "inde", + "pend" + ], + [ + "ne", + "ys" + ], + [ + "ney", + "s" + ], + [ + "▁l", + "ied" + ], + [ + "▁li", + "ed" + ], + [ + "▁lie", + "d" + ], + [ + "▁a", + "spir" + ], + [ + "▁asp", + "ir" + ], + [ + "ч", + "ні" + ], + [ + "▁break", + "down" + ], + [ + "▁H", + "arm" + ], + [ + "▁Har", + "m" + ], + [ + "▁Ha", + "rm" + ], + [ + "▁design", + "ing" + ], + [ + "h", + "f" + ], + [ + "▁Ang", + "ela" + ], + [ + "▁Angel", + "a" + ], + [ + "▁con", + "fer" + ], + [ + "▁conf", + "er" + ], + [ + "▁part", + "ido" + ], + [ + "▁parti", + "do" + ], + [ + "▁inter", + "ference" + ], + [ + "ma", + "o" + ], + [ + "m", + "ao" + ], + [ + "▁absor", + "bed" + ], + [ + "▁absorb", + "ed" + ], + [ + "▁V", + "all" + ], + [ + "▁Val", + "l" + ], + [ + "▁Va", + "ll" + ], + [ + "Error", + "Code" + ], + [ + "▁Publish", + "ing" + ], + [ + "va", + "no" + ], + [ + "van", + "o" + ], + [ + "v", + "ano" + ], + [ + "BIT", + "S" + ], + [ + "BI", + "TS" + ], + [ + "B", + "ITS" + ], + [ + "▁de", + "er" + ], + [ + "▁Camp", + "aign" + ], + [ + "▁g", + "raz" + ], + [ + "▁gr", + "az" + ], + [ + "▁gra", + "z" + ], + [ + "CHAN", + "GE" + ], + [ + "▁f", + "eder" + ], + [ + "▁fe", + "der" + ], + [ + "▁fed", + "er" + ], + [ + "if", + "fe" + ], + [ + "iff", + "e" + ], + [ + "hand", + "ed" + ], + [ + "han", + "ded" + ], + [ + "h", + "anded" + ], + [ + "c", + "q" + ], + [ + "um", + "bing" + ], + [ + "umb", + "ing" + ], + [ + "▁un", + "re" + ], + [ + "▁s", + "iendo" + ], + [ + "▁si", + "endo" + ], + [ + "▁sim", + "pler" + ], + [ + "▁simple", + "r" + ], + [ + "▁simpl", + "er" + ], + [ + "wh", + "y" + ], + [ + "w", + "hy" + ], + [ + "ar", + "ettes" + ], + [ + "are", + "ttes" + ], + [ + "aret", + "tes" + ], + [ + "arette", + "s" + ], + [ + "an", + "st" + ], + [ + "ans", + "t" + ], + [ + "▁h", + "ass" + ], + [ + "▁has", + "s" + ], + [ + "▁ha", + "ss" + ], + [ + "▁Enter", + "prise" + ], + [ + "▁m", + "ois" + ], + [ + "▁mo", + "is" + ], + [ + "▁F", + "o" + ], + [ + "▁уча", + "ст" + ], + [ + "ff", + "en" + ], + [ + "f", + "fen" + ], + [ + "▁MOD", + "ULE" + ], + [ + "▁", + "MODULE" + ], + [ + "▁activ", + "ated" + ], + [ + "▁activate", + "d" + ], + [ + "▁intern", + "acional" + ], + [ + "▁M", + "ittel" + ], + [ + "deg", + "ree" + ], + [ + "▁от", + "кры" + ], + [ + "▁&", + "(" + ], + [ + "get", + "Property" + ], + [ + "is", + "z" + ], + [ + "i", + "sz" + ], + [ + "ced", + "ure" + ], + [ + "▁en", + "ters" + ], + [ + "▁ent", + "ers" + ], + [ + "▁enter", + "s" + ], + [ + "▁S", + "ally" + ], + [ + "▁Sal", + "ly" + ], + [ + "▁Tr", + "ain" + ], + [ + "▁Tra", + "in" + ], + [ + "▁lo", + "gged" + ], + [ + "▁log", + "ged" + ], + [ + "▁R", + "av" + ], + [ + "▁Ra", + "v" + ], + [ + "▁A", + "void" + ], + [ + "▁Av", + "oid" + ], + [ + "▁K", + "aiser" + ], + [ + "▁Ka", + "iser" + ], + [ + "▁ex", + "pend" + ], + [ + "▁exp", + "end" + ], + [ + "ap", + "hor" + ], + [ + "aph", + "or" + ], + [ + "▁b", + "rass" + ], + [ + "▁br", + "ass" + ], + [ + "▁bra", + "ss" + ], + [ + "▁bras", + "s" + ], + [ + "▁mel", + "od" + ], + [ + "▁att", + "itudes" + ], + [ + "▁attitude", + "s" + ], + [ + "*", + "\"" + ], + [ + "W", + "all" + ], + [ + "▁o", + "we" + ], + [ + "▁", + "owe" + ], + [ + "▁b", + "amb" + ], + [ + "▁ba", + "mb" + ], + [ + "sh", + "ader" + ], + [ + "sha", + "der" + ], + [ + "ce", + "ster" + ], + [ + "ces", + "ter" + ], + [ + "c", + "ester" + ], + [ + "▁P", + "P" + ], + [ + "▁", + "PP" + ], + [ + "▁migr", + "ations" + ], + [ + "▁migration", + "s" + ], + [ + "ent", + "ric" + ], + [ + "entr", + "ic" + ], + [ + "▁Set", + "up" + ], + [ + "▁", + "Setup" + ], + [ + "▁Art", + "ist" + ], + [ + "hr", + "e" + ], + [ + "h", + "re" + ], + [ + "▁pol", + "ite" + ], + [ + "▁polit", + "e" + ], + [ + "ah", + "an" + ], + [ + "aha", + "n" + ], + [ + "a", + "han" + ], + [ + "▁lug", + "lio" + ], + [ + "▁pre", + "decess" + ], + [ + "▁S", + "IG" + ], + [ + "▁SI", + "G" + ], + [ + "▁", + "SIG" + ], + [ + "ті", + "в" + ], + [ + "т", + "ів" + ], + [ + "▁R", + "F" + ], + [ + "▁", + "RF" + ], + [ + "▁D", + "ry" + ], + [ + "▁Dr", + "y" + ], + [ + "▁m", + "aker" + ], + [ + "▁make", + "r" + ], + [ + "▁ma", + "ker" + ], + [ + "▁", + "maker" + ], + [ + "ши", + "м" + ], + [ + "ш", + "им" + ], + [ + "▁S", + "ounds" + ], + [ + "▁Sound", + "s" + ], + [ + "▁implement", + "ing" + ], + [ + "▁a", + "h" + ], + [ + "▁", + "ah" + ], + [ + "▁g", + "ev" + ], + [ + "▁ge", + "v" + ], + [ + "▁du", + "plicate" + ], + [ + "▁L", + "ogan" + ], + [ + "▁Log", + "an" + ], + [ + "▁Lo", + "gan" + ], + [ + "▁G", + "rade" + ], + [ + "▁Gr", + "ade" + ], + [ + "▁Grad", + "e" + ], + [ + "▁Gra", + "de" + ], + [ + "DU", + "CT" + ], + [ + "ís", + "es" + ], + [ + "í", + "ses" + ], + [ + "ér", + "t" + ], + [ + "é", + "rt" + ], + [ + "▁nons", + "ense" + ], + [ + "back", + "up" + ], + [ + "Att", + "achment" + ], + [ + "▁e", + "cc" + ], + [ + "▁ec", + "c" + ], + [ + "▁Squad", + "ron" + ], + [ + "le", + "arn" + ], + [ + "lear", + "n" + ], + [ + "de", + "precated" + ], + [ + "dep", + "recated" + ], + [ + "▁A", + "ub" + ], + [ + "▁Au", + "b" + ], + [ + "▁G", + "ol" + ], + [ + "▁Go", + "l" + ], + [ + "▁over", + "l" + ], + [ + "SER", + "VICE" + ], + [ + "▁beautiful", + "ly" + ], + [ + "RE", + "L" + ], + [ + "R", + "EL" + ], + [ + "▁G", + "ian" + ], + [ + "▁Gi", + "an" + ], + [ + "▁P", + "apa" + ], + [ + "▁Pa", + "pa" + ], + [ + "▁Pap", + "a" + ], + [ + "res", + "pond" + ], + [ + "respon", + "d" + ], + [ + "resp", + "ond" + ], + [ + "▁Carib", + "bean" + ], + [ + "r", + "n" + ], + [ + "▁худо", + "ж" + ], + [ + "C", + "fg" + ], + [ + "ra", + "i" + ], + [ + "r", + "ai" + ], + [ + "▁sn", + "iff" + ], + [ + "tt", + "o" + ], + [ + "t", + "to" + ], + [ + "оло", + "ги" + ], + [ + "о", + "логи" + ], + [ + "▁r", + "b" + ], + [ + "▁", + "rb" + ], + [ + "▁inc", + "idents" + ], + [ + "▁incident", + "s" + ], + [ + "▁d", + "uck" + ], + [ + "▁du", + "ck" + ], + [ + "▁PROVID", + "ED" + ], + [ + "Source", + "s" + ], + [ + "S", + "ources" + ], + [ + "▁Chel", + "sea" + ], + [ + "▁t", + "ek" + ], + [ + "▁te", + "k" + ], + [ + "▁", + "tek" + ], + [ + "▁на", + "лази" + ], + [ + "▁pil", + "ots" + ], + [ + "▁pilot", + "s" + ], + [ + "т", + "ки" + ], + [ + "▁tr", + "aded" + ], + [ + "▁trad", + "ed" + ], + [ + "▁tra", + "ded" + ], + [ + "▁trade", + "d" + ], + [ + "▁Be", + "ijing" + ], + [ + "▁Greg", + "ory" + ], + [ + "scal", + "ar" + ], + [ + "▁incl", + "ined" + ], + [ + "▁inc", + "lined" + ], + [ + "▁K", + "amp" + ], + [ + "▁Kam", + "p" + ], + [ + "▁Ka", + "mp" + ], + [ + "▁M", + "arian" + ], + [ + "▁Mar", + "ian" + ], + [ + "▁Ma", + "rian" + ], + [ + "▁Maria", + "n" + ], + [ + "▁fier", + "ce" + ], + [ + "▁the", + "ft" + ], + [ + "▁th", + "eft" + ], + [ + "ющи", + "х" + ], + [ + "▁In", + "to" + ], + [ + "▁Int", + "o" + ], + [ + "▁", + "Into" + ], + [ + "con", + "straint" + ], + [ + "parent", + "Node" + ], + [ + "id", + "ental" + ], + [ + "ident", + "al" + ], + [ + "iden", + "tal" + ], + [ + "▁gouver", + "nement" + ], + [ + "▁S", + "ND" + ], + [ + "▁SN", + "D" + ], + [ + "▁Rub", + "y" + ], + [ + "▁Ru", + "by" + ], + [ + "▁mon", + "aster" + ], + [ + "Rec", + "ords" + ], + [ + "Record", + "s" + ], + [ + "▁K", + "ab" + ], + [ + "▁Ka", + "b" + ], + [ + "▁Un", + "iverse" + ], + [ + "▁Univers", + "e" + ], + [ + "▁approxim", + "ate" + ], + [ + "▁approx", + "imate" + ], + [ + "W", + "ater" + ], + [ + "▁Phys", + "ical" + ], + [ + "ap", + "pers" + ], + [ + "app", + "ers" + ], + [ + "appe", + "rs" + ], + [ + "oubt", + "edly" + ], + [ + "ло", + "жен" + ], + [ + "ложе", + "н" + ], + [ + "▁tow", + "el" + ], + [ + "▁sib", + "lings" + ], + [ + "ep", + "h" + ], + [ + "e", + "ph" + ], + [ + "ic", + "ios" + ], + [ + "ici", + "os" + ], + [ + "icio", + "s" + ], + [ + "ра", + "ми" + ], + [ + "▁out", + "rage" + ], + [ + "▁tamb", + "é" + ], + [ + "SR", + "C" + ], + [ + "S", + "RC" + ], + [ + "те", + "лем" + ], + [ + "тел", + "ем" + ], + [ + "V", + "i" + ], + [ + ".'", + ");" + ], + [ + ".", + "');" + ], + [ + "L", + "M" + ], + [ + "▁m", + "itt" + ], + [ + "▁mit", + "t" + ], + [ + "▁mi", + "tt" + ], + [ + "▁", + "mitt" + ], + [ + "▁w", + "eed" + ], + [ + "▁we", + "ed" + ], + [ + "▁cr", + "ops" + ], + [ + "▁cro", + "ps" + ], + [ + "▁crop", + "s" + ], + [ + "im", + "an" + ], + [ + "ima", + "n" + ], + [ + "i", + "man" + ], + [ + "Cl", + "aim" + ], + [ + "ins", + "ula" + ], + [ + "▁(", + "“" + ], + [ + "▁Ch", + "anges" + ], + [ + "▁Change", + "s" + ], + [ + "▁", + "Changes" + ], + [ + "▁invån", + "are" + ], + [ + "ag", + "ain" + ], + [ + "aga", + "in" + ], + [ + "a", + "gain" + ], + [ + "▁c", + "nt" + ], + [ + "▁", + "cnt" + ], + [ + "▁G", + "az" + ], + [ + "▁Ga", + "z" + ], + [ + "▁a", + "ustral" + ], + [ + "over", + "lay" + ], + [ + "▁Me", + "chan" + ], + [ + "▁sl", + "ammed" + ], + [ + "▁tr", + "ailing" + ], + [ + "▁tra", + "iling" + ], + [ + "▁trail", + "ing" + ], + [ + "▁Bi", + "ography" + ], + [ + "▁appe", + "aling" + ], + [ + "▁appeal", + "ing" + ], + [ + "IV", + "ER" + ], + [ + "IVE", + "R" + ], + [ + "I", + "VER" + ], + [ + "▁A", + "ve" + ], + [ + "▁Av", + "e" + ], + [ + "▁P", + "lot" + ], + [ + "▁Pl", + "ot" + ], + [ + "vo", + "j" + ], + [ + "v", + "oj" + ], + [ + "▁s", + "ung" + ], + [ + "▁su", + "ng" + ], + [ + "▁sun", + "g" + ], + [ + "▁", + "sung" + ], + [ + "▁u", + "nos" + ], + [ + "▁un", + "os" + ], + [ + "▁uno", + "s" + ], + [ + "Effect", + "s" + ], + [ + "v", + "v" + ], + [ + "co", + "ok" + ], + [ + "c", + "ook" + ], + [ + "But", + "tons" + ], + [ + "Button", + "s" + ], + [ + "▁trans", + "m" + ], + [ + "ier", + "to" + ], + [ + "iert", + "o" + ], + [ + "CON", + "TEXT" + ], + [ + "CONT", + "EXT" + ], + [ + "▁dign", + "ity" + ], + [ + "air", + "ed" + ], + [ + "ai", + "red" + ], + [ + "aire", + "d" + ], + [ + "a", + "ired" + ], + [ + "java", + "x" + ], + [ + "jav", + "ax" + ], + [ + "j", + "avax" + ], + [ + "▁Albert", + "o" + ], + [ + "▁Alber", + "to" + ], + [ + "▁Rec", + "ently" + ], + [ + "▁Recent", + "ly" + ], + [ + "▁fac", + "ial" + ], + [ + "▁fa", + "cial" + ], + [ + "math", + "op" + ], + [ + "mat", + "hop" + ], + [ + "ał", + "o" + ], + [ + "a", + "ło" + ], + [ + "ви", + "д" + ], + [ + "co", + "tt" + ], + [ + "c", + "ott" + ], + [ + "Vari", + "ables" + ], + [ + "Variable", + "s" + ], + [ + "▁R", + "an" + ], + [ + "▁Ra", + "n" + ], + [ + "▁b", + "unk" + ], + [ + "am", + "iliar" + ], + [ + "amil", + "iar" + ], + [ + "CA", + "ST" + ], + [ + "C", + "AST" + ], + [ + "▁fr", + "ü" + ], + [ + "VE", + "D" + ], + [ + "V", + "ED" + ], + [ + "▁NOT", + "ICE" + ], + [ + "▁turn", + "o" + ], + [ + "▁tur", + "no" + ], + [ + "valid", + "ator" + ], + [ + "▁Portug", + "uese" + ], + [ + "▁question", + "ing" + ], + [ + "}}", + ")" + ], + [ + "}", + "})" + ], + [ + "▁l", + "ear" + ], + [ + "▁le", + "ar" + ], + [ + "▁", + "lear" + ], + [ + "X", + "amarin" + ], + [ + "▁dis", + "adv" + ], + [ + "enc", + "oded" + ], + [ + "encode", + "d" + ], + [ + "▁K", + "ot" + ], + [ + "▁Ko", + "t" + ], + [ + "ra", + "ted" + ], + [ + "rat", + "ed" + ], + [ + "rate", + "d" + ], + [ + "r", + "ated" + ], + [ + "▁The", + "ory" + ], + [ + "ci", + "us" + ], + [ + "c", + "ius" + ], + [ + "▁Dar", + "win" + ], + [ + "ђ", + "е" + ], + [ + "▁dé", + "cl" + ], + [ + "▁déc", + "l" + ], + [ + "▁обла", + "сть" + ], + [ + "ро", + "вич" + ], + [ + "▁mob", + "ility" + ], + [ + "▁mobil", + "ity" + ], + [ + "V", + "F" + ], + [ + "▁х", + "и" + ], + [ + "▁", + "хи" + ], + [ + "un", + "til" + ], + [ + "unt", + "il" + ], + [ + "u", + "ntil" + ], + [ + "▁bar", + "riers" + ], + [ + "▁barrier", + "s" + ], + [ + "▁barr", + "iers" + ], + [ + "gi", + "f" + ], + [ + "g", + "if" + ], + [ + "▁R", + "oh" + ], + [ + "▁Ro", + "h" + ], + [ + "▁a", + "ging" + ], + [ + "▁ag", + "ing" + ], + [ + "▁", + "aging" + ], + [ + "▁W", + "idget" + ], + [ + "▁", + "Widget" + ], + [ + "ol", + "k" + ], + [ + "▁f", + "arms" + ], + [ + "▁far", + "ms" + ], + [ + "▁farm", + "s" + ], + [ + "Check", + "er" + ], + [ + "Che", + "cker" + ], + [ + "Int", + "roduction" + ], + [ + "с", + "мо" + ], + [ + "▁Russ", + "ians" + ], + [ + "▁Russian", + "s" + ], + [ + "▁Russia", + "ns" + ], + [ + "na", + "ments" + ], + [ + "nam", + "ents" + ], + [ + "nament", + "s" + ], + [ + "n", + "aments" + ], + [ + "▁In", + "sert" + ], + [ + "▁Ins", + "ert" + ], + [ + "▁", + "Insert" + ], + [ + "▁When", + "ever" + ], + [ + "▁Whe", + "never" + ], + [ + "er", + "set" + ], + [ + "ers", + "et" + ], + [ + "it", + "ori" + ], + [ + "itor", + "i" + ], + [ + "ito", + "ri" + ], + [ + "▁D", + "ort" + ], + [ + "▁Do", + "rt" + ], + [ + "▁Dor", + "t" + ], + [ + "▁cost", + "ume" + ], + [ + "▁mathemat", + "ical" + ], + [ + "▁B", + "ast" + ], + [ + "▁Bas", + "t" + ], + [ + "▁Ba", + "st" + ], + [ + "▁nom", + "inated" + ], + [ + "▁nomin", + "ated" + ], + [ + "▁rest", + "oration" + ], + [ + "pos", + "al" + ], + [ + "po", + "sal" + ], + [ + "▁un", + "fortunate" + ], + [ + "P", + "s" + ], + [ + "LI", + "N" + ], + [ + "L", + "IN" + ], + [ + "▁int", + "act" + ], + [ + "▁prov", + "oc" + ], + [ + "▁situ", + "ée" + ], + [ + "▁но", + "ября" + ], + [ + "er", + "mo" + ], + [ + "erm", + "o" + ], + [ + "▁f", + "isher" + ], + [ + "▁fish", + "er" + ], + [ + "▁fis", + "her" + ], + [ + "г", + "ля" + ], + [ + "▁con", + "ting" + ], + [ + "▁cont", + "ing" + ], + [ + "▁contin", + "g" + ], + [ + "▁Do", + "ug" + ], + [ + "▁Dou", + "g" + ], + [ + "\"", + "?" + ], + [ + "▁E", + "va" + ], + [ + "▁Ev", + "a" + ], + [ + "▁t", + "ops" + ], + [ + "▁to", + "ps" + ], + [ + "▁top", + "s" + ], + [ + "▁Rem", + "ote" + ], + [ + "▁", + "Remote" + ], + [ + "▁art", + "work" + ], + [ + "▁art", + "illery" + ], + [ + "qu", + "ick" + ], + [ + "▁Arab", + "ia" + ], + [ + "▁SD", + "Value" + ], + [ + "▁Dak", + "ota" + ], + [ + "ia", + "ted" + ], + [ + "iat", + "ed" + ], + [ + "iate", + "d" + ], + [ + "i", + "ated" + ], + [ + "▁Op", + "tim" + ], + [ + "▁Opt", + "im" + ], + [ + "but", + "tons" + ], + [ + "button", + "s" + ], + [ + "▁c", + "ottage" + ], + [ + "▁where", + "in" + ], + [ + "▁tut", + "orial" + ], + [ + "▁S", + "cre" + ], + [ + "▁Sc", + "re" + ], + [ + "▁swe", + "ep" + ], + [ + "▁Coff", + "ee" + ], + [ + "})", + "}" + ], + [ + "}", + ")}" + ], + [ + "▁му", + "зы" + ], + [ + "host", + "name" + ], + [ + "▁T", + "emp" + ], + [ + "▁Te", + "mp" + ], + [ + "▁Tem", + "p" + ], + [ + "▁", + "Temp" + ], + [ + "▁F", + "ut" + ], + [ + "▁Fu", + "t" + ], + [ + "re", + "spect" + ], + [ + "res", + "pect" + ], + [ + "resp", + "ect" + ], + [ + "oc", + "z" + ], + [ + "o", + "cz" + ], + [ + "▁pre", + "domin" + ], + [ + "▁pred", + "omin" + ], + [ + "Ind", + "icator" + ], + [ + "en", + "cial" + ], + [ + "enc", + "ial" + ], + [ + "encia", + "l" + ], + [ + "enci", + "al" + ], + [ + "UM", + "ENT" + ], + [ + "U", + "MENT" + ], + [ + "▁SH", + "ALL" + ], + [ + "▁SHA", + "LL" + ], + [ + "▁comm", + "anded" + ], + [ + "▁command", + "ed" + ], + [ + "▁withdraw", + "al" + ], + [ + "io", + "ur" + ], + [ + "i", + "our" + ], + [ + "REG", + "ION" + ], + [ + "s", + "printf" + ], + [ + "▁в", + "ме" + ], + [ + "▁Pay", + "ment" + ], + [ + "▁", + "Payment" + ], + [ + "▁A", + "nim" + ], + [ + "▁An", + "im" + ], + [ + "▁", + "Anim" + ], + [ + "pub", + "lish" + ], + [ + "▁se", + "eks" + ], + [ + "▁see", + "ks" + ], + [ + "▁seek", + "s" + ], + [ + "ou", + "w" + ], + [ + "o", + "uw" + ], + [ + "▁G", + "M" + ], + [ + "▁", + "GM" + ], + [ + "ru", + "gu" + ], + [ + "rug", + "u" + ], + [ + "r", + "ugu" + ], + [ + "us", + "tain" + ], + [ + "ust", + "ain" + ], + [ + "usta", + "in" + ], + [ + "▁)", + ")" + ], + [ + "▁", + "))" + ], + [ + "▁consult", + "ing" + ], + [ + "▁D", + "ialog" + ], + [ + "▁", + "Dialog" + ], + [ + "▁L", + "ars" + ], + [ + "▁La", + "rs" + ], + [ + "▁Lar", + "s" + ], + [ + "▁crit", + "ique" + ], + [ + "▁circ", + "ulation" + ], + [ + "▁circul", + "ation" + ], + [ + "▁land", + "sc" + ], + [ + "▁lands", + "c" + ], + [ + "man", + "aged" + ], + [ + "▁C", + "raft" + ], + [ + "▁Cr", + "aft" + ], + [ + "▁Cra", + "ft" + ], + [ + "▁h", + "erman" + ], + [ + "▁her", + "man" + ], + [ + "af", + "i" + ], + [ + "a", + "fi" + ], + [ + "am", + "y" + ], + [ + "a", + "my" + ], + [ + "▁disc", + "our" + ], + [ + "▁disco", + "ur" + ], + [ + "<>", + "(" + ], + [ + "<", + ">(" + ], + [ + "▁St", + "eph" + ], + [ + "▁Ste", + "ph" + ], + [ + "▁Step", + "h" + ], + [ + "▁toler", + "ance" + ], + [ + "type", + "name" + ], + [ + "typ", + "ename" + ], + [ + "typen", + "ame" + ], + [ + "vent", + "ions" + ], + [ + "vention", + "s" + ], + [ + "zi", + "ał" + ], + [ + "z", + "iał" + ], + [ + "ст", + "ов" + ], + [ + "сто", + "в" + ], + [ + "с", + "тов" + ], + [ + "▁st", + "icking" + ], + [ + "▁stick", + "ing" + ], + [ + "AS", + "C" + ], + [ + "A", + "SC" + ], + [ + "IS", + "O" + ], + [ + "I", + "SO" + ], + [ + "▁Sp", + "encer" + ], + [ + "▁Di", + "dn" + ], + [ + "▁Did", + "n" + ], + [ + "gom", + "ery" + ], + [ + "im", + "iter" + ], + [ + "imit", + "er" + ], + [ + "imi", + "ter" + ], + [ + "dr", + "u" + ], + [ + "d", + "ru" + ], + [ + "Cl", + "ause" + ], + [ + "▁sl", + "ides" + ], + [ + "▁slide", + "s" + ], + [ + "▁slid", + "es" + ], + [ + "##", + "#" + ], + [ + "#", + "##" + ], + [ + "▁S", + "ugar" + ], + [ + "▁Su", + "gar" + ], + [ + "H", + "Y" + ], + [ + "▁э", + "ти" + ], + [ + "▁Ed", + "wards" + ], + [ + "▁Edward", + "s" + ], + [ + "▁c", + "ents" + ], + [ + "▁cent", + "s" + ], + [ + "oy", + "a" + ], + [ + "o", + "ya" + ], + [ + "ser", + "ts" + ], + [ + "sert", + "s" + ], + [ + "s", + "erts" + ], + [ + "▁H", + "ass" + ], + [ + "▁Ha", + "ss" + ], + [ + "▁Has", + "s" + ], + [ + "▁in", + "gen" + ], + [ + "▁ing", + "en" + ], + [ + "▁", + "ingen" + ], + [ + "ст", + "ри" + ], + [ + "с", + "три" + ], + [ + "▁s", + "addle" + ], + [ + "sol", + "id" + ], + [ + "s", + "olid" + ], + [ + "▁ch", + "ampions" + ], + [ + "▁champion", + "s" + ], + [ + "▁champ", + "ions" + ], + [ + "-", + ")" + ], + [ + "▁S", + "lov" + ], + [ + "▁Sl", + "ov" + ], + [ + "▁sh", + "iny" + ], + [ + "▁*", + ")&" + ], + [ + "▁*)", + "&" + ], + [ + "▁Def", + "ine" + ], + [ + "č", + "e" + ], + [ + "▁scr", + "ut" + ], + [ + "on", + "den" + ], + [ + "ond", + "en" + ], + [ + "onde", + "n" + ], + [ + "'\"", + "," + ], + [ + "'", + "\"," + ], + [ + "uf", + "fs" + ], + [ + "uff", + "s" + ], + [ + "▁o", + "lymp" + ], + [ + "id", + "ential" + ], + [ + "ident", + "ial" + ], + [ + "wa", + "nd" + ], + [ + "wan", + "d" + ], + [ + "w", + "and" + ], + [ + "▁ann", + "ually" + ], + [ + "▁annual", + "ly" + ], + [ + "▁Ark", + "ansas" + ], + [ + "▁s", + "aint" + ], + [ + "▁sa", + "int" + ], + [ + "▁gle", + "ich" + ], + [ + "▁per", + "fection" + ], + [ + "▁perfect", + "ion" + ], + [ + "▁perf", + "ection" + ], + [ + ")", + ">" + ], + [ + "▁sh", + "orts" + ], + [ + "▁short", + "s" + ], + [ + "▁just", + "ified" + ], + [ + "pe", + "ated" + ], + [ + "peat", + "ed" + ], + [ + "pack", + "ages" + ], + [ + "package", + "s" + ], + [ + "dr", + "iven" + ], + [ + "drive", + "n" + ], + [ + "d", + "riven" + ], + [ + "▁Liber", + "ty" + ], + [ + "▁str", + "ipped" + ], + [ + "▁stri", + "pped" + ], + [ + "▁strip", + "ped" + ], + [ + "ше", + "ние" + ], + [ + "▁fün", + "f" + ], + [ + "▁e", + "cosystem" + ], + [ + "ix", + "a" + ], + [ + "i", + "xa" + ], + [ + "▁F", + "resh" + ], + [ + "▁Fr", + "esh" + ], + [ + "▁Fre", + "sh" + ], + [ + "var", + "t" + ], + [ + "va", + "rt" + ], + [ + "v", + "art" + ], + [ + "▁tre", + "ats" + ], + [ + "▁treat", + "s" + ], + [ + "▁st", + "ance" + ], + [ + "▁stan", + "ce" + ], + [ + "▁", + "stance" + ], + [ + "чё", + "т" + ], + [ + "ч", + "ёт" + ], + [ + "▁p", + "ity" + ], + [ + "▁pi", + "ty" + ], + [ + "▁pit", + "y" + ], + [ + "ad", + "ém" + ], + [ + "▁о", + "кон" + ], + [ + "▁ок", + "он" + ], + [ + "▁C", + "hand" + ], + [ + "▁Ch", + "and" + ], + [ + "▁Cha", + "nd" + ], + [ + "ra", + "b" + ], + [ + "r", + "ab" + ], + [ + "вши", + "й" + ], + [ + "в", + "ший" + ], + [ + "in", + "ski" + ], + [ + "ins", + "ki" + ], + [ + "▁contin", + "ually" + ], + [ + "▁continu", + "ally" + ], + [ + "▁D", + "addy" + ], + [ + "▁Dad", + "dy" + ], + [ + "▁night", + "mare" + ], + [ + "ic", + "ional" + ], + [ + "ici", + "onal" + ], + [ + "icio", + "nal" + ], + [ + "icion", + "al" + ], + [ + "▁e", + "fect" + ], + [ + "ue", + "blo" + ], + [ + "▁l", + "anç" + ], + [ + "▁lan", + "ç" + ], + [ + "▁Col", + "lections" + ], + [ + "▁Collection", + "s" + ], + [ + "▁Collect", + "ions" + ], + [ + "▁", + "Collections" + ], + [ + "du", + "e" + ], + [ + "d", + "ue" + ], + [ + "am", + "pton" + ], + [ + "amp", + "ton" + ], + [ + "▁mem", + "cpy" + ], + [ + "▁", + "memcpy" + ], + [ + "▁*", + "*(" + ], + [ + "▁**", + "(" + ], + [ + "is", + "sent" + ], + [ + "iss", + "ent" + ], + [ + "isse", + "nt" + ], + [ + "issen", + "t" + ], + [ + "▁In", + "sp" + ], + [ + "▁Ins", + "p" + ], + [ + "▁Glas", + "gow" + ], + [ + "▁fur", + "ono" + ], + [ + "▁kind", + "ness" + ], + [ + "B", + "i" + ], + [ + "▁comp", + "eted" + ], + [ + "▁compet", + "ed" + ], + [ + "▁compete", + "d" + ], + [ + "▁o", + "ak" + ], + [ + "L", + "arge" + ], + [ + "▁dis", + "gu" + ], + [ + "▁disg", + "u" + ], + [ + "▁k", + "ings" + ], + [ + "▁king", + "s" + ], + [ + "▁kin", + "gs" + ], + [ + "та", + "ми" + ], + [ + "▁st", + "uffed" + ], + [ + "▁stuff", + "ed" + ], + [ + "▁h", + "ilar" + ], + [ + "▁hi", + "lar" + ], + [ + "pub", + "lished" + ], + [ + "publish", + "ed" + ], + [ + "▁st", + "ressed" + ], + [ + "▁str", + "essed" + ], + [ + "▁stress", + "ed" + ], + [ + "▁Pe", + "ak" + ], + [ + "▁lo", + "ader" + ], + [ + "▁load", + "er" + ], + [ + "▁", + "loader" + ], + [ + "Key", + "board" + ], + [ + "▁re", + "construction" + ], + [ + "▁v", + "od" + ], + [ + "▁vo", + "d" + ], + [ + "▁", + "vod" + ], + [ + "▁d", + "un" + ], + [ + "▁du", + "n" + ], + [ + "▁understand", + "s" + ], + [ + "te", + "nant" + ], + [ + "ten", + "ant" + ], + [ + "▁ch", + "aque" + ], + [ + "▁cha", + "que" + ], + [ + "▁pre", + "jud" + ], + [ + "ut", + "at" + ], + [ + "uta", + "t" + ], + [ + "u", + "tat" + ], + [ + "▁u", + "so" + ], + [ + "▁us", + "o" + ], + [ + "▁", + "uso" + ], + [ + "▁He", + "avy" + ], + [ + "▁cu", + "atro" + ], + [ + "▁side", + "walk" + ], + [ + "▁B", + "ug" + ], + [ + "▁Bu", + "g" + ], + [ + "▁mån", + "aden" + ], + [ + "ge", + "o" + ], + [ + "▁un", + "ited" + ], + [ + "▁unit", + "ed" + ], + [ + "▁F", + "iles" + ], + [ + "▁Fil", + "es" + ], + [ + "▁File", + "s" + ], + [ + "▁Fi", + "les" + ], + [ + "▁", + "Files" + ], + [ + "▁А", + "ль" + ], + [ + "▁Ал", + "ь" + ], + [ + "▁rug", + "by" + ], + [ + "▁fin", + "ancing" + ], + [ + "▁financ", + "ing" + ], + [ + "▁com", + "ply" + ], + [ + "▁comp", + "ly" + ], + [ + "▁compl", + "y" + ], + [ + "&", + "#" + ], + [ + "▁r", + "ushing" + ], + [ + "▁rush", + "ing" + ], + [ + "▁rus", + "hing" + ], + [ + "▁f", + "en" + ], + [ + "▁fe", + "n" + ], + [ + "▁", + "fen" + ], + [ + "mon", + "g" + ], + [ + "mo", + "ng" + ], + [ + "m", + "ong" + ], + [ + "▁sp", + "é" + ], + [ + "▁present", + "ing" + ], + [ + "IN", + "CLUDING" + ], + [ + "ě", + "l" + ], + [ + "zeich", + "nung" + ], + [ + "Back", + "up" + ], + [ + "▁pe", + "tit" + ], + [ + "▁pet", + "it" + ], + [ + "▁all", + "erg" + ], + [ + "▁alle", + "rg" + ], + [ + "▁aller", + "g" + ], + [ + "ну", + "т" + ], + [ + "н", + "ут" + ], + [ + "▁wor", + "rying" + ], + [ + "▁worry", + "ing" + ], + [ + "▁m", + "amm" + ], + [ + "▁ma", + "mm" + ], + [ + "▁oper", + "and" + ], + [ + "▁opera", + "nd" + ], + [ + ":%.*", + "]]" + ], + [ + "▁real", + "ise" + ], + [ + "Comm", + "ands" + ], + [ + "Command", + "s" + ], + [ + "▁B", + "ew" + ], + [ + "▁Be", + "w" + ], + [ + "▁ass", + "umes" + ], + [ + "▁assum", + "es" + ], + [ + "▁assume", + "s" + ], + [ + "▁Co", + "vid" + ], + [ + "▁Cov", + "id" + ], + [ + "▁qu", + "and" + ], + [ + "ty", + "ard" + ], + [ + "t", + "yard" + ], + [ + "▁M", + "ono" + ], + [ + "▁Mon", + "o" + ], + [ + "▁Mo", + "no" + ], + [ + "lin", + "ked" + ], + [ + "link", + "ed" + ], + [ + "M", + "ARK" + ], + [ + "Es", + "p" + ], + [ + "E", + "sp" + ], + [ + "▁bless", + "ing" + ], + [ + "▁eyeb", + "rows" + ], + [ + "▁N", + "V" + ], + [ + "▁", + "NV" + ], + [ + "▁ст", + "ру" + ], + [ + "▁", + "стру" + ], + [ + "▁mod", + "eling" + ], + [ + "▁model", + "ing" + ], + [ + "▁mode", + "ling" + ], + [ + "▁gre", + "eted" + ], + [ + "Work", + "space" + ], + [ + "▁pe", + "dest" + ], + [ + "▁ped", + "est" + ], + [ + "▁не", + "за" + ], + [ + "lem", + "agne" + ], + [ + "Stat", + "istics" + ], + [ + "▁a", + "ument" + ], + [ + "▁au", + "ment" + ], + [ + "▁spe", + "eds" + ], + [ + "▁speed", + "s" + ], + [ + "▁synd", + "rome" + ], + [ + "CONNE", + "CT" + ], + [ + "za", + "hl" + ], + [ + "z", + "ahl" + ], + [ + "ver", + "so" + ], + [ + "vers", + "o" + ], + [ + "érc", + "ito" + ], + [ + "▁astr", + "onom" + ], + [ + "▁ap", + "rile" + ], + [ + "▁apr", + "ile" + ], + [ + "▁april", + "e" + ], + [ + "že", + "n" + ], + [ + "ž", + "en" + ], + [ + "ве", + "ро" + ], + [ + "вер", + "о" + ], + [ + "dr", + "aft" + ], + [ + "d", + "raft" + ], + [ + "▁g", + "ioc" + ], + [ + "▁gi", + "oc" + ], + [ + "▁com", + "port" + ], + [ + "▁comp", + "ort" + ], + [ + "▁var", + "iance" + ], + [ + "▁vari", + "ance" + ], + [ + "▁real", + "izing" + ], + [ + "▁realiz", + "ing" + ], + [ + "ED", + "IT" + ], + [ + "оло", + "ві" + ], + [ + "▁e", + "star" + ], + [ + "▁est", + "ar" + ], + [ + "▁es", + "tar" + ], + [ + "▁esta", + "r" + ], + [ + "▁s", + "ost" + ], + [ + "▁so", + "st" + ], + [ + "N", + "ORMAL" + ], + [ + "▁", + "ó" + ], + [ + "▁And", + "r" + ], + [ + "▁An", + "dr" + ], + [ + "ATTR", + "IB" + ], + [ + "▁re", + "de" + ], + [ + "▁r", + "ede" + ], + [ + "▁red", + "e" + ], + [ + "▁t", + "oes" + ], + [ + "▁to", + "es" + ], + [ + "▁toe", + "s" + ], + [ + "▁adv", + "ances" + ], + [ + "▁advance", + "s" + ], + [ + "▁Again", + "st" + ], + [ + "TO", + "M" + ], + [ + "T", + "OM" + ], + [ + "rs", + "s" + ], + [ + "r", + "ss" + ], + [ + "MM", + "MM" + ], + [ + "▁ne", + "west" + ], + [ + "▁new", + "est" + ], + [ + "▁V", + "ER" + ], + [ + "▁", + "VER" + ], + [ + "▁phrase", + "s" + ], + [ + "▁phr", + "ases" + ], + [ + "an", + "ter" + ], + [ + "ant", + "er" + ], + [ + "ante", + "r" + ], + [ + "La", + "unch" + ], + [ + "▁c", + "hr" + ], + [ + "▁ch", + "r" + ], + [ + "▁", + "chr" + ], + [ + "▁manufact", + "ured" + ], + [ + "$)", + "," + ], + [ + "$", + ")," + ], + [ + "roll", + "ment" + ], + [ + "es", + "ton" + ], + [ + "est", + "on" + ], + [ + "esto", + "n" + ], + [ + "e", + "ston" + ], + [ + "▁pe", + "int" + ], + [ + "”", + ")" + ], + [ + "en", + "det" + ], + [ + "end", + "et" + ], + [ + "ende", + "t" + ], + [ + "▁H", + "air" + ], + [ + "▁Ha", + "ir" + ], + [ + "ival", + "ent" + ], + [ + "▁up", + "right" + ], + [ + "gr", + "en" + ], + [ + "gre", + "n" + ], + [ + "g", + "ren" + ], + [ + "an", + "ked" + ], + [ + "ank", + "ed" + ], + [ + "wr", + "ight" + ], + [ + "w", + "right" + ], + [ + "▁m", + "ast" + ], + [ + "▁ma", + "st" + ], + [ + "▁mas", + "t" + ], + [ + "▁on", + "Change" + ], + [ + "▁de", + "bris" + ], + [ + "▁deb", + "ris" + ], + [ + "▁g", + "rap" + ], + [ + "▁gr", + "ap" + ], + [ + "▁gra", + "p" + ], + [ + "et", + "ry" + ], + [ + "etr", + "y" + ], + [ + "e", + "try" + ], + [ + "▁(", + "__" + ], + [ + "▁(_", + "_" + ], + [ + "▁", + "(__" + ], + [ + "▁Com", + "merce" + ], + [ + "BO", + "X" + ], + [ + "T", + "ax" + ], + [ + "▁о", + "три" + ], + [ + "▁от", + "ри" + ], + [ + "▁pre", + "vention" + ], + [ + "▁prevent", + "ion" + ], + [ + "▁prev", + "ention" + ], + [ + "▁Fe", + "el" + ], + [ + "▁ex", + "otic" + ], + [ + "▁B", + "ark" + ], + [ + "▁Bar", + "k" + ], + [ + "▁S", + "team" + ], + [ + "▁Ste", + "am" + ], + [ + "fo", + "n" + ], + [ + "f", + "on" + ], + [ + "ol", + "in" + ], + [ + "oli", + "n" + ], + [ + "o", + "lin" + ], + [ + "▁elim", + "inated" + ], + [ + "▁eliminate", + "d" + ], + [ + "▁b", + "c" + ], + [ + "▁", + "bc" + ], + [ + "▁C", + "ycl" + ], + [ + "▁Cy", + "cl" + ], + [ + "▁$", + "(\"#" + ], + [ + "▁", + "$(\"#" + ], + [ + "▁P", + "arl" + ], + [ + "▁Par", + "l" + ], + [ + "▁Pa", + "rl" + ], + [ + "man", + "uel" + ], + [ + "os", + "pher" + ], + [ + "osp", + "her" + ], + [ + "osph", + "er" + ], + [ + "W", + "F" + ], + [ + "An", + "aly" + ], + [ + "Anal", + "y" + ], + [ + "▁nav", + "ig" + ], + [ + "▁re", + "nown" + ], + [ + "▁ren", + "own" + ], + [ + "R", + "x" + ], + [ + "▁W", + "alt" + ], + [ + "▁Wal", + "t" + ], + [ + "▁Wa", + "lt" + ], + [ + "uf", + "fed" + ], + [ + "uff", + "ed" + ], + [ + "▁f", + "oster" + ], + [ + "▁fo", + "ster" + ], + [ + "▁fost", + "er" + ], + [ + "▁fos", + "ter" + ], + [ + "$", + ":" + ], + [ + "sh", + "ore" + ], + [ + "Conne", + "ctor" + ], + [ + "Conn", + "ector" + ], + [ + "Connect", + "or" + ], + [ + "фи", + "ка" + ], + [ + "▁real", + "ization" + ], + [ + "▁realiz", + "ation" + ], + [ + "L", + "i" + ], + [ + "ct", + "xt" + ], + [ + "ctx", + "t" + ], + [ + "c", + "txt" + ], + [ + "ah", + "oo" + ], + [ + "aho", + "o" + ], + [ + "▁mir", + "acle" + ], + [ + "▁E", + "T" + ], + [ + "▁", + "ET" + ], + [ + "▁G", + "PS" + ], + [ + "▁GP", + "S" + ], + [ + "▁Observ", + "able" + ], + [ + "▁h", + "f" + ], + [ + "▁", + "hf" + ], + [ + "▁magnific", + "ent" + ], + [ + "не", + "го" + ], + [ + "BI", + "N" + ], + [ + "B", + "IN" + ], + [ + "▁D", + "orf" + ], + [ + "▁Do", + "rf" + ], + [ + "▁Dor", + "f" + ], + [ + "ie", + "ck" + ], + [ + "ve", + "e" + ], + [ + "v", + "ee" + ], + [ + "▁C", + "raw" + ], + [ + "▁Cr", + "aw" + ], + [ + "▁Cra", + "w" + ], + [ + "/", + "#" + ], + [ + "▁p", + "ci" + ], + [ + "▁pc", + "i" + ], + [ + "▁", + "pci" + ], + [ + "ip", + "pet" + ], + [ + "ipp", + "et" + ], + [ + "▁Hill", + "ary" + ], + [ + "▁g", + "ir" + ], + [ + "▁gi", + "r" + ], + [ + "▁r", + "and" + ], + [ + "▁ran", + "d" + ], + [ + "▁ra", + "nd" + ], + [ + "▁", + "rand" + ], + [ + "▁la", + "ying" + ], + [ + "▁lay", + "ing" + ], + [ + "▁D", + "ifferent" + ], + [ + "bo", + "ys" + ], + [ + "boy", + "s" + ], + [ + "vi", + "rt" + ], + [ + "vir", + "t" + ], + [ + "v", + "irt" + ], + [ + "▁enc", + "ryption" + ], + [ + "ás", + "z" + ], + [ + "á", + "sz" + ], + [ + "по", + "р" + ], + [ + "п", + "ор" + ], + [ + "▁sm", + "elled" + ], + [ + "▁smell", + "ed" + ], + [ + "▁sus", + "cept" + ], + [ + "cl", + "uded" + ], + [ + "clude", + "d" + ], + [ + "▁C", + "arn" + ], + [ + "▁Car", + "n" + ], + [ + "▁Ca", + "rn" + ], + [ + "ig", + "ten" + ], + [ + "igt", + "en" + ], + [ + "igte", + "n" + ], + [ + "▁Ch", + "uck" + ], + [ + "▁Prov", + "inc" + ], + [ + "▁per", + "í" + ], + [ + "▁Mar", + "shal" + ], + [ + "▁Mars", + "hal" + ], + [ + "▁", + "Marshal" + ], + [ + "мо", + "ж" + ], + [ + "g", + "fx" + ], + [ + "os", + "hi" + ], + [ + "osh", + "i" + ], + [ + "▁W", + "HE" + ], + [ + "▁WH", + "E" + ], + [ + "▁relax", + "ation" + ], + [ + ",", + "." + ], + [ + "we", + "re" + ], + [ + "wer", + "e" + ], + [ + "w", + "ere" + ], + [ + "▁var", + "ieties" + ], + [ + "▁W", + "on" + ], + [ + "▁Wo", + "n" + ], + [ + "▁g", + "aps" + ], + [ + "▁gap", + "s" + ], + [ + "▁ga", + "ps" + ], + [ + "▁st", + "ole" + ], + [ + "▁sto", + "le" + ], + [ + "ig", + "ua" + ], + [ + "igu", + "a" + ], + [ + "ющи", + "е" + ], + [ + "▁Ham", + "pshire" + ], + [ + "ph", + "rase" + ], + [ + "▁pel", + "ícula" + ], + [ + "Process", + "ing" + ], + [ + "▁initial", + "ization" + ], + [ + "oust", + "ic" + ], + [ + "▁Jose", + "f" + ], + [ + "▁Jos", + "ef" + ], + [ + "ic", + "ating" + ], + [ + "ica", + "ting" + ], + [ + "▁good", + "ness" + ], + [ + "TE", + "S" + ], + [ + "T", + "ES" + ], + [ + "▁c", + "ope" + ], + [ + "▁co", + "pe" + ], + [ + "▁cop", + "e" + ], + [ + "▁", + "cope" + ], + [ + "▁ignor", + "ance" + ], + [ + "▁B", + "rist" + ], + [ + "▁Br", + "ist" + ], + [ + "▁par", + "as" + ], + [ + "▁para", + "s" + ], + [ + "▁pa", + "ras" + ], + [ + "▁accident", + "ally" + ], + [ + "▁t", + "and" + ], + [ + "▁tan", + "d" + ], + [ + "▁ta", + "nd" + ], + [ + "it", + "test" + ], + [ + "itt", + "est" + ], + [ + "itte", + "st" + ], + [ + "▁у", + "ли" + ], + [ + "▁sh", + "ipped" + ], + [ + "▁ship", + "ped" + ], + [ + "▁о", + "ст" + ], + [ + "▁ос", + "т" + ], + [ + "else", + "if" + ], + [ + "▁u", + "size" + ], + [ + "▁us", + "ize" + ], + [ + "hor", + "izontal" + ], + [ + "▁C", + "arr" + ], + [ + "▁Car", + "r" + ], + [ + "▁Ca", + "rr" + ], + [ + "▁pre", + "cip" + ], + [ + "▁prec", + "ip" + ], + [ + "ro", + "z" + ], + [ + "r", + "oz" + ], + [ + "path", + "etic" + ], + [ + "pat", + "hetic" + ], + [ + "ri", + "ved" + ], + [ + "riv", + "ed" + ], + [ + "rive", + "d" + ], + [ + "r", + "ived" + ], + [ + "ro", + "k" + ], + [ + "r", + "ok" + ], + [ + "▁dig", + "ging" + ], + [ + "мо", + "м" + ], + [ + "▁M", + "ull" + ], + [ + "▁Mu", + "ll" + ], + [ + "▁Mul", + "l" + ], + [ + "▁X", + "III" + ], + [ + "▁XII", + "I" + ], + [ + "▁XI", + "II" + ], + [ + "▁pe", + "as" + ], + [ + "▁f", + "oul" + ], + [ + "▁fo", + "ul" + ], + [ + "▁fou", + "l" + ], + [ + "▁travel", + "s" + ], + [ + "▁trav", + "els" + ], + [ + "▁N", + "g" + ], + [ + "▁состав", + "е" + ], + [ + "▁соста", + "ве" + ], + [ + "Mon", + "t" + ], + [ + "Mo", + "nt" + ], + [ + "M", + "ont" + ], + [ + "ar", + "de" + ], + [ + "ard", + "e" + ], + [ + "▁Ste", + "fan" + ], + [ + "^^", + "^^" + ], + [ + "▁K", + "iss" + ], + [ + "▁Ki", + "ss" + ], + [ + "▁E", + "k" + ], + [ + "▁ok", + "tober" + ], + [ + "▁mem", + "orable" + ], + [ + "▁memor", + "able" + ], + [ + "')", + ")." + ], + [ + "'))", + "." + ], + [ + "'", + "))." + ], + [ + "▁V", + "ision" + ], + [ + "▁Vis", + "ion" + ], + [ + "▁N", + "ina" + ], + [ + "▁Ni", + "na" + ], + [ + "▁Nin", + "a" + ], + [ + "▁S", + "olar" + ], + [ + "▁So", + "lar" + ], + [ + "▁Sol", + "ar" + ], + [ + "▁highlight", + "ed" + ], + [ + "▁me", + "mo" + ], + [ + "▁mem", + "o" + ], + [ + "me", + "isterschaft" + ], + [ + "side", + "bar" + ], + [ + "SE", + "E" + ], + [ + "S", + "EE" + ], + [ + "▁Nev", + "ada" + ], + [ + "D", + "a" + ], + [ + "▁draw", + "er" + ], + [ + "ast", + "ically" + ], + [ + "astic", + "ally" + ], + [ + "el", + "de" + ], + [ + "eld", + "e" + ], + [ + "sc", + "ribed" + ], + [ + "scri", + "bed" + ], + [ + "scribe", + "d" + ], + [ + "scrib", + "ed" + ], + [ + "▁pri", + "ests" + ], + [ + "▁priest", + "s" + ], + [ + "▁hom", + "mes" + ], + [ + "▁homme", + "s" + ], + [ + "▁in", + "structor" + ], + [ + "▁instruct", + "or" + ], + [ + "кла", + "д" + ], + [ + "▁sp", + "ett" + ], + [ + "▁spe", + "tt" + ], + [ + "\\", + "-" + ], + [ + "▁ми", + "ра" + ], + [ + "▁", + "мира" + ], + [ + "▁Look", + "s" + ], + [ + "▁Lo", + "oks" + ], + [ + "▁sle", + "eve" + ], + [ + "▁strong", + "est" + ], + [ + "▁t", + "ête" + ], + [ + "▁Nic", + "ole" + ], + [ + "▁Ni", + "cole" + ], + [ + "▁Nicol", + "e" + ], + [ + "im", + "per" + ], + [ + "imp", + "er" + ], + [ + "на", + "ча" + ], + [ + "ip", + "per" + ], + [ + "ipp", + "er" + ], + [ + "▁in", + "won" + ], + [ + "il", + "ers" + ], + [ + "ile", + "rs" + ], + [ + "iler", + "s" + ], + [ + "i", + "lers" + ], + [ + "▁Dep", + "uty" + ], + [ + "og", + "e" + ], + [ + "o", + "ge" + ], + [ + "▁de", + "pressed" + ], + [ + "▁dep", + "ressed" + ], + [ + "▁depress", + "ed" + ], + [ + "▁ar", + "te" + ], + [ + "▁art", + "e" + ], + [ + "▁", + "arte" + ], + [ + "▁comb", + "ining" + ], + [ + "LA", + "ST" + ], + [ + "L", + "AST" + ], + [ + "in", + "ted" + ], + [ + "int", + "ed" + ], + [ + "inte", + "d" + ], + [ + "▁A", + "verage" + ], + [ + "▁Ave", + "rage" + ], + [ + "▁poll", + "ution" + ], + [ + "▁Phill", + "ips" + ], + [ + "▁W", + "M" + ], + [ + "▁", + "WM" + ], + [ + "}}", + "}\\" + ], + [ + "}}}", + "\\" + ], + [ + "}", + "}}\\" + ], + [ + "Add", + "ed" + ], + [ + "Ad", + "ded" + ], + [ + "▁per", + "ipher" + ], + [ + "Creat", + "ion" + ], + [ + "C", + "reation" + ], + [ + "▁ital", + "ien" + ], + [ + "▁Ch", + "oice" + ], + [ + "▁Cho", + "ice" + ], + [ + "▁", + "Choice" + ], + [ + "▁EX", + "PRESS" + ], + [ + "▁St", + "ruct" + ], + [ + "▁Str", + "uct" + ], + [ + "▁", + "Struct" + ], + [ + "ys", + "z" + ], + [ + "y", + "sz" + ], + [ + "Res", + "ize" + ], + [ + "Re", + "size" + ], + [ + "AR", + "GS" + ], + [ + "ARG", + "S" + ], + [ + "▁re", + "po" + ], + [ + "▁rep", + "o" + ], + [ + "▁", + "repo" + ], + [ + "▁что", + "бы" + ], + [ + "▁p", + "ref" + ], + [ + "▁pre", + "f" + ], + [ + "▁pr", + "ef" + ], + [ + "▁", + "pref" + ], + [ + "▁earth", + "qu" + ], + [ + "▁Ме", + "кси" + ], + [ + "▁F", + "inale" + ], + [ + "▁Fin", + "ale" + ], + [ + "▁Final", + "e" + ], + [ + "▁h", + "echo" + ], + [ + "▁he", + "cho" + ], + [ + "requ", + "ests" + ], + [ + "request", + "s" + ], + [ + "C", + "ut" + ], + [ + "▁des", + "erved" + ], + [ + "▁deserve", + "d" + ], + [ + "го", + "во" + ], + [ + "гов", + "о" + ], + [ + "▁Re", + "cent" + ], + [ + "▁Rec", + "ent" + ], + [ + "▁ди", + "визи" + ], + [ + "▁support", + "ive" + ], + [ + "пра", + "ви" + ], + [ + "прав", + "и" + ], + [ + "▁irre", + "levant" + ], + [ + "'", + "\r" + ], + [ + "▁c", + "trl" + ], + [ + "▁", + "ctrl" + ], + [ + "▁De", + "al" + ], + [ + "iz", + "ada" + ], + [ + "iza", + "da" + ], + [ + "u", + "o" + ], + [ + "▁n", + "ort" + ], + [ + "▁no", + "rt" + ], + [ + "▁nor", + "t" + ], + [ + "ge", + "ometry" + ], + [ + "geo", + "metry" + ], + [ + "▁Individ", + "ual" + ], + [ + "er", + "eg" + ], + [ + "ere", + "g" + ], + [ + "e", + "reg" + ], + [ + "▁при", + "ня" + ], + [ + "cre", + "f" + ], + [ + "cr", + "ef" + ], + [ + "c", + "ref" + ], + [ + "═", + "═" + ], + [ + "▁com", + "erc" + ], + [ + "▁come", + "rc" + ], + [ + "=", + "_" + ], + [ + "bu", + "nd" + ], + [ + "b", + "und" + ], + [ + "та", + "х" + ], + [ + "il", + "en" + ], + [ + "ile", + "n" + ], + [ + "i", + "len" + ], + [ + "чи", + "та" + ], + [ + "▁corpor", + "ation" + ], + [ + "es", + "z" + ], + [ + "e", + "sz" + ], + [ + "▁=", + "=>" + ], + [ + "▁==", + ">" + ], + [ + "ab", + "lish" + ], + [ + "abl", + "ish" + ], + [ + "Ap", + "r" + ], + [ + "A", + "pr" + ], + [ + "▁r", + "ipped" + ], + [ + "▁ri", + "pped" + ], + [ + "▁rip", + "ped" + ], + [ + "Var", + "s" + ], + [ + "V", + "ars" + ], + [ + "st", + "ret" + ], + [ + "str", + "et" + ], + [ + "stre", + "t" + ], + [ + "▁Frances", + "co" + ], + [ + "Na", + "N" + ], + [ + "▁any", + "time" + ], + [ + "▁autom", + "ated" + ], + [ + "ost", + "ream" + ], + [ + "o", + "stream" + ], + [ + "▁draw", + "ings" + ], + [ + "▁drawing", + "s" + ], + [ + "▁enhance", + "ment" + ], + [ + "ok", + "rat" + ], + [ + "▁Iss", + "ue" + ], + [ + "в", + "ра" + ], + [ + "Cur", + "rency" + ], + [ + "▁w", + "yn" + ], + [ + "▁wy", + "n" + ], + [ + "izar", + "re" + ], + [ + "ét", + "ico" + ], + [ + "mult", + "iple" + ], + [ + "multi", + "ple" + ], + [ + "multip", + "le" + ], + [ + "▁R", + "ate" + ], + [ + "▁Ra", + "te" + ], + [ + "▁Rat", + "e" + ], + [ + "▁", + "Rate" + ], + [ + "▁I", + "ch" + ], + [ + "▁A", + "uss" + ], + [ + "▁Aus", + "s" + ], + [ + "▁Au", + "ss" + ], + [ + "▁For", + "mer" + ], + [ + "▁Form", + "er" + ], + [ + "Cur", + "ve" + ], + [ + "▁mar", + "vel" + ], + [ + "att", + "ro" + ], + [ + "attr", + "o" + ], + [ + "▁с", + "п" + ], + [ + "BO", + "OL" + ], + [ + "си", + "я" + ], + [ + "go", + "ld" + ], + [ + "g", + "old" + ], + [ + "▁N", + "intendo" + ], + [ + "▁Salv", + "ador" + ], + [ + "▁S", + "olution" + ], + [ + "▁Sol", + "ution" + ], + [ + "AD", + "C" + ], + [ + "A", + "DC" + ], + [ + "бо", + "ра" + ], + [ + "бор", + "а" + ], + [ + "▁Ben", + "nett" + ], + [ + "▁F", + "R" + ], + [ + "▁", + "FR" + ], + [ + "▁pu", + "eden" + ], + [ + "▁pued", + "en" + ], + [ + "▁puede", + "n" + ], + [ + "pat", + "ient" + ], + [ + "▁P", + "G" + ], + [ + "▁", + "PG" + ], + [ + "▁J", + "in" + ], + [ + "▁Ji", + "n" + ], + [ + "▁cr", + "ashed" + ], + [ + "▁crash", + "ed" + ], + [ + "▁d", + "enen" + ], + [ + "▁de", + "nen" + ], + [ + "▁den", + "en" + ], + [ + "▁S", + "ample" + ], + [ + "▁Sam", + "ple" + ], + [ + "▁", + "Sample" + ], + [ + "▁Que", + "bec" + ], + [ + "it", + "ories" + ], + [ + "itor", + "ies" + ], + [ + "ito", + "ries" + ], + [ + "itori", + "es" + ], + [ + "▁b", + "linked" + ], + [ + "▁blink", + "ed" + ], + [ + "▁l", + "ion" + ], + [ + "▁li", + "on" + ], + [ + "▁vo", + "ce" + ], + [ + "▁voc", + "e" + ], + [ + "▁Imp", + "act" + ], + [ + "▁M", + "au" + ], + [ + "▁Ma", + "u" + ], + [ + "▁N", + "ie" + ], + [ + "▁Ni", + "e" + ], + [ + "▁l", + "ob" + ], + [ + "▁lo", + "b" + ], + [ + "▁д", + "ве" + ], + [ + "or", + "neys" + ], + [ + "orney", + "s" + ], + [ + "orne", + "ys" + ], + [ + "▁coast", + "al" + ], + [ + "▁s", + "ensors" + ], + [ + "▁sens", + "ors" + ], + [ + "▁sensor", + "s" + ], + [ + "▁X", + "II" + ], + [ + "▁XI", + "I" + ], + [ + "▁ill", + "usion" + ], + [ + "oj", + "i" + ], + [ + "o", + "ji" + ], + [ + "▁I", + "NC" + ], + [ + "▁IN", + "C" + ], + [ + "▁Dun", + "can" + ], + [ + "y", + "k" + ], + [ + "▁affect", + "ing" + ], + [ + "pu", + "l" + ], + [ + "p", + "ul" + ], + [ + "▁Napole", + "on" + ], + [ + "▁а", + "каде" + ], + [ + "▁com", + "pt" + ], + [ + "▁comp", + "t" + ], + [ + "▁prof", + "itable" + ], + [ + "▁profit", + "able" + ], + [ + "lo", + "e" + ], + [ + "l", + "oe" + ], + [ + "▁deux", + "ième" + ], + [ + "▁W", + "C" + ], + [ + "▁", + "WC" + ], + [ + "▁v", + "iable" + ], + [ + "▁vi", + "able" + ], + [ + "▁via", + "ble" + ], + [ + "▁D", + "rug" + ], + [ + "▁Dr", + "ug" + ], + [ + "▁Dru", + "g" + ], + [ + "Text", + "Box" + ], + [ + "▁lum", + "inos" + ], + [ + "au", + "té" + ], + [ + "aut", + "é" + ], + [ + "y", + "c" + ], + [ + "št", + "ě" + ], + [ + "▁affili", + "ates" + ], + [ + "▁affiliate", + "s" + ], + [ + "il", + "da" + ], + [ + "ild", + "a" + ], + [ + "con", + "duct" + ], + [ + "cond", + "uct" + ], + [ + "▁e", + "benfalls" + ], + [ + "▁A", + "MD" + ], + [ + "▁AM", + "D" + ], + [ + "▁Mon", + "itor" + ], + [ + "▁", + "Monitor" + ], + [ + "▁Compan", + "ies" + ], + [ + "▁correct", + "ed" + ], + [ + "▁corre", + "cted" + ], + [ + "ä", + "ck" + ], + [ + "SY", + "STEM" + ], + [ + "other", + "apy" + ], + [ + "▁п", + "еред" + ], + [ + "▁пере", + "д" + ], + [ + "▁пе", + "ред" + ], + [ + "▁bl", + "ues" + ], + [ + "▁blue", + "s" + ], + [ + "at", + "isf" + ], + [ + "ati", + "sf" + ], + [ + "atis", + "f" + ], + [ + "al", + "though" + ], + [ + "alth", + "ough" + ], + [ + "ro", + "st" + ], + [ + "ros", + "t" + ], + [ + "r", + "ost" + ], + [ + "SC", + "AN" + ], + [ + "S", + "CAN" + ], + [ + "▁R", + "AM" + ], + [ + "ці", + "ональ" + ], + [ + "▁vend", + "ors" + ], + [ + "▁vendor", + "s" + ], + [ + "▁custom", + "s" + ], + [ + "▁cust", + "oms" + ], + [ + "▁activ", + "ate" + ], + [ + "▁", + "activate" + ], + [ + "▁b", + "logs" + ], + [ + "▁bl", + "ogs" + ], + [ + "▁blo", + "gs" + ], + [ + "▁blog", + "s" + ], + [ + "▁br", + "ace" + ], + [ + "▁bra", + "ce" + ], + [ + "▁", + "brace" + ], + [ + "▁st", + "rat" + ], + [ + "▁str", + "at" + ], + [ + "▁stra", + "t" + ], + [ + "an", + "je" + ], + [ + "anj", + "e" + ], + [ + "щ", + "ё" + ], + [ + "▁t", + "ide" + ], + [ + "▁tid", + "e" + ], + [ + "▁ti", + "de" + ], + [ + "▁Brig", + "ade" + ], + [ + "get", + "Operand" + ], + [ + "▁al", + "iment" + ], + [ + "▁ali", + "ment" + ], + [ + "▁achieve", + "ments" + ], + [ + "▁achievement", + "s" + ], + [ + "▁suspic", + "ion" + ], + [ + "▁susp", + "icion" + ], + [ + "▁touch", + "down" + ], + [ + "br", + "oad" + ], + [ + "bro", + "ad" + ], + [ + "b", + "road" + ], + [ + "io", + "re" + ], + [ + "ior", + "e" + ], + [ + "i", + "ore" + ], + [ + "Compar", + "ison" + ], + [ + "▁m", + "um" + ], + [ + "▁mu", + "m" + ], + [ + "Eng", + "lish" + ], + [ + "▁P", + "icture" + ], + [ + "▁Pict", + "ure" + ], + [ + "▁M", + "ouse" + ], + [ + "▁Mo", + "use" + ], + [ + "▁", + "Mouse" + ], + [ + "am", + "d" + ], + [ + "a", + "md" + ], + [ + "▁[", + "`" + ], + [ + "▁den", + "omin" + ], + [ + "▁denom", + "in" + ], + [ + "▁Al", + "eks" + ], + [ + "▁Ale", + "ks" + ], + [ + "▁pr", + "events" + ], + [ + "▁prevent", + "s" + ], + [ + "▁prev", + "ents" + ], + [ + "ó", + "b" + ], + [ + "fe", + "d" + ], + [ + "f", + "ed" + ], + [ + "▁P", + "ray" + ], + [ + "▁Pr", + "ay" + ], + [ + "▁Pra", + "y" + ], + [ + "▁sh", + "ine" + ], + [ + "▁", + "shine" + ], + [ + "▁cl", + "utch" + ], + [ + "mu", + "x" + ], + [ + "m", + "ux" + ], + [ + "App", + "ro" + ], + [ + "Ap", + "pro" + ], + [ + "▁not", + "ably" + ], + [ + "ch", + "io" + ], + [ + "chi", + "o" + ], + [ + "na", + "ge" + ], + [ + "n", + "age" + ], + [ + "HA", + "S" + ], + [ + "H", + "AS" + ], + [ + "▁'", + ")" + ], + [ + "▁", + "')" + ], + [ + "▁M", + "iche" + ], + [ + "▁Mich", + "e" + ], + [ + "▁Mic", + "he" + ], + [ + "▁Mi", + "che" + ], + [ + "t", + "g" + ], + [ + "::", + "~" + ], + [ + "▁am", + "ely" + ], + [ + "▁ro", + "dz" + ], + [ + "▁rod", + "z" + ], + [ + "z", + "s" + ], + [ + "tr", + "ait" + ], + [ + "tra", + "it" + ], + [ + "t", + "rait" + ], + [ + "▁k", + "lass" + ], + [ + "▁kl", + "ass" + ], + [ + "▁", + "klass" + ], + [ + "f", + "ö" + ], + [ + "▁dest", + "ac" + ], + [ + "▁Cl", + "ara" + ], + [ + "▁Clar", + "a" + ], + [ + "f", + "requency" + ], + [ + "▁G", + "it" + ], + [ + "▁Gi", + "t" + ], + [ + "▁по", + "ль" + ], + [ + "▁пол", + "ь" + ], + [ + "▁frequ", + "encies" + ], + [ + "▁febr", + "ero" + ], + [ + "▁st", + "umbled" + ], + [ + "ко", + "ю" + ], + [ + "▁N", + "ames" + ], + [ + "▁Name", + "s" + ], + [ + "▁Na", + "mes" + ], + [ + "▁Nam", + "es" + ], + [ + "▁", + "Names" + ], + [ + "▁F", + "light" + ], + [ + "▁Fl", + "ight" + ], + [ + "▁p", + "rey" + ], + [ + "▁pre", + "y" + ], + [ + "▁pr", + "ey" + ], + [ + "▁med", + "io" + ], + [ + "▁medi", + "o" + ], + [ + "▁V", + "AR" + ], + [ + "▁VA", + "R" + ], + [ + "▁", + "VAR" + ], + [ + "▁F", + "loat" + ], + [ + "▁Flo", + "at" + ], + [ + "▁", + "Float" + ], + [ + "▁Ern", + "est" + ], + [ + "▁Marc", + "atori" + ], + [ + "op", + "ort" + ], + [ + "o", + "port" + ], + [ + "▁cancel", + "lation" + ], + [ + "▁cancell", + "ation" + ], + [ + "▁Br", + "yan" + ], + [ + "▁Bry", + "an" + ], + [ + "——", + "——" + ], + [ + "Lu", + "c" + ], + [ + "L", + "uc" + ], + [ + "▁li", + "bre" + ], + [ + "▁lib", + "re" + ], + [ + "▁t", + "ítulo" + ], + [ + "*", + ">" + ], + [ + "▁S", + "andy" + ], + [ + "▁San", + "dy" + ], + [ + "▁Sand", + "y" + ], + [ + "▁Mar", + "ina" + ], + [ + "Be", + "en" + ], + [ + "B", + "een" + ], + [ + "▁w", + "al" + ], + [ + "▁wa", + "l" + ], + [ + "▁", + "wal" + ], + [ + "▁K", + "ultur" + ], + [ + "▁expl", + "ode" + ], + [ + "▁explo", + "de" + ], + [ + "▁lim", + "iting" + ], + [ + "▁limit", + "ing" + ], + [ + "▁presum", + "ably" + ], + [ + "▁p", + "b" + ], + [ + "▁", + "pb" + ], + [ + "▁M", + "erc" + ], + [ + "▁Me", + "rc" + ], + [ + "▁Mer", + "c" + ], + [ + "▁ре", + "ки" + ], + [ + "le", + "arning" + ], + [ + "lear", + "ning" + ], + [ + "learn", + "ing" + ], + [ + "C", + "atalog" + ], + [ + "▁C", + "ensus" + ], + [ + "lt", + "e" + ], + [ + "l", + "te" + ], + [ + "▁N", + "ET" + ], + [ + "▁NE", + "T" + ], + [ + "▁", + "NET" + ], + [ + "ra", + "ising" + ], + [ + "rais", + "ing" + ], + [ + "rai", + "sing" + ], + [ + "сь", + "ке" + ], + [ + "st", + "aff" + ], + [ + "sta", + "ff" + ], + [ + "▁Qu", + "inn" + ], + [ + "▁mem", + "orial" + ], + [ + "▁memor", + "ial" + ], + [ + "▁memo", + "rial" + ], + [ + "п", + "ня" + ], + [ + "▁cu", + "enta" + ], + [ + "▁X", + "I" + ], + [ + "lb", + "l" + ], + [ + "l", + "bl" + ], + [ + "▁v", + "aries" + ], + [ + "▁var", + "ies" + ], + [ + "▁vari", + "es" + ], + [ + "▁va", + "ries" + ], + [ + "▁fluct", + "uations" + ], + [ + "▁дол", + "ж" + ], + [ + "▁осо", + "би" + ], + [ + "▁ware", + "house" + ], + [ + "How", + "ever" + ], + [ + "▁correct", + "ions" + ], + [ + "▁corre", + "ctions" + ], + [ + "▁correction", + "s" + ], + [ + "dh", + "d" + ], + [ + "d", + "hd" + ], + [ + "▁f", + "als" + ], + [ + "▁fa", + "ls" + ], + [ + "▁fal", + "s" + ], + [ + "▁controvers", + "y" + ], + [ + "▁cur", + "se" + ], + [ + "▁t", + "élé" + ], + [ + "▁té", + "lé" + ], + [ + "ře", + "d" + ], + [ + "ř", + "ed" + ], + [ + "▁A", + "U" + ], + [ + "▁", + "AU" + ], + [ + "▁т", + "ор" + ], + [ + "▁то", + "р" + ], + [ + "▁", + "тор" + ], + [ + "▁cr", + "ít" + ], + [ + "id", + "an" + ], + [ + "ida", + "n" + ], + [ + "i", + "dan" + ], + [ + "ili", + "ary" + ], + [ + "iliar", + "y" + ], + [ + "ilia", + "ry" + ], + [ + "▁P", + "anel" + ], + [ + "▁Pan", + "el" + ], + [ + "▁Pa", + "nel" + ], + [ + "▁", + "Panel" + ], + [ + "cul", + "e" + ], + [ + "cu", + "le" + ], + [ + "c", + "ule" + ], + [ + "▁P", + "oor" + ], + [ + "▁Po", + "or" + ], + [ + "▁B", + "A" + ], + [ + "▁", + "BA" + ], + [ + "▁ignor", + "ant" + ], + [ + "ème", + "s" + ], + [ + "è", + "mes" + ], + [ + "▁aest", + "hetic" + ], + [ + "Link", + "ed" + ], + [ + "Lin", + "ked" + ], + [ + "get", + "Int" + ], + [ + "Un", + "icode" + ], + [ + "[", + "@" + ], + [ + "▁Z", + "ent" + ], + [ + "▁Ze", + "nt" + ], + [ + "▁Zen", + "t" + ], + [ + "Man", + "ifest" + ], + [ + "▁v", + "ars" + ], + [ + "▁var", + "s" + ], + [ + "▁va", + "rs" + ], + [ + "▁", + "vars" + ], + [ + "P", + "B" + ], + [ + "▁в", + "у" + ], + [ + "▁", + "ву" + ], + [ + "▁De", + "scribe" + ], + [ + "▁Desc", + "ribe" + ], + [ + "▁", + "Describe" + ], + [ + "▁Any", + "thing" + ], + [ + "oi", + "rs" + ], + [ + "oir", + "s" + ], + [ + "o", + "irs" + ], + [ + "▁s", + "ocks" + ], + [ + "▁so", + "cks" + ], + [ + "▁soc", + "ks" + ], + [ + "▁sock", + "s" + ], + [ + "▁im", + "ped" + ], + [ + "▁imp", + "ed" + ], + [ + "▁ne", + "ue" + ], + [ + "▁neu", + "e" + ], + [ + "▁dis", + "pers" + ], + [ + "▁disp", + "ers" + ], + [ + "Col", + "lect" + ], + [ + "Coll", + "ect" + ], + [ + "file", + "r" + ], + [ + "fil", + "er" + ], + [ + "fi", + "ler" + ], + [ + "f", + "iler" + ], + [ + "▁Fr", + "au" + ], + [ + "▁Fra", + "u" + ], + [ + "▁H", + "ockey" + ], + [ + "▁te", + "ens" + ], + [ + "▁teen", + "s" + ], + [ + "▁Rober", + "to" + ], + [ + "▁Robert", + "o" + ], + [ + "la", + "uf" + ], + [ + "l", + "auf" + ], + [ + "ва", + "ть" + ], + [ + "ват", + "ь" + ], + [ + "▁с", + "ко" + ], + [ + "▁", + "ско" + ], + [ + "is", + "Array" + ], + [ + "▁teen", + "ager" + ], + [ + "Bu", + "ilt" + ], + [ + "▁loud", + "ly" + ], + [ + "Cap", + "acity" + ], + [ + "▁advent", + "ures" + ], + [ + "▁adventure", + "s" + ], + [ + "▁M", + "olly" + ], + [ + "▁Mol", + "ly" + ], + [ + "rec", + "ogn" + ], + [ + "bar", + "s" + ], + [ + "ba", + "rs" + ], + [ + "b", + "ars" + ], + [ + "▁L", + "or" + ], + [ + "▁Lo", + "r" + ], + [ + "▁pu", + "ò" + ], + [ + "▁m", + "ong" + ], + [ + "▁mon", + "g" + ], + [ + "▁mo", + "ng" + ], + [ + "▁", + "mong" + ], + [ + "in", + "ement" + ], + [ + "ine", + "ment" + ], + [ + "i", + "nement" + ], + [ + "Ass", + "ignment" + ], + [ + "Assign", + "ment" + ], + [ + "▁d", + "iz" + ], + [ + "▁di", + "z" + ], + [ + "less", + "ness" + ], + [ + "▁H", + "alloween" + ], + [ + "▁bit", + "map" + ], + [ + "▁", + "bitmap" + ], + [ + "Ro", + "m" + ], + [ + "R", + "om" + ], + [ + "на", + "р" + ], + [ + "н", + "ар" + ], + [ + "▁re", + "bel" + ], + [ + "▁reb", + "el" + ], + [ + "▁rad", + "ial" + ], + [ + "▁radi", + "al" + ], + [ + "me", + "asure" + ], + [ + "ni", + "t" + ], + [ + "n", + "it" + ], + [ + "▁Ass", + "ume" + ], + [ + "▁assign", + "ments" + ], + [ + "▁assignment", + "s" + ], + [ + "▁I", + "sn" + ], + [ + "▁Is", + "n" + ], + [ + "▁al", + "tre" + ], + [ + "▁alt", + "re" + ], + [ + "ße", + "r" + ], + [ + "ß", + "er" + ], + [ + "на", + "ль" + ], + [ + "нал", + "ь" + ], + [ + "н", + "аль" + ], + [ + "▁fl", + "ies" + ], + [ + "▁d", + "roit" + ], + [ + "▁dro", + "it" + ], + [ + "▁thick", + "ness" + ], + [ + "▁en", + "jo" + ], + [ + "▁d", + "well" + ], + [ + "▁dw", + "ell" + ], + [ + "▁hom", + "osexual" + ], + [ + "▁e", + "val" + ], + [ + "▁ev", + "al" + ], + [ + "▁", + "eval" + ], + [ + "$_", + "{" + ], + [ + "$", + "_{" + ], + [ + "as", + "ia" + ], + [ + "asi", + "a" + ], + [ + "▁phil", + "os" + ], + [ + "get", + "Current" + ], + [ + "▁veter", + "ans" + ], + [ + "▁veteran", + "s" + ], + [ + "▁Ber", + "keley" + ], + [ + "▁wild", + "life" + ], + [ + "Co", + "p" + ], + [ + "C", + "op" + ], + [ + "ve", + "rn" + ], + [ + "ver", + "n" + ], + [ + "v", + "ern" + ], + [ + "▁", + "Ú" + ], + [ + "to", + "s" + ], + [ + "t", + "os" + ], + [ + "▁L", + "ed" + ], + [ + "▁Le", + "d" + ], + [ + "▁key", + "words" + ], + [ + "▁keyword", + "s" + ], + [ + "▁med", + "ications" + ], + [ + "▁medic", + "ations" + ], + [ + "▁medication", + "s" + ], + [ + "ne", + "um" + ], + [ + "▁jam", + "ais" + ], + [ + "▁B", + "uc" + ], + [ + "▁Bu", + "c" + ], + [ + "▁P", + "D" + ], + [ + "▁", + "PD" + ], + [ + "▁State", + "ment" + ], + [ + "▁Stat", + "ement" + ], + [ + "▁", + "Statement" + ], + [ + "▁P", + "I" + ], + [ + "▁", + "PI" + ], + [ + "▁Jack", + "ie" + ], + [ + "▁Jac", + "kie" + ], + [ + "▁ord", + "in" + ], + [ + "▁k", + "ör" + ], + [ + "▁kö", + "r" + ], + [ + "en", + "ze" + ], + [ + "enz", + "e" + ], + [ + "▁util", + "ized" + ], + [ + "▁utiliz", + "ed" + ], + [ + "▁utilize", + "d" + ], + [ + "á", + "ct" + ], + [ + "az", + "ed" + ], + [ + "aze", + "d" + ], + [ + "a", + "zed" + ], + [ + "▁sever", + "ely" + ], + [ + "▁severe", + "ly" + ], + [ + "▁ä", + "ven" + ], + [ + "▁li", + "bro" + ], + [ + "▁lib", + "ro" + ], + [ + "▁E", + "u" + ], + [ + "äs", + "t" + ], + [ + "ä", + "st" + ], + [ + "PAR", + "T" + ], + [ + "PA", + "RT" + ], + [ + "P", + "ART" + ], + [ + "▁But", + "ler" + ], + [ + "▁puzz", + "le" + ], + [ + "F", + "all" + ], + [ + "Count", + "ry" + ], + [ + "C", + "ountry" + ], + [ + "pf", + "n" + ], + [ + "p", + "fn" + ], + [ + "▁у", + "країн" + ], + [ + "▁Or", + "chestra" + ], + [ + "▁al", + "to" + ], + [ + "▁alt", + "o" + ], + [ + "▁anc", + "ora" + ], + [ + "▁decom", + "position" + ], + [ + "▁", + "م" + ], + [ + "▁appet", + "ite" + ], + [ + "ad", + "u" + ], + [ + "a", + "du" + ], + [ + "▁TH", + "AT" + ], + [ + "▁com", + "enz" + ], + [ + "min", + "a" + ], + [ + "mi", + "na" + ], + [ + "m", + "ina" + ], + [ + "▁init", + "iated" + ], + [ + "▁initi", + "ated" + ], + [ + "▁T", + "at" + ], + [ + "▁Ta", + "t" + ], + [ + "▁some", + "time" + ], + [ + "▁som", + "etime" + ], + [ + "▁somet", + "ime" + ], + [ + "re", + "k" + ], + [ + "r", + "ek" + ], + [ + "br", + "ead" + ], + [ + "bre", + "ad" + ], + [ + "b", + "read" + ], + [ + "▁Stat", + "istics" + ], + [ + "▁", + "Statistics" + ], + [ + "▁C", + "ob" + ], + [ + "▁Co", + "b" + ], + [ + "F", + "ollow" + ], + [ + "▁ge", + "ometric" + ], + [ + "ш", + "ла" + ], + [ + "▁proceed", + "ings" + ], + [ + "D", + "lg" + ], + [ + "se", + "ven" + ], + [ + "s", + "even" + ], + [ + "▁[", + "-" + ], + [ + "▁", + "[-" + ], + [ + "▁Buff", + "alo" + ], + [ + "▁bl", + "acks" + ], + [ + "▁black", + "s" + ], + [ + "▁s", + "ov" + ], + [ + "▁so", + "v" + ], + [ + "▁cust", + "ody" + ], + [ + "▁r", + "as" + ], + [ + "▁ra", + "s" + ], + [ + "▁", + "ras" + ], + [ + "▁tatto", + "o" + ], + [ + "öffent", + "licht" + ], + [ + "Bl", + "o" + ], + [ + "B", + "lo" + ], + [ + "A", + "ustral" + ], + [ + "▁rec", + "uper" + ], + [ + "ле", + "в" + ], + [ + "л", + "ев" + ], + [ + "▁b", + "em" + ], + [ + "▁be", + "m" + ], + [ + "▁t", + "hou" + ], + [ + "▁th", + "ou" + ], + [ + "ori", + "ented" + ], + [ + "orient", + "ed" + ], + [ + "vi", + "r" + ], + [ + "v", + "ir" + ], + [ + "▁col", + "ony" + ], + [ + "▁colon", + "y" + ], + [ + "▁Stan", + "ford" + ], + [ + "Abs", + "olute" + ], + [ + "ad", + "rat" + ], + [ + "adr", + "at" + ], + [ + "▁S", + "itu" + ], + [ + "▁Si", + "tu" + ], + [ + "▁sou", + "vent" + ], + [ + "EX", + "EC" + ], + [ + "▁m", + "ű" + ], + [ + "▁apart", + "ments" + ], + [ + "▁apartment", + "s" + ], + [ + "▁слу", + "ча" + ], + [ + "▁a", + "no" + ], + [ + "▁an", + "o" + ], + [ + "▁", + "ano" + ], + [ + "WIN", + "DO" + ], + [ + "ac", + "ci" + ], + [ + "acc", + "i" + ], + [ + "▁L", + "au" + ], + [ + "▁La", + "u" + ], + [ + "co", + "urt" + ], + [ + "cou", + "rt" + ], + [ + "c", + "ourt" + ], + [ + "▁manif", + "old" + ], + [ + "▁coal", + "ition" + ], + [ + "▁X", + "IV" + ], + [ + "▁XI", + "V" + ], + [ + "Att", + "rib" + ], + [ + "Attr", + "ib" + ], + [ + "asc", + "ade" + ], + [ + "▁whe", + "at" + ], + [ + "▁strength", + "s" + ], + [ + "FR", + "EE" + ], + [ + "F", + "REE" + ], + [ + "EMP", + "TY" + ], + [ + "▁h", + "ey" + ], + [ + "▁he", + "y" + ], + [ + "as", + "cular" + ], + [ + "asc", + "ular" + ], + [ + "▁pl", + "asma" + ], + [ + "▁b", + "ob" + ], + [ + "▁bo", + "b" + ], + [ + "Sep", + "arator" + ], + [ + "=\"", + "${" + ], + [ + "=\"$", + "{" + ], + [ + "▁Z", + "ag" + ], + [ + "▁Za", + "g" + ], + [ + "▁pro", + "jet" + ], + [ + "▁smooth", + "ly" + ], + [ + "SE", + "QU" + ], + [ + "an", + "aly" + ], + [ + "ana", + "ly" + ], + [ + "anal", + "y" + ], + [ + "att", + "achment" + ], + [ + "attach", + "ment" + ], + [ + "▁E", + "S" + ], + [ + "▁", + "ES" + ], + [ + "▁po", + "pped" + ], + [ + "▁pop", + "ped" + ], + [ + "ő", + "s" + ], + [ + "to", + "m" + ], + [ + "t", + "om" + ], + [ + "▁s", + "ón" + ], + [ + "▁só", + "n" + ], + [ + "▁r", + "ott" + ], + [ + "▁ro", + "tt" + ], + [ + "▁rot", + "t" + ], + [ + "▁", + "rott" + ], + [ + "Util", + "ities" + ], + [ + "Ut", + "ilities" + ], + [ + "had", + "oop" + ], + [ + "hado", + "op" + ], + [ + "▁s", + "otto" + ], + [ + "▁so", + "tto" + ], + [ + "au", + "tor" + ], + [ + "aut", + "or" + ], + [ + "auto", + "r" + ], + [ + "▁George", + "s" + ], + [ + "▁Georg", + "es" + ], + [ + "▁kter", + "ý" + ], + [ + "▁gru", + "ppo" + ], + [ + "▁ко", + "гда" + ], + [ + "▁ме", + "да" + ], + [ + "▁instrument", + "al" + ], + [ + "▁W", + "riter" + ], + [ + "▁Write", + "r" + ], + [ + "▁Writ", + "er" + ], + [ + "▁Wr", + "iter" + ], + [ + "▁", + "Writer" + ], + [ + "▁set", + "Timeout" + ], + [ + "ik", + "k" + ], + [ + "i", + "kk" + ], + [ + "▁Do", + "po" + ], + [ + "▁Dop", + "o" + ], + [ + "])", + ";\r" + ], + [ + "]);", + "\r" + ], + [ + "]", + ");\r" + ], + [ + "▁pract", + "icing" + ], + [ + "▁Ron", + "ald" + ], + [ + "▁у", + "би" + ], + [ + "▁ag", + "rees" + ], + [ + "▁agree", + "s" + ], + [ + "▁agre", + "es" + ], + [ + "▁den", + "oted" + ], + [ + "▁denote", + "d" + ], + [ + "is", + "miss" + ], + [ + "ism", + "iss" + ], + [ + "▁interview", + "ed" + ], + [ + "template", + "s" + ], + [ + "t", + "emplates" + ], + [ + "ř", + "i" + ], + [ + "ad", + "ministr" + ], + [ + "admin", + "istr" + ], + [ + "▁B", + "utter" + ], + [ + "▁But", + "ter" + ], + [ + "▁XV", + "II" + ], + [ + "▁XVI", + "I" + ], + [ + "▁position", + "ed" + ], + [ + "▁posit", + "ioned" + ], + [ + "▁Four", + "th" + ], + [ + "▁overwhel", + "med" + ], + [ + "▁Reg", + "ular" + ], + [ + "▁rep", + "rezent" + ], + [ + "коно", + "ми" + ], + [ + "▁expect", + "s" + ], + [ + "Ind", + "ices" + ], + [ + "▁mar", + "ijuana" + ], + [ + "▁z", + "aj" + ], + [ + "▁za", + "j" + ], + [ + "▁B", + "ren" + ], + [ + "▁Br", + "en" + ], + [ + "▁Bre", + "n" + ], + [ + "▁be", + "gg" + ], + [ + "▁beg", + "g" + ], + [ + "▁na", + "hm" + ], + [ + "▁nah", + "m" + ], + [ + "▁inter", + "rog" + ], + [ + "ти", + "е" + ], + [ + "▁B", + "un" + ], + [ + "▁Bu", + "n" + ], + [ + "▁с", + "еред" + ], + [ + "▁се", + "ред" + ], + [ + "▁shel", + "ves" + ], + [ + "▁которы", + "х" + ], + [ + "▁Fra", + "uen" + ], + [ + "▁Frau", + "en" + ], + [ + "▁Serge", + "ant" + ], + [ + "▁у", + "спе" + ], + [ + "mat", + "ched" + ], + [ + "match", + "ed" + ], + [ + "m", + "atched" + ], + [ + "▁d", + "onne" + ], + [ + "▁don", + "ne" + ], + [ + "▁touch", + "es" + ], + [ + "▁tou", + "ches" + ], + [ + "ab", + "ort" + ], + [ + "abor", + "t" + ], + [ + "▁v", + "ale" + ], + [ + "▁val", + "e" + ], + [ + "▁va", + "le" + ], + [ + "▁inst", + "itutional" + ], + [ + "▁institut", + "ional" + ], + [ + "▁institution", + "al" + ], + [ + "▁M", + "ons" + ], + [ + "▁Mon", + "s" + ], + [ + "▁Mo", + "ns" + ], + [ + "▁ambit", + "ious" + ], + [ + "▁non", + "etheless" + ], + [ + "▁none", + "theless" + ], + [ + "j", + "d" + ], + [ + "пе", + "й" + ], + [ + "п", + "ей" + ], + [ + "▁back", + "pack" + ], + [ + "da", + "o" + ], + [ + "d", + "ao" + ], + [ + "ви", + "я" + ], + [ + "▁surround", + "ings" + ], + [ + "▁surrounding", + "s" + ], + [ + "|", + "_{" + ], + [ + "▁g", + "egründ" + ], + [ + "dis", + "p" + ], + [ + "di", + "sp" + ], + [ + "d", + "isp" + ], + [ + "▁moist", + "ure" + ], + [ + "▁w", + "yd" + ], + [ + "▁wy", + "d" + ], + [ + "▁tr", + "aders" + ], + [ + "▁trad", + "ers" + ], + [ + "▁tra", + "ders" + ], + [ + "▁trade", + "rs" + ], + [ + "▁Er", + "st" + ], + [ + "▁Gal", + "axy" + ], + [ + "▁в", + "оло" + ], + [ + "▁во", + "ло" + ], + [ + "▁Per", + "u" + ], + [ + "▁Pe", + "ru" + ], + [ + "▁prior", + "ities" + ], + [ + "▁pron", + "ounced" + ], + [ + "▁C", + "BS" + ], + [ + "▁CB", + "S" + ], + [ + "▁Pal", + "m" + ], + [ + "▁Pa", + "lm" + ], + [ + "▁exp", + "ans" + ], + [ + "▁ener", + "get" + ], + [ + "▁energ", + "et" + ], + [ + "▁Cond", + "ition" + ], + [ + "▁", + "Condition" + ], + [ + "▁S", + "ver" + ], + [ + "▁Sv", + "er" + ], + [ + "ne", + "sted" + ], + [ + "nes", + "ted" + ], + [ + "n", + "ested" + ], + [ + "▁февра", + "ля" + ], + [ + "he", + "ro" + ], + [ + "her", + "o" + ], + [ + "h", + "ero" + ], + [ + "▁ко", + "ло" + ], + [ + "▁к", + "оло" + ], + [ + "▁", + "коло" + ], + [ + "▁Fil", + "ms" + ], + [ + "▁Film", + "s" + ], + [ + "Bo", + "n" + ], + [ + "B", + "on" + ], + [ + "é", + "al" + ], + [ + "ploy", + "ed" + ], + [ + "tr", + "ained" + ], + [ + "tra", + "ined" + ], + [ + "train", + "ed" + ], + [ + "▁els", + "ő" + ], + [ + "▁l", + "ust" + ], + [ + "▁lu", + "st" + ], + [ + "ati", + "num" + ], + [ + "atin", + "um" + ], + [ + "oy", + "le" + ], + [ + "o", + "yle" + ], + [ + "▁J", + "et" + ], + [ + "▁Je", + "t" + ], + [ + "жде", + "ния" + ], + [ + "▁survey", + "s" + ], + [ + "▁surve", + "ys" + ], + [ + "be", + "e" + ], + [ + "b", + "ee" + ], + [ + "work", + "ers" + ], + [ + "worker", + "s" + ], + [ + "wor", + "kers" + ], + [ + "rec", + "ords" + ], + [ + "record", + "s" + ], + [ + "cal", + "endar" + ], + [ + "bb", + "ing" + ], + [ + "b", + "bing" + ], + [ + "reg", + "ation" + ], + [ + "dash", + "board" + ], + [ + "d", + "ashboard" + ], + [ + "K", + "ing" + ], + [ + "▁v", + "ista" + ], + [ + "▁vis", + "ta" + ], + [ + "▁vi", + "sta" + ], + [ + "▁dep", + "icted" + ], + [ + "▁occur", + "ring" + ], + [ + "▁о", + "фи" + ], + [ + "▁sand", + "wich" + ], + [ + "rc", + "u" + ], + [ + "r", + "cu" + ], + [ + "ke", + "rn" + ], + [ + "ker", + "n" + ], + [ + "k", + "ern" + ], + [ + "▁min", + "ut" + ], + [ + "▁mi", + "nut" + ], + [ + "▁с", + "мер" + ], + [ + "▁t", + "d" + ], + [ + "▁", + "td" + ], + [ + "so", + "lete" + ], + [ + "sole", + "te" + ], + [ + "sol", + "ete" + ], + [ + "Com", + "plex" + ], + [ + "Comp", + "lex" + ], + [ + "▁t", + "unn" + ], + [ + "▁tu", + "nn" + ], + [ + "▁tun", + "n" + ], + [ + "▁sc", + "arc" + ], + [ + "▁scar", + "c" + ], + [ + "st", + "ead" + ], + [ + "ste", + "ad" + ], + [ + "▁F", + "ail" + ], + [ + "▁Fa", + "il" + ], + [ + "▁", + "Fail" + ], + [ + "▁R", + "s" + ], + [ + "▁tr", + "ails" + ], + [ + "▁tra", + "ils" + ], + [ + "▁trail", + "s" + ], + [ + "ke", + "m" + ], + [ + "k", + "em" + ], + [ + "▁Rom", + "ans" + ], + [ + "▁Ro", + "mans" + ], + [ + "▁Roman", + "s" + ], + [ + "▁Roma", + "ns" + ], + [ + "at", + "ivity" + ], + [ + "ativ", + "ity" + ], + [ + "Pre", + "vious" + ], + [ + "Prev", + "ious" + ], + [ + "▁de", + "press" + ], + [ + "▁dep", + "ress" + ], + [ + "▁re", + "signed" + ], + [ + "▁res", + "igned" + ], + [ + "▁resign", + "ed" + ], + [ + "get", + "Default" + ], + [ + "▁Tib", + "et" + ], + [ + "▁Ti", + "bet" + ], + [ + "▁Fr", + "anco" + ], + [ + "▁Franc", + "o" + ], + [ + "▁Fran", + "co" + ], + [ + "\")", + "));" + ], + [ + "\"))", + ");" + ], + [ + "\"", + ")));" + ], + [ + "▁in", + "jection" + ], + [ + "▁inj", + "ection" + ], + [ + "▁inject", + "ion" + ], + [ + "rem", + "oved" + ], + [ + "remove", + "d" + ], + [ + "▁pra", + "ised" + ], + [ + "▁praise", + "d" + ], + [ + "▁A", + "sc" + ], + [ + "▁As", + "c" + ], + [ + "er", + "ase" + ], + [ + "era", + "se" + ], + [ + "eras", + "e" + ], + [ + "e", + "rase" + ], + [ + "▁commission", + "ed" + ], + [ + "MA", + "IL" + ], + [ + "M", + "AIL" + ], + [ + "▁B", + "oh" + ], + [ + "▁Bo", + "h" + ], + [ + "Pol", + "y" + ], + [ + "Po", + "ly" + ], + [ + "P", + "oly" + ], + [ + "▁cin", + "q" + ], + [ + "▁Ab", + "ove" + ], + [ + "▁Josh", + "ua" + ], + [ + "ZE", + "RO" + ], + [ + "Z", + "ERO" + ], + [ + "▁sum", + "mit" + ], + [ + "▁U", + "rs" + ], + [ + "▁Ur", + "s" + ], + [ + "▁c", + "url" + ], + [ + "▁cur", + "l" + ], + [ + "▁cu", + "rl" + ], + [ + "▁v", + "isa" + ], + [ + "▁vis", + "a" + ], + [ + "▁vi", + "sa" + ], + [ + "▁re", + "sur" + ], + [ + "▁res", + "ur" + ], + [ + "={", + "'" + ], + [ + "=", + "{'" + ], + [ + "fe", + "at" + ], + [ + "▁abs", + "orb" + ], + [ + "▁absor", + "b" + ], + [ + "▁plan", + "ets" + ], + [ + "▁plane", + "ts" + ], + [ + "▁planet", + "s" + ], + [ + "▁prin", + "cess" + ], + [ + "▁prince", + "ss" + ], + [ + "▁Jahrhund", + "erts" + ], + [ + "▁Jahrhundert", + "s" + ], + [ + "x", + "p" + ], + [ + "▁N", + "BC" + ], + [ + "▁ко", + "ми" + ], + [ + "▁ком", + "и" + ], + [ + "▁F", + "UN" + ], + [ + "▁", + "FUN" + ], + [ + "▁ne", + "uen" + ], + [ + "▁neu", + "en" + ], + [ + "▁neue", + "n" + ], + [ + "▁dé", + "jà" + ], + [ + "▁O", + "z" + ], + [ + "bb", + "en" + ], + [ + "b", + "ben" + ], + [ + "VID", + "EO" + ], + [ + "▁ej", + "empl" + ], + [ + "▁cons", + "iders" + ], + [ + "▁consider", + "s" + ], + [ + "▁consid", + "ers" + ], + [ + "at", + "ri" + ], + [ + "atr", + "i" + ], + [ + "a", + "tri" + ], + [ + "▁ar", + "rog" + ], + [ + "▁arr", + "og" + ], + [ + "io", + "so" + ], + [ + "ios", + "o" + ], + [ + "i", + "oso" + ], + [ + "▁h", + "ace" + ], + [ + "▁ha", + "ce" + ], + [ + "▁contact", + "ed" + ], + [ + "▁un", + "ple" + ], + [ + "▁spons", + "ored" + ], + [ + "▁tr", + "ainer" + ], + [ + "▁tra", + "iner" + ], + [ + "▁train", + "er" + ], + [ + "sb", + "i" + ], + [ + "s", + "bi" + ], + [ + "▁за", + "нима" + ], + [ + "C", + "riterion" + ], + [ + "но", + "то" + ], + [ + "sch", + "eme" + ], + [ + "sche", + "me" + ], + [ + "enn", + "ial" + ], + [ + "per", + "form" + ], + [ + "perf", + "orm" + ], + [ + "▁fix", + "ing" + ], + [ + "▁по", + "стро" + ], + [ + "▁пос", + "тро" + ], + [ + "ar", + "b" + ], + [ + "a", + "rb" + ], + [ + "EX", + "IT" + ], + [ + "▁ca", + "fé" + ], + [ + "▁caf", + "é" + ], + [ + "itut", + "ed" + ], + [ + "itute", + "d" + ], + [ + "itu", + "ted" + ], + [ + "ri", + "ages" + ], + [ + "ria", + "ges" + ], + [ + "riage", + "s" + ], + [ + "T", + "ur" + ], + [ + "▁hab", + "er" + ], + [ + "▁ha", + "ber" + ], + [ + "el", + "asticsearch" + ], + [ + "▁а", + "л" + ], + [ + "▁", + "ал" + ], + [ + "r", + "h" + ], + [ + "▁v", + "oll" + ], + [ + "▁vo", + "ll" + ], + [ + "▁vol", + "l" + ], + [ + "CL", + "U" + ], + [ + "M", + "il" + ], + [ + "▁mem", + "bres" + ], + [ + "▁membr", + "es" + ], + [ + "▁membre", + "s" + ], + [ + "▁remark", + "ed" + ], + [ + "ва", + "на" + ], + [ + "ван", + "а" + ], + [ + "в", + "ана" + ], + [ + "=\"", + "_" + ], + [ + "Le", + "ss" + ], + [ + "Les", + "s" + ], + [ + "L", + "ess" + ], + [ + "(\"", + "\");" + ], + [ + "▁Y", + "ale" + ], + [ + "▁Ya", + "le" + ], + [ + "ber", + "ries" + ], + [ + "▁rele", + "asing" + ], + [ + "▁im", + "ports" + ], + [ + "▁import", + "s" + ], + [ + "▁imp", + "orts" + ], + [ + "id", + "ea" + ], + [ + "ide", + "a" + ], + [ + "▁(", + "+" + ], + [ + "▁ar", + "qu" + ], + [ + "ific", + "ación" + ], + [ + "ifica", + "ción" + ], + [ + "▁па", + "ра" + ], + [ + "▁пар", + "а" + ], + [ + "▁R", + "angers" + ], + [ + "▁Range", + "rs" + ], + [ + "▁Rang", + "ers" + ], + [ + "▁Ran", + "gers" + ], + [ + "M", + "ic" + ], + [ + "▁n", + "ederbörd" + ], + [ + "▁imag", + "inary" + ], + [ + "▁imagin", + "ary" + ], + [ + "▁special", + "ists" + ], + [ + "▁specialist", + "s" + ], + [ + "▁ho", + "of" + ], + [ + "Mod", + "ules" + ], + [ + "Module", + "s" + ], + [ + "▁sad", + "ly" + ], + [ + "ce", + "il" + ], + [ + "Tab", + "Index" + ], + [ + "at", + "ionale" + ], + [ + "ation", + "ale" + ], + [ + "ational", + "e" + ], + [ + "▁Part", + "ner" + ], + [ + "tb", + "ody" + ], + [ + "t", + "body" + ], + [ + "▁le", + "verage" + ], + [ + "▁lever", + "age" + ], + [ + "D", + "N" + ], + [ + "▁P", + "rec" + ], + [ + "▁Pr", + "ec" + ], + [ + "▁Pre", + "c" + ], + [ + "▁S", + "é" + ], + [ + "▁M", + "am" + ], + [ + "▁Ma", + "m" + ], + [ + "▁a", + "fin" + ], + [ + "▁af", + "in" + ], + [ + "is", + "Valid" + ], + [ + "Ps", + "e" + ], + [ + "P", + "se" + ], + [ + "▁сто", + "ро" + ], + [ + "▁cho", + "pped" + ], + [ + "▁chop", + "ped" + ], + [ + "▁Min", + "or" + ], + [ + "▁Mi", + "nor" + ], + [ + "▁d", + "abei" + ], + [ + "Da", + "vid" + ], + [ + "D", + "avid" + ], + [ + "uss", + "ia" + ], + [ + "▁дере", + "вня" + ], + [ + "▁Id", + "entity" + ], + [ + "▁Ident", + "ity" + ], + [ + "▁", + "Identity" + ], + [ + "▁L", + "GBT" + ], + [ + "ци", + "је" + ], + [ + "▁Or", + "ts" + ], + [ + "▁Ort", + "s" + ], + [ + "▁part", + "i" + ], + [ + "▁par", + "ti" + ], + [ + "▁B", + "achelor" + ], + [ + "ug", + "a" + ], + [ + "u", + "ga" + ], + [ + "▁O", + "PT" + ], + [ + "▁OP", + "T" + ], + [ + "▁", + "OPT" + ], + [ + "▁S", + "eth" + ], + [ + "▁Se", + "th" + ], + [ + "▁Set", + "h" + ], + [ + "▁LI", + "ABLE" + ], + [ + "▁inaug", + "ur" + ], + [ + "▁Shan", + "ghai" + ], + [ + "▁relax", + "ing" + ], + [ + "ци", + "она" + ], + [ + "цион", + "а" + ], + [ + "\"", + "%" + ], + [ + "▁o", + "bey" + ], + [ + "▁ob", + "ey" + ], + [ + "▁A", + "irlines" + ], + [ + "▁Air", + "lines" + ], + [ + "Link", + "s" + ], + [ + "Lin", + "ks" + ], + [ + "L", + "inks" + ], + [ + "▁C", + "elt" + ], + [ + "▁Ce", + "lt" + ], + [ + "▁Cel", + "t" + ], + [ + "▁Ad", + "min" + ], + [ + "▁Adm", + "in" + ], + [ + "▁", + "Admin" + ], + [ + "ag", + "ation" + ], + [ + "▁wor", + "ries" + ], + [ + "IN", + "TE" + ], + [ + "INT", + "E" + ], + [ + "ar", + "ith" + ], + [ + "ari", + "th" + ], + [ + "Fat", + "alf" + ], + [ + "]]", + ")" + ], + [ + "]", + "])" + ], + [ + "co", + "lm" + ], + [ + "col", + "m" + ], + [ + "▁arch", + "ae" + ], + [ + "▁br", + "ushed" + ], + [ + "▁brush", + "ed" + ], + [ + "▁t", + "ät" + ], + [ + "▁struct", + "ured" + ], + [ + "▁structure", + "d" + ], + [ + "ти", + "и" + ], + [ + "▁home", + "m" + ], + [ + "▁hom", + "em" + ], + [ + "▁ho", + "mem" + ], + [ + "[:", + "," + ], + [ + "▁n", + "avy" + ], + [ + "▁na", + "vy" + ], + [ + "▁nav", + "y" + ], + [ + "get", + "Key" + ], + [ + "power", + "ed" + ], + [ + "pow", + "ered" + ], + [ + "▁s", + "ucked" + ], + [ + "▁suc", + "ked" + ], + [ + "▁suck", + "ed" + ], + [ + "▁z", + "omb" + ], + [ + "▁zo", + "mb" + ], + [ + "iss", + "ant" + ], + [ + "issa", + "nt" + ], + [ + "▁M", + "ight" + ], + [ + "▁Mi", + "ght" + ], + [ + "▁Mig", + "ht" + ], + [ + "▁P", + "ull" + ], + [ + "▁Pu", + "ll" + ], + [ + "▁Pul", + "l" + ], + [ + "ri", + "r" + ], + [ + "r", + "ir" + ], + [ + "▁п", + "і" + ], + [ + "▁", + "пі" + ], + [ + "▁se", + "as" + ], + [ + "▁sea", + "s" + ], + [ + "▁W", + "rest" + ], + [ + "▁Wr", + "est" + ], + [ + "▁t", + "ense" + ], + [ + "▁ten", + "se" + ], + [ + "▁tens", + "e" + ], + [ + "▁a", + "tm" + ], + [ + "▁at", + "m" + ], + [ + "▁have", + "t" + ], + [ + "▁ha", + "vet" + ], + [ + "▁hav", + "et" + ], + [ + "▁pier", + "ws" + ], + [ + "▁trag", + "ic" + ], + [ + "▁D", + "iff" + ], + [ + "▁Di", + "ff" + ], + [ + "▁", + "Diff" + ], + [ + "▁conf", + "idential" + ], + [ + "▁confident", + "ial" + ], + [ + "success", + "ful" + ], + [ + "ę", + "ż" + ], + [ + "▁Ch", + "ain" + ], + [ + "▁Cha", + "in" + ], + [ + "▁", + "Chain" + ], + [ + "▁Ken", + "ya" + ], + [ + "Ch", + "oice" + ], + [ + "oc", + "ur" + ], + [ + "o", + "cur" + ], + [ + "an", + "iu" + ], + [ + "ani", + "u" + ], + [ + "▁consult", + "ant" + ], + [ + "▁Ad", + "vis" + ], + [ + "▁Adv", + "is" + ], + [ + "Li", + "f" + ], + [ + "L", + "if" + ], + [ + "▁L", + "ors" + ], + [ + "▁Lo", + "rs" + ], + [ + "▁Lor", + "s" + ], + [ + "avor", + "ite" + ], + [ + "avo", + "rite" + ], + [ + "▁util", + "izing" + ], + [ + "▁utiliz", + "ing" + ], + [ + "▁v", + "intage" + ], + [ + "Mat", + "cher" + ], + [ + "Match", + "er" + ], + [ + "▁m", + "embre" + ], + [ + "▁me", + "mbre" + ], + [ + "▁mem", + "bre" + ], + [ + "▁membr", + "e" + ], + [ + "▁Ex", + "pect" + ], + [ + "▁Exp", + "ect" + ], + [ + "▁", + "Expect" + ], + [ + "▁tr", + "acing" + ], + [ + "▁tra", + "cing" + ], + [ + "no", + "g" + ], + [ + "n", + "og" + ], + [ + "▁d", + "ej" + ], + [ + "▁de", + "j" + ], + [ + "▁у", + "че" + ], + [ + "▁lo", + "ops" + ], + [ + "▁loop", + "s" + ], + [ + "▁on", + "click" + ], + [ + "▁G", + "PU" + ], + [ + "▁GP", + "U" + ], + [ + "▁", + "GPU" + ], + [ + "▁Album", + "s" + ], + [ + "▁Alb", + "ums" + ], + [ + "▁Arch", + "ives" + ], + [ + "ва", + "та" + ], + [ + "ват", + "а" + ], + [ + "▁st", + "ove" + ], + [ + "▁sto", + "ve" + ], + [ + "ш", + "ли" + ], + [ + "an", + "cies" + ], + [ + "anc", + "ies" + ], + [ + "▁geme", + "ente" + ], + [ + "mo", + "b" + ], + [ + "m", + "ob" + ], + [ + "PD", + "F" + ], + [ + "P", + "DF" + ], + [ + "es", + "o" + ], + [ + "e", + "so" + ], + [ + "▁v", + "ég" + ], + [ + "▁vé", + "g" + ], + [ + "Res", + "olve" + ], + [ + "▁te", + "aches" + ], + [ + "▁teach", + "es" + ], + [ + "▁tea", + "ches" + ], + [ + "ло", + "же" + ], + [ + "▁с", + "тво" + ], + [ + "▁ст", + "во" + ], + [ + "▁", + "ство" + ], + [ + "▁О", + "дна" + ], + [ + "▁f", + "id" + ], + [ + "▁fi", + "d" + ], + [ + "Some", + "thing" + ], + [ + "Som", + "ething" + ], + [ + "▁ne", + "bo" + ], + [ + "▁Valent", + "ine" + ], + [ + "row", + "ning" + ], + [ + "rown", + "ing" + ], + [ + "▁а", + "ле" + ], + [ + "▁ал", + "е" + ], + [ + "aw", + "i" + ], + [ + "a", + "wi" + ], + [ + "is", + "hi" + ], + [ + "ish", + "i" + ], + [ + "▁S", + "PI" + ], + [ + "▁SP", + "I" + ], + [ + "▁", + "SPI" + ], + [ + "▁s", + "pel" + ], + [ + "▁sp", + "el" + ], + [ + "▁spe", + "l" + ], + [ + "▁б", + "іль" + ], + [ + "▁бі", + "ль" + ], + [ + "▁particip", + "ant" + ], + [ + "▁N", + "ed" + ], + [ + "▁Ne", + "d" + ], + [ + "▁G", + "ast" + ], + [ + "▁Ga", + "st" + ], + [ + "▁Gas", + "t" + ], + [ + "▁bl", + "ond" + ], + [ + "▁blo", + "nd" + ], + [ + "▁s", + "aves" + ], + [ + "▁sa", + "ves" + ], + [ + "▁save", + "s" + ], + [ + "▁sav", + "es" + ], + [ + "col", + "ored" + ], + [ + "color", + "ed" + ], + [ + "colo", + "red" + ], + [ + "▁A", + "CTION" + ], + [ + "▁AC", + "TION" + ], + [ + "▁ACT", + "ION" + ], + [ + "▁", + "ACTION" + ], + [ + "▁Polit", + "iker" + ], + [ + "}$", + ")" + ], + [ + "}", + "$)" + ], + [ + "▁D", + "um" + ], + [ + "▁Du", + "m" + ], + [ + "den", + "try" + ], + [ + "d", + "entry" + ], + [ + "Stud", + "ent" + ], + [ + "▁~", + "=" + ], + [ + "lo", + "ads" + ], + [ + "load", + "s" + ], + [ + "▁F", + "oster" + ], + [ + "▁Fo", + "ster" + ], + [ + "一", + "个" + ], + [ + "▁P", + "K" + ], + [ + "▁", + "PK" + ], + [ + "▁S", + "B" + ], + [ + "▁", + "SB" + ], + [ + "▁H", + "ern" + ], + [ + "▁He", + "rn" + ], + [ + "▁Her", + "n" + ], + [ + "▁Ex", + "hib" + ], + [ + "Listener", + "s" + ], + [ + "Listen", + "ers" + ], + [ + "Su", + "n" + ], + [ + "S", + "un" + ], + [ + "pl", + "ac" + ], + [ + "▁B", + "ever" + ], + [ + "▁Be", + "ver" + ], + [ + "▁Bev", + "er" + ], + [ + "▁incl", + "uy" + ], + [ + "▁inclu", + "y" + ], + [ + "▁d", + "c" + ], + [ + "▁", + "dc" + ], + [ + "ar", + "gc" + ], + [ + "arg", + "c" + ], + [ + "▁g", + "ed" + ], + [ + "▁ge", + "d" + ], + [ + "▁", + "ged" + ], + [ + "с", + "па" + ], + [ + "▁Form", + "ula" + ], + [ + "▁с", + "ем" + ], + [ + "▁се", + "м" + ], + [ + "▁em", + "pt" + ], + [ + "▁emp", + "t" + ], + [ + "▁", + "empt" + ], + [ + "un", + "register" + ], + [ + "▁Queens", + "land" + ], + [ + "ánd", + "ez" + ], + [ + "ot", + "ive" + ], + [ + "oti", + "ve" + ], + [ + "▁al", + "ley" + ], + [ + "▁all", + "ey" + ], + [ + "▁alle", + "y" + ], + [ + "▁Democr", + "at" + ], + [ + "▁trav", + "ail" + ], + [ + "▁$", + "," + ], + [ + "▁", + "$," + ], + [ + "R", + "P" + ], + [ + "ро", + "е" + ], + [ + "pers", + "onal" + ], + [ + "person", + "al" + ], + [ + "▁péri", + "ode" + ], + [ + "HO", + "ME" + ], + [ + "om", + "es" + ], + [ + "ome", + "s" + ], + [ + "o", + "mes" + ], + [ + "▁recogn", + "ised" + ], + [ + "he", + "ng" + ], + [ + "hen", + "g" + ], + [ + "h", + "eng" + ], + [ + "▁J", + "ung" + ], + [ + "▁Jun", + "g" + ], + [ + "▁Ju", + "ng" + ], + [ + "▁Ro", + "land" + ], + [ + "▁Rol", + "and" + ], + [ + "▁conv", + "icted" + ], + [ + "Loc", + "ked" + ], + [ + "Lock", + "ed" + ], + [ + "L", + "ocked" + ], + [ + "▁m", + "ari" + ], + [ + "▁mar", + "i" + ], + [ + "▁ma", + "ri" + ], + [ + "▁Lux", + "em" + ], + [ + "refer", + "to" + ], + [ + "De", + "leted" + ], + [ + "Dele", + "ted" + ], + [ + "Delete", + "d" + ], + [ + "Del", + "eted" + ], + [ + "int", + "ent" + ], + [ + "inte", + "nt" + ], + [ + "▁St", + "aats" + ], + [ + "▁Sta", + "ats" + ], + [ + "▁обла", + "сті" + ], + [ + "и", + "т" + ], + [ + "▁са", + "ве" + ], + [ + "▁Pro", + "tocol" + ], + [ + "▁", + "Protocol" + ], + [ + "ają", + "c" + ], + [ + "ch", + "k" + ], + [ + "Type", + "Info" + ], + [ + "▁p", + "kt" + ], + [ + "▁", + "pkt" + ], + [ + "▁sc", + "andal" + ], + [ + "▁scan", + "dal" + ], + [ + "▁individ", + "ually" + ], + [ + "▁individual", + "ly" + ], + [ + "FM", + "T" + ], + [ + "F", + "MT" + ], + [ + "▁n", + "j" + ], + [ + "ab", + "ile" + ], + [ + "abil", + "e" + ], + [ + "abi", + "le" + ], + [ + "▁R", + "ivers" + ], + [ + "▁River", + "s" + ], + [ + "PRO", + "PERTY" + ], + [ + "V", + "B" + ], + [ + "wo", + "rt" + ], + [ + "wor", + "t" + ], + [ + "w", + "ort" + ], + [ + "▁split", + "ting" + ], + [ + "▁spl", + "itting" + ], + [ + "ach", + "ten" + ], + [ + "acht", + "en" + ], + [ + "achte", + "n" + ], + [ + "a", + "chten" + ], + [ + "▁AR", + "ISING" + ], + [ + "▁s", + "ip" + ], + [ + "▁si", + "p" + ], + [ + "▁f", + "res" + ], + [ + "▁fr", + "es" + ], + [ + "▁fre", + "s" + ], + [ + "▁g", + "room" + ], + [ + "▁gr", + "oom" + ], + [ + "▁gro", + "om" + ], + [ + "H", + "ol" + ], + [ + "▁c", + "anon" + ], + [ + "▁can", + "on" + ], + [ + "▁ca", + "non" + ], + [ + "▁abrupt", + "ly" + ], + [ + "▁after", + "ward" + ], + [ + "▁R", + "unning" + ], + [ + "▁Run", + "ning" + ], + [ + "▁", + "Running" + ], + [ + "▁j", + "i" + ], + [ + "▁", + "ji" + ], + [ + "▁%", + "," + ], + [ + "▁", + "%," + ], + [ + "▁Palest", + "inian" + ], + [ + "R", + "W" + ], + [ + "pgf", + "scope" + ], + [ + "▁country", + "side" + ], + [ + "▁countr", + "yside" + ], + [ + "▁fort", + "unate" + ], + [ + "▁", + "fortunate" + ], + [ + "▁c", + "él" + ], + [ + "▁Po", + "inter" + ], + [ + "▁Point", + "er" + ], + [ + "▁", + "Pointer" + ], + [ + "ens", + "ors" + ], + [ + "ensor", + "s" + ], + [ + "enso", + "rs" + ], + [ + "ra", + "ting" + ], + [ + "rat", + "ing" + ], + [ + "r", + "ating" + ], + [ + "▁buff", + "ers" + ], + [ + "▁buffer", + "s" + ], + [ + "▁buf", + "fers" + ], + [ + "▁re", + "mot" + ], + [ + "▁rem", + "ot" + ], + [ + "▁Prop", + "Types" + ], + [ + "▁N", + "ah" + ], + [ + "▁Na", + "h" + ], + [ + "al", + "tern" + ], + [ + "alt", + "ern" + ], + [ + "alter", + "n" + ], + [ + "▁eas", + "iest" + ], + [ + "▁in", + "vas" + ], + [ + "▁inv", + "as" + ], + [ + "▁cl", + "k" + ], + [ + "▁", + "clk" + ], + [ + "copy", + "right" + ], + [ + "c", + "opyright" + ], + [ + "▁bl", + "anc" + ], + [ + "SA", + "MP" + ], + [ + "S", + "AMP" + ], + [ + "▁Co", + "hen" + ], + [ + "▁S", + "hell" + ], + [ + "▁She", + "ll" + ], + [ + "▁Sh", + "ell" + ], + [ + "▁Shel", + "l" + ], + [ + "▁", + "Shell" + ], + [ + "▁destroy", + "ing" + ], + [ + "▁destro", + "ying" + ], + [ + "▁Z", + "el" + ], + [ + "▁Ze", + "l" + ], + [ + "date", + "r" + ], + [ + "da", + "ter" + ], + [ + "dat", + "er" + ], + [ + "d", + "ater" + ], + [ + "če", + "n" + ], + [ + "č", + "en" + ], + [ + "▁f", + "iling" + ], + [ + "▁fil", + "ing" + ], + [ + "▁fi", + "ling" + ], + [ + "▁integr", + "ate" + ], + [ + "xi", + "t" + ], + [ + "x", + "it" + ], + [ + "▁R", + "ET" + ], + [ + "▁RE", + "T" + ], + [ + "▁", + "RET" + ], + [ + "le", + "ne" + ], + [ + "len", + "e" + ], + [ + "l", + "ene" + ], + [ + "cal", + "ls" + ], + [ + "call", + "s" + ], + [ + "c", + "alls" + ], + [ + "▁sl", + "aughter" + ], + [ + "init", + "ialized" + ], + [ + "initial", + "ized" + ], + [ + "initialize", + "d" + ], + [ + "un", + "ches" + ], + [ + "unch", + "es" + ], + [ + "unc", + "hes" + ], + [ + "▁Tr", + "ace" + ], + [ + "▁Tra", + "ce" + ], + [ + "▁", + "Trace" + ], + [ + "eff", + "icient" + ], + [ + "▁Wood", + "s" + ], + [ + "▁long", + "itud" + ], + [ + "G", + "N" + ], + [ + "▁K", + "ont" + ], + [ + "▁Kon", + "t" + ], + [ + "▁Ko", + "nt" + ], + [ + "▁chunk", + "s" + ], + [ + "á", + "ch" + ], + [ + "▁unem", + "ployment" + ], + [ + "ac", + "om" + ], + [ + "aco", + "m" + ], + [ + "a", + "com" + ], + [ + "▁sl", + "owed" + ], + [ + "▁slow", + "ed" + ], + [ + "▁out", + "lined" + ], + [ + "▁outline", + "d" + ], + [ + "xff", + "ff" + ], + [ + "xf", + "fff" + ], + [ + "x", + "ffff" + ], + [ + "▁ik", + "ke" + ], + [ + "▁work", + "space" + ], + [ + "▁works", + "pace" + ], + [ + "M", + "c" + ], + [ + "▁k", + "icking" + ], + [ + "▁kick", + "ing" + ], + [ + "▁embed", + "ding" + ], + [ + "ch", + "nitt" + ], + [ + "chn", + "itt" + ], + [ + "er", + "ten" + ], + [ + "ert", + "en" + ], + [ + "▁In", + "terior" + ], + [ + "▁Inter", + "ior" + ], + [ + "▁S", + "ongs" + ], + [ + "▁Son", + "gs" + ], + [ + "▁Song", + "s" + ], + [ + "mm", + "c" + ], + [ + "m", + "mc" + ], + [ + "▁analy", + "zed" + ], + [ + "▁analyze", + "d" + ], + [ + "▁Cou", + "pe" + ], + [ + "▁favor", + "ites" + ], + [ + "▁favorite", + "s" + ], + [ + "▁t", + "t" + ], + [ + "▁", + "tt" + ], + [ + "▁то", + "й" + ], + [ + "▁", + "той" + ], + [ + "R", + "outing" + ], + [ + "▁Sil", + "va" + ], + [ + "▁andere", + "m" + ], + [ + "▁ander", + "em" + ], + [ + "▁h", + "onom" + ], + [ + "▁hon", + "om" + ], + [ + "▁ho", + "nom" + ], + [ + "▁исполь", + "зова" + ], + [ + ".\"", + "]" + ], + [ + ".", + "\"]" + ], + [ + "▁W", + "u" + ], + [ + "le", + "gt" + ], + [ + "leg", + "t" + ], + [ + "▁s", + "poon" + ], + [ + "▁sp", + "oon" + ], + [ + "▁spo", + "on" + ], + [ + "▁j", + "ap" + ], + [ + "▁ja", + "p" + ], + [ + "▁Ext", + "ension" + ], + [ + "▁", + "Extension" + ], + [ + "er", + "ne" + ], + [ + "ern", + "e" + ], + [ + "▁v", + "agy" + ], + [ + "▁va", + "gy" + ], + [ + "▁vag", + "y" + ], + [ + "▁се", + "ла" + ], + [ + "▁ф", + "унк" + ], + [ + "▁anal", + "ytics" + ], + [ + "▁analyt", + "ics" + ], + [ + "▁s", + "ug" + ], + [ + "▁su", + "g" + ], + [ + "▁A", + "sync" + ], + [ + "▁As", + "ync" + ], + [ + "▁", + "Async" + ], + [ + "▁pe", + "aks" + ], + [ + "▁peak", + "s" + ], + [ + "▁G", + "ym" + ], + [ + "▁Gy", + "m" + ], + [ + "▁law", + "suit" + ], + [ + "▁laws", + "uit" + ], + [ + "<", + ">" + ], + [ + "ial", + "is" + ], + [ + "i", + "alis" + ], + [ + "et", + "ric" + ], + [ + "etr", + "ic" + ], + [ + "face", + "d" + ], + [ + "fa", + "ced" + ], + [ + "fac", + "ed" + ], + [ + "f", + "aced" + ], + [ + "▁dis", + "rupt" + ], + [ + "▁f", + "å" + ], + [ + "Input", + "s" + ], + [ + "`)", + ";" + ], + [ + "`", + ");" + ], + [ + "▁M", + "end" + ], + [ + "▁Me", + "nd" + ], + [ + "▁Men", + "d" + ], + [ + "go", + "n" + ], + [ + "g", + "on" + ], + [ + "▁\"", + ",\"" + ], + [ + "▁\",", + "\"" + ], + [ + "▁", + "\",\"" + ], + [ + "▁n", + "erves" + ], + [ + "▁nerv", + "es" + ], + [ + "▁nerve", + "s" + ], + [ + "▁ner", + "ves" + ], + [ + "▁doubt", + "s" + ], + [ + "▁doub", + "ts" + ], + [ + "sa", + "p" + ], + [ + "s", + "ap" + ], + [ + "▁s", + "ow" + ], + [ + "▁so", + "w" + ], + [ + ",\\", + ",\\" + ], + [ + ",\\,", + "\\" + ], + [ + ",", + "\\,\\" + ], + [ + "▁B", + "S" + ], + [ + "▁", + "BS" + ], + [ + "▁G", + "lad" + ], + [ + "▁Gl", + "ad" + ], + [ + "▁a", + "ster" + ], + [ + "▁as", + "ter" + ], + [ + "▁ast", + "er" + ], + [ + "▁", + "aster" + ], + [ + "œuv", + "re" + ], + [ + "▁Bang", + "l" + ], + [ + "▁Ban", + "gl" + ], + [ + "▁i", + "Pad" + ], + [ + "use", + "ppe" + ], + [ + "▁conduct", + "ing" + ], + [ + "▁(", + "{\\" + ], + [ + "▁({", + "\\" + ], + [ + "▁", + "({\\" + ], + [ + "▁Har", + "bor" + ], + [ + "ps", + "z" + ], + [ + "p", + "sz" + ], + [ + "▁FI", + "FA" + ], + [ + "_*", + "*" + ], + [ + "_", + "**" + ], + [ + "em", + "or" + ], + [ + "e", + "mor" + ], + [ + "▁", + "▁" + ], + [ + "▁▁", + "▁▁" + ], + [ + "▁▁▁", + "▁" + ], + [ + "▁", + "▁▁▁" + ], + [ + "▁▁", + "▁▁▁▁▁▁" + ], + [ + "▁▁▁▁", + "▁▁▁▁" + ], + [ + "▁▁▁▁▁", + "▁▁▁" + ], + [ + "▁▁▁▁▁▁", + "▁▁" + ], + [ + "▁▁▁", + "▁▁▁▁▁" + ], + [ + "▁▁▁▁▁▁▁", + "▁" + ], + [ + "▁", + "▁▁▁▁▁▁▁" + ], + [ + "▁▁", + "▁▁▁" + ], + [ + "▁▁▁▁", + "▁" + ], + [ + "▁▁▁", + "▁▁" + ], + [ + "▁", + "▁▁▁▁" + ], + [ + "▁▁", + "▁▁▁▁▁▁▁▁▁▁▁▁▁▁" + ], + [ + "▁▁▁▁", + "▁▁▁▁▁▁▁▁▁▁▁▁" + ], + [ + "▁▁▁▁▁▁▁▁", + "▁▁▁▁▁▁▁▁" + ], + [ + "▁▁▁▁▁", + "▁▁▁▁▁▁▁▁▁▁▁" + ], + [ + "▁▁▁▁▁▁", + "▁▁▁▁▁▁▁▁▁▁" + ], + [ + "▁▁▁▁▁▁▁▁▁▁▁▁", + "▁▁▁▁" + ], + [ + "▁▁▁▁▁▁▁▁▁▁▁▁▁", + "▁▁▁" + ], + [ + "▁▁▁▁▁▁▁▁▁▁", + "▁▁▁▁▁▁" + ], + [ + "▁▁▁▁▁▁▁▁▁▁▁▁▁▁", + "▁▁" + ], + [ + "▁▁▁", + "▁▁▁▁▁▁▁▁▁▁▁▁▁" + ], + [ + "▁▁▁▁▁▁▁▁▁", + "▁▁▁▁▁▁▁" + ], + [ + "▁▁▁▁▁▁▁", + "▁▁▁▁▁▁▁▁▁" + ], + [ + "▁▁▁▁▁▁▁▁▁▁▁", + "▁▁▁▁▁" + ], + [ + "▁▁", + "▁▁▁▁" + ], + [ + "▁▁▁▁", + "▁▁" + ], + [ + "▁▁▁▁▁", + "▁" + ], + [ + "▁▁▁", + "▁▁▁" + ], + [ + "▁", + "▁▁▁▁▁" + ], + [ + "▁▁", + "▁▁▁▁▁▁▁▁▁▁" + ], + [ + "▁▁▁▁", + "▁▁▁▁▁▁▁▁" + ], + [ + "▁▁▁▁▁▁▁▁", + "▁▁▁▁" + ], + [ + "▁▁▁▁▁", + "▁▁▁▁▁▁▁" + ], + [ + "▁▁▁▁▁▁", + "▁▁▁▁▁▁" + ], + [ + "▁▁▁▁▁▁▁▁▁▁", + "▁▁" + ], + [ + "▁▁▁", + "▁▁▁▁▁▁▁▁▁" + ], + [ + "▁▁▁▁▁▁▁▁▁", + "▁▁▁" + ], + [ + "▁▁▁▁▁▁▁", + "▁▁▁▁▁" + ], + [ + "▁▁▁▁▁▁▁▁▁▁▁", + "▁" + ], + [ + "▁", + "▁▁▁▁▁▁▁▁▁▁▁" + ], + [ + "▁▁", + "▁▁▁▁▁▁▁▁▁▁▁" + ], + [ + "▁▁▁▁", + "▁▁▁▁▁▁▁▁▁" + ], + [ + "▁▁▁▁▁▁▁▁", + "▁▁▁▁▁" + ], + [ + "▁▁▁▁▁", + "▁▁▁▁▁▁▁▁" + ], + [ + "▁▁▁▁▁▁", + "▁▁▁▁▁▁▁" + ], + [ + "▁▁▁▁▁▁▁▁▁▁▁▁", + "▁" + ], + [ + "▁▁▁▁▁▁▁▁▁▁", + "▁▁▁" + ], + [ + "▁▁▁", + "▁▁▁▁▁▁▁▁▁▁" + ], + [ + "▁▁▁▁▁▁▁▁▁", + "▁▁▁▁" + ], + [ + "▁▁▁▁▁▁▁", + "▁▁▁▁▁▁" + ], + [ + "▁▁▁▁▁▁▁▁▁▁▁", + "▁▁" + ], + [ + "▁", + "▁▁▁▁▁▁▁▁▁▁▁▁" + ], + [ + "▁▁", + "▁▁▁▁▁▁▁▁" + ], + [ + "▁▁▁▁", + "▁▁▁▁▁▁" + ], + [ + "▁▁▁▁▁▁▁▁", + "▁▁" + ], + [ + "▁▁▁▁▁", + "▁▁▁▁▁" + ], + [ + "▁▁▁▁▁▁", + "▁▁▁▁" + ], + [ + "▁▁▁", + "▁▁▁▁▁▁▁" + ], + [ + "▁▁▁▁▁▁▁▁▁", + "▁" + ], + [ + "▁▁▁▁▁▁▁", + "▁▁▁" + ], + [ + "▁", + "▁▁▁▁▁▁▁▁▁" + ], + [ + "▁▁", + "▁▁▁▁▁▁▁▁▁▁▁▁" + ], + [ + "▁▁▁▁", + "▁▁▁▁▁▁▁▁▁▁" + ], + [ + "▁▁▁▁▁▁▁▁", + "▁▁▁▁▁▁" + ], + [ + "▁▁▁▁▁", + "▁▁▁▁▁▁▁▁▁" + ], + [ + "▁▁▁▁▁▁", + "▁▁▁▁▁▁▁▁" + ], + [ + "▁▁▁▁▁▁▁▁▁▁▁▁", + "▁▁" + ], + [ + "▁▁▁▁▁▁▁▁▁▁▁▁▁", + "▁" + ], + [ + "▁▁▁▁▁▁▁▁▁▁", + "▁▁▁▁" + ], + [ + "▁▁▁", + "▁▁▁▁▁▁▁▁▁▁▁" + ], + [ + "▁▁▁▁▁▁▁▁▁", + "▁▁▁▁▁" + ], + [ + "▁▁▁▁▁▁▁", + "▁▁▁▁▁▁▁" + ], + [ + "▁▁▁▁▁▁▁▁▁▁▁", + "▁▁▁" + ], + [ + "▁", + "▁▁▁▁▁▁▁▁▁▁▁▁▁" + ], + [ + "▁▁", + "▁" + ], + [ + "▁", + "▁▁" + ], + [ + "▁▁", + "▁▁▁▁▁▁▁" + ], + [ + "▁▁▁▁", + "▁▁▁▁▁" + ], + [ + "▁▁▁▁▁▁▁▁", + "▁" + ], + [ + "▁▁▁▁▁", + "▁▁▁▁" + ], + [ + "▁▁▁▁▁▁", + "▁▁▁" + ], + [ + "▁▁▁", + "▁▁▁▁▁▁" + ], + [ + "▁▁▁▁▁▁▁", + "▁▁" + ], + [ + "▁", + "▁▁▁▁▁▁▁▁" + ], + [ + "▁▁", + "▁▁▁▁▁" + ], + [ + "▁▁▁▁", + "▁▁▁" + ], + [ + "▁▁▁▁▁", + "▁▁" + ], + [ + "▁▁▁▁▁▁", + "▁" + ], + [ + "▁▁▁", + "▁▁▁▁" + ], + [ + "▁", + "▁▁▁▁▁▁" + ], + [ + "▁▁", + "▁▁▁▁▁▁▁▁▁" + ], + [ + "▁▁▁▁", + "▁▁▁▁▁▁▁" + ], + [ + "▁▁▁▁▁▁▁▁", + "▁▁▁" + ], + [ + "▁▁▁▁▁", + "▁▁▁▁▁▁" + ], + [ + "▁▁▁▁▁▁", + "▁▁▁▁▁" + ], + [ + "▁▁▁▁▁▁▁▁▁▁", + "▁" + ], + [ + "▁▁▁", + "▁▁▁▁▁▁▁▁" + ], + [ + "▁▁▁▁▁▁▁▁▁", + "▁▁" + ], + [ + "▁▁▁▁▁▁▁", + "▁▁▁▁" + ], + [ + "▁", + "▁▁▁▁▁▁▁▁▁▁" + ] + ] + } +} \ No newline at end of file diff --git a/tokenizer_config.json b/tokenizer_config.json new file mode 100644 index 0000000000000000000000000000000000000000..79c3f05ed38de2af9af5843fef6b7a91bfbb2b26 --- /dev/null +++ b/tokenizer_config.json @@ -0,0 +1,44 @@ +{ + "add_bos_token": true, + "add_eos_token": false, + "add_prefix_space": null, + "added_tokens_decoder": { + "0": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": true + }, + "1": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": true + }, + "2": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": true + } + }, + "additional_special_tokens": [], + "bos_token": "", + "clean_up_tokenization_spaces": false, + "eos_token": "", + "extra_special_tokens": {}, + "legacy": false, + "model_max_length": 1000000000000000019884624838656, + "pad_token": "", + "sp_model_kwargs": {}, + "spaces_between_special_tokens": false, + "tokenizer_class": "LlamaTokenizer", + "unk_token": "", + "use_default_system_prompt": false +} diff --git a/train.sh b/train.sh new file mode 100644 index 0000000000000000000000000000000000000000..c1973fd372c0405eeb5bcfc06a614d87746cc5dc --- /dev/null +++ b/train.sh @@ -0,0 +1,191 @@ +args=$@ +for arg in $args; do + eval "$arg" +done + +echo "model: ${model:=mistralai/Mistral-7B-v0.1}" +echo "tokenizer: ${tokenizer:=mistralai/Mistral-7B-v0.1}" +echo "project: ${project:=fla}" +echo "type: ${type:=gla}" +echo "data: ${data:=}" +echo "name: ${name:=}" +echo "cache: ${cache:=}" +echo "seed: ${seed:=42}" +echo "context: ${context:=2048}" +echo "steps: ${steps:=0}" +echo "save: ${save:=2048}" +echo "limit: ${limit:=16}" +echo "preprocessing: ${preprocessing:=32}" +echo "workers: ${workers:=32}" +echo "logging: ${logging:=32}" +echo "config: ${config:=configs/deepspeed.yaml}" +echo "push: ${push:=False}" + +echo "lr: ${lr:=3e-4}" +echo "scheduler: ${scheduler:=cosine_with_min_lr}" +echo "epochs: ${epochs:=1}" +echo "optim: ${optim:=adamw_torch_fused}" +echo "decay: ${decay:=0.01}" +echo "beta1: ${beta1:=0.9}" +echo "beta2: ${beta2:=0.95}" +echo "norm: ${norm:=1.0}" +echo "batch: ${batch:=32}" +echo "update: ${update:=4}" +echo "warmup: ${warmup:=512}" +echo "path: ${path:=}" +echo "checkpoint: ${checkpoint:=}" +echo "node: ${node:=}" +echo "rank: ${rank:=}" +echo "ip: ${ip:=}" +echo "port: ${port:=}" +echo "nodes: ${nodes:=1}" + +params="--model_name_or_path $model \ + --tokenizer $tokenizer \ + --use_fast_tokenizer \ + --do_train \ + --dataset $data \ + --context_length $context \ + --streaming \ + --preprocessing_num_workers $preprocessing \ + --dataloader_num_workers $workers \ + --dataloader_prefetch_factor 2 \ + --ignore_data_skip \ + --output_dir $path \ + --overwrite_output_dir \ + --logging_steps $logging \ + --include_num_input_tokens_seen \ + --save_steps $save \ + --save_total_limit $limit \ + --learning_rate $lr \ + --lr_scheduler_type $scheduler \ + --warmup_steps $warmup \ + --optim $optim \ + --weight_decay $decay \ + --adam_beta1=$beta1 \ + --adam_beta2=$beta2 \ + --max_grad_norm $norm \ + --num_train_epochs $epochs \ + --per_device_train_batch_size $batch \ + --gradient_accumulation_steps $update \ + --seed $seed \ + --logging_steps $logging \ + --push_to_hub $push \ + --bf16" + +if [ $steps -gt 0 ]; then + params+=" --max_steps $steps" +fi + +if [ "$name" != "" ]; then + params+=" --dataset_name $name" +fi +if [ "$cache" != "" ]; then + params+=" --cache_dir $cache" +fi +if [ "$checkpoint" != "" ]; then + params+=" --resume_from_checkpoint $checkpoint" +fi +if [ "$WANDB_DISABLED" != "true" ]; then + params+=" --report_to wandb \ + --run_name $type.$(basename $path)" +else + params+=" --report_to none" +fi + +NUM_GPUS=$(nvidia-smi --list-gpus | wc -l) +echo "Launching training..." +accelerate_params="" +if [ "$rank" != "" ]; then + accelerate_params+=" --machine_rank $rank \ + --num_processes $((nodes * $NUM_GPUS)) \ + --num_machines $nodes \ + --main_process_ip $ip \ + --main_process_port $port \ + --same_network" +fi + +if [[ $config == *"deepspeed"* ]]; then +cat < "configs/ds_config.json" +{ + "train_batch_size": "auto", + "train_micro_batch_size_per_gpu": "auto", + "gradient_accumulation_steps": "auto", + "gradient_clipping": "auto", + "zero_allow_untested_optimizer": true, + "bf16": { + "enabled": true + }, + "zero_optimization": { + "stage": 2, + "allgather_partitions": true, + "allgather_bucket_size": 5e8, + "reduce_scatter": true, + "reduce_bucket_size": 5e8, + "overlap_comm": false, + "contiguous_gradients": true + } +} +EOF +cat < $config +compute_environment: LOCAL_MACHINE +distributed_type: DEEPSPEED +deepspeed_config: + deepspeed_config_file: configs/ds_config.json + zero3_init_flag: true +machine_rank: 0 +main_training_function: main +num_machines: 1 +num_processes: $NUM_GPUS +use_cpu: false +EOF +fi +if [[ $config == *"fsdp"* ]]; then +cat < $config +compute_environment: LOCAL_MACHINE +distributed_type: FSDP +fsdp_config: + fsdp_auto_wrap_policy: TRANSFORMER_BASED_WRAP + fsdp_backward_prefetch: BACKWARD_PRE + fsdp_forward_prefetch: false + fsdp_cpu_ram_efficient_loading: true + fsdp_offload_params: false + fsdp_sharding_strategy: HYBRID_SHARD_ZERO2 + fsdp_state_dict_type: SHARDED_STATE_DICT + fsdp_sync_module_states: true + fsdp_use_orig_params: true +machine_rank: 0 +main_training_function: main +mixed_precision: bf16 +num_machines: $nodes +num_processes: $((nodes * $NUM_GPUS)) +rdzv_backend: static +same_network: true +tpu_env: [] +tpu_use_cluster: false +tpu_use_sudo: false +use_cpu: false +EOF +fi + +cat $config + +set -x +mkdir -p $path +cp * $path +cp -r configs $path +cp -r flame $path +cp -r ../fla $path + +export TRANSFORMERS_OFFLINE=1 +export HF_DATASETS_OFFLINE=1 +if [ "$date" == "" ]; then + date=$(date +%Y%m%d%H%M) +fi +export WANDB_RESUME=allow +export WANDB_NAME="$type.$(basename $path)" +export WANDB_PROJECT=$project +export WANDB_RUN_ID="$WANDB_NAME-$date" +accelerate launch $accelerate_params --config_file $config run.py $params + +echo "RUNNING DONE!" diff --git a/trainer_log.jsonl b/trainer_log.jsonl new file mode 100644 index 0000000000000000000000000000000000000000..bd0398a07fabe38267dd97577ec0e6cda8031fa0 --- /dev/null +++ b/trainer_log.jsonl @@ -0,0 +1,158 @@ +{"current_steps": 32, "total_steps": 20000, "loss": 10.177, "eval_loss": null, "predict_loss": null, "learning_rate": 4.7999999999999994e-05, "epoch": 0.0018226348464999715, "percentage": 0.16} +{"current_steps": 64, "total_steps": 20000, "loss": 9.5999, "eval_loss": null, "predict_loss": null, "learning_rate": 9.599999999999999e-05, "epoch": 0.003645269692999943, "percentage": 0.32} +{"current_steps": 96, "total_steps": 20000, "loss": 8.7965, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00014399999999999998, "epoch": 0.005467904539499914, "percentage": 0.48} +{"current_steps": 128, "total_steps": 20000, "loss": 7.8745, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00019199999999999998, "epoch": 0.007290539385999886, "percentage": 0.64} +{"current_steps": 160, "total_steps": 20000, "loss": 7.1177, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00023999999999999998, "epoch": 0.009113174232499858, "percentage": 0.8} +{"current_steps": 192, "total_steps": 20000, "loss": 6.7502, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00028799999999999995, "epoch": 0.010935809078999829, "percentage": 0.96} +{"current_steps": 224, "total_steps": 20000, "loss": 6.521, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002999990211974405, "epoch": 0.012758443925499801, "percentage": 1.12} +{"current_steps": 256, "total_steps": 20000, "loss": 6.2142, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00029999467099246324, "epoch": 0.014581078771999772, "percentage": 1.28} +{"current_steps": 288, "total_steps": 20000, "loss": 6.0513, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00029998684074125604, "epoch": 0.016403713618499745, "percentage": 1.44} +{"current_steps": 320, "total_steps": 20000, "loss": 6.0916, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00029997553064567596, "epoch": 0.018226348464999716, "percentage": 1.6} +{"current_steps": 352, "total_steps": 20000, "loss": 5.7273, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002999607409972875, "epoch": 0.020048983311499687, "percentage": 1.76} +{"current_steps": 384, "total_steps": 20000, "loss": 5.8725, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002999424721773551, "epoch": 0.021871618157999657, "percentage": 1.92} +{"current_steps": 416, "total_steps": 20000, "loss": 5.5581, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00029992072465683314, "epoch": 0.02369425300449963, "percentage": 2.08} +{"current_steps": 448, "total_steps": 20000, "loss": 5.5562, "eval_loss": null, "predict_loss": null, "learning_rate": 0.000299895498996354, "epoch": 0.025516887850999603, "percentage": 2.24} +{"current_steps": 480, "total_steps": 20000, "loss": 5.5714, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002998667958462134, "epoch": 0.027339522697499574, "percentage": 2.4} +{"current_steps": 512, "total_steps": 20000, "loss": 5.5991, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00029983461594635383, "epoch": 0.029162157543999544, "percentage": 2.56} +{"current_steps": 544, "total_steps": 20000, "loss": 5.4036, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00029979896012634534, "epoch": 0.030984792390499515, "percentage": 2.72} +{"current_steps": 576, "total_steps": 20000, "loss": 5.4731, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002997598293053643, "epoch": 0.03280742723699949, "percentage": 2.88} +{"current_steps": 608, "total_steps": 20000, "loss": 5.1043, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002997172244921695, "epoch": 0.03463006208349946, "percentage": 3.04} +{"current_steps": 640, "total_steps": 20000, "loss": 5.1599, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002996711467850762, "epoch": 0.03645269692999943, "percentage": 3.2} +{"current_steps": 672, "total_steps": 20000, "loss": 5.1653, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00029962159737192815, "epoch": 0.0382753317764994, "percentage": 3.36} +{"current_steps": 704, "total_steps": 20000, "loss": 5.3389, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00029956857753006636, "epoch": 0.04009796662299937, "percentage": 3.52} +{"current_steps": 736, "total_steps": 20000, "loss": 5.1364, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00029951208862629666, "epoch": 0.041920601469499344, "percentage": 3.68} +{"current_steps": 768, "total_steps": 20000, "loss": 5.1161, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00029945213211685426, "epoch": 0.043743236315999315, "percentage": 3.84} +{"current_steps": 800, "total_steps": 20000, "loss": 5.0832, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002993887095473664, "epoch": 0.045565871162499286, "percentage": 4.0} +{"current_steps": 832, "total_steps": 20000, "loss": 5.1319, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002993218225528122, "epoch": 0.04738850600899926, "percentage": 4.16} +{"current_steps": 864, "total_steps": 20000, "loss": 5.1314, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00029925147285748055, "epoch": 0.049211140855499234, "percentage": 4.32} +{"current_steps": 896, "total_steps": 20000, "loss": 5.0525, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002991776622749261, "epoch": 0.051033775701999205, "percentage": 4.48} +{"current_steps": 928, "total_steps": 20000, "loss": 4.9283, "eval_loss": null, "predict_loss": null, "learning_rate": 0.000299100392707922, "epoch": 0.052856410548499176, "percentage": 4.64} +{"current_steps": 960, "total_steps": 20000, "loss": 4.8773, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00029901966614841115, "epoch": 0.05467904539499915, "percentage": 4.8} +{"current_steps": 992, "total_steps": 20000, "loss": 4.8235, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002989354846774545, "epoch": 0.05650168024149912, "percentage": 4.96} +{"current_steps": 1024, "total_steps": 20000, "loss": 4.9179, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002988478504651778, "epoch": 0.05832431508799909, "percentage": 5.12} +{"current_steps": 1056, "total_steps": 20000, "loss": 4.8357, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002987567657707157, "epoch": 0.06014694993449906, "percentage": 5.28} +{"current_steps": 1088, "total_steps": 20000, "loss": 4.7351, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00029866223294215287, "epoch": 0.06196958478099903, "percentage": 5.44} +{"current_steps": 1120, "total_steps": 20000, "loss": 4.6935, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002985642544164642, "epoch": 0.06379221962749901, "percentage": 5.6} +{"current_steps": 1152, "total_steps": 20000, "loss": 4.7992, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002984628327194516, "epoch": 0.06561485447399898, "percentage": 5.76} +{"current_steps": 1184, "total_steps": 20000, "loss": 4.6467, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00029835797046567897, "epoch": 0.06743748932049895, "percentage": 5.92} +{"current_steps": 1216, "total_steps": 20000, "loss": 4.7015, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00029824967035840485, "epoch": 0.06926012416699892, "percentage": 6.08} +{"current_steps": 1248, "total_steps": 20000, "loss": 4.8905, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002981379351895126, "epoch": 0.07108275901349889, "percentage": 6.24} +{"current_steps": 1280, "total_steps": 20000, "loss": 4.7762, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002980227678394385, "epoch": 0.07290539385999886, "percentage": 6.4} +{"current_steps": 1312, "total_steps": 20000, "loss": 4.7081, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002979041712770977, "epoch": 0.07472802870649883, "percentage": 6.56} +{"current_steps": 1344, "total_steps": 20000, "loss": 4.8298, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002977821485598071, "epoch": 0.0765506635529988, "percentage": 6.72} +{"current_steps": 1376, "total_steps": 20000, "loss": 4.8116, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00029765670283320725, "epoch": 0.07837329839949878, "percentage": 6.88} +{"current_steps": 1408, "total_steps": 20000, "loss": 4.7258, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00029752783733118086, "epoch": 0.08019593324599875, "percentage": 7.04} +{"current_steps": 1440, "total_steps": 20000, "loss": 4.6646, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00029739555537576946, "epoch": 0.08201856809249872, "percentage": 7.2} +{"current_steps": 1472, "total_steps": 20000, "loss": 4.6206, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002972598603770878, "epoch": 0.08384120293899869, "percentage": 7.36} +{"current_steps": 1504, "total_steps": 20000, "loss": 4.5949, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002971207558332359, "epoch": 0.08566383778549866, "percentage": 7.52} +{"current_steps": 1536, "total_steps": 20000, "loss": 4.5162, "eval_loss": null, "predict_loss": null, "learning_rate": 0.000296978245330209, "epoch": 0.08748647263199863, "percentage": 7.68} +{"current_steps": 1568, "total_steps": 20000, "loss": 4.697, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00029683233254180504, "epoch": 0.0893091074784986, "percentage": 7.84} +{"current_steps": 1600, "total_steps": 20000, "loss": 4.8891, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002966830212295299, "epoch": 0.09113174232499857, "percentage": 8.0} +{"current_steps": 1632, "total_steps": 20000, "loss": 4.5456, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00029653031524250043, "epoch": 0.09295437717149854, "percentage": 8.16} +{"current_steps": 1664, "total_steps": 20000, "loss": 4.4857, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002963742185173454, "epoch": 0.09477701201799851, "percentage": 8.32} +{"current_steps": 1696, "total_steps": 20000, "loss": 4.4269, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00029621473507810374, "epoch": 0.09659964686449848, "percentage": 8.48} +{"current_steps": 1728, "total_steps": 20000, "loss": 4.5971, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002960518690361209, "epoch": 0.09842228171099847, "percentage": 8.64} +{"current_steps": 1760, "total_steps": 20000, "loss": 4.5318, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002958856245899433, "epoch": 0.10024491655749844, "percentage": 8.8} +{"current_steps": 1792, "total_steps": 20000, "loss": 4.6019, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002957160060252092, "epoch": 0.10206755140399841, "percentage": 8.96} +{"current_steps": 1824, "total_steps": 20000, "loss": 4.5667, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00029554301771453904, "epoch": 0.10389018625049838, "percentage": 9.12} +{"current_steps": 1856, "total_steps": 20000, "loss": 4.4254, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00029536666411742233, "epoch": 0.10571282109699835, "percentage": 9.28} +{"current_steps": 1888, "total_steps": 20000, "loss": 4.5096, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002951869497801027, "epoch": 0.10753545594349832, "percentage": 9.44} +{"current_steps": 1920, "total_steps": 20000, "loss": 4.3266, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00029500387933546095, "epoch": 0.1093580907899983, "percentage": 9.6} +{"current_steps": 1952, "total_steps": 20000, "loss": 4.2293, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00029481745750289517, "epoch": 0.11118072563649826, "percentage": 9.76} +{"current_steps": 1984, "total_steps": 20000, "loss": 4.3786, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00029462768908819953, "epoch": 0.11300336048299824, "percentage": 9.92} +{"current_steps": 2016, "total_steps": 20000, "loss": 4.5566, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002944345789834402, "epoch": 0.1148259953294982, "percentage": 10.08} +{"current_steps": 2048, "total_steps": 20000, "loss": 4.3809, "eval_loss": null, "predict_loss": null, "learning_rate": 0.000294238132166829, "epoch": 0.11664863017599818, "percentage": 10.24} +{"current_steps": 2080, "total_steps": 20000, "loss": 4.3308, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00029403835370259574, "epoch": 0.11847126502249815, "percentage": 10.4} +{"current_steps": 2112, "total_steps": 20000, "loss": 4.2377, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00029383524874085685, "epoch": 0.12029389986899812, "percentage": 10.56} +{"current_steps": 2144, "total_steps": 20000, "loss": 4.3365, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002936288225174832, "epoch": 0.12211653471549809, "percentage": 10.72} +{"current_steps": 2176, "total_steps": 20000, "loss": 4.4236, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00029341908035396475, "epoch": 0.12393916956199806, "percentage": 10.88} +{"current_steps": 2208, "total_steps": 20000, "loss": 4.4022, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002932060276572737, "epoch": 0.12576180440849805, "percentage": 11.04} +{"current_steps": 2240, "total_steps": 20000, "loss": 4.3397, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002929896699197249, "epoch": 0.12758443925499802, "percentage": 11.2} +{"current_steps": 2272, "total_steps": 20000, "loss": 4.3137, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00029277001271883426, "epoch": 0.129407074101498, "percentage": 11.36} +{"current_steps": 2304, "total_steps": 20000, "loss": 4.3309, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002925470617171751, "epoch": 0.13122970894799796, "percentage": 11.52} +{"current_steps": 2336, "total_steps": 20000, "loss": 4.2041, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002923208226622319, "epoch": 0.13305234379449793, "percentage": 11.68} +{"current_steps": 2368, "total_steps": 20000, "loss": 4.0959, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00029209130138625256, "epoch": 0.1348749786409979, "percentage": 11.84} +{"current_steps": 2400, "total_steps": 20000, "loss": 4.2376, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00029185850380609757, "epoch": 0.13669761348749787, "percentage": 12.0} +{"current_steps": 2432, "total_steps": 20000, "loss": 4.0678, "eval_loss": null, "predict_loss": null, "learning_rate": 0.000291622435923088, "epoch": 0.13852024833399784, "percentage": 12.16} +{"current_steps": 2464, "total_steps": 20000, "loss": 4.2285, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002913831038228502, "epoch": 0.1403428831804978, "percentage": 12.32} +{"current_steps": 2496, "total_steps": 20000, "loss": 4.1923, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00029114051367515944, "epoch": 0.14216551802699778, "percentage": 12.48} +{"current_steps": 2528, "total_steps": 20000, "loss": 4.1845, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002908946717337805, "epoch": 0.14398815287349775, "percentage": 12.64} +{"current_steps": 2560, "total_steps": 20000, "loss": 4.2681, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00029064558433630674, "epoch": 0.14581078771999773, "percentage": 12.8} +{"current_steps": 2592, "total_steps": 20000, "loss": 4.1535, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00029039325790399656, "epoch": 0.1476334225664977, "percentage": 12.96} +{"current_steps": 2624, "total_steps": 20000, "loss": 4.1535, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002901376989416077, "epoch": 0.14945605741299767, "percentage": 13.12} +{"current_steps": 2656, "total_steps": 20000, "loss": 4.3691, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00028987891403723, "epoch": 0.15127869225949764, "percentage": 13.28} +{"current_steps": 2688, "total_steps": 20000, "loss": 4.1536, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002896169098621151, "epoch": 0.1531013271059976, "percentage": 13.44} +{"current_steps": 2720, "total_steps": 20000, "loss": 4.1119, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00028935169317050475, "epoch": 0.15492396195249758, "percentage": 13.6} +{"current_steps": 2752, "total_steps": 20000, "loss": 4.0028, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002890832707994566, "epoch": 0.15674659679899755, "percentage": 13.76} +{"current_steps": 2784, "total_steps": 20000, "loss": 4.2726, "eval_loss": null, "predict_loss": null, "learning_rate": 0.000288811649668668, "epoch": 0.15856923164549752, "percentage": 13.92} +{"current_steps": 2816, "total_steps": 20000, "loss": 3.8537, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00028853683678029755, "epoch": 0.1603918664919975, "percentage": 14.08} +{"current_steps": 2848, "total_steps": 20000, "loss": 3.9781, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00028825883921878437, "epoch": 0.16221450133849746, "percentage": 14.24} +{"current_steps": 2880, "total_steps": 20000, "loss": 4.2012, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00028797766415066613, "epoch": 0.16403713618499743, "percentage": 14.4} +{"current_steps": 2912, "total_steps": 20000, "loss": 4.2413, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00028769331882439364, "epoch": 0.1658597710314974, "percentage": 14.56} +{"current_steps": 2944, "total_steps": 20000, "loss": 4.1706, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00028740581057014417, "epoch": 0.16768240587799738, "percentage": 14.72} +{"current_steps": 2976, "total_steps": 20000, "loss": 3.9725, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002871151467996327, "epoch": 0.16950504072449735, "percentage": 14.88} +{"current_steps": 3008, "total_steps": 20000, "loss": 4.05, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00028682133500592056, "epoch": 0.17132767557099732, "percentage": 15.04} +{"current_steps": 3040, "total_steps": 20000, "loss": 4.0179, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00028652438276322256, "epoch": 0.1731503104174973, "percentage": 15.2} +{"current_steps": 3072, "total_steps": 20000, "loss": 4.1675, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002862242977267114, "epoch": 0.17497294526399726, "percentage": 15.36} +{"current_steps": 3104, "total_steps": 20000, "loss": 4.1781, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002859210876323207, "epoch": 0.17679558011049723, "percentage": 15.52} +{"current_steps": 3136, "total_steps": 20000, "loss": 4.1132, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00028561476029654524, "epoch": 0.1786182149569972, "percentage": 15.68} +{"current_steps": 3168, "total_steps": 20000, "loss": 4.0046, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002853053236162396, "epoch": 0.18044084980349717, "percentage": 15.84} +{"current_steps": 3200, "total_steps": 20000, "loss": 4.0622, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00028499278556841465, "epoch": 0.18226348464999714, "percentage": 16.0} +{"current_steps": 3232, "total_steps": 20000, "loss": 4.043, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002846771542100318, "epoch": 0.1840861194964971, "percentage": 16.16} +{"current_steps": 3264, "total_steps": 20000, "loss": 3.9258, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002843584376777953, "epoch": 0.18590875434299708, "percentage": 16.32} +{"current_steps": 3296, "total_steps": 20000, "loss": 4.036, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002840366441879425, "epoch": 0.18773138918949706, "percentage": 16.48} +{"current_steps": 3328, "total_steps": 20000, "loss": 3.9786, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00028371178203603236, "epoch": 0.18955402403599703, "percentage": 16.64} +{"current_steps": 3360, "total_steps": 20000, "loss": 3.8724, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002833838595967309, "epoch": 0.191376658882497, "percentage": 16.8} +{"current_steps": 3392, "total_steps": 20000, "loss": 3.9703, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00028305288532359585, "epoch": 0.19319929372899697, "percentage": 16.96} +{"current_steps": 3424, "total_steps": 20000, "loss": 3.9662, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002827188677488587, "epoch": 0.19502192857549694, "percentage": 17.12} +{"current_steps": 3456, "total_steps": 20000, "loss": 4.0462, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00028238181548320457, "epoch": 0.19684456342199694, "percentage": 17.28} +{"current_steps": 3488, "total_steps": 20000, "loss": 4.0952, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002820417372155502, "epoch": 0.1986671982684969, "percentage": 17.44} +{"current_steps": 3520, "total_steps": 20000, "loss": 3.9087, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002816986417128201, "epoch": 0.20048983311499688, "percentage": 17.6} +{"current_steps": 3552, "total_steps": 20000, "loss": 4.0715, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00028135253781972063, "epoch": 0.20231246796149685, "percentage": 17.76} +{"current_steps": 3584, "total_steps": 20000, "loss": 4.0222, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00028100343445851164, "epoch": 0.20413510280799682, "percentage": 17.92} +{"current_steps": 3616, "total_steps": 20000, "loss": 4.0121, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00028065134062877685, "epoch": 0.2059577376544968, "percentage": 18.08} +{"current_steps": 3648, "total_steps": 20000, "loss": 3.8264, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002802962654071917, "epoch": 0.20778037250099676, "percentage": 18.24} +{"current_steps": 3680, "total_steps": 20000, "loss": 3.866, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00027993821794728915, "epoch": 0.20960300734749673, "percentage": 18.4} +{"current_steps": 3712, "total_steps": 20000, "loss": 3.9637, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00027957720747922405, "epoch": 0.2114256421939967, "percentage": 18.56} +{"current_steps": 3744, "total_steps": 20000, "loss": 3.9347, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002792132433095351, "epoch": 0.21324827704049668, "percentage": 18.72} +{"current_steps": 3776, "total_steps": 20000, "loss": 3.7413, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002788463348209047, "epoch": 0.21507091188699665, "percentage": 18.88} +{"current_steps": 3808, "total_steps": 20000, "loss": 3.9798, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00027847649147191736, "epoch": 0.21689354673349662, "percentage": 19.04} +{"current_steps": 3840, "total_steps": 20000, "loss": 3.9606, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00027810372279681576, "epoch": 0.2187161815799966, "percentage": 19.2} +{"current_steps": 3872, "total_steps": 20000, "loss": 4.0733, "eval_loss": null, "predict_loss": null, "learning_rate": 0.000277728038405255, "epoch": 0.22053881642649656, "percentage": 19.36} +{"current_steps": 3904, "total_steps": 20000, "loss": 3.9827, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002773494479820547, "epoch": 0.22236145127299653, "percentage": 19.52} +{"current_steps": 3936, "total_steps": 20000, "loss": 3.9574, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00027696796128694965, "epoch": 0.2241840861194965, "percentage": 19.68} +{"current_steps": 3968, "total_steps": 20000, "loss": 3.9315, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002765835881543379, "epoch": 0.22600672096599647, "percentage": 19.84} +{"current_steps": 4000, "total_steps": 20000, "loss": 3.9058, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002761963384930274, "epoch": 0.22782935581249644, "percentage": 20.0} +{"current_steps": 4032, "total_steps": 20000, "loss": 4.0618, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00027580622228598055, "epoch": 0.2296519906589964, "percentage": 20.16} +{"current_steps": 4064, "total_steps": 20000, "loss": 3.8876, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002754132495900569, "epoch": 0.23147462550549638, "percentage": 20.32} +{"current_steps": 4096, "total_steps": 20000, "loss": 3.8588, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00027501743053575365, "epoch": 0.23329726035199636, "percentage": 20.48} +{"current_steps": 4128, "total_steps": 20000, "loss": 3.9312, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00027461877532694476, "epoch": 0.23511989519849633, "percentage": 20.64} +{"current_steps": 4160, "total_steps": 20000, "loss": 4.0081, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00027421729424061787, "epoch": 0.2369425300449963, "percentage": 20.8} +{"current_steps": 4192, "total_steps": 20000, "loss": 4.0165, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002738129976266092, "epoch": 0.23876516489149627, "percentage": 20.96} +{"current_steps": 4224, "total_steps": 20000, "loss": 3.8829, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00027340589590733687, "epoch": 0.24058779973799624, "percentage": 21.12} +{"current_steps": 4256, "total_steps": 20000, "loss": 3.9222, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002729959995775323, "epoch": 0.2424104345844962, "percentage": 21.28} +{"current_steps": 4288, "total_steps": 20000, "loss": 3.9952, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00027258331920396926, "epoch": 0.24423306943099618, "percentage": 21.44} +{"current_steps": 4320, "total_steps": 20000, "loss": 3.9364, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00027216786542519225, "epoch": 0.24605570427749615, "percentage": 21.6} +{"current_steps": 4352, "total_steps": 20000, "loss": 4.0225, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002717496489512413, "epoch": 0.24787833912399612, "percentage": 21.76} +{"current_steps": 4384, "total_steps": 20000, "loss": 3.9018, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002713286805633767, "epoch": 0.2497009739704961, "percentage": 21.92} +{"current_steps": 4416, "total_steps": 20000, "loss": 3.6686, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002709049711138003, "epoch": 0.2515236088169961, "percentage": 22.08} +{"current_steps": 4448, "total_steps": 20000, "loss": 3.778, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002704785315253767, "epoch": 0.25334624366349606, "percentage": 22.24} +{"current_steps": 4480, "total_steps": 20000, "loss": 4.031, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002700493727913506, "epoch": 0.25516887850999603, "percentage": 22.4} +{"current_steps": 4512, "total_steps": 20000, "loss": 3.7777, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002696175059750642, "epoch": 0.256991513356496, "percentage": 22.56} +{"current_steps": 4544, "total_steps": 20000, "loss": 3.8406, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00026918294220967175, "epoch": 0.258814148202996, "percentage": 22.72} +{"current_steps": 4576, "total_steps": 20000, "loss": 3.9092, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00026874569269785245, "epoch": 0.26063678304949595, "percentage": 22.88} +{"current_steps": 4608, "total_steps": 20000, "loss": 3.8948, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00026830576871152167, "epoch": 0.2624594178959959, "percentage": 23.04} +{"current_steps": 4640, "total_steps": 20000, "loss": 3.8556, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00026786318159154054, "epoch": 0.2642820527424959, "percentage": 23.2} +{"current_steps": 4672, "total_steps": 20000, "loss": 3.7144, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002674179427474234, "epoch": 0.26610468758899586, "percentage": 23.36} +{"current_steps": 4704, "total_steps": 20000, "loss": 3.9987, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002669700636570438, "epoch": 0.26792732243549583, "percentage": 23.52} +{"current_steps": 4736, "total_steps": 20000, "loss": 3.8219, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002665195558663385, "epoch": 0.2697499572819958, "percentage": 23.68} +{"current_steps": 4768, "total_steps": 20000, "loss": 3.7573, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00026606643098900997, "epoch": 0.27157259212849577, "percentage": 23.84} +{"current_steps": 4800, "total_steps": 20000, "loss": 3.9833, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002656107007062269, "epoch": 0.27339522697499574, "percentage": 24.0} +{"current_steps": 4832, "total_steps": 20000, "loss": 3.7396, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00026515237676632295, "epoch": 0.2752178618214957, "percentage": 24.16} +{"current_steps": 4864, "total_steps": 20000, "loss": 3.9386, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002646914709844943, "epoch": 0.2770404966679957, "percentage": 24.32} +{"current_steps": 4896, "total_steps": 20000, "loss": 3.8202, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002642279952424945, "epoch": 0.27886313151449565, "percentage": 24.48} +{"current_steps": 4928, "total_steps": 20000, "loss": 3.7296, "eval_loss": null, "predict_loss": null, "learning_rate": 0.0002637619614883287, "epoch": 0.2806857663609956, "percentage": 24.64} +{"current_steps": 4960, "total_steps": 20000, "loss": 3.7928, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00026329338173594516, "epoch": 0.2825084012074956, "percentage": 24.8} +{"current_steps": 4992, "total_steps": 20000, "loss": 3.7949, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00026282226806492595, "epoch": 0.28433103605399557, "percentage": 24.96} +{"current_steps": 5024, "total_steps": 20000, "loss": 3.9344, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00026234863262017535, "epoch": 0.28615367090049554, "percentage": 25.12} +{"current_steps": 5056, "total_steps": 20000, "loss": 4.0105, "eval_loss": null, "predict_loss": null, "learning_rate": 0.00026187248761160676, "epoch": 0.2879763057469955, "percentage": 25.28} diff --git a/training_args.bin b/training_args.bin new file mode 100644 index 0000000000000000000000000000000000000000..452a671eabb37184ab0f26f9feb4581ae82d4241 --- /dev/null +++ b/training_args.bin @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:82fc21ae7266a339008724c824c2a7f2caf20e4c733521d4c4161c114a54bd86 +size 7224