{"cells":[{"metadata":{},"cell_type":"markdown","source":"# Project Description"},{"metadata":{},"cell_type":"markdown","source":"## Overall objective"},{"metadata":{},"cell_type":"markdown","source":"In this notebook, I want to use two state of the art Natural Language Processing (NLP) techniques which have sort of revolutionalized the area of NLP in Deep Learning.\n\nThese techniques are as follows:\n\n1. BERT (Deep Bidirectional Transformers for Language Understanding)\n2. Fastai ULMFiT (Universal Language Model Fine-tuning for Text Classification)\n\nBoth these techniques are very advanced and very recent NLP techniques (BERT was introduced by Google in 2018). Both of them incorporate the methods of Transfer Learning which is quite cool and are pre-trained on large corpuses of Wikipedia articles. I wanted to compare the overall performance of these two techniques.\n\nI really like using Fastai for my deep learning projects and can't thank enough for this amazing community and our mentors - Jeremy & Rachael for creating few wonderful courses on the matters pertaining to Deep Learning. Therefore one of my aims to work on this project was to **integrate BERT with Fastai**. This means power of BERT combined with the simplicity of Fastai. It was not an easy task especially implementing Discriminative Learning Rate technique of Fastai in BERT modelling. \n\nIn my project, below article helped me in understanding few of these integration techniques and I would like to extend my gratidue to the writer of this article:\n\n[https://mlexplained.com/2019/05/13/a-tutorial-to-fine-tuning-bert-with-fast-ai/](http://)\n\n"},{"metadata":{},"cell_type":"markdown","source":"## Data"},{"metadata":{},"cell_type":"markdown","source":"In this project, we will use Jigsaw's Toxic Comments dataset which has categorized each text item into 6 classes -\n\n1. Toxic\n2. Severe Toxic\n3. Obscene\n4. Threat\n5. Insult\n6. Identity Hate\n\nThis is a **multi-label text classification challenge**."},{"metadata":{},"cell_type":"markdown","source":"# Importing Libraries & Data Preparation"},{"metadata":{"_uuid":"8f2839f25d086af736a60e9eeb907d3b93b6e0e5","_cell_guid":"b1076dfc-b9ad-4769-8c92-a6c4dae69d19","trusted":true},"cell_type":"code","source":"import numpy as np\nimport pandas as pd\n\nfrom pathlib import Path\nfrom typing import *\n\nimport torch\nimport torch.optim as optim\n\nimport gc\ngc.collect()","execution_count":1,"outputs":[{"output_type":"execute_result","execution_count":1,"data":{"text/plain":"0"},"metadata":{}}]},{"metadata":{},"cell_type":"markdown","source":"In this section, we will import Fastai libraries and few other important libraries for our task"},{"metadata":{"trusted":true},"cell_type":"code","source":"!pip install pretrainedmodels\n\n%reload_ext autoreload\n%autoreload 2\n%matplotlib inline\n\n!pip install fastai==1.0.52\nimport fastai\n\nfrom fastai import *\nfrom fastai.vision import *\nfrom fastai.text import *\n\nfrom torchvision.models import *\nimport pretrainedmodels\n\nfrom utils import *\nimport sys\n\nfrom fastai.callbacks.tracker import EarlyStoppingCallback\nfrom fastai.callbacks.tracker import SaveModelCallback","execution_count":2,"outputs":[{"output_type":"stream","text":"Collecting pretrainedmodels\n\u001b[?25l  Downloading https://files.pythonhosted.org/packages/84/0e/be6a0e58447ac16c938799d49bfb5fb7a80ac35e137547fc6cee2c08c4cf/pretrainedmodels-0.7.4.tar.gz (58kB)\n\u001b[K     |████████████████████████████████| 61kB 2.4MB/s eta 0:00:011\n\u001b[?25hRequirement already satisfied: torch in /opt/conda/lib/python3.6/site-packages (from pretrainedmodels) (1.1.0)\nRequirement already satisfied: torchvision in /opt/conda/lib/python3.6/site-packages (from pretrainedmodels) (0.3.0)\nRequirement already satisfied: munch in /opt/conda/lib/python3.6/site-packages (from pretrainedmodels) (2.3.2)\nRequirement already satisfied: tqdm in /opt/conda/lib/python3.6/site-packages (from pretrainedmodels) (4.32.1)\nRequirement already satisfied: numpy in /opt/conda/lib/python3.6/site-packages (from torch->pretrainedmodels) (1.16.4)\nRequirement already satisfied: six in /opt/conda/lib/python3.6/site-packages (from torchvision->pretrainedmodels) (1.12.0)\nRequirement already satisfied: pillow>=4.1.1 in /opt/conda/lib/python3.6/site-packages (from torchvision->pretrainedmodels) (6.0.0)\nBuilding wheels for collected packages: pretrainedmodels\n  Building wheel for pretrainedmodels (setup.py) ... \u001b[?25ldone\n\u001b[?25h  Stored in directory: /tmp/.cache/pip/wheels/69/df/63/62583c096289713f22db605aa2334de5b591d59861a02c2ecd\nSuccessfully built pretrainedmodels\nInstalling collected packages: pretrainedmodels\nSuccessfully installed pretrainedmodels-0.7.4\nCollecting fastai==1.0.52\n\u001b[?25l  Downloading https://files.pythonhosted.org/packages/de/07/7203179ba2f211bfbda364f883f645aea4d09ccfd48ce5dbb00a894722bf/fastai-1.0.52-py3-none-any.whl (219kB)\n\u001b[K     |████████████████████████████████| 225kB 2.8MB/s eta 0:00:01\n\u001b[?25hRequirement already satisfied: packaging in /opt/conda/lib/python3.6/site-packages (from fastai==1.0.52) (19.0)\nRequirement already satisfied: torchvision in /opt/conda/lib/python3.6/site-packages (from fastai==1.0.52) (0.3.0)\nRequirement already satisfied: pyyaml in /opt/conda/lib/python3.6/site-packages (from fastai==1.0.52) (5.1.1)\nRequirement already satisfied: numexpr in /opt/conda/lib/python3.6/site-packages (from fastai==1.0.52) (2.6.9)\nRequirement already satisfied: dataclasses; python_version < \"3.7\" in /opt/conda/lib/python3.6/site-packages (from fastai==1.0.52) (0.6)\nRequirement already satisfied: numpy>=1.15 in /opt/conda/lib/python3.6/site-packages (from fastai==1.0.52) (1.16.4)\nRequirement already satisfied: bottleneck in /opt/conda/lib/python3.6/site-packages (from fastai==1.0.52) (1.2.1)\nRequirement already satisfied: spacy>=2.0.18 in /opt/conda/lib/python3.6/site-packages (from fastai==1.0.52) (2.1.4)\nRequirement already satisfied: scipy in /opt/conda/lib/python3.6/site-packages (from fastai==1.0.52) (1.2.1)\nRequirement already satisfied: matplotlib in /opt/conda/lib/python3.6/site-packages (from fastai==1.0.52) (3.0.3)\nRequirement already satisfied: beautifulsoup4 in /opt/conda/lib/python3.6/site-packages (from fastai==1.0.52) (4.7.1)\nRequirement already satisfied: pandas in /opt/conda/lib/python3.6/site-packages (from fastai==1.0.52) (0.23.4)\nRequirement already satisfied: nvidia-ml-py3 in /opt/conda/lib/python3.6/site-packages (from fastai==1.0.52) (7.352.0)\nRequirement already satisfied: Pillow in /opt/conda/lib/python3.6/site-packages (from fastai==1.0.52) (6.0.0)\nRequirement already satisfied: fastprogress>=0.1.19 in /opt/conda/lib/python3.6/site-packages (from fastai==1.0.52) (0.1.21)\nRequirement already satisfied: requests in /opt/conda/lib/python3.6/site-packages (from fastai==1.0.52) (2.22.0)\nRequirement already satisfied: torch>=1.0.0 in /opt/conda/lib/python3.6/site-packages (from fastai==1.0.52) (1.1.0)\nRequirement already satisfied: typing in /opt/conda/lib/python3.6/site-packages (from fastai==1.0.52) (3.6.6)\nRequirement already satisfied: six in /opt/conda/lib/python3.6/site-packages (from packaging->fastai==1.0.52) (1.12.0)\nRequirement already satisfied: pyparsing>=2.0.2 in /opt/conda/lib/python3.6/site-packages (from packaging->fastai==1.0.52) (2.4.0)\nRequirement already satisfied: preshed<2.1.0,>=2.0.1 in /opt/conda/lib/python3.6/site-packages (from spacy>=2.0.18->fastai==1.0.52) (2.0.1)\nRequirement already satisfied: blis<0.3.0,>=0.2.2 in /opt/conda/lib/python3.6/site-packages (from spacy>=2.0.18->fastai==1.0.52) (0.2.4)\nRequirement already satisfied: plac<1.0.0,>=0.9.6 in /opt/conda/lib/python3.6/site-packages (from spacy>=2.0.18->fastai==1.0.52) (0.9.6)\nRequirement already satisfied: murmurhash<1.1.0,>=0.28.0 in /opt/conda/lib/python3.6/site-packages (from spacy>=2.0.18->fastai==1.0.52) (1.0.2)\nRequirement already satisfied: cymem<2.1.0,>=2.0.2 in /opt/conda/lib/python3.6/site-packages (from spacy>=2.0.18->fastai==1.0.52) (2.0.2)\nRequirement already satisfied: jsonschema<3.1.0,>=2.6.0 in /opt/conda/lib/python3.6/site-packages (from spacy>=2.0.18->fastai==1.0.52) (3.0.1)\nRequirement already satisfied: wasabi<1.1.0,>=0.2.0 in /opt/conda/lib/python3.6/site-packages (from spacy>=2.0.18->fastai==1.0.52) (0.2.2)\nRequirement already satisfied: srsly<1.1.0,>=0.0.5 in /opt/conda/lib/python3.6/site-packages (from spacy>=2.0.18->fastai==1.0.52) (0.0.7)\nRequirement already satisfied: thinc<7.1.0,>=7.0.2 in /opt/conda/lib/python3.6/site-packages (from spacy>=2.0.18->fastai==1.0.52) (7.0.4)\nRequirement already satisfied: cycler>=0.10 in /opt/conda/lib/python3.6/site-packages (from matplotlib->fastai==1.0.52) (0.10.0)\nRequirement already satisfied: kiwisolver>=1.0.1 in /opt/conda/lib/python3.6/site-packages (from matplotlib->fastai==1.0.52) (1.1.0)\nRequirement already satisfied: python-dateutil>=2.1 in /opt/conda/lib/python3.6/site-packages (from matplotlib->fastai==1.0.52) (2.8.0)\nRequirement already satisfied: soupsieve>=1.2 in /opt/conda/lib/python3.6/site-packages (from beautifulsoup4->fastai==1.0.52) (1.8)\nRequirement already satisfied: pytz>=2011k in /opt/conda/lib/python3.6/site-packages (from pandas->fastai==1.0.52) (2019.1)\nRequirement already satisfied: chardet<3.1.0,>=3.0.2 in /opt/conda/lib/python3.6/site-packages (from requests->fastai==1.0.52) (3.0.4)\nRequirement already satisfied: certifi>=2017.4.17 in /opt/conda/lib/python3.6/site-packages (from requests->fastai==1.0.52) (2019.6.16)\nRequirement already satisfied: idna<2.9,>=2.5 in /opt/conda/lib/python3.6/site-packages (from requests->fastai==1.0.52) (2.8)\nRequirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /opt/conda/lib/python3.6/site-packages (from requests->fastai==1.0.52) (1.24.2)\nRequirement already satisfied: attrs>=17.4.0 in /opt/conda/lib/python3.6/site-packages (from jsonschema<3.1.0,>=2.6.0->spacy>=2.0.18->fastai==1.0.52) (19.1.0)\nRequirement already satisfied: pyrsistent>=0.14.0 in /opt/conda/lib/python3.6/site-packages (from jsonschema<3.1.0,>=2.6.0->spacy>=2.0.18->fastai==1.0.52) (0.14.11)\nRequirement already satisfied: setuptools in /opt/conda/lib/python3.6/site-packages (from jsonschema<3.1.0,>=2.6.0->spacy>=2.0.18->fastai==1.0.52) (41.0.1)\nRequirement already satisfied: tqdm<5.0.0,>=4.10.0 in /opt/conda/lib/python3.6/site-packages (from thinc<7.1.0,>=7.0.2->spacy>=2.0.18->fastai==1.0.52) (4.32.1)\nInstalling collected packages: fastai\n  Found existing installation: fastai 1.0.54\n    Uninstalling fastai-1.0.54:\n      Successfully uninstalled fastai-1.0.54\nSuccessfully installed fastai-1.0.52\n","name":"stdout"}]},{"metadata":{},"cell_type":"markdown","source":"Let's import Huggingface's \"pytorch-pretrained-bert\" model (this is now renamed as pytorch-transformers)\n\n[https://github.com/huggingface/pytorch-transformers](http://)\n\nThis is a brilliant repository of few of amazing NLP techniques and already pre-trained."},{"metadata":{"trusted":true},"cell_type":"code","source":"%%bash\npip install pytorch-pretrained-bert","execution_count":3,"outputs":[{"output_type":"stream","text":"Requirement already satisfied: pytorch-pretrained-bert in /opt/conda/lib/python3.6/site-packages (0.6.2)\nRequirement already satisfied: tqdm in /opt/conda/lib/python3.6/site-packages (from pytorch-pretrained-bert) (4.32.1)\nRequirement already satisfied: regex in /opt/conda/lib/python3.6/site-packages (from pytorch-pretrained-bert) (2019.6.8)\nRequirement already satisfied: requests in /opt/conda/lib/python3.6/site-packages (from pytorch-pretrained-bert) (2.22.0)\nRequirement already satisfied: torch>=0.4.1 in /opt/conda/lib/python3.6/site-packages (from pytorch-pretrained-bert) (1.1.0)\nRequirement already satisfied: numpy in /opt/conda/lib/python3.6/site-packages (from pytorch-pretrained-bert) (1.16.4)\nRequirement already satisfied: boto3 in /opt/conda/lib/python3.6/site-packages (from pytorch-pretrained-bert) (1.9.185)\nRequirement already satisfied: chardet<3.1.0,>=3.0.2 in /opt/conda/lib/python3.6/site-packages (from requests->pytorch-pretrained-bert) (3.0.4)\nRequirement already satisfied: idna<2.9,>=2.5 in /opt/conda/lib/python3.6/site-packages (from requests->pytorch-pretrained-bert) (2.8)\nRequirement already satisfied: certifi>=2017.4.17 in /opt/conda/lib/python3.6/site-packages (from requests->pytorch-pretrained-bert) (2019.6.16)\nRequirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /opt/conda/lib/python3.6/site-packages (from requests->pytorch-pretrained-bert) (1.24.2)\nRequirement already satisfied: s3transfer<0.3.0,>=0.2.0 in /opt/conda/lib/python3.6/site-packages (from boto3->pytorch-pretrained-bert) (0.2.1)\nRequirement already satisfied: jmespath<1.0.0,>=0.7.1 in /opt/conda/lib/python3.6/site-packages (from boto3->pytorch-pretrained-bert) (0.9.4)\nRequirement already satisfied: botocore<1.13.0,>=1.12.185 in /opt/conda/lib/python3.6/site-packages (from boto3->pytorch-pretrained-bert) (1.12.185)\nRequirement already satisfied: docutils>=0.10 in /opt/conda/lib/python3.6/site-packages (from botocore<1.13.0,>=1.12.185->boto3->pytorch-pretrained-bert) (0.14)\nRequirement already satisfied: python-dateutil<3.0.0,>=2.1; python_version >= \"2.7\" in /opt/conda/lib/python3.6/site-packages (from botocore<1.13.0,>=1.12.185->boto3->pytorch-pretrained-bert) (2.8.0)\nRequirement already satisfied: six>=1.5 in /opt/conda/lib/python3.6/site-packages (from python-dateutil<3.0.0,>=2.1; python_version >= \"2.7\"->botocore<1.13.0,>=1.12.185->boto3->pytorch-pretrained-bert) (1.12.0)\n","name":"stdout"}]},{"metadata":{},"cell_type":"markdown","source":"BERT has several flavours when it comes to Tokenization. For our modelling purposes, we will use the most common and standard method named as \"bert-case-uncased\".\n\nWe will name this as bert_tok"},{"metadata":{"_cell_guid":"79c7e3d0-c299-4dcb-8224-4455121ee9b0","_uuid":"d629ff2d2480ee46fbb7e2d37f6b5fab8052498a","trusted":true},"cell_type":"code","source":"from pytorch_pretrained_bert import BertTokenizer\nbert_tok = BertTokenizer.from_pretrained(\n    \"bert-base-uncased\",\n)","execution_count":4,"outputs":[{"output_type":"stream","text":"100%|██████████| 231508/231508 [00:00<00:00, 6000104.62B/s]\n","name":"stderr"}]},{"metadata":{},"cell_type":"markdown","source":"As mentioned in the article in first section, we will change the tokenizer of Fastai to incorporate BertTokenizer. One important thing to note here is to change the start and end of each token with [CLS] and [SEP] which is a requirement of BERT."},{"metadata":{"trusted":true},"cell_type":"code","source":"class FastAiBertTokenizer(BaseTokenizer):\n    \"\"\"Wrapper around BertTokenizer to be compatible with fast.ai\"\"\"\n    def __init__(self, tokenizer: BertTokenizer, max_seq_len: int=128, **kwargs):\n        self._pretrained_tokenizer = tokenizer\n        self.max_seq_len = max_seq_len\n\n    def __call__(self, *args, **kwargs):\n        return self\n\n    def tokenizer(self, t:str) -> List[str]:\n        \"\"\"Limits the maximum sequence length\"\"\"\n        return [\"[CLS]\"] + self._pretrained_tokenizer.tokenize(t)[:self.max_seq_len - 2] + [\"[SEP]\"]","execution_count":5,"outputs":[]},{"metadata":{},"cell_type":"markdown","source":"Before we move further, lets have a look at the Data on which we have to work.\n\nWe will split the train data into two parts: Train, Validation. However, for the purpose of this project, we will not be using Test Data"},{"metadata":{"trusted":true},"cell_type":"code","source":"from sklearn.model_selection import train_test_split","execution_count":6,"outputs":[]},{"metadata":{"trusted":true},"cell_type":"code","source":"DATA_ROOT = Path(\"..\") / \"input\"\n\ntrain, test = [pd.read_csv(DATA_ROOT / fname) for fname in [\"train.csv\", \"test.csv\"]]\ntrain, val = train_test_split(train, shuffle=True, test_size=0.2, random_state=42)","execution_count":7,"outputs":[]},{"metadata":{"trusted":true},"cell_type":"code","source":"train.head()","execution_count":8,"outputs":[{"output_type":"execute_result","execution_count":8,"data":{"text/plain":"                      id      ...      identity_hate\n140030  ed56f082116dcbd0      ...                  0\n159124  f8e3cd98b63bf401      ...                  0\n60006   a09e1bcf10631f9a      ...                  0\n65432   af0ee0066c607eb8      ...                  0\n154979  b734772b1a807e09      ...                  0\n\n[5 rows x 8 columns]","text/html":"<div>\n<style scoped>\n    .dataframe tbody tr th:only-of-type {\n        vertical-align: middle;\n    }\n\n    .dataframe tbody tr th {\n        vertical-align: top;\n    }\n\n    .dataframe thead th {\n        text-align: right;\n    }\n</style>\n<table border=\"1\" class=\"dataframe\">\n  <thead>\n    <tr style=\"text-align: right;\">\n      <th></th>\n      <th>id</th>\n      <th>comment_text</th>\n      <th>toxic</th>\n      <th>severe_toxic</th>\n      <th>obscene</th>\n      <th>threat</th>\n      <th>insult</th>\n      <th>identity_hate</th>\n    </tr>\n  </thead>\n  <tbody>\n    <tr>\n      <th>140030</th>\n      <td>ed56f082116dcbd0</td>\n      <td>Grandma Terri Should Burn in Trash \\nGrandma T...</td>\n      <td>1</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n    </tr>\n    <tr>\n      <th>159124</th>\n      <td>f8e3cd98b63bf401</td>\n      <td>, 9 May 2009 (UTC)\\nIt would be easiest if you...</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n    </tr>\n    <tr>\n      <th>60006</th>\n      <td>a09e1bcf10631f9a</td>\n      <td>\"\\n\\nThe Objectivity of this Discussion is dou...</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n    </tr>\n    <tr>\n      <th>65432</th>\n      <td>af0ee0066c607eb8</td>\n      <td>Shelly Shock\\nShelly Shock is. . .( )</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n    </tr>\n    <tr>\n      <th>154979</th>\n      <td>b734772b1a807e09</td>\n      <td>I do not care. Refer to Ong Teng Cheong talk p...</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n    </tr>\n  </tbody>\n</table>\n</div>"},"metadata":{}}]},{"metadata":{"trusted":true},"cell_type":"code","source":"test.head()","execution_count":9,"outputs":[{"output_type":"execute_result","execution_count":9,"data":{"text/plain":"                 id                                       comment_text\n0  00001cee341fdb12  Yo bitch Ja Rule is more succesful then you'll...\n1  0000247867823ef7  == From RfC == \\n\\n The title is fine as it is...\n2  00013b17ad220c46  \" \\n\\n == Sources == \\n\\n * Zawe Ashton on Lap...\n3  00017563c3f7919a  :If you have a look back at the source, the in...\n4  00017695ad8997eb          I don't anonymously edit articles at all.","text/html":"<div>\n<style scoped>\n    .dataframe tbody tr th:only-of-type {\n        vertical-align: middle;\n    }\n\n    .dataframe tbody tr th {\n        vertical-align: top;\n    }\n\n    .dataframe thead th {\n        text-align: right;\n    }\n</style>\n<table border=\"1\" class=\"dataframe\">\n  <thead>\n    <tr style=\"text-align: right;\">\n      <th></th>\n      <th>id</th>\n      <th>comment_text</th>\n    </tr>\n  </thead>\n  <tbody>\n    <tr>\n      <th>0</th>\n      <td>00001cee341fdb12</td>\n      <td>Yo bitch Ja Rule is more succesful then you'll...</td>\n    </tr>\n    <tr>\n      <th>1</th>\n      <td>0000247867823ef7</td>\n      <td>== From RfC == \\n\\n The title is fine as it is...</td>\n    </tr>\n    <tr>\n      <th>2</th>\n      <td>00013b17ad220c46</td>\n      <td>\" \\n\\n == Sources == \\n\\n * Zawe Ashton on Lap...</td>\n    </tr>\n    <tr>\n      <th>3</th>\n      <td>00017563c3f7919a</td>\n      <td>:If you have a look back at the source, the in...</td>\n    </tr>\n    <tr>\n      <th>4</th>\n      <td>00017695ad8997eb</td>\n      <td>I don't anonymously edit articles at all.</td>\n    </tr>\n  </tbody>\n</table>\n</div>"},"metadata":{}}]},{"metadata":{"trusted":true},"cell_type":"code","source":"val.head()","execution_count":10,"outputs":[{"output_type":"execute_result","execution_count":10,"data":{"text/plain":"                      id      ...      identity_hate\n119105  7ca72b5b9c688e9e      ...                  0\n131631  c03f72fd8f8bf54f      ...                  0\n125326  9e5b8e8fc1ff2e84      ...                  0\n111256  5332799e706665a6      ...                  0\n83590   dfa7d8f0b4366680      ...                  0\n\n[5 rows x 8 columns]","text/html":"<div>\n<style scoped>\n    .dataframe tbody tr th:only-of-type {\n        vertical-align: middle;\n    }\n\n    .dataframe tbody tr th {\n        vertical-align: top;\n    }\n\n    .dataframe thead th {\n        text-align: right;\n    }\n</style>\n<table border=\"1\" class=\"dataframe\">\n  <thead>\n    <tr style=\"text-align: right;\">\n      <th></th>\n      <th>id</th>\n      <th>comment_text</th>\n      <th>toxic</th>\n      <th>severe_toxic</th>\n      <th>obscene</th>\n      <th>threat</th>\n      <th>insult</th>\n      <th>identity_hate</th>\n    </tr>\n  </thead>\n  <tbody>\n    <tr>\n      <th>119105</th>\n      <td>7ca72b5b9c688e9e</td>\n      <td>Geez, are you forgetful!  We've already discus...</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n    </tr>\n    <tr>\n      <th>131631</th>\n      <td>c03f72fd8f8bf54f</td>\n      <td>Carioca RFA \\n\\nThanks for your support on my ...</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n    </tr>\n    <tr>\n      <th>125326</th>\n      <td>9e5b8e8fc1ff2e84</td>\n      <td>\"\\n\\n Birthday \\n\\nNo worries, It's what I do ...</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n    </tr>\n    <tr>\n      <th>111256</th>\n      <td>5332799e706665a6</td>\n      <td>Pseudoscience category? \\n\\nI'm assuming that ...</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n    </tr>\n    <tr>\n      <th>83590</th>\n      <td>dfa7d8f0b4366680</td>\n      <td>(and if such phrase exists, it would be provid...</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n      <td>0</td>\n    </tr>\n  </tbody>\n</table>\n</div>"},"metadata":{}}]},{"metadata":{},"cell_type":"markdown","source":"In following code snippets, we need to wrap BERT vocab and BERT tokenizer with Fastai modules"},{"metadata":{"trusted":true},"cell_type":"code","source":"fastai_bert_vocab = Vocab(list(bert_tok.vocab.keys()))","execution_count":11,"outputs":[]},{"metadata":{"trusted":true},"cell_type":"code","source":"fastai_tokenizer = Tokenizer(tok_func=FastAiBertTokenizer(bert_tok, max_seq_len=256), pre_rules=[], post_rules=[])","execution_count":12,"outputs":[]},{"metadata":{},"cell_type":"markdown","source":"Now, we can create our Databunch. Important thing to note here is to use BERT Tokenizer, BERT Vocab. And to and put include_bos and include_eos as False as Fastai puts some default values for these"},{"metadata":{"trusted":true},"cell_type":"code","source":"label_cols = [\"toxic\", \"severe_toxic\", \"obscene\", \"threat\", \"insult\", \"identity_hate\"]\n\ndatabunch_1 = TextDataBunch.from_df(\".\", train, val, \n                  tokenizer=fastai_tokenizer,\n                  vocab=fastai_bert_vocab,\n                  include_bos=False,\n                  include_eos=False,\n                  text_cols=\"comment_text\",\n                  label_cols=label_cols,\n                  bs=32,\n                  collate_fn=partial(pad_collate, pad_first=False, pad_idx=0),\n             )","execution_count":13,"outputs":[]},{"metadata":{"trusted":true},"cell_type":"markdown","source":"Alternatively, we can pass our own list of Preprocessors to the databunch (this is effectively what is happening behind the scenes)"},{"metadata":{"trusted":true},"cell_type":"code","source":"class BertTokenizeProcessor(TokenizeProcessor):\n    def __init__(self, tokenizer):\n        super().__init__(tokenizer=tokenizer, include_bos=False, include_eos=False)\n\nclass BertNumericalizeProcessor(NumericalizeProcessor):\n    def __init__(self, *args, **kwargs):\n        super().__init__(*args, vocab=Vocab(list(bert_tok.vocab.keys())), **kwargs)\n\ndef get_bert_processor(tokenizer:Tokenizer=None, vocab:Vocab=None):\n    \"\"\"\n    Constructing preprocessors for BERT\n    We remove sos/eos tokens since we add that ourselves in the tokenizer.\n    We also use a custom vocabulary to match the numericalization with the original BERT model.\n    \"\"\"\n    return [BertTokenizeProcessor(tokenizer=tokenizer),\n            NumericalizeProcessor(vocab=vocab)]","execution_count":14,"outputs":[]},{"metadata":{"trusted":true},"cell_type":"code","source":"class BertDataBunch(TextDataBunch):\n    @classmethod\n    def from_df(cls, path:PathOrStr, train_df:DataFrame, valid_df:DataFrame, test_df:Optional[DataFrame]=None,\n                tokenizer:Tokenizer=None, vocab:Vocab=None, classes:Collection[str]=None, text_cols:IntsOrStrs=1,\n                label_cols:IntsOrStrs=0, label_delim:str=None, **kwargs) -> DataBunch:\n        \"Create a `TextDataBunch` from DataFrames.\"\n        p_kwargs, kwargs = split_kwargs_by_func(kwargs, get_bert_processor)\n        # use our custom processors while taking tokenizer and vocab as kwargs\n        processor = get_bert_processor(tokenizer=tokenizer, vocab=vocab, **p_kwargs)\n        if classes is None and is_listy(label_cols) and len(label_cols) > 1: classes = label_cols\n        src = ItemLists(path, TextList.from_df(train_df, path, cols=text_cols, processor=processor),\n                        TextList.from_df(valid_df, path, cols=text_cols, processor=processor))\n        src = src.label_for_lm() if cls==TextLMDataBunch else src.label_from_df(cols=label_cols, classes=classes)\n        if test_df is not None: src.add_test(TextList.from_df(test_df, path, cols=text_cols))\n        return src.databunch(**kwargs)","execution_count":15,"outputs":[]},{"metadata":{"trusted":true},"cell_type":"code","source":"# this will produce a virtually identical databunch to the code above\ndatabunch_2 = BertDataBunch.from_df(\".\", train_df=train, valid_df=val,\n                  tokenizer=fastai_tokenizer,\n                  vocab=fastai_bert_vocab,\n                  text_cols=\"comment_text\",\n                  label_cols=label_cols,\n                  bs=32,\n                  collate_fn=partial(pad_collate, pad_first=False, pad_idx=0),\n             )","execution_count":16,"outputs":[]},{"metadata":{"trusted":true},"cell_type":"code","source":"path=Path('../input/')","execution_count":17,"outputs":[]},{"metadata":{"trusted":true},"cell_type":"code","source":"databunch_2.show_batch()","execution_count":18,"outputs":[{"output_type":"display_data","data":{"text/plain":"<IPython.core.display.HTML object>","text/html":"<table border=\"1\" class=\"dataframe\">\n  <thead>\n    <tr style=\"text-align: right;\">\n      <th>text</th>\n      <th>target</th>\n    </tr>\n  </thead>\n  <tbody>\n    <tr>\n      <td>[CLS] \" plato not \" \" part of the soc ##ratic entourage \" \" or \" \" inner circle \" \" ? at the beginning of the plato and socrates section of the article , someone had said , \" \" plato made himself seem as though he were part of the soc ##ratic entourage but never says so explicitly . . . . in the apology , plato distances</td>\n      <td></td>\n    </tr>\n    <tr>\n      <td>[CLS] \" image : s _ via - cy ##rix _ iii - 600 ##m ##h ##z _ ( 133 ##x ##4 . 5 _ 2 . 0 ##v ) . jp ##g listed for del ##eti ##on dear up ##load ##er : the media file you uploaded as image : s _ via - cy ##rix _ iii - 600 ##m ##h ##z _ ( 133 ##x ##4 .</td>\n      <td></td>\n    </tr>\n    <tr>\n      <td>[CLS] hey jeff . . . you seem to have missed the point entirely . wikipedia ' s talk pages are for discussing the article and how it needs to be changed . given that fire ##fly is proving to be a failure at the box office by the well accepted formula of movie cost &gt; box office take , i am suggesting we start formula ##ting a way to</td>\n      <td></td>\n    </tr>\n    <tr>\n      <td>[CLS] \" hi , boon ##dock , welcome to wikipedia ! i hope you like this place — i sure do — and want to stay . before getting too in - depth , you may want to read about the five pillars of wikipedia and simplified rules ##et . if you need help on how to title new articles check out the naming conventions , and for help on</td>\n      <td></td>\n    </tr>\n    <tr>\n      <td>[CLS] polar ##is , wi ##ki does not prohibit the use of po ##v sources , that is after ##all how we get all our info anyway . what wi ##ki wants is to present po ##v material in np ##ov fashion which means citing the source and informing readers that this is the opinion or belief of the source . these sources have a different po ##v than most</td>\n      <td></td>\n    </tr>\n  </tbody>\n</table>"},"metadata":{}}]},{"metadata":{"trusted":true},"cell_type":"code","source":"databunch_1.show_batch()","execution_count":19,"outputs":[{"output_type":"display_data","data":{"text/plain":"<IPython.core.display.HTML object>","text/html":"<table border=\"1\" class=\"dataframe\">\n  <thead>\n    <tr style=\"text-align: right;\">\n      <th>text</th>\n      <th>target</th>\n    </tr>\n  </thead>\n  <tbody>\n    <tr>\n      <td>[CLS] \" af ##d nomination of harvey l . bass i ' ve nominated harvey l . bass , an article you created , for del ##eti ##on . we appreciate your contributions , but in this particular case i do not feel that harvey l . bass sat ##is ##fies wikipedia ' s criteria for inclusion ; i have explained why in the nomination space ( see also \"</td>\n      <td></td>\n    </tr>\n    <tr>\n      <td>[CLS] hi scorpion , . i ' m a bit aggravated as i provided a valid source from canada ' s walk of fame saying that ryan reynolds will not be inducted on october 1 , 2011 like it states in your edit . the way that you put is that we should ignore the source and just merge the columns on the night of october 1 , but according</td>\n      <td></td>\n    </tr>\n    <tr>\n      <td>[CLS] can you cite a rep ##utable scholarly source ( not other christian groups , which would of course reject this group , just as they often reject each other ) that states an assessment that pc is not a form of christianity . i do not think this fact is disputed unless there is a fringe view of what is means to be a christian , which exclude ##s</td>\n      <td></td>\n    </tr>\n    <tr>\n      <td>[CLS] \" welcome hello , and welcome to wikipedia ! thank you for your contributions . i hope you like the place and decide to stay . here are some pages you might like to see : the five pillars of wikipedia how to edit a page help pages tutor ##ial how to write a great article manual of style you are welcome to continue editing articles without logging in</td>\n      <td></td>\n    </tr>\n    <tr>\n      <td>[CLS] \" your donation will fund wi ##kia , inc . , which is not a charity . your non - profit donation will ultimately line the for - profit pockets of jimmy wales , amazon , google , the be ##sse ##mer partners , and other corporate ben ##ef ##icia ##ries . how ? wikipedia is a commercial traffic engine . as of december 2008 , there are over</td>\n      <td>toxic</td>\n    </tr>\n  </tbody>\n</table>"},"metadata":{}}]},{"metadata":{},"cell_type":"markdown","source":"Both Databunch_1 and Databunch_2 can be used for modelling purposes. In this project, we will be using Databunch_1 which is easier to create and use."},{"metadata":{},"cell_type":"markdown","source":"# BERT Model"},{"metadata":{"trusted":true},"cell_type":"code","source":"from pytorch_pretrained_bert.modeling import BertConfig, BertForSequenceClassification, BertForNextSentencePrediction, BertForMaskedLM\nbert_model_class = BertForSequenceClassification.from_pretrained('bert-base-uncased', num_labels=6)","execution_count":20,"outputs":[{"output_type":"stream","text":"100%|██████████| 407873900/407873900 [00:09<00:00, 42286350.63B/s]\n","name":"stderr"}]},{"metadata":{},"cell_type":"markdown","source":"Loss function to be used is Binary Cross Entropy with Logistic Losses"},{"metadata":{"trusted":true},"cell_type":"code","source":"loss_func = nn.BCEWithLogitsLoss()","execution_count":21,"outputs":[]},{"metadata":{},"cell_type":"markdown","source":"Considering this is a multi-label classification problem, we cant use simple accuracy as metrics here. Instead, we will use accuracy_thresh with threshold of 25% as our metric here."},{"metadata":{"trusted":true},"cell_type":"code","source":"acc_02 = partial(accuracy_thresh, thresh=0.25)","execution_count":22,"outputs":[]},{"metadata":{"trusted":true},"cell_type":"code","source":"model = bert_model_class","execution_count":23,"outputs":[]},{"metadata":{},"cell_type":"markdown","source":"Now, lets create learner function"},{"metadata":{"trusted":true},"cell_type":"code","source":"from fastai.callbacks import *\n\nlearner = Learner(\n    databunch_1, model,\n    loss_func=loss_func, model_dir='/temp/model', metrics=acc_02,\n)","execution_count":24,"outputs":[]},{"metadata":{},"cell_type":"markdown","source":"Below code will help us in splitting the model into desirable parts which will be helpful for us in Discriminative Learning i.e. setting up different learning rates and weight decays for different parts of the model."},{"metadata":{"trusted":true},"cell_type":"code","source":"def bert_clas_split(self) -> List[nn.Module]:\n    \n    bert = model.bert\n    embedder = bert.embeddings\n    pooler = bert.pooler\n    encoder = bert.encoder\n    classifier = [model.dropout, model.classifier]\n    n = len(encoder.layer)//3\n    print(n)\n    groups = [[embedder], list(encoder.layer[:n]), list(encoder.layer[n+1:2*n]), list(encoder.layer[(2*n)+1:]), [pooler], classifier]\n    return groups","execution_count":25,"outputs":[]},{"metadata":{"trusted":true},"cell_type":"code","source":"x = bert_clas_split(model)","execution_count":26,"outputs":[{"output_type":"stream","text":"4\n","name":"stdout"}]},{"metadata":{},"cell_type":"markdown","source":"Let's split the model now in 6 parts"},{"metadata":{"trusted":true},"cell_type":"code","source":"learner.split([x[0], x[1], x[2], x[3], x[5]])","execution_count":27,"outputs":[{"output_type":"execute_result","execution_count":27,"data":{"text/plain":"Learner(data=TextClasDataBunch;\n\nTrain: LabelList (127656 items)\nx: TextList\n[CLS] grandma terri should burn in trash grandma terri is trash . i hate grandma terri . f % % k her to hell ! 71 . 74 . 76 . 40 [SEP],[CLS] , 9 may 2009 ( utc ) it would be easiest if you were to admit to being a member of the involved portuguese lodge , and then there would be no requirement to acknowledge whether you had a previous account ( carlos bot ##el ##ho did not have a good record ) or not and i would then remove the sock ##pu ##ppet template as irrelevant . w ##p : co ##i permits people to edit those articles , such as ms ##ja ##pan does , but just means you have to be more careful in ensuring that references back your edit ##s and that np ##ov is upheld . 20 : 29 [SEP],[CLS] \" the object ##ivity of this discussion is doubtful ( non - existent ) ( 1 ) as indicated earlier , the section on marxist leaders ’ views is misleading : ( a ) it lays un ##war ##rant ##ed and excessive emphasis on tr ##ots ##ky , creating the misleading impression that other prominent marxist ##s ( marx , eng ##els , lenin ) did not advocate and / or practiced terrorism ; ( b ) it lays un ##war ##rant ##ed and excessive emphasis on the theoretical “ rejection of individual terrorism ” , creating the misleading impression that this is the main ( only ) marxist position on terrorism . ( 2 ) the discussion is not being properly monitored : ( a ) no disc ##ern ##ible attempt is being made to establish and maintain an acceptable degree of object ##ivity ; ( b ) important and relevant scholarly works such as the international encyclopedia of terrorism are being ignored or illicit ##ly excluded from the discussion ; ( c ) though the only logical way to remedy the b ##lat ##ant im ##balance in the above section is to include quotes by / on other leaders who are known to have endorsed and practiced terrorism all attempts to do so have been systematically blocked with imp ##uni ##ty by the ap ##ologists for marxist terrorism who have done their best to sabotage and wreck both the article and the discussion . ( 3 ) among the tactics deployed by [SEP],[CLS] shelly shock shelly shock is . . . ( ) [SEP],[CLS] i do not care . refer to on ##g ten ##g che ##ong talk page . is la go ##ut ##te de pl ##ui ##e writing a biography or writing the history of trade unions . she is making use of the dead to push her agenda again . right before elections too . how timely . 202 . 156 . 13 . 232 [SEP]\ny: MultiCategoryList\ntoxic,,,,\nPath: .;\n\nValid: LabelList (31915 items)\nx: TextList\n[CLS] gee ##z , are you forget ##ful ! we ' ve already discussed why marx was not an anarchist , i . e . he wanted to use a state to mold his ' socialist man . ' er ##go , he is a stat ##ist - the opposite of an anarchist . i know a guy who says that , when he gets old and his teeth fall out , he ' ll quit eating meat . would you call him a vegetarian ? [SEP],[CLS] car ##io ##ca rf ##a thanks for your support on my request for ad ##mins ##hip . the final outcome was ( 31 / 4 / 1 ) , so i am now an administrator . if you have any comments or concerns on my actions as an administrator , please let me know . thank you ! [SEP],[CLS] \" birthday no worries , it ' s what i do ; ) enjoy ur day | talk | e \" [SEP],[CLS] pseudo ##sc ##ience category ? i ' m assuming that this article is in the pseudo ##sc ##ience category because of its association with creation ##ism . however , there are modern , scientific ##ally - accepted variants of cat ##ast ##rop ##hism that have nothing to do with creation ##ism — and they ' re even mentioned in the article ! i think the connection to pseudo ##sc ##ience needs to be clarified , or the article made more general and less creation ##ism - specific and the category tag removed entirely . [SEP],[CLS] ( and if such phrase exists , it would be provided by search engine even if mentioned page is not available as a whole ) [SEP]\ny: MultiCategoryList\n,,,,\nPath: .;\n\nTest: None, model=BertForSequenceClassification(\n  (bert): BertModel(\n    (embeddings): BertEmbeddings(\n      (word_embeddings): Embedding(30522, 768, padding_idx=0)\n      (position_embeddings): Embedding(512, 768)\n      (token_type_embeddings): Embedding(2, 768)\n      (LayerNorm): BertLayerNorm()\n      (dropout): Dropout(p=0.1)\n    )\n    (encoder): BertEncoder(\n      (layer): ModuleList(\n        (0): BertLayer(\n          (attention): BertAttention(\n            (self): BertSelfAttention(\n              (query): Linear(in_features=768, out_features=768, bias=True)\n              (key): Linear(in_features=768, out_features=768, bias=True)\n              (value): Linear(in_features=768, out_features=768, bias=True)\n              (dropout): Dropout(p=0.1)\n            )\n            (output): BertSelfOutput(\n              (dense): Linear(in_features=768, out_features=768, bias=True)\n              (LayerNorm): BertLayerNorm()\n              (dropout): Dropout(p=0.1)\n            )\n          )\n          (intermediate): BertIntermediate(\n            (dense): Linear(in_features=768, out_features=3072, bias=True)\n          )\n          (output): BertOutput(\n            (dense): Linear(in_features=3072, out_features=768, bias=True)\n            (LayerNorm): BertLayerNorm()\n            (dropout): Dropout(p=0.1)\n          )\n        )\n        (1): BertLayer(\n          (attention): BertAttention(\n            (self): BertSelfAttention(\n              (query): Linear(in_features=768, out_features=768, bias=True)\n              (key): Linear(in_features=768, out_features=768, bias=True)\n              (value): Linear(in_features=768, out_features=768, bias=True)\n              (dropout): Dropout(p=0.1)\n            )\n            (output): BertSelfOutput(\n              (dense): Linear(in_features=768, out_features=768, bias=True)\n              (LayerNorm): BertLayerNorm()\n              (dropout): Dropout(p=0.1)\n            )\n          )\n          (intermediate): BertIntermediate(\n            (dense): Linear(in_features=768, out_features=3072, bias=True)\n          )\n          (output): BertOutput(\n            (dense): Linear(in_features=3072, out_features=768, bias=True)\n            (LayerNorm): BertLayerNorm()\n            (dropout): Dropout(p=0.1)\n          )\n        )\n        (2): BertLayer(\n          (attention): BertAttention(\n            (self): BertSelfAttention(\n              (query): Linear(in_features=768, out_features=768, bias=True)\n              (key): Linear(in_features=768, out_features=768, bias=True)\n              (value): Linear(in_features=768, out_features=768, bias=True)\n              (dropout): Dropout(p=0.1)\n            )\n            (output): BertSelfOutput(\n              (dense): Linear(in_features=768, out_features=768, bias=True)\n              (LayerNorm): BertLayerNorm()\n              (dropout): Dropout(p=0.1)\n            )\n          )\n          (intermediate): BertIntermediate(\n            (dense): Linear(in_features=768, out_features=3072, bias=True)\n          )\n          (output): BertOutput(\n            (dense): Linear(in_features=3072, out_features=768, bias=True)\n            (LayerNorm): BertLayerNorm()\n            (dropout): Dropout(p=0.1)\n          )\n        )\n        (3): BertLayer(\n          (attention): BertAttention(\n            (self): BertSelfAttention(\n              (query): Linear(in_features=768, out_features=768, bias=True)\n              (key): Linear(in_features=768, out_features=768, bias=True)\n              (value): Linear(in_features=768, out_features=768, bias=True)\n              (dropout): Dropout(p=0.1)\n            )\n            (output): BertSelfOutput(\n              (dense): Linear(in_features=768, out_features=768, bias=True)\n              (LayerNorm): BertLayerNorm()\n              (dropout): Dropout(p=0.1)\n            )\n          )\n          (intermediate): BertIntermediate(\n            (dense): Linear(in_features=768, out_features=3072, bias=True)\n          )\n          (output): BertOutput(\n            (dense): Linear(in_features=3072, out_features=768, bias=True)\n            (LayerNorm): BertLayerNorm()\n            (dropout): Dropout(p=0.1)\n          )\n        )\n        (4): BertLayer(\n          (attention): BertAttention(\n            (self): BertSelfAttention(\n              (query): Linear(in_features=768, out_features=768, bias=True)\n              (key): Linear(in_features=768, out_features=768, bias=True)\n              (value): Linear(in_features=768, out_features=768, bias=True)\n              (dropout): Dropout(p=0.1)\n            )\n            (output): BertSelfOutput(\n              (dense): Linear(in_features=768, out_features=768, bias=True)\n              (LayerNorm): BertLayerNorm()\n              (dropout): Dropout(p=0.1)\n            )\n          )\n          (intermediate): BertIntermediate(\n            (dense): Linear(in_features=768, out_features=3072, bias=True)\n          )\n          (output): BertOutput(\n            (dense): Linear(in_features=3072, out_features=768, bias=True)\n            (LayerNorm): BertLayerNorm()\n            (dropout): Dropout(p=0.1)\n          )\n        )\n        (5): BertLayer(\n          (attention): BertAttention(\n            (self): BertSelfAttention(\n              (query): Linear(in_features=768, out_features=768, bias=True)\n              (key): Linear(in_features=768, out_features=768, bias=True)\n              (value): Linear(in_features=768, out_features=768, bias=True)\n              (dropout): Dropout(p=0.1)\n            )\n            (output): BertSelfOutput(\n              (dense): Linear(in_features=768, out_features=768, bias=True)\n              (LayerNorm): BertLayerNorm()\n              (dropout): Dropout(p=0.1)\n            )\n          )\n          (intermediate): BertIntermediate(\n            (dense): Linear(in_features=768, out_features=3072, bias=True)\n          )\n          (output): BertOutput(\n            (dense): Linear(in_features=3072, out_features=768, bias=True)\n            (LayerNorm): BertLayerNorm()\n            (dropout): Dropout(p=0.1)\n          )\n        )\n        (6): BertLayer(\n          (attention): BertAttention(\n            (self): BertSelfAttention(\n              (query): Linear(in_features=768, out_features=768, bias=True)\n              (key): Linear(in_features=768, out_features=768, bias=True)\n              (value): Linear(in_features=768, out_features=768, bias=True)\n              (dropout): Dropout(p=0.1)\n            )\n            (output): BertSelfOutput(\n              (dense): Linear(in_features=768, out_features=768, bias=True)\n              (LayerNorm): BertLayerNorm()\n              (dropout): Dropout(p=0.1)\n            )\n          )\n          (intermediate): BertIntermediate(\n            (dense): Linear(in_features=768, out_features=3072, bias=True)\n          )\n          (output): BertOutput(\n            (dense): Linear(in_features=3072, out_features=768, bias=True)\n            (LayerNorm): BertLayerNorm()\n            (dropout): Dropout(p=0.1)\n          )\n        )\n        (7): BertLayer(\n          (attention): BertAttention(\n            (self): BertSelfAttention(\n              (query): Linear(in_features=768, out_features=768, bias=True)\n              (key): Linear(in_features=768, out_features=768, bias=True)\n              (value): Linear(in_features=768, out_features=768, bias=True)\n              (dropout): Dropout(p=0.1)\n            )\n            (output): BertSelfOutput(\n              (dense): Linear(in_features=768, out_features=768, bias=True)\n              (LayerNorm): BertLayerNorm()\n              (dropout): Dropout(p=0.1)\n            )\n          )\n          (intermediate): BertIntermediate(\n            (dense): Linear(in_features=768, out_features=3072, bias=True)\n          )\n          (output): BertOutput(\n            (dense): Linear(in_features=3072, out_features=768, bias=True)\n            (LayerNorm): BertLayerNorm()\n            (dropout): Dropout(p=0.1)\n          )\n        )\n        (8): BertLayer(\n          (attention): BertAttention(\n            (self): BertSelfAttention(\n              (query): Linear(in_features=768, out_features=768, bias=True)\n              (key): Linear(in_features=768, out_features=768, bias=True)\n              (value): Linear(in_features=768, out_features=768, bias=True)\n              (dropout): Dropout(p=0.1)\n            )\n            (output): BertSelfOutput(\n              (dense): Linear(in_features=768, out_features=768, bias=True)\n              (LayerNorm): BertLayerNorm()\n              (dropout): Dropout(p=0.1)\n            )\n          )\n          (intermediate): BertIntermediate(\n            (dense): Linear(in_features=768, out_features=3072, bias=True)\n          )\n          (output): BertOutput(\n            (dense): Linear(in_features=3072, out_features=768, bias=True)\n            (LayerNorm): BertLayerNorm()\n            (dropout): Dropout(p=0.1)\n          )\n        )\n        (9): BertLayer(\n          (attention): BertAttention(\n            (self): BertSelfAttention(\n              (query): Linear(in_features=768, out_features=768, bias=True)\n              (key): Linear(in_features=768, out_features=768, bias=True)\n              (value): Linear(in_features=768, out_features=768, bias=True)\n              (dropout): Dropout(p=0.1)\n            )\n            (output): BertSelfOutput(\n              (dense): Linear(in_features=768, out_features=768, bias=True)\n              (LayerNorm): BertLayerNorm()\n              (dropout): Dropout(p=0.1)\n            )\n          )\n          (intermediate): BertIntermediate(\n            (dense): Linear(in_features=768, out_features=3072, bias=True)\n          )\n          (output): BertOutput(\n            (dense): Linear(in_features=3072, out_features=768, bias=True)\n            (LayerNorm): BertLayerNorm()\n            (dropout): Dropout(p=0.1)\n          )\n        )\n        (10): BertLayer(\n          (attention): BertAttention(\n            (self): BertSelfAttention(\n              (query): Linear(in_features=768, out_features=768, bias=True)\n              (key): Linear(in_features=768, out_features=768, bias=True)\n              (value): Linear(in_features=768, out_features=768, bias=True)\n              (dropout): Dropout(p=0.1)\n            )\n            (output): BertSelfOutput(\n              (dense): Linear(in_features=768, out_features=768, bias=True)\n              (LayerNorm): BertLayerNorm()\n              (dropout): Dropout(p=0.1)\n            )\n          )\n          (intermediate): BertIntermediate(\n            (dense): Linear(in_features=768, out_features=3072, bias=True)\n          )\n          (output): BertOutput(\n            (dense): Linear(in_features=3072, out_features=768, bias=True)\n            (LayerNorm): BertLayerNorm()\n            (dropout): Dropout(p=0.1)\n          )\n        )\n        (11): BertLayer(\n          (attention): BertAttention(\n            (self): BertSelfAttention(\n              (query): Linear(in_features=768, out_features=768, bias=True)\n              (key): Linear(in_features=768, out_features=768, bias=True)\n              (value): Linear(in_features=768, out_features=768, bias=True)\n              (dropout): Dropout(p=0.1)\n            )\n            (output): BertSelfOutput(\n              (dense): Linear(in_features=768, out_features=768, bias=True)\n              (LayerNorm): BertLayerNorm()\n              (dropout): Dropout(p=0.1)\n            )\n          )\n          (intermediate): BertIntermediate(\n            (dense): Linear(in_features=768, out_features=3072, bias=True)\n          )\n          (output): BertOutput(\n            (dense): Linear(in_features=3072, out_features=768, bias=True)\n            (LayerNorm): BertLayerNorm()\n            (dropout): Dropout(p=0.1)\n          )\n        )\n      )\n    )\n    (pooler): BertPooler(\n      (dense): Linear(in_features=768, out_features=768, bias=True)\n      (activation): Tanh()\n    )\n  )\n  (dropout): Dropout(p=0.1)\n  (classifier): Linear(in_features=768, out_features=6, bias=True)\n), opt_func=functools.partial(<class 'torch.optim.adam.Adam'>, betas=(0.9, 0.99)), loss_func=BCEWithLogitsLoss(), metrics=[functools.partial(<function accuracy_thresh at 0x7f79cc28dae8>, thresh=0.25)], true_wd=True, bn_wd=True, wd=0.01, train_bn=True, path=PosixPath('.'), model_dir='/temp/model', callback_fns=[functools.partial(<class 'fastai.basic_train.Recorder'>, add_time=True, silent=False)], callbacks=[], layer_groups=[Sequential(\n  (0): BertEmbeddings(\n    (word_embeddings): Embedding(30522, 768, padding_idx=0)\n    (position_embeddings): Embedding(512, 768)\n    (token_type_embeddings): Embedding(2, 768)\n    (LayerNorm): BertLayerNorm()\n    (dropout): Dropout(p=0.1)\n  )\n), Sequential(\n  (0): BertLayer(\n    (attention): BertAttention(\n      (self): BertSelfAttention(\n        (query): Linear(in_features=768, out_features=768, bias=True)\n        (key): Linear(in_features=768, out_features=768, bias=True)\n        (value): Linear(in_features=768, out_features=768, bias=True)\n        (dropout): Dropout(p=0.1)\n      )\n      (output): BertSelfOutput(\n        (dense): Linear(in_features=768, out_features=768, bias=True)\n        (LayerNorm): BertLayerNorm()\n        (dropout): Dropout(p=0.1)\n      )\n    )\n    (intermediate): BertIntermediate(\n      (dense): Linear(in_features=768, out_features=3072, bias=True)\n    )\n    (output): BertOutput(\n      (dense): Linear(in_features=3072, out_features=768, bias=True)\n      (LayerNorm): BertLayerNorm()\n      (dropout): Dropout(p=0.1)\n    )\n  )\n  (1): BertLayer(\n    (attention): BertAttention(\n      (self): BertSelfAttention(\n        (query): Linear(in_features=768, out_features=768, bias=True)\n        (key): Linear(in_features=768, out_features=768, bias=True)\n        (value): Linear(in_features=768, out_features=768, bias=True)\n        (dropout): Dropout(p=0.1)\n      )\n      (output): BertSelfOutput(\n        (dense): Linear(in_features=768, out_features=768, bias=True)\n        (LayerNorm): BertLayerNorm()\n        (dropout): Dropout(p=0.1)\n      )\n    )\n    (intermediate): BertIntermediate(\n      (dense): Linear(in_features=768, out_features=3072, bias=True)\n    )\n    (output): BertOutput(\n      (dense): Linear(in_features=3072, out_features=768, bias=True)\n      (LayerNorm): BertLayerNorm()\n      (dropout): Dropout(p=0.1)\n    )\n  )\n  (2): BertLayer(\n    (attention): BertAttention(\n      (self): BertSelfAttention(\n        (query): Linear(in_features=768, out_features=768, bias=True)\n        (key): Linear(in_features=768, out_features=768, bias=True)\n        (value): Linear(in_features=768, out_features=768, bias=True)\n        (dropout): Dropout(p=0.1)\n      )\n      (output): BertSelfOutput(\n        (dense): Linear(in_features=768, out_features=768, bias=True)\n        (LayerNorm): BertLayerNorm()\n        (dropout): Dropout(p=0.1)\n      )\n    )\n    (intermediate): BertIntermediate(\n      (dense): Linear(in_features=768, out_features=3072, bias=True)\n    )\n    (output): BertOutput(\n      (dense): Linear(in_features=3072, out_features=768, bias=True)\n      (LayerNorm): BertLayerNorm()\n      (dropout): Dropout(p=0.1)\n    )\n  )\n  (3): BertLayer(\n    (attention): BertAttention(\n      (self): BertSelfAttention(\n        (query): Linear(in_features=768, out_features=768, bias=True)\n        (key): Linear(in_features=768, out_features=768, bias=True)\n        (value): Linear(in_features=768, out_features=768, bias=True)\n        (dropout): Dropout(p=0.1)\n      )\n      (output): BertSelfOutput(\n        (dense): Linear(in_features=768, out_features=768, bias=True)\n        (LayerNorm): BertLayerNorm()\n        (dropout): Dropout(p=0.1)\n      )\n    )\n    (intermediate): BertIntermediate(\n      (dense): Linear(in_features=768, out_features=3072, bias=True)\n    )\n    (output): BertOutput(\n      (dense): Linear(in_features=3072, out_features=768, bias=True)\n      (LayerNorm): BertLayerNorm()\n      (dropout): Dropout(p=0.1)\n    )\n  )\n), Sequential(\n  (0): BertLayer(\n    (attention): BertAttention(\n      (self): BertSelfAttention(\n        (query): Linear(in_features=768, out_features=768, bias=True)\n        (key): Linear(in_features=768, out_features=768, bias=True)\n        (value): Linear(in_features=768, out_features=768, bias=True)\n        (dropout): Dropout(p=0.1)\n      )\n      (output): BertSelfOutput(\n        (dense): Linear(in_features=768, out_features=768, bias=True)\n        (LayerNorm): BertLayerNorm()\n        (dropout): Dropout(p=0.1)\n      )\n    )\n    (intermediate): BertIntermediate(\n      (dense): Linear(in_features=768, out_features=3072, bias=True)\n    )\n    (output): BertOutput(\n      (dense): Linear(in_features=3072, out_features=768, bias=True)\n      (LayerNorm): BertLayerNorm()\n      (dropout): Dropout(p=0.1)\n    )\n  )\n  (1): BertLayer(\n    (attention): BertAttention(\n      (self): BertSelfAttention(\n        (query): Linear(in_features=768, out_features=768, bias=True)\n        (key): Linear(in_features=768, out_features=768, bias=True)\n        (value): Linear(in_features=768, out_features=768, bias=True)\n        (dropout): Dropout(p=0.1)\n      )\n      (output): BertSelfOutput(\n        (dense): Linear(in_features=768, out_features=768, bias=True)\n        (LayerNorm): BertLayerNorm()\n        (dropout): Dropout(p=0.1)\n      )\n    )\n    (intermediate): BertIntermediate(\n      (dense): Linear(in_features=768, out_features=3072, bias=True)\n    )\n    (output): BertOutput(\n      (dense): Linear(in_features=3072, out_features=768, bias=True)\n      (LayerNorm): BertLayerNorm()\n      (dropout): Dropout(p=0.1)\n    )\n  )\n  (2): BertLayer(\n    (attention): BertAttention(\n      (self): BertSelfAttention(\n        (query): Linear(in_features=768, out_features=768, bias=True)\n        (key): Linear(in_features=768, out_features=768, bias=True)\n        (value): Linear(in_features=768, out_features=768, bias=True)\n        (dropout): Dropout(p=0.1)\n      )\n      (output): BertSelfOutput(\n        (dense): Linear(in_features=768, out_features=768, bias=True)\n        (LayerNorm): BertLayerNorm()\n        (dropout): Dropout(p=0.1)\n      )\n    )\n    (intermediate): BertIntermediate(\n      (dense): Linear(in_features=768, out_features=3072, bias=True)\n    )\n    (output): BertOutput(\n      (dense): Linear(in_features=3072, out_features=768, bias=True)\n      (LayerNorm): BertLayerNorm()\n      (dropout): Dropout(p=0.1)\n    )\n  )\n), Sequential(\n  (0): BertLayer(\n    (attention): BertAttention(\n      (self): BertSelfAttention(\n        (query): Linear(in_features=768, out_features=768, bias=True)\n        (key): Linear(in_features=768, out_features=768, bias=True)\n        (value): Linear(in_features=768, out_features=768, bias=True)\n        (dropout): Dropout(p=0.1)\n      )\n      (output): BertSelfOutput(\n        (dense): Linear(in_features=768, out_features=768, bias=True)\n        (LayerNorm): BertLayerNorm()\n        (dropout): Dropout(p=0.1)\n      )\n    )\n    (intermediate): BertIntermediate(\n      (dense): Linear(in_features=768, out_features=3072, bias=True)\n    )\n    (output): BertOutput(\n      (dense): Linear(in_features=3072, out_features=768, bias=True)\n      (LayerNorm): BertLayerNorm()\n      (dropout): Dropout(p=0.1)\n    )\n  )\n  (1): BertLayer(\n    (attention): BertAttention(\n      (self): BertSelfAttention(\n        (query): Linear(in_features=768, out_features=768, bias=True)\n        (key): Linear(in_features=768, out_features=768, bias=True)\n        (value): Linear(in_features=768, out_features=768, bias=True)\n        (dropout): Dropout(p=0.1)\n      )\n      (output): BertSelfOutput(\n        (dense): Linear(in_features=768, out_features=768, bias=True)\n        (LayerNorm): BertLayerNorm()\n        (dropout): Dropout(p=0.1)\n      )\n    )\n    (intermediate): BertIntermediate(\n      (dense): Linear(in_features=768, out_features=3072, bias=True)\n    )\n    (output): BertOutput(\n      (dense): Linear(in_features=3072, out_features=768, bias=True)\n      (LayerNorm): BertLayerNorm()\n      (dropout): Dropout(p=0.1)\n    )\n  )\n  (2): BertLayer(\n    (attention): BertAttention(\n      (self): BertSelfAttention(\n        (query): Linear(in_features=768, out_features=768, bias=True)\n        (key): Linear(in_features=768, out_features=768, bias=True)\n        (value): Linear(in_features=768, out_features=768, bias=True)\n        (dropout): Dropout(p=0.1)\n      )\n      (output): BertSelfOutput(\n        (dense): Linear(in_features=768, out_features=768, bias=True)\n        (LayerNorm): BertLayerNorm()\n        (dropout): Dropout(p=0.1)\n      )\n    )\n    (intermediate): BertIntermediate(\n      (dense): Linear(in_features=768, out_features=3072, bias=True)\n    )\n    (output): BertOutput(\n      (dense): Linear(in_features=3072, out_features=768, bias=True)\n      (LayerNorm): BertLayerNorm()\n      (dropout): Dropout(p=0.1)\n    )\n  )\n), Sequential(\n  (0): Dropout(p=0.1)\n  (1): Linear(in_features=768, out_features=6, bias=True)\n)], add_time=True, silent=False)"},"metadata":{}}]},{"metadata":{"trusted":true},"cell_type":"code","source":"learner.lr_find()","execution_count":28,"outputs":[{"output_type":"display_data","data":{"text/plain":"<IPython.core.display.HTML object>","text/html":""},"metadata":{}},{"output_type":"stream","text":"LR Finder is complete, type {learner_name}.recorder.plot() to see the graph.\n","name":"stdout"}]},{"metadata":{"trusted":true},"cell_type":"code","source":"learner.recorder.plot()","execution_count":29,"outputs":[{"output_type":"display_data","data":{"text/plain":"<Figure size 432x288 with 1 Axes>","image/png":"iVBORw0KGgoAAAANSUhEUgAAAYsAAAEKCAYAAADjDHn2AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAAIABJREFUeJzt3Xl4VNX5wPHvm8lOFgIJe1gNS9glgor7BqhF0dZCa9W60EW06s+22lbbYq1W26q02Iq7uOBat9pSUXBBtrAqSyCELSAQEkgC2ZP398fc4BgTJkBuZsn7eZ55mHvm3LnvYZK8c+659xxRVYwxxpgjiQh0AMYYY4KfJQtjjDF+WbIwxhjjlyULY4wxflmyMMYY45clC2OMMX5ZsjDGGOOXJQtjjDF+WbIwxhjjV2SgA2gpqamp2rt370CHYYwxIWX58uX7VDXNX72wSRa9e/cmOzs70GEYY0xIEZFtzalnp6GMMcb4ZcnCGGOMX64mCxEZLyI5IpIrInc08npPEZkvIitFZI2IXOjz2p3OfjkiMs7NOI0xxhyZa2MWIuIBZgLnA/nAMhF5W1XX+VT7DfCKqv5DRDKB94DezvPJwGCgGzBPRPqraq1b8RpjjGmamz2L0UCuquapahUwB7ikQR0FkpznycAu5/klwBxVrVTVLUCu837GGGMCwM1k0R3Y4bOd75T5+h1wpYjk4+1V3HQU+xpjjGklbiYLaaSs4bJ8U4BnVLUHcCEwW0QimrkvIjJVRLJFJLugoOC4AzbGGNM4N5NFPpDus92Dr04z1bsOeAVAVRcBsUBqM/dFVWepapaqZqWl+b2nxBhjws7ry/N5edl214/jZrJYBmSISB8RicY7YP12gzrbgXMBRGQQ3mRR4NSbLCIxItIHyACWuhirMcaEpCc/3cKbK7/xXbrFuZYsVLUGmAbMBdbjvepprYhMF5GJTrX/A24QkdXAS8A16rUWb49jHfBf4Ea7EsoYY76uuKya9btLOLlvR9eP5ep0H6r6Ht6Ba9+yu32erwPGNrHvvcC9bsZnjDGhbNnWIlRhTN8Orh/L7uA2xpgQtTivkOjICEakt3f9WJYsjDEmRC3ZUsTI9PbERnlcP5YlC2OMCUElFdWs3VXcKuMVYMnCGGNCUvbWIupaabwCLFkYY0xIWpxXRLQnghN7prTK8SxZGGNMCFqSV8iIVhqvAEsWxhgTckorqvliV0mrnYICSxbGGBNysrftp7ZOW21wGyxZGGNMyFmSV0SUR1ptvAIsWRhjTMhZnFfI8B7tiYtunfEKsGRhjDEh5VBlDZ/vLG7V8QqwZGGMMSElEOMVYMnCGGNCypK8QiIjhFG9Wm+8AixZGGNMSFmcV8iwHsnER7s6afg3WLIwxpgQsaekgpU7DnB6RuuvDGrJwhhjQsTbq3ahCpeO7N7qx3Y1WYjIeBHJEZFcEbmjkdcfEpFVzmOjiBzwea3W57WGy7EaY0yb86+VOxmR3p4+qe1a/diunfQSEQ8wEzgfyAeWicjbzup4AKjqrT71bwJG+rxFuaqOcCs+Y4wJJTm7S1n3ZQm/nzg4IMd3s2cxGshV1TxVrQLmAJccof4UvOtwtypVpaLalvc2xgS3f63ciSdCuHhY14Ac383h9O7ADp/tfGBMYxVFpBfQB/jQpzhWRLKBGuB+VX3TjSCLy6sZMf19oj0RJMZGOo8oYiIjiIgQPCJ4IoR2MR5SE2JIS/Q+OiXG0jU5lu7t42gfH4WIuBGeMcZQV6e8tWonZ/ZPo2NCTEBicDNZNPbXU5uoOxl4TVV9v+L3VNVdItIX+FBEPlfVzV87gMhUYCpAz549jynISE8EPx83gNKKGkorqg//W1VbR02tUlNXR0WNsre0gqVbithfVv2N94iL8tC1vTd5dEmK8/6bHEun+sSSFEtqQjQRItTUqvPedURHRtAuOpKICEs0xpimLdlSxJfFFdx54aCAxeBmssgH0n22ewC7mqg7GbjRt0BVdzn/5onIArzjGZsb1JkFzALIyspqKhEdUUJMJDeefUKz61fX1lF4sIo9JRV8WVzOzgMV7DpQzq4D5ewuqeCzzfvYW1pJbV3zwhHxxpAYE0mMMy+9qnffmEgPPVLiSO8QT4+UOHqkeP+13owxbcubK3eSEBPJ+YM6BywGN5PFMiBDRPoAO/EmhO81rCQiA4AUYJFPWQpQpqqVIpIKjAUecDHWZovyRNDF6TkMT2/faJ3aOmXfwUr2llRScLCCvSWV7DtYiaq3JxPlESIjhKraOqcn431U1tQiIgjeJHKospb8/WUs2VLEwcqarx0jPtpDeko8p2WkMm5wF0b1SsFjPRRjwk5FdS3vff4l44d0adWJAxtyLVmoao2ITAPmAh7gKVVdKyLTgWxVrb8cdgowR+u/TnsNAh4TkTq8g/D3+15FFew8EULnpFg6J8UCycf9fqrKgbJq8veXs/NAGTsPVLBzfzm5BQeZvWgbT366hY7tojl3UCfOHtCJU/ulkhwfdfwNMcYE3Icb9lJaWcOkANxb4Uu+/jc6dGVlZWl2dnagw2h1Bytr+CingP+t2+39oaqoIUJgaPdkTstI5ZyBnRiZnmLjIsaEqBuey2ZN/gE+u+NcV84eiMhyVc3yV691JxcxLS4hJpKLhnXlomFdqa6tY/WOA3yyaR+f5u7jnx/lMXP+ZjolxjBucBcmDOnC6D4diPTYjfvGhIKSimoW5OzlmlN7B/w0syWLMBLliSCrdweyenfg1vP7U1xezfwNe/nvF7t5dfkOZi/eRqfEGK7ISmfy6HR6pMQHOmRjzBEs3LSP6lpl3OAugQ7FkkU4S46L4tKR3bl0ZHfKqrynq15dns/MBbnMXJDLWf3TmDK6J2cP7ESU9TaMCToLcgpIio1kRBMX07QmSxZtRHx0JBOGdmXC0K7sPFDOy0u3M2fZDqbOXk5aYgyXndid74xK54ROCYEO1RiD98KWjzYWcHpGWlCcOrZk0QZ1bx/HbRcM4KZzM/gop4BXsnfw5CdbeOyjPLJ6pfDDsX0YN7hzUPyAGtNW5ewpZXdJBWf2b/3pyBtjyaINi/JEcF5mZ87L7ExBaSX/WpnPC0u2c+OLK+iREse1Y/twxUnpJMTYj4kxrW1BTgEAZw4IjmRhl86ar6mtU+at38PjH+eRvW0/ibGRXDysK98a3o0xfToG/IoMY9qKKbMWs7+siv/ecoarx7FLZ80x8UQI4wZ3YdzgLqzcvp/nFm3jrVW7eGnpDjonxXDxsG5ckZXOgC6JgQ7VmLB1sLKG7G1FXHda30CHcpglC9OkkT1TGNkzhfKqWj7YsIe3Vu3iuUVbefLTLYzu04GrTunFuMFd7EoqY1rYZ7neS2aDZbwCLFmYZoiL9nDxsG5cPKwb+w9V8Ur2Dp5fso1pL64kLTGG74/pyZUn9yI1QFMnGxNuFmwsICEmklG9UgIdymGWLMxRSWkXzY/O7Mf1p/flo417eW7RNh6et4lHF2zm0hHduO60vnaKypjjoKp8lFPAqf06Eh0ZPL12SxbmmHgihHMGduacgZ3J3XuQpxdu4fUV+bySnc9ZA9K49bz+Tc7Ka4xp2uaCg+w8UH5USye0huBJWyZkndApgXsnDWXRHefy83EDWL3jAJfMXMgNz2Wz/suSQIdnTEgJtktm61myMC0mpV00N559Ap/88hz+7/z+LM4rZMIjn3DTSyvZUVQW6PCMCQkfbSwgo1MC3dvHBTqUr7FkYVpcQkwkN52bwae/OIcbz+7HvHV7OPcvH3Hff9ZTUvHNZWmNMV5lVTUsySsKqqug6lmyMK5Jjo/i5+MGMv/2s5g4ohuzPs7jrAcXMHvRVmpq6wIdnjFBZ/WOYqpq6xibkRroUL7B1WQhIuNFJEdEckXkjkZef0hEVjmPjSJywOe1q0Vkk/O42s04jbu6JMfy5+8M551pp9G/cwJ3vbWWi2Z8yme5+wIdmjFBZVvhIQAygnBCT9eShYh4gJnABCATmCIimb51VPVWVR2hqiOAvwFvOPt2AH4LjAFGA7911uU2IWxI92ReuuFk/nnlKMqqa/jeE0v48ezlNp5hjGNrYRlRHqFrcnCNV4C7PYvRQK6q5qlqFTAHuOQI9acALznPxwHvq2qRqu4H3gfGuxiraSUiwvghXXj/1jP5+bgBfLSxgHP/+hEzPthEVY2dmjJt2/aiQ6R3iA/KOdjcTBbdgR0+2/lO2TeISC+gD/Dh0e5rQlNslIcbzz6B+befxQWZnfnr+xu5aMYnLN9WFOjQjAmYrfvK6NUhOFewdDNZNJYam5ridjLwmqrWHs2+IjJVRLJFJLugoOAYwzSB1CU5lr9/70SeuiaLsqpaLv/HIn7z5ud21ZRpc1SV7UVl9OrYLtChNMrNZJEPpPts9wB2NVF3Ml+dgmr2vqo6S1WzVDUrLS34LjUzzXfOwM7879YzuO60Pry4ZDvjHvqYjzfaFwDTdhQequJgZQ29Ora9nsUyIENE+ohINN6E8HbDSiIyAEgBFvkUzwUuEJEUZ2D7AqfMhLF2MZHcdXEmb/x0LO1iIrnqqaXc+cYaSq2XYdqAbYXeCz16t7WeharWANPw/pFfD7yiqmtFZLqITPSpOgWYoz6rMKlqEXAP3oSzDJjulJk2YER6e9696TR+dGZfXl62g/EPf8JCu8zWhLn6y2Z7BmnPwlbKM0Ftxfb93P7qavIKDnHVKb24Y8JA4qNt/ksTfh56fyN/+3AT6+8ZT0ykp9WO29yV8uwObhPUTuyZwns3n851p/Vh9uJtTHjkE7K3WifThJ9thYfomhzXqoniaFiyMEEvNsrDXRdn8tINJ1OnynceW8SDczdQVxcevWJjwHtDXu/U4DwFBZYsTAg5uW9H/vOzM/jOqB7MnL+Zm+aspKK61v+OxoSA7UVl9OwQnIPbYIsfmRCTEBPJny4fxgmdEvjjexsoKKlk1lWjaB8fHejQjDlmJRXVFB2qoneQDm6D9SxMCBIRpp7Rj79NGcmqHQe4/B+f2fxSJqRtdy6bDdZ7LMCShQlh3xrejdnXjaagtJJLZy60m/hMyNrqXDYbrHdvgyULE+LG9O3IGz8dS8eEaK56ail/+u8Gqm2tDBNitlnPwhj3ndApgbduPI0po9P5x4LNfPexReTvt9NSJnRsKzxEWmJMUN9DZMnChIW4aA/3XTaMGVNGsnHPQS6a8SnLt+0PdFjGNMu2wrKgHtwGSxYmzEwc3o13bzqNlPgornpyCcvsBj4TArYVBvdls2DJwoSh3qntmDP1FDonx3LVk0tZtLkw0CEZ06SK6lp2l1RYz8KYQOiSHMucqSfTIyWOHz6zlE832USEJjhtdy77DtYJBOtZsjBhq1OiN2H07tiOa59dxoKcvYEOyZhvCPapyetZsjBhrWNCDC/dcDIZnRKY+txy5m+whGGCy7bD91hYz8KYgEppF80L149hQJdEfjR7OR+s3xPokIw5bFthGclxUUE/ZY0lC9MmtI+P5vnrxjCwayI/fn4589ZZwjDBYWvhoaDvVYDLyUJExotIjojkisgdTdS5QkTWichaEXnRp7xWRFY5j28sx2rM0UqOj2L2dWPI7JbMT15Yzv/W7g50SMawrbAsqKf5qOdashARDzATmABkAlNEJLNBnQzgTmCsqg4GbvF5uVxVRzgP32VYjTlmyXFRzL5uNIO7JXPjiyt433oYJoCqa+vYeaCcXh3ads9iNJCrqnmqWgXMAS5pUOcGYKaq7gdQVRt9NK5Lio3iOSdh/PQFOyVlAmfn/nJq67TNn4bqDuzw2c53ynz1B/qLyEIRWSwi431eixWRbKf80sYOICJTnTrZBQU246hpvvqEUX9KyhKGCYRtRfUTCLbh01CANFLWcB3MSCADOAuYAjwhIu2d13o6i4h/D3hYRPp9481UZ6lqlqpmpaWltVzkpk1Iio3iuWtHk9k1iZ+8YJfVmta3peAgQNDfvQ3uJot8IN1nuwewq5E6b6lqtapuAXLwJg9UdZfzbx6wABjpYqymjUqOi+K568YwsEsS015cQe7e0kCHZNqQjXsP0j4+irTEmECH4pebyWIZkCEifUQkGpgMNLyq6U3gbAARScV7WipPRFJEJManfCywzsVYTRuWHBfF41dlERcdydTnllNSUR3okEwbsXF3Kf07JyLS2ImY4OJaslDVGmAaMBdYD7yiqmtFZLqI1F/dNBcoFJF1wHzg56paCAwCskVktVN+v6pasjCu6ZIcy6PfP5HtRWXc9vIq6uoanjE1pmWpKjl7SunfOSHQoTSLqyttqOp7wHsNyu72ea7Abc7Dt85nwFA3YzOmodF9OnD3tzK5+621zPhwE7ec1z/QIZkwtqekktKKGgZ0Tgx0KM1id3Ab4+MHJ/fi26N68PC8TXYPhnFVzh7v+Fh/SxbGhB4R4Q+XDmFYj2Rue3kVuXsPBjokE6Y27rZkYUxIi43y8M8rRxEdGcHU2dk24G1ckbOnlE6JMaS0C+4JBOtZsjCmEd3ax3kHvAttwNu4Y+OeUgZ0CY1eBViyMKZJY/p25K6LM5m3fi8Pf7Ap0OGYMFJXp2zcU0pGJ0sWxoSFq07pxXdG9WDGB5uYa7PUmhayY38ZFdV1DOgSGpfNgiULY45IRLjn0iEMT2/PbS+vYuMeu8PbHL+Ne7wXToTK4DZYsjDGr9goD49dOYr4mEiufzab/YeqAh2SCXH1XzoyLFkYE166JMcy6wej2F1SwU9fWEF1bV2gQzIhLGd3KT1S4kiIcfW+6BZlycKYZhrZM4X7Jg1lUV4hf3jXZp8xx27jntKQuXO7niULY47C5aN6cMPpfXh20TZeXLI90OGYEFRdW8fmgoP0D6HLZsGShTFH7Y4Jgzizfxp3v/UFS/IKAx2OCTFb9x2iulZDZgLBes1KFiLSz2fK8LNE5GafRYqMaVM8EcKMKSPp2TGen7ywgvz9ZYEOyYSQULwSCprfs3gdqBWRE4AngT7Ai65FZUyQq18Do7q2juufzeZQZU2gQzIhImdPKREC/dLCsGcB1DnrU0wCHlbVW4Gu7oVlTPDrl5bA3793Ihv3lHL7q6ttShDTLBt3l9I7tR2xUZ5Ah3JUmpssqkVkCnA18K5TFuVOSMaEjjP7p/GrCwfxny92M+NDmxLE+BeKV0JB85PFD4FTgHtVdYuI9AGe97eTiIwXkRwRyRWRO5qoc4WIrBORtSLyok/51SKyyXlc3cw4jWl1153Wh8tO7M7D8zbxwXpbA8M0raK6lq2Fh0JuvAKamSxUdZ2q3qyqL4lICpCoqvcfaR8R8QAzgQlAJjBFRDIb1MkA7gTGqupg4BanvAPwW2AMMBr4rXNcY4KOiPDHSUMZ1DWJX7y2hoLSykCHZIJU7t6D1GnoDW5D86+GWiAiSc4f8dXA0yLyVz+7jQZyVTVPVauAOcAlDercAMxU1f0AqrrXKR8HvK+qRc5r7wPjm9ckY1pfbJSHRyaPoLSyhjteX4N3xWBjvm7TXu80H6E0gWC95p6GSlbVEuAy4GlVHQWc52ef7sAOn+18p8xXf6C/iCwUkcUiMv4o9kVEpopItohkFxQUNLMpxrijf+dEfjl+IB9s2MtLS3f438G0ORt2lxLtiaBXx3aBDuWoNTdZRIpIV+AKvhrg9kcaKWv4dSsSyADOAqYATzj3bzRnX1R1lqpmqWpWWlpaM8Myxj0/PLU3Y0/oyD3vrmPLvkOBDscEmbU7S+jfJYEoT+jdD93ciKcDc4HNqrpMRPoC/i79yAfSfbZ7ALsaqfOWqlar6hYgB2/yaM6+xgSdiAjhz98ZTnRkBLe8vMomHDSHqSqf7yxmaPfQvJ+5uQPcr6rqMFX9ibOdp6qX+9ltGZAhIn1EJBqYDLzdoM6bwNkAIpKK97RUHt7EdIGIpDgD2xc4ZcYEva7Jcdw7aQirdxzg7x/mBjocEyR2FJVTXF7N0O7JgQ7lmDR3gLuHiPxLRPaKyB4ReV1EehxpH+cmvml4/8ivB15R1bUiMl1EJjrV5gKFIrIOmA/8XFULVbUIuAdvwlkGTHfKjAkJFw/rxqSR3fn7/FxW7TgQ6HBMEFiz0/tzMKxHaCYLac5VGyLyPt7pPWY7RVcC31fV812M7ahkZWVpdnZ2oMMw5rDi8momPPwxMVEe/n3zacRHh87aBabl3fef9Tz96Va++P04oiODZ8xCRJarapa/es2NOE1Vn1bVGufxDGAjysYcQXJcFH++Yjhb9h3ivvc2BDocE2Cf5xczsGtiUCWKo9HcqPeJyJUi4nEeVwI2N7MxfpzaL5XrT+vD7MXbWJCz1/8OJizVD24PCdHxCmh+srgW72Wzu4EvgW/jnQLEGOPH7eMG0L9zAr94bY2t391GbSsso7SihmHhnixUdbuqTlTVNFXtpKqX4r1BzxjjR2yUh79eMYL9ZVXc/fbaQIdjAuDzncUAbaJn0ZjbWiwKY8LckO7JTDs7g3dW72K+nY5qcz7fWUx0ZERIzglV73iSRWN3WRtjmvDjs/rSL60dv/nXF5RV2WJJbcnn+cUM6hK6g9twfMnCZkoz5ijERHr446Sh7DxQziPzbO2LtqKuTvliZzFDQ/T+inpHTBYiUioiJY08SoFurRSjMWFjTN+OfDcrnSc+3cLaXcWBDse0gm1FZZRW1oTsndv1jpgsVDVRVZMaeSSqqt1hZMwxuPPCgaTER/GrNz6n1pZiDXtr8r13bofqnFD1QvcEmjEhqn18NHddnMnq/GJmL9oa6HCMy75wBrczOofeGha+LFkYEwATh3fjjP5pPDA3h602lXlYW5NfTGbXpJCcltxXaEdvTIgSEe6/bCiREcKtr6yixqYyD0t1dcraXSUhP14BliyMCZhu7eO4d9JQVm4/wMz5mwMdjjlOqsoPn17KjS+sYG9JBQBbCg9xsLIm5K+EAu9KdcaYAPnW8G58uGEvMz7cxBn9UxnZMyXQIZljtLngEPNzvMs7f7ypgLsuyjx8X4X1LIwxx+33lwymS1Ist768ikOVdrNeqJq3fg8AL94whkFdk/jF62u4680viImMIKNTaA9ug8vJQkTGi0iOiOSKyB2NvH6NiBSIyCrncb3Pa7U+5Q1X2DMmbCTFRvGXK4azraiMe95dF+hwzDGat24Pg7slcWq/VObccDL3ThqCAqN6pRAZ4oPb4OJpKBHxADOB8/Guqb1MRN5W1Ya/DS+r6rRG3qJcVUe4FZ8xweTkvh2ZekZfHvsoj4kjunFqv9RAh2SOQuHBSpZv38/N52QA3rXYvz+mFxcPDZ97l91Md6OBXGe97ipgDnCJi8czJqTdel5/0jvEcdebX1BVY1dHhZIPN+xFFc7P7Py18uT4KJLjowIUVctyM1l0B3b4bOc7ZQ1dLiJrROQ1EUn3KY8VkWwRWSwil7oYpzFBITbKw/SJQ9hccIgnPs0LdDjmKMxbv4cuSbEM7pYU6FBc42ayaGxW2oZzG7wD9FbVYcA84Fmf13o668J+D3hYRPp94wAiU52Ekl1QUNBScRsTMGcP7MT4wV2Y8cEm8veXBToc0wwV1bV8vHEf52V2QiR8J+N2M1nkA749hR7ALt8KqlqoqpXO5uPAKJ/Xdjn/5gELgJEND6Cqs1Q1S1Wz0tJsSXATHu7+ViYRIvzubRvsDgWLNhdSXl3LeYM6+68cwtxMFsuADBHpIyLRwGTga1c1iUhXn82JwHqnPEVEYpznqcBYwH5zTJvQrX0cPzs3g3nr9/D+uj2BDsf48f76PbSL9nBKv46BDsVVriULVa0BpgFz8SaBV1R1rYhMF5GJTrWbRWStiKwGbgauccoHAdlO+Xzg/kauojImbF17Wh/6d07gd2+vtYWSglhdnfLB+j2c0T+NmEhPoMNxlat3cKvqe8B7Dcru9nl+J3BnI/t9Bgx1MzZjglmUJ4J7Jw3liscWce+/13PvJPt1CEZf7CpmT0ll2J+CAruD25igdVLvDkw9vS8vLNnOPDsdFZTmrdtDhHgvTAh3liyMCWK3XdCfzK5J/PL1NRSUVvrfwbSq99fvJatXBzq0iw50KK6zZGFMEIuJ9PDI5BEcrKzhF6+tRtVW1gsWG/eUsv7LEs7LDP9eBViyMCboZXRO5FcXDmJ+TgGzF28LdDjGMevjPGKjIvj2qHT/lcOAJQtjQsBVp/TirAFp3Pvv9WwuOBjocNq8L4vLeWvVTiaf1LNNnIICSxbGhAQR4YFvDyMmMoLp77S9q8h3F1ewYvv+QIdx2NMLt1KncN1pfQIdSquxZGFMiOiUGMvN52bw0cYC5ufsDXQ4rWrGh5u44p+L+GJncaBDobi8mheXbOeioV1J7xAf6HBajSULY0LIVaf0pm9qO/7w7jqq29C63YUHK6mpU259eRUV1bUBjeXFJds5WFnD1DP6BjSO1mbJwpgQEh0Zwa8vGsTmgkM834YGu0vKa2gfH8WmvQd54L85zdpnc8HBFk+olTW1PLVwC6dnpDIkDJZKPRqWLIwJMecM7MTpGak8PG8T+w9VBTqcVlFcXs2onilcdUovnlq4hYW5+45Y/82VOzn3Lx9xy5xV1NW13OXGb67cSUFpJT864xuTYIc9SxbGhBgR4TcXZVJaUc0jH2wKdDitoqSimuS4KO6cMIi+qe24/dXVFJdXN1p32dYifvHaGrolx/Lvz7/kL+83ryfiT12d8tjHeQzulsTYE8J70sDGWLIwJgQN6JLI98b0ZPbibWzaUxrocFxXXF5NUlwUcdEeHvruCPaWVvKbRlYU3FZ4iKnPZdM9JY73fnY6U0b3ZOb8zbySvaOJd26+DzbsJa/gEFPP6BvW61Y0xZKFMSHq1vP6kxATyU9fWEFxWePfssNBXZ1ysLKGpDjv8qTD09tzy7kZvLN6F2c9OJ9nFm6horqW4rJqrn1mGQo8dc1JtI+PZvolgznthFR+9cbnfLb5yKeu/Hnikzy6t4/joqFd/VcOQ5YsjAlRHRNi+MeVJ7KtsIwbZmcH/Coht5RW1KAKSbFfTZI97ZwTePba0XRPieN376zjtD99yJTHF7O9qIzHrhxFn9R2gHf23pnfP5Heqe34yfMryNl9bL2wL3YWs2RLEdec2ptIT9v8s9nLRDZ3AAAUHklEQVQ2W21MmDi1Xyp/vmI4S7cU8X+vrG7RwdxgUVLh7TUlOz0L8I7bnNk/jVd/fCovTz2ZQV2TWL+7hPsvG8aYvl8fT0iOi+Lpa04iyiNc/LdPuO8/6zlYeXRrhDz56RbaRXv47ui2MbVHY1xdz8IY476Jw7uxu7icP763gU5JMdx9cWZYnVOvH8hO8kkWvsb07ciYvh05WFlDQkzjf9LSO8Tz3s9O58H/5vDYR3m8sWInvxw/kMtGdici4sj/V7uLK3hn9S6uOqU3SbGNx9AWuNqzEJHxIpIjIrkickcjr18jIgUissp5XO/z2tUissl5XO1mnMaEuhtO78sPx/bm6YVbeeazrYEOp0WVlH+zZ9GYphJFvU6JsTz4neG8eeNYureP4/ZXV3POXxYw/Z11fLKpgMqaxk/jPbtoK3Wq/HBs72MJP2y41rMQEQ8wEzgfyAeWicjbjSyP+rKqTmuwbwfgt0AWoMByZ9/gmRzGmCAiItx1USY7isq57z8bOD0jjRM6JQQ6rBZxuGfRQt/qR6S3542fnMrbq3fxr5U7eX7JNp5auIX4aA/jBnfh1xcNIjUhBoCyqhpeXLKdcYO7tKmpPRrjZs9iNJCrqnmqWgXMAS5p5r7jgPdVtchJEO8D412K05iwEBEh3HfZUOKjPfzitdXUhsn4xeExi/iWOwUUESFcOrI7z147mtV3X8BT12Rx2Ynd+ffnXzL+4Y/5YL13ZcLXl+dTXF7N9ae3nQkDm+JmsugO+F7cnO+UNXS5iKwRkddEpH70qLn7GmN8pCV6xyxWbD/As2FyOuqrnoU7J0Lioj2cM7Azf7h0KO9MO43UhBiuezabO9/4nKcWbmVEentO7JniyrFDiZvJorFRo4Zfdd4BeqvqMGAe8OxR7IuITBWRbBHJLigoOK5gjQkXk0Z256wBaTw4N4fthWWBDue4lZTXECH+xyRawoAuibw1bSw/OrMvc5ZtZ8u+Q1x/ep+wumDgWLmZLPIB3+vMegC7fCuoaqGq1i8s/Dgwqrn7OvvPUtUsVc1KS0trscCNCWUiwh8nDcUTIdz5rzUhvxRr/d3brfUHOybSw50TBjHnhpOZdvYJjB/cpVWOG+zcTBbLgAwR6SMi0cBk4G3fCiLieyvkRGC983wucIGIpIhICnCBU2aMaYZu7eO4Y8JAFuYW8vKy45/qIpDq54VqbWP6duT2cQPa7E14Dbn2v6CqNcA0vH/k1wOvqOpaEZkuIhOdajeLyFoRWQ3cDFzj7FsE3IM34SwDpjtlxphm+t7onpzctwP3/ns9Ow+UBzqcY1ZcHphkYb5OQr2LWi8rK0uzs7MDHYYxQWV7YRkTHvmY4entef66MX5vQAtGkx5dSLvoSJ6/fkygQwlLIrJcVbP81bP+lTFhrGfHeO66OJPPNhfy7KKtgQ7nmJRYzyIoWLIwJsx996R0zhnYifv/s4HcvQcDHc5RKy6vISnOZiYKNEsWxoQ5EeH+y7036932yqqQW7u7pKK6yXmhTOuxZGFMG9ApMZZ7Jw1lTX4xj87fHOhwmq2iupaqmro2PYFfsLBkYUwbceHQrlw6ohszPtxE9tbQuLiwuZMIGvdZsjCmDZl+6RDSU+KY9uJK9h2s9L9DgPmbnty0HksWxrQhSbFRzPz+iRSVVXHLnFVBP9lgYwsfmcCwZGFMGzO4WzLTJw7m09x9zPhgU6DDOSK3JxE0zWfJwpg26LsnpXP5iT2Y8eEmPt4YvJNwlpR7lz+1nkXgWbIwpg0SEf5w6RD6d0rkZ3NWsqMoOGentTGL4GHJwpg2Ki7aw6NXnkhtnXL1U0spOlQV6JC+oaSFV8kzx86ShTFtWL+0BJ685iTyD5Rz3bPLKK9qfB3qQCkuryYuykN0pP2pCjT7BIxp407q3YEZk0ewascBbnppJTVBdId3oKYnN99kycIYw/ghXfn9xMHMW7+Hu95aGzQLJtn05MHDrkczxgBw1Sm92V1cwaMLNtMnNZ6pZ/QLdEiU2CSCQcN6FsaYw34+bgAXDu3Cn/6bw6LNhYEOx3oWQcTVZCEi40UkR0RyReSOI9T7toioiGQ5271FpFxEVjmPf7oZpzHGS0R44NvD6dUxnpteWsHu4oqAxlNSUW1XQgUJ15KFiHiAmcAEIBOYIiKZjdRLxLuk6pIGL21W1RHO48duxWmM+bqEmEgeu3IUZVW13PjiCqpqAjfgXVxu05MHCzd7FqOBXFXNU9UqYA5wSSP17gEeAAL7FcYYc1hG50Qe+PYwlm/bzx/fWx+QGOrqlIOVNZYsgoSbyaI7sMNnO98pO0xERgLpqvpuI/v3EZGVIvKRiJzuYpzGmEZcPKwb147twzOfbeW15fmtfvzSihpUbaqPYOHmZQaNrQx/+Ho8EYkAHgKuaaTel0BPVS0UkVHAmyIyWFVLvnYAkanAVICePXu2VNzGGMedFw5kw+4Sfvn6GtpFe5gwtGurHdsmEQwubvYs8oF0n+0ewC6f7URgCLBARLYCJwNvi0iWqlaqaiGAqi4HNgP9Gx5AVWepapaqZqWlpbnUDGParihPBI9flcXwHsncPGcl8zfsbbVj2/TkwcXNZLEMyBCRPiISDUwG3q5/UVWLVTVVVXuram9gMTBRVbNFJM0ZIEdE+gIZQJ6LsRpjmtAuJpJnrh3NwC5J/Oj55SzM3dcqx7VJBIOLa8lCVWuAacBcYD3wiqquFZHpIjLRz+5nAGtEZDXwGvBjVQ2NdSCNCUNJsVE8d+1o+nRsx/XPZrM4z/17MGxJ1eDi6slAVX0PeK9B2d1N1D3L5/nrwOtuxmaMOTop7aJ5/voxfHfWIr7/xBJuPKsfN52bQZTHne+c1rMILnYHtzGm2dISY3jzxrFcOqI7Mz7MZdKjC9m4p9SVY9mYRXCxZGGMOSpJsVH85YrhPPaDUXx5oIKL//YpTy/c0uLHKS6vxhMhtIv2tPh7m6NnycIYc0zGDe7C3FvP4IyMVH7/zjr++r+cFp2ttqS8hqTYSEQauwrftDZLFsaYY5aaEMNjP8jiiqwezPgwlwfmtlzCsKk+govd7WKMOS6eCOH+y4YR6YngHws2U1Nbx68uHHTcPQJb+Ci4WLIwxhy3iAjh3kuHEBUhPP7JFmrqlLsvzjyuhGHTkwcXSxbGmBYhIvxu4mAiPRE8+ekW2sdF87PzMo75/UrKq+mWHNeCEZrjYcnCGNNiRITfXDSIA2XVPDRvI12TY7nipHT/OzaiuNxmnA0mliyMMS1KRLj/8qEUHKzkzn99TlpiDGcP7HTU71NSUW1LqgYRuxrKGNPiojwRPPr9ExnUNZGfvrCC1TsOHNX+FdW1VNXU2ZhFELFkYYxxRUJMJE9dcxIdE6K59pllrDqKhFFyeHpySxbBwpKFMcY1nRJjefba0cRGefj2Pz7j8Y/zqKvzfx9GsU0iGHQsWRhjXNUvLYH3bj6dcwd14t731nP9c9kUHao64j7180LZAHfwsGRhjHFdcnwU/7xyFL+fOJhPN+3jwkc+4YudxU3Wt55F8LFkYYxpFSLC1af25o2fnoonQrjqqaXk7m18xtqS8hrAllQNJpYsjDGtakj3ZJ6/fgwRIlz5xFJ2FJV9o471LIKPq8lCRMaLSI6I5IrIHUeo920RURHJ8im709kvR0TGuRmnMaZ19Ultx+zrRlNWVcOVTy5hb0nF114vsYWPgo5rycJZQ3smMAHIBKaISGYj9RKBm4ElPmWZeNfsHgyMBx6tX5PbGBMeBnVN4plrR1NQWskPnlzKfp9B7+LyauKjPa6twmeOnpufxGggV1XzVLUKmANc0ki9e4AHAN+vFpcAc1S1UlW3ALnO+xljwsiJPVN4/Kostuw7xPkPfcwry3ZQV6fe6cntHoug4may6A7s8NnOd8oOE5GRQLqqvnu0+zr7TxWRbBHJLigoaJmojTGtauwJqbz+k1Pp2SGOX7y+hokzP2XtrhIbrwgybiaLxuYmPnw3johEAA8B/3e0+x4uUJ2lqlmqmpWWlnbMgRpjAmtoj2Re/8mpPDJ5BIUHq1j3ZYnNCxVk3Pw08gHf6SZ7ALt8thOBIcACZ877LsDbIjKxGfsaY8KMiHDJiO6cn9mZ5xZt44S0hECHZHy4mSyWARki0gfYiXfA+nv1L6pqMZBavy0iC4DbVTVbRMqBF0Xkr0A3IANY6mKsxpggER8dyY/P7BfoMEwDriULVa0RkWnAXMADPKWqa0VkOpCtqm8fYd+1IvIKsA6oAW5U1Vq3YjXGGHNk0lKLqwdaVlaWZmdnBzoMY4wJKSKyXFWz/NWzi5iNMcb4ZcnCGGOMX5YsjDHG+GXJwhhjjF+WLIwxxvhlycIYY4xfYXPprIgUANsaFCcDDZfj8lfm73kqsO84Qm3s+M2t09zyUGlPU69Ze8KjPb7bvuXH06bWbk/D7UC050j1WqI97VXV/3xJqhq2D2DW0Zb5e473hsIWjam5dZpbHirtae5nZO0JzfY0aIdvnWNuU2u35wifS6u150j1Wqo9zXmE+2mod46hrDnPj0dz3qepOs0tD5X2NPWatSc82uO7HartabgdiPYcqV5LtcevsDkN1VpEJFubcbdjqLD2BLdwaw+EX5vCrT1NCfeehRtmBTqAFmbtCW7h1h4IvzaFW3saZT0LY4wxflnPwhhjjF9tOlmIyFMisldEvjiGfUeJyOcikisiM8RZwcl57SYRyRGRtSLyQMtGfcSYWrw9IvI7EdkpIqucx4UtH3mTMbny+Tiv3y4iKiKpTb1HS3Pp87lHRNY4n83/RKRby0feZExutOdBEdngtOlfItK+5SNvMiY32vMd5+9AnYiE9rjGsV7yFQ4P4AzgROCLY9h3KXAK3iVg/wNMcMrPBuYBMc52pxBvz+/wLkoVFp+P81o63nVWtgGpodweIMmnzs3AP0O8PRcAkc7zPwF/CvH2DAIGAAuArNZqixuPNt2zUNWPgSLfMhHpJyL/FZHlIvKJiAxsuJ+IdMX7S7pIvT8RzwGXOi//BLhfVSudY+x1txVfcak9AeNiex4CfkEj67q7yY32qGqJT9V2tGKbXGrP/1S1xqm6GO+Syq3CpfasV9Wc1ojfbW06WTRhFnCTqo4CbgcebaROd7zrhNfLd8oA+gOni8gSEflIRE5yNVr/jrc9ANOc0wJPiUiKe6E2y3G1R7xrvO9U1dVuB9pMx/35iMi9IrID+D5wt4uxNkdL/LzVuxbvt/RAasn2hDQ31+AOOSKSAJwKvOpzijumsaqNlNV/o4sEUoCTgZOAV0Skr/ONo1W1UHv+AdzjbN8D/AXvL3GrO972iEg88Gu8pzoCroU+H1T118CvReROYBrw2xYOtVlaqj3Oe/0a75LKL7RkjEejJdsTDixZfF0EcEBVR/gWiogHWO5svo33D6hv97gHsMt5ng+84SSHpSJSh3fumAI3A2/CcbdHVff47Pc48K6bAftxvO3pB/QBVju//D2AFSIyWlV3uxx7Y1ri583Xi8C/CVCyoIXaIyJXAxcD5wbiS5aPlv58QlugB00C/QB64zOgBXwGfMd5LsDwJvZbhrf3UD+gdaFT/mNguvO8P7AD536WEG1PV586twJzQvnzaVBnK604wO3S55PhU+cm4LUQb894YB2Q1prtcPvnjTAY4A54AAFtPLwEfAlU4+0RXIf3m+d/gdXOD+3dTeybBXwBbAb+Xp8QgGjgeee1FcA5Id6e2cDnwBq836K6hnJ7GtRp1WTh0ufzulO+Bu88P91DvD25eL9grXIerXl1lxvtmeS8VyWwB5jbWu1p6YfdwW2MMcYvuxrKGGOMX5YsjDHG+GXJwhhjjF+WLIwxxvhlycIYY4xflixMWBORg618vCdEJLOF3qvWmU32CxF5x98MrCLSXkR+2hLHNqYhu3TWhDUROaiqCS34fpH61UR3rvKNXUSeBTaq6r1HqN8beFdVh7RGfKZtsZ6FaXNEJE1EXheRZc5jrFM+WkQ+E5GVzr8DnPJrRORVEXkH+J+InCUiC0TkNWfthRd81i9YUL9ugYgcdCb5Wy0ii0Wks1Pez9leJiLTm9n7WcRXkyEmiMgHIrJCvGsoXOLUuR/o5/RGHnTq/tw5zhoR+X0L/jeaNsaShWmLHgEeUtWTgMuBJ5zyDcAZqjoS7+ytf/TZ5xTgalU9x9keCdwCZAJ9gbGNHKcdsFhVhwMfAzf4HP8R5/h+5xBy5iI6F+8d9AAVwCRVPRHv+il/cZLVHcBmVR2hqj8XkQuADGA0MAIYJSJn+DueMY2xiQRNW3QekOkzk2iSiCQCycCzIpKBd9bQKJ993ldV37UOlqpqPoCIrMI7p9CnDY5TxVcTLy4Hzneen8JX62u8CPy5iTjjfN57OfC+Uy7AH50//HV4exydG9n/Auex0tlOwJs8Pm7ieMY0yZKFaYsigFNUtdy3UET+BsxX1UnO+f8FPi8favAelT7Pa2n8d6lavxoUbKrOkZSr6ggRScabdG4EZuBdtyINGKWq1SKyFYhtZH8B7lPVx47yuMZ8g52GMm3R//Cu+wCAiNRPQZ0M7HSeX+Pi8RfjPf0FMNlfZVUtxrtk6u0iEoU3zr1Oojgb6OVULQUSfXadC1zrrMuAiHQXkU4t1AbTxliyMOEuXkTyfR634f3Dm+UM+q7DO608wAPAfSKyEPC4GNMtwG0ishToChT720FVV+Kd+XQy3gWBskQkG28vY4NTpxBY6Fxq+6Cq/g/vaa5FIvI58BpfTybGNJtdOmtMK3NW7CtXVRWRycAUVb3E337GBJKNWRjT+kYBf3euYDpAgJapNeZoWM/CGGOMXzZmYYwxxi9LFsYYY/yyZGGMMcYvSxbGGGP8smRhjDHGL0sWxhhj/Pp/yfI45zGI2G0AAAAASUVORK5CYII=\n"},"metadata":{"needs_background":"light"}}]},{"metadata":{"trusted":true},"cell_type":"code","source":"learner.fit_one_cycle(2, max_lr=slice(1e-5, 5e-4), moms=(0.8,0.7), pct_start=0.2, wd =(1e-7, 1e-5, 1e-4, 1e-3, 1e-2))","execution_count":30,"outputs":[{"output_type":"display_data","data":{"text/plain":"<IPython.core.display.HTML object>","text/html":"<table border=\"1\" class=\"dataframe\">\n  <thead>\n    <tr style=\"text-align: left;\">\n      <th>epoch</th>\n      <th>train_loss</th>\n      <th>valid_loss</th>\n      <th>accuracy_thresh</th>\n      <th>time</th>\n    </tr>\n  </thead>\n  <tbody>\n    <tr>\n      <td>0</td>\n      <td>0.046906</td>\n      <td>0.042532</td>\n      <td>0.983065</td>\n      <td>24:54</td>\n    </tr>\n    <tr>\n      <td>1</td>\n      <td>0.032987</td>\n      <td>0.036107</td>\n      <td>0.982066</td>\n      <td>29:35</td>\n    </tr>\n  </tbody>\n</table>"},"metadata":{}}]},{"metadata":{"trusted":true},"cell_type":"code","source":"learner.save('head')\nlearner.load('head')","execution_count":31,"outputs":[{"output_type":"execute_result","execution_count":31,"data":{"text/plain":"Learner(data=TextClasDataBunch;\n\nTrain: LabelList (127656 items)\nx: TextList\n[CLS] grandma terri should burn in trash grandma terri is trash . i hate grandma terri . f % % k her to hell ! 71 . 74 . 76 . 40 [SEP],[CLS] , 9 may 2009 ( utc ) it would be easiest if you were to admit to being a member of the involved portuguese lodge , and then there would be no requirement to acknowledge whether you had a previous account ( carlos bot ##el ##ho did not have a good record ) or not and i would then remove the sock ##pu ##ppet template as irrelevant . w ##p : co ##i permits people to edit those articles , such as ms ##ja ##pan does , but just means you have to be more careful in ensuring that references back your edit ##s and that np ##ov is upheld . 20 : 29 [SEP],[CLS] \" the object ##ivity of this discussion is doubtful ( non - existent ) ( 1 ) as indicated earlier , the section on marxist leaders ’ views is misleading : ( a ) it lays un ##war ##rant ##ed and excessive emphasis on tr ##ots ##ky , creating the misleading impression that other prominent marxist ##s ( marx , eng ##els , lenin ) did not advocate and / or practiced terrorism ; ( b ) it lays un ##war ##rant ##ed and excessive emphasis on the theoretical “ rejection of individual terrorism ” , creating the misleading impression that this is the main ( only ) marxist position on terrorism . ( 2 ) the discussion is not being properly monitored : ( a ) no disc ##ern ##ible attempt is being made to establish and maintain an acceptable degree of object ##ivity ; ( b ) important and relevant scholarly works such as the international encyclopedia of terrorism are being ignored or illicit ##ly excluded from the discussion ; ( c ) though the only logical way to remedy the b ##lat ##ant im ##balance in the above section is to include quotes by / on other leaders who are known to have endorsed and practiced terrorism all attempts to do so have been systematically blocked with imp ##uni ##ty by the ap ##ologists for marxist terrorism who have done their best to sabotage and wreck both the article and the discussion . ( 3 ) among the tactics deployed by [SEP],[CLS] shelly shock shelly shock is . . . ( ) [SEP],[CLS] i do not care . refer to on ##g ten ##g che ##ong talk page . is la go ##ut ##te de pl ##ui ##e writing a biography or writing the history of trade unions . she is making use of the dead to push her agenda again . right before elections too . how timely . 202 . 156 . 13 . 232 [SEP]\ny: MultiCategoryList\ntoxic,,,,\nPath: .;\n\nValid: LabelList (31915 items)\nx: TextList\n[CLS] gee ##z , are you forget ##ful ! we ' ve already discussed why marx was not an anarchist , i . e . he wanted to use a state to mold his ' socialist man . ' er ##go , he is a stat ##ist - the opposite of an anarchist . i know a guy who says that , when he gets old and his teeth fall out , he ' ll quit eating meat . would you call him a vegetarian ? [SEP],[CLS] car ##io ##ca rf ##a thanks for your support on my request for ad ##mins ##hip . the final outcome was ( 31 / 4 / 1 ) , so i am now an administrator . if you have any comments or concerns on my actions as an administrator , please let me know . thank you ! [SEP],[CLS] \" birthday no worries , it ' s what i do ; ) enjoy ur day | talk | e \" [SEP],[CLS] pseudo ##sc ##ience category ? i ' m assuming that this article is in the pseudo ##sc ##ience category because of its association with creation ##ism . however , there are modern , scientific ##ally - accepted variants of cat ##ast ##rop ##hism that have nothing to do with creation ##ism — and they ' re even mentioned in the article ! i think the connection to pseudo ##sc ##ience needs to be clarified , or the article made more general and less creation ##ism - specific and the category tag removed entirely . [SEP],[CLS] ( and if such phrase exists , it would be provided by search engine even if mentioned page is not available as a whole ) [SEP]\ny: MultiCategoryList\n,,,,\nPath: .;\n\nTest: None, model=BertForSequenceClassification(\n  (bert): BertModel(\n    (embeddings): BertEmbeddings(\n      (word_embeddings): Embedding(30522, 768, padding_idx=0)\n      (position_embeddings): Embedding(512, 768)\n      (token_type_embeddings): Embedding(2, 768)\n      (LayerNorm): BertLayerNorm()\n      (dropout): Dropout(p=0.1)\n    )\n    (encoder): BertEncoder(\n      (layer): ModuleList(\n        (0): BertLayer(\n          (attention): BertAttention(\n            (self): BertSelfAttention(\n              (query): Linear(in_features=768, out_features=768, bias=True)\n              (key): Linear(in_features=768, out_features=768, bias=True)\n              (value): Linear(in_features=768, out_features=768, bias=True)\n              (dropout): Dropout(p=0.1)\n            )\n            (output): BertSelfOutput(\n              (dense): Linear(in_features=768, out_features=768, bias=True)\n              (LayerNorm): BertLayerNorm()\n              (dropout): Dropout(p=0.1)\n            )\n          )\n          (intermediate): BertIntermediate(\n            (dense): Linear(in_features=768, out_features=3072, bias=True)\n          )\n          (output): BertOutput(\n            (dense): Linear(in_features=3072, out_features=768, bias=True)\n            (LayerNorm): BertLayerNorm()\n            (dropout): Dropout(p=0.1)\n          )\n        )\n        (1): BertLayer(\n          (attention): BertAttention(\n            (self): BertSelfAttention(\n              (query): Linear(in_features=768, out_features=768, bias=True)\n              (key): Linear(in_features=768, out_features=768, bias=True)\n              (value): Linear(in_features=768, out_features=768, bias=True)\n              (dropout): Dropout(p=0.1)\n            )\n            (output): BertSelfOutput(\n              (dense): Linear(in_features=768, out_features=768, bias=True)\n              (LayerNorm): BertLayerNorm()\n              (dropout): Dropout(p=0.1)\n            )\n          )\n          (intermediate): BertIntermediate(\n            (dense): Linear(in_features=768, out_features=3072, bias=True)\n          )\n          (output): BertOutput(\n            (dense): Linear(in_features=3072, out_features=768, bias=True)\n            (LayerNorm): BertLayerNorm()\n            (dropout): Dropout(p=0.1)\n          )\n        )\n        (2): BertLayer(\n          (attention): BertAttention(\n            (self): BertSelfAttention(\n              (query): Linear(in_features=768, out_features=768, bias=True)\n              (key): Linear(in_features=768, out_features=768, bias=True)\n              (value): Linear(in_features=768, out_features=768, bias=True)\n              (dropout): Dropout(p=0.1)\n            )\n            (output): BertSelfOutput(\n              (dense): Linear(in_features=768, out_features=768, bias=True)\n              (LayerNorm): BertLayerNorm()\n              (dropout): Dropout(p=0.1)\n            )\n          )\n          (intermediate): BertIntermediate(\n            (dense): Linear(in_features=768, out_features=3072, bias=True)\n          )\n          (output): BertOutput(\n            (dense): Linear(in_features=3072, out_features=768, bias=True)\n            (LayerNorm): BertLayerNorm()\n            (dropout): Dropout(p=0.1)\n          )\n        )\n        (3): BertLayer(\n          (attention): BertAttention(\n            (self): BertSelfAttention(\n              (query): Linear(in_features=768, out_features=768, bias=True)\n              (key): Linear(in_features=768, out_features=768, bias=True)\n              (value): Linear(in_features=768, out_features=768, bias=True)\n              (dropout): Dropout(p=0.1)\n            )\n            (output): BertSelfOutput(\n              (dense): Linear(in_features=768, out_features=768, bias=True)\n              (LayerNorm): BertLayerNorm()\n              (dropout): Dropout(p=0.1)\n            )\n          )\n          (intermediate): BertIntermediate(\n            (dense): Linear(in_features=768, out_features=3072, bias=True)\n          )\n          (output): BertOutput(\n            (dense): Linear(in_features=3072, out_features=768, bias=True)\n            (LayerNorm): BertLayerNorm()\n            (dropout): Dropout(p=0.1)\n          )\n        )\n        (4): BertLayer(\n          (attention): BertAttention(\n            (self): BertSelfAttention(\n              (query): Linear(in_features=768, out_features=768, bias=True)\n              (key): Linear(in_features=768, out_features=768, bias=True)\n              (value): Linear(in_features=768, out_features=768, bias=True)\n              (dropout): Dropout(p=0.1)\n            )\n            (output): BertSelfOutput(\n              (dense): Linear(in_features=768, out_features=768, bias=True)\n              (LayerNorm): BertLayerNorm()\n              (dropout): Dropout(p=0.1)\n            )\n          )\n          (intermediate): BertIntermediate(\n            (dense): Linear(in_features=768, out_features=3072, bias=True)\n          )\n          (output): BertOutput(\n            (dense): Linear(in_features=3072, out_features=768, bias=True)\n            (LayerNorm): BertLayerNorm()\n            (dropout): Dropout(p=0.1)\n          )\n        )\n        (5): BertLayer(\n          (attention): BertAttention(\n            (self): BertSelfAttention(\n              (query): Linear(in_features=768, out_features=768, bias=True)\n              (key): Linear(in_features=768, out_features=768, bias=True)\n              (value): Linear(in_features=768, out_features=768, bias=True)\n              (dropout): Dropout(p=0.1)\n            )\n            (output): BertSelfOutput(\n              (dense): Linear(in_features=768, out_features=768, bias=True)\n              (LayerNorm): BertLayerNorm()\n              (dropout): Dropout(p=0.1)\n            )\n          )\n          (intermediate): BertIntermediate(\n            (dense): Linear(in_features=768, out_features=3072, bias=True)\n          )\n          (output): BertOutput(\n            (dense): Linear(in_features=3072, out_features=768, bias=True)\n            (LayerNorm): BertLayerNorm()\n            (dropout): Dropout(p=0.1)\n          )\n        )\n        (6): BertLayer(\n          (attention): BertAttention(\n            (self): BertSelfAttention(\n              (query): Linear(in_features=768, out_features=768, bias=True)\n              (key): Linear(in_features=768, out_features=768, bias=True)\n              (value): Linear(in_features=768, out_features=768, bias=True)\n              (dropout): Dropout(p=0.1)\n            )\n            (output): BertSelfOutput(\n              (dense): Linear(in_features=768, out_features=768, bias=True)\n              (LayerNorm): BertLayerNorm()\n              (dropout): Dropout(p=0.1)\n            )\n          )\n          (intermediate): BertIntermediate(\n            (dense): Linear(in_features=768, out_features=3072, bias=True)\n          )\n          (output): BertOutput(\n            (dense): Linear(in_features=3072, out_features=768, bias=True)\n            (LayerNorm): BertLayerNorm()\n            (dropout): Dropout(p=0.1)\n          )\n        )\n        (7): BertLayer(\n          (attention): BertAttention(\n            (self): BertSelfAttention(\n              (query): Linear(in_features=768, out_features=768, bias=True)\n              (key): Linear(in_features=768, out_features=768, bias=True)\n              (value): Linear(in_features=768, out_features=768, bias=True)\n              (dropout): Dropout(p=0.1)\n            )\n            (output): BertSelfOutput(\n              (dense): Linear(in_features=768, out_features=768, bias=True)\n              (LayerNorm): BertLayerNorm()\n              (dropout): Dropout(p=0.1)\n            )\n          )\n          (intermediate): BertIntermediate(\n            (dense): Linear(in_features=768, out_features=3072, bias=True)\n          )\n          (output): BertOutput(\n            (dense): Linear(in_features=3072, out_features=768, bias=True)\n            (LayerNorm): BertLayerNorm()\n            (dropout): Dropout(p=0.1)\n          )\n        )\n        (8): BertLayer(\n          (attention): BertAttention(\n            (self): BertSelfAttention(\n              (query): Linear(in_features=768, out_features=768, bias=True)\n              (key): Linear(in_features=768, out_features=768, bias=True)\n              (value): Linear(in_features=768, out_features=768, bias=True)\n              (dropout): Dropout(p=0.1)\n            )\n            (output): BertSelfOutput(\n              (dense): Linear(in_features=768, out_features=768, bias=True)\n              (LayerNorm): BertLayerNorm()\n              (dropout): Dropout(p=0.1)\n            )\n          )\n          (intermediate): BertIntermediate(\n            (dense): Linear(in_features=768, out_features=3072, bias=True)\n          )\n          (output): BertOutput(\n            (dense): Linear(in_features=3072, out_features=768, bias=True)\n            (LayerNorm): BertLayerNorm()\n            (dropout): Dropout(p=0.1)\n          )\n        )\n        (9): BertLayer(\n          (attention): BertAttention(\n            (self): BertSelfAttention(\n              (query): Linear(in_features=768, out_features=768, bias=True)\n              (key): Linear(in_features=768, out_features=768, bias=True)\n              (value): Linear(in_features=768, out_features=768, bias=True)\n              (dropout): Dropout(p=0.1)\n            )\n            (output): BertSelfOutput(\n              (dense): Linear(in_features=768, out_features=768, bias=True)\n              (LayerNorm): BertLayerNorm()\n              (dropout): Dropout(p=0.1)\n            )\n          )\n          (intermediate): BertIntermediate(\n            (dense): Linear(in_features=768, out_features=3072, bias=True)\n          )\n          (output): BertOutput(\n            (dense): Linear(in_features=3072, out_features=768, bias=True)\n            (LayerNorm): BertLayerNorm()\n            (dropout): Dropout(p=0.1)\n          )\n        )\n        (10): BertLayer(\n          (attention): BertAttention(\n            (self): BertSelfAttention(\n              (query): Linear(in_features=768, out_features=768, bias=True)\n              (key): Linear(in_features=768, out_features=768, bias=True)\n              (value): Linear(in_features=768, out_features=768, bias=True)\n              (dropout): Dropout(p=0.1)\n            )\n            (output): BertSelfOutput(\n              (dense): Linear(in_features=768, out_features=768, bias=True)\n              (LayerNorm): BertLayerNorm()\n              (dropout): Dropout(p=0.1)\n            )\n          )\n          (intermediate): BertIntermediate(\n            (dense): Linear(in_features=768, out_features=3072, bias=True)\n          )\n          (output): BertOutput(\n            (dense): Linear(in_features=3072, out_features=768, bias=True)\n            (LayerNorm): BertLayerNorm()\n            (dropout): Dropout(p=0.1)\n          )\n        )\n        (11): BertLayer(\n          (attention): BertAttention(\n            (self): BertSelfAttention(\n              (query): Linear(in_features=768, out_features=768, bias=True)\n              (key): Linear(in_features=768, out_features=768, bias=True)\n              (value): Linear(in_features=768, out_features=768, bias=True)\n              (dropout): Dropout(p=0.1)\n            )\n            (output): BertSelfOutput(\n              (dense): Linear(in_features=768, out_features=768, bias=True)\n              (LayerNorm): BertLayerNorm()\n              (dropout): Dropout(p=0.1)\n            )\n          )\n          (intermediate): BertIntermediate(\n            (dense): Linear(in_features=768, out_features=3072, bias=True)\n          )\n          (output): BertOutput(\n            (dense): Linear(in_features=3072, out_features=768, bias=True)\n            (LayerNorm): BertLayerNorm()\n            (dropout): Dropout(p=0.1)\n          )\n        )\n      )\n    )\n    (pooler): BertPooler(\n      (dense): Linear(in_features=768, out_features=768, bias=True)\n      (activation): Tanh()\n    )\n  )\n  (dropout): Dropout(p=0.1)\n  (classifier): Linear(in_features=768, out_features=6, bias=True)\n), opt_func=functools.partial(<class 'torch.optim.adam.Adam'>, betas=(0.9, 0.99)), loss_func=BCEWithLogitsLoss(), metrics=[functools.partial(<function accuracy_thresh at 0x7f79cc28dae8>, thresh=0.25)], true_wd=True, bn_wd=True, wd=0.01, train_bn=True, path=PosixPath('.'), model_dir='/temp/model', callback_fns=[functools.partial(<class 'fastai.basic_train.Recorder'>, add_time=True, silent=False)], callbacks=[], layer_groups=[Sequential(\n  (0): BertEmbeddings(\n    (word_embeddings): Embedding(30522, 768, padding_idx=0)\n    (position_embeddings): Embedding(512, 768)\n    (token_type_embeddings): Embedding(2, 768)\n    (LayerNorm): BertLayerNorm()\n    (dropout): Dropout(p=0.1)\n  )\n), Sequential(\n  (0): BertLayer(\n    (attention): BertAttention(\n      (self): BertSelfAttention(\n        (query): Linear(in_features=768, out_features=768, bias=True)\n        (key): Linear(in_features=768, out_features=768, bias=True)\n        (value): Linear(in_features=768, out_features=768, bias=True)\n        (dropout): Dropout(p=0.1)\n      )\n      (output): BertSelfOutput(\n        (dense): Linear(in_features=768, out_features=768, bias=True)\n        (LayerNorm): BertLayerNorm()\n        (dropout): Dropout(p=0.1)\n      )\n    )\n    (intermediate): BertIntermediate(\n      (dense): Linear(in_features=768, out_features=3072, bias=True)\n    )\n    (output): BertOutput(\n      (dense): Linear(in_features=3072, out_features=768, bias=True)\n      (LayerNorm): BertLayerNorm()\n      (dropout): Dropout(p=0.1)\n    )\n  )\n  (1): BertLayer(\n    (attention): BertAttention(\n      (self): BertSelfAttention(\n        (query): Linear(in_features=768, out_features=768, bias=True)\n        (key): Linear(in_features=768, out_features=768, bias=True)\n        (value): Linear(in_features=768, out_features=768, bias=True)\n        (dropout): Dropout(p=0.1)\n      )\n      (output): BertSelfOutput(\n        (dense): Linear(in_features=768, out_features=768, bias=True)\n        (LayerNorm): BertLayerNorm()\n        (dropout): Dropout(p=0.1)\n      )\n    )\n    (intermediate): BertIntermediate(\n      (dense): Linear(in_features=768, out_features=3072, bias=True)\n    )\n    (output): BertOutput(\n      (dense): Linear(in_features=3072, out_features=768, bias=True)\n      (LayerNorm): BertLayerNorm()\n      (dropout): Dropout(p=0.1)\n    )\n  )\n  (2): BertLayer(\n    (attention): BertAttention(\n      (self): BertSelfAttention(\n        (query): Linear(in_features=768, out_features=768, bias=True)\n        (key): Linear(in_features=768, out_features=768, bias=True)\n        (value): Linear(in_features=768, out_features=768, bias=True)\n        (dropout): Dropout(p=0.1)\n      )\n      (output): BertSelfOutput(\n        (dense): Linear(in_features=768, out_features=768, bias=True)\n        (LayerNorm): BertLayerNorm()\n        (dropout): Dropout(p=0.1)\n      )\n    )\n    (intermediate): BertIntermediate(\n      (dense): Linear(in_features=768, out_features=3072, bias=True)\n    )\n    (output): BertOutput(\n      (dense): Linear(in_features=3072, out_features=768, bias=True)\n      (LayerNorm): BertLayerNorm()\n      (dropout): Dropout(p=0.1)\n    )\n  )\n  (3): BertLayer(\n    (attention): BertAttention(\n      (self): BertSelfAttention(\n        (query): Linear(in_features=768, out_features=768, bias=True)\n        (key): Linear(in_features=768, out_features=768, bias=True)\n        (value): Linear(in_features=768, out_features=768, bias=True)\n        (dropout): Dropout(p=0.1)\n      )\n      (output): BertSelfOutput(\n        (dense): Linear(in_features=768, out_features=768, bias=True)\n        (LayerNorm): BertLayerNorm()\n        (dropout): Dropout(p=0.1)\n      )\n    )\n    (intermediate): BertIntermediate(\n      (dense): Linear(in_features=768, out_features=3072, bias=True)\n    )\n    (output): BertOutput(\n      (dense): Linear(in_features=3072, out_features=768, bias=True)\n      (LayerNorm): BertLayerNorm()\n      (dropout): Dropout(p=0.1)\n    )\n  )\n), Sequential(\n  (0): BertLayer(\n    (attention): BertAttention(\n      (self): BertSelfAttention(\n        (query): Linear(in_features=768, out_features=768, bias=True)\n        (key): Linear(in_features=768, out_features=768, bias=True)\n        (value): Linear(in_features=768, out_features=768, bias=True)\n        (dropout): Dropout(p=0.1)\n      )\n      (output): BertSelfOutput(\n        (dense): Linear(in_features=768, out_features=768, bias=True)\n        (LayerNorm): BertLayerNorm()\n        (dropout): Dropout(p=0.1)\n      )\n    )\n    (intermediate): BertIntermediate(\n      (dense): Linear(in_features=768, out_features=3072, bias=True)\n    )\n    (output): BertOutput(\n      (dense): Linear(in_features=3072, out_features=768, bias=True)\n      (LayerNorm): BertLayerNorm()\n      (dropout): Dropout(p=0.1)\n    )\n  )\n  (1): BertLayer(\n    (attention): BertAttention(\n      (self): BertSelfAttention(\n        (query): Linear(in_features=768, out_features=768, bias=True)\n        (key): Linear(in_features=768, out_features=768, bias=True)\n        (value): Linear(in_features=768, out_features=768, bias=True)\n        (dropout): Dropout(p=0.1)\n      )\n      (output): BertSelfOutput(\n        (dense): Linear(in_features=768, out_features=768, bias=True)\n        (LayerNorm): BertLayerNorm()\n        (dropout): Dropout(p=0.1)\n      )\n    )\n    (intermediate): BertIntermediate(\n      (dense): Linear(in_features=768, out_features=3072, bias=True)\n    )\n    (output): BertOutput(\n      (dense): Linear(in_features=3072, out_features=768, bias=True)\n      (LayerNorm): BertLayerNorm()\n      (dropout): Dropout(p=0.1)\n    )\n  )\n  (2): BertLayer(\n    (attention): BertAttention(\n      (self): BertSelfAttention(\n        (query): Linear(in_features=768, out_features=768, bias=True)\n        (key): Linear(in_features=768, out_features=768, bias=True)\n        (value): Linear(in_features=768, out_features=768, bias=True)\n        (dropout): Dropout(p=0.1)\n      )\n      (output): BertSelfOutput(\n        (dense): Linear(in_features=768, out_features=768, bias=True)\n        (LayerNorm): BertLayerNorm()\n        (dropout): Dropout(p=0.1)\n      )\n    )\n    (intermediate): BertIntermediate(\n      (dense): Linear(in_features=768, out_features=3072, bias=True)\n    )\n    (output): BertOutput(\n      (dense): Linear(in_features=3072, out_features=768, bias=True)\n      (LayerNorm): BertLayerNorm()\n      (dropout): Dropout(p=0.1)\n    )\n  )\n), Sequential(\n  (0): BertLayer(\n    (attention): BertAttention(\n      (self): BertSelfAttention(\n        (query): Linear(in_features=768, out_features=768, bias=True)\n        (key): Linear(in_features=768, out_features=768, bias=True)\n        (value): Linear(in_features=768, out_features=768, bias=True)\n        (dropout): Dropout(p=0.1)\n      )\n      (output): BertSelfOutput(\n        (dense): Linear(in_features=768, out_features=768, bias=True)\n        (LayerNorm): BertLayerNorm()\n        (dropout): Dropout(p=0.1)\n      )\n    )\n    (intermediate): BertIntermediate(\n      (dense): Linear(in_features=768, out_features=3072, bias=True)\n    )\n    (output): BertOutput(\n      (dense): Linear(in_features=3072, out_features=768, bias=True)\n      (LayerNorm): BertLayerNorm()\n      (dropout): Dropout(p=0.1)\n    )\n  )\n  (1): BertLayer(\n    (attention): BertAttention(\n      (self): BertSelfAttention(\n        (query): Linear(in_features=768, out_features=768, bias=True)\n        (key): Linear(in_features=768, out_features=768, bias=True)\n        (value): Linear(in_features=768, out_features=768, bias=True)\n        (dropout): Dropout(p=0.1)\n      )\n      (output): BertSelfOutput(\n        (dense): Linear(in_features=768, out_features=768, bias=True)\n        (LayerNorm): BertLayerNorm()\n        (dropout): Dropout(p=0.1)\n      )\n    )\n    (intermediate): BertIntermediate(\n      (dense): Linear(in_features=768, out_features=3072, bias=True)\n    )\n    (output): BertOutput(\n      (dense): Linear(in_features=3072, out_features=768, bias=True)\n      (LayerNorm): BertLayerNorm()\n      (dropout): Dropout(p=0.1)\n    )\n  )\n  (2): BertLayer(\n    (attention): BertAttention(\n      (self): BertSelfAttention(\n        (query): Linear(in_features=768, out_features=768, bias=True)\n        (key): Linear(in_features=768, out_features=768, bias=True)\n        (value): Linear(in_features=768, out_features=768, bias=True)\n        (dropout): Dropout(p=0.1)\n      )\n      (output): BertSelfOutput(\n        (dense): Linear(in_features=768, out_features=768, bias=True)\n        (LayerNorm): BertLayerNorm()\n        (dropout): Dropout(p=0.1)\n      )\n    )\n    (intermediate): BertIntermediate(\n      (dense): Linear(in_features=768, out_features=3072, bias=True)\n    )\n    (output): BertOutput(\n      (dense): Linear(in_features=3072, out_features=768, bias=True)\n      (LayerNorm): BertLayerNorm()\n      (dropout): Dropout(p=0.1)\n    )\n  )\n), Sequential(\n  (0): Dropout(p=0.1)\n  (1): Linear(in_features=768, out_features=6, bias=True)\n)], add_time=True, silent=None)"},"metadata":{}}]},{"metadata":{},"cell_type":"markdown","source":"Now, we will unfreeze last two last layers and train the model again"},{"metadata":{"trusted":true},"cell_type":"code","source":"learner.freeze_to(-2)\nlearner.fit_one_cycle(2, max_lr=slice(1e-5, 5e-4), moms=(0.8,0.7), pct_start=0.2, wd =(1e-7, 1e-5, 1e-4, 1e-3, 1e-2))","execution_count":32,"outputs":[{"output_type":"display_data","data":{"text/plain":"<IPython.core.display.HTML object>","text/html":"<table border=\"1\" class=\"dataframe\">\n  <thead>\n    <tr style=\"text-align: left;\">\n      <th>epoch</th>\n      <th>train_loss</th>\n      <th>valid_loss</th>\n      <th>accuracy_thresh</th>\n      <th>time</th>\n    </tr>\n  </thead>\n  <tbody>\n    <tr>\n      <td>0</td>\n      <td>0.033311</td>\n      <td>0.038518</td>\n      <td>0.980818</td>\n      <td>18:30</td>\n    </tr>\n    <tr>\n      <td>1</td>\n      <td>0.029888</td>\n      <td>0.037146</td>\n      <td>0.982213</td>\n      <td>17:40</td>\n    </tr>\n  </tbody>\n</table>"},"metadata":{}}]},{"metadata":{"trusted":true},"cell_type":"code","source":"learner.save('head-2')\nlearner.load('head-2')","execution_count":33,"outputs":[{"output_type":"execute_result","execution_count":33,"data":{"text/plain":"Learner(data=TextClasDataBunch;\n\nTrain: LabelList (127656 items)\nx: TextList\n[CLS] grandma terri should burn in trash grandma terri is trash . i hate grandma terri . f % % k her to hell ! 71 . 74 . 76 . 40 [SEP],[CLS] , 9 may 2009 ( utc ) it would be easiest if you were to admit to being a member of the involved portuguese lodge , and then there would be no requirement to acknowledge whether you had a previous account ( carlos bot ##el ##ho did not have a good record ) or not and i would then remove the sock ##pu ##ppet template as irrelevant . w ##p : co ##i permits people to edit those articles , such as ms ##ja ##pan does , but just means you have to be more careful in ensuring that references back your edit ##s and that np ##ov is upheld . 20 : 29 [SEP],[CLS] \" the object ##ivity of this discussion is doubtful ( non - existent ) ( 1 ) as indicated earlier , the section on marxist leaders ’ views is misleading : ( a ) it lays un ##war ##rant ##ed and excessive emphasis on tr ##ots ##ky , creating the misleading impression that other prominent marxist ##s ( marx , eng ##els , lenin ) did not advocate and / or practiced terrorism ; ( b ) it lays un ##war ##rant ##ed and excessive emphasis on the theoretical “ rejection of individual terrorism ” , creating the misleading impression that this is the main ( only ) marxist position on terrorism . ( 2 ) the discussion is not being properly monitored : ( a ) no disc ##ern ##ible attempt is being made to establish and maintain an acceptable degree of object ##ivity ; ( b ) important and relevant scholarly works such as the international encyclopedia of terrorism are being ignored or illicit ##ly excluded from the discussion ; ( c ) though the only logical way to remedy the b ##lat ##ant im ##balance in the above section is to include quotes by / on other leaders who are known to have endorsed and practiced terrorism all attempts to do so have been systematically blocked with imp ##uni ##ty by the ap ##ologists for marxist terrorism who have done their best to sabotage and wreck both the article and the discussion . ( 3 ) among the tactics deployed by [SEP],[CLS] shelly shock shelly shock is . . . ( ) [SEP],[CLS] i do not care . refer to on ##g ten ##g che ##ong talk page . is la go ##ut ##te de pl ##ui ##e writing a biography or writing the history of trade unions . she is making use of the dead to push her agenda again . right before elections too . how timely . 202 . 156 . 13 . 232 [SEP]\ny: MultiCategoryList\ntoxic,,,,\nPath: .;\n\nValid: LabelList (31915 items)\nx: TextList\n[CLS] gee ##z , are you forget ##ful ! we ' ve already discussed why marx was not an anarchist , i . e . he wanted to use a state to mold his ' socialist man . ' er ##go , he is a stat ##ist - the opposite of an anarchist . i know a guy who says that , when he gets old and his teeth fall out , he ' ll quit eating meat . would you call him a vegetarian ? [SEP],[CLS] car ##io ##ca rf ##a thanks for your support on my request for ad ##mins ##hip . the final outcome was ( 31 / 4 / 1 ) , so i am now an administrator . if you have any comments or concerns on my actions as an administrator , please let me know . thank you ! [SEP],[CLS] \" birthday no worries , it ' s what i do ; ) enjoy ur day | talk | e \" [SEP],[CLS] pseudo ##sc ##ience category ? i ' m assuming that this article is in the pseudo ##sc ##ience category because of its association with creation ##ism . however , there are modern , scientific ##ally - accepted variants of cat ##ast ##rop ##hism that have nothing to do with creation ##ism — and they ' re even mentioned in the article ! i think the connection to pseudo ##sc ##ience needs to be clarified , or the article made more general and less creation ##ism - specific and the category tag removed entirely . [SEP],[CLS] ( and if such phrase exists , it would be provided by search engine even if mentioned page is not available as a whole ) [SEP]\ny: MultiCategoryList\n,,,,\nPath: .;\n\nTest: None, model=BertForSequenceClassification(\n  (bert): BertModel(\n    (embeddings): BertEmbeddings(\n      (word_embeddings): Embedding(30522, 768, padding_idx=0)\n      (position_embeddings): Embedding(512, 768)\n      (token_type_embeddings): Embedding(2, 768)\n      (LayerNorm): BertLayerNorm()\n      (dropout): Dropout(p=0.1)\n    )\n    (encoder): BertEncoder(\n      (layer): ModuleList(\n        (0): BertLayer(\n          (attention): BertAttention(\n            (self): BertSelfAttention(\n              (query): Linear(in_features=768, out_features=768, bias=True)\n              (key): Linear(in_features=768, out_features=768, bias=True)\n              (value): Linear(in_features=768, out_features=768, bias=True)\n              (dropout): Dropout(p=0.1)\n            )\n            (output): BertSelfOutput(\n              (dense): Linear(in_features=768, out_features=768, bias=True)\n              (LayerNorm): BertLayerNorm()\n              (dropout): Dropout(p=0.1)\n            )\n          )\n          (intermediate): BertIntermediate(\n            (dense): Linear(in_features=768, out_features=3072, bias=True)\n          )\n          (output): BertOutput(\n            (dense): Linear(in_features=3072, out_features=768, bias=True)\n            (LayerNorm): BertLayerNorm()\n            (dropout): Dropout(p=0.1)\n          )\n        )\n        (1): BertLayer(\n          (attention): BertAttention(\n            (self): BertSelfAttention(\n              (query): Linear(in_features=768, out_features=768, bias=True)\n              (key): Linear(in_features=768, out_features=768, bias=True)\n              (value): Linear(in_features=768, out_features=768, bias=True)\n              (dropout): Dropout(p=0.1)\n            )\n            (output): BertSelfOutput(\n              (dense): Linear(in_features=768, out_features=768, bias=True)\n              (LayerNorm): BertLayerNorm()\n              (dropout): Dropout(p=0.1)\n            )\n          )\n          (intermediate): BertIntermediate(\n            (dense): Linear(in_features=768, out_features=3072, bias=True)\n          )\n          (output): BertOutput(\n            (dense): Linear(in_features=3072, out_features=768, bias=True)\n            (LayerNorm): BertLayerNorm()\n            (dropout): Dropout(p=0.1)\n          )\n        )\n        (2): BertLayer(\n          (attention): BertAttention(\n            (self): BertSelfAttention(\n              (query): Linear(in_features=768, out_features=768, bias=True)\n              (key): Linear(in_features=768, out_features=768, bias=True)\n              (value): Linear(in_features=768, out_features=768, bias=True)\n              (dropout): Dropout(p=0.1)\n            )\n            (output): BertSelfOutput(\n              (dense): Linear(in_features=768, out_features=768, bias=True)\n              (LayerNorm): BertLayerNorm()\n              (dropout): Dropout(p=0.1)\n            )\n          )\n          (intermediate): BertIntermediate(\n            (dense): Linear(in_features=768, out_features=3072, bias=True)\n          )\n          (output): BertOutput(\n            (dense): Linear(in_features=3072, out_features=768, bias=True)\n            (LayerNorm): BertLayerNorm()\n            (dropout): Dropout(p=0.1)\n          )\n        )\n        (3): BertLayer(\n          (attention): BertAttention(\n            (self): BertSelfAttention(\n              (query): Linear(in_features=768, out_features=768, bias=True)\n              (key): Linear(in_features=768, out_features=768, bias=True)\n              (value): Linear(in_features=768, out_features=768, bias=True)\n              (dropout): Dropout(p=0.1)\n            )\n            (output): BertSelfOutput(\n              (dense): Linear(in_features=768, out_features=768, bias=True)\n              (LayerNorm): BertLayerNorm()\n              (dropout): Dropout(p=0.1)\n            )\n          )\n          (intermediate): BertIntermediate(\n            (dense): Linear(in_features=768, out_features=3072, bias=True)\n          )\n          (output): BertOutput(\n            (dense): Linear(in_features=3072, out_features=768, bias=True)\n            (LayerNorm): BertLayerNorm()\n            (dropout): Dropout(p=0.1)\n          )\n        )\n        (4): BertLayer(\n          (attention): BertAttention(\n            (self): BertSelfAttention(\n              (query): Linear(in_features=768, out_features=768, bias=True)\n              (key): Linear(in_features=768, out_features=768, bias=True)\n              (value): Linear(in_features=768, out_features=768, bias=True)\n              (dropout): Dropout(p=0.1)\n            )\n            (output): BertSelfOutput(\n              (dense): Linear(in_features=768, out_features=768, bias=True)\n              (LayerNorm): BertLayerNorm()\n              (dropout): Dropout(p=0.1)\n            )\n          )\n          (intermediate): BertIntermediate(\n            (dense): Linear(in_features=768, out_features=3072, bias=True)\n          )\n          (output): BertOutput(\n            (dense): Linear(in_features=3072, out_features=768, bias=True)\n            (LayerNorm): BertLayerNorm()\n            (dropout): Dropout(p=0.1)\n          )\n        )\n        (5): BertLayer(\n          (attention): BertAttention(\n            (self): BertSelfAttention(\n              (query): Linear(in_features=768, out_features=768, bias=True)\n              (key): Linear(in_features=768, out_features=768, bias=True)\n              (value): Linear(in_features=768, out_features=768, bias=True)\n              (dropout): Dropout(p=0.1)\n            )\n            (output): BertSelfOutput(\n              (dense): Linear(in_features=768, out_features=768, bias=True)\n              (LayerNorm): BertLayerNorm()\n              (dropout): Dropout(p=0.1)\n            )\n          )\n          (intermediate): BertIntermediate(\n            (dense): Linear(in_features=768, out_features=3072, bias=True)\n          )\n          (output): BertOutput(\n            (dense): Linear(in_features=3072, out_features=768, bias=True)\n            (LayerNorm): BertLayerNorm()\n            (dropout): Dropout(p=0.1)\n          )\n        )\n        (6): BertLayer(\n          (attention): BertAttention(\n            (self): BertSelfAttention(\n              (query): Linear(in_features=768, out_features=768, bias=True)\n              (key): Linear(in_features=768, out_features=768, bias=True)\n              (value): Linear(in_features=768, out_features=768, bias=True)\n              (dropout): Dropout(p=0.1)\n            )\n            (output): BertSelfOutput(\n              (dense): Linear(in_features=768, out_features=768, bias=True)\n              (LayerNorm): BertLayerNorm()\n              (dropout): Dropout(p=0.1)\n            )\n          )\n          (intermediate): BertIntermediate(\n            (dense): Linear(in_features=768, out_features=3072, bias=True)\n          )\n          (output): BertOutput(\n            (dense): Linear(in_features=3072, out_features=768, bias=True)\n            (LayerNorm): BertLayerNorm()\n            (dropout): Dropout(p=0.1)\n          )\n        )\n        (7): BertLayer(\n          (attention): BertAttention(\n            (self): BertSelfAttention(\n              (query): Linear(in_features=768, out_features=768, bias=True)\n              (key): Linear(in_features=768, out_features=768, bias=True)\n              (value): Linear(in_features=768, out_features=768, bias=True)\n              (dropout): Dropout(p=0.1)\n            )\n            (output): BertSelfOutput(\n              (dense): Linear(in_features=768, out_features=768, bias=True)\n              (LayerNorm): BertLayerNorm()\n              (dropout): Dropout(p=0.1)\n            )\n          )\n          (intermediate): BertIntermediate(\n            (dense): Linear(in_features=768, out_features=3072, bias=True)\n          )\n          (output): BertOutput(\n            (dense): Linear(in_features=3072, out_features=768, bias=True)\n            (LayerNorm): BertLayerNorm()\n            (dropout): Dropout(p=0.1)\n          )\n        )\n        (8): BertLayer(\n          (attention): BertAttention(\n            (self): BertSelfAttention(\n              (query): Linear(in_features=768, out_features=768, bias=True)\n              (key): Linear(in_features=768, out_features=768, bias=True)\n              (value): Linear(in_features=768, out_features=768, bias=True)\n              (dropout): Dropout(p=0.1)\n            )\n            (output): BertSelfOutput(\n              (dense): Linear(in_features=768, out_features=768, bias=True)\n              (LayerNorm): BertLayerNorm()\n              (dropout): Dropout(p=0.1)\n            )\n          )\n          (intermediate): BertIntermediate(\n            (dense): Linear(in_features=768, out_features=3072, bias=True)\n          )\n          (output): BertOutput(\n            (dense): Linear(in_features=3072, out_features=768, bias=True)\n            (LayerNorm): BertLayerNorm()\n            (dropout): Dropout(p=0.1)\n          )\n        )\n        (9): BertLayer(\n          (attention): BertAttention(\n            (self): BertSelfAttention(\n              (query): Linear(in_features=768, out_features=768, bias=True)\n              (key): Linear(in_features=768, out_features=768, bias=True)\n              (value): Linear(in_features=768, out_features=768, bias=True)\n              (dropout): Dropout(p=0.1)\n            )\n            (output): BertSelfOutput(\n              (dense): Linear(in_features=768, out_features=768, bias=True)\n              (LayerNorm): BertLayerNorm()\n              (dropout): Dropout(p=0.1)\n            )\n          )\n          (intermediate): BertIntermediate(\n            (dense): Linear(in_features=768, out_features=3072, bias=True)\n          )\n          (output): BertOutput(\n            (dense): Linear(in_features=3072, out_features=768, bias=True)\n            (LayerNorm): BertLayerNorm()\n            (dropout): Dropout(p=0.1)\n          )\n        )\n        (10): BertLayer(\n          (attention): BertAttention(\n            (self): BertSelfAttention(\n              (query): Linear(in_features=768, out_features=768, bias=True)\n              (key): Linear(in_features=768, out_features=768, bias=True)\n              (value): Linear(in_features=768, out_features=768, bias=True)\n              (dropout): Dropout(p=0.1)\n            )\n            (output): BertSelfOutput(\n              (dense): Linear(in_features=768, out_features=768, bias=True)\n              (LayerNorm): BertLayerNorm()\n              (dropout): Dropout(p=0.1)\n            )\n          )\n          (intermediate): BertIntermediate(\n            (dense): Linear(in_features=768, out_features=3072, bias=True)\n          )\n          (output): BertOutput(\n            (dense): Linear(in_features=3072, out_features=768, bias=True)\n            (LayerNorm): BertLayerNorm()\n            (dropout): Dropout(p=0.1)\n          )\n        )\n        (11): BertLayer(\n          (attention): BertAttention(\n            (self): BertSelfAttention(\n              (query): Linear(in_features=768, out_features=768, bias=True)\n              (key): Linear(in_features=768, out_features=768, bias=True)\n              (value): Linear(in_features=768, out_features=768, bias=True)\n              (dropout): Dropout(p=0.1)\n            )\n            (output): BertSelfOutput(\n              (dense): Linear(in_features=768, out_features=768, bias=True)\n              (LayerNorm): BertLayerNorm()\n              (dropout): Dropout(p=0.1)\n            )\n          )\n          (intermediate): BertIntermediate(\n            (dense): Linear(in_features=768, out_features=3072, bias=True)\n          )\n          (output): BertOutput(\n            (dense): Linear(in_features=3072, out_features=768, bias=True)\n            (LayerNorm): BertLayerNorm()\n            (dropout): Dropout(p=0.1)\n          )\n        )\n      )\n    )\n    (pooler): BertPooler(\n      (dense): Linear(in_features=768, out_features=768, bias=True)\n      (activation): Tanh()\n    )\n  )\n  (dropout): Dropout(p=0.1)\n  (classifier): Linear(in_features=768, out_features=6, bias=True)\n), opt_func=functools.partial(<class 'torch.optim.adam.Adam'>, betas=(0.9, 0.99)), loss_func=BCEWithLogitsLoss(), metrics=[functools.partial(<function accuracy_thresh at 0x7f79cc28dae8>, thresh=0.25)], true_wd=True, bn_wd=True, wd=0.01, train_bn=True, path=PosixPath('.'), model_dir='/temp/model', callback_fns=[functools.partial(<class 'fastai.basic_train.Recorder'>, add_time=True, silent=False)], callbacks=[], layer_groups=[Sequential(\n  (0): BertEmbeddings(\n    (word_embeddings): Embedding(30522, 768, padding_idx=0)\n    (position_embeddings): Embedding(512, 768)\n    (token_type_embeddings): Embedding(2, 768)\n    (LayerNorm): BertLayerNorm()\n    (dropout): Dropout(p=0.1)\n  )\n), Sequential(\n  (0): BertLayer(\n    (attention): BertAttention(\n      (self): BertSelfAttention(\n        (query): Linear(in_features=768, out_features=768, bias=True)\n        (key): Linear(in_features=768, out_features=768, bias=True)\n        (value): Linear(in_features=768, out_features=768, bias=True)\n        (dropout): Dropout(p=0.1)\n      )\n      (output): BertSelfOutput(\n        (dense): Linear(in_features=768, out_features=768, bias=True)\n        (LayerNorm): BertLayerNorm()\n        (dropout): Dropout(p=0.1)\n      )\n    )\n    (intermediate): BertIntermediate(\n      (dense): Linear(in_features=768, out_features=3072, bias=True)\n    )\n    (output): BertOutput(\n      (dense): Linear(in_features=3072, out_features=768, bias=True)\n      (LayerNorm): BertLayerNorm()\n      (dropout): Dropout(p=0.1)\n    )\n  )\n  (1): BertLayer(\n    (attention): BertAttention(\n      (self): BertSelfAttention(\n        (query): Linear(in_features=768, out_features=768, bias=True)\n        (key): Linear(in_features=768, out_features=768, bias=True)\n        (value): Linear(in_features=768, out_features=768, bias=True)\n        (dropout): Dropout(p=0.1)\n      )\n      (output): BertSelfOutput(\n        (dense): Linear(in_features=768, out_features=768, bias=True)\n        (LayerNorm): BertLayerNorm()\n        (dropout): Dropout(p=0.1)\n      )\n    )\n    (intermediate): BertIntermediate(\n      (dense): Linear(in_features=768, out_features=3072, bias=True)\n    )\n    (output): BertOutput(\n      (dense): Linear(in_features=3072, out_features=768, bias=True)\n      (LayerNorm): BertLayerNorm()\n      (dropout): Dropout(p=0.1)\n    )\n  )\n  (2): BertLayer(\n    (attention): BertAttention(\n      (self): BertSelfAttention(\n        (query): Linear(in_features=768, out_features=768, bias=True)\n        (key): Linear(in_features=768, out_features=768, bias=True)\n        (value): Linear(in_features=768, out_features=768, bias=True)\n        (dropout): Dropout(p=0.1)\n      )\n      (output): BertSelfOutput(\n        (dense): Linear(in_features=768, out_features=768, bias=True)\n        (LayerNorm): BertLayerNorm()\n        (dropout): Dropout(p=0.1)\n      )\n    )\n    (intermediate): BertIntermediate(\n      (dense): Linear(in_features=768, out_features=3072, bias=True)\n    )\n    (output): BertOutput(\n      (dense): Linear(in_features=3072, out_features=768, bias=True)\n      (LayerNorm): BertLayerNorm()\n      (dropout): Dropout(p=0.1)\n    )\n  )\n  (3): BertLayer(\n    (attention): BertAttention(\n      (self): BertSelfAttention(\n        (query): Linear(in_features=768, out_features=768, bias=True)\n        (key): Linear(in_features=768, out_features=768, bias=True)\n        (value): Linear(in_features=768, out_features=768, bias=True)\n        (dropout): Dropout(p=0.1)\n      )\n      (output): BertSelfOutput(\n        (dense): Linear(in_features=768, out_features=768, bias=True)\n        (LayerNorm): BertLayerNorm()\n        (dropout): Dropout(p=0.1)\n      )\n    )\n    (intermediate): BertIntermediate(\n      (dense): Linear(in_features=768, out_features=3072, bias=True)\n    )\n    (output): BertOutput(\n      (dense): Linear(in_features=3072, out_features=768, bias=True)\n      (LayerNorm): BertLayerNorm()\n      (dropout): Dropout(p=0.1)\n    )\n  )\n), Sequential(\n  (0): BertLayer(\n    (attention): BertAttention(\n      (self): BertSelfAttention(\n        (query): Linear(in_features=768, out_features=768, bias=True)\n        (key): Linear(in_features=768, out_features=768, bias=True)\n        (value): Linear(in_features=768, out_features=768, bias=True)\n        (dropout): Dropout(p=0.1)\n      )\n      (output): BertSelfOutput(\n        (dense): Linear(in_features=768, out_features=768, bias=True)\n        (LayerNorm): BertLayerNorm()\n        (dropout): Dropout(p=0.1)\n      )\n    )\n    (intermediate): BertIntermediate(\n      (dense): Linear(in_features=768, out_features=3072, bias=True)\n    )\n    (output): BertOutput(\n      (dense): Linear(in_features=3072, out_features=768, bias=True)\n      (LayerNorm): BertLayerNorm()\n      (dropout): Dropout(p=0.1)\n    )\n  )\n  (1): BertLayer(\n    (attention): BertAttention(\n      (self): BertSelfAttention(\n        (query): Linear(in_features=768, out_features=768, bias=True)\n        (key): Linear(in_features=768, out_features=768, bias=True)\n        (value): Linear(in_features=768, out_features=768, bias=True)\n        (dropout): Dropout(p=0.1)\n      )\n      (output): BertSelfOutput(\n        (dense): Linear(in_features=768, out_features=768, bias=True)\n        (LayerNorm): BertLayerNorm()\n        (dropout): Dropout(p=0.1)\n      )\n    )\n    (intermediate): BertIntermediate(\n      (dense): Linear(in_features=768, out_features=3072, bias=True)\n    )\n    (output): BertOutput(\n      (dense): Linear(in_features=3072, out_features=768, bias=True)\n      (LayerNorm): BertLayerNorm()\n      (dropout): Dropout(p=0.1)\n    )\n  )\n  (2): BertLayer(\n    (attention): BertAttention(\n      (self): BertSelfAttention(\n        (query): Linear(in_features=768, out_features=768, bias=True)\n        (key): Linear(in_features=768, out_features=768, bias=True)\n        (value): Linear(in_features=768, out_features=768, bias=True)\n        (dropout): Dropout(p=0.1)\n      )\n      (output): BertSelfOutput(\n        (dense): Linear(in_features=768, out_features=768, bias=True)\n        (LayerNorm): BertLayerNorm()\n        (dropout): Dropout(p=0.1)\n      )\n    )\n    (intermediate): BertIntermediate(\n      (dense): Linear(in_features=768, out_features=3072, bias=True)\n    )\n    (output): BertOutput(\n      (dense): Linear(in_features=3072, out_features=768, bias=True)\n      (LayerNorm): BertLayerNorm()\n      (dropout): Dropout(p=0.1)\n    )\n  )\n), Sequential(\n  (0): BertLayer(\n    (attention): BertAttention(\n      (self): BertSelfAttention(\n        (query): Linear(in_features=768, out_features=768, bias=True)\n        (key): Linear(in_features=768, out_features=768, bias=True)\n        (value): Linear(in_features=768, out_features=768, bias=True)\n        (dropout): Dropout(p=0.1)\n      )\n      (output): BertSelfOutput(\n        (dense): Linear(in_features=768, out_features=768, bias=True)\n        (LayerNorm): BertLayerNorm()\n        (dropout): Dropout(p=0.1)\n      )\n    )\n    (intermediate): BertIntermediate(\n      (dense): Linear(in_features=768, out_features=3072, bias=True)\n    )\n    (output): BertOutput(\n      (dense): Linear(in_features=3072, out_features=768, bias=True)\n      (LayerNorm): BertLayerNorm()\n      (dropout): Dropout(p=0.1)\n    )\n  )\n  (1): BertLayer(\n    (attention): BertAttention(\n      (self): BertSelfAttention(\n        (query): Linear(in_features=768, out_features=768, bias=True)\n        (key): Linear(in_features=768, out_features=768, bias=True)\n        (value): Linear(in_features=768, out_features=768, bias=True)\n        (dropout): Dropout(p=0.1)\n      )\n      (output): BertSelfOutput(\n        (dense): Linear(in_features=768, out_features=768, bias=True)\n        (LayerNorm): BertLayerNorm()\n        (dropout): Dropout(p=0.1)\n      )\n    )\n    (intermediate): BertIntermediate(\n      (dense): Linear(in_features=768, out_features=3072, bias=True)\n    )\n    (output): BertOutput(\n      (dense): Linear(in_features=3072, out_features=768, bias=True)\n      (LayerNorm): BertLayerNorm()\n      (dropout): Dropout(p=0.1)\n    )\n  )\n  (2): BertLayer(\n    (attention): BertAttention(\n      (self): BertSelfAttention(\n        (query): Linear(in_features=768, out_features=768, bias=True)\n        (key): Linear(in_features=768, out_features=768, bias=True)\n        (value): Linear(in_features=768, out_features=768, bias=True)\n        (dropout): Dropout(p=0.1)\n      )\n      (output): BertSelfOutput(\n        (dense): Linear(in_features=768, out_features=768, bias=True)\n        (LayerNorm): BertLayerNorm()\n        (dropout): Dropout(p=0.1)\n      )\n    )\n    (intermediate): BertIntermediate(\n      (dense): Linear(in_features=768, out_features=3072, bias=True)\n    )\n    (output): BertOutput(\n      (dense): Linear(in_features=3072, out_features=768, bias=True)\n      (LayerNorm): BertLayerNorm()\n      (dropout): Dropout(p=0.1)\n    )\n  )\n), Sequential(\n  (0): Dropout(p=0.1)\n  (1): Linear(in_features=768, out_features=6, bias=True)\n)], add_time=True, silent=None)"},"metadata":{}}]},{"metadata":{},"cell_type":"markdown","source":"We will now unfreeze the entire model and train it"},{"metadata":{"trusted":true},"cell_type":"code","source":"learner.unfreeze()\nlearner.lr_find()\nlearner.recorder.plot(suggestion=True)","execution_count":34,"outputs":[{"output_type":"display_data","data":{"text/plain":"<IPython.core.display.HTML object>","text/html":""},"metadata":{}},{"output_type":"stream","text":"LR Finder is complete, type {learner_name}.recorder.plot() to see the graph.\nMin numerical gradient: 1.91E-06\n","name":"stdout"},{"output_type":"display_data","data":{"text/plain":"<Figure size 432x288 with 1 Axes>","image/png":"iVBORw0KGgoAAAANSUhEUgAAAZIAAAEKCAYAAAA4t9PUAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAAIABJREFUeJzt3Xl8VdW5//HPkxGSMCYBgTAPyqCiRByxWuvYKlptxdah1ZZOXqvWerXetmpve+vtr9erV6vVOk+oOGHV2lbUOiJhBkUNYUrCkBAIGcn0/P44GzzGhASSk3NO8n2/XufFOXuvvfezN8l5stbaey1zd0RERPZXQrQDEBGR+KZEIiIiHaJEIiIiHaJEIiIiHaJEIiIiHaJEIiIiHaJEIiIiHaJEIiIiHaJEIiIiHZIU7QC6QlZWlo8aNSraYYiIxJVFixaVunt2W+V6RCIZNWoUeXl50Q5DRCSumNn69pRT05aIiHSIEomIiHSIEomIiHSIEomIiHSIEomIiHSIEomIiHSIEomIiHSIEomISDeUt66MO+Z/StWuhogfS4lERKQb+sdHW7j9tXySEyP/Na9EIiLSDa0q2smEAzJISVIiERGRfeTurCwuZ8rQfl1yPCUSEZFupri8lh3V9UwepkQiIiL7YWVROQCTh/btkuMpkYiIdDOrispJMJh4gBKJiIjsh1XFOxk3KIPeKYldcryIJhIzO83MPjazfDO7roX1qWb2ZLB+gZmNClt3iJm9Z2arzGyFmfUKlk8LPueb2e1mZpE8BxGReNOVHe0QwURiZonAncDpwCTgAjOb1KzYZcB2dx8H3ArcEmybBDwK/NDdJwMnAPXBNncBs4Hxweu0SJ2DiEi82VpRy5adu5jURf0jENkayXQg390L3L0OmAPMbFZmJvBQ8H4ucFJQwzgFWO7uywDcfZu7N5rZEKCvu7/n7g48DJwdwXMQEYkrq4p3AjCli+7YgsgmkmHAxrDPhcGyFsu4ewNQDmQCEwA3s1fNbLGZXRtWvrCNfQJgZrPNLM/M8kpKSjp8MiIi8eDDIJF0ZY0kknO2t9R34e0skwQcBxwBVAOvmdkiYGc79hla6H4PcA9Abm5ui2VERLqblUXljMxMo2+v5C47ZiRrJIXA8LDPOUBxa2WCfpF+QFmw/E13L3X3auBl4PBgeU4b+xQR6bG6uqMdIptIFgLjzWy0maUAs4B5zcrMAy4J3p8HzA/6Pl4FDjGztCDBfAn40N03ARVmdlTQl3Ix8EIEz0FEJG6UV9ezsayGycO6rlkLIti05e4NZnY5oaSQCNzv7qvM7GYgz93nAfcBj5hZPqGayKxg2+1m9j+EkpEDL7v7S8GufwQ8CPQGXgleIiI93qpNu59o79oaSST7SHD3lwk1S4Uv+1XY+1rgG61s+yihW4CbL88DpnRupCIi8W9VUagbuauGRtlNT7aLiHQTK4vLGdKvF1kZqV16XCUSEZFuYlXxzi5v1gIlEhGRbqG6roE1JZVd3qwFSiQiIt3CR5t24t61T7TvpkQiItINfDY0imokIiKyH1YWlTMwPYUD+vbq8mMrkYiIdAMri3YyeWhfojGzhhKJiEic29XQyCdbKqLSPwJKJCIice/TLZU0NHlU7tgCJRIRkbi3sig0NEpXD9a4mxKJiEicW1lcTp/UJEYMTIvK8ZVIRETi3KrinUwa2peEhK7vaAclEhGRuNbQ2MRHm6IzNMpuSiQiInGsoLSK2vqmqDyIuJsSiYhIHNvT0R6lW39BiUREJK6tKt5JalICY7LSoxZDRBOJmZ1mZh+bWb6ZXdfC+lQzezJYv8DMRgXLR5lZjZktDV53h23zRrDP3esGRfIcRERi2cqiciYO6UtSYvTqBRGbIdHMEoE7gZOBQmChmc1z9w/Dil0GbHf3cWY2C7gFOD9Yt8bdp7ay+28HMyWKiPRYTU3Oh8U7mXnY0KjGEckUNh3Id/cCd68D5gAzm5WZCTwUvJ8LnGTRGChGRCQOFe2ooWJXAxOHRK+jHSKbSIYBG8M+FwbLWizj7g1AOZAZrBttZkvM7E0zm9FsuweCZq1fKvGISE9VUFoFwNjsjKjGEclE0tIXvLezzCZghLsfBlwNPG5mu1Put939YGBG8LqoxYObzTazPDPLKykp2a8TEBGJZWtLKgEYkx29jnaIbCIpBIaHfc4BilsrY2ZJQD+gzN13ufs2AHdfBKwBJgSfi4J/K4DHCTWhfYG73+Puue6em52d3WknJSISKwpKq+iTmkR2RmpU44hkIlkIjDez0WaWAswC5jUrMw+4JHh/HjDf3d3MsoPOesxsDDAeKDCzJDPLCpYnA18DVkbwHEREYlZBSRWjs9OjMgdJuIjdteXuDWZ2OfAqkAjc7+6rzOxmIM/d5wH3AY+YWT5QRijZABwP3GxmDUAj8EN3LzOzdODVIIkkAv8E7o3UOYiIxLK1pVUcMWpAtMOIXCIBcPeXgZebLftV2Pta4BstbPcM8EwLy6uAaZ0fqYhIfKmpa6RoRw3nZw9vu3CE6cl2EZE4tDa4Y2t0FJ9o302JREQkDu1OJNG+YwuUSERE4lJBcOuvaiQiIrJfCkqrGNqvF2kpEe3qbhclEhGROFRQGrr1NxYokYiIxBl3p6CkkjFZ0R0aZTclEhGROFNaWUdFbUNMdLSDEomISNyJpVt/QYlERCTu7L5jK9qj/u6mRCIiEmcKSqtISUpgaP/e0Q4FUCIREYk7BSVVjMpMIzEhNqZjUiIREYkzBaWxc8cWKJGIiMSV+sYmNmyrjpk7tkCJREQkrhRur6GhyWPmji1QIhERiSsFe6bXVdOWiIjsh4KS0DMkY3tK05aZnWZmH5tZvpld18L6VDN7Mli/wMxGBctHmVmNmS0NXneHbTPNzFYE29xu0Z5jUkSkCxWUVjEwPYX+aSnRDmWPiCWSYM71O4HTgUnABWY2qVmxy4Dt7j4OuBW4JWzdGnefGrx+GLb8LmA2oXncxwOnReocRERiTUFJZUz1j0BkayTTgXx3L3D3OmAOMLNZmZnAQ8H7ucBJe6thmNkQoK+7v+fuDjwMnN35oYuIxKaC0irG9KBEMgzYGPa5MFjWYhl3bwDKgcxg3WgzW2Jmb5rZjLDyhW3sU0SkW6qoraekYldMdbQDRHJGlJZqFt7OMpuAEe6+zcymAc+b2eR27jO0Y7PZhJrAGDFiRLuDFhGJVbE2WONukayRFALDwz7nAMWtlTGzJKAfUObuu9x9G4C7LwLWABOC8jlt7JNgu3vcPdfdc7OzszvhdEREoisW79iCyCaShcB4MxttZinALGBeszLzgEuC9+cB893dzSw76KzHzMYQ6lQvcPdNQIWZHRX0pVwMvBDBcxARiRkFJZUkGIzITIt2KJ8TsaYtd28ws8uBV4FE4H53X2VmNwN57j4PuA94xMzygTJCyQbgeOBmM2sAGoEfuntZsO5HwINAb+CV4CUi0u0VlFaRMyCN1KTEaIfyORGdNd7dXwZebrbsV2Hva4FvtLDdM8AzrewzD5jSuZGKiMS+gpKqmBpjazc92S4iEgeampy1pVUxNervbkokIiJxYEtFLTX1jaqRiIjI/tl9x1asPYwISiQiInEhFkf93U2JREQkDhSUVpGWksjgvqnRDuULlEhEROJAQUkVo7PSicUBz5VIRETiQEFpZUw2a4ESiYhIzNvV0Ejh9pqY7GgHJRIRkZi3fls17sTkrb+gRCIiEvP23LEVgw8jghKJiEjMK9g9fLxqJCIisj/yt1YyqE8qGakRHR5xvymRiIjEsJVF5by4rJjjxmVFO5RWKZGIiMSo6roGrpizhMz0VH75tUnRDqdVsVlPEhERbn7xQ9aWVvHY945kQHpKtMNplWokIiIx6OUVm5izcCM/PmEsx4yN3WYtUCIREYk5RTtquO6Z5Uwd3p8rvzIh2uG0KaKJxMxOM7OPzSzfzK5rYX2qmT0ZrF9gZqOarR9hZpVmdk3YsnVmtsLMlppZXiTjFxHpao1NzlVzltLkcPusw0hOjP2/9yMWoZklAncCpwOTgAvMrHlv0WXAdncfB9wK3NJs/a20PCf7ie4+1d1zOzlsEZGouvP1fD5YV8Zvzp7MiMy0aIfTLpFMddOBfHcvcPc6YA4ws1mZmcBDwfu5wEkWDG1pZmcDBcCqCMYoIhIzFq0v47bXPuWcw4ZxzmE50Q6n3SKZSIYBG8M+FwbLWizj7g1AOZBpZunAvwM3tbBfB/5uZovMbHZrBzez2WaWZ2Z5JSUlHTgNEZHIamxyFq3fzhVPLGVY/97cPHNytEPaJ5G8/belQfO9nWVuAm5198oWxt4/1t2LzWwQ8A8zW+3u//rCTtzvAe4ByM3NbX5cEZGoqq5r4O1PS/nnR1uYv3orpZV19E5O5PHvH0mfXsnRDm+fRDKRFALDwz7nAMWtlCk0sySgH1AGHAmcZ2b/DfQHmsys1t3vcPdiAHffambPEWpC+0IiERGJRfNXb+Gx9zfwdn4puxqa6JOaxAkHDeIrEwdxwoRB9EuLryQCkU0kC4HxZjYaKAJmAd9qVmYecAnwHnAeMN/dHZixu4CZ3QhUuvsdQZNXgrtXBO9PAW6O4DmIiHSa0spdzH54Edl9Urlg+ghOnjSYI0YNJCUp9u/M2pt2JRIzGwsUuvsuMzsBOAR42N13tLaNuzeY2eXAq0AicL+7rzKzm4E8d58H3Ac8Ymb5hGois9oIZTDwXNDclQQ87u5/a885iIhE2/NLimhoch6+dDrjB/eJdjidxkIVgDYKmS0FcoFRhBLDPOBAdz8jotF1ktzcXM/L0yMnIhI97s7pt71FanIiL/zk2GiH0y5mtqg9j1m0tz7VFNxVdQ7wv+5+FTCkIwGKiPQkq4p3snpzBedNi5/beturvYmk3swuINSf8ddgWfz1CImIRMncRYWkJCVw1iFDox1Kp2tvIvkucDTwW3dfG3SgPxq5sEREuo9dDY08v7SIUyYNjsu7strSrs52d/8QuALAzAYAfdz995EMTESku5j/0VZ2VNd3y2YtaGeNxMzeMLO+ZjYQWAY8YGb/E9nQRES6h7mLChncN5UZ47OjHUpEtLdpq5+77wS+Djzg7tOAr0QuLBGR7mFrRS1vfFLC1w/PITGhpcE84l97E0mSmQ0Bvslnne0iItKG55cU0djk3bZZC9qfSG4m9PzIGndfaGZjgE8jF5aISPxzd+YuKuTwEf0Zm50R7XAipl2JxN2fdvdD3P1HwecCdz83sqGJiMS3FUXlfLKlkvOmDW+7cBxrb2d7jpk9Z2ZbzWyLmT1jZt23niYi0gmeziskNSmBrx3avZ/fbm/T1gOEhkUZSmgOkReDZSIi0oLa+kbmLSvmtCkH0DfOhoXfV+1NJNnu/oC7NwSvB4HueR+biEgneO2jrZTXdN9nR8K1N5GUmtmFZpYYvC4EtkUyMBGRePb0oo0M6deLY8ZmRTuUiGtvIrmU0K2/m4FNhOYO+W6kghIRiWdbdtbyr09KOLcbPzsSrr13bW1w97PcPdvdB7n72YQeThQRkWaeWVxIk8O5PaBZC9pfI2nJ1Z0WhYhIN9HQ2MRj72/gqDEDGZ2VHu1wukRHEkmb9TUzO83MPjazfDO7roX1qWb2ZLB+gZmNarZ+hJlVmtk17d2niEg0/fOjrRTtqOE7x4yOdihdpiOJZK9TK5pZInAncDowCbjAzCY1K3YZsN3dxwG3Arc0W38r8Mo+7lNEJGoefHctw/r35uRJg6MdSpfZayIxswoz29nCq4LQMyV7Mx3ID56CrwPmADOblZkJPBS8nwucZMGE7GZ2NlAArNrHfYqIRMVHm3byfkEZlxwzskd0su+210Ti7n3cvW8Lrz7u3tZcJsOAjWGfC4NlLZYJpvItBzLNLB34d+Cm/diniEhUPPjOOnonJ3J+7ohoh9KlOtK01ZaW0nHz5rDWytwE3Orulfuxz1BBs9lmlmdmeSUlJW0GKyLSEWVVdTy/tIhzDh/WLWdB3Jt2zZC4nwqB8JHKcoDiVsoUmlkS0A8oA44EzjOz/wb6A01mVgssasc+AXD3e4B7AHJzc/fanyMi0lFzFm5gV0MT3z1mVLRD6XKRTCQLgfHB/O5FwCzgW83KzAMuAd4j9JDjfHd3YMbuAmZ2I1Dp7ncEyaatfYqIdKn6xiYeeW89x43LYvzgPtEOp8tFrGkr6PO4nNA8Jh8BT7n7KjO72czOCordR6hPJJ/Qcyl7vZ23tX1G6hxERNrj76u2sKm8lu/0wNoIgIUqAN1bbm6u5+XlRTsMEemmvnH3u2zZuYvXrzmhW92tZWaL3D23rXKR7GwXEen2VhaVs3Dddi4+umfd8htOiUREpAMefHcdaSmJfPOI7j0L4t4okYiI7KfSyl3MW1rMedNyuv3kVXujRCIisp+eWLCBusYmLj56VLRDiSolEhGR/VDf2MQj76/n+AnZjBuUEe1wokqJRERkP7y7ZhtbK3Zx4ZE9aziUliiRiIjsh9dXb6VXcgLHT8iOdihRp0QiIrKP3J3XP97KMWOz6JWcGO1wok6JRERkHxWUVrF+WzUnHqjaCCiRiIjss9dXbwXgxIMGRTmS2KBEIiKyj17/eCsTBmeQMyAt2qHEBCUSEZF9ULmrgQ/WlnHigaqN7KZEIiKyD97+tIT6RlezVhglEhGRfTB/9Vb69Epi2sgB0Q4lZiiRiIi0U+i23xKOn5BNcqK+PnfTlRARaadVxTspqdil/pFmlEhERNpp/uqtmMEJen7kcyKaSMzsNDP72MzyzewL0+iaWaqZPRmsX2Bmo4Ll081safBaZmbnhG2zzsxWBOs07aGIdJnXP97KITn9ycpIjXYoMSViicTMEoE7gdOBScAFZjapWbHLgO3uPg64FbglWL4SyHX3qcBpwJ/NLClsuxPdfWp7poAUEekM2yp3sXTjDr6sZq0viGSNZDqQ7+4F7l4HzAFmNiszE3goeD8XOMnMzN2r3b0hWN4L6P4Ty4tITHvzkxLc4cSD1KzVXCQTyTBgY9jnwmBZi2WCxFEOZAKY2ZFmtgpYAfwwLLE48HczW2Rms1s7uJnNNrM8M8srKSnplBMSkZ5r/uqtZGWkMmVov2iHEnMimUishWXNaxatlnH3Be4+GTgCuN7MegXrj3X3wwk1mf3EzI5v6eDufo+757p7bna2/oIQkf3X0NjEvz4p4cQDs0lIaOlrq2eLZCIpBIaHfc4BilsrE/SB9APKwgu4+0dAFTAl+Fwc/LsVeI5QE5qISMQs3rCDnbUNepq9FZFMJAuB8WY22sxSgFnAvGZl5gGXBO/PA+a7uwfbJAGY2UjgQGCdmaWbWZ9geTpwCqGOeRGRiJm/eitJCcZx47OiHUpMSmq7yP5x9wYzuxx4FUgE7nf3VWZ2M5Dn7vOA+4BHzCyfUE1kVrD5ccB1ZlYPNAE/dvdSMxsDPGdmu2N/3N3/FqlzEBGB0LDxR4waSN9eydEOJSZFLJEAuPvLwMvNlv0q7H0t8I0WtnsEeKSF5QXAoZ0fqYhIy4p21PDxlgpuOGNitEOJWXqyXUTiUm19Y5cc57NJrHTTTmuUSEQk7jy2YD2TfvU3zv/ze8xbVsyuhsgklcYmZ97SYoYP7M3Y7IyIHKM7iGjTlohIZ5vzwQZueG4l00YOYFN5LVc8sYTM9BS+kTucb00fwYjMzpm10N35xbMr+GBdGf959hSCvllpgRKJiMSNp/I2cv1zKzjxwGzuvmgayQkJvJVfyuML1nPvWwXc/eYaZozPYubUYUwe2pex2RmkJO17w4u787uXP+LJvI1c8eVxXHjUyAicTfehRCIiceGZRYX8+zPLOW5cFnddOI3UpEQAvjQhmy9NyGZzeS1PLtzInIUbuObpZQAkJRhjstM56IC+HHhAHyYO6cMxY7PolZy412P96Y013PvWWi45eiRXnTwh4ucW78y9+w9jlZub63l5GihYJF49v6SIq55ayjFjM7nvkiP2mggam5w1JZV8tGknH2+u4OPNFazeXEHRjhoABvdN5YqTxvPN3OEtTk71yPvr+eXzKzl76lD+55tTe/ST7Ga2qD2D4yqRiEhMe3FZMT+ds4TpowfywHem0ztl77WJ1pTX1LNkw3bumJ9P3vrtjMpM46qTJ3DmIUP3JIsXlhZx5ZNLOemgQdx14bQePwuiEkkYJRKR+PTKik1c/sQSpo0YwIOXHkFaSsdb492d+au38odXP2b15gomDunLtaceSJM7P3hkEdNGDuChS6e32fzVEyiRhFEiEYk/+Vsr+OrtbzNlWD8eunQ6Gamd26Xb1OS8uLyYP/79EzaUVWMGU4b24/HvH0kfPcEOtD+RqLNdRGJOfWMTVz25jLSURO668PBOTyIACQnGzKnDOOPgITy5cCPvF2zjprMmK4nsByUSEYk5//fap6woKufuCw9nUJ9ebW/QAcmJCVx41Ejd4tsBPbsnSURizuIN27nj9XzOPTyH06YMiXY40g5KJCISM6p2NXD1k0sZ0q83vz5rUrTDkXZS05aIxIzfvvwR68uqeeL7R2nI9jiiGomIxIT5q7fw+IINzJ4xhqPGZEY7HNkHqpGISKdobHIefm8dH23aycjMdEZlpjMyM42RmWlt3gm1rXIX185dwUEH9OHqUzQkSbyJaCIxs9OA2wjNkPgXd/99s/WpwMPANGAbcL67rzOz6cA9u4sBN7r7c+3Zp4i0X11DE2tLqxidlb5fgxvutrGsmp89tYwP1pXRPy2ZHdX1n1uflZHCiIFpjMxMZ8TAtNArM42RA9PI7pPK9c+uYGdNPY9cNn3PGFoSPyKWSMwsEbgTOBkoBBaa2Tx3/zCs2GXAdncfZ2azgFuA8wnNw54bTNc7BFhmZi8C3o59ikg71NY3csG977Nkww5SkhKYMrQvhw7vz9Th/Tls+ACGD+zd5tDp7s5zS4r49QurcOCP3ziUrx8+jOq6RtZvq2b9tirW7fm3ig/WlvH80iLCn4NOTUpgV0MTvzjjICYO6RvZk5aIiGSNZDqQH0yPi5nNAWYC4V/6M4Ebg/dzgTvMzNy9OqxML0IJpL37FJE2NDU5Vz25lKUbd3DNKRMor6ln6cYdPPHBBh54Zx0AmekpTB89kBnjs5kxPovhAz8/z8eO6jpueH4lLy3fRO7IAdx6/tQ9ZdJTk5g0tC+Thn4xMexqaKRoew0byqrZWFbN+m3VpCQlcNlxYyJ+3hIZkUwkw4CNYZ8LgSNbKxPUPsqBTKDUzI4E7gdGAhcF69uzTxFpwy2vruaVlZu54YyJfP/4z77A6xub+HhzBcsKd7B4/Q7eXVPKKys3AzA6K50Z47M4blwWSYnGL55dSWnlLn5+6oH88EtjSWznKLmpSYmMyc5gjGYc7DYimUha+qlqPrBXq2XcfQEw2cwmAg+Z2Svt3Gdox2azgdkAI0aMaG/MIt3e4ws28Oc3C7jwqBF8b8boz61LTkxgyrB+TBnWj28fORJ3Z01JFW99WsJbn5Yyd1EhD7+3HoCx2ence/GxHJzTLxqnITEkkomkEBge9jkHKG6lTKGZJQH9gLLwAu7+kZlVAVPauc/d291D0GGfm5vb/UemFGmHNz8p4ZcvrORLE7K58czJbfaBmBnjBmUwblAG3z12NHUNTSzesJ0N26o589Ch+z2ku3QvkXyOZCEw3sxGm1kKMAuY16zMPOCS4P15wHx392CbJAAzGwkcCKxr5z5FpAWrN+/kJ48tZvygDO741mEk7cdcGylJCRw1JpNvHjFcSUT2iFiNJOjTuBx4ldCtuve7+yozuxnIc/d5wH3AI2aWT6gmMivY/DjgOjOrB5qAH7t7KUBL+4zUOYh0F1t31nLpAwtJT03kge8eoRFupVNpPhKRbq66roHz//w+a0oqeeoHRzNlmPo0pH00H4mI0NTkXPP0MlYWl3PvRblKIhIRGmtLpBu7ff6nvLxiM9effhBfmTQ42uFIN6VEEgN2VNfx5MIN1Dc2RTsU6UZeWr6J//3np5x7eA7fn6GH/SRylEiizD3U9PDvz6zgyjlLaVAykU6wsqicnz29lMNH9Od3X5/S5m2+Ih2hRBJlzy4u4p8fbeWYsZm8tGIT185dTlNT978BQiJna0Utsx/OY2BaCndfNE2DIErEKZFE0ebyWm58cRVHjBrAI5cdydUnT+DZJUXc8PwKesLddNHU2OSU19R3u+u8q6GRHz6yiO3V9dxzcW7E5zsXAd21FTXuznXPLqe+sYk/nHcoiQnGv315HLX1jfzpjTWkJiXy6zMnqUkiAgpKKvnew3kUlFSRkZrEsP69GTagN0P792JY/zSG9u9FkzvVdY3UBK/q+tC/A9NT+N6M0aSlxN6vjrtz/bMrWLxhB3/69uG6Q0u6TOz9NsSQRevLGJmZTlZG6j5t19TkJLQxgN3Tiwp54+MSbjxzEqOy0oHQcBQ/P/VAauubuP+dtfRKTuTfTztQyaQTvf1pKT9+bBFJiQlcc8oESivrKNpRQ9H2Ghat3055TX2L2yUnGr2TE9lZ28DzS4u4fdZhMfNFXV5Tz4rCcv62ahPPLi7iyq+M54yDh0Q7LOlBlEha0dDYxBVPLKWitp7rz5jI+bnD20wOywt38Ot5qyjeUcNvzz641dsti3fU8JsXP+TI0QO5+OhRn1tnZvzyaxPZ1dDI3W+uoVdyAld+RTPGdYaH31vHTS9+yLjsDP5ySe4XhkUHqNzVwObyGhLMSEtJondKImkpiSQHw4m8m1/KVU8t5Zw/vcPPTz2Q7x03ps2fi87k7qwq3sniDdtZunEHyzbuYE1J1Z71503L4Yovj++yeERAT7bvVf7WSm54bgUL1paRO3IAvz3nYA48oM8Xym2r3MUfXv2YJ/M2kpmeysD0ZD7ZUsnZU4fy6zMnMyA9ZU9Zd+fi+z9g0frt/O2nxzMi84tfZhCq1Vz7zHLmLirk56ceyE9OHLfP8UtIfWMTN724ikff38BXJg7if2cdRkbq/v8Ntb2qjuueXc6rq7Zw3Lgs/vjNQxnct2v6Iv7nH59w+2ufApCVkcrU4f2ZOrwfU4cP4OCcfvTrraFPpPO098l2JZI2uDvPLC7ity99SEVtA98pjaqtAAAQpUlEQVSbMYafnjSe3imJNDQ28fgHG/h/r35MVV0j3z1mFFd8ZTy9khK58/V87nw9n/5pKfz2nCmcOvkAAJ74YAPXP7uC35w9hYuOGrnXYzc2OVc/tZQXlhYz+/gxXH/6QWrm2kc7quv48WOLeXfNNn7wpTFce+pB7Z43Y2/cnSc+2MjNf11F7+REbjn3EE4J/o8jZUHBNmbd+z5nHTqUa087iKH9eunnQSJKiSRMZ4y1tb2qjv965SOeyiskZ0BvfnD8GB5bsIHVmys4dlwmN545mfGDP19bWVVczs+fXs6Hm3Zy1qFD+f6MMcy65z2mjujPI5ce2a4mkcYm56YXV/Hwe+s59/Acbjn3YJLWrYU//hEefRQqKyEjAy68EH72Mxg7tkPn2R00NTmfbK3gg7Vl3P/2Wop31PK7rx/MedNyOv1Y+VsrufLJJaws2smpkwdz6bGjmT56YKd/wZfX1HPGbW+RnGi8dMUM0jtQoxJpLyWSMJ05aOOCgm3c8PxK8rdWMrRfL/7ja5M4fcoBrX5x1Dc2cfcba7h9/qfUNzrpKYm8etXx5AxouUmrJe7O7a/lc+s/P+HK+nx+evcvsPp6qA/rGE5ODr3mzoXTT+/oacaVhsYmVhXv5IO1ZSxYW8bCdWV7Os1HDEzj1vMPZdrIgRE7fl1DE3e+ns9D761jR3U9Bx3Qh+8cM4qZU4d12lDrVzyxhJdWbOKZHx3D1OH9O2WfIm1RIgnT2aP/1jU08XZ+CUePyWr3F8XqzTv5/SurOW9aDl87ZOh+Hfe5uf/i1G+dQlr9rtYLpaXB8uU9pmbyTn4pVz+1lC07Q9dkdFY600cNZPro0CtnQO8ua/6pqWvkhaVFPPjuOlZvrqB/WjLnHzGcC48c2aE4nl9SxJVPLuWaUyZwuTrSpQspkYTpNsPI//jHNN17LwkNDa2XSU6G2bPhjju6Lq4oqG9s4tZ/fMJdb65hbHYGPz1pPEeOHsigLur03ht354O1ZTz47jr+/uEWGpuclKQEMtNTGJieQmZG6p73U4b15axDh7Xab7OxrJozbnuLg4b0Yc7sozulf0ekvZRIwnSbRNK3L1RUtK9ceXnk44mSjWXVXDFnCUs27GDWEcP51ZmTYvIBQQjd6v3Kys1s3VnLtqo6yqrq2FZVx7bKXWyrrKOmvpEDB/fhF1+dyJcmZH9u24bGJmbd8z4fb67glStn7FNzqEhniIn5SMzsNOA2QrMZ/sXdf99sfSrwMDAN2Aac7+7rzOxk4PdAClAH/Nzd5wfbvAEMAWqC3Zzi7lsjeR4xo7Kyc8vFob8uL+b6Z1YA8H8XHMaZh+5fM2FXGdq/N5cdN7rFde7OKys38/tXVnPJ/R8wY3wWvzhjIhOH9AXgT2+sIW/9dm6bNVVJRGJaxBKJmSUCdwInA4XAQjOb5+4fhhW7DNju7uPMbBZwC3A+UAqc6e7FZjaF0NS6w8K2+7a7d4Mqxj7KyGhfjSQjI/KxdLGaukZuenEVcxZuZOrw/vzfBYe1+EBhPDEzzjh4CCdNHMSj72/g9tc+5Yzb3+Ib03I4aeJgbnvtU86eOpSZU4e1vTORKIrkoI3TgXx3L3D3OmAOMLNZmZnAQ8H7ucBJZmbuvsTdi4Plq4BeQe2lZ7vwwlAfyN4kJ8NFF3VNPF3k0y0VnHXH2zyZt5EfnTCWp394dNwnkXCpSYlcdtxo/vXzE/necaN5fkkxP3hkEQf07cXNZ0+JdngibYpk09YwYGPY50LgyNbKuHuDmZUDmYRqJLudCyxx9/BblR4ws0bgGeA/vSd09EDoOZGHHvr8bb/NeHIydtVVXRhUZD27uJAbnltJWkoiD186nRnjs9veKE71S0vmhq9O4qKjRnH/O2s5b1oOfXvpSXWJfZGskbR0e0nzL/y9ljGzyYSau34Qtv7b7n4wMCN4tfjnt5nNNrM8M8srKSnZp8Bj1tixoedE0tK+UDNpSkqiOjmVmy+5iYphI6IUYOepqWvk2rnLuPqpZRyc04+XfzqjWyeRcCMy07jxrMkxMyikSFsimUgKgeFhn3OA4tbKmFkS0A8oCz7nAM8BF7v7mt0buHtR8G8F8DihJrQvcPd73D3X3XOzs7vRF9Dpp4eeE5k9O3R3VkIC9O1Lwg9+wOJ5b/JI/0lc9mAe1XV7uUU4xuVvreTsO9/h6UWFXH7iOB7/3pFdNpaViOy7SCaShcB4MxttZinALGBeszLzgEuC9+cB893dzaw/8BJwvbu/s7uwmSWZWVbwPhn4GrAygucQm8aODT0nUl4OjY2hf++4g+NOO5L/nTWVvPVlfO+hPHbWtt4EFqteWFrEWXe8TUnlLh787nSuOfVAkhI1/5pILIvYb6i7NwCXE7rj6iPgKXdfZWY3m9lZQbH7gEwzyweuBq4Lll8OjAN+aWZLg9cgIBV41cyWA0uBIuDeSJ1DPPraIUP54zcP5YO1ZXz9T++yfltV2xvFgLqGJn75/Ep+OmcpU4b24+UrZnzhuQoRiU16ILGbendNKT96dDFmcNe3p3H02Mxoh9SqLTtr+dGji1i8YQezjx/DtaqFiMSE9j6QqN/WbuqYsVm88JNjyUxP4aL7FvD4gg37va/6xiZq6ho7MbrPLCjYxldvf5vVmyu481uH84szJiqJiMSZ2BxXQjrFqKx0nvvJsfzb40v4xXMr+GRLBf/x1fZ/UdfWN/L4gg3c9eYaymvqOXniYL5++DCOn5C9Z8bA/eXuPPDOOn778keMHJjG498/kgmDvzhpmIjEPiWSbq5vr2TuuySX/3plNfe9vZY1JZXccu4hDO3fu9VtausbeWzBBu5+cw0lFbs4ekwmEwZn8OLyTby0YhNZGSmcdegwzp02jMlDP3+L6q6GRrZV1rGtso6K2npSk0NT1aalJNI7OXHPaMn/8fxKXlhazMmTBvPHbx6q5yVE4pj6SHqQJxdu4D+eX0l9o3NA314cktOPQ4f359Cc/hyc04/UpITPJZBjxmaGRtUdE+pfqWto4s1PSnhmUSGvrd5CfaMzYXAGfXsls62qjtLKXVTUtu+2YzP42ckT+PEJ47p0znMRaT+N/htGieQz+Vsr+NcnpSwv3MGywnLWln52V1daSiLVdY1fSCAt2V5Vx1+XF/Pyis2YsWdo9KyMz4ZJz+iVxK6GJmrrGqmua6S6vnHP+6PGDNzr/kUk+pRIwiiRtK68up7lRTtYtnEHhdtr+PrhOUwfHbnZBEUkfsTEMPIS+/qlJTNjfHaPGX5ERDqf7rMUEZEOUSIREZEOUSIREZEOUSIREZEOUSIREZEOUSIREZEOUSIREZEOUSIREZEO6RFPtptZCbC+ldX9gPIOLG+pXPiyLKC03cHun9Zi7cxt2yq3t/VtXaPWljX/rGvZM67lvmy3v9dyX5ZH+1pG82dypLu3/bSyu/foF3BPR5a3VC58GZAXrXPozG3bKre39W1do/ZeW13LnnEt92W7/b2W+7I82tcyFn4m23qpaQte7ODylsq1tm2kdOR47d22rXJ7W9/ea9SeaxtpupadZ3+PuS/b7e+13Jfl0b6WsfAzuVc9omkrmswsz9sx6Jm0Tdey8+hadh5dS3W2d4V7oh1AN6Jr2Xl0LTtPj7+WqpGIiEiHqEYiIiIdokSyD8zsfjPbamYr92PbaWa2wszyzex2M7Owdf9mZh+b2Soz++/OjTo2ReJamtmNZlZkZkuD1xmdH3nsidTPZbD+GjNzM8vqvIhjU4R+Jn9jZsuDn8e/m9nQzo88+pRI9s2DwGn7ue1dwGxgfPA6DcDMTgRmAoe4+2Tg/3U8zLjwIJ18LQO3uvvU4PVyx0KMGw8SgWtpZsOBk4ENHYwvXjxI51/HP7j7Ie4+Ffgr8KuOBhmLlEj2gbv/CygLX2ZmY83sb2a2yMzeMrODmm9nZkOAvu7+noc6pR4Gzg5W/wj4vbvvCo6xNbJnERsidC17pAhey1uBa4Ee0ZEaievo7jvDiqbTTa+lEknH3QP8m7tPA64B/tRCmWFAYdjnwmAZwARghpktMLM3zeyIiEYb2zp6LQEuD5oS7jezAZELNeZ16Fqa2VlAkbsvi3SgMa7DP5Nm9lsz2wh8m25aI9Gc7R1gZhnAMcDTYU3LqS0VbWHZ7r9MkoABwFHAEcBTZjbGe9jtdJ10Le8CfhN8/g3wR+DSzo009nX0WppZGnADcEpkIowPnfQzibvfANxgZtcDlwO/7uRQo06JpGMSgB1B++ceZpYILAo+ziP0BZcTViQHKA7eFwLPBonjAzNrIjR2T0kkA49BHb6W7r4lbLt7CbVJ90QdvZZjgdHAsuALNAdYbGbT3X1zhGOPJZ3x+x3uceAlumEiUdNWBwTtn2vN7BsAFnKouzeGdfj+yt03ARVmdlRwN8fFwAvBbp4HvhxsPwFIIfKD6cWczriWQVv1bucA+3z3TXfQ0Wvp7ivcfZC7j3L3UYT+2Dm8hyWRzvqZHB+2y7OA1V19Hl2iIwN19bQX8ASwCagn9Mt1GaG/3P4GLAM+BH7Vyra5hL7Y1gB38NnDoCnAo8G6xcCXo32ecXwtHwFWAMsJ/aU4JNrnGa/XslmZdUBWtM8zHq8j8EywfDmh8ayGRfs8I/HSk+0iItIhatoSEZEOUSIREZEOUSIREZEOUSIREZEOUSIREZEOUSKRHsnMKrv4eH8xs0mdtK/GYDTZlWb2opn1b6N8fzP7cWccW6Qluv1XeiQzq3T3jE7cX5K7N3TW/to41p7Yzewh4BN3/+1eyo8C/uruU7oiPul5VCMRCZhZtpk9Y2YLg9exwfLpZvaumS0J/j0wWP4dM3vazF4E/m5mJ5jZG2Y218xWm9ljwZPOBMtzg/eVwUB+y8zsfTMbHCwfG3xeaGY3t7PW9B6fDbSYYWavmdliC82NMTMo83tgbFCL+UNQ9ufBcZab2U2deBmlB1IiEfnMbYTmMzkCOBf4S7B8NXC8ux9GaPTW34VtczRwibt/Ofh8GHAlMAkYAxzbwnHSgffd/VDgX8D3w45/W3D8lsZq+pxgzKeTCD3FD1ALnOPuhwMnAn8MEtl1wBoPDenxczM7hdCcGdOBqcA0Mzu+reOJtEaDNop85ivApLCRXvuaWR+gH/BQMG6SA8lh2/zD3cPnsPjA3QsBzGwpMAp4u9lx6vhsQMlFhCaPglBS2j0fyOO0PslZ77B9LwL+ESw34HdBUmgiVFMZ3ML2pwSvJcHnDEKJ5V+tHE9kr5RIRD6TABzt7jXhC83s/4DX3f2coL/hjbDVVc32sSvsfSMt/47V+2edk62V2Zsad59qZv0IJaSfALcTmu8iG5jm7vVmtg7o1cL2BvyXu/95H48r0iI1bYl85u+E5osAwMx2Dx/eDygK3n8ngsd/n1CTGsCstgq7ezlwBXCNmSUTinNrkEROBEYGRSuAPmGbvgpcGsy3gZkNM7NBnXQO0gMpkUhPlWZmhWGvqwl9KecGHdAfAj8Myv438F9m9g6QGMGYrgSuNrMPgCFAeVsbuPsSQiPTzgIeIxR/HqHayeqgzDbgneB24T+4+98JNZ29Z2YrgLl8PtGI7BPd/isSIyw0M2GNu7uZzQIucPeZbW0nEm3qIxGJHdOAO4I7rXbQA6cJlvikGomIiHSI+khERKRDlEhERKRDlEhERKRDlEhERKRDlEhERKRDlEhERKRD/j/1pAr7m588tAAAAABJRU5ErkJggg==\n"},"metadata":{"needs_background":"light"}}]},{"metadata":{"trusted":true},"cell_type":"code","source":"learner.fit_one_cycle(2, slice(5e-6, 5e-5), moms=(0.8,0.7), pct_start=0.2, wd =(1e-7, 1e-5, 1e-4, 1e-3, 1e-2))","execution_count":35,"outputs":[{"output_type":"display_data","data":{"text/plain":"<IPython.core.display.HTML object>","text/html":"<table border=\"1\" class=\"dataframe\">\n  <thead>\n    <tr style=\"text-align: left;\">\n      <th>epoch</th>\n      <th>train_loss</th>\n      <th>valid_loss</th>\n      <th>accuracy_thresh</th>\n      <th>time</th>\n    </tr>\n  </thead>\n  <tbody>\n    <tr>\n      <td>0</td>\n      <td>0.031048</td>\n      <td>0.037301</td>\n      <td>0.980604</td>\n      <td>25:57</td>\n    </tr>\n    <tr>\n      <td>1</td>\n      <td>0.028199</td>\n      <td>0.038580</td>\n      <td>0.982683</td>\n      <td>28:36</td>\n    </tr>\n  </tbody>\n</table>"},"metadata":{}}]},{"metadata":{},"cell_type":"markdown","source":"We will now see our model's prediction power"},{"metadata":{"trusted":true},"cell_type":"code","source":"text = 'you are so sweet'\nlearner.predict(text)","execution_count":36,"outputs":[{"output_type":"execute_result","execution_count":36,"data":{"text/plain":"(MultiCategory ,\n tensor([0., 0., 0., 0., 0., 0.]),\n tensor([1.1777e-04, 2.4234e-05, 1.0298e-04, 9.0122e-06, 3.9043e-05, 1.2749e-05]))"},"metadata":{}}]},{"metadata":{"trusted":true},"cell_type":"code","source":"text = 'you are pathetic piece of shit'\nlearner.predict(text)","execution_count":37,"outputs":[{"output_type":"execute_result","execution_count":37,"data":{"text/plain":"(MultiCategory toxic;obscene;insult,\n tensor([1., 0., 1., 0., 1., 0.]),\n tensor([9.9793e-01, 1.6227e-01, 9.8385e-01, 7.9275e-05, 9.7258e-01, 7.4961e-03]))"},"metadata":{}}]},{"metadata":{},"cell_type":"markdown","source":"This is awesome!\n\nWith few number of epochs, we are able to get the accuracy of around 98% on this multi-label classification task.\n\nNow, lets see how does Fastai ULMFiT fare on this task"},{"metadata":{},"cell_type":"markdown","source":"# Fastai - ULMFiT"},{"metadata":{},"cell_type":"markdown","source":"This will have two parts:\n\n1. Training the Language Model\n2. Training the Classifier Model"},{"metadata":{},"cell_type":"markdown","source":"## Language Model\n"},{"metadata":{},"cell_type":"markdown","source":"Important thing to remember in the Language Model is that we train it without label. The basic objective by training language model is to predict the next sentence / words in a sequence of text."},{"metadata":{"trusted":true},"cell_type":"code","source":"src_lm = ItemLists(path, TextList.from_df(train, path=\".\", cols = \"comment_text\"), \n                   TextList.from_df(val, path=\".\", cols = 'comment_text'))","execution_count":38,"outputs":[]},{"metadata":{"trusted":true},"cell_type":"code","source":"data_lm = src_lm.label_for_lm().databunch(bs=32)","execution_count":39,"outputs":[]},{"metadata":{"trusted":true},"cell_type":"code","source":"data_lm.show_batch()","execution_count":40,"outputs":[{"output_type":"display_data","data":{"text/plain":"<IPython.core.display.HTML object>","text/html":"<table border=\"1\" class=\"dataframe\">\n  <thead>\n    <tr style=\"text-align: right;\">\n      <th>idx</th>\n      <th>text</th>\n    </tr>\n  </thead>\n  <tbody>\n    <tr>\n      <td>0</td>\n      <td>there would be no requirement to acknowledge whether you had a previous account ( xxmaj carlos xxmaj xxunk did not have a good record ) or not and i would then remove the sockpuppet template as irrelevant . xxup wp : xxup coi permits people to edit those articles , such as msjapan does , but just means you have to be more careful in ensuring that references back your</td>\n    </tr>\n    <tr>\n      <td>1</td>\n      <td>xxmaj so it 's an obsession . xxmaj what 's your point ? xxmaj what are your obsessions doing in article space ? : that is not a good reason to revert one 's edits . xxmaj additionally , the deaths of xxmaj xxunk 's companions are not listed alphabetically , but also not listed by who was born first , which is why i switched the positioning of the</td>\n    </tr>\n    <tr>\n      <td>2</td>\n      <td>wiki raid threads and ruining fun , dick faggot \\n  thanks for watching wiki raid threads and ruining fun , dick faggot \\n  thanks for watching wiki raid threads and ruining fun , dick faggot \\n  thanks for watching wiki raid threads and ruining fun , dick faggot \\n  thanks for watching wiki raid threads and ruining fun , dick faggot \\n  thanks for watching</td>\n    </tr>\n    <tr>\n      <td>3</td>\n      <td>xxmaj not every hex code has an exact name . xxmaj it 's probably just called \" \" blue \" \" . xxunk \" xxbos i 'm satisfied with your conclusions ( and also that , even though you do n't say anything clearly , i can see what might be your opinion about this xxup ip ) and i will no longer make sockpuppetry accusations in edit summaries .</td>\n    </tr>\n    <tr>\n      <td>4</td>\n      <td>\" moving parts \" \" : xxmaj there are devices that contain keys and switches and xxunk , all of them sporting moving ( or movable ) parts and the devices are regarded solid - state nevertheless . 217.237.149.206 \" xxbos i have explained the importance and have described exactly what xxmaj willard xxmaj wonky xxmaj candy - xxmaj hand xxmaj candy is , and request that you keep it</td>\n    </tr>\n  </tbody>\n</table>"},"metadata":{}}]},{"metadata":{"trusted":true},"cell_type":"code","source":"learn = language_model_learner(data_lm, AWD_LSTM, drop_mult=0.3, model_dir=\"/temp/model\")","execution_count":41,"outputs":[]},{"metadata":{"trusted":true},"cell_type":"code","source":"learn.lr_find()\nlearn.recorder.plot(suggestion=True)","execution_count":42,"outputs":[{"output_type":"display_data","data":{"text/plain":"<IPython.core.display.HTML object>","text/html":""},"metadata":{}},{"output_type":"stream","text":"LR Finder is complete, type {learner_name}.recorder.plot() to see the graph.\nMin numerical gradient: 1.32E-02\n","name":"stdout"},{"output_type":"display_data","data":{"text/plain":"<Figure size 432x288 with 1 Axes>","image/png":"iVBORw0KGgoAAAANSUhEUgAAAYwAAAEKCAYAAAAB0GKPAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAAIABJREFUeJzt3XuYnHV99/H3d8/nQ5JNsuRAIIRAQANhARGfKEWQ2CqlpRUqFlGb4oFWW33qc3ld1kcve6LWaqmkPAhUBWwF02KFJGirKOcNJBDIgSQkZJPsIXucPc/ufJ8/5t5lssxuJsnec9h8Xtc1V2Z+9++393eGYb/7u3+H29wdERGRY8nLdAAiIpIblDBERCQlShgiIpISJQwREUmJEoaIiKRECUNERFKihCEiIilRwhARkZQoYYiISEoKMh3AdJozZ44vWbIk02GIiOSMzZs3H3H3ulTqzqiEsWTJEhobGzMdhohIzjCz/anW1SUpERFJSWgJw8yWm9mWhEePmX12Qp0Pm9lLweMpM1uZcGyfmb0ctFW3QUQkw0K7JOXuO4ELAMwsHzgIrJ9Q7XXg3e7eaWZrgLuASxOOX+HuR8KKUUREUpeuMYwrgT3uftS1Mnd/KuHlM8DCNMUjIiLHKV1jGDcADx6jzseBxxJeO7DJzDab2drQIhMRkZSE3sMwsyLgg8D/maLOFcQTxrsSii9390NmNhd43Mx2uPsTSdquBdYCLF68eFpjFxGRN6Wjh7EGeMHdW5IdNLO3A3cD17p7+1i5ux8K/m0lPvZxSbL27n6Xuze4e0NdXUpTiUVE5ASkI2HcyCSXo8xsMfBj4CPuviuhvNzMKseeA1cD29IQq4hITvnZqy2s++WetJwr1IRhZmXAVcSTwljZrWZ2a/Dyy8Bs4DsTps/OA35tZluB54CfuvuGMGMVEclFG19p5r4n96XlXKGOYbh7P/GEkFi2LuH5J4BPJGm3F1g5sVxERI7W2R+lpqwwLefSSm8RkRzWPTBMbVlRWs6lhCEiksPUwxARkZR09Q9Tox6GiIhMxd3p6o9Sqx6GiIhMJTI0wkjMNYYhIiJT6+6PAlCtHoaIiEyls38YQD0MERGZWmfQw9AYhoiITKkr6GFolpSIiEyps2/skpR6GCIiMoWugWDQu1QJQ0REptDVH6WypICC/PT8KlfCEBHJUZ396dtHCpQwRERyVmcaV3mDEoaISM7qTuM+UqCEISKSs9K5Uy2EmDDMbHlwF72xR4+ZfXZCHTOzb5vZbjN7ycxWJRy72cxeCx43hxWniEiuSvcYRmh33HP3ncAFAGaWDxwE1k+otgZYFjwuBe4ELjWzWcBfAg2AA5vN7BF37wwrXhGRXDIyGiMyODIzehgTXAnscff9E8qvBb7ncc8ANWZWD7wPeNzdO4Ik8ThwTZpiFRHJemNrMGbiLKkbgAeTlC8ADiS8bgrKJisXERHiazCAmdXDMLMi4IPAj5IdTlLmU5Qn+/lrzazRzBrb2tpOPFARkRyS7n2kID09jDXAC+7ekuRYE7Ao4fVC4NAU5W/h7ne5e4O7N9TV1U1TyCIi2S3dO9VCehLGjSS/HAXwCPCHwWypdwDd7n4Y2AhcbWa1ZlYLXB2UiYgI6b8XBoQ4SwrAzMqAq4A/Tii7FcDd1wGPAu8HdgP9wC3BsQ4z+xrwfNDsq+7eEWasIiK55M1LUunrYYSaMNy9H5g9oWxdwnMHPj1J23uAe8KMT0QkV3X1RynIMyqKQ/01fhSt9BYRyUFjq7zNks0RCocShohIDupK8z5SoIQhIpKT4tuCpG/8ApQwRERyUld/VD0MERE5tq7+KDVpujXrGCUMEZEc1Nk/TG25ehgiIjKFgeFRhkZiaV2DAUoYIiI5JxOrvEEJQ0Qk54wlDI1hiIjIlLrHtzZXD0NERKYwvlNtuXoYIiIyBY1hiIhISsZ2qq3WGIaIiEylsz9KaWE+JYX5aT2vEoaISI7p6o+mfR8pUMIQEck5mdipFsK/414NcDdwPuDAx9z96YTjXwA+nBDLuUBdcMe9fUAEGAVG3L0hzFhFRHJFfFuQ9Pcwwr5V07eADe5+vZkVAWWJB939duB2ADP7APC5CbdivcLdj4Qco4hITunqj1JfU5r284aWMMysClgNfBTA3YeB4Sma3Ag8GFY8IiIzRddA+neqhXDHMM4E2oB7zexFM7vbzMqTVTSzMuAa4OGEYgc2mdlmM1s72UnMbK2ZNZpZY1tb23TGLyKSdWIxp6t/OO1rMCDchFEArALudPcLgT7gi5PU/QDw5ITLUZe7+ypgDfBpM1udrKG73+XuDe7eUFdXN43hi4hkn8jgCDEn7TvVQrgJowlocvdng9cPEU8gydzAhMtR7n4o+LcVWA9cElKcIiI5I1OrvCHEhOHuzcABM1seFF0JvDqxnplVA+8G/jOhrNzMKseeA1cD28KKVUQkV4zvVJuBHkbYs6RuA+4PZkjtBW4xs1sB3H1dUOc6YJO79yW0mwesN7OxGB9w9w0hxyoikvVaI0MAzK0sSfu5Q00Y7r4FmLh+Yt2EOvcB900o2wusDDM2EZFcNJ4wqorTfm6t9BYRySGtPYPkGcxO8/28QQlDRCSntPYMMbuimIL89P/6VsIQEckhLZFB5mXgchQoYYiI5JTWnqGMDHiDEoaISE5pVQ9DRESOJToao71vmDr1MEREZCpHeodwRz0MERGZWmtP5hbtgRKGiEjOaOkZBNTDEBGRY8jktiCghCEikjNaewYxgzkV6V/lDUoYIiI5ozUyxOzyzKzyBiUMEZGc0dKTuTUYoIQhIpIzWiNDzKvKzPgFKGGIiOSMlp4h5laqhyEiIlMYGY3R3jfE3JnawzCzGjN7yMx2mNl2M7tswvH3mFm3mW0JHl9OOHaNme00s91m9sUw4xQRyXZHeodxJ6M9jLBv0fotYIO7Xx/cprUsSZ1fuftvJRaYWT7wz8BVQBPwvJk94u5vuSe4iMipoDUytmhvBvYwzKwKWA18F8Ddh929K8XmlwC73X2vuw8DPwSuDSdSEZHs1zK+LcjMHMM4E2gD7jWzF83sbjMrT1LvMjPbamaPmdl5QdkC4EBCnaag7C3MbK2ZNZpZY1tb27S+ARGRbDGjexjEL3etAu509wuBPmDiWMQLwOnuvhL4J+A/gnJL8vM82Unc/S53b3D3hrq6uumJXEQky7T0DGV0lTeEmzCagCZ3fzZ4/RDxBDLO3XvcvTd4/ihQaGZzgraLEqouBA6FGKuISFZriwxmdJU3hJgw3L0ZOGBmy4OiK4GjBq3NbL6ZWfD8kiCeduB5YJmZnREMlt8APBJWrCIi2S7TazAg/FlStwH3B7/09wK3mNmtAO6+Drge+KSZjQADwA3u7sCImX0G2AjkA/e4+yshxyoikrUyeWvWMaEmDHffAjRMKF6XcPwO4I5J2j4KPBpedCIiuaOlZ4jz6qszGoNWeouIZLmR0RjtvUMZ72EoYYiIZLn2vmFiDnUZnFILShgiIllv7F7e8zI86K2EISKS5cbu5Z3JjQdBCUNEJOu1jK/yVg9DRESm0Dq+ylsJQ0REptAaGWR2eRGFGVzlDUoYIiJZr7VniLrKzI5fgBKGiEjWa8mCVd6ghCEikvUOdw1SX60ehoiITKFvaIT2vmEW1ia7YWl6KWGIiGSxg10DACysLc1wJEoYIiJZ7UBHPwCLZuVID8PMlppZcfD8PWb2J2ZWE25oIiLS1Jl7PYyHgVEzOwv4LnAG8EBoUYmICBDvYRQX5FGX4UV7kHrCiLn7CHAd8I/u/jmg/liNzKzGzB4ysx1mtt3MLptw/MNm9lLweMrMViYc22dmL5vZFjNrPJ43JSIyUzR1DrCwtpTg5qQZleoNlKJmdiNwM/CBoKwwhXbfAja4+/XBXfcmXoR7HXi3u3ea2RrgLuDShONXuPuRFGMUEZlxDnT2Z8X4BaTew7gFuAz4uru/bmZnAD+YqoGZVQGriV/Cwt2H3b0rsY67P+XuncHLZ4CFxxO8iMhM19Q5wKIsmFILKSYMd3/V3f/E3R80s1qg0t3/5hjNzgTagHvN7EUzu9vMyqeo/3HgscTTApvMbLOZrU0lThGRmaRnMEr3QDQrBrwh9VlSvzCzKjObBWwlngT+4RjNCoBVwJ3ufiHQB3xxkp9/BfGE8RcJxZe7+ypgDfBpM1s9Sdu1ZtZoZo1tbW2pvB0RkZzQ1BGfIZVrl6Sq3b0H+B3gXne/CHjvMdo0AU3u/mzw+iHiCeQoZvZ24G7gWndvHyt390PBv63AeuCSZCdx97vcvcHdG+rq6lJ8OyIi2e9AZ3wNRk71MIACM6sHfh/4r1QauHszcMDMlgdFVwKvJtYxs8XAj4GPuPuuhPJyM6scew5cDWxLMVYRkRlhbA1GtoxhpDpL6qvARuBJd3/ezM4EXkuh3W3A/cEMqb3ALWZ2K4C7rwO+DMwGvhNMGRtx9wZgHrA+KCsAHnD3Dam/LRGR3Hego5/yonxqylKZlBq+lBKGu/8I+FHC673A76bQbgvQMKF4XcLxTwCfSNJuL7ByYrmIyKmkqXOARbPKsmINBqQ+6L3QzNabWauZtZjZw2amKbAiIiFq6uzPmvELSH0M417gEeA0YAHwk6BMRERC4O7BKu/sGL+A1BNGnbvf6+4jweM+QFOSRERC0tUfpXdoJCd7GEfM7CYzyw8eNwHtx2wlIiInZHyGVJaswYDUE8bHiE+pbQYOA9cT3y5ERERCkG1rMCD1rUHecPcPunudu891998mvohPRERC0DSeMHKvh5HMn01bFCIicpQDHQNUlRRQXZodazDg5BJGdkwMFhGZgZqyaFvzMSeTMHzaohARkaMcCG6clE2mXOltZhGSJwYDsuudiIjMEPE1GP285+zsWr0wZcJw98p0BSIiInFHeocZjMayrodxMpekREQkBGNTamfSGIaIiIRgbNFeNk2pBSUMEZGsc6Aj+xbtgRKGiEjW2dUSYUFNKeXFqd6yKD2UMEREsszO5gjL52ffnKNQE4aZ1ZjZQ2a2w8y2m9llE46bmX3bzHab2Utmtirh2M1m9lrwuDnMOEVEskV0NMbetj7Onpd9CSPs/s63gA3ufn1wm9aJIzhrgGXB41LgTuBSM5sF/CXxu/U5sNnMHnH3zpDjFRHJqP3tfQyPxlg+vyLTobxFaD0MM6sCVgPfBXD3YXfvmlDtWuB7HvcMUGNm9cD7gMfdvSNIEo8D14QVq4hIttjZ3AuQlT2MMC9JnQm0Afea2YtmdreZlU+oswA4kPC6KSibrFxEZEbb2RIhz2Bp3SnUwyB+uWsVcKe7Xwj0AV+cUCfZBoY+RflbmNlaM2s0s8a2traTiVdEJON2NUdYMqecksL8TIfyFmEmjCagyd2fDV4/RDyBTKyzKOH1QuDQFOVv4e53uXuDuzfU1WXXvisiIsdrV0uE5Vl4OQpCTBju3gwcMLPlQdGVwKsTqj0C/GEwW+odQLe7HwY2AlebWa2Z1QJXB2UiIjPWYHSUfe3ZOUMKwp8ldRtwfzBDai9wi5ndCuDu64BHgfcDu4F+gtu+unuHmX0NeD74OV91946QYxURyajdrb3EnKxcgwEhJwx330J8amyidQnHHfj0JG3vAe4JLzoRkeyyqyUCZOcMKdBKbxGRrLGzJUJRfh5LZmfXpoNjlDBERLLEruYIS+dWUJCfnb+aszMqEZFT0K6WXpbPy771F2OUMEREskBkMMrBrgHOztIBb1DCEBHJCrta4luCZOsaDFDCEBHJCtk+QwqUMEREssLO5gjlRfksqMmuu+wlUsIQEckCu1oiLJtXSV5esq30soMShohIFsjmPaTGKGGIiGRYa2SQI73DWT1DCpQwREQy7oX98ZuJXrCoJsORTE0JQ0Qkwxr3dVJUkMf5C6oyHcqUlDBERDKscX8nKxdWU1yQfTdNSqSEISKSQYPRUV451M1Fp8/KdCjHpIQhIpJBWw90ER11Ll5Sm+lQjkkJQ0QkgxqDAe+LTs/+hBHqDZTMbB8QAUaBEXdvmHD8C8CHE2I5F6gL7rg3ZVsRkZlg8/5OzppbQU1ZUaZDOaawb9EKcIW7H0l2wN1vB24HMLMPAJ+bcCvWSduKiOS6WMzZvL+TNefPz3QoKcmmS1I3Ag9mOggRkXTZ3dZL90A0Jy5HQfgJw4FNZrbZzNZOVsnMyoBrgIdPoO1aM2s0s8a2trZpC1xEJGyN++LjFw1Lsn+GFIR/Sepydz9kZnOBx81sh7s/kaTeB4AnJ1yOSqmtu98F3AXQ0NDgYbwJEZEwNO7vYHZ5Udbew3uiUHsY7n4o+LcVWA9cMknVG5hwOeo42oqI5KTN+zu56PRazLJ3h9pEoSUMMys3s8qx58DVwLYk9aqBdwP/ebxtRURyVVtkiP3t/TTkwPqLMWFekpoHrA8yZwHwgLtvMLNbAdx9XVDvOmCTu/cdq22IsYqIpNXm/fEr8LkyfgEhJgx33wusTFK+bsLr+4D7UmkrIjJTNO7rpLggj/NPq850KCnLpmm1IiKnjF/vPsKFi2soKsidX8O5E6mIyAyxt62XHc0Rrl6RGwv2xihhiIik2WPbmgG4JkdWeI9RwhARSbMN25q5YFENp9WUZjqU46KEISKSRgc6+nn5YHfO7B+VSAlDRCSNNr4Svxy15vz6DEdy/JQwRETS6LFtzayor2JxjmwHkkgJQ0QkTVp6BnNqO/OJlDBERNJk/HLU25QwRERkCo++fJhlcys4a25lpkM5IUoYIiJp0N47xHOvd+Ts5ShQwhARSYufvnyYmMM1OTg7aowShohIyNydB559g/MXVLHitKpMh3PClDBEREK2tambHc0Rbrh4caZDOSlKGCIiIfvhc29QWpjPtReclulQTkqoCcPM9pnZy2a2xcwakxx/j5l1B8e3mNmXE45dY2Y7zWy3mX0xzDhFRMLSOzTCI1sP8YGV9VSWFGY6nJMS5h33xlzh7kemOP4rd/+txAIzywf+GbgKaAKeN7NH3P3VEOMUEZl2j2w5RP/wKDdektuXoyB7L0ldAux2973uPgz8ELg2wzGJiBy3B597g3PmV3LBoppMh3LSwk4YDmwys81mtnaSOpeZ2VYze8zMzgvKFgAHEuo0BWUiIjlj28FuXj7YzY2XLMbMMh3OSQv7ktTl7n7IzOYCj5vZDnd/IuH4C8Dp7t5rZu8H/gNYBiT7ZD3ZCYJEtBZg8eLc7/KJyMzx4HNvUFyQx29fMDP+3g21h+Huh4J/W4H1xC81JR7vcffe4PmjQKGZzSHeo1iUUHUhcGiSc9zl7g3u3lBXVxfCuxAROX7tvUP8x4sH+c231VNdltuD3WNCSxhmVm5mlWPPgauBbRPqzLegn2ZmlwTxtAPPA8vM7AwzKwJuAB4JK1YRken2T/+9m8GRGJ+64qxMhzJtwrwkNQ9YH+SDAuABd99gZrcCuPs64Hrgk2Y2AgwAN7i7AyNm9hlgI5AP3OPur4QYq4jItNl3pI8fPLOfD128iLPmVmQ6nGkTWsJw973AyiTl6xKe3wHcMUn7R4FHw4pvzGjM+fn2Fna39bK7tZc9rb0c7BrAzCjKz6Mg36ivLuGdS+dw+VlzWLmwGoDmnkGaOgc41DVAc88grT1DtPQMclpNKb/59nouXFQzPsjl7hzoGKA1MkhNWSE1ZUXUlBaSn2eMxpxRd/LMKMzP1klrInI8bt+4k6KCPD773mWZDmVapWMdRlbLM/jcv22hb3iUeVXFnDW3gqtWxHeTjI7GiI7G2NPWyzd/tot/eHwXpYX5DI/GGI0dPQZfWVxAXVUxP9/eynd//ToLakp59/I6DnUNsPVAF5390SnjMIMzZpdz7mlVrKivYmldBQtrS1lQU0pNWeGMmGEhcip48Y1OfvryYf70ymXMrSzJdDjT6pRPGGbGjz91OfU1JVRNsQqzo2+Yp/e08/y+DiqKC1hQW8rC2lJOqyllflUJ5cXxj7J7IMrPXm3hpy8f5scvNHH6rHKuWjGPlYtqWFBTSvdAlK7+KJ39w8Qc8s3Iz4OhkRg7myO81NTFT186fNS5y4vyufTM2Vy1Yh5Xnjt3/EvYNzRCS88gZsas8iKqSgqUWEQyyN3560d3MKeimLWrz8x0ONPO4kMGM0NDQ4M3Nr5lB5Kc0zMYZf+Rfg529XOwa5DXj/Tyi51tNHUOALBoVimdfVF6h0aOaleYH08c86tKqK8uZX51CQtrS1k2r5Kz51Uwv6okYwklOhqjqz9K39AIhQV5lBTkUVyYT3FBHgV5pkQnM8KmV5pZ+/3NfP268/nwpadnOpyUmNlmd29Ipe4p38PIRlUlhbxtYTVvC8ZLIP6Xy86WCI+/0sLOlgh1lcXMqyphXlUx7vEeUHvfMO29QxzuHmRPWy9P7j5CJCGpVBQXUFteSFF+HsUF+ZQV5Y//nLlVxRhGayQ+HnOkd4iCfKOsqIDyonzKiwuYVV7ErPIiasuKGBoZjY/bRAZpiwzRNzRK//AI/cOjDI/EGHWPj8/EnMjgyFuSWyIzKC6Ix1RXWUx9dQn11SXMryphTmUxcyrij6V15cyuKA71sxc5UZHBKF955BWWza3gQw2Ljt0gBylh5Agz45z5VZwz//j20m/vHWJ3ay+vtcYH9XsGogyNxBgaGaV3aIRdLRF+vfsIkcH4L/SK4gLmVsV/QQ9GY7T39tM/PEpkMErXQJSJHdLaskLqKoupKC6grKiA2RXFFBfkkZ9n5JuRl2dUlhRQU1pEbXkhFcUFREdjDI3EGIyOMhSNMTwaY3gkxkB0lLbIEIe6B9nZ3EZb79Bbznf67DJWLa7lgkU1nDW3giVzyqmvKiEvL/weirurJyST+tsNOzjcM8jDn3wnBTN0AosSxgw3u6KY2RXFXHrm7Cnr9Q+P4M74WEwyozGnZyBKR/8wRfl51FUWU1KYP90hjxsZjdHRP8yRyDBtvUNsP9zDi2908uvdR1j/4sHxesUFeSyaVTbeK5lfXUJtWRGVJQVUlhRSXpxPzCHmTizm9A2P0tk3TEffMF39w/Qm9I6GRkaJORDU7xsepWcgSvdAlL7hEUoK4r2typIC5lUV03D6LBqW1LLq9Nopx8BkZntqzxF+8MwbfOJdZ7BqcW2mwwmNxjAk57g7LT1D7D3Sy74j/bx+pJcDHfHpzc3dg7RGBoml+LWuLi0MekfxS3TFBfmYQZ4ZZlBWlE9VaeF4vcHoKL1Do/QNjbCvvY9XDvUwGnPM4KLFtbzvvPm877z5LJ5dFtr7j8WctuDS4+zyIhbUlKalhyXJ9Q+PcM0//oo8g8f+dDWlReH9ERUGjWHIjGZmzK+O9yTeufStx0dGY/QOjRAZHKFnMEr/8Ch5QRLIM6O0KJ9Z5fG1MCd76aBvaIQtB7p4dm87j29v5euPbufrj25naV059dWl4+M+i2aVsaI+PmU62TYRvUMj8WTXM0jPYJTBaPwSXd/QCK2R+Bqflp5BDncPcrhrkOHR2HjbsqJ8zppbwVlzKzhnfmVw6bKSusri1C+h7dkD3/gG/OAH0NsLFRVw003w538OS5N8yDLu7zfu4o2Ofv5t7TtyLlkcL/UwRKbRG+39bHylmWdf76C9byg+GaF3+KhB//rqEgrz8xiNOdHRGP3Do1NOCiguyGNeVfxy27zqEhbUlLKgNj6duy0yxGutEV5r6WVXS4TWyNB4u1nlRZxbXxlPVKdVcdHiWSyaVfrWJPLYY3D99RCNxh9jCgvjj4cegjVrpu0zmkn+e0cLH//XRj7yjtP56rXnZzqcE3I8PQwlDJE0aI0Msv1whO2He9jVEiEWcwry41OKSwrz4z2mqhLmVZVQXVpIaVE+pYX5lBblH9f6mo6+YXY097CzOX6u7Ycj7GyJMDwS75HUV5fwjjNnc9HptZxbX8U5fa2UX7wK+vsn/6FlZfDSS+ppTPBSUxcf+pdnWDq3nH9be9mU43/ZTAlDRMaNjMbY3dbL86938MzrHTy7t50jvcMAfG3Td7hh60YKY6OT/4DCQli7Fu5IuovPKemN9n5+584nKSnM58efemdOr+hWwhCRSbk7TZ0D7GyOsPripRT19x27UVUVdHeHH1wO6Ogb5nfvfIrO/mEe/uQ7WVqX25sLatBbRCZlZiyaVcaiWWUwMMWlqAQeiTAUHQ11GnUuaIsM8bH7nudQ1wD3f+LSnE8Wx0sJQ+RUVlEBkcgxq/UWlnLhX27k7HmVXHLGLFafPYdLz5ids9ftT8Sulgi33Ps8HX3D3HnTKhqWzMp0SGl36vzXFpG3uukmuPvuo2dHTeCFhbRf9/v80eoz2Xqgiwefe4P7ntpHYb6xanEtV62Yx/vOmx/vscxQT+xq49P3v0BpUT7//seXHbVtz6lEYxgip7I9e+Dtbz+uWVKD0VEa93Xyq91t/GJHGztb4j2UFfVVvGd5HasWx1e+zyovSsc7CJW7c8+T+/irR7ezbG4F93z0Yk6rKc10WNMqawa9zWwfEAFGgZGJQZnZh4G/CF72Ap90962ptE1GCUPkBJzkOox9R/rY9GozG19pYcuBrvF7xZw5p5zffHs9v9+wKCd7H939Ub7w0FY2vdrCVSvm8c0PXUDFDLwEl20Jo8Hdj0xy/J3AdnfvNLM1wFfc/dJU2iajhCFygvbsgW9+E77//TdXen/kI/C5zx3X+ouB4VFeaupi8xudPL2nnV/vPoI7XH7WbK5duYC3LazmrLkVWX93yRff6OQzD7xIS88gX1xzDh9/1xkzduPJnEkYE+rWAtvcfcHxth2jhCGSXQ52DfBQYxP/3niAg13x+7kU5edx9vwKLlkym/eumMvFS2ZlTQJ5qamLdb/cw2PbmjmtupQ7/uBCLpzBmwlCdiWM14FOwIF/cfe7pqj7eeAcd//E8bYdo4Qhkp1iMWdPWy+vHu7h1cM9vHKwh+f2dTA8EqOqpIDVZ9dxwaIaVtRXcW59FbVpHP8YjI7yy11tfO/pfTy5u53KkgJuesfp3Lp6adJ9v2aabEoYp7n7ITObCzwO3ObuTySpdwXwHeBd7t5+nG3XAmsBFi9efNH+/ftDez8iMn36h0f41WtH+Pn2Fp7YdYTmnsHxY0vs+X8sAAAJpUlEQVRml/Hus+t49/I6LjtzzrRu6jcYHaWpc4AdzT1s2NbM/+xopW94lLmVxXz8XWfwB5cupvIU2qo+axLGUScy+wrQ6+5/P6H87cB6YI277zqethOphyGSu9p7h9h+OMIrh7p59vUOnt7TzkB0dPwS1rK5lZw1t4Iz5pRTVVJIRUkBFcUFVJcWUluWfOfh7v4oL7zRSeP+Dl7Y38Wett6jNmicXV7E1efNZ83587ls6eysuTSWTlmRMMysHMhz90jw/HHgq+6+IaHOYuC/gT9096eOp20yShgiM8f49N3X2tjeHGF3S4RD3YOT1q8pK6SmtJCRmDMYjTEUHR2/RXF+nnHeaVUsn1fJwtoyFs0qZcmcclYurCH/FL+XSLZsDTIPWB/MLCgAHnD3DWZ2K4C7rwO+DMwGvhPUG5s+m7RtiLGKSJYpKcznXcvm8K5lc8bLIoNRDnQM0Ds0Qt9Q/H4n3QNR2nuDOygORCnMM4oL8ykpzGNORTEXLq7hgkU1lBXNvCmx6aaFeyIip7Dj6WGcehfsRETkhChhiIhISpQwREQkJUoYIiKSEiUMERFJiRKGiIikRAlDRERSooQhIiIpmVEL98ysG3gtyaFqoDvF12PPk5XNAVLebn2Sc6V6PFl5spgme34yMU8VV6rx5UrMycpz8fuRSsyJz/X9SP34TP9+LHP31O456+4z5gHclUr5VK/Hnk9S1jhdMR1vzJPFdKz4TyTmE407F2OeKd+PVGLO9Get70f2fz+O9Zhpl6R+kmL5VK9/MkXZdMZ0rOPJyieL6Vjxn4gTiTsXY05Wnovfj1RiTnyu70fqx0+l78eUZtQlqbCZWaOnuOdKtlDM6ZOLcSvm9MnVuBPNtB5G2I55178spJjTJxfjVszpk6txj1MPQ0REUqIehoiIpOSUTRhmdo+ZtZrZthNoe5GZvWxmu83s2xbc6Sk4dpuZ7TSzV8zs77I9ZjP7ipkdNLMtweP92R5zwvHPm5mb2ZzJfsaJCumz/pqZvRR8zpvM7LQciPl2M9sRxL3ezGpyIObfC/7/i5nZtI0ZnEysk/y8m83steBxc0L5lN/7jDqR6Wkz4QGsBlYB206g7XPAZYABjxG/HznAFcDPgOLg9dwciPkrwOdz6XMOji0CNgL7gTm5EDdQlVDnT4B1ORDz1UBB8Pxvgb/NgZjPBZYDvwAaMh1rEMeSCWWzgL3Bv7XB89qp3lc2PE7ZHoa7PwF0JJaZ2VIz22Bmm83sV2Z2zsR2ZlZP/H/8pz3+X/d7wG8Hhz8J/I27DwXnaM2BmEMVYszfBP43EMogXBhxu3tPQtXy6Y49pJg3uftIUPUZYGEOxLzd3XdOZ5wnE+sk3gc87u4d7t4JPA5ck8n/V1NxyiaMSdwF3ObuFwGfB76TpM4CoCnhdVNQBnA28L/M7Fkz+6WZXRxqtHEnGzPAZ4JLDveYWW14oY47qZjN7IPAQXffGnagE5z0Z21mXzezA8CHid/TPmzT8f0Y8zHif/GGbTpjDlsqsSazADiQ8Hos/mx5X0nprugBM6sA3gn8KOGSYXGyqknKxv5SLCDevXwHcDHw72Z2ZvCXwrSbppjvBL4WvP4a8A3ivxhCcbIxm1kZ8CXil0rSZpo+a9z9S8CXzOz/AJ8B/nKaQ30zkGmKOfhZXwJGgPunM8a3BDKNMYdtqljN7BbgT4Oys4BHzWwYeN3dr2Py+DP+vqaihPGmPKDL3S9ILDSzfGBz8PIR4r9gE7vlC4FDwfMm4MdBgnjOzGLE949py9aY3b0lod3/A/4rpFjHnGzMS4EzgK3B/6QLgRfM7BJ3b87iuCd6APgpISYMpinmYED2t4Arw/rjJ8F0f85hShorgLvfC9wLYGa/AD7q7vsSqjQB70l4vZD4WEcTmX9fk8v0IEomH8ASEgawgKeA3wueG7ByknbPE+9FjA1KvT8ovxX4avD8bOJdTsvymOsT6nwO+GG2f84T6uwjhEHvkD7rZQl1bgMeyoGYrwFeBerC+IzD/H4wzYPeJxorkw96v078ikRt8HxWqt/7TD0yHkDG3jg8CBwGosSz+seJ/+W6Adga/E/y5UnaNgDbgD3AHby5ALII+EFw7AXgN3Ig5u8DLwMvEf/LrT7bY55QZx/hzJIK47N+OCh/ifj+PQtyIObdxP/w2RI8pntmVxgxXxf8rCGgBdiYyVhJkjCC8o8Fn+9u4Jbj+d5n6qGV3iIikhLNkhIRkZQoYYiISEqUMEREJCVKGCIikhIlDBERSYkShsxoZtab5vPdbWYrpulnjVp8Z9ttZvaTY+0Ua2Y1Zvap6Ti3SDKaViszmpn1unvFNP68An9zM75QJcZuZv8K7HL3r09RfwnwX+5+fjrik1OPehhyyjGzOjN72MyeDx6XB+WXmNlTZvZi8O/yoPyjZvYjM/sJsMnM3mNmvzCzhyx+r4j7x+5ZEJQ3BM97g80Gt5rZM2Y2LyhfGrx+3sy+mmIv6Gne3Hyxwsx+bmYvWPy+CdcGdf4GWBr0Sm4P6n4hOM9LZvZ/p/FjlFOQEoacir4FfNPdLwZ+F7g7KN8BrHb3C4nvJPtXCW0uA252998IXl8IfBZYAZwJXJ7kPOXAM+6+EngC+KOE838rOP8x9wkK9lG6kvhKfIBB4Dp3X0X8HizfCBLWF4E97n6Bu3/BzK4GlgGXABcAF5nZ6mOdT2Qy2nxQTkXvBVYk7DBaZWaVQDXwr2a2jPgOoYUJbR5398R7ITzn7k0AZraF+B5Dv55wnmHe3MxxM3BV8Pwy3rzHwQPA308SZ2nCz95M/J4JEN9j6K+CX/4x4j2PeUnaXx08XgxeVxBPIE9Mcj6RKSlhyKkoD7jM3QcSC83sn4D/cffrgvGAXyQc7pvwM4YSno+S/P+lqL85SDhZnakMuPsFZlZNPPF8Gvg28Xtp1AEXuXvUzPYBJUnaG/DX7v4vx3lekaR0SUpORZuI34sCADMb2566GjgYPP9oiOd/hvilMIAbjlXZ3buJ39L182ZWSDzO1iBZXAGcHlSNAJUJTTcCHwvu24CZLTCzudP0HuQUpIQhM12ZmTUlPP6M+C/fhmAg+FXi29ID/B3w12b2JJAfYkyfBf7MzJ4D6oHuYzVw9xeJ74h6A/GbGDWYWSPx3saOoE478GQwDfd2d99E/JLX02b2MvAQRycUkeOiabUiaRbcNXDA3d3MbgBudPdrj9VOJNM0hiGSfhcBdwQzm7oI8Za4ItNJPQwREUmJxjBERCQlShgiIpISJQwREUmJEoaIiKRECUNERFKihCEiIin5/7d5gITqKI7vAAAAAElFTkSuQmCC\n"},"metadata":{"needs_background":"light"}}]},{"metadata":{"trusted":true},"cell_type":"code","source":"learn.fit_one_cycle(1, max_lr=slice(5e-4, 5e-3), moms=(0.8, 0.7), pct_start=0.2, wd =(1e-7, 1e-5, 1e-4, 1e-3))","execution_count":43,"outputs":[{"output_type":"display_data","data":{"text/plain":"<IPython.core.display.HTML object>","text/html":"<table border=\"1\" class=\"dataframe\">\n  <thead>\n    <tr style=\"text-align: left;\">\n      <th>epoch</th>\n      <th>train_loss</th>\n      <th>valid_loss</th>\n      <th>accuracy</th>\n      <th>time</th>\n    </tr>\n  </thead>\n  <tbody>\n    <tr>\n      <td>0</td>\n      <td>4.183220</td>\n      <td>3.954616</td>\n      <td>0.318500</td>\n      <td>11:19</td>\n    </tr>\n  </tbody>\n</table>"},"metadata":{}}]},{"metadata":{"trusted":true},"cell_type":"code","source":"learn.save('fit_head')\nlearn.load('fit_head')","execution_count":44,"outputs":[{"output_type":"execute_result","execution_count":44,"data":{"text/plain":"LanguageLearner(data=TextLMDataBunch;\n\nTrain: LabelList (127656 items)\nx: LMTextList\nxxbos xxmaj grandma xxmaj terri xxmaj should xxmaj burn in xxmaj trash \n  xxmaj grandma xxmaj terri is trash . i hate xxmaj grandma xxmaj terri . xxup xxunk her to xxup hell ! 71.74.76.40,xxbos , 9 xxmaj may 2009 ( xxup utc ) \n  xxmaj it would be easiest if you were to admit to being a member of the involved xxmaj portuguese xxmaj lodge , and then there would be no requirement to acknowledge whether you had a previous account ( xxmaj carlos xxmaj xxunk did not have a good record ) or not and i would then remove the sockpuppet template as irrelevant . xxup wp : xxup coi permits people to edit those articles , such as msjapan does , but just means you have to be more careful in ensuring that references back your edits and that xxup npov is upheld . 20:29,xxbos \" \n \n  xxmaj the xxmaj objectivity of this xxmaj discussion is doubtful ( non - existent ) \n \n  ( 1 ) xxmaj as indicated earlier , the section on xxmaj marxist leaders ’ views is misleading : \n \n  ( a ) it lays unwarranted and excessive emphasis on xxmaj trotsky , creating the misleading impression that other prominent xxmaj marxists ( xxmaj marx , xxmaj engels , xxmaj lenin ) did not advocate and / or practiced terrorism ; \n \n  ( b ) it lays unwarranted and excessive emphasis on the theoretical “ rejection of individual terrorism ” , creating the misleading impression that this is the main ( only ) xxmaj marxist position on terrorism . \n \n  ( 2 ) xxmaj the discussion is not being properly monitored : \n \n  ( a ) no discernible attempt is being made to establish and maintain an acceptable degree of objectivity ; \n \n  ( b ) important and relevant scholarly works such as the xxmaj international xxmaj encyclopedia of xxmaj terrorism are being ignored or xxunk excluded from the discussion ; \n \n  ( c ) though the only logical way to remedy the blatant imbalance in the above section is to include quotes by / on other leaders who are known to have endorsed and practiced terrorism all attempts to do so have been systematically blocked with impunity by the apologists for xxmaj marxist terrorism who have done their best to sabotage and wreck both the article and the discussion . \n \n  ( 3 ) xxmaj among the tactics deployed by the apologist wreckers and xxunk the following may be identified as representative examples : \n \n  ( a ) it is claimed that xxmaj marx and xxmaj engels did not advocate terrorism despite the fact that scholarly works like the xxmaj international xxmaj encyclopedia of xxmaj terrorism show that they did , and xxmaj marx himself was known as “ xxmaj the xxmaj red xxmaj terror xxmaj doctor ” ; \n \n  ( b ) it is claimed that xxmaj marx and xxmaj engels were not involved in terrorist activities despite the fact that numerous sources from xxmaj the xxmaj neue xxmaj xxunk xxmaj zeitung to xxmaj isaiah xxmaj berlin and xxmaj francis xxmaj xxunk state otherwise ; \n \n  ( c ) it is claimed that xxmaj lenin does not refer to terror in xxmaj the xxmaj proletarian xxmaj revolution and the xxmaj renegade xxup k. xxmaj xxunk and other works / statements despite the fact that xxmaj robert xxmaj service , xxup iet , and other scholarly and reliable sources state that he does ; \n \n  ( d ) it is claimed that the xxmaj russian word ‘ ’ xxunk ’ ’ does not mean “ terror ” when : \n \n  i. the xxmaj oxford xxmaj russian xxmaj dictionary says that it does ; \n \n  ii . it is evident from the context that this is the case ; \n \n  iii . any educated xxmaj russian speaker can confirm that xxunk may mean “ terror ” depending on the context ; \n \n  ( e ) it is claimed that xxmaj marxism is “ scientific ” when in fact : \n \n  i. xxmaj marx was not a scientist ; \n \n  ii . xxmaj marx ’s background was philosophy and law , not science ; \n \n  iii . xxmaj marxism is not recognized as a science by the academic world ; \n \n  iv . virtually every one of xxmaj marx ’s predictions turned out to be wrong , as became increasingly apparent during his lifetime and xxunk so after his death ( xxup r. xxmaj pipes , xxmaj communism : a xxmaj brief xxmaj history , 2001 , p. 15 ) from which it follows that xxmaj marxism does not qualify as a scientific system by any accepted standards ; \n \n  v. the evidence indicates that xxmaj marxism is closer to a religious sect than to science proper ; \n \n  ( f ) apologist literature is being quoted in a fraudulent attempt to whitewash xxmaj marxist terrorism , in effect turning the discussion into an advertisement for terrorism ; \n \n  ( g ) it is claimed that xxmaj marxist terrorism is not rooted in the xxmaj marxist theory of class struggle even though there are numerous sources showing that it is ( please note that it is immaterial whether terrorism had already been justified in terms of a theory of class prior to xxmaj marx , the point being that it was advocated / practiced on the basis of xxmaj marxist class - struggle theories xxup by xxup marxists ) : \n \n  “ xxmaj karl xxmaj marx felt that terror was a necessary part of a revolutionary strategy ” ( xxmaj peter xxmaj xxunk , “ xxmaj theories of xxmaj terror in xxmaj urban xxmaj xxunk ” , xxup iet , p. 138 ) ; \n \n  “ xxmaj revolutionary terrorism has its roots in a political ideology , from the xxmaj marxist - xxmaj leninist thinking of the xxmaj left , to the fascists found on the xxmaj right ” ( xxmaj xxunk xxmaj gal - xxmaj or , \" \" xxmaj revolutionary xxmaj terrorism \" \" , xxup iet , p. 203 ) ; \n \n  “ … perhaps the most important key to xxmaj stalin ’s motivation lies in the realm of ideology . xxmaj the xxunk of xxmaj soviet communist ideology in the 1920s and 1930s was class struggle – the xxunk antagonism between mutually incompatible economic interest groups ” ( xxmaj geoffrey xxmaj robert , xxmaj stalins xxmaj wars , 2006 , pp . 17 - 18 ) ; \n \n  this fact is supported not only by reliable academic sources , but by elementary logic : \n \n  “ xxmaj in 1907 xxmaj xxunk published in the magazine ‘ ’ xxmaj neue xxmaj zeit ( xxmaj vol . xxup xxv 2 , p. 164 ) extracts from a letter by xxmaj marx to xxmaj xxunk dated xxmaj march 5 , 1852 . xxmaj in this letter , among other things , is the following noteworthy observation : … class struggle necessarily leads to the dictatorship of the proletariat … ”,xxbos xxmaj shelly xxmaj shock \n  xxmaj shelly xxmaj shock is . . . ( ),xxbos i do not care . xxmaj refer to xxmaj ong xxmaj teng xxmaj cheong talk page . xxmaj is xxmaj la goutte de pluie writing a biography or writing the history of trade unions . xxmaj she is making use of the dead to push her agenda again . xxmaj right before elections too . xxmaj how timely . xxunk\ny: LMLabelList\n,,,,\nPath: .;\n\nValid: LabelList (31915 items)\nx: LMTextList\nxxbos xxmaj geez , are you xxunk ! xxmaj we 've already discussed why xxmaj marx was not an anarchist , i.e. he wanted to use a xxmaj state to mold his ' socialist man . ' xxmaj ergo , he is a statist - the opposite of an anarchist . i know a guy who says that , when he gets old and his teeth fall out , he 'll quit eating meat . xxmaj would you call him a vegetarian ?,xxbos xxmaj xxunk xxup rfa \n \n  xxmaj thanks for your support on my request for adminship . \n \n  xxmaj the final outcome was ( 31 / 4 / 1 ) , so i am now an administrator . xxmaj if you have any comments or concerns on my actions as an administrator , please let me know . xxmaj thank you !,xxbos \" \n \n  xxmaj birthday \n \n  xxmaj no worries , xxmaj it 's what i do ; ) xxmaj enjoy ur xxunk \",xxbos xxmaj pseudoscience category ? \n \n  i 'm assuming that this article is in the pseudoscience category because of its association with creationism . xxmaj however , there are modern , scientifically - accepted variants of xxunk that have nothing to do with creationism — and they 're even mentioned in the article ! i think the connection to pseudoscience needs to be clarified , or the article made more general and less creationism - specific and the category tag removed entirely .,xxbos ( and if such phrase exists , it would be provided by search engine even if mentioned page is not available as a whole )\ny: LMLabelList\n,,,,\nPath: .;\n\nTest: None, model=SequentialRNN(\n  (0): AWD_LSTM(\n    (encoder): Embedding(57520, 400, padding_idx=1)\n    (encoder_dp): EmbeddingDropout(\n      (emb): Embedding(57520, 400, padding_idx=1)\n    )\n    (rnns): ModuleList(\n      (0): WeightDropout(\n        (module): LSTM(400, 1150, batch_first=True)\n      )\n      (1): WeightDropout(\n        (module): LSTM(1150, 1150, batch_first=True)\n      )\n      (2): WeightDropout(\n        (module): LSTM(1150, 400, batch_first=True)\n      )\n    )\n    (input_dp): RNNDropout()\n    (hidden_dps): ModuleList(\n      (0): RNNDropout()\n      (1): RNNDropout()\n      (2): RNNDropout()\n    )\n  )\n  (1): LinearDecoder(\n    (decoder): Linear(in_features=400, out_features=57520, bias=True)\n    (output_dp): RNNDropout()\n  )\n), opt_func=functools.partial(<class 'torch.optim.adam.Adam'>, betas=(0.9, 0.99)), loss_func=FlattenedLoss of CrossEntropyLoss(), metrics=[<function accuracy at 0x7f79cc28da60>], true_wd=True, bn_wd=True, wd=0.01, train_bn=True, path=PosixPath('../input'), model_dir='/temp/model', callback_fns=[functools.partial(<class 'fastai.basic_train.Recorder'>, add_time=True, silent=False)], callbacks=[RNNTrainer\nlearn: LanguageLearner(data=TextLMDataBunch;\n\nTrain: LabelList (127656 items)\nx: LMTextList\nxxbos xxmaj grandma xxmaj terri xxmaj should xxmaj burn in xxmaj trash \n  xxmaj grandma xxmaj terri is trash . i hate xxmaj grandma xxmaj terri . xxup xxunk her to xxup hell ! 71.74.76.40,xxbos , 9 xxmaj may 2009 ( xxup utc ) \n  xxmaj it would be easiest if you were to admit to being a member of the involved xxmaj portuguese xxmaj lodge , and then there would be no requirement to acknowledge whether you had a previous account ( xxmaj carlos xxmaj xxunk did not have a good record ) or not and i would then remove the sockpuppet template as irrelevant . xxup wp : xxup coi permits people to edit those articles , such as msjapan does , but just means you have to be more careful in ensuring that references back your edits and that xxup npov is upheld . 20:29,xxbos \" \n \n  xxmaj the xxmaj objectivity of this xxmaj discussion is doubtful ( non - existent ) \n \n  ( 1 ) xxmaj as indicated earlier , the section on xxmaj marxist leaders ’ views is misleading : \n \n  ( a ) it lays unwarranted and excessive emphasis on xxmaj trotsky , creating the misleading impression that other prominent xxmaj marxists ( xxmaj marx , xxmaj engels , xxmaj lenin ) did not advocate and / or practiced terrorism ; \n \n  ( b ) it lays unwarranted and excessive emphasis on the theoretical “ rejection of individual terrorism ” , creating the misleading impression that this is the main ( only ) xxmaj marxist position on terrorism . \n \n  ( 2 ) xxmaj the discussion is not being properly monitored : \n \n  ( a ) no discernible attempt is being made to establish and maintain an acceptable degree of objectivity ; \n \n  ( b ) important and relevant scholarly works such as the xxmaj international xxmaj encyclopedia of xxmaj terrorism are being ignored or xxunk excluded from the discussion ; \n \n  ( c ) though the only logical way to remedy the blatant imbalance in the above section is to include quotes by / on other leaders who are known to have endorsed and practiced terrorism all attempts to do so have been systematically blocked with impunity by the apologists for xxmaj marxist terrorism who have done their best to sabotage and wreck both the article and the discussion . \n \n  ( 3 ) xxmaj among the tactics deployed by the apologist wreckers and xxunk the following may be identified as representative examples : \n \n  ( a ) it is claimed that xxmaj marx and xxmaj engels did not advocate terrorism despite the fact that scholarly works like the xxmaj international xxmaj encyclopedia of xxmaj terrorism show that they did , and xxmaj marx himself was known as “ xxmaj the xxmaj red xxmaj terror xxmaj doctor ” ; \n \n  ( b ) it is claimed that xxmaj marx and xxmaj engels were not involved in terrorist activities despite the fact that numerous sources from xxmaj the xxmaj neue xxmaj xxunk xxmaj zeitung to xxmaj isaiah xxmaj berlin and xxmaj francis xxmaj xxunk state otherwise ; \n \n  ( c ) it is claimed that xxmaj lenin does not refer to terror in xxmaj the xxmaj proletarian xxmaj revolution and the xxmaj renegade xxup k. xxmaj xxunk and other works / statements despite the fact that xxmaj robert xxmaj service , xxup iet , and other scholarly and reliable sources state that he does ; \n \n  ( d ) it is claimed that the xxmaj russian word ‘ ’ xxunk ’ ’ does not mean “ terror ” when : \n \n  i. the xxmaj oxford xxmaj russian xxmaj dictionary says that it does ; \n \n  ii . it is evident from the context that this is the case ; \n \n  iii . any educated xxmaj russian speaker can confirm that xxunk may mean “ terror ” depending on the context ; \n \n  ( e ) it is claimed that xxmaj marxism is “ scientific ” when in fact : \n \n  i. xxmaj marx was not a scientist ; \n \n  ii . xxmaj marx ’s background was philosophy and law , not science ; \n \n  iii . xxmaj marxism is not recognized as a science by the academic world ; \n \n  iv . virtually every one of xxmaj marx ’s predictions turned out to be wrong , as became increasingly apparent during his lifetime and xxunk so after his death ( xxup r. xxmaj pipes , xxmaj communism : a xxmaj brief xxmaj history , 2001 , p. 15 ) from which it follows that xxmaj marxism does not qualify as a scientific system by any accepted standards ; \n \n  v. the evidence indicates that xxmaj marxism is closer to a religious sect than to science proper ; \n \n  ( f ) apologist literature is being quoted in a fraudulent attempt to whitewash xxmaj marxist terrorism , in effect turning the discussion into an advertisement for terrorism ; \n \n  ( g ) it is claimed that xxmaj marxist terrorism is not rooted in the xxmaj marxist theory of class struggle even though there are numerous sources showing that it is ( please note that it is immaterial whether terrorism had already been justified in terms of a theory of class prior to xxmaj marx , the point being that it was advocated / practiced on the basis of xxmaj marxist class - struggle theories xxup by xxup marxists ) : \n \n  “ xxmaj karl xxmaj marx felt that terror was a necessary part of a revolutionary strategy ” ( xxmaj peter xxmaj xxunk , “ xxmaj theories of xxmaj terror in xxmaj urban xxmaj xxunk ” , xxup iet , p. 138 ) ; \n \n  “ xxmaj revolutionary terrorism has its roots in a political ideology , from the xxmaj marxist - xxmaj leninist thinking of the xxmaj left , to the fascists found on the xxmaj right ” ( xxmaj xxunk xxmaj gal - xxmaj or , \" \" xxmaj revolutionary xxmaj terrorism \" \" , xxup iet , p. 203 ) ; \n \n  “ … perhaps the most important key to xxmaj stalin ’s motivation lies in the realm of ideology . xxmaj the xxunk of xxmaj soviet communist ideology in the 1920s and 1930s was class struggle – the xxunk antagonism between mutually incompatible economic interest groups ” ( xxmaj geoffrey xxmaj robert , xxmaj stalins xxmaj wars , 2006 , pp . 17 - 18 ) ; \n \n  this fact is supported not only by reliable academic sources , but by elementary logic : \n \n  “ xxmaj in 1907 xxmaj xxunk published in the magazine ‘ ’ xxmaj neue xxmaj zeit ( xxmaj vol . xxup xxv 2 , p. 164 ) extracts from a letter by xxmaj marx to xxmaj xxunk dated xxmaj march 5 , 1852 . xxmaj in this letter , among other things , is the following noteworthy observation : … class struggle necessarily leads to the dictatorship of the proletariat … ”,xxbos xxmaj shelly xxmaj shock \n  xxmaj shelly xxmaj shock is . . . ( ),xxbos i do not care . xxmaj refer to xxmaj ong xxmaj teng xxmaj cheong talk page . xxmaj is xxmaj la goutte de pluie writing a biography or writing the history of trade unions . xxmaj she is making use of the dead to push her agenda again . xxmaj right before elections too . xxmaj how timely . xxunk\ny: LMLabelList\n,,,,\nPath: .;\n\nValid: LabelList (31915 items)\nx: LMTextList\nxxbos xxmaj geez , are you xxunk ! xxmaj we 've already discussed why xxmaj marx was not an anarchist , i.e. he wanted to use a xxmaj state to mold his ' socialist man . ' xxmaj ergo , he is a statist - the opposite of an anarchist . i know a guy who says that , when he gets old and his teeth fall out , he 'll quit eating meat . xxmaj would you call him a vegetarian ?,xxbos xxmaj xxunk xxup rfa \n \n  xxmaj thanks for your support on my request for adminship . \n \n  xxmaj the final outcome was ( 31 / 4 / 1 ) , so i am now an administrator . xxmaj if you have any comments or concerns on my actions as an administrator , please let me know . xxmaj thank you !,xxbos \" \n \n  xxmaj birthday \n \n  xxmaj no worries , xxmaj it 's what i do ; ) xxmaj enjoy ur xxunk \",xxbos xxmaj pseudoscience category ? \n \n  i 'm assuming that this article is in the pseudoscience category because of its association with creationism . xxmaj however , there are modern , scientifically - accepted variants of xxunk that have nothing to do with creationism — and they 're even mentioned in the article ! i think the connection to pseudoscience needs to be clarified , or the article made more general and less creationism - specific and the category tag removed entirely .,xxbos ( and if such phrase exists , it would be provided by search engine even if mentioned page is not available as a whole )\ny: LMLabelList\n,,,,\nPath: .;\n\nTest: None, model=SequentialRNN(\n  (0): AWD_LSTM(\n    (encoder): Embedding(57520, 400, padding_idx=1)\n    (encoder_dp): EmbeddingDropout(\n      (emb): Embedding(57520, 400, padding_idx=1)\n    )\n    (rnns): ModuleList(\n      (0): WeightDropout(\n        (module): LSTM(400, 1150, batch_first=True)\n      )\n      (1): WeightDropout(\n        (module): LSTM(1150, 1150, batch_first=True)\n      )\n      (2): WeightDropout(\n        (module): LSTM(1150, 400, batch_first=True)\n      )\n    )\n    (input_dp): RNNDropout()\n    (hidden_dps): ModuleList(\n      (0): RNNDropout()\n      (1): RNNDropout()\n      (2): RNNDropout()\n    )\n  )\n  (1): LinearDecoder(\n    (decoder): Linear(in_features=400, out_features=57520, bias=True)\n    (output_dp): RNNDropout()\n  )\n), opt_func=functools.partial(<class 'torch.optim.adam.Adam'>, betas=(0.9, 0.99)), loss_func=FlattenedLoss of CrossEntropyLoss(), metrics=[<function accuracy at 0x7f79cc28da60>], true_wd=True, bn_wd=True, wd=0.01, train_bn=True, path=PosixPath('../input'), model_dir='/temp/model', callback_fns=[functools.partial(<class 'fastai.basic_train.Recorder'>, add_time=True, silent=False)], callbacks=[...], layer_groups=[Sequential(\n  (0): WeightDropout(\n    (module): LSTM(400, 1150, batch_first=True)\n  )\n  (1): RNNDropout()\n), Sequential(\n  (0): WeightDropout(\n    (module): LSTM(1150, 1150, batch_first=True)\n  )\n  (1): RNNDropout()\n), Sequential(\n  (0): WeightDropout(\n    (module): LSTM(1150, 400, batch_first=True)\n  )\n  (1): RNNDropout()\n), Sequential(\n  (0): Embedding(57520, 400, padding_idx=1)\n  (1): EmbeddingDropout(\n    (emb): Embedding(57520, 400, padding_idx=1)\n  )\n  (2): LinearDecoder(\n    (decoder): Linear(in_features=400, out_features=57520, bias=True)\n    (output_dp): RNNDropout()\n  )\n)], add_time=True, silent=None)\nalpha: 2.0\nbeta: 1.0], layer_groups=[Sequential(\n  (0): WeightDropout(\n    (module): LSTM(400, 1150, batch_first=True)\n  )\n  (1): RNNDropout()\n), Sequential(\n  (0): WeightDropout(\n    (module): LSTM(1150, 1150, batch_first=True)\n  )\n  (1): RNNDropout()\n), Sequential(\n  (0): WeightDropout(\n    (module): LSTM(1150, 400, batch_first=True)\n  )\n  (1): RNNDropout()\n), Sequential(\n  (0): Embedding(57520, 400, padding_idx=1)\n  (1): EmbeddingDropout(\n    (emb): Embedding(57520, 400, padding_idx=1)\n  )\n  (2): LinearDecoder(\n    (decoder): Linear(in_features=400, out_features=57520, bias=True)\n    (output_dp): RNNDropout()\n  )\n)], add_time=True, silent=None)"},"metadata":{}}]},{"metadata":{"trusted":true},"cell_type":"code","source":"learn.unfreeze()\nlearn.lr_find()\nlearn.recorder.plot(suggestion=True)","execution_count":45,"outputs":[{"output_type":"display_data","data":{"text/plain":"<IPython.core.display.HTML object>","text/html":""},"metadata":{}},{"output_type":"stream","text":"LR Finder is complete, type {learner_name}.recorder.plot() to see the graph.\nMin numerical gradient: 6.31E-07\n","name":"stdout"},{"output_type":"display_data","data":{"text/plain":"<Figure size 432x288 with 1 Axes>","image/png":"iVBORw0KGgoAAAANSUhEUgAAAYUAAAEKCAYAAAD9xUlFAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAAIABJREFUeJzt3XmUHXWd9/H3t/v2vqWT7oSshLAKiixNJDI6IIjCgzAMqNFBER0jrqOO+ujR4+PDjOMu4nCGDA+LoIICioMKCKiI47B1NsAAWQgk6Wyd7nSn97t9nz+qurk0vSXpukv353VOna5b9auqb9+k7/f+lvqVuTsiIiIARbkOQERE8oeSgoiIDFFSEBGRIUoKIiIyRElBRESGKCmIiMgQJQURERmipCAiIkMiTQpm9hkz+6uZPWNmt5tZ+bD9ZWb2czPbZGaPm9niKOMREZGxxaI6sZnNBz4FHO/ufWZ2B7Ac+FFGsQ8B+9z9KDNbDnwLePdY521oaPDFixdHE7SIyBS1atWqve7eOF65yJJCxvkrzCwBVAI7hu2/CPhauH4XcK2ZmY8x98bixYtpbm6OIlYRkSnLzF6aSLnImo/cvQX4LrAV2Al0uvsDw4rNB7aF5ZNAJzArqphERGRskSUFM6snqAkcAcwDqszssuHFRjj0VbUEM1thZs1m1tza2jr5wYqICBBtR/M5wBZ3b3X3BPBL4I3DymwHFgKYWQyoA9qHn8jdr3f3Jndvamwct0lMREQOUpRJYStwuplVmpkBZwPPDitzD3B5uH4p8Iex+hNERCRaUfYpPE7QebwaeDq81vVmdpWZXRgWuxGYZWabgM8CX4wqHhERGZ8V2hfzpqYm1+gjEZEDY2ar3L1pvHK6o1lERIYoKYiIFIBrHtrInzdGP/pSSUFEJM+l0s41v9/AE1teNThz0ikpiIjkuY7eOGmHWVWlkV9LSUFEJM+198QBmFVdFvm1lBRERPLc3u4wKaimICIibT0DgGoKIiICtA3WFKpVUxARmfbaugcwg/pKJQURkWlvb0+cmZWlFBeNNLH05FJSEBHJc+3dcWZmoZMZlBRERPJeW89AVvoTQElBRCTvtXXHszLyCJQURETy3t7uARrUfCQiIvFkmv39SdUUREQE9vVm7x4FUFIQEclre7vDu5kLvfnIzI41s7UZy34z+/SwMmeaWWdGma9GFY+ISCF6+W7m7DQfxaI6sbs/D5wEYGbFQAtw9whF/+zuF0QVh4hIIRua96jQawrDnA1sdveXsnQ9EZEpIds1hWwlheXA7aPsW2Zm68zsPjM7IUvxiIgUhLaeOCXFRm15ZA07rxB5UjCzUuBC4M4Rdq8GDnf31wP/DvxqlHOsMLNmM2tubY3+GaUiIvmirXuAmVWlmEU/7xFkp6ZwHrDa3XcP3+Hu+929O1y/Fygxs4YRyl3v7k3u3tTY2Bh9xCIieaKtO86squw0HUF2ksJ7GKXpyMwOszD9mdnSMJ62LMQkIlIQ9vbEs3aPAkQ4+gjAzCqBtwIfydh2JYC7rwQuBT5qZkmgD1ju7h5lTCIihaSte4AlDVVZu16kScHde4FZw7atzFi/Frg2yhhERApZe088a8NRQXc0i4jkrd54kt54KmvDUUFJQUQkbw3do6CagoiItPVkdzI8UFIQEclbbYOT4an5SEREhmoKaj4SEZGX5z1SUhARmfbaugeoKCmmsjQ78x6BkoKISN5qy/LdzKCkICKSt/Z2D2S1kxmUFERE8lZ7T5yGLHYyg5KCiEjeautW85GIiADuTluPmo9ERATY358kkfKs3qMASgoiInnp5buZlRRERKa9l+9mVvORiMi0l4u7mSHCpGBmx5rZ2oxlv5l9elgZM7MfmtkmM3vKzE6JKh4RkULS1hM0HzVkuaM5snun3f154CQAMysGWoC7hxU7Dzg6XN4AXBf+FBGZ1gZrCvWVU6SmMMzZwGZ3f2nY9ouAWz3wGDDDzOZmKSYRkbzV1j1AbXmM0lh2W/mzdbXlwO0jbJ8PbMt4vT3cJiIyrbV2D9BQk92mI8hCUjCzUuBC4M6Rdo+wzUc4xwozazaz5tbW1skOUUQk77R09DN/RkXWr5uNmsJ5wGp33z3Cvu3AwozXC4Adwwu5+/Xu3uTuTY2NjRGFKSKSP3Z29DGvbmomhfcwctMRwD3A+8NRSKcDne6+MwsxiYjkrYFkij1dA8ydUZ71a0f65AYzqwTeCnwkY9uVAO6+ErgXOB/YBPQCV0QZj4hIIdjdGQxHnZeD5qNIk4K79wKzhm1bmbHuwMejjEFEpNDs6OwDmLJ9CiIicgB2dARJYW5d9puPlBRERPLMYFLIRfORkoKISJ5p6ehnVlUp5SXFWb+2koKISJ7Z2dmXk1oCKCmIiOSdHR19OelPACUFEZG84u607FNNQURECB7D2RNP5WQ4KigpiIjklZ3hPQq5uJsZlBRERPJKLoejgpKCiEheaenoB3JzNzMoKYiI5JUdHX2UFBuNWX4M5yAlBRGRPLKzo485teUUFY30uJnoKSmIiOSRHR39OetPACUFEZG80tLRl7P+BFBSEBHJG6m0s2t/f87uZgYlBRGRvNHaNUAq7Wo+EhGRoOkIcjccFSJOCmY2w8zuMrPnzOxZM1s2bP+ZZtZpZmvD5atRxiMiks9yfeMaRPw4TuAa4H53v9TMSoHKEcr82d0viDgOEZG8N/TEtRxNcQERJgUzqwXeDHwAwN3jQDyq64mIFLqdnf3UlMWoLS/JWQxRNh8tAVqBm81sjZndYGZVI5RbZmbrzOw+MzshwnhERPJaS0fupsweFGVSiAGnANe5+8lAD/DFYWVWA4e7++uBfwd+NdKJzGyFmTWbWXNra2uEIYuI5M6Ojr6cNh1BtElhO7Dd3R8PX99FkCSGuPt+d+8O1+8FSsysYfiJ3P16d29y96bGxsYIQxYRyZ0dU7mm4O67gG1mdmy46WxgfWYZMzvMzCxcXxrG0xZVTCIi+aovnmJfbyKnw1Eh+tFHnwR+Go48egG4wsyuBHD3lcClwEfNLAn0Acvd3SOOSUQk7+zoHByOmtvmo0iTgruvBZqGbV6Zsf9a4NooYxARKQRDw1HrpmjzkYiITNwLrT0AHNEw0iDN7FFSEBHJAxt2d1FTHmN2TW4erjNISUFEJA9s3NPNMXNqCMfe5IySgohIjrk7G3d3cfTs6lyHoqQgIpJrbT1x9vUmOHpOTa5DUVIQEcm1Dbu7ADhmjmoKIiLT3qY93QAcPVs1BRGRaW9w5NGc2tyOPAIlBRGRnNuwOz9GHoGSgohIzm3a050XI49ASUFEJKf2dg/Q3hPPi5FHoKQgIpJTG3cPdjKrpiAiMu1t3DM4HFU1BRGRaW/j7u68GXkESgoiIjm1IZzeIh9GHsEEk4KZHWlmZeH6mWb2KTObEW1oIiJT3+BEePliojWFXwApMzsKuBE4ArgtsqhERKaBtnDk0VF50skME08KaXdPAhcDP3D3zwBzxzvIzGaY2V1m9pyZPWtmy4btNzP7oZltMrOnzOyUA/8VREQK04Zw5FE+1RQm+jjOhJm9B7gceEe4rWQCx10D3O/ul4bPaa4ctv884OhweQNwXfhTRGTK25RnI49g4jWFK4BlwNfdfYuZHQH8ZKwDzKwWeDNBcxPuHnf3jmHFLgJu9cBjwAwzG7cGIiIyFWzY3U1NWf6MPIIJJgV3X+/un3L3282sHqhx92+Oc9gSoBW42czWmNkNZjb84aPzgW0Zr7eH217BzFaYWbOZNbe2tk4kZBGRvLdxTxdHz8mfkUcw8dFHD5tZrZnNBNYRfNB/f5zDYsApwHXufjLQA3xx+KlHOM5ftcH9endvcvemxsbGiYQsIpLXgqetdefFdNmZJtp8VOfu+4G/B25291OBc8Y5Zjuw3d0fD1/fRZAkhpdZmPF6AbBjgjGJiBSsHZ39tPXEOX5eba5DeYWJJoVY2Nb/LuA3EznA3XcB28zs2HDT2cD6YcXuAd4fjkI6Heh0950TjElEpGCt2boPgFMW1ec4klea6Oijq4DfAX9x9yfNbAmwcQLHfRL4aTjy6AXgCjO7EsDdVwL3AucDm4Begg5tEZEpb+3WDspiRRw3N7+ajyaUFNz9TuDOjNcvAJdM4Li1QNOwzSsz9jvw8QlFKiIyhazZ1sHr5tdRUpxfsw1NtKN5gZndbWZ7zGy3mf3CzBZEHZyIyFQUT6Z5uqWTkxbm32xBE01RNxO0/88jGDL663CbiIgcoOd27SeeTHNynvUnwMSTQqO73+zuyXD5EaCxoSIiB2HN1uA+3pMXFW5NYa+ZXWZmxeFyGdAWZWAiIlPVmq37mF1Txty68lyH8ioTTQofJBiOugvYCVyKRgqJiByUtds6OHnRjLy6k3nQRKe52OruF7p7o7vPdve/I7iRTUREDkB7T5wX23rzsj8BDu3Ja5+dtChERKaJdduC/oR8HHkEh5YU8q/eIyKS59Zs3UeRwYkL6nIdyogOJSm8auI6EREZ25ptHRx7WC2VpROdUCK7xozKzLoY+cPfgIpIIhIRmaLSaWfttg7e8fp5uQ5lVGMmBXfPr0k5REQK2At7u+nqT3JynvYnwKE1H4mIyAFYncc3rQ1SUhARyZI1W/dRUx5jSUN1rkMZlZKCiEgWuDt/fK6VM45soKgofwdvKimIiGTBX3fsZ9f+fs45fk6uQxmTkoKISBY8uH43RQZnHZvfc4lGOlDWzF4EuoAUkHT3pmH7zwT+C9gSbvqlu18VZUwiIrnw4PrdnHp4PbOqy3IdypiycffEWe6+d4z9f3b3C7IQh4hITrR09LF+536+dN5xuQ5lXGo+EhGJ2O+f3Q2Q9/0JEH1ScOABM1tlZitGKbPMzNaZ2X1mdkLE8YiIZN2D63ezpKGKIxvzdyjqoKibj85w9x1mNht40Myec/dHMvavBg53924zOx/4FXD08JOECWUFwKJFiyIOWURk8nT1J3jshTauOOOIXIcyIZHWFNx9R/hzD3A3sHTY/v3u3h2u3wuUmFnDCOe53t2b3L2psTG/e+5FRDI9smEviZRzzmvyv+kIIkwKZlZlZjWD68C5wDPDyhxm4aOHzGxpGI8e8ykiU8ZDz+6mvrKEU/J4aotMUTYfzQHuDj/zY8Bt7n6/mV0J4O4rCR7r+VEzSwJ9wHJ315TcIjIlJFNp/vDcHs5+zWxixYUxrieypODuLwCvH2H7yoz1a4Fro4pBRCSXHt/STmdfgnMLYNTRoMJIXSIiBejHj77EjMoS/vaY2bkOZcKUFEREIrCtvZcH1u/ivUsXUVFanOtwJkxJQUQkArc++iJmxvuWHZ7rUA6IkoKIyCTrGUjysye3cf7r5jK3rrCeXKykICIyyX6xejtd/UmuOGNxrkM5YEoKIiKTKJ12bv7Li5y0cAanLKrPdTgHTElBRGQS/WlDK1v29vDBvymMaS2GU1IQEZlEN/1lC4fVlnPeaw/LdSgHRUlBRGSSrHqpnT9v3Mvlb1xMSYHcwTxcYUYtIpJn3J1v3vccs2vKuPyNhTUMNZOSgojIJHjo2T08+eI+Pn3OMVSWZuOhltFQUhAROUTJVJpv3/8cSxqqeFfTglyHc0iUFEREDtEvV7ewcU83X3j7sQUzG+poCjt6EZEc60+k+P6DGzhp4QzedkJhjjjKpKQgInIIbvzvLeza388XzzuO8PkxBU1JQUTkID2/q4trfr+Rt50wh9OXzMp1OJNCSUFE5CAMJFN8+udrqSmL8fWLX5frcCZNpEnBzF40s6fNbK2ZNY+w38zsh2a2ycyeMrNTooxHRGSy/OChjTy7cz/fvOREGqrLch3OpMnGYNqz3H3vKPvOA44OlzcA14U/RUTy1pMvtvOff9rM8tMW8tYCetTmROS6+egi4FYPPAbMMLO5OY5JRGRUXf0JPnvHWubXV/CVC47PdTiTLuqk4MADZrbKzFaMsH8+sC3j9fZw2yuY2Qozazaz5tbW1ohCFREZWzKV5hO3rWFHRz/ff9dJVJcV7p3Lo4k6KZzh7qcQNBN93MzePGz/SOO3/FUb3K939yZ3b2psbIwiThGRMbk7X/nVM/xpQyv/+nev5bTFM3MdUiQiTQruviP8uQe4G1g6rMh2YGHG6wXAjihjEhE5GP/x8GZ+9uQ2PnHWUbxn6aJchxOZyJKCmVWZWc3gOnAu8MywYvcA7w9HIZ0OdLr7zqhiEhE5GHev2c53fvc8F588n38+95hchxOpKBvE5gB3h3f4xYDb3P1+M7sSwN1XAvcC5wObgF7gigjjERE5YLc/sZWv/OoZTl8yk29dcuKUuGt5LJElBXd/AXj9CNtXZqw78PGoYhAROVjptPPdB57nPx7ezJnHNnLte0+hNJbrAZvRm3pd5yIih2ggmeJzdz7Fr9ft4L1vWMRVF55Q8LOfTpSSgohIhs6+BCtubebxLe3877cfx5V/u2TKNxllUlIQEQnt7Ozj8pueYMveHq5ZfhIXnfSq26amPCUFERFgw+4uLr/pCbr6k9xyxVLeeFRDrkPKCSUFEZn21mzdx+U3PUF5STF3fGQZx8+rzXVIOaOkICLT2rptHbz/xieYWV3KTz70BhbOrMx1SDmlpCAi09YzLZ2878bHmVFVwu0fPp15MypyHVLOTY8xViIiw6zfsZ/LbnycmnIlhExKCiIy7Tz2QhvvveExKkqKuf3Dp7Ogfno3GWVSUhCRaeVnT2zlshseZ1ZVKT9fsYxFs5QQMqlPQUSmhVTa+fpvn+Wmv2zhzcc0cu17T6a2vCTXYeUdJQURmfLiyTSfvH01v/vrbq44YzFfPv8102baigOlpCAiU9pAMsXHfrKa3z+3h//zjuO54owjch1SXlNSEJEpqz+R4qM/WcUfnw+elnbZ6YfnOqS8p6QgIlNSfyLFih+v4pENrXzj7183pZ+WNpmUFERkSvrX367nkQ2tfPuSE3nXaQvHP0CALAxJNbNiM1tjZr8ZYd8HzKzVzNaGyz9GHY+ITH0P/HUXP3lsKx9+0xFKCAcoGzWFfwKeBUabYern7v6JLMQhItPArs5+vvCLp3jt/Fo+/7bjch1OwYm0pmBmC4D/BdwQ5XVERCC4F+EzP19LPJnmh8tPnhaPz5xsUb9jPwC+AKTHKHOJmT1lZneZmep5InLQVv5pM4++0MbXLjyBJY3VuQ6nIEWWFMzsAmCPu68ao9ivgcXufiLwEHDLKOdaYWbNZtbc2toaQbQiUuj+tKGV7z+4gQtOnMs7T12Q63AKlrl7NCc2+wbwPiAJlBP0KfzS3S8bpXwx0O7udWOdt6mpyZubmyc7XBEpYM/v6uLS6/6HBTMruevKZVSVaWDlcGa2yt2bxisXWU3B3b/k7gvcfTGwHPjD8IRgZnMzXl5I0CEtIjJhrV0DfPBHT1JRWsyNlzcpIRyirL97ZnYV0Ozu9wCfMrMLCWoT7cAHsh2PiBSu/kSKD9/aTHtPnDs+skzPRJgEkTUfRUXNRyIC0D2Q5OM/Xc0jG1tZedmpvO2Ew3IdUl6baPOR6lkiUnBaOvr40I+eZOOebr5x8euUECaRkoKIFJS12zr4x1uaGUim+NEVp/GmoxtzHdKUoqQgIgXjnnU7+Pyd65hdW8btH34DR8+pyXVIU46SgojkvVTa+e4Dz3Pdw5s5bXE91112Kg3VZbkOa0pSUhCRvNbZl+CffraGh59v5b1vWMTX3nGCpq+IkJKCiOStZ3fu5+M/Xc3W9l49JCdLlBREJO+4O3c0b+Or//VX6ipKuO3Dp7P0iJm5DmtaUFIQkbzSG0/ylbuf4ZdrWviboxq4+t0n0Vij/oNsUVIQkbyx6qV9fO7OdbzY1sNnzjmGT7zlKIqLLNdhTStKCiKScwPJFD94aCP/+afNzK2r4LZ/PJ1lR87KdVjT0tTvwt+8GT72MaithaKi4OfHPhZsF5Gc++uOTi669i9c9/Bm3nnqQu7/9JuUEHJoatcU7rsPLr0UEolgAejqghtugFtugbvugvPOy22MItNUIpXmuoc388Pfb6S+qpSbPtDEW46bk+uwpr2pmxQ2bw4SQm/vq/cNJolLL4WnnoIjj8x+fCLT2IbdXfzzHet4uqWTi06ax9fecQL1VaW5DkuYyknhe997uXYwmkQCrr4arr02OzGJTGOJVJo/PLeH2x7fyiMbW6mvLOW6fziF8143d/yDJWum7tTZtbVBU9FEynV2HnpgIjKil9p6uKN5G3c2b2dP1wCH1ZbzrtMW8v5lh2uqiizS1Nnd3RMq5l1deNopinDY20Ayxd7uOHu7BmjrGaC9J0FHb5z2njjxZJqK0mLKS4opixURy4gj5cEt/h29cfb1JuhPpCgpNmJFRcSKjYFkmt6BJD0DKQZSaWZVlTK7pozZNWXUVpRgZhhQZFBfVcrsmnIOqytnZlUp/YkUXf1JugeS7OuN09o1MLSYQXlJMeWxYmLFRl88RfdAkt54kngyTVGRUWRGkUFfIs3+vgRd/QkSKWfRrEqOaqzmqNnVHD+vliUNVZhpSOF00z2Q5KH1u/n5k9t49IU2igzOPHY27126iDOPbSRWPPXHuBSqqZsUqqsnVFPoLqmg6av3s3hWFbNry5hZVUp9ZSm15TESaSeeTBNPpkm7U1JcFHwoh/+h02kn7U46rGwZYAY98RQ7O/rY2dnPzs5+OvtGbsYqLjJKi4voT6YYrcJmBnUVJdRXllIWKyKZdpKpNImUU1ZSRHVZjMrSYmpLYuzq7Ofplk7augeGYjpQ1WUxDOhPpkikfCiG6tIYlWXFlMaKSKeDO07TDhWlxdSUx6gtL6G4yHimpZN7n9459Ps0VJey9IiZLF08k1MOr+e4w2onZd4ad2dfb4KdnX30xYNYk+k0A4k0HUOJNM7+viQ9A0m6BoKfVWUxZteU0VhTxqyqUkpjRRQXBf+uFSXFNIb7GqrLKC8pPuQ4pwt3Z1t7H398fg8PPbubx19oJ55Ks3BmBZ879xguPXUhh9WV5zpMmYDIk4KZFQPNQIu7XzBsXxlwK3Aq0Aa8291fnJQLX3ZZMMpojH6FdCzGjndcwvuXHc6Wvb3s7R5ga3sv7T1xuvqTlBQHH9plJcUYQZtoMu0kUmkAiswoLgq+jQM44OEH5dy6chbUV3La4pnMrimjIfygmVVdyqyqUmaEicfMcHfiqTT98SD5DCoyo7o8dsA37yRTaXoTQaJxd1JpZ19vnF2dA+ze38++3jgVpcVUl8WoKY9RV1HC7JpyGqrLqCgtfsV5kmmnLFZ0QN/2+xMpNrd28/T2Tp7Y0s7jW9q59+ldAJTGinjtvFpOXDCDY+bUcNTsoFZRVVbMrs5+dnT0s2t/Hx29iaEP867+JN39Sbr6E3T1J2nribOzs4/+RHrMOIqLjJry4HesKo1RVRajvaeXVS/to70nPu7vUVseY3ZtOY3Vwb9ffWUJMypKqAv/7arLgnNWlcWYWVXKrOpSaspi06Jm1NEbZ/2O/TzV0snql/axZlsHrV0DACxpqOLyNx7OOa+Zw2mLZ0ZaC5fJF3mfgpl9FmgCakdICh8DTnT3K81sOXCxu797rPNNuE9h82Y48cSRRx8NqqwcdfSRu0+LP+5saenoY83Wfazd2sHabR08s6Nz3A91gLJY0VDyqikvoaY8Rn1lKXPrypk7o4K5deVUl8WIFRslxUWUFhcxo7KEGZXBB/RoH0iJVJp9vXGSqSBpJtNOd3+Svd0D7OnqZ8/+AfZ2D9DaHTSp7e2O09Ebp7MvMWYtrLS4iPqqEmrKS6gqi1FTFqMsVkTmIbOqSlk4s5IF9RUsnFnJwvpKZteU5eWHZ38ixQutPWzc08XG3d08v7uL9Tv209LRN1TmiIYqTl44g5MXzeCMoxpY0lidw4hlNBPtU4g0KZjZAuAW4OvAZ0dICr8Dvubuj5pZDNgFNPoYQR3QM5pHuk8BoKQkWHSfQs6k005LRx+bWrvZvKeb3niKuXXlzJtREfR7VJZSVRbLuymS02mnqz/J/v4E3QNBn0x3f9Av09Ydp60nTnvPQLgvRXd/goFkGjMwjLQ7rV0D7Am/VQ8qLS5ifn0FC+orWDSzksNnVbJoZiUzq8qoKgtqddVlMWorSiiZQHu8u5NIBbXaZMrpiQcxdvQmglpYPEl/IkV/IkVfPM1AMkV/IvjZ0ZugpaOPlo6+oW//ENS8Fs+q5Ph5dZwwr5bj59by2vl1zNRQ0oKQLx3NPwC+AIz2eKT5wDYAd0+aWScwC9g7KVc/77ygJnD11fDjHwedz9XV8L73wWc+o/sTcqioyIJvyTMrOevY2bkOZ8KKioy6yhLqKksO6Tz9iRQtHX1sa+9l+76+cOllW3svv316Jx29ozd7VpcFTX7lJUWkHdJhE2E8mQ4+6MN+sAMRK7KhwQ61FSXMn1HBWcc2Mm9GBUsaqzlmTjVHNFRRFlM/y1QXWVIwswuAPe6+yszOHK3YCNteVUswsxXACoBFixYdWCBHHhnch6B7ESSPlJcUc2RjNUeO0tTS2ZdgW3svnX1BP0pPWCsJRqMl6OwLRqMVFRnFFvQ/lcaKgg/2kiLKYsWUhk1qJcVFVJQWU19ZQl1FKfVVJVSVxigvKQ5GvsWKNBpIhkRZUzgDuNDMzgfKgVoz+4m7X5ZRZjuwENgeNh/VAe3DT+Tu1wPXQ9B8FGHMInmhrqKEuvl1uQ5DpqHIvh64+5fcfYG7LwaWA38YlhAA7gEuD9cvDcvoQ19EJEeyfp+CmV0FNLv7PcCNwI/NbBNBDWF5tuMREZGXZSUpuPvDwMPh+lcztvcD78xGDCIiMj71LomIyBAlBRERGaKkICIiQ5QURERkiJKCiIgMKbiH7JhZK9ABDH8yTt0428Zbz9zWwIFPtTHS9Seyf7LiPpiYx4prvP3Dt4/1WnGPH9d4+w8m7pG2Ke7x9x/I32Tm68mKO6rPkqPdffw7It294Bbg+gPdNt76sG3NkxHTRPZPVtwHE/Nkxj3Wa8Wdm7hH2aa4x9l/IH+TUcSdjc+SsZZCbT769UFsG299pOMPNaaJ7J8qcY/1WnGPfr2J7j+YuEf7XQ7GdIr7QP4mM19PVtzZ+CwZVcE1H2WDmTX7BKZ5xgXYAAAHJElEQVSYzSeFGDMo7mxT3NlViHEXak0hatfnOoCDUIgxg+LONsWdXQUXt2oKIiIyRDUFEREZMqWTgpndZGZ7zOyZgzj2VDN72sw2mdkPLeOBzWb2STN73sz+ambfntyoo4nbzL5mZi1mtjZczi+EuDP2f87M3MwaJi/ioXNH8X7/i5k9Fb7XD5jZvAKJ+ztm9lwY+91mNqNA4n5n+PeYNrNJa8M/lFhHOd/lZrYxXC7P2D7m//+sOphhXoWyAG8GTgGeOYhjnwCWETwd7j7gvHD7WcBDQFn4enaBxP014HOF9n6H+xYCvwNeAhoKIW6gNqPMp4CVBRL3uUAsXP8W8K0Cifs1wLEEszE35TrWMI7Fw7bNBF4If9aH6/Vj/V65WKZ0TcHdH2HYk9zM7Egzu9/MVpnZn83suOHHmdlcgj/qRz34F7sV+Ltw90eBb7r7QHiNPQUSd+QijPtqgmd9R9IBFkXc7r4/o2hVFLFHFPcD7p4Miz4GLCiQuJ919+fzJdZRvA140N3b3X0f8CDw9lz/3Q43pZPCKK4HPunupwKfA/5jhDLzCR4VOmh7uA3gGOBNZva4mf3JzE6LNNqXHWrcAJ8ImwVuMrP66EJ9hUOK28wuBFrcfV3UgQ5zyO+3mX3dzLYB/wB8leyYjP8ngz5I8K01GyYz7qhNJNaRzAe2ZbwejD9ffi8gB09eyyUzqwbeCNyZ0WRXNlLREbYNftOLEVT9TgdOA+4wsyVhho/EJMV9HfAv4et/Ab5H8EcfmUON28wqgS8TNGlkzSS937j7l4Evm9mXgE8A/2eSQ31lMJMUd3iuLwNJ4KeTGeNIJjPuqI0Vq5ldAfxTuO0o4F4ziwNb3P1iRo8/579XpmmVFAhqRh3uflLmRjMrBlaFL+8h+ADNrDYvAHaE69uBX4ZJ4AkzSxPMb9Kaz3G7++6M4/4f8JsI4x10qHEfCRwBrAv/ABcAq81sqbvvyuO4h7sN+C0RJwUmKe6wA/QC4Owov+xkmOz3O0ojxgrg7jcDNwOY2cPAB9z9xYwi24EzM14vIOh72E7uf6+X5aozI1sLsJiMTiLgf4B3husGvH6U454kqA0MdvycH26/ErgqXD+GoDpoBRD33IwynwF+Vgjv97AyLxJBR3NE7/fRGWU+CdxVIHG/HVgPNEYRb9T/T5jkjuaDjZXRO5q3ELQ01IfrMyfye2VzyclFs/bLwe3ATiBBkI0/RPDN835gXfif/6ujHNsEPANsBq7l5Rv9SoGfhPtWA28pkLh/DDwNPEXwrWtuIcQ9rMyLRDP6KIr3+xfh9qcI5pyZXyBxbyL4orM2XKIYNRVF3BeH5xoAdgO/y2WsjJAUwu0fDN/jTcAVB/L/P1uL7mgWEZEh03H0kYiIjEJJQUREhigpiIjIECUFEREZoqQgIiJDlBRkSjCz7ixf7wYzO36SzpWyYDbVZ8zs1+PNTGpmM8zsY5NxbZHhNCRVpgQz63b36kk8X8xfnhguUpmxm9ktwAZ3//oY5RcDv3H312YjPpleVFOQKcvMGs3sF2b2ZLicEW5famb/Y2Zrwp/Hhts/YGZ3mtmvgQfM7Ewze9jM7rLgGQM/HZznPtzeFK53h5PfrTOzx8xsTrj9yPD1k2Z21QRrM4/y8mSA1Wb2ezNbbcFc+xeFZb4JHBnWLr4Tlv18eJ2nzOz/TuLbKNOMkoJMZdcAV7v7acAlwA3h9ueAN7v7yQSzl/5bxjHLgMvd/S3h65OBTwPHA0uAM0a4ThXwmLu/HngE+HDG9a8Jrz/uXDbhXD9nE9xxDtAPXOzupxA8x+N7YVL6IrDZ3U9y98+b2bnA0cBS4CTgVDN783jXExnJdJsQT6aXc4DjM2azrDWzGqAOuMXMjiaYjbIk45gH3T1z/vwn3H07gJmtJZgH57+HXSfOyxMMrgLeGq4v4+V58W8DvjtKnBUZ515FMM8+BPPg/Fv4AZ8mqEHMGeH4c8NlTfi6miBJPDLK9URGpaQgU1kRsMzd+zI3mtm/A39094vD9vmHM3b3DDvHQMZ6ipH/ZhL+cufcaGXG0ufuJ5lZHUFy+TjwQ4LnMDQCp7p7wsxeBMpHON6Ab7j7fx7gdUVeRc1HMpU9QPAcAwDMbHC64zqgJVz/QITXf4yg2Qpg+XiF3b2T4NGdnzOzEoI494QJ4Szg8LBoF1CTcejvgA+Gc/1jZvPNbPYk/Q4yzSgpyFRRaWbbM5bPEnzANoWdr+sJpj0H+DbwDTP7C1AcYUyfBj5rZk8Ac4HO8Q5w9zUEs28uJ3jATZOZNRPUGp4Ly7QBfwmHsH7H3R8gaJ561MyeBu7ilUlDZMI0JFUkIuGT4/rc3c1sOfAed79ovONEckl9CiLRORW4Nhwx1EHEjz8VmQyqKYiIyBD1KYiIyBAlBRERGaKkICIiQ5QURERkiJKCiIgMUVIQEZEh/x9pgCf3+woshAAAAABJRU5ErkJggg==\n"},"metadata":{"needs_background":"light"}}]},{"metadata":{"trusted":true},"cell_type":"code","source":"learn.fit_one_cycle(10, max_lr = slice(1e-4, 1e-3), moms=(0.8, 0.7), pct_start=0.2, wd =(1e-7, 1e-5, 1e-4,  1e-2))","execution_count":46,"outputs":[{"output_type":"display_data","data":{"text/plain":"<IPython.core.display.HTML object>","text/html":"<table border=\"1\" class=\"dataframe\">\n  <thead>\n    <tr style=\"text-align: left;\">\n      <th>epoch</th>\n      <th>train_loss</th>\n      <th>valid_loss</th>\n      <th>accuracy</th>\n      <th>time</th>\n    </tr>\n  </thead>\n  <tbody>\n    <tr>\n      <td>0</td>\n      <td>3.793664</td>\n      <td>3.787050</td>\n      <td>0.336729</td>\n      <td>12:44</td>\n    </tr>\n    <tr>\n      <td>1</td>\n      <td>3.745116</td>\n      <td>3.693457</td>\n      <td>0.351849</td>\n      <td>12:44</td>\n    </tr>\n    <tr>\n      <td>2</td>\n      <td>3.606362</td>\n      <td>3.643294</td>\n      <td>0.359943</td>\n      <td>12:45</td>\n    </tr>\n    <tr>\n      <td>3</td>\n      <td>3.598700</td>\n      <td>3.610803</td>\n      <td>0.364181</td>\n      <td>13:01</td>\n    </tr>\n    <tr>\n      <td>4</td>\n      <td>3.553363</td>\n      <td>3.590075</td>\n      <td>0.367450</td>\n      <td>12:54</td>\n    </tr>\n    <tr>\n      <td>5</td>\n      <td>3.545811</td>\n      <td>3.575722</td>\n      <td>0.369594</td>\n      <td>12:54</td>\n    </tr>\n    <tr>\n      <td>6</td>\n      <td>3.452046</td>\n      <td>3.567434</td>\n      <td>0.370651</td>\n      <td>12:55</td>\n    </tr>\n    <tr>\n      <td>7</td>\n      <td>3.460718</td>\n      <td>3.562857</td>\n      <td>0.371756</td>\n      <td>12:54</td>\n    </tr>\n    <tr>\n      <td>8</td>\n      <td>3.403893</td>\n      <td>3.560997</td>\n      <td>0.372018</td>\n      <td>12:55</td>\n    </tr>\n    <tr>\n      <td>9</td>\n      <td>3.412446</td>\n      <td>3.560777</td>\n      <td>0.372056</td>\n      <td>12:55</td>\n    </tr>\n  </tbody>\n</table>"},"metadata":{}}]},{"metadata":{"trusted":true},"cell_type":"code","source":"learn.save('fine-tuned')\nlearn.load('fine-tuned')\nlearn.save_encoder('fine-tuned')","execution_count":47,"outputs":[]},{"metadata":{"trusted":true},"cell_type":"code","source":"TEXT = \"He is a piece of\"\nN_WORDS = 10\nN_SENTENCES = 2","execution_count":48,"outputs":[]},{"metadata":{"trusted":true},"cell_type":"code","source":"print(\"\\n\".join(learn.predict(TEXT, N_WORDS, temperature=0.75) for _ in range(N_SENTENCES)))","execution_count":49,"outputs":[{"output_type":"stream","text":"He is a piece of shit . He is a gay . He\nHe is a piece of shit . He is not a Nazi .\n","name":"stdout"}]},{"metadata":{},"cell_type":"markdown","source":"# Classification Model"},{"metadata":{"trusted":true},"cell_type":"code","source":"src_clas = ItemLists(path, TextList.from_df( train, path=\".\", cols=\"comment_text\", vocab = data_lm.vocab),\n                    TextList.from_df( val, path=\".\", cols=\"comment_text\", vocab = data_lm.vocab))","execution_count":50,"outputs":[]},{"metadata":{"trusted":true},"cell_type":"code","source":"data_clas = src_clas.label_from_df(cols=label_cols).databunch(bs=32)","execution_count":51,"outputs":[]},{"metadata":{"trusted":true},"cell_type":"code","source":"data_clas.show_batch()","execution_count":52,"outputs":[{"output_type":"display_data","data":{"text/plain":"<IPython.core.display.HTML object>","text/html":"<table border=\"1\" class=\"dataframe\">\n  <thead>\n    <tr style=\"text-align: right;\">\n      <th>text</th>\n      <th>target</th>\n    </tr>\n  </thead>\n  <tbody>\n    <tr>\n      <td>xxbos xxmaj take that ! \\n \\n  xxup in xxup the xxup ass xxup in xxup the xxup ass xxup in xxup the xxup ass xxup in xxup the xxup ass xxup in xxup the xxup ass xxup in xxup the xxup ass xxup in xxup the xxup ass xxup in xxup the xxup ass xxup in xxup the xxup ass xxup in xxup the xxup ass xxup in</td>\n      <td>toxic;severe_toxic;obscene</td>\n    </tr>\n    <tr>\n      <td>xxbos \" \\n \\n  xxmaj from xxmaj norway ; xxmaj denmark ; xxmaj iceland ; xxmaj scotland ; xxunk etc . ) ; xxmaj wales ; xxmaj ireland ; xxmaj basques &amp; xxmaj xxunk data ) \\n \\n  by xxmaj gunnar xxmaj thompson \\n \\n  \" \" xxmaj on xxunk 's map , the northwestern continent is called \" \" xxmaj xxunk . \" \" xxmaj this</td>\n      <td></td>\n    </tr>\n    <tr>\n      <td>xxbos \" \\n \\n  xxmaj definitely his views contradict the current principles of xxmaj wikipedia . i do n't think his intention is to comply with them . xxmaj if no one expresses dissatisfaction and advocates change , there would be no improvement . \\n \\n  xxmaj on the other hand there are users who are abusing the system covertly , like user xxmaj zoe and user xxmaj</td>\n      <td></td>\n    </tr>\n    <tr>\n      <td>xxbos \" xxmaj february 2009 ( xxup utc ) \\n \\n  xxmaj well , hope the floor did n't hurt . xxmaj but your characterization of my attitude and what i respect and do n't is to say the least dubious . xxmaj did it occur to you that i did n't attend your schools , have different experiences , different parents , and can think for myself ?</td>\n      <td></td>\n    </tr>\n    <tr>\n      <td>xxbos \" again , little of that is relevant , whether true or not , including the ridiculous allegation of a \" \" tantrum \" \" ( whatever the size ) . personal talk is to be avoided on article talk pages , so i wo n't comment further on your \" \" degree \" \" issues . i encourage you though , if you 've time , to consider</td>\n      <td></td>\n    </tr>\n  </tbody>\n</table>"},"metadata":{}}]},{"metadata":{"trusted":true},"cell_type":"code","source":"learn = text_classifier_learner(data_clas, AWD_LSTM, drop_mult=0.5, model_dir='/temp/model', metrics=acc_02, loss_func=loss_func)\nlearn.load_encoder('fine-tuned')","execution_count":53,"outputs":[]},{"metadata":{"trusted":true},"cell_type":"code","source":"learn.lr_find()\nlearn.recorder.plot(suggestion=True)","execution_count":54,"outputs":[{"output_type":"display_data","data":{"text/plain":"<IPython.core.display.HTML object>","text/html":""},"metadata":{}},{"output_type":"stream","text":"LR Finder is complete, type {learner_name}.recorder.plot() to see the graph.\nMin numerical gradient: 4.79E-02\n","name":"stdout"},{"output_type":"display_data","data":{"text/plain":"<Figure size 432x288 with 1 Axes>","image/png":"iVBORw0KGgoAAAANSUhEUgAAAZIAAAEKCAYAAAA4t9PUAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAAIABJREFUeJzt3Xl8nWWd///X55yTtUnaplm6L7TpilhoyiKILZuFcaAqIqBfQUf5jQ648xWc+eoM6rh9HRwHnJ+IIi6IUgcBBQpIEYS2NIVC97206ZqmS5Km2T/fP87dcpqeLG1y52R5Px+P+9FzX/d13+dznyb5nPu6rvu6zd0RERE5XZFUByAiIn2bEomIiHSJEomIiHSJEomIiHSJEomIiHSJEomIiHSJEomIiHSJEomIiHSJEomIiHRJLNUB9ISCggIfP358qsMQEelTli9fvt/dCzuqNyASyfjx4ykrK0t1GCIifYqZvdWZemraEhGRLlEiERGRLlEiERGRLlEiERGRLlEiERGRLlEiERGRLlEiERGRLhkQ95GcrkdfL6emromLSgoZPywbM+twn5r6JrbtP8KW/Udwd6aPyGNCwSBiUeVsEemflEja8cQbu3l+3T4ARg3J4qJJBYwryGZQeoys9Cjp0Qi7Dh9l2/4jbKusZdv+I+yrrj/pOBmxCFNH5DFycCY5GTFyMmPkZqaRn53GsJwMhuWkU5Sbweih2WSmRU873gNHGnhpYwX7quo5o3AQEwtzGJOfTTSSPAG6Ow3NLcQikTbriIh0RImkHT+7qZRtlbX8bWMFL23cz5OrdlNd13RSvYKcDCYUZHPx5EImFAxiYuEgJhTk4DhrdlWxelcVa3ZVsWlfDTX1TdTUNVHT0IT7ye9ZnJfBuPxB5GWlcaS+KV6/vgkDstKjZKdHyU6PMSQ7jaHZ6QzNTqe5pYWXNu1nxY5DJx0zPRqhMDeDvKw0cjNj5GTEqDrayL7qeiqq6zna2AxANGKkRyOkRY1YNJ5YYhGjxZ2m5njCaWlxhg/OZFJRzvGlpCiXSUU5JyXAhqaWIEkZ0YgRNaOpxWlsbqGxuYWmFicrLUpWWpSIkphIn2ae7K9ZP1NaWurdMUWKu1Pf1MKR+iZqG5qpb2pm+OAscjJOPR83tziHahs4cKSB/TUN7KuuY3tlLW8dqGV7ZS1VdY3kZaaRkxljUEYMd+doQzNHGuLvfai2kYO1DVTXNWEG7xw9hDlTCpkzpYhx+dls2X+EzRU1bK6oYX91A1V1jVQdbaS6rom8rBhFuZkU5mYwNDuN5hZoaG6moamFxmanucVpanGamluImJEWM2KRCBEzyg/Wsqmihrcqa2luif/sRAzGDRtEQU46lUca2F9dT1WShNuW7PQog7PSGD00izH52YwZmk1WepSqo41U1cVjzkqLkj8onfxB6QzJTgegpcVpdidikJMRT5S5mTGGZqdTmJvBoNP4fxGRt5nZcncv7bCeEknf1tjcQlOzk5V++k1ip6OhqYW3Ko+wYW8NG/ZWs3FfNZU1DRTkZFCQk05BTgaZaVGaWpzmlvgVSCxipMcipAVXPHWNzRypb+ZIfRMHahsoP3iUHQdq2VNVhzvEInb8Sqq2oZmDRxpoaun8z2tWWpSC3HSy02Lxq6KIkRY1CnIyGD44k+K8TPIHpRONGBEzIkZwJRY5Xjc7PUZeVozBWWnkZqQRSejqSotGutQUKdLbdTaR6CtbH5cWjZCKv2XpsQglxbmUFOfyd4zo1mPXNzXT1Oxkp0dPGODg7lTVNXG4thGASCT+h7+5xampb6K6ronqukYOHGlkf0286W5/TT31jfFE1uIeJMBalm49wOGjjV2K0wzG5WczdXgeU0fkMqU4lwmFgxg/bJASjAwoSiTS62TEoiRrlTIzBmelMTgrrVve52hDMwdrG2hxxx1a/O1mvWNXekcamuJNbEebqKo7MfHU1DexYW8163ZXs3DNnuP9U2YwcnAWI4dkHm9CLM7LZGLhIKYMz2XM0Gz1C0m/okQiA1ZWepSs9KxuOdbRhmY2V9Swdf8Rtu4/wpaKGvZU1bF2TxUvbqinuv7tPqPs9CiTinIYPTSLEYOzGDkkixGDMynIyaAwN940mJvZPclSpCeEmkjMbB7wn0AUuN/dv9Nq+93A3GA1Gyhy9yHBtpuAfwm2fdPdHwzKZwG/ALKAJ4HP+UDo6JFeLSs9ypmjBnPmqMFJt9fUN7FxbzXr91Szbk81mytqWL+nmkXrKo6PnEs0oWAQV8wo5r0zhjNz9BBdwUivFlpnu5lFgQ3A5UA5sAy4wd3XtFH/NuBsd/+EmeUDZUAp4MByYJa7HzSzV4HPAUuIJ5IfuftT7cXSnzvbpW9zdw7VNrKnqu54v86eqjoWb65k8eZKmlqcotwMSscPZcbIwUwfmcfU4bkMyUonMy3SqZtkRU5Xb+hsPxfY5O5bgoAeBq4BkiYS4Abg68Hr9wLPuvuBYN9ngXlm9gKQ5+6Lg/JfAvOBdhOJSG9lZgwdlM7QQeknlH9mziQO1zbyl3V7+cu6fazaeZgnV+45oU40YuRkxBg5JIuLJxfwnsmFlI7LJz2mWRSkZ4WZSEYBOxLWy4HzklU0s3HABOD5dvYdFSzlScqTHfMW4BaAsWPHnnr0Iik2ODuND5wzmg+cMxqAqrrG4ze2Vtc1UVPfSE1dE+v3VvOzl7byk79uIScjxtljh8Sb2UYOZsbIvHZnNxDpDmEmkmQ/uW21o10PLHD3Y43Fbe3b6WO6+33AfRBv2mo/VJHeLy8zjfPPGMb5Zww7aVtNfROvbNrPXzdU8Pr2Q9z/0hYam+M/9rGIMXJIFqOHZjG5OJdbL5lEQU5GT4cv/ViYiaQcGJOwPhrY1Ubd64F/arXvnFb7vhCUj+7kMUUGjJyMGFfMGM4VM4YD8XtxNu6tYfWuw7xVWcuO4GbPh5Zu588rd/OfH57JuyYVpDhq6S/CTCTLgBIzmwDsJJ4sbmxdycymAEOBxQnFC4F/N7OhwfoVwJ3ufsDMqs3sfGAp8DHgv0I8B5E+KSOWfBTZ2t1V3PrQa3zkZ0u5de4kPndpiWamli4L7SfI3ZuAW4knhbXA7919tZndZWZXJ1S9AXg4cQhv0Mn+DeLJaBlw17GOd+DTwP3AJmAz6mgX6bRpI/J44raLuPac0fzX85u4+p6Xuf+lLew4UJvq0KQP01xbIgPUYyt28pO/bmHN7ioAzhyVx8UlhZw1eggzxwxh+ODMFEcoqdYbhv+KSC92zcxRXDNzFG9VHuHpVXt4evUefvLiluOzOhflZnDlmcP56PnjKCnOTXG00pvpikREjqtrbGb1rireLD/Esm0HeG7NPhqaWzh3Qj4fPX8cV505XH0qA4imkU+gRCJyeipr6nlkeTkPLd3O9gO1TCrK4ctXTOG9M4p1V/0AoESSQIlEpGtaWpynV+/h/z6zni0VR3jnmCF89cqpnJfknhbpPzqbSHSNKiIdikSMq94xgmc+fzHf/eA72FdVx/U/XcKPX9jEQPgyKu1TIhGRTotFI3x49lie/9Ic3nfWSL739Hpu++3rHG04eQZjGTiUSETklGWlR/nR9TP5yryp/Hnlbj74369QflD3ogxUSiQiclrMjE/PmcjPb5rNjoO1fODHr7B+T3Wqw5IUUCIRkS6ZO7WIP3z6XQBc95PFvL79YIojkp6mRCIiXTa5OJc/fPpdDM5K4yP3L+XlTftTHZL0ICUSEekWY/KzWfCPFzBmaDYff2AZL6zfl+qQpIcokYhItynKy+R3/9/5TCrK4baHXmfjXvWZDARKJCLSrYZkp3P/TaVkpEX55C/LOHikIdUhSciUSESk240cksVPPzaL3Yfr+PRvltPY3JLqkCRESiQiEoqzxw7lex88iyVbDvD1x1frDvh+TIlEREIz/+xRfGbORB5aup2Fq/ekOhwJSaiJxMzmmdl6M9tkZne0Uec6M1tjZqvN7KGgbK6ZrUhY6sxsfrDtF2a2NWHbzDDPQUS65ouXT2bq8Fy++ee11DVqKpX+KLREYmZR4F7gSmA6cIOZTW9VpwS4E7jQ3WcAnwdw90XuPtPdZwKXALXAMwm73n5su7uvCOscRKTrYtEIX3vfdMoPHuVnf9ua6nAkBGFekZwLbHL3Le7eADwMXNOqzqeAe939IIC7Jxt4fi3wlLtrIh+RPupdkwqYN2M49y7axJ7DdakOR7pZmIlkFLAjYb08KEs0GZhsZi+b2RIzm5fkONcDv21V9i0ze9PM7jazjO4LWUTC8tWrptHU4nzv6XWpDkW6WZiJJNnj01oP24gBJcAc4AbgfjMbcvwAZiOAdwALE/a5E5gKzAbyga8kfXOzW8yszMzKKioqTvccRKSbjB2WzScvmsD/vL6T1zQfV78SZiIpB8YkrI8GdiWp85i7N7r7VmA98cRyzHXAo+7eeKzA3Xd7XD3wAPEmtJO4+33uXurupYWFhd1wOiLSVZ+ZO4mi3Az+7Yk1Gg7cj4SZSJYBJWY2wczSiTdRPd6qzh+BuQBmVkC8qWtLwvYbaNWsFVylYPEHRs8HVoUSvYh0u5yMGF+4fDJv7DjE4i2VqQ5HukloicTdm4BbiTdLrQV+7+6rzewuM7s6qLYQqDSzNcAi4qOxKgHMbDzxK5q/tjr0b8xsJbASKAC+GdY5iEj3e//Zo8gflM7P/7Yt1aFIN7GBcHlZWlrqZWVlqQ5DRAL/8cx6/mvRJhZ9aQ7jCwalOhxpg5ktd/fSjurpznYR6XEfvWAcsYjxwMu6r6Q/UCIRkR5XlJvJ379zJI8sL+fw0caOd5BeTYlERFLiExdOoLahmd8t257qUKSLlEhEJCXOHDWY8ybk8+Arb9Gkaeb7NCUSEUmZf7hoAjsPHWXh6r2pDkW6QIlERFLm0mnFjBuWzS8Xb0t1KNIFSiQikjLRiHHtOaNZuvUAOw8dTXU4cpqUSEQkpa6ZGZ/L9fEVrWdQkr5CiUREUmrssGzOGTuEx1bsTHUocpqUSEQk5a6ZOYp1e6pZt6cq1aHIaVAiEZGU+7uzRhCNGH98Xc1bfZESiYikXEFOBu8uKeCJN3bR0tL/5//rb5RIRKRXmD9zFDsPHaXsLT30qq9RIhGRXuHy6cVkpUX5ozrd+xwlEhHpFQZlxLhiRjFPrtxNQ5OmTOlLlEhEpNeYP3MUh2ob+euGilSHIqcg1ERiZvPMbL2ZbTKzO9qoc52ZrTGz1Wb2UEJ5s5mtCJbHE8onmNlSM9toZr8LHuMrIv3ARSUFDM5K4+lVe1IdipyC0BKJmUWBe4ErgenADWY2vVWdEuBO4EJ3nwF8PmHzUXefGSxXJ5R/F7jb3UuAg8A/hHUOItKz0qIR5kwp5IX1+2jW6K0+I8wrknOBTe6+xd0bgIeBa1rV+RRwr7sfBHD3fe0d0MwMuARYEBQ9CMzv1qhFJKUumVpE5ZEG3ig/lOpQpJPCTCSjgB0J6+VBWaLJwGQze9nMlpjZvIRtmWZWFpQfSxbDgEPu3tTOMUWkD3vP5EKiEeP5te1+r5ReJMxEYknKWl+rxoASYA5wA3C/mQ0Jto0NHjp/I/BDM5vYyWPG39zsliARlVVUqONOpK8Ykp3OrHFD+cs6JZK+IsxEUg6MSVgfDbSe/6AceMzdG919K7CeeGLB3XcF/24BXgDOBvYDQ8ws1s4xCfa7z91L3b20sLCwe85IRHrEpVOLWLu7il2aWr5PCDORLANKglFW6cD1wOOt6vwRmAtgZgXEm7q2mNlQM8tIKL8QWOPuDiwCrg32vwl4LMRzEJEUuHRaEQCL1uuqpC8ILZEE/Ri3AguBtcDv3X21md1lZsdGYS0EKs1sDfEEcbu7VwLTgDIzeyMo/467rwn2+QrwRTPbRLzP5GdhnYOIpMbEwhzG5Gepn6SPiHVc5fS5+5PAk63Kvpbw2oEvBktinVeAd7RxzC3ER4SJSD9lZlw6tZjfvrqdow3NZKVHUx2StEN3totIr3TJ1CLqm1pYvGV/qkORDiiRiEivdN4Z+WSnR/mLmrd6PSUSEemVMmJR3l1SwPPr9hFvBZfeSolERHqtS6cWs/twHWt26xG8vZkSiYj0WnOmxu8BW6SbE3s1JRIR6bWKcjM5a/Rg3eXeyymRiEivdsnUIlbsOERlTX2qQ5E2KJGISK926dRi3OGF9Zozr7dSIhGRXm3GyDyKcjN4Xs1bvZYSiYj0apGIMXdKES9uqNCz3HspJRIR6fUumVZEdX0TZdsOpDoUSUKJRER6vYsmFZAejah5q5dSIhGRXm9Q+Vv8+JWf8YVrZ0MkAnl58JnPwObNqQ5NUCIRkd7uqafgrLOY+7fHGFRfC+5QXQ333w9nnRXfLimlRCIivdfmzXDttVBbS7Sp6cRtjY1QWxvfriuTlFIiEZHe6wc/iCeM9jQ2wt1390w8kpQSiYj0Xr/+decSya9+1TPxSFKhJhIzm2dm681sk5nd0Uad68xsjZmtNrOHgrKZZrY4KHvTzD6cUP8XZrbVzFYEy8wwz0FEUqimpnvrSShCe9SumUWBe4HLgXJgmZk9nvDsdcysBLgTuNDdD5pZUbCpFviYu280s5HAcjNb6O6Hgu23u/uCsGIXkV4iJyfesd6ZepIyYV6RnAtscvct7t4APAxc06rOp4B73f0ggLvvC/7d4O4bg9e7gH1AYYixikhv9NGPQlpa+3XS0uB//a+eiUeSCjORjAJ2JKyXB2WJJgOTzexlM1tiZvNaH8TMzgXSgcRhGd8KmrzuNrOMZG9uZreYWZmZlVVUaLI3kT7pS1/qXCL5whd6Jh5JKsxEYknKWj8vMwaUAHOAG4D7zWzI8QOYjQB+BXzc3Y9NsnMnMBWYDeQDX0n25u5+n7uXuntpYaEuZkT6pIkTYcECyM4+KaG0xGLx8gUL4vUkZcJMJOXAmIT10cCuJHUec/dGd98KrCeeWDCzPODPwL+4+5JjO7j7bo+rBx4g3oQmIv3VlVfCm2/CLbdAXh4eiVCdkc1r8z4UL7/yylRHOOCFmUiWASVmNsHM0oHrgcdb1fkjMBfAzAqIN3VtCeo/CvzS3R9J3CG4SsHMDJgPrArxHESkN5g4Ee65Bw4fxpqb+cQPn+Ub8/5JVyK9RKcSiZlNPNYXYWZzzOyziU1Qybh7E3ArsBBYC/ze3Veb2V1mdnVQbSFQaWZrgEXER2NVAtcBFwM3Jxnm+xszWwmsBAqAb57SGYtIn3f+GcNYtfMw1XUd3GMiPcLcW3dbJKlktgIoBcYT/+P/ODDF3a8KNbpuUlpa6mVlZakOQ0S6ySub9nPj/Ut54ObZzJ1a1PEOclrMbLm7l3ZUr7NNWy3BFcb7gR+6+xeAEV0JUETkdJ0zbijp0QiLt1SmOhSh84mk0cxuAG4C/hSUdTAmT0QkHJlpUWaOHcISJZJeobOJ5OPABcC33H2rmU0Afh1eWCIi7TvWT1KlfpKU61Qicfc17v5Zd/+tmQ0Fct39OyHHJiLSpgvOGEaLw7KtevxuqnV21NYLZpZnZvnAG8ADZvYf4YYmItK2s8cOIT0WYfFmNW+lWmebtga7exXwAeABd58FXBZeWCIi7ctMizJr7FD+tml/qkMZ8DqbSGLBjYDX8XZnu4hISs2dWsi6PdXsPHQ01aEMaJ1NJHcRv39ks7svM7MzgI3hhSUi0rFLpxUD8PzavSmOZGDrbGf7I+5+lrt/Oljf4u4fDDc0EZH2nVEwiPHDsnlu7b5UhzKgdbazfbSZPWpm+8xsr5n9wcxGhx2ciEh7zIxLpxWzeHMlR+qbUh3OgNXZpq0HiE+LMpL4M0WeCMpERFLq0mlFNDS3qNM9hTqbSArd/QF3bwqWX6AnFopILzB7fD65mTH+on6SlOlsItlvZh81s2iwfBTQ4G0RSbm0aIT3TC7k+XUVtLR0PAmtdL/OJpJPEB/6uwfYDVxLfNoUEZGUu2xaMftr6nlz5+FUhzIgdXbU1nZ3v9rdC929yN3nE785UUQk5eZMKSQaMTVvpUhXnpD4xW6LQkSkC4ZkpzNr3FANA06RriQS67CC2TwzW29mm8zsjjbqXGdma8xstZk9lFB+k5ltDJabEspnmdnK4Jg/Ch65KyID3KVTi1i7u0p3uadAVxJJu71aZhYF7gWuBKYDN5jZ9FZ1SoA7gQvdfQbw+aA8H/g6cB5wLvD1YNZhgP8GbgFKgmVeF85BRPoJ3eWeOu0mEjOrNrOqJEs18XtK2nMusCm4C74BeBi4plWdTwH3uvtBAHc/dl36XuBZdz8QbHsWmBfM95Xn7os9/ozgXwLzT+WERaR/mlg4iHHDslm0viLVoQw47SYSd89197wkS667xzo49ihgR8J6eVCWaDIw2cxeNrMlZjavg31HBa/bO6aIDEBmxtwpRbyyeT91jc2pDmdA6UrTVkeS9V20bg6LEW+emgPcANxvZkPa2bczx4y/udktZlZmZmUVFfqGIjIQzJlSSF1jC0v1sKseFWYiKQfGJKyPBnYlqfOYuze6+1ZgPfHE0ta+5cHr9o4JgLvf5+6l7l5aWKib8EUGgvPPGEZmWoRF6zR6qyeFmUiWASVmNsHM0oHric/XleiPwFwAMysg3tS1hfiU9VeY2dCgk/0KYKG77waqzez8YLTWx4DHQjwHEelDMtOiXHDGMF5Yr0TSk0JLJO7eBNxKPCmsBX7v7qvN7C4zuzqothCoNLM1wCLgdnevdPcDwDeIJ6NlwF1BGcCngfuBTcBm4KmwzkFE+p65U4vYVlnL1v1HUh3KgGHxwU/9W2lpqZeVlaU6DBHpAdsra7n4+4v4+t9P5+MXTkh1OH2amS1399KO6oXZtCUi0uPGDstmYuEgDQPuQUokItLvzJ1SxJItldQ26GFXPUGJRET6nTlTimhoamHxZj3toicokYhIvzN7wlCy06Ms0uitHqFEIiL9TkYsyoWTCli0roKBMKAo1ZRIRKRfmjuliJ2HjrJ2d3WqQ+n3lEhEpF+6YkYxeZkxvvbYKpr1CN5QKZGISL9UkJPBv10zg7K3DvLTl7YkraNmr+6hRCIi/db8maO48szh/MczG1i3p+p4+db9R5h/78t86pfLUxhd/6FEIiL9lpnxzflnkpcV4wu/e4OGphYeW7GT9/3oJVbsOMRza/eyvbI21WH2eUokItKvDcvJ4NsfOIu1u6u4+p6/8bmHVzB1RB5/+PQFmMGjr+9MdYh9nhKJiPR7l08v5kOzRrN+bzWfmTORh285n1nj8rngjGH8z+vl6ivpIiUSERkQvv2Bd/Di7XP53/OmkhaN/+l7/9mjeKuylte2H0pxdH2bEomIDAixaIQx+dknlF35jhFkpkX4n9fK29hLOkOJREQGrJyMGO+dMZw/vbmb+iY95/10KZGIyID2/rNHcfhoI4vWadr50xVqIjGzeWa23sw2mdkdSbbfbGYVZrYiWD4ZlM9NKFthZnVmNj/Y9gsz25qwbWaY5yAi/dtFkwooyMlQ81YXxMI6sJlFgXuBy4FyYJmZPe7ua1pV/Z2735pY4O6LgJnBcfKJP1b3mYQqt7v7grBiF5GBIxaNMH/mSB5cvI2DRxoYOig91SH1OWFekZwLbHL3Le7eADwMXHMax7kWeMrdddeQiITi/eeMorHZ+dObu1IdSp8UZiIZBexIWC8Pylr7oJm9aWYLzGxMku3XA79tVfatYJ+7zSyjm+IVkQFq+og8JhXl8OTKPakOpU8KM5FYkrLWd/08AYx397OA54AHTziA2QjgHcDChOI7ganAbCAf+ErSNze7xczKzKysokKdaCLSNjNj3ozhLN1ayYEjDakOp88JM5GUA4lXGKOBE64b3b3S3euD1Z8Cs1od4zrgUXdvTNhnt8fVAw8Qb0I7ibvf5+6l7l5aWFjYxVMRkf5u3pnDaXF4bu3eVIfS54SZSJYBJWY2wczSiTdRPZ5YIbjiOOZqYG2rY9xAq2atY/uYmQHzgVXdHLeIDEAzRuYxakgWC1epeetUhTZqy92bzOxW4s1SUeDn7r7azO4Cytz9ceCzZnY10AQcAG4+tr+ZjSd+RfPXVof+jZkVEm86WwH8Y1jnICIDh5nx3hnD+fWSt6ipbyInI7Q/j/2ODYTJykpLS72srCzVYYhIL/fq1gNc95PF3HPj2bzvrJGpDiflzGy5u5d2VE93touIBGaNG8qwQek8reatU6JEIiISiEaMK2YUs2jdPuoaNfdWZymRiIgkeO+M4RxpaOaVzftTHUqfoUQiIpLgXRMLyM2IqXnrFCiRiIgkSI9FuGRaEc+t3UdTc0uqw+kTlEhERFqZN2M4B440sGzbwVSH0icokYiItHLx5ELSoxHd5d5JSiQiIq0MyojxrknDeG7tXgbCvXZdpUQiIpLEZdOKeauyls0VNakOpddTIhERSeLSaUUAPLtmX4oj6f2USEREkhgxOIszR+Wpn6QTlEhERNpw2bRiXtt+kMqa+o4rD2BKJCIibbhsWjHu8Pw6NW+1R4lERKQNM0bmMWJwppq3OqBEIiLSBjPj0mlFvLhhvyZxbIcSiYhIOy6bVszRxmYWb6k8rf3rm5p5bXv/vkNeiUREpB0XTBzGoPQoz605veatB1/Zxgd+/Aoryw93c2S9R6iJxMzmmdl6M9tkZnck2X6zmVWY2Ypg+WTCtuaE8scTyieY2VIz22hmvwueBy8iEoqMWJSLJxfy3Nq9tLSc+l3uf1kb76j/1ZJt3RxZ7xFaIjGzKHAvcCUwHbjBzKYnqfo7d58ZLPcnlB9NKL86ofy7wN3uXgIcBP4hrHMQEQG4+p0j2VtVz8PLdpzSftV1jSx/6yAZsQiPrdjFodqGkCJMrTCvSM4FNrn7FndvAB4GrunKAc3MgEuABUHRg8D8LkUpItKBeWcO57wJ+Xz36XWndE/Jy5v209TifPWqadQ3tfBIWXmIUaZOmIlkFJCYvsuDstY+aGZvmtkCMxuTUJ5pZmVmtsTMjiWLYcAhd2/q4JiY2S3B/mUVFRVdPBURGcjMjG/OP5Mj9U18+6l1nd7vhfUV5GbEuPG8scweP5RfL33rtJrHerswE4klKWv9CT4BjHf3s4DniF9hHDPW3UuBG4EfmtnETh6KYir7AAAQXklEQVQzXuh+n7uXuntpYWHhqUcvIpKgpDiXT118BguWl/Pq1gMd1nd3XlhfwUUlBaRFI3z0/HG8VVnLixv73xfbMBNJOZB4hTEa2JVYwd0r3f3YdeJPgVkJ23YF/24BXgDOBvYDQ8ws1tYxRUTCctslkxg1JIv/88dVNHbw9MR1e6rZU1XHnCnxL7JXnjmCgpx0frX4rZ4ItUeFmUiWASXBKKt04Hrg8cQKZjYiYfVqYG1QPtTMMoLXBcCFwBqPPxhgEXBtsM9NwGMhnoOIyHHZ6TH+9eoZrN9bzc//trXdui+sj195vGdyfBbh9FiE62eP5fn1+9hxoDb0WHtSaIkk6Me4FVhIPEH83t1Xm9ldZnZsFNZnzWy1mb0BfBa4OSifBpQF5YuA77j7mmDbV4Avmtkm4n0mPwvrHEREWrt8ejGXTSvmB89uYNXOtu8NeWH9PqYOz2X44MzjZTeeNxYDfrN0ew9E2nNsIDz9q7S01MvKylIdhoj0E5U19fzdj/5GRlqEJ267iLzMtBO2V9c1cvZdz/LJd5/BHVdOPWHbp35ZxuvbD7L0q5cRjSTr9u09zGx50FfdLt3ZLiJyioblZHDPjWdTfvAotz/yxkmP4z027HfulJMH+lwzcyT7axpYtq3jDvu+QolEROQ0lI7P5455U1m4ei8/f3nbCduODfs9Z9zQk/abO6WIjFiEp1bu7qFIwxfruIqIiCTzyXdPYNm2A3z7ybXUNTYzYnAmw3IyThj229qgjBhzphTy1Ko9fP3vZxDp5c1bnaFEIiJymsyM73/onXz4J4v5/sL1J2y7ZGpRm/td9Y4RLFy9l9e2H6R0fH7YYYZOiUREpAsGZ6Xx1OfeTVVdEweONHDgSD1H6pt518Rhbe5zydQi0qMRnly5p91EsmZXFd95eh333Hj2SR36vYn6SEREusjMGJyVxoSCQcwal8/FkwuJJWnWOiY3M42LJxfw1Krd7U6Z8tgbO3lxQwWPrejd910rkYiIpMCVZ45g9+E63ig/1Gadsm3xB2L9btmp33ey40AtDy3dzsEj4c84rEQiIpICl00rJi1qPLVqT9LtdY3NvFl+iMLcDFbtrGr35sdkFm+u5KuPruTw0cbuCLddSiQiIikwODuNCycV8OTK3SfdhwLwxo5DNDY7d8ybSnoswu/LTu1ZKBv2VpMRizAmP7u7Qm6TEomISIpcdeYIyg8eZdXOqpO2lb0Vb9a6ZGoRV505nEdf30ldY3Onj71xXw0TC3N65O55JRIRkRS5fHox0Yjx5yQ3Jy7bdoDJxTkMHZTOh2ePpbquiadWdf4mxo17q5lcnNOd4bZJiUREJEWGDkrn4pIC/vj6TpoTRm81tzjLt719j8n5Z+Qzblg2D7/aueat6rpGdh2uo6Q4N5S4W1MiERFJoWtnjWFPVR0vb9p/vGz9nmqq65uYPT4+xYqZcV3pGJZuPcDW/Uc6POamfTUAlBTpikREpN+7dFoRg7PSWLD87ee5H5vQcXbCzYrXzhpNNGKd6nTfuDdIJLoiERHp/zLTolwzcyQLV+85PlR32bYDjBicyaghWcfrFedlMndKEY+UldPQ1P7TGTfui4/YGtsDI7ZAiUREJOWunTWa+qYW/vTmLtydZdsOUDo+H7MTR1x95Lyx7K+p59k1e9s93oa9PTdiC0JOJGY2z8zWm9kmM7sjyfabzazCzFYEyyeD8plmtjh4euKbZvbhhH1+YWZbE/aZGeY5iIiE7R2jBjOlOJcFy8spP3iUvVX1nDv+5CnoL55cyKghWfx6SfvPfd+0r4aSHhqxBSEmEjOLAvcCVwLTgRvMbHqSqr9z95nBcn9QVgt8zN1nAPOAH5rZkIR9bk/YZ0VY5yAi0hPMjGtnjeb17Yf47avx6VCSTeYYjRg3njeWxVsqj3eot1ZT38TOQ0eZ3EP9IxDuFcm5wCZ33+LuDcDDwDWd2dHdN7j7xuD1LmAfcPKjxkRE+on5Z48iGjF++tIWcjNjTGkjEXx49hjSosZDbTz3fePeaqDnRmxBuIlkFJA4vKA8KGvtg0Hz1QIzG9N6o5mdC6QDmxOKvxXsc7eZZXRr1CIiKVCYm8HcKYU0Njul44a2+cCrgpwM5p05ggXLd3C04eQ73Tfu69kRWxBuIkn2KbSeUOYJYLy7nwU8Bzx4wgHMRgC/Aj7u7seGKdwJTAVmA/nAV5K+udktZlZmZmUVFRWnfxYiIj3k2lmjgeTNWok+ct5YquqaeOLNk6eX37i3mvQeHLEF4SaSciDxCmM0cMJZu3ulu9cHqz8FZh3bZmZ5wJ+Bf3H3JQn77Pa4euAB4k1oJ3H3+9y91N1LCwvVKiYivd+l04r50uWTua70pMaZE5w3IZ+Sohx+k6R5qyfn2DomzESyDCgxswlmlg5cDzyeWCG44jjmamBtUJ4OPAr80t0fSbaPxcfFzQdWhXYGIiI9KC0a4bZLSyjMbb/F3sz4yHljeWPHoZOml9+4t6bH5tg6JrRE4u5NwK3AQuIJ4vfuvtrM7jKzq4Nqnw2G+L4BfBa4OSi/DrgYuDnJMN/fmNlKYCVQAHwzrHMQEemtPjBrNFlpUX760pbjZakYsQUhP7Pd3Z8EnmxV9rWE13cS7/Novd+vgV+3ccxLujlMEZE+Jy8zjY9fOJ4fv7CZj5w3jnMn5B8fEjypB0dsge5sFxHps269ZBKjhmTxf/64isbmFjYEQ397+opEiUREpI/KTo/x9b+fzvq91fzi5W1s2lfT4yO2IOSmLRERCdfl04u5dGoRdz+3gQkFg3p8xBboikREpE8zM/716hk0tzird1X1+IgtUCIREenzxuRnc+vcSUDPTo1yjJq2RET6gVvecwY19U1c/c5kM1GFS4lERKQfyIhFufOqaSl5bzVtiYhIlyiRiIhIlyiRiIhIlyiRiIhIlyiRiIhIlyiRiIhIlyiRiIhIlyiRiIhIl5h768eo9z9mdhjYmGTTYOBwJ9eTvT72bwGw/zTDa/2end2u2Adm7B3F3V4dxa7YT7VOibsP7vAd3L3fL8B9nSlvbz3Z64R/y7o7NsWu2E8nbsWu2MOMva1loDRtPdHJ8vbWk71u67inoqNjKPaTXw/k2Duzv2I/+bViP706nXrvAdG0FTYzK3P30lTHcToUe2oo9tRQ7OEYKFckYbsv1QF0gWJPDcWeGoo9BLoiERGRLtEViYiIdIkSSStm9nMz22dmq05j31lmttLMNpnZj8zMErbdZmbrzWy1mX2ve6M+/h7dHruZ/auZ7TSzFcFyVfdHHt7nHmz/spm5mRV0X8QnHD+Mz/0bZvZm8Jk/Y2Yjuz/y0GL/vpmtC+J/1MyGdH/kocX+oeB3tMXMur0/oisxt3G8m8xsY7DclFDe7u9Etzud4WT9eQEuBs4BVp3Gvq8CFwAGPAVcGZTPBZ4DMoL1oj4U+78CX+6Ln3uwbQywEHgLKOgrsQN5CXU+C/z/fSj2K4BY8Pq7wHf7UOzTgCnAC0Bpb4k5iGd8q7J8YEvw79Dg9dD2zi+sRVckrbj7i8CBxDIzm2hmT5vZcjN7ycymtt7PzEYQ/+Vf7PH/yV8C84PNnwa+4+71wXvs60Ox94gQY78b+N9AaJ2BYcTu7lUJVQeFFX9IsT/j7k1B1SXA6D4U+1p3Xx9GvF2JuQ3vBZ519wPufhB4FpiXit9nJZLOuQ+4zd1nAV8GfpykziigPGG9PCgDmAy828yWmtlfzWx2qNGeqKuxA9waNFP83MyGhhfqSboUu5ldDex09zfCDjSJLn/uZvYtM9sBfAT4WoixttYdPzPHfIL4N+Ke0p2x95TOxJzMKGBHwvqx8+jx89Mz2ztgZjnAu4BHEpoZM5JVTVJ27FtkjPil5/nAbOD3ZnZG8G0hNN0U+38D3wjWvwH8gPgfh1B1NXYzywb+mXgzS4/qps8dd/9n4J/N7E7gVuDr3RzqyQF1U+zBsf4ZaAJ+050xtqU7Y+8p7cVsZh8HPheUTQKeNLMGYKu7v5+2z6PHz0+JpGMR4JC7z0wsNLMosDxYfZz4H9zES/jRwK7gdTnwP0HieNXMWojPm1MRZuB0Q+zuvjdhv58Cfwoz4ARdjX0iMAF4I/gFHQ28ZmbnuvueXh57aw8Bf6YHEgndFHvQ8fs+4NKwvzAl6O7PvSckjRnA3R8AHgAwsxeAm919W0KVcmBOwvpo4n0p5fT0+YXZAdNXF2A8CZ1hwCvAh4LXBryzjf2WEb/qONbBdVVQ/o/AXcHrycQvR62PxD4ioc4XgIf7yufeqs42QupsD+lzL0mocxuwoA/FPg9YAxSGFXPYPzOE1Nl+ujHTdmf7VuKtHUOD1/mdOb9uP6ew/6P72gL8FtgNNBLP7P9A/Jvt08AbwS/I19rYtxRYBWwG7uHtGz7TgV8H214DLulDsf8KWAm8Sfzb3Ii+EnurOtsIb9RWGJ/7H4LyN4nPdzSqD8W+ifiXpRXBEtaIszBif39wrHpgL7CwN8RMkkQSlH8i+Lw3AR8/ld+J7lx0Z7uIiHSJRm2JiEiXKJGIiEiXKJGIiEiXKJGIiEiXKJGIiEiXKJHIgGRmNT38fveb2fRuOlazxWcFXmVmT3Q0u66ZDTGzz3THe4sko+G/MiCZWY2753Tj8WL+9kSFoUqM3cweBDa4+7faqT8e+JO7n9kT8cnAoysSkYCZFZrZH8xsWbBcGJSfa2avmNnrwb9TgvKbzewRM3sCeMbM5pjZC2a2wOLP4/jNsedABOWlweuaYELGN8xsiZkVB+UTg/VlZnZXJ6+aFvP2JJU5ZvYXM3vN4s+iuCao8x1gYnAV8/2g7u3B+7xpZv/WjR+jDEBKJCJv+0/gbnefDXwQuD8oXwdc7O5nE5+F998T9rkAuMndLwnWzwY+D0wHzgAuTPI+g4Al7v5O4EXgUwnv/5/B+3c4N1Iwh9SlxGccAKgD3u/u5xB/Bs4PgkR2B7DZ3We6++1mdgVQApwLzARmmdnFHb2fSFs0aaPI2y4DpifMwppnZrnAYOBBMyshPotqWsI+z7p74vMlXnX3cgAzW0F8XqW/tXqfBt6e/HI5cHnw+gLefm7EQ8D/bSPOrIRjLyf+HAqIz6v070FSaCF+pVKcZP8rguX1YD2HeGJ5sY33E2mXEonI2yLABe5+NLHQzP4LWOTu7w/6G15I2Hyk1THqE143k/x3rNHf7pxsq057jrr7TDMbTDwh/RPwI+LPLSkEZrl7o5ltAzKT7G/At939J6f4viJJqWlL5G3PEH/uBwBmdmxq78HAzuD1zSG+/xLiTWoA13dU2d0PE38M75fNLI14nPuCJDIXGBdUrQZyE3ZdCHwieBYGZjbKzIq66RxkAFIikYEq28zKE5YvEv+jXBp0QK8hPv0/wPeAb5vZy0A0xJg+D3zRzF4FRgCHO9rB3V8nPmvs9cQfIFVqZmXEr07WBXUqgZeD4cLfd/dniDedLTazlcACTkw0IqdEw39FeongqY5H3d3N7HrgBne/pqP9RFJNfSQivccs4J5gpNUheuCRxiLdQVckIiLSJeojERGRLlEiERGRLlEiERGRLlEiERGRLlEiERGRLlEiERGRLvl/3zUG6vrIYuEAAAAASUVORK5CYII=\n"},"metadata":{"needs_background":"light"}}]},{"metadata":{"trusted":true},"cell_type":"code","source":"learn.fit_one_cycle(2, max_lr=slice(1e-3, 1e-2), moms=(0.8, 0.7), pct_start=0.2, wd =(1e-7, 1e-5, 1e-4, 1e-3, 1e-2))","execution_count":58,"outputs":[{"output_type":"display_data","data":{"text/plain":"<IPython.core.display.HTML object>","text/html":"<table border=\"1\" class=\"dataframe\">\n  <thead>\n    <tr style=\"text-align: left;\">\n      <th>epoch</th>\n      <th>train_loss</th>\n      <th>valid_loss</th>\n      <th>accuracy_thresh</th>\n      <th>time</th>\n    </tr>\n  </thead>\n  <tbody>\n    <tr>\n      <td>0</td>\n      <td>0.083538</td>\n      <td>0.066938</td>\n      <td>0.968161</td>\n      <td>05:18</td>\n    </tr>\n    <tr>\n      <td>1</td>\n      <td>0.072916</td>\n      <td>0.065506</td>\n      <td>0.970056</td>\n      <td>04:26</td>\n    </tr>\n  </tbody>\n</table>"},"metadata":{}}]},{"metadata":{"trusted":true},"cell_type":"code","source":"learn.save('first-head')\nlearn.load('first-head')","execution_count":59,"outputs":[{"output_type":"execute_result","execution_count":59,"data":{"text/plain":"RNNLearner(data=TextClasDataBunch;\n\nTrain: LabelList (127656 items)\nx: TextList\nxxbos xxmaj grandma xxmaj terri xxmaj should xxmaj burn in xxmaj trash \n  xxmaj grandma xxmaj terri is trash . i hate xxmaj grandma xxmaj terri . xxup xxunk her to xxup hell ! 71.74.76.40,xxbos , 9 xxmaj may 2009 ( xxup utc ) \n  xxmaj it would be easiest if you were to admit to being a member of the involved xxmaj portuguese xxmaj lodge , and then there would be no requirement to acknowledge whether you had a previous account ( xxmaj carlos xxmaj xxunk did not have a good record ) or not and i would then remove the sockpuppet template as irrelevant . xxup wp : xxup coi permits people to edit those articles , such as msjapan does , but just means you have to be more careful in ensuring that references back your edits and that xxup npov is upheld . 20:29,xxbos \" \n \n  xxmaj the xxmaj objectivity of this xxmaj discussion is doubtful ( non - existent ) \n \n  ( 1 ) xxmaj as indicated earlier , the section on xxmaj marxist leaders ’ views is misleading : \n \n  ( a ) it lays unwarranted and excessive emphasis on xxmaj trotsky , creating the misleading impression that other prominent xxmaj marxists ( xxmaj marx , xxmaj engels , xxmaj lenin ) did not advocate and / or practiced terrorism ; \n \n  ( b ) it lays unwarranted and excessive emphasis on the theoretical “ rejection of individual terrorism ” , creating the misleading impression that this is the main ( only ) xxmaj marxist position on terrorism . \n \n  ( 2 ) xxmaj the discussion is not being properly monitored : \n \n  ( a ) no discernible attempt is being made to establish and maintain an acceptable degree of objectivity ; \n \n  ( b ) important and relevant scholarly works such as the xxmaj international xxmaj encyclopedia of xxmaj terrorism are being ignored or xxunk excluded from the discussion ; \n \n  ( c ) though the only logical way to remedy the blatant imbalance in the above section is to include quotes by / on other leaders who are known to have endorsed and practiced terrorism all attempts to do so have been systematically blocked with impunity by the apologists for xxmaj marxist terrorism who have done their best to sabotage and wreck both the article and the discussion . \n \n  ( 3 ) xxmaj among the tactics deployed by the apologist wreckers and xxunk the following may be identified as representative examples : \n \n  ( a ) it is claimed that xxmaj marx and xxmaj engels did not advocate terrorism despite the fact that scholarly works like the xxmaj international xxmaj encyclopedia of xxmaj terrorism show that they did , and xxmaj marx himself was known as “ xxmaj the xxmaj red xxmaj terror xxmaj doctor ” ; \n \n  ( b ) it is claimed that xxmaj marx and xxmaj engels were not involved in terrorist activities despite the fact that numerous sources from xxmaj the xxmaj neue xxmaj xxunk xxmaj zeitung to xxmaj isaiah xxmaj berlin and xxmaj francis xxmaj xxunk state otherwise ; \n \n  ( c ) it is claimed that xxmaj lenin does not refer to terror in xxmaj the xxmaj proletarian xxmaj revolution and the xxmaj renegade xxup k. xxmaj xxunk and other works / statements despite the fact that xxmaj robert xxmaj service , xxup iet , and other scholarly and reliable sources state that he does ; \n \n  ( d ) it is claimed that the xxmaj russian word ‘ ’ xxunk ’ ’ does not mean “ terror ” when : \n \n  i. the xxmaj oxford xxmaj russian xxmaj dictionary says that it does ; \n \n  ii . it is evident from the context that this is the case ; \n \n  iii . any educated xxmaj russian speaker can confirm that xxunk may mean “ terror ” depending on the context ; \n \n  ( e ) it is claimed that xxmaj marxism is “ scientific ” when in fact : \n \n  i. xxmaj marx was not a scientist ; \n \n  ii . xxmaj marx ’s background was philosophy and law , not science ; \n \n  iii . xxmaj marxism is not recognized as a science by the academic world ; \n \n  iv . virtually every one of xxmaj marx ’s predictions turned out to be wrong , as became increasingly apparent during his lifetime and xxunk so after his death ( xxup r. xxmaj pipes , xxmaj communism : a xxmaj brief xxmaj history , 2001 , p. 15 ) from which it follows that xxmaj marxism does not qualify as a scientific system by any accepted standards ; \n \n  v. the evidence indicates that xxmaj marxism is closer to a religious sect than to science proper ; \n \n  ( f ) apologist literature is being quoted in a fraudulent attempt to whitewash xxmaj marxist terrorism , in effect turning the discussion into an advertisement for terrorism ; \n \n  ( g ) it is claimed that xxmaj marxist terrorism is not rooted in the xxmaj marxist theory of class struggle even though there are numerous sources showing that it is ( please note that it is immaterial whether terrorism had already been justified in terms of a theory of class prior to xxmaj marx , the point being that it was advocated / practiced on the basis of xxmaj marxist class - struggle theories xxup by xxup marxists ) : \n \n  “ xxmaj karl xxmaj marx felt that terror was a necessary part of a revolutionary strategy ” ( xxmaj peter xxmaj xxunk , “ xxmaj theories of xxmaj terror in xxmaj urban xxmaj xxunk ” , xxup iet , p. 138 ) ; \n \n  “ xxmaj revolutionary terrorism has its roots in a political ideology , from the xxmaj marxist - xxmaj leninist thinking of the xxmaj left , to the fascists found on the xxmaj right ” ( xxmaj xxunk xxmaj gal - xxmaj or , \" \" xxmaj revolutionary xxmaj terrorism \" \" , xxup iet , p. 203 ) ; \n \n  “ … perhaps the most important key to xxmaj stalin ’s motivation lies in the realm of ideology . xxmaj the xxunk of xxmaj soviet communist ideology in the 1920s and 1930s was class struggle – the xxunk antagonism between mutually incompatible economic interest groups ” ( xxmaj geoffrey xxmaj robert , xxmaj stalins xxmaj wars , 2006 , pp . 17 - 18 ) ; \n \n  this fact is supported not only by reliable academic sources , but by elementary logic : \n \n  “ xxmaj in 1907 xxmaj xxunk published in the magazine ‘ ’ xxmaj neue xxmaj zeit ( xxmaj vol . xxup xxv 2 , p. 164 ) extracts from a letter by xxmaj marx to xxmaj xxunk dated xxmaj march 5 , 1852 . xxmaj in this letter , among other things , is the following noteworthy observation : … class struggle necessarily leads to the dictatorship of the proletariat … ”,xxbos xxmaj shelly xxmaj shock \n  xxmaj shelly xxmaj shock is . . . ( ),xxbos i do not care . xxmaj refer to xxmaj ong xxmaj teng xxmaj cheong talk page . xxmaj is xxmaj la goutte de pluie writing a biography or writing the history of trade unions . xxmaj she is making use of the dead to push her agenda again . xxmaj right before elections too . xxmaj how timely . xxunk\ny: MultiCategoryList\ntoxic,,,,\nPath: .;\n\nValid: LabelList (31915 items)\nx: TextList\nxxbos xxmaj geez , are you xxunk ! xxmaj we 've already discussed why xxmaj marx was not an anarchist , i.e. he wanted to use a xxmaj state to mold his ' socialist man . ' xxmaj ergo , he is a statist - the opposite of an anarchist . i know a guy who says that , when he gets old and his teeth fall out , he 'll quit eating meat . xxmaj would you call him a vegetarian ?,xxbos xxmaj xxunk xxup rfa \n \n  xxmaj thanks for your support on my request for adminship . \n \n  xxmaj the final outcome was ( 31 / 4 / 1 ) , so i am now an administrator . xxmaj if you have any comments or concerns on my actions as an administrator , please let me know . xxmaj thank you !,xxbos \" \n \n  xxmaj birthday \n \n  xxmaj no worries , xxmaj it 's what i do ; ) xxmaj enjoy ur xxunk \",xxbos xxmaj pseudoscience category ? \n \n  i 'm assuming that this article is in the pseudoscience category because of its association with creationism . xxmaj however , there are modern , scientifically - accepted variants of xxunk that have nothing to do with creationism — and they 're even mentioned in the article ! i think the connection to pseudoscience needs to be clarified , or the article made more general and less creationism - specific and the category tag removed entirely .,xxbos ( and if such phrase exists , it would be provided by search engine even if mentioned page is not available as a whole )\ny: MultiCategoryList\n,,,,\nPath: .;\n\nTest: None, model=SequentialRNN(\n  (0): MultiBatchEncoder(\n    (module): AWD_LSTM(\n      (encoder): Embedding(57520, 400, padding_idx=1)\n      (encoder_dp): EmbeddingDropout(\n        (emb): Embedding(57520, 400, padding_idx=1)\n      )\n      (rnns): ModuleList(\n        (0): WeightDropout(\n          (module): LSTM(400, 1150, batch_first=True)\n        )\n        (1): WeightDropout(\n          (module): LSTM(1150, 1150, batch_first=True)\n        )\n        (2): WeightDropout(\n          (module): LSTM(1150, 400, batch_first=True)\n        )\n      )\n      (input_dp): RNNDropout()\n      (hidden_dps): ModuleList(\n        (0): RNNDropout()\n        (1): RNNDropout()\n        (2): RNNDropout()\n      )\n    )\n  )\n  (1): PoolingLinearClassifier(\n    (layers): Sequential(\n      (0): BatchNorm1d(1200, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)\n      (1): Dropout(p=0.2)\n      (2): Linear(in_features=1200, out_features=50, bias=True)\n      (3): ReLU(inplace)\n      (4): BatchNorm1d(50, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)\n      (5): Dropout(p=0.1)\n      (6): Linear(in_features=50, out_features=6, bias=True)\n    )\n  )\n), opt_func=functools.partial(<class 'torch.optim.adam.Adam'>, betas=(0.9, 0.99)), loss_func=BCEWithLogitsLoss(), metrics=[functools.partial(<function accuracy_thresh at 0x7f79cc28dae8>, thresh=0.25)], true_wd=True, bn_wd=True, wd=0.01, train_bn=True, path=PosixPath('../input'), model_dir='/temp/model', callback_fns=[functools.partial(<class 'fastai.basic_train.Recorder'>, add_time=True, silent=False)], callbacks=[RNNTrainer\nlearn: RNNLearner(data=TextClasDataBunch;\n\nTrain: LabelList (127656 items)\nx: TextList\nxxbos xxmaj grandma xxmaj terri xxmaj should xxmaj burn in xxmaj trash \n  xxmaj grandma xxmaj terri is trash . i hate xxmaj grandma xxmaj terri . xxup xxunk her to xxup hell ! 71.74.76.40,xxbos , 9 xxmaj may 2009 ( xxup utc ) \n  xxmaj it would be easiest if you were to admit to being a member of the involved xxmaj portuguese xxmaj lodge , and then there would be no requirement to acknowledge whether you had a previous account ( xxmaj carlos xxmaj xxunk did not have a good record ) or not and i would then remove the sockpuppet template as irrelevant . xxup wp : xxup coi permits people to edit those articles , such as msjapan does , but just means you have to be more careful in ensuring that references back your edits and that xxup npov is upheld . 20:29,xxbos \" \n \n  xxmaj the xxmaj objectivity of this xxmaj discussion is doubtful ( non - existent ) \n \n  ( 1 ) xxmaj as indicated earlier , the section on xxmaj marxist leaders ’ views is misleading : \n \n  ( a ) it lays unwarranted and excessive emphasis on xxmaj trotsky , creating the misleading impression that other prominent xxmaj marxists ( xxmaj marx , xxmaj engels , xxmaj lenin ) did not advocate and / or practiced terrorism ; \n \n  ( b ) it lays unwarranted and excessive emphasis on the theoretical “ rejection of individual terrorism ” , creating the misleading impression that this is the main ( only ) xxmaj marxist position on terrorism . \n \n  ( 2 ) xxmaj the discussion is not being properly monitored : \n \n  ( a ) no discernible attempt is being made to establish and maintain an acceptable degree of objectivity ; \n \n  ( b ) important and relevant scholarly works such as the xxmaj international xxmaj encyclopedia of xxmaj terrorism are being ignored or xxunk excluded from the discussion ; \n \n  ( c ) though the only logical way to remedy the blatant imbalance in the above section is to include quotes by / on other leaders who are known to have endorsed and practiced terrorism all attempts to do so have been systematically blocked with impunity by the apologists for xxmaj marxist terrorism who have done their best to sabotage and wreck both the article and the discussion . \n \n  ( 3 ) xxmaj among the tactics deployed by the apologist wreckers and xxunk the following may be identified as representative examples : \n \n  ( a ) it is claimed that xxmaj marx and xxmaj engels did not advocate terrorism despite the fact that scholarly works like the xxmaj international xxmaj encyclopedia of xxmaj terrorism show that they did , and xxmaj marx himself was known as “ xxmaj the xxmaj red xxmaj terror xxmaj doctor ” ; \n \n  ( b ) it is claimed that xxmaj marx and xxmaj engels were not involved in terrorist activities despite the fact that numerous sources from xxmaj the xxmaj neue xxmaj xxunk xxmaj zeitung to xxmaj isaiah xxmaj berlin and xxmaj francis xxmaj xxunk state otherwise ; \n \n  ( c ) it is claimed that xxmaj lenin does not refer to terror in xxmaj the xxmaj proletarian xxmaj revolution and the xxmaj renegade xxup k. xxmaj xxunk and other works / statements despite the fact that xxmaj robert xxmaj service , xxup iet , and other scholarly and reliable sources state that he does ; \n \n  ( d ) it is claimed that the xxmaj russian word ‘ ’ xxunk ’ ’ does not mean “ terror ” when : \n \n  i. the xxmaj oxford xxmaj russian xxmaj dictionary says that it does ; \n \n  ii . it is evident from the context that this is the case ; \n \n  iii . any educated xxmaj russian speaker can confirm that xxunk may mean “ terror ” depending on the context ; \n \n  ( e ) it is claimed that xxmaj marxism is “ scientific ” when in fact : \n \n  i. xxmaj marx was not a scientist ; \n \n  ii . xxmaj marx ’s background was philosophy and law , not science ; \n \n  iii . xxmaj marxism is not recognized as a science by the academic world ; \n \n  iv . virtually every one of xxmaj marx ’s predictions turned out to be wrong , as became increasingly apparent during his lifetime and xxunk so after his death ( xxup r. xxmaj pipes , xxmaj communism : a xxmaj brief xxmaj history , 2001 , p. 15 ) from which it follows that xxmaj marxism does not qualify as a scientific system by any accepted standards ; \n \n  v. the evidence indicates that xxmaj marxism is closer to a religious sect than to science proper ; \n \n  ( f ) apologist literature is being quoted in a fraudulent attempt to whitewash xxmaj marxist terrorism , in effect turning the discussion into an advertisement for terrorism ; \n \n  ( g ) it is claimed that xxmaj marxist terrorism is not rooted in the xxmaj marxist theory of class struggle even though there are numerous sources showing that it is ( please note that it is immaterial whether terrorism had already been justified in terms of a theory of class prior to xxmaj marx , the point being that it was advocated / practiced on the basis of xxmaj marxist class - struggle theories xxup by xxup marxists ) : \n \n  “ xxmaj karl xxmaj marx felt that terror was a necessary part of a revolutionary strategy ” ( xxmaj peter xxmaj xxunk , “ xxmaj theories of xxmaj terror in xxmaj urban xxmaj xxunk ” , xxup iet , p. 138 ) ; \n \n  “ xxmaj revolutionary terrorism has its roots in a political ideology , from the xxmaj marxist - xxmaj leninist thinking of the xxmaj left , to the fascists found on the xxmaj right ” ( xxmaj xxunk xxmaj gal - xxmaj or , \" \" xxmaj revolutionary xxmaj terrorism \" \" , xxup iet , p. 203 ) ; \n \n  “ … perhaps the most important key to xxmaj stalin ’s motivation lies in the realm of ideology . xxmaj the xxunk of xxmaj soviet communist ideology in the 1920s and 1930s was class struggle – the xxunk antagonism between mutually incompatible economic interest groups ” ( xxmaj geoffrey xxmaj robert , xxmaj stalins xxmaj wars , 2006 , pp . 17 - 18 ) ; \n \n  this fact is supported not only by reliable academic sources , but by elementary logic : \n \n  “ xxmaj in 1907 xxmaj xxunk published in the magazine ‘ ’ xxmaj neue xxmaj zeit ( xxmaj vol . xxup xxv 2 , p. 164 ) extracts from a letter by xxmaj marx to xxmaj xxunk dated xxmaj march 5 , 1852 . xxmaj in this letter , among other things , is the following noteworthy observation : … class struggle necessarily leads to the dictatorship of the proletariat … ”,xxbos xxmaj shelly xxmaj shock \n  xxmaj shelly xxmaj shock is . . . ( ),xxbos i do not care . xxmaj refer to xxmaj ong xxmaj teng xxmaj cheong talk page . xxmaj is xxmaj la goutte de pluie writing a biography or writing the history of trade unions . xxmaj she is making use of the dead to push her agenda again . xxmaj right before elections too . xxmaj how timely . xxunk\ny: MultiCategoryList\ntoxic,,,,\nPath: .;\n\nValid: LabelList (31915 items)\nx: TextList\nxxbos xxmaj geez , are you xxunk ! xxmaj we 've already discussed why xxmaj marx was not an anarchist , i.e. he wanted to use a xxmaj state to mold his ' socialist man . ' xxmaj ergo , he is a statist - the opposite of an anarchist . i know a guy who says that , when he gets old and his teeth fall out , he 'll quit eating meat . xxmaj would you call him a vegetarian ?,xxbos xxmaj xxunk xxup rfa \n \n  xxmaj thanks for your support on my request for adminship . \n \n  xxmaj the final outcome was ( 31 / 4 / 1 ) , so i am now an administrator . xxmaj if you have any comments or concerns on my actions as an administrator , please let me know . xxmaj thank you !,xxbos \" \n \n  xxmaj birthday \n \n  xxmaj no worries , xxmaj it 's what i do ; ) xxmaj enjoy ur xxunk \",xxbos xxmaj pseudoscience category ? \n \n  i 'm assuming that this article is in the pseudoscience category because of its association with creationism . xxmaj however , there are modern , scientifically - accepted variants of xxunk that have nothing to do with creationism — and they 're even mentioned in the article ! i think the connection to pseudoscience needs to be clarified , or the article made more general and less creationism - specific and the category tag removed entirely .,xxbos ( and if such phrase exists , it would be provided by search engine even if mentioned page is not available as a whole )\ny: MultiCategoryList\n,,,,\nPath: .;\n\nTest: None, model=SequentialRNN(\n  (0): MultiBatchEncoder(\n    (module): AWD_LSTM(\n      (encoder): Embedding(57520, 400, padding_idx=1)\n      (encoder_dp): EmbeddingDropout(\n        (emb): Embedding(57520, 400, padding_idx=1)\n      )\n      (rnns): ModuleList(\n        (0): WeightDropout(\n          (module): LSTM(400, 1150, batch_first=True)\n        )\n        (1): WeightDropout(\n          (module): LSTM(1150, 1150, batch_first=True)\n        )\n        (2): WeightDropout(\n          (module): LSTM(1150, 400, batch_first=True)\n        )\n      )\n      (input_dp): RNNDropout()\n      (hidden_dps): ModuleList(\n        (0): RNNDropout()\n        (1): RNNDropout()\n        (2): RNNDropout()\n      )\n    )\n  )\n  (1): PoolingLinearClassifier(\n    (layers): Sequential(\n      (0): BatchNorm1d(1200, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)\n      (1): Dropout(p=0.2)\n      (2): Linear(in_features=1200, out_features=50, bias=True)\n      (3): ReLU(inplace)\n      (4): BatchNorm1d(50, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)\n      (5): Dropout(p=0.1)\n      (6): Linear(in_features=50, out_features=6, bias=True)\n    )\n  )\n), opt_func=functools.partial(<class 'torch.optim.adam.Adam'>, betas=(0.9, 0.99)), loss_func=BCEWithLogitsLoss(), metrics=[functools.partial(<function accuracy_thresh at 0x7f79cc28dae8>, thresh=0.25)], true_wd=True, bn_wd=True, wd=0.01, train_bn=True, path=PosixPath('../input'), model_dir='/temp/model', callback_fns=[functools.partial(<class 'fastai.basic_train.Recorder'>, add_time=True, silent=False)], callbacks=[...], layer_groups=[Sequential(\n  (0): Embedding(57520, 400, padding_idx=1)\n  (1): EmbeddingDropout(\n    (emb): Embedding(57520, 400, padding_idx=1)\n  )\n), Sequential(\n  (0): WeightDropout(\n    (module): LSTM(400, 1150, batch_first=True)\n  )\n  (1): RNNDropout()\n), Sequential(\n  (0): WeightDropout(\n    (module): LSTM(1150, 1150, batch_first=True)\n  )\n  (1): RNNDropout()\n), Sequential(\n  (0): WeightDropout(\n    (module): LSTM(1150, 400, batch_first=True)\n  )\n  (1): RNNDropout()\n), Sequential(\n  (0): PoolingLinearClassifier(\n    (layers): Sequential(\n      (0): BatchNorm1d(1200, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)\n      (1): Dropout(p=0.2)\n      (2): Linear(in_features=1200, out_features=50, bias=True)\n      (3): ReLU(inplace)\n      (4): BatchNorm1d(50, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)\n      (5): Dropout(p=0.1)\n      (6): Linear(in_features=50, out_features=6, bias=True)\n    )\n  )\n)], add_time=True, silent=None)\nalpha: 2.0\nbeta: 1.0], layer_groups=[Sequential(\n  (0): Embedding(57520, 400, padding_idx=1)\n  (1): EmbeddingDropout(\n    (emb): Embedding(57520, 400, padding_idx=1)\n  )\n), Sequential(\n  (0): WeightDropout(\n    (module): LSTM(400, 1150, batch_first=True)\n  )\n  (1): RNNDropout()\n), Sequential(\n  (0): WeightDropout(\n    (module): LSTM(1150, 1150, batch_first=True)\n  )\n  (1): RNNDropout()\n), Sequential(\n  (0): WeightDropout(\n    (module): LSTM(1150, 400, batch_first=True)\n  )\n  (1): RNNDropout()\n), Sequential(\n  (0): PoolingLinearClassifier(\n    (layers): Sequential(\n      (0): BatchNorm1d(1200, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)\n      (1): Dropout(p=0.2)\n      (2): Linear(in_features=1200, out_features=50, bias=True)\n      (3): ReLU(inplace)\n      (4): BatchNorm1d(50, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)\n      (5): Dropout(p=0.1)\n      (6): Linear(in_features=50, out_features=6, bias=True)\n    )\n  )\n)], add_time=True, silent=None)"},"metadata":{}}]},{"metadata":{"trusted":true},"cell_type":"code","source":"learn.freeze_to(-2)\nlearn.fit_one_cycle(2, slice(5e-2/(2.6**4),5e-2), moms=(0.8,0.7), pct_start=0.2, wd =(1e-7, 1e-5, 1e-4, 1e-3, 1e-2))","execution_count":60,"outputs":[{"output_type":"display_data","data":{"text/plain":"<IPython.core.display.HTML object>","text/html":"<table border=\"1\" class=\"dataframe\">\n  <thead>\n    <tr style=\"text-align: left;\">\n      <th>epoch</th>\n      <th>train_loss</th>\n      <th>valid_loss</th>\n      <th>accuracy_thresh</th>\n      <th>time</th>\n    </tr>\n  </thead>\n  <tbody>\n    <tr>\n      <td>0</td>\n      <td>0.067746</td>\n      <td>0.889934</td>\n      <td>0.975555</td>\n      <td>05:52</td>\n    </tr>\n    <tr>\n      <td>1</td>\n      <td>0.058668</td>\n      <td>1.353945</td>\n      <td>0.971554</td>\n      <td>05:02</td>\n    </tr>\n  </tbody>\n</table>"},"metadata":{}}]},{"metadata":{"trusted":true},"cell_type":"code","source":"learn.save('second')\nlearn.load('second')","execution_count":61,"outputs":[{"output_type":"execute_result","execution_count":61,"data":{"text/plain":"RNNLearner(data=TextClasDataBunch;\n\nTrain: LabelList (127656 items)\nx: TextList\nxxbos xxmaj grandma xxmaj terri xxmaj should xxmaj burn in xxmaj trash \n  xxmaj grandma xxmaj terri is trash . i hate xxmaj grandma xxmaj terri . xxup xxunk her to xxup hell ! 71.74.76.40,xxbos , 9 xxmaj may 2009 ( xxup utc ) \n  xxmaj it would be easiest if you were to admit to being a member of the involved xxmaj portuguese xxmaj lodge , and then there would be no requirement to acknowledge whether you had a previous account ( xxmaj carlos xxmaj xxunk did not have a good record ) or not and i would then remove the sockpuppet template as irrelevant . xxup wp : xxup coi permits people to edit those articles , such as msjapan does , but just means you have to be more careful in ensuring that references back your edits and that xxup npov is upheld . 20:29,xxbos \" \n \n  xxmaj the xxmaj objectivity of this xxmaj discussion is doubtful ( non - existent ) \n \n  ( 1 ) xxmaj as indicated earlier , the section on xxmaj marxist leaders ’ views is misleading : \n \n  ( a ) it lays unwarranted and excessive emphasis on xxmaj trotsky , creating the misleading impression that other prominent xxmaj marxists ( xxmaj marx , xxmaj engels , xxmaj lenin ) did not advocate and / or practiced terrorism ; \n \n  ( b ) it lays unwarranted and excessive emphasis on the theoretical “ rejection of individual terrorism ” , creating the misleading impression that this is the main ( only ) xxmaj marxist position on terrorism . \n \n  ( 2 ) xxmaj the discussion is not being properly monitored : \n \n  ( a ) no discernible attempt is being made to establish and maintain an acceptable degree of objectivity ; \n \n  ( b ) important and relevant scholarly works such as the xxmaj international xxmaj encyclopedia of xxmaj terrorism are being ignored or xxunk excluded from the discussion ; \n \n  ( c ) though the only logical way to remedy the blatant imbalance in the above section is to include quotes by / on other leaders who are known to have endorsed and practiced terrorism all attempts to do so have been systematically blocked with impunity by the apologists for xxmaj marxist terrorism who have done their best to sabotage and wreck both the article and the discussion . \n \n  ( 3 ) xxmaj among the tactics deployed by the apologist wreckers and xxunk the following may be identified as representative examples : \n \n  ( a ) it is claimed that xxmaj marx and xxmaj engels did not advocate terrorism despite the fact that scholarly works like the xxmaj international xxmaj encyclopedia of xxmaj terrorism show that they did , and xxmaj marx himself was known as “ xxmaj the xxmaj red xxmaj terror xxmaj doctor ” ; \n \n  ( b ) it is claimed that xxmaj marx and xxmaj engels were not involved in terrorist activities despite the fact that numerous sources from xxmaj the xxmaj neue xxmaj xxunk xxmaj zeitung to xxmaj isaiah xxmaj berlin and xxmaj francis xxmaj xxunk state otherwise ; \n \n  ( c ) it is claimed that xxmaj lenin does not refer to terror in xxmaj the xxmaj proletarian xxmaj revolution and the xxmaj renegade xxup k. xxmaj xxunk and other works / statements despite the fact that xxmaj robert xxmaj service , xxup iet , and other scholarly and reliable sources state that he does ; \n \n  ( d ) it is claimed that the xxmaj russian word ‘ ’ xxunk ’ ’ does not mean “ terror ” when : \n \n  i. the xxmaj oxford xxmaj russian xxmaj dictionary says that it does ; \n \n  ii . it is evident from the context that this is the case ; \n \n  iii . any educated xxmaj russian speaker can confirm that xxunk may mean “ terror ” depending on the context ; \n \n  ( e ) it is claimed that xxmaj marxism is “ scientific ” when in fact : \n \n  i. xxmaj marx was not a scientist ; \n \n  ii . xxmaj marx ’s background was philosophy and law , not science ; \n \n  iii . xxmaj marxism is not recognized as a science by the academic world ; \n \n  iv . virtually every one of xxmaj marx ’s predictions turned out to be wrong , as became increasingly apparent during his lifetime and xxunk so after his death ( xxup r. xxmaj pipes , xxmaj communism : a xxmaj brief xxmaj history , 2001 , p. 15 ) from which it follows that xxmaj marxism does not qualify as a scientific system by any accepted standards ; \n \n  v. the evidence indicates that xxmaj marxism is closer to a religious sect than to science proper ; \n \n  ( f ) apologist literature is being quoted in a fraudulent attempt to whitewash xxmaj marxist terrorism , in effect turning the discussion into an advertisement for terrorism ; \n \n  ( g ) it is claimed that xxmaj marxist terrorism is not rooted in the xxmaj marxist theory of class struggle even though there are numerous sources showing that it is ( please note that it is immaterial whether terrorism had already been justified in terms of a theory of class prior to xxmaj marx , the point being that it was advocated / practiced on the basis of xxmaj marxist class - struggle theories xxup by xxup marxists ) : \n \n  “ xxmaj karl xxmaj marx felt that terror was a necessary part of a revolutionary strategy ” ( xxmaj peter xxmaj xxunk , “ xxmaj theories of xxmaj terror in xxmaj urban xxmaj xxunk ” , xxup iet , p. 138 ) ; \n \n  “ xxmaj revolutionary terrorism has its roots in a political ideology , from the xxmaj marxist - xxmaj leninist thinking of the xxmaj left , to the fascists found on the xxmaj right ” ( xxmaj xxunk xxmaj gal - xxmaj or , \" \" xxmaj revolutionary xxmaj terrorism \" \" , xxup iet , p. 203 ) ; \n \n  “ … perhaps the most important key to xxmaj stalin ’s motivation lies in the realm of ideology . xxmaj the xxunk of xxmaj soviet communist ideology in the 1920s and 1930s was class struggle – the xxunk antagonism between mutually incompatible economic interest groups ” ( xxmaj geoffrey xxmaj robert , xxmaj stalins xxmaj wars , 2006 , pp . 17 - 18 ) ; \n \n  this fact is supported not only by reliable academic sources , but by elementary logic : \n \n  “ xxmaj in 1907 xxmaj xxunk published in the magazine ‘ ’ xxmaj neue xxmaj zeit ( xxmaj vol . xxup xxv 2 , p. 164 ) extracts from a letter by xxmaj marx to xxmaj xxunk dated xxmaj march 5 , 1852 . xxmaj in this letter , among other things , is the following noteworthy observation : … class struggle necessarily leads to the dictatorship of the proletariat … ”,xxbos xxmaj shelly xxmaj shock \n  xxmaj shelly xxmaj shock is . . . ( ),xxbos i do not care . xxmaj refer to xxmaj ong xxmaj teng xxmaj cheong talk page . xxmaj is xxmaj la goutte de pluie writing a biography or writing the history of trade unions . xxmaj she is making use of the dead to push her agenda again . xxmaj right before elections too . xxmaj how timely . xxunk\ny: MultiCategoryList\ntoxic,,,,\nPath: .;\n\nValid: LabelList (31915 items)\nx: TextList\nxxbos xxmaj geez , are you xxunk ! xxmaj we 've already discussed why xxmaj marx was not an anarchist , i.e. he wanted to use a xxmaj state to mold his ' socialist man . ' xxmaj ergo , he is a statist - the opposite of an anarchist . i know a guy who says that , when he gets old and his teeth fall out , he 'll quit eating meat . xxmaj would you call him a vegetarian ?,xxbos xxmaj xxunk xxup rfa \n \n  xxmaj thanks for your support on my request for adminship . \n \n  xxmaj the final outcome was ( 31 / 4 / 1 ) , so i am now an administrator . xxmaj if you have any comments or concerns on my actions as an administrator , please let me know . xxmaj thank you !,xxbos \" \n \n  xxmaj birthday \n \n  xxmaj no worries , xxmaj it 's what i do ; ) xxmaj enjoy ur xxunk \",xxbos xxmaj pseudoscience category ? \n \n  i 'm assuming that this article is in the pseudoscience category because of its association with creationism . xxmaj however , there are modern , scientifically - accepted variants of xxunk that have nothing to do with creationism — and they 're even mentioned in the article ! i think the connection to pseudoscience needs to be clarified , or the article made more general and less creationism - specific and the category tag removed entirely .,xxbos ( and if such phrase exists , it would be provided by search engine even if mentioned page is not available as a whole )\ny: MultiCategoryList\n,,,,\nPath: .;\n\nTest: None, model=SequentialRNN(\n  (0): MultiBatchEncoder(\n    (module): AWD_LSTM(\n      (encoder): Embedding(57520, 400, padding_idx=1)\n      (encoder_dp): EmbeddingDropout(\n        (emb): Embedding(57520, 400, padding_idx=1)\n      )\n      (rnns): ModuleList(\n        (0): WeightDropout(\n          (module): LSTM(400, 1150, batch_first=True)\n        )\n        (1): WeightDropout(\n          (module): LSTM(1150, 1150, batch_first=True)\n        )\n        (2): WeightDropout(\n          (module): LSTM(1150, 400, batch_first=True)\n        )\n      )\n      (input_dp): RNNDropout()\n      (hidden_dps): ModuleList(\n        (0): RNNDropout()\n        (1): RNNDropout()\n        (2): RNNDropout()\n      )\n    )\n  )\n  (1): PoolingLinearClassifier(\n    (layers): Sequential(\n      (0): BatchNorm1d(1200, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)\n      (1): Dropout(p=0.2)\n      (2): Linear(in_features=1200, out_features=50, bias=True)\n      (3): ReLU(inplace)\n      (4): BatchNorm1d(50, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)\n      (5): Dropout(p=0.1)\n      (6): Linear(in_features=50, out_features=6, bias=True)\n    )\n  )\n), opt_func=functools.partial(<class 'torch.optim.adam.Adam'>, betas=(0.9, 0.99)), loss_func=BCEWithLogitsLoss(), metrics=[functools.partial(<function accuracy_thresh at 0x7f79cc28dae8>, thresh=0.25)], true_wd=True, bn_wd=True, wd=0.01, train_bn=True, path=PosixPath('../input'), model_dir='/temp/model', callback_fns=[functools.partial(<class 'fastai.basic_train.Recorder'>, add_time=True, silent=False)], callbacks=[RNNTrainer\nlearn: RNNLearner(data=TextClasDataBunch;\n\nTrain: LabelList (127656 items)\nx: TextList\nxxbos xxmaj grandma xxmaj terri xxmaj should xxmaj burn in xxmaj trash \n  xxmaj grandma xxmaj terri is trash . i hate xxmaj grandma xxmaj terri . xxup xxunk her to xxup hell ! 71.74.76.40,xxbos , 9 xxmaj may 2009 ( xxup utc ) \n  xxmaj it would be easiest if you were to admit to being a member of the involved xxmaj portuguese xxmaj lodge , and then there would be no requirement to acknowledge whether you had a previous account ( xxmaj carlos xxmaj xxunk did not have a good record ) or not and i would then remove the sockpuppet template as irrelevant . xxup wp : xxup coi permits people to edit those articles , such as msjapan does , but just means you have to be more careful in ensuring that references back your edits and that xxup npov is upheld . 20:29,xxbos \" \n \n  xxmaj the xxmaj objectivity of this xxmaj discussion is doubtful ( non - existent ) \n \n  ( 1 ) xxmaj as indicated earlier , the section on xxmaj marxist leaders ’ views is misleading : \n \n  ( a ) it lays unwarranted and excessive emphasis on xxmaj trotsky , creating the misleading impression that other prominent xxmaj marxists ( xxmaj marx , xxmaj engels , xxmaj lenin ) did not advocate and / or practiced terrorism ; \n \n  ( b ) it lays unwarranted and excessive emphasis on the theoretical “ rejection of individual terrorism ” , creating the misleading impression that this is the main ( only ) xxmaj marxist position on terrorism . \n \n  ( 2 ) xxmaj the discussion is not being properly monitored : \n \n  ( a ) no discernible attempt is being made to establish and maintain an acceptable degree of objectivity ; \n \n  ( b ) important and relevant scholarly works such as the xxmaj international xxmaj encyclopedia of xxmaj terrorism are being ignored or xxunk excluded from the discussion ; \n \n  ( c ) though the only logical way to remedy the blatant imbalance in the above section is to include quotes by / on other leaders who are known to have endorsed and practiced terrorism all attempts to do so have been systematically blocked with impunity by the apologists for xxmaj marxist terrorism who have done their best to sabotage and wreck both the article and the discussion . \n \n  ( 3 ) xxmaj among the tactics deployed by the apologist wreckers and xxunk the following may be identified as representative examples : \n \n  ( a ) it is claimed that xxmaj marx and xxmaj engels did not advocate terrorism despite the fact that scholarly works like the xxmaj international xxmaj encyclopedia of xxmaj terrorism show that they did , and xxmaj marx himself was known as “ xxmaj the xxmaj red xxmaj terror xxmaj doctor ” ; \n \n  ( b ) it is claimed that xxmaj marx and xxmaj engels were not involved in terrorist activities despite the fact that numerous sources from xxmaj the xxmaj neue xxmaj xxunk xxmaj zeitung to xxmaj isaiah xxmaj berlin and xxmaj francis xxmaj xxunk state otherwise ; \n \n  ( c ) it is claimed that xxmaj lenin does not refer to terror in xxmaj the xxmaj proletarian xxmaj revolution and the xxmaj renegade xxup k. xxmaj xxunk and other works / statements despite the fact that xxmaj robert xxmaj service , xxup iet , and other scholarly and reliable sources state that he does ; \n \n  ( d ) it is claimed that the xxmaj russian word ‘ ’ xxunk ’ ’ does not mean “ terror ” when : \n \n  i. the xxmaj oxford xxmaj russian xxmaj dictionary says that it does ; \n \n  ii . it is evident from the context that this is the case ; \n \n  iii . any educated xxmaj russian speaker can confirm that xxunk may mean “ terror ” depending on the context ; \n \n  ( e ) it is claimed that xxmaj marxism is “ scientific ” when in fact : \n \n  i. xxmaj marx was not a scientist ; \n \n  ii . xxmaj marx ’s background was philosophy and law , not science ; \n \n  iii . xxmaj marxism is not recognized as a science by the academic world ; \n \n  iv . virtually every one of xxmaj marx ’s predictions turned out to be wrong , as became increasingly apparent during his lifetime and xxunk so after his death ( xxup r. xxmaj pipes , xxmaj communism : a xxmaj brief xxmaj history , 2001 , p. 15 ) from which it follows that xxmaj marxism does not qualify as a scientific system by any accepted standards ; \n \n  v. the evidence indicates that xxmaj marxism is closer to a religious sect than to science proper ; \n \n  ( f ) apologist literature is being quoted in a fraudulent attempt to whitewash xxmaj marxist terrorism , in effect turning the discussion into an advertisement for terrorism ; \n \n  ( g ) it is claimed that xxmaj marxist terrorism is not rooted in the xxmaj marxist theory of class struggle even though there are numerous sources showing that it is ( please note that it is immaterial whether terrorism had already been justified in terms of a theory of class prior to xxmaj marx , the point being that it was advocated / practiced on the basis of xxmaj marxist class - struggle theories xxup by xxup marxists ) : \n \n  “ xxmaj karl xxmaj marx felt that terror was a necessary part of a revolutionary strategy ” ( xxmaj peter xxmaj xxunk , “ xxmaj theories of xxmaj terror in xxmaj urban xxmaj xxunk ” , xxup iet , p. 138 ) ; \n \n  “ xxmaj revolutionary terrorism has its roots in a political ideology , from the xxmaj marxist - xxmaj leninist thinking of the xxmaj left , to the fascists found on the xxmaj right ” ( xxmaj xxunk xxmaj gal - xxmaj or , \" \" xxmaj revolutionary xxmaj terrorism \" \" , xxup iet , p. 203 ) ; \n \n  “ … perhaps the most important key to xxmaj stalin ’s motivation lies in the realm of ideology . xxmaj the xxunk of xxmaj soviet communist ideology in the 1920s and 1930s was class struggle – the xxunk antagonism between mutually incompatible economic interest groups ” ( xxmaj geoffrey xxmaj robert , xxmaj stalins xxmaj wars , 2006 , pp . 17 - 18 ) ; \n \n  this fact is supported not only by reliable academic sources , but by elementary logic : \n \n  “ xxmaj in 1907 xxmaj xxunk published in the magazine ‘ ’ xxmaj neue xxmaj zeit ( xxmaj vol . xxup xxv 2 , p. 164 ) extracts from a letter by xxmaj marx to xxmaj xxunk dated xxmaj march 5 , 1852 . xxmaj in this letter , among other things , is the following noteworthy observation : … class struggle necessarily leads to the dictatorship of the proletariat … ”,xxbos xxmaj shelly xxmaj shock \n  xxmaj shelly xxmaj shock is . . . ( ),xxbos i do not care . xxmaj refer to xxmaj ong xxmaj teng xxmaj cheong talk page . xxmaj is xxmaj la goutte de pluie writing a biography or writing the history of trade unions . xxmaj she is making use of the dead to push her agenda again . xxmaj right before elections too . xxmaj how timely . xxunk\ny: MultiCategoryList\ntoxic,,,,\nPath: .;\n\nValid: LabelList (31915 items)\nx: TextList\nxxbos xxmaj geez , are you xxunk ! xxmaj we 've already discussed why xxmaj marx was not an anarchist , i.e. he wanted to use a xxmaj state to mold his ' socialist man . ' xxmaj ergo , he is a statist - the opposite of an anarchist . i know a guy who says that , when he gets old and his teeth fall out , he 'll quit eating meat . xxmaj would you call him a vegetarian ?,xxbos xxmaj xxunk xxup rfa \n \n  xxmaj thanks for your support on my request for adminship . \n \n  xxmaj the final outcome was ( 31 / 4 / 1 ) , so i am now an administrator . xxmaj if you have any comments or concerns on my actions as an administrator , please let me know . xxmaj thank you !,xxbos \" \n \n  xxmaj birthday \n \n  xxmaj no worries , xxmaj it 's what i do ; ) xxmaj enjoy ur xxunk \",xxbos xxmaj pseudoscience category ? \n \n  i 'm assuming that this article is in the pseudoscience category because of its association with creationism . xxmaj however , there are modern , scientifically - accepted variants of xxunk that have nothing to do with creationism — and they 're even mentioned in the article ! i think the connection to pseudoscience needs to be clarified , or the article made more general and less creationism - specific and the category tag removed entirely .,xxbos ( and if such phrase exists , it would be provided by search engine even if mentioned page is not available as a whole )\ny: MultiCategoryList\n,,,,\nPath: .;\n\nTest: None, model=SequentialRNN(\n  (0): MultiBatchEncoder(\n    (module): AWD_LSTM(\n      (encoder): Embedding(57520, 400, padding_idx=1)\n      (encoder_dp): EmbeddingDropout(\n        (emb): Embedding(57520, 400, padding_idx=1)\n      )\n      (rnns): ModuleList(\n        (0): WeightDropout(\n          (module): LSTM(400, 1150, batch_first=True)\n        )\n        (1): WeightDropout(\n          (module): LSTM(1150, 1150, batch_first=True)\n        )\n        (2): WeightDropout(\n          (module): LSTM(1150, 400, batch_first=True)\n        )\n      )\n      (input_dp): RNNDropout()\n      (hidden_dps): ModuleList(\n        (0): RNNDropout()\n        (1): RNNDropout()\n        (2): RNNDropout()\n      )\n    )\n  )\n  (1): PoolingLinearClassifier(\n    (layers): Sequential(\n      (0): BatchNorm1d(1200, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)\n      (1): Dropout(p=0.2)\n      (2): Linear(in_features=1200, out_features=50, bias=True)\n      (3): ReLU(inplace)\n      (4): BatchNorm1d(50, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)\n      (5): Dropout(p=0.1)\n      (6): Linear(in_features=50, out_features=6, bias=True)\n    )\n  )\n), opt_func=functools.partial(<class 'torch.optim.adam.Adam'>, betas=(0.9, 0.99)), loss_func=BCEWithLogitsLoss(), metrics=[functools.partial(<function accuracy_thresh at 0x7f79cc28dae8>, thresh=0.25)], true_wd=True, bn_wd=True, wd=0.01, train_bn=True, path=PosixPath('../input'), model_dir='/temp/model', callback_fns=[functools.partial(<class 'fastai.basic_train.Recorder'>, add_time=True, silent=False)], callbacks=[...], layer_groups=[Sequential(\n  (0): Embedding(57520, 400, padding_idx=1)\n  (1): EmbeddingDropout(\n    (emb): Embedding(57520, 400, padding_idx=1)\n  )\n), Sequential(\n  (0): WeightDropout(\n    (module): LSTM(400, 1150, batch_first=True)\n  )\n  (1): RNNDropout()\n), Sequential(\n  (0): WeightDropout(\n    (module): LSTM(1150, 1150, batch_first=True)\n  )\n  (1): RNNDropout()\n), Sequential(\n  (0): WeightDropout(\n    (module): LSTM(1150, 400, batch_first=True)\n  )\n  (1): RNNDropout()\n), Sequential(\n  (0): PoolingLinearClassifier(\n    (layers): Sequential(\n      (0): BatchNorm1d(1200, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)\n      (1): Dropout(p=0.2)\n      (2): Linear(in_features=1200, out_features=50, bias=True)\n      (3): ReLU(inplace)\n      (4): BatchNorm1d(50, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)\n      (5): Dropout(p=0.1)\n      (6): Linear(in_features=50, out_features=6, bias=True)\n    )\n  )\n)], add_time=True, silent=None)\nalpha: 2.0\nbeta: 1.0], layer_groups=[Sequential(\n  (0): Embedding(57520, 400, padding_idx=1)\n  (1): EmbeddingDropout(\n    (emb): Embedding(57520, 400, padding_idx=1)\n  )\n), Sequential(\n  (0): WeightDropout(\n    (module): LSTM(400, 1150, batch_first=True)\n  )\n  (1): RNNDropout()\n), Sequential(\n  (0): WeightDropout(\n    (module): LSTM(1150, 1150, batch_first=True)\n  )\n  (1): RNNDropout()\n), Sequential(\n  (0): WeightDropout(\n    (module): LSTM(1150, 400, batch_first=True)\n  )\n  (1): RNNDropout()\n), Sequential(\n  (0): PoolingLinearClassifier(\n    (layers): Sequential(\n      (0): BatchNorm1d(1200, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)\n      (1): Dropout(p=0.2)\n      (2): Linear(in_features=1200, out_features=50, bias=True)\n      (3): ReLU(inplace)\n      (4): BatchNorm1d(50, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)\n      (5): Dropout(p=0.1)\n      (6): Linear(in_features=50, out_features=6, bias=True)\n    )\n  )\n)], add_time=True, silent=None)"},"metadata":{}}]},{"metadata":{"trusted":true},"cell_type":"code","source":"learn.freeze_to(-3)\nlearn.fit_one_cycle(2, slice(5e-2/(2.6**4),5e-2), moms=(0.8,0.7), pct_start=0.2, wd =(1e-7, 1e-5, 1e-4, 1e-3, 1e-2))","execution_count":62,"outputs":[{"output_type":"display_data","data":{"text/plain":"<IPython.core.display.HTML object>","text/html":"<table border=\"1\" class=\"dataframe\">\n  <thead>\n    <tr style=\"text-align: left;\">\n      <th>epoch</th>\n      <th>train_loss</th>\n      <th>valid_loss</th>\n      <th>accuracy_thresh</th>\n      <th>time</th>\n    </tr>\n  </thead>\n  <tbody>\n    <tr>\n      <td>0</td>\n      <td>0.078076</td>\n      <td>15.911341</td>\n      <td>0.956583</td>\n      <td>08:46</td>\n    </tr>\n    <tr>\n      <td>1</td>\n      <td>0.063021</td>\n      <td>0.530898</td>\n      <td>0.965607</td>\n      <td>08:56</td>\n    </tr>\n  </tbody>\n</table>"},"metadata":{}}]},{"metadata":{"trusted":true},"cell_type":"code","source":"learn.save('third')\nlearn.load('third')","execution_count":63,"outputs":[{"output_type":"execute_result","execution_count":63,"data":{"text/plain":"RNNLearner(data=TextClasDataBunch;\n\nTrain: LabelList (127656 items)\nx: TextList\nxxbos xxmaj grandma xxmaj terri xxmaj should xxmaj burn in xxmaj trash \n  xxmaj grandma xxmaj terri is trash . i hate xxmaj grandma xxmaj terri . xxup xxunk her to xxup hell ! 71.74.76.40,xxbos , 9 xxmaj may 2009 ( xxup utc ) \n  xxmaj it would be easiest if you were to admit to being a member of the involved xxmaj portuguese xxmaj lodge , and then there would be no requirement to acknowledge whether you had a previous account ( xxmaj carlos xxmaj xxunk did not have a good record ) or not and i would then remove the sockpuppet template as irrelevant . xxup wp : xxup coi permits people to edit those articles , such as msjapan does , but just means you have to be more careful in ensuring that references back your edits and that xxup npov is upheld . 20:29,xxbos \" \n \n  xxmaj the xxmaj objectivity of this xxmaj discussion is doubtful ( non - existent ) \n \n  ( 1 ) xxmaj as indicated earlier , the section on xxmaj marxist leaders ’ views is misleading : \n \n  ( a ) it lays unwarranted and excessive emphasis on xxmaj trotsky , creating the misleading impression that other prominent xxmaj marxists ( xxmaj marx , xxmaj engels , xxmaj lenin ) did not advocate and / or practiced terrorism ; \n \n  ( b ) it lays unwarranted and excessive emphasis on the theoretical “ rejection of individual terrorism ” , creating the misleading impression that this is the main ( only ) xxmaj marxist position on terrorism . \n \n  ( 2 ) xxmaj the discussion is not being properly monitored : \n \n  ( a ) no discernible attempt is being made to establish and maintain an acceptable degree of objectivity ; \n \n  ( b ) important and relevant scholarly works such as the xxmaj international xxmaj encyclopedia of xxmaj terrorism are being ignored or xxunk excluded from the discussion ; \n \n  ( c ) though the only logical way to remedy the blatant imbalance in the above section is to include quotes by / on other leaders who are known to have endorsed and practiced terrorism all attempts to do so have been systematically blocked with impunity by the apologists for xxmaj marxist terrorism who have done their best to sabotage and wreck both the article and the discussion . \n \n  ( 3 ) xxmaj among the tactics deployed by the apologist wreckers and xxunk the following may be identified as representative examples : \n \n  ( a ) it is claimed that xxmaj marx and xxmaj engels did not advocate terrorism despite the fact that scholarly works like the xxmaj international xxmaj encyclopedia of xxmaj terrorism show that they did , and xxmaj marx himself was known as “ xxmaj the xxmaj red xxmaj terror xxmaj doctor ” ; \n \n  ( b ) it is claimed that xxmaj marx and xxmaj engels were not involved in terrorist activities despite the fact that numerous sources from xxmaj the xxmaj neue xxmaj xxunk xxmaj zeitung to xxmaj isaiah xxmaj berlin and xxmaj francis xxmaj xxunk state otherwise ; \n \n  ( c ) it is claimed that xxmaj lenin does not refer to terror in xxmaj the xxmaj proletarian xxmaj revolution and the xxmaj renegade xxup k. xxmaj xxunk and other works / statements despite the fact that xxmaj robert xxmaj service , xxup iet , and other scholarly and reliable sources state that he does ; \n \n  ( d ) it is claimed that the xxmaj russian word ‘ ’ xxunk ’ ’ does not mean “ terror ” when : \n \n  i. the xxmaj oxford xxmaj russian xxmaj dictionary says that it does ; \n \n  ii . it is evident from the context that this is the case ; \n \n  iii . any educated xxmaj russian speaker can confirm that xxunk may mean “ terror ” depending on the context ; \n \n  ( e ) it is claimed that xxmaj marxism is “ scientific ” when in fact : \n \n  i. xxmaj marx was not a scientist ; \n \n  ii . xxmaj marx ’s background was philosophy and law , not science ; \n \n  iii . xxmaj marxism is not recognized as a science by the academic world ; \n \n  iv . virtually every one of xxmaj marx ’s predictions turned out to be wrong , as became increasingly apparent during his lifetime and xxunk so after his death ( xxup r. xxmaj pipes , xxmaj communism : a xxmaj brief xxmaj history , 2001 , p. 15 ) from which it follows that xxmaj marxism does not qualify as a scientific system by any accepted standards ; \n \n  v. the evidence indicates that xxmaj marxism is closer to a religious sect than to science proper ; \n \n  ( f ) apologist literature is being quoted in a fraudulent attempt to whitewash xxmaj marxist terrorism , in effect turning the discussion into an advertisement for terrorism ; \n \n  ( g ) it is claimed that xxmaj marxist terrorism is not rooted in the xxmaj marxist theory of class struggle even though there are numerous sources showing that it is ( please note that it is immaterial whether terrorism had already been justified in terms of a theory of class prior to xxmaj marx , the point being that it was advocated / practiced on the basis of xxmaj marxist class - struggle theories xxup by xxup marxists ) : \n \n  “ xxmaj karl xxmaj marx felt that terror was a necessary part of a revolutionary strategy ” ( xxmaj peter xxmaj xxunk , “ xxmaj theories of xxmaj terror in xxmaj urban xxmaj xxunk ” , xxup iet , p. 138 ) ; \n \n  “ xxmaj revolutionary terrorism has its roots in a political ideology , from the xxmaj marxist - xxmaj leninist thinking of the xxmaj left , to the fascists found on the xxmaj right ” ( xxmaj xxunk xxmaj gal - xxmaj or , \" \" xxmaj revolutionary xxmaj terrorism \" \" , xxup iet , p. 203 ) ; \n \n  “ … perhaps the most important key to xxmaj stalin ’s motivation lies in the realm of ideology . xxmaj the xxunk of xxmaj soviet communist ideology in the 1920s and 1930s was class struggle – the xxunk antagonism between mutually incompatible economic interest groups ” ( xxmaj geoffrey xxmaj robert , xxmaj stalins xxmaj wars , 2006 , pp . 17 - 18 ) ; \n \n  this fact is supported not only by reliable academic sources , but by elementary logic : \n \n  “ xxmaj in 1907 xxmaj xxunk published in the magazine ‘ ’ xxmaj neue xxmaj zeit ( xxmaj vol . xxup xxv 2 , p. 164 ) extracts from a letter by xxmaj marx to xxmaj xxunk dated xxmaj march 5 , 1852 . xxmaj in this letter , among other things , is the following noteworthy observation : … class struggle necessarily leads to the dictatorship of the proletariat … ”,xxbos xxmaj shelly xxmaj shock \n  xxmaj shelly xxmaj shock is . . . ( ),xxbos i do not care . xxmaj refer to xxmaj ong xxmaj teng xxmaj cheong talk page . xxmaj is xxmaj la goutte de pluie writing a biography or writing the history of trade unions . xxmaj she is making use of the dead to push her agenda again . xxmaj right before elections too . xxmaj how timely . xxunk\ny: MultiCategoryList\ntoxic,,,,\nPath: .;\n\nValid: LabelList (31915 items)\nx: TextList\nxxbos xxmaj geez , are you xxunk ! xxmaj we 've already discussed why xxmaj marx was not an anarchist , i.e. he wanted to use a xxmaj state to mold his ' socialist man . ' xxmaj ergo , he is a statist - the opposite of an anarchist . i know a guy who says that , when he gets old and his teeth fall out , he 'll quit eating meat . xxmaj would you call him a vegetarian ?,xxbos xxmaj xxunk xxup rfa \n \n  xxmaj thanks for your support on my request for adminship . \n \n  xxmaj the final outcome was ( 31 / 4 / 1 ) , so i am now an administrator . xxmaj if you have any comments or concerns on my actions as an administrator , please let me know . xxmaj thank you !,xxbos \" \n \n  xxmaj birthday \n \n  xxmaj no worries , xxmaj it 's what i do ; ) xxmaj enjoy ur xxunk \",xxbos xxmaj pseudoscience category ? \n \n  i 'm assuming that this article is in the pseudoscience category because of its association with creationism . xxmaj however , there are modern , scientifically - accepted variants of xxunk that have nothing to do with creationism — and they 're even mentioned in the article ! i think the connection to pseudoscience needs to be clarified , or the article made more general and less creationism - specific and the category tag removed entirely .,xxbos ( and if such phrase exists , it would be provided by search engine even if mentioned page is not available as a whole )\ny: MultiCategoryList\n,,,,\nPath: .;\n\nTest: None, model=SequentialRNN(\n  (0): MultiBatchEncoder(\n    (module): AWD_LSTM(\n      (encoder): Embedding(57520, 400, padding_idx=1)\n      (encoder_dp): EmbeddingDropout(\n        (emb): Embedding(57520, 400, padding_idx=1)\n      )\n      (rnns): ModuleList(\n        (0): WeightDropout(\n          (module): LSTM(400, 1150, batch_first=True)\n        )\n        (1): WeightDropout(\n          (module): LSTM(1150, 1150, batch_first=True)\n        )\n        (2): WeightDropout(\n          (module): LSTM(1150, 400, batch_first=True)\n        )\n      )\n      (input_dp): RNNDropout()\n      (hidden_dps): ModuleList(\n        (0): RNNDropout()\n        (1): RNNDropout()\n        (2): RNNDropout()\n      )\n    )\n  )\n  (1): PoolingLinearClassifier(\n    (layers): Sequential(\n      (0): BatchNorm1d(1200, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)\n      (1): Dropout(p=0.2)\n      (2): Linear(in_features=1200, out_features=50, bias=True)\n      (3): ReLU(inplace)\n      (4): BatchNorm1d(50, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)\n      (5): Dropout(p=0.1)\n      (6): Linear(in_features=50, out_features=6, bias=True)\n    )\n  )\n), opt_func=functools.partial(<class 'torch.optim.adam.Adam'>, betas=(0.9, 0.99)), loss_func=BCEWithLogitsLoss(), metrics=[functools.partial(<function accuracy_thresh at 0x7f79cc28dae8>, thresh=0.25)], true_wd=True, bn_wd=True, wd=0.01, train_bn=True, path=PosixPath('../input'), model_dir='/temp/model', callback_fns=[functools.partial(<class 'fastai.basic_train.Recorder'>, add_time=True, silent=False)], callbacks=[RNNTrainer\nlearn: RNNLearner(data=TextClasDataBunch;\n\nTrain: LabelList (127656 items)\nx: TextList\nxxbos xxmaj grandma xxmaj terri xxmaj should xxmaj burn in xxmaj trash \n  xxmaj grandma xxmaj terri is trash . i hate xxmaj grandma xxmaj terri . xxup xxunk her to xxup hell ! 71.74.76.40,xxbos , 9 xxmaj may 2009 ( xxup utc ) \n  xxmaj it would be easiest if you were to admit to being a member of the involved xxmaj portuguese xxmaj lodge , and then there would be no requirement to acknowledge whether you had a previous account ( xxmaj carlos xxmaj xxunk did not have a good record ) or not and i would then remove the sockpuppet template as irrelevant . xxup wp : xxup coi permits people to edit those articles , such as msjapan does , but just means you have to be more careful in ensuring that references back your edits and that xxup npov is upheld . 20:29,xxbos \" \n \n  xxmaj the xxmaj objectivity of this xxmaj discussion is doubtful ( non - existent ) \n \n  ( 1 ) xxmaj as indicated earlier , the section on xxmaj marxist leaders ’ views is misleading : \n \n  ( a ) it lays unwarranted and excessive emphasis on xxmaj trotsky , creating the misleading impression that other prominent xxmaj marxists ( xxmaj marx , xxmaj engels , xxmaj lenin ) did not advocate and / or practiced terrorism ; \n \n  ( b ) it lays unwarranted and excessive emphasis on the theoretical “ rejection of individual terrorism ” , creating the misleading impression that this is the main ( only ) xxmaj marxist position on terrorism . \n \n  ( 2 ) xxmaj the discussion is not being properly monitored : \n \n  ( a ) no discernible attempt is being made to establish and maintain an acceptable degree of objectivity ; \n \n  ( b ) important and relevant scholarly works such as the xxmaj international xxmaj encyclopedia of xxmaj terrorism are being ignored or xxunk excluded from the discussion ; \n \n  ( c ) though the only logical way to remedy the blatant imbalance in the above section is to include quotes by / on other leaders who are known to have endorsed and practiced terrorism all attempts to do so have been systematically blocked with impunity by the apologists for xxmaj marxist terrorism who have done their best to sabotage and wreck both the article and the discussion . \n \n  ( 3 ) xxmaj among the tactics deployed by the apologist wreckers and xxunk the following may be identified as representative examples : \n \n  ( a ) it is claimed that xxmaj marx and xxmaj engels did not advocate terrorism despite the fact that scholarly works like the xxmaj international xxmaj encyclopedia of xxmaj terrorism show that they did , and xxmaj marx himself was known as “ xxmaj the xxmaj red xxmaj terror xxmaj doctor ” ; \n \n  ( b ) it is claimed that xxmaj marx and xxmaj engels were not involved in terrorist activities despite the fact that numerous sources from xxmaj the xxmaj neue xxmaj xxunk xxmaj zeitung to xxmaj isaiah xxmaj berlin and xxmaj francis xxmaj xxunk state otherwise ; \n \n  ( c ) it is claimed that xxmaj lenin does not refer to terror in xxmaj the xxmaj proletarian xxmaj revolution and the xxmaj renegade xxup k. xxmaj xxunk and other works / statements despite the fact that xxmaj robert xxmaj service , xxup iet , and other scholarly and reliable sources state that he does ; \n \n  ( d ) it is claimed that the xxmaj russian word ‘ ’ xxunk ’ ’ does not mean “ terror ” when : \n \n  i. the xxmaj oxford xxmaj russian xxmaj dictionary says that it does ; \n \n  ii . it is evident from the context that this is the case ; \n \n  iii . any educated xxmaj russian speaker can confirm that xxunk may mean “ terror ” depending on the context ; \n \n  ( e ) it is claimed that xxmaj marxism is “ scientific ” when in fact : \n \n  i. xxmaj marx was not a scientist ; \n \n  ii . xxmaj marx ’s background was philosophy and law , not science ; \n \n  iii . xxmaj marxism is not recognized as a science by the academic world ; \n \n  iv . virtually every one of xxmaj marx ’s predictions turned out to be wrong , as became increasingly apparent during his lifetime and xxunk so after his death ( xxup r. xxmaj pipes , xxmaj communism : a xxmaj brief xxmaj history , 2001 , p. 15 ) from which it follows that xxmaj marxism does not qualify as a scientific system by any accepted standards ; \n \n  v. the evidence indicates that xxmaj marxism is closer to a religious sect than to science proper ; \n \n  ( f ) apologist literature is being quoted in a fraudulent attempt to whitewash xxmaj marxist terrorism , in effect turning the discussion into an advertisement for terrorism ; \n \n  ( g ) it is claimed that xxmaj marxist terrorism is not rooted in the xxmaj marxist theory of class struggle even though there are numerous sources showing that it is ( please note that it is immaterial whether terrorism had already been justified in terms of a theory of class prior to xxmaj marx , the point being that it was advocated / practiced on the basis of xxmaj marxist class - struggle theories xxup by xxup marxists ) : \n \n  “ xxmaj karl xxmaj marx felt that terror was a necessary part of a revolutionary strategy ” ( xxmaj peter xxmaj xxunk , “ xxmaj theories of xxmaj terror in xxmaj urban xxmaj xxunk ” , xxup iet , p. 138 ) ; \n \n  “ xxmaj revolutionary terrorism has its roots in a political ideology , from the xxmaj marxist - xxmaj leninist thinking of the xxmaj left , to the fascists found on the xxmaj right ” ( xxmaj xxunk xxmaj gal - xxmaj or , \" \" xxmaj revolutionary xxmaj terrorism \" \" , xxup iet , p. 203 ) ; \n \n  “ … perhaps the most important key to xxmaj stalin ’s motivation lies in the realm of ideology . xxmaj the xxunk of xxmaj soviet communist ideology in the 1920s and 1930s was class struggle – the xxunk antagonism between mutually incompatible economic interest groups ” ( xxmaj geoffrey xxmaj robert , xxmaj stalins xxmaj wars , 2006 , pp . 17 - 18 ) ; \n \n  this fact is supported not only by reliable academic sources , but by elementary logic : \n \n  “ xxmaj in 1907 xxmaj xxunk published in the magazine ‘ ’ xxmaj neue xxmaj zeit ( xxmaj vol . xxup xxv 2 , p. 164 ) extracts from a letter by xxmaj marx to xxmaj xxunk dated xxmaj march 5 , 1852 . xxmaj in this letter , among other things , is the following noteworthy observation : … class struggle necessarily leads to the dictatorship of the proletariat … ”,xxbos xxmaj shelly xxmaj shock \n  xxmaj shelly xxmaj shock is . . . ( ),xxbos i do not care . xxmaj refer to xxmaj ong xxmaj teng xxmaj cheong talk page . xxmaj is xxmaj la goutte de pluie writing a biography or writing the history of trade unions . xxmaj she is making use of the dead to push her agenda again . xxmaj right before elections too . xxmaj how timely . xxunk\ny: MultiCategoryList\ntoxic,,,,\nPath: .;\n\nValid: LabelList (31915 items)\nx: TextList\nxxbos xxmaj geez , are you xxunk ! xxmaj we 've already discussed why xxmaj marx was not an anarchist , i.e. he wanted to use a xxmaj state to mold his ' socialist man . ' xxmaj ergo , he is a statist - the opposite of an anarchist . i know a guy who says that , when he gets old and his teeth fall out , he 'll quit eating meat . xxmaj would you call him a vegetarian ?,xxbos xxmaj xxunk xxup rfa \n \n  xxmaj thanks for your support on my request for adminship . \n \n  xxmaj the final outcome was ( 31 / 4 / 1 ) , so i am now an administrator . xxmaj if you have any comments or concerns on my actions as an administrator , please let me know . xxmaj thank you !,xxbos \" \n \n  xxmaj birthday \n \n  xxmaj no worries , xxmaj it 's what i do ; ) xxmaj enjoy ur xxunk \",xxbos xxmaj pseudoscience category ? \n \n  i 'm assuming that this article is in the pseudoscience category because of its association with creationism . xxmaj however , there are modern , scientifically - accepted variants of xxunk that have nothing to do with creationism — and they 're even mentioned in the article ! i think the connection to pseudoscience needs to be clarified , or the article made more general and less creationism - specific and the category tag removed entirely .,xxbos ( and if such phrase exists , it would be provided by search engine even if mentioned page is not available as a whole )\ny: MultiCategoryList\n,,,,\nPath: .;\n\nTest: None, model=SequentialRNN(\n  (0): MultiBatchEncoder(\n    (module): AWD_LSTM(\n      (encoder): Embedding(57520, 400, padding_idx=1)\n      (encoder_dp): EmbeddingDropout(\n        (emb): Embedding(57520, 400, padding_idx=1)\n      )\n      (rnns): ModuleList(\n        (0): WeightDropout(\n          (module): LSTM(400, 1150, batch_first=True)\n        )\n        (1): WeightDropout(\n          (module): LSTM(1150, 1150, batch_first=True)\n        )\n        (2): WeightDropout(\n          (module): LSTM(1150, 400, batch_first=True)\n        )\n      )\n      (input_dp): RNNDropout()\n      (hidden_dps): ModuleList(\n        (0): RNNDropout()\n        (1): RNNDropout()\n        (2): RNNDropout()\n      )\n    )\n  )\n  (1): PoolingLinearClassifier(\n    (layers): Sequential(\n      (0): BatchNorm1d(1200, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)\n      (1): Dropout(p=0.2)\n      (2): Linear(in_features=1200, out_features=50, bias=True)\n      (3): ReLU(inplace)\n      (4): BatchNorm1d(50, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)\n      (5): Dropout(p=0.1)\n      (6): Linear(in_features=50, out_features=6, bias=True)\n    )\n  )\n), opt_func=functools.partial(<class 'torch.optim.adam.Adam'>, betas=(0.9, 0.99)), loss_func=BCEWithLogitsLoss(), metrics=[functools.partial(<function accuracy_thresh at 0x7f79cc28dae8>, thresh=0.25)], true_wd=True, bn_wd=True, wd=0.01, train_bn=True, path=PosixPath('../input'), model_dir='/temp/model', callback_fns=[functools.partial(<class 'fastai.basic_train.Recorder'>, add_time=True, silent=False)], callbacks=[...], layer_groups=[Sequential(\n  (0): Embedding(57520, 400, padding_idx=1)\n  (1): EmbeddingDropout(\n    (emb): Embedding(57520, 400, padding_idx=1)\n  )\n), Sequential(\n  (0): WeightDropout(\n    (module): LSTM(400, 1150, batch_first=True)\n  )\n  (1): RNNDropout()\n), Sequential(\n  (0): WeightDropout(\n    (module): LSTM(1150, 1150, batch_first=True)\n  )\n  (1): RNNDropout()\n), Sequential(\n  (0): WeightDropout(\n    (module): LSTM(1150, 400, batch_first=True)\n  )\n  (1): RNNDropout()\n), Sequential(\n  (0): PoolingLinearClassifier(\n    (layers): Sequential(\n      (0): BatchNorm1d(1200, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)\n      (1): Dropout(p=0.2)\n      (2): Linear(in_features=1200, out_features=50, bias=True)\n      (3): ReLU(inplace)\n      (4): BatchNorm1d(50, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)\n      (5): Dropout(p=0.1)\n      (6): Linear(in_features=50, out_features=6, bias=True)\n    )\n  )\n)], add_time=True, silent=None)\nalpha: 2.0\nbeta: 1.0], layer_groups=[Sequential(\n  (0): Embedding(57520, 400, padding_idx=1)\n  (1): EmbeddingDropout(\n    (emb): Embedding(57520, 400, padding_idx=1)\n  )\n), Sequential(\n  (0): WeightDropout(\n    (module): LSTM(400, 1150, batch_first=True)\n  )\n  (1): RNNDropout()\n), Sequential(\n  (0): WeightDropout(\n    (module): LSTM(1150, 1150, batch_first=True)\n  )\n  (1): RNNDropout()\n), Sequential(\n  (0): WeightDropout(\n    (module): LSTM(1150, 400, batch_first=True)\n  )\n  (1): RNNDropout()\n), Sequential(\n  (0): PoolingLinearClassifier(\n    (layers): Sequential(\n      (0): BatchNorm1d(1200, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)\n      (1): Dropout(p=0.2)\n      (2): Linear(in_features=1200, out_features=50, bias=True)\n      (3): ReLU(inplace)\n      (4): BatchNorm1d(50, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)\n      (5): Dropout(p=0.1)\n      (6): Linear(in_features=50, out_features=6, bias=True)\n    )\n  )\n)], add_time=True, silent=None)"},"metadata":{}}]},{"metadata":{"trusted":true},"cell_type":"code","source":"learn.unfreeze()\nlearn.lr_find()\nlearn.recorder.plot(suggestion=True)","execution_count":64,"outputs":[{"output_type":"display_data","data":{"text/plain":"<IPython.core.display.HTML object>","text/html":""},"metadata":{}},{"output_type":"stream","text":"LR Finder is complete, type {learner_name}.recorder.plot() to see the graph.\nMin numerical gradient: 6.31E-07\n","name":"stdout"},{"output_type":"display_data","data":{"text/plain":"<Figure size 432x288 with 1 Axes>","image/png":"iVBORw0KGgoAAAANSUhEUgAAAZgAAAEKCAYAAAAvlUMdAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAAIABJREFUeJzt3Xd4lFX2wPHvSSGhJKEltAChinQkgIgdC66uiMIqlsWKrusWXdd1i+7+cF376u6Kq66Croq6YMOKvaGU0GuoAUIKIZBO+vn9MQOGMCGTZN5Myfk8zzzMvHPfmXMzIWfuve+9V1QVY4wxxtfC/B2AMcaY0GQJxhhjjCMswRhjjHGEJRhjjDGOsARjjDHGEZZgjDHGOMISjDHGGEdYgjHGGOMISzDGGGMcEeHvAHylc+fOmpSU5O8wjDEmqKxYsWK/qsY78dohk2CSkpJISUnxdxjGGBNURGSXU69tXWTGGGMcYQnGGGOMIyzBGGOMcYQlGGOMMY6wBGOMMcYRlmCMMcY4whKMMcYYR1iCMcaYILZgRTqvLtvt7zA8sgRjjDFBbH7KHt5audffYXhkCcYYY4LYvsIyEmKj/B2GR5ZgjDEmiO0rKCUhJtrfYXhkCcYYY4JUUVklxeVVdLEWjDHGGF/KLigFsC4yY4wxvrWvoAyALtZFZowxxpf2FR5uwViCMcYY40OHWzDWRWaMMcansgtKaR0ZTkxUYO4daQnGGGOCVLZ7DoyI+DsUjyzBGGNMkNpXUBqwA/xgCcYYY4JWIM/iB0swxhgTtAJ5Fj9YgjHGmKAU6LP4weEEIyKTRCRVRLaJyN0enj9dRFaKSKWITK313Ecikici7zkZozHGBKNAn8UPDiYYEQkHZgMXAIOB6SIyuFax3cC1wDwPL/EIcI1T8RljTDAL9Fn84GwLZiywTVV3qGo58BowuWYBVU1T1bVAde2TVfUzoNDB+IwxJmj9MIu/BbZggB7AnhqP093HjDHGNNEPs/hbZgvG08wf9ekbiMwUkRQRScnJyfHlSxtjTEAL9Fn84GyCSQd61nicCGT48g1U9VlVTVbV5Pj4eF++tDHGBLRAn8UPziaY5cAAEekjIq2AK4CFDr6fMca0GIE+ix8cTDCqWgncBiwCNgH/U9UNIjJLRC4GEJExIpIOTAOeEZENh88XkW+A+cBEEUkXkfOditUYY4LNvsIy4gN4gB/A0c47Vf0A+KDWsXtr3F+Oq+vM07mnORmbMcYEs30FpZx1QoK/wzgum8lvjDFBJhhm8YMlGGOMCTrBMIsfLMEYY0zQCYZZ/GAJxhhjgk4wzOIHSzDGGBN0gmEWP1iCMcaYoBMMs/jBEowxxgSdYJjFD5ZgjDEm6ATDLH6wBGOMMUEnGGbxgyUYY4wJOtaCMcYY43OHZ/EH+iXKYAnGGGOCyuFZ/IG+TAxYgjHGmKASLLP4wRKMMcYElWCZxQ+WYIwxJqgEyyx+sARjjDFBJbuglOjIsICfxQ+WYIwxJqhkF5bRJTY64GfxgyUYY4wJKtlBMgcGLMEYY0xQyD9Uwe/fXMuynQcY2LWdv8PxSuB34hljTAumqny0Pot7F24gt6iMm07rw+3nDvR3WF6xBGOMMQHsN/PX8ObKvQzpHsucGWMYlhjn75C85mgXmYhMEpFUEdkmInd7eP50EVkpIpUiMrXWczNEZKv7NsPJOI0xJhBtyizgzZV7ufaUJN75+YSgSi7gYIIRkXBgNnABMBiYLiKDaxXbDVwLzKt1bkfgz8A4YCzwZxHp4FSsxhgTiN5YkU5kuPDLiQOICA++IXMnIx4LbFPVHapaDrwGTK5ZQFXTVHUtUF3r3POBT1T1gKoeBD4BJjkYqzHGBJTKqmreXp3BWSck0LFtK3+H0yhOJpgewJ4aj9Pdx5w+1xhjgt43W/ezv6iMS09K9HcojeZkgvE0C0h9ea6IzBSRFBFJycnJaVBwxhgTyBasTKdDm0jOHpTg71AazckEkw70rPE4Ecjw5bmq+qyqJqtqcnx8fKMDNcaYQJJ/qIJPNmZz8YjutIoIvrGXw5yMfDkwQET6iEgr4ApgoZfnLgLOE5EO7sH989zHjDEm5L2/NpPyymouGx283WPgYIJR1UrgNlyJYRPwP1XdICKzRORiABEZIyLpwDTgGRHZ4D73AHAfriS1HJjlPmaMMSHvjZXp9E9ox7AewXVZcm2OTrRU1Q+AD2odu7fG/eW4ur88nTsHmONkfMYYE2h27i9mxa6D/G7SoKBY0PJ4grdzzxhjQtBbK9MRgSmjgv/CWUswxhgTIFSVN1bu5dT+nekaFxwrJh+PJRhjjAkQ2QVl7M07xLmDu/g7FJ+wBGOMMQEiI/8QAIkdWvs5Et+wBGOMMQEiM68UgK6xlmCMMcb4UKa7BdO9ffCPv4AlGGOMCRiZ+aW0jgwnrnWkv0PxCUswxhgTIDLzD9GtfXTQz385zBKMMcYEiIy8UrqFwOXJh1mCMcaYAJGVX0q3uNAY4AdLMMYYExAqq6rZV1hKd2vBGGOM8aXswjKqFbq1txaMMcYYH8rMc12iHApLxBxmCcYYYwJAZr5rkmV3G4MxxhjjS4cnWXYLkUmWYAnGGGMCQkZeKe2iIoiNDo1JlmAJxhhjAkJm/qGQGn8BSzDGGBMQXHNgLMEYY4zxsYz80pAa4AdLMMYY43flldXsLyoLqQF+sARjjDF+l11QiirWRdYQIjJJRFJFZJuI3O3h+SgRed39/FIRSXIfbyUic0VknYisEZEznYzTGGP8KcM9yTKU1iEDBxOMiIQDs4ELgMHAdBEZXKvYDcBBVe0PPA485D5+E4CqDgPOBR4TEWttGWNCUlaBe5KldZF5bSywTVV3qGo58BowuVaZycCL7vsLgIni2ghhMPAZgKruA/KAZAdjNcYYv8lwb5VsLRjv9QD21Hic7j7msYyqVgL5QCdgDTBZRCJEpA8wGujpYKzGGOM3mfmHiI2OoG1UhL9D8Skna+NpSzb1sswc4EQgBdgFfAdUHvMGIjOBmQC9evVqSqzGGOM3ro3GQqv1As62YNI5utWRCGTUVUZEIoA44ICqVqrq7ao6UlUnA+2BrbXfQFWfVdVkVU2Oj493pBLGGOO0rIJDIXeJMjibYJYDA0Skj4i0Aq4AFtYqsxCY4b4/FfhcVVVE2ohIWwAROReoVNWNDsZqjDF+kxmiLRjHushUtVJEbgMWAeHAHFXdICKzgBRVXQg8D7wkItuAA7iSEEACsEhEqoG9wDVOxWmMMf5UWlFFbnF5SO1keZijI0qq+gHwQa1j99a4XwpM83BeGnCCk7EZY0wgyHLvAxNqC12CzeQ3xhi/OrLRWAhtlXyYJRhjjPGjIxuNWQvGGGOMLx1uwYTiIL8lGGOM8aOMvEO0bxNJ61bh/g7F5yzBGGNMMymtqOKj9VmUV1YfOebaaCz0Wi/g8FVkxhhjXEorqrjpvyl8s3U/Y/t05OmrR9OxbSv3RmOhN/4C1oIxxhjHlVZUcfNLK/h2236uPrkXq/fkccnsxWzJLiQzPzRn8YOXCUZE+olIlPv+mSLySxFp72xoxhgT/Moqq7jl5RV8tSWHBy8dxl8vGcb/bh7PoYoqLn3qO/JKKkK2i8zbFswbQJWI9Mc1+74PMM+xqIwxJgSUVVbxs5dX8mVqDg9cOozLx7gW5R3Zsz0Lb5tAUuc2APQIwTkw4P0YTLV76ZcpwBOq+i8RWeVkYMYYE8yqq5Xfzl/L55v38bcpw5g+9ugV37vFteZ/N4/n3TUZTBra1U9ROsvbBFMhItNxLUz5Y/exSGdCMsaY4Pf3T7awcE0Gv5s0iCvHed5OpE2riCOtmlDkbRfZdcB44H5V3eneBOxl58Iyxpjg9fry3Tz5xTamj+3JLWf09Xc4fuNVC8a9VP4vAUSkAxCjqg86GZgxxgSjr7fk8Ie31nPagM7MmjwU1y7wLZO3V5F9KSKxItIR13bGc0Xk786GZowxwWVDRj63vrKSAQnteOqqk4gMb9kzQbytfZyqFgCXAnNVdTRwjnNhGWNMcPl6Sw6XP7OEdlERzLl2DDHRNkztbYKJEJFuwE+A9xyMxxhjgs5ry3Zz3QvLSezQmrd+fkpILr3fGN5eRTYL186Ui1V1uYj0BbY6F5YxxgSe4rJKMvMP0S4qkpjoCFpHhvPYJ6nM/mI7pw+MZ/aVo6zlUoO3g/zzgfk1Hu8ALnMqKGOMCUS3zVvJF6k5Rx6LgCpMH9uLWZOHtPgxl9q8SjAikgj8C5gAKPAt8CtVTXcwNmOMCRjV1crytIOcMTCe84d0pbC0gqKySvp0bsuUUT1a9NVidfG2i2wurqVhprkfX+0+dq4TQRljTKDZsb+IorJKLhrejWnJPf0dTlDwtj0Xr6pzVbXSfXsBiHcwLmOMCSir9+QDrnXEjHe8TTD7ReRqEQl3364Gcp0MzBhjAsmaPXm0i4qgb3w7f4cSNLxNMNfjukQ5C8gEpuJaPua4RGSSiKSKyDYRudvD81Ei8rr7+aUikuQ+HikiL4rIOhHZJCK/97ZCxhjjhLXpeQzrEUd4mI21eMurBKOqu1X1YlWNV9UEVb0E16TLOolIODAbuAAYDEwXkcG1it0AHFTV/sDjwEPu49OAKFUdBowGbj6cfIwxprmVVVaxMbOAEdY91iBNuabujnqeHwtsU9UdqloOvAZMrlVmMvCi+/4CYKK4LsVQoK2IRACtgXKgoAmxGmNMo23KLKSiShmRGOfvUIJKUxJMfe3EHsCeGo/T3cc8llHVSiAf6IQr2RTj6o7bDTyqqgeOCUBkpoikiEhKTk5O7aeNMcYn1uzJA7AWTAM1JcFoPc97SkC1z6mrzFigCuiOa/fM37hXDzi6oOqzqpqsqsnx8XZRmzHGGWv25BEfE0W3uGh/hxJUjjsPRkQK8ZxIBFfX1fGkAzUvFk8EMuook+7uDosDDgBXAh+pagWwT0QWA8nAjnre0xhjfG5Neh4jEtvbZMoGOm4LRlVjVDXWwy1GVeubpLkcGCAifUSkFXAFsLBWmYW4dskE15Vpn6uq4uoWO1tc2gInA5sbWjljjGmqgtIKtucUM7Knjb80lGML57jHVG7DtUjmJuB/qrpBRGaJyMXuYs8DnURkG66LBg5fyjwbaAesx5Wo5qrqWqdiNcaYuqxLd02wtPGXhvN2qZhGUdUPgA9qHbu3xv1Sflh+pmaZIk/HjTGmua12D/AP72EJpqFs6U9jjDmONXvy6NO5LXFtbBn+hrIEY4wxx7E2Pd/mvzSSJRhjjKlDVn4pWQWlNv7SSJZgjDGmDmvSbYJlU1iCMcaYOqzZk0dEmDC4W6y/QwlKlmCMMaYOa9LzGNQthujIcH+HEpQswRhjjAeVVdWs3p3HqJ4d/B1K0LIEY4wxHqzPKKC4vIpxfTv6O5SgZQnGGGM8WLLDtWnvuD6d/BxJ8LIEY4wxHizdkUu/+LbEx0T5O5SgZQnGGGNqqayqZnnaQU7ua62XprAEY4wxtWzMLKCorJJxlmCaxBKMMcbUsnSHawPdk/vYAH9TWIIxxphaluzIpW/ntiTE2g6WTWEJxhhjaqiqVpalHbDLk33AEowxxtSwKbOAwtJKG+D3AUswxhhTg81/8R1LMMYYU8OSHQdI6tSGrnE2/tJUlmCMMcatulpZnnbAWi8+YgnGGGPcNmUVkH+owgb4fcQSjDHGuB2e/2ITLH3D0QQjIpNEJFVEtonI3R6ejxKR193PLxWRJPfxq0RkdY1btYiMdDJWY4xZsiOXnh1b06N9a3+HEhIcSzAiEg7MBi4ABgPTRWRwrWI3AAdVtT/wOPAQgKq+oqojVXUkcA2QpqqrnYrVGGMOFpezZEeujb/4kJMtmLHANlXdoarlwGvA5FplJgMvuu8vACaKiNQqMx141cE4jTEtnKpy95trOVRRxXUTkvwdTshwMsH0APbUeJzuPuaxjKpWAvlA7a8Pl1NHghGRmSKSIiIpOTk5PgnaGNPyvL58D4s2ZHPX+YMY0j3O3+GEDCcTTO2WCIA2pIyIjANKVHW9pzdQ1WdVNVlVk+Pj4xsfqTGmxdqRU8T/vbuRCf07ccOpffwdTkhxMsGkAz1rPE4EMuoqIyIRQBxwoMbzV2DdY8YYh5RXVvOr11YTFRnGY9NGEhbm6TuvaSwnE8xyYICI9BGRVriSxcJaZRYCM9z3pwKfq6oCiEgYMA3X2I0xxvjc459uYd3efB68dLjN3HdAhFMvrKqVInIbsAgIB+ao6gYRmQWkqOpC4HngJRHZhqvlckWNlzgdSFfVHU7FaIxpud5ZvZenv9rO9LE9mTS0q7/DCUnibjAEveTkZE1JSfF3GMaYIPDZpmxmvrSCMUkdeOG6sURHhvs7JL8RkRWqmuzEa9tMfmNMi7JkRy63vrKSId1jeW7GmBadXJxmCcYY02KsTc/jxhdT6NWxDS9cN5Z2UY6NEhgcHIMxxhh/UlU2ZBSQmlXI9pwitucU8d32XNq3ieSlG8bRsW0rf4cY8izBGGNC0icbXeMsABFhQu9ObThtQGd+N2mQXTHWTCzBGGNC0jurM+jcrhWv3zyeXh3bEBluIwLNzX7ixpiQU1xWyWebs5k0tCv94ttZcvET+6kbY0LO55v3UVpRzUXDu/s7lBbNEowxJuS8tzaD+JgoxiTZzpT+ZAnGGBNSisoq+SI1hwuHdSPc1hbzK0swxpiQ8tmmbMorq7loeDd/h9LitewEs3073HorxMZCWJjr31tvdR03xgSld9dk0jU2mpN6dfB3KC1ey00wH34Iw4fDc89BYSGouv597jnX8Q8/9HeExpgGKiit4OstOVw4vJstvR8AWuY8mO3bYepUKCk59rmKCtdt6lRYuxb69Wv++IwxvL58N6v35DOhfycm9OtMBy9m3n+yIZvyqmoutO6xgNAyE8xjj7mSyPFUVMDjj8OTTzZPTMaYI4rKKpn17kZKKqp4ddluRGBo9zgGdGlHaUUVxWVVlJRXEhMdyeVjenLOiV0IDxPeX5dJj/atGdWzvb+rYGipCebll71LMC+9ZAnGGD9YuDqD4vIq5t8ynjARvt26n2+35bBkey5toiJo2yqc1q3C2ZRZwM0vrSCxQ2umj+3FN1tzuG5CH0SseywQtMwEU1Tk23LGGJ+at2wXg7rGkNy7AyLC6N4d+NU5A44pV1lVzScbs5n7XRqPLEoF4MJh1j0WKFpmgmnXzjWg7005Y0yzWpeez/q9BcyaPKTelkhEeBgXDOvGBcO6sTGjgG05RQxPjGumSE19WuZVZFdfDZGRxy8TGQnXXOOTt6uuVvbmHfLJaxkT6uYt20V0ZBiXjOrRoPMGd4/l4hHdrXssgLTMBPOb33iXYG6/3Sdvd8876zn94S9IzfKi1WRMC1ZYWsE7qzP48fDuxEbX83/UBLyWmWD69YMFC6BNm2MSjUZGuo4vWOCTS5Q/Wp/JK0t3U1WtzF28s8mvZ0woW7gmg5LyKq4c18vfoRgfaJkJBuCCC1zzXGbOhNhYVITCVm3YcOFPXMcvuKDJb7E37xB3LVjL8MQ4po5O5K1VezlQXO6D4E1LVlxWyevLd1NaUeXvUHxKVZm3dDeDusYw0i4zDgmOJhgRmSQiqSKyTUTu9vB8lIi87n5+qYgk1XhuuIh8LyIbRGSdiPh+C7p+/VyXIefnI9XVXP33j7nrzJt90nKpqlZuf201VdXKP68YxczT+1JWWc2ry3b7IHDTkj391XZ+98Y6fvLM92QXlPo7HJ9Zm57PhowCrhrXy8ZRQoRjCUZEwoHZwAXAYGC6iAyuVewG4KCq9gceBx5ynxsBvAzcoqpDgDOBeiauNN3kEd3ZmFnA1uymj5U8+fk2lqUd4L5LhpLUuS0Du8Rw2oDO/Pf7NCqqqpserGmRSiuqjnzL376viIuf/Ja16Xn+Dot9BaX849OtTUp4Ly3ZRevIcCY3cHDfBC4nWzBjgW2qukNVy4HXgMm1ykwGXnTfXwBMFNdXl/OAtaq6BkBVc1XV8f6Ai4Z3I0xc/cBNsWLXQf7x2RamjOrBpSclHjl+3YQksgvK+GBdZlNDbRBV5d01GTz2cSqVltyC2ntrM8ktLueeiwaz4GenEBEWxrSnv2fhmgxU1S8x5ZdUcM3zy3j80y2c89hXvLRkF9XV3sdSWlHFH95ax4IV6UxLTrTB/RDi5DyYHsCeGo/TgXF1lVHVShHJBzoBAwEVkUVAPPCaqj5c+w1EZCYwE6BXr6YPCibERnNKv868szqDO84d2Ohm+rtrMmgdGc6syUOOOn7mwAT6dG7L3MVpTB7ZPN/SVuw6wH3vbWL1Hte33P1FZfxtyjDrgghCqq4LRQYktOOUfp0QERbeNoFbXl7BL19dxd1vrKV3p7b06dyGpE5t6da+NV1iougaF03X2GgSYn3fy3yovIrrX1zOzv3FPDJ1OG+v3ss9b6/n7VV7eeDSYQzsEnPc83fnlnDrvBWs31vALWf0487zBvo8RuM/TiYYT3/Ban+tqatMBHAqMAYoAT4TkRWq+tlRBVWfBZ4FSE5O9snXt4tHdueuBWtZvSePUY1c7jstt5ikzm2JqfVNLCxMuPaUJP68cAMrdx90dDnxzPxD3PfeRj5Yl0VCTBQPTx3Ozv3F/PvL7fRo35rbzj52VrQJbCt2HWRDRgH3Txl65AtCp3ZRvHzjON5auZfU7EJ25ZawObOQjzdkU1mrFTFxUAIPTR1O53ZRDXpfVeX77blszyli4old6N6+NQAVVdXc+soKVu4+yOwrT+JHw7oxdXQib67cy1/f38iF//yGey4azDUn9/b4hWbRhizunL8GAZ77aTLnDO7SuB+MCVhOJph0oGeNx4lA7b6nw2XS3eMuccAB9/GvVHU/gIh8AJwEfIbDJg3typ/eXs8bK9MbnWB255ZwYrdYj89NHZ3Iox+nMndxmmMJRlW59ZWVbM4s5NfnDGDm6X1p0yoCVSUz7xCPfryF7u1bH9V9Z3xn2c4DDOzSjvZt6l/9tyHmfpdGbHQEU2qNUURFhHPF2KNb8FXVSm5RGVkFpWTll7Ixs4CnvtzOpCe+4dFpwznzhIR636+yqpqPNmTxzFc7WLc3H4B73tnAmKQO/HhEd1buOsgXqTncP2UoP3IvzyIiXDY6kbMGJfDb+Wu4950NbMos4P8uHkqrCFeP/MHicu57byNvrtrL8MQ4Zl95Ej07tvHFj8gEGCcTzHJggIj0AfYCVwBX1iqzEJgBfA9MBT5X1cNdY3eJSBugHDgD10UAjouNjuTCYd14Z1UGf/jRibRp1bAfUVW1sudgCecP7erx+bZREVye3JMXvktj7wWD6OH+NuhL767NZNXuPB6ZOpxpyT/keBHh4akj2FdYxl0L1tIlNpoJ/Tv7/P2b0/trMxmeGOf1H6j8QxWsTc8jr6SC/EOuW+9ObbhoeHefxPPvL7fz0Eeb6RvfllduHEe3ON98vpn5h/hofRY3nNrHq9/J8DAhwd0tNjwRzhvSlUlDu/KrV1dz7dzlXDchid9NGkR0ZPgx5xaVVfLGinSe/3Ynuw+U0LdzWx64dBjJvTuwaEMWC9dkcO87GwD4zbkDuWpc72Neo2PbVjz702Qe+ziVp77cztbsIv599WhS0g5wzzvrySup4Jdn9+fnZ/cnKuLYGExoECcHBkXkR8ATQDgwR1XvF5FZQIqqLnRfevwSMApXy+UKVd3hPvdq4Pe4usw+UNW7jvdeycnJmpKS4pO4l6cdYNrT3/PQZcO4fEzDxnb2HCjhtIe/4MFLhx3zrfKwvXmHOOvRL7loWDf+fvlIX4R8RGlFFRMf+4q41pG894tTPW66VFBawbR/f8/evEM8ddVJnD4w3qcxNJdducWc8ciXJMRE8erMk+kXf/y143buL+bq55Z6XLbn3osGc/2pfZoUz78+28pjn2zhjIHxrNx1kLg2kcy78WR6dfIu+akq23OKWZ52gEPlVVw0ohsJMa5xk0cWbeapL7fz9W/PatK3/dKKKh78cDMvfJdGTFQEZw5K4LzBXTjzhHhyi8p58fs05qekU1RWyahe7bn59H6cO7jLMXvbp2YVsiu3mHMHd6l3PG/hmgzuWrCGcBGKy6sY0j2WR6aOYHB3z61807zcww/Jjry2v6488TVfJhhV5fwnviY6MpyFt53aoHMXb9vPVc8t5dWbTmZ8v051lnvww808/dV23r3tVIb5cHG+w9+g5904jlOO0zrJzD/EdXOXsyW7kD9eOJjrJyQF3cD/S0t2cc/b64mNjiA6Mvy4SSY1q5Crn19KVbXy8GXD6d2pDXGtI2kXHcEdr6/how1Z/OOKkY26+EJVeeLTrfzjs61cOqoHD08dzqbMQq6Zs5SoiDBeuXEc/RPqHuz+aH0Wb65MJ2XXwaMm4oaHCRMHJTB1dCJ3v7mO0b078J+f+ubvwNIduby1ai+fbMwmt7icyHChslqJCBMuHNaNGackNbqL2JP1e/P509vrOXdwF2ae3pfI8JY7xzvQWILxgi8TDMALi3fyl3c38t4vTmVoD+8TwCtLd/HHt9bz3d1nHxkM9aSgtIIzH/mSAQnteG3myT75476/qIwzH/mSk/t24rkZ9f++FJdVcsf/VrNoQzY/SU7kvkuGBlV3xcz/prAho4C5141h+rNLCA8TXpt5Mn1rJZl16flcM2cprcJdf+wH1LqyqbSiimvnLiMl7SDPzUj2anziMFXl0Y9Tmf3FdqaNTuTBy4Yf+bZfM6k9cflIThvQ+ajP+UBxOfe8s57317o2yRrfrxNjkjowJqkjAK+n7OGNFensL3Ilnfq+NDRGVbWyavdBPtmUTdtWEVwxpqcjV5uZwGUJxgu+TjD5hyoY97dPmTIqkQcuHeb1eQ98sIm536WxedakevcEf+n7NO55ZwP/+Wky5/rgCpo/vrWO15fvYdHtp9fbXXRYdbXyxKdb+Ofn2xjduwN/vWRonRcoBJKKqmpGzfqEH4/ozgOXDmNLdiHTn11CRLjwq4kDaRURRmS4cKi8ivvf30Rs60jm3TSO3p3aeny9gtIKLn9mCWn7i5l30zivvr2nHyzh7jfW8e22/Uwf24v7Lxl6zGdes1uub+e2TB/vDZuPAAASQElEQVTbi6mjE1m68wB/ensd+Ycq+PU5A7n59L5EePhWX1FVzWeb9rE371BQtjJN4LME4wVfJxiA385fw/vrMln6h4nHXHJcl5tfSmF7TjGf3nFGvWUrqqqZ9MTXqMKi209vUrfBluxCJj3xNT8dn8RfLh5S/wm1vLc2g7vfWEdRWSXnD+nCLycOYEj3wN1X4/A42b+vOokL3FcwpWYVctVzS9lfVHZU2b6d2/LyjeOO26IE2FdYytR/f09m/iH6J8QwqGsMA7u4/j2hawzd4qIREVSV15fv4a/vb0JV+eOFg5k+tmedf/xLK6r4YF0m85buJmXXQSLCXN1RQ7rH8ui0EUGR0E3ocjLBtMwNx7x05bhezF+RzjurM7j65GOvlPFkV24Jvb0chI0MD+P3F5zIjf9NYd7S3cw4JalRcaoq//fuBtpFRfCriY2b33LR8O6c2r8zcxanMXfxThZtyOacE7tw/5ShdAnALpNvtuQQJhzVZXRC1xi+/d1ZHCgup7JKqaiuprJK6d2pjcerpWpLiInm1Zkn89/v0ticVcgS9zjFYbHREZzQNQZVSNl1kPF9O/Hw1OH1DrpHR4Zz6UmJXHpSIqlZhcxP2UN8TBTXn9rHxiJMSLMEcxwje7ZncLdY5i3d7dUCfKrK7gMlnNLP+37yiScmML5vJ574dAuThnb1+MdcVUnZdZB+8e3o2PbYuRXzV6SzeFsu900eQgcPz3urfZtW3HHuQG44tQ8vLE7jma+3M/0/S3h95njiYxo2Oc9pX23dz8ie7YlrfXTLMjoyvN6WyvH0aN+a3//oxCOP80sqSM0uJDWrgM1ZhaRmFZJdWMqsyUO4elzvertBazuhawx/uqj2knzGhCZLMMchIlw5rhd/ens9a9Lz611CPKeojJLyKnp7eVnq4ff400UnMmX2d5z16JfceFpfbjqtz5EuucXb9vPox6ms2p3HwC7tmH/LKUf9Ud1XUMpf39vI2KSOHucjNEZc60h+dc4AxvfrxIw5y7j6uaW8OvNkj8nNH/JKylmbnscvm2E1grg2kYzt05GxfTo6/l7GhBprn9fjklE9aNMq3KvNwnbnlgB4Pe/hsCHd41h0++mcNSiBf362lTMe+ZInP9/K9GeXcNVzS8nKL+W2s/qzc38xM/+bQlmla91PVeWed9ZTVlnNg5cNa/C36fqM7dOR52ckk5ZbzDXPLyW/xPEFrb2yeFsuqgTt/B1jWgpLMPVoFxXBNeN7s3BNBpsyC45bNs2dYJLquFLpePp0bsvsK0/inZ9P4IQuMTz68Ra27ivizz8ezBd3nsmd55/Ao9NGsHTnAe6cv5bqauXD9Vks2pDN7ecOPObSXF85pX9nnrlmNFuzi/jp3GUUlPo/yXy9JYeY6AhG+HD+kDHG96yLzAu3ntGfV5fu5pFFqcy5dkyd5XbnFhMmNGn5lxE92zPvpnFszymme/voo5YFmTyyBxl5pTz00WbiWkfw0foshvWI48YmzkCvz5knJDD7qpP42csruGT2Yp6+enS9q+Q6RVX5ZmsOE/p19nhZrzEmcNj/UC/EtYnkZ2f25/PN+1i280Cd5XYdKKFHh9ZHFvVrLBGhf0I7j2tO3XJGX346vjcvL9lNXkkFD102vFn+0J47uAuv3DiOwtJKJj+5mLdrXF3VnLbnFJORX2rdY8YEAWvBeOnaU5J44budPPjhJt742SkeryhLyy2hd8eGd481hIjw5x8PITzMlYSacz2ncX078f4vTuW2V1fx69dXk7LrAFeN601JeSUl5VUUl1UxomeczxZ49OTrLTkAnDYguBfpNKYlsATjpdatwvn1OQP5/Zvr+HhjNucPOXa15N25xUcm/TkpPMyVZPwhITaaeTeO45FFqTzz9Q5eXrL7qOdjoyP499Wjm7RKs6qydOcB5i7eyRepOYzr05Epo3pw/pCufLM1h76d29ry7sYEAUswDTBtdCL/+WYHjyxKZeKghKO6pvIPVXCwpIKkBl5BFowiwsP4/Y9OZNLQrmTll9ImKoK2rcKpVvjT2+uYMWcZsyYP5cpxP6wmXVFVzeeb91FWWc3FIzwvjV9ZVc1bq/YyZ3EamzIL6NAmkskjurNkZy53/G8NrSPXU1ldzZV1rFJtjAkslmAaICI8jLvOP4FbXl7Jmyv38pMxP+y1cuQSZYe7yAKJp/W63vjZKfzi1VX84a11bM8pYsb4JOav2MPry/ewr9C1hMueAyX8/Kz+R51XWlHFL15dxScbsxnYpR0PXjqMS0b1IDoy/MhE0zdX7uWbrTlc3EzbTRtjmsYSTAOdP6QrI3q2519fbOWy0YlHVs7ddaAYoEGTLENRTHQkz/00mb++v4nnv93J89/uRATOHBjPX8f24oN1mTyyKJUwEX52Zj8ASsormfnfFXy7bT9/+fFgZpxy9KKOIsKYpI5HVhk2xgQHSzANJCLMPK0vP5+3ks837zuyCvIudwumpScYcLX0/nLxEE7q3YHducVcMqoHiR1cP5ezByVQrfDQR5sJE7hibC+uf2E5q3Yf5LFpI7hstG3jbEyosATTCOcP6UK3uGjmLt5ZI8EUEx8T1eAtlkOZp7GWiPAw/v6TEVSr8oB7Z8X9RWU8ddVJTBrq/AUSxpjmY/NgGiEiPIxrxvfmu+25pGYVAq5LlFvCAL8vRISH8cTlI7loeDfySip4bsYYSy7GhCBLMI00fUwvoiLCeOE71xplu3NLWtQAf1NFhIfxr+mjWHHPOZxhkyaNCUmWYBqpQ9tWTBnVg7dW7SUrv5SsglIbf2kgEbEuRWNCmCWYJrh2QhKlFdU8vGgzYAP8xhhTk6MJRkQmiUiqiGwTkbs9PB8lIq+7n18qIknu40kickhEVrtvTzsZZ2MN6hrL+L6deHOla12uuvZ7N8aYlsixBCMi4cBs4AJgMDBdRGpv5XcDcFBV+wOPAw/VeG67qo50325xKs6mum5C0pH73m6VbIwxLYGTLZixwDZV3aGq5cBrwORaZSYDL7rvLwAmSn37EgeYiSd2oWfH1sRGR9C+TWT9JxhjTAvh5AhrD2BPjcfpwLi6yqhqpYjkA53cz/URkVVAAfAnVf3GwVgbLTxMuP+SYew5WOJxhWVjjGmpnEwwnv7aqpdlMoFeqporIqOBt0VkiKoetaWkiMwEZgL06uW/BRBtbxJjjDmWk11k6UDPGo8TgYy6yohIBBAHHFDVMlXNBVDVFcB2YGDtN1DVZ1U1WVWT4+Ptj7wxxgQSJxPMcmCAiPQRkVbAFcDCWmUWAjPc96cCn6uqiki8+yIBRKQvMADY4WCsxhhjfMyxLjL3mMptwCIgHJijqhtEZBaQoqoLgeeBl0RkG3AAVxICOB2YJSKVQBVwi6rWvVexMcaYgCOqtYdFglNycrKmpKT4OwxjjAkqIrJCVZOdeG2byW+MMcYRlmCMMcY4whKMMcYYR1iCMcYY44iQGeQXkRxgV63DcUB+A4/Vd78zsL+RYXp674aU8aY+zVWX+mKtr0xD61L78eH7NY/ZZ+NdrPWVsc/Gv38DjlfOibq0VVVnJhKqasjegGcbeqy++7gusfZZPA0p4019mqsuTa1PQ+tynDrUPGafjX02Af3ZeFMXX342Tv+e1XcL9S6ydxtxzJv7voynIWW8qU9z1cXb16mrTEPrUvvxu3WUaSz7bI5/3D6b5vsbcLxygVSXeoVMF1lzEZEUdeia8eYWSnWB0KpPKNUFQqs+VhfvhXoLxgnP+jsAHwqlukBo1SeU6gKhVR+ri5esBWOMMcYR1oIxxhjjiBadYERkjojsE5H1jTh3tIisE5FtIvLPmjtxisgvRCRVRDaIyMO+jbrOeHxeFxH5i4jsFZHV7tuPfB95nTE58tm4n79TRFREOvsu4uPG48Rnc5+IrHV/Lh+LSHffR+4xHifq8oiIbHbX5y0Rae/7yOuMyYn6THP/368WEcfHappShzpeb4aIbHXfZtQ4ftz/Vx45eYlaoN9wrdp8ErC+EecuA8bj2jTtQ+AC9/GzgE+BKPfjhCCuy1+AO0Pls3E/1xPXCt+7gM7BWhcgtkaZXwJPB3FdzgMi3PcfAh4K5t8z4ETgBOBLIDlQ6+COL6nWsY64tkbpCHRw3+9wvPoe79aiWzCq+jWubQKOEJF+IvKRiKwQkW9EZFDt80SkG67/4N+r6yf/X+AS99M/Ax5U1TL3e+xzthYuDtXFbxysz+PAXRy7u6pjnKiLHr27a1uaqT4O1eVjVa10F12Ca3PCZuFQfTapampzxO9+v0bVoQ7nA5+o6gFVPQh8Akxq7N+JFp1g6vAs8AtVHQ3cCTzloUwPXLtxHpbuPgaunTdPE5GlIvKViIxxNNrja2pdAG5zd13MEZEOzoXqlSbVR0QuBvaq6hqnA/VCkz8bEblfRPYAVwH3OhhrfXzxe3bY9bi+HfuTL+vjL97UwZMewJ4ajw/Xq1H1dWzDsWAkIu2AU4D5NboXozwV9XDs8DfICFxNy5OBMcD/RKSvO+s3Gx/V5d/Afe7H9wGP4foD0OyaWh8RaQP8EVd3jF/56LNBVf8I/FFEfg/cBvzZx6HWy1d1cb/WH4FK4BVfxtgQvqyPvxyvDiJyHfAr97H+wAciUg7sVNUp1F2vRtXXEszRwoA8VR1Z86C4tm9e4X64ENcf3prN+EQgw30/HXjTnVCWiUg1rvV+cpwM3IMm10VVs2uc9x/gPScDrkdT69MP6AOscf+nSwRWishYVc1yOPbafPF7VtM84H38kGDwUV3cg8kXAROb+8tYLb7+bPzBYx0AVHUuMBdARL4ErlXVtBpF0oEzazxOxDVWk05j6uv0AFSg34AkagyOAd8B09z3BRhRx3nLcbVSDg94/ch9/BZglvv+QFzNTQnSunSrUeZ24LVg/mxqlUmjmQb5HfpsBtQo8wtgQRDXZRKwEYhvzt8vp3/PaKZB/sbWgboH+Xfi6oXp4L7f0Zv6eozLHx9ooNyAV4FMoAJXhr4B17fcj4A17l/6e+s4NxlYD2wHnuSHSautgJfdz60Ezg7iurwErAPW4vrW1q056uJUfWqVSaP5riJz4rN5w318La51pXoEcV224foittp9a5Yr4hyszxT3a5UB2cCiQKwDHhKM+/j17s9kG3BdffU93s1m8htjjHGEXUVmjDHGEZZgjDHGOMISjDHGGEdYgjHGGOMISzDGGGMcYQnGhDQRKWrm93tORAb76LWqxLVa8noRebe+VYZFpL2I3OqL9zbGF+wyZRPSRKRIVdv58PUi9IeFGR1VM3YReRHYoqr3H6d8EvCeqg5tjviMqY+1YEyLIyLxIvKGiCx33ya4j48Vke9EZJX73xPcx68Vkfki8i7wsYicKSJfisgCce1j8srhvTHcx5Pd94vcC1KuEZElItLFfbyf+/FyEZnlZSvre35YtLOdiHwmIivFtT/HZHeZB4F+7lbPI+6yv3W/z1oR+T8f/hiNqZclGNMS/QN4XFXHAJcBz7mPbwZOV9VRuFYn/luNc8YDM1T1bPfjUcCvgcFAX2CCh/dpCyxR1RHA18BNNd7/H+73r3c9J/c6WBNxraYAUApMUdWTcO0/9Jg7wd0NbFfVkar6WxE5DxgAjAVGAqNF5PT63s8YX7HFLk1LdA4wuMZKs7EiEgPEAS+KyABcK8VG1jjnE1WtuefGMlVNBxCR1bjWgvq21vuU88MCoSuAc933x/PDXhrzgEfriLN1jddegWtvDnCtBfU3d7KoxtWy6eLh/PPct1Xux+1wJZyv63g/Y3zKEoxpicKA8ap6qOZBEfkX8IWqTnGPZ3xZ4+niWq9RVuN+FZ7/L1XoD4OcdZU5nkOqOlJE4nAlqp8D/8S1/0s8MFpVK0QkDYj2cL4AD6jqMw18X2N8wrrITEv0Ma79UwAQkcPLmscBe933r3Xw/Zfg6poDuKK+wqqaj2tb5DtFJBJXnPvcyeUsoLe7aCEQU+PURcD17v1BEJEeIpLgozoYUy9LMCbUtRGR9Bq3O3D9sU52D3xvxLXFAsDDwAMishgIdzCmXwN3iMgyoBuQX98JqroK18q4V+DakCtZRFJwtWY2u8vkAovdlzU/oqof4+qC+15E1gELODoBGeMou0zZmGbm3l3zkKqqiFwBTFfVyfWdZ0ywsTEYY5rfaOBJ95VfefhpG2pjnGYtGGOMMY6wMRhjjDGOsARjjDHGEZZgjDHGOMISjDHGGEdYgjHGGOMISzDGGGMc8f+27JXviJ3KCgAAAABJRU5ErkJggg==\n"},"metadata":{"needs_background":"light"}}]},{"metadata":{"trusted":true},"cell_type":"code","source":"learn.fit_one_cycle(2, slice(1e-4/(2.6**4),1e-4), moms=(0.8,0.7), pct_start=0.2, wd =(1e-7, 1e-5, 1e-4, 1e-3, 1e-2))","execution_count":65,"outputs":[{"output_type":"display_data","data":{"text/plain":"<IPython.core.display.HTML object>","text/html":"<table border=\"1\" class=\"dataframe\">\n  <thead>\n    <tr style=\"text-align: left;\">\n      <th>epoch</th>\n      <th>train_loss</th>\n      <th>valid_loss</th>\n      <th>accuracy_thresh</th>\n      <th>time</th>\n    </tr>\n  </thead>\n  <tbody>\n    <tr>\n      <td>0</td>\n      <td>0.058955</td>\n      <td>0.455071</td>\n      <td>0.971549</td>\n      <td>10:35</td>\n    </tr>\n    <tr>\n      <td>1</td>\n      <td>0.053523</td>\n      <td>0.454931</td>\n      <td>0.972103</td>\n      <td>10:12</td>\n    </tr>\n  </tbody>\n</table>"},"metadata":{}}]},{"metadata":{"trusted":true},"cell_type":"code","source":"learn.predict('she is so sweet')","execution_count":66,"outputs":[{"output_type":"execute_result","execution_count":66,"data":{"text/plain":"(MultiCategory ,\n tensor([0., 0., 0., 0., 0., 0.]),\n tensor([0.4081, 0.0108, 0.0850, 0.0199, 0.1322, 0.0290]))"},"metadata":{}}]},{"metadata":{"trusted":true},"cell_type":"code","source":"learn.predict('you are son of a bitch ')","execution_count":67,"outputs":[{"output_type":"execute_result","execution_count":67,"data":{"text/plain":"(MultiCategory toxic;severe_toxic;obscene;insult,\n tensor([1., 1., 1., 0., 1., 0.]),\n tensor([0.9990, 0.7699, 0.9998, 0.0061, 0.9787, 0.1059]))"},"metadata":{}}]}],"metadata":{"kernelspec":{"display_name":"Python 3","language":"python","name":"python3"},"language_info":{"name":"python","version":"3.6.4","mimetype":"text/x-python","codemirror_mode":{"name":"ipython","version":3},"pygments_lexer":"ipython3","nbconvert_exporter":"python","file_extension":".py"}},"nbformat":4,"nbformat_minor":1}