{
  "nbformat": 4,
  "nbformat_minor": 0,
  "metadata": {
    "colab": {
      "name": "modified.ipynb",
      "provenance": [],
      "collapsed_sections": [],
      "toc_visible": true
    },
    "kernelspec": {
      "name": "python3",
      "display_name": "Python 3"
    },
    "language_info": {
      "codemirror_mode": {
        "name": "ipython",
        "version": 3
      },
      "file_extension": ".py",
      "mimetype": "text/x-python",
      "name": "python",
      "nbconvert_exporter": "python",
      "pygments_lexer": "ipython3",
      "version": "3.7.3"
    },
    "accelerator": "GPU"
  },
  "cells": [
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "weBEWAcEoJJd",
        "colab_type": "text"
      },
      "source": [
        "# Entity Relation Extraction using R-BERT\n",
        "\n",
        "> Indented block\n",
        "\n"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "8D3AWBohiIUw",
        "colab_type": "text"
      },
      "source": [
        "\n",
        "\n",
        "In this notebook, entity relations are extracted from medical data about diseases.\n",
        "Bert is customized for the task, using methods from the following paper:\n",
        "\n",
        "Enriching Pre-trained Language Model with Entity Information for Relation Classification https://arxiv.org/abs/1905.08284.\n",
        "\n",
        "The code for the implementation was borrowed from: https://github.com/wang-h/bert-relation-classification\n",
        "\n",
        "This custom R-Bert model is then fine tuned for our data and used to predict what kind of relationships hold between entities in our test set.\n",
        "\n",
        "\n",
        "Data: https://www.kaggle.com/kmader/figure-eight-medical-sentence-summary\n",
        "\n",
        "\n"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "colab_type": "text",
        "id": "RFrk6sia_lUO"
      },
      "source": [
        "\n",
        "\n",
        "\n",
        "Table of contents\n",
        "\n",
        "1. Install dependencies, import modules and load helper functions.\n",
        "2. Read the data as features.\n",
        "3. Convert features to tensors.\n",
        "4. Change the bert class for relation extraction\n",
        "5. Prepare and load the model\n",
        "6. Train the model!\n",
        "7. Load the trained model.\n",
        "8. Evaluate!"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab_type": "code",
        "id": "gFV0nYifAVSp",
        "outputId": "ee68bf9d-0d0a-4fc0-c438-e2bf275d92a2",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 33
        }
      },
      "source": [
        "# Connect to google drive (where the data is, to access it):\n",
        "from google.colab import drive\n",
        "drive.mount('/content/gdrive')"
      ],
      "execution_count": 98,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "Drive already mounted at /content/gdrive; to attempt to forcibly remount, call drive.mount(\"/content/gdrive\", force_remount=True).\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "xW2lQw7Ueodx",
        "colab_type": "text"
      },
      "source": [
        "# 1. Install dependencies, import modules and load helper functions\n"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab_type": "code",
        "id": "wl6msvLX_kDS",
        "outputId": "d85ffe96-e0e7-4d56-acce-362c0ffbe3cb",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 590
        }
      },
      "source": [
        "! pip install pytorch-transformers #"
      ],
      "execution_count": 8,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "Collecting pytorch-transformers\n",
            "\u001b[?25l  Downloading https://files.pythonhosted.org/packages/a3/b7/d3d18008a67e0b968d1ab93ad444fc05699403fa662f634b2f2c318a508b/pytorch_transformers-1.2.0-py3-none-any.whl (176kB)\n",
            "\r\u001b[K     |█▉                              | 10kB 26.7MB/s eta 0:00:01\r\u001b[K     |███▊                            | 20kB 2.1MB/s eta 0:00:01\r\u001b[K     |█████▋                          | 30kB 2.8MB/s eta 0:00:01\r\u001b[K     |███████▍                        | 40kB 2.0MB/s eta 0:00:01\r\u001b[K     |█████████▎                      | 51kB 2.3MB/s eta 0:00:01\r\u001b[K     |███████████▏                    | 61kB 2.7MB/s eta 0:00:01\r\u001b[K     |█████████████                   | 71kB 2.9MB/s eta 0:00:01\r\u001b[K     |██████████████▉                 | 81kB 3.1MB/s eta 0:00:01\r\u001b[K     |████████████████▊               | 92kB 3.5MB/s eta 0:00:01\r\u001b[K     |██████████████████▋             | 102kB 3.3MB/s eta 0:00:01\r\u001b[K     |████████████████████▍           | 112kB 3.3MB/s eta 0:00:01\r\u001b[K     |██████████████████████▎         | 122kB 3.3MB/s eta 0:00:01\r\u001b[K     |████████████████████████▏       | 133kB 3.3MB/s eta 0:00:01\r\u001b[K     |██████████████████████████      | 143kB 3.3MB/s eta 0:00:01\r\u001b[K     |███████████████████████████▉    | 153kB 3.3MB/s eta 0:00:01\r\u001b[K     |█████████████████████████████▊  | 163kB 3.3MB/s eta 0:00:01\r\u001b[K     |███████████████████████████████▋| 174kB 3.3MB/s eta 0:00:01\r\u001b[K     |████████████████████████████████| 184kB 3.3MB/s \n",
            "\u001b[?25hRequirement already satisfied: requests in /usr/local/lib/python3.6/dist-packages (from pytorch-transformers) (2.21.0)\n",
            "Requirement already satisfied: tqdm in /usr/local/lib/python3.6/dist-packages (from pytorch-transformers) (4.28.1)\n",
            "Collecting sacremoses\n",
            "\u001b[?25l  Downloading https://files.pythonhosted.org/packages/a6/b4/7a41d630547a4afd58143597d5a49e07bfd4c42914d8335b2a5657efc14b/sacremoses-0.0.38.tar.gz (860kB)\n",
            "\u001b[K     |████████████████████████████████| 870kB 9.7MB/s \n",
            "\u001b[?25hCollecting sentencepiece\n",
            "\u001b[?25l  Downloading https://files.pythonhosted.org/packages/74/f4/2d5214cbf13d06e7cb2c20d84115ca25b53ea76fa1f0ade0e3c9749de214/sentencepiece-0.1.85-cp36-cp36m-manylinux1_x86_64.whl (1.0MB)\n",
            "\u001b[K     |████████████████████████████████| 1.0MB 19.5MB/s \n",
            "\u001b[?25hRequirement already satisfied: numpy in /usr/local/lib/python3.6/dist-packages (from pytorch-transformers) (1.17.5)\n",
            "Requirement already satisfied: regex in /usr/local/lib/python3.6/dist-packages (from pytorch-transformers) (2019.12.20)\n",
            "Requirement already satisfied: boto3 in /usr/local/lib/python3.6/dist-packages (from pytorch-transformers) (1.11.15)\n",
            "Requirement already satisfied: torch>=1.0.0 in /usr/local/lib/python3.6/dist-packages (from pytorch-transformers) (1.4.0)\n",
            "Requirement already satisfied: chardet<3.1.0,>=3.0.2 in /usr/local/lib/python3.6/dist-packages (from requests->pytorch-transformers) (3.0.4)\n",
            "Requirement already satisfied: idna<2.9,>=2.5 in /usr/local/lib/python3.6/dist-packages (from requests->pytorch-transformers) (2.8)\n",
            "Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.6/dist-packages (from requests->pytorch-transformers) (2019.11.28)\n",
            "Requirement already satisfied: urllib3<1.25,>=1.21.1 in /usr/local/lib/python3.6/dist-packages (from requests->pytorch-transformers) (1.24.3)\n",
            "Requirement already satisfied: six in /usr/local/lib/python3.6/dist-packages (from sacremoses->pytorch-transformers) (1.12.0)\n",
            "Requirement already satisfied: click in /usr/local/lib/python3.6/dist-packages (from sacremoses->pytorch-transformers) (7.0)\n",
            "Requirement already satisfied: joblib in /usr/local/lib/python3.6/dist-packages (from sacremoses->pytorch-transformers) (0.14.1)\n",
            "Requirement already satisfied: jmespath<1.0.0,>=0.7.1 in /usr/local/lib/python3.6/dist-packages (from boto3->pytorch-transformers) (0.9.4)\n",
            "Requirement already satisfied: s3transfer<0.4.0,>=0.3.0 in /usr/local/lib/python3.6/dist-packages (from boto3->pytorch-transformers) (0.3.3)\n",
            "Requirement already satisfied: botocore<1.15.0,>=1.14.15 in /usr/local/lib/python3.6/dist-packages (from boto3->pytorch-transformers) (1.14.15)\n",
            "Requirement already satisfied: docutils<0.16,>=0.10 in /usr/local/lib/python3.6/dist-packages (from botocore<1.15.0,>=1.14.15->boto3->pytorch-transformers) (0.15.2)\n",
            "Requirement already satisfied: python-dateutil<3.0.0,>=2.1 in /usr/local/lib/python3.6/dist-packages (from botocore<1.15.0,>=1.14.15->boto3->pytorch-transformers) (2.6.1)\n",
            "Building wheels for collected packages: sacremoses\n",
            "  Building wheel for sacremoses (setup.py) ... \u001b[?25l\u001b[?25hdone\n",
            "  Created wheel for sacremoses: filename=sacremoses-0.0.38-cp36-none-any.whl size=884628 sha256=3227afdc69a992e5a50024e00361e86f8f43110980854bf8818194c2a488d367\n",
            "  Stored in directory: /root/.cache/pip/wheels/6d/ec/1a/21b8912e35e02741306f35f66c785f3afe94de754a0eaf1422\n",
            "Successfully built sacremoses\n",
            "Installing collected packages: sacremoses, sentencepiece, pytorch-transformers\n",
            "Successfully installed pytorch-transformers-1.2.0 sacremoses-0.0.38 sentencepiece-0.1.85\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab_type": "code",
        "id": "BcFPYsPF_kWM",
        "colab": {}
      },
      "source": [
        "# Classes for storing individual sentences:\n",
        "\n",
        "class InputExample(object):\n",
        "    \"\"\"A single training/test example for simple sequence classification.\"\"\"\n",
        "\n",
        "    def __init__(self, guid, text_a, text_b=None, label=None):\n",
        "        \"\"\"Constructs a InputExample.\n",
        "\n",
        "        Args:\n",
        "            guid: Unique id for the example.\n",
        "            text_a: string. The untokenized text of the first sequence. For single\n",
        "            sequence tasks, only this sequence must be specified.\n",
        "            text_b: (Optional) string. The untokenized text of the second sequence.\n",
        "            Only must be specified for sequence pair tasks.\n",
        "            label: (Optional) string. The label of the example. This should be\n",
        "            specified for train and dev examples, but not for test examples.\n",
        "        \"\"\"\n",
        "        self.guid = guid\n",
        "        self.text_a = text_a\n",
        "        self.text_b = text_b\n",
        "        self.label = label\n",
        "\n",
        "class InputFeatures(object):\n",
        "    \"\"\"A single set of features of data.\"\"\"\n",
        "\n",
        "    def __init__(self,\n",
        "                 input_ids,\n",
        "                 input_mask,\n",
        "                 e11_p, e12_p, e21_p, e22_p,\n",
        "                 e1_mask, e2_mask,\n",
        "                 segment_ids,\n",
        "                 label_id):\n",
        "        self.input_ids = input_ids\n",
        "        self.input_mask = input_mask\n",
        "        self.segment_ids = segment_ids\n",
        "        self.label_id = label_id\n",
        "\n",
        "        #add enitity position and entity mask for BERT\n",
        "        self.e11_p = e11_p\n",
        "        self.e12_p = e12_p\n",
        "        self.e21_p = e21_p\n",
        "        self.e22_p = e22_p\n",
        "        self.e1_mask = e1_mask\n",
        "        self.e2_mask = e2_mask\n",
        "        \n",
        "    def print_contents(self):\n",
        "        print(self.input_ids,self.input_mask,self.segment_ids, self.label_id,\n",
        "        self.e11_p,self.e12_p,self.e21_p,\n",
        "        self.e22_p,self.e1_mask, self.e2_mask)"
      ],
      "execution_count": 0,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab_type": "code",
        "id": "Ri0281T7AVD3",
        "colab": {}
      },
      "source": [
        "# Functions for reading in the data:\n",
        "\n",
        "import csv\n",
        "import sys \n",
        "import logging\n",
        "\n",
        "logger = logging.getLogger(__name__)\n",
        "\n",
        "def read_tsv(input_file, quotechar=None):\n",
        "    \"\"\"Reads a tab separated value file.\"\"\"\n",
        "    with open(input_file, \"r\", encoding=\"utf-8-sig\") as f:\n",
        "        reader = csv.reader(f, delimiter=\"\\t\", quotechar=quotechar)\n",
        "        lines = []\n",
        "        for line in reader:\n",
        "            if sys.version_info[0] == 2:\n",
        "                line = list(cell for cell in line)\n",
        "            lines.append(line)\n",
        "        return lines\n",
        "      \n",
        "def create_examples(lines, set_type):\n",
        "    \"\"\"Creates examples for the training and test sets.\n",
        "  \n",
        "    $AZATHIOPRINE$ is an immunosuppressive drug that is used to treat #RHEUMATOID ARTHRITIS#\t8\ttreats2\ttreats1\t2\n",
        "    \n",
        "    $ denotes first entity, # denotes second entitiy, 8 denotes type of relation and 2 denotes direction\n",
        "    \"\"\"\n",
        "    examples = []\n",
        "    for (i, line) in enumerate(lines):\n",
        "\n",
        "        guid = \"%s-%s\" % (set_type, i)\n",
        "        logger.info(line)\n",
        "        text_a = line[1]\n",
        "        text_b = None\n",
        "        label = line[2]\n",
        "        examples.append(\n",
        "            InputExample(guid=guid, text_a=text_a, text_b=text_b, label=label))\n",
        "    return examples\n",
        "\n",
        "def get_train_examples(data_dir):\n",
        "    logger.info(\"LOOKING AT {}\".format(\n",
        "        os.path.join(data_dir, \"train.tsv\")))\n",
        "    return create_examples(\n",
        "        read_tsv(os.path.join(data_dir, \"train.tsv\")), \"train\")\n",
        "    \n",
        "\n",
        "def get_test_examples(data_dir):\n",
        "    return create_examples(\n",
        "        read_tsv(os.path.join(data_dir, \"test.tsv\")), \"test\")"
      ],
      "execution_count": 0,
      "outputs": []
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "EzO2a_dkeod-",
        "colab_type": "text"
      },
      "source": [
        "# 2. Read in the data and convert to features"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab_type": "code",
        "id": "oHu_IJdsAtul",
        "outputId": "3768b49f-ba78-484f-d170-ad99ffa0fc72",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 33
        }
      },
      "source": [
        "from pytorch_transformers import WEIGHTS_NAME, BertConfig, BertTokenizer\n",
        "\n",
        "# Configuration parameters:\n",
        "use_entity_indicator=True\n",
        "max_seq_len=176\n",
        "\n",
        "tokenizer = BertTokenizer.from_pretrained(\n",
        "        'bert-base-uncased', do_lower_case=True)\n",
        "\n",
        "n_labels = 18\n",
        "labels = [str(i) for i in range(n_labels)]\n"
      ],
      "execution_count": 31,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "100%|██████████| 231508/231508 [00:00<00:00, 2746269.35B/s]\n"
          ],
          "name": "stderr"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab_type": "code",
        "id": "Z2x4LEfqAz4c",
        "colab": {}
      },
      "source": [
        "# BERT Class for converting the input to features according to the required input form\n",
        "def convert_examples_to_features(examples, label_list, max_seq_len,\n",
        "                                 tokenizer,\n",
        "                                 cls_token='[CLS]',\n",
        "                                 cls_token_segment_id=1,\n",
        "                                 sep_token='[SEP]',\n",
        "                                 pad_token=0,\n",
        "                                 pad_token_segment_id=0,\n",
        "                                 sequence_a_segment_id=0,\n",
        "                                 sequence_b_segment_id=1,\n",
        "                                 mask_padding_with_zero=True):\n",
        "    ''' In: sentences with entities marked by $$ and ## around them\n",
        "      Out: sentence represented as object of the InputFeature class '''\n",
        "\n",
        "    label_map = {label: i for i, label in enumerate(label_list)}\n",
        "\n",
        "    features = []\n",
        "    for (ex_index, example) in enumerate(examples):\n",
        "        if ex_index % 10000 == 0:\n",
        "            logger.info(\"Writing example %d of %d\" % (ex_index, len(examples)))\n",
        "\n",
        "        tokens_a = tokenizer.tokenize(example.text_a)\n",
        "        \n",
        "        #convert the entity information to features as well\n",
        "        l = len(tokens_a)\n",
        "        \n",
        "        # the start position of entity1:\n",
        "        e11_p = tokens_a.index(\"#\") + 1  \n",
        "        # the end position of entity1\n",
        "        e12_p = l - tokens_a[::-1].index(\"#\") + 1  \n",
        "        # the start position of entity2\n",
        "        e21_p = tokens_a.index(\"$\") + 1  \n",
        "        # the end position of entity2\n",
        "        e22_p = l - tokens_a[::-1].index(\"$\") + 1 \n",
        "\n",
        "        tokens_b = None\n",
        "\n",
        "        if example.text_b:\n",
        "            tokens_b = tokenizer.tokenize(example.text_b)\n",
        "            # Modifies `tokens_a` and `tokens_b` in place so that the total\n",
        "            # length is less than the specified length.\n",
        "            # Account for [CLS], [SEP], [SEP] with \"- 3\".\n",
        "            special_tokens_count = 3\n",
        "            _truncate_seq_pair(tokens_a, tokens_b,\n",
        "                               max_seq_len - special_tokens_count)\n",
        "        else:\n",
        "            # Account for [CLS] and [SEP] with \"- 2\" and with \"\n",
        "            special_tokens_count = 2\n",
        "            if len(tokens_a) > max_seq_len - special_tokens_count:\n",
        "                tokens_a = tokens_a[:(max_seq_len - special_tokens_count)]\n",
        "\n",
        "        # The convention in BERT is:\n",
        "        # (a) For sequence pairs:\n",
        "        #  tokens:   [CLS] is this jack ##son ##ville ? [SEP] no it is not . [SEP]\n",
        "        #  type_ids:   0   0  0    0    0     0       0   0   1  1  1  1   1   1\n",
        "        # (b) For single sequences:\n",
        "        #  tokens:   [CLS] the dog is hairy . [SEP]\n",
        "        #  type_ids:   0   0   0   0  0     0   0\n",
        "        #\n",
        "        # Where \"type_ids\" are used to indicate whether this is the first\n",
        "        # sequence or the second sequence. The embedding vectors for `type=0` and\n",
        "        # `type=1` were learned during pre-training and are added to the wordpiece\n",
        "        # embedding vector (and position vector). This is not *strictly* necessary\n",
        "        # since the [SEP] token unambiguously separates the sequences, but it makes\n",
        "        # it easier for the model to learn the concept of sequences.\n",
        "        #\n",
        "        # For classification tasks, the first vector (corresponding to [CLS]) is\n",
        "        # used as as the \"sentence vector\". Note that this only makes sense because\n",
        "        # the entire model is fine-tuned.\n",
        "        tokens = tokens_a + [sep_token]\n",
        "        segment_ids = [sequence_a_segment_id] * len(tokens)\n",
        "\n",
        "        if tokens_b:\n",
        "            tokens += tokens_b + [sep_token]\n",
        "            segment_ids += [sequence_b_segment_id] * (len(tokens_b) + 1)\n",
        "\n",
        "        tokens = [cls_token] + tokens\n",
        "        segment_ids = [cls_token_segment_id] + segment_ids\n",
        "\n",
        "        input_ids = tokenizer.convert_tokens_to_ids(tokens)\n",
        "\n",
        "        # The mask has 1 for real tokens and 0 for padding tokens. Only real\n",
        "        # tokens are attended to.\n",
        "        input_mask = [1 if mask_padding_with_zero else 0] * len(input_ids)\n",
        "\n",
        "        # Zero-pad up to the sequence length.\n",
        "        padding_length = max_seq_len - len(input_ids)\n",
        "        input_ids = input_ids + ([pad_token] * padding_length)\n",
        "        input_mask = input_mask + \\\n",
        "                     ([0 if mask_padding_with_zero else 1] * padding_length)\n",
        "        segment_ids = segment_ids + \\\n",
        "                      ([pad_token_segment_id] * padding_length)\n",
        "\n",
        "        #add attention mask for entities as well\n",
        "        e1_mask = [0 for i in range(len(input_mask))]\n",
        "\n",
        "        e2_mask = [0 for i in range(len(input_mask))]\n",
        "\n",
        "        for i in range(e11_p, e12_p):\n",
        "            e1_mask[i] = 1\n",
        "        for i in range(e21_p, e22_p):\n",
        "            e2_mask[i] = 1\n",
        "\n",
        "        assert len(input_ids) == max_seq_len\n",
        "        assert len(input_mask) == max_seq_len\n",
        "        assert len(segment_ids) == max_seq_len\n",
        "\n",
        "        label_id = int(example.label)\n",
        "\n",
        "        if ex_index < 5:\n",
        "            logger.info(\"*** Example ***\")\n",
        "            logger.info(\"guid: %s\" % (example.guid))\n",
        "            logger.info(\"tokens: %s\" % \" \".join(\n",
        "                [str(x) for x in tokens]))\n",
        "            logger.info(\"input_ids: %s\" %\n",
        "                        \" \".join([str(x) for x in input_ids]))\n",
        "            logger.info(\"input_mask: %s\" %\n",
        "                        \" \".join([str(x) for x in input_mask]))\n",
        "            if use_entity_indicator:\n",
        "                logger.info(\"e11_p: %s\" % e11_p)\n",
        "                logger.info(\"e12_p: %s\" % e12_p)\n",
        "                logger.info(\"e21_p: %s\" % e21_p)\n",
        "                logger.info(\"e22_p: %s\" % e22_p)\n",
        "                logger.info(\"e1_mask: %s\" %\n",
        "                            \" \".join([str(x) for x in e1_mask]))\n",
        "                logger.info(\"e2_mask: %s\" %\n",
        "                            \" \".join([str(x) for x in e2_mask]))\n",
        "            logger.info(\"segment_ids: %s\" %\n",
        "                        \" \".join([str(x) for x in segment_ids]))\n",
        "            logger.info(\"label: %s (id = %d)\" % (example.label, label_id))\n",
        "\n",
        "        features.append( InputFeatures(input_ids=input_ids,input_mask=input_mask,e11_p=e11_p,e12_p=e12_p, e21_p=e21_p, e22_p=e22_p,\n",
        "                          e1_mask=e1_mask,e2_mask=e2_mask, segment_ids=segment_ids,label_id=label_id))\n",
        "    return features"
      ],
      "execution_count": 0,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab_type": "code",
        "id": "GtvdYGP0_km6",
        "colab": {}
      },
      "source": [
        "import os\n",
        "\n",
        "# Get the training data from the data folder, hosted on google drive:\n",
        "data_folder = '/content/gdrive/My Drive/Colab Notebooks/data/'\n",
        "examples = get_train_examples(data_folder)\n",
        "features = convert_examples_to_features(\n",
        "    examples, labels, max_seq_len, tokenizer)"
      ],
      "execution_count": 0,
      "outputs": []
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "colab_type": "text",
        "id": "D5bSF9YdpxVh"
      },
      "source": [
        "*Convert* the features to tensors and make a tensor data set"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab_type": "code",
        "id": "2zF2wIkNI1eO",
        "colab": {}
      },
      "source": [
        "import torch \n",
        "from torch.utils.data import DataLoader, RandomSampler, SequentialSampler,TensorDataset\n",
        "\n",
        "all_input_ids = torch.tensor(\n",
        "        [f.input_ids for f in features], dtype=torch.long)\n",
        "all_input_mask = torch.tensor(\n",
        "    [f.input_mask for f in features], dtype=torch.long)\n",
        "all_segment_ids = torch.tensor(\n",
        "    [f.segment_ids for f in features], dtype=torch.long)\n",
        "\n",
        "#also for entities\n",
        "all_e1_mask = torch.tensor(\n",
        "    [f.e1_mask for f in features], dtype=torch.long)\n",
        "all_e2_mask = torch.tensor(\n",
        "    [f.e2_mask for f in features], dtype=torch.long) \n",
        "\n",
        "all_label_ids = torch.tensor(\n",
        "        [f.label_id for f in features], dtype=torch.long)\n",
        "\n",
        "dataset = TensorDataset(all_input_ids, all_input_mask,\n",
        "                            all_segment_ids, all_label_ids, all_e1_mask, all_e2_mask)"
      ],
      "execution_count": 0,
      "outputs": []
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "ocsdh3vD1bow",
        "colab_type": "text"
      },
      "source": [
        "# 3. Preparing the model"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab_type": "code",
        "id": "Rfyz43huJHaz",
        "colab": {}
      },
      "source": [
        "# Configuration parameters:\n",
        "\n",
        "# batch size (low to save memory):\n",
        "per_gpu_train_batch_size = 4\n",
        "n_gpu = torch.cuda.device_count()\n",
        "\n",
        "# the base BERT model (smaller, to save memory)\n",
        "pretrained_model_name='bert-base-uncased'\n",
        "\n",
        "# parameters for gradient descent:\n",
        "max_steps=-1\n",
        "gradient_accumulation_steps=1 \n",
        "\n",
        "# Number of training epochs:\n",
        "num_train_epochs=5.0\n",
        "\n",
        "# Name of task for Bert:\n",
        "task_name = 'semeval'\n",
        "\n",
        "# hyperparameter for regularization\n",
        "l2_reg_lambda=5e-3\n",
        "local_rank=-1\n",
        "no_cuda=False\n",
        "\n",
        "train_batch_size = per_gpu_train_batch_size * \\\n",
        "        max(1, n_gpu)\n",
        "\n",
        "# For sampling during the training:\n",
        "train_sampler = RandomSampler(dataset)\n",
        "train_dataloader = DataLoader(\n",
        "        dataset, sampler=train_sampler, batch_size=train_batch_size)\n",
        "\n",
        "# total number of steps for training:\n",
        "t_total = len(train_dataloader) // gradient_accumulation_steps * num_train_epochs"
      ],
      "execution_count": 0,
      "outputs": []
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "D_UwqeLDhig7",
        "colab_type": "text"
      },
      "source": [
        "# 4. Load the Bert customized for relation extraction (R-Bert)"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab_type": "code",
        "id": "ZkOIYVQ-JTzL",
        "colab": {}
      },
      "source": [
        "import torch.nn as nn\n",
        "import torch.nn.functional as F\n",
        "from pytorch_transformers import (BertModel, BertPreTrainedModel, BertTokenizer)\n",
        "from torch.nn import MSELoss, CrossEntropyLoss\n",
        "\n",
        "def l2_loss(parameters):\n",
        "  '''Calculates L2 loss (euclidian length) of 'parameters' vector.'''\n",
        "  return torch.sum(   torch.tensor([torch.sum(p ** 2) / 2 for p in parameters if p.requires_grad ]))\n",
        "\n",
        "\n",
        "# Huggingface Transformers Class for BERT Sequence Classification\n",
        "class BertForSequenceClassification(BertPreTrainedModel):\n",
        "    \"\"\"\n",
        "        **labels**: (`optional`) ``torch.LongTensor`` of shape ``(batch_size,)``:\n",
        "            Labels for computing the sequence classification/regression loss.\n",
        "            Indices should be in ``[0, ..., config.num_labels - 1]``.\n",
        "            If ``config.num_labels == 1`` a regression loss is computed (Mean-Square loss),\n",
        "            If ``config.num_labels > 1`` a classification loss is computed (Cross-Entropy).\n",
        "\n",
        "    Outputs: `Tuple` comprising various elements depending on the configuration (config) and inputs:\n",
        "        **loss**: (`optional`, returned when ``labels`` is provided) ``torch.FloatTensor`` of shape ``(1,)``:\n",
        "            Classification (or regression if config.num_labels==1) loss.\n",
        "        **logits**: ``torch.FloatTensor`` of shape ``(batch_size, config.num_labels)``\n",
        "            Classification (or regression if config.num_labels==1) scores (before SoftMax).\n",
        "        **hidden_states**: (`optional`, returned when ``config.output_hidden_states=True``)\n",
        "            list of ``torch.FloatTensor`` (one for the output of each layer + the output of the embeddings)\n",
        "            of shape ``(batch_size, sequence_length, hidden_size)``:\n",
        "            Hidden-states of the model at the output of each layer plus the initial embedding outputs.\n",
        "        **attentions**: (`optional`, returned when ``config.output_attentions=True``)\n",
        "            list of ``torch.FloatTensor`` (one for each layer) of shape ``(batch_size, num_heads, sequence_length, sequence_length)``:\n",
        "            Attentions weights after the attention softmax, used to compute the weighted average in the self-attention heads.\n",
        "\n",
        "    Examples::\n",
        "\n",
        "        tokenizer = BertTokenizer.from_pretrained('bert-base-uncased')\n",
        "        model = BertForSequenceClassification.from_pretrained(\n",
        "            'bert-base-uncased')\n",
        "        input_ids = torch.tensor(tokenizer.encode(\n",
        "            \"Hello, my dog is cute\")).unsqueeze(0)  # Batch size 1\n",
        "        labels = torch.tensor([1]).unsqueeze(0)  # Batch size 1\n",
        "        outputs = model(input_ids, labels=labels)\n",
        "        loss, logits = outputs[:2]\n",
        "\n",
        "    \"\"\"\n",
        "\n",
        "    def __init__(self, config):\n",
        "        super(BertForSequenceClassification, self).__init__(config)\n",
        "        self.num_labels = config.num_labels\n",
        "        self.l2_reg_lambda = config.l2_reg_lambda\n",
        "        self.bert = BertModel(config)\n",
        "        self.latent_entity_typing = config.latent_entity_typing\n",
        "        self.dropout = nn.Dropout(config.hidden_dropout_prob)\n",
        "        classifier_size = config.hidden_size*3\n",
        "        self.classifier = nn.Linear(\n",
        "            classifier_size, self.config.num_labels)\n",
        "        self.latent_size = config.hidden_size\n",
        "        self.latent_type = nn.Parameter(torch.FloatTensor(\n",
        "            3, config.hidden_size), requires_grad=True)\n",
        "\n",
        "        self.init_weights()\n",
        "\n",
        "    # Customized forward step, for relation extraction\n",
        "    # Does the extra steps required, as described in the paper.\n",
        "    # Enriching Pre-trained Language Model with Entity Information for Relation Classification https://arxiv.org/abs/1905.08284.\n",
        "\n",
        "    def forward(self, input_ids, token_type_ids=None, attention_mask=None, e1_mask=None, e2_mask=None, labels=None,\n",
        "                position_ids=None, head_mask=None):\n",
        "\n",
        "        outputs = self.bert(input_ids, position_ids=position_ids, token_type_ids=token_type_ids,\n",
        "                            attention_mask=attention_mask, head_mask=head_mask)\n",
        "        pooled_output = outputs[1]\n",
        "        sequence_output = outputs[0]\n",
        "\n",
        "        def extract_entity(sequence_output, e_mask):\n",
        "            extended_e_mask = e_mask.unsqueeze(1)\n",
        "            extended_e_mask = torch.bmm(\n",
        "                extended_e_mask.float(), sequence_output).squeeze(1)\n",
        "            return extended_e_mask.float()\n",
        "\n",
        "        e1_h = extract_entity(sequence_output, e1_mask)\n",
        "        e2_h = extract_entity(sequence_output, e2_mask)\n",
        "        context = self.dropout(pooled_output)\n",
        "        pooled_output = torch.cat([context, e1_h, e2_h], dim=-1)\n",
        "\n",
        "        # Extra logit layer on top of BERT,  in order to do relation extraction:\n",
        "        logits = self.classifier(pooled_output)\n",
        "\n",
        "        # add hidden states and attention\n",
        "        outputs = (logits,) + outputs[2:]\n",
        "\n",
        "        device = logits.get_device()\n",
        "        l2 = l2_loss(self.parameters())\n",
        "\n",
        "        if device >= 0:\n",
        "            l2 = l2.to(device)\n",
        "        loss = l2 * self.l2_reg_lambda\n",
        "        if labels is not None:\n",
        "\n",
        "            # transform to plausible probabilities,  between 0 and 1:            \n",
        "            probabilities = F.softmax(logits, dim=-1)\n",
        "            log_probs = F.log_softmax(logits, dim=-1)\n",
        "\n",
        "            # Do one hot encoding:\n",
        "            one_hot_labels = F.one_hot(labels, num_classes=self.num_labels)\n",
        "            if device >= 0:\n",
        "                one_hot_labels = one_hot_labels.to(device)\n",
        "\n",
        "            # Calculate loss:\n",
        "            dist = one_hot_labels[:, 1:].float() * log_probs[:, 1:]\n",
        "            example_loss_except_other, _ = dist.min(dim=-1)\n",
        "            per_example_loss = - example_loss_except_other.mean()\n",
        "\n",
        "            rc_probabilities = probabilities - probabilities * one_hot_labels.float()\n",
        "            second_pre,  _ = rc_probabilities[:, 1:].max(dim=-1)\n",
        "            rc_loss = - (1 - second_pre).log().mean()\n",
        "\n",
        "            loss += per_example_loss + 5 * rc_loss\n",
        "\n",
        "            outputs = (loss,) + outputs\n",
        "\n",
        "        return outputs  # (loss), logits, (hidden_states), (attentions)"
      ],
      "execution_count": 0,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab_type": "code",
        "id": "7hOXeBlmJyUL",
        "outputId": "4692488e-7115-4a35-be08-b938aefa63a5",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 50
        }
      },
      "source": [
        "# Make config variable for the model:\n",
        "bertconfig = BertConfig.from_pretrained(\n",
        "        pretrained_model_name, num_labels=n_labels, finetuning_task=task_name)\n",
        "\n",
        "bertconfig.l2_reg_lambda = l2_reg_lambda\n",
        "bertconfig.latent_entity_typing = False\n",
        "bertconfig.num_classes = n_labels\n",
        "\n",
        "# Load the model:\n",
        "model = BertForSequenceClassification.from_pretrained(\n",
        "        pretrained_model_name, config=bertconfig)"
      ],
      "execution_count": 41,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "100%|██████████| 361/361 [00:00<00:00, 101158.72B/s]\n",
            "100%|██████████| 440473133/440473133 [00:06<00:00, 70873575.69B/s]\n"
          ],
          "name": "stderr"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "_-8S8HXOhxy9",
        "colab_type": "text"
      },
      "source": [
        "# 5. Get ready for training"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab_type": "code",
        "id": "7V6XGt3nK8aR",
        "colab": {}
      },
      "source": [
        "# Prepare optimizer and schedule (linear warmup and decay)\n",
        "\n",
        "from pytorch_transformers import AdamW, WarmupLinearSchedule\n",
        "\n",
        "# Hyperparameters for the optimizer:\n",
        "max_grad_norm = 1.0\n",
        "learning_rate=2e-5\n",
        "adam_epsilon=1e-8\n",
        "warmup_steps=0\n",
        "weight_decay=0.9\n",
        "\n",
        "\n",
        "no_decay = ['bias', 'LayerNorm.weight']\n",
        "optimizer_grouped_parameters = [\n",
        "    {'params': [p for n, p in model.named_parameters()\n",
        "                if not any(nd in n for nd in no_decay)], 'weight_decay': weight_decay},\n",
        "    {'params': [p for n, p in model.named_parameters()\n",
        "                if any(nd in n for nd in no_decay)], 'weight_decay': 0.0}\n",
        "]\n",
        "\n",
        "# Load optimizer and scheduler:\n",
        "optimizer = AdamW(optimizer_grouped_parameters,\n",
        "                  lr=learning_rate, eps=adam_epsilon)\n",
        "scheduler = WarmupLinearSchedule(\n",
        "    optimizer, warmup_steps=warmup_steps, t_total=t_total)\n",
        "\n",
        "# Parallelize in case we have multiple GPUs:\n",
        "if n_gpu > 1:\n",
        "    model = torch.nn.DataParallel(model)"
      ],
      "execution_count": 0,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab_type": "code",
        "id": "v1O5uk_vK8kx",
        "outputId": "b6a4c51a-dd96-4120-bd3b-5951267a51d1",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 33
        }
      },
      "source": [
        "# Prepare for trainig:\n",
        "from tqdm import tqdm, trange\n",
        "import random\n",
        "import numpy as np\n",
        "\n",
        "#  Random seed for reproducability\n",
        "def set_seed(seed):\n",
        "    random.seed(seed)\n",
        "    np.random.seed(seed)\n",
        "    torch.manual_seed(seed)\n",
        "    torch.cuda.manual_seed_all(seed)\n",
        "\n",
        "global_step = 0\n",
        "tr_loss, logging_loss = 0.0, 0.0\n",
        "model.zero_grad()\n",
        "train_iterator = trange(int(num_train_epochs),\n",
        "                        desc=\"Epoch\", disable=local_rank not in [-1, 0])\n",
        "\n"
      ],
      "execution_count": 48,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "\rEpoch:   0%|          | 0/5 [00:00<?, ?it/s]"
          ],
          "name": "stderr"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab_type": "code",
        "id": "sUAKtXGzLWx4",
        "outputId": "2bdf29be-613d-4ec8-f999-e4a382ce01df",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 1000
        }
      },
      "source": [
        "# put the model to the device\n",
        "device = torch.device(\"cuda\" if torch.cuda.is_available() and not no_cuda else \"cpu\")\n",
        "model.to(device)"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "BertForSequenceClassification(\n",
              "  (bert): BertModel(\n",
              "    (embeddings): BertEmbeddings(\n",
              "      (word_embeddings): Embedding(30522, 768, padding_idx=0)\n",
              "      (position_embeddings): Embedding(512, 768)\n",
              "      (token_type_embeddings): Embedding(2, 768)\n",
              "      (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "      (dropout): Dropout(p=0.1, inplace=False)\n",
              "    )\n",
              "    (encoder): BertEncoder(\n",
              "      (layer): ModuleList(\n",
              "        (0): BertLayer(\n",
              "          (attention): BertAttention(\n",
              "            (self): BertSelfAttention(\n",
              "              (query): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (key): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (value): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "            (output): BertSelfOutput(\n",
              "              (dense): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "          )\n",
              "          (intermediate): BertIntermediate(\n",
              "            (dense): Linear(in_features=768, out_features=3072, bias=True)\n",
              "          )\n",
              "          (output): BertOutput(\n",
              "            (dense): Linear(in_features=3072, out_features=768, bias=True)\n",
              "            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "            (dropout): Dropout(p=0.1, inplace=False)\n",
              "          )\n",
              "        )\n",
              "        (1): BertLayer(\n",
              "          (attention): BertAttention(\n",
              "            (self): BertSelfAttention(\n",
              "              (query): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (key): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (value): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "            (output): BertSelfOutput(\n",
              "              (dense): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "          )\n",
              "          (intermediate): BertIntermediate(\n",
              "            (dense): Linear(in_features=768, out_features=3072, bias=True)\n",
              "          )\n",
              "          (output): BertOutput(\n",
              "            (dense): Linear(in_features=3072, out_features=768, bias=True)\n",
              "            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "            (dropout): Dropout(p=0.1, inplace=False)\n",
              "          )\n",
              "        )\n",
              "        (2): BertLayer(\n",
              "          (attention): BertAttention(\n",
              "            (self): BertSelfAttention(\n",
              "              (query): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (key): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (value): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "            (output): BertSelfOutput(\n",
              "              (dense): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "          )\n",
              "          (intermediate): BertIntermediate(\n",
              "            (dense): Linear(in_features=768, out_features=3072, bias=True)\n",
              "          )\n",
              "          (output): BertOutput(\n",
              "            (dense): Linear(in_features=3072, out_features=768, bias=True)\n",
              "            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "            (dropout): Dropout(p=0.1, inplace=False)\n",
              "          )\n",
              "        )\n",
              "        (3): BertLayer(\n",
              "          (attention): BertAttention(\n",
              "            (self): BertSelfAttention(\n",
              "              (query): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (key): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (value): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "            (output): BertSelfOutput(\n",
              "              (dense): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "          )\n",
              "          (intermediate): BertIntermediate(\n",
              "            (dense): Linear(in_features=768, out_features=3072, bias=True)\n",
              "          )\n",
              "          (output): BertOutput(\n",
              "            (dense): Linear(in_features=3072, out_features=768, bias=True)\n",
              "            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "            (dropout): Dropout(p=0.1, inplace=False)\n",
              "          )\n",
              "        )\n",
              "        (4): BertLayer(\n",
              "          (attention): BertAttention(\n",
              "            (self): BertSelfAttention(\n",
              "              (query): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (key): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (value): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "            (output): BertSelfOutput(\n",
              "              (dense): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "          )\n",
              "          (intermediate): BertIntermediate(\n",
              "            (dense): Linear(in_features=768, out_features=3072, bias=True)\n",
              "          )\n",
              "          (output): BertOutput(\n",
              "            (dense): Linear(in_features=3072, out_features=768, bias=True)\n",
              "            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "            (dropout): Dropout(p=0.1, inplace=False)\n",
              "          )\n",
              "        )\n",
              "        (5): BertLayer(\n",
              "          (attention): BertAttention(\n",
              "            (self): BertSelfAttention(\n",
              "              (query): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (key): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (value): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "            (output): BertSelfOutput(\n",
              "              (dense): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "          )\n",
              "          (intermediate): BertIntermediate(\n",
              "            (dense): Linear(in_features=768, out_features=3072, bias=True)\n",
              "          )\n",
              "          (output): BertOutput(\n",
              "            (dense): Linear(in_features=3072, out_features=768, bias=True)\n",
              "            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "            (dropout): Dropout(p=0.1, inplace=False)\n",
              "          )\n",
              "        )\n",
              "        (6): BertLayer(\n",
              "          (attention): BertAttention(\n",
              "            (self): BertSelfAttention(\n",
              "              (query): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (key): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (value): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "            (output): BertSelfOutput(\n",
              "              (dense): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "          )\n",
              "          (intermediate): BertIntermediate(\n",
              "            (dense): Linear(in_features=768, out_features=3072, bias=True)\n",
              "          )\n",
              "          (output): BertOutput(\n",
              "            (dense): Linear(in_features=3072, out_features=768, bias=True)\n",
              "            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "            (dropout): Dropout(p=0.1, inplace=False)\n",
              "          )\n",
              "        )\n",
              "        (7): BertLayer(\n",
              "          (attention): BertAttention(\n",
              "            (self): BertSelfAttention(\n",
              "              (query): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (key): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (value): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "            (output): BertSelfOutput(\n",
              "              (dense): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "          )\n",
              "          (intermediate): BertIntermediate(\n",
              "            (dense): Linear(in_features=768, out_features=3072, bias=True)\n",
              "          )\n",
              "          (output): BertOutput(\n",
              "            (dense): Linear(in_features=3072, out_features=768, bias=True)\n",
              "            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "            (dropout): Dropout(p=0.1, inplace=False)\n",
              "          )\n",
              "        )\n",
              "        (8): BertLayer(\n",
              "          (attention): BertAttention(\n",
              "            (self): BertSelfAttention(\n",
              "              (query): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (key): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (value): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "            (output): BertSelfOutput(\n",
              "              (dense): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "          )\n",
              "          (intermediate): BertIntermediate(\n",
              "            (dense): Linear(in_features=768, out_features=3072, bias=True)\n",
              "          )\n",
              "          (output): BertOutput(\n",
              "            (dense): Linear(in_features=3072, out_features=768, bias=True)\n",
              "            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "            (dropout): Dropout(p=0.1, inplace=False)\n",
              "          )\n",
              "        )\n",
              "        (9): BertLayer(\n",
              "          (attention): BertAttention(\n",
              "            (self): BertSelfAttention(\n",
              "              (query): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (key): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (value): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "            (output): BertSelfOutput(\n",
              "              (dense): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "          )\n",
              "          (intermediate): BertIntermediate(\n",
              "            (dense): Linear(in_features=768, out_features=3072, bias=True)\n",
              "          )\n",
              "          (output): BertOutput(\n",
              "            (dense): Linear(in_features=3072, out_features=768, bias=True)\n",
              "            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "            (dropout): Dropout(p=0.1, inplace=False)\n",
              "          )\n",
              "        )\n",
              "        (10): BertLayer(\n",
              "          (attention): BertAttention(\n",
              "            (self): BertSelfAttention(\n",
              "              (query): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (key): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (value): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "            (output): BertSelfOutput(\n",
              "              (dense): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "          )\n",
              "          (intermediate): BertIntermediate(\n",
              "            (dense): Linear(in_features=768, out_features=3072, bias=True)\n",
              "          )\n",
              "          (output): BertOutput(\n",
              "            (dense): Linear(in_features=3072, out_features=768, bias=True)\n",
              "            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "            (dropout): Dropout(p=0.1, inplace=False)\n",
              "          )\n",
              "        )\n",
              "        (11): BertLayer(\n",
              "          (attention): BertAttention(\n",
              "            (self): BertSelfAttention(\n",
              "              (query): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (key): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (value): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "            (output): BertSelfOutput(\n",
              "              (dense): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "          )\n",
              "          (intermediate): BertIntermediate(\n",
              "            (dense): Linear(in_features=768, out_features=3072, bias=True)\n",
              "          )\n",
              "          (output): BertOutput(\n",
              "            (dense): Linear(in_features=3072, out_features=768, bias=True)\n",
              "            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "            (dropout): Dropout(p=0.1, inplace=False)\n",
              "          )\n",
              "        )\n",
              "      )\n",
              "    )\n",
              "    (pooler): BertPooler(\n",
              "      (dense): Linear(in_features=768, out_features=768, bias=True)\n",
              "      (activation): Tanh()\n",
              "    )\n",
              "  )\n",
              "  (dropout): Dropout(p=0.1, inplace=False)\n",
              "  (classifier): Linear(in_features=2304, out_features=18, bias=True)\n",
              ")"
            ]
          },
          "metadata": {
            "tags": []
          },
          "execution_count": 39
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "DZf9v1dOh2aD",
        "colab_type": "text"
      },
      "source": [
        "# 6. Train!"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab_type": "code",
        "id": "aAsevK9qLh1A",
        "outputId": "eb241fd9-78c7-4343-832d-8dc05c5f493a",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 1000
        }
      },
      "source": [
        "# Loops through the training set for a few epochs and backpropagate\n",
        "\n",
        "# Collect the loss values:\n",
        "loss_values = []\n",
        "\n",
        "seed = 123456\n",
        "set_seed(seed)\n",
        "\n",
        "for _ in train_iterator:\n",
        "    epoch_iterator = tqdm(train_dataloader, desc=\"Iteration\",\n",
        "                          disable=local_rank not in [-1, 0])\n",
        "    \n",
        "    # For each epoch,  split into batches and train!\n",
        "\n",
        "    for step, batch in enumerate(epoch_iterator):\n",
        "        model.train()\n",
        "        batch = tuple(t.to(device) for t in batch)\n",
        "        inputs = {'input_ids':      batch[0],\n",
        "                  'attention_mask': batch[1],\n",
        "                  'token_type_ids': batch[2],\n",
        "                  'labels':      batch[3],\n",
        "                  'e1_mask': batch[4],\n",
        "                  'e2_mask': batch[5],\n",
        "                  }\n",
        "\n",
        "        outputs = model(**inputs)\n",
        "        # model outputs are always tuple in transformers\n",
        "        \n",
        "        loss = outputs[0]\n",
        "\n",
        "        # Collect the loss:\n",
        "        loss_values.append(loss)\n",
        "        \n",
        "        if n_gpu > 1:\n",
        "            loss = loss.mean()  \n",
        "            # mean() to average on multi-gpu parallel training\n",
        "        if gradient_accumulation_steps > 1:\n",
        "            loss = loss / gradient_accumulation_steps\n",
        "        \n",
        "        # Back propagate\n",
        "        loss.backward()\n",
        "        torch.nn.utils.clip_grad_norm_(\n",
        "            model.parameters(), max_grad_norm)\n",
        "\n",
        "        tr_loss += loss.item()\n",
        "        if (step + 1) % gradient_accumulation_steps == 0:\n",
        "\n",
        "            # Take a step! \n",
        "            optimizer.step()\n",
        "            scheduler.step()              \n",
        "            # Update learning rate schedule\n",
        "            model.zero_grad()\n",
        "            global_step += 1\n",
        "\n",
        "        if max_steps > 0 and global_step > max_steps:\n",
        "            # We're done!\n",
        "            epoch_iterator.close()\n",
        "            break\n",
        "    if max_steps > 0 and global_step > max_steps:\n",
        "        # We're done!\n",
        "        train_iterator.close()\n",
        "        break"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "\n",
            "Iteration:   0%|          | 0/403 [00:00<?, ?it/s]\u001b[A\n",
            "Iteration:   0%|          | 1/403 [00:00<04:52,  1.37it/s]\u001b[A\n",
            "Iteration:   0%|          | 2/403 [00:01<03:58,  1.68it/s]\u001b[A\n",
            "Iteration:   1%|          | 3/403 [00:01<03:18,  2.01it/s]\u001b[A\n",
            "Iteration:   1%|          | 4/403 [00:01<02:51,  2.33it/s]\u001b[A\n",
            "Iteration:   1%|          | 5/403 [00:01<02:32,  2.61it/s]\u001b[A\n",
            "Iteration:   1%|▏         | 6/403 [00:02<02:18,  2.86it/s]\u001b[A\n",
            "Iteration:   2%|▏         | 7/403 [00:02<02:08,  3.07it/s]\u001b[A\n",
            "Iteration:   2%|▏         | 8/403 [00:02<02:01,  3.25it/s]\u001b[A\n",
            "Iteration:   2%|▏         | 9/403 [00:02<01:56,  3.39it/s]\u001b[A\n",
            "Iteration:   2%|▏         | 10/403 [00:03<01:52,  3.49it/s]\u001b[A\n",
            "Iteration:   3%|▎         | 11/403 [00:03<01:50,  3.56it/s]\u001b[A\n",
            "Iteration:   3%|▎         | 12/403 [00:03<01:48,  3.62it/s]\u001b[A\n",
            "Iteration:   3%|▎         | 13/403 [00:03<01:46,  3.66it/s]\u001b[A\n",
            "Iteration:   3%|▎         | 14/403 [00:04<01:45,  3.68it/s]\u001b[A\n",
            "Iteration:   4%|▎         | 15/403 [00:04<01:44,  3.71it/s]\u001b[A\n",
            "Iteration:   4%|▍         | 16/403 [00:04<01:43,  3.72it/s]\u001b[A\n",
            "Iteration:   4%|▍         | 17/403 [00:05<01:42,  3.75it/s]\u001b[A\n",
            "Iteration:   4%|▍         | 18/403 [00:05<01:42,  3.77it/s]\u001b[A\n",
            "Iteration:   5%|▍         | 19/403 [00:05<01:41,  3.77it/s]\u001b[A\n",
            "Iteration:   5%|▍         | 20/403 [00:05<01:41,  3.77it/s]\u001b[A\n",
            "Iteration:   5%|▌         | 21/403 [00:06<01:41,  3.77it/s]\u001b[A\n",
            "Iteration:   5%|▌         | 22/403 [00:06<01:41,  3.77it/s]\u001b[A\n",
            "Iteration:   6%|▌         | 23/403 [00:06<01:40,  3.77it/s]\u001b[A\n",
            "Iteration:   6%|▌         | 24/403 [00:06<01:40,  3.76it/s]\u001b[A\n",
            "Iteration:   6%|▌         | 25/403 [00:07<01:40,  3.76it/s]\u001b[A\n",
            "Iteration:   6%|▋         | 26/403 [00:07<01:40,  3.76it/s]\u001b[A\n",
            "Iteration:   7%|▋         | 27/403 [00:07<01:39,  3.76it/s]\u001b[A\n",
            "Iteration:   7%|▋         | 28/403 [00:07<01:39,  3.76it/s]\u001b[A\n",
            "Iteration:   7%|▋         | 29/403 [00:08<01:39,  3.76it/s]\u001b[A\n",
            "Iteration:   7%|▋         | 30/403 [00:08<01:38,  3.78it/s]\u001b[A\n",
            "Iteration:   8%|▊         | 31/403 [00:08<01:38,  3.79it/s]\u001b[A\n",
            "Iteration:   8%|▊         | 32/403 [00:09<01:38,  3.77it/s]\u001b[A\n",
            "Iteration:   8%|▊         | 33/403 [00:09<01:38,  3.77it/s]\u001b[A\n",
            "Iteration:   8%|▊         | 34/403 [00:09<01:37,  3.77it/s]\u001b[A\n",
            "Iteration:   9%|▊         | 35/403 [00:09<01:37,  3.78it/s]\u001b[A\n",
            "Iteration:   9%|▉         | 36/403 [00:10<01:37,  3.78it/s]\u001b[A\n",
            "Iteration:   9%|▉         | 37/403 [00:10<01:37,  3.77it/s]\u001b[A\n",
            "Iteration:   9%|▉         | 38/403 [00:10<01:36,  3.77it/s]\u001b[A\n",
            "Iteration:  10%|▉         | 39/403 [00:10<01:36,  3.78it/s]\u001b[A\n",
            "Iteration:  10%|▉         | 40/403 [00:11<01:36,  3.78it/s]\u001b[A\n",
            "Iteration:  10%|█         | 41/403 [00:11<01:35,  3.78it/s]\u001b[A\n",
            "Iteration:  10%|█         | 42/403 [00:11<01:35,  3.79it/s]\u001b[A\n",
            "Iteration:  11%|█         | 43/403 [00:11<01:35,  3.78it/s]\u001b[A\n",
            "Iteration:  11%|█         | 44/403 [00:12<01:34,  3.78it/s]\u001b[A\n",
            "Iteration:  11%|█         | 45/403 [00:12<01:34,  3.78it/s]\u001b[A\n",
            "Iteration:  11%|█▏        | 46/403 [00:12<01:34,  3.78it/s]\u001b[A\n",
            "Iteration:  12%|█▏        | 47/403 [00:12<01:34,  3.78it/s]\u001b[A\n",
            "Iteration:  12%|█▏        | 48/403 [00:13<01:33,  3.78it/s]\u001b[A\n",
            "Iteration:  12%|█▏        | 49/403 [00:13<01:33,  3.77it/s]\u001b[A\n",
            "Iteration:  12%|█▏        | 50/403 [00:13<01:33,  3.77it/s]\u001b[A\n",
            "Iteration:  13%|█▎        | 51/403 [00:14<01:33,  3.77it/s]\u001b[A\n",
            "Iteration:  13%|█▎        | 52/403 [00:14<01:33,  3.77it/s]\u001b[A\n",
            "Iteration:  13%|█▎        | 53/403 [00:14<01:33,  3.76it/s]\u001b[A\n",
            "Iteration:  13%|█▎        | 54/403 [00:14<01:32,  3.77it/s]\u001b[A\n",
            "Iteration:  14%|█▎        | 55/403 [00:15<01:32,  3.77it/s]\u001b[A\n",
            "Iteration:  14%|█▍        | 56/403 [00:15<01:31,  3.77it/s]\u001b[A\n",
            "Iteration:  14%|█▍        | 57/403 [00:15<01:31,  3.77it/s]\u001b[A\n",
            "Iteration:  14%|█▍        | 58/403 [00:15<01:31,  3.77it/s]\u001b[A\n",
            "Iteration:  15%|█▍        | 59/403 [00:16<01:31,  3.76it/s]\u001b[A\n",
            "Iteration:  15%|█▍        | 60/403 [00:16<01:31,  3.76it/s]\u001b[A\n",
            "Iteration:  15%|█▌        | 61/403 [00:16<01:30,  3.76it/s]\u001b[A\n",
            "Iteration:  15%|█▌        | 62/403 [00:16<01:30,  3.76it/s]\u001b[A\n",
            "Iteration:  16%|█▌        | 63/403 [00:17<01:30,  3.76it/s]\u001b[A\n",
            "Iteration:  16%|█▌        | 64/403 [00:17<01:29,  3.78it/s]\u001b[A\n",
            "Iteration:  16%|█▌        | 65/403 [00:17<01:29,  3.77it/s]\u001b[A\n",
            "Iteration:  16%|█▋        | 66/403 [00:18<01:29,  3.77it/s]\u001b[A\n",
            "Iteration:  17%|█▋        | 67/403 [00:18<01:29,  3.77it/s]\u001b[A\n",
            "Iteration:  17%|█▋        | 68/403 [00:18<01:28,  3.78it/s]\u001b[A\n",
            "Iteration:  17%|█▋        | 69/403 [00:18<01:28,  3.77it/s]\u001b[A\n",
            "Iteration:  17%|█▋        | 70/403 [00:19<01:28,  3.77it/s]\u001b[A\n",
            "Iteration:  18%|█▊        | 71/403 [00:19<01:27,  3.78it/s]\u001b[A\n",
            "Iteration:  18%|█▊        | 72/403 [00:19<01:27,  3.79it/s]\u001b[A\n",
            "Iteration:  18%|█▊        | 73/403 [00:19<01:27,  3.78it/s]\u001b[A\n",
            "Iteration:  18%|█▊        | 74/403 [00:20<01:26,  3.79it/s]\u001b[A\n",
            "Iteration:  19%|█▊        | 75/403 [00:20<01:26,  3.78it/s]\u001b[A\n",
            "Iteration:  19%|█▉        | 76/403 [00:20<01:26,  3.78it/s]\u001b[A\n",
            "Iteration:  19%|█▉        | 77/403 [00:20<01:26,  3.77it/s]\u001b[A\n",
            "Iteration:  19%|█▉        | 78/403 [00:21<01:26,  3.77it/s]\u001b[A\n",
            "Iteration:  20%|█▉        | 79/403 [00:21<01:25,  3.78it/s]\u001b[A\n",
            "Iteration:  20%|█▉        | 80/403 [00:21<01:25,  3.76it/s]\u001b[A\n",
            "Iteration:  20%|██        | 81/403 [00:21<01:25,  3.76it/s]\u001b[A\n",
            "Iteration:  20%|██        | 82/403 [00:22<01:25,  3.76it/s]\u001b[A\n",
            "Iteration:  21%|██        | 83/403 [00:22<01:25,  3.76it/s]\u001b[A\n",
            "Iteration:  21%|██        | 84/403 [00:22<01:24,  3.75it/s]\u001b[A\n",
            "Iteration:  21%|██        | 85/403 [00:23<01:24,  3.76it/s]\u001b[A\n",
            "Iteration:  21%|██▏       | 86/403 [00:23<01:24,  3.77it/s]\u001b[A\n",
            "Iteration:  22%|██▏       | 87/403 [00:23<01:23,  3.78it/s]\u001b[A\n",
            "Iteration:  22%|██▏       | 88/403 [00:23<01:23,  3.77it/s]\u001b[A\n",
            "Iteration:  22%|██▏       | 89/403 [00:24<01:23,  3.77it/s]\u001b[A\n",
            "Iteration:  22%|██▏       | 90/403 [00:24<01:23,  3.75it/s]\u001b[A\n",
            "Iteration:  23%|██▎       | 91/403 [00:24<01:22,  3.76it/s]\u001b[A\n",
            "Iteration:  23%|██▎       | 92/403 [00:24<01:22,  3.76it/s]\u001b[A\n",
            "Iteration:  23%|██▎       | 93/403 [00:25<01:22,  3.76it/s]\u001b[A\n",
            "Iteration:  23%|██▎       | 94/403 [00:25<01:22,  3.76it/s]\u001b[A\n",
            "Iteration:  24%|██▎       | 95/403 [00:25<01:22,  3.75it/s]\u001b[A\n",
            "Iteration:  24%|██▍       | 96/403 [00:25<01:21,  3.76it/s]\u001b[A\n",
            "Iteration:  24%|██▍       | 97/403 [00:26<01:21,  3.75it/s]\u001b[A\n",
            "Iteration:  24%|██▍       | 98/403 [00:26<01:21,  3.75it/s]\u001b[A\n",
            "Iteration:  25%|██▍       | 99/403 [00:26<01:21,  3.75it/s]\u001b[A\n",
            "Iteration:  25%|██▍       | 100/403 [00:27<01:20,  3.76it/s]\u001b[A\n",
            "Iteration:  25%|██▌       | 101/403 [00:27<01:20,  3.75it/s]\u001b[A\n",
            "Iteration:  25%|██▌       | 102/403 [00:27<01:20,  3.75it/s]\u001b[A\n",
            "Iteration:  26%|██▌       | 103/403 [00:27<01:19,  3.76it/s]\u001b[A\n",
            "Iteration:  26%|██▌       | 104/403 [00:28<01:19,  3.76it/s]\u001b[A\n",
            "Iteration:  26%|██▌       | 105/403 [00:28<01:19,  3.76it/s]\u001b[A\n",
            "Iteration:  26%|██▋       | 106/403 [00:28<01:18,  3.77it/s]\u001b[A\n",
            "Iteration:  27%|██▋       | 107/403 [00:28<01:18,  3.77it/s]\u001b[A\n",
            "Iteration:  27%|██▋       | 108/403 [00:29<01:18,  3.77it/s]\u001b[A\n",
            "Iteration:  27%|██▋       | 109/403 [00:29<01:17,  3.78it/s]\u001b[A\n",
            "Iteration:  27%|██▋       | 110/403 [00:29<01:17,  3.78it/s]\u001b[A\n",
            "Iteration:  28%|██▊       | 111/403 [00:29<01:17,  3.76it/s]\u001b[A\n",
            "Iteration:  28%|██▊       | 112/403 [00:30<01:17,  3.76it/s]\u001b[A\n",
            "Iteration:  28%|██▊       | 113/403 [00:30<01:17,  3.76it/s]\u001b[A\n",
            "Iteration:  28%|██▊       | 114/403 [00:30<01:16,  3.76it/s]\u001b[A\n",
            "Iteration:  29%|██▊       | 115/403 [00:31<01:16,  3.76it/s]\u001b[A\n",
            "Iteration:  29%|██▉       | 116/403 [00:31<01:16,  3.76it/s]\u001b[A\n",
            "Iteration:  29%|██▉       | 117/403 [00:31<01:15,  3.77it/s]\u001b[A\n",
            "Iteration:  29%|██▉       | 118/403 [00:31<01:15,  3.77it/s]\u001b[A\n",
            "Iteration:  30%|██▉       | 119/403 [00:32<01:15,  3.77it/s]\u001b[A\n",
            "Iteration:  30%|██▉       | 120/403 [00:32<01:15,  3.76it/s]\u001b[A\n",
            "Iteration:  30%|███       | 121/403 [00:32<01:14,  3.77it/s]\u001b[A\n",
            "Iteration:  30%|███       | 122/403 [00:32<01:14,  3.78it/s]\u001b[A\n",
            "Iteration:  31%|███       | 123/403 [00:33<01:14,  3.77it/s]\u001b[A\n",
            "Iteration:  31%|███       | 124/403 [00:33<01:13,  3.78it/s]\u001b[A\n",
            "Iteration:  31%|███       | 125/403 [00:33<01:13,  3.77it/s]\u001b[A\n",
            "Iteration:  31%|███▏      | 126/403 [00:33<01:13,  3.77it/s]\u001b[A\n",
            "Iteration:  32%|███▏      | 127/403 [00:34<01:13,  3.77it/s]\u001b[A\n",
            "Iteration:  32%|███▏      | 128/403 [00:34<01:13,  3.77it/s]\u001b[A\n",
            "Iteration:  32%|███▏      | 129/403 [00:34<01:12,  3.76it/s]\u001b[A\n",
            "Iteration:  32%|███▏      | 130/403 [00:35<01:13,  3.74it/s]\u001b[A\n",
            "Iteration:  33%|███▎      | 131/403 [00:35<01:12,  3.74it/s]\u001b[A\n",
            "Iteration:  33%|███▎      | 132/403 [00:35<01:12,  3.74it/s]\u001b[A\n",
            "Iteration:  33%|███▎      | 133/403 [00:35<01:11,  3.75it/s]\u001b[A\n",
            "Iteration:  33%|███▎      | 134/403 [00:36<01:11,  3.76it/s]\u001b[A\n",
            "Iteration:  33%|███▎      | 135/403 [00:36<01:11,  3.75it/s]\u001b[A\n",
            "Iteration:  34%|███▎      | 136/403 [00:36<01:10,  3.77it/s]\u001b[A\n",
            "Iteration:  34%|███▍      | 137/403 [00:36<01:10,  3.77it/s]\u001b[A\n",
            "Iteration:  34%|███▍      | 138/403 [00:37<01:10,  3.76it/s]\u001b[A\n",
            "Iteration:  34%|███▍      | 139/403 [00:37<01:10,  3.75it/s]\u001b[A\n",
            "Iteration:  35%|███▍      | 140/403 [00:37<01:09,  3.76it/s]\u001b[A\n",
            "Iteration:  35%|███▍      | 141/403 [00:37<01:09,  3.76it/s]\u001b[A\n",
            "Iteration:  35%|███▌      | 142/403 [00:38<01:09,  3.75it/s]\u001b[A\n",
            "Iteration:  35%|███▌      | 143/403 [00:38<01:09,  3.76it/s]\u001b[A\n",
            "Iteration:  36%|███▌      | 144/403 [00:38<01:08,  3.77it/s]\u001b[A\n",
            "Iteration:  36%|███▌      | 145/403 [00:39<01:08,  3.78it/s]\u001b[A\n",
            "Iteration:  36%|███▌      | 146/403 [00:39<01:07,  3.78it/s]\u001b[A\n",
            "Iteration:  36%|███▋      | 147/403 [00:39<01:07,  3.78it/s]\u001b[A\n",
            "Iteration:  37%|███▋      | 148/403 [00:39<01:07,  3.78it/s]\u001b[A\n",
            "Iteration:  37%|███▋      | 149/403 [00:40<01:07,  3.79it/s]\u001b[A\n",
            "Iteration:  37%|███▋      | 150/403 [00:40<01:07,  3.77it/s]\u001b[A\n",
            "Iteration:  37%|███▋      | 151/403 [00:40<01:06,  3.77it/s]\u001b[A\n",
            "Iteration:  38%|███▊      | 152/403 [00:40<01:06,  3.77it/s]\u001b[A\n",
            "Iteration:  38%|███▊      | 153/403 [00:41<01:06,  3.77it/s]\u001b[A\n",
            "Iteration:  38%|███▊      | 154/403 [00:41<01:06,  3.76it/s]\u001b[A\n",
            "Iteration:  38%|███▊      | 155/403 [00:41<01:05,  3.76it/s]\u001b[A\n",
            "Iteration:  39%|███▊      | 156/403 [00:41<01:05,  3.76it/s]\u001b[A\n",
            "Iteration:  39%|███▉      | 157/403 [00:42<01:05,  3.77it/s]\u001b[A\n",
            "Iteration:  39%|███▉      | 158/403 [00:42<01:05,  3.76it/s]\u001b[A\n",
            "Iteration:  39%|███▉      | 159/403 [00:42<01:04,  3.76it/s]\u001b[A\n",
            "Iteration:  40%|███▉      | 160/403 [00:42<01:04,  3.75it/s]\u001b[A\n",
            "Iteration:  40%|███▉      | 161/403 [00:43<01:04,  3.75it/s]\u001b[A\n",
            "Iteration:  40%|████      | 162/403 [00:43<01:04,  3.76it/s]\u001b[A\n",
            "Iteration:  40%|████      | 163/403 [00:43<01:03,  3.77it/s]\u001b[A\n",
            "Iteration:  41%|████      | 164/403 [00:44<01:03,  3.77it/s]\u001b[A\n",
            "Iteration:  41%|████      | 165/403 [00:44<01:03,  3.75it/s]\u001b[A\n",
            "Iteration:  41%|████      | 166/403 [00:44<01:03,  3.73it/s]\u001b[A\n",
            "Iteration:  41%|████▏     | 167/403 [00:44<01:02,  3.75it/s]\u001b[A\n",
            "Iteration:  42%|████▏     | 168/403 [00:45<01:02,  3.74it/s]\u001b[A\n",
            "Iteration:  42%|████▏     | 169/403 [00:45<01:02,  3.72it/s]\u001b[A\n",
            "Iteration:  42%|████▏     | 170/403 [00:45<01:02,  3.71it/s]\u001b[A\n",
            "Iteration:  42%|████▏     | 171/403 [00:45<01:02,  3.70it/s]\u001b[A\n",
            "Iteration:  43%|████▎     | 172/403 [00:46<01:02,  3.70it/s]\u001b[A\n",
            "Iteration:  43%|████▎     | 173/403 [00:46<01:02,  3.71it/s]\u001b[A\n",
            "Iteration:  43%|████▎     | 174/403 [00:46<01:01,  3.72it/s]\u001b[A\n",
            "Iteration:  43%|████▎     | 175/403 [00:47<01:01,  3.71it/s]\u001b[A\n",
            "Iteration:  44%|████▎     | 176/403 [00:47<01:01,  3.71it/s]\u001b[A\n",
            "Iteration:  44%|████▍     | 177/403 [00:47<01:00,  3.74it/s]\u001b[A\n",
            "Iteration:  44%|████▍     | 178/403 [00:47<00:59,  3.75it/s]\u001b[A\n",
            "Iteration:  44%|████▍     | 179/403 [00:48<00:59,  3.76it/s]\u001b[A\n",
            "Iteration:  45%|████▍     | 180/403 [00:48<00:59,  3.75it/s]\u001b[A\n",
            "Iteration:  45%|████▍     | 181/403 [00:48<00:59,  3.76it/s]\u001b[A\n",
            "Iteration:  45%|████▌     | 182/403 [00:48<00:58,  3.76it/s]\u001b[A\n",
            "Iteration:  45%|████▌     | 183/403 [00:49<00:58,  3.76it/s]\u001b[A\n",
            "Iteration:  46%|████▌     | 184/403 [00:49<00:58,  3.75it/s]\u001b[A\n",
            "Iteration:  46%|████▌     | 185/403 [00:49<00:57,  3.76it/s]\u001b[A\n",
            "Iteration:  46%|████▌     | 186/403 [00:49<00:57,  3.76it/s]\u001b[A\n",
            "Iteration:  46%|████▋     | 187/403 [00:50<00:57,  3.76it/s]\u001b[A\n",
            "Iteration:  47%|████▋     | 188/403 [00:50<00:57,  3.75it/s]\u001b[A\n",
            "Iteration:  47%|████▋     | 189/403 [00:50<00:56,  3.76it/s]\u001b[A\n",
            "Iteration:  47%|████▋     | 190/403 [00:51<00:56,  3.77it/s]\u001b[A\n",
            "Iteration:  47%|████▋     | 191/403 [00:51<00:56,  3.77it/s]\u001b[A\n",
            "Iteration:  48%|████▊     | 192/403 [00:51<00:56,  3.76it/s]\u001b[A\n",
            "Iteration:  48%|████▊     | 193/403 [00:51<00:55,  3.75it/s]\u001b[A\n",
            "Iteration:  48%|████▊     | 194/403 [00:52<00:55,  3.75it/s]\u001b[A\n",
            "Iteration:  48%|████▊     | 195/403 [00:52<00:55,  3.76it/s]\u001b[A\n",
            "Iteration:  49%|████▊     | 196/403 [00:52<00:55,  3.75it/s]\u001b[A\n",
            "Iteration:  49%|████▉     | 197/403 [00:52<00:55,  3.75it/s]\u001b[A\n",
            "Iteration:  49%|████▉     | 198/403 [00:53<00:54,  3.74it/s]\u001b[A\n",
            "Iteration:  49%|████▉     | 199/403 [00:53<00:54,  3.74it/s]\u001b[A\n",
            "Iteration:  50%|████▉     | 200/403 [00:53<00:54,  3.75it/s]\u001b[A\n",
            "Iteration:  50%|████▉     | 201/403 [00:53<00:53,  3.76it/s]\u001b[A\n",
            "Iteration:  50%|█████     | 202/403 [00:54<00:53,  3.77it/s]\u001b[A\n",
            "Iteration:  50%|█████     | 203/403 [00:54<00:53,  3.76it/s]\u001b[A\n",
            "Iteration:  51%|█████     | 204/403 [00:54<00:52,  3.77it/s]\u001b[A\n",
            "Iteration:  51%|█████     | 205/403 [00:54<00:52,  3.78it/s]\u001b[A\n",
            "Iteration:  51%|█████     | 206/403 [00:55<00:52,  3.78it/s]\u001b[A\n",
            "Iteration:  51%|█████▏    | 207/403 [00:55<00:52,  3.77it/s]\u001b[A\n",
            "Iteration:  52%|█████▏    | 208/403 [00:55<00:51,  3.77it/s]\u001b[A\n",
            "Iteration:  52%|█████▏    | 209/403 [00:56<00:51,  3.77it/s]\u001b[A\n",
            "Iteration:  52%|█████▏    | 210/403 [00:56<00:51,  3.77it/s]\u001b[A\n",
            "Iteration:  52%|█████▏    | 211/403 [00:56<00:51,  3.76it/s]\u001b[A\n",
            "Iteration:  53%|█████▎    | 212/403 [00:56<00:51,  3.75it/s]\u001b[A\n",
            "Iteration:  53%|█████▎    | 213/403 [00:57<00:50,  3.74it/s]\u001b[A\n",
            "Iteration:  53%|█████▎    | 214/403 [00:57<00:50,  3.73it/s]\u001b[A\n",
            "Iteration:  53%|█████▎    | 215/403 [00:57<00:50,  3.74it/s]\u001b[A\n",
            "Iteration:  54%|█████▎    | 216/403 [00:57<00:50,  3.73it/s]\u001b[A\n",
            "Iteration:  54%|█████▍    | 217/403 [00:58<00:49,  3.73it/s]\u001b[A\n",
            "Iteration:  54%|█████▍    | 218/403 [00:58<00:49,  3.73it/s]\u001b[A\n",
            "Iteration:  54%|█████▍    | 219/403 [00:58<00:49,  3.71it/s]\u001b[A\n",
            "Iteration:  55%|█████▍    | 220/403 [00:59<00:49,  3.72it/s]\u001b[A\n",
            "Iteration:  55%|█████▍    | 221/403 [00:59<00:48,  3.73it/s]\u001b[A\n",
            "Iteration:  55%|█████▌    | 222/403 [00:59<00:48,  3.74it/s]\u001b[A\n",
            "Iteration:  55%|█████▌    | 223/403 [00:59<00:47,  3.75it/s]\u001b[A\n",
            "Iteration:  56%|█████▌    | 224/403 [01:00<00:47,  3.76it/s]\u001b[A\n",
            "Iteration:  56%|█████▌    | 225/403 [01:00<00:47,  3.76it/s]\u001b[A\n",
            "Iteration:  56%|█████▌    | 226/403 [01:00<00:46,  3.77it/s]\u001b[A\n",
            "Iteration:  56%|█████▋    | 227/403 [01:00<00:46,  3.77it/s]\u001b[A\n",
            "Iteration:  57%|█████▋    | 228/403 [01:01<00:46,  3.76it/s]\u001b[A\n",
            "Iteration:  57%|█████▋    | 229/403 [01:01<00:46,  3.74it/s]\u001b[A\n",
            "Iteration:  57%|█████▋    | 230/403 [01:01<00:46,  3.72it/s]\u001b[A\n",
            "Iteration:  57%|█████▋    | 231/403 [01:01<00:46,  3.72it/s]\u001b[A\n",
            "Iteration:  58%|█████▊    | 232/403 [01:02<00:46,  3.72it/s]\u001b[A\n",
            "Iteration:  58%|█████▊    | 233/403 [01:02<00:45,  3.73it/s]\u001b[A\n",
            "Iteration:  58%|█████▊    | 234/403 [01:02<00:45,  3.75it/s]\u001b[A\n",
            "Iteration:  58%|█████▊    | 235/403 [01:03<00:44,  3.75it/s]\u001b[A\n",
            "Iteration:  59%|█████▊    | 236/403 [01:03<00:44,  3.74it/s]\u001b[A\n",
            "Iteration:  59%|█████▉    | 237/403 [01:03<00:44,  3.74it/s]\u001b[A\n",
            "Iteration:  59%|█████▉    | 238/403 [01:03<00:44,  3.74it/s]\u001b[A\n",
            "Iteration:  59%|█████▉    | 239/403 [01:04<00:43,  3.75it/s]\u001b[A\n",
            "Iteration:  60%|█████▉    | 240/403 [01:04<00:43,  3.75it/s]\u001b[A\n",
            "Iteration:  60%|█████▉    | 241/403 [01:04<00:43,  3.76it/s]\u001b[A\n",
            "Iteration:  60%|██████    | 242/403 [01:04<00:42,  3.77it/s]\u001b[A\n",
            "Iteration:  60%|██████    | 243/403 [01:05<00:42,  3.76it/s]\u001b[A\n",
            "Iteration:  61%|██████    | 244/403 [01:05<00:42,  3.76it/s]\u001b[A\n",
            "Iteration:  61%|██████    | 245/403 [01:05<00:41,  3.77it/s]\u001b[A\n",
            "Iteration:  61%|██████    | 246/403 [01:05<00:41,  3.78it/s]\u001b[A\n",
            "Iteration:  61%|██████▏   | 247/403 [01:06<00:41,  3.77it/s]\u001b[A\n",
            "Iteration:  62%|██████▏   | 248/403 [01:06<00:41,  3.77it/s]\u001b[A\n",
            "Iteration:  62%|██████▏   | 249/403 [01:06<00:40,  3.77it/s]\u001b[A\n",
            "Iteration:  62%|██████▏   | 250/403 [01:06<00:40,  3.77it/s]\u001b[A\n",
            "Iteration:  62%|██████▏   | 251/403 [01:07<00:40,  3.76it/s]\u001b[A\n",
            "Iteration:  63%|██████▎   | 252/403 [01:07<00:40,  3.76it/s]\u001b[A\n",
            "Iteration:  63%|██████▎   | 253/403 [01:07<00:39,  3.77it/s]\u001b[A\n",
            "Iteration:  63%|██████▎   | 254/403 [01:08<00:39,  3.78it/s]\u001b[A\n",
            "Iteration:  63%|██████▎   | 255/403 [01:08<00:39,  3.77it/s]\u001b[A\n",
            "Iteration:  64%|██████▎   | 256/403 [01:08<00:38,  3.77it/s]\u001b[A\n",
            "Iteration:  64%|██████▍   | 257/403 [01:08<00:38,  3.78it/s]\u001b[A\n",
            "Iteration:  64%|██████▍   | 258/403 [01:09<00:38,  3.77it/s]\u001b[A\n",
            "Iteration:  64%|██████▍   | 259/403 [01:09<00:38,  3.76it/s]\u001b[A\n",
            "Iteration:  65%|██████▍   | 260/403 [01:09<00:37,  3.77it/s]\u001b[A\n",
            "Iteration:  65%|██████▍   | 261/403 [01:09<00:37,  3.77it/s]\u001b[A\n",
            "Iteration:  65%|██████▌   | 262/403 [01:10<00:37,  3.77it/s]\u001b[A\n",
            "Iteration:  65%|██████▌   | 263/403 [01:10<00:37,  3.77it/s]\u001b[A\n",
            "Iteration:  66%|██████▌   | 264/403 [01:10<00:36,  3.77it/s]\u001b[A\n",
            "Iteration:  66%|██████▌   | 265/403 [01:10<00:36,  3.77it/s]\u001b[A\n",
            "Iteration:  66%|██████▌   | 266/403 [01:11<00:36,  3.77it/s]\u001b[A\n",
            "Iteration:  66%|██████▋   | 267/403 [01:11<00:36,  3.77it/s]\u001b[A\n",
            "Iteration:  67%|██████▋   | 268/403 [01:11<00:35,  3.77it/s]\u001b[A\n",
            "Iteration:  67%|██████▋   | 269/403 [01:12<00:35,  3.76it/s]\u001b[A\n",
            "Iteration:  67%|██████▋   | 270/403 [01:12<00:35,  3.76it/s]\u001b[A\n",
            "Iteration:  67%|██████▋   | 271/403 [01:12<00:35,  3.76it/s]\u001b[A\n",
            "Iteration:  67%|██████▋   | 272/403 [01:12<00:34,  3.77it/s]\u001b[A\n",
            "Iteration:  68%|██████▊   | 273/403 [01:13<00:34,  3.77it/s]\u001b[A\n",
            "Iteration:  68%|██████▊   | 274/403 [01:13<00:34,  3.77it/s]\u001b[A\n",
            "Iteration:  68%|██████▊   | 275/403 [01:13<00:34,  3.76it/s]\u001b[A\n",
            "Iteration:  68%|██████▊   | 276/403 [01:13<00:33,  3.76it/s]\u001b[A\n",
            "Iteration:  69%|██████▊   | 277/403 [01:14<00:33,  3.76it/s]\u001b[A\n",
            "Iteration:  69%|██████▉   | 278/403 [01:14<00:33,  3.76it/s]\u001b[A\n",
            "Iteration:  69%|██████▉   | 279/403 [01:14<00:33,  3.76it/s]\u001b[A\n",
            "Iteration:  69%|██████▉   | 280/403 [01:14<00:32,  3.76it/s]\u001b[A\n",
            "Iteration:  70%|██████▉   | 281/403 [01:15<00:32,  3.77it/s]\u001b[A\n",
            "Iteration:  70%|██████▉   | 282/403 [01:15<00:32,  3.77it/s]\u001b[A\n",
            "Iteration:  70%|███████   | 283/403 [01:15<00:31,  3.77it/s]\u001b[A\n",
            "Iteration:  70%|███████   | 284/403 [01:16<00:31,  3.76it/s]\u001b[A\n",
            "Iteration:  71%|███████   | 285/403 [01:16<00:31,  3.75it/s]\u001b[A\n",
            "Iteration:  71%|███████   | 286/403 [01:16<00:31,  3.75it/s]\u001b[A\n",
            "Iteration:  71%|███████   | 287/403 [01:16<00:30,  3.75it/s]\u001b[A\n",
            "Iteration:  71%|███████▏  | 288/403 [01:17<00:30,  3.74it/s]\u001b[A\n",
            "Iteration:  72%|███████▏  | 289/403 [01:17<00:30,  3.75it/s]\u001b[A\n",
            "Iteration:  72%|███████▏  | 290/403 [01:17<00:30,  3.76it/s]\u001b[A\n",
            "Iteration:  72%|███████▏  | 291/403 [01:17<00:29,  3.75it/s]\u001b[A\n",
            "Iteration:  72%|███████▏  | 292/403 [01:18<00:29,  3.75it/s]\u001b[A\n",
            "Iteration:  73%|███████▎  | 293/403 [01:18<00:29,  3.75it/s]\u001b[A\n",
            "Iteration:  73%|███████▎  | 294/403 [01:18<00:28,  3.76it/s]\u001b[A\n",
            "Iteration:  73%|███████▎  | 295/403 [01:18<00:28,  3.75it/s]\u001b[A\n",
            "Iteration:  73%|███████▎  | 296/403 [01:19<00:28,  3.75it/s]\u001b[A\n",
            "Iteration:  74%|███████▎  | 297/403 [01:19<00:28,  3.75it/s]\u001b[A\n",
            "Iteration:  74%|███████▍  | 298/403 [01:19<00:27,  3.76it/s]\u001b[A\n",
            "Iteration:  74%|███████▍  | 299/403 [01:20<00:27,  3.75it/s]\u001b[A\n",
            "Iteration:  74%|███████▍  | 300/403 [01:20<00:27,  3.75it/s]\u001b[A\n",
            "Iteration:  75%|███████▍  | 301/403 [01:20<00:27,  3.74it/s]\u001b[A\n",
            "Iteration:  75%|███████▍  | 302/403 [01:20<00:26,  3.76it/s]\u001b[A\n",
            "Iteration:  75%|███████▌  | 303/403 [01:21<00:26,  3.75it/s]\u001b[A\n",
            "Iteration:  75%|███████▌  | 304/403 [01:21<00:26,  3.75it/s]\u001b[A\n",
            "Iteration:  76%|███████▌  | 305/403 [01:21<00:26,  3.76it/s]\u001b[A\n",
            "Iteration:  76%|███████▌  | 306/403 [01:21<00:25,  3.77it/s]\u001b[A\n",
            "Iteration:  76%|███████▌  | 307/403 [01:22<00:25,  3.76it/s]\u001b[A\n",
            "Iteration:  76%|███████▋  | 308/403 [01:22<00:25,  3.76it/s]\u001b[A\n",
            "Iteration:  77%|███████▋  | 309/403 [01:22<00:24,  3.77it/s]\u001b[A\n",
            "Iteration:  77%|███████▋  | 310/403 [01:22<00:24,  3.76it/s]\u001b[A\n",
            "Iteration:  77%|███████▋  | 311/403 [01:23<00:24,  3.76it/s]\u001b[A\n",
            "Iteration:  77%|███████▋  | 312/403 [01:23<00:24,  3.76it/s]\u001b[A\n",
            "Iteration:  78%|███████▊  | 313/403 [01:23<00:23,  3.76it/s]\u001b[A\n",
            "Iteration:  78%|███████▊  | 314/403 [01:24<00:23,  3.75it/s]\u001b[A\n",
            "Iteration:  78%|███████▊  | 315/403 [01:24<00:23,  3.74it/s]\u001b[A\n",
            "Iteration:  78%|███████▊  | 316/403 [01:24<00:23,  3.74it/s]\u001b[A\n",
            "Iteration:  79%|███████▊  | 317/403 [01:24<00:22,  3.76it/s]\u001b[A\n",
            "Iteration:  79%|███████▉  | 318/403 [01:25<00:22,  3.76it/s]\u001b[A\n",
            "Iteration:  79%|███████▉  | 319/403 [01:25<00:22,  3.75it/s]\u001b[A\n",
            "Iteration:  79%|███████▉  | 320/403 [01:25<00:22,  3.75it/s]\u001b[A\n",
            "Iteration:  80%|███████▉  | 321/403 [01:25<00:21,  3.74it/s]\u001b[A\n",
            "Iteration:  80%|███████▉  | 322/403 [01:26<00:21,  3.74it/s]\u001b[A\n",
            "Iteration:  80%|████████  | 323/403 [01:26<00:21,  3.74it/s]\u001b[A\n",
            "Iteration:  80%|████████  | 324/403 [01:26<00:21,  3.74it/s]\u001b[A\n",
            "Iteration:  81%|████████  | 325/403 [01:26<00:20,  3.75it/s]\u001b[A\n",
            "Iteration:  81%|████████  | 326/403 [01:27<00:20,  3.75it/s]\u001b[A\n",
            "Iteration:  81%|████████  | 327/403 [01:27<00:20,  3.75it/s]\u001b[A\n",
            "Iteration:  81%|████████▏ | 328/403 [01:27<00:19,  3.75it/s]\u001b[A\n",
            "Iteration:  82%|████████▏ | 329/403 [01:28<00:19,  3.75it/s]\u001b[A\n",
            "Iteration:  82%|████████▏ | 330/403 [01:28<00:19,  3.76it/s]\u001b[A\n",
            "Iteration:  82%|████████▏ | 331/403 [01:28<00:19,  3.76it/s]\u001b[A\n",
            "Iteration:  82%|████████▏ | 332/403 [01:28<00:18,  3.77it/s]\u001b[A\n",
            "Iteration:  83%|████████▎ | 333/403 [01:29<00:18,  3.77it/s]\u001b[A\n",
            "Iteration:  83%|████████▎ | 334/403 [01:29<00:18,  3.77it/s]\u001b[A\n",
            "Iteration:  83%|████████▎ | 335/403 [01:29<00:18,  3.77it/s]\u001b[A\n",
            "Iteration:  83%|████████▎ | 336/403 [01:29<00:17,  3.77it/s]\u001b[A\n",
            "Iteration:  84%|████████▎ | 337/403 [01:30<00:17,  3.77it/s]\u001b[A\n",
            "Iteration:  84%|████████▍ | 338/403 [01:30<00:17,  3.77it/s]\u001b[A\n",
            "Iteration:  84%|████████▍ | 339/403 [01:30<00:16,  3.77it/s]\u001b[A\n",
            "Iteration:  84%|████████▍ | 340/403 [01:30<00:16,  3.76it/s]\u001b[A\n",
            "Iteration:  85%|████████▍ | 341/403 [01:31<00:16,  3.75it/s]\u001b[A\n",
            "Iteration:  85%|████████▍ | 342/403 [01:31<00:16,  3.77it/s]\u001b[A\n",
            "Iteration:  85%|████████▌ | 343/403 [01:31<00:15,  3.77it/s]\u001b[A\n",
            "Iteration:  85%|████████▌ | 344/403 [01:31<00:15,  3.77it/s]\u001b[A\n",
            "Iteration:  86%|████████▌ | 345/403 [01:32<00:15,  3.75it/s]\u001b[A\n",
            "Iteration:  86%|████████▌ | 346/403 [01:32<00:15,  3.76it/s]\u001b[A\n",
            "Iteration:  86%|████████▌ | 347/403 [01:32<00:14,  3.76it/s]\u001b[A\n",
            "Iteration:  86%|████████▋ | 348/403 [01:33<00:14,  3.76it/s]\u001b[A\n",
            "Iteration:  87%|████████▋ | 349/403 [01:33<00:14,  3.75it/s]\u001b[A\n",
            "Iteration:  87%|████████▋ | 350/403 [01:33<00:14,  3.75it/s]\u001b[A\n",
            "Iteration:  87%|████████▋ | 351/403 [01:33<00:13,  3.74it/s]\u001b[A\n",
            "Iteration:  87%|████████▋ | 352/403 [01:34<00:13,  3.74it/s]\u001b[A\n",
            "Iteration:  88%|████████▊ | 353/403 [01:34<00:13,  3.69it/s]\u001b[A\n",
            "Iteration:  88%|████████▊ | 354/403 [01:34<00:13,  3.69it/s]\u001b[A\n",
            "Iteration:  88%|████████▊ | 355/403 [01:34<00:13,  3.66it/s]\u001b[A\n",
            "Iteration:  88%|████████▊ | 356/403 [01:35<00:12,  3.65it/s]\u001b[A\n",
            "Iteration:  89%|████████▊ | 357/403 [01:35<00:12,  3.65it/s]\u001b[A\n",
            "Iteration:  89%|████████▉ | 358/403 [01:35<00:12,  3.67it/s]\u001b[A\n",
            "Iteration:  89%|████████▉ | 359/403 [01:36<00:11,  3.69it/s]\u001b[A\n",
            "Iteration:  89%|████████▉ | 360/403 [01:36<00:11,  3.71it/s]\u001b[A\n",
            "Iteration:  90%|████████▉ | 361/403 [01:36<00:11,  3.73it/s]\u001b[A\n",
            "Iteration:  90%|████████▉ | 362/403 [01:36<00:10,  3.74it/s]\u001b[A\n",
            "Iteration:  90%|█████████ | 363/403 [01:37<00:10,  3.74it/s]\u001b[A\n",
            "Iteration:  90%|█████████ | 364/403 [01:37<00:10,  3.75it/s]\u001b[A\n",
            "Iteration:  91%|█████████ | 365/403 [01:37<00:10,  3.76it/s]\u001b[A\n",
            "Iteration:  91%|█████████ | 366/403 [01:37<00:09,  3.75it/s]\u001b[A\n",
            "Iteration:  91%|█████████ | 367/403 [01:38<00:09,  3.76it/s]\u001b[A\n",
            "Iteration:  91%|█████████▏| 368/403 [01:38<00:09,  3.76it/s]\u001b[A\n",
            "Iteration:  92%|█████████▏| 369/403 [01:38<00:09,  3.76it/s]\u001b[A\n",
            "Iteration:  92%|█████████▏| 370/403 [01:38<00:08,  3.78it/s]\u001b[A\n",
            "Iteration:  92%|█████████▏| 371/403 [01:39<00:08,  3.80it/s]\u001b[A\n",
            "Iteration:  92%|█████████▏| 372/403 [01:39<00:08,  3.82it/s]\u001b[A\n",
            "Iteration:  93%|█████████▎| 373/403 [01:39<00:07,  3.84it/s]\u001b[A\n",
            "Iteration:  93%|█████████▎| 374/403 [01:40<00:07,  3.85it/s]\u001b[A\n",
            "Iteration:  93%|█████████▎| 375/403 [01:40<00:07,  3.84it/s]\u001b[A\n",
            "Iteration:  93%|█████████▎| 376/403 [01:40<00:07,  3.83it/s]\u001b[A\n",
            "Iteration:  94%|█████████▎| 377/403 [01:40<00:06,  3.85it/s]\u001b[A\n",
            "Iteration:  94%|█████████▍| 378/403 [01:41<00:06,  3.85it/s]\u001b[A\n",
            "Iteration:  94%|█████████▍| 379/403 [01:41<00:06,  3.86it/s]\u001b[A\n",
            "Iteration:  94%|█████████▍| 380/403 [01:41<00:05,  3.85it/s]\u001b[A\n",
            "Iteration:  95%|█████████▍| 381/403 [01:41<00:05,  3.85it/s]\u001b[A\n",
            "Iteration:  95%|█████████▍| 382/403 [01:42<00:05,  3.85it/s]\u001b[A\n",
            "Iteration:  95%|█████████▌| 383/403 [01:42<00:05,  3.84it/s]\u001b[A\n",
            "Iteration:  95%|█████████▌| 384/403 [01:42<00:04,  3.85it/s]\u001b[A\n",
            "Iteration:  96%|█████████▌| 385/403 [01:42<00:04,  3.85it/s]\u001b[A\n",
            "Iteration:  96%|█████████▌| 386/403 [01:43<00:04,  3.83it/s]\u001b[A\n",
            "Iteration:  96%|█████████▌| 387/403 [01:43<00:04,  3.84it/s]\u001b[A\n",
            "Iteration:  96%|█████████▋| 388/403 [01:43<00:03,  3.84it/s]\u001b[A\n",
            "Iteration:  97%|█████████▋| 389/403 [01:43<00:03,  3.85it/s]\u001b[A\n",
            "Iteration:  97%|█████████▋| 390/403 [01:44<00:03,  3.84it/s]\u001b[A\n",
            "Iteration:  97%|█████████▋| 391/403 [01:44<00:03,  3.84it/s]\u001b[A\n",
            "Iteration:  97%|█████████▋| 392/403 [01:44<00:02,  3.84it/s]\u001b[A\n",
            "Iteration:  98%|█████████▊| 393/403 [01:44<00:02,  3.84it/s]\u001b[A\n",
            "Iteration:  98%|█████████▊| 394/403 [01:45<00:02,  3.84it/s]\u001b[A\n",
            "Iteration:  98%|█████████▊| 395/403 [01:45<00:02,  3.84it/s]\u001b[A\n",
            "Iteration:  98%|█████████▊| 396/403 [01:45<00:01,  3.85it/s]\u001b[A\n",
            "Iteration:  99%|█████████▊| 397/403 [01:45<00:01,  3.85it/s]\u001b[A\n",
            "Iteration:  99%|█████████▉| 398/403 [01:46<00:01,  3.85it/s]\u001b[A\n",
            "Iteration:  99%|█████████▉| 399/403 [01:46<00:01,  3.86it/s]\u001b[A\n",
            "Iteration:  99%|█████████▉| 400/403 [01:46<00:00,  3.87it/s]\u001b[A\n",
            "Iteration: 100%|█████████▉| 401/403 [01:47<00:00,  3.86it/s]\u001b[A\n",
            "Iteration: 100%|█████████▉| 402/403 [01:47<00:00,  3.87it/s]\u001b[A\n",
            "Iteration: 100%|██████████| 403/403 [01:47<00:00,  3.86it/s]\u001b[A\n",
            "Epoch:  20%|██        | 1/5 [02:07<08:30, 127.74s/it]\n",
            "Iteration:   0%|          | 0/403 [00:00<?, ?it/s]\u001b[A\n",
            "Iteration:   0%|          | 1/403 [00:00<01:42,  3.92it/s]\u001b[A\n",
            "Iteration:   0%|          | 2/403 [00:00<01:42,  3.91it/s]\u001b[A\n",
            "Iteration:   1%|          | 3/403 [00:00<01:42,  3.89it/s]\u001b[A\n",
            "Iteration:   1%|          | 4/403 [00:01<01:42,  3.88it/s]\u001b[A\n",
            "Iteration:   1%|          | 5/403 [00:01<01:42,  3.87it/s]\u001b[A\n",
            "Iteration:   1%|▏         | 6/403 [00:01<01:42,  3.86it/s]\u001b[A\n",
            "Iteration:   2%|▏         | 7/403 [00:01<01:42,  3.85it/s]\u001b[A\n",
            "Iteration:   2%|▏         | 8/403 [00:02<01:42,  3.85it/s]\u001b[A\n",
            "Iteration:   2%|▏         | 9/403 [00:02<01:42,  3.85it/s]\u001b[A\n",
            "Iteration:   2%|▏         | 10/403 [00:02<01:41,  3.86it/s]\u001b[A\n",
            "Iteration:   3%|▎         | 11/403 [00:02<01:41,  3.85it/s]\u001b[A\n",
            "Iteration:   3%|▎         | 12/403 [00:03<01:41,  3.86it/s]\u001b[A\n",
            "Iteration:   3%|▎         | 13/403 [00:03<01:41,  3.84it/s]\u001b[A\n",
            "Iteration:   3%|▎         | 14/403 [00:03<01:41,  3.84it/s]\u001b[A\n",
            "Iteration:   4%|▎         | 15/403 [00:03<01:40,  3.85it/s]\u001b[A\n",
            "Iteration:   4%|▍         | 16/403 [00:04<01:40,  3.85it/s]\u001b[A\n",
            "Iteration:   4%|▍         | 17/403 [00:04<01:40,  3.85it/s]\u001b[A\n",
            "Iteration:   4%|▍         | 18/403 [00:04<01:40,  3.84it/s]\u001b[A\n",
            "Iteration:   5%|▍         | 19/403 [00:04<01:39,  3.84it/s]\u001b[A\n",
            "Iteration:   5%|▍         | 20/403 [00:05<01:39,  3.84it/s]\u001b[A\n",
            "Iteration:   5%|▌         | 21/403 [00:05<01:39,  3.84it/s]\u001b[A\n",
            "Iteration:   5%|▌         | 22/403 [00:05<01:39,  3.84it/s]\u001b[A\n",
            "Iteration:   6%|▌         | 23/403 [00:05<01:38,  3.84it/s]\u001b[A\n",
            "Iteration:   6%|▌         | 24/403 [00:06<01:38,  3.85it/s]\u001b[A\n",
            "Iteration:   6%|▌         | 25/403 [00:06<01:38,  3.85it/s]\u001b[A\n",
            "Iteration:   6%|▋         | 26/403 [00:06<01:37,  3.86it/s]\u001b[A\n",
            "Iteration:   7%|▋         | 27/403 [00:07<01:37,  3.86it/s]\u001b[A\n",
            "Iteration:   7%|▋         | 28/403 [00:07<01:36,  3.87it/s]\u001b[A\n",
            "Iteration:   7%|▋         | 29/403 [00:07<01:36,  3.86it/s]\u001b[A\n",
            "Iteration:   7%|▋         | 30/403 [00:07<01:36,  3.85it/s]\u001b[A\n",
            "Iteration:   8%|▊         | 31/403 [00:08<01:36,  3.85it/s]\u001b[A\n",
            "Iteration:   8%|▊         | 32/403 [00:08<01:36,  3.85it/s]\u001b[A\n",
            "Iteration:   8%|▊         | 33/403 [00:08<01:36,  3.84it/s]\u001b[A\n",
            "Iteration:   8%|▊         | 34/403 [00:08<01:35,  3.85it/s]\u001b[A\n",
            "Iteration:   9%|▊         | 35/403 [00:09<01:35,  3.85it/s]\u001b[A\n",
            "Iteration:   9%|▉         | 36/403 [00:09<01:35,  3.86it/s]\u001b[A\n",
            "Iteration:   9%|▉         | 37/403 [00:09<01:35,  3.84it/s]\u001b[A\n",
            "Iteration:   9%|▉         | 38/403 [00:09<01:34,  3.85it/s]\u001b[A\n",
            "Iteration:  10%|▉         | 39/403 [00:10<01:34,  3.85it/s]\u001b[A\n",
            "Iteration:  10%|▉         | 40/403 [00:10<01:34,  3.83it/s]\u001b[A\n",
            "Iteration:  10%|█         | 41/403 [00:10<01:34,  3.83it/s]\u001b[A\n",
            "Iteration:  10%|█         | 42/403 [00:10<01:34,  3.83it/s]\u001b[A\n",
            "Iteration:  11%|█         | 43/403 [00:11<01:34,  3.82it/s]\u001b[A\n",
            "Iteration:  11%|█         | 44/403 [00:11<01:33,  3.83it/s]\u001b[A\n",
            "Iteration:  11%|█         | 45/403 [00:11<01:33,  3.83it/s]\u001b[A\n",
            "Iteration:  11%|█▏        | 46/403 [00:11<01:32,  3.85it/s]\u001b[A\n",
            "Iteration:  12%|█▏        | 47/403 [00:12<01:32,  3.86it/s]\u001b[A\n",
            "Iteration:  12%|█▏        | 48/403 [00:12<01:32,  3.86it/s]\u001b[A\n",
            "Iteration:  12%|█▏        | 49/403 [00:12<01:31,  3.85it/s]\u001b[A\n",
            "Iteration:  12%|█▏        | 50/403 [00:12<01:31,  3.85it/s]\u001b[A\n",
            "Iteration:  13%|█▎        | 51/403 [00:13<01:31,  3.86it/s]\u001b[A\n",
            "Iteration:  13%|█▎        | 52/403 [00:13<01:31,  3.84it/s]\u001b[A\n",
            "Iteration:  13%|█▎        | 53/403 [00:13<01:31,  3.83it/s]\u001b[A\n",
            "Iteration:  13%|█▎        | 54/403 [00:14<01:31,  3.81it/s]\u001b[A\n",
            "Iteration:  14%|█▎        | 55/403 [00:14<01:31,  3.81it/s]\u001b[A\n",
            "Iteration:  14%|█▍        | 56/403 [00:14<01:30,  3.82it/s]\u001b[A\n",
            "Iteration:  14%|█▍        | 57/403 [00:14<01:30,  3.81it/s]\u001b[A\n",
            "Iteration:  14%|█▍        | 58/403 [00:15<01:30,  3.82it/s]\u001b[A\n",
            "Iteration:  15%|█▍        | 59/403 [00:15<01:29,  3.84it/s]\u001b[A\n",
            "Iteration:  15%|█▍        | 60/403 [00:15<01:29,  3.83it/s]\u001b[A\n",
            "Iteration:  15%|█▌        | 61/403 [00:15<01:29,  3.83it/s]\u001b[A\n",
            "Iteration:  15%|█▌        | 62/403 [00:16<01:29,  3.82it/s]\u001b[A\n",
            "Iteration:  16%|█▌        | 63/403 [00:16<01:28,  3.83it/s]\u001b[A\n",
            "Iteration:  16%|█▌        | 64/403 [00:16<01:28,  3.82it/s]\u001b[A\n",
            "Iteration:  16%|█▌        | 65/403 [00:16<01:28,  3.83it/s]\u001b[A\n",
            "Iteration:  16%|█▋        | 66/403 [00:17<01:27,  3.85it/s]\u001b[A\n",
            "Iteration:  17%|█▋        | 67/403 [00:17<01:27,  3.86it/s]\u001b[A\n",
            "Iteration:  17%|█▋        | 68/403 [00:17<01:26,  3.86it/s]\u001b[A\n",
            "Iteration:  17%|█▋        | 69/403 [00:17<01:26,  3.86it/s]\u001b[A\n",
            "Iteration:  17%|█▋        | 70/403 [00:18<01:26,  3.86it/s]\u001b[A\n",
            "Iteration:  18%|█▊        | 71/403 [00:18<01:25,  3.87it/s]\u001b[A\n",
            "Iteration:  18%|█▊        | 72/403 [00:18<01:25,  3.86it/s]\u001b[A\n",
            "Iteration:  18%|█▊        | 73/403 [00:18<01:25,  3.85it/s]\u001b[A\n",
            "Iteration:  18%|█▊        | 74/403 [00:19<01:25,  3.86it/s]\u001b[A\n",
            "Iteration:  19%|█▊        | 75/403 [00:19<01:24,  3.86it/s]\u001b[A\n",
            "Iteration:  19%|█▉        | 76/403 [00:19<01:24,  3.86it/s]\u001b[A\n",
            "Iteration:  19%|█▉        | 77/403 [00:20<01:24,  3.84it/s]\u001b[A\n",
            "Iteration:  19%|█▉        | 78/403 [00:20<01:24,  3.84it/s]\u001b[A\n",
            "Iteration:  20%|█▉        | 79/403 [00:20<01:24,  3.84it/s]\u001b[A\n",
            "Iteration:  20%|█▉        | 80/403 [00:20<01:23,  3.85it/s]\u001b[A\n",
            "Iteration:  20%|██        | 81/403 [00:21<01:23,  3.85it/s]\u001b[A\n",
            "Iteration:  20%|██        | 82/403 [00:21<01:23,  3.85it/s]\u001b[A\n",
            "Iteration:  21%|██        | 83/403 [00:21<01:23,  3.85it/s]\u001b[A\n",
            "Iteration:  21%|██        | 84/403 [00:21<01:22,  3.85it/s]\u001b[A\n",
            "Iteration:  21%|██        | 85/403 [00:22<01:22,  3.86it/s]\u001b[A\n",
            "Iteration:  21%|██▏       | 86/403 [00:22<01:22,  3.86it/s]\u001b[A\n",
            "Iteration:  22%|██▏       | 87/403 [00:22<01:22,  3.85it/s]\u001b[A\n",
            "Iteration:  22%|██▏       | 88/403 [00:22<01:21,  3.86it/s]\u001b[A\n",
            "Iteration:  22%|██▏       | 89/403 [00:23<01:21,  3.85it/s]\u001b[A\n",
            "Iteration:  22%|██▏       | 90/403 [00:23<01:21,  3.85it/s]\u001b[A\n",
            "Iteration:  23%|██▎       | 91/403 [00:23<01:21,  3.85it/s]\u001b[A\n",
            "Iteration:  23%|██▎       | 92/403 [00:23<01:20,  3.85it/s]\u001b[A\n",
            "Iteration:  23%|██▎       | 93/403 [00:24<01:20,  3.84it/s]\u001b[A\n",
            "Iteration:  23%|██▎       | 94/403 [00:24<01:20,  3.84it/s]\u001b[A\n",
            "Iteration:  24%|██▎       | 95/403 [00:24<01:20,  3.85it/s]\u001b[A\n",
            "Iteration:  24%|██▍       | 96/403 [00:24<01:19,  3.85it/s]\u001b[A\n",
            "Iteration:  24%|██▍       | 97/403 [00:25<01:19,  3.86it/s]\u001b[A\n",
            "Iteration:  24%|██▍       | 98/403 [00:25<01:19,  3.85it/s]\u001b[A\n",
            "Iteration:  25%|██▍       | 99/403 [00:25<01:19,  3.84it/s]\u001b[A\n",
            "Iteration:  25%|██▍       | 100/403 [00:26<01:18,  3.84it/s]\u001b[A\n",
            "Iteration:  25%|██▌       | 101/403 [00:26<01:18,  3.83it/s]\u001b[A\n",
            "Iteration:  25%|██▌       | 102/403 [00:26<01:18,  3.83it/s]\u001b[A\n",
            "Iteration:  26%|██▌       | 103/403 [00:26<01:18,  3.83it/s]\u001b[A\n",
            "Iteration:  26%|██▌       | 104/403 [00:27<01:17,  3.84it/s]\u001b[A\n",
            "Iteration:  26%|██▌       | 105/403 [00:27<01:17,  3.86it/s]\u001b[A\n",
            "Iteration:  26%|██▋       | 106/403 [00:27<01:17,  3.85it/s]\u001b[A\n",
            "Iteration:  27%|██▋       | 107/403 [00:27<01:16,  3.85it/s]\u001b[A\n",
            "Iteration:  27%|██▋       | 108/403 [00:28<01:16,  3.86it/s]\u001b[A\n",
            "Iteration:  27%|██▋       | 109/403 [00:28<01:16,  3.86it/s]\u001b[A\n",
            "Iteration:  27%|██▋       | 110/403 [00:28<01:16,  3.85it/s]\u001b[A\n",
            "Iteration:  28%|██▊       | 111/403 [00:28<01:15,  3.85it/s]\u001b[A\n",
            "Iteration:  28%|██▊       | 112/403 [00:29<01:15,  3.86it/s]\u001b[A\n",
            "Iteration:  28%|██▊       | 113/403 [00:29<01:15,  3.86it/s]\u001b[A\n",
            "Iteration:  28%|██▊       | 114/403 [00:29<01:15,  3.85it/s]\u001b[A\n",
            "Iteration:  29%|██▊       | 115/403 [00:29<01:14,  3.85it/s]\u001b[A\n",
            "Iteration:  29%|██▉       | 116/403 [00:30<01:14,  3.86it/s]\u001b[A\n",
            "Iteration:  29%|██▉       | 117/403 [00:30<01:14,  3.85it/s]\u001b[A\n",
            "Iteration:  29%|██▉       | 118/403 [00:30<01:14,  3.85it/s]\u001b[A\n",
            "Iteration:  30%|██▉       | 119/403 [00:30<01:13,  3.85it/s]\u001b[A\n",
            "Iteration:  30%|██▉       | 120/403 [00:31<01:13,  3.86it/s]\u001b[A\n",
            "Iteration:  30%|███       | 121/403 [00:31<01:13,  3.86it/s]\u001b[A\n",
            "Iteration:  30%|███       | 122/403 [00:31<01:12,  3.86it/s]\u001b[A\n",
            "Iteration:  31%|███       | 123/403 [00:31<01:12,  3.85it/s]\u001b[A\n",
            "Iteration:  31%|███       | 124/403 [00:32<01:12,  3.85it/s]\u001b[A\n",
            "Iteration:  31%|███       | 125/403 [00:32<01:12,  3.85it/s]\u001b[A\n",
            "Iteration:  31%|███▏      | 126/403 [00:32<01:11,  3.86it/s]\u001b[A\n",
            "Iteration:  32%|███▏      | 127/403 [00:33<01:11,  3.86it/s]\u001b[A\n",
            "Iteration:  32%|███▏      | 128/403 [00:33<01:11,  3.87it/s]\u001b[A\n",
            "Iteration:  32%|███▏      | 129/403 [00:33<01:10,  3.86it/s]\u001b[A\n",
            "Iteration:  32%|███▏      | 130/403 [00:33<01:10,  3.85it/s]\u001b[A\n",
            "Iteration:  33%|███▎      | 131/403 [00:34<01:10,  3.85it/s]\u001b[A\n",
            "Iteration:  33%|███▎      | 132/403 [00:34<01:10,  3.85it/s]\u001b[A\n",
            "Iteration:  33%|███▎      | 133/403 [00:34<01:10,  3.85it/s]\u001b[A\n",
            "Iteration:  33%|███▎      | 134/403 [00:34<01:09,  3.85it/s]\u001b[A\n",
            "Iteration:  33%|███▎      | 135/403 [00:35<01:09,  3.86it/s]\u001b[A\n",
            "Iteration:  34%|███▎      | 136/403 [00:35<01:09,  3.87it/s]\u001b[A\n",
            "Iteration:  34%|███▍      | 137/403 [00:35<01:08,  3.87it/s]\u001b[A\n",
            "Iteration:  34%|███▍      | 138/403 [00:35<01:08,  3.86it/s]\u001b[A\n",
            "Iteration:  34%|███▍      | 139/403 [00:36<01:08,  3.86it/s]\u001b[A\n",
            "Iteration:  35%|███▍      | 140/403 [00:36<01:08,  3.87it/s]\u001b[A\n",
            "Iteration:  35%|███▍      | 141/403 [00:36<01:07,  3.86it/s]\u001b[A\n",
            "Iteration:  35%|███▌      | 142/403 [00:36<01:07,  3.86it/s]\u001b[A\n",
            "Iteration:  35%|███▌      | 143/403 [00:37<01:07,  3.86it/s]\u001b[A\n",
            "Iteration:  36%|███▌      | 144/403 [00:37<01:06,  3.87it/s]\u001b[A\n",
            "Iteration:  36%|███▌      | 145/403 [00:37<01:06,  3.87it/s]\u001b[A\n",
            "Iteration:  36%|███▌      | 146/403 [00:37<01:06,  3.87it/s]\u001b[A\n",
            "Iteration:  36%|███▋      | 147/403 [00:38<01:06,  3.86it/s]\u001b[A\n",
            "Iteration:  37%|███▋      | 148/403 [00:38<01:05,  3.87it/s]\u001b[A\n",
            "Iteration:  37%|███▋      | 149/403 [00:38<01:05,  3.85it/s]\u001b[A\n",
            "Iteration:  37%|███▋      | 150/403 [00:38<01:05,  3.85it/s]\u001b[A\n",
            "Iteration:  37%|███▋      | 151/403 [00:39<01:05,  3.85it/s]\u001b[A\n",
            "Iteration:  38%|███▊      | 152/403 [00:39<01:05,  3.85it/s]\u001b[A\n",
            "Iteration:  38%|███▊      | 153/403 [00:39<01:04,  3.85it/s]\u001b[A\n",
            "Iteration:  38%|███▊      | 154/403 [00:40<01:04,  3.86it/s]\u001b[A\n",
            "Iteration:  38%|███▊      | 155/403 [00:40<01:04,  3.85it/s]\u001b[A\n",
            "Iteration:  39%|███▊      | 156/403 [00:40<01:04,  3.85it/s]\u001b[A\n",
            "Iteration:  39%|███▉      | 157/403 [00:40<01:04,  3.83it/s]\u001b[A\n",
            "Iteration:  39%|███▉      | 158/403 [00:41<01:03,  3.85it/s]\u001b[A\n",
            "Iteration:  39%|███▉      | 159/403 [00:41<01:03,  3.84it/s]\u001b[A\n",
            "Iteration:  40%|███▉      | 160/403 [00:41<01:03,  3.84it/s]\u001b[A\n",
            "Iteration:  40%|███▉      | 161/403 [00:41<01:03,  3.84it/s]\u001b[A\n",
            "Iteration:  40%|████      | 162/403 [00:42<01:02,  3.84it/s]\u001b[A\n",
            "Iteration:  40%|████      | 163/403 [00:42<01:02,  3.84it/s]\u001b[A\n",
            "Iteration:  41%|████      | 164/403 [00:42<01:02,  3.83it/s]\u001b[A\n",
            "Iteration:  41%|████      | 165/403 [00:42<01:02,  3.84it/s]\u001b[A\n",
            "Iteration:  41%|████      | 166/403 [00:43<01:02,  3.82it/s]\u001b[A\n",
            "Iteration:  41%|████▏     | 167/403 [00:43<01:01,  3.83it/s]\u001b[A\n",
            "Iteration:  42%|████▏     | 168/403 [00:43<01:01,  3.83it/s]\u001b[A\n",
            "Iteration:  42%|████▏     | 169/403 [00:43<01:00,  3.84it/s]\u001b[A\n",
            "Iteration:  42%|████▏     | 170/403 [00:44<01:00,  3.83it/s]\u001b[A\n",
            "Iteration:  42%|████▏     | 171/403 [00:44<01:00,  3.84it/s]\u001b[A\n",
            "Iteration:  43%|████▎     | 172/403 [00:44<00:59,  3.85it/s]\u001b[A\n",
            "Iteration:  43%|████▎     | 173/403 [00:44<00:59,  3.86it/s]\u001b[A\n",
            "Iteration:  43%|████▎     | 174/403 [00:45<00:59,  3.86it/s]\u001b[A\n",
            "Iteration:  43%|████▎     | 175/403 [00:45<00:59,  3.85it/s]\u001b[A\n",
            "Iteration:  44%|████▎     | 176/403 [00:45<00:58,  3.85it/s]\u001b[A\n",
            "Iteration:  44%|████▍     | 177/403 [00:45<00:58,  3.85it/s]\u001b[A\n",
            "Iteration:  44%|████▍     | 178/403 [00:46<00:58,  3.85it/s]\u001b[A\n",
            "Iteration:  44%|████▍     | 179/403 [00:46<00:58,  3.84it/s]\u001b[A\n",
            "Iteration:  45%|████▍     | 180/403 [00:46<00:58,  3.83it/s]\u001b[A\n",
            "Iteration:  45%|████▍     | 181/403 [00:47<00:57,  3.84it/s]\u001b[A\n",
            "Iteration:  45%|████▌     | 182/403 [00:47<00:57,  3.83it/s]\u001b[A\n",
            "Iteration:  45%|████▌     | 183/403 [00:47<00:57,  3.82it/s]\u001b[A\n",
            "Iteration:  46%|████▌     | 184/403 [00:47<00:57,  3.81it/s]\u001b[A\n",
            "Iteration:  46%|████▌     | 185/403 [00:48<00:57,  3.82it/s]\u001b[A\n",
            "Iteration:  46%|████▌     | 186/403 [00:48<00:56,  3.84it/s]\u001b[A\n",
            "Iteration:  46%|████▋     | 187/403 [00:48<00:56,  3.83it/s]\u001b[A\n",
            "Iteration:  47%|████▋     | 188/403 [00:48<00:56,  3.84it/s]\u001b[A\n",
            "Iteration:  47%|████▋     | 189/403 [00:49<00:55,  3.84it/s]\u001b[A\n",
            "Iteration:  47%|████▋     | 190/403 [00:49<00:55,  3.85it/s]\u001b[A\n",
            "Iteration:  47%|████▋     | 191/403 [00:49<00:55,  3.85it/s]\u001b[A\n",
            "Iteration:  48%|████▊     | 192/403 [00:49<00:54,  3.85it/s]\u001b[A\n",
            "Iteration:  48%|████▊     | 193/403 [00:50<00:54,  3.86it/s]\u001b[A\n",
            "Iteration:  48%|████▊     | 194/403 [00:50<00:54,  3.85it/s]\u001b[A\n",
            "Iteration:  48%|████▊     | 195/403 [00:50<00:54,  3.84it/s]\u001b[A\n",
            "Iteration:  49%|████▊     | 196/403 [00:50<00:53,  3.84it/s]\u001b[A\n",
            "Iteration:  49%|████▉     | 197/403 [00:51<00:53,  3.85it/s]\u001b[A\n",
            "Iteration:  49%|████▉     | 198/403 [00:51<00:53,  3.86it/s]\u001b[A\n",
            "Iteration:  49%|████▉     | 199/403 [00:51<00:52,  3.86it/s]\u001b[A\n",
            "Iteration:  50%|████▉     | 200/403 [00:51<00:52,  3.86it/s]\u001b[A\n",
            "Iteration:  50%|████▉     | 201/403 [00:52<00:52,  3.87it/s]\u001b[A\n",
            "Iteration:  50%|█████     | 202/403 [00:52<00:52,  3.86it/s]\u001b[A\n",
            "Iteration:  50%|█████     | 203/403 [00:52<00:51,  3.86it/s]\u001b[A\n",
            "Iteration:  51%|█████     | 204/403 [00:53<00:51,  3.86it/s]\u001b[A\n",
            "Iteration:  51%|█████     | 205/403 [00:53<00:51,  3.86it/s]\u001b[A\n",
            "Iteration:  51%|█████     | 206/403 [00:53<00:50,  3.87it/s]\u001b[A\n",
            "Iteration:  51%|█████▏    | 207/403 [00:53<00:50,  3.87it/s]\u001b[A\n",
            "Iteration:  52%|█████▏    | 208/403 [00:54<00:50,  3.86it/s]\u001b[A\n",
            "Iteration:  52%|█████▏    | 209/403 [00:54<00:50,  3.86it/s]\u001b[A\n",
            "Iteration:  52%|█████▏    | 210/403 [00:54<00:49,  3.87it/s]\u001b[A\n",
            "Iteration:  52%|█████▏    | 211/403 [00:54<00:49,  3.86it/s]\u001b[A\n",
            "Iteration:  53%|█████▎    | 212/403 [00:55<00:49,  3.85it/s]\u001b[A\n",
            "Iteration:  53%|█████▎    | 213/403 [00:55<00:49,  3.87it/s]\u001b[A\n",
            "Iteration:  53%|█████▎    | 214/403 [00:55<00:48,  3.88it/s]\u001b[A\n",
            "Iteration:  53%|█████▎    | 215/403 [00:55<00:48,  3.87it/s]\u001b[A\n",
            "Iteration:  54%|█████▎    | 216/403 [00:56<00:48,  3.86it/s]\u001b[A\n",
            "Iteration:  54%|█████▍    | 217/403 [00:56<00:48,  3.85it/s]\u001b[A\n",
            "Iteration:  54%|█████▍    | 218/403 [00:56<00:48,  3.85it/s]\u001b[A\n",
            "Iteration:  54%|█████▍    | 219/403 [00:56<00:47,  3.85it/s]\u001b[A\n",
            "Iteration:  55%|█████▍    | 220/403 [00:57<00:47,  3.85it/s]\u001b[A\n",
            "Iteration:  55%|█████▍    | 221/403 [00:57<00:47,  3.85it/s]\u001b[A\n",
            "Iteration:  55%|█████▌    | 222/403 [00:57<00:47,  3.84it/s]\u001b[A\n",
            "Iteration:  55%|█████▌    | 223/403 [00:57<00:47,  3.83it/s]\u001b[A\n",
            "Iteration:  56%|█████▌    | 224/403 [00:58<00:46,  3.82it/s]\u001b[A\n",
            "Iteration:  56%|█████▌    | 225/403 [00:58<00:47,  3.79it/s]\u001b[A\n",
            "Iteration:  56%|█████▌    | 226/403 [00:58<00:46,  3.79it/s]\u001b[A\n",
            "Iteration:  56%|█████▋    | 227/403 [00:59<00:46,  3.79it/s]\u001b[A\n",
            "Iteration:  57%|█████▋    | 228/403 [00:59<00:46,  3.80it/s]\u001b[A\n",
            "Iteration:  57%|█████▋    | 229/403 [00:59<00:45,  3.81it/s]\u001b[A\n",
            "Iteration:  57%|█████▋    | 230/403 [00:59<00:45,  3.83it/s]\u001b[A\n",
            "Iteration:  57%|█████▋    | 231/403 [01:00<00:44,  3.82it/s]\u001b[A\n",
            "Iteration:  58%|█████▊    | 232/403 [01:00<00:44,  3.80it/s]\u001b[A\n",
            "Iteration:  58%|█████▊    | 233/403 [01:00<00:44,  3.82it/s]\u001b[A\n",
            "Iteration:  58%|█████▊    | 234/403 [01:00<00:44,  3.82it/s]\u001b[A\n",
            "Iteration:  58%|█████▊    | 235/403 [01:01<00:43,  3.84it/s]\u001b[A\n",
            "Iteration:  59%|█████▊    | 236/403 [01:01<00:43,  3.84it/s]\u001b[A\n",
            "Iteration:  59%|█████▉    | 237/403 [01:01<00:43,  3.85it/s]\u001b[A\n",
            "Iteration:  59%|█████▉    | 238/403 [01:01<00:42,  3.85it/s]\u001b[A\n",
            "Iteration:  59%|█████▉    | 239/403 [01:02<00:42,  3.86it/s]\u001b[A\n",
            "Iteration:  60%|█████▉    | 240/403 [01:02<00:42,  3.86it/s]\u001b[A\n",
            "Iteration:  60%|█████▉    | 241/403 [01:02<00:41,  3.86it/s]\u001b[A\n",
            "Iteration:  60%|██████    | 242/403 [01:02<00:41,  3.86it/s]\u001b[A\n",
            "Iteration:  60%|██████    | 243/403 [01:03<00:41,  3.86it/s]\u001b[A\n",
            "Iteration:  61%|██████    | 244/403 [01:03<00:41,  3.85it/s]\u001b[A\n",
            "Iteration:  61%|██████    | 245/403 [01:03<00:41,  3.85it/s]\u001b[A\n",
            "Iteration:  61%|██████    | 246/403 [01:03<00:40,  3.84it/s]\u001b[A\n",
            "Iteration:  61%|██████▏   | 247/403 [01:04<00:40,  3.84it/s]\u001b[A\n",
            "Iteration:  62%|██████▏   | 248/403 [01:04<00:40,  3.82it/s]\u001b[A\n",
            "Iteration:  62%|██████▏   | 249/403 [01:04<00:40,  3.82it/s]\u001b[A\n",
            "Iteration:  62%|██████▏   | 250/403 [01:04<00:39,  3.83it/s]\u001b[A\n",
            "Iteration:  62%|██████▏   | 251/403 [01:05<00:39,  3.84it/s]\u001b[A\n",
            "Iteration:  63%|██████▎   | 252/403 [01:05<00:39,  3.84it/s]\u001b[A\n",
            "Iteration:  63%|██████▎   | 253/403 [01:05<00:39,  3.85it/s]\u001b[A\n",
            "Iteration:  63%|██████▎   | 254/403 [01:06<00:38,  3.85it/s]\u001b[A\n",
            "Iteration:  63%|██████▎   | 255/403 [01:06<00:38,  3.85it/s]\u001b[A\n",
            "Iteration:  64%|██████▎   | 256/403 [01:06<00:38,  3.85it/s]\u001b[A\n",
            "Iteration:  64%|██████▍   | 257/403 [01:06<00:37,  3.85it/s]\u001b[A\n",
            "Iteration:  64%|██████▍   | 258/403 [01:07<00:37,  3.86it/s]\u001b[A\n",
            "Iteration:  64%|██████▍   | 259/403 [01:07<00:37,  3.87it/s]\u001b[A\n",
            "Iteration:  65%|██████▍   | 260/403 [01:07<00:37,  3.86it/s]\u001b[A\n",
            "Iteration:  65%|██████▍   | 261/403 [01:07<00:36,  3.86it/s]\u001b[A\n",
            "Iteration:  65%|██████▌   | 262/403 [01:08<00:36,  3.86it/s]\u001b[A\n",
            "Iteration:  65%|██████▌   | 263/403 [01:08<00:36,  3.87it/s]\u001b[A\n",
            "Iteration:  66%|██████▌   | 264/403 [01:08<00:35,  3.86it/s]\u001b[A\n",
            "Iteration:  66%|██████▌   | 265/403 [01:08<00:36,  3.83it/s]\u001b[A\n",
            "Iteration:  66%|██████▌   | 266/403 [01:09<00:35,  3.81it/s]\u001b[A\n",
            "Iteration:  66%|██████▋   | 267/403 [01:09<00:35,  3.83it/s]\u001b[A\n",
            "Iteration:  67%|██████▋   | 268/403 [01:09<00:35,  3.82it/s]\u001b[A\n",
            "Iteration:  67%|██████▋   | 269/403 [01:09<00:34,  3.83it/s]\u001b[A\n",
            "Iteration:  67%|██████▋   | 270/403 [01:10<00:34,  3.81it/s]\u001b[A\n",
            "Iteration:  67%|██████▋   | 271/403 [01:10<00:34,  3.80it/s]\u001b[A\n",
            "Iteration:  67%|██████▋   | 272/403 [01:10<00:34,  3.79it/s]\u001b[A\n",
            "Iteration:  68%|██████▊   | 273/403 [01:10<00:34,  3.80it/s]\u001b[A\n",
            "Iteration:  68%|██████▊   | 274/403 [01:11<00:33,  3.81it/s]\u001b[A\n",
            "Iteration:  68%|██████▊   | 275/403 [01:11<00:33,  3.83it/s]\u001b[A\n",
            "Iteration:  68%|██████▊   | 276/403 [01:11<00:33,  3.83it/s]\u001b[A\n",
            "Iteration:  69%|██████▊   | 277/403 [01:12<00:32,  3.84it/s]\u001b[A\n",
            "Iteration:  69%|██████▉   | 278/403 [01:12<00:32,  3.86it/s]\u001b[A\n",
            "Iteration:  69%|██████▉   | 279/403 [01:12<00:32,  3.87it/s]\u001b[A\n",
            "Iteration:  69%|██████▉   | 280/403 [01:12<00:31,  3.85it/s]\u001b[A\n",
            "Iteration:  70%|██████▉   | 281/403 [01:13<00:31,  3.85it/s]\u001b[A\n",
            "Iteration:  70%|██████▉   | 282/403 [01:13<00:31,  3.86it/s]\u001b[A\n",
            "Iteration:  70%|███████   | 283/403 [01:13<00:31,  3.84it/s]\u001b[A\n",
            "Iteration:  70%|███████   | 284/403 [01:13<00:31,  3.83it/s]\u001b[A\n",
            "Iteration:  71%|███████   | 285/403 [01:14<00:30,  3.82it/s]\u001b[A\n",
            "Iteration:  71%|███████   | 286/403 [01:14<00:30,  3.81it/s]\u001b[A\n",
            "Iteration:  71%|███████   | 287/403 [01:14<00:30,  3.81it/s]\u001b[A\n",
            "Iteration:  71%|███████▏  | 288/403 [01:14<00:30,  3.82it/s]\u001b[A\n",
            "Iteration:  72%|███████▏  | 289/403 [01:15<00:29,  3.83it/s]\u001b[A\n",
            "Iteration:  72%|███████▏  | 290/403 [01:15<00:29,  3.83it/s]\u001b[A\n",
            "Iteration:  72%|███████▏  | 291/403 [01:15<00:29,  3.84it/s]\u001b[A\n",
            "Iteration:  72%|███████▏  | 292/403 [01:15<00:28,  3.85it/s]\u001b[A\n",
            "Iteration:  73%|███████▎  | 293/403 [01:16<00:28,  3.84it/s]\u001b[A\n",
            "Iteration:  73%|███████▎  | 294/403 [01:16<00:28,  3.85it/s]\u001b[A\n",
            "Iteration:  73%|███████▎  | 295/403 [01:16<00:28,  3.85it/s]\u001b[A\n",
            "Iteration:  73%|███████▎  | 296/403 [01:16<00:27,  3.86it/s]\u001b[A\n",
            "Iteration:  74%|███████▎  | 297/403 [01:17<00:27,  3.86it/s]\u001b[A\n",
            "Iteration:  74%|███████▍  | 298/403 [01:17<00:27,  3.86it/s]\u001b[A\n",
            "Iteration:  74%|███████▍  | 299/403 [01:17<00:26,  3.86it/s]\u001b[A\n",
            "Iteration:  74%|███████▍  | 300/403 [01:18<00:26,  3.87it/s]\u001b[A\n",
            "Iteration:  75%|███████▍  | 301/403 [01:18<00:26,  3.87it/s]\u001b[A\n",
            "Iteration:  75%|███████▍  | 302/403 [01:18<00:26,  3.87it/s]\u001b[A\n",
            "Iteration:  75%|███████▌  | 303/403 [01:18<00:25,  3.86it/s]\u001b[A\n",
            "Iteration:  75%|███████▌  | 304/403 [01:19<00:25,  3.86it/s]\u001b[A\n",
            "Iteration:  76%|███████▌  | 305/403 [01:19<00:25,  3.86it/s]\u001b[A\n",
            "Iteration:  76%|███████▌  | 306/403 [01:19<00:25,  3.86it/s]\u001b[A\n",
            "Iteration:  76%|███████▌  | 307/403 [01:19<00:24,  3.85it/s]\u001b[A\n",
            "Iteration:  76%|███████▋  | 308/403 [01:20<00:24,  3.85it/s]\u001b[A\n",
            "Iteration:  77%|███████▋  | 309/403 [01:20<00:24,  3.85it/s]\u001b[A\n",
            "Iteration:  77%|███████▋  | 310/403 [01:20<00:24,  3.85it/s]\u001b[A\n",
            "Iteration:  77%|███████▋  | 311/403 [01:20<00:23,  3.86it/s]\u001b[A\n",
            "Iteration:  77%|███████▋  | 312/403 [01:21<00:23,  3.86it/s]\u001b[A\n",
            "Iteration:  78%|███████▊  | 313/403 [01:21<00:23,  3.87it/s]\u001b[A\n",
            "Iteration:  78%|███████▊  | 314/403 [01:21<00:23,  3.87it/s]\u001b[A\n",
            "Iteration:  78%|███████▊  | 315/403 [01:21<00:22,  3.87it/s]\u001b[A\n",
            "Iteration:  78%|███████▊  | 316/403 [01:22<00:22,  3.87it/s]\u001b[A\n",
            "Iteration:  79%|███████▊  | 317/403 [01:22<00:22,  3.87it/s]\u001b[A\n",
            "Iteration:  79%|███████▉  | 318/403 [01:22<00:21,  3.87it/s]\u001b[A\n",
            "Iteration:  79%|███████▉  | 319/403 [01:22<00:21,  3.87it/s]\u001b[A\n",
            "Iteration:  79%|███████▉  | 320/403 [01:23<00:21,  3.88it/s]\u001b[A\n",
            "Iteration:  80%|███████▉  | 321/403 [01:23<00:21,  3.88it/s]\u001b[A\n",
            "Iteration:  80%|███████▉  | 322/403 [01:23<00:20,  3.87it/s]\u001b[A\n",
            "Iteration:  80%|████████  | 323/403 [01:23<00:20,  3.87it/s]\u001b[A\n",
            "Iteration:  80%|████████  | 324/403 [01:24<00:20,  3.87it/s]\u001b[A\n",
            "Iteration:  81%|████████  | 325/403 [01:24<00:20,  3.88it/s]\u001b[A\n",
            "Iteration:  81%|████████  | 326/403 [01:24<00:19,  3.88it/s]\u001b[A\n",
            "Iteration:  81%|████████  | 327/403 [01:24<00:19,  3.87it/s]\u001b[A\n",
            "Iteration:  81%|████████▏ | 328/403 [01:25<00:19,  3.87it/s]\u001b[A\n",
            "Iteration:  82%|████████▏ | 329/403 [01:25<00:19,  3.87it/s]\u001b[A\n",
            "Iteration:  82%|████████▏ | 330/403 [01:25<00:18,  3.86it/s]\u001b[A\n",
            "Iteration:  82%|████████▏ | 331/403 [01:26<00:18,  3.86it/s]\u001b[A\n",
            "Iteration:  82%|████████▏ | 332/403 [01:26<00:18,  3.87it/s]\u001b[A\n",
            "Iteration:  83%|████████▎ | 333/403 [01:26<00:18,  3.87it/s]\u001b[A\n",
            "Iteration:  83%|████████▎ | 334/403 [01:26<00:17,  3.87it/s]\u001b[A\n",
            "Iteration:  83%|████████▎ | 335/403 [01:27<00:17,  3.88it/s]\u001b[A\n",
            "Iteration:  83%|████████▎ | 336/403 [01:27<00:17,  3.89it/s]\u001b[A\n",
            "Iteration:  84%|████████▎ | 337/403 [01:27<00:17,  3.88it/s]\u001b[A\n",
            "Iteration:  84%|████████▍ | 338/403 [01:27<00:16,  3.88it/s]\u001b[A\n",
            "Iteration:  84%|████████▍ | 339/403 [01:28<00:16,  3.87it/s]\u001b[A\n",
            "Iteration:  84%|████████▍ | 340/403 [01:28<00:16,  3.86it/s]\u001b[A\n",
            "Iteration:  85%|████████▍ | 341/403 [01:28<00:16,  3.85it/s]\u001b[A\n",
            "Iteration:  85%|████████▍ | 342/403 [01:28<00:15,  3.85it/s]\u001b[A\n",
            "Iteration:  85%|████████▌ | 343/403 [01:29<00:15,  3.85it/s]\u001b[A\n",
            "Iteration:  85%|████████▌ | 344/403 [01:29<00:15,  3.85it/s]\u001b[A\n",
            "Iteration:  86%|████████▌ | 345/403 [01:29<00:15,  3.85it/s]\u001b[A\n",
            "Iteration:  86%|████████▌ | 346/403 [01:29<00:14,  3.86it/s]\u001b[A\n",
            "Iteration:  86%|████████▌ | 347/403 [01:30<00:14,  3.86it/s]\u001b[A\n",
            "Iteration:  86%|████████▋ | 348/403 [01:30<00:14,  3.88it/s]\u001b[A\n",
            "Iteration:  87%|████████▋ | 349/403 [01:30<00:13,  3.87it/s]\u001b[A\n",
            "Iteration:  87%|████████▋ | 350/403 [01:30<00:13,  3.87it/s]\u001b[A\n",
            "Iteration:  87%|████████▋ | 351/403 [01:31<00:13,  3.86it/s]\u001b[A\n",
            "Iteration:  87%|████████▋ | 352/403 [01:31<00:13,  3.87it/s]\u001b[A\n",
            "Iteration:  88%|████████▊ | 353/403 [01:31<00:12,  3.87it/s]\u001b[A\n",
            "Iteration:  88%|████████▊ | 354/403 [01:31<00:12,  3.86it/s]\u001b[A\n",
            "Iteration:  88%|████████▊ | 355/403 [01:32<00:12,  3.85it/s]\u001b[A\n",
            "Iteration:  88%|████████▊ | 356/403 [01:32<00:12,  3.86it/s]\u001b[A\n",
            "Iteration:  89%|████████▊ | 357/403 [01:32<00:11,  3.86it/s]\u001b[A\n",
            "Iteration:  89%|████████▉ | 358/403 [01:33<00:11,  3.86it/s]\u001b[A\n",
            "Iteration:  89%|████████▉ | 359/403 [01:33<00:11,  3.86it/s]\u001b[A\n",
            "Iteration:  89%|████████▉ | 360/403 [01:33<00:11,  3.87it/s]\u001b[A\n",
            "Iteration:  90%|████████▉ | 361/403 [01:33<00:10,  3.88it/s]\u001b[A\n",
            "Iteration:  90%|████████▉ | 362/403 [01:34<00:10,  3.88it/s]\u001b[A\n",
            "Iteration:  90%|█████████ | 363/403 [01:34<00:10,  3.87it/s]\u001b[A\n",
            "Iteration:  90%|█████████ | 364/403 [01:34<00:10,  3.87it/s]\u001b[A\n",
            "Iteration:  91%|█████████ | 365/403 [01:34<00:09,  3.87it/s]\u001b[A\n",
            "Iteration:  91%|█████████ | 366/403 [01:35<00:09,  3.88it/s]\u001b[A\n",
            "Iteration:  91%|█████████ | 367/403 [01:35<00:09,  3.87it/s]\u001b[A\n",
            "Iteration:  91%|█████████▏| 368/403 [01:35<00:09,  3.87it/s]\u001b[A\n",
            "Iteration:  92%|█████████▏| 369/403 [01:35<00:08,  3.86it/s]\u001b[A\n",
            "Iteration:  92%|█████████▏| 370/403 [01:36<00:08,  3.87it/s]\u001b[A\n",
            "Iteration:  92%|█████████▏| 371/403 [01:36<00:08,  3.87it/s]\u001b[A\n",
            "Iteration:  92%|█████████▏| 372/403 [01:36<00:08,  3.87it/s]\u001b[A\n",
            "Iteration:  93%|█████████▎| 373/403 [01:36<00:07,  3.87it/s]\u001b[A\n",
            "Iteration:  93%|█████████▎| 374/403 [01:37<00:07,  3.87it/s]\u001b[A\n",
            "Iteration:  93%|█████████▎| 375/403 [01:37<00:07,  3.87it/s]\u001b[A\n",
            "Iteration:  93%|█████████▎| 376/403 [01:37<00:06,  3.87it/s]\u001b[A\n",
            "Iteration:  94%|█████████▎| 377/403 [01:37<00:06,  3.86it/s]\u001b[A\n",
            "Iteration:  94%|█████████▍| 378/403 [01:38<00:06,  3.88it/s]\u001b[A\n",
            "Iteration:  94%|█████████▍| 379/403 [01:38<00:06,  3.87it/s]\u001b[A\n",
            "Iteration:  94%|█████████▍| 380/403 [01:38<00:05,  3.86it/s]\u001b[A\n",
            "Iteration:  95%|█████████▍| 381/403 [01:38<00:05,  3.86it/s]\u001b[A\n",
            "Iteration:  95%|█████████▍| 382/403 [01:39<00:05,  3.86it/s]\u001b[A\n",
            "Iteration:  95%|█████████▌| 383/403 [01:39<00:05,  3.86it/s]\u001b[A\n",
            "Iteration:  95%|█████████▌| 384/403 [01:39<00:04,  3.86it/s]\u001b[A\n",
            "Iteration:  96%|█████████▌| 385/403 [01:39<00:04,  3.87it/s]\u001b[A\n",
            "Iteration:  96%|█████████▌| 386/403 [01:40<00:04,  3.87it/s]\u001b[A\n",
            "Iteration:  96%|█████████▌| 387/403 [01:40<00:04,  3.86it/s]\u001b[A\n",
            "Iteration:  96%|█████████▋| 388/403 [01:40<00:03,  3.85it/s]\u001b[A\n",
            "Iteration:  97%|█████████▋| 389/403 [01:41<00:03,  3.86it/s]\u001b[A\n",
            "Iteration:  97%|█████████▋| 390/403 [01:41<00:03,  3.86it/s]\u001b[A\n",
            "Iteration:  97%|█████████▋| 391/403 [01:41<00:03,  3.85it/s]\u001b[A\n",
            "Iteration:  97%|█████████▋| 392/403 [01:41<00:02,  3.85it/s]\u001b[A\n",
            "Iteration:  98%|█████████▊| 393/403 [01:42<00:02,  3.85it/s]\u001b[A\n",
            "Iteration:  98%|█████████▊| 394/403 [01:42<00:02,  3.85it/s]\u001b[A\n",
            "Iteration:  98%|█████████▊| 395/403 [01:42<00:02,  3.84it/s]\u001b[A\n",
            "Iteration:  98%|█████████▊| 396/403 [01:42<00:01,  3.84it/s]\u001b[A\n",
            "Iteration:  99%|█████████▊| 397/403 [01:43<00:01,  3.84it/s]\u001b[A\n",
            "Iteration:  99%|█████████▉| 398/403 [01:43<00:01,  3.84it/s]\u001b[A\n",
            "Iteration:  99%|█████████▉| 399/403 [01:43<00:01,  3.85it/s]\u001b[A\n",
            "Iteration:  99%|█████████▉| 400/403 [01:43<00:00,  3.84it/s]\u001b[A\n",
            "Iteration: 100%|█████████▉| 401/403 [01:44<00:00,  3.85it/s]\u001b[A\n",
            "Iteration: 100%|█████████▉| 402/403 [01:44<00:00,  3.85it/s]\u001b[A\n",
            "Iteration: 100%|██████████| 403/403 [01:44<00:00,  3.85it/s]\u001b[A\n",
            "Epoch:  40%|████      | 2/5 [03:52<06:02, 120.82s/it]\n",
            "Iteration:   0%|          | 0/403 [00:00<?, ?it/s]\u001b[A\n",
            "Iteration:   0%|          | 1/403 [00:00<01:43,  3.89it/s]\u001b[A\n",
            "Iteration:   0%|          | 2/403 [00:00<01:42,  3.89it/s]\u001b[A\n",
            "Iteration:   1%|          | 3/403 [00:00<01:43,  3.88it/s]\u001b[A\n",
            "Iteration:   1%|          | 4/403 [00:01<01:43,  3.86it/s]\u001b[A\n",
            "Iteration:   1%|          | 5/403 [00:01<01:43,  3.86it/s]\u001b[A\n",
            "Iteration:   1%|▏         | 6/403 [00:01<01:42,  3.86it/s]\u001b[A\n",
            "Iteration:   2%|▏         | 7/403 [00:01<01:42,  3.85it/s]\u001b[A\n",
            "Iteration:   2%|▏         | 8/403 [00:02<01:42,  3.84it/s]\u001b[A\n",
            "Iteration:   2%|▏         | 9/403 [00:02<01:42,  3.84it/s]\u001b[A\n",
            "Iteration:   2%|▏         | 10/403 [00:02<01:42,  3.84it/s]\u001b[A\n",
            "Iteration:   3%|▎         | 11/403 [00:02<01:42,  3.83it/s]\u001b[A\n",
            "Iteration:   3%|▎         | 12/403 [00:03<01:42,  3.81it/s]\u001b[A\n",
            "Iteration:   3%|▎         | 13/403 [00:03<01:42,  3.81it/s]\u001b[A\n",
            "Iteration:   3%|▎         | 14/403 [00:03<01:41,  3.82it/s]\u001b[A\n",
            "Iteration:   4%|▎         | 15/403 [00:03<01:41,  3.83it/s]\u001b[A\n",
            "Iteration:   4%|▍         | 16/403 [00:04<01:40,  3.84it/s]\u001b[A\n",
            "Iteration:   4%|▍         | 17/403 [00:04<01:40,  3.85it/s]\u001b[A\n",
            "Iteration:   4%|▍         | 18/403 [00:04<01:40,  3.85it/s]\u001b[A\n",
            "Iteration:   5%|▍         | 19/403 [00:04<01:39,  3.85it/s]\u001b[A\n",
            "Iteration:   5%|▍         | 20/403 [00:05<01:39,  3.85it/s]\u001b[A\n",
            "Iteration:   5%|▌         | 21/403 [00:05<01:39,  3.85it/s]\u001b[A\n",
            "Iteration:   5%|▌         | 22/403 [00:05<01:38,  3.86it/s]\u001b[A\n",
            "Iteration:   6%|▌         | 23/403 [00:05<01:38,  3.85it/s]\u001b[A\n",
            "Iteration:   6%|▌         | 24/403 [00:06<01:38,  3.85it/s]\u001b[A\n",
            "Iteration:   6%|▌         | 25/403 [00:06<01:38,  3.85it/s]\u001b[A\n",
            "Iteration:   6%|▋         | 26/403 [00:06<01:37,  3.85it/s]\u001b[A\n",
            "Iteration:   7%|▋         | 27/403 [00:07<01:37,  3.84it/s]\u001b[A\n",
            "Iteration:   7%|▋         | 28/403 [00:07<01:37,  3.86it/s]\u001b[A\n",
            "Iteration:   7%|▋         | 29/403 [00:07<01:36,  3.86it/s]\u001b[A\n",
            "Iteration:   7%|▋         | 30/403 [00:07<01:36,  3.86it/s]\u001b[A\n",
            "Iteration:   8%|▊         | 31/403 [00:08<01:36,  3.86it/s]\u001b[A\n",
            "Iteration:   8%|▊         | 32/403 [00:08<01:35,  3.87it/s]\u001b[A\n",
            "Iteration:   8%|▊         | 33/403 [00:08<01:35,  3.87it/s]\u001b[A\n",
            "Iteration:   8%|▊         | 34/403 [00:08<01:35,  3.87it/s]\u001b[A\n",
            "Iteration:   9%|▊         | 35/403 [00:09<01:35,  3.87it/s]\u001b[A\n",
            "Iteration:   9%|▉         | 36/403 [00:09<01:34,  3.87it/s]\u001b[A\n",
            "Iteration:   9%|▉         | 37/403 [00:09<01:34,  3.86it/s]\u001b[A\n",
            "Iteration:   9%|▉         | 38/403 [00:09<01:34,  3.86it/s]\u001b[A\n",
            "Iteration:  10%|▉         | 39/403 [00:10<01:34,  3.85it/s]\u001b[A\n",
            "Iteration:  10%|▉         | 40/403 [00:10<01:34,  3.86it/s]\u001b[A\n",
            "Iteration:  10%|█         | 41/403 [00:10<01:33,  3.87it/s]\u001b[A\n",
            "Iteration:  10%|█         | 42/403 [00:10<01:33,  3.87it/s]\u001b[A\n",
            "Iteration:  11%|█         | 43/403 [00:11<01:32,  3.87it/s]\u001b[A\n",
            "Iteration:  11%|█         | 44/403 [00:11<01:32,  3.87it/s]\u001b[A\n",
            "Iteration:  11%|█         | 45/403 [00:11<01:32,  3.87it/s]\u001b[A\n",
            "Iteration:  11%|█▏        | 46/403 [00:11<01:32,  3.86it/s]\u001b[A\n",
            "Iteration:  12%|█▏        | 47/403 [00:12<01:32,  3.86it/s]\u001b[A\n",
            "Iteration:  12%|█▏        | 48/403 [00:12<01:31,  3.87it/s]\u001b[A\n",
            "Iteration:  12%|█▏        | 49/403 [00:12<01:31,  3.86it/s]\u001b[A\n",
            "Iteration:  12%|█▏        | 50/403 [00:12<01:31,  3.86it/s]\u001b[A\n",
            "Iteration:  13%|█▎        | 51/403 [00:13<01:31,  3.86it/s]\u001b[A\n",
            "Iteration:  13%|█▎        | 52/403 [00:13<01:30,  3.87it/s]\u001b[A\n",
            "Iteration:  13%|█▎        | 53/403 [00:13<01:30,  3.86it/s]\u001b[A\n",
            "Iteration:  13%|█▎        | 54/403 [00:14<01:30,  3.85it/s]\u001b[A\n",
            "Iteration:  14%|█▎        | 55/403 [00:14<01:30,  3.85it/s]\u001b[A\n",
            "Iteration:  14%|█▍        | 56/403 [00:14<01:29,  3.86it/s]\u001b[A\n",
            "Iteration:  14%|█▍        | 57/403 [00:14<01:29,  3.86it/s]\u001b[A\n",
            "Iteration:  14%|█▍        | 58/403 [00:15<01:29,  3.86it/s]\u001b[A\n",
            "Iteration:  15%|█▍        | 59/403 [00:15<01:28,  3.87it/s]\u001b[A\n",
            "Iteration:  15%|█▍        | 60/403 [00:15<01:28,  3.88it/s]\u001b[A\n",
            "Iteration:  15%|█▌        | 61/403 [00:15<01:28,  3.87it/s]\u001b[A\n",
            "Iteration:  15%|█▌        | 62/403 [00:16<01:28,  3.87it/s]\u001b[A\n",
            "Iteration:  16%|█▌        | 63/403 [00:16<01:27,  3.87it/s]\u001b[A\n",
            "Iteration:  16%|█▌        | 64/403 [00:16<01:27,  3.88it/s]\u001b[A\n",
            "Iteration:  16%|█▌        | 65/403 [00:16<01:27,  3.87it/s]\u001b[A\n",
            "Iteration:  16%|█▋        | 66/403 [00:17<01:26,  3.87it/s]\u001b[A\n",
            "Iteration:  17%|█▋        | 67/403 [00:17<01:26,  3.88it/s]\u001b[A\n",
            "Iteration:  17%|█▋        | 68/403 [00:17<01:26,  3.88it/s]\u001b[A\n",
            "Iteration:  17%|█▋        | 69/403 [00:17<01:26,  3.88it/s]\u001b[A\n",
            "Iteration:  17%|█▋        | 70/403 [00:18<01:26,  3.87it/s]\u001b[A\n",
            "Iteration:  18%|█▊        | 71/403 [00:18<01:25,  3.87it/s]\u001b[A\n",
            "Iteration:  18%|█▊        | 72/403 [00:18<01:25,  3.87it/s]\u001b[A\n",
            "Iteration:  18%|█▊        | 73/403 [00:18<01:25,  3.86it/s]\u001b[A\n",
            "Iteration:  18%|█▊        | 74/403 [00:19<01:25,  3.85it/s]\u001b[A\n",
            "Iteration:  19%|█▊        | 75/403 [00:19<01:25,  3.86it/s]\u001b[A\n",
            "Iteration:  19%|█▉        | 76/403 [00:19<01:24,  3.87it/s]\u001b[A\n",
            "Iteration:  19%|█▉        | 77/403 [00:19<01:24,  3.88it/s]\u001b[A\n",
            "Iteration:  19%|█▉        | 78/403 [00:20<01:24,  3.87it/s]\u001b[A\n",
            "Iteration:  20%|█▉        | 79/403 [00:20<01:23,  3.87it/s]\u001b[A\n",
            "Iteration:  20%|█▉        | 80/403 [00:20<01:23,  3.87it/s]\u001b[A\n",
            "Iteration:  20%|██        | 81/403 [00:20<01:23,  3.87it/s]\u001b[A\n",
            "Iteration:  20%|██        | 82/403 [00:21<01:23,  3.86it/s]\u001b[A\n",
            "Iteration:  21%|██        | 83/403 [00:21<01:22,  3.87it/s]\u001b[A\n",
            "Iteration:  21%|██        | 84/403 [00:21<01:22,  3.88it/s]\u001b[A\n",
            "Iteration:  21%|██        | 85/403 [00:22<01:21,  3.88it/s]\u001b[A\n",
            "Iteration:  21%|██▏       | 86/403 [00:22<01:21,  3.88it/s]\u001b[A\n",
            "Iteration:  22%|██▏       | 87/403 [00:22<01:21,  3.87it/s]\u001b[A\n",
            "Iteration:  22%|██▏       | 88/403 [00:22<01:21,  3.87it/s]\u001b[A\n",
            "Iteration:  22%|██▏       | 89/403 [00:23<01:21,  3.87it/s]\u001b[A\n",
            "Iteration:  22%|██▏       | 90/403 [00:23<01:21,  3.86it/s]\u001b[A\n",
            "Iteration:  23%|██▎       | 91/403 [00:23<01:20,  3.86it/s]\u001b[A\n",
            "Iteration:  23%|██▎       | 92/403 [00:23<01:20,  3.86it/s]\u001b[A\n",
            "Iteration:  23%|██▎       | 93/403 [00:24<01:20,  3.86it/s]\u001b[A\n",
            "Iteration:  23%|██▎       | 94/403 [00:24<01:20,  3.86it/s]\u001b[A\n",
            "Iteration:  24%|██▎       | 95/403 [00:24<01:19,  3.87it/s]\u001b[A\n",
            "Iteration:  24%|██▍       | 96/403 [00:24<01:19,  3.85it/s]\u001b[A\n",
            "Iteration:  24%|██▍       | 97/403 [00:25<01:19,  3.86it/s]\u001b[A\n",
            "Iteration:  24%|██▍       | 98/403 [00:25<01:19,  3.85it/s]\u001b[A\n",
            "Iteration:  25%|██▍       | 99/403 [00:25<01:19,  3.84it/s]\u001b[A\n",
            "Iteration:  25%|██▍       | 100/403 [00:25<01:19,  3.83it/s]\u001b[A\n",
            "Iteration:  25%|██▌       | 101/403 [00:26<01:19,  3.82it/s]\u001b[A\n",
            "Iteration:  25%|██▌       | 102/403 [00:26<01:18,  3.82it/s]\u001b[A\n",
            "Iteration:  26%|██▌       | 103/403 [00:26<01:18,  3.83it/s]\u001b[A\n",
            "Iteration:  26%|██▌       | 104/403 [00:26<01:17,  3.84it/s]\u001b[A\n",
            "Iteration:  26%|██▌       | 105/403 [00:27<01:17,  3.85it/s]\u001b[A\n",
            "Iteration:  26%|██▋       | 106/403 [00:27<01:17,  3.84it/s]\u001b[A\n",
            "Iteration:  27%|██▋       | 107/403 [00:27<01:16,  3.85it/s]\u001b[A\n",
            "Iteration:  27%|██▋       | 108/403 [00:27<01:16,  3.85it/s]\u001b[A\n",
            "Iteration:  27%|██▋       | 109/403 [00:28<01:16,  3.86it/s]\u001b[A\n",
            "Iteration:  27%|██▋       | 110/403 [00:28<01:15,  3.86it/s]\u001b[A\n",
            "Iteration:  28%|██▊       | 111/403 [00:28<01:15,  3.86it/s]\u001b[A\n",
            "Iteration:  28%|██▊       | 112/403 [00:29<01:15,  3.85it/s]\u001b[A\n",
            "Iteration:  28%|██▊       | 113/403 [00:29<01:15,  3.84it/s]\u001b[A\n",
            "Iteration:  28%|██▊       | 114/403 [00:29<01:15,  3.82it/s]\u001b[A\n",
            "Iteration:  29%|██▊       | 115/403 [00:29<01:15,  3.82it/s]\u001b[A\n",
            "Iteration:  29%|██▉       | 116/403 [00:30<01:14,  3.83it/s]\u001b[A\n",
            "Iteration:  29%|██▉       | 117/403 [00:30<01:14,  3.84it/s]\u001b[A\n",
            "Iteration:  29%|██▉       | 118/403 [00:30<01:14,  3.84it/s]\u001b[A\n",
            "Iteration:  30%|██▉       | 119/403 [00:30<01:13,  3.84it/s]\u001b[A\n",
            "Iteration:  30%|██▉       | 120/403 [00:31<01:13,  3.85it/s]\u001b[A\n",
            "Iteration:  30%|███       | 121/403 [00:31<01:13,  3.85it/s]\u001b[A\n",
            "Iteration:  30%|███       | 122/403 [00:31<01:12,  3.85it/s]\u001b[A\n",
            "Iteration:  31%|███       | 123/403 [00:31<01:12,  3.85it/s]\u001b[A\n",
            "Iteration:  31%|███       | 124/403 [00:32<01:12,  3.85it/s]\u001b[A\n",
            "Iteration:  31%|███       | 125/403 [00:32<01:12,  3.86it/s]\u001b[A\n",
            "Iteration:  31%|███▏      | 126/403 [00:32<01:11,  3.86it/s]\u001b[A\n",
            "Iteration:  32%|███▏      | 127/403 [00:32<01:11,  3.86it/s]\u001b[A\n",
            "Iteration:  32%|███▏      | 128/403 [00:33<01:11,  3.87it/s]\u001b[A\n",
            "Iteration:  32%|███▏      | 129/403 [00:33<01:10,  3.87it/s]\u001b[A\n",
            "Iteration:  32%|███▏      | 130/403 [00:33<01:10,  3.87it/s]\u001b[A\n",
            "Iteration:  33%|███▎      | 131/403 [00:33<01:10,  3.86it/s]\u001b[A\n",
            "Iteration:  33%|███▎      | 132/403 [00:34<01:10,  3.87it/s]\u001b[A\n",
            "Iteration:  33%|███▎      | 133/403 [00:34<01:09,  3.87it/s]\u001b[A\n",
            "Iteration:  33%|███▎      | 134/403 [00:34<01:09,  3.87it/s]\u001b[A\n",
            "Iteration:  33%|███▎      | 135/403 [00:35<01:09,  3.86it/s]\u001b[A\n",
            "Iteration:  34%|███▎      | 136/403 [00:35<01:09,  3.86it/s]\u001b[A\n",
            "Iteration:  34%|███▍      | 137/403 [00:35<01:08,  3.88it/s]\u001b[A\n",
            "Iteration:  34%|███▍      | 138/403 [00:35<01:08,  3.88it/s]\u001b[A\n",
            "Iteration:  34%|███▍      | 139/403 [00:36<01:08,  3.87it/s]\u001b[A\n",
            "Iteration:  35%|███▍      | 140/403 [00:36<01:07,  3.87it/s]\u001b[A\n",
            "Iteration:  35%|███▍      | 141/403 [00:36<01:07,  3.88it/s]\u001b[A\n",
            "Iteration:  35%|███▌      | 142/403 [00:36<01:07,  3.88it/s]\u001b[A\n",
            "Iteration:  35%|███▌      | 143/403 [00:37<01:07,  3.87it/s]\u001b[A\n",
            "Iteration:  36%|███▌      | 144/403 [00:37<01:06,  3.88it/s]\u001b[A\n",
            "Iteration:  36%|███▌      | 145/403 [00:37<01:06,  3.89it/s]\u001b[A\n",
            "Iteration:  36%|███▌      | 146/403 [00:37<01:06,  3.89it/s]\u001b[A\n",
            "Iteration:  36%|███▋      | 147/403 [00:38<01:06,  3.87it/s]\u001b[A\n",
            "Iteration:  37%|███▋      | 148/403 [00:38<01:05,  3.87it/s]\u001b[A\n",
            "Iteration:  37%|███▋      | 149/403 [00:38<01:05,  3.88it/s]\u001b[A\n",
            "Iteration:  37%|███▋      | 150/403 [00:38<01:05,  3.88it/s]\u001b[A\n",
            "Iteration:  37%|███▋      | 151/403 [00:39<01:04,  3.89it/s]\u001b[A\n",
            "Iteration:  38%|███▊      | 152/403 [00:39<01:04,  3.89it/s]\u001b[A\n",
            "Iteration:  38%|███▊      | 153/403 [00:39<01:04,  3.89it/s]\u001b[A\n",
            "Iteration:  38%|███▊      | 154/403 [00:39<01:03,  3.89it/s]\u001b[A\n",
            "Iteration:  38%|███▊      | 155/403 [00:40<01:03,  3.89it/s]\u001b[A\n",
            "Iteration:  39%|███▊      | 156/403 [00:40<01:03,  3.89it/s]\u001b[A\n",
            "Iteration:  39%|███▉      | 157/403 [00:40<01:03,  3.88it/s]\u001b[A\n",
            "Iteration:  39%|███▉      | 158/403 [00:40<01:03,  3.88it/s]\u001b[A\n",
            "Iteration:  39%|███▉      | 159/403 [00:41<01:02,  3.88it/s]\u001b[A\n",
            "Iteration:  40%|███▉      | 160/403 [00:41<01:02,  3.87it/s]\u001b[A\n",
            "Iteration:  40%|███▉      | 161/403 [00:41<01:02,  3.87it/s]\u001b[A\n",
            "Iteration:  40%|████      | 162/403 [00:41<01:02,  3.87it/s]\u001b[A\n",
            "Iteration:  40%|████      | 163/403 [00:42<01:02,  3.86it/s]\u001b[A\n",
            "Iteration:  41%|████      | 164/403 [00:42<01:01,  3.87it/s]\u001b[A\n",
            "Iteration:  41%|████      | 165/403 [00:42<01:01,  3.86it/s]\u001b[A\n",
            "Iteration:  41%|████      | 166/403 [00:42<01:01,  3.86it/s]\u001b[A\n",
            "Iteration:  41%|████▏     | 167/403 [00:43<01:00,  3.87it/s]\u001b[A\n",
            "Iteration:  42%|████▏     | 168/403 [00:43<01:00,  3.86it/s]\u001b[A\n",
            "Iteration:  42%|████▏     | 169/403 [00:43<01:00,  3.86it/s]\u001b[A\n",
            "Iteration:  42%|████▏     | 170/403 [00:44<01:00,  3.86it/s]\u001b[A\n",
            "Iteration:  42%|████▏     | 171/403 [00:44<01:00,  3.86it/s]\u001b[A\n",
            "Iteration:  43%|████▎     | 172/403 [00:44<00:59,  3.86it/s]\u001b[A\n",
            "Iteration:  43%|████▎     | 173/403 [00:44<00:59,  3.86it/s]\u001b[A\n",
            "Iteration:  43%|████▎     | 174/403 [00:45<00:59,  3.86it/s]\u001b[A\n",
            "Iteration:  43%|████▎     | 175/403 [00:45<00:58,  3.87it/s]\u001b[A\n",
            "Iteration:  44%|████▎     | 176/403 [00:45<00:58,  3.87it/s]\u001b[A\n",
            "Iteration:  44%|████▍     | 177/403 [00:45<00:58,  3.86it/s]\u001b[A\n",
            "Iteration:  44%|████▍     | 178/403 [00:46<00:58,  3.86it/s]\u001b[A\n",
            "Iteration:  44%|████▍     | 179/403 [00:46<00:57,  3.87it/s]\u001b[A\n",
            "Iteration:  45%|████▍     | 180/403 [00:46<00:57,  3.87it/s]\u001b[A\n",
            "Iteration:  45%|████▍     | 181/403 [00:46<00:57,  3.86it/s]\u001b[A\n",
            "Iteration:  45%|████▌     | 182/403 [00:47<00:57,  3.85it/s]\u001b[A\n",
            "Iteration:  45%|████▌     | 183/403 [00:47<00:57,  3.84it/s]\u001b[A\n",
            "Iteration:  46%|████▌     | 184/403 [00:47<00:57,  3.83it/s]\u001b[A\n",
            "Iteration:  46%|████▌     | 185/403 [00:47<00:57,  3.81it/s]\u001b[A\n",
            "Iteration:  46%|████▌     | 186/403 [00:48<00:56,  3.81it/s]\u001b[A\n",
            "Iteration:  46%|████▋     | 187/403 [00:48<00:56,  3.82it/s]\u001b[A\n",
            "Iteration:  47%|████▋     | 188/403 [00:48<00:56,  3.82it/s]\u001b[A\n",
            "Iteration:  47%|████▋     | 189/403 [00:48<00:55,  3.82it/s]\u001b[A\n",
            "Iteration:  47%|████▋     | 190/403 [00:49<00:55,  3.83it/s]\u001b[A\n",
            "Iteration:  47%|████▋     | 191/403 [00:49<00:55,  3.84it/s]\u001b[A\n",
            "Iteration:  48%|████▊     | 192/403 [00:49<00:54,  3.85it/s]\u001b[A\n",
            "Iteration:  48%|████▊     | 193/403 [00:50<00:54,  3.86it/s]\u001b[A\n",
            "Iteration:  48%|████▊     | 194/403 [00:50<00:54,  3.86it/s]\u001b[A\n",
            "Iteration:  48%|████▊     | 195/403 [00:50<00:53,  3.88it/s]\u001b[A\n",
            "Iteration:  49%|████▊     | 196/403 [00:50<00:53,  3.87it/s]\u001b[A\n",
            "Iteration:  49%|████▉     | 197/403 [00:51<00:53,  3.87it/s]\u001b[A\n",
            "Iteration:  49%|████▉     | 198/403 [00:51<00:52,  3.87it/s]\u001b[A\n",
            "Iteration:  49%|████▉     | 199/403 [00:51<00:52,  3.88it/s]\u001b[A\n",
            "Iteration:  50%|████▉     | 200/403 [00:51<00:52,  3.86it/s]\u001b[A\n",
            "Iteration:  50%|████▉     | 201/403 [00:52<00:52,  3.86it/s]\u001b[A\n",
            "Iteration:  50%|█████     | 202/403 [00:52<00:52,  3.86it/s]\u001b[A\n",
            "Iteration:  50%|█████     | 203/403 [00:52<00:51,  3.87it/s]\u001b[A\n",
            "Iteration:  51%|█████     | 204/403 [00:52<00:51,  3.87it/s]\u001b[A\n",
            "Iteration:  51%|█████     | 205/403 [00:53<00:51,  3.86it/s]\u001b[A\n",
            "Iteration:  51%|█████     | 206/403 [00:53<00:50,  3.87it/s]\u001b[A\n",
            "Iteration:  51%|█████▏    | 207/403 [00:53<00:50,  3.88it/s]\u001b[A\n",
            "Iteration:  52%|█████▏    | 208/403 [00:53<00:50,  3.87it/s]\u001b[A\n",
            "Iteration:  52%|█████▏    | 209/403 [00:54<00:50,  3.86it/s]\u001b[A\n",
            "Iteration:  52%|█████▏    | 210/403 [00:54<00:49,  3.87it/s]\u001b[A\n",
            "Iteration:  52%|█████▏    | 211/403 [00:54<00:49,  3.88it/s]\u001b[A\n",
            "Iteration:  53%|█████▎    | 212/403 [00:54<00:49,  3.87it/s]\u001b[A\n",
            "Iteration:  53%|█████▎    | 213/403 [00:55<00:49,  3.87it/s]\u001b[A\n",
            "Iteration:  53%|█████▎    | 214/403 [00:55<00:48,  3.88it/s]\u001b[A\n",
            "Iteration:  53%|█████▎    | 215/403 [00:55<00:48,  3.88it/s]\u001b[A\n",
            "Iteration:  54%|█████▎    | 216/403 [00:55<00:48,  3.87it/s]\u001b[A\n",
            "Iteration:  54%|█████▍    | 217/403 [00:56<00:48,  3.86it/s]\u001b[A\n",
            "Iteration:  54%|█████▍    | 218/403 [00:56<00:47,  3.87it/s]\u001b[A\n",
            "Iteration:  54%|█████▍    | 219/403 [00:56<00:47,  3.87it/s]\u001b[A\n",
            "Iteration:  55%|█████▍    | 220/403 [00:56<00:47,  3.86it/s]\u001b[A\n",
            "Iteration:  55%|█████▍    | 221/403 [00:57<00:47,  3.86it/s]\u001b[A\n",
            "Iteration:  55%|█████▌    | 222/403 [00:57<00:46,  3.87it/s]\u001b[A\n",
            "Iteration:  55%|█████▌    | 223/403 [00:57<00:46,  3.88it/s]\u001b[A\n",
            "Iteration:  56%|█████▌    | 224/403 [00:58<00:46,  3.87it/s]\u001b[A\n",
            "Iteration:  56%|█████▌    | 225/403 [00:58<00:45,  3.87it/s]\u001b[A\n",
            "Iteration:  56%|█████▌    | 226/403 [00:58<00:45,  3.87it/s]\u001b[A\n",
            "Iteration:  56%|█████▋    | 227/403 [00:58<00:45,  3.87it/s]\u001b[A\n",
            "Iteration:  57%|█████▋    | 228/403 [00:59<00:45,  3.87it/s]\u001b[A\n",
            "Iteration:  57%|█████▋    | 229/403 [00:59<00:45,  3.87it/s]\u001b[A\n",
            "Iteration:  57%|█████▋    | 230/403 [00:59<00:44,  3.86it/s]\u001b[A\n",
            "Iteration:  57%|█████▋    | 231/403 [00:59<00:44,  3.86it/s]\u001b[A\n",
            "Iteration:  58%|█████▊    | 232/403 [01:00<00:44,  3.87it/s]\u001b[A\n",
            "Iteration:  58%|█████▊    | 233/403 [01:00<00:43,  3.87it/s]\u001b[A\n",
            "Iteration:  58%|█████▊    | 234/403 [01:00<00:43,  3.87it/s]\u001b[A\n",
            "Iteration:  58%|█████▊    | 235/403 [01:00<00:43,  3.87it/s]\u001b[A\n",
            "Iteration:  59%|█████▊    | 236/403 [01:01<00:43,  3.86it/s]\u001b[A\n",
            "Iteration:  59%|█████▉    | 237/403 [01:01<00:42,  3.87it/s]\u001b[A\n",
            "Iteration:  59%|█████▉    | 238/403 [01:01<00:42,  3.86it/s]\u001b[A\n",
            "Iteration:  59%|█████▉    | 239/403 [01:01<00:42,  3.85it/s]\u001b[A\n",
            "Iteration:  60%|█████▉    | 240/403 [01:02<00:42,  3.85it/s]\u001b[A\n",
            "Iteration:  60%|█████▉    | 241/403 [01:02<00:42,  3.85it/s]\u001b[A\n",
            "Iteration:  60%|██████    | 242/403 [01:02<00:41,  3.84it/s]\u001b[A\n",
            "Iteration:  60%|██████    | 243/403 [01:02<00:41,  3.84it/s]\u001b[A\n",
            "Iteration:  61%|██████    | 244/403 [01:03<00:41,  3.84it/s]\u001b[A\n",
            "Iteration:  61%|██████    | 245/403 [01:03<00:41,  3.84it/s]\u001b[A\n",
            "Iteration:  61%|██████    | 246/403 [01:03<00:40,  3.85it/s]\u001b[A\n",
            "Iteration:  61%|██████▏   | 247/403 [01:03<00:40,  3.85it/s]\u001b[A\n",
            "Iteration:  62%|██████▏   | 248/403 [01:04<00:40,  3.86it/s]\u001b[A\n",
            "Iteration:  62%|██████▏   | 249/403 [01:04<00:39,  3.87it/s]\u001b[A\n",
            "Iteration:  62%|██████▏   | 250/403 [01:04<00:39,  3.87it/s]\u001b[A\n",
            "Iteration:  62%|██████▏   | 251/403 [01:05<00:39,  3.87it/s]\u001b[A\n",
            "Iteration:  63%|██████▎   | 252/403 [01:05<00:39,  3.87it/s]\u001b[A\n",
            "Iteration:  63%|██████▎   | 253/403 [01:05<00:38,  3.87it/s]\u001b[A\n",
            "Iteration:  63%|██████▎   | 254/403 [01:05<00:38,  3.87it/s]\u001b[A\n",
            "Iteration:  63%|██████▎   | 255/403 [01:06<00:38,  3.86it/s]\u001b[A\n",
            "Iteration:  64%|██████▎   | 256/403 [01:06<00:38,  3.85it/s]\u001b[A\n",
            "Iteration:  64%|██████▍   | 257/403 [01:06<00:37,  3.86it/s]\u001b[A\n",
            "Iteration:  64%|██████▍   | 258/403 [01:06<00:37,  3.86it/s]\u001b[A\n",
            "Iteration:  64%|██████▍   | 259/403 [01:07<00:37,  3.86it/s]\u001b[A\n",
            "Iteration:  65%|██████▍   | 260/403 [01:07<00:37,  3.86it/s]\u001b[A\n",
            "Iteration:  65%|██████▍   | 261/403 [01:07<00:36,  3.87it/s]\u001b[A\n",
            "Iteration:  65%|██████▌   | 262/403 [01:07<00:36,  3.87it/s]\u001b[A\n",
            "Iteration:  65%|██████▌   | 263/403 [01:08<00:36,  3.87it/s]\u001b[A\n",
            "Iteration:  66%|██████▌   | 264/403 [01:08<00:35,  3.88it/s]\u001b[A\n",
            "Iteration:  66%|██████▌   | 265/403 [01:08<00:35,  3.87it/s]\u001b[A\n",
            "Iteration:  66%|██████▌   | 266/403 [01:08<00:35,  3.87it/s]\u001b[A\n",
            "Iteration:  66%|██████▋   | 267/403 [01:09<00:35,  3.88it/s]\u001b[A\n",
            "Iteration:  67%|██████▋   | 268/403 [01:09<00:34,  3.87it/s]\u001b[A\n",
            "Iteration:  67%|██████▋   | 269/403 [01:09<00:34,  3.87it/s]\u001b[A\n",
            "Iteration:  67%|██████▋   | 270/403 [01:09<00:34,  3.87it/s]\u001b[A\n",
            "Iteration:  67%|██████▋   | 271/403 [01:10<00:34,  3.87it/s]\u001b[A\n",
            "Iteration:  67%|██████▋   | 272/403 [01:10<00:33,  3.87it/s]\u001b[A\n",
            "Iteration:  68%|██████▊   | 273/403 [01:10<00:33,  3.88it/s]\u001b[A\n",
            "Iteration:  68%|██████▊   | 274/403 [01:10<00:33,  3.87it/s]\u001b[A\n",
            "Iteration:  68%|██████▊   | 275/403 [01:11<00:32,  3.88it/s]\u001b[A\n",
            "Iteration:  68%|██████▊   | 276/403 [01:11<00:32,  3.88it/s]\u001b[A\n",
            "Iteration:  69%|██████▊   | 277/403 [01:11<00:32,  3.88it/s]\u001b[A\n",
            "Iteration:  69%|██████▉   | 278/403 [01:11<00:32,  3.87it/s]\u001b[A\n",
            "Iteration:  69%|██████▉   | 279/403 [01:12<00:32,  3.87it/s]\u001b[A\n",
            "Iteration:  69%|██████▉   | 280/403 [01:12<00:31,  3.88it/s]\u001b[A\n",
            "Iteration:  70%|██████▉   | 281/403 [01:12<00:31,  3.88it/s]\u001b[A\n",
            "Iteration:  70%|██████▉   | 282/403 [01:13<00:31,  3.88it/s]\u001b[A\n",
            "Iteration:  70%|███████   | 283/403 [01:13<00:30,  3.87it/s]\u001b[A\n",
            "Iteration:  70%|███████   | 284/403 [01:13<00:30,  3.87it/s]\u001b[A\n",
            "Iteration:  71%|███████   | 285/403 [01:13<00:30,  3.86it/s]\u001b[A\n",
            "Iteration:  71%|███████   | 286/403 [01:14<00:30,  3.86it/s]\u001b[A\n",
            "Iteration:  71%|███████   | 287/403 [01:14<00:30,  3.86it/s]\u001b[A\n",
            "Iteration:  71%|███████▏  | 288/403 [01:14<00:29,  3.86it/s]\u001b[A\n",
            "Iteration:  72%|███████▏  | 289/403 [01:14<00:29,  3.86it/s]\u001b[A\n",
            "Iteration:  72%|███████▏  | 290/403 [01:15<00:29,  3.85it/s]\u001b[A\n",
            "Iteration:  72%|███████▏  | 291/403 [01:15<00:29,  3.85it/s]\u001b[A\n",
            "Iteration:  72%|███████▏  | 292/403 [01:15<00:28,  3.85it/s]\u001b[A\n",
            "Iteration:  73%|███████▎  | 293/403 [01:15<00:28,  3.85it/s]\u001b[A\n",
            "Iteration:  73%|███████▎  | 294/403 [01:16<00:28,  3.85it/s]\u001b[A\n",
            "Iteration:  73%|███████▎  | 295/403 [01:16<00:28,  3.86it/s]\u001b[A\n",
            "Iteration:  73%|███████▎  | 296/403 [01:16<00:27,  3.87it/s]\u001b[A\n",
            "Iteration:  74%|███████▎  | 297/403 [01:16<00:27,  3.88it/s]\u001b[A\n",
            "Iteration:  74%|███████▍  | 298/403 [01:17<00:27,  3.88it/s]\u001b[A\n",
            "Iteration:  74%|███████▍  | 299/403 [01:17<00:26,  3.88it/s]\u001b[A\n",
            "Iteration:  74%|███████▍  | 300/403 [01:17<00:26,  3.87it/s]\u001b[A\n",
            "Iteration:  75%|███████▍  | 301/403 [01:17<00:26,  3.87it/s]\u001b[A\n",
            "Iteration:  75%|███████▍  | 302/403 [01:18<00:26,  3.87it/s]\u001b[A\n",
            "Iteration:  75%|███████▌  | 303/403 [01:18<00:25,  3.87it/s]\u001b[A\n",
            "Iteration:  75%|███████▌  | 304/403 [01:18<00:25,  3.87it/s]\u001b[A\n",
            "Iteration:  76%|███████▌  | 305/403 [01:18<00:25,  3.86it/s]\u001b[A\n",
            "Iteration:  76%|███████▌  | 306/403 [01:19<00:25,  3.86it/s]\u001b[A\n",
            "Iteration:  76%|███████▌  | 307/403 [01:19<00:24,  3.85it/s]\u001b[A\n",
            "Iteration:  76%|███████▋  | 308/403 [01:19<00:24,  3.86it/s]\u001b[A\n",
            "Iteration:  77%|███████▋  | 309/403 [01:20<00:24,  3.86it/s]\u001b[A\n",
            "Iteration:  77%|███████▋  | 310/403 [01:20<00:24,  3.86it/s]\u001b[A\n",
            "Iteration:  77%|███████▋  | 311/403 [01:20<00:23,  3.87it/s]\u001b[A\n",
            "Iteration:  77%|███████▋  | 312/403 [01:20<00:23,  3.87it/s]\u001b[A\n",
            "Iteration:  78%|███████▊  | 313/403 [01:21<00:23,  3.87it/s]\u001b[A\n",
            "Iteration:  78%|███████▊  | 314/403 [01:21<00:22,  3.87it/s]\u001b[A\n",
            "Iteration:  78%|███████▊  | 315/403 [01:21<00:22,  3.86it/s]\u001b[A\n",
            "Iteration:  78%|███████▊  | 316/403 [01:21<00:22,  3.84it/s]\u001b[A\n",
            "Iteration:  79%|███████▊  | 317/403 [01:22<00:22,  3.84it/s]\u001b[A\n",
            "Iteration:  79%|███████▉  | 318/403 [01:22<00:22,  3.85it/s]\u001b[A\n",
            "Iteration:  79%|███████▉  | 319/403 [01:22<00:21,  3.84it/s]\u001b[A\n",
            "Iteration:  79%|███████▉  | 320/403 [01:22<00:21,  3.84it/s]\u001b[A\n",
            "Iteration:  80%|███████▉  | 321/403 [01:23<00:21,  3.85it/s]\u001b[A\n",
            "Iteration:  80%|███████▉  | 322/403 [01:23<00:21,  3.85it/s]\u001b[A\n",
            "Iteration:  80%|████████  | 323/403 [01:23<00:20,  3.84it/s]\u001b[A\n",
            "Iteration:  80%|████████  | 324/403 [01:23<00:20,  3.84it/s]\u001b[A\n",
            "Iteration:  81%|████████  | 325/403 [01:24<00:20,  3.84it/s]\u001b[A\n",
            "Iteration:  81%|████████  | 326/403 [01:24<00:20,  3.84it/s]\u001b[A\n",
            "Iteration:  81%|████████  | 327/403 [01:24<00:19,  3.86it/s]\u001b[A\n",
            "Iteration:  81%|████████▏ | 328/403 [01:24<00:19,  3.85it/s]\u001b[A\n",
            "Iteration:  82%|████████▏ | 329/403 [01:25<00:19,  3.86it/s]\u001b[A\n",
            "Iteration:  82%|████████▏ | 330/403 [01:25<00:18,  3.85it/s]\u001b[A\n",
            "Iteration:  82%|████████▏ | 331/403 [01:25<00:18,  3.83it/s]\u001b[A\n",
            "Iteration:  82%|████████▏ | 332/403 [01:25<00:18,  3.82it/s]\u001b[A\n",
            "Iteration:  83%|████████▎ | 333/403 [01:26<00:18,  3.82it/s]\u001b[A\n",
            "Iteration:  83%|████████▎ | 334/403 [01:26<00:18,  3.83it/s]\u001b[A\n",
            "Iteration:  83%|████████▎ | 335/403 [01:26<00:17,  3.84it/s]\u001b[A\n",
            "Iteration:  83%|████████▎ | 336/403 [01:27<00:17,  3.85it/s]\u001b[A\n",
            "Iteration:  84%|████████▎ | 337/403 [01:27<00:17,  3.86it/s]\u001b[A\n",
            "Iteration:  84%|████████▍ | 338/403 [01:27<00:16,  3.87it/s]\u001b[A\n",
            "Iteration:  84%|████████▍ | 339/403 [01:27<00:16,  3.87it/s]\u001b[A\n",
            "Iteration:  84%|████████▍ | 340/403 [01:28<00:16,  3.87it/s]\u001b[A\n",
            "Iteration:  85%|████████▍ | 341/403 [01:28<00:16,  3.87it/s]\u001b[A\n",
            "Iteration:  85%|████████▍ | 342/403 [01:28<00:15,  3.88it/s]\u001b[A\n",
            "Iteration:  85%|████████▌ | 343/403 [01:28<00:15,  3.87it/s]\u001b[A\n",
            "Iteration:  85%|████████▌ | 344/403 [01:29<00:15,  3.85it/s]\u001b[A\n",
            "Iteration:  86%|████████▌ | 345/403 [01:29<00:15,  3.84it/s]\u001b[A\n",
            "Iteration:  86%|████████▌ | 346/403 [01:29<00:14,  3.84it/s]\u001b[A\n",
            "Iteration:  86%|████████▌ | 347/403 [01:29<00:14,  3.83it/s]\u001b[A\n",
            "Iteration:  86%|████████▋ | 348/403 [01:30<00:14,  3.84it/s]\u001b[A\n",
            "Iteration:  87%|████████▋ | 349/403 [01:30<00:14,  3.86it/s]\u001b[A\n",
            "Iteration:  87%|████████▋ | 350/403 [01:30<00:13,  3.87it/s]\u001b[A\n",
            "Iteration:  87%|████████▋ | 351/403 [01:30<00:13,  3.86it/s]\u001b[A\n",
            "Iteration:  87%|████████▋ | 352/403 [01:31<00:13,  3.86it/s]\u001b[A\n",
            "Iteration:  88%|████████▊ | 353/403 [01:31<00:12,  3.85it/s]\u001b[A\n",
            "Iteration:  88%|████████▊ | 354/403 [01:31<00:12,  3.85it/s]\u001b[A\n",
            "Iteration:  88%|████████▊ | 355/403 [01:31<00:12,  3.85it/s]\u001b[A\n",
            "Iteration:  88%|████████▊ | 356/403 [01:32<00:12,  3.85it/s]\u001b[A\n",
            "Iteration:  89%|████████▊ | 357/403 [01:32<00:11,  3.87it/s]\u001b[A\n",
            "Iteration:  89%|████████▉ | 358/403 [01:32<00:11,  3.88it/s]\u001b[A\n",
            "Iteration:  89%|████████▉ | 359/403 [01:32<00:11,  3.87it/s]\u001b[A\n",
            "Iteration:  89%|████████▉ | 360/403 [01:33<00:11,  3.88it/s]\u001b[A\n",
            "Iteration:  90%|████████▉ | 361/403 [01:33<00:10,  3.87it/s]\u001b[A\n",
            "Iteration:  90%|████████▉ | 362/403 [01:33<00:10,  3.88it/s]\u001b[A\n",
            "Iteration:  90%|█████████ | 363/403 [01:34<00:10,  3.86it/s]\u001b[A\n",
            "Iteration:  90%|█████████ | 364/403 [01:34<00:10,  3.85it/s]\u001b[A\n",
            "Iteration:  91%|█████████ | 365/403 [01:34<00:09,  3.85it/s]\u001b[A\n",
            "Iteration:  91%|█████████ | 366/403 [01:34<00:09,  3.85it/s]\u001b[A\n",
            "Iteration:  91%|█████████ | 367/403 [01:35<00:09,  3.84it/s]\u001b[A\n",
            "Iteration:  91%|█████████▏| 368/403 [01:35<00:09,  3.83it/s]\u001b[A\n",
            "Iteration:  92%|█████████▏| 369/403 [01:35<00:08,  3.83it/s]\u001b[A\n",
            "Iteration:  92%|█████████▏| 370/403 [01:35<00:08,  3.84it/s]\u001b[A\n",
            "Iteration:  92%|█████████▏| 371/403 [01:36<00:08,  3.85it/s]\u001b[A\n",
            "Iteration:  92%|█████████▏| 372/403 [01:36<00:08,  3.85it/s]\u001b[A\n",
            "Iteration:  93%|█████████▎| 373/403 [01:36<00:07,  3.85it/s]\u001b[A\n",
            "Iteration:  93%|█████████▎| 374/403 [01:36<00:07,  3.85it/s]\u001b[A\n",
            "Iteration:  93%|█████████▎| 375/403 [01:37<00:07,  3.85it/s]\u001b[A\n",
            "Iteration:  93%|█████████▎| 376/403 [01:37<00:06,  3.86it/s]\u001b[A\n",
            "Iteration:  94%|█████████▎| 377/403 [01:37<00:06,  3.87it/s]\u001b[A\n",
            "Iteration:  94%|█████████▍| 378/403 [01:37<00:06,  3.88it/s]\u001b[A\n",
            "Iteration:  94%|█████████▍| 379/403 [01:38<00:06,  3.88it/s]\u001b[A\n",
            "Iteration:  94%|█████████▍| 380/403 [01:38<00:05,  3.88it/s]\u001b[A\n",
            "Iteration:  95%|█████████▍| 381/403 [01:38<00:05,  3.88it/s]\u001b[A\n",
            "Iteration:  95%|█████████▍| 382/403 [01:38<00:05,  3.87it/s]\u001b[A\n",
            "Iteration:  95%|█████████▌| 383/403 [01:39<00:05,  3.88it/s]\u001b[A\n",
            "Iteration:  95%|█████████▌| 384/403 [01:39<00:04,  3.88it/s]\u001b[A\n",
            "Iteration:  96%|█████████▌| 385/403 [01:39<00:04,  3.88it/s]\u001b[A\n",
            "Iteration:  96%|█████████▌| 386/403 [01:39<00:04,  3.87it/s]\u001b[A\n",
            "Iteration:  96%|█████████▌| 387/403 [01:40<00:04,  3.87it/s]\u001b[A\n",
            "Iteration:  96%|█████████▋| 388/403 [01:40<00:03,  3.88it/s]\u001b[A\n",
            "Iteration:  97%|█████████▋| 389/403 [01:40<00:03,  3.87it/s]\u001b[A\n",
            "Iteration:  97%|█████████▋| 390/403 [01:41<00:03,  3.87it/s]\u001b[A\n",
            "Iteration:  97%|█████████▋| 391/403 [01:41<00:03,  3.87it/s]\u001b[A\n",
            "Iteration:  97%|█████████▋| 392/403 [01:41<00:02,  3.86it/s]\u001b[A\n",
            "Iteration:  98%|█████████▊| 393/403 [01:41<00:02,  3.86it/s]\u001b[A\n",
            "Iteration:  98%|█████████▊| 394/403 [01:42<00:02,  3.87it/s]\u001b[A\n",
            "Iteration:  98%|█████████▊| 395/403 [01:42<00:02,  3.87it/s]\u001b[A\n",
            "Iteration:  98%|█████████▊| 396/403 [01:42<00:01,  3.86it/s]\u001b[A\n",
            "Iteration:  99%|█████████▊| 397/403 [01:42<00:01,  3.88it/s]\u001b[A\n",
            "Iteration:  99%|█████████▉| 398/403 [01:43<00:01,  3.87it/s]\u001b[A\n",
            "Iteration:  99%|█████████▉| 399/403 [01:43<00:01,  3.87it/s]\u001b[A\n",
            "Iteration:  99%|█████████▉| 400/403 [01:43<00:00,  3.86it/s]\u001b[A\n",
            "Iteration: 100%|█████████▉| 401/403 [01:43<00:00,  3.85it/s]\u001b[A\n",
            "Iteration: 100%|█████████▉| 402/403 [01:44<00:00,  3.85it/s]\u001b[A\n",
            "Iteration: 100%|██████████| 403/403 [01:44<00:00,  3.86it/s]\u001b[A\n",
            "Epoch:  60%|██████    | 3/5 [05:36<03:51, 115.89s/it]\n",
            "Iteration:   0%|          | 0/403 [00:00<?, ?it/s]\u001b[A\n",
            "Iteration:   0%|          | 1/403 [00:00<01:43,  3.87it/s]\u001b[A\n",
            "Iteration:   0%|          | 2/403 [00:00<01:43,  3.86it/s]\u001b[A\n",
            "Iteration:   1%|          | 3/403 [00:00<01:43,  3.86it/s]\u001b[A\n",
            "Iteration:   1%|          | 4/403 [00:01<01:43,  3.87it/s]\u001b[A\n",
            "Iteration:   1%|          | 5/403 [00:01<01:42,  3.87it/s]\u001b[A\n",
            "Iteration:   1%|▏         | 6/403 [00:01<01:42,  3.88it/s]\u001b[A\n",
            "Iteration:   2%|▏         | 7/403 [00:01<01:42,  3.87it/s]\u001b[A\n",
            "Iteration:   2%|▏         | 8/403 [00:02<01:41,  3.88it/s]\u001b[A\n",
            "Iteration:   2%|▏         | 9/403 [00:02<01:41,  3.88it/s]\u001b[A\n",
            "Iteration:   2%|▏         | 10/403 [00:02<01:41,  3.87it/s]\u001b[A\n",
            "Iteration:   3%|▎         | 11/403 [00:02<01:41,  3.87it/s]\u001b[A\n",
            "Iteration:   3%|▎         | 12/403 [00:03<01:40,  3.88it/s]\u001b[A\n",
            "Iteration:   3%|▎         | 13/403 [00:03<01:40,  3.88it/s]\u001b[A\n",
            "Iteration:   3%|▎         | 14/403 [00:03<01:40,  3.87it/s]\u001b[A\n",
            "Iteration:   4%|▎         | 15/403 [00:03<01:40,  3.87it/s]\u001b[A\n",
            "Iteration:   4%|▍         | 16/403 [00:04<01:39,  3.89it/s]\u001b[A\n",
            "Iteration:   4%|▍         | 17/403 [00:04<01:39,  3.89it/s]\u001b[A\n",
            "Iteration:   4%|▍         | 18/403 [00:04<01:39,  3.88it/s]\u001b[A\n",
            "Iteration:   5%|▍         | 19/403 [00:04<01:39,  3.87it/s]\u001b[A\n",
            "Iteration:   5%|▍         | 20/403 [00:05<01:38,  3.88it/s]\u001b[A\n",
            "Iteration:   5%|▌         | 21/403 [00:05<01:38,  3.87it/s]\u001b[A\n",
            "Iteration:   5%|▌         | 22/403 [00:05<01:38,  3.87it/s]\u001b[A\n",
            "Iteration:   6%|▌         | 23/403 [00:05<01:38,  3.86it/s]\u001b[A\n",
            "Iteration:   6%|▌         | 24/403 [00:06<01:37,  3.88it/s]\u001b[A\n",
            "Iteration:   6%|▌         | 25/403 [00:06<01:37,  3.88it/s]\u001b[A\n",
            "Iteration:   6%|▋         | 26/403 [00:06<01:37,  3.88it/s]\u001b[A\n",
            "Iteration:   7%|▋         | 27/403 [00:06<01:36,  3.88it/s]\u001b[A\n",
            "Iteration:   7%|▋         | 28/403 [00:07<01:36,  3.88it/s]\u001b[A\n",
            "Iteration:   7%|▋         | 29/403 [00:07<01:36,  3.87it/s]\u001b[A\n",
            "Iteration:   7%|▋         | 30/403 [00:07<01:36,  3.87it/s]\u001b[A\n",
            "Iteration:   8%|▊         | 31/403 [00:08<01:36,  3.87it/s]\u001b[A\n",
            "Iteration:   8%|▊         | 32/403 [00:08<01:35,  3.88it/s]\u001b[A\n",
            "Iteration:   8%|▊         | 33/403 [00:08<01:35,  3.89it/s]\u001b[A\n",
            "Iteration:   8%|▊         | 34/403 [00:08<01:35,  3.88it/s]\u001b[A\n",
            "Iteration:   9%|▊         | 35/403 [00:09<01:34,  3.88it/s]\u001b[A\n",
            "Iteration:   9%|▉         | 36/403 [00:09<01:34,  3.88it/s]\u001b[A\n",
            "Iteration:   9%|▉         | 37/403 [00:09<01:34,  3.88it/s]\u001b[A\n",
            "Iteration:   9%|▉         | 38/403 [00:09<01:34,  3.87it/s]\u001b[A\n",
            "Iteration:  10%|▉         | 39/403 [00:10<01:34,  3.87it/s]\u001b[A\n",
            "Iteration:  10%|▉         | 40/403 [00:10<01:33,  3.87it/s]\u001b[A\n",
            "Iteration:  10%|█         | 41/403 [00:10<01:33,  3.87it/s]\u001b[A\n",
            "Iteration:  10%|█         | 42/403 [00:10<01:33,  3.87it/s]\u001b[A\n",
            "Iteration:  11%|█         | 43/403 [00:11<01:33,  3.87it/s]\u001b[A\n",
            "Iteration:  11%|█         | 44/403 [00:11<01:32,  3.87it/s]\u001b[A\n",
            "Iteration:  11%|█         | 45/403 [00:11<01:32,  3.87it/s]\u001b[A\n",
            "Iteration:  11%|█▏        | 46/403 [00:11<01:32,  3.87it/s]\u001b[A\n",
            "Iteration:  12%|█▏        | 47/403 [00:12<01:31,  3.87it/s]\u001b[A\n",
            "Iteration:  12%|█▏        | 48/403 [00:12<01:31,  3.87it/s]\u001b[A\n",
            "Iteration:  12%|█▏        | 49/403 [00:12<01:31,  3.87it/s]\u001b[A\n",
            "Iteration:  12%|█▏        | 50/403 [00:12<01:31,  3.87it/s]\u001b[A\n",
            "Iteration:  13%|█▎        | 51/403 [00:13<01:30,  3.87it/s]\u001b[A\n",
            "Iteration:  13%|█▎        | 52/403 [00:13<01:30,  3.88it/s]\u001b[A\n",
            "Iteration:  13%|█▎        | 53/403 [00:13<01:30,  3.87it/s]\u001b[A\n",
            "Iteration:  13%|█▎        | 54/403 [00:13<01:30,  3.87it/s]\u001b[A\n",
            "Iteration:  14%|█▎        | 55/403 [00:14<01:30,  3.86it/s]\u001b[A\n",
            "Iteration:  14%|█▍        | 56/403 [00:14<01:29,  3.87it/s]\u001b[A\n",
            "Iteration:  14%|█▍        | 57/403 [00:14<01:29,  3.86it/s]\u001b[A\n",
            "Iteration:  14%|█▍        | 58/403 [00:14<01:29,  3.87it/s]\u001b[A\n",
            "Iteration:  15%|█▍        | 59/403 [00:15<01:29,  3.86it/s]\u001b[A\n",
            "Iteration:  15%|█▍        | 60/403 [00:15<01:28,  3.86it/s]\u001b[A\n",
            "Iteration:  15%|█▌        | 61/403 [00:15<01:28,  3.86it/s]\u001b[A\n",
            "Iteration:  15%|█▌        | 62/403 [00:16<01:28,  3.86it/s]\u001b[A\n",
            "Iteration:  16%|█▌        | 63/403 [00:16<01:28,  3.86it/s]\u001b[A\n",
            "Iteration:  16%|█▌        | 64/403 [00:16<01:27,  3.86it/s]\u001b[A\n",
            "Iteration:  16%|█▌        | 65/403 [00:16<01:27,  3.86it/s]\u001b[A\n",
            "Iteration:  16%|█▋        | 66/403 [00:17<01:27,  3.86it/s]\u001b[A\n",
            "Iteration:  17%|█▋        | 67/403 [00:17<01:27,  3.86it/s]\u001b[A\n",
            "Iteration:  17%|█▋        | 68/403 [00:17<01:26,  3.86it/s]\u001b[A\n",
            "Iteration:  17%|█▋        | 69/403 [00:17<01:26,  3.86it/s]\u001b[A\n",
            "Iteration:  17%|█▋        | 70/403 [00:18<01:26,  3.86it/s]\u001b[A\n",
            "Iteration:  18%|█▊        | 71/403 [00:18<01:26,  3.84it/s]\u001b[A\n",
            "Iteration:  18%|█▊        | 72/403 [00:18<01:26,  3.84it/s]\u001b[A\n",
            "Iteration:  18%|█▊        | 73/403 [00:18<01:25,  3.84it/s]\u001b[A\n",
            "Iteration:  18%|█▊        | 74/403 [00:19<01:25,  3.85it/s]\u001b[A\n",
            "Iteration:  19%|█▊        | 75/403 [00:19<01:25,  3.85it/s]\u001b[A\n",
            "Iteration:  19%|█▉        | 76/403 [00:19<01:24,  3.86it/s]\u001b[A\n",
            "Iteration:  19%|█▉        | 77/403 [00:19<01:24,  3.86it/s]\u001b[A\n",
            "Iteration:  19%|█▉        | 78/403 [00:20<01:24,  3.86it/s]\u001b[A\n",
            "Iteration:  20%|█▉        | 79/403 [00:20<01:23,  3.87it/s]\u001b[A\n",
            "Iteration:  20%|█▉        | 80/403 [00:20<01:23,  3.87it/s]\u001b[A\n",
            "Iteration:  20%|██        | 81/403 [00:20<01:23,  3.87it/s]\u001b[A\n",
            "Iteration:  20%|██        | 82/403 [00:21<01:22,  3.88it/s]\u001b[A\n",
            "Iteration:  21%|██        | 83/403 [00:21<01:22,  3.88it/s]\u001b[A\n",
            "Iteration:  21%|██        | 84/403 [00:21<01:22,  3.87it/s]\u001b[A\n",
            "Iteration:  21%|██        | 85/403 [00:21<01:21,  3.88it/s]\u001b[A\n",
            "Iteration:  21%|██▏       | 86/403 [00:22<01:21,  3.89it/s]\u001b[A\n",
            "Iteration:  22%|██▏       | 87/403 [00:22<01:21,  3.89it/s]\u001b[A\n",
            "Iteration:  22%|██▏       | 88/403 [00:22<01:21,  3.89it/s]\u001b[A\n",
            "Iteration:  22%|██▏       | 89/403 [00:22<01:20,  3.89it/s]\u001b[A\n",
            "Iteration:  22%|██▏       | 90/403 [00:23<01:20,  3.89it/s]\u001b[A\n",
            "Iteration:  23%|██▎       | 91/403 [00:23<01:20,  3.88it/s]\u001b[A\n",
            "Iteration:  23%|██▎       | 92/403 [00:23<01:20,  3.87it/s]\u001b[A\n",
            "Iteration:  23%|██▎       | 93/403 [00:24<01:20,  3.87it/s]\u001b[A\n",
            "Iteration:  23%|██▎       | 94/403 [00:24<01:19,  3.87it/s]\u001b[A\n",
            "Iteration:  24%|██▎       | 95/403 [00:24<01:19,  3.87it/s]\u001b[A\n",
            "Iteration:  24%|██▍       | 96/403 [00:24<01:19,  3.88it/s]\u001b[A\n",
            "Iteration:  24%|██▍       | 97/403 [00:25<01:18,  3.88it/s]\u001b[A\n",
            "Iteration:  24%|██▍       | 98/403 [00:25<01:18,  3.87it/s]\u001b[A\n",
            "Iteration:  25%|██▍       | 99/403 [00:25<01:18,  3.87it/s]\u001b[A\n",
            "Iteration:  25%|██▍       | 100/403 [00:25<01:18,  3.87it/s]\u001b[A\n",
            "Iteration:  25%|██▌       | 101/403 [00:26<01:17,  3.88it/s]\u001b[A\n",
            "Iteration:  25%|██▌       | 102/403 [00:26<01:17,  3.89it/s]\u001b[A\n",
            "Iteration:  26%|██▌       | 103/403 [00:26<01:17,  3.88it/s]\u001b[A\n",
            "Iteration:  26%|██▌       | 104/403 [00:26<01:17,  3.87it/s]\u001b[A\n",
            "Iteration:  26%|██▌       | 105/403 [00:27<01:17,  3.87it/s]\u001b[A\n",
            "Iteration:  26%|██▋       | 106/403 [00:27<01:16,  3.88it/s]\u001b[A\n",
            "Iteration:  27%|██▋       | 107/403 [00:27<01:16,  3.88it/s]\u001b[A\n",
            "Iteration:  27%|██▋       | 108/403 [00:27<01:15,  3.88it/s]\u001b[A\n",
            "Iteration:  27%|██▋       | 109/403 [00:28<01:15,  3.88it/s]\u001b[A\n",
            "Iteration:  27%|██▋       | 110/403 [00:28<01:15,  3.87it/s]\u001b[A\n",
            "Iteration:  28%|██▊       | 111/403 [00:28<01:15,  3.87it/s]\u001b[A\n",
            "Iteration:  28%|██▊       | 112/403 [00:28<01:15,  3.87it/s]\u001b[A\n",
            "Iteration:  28%|██▊       | 113/403 [00:29<01:15,  3.85it/s]\u001b[A\n",
            "Iteration:  28%|██▊       | 114/403 [00:29<01:15,  3.85it/s]\u001b[A\n",
            "Iteration:  29%|██▊       | 115/403 [00:29<01:14,  3.85it/s]\u001b[A\n",
            "Iteration:  29%|██▉       | 116/403 [00:29<01:14,  3.85it/s]\u001b[A\n",
            "Iteration:  29%|██▉       | 117/403 [00:30<01:14,  3.86it/s]\u001b[A\n",
            "Iteration:  29%|██▉       | 118/403 [00:30<01:14,  3.85it/s]\u001b[A\n",
            "Iteration:  30%|██▉       | 119/403 [00:30<01:13,  3.86it/s]\u001b[A\n",
            "Iteration:  30%|██▉       | 120/403 [00:31<01:13,  3.86it/s]\u001b[A\n",
            "Iteration:  30%|███       | 121/403 [00:31<01:12,  3.87it/s]\u001b[A\n",
            "Iteration:  30%|███       | 122/403 [00:31<01:12,  3.87it/s]\u001b[A\n",
            "Iteration:  31%|███       | 123/403 [00:31<01:12,  3.87it/s]\u001b[A\n",
            "Iteration:  31%|███       | 124/403 [00:32<01:12,  3.87it/s]\u001b[A\n",
            "Iteration:  31%|███       | 125/403 [00:32<01:11,  3.87it/s]\u001b[A\n",
            "Iteration:  31%|███▏      | 126/403 [00:32<01:11,  3.87it/s]\u001b[A\n",
            "Iteration:  32%|███▏      | 127/403 [00:32<01:11,  3.88it/s]\u001b[A\n",
            "Iteration:  32%|███▏      | 128/403 [00:33<01:10,  3.88it/s]\u001b[A\n",
            "Iteration:  32%|███▏      | 129/403 [00:33<01:10,  3.87it/s]\u001b[A\n",
            "Iteration:  32%|███▏      | 130/403 [00:33<01:10,  3.87it/s]\u001b[A\n",
            "Iteration:  33%|███▎      | 131/403 [00:33<01:10,  3.86it/s]\u001b[A\n",
            "Iteration:  33%|███▎      | 132/403 [00:34<01:10,  3.86it/s]\u001b[A\n",
            "Iteration:  33%|███▎      | 133/403 [00:34<01:09,  3.86it/s]\u001b[A\n",
            "Iteration:  33%|███▎      | 134/403 [00:34<01:09,  3.86it/s]\u001b[A\n",
            "Iteration:  33%|███▎      | 135/403 [00:34<01:09,  3.87it/s]\u001b[A\n",
            "Iteration:  34%|███▎      | 136/403 [00:35<01:09,  3.87it/s]\u001b[A\n",
            "Iteration:  34%|███▍      | 137/403 [00:35<01:08,  3.87it/s]\u001b[A\n",
            "Iteration:  34%|███▍      | 138/403 [00:35<01:08,  3.87it/s]\u001b[A\n",
            "Iteration:  34%|███▍      | 139/403 [00:35<01:08,  3.87it/s]\u001b[A\n",
            "Iteration:  35%|███▍      | 140/403 [00:36<01:07,  3.87it/s]\u001b[A\n",
            "Iteration:  35%|███▍      | 141/403 [00:36<01:07,  3.86it/s]\u001b[A\n",
            "Iteration:  35%|███▌      | 142/403 [00:36<01:07,  3.88it/s]\u001b[A\n",
            "Iteration:  35%|███▌      | 143/403 [00:36<01:07,  3.88it/s]\u001b[A\n",
            "Iteration:  36%|███▌      | 144/403 [00:37<01:06,  3.88it/s]\u001b[A\n",
            "Iteration:  36%|███▌      | 145/403 [00:37<01:06,  3.88it/s]\u001b[A\n",
            "Iteration:  36%|███▌      | 146/403 [00:37<01:06,  3.87it/s]\u001b[A\n",
            "Iteration:  36%|███▋      | 147/403 [00:37<01:06,  3.88it/s]\u001b[A\n",
            "Iteration:  37%|███▋      | 148/403 [00:38<01:05,  3.88it/s]\u001b[A\n",
            "Iteration:  37%|███▋      | 149/403 [00:38<01:05,  3.88it/s]\u001b[A\n",
            "Iteration:  37%|███▋      | 150/403 [00:38<01:05,  3.86it/s]\u001b[A\n",
            "Iteration:  37%|███▋      | 151/403 [00:39<01:05,  3.86it/s]\u001b[A\n",
            "Iteration:  38%|███▊      | 152/403 [00:39<01:05,  3.86it/s]\u001b[A\n",
            "Iteration:  38%|███▊      | 153/403 [00:39<01:04,  3.86it/s]\u001b[A\n",
            "Iteration:  38%|███▊      | 154/403 [00:39<01:04,  3.85it/s]\u001b[A\n",
            "Iteration:  38%|███▊      | 155/403 [00:40<01:04,  3.87it/s]\u001b[A\n",
            "Iteration:  39%|███▊      | 156/403 [00:40<01:03,  3.88it/s]\u001b[A\n",
            "Iteration:  39%|███▉      | 157/403 [00:40<01:03,  3.86it/s]\u001b[A\n",
            "Iteration:  39%|███▉      | 158/403 [00:40<01:03,  3.87it/s]\u001b[A\n",
            "Iteration:  39%|███▉      | 159/403 [00:41<01:03,  3.86it/s]\u001b[A\n",
            "Iteration:  40%|███▉      | 160/403 [00:41<01:03,  3.85it/s]\u001b[A\n",
            "Iteration:  40%|███▉      | 161/403 [00:41<01:03,  3.82it/s]\u001b[A\n",
            "Iteration:  40%|████      | 162/403 [00:41<01:03,  3.80it/s]\u001b[A\n",
            "Iteration:  40%|████      | 163/403 [00:42<01:02,  3.81it/s]\u001b[A\n",
            "Iteration:  41%|████      | 164/403 [00:42<01:02,  3.82it/s]\u001b[A\n",
            "Iteration:  41%|████      | 165/403 [00:42<01:02,  3.81it/s]\u001b[A\n",
            "Iteration:  41%|████      | 166/403 [00:42<01:01,  3.83it/s]\u001b[A\n",
            "Iteration:  41%|████▏     | 167/403 [00:43<01:01,  3.84it/s]\u001b[A\n",
            "Iteration:  42%|████▏     | 168/403 [00:43<01:01,  3.84it/s]\u001b[A\n",
            "Iteration:  42%|████▏     | 169/403 [00:43<01:01,  3.83it/s]\u001b[A\n",
            "Iteration:  42%|████▏     | 170/403 [00:43<01:00,  3.83it/s]\u001b[A\n",
            "Iteration:  42%|████▏     | 171/403 [00:44<01:00,  3.84it/s]\u001b[A\n",
            "Iteration:  43%|████▎     | 172/403 [00:44<01:00,  3.83it/s]\u001b[A\n",
            "Iteration:  43%|████▎     | 173/403 [00:44<01:00,  3.82it/s]\u001b[A\n",
            "Iteration:  43%|████▎     | 174/403 [00:45<00:59,  3.82it/s]\u001b[A\n",
            "Iteration:  43%|████▎     | 175/403 [00:45<00:59,  3.82it/s]\u001b[A\n",
            "Iteration:  44%|████▎     | 176/403 [00:45<00:59,  3.82it/s]\u001b[A\n",
            "Iteration:  44%|████▍     | 177/403 [00:45<00:59,  3.81it/s]\u001b[A\n",
            "Iteration:  44%|████▍     | 178/403 [00:46<00:58,  3.84it/s]\u001b[A\n",
            "Iteration:  44%|████▍     | 179/403 [00:46<00:58,  3.84it/s]\u001b[A\n",
            "Iteration:  45%|████▍     | 180/403 [00:46<00:57,  3.85it/s]\u001b[A\n",
            "Iteration:  45%|████▍     | 181/403 [00:46<00:57,  3.85it/s]\u001b[A\n",
            "Iteration:  45%|████▌     | 182/403 [00:47<00:57,  3.85it/s]\u001b[A\n",
            "Iteration:  45%|████▌     | 183/403 [00:47<00:57,  3.86it/s]\u001b[A\n",
            "Iteration:  46%|████▌     | 184/403 [00:47<00:56,  3.86it/s]\u001b[A\n",
            "Iteration:  46%|████▌     | 185/403 [00:47<00:56,  3.85it/s]\u001b[A\n",
            "Iteration:  46%|████▌     | 186/403 [00:48<00:56,  3.86it/s]\u001b[A\n",
            "Iteration:  46%|████▋     | 187/403 [00:48<00:55,  3.86it/s]\u001b[A\n",
            "Iteration:  47%|████▋     | 188/403 [00:48<00:55,  3.86it/s]\u001b[A\n",
            "Iteration:  47%|████▋     | 189/403 [00:48<00:55,  3.86it/s]\u001b[A\n",
            "Iteration:  47%|████▋     | 190/403 [00:49<00:55,  3.86it/s]\u001b[A\n",
            "Iteration:  47%|████▋     | 191/403 [00:49<00:54,  3.87it/s]\u001b[A\n",
            "Iteration:  48%|████▊     | 192/403 [00:49<00:54,  3.88it/s]\u001b[A\n",
            "Iteration:  48%|████▊     | 193/403 [00:49<00:54,  3.88it/s]\u001b[A\n",
            "Iteration:  48%|████▊     | 194/403 [00:50<00:53,  3.89it/s]\u001b[A\n",
            "Iteration:  48%|████▊     | 195/403 [00:50<00:53,  3.89it/s]\u001b[A\n",
            "Iteration:  49%|████▊     | 196/403 [00:50<00:53,  3.87it/s]\u001b[A\n",
            "Iteration:  49%|████▉     | 197/403 [00:50<00:53,  3.87it/s]\u001b[A\n",
            "Iteration:  49%|████▉     | 198/403 [00:51<00:52,  3.87it/s]\u001b[A\n",
            "Iteration:  49%|████▉     | 199/403 [00:51<00:52,  3.87it/s]\u001b[A\n",
            "Iteration:  50%|████▉     | 200/403 [00:51<00:52,  3.88it/s]\u001b[A\n",
            "Iteration:  50%|████▉     | 201/403 [00:52<00:52,  3.87it/s]\u001b[A\n",
            "Iteration:  50%|█████     | 202/403 [00:52<00:51,  3.88it/s]\u001b[A\n",
            "Iteration:  50%|█████     | 203/403 [00:52<00:51,  3.87it/s]\u001b[A\n",
            "Iteration:  51%|█████     | 204/403 [00:52<00:51,  3.87it/s]\u001b[A\n",
            "Iteration:  51%|█████     | 205/403 [00:53<00:51,  3.87it/s]\u001b[A\n",
            "Iteration:  51%|█████     | 206/403 [00:53<00:50,  3.87it/s]\u001b[A\n",
            "Iteration:  51%|█████▏    | 207/403 [00:53<00:50,  3.86it/s]\u001b[A\n",
            "Iteration:  52%|█████▏    | 208/403 [00:53<00:50,  3.87it/s]\u001b[A\n",
            "Iteration:  52%|█████▏    | 209/403 [00:54<00:50,  3.87it/s]\u001b[A\n",
            "Iteration:  52%|█████▏    | 210/403 [00:54<00:49,  3.86it/s]\u001b[A\n",
            "Iteration:  52%|█████▏    | 211/403 [00:54<00:49,  3.87it/s]\u001b[A\n",
            "Iteration:  53%|█████▎    | 212/403 [00:54<00:49,  3.87it/s]\u001b[A\n",
            "Iteration:  53%|█████▎    | 213/403 [00:55<00:49,  3.86it/s]\u001b[A\n",
            "Iteration:  53%|█████▎    | 214/403 [00:55<00:48,  3.86it/s]\u001b[A\n",
            "Iteration:  53%|█████▎    | 215/403 [00:55<00:48,  3.86it/s]\u001b[A\n",
            "Iteration:  54%|█████▎    | 216/403 [00:55<00:48,  3.86it/s]\u001b[A\n",
            "Iteration:  54%|█████▍    | 217/403 [00:56<00:48,  3.87it/s]\u001b[A\n",
            "Iteration:  54%|█████▍    | 218/403 [00:56<00:47,  3.87it/s]\u001b[A\n",
            "Iteration:  54%|█████▍    | 219/403 [00:56<00:47,  3.86it/s]\u001b[A\n",
            "Iteration:  55%|█████▍    | 220/403 [00:56<00:47,  3.88it/s]\u001b[A\n",
            "Iteration:  55%|█████▍    | 221/403 [00:57<00:46,  3.88it/s]\u001b[A\n",
            "Iteration:  55%|█████▌    | 222/403 [00:57<00:46,  3.87it/s]\u001b[A\n",
            "Iteration:  55%|█████▌    | 223/403 [00:57<00:46,  3.87it/s]\u001b[A\n",
            "Iteration:  56%|█████▌    | 224/403 [00:57<00:46,  3.87it/s]\u001b[A\n",
            "Iteration:  56%|█████▌    | 225/403 [00:58<00:45,  3.88it/s]\u001b[A\n",
            "Iteration:  56%|█████▌    | 226/403 [00:58<00:45,  3.87it/s]\u001b[A\n",
            "Iteration:  56%|█████▋    | 227/403 [00:58<00:45,  3.86it/s]\u001b[A\n",
            "Iteration:  57%|█████▋    | 228/403 [00:58<00:45,  3.86it/s]\u001b[A\n",
            "Iteration:  57%|█████▋    | 229/403 [00:59<00:45,  3.86it/s]\u001b[A\n",
            "Iteration:  57%|█████▋    | 230/403 [00:59<00:44,  3.86it/s]\u001b[A\n",
            "Iteration:  57%|█████▋    | 231/403 [00:59<00:44,  3.85it/s]\u001b[A\n",
            "Iteration:  58%|█████▊    | 232/403 [01:00<00:44,  3.86it/s]\u001b[A\n",
            "Iteration:  58%|█████▊    | 233/403 [01:00<00:44,  3.86it/s]\u001b[A\n",
            "Iteration:  58%|█████▊    | 234/403 [01:00<00:43,  3.86it/s]\u001b[A\n",
            "Iteration:  58%|█████▊    | 235/403 [01:00<00:43,  3.87it/s]\u001b[A\n",
            "Iteration:  59%|█████▊    | 236/403 [01:01<00:43,  3.87it/s]\u001b[A\n",
            "Iteration:  59%|█████▉    | 237/403 [01:01<00:42,  3.87it/s]\u001b[A\n",
            "Iteration:  59%|█████▉    | 238/403 [01:01<00:42,  3.87it/s]\u001b[A\n",
            "Iteration:  59%|█████▉    | 239/403 [01:01<00:42,  3.86it/s]\u001b[A\n",
            "Iteration:  60%|█████▉    | 240/403 [01:02<00:42,  3.86it/s]\u001b[A\n",
            "Iteration:  60%|█████▉    | 241/403 [01:02<00:41,  3.87it/s]\u001b[A\n",
            "Iteration:  60%|██████    | 242/403 [01:02<00:41,  3.86it/s]\u001b[A\n",
            "Iteration:  60%|██████    | 243/403 [01:02<00:41,  3.87it/s]\u001b[A\n",
            "Iteration:  61%|██████    | 244/403 [01:03<00:41,  3.87it/s]\u001b[A\n",
            "Iteration:  61%|██████    | 245/403 [01:03<00:40,  3.87it/s]\u001b[A\n",
            "Iteration:  61%|██████    | 246/403 [01:03<00:40,  3.85it/s]\u001b[A\n",
            "Iteration:  61%|██████▏   | 247/403 [01:03<00:40,  3.86it/s]\u001b[A\n",
            "Iteration:  62%|██████▏   | 248/403 [01:04<00:40,  3.86it/s]\u001b[A\n",
            "Iteration:  62%|██████▏   | 249/403 [01:04<00:39,  3.87it/s]\u001b[A\n",
            "Iteration:  62%|██████▏   | 250/403 [01:04<00:39,  3.87it/s]\u001b[A\n",
            "Iteration:  62%|██████▏   | 251/403 [01:04<00:39,  3.87it/s]\u001b[A\n",
            "Iteration:  63%|██████▎   | 252/403 [01:05<00:38,  3.87it/s]\u001b[A\n",
            "Iteration:  63%|██████▎   | 253/403 [01:05<00:38,  3.87it/s]\u001b[A\n",
            "Iteration:  63%|██████▎   | 254/403 [01:05<00:38,  3.86it/s]\u001b[A\n",
            "Iteration:  63%|██████▎   | 255/403 [01:05<00:38,  3.86it/s]\u001b[A\n",
            "Iteration:  64%|██████▎   | 256/403 [01:06<00:37,  3.87it/s]\u001b[A\n",
            "Iteration:  64%|██████▍   | 257/403 [01:06<00:37,  3.87it/s]\u001b[A\n",
            "Iteration:  64%|██████▍   | 258/403 [01:06<00:37,  3.87it/s]\u001b[A\n",
            "Iteration:  64%|██████▍   | 259/403 [01:07<00:37,  3.86it/s]\u001b[A\n",
            "Iteration:  65%|██████▍   | 260/403 [01:07<00:36,  3.87it/s]\u001b[A\n",
            "Iteration:  65%|██████▍   | 261/403 [01:07<00:36,  3.86it/s]\u001b[A\n",
            "Iteration:  65%|██████▌   | 262/403 [01:07<00:36,  3.86it/s]\u001b[A\n",
            "Iteration:  65%|██████▌   | 263/403 [01:08<00:36,  3.86it/s]\u001b[A\n",
            "Iteration:  66%|██████▌   | 264/403 [01:08<00:36,  3.86it/s]\u001b[A\n",
            "Iteration:  66%|██████▌   | 265/403 [01:08<00:35,  3.86it/s]\u001b[A\n",
            "Iteration:  66%|██████▌   | 266/403 [01:08<00:35,  3.85it/s]\u001b[A\n",
            "Iteration:  66%|██████▋   | 267/403 [01:09<00:35,  3.85it/s]\u001b[A\n",
            "Iteration:  67%|██████▋   | 268/403 [01:09<00:34,  3.86it/s]\u001b[A\n",
            "Iteration:  67%|██████▋   | 269/403 [01:09<00:34,  3.85it/s]\u001b[A\n",
            "Iteration:  67%|██████▋   | 270/403 [01:09<00:34,  3.85it/s]\u001b[A\n",
            "Iteration:  67%|██████▋   | 271/403 [01:10<00:34,  3.85it/s]\u001b[A\n",
            "Iteration:  67%|██████▋   | 272/403 [01:10<00:33,  3.86it/s]\u001b[A\n",
            "Iteration:  68%|██████▊   | 273/403 [01:10<00:33,  3.87it/s]\u001b[A\n",
            "Iteration:  68%|██████▊   | 274/403 [01:10<00:33,  3.87it/s]\u001b[A\n",
            "Iteration:  68%|██████▊   | 275/403 [01:11<00:33,  3.87it/s]\u001b[A\n",
            "Iteration:  68%|██████▊   | 276/403 [01:11<00:32,  3.86it/s]\u001b[A\n",
            "Iteration:  69%|██████▊   | 277/403 [01:11<00:32,  3.86it/s]\u001b[A\n",
            "Iteration:  69%|██████▉   | 278/403 [01:11<00:32,  3.87it/s]\u001b[A\n",
            "Iteration:  69%|██████▉   | 279/403 [01:12<00:32,  3.87it/s]\u001b[A\n",
            "Iteration:  69%|██████▉   | 280/403 [01:12<00:31,  3.86it/s]\u001b[A\n",
            "Iteration:  70%|██████▉   | 281/403 [01:12<00:31,  3.86it/s]\u001b[A\n",
            "Iteration:  70%|██████▉   | 282/403 [01:12<00:31,  3.87it/s]\u001b[A\n",
            "Iteration:  70%|███████   | 283/403 [01:13<00:30,  3.88it/s]\u001b[A\n",
            "Iteration:  70%|███████   | 284/403 [01:13<00:30,  3.89it/s]\u001b[A\n",
            "Iteration:  71%|███████   | 285/403 [01:13<00:30,  3.88it/s]\u001b[A\n",
            "Iteration:  71%|███████   | 286/403 [01:13<00:30,  3.88it/s]\u001b[A\n",
            "Iteration:  71%|███████   | 287/403 [01:14<00:29,  3.88it/s]\u001b[A\n",
            "Iteration:  71%|███████▏  | 288/403 [01:14<00:29,  3.87it/s]\u001b[A\n",
            "Iteration:  72%|███████▏  | 289/403 [01:14<00:29,  3.87it/s]\u001b[A\n",
            "Iteration:  72%|███████▏  | 290/403 [01:15<00:29,  3.88it/s]\u001b[A\n",
            "Iteration:  72%|███████▏  | 291/403 [01:15<00:28,  3.88it/s]\u001b[A\n",
            "Iteration:  72%|███████▏  | 292/403 [01:15<00:28,  3.88it/s]\u001b[A\n",
            "Iteration:  73%|███████▎  | 293/403 [01:15<00:28,  3.87it/s]\u001b[A\n",
            "Iteration:  73%|███████▎  | 294/403 [01:16<00:28,  3.86it/s]\u001b[A\n",
            "Iteration:  73%|███████▎  | 295/403 [01:16<00:28,  3.85it/s]\u001b[A\n",
            "Iteration:  73%|███████▎  | 296/403 [01:16<00:27,  3.83it/s]\u001b[A\n",
            "Iteration:  74%|███████▎  | 297/403 [01:16<00:27,  3.83it/s]\u001b[A\n",
            "Iteration:  74%|███████▍  | 298/403 [01:17<00:27,  3.83it/s]\u001b[A\n",
            "Iteration:  74%|███████▍  | 299/403 [01:17<00:27,  3.82it/s]\u001b[A\n",
            "Iteration:  74%|███████▍  | 300/403 [01:17<00:27,  3.81it/s]\u001b[A\n",
            "Iteration:  75%|███████▍  | 301/403 [01:17<00:26,  3.83it/s]\u001b[A\n",
            "Iteration:  75%|███████▍  | 302/403 [01:18<00:26,  3.82it/s]\u001b[A\n",
            "Iteration:  75%|███████▌  | 303/403 [01:18<00:26,  3.80it/s]\u001b[A\n",
            "Iteration:  75%|███████▌  | 304/403 [01:18<00:26,  3.78it/s]\u001b[A\n",
            "Iteration:  76%|███████▌  | 305/403 [01:18<00:25,  3.79it/s]\u001b[A\n",
            "Iteration:  76%|███████▌  | 306/403 [01:19<00:25,  3.80it/s]\u001b[A\n",
            "Iteration:  76%|███████▌  | 307/403 [01:19<00:25,  3.80it/s]\u001b[A\n",
            "Iteration:  76%|███████▋  | 308/403 [01:19<00:25,  3.79it/s]\u001b[A\n",
            "Iteration:  77%|███████▋  | 309/403 [01:19<00:24,  3.82it/s]\u001b[A\n",
            "Iteration:  77%|███████▋  | 310/403 [01:20<00:24,  3.83it/s]\u001b[A\n",
            "Iteration:  77%|███████▋  | 311/403 [01:20<00:23,  3.85it/s]\u001b[A\n",
            "Iteration:  77%|███████▋  | 312/403 [01:20<00:23,  3.85it/s]\u001b[A\n",
            "Iteration:  78%|███████▊  | 313/403 [01:21<00:23,  3.86it/s]\u001b[A\n",
            "Iteration:  78%|███████▊  | 314/403 [01:21<00:23,  3.87it/s]\u001b[A\n",
            "Iteration:  78%|███████▊  | 315/403 [01:21<00:22,  3.88it/s]\u001b[A\n",
            "Iteration:  78%|███████▊  | 316/403 [01:21<00:22,  3.87it/s]\u001b[A\n",
            "Iteration:  79%|███████▊  | 317/403 [01:22<00:22,  3.87it/s]\u001b[A\n",
            "Iteration:  79%|███████▉  | 318/403 [01:22<00:21,  3.87it/s]\u001b[A\n",
            "Iteration:  79%|███████▉  | 319/403 [01:22<00:21,  3.88it/s]\u001b[A\n",
            "Iteration:  79%|███████▉  | 320/403 [01:22<00:21,  3.87it/s]\u001b[A\n",
            "Iteration:  80%|███████▉  | 321/403 [01:23<00:21,  3.86it/s]\u001b[A\n",
            "Iteration:  80%|███████▉  | 322/403 [01:23<00:20,  3.86it/s]\u001b[A\n",
            "Iteration:  80%|████████  | 323/403 [01:23<00:20,  3.85it/s]\u001b[A\n",
            "Iteration:  80%|████████  | 324/403 [01:23<00:20,  3.85it/s]\u001b[A\n",
            "Iteration:  81%|████████  | 325/403 [01:24<00:20,  3.86it/s]\u001b[A\n",
            "Iteration:  81%|████████  | 326/403 [01:24<00:19,  3.86it/s]\u001b[A\n",
            "Iteration:  81%|████████  | 327/403 [01:24<00:19,  3.85it/s]\u001b[A\n",
            "Iteration:  81%|████████▏ | 328/403 [01:24<00:19,  3.86it/s]\u001b[A\n",
            "Iteration:  82%|████████▏ | 329/403 [01:25<00:19,  3.86it/s]\u001b[A\n",
            "Iteration:  82%|████████▏ | 330/403 [01:25<00:18,  3.87it/s]\u001b[A\n",
            "Iteration:  82%|████████▏ | 331/403 [01:25<00:18,  3.86it/s]\u001b[A\n",
            "Iteration:  82%|████████▏ | 332/403 [01:25<00:18,  3.86it/s]\u001b[A\n",
            "Iteration:  83%|████████▎ | 333/403 [01:26<00:18,  3.87it/s]\u001b[A\n",
            "Iteration:  83%|████████▎ | 334/403 [01:26<00:17,  3.87it/s]\u001b[A\n",
            "Iteration:  83%|████████▎ | 335/403 [01:26<00:17,  3.87it/s]\u001b[A\n",
            "Iteration:  83%|████████▎ | 336/403 [01:26<00:17,  3.88it/s]\u001b[A\n",
            "Iteration:  84%|████████▎ | 337/403 [01:27<00:17,  3.88it/s]\u001b[A\n",
            "Iteration:  84%|████████▍ | 338/403 [01:27<00:16,  3.89it/s]\u001b[A\n",
            "Iteration:  84%|████████▍ | 339/403 [01:27<00:16,  3.89it/s]\u001b[A\n",
            "Iteration:  84%|████████▍ | 340/403 [01:28<00:16,  3.88it/s]\u001b[A\n",
            "Iteration:  85%|████████▍ | 341/403 [01:28<00:16,  3.87it/s]\u001b[A\n",
            "Iteration:  85%|████████▍ | 342/403 [01:28<00:15,  3.87it/s]\u001b[A\n",
            "Iteration:  85%|████████▌ | 343/403 [01:28<00:15,  3.87it/s]\u001b[A\n",
            "Iteration:  85%|████████▌ | 344/403 [01:29<00:15,  3.87it/s]\u001b[A\n",
            "Iteration:  86%|████████▌ | 345/403 [01:29<00:15,  3.87it/s]\u001b[A\n",
            "Iteration:  86%|████████▌ | 346/403 [01:29<00:14,  3.86it/s]\u001b[A\n",
            "Iteration:  86%|████████▌ | 347/403 [01:29<00:14,  3.86it/s]\u001b[A\n",
            "Iteration:  86%|████████▋ | 348/403 [01:30<00:14,  3.86it/s]\u001b[A\n",
            "Iteration:  87%|████████▋ | 349/403 [01:30<00:13,  3.86it/s]\u001b[A\n",
            "Iteration:  87%|████████▋ | 350/403 [01:30<00:13,  3.86it/s]\u001b[A\n",
            "Iteration:  87%|████████▋ | 351/403 [01:30<00:13,  3.87it/s]\u001b[A\n",
            "Iteration:  87%|████████▋ | 352/403 [01:31<00:13,  3.88it/s]\u001b[A\n",
            "Iteration:  88%|████████▊ | 353/403 [01:31<00:12,  3.88it/s]\u001b[A\n",
            "Iteration:  88%|████████▊ | 354/403 [01:31<00:12,  3.87it/s]\u001b[A\n",
            "Iteration:  88%|████████▊ | 355/403 [01:31<00:12,  3.87it/s]\u001b[A\n",
            "Iteration:  88%|████████▊ | 356/403 [01:32<00:12,  3.88it/s]\u001b[A\n",
            "Iteration:  89%|████████▊ | 357/403 [01:32<00:11,  3.88it/s]\u001b[A\n",
            "Iteration:  89%|████████▉ | 358/403 [01:32<00:11,  3.88it/s]\u001b[A\n",
            "Iteration:  89%|████████▉ | 359/403 [01:32<00:11,  3.88it/s]\u001b[A\n",
            "Iteration:  89%|████████▉ | 360/403 [01:33<00:11,  3.89it/s]\u001b[A\n",
            "Iteration:  90%|████████▉ | 361/403 [01:33<00:10,  3.89it/s]\u001b[A\n",
            "Iteration:  90%|████████▉ | 362/403 [01:33<00:10,  3.88it/s]\u001b[A\n",
            "Iteration:  90%|█████████ | 363/403 [01:33<00:10,  3.88it/s]\u001b[A\n",
            "Iteration:  90%|█████████ | 364/403 [01:34<00:10,  3.87it/s]\u001b[A\n",
            "Iteration:  91%|█████████ | 365/403 [01:34<00:09,  3.86it/s]\u001b[A\n",
            "Iteration:  91%|█████████ | 366/403 [01:34<00:09,  3.87it/s]\u001b[A\n",
            "Iteration:  91%|█████████ | 367/403 [01:34<00:09,  3.86it/s]\u001b[A\n",
            "Iteration:  91%|█████████▏| 368/403 [01:35<00:09,  3.86it/s]\u001b[A\n",
            "Iteration:  92%|█████████▏| 369/403 [01:35<00:08,  3.86it/s]\u001b[A\n",
            "Iteration:  92%|█████████▏| 370/403 [01:35<00:08,  3.86it/s]\u001b[A\n",
            "Iteration:  92%|█████████▏| 371/403 [01:36<00:08,  3.86it/s]\u001b[A\n",
            "Iteration:  92%|█████████▏| 372/403 [01:36<00:08,  3.86it/s]\u001b[A\n",
            "Iteration:  93%|█████████▎| 373/403 [01:36<00:07,  3.86it/s]\u001b[A\n",
            "Iteration:  93%|█████████▎| 374/403 [01:36<00:07,  3.88it/s]\u001b[A\n",
            "Iteration:  93%|█████████▎| 375/403 [01:37<00:07,  3.88it/s]\u001b[A\n",
            "Iteration:  93%|█████████▎| 376/403 [01:37<00:06,  3.89it/s]\u001b[A\n",
            "Iteration:  94%|█████████▎| 377/403 [01:37<00:06,  3.88it/s]\u001b[A\n",
            "Iteration:  94%|█████████▍| 378/403 [01:37<00:06,  3.88it/s]\u001b[A\n",
            "Iteration:  94%|█████████▍| 379/403 [01:38<00:06,  3.88it/s]\u001b[A\n",
            "Iteration:  94%|█████████▍| 380/403 [01:38<00:05,  3.89it/s]\u001b[A\n",
            "Iteration:  95%|█████████▍| 381/403 [01:38<00:05,  3.88it/s]\u001b[A\n",
            "Iteration:  95%|█████████▍| 382/403 [01:38<00:05,  3.88it/s]\u001b[A\n",
            "Iteration:  95%|█████████▌| 383/403 [01:39<00:05,  3.87it/s]\u001b[A\n",
            "Iteration:  95%|█████████▌| 384/403 [01:39<00:04,  3.87it/s]\u001b[A\n",
            "Iteration:  96%|█████████▌| 385/403 [01:39<00:04,  3.86it/s]\u001b[A\n",
            "Iteration:  96%|█████████▌| 386/403 [01:39<00:04,  3.86it/s]\u001b[A\n",
            "Iteration:  96%|█████████▌| 387/403 [01:40<00:04,  3.86it/s]\u001b[A\n",
            "Iteration:  96%|█████████▋| 388/403 [01:40<00:03,  3.86it/s]\u001b[A\n",
            "Iteration:  97%|█████████▋| 389/403 [01:40<00:03,  3.85it/s]\u001b[A\n",
            "Iteration:  97%|█████████▋| 390/403 [01:40<00:03,  3.84it/s]\u001b[A\n",
            "Iteration:  97%|█████████▋| 391/403 [01:41<00:03,  3.85it/s]\u001b[A\n",
            "Iteration:  97%|█████████▋| 392/403 [01:41<00:02,  3.84it/s]\u001b[A\n",
            "Iteration:  98%|█████████▊| 393/403 [01:41<00:02,  3.82it/s]\u001b[A\n",
            "Iteration:  98%|█████████▊| 394/403 [01:41<00:02,  3.84it/s]\u001b[A\n",
            "Iteration:  98%|█████████▊| 395/403 [01:42<00:02,  3.84it/s]\u001b[A\n",
            "Iteration:  98%|█████████▊| 396/403 [01:42<00:01,  3.85it/s]\u001b[A\n",
            "Iteration:  99%|█████████▊| 397/403 [01:42<00:01,  3.84it/s]\u001b[A\n",
            "Iteration:  99%|█████████▉| 398/403 [01:43<00:01,  3.85it/s]\u001b[A\n",
            "Iteration:  99%|█████████▉| 399/403 [01:43<00:01,  3.86it/s]\u001b[A\n",
            "Iteration:  99%|█████████▉| 400/403 [01:43<00:00,  3.86it/s]\u001b[A\n",
            "Iteration: 100%|█████████▉| 401/403 [01:43<00:00,  3.86it/s]\u001b[A\n",
            "Iteration: 100%|█████████▉| 402/403 [01:44<00:00,  3.87it/s]\u001b[A\n",
            "Iteration: 100%|██████████| 403/403 [01:44<00:00,  3.86it/s]\u001b[A\n",
            "Epoch:  80%|████████  | 4/5 [07:21<01:52, 112.42s/it]\n",
            "Iteration:   0%|          | 0/403 [00:00<?, ?it/s]\u001b[A\n",
            "Iteration:   0%|          | 1/403 [00:00<01:43,  3.88it/s]\u001b[A\n",
            "Iteration:   0%|          | 2/403 [00:00<01:43,  3.86it/s]\u001b[A\n",
            "Iteration:   1%|          | 3/403 [00:00<01:43,  3.86it/s]\u001b[A\n",
            "Iteration:   1%|          | 4/403 [00:01<01:43,  3.85it/s]\u001b[A\n",
            "Iteration:   1%|          | 5/403 [00:01<01:43,  3.85it/s]\u001b[A\n",
            "Iteration:   1%|▏         | 6/403 [00:01<01:43,  3.85it/s]\u001b[A\n",
            "Iteration:   2%|▏         | 7/403 [00:01<01:42,  3.85it/s]\u001b[A\n",
            "Iteration:   2%|▏         | 8/403 [00:02<01:42,  3.86it/s]\u001b[A\n",
            "Iteration:   2%|▏         | 9/403 [00:02<01:42,  3.86it/s]\u001b[A\n",
            "Iteration:   2%|▏         | 10/403 [00:02<01:41,  3.87it/s]\u001b[A\n",
            "Iteration:   3%|▎         | 11/403 [00:02<01:41,  3.86it/s]\u001b[A\n",
            "Iteration:   3%|▎         | 12/403 [00:03<01:41,  3.85it/s]\u001b[A\n",
            "Iteration:   3%|▎         | 13/403 [00:03<01:41,  3.86it/s]\u001b[A\n",
            "Iteration:   3%|▎         | 14/403 [00:03<01:40,  3.87it/s]\u001b[A\n",
            "Iteration:   4%|▎         | 15/403 [00:03<01:40,  3.85it/s]\u001b[A\n",
            "Iteration:   4%|▍         | 16/403 [00:04<01:40,  3.85it/s]\u001b[A\n",
            "Iteration:   4%|▍         | 17/403 [00:04<01:40,  3.86it/s]\u001b[A\n",
            "Iteration:   4%|▍         | 18/403 [00:04<01:39,  3.87it/s]\u001b[A\n",
            "Iteration:   5%|▍         | 19/403 [00:04<01:38,  3.88it/s]\u001b[A\n",
            "Iteration:   5%|▍         | 20/403 [00:05<01:38,  3.87it/s]\u001b[A\n",
            "Iteration:   5%|▌         | 21/403 [00:05<01:38,  3.88it/s]\u001b[A\n",
            "Iteration:   5%|▌         | 22/403 [00:05<01:38,  3.88it/s]\u001b[A\n",
            "Iteration:   6%|▌         | 23/403 [00:05<01:37,  3.88it/s]\u001b[A\n",
            "Iteration:   6%|▌         | 24/403 [00:06<01:37,  3.87it/s]\u001b[A\n",
            "Iteration:   6%|▌         | 25/403 [00:06<01:37,  3.87it/s]\u001b[A\n",
            "Iteration:   6%|▋         | 26/403 [00:06<01:37,  3.88it/s]\u001b[A\n",
            "Iteration:   7%|▋         | 27/403 [00:06<01:37,  3.87it/s]\u001b[A\n",
            "Iteration:   7%|▋         | 28/403 [00:07<01:37,  3.85it/s]\u001b[A\n",
            "Iteration:   7%|▋         | 29/403 [00:07<01:37,  3.85it/s]\u001b[A\n",
            "Iteration:   7%|▋         | 30/403 [00:07<01:36,  3.86it/s]\u001b[A\n",
            "Iteration:   8%|▊         | 31/403 [00:08<01:36,  3.87it/s]\u001b[A\n",
            "Iteration:   8%|▊         | 32/403 [00:08<01:35,  3.87it/s]\u001b[A\n",
            "Iteration:   8%|▊         | 33/403 [00:08<01:35,  3.87it/s]\u001b[A\n",
            "Iteration:   8%|▊         | 34/403 [00:08<01:35,  3.86it/s]\u001b[A\n",
            "Iteration:   9%|▊         | 35/403 [00:09<01:35,  3.86it/s]\u001b[A\n",
            "Iteration:   9%|▉         | 36/403 [00:09<01:35,  3.86it/s]\u001b[A\n",
            "Iteration:   9%|▉         | 37/403 [00:09<01:34,  3.86it/s]\u001b[A\n",
            "Iteration:   9%|▉         | 38/403 [00:09<01:34,  3.86it/s]\u001b[A\n",
            "Iteration:  10%|▉         | 39/403 [00:10<01:34,  3.87it/s]\u001b[A\n",
            "Iteration:  10%|▉         | 40/403 [00:10<01:33,  3.87it/s]\u001b[A\n",
            "Iteration:  10%|█         | 41/403 [00:10<01:33,  3.86it/s]\u001b[A\n",
            "Iteration:  10%|█         | 42/403 [00:10<01:33,  3.87it/s]\u001b[A\n",
            "Iteration:  11%|█         | 43/403 [00:11<01:33,  3.87it/s]\u001b[A\n",
            "Iteration:  11%|█         | 44/403 [00:11<01:32,  3.87it/s]\u001b[A\n",
            "Iteration:  11%|█         | 45/403 [00:11<01:32,  3.87it/s]\u001b[A\n",
            "Iteration:  11%|█▏        | 46/403 [00:11<01:32,  3.88it/s]\u001b[A\n",
            "Iteration:  12%|█▏        | 47/403 [00:12<01:32,  3.87it/s]\u001b[A\n",
            "Iteration:  12%|█▏        | 48/403 [00:12<01:31,  3.87it/s]\u001b[A\n",
            "Iteration:  12%|█▏        | 49/403 [00:12<01:31,  3.88it/s]\u001b[A\n",
            "Iteration:  12%|█▏        | 50/403 [00:12<01:31,  3.88it/s]\u001b[A\n",
            "Iteration:  13%|█▎        | 51/403 [00:13<01:30,  3.89it/s]\u001b[A\n",
            "Iteration:  13%|█▎        | 52/403 [00:13<01:30,  3.89it/s]\u001b[A\n",
            "Iteration:  13%|█▎        | 53/403 [00:13<01:30,  3.89it/s]\u001b[A\n",
            "Iteration:  13%|█▎        | 54/403 [00:13<01:29,  3.89it/s]\u001b[A\n",
            "Iteration:  14%|█▎        | 55/403 [00:14<01:29,  3.89it/s]\u001b[A\n",
            "Iteration:  14%|█▍        | 56/403 [00:14<01:29,  3.88it/s]\u001b[A\n",
            "Iteration:  14%|█▍        | 57/403 [00:14<01:29,  3.88it/s]\u001b[A\n",
            "Iteration:  14%|█▍        | 58/403 [00:14<01:29,  3.87it/s]\u001b[A\n",
            "Iteration:  15%|█▍        | 59/403 [00:15<01:29,  3.85it/s]\u001b[A\n",
            "Iteration:  15%|█▍        | 60/403 [00:15<01:28,  3.87it/s]\u001b[A\n",
            "Iteration:  15%|█▌        | 61/403 [00:15<01:28,  3.88it/s]\u001b[A\n",
            "Iteration:  15%|█▌        | 62/403 [00:16<01:27,  3.88it/s]\u001b[A\n",
            "Iteration:  16%|█▌        | 63/403 [00:16<01:27,  3.87it/s]\u001b[A\n",
            "Iteration:  16%|█▌        | 64/403 [00:16<01:27,  3.87it/s]\u001b[A\n",
            "Iteration:  16%|█▌        | 65/403 [00:16<01:27,  3.88it/s]\u001b[A\n",
            "Iteration:  16%|█▋        | 66/403 [00:17<01:26,  3.88it/s]\u001b[A\n",
            "Iteration:  17%|█▋        | 67/403 [00:17<01:26,  3.89it/s]\u001b[A\n",
            "Iteration:  17%|█▋        | 68/403 [00:17<01:26,  3.89it/s]\u001b[A\n",
            "Iteration:  17%|█▋        | 69/403 [00:17<01:26,  3.88it/s]\u001b[A\n",
            "Iteration:  17%|█▋        | 70/403 [00:18<01:25,  3.87it/s]\u001b[A\n",
            "Iteration:  18%|█▊        | 71/403 [00:18<01:25,  3.87it/s]\u001b[A\n",
            "Iteration:  18%|█▊        | 72/403 [00:18<01:25,  3.87it/s]\u001b[A\n",
            "Iteration:  18%|█▊        | 73/403 [00:18<01:24,  3.88it/s]\u001b[A\n",
            "Iteration:  18%|█▊        | 74/403 [00:19<01:24,  3.88it/s]\u001b[A\n",
            "Iteration:  19%|█▊        | 75/403 [00:19<01:24,  3.88it/s]\u001b[A\n",
            "Iteration:  19%|█▉        | 76/403 [00:19<01:24,  3.89it/s]\u001b[A\n",
            "Iteration:  19%|█▉        | 77/403 [00:19<01:23,  3.89it/s]\u001b[A\n",
            "Iteration:  19%|█▉        | 78/403 [00:20<01:23,  3.88it/s]\u001b[A\n",
            "Iteration:  20%|█▉        | 79/403 [00:20<01:23,  3.87it/s]\u001b[A\n",
            "Iteration:  20%|█▉        | 80/403 [00:20<01:23,  3.87it/s]\u001b[A\n",
            "Iteration:  20%|██        | 81/403 [00:20<01:23,  3.87it/s]\u001b[A\n",
            "Iteration:  20%|██        | 82/403 [00:21<01:23,  3.86it/s]\u001b[A\n",
            "Iteration:  21%|██        | 83/403 [00:21<01:23,  3.85it/s]\u001b[A\n",
            "Iteration:  21%|██        | 84/403 [00:21<01:23,  3.84it/s]\u001b[A\n",
            "Iteration:  21%|██        | 85/403 [00:21<01:22,  3.84it/s]\u001b[A\n",
            "Iteration:  21%|██▏       | 86/403 [00:22<01:22,  3.85it/s]\u001b[A\n",
            "Iteration:  22%|██▏       | 87/403 [00:22<01:21,  3.86it/s]\u001b[A\n",
            "Iteration:  22%|██▏       | 88/403 [00:22<01:21,  3.86it/s]\u001b[A\n",
            "Iteration:  22%|██▏       | 89/403 [00:23<01:21,  3.86it/s]\u001b[A\n",
            "Iteration:  22%|██▏       | 90/403 [00:23<01:20,  3.87it/s]\u001b[A\n",
            "Iteration:  23%|██▎       | 91/403 [00:23<01:20,  3.87it/s]\u001b[A\n",
            "Iteration:  23%|██▎       | 92/403 [00:23<01:20,  3.88it/s]\u001b[A\n",
            "Iteration:  23%|██▎       | 93/403 [00:24<01:19,  3.88it/s]\u001b[A\n",
            "Iteration:  23%|██▎       | 94/403 [00:24<01:19,  3.87it/s]\u001b[A\n",
            "Iteration:  24%|██▎       | 95/403 [00:24<01:19,  3.87it/s]\u001b[A\n",
            "Iteration:  24%|██▍       | 96/403 [00:24<01:19,  3.88it/s]\u001b[A\n",
            "Iteration:  24%|██▍       | 97/403 [00:25<01:18,  3.87it/s]\u001b[A\n",
            "Iteration:  24%|██▍       | 98/403 [00:25<01:19,  3.85it/s]\u001b[A\n",
            "Iteration:  25%|██▍       | 99/403 [00:25<01:18,  3.86it/s]\u001b[A\n",
            "Iteration:  25%|██▍       | 100/403 [00:25<01:18,  3.86it/s]\u001b[A\n",
            "Iteration:  25%|██▌       | 101/403 [00:26<01:18,  3.86it/s]\u001b[A\n",
            "Iteration:  25%|██▌       | 102/403 [00:26<01:18,  3.86it/s]\u001b[A\n",
            "Iteration:  26%|██▌       | 103/403 [00:26<01:17,  3.85it/s]\u001b[A\n",
            "Iteration:  26%|██▌       | 104/403 [00:26<01:17,  3.87it/s]\u001b[A\n",
            "Iteration:  26%|██▌       | 105/403 [00:27<01:16,  3.88it/s]\u001b[A\n",
            "Iteration:  26%|██▋       | 106/403 [00:27<01:16,  3.87it/s]\u001b[A\n",
            "Iteration:  27%|██▋       | 107/403 [00:27<01:16,  3.87it/s]\u001b[A\n",
            "Iteration:  27%|██▋       | 108/403 [00:27<01:16,  3.88it/s]\u001b[A\n",
            "Iteration:  27%|██▋       | 109/403 [00:28<01:16,  3.87it/s]\u001b[A\n",
            "Iteration:  27%|██▋       | 110/403 [00:28<01:15,  3.86it/s]\u001b[A\n",
            "Iteration:  28%|██▊       | 111/403 [00:28<01:15,  3.87it/s]\u001b[A\n",
            "Iteration:  28%|██▊       | 112/403 [00:28<01:15,  3.88it/s]\u001b[A\n",
            "Iteration:  28%|██▊       | 113/403 [00:29<01:15,  3.86it/s]\u001b[A\n",
            "Iteration:  28%|██▊       | 114/403 [00:29<01:14,  3.86it/s]\u001b[A\n",
            "Iteration:  29%|██▊       | 115/403 [00:29<01:14,  3.86it/s]\u001b[A\n",
            "Iteration:  29%|██▉       | 116/403 [00:29<01:14,  3.86it/s]\u001b[A\n",
            "Iteration:  29%|██▉       | 117/403 [00:30<01:14,  3.85it/s]\u001b[A\n",
            "Iteration:  29%|██▉       | 118/403 [00:30<01:13,  3.86it/s]\u001b[A\n",
            "Iteration:  30%|██▉       | 119/403 [00:30<01:13,  3.86it/s]\u001b[A\n",
            "Iteration:  30%|██▉       | 120/403 [00:31<01:13,  3.86it/s]\u001b[A\n",
            "Iteration:  30%|███       | 121/403 [00:31<01:13,  3.86it/s]\u001b[A\n",
            "Iteration:  30%|███       | 122/403 [00:31<01:12,  3.86it/s]\u001b[A\n",
            "Iteration:  31%|███       | 123/403 [00:31<01:12,  3.87it/s]\u001b[A\n",
            "Iteration:  31%|███       | 124/403 [00:32<01:12,  3.87it/s]\u001b[A\n",
            "Iteration:  31%|███       | 125/403 [00:32<01:11,  3.87it/s]\u001b[A\n",
            "Iteration:  31%|███▏      | 126/403 [00:32<01:11,  3.88it/s]\u001b[A\n",
            "Iteration:  32%|███▏      | 127/403 [00:32<01:11,  3.86it/s]\u001b[A\n",
            "Iteration:  32%|███▏      | 128/403 [00:33<01:11,  3.86it/s]\u001b[A\n",
            "Iteration:  32%|███▏      | 129/403 [00:33<01:11,  3.86it/s]\u001b[A\n",
            "Iteration:  32%|███▏      | 130/403 [00:33<01:11,  3.84it/s]\u001b[A\n",
            "Iteration:  33%|███▎      | 131/403 [00:33<01:10,  3.84it/s]\u001b[A\n",
            "Iteration:  33%|███▎      | 132/403 [00:34<01:10,  3.84it/s]\u001b[A\n",
            "Iteration:  33%|███▎      | 133/403 [00:34<01:10,  3.84it/s]\u001b[A\n",
            "Iteration:  33%|███▎      | 134/403 [00:34<01:09,  3.86it/s]\u001b[A\n",
            "Iteration:  33%|███▎      | 135/403 [00:34<01:09,  3.87it/s]\u001b[A\n",
            "Iteration:  34%|███▎      | 136/403 [00:35<01:09,  3.87it/s]\u001b[A\n",
            "Iteration:  34%|███▍      | 137/403 [00:35<01:08,  3.86it/s]\u001b[A\n",
            "Iteration:  34%|███▍      | 138/403 [00:35<01:08,  3.87it/s]\u001b[A\n",
            "Iteration:  34%|███▍      | 139/403 [00:35<01:08,  3.88it/s]\u001b[A\n",
            "Iteration:  35%|███▍      | 140/403 [00:36<01:07,  3.88it/s]\u001b[A\n",
            "Iteration:  35%|███▍      | 141/403 [00:36<01:07,  3.88it/s]\u001b[A\n",
            "Iteration:  35%|███▌      | 142/403 [00:36<01:07,  3.87it/s]\u001b[A\n",
            "Iteration:  35%|███▌      | 143/403 [00:36<01:07,  3.87it/s]\u001b[A\n",
            "Iteration:  36%|███▌      | 144/403 [00:37<01:06,  3.87it/s]\u001b[A\n",
            "Iteration:  36%|███▌      | 145/403 [00:37<01:06,  3.87it/s]\u001b[A\n",
            "Iteration:  36%|███▌      | 146/403 [00:37<01:06,  3.87it/s]\u001b[A\n",
            "Iteration:  36%|███▋      | 147/403 [00:38<01:06,  3.86it/s]\u001b[A\n",
            "Iteration:  37%|███▋      | 148/403 [00:38<01:06,  3.86it/s]\u001b[A\n",
            "Iteration:  37%|███▋      | 149/403 [00:38<01:05,  3.87it/s]\u001b[A\n",
            "Iteration:  37%|███▋      | 150/403 [00:38<01:05,  3.87it/s]\u001b[A\n",
            "Iteration:  37%|███▋      | 151/403 [00:39<01:05,  3.87it/s]\u001b[A\n",
            "Iteration:  38%|███▊      | 152/403 [00:39<01:04,  3.87it/s]\u001b[A\n",
            "Iteration:  38%|███▊      | 153/403 [00:39<01:04,  3.86it/s]\u001b[A\n",
            "Iteration:  38%|███▊      | 154/403 [00:39<01:04,  3.88it/s]\u001b[A\n",
            "Iteration:  38%|███▊      | 155/403 [00:40<01:04,  3.87it/s]\u001b[A\n",
            "Iteration:  39%|███▊      | 156/403 [00:40<01:03,  3.87it/s]\u001b[A\n",
            "Iteration:  39%|███▉      | 157/403 [00:40<01:03,  3.87it/s]\u001b[A\n",
            "Iteration:  39%|███▉      | 158/403 [00:40<01:03,  3.87it/s]\u001b[A\n",
            "Iteration:  39%|███▉      | 159/403 [00:41<01:03,  3.86it/s]\u001b[A\n",
            "Iteration:  40%|███▉      | 160/403 [00:41<01:02,  3.87it/s]\u001b[A\n",
            "Iteration:  40%|███▉      | 161/403 [00:41<01:02,  3.87it/s]\u001b[A\n",
            "Iteration:  40%|████      | 162/403 [00:41<01:02,  3.88it/s]\u001b[A\n",
            "Iteration:  40%|████      | 163/403 [00:42<01:02,  3.87it/s]\u001b[A\n",
            "Iteration:  41%|████      | 164/403 [00:42<01:01,  3.87it/s]\u001b[A\n",
            "Iteration:  41%|████      | 165/403 [00:42<01:01,  3.88it/s]\u001b[A\n",
            "Iteration:  41%|████      | 166/403 [00:42<01:01,  3.88it/s]\u001b[A\n",
            "Iteration:  41%|████▏     | 167/403 [00:43<01:00,  3.87it/s]\u001b[A\n",
            "Iteration:  42%|████▏     | 168/403 [00:43<01:00,  3.87it/s]\u001b[A\n",
            "Iteration:  42%|████▏     | 169/403 [00:43<01:00,  3.87it/s]\u001b[A\n",
            "Iteration:  42%|████▏     | 170/403 [00:43<01:00,  3.88it/s]\u001b[A\n",
            "Iteration:  42%|████▏     | 171/403 [00:44<00:59,  3.87it/s]\u001b[A\n",
            "Iteration:  43%|████▎     | 172/403 [00:44<00:59,  3.87it/s]\u001b[A\n",
            "Iteration:  43%|████▎     | 173/403 [00:44<00:59,  3.87it/s]\u001b[A\n",
            "Iteration:  43%|████▎     | 174/403 [00:44<00:59,  3.86it/s]\u001b[A\n",
            "Iteration:  43%|████▎     | 175/403 [00:45<00:59,  3.85it/s]\u001b[A\n",
            "Iteration:  44%|████▎     | 176/403 [00:45<00:58,  3.86it/s]\u001b[A\n",
            "Iteration:  44%|████▍     | 177/403 [00:45<00:58,  3.87it/s]\u001b[A\n",
            "Iteration:  44%|████▍     | 178/403 [00:46<00:58,  3.87it/s]\u001b[A\n",
            "Iteration:  44%|████▍     | 179/403 [00:46<00:57,  3.86it/s]\u001b[A\n",
            "Iteration:  45%|████▍     | 180/403 [00:46<00:57,  3.87it/s]\u001b[A\n",
            "Iteration:  45%|████▍     | 181/403 [00:46<00:57,  3.89it/s]\u001b[A\n",
            "Iteration:  45%|████▌     | 182/403 [00:47<00:56,  3.88it/s]\u001b[A\n",
            "Iteration:  45%|████▌     | 183/403 [00:47<00:56,  3.87it/s]\u001b[A\n",
            "Iteration:  46%|████▌     | 184/403 [00:47<00:56,  3.88it/s]\u001b[A\n",
            "Iteration:  46%|████▌     | 185/403 [00:47<00:56,  3.87it/s]\u001b[A\n",
            "Iteration:  46%|████▌     | 186/403 [00:48<00:55,  3.88it/s]\u001b[A\n",
            "Iteration:  46%|████▋     | 187/403 [00:48<00:55,  3.88it/s]\u001b[A\n",
            "Iteration:  47%|████▋     | 188/403 [00:48<00:55,  3.89it/s]\u001b[A\n",
            "Iteration:  47%|████▋     | 189/403 [00:48<00:55,  3.87it/s]\u001b[A\n",
            "Iteration:  47%|████▋     | 190/403 [00:49<00:54,  3.87it/s]\u001b[A\n",
            "Iteration:  47%|████▋     | 191/403 [00:49<00:54,  3.87it/s]\u001b[A\n",
            "Iteration:  48%|████▊     | 192/403 [00:49<00:54,  3.88it/s]\u001b[A\n",
            "Iteration:  48%|████▊     | 193/403 [00:49<00:54,  3.88it/s]\u001b[A\n",
            "Iteration:  48%|████▊     | 194/403 [00:50<00:53,  3.88it/s]\u001b[A\n",
            "Iteration:  48%|████▊     | 195/403 [00:50<00:53,  3.88it/s]\u001b[A\n",
            "Iteration:  49%|████▊     | 196/403 [00:50<00:53,  3.89it/s]\u001b[A\n",
            "Iteration:  49%|████▉     | 197/403 [00:50<00:53,  3.88it/s]\u001b[A\n",
            "Iteration:  49%|████▉     | 198/403 [00:51<00:52,  3.88it/s]\u001b[A\n",
            "Iteration:  49%|████▉     | 199/403 [00:51<00:52,  3.88it/s]\u001b[A\n",
            "Iteration:  50%|████▉     | 200/403 [00:51<00:52,  3.89it/s]\u001b[A\n",
            "Iteration:  50%|████▉     | 201/403 [00:51<00:51,  3.88it/s]\u001b[A\n",
            "Iteration:  50%|█████     | 202/403 [00:52<00:51,  3.88it/s]\u001b[A\n",
            "Iteration:  50%|█████     | 203/403 [00:52<00:51,  3.88it/s]\u001b[A\n",
            "Iteration:  51%|█████     | 204/403 [00:52<00:51,  3.88it/s]\u001b[A\n",
            "Iteration:  51%|█████     | 205/403 [00:52<00:51,  3.87it/s]\u001b[A\n",
            "Iteration:  51%|█████     | 206/403 [00:53<00:50,  3.87it/s]\u001b[A\n",
            "Iteration:  51%|█████▏    | 207/403 [00:53<00:50,  3.87it/s]\u001b[A\n",
            "Iteration:  52%|█████▏    | 208/403 [00:53<00:50,  3.88it/s]\u001b[A\n",
            "Iteration:  52%|█████▏    | 209/403 [00:54<00:50,  3.87it/s]\u001b[A\n",
            "Iteration:  52%|█████▏    | 210/403 [00:54<00:49,  3.87it/s]\u001b[A\n",
            "Iteration:  52%|█████▏    | 211/403 [00:54<00:49,  3.87it/s]\u001b[A\n",
            "Iteration:  53%|█████▎    | 212/403 [00:54<00:49,  3.86it/s]\u001b[A\n",
            "Iteration:  53%|█████▎    | 213/403 [00:55<00:49,  3.85it/s]\u001b[A\n",
            "Iteration:  53%|█████▎    | 214/403 [00:55<00:49,  3.83it/s]\u001b[A\n",
            "Iteration:  53%|█████▎    | 215/403 [00:55<00:48,  3.84it/s]\u001b[A\n",
            "Iteration:  54%|█████▎    | 216/403 [00:55<00:48,  3.85it/s]\u001b[A\n",
            "Iteration:  54%|█████▍    | 217/403 [00:56<00:48,  3.86it/s]\u001b[A\n",
            "Iteration:  54%|█████▍    | 218/403 [00:56<00:48,  3.85it/s]\u001b[A\n",
            "Iteration:  54%|█████▍    | 219/403 [00:56<00:47,  3.85it/s]\u001b[A\n",
            "Iteration:  55%|█████▍    | 220/403 [00:56<00:47,  3.86it/s]\u001b[A\n",
            "Iteration:  55%|█████▍    | 221/403 [00:57<00:47,  3.83it/s]\u001b[A\n",
            "Iteration:  55%|█████▌    | 222/403 [00:57<00:47,  3.81it/s]\u001b[A\n",
            "Iteration:  55%|█████▌    | 223/403 [00:57<00:47,  3.81it/s]\u001b[A\n",
            "Iteration:  56%|█████▌    | 224/403 [00:57<00:46,  3.82it/s]\u001b[A\n",
            "Iteration:  56%|█████▌    | 225/403 [00:58<00:46,  3.82it/s]\u001b[A\n",
            "Iteration:  56%|█████▌    | 226/403 [00:58<00:46,  3.83it/s]\u001b[A\n",
            "Iteration:  56%|█████▋    | 227/403 [00:58<00:45,  3.84it/s]\u001b[A\n",
            "Iteration:  57%|█████▋    | 228/403 [00:58<00:45,  3.84it/s]\u001b[A\n",
            "Iteration:  57%|█████▋    | 229/403 [00:59<00:45,  3.85it/s]\u001b[A\n",
            "Iteration:  57%|█████▋    | 230/403 [00:59<00:45,  3.84it/s]\u001b[A\n",
            "Iteration:  57%|█████▋    | 231/403 [00:59<00:44,  3.86it/s]\u001b[A\n",
            "Iteration:  58%|█████▊    | 232/403 [01:00<00:44,  3.86it/s]\u001b[A\n",
            "Iteration:  58%|█████▊    | 233/403 [01:00<00:44,  3.85it/s]\u001b[A\n",
            "Iteration:  58%|█████▊    | 234/403 [01:00<00:44,  3.84it/s]\u001b[A\n",
            "Iteration:  58%|█████▊    | 235/403 [01:00<00:43,  3.84it/s]\u001b[A\n",
            "Iteration:  59%|█████▊    | 236/403 [01:01<00:43,  3.83it/s]\u001b[A\n",
            "Iteration:  59%|█████▉    | 237/403 [01:01<00:43,  3.82it/s]\u001b[A\n",
            "Iteration:  59%|█████▉    | 238/403 [01:01<00:43,  3.83it/s]\u001b[A\n",
            "Iteration:  59%|█████▉    | 239/403 [01:01<00:42,  3.84it/s]\u001b[A\n",
            "Iteration:  60%|█████▉    | 240/403 [01:02<00:42,  3.85it/s]\u001b[A\n",
            "Iteration:  60%|█████▉    | 241/403 [01:02<00:42,  3.86it/s]\u001b[A\n",
            "Iteration:  60%|██████    | 242/403 [01:02<00:41,  3.87it/s]\u001b[A\n",
            "Iteration:  60%|██████    | 243/403 [01:02<00:41,  3.88it/s]\u001b[A\n",
            "Iteration:  61%|██████    | 244/403 [01:03<00:41,  3.87it/s]\u001b[A\n",
            "Iteration:  61%|██████    | 245/403 [01:03<00:40,  3.87it/s]\u001b[A\n",
            "Iteration:  61%|██████    | 246/403 [01:03<00:40,  3.87it/s]\u001b[A\n",
            "Iteration:  61%|██████▏   | 247/403 [01:03<00:40,  3.88it/s]\u001b[A\n",
            "Iteration:  62%|██████▏   | 248/403 [01:04<00:40,  3.87it/s]\u001b[A\n",
            "Iteration:  62%|██████▏   | 249/403 [01:04<00:39,  3.87it/s]\u001b[A\n",
            "Iteration:  62%|██████▏   | 250/403 [01:04<00:39,  3.88it/s]\u001b[A\n",
            "Iteration:  62%|██████▏   | 251/403 [01:04<00:39,  3.88it/s]\u001b[A\n",
            "Iteration:  63%|██████▎   | 252/403 [01:05<00:38,  3.88it/s]\u001b[A\n",
            "Iteration:  63%|██████▎   | 253/403 [01:05<00:38,  3.87it/s]\u001b[A\n",
            "Iteration:  63%|██████▎   | 254/403 [01:05<00:38,  3.87it/s]\u001b[A\n",
            "Iteration:  63%|██████▎   | 255/403 [01:05<00:38,  3.88it/s]\u001b[A\n",
            "Iteration:  64%|██████▎   | 256/403 [01:06<00:37,  3.88it/s]\u001b[A\n",
            "Iteration:  64%|██████▍   | 257/403 [01:06<00:37,  3.88it/s]\u001b[A\n",
            "Iteration:  64%|██████▍   | 258/403 [01:06<00:37,  3.87it/s]\u001b[A\n",
            "Iteration:  64%|██████▍   | 259/403 [01:06<00:37,  3.87it/s]\u001b[A\n",
            "Iteration:  65%|██████▍   | 260/403 [01:07<00:37,  3.86it/s]\u001b[A\n",
            "Iteration:  65%|██████▍   | 261/403 [01:07<00:36,  3.87it/s]\u001b[A\n",
            "Iteration:  65%|██████▌   | 262/403 [01:07<00:36,  3.87it/s]\u001b[A\n",
            "Iteration:  65%|██████▌   | 263/403 [01:08<00:36,  3.88it/s]\u001b[A\n",
            "Iteration:  66%|██████▌   | 264/403 [01:08<00:36,  3.86it/s]\u001b[A\n",
            "Iteration:  66%|██████▌   | 265/403 [01:08<00:35,  3.86it/s]\u001b[A\n",
            "Iteration:  66%|██████▌   | 266/403 [01:08<00:35,  3.88it/s]\u001b[A\n",
            "Iteration:  66%|██████▋   | 267/403 [01:09<00:35,  3.88it/s]\u001b[A\n",
            "Iteration:  67%|██████▋   | 268/403 [01:09<00:34,  3.87it/s]\u001b[A\n",
            "Iteration:  67%|██████▋   | 269/403 [01:09<00:34,  3.87it/s]\u001b[A\n",
            "Iteration:  67%|██████▋   | 270/403 [01:09<00:34,  3.88it/s]\u001b[A\n",
            "Iteration:  67%|██████▋   | 271/403 [01:10<00:34,  3.88it/s]\u001b[A\n",
            "Iteration:  67%|██████▋   | 272/403 [01:10<00:33,  3.87it/s]\u001b[A\n",
            "Iteration:  68%|██████▊   | 273/403 [01:10<00:33,  3.87it/s]\u001b[A\n",
            "Iteration:  68%|██████▊   | 274/403 [01:10<00:33,  3.88it/s]\u001b[A\n",
            "Iteration:  68%|██████▊   | 275/403 [01:11<00:32,  3.88it/s]\u001b[A\n",
            "Iteration:  68%|██████▊   | 276/403 [01:11<00:32,  3.88it/s]\u001b[A\n",
            "Iteration:  69%|██████▊   | 277/403 [01:11<00:32,  3.88it/s]\u001b[A\n",
            "Iteration:  69%|██████▉   | 278/403 [01:11<00:32,  3.89it/s]\u001b[A\n",
            "Iteration:  69%|██████▉   | 279/403 [01:12<00:31,  3.88it/s]\u001b[A\n",
            "Iteration:  69%|██████▉   | 280/403 [01:12<00:31,  3.87it/s]\u001b[A\n",
            "Iteration:  70%|██████▉   | 281/403 [01:12<00:31,  3.86it/s]\u001b[A\n",
            "Iteration:  70%|██████▉   | 282/403 [01:12<00:31,  3.87it/s]\u001b[A\n",
            "Iteration:  70%|███████   | 283/403 [01:13<00:31,  3.87it/s]\u001b[A\n",
            "Iteration:  70%|███████   | 284/403 [01:13<00:30,  3.84it/s]\u001b[A\n",
            "Iteration:  71%|███████   | 285/403 [01:13<00:30,  3.83it/s]\u001b[A\n",
            "Iteration:  71%|███████   | 286/403 [01:13<00:30,  3.83it/s]\u001b[A\n",
            "Iteration:  71%|███████   | 287/403 [01:14<00:30,  3.83it/s]\u001b[A\n",
            "Iteration:  71%|███████▏  | 288/403 [01:14<00:30,  3.83it/s]\u001b[A\n",
            "Iteration:  72%|███████▏  | 289/403 [01:14<00:29,  3.83it/s]\u001b[A\n",
            "Iteration:  72%|███████▏  | 290/403 [01:15<00:29,  3.82it/s]\u001b[A\n",
            "Iteration:  72%|███████▏  | 291/403 [01:15<00:29,  3.82it/s]\u001b[A\n",
            "Iteration:  72%|███████▏  | 292/403 [01:15<00:29,  3.82it/s]\u001b[A\n",
            "Iteration:  73%|███████▎  | 293/403 [01:15<00:28,  3.84it/s]\u001b[A\n",
            "Iteration:  73%|███████▎  | 294/403 [01:16<00:28,  3.84it/s]\u001b[A\n",
            "Iteration:  73%|███████▎  | 295/403 [01:16<00:28,  3.85it/s]\u001b[A\n",
            "Iteration:  73%|███████▎  | 296/403 [01:16<00:27,  3.86it/s]\u001b[A\n",
            "Iteration:  74%|███████▎  | 297/403 [01:16<00:27,  3.87it/s]\u001b[A\n",
            "Iteration:  74%|███████▍  | 298/403 [01:17<00:27,  3.86it/s]\u001b[A\n",
            "Iteration:  74%|███████▍  | 299/403 [01:17<00:26,  3.86it/s]\u001b[A\n",
            "Iteration:  74%|███████▍  | 300/403 [01:17<00:26,  3.87it/s]\u001b[A\n",
            "Iteration:  75%|███████▍  | 301/403 [01:17<00:26,  3.87it/s]\u001b[A\n",
            "Iteration:  75%|███████▍  | 302/403 [01:18<00:26,  3.87it/s]\u001b[A\n",
            "Iteration:  75%|███████▌  | 303/403 [01:18<00:25,  3.87it/s]\u001b[A\n",
            "Iteration:  75%|███████▌  | 304/403 [01:18<00:25,  3.88it/s]\u001b[A\n",
            "Iteration:  76%|███████▌  | 305/403 [01:18<00:25,  3.88it/s]\u001b[A\n",
            "Iteration:  76%|███████▌  | 306/403 [01:19<00:25,  3.88it/s]\u001b[A\n",
            "Iteration:  76%|███████▌  | 307/403 [01:19<00:24,  3.87it/s]\u001b[A\n",
            "Iteration:  76%|███████▋  | 308/403 [01:19<00:24,  3.88it/s]\u001b[A\n",
            "Iteration:  77%|███████▋  | 309/403 [01:19<00:24,  3.89it/s]\u001b[A\n",
            "Iteration:  77%|███████▋  | 310/403 [01:20<00:23,  3.89it/s]\u001b[A\n",
            "Iteration:  77%|███████▋  | 311/403 [01:20<00:23,  3.88it/s]\u001b[A\n",
            "Iteration:  77%|███████▋  | 312/403 [01:20<00:23,  3.88it/s]\u001b[A\n",
            "Iteration:  78%|███████▊  | 313/403 [01:20<00:23,  3.87it/s]\u001b[A\n",
            "Iteration:  78%|███████▊  | 314/403 [01:21<00:23,  3.87it/s]\u001b[A\n",
            "Iteration:  78%|███████▊  | 315/403 [01:21<00:22,  3.86it/s]\u001b[A\n",
            "Iteration:  78%|███████▊  | 316/403 [01:21<00:22,  3.87it/s]\u001b[A\n",
            "Iteration:  79%|███████▊  | 317/403 [01:22<00:22,  3.87it/s]\u001b[A\n",
            "Iteration:  79%|███████▉  | 318/403 [01:22<00:21,  3.88it/s]\u001b[A\n",
            "Iteration:  79%|███████▉  | 319/403 [01:22<00:21,  3.88it/s]\u001b[A\n",
            "Iteration:  79%|███████▉  | 320/403 [01:22<00:21,  3.89it/s]\u001b[A\n",
            "Iteration:  80%|███████▉  | 321/403 [01:23<00:21,  3.88it/s]\u001b[A\n",
            "Iteration:  80%|███████▉  | 322/403 [01:23<00:20,  3.87it/s]\u001b[A\n",
            "Iteration:  80%|████████  | 323/403 [01:23<00:20,  3.86it/s]\u001b[A\n",
            "Iteration:  80%|████████  | 324/403 [01:23<00:20,  3.87it/s]\u001b[A\n",
            "Iteration:  81%|████████  | 325/403 [01:24<00:20,  3.88it/s]\u001b[A\n",
            "Iteration:  81%|████████  | 326/403 [01:24<00:19,  3.88it/s]\u001b[A\n",
            "Iteration:  81%|████████  | 327/403 [01:24<00:19,  3.87it/s]\u001b[A\n",
            "Iteration:  81%|████████▏ | 328/403 [01:24<00:19,  3.87it/s]\u001b[A\n",
            "Iteration:  82%|████████▏ | 329/403 [01:25<00:19,  3.87it/s]\u001b[A\n",
            "Iteration:  82%|████████▏ | 330/403 [01:25<00:18,  3.86it/s]\u001b[A\n",
            "Iteration:  82%|████████▏ | 331/403 [01:25<00:18,  3.86it/s]\u001b[A\n",
            "Iteration:  82%|████████▏ | 332/403 [01:25<00:18,  3.86it/s]\u001b[A\n",
            "Iteration:  83%|████████▎ | 333/403 [01:26<00:18,  3.86it/s]\u001b[A\n",
            "Iteration:  83%|████████▎ | 334/403 [01:26<00:17,  3.86it/s]\u001b[A\n",
            "Iteration:  83%|████████▎ | 335/403 [01:26<00:17,  3.85it/s]\u001b[A\n",
            "Iteration:  83%|████████▎ | 336/403 [01:26<00:17,  3.85it/s]\u001b[A\n",
            "Iteration:  84%|████████▎ | 337/403 [01:27<00:17,  3.85it/s]\u001b[A\n",
            "Iteration:  84%|████████▍ | 338/403 [01:27<00:16,  3.85it/s]\u001b[A\n",
            "Iteration:  84%|████████▍ | 339/403 [01:27<00:16,  3.87it/s]\u001b[A\n",
            "Iteration:  84%|████████▍ | 340/403 [01:27<00:16,  3.87it/s]\u001b[A\n",
            "Iteration:  85%|████████▍ | 341/403 [01:28<00:16,  3.87it/s]\u001b[A\n",
            "Iteration:  85%|████████▍ | 342/403 [01:28<00:15,  3.88it/s]\u001b[A\n",
            "Iteration:  85%|████████▌ | 343/403 [01:28<00:15,  3.87it/s]\u001b[A\n",
            "Iteration:  85%|████████▌ | 344/403 [01:28<00:15,  3.87it/s]\u001b[A\n",
            "Iteration:  86%|████████▌ | 345/403 [01:29<00:15,  3.86it/s]\u001b[A\n",
            "Iteration:  86%|████████▌ | 346/403 [01:29<00:14,  3.86it/s]\u001b[A\n",
            "Iteration:  86%|████████▌ | 347/403 [01:29<00:14,  3.85it/s]\u001b[A\n",
            "Iteration:  86%|████████▋ | 348/403 [01:30<00:14,  3.85it/s]\u001b[A\n",
            "Iteration:  87%|████████▋ | 349/403 [01:30<00:14,  3.85it/s]\u001b[A\n",
            "Iteration:  87%|████████▋ | 350/403 [01:30<00:13,  3.86it/s]\u001b[A\n",
            "Iteration:  87%|████████▋ | 351/403 [01:30<00:13,  3.86it/s]\u001b[A\n",
            "Iteration:  87%|████████▋ | 352/403 [01:31<00:13,  3.86it/s]\u001b[A\n",
            "Iteration:  88%|████████▊ | 353/403 [01:31<00:12,  3.86it/s]\u001b[A\n",
            "Iteration:  88%|████████▊ | 354/403 [01:31<00:12,  3.86it/s]\u001b[A\n",
            "Iteration:  88%|████████▊ | 355/403 [01:31<00:12,  3.86it/s]\u001b[A\n",
            "Iteration:  88%|████████▊ | 356/403 [01:32<00:12,  3.87it/s]\u001b[A\n",
            "Iteration:  89%|████████▊ | 357/403 [01:32<00:11,  3.88it/s]\u001b[A\n",
            "Iteration:  89%|████████▉ | 358/403 [01:32<00:11,  3.87it/s]\u001b[A\n",
            "Iteration:  89%|████████▉ | 359/403 [01:32<00:11,  3.86it/s]\u001b[A\n",
            "Iteration:  89%|████████▉ | 360/403 [01:33<00:11,  3.85it/s]\u001b[A\n",
            "Iteration:  90%|████████▉ | 361/403 [01:33<00:10,  3.85it/s]\u001b[A\n",
            "Iteration:  90%|████████▉ | 362/403 [01:33<00:10,  3.85it/s]\u001b[A\n",
            "Iteration:  90%|█████████ | 363/403 [01:33<00:10,  3.85it/s]\u001b[A\n",
            "Iteration:  90%|█████████ | 364/403 [01:34<00:10,  3.84it/s]\u001b[A\n",
            "Iteration:  91%|█████████ | 365/403 [01:34<00:09,  3.83it/s]\u001b[A\n",
            "Iteration:  91%|█████████ | 366/403 [01:34<00:09,  3.84it/s]\u001b[A\n",
            "Iteration:  91%|█████████ | 367/403 [01:34<00:09,  3.85it/s]\u001b[A\n",
            "Iteration:  91%|█████████▏| 368/403 [01:35<00:09,  3.85it/s]\u001b[A\n",
            "Iteration:  92%|█████████▏| 369/403 [01:35<00:08,  3.87it/s]\u001b[A\n",
            "Iteration:  92%|█████████▏| 370/403 [01:35<00:08,  3.87it/s]\u001b[A\n",
            "Iteration:  92%|█████████▏| 371/403 [01:35<00:08,  3.87it/s]\u001b[A\n",
            "Iteration:  92%|█████████▏| 372/403 [01:36<00:08,  3.87it/s]\u001b[A\n",
            "Iteration:  93%|█████████▎| 373/403 [01:36<00:07,  3.87it/s]\u001b[A\n",
            "Iteration:  93%|█████████▎| 374/403 [01:36<00:07,  3.88it/s]\u001b[A\n",
            "Iteration:  93%|█████████▎| 375/403 [01:37<00:07,  3.85it/s]\u001b[A\n",
            "Iteration:  93%|█████████▎| 376/403 [01:37<00:06,  3.86it/s]\u001b[A\n",
            "Iteration:  94%|█████████▎| 377/403 [01:37<00:06,  3.86it/s]\u001b[A\n",
            "Iteration:  94%|█████████▍| 378/403 [01:37<00:06,  3.87it/s]\u001b[A\n",
            "Iteration:  94%|█████████▍| 379/403 [01:38<00:06,  3.86it/s]\u001b[A\n",
            "Iteration:  94%|█████████▍| 380/403 [01:38<00:05,  3.86it/s]\u001b[A\n",
            "Iteration:  95%|█████████▍| 381/403 [01:38<00:05,  3.87it/s]\u001b[A\n",
            "Iteration:  95%|█████████▍| 382/403 [01:38<00:05,  3.87it/s]\u001b[A\n",
            "Iteration:  95%|█████████▌| 383/403 [01:39<00:05,  3.87it/s]\u001b[A\n",
            "Iteration:  95%|█████████▌| 384/403 [01:39<00:04,  3.86it/s]\u001b[A\n",
            "Iteration:  96%|█████████▌| 385/403 [01:39<00:04,  3.85it/s]\u001b[A\n",
            "Iteration:  96%|█████████▌| 386/403 [01:39<00:04,  3.85it/s]\u001b[A\n",
            "Iteration:  96%|█████████▌| 387/403 [01:40<00:04,  3.85it/s]\u001b[A\n",
            "Iteration:  96%|█████████▋| 388/403 [01:40<00:03,  3.85it/s]\u001b[A\n",
            "Iteration:  97%|█████████▋| 389/403 [01:40<00:03,  3.87it/s]\u001b[A\n",
            "Iteration:  97%|█████████▋| 390/403 [01:40<00:03,  3.87it/s]\u001b[A\n",
            "Iteration:  97%|█████████▋| 391/403 [01:41<00:03,  3.87it/s]\u001b[A\n",
            "Iteration:  97%|█████████▋| 392/403 [01:41<00:02,  3.86it/s]\u001b[A\n",
            "Iteration:  98%|█████████▊| 393/403 [01:41<00:02,  3.86it/s]\u001b[A\n",
            "Iteration:  98%|█████████▊| 394/403 [01:41<00:02,  3.87it/s]\u001b[A\n",
            "Iteration:  98%|█████████▊| 395/403 [01:42<00:02,  3.86it/s]\u001b[A\n",
            "Iteration:  98%|█████████▊| 396/403 [01:42<00:01,  3.86it/s]\u001b[A\n",
            "Iteration:  99%|█████████▊| 397/403 [01:42<00:01,  3.86it/s]\u001b[A\n",
            "Iteration:  99%|█████████▉| 398/403 [01:42<00:01,  3.87it/s]\u001b[A\n",
            "Iteration:  99%|█████████▉| 399/403 [01:43<00:01,  3.87it/s]\u001b[A\n",
            "Iteration:  99%|█████████▉| 400/403 [01:43<00:00,  3.87it/s]\u001b[A\n",
            "Iteration: 100%|█████████▉| 401/403 [01:43<00:00,  3.87it/s]\u001b[A\n",
            "Iteration: 100%|█████████▉| 402/403 [01:44<00:00,  3.86it/s]\u001b[A\n",
            "Iteration: 100%|██████████| 403/403 [01:44<00:00,  3.86it/s]\u001b[A\n",
            "Epoch: 100%|██████████| 5/5 [09:05<00:00, 109.98s/it]\n"
          ],
          "name": "stderr"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "InClOPvSh6C-",
        "colab_type": "text"
      },
      "source": [
        "# 7. Save / Load model"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "HkYNtXZFsgF4",
        "colab_type": "code",
        "colab": {}
      },
      "source": [
        "# Save the trained model:\n",
        "torch.save(model.state_dict(), '/content/gdrive/My Drive/Colab Notebooks/data/das_model_train2')"
      ],
      "execution_count": 0,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab_type": "code",
        "id": "KrWDYmSYMMqf",
        "colab": {}
      },
      "source": [
        "# Load the model, which was made on 8 GPUs (so the state_dict has a different format)\n",
        "state_dict = torch.load('/content/gdrive/My Drive/Colab Notebooks/data/das_model_train')\n",
        "\n",
        "# Fix the format on the state_dict:\n",
        "\n",
        "# create new OrderedDict that does not contain `module.`\n",
        "from collections import OrderedDict\n",
        "new_state_dict = OrderedDict()\n",
        "for k, v in state_dict.items():\n",
        "    name = k[7:] # remove `module.`\n",
        "    new_state_dict[name] = v"
      ],
      "execution_count": 0,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "dnSVUXX7vrQN",
        "colab_type": "code",
        "outputId": "1ccdd085-3c4f-441e-cfac-fefca2936817",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 1000
        }
      },
      "source": [
        "device = torch.device(\"cuda\" if torch.cuda.is_available() and not no_cuda else \"cpu\")\n",
        "\n",
        "\n",
        "# Load the saved model from the state dict: \n",
        "model = BertForSequenceClassification.from_pretrained(pretrained_model_name, config=bertconfig)\n",
        "model.load_state_dict(new_state_dict)\n",
        "model.to(device)"
      ],
      "execution_count": 44,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "BertForSequenceClassification(\n",
              "  (bert): BertModel(\n",
              "    (embeddings): BertEmbeddings(\n",
              "      (word_embeddings): Embedding(30522, 768, padding_idx=0)\n",
              "      (position_embeddings): Embedding(512, 768)\n",
              "      (token_type_embeddings): Embedding(2, 768)\n",
              "      (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "      (dropout): Dropout(p=0.1, inplace=False)\n",
              "    )\n",
              "    (encoder): BertEncoder(\n",
              "      (layer): ModuleList(\n",
              "        (0): BertLayer(\n",
              "          (attention): BertAttention(\n",
              "            (self): BertSelfAttention(\n",
              "              (query): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (key): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (value): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "            (output): BertSelfOutput(\n",
              "              (dense): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "          )\n",
              "          (intermediate): BertIntermediate(\n",
              "            (dense): Linear(in_features=768, out_features=3072, bias=True)\n",
              "          )\n",
              "          (output): BertOutput(\n",
              "            (dense): Linear(in_features=3072, out_features=768, bias=True)\n",
              "            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "            (dropout): Dropout(p=0.1, inplace=False)\n",
              "          )\n",
              "        )\n",
              "        (1): BertLayer(\n",
              "          (attention): BertAttention(\n",
              "            (self): BertSelfAttention(\n",
              "              (query): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (key): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (value): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "            (output): BertSelfOutput(\n",
              "              (dense): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "          )\n",
              "          (intermediate): BertIntermediate(\n",
              "            (dense): Linear(in_features=768, out_features=3072, bias=True)\n",
              "          )\n",
              "          (output): BertOutput(\n",
              "            (dense): Linear(in_features=3072, out_features=768, bias=True)\n",
              "            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "            (dropout): Dropout(p=0.1, inplace=False)\n",
              "          )\n",
              "        )\n",
              "        (2): BertLayer(\n",
              "          (attention): BertAttention(\n",
              "            (self): BertSelfAttention(\n",
              "              (query): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (key): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (value): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "            (output): BertSelfOutput(\n",
              "              (dense): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "          )\n",
              "          (intermediate): BertIntermediate(\n",
              "            (dense): Linear(in_features=768, out_features=3072, bias=True)\n",
              "          )\n",
              "          (output): BertOutput(\n",
              "            (dense): Linear(in_features=3072, out_features=768, bias=True)\n",
              "            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "            (dropout): Dropout(p=0.1, inplace=False)\n",
              "          )\n",
              "        )\n",
              "        (3): BertLayer(\n",
              "          (attention): BertAttention(\n",
              "            (self): BertSelfAttention(\n",
              "              (query): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (key): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (value): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "            (output): BertSelfOutput(\n",
              "              (dense): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "          )\n",
              "          (intermediate): BertIntermediate(\n",
              "            (dense): Linear(in_features=768, out_features=3072, bias=True)\n",
              "          )\n",
              "          (output): BertOutput(\n",
              "            (dense): Linear(in_features=3072, out_features=768, bias=True)\n",
              "            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "            (dropout): Dropout(p=0.1, inplace=False)\n",
              "          )\n",
              "        )\n",
              "        (4): BertLayer(\n",
              "          (attention): BertAttention(\n",
              "            (self): BertSelfAttention(\n",
              "              (query): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (key): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (value): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "            (output): BertSelfOutput(\n",
              "              (dense): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "          )\n",
              "          (intermediate): BertIntermediate(\n",
              "            (dense): Linear(in_features=768, out_features=3072, bias=True)\n",
              "          )\n",
              "          (output): BertOutput(\n",
              "            (dense): Linear(in_features=3072, out_features=768, bias=True)\n",
              "            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "            (dropout): Dropout(p=0.1, inplace=False)\n",
              "          )\n",
              "        )\n",
              "        (5): BertLayer(\n",
              "          (attention): BertAttention(\n",
              "            (self): BertSelfAttention(\n",
              "              (query): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (key): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (value): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "            (output): BertSelfOutput(\n",
              "              (dense): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "          )\n",
              "          (intermediate): BertIntermediate(\n",
              "            (dense): Linear(in_features=768, out_features=3072, bias=True)\n",
              "          )\n",
              "          (output): BertOutput(\n",
              "            (dense): Linear(in_features=3072, out_features=768, bias=True)\n",
              "            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "            (dropout): Dropout(p=0.1, inplace=False)\n",
              "          )\n",
              "        )\n",
              "        (6): BertLayer(\n",
              "          (attention): BertAttention(\n",
              "            (self): BertSelfAttention(\n",
              "              (query): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (key): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (value): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "            (output): BertSelfOutput(\n",
              "              (dense): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "          )\n",
              "          (intermediate): BertIntermediate(\n",
              "            (dense): Linear(in_features=768, out_features=3072, bias=True)\n",
              "          )\n",
              "          (output): BertOutput(\n",
              "            (dense): Linear(in_features=3072, out_features=768, bias=True)\n",
              "            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "            (dropout): Dropout(p=0.1, inplace=False)\n",
              "          )\n",
              "        )\n",
              "        (7): BertLayer(\n",
              "          (attention): BertAttention(\n",
              "            (self): BertSelfAttention(\n",
              "              (query): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (key): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (value): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "            (output): BertSelfOutput(\n",
              "              (dense): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "          )\n",
              "          (intermediate): BertIntermediate(\n",
              "            (dense): Linear(in_features=768, out_features=3072, bias=True)\n",
              "          )\n",
              "          (output): BertOutput(\n",
              "            (dense): Linear(in_features=3072, out_features=768, bias=True)\n",
              "            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "            (dropout): Dropout(p=0.1, inplace=False)\n",
              "          )\n",
              "        )\n",
              "        (8): BertLayer(\n",
              "          (attention): BertAttention(\n",
              "            (self): BertSelfAttention(\n",
              "              (query): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (key): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (value): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "            (output): BertSelfOutput(\n",
              "              (dense): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "          )\n",
              "          (intermediate): BertIntermediate(\n",
              "            (dense): Linear(in_features=768, out_features=3072, bias=True)\n",
              "          )\n",
              "          (output): BertOutput(\n",
              "            (dense): Linear(in_features=3072, out_features=768, bias=True)\n",
              "            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "            (dropout): Dropout(p=0.1, inplace=False)\n",
              "          )\n",
              "        )\n",
              "        (9): BertLayer(\n",
              "          (attention): BertAttention(\n",
              "            (self): BertSelfAttention(\n",
              "              (query): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (key): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (value): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "            (output): BertSelfOutput(\n",
              "              (dense): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "          )\n",
              "          (intermediate): BertIntermediate(\n",
              "            (dense): Linear(in_features=768, out_features=3072, bias=True)\n",
              "          )\n",
              "          (output): BertOutput(\n",
              "            (dense): Linear(in_features=3072, out_features=768, bias=True)\n",
              "            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "            (dropout): Dropout(p=0.1, inplace=False)\n",
              "          )\n",
              "        )\n",
              "        (10): BertLayer(\n",
              "          (attention): BertAttention(\n",
              "            (self): BertSelfAttention(\n",
              "              (query): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (key): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (value): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "            (output): BertSelfOutput(\n",
              "              (dense): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "          )\n",
              "          (intermediate): BertIntermediate(\n",
              "            (dense): Linear(in_features=768, out_features=3072, bias=True)\n",
              "          )\n",
              "          (output): BertOutput(\n",
              "            (dense): Linear(in_features=3072, out_features=768, bias=True)\n",
              "            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "            (dropout): Dropout(p=0.1, inplace=False)\n",
              "          )\n",
              "        )\n",
              "        (11): BertLayer(\n",
              "          (attention): BertAttention(\n",
              "            (self): BertSelfAttention(\n",
              "              (query): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (key): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (value): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "            (output): BertSelfOutput(\n",
              "              (dense): Linear(in_features=768, out_features=768, bias=True)\n",
              "              (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "              (dropout): Dropout(p=0.1, inplace=False)\n",
              "            )\n",
              "          )\n",
              "          (intermediate): BertIntermediate(\n",
              "            (dense): Linear(in_features=768, out_features=3072, bias=True)\n",
              "          )\n",
              "          (output): BertOutput(\n",
              "            (dense): Linear(in_features=3072, out_features=768, bias=True)\n",
              "            (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)\n",
              "            (dropout): Dropout(p=0.1, inplace=False)\n",
              "          )\n",
              "        )\n",
              "      )\n",
              "    )\n",
              "    (pooler): BertPooler(\n",
              "      (dense): Linear(in_features=768, out_features=768, bias=True)\n",
              "      (activation): Tanh()\n",
              "    )\n",
              "  )\n",
              "  (dropout): Dropout(p=0.1, inplace=False)\n",
              "  (classifier): Linear(in_features=2304, out_features=18, bias=True)\n",
              ")"
            ]
          },
          "metadata": {
            "tags": []
          },
          "execution_count": 44
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "uDJa8ZJBh_PR",
        "colab_type": "text"
      },
      "source": [
        "# 8. Evaluate!"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab_type": "code",
        "id": "b2rLtC2FVZpz",
        "colab": {}
      },
      "source": [
        "# Metrics for evaluation (accuracy, f1 score),  from the official script for SemEval task-8\n",
        "def acc_and_f1(preds, labels, average='macro'):\n",
        "    acc = simple_accuracy(preds, labels)\n",
        "    f1 = f1_score(y_true=labels, y_pred=preds, average=average)\n",
        "    return {\"acc\": acc,\n",
        "        \"f1\": f1,\n",
        "        \"acc_and_f1\": (acc + f1) / 2}\n",
        "    \n",
        "def compute_metrics(task_name, preds, labels):\n",
        "    assert len(preds) == len(labels)\n",
        "    return acc_and_f1(preds, labels)\n",
        "\n",
        "def simple_accuracy(preds, labels):\n",
        "    return (preds == labels).mean()"
      ],
      "execution_count": 0,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab_type": "code",
        "id": "-cAZ3OCmVZmh",
        "colab": {}
      },
      "source": [
        "# Evaluation\n",
        "\n",
        "def evaluate(model, tokenizer, prefix=\"\"):\n",
        "    '''\n",
        "    Reads the test set, makes predictions on it, saves the predictions\n",
        "    returns the predictions / truth and accuracy+f1 score.\n",
        "    '''\n",
        "    # Loop to handle MNLI double evaluation (matched, mis-matched)\n",
        "\n",
        "    # What kind of task it was, for BERT:\n",
        "    eval_task = task_name\n",
        "\n",
        "    # Save the evaluation metrics into results:\n",
        "    results = {}\n",
        "\n",
        "    # Load the test set and convert to features and to tensors:\n",
        "    examples = get_test_examples('/content/gdrive/My Drive/Colab Notebooks/data/')\n",
        "    features = convert_examples_to_features(\n",
        "        examples, labels, max_seq_len, tokenizer, \"classification\", use_entity_indicator)\n",
        "\n",
        "    all_input_ids = torch.tensor(\n",
        "            [f.input_ids for f in features], dtype=torch.long)\n",
        "    all_input_mask = torch.tensor(\n",
        "        [f.input_mask for f in features], dtype=torch.long)\n",
        "    all_segment_ids = torch.tensor(\n",
        "        [f.segment_ids for f in features], dtype=torch.long)\n",
        "    all_e1_mask = torch.tensor(\n",
        "        [f.e1_mask for f in features], dtype=torch.long)  # add e1 mask\n",
        "    all_e2_mask = torch.tensor(\n",
        "        [f.e2_mask for f in features], dtype=torch.long)  # add e2 mask\n",
        "\n",
        "    all_label_ids = torch.tensor(\n",
        "        [f.label_id for f in features], dtype=torch.long)\n",
        "\n",
        "    eval_dataset = TensorDataset(all_input_ids, all_input_mask,all_segment_ids, all_label_ids, all_e1_mask, all_e2_mask)\n",
        "\n",
        "    # Size of batch per GPU:\n",
        "    eval_batch_size = per_gpu_eval_batch_size * \\\n",
        "        max(1, n_gpu)\n",
        "\n",
        "    # Sample and load data:\n",
        "    eval_sampler = SequentialSampler(\n",
        "        eval_dataset) \n",
        "    eval_dataloader = DataLoader(\n",
        "        eval_dataset, sampler=eval_sampler, batch_size=eval_batch_size)\n",
        "\n",
        "  # Eval!\n",
        "    logger.info(\"***** Running evaluation {} *****\".format(prefix))\n",
        "    logger.info(\"  Num examples = %d\", len(eval_dataset))\n",
        "    logger.info(\"  Batch size = %d\", eval_batch_size)\n",
        "    eval_loss = 0.0\n",
        "    nb_eval_steps = 0\n",
        "    preds = None\n",
        "    out_label_ids = None\n",
        "\n",
        "    # Loop through the test set, batch by batch:\n",
        "\n",
        "    for batch in tqdm(eval_dataloader, desc=\"Evaluating\"):\n",
        "        model.eval()\n",
        "        batch = tuple(t.to(device) for t in batch)\n",
        "\n",
        "        with torch.no_grad():\n",
        "            inputs = {'input_ids':      batch[0],\n",
        "                      'attention_mask': batch[1],\n",
        "                      'token_type_ids': batch[2],\n",
        "                      'labels':      batch[3],\n",
        "                      'e1_mask': batch[4],\n",
        "                      'e2_mask': batch[5],\n",
        "                      }\n",
        "            outputs = model(**inputs)\n",
        "            tmp_eval_loss, logits = outputs[:2]\n",
        "\n",
        "            eval_loss += tmp_eval_loss.mean().item()\n",
        "        nb_eval_steps += 1\n",
        "\n",
        "        # Extract the predictions from the model's output:\n",
        "        if preds is None:\n",
        "            preds = logits.detach().cpu().numpy()\n",
        "            out_label_ids = inputs['labels'].detach().cpu().numpy()\n",
        "        else:\n",
        "            preds = np.append(preds, logits.detach().cpu().numpy(), axis=0)\n",
        "            out_label_ids = np.append(\n",
        "                out_label_ids, inputs['labels'].detach().cpu().numpy(), axis=0)\n",
        "            \n",
        "    # Get the loss, prediction and results:\n",
        "    eval_loss = eval_loss / nb_eval_steps\n",
        "    preds = np.argmax(preds, axis=1)\n",
        "\n",
        "\n",
        "    result = compute_metrics(eval_task, preds, out_label_ids)\n",
        "    results.update(result)\n",
        "\n",
        "    logger.info(\"***** Eval results {} *****\".format(prefix))\n",
        "    for key in sorted(result.keys()):\n",
        "        logger.info(\"  %s = %s\", key, str(result[key]))\n",
        "    \n",
        "    # Write results to file:\n",
        "    output_eval_file = \"/content/gdrive/My Drive/Colab Notebooks/data/eval/results2.txt\"\n",
        "    with open(output_eval_file, \"w\") as writer:\n",
        "        for key in range(len(preds)):\n",
        "            writer.write(\"%d\\t%s\\n\" %  (key+8001, str(RELATION_LABELS[preds[key]])))\n",
        "                \n",
        "    return result, preds, out_label_ids"
      ],
      "execution_count": 0,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab_type": "code",
        "id": "wE8DJN7UVcF5",
        "outputId": "29a1122b-c4fb-4f15-adc6-94e96e32b53d",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 1000
        }
      },
      "source": [
        "import numpy as np \n",
        "from scipy.stats import pearsonr, spearmanr\n",
        "from sklearn.metrics import matthews_corrcoef, f1_score\n",
        "\n",
        "RELATION_LABELS = ['causes1-causes2(e1,e2)',\n",
        "'causes2-causes1(e2,e1)',\n",
        "'contraindicates1-contraindicates2(e1,e2)',\n",
        "'contraindicates2-contraindicates1(e2,e1)',\n",
        "'location1-location2(e1,e2)',\n",
        "'location2-location1(e2,e1)',\n",
        "'treats1-treats2(e1,e2)',\n",
        "'treats2-treats1(e2,e1)',\n",
        "'diagnosed by1-diagnosed by2(e1,e2)',\n",
        "'diagnosed by2-diagnosed by1(e2,e1)']\n",
        "\n",
        "per_gpu_eval_batch_size=4\n",
        "\n",
        "result = evaluate(model, tokenizer)\n",
        "result"
      ],
      "execution_count": 96,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "\n",
            "Evaluating:   0%|          | 0/147 [00:00<?, ?it/s]\u001b[A\n",
            "Evaluating:   1%|▏         | 2/147 [00:00<00:12, 11.62it/s]\u001b[A\n",
            "Evaluating:   3%|▎         | 4/147 [00:00<00:11, 12.66it/s]\u001b[A\n",
            "Evaluating:   4%|▍         | 6/147 [00:00<00:10, 14.06it/s]\u001b[A\n",
            "Evaluating:   5%|▌         | 8/147 [00:00<00:09, 15.06it/s]\u001b[A\n",
            "Evaluating:   7%|▋         | 10/147 [00:00<00:08, 15.92it/s]\u001b[A\n",
            "Evaluating:   8%|▊         | 12/147 [00:00<00:08, 16.66it/s]\u001b[A\n",
            "Evaluating:  10%|▉         | 14/147 [00:00<00:07, 17.27it/s]\u001b[A\n",
            "Evaluating:  11%|█         | 16/147 [00:00<00:07, 17.69it/s]\u001b[A\n",
            "Evaluating:  12%|█▏        | 18/147 [00:01<00:07, 18.02it/s]\u001b[A\n",
            "Evaluating:  14%|█▎        | 20/147 [00:01<00:06, 18.32it/s]\u001b[A\n",
            "Evaluating:  15%|█▍        | 22/147 [00:01<00:06, 18.52it/s]\u001b[A\n",
            "Evaluating:  16%|█▋        | 24/147 [00:01<00:06, 18.57it/s]\u001b[A\n",
            "Evaluating:  18%|█▊        | 26/147 [00:01<00:06, 18.62it/s]\u001b[A\n",
            "Evaluating:  19%|█▉        | 28/147 [00:01<00:06, 18.67it/s]\u001b[A\n",
            "Evaluating:  20%|██        | 30/147 [00:01<00:06, 18.72it/s]\u001b[A\n",
            "Evaluating:  22%|██▏       | 32/147 [00:01<00:06, 18.74it/s]\u001b[A\n",
            "Evaluating:  23%|██▎       | 34/147 [00:01<00:06, 18.68it/s]\u001b[A\n",
            "Evaluating:  24%|██▍       | 36/147 [00:02<00:05, 18.74it/s]\u001b[A\n",
            "Evaluating:  26%|██▌       | 38/147 [00:02<00:05, 18.83it/s]\u001b[A\n",
            "Evaluating:  27%|██▋       | 40/147 [00:02<00:05, 18.83it/s]\u001b[A\n",
            "Evaluating:  29%|██▊       | 42/147 [00:02<00:05, 18.80it/s]\u001b[A\n",
            "Evaluating:  30%|██▉       | 44/147 [00:02<00:05, 18.79it/s]\u001b[A\n",
            "Evaluating:  31%|███▏      | 46/147 [00:02<00:05, 18.81it/s]\u001b[A\n",
            "Evaluating:  33%|███▎      | 48/147 [00:02<00:05, 18.77it/s]\u001b[A\n",
            "Evaluating:  34%|███▍      | 50/147 [00:02<00:05, 18.73it/s]\u001b[A\n",
            "Evaluating:  35%|███▌      | 52/147 [00:02<00:05, 18.74it/s]\u001b[A\n",
            "Evaluating:  37%|███▋      | 54/147 [00:02<00:04, 18.77it/s]\u001b[A\n",
            "Evaluating:  38%|███▊      | 56/147 [00:03<00:04, 18.74it/s]\u001b[A\n",
            "Evaluating:  39%|███▉      | 58/147 [00:03<00:04, 18.73it/s]\u001b[A\n",
            "Evaluating:  41%|████      | 60/147 [00:03<00:04, 18.74it/s]\u001b[A\n",
            "Evaluating:  42%|████▏     | 62/147 [00:03<00:04, 18.74it/s]\u001b[A\n",
            "Evaluating:  44%|████▎     | 64/147 [00:03<00:04, 18.74it/s]\u001b[A\n",
            "Evaluating:  45%|████▍     | 66/147 [00:03<00:04, 18.69it/s]\u001b[A\n",
            "Evaluating:  46%|████▋     | 68/147 [00:03<00:04, 18.71it/s]\u001b[A\n",
            "Evaluating:  48%|████▊     | 70/147 [00:03<00:04, 18.55it/s]\u001b[A\n",
            "Evaluating:  49%|████▉     | 72/147 [00:03<00:04, 18.53it/s]\u001b[A\n",
            "Evaluating:  50%|█████     | 74/147 [00:04<00:03, 18.59it/s]\u001b[A\n",
            "Evaluating:  52%|█████▏    | 76/147 [00:04<00:03, 18.61it/s]\u001b[A\n",
            "Evaluating:  53%|█████▎    | 78/147 [00:04<00:03, 18.63it/s]\u001b[A\n",
            "Evaluating:  54%|█████▍    | 80/147 [00:04<00:03, 18.66it/s]\u001b[A\n",
            "Evaluating:  56%|█████▌    | 82/147 [00:04<00:03, 18.66it/s]\u001b[A\n",
            "Evaluating:  57%|█████▋    | 84/147 [00:04<00:03, 18.67it/s]\u001b[A\n",
            "Evaluating:  59%|█████▊    | 86/147 [00:04<00:03, 18.66it/s]\u001b[A\n",
            "Evaluating:  60%|█████▉    | 88/147 [00:04<00:03, 18.72it/s]\u001b[A\n",
            "Evaluating:  61%|██████    | 90/147 [00:04<00:03, 18.73it/s]\u001b[A\n",
            "Evaluating:  63%|██████▎   | 92/147 [00:04<00:02, 18.76it/s]\u001b[A\n",
            "Evaluating:  64%|██████▍   | 94/147 [00:05<00:02, 18.68it/s]\u001b[A\n",
            "Evaluating:  65%|██████▌   | 96/147 [00:05<00:02, 18.76it/s]\u001b[A\n",
            "Evaluating:  67%|██████▋   | 98/147 [00:05<00:02, 18.71it/s]\u001b[A\n",
            "Evaluating:  68%|██████▊   | 100/147 [00:05<00:02, 18.73it/s]\u001b[A\n",
            "Evaluating:  69%|██████▉   | 102/147 [00:05<00:02, 18.68it/s]\u001b[A\n",
            "Evaluating:  71%|███████   | 104/147 [00:05<00:02, 18.76it/s]\u001b[A\n",
            "Evaluating:  72%|███████▏  | 106/147 [00:05<00:02, 18.75it/s]\u001b[A\n",
            "Evaluating:  73%|███████▎  | 108/147 [00:05<00:02, 18.71it/s]\u001b[A\n",
            "Evaluating:  75%|███████▍  | 110/147 [00:05<00:01, 18.63it/s]\u001b[A\n",
            "Evaluating:  76%|███████▌  | 112/147 [00:06<00:01, 18.61it/s]\u001b[A\n",
            "Evaluating:  78%|███████▊  | 114/147 [00:06<00:01, 18.60it/s]\u001b[A\n",
            "Evaluating:  79%|███████▉  | 116/147 [00:06<00:01, 18.62it/s]\u001b[A\n",
            "Evaluating:  80%|████████  | 118/147 [00:06<00:01, 18.68it/s]\u001b[A\n",
            "Evaluating:  82%|████████▏ | 120/147 [00:06<00:01, 18.57it/s]\u001b[A\n",
            "Evaluating:  83%|████████▎ | 122/147 [00:06<00:01, 18.59it/s]\u001b[A\n",
            "Evaluating:  84%|████████▍ | 124/147 [00:06<00:01, 18.61it/s]\u001b[A\n",
            "Evaluating:  86%|████████▌ | 126/147 [00:06<00:01, 18.56it/s]\u001b[A\n",
            "Evaluating:  87%|████████▋ | 128/147 [00:06<00:01, 18.55it/s]\u001b[A\n",
            "Evaluating:  88%|████████▊ | 130/147 [00:07<00:00, 18.57it/s]\u001b[A\n",
            "Evaluating:  90%|████████▉ | 132/147 [00:07<00:00, 18.50it/s]\u001b[A\n",
            "Evaluating:  91%|█████████ | 134/147 [00:07<00:00, 18.51it/s]\u001b[A\n",
            "Evaluating:  93%|█████████▎| 136/147 [00:07<00:00, 18.56it/s]\u001b[A\n",
            "Evaluating:  94%|█████████▍| 138/147 [00:07<00:00, 18.62it/s]\u001b[A\n",
            "Evaluating:  95%|█████████▌| 140/147 [00:07<00:00, 18.62it/s]\u001b[A\n",
            "Evaluating:  97%|█████████▋| 142/147 [00:07<00:00, 18.64it/s]\u001b[A\n",
            "Evaluating:  98%|█████████▊| 144/147 [00:07<00:00, 18.69it/s]\u001b[A\n",
            "Evaluating:  99%|█████████▉| 146/147 [00:07<00:00, 18.58it/s]\u001b[A\n",
            "Evaluating: 100%|██████████| 147/147 [00:07<00:00, 18.54it/s]\u001b[A"
          ],
          "name": "stderr"
        },
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "({'acc': 0.5316239316239316,\n",
              "  'acc_and_f1': 0.3817510035286296,\n",
              "  'f1': 0.23187807543332759},\n",
              " array([2, 0, 0, 2, 7, 8, 8, 2, 0, 8, 0, 1, 0, 8, 2, 1, 2, 7, 1, 2, 8, 0,\n",
              "        1, 2, 6, 7, 2, 1, 8, 0, 0, 7, 0, 2, 2, 0, 7, 0, 0, 8, 0, 1, 0, 1,\n",
              "        8, 0, 8, 2, 0, 0, 7, 8, 2, 8, 0, 8, 0, 0, 1, 0, 0, 1, 0, 1, 1, 1,\n",
              "        0, 1, 8, 7, 1, 0, 1, 8, 8, 0, 8, 7, 0, 7, 0, 0, 0, 2, 8, 2, 1, 0,\n",
              "        1, 1, 1, 0, 0, 0, 8, 1, 0, 1, 0, 0, 8, 1, 1, 0, 0, 7, 0, 1, 1, 1,\n",
              "        0, 1, 0, 2, 7, 0, 7, 0, 5, 7, 7, 2, 1, 8, 8, 2, 0, 8, 0, 1, 7, 8,\n",
              "        0, 0, 7, 8, 0, 1, 0, 7, 8, 0, 0, 7, 0, 0, 1, 0, 0, 5, 0, 8, 8, 8,\n",
              "        1, 0, 7, 1, 8, 8, 0, 0, 8, 7, 2, 1, 1, 0, 1, 9, 5, 2, 8, 2, 0, 8,\n",
              "        1, 2, 1, 1, 1, 2, 1, 0, 2, 8, 8, 7, 7, 8, 1, 0, 0, 2, 1, 8, 6, 0,\n",
              "        7, 7, 0, 8, 8, 8, 2, 2, 0, 2, 7, 7, 2, 0, 2, 7, 2, 0, 0, 0, 8, 1,\n",
              "        1, 1, 0, 0, 2, 2, 2, 1, 1, 1, 1, 8, 2, 0, 2, 0, 0, 5, 2, 1, 2, 1,\n",
              "        1, 1, 8, 8, 0, 8, 7, 7, 7, 7, 7, 8, 8, 8, 8, 1, 0, 8, 2, 7, 8, 7,\n",
              "        0, 2, 7, 8, 0, 8, 0, 1, 0, 0, 1, 1, 0, 7, 0, 7, 2, 2, 0, 2, 2, 1,\n",
              "        2, 1, 1, 0, 2, 2, 2, 1, 7, 0, 2, 0, 0, 2, 1, 1, 1, 2, 0, 2, 2, 7,\n",
              "        2, 0, 2, 1, 2, 2, 0, 2, 1, 0, 0, 1, 0, 1, 2, 0, 7, 2, 0, 7, 0, 1,\n",
              "        0, 0, 0, 0, 1, 2, 8, 1, 2, 2, 1, 1, 7, 2, 2, 0, 8, 2, 8, 2, 2, 1,\n",
              "        2, 0, 1, 8, 7, 8, 0, 0, 2, 1, 1, 2, 0, 1, 0, 8, 8, 7, 0, 2, 0, 2,\n",
              "        2, 2, 2, 2, 0, 1, 0, 8, 2, 1, 1, 2, 8, 0, 1, 1, 2, 7, 2, 2, 5, 2,\n",
              "        7, 1, 7, 0, 1, 8, 0, 1, 0, 7, 1, 1, 8, 2, 0, 2, 0, 1, 1, 0, 8, 2,\n",
              "        2, 1, 2, 1, 7, 0, 7, 0, 0, 1, 0, 5, 7, 0, 2, 8, 8, 0, 8, 7, 8, 0,\n",
              "        7, 0, 8, 8, 8, 8, 8, 7, 0, 0, 8, 1, 1, 8, 8, 8, 8, 7, 8, 8, 8, 7,\n",
              "        7, 8, 8, 8, 0, 0, 0, 8, 8, 8, 8, 7, 0, 8, 8, 8, 8, 7, 8, 8, 8, 7,\n",
              "        8, 8, 8, 8, 7, 7, 7, 0, 8, 8, 8, 8, 8, 8, 8, 8, 0, 0, 8, 8, 7, 8,\n",
              "        7, 8, 8, 8, 8, 8, 8, 8, 7, 8, 8, 8, 7, 8, 7, 8, 8, 8, 8, 8, 7, 8,\n",
              "        8, 8, 8, 8, 7, 7, 8, 8, 8, 0, 7, 0, 8, 8, 2, 8, 8, 8, 8, 8, 8, 0,\n",
              "        8, 8, 8, 8, 8, 8, 8, 8, 7, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 0, 8, 8,\n",
              "        8, 8, 8, 8, 1, 8, 8, 8, 8, 7, 8, 7, 8]),\n",
              " array([ 6, 10,  9,  2,  7,  8,  2,  2,  1,  8,  2, 10,  1,  2,  2,  2,  2,\n",
              "         7,  1,  1,  8,  2,  9,  1,  1,  1,  2,  1,  8,  1,  5,  1,  6,  1,\n",
              "        10,  2,  7,  9,  1,  8,  6,  1,  1,  2,  2,  2,  8,  6,  2,  1,  7,\n",
              "         7,  2,  4,  2,  8,  6,  8,  1,  7,  7,  1, 10,  1,  1,  2,  2,  1,\n",
              "         8,  7, 10,  6,  7,  8,  8,  6,  2,  1, 10,  7, 10,  1,  7,  2,  8,\n",
              "         1,  2,  1,  5,  1,  5,  8,  1,  1,  8,  1,  6,  6,  5,  6,  8,  9,\n",
              "         5,  2,  6,  7,  9,  5,  1,  1,  1,  2,  1,  2,  7,  8,  9,  6,  9,\n",
              "         7,  7, 10,  9,  8, 10,  2,  1,  8,  6,  1,  7,  8,  7,  6,  7,  8,\n",
              "         9,  9,  2,  7,  8,  1,  2,  7,  6,  2,  1,  1,  5,  6,  1,  8, 10,\n",
              "        10,  5,  5,  8,  1,  8,  8,  3,  2, 10,  7,  1,  1,  1,  2,  1, 10,\n",
              "         1,  6,  8,  2,  1,  8,  9,  6,  7,  5,  1,  2,  1,  1,  6, 10,  1,\n",
              "         7,  9,  8,  1,  1,  1,  2,  5,  8,  2,  2,  7,  7,  2,  8, 10,  8,\n",
              "         2,  2,  1,  2,  7,  7,  6,  9,  2,  1,  2,  2,  1,  1,  1,  1,  1,\n",
              "         2,  2,  2,  2,  2,  2,  1,  1,  2,  1,  2,  2,  1,  2, 10,  1,  9,\n",
              "         2, 10,  2,  5,  1,  1,  8,  8,  2, 10,  7,  9,  7,  7,  7, 10,  8,\n",
              "         8,  8,  3,  7,  8,  2,  7,  8,  7,  1,  2,  4,  8,  1,  2,  2,  2,\n",
              "         1,  2,  1,  1,  1,  1,  1,  1,  2,  2,  1,  2,  2,  1,  1,  1,  1,\n",
              "         1,  2,  1,  2,  1,  2,  1,  1,  2,  1,  2,  2,  1,  1,  2,  1,  1,\n",
              "         2,  1,  2,  1,  2,  1,  2,  2,  2,  2,  1,  2,  1,  2,  1,  1,  2,\n",
              "         1,  1,  1,  1,  1,  1,  1,  2,  1,  1,  1,  1,  1,  1,  2,  1,  2,\n",
              "         1,  1,  1,  2,  2,  1,  2,  2,  1,  1,  2,  2,  1,  2,  1,  2,  1,\n",
              "         2,  1,  2,  2,  1,  2,  2,  1,  1,  2,  2,  2,  1,  1,  1,  1,  2,\n",
              "         2,  2,  2,  1,  1,  1,  2,  2,  1,  1,  1,  2,  2,  2,  2,  1,  2,\n",
              "         1,  2,  2,  2,  2,  1,  1,  1,  1,  2,  2,  1,  1,  2,  1,  1,  1,\n",
              "         2,  2,  2,  2,  2,  1,  1,  1,  2,  2,  2,  2,  2,  1,  1,  1,  1,\n",
              "         1,  2,  1,  2,  1,  1,  1,  2,  8,  8,  8,  8,  7,  8,  7,  7,  7,\n",
              "         8,  8,  8,  8,  8,  7,  7,  7,  8,  8,  7,  8,  8,  8,  8,  7,  8,\n",
              "         8,  8,  7,  7,  8,  8,  8,  7,  7,  7,  8,  8,  8,  8,  7,  7,  8,\n",
              "         8,  8,  8,  7,  8,  8,  8,  7,  8,  8,  8,  8,  7,  7,  7,  7,  8,\n",
              "         8,  8,  8,  8,  8,  8,  8,  8,  7,  8,  8,  7,  8,  7,  8,  8,  8,\n",
              "         8,  8,  8,  8,  8,  8,  8,  8,  7,  8,  7,  8,  8,  8,  8,  8,  7,\n",
              "         8,  8,  8,  8,  8,  7,  7,  8,  8,  8,  7,  7,  7,  8,  7,  8,  8,\n",
              "         8,  8,  8,  8,  8,  7,  8,  8,  8,  8,  8,  8,  8,  8,  7,  8,  8,\n",
              "         8,  8,  8,  7,  8,  8,  8,  8,  7,  8,  8,  8,  8,  8,  8,  7,  8,\n",
              "         8,  8,  8,  7,  8,  7,  8]))"
            ]
          },
          "metadata": {
            "tags": []
          },
          "execution_count": 96
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "EmpSmesEuv2z",
        "colab_type": "text"
      },
      "source": [
        "# Turns out that the model predicts over half of the classes correctly!\n",
        "\n",
        "\n",
        "Results of the evaluation:\n",
        "\n",
        "\n",
        "accuracy: 0.532\n",
        "\n",
        "f1-score (macro average): 0.232"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "lDHd-4Qy69En",
        "colab_type": "text"
      },
      "source": [
        "Check what the predictions were, by running through the test file sentence by sentence:"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "aA8kuVyjsEF8",
        "colab_type": "code",
        "colab": {}
      },
      "source": [
        "# dict that relates the relation name and how it appears in the text:\n",
        "\n",
        "RELATIONZ = {'causes1-causes2(e1,e2)' : '1\tcauses1\tcauses2\t1',\n",
        "'causes2-causes1(e2,e1)' : '2\tcauses2\tcauses1\t2',\n",
        "'contraindicates1-contraindicates2(e1,e2)' : '3\tcontraindicates1\tcontraindicates2\t1',\n",
        "'contraindicates2-contraindicates1(e2,e1)' : '4\tcontraindicates2\tcontraindicates1\t2',\n",
        "'location1-location2(e1,e2)' : '5\tlocation1\tlocation2\t1',\n",
        "'location2-location1(e2,e1)' : '6\tlocation2\tlocation1\t2',\n",
        "'treats1-treats2(e1,e2)' : '7\ttreats1\ttreats2\t1',\n",
        "'treats2-treats1(e2,e1)' : '8\ttreats2\ttreats1\t2', \n",
        "'diagnosed by1-diagnosed by2(e1,e2)': '9\tdiagnosed by1\tdiagnosed by2\t1',\n",
        "'diagnosed by2-diagnosed by1(e2,e1)' : '10\tdiagnosed by2\tdiagnosed by1\t2'}"
      ],
      "execution_count": 0,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "V6PU7S5tx8as",
        "colab_type": "code",
        "colab": {}
      },
      "source": [
        "predictions = []\n",
        "with open('/content/gdrive/My Drive/Colab Notebooks/data/eval/results2.txt') as f:\n",
        "  for l in f.readlines():\n",
        "    predictions.append(l.split('\t')[1].strip())"
      ],
      "execution_count": 0,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "o05A1uLvtsdf",
        "colab_type": "code",
        "outputId": "2b459fb3-6953-48c1-8ce6-fe6ffb51ade8",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 1000
        }
      },
      "source": [
        "# Check which predictions were correctly done by the model:\n",
        "\n",
        "with open('/content/gdrive/My Drive/Colab Notebooks/data/test.tsv') as f:\n",
        "  correct = set() \n",
        "  i = 0 \n",
        "  for l in f.readlines():\n",
        "    if RELATIONZ[predictions[i]] in l[-30:]:\n",
        "        print(predictions[i])\n",
        "        print(l[6:])\n",
        "        correct.add((l,predictions[i]))\n",
        "    i+=1"
      ],
      "execution_count": 13,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "causes1-causes2(e1,e2)\n",
            "therapeutic results of Lp TAE (transcatheter arterial embolization in the presence or absence of Gelfoam particles preceded by the infusion of a mixture of lipiodol and an anticancer drug via the proper hepatic artery) or DSM TAE (transcatheter arterial embolization with degradable starch microspheres and the arterial injection of anticancer drugs via the hepatic artery) combined with $HYPERTHERMIA$ were evaluated in 30 patients with #HEPATOCELLULAR CARCINOMA# (HCC), 5 subjects with hepatic cholangiocarcinoma, and 22 patients with metastatic liver carcinoma.\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "1 yr old woman with gallbladder stones, diabetes, weight loss, $DIARRHEA$ and steatorrhea, #IMMUNOHISTOCHEMICAL DIAGNOSIS OF SOMATOSTATINOMA# (liver biopsy) and high plasma values of somatostatin was studied by somatostatin receptor scintigraphy.\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes2-causes1(e2,e1)\n",
            "ANTAVIRUS# PULMONARY SYNDROME$ (HPS) is a viral infection from a new strain of #HANTAVIRUS#\t2\tcauses2\tcauses1\t2\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "ase of a 73 years old man with macronodular liver cirrhosis, ascites, $JAUNDICE$ and a #PRIMARY HEPATOCELLULAR CARCINOMA# is presented.\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "1, TWINKLE and POLG genes affect mtDNA stability and are involved in autosomal dominant PEO, while mutations in POLG are responsible for numerous clinical presentations, including autosomal recessive PEO, sensory ataxic neuropathy, dysarthria and ophthalmoparesis (SANDO), $SPINO CEREBELLAR ATAXIA$ and epilepsy (SCAE) or #ALPERS SYNDROME#\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "addition to the characterization of the degree of inflammation and fibrosis, the evaluation of a $LIVER BIOPSY SPECIMEN$ from a patient with #CHRONIC HEPATITIS# includes a description of all of the findings present (e.g., steatosis, bile duct changes, granulomas, type of inflammatory cells, presence of iron, and Mallory hyaline) or a notation of their absence.\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes2-causes1(e2,e1)\n",
            "te idiopathic thrombocytopenic purpura is the most common cause of thrombocytopenia in childhood, and diagnosis of $IDIOPATHIC THROMBOCYTOPENIC PURPURA$ is made clinically based on the exclusion of other causes of #THROMBOCYTOPENIA#.\t2\tcauses2\tcauses1\t2\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "s results in the amplification of any subsequent $ALLERGIC REACTION$ contributing to the #CHRONIC ALLERGIC STATE#\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes2-causes1(e2,e1)\n",
            "UCTOSE INTOLERANCE$ is caused by a #DEFICIT OF THE LIVER# aldolase B enzyme.\t2\tcauses2\tcauses1\t2\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "dictive model for $PAIN$ recurrence after posterior fossa surgery for #TRIGEMINAL NEURALGIA#\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes2-causes1(e2,e1)\n",
            "e postpartum eclampsia without the classical $PRE ECLAMPTIC SIGNS OEDEMA PROTEINURIA$ and hypertension is a rarely noticed #COMPLICATION OF PREGNANCY#\t2\tcauses2\tcauses1\t2\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "determine if termination of hemodynamically tolerated, sustained ventricular tachycardia during intravenous infusion of procainamide predicts the success of procainamide therapy in preventing induction of $TACHYCARDIA$ 15 patients with inducible, #SUSTAINED VENTRICULAR TACHYCARDIA# in the setting of chronic coronary artery disease were studied.\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "$CARIES$ protective effect of the #FETAL FLUORIDE# however, has been substantiated by pertinent research.\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            " principal differences between these vaccines are the transmission of live vaccine viruses from recipieits to their contacts and the occurrence of occasional cases of $PARALYTIC POLIOMYELITIS$ associated with use of #LIVE POLIOVIRUS VACCINE#\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "UBERCULOUS INVOLVEMENT OF PITUITARY$ is extremely rare and is usually not suspected while dealing with #PITUITARY ADENOMAS# even in patients with history of systemic tuberculosis.\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes2-causes1(e2,e1)\n",
            "LUSTER HEADACHE$ the most severe of #PRIMARY HEADACHE CONDITIONS# for functional and social impairment it provokes, has been recently the object of a great amount of clinical, physiopathological, surgical and functional neuroradiological studies aimed to uncover the real mechanisms which underlie its disabling manifestations.\t2\tcauses2\tcauses1\t2\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            " report a boy $WITH MARKED CLINICAL KYPHOSIS IN WHOM THE$ diagnosis of #MPS II# was proved by demonstrating a severe deficiency of serum and leucocyte iduronate sulphate sulphatase and an accelerated incorporation of radiosulphate into his cultured fibroblast glycosaminoglycans, which could not be corrected by the product of other typed reference #MPS II# cells.\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "e M3 subtype of AML, also known as acute promyelocytic leukemia, is almost universally treated with the drug ATRA (all trans retinoic acid) in addition to induction chemotherapy.[36][37][38] Care must be taken to prevent disseminated $INTRAVASCULAR COAGULATION (DIC$ complicating the treatment of #APL# when the promyelocytes release the contents of their granules into the peripheral circulation.\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "wever, radiotherapy alone could not resolve the problem of $PATHOLOGICAL FRACTURE$ which is an important complication of #SPINAL METASTASES#\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "e activities of nucleoside diphosphatase in various rat $ASCITES$ cells of #HEPATOMA# and fetal and neonatal rat liver were much lower than that of normal adult rat liver.\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "location2-location1(e2,e1)\n",
            "is study expands the geographical map of the distribution of $BCL 2 GENE$ rearrangement in #FOLLICULAR LYMPHOMA# patients in the Middle East region.\t6\tlocation2\tlocation1\t2\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "radioimmunoassay for neuron specific enolase (NSE), a marker of $NEUROENDOCRINE DIFFERENTIATION$ has been evaluated in #SMALL CELL LUNG CANCER (SCLC)#.\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "treats2-treats1(e2,e1)\n",
            "Adherence to a gluten free diet in patients with celiac disease can lead to the correction of $IRON DEFICIENCY ANEMIA$ and the replacement of iron stores, and it can prevent the recurrence of #IRON# deficiency.\t8\ttreats2\ttreats1\t2\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "e $HYPERSENSITIVITY$ proved to be due to an #ALLERGY# to a reaction product, and the simultaneous presence of the preservatives 1,3,5 trihydroxyethylhexahydrotriazine and thymol was found to be necessary for the occurrence of a positive patch test reaction.\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "ECONIUM PERITONITIS$ is a chemical peritonitis which occurs following #BOWEL PERFORATION# during fetal life.\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "e $SPIROCHETE BORRELIA BURGDORFERI$ is the causative agent of #LYME DISEASE# the leading vector borne illness in the United States.\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "e former is consistent with the concept of an agent with an independent genome while the latter is consistent with the concept that 'strain of agent' is another expression of the involvement of $PRION PROTEIN$ in the pathogenesis of #TRANSMISSIBLE SPONGIFORM ENCEPHALOPATHY#\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "ch comparison gave the same conclusion: CMPV is the closest known virus to $VARIOLA VIRUS$ the cause of #SMALLPOX#\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "V-specific DNA was demonstrated in 15 lymphocyte preparations of nine patients with $VARICELLA$ and in one with disseminated zoster out of five patients with #ZOSTER#\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "e aim of the study was to assess the therapeutic efficacy of loratadine on $PRURITUS$ in patients with atopic dermatitis, considering the patients' sensation of #ITCH#\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes2-causes1(e2,e1)\n",
            "abetic neuropathy is common in patients with $DIABETES MELLITUS$, and 7.5% of diabetics experience pain from #DIABETIC NEUROPATHY#\t2\tcauses2\tcauses1\t2\n",
            "\n",
            "causes2-causes1(e2,e1)\n",
            "patient presented with a $HYPERSENSITIVITY REACTION$ to the aromatic anticonvulsants which evolved into #STEVENS-JOHNSON SYNDROME# and was complicated by the presence of adult respiratory distress syndrome\t2\tcauses2\tcauses1\t2\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "lergen specific immunotherapy has been shown to be effective in rigorous double-blind placebo-controlled clinical trials in both children and adults A recent WHO position paper stated that immunotherapy is an effective treatment for patients with allergic rhinitis/conjunctivitis, $ALLERGIC ASTHMA$ and #ALLERGIC REACTIONS# from stinging insects and is thought to be more effective in children than in adults\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "e material investigated consisted of two spleens resected at gastrectomy and one resected because of $SPLENOMEGALY$ in a case of #HAIRY CELL LEUKAEMIA#\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "ARIOLA MAJOR VIRUS$ the causative agent of #SMALLPOX# encodes the dual specificity H1 phosphatase.\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            ",   116  Nocardiosis  Tetracyclines are alternative to co trimoxazole for treatment of $NOCARDIOSIS$   +    caused by  #NOCARDIA#\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes2-causes1(e2,e1)\n",
            "rch Hemoglobinuria  $MARCH HEMOGLOBINURIA$ a disorder that somewhat resembles #MICROANGIOPATHIC HEMOLYSIS# usually occurs in young persons after prolonged marching or running or playing on bongo drums.\t2\tcauses2\tcauses1\t2\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "uctuations in cognition, parkinsonian symptoms, $WELL FORMED VISUAL HALLUCINATIONS$ and relative preservation of short term memory suggest #LEWY BODY DEMENTIA# rather than Alzheimer's disease (see  Table 213 5.\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "ONADOTROPHIN INDEPENDENT FAMILIAL SEXUAL PRECOCITY$ with premature Leydig and germinal cell maturation (familial testotoxicosis): effects of a potent #LUTEINIZING HORMONE RELEASING FACTOR AGONIST AND MEDROXYPROGESTERONE# acetate therapy in four cases.\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "ISTERIA INFECTIONS$  Treatment of infections caused by  #L MONOCYTOGENES#  (e.g., infections during pregnancy, granulomatosis infantiseptica, sepsis, meningitis, endocarditis, foodborne infections.\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "YOCLONUS$ (a jerky reaction, particularly prominent after a startle) is an important physical finding, but its absence does not rule out #CJD#\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "VERDOSE$ can result from inhalation, injection, or absorption of the #DRUG# from the gastrointestinal tract ('body packing')\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "PERTENSION  Although mild or moderate hypertension does not usually cause headache, $SEVERE HYPERTENSION$ from the following conditions can cause headache: acute pressor response to exogenous agents, #PHEOCHROMOCYTOMA# malignant hypertension, and preeclampsia and eclampsia.\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "IN RESULTS  Pregabalin, ≥ 300 mg/d, reduced pain in patients with postherpetic neuralgia, $PAIN$ful diabetic neuropathy, and #FIBROMYALGIA#; 600 mg/d reduced pain in patients with central neuropathic pain (Table.\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes2-causes1(e2,e1)\n",
            " all of these pathway defects, $HEMOLYTIC ANEMIA$ occurs only in homozygotes, and the exact mechanism of #HEMOLYSIS# is unknown.\t2\tcauses2\tcauses1\t2\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "ETASTATIC BASAL CELL CARCINOMA$ to the bone in #GORLIN SYNDROME#\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "ONJUGATED HYPERBILIRUBINEMIA$ due to impaired excretion can result from #DUBIN JOHNSON SYNDROME#\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "  Viral infections (e.g., $CYTOMEGALOVIRUS [CMV] INFECTIONS$ herpes simplex, #HERPES ZOSTER# reported more frequently in cardiac transplant recipients receiving mycophenolate than in those receiving azathioprine.\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes2-causes1(e2,e1)\n",
            "edictors of $EPILEPSY$ in children who have experienced #FEBRILE SEIZURES#\t2\tcauses2\tcauses1\t2\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "  Trichostrongyliasis  Treatment of $TRICHOSTRONGYLIASIS$   +    caused by  #TRICHOSTRONGYLUS#\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "ROGRESSIVE MULTIFOCAL LEUKOENCEPHALOPATHY$ and other disorders caused by #JC VIRUS#: clinical features and pathogenesis.\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "2 ,   321  Data are limited regarding safety of repeated use of co trimoxazole in pediatric patients   186 ,   a  GI Infections  Treatment of $TRAVELERS' DIARRHEA$ caused by susceptible #ENTEROTOXIGENIC  ESCHERICHIA COLI#\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "nditions that affect the $LUNG$ diffusely, such as #EMPHYSEMA# and pulmonary fibrosis, decrease both DLCO and alveolar ventilation (V A.\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "EONATAL HYPOGLYCEMIA$ low blood glucose in the first month of life, occurs in about half of children with #BWS[#9] Most of these hypoglycemic newborns are asymptomatic and have a normal blood glucose level within days.\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "ONGENITAL INFECTIONS$ that can cause MR include #RUBELLA VIRUS# cytomegalovirus,  Toxoplasma gondii ,  Treponema pallidum , and HIV.\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "IOLOGY: Many diseases and conditions can inflame the membranous covering of the heart, including infections (bacterial, tubercular, viral, fungal); collagen vascular diseases (e.g., rheumatic fever, rheumatoid arthritis, or $SYSTEMIC LUPUS ERYTHEMATOSUS$; drugs (hydralazine, #PROCAINAMIDE# isoniazid, minoxidil); myocardial infarction; cancer; renal failure; cardiac surgery; or trauma.\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes2-causes1(e2,e1)\n",
            "ese disorders include inflammatory arteriopathies such as collagen vascular diseases, Takayasu disease, and neurovascular syphilis, as well as noninflammatory arteriopathies such as arterial dissection, fibromuscular dysplasia, moyamoya disease, CADASIL ($CEREBRAL AUTOSOMAL DOMINANT ARTERIOPATHY WITH SUBCORTICAL INFARCTS$) and #LEUKOENCEPHALOPATHY# radiation vasculopathy, and vasospasm after SAH.\t2\tcauses2\tcauses1\t2\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "mpromise  Treat $VIRAL INFECTIONS$ that may cause vasculitis (HIV, hepatitis, cytomegalovirus, parvovirus B19, herpes zoster, #EPSTEIN BARR VIRUS# with antiviral agents, if possible.\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes2-causes1(e2,e1)\n",
            "0  $CLOSTRIDIUM INFECTIONS$  Treatment of infections caused by  #CLOSTRIDIUM#\t2\tcauses2\tcauses1\t2\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "OTULISM$ caused by production of #BOTULINUM TOXIN# in the colon following ingestion of spores of  Clostridium botulinum.\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes2-causes1(e2,e1)\n",
            "nsitivity Reactions  $HYPERSENSITIVITY REACTIONS$  Serious cutaneous reactions, including #STEVENS JOHNSON SYNDROME#  1 ,   2 ,   93 ,   99   erythema multiforme.\t2\tcauses2\tcauses1\t2\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "evalence of $TOXIC SHOCK SYNDROME$ #TOXIN 1 PRODUCING STAPHYLOCOCCUS AUREUS# and the presence of antibodies to this superantigen in menstruating women.\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "Investigations of the prevalence of mites in the personal environment of 37 cases of $NORMAL SCABIES$ revealed live #MITES# from dust samples taken from bedroom floors, overstuffed chairs, and couches in all 37 cases ( 11.\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "idence: • $BONE$ affected by #PAGET'S DISEASE# can enlarge, and this enlargement of deformity is a presenting symptom in 20% of patients.\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "  He Q, Viljanen MK, Arvilommi H, Aittanen B, Mertsola J. $WHOOPING COUGH$ caused by  #BORDETELLA PERTUSSIS#  and  Bordetella parapertussis  in an immunized population.\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes2-causes1(e2,e1)\n",
            " $MEN II$ is suspected, #MEDULLARY THYROID CANCER# should be considered, and pheochromocytoma must be excluded before the patient goes to surgery.\t2\tcauses2\tcauses1\t2\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "e manifestations of $LEPROSY$ depend on the infected person's immune response to the causative agent,  #M LEPRAE#\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes2-causes1(e2,e1)\n",
            "AGR SYNDROME$ is the combination of Wilms' tumor (with  WT1  deletion), aniridia, GU malformations (eg, renal hypoplasia, cystic disease, #HYPOSPADIAS CRYPTORCHIDISM AND MENTAL RETARDATION#\t2\tcauses2\tcauses1\t2\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "APILLEDEMA$ is a late sign of #INCREASED INTRACRANIAL PRESSURE#; initial absence is not reassuring.\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "ERATOACANTHOMAS$ and sebaceous hyperplasia are also seen in patients with #MTS#\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes2-causes1(e2,e1)\n",
            "it] $POLYCYSTIC DISEASE OF THE #KIDNEYS#$ Main article: Polycystic Disease of the Kidneys  Additional possible cause of nephropathy is due to the formation of cysts or pockets containing fluid within the #KIDNEYS#\t2\tcauses2\tcauses1\t2\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "OSTER$     Reactivation of #VARICELLA VIRUS# years after the initial infection with chickenpox.\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "IRECT #DRUG# TOXICITY$  Although antimicrobials can damage virtually all human organ systems, the potential for toxicity varies widely from drug to #DRUG#.\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "PIDEMIC PLEURODYNIA$ hand foot and mouth disease, herpangina, and poliomyelitis are caused almost exclusively by #ENTEROVIRUSES#\t1\tcauses1\tcauses2\t1\n",
            "\n",
            "treats2-treats1(e2,e1)\n",
            "e 5 HT level in PPP was significantly increased in patients with coeliac disease in whom the disease was untreated or treated with gluten free diet for less than a year (p less than 0.01) but also compared with the patients with $COELIAC DISEASE$ treated with a #GLUTEN FREE DIET# for more than a year (p less than 0.01.\t8\ttreats2\ttreats1\t2\n",
            "\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "w-qiJBgdv2Yt",
        "colab_type": "text"
      },
      "source": [
        "It seems that the model only catches the causal relationships!"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "BO5GXt-gx2jA",
        "colab_type": "code",
        "outputId": "20eef5e5-7104-42d3-cd69-d01053885b93",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 82
        }
      },
      "source": [
        "from collections import Counter\n",
        "\n",
        "Counter([x[1] for x in correct])"
      ],
      "execution_count": 14,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "Counter({'causes1-causes2(e1,e2)': 56,\n",
              "         'causes2-causes1(e2,e1)': 16,\n",
              "         'location2-location1(e2,e1)': 1,\n",
              "         'treats2-treats1(e2,e1)': 2})"
            ]
          },
          "metadata": {
            "tags": []
          },
          "execution_count": 14
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "YXmfwsoxIlSm",
        "colab_type": "text"
      },
      "source": [
        "What was the distribution of relationships in the training data?\n",
        "\n"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "T6keHWR_J_yT",
        "colab_type": "code",
        "outputId": "83d99330-dc2c-4eeb-b065-2cf54642099c",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 180
        }
      },
      "source": [
        "train_data = read_tsv('/content/gdrive/My Drive/Colab Notebooks/data/train.tsv')\n",
        "\n",
        "Counter([(x[3],x[4]) for x in train_data])"
      ],
      "execution_count": 15,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "Counter({('causes1', 'causes2'): 419,\n",
              "         ('causes2', 'causes1'): 432,\n",
              "         ('contraindicates1', 'contraindicates2'): 4,\n",
              "         ('contraindicates2', 'contraindicates1'): 3,\n",
              "         ('diagnosed by1', 'diagnosed by2'): 29,\n",
              "         ('diagnosed by2', 'diagnosed by1'): 34,\n",
              "         ('location1', 'location2'): 36,\n",
              "         ('location2', 'location1'): 31,\n",
              "         ('treats1', 'treats2'): 215,\n",
              "         ('treats2', 'treats1'): 409})"
            ]
          },
          "metadata": {
            "tags": []
          },
          "execution_count": 15
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "A9gD0Vtl8Ak-",
        "colab_type": "text"
      },
      "source": [
        "What about the distribution of relationships in the test data?"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "nVRH-rxS8GOP",
        "colab_type": "code",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 180
        },
        "outputId": "8f7ad19a-ba8c-4423-b84e-a0bd863eab62"
      },
      "source": [
        "test_data = read_tsv('/content/gdrive/My Drive/Colab Notebooks/data/test.tsv')\n",
        "\n",
        "Counter([(x[3],x[4]) for x in test_data])"
      ],
      "execution_count": 16,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "Counter({('causes1', 'causes2'): 157,\n",
              "         ('causes2', 'causes1'): 133,\n",
              "         ('contraindicates1', 'contraindicates2'): 2,\n",
              "         ('contraindicates2', 'contraindicates1'): 2,\n",
              "         ('diagnosed by1', 'diagnosed by2'): 15,\n",
              "         ('diagnosed by2', 'diagnosed by1'): 19,\n",
              "         ('location1', 'location2'): 12,\n",
              "         ('location2', 'location1'): 20,\n",
              "         ('treats1', 'treats2'): 75,\n",
              "         ('treats2', 'treats1'): 150})"
            ]
          },
          "metadata": {
            "tags": []
          },
          "execution_count": 16
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "mrwsWfTMN9XJ",
        "colab_type": "text"
      },
      "source": [
        "The model only identifies 'treats' twice correctly, even though it is almost as abundant as 'causes'...\n",
        "\n",
        "What are 'treats' cases classified as?"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "Nj2OppQ5OKMq",
        "colab_type": "code",
        "outputId": "769acb7e-e9dd-46fd-bc37-d664d5236f46",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 1000
        }
      },
      "source": [
        "treats = []\n",
        "with open('/content/gdrive/My Drive/Colab Notebooks/data/test.tsv') as f:\n",
        "  \n",
        "  for i,l in enumerate(f.readlines()):\n",
        "    if 'treats' in l[-30:]:\n",
        "      # it is a \"treats\" relation \n",
        "        treats.append(predictions[i])\n",
        "        if predictions[i][:7] == 'treats2' or predictions[i][:7] == 'treats1':\n",
        "          # it is predicted to be a treats relation\n",
        "          print(l)\n",
        "          \n",
        "Counter(treats)"
      ],
      "execution_count": 17,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "4\tClonidine, oxymetazoline, tetrahydozoline, brimonidine, tizanidine; barbiturates; opioids; benzodiazepines  Give naloxone for suspected $OPIOID OVERDOSE$; consider #FLUMAZENIL# for benzodiazepine overdose Cholinergic (pinpoint pupils; variable HR; sweaty skin; abdominal cramps and diarrhea)  Organophosphate and carbamate insecticides; chemical warfare nerve agents  Give atropine and pralidoxime; obtain measurements of serum and RBC cholinesterase activity Anticholinergic (agitation; delirium; dilated pupils; tachycardia; decreased peristalsis; dry, flushed skin)  Atropine and related drugs; antihistamines; carbamazepine; phenothiazines; tricyclic antidepressants  Obtain immediate ECG.\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "17\tWoscoff A, Carabeli S. Treatment of $TINEA PEDIS$ with #SULCONAZOLE NITRATE 1% CREAM# or miconazole nitrate 2% cream.\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "37\tThe study demonstrated that vasopressin is similar to epinephrine for OOH CA due to $VENTRICULAR FIBRILLATION$ or pulseless electrical activity, and superior to epinephrine for the initial treatment of asystolic arrest; it also demonstrated that the combination of vasopressin and #EPINEPHRINE# is superior to epinephrine alone in the treatment of refractory, out of hospital cardiac arrest.\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "51\tIn a comparison group of 70 dogs without knwon exposure to $TUBERCULOSIS$ two positive responses to #USDA TUBERCULIN# were demonstrated and none to PPD.\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "70\tThus, tolcapone is a useful option in patients with $FLUCTUATING PARKINSON'S DISEASE$ who are receiving #LEVODOPA/DDCI# and are not responding to, or are not candidates for, other adjunctive treatments.\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "80\tCONCLUSION  In mild asthma, patients with the Gly/Gly genotype had better asthma control with albuterol than when it was withdrawn, and patients with the Arg/Arg genotype had worse $ASTHMA$ control with #ALBUTEROL# than when it was withdrawn.\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "106\tEpisodes of prolonged $APNOEA$ disappeared in all infants after administration of #CAFFEINE# and in 11 infants all pneumogram abnormalities resolved.\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "115\tThe response of urethral pressure to administration of an alpha stimulant was compared between a group of eight patients with $CHRONIC NEUROGENIC BLADDERS$ as evidenced by positive denervation supersensitivity to #PARASYMPATHOMIMETIC BETHANECHOL CHLORIDE# and a group of ten control patients.\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "120\tEach patient had a physical examination performed by his or her primary care provider, was given a standardized questionnaire that focused on symptoms of $THYROID DISEASE$ and underwent a venipuncture for #TOTAL THYROXINE# triiodothyronine resin uptake and thyrotropin (TSH) concentration.\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "122\tA 51 year old woman with $OVERT CONGESTIVE HEART FAILURE$ with pleural and pericardial effusion was treated with #FUROSEMIDE# and nifedipine, leading to improvement in her condition and a decrease in effusions.\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "132\tNine patients on maintenance haemodialysis with frequent $MUSCLE CRAMPS$ were given 320 mg #QUININE SULPHATE# or placebo (in an identical gelatin capsule) at the beginning of each dialysis for a period of 12 weeks.\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "136\tBecause montelukast is efficacious for patients with $ASTHMA$ selected patients with mild asthma and rhinitis may be good candidates for #MONTELUKAST#.\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "141\tTransdermal scopolamine provides significant $MOTION SICKNESS$ protection, similar in extent to that provided by oral #SCOPOLAMINE# or dimenhydrinate.\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "145\tCONCLUSIONS  In older patients with $MAJOR DEPRESSION$ who responded to treatment, long term maintenance treatment with #PAROXETINE# prevented recurrence of depression.\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "158\t? Adherence to a gluten free diet in patients with celiac disease can lead to the correction of $IRON DEFICIENCY ANEMIA$ and the replacement of iron stores, and it can prevent the recurrence of #IRON# deficiency.\t8\ttreats2\ttreats1\t2\n",
            "\n",
            "165\tThromboembolism did not occur in patients who were in $ATRIAL FIBRILLATION$ and receiving #WARFARIN ANTICOAGULATION#.\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "190\tA 52 year old man with $MYASTHENIA GRAVIS$ and normal liver function was treated with #NEOSTIGMINE# prednisolone and azathioprine.\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "201\t$SALMONELLA ISOLATES$ were found to be susceptible to chloramphenicol, kanamycin, #AMPICILLIN# and tetracycline.\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "202\tEighty eight patients with $PARKINSON'S DISEASE$ were treated with #LEVODOPA#\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "211\tOlsen EA, DeLong ER, Weiner MS. Long term follow up of men with $MALE PATTERN BALDNESS$ treated with #TOPICAL MINOXIDIL#\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "212\tSeventy six patients with previously untreated advanced $HODGKIN'S DISEASE$ have been treated with the MOPP/ABV (mechlorethamine, #VINCRISTINE# procarbazine, prednisone/doxorubicin, bleomycin, vincristine) hybrid program.\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "251\tProphylaxis for $GVHD$ consisted of cyclosporine and #METHYLPREDNISOLONE#\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "253\tIn order to evaluate the preventive effect of estrogen and vitamin D3 on $POSTMENOPAUSAL BONE LOSS$, either #ESTROGEN# (Premarin 0.625 mg/day) or vitamin D3 (Onealfa 1.0 micrograms/day) was administered to postmenopausal women\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "254\tVery high plasma androgen levels or evidence of $HYPERCORTISOLISM$, which is not normally suppressible by #DEXAMETHASONE#, should lead to the search for a tumor or Cushing's syndrome\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "255\tAssociation of suicide attempts with $ACNE$ and treatment with #ISOTRETINOIN#: retrospective Swedish cohort study\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "265\t2]  Ridgway EC, McCammon JA, Benotti J, Maloof F. Acute metabolic responses in $MYXEDEMA$ to large doses of #INTRAVENOUS L THYROXINE#\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "267\tThis paper will present the experience of the Department of Clinical Haematology and Oncology in the treatment of $OSTEOSARCOMA$ with high dose #METHOTREXATE# and summarize other groups' experiences with various chemotherapy regimes.\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "441\t24  $CLOSTRIDIUM INFECTIONS$  Treatment of infections caused by  #CLOSTRIDIUM PERFRINGENS#    +   ; alternative to penicillin G for those with penicillin hypersensitivity or for polymicrobial infections.\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "444\tLalli C, Ciofetta M, Del Sindaco P et al. Long term intensive treatment of $TYPE 1 DIABETES$ with the short #ACTING INSULIN ANALOG LISPRO# in variable combination with NPH insulin at mealtime.\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "451\tPatients with otitis media received approximately 40 mg/kg/day of cefaclor or amoxicillin trihydrate for ten days to three weeks; patients with $PHARYNGITIS$ received 20 mg/kg/day of #CEFACLOR# or penicillin V potassium for ten days.\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "461\tThere was a significant alleviation of $PAIN$ in the #CHONDROITIN SULFATE TREATED# groups compared to placebo.\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "465\tThe same group in Italy found that the combination of imatinib and sirolimus caused a response in several patients whose $[TUMORS]$ progressed on #[IMATINIB]# alone.\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "466\tThe response of $PSORIASIS$ to #BETAMETHASONE VALERATE# and clobetasol propionate: a 6 month controlled study.\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "477\tNegrier S, Caty A, Lesimple T et al. Treatment of patients with $METASTATIC RENAL CARCINOMA$ with a combination of subcutaneous interleukin 2 and #INTERFERON ALFA# with or without fluorouracil.\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "483\tTreatment of $REFRACTORY DIARRHOEA$ in AIDS with acetorphan and #OCTREOTIDE#: a randomized crossover study.\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "487\tSharkey Mathis PK, Velez J, Fetchick R et al$HISTOPLASMOSIS$ in the acquired immunodeficiency syndrome (AIDS): treatment with itraconazole and #FLUCONAZOLE#\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "492\tIn other studies by the same team, in patients with $HOMOZYGOUS FAMILIAL HYPERCHOLESTEROLEMIA$ the addition of #EZETIMIBE# to 40 mg/d of either simvastatin or atorvastatin increased effectiveness almost 4 times more than increasing the statin dosage to 80 mg/d  (2).\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "493\tOral combination chemotherapy with Pep-C for [$MANTLE CELL LYMPHOMA$] : Daily prednisone, [#ETOPOSIDE#], procarbazine and cyclophosphamide.\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "494\tb  Class:  Antihistamines 56:22.08; First Generation Antihistamines 4:04; A04AD (ATC)  Synonyms:  dimenhy DRINATE ; Anti-nauseant; Anti-nausee; Dimenhydrinate; Dinate; Gravergol; Gravol; Nauseatol; Travel  Uses  Motion Sickness  Used principally in the prevention and treatment of nausea, $VOMITING$, and/or vertigo associated with motion sickness, although scopolamine, #PROMETHAZINE#, or meclizine may be more effective\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "508\tIn 75% of supraventricular parossistic tachycardia and in 75% of parossistic atrial fibrillation, arrhythmia was interrupted within few minutes from drug injection; in 90% with premature ventricular contractions (PVC) and in 100% of $VENTRICULAR TACHYCARDIA$ #DISOPYRAMIDE# was capable to interrupt the arrhythmias.\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "510\tHowever, in patients with $VARIANT ANGINA$, a small dose of #ERGONOVINE# produced a percentage of change in diameter of 39.8 +/- 15.3% at the site of spastic occlusion included by a larger dose of ergonovine, compared with that of 7.0 +/- 11.9% in the remaining non-spastic coronary arteries (p less than 0.05\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "518\tThe 5 HT level in PPP was significantly increased in patients with coeliac disease in whom the disease was untreated or treated with gluten free diet for less than a year (p less than 0.01) but also compared with the patients with $COELIAC DISEASE$ treated with a #GLUTEN FREE DIET# for more than a year (p less than 0.01.\t8\ttreats2\ttreats1\t2\n",
            "\n",
            "522\t$DYSRHYTHMIA$ should be treated with atropine, #ANTIARRHYTHMIC AGENTS#, and a temporary pacemaker, in order to avoid lethal results\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "524\t423  Blastomycosis  >     Treatment of $BLASTOMYCOSIS$  IV:  Conventional #AMPHOTERICIN# B: 0.5 1 mg/kg daily.\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "530\tAfter an outbreak of hepatitis in Washington, D.C. in 1970 among a group of persons taking isoniazid to prevent $TUBERCULOSIS$ an #ISONIAZID# surveillance study was conducted among 13,838 persons in 21 participating health departments.\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "536\t• Methimazole and carbimazole are used during $PREGNANCY$ in many countries where #PROPYLTHIOURACIL# is not commercially available ( 25.\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "537\tIn addition, her past medical history is significant for $HYPERTENSION$ for which she is being treated with #HYDROCHLOROTHIAZIDE#; hyperlipidemia, for which she is taking a HMG CoA reductase inhibitor; and gestational diabetes when she was pregnant with her second child 6 years ago, for which she required insulin therapy.\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "542\tUnless severe hepatocellular damage is present, $HYPOPROTHROMBINEMIA$ usually subsides after use of #PHYTONADIONE# (vitamin K 1 ) 5 to 10 mg sc once/day for 2 to 3 days.\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "562\tIn 38 patients with $PARKINSON'S SYNDROME$ Madopar preparation was used (#L DOPA WITH PERIPHERAL DECARBOXYLASE# inhibitor) in 33 cases as the main drug and in 5 cases as an addition to L dopa.\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "585\tLansoprazole provided more effective and faster relief for heartburn than did omeprazole in erosive esophagitis  Keywords: Esophagitis $HEARTBURN$ Lansoprazole #OMEPRAZOLE#  ACP Journal Club.\t7\ttreats1\ttreats2\t1\n",
            "\n",
            "587\tResistance to methotrexate was developed by continuous exposure of P388 murine $LEUKEMIA$ cells in vitro to increasing concentrations of #METHOTREXATE# up to 1 X 10( 7) M. Once established, the resistance to methotrexate was stable.\t7\ttreats1\ttreats2\t1\n",
            "\n"
          ],
          "name": "stdout"
        },
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "Counter({'causes1-causes2(e1,e2)': 24,\n",
              "         'causes2-causes1(e2,e1)': 5,\n",
              "         'contraindicates1-contraindicates2(e1,e2)': 1,\n",
              "         'diagnosed by1-diagnosed by2(e1,e2)': 144,\n",
              "         'treats2-treats1(e2,e1)': 51})"
            ]
          },
          "metadata": {
            "tags": []
          },
          "execution_count": 17
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "2c2TSNq9P0QT",
        "colab_type": "text"
      },
      "source": [
        "The 'treats' examples are misclassified as belonging to the relatively rare class 'diagnosed by'.\n",
        "\n",
        "Most of the treatments are misclassified in the wrong direction - e.g. that a pain treats a painkiller. The model might be better at non-directed relations.\n"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "cWG8lQ_O5Oed",
        "colab_type": "text"
      },
      "source": [
        "# Confusion matrices, precision and recall for the 10 relations"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "SEeUmwcFTGjI",
        "colab_type": "code",
        "colab": {}
      },
      "source": [
        "from sklearn.metrics import multilabel_confusion_matrix\n",
        "\n",
        "# Grab the true predictions:\n",
        "\n",
        "truths = []\n",
        "with open('/content/gdrive/My Drive/Colab Notebooks/data/test.tsv') as f:\n",
        "    for l in f.readlines():\n",
        "      found = False \n",
        "      for k,v in RELATIONZ.items():\n",
        "        if v in l:\n",
        "          truths.append(k) "
      ],
      "execution_count": 0,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "Nj5KWCPoTuM7",
        "colab_type": "code",
        "outputId": "a5a3db70-f248-4479-e2ba-317616637d3a",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 505
        }
      },
      "source": [
        "confusion = multilabel_confusion_matrix(truths,predictions)\n",
        "\n",
        "for i,c in enumerate(confusion):\n",
        "  print(sorted(list(RELATIONZ.keys()))[i])\n",
        "  print(c)"
      ],
      "execution_count": 19,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "causes1-causes2(e1,e2)\n",
            "[[342  86]\n",
            " [101  56]]\n",
            "causes2-causes1(e2,e1)\n",
            "[[371  81]\n",
            " [117  16]]\n",
            "contraindicates1-contraindicates2(e1,e2)\n",
            "[[494  89]\n",
            " [  2   0]]\n",
            "contraindicates2-contraindicates1(e2,e1)\n",
            "[[583   0]\n",
            " [  2   0]]\n",
            "diagnosed by1-diagnosed by2(e1,e2)\n",
            "[[397 173]\n",
            " [ 15   0]]\n",
            "diagnosed by2-diagnosed by1(e2,e1)\n",
            "[[566   0]\n",
            " [ 18   1]]\n",
            "location1-location2(e1,e2)\n",
            "[[573   0]\n",
            " [ 12   0]]\n",
            "location2-location1(e2,e1)\n",
            "[[560   5]\n",
            " [ 19   1]]\n",
            "treats1-treats2(e1,e2)\n",
            "[[508   2]\n",
            " [ 75   0]]\n",
            "treats2-treats1(e2,e1)\n",
            "[[362  73]\n",
            " [148   2]]\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "qxONacH2xH2B",
        "colab_type": "code",
        "colab": {}
      },
      "source": [
        ""
      ],
      "execution_count": 0,
      "outputs": []
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "Yjv1zIhkYP2y",
        "colab_type": "text"
      },
      "source": [
        "The confusion matrices show that the accuracy of classification for each individual relation is pretty bad. \n",
        "\n",
        "What are the precision and recall for the 10 classes/relations?"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "xmE5B1iRVMns",
        "colab_type": "code",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 70
        },
        "outputId": "10316018-4724-45be-9e46-386db4b87993"
      },
      "source": [
        "from sklearn.metrics import precision_recall_fscore_support\n",
        "\n",
        "precision,recall,_,_ = precision_recall_fscore_support(truths,predictions)"
      ],
      "execution_count": 26,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "/usr/local/lib/python3.6/dist-packages/sklearn/metrics/_classification.py:1272: UndefinedMetricWarning: Precision and F-score are ill-defined and being set to 0.0 in labels with no predicted samples. Use `zero_division` parameter to control this behavior.\n",
            "  _warn_prf(average, modifier, msg_start, len(result))\n"
          ],
          "name": "stderr"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "1_F-xEUCvQPh",
        "colab_type": "code",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 537
        },
        "outputId": "a42a6fa0-7560-4cd2-c3e1-16e3567baf80"
      },
      "source": [
        "\n",
        "print('Precision    ', 'Recall')\n",
        "print()\n",
        "for i,p in enumerate(precision):\n",
        "  print(sorted(list(RELATIONZ.keys()))[i])\n",
        "  print('%.4f' % p, '      ', '%.4f' % recall[i])\n",
        "  print()"
      ],
      "execution_count": 95,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "Precision     Recall\n",
            "\n",
            "causes1-causes2(e1,e2)\n",
            "0.3944        0.3567\n",
            "\n",
            "causes2-causes1(e2,e1)\n",
            "0.1649        0.1203\n",
            "\n",
            "contraindicates1-contraindicates2(e1,e2)\n",
            "0.0000        0.0000\n",
            "\n",
            "contraindicates2-contraindicates1(e2,e1)\n",
            "0.0000        0.0000\n",
            "\n",
            "diagnosed by1-diagnosed by2(e1,e2)\n",
            "0.0000        0.0000\n",
            "\n",
            "diagnosed by2-diagnosed by1(e2,e1)\n",
            "1.0000        0.0526\n",
            "\n",
            "location1-location2(e1,e2)\n",
            "0.0000        0.0000\n",
            "\n",
            "location2-location1(e2,e1)\n",
            "0.1667        0.0500\n",
            "\n",
            "treats1-treats2(e1,e2)\n",
            "0.0000        0.0000\n",
            "\n",
            "treats2-treats1(e2,e1)\n",
            "0.0267        0.0133\n",
            "\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "R1RAPotRxgTc",
        "colab_type": "text"
      },
      "source": [
        "## Each relation has bad precision / recall,   but overall the accuracy is 53%.\n",
        "\n",
        "The not so good results make sense, given that it is a complicated task and the data set was small and obscure."
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "pnF8KpEH7NyD",
        "colab_type": "text"
      },
      "source": [
        "# References\n",
        "\n",
        "J.  Devlin,  M.-W.  Chang,  K.  Lee,  and  K.  Toutanova,  “Bert.”https://github.com/google-research/bert, 2018.  \n",
        "\n",
        "\n",
        "T. Wolf, L. Debut, V. Sanh, J. Chaumond, C. Delangue, A. Moi, P. Cistac,T. Rault, R. Louf, M. Funtowicz, and J. Brew, “Huggingface's transformers.”https://github.com/huggingface/transformers, 2019.\n",
        "\n",
        "\n",
        "H.   Wang,    “bert-relation-classification.”https://github.com/wang-h/bert-relation-classification, 2019."
      ]
    }
  ]
}