{
  "nbformat": 4,
  "nbformat_minor": 0,
  "metadata": {
    "colab": {
      "name": "05. Ensemble Learning.ipynb",
      "provenance": [],
      "collapsed_sections": [],
      "toc_visible": true,
      "include_colab_link": true
    },
    "kernelspec": {
      "name": "python3",
      "display_name": "Python 3"
    }
  },
  "cells": [
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "view-in-github",
        "colab_type": "text"
      },
      "source": [
        "<a href=\"https://colab.research.google.com/github/Eldave93/Seizure-Detection-Tutorials/blob/master/05.%20Ensemble%20Learning.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "Ec1FIPYtlPu3",
        "colab_type": "text"
      },
      "source": [
        "# Classification Tutorial #04\n",
        "\n",
        "# Ensemble Learning\n",
        "\n",
        "by [David Luke Elliott](https://www.lancaster.ac.uk/psychology/about-us/people/david-elliott)\n",
        "/ [GitHub](https://github.com/Eldave93) \n",
        "\n",
        "Ensemble methods aim to improve generalizability of an algorithm by combining the predictions of several estimators<sup>1,2</sup>. To acheive this there are two general methods, averaging and boosting. \n",
        "\n",
        "**Notes**\n",
        "- watch [this video] for more information on some methods\n",
        "\n",
        "---\n",
        "1. Raschka, Sebastian, and Vahid Mirjalili. Python Machine Learning, 2nd Ed. Packt Publishing, 2017.\n",
        "2. https://scikit-learn.org/stable/modules/ensemble.html\n",
        "[this video]: https://youtu.be/wPqtzj5VZus"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "KpsXUoTn9AXp",
        "colab_type": "text"
      },
      "source": [
        "# Environment Set-up\n",
        "\n",
        "First lets install the packages we need"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "x32YWiDF9DR0",
        "colab_type": "code",
        "outputId": "f04bfe5f-b5d8-4b4a-f435-78b6eba30483",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 341
        }
      },
      "source": [
        "!pip install h5py tables kaggle mne\n",
        "!pip install --upgrade imbalanced-learn"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "Requirement already satisfied: h5py in /usr/local/lib/python3.6/dist-packages (2.8.0)\n",
            "Requirement already satisfied: tables in /usr/local/lib/python3.6/dist-packages (3.4.4)\n",
            "Requirement already satisfied: kaggle in /usr/local/lib/python3.6/dist-packages (1.5.6)\n",
            "Requirement already satisfied: mne in /usr/local/lib/python3.6/dist-packages (0.19.2)\n",
            "Requirement already satisfied: numpy>=1.7 in /usr/local/lib/python3.6/dist-packages (from h5py) (1.17.5)\n",
            "Requirement already satisfied: six in /usr/local/lib/python3.6/dist-packages (from h5py) (1.12.0)\n",
            "Requirement already satisfied: numexpr>=2.5.2 in /usr/local/lib/python3.6/dist-packages (from tables) (2.7.1)\n",
            "Requirement already satisfied: urllib3<1.25,>=1.21.1 in /usr/local/lib/python3.6/dist-packages (from kaggle) (1.24.3)\n",
            "Requirement already satisfied: certifi in /usr/local/lib/python3.6/dist-packages (from kaggle) (2019.11.28)\n",
            "Requirement already satisfied: python-slugify in /usr/local/lib/python3.6/dist-packages (from kaggle) (4.0.0)\n",
            "Requirement already satisfied: python-dateutil in /usr/local/lib/python3.6/dist-packages (from kaggle) (2.6.1)\n",
            "Requirement already satisfied: requests in /usr/local/lib/python3.6/dist-packages (from kaggle) (2.21.0)\n",
            "Requirement already satisfied: tqdm in /usr/local/lib/python3.6/dist-packages (from kaggle) (4.28.1)\n",
            "Requirement already satisfied: scipy>=0.17.1 in /usr/local/lib/python3.6/dist-packages (from mne) (1.4.1)\n",
            "Requirement already satisfied: text-unidecode>=1.3 in /usr/local/lib/python3.6/dist-packages (from python-slugify->kaggle) (1.3)\n",
            "Requirement already satisfied: idna<2.9,>=2.5 in /usr/local/lib/python3.6/dist-packages (from requests->kaggle) (2.8)\n",
            "Requirement already satisfied: chardet<3.1.0,>=3.0.2 in /usr/local/lib/python3.6/dist-packages (from requests->kaggle) (3.0.4)\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "FJZrWSLo9SJy",
        "colab_type": "code",
        "colab": {}
      },
      "source": [
        "import os                         # for file locations\n",
        "import matplotlib.pyplot as plt   # for plotting\n",
        "import numpy as np                # arrays\n",
        "import pickle                     # saving python objects\n",
        "import pandas as pd               # dataframes\n",
        "import tables\n",
        "\n",
        "# colours for printing outputs\n",
        "class color:\n",
        "   PURPLE = '\\033[95m'\n",
        "   CYAN = '\\033[96m'\n",
        "   DARKCYAN = '\\033[36m'\n",
        "   BLUE = '\\033[94m'\n",
        "   GREEN = '\\033[92m'\n",
        "   YELLOW = '\\033[93m'\n",
        "   RED = '\\033[91m'\n",
        "   BOLD = '\\033[1m'\n",
        "   UNDERLINE = '\\033[4m'\n",
        "   END = '\\033[0m'"
      ],
      "execution_count": 0,
      "outputs": []
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "WuqA7o17_wcY",
        "colab_type": "text"
      },
      "source": [
        "We are going to start using a different dataset from the Epileptologie one we have been using so far. This is because Epileptologie only has 1 channel, is quite small, and we already were getting good results with it without the need for ensemble learning. \n",
        "\n",
        "We are going to use the data which was origionally a kaggle competition in 2014<sup>1</sup> (see the first notebook for more information).\n",
        "\n",
        "---\n",
        "1. https://www.kaggle.com/c/seizure-detection"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "4_r9IMrB_viA",
        "colab_type": "code",
        "outputId": "52bec4f2-91c0-4827-9f28-d323345a54dc",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 93
        }
      },
      "source": [
        "import gdown\n",
        "FILE_PATH = 'UPennMayo_features.hdf5'\n",
        "\n",
        "if not os.path.exists(FILE_PATH):\n",
        "    gdown.download('https://drive.google.com/uc?id=1-0Y0eKW9hIeOdKlYbi9UAojERKbQzi5d', \n",
        "                './'+FILE_PATH, quiet=False)"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "Downloading...\n",
            "From: https://drive.google.com/uc?id=1-0Y0eKW9hIeOdKlYbi9UAojERKbQzi5d\n",
            "To: /content/UPennMayo_features.hdf5\n",
            "47.9MB [00:00, 123MB/s] \n"
          ],
          "name": "stderr"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "jr67PrD5nO6a",
        "colab_type": "text"
      },
      "source": [
        "## Load Data\n",
        "\n",
        "First lets start by getting our workspace ready and then loading in the data. I have saved the data in a hdf5 file.\n",
        "\n",
        "**TODO**\n",
        "- Write a brief summary of hdf5 files and how I use them."
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "D3j6sFdMlJGH",
        "colab_type": "code",
        "outputId": "0b0d14ca-6187-46dc-c588-0b22680348f5",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 437
        }
      },
      "source": [
        "# load features dataframe\n",
        "h5file = tables.open_file(FILE_PATH, mode=\"r+\")\n",
        "h5file"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "File(filename=UPennMayo_features.hdf5, title='Upenn Features', mode='r+', root_uep='/', filters=Filters(complevel=0, shuffle=False, bitshuffle=False, fletcher32=False, least_significant_digit=None))\n",
              "/ (RootGroup) 'Upenn Features'\n",
              "/Patient_2 (Group) 'Participant Data'\n",
              "/Patient_2/Data_x (EArray(7035, 848)) 'Feature Array'\n",
              "  atom := Float64Atom(shape=(), dflt=0.0)\n",
              "  maindim := 0\n",
              "  flavor := 'numpy'\n",
              "  byteorder := 'little'\n",
              "  chunkshape := (9, 848)\n",
              "/Patient_2/Data_x_Feat_Names (Array(848,)) 'Names of Each Feature'\n",
              "  atom := StringAtom(itemsize=29, shape=(), dflt=b'')\n",
              "  maindim := 0\n",
              "  flavor := 'numpy'\n",
              "  byteorder := 'irrelevant'\n",
              "  chunkshape := None\n",
              "/Patient_2/Data_y (EArray(7035, 1)) 'Events Array'\n",
              "  atom := Int64Atom(shape=(), dflt=0)\n",
              "  maindim := 0\n",
              "  flavor := 'numpy'\n",
              "  byteorder := 'little'\n",
              "  chunkshape := (8192, 1)"
            ]
          },
          "metadata": {
            "tags": []
          },
          "execution_count": 47
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "yhvtTn667sXF",
        "colab_type": "text"
      },
      "source": [
        "If we look in this hdf5 file we see there is only data from 1 human patient out of the possible 8. Thats because for this tutorial, we'll use patient 2 as an example to increase training speed compared to the full set.\n",
        "\n",
        "**NOTE**\n",
        "- if you want the full set of human data then I made a version called *UPennMayo_all_features.hdf5*"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "lprgN2k6Nf7I",
        "colab_type": "code",
        "outputId": "0209d133-f342-4f15-f7a4-7ab055fe5ca2",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 36
        }
      },
      "source": [
        "import h5py\n",
        "with h5py.File(FILE_PATH, 'r') as f:\n",
        "  key_list = list(f.keys())\n",
        "display(key_list)"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "display_data",
          "data": {
            "text/plain": [
              "['Patient_2']"
            ]
          },
          "metadata": {
            "tags": []
          }
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "KufKcXT5Nmrl",
        "colab_type": "text"
      },
      "source": [
        "If you want more specific detail on the features than below then I recommend you look at the tutorial associated with the feature extraction. \n",
        "\n",
        "The dataset has multipule channels (different for each participant) and has previously been segmented into multipule datasets each lasting 1 second. Each dataset is associated with a class and has features extracted from it. These briefly are:\n",
        "\n",
        "**Welch**\n",
        "\n",
        "The Welch method is a spectral density estimation method that calculates a periodogram for windowed sections of data using a Fourier Transform. Overlapping segments are windowed with a discrete Fourier tranform applied to calculate the periodogram which is then squared and averaged to get a power measure.\n",
        "- *power_delta*: Average power between 0.1hz and 4hz \n",
        "- *power_theta*: Average power between 4hz and 8hz \n",
        "- *power_alpha*: Average power between 8hz and 12hz \n",
        "- *power_beta*: Average power between 12hz and 30hz \n",
        "- *power_gamma*: Average power between 30hz and 70hz \n",
        "\n",
        "**Discrete Wavelet Transform**\n",
        "\n",
        "Several oscillatory kernel-based wavelets are stretched and moved to different positions in time across a signal, dividing the data into different frequency components which are each analysed in respect to their scale.\n",
        "- *LSWT*: The log-sum energy of the subband coefficients\n",
        "- *mean*: Average power of the wavelet coefficients in each sub-band\n",
        "- *mean_abs*: Mean of the absolute values of the coefficients in each sub-band\n",
        "- *std*: Standard deviation of the coefficients in each sub-band\n",
        "- *Ratio*: Ratio of the absolute mean values of adjacent sub-bands\n",
        "\n",
        "**Correlation and Eigenvalues<sup>2,3</sup>**\n",
        "\n",
        "The log10 of frequency magnitudes, calculated using a Fast Fourier Transform, was gained for each EEG channel in each dataset. The FFT output was normalised across frequency (1-47Hz) and the correlation coefficients matrix calculated. Eigenvalues are then also calculated on the correlation matrix. Both eigenvalues and correlation coefficients were gained in the time domain as well as the Frequency domain.\n",
        "\n",
        "---\n",
        "1. https://www.kaggle.com/c/seizure-detection\n",
        "2. https://github.com/MichaelHills/seizure-detection\n",
        "3. K Schindler, H Leung, CE Elger, K Lehnertz (2007) Assessing seizure dynamics by analysing the correlation structure of multichannel intracranial EEG. Brain 130(Pt1):65-77"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "rfA7C4TYNnsM",
        "colab_type": "code",
        "outputId": "91273209-5df3-4f6b-b151-ce40afd26a2b",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 287
        }
      },
      "source": [
        "data_x = h5file.get_node('/Patient_2/Data_x')\n",
        "data_y = h5file.get_node('/Patient_2/Data_y')\n",
        "data_x_labels = h5file.get_node('/Patient_2/Data_x_Feat_Names')\n",
        "\n",
        "part_feature_df = pd.DataFrame(data_x[:], columns = data_x_labels[:].astype(str))\n",
        "part_feature_df['class'] = data_y[:]\n",
        "part_feature_df = part_feature_df.set_index('class')\n",
        "part_feature_df.head()"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/html": [
              "<div>\n",
              "<style scoped>\n",
              "    .dataframe tbody tr th:only-of-type {\n",
              "        vertical-align: middle;\n",
              "    }\n",
              "\n",
              "    .dataframe tbody tr th {\n",
              "        vertical-align: top;\n",
              "    }\n",
              "\n",
              "    .dataframe thead th {\n",
              "        text-align: right;\n",
              "    }\n",
              "</style>\n",
              "<table border=\"1\" class=\"dataframe\">\n",
              "  <thead>\n",
              "    <tr style=\"text-align: right;\">\n",
              "      <th></th>\n",
              "      <th>LMacro_01|2_4Hz</th>\n",
              "      <th>LMacro_01|4_8Hz</th>\n",
              "      <th>LMacro_01|8_12Hz</th>\n",
              "      <th>LMacro_01|12_30Hz</th>\n",
              "      <th>LMacro_01|30_70Hz</th>\n",
              "      <th>LMacro_01|Ratio_3_12/2_30Hz</th>\n",
              "      <th>LMacro_01|D6_mean</th>\n",
              "      <th>LMacro_01|D5_mean</th>\n",
              "      <th>LMacro_01|D4_mean</th>\n",
              "      <th>LMacro_01|D3_mean</th>\n",
              "      <th>LMacro_01|D2_mean</th>\n",
              "      <th>LMacro_01|D1_mean</th>\n",
              "      <th>LMacro_01|D6_std</th>\n",
              "      <th>LMacro_01|D5_std</th>\n",
              "      <th>LMacro_01|D4_std</th>\n",
              "      <th>LMacro_01|D3_std</th>\n",
              "      <th>LMacro_01|D2_std</th>\n",
              "      <th>LMacro_01|D1_std</th>\n",
              "      <th>LMacro_01|D6_ratio</th>\n",
              "      <th>LMacro_01|D5_ratio</th>\n",
              "      <th>LMacro_01|D4_ratio</th>\n",
              "      <th>LMacro_01|D3_ratio</th>\n",
              "      <th>LMacro_01|D2_ratio</th>\n",
              "      <th>LMacro_01|D1_ratio</th>\n",
              "      <th>LMacro_01|D6_mean_abs</th>\n",
              "      <th>LMacro_01|D5_mean_abs</th>\n",
              "      <th>LMacro_01|D4_mean_abs</th>\n",
              "      <th>LMacro_01|D3_mean_abs</th>\n",
              "      <th>LMacro_01|D2_mean_abs</th>\n",
              "      <th>LMacro_01|D1_mean_abs</th>\n",
              "      <th>LMacro_01|D6_LSWT</th>\n",
              "      <th>LMacro_01|D5_LSWT</th>\n",
              "      <th>LMacro_01|D4_LSWT</th>\n",
              "      <th>LMacro_01|D3_LSWT</th>\n",
              "      <th>LMacro_01|D2_LSWT</th>\n",
              "      <th>LMacro_01|D1_LSWT</th>\n",
              "      <th>LMacro_02|2_4Hz</th>\n",
              "      <th>LMacro_02|4_8Hz</th>\n",
              "      <th>LMacro_02|8_12Hz</th>\n",
              "      <th>LMacro_02|12_30Hz</th>\n",
              "      <th>...</th>\n",
              "      <th>LMacro_07_RMacro_05|time_corr</th>\n",
              "      <th>LMacro_07_RMacro_06|time_corr</th>\n",
              "      <th>LMacro_07_RMacro_07|time_corr</th>\n",
              "      <th>LMacro_07_RMacro_08|time_corr</th>\n",
              "      <th>LMacro_08_RMacro_01|time_corr</th>\n",
              "      <th>LMacro_08_RMacro_02|time_corr</th>\n",
              "      <th>LMacro_08_RMacro_03|time_corr</th>\n",
              "      <th>LMacro_08_RMacro_04|time_corr</th>\n",
              "      <th>LMacro_08_RMacro_05|time_corr</th>\n",
              "      <th>LMacro_08_RMacro_06|time_corr</th>\n",
              "      <th>LMacro_08_RMacro_07|time_corr</th>\n",
              "      <th>LMacro_08_RMacro_08|time_corr</th>\n",
              "      <th>RMacro_01_RMacro_02|time_corr</th>\n",
              "      <th>RMacro_01_RMacro_03|time_corr</th>\n",
              "      <th>RMacro_01_RMacro_04|time_corr</th>\n",
              "      <th>RMacro_01_RMacro_05|time_corr</th>\n",
              "      <th>RMacro_01_RMacro_06|time_corr</th>\n",
              "      <th>RMacro_01_RMacro_07|time_corr</th>\n",
              "      <th>RMacro_01_RMacro_08|time_corr</th>\n",
              "      <th>RMacro_02_RMacro_03|time_corr</th>\n",
              "      <th>RMacro_02_RMacro_04|time_corr</th>\n",
              "      <th>RMacro_02_RMacro_05|time_corr</th>\n",
              "      <th>RMacro_02_RMacro_06|time_corr</th>\n",
              "      <th>RMacro_02_RMacro_07|time_corr</th>\n",
              "      <th>RMacro_02_RMacro_08|time_corr</th>\n",
              "      <th>RMacro_03_RMacro_04|time_corr</th>\n",
              "      <th>RMacro_03_RMacro_05|time_corr</th>\n",
              "      <th>RMacro_03_RMacro_06|time_corr</th>\n",
              "      <th>RMacro_03_RMacro_07|time_corr</th>\n",
              "      <th>RMacro_03_RMacro_08|time_corr</th>\n",
              "      <th>RMacro_04_RMacro_05|time_corr</th>\n",
              "      <th>RMacro_04_RMacro_06|time_corr</th>\n",
              "      <th>RMacro_04_RMacro_07|time_corr</th>\n",
              "      <th>RMacro_04_RMacro_08|time_corr</th>\n",
              "      <th>RMacro_05_RMacro_06|time_corr</th>\n",
              "      <th>RMacro_05_RMacro_07|time_corr</th>\n",
              "      <th>RMacro_05_RMacro_08|time_corr</th>\n",
              "      <th>RMacro_06_RMacro_07|time_corr</th>\n",
              "      <th>RMacro_06_RMacro_08|time_corr</th>\n",
              "      <th>RMacro_07_RMacro_08|time_corr</th>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>class</th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "    </tr>\n",
              "  </thead>\n",
              "  <tbody>\n",
              "    <tr>\n",
              "      <th>1</th>\n",
              "      <td>227.771604</td>\n",
              "      <td>121.729376</td>\n",
              "      <td>104.748192</td>\n",
              "      <td>12.558394</td>\n",
              "      <td>0.647189</td>\n",
              "      <td>0.236310</td>\n",
              "      <td>-1.464481</td>\n",
              "      <td>-0.824742</td>\n",
              "      <td>0.013091</td>\n",
              "      <td>0.079837</td>\n",
              "      <td>0.007579</td>\n",
              "      <td>-0.003982</td>\n",
              "      <td>32.788547</td>\n",
              "      <td>7.787165</td>\n",
              "      <td>3.174328</td>\n",
              "      <td>2.255994</td>\n",
              "      <td>1.501909</td>\n",
              "      <td>0.434303</td>\n",
              "      <td>3.754051</td>\n",
              "      <td>0.478331</td>\n",
              "      <td>0.654988</td>\n",
              "      <td>0.968202</td>\n",
              "      <td>1.110025</td>\n",
              "      <td>0.291264</td>\n",
              "      <td>22.104887</td>\n",
              "      <td>5.888276</td>\n",
              "      <td>2.515215</td>\n",
              "      <td>1.791908</td>\n",
              "      <td>1.186303</td>\n",
              "      <td>0.345527</td>\n",
              "      <td>2.393530</td>\n",
              "      <td>0.000000</td>\n",
              "      <td>4.938845</td>\n",
              "      <td>5.224725</td>\n",
              "      <td>4.976350</td>\n",
              "      <td>4.832041</td>\n",
              "      <td>1005.364843</td>\n",
              "      <td>313.338129</td>\n",
              "      <td>313.795808</td>\n",
              "      <td>41.072159</td>\n",
              "      <td>...</td>\n",
              "      <td>0.105581</td>\n",
              "      <td>0.083218</td>\n",
              "      <td>0.037208</td>\n",
              "      <td>-0.074456</td>\n",
              "      <td>-0.232014</td>\n",
              "      <td>-0.110795</td>\n",
              "      <td>0.064210</td>\n",
              "      <td>0.231193</td>\n",
              "      <td>0.054408</td>\n",
              "      <td>0.080716</td>\n",
              "      <td>0.008344</td>\n",
              "      <td>-0.092507</td>\n",
              "      <td>0.478147</td>\n",
              "      <td>0.070836</td>\n",
              "      <td>0.276334</td>\n",
              "      <td>0.409964</td>\n",
              "      <td>0.089740</td>\n",
              "      <td>0.011427</td>\n",
              "      <td>-0.045528</td>\n",
              "      <td>0.659306</td>\n",
              "      <td>0.402787</td>\n",
              "      <td>-0.120908</td>\n",
              "      <td>-0.099517</td>\n",
              "      <td>-0.098398</td>\n",
              "      <td>-0.134968</td>\n",
              "      <td>0.671873</td>\n",
              "      <td>-0.177492</td>\n",
              "      <td>0.043530</td>\n",
              "      <td>0.109859</td>\n",
              "      <td>0.070146</td>\n",
              "      <td>0.273753</td>\n",
              "      <td>0.363735</td>\n",
              "      <td>0.389202</td>\n",
              "      <td>0.297462</td>\n",
              "      <td>0.789810</td>\n",
              "      <td>0.652605</td>\n",
              "      <td>0.519944</td>\n",
              "      <td>0.953851</td>\n",
              "      <td>0.796902</td>\n",
              "      <td>0.888274</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>1</th>\n",
              "      <td>162.957202</td>\n",
              "      <td>77.334727</td>\n",
              "      <td>39.298664</td>\n",
              "      <td>159.289935</td>\n",
              "      <td>17.511810</td>\n",
              "      <td>0.521757</td>\n",
              "      <td>-0.524147</td>\n",
              "      <td>-2.195927</td>\n",
              "      <td>0.170249</td>\n",
              "      <td>0.077665</td>\n",
              "      <td>-0.009410</td>\n",
              "      <td>-0.004780</td>\n",
              "      <td>136.208688</td>\n",
              "      <td>30.836280</td>\n",
              "      <td>12.684444</td>\n",
              "      <td>5.863654</td>\n",
              "      <td>2.171036</td>\n",
              "      <td>0.478162</td>\n",
              "      <td>4.499467</td>\n",
              "      <td>0.411784</td>\n",
              "      <td>0.612139</td>\n",
              "      <td>0.787675</td>\n",
              "      <td>0.749612</td>\n",
              "      <td>0.226207</td>\n",
              "      <td>110.524209</td>\n",
              "      <td>24.563843</td>\n",
              "      <td>8.780154</td>\n",
              "      <td>4.122931</td>\n",
              "      <td>1.688453</td>\n",
              "      <td>0.381939</td>\n",
              "      <td>5.750614</td>\n",
              "      <td>0.000000</td>\n",
              "      <td>6.024042</td>\n",
              "      <td>6.011127</td>\n",
              "      <td>5.849690</td>\n",
              "      <td>5.849246</td>\n",
              "      <td>177.183887</td>\n",
              "      <td>64.238290</td>\n",
              "      <td>44.758678</td>\n",
              "      <td>78.214008</td>\n",
              "      <td>...</td>\n",
              "      <td>0.368963</td>\n",
              "      <td>0.345036</td>\n",
              "      <td>0.190677</td>\n",
              "      <td>0.175533</td>\n",
              "      <td>0.143469</td>\n",
              "      <td>-0.377845</td>\n",
              "      <td>-0.132094</td>\n",
              "      <td>0.177265</td>\n",
              "      <td>0.364337</td>\n",
              "      <td>0.315025</td>\n",
              "      <td>0.180897</td>\n",
              "      <td>0.179192</td>\n",
              "      <td>0.197950</td>\n",
              "      <td>-0.024201</td>\n",
              "      <td>0.489842</td>\n",
              "      <td>0.500227</td>\n",
              "      <td>0.441391</td>\n",
              "      <td>0.083896</td>\n",
              "      <td>0.067550</td>\n",
              "      <td>0.448453</td>\n",
              "      <td>-0.143189</td>\n",
              "      <td>-0.236526</td>\n",
              "      <td>-0.318908</td>\n",
              "      <td>-0.218642</td>\n",
              "      <td>-0.218159</td>\n",
              "      <td>0.349326</td>\n",
              "      <td>0.072042</td>\n",
              "      <td>-0.015888</td>\n",
              "      <td>-0.035920</td>\n",
              "      <td>-0.016688</td>\n",
              "      <td>0.402350</td>\n",
              "      <td>0.379938</td>\n",
              "      <td>0.046323</td>\n",
              "      <td>-0.029894</td>\n",
              "      <td>0.765875</td>\n",
              "      <td>0.426919</td>\n",
              "      <td>0.379509</td>\n",
              "      <td>0.808733</td>\n",
              "      <td>0.732159</td>\n",
              "      <td>0.911896</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>1</th>\n",
              "      <td>7.328949</td>\n",
              "      <td>42.812358</td>\n",
              "      <td>74.461760</td>\n",
              "      <td>98.214038</td>\n",
              "      <td>8.702826</td>\n",
              "      <td>0.631694</td>\n",
              "      <td>4.117210</td>\n",
              "      <td>-0.597011</td>\n",
              "      <td>-0.399529</td>\n",
              "      <td>0.090689</td>\n",
              "      <td>0.002929</td>\n",
              "      <td>-0.009800</td>\n",
              "      <td>111.980787</td>\n",
              "      <td>21.057629</td>\n",
              "      <td>5.753481</td>\n",
              "      <td>2.588111</td>\n",
              "      <td>1.388032</td>\n",
              "      <td>0.401990</td>\n",
              "      <td>5.619444</td>\n",
              "      <td>0.338374</td>\n",
              "      <td>0.516542</td>\n",
              "      <td>0.705551</td>\n",
              "      <td>0.947942</td>\n",
              "      <td>0.290217</td>\n",
              "      <td>89.182818</td>\n",
              "      <td>15.870400</td>\n",
              "      <td>4.621165</td>\n",
              "      <td>2.022304</td>\n",
              "      <td>1.111390</td>\n",
              "      <td>0.322544</td>\n",
              "      <td>6.170474</td>\n",
              "      <td>3.438399</td>\n",
              "      <td>0.000000</td>\n",
              "      <td>5.223997</td>\n",
              "      <td>4.883751</td>\n",
              "      <td>4.643624</td>\n",
              "      <td>149.110048</td>\n",
              "      <td>61.553056</td>\n",
              "      <td>46.734910</td>\n",
              "      <td>66.373068</td>\n",
              "      <td>...</td>\n",
              "      <td>0.311119</td>\n",
              "      <td>0.386066</td>\n",
              "      <td>0.331398</td>\n",
              "      <td>0.259048</td>\n",
              "      <td>0.173879</td>\n",
              "      <td>-0.389145</td>\n",
              "      <td>-0.417331</td>\n",
              "      <td>-0.206554</td>\n",
              "      <td>0.398575</td>\n",
              "      <td>0.383888</td>\n",
              "      <td>0.302225</td>\n",
              "      <td>0.245279</td>\n",
              "      <td>0.328624</td>\n",
              "      <td>-0.087730</td>\n",
              "      <td>-0.137598</td>\n",
              "      <td>0.057621</td>\n",
              "      <td>0.133883</td>\n",
              "      <td>0.179161</td>\n",
              "      <td>0.105338</td>\n",
              "      <td>0.679629</td>\n",
              "      <td>0.318373</td>\n",
              "      <td>-0.071765</td>\n",
              "      <td>0.050092</td>\n",
              "      <td>0.125776</td>\n",
              "      <td>0.103813</td>\n",
              "      <td>0.793943</td>\n",
              "      <td>-0.154397</td>\n",
              "      <td>-0.134264</td>\n",
              "      <td>-0.094342</td>\n",
              "      <td>-0.099292</td>\n",
              "      <td>-0.020415</td>\n",
              "      <td>-0.103571</td>\n",
              "      <td>-0.071538</td>\n",
              "      <td>-0.048586</td>\n",
              "      <td>0.807652</td>\n",
              "      <td>0.638780</td>\n",
              "      <td>0.563594</td>\n",
              "      <td>0.936139</td>\n",
              "      <td>0.864730</td>\n",
              "      <td>0.954777</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>1</th>\n",
              "      <td>213.279844</td>\n",
              "      <td>85.966538</td>\n",
              "      <td>48.585487</td>\n",
              "      <td>241.390643</td>\n",
              "      <td>19.657826</td>\n",
              "      <td>1.773919</td>\n",
              "      <td>1.079959</td>\n",
              "      <td>0.292291</td>\n",
              "      <td>0.022378</td>\n",
              "      <td>0.049902</td>\n",
              "      <td>0.007782</td>\n",
              "      <td>-0.001951</td>\n",
              "      <td>141.495700</td>\n",
              "      <td>31.239320</td>\n",
              "      <td>10.276078</td>\n",
              "      <td>3.798760</td>\n",
              "      <td>1.597864</td>\n",
              "      <td>0.410597</td>\n",
              "      <td>4.289572</td>\n",
              "      <td>0.434734</td>\n",
              "      <td>0.559942</td>\n",
              "      <td>0.612397</td>\n",
              "      <td>0.812651</td>\n",
              "      <td>0.262716</td>\n",
              "      <td>107.301414</td>\n",
              "      <td>25.014481</td>\n",
              "      <td>7.778018</td>\n",
              "      <td>2.767019</td>\n",
              "      <td>1.258672</td>\n",
              "      <td>0.330673</td>\n",
              "      <td>4.581694</td>\n",
              "      <td>3.980179</td>\n",
              "      <td>2.566618</td>\n",
              "      <td>3.620904</td>\n",
              "      <td>2.750477</td>\n",
              "      <td>0.000000</td>\n",
              "      <td>13.380336</td>\n",
              "      <td>22.280386</td>\n",
              "      <td>30.907487</td>\n",
              "      <td>54.117322</td>\n",
              "      <td>...</td>\n",
              "      <td>0.508899</td>\n",
              "      <td>0.487559</td>\n",
              "      <td>0.451324</td>\n",
              "      <td>0.291205</td>\n",
              "      <td>0.631330</td>\n",
              "      <td>0.165049</td>\n",
              "      <td>0.137329</td>\n",
              "      <td>0.253372</td>\n",
              "      <td>0.345892</td>\n",
              "      <td>0.273578</td>\n",
              "      <td>0.275772</td>\n",
              "      <td>0.305362</td>\n",
              "      <td>0.327336</td>\n",
              "      <td>0.106497</td>\n",
              "      <td>0.246135</td>\n",
              "      <td>0.124531</td>\n",
              "      <td>0.178851</td>\n",
              "      <td>0.210248</td>\n",
              "      <td>0.245501</td>\n",
              "      <td>0.856986</td>\n",
              "      <td>0.816305</td>\n",
              "      <td>-0.364717</td>\n",
              "      <td>-0.262091</td>\n",
              "      <td>-0.121623</td>\n",
              "      <td>0.073687</td>\n",
              "      <td>0.930112</td>\n",
              "      <td>-0.270618</td>\n",
              "      <td>-0.205250</td>\n",
              "      <td>-0.070795</td>\n",
              "      <td>0.150490</td>\n",
              "      <td>-0.194032</td>\n",
              "      <td>-0.129580</td>\n",
              "      <td>0.018119</td>\n",
              "      <td>0.232352</td>\n",
              "      <td>0.842868</td>\n",
              "      <td>0.673148</td>\n",
              "      <td>0.445052</td>\n",
              "      <td>0.857085</td>\n",
              "      <td>0.559804</td>\n",
              "      <td>0.861035</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>1</th>\n",
              "      <td>44.163924</td>\n",
              "      <td>27.442250</td>\n",
              "      <td>42.136054</td>\n",
              "      <td>177.165579</td>\n",
              "      <td>15.002302</td>\n",
              "      <td>1.353723</td>\n",
              "      <td>-1.712907</td>\n",
              "      <td>1.431398</td>\n",
              "      <td>-0.246839</td>\n",
              "      <td>-0.003639</td>\n",
              "      <td>0.003989</td>\n",
              "      <td>0.000278</td>\n",
              "      <td>135.107508</td>\n",
              "      <td>25.263878</td>\n",
              "      <td>9.126206</td>\n",
              "      <td>3.616183</td>\n",
              "      <td>1.551521</td>\n",
              "      <td>0.393050</td>\n",
              "      <td>5.453788</td>\n",
              "      <td>0.344536</td>\n",
              "      <td>0.616948</td>\n",
              "      <td>0.668856</td>\n",
              "      <td>0.807527</td>\n",
              "      <td>0.258416</td>\n",
              "      <td>108.026608</td>\n",
              "      <td>19.807629</td>\n",
              "      <td>6.955007</td>\n",
              "      <td>2.738858</td>\n",
              "      <td>1.234676</td>\n",
              "      <td>0.319060</td>\n",
              "      <td>0.000000</td>\n",
              "      <td>5.939947</td>\n",
              "      <td>4.217378</td>\n",
              "      <td>4.971899</td>\n",
              "      <td>5.021267</td>\n",
              "      <td>4.992415</td>\n",
              "      <td>35.455547</td>\n",
              "      <td>11.237863</td>\n",
              "      <td>21.877706</td>\n",
              "      <td>64.766595</td>\n",
              "      <td>...</td>\n",
              "      <td>0.281274</td>\n",
              "      <td>0.194032</td>\n",
              "      <td>0.023224</td>\n",
              "      <td>-0.131616</td>\n",
              "      <td>-0.150468</td>\n",
              "      <td>-0.344999</td>\n",
              "      <td>0.076006</td>\n",
              "      <td>-0.046960</td>\n",
              "      <td>0.353895</td>\n",
              "      <td>0.238852</td>\n",
              "      <td>-0.039857</td>\n",
              "      <td>-0.218380</td>\n",
              "      <td>0.733632</td>\n",
              "      <td>0.000826</td>\n",
              "      <td>0.041887</td>\n",
              "      <td>-0.016991</td>\n",
              "      <td>0.104665</td>\n",
              "      <td>0.030016</td>\n",
              "      <td>-0.014210</td>\n",
              "      <td>0.353321</td>\n",
              "      <td>0.319825</td>\n",
              "      <td>-0.243991</td>\n",
              "      <td>0.073423</td>\n",
              "      <td>0.100328</td>\n",
              "      <td>0.067567</td>\n",
              "      <td>0.692748</td>\n",
              "      <td>0.107828</td>\n",
              "      <td>0.202009</td>\n",
              "      <td>0.136237</td>\n",
              "      <td>0.080719</td>\n",
              "      <td>0.099375</td>\n",
              "      <td>0.115953</td>\n",
              "      <td>0.158055</td>\n",
              "      <td>0.105109</td>\n",
              "      <td>0.680998</td>\n",
              "      <td>0.362963</td>\n",
              "      <td>0.242693</td>\n",
              "      <td>0.834629</td>\n",
              "      <td>0.653866</td>\n",
              "      <td>0.911919</td>\n",
              "    </tr>\n",
              "  </tbody>\n",
              "</table>\n",
              "<p>5 rows × 848 columns</p>\n",
              "</div>"
            ],
            "text/plain": [
              "       LMacro_01|2_4Hz  ...  RMacro_07_RMacro_08|time_corr\n",
              "class                   ...                               \n",
              "1           227.771604  ...                       0.888274\n",
              "1           162.957202  ...                       0.911896\n",
              "1             7.328949  ...                       0.954777\n",
              "1           213.279844  ...                       0.861035\n",
              "1            44.163924  ...                       0.911919\n",
              "\n",
              "[5 rows x 848 columns]"
            ]
          },
          "metadata": {
            "tags": []
          },
          "execution_count": 7
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "yMsqqJfdoSYL",
        "colab_type": "text"
      },
      "source": [
        "## Data Preparation\n",
        "\n",
        "We'll just remove the data that was used for the test set (2) in the competition as quite frankly we don't know what they represent. This will leave just our known ictal data (1) and inter-ictal (0) data.\n",
        "\n",
        "**Note**\n",
        "- I noticed in a paper that they contacted the kaggle organisers for test labels:\n",
        "\n",
        "    \"*Dr Benjamin H. Brinkmann support from Mayo Systems Electrophysiology Lab for providing information on some unlabeled datasets*\"\n",
        "    \n",
        "    so if you really want them then maybe this could be an option?"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "-af8-SU-NvvW",
        "colab_type": "code",
        "outputId": "eb4f4088-2d8d-43c4-b43c-47e047c67d87",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 93
        }
      },
      "source": [
        "part_feature_df.index.value_counts()"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "2    3894\n",
              "0    2990\n",
              "1     151\n",
              "Name: class, dtype: int64"
            ]
          },
          "metadata": {
            "tags": []
          },
          "execution_count": 8
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "byoLlXZRNxZX",
        "colab_type": "code",
        "outputId": "9e152dfb-6704-4848-c187-77f37f0bbce7",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 287
        }
      },
      "source": [
        "reduced_features = part_feature_df.loc[[0,1]]\n",
        "reduced_features.head()"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/html": [
              "<div>\n",
              "<style scoped>\n",
              "    .dataframe tbody tr th:only-of-type {\n",
              "        vertical-align: middle;\n",
              "    }\n",
              "\n",
              "    .dataframe tbody tr th {\n",
              "        vertical-align: top;\n",
              "    }\n",
              "\n",
              "    .dataframe thead th {\n",
              "        text-align: right;\n",
              "    }\n",
              "</style>\n",
              "<table border=\"1\" class=\"dataframe\">\n",
              "  <thead>\n",
              "    <tr style=\"text-align: right;\">\n",
              "      <th></th>\n",
              "      <th>LMacro_01|2_4Hz</th>\n",
              "      <th>LMacro_01|4_8Hz</th>\n",
              "      <th>LMacro_01|8_12Hz</th>\n",
              "      <th>LMacro_01|12_30Hz</th>\n",
              "      <th>LMacro_01|30_70Hz</th>\n",
              "      <th>LMacro_01|Ratio_3_12/2_30Hz</th>\n",
              "      <th>LMacro_01|D6_mean</th>\n",
              "      <th>LMacro_01|D5_mean</th>\n",
              "      <th>LMacro_01|D4_mean</th>\n",
              "      <th>LMacro_01|D3_mean</th>\n",
              "      <th>LMacro_01|D2_mean</th>\n",
              "      <th>LMacro_01|D1_mean</th>\n",
              "      <th>LMacro_01|D6_std</th>\n",
              "      <th>LMacro_01|D5_std</th>\n",
              "      <th>LMacro_01|D4_std</th>\n",
              "      <th>LMacro_01|D3_std</th>\n",
              "      <th>LMacro_01|D2_std</th>\n",
              "      <th>LMacro_01|D1_std</th>\n",
              "      <th>LMacro_01|D6_ratio</th>\n",
              "      <th>LMacro_01|D5_ratio</th>\n",
              "      <th>LMacro_01|D4_ratio</th>\n",
              "      <th>LMacro_01|D3_ratio</th>\n",
              "      <th>LMacro_01|D2_ratio</th>\n",
              "      <th>LMacro_01|D1_ratio</th>\n",
              "      <th>LMacro_01|D6_mean_abs</th>\n",
              "      <th>LMacro_01|D5_mean_abs</th>\n",
              "      <th>LMacro_01|D4_mean_abs</th>\n",
              "      <th>LMacro_01|D3_mean_abs</th>\n",
              "      <th>LMacro_01|D2_mean_abs</th>\n",
              "      <th>LMacro_01|D1_mean_abs</th>\n",
              "      <th>LMacro_01|D6_LSWT</th>\n",
              "      <th>LMacro_01|D5_LSWT</th>\n",
              "      <th>LMacro_01|D4_LSWT</th>\n",
              "      <th>LMacro_01|D3_LSWT</th>\n",
              "      <th>LMacro_01|D2_LSWT</th>\n",
              "      <th>LMacro_01|D1_LSWT</th>\n",
              "      <th>LMacro_02|2_4Hz</th>\n",
              "      <th>LMacro_02|4_8Hz</th>\n",
              "      <th>LMacro_02|8_12Hz</th>\n",
              "      <th>LMacro_02|12_30Hz</th>\n",
              "      <th>...</th>\n",
              "      <th>LMacro_07_RMacro_05|time_corr</th>\n",
              "      <th>LMacro_07_RMacro_06|time_corr</th>\n",
              "      <th>LMacro_07_RMacro_07|time_corr</th>\n",
              "      <th>LMacro_07_RMacro_08|time_corr</th>\n",
              "      <th>LMacro_08_RMacro_01|time_corr</th>\n",
              "      <th>LMacro_08_RMacro_02|time_corr</th>\n",
              "      <th>LMacro_08_RMacro_03|time_corr</th>\n",
              "      <th>LMacro_08_RMacro_04|time_corr</th>\n",
              "      <th>LMacro_08_RMacro_05|time_corr</th>\n",
              "      <th>LMacro_08_RMacro_06|time_corr</th>\n",
              "      <th>LMacro_08_RMacro_07|time_corr</th>\n",
              "      <th>LMacro_08_RMacro_08|time_corr</th>\n",
              "      <th>RMacro_01_RMacro_02|time_corr</th>\n",
              "      <th>RMacro_01_RMacro_03|time_corr</th>\n",
              "      <th>RMacro_01_RMacro_04|time_corr</th>\n",
              "      <th>RMacro_01_RMacro_05|time_corr</th>\n",
              "      <th>RMacro_01_RMacro_06|time_corr</th>\n",
              "      <th>RMacro_01_RMacro_07|time_corr</th>\n",
              "      <th>RMacro_01_RMacro_08|time_corr</th>\n",
              "      <th>RMacro_02_RMacro_03|time_corr</th>\n",
              "      <th>RMacro_02_RMacro_04|time_corr</th>\n",
              "      <th>RMacro_02_RMacro_05|time_corr</th>\n",
              "      <th>RMacro_02_RMacro_06|time_corr</th>\n",
              "      <th>RMacro_02_RMacro_07|time_corr</th>\n",
              "      <th>RMacro_02_RMacro_08|time_corr</th>\n",
              "      <th>RMacro_03_RMacro_04|time_corr</th>\n",
              "      <th>RMacro_03_RMacro_05|time_corr</th>\n",
              "      <th>RMacro_03_RMacro_06|time_corr</th>\n",
              "      <th>RMacro_03_RMacro_07|time_corr</th>\n",
              "      <th>RMacro_03_RMacro_08|time_corr</th>\n",
              "      <th>RMacro_04_RMacro_05|time_corr</th>\n",
              "      <th>RMacro_04_RMacro_06|time_corr</th>\n",
              "      <th>RMacro_04_RMacro_07|time_corr</th>\n",
              "      <th>RMacro_04_RMacro_08|time_corr</th>\n",
              "      <th>RMacro_05_RMacro_06|time_corr</th>\n",
              "      <th>RMacro_05_RMacro_07|time_corr</th>\n",
              "      <th>RMacro_05_RMacro_08|time_corr</th>\n",
              "      <th>RMacro_06_RMacro_07|time_corr</th>\n",
              "      <th>RMacro_06_RMacro_08|time_corr</th>\n",
              "      <th>RMacro_07_RMacro_08|time_corr</th>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>class</th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "    </tr>\n",
              "  </thead>\n",
              "  <tbody>\n",
              "    <tr>\n",
              "      <th>0</th>\n",
              "      <td>114.541010</td>\n",
              "      <td>26.387061</td>\n",
              "      <td>11.821119</td>\n",
              "      <td>1.357175</td>\n",
              "      <td>0.127231</td>\n",
              "      <td>0.080726</td>\n",
              "      <td>-0.128602</td>\n",
              "      <td>0.118216</td>\n",
              "      <td>-0.101168</td>\n",
              "      <td>-0.033279</td>\n",
              "      <td>-0.004297</td>\n",
              "      <td>0.001694</td>\n",
              "      <td>13.557896</td>\n",
              "      <td>4.324083</td>\n",
              "      <td>2.012277</td>\n",
              "      <td>1.401388</td>\n",
              "      <td>1.110230</td>\n",
              "      <td>0.385098</td>\n",
              "      <td>3.004006</td>\n",
              "      <td>0.579224</td>\n",
              "      <td>0.684463</td>\n",
              "      <td>0.893648</td>\n",
              "      <td>1.246923</td>\n",
              "      <td>0.350324</td>\n",
              "      <td>10.624583</td>\n",
              "      <td>3.536804</td>\n",
              "      <td>1.587640</td>\n",
              "      <td>1.102274</td>\n",
              "      <td>0.879269</td>\n",
              "      <td>0.308029</td>\n",
              "      <td>3.106438</td>\n",
              "      <td>3.961608</td>\n",
              "      <td>0.000000</td>\n",
              "      <td>2.507438</td>\n",
              "      <td>3.327911</td>\n",
              "      <td>3.624668</td>\n",
              "      <td>81.093381</td>\n",
              "      <td>21.245074</td>\n",
              "      <td>12.022613</td>\n",
              "      <td>1.716688</td>\n",
              "      <td>...</td>\n",
              "      <td>0.142421</td>\n",
              "      <td>0.445292</td>\n",
              "      <td>0.375804</td>\n",
              "      <td>0.048456</td>\n",
              "      <td>-0.287851</td>\n",
              "      <td>-0.255292</td>\n",
              "      <td>-0.189169</td>\n",
              "      <td>-0.323072</td>\n",
              "      <td>0.083437</td>\n",
              "      <td>0.353000</td>\n",
              "      <td>0.313551</td>\n",
              "      <td>-0.003411</td>\n",
              "      <td>-0.235130</td>\n",
              "      <td>-0.170569</td>\n",
              "      <td>-0.220663</td>\n",
              "      <td>0.649620</td>\n",
              "      <td>-0.020600</td>\n",
              "      <td>-0.345087</td>\n",
              "      <td>0.100752</td>\n",
              "      <td>0.799281</td>\n",
              "      <td>-0.212111</td>\n",
              "      <td>-0.533714</td>\n",
              "      <td>-0.237637</td>\n",
              "      <td>-0.073423</td>\n",
              "      <td>-0.217264</td>\n",
              "      <td>-0.243490</td>\n",
              "      <td>-0.387402</td>\n",
              "      <td>-0.151831</td>\n",
              "      <td>-0.017957</td>\n",
              "      <td>-0.169271</td>\n",
              "      <td>-0.303116</td>\n",
              "      <td>-0.431413</td>\n",
              "      <td>-0.376480</td>\n",
              "      <td>-0.411506</td>\n",
              "      <td>0.544665</td>\n",
              "      <td>0.179166</td>\n",
              "      <td>0.469152</td>\n",
              "      <td>0.890269</td>\n",
              "      <td>0.827349</td>\n",
              "      <td>0.822477</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>0</th>\n",
              "      <td>27.574621</td>\n",
              "      <td>43.245174</td>\n",
              "      <td>18.910574</td>\n",
              "      <td>0.791160</td>\n",
              "      <td>0.113143</td>\n",
              "      <td>0.028620</td>\n",
              "      <td>-0.566416</td>\n",
              "      <td>-0.064466</td>\n",
              "      <td>-0.082230</td>\n",
              "      <td>0.007399</td>\n",
              "      <td>-0.009481</td>\n",
              "      <td>-0.002008</td>\n",
              "      <td>12.721882</td>\n",
              "      <td>4.671603</td>\n",
              "      <td>2.081970</td>\n",
              "      <td>1.429230</td>\n",
              "      <td>1.163085</td>\n",
              "      <td>0.392771</td>\n",
              "      <td>2.740875</td>\n",
              "      <td>0.625647</td>\n",
              "      <td>0.695557</td>\n",
              "      <td>0.874557</td>\n",
              "      <td>1.282507</td>\n",
              "      <td>0.339322</td>\n",
              "      <td>10.070413</td>\n",
              "      <td>3.674160</td>\n",
              "      <td>1.674747</td>\n",
              "      <td>1.141394</td>\n",
              "      <td>0.935474</td>\n",
              "      <td>0.317427</td>\n",
              "      <td>0.000000</td>\n",
              "      <td>3.654221</td>\n",
              "      <td>3.131752</td>\n",
              "      <td>3.985539</td>\n",
              "      <td>3.617549</td>\n",
              "      <td>3.786875</td>\n",
              "      <td>34.090448</td>\n",
              "      <td>55.587796</td>\n",
              "      <td>23.758051</td>\n",
              "      <td>1.081294</td>\n",
              "      <td>...</td>\n",
              "      <td>0.059382</td>\n",
              "      <td>-0.075373</td>\n",
              "      <td>-0.096533</td>\n",
              "      <td>-0.022078</td>\n",
              "      <td>-0.268351</td>\n",
              "      <td>-0.628743</td>\n",
              "      <td>-0.249155</td>\n",
              "      <td>-0.291405</td>\n",
              "      <td>0.037981</td>\n",
              "      <td>-0.046959</td>\n",
              "      <td>-0.073897</td>\n",
              "      <td>-0.044187</td>\n",
              "      <td>0.380488</td>\n",
              "      <td>0.106047</td>\n",
              "      <td>-0.186159</td>\n",
              "      <td>0.161660</td>\n",
              "      <td>0.005139</td>\n",
              "      <td>-0.108882</td>\n",
              "      <td>-0.105291</td>\n",
              "      <td>0.488385</td>\n",
              "      <td>-0.218292</td>\n",
              "      <td>0.083636</td>\n",
              "      <td>0.271466</td>\n",
              "      <td>0.280665</td>\n",
              "      <td>0.136437</td>\n",
              "      <td>-0.233991</td>\n",
              "      <td>0.132623</td>\n",
              "      <td>0.174909</td>\n",
              "      <td>0.162932</td>\n",
              "      <td>0.175522</td>\n",
              "      <td>-0.427137</td>\n",
              "      <td>-0.388867</td>\n",
              "      <td>-0.327155</td>\n",
              "      <td>-0.370580</td>\n",
              "      <td>0.750489</td>\n",
              "      <td>0.593814</td>\n",
              "      <td>0.665086</td>\n",
              "      <td>0.956724</td>\n",
              "      <td>0.870345</td>\n",
              "      <td>0.912751</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>0</th>\n",
              "      <td>58.207500</td>\n",
              "      <td>18.142876</td>\n",
              "      <td>8.238157</td>\n",
              "      <td>1.537487</td>\n",
              "      <td>0.341276</td>\n",
              "      <td>0.179073</td>\n",
              "      <td>0.500630</td>\n",
              "      <td>-0.156088</td>\n",
              "      <td>0.016892</td>\n",
              "      <td>0.061672</td>\n",
              "      <td>-0.002595</td>\n",
              "      <td>-0.005260</td>\n",
              "      <td>27.352470</td>\n",
              "      <td>9.916045</td>\n",
              "      <td>4.451683</td>\n",
              "      <td>2.263063</td>\n",
              "      <td>1.171889</td>\n",
              "      <td>0.389263</td>\n",
              "      <td>2.593647</td>\n",
              "      <td>0.653650</td>\n",
              "      <td>0.756241</td>\n",
              "      <td>0.787538</td>\n",
              "      <td>0.911714</td>\n",
              "      <td>0.328884</td>\n",
              "      <td>19.718376</td>\n",
              "      <td>7.602567</td>\n",
              "      <td>3.543520</td>\n",
              "      <td>1.768838</td>\n",
              "      <td>0.948547</td>\n",
              "      <td>0.311962</td>\n",
              "      <td>4.234047</td>\n",
              "      <td>0.000000</td>\n",
              "      <td>3.460437</td>\n",
              "      <td>4.179866</td>\n",
              "      <td>3.143538</td>\n",
              "      <td>2.585988</td>\n",
              "      <td>42.881090</td>\n",
              "      <td>47.020922</td>\n",
              "      <td>15.018225</td>\n",
              "      <td>2.147292</td>\n",
              "      <td>...</td>\n",
              "      <td>0.037790</td>\n",
              "      <td>-0.264089</td>\n",
              "      <td>-0.315883</td>\n",
              "      <td>-0.230196</td>\n",
              "      <td>-0.278415</td>\n",
              "      <td>-0.227573</td>\n",
              "      <td>-0.160157</td>\n",
              "      <td>-0.065920</td>\n",
              "      <td>0.006010</td>\n",
              "      <td>-0.207061</td>\n",
              "      <td>-0.212722</td>\n",
              "      <td>-0.233025</td>\n",
              "      <td>0.122168</td>\n",
              "      <td>-0.235360</td>\n",
              "      <td>0.204504</td>\n",
              "      <td>0.361533</td>\n",
              "      <td>0.260996</td>\n",
              "      <td>0.051503</td>\n",
              "      <td>0.198854</td>\n",
              "      <td>0.775103</td>\n",
              "      <td>0.176252</td>\n",
              "      <td>-0.659213</td>\n",
              "      <td>-0.431697</td>\n",
              "      <td>-0.272163</td>\n",
              "      <td>-0.257657</td>\n",
              "      <td>0.415948</td>\n",
              "      <td>-0.748055</td>\n",
              "      <td>-0.519936</td>\n",
              "      <td>-0.280743</td>\n",
              "      <td>-0.337609</td>\n",
              "      <td>0.121037</td>\n",
              "      <td>0.126031</td>\n",
              "      <td>0.092073</td>\n",
              "      <td>-0.025342</td>\n",
              "      <td>0.779517</td>\n",
              "      <td>0.483535</td>\n",
              "      <td>0.452129</td>\n",
              "      <td>0.887166</td>\n",
              "      <td>0.724896</td>\n",
              "      <td>0.718923</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>0</th>\n",
              "      <td>142.960698</td>\n",
              "      <td>11.285728</td>\n",
              "      <td>13.005110</td>\n",
              "      <td>1.014406</td>\n",
              "      <td>0.283393</td>\n",
              "      <td>0.118613</td>\n",
              "      <td>-0.823890</td>\n",
              "      <td>0.296830</td>\n",
              "      <td>0.030066</td>\n",
              "      <td>-0.039112</td>\n",
              "      <td>-0.008291</td>\n",
              "      <td>0.005120</td>\n",
              "      <td>22.598741</td>\n",
              "      <td>10.054916</td>\n",
              "      <td>4.598996</td>\n",
              "      <td>2.500677</td>\n",
              "      <td>1.230197</td>\n",
              "      <td>0.382808</td>\n",
              "      <td>2.139269</td>\n",
              "      <td>0.763976</td>\n",
              "      <td>0.762331</td>\n",
              "      <td>0.842739</td>\n",
              "      <td>0.867206</td>\n",
              "      <td>0.310805</td>\n",
              "      <td>16.561888</td>\n",
              "      <td>7.741845</td>\n",
              "      <td>3.705356</td>\n",
              "      <td>1.979278</td>\n",
              "      <td>0.991894</td>\n",
              "      <td>0.308285</td>\n",
              "      <td>0.000000</td>\n",
              "      <td>4.782595</td>\n",
              "      <td>4.389769</td>\n",
              "      <td>3.836238</td>\n",
              "      <td>4.104714</td>\n",
              "      <td>4.428988</td>\n",
              "      <td>106.109413</td>\n",
              "      <td>21.434679</td>\n",
              "      <td>13.119434</td>\n",
              "      <td>1.797604</td>\n",
              "      <td>...</td>\n",
              "      <td>0.129069</td>\n",
              "      <td>0.146538</td>\n",
              "      <td>-0.032067</td>\n",
              "      <td>0.051480</td>\n",
              "      <td>-0.267513</td>\n",
              "      <td>-0.420282</td>\n",
              "      <td>-0.408088</td>\n",
              "      <td>-0.023221</td>\n",
              "      <td>0.046785</td>\n",
              "      <td>0.107707</td>\n",
              "      <td>-0.137226</td>\n",
              "      <td>0.023916</td>\n",
              "      <td>0.337851</td>\n",
              "      <td>0.200658</td>\n",
              "      <td>0.161783</td>\n",
              "      <td>0.258978</td>\n",
              "      <td>0.162670</td>\n",
              "      <td>0.186067</td>\n",
              "      <td>0.197950</td>\n",
              "      <td>0.695472</td>\n",
              "      <td>-0.013475</td>\n",
              "      <td>-0.333610</td>\n",
              "      <td>-0.415209</td>\n",
              "      <td>-0.302744</td>\n",
              "      <td>-0.304885</td>\n",
              "      <td>0.260058</td>\n",
              "      <td>-0.375359</td>\n",
              "      <td>-0.471712</td>\n",
              "      <td>-0.358132</td>\n",
              "      <td>-0.349325</td>\n",
              "      <td>0.119059</td>\n",
              "      <td>0.116303</td>\n",
              "      <td>0.282626</td>\n",
              "      <td>0.113695</td>\n",
              "      <td>0.737446</td>\n",
              "      <td>0.529281</td>\n",
              "      <td>0.288499</td>\n",
              "      <td>0.719412</td>\n",
              "      <td>0.648070</td>\n",
              "      <td>0.786587</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>0</th>\n",
              "      <td>221.105230</td>\n",
              "      <td>39.302649</td>\n",
              "      <td>28.439736</td>\n",
              "      <td>3.143172</td>\n",
              "      <td>0.317358</td>\n",
              "      <td>0.072439</td>\n",
              "      <td>0.119001</td>\n",
              "      <td>0.103982</td>\n",
              "      <td>-0.235290</td>\n",
              "      <td>-0.002702</td>\n",
              "      <td>0.000093</td>\n",
              "      <td>-0.008104</td>\n",
              "      <td>23.294614</td>\n",
              "      <td>9.924683</td>\n",
              "      <td>4.444632</td>\n",
              "      <td>2.492482</td>\n",
              "      <td>1.286292</td>\n",
              "      <td>0.385283</td>\n",
              "      <td>2.393258</td>\n",
              "      <td>0.702662</td>\n",
              "      <td>0.726574</td>\n",
              "      <td>0.844753</td>\n",
              "      <td>0.918288</td>\n",
              "      <td>0.305973</td>\n",
              "      <td>18.411967</td>\n",
              "      <td>7.693265</td>\n",
              "      <td>3.485531</td>\n",
              "      <td>1.901163</td>\n",
              "      <td>1.015581</td>\n",
              "      <td>0.310740</td>\n",
              "      <td>4.456353</td>\n",
              "      <td>4.532672</td>\n",
              "      <td>0.000000</td>\n",
              "      <td>4.308820</td>\n",
              "      <td>4.333021</td>\n",
              "      <td>4.021278</td>\n",
              "      <td>375.854179</td>\n",
              "      <td>44.382299</td>\n",
              "      <td>38.994918</td>\n",
              "      <td>3.095420</td>\n",
              "      <td>...</td>\n",
              "      <td>0.026899</td>\n",
              "      <td>-0.220666</td>\n",
              "      <td>-0.086244</td>\n",
              "      <td>-0.175303</td>\n",
              "      <td>-0.123123</td>\n",
              "      <td>-0.373268</td>\n",
              "      <td>-0.317954</td>\n",
              "      <td>-0.112358</td>\n",
              "      <td>-0.033773</td>\n",
              "      <td>-0.303254</td>\n",
              "      <td>-0.221227</td>\n",
              "      <td>-0.248652</td>\n",
              "      <td>-0.060182</td>\n",
              "      <td>0.059163</td>\n",
              "      <td>-0.112238</td>\n",
              "      <td>0.172122</td>\n",
              "      <td>0.056972</td>\n",
              "      <td>-0.128083</td>\n",
              "      <td>0.007895</td>\n",
              "      <td>0.531248</td>\n",
              "      <td>0.014922</td>\n",
              "      <td>-0.283171</td>\n",
              "      <td>-0.161568</td>\n",
              "      <td>-0.212833</td>\n",
              "      <td>-0.294075</td>\n",
              "      <td>0.497757</td>\n",
              "      <td>-0.304521</td>\n",
              "      <td>-0.125158</td>\n",
              "      <td>-0.338306</td>\n",
              "      <td>-0.098428</td>\n",
              "      <td>-0.272180</td>\n",
              "      <td>0.054720</td>\n",
              "      <td>-0.089779</td>\n",
              "      <td>0.077794</td>\n",
              "      <td>0.721609</td>\n",
              "      <td>0.470620</td>\n",
              "      <td>0.344036</td>\n",
              "      <td>0.810304</td>\n",
              "      <td>0.774646</td>\n",
              "      <td>0.820247</td>\n",
              "    </tr>\n",
              "  </tbody>\n",
              "</table>\n",
              "<p>5 rows × 848 columns</p>\n",
              "</div>"
            ],
            "text/plain": [
              "       LMacro_01|2_4Hz  ...  RMacro_07_RMacro_08|time_corr\n",
              "class                   ...                               \n",
              "0           114.541010  ...                       0.822477\n",
              "0            27.574621  ...                       0.912751\n",
              "0            58.207500  ...                       0.718923\n",
              "0           142.960698  ...                       0.786587\n",
              "0           221.105230  ...                       0.820247\n",
              "\n",
              "[5 rows x 848 columns]"
            ]
          },
          "metadata": {
            "tags": []
          },
          "execution_count": 9
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "0yj0MCPoNzvH",
        "colab_type": "code",
        "outputId": "c043989e-8032-4fa7-e7e1-99a48e09dc5b",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 55
        }
      },
      "source": [
        "from sklearn.model_selection import train_test_split\n",
        "\n",
        "TEST_SIZE = 0.1\n",
        "RANDOM_STATE = 0\n",
        "\n",
        "# turn to numpy array\n",
        "data_x = reduced_features.values\n",
        "\n",
        "# create condition(group) array\n",
        "data_y = reduced_features.index.values\n",
        "    \n",
        "X_train, X_test, y_train, y_test = train_test_split(data_x, data_y, \n",
        "                                                    test_size=TEST_SIZE,\n",
        "                                                    random_state=RANDOM_STATE)\n",
        "X_train, X_val, y_train, y_val = train_test_split(X_train, y_train, \n",
        "                                                    test_size=TEST_SIZE,\n",
        "                                                    random_state=RANDOM_STATE)\n",
        "\n",
        "print(X_train.shape, X_val.shape, X_test.shape)\n",
        "print(y_train.shape, y_val.shape, y_test.shape)"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "(2543, 848) (283, 848) (315, 848)\n",
            "(2543,) (283,) (315,)\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "UjhbnX8pN3HW",
        "colab_type": "text"
      },
      "source": [
        "# Averaging Methods\n",
        "Averaging methods build several separate estimators and then, as their name suggets, average their predictions. By reducing the variance these tend to perform better than any single base estimator<sup>1</sup>.\n",
        "\n",
        "---\n",
        "\n",
        "1. https://scikit-learn.org/stable/modules/ensemble.html"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "IEm1gMUbN3xO",
        "colab_type": "text"
      },
      "source": [
        "## Bagging\n",
        "A bagging classifier is an ensemble of base classifiers, each fit on random subsets of a dataset. Their predictions are then pooled or aggregated to form a final prediction. This reduces variance of an estimator and can be a simple way to reduce overfitting. They work best with complex models as opposed to boosting, which work best with weak models<sup>1</sup>.\n",
        "\n",
        "Specifically, bagging is when sampling is produced with replacement<sup>2</sup>, and without replacement being called pasting<sup>3</sup>. Therefore both bagging and pasting allow training to be sampled several times across multipule predictors, with bagging only allowing several samples for the same predictor <sup>4</sup>.\n",
        "\n",
        "We can do this with any classifier so lets start with a support vector machine. \n",
        "\n",
        "**NOTE**\n",
        "- If we wanted to use pasting we would just set *bootstrap=False*.\n",
        "\n",
        "---\n",
        "\n",
        "1. https://scikit-learn.org/stable/modules/ensemble.html\n",
        "2. Breiman, L. (1996). Bagging predictors. Machine learning, 24(2), 123-140.\n",
        "3. Breiman, L. (1999). Pasting small votes for classification in large databases and on-line. Machine learning, 36(1-2), 85-103.\n",
        "4. Géron, A. (2017). Hands-on machine learning with Scikit-Learn and TensorFlow: concepts, tools, and techniques to build intelligent systems. \" O'Reilly Media, Inc.\"."
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "lcsWLbOON8a5",
        "colab_type": "code",
        "outputId": "bf41c0ff-ca4f-483e-a81f-64171c1021cd",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 55
        }
      },
      "source": [
        "%%time\n",
        "\n",
        "from sklearn.pipeline import Pipeline\n",
        "from sklearn.preprocessing import StandardScaler\n",
        "from sklearn.svm import SVC\n",
        "from sklearn.ensemble import BaggingClassifier\n",
        "from sklearn.decomposition import PCA\n",
        "from sklearn.metrics import classification_report\n",
        "\n",
        "pipe_svc = Pipeline([('scl', StandardScaler()),\n",
        "                     ('pca', PCA(n_components=0.8, random_state = RANDOM_STATE)),\n",
        "                     ('clf', SVC(kernel='rbf', random_state=RANDOM_STATE))])\n",
        "\n",
        "bag = BaggingClassifier(base_estimator=pipe_svc, \n",
        "                        n_estimators=10, \n",
        "                        max_samples=0.5, \n",
        "                        max_features=0.5, \n",
        "                        bootstrap=True, \n",
        "                        bootstrap_features=True, \n",
        "                        oob_score=True, \n",
        "                        warm_start=False,\n",
        "                        n_jobs=-1, \n",
        "                        random_state=RANDOM_STATE)\n",
        "bag.fit(X_train, y_train)"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "CPU times: user 733 ms, sys: 491 ms, total: 1.22 s\n",
            "Wall time: 3.31 s\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "mGhcz25eVcqX",
        "colab_type": "text"
      },
      "source": [
        "As we can see performance is okay but recall is particularly poor. It is likely the model is not complex enough or each model in the ensemble is too similar to each add value (we'll look at solving this soon)."
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "d09pwIZFVbR2",
        "colab_type": "code",
        "outputId": "bb8e9010-22d2-456f-b968-79f60803cf63",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 175
        }
      },
      "source": [
        "y_pred = bag.predict(X_val)\n",
        "\n",
        "display(pd.DataFrame(classification_report(y_val, y_pred , output_dict =True)))"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "display_data",
          "data": {
            "text/html": [
              "<div>\n",
              "<style scoped>\n",
              "    .dataframe tbody tr th:only-of-type {\n",
              "        vertical-align: middle;\n",
              "    }\n",
              "\n",
              "    .dataframe tbody tr th {\n",
              "        vertical-align: top;\n",
              "    }\n",
              "\n",
              "    .dataframe thead th {\n",
              "        text-align: right;\n",
              "    }\n",
              "</style>\n",
              "<table border=\"1\" class=\"dataframe\">\n",
              "  <thead>\n",
              "    <tr style=\"text-align: right;\">\n",
              "      <th></th>\n",
              "      <th>0</th>\n",
              "      <th>1</th>\n",
              "      <th>accuracy</th>\n",
              "      <th>macro avg</th>\n",
              "      <th>weighted avg</th>\n",
              "    </tr>\n",
              "  </thead>\n",
              "  <tbody>\n",
              "    <tr>\n",
              "      <th>precision</th>\n",
              "      <td>0.985294</td>\n",
              "      <td>0.818182</td>\n",
              "      <td>0.978799</td>\n",
              "      <td>0.901738</td>\n",
              "      <td>0.977618</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>recall</th>\n",
              "      <td>0.992593</td>\n",
              "      <td>0.692308</td>\n",
              "      <td>0.978799</td>\n",
              "      <td>0.842450</td>\n",
              "      <td>0.978799</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>f1-score</th>\n",
              "      <td>0.988930</td>\n",
              "      <td>0.750000</td>\n",
              "      <td>0.978799</td>\n",
              "      <td>0.869465</td>\n",
              "      <td>0.977954</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>support</th>\n",
              "      <td>270.000000</td>\n",
              "      <td>13.000000</td>\n",
              "      <td>0.978799</td>\n",
              "      <td>283.000000</td>\n",
              "      <td>283.000000</td>\n",
              "    </tr>\n",
              "  </tbody>\n",
              "</table>\n",
              "</div>"
            ],
            "text/plain": [
              "                    0          1  accuracy   macro avg  weighted avg\n",
              "precision    0.985294   0.818182  0.978799    0.901738      0.977618\n",
              "recall       0.992593   0.692308  0.978799    0.842450      0.978799\n",
              "f1-score     0.988930   0.750000  0.978799    0.869465      0.977954\n",
              "support    270.000000  13.000000  0.978799  283.000000    283.000000"
            ]
          },
          "metadata": {
            "tags": []
          }
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "UIYGfabe23JU",
        "colab_type": "text"
      },
      "source": [
        "An additional way we can get a performance metric on a validation set is to ensure we use `oob_score = True`\n",
        "\n",
        "With bagging by default only trains on a sample of the training data, leaving a set of training data sampled as out-of-bag (oob) instances. Since they are not seen during training, we can evalute on them without a separate validation using the oob_score.\n",
        "\n",
        "This estimated that its likely to get an accuracy of about 0.99 on the test/validation set so its a pretty close to what we did get above."
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "iAhQUdYA28sk",
        "colab_type": "code",
        "outputId": "8a95e355-f148-4bfb-c2a3-d36167f15169",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 36
        }
      },
      "source": [
        "bag.oob_score_"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "0.9897758552890287"
            ]
          },
          "metadata": {
            "tags": []
          },
          "execution_count": 13
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "XxZQztWUbr-9",
        "colab_type": "text"
      },
      "source": [
        "Maybe we think that the problem with our previous model was that we didnt fully account for the imballances in the data. Then we could use a ballanced bagging classifier, but again this performs poorly."
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "r01R7T_Sa4n_",
        "colab_type": "code",
        "outputId": "cbccb46f-014d-444f-c831-be1ab0c33701",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 233
        }
      },
      "source": [
        "from imblearn.ensemble import BalancedBaggingClassifier\n",
        "\n",
        "bal_bag = BalancedBaggingClassifier(base_estimator=pipe_svc, \n",
        "                                n_estimators=10, \n",
        "                                max_samples=0.5, \n",
        "                                max_features=0.5, \n",
        "                                bootstrap=True, \n",
        "                                bootstrap_features=True, \n",
        "                                oob_score=True, \n",
        "                                warm_start=False, \n",
        "                                sampling_strategy ='majority', \n",
        "                                replacement=True, \n",
        "                                n_jobs=-1, \n",
        "                                random_state=RANDOM_STATE)\n",
        "bal_bag.fit(X_train, y_train)\n",
        "\n",
        "y_pred = bal_bag.predict(X_val)\n",
        "\n",
        "display(pd.DataFrame(classification_report(y_val, y_pred , output_dict =True)))"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "/usr/local/lib/python3.6/dist-packages/joblib/externals/loky/process_executor.py:706: UserWarning: A worker stopped while some jobs were given to the executor. This can be caused by a too short worker timeout or by a memory leak.\n",
            "  \"timeout or by a memory leak.\", UserWarning\n"
          ],
          "name": "stderr"
        },
        {
          "output_type": "display_data",
          "data": {
            "text/html": [
              "<div>\n",
              "<style scoped>\n",
              "    .dataframe tbody tr th:only-of-type {\n",
              "        vertical-align: middle;\n",
              "    }\n",
              "\n",
              "    .dataframe tbody tr th {\n",
              "        vertical-align: top;\n",
              "    }\n",
              "\n",
              "    .dataframe thead th {\n",
              "        text-align: right;\n",
              "    }\n",
              "</style>\n",
              "<table border=\"1\" class=\"dataframe\">\n",
              "  <thead>\n",
              "    <tr style=\"text-align: right;\">\n",
              "      <th></th>\n",
              "      <th>0</th>\n",
              "      <th>1</th>\n",
              "      <th>accuracy</th>\n",
              "      <th>macro avg</th>\n",
              "      <th>weighted avg</th>\n",
              "    </tr>\n",
              "  </thead>\n",
              "  <tbody>\n",
              "    <tr>\n",
              "      <th>precision</th>\n",
              "      <td>1.000000</td>\n",
              "      <td>0.590909</td>\n",
              "      <td>0.968198</td>\n",
              "      <td>0.795455</td>\n",
              "      <td>0.981208</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>recall</th>\n",
              "      <td>0.966667</td>\n",
              "      <td>1.000000</td>\n",
              "      <td>0.968198</td>\n",
              "      <td>0.983333</td>\n",
              "      <td>0.968198</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>f1-score</th>\n",
              "      <td>0.983051</td>\n",
              "      <td>0.742857</td>\n",
              "      <td>0.968198</td>\n",
              "      <td>0.862954</td>\n",
              "      <td>0.972017</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>support</th>\n",
              "      <td>270.000000</td>\n",
              "      <td>13.000000</td>\n",
              "      <td>0.968198</td>\n",
              "      <td>283.000000</td>\n",
              "      <td>283.000000</td>\n",
              "    </tr>\n",
              "  </tbody>\n",
              "</table>\n",
              "</div>"
            ],
            "text/plain": [
              "                    0          1  accuracy   macro avg  weighted avg\n",
              "precision    1.000000   0.590909  0.968198    0.795455      0.981208\n",
              "recall       0.966667   1.000000  0.968198    0.983333      0.968198\n",
              "f1-score     0.983051   0.742857  0.968198    0.862954      0.972017\n",
              "support    270.000000  13.000000  0.968198  283.000000    283.000000"
            ]
          },
          "metadata": {
            "tags": []
          }
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "AKeNEXn6udmy",
        "colab_type": "text"
      },
      "source": [
        "So instead lets look at a more complex model that aims to ensure each classifier in the ensemble makes different errors."
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "lHlRpGOF5BV2",
        "colab_type": "text"
      },
      "source": [
        "## Random Forests\n",
        "\n",
        "Random forests are essentally bagged tree classifier. However, rather than using the bagging method above we can use one of the inbuilt methods Sklearn has specifically designed for fitting an ensemble of trees. A random forest is just a fancier version of bagging where multiple decision trees are averaged together to build a robust model that is less susceptible to overfitting.\n",
        "\n",
        "The random forest algorithm can be summarized in four steps<sup>1</sup>:\n",
        "\n",
        "> 1. *Draw a random bootstrap sample of size n (randomly choose n samples from the training set with replacement).*\n",
        "> 2. *Grow a decision tree from the bootstrap sample. At each node:*\n",
        ">\n",
        ">    *a. Randomly select d features without replacement.*\n",
        ">    \n",
        ">    *b. Split the node using the feature that provides the best split according to the objective function, for instance, maximizing the information gain.*\n",
        ">\n",
        ">3. *Repeat the steps 1-2 k times.*\n",
        ">4. *Aggregate the prediction by each tree to assign the class label by majority vote.*\n",
        "\n",
        "Instead of using majority vote, as was done in the origional publication<sup>2</sup>, in Sklearn the RandomForestClassifier averages the probabilistic prediction.\n",
        "\n",
        "**NOTES**\n",
        "- We cannot use the graphviz on the whole forest as we did for the trees in the supervised learning notebook, as each tree is built differently.\n",
        "\n",
        "---\n",
        "1. Raschka2017\n",
        "2. L. Breiman, “Random Forests”, Machine Learning, 45(1), 5-32, 2001"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "Ynqo_JfPXkvL",
        "colab_type": "code",
        "outputId": "e86edf4f-c68b-4fc7-d46c-ed0b4e223258",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 175
        }
      },
      "source": [
        "from sklearn.ensemble import RandomForestClassifier\n",
        "\n",
        "forest = RandomForestClassifier(criterion='gini',\n",
        "                                n_estimators=1000,\n",
        "                                max_features = 'sqrt',\n",
        "                                class_weight = 'balanced',\n",
        "                                random_state=RANDOM_STATE,\n",
        "                                n_jobs=-1)\n",
        "forest.fit(X_train, y_train)\n",
        "\n",
        "y_pred = forest.predict(X_val)\n",
        "\n",
        "display(pd.DataFrame(classification_report(y_val, y_pred , output_dict = True)))"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "display_data",
          "data": {
            "text/html": [
              "<div>\n",
              "<style scoped>\n",
              "    .dataframe tbody tr th:only-of-type {\n",
              "        vertical-align: middle;\n",
              "    }\n",
              "\n",
              "    .dataframe tbody tr th {\n",
              "        vertical-align: top;\n",
              "    }\n",
              "\n",
              "    .dataframe thead th {\n",
              "        text-align: right;\n",
              "    }\n",
              "</style>\n",
              "<table border=\"1\" class=\"dataframe\">\n",
              "  <thead>\n",
              "    <tr style=\"text-align: right;\">\n",
              "      <th></th>\n",
              "      <th>0</th>\n",
              "      <th>1</th>\n",
              "      <th>accuracy</th>\n",
              "      <th>macro avg</th>\n",
              "      <th>weighted avg</th>\n",
              "    </tr>\n",
              "  </thead>\n",
              "  <tbody>\n",
              "    <tr>\n",
              "      <th>precision</th>\n",
              "      <td>0.996296</td>\n",
              "      <td>0.923077</td>\n",
              "      <td>0.992933</td>\n",
              "      <td>0.959687</td>\n",
              "      <td>0.992933</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>recall</th>\n",
              "      <td>0.996296</td>\n",
              "      <td>0.923077</td>\n",
              "      <td>0.992933</td>\n",
              "      <td>0.959687</td>\n",
              "      <td>0.992933</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>f1-score</th>\n",
              "      <td>0.996296</td>\n",
              "      <td>0.923077</td>\n",
              "      <td>0.992933</td>\n",
              "      <td>0.959687</td>\n",
              "      <td>0.992933</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>support</th>\n",
              "      <td>270.000000</td>\n",
              "      <td>13.000000</td>\n",
              "      <td>0.992933</td>\n",
              "      <td>283.000000</td>\n",
              "      <td>283.000000</td>\n",
              "    </tr>\n",
              "  </tbody>\n",
              "</table>\n",
              "</div>"
            ],
            "text/plain": [
              "                    0          1  accuracy   macro avg  weighted avg\n",
              "precision    0.996296   0.923077  0.992933    0.959687      0.992933\n",
              "recall       0.996296   0.923077  0.992933    0.959687      0.992933\n",
              "f1-score     0.996296   0.923077  0.992933    0.959687      0.992933\n",
              "support    270.000000  13.000000  0.992933  283.000000    283.000000"
            ]
          },
          "metadata": {
            "tags": []
          }
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "80GjqqNn3EzV",
        "colab_type": "text"
      },
      "source": [
        "Lets look at how a descion boundary created by a bagged tree could generalise better than a single tree (as we did in the supervised learning notebook)"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "VJGbW4VF3G5s",
        "colab_type": "code",
        "outputId": "31e09e22-2bda-4014-ded3-bb0789f131d0",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 265
        }
      },
      "source": [
        "from mlxtend.plotting import category_scatter\n",
        "\n",
        "x_axis_label = 'LMacro_03|D4_ratio'\n",
        "y_axis_label =  'LMacro_03|D2_ratio'\n",
        "\n",
        "reduced_features_reset = reduced_features.reset_index()\n",
        "\n",
        "fig = category_scatter(x=x_axis_label, y=y_axis_label, label_col='class', \n",
        "                       data=reduced_features_reset, legend_loc='upper left')\n",
        "\n",
        "feature_list = list(reduced_features.columns)\n",
        "\n",
        "two_features_data = reduced_features.iloc[:,[feature_list.index(x_axis_label),feature_list.index(y_axis_label)]]"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "display_data",
          "data": {
            "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXQAAAD4CAYAAAD8Zh1EAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjMsIGh0\ndHA6Ly9tYXRwbG90bGliLm9yZy+AADFEAAAgAElEQVR4nOy9f3ib9X33+/pKsuzYkvM7TmI7DSSQ\nAIEkJsTQtAktW4OBJ+66PgTatduDWLfT0F1rdj3dTp+ddevOnqtnu+puK7BdHeLp6FlLc9YW+aGY\nwLKS0BQEIb8aQqAO0NhO4iROgiXH1s/v+eOrr+5bsmRJtvwz39d16bIl3brv25L1/n7uz08hpcRg\nMBgM0x/HZJ+AwWAwGMqDEXSDwWCYIRhBNxgMhhmCEXSDwWCYIRhBNxgMhhmCa7IOvGDBArl8+fLJ\nOrzBYDBMS954440LUsqFuZ6bNEFfvnw5Bw4cmKzDGwwGw7RECPHrfM8Zl4vBYDDMEIygGwwGwwzB\nCLrBYDDMECbNh56LWCxGd3c3Q0NDk30qI1JVVUVDQwMVFRWTfSoGg8GQpqCgCyGeBO4Dzkkp1+TZ\n5k7g74EK4IKUcstoTqa7uxuv18vy5csRQoxmF+OOlJK+vj66u7u55pprJvt0DAaDIU0xLpfvAnfn\ne1IIMQd4HNgmpbwJ+K+jPZmhoSHmz58/ZcUcQAjB/Pnzp/xVhMFguPooKOhSyn3AxRE2+QzwYynl\nqdT258ZyQlNZzDXT4RwNBsPVRzmCotcDc4UQLwkh3hBCfD7fhkKILwghDgghDpw/f74MhzYYDAaD\nphyC7gJuBe4FtgL/lxDi+lwbSim/I6XcIKXcsHBhzkKnKcHzzz/PqlWrWLlyJd/4xjcKbt/SAk1N\n1q2lZQJO0mAwGLIoR5ZLN9AnpRwABoQQ+4C1wDtl2PeEk0gk2LFjBy+++CINDQ3cdtttbNu2jRtv\nvDHva3p7ob7eut/TMwEnajAYDFmUw0IPAB8RQriEENVAM/BWGfY7Kbz22musXLmSa6+9FrfbzQMP\nPEAgEJjs0zIYDIaCFJO2+APgTmCBEKIb+BoqPREp5T9LKd8SQjwPHAWSwBNSymPjd8oWLS3KOtbU\n1UFHx9j22dPTQ2NjY/p+Q0MDwWBwbDs1GAyGCaCgoEspHyxim78D/q4sZ1QCU8XVUVeXeey6usk5\nD4PBcHUzpSpFpwL19fV0dXWl73d3d1NvXzVyMNarAoPBYCgHppdLFrfddhu/+tWveO+994hGozz9\n9NNs27Ztsk/LYDAYCjKtLfTxcHW4XC4effRRtm7dSiKR4KGHHuKmm24a+44NBoNhnJnWgj5ero57\n7rmHe+65Z3x2bjAYDOOEcbkYDAbDDMEIusFgMMwQjKAbDAbDDMEIusFgMMwQjKAbDAbDDMEIusFg\nMMwQjKBn8dBDD7Fo0SLWrMk5bc9gMBimLEbQs/i93/s9nn/++ck+DYPBYCiZaS3ooUgIKSWghjeH\nIqEx73Pz5s3MmzdvzPsxGAyGiWbaCnooEmLn7p34D/mRUuI/5Gfn7p1lEXWDwWCYjkzb0n+P20Nz\nQzOBtwME3lYDKFpXteJxeyb5zAwGg2FymLYWuhAC33pfxmO+9T6EEJN0RgaDwTC5TFtB124WO9r9\nYjAYDFcj01bQw9Ewwe4grataaX+gndZVrQS7g4Sj4THt98EHH+SOO+7g7bffpqGhAb/fX/hFBoPB\nMAUQk2XRbtiwQR44cCDjsbfeeosbbrih6H2EIiE8bg9CCKSUhKNhvJXecp9qTko9V4PBYCgHQog3\npJQbcj03bYOiQIZ4CyEmTMwNBoNhKjJtXS4Gg8FgyGTKCfp0CGpOh3M0GAxXHwUFXQjxpBDinBDi\nWIHtbhNCxIUQnx7tyVRVVdHX1zelBVNKSV9fH1VVVZN9KgaDwZBBMT707wKPAk/l20AI4QT+H+CF\nsZxMQ0MD3d3dnD9/fiy7GXeqqqpoaGiY7NMwGAyGDAoKupRynxBieYHNvgT8CLhtLCdTUVHBNddc\nM5ZdGAwGw1XLmH3oQoh64LeAfypi2y8IIQ4IIQ5MdSvcYDAYphvlCIr+PfCnUspkoQ2llN+RUm6Q\nUm5YuHBhGQ5tMBgMBk058tA3AE+neqgsAO4RQsSllM+UYd8Gg8FgKJIxC7qUMu30FkJ8F3jWiLnB\nYDBMPAUFXQjxA+BOYIEQohv4GlABIKX853E9O4PBYDAUTTFZLg8WuzMp5e+N6WwMBoPBMGqmXKWo\nwWAwGEaHEXSDwWCYIRhBNxgMhhmCEfSriFAklO6TI6U0A7UNhhmGEfSrhFAkxM7dO9Nj+vyH/Ozc\nvdOIusEwg5jWAy4MxeNxe2huaCbwdoDA2wEAWle14nF7JvnMDAZDuTAW+lWCEALfel/GY771PlIV\nvgaDYQZgBP0qQbtZ7Gj3i8FgmBkYQb9KCEfDBLuDtK5qpf2BdlpXtRLsDhKOhif71AwGQ5kQk2Wh\nbdiwQR44cGBSjn21EoqE8Lg9CCGQUhKOhs1gbYNhmiGEeENKuSHXcyYoOs1paYHeXut+XR10dOTe\n1i7eQggj5gbDDMMIepkpRWDLQW8v1Ndb93t6xu9YBoNhamMEvcwYgTUYDJOFEfQJYqyW+0Rb/gaD\nYfphBH2CGKvlnv36vXuhqQk6O+HECXC5YOVKJfTlwCwgBsP0wwj6GMkWvq6uzOdzCWwwCOGwEmS9\nTaliGY8rgdci39MDBw+Wtg/IL9zGdWQwTD+MoI+RbOHr7FS3eNy639KS+ZpIBCoqMsV4sjDCbTDM\nHIygjxJt2Z44ASdPQmUlNDcrIa+oAG8qI3BoSG1XV2eJZSwGnhJaqLS0WK4VUO4VVwmfnHGfGAxX\nB0bQR4m2bE+ehKoqJdwjYRfQpib12mBQWeuxmHosn9D29sKWLdb9nh617d691pWAy6WEO9/rS7XC\n7QuQvm8wGKY2RtDHSGUl9PVBMgnPPad+Dg7ClSvgcCixz0aLZTisrHmPRwluKe6Ojg5rYdBkvz7f\nVUSuc7Hf1/s3GAzTCyPoY6S5Gfbts+4PDUE0qoQ9kVAiWlcHv3lviAtnPAgEEsmCJWFW480Q5KNH\nobrauh+Pw5o1+QW5s1M9rsnu4lDMVYQRboNh5mAEHcuS1cFMewpgLsHL9mlHIjB/vvo5Z44Szs2b\nlUtk5Uo43RfizYadzPI2s6XWR9cCP4cIUn++DbDK75NJy/cOcP68EuRjx9S+QyHYswdqatQ5hMPg\ndKptEwn1s7ZW/Q16UTl6VD0+MABut/p7pLQWjpoadRyDwTD9MYJOpiXr9SpBHskFku3T3rNnuKUO\nVmqhxMM7vc0MrQjw+nw1XKL6vVaWzPekj9HZqYT24kV1P5FQ9zs61O9CqJvHAw0N6hwctl6Z2jof\nGFCPa4HXP+3bLVhg3Q+ZgUUGw4yhoKALIZ4E7gPOSSnX5Hj+s8CfAgIIAf+HlPJIuU90osnOFe/q\ngsZG9fuJE9Ddbbk/XC4l/lIqgXS5lHUeiWiRFwz1++BDAc6dAyQQ8NHF8OESOsip0YIspbr19akF\nZ+XK3Nsnk8NdL0KoxwYH1f1z56znHGVsoGzPpunsVD/1eZrMGoNh/CnGQv8u8CjwVJ7n3wO2SCkv\nCSFagO8AzXm2nRDKkaanc8W7uy13x6VL6jktjh0dymURj6tj2At7XC4lrv39ABKaUsMltNg2+eGg\nD3KI+kgkk+pcjhxRv+ciW9DzdUjWQl+IYt9PezaN9u1PhVx7g+FqoaCgSyn3CSGWj/D8L2x3XwUa\nxn5aY6PUND2d6WG3sONx5d6IRFRAMRxWAgjKbx2NWmIYj1ul+Hp/GWLrDkN9EN5uVSLe5Ff3j22H\n6Oha2OYT81LQ5z9SyiSMLu1xYEDtX7uhzGAkg2H8KbcP3QfktYWFEF8AvgCwbNmyMh+6OHKV6jc2\nDg+Idnbm9ouDCnxeuKAEf/Nmtc3AgApAZvusASXau9sg6gGEEvUxiDnukLUvpFowRrsv4NAhtVhV\nV6u/f9MmS9xDkRASdSyJJOEIYw/k2rFn3Wi/f1UVXL5s5dqDcb8YDONF2TyoQoiPoQT9T/NtI6X8\njpRyg5Ryw8KFC8t16JLQ1qa+DQyon0KogKgQmW6CWEz5rJ1OJfjxuBJz7U7Zt09Z74lEAas56sVy\nr4ixifnWnSkXTsqVs3WnenwMSGn9/XrBC0VC7Ny9k75lfiSSrgV+3mzcScJV3LGkVO9dPK589fo9\nty+oBoOhfJTFQhdC3AI8AbRIKfvKsc+xMNYqx2AwswJTSpUPfuSIEryaGu0bVwuCFvIJcStEPdDT\nDKsC6gbKlRMtoZdAHs6fz0yb9Lg9NDc0E1ge4OWIOtbsbpWdk01Li9X2ANQCWFNjXcEYDIbxZ8yC\nLoRYBvwY+JyU8p2xn9LYyXc539JiFemAKtTJhb15Vn298o/r7aNRKzVQFw9NLCmXjRZzGFVwNRdS\nqquNEye0X13w3HM+vvqUOtbAAERf8LF7UOBwqO2FUO+F263eG70g9PdbefNDQ2phLLbVQbGYHjUG\nQybFpC3+ALgTWCCE6Aa+BlQASCn/GfgLYD7wuFBRw3i+AaaTTW+vFegEJV41NZkBUV2QI6Uq5RdC\nifbRo+oxbY3rLBYoPlukPNgyZjSjzJjJRTKp3oOTJ+FXnRL/IX86MHxlACo2+hna66OiQhCNqoVP\nW+bRqFWNKoTKd29uVkLe16dcVTqfvrt7zKdqOkUaDFkUk+XyYIHnHwYeLtsZjTP20vlcPcSbmpSY\n6V4sdtHW2S2Qmf89oRkc45AxY0d3cQyFQLrD/Nm3gwy+1Yr7XR+u6/zEGoLg3k40dSz9foDlYgGr\n2ArUzz17lOjr3jbZbQiMtW0wjB1TKZqDykplvUP+QKdObZxwyp0xk0UiYVndRLxE/3cbsT4PFxBw\n3gfu3MfSsYQ9e6yUTu07z+fagswGYhUVVr8aY20bDKVzVQl6scFSna6oG1ppsbIL+KSIuSZDUMeQ\nMZMDKTOt7sEPijuWjiVEItYi2N+vrnK0a6uy0rLMYzH1/o/UQKyQ1W5a/BoMmVxVgl7MJby9yOjC\nBfVYOYp48lLmnPJyYw/6ardTrsVMxxG0+0nfX7BAuW82bcotzjo3PReFfOTGJWMwZFLGTh4zg44O\n5VfftElZlB6PlcUB1k8x9vjjuOWUj/mc0v0JJLLCOhcp81+ZaCHP/pmN7lK5f7815Bos6z0WswZ4\nGAyG0riqLPRS6O21gnhXrlhWunZHlCUQOo455aNCLzA9zZkB191to75q6OtT79Xever9vHJFtRrW\nHS07O5WAN6QaRtjdKiNZ7waDYThG0Iskl2Wqc69Hz/jllI+KcVhgtMtmcHB4t8dgULVZyM400mj3\nl70tQzny10eDycIxTAeMoI+AdgPkczOMPTA6vjnlpTP6BWbWLPVeZV+5OBzqluu96utT/d/tlrhd\nKO2W+njmmxcj1ibn3TAdMD70FKFICJlSIykl85eEaGiAFSuUWOXC5bIEa1Q+dXtO+Q/a1c/6oHp8\nIkn7zSU0PQHCpr7av5/vpW71/rhc1ntgfy+ys4Ps6Erb7m6V5XLypHLNtLSM9Q8aTkuLWhj0zX6M\n3t6JOQeDYbwxFjpWE6rmhmZ86334D/lZ/gdBfry1DW+ll6Ym1cfFnsUBSqi0eHk8Kj0vmVQiN2dO\n5iCJnBTIKZ+QClS73/zY/XDLU+rA//YsrNlVsGgpHldXMps2wQsvqMeKPWf992WPwNu7FxYutAaK\nHDlijdLTLhy9iNbUZHaHzEchC1tXw9q3NximG0bQsZpQffWpAH/yL6kmVGda+a//5OH5lFAIocTa\nLrIej1WEtGKFyt4YHFTb6VFyBRkhp1wfx16xWnay/eYxD7z1KXUeRRQtSaneg44O1X53cNBa5AoJ\ne3aKoyYSsYZtV1bmX9iSSRVk1eI7nn5uk/NumA4YQQeEENx/4/38SSRAVZWaTJT8+f3suyJoalI9\n091uJTR6XqcQSshjMatcvqYms7AmLyXmno+vlZ7lN0+64ODDpP3mBbJbhFCFVy0tmS4oKD7GkP33\n6ffvyhW175oaa2CG/TXZQj9aP3ddnTXwG3JXtpoAqGE6YAQd6B/qZ/n/uJeQo5+BvmUkak4Rb72X\nmsBPqa+vBdSXfv9+qxGVFnH7sGhQbgItKnrAcwajSA0cX0EfW2A2mVRW+QsvWC4pu6iPBT0oW7uy\n7A3R7HR2knaL2Vsc29+3kSzsjo7c1r3BMN0wgo7yoX/QD9IFiUQSSCBlppjlstAK5Unb+6anyXZx\nCKh6v5WhEVIDx9WXXqZmXzp2EI0Ot8xLPf/sqlMt4tn71dtcuaIWUT15Sbtiksni3TDGAjfMBK56\nQQ9FQvzl3r9EHL8frnkearuh6gP4f3cza5ayzo8eVf5hTU3N8ECexm4JxmLKms/sm25zcQhwOeGu\neT5erhVpt0J2mt+4WuhlbPalR/PZG5clk6R7pxcbB8j+exctskb+VVaqdEfIjGHkI9sNY5/9Ciaf\n3DCzuGoEPZ+l5nF72Fi/Ef+6RxGeswDI0GK4/jnq5cP09AgSCSXOGh3w7OrK9L3W1GQWydiPeeSI\nFjSbi0OqhMCjTj/RmI9580Q620I3qRpm4Y8HxTb7KuD711k92cVWOnCqC4uysVvwuaz5ixfVexcO\nW+4XUH51u7/b4bDcMvPmqR4y2cTjJp/cMHOZloIeioTwuD0IIZBSEo6G8VaObFHmC5gJIdh+03b+\noOqvEQOLEf3LcJ3cRnL5a+z7/gN4K73DfMLa2rb7y+371Ngtv5aWlJ/ZHUY2BBHvtFLzjo8PfdLP\n2blBXLO2Ew5708M3KiutIKwWqUnt8DgObQE02cHObLKtfbAGZUQiagHZs8cazgHWlKR8lHt6ksEw\nFZh2gp4rZzzYHaQtlTNe6r5qKmrYdXwX8xI3Ek0KRA3MXgjXD30zvT8pMy300bhAdGfB+novcdpw\nLvFwOimQx3zQux1H3IujQomMlKq3yYkTSrTmzFH70G6HwcHM85kQxtAWQAdOR4vdhaPdN9qFE4sp\nd9iWLZkivWKFFdi0L7Ra5O155/X1mduMNv3RtAcwTDbTTtDTg4vfDhB4WwlL66pWPO7S+o0kXGph\nWLt4LYfPHObeLXVUOCtYt3gdR86+RtvWBzK2H60f2/4lP3FCVSQ2N1sLj0AQC3szilpWrFCum5YW\n5fO1j3UDK20yfU4T0oJ36vSd0XGJ6mr1Xmn0hCQ9A1a/73ZhbWlR4q0XxFwpiqNNfzTtAQyTzbQT\ndCEEvvW+tJgD+Nb7EAVq77PT1pbUWQtDPBHH5XDRurqVh9Y9xEBsIMPa93qtfGstojo/PfsY2di/\n5N3dygesz0MPeMiHTqfT6ZL2FL40xbhCyiL4k9d3JnuwiN1K7+pSn4t98He2nzxXIBQytzEYZgLT\nTtClVIOL7fgP+QuK+vBLX4GUamFwOdXboPeR7bpZuVJ9+e1TjLQY5OsUmAs9Wi07cHrihLVYCKHu\nt7Soc7ZcNWr7YFBl2KQtdO0KWa1cIUKAPGFzhXhOw8f+Anpuh4MPwW2Pw+Ijpfu+x3mWabEIAWvX\nZg7I6O7OHPyd7Tu3Cz6ovPVNm0zlp2HmMe0EPRwNE+wO0rqqNcOHvv2m7SX50EtZGLR1P9Jl+mjR\nVrjuF673n89y18JluYCUK8R5U0AV4oBlNbtDSsydMVj1DGz8NlRdgv1fKb0l7jjPMi0WISx3VFPT\n8Fmke/cqAX/22czXhUKqP4z+fSTf9mjL/E17AMNkM+0E3VvppW1rWzrLxbfeV7KYQ2kLQ742rsVQ\nzJc82wqHkf2vDoca7QYwOCSJrvFTWZP6u8KQ1K6QqEdZ5qsCUP96aoPF8PoXGZWbZBxnmRZLMmk1\n7qqvVzUCsZj6u3NW5qbQC+Dly1ZmC+QOXI42kFns60zw1DBeCDmuVSv52bBhgzxw4MCkHFtTKP0x\n+4vX1WV1AISxfRGzj72uOczSJeBMehAIunskC5aE6TvjzRjwEI+r26xZyhJNuJQPvepCM55f+Qhf\n5yeyIAgvtCGHvEAS/nA9pHLs6bkN3v7k5A7SGCd0EzN7VsxIOJ2qunXLluGf41hEt9Brcy3epbju\nDFc3Qog3pJQbcj1X0EIXQjwJ3Aeck1KuyfG8AP4BuAe4AvyelHJa/HvaxTuX7zw7awGsL57+0o5k\n6WWjRTwcDfPl3V9m3eJ17LhtB/5Dfi7c9jLdQ4Kavo8w/5SP2Bo/eweCVJ1qQwgvFRXKytyyJTNr\nxuPxMrS/jSqnh0hEUHHMR0XFdoTby2BSElv7uHKzhBfDB8tgcB7Uvzop7pLxRrcwLhYhlLsml3tr\nLBkrJtvFMFkU43L5LvAo8FSe51uA61K3ZuCfUj9nNKV+aXX+/Cv/3ozr2EOcWh/jyTl/yx8//lNq\nZrmoPbONJbVwzfYAoDJ43vmHVqqcHoQztY9Qbktv3z5vepuqSkE4rIS6yhsmvvQI8hdfgde+CE1P\nKjH/2dentpinsnJcLkE8XnxWjq6qzddiILsNcTyuXnP0aO4smJEwbhPDVKSgoEsp9wkhlo+wSSvw\nlFS+m1eFEHOEEEuklGfKdI7TAt3xT5P9Bdf587tqAlRtCeBCIiJzIemiqgrWJh6m5xjQ0p5+jfOo\nD+HNdIssXKhcLaCybY4dU4VHQ0NWQU1Xlwqexq94qXypjUjIgyxDIHOiB27Ei6lILSElM5/QJxJW\nY697782/OGfXFHg8Vu57KVZ4rpYRo2WkhcUsOlcf5QiK1gP2jOzu1GPDBF0I8QXgCwDLli0rw6HH\nl1KyFgr1CNEB3D8hgEQy5O4ieU69Bxf74D8uPgGzoMH2Grnez+AxHyLl63a5lJh7U3qle8w0NKgF\nJRpVQqFnezqdUFPhJQp4vBAOC+RIYl5AHCck3FJKRWqZ2xFIqVJTY7HhIltXBy++mNkBcnBQba9f\nqykk2IVaRpTCSFeKxvVz9TGhWS5Syu8A3wEVFJ3IY4+GUlLbRuobAplpkpIEMecluLIOnvlfJDc8\nSbzxZZAiI+vm8Nog82Lbcca96WPqKkhQlnkopPz6TU0q17qqSglSPK4E/8oVa3uPJ3fDKmBce7WU\nRgkVqWNoR5AP3XY3GrUC4J2d6qYrdO0MDVndNHVPmHIKtsFQCuUQ9B7AlvtBQ+qxGU222Bfqja7T\nJBsHWqkI+ji//DF6EofBPYA47MN5fDvJJPh+bEvH/NbwFEp7G99CCJE5PaimRgUNc43TK0ocJ6TF\nQCkVqeVvR2DPjjlyRP20u2qyr1J05arDMbwnTDENwIJB9ZmUElw3GPJRDkFvBx4RQjyNCoZ+cLX5\nz0OREIvqPJzqDSPiNUjXAEvqPIQiVhpkOn/+v+hUxR1Uzw2TxEsSlDAm4J571Bc6V9YNKFG2W9nx\nuFVgo2dxahGXUlmaQigL0uVSN52rnSlOBcRxoiz4kipSx7cdQTH92z0edVXU12e5awYHlU/ejl3s\n9dVdZ6cSc4dDXV2NpmBtJLegKXS6+igmbfEHwJ3AAiFEN/A1oAJASvnPwHOolMVOVNrifxuvk51M\n8gWYdPbKfX+1lsNnDxNLxKhwVLBuyTp27j6S0QUyO02yIullIG4JsMOReQx9THse+sqVmVacznTp\n7lZCogVco11B8fgI7hagoDiOg3sjJ6VUpI4g/iLmHXefv9OZ2aP9ypXhFnyuQLL9s9NuMrCasJXC\nSNa8sfSvPorJcnmwwPMS2FG2M5qi5Asw6eyVZ048Q1d/F2fDZ1nsWUzvQC+tq0fuArlp0/CSfzv7\n91tNwRwOJdbZl/V29MJgb7GbbWXaxV6LzcKF0BcOkxjRMp7AbovFVqSOIP7jHaBxOOCWW1TKo3Zl\n2Z/Lft+DQRW8ti/S4bA1Qs/hIKPj5kiY7BVDPqZd6f9Uw979sbG2kbPhsyyrXQaicBfIfCX/+gsb\nDisrsJjqRz0MAyxhiMetFgEXL1odGzV6+74+SCQKWcaT121xRPKIv8OhFrbxmviUTCoxz+71nt16\nQL/ffX3KAte9Z7LdZqA+62LcIrmMCyPyBoAyzGa/utHZK1JKuvpV9uap/lMgST8+EtrPqW+6pW59\nfWZQs9DrtTVu314IJSI6EwNUqbvLpbZzu9WoNrdbiUnaElevxpW0iaXdvfGDdvWzPqgen4Ikk+M/\nvk9PTUoklHCPtODqIHQ0ai24eq13udTvW7ao+01N1q2lpbhz0f8z+jZSW2bDzMVY6EWSL8B0JnSG\nV7te5e6Vd3P47GGuxK5Q7apm9YLVvNr9Kttv2p4u988V5NTdFvUXUF+Oa0HXQqxHq+3Zoy7xsy0y\nl0tZpDowqnOpN29Wz+u2u7pjpBDWsGur4jQzzTGDKdJtcUIoMpvHLuAOh1oU7Y9pl5b+WVWlrrou\nXx6+HyGsz14LO5iUR0NpGEEvklyXr6FIiK+99DXWLVnHFzd8kccPPM6hM4f4yqav8Lf7/5Z1S9bh\ncXsKjsnLvoTWRSnz5imBDoVUMZH2sYfDlu9dt43t7FQ+Wk32+LWGBiX0drHQgx9OnFDBuYGBTFEa\nluUxBbotjjtjyObJvprSF2f6px52rRfn7Fmq3d3Wez9SymMu48JY5AYwgj4ihfyS9nF4u0/uBtQ4\nvJXzVnJ74+3DHi92TJ7Lpb6wWqA7Oy1Le98+K4iqh22Ayn7J1bHP/jfE40oodLm6rm7VAyJ0dale\nOAYGikvdm1GMMpvHLtT5GMkl43Zbve5h5JmnuYwLPVpPY1IUr06MoI9AodLpkcbhjWZMniZbnLOL\nlvR0I13VqDMoCv0NR49a+dJgCYi9H0n2cU+eHH9f9NRihGyeEVwx2p8OxS2CQgy/GtJFX9nDVIop\nPiq202e+VtGGmYER9DEgpeTm3/XTZevVcXOHn6PffYgnDz+Zse1IY/IKFYDYn9dfdJfLEg6dQTHa\nasNcl/cHDqj95ovpFttzfPqRJ5vn2P2w9U/yumK0r7wYtE89l/BXV1utHOrr1WfT16eey1d8VOhK\nUtdKNDc0ZwxzyecCNExfTJbLGAhHw5x1Blk+1MpHu9tZPtTKWWeQs+GzvPzrl9l2/TbaH2hn2/Xb\nePnXLxOO5s4I6ejIFPHe3u2nVd8AACAASURBVMzsho4O9SU/eFD5wN1udTleXQ3z56sv+ZYthTMc\nampU4HTzZnWrnhOiu0eqfGgk1XNDdHcr3/pIYg4zVczJn80DlivmwW3qZ09zhismn0hn43RaQWmw\n8taFUFddTU3KzbZnjyXmLpf6zO1uGU2hDBe7a3Db09sIvB2guaG5aBegYfpgLPQRKGQ5eyu91L/Z\nRmOdmjLUeMEHvdvxVjLMEi/kbink3sm2wgYH1dQindVi943n+xtiMSXoAHFHiMidO+lLNCNPqElH\n1KtJRyLL4tTCnn1/RjJSNk+Bwqpi3henUy3EUipXVnW11T0TVNaRFmZQol5RUXzRUS7G6gI0TB+M\noGdRaoGGM+61ZW4LnHEvHrfkI8s+QuDtAO3vqP7m+YKi+ng600RnrWRjF/zubrhwwXrO4chtuUHm\nuetj9fSAxINjVjORjQHinw4g4yhrdMiTt8qyaCG3+5rdKQd8tJbxa+hVZnJm84yusMrpVNlKQ0Mq\nNqLRxoHLZQW2dWxExzi0e6Wy0tomFis94FnKQHTD9Ma4XLI43Rdiab2kvh6W1ktO943YAIW6OjjV\nq1wXPT2wqE4FnHzrfRnb5fvyaKHWVlg+YbbT3Ky21a6TefPUF91enJQLu2tHIBCHfFy+ZEu3S4mT\n3RK3i7jLBYsWWb/nosKTSvtr8isx/+x98Jl71e9NfvWce+T3dEoyysKqREJZ3WC5zXSaYVOTEnE9\nsES/11VV1v+Cy6UC3itWqFuu+af6KmzvXmXR62Er2m1nH4je/kA7rataCXYH87oADdMXY6HbCEVC\n9Ny0E2im8YKPrgV+ehYECUXyB492PTM84PTl519l3ZJ1GdsVsoi0FaaFuRQrrLnZylApNOtULyAS\nyVuz/CSTMG8+nDsHjg1+nEd9yKRIt4UFyy+cTFol6/ms9eSQPe3vGaj8QD3x6c+CZHwaek0EeVwx\njniqW2YBtGiD+gx0qqh+b+3P261xLeD6c+3thdpa9by2+Ovq1GdfW2ultJ48qYQdbJ0+3bbWzDdt\nH3EgumkdMD0xgm7D4/ZQfamZs9cEODtH+Rur3xs5f9wecAq8HSAYBPeprfxk9mFqLrVyY8THp/9G\nZRXk+hLZi3pABTxzoYdb6GBkNKru27/UUPyUmlcPholsCCLfbKX/LR/c4CdZHyR5dDukBmp4bH+2\n9vnq8+7tHd4iFiARt6f9CfigkQyXxGT3fhkLOVwxepZrIZJJS4jjccv61lc6eoF0ODL/F3SA3P65\nnjypfur7ukBM9/5JJKyfVlFSaQPRTYXq9MQIug0hBDdGfLwyZAWP1kdG9jNmB5wiEdgY20HiYhhn\n0sPp3twWEagvkS7X119ojyd3R8V8jbzseeP2BQKU1Z8vPz0W9jL/DTVvdMtdgmef9eE8sR2R9EIq\nJdEeiAuF1P7377cWFYdDZdmAlY2RTNp9zRJmp6YT9i9TFvpUaOhVRrKbcY2E/jeqqBjeCTPXfjwe\nZcnryuF8aYvxuNrO3k9G/8z+XzKW+MzGCLoNKSWf/hs/FW9bj7X+vh8p84t6roBT1wI/jResWaD5\nhlWAFQDVgbBcAdFi6e3NzIjI1V/b3syLkJeqlEAIIZjvsc7x3Lnc+xfCysoYHFRCri15Icj0NR+7\nX/nQpUT85N+QN+waYVjFzGdgoLQFANRin6syONd2xTAaS9wsAtMHI+g27MEjewFGLus632s+9EM/\nlxqCLL24PbNb4QSRLyMiVzMv3U4gH3bx0K6BgQGVK51d4Zi2Nm1teIUQyH97FoCFc2o5V6aGXrmG\nRkwHtJjbz9/e5ld3wdQtBOwibY+x6M9Ci7E9QG3ft7NIdxCMnKJr3DHTByPoNooJHhV6zY0RH2de\n206vbbBzPuyjyLTfdM8e9QVt/mgIKTNLtevqvAX7ddgt/J4ey5LK/lJ2dg4fcn3hwvCGUXYffW+v\n1e9FCJAVKj1xwQLBuXMSURVGDnnTgu10QjyqHMcqzbI8Db2mo5jbsZ+/LvR67jlrUdSirKuCPQVi\nyJWV1hAUh8MaZu12Fx9gNxb3zMAIehbZY+KKKY22b/N8hwCKE63sMXKaU70hlv/BTvyHMku1dz0z\ncql2LitL+73DYTX0WAiV5pirmVf2eezda/3e2wtdKXd4PJ4S81RXQtntQ9zqx7U8iHy+jfiAOkdt\nkbrdMGeOSt8rZkRbLpxOy1+fyx00XenvzxRzLd46/qEXUj0YQ8dcIPOzqquz4hsVFWqB3rQpf4qj\n/b6dXO4Vw/TBCPoUxBHPzJyB4ro15rKympqsRlC6xDyfv7WzUzXw0gKbTKpgm93qb2xMFTb1eZCp\n9MTzqwMgobKnFeH0MGuRcg9s3gzPPqtcCHYRdjjUJKXLlwt3KMxFrhFv0xn735J9VZRr0dfxlmyy\nC5f0a7OD2S6XWjAaG60UV719LveKGTY9fTCCPoEUG1wSlF6qXaibnhbBZNLyrWefj75s177XaHS4\n+NfVKWvRIQTJwz7k6oD1mkM+wmHBQKprYDCY/70YGlJBvni8OHFOJFQ/8Zkm5tmEQlY6qNerPqOO\nDrXY6uwl3ZXRjl2Ig0H1GelahM5OFffQr4lE1Gd9++3W60fyixt3zPTBCPoEki+4lG0BLaorrVQ7\nXze9v7rzr5AsAQSz50hkRZhLZ73pczlxQl3CNzcrEdCCD5mCYe/GCMrC+/hdkq4Ffs7OsazxI5f9\niJ/5mFWlXpwvI0MHUisrSxPnGdsQLA+hELzwwvAxdNpNpf9nurqUQGvB1/EY+wIgZWaNQ6nZNhqT\n8TK1MYI+Bcj+QoQiYXbuLj7bRhc3/ej4jwicCICAT1z7Cf7sP/6M2JrNJDt9XL7GT3JpkGRHGx6P\nl/p69YXXFvjFi9b+tGA4nUrEYzHLf1tfrxaChCPMpZogiy+34jjio/X3/bwyN4jbs52hsDfncAy9\nX5dLNaXKDgYWwu1WQjRaMZqOJJPw4otwyy1WO13d8+XECSvDxZ6uqgvA9DhB7ULL597SLXqrq9X/\nw5EjqhPkypVqsaittRbTSETFMuw99A1Th6IEXQhxN/APgBN4Qkr5jaznlwH/CsxJbfNnUsrnynyu\nVwXaddK2tY2aipp0X5hcYq6tpc5OQdRxP1Hf16mUc/Eml/Ffrqvk/Q/ep3Hrea5psfzwj73voaF+\n+HGzs1s0W7YMv7KIRmHvi16GZBtdQ6oU/ksf8RGR23ElvCSTuUU33bWxKoREvc5dKYkSJjlYOJA8\nGn/7TCCRUOJ99Gjm+5pIqPdEDwe3u2KKGTC+Z4/6OTSkPhuvV92GhlTfGN2X/eRJq/YgEik+590w\n8RRsziWEcAKPAS3AjcCDQogbszb7c2CXlHI98ADweLlPdCagXSv5mmhp14n/kB+P28PjBx7ny7u/\nTDgaxuP2EIpkNrVKC62QVKz7ISIyl2T1WS7Neo3HDjzG79z8Ozhttem+9VaxEyiXh+4do8XW7VY3\nh0NZhbkup91uuOsu8Lq9uFwClwtqvQI55CUWy29BO53gmBXCcd8jhK57guc6JEM3PEHyE49Mz4Zd\nE0g8Pvx9TSatIdQ66G2fY6rjJdno9Ma77kp9jt7C4m+YHhRjoW8EOqWU7wIIIZ4GWoHjtm0kkOpU\nwWzgdDlPcqZQyNdo7wvzo+M/4s3zb/Kx5R+jpqJm5CkzFWHida9RcegRajcGCIkuLg1eIpKIZPjc\n/Yf8LKrz0dOjHtNpcR0d6rJ6YAASrpBVFIQkFBme/56v06IWBYfDukS3pyUqAZJEZr0Htx0muSoA\ntV0QmQ15m/aSEntp5bBXhkDOvCHVbnf+q5BZs6zS/uxUTyHIaKamg9rV1Zb/XG9nH8Khs2XsjcEK\nIYRlBED+wjXjW58cihH0eqDLdr8byC5Q/0vgBSHEl4Aa4DfKcnZXGfa+MC6Hi7mz5tI32EfrD1uB\n/KmLIualZv83CX/ohwgEFYONrJq/iF1v7uLzaz/Pw00PpxeE/++Z3H74TZvgpVdCJH5jJ84zzcRf\n9/Gm2881fxikvq+NpXXejBQ6O/G4NdEehqfhgTVl6aObvRy9+DnO3PDnJLxnQYD8j0eGiXM6m8Ud\ngpZHYN57cORzADibniJxfgV0fHtGifpILiX7XFf9nupFs6JCCb5uDVBZqYR8xQrlqhkczHyd5sqV\nTMtcd9J0uSyhrqtTWTL6uerq3Pntppp0alCuoOiDwHellN8UQtwBfE8IsUZKmRHuEkJ8AfgCwLJl\ny8p06JlDRl8YActql3Gq/xSNtY1psbdb3NqFIyUMhAXJxa9R+V4r6yM+Pv37fn5+6udsv2l7UVWv\nHR1wd4uH4/FmPrglQPTaAC4n1NNKY52H07Yv6LFjcPhwSiDcIcBDMpkanFwZhsjwY0Sj6rZ7Nyy+\nJ/M54bDscz0rtbJSbZ+IeqDrI7DkMHz8zwFIhBdD16bp2YY3D15vpjVdLE6nes/sQ6Xt7ZSbmlSQ\n0+6KsROPWyMMc1nUxsqeXhTjOesBGm33G1KP2fEBuwCklK8AVcCC7B1JKb8jpdwgpdywcOHC0Z3x\nDMbeFyawPcC8WfO4NHiJhFTOU/8hP9L2jdSzRvv7YfCyl77vt/Hrn/h4vkPw7//Dx6H/+W3u/HAt\nTU1wzz0jV722tMC5XsH8U2owhxAwZy4ZTcY0uj2rszqEaFHDLFrukXg2+6m4bydUhqze3NpdAiAk\n8VmnudT4FFWJxTjOboTwYuTNT6V96Hq2psejrEHVe/zhVBveFB8sU4/NkI6NYGWulILLpbJRVq9W\ngrxihbLQ7UMuurqsQSXZw0rmzVPv8+rVRrhnCsVY6K8D1wkhrkEJ+QPAZ7K2OQXcBXxXCHEDStDP\nl/NErwbsfWHC0TAVjgq+sukr7LhtR1GNwuyPn+sVLKu37he6BO7tVROauhb4qapS1iJYnSOzZ2c6\nHCDjHjjdTGJVgNfrAlQD115u5R2Hh82b1dzS59kJ3c1qcMY6P8mGn+OVy/j69jt5uOlhnjj4BPu7\n9vPtfxJsuWP4ZfuhQxKanrDa8ALMPqUem0GiPppiqXjc8n/nayMBVpfHZNJy00hpNfsqR+WnqSad\nGghZhFkghLgH+HtUSuKTUsq/EUJ8HTggpWxPZb38C+BBmWNfkVK+MNI+N2zYIA8cODDmP2AmU6j6\ncyQK9U7PtX1dY4g3G3cyd6CZd3/kY+7H/VyZG6T+zTaWzrd86A6H1dJVIon99jZqZ6v71wbb6e4S\nNDaq5w4LP+KGgLX9W61ceOl+vJXe9N/1ifvC9J3xcuJEys2SsDI2ZEWWD905BDftgksr4Pl/hIow\nhFXx1LSZWVpGHA5laTemLmDsxWKgPnddeATWlKRZs9R9exsAjQloTm2EEG9IKTfkeq4oH3oqp/y5\nrMf+wvb7cWDTWE7SMJzRNAobC66kl5u62nAmPThXCt74iY9wdPgVgc5ikUhYn1nR2rfMz611vlST\nMsGChT4u3RhIW4azT1lifneL5ExfmF8d81JRkdqnra2saiXrhY5HSbtttu6EN++HI78La5+CjY8q\nS/2VndD0JNS/Cj/7OoSXjut7NVnY+9HbB5jYpxll54mfz7pWzl7s9+wxAc2ZgqkUnaGUeglsbW+1\n/c23iPzmbyoXTcIVpuemIBd/2Ur4sHKp9NcH6X5pOy0tXp57TvKN3X4CqYEh8USc8Mfu5YmDKvPm\neKUfNgZxvdtGldM6TjisrE5INfVKW9wSem5X4+1WvqDun2mC5n+Etd9TvvXBefCxv4Dd3yqvpe4O\nWfNEJ+BKwJ7CqEfXhUJqwbtyRV3F9PerCs/KShWkFsJy3XR0qDqCcrs+THri1KYol8t4YFwuM4NQ\nJMSiuR7iMYFwKKFLXPGydi3sfSWzx8wTB5/ge0e/R427BpfDxb59sHyolfd+rPq/6N4vethD7t4t\nEh7cZt39wTPwh03gOavu99wGb3+y5DF3Onko59fBbbUK5qBPjdGrD6YGeRQW9bE2FPN6ydlKAVRw\nuqZGvWe6V0syqVxiuvQ/m9razF494TDcY8s8Gsk9l/1a+6xZw8QwZpeLYfoxFv97KXgrvWnZFFkD\nLLKHfzzc9DDbb9rOZ3/y2fQ2jRd8nKkUw6bx5BZA+7zS1P1PPqQyYLSgz+6Cgw8xmmBpXtsm6lFi\nviqQGn6NGrFXZNrkaMRcFwFpX3cudE8cPWdUH0f3u2lqUtkuMHyYeLYIP/usOlY0qvapaw2yLfB4\n3HL5gJWfbpgamILfGYi9hYDObd+5e+ew1gEtLeqL29QEazeGuLtFKZqUcti2I6Fzx/VNCCttbssd\nXm69VaQ7Bu46viv9uspKOOr0U98gWbFCpd7198P69cpiHFaObp9X+oN26Lxb+cwjtcoyDy+Gqktw\n2+OMWHmag5EvVEXK4rcxzoOu9fkMDirRzLUoZL8/8bgVh5BS+cWFULf6enWz90XX6L4vd92lrPwt\nW6zt7e4Vw9THCPoMxN5CYNvT2wi8HaC5oXlYlamu7qtrDHFp406OV468AORj0yYlCtoy1H5eLQpa\nGOx59u0PtDP/0lYG5r5K0hVGIok7hh9PX967XKoi1vWfbYjDKTF9fQd8/39DrBre/iSuJw/B/q/A\nkiNK/MtG9pUBqfvj567U/XRG6rHidKr3OlzCn6onGsHwnurF4HKpdEd9y9cGwjA5mI9jBmJvIaAZ\naUCGM+lh7kAz7y8JsO3p3BOSRnLhdHTkzpzIJjvP/pzjMEvEOhrrPHQt8HOIIKFIW7p3jHY7aB+0\nwwE3X6+O2dUFjY2CI0euR774LeSQh7gW+XIELO1BUHcIGl9WVwZ2H3oZBl5n43RaXRRHwuGwqjt1\nNagW12h05IVAu2jyofvfg3KD6SEboBZvM6Ju6mIEfYZgF9xkMsnjBzIbXo40IEMgaLzg4/2G3AtA\nvgEaORuF2cglDOBVw6bxEHLfTnh1gPMrdwNQ/Z5aRPKN0ssumDl4UGddeDl8WHccVDNdx9RpNzsI\numYXIODY/aTdL2USc90pUTfXqqmx2uDW1KgrnVzBYY9nZHeIHoChXTc648luUdvbATgcapt4XA30\n1v8mTmfmcUxGy9TGuFxmANk+88cPPM6jrz3K1hVbaX+gndZVrQS7g4SjmdfmOlWxu0dy1OnPsNzs\nbQaKdeHYcbkyXQFagHp71UzSd08K4q/7iMbg0gdKsXTbgdP9p9PHLuTP1+0PqqpUquOcOSO/V0W1\nibUHQR/cpn52fcQm4OXr9Kg7KIK6CunvJ91Pvr8//5Sm/n41qs7hsFIW9TASh0MNodBtIfr71e8H\nDyoLOxxW2Sq1tbBwoYpZxOPq+TVrVNBzwQJ1yw7KhiKhoj8bw8Rj0hZnANrvbXexbF2xlS9u+CIO\nh6NglksoEuLGL+9EnG5m/ikffcv8yKVBjn/LssCllGx72koXbH+gPcPaz5WfvH//8BS3lStV8UuF\nt5/LS39IYtWPYeGbVInZrDj/R8TFEO8vfIwF7z3Cwvd2EFvj5wNPkAvfbyMW9qb7f0tpTdUB5XJY\nkOoeZB9InU1trRK0YUHGYXnmIfhtKxuHH7QzXkHQfCma2amUOsCpA896rFwsVlomzUhVxHqghZ5+\nZB92UcqVWrH56iavvXRM2uIMJ5fPfMdtO9KCW6jK1FvpZd6BNhrrPIh6QX3CR9eB7RliXmjGaa4v\nYXV17hQ3WRHi4sYvIee+B0d+BwYWMnT9s7yz5OvEztxEZXczsWW7Ob1sN0NDMPu0csVclJm91u0D\nlUFVRBYK9OXMmR6WZ/6EKlSK1IBMfUWa/OOW2ZIr68TuDnG71TY1Nern4KA12CIWU7/nSzO0o8Xz\nxAkl2rozYzaVldY8WHuvF/uV2lefChCJwOwzrWz+Sw+Ls4470tBq+zkWartrBL80jMtlBpBPcEu5\n+nLGvemuigKBM26r3MzKTsnnwskmV0ZEXR3Ernjg1EeVYK5uh1kXEeEGZkfX4KKCio7/lbGf+ad8\n3N4s8HiUW0W7TbTFqtPu7r0X1q4dOXc7J9kulht+nBra+Sllmb/dqoKgZc2cschehLI/Nns6qMdj\nvaaiwtq2mDRDLZ56H+Hw8MlZdXWqncCKFaRTSbWAasMBVGykqgpuSfhoqBcjHjcSUec6mlRIfc4m\njbI4jIU+A7ALbjFDpUslu0CoUG91zcqVw62vjg5oaRG8tNdH/MYAEkmipgtH/4fod0A8IQnf9d8Y\n7AOnAxJJeNPt5+Q+HwNhVYCk3QvZc1CDQSVGpafSpQKdumgo6YLv/zTlJy9vELQU7JOeQAVIdVxC\nysxMmGAwt7VdKiNZv7kMh1zdOA2Th7HQZwBacLULxLfeVzADJZtC8051Qy0ovlFYvn1KJIlb9CCP\nBMy6hBiah/snARzv3g31QVzvb8Wxqx3XyVacy4JQEU67GHIhBAwlQ0gkmzapoxQ/pzRHnvmaXbY7\n4zfuzuWyRsaB8vEvWqQe02X22qLWLXD1grVwobV9dkOulhb1XHW1utXWWlWj2rrWVnOxVq/dcLg2\n2M7iy61cqgmScAy/crF/9rFY/lTJQv93htIwQVFDTsazdcDajSHeWa581hXHfIRWP4ar4TDOfV+n\nIroY99yz3H7zYrp7B3DEPSxuDONKqmP39Chh0g2q0tj84CLVKKzofitj7NVSCg6HEmj7QIt8Ac1F\ni1SANztoqu/rhUC/DzrtUeeKZwc3dVD6xAkl5tnTjYpB/1/cc4/gbK8k6QrjjHuL8t1rSvGDGx/6\ncEYKihpBNwxjtHnnxdLUBL98J0R8wMoqEbPP4Lrra9xQ20zjBR9dC/z0oHqxL6sbPqijvl4NOR4a\nUq4Hpytl9a+yAsPpQqBi3AHj2E3RHuScNUsJcjE9UHSRUTaLFqlccVCZPRcvqt+rq2HzZus9ypet\nUmqv/LFQiiAb8S4Ok+VylTFW69qezRB4O0A8GedTqz+Fx+0pm7WeHFSvV14cAaElOM428/6iQLrA\nqXGglSXzPRmZD11dymVw4oQ1rEHtIcsPDqVlpUSt85Eyv4tFt7V1Oq1AZaGUQbvNZHeN6JRD+yDt\nZFLd1z+tc1K/OxyWtS2lEnZ9/P5+eO45KwNGCEvQweqv09mp3j+XS1nspbg5ShXdUoZHm0HTY8f4\n0GcYxTbmGgl7NkM8GefNc2+mnyt1f7kKUerqMtPypASZFIhDPiorlZW5eTP88l/VoAxdFHPwoBJE\nneHh8SiBq62FufPK02+l0AWrfYRbsb1Q7D7yZFKlHUL+8n79nuQK/iaT1vQhGF4oZV8QEgl1JaD7\nqIMSzC1bVCOulSvVe1qKFWyyTqY2xkKfYWRb1zC8L0sh7NkMTuFk7qy5PPr6o7S/3U7wNah6X+Ue\nC0a20PK5bnY900bdXC+xmBJIZYFKIjf6ifYpVwrAzR1+fvmvVq57S4vK8tACqcfUVVbCUMLWiXEc\n+63Y/d75e7arc5w3z7LIcxYzkSnWulOibgcgxPD9O53Kkq+oICPjJxu92Nh7ous8cFBZMeFwcfnr\nE41uGRGLpUYjTqFzK5aJal+djRH0GUapjblykZ0G+cTBJ/jrfX/NUs9SIhEXGxM+RL3aX77L4pYW\nONvroW9ZM7uWBPhqZYDmZmtxqamxfL9SAu4wcmkQ3mrFfcrH0nv9vDUQZF3z9nROfGenlXOuFwNQ\nU3uSSS+80AaRlB98jKmGdjeHnYULLTdHtnXscimr2ONRVjBYi9NI6GCmdn/ogKYu7sk+J53Xrdsc\n6F7yg4PDq0vtgm3HnhsO4+feKGVylt42HLauwOrrp5/rZbxjUCNhBH2GUUxVZyHseecAQ/Ehblp4\nEy6n+nd5f+FjLD+/I12IlIveXmhIVZ2+XhVIi5M+Dz3n0io199K3pw2iHqLVqlnYOy9vZ9lHrS/A\niROWz1oLlp7YEw7DgtleLl7UQp/fD55PrO3ke97eViDDZeIOMXeOh3BIsGKlJO5QmTmVlXrByb2/\nWbOGTxZqaVGtDLKpqLBcKdnVnDU1VgWpHS3Ye/eqnydOqJ+RiOr3UiqljjYsxbLW2+ZqxDadKMdV\n8mgxPvQZxmirOrPReednQmd47PXHWFSziMD2AM7oPN5f9CgR15mC+5BIuhaMXMFaV6cEaWgI5JAX\np8OqVhU2QQ4GLQHLFkftzrh8OfNxPZJt2HkV41Z3h7D87wVy2lNpj6GVfpwuSWyNn9fn7eSlX4TS\nhUB2P7qdSCRz0Ii2qGtqUjGCWdZ5xOKSZEUIh2N4Nef581YzrVmzrDiDJh63fOd33ZW/7L8Quhma\nvk03V8hEYI9BaUq9Sh4txkKfYYy2qjMfS7xLeGTjIzzf+TytP2zF6ZXMeecRLvx6SdqHno+EI8yl\nmiCLL7dysv1+jpz5Ia/MDfLH7fcjEFz3IXVOLpcKgtp9pz09mRWfkYjlmgAV6HM4lFBqSzk7yFio\np3gGGf3P+6Hlj/Be/ggc8hFd4yeyIAjP58lLT7UOiN4YYGh5gBNA8petyIuZFpl9tqh91F6u7I6V\nK9XgkTcbd3L+kMrXv+ZTfi7VBJn7WhsHX8s8j1AkxLlzVn74iXfDuKU3bwVpNJrZsz4ez/SxT6bf\nutSrgKlGOa6SR4sR9BmIXbyLrerMhxCCHbftYPdJ1bP89mZB+7d2FPzHrKuD3i4vc8+0IZFE7/wT\n5tRu5MaL32Tv6l0klgSpi7XhSnrp7IRTvSHqGzwIBIvqJAlHmM5Ob1p0tItAi9PevUqEsiskR8Ww\n/uc/hLknuTKnF+fyAFWzYNGZVi5XeRhMWr57HbxMJgU1nT5mbQ5wsS+1z0O+VC67lduedFm57bpb\n5EhXC86kh0u/bGZoeYChDwU4FoeKQ63cOD9zobD7bM/2+kiu9cPKIPI/2+jrU+9hJKL8+doyd7uV\npa7Zs2fqNMma7lb/eLfiGImiBF0IcTfwD4ATeEJK+Y0c29wP/CXq+vCIlPIzZTxPwyQxGmvD/uV3\nxr0sqpNcTjTTOydAYzzIDQAAHO1JREFU75x24rNRJf3zlDBdszrExj/PDCL92beDbPqYEvy4I8RL\nL3pobhaqbYAjzMqV6othL57p61P+ZbslXBS5hkAf/TyJVe0khOonU7Hfx5UBkVHoY/Uyl1xZ5cfe\nE0xueAzqDkPP7cOqT0XMW9DtU1cH+/YKwmEfPBgAAS4n3DDo4/n9me+73Wf7bnOAqiq4nlYa7/Dw\nn3uUcOurH92Qq9R+N+ORIz5TC4nKfZVcCgV96EIIJ/AY0ALcCDwohLgxa5vrgP8T2CSlvAn443E4\nV8MkMBqffHau8rlekR5eoak45ksHVR3x4QM0qi8140x6iDuU20Gu96cHcbw+byfzl6h89nBYZZ1c\nuGBVVZYk5sDwIdBKbd2V6WfpX+Gnwi1ZuBDuu88aEHHffeCYFSaxJEj/a63Ev9dO/MRWJeZn18Gq\nZ+Az96iFoqcZop5h/nR7P5NTvSEW1Uk6OuDalUkqP/IYbrdVdNS3bHgXzVw+28YLvoygdXOzcmut\nXq1837qP/GQyk3PaR9P7qBwUs05vBDqllO8CCCGeBlqB47Ztfh94TEp5CUBKOcKIAcN0QOfReiu9\nfPMT3wQYtbUhUYFCHdSUQHSNn+4eJTqL6wT//j98vLLcSrVM/qeP16oEQxEPsTXNxK4N8KYzwKxZ\nqgf3hTOqB3dNjXJ95O1zXlQ5f1ZRkkjA2u/hfvdzRPc+THKDn+TSILHEdobOe9m3zwrE7tuXqnp9\nvo1owqOOsTgl5q9/EW79F6i6BOduSleu6nFzlZUquKmtUrvrREofF5Y/TrTxUWqOPcKsYzsIrfRz\n5fogZ0L3sMS7JJ3jHIqE2HVcNRPT2S9HnX7mn/LhcuW+isr2U9fUTG+/tUFRjKDXA122+91Adpjl\negAhxH6UW+YvpZTPZ+9ICPEF4AsAy5YtG835GiaA7DzaXcd3ZeTRlmptdP46TOXaILPPtDL/lI/Y\nGj93/GmQtq1qYZBS8qHf8meUqX9ws5/4IR/uCkHFMR/J6wLEE1YPblEv0sHD+nqr5D1tvLpDcPdO\n6FbNuipvV4FNmSuw6VZFSeKdVuQbPsStfmTjywy9sR0QJA/4wG3ltA8NWW6dqqpU2mHSS3WNmpUa\nPn07FTcHiG74F6g5C+HFgDNjSIbLlen/z5W3H6mD6l8+Am98kYsDgsR5H86T93ALX2Nxoplf/qty\nT/381M+RUqauoux5z9u5/5PenEJdqmuj1EDlTHWnTHXKFRR1AdcBdwINwD4hxM1SyoxEMinld4Dv\ngGrOVaZjG8rMWPNos7/8IubltottOJNqIlL3MR9t/6LEvKUFTveFOd0QpOK1Vjy/UkVFlxYHcbq3\nI6WH2JrhPbhP/1T1R49EVM72MDdL1APdyi8uVwUYEsCJVoh6MnzsTie4nV4i/9HGfK+HD9yC2CEf\n4s3tJCIp4U8HNwEkMUcYR8ISd3tLW4FAvuEjesOPlGUeXgzfOcQtn3+St64JwjvbSVzxUlWVOQ0o\nV95+ZSUMHthBIi5IJsHpEMxKLmEpzbxfE2Db09Znc/+N96cv8+1XUeUS0VL3U4zPfbpns0xFihH0\nHqDRdr8h9ZidbiAopYwB7wkh3kEJ/OtlOUvDhDLWatPsL39TE+n2t6BET1v5vb2wrN7Le/vaqHJ6\niESsoiK39OL2hog1BHH8qhX3IR9XbvPz1sIg8dB2RNRL0hXCmfRAUoCQqm96zKsabB1SzbqEQ+Vk\nxw778NYKNm9W56G7DLa0wN69XiJDqaCqELiFF6pgMDG8ta7rmiB3xdrY+6J3WLql06XcNyLpQp6/\nCSFdyHVP0nDhId57ZTuzqr0MOZQ/e+9e9fc3NamCn65uydJ7/UgRB1w0N8OR95/gRrmdV16qpaoK\nhobU+6MbmOX6bOw+26lsKU+V85hJFFNY9DpwnRDiGiGEG3gAaM/a5hmUdY4QYgHKBfNuGc/TMIGU\nY6RdITKadiFxV0JkSBCLwekeQUXSy+AgXDzjJdLextArPhJxQfx1H1UvtSGiXpzVSmzlulQTrvV+\n2LqTytoQ8xdIGrf5cblsBTbr/cgCzboSCXWLRFIukagHcboZsTo1nm51ALqb+fnPPAwOwk9/qgKy\n4bDyQ58+H6ZieZDK91vxtj9H5XutiIYg3b0DEPWmG2Xt3auabJ08CUePqirPCx+EOXr555y/HGbo\n6Da2Xb+NnoXfY0/llwhFQ1y8COGBwsVadsoVeMzVZM0w9Sgo6FLKOPAIsBt4C9glpXxTCPF1IYQe\nA78b6BNCHAd+Bvx3KWVf7j0apjqlZrboL3tLC6xvkqzdGKKpSVmHMHwqzfwlto6QqWrS6t/ayYfv\nDGVkYcyapaofF9Z6qfUK1qyBG1YLttzhxeOBOdUq3VCuCiA+sw1xg8okWX2th0gyTJcMkjjeSvS7\n7USPtiKXBglxhpf2Snp6YFGdEqbeXmsqkMbqdijUwAwJpPrI1J70cd1KQW2t6u2ycKF6fWPqOnZ1\n1ze5a56PzZthy4L7uflMG0de89Lfr/zt/f3q7/N4lA8+bVxHvThe+EeVMrmina9+L0DySg3J9z6K\nHPIQj0PSFaaHII0DY6sELoXRdPA0k4gymagF0Qy4MOSk2G5x9gDqYw+ropZLNUFu6mqjt8ubc3CC\nFoXA24G0u0IHTBfXidTcUWXFautaZ4SAsjT16/r7JXxmW3rfjh+284l7w+x/ycNALEx1RQ0D0QGI\nepjbeIaa+74G3c38+idW8PC1/9saovHss0BliAqpipyisST1n3yc2Id2p49R+Z4613dPiowc+ApP\niJrf2kn4nWYqfulj7sf9XJkbZP25Nl78aeZ7Z/WwId1/xuVSHRoHhyQi9Tf196MGVctUiqcDPn53\niBeeLa6TX75hFqV0A7R/XhpdNDMR5ezTnXI36xppwIXp5WLISbF5tJlFLds4OyfA3AGVQ54Pe960\nzo/+9U98HDoo0n7Vjg6VM/3hO0N8dLOkuVm5ZuYvCdHTo8T92hWSyg/7WbRITfFxOaHiw49xrPHL\nxNb4qanwcO2nnsR17048c8PccfMS5g4088ESK9+9uaEZR9x2ru4QfGIniVv8JJISbnucC9c+ysIP\ntnJbp5qjeWVuUFV9AskKNcdUSnDJGhaLtczeGMDx2W2s3R7gG19q5oVnc78XOsVQV57G49B3URK+\nznKpuJzguFXlwLtcyqrvO6M+G23l6UyhXFZfLku5VIt7MnuTzATs3xH7/914NOsygj6DmAw/ZzFF\nLdkU66OfvyTE6/N2ctSpiopia/ws/4Od7H0lxMGDsO+VMAvXBql8r5Wle9sR77TC4sPMvrKO+IoA\ng61qgXGeaYaYsrjVhHoL33p1VaAFT8SVz9xxQwDx4Dacq3az4L1H+ND5L6ZfX/9mG6ff8xKKhrhw\n607ON/hJuvphnZ/LNa+TJKYCmxLuv/H+tPjaP5v5S0LpBlu6Da/DofrfyKVBIkdbif5rO/JEK8ml\nagFJJq1hy/lE+TfvDbF2Y4j1TZKmJrUI6vdLN9MqVWAmIqYyk5nIBdEI+gyhHJOKRoP9y24vaunu\nkXn9psX66F941sM3vtTM2u0BGv77Nq5pyRQeb6WX499qS1v3H5/rY9X736Li8A4kqoBpaAhcv/SB\nFOlKU/sEev8hP889Z52rp0ZQedyHBKprlAtkzZUdnOlx0NMDe38R5t0THgYGoLomSU3oFhZu+RHs\nWIO841tc8vyCUPUxhrxv8v7l97n3+/fS80HPsM/GvjCtXasqTu+5B2qrvHh/0cZvzPPh9agc/Kq9\nbdS4vBm9bPKJcu95yaWNO0mu9bO0XnK8cvj/QakCM9oOnlM5kDqR5zaRC6IR9BnCRF7W2bF/2Zve\nbmf2mUyXRC50rwstIr71vpz+RLvwBIOqKvOxh33ceqtIB1ztrqHnOwSHgx52POGnpUX54T/8kTht\nP3uCDz6Q7P1FiI/97sv8z88PFyadDbJ5i+T67X5qatQCFYnA8Up/2t1TuW0n1/62H2d1P5Hfvo/B\nFbuQUiIckivu9xh0XCAm+nHLOQghEEKU/NmImBeBoLlZBYHvvMPLihXKzaTdJvlE2Rn3MnegmbNz\nAry+chsfLBl+rFIFptjPy85kGRjFMNHnVq6W1sVggqIzCCkl2562AoTtD7RPiJ9TB9huvVWwtF41\nz3IlvaOaJm/Pm9YtA65pCbBvnwogLr7cSuMFH6d7BHtfGR7YA9IBqPtvvJ/7fnAfUkp++pmfsuv4\nLn5+6uf8493/SG1VbUYwUAcPde8YupuZfyozyHu2y0PfMj+RawL09QGzT5F09+GaFUUmBNI1iOvK\nMmLuXmouN1NZCSe/8f30sfJ9Nva/+cQJ5SfXlni+IKYu9//qU4F0xensM61c+k8fW7bA6yvVsYaG\n4PLj7YSj4fTr+4f6+VLHl/johz46bhN1pnIgdTLOrZwj6UxQ9CpgMv2cGQFUREYRUanY86YXN4Y5\n61SWzbVBFZC8VBMk4QiTcOW2soC0Nemt9PK5Wz6Hx+3hsz/5LIG3A3xk2UfSX6RcwV5X0stNXW3M\nP+VL+8xv6lJdHwVWkzGnAwgtRUqB88oSKhN1uAaWk6w+owY5O7ro64M1D+xia0sS/yE/8WQ8/Xk8\ncfAJ+odUAxr70IgtWyxLPF8Q8+bf9XPNjj/iTx99mcuvtuL+UTvLh1qhIYisCHHUqfrm6GrTx15/\njC/v/nL69buO70IIkfbvF2Nxl8pUDqROxrlNVLMu0w99hjCZPZjHC1fSS/2bbfjWe3g8Ja5LL27H\nlfTiiMu87QnsX8yHmx6m/R2rDi7fFzezDN3L4jrYvx9OnhSAev+SUhUvXbmgtpK1p3HEZrP5tsW8\n1fcW/cc3UR1bx/legUtUIM+sg1uC9LxxDy//+mUGogN87pbPAfDUkafY37Wfb7d8O+PzyVU9KWVm\nK4auGkgeb4U37yd+yctFBPzUx4Y7tnN6BWx8KEhzg/V/8GrXq6xbsi79+mBQDfre8nVvakiJoKOj\nvP8jEzHkYbRW72QOoBhvjKDPECazB7NmPHpzOONehFD7Ot1jieviOqs9gc5J7/47H48j0uXtpXxx\n7UKqXSADA+q+EKlJSRVhKq8Nci2tLLn4/7d378FR3dcBx79HDySMVpi3bCEVNzW2WRoEA4rdYRiY\nzoBNHSkNGTBJ7dhAMxOHukNoZto/8jL2tPUMSVvjtENlXNMmDiRkQB4wJDPBxY6NJWIMQVjYBDDi\nIYJtgR5FT07/uHtXKyFpr7Tavaur85lhRivd3T37G+7Z3z2/x11J1bS/oL1N+NnKn7H9+Ha+e/w9\nwhee5vCvC8gNtdDWlEd4QjNX2kI8/9Dz7KjZ4Xy5qDPmsbB44YBjHLEJa03JGna9v4usDOeUlffW\nMjZXaI1cY7e3OVdGmZ30+H+wctbK6D4v+0/vp0u7PN/oOxHJ7mAkMrc7iJ0fl9XQzS0Gs6houOqC\nLq97j8TWQf/3jU7G5mRRcK2c6R+v4cKVFo5VhYZ80rv1dLdu39rqzJW/eBG6spoomuZMgezIaOTS\nZThe5dTIlz7czCeXQ9TWOgui3LsDuXVwL2Mcbps2tzez4cAGSgpKeHL+kzyx5wkOnjtIeGqYt97M\nov14OaHTa7l+zdmONyfHmbcf217u5y8tLAVgS/UWGm40cO3DMHfdWBGdXjqUsQ4vkvH/w5VoHTyZ\nsSWb1dCNZ15nACRrpoDXmxC7vaxln1lG9o1Csjsn0jDuMB9N+REXw04cQ5mdEU9mZyg6xz77Zj5Z\nnfmAUxf91d5QtA7ee1aKlzGO2DYdlz2Ojq4OnvvNcyz7n2UcPHeQJTOWsHf1XopayumY9g6tXc3k\n5jq35nO3THjtte4peXlj8igtLGVnzU42HdoEwLcXfZvxl74YHYtIpmTWjROtg/t1A4pks5KL6cHr\n1rmJbrGbKDdZj8sex3MN22i9fTdKF23jr3Db2fIec9Vdgz1x3Xn17k2r3RKS21PP6MyjYNqtPby+\nvoSa2uJf5vduU1VlwtgJ5GblEp4S5qXyl8jIyOB3L69l6cOr+KS4+7O4sfW+KgE49ckp7pl0D7lZ\nuaybt45dbXC56hGudIZ6PNdPg+0xD6acNpJ744NlJRdzC6/TH/2aJtnbgw8pb8/ojuOBc5Xsf+3W\nOLye2PHKPkMt5Xh5f7dNVZW6xjqK84txF916KSnEliLc1wCir5MuUwdjDaU9vT5nuPdRSQdWcjGe\neZ3+mC7LwVWVLz37IosWEf33pWdvLWU0tjbyzQPfjE4XHKhEFK/sM9RFXPEu81WVF6pfQFXp0i4a\nbjQQygmxZ9WeARejxK56BGe7ASD6GusXrKdydWp2ZvSi9ypNVR10e3otp/m14M4vltADbrBLnL2u\nakvl6rfBxvvGR29EP2djayPr963n5WMvs+DOBWyp3sLsf5/NjhM7hnxiJ2se8+Wmy2yp2sKksZPY\nu3ovC4sW8ttLv6W+ub7fhNV7LKPi3QoefuVhOm92kpWRRXhquEeMfvdM+xp72fjLjdEvIZeX9vRS\nB0/n+fDJYCWXAEtmaWAwxyVbbByNrY08tf8pFhYvZO3ctVS8W8FL773E+evnmTR2EvUt9WRnZCMI\nNU/WkJ+bP+j3cxPRrvd3kSmZiAhlM8tYFV41pNeLfd3Nb2/m9bOvIxmC3lQW37WYjQ9s7DcB9Z7t\n0dnVSXN7M4/NeYx189YlrcSQ6Bzw2NkpZTOdclnseoHhKg2l84rVobKSyyiVrNLAYI8bbr2vOtz3\nd2NaWLww+pkrP6jk8ZLHmZg7kbPXznKt9RrtXe1MvG0iO2p2DKlE1NzeHF0oVH5POWUzy9h+bDtP\n7X8qoVk+ze3N1H5cS11TnVP/bqqj9uPaAa96evdAszKz2Pvlvaybty5pq0ATmeHUV495VXgVVRer\nknK1ly5XkqliCT3Agni5GS+Z3PKZ1dmOoDC/kI6bHeRk5lCQV8A35n+DqotVQzqxQznOQqFHP/so\nlR9UUnmq0tNCoXjcqYr1zfVUX6qmvrmejq4OxmWP6/c5fY1l7Dy5M/pzMr5oE6lL9xfv5qWbh3V6\nqSsZU1fTmSX0AEuXgUuvvNT73WSy6+Quyl5xkklpYWmP58V+5tbOVl48+iKTx05mUfEipoem8+mN\nT2nramPz0s1DPrHzc/NZN2+d80CcnnF/U+a8jmG0dLSQnZFNQV4BpXeWUpBXQHZGNi0dLf0+x48e\naCIdhf7idXemdF9/OBNuUOec98USeoCNpMtNr5fx7nL2mqs1nG88j3vP542/3EhTW1OPz/zjv/wx\nLR0tnGk4w4effsi0vGlMum0ShaFCjlw6EjcBDZSMB7tQyEtpIm9MHiV3lESnGBbnF1NyR8mwzPYY\nTol0FEZbjznVLKEH2Eg5eZramhiXPY7PTf8cu2t3s/wny/u9jFdVdtTsYMLYCdQ311N1qYot1Vso\nLSwlb0weoZwQ31/8fdaUrCGUE+Krc77K5LGTOXPtDAd+f4DG9kbWzl17y6ZYfcU0UDL28mWZNyaP\nOQVz2F27m7KflrG7djdzCub0m6Cb25s5Vn+M8nsjr3lvOcfqj8X9Ak51DzTRjsJo6jGnms1yMb6K\nnYmzpmQNc7fOpeFGA+GpYfZ9eV+fJQx3f5I9p/ZQ11hHw40GTnz9BPm5+bfM7Kl4t4JNhzbR2tmK\niLDgzgW8uvrVuL1zL7Mj4s30aGprYsOBDVRfqqYov4i6xjoW3LmAHy774YALZtJh5lA8IyXOILJZ\nLiZtRQfYavcwd+tc6pvrmTB2ApmS2edlfCgnxOalmwGnd1eUX0R4apidJ3dG9y9xB+w+/8rneb7q\neeeOQpEkXNdYR8W7FXHLA17qxPF6mkMZ5BwpvdeREudoYwnd+MpNnJ03O2m40UBBXgFHv3aUL9z7\nhX4v40UkOs3t1dWvsuK+FdFjYxNxl3bR0NrAjNtn8MySZ3hmyTOMzxnPm+ffjFseGI4B5aEMchqT\nCE+bc4nIg8C/AplAhar+Uz/HrQB+DixQVaunmLjcxJmVmUV4SpisjCy2vbeNNSVr+t2feqC932MT\ncVZGFrOnzmb5nyyP9q5XhVd56lEOx57Z7iDnlZYrngc5jUlE3B66iGQCLwAPAbOA1SIyq4/jQsDf\nAu8Md5AmuGIT576v7KP8XmeAraWjZcDE2d8lf+8BuxX3reD4leO0dLQgIuTn5ntKyMMxoDzUQU5j\nhiruoKiIPAB8T1WXRR7/A4Cq/mOv4/4F+BXwLeDv4vXQbVDUuIZ7gC2dBuzSKRYTDIkOihYCdTGP\nL0R+F/sG84AiVd0bJ5CvicgRETly9epVD29tRoPhHmBLpwG7dIrFBF/Cg6IikgH8ANgY71hV3aqq\n81V1/pQpUxJ9a2OMMTG8JPSLQFHM4+mR37lCwGzgdRE5B9wPVIpIn5cExhhjksNLQq8G7haRu0Rk\nDPAIEN3nUlWvq+pkVZ2hqjOAw0CZzXIxxpjUipvQVbUTWA8cAN4HdqpqjYg8LSJlAz/bGDOcBnvD\nEjO6eKqhq+o+VZ2pqp9R1Wcjv/uOqlb2cexi650bM/wS2YfcjA6eFhYZY/wXu62Bu8dM+T3ltlDJ\nRNnSf2NGiCDesMQML0voxowQI+2GJSb1LKEbM0KMpBuWGH/YfujGjCC2lYAZaOm/DYoaM4LEJm/b\nSsD0ZiUXY4wJCEvoxhgTEJbQjTEmICyhG2NMQFhCN8aYgLCEbowxAWEJ3RhjAsK3hUUichX4yJc3\nd0wGPvbx/dONtUc3a4tu1hY9pUN7/JGq9nnLN98Sut9E5Eh/q61GI2uPbtYW3awtekr39rCSizHG\nBIQldGOMCYjRnNC3+h1AmrH26GZt0c3aoqe0bo9RW0M3xpigGc09dGOMCRRL6MYYExCBT+gi8qCI\nnBKR0yLy9wMct0JEVETSdkpSory0hYisFJGTIlIjIj9JdYypFK89RKRYRA6KyFEROS4iy/2IMxVE\nZJuI/EFETvTzdxGRf4u01XERmZfqGFPFQ1t8JdIGvxORt0RkTqpj7JeqBvYfkAn8HvhjYAxwDJjV\nx3Eh4BBwGJjvd9x+tQVwN3AUmBB5PNXvuH1uj63A1yM/zwLO+R13EttjETAPONHP35cDrwEC3A+8\n43fMPrbFn8WcIw+lU1sEvYdeCpxW1TOq2g78FCjv47hNwD8DrakMLsW8tMVfAy+oagOAqv4hxTGm\nkpf2UCA/8vN44FIK40spVT0EfDrAIeXAdnUcBm4XkTtSE11qxWsLVX3LPUdwOoHTUxKYB0FP6IVA\nXczjC5HfRUUuHYtUdW8qA/NB3LYAZgIzReQ3InJYRB5MWXSp56U9vgf8lYhcAPYBf5Oa0NKSl/Ya\njdbiXLmkhVF9T1ERyQB+ADzucyjpIgun7LIYp9dxSET+VFWv+RqVf1YD/6Wqm0XkAeC/RWS2qt70\nOzDjPxFZgpPQF/odiyvoPfSLQFHM4+mR37lCwGzgdRE5h1MbrAzowGi8tgCn11Wpqh2qehb4ACfB\nB5GX9lgL7ARQ1beBXJzNmUYjL+01aojIZ4EKoFxVP/E7HlfQE3o1cLeI3CUiY4BHgEr3j6p6XVUn\nq+oMVZ2BUw8rU9Uj/oSbVAO2RcRunN45IjIZpwRzJpVBppCX9jgP/DmAiNyHk9CvpjTK9FEJPBaZ\n7XI/cF1VL/sdlB9EpBj4BfCoqn7gdzyxAl1yUdVOEVkPHMCZ1bBNVWtE5GngiKr2PoEDy2NbHACW\nishJoAv4Vjr1PoaTx/bYCPyniGzAGSB9XCNTG4JGRF7B+TKfHBkz+C6QDaCq/4EzhrAcOA38H/CE\nP5Emn4e2+A4wCfiRiAB0aprswGhL/40xJiCCXnIxxphRwxK6McYEhCV0Y4wJCEvoxhgTEJbQjTEm\nICyhG2NMQFhCN8aYgPh/OzklM6wsqU0AAAAASUVORK5CYII=\n",
            "text/plain": [
              "<Figure size 432x288 with 1 Axes>"
            ]
          },
          "metadata": {
            "tags": []
          }
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "RB-TXAkPLt4d",
        "colab_type": "code",
        "outputId": "a1fcbf2c-b875-4d84-e844-d1151a48c842",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 581
        }
      },
      "source": [
        "from mlxtend.plotting import plot_decision_regions\n",
        "from sklearn.tree import DecisionTreeClassifier\n",
        "from sklearn.ensemble import RandomForestClassifier\n",
        "\n",
        "tree = DecisionTreeClassifier(criterion='gini',\n",
        "                              class_weight = 'balanced',\n",
        "                              random_state=RANDOM_STATE)\n",
        "\n",
        "tree_dict = {'Tree':tree, 'Forest':forest}\n",
        "\n",
        "for classifier_name in tree_dict:\n",
        "\n",
        "  tree_dict[classifier_name].fit(two_features_data.values, reduced_features_reset['class'].values)\n",
        "\n",
        "  plot_decision_regions(two_features_data.values,\n",
        "                        reduced_features_reset['class'].values,\n",
        "                        clf = tree_dict[classifier_name])\n",
        "\n",
        "  plt.xlabel(x_axis_label) \n",
        "  plt.ylabel(y_axis_label)\n",
        "  \n",
        "  print(color.BOLD+color.UNDERLINE+classifier_name+color.END)\n",
        "  plt.show()"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "\u001b[1m\u001b[4mTree\u001b[0m\n"
          ],
          "name": "stdout"
        },
        {
          "output_type": "display_data",
          "data": {
            "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYsAAAEHCAYAAABfkmooAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjMsIGh0\ndHA6Ly9tYXRwbG90bGliLm9yZy+AADFEAAAgAElEQVR4nOzdd3wVZdbA8d+ZmdvSGyT03hS7i3Xt\nHQQU7P1d131dO6y77rq2LbrvunZdXbtYKDZEsGDDggWQBZGOIJKQQnpuvzPzvH/MTQQEE4QYxOf7\n+eRD7tyZuSc3ISdPO48opdA0TdO072N0dACapmnazk8nC03TNK1VOllomqZprdLJQtM0TWuVThaa\npmlaq6yODqA9rKpqUtXhZEeHoe0kls2ZxZn75BEMBH7Q9U99uJpBhwzfwVFtv9LZLzL2l7t1dBja\nrqT3obK1p3bJZLGuNsaqqnBHh6HtJD6bPZsxhV0IZgZ/0PUfvL+AZK9Dd3BU22/JB7MY2y/R0WFo\nu5LeW/85191QmqZpWqs6NFmISA8ReU9ElojIYhG5agvnHCEiDSKyIP1xY0fEqmma9nPW0d1QNjBe\nKTVfRLKBz0XkLaXUks3O+1ApNaID4tM0TdPo4GShlCoHytOfN4nIUqAbsHmy0DRN+0lwESJmAY4V\nBLY6XtyBFKYdJ9OpxaDt5Z46umXRQkR6A/sAn23h6YNEZCGwHvidUmrxFq6/BLgEYNwttzP4iFPb\nL1hN07StiJgF+LLyyBIH2QlzhVKQUEEiYch2atp83U6RLEQkC3gRuFop1bjZ0/OBXkqpsIicBEwF\nBmx+D6XUw8DDAO8tq1J6NpSmaR3BsYI7baIAEIEADnErCE7br+vw2VAi4sNLFM8qpV7a/HmlVKNS\nKpz+/DXAJyJFP3KYmqZpbSQ7baJo5sW3bUF29GwoAR4Dliql7tzKOSXp8xCRYXgxt73tpGmapm23\nju6GOgQ4D1gkIgvSx/4E9ARQSj0EjAUuFREbiAFnKr0Jh6Zp2la98eHnXHXboziOw8Vjj+O6X4/d\n7nt29Gyoj2ilLaSUuh+4/8eJSNM07afNcRwu+9t/eOvRv9C9uJBfnDGekUcOY7f+Pbfrvh3dstA0\nTfvZGnbu9VQ3xL5zvCg3xJxn/v6D7jln0Ur69+xC3x4lAJx54i955d3PdLLQNE37qapuiLH7b+76\nzvHF/7nmB9+zrLKGHiXfzgHqXlLEZ18s/8H3a9bhs6E0TdO0nZ9OFpqmabuQbsWFrKuobnlcWlFN\nt86F231fnSw0TdN2Ib8YOoCVa9ezprSCZDLFpNc/ZOSRB2z3ffWYhaZp2i7Eskzuv/43HP/rm3Fc\nl/855Rh2H7B9g9ugk4WmaVqHKcoNbXEwuyg3tF33Penw/Tnp8P236x6b08lC0zStg/zQ6bEdQY9Z\naJqmaa3SyULTNE1rlU4WmqZpWqt0stA0TdNapZOFpmma1iqdLDRN03Yx/3P9PXQ+9DyGjrx8h91T\nJwtN07RdzIWnHM0bD9+8Q++pk4WmaVoHq65rZMzlf6GmvnGH3O+w/YdSkJu1Q+7VTCcLTdO0Djbh\npTepK1vFUy++2dGhbJVOFpqmaR2ouq6R6W+9x4OnFjP9rfd2WOtiR9PJQtM0rQNNeOlNRvQTBhUH\nGdFPdtrWhU4WmqZpHaS5VXH+fjkAnL9fzk7butDJQtM0rYM0tyqKsryarkVZ1g5pXZz1u9s56Kzf\ns/zrMrofeRGPvThzu2PVVWc1TdM6yKw5C1lfnuC5ReWbHO9avZBxvzrtB9934r+u3d7QvkMnC03T\ntA4y7T9/6+gQ2kx3Q2mapmmt6tBkISI9ROQ9EVkiIotF5KotnCMicq+IrBKRL0Rk346IVdM0rW0U\nSnV0DN/Pi2/bguzoloUNjFdK7QYcCFwmIrttds6JwID0xyXAgz9uiJqmaW1n2nESytxpE4ZSkFAm\nph3fpus6dMxCKVUOlKc/bxKRpUA3YMlGp40CJiilFPCpiOSJSJf0tZqmaTuVTKeWSBjiVhCQjg5n\nCxSm3USmU7tNV+00A9wi0hvYB/hss6e6Aes2elyaPqaThaZpOx0DRbZTA05HR7JjdXQ3FAAikgW8\nCFytlPpBq1FE5BIRmSci86ZPmbBjA9Q0TfuZ6/CWhYj48BLFs0qpl7ZwShnQY6PH3dPHNqGUehh4\nGOC9ZVVqVVW4HaLVNE37eero2VACPAYsVUrduZXTpgHnp2dFHQg06PEKTdO0H1dHtywOAc4DFonI\ngvSxPwE9AZRSDwGvAScBq4AocFEHxKlpmvaz1tGzoT6ilekC6VlQl/04EWmapmlbslMMcGuapmk7\nN50sNE3TtFbpZKFpmqa1SicLTdM0rVU6WWiapmmt0slC0zRNa5VOFpqmaVqrdLLQNE3TWqWThaZp\nmtYqnSw0TdO0VulkoWmaprVKJwtN0zStVTpZaJqmaa3SyULTNE1rlU4WmqZpWqt0stA0TdNapZOF\npmma1qpt2ilPRIqBX6QfzlFKVe34kDRN07SdTZtbFiJyOjAHOA04HfhMRMa2V2CapmnazmNbWhbX\nA79obk2ISCfgbeCF9ghM0zRN23lsS7IwNut2qkGPeWg7idsuP4twuOk7x7Oysjnq8EM6ICJN27Vs\nS7J4Q0TeBCamH58BvLbjQ9K0bRcON9H34vu+c3z1o1d0QDSatutpc7JQSl0rImOA5j/THlZKvdw+\nYWmapmk7k22aDaWUehF4sZ1i0TRN03ZSrSYLEflIKXWoiDQBauOnAKWUymm36DRN07SdQqsD1Eqp\nQ9P/Ziulcjb6yN7eRCEij4tIlYh8uZXnjxCRBhFZkP64cXteT9M0Tfth2twNJSJPK6XOa+3YNnoS\nuB+Y8D3nfKiUGrEdr6H9DGRlZW9xMDsrK7sDotG0Xc+2jFnsvvEDEbGA/bbnxZVSH4hI7+25h6YB\n/PH+iVt97rPn7/8RI9G0XVOr3VAi8sf0eMWeItKY/mgCKoFX2j1COEhEForI6yKy+9ZOEpFLRGSe\niMybPuX7Giqapmnatmq1ZaGUug24TURuU0r98UeIaWPzgV5KqbCInARMBQZs6USl1MPAwwDvLatS\nq6rCP16UmqZpu7htWWfxRxHJx/tlHdzo+AftEVj63o0bff6aiPxbRIqUUtXt9Zqapmnad23LAPfF\nwFVAd2ABcCDwCXBU+4QGIlICVCqllIgMw+s2q2mv19M0TdO2bFsGuK/CK0/+qVLqSBEZDNy6PS8u\nIhOBI4AiESkFbgJ8AEqph4CxwKUiYgMx4EyllNrK7TRN07R2si3JIq6UiosIIhJQSi0TkUHb8+JK\nqbNaef5+vKm1mqZpWgfalmRRKiJ5eIPMb4lIHbC2fcLSNE3TdibbMsB9SvrTm0XkPSAXeKNdotI0\nTdN2Km1KFiJiAouVUoMBlFLvt2tUmqZp2k6lTZsXKaUcYLmI9GzneDStw1TXhxlz3UPUNEQ6OhRN\n2+lsy053+cBiEXlHRKY1f7RXYJr2Y5sw42PqKtbx1PTZHR2Kpu10tmWA+4Z2i0LTOlh1fZjp78/l\nwVOLuHT6XC4YcQiFuZkdHZam7TS2ZYD7e8cpROQTpdRB2x+Spv0wW9uHW8WbuHPktxVph136ANVN\niU3OiTU1cObAJIM65zOif5ynps9m3DnHedfrlT2atm075bUi2PopmtZ+Nt6Hu2LdahzHAaD82evo\ndto/iEajuK5CDMHbu+tbWWaKupjFbyevJZpyefnN17jx2Y8wxMAMZjJr7pffW9lW03Z1OzJZ6L+/\ntA5x2+VnUbpmJY6rqL/74pbjbqwJ5dqAkAzkga3wZeRRct4dbJwsoh89wRnyDv+ztwUIYppk+uM8\ntdzEDhWgghlbbLFo2s/JjkwWmtYhwuEmXDHpfPoNiOlrOV7z2j04TRswQtkgBiIGbqyRqudvovNp\nt7Sc51YsZ0pNgilfxjEz8kAECBEq6k7uabey/smrIaj/q2g/bzvyf4C0foqmbb/mloQSbzKfm+5u\nEgRJH1PKxU1GySrqQuaJ12KGsrEbKgChevodmySV4LFXUz3tnziReoqH34zXSBYM0T/SmtZsRyaL\n7dleVdO2avOB67rqKiSUixnMwo1HkFgjynWonnEnAGIFEMuHFa2ml6+clc9egx3IbXnOjdRTOelP\nuMkYuF6icSL1ANTMfIDC4y7z/vRJj2wnw3XU1CX47Yhhm8RliEG3Xn30WIb2s9BqshCRHsDtQDfg\ndeB2pVQq/dxUpdRoAKXUl+0ZqPbz1TxwvfjR8TjxqNeSiNTjxhrpfMbfwLHBtPAV9gClKH/yStxk\nlC6Fufz15ByumBHFHn4zZiibiqfHo1wHu269d20z1wGlqH79HqyCrqQ2fEO81PuRVk4KDBPXsRHD\nbLnEUS7frF7JbZefpROGtstrS8viceBF4FPgV8D7InKyUqoG6NWewWnaxpx4lK4X3k20YrX3i33G\nnenxBcCxSdWsA9fBidTjc+OM3rOAXnkmo/q7PDf/ZTIOvaDlXhLIJPbBo2QccyU10//V0spwY41U\nPPsH3HgEN1qH+mY+iFB85q3UTL+TkvO/fU3v9WzWvXCLThjaLq8tyaJTem8JgCtE5FzgAxEZiZ4B\npf2Iko0bWPfEVSjHBsCNNYJjYxV0AxGqptzo/dIXKPSnGNUril0fZ1RvePGlNyhfvRAV8zZfNCMb\n6OkrZ+Wk3+FaWV4SUAq7tgxESJSvpO7N++jR9AUrHR/V0/4PN9ZE+YRrMPwhb4DcsVEoFEI43LTV\ndR5ZWdk6kWg/eW1JFj4RCSql4gBKqWdEpAJ4E9BLXLUfj5h0ueBu3JS3oK7y2T9AultIDAs3GaPL\n+XcR+eAxxmTMpVOPTqAgr2YdowcZvBA8GLPPgbipBP737+CvJ+dy6cQ1bCD725V3poWvoDvxtV/g\nNxQ3Du/J+JlxkoddihHIwCroRsXT43EaN4BpIa6DcmxqKsupqSyn+Iy/YFo+TNOkpEdfAFY/esUW\nvxxN+ylpS22oR4EDNj6glHobOA3Q4xRah1EbNWyVk2r53K1YxpQv4xz/SDnHP1rOiU838fziFIkF\nr+LGI6jS/zJ6oEHvwiCjB1lYyYbv3rtqBf0KDPqXZDJ6iB9VscRLDpbfe165m5zf7dLHsXKKsPK7\nESjq2bIgUNN2Fa22LJRSd23l+H+BY3d4RJq2maysbFY/egVKOSg72XLc8IeomvxnjFAO4HVLpWpL\nCR17jXee6f14107+M5YdZXABrFv+Hlk1Szj1lEyUk+SUQQZTl3rjFLhe95YCipPr+M8pQVRdGWP2\n7MYrL7xLouvQrcZY8cy1uJE67IZKHATl2lSsW01Jj77Ub6jg+gtHANBQU42bTjSiXPI6lbR8jbqr\nStuZtXU/iyOBK4DmbVSXAvcrpWa1U1ya1qL5l+jlJx+I21iJY6e87qVDz6Hxlb+Qc8zvCfTYnfIn\nrsDKLUYsP6madd7sKLzB7EKjkb8M78XlL3zCcUOyKcjOABT5mT5GD3F4YmkNRRc/ipg+Ih88xmi1\nniElAUrrU+SFDEYNFCaumk0qmIUTqady4p8Qw0Slu6GcpmpAqHz2D5jZhenIFeVWEMdVWEf8FoBC\nFGJ4azwqp9zQUp5Ed1VpO7u2TJ0djrcP9l+AW/BmoO8LPC4ilyulXmvfEDXNk1tYRLfeA6hYt5pE\nPEZq2XsMyE6y/OWbUbndcSL12HXrvZOV8mYrAUa4iuwsl7yQxcj+LpO+aGD6Cq8V4YSbQAyMVJN3\nrWFiL3mLqaR4fVUjylVEja8RfwYqcw2+wh6IYWCEcuhy4b2UT7iGLuffRaq2FF9Bd8qfvJKS8+9E\nTB/KTlLx9HgAfAXdEMuPm0pg+AItMWraT0VbWhbXAqOVUgs3OrZAROYB9wE7XbLYULGe9WsrOjoM\nbQfzWyYr/n0J9TVVKKXoZEa57oQA175jo079MxUTrqFy8reV9MUwUMol30jQKdPklU9XM3qAxcuL\nY1Q0+bz+JjeImV2Eq5rISc+GyjWTvHxmDle8Hmf8ARZXzExSlcpHwhEan7oa5To4kToqJozD8Ie8\n8RLXaRk3sWtKW2JwInUApOorsXI7eweVC7ItW8loWsdrS7Io2SxRAKCU+kJEitshpu3WbcNsnNIv\nOjoMbQcLJGpJpmxwXbJ778G+zjzeWeNwyiCDx569BjMjN/3LWbByi1F2AonUEEnBv47PYtxbCUYd\n2JMxwxqZ7BxJ/ZKPUSiKTrqamtfuAdPCWTSds/ewGNzZx4iBNm+tTpFt2VQ3VWGbIcRIlxNxHApH\njAPAbqj0WgkiYJj4OvX2Pldqo0V837YiXDuF6FIi2k9MW5LF9+0xuVPuP7msMsqXX+sqobuCWDzO\nOx98ytGHHUR1Y4yi/U5gwwcvQel8ivsLg4sMDuhm8coaB2est0q76vmbSNWUgkDQjlKYIdh2in07\nu4z+9xJM0yDuTCQlQQJunA3TbkclwmyYehslqTKOOzjIiso4B3U3uHt2ii45BgFxUNmFFI0YD45N\n5ZQbqZ5+x7eBug5GMAuAysnXoxIxAJTrDWZXT/snRiib4jP+7h13UijXYd6tY/HndCIVrm0ZBNeD\n3drOqC3Jot9Wtk8VoO8OjmeH6HvMBbh7junoMLQd4N2JD9Ip8AENWX0J5C+l01G/omz2y5w/VLhs\nfz95QQERRvVJ8NynEwnsPYKC435L5cQ/YWbkk234KQlGMIp6M/ZQeGV9DaUbwhg+P347ysA8l7Le\n+2D1OYDoW3dzwl6dKeiaR6puPbbjUpQp3HBYgCvfSBBuqgHHpubNBxDDwI3WbzLu4DRVI5Yfp6ma\nbv/7OAB2bVl62Z5QOeVGKiZ4rRGvdLrCyMincMQ4vS5D2+m1JVmM+p7n/rWjAtmRls39gC+XLuvo\nMLTtFItG+HTak5y7V5Bnpj1JPG5QM/91xE7w9ELF1KUpoilFl2yDupjC8c+FvYa3FAe0Uk0c2bOJ\nT79JoerLyLKEEV2iPL5BkVI+Cv0pbjwyk3Eff07YdQnGN/DiwhCvLkvixqI4ho9zhvjo2ynEmGGF\nPDrXa0grO9FSV8pX0L0l3vInr6Ro+Dgqp9zodU0BmD78Bd1azlHut6vPrdwS7IYKql75J06sidJ0\n4lHK4bLh+5PXqUS3MrSdRlvWWXzvdqrbS0QeB0YAVUqp70xkF69z9x7gJCAKXKiUmv999zxtgOKk\nTL0o6qfukalzOGd34eJ9gyg7ycPz4vTs259yQ3HOHj6uPCDAI/MTfFbmcnRfP0+vlJa1FQDB3ALe\nK3dw7SZOmRRB+TOIh5NYmGTtdgijjXfonW9xXG/F5Dmv07fA5JuED475PXVvP0TXDIdTDwxiZoc4\ndfcYL84pp+rV21GJSEtCUo6NmFZLgX5fp94A3hRe0+d1NzWf412BE21ERFB2AhAKT77WGzNx7Zax\nDidSTzhuU79h5VbLiBjRGu66SO9krP042jJ19sY23muWUuqDHxDDk3hTcyds5fkTgQHpjwOAB9ls\nRfnmigpyKXILfkAo2s6iuj7Mx/MXM+X0XIqyLK4+NJdn5n2D69jkBw1O281H3FaMGODj0fkxVkSC\nSFYBbipB09yXcZMx6srWYiibkgyHogyTrxoi9M0zWNFgkrF+LqceF0S5Nm4yiiibkweFmLw4Svmy\nd7HidZzc3yY/2A1lJ8lO1TB6sMkTy8IkEapfu9srWhhtxMzIxRepxBE/5U9eiVIuNf85n+yRN2AE\nM70SIvle66Jw+Diqptzk7beRrlOF66DsOF0uvK8lWaRqS8no3Iu195/fUnV3461iAdY/dQ19z72T\nouwAcx68rCO+TdrPSFu6oda28V71PyQApdQHItL7e04ZBUxQSingUxHJE5EuSqnyH/J62k/DhBkf\nM6K/QVGW9yNalGVxyhAfTz9yLecMDVCUHSBpO2SHYGCRyeImwU64uNPvwAhXIb4AYljkug1kBSyu\nPSTA9e8lufGIAFe/keDY4kZy/T7qYy4froxQGBJ2K4Kzdje4Z847+GzFy18qJi1cRcpR+CwDKzOP\nUFFX8o68FF9Bd5STYv3DvyZ/6C/punoa60oOJ/Pwi4nMephu37xBRflirH1HYzdUtkyrrZpyU/or\nlJbB74rn/ojhC2w2MO6VQ3cdh5rydfQFHMchUNSz5T0y/CF2//UdLH5k/I/2fdF+vtrSDfXUjxHI\n9+gGrNvocWn62CbJQkQuAS4B+M9fruKSo/v9aAFqO96s+StYX5XguUVVGx0NkBWIM7syyOzXoaw6\niu0oIkkXw2zCdQXLjhIykoQTSZSYBDPg5EF+VtQ4nL27QZ884ZTdAkxcCq+uiJG0HbplwxvnhCjO\nNBnW02Li4jBVST8PntWdX0+poJsvwspoNnkn/5Wswh7eYr/01FelXIJfvcstw4u5/NXPSNaeSEbZ\nHG443M9Vb06mfNEs3JjXheR1R5mb7ImhXBcxLdxUAru2bKM9NgQrtzN2bRmVU25g3q1jN12bIQZu\nIsbsR24ir52/F5oGbeuGEryigQp4ATgK76/9ZcBDavOKah1EKfUw8DBA0/ypqrpiaQdHpG2Px2+4\nsNVz9rv0QRriBqF9DyU48FBy/SGM6deTY9dQ0xinIeFgIJQ3JiltcHlqdIiGpGLMAd2ZVhamws4m\nULuKo/oYNCaExoSLGC5H9TZ4eVmcHL9LkRXlxsP9XPl6I04i4tWcEmnZO8PnxBnVL4NeucLoAcIz\nb9/DaYNM+hQEGLN/Ic817kXjmkXelq1n/Z2a1++lywV3t3wNqQ1fg2lRNen6Tb841/Yq2zYTg+Iz\nb8VX6A2oJ6u/oWn+dJIVyyG4A95wTWtFW7qhHgA6A368JBEApgHD8WpFXdVu0XnKgB4bPe6ePrZV\nTy0I88WXje0alNbx1lfV4LiKnO5DUW4KZ8mHHNIpwUfLEzwxMsRvZ8Q5d+8QC8pTHNvPoku2gUQU\nyogyaqDBZPsXOKV+XvimgkmLawAwM3KQuMOATkFmLGni7D389M8XRg2yeLFsEdJ1cMsMKLupmkJ/\nilP39Cr1H90/yCsLVnHq7j1RyQhj9sxi6uQ51IZjiBiIGDjhWhqmXEfwkAu8dRlKUTPjbq+FYRjU\nTPe2hlUoxLAoPP4ymrusKp77A76iXpScdWuHvN/az1tbksUvlVJ7iIgPqAC6KKWSIjIR+N5ZSTvI\nNOByEZmEN7Dd0Np4xe4HH0eg/8E/QmhaR5o69RXCcZus3nsSXreEwKp3KLfC7F1s0D3HYNRgi4fm\nRInZMLcM7p/jjRsoI4GEckiF5hA85ipAUTvxT4CQ3/cwzs6ez5g9c/jTS18xcWwGyk4yZojF9Hdn\nkug6tGXxXXLR65w+yKIgOwTAR2vqOXsPizxfEjupyAsoRg80efiTJLaZQe3b/8HvxumjyihdM5dE\n9Royjr4c5doUn/FXb/V3uvhh8/awVkF3zOxCSs75Pyqe/h12bSnrn7zKa+E0VmwyLVfT2lNbkoUN\noJRKichcpVQy/dgWke3ugkonnSOAIhEpBW4CfOnXeAiv9tRJwCq8qbMXbe9raj99t11+FnXVVUgw\nm1QygVr3Xw7v7vLBsgSX7uenIa5YWeMS8huk9jwFY9AhBEr6U3rfOZgZhV6najRBeNo/Aa98h5gW\ndtmXPFVbwTOfbeDc3aAu4qBcRX4QTu7WyBPT/4ET9Gba+RvW8rzh8MLSpYAiZTsETHj88zJcQGSl\nd57lR2V3IlVbSokvyZ8P8vO/r75Ez2xh1eQ/gC9rq1+nWD4EoXlurplZQMkFdxL/bBLMm0hTYyVF\nxT23er2m7ShtSRYVIpKllAorpU5oPigiJUDye65rE6XUWa08rwA9L1DbRDjchC+rANuxaXr1VlQi\nwozqBs7azcIw4J01NnEHdisS3l/wCpHF72JmFSCWj06jrkMpt2VGk9O4gYrnrvO2U61Zz8AclzX1\nDhO/FCZ+mQKlMDO9YeRgdj62mAjQJCGiGHS/dALKtQkqhZg+UtVrqXvtbkrO8RKR1VBJfNo/yR96\nOGMy5lBYnEXnrCi3HJfPZVNrqEgkQLmI+Foq5QI4kXoqnroGCYRajinXJrFuMYFV73Bgb4vZ65JU\n1kfpe+6dLefoqbRae2jLbKgTt/JUE95iOk3rMH47Ro/GhayIZJDh2sxcpaiKKAozhHtPCHLFG3EM\nBCMz/9uFdMpFOSmSlSuJfDkrvdYiRsagQ7C++YzTh/p4eFUnyqMWXS68h9IHzqd4+M0ggrF4JgWr\nZpIXFJYpRUpCpGrTVWZdp2Wb100plOuQUfopp47JZsZibyykb3aS0UN8PLHUazmYOZ02uUoMg5IL\n7gQFlZP+1LIuo3HGP7loSJxkyhtLeXzRBmwrA5RLMKeQsuoNev2FtsO1dfOjXOAEvCmr4A0wv6mU\nqtr6VR1n+cwnWbhgQUeHobWjeF0FqZRLyHA4+7ASHphdz/GDijDjDcz+Jsm5e/g4so/Jr/b28dgi\nITxwP3yDjqJ62v9h5RZj15ZhZHdCoTCC2d6K6rp1DCoUBGFUP4dH54Vb1kdUz7gTEAoSpXTKcvnz\nYQEunZGgwt5CT6zyakGJ5U/v7Cf4UmFG7+W1ED5eVc/EsRmgFKcONpm6PIITj2A3NJfVl3Ria+5+\nUrjJGJ3P+BtutAH/h/dw2oFFvLmkkbKqegI+k8CQI4gsfo9Av2GY0UbIz2XZ529w2RNzf9D7mykJ\n/nnhoT/oWm3X1Japs+fjjSPM5NtZSEcCt4rILUqpra287jAxK4+YL7+jw9DakcLEVCmG5CbYK6eJ\nkb0TPDcvnC4VDitqFH94O0E0BY0xG1W2mNCwM1FA0/TbCOw7BqugG/lHeENg4UUzKZJGbh/Ti6zo\nesKhPF6cX40Ta8LMLqTgmN8Qf/sehpSEOK6Xwz7d/Jwy2OaJlf5vZzC5DuCV6kCE8ieuaIk2kAwz\naU4Tz3zicu6efuqiXisnPwijB/l4bOotuNklLV+fE6lHOTZl/74gfW8XnBTO6k8ZPdhHfobFmfvl\nk6qLkR1M8cTS91BK0bDwLYxgFhvqcgnlF7P3eTf/oPd39oTbftB12q6rLS2L64H9lFKbrNAWkXzg\nM7ZepqPD7H3UaLKGHtPRYS+II/4AACAASURBVGjtaPobb9HZTHDX2FzyMy1Osb7m1RX1HDskn37Z\nSS7bz2RNvUPKNQgFHJ5YXkH46fGYTRX0NNaxfOqXuLndUK6DL1xB0EkwakBn8vwudhTyMyxGD/bx\nxLPX4LpCcvHb5MbLifkMztkzm/ygcNpuFq+WGbhjbsFM7wOu7CRlD/0PIgZFI38PrkPNG/eTyO5G\nwjAJNHzDpMUOkxZHQSRdWFCwEOKxxpZV3cpJ7we+8WMRnIplTFkTZcpCr6ihE40gIhh2CoxMSJcR\niUbqiYq3FW23Xn10MUJtu7UlWXjt4O9yaSmfpmk/Lp8T46RudWREIySiUBBwOXuoj1dWRphrChO/\niBNOKMIpMLOLyCgswdhrJKmZ/+KaQzO5/q0IVa6DlWhgQE6S0kbF5LkbmDS3Ov0KiwEwnASuZGCu\nfIc9uhkc0sOkKMOgOuryz9kJjutuMfXTifh3P64lNuU4iGFSM/1Obzwk1phemS3gpFqKHVZN/jOI\n0PmMvxJK7x2u7CRi+Sl/6mpwHbpcdF86Af0KFASPugyMb//bVk++gc5n/JXYxD9hZhfS5fy70jGk\npwk3VRF++972/4Zou7y2JIu/A/NFZCbflt3oCRwL/LW9Atse69Z8xVdrSls/UfvJys3NY9rXcaZ9\n7T2ONkVx7BS2WAQys7GjjYCQsCHn6N9hIUTfuZ/uwRRvLrc5qF8OM8OdCVZVct4eFg/Ms4mlFDUZ\nPel64b04sUYiM+8ma7+xBFd/QsGq6cxb7/L5eod7Pqsj5Si6ZMFH3zRhdV2Nr1NvkjVlhJ+/FiXQ\n+Yy/4Svs4e3D/ezv8aXLhNS8+UC62uxmGyMFs+h82i3e8fQ2rW6scaOuLKh952E6n3YLYvk3eicU\nVm56w0rXbSlD0syxbeqqq7j+whG63Lm2XdpUGyq9+dHxfDvAPQv4o1Kqrh1j+8H2Si2kc2JRR4eh\ntaMjLj7sO8euum8aDVHvL+p6lYGEcrGbqklOvwPlpCh067jscD//npfitL1c3lqxkIO7u/QrtAj5\nbHL8QmPCm3GUWjyT3olVfLN8FplVC7j16CAPzkvy22Eh/nd6lJRr0yXHxGcKZd29yvrhKdcyICvK\n8joDHDs9uP2tmtfvwWna8G39p3RXE6aP6mn/B4DdUAWolplPkm5FWLnF2HXrqXh6fMsxSNeWakke\n3+0AEMPCl1VA34vv05sqadulTbOh0klh0vedIyKfKKV2iuL6ew3uy15FqY4OQ/uRLT9gcMvnfc+9\nk91/fQezH7mJZDyG0bieX+3r46J9/ERT8HVjnE4Bm5uPCLJHscn0FTafrnPIVU0kNqxNFwfszOUv\nzOb4Idl8XOqQdOGT6gwKM5M0RR3mljq8dEYm57/yEqWLPyLPCbNbZz8V4SThmXeTedRvkUBGugR5\nEpVKYGTmp+s7SUsyad4oyW6sAtfGyu+KEcrxtnA1TEQMNrzyD+/Y8HEte2YgUPHUuO95R3Qvsbbj\nGK2f0ma6nJm20yjKDrD4kfHkEabQlyTPTHB8X4uval0O6m7y/KIEJ/U36ZdvAEKXLOH4/hZ9cxWp\njx5j1EChd2GQkf1dXlpQx/NfJhh/UIDn5mwgGk9wfH8Lvwl9C32cOsQiWxIc1cdiQIHBbkUGfZyv\nSS6e6XURGSZi+gC81dgKnGgDTa/+HSe20aZGzS2NZulE8Z1anSLYdetJ1ZTiRGqpeOqalqeUnWyZ\nruut77BJNtW2y3us/by0qWXRRlsaBNe0DrHxYrQ7n52Jsy5Jl9w4fQtMkimbkN9g+ooUs772fkFX\nhl0y/UJ1VJGbsYTRh2WiUiHG/qIL077awLHdFb1792BA0Rr2K1Y4rsvunU0mLYpzymCLV5bWctvR\nGfhN4fWVSS7a28cd8z7CTV2AEcigYsI4nEg9ZmYeykmRWvQavZNfsX7JW5h9D8SJ1FM9/Q7cWCNm\nZgFGKBtfQXdvwZ9je9NxgaoXbsa1k1jZRS0LAJv3vrAbqyl76FeIYaSPe+XPm7dy1bTtsSOThabt\nlGbNX8GyNU088nGSnKCQTDkkbBdDhHDK66qJ2aAMA0yTkQOgOJgiXreeLISRveLEUrChbB3xRJKR\nA4NMX+HSN0+499MEJ/a3OXsPHwHTKw+yXxeTskbFqL4pHn/qctxQgbdIz7TAMEnWlpH8/CXGn9ad\n6995n0T3vVpiVa5Lyfl3bvoFmBZWdhHF53jjGmUP/Yqi4ePAtBAxsPK7AlA+4RqcJq96rrKT+Dr3\npeSsW1n/+OU/wrus7ep2ZLLQHaTaTmnaHZczcvz9rK/ypsX6A9A9C8qqG/EV9SGxYR0H9cngy/oA\nVqCAF5eu5K3VLrUxG1cJIVOREzTIy7AZNdhHn3wD24VPSl165AjPL7ERgYc/T5LpEyaMDpEfEmpj\nimlrHZxTb8DKLqT8iSvxRSppmnwtu+fb/Hd5GSO6CI+/dBOIn6KRv6dy4p++LR8CXteUaaE2b7iL\ntJRKb54BJUi6RMi91DxyIXakjvWPX44TqWferaeBUlw2fH/yOn27+E/PkNLaapuShYgUA79IP5yz\nWbmP83ZYVJq2g02747t/XQ+79AFWr1lLv5wUJQGHchXhq8o6Tt/DR1kTJGyLT9fZHNnfxwfrFI8t\nsAkains+TYJS+Cy4eJ8A//goQZccg5hrMaI/HNTTwhJIOIrRfRJMXDgd69AL8Dkx+mfFqWiyeXB4\nJle8Hud3BweZujTBhlA3JL0TXtXkP7fE2Dy9dpNd8sBbqb4VqSUzGZivqNjtCIL7n+ZtsCSCr6Ab\n5RPGYR3xW0zL+6+/btKf9bRarU3anCxE5HTgdrxpswLcJyLXKqVeAFBKfdkuEWpaO3n6urEMv+L/\neGx0Fr+ZFubgXj5qljucdnAfXvmigVOHGFz2QiWG4W3N+sTSAEkRzu4XZlGlQ588YfpKm4Al3Hx4\ngItfjfHyUuHtr74dIwgnFYnYS9QufJsSM8xNx/i4aZZLp0zh5CFBPqnNZsywQianhqGUtwFSt0uf\nbLk+teFrxPKz4dXbvSmyzUlCttyQV8oluPo9bjgiyPhP3sPd48T0uYpUfQVurJGa1+7+dltY19HT\narU22ZaWxfXAL5pbEyLSCXgbb6tVTfvJue6BFzh7qEVJpkEkqfhgTZKTB/rJlShKuRQYcUYONHl0\nfpL/jAoydWkTDSmLl5amcFzFiP5+vqi0GVRo8PySFGcN9TFxuUWMDCL1tVii8FkGVnF/8osHMiZz\nLv0LIpy5u497P0syYlCA86dWk/DnYWctwuzvFe5rXn3dzGre4Gjj1oRSm3ZXAU6kFp8TY1TfEL3z\nTEb1c3juk+fw7X4sgtdtZWTmU3L+XS2D4KUPXNB+b7C2S9mWZGFs1u1Uw46deqtpP5rq+jBLVnzN\n70b6+NfsKNkBYWW5y4tLUry8rJyoLcz4wsV1XSwTPlqbZNQgkyeXh4gkXH69p+LIPhb3zk3xi64G\n4RSMGeLj1Yo8ylNZBKSOY/v7mb3Ogf1OJzjnEU4ZIeRl+jhvXx9nTm5k5EE9Gb1/I5PtI8jcd3TL\neouWsh/p6bapqjU4TTWUPXgh4M1+qnrhlnThQlp+8btOisIgjNm3AMIRTt0zi1emLsTefywq3uTN\ninKdTZKMch0WPzqeENumuj7Mb/7xDA//8TwKczO375uh/SRsS7J4Q0TeBJo7Ns/A28VO035yJsz4\nmN8cXMCgnpnc8N46Hh2VzdFPNsDQ4XQ98TKW33MekXgtIQsO62nx/OIkCV8uoaLupMq+5OVlMPnL\nFMVZBnsUG8wpcykICSN7x3lifh1798xhv4E5dMqqZtKiVxk9xE+eFaEu6o1B7NfVZPSDS7FMg2hq\nIk2fTCJuhBDlfrsBUnNXk2mlt1b9J3bdem92VE5nKp4ej9NUjcJCDIMANqMGQGbYSwb5GRajBwqT\nlszE7HuQVzrdsPAV9kCkeee9PJx4FILbNtdlwoyPqatYx1PTZzPunONav0D7yWvzT4hS6loRORVo\nLnL/sFLq5fYJS9Pa16z5K1hfleD+j+sZ1R8c5bJ7J4OP579OonwlTqyJwZ3gkB4+GuKKUYN9TDGO\noGnNIgotRXGmwbJqhe0q/jk7SUFIOOGZKK6RJOA6/O3kvuRnWGzIbWTyi8uY0pTDlIUKFY8CoJSC\nTv2xjroc96WbGJjnUFp8GI1ff9lS68muLcOJN5H49DmUncRuqMRNxknMmUg0GsONNSKWHyOUgxgW\nVsNanl/i8MJSJ13RdgUiBjH3JaJzX8PMzMMIZnkL91reiW2fxFhdH2b6+3N58NQiLp0+lwtGHKJb\nFz8Dbd38yATeVkodCbzUviFpWvubdsflVNeHOf3393D9CdkUZVn8z8GNLJ4Z49Bjj2PutCeIpVKc\nPNDHw/OTDO0kBJd+ihmJMWavXL6pjrB3iUlBbgZz1kbZv4tBwjV4ZqnF2CE2+RkWCOSFDM4+oDOT\n7SOwow2YS14jZAnrI4Kqb6Dhxb+QYzdwdN98np77GiqzN5XP/gGFwgnXEEo10i9fWB42qZ72T69K\nbkYTKxp8FJ1zF0q5WOmKtamadVRO+hNWVhE9LrqHbx69nJLz/oVVV0bytXsoHDEeK68kPfbRXExa\nkQrXklXUb5P3xwxl89gb/8VvfXfnv9c/WkDXYIJ5pRZdgwmuuOtFTjx07x/l+6a1r/MO3vpzba0N\n5YiIKyK5SqmGHRWYpnWkCTM+ZkR/g6Is77/BGfvk8EVFijmfv8rQQpfdC0x65xv867gQt30Y54ii\nel6vSzK8bwZXLnWoiytSTgP3nZTJn9+NEnWEEHFeXgIzSku9vSWiDSCNRN3n8btJMoOK3x0c4A9v\nJ0g4ccxQNoU+oW92ki6ZkCJKzgWPoOwklQ9dQLbPZZ8uPr6qTZKKN2K6Ca49tit/eLUCCWahYo2b\nzIxqnn5rJxMtYxkA4g9RNfnPmJkF3opv00I5NoGsXEJZoe9Mmx126qWsL1v7nfcs0ljPgtVvcsdJ\nXckLWZxTkMv41yrpdcoJZGTntse3SdtJbEtHZRhYJCJvAZHmg0qpK3d4VJr2I2juinpu0aa7A1c3\nVhKJxpmzVjFtuY1hQG1UEbcV+SGDhrjinuHZfLouyTf1DmXJACMG2Ty/xOb+ESEunxEnFijCKOpD\nr5qPKe83mkzXod/qSfgNYVmN4tg+Jp9XxcCAO47LZNrSGPt3M6lfXUV83WKsvGLMVJi+JYIpsGex\nQUOiDkOEisoNjB4oTJ77PIE9TvSq0KZnSinlYtatpezJq3BjTd4+GIJXHgQoHDEOwzAJFnUnUf0N\n3XoP2OK0WcM0KenZ9zvH3534IKcN9bN/r+yWY6cNTbDii0856qxLd+S3R9vJbEuyeAndBaXtQra0\nUA+8Pvlf/vpWzh6UYtQgHyllMGNFkhXVNq+vtPnjO3Fc5S2Yc1w4ay+HYUUW73+dIi9oMKjIpD62\niqULl2MXmjhznsUnLk6OcNF+fiyBvL4+Fr6RZESfBD2yLPYpNuiRZ1CSKfz7nXtxBxxKpg+GdbXo\nmWfwzmqb4/pZzPzKZvdOBvt09TP93c9JDDmG6NSbyTjmSnAd/G6SAflQ3ntfco68hPVPXg1A1wvv\npuzZP3pdWVZ6llW4lkRRZ7Kysrf4PmzJyv/O5r9VcSZ/UYrtONTX1ZOfn0duxWydLHZxor5nJegm\nJ4pkAnGllJN+bAIBpVS0HeP7YVbMVFQv7+gotJ+ALU0BvfPZmUyYOhNLFI1xRX3cIW5DyFT4TOGm\nkX14Z1k90XCYLypSjN0zi/N2c5nwRYq6GCzZ4PDHQ/2c81KMPnlCXRwGFhoM62by0TcOGT4YNcji\n/rkp6mIKV4FlQMgHmT7hmwYXyzDYs1g4pIdJhk9A4Lw9fdz3WYL317pURVzycjJY2hAiVzXSmNUL\no6AH5spZPHZqNuM/DGGMuYvKKTcCXrIAWloTAKsfvYK/Pzn9B7937058kLXvPEGvoy/SiWIX8evD\n+m51xsO2JItPgWOUUuH04yxgplLqe4ZEOohOFlob3fnsTKa/9T4jjj2ccecc1zLoPeV0b9C7Omwz\n9PY1GBl5DCto4L3VNkkl+ESRFxSO6Wvx4pIUrkoPHYiwd4lJeaNDTgAMERZVuZjy7dBCy9CyguwA\nxG3olSuUNiqGdjZYVuMiQIZPcF2FCxzYzaJ/ocFhPU3GPh+jZy6sDwvZfuHuk7K48rUIdSkfI/s5\nhPwmX1akWFjjQwJZqHgjypcBIrjJGFkZ3qqKYMDHGSdvea/6WDzBex98wlGHHUQwGNji82+++RbD\nB/qYsSLFCccfu8XztJ+Wex95dqvJYlu6oYLNiQJAKRUWkYztigwQkROAewATeFQp9Y/Nnr8Qr8xI\nWfrQ/UqpR7/vnvOXrmb9qmXbG5q2i2uMxHhq+mwuG5bJA9Nn06W4E+/MWULf7BRz1iUAb/vTThmK\nPkVxjuodxA0GmRXrj+HPYM/kbPbu6mNDzODDr5NcvK+fCQsT5Pih1hK6ZXuJonsO5AYMki70yjU4\nbWiAmV/Z/Lfc5h9H+7nh3Th1ccVeJQYLKlxyg0JjXOEPQkwJ+5SYjBxs8eE3Dr/oGWRYtwQl+dls\nWG2TGbDpVFzM8f3W8vbqFFcelMPBg4v5YHE5F7+Xjb/fgQS/egt78En0+OUYFj8yntXPbLph0tZa\nV18Fmhick2DcOYd/572789mZDDk4k3GH5XLnBw2wlfO0Xce2JIuIiOyrlJoPICL7AbHtefF0V9YD\nePt5lwJzRWSaUmrJZqdOVkq1uc7ykuA+LM7svD2haT8Dn89+if17ZdCzSx7796rniQVJ1n0dIVpr\n8saaCE1NYTIzM3EiLpcOM8i0HI7rFmP2u3MwBI4/PEBOyOD4vsKXlbC6zuGXvUzeW2MzpJPJR+sc\nsvyCKGFptYtlQIYFj8yLs6rW5eg+JnuXmGQHDPymoj6u6JMnKDHon68YUGgxbYXD7G9sPllnE0vB\n22saaYy71Ccj2AmbvCyT6UuaCPoMTtnNJCu9w2qnLIsjCqt5Y+7L3Do8kytff4G6xR9SnPfdv+82\nX2DX2jqK5uennO6NdZy/byanT9HrLXZ125IsrgaeF5H1eC3pErxV3NtjGLBKKbUaQEQmAaOAzZPF\nNunSoxcRf8F2hqbtyprqa9mw7BPuOr0zhZk+ehT4OXfKJ1x860SycvN5d9JDfPPuk9idhrC/PY8D\n++fhS9STCuSxx8pGhhYbDOtu488vQdWXUZcM8ex/YzQmFT1zhN07CSlbWLJB0SUbhnYSqiKKA7ob\n1MZgTZ0i5SqOnRDmf/f38fwSh7I6RY9c4d/DA/x2RpyXlyaxDOicAesaIWRBUabBrUf6uHhanEw/\nCIrnPqtEgHASHpsXJzPT2641kXQYUmgwcr+urIo0QLe9OH/4wYy57qGWVsTmiWHEL/fmzOsf5pRB\nwqDOAUb0j39nlfbmU46LsixG9Df0au5d3Las4J4rIoOBQelDy5VS27vRdTdg3UaPS4EDtnDeGBE5\nDFgBXKOUWreFc1pUznqCtQvnb2do2q7ss4XL6KyqeeEzr2c1knCpKK3nlb9dwL67D+Cz12dx8iAf\nz83/lMUpl2cWlhHyQSwVwVYw72t44lOFYX2NpVIELAO/KVimycE9hYAF/QtNllTbJGwYWGiyvNbm\nsQUOSsEenU2qIpByhTe/cskOCCkXApbBK8tscoKCPyw0JBSH9bIIpxxMQ7GmJsW9cxwMA04a4OPt\n1TYlmQbL6gxMnx9XJfjV6ScSjSeY+vosfnVQFgDDBwU5+uG3qG6IbNKKaP7F35wYxt01GYnVgpMD\nbLnVsLUpx10rV+hksQvb1s2PBgG74e23va+IoJSasOPD2sSrwESlVEJEfgM8BRy1+UkicglwCcB/\n/nIVt15wWDuHpf2UjRz/BeuTFq+u8MqJ1zbF6JKlaKzZQM+s/lx5cBbjDsulT2GA2z9o4PTBiov3\nDfLo/DhTVgYJZedSXttEUZeuxMqWYWTmE26o5tw9TH61j4VpCM8vTjK0s0Fpg8vgTiYFGQYPf2Hg\nw2F1bYqQT/j7UQGufCOOz/AGvW8+IkBBSDiyj8UFU2OM3c3i94cEeHJBigkLknTNNli6weV/9vbz\nlyMD3D83SVnYoDFYSJ9z/sbKZ29m3Kh9Oe7Ku7nswEwuPsBbKDdjaZiSkM2Lb33MtIu6trQiNu5O\nGj4oyP0frObJU0LcPCvCbw91tthq2NqUY23Xti2zoW4CjsBLFq8BJwIfKaXG/uAXFzkIuFkpdXz6\n8R8BlFK3beV8E6hVSn3/UlE9G0rbBs0zoB4ckcElr0RwleLls3MpyrIob0jxywfW8trFPRjYOUB1\n2Ob0KU08f/vVnHjdk1Q3JVhfVYNpxxFcMn0KSxSOq/Cbwr0nBrnitTh2+nE4WIzv2HFEX76B0we5\n5AWFD9barKhxOW8vPxft7UOJiU9cHpvvDbD/el8/9XHFBa/EuO6QAH/7MMGMszMYUGhSFRXOeinB\n3r0L+Ch0BI3LPubyE4dy17OvEwoGyMkMYjsujU1hjultEPLBHWN6c+/sJj6szuGXRY2MO8z773Tn\nrBqq6xo5aw8fjy9I8coqg4Jsb+ZU185FOkn8HBx8xQ6ZDTUW2Av4r1LqovSuec9sZ2hzgQEi0gdv\nttOZwNkbnyAiXZRS5emHI4Gl2/mamraJjbtiDu9Wx6JKh6KsQgDEjnH2UIvpS8KM6xzY5C/tOQ9e\nRnV9mOOuuJtMFCtqHLKNOPedGOQ30+OM2c1H/wKDsbv5mLosyYkDfLyyopqaF/7M4DwbV1l8vM7h\nweEhTp0S5bkvkzzzRZLMgIXrOoRMGNLJQARyg8Lpu/mYsjjFKYN9KAWrahxsF/bt7PLcnArEfInu\nPbox/f25vP2/Pbl0epTnb7+ap6bPJrx6LrNW1HPfiQGq68Ocv28m/7n/a9aUhnhuUaIloTx9SpBg\nwM/1J5SwMJ0U9aC1BtuWLGJKKVdEbBHJAaqAHtvz4kopW0QuB97Emzr7uFJqsYj8BZinlJoGXCki\nIwEbqAUu3J7X1LSNbT6zZ3h/eOa/Mfa+twLLNKiqa8JxXFzqeW6x03Jdc//8gy/O4v/bu/Mwq+r7\njuPvzzDMDA7bIIpEUdxQETdwiVtjqlFjjGi0LvWJYONjGovGmtqa+ESNeZpHm6e2T4sbJlpibcRq\nVeISFwSlcWNUEFERYl1AFhHDvsyM3/5xzuBluHfOZYaZO3P5vJ5nnjn3nN859/vjDvd7z/md+/1p\n3XKO278Pf/x0Hd/aryfDdqygd5UYPaySCPjbo3vy8LsN/O9HG7lw1EAenbOK8afXMu7xdZx9QDWj\ndqvhokObuPO1BkbsJOYtD9Y3BLU9xaLVwZmT1rJqA+xUK1asDz5bBw+908BOtZWs3hhQWcUxBw9l\n8j+P45b7noaFr20ag7j1v6cy7dVZfH3XBk7ft5JhAyt5f/ka9u7fm+8fMwB2HcVVF568ab9jh395\n0u5Ba8u1NcmiXlJ/4C7gNZJaUS+1N4CIeIIW82JExHU5yz8Gftze5zHLp+WdPUcN24Vxx6/Y9Cba\nmmV/Ws1Dz7zE+NN6cc2za+hdBZeMquLRuU2cuX8lA2tFXY34Sh9x4UE9mThrI/e8upy/HNGTuhrx\ntd0rWLa2kbc/beCIwRXc2+MLbju9L2fdv5r+NRW8cHE/BtZWMGfJRi54cA23f7uWul7JVYK7XtuQ\nXCbq12vTJaJ8t7Qef+eL/NWo3kyfv55PVjbwX29tZOX6gMoF9K2t2ZT0PGhtWbbmbqjL0sU7JP0e\n6BsRb3ZMWGadoz1vkrc/NI0ThzRw1G470K9qPYfv04M9+4tpHzTy+qJG7nmjgUG9k0KATQFrG6B3\nNfz0tCE0bljLpUfA5U9uYMCAnXhp5gK+e0gVO/aC0/bpwazFTSxd08TStU00fhGcMLSS70xaw5Cd\n+6fPXsmIvTcfR8h3S2s1G7nr1ZX0ra2B6hoA+lZvOQbh8QjLkpksJI1sbVvzl/TMuqO2vkk2n1Xc\nd0YVPSvERyuaeHdZ8PyHjUAFjV8EFx9axXVfq6a6p9jYCGdOWstBuyTjHvM+X48EhwyCw/71Q6rV\nxH3f2YFVG77ggoN68si7DXz7gQbq0gHmxqZKamoaeGr81QXHEPIlvsqa3uzvwWnbBoo5s6gH3gKW\npY9zR8uDPLexmpW73zz+It/Yowmpkrc/bWDn2go++FMTi1dDRUWwoQnumdnAf8xsoKIiqQO1vjGY\ntfgLpn7S/GZeCVQSsYqzDqyirrYHDQF9anpw1vAqpi7agfp7rwe+rGHV2hiCE4J1pGKSxVUkd0Kt\nA+4HHs6tEWW2PUo+xdcw9ZN0hfpSVwcH7pzMG/HJ0mVb7FPo9tMzfjSe6UuWMX2zkbtKdt8luSPL\n05haV7A137PYi+TW1tHAh8AvImJmB8bWdv6ehZWR5juVNhXtK2Lw3axNWvmeRUWhDS2l9ZseBZ4m\nqek0rP2RmVlrms8qLhqZnElcNLKWx56fwWcr1mTsabZtZSYLSXtJ+omkV4CfAbOAAyLigQ6Pzmw7\n11rRPrPOVMyYxXzgTZKzipXA7sAPlM7kEhG3dFh0Zts5f//BuopiksWNJHc9AfRusa24AQ8zaxPf\n4WRdRWayiIgbCm2TdOU2jcbMzLqkoge4C7gqu4mZmXV37U0WBW+zMjOz8tHeZOExCzOz7UAxtaFW\nkT8pCOi1zSMyM7Mup5gB7j6dEYiZmXVd7b0MZWZm2wEnCzMzy+RkYWZmmZwszMwsk5OFmZllcrIw\nM7NMThZmZpbJycLMzDI5WZiZWSYnCzMzy1TyZCHpVElzJc2XdE2e7dWSJqXbX5E0tPOjNDPbvpU0\nWUjqAdwKfBMYDlwgaXiLZt8DPo+IfYB/AW7u3CjNzKzUZxZHAvMj4v2I2AjcD4xu0WY0MDFdfhA4\nUc0TgJuZWacodbLYAKLXhgAACkRJREFUFfg45/GCdF3eNhHRCKwAdmx5IEmXSqqXVD9h0hMdFK6Z\n2fYps0R5dxERE4AJALz3dLBsbmkDMjMrI6U+s1gIDMl5vFu6Lm8bSZVAP+CzTonOzMyA0ieLGcC+\nkvaUVAWcD0xu0WYyMCZdPgd4LiI8nauZWScq6WWoiGiUNA54CugB3B0RcyTdCNRHxGTg18C9kuYD\ny0kSipmZdSKV5Yd0j1mYmW29Yy4veKdpqS9DmZlZN+BkYWZmmZwszMwsk5OFmZllcrIwM7NMThZm\nZpbJycLMzDI5WZiZWSYnCzMzy+RkYWZmmZwszMwsk5OFmZllcrIwM7NMThZmZpbJycLMzDI5WZiZ\nWSYnCzMzy+RkYWZmmZwszMwsk5OFmZllcrIwM7NMThZmZpbJycLMzDI5WZiZWSYnCzMzy1SyZCFp\ngKRnJM1Lf9cVaNckaWb6M7mz4zQzs9KeWVwDTImIfYEp6eN81kXEoenPGZ0XnpmZNStlshgNTEyX\nJwJnljAWMzNrRSmTxaCIWJQuLwYGFWhXI6le0suSCiYUSZem7eonTHpimwdrZrY9q+zIg0t6Ftgl\nz6Zrcx9EREiKAofZIyIWStoLeE7S7Ij4Y8tGETEBmADAe08Hy+a2L3gzM9ukQ5NFRJxUaJukJZIG\nR8QiSYOBpQWOsTD9/b6kacBhwBbJwszMOk4pL0NNBsaky2OAR1s2kFQnqTpdHggcC7zdaRGamRlQ\n2mRxE/ANSfOAk9LHSDpc0q/SNgcA9ZJmAVOBmyLCycLMrJMpotBQQTfmMQszs613zOUqtMnf4DYz\ns0xOFmZmlsnJwszMMnXorbMlU1kNVbWljsLMrGyU5wD3NiDp0vSLfmXN/Swv7mf56Gp99GWowi4t\ndQCdxP0sL+5n+ehSfXSyMDOzTE4WZmaWycmisC5zrbCDuZ/lxf0sH12qjx7gNjOzTD6zMDOzTE4W\nZmaWyckiJWmApGckzUt/1xVo1yRpZvozubPjbCtJp0qaK2m+pC3mO5dULWlSuv0VSUM7P8r2K6Kf\nYyV9mvMaXlKKONtD0t2Slkp6q8B2Sfq39N/gTUkjOzvGbaGIfp4gaUXOa3ldZ8fYXpKGSJoq6W1J\ncyT9ME+brvF6RoR/knGbfwKuSZevAW4u0G51qWNtQ996kEwYtRdQBcwChrdocxlwR7p8PjCp1HF3\nUD/HAuNLHWs7+/lnwEjgrQLbTwOeBAR8FXil1DF3UD9PAB4rdZzt7ONgYGS63Ad4L8/fbJd4PX1m\n8aXRwMR0eSJQcL7vbuhIYH5EvB8RG4H7SfqbK7f/DwInSipYrriLKqaf3V5EvAAsb6XJaOA3kXgZ\n6J/ORtmtFNHPbi8iFkXE6+nyKuAdYNcWzbrE6+lk8aVBEbEoXV4MDCrQrkZSvaSXJXWXhLIr8HHO\n4wVs+Qe5qU1ENAIrgB07Jbptp5h+Apydns4/KGlI54TWqYr9dygHR0uaJelJSQeWOpj2SC/9Hga8\n0mJTl3g9y7OQYAGSngV2ybPp2twHERGSCt1TvEdELJS0F/CcpNkR4TnBu4/fAb+NiA2Svk9yNvXn\nJY7J2uZ1kv+PqyWdBjwC7FvimNpEUm/gIeDKiFhZ6njy2a6SRUScVGibpCWSBkfEovQUb2mBYyxM\nf78vaRrJJ4GuniwWArmfoHdL1+Vrs0BSJdAP+KxzwttmMvsZEbl9+hXJWFW5Keb17vZy31Qj4glJ\nt0kaGBHLShnX1pLUkyRR3BcR/5OnSZd4PX0Z6kuTgTHp8hjg0ZYNJNVJqk6XBwLHAt1hTvAZwL6S\n9pRURTKA3fJOrtz+nwM8F+noWjeS2c8W13rPILlGXG4mAxeld9F8FViRc4m1bEjapXlcTdKRJO9n\n3eoDThr/r4F3IuKWAs26xOu5XZ1ZZLgJeEDS94APgXMBJB0O/HVEXAIcANwp6QuSP8ybIqLLJ4uI\naJQ0DniK5I6huyNijqQbgfqImEzyB3uvpPkkg4rnly7itimyn1dIOgNoJOnn2JIF3EaSfktyJ9BA\nSQuA64GeABFxB/AEyR0084G1wMWlibR9iujnOcAPJDUC64Dzu+EHnGOB7wKzJc1M1/0E2B261uvp\nch9mZpbJl6HMzCyTk4WZmWVysjAzs0xOFmZmlsnJwszMMjlZmJlZJicL61Ykrc6z7gZJIWmfnHVX\npusO79wIQdIYJaXu50kak7P+92kdozmS7pDUI10/VtIN6XJz2e03lJRaf0HS6Xme4+xt1T9J/SVd\nlvP4K5IebO9xrbw4WVi5mM3mXyT8C2BOew+alj7ZmvYDSL48dhRJFdzr9eXcKOdGxCHACGCnNMZ8\npkfEYRGxH3AFMF7SiTnP0Qf4IVsWnGtrP/qTlKgHICI+iYhzij22bR+cLKxcPEJajlzS3iRVczfV\nCJJ0e1oteI6kn+WsP0LSi+kn/lcl9Uk/6U+W9BwwJS2z8EtJb0maLem8VuI4BXgmIpZHxOfAM8Cp\nsFkto0qS+TYyvxEbETOBG4FxOat/DtwMrG9t3zz96C1piqTX0340l2+/CdhbyQRCv5Q0VOmEQ5Jq\nJN2Ttn9D0tezYrby5HIfVi5WAh9LGkGSNCaxeVmEayNieXrpZ4qkg4F303bnRcQMSX1JykZAMunO\nwek+ZwOHAocAA4EZkl4oUJ+n1XLSkp4iOeN4kmTekGK8Dlyd7j8SGBIRj0u6uoh9c/tRCZwVESvT\n2mYvK5nt8RpgREQcmj7H0Jz9/4akEPNBkvYHnpY0LCJaTVRWfnxmYeXkfpJLUWcCD7fYdq6k14E3\ngAOB4cB+wKKImAHJJ/90Lg9Izw7S5eNIypo3RcQS4HngiLYEGBGnkMyOVk3xpdGbi+VVALcAP9qK\np8zth4BfSHoTeJYkiRWat6XZccB/prG/S1I3bdhWPL+VCScLKyePkRRl+yi3fLWkPYG/A06MiIOB\nx4GajGOtaWMMxZRJX09S1bjYWfwOI6mO24dkvGOapA9IpticnDHInduPC0nGSkalZxFLyP53MAOc\nLKyMRMRa4B+Af2yxqS/Jm+YKSYOAb6br5wKDJR0BycBxgYHg6cB5knpI2olkbuhXC4TxFHCyknL2\ndcDJwFPpeMHg9HkqgW+RXAZrVXq57KfArRGxIiIGRsTQiBgKvAycERH1WcdJ9QOWRkRDOvawR7p+\nFUkiymc6SZJB0jCSaqhzi3w+KyMes7DuZoe0XHWzzeYAiIj7W+4QEbMkvUHy5vwx8Id0/cZ0sPrf\nJfUiGa/IN0HWw8DRwCySQem/j4jF+YJLxwZ+TjK3BsCN6bpBJGcB1SQf0qYCdxTo4/FpvDuQTMJ1\nRURMKdB2a9wH/E7SbKCeNFlFxGeS/pAOaj8J3Jqzz23A7ek+jcDYiNiwDWKxbsYlys1KTNJYYGhE\n3FDiUMwK8mUoMzPL5MtQZm0g6SDg3harN0TEUW043Ezgg3bEcgrJ9y5y/V9EnNXWY5q15MtQZmaW\nyZehzMwsk5OFmZllcrIwM7NMThZmZpbp/wE+uyz8wGLd5QAAAABJRU5ErkJggg==\n",
            "text/plain": [
              "<Figure size 432x288 with 1 Axes>"
            ]
          },
          "metadata": {
            "tags": []
          }
        },
        {
          "output_type": "stream",
          "text": [
            "\u001b[1m\u001b[4mForest\u001b[0m\n"
          ],
          "name": "stdout"
        },
        {
          "output_type": "display_data",
          "data": {
            "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYsAAAEHCAYAAABfkmooAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjMsIGh0\ndHA6Ly9tYXRwbG90bGliLm9yZy+AADFEAAAgAElEQVR4nOzdd5gV5fnw8e89M6ds77D0DiIiisYe\nxV5AQbBhAZMY87O3mFhiTWKKiS36aowVC1IURIodOwg2RHoTWNhl2b6nn5l53j/m7FIEd3EXF/H5\nXNdecGbmzLk57J57n3Y/opRC0zRN076P0dYBaJqmaXs+nSw0TdO0JulkoWmapjVJJwtN0zStSTpZ\naJqmaU2y2jqA3WFleb2qCCXaOgxN07SflMN6FsjOzu2VyWJ9VZSV5aG2DkPTNO0n5bCeBTs9p7uh\nNE3TtCa1abIQkS4iMltEFovIIhG5ZgfXDBGRWhH5KvV1e1vEqmma9nPW1t1QNnCDUuoLEckCPheR\nt5RSi7e77kOl1LA2iE/TNE2jjZOFUqoUKE39vV5ElgCdgO2ThaZp2k+CoMjxuQRNENnpeHGbUUoR\nc6A2aaBofnxt3bJoJCLdgQOBT3dw+nARWQBsBH6vlFq0g+dfClwKcP1d97LPkJG7L1hN07SdyPG5\n5GYEccWCPTBZoBRBZUM4Rk3SbPbT9ohkISKZwMvAtUqpuu1OfwF0U0qFROQ0YCrQZ/t7KKUeBx4H\nmL20XOnZUJqmtYWgyZ6bKABEcLEImkCy+U9r89lQIuLDSxQvKKVe2f68UqpOKRVK/X0m4BORwh85\nTE3TtGYRkT03UTQQ2eUusraeDSXAk8ASpdR9O7mmOHUdInIIXsyVP16UmqZpWlt3Qx0JXAQsFJGv\nUsduAboCKKUeA84CLhMRG4gC5ym9CYemadpOffbRuzz6j9txHYdTRp7PuZdc1eJ7tvVsqI/g+4fj\nlVIPAw//OBFpmqb9tDmOwyN/vYV7Hp9AYXEHrj7vVA479iS69erXovu2dctC0zTtZ+uaMWdSW7f9\nnB7Iyc7mwXFTftA9ly38kg5du9OhSzcAjjl1OHNmv6GThaZp2k9VbV0dfS79bsfJisev/MH3rCwv\no6i4U+PjwvYdWPb1lz/4fg3afDaUpmmatufTyULTNG0vUtCumM1lGxofV2wqpaB9cYvvq5OFpmna\nXqTffgewce0aykrWkUwmeH/Wqxw25OQW31ePWWiapu1FTMvi8lvu4db/G43rOJx05nl0792ywW3Q\nyULTNK3N5GRn73AwOyc7u0X3PeTo4znk6ONbdI/t6WShaZrWRn7o9Ni2oMcsNE3TtCbpZKFpmqY1\nSScLTdM0rUk6WWiapmlN0slC0zRNa5JOFpqmaXuZ+267jnOP2Y/fnTmk1e6pk4Wmadpe5sTh5/CX\nR19s1XvqZKFpmtbGaqsr+evVF1JXU9Uq9xt48OFk5eS1yr0a6GShaZrWxt6d+gLuxgW8M+X5tg5l\np3Sy0DRNa0O11ZV8+dZkHhjZmS/fmtxqrYvWppOFpmlaG3p36guc3hv6tE/j9N7ssa0LnSw0TdPa\nSEOr4vyDcgA4/6CcPbZ1oZOFpmlaG2loVRRk+gDvz9ZoXfztD5dx3YXDKPl2FRceP5jXX2n5zChd\ndVbTNK2NLJz3IR+Wxhj/dck2x3M3f8iZv7r6B9/35n8+2tLQvkMnC03TtDZy+6OT2jqEZtPdUJqm\naVqT2jRZiEgXEZktIotFZJGIXLODa0REHhKRlSLytYgMbotYNU3TmkMpBUq1dRjfTykvzl3Q1i0L\nG7hBKbUvcBhwhYjsu901pwJ9Ul+XAq3fGadpmtZKYg4Yyt5zE4ZSGMom5uza09p0zEIpVQqUpv5e\nLyJLgE7A4q0uGw6MU14anCsiuSLSIfVcTdO0PUpt0oBwjKAJItLW4XyHUoqYk4pzF+wxA9wi0h04\nEPh0u1OdgPVbPS5JHdPJQtO0PY5CqEmakGzrSFpXW3dDASAimcDLwLVKqbofeI9LReQzEfls+sRx\nrRugpmnaz1ybtyxExIeXKF5QSr2yg0s2AF22etw5dWwbSqnHgccBZi8tVyvLQ7shWk3TtJ+ntp4N\nJcCTwBKl1H07uWwaMCY1K+owoFaPV2iapv242rplcSRwEbBQRL5KHbsF6AqglHoMmAmcBqwEIsCv\n2iBOTdO0n7W2ng31EfC90wVSs6Cu+HEi0jRN03Zkjxjg1jRN0/ZsOllomqZpTdLJQtM0TWuSThaa\npmlak3Sy0DRN05qkk4WmaZrWJJ0sNE3TtCbpZKFpmqY1SScLTdM0rUk6WWiapmlN0slC0zRNa5JO\nFpqmaVqTdLLQNE3TmqSThaZpmtYknSw0TdO0JulkoWmapjVJJwtN0zStSbu0U56ItAd+kXo4TylV\n3vohaZqmaXuaZrcsROQcYB5wNnAO8KmInLW7AtM0TdP2HLvSsrgV+EVDa0JEioC3gcm7IzBN0zRt\nz7ErycLYrtupEj3moe0h/nblaEKh+u8cz8zM4uaHx7dBRJq2d9mVZPG6iLwBNPzknQvMbP2QNG3X\nhUL19LzkP985vvqJq9ogGk3b+zQ7WSilbhSRUcCRqUOPK6Wm7J6wNE3TtD3JLs2GUkq9DLy8m2LR\nNE3T9lBNJgsR+UgpdZSI1ANq61OAUkpl77boNE3TtD1CkwPUSqmjUn9mKaWyt/rKammiEJGnRKRc\nRL7ZyfkhIlIrIl+lvm5vyetpmqZpP0yzu6FE5Dml1EVNHdtFzwAPA+O+55oPlVLDWvAa2s9AZmbW\nDgezMzOz2iAaTdv77MqYxYCtH4iIBRzUkhdXSn0gIt1bcg9NA/T0WE3bzZrshhKRm1PjFfuLSF3q\nqx7YBLy62yOEw0VkgYjMEpEBO7tIRC4Vkc9E5LPpE7+voaJpmqbtKlFKNX0VICJ/U0rd3OoBeC2L\n6Uqp/XZwLhtwlVIhETkNeFAp1aepe85eWq5WlodaO1RN07S92m+P7ik7O7cr6yxuFpE8oA8Q3Or4\nBy0L73tfs26rv88Ukf8nIoVKqYrd9Zqapmnad+3KAPclwDVAZ+Ar4DBgDnDc7gkNRKQY2KSUUiJy\nCF63WeXuej3t58tOJvh0/L8JiE0iadNzyHmIabB55Vfsd/y5bR2eprW5XRngvgavPPlcpdSxIrIP\ncE9LXlxExgNDgEIRKQHuAHwASqnHgLOAy0TEBqLAeaq5/WaatgtqKjZxZAeH3546mETS5tIXZ2KY\nJlZ9KehkoWm7lCxiSqmYiCAiAaXUUhHp15IXV0qNbuL8w3hTazVtt2vorBXZ0m1rGDvtwtW0n5Vd\nSRYlIpILTAXeEpFqYO3uCUvTNE3bk+zKAPeZqb/eKSKzgRzg9d0SlaZpmrZHaVayEBETWKSU2gdA\nKfX+bo1K0zRN26M0a/MipZQDLBORrrs5Hk1rM/WRGKNueozK2nBbh6Jpe5xdGbPIAxaJyDyg8adJ\nKXVGq0elaW1gzoJlVJet57kZnwDd2zocTduj7EqyuG23RaFpbSxUV8PCpauYemEhv3vtMzocXExa\nRkZbh6Vpe4xdGeD+3nEKEZmjlDq85SFp2g/T3H24d3RdvGYThxZE6dcuwGm9okxetoj9Bh+CXtaj\naZ5d2imvCcGmL9G03WfrfbjL1q/GcRwA1r/0J269eBg1m8twbBtME2/OhkcpRQdfmHbpwuUT1lIb\nc1m3ppLla0txk3GmfDTsOwlH035uWjNZ6F/BtDbxtytHU7JmBY6rqHngksbjbrQe5dqAUBuO4SiF\nmVVA8UX/ZssSPIh89DTD5R1+fYAFCGKa5KbFeHZZGDctnyh+2EGLRdN+TlozWWhamwiF6nHFpN05\ntyGmr/F45cwHceo3Y6RlgRiIGLjROson3UG7s+9qvM4tW8bEyjgTv4lhpueCCJBGWmFncs6+h43P\nXAtB/aOi/by15k+Aroug/SgaWhJKvJnfbqq7SRAkdUwpFzcRIbOwAxmn3oiZloVdWwYIFdP/vU1S\nCZ54LRXT/okTrqH90DvxGsmCIfpbWtMaNHs/iyZvJLKfUmqHe2n/2PR+FnuX7QekqyvKkWAWRjAT\nNxbGjdahXAczIxcAsQKI5UMqVtE3J8mKSBZ2IKfxnFNfgZXXATcRBddLNE64BgArtz0FJ13h/eqj\nIL1DL759eCwqGcfw+beJyxCDTt166LEMba/Rov0sRKQLcC/QCZgF3KuUSqbOTVVKjQDYUxKFtvdp\nGLhe9MQNOLGI15II1+BG62h37l/AscG08BV0AaUofeZq3ESEDgU5/Pn0bK6aEcEeeidmWhZlz92A\nch3s6o3ecxu4DihFxawHsfI7YldtRKGIlK5COUkwTFzHRowtA+OOclm3egV/u3K0ThjaXq853VBP\nAS8Dc4HfAO+LyOlKqUqg2+4MTtO25sQidLz4ASJlq70P9hn3pcYXAMcmWbkeXAcnXIPPjTFi/3y6\n5ZoM7+3y4hdTSD9qbOO9JJBB9IMnSD/haiqn/6uxleFG6yh74Y9e8jBMMrJzQIT2591D5fT7KB6z\n5TW917NZP/kunTC0vV5zkkVRam8JgKtE5ELgAxE5Az0DSvsRJeo2s/7pa1CODYAbrQPHxsrvBCKU\nT7zd+9AXKPAnGd4tgl0TY3h3ePmV1yldvQAV9TZfNMOb6eorZcVLv8e1Mr0koBR21QYvGShF1fjf\n05VvWeZYVEz7B260ntJx12H407wBcsdGoVAIoVB9s9d5aNpPUXOShU9EgkqpGIBS6nkRKQPeAPQS\nV+3HIyYdxj6Am4wDsOmFP0KqW0gMCzcRpcOY+wl/8CSj0udT1KUIFORWrmdEP4PJwSMwexyGm4zj\nf//f/Pn0HC4bv4bNZEHD2J1p4cvvTHz9NxQEHO4+oytXzYyQOPoyjEA6Vn4nyp67AaduM5gW4joo\nx6ZyUymVm0ppf+7dmJYP0zQp7tITgNVPXNUmb5emtabmFBJ8Ajh06wNKqbeBswE9TqG1GbVVw1Y5\nyca/u2VLmfhNjJP/V8rJT5Ry6nP1TFqUJP7Va7ixMKrkS0b0NeheEGREPwsrUfudeyeXvceIfibd\ni9IZ0d+PKlvsJQfLG+RWyt3m+k6XPYWVXYiV14lAYdfGBYGatrdosmWhlLp/J8e/BE5s9Yg0bTuZ\nmVmsfuIqlHJQdqLxuOFPo3zCnzDSsgGvWypZVULaidd515net3fVhD9h2RH2yYf1y2aTWbmYkWdm\noJwEZ/YzmLqkYZzC695SQPvEekafGcCpKmHU/p14dfK7xDvut9MYy56/ETdcjV27CQdBuTZl61dT\n3KUnNZvLuPXiYQDUVlbgphKNKJfcouLGf6PuqtL2ZM3dz+JY4CqgYRvVJcDDSqn3dlNcmtao4UP0\nytMPw63bhGMnve6loy6g7tW7yT7hDwS6DKD06auwctojlp9k5XpvdhTeYHaBUcfdQ7tx5eQ5nNQ/\ni/ysdECRl+FjRH+Hp5dUUnjJE4jpI/zBk4xQG+lfHKCkJklumsHwvsL4lR+TDGbihGvYNP4WxDBR\nqW4op74CEDa98EfMrIJU5IpSK4jjKqwhlwNQgEIMb43Hpom3NZYn0V1V2p6uOVNnh+Ltg303cBfe\nDPTBwFMicqVSaubuDVHTPDkFhXTq3oey9auJx6Ikl86mT1aCZVPuROV0xgnXYFdv9C5WyputBBih\ncrIyXXLTLM7o7fLS17VMX+61IpxQPYiBkaz3nmuY2IvfYipJZq2sQ7mKiPEt4k9HZazBV9AFMQyM\ntGw6XPwQpeOuo8OY+0lWleDL70zpM1dTPOY+xPSh7ARlz90AgC+/E2L5cZNxDF+gMUZN+6loTsvi\nRmCEUmrBVse+EpHPgP8Ae1yy2Fy2kY1ry9o6DK0VFXft1dgdVblpI67rUGSEuemUADe+Y6NG/omy\ncdexacKWSvpiGCjlkm8mKMoweHXuakb0sZiyKEpZvc/rb3KDmFmFuKqe7NRsqBwzwZTzsrlqVowb\nDrW46s0E5ck8JBSm7tlrUa6DE66mbNz1GP40b7zEdRrHTezKksYYnHA1AMmaTVg57byDygVp1r5j\nmrbHaE6yKN4uUQCglPpaRNrvhpha7Jtpj/LeBx+1dRhaKwnH4hxx7pZuGqXAF0hjcGGUd9Y4nNnP\n4MkXrsNMz0l9OAtWTnuUHceM1dAuKPxnaBZXzIox/LCujDqkjgnOsdQs/gSFovC0a6mc+SCYFs7C\n6Zw/0GKfdj6G9bV5a3WSLMumor4c20xDjFQ5EcehYNj1ANi1m7ygRMAw8RV1b5x+u2UR35ZWhGsn\nEV1KRPuJabLch4h8rpQ6aFfPtaXX5q9UyzdUtXUYWisI1VUz9ZG7uejWB/n7tWPoecl/mH/PWeQY\nUf51rLCi0uW0Pj4umGHinPUQZloW5ZPuIFlZAgIZToix+xv85kAfT36Z5IWvk5imQcwxCEuQgBvD\nzuyAioeQQAbFyQ2MGxEkL82gJiGcOyFEh2yD1TVCLKcbhcNuAMdm08TbtxqbAFzHKz+SiGIE0lHx\nKAB2fQUAZkYuRloW7c/9K+DN3trw2G8QAX92EclQFXmFXstDD3ZrbaVF5T6AXiIybQfHBej5g6Pa\njRLxGLGo3kd5bzD/9cnIpsW8P+mJxmM+J8agIpt26X4WlSt6FxgM7xHnxbnjCRwwjPyTLmfT+Fsw\n0/PIS8/irKOyMNItzjrK4dWNlZRsDmH4/PjtCH1zXTZ0PxCrx6FE3nqAUwa1I79jLsnqjdiOS2GG\ncNvRAa5+PU6ovhIcm8o3HkEMAzdSs824g1NfgVh+nPoKOv3fUwDYVRtSy/aETRNvp2yc1xrxSqcr\njPQ8CoZdr9dlaHu85iSL4d9z7l+tFUhryl//Lh2Wft3WYWgtVB+JsuqDGVx8QIDH33qRcNSi8otZ\nkIzz+UbFze/EWFureGuNS0XYxfHPh0FDG4sDWsl6hhWHyYyGSEYhExjWIcJTmxVJ5aPAn+T2YzO4\n/pPPCbkuwdhmXl6QxmtLE7jRCI7h44L+PnoWpTHqkAKemO/9AqLseGNdKV9+58Z4S5+5msKh17Np\n4u1e1xSA6cOf36nxGuVuWX1u5RRj15ZR/uo/caL1lKQSj1IOVww9mNyiYt3K0PYYzVln8b3bqbaU\niDwFDAPKlVLfmcguXufug8BpQAS4WCn1xffd89iD9uHYbrpP+Kfuvhfe5LcHp3P90Tn4rFr+/mGE\nrj17sw7FRYN83HtiGqePDzO4o0XcFp5bIY1rKwCCOfm8vNph8mKvArHypxMLJbAwydz3SEYY79A9\nz+Kk7ooJ82bRq8BkbcwHJ/yB6rcfo2O6w8jDgphZaYwcEOXleaWUv3YvKh5uTEjKsRHTaizQ7yvq\nDuBN4TV9KCe55RrvGTiROkQEZccBoeD0G70xE9duHOtwwjWEYjY1m1foMiLaHqE5U2dvb+a93lNK\nffADYngGb2ruuJ2cPxXok/o6FHiU7VaUa3ufipoQ09+fz8RzsgA4pW+AO17fxLIPZpDhE4KmsLzS\noXOWMGGRQ8zMRDLzt7qDonZzOT43RnHAJidosKo2TM9cg+W1Jukb5zPypCDKtXETETIsh78cn8l1\nr0coXfouVqya03vb5AU7oewEWclKRuxj8vTSEAmEipkPeEULI3WY6Tn4wptwxE/pM1ejlEvlf8eQ\ndcZtGMEMr4RInte6KBh6PeUT7/D220jVqcJ1UHaMDhf/pzFZJKtKSG/XjbUPj2msurv1VrGwZbtY\nnTS0H0NzuqHWNvNeNT8kAKXUByLS/XsuGQ6MU95I/FwRyRWRDkqp0h/yetpPw7gZnzCst0Fhpvct\nGk4oju3h58PFsxkxIEhGUHAwCfgMTu9n8fQSwY671E/7J75wOa5rIEoRlARFmRbXHOrn1tkJbh8S\n4NrX45zYvo4cv4+aqMuHK8JcfaifIzsLowcYPDjvHXy2Yso3ipcWrCTpKHyWgZWRS1phR3KPvQxf\nfmeUk2Tj478lb79f0nH1NNYXH0PGMZcQfu9xOq17nbLSRViDR2DXbmqcVls+8Y7Uv1BQrreSu+zF\nmzF8ATZNuLVxYFy5Xjl013GoLF1PT8BxHAKFXRvfI19mPj0v+Y8e49B+FM3phnr2xwjke3QC1m/1\nuCR1bJtkISKXApcC/Pfua7j0+F4/WoBa63vvi+VsLI/z4sJyAMIxm3ong6Df5sPyNJJrQ/zviwi1\nUYcMv4PhJHBdwedE6ZEZZ2W1wnYFw684vqeP5ZUO5w8w6JErnLlvgPFL4LXlURK2Q6csuGCgBUox\nZnA64xeFKE/4eXR0Z347sYxOvjArIlnknv5nMgu6eIv9UlNflXIJrnqXu4a258rXPiVRdSrpG+Zx\n2zF+rnljAqUL38ONel1IXneUuc2eGMp1EdPCTcaxqzZstceGYOW0w67awKaJt/HZPWdtuzZDDFCK\nRU/cQNqP8j+i/dw1pxtK8IoGKmAycBzeb/tLgcfU9hXV2ohS6nHgcQCWv6moWNa2AWktMu3fV27z\neP6StbxY2Zd9DzuBmopNpM37L/dNfJ+KzbXkXPU8/vJ15PjTMKbfCnYleYEYtXFFYYbBufsK172e\n5JnhQWoTilGHdmbahhBldhaBqpUc18OgKgpVURcxXI7rbjBlaYxsv0uhFeH2Y/xcPasOJx72ak6J\nNO6d4XNiDO+VTrccYUQf4fm3H+TsfiY98gOMOriAF+sGUbdmobdl6+i/UjnrITqMfaDx35Xc/C2Y\nFuUv3brtG+DaXmXbBmLQ/rx78BV4A+qJinUIQvXM+/T+4NqPojnfZY8A7QA/XpIIANOAoXi1oq7Z\nbdF5NgBdtnrcOXVM+xlZuLaKYLG3NarPH2BlaTWbq2px4hFvMyQUzuIPObIozkfL4jx9RhqXz4iR\nHTCYuNhhWF+LDlkGElYoI8LwvgYT7F/glPiZvK6MlxZVAmCmZyMxhz5FQWYsruf8gX565wnD+1m8\nvGEh0nGfxhlQdn0FBf4kI/f3KvUf3zvIq1+tZOSArqhEmFH7ZzJ1wjyqQlFEDEQMnFAVtRNvInjk\nWIxgJihF5YwHvBaGYVA5/T7Aq6grhkXByVfQ0GVV9uIf8RV2o3j0PT/6+69pzUkWv1RKDRQRH1AG\ndFBKJURkPPC9s5JayTTgShF5CW9gu1aPV/z8vPbZOo67uj8AaZnZrKhIUJCbRcyB9OKehNYvJrDy\nHUqtEAe0N+icbXBGP5P/fWGzusrGbwr/+sSrWKuMOJKWTTJtHsETrgEUVeNvAYS8nkdzftYXjNo/\nm1teWcX4s9JRdoJR/S2mv/sm8Y77eR/yQGLhLM7pZ5Gf5XUEfbSmhvMHWuT6EtgJRW5AMaKvyeNz\nEthmOlVv/xe/G6OH2kDJmvnEK9aQfvyVKNem/bl/9lZ/p4ofNmwPa+V3xswqoPiCf1D23O+xq0rY\n+Mw1KDuBP1aFymxPM+uBalqLNOe7zAZQSiVFZL5SKpF6bItIi7ugUklnCFAoIiXAHYAv9RqP4dWe\nOg1YiTd19ldN3TMWT2BH4i0NTduDXHJcb97+eh77HnYcddUVLPjyC8rrk7iOQzIRR63/kmM6u3yw\nNM7YQT5qY4oVlS6mAdbIf2CkZeHL70zJfy7ATC/wOlUjcULT/gl45TvEtLA3fMOzVWU8/+lmLtwX\nqsMOylXkBeH0TnU8Pf3vOEFv1pW/di2TDIfJS5YAiqTtEDDhqc834AIiK7zrLD8qq4hkVQnFvgR/\nOtzP/732Cl2zhJUT/gi+zJ3+u8XyIQgNc3PNjHyKx95H7NOXKF42keW1pWS2G7gb33lN8zQnWZSJ\nSKZSKqSUOqXhoIgUA4nveV6zKKVGN3FeAVfsyj0fn1POgoUbWxSXtmcxffkcOPQoAHLyi6iJeIvb\nJJhJ/Wv3oOJhZlTUcukBFgs3uSSdJIsrFF2zhXVT7yBupGNk5CGWj6LhN6GU2zijyanbTNmLN4EI\n8cqN9M12WVPjMP4bYfw3SVAKMyMXgGBWHraYCFAvaUQw6HzZOJRrE1QKMX0kK9ZSPfMBii/wEpFV\nu4nYtH+St98xjEqfR0H7TNplRrjrpDyumFpJWTwOykXE11gpF8AJ11D27HVIYMsQtnJt4usXEVj5\nDrcdE+CaWTHqaqsb98sAvf5C2z2aMxvq1J2cqsdbTLfHceJRnOh3FzFpP112PI6dTGL5/CilUoX7\nDHx2lC51C1geTifdtXlliaK0XrFgk3BEF5M3Vtn4DJtkZj6k9u5WykU5SZIV33otjNTMpsLTf0/g\no/9w25H13DA3n9KIRYeLH6TkkTG0H3oniGAsepP8lW+SGxSWKkVS0khWparMuk7jNq/bUijXIb1k\nLiNHZTFjkTcW0jMrwYj+Pp5e4rUczOyibZ4lhkHx2PtAwaaXbmlcl1E345/8qn+M7rk+hvezeGrh\nemwrA5SDP7uI6opVev2F1uqau/lRDnAK3pRV8AaY31BKle+uwFri6hN6ogYlm75Q+8kor67nrudv\nxwqkE4/FyMkMUuME8MerOf+wYh75uIaT+xVixmr5eF2CCwf6uGSwn6EvRjCDaazsfAhm719SMe0f\nWDntsas2YOVtKcMhhoGzao633WquxfBeDk98FmpcH1Ex4z5AyI+XUJTp8qejA1w2I06ZvYOeWOXV\nghLLn9rZT/AlQ4wY5LUQPllZw/iz0kEpRu5jMnVZGCcWxq5tKKsvqRXiDd1PCjcRpd25f8GN1OL/\n8EHOPqwQX7rFqAEbmb4xiBp1P+UTb6fjxQ8Qr1hHp+599PoLrVU1Z+rsGLxxhDfZMgvpWOAeEblL\nKbWzlddt5qHZ61mwYEVbh6G1toyOxAEyoboujHLr6J0XZ1C2cEb3OC9+Fkq1OGB5peKO9+LccLif\nG96OknS/InDQKBRQP/1vBAaP2ubWSrmkb5zPqPMKkUiUUQfk8vIXFTjResysAvJP+B2xtx+kf3Ea\nJ3VzOLCTnzP3sXl6hX/LDCbXAbxSHYhQ+nTDh7UikAjx0rx6np/jcuH+fqoj3krsvCCM6Ofjyal3\n4WYVN8bjhGtQjs2G/zc2dW8XnCTO6rmM2MdHXrq3LiQnoDi9cx1PP3MFtq349mFvllVNZo5ef6G1\nquaUKF8GHKqUqtnueB7wqS/jz/8AACAASURBVFKq726M7weZvbRcrSwPtXUY2m50w6gjkXgds/6v\nF3kZFhvXfcuvX67hxP659MpKcMVBJmtqHJKuwdMLHJ5eFsT2ZWPUl9EvJ8myWh9uTieU6+ALlSFO\nnF8f0Y5LDs/HrinDV9CF/769iqcX+0m4Qnbvg8la8zYdsgwmnptFXlD4vCTGmLcycUfdj5naB1zZ\nCTY89mvEML0Fdq5D5esPe3WgDJNA7TrSGn5FE0kVFhSijhCzshtXdatUl1lDTSnl2BSffw+xdx/B\nH61orHbrRGoRESI2RI2MxgV/ynVBwDJ9dOrWQ3dHac3S0hLlXjv4u1way6dp2o/LdBMcWJjgzc9W\nM2LfIPkBl/P38/HqijDzTWH81zFCcUUoCWZWIekFxcTS22HXrOfyw9K5e3aEctfBitfSJztBSZ1i\nwvzNvDS/IvUKiwAwnDiupGOueIeBnQyO7GJSmG5QEXH558dxTupsMXXuePwDTmqMTTkOYphUTr/P\nqz0VrUutzBZwko3FDssn/AlEaHfun0lL7R2u7ARi+Sl99lpwHTr86j+pBPQbUBA87gowtvzYVky4\njXbn/pno+FswswroMOb+VAxe95mqLyf09kO7/z9E2+s1J1n8FfhCRN5kS9mNrsCJwJ93V2AtsX7N\nKlatKWn6Qu0nw/IH6NZ3S1HinKxMltfGWPilwfg1AUI1IZxkkqT4SMvOJRHxNr+KOpA/5kkCIoT/\nezEFQZe3lkcZObiQF2r7kr7uYy7a3+KJL21qY1CR1oWOFz+EE60j/OYDZB50FsHVc8hfOZ3PNrp8\nvtHhwU+rSTqKDpnw0bp6rI6r8RV1J1G5gdCkG1EC7c79C76CLt4+3C/8AV+qTEjlG4+kqs3S2Iqo\nmPZPjGAm7c6+yzue2qbVjdZt1ZUFVe88Truz70Is/1bvjMLKSW1Y6bqNg/UNHNumuqJcD3hrLdas\n2lCpzY9OZssA93vAzUqp6t0Y2w82KLmAdvGFbR2G1oo2bAwz660XyC5oTywS5sbRx3HxCQN4+/OV\nvLoug8PPuYK/XTma+vo66kM1hOI+JJCBa9cSeeQilOOQpeqZcmE6oyZFuapHHPOrj+mTGyc/zSJo\nQm421Ia9GUfJRW/SPb6SdcveI6P8K+45PsijnyW4/JA0/m96BEc5DCy28JmKDZ29JBaaeCN9MiMs\nqzbAsVOD21tUznoQp37zlvpPqa4mTB8V0/4BgF1bDqjGmU+SakVYOe2xqzdS9twNjccgVVuqMXl8\ntwNADEsXHNRaRbNmQ6WSwkvfd42IzFFKHd4qUbXQoH16MqhQz4ba24yxHRzHRQQCfh8Aw48awDuP\nv8eXz93FKYfuQzQS5dfHdOWieyax76//wZyn/kwiHsWoK+WyA33s197kql/4mLqwHifm0i3H4sL9\nfeQE4OpZcXKMeuKb1xL7fAo3ntWOm177mJP7Z/FJiUPChTkV6RRkJMgyXT7f4PDYsDTGvPoKJYs+\nIscOceNRGdz4Rpi6Nx8g47jLkUB6qgR5ApWMY2Tkpeo7SWMyadgoya4rB9fGyuuIkZbtbeFqmIgY\nbH71796xodc37pmBQNmz13/PO6Z7ibXWYzR9SbMFW/FemraN6roIgy//L8ffPpXDrn+OT5esazy3\nqSZCRPmJKD8qLYdHPywnFEvw4f1XYtduQiI1YMdwXOGtVTYvLLT5qtShT77QPkP4w1txvixzOaOf\nRc8cRfKjJzm4k0XvojTO6O3yylfVTPomzg2HB3hx3mYisTiX/8KPz1T0LPAxsr9FlsQ5tofFQR1M\nBhSZ9HC+JbHoTa+LyDAR00tugoDyBqbrX/vrtuuBGloaDVKJ4ju1OkWwqzeSrCzBCVdR9ux1jaeU\nnWicruut77BJ1Ov96LWWa82iMt8/rUrTWiAvO51HrjyNyroIpiEc1GfLGolAYReOHHMT4PXR33nB\nEO589i0CaekAvDv+UTa88zijBigO7GjhAmOnRNkcUTy/MEk0qeicbRBKKCoiipz0xdw6MoMACc76\nRQemrdrMiZ0V3bt3oU/hGvbJc/nLB3GKMoSXFsY4cx+LV5dU8Y8T0vGbLhEb/nxsgCve+gg3ORYj\nkE7ZuOtxwjWYGbnegsCFM+meWMXGxW9h9jwMJ1xDxfR/40brMDPyG8uTJKtKwLG96bhA+eQ7ce0E\nVlZh4wLAhr0v7LoKNjz2G8QwUse98ucNW7lqWkvoCmTaT8bR+3ff4fGMWDmfj7u78fGIoSfxzaQt\n28PPfeMN6mtijJnikh0UEklvjcPmsMJRgmkIa2pc/JYBpo/TeyumfhMmZodwlDA4zwHXYPOG9cTi\nCUbvF+SkXsKCMpeH5sY5tbfN+QN9BEyvPMjwfl4LY3jPOE89eyVuWr63SM+0wDBJVG0g8fkr3HB2\nZ259533inQc1xqpcl+Ix9237DzQtrKxC2l/gjWtseOw3FA69HkwLEQMrryMApeOuw6n3qucqO4Gv\nXU+KR9/Dxqe2LfeuaT9Ek+ssmn0jkS+VUge2ys1aSu9noW3njBseZmN5xTbHNlTUYRT2IL55Pbec\nkM9fZ4dQ2cX4KpejMDBE4SohlnTplG1wWl8/hWmKCwb6uH9OnNdXOWT7YUmFN45iGZDhE8aNSCMv\nTaiKKsa+buGMeggrq4DSp6/GF90MTpIBeTa/7O4n5ghPfa1IiJ/2o//GpvG30H7rEuSODaZFxfR/\nN9aa2vDYb2h/7p/xFXbzrknNgCp79jqccBXFYx+i8n8XY2d1QAwTJ1yDpLZrNQwht2jL4j89Q0rb\nWkvXWTQSkfbAL1IP521X7uOiHxCbpv0ott9MCeCQyx5h9Zq19MpOsvjbCgblucwrXc3ZA/yYpgnK\nZfKiOKf18fHJBsWTX9kEDcWDcxOgFD4L7j0hjd9Mi1KcZRB1LYb1hsO7WlgCcUcxokec8QumYx01\nFp8TpXdmjLJ6m0eHZnDVrBi/PyLI1CVxNqd1QlI74ZVP+FNjjA3Ta7fZJQ8aF+XtSHLxm/TNU5Tt\nO4TgwWd7GyyJ4MvvROm467GGXI5peT/6eh9vrbmanSxE5BzgXrxpswL8R0RuVEpNBlBKfbNbItS0\nJvxlwlwqI96HZzKZ4PdnDKR7cX6Tz3vuprMYetU/eHJEJkOfr+c3BwVYVWNz3i978NiHFVw0yMf7\n327Gb8Hp/fw8vSRAQoSL+4YJWlCUDvu1MyjKMLjzmACXvBZlyhLh7VVbxghCCUU8+gpVC96m2Axx\nxwk+7njPpShDOL1/kDlVWYw6pIAJyUNQytsAqdNlzzQ+P7n5W8Tys/m1e70psg1JQnb8C6BSLsHV\ns7ltSJAb5szGHXhq6lpFsqYMN1pH5cwHtmwL6zp6Wq3WLLvSsrgV+EVDa0JEioC38bZa1bQ2syyU\nwdFjvQHuqvKNzPzsSS4fls8fnpzNp8tKefHG0+hUlPud5930yGTO38+iOMMgbru8syrB+QP95BtR\nDuuezubqas7oa/LEFwn+OzzI1CX11CYtJixKkuGD54YHeWR+ktH7+eiVbzB6Px/jl1lESceN1ODD\nJakEq31v8tr3ZVTGfHrnhzlvgI+HPk0wrF+AMVMriPtzsTMXYvb2SrA3rL5uYOWnBvO3bk0otaXa\nbYoTrsLnRBneM43uuSbDezm8OOdFfANORBB8+Z0xMvIoHnN/4yB4ySNjW+u/QdvL7UqyMLbrdqqk\ndafeatoPUlGymq9efxGA2srNjOzlzYL67ckDGdyziA4F2d99Tk2Ixcu/5fdn+PjXxxEMERaUOSyr\nTPDkFxsJJaB9uotSCsuEj9YmGN7P5JllaYTiLqMHKEwD5pQ4vHJOkLgjjOrv47WyXEqTmRRZ9Tx0\nSiZXzwzDQecQnPc/zhwm5Gb4uGiwj/Mm1HHG4V0ZcXAdE+whZAwe0bjeorHsR2q6bbJ8DU59JRse\nvRjwZj+VT74rVbiQxg9+10lSEIRRg/MhFGbk/pm8OnUB9sFnoWL13qwo19kmySjXYdETN+xy0cH6\nmipeuvdGRv/hX2Tm5O3is7Wfol1JFq+LyBtAQ8fmuXi72Glam5p04ylU1oUBMI0COrfzPrz6dC6k\nT+fCHT5n3IxP+N0R+fTrmsFts9fz3Kgszp5QR7JoHzK77otaMZdI7UbSLMXRXS0mLUoQ9+WQVtiZ\n5IZvmLIUJnyT5PyBfiqi4LiK/DThjO4xnv6imlGHF9OnTx4j+q/gpYWvMaK/n1wrTHXEG4M4qKPJ\niEeXYJkGkeR46ue8RMxIQ5S7ZQOkhq4m00ptrfpP7OqN3uyo7HaUPXcDTn0FCgsxDALYDO8DGSEv\nGeSlW4zoK7y0+E3Mnod7pdMNC19BF2/AGzAzcnFiEQju2sTI+bMmYG1ayLyZL3Hc6Mt26bnaT1Oz\nv0OUUjeKyEjgqNShx5VSU3ZPWJrWfJnpATLTA7v0nPe+WM7G8jgPf1LD8N7QIQvO2tfHxMVLUXYt\nZjxEv0I4sou3RevwfXxMNIZQv2YhBZaiXYbB+lrFG6uSjP8mQX6aUBsD10gQcB1G7Z8FCkb29zPh\n5aVMrM9m4gKFikUAvA2cinpjHXcl7it30DfXoaT90dR9+01jrSe7agNOrJ743BdRdgK7dhNuIkZ8\n3ngikShutA6x/Bhp2YhhYdWuZdJih8lLnFRF2+WIGETdV4jMn4mZkYsRzPQW7jW+Ezse+3Bdl9rK\nHW9XE66rYeE7k7jnhHbc8vYk+h56HBlZObv0/mt7qp47PdPczY9M4G2l1LHAK60Ulaa1mWn/vpKK\nmhDn/OFBbj0li+ygyXvrv8U1TSxfgIxgPdEknN7Xx2vLk5zcy2LaO3Mxw1FGDcqhLhThlN4+kmaQ\neWsjHNzBIO4aPL/E4qz+trffhEBumsH5h7Zjgj0EJ1pH5orXvV32qkDFXepn3Ee+U8fNx3bg91Nn\nURvoxqYX/ohC4YQqSUvW0StPWBYyqZj2T69Kbno9y2t9FF5wP0q5WKmKtcnK9Wx66RaszEK6/OpB\n1j1xJcUX/QuregOJmQ9SMOwGrNzi1NhHQzFpRTJURWZhr23en/lTHmc/fxk+67s9zXPnLqR/ei2l\nFQn6p0dZ+OyfOP4wvQ/4XmHUETs91dzaUI6IuCKSo5SqbbXANK0NjZvxCcN6GxRmej8GH1/ZhX+/\nX8vnNS7pMZsB+Sbd8wwSDsxaaTOsUy2TquMM65nO799wmLfRJpJI8J/TMrhqZpiII6QRY8pimFFS\nAsrFjdSC1BFxJ5FpJCnKxttlb2accEY6/uI+FK0sIVJfx1n7mDyzJkL22P+h7ASb//cb2qUrbh8S\n5IoZMaIBH5mmxd2nd+HySRuQYCYqWrfNzKiG6bd2It44lgEg/jTKJ/wJMyPfW/FtWijHJpCZQ1pm\n2nemzWbmt2P1t6vxWdtuExuNxfl80QrOGhCgpNaha57F5EUrKO7clbSgH23vtSsdlSFgoYi8BYQb\nDiqlrm71qDTtR9DQFfXiwm27WyrqNhGOxJi3VjFtmU152MVVYArkBg265JgM7x/AFEXSUXTrmMew\nfkkmLbZ5eFgaV86IEQ0UYhb1pGvFx5T2GkGG69Bv7QRO6WVyYEc/I/vZTFu5FEJreeC0DNoFHY7r\n6ueVpeXE1i/Cym1PplvHxQf56J1n0K9AWFm1kqQDEo4yoq8wYf4kAgNP9arQpmZKKeViVq9lwzPX\n4EbrvX0wBK88CFAw7HoMwyRY2Pl7t18dcOxIYOR3jr87/lEuPuIbrvzllrGgzJwKlmf25kA9drFX\n25Vk8Qq6C0rbA322bANryrzaSZYpnH5YP6zUb8TL15XTp0tR44Du1rZfqOc4LtM/XUZVbYS/PzmZ\nC/exGd7PxwsLbaYuTXBcd4OJi2zOnRzBVYpQzKFnnsFJfesZ1d/i/W+T5AYN+hWa1ERXUrJ0BRkF\nBs68F/CJSzzH4PyBPnL9DmfvazFzZZxhPeLk+y1MUXTINLlwoMX/e+ch3D5H0S4dfnWAj4qoIm7D\nzUf5+dcnCXyiGLlvkOnvfk68/wlEpt5J+glXg+vgdxP0yYPS7oPJPvZSNj5zLQAdL36ADS/c7HVl\nWalZVqEq4oXtyMzM+s57Ew3Xs+rLj76z+O+zd6bxYW01T86pxnFdQqEwWZkZZORNo13nbi3/z9Ta\n1tE7Lw3T7HIfIpIBxJRSTuqxCQSUUpHWiLFV6XIfPytn3PsuB5x+CQBVm0o4MPopvznlQCZ+sIT/\nvr+OW4f24rjBvXf43IqaEL/7+/M8fvNFvPXFGmYnB7Dqq7ksnj2RjlkGdTHF+lqbdJ8QNL0en5nX\nDOS5ueUsW19FXczh4K7pXLSvy7ivk1RHYfFmh5uP8nP9mzHapQvr6xR9CwxO7GVx9aEBDLyChSc9\nH6E6qnCVVyokzeeVC1lX62IZBucOMLlgfx/vf+uAQI9c4ZF5CfoWmKytVfToXMjktdkEar+lLqMb\nRn4XZPlsTuvr5/W1Fsa+JxNaNBuA7ANOAcCO1JGX5605qfx8FmeNOnOH70uyZhPXnrrPd7qhtvbc\nzDl8MPczjj7sYC46bY/YnUBroUFj/94q5T7eAU7A644CSAPeBHY+IqJpP4KBxQFq5njrLJKxGL88\n3duMqDaapE+hn4E92u/0ueNmfEJ12Xqenf4xI48/mJnj32TZxx/w3OiOHN83g4qQTY+/rsI20zii\na4JP1jkc+o+FuK4iaMKpfSz+92k9/52bGjoQ4YBik9+/GSM3ALYLFRFFddRh3gaHv34Qbxxadl3I\nTxdCCUWHTKGkzvuzKF2I2opZK21mrEhiiPDM8DTmlDh0yDKYtjyJUvDRujL8Zhl/H5rJre+sprp6\nPb86wM///SKdwvkRxi17j4z89rjVJcRWzEEMk2SoGqPQW3fSv2sRT/960A7fl62TaEFOxg7Pf7Fo\nGc+Nbs9l05dx88Wn7vA6be+xK8kiqJRqSBQopUIikt7SAETkFOBBwASeUEr9fbvzF+OVGdmQOvSw\nUuqJlr6utvf460VH7vD4b0/en99+z/MqakJMf38+j44s5LLp8xk77EgOaCcccGQmx/f1PvgKMy0G\ndzDYv4ufIzuaLK5xqEvrj5mew1GJjzmim4+wneTDbxNcMtjPc1/HGdjOAKU4oL3Be2sdBhQJRRkG\nmyOKDpkGZ+8XYNFml2lLE4zd32TKUptwUnFaH5N31ji4ylvtalqQdGF4P4tQUnFSbx9vr3HomiP0\nK87mrdU2BQGbnl07cHKvtby9OskFg7IZ2LM950ZLeWtzOv5e+xFcVYq9zxF0+eUoFv3vBlY/f/13\n3oftE8PWSfT6C07a/q1rnBzQr12AYb1jO71O23vsygrssIgMbnggIgcB0Za8eKor6xHgVGBfYLSI\n7LuDSycopQ5IfelEobWKbT/wDJ6d/jHvfbGcFxfGOeChMtrfuYr97t/IonKHunCMl7+Jcekgwb/x\nS3yr3uPkngY5aQYn9xTy0mB1tUPfAoMP19o8dEqQBZtcjuxicnxPH8sqHNbVupzW20QpxaxlcbL9\nivKwIjcoRJPw4ToHvwm2K/TIM+iQZVIXFyYuTjJqQpShL4T5ZL3N6mrF0k1h7HiYXJ/N9MX1BH0G\nZ+7rJzM1Iako02JIQQV186dw++GKmrmT+fqxaynM+u56lK0TA2ybRKe/P5/K2vA21zecHzPYSyxj\nBmfs8Dpt77IryeJaYJKIfCgiHwETgJYWyj8EWKmUWq2USuBt3Tq8hffUtCbt7APv6dt/xWfP3cmY\noUcxoDhIn24d6Fdokhbw0ynLYEj/AgZ2yeLcg3I4pLPF/r2KOaKLya8PSmNxucPSCkVhOtw3N07U\nVjz/dZIpS5PkpRl0zxGG9vPRJVtI98FNR/pZXe3ywMl+ijOF+jgohNcvTMN2YUWlQ7Zf0S5dKEj3\nVonvV+zjsaFBVmy2CZggKF78dBOvLokz6Zs4Jz1bx8GPlHPRdMXba4VzBwY446COXH10AdcMG8TM\nv41l1E2PNX6wb58Ylq8r56SrHuDYrmyTRLe2/ZTjwkxrh9dpe5ddWcE9X0T2AfqlDi1TSrV0o+tO\nwPqtHpcAh+7gulEicjSwHLhOKbV+B9doWrNt/4EHUFNdw/+bPJvLRg1p/AA946lvqQwrVtdEcFyX\nZxeUELEVn30rPD1XYVjfYqkkAcvAbwqZAYMbj7B4caFNfpqggHhScWB7kzdW2wx6LIyrhK7Z8NC8\nJOtqXa5/M0FWQAhaMKSbybSlNtlBwR8SauOKo7uZfLjOId2vWFOZ5KF5DoYBp/Xx8fZqm+IMg6XV\nBtnZ2TjRMJ89d2fjgsM/nujNdBraL8jxj79FRW14m+6l7buTrr9/AhKtAscb1xgzOINzJnpddA1d\nVDubctxx03LdFbUX29Wd8vrhdRcFgcEiglJqXOuHtY3XgPFKqbiI/A54Fjhu+4tE5FLgUoD/3n0N\nlx7fa/tLNK3R9h94VfVRci2bl9/9nIygv/ED9HdH5HPvB7Wcs4/iov0D/OOjCB+WpZOenUtpVT2F\nHToS3bAUIyOPUG0FFw406ZhlcvkvTF5blqAmpiipddm/2CRgCROWCRYOZfUOdXHhr8cFuPr1GD4D\nXAVjD/CTnyYc28Ni7NQoZ+1r8YcjAzzzVZJxXyXomGWwZLPLrw/wc/exAR6en2BDyKAuWECPC/7C\nihe8RHHS1Q8wet8tyXDGkhDFaTYvv/UJ037Vkcumz2fYLw9g+vvzmXjOloTy8AereebMNO58L8zl\nRznbtBoaEsGO9gbR9n67MnX2DmAIXrKYiTfO8JFS6qwf/OIihwN3KqVOTj2+GUAp9bedXG8CVUqp\n7y9Eo6fOarug4bfwR4elc+mrYVylmHJ+DoWZFqW1SX75yFpmXtKFvu0CVIRszplYz6R7r+XUm56h\noj7OxvJKTDuG4JLhU1iicFyF3xQeOjXIVTNj2KnHoWB7fCdeT2TKbZzTzyU3KHyw1mZ5pctFg/z8\n6gAfSkx84vLkF3EAfjvYT01MMfbVKDcdGeAvH8aZcX46fQpMyiPC6FfiHNA9n4/ShlC39BOuPHU/\n7n9hFmnBANkZQWzHpa4+xAndDdJ88O9R3Xno43o+rMjml4V1XH+09+N033uVVFTXMXqgj6e+SvLq\nSoP8LK8ebcd2hTpJ/BwccVWrTJ09CxgEfKmU+lVq17znWxjafKCPiPTAm+10HnD+1heISAelVGnq\n4RnAkha+pqZtY+uumGP+f3t3HmVFeeZx/Pvrvr1gswsCQRQ3oogbbolmHbOoMSDRMSaeCJnkmA2N\nYyYzJk4SY05yzOSMk5NgUJPoEKMRo1E7qMEN1IlRaBFEjCghGkEWAWWRrZdn/qhqvLR9qaab7tt9\n+X3O6dN1q96q+7zc5j5V71v1vsPfYOGqRgb13hcANWzhs2NyzHh+E5ftV7XTmfacqV9LzuIv/ik1\nBC+ubaRP2VZ+fkY1X5qxlXNGV3DowDLOHV3B3S9s54zDKrjnxTWsveM/Obx/A02R44lXG5n6iV58\n6vbN3Prcdn777HZqqnI0NTXSqxyOGFyGBP2qxXmjK7h9UT0TDq8gApasbaShCcbu18Stc1ai8j+w\n/4jhzHh0Lg99+QC+MmMzv//JpUyb8Wc2LZ3L7Bff5OdnVLHmzU1cOLaG66e8zN+X9eLWhdt2JJSb\nJ1RTXVXJFacPZUGaFH1LrMHuJYstEdEkqUFSX2A1MKIjbx4RDZImAzNJbp29MSIWSboKqIuIWuAS\nSeOABmAdMKkj72mWr7mDd0dTzKHw22e2cOzPVpIrL2P1GxtpbGyiiTe5dVHjjv2a2+en3jkbbVnH\n+w7vw99e38In3l3BqH3L6F0pxo/KEQH/+t4K7nqhnv/7x3YuOH4Q9yzayJSzaph87xbOOaKK4/ev\n5sJjG7n+6XrGDBYvrQu21gc1FWLFpuDs6ZvZuA0G14j1W4O1W+DOv9YzuCbHpu0BuUpOOXoktf89\nmWtueQCWP72jD+La389i9pwFfHh4PWcdlmPUoBxL173FIf1786VTBsLw47nsgo/t2O/U0W9ftLds\nfrK92+4kizpJ/YFfAk+TPJz3l44GEBH30WJejIj4bt7yt4BvdfR9zFrTsqP75FFDmfz+9Tu+RHdl\nzZubuPPBvzDlzF5c/tBb9K6ELx5fyT2LGzn78ByDasSAavGuPuKCoyqYtmA7N81Zx2fHVDCgWnzw\ngDLWbG7g+dfrOXFYGTeXN/GLs/oy4bZN9K8u47HP92NQTRmLVm3nM3e8xdRP1jCgV9JK8MuntyXN\nRP167Wgiapn4Lhxbw/uvf4J/Ob43jy/Zymsb6rn1ue1s2BqQW0bfmuodSc+d1pZld+6G+mq6eJ2k\nPwF9I+LZzgnLrGt05Ety6p2zOW1EPSfvvw/9KrdywqHlHNRfzH65gXkrGrjpmXqG9BblgsaAzfXQ\nuwq+c+YIGrZt5qIT4eL7tzFw4GD+Mn8Znzumkn17wZmHlrNgZSOr32pk9eZGGpqCD43M8anpbzFi\nv+bpYXOMOWTnfoTWbmmtYju/nLOBvjXVUFUNQN+qd/ZBuD/CsmQmi/wH8VrbFhHz9mxIZl2nvV+S\nzVcVt4yrpKJM/GN98ozFo680AGU0NAWfP7aS736wiqoKsb0Bzp6+maOGJv0eL72xFQmOGQLH/fQV\nqtTILZ/ah43bmvjMURXc/UI9n7y9ngFpB3NDY47q6npmTvlmwT6E1hJfrro3h7tz2vaAtlxZ1AHP\nAWvS1/m95UErt7Galbrf3PsEHz2wESnH86/Xs19NGS+/2cjKTVBWFmxrhJvm1/O/8+spS0b/YGtD\nsGBlE7Nea/4yzwE5IjYy4chKBtSUUx/Qp7qcCaMrmbViH+pu/h4A19zyADMefHSXfQhOCNaZ2pIs\nLiO5E2oLyRPWd+WPEWW2N0rO4quZ9Vq6Qn0ZMACO3C+Z5+G11WvesU+h20/HfWMKj69aw+M79dzl\nOGBockdWa2NY+Q4lBWPGyQAAC9FJREFU62q785zFwSS3to4HXgF+FBHzOzG29vNzFlZCmu9UuuwD\n/bjmsbZ1vpu1yy6es2jz2FARsRS4h2RY8pOAUR2PzMx2xYP2WXeRmSwkHSzp25KeAr4PLACOiIjb\nOz06s72cB+2z7qItfRZLgGdJrio2AAcAX2mepjIirum06Mz2cn7+wbqLtiSLq0juegLo3WJb2zo8\nzKxdfIeTdReZySIiriy0TdKlezQaMzPrlnZn8qPWXJZdxMzMerqOJouCt1mZmVnp6GiycJ+Fmdle\noC1jQ22k9aQgoNcej8jMzLqdtnRw9+mKQMzMrPvqaDOUmZntBZwszMwsk5OFmZllcrIwM7NMThZm\nZpbJycLMzDI5WZiZWSYnCzMzy+RkYWZmmZwszMwsU9GThaTTJS2WtETS5a1sr5I0Pd3+lKSRXR+l\nmdnerajJQlI5cC1wBjAa+Iyk0S2KfQF4IyIOBf4H+HHXRmlmZsW+sjgJWBIRSyNiO3AbML5FmfHA\ntHT5DuA0NU8AbmZmXaLYyWI48Gre62XpulbLREQDsB7Yt+WBJF0kqU5S3Q3T7+ukcM3M9k6ZQ5T3\nFBFxA3ADAC8+EKxZXNyAzMxKSLGvLJYDI/Je75+ua7WMpBzQD1jbJdGZmRlQ/GQxFzhM0kGSKoHz\ngdoWZWqBienyucAjEeHpXM3MulBRm6EiokHSZGAmUA7cGBGLJF0F1EVELfBr4GZJS4B1JAnFzMy6\nkEryJN19FmZmu++UiwveaVrsZigzM+sBnCzMzCyTk4WZmWVysjAzs0xOFmZmlsnJwszMMjlZmJlZ\nJicLMzPL5GRhZmaZnCzMzCyTk4WZmWVysjAzs0xOFmZmlsnJwszMMjlZmJlZJicLMzPL5GRhZmaZ\nnCzMzCyTk4WZmWVysjAzs0xOFmZmlsnJwszMMjlZmJlZJicLMzPL5GRhZmaZipYsJA2U9KCkl9Lf\nAwqUa5Q0P/2p7eo4zcysuFcWlwMPR8RhwMPp69ZsiYhj059xXReemZk1K2ayGA9MS5enAWcXMRYz\nM9uFYiaLIRGxIl1eCQwpUK5aUp2kJyUVTCiSLkrL1d0w/b49HqyZ2d4s15kHl/QQMLSVTVfkv4iI\nkBQFDnNgRCyXdDDwiKSFEfG3loUi4gbgBgBefCBYs7hjwZuZ2Q6dmiwi4iOFtklaJWlYRKyQNAxY\nXeAYy9PfSyXNBo4D3pEszMys8xSzGaoWmJguTwTuaVlA0gBJVenyIOBU4Pkui9DMzIDiJourgY9K\negn4SPoaSSdI+lVa5gigTtICYBZwdUQ4WZiZdTFFFOoq6MHcZ2FmtvtOuViFNvkJbjMzy+RkYWZm\nmZwszMwsU6feOls0uSqorCl2FGZmJaM0O7j3AEkXpQ/6lTTXs7S4nqWju9XRzVCFXVTsALqI61la\nXM/S0a3q6GRhZmaZnCzMzCyTk0Vh3aatsJO5nqXF9Swd3aqO7uA2M7NMvrIwM7NMThZmZpbJySIl\naaCkByW9lP4eUKBco6T56U9tV8fZXpJOl7RY0hJJ75jvXFKVpOnp9qckjez6KDuuDfWcJOn1vM/w\ni8WIsyMk3ShptaTnCmyXpJ+l/wbPShrb1THuCW2o54ckrc/7LL/b1TF2lKQRkmZJel7SIklfb6VM\n9/g8I8I/Sb/NfwGXp8uXAz8uUG5TsWNtR93KSSaMOhioBBYAo1uU+SpwXbp8PjC92HF3Uj0nAVOK\nHWsH6/kBYCzwXIHtZwL3AwLeAzxV7Jg7qZ4fAmYUO84O1nEYMDZd7gO82MrfbLf4PH1l8bbxwLR0\neRpQcL7vHugkYElELI2I7cBtJPXNl1//O4DTJBUcrribaks9e7yIeAxYt4si44HfROJJoH86G2WP\n0oZ69ngRsSIi5qXLG4G/AsNbFOsWn6eTxduGRMSKdHklMKRAuWpJdZKelNRTEspw4NW818t45x/k\njjIR0QCsB/btkuj2nLbUE+Cc9HL+Dkkjuia0LtXWf4dS8F5JCyTdL+nIYgfTEWnT73HAUy02dYvP\nszQHEixA0kPA0FY2XZH/IiJCUqF7ig+MiOWSDgYekbQwIjwneM/xR+B3EbFN0pdIrqb+qcgxWfvM\nI/n/uEnSmcDdwGFFjqldJPUG7gQujYgNxY6nNXtVsoiIjxTaJmmVpGERsSK9xFtd4BjL099LJc0m\nORPo7sliOZB/Br1/uq61Mssk5YB+wNquCW+PyaxnROTX6VckfVWlpi2fd4+X/6UaEfdJ+oWkQRGx\npphx7S5JFSSJ4paI+EMrRbrF5+lmqLfVAhPT5YnAPS0LSBogqSpdHgScCvSEOcHnAodJOkhSJUkH\ndss7ufLrfy7wSKS9az1IZj1btPWOI2kjLjW1wIXpXTTvAdbnNbGWDElDm/vVJJ1E8n3Wo05w0vh/\nDfw1Iq4pUKxbfJ571ZVFhquB2yV9AXgFOA9A0gnAlyPii8ARwPWSmkj+MK+OiG6fLCKiQdJkYCbJ\nHUM3RsQiSVcBdRFRS/IHe7OkJSSdiucXL+L2aWM9L5E0DmggqeekogXcTpJ+R3In0CBJy4DvARUA\nEXEdcB/JHTRLgM3A54sTace0oZ7nAl+R1ABsAc7vgSc4pwKfAxZKmp+u+zZwAHSvz9PDfZiZWSY3\nQ5mZWSYnCzMzy+RkYWZmmZwszMwsk5OFmZllcrIwM7NMThbWo0ja1Mq6KyWFpEPz1l2arjuhayME\nSROVDHX/kqSJeev/lI5jtEjSdZLK0/WTJF2ZLjcPu/2MkqHWH5N0Vivvcc6eqp+k/pK+mvf6XZLu\n6OhxrbQ4WVipWMjODxL+M7CoowdNhz7ZnfIDSR4eO5lkFNzv6e25Uc6LiGOAMcDgNMbWPB4Rx0XE\nu4FLgCmSTst7jz7A13nngHPtrUd/kiHqAYiI1yLi3LYe2/YOThZWKu4mHY5c0iEko+buGCNI0tR0\ntOBFkr6ft/5ESU+kZ/xzJPVJz/RrJT0CPJwOs/ATSc9JWijp07uI4+PAgxGxLiLeAB4EToedxjLK\nkcy3kflEbETMB64CJuet/gHwY2DrrvZtpR69JT0saV5aj+bh268GDlEygdBPJI1UOuGQpGpJN6Xl\nn5H04ayYrTR5uA8rFRuAVyWNIUka09l5WIQrImJd2vTzsKSjgRfScp+OiLmS+pIMGwHJpDtHp/uc\nAxwLHAMMAuZKeqzA+Dy7HE5a0kySK477SeYNaYt5wDfT/ccCIyLiXknfbMO++fXIARMiYkM6ttmT\nSmZ7vBwYExHHpu8xMm//r5EMxHyUpMOBBySNiohdJiorPb6ysFJyG0lT1NnAXS22nSdpHvAMcCQw\nGng3sCIi5kJy5p/O5QHp1UG6/D6SYc0bI2IV8ChwYnsCjIiPk8yOVkXbh0ZvHiyvDLgG+MZuvGV+\nPQT8SNKzwEMkSazQvC3N3gf8No39BZJx00btxvtbiXCysFIyg2RQtn/kD18t6SDg34DTIuJo4F6g\nOuNYb7UzhrYMk76VZFTjts7idxzJ6Lh9SPo7Zkt6mWSKzdqMTu78elxA0ldyfHoVsYrsfwczwMnC\nSkhEbAb+A/hhi019Sb4010saApyRrl8MDJN0IiQdxwU6gh8HPi2pXNJgkrmh5xQIYybwMSXD2Q8A\nPgbMTPsLhqXvkwM+QdIMtktpc9l3gGsjYn1EDIqIkRExEngSGBcRdVnHSfUDVkdEfdr3cGC6fiNJ\nImrN4yRJBkmjSEZDXdzG97MS4j4L62n2SYerbrbTHAARcVvLHSJigaRnSL6cXwX+nK7fnnZW/1xS\nL5L+itYmyLoLeC+wgKRT+t8jYmVrwaV9Az8gmVsD4Kp03RCSq4AqkpO0WcB1Ber4/jTefUgm4bok\nIh4uUHZ33AL8UdJCoI40WUXEWkl/Tju17weuzdvnF8DUdJ8GYFJEbNsDsVgP4yHKzYpM0iRgZERc\nWeRQzApyM5SZmWVyM5RZO0g6Cri5xeptEXFyOw43H3i5A7F8nOS5i3x/j4gJ7T2mWUtuhjIzs0xu\nhjIzs0xOFmZmlsnJwszMMjlZmJlZpv8H1gkLTwQbRQIAAAAASUVORK5CYII=\n",
            "text/plain": [
              "<Figure size 432x288 with 1 Axes>"
            ]
          },
          "metadata": {
            "tags": []
          }
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "SdHxjiiTMD7Y",
        "colab_type": "text"
      },
      "source": [
        "As well as Random forests, Extremely randomized trees<sup>1</sup> can also be used. These work similarly to Random forests except...\n",
        "\n",
        "\"*...instead of looking for the most discriminative thresholds, thresholds are drawn at random for each candidate feature and the best of these randomly-generated thresholds is picked as the splitting rule. This usually allows to reduce the variance of the model a bit more, at the expense of a slightly greater increase in bias*\"<sup>2</sup>.\n",
        "\n",
        "In this case it appears we are better served by the random forest.\n",
        "\n",
        "---\n",
        "1. Geurts, P., Ernst, D., & Wehenkel, L. (2006). Extremely randomized trees. Machine learning, 63(1), 3-42.\n",
        "2. https://scikit-learn.org/stable/modules/ensemble.html"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "rChMSlnQ5COe",
        "colab_type": "code",
        "outputId": "a1d9e043-c3aa-4431-ada2-f72e2d34a53b",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 175
        }
      },
      "source": [
        "from sklearn.ensemble import ExtraTreesClassifier\n",
        "\n",
        "# create a forest classifier\n",
        "ETSC = ExtraTreesClassifier(criterion='gini',\n",
        "                            n_estimators=1000,\n",
        "                            max_features = 'sqrt',\n",
        "                            class_weight = 'balanced',\n",
        "                            random_state=RANDOM_STATE,\n",
        "                            n_jobs=-1)\n",
        "\n",
        "ETSC.fit(X_train, y_train)\n",
        "\n",
        "y_pred = ETSC.predict(X_val)\n",
        "\n",
        "display(pd.DataFrame(classification_report(y_val, y_pred, output_dict = True)))"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "display_data",
          "data": {
            "text/html": [
              "<div>\n",
              "<style scoped>\n",
              "    .dataframe tbody tr th:only-of-type {\n",
              "        vertical-align: middle;\n",
              "    }\n",
              "\n",
              "    .dataframe tbody tr th {\n",
              "        vertical-align: top;\n",
              "    }\n",
              "\n",
              "    .dataframe thead th {\n",
              "        text-align: right;\n",
              "    }\n",
              "</style>\n",
              "<table border=\"1\" class=\"dataframe\">\n",
              "  <thead>\n",
              "    <tr style=\"text-align: right;\">\n",
              "      <th></th>\n",
              "      <th>0</th>\n",
              "      <th>1</th>\n",
              "      <th>accuracy</th>\n",
              "      <th>macro avg</th>\n",
              "      <th>weighted avg</th>\n",
              "    </tr>\n",
              "  </thead>\n",
              "  <tbody>\n",
              "    <tr>\n",
              "      <th>precision</th>\n",
              "      <td>0.996283</td>\n",
              "      <td>0.857143</td>\n",
              "      <td>0.989399</td>\n",
              "      <td>0.926713</td>\n",
              "      <td>0.989891</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>recall</th>\n",
              "      <td>0.992593</td>\n",
              "      <td>0.923077</td>\n",
              "      <td>0.989399</td>\n",
              "      <td>0.957835</td>\n",
              "      <td>0.989399</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>f1-score</th>\n",
              "      <td>0.994434</td>\n",
              "      <td>0.888889</td>\n",
              "      <td>0.989399</td>\n",
              "      <td>0.941662</td>\n",
              "      <td>0.989586</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>support</th>\n",
              "      <td>270.000000</td>\n",
              "      <td>13.000000</td>\n",
              "      <td>0.989399</td>\n",
              "      <td>283.000000</td>\n",
              "      <td>283.000000</td>\n",
              "    </tr>\n",
              "  </tbody>\n",
              "</table>\n",
              "</div>"
            ],
            "text/plain": [
              "                    0          1  accuracy   macro avg  weighted avg\n",
              "precision    0.996283   0.857143  0.989399    0.926713      0.989891\n",
              "recall       0.992593   0.923077  0.989399    0.957835      0.989399\n",
              "f1-score     0.994434   0.888889  0.989399    0.941662      0.989586\n",
              "support    270.000000  13.000000  0.989399  283.000000    283.000000"
            ]
          },
          "metadata": {
            "tags": []
          }
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "lE3i5YyZYQ9b",
        "colab_type": "text"
      },
      "source": [
        "It is also worth noting we have been dealing with the class imballance found in this data by using `class_weight = 'balanced'` to assign more importance to getting ictal data predictions correct. We can however also undersample using a ballanced random forest. Generally what performs better depends on the amount of data you are training on. If small then class wight will be better (as seen below), but if you have very large datasets, then undersampling will likely work better."
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "I47oe_3nYjLd",
        "colab_type": "code",
        "outputId": "0bf9d1ff-41f9-4b3b-ec76-f350e4327bd1",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 175
        }
      },
      "source": [
        "from imblearn.ensemble import BalancedRandomForestClassifier\n",
        "\n",
        "bal_forest = BalancedRandomForestClassifier(criterion='gini',\n",
        "                                            n_estimators=1000,\n",
        "                                            max_features = 'sqrt',\n",
        "                                            random_state=RANDOM_STATE,\n",
        "                                            n_jobs=-1\n",
        "                                            )\n",
        "bal_forest.fit(X_train, y_train)\n",
        "\n",
        "y_pred = bal_forest.predict(X_val)\n",
        "\n",
        "display(pd.DataFrame(classification_report(y_val, y_pred , output_dict = True)))"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "display_data",
          "data": {
            "text/html": [
              "<div>\n",
              "<style scoped>\n",
              "    .dataframe tbody tr th:only-of-type {\n",
              "        vertical-align: middle;\n",
              "    }\n",
              "\n",
              "    .dataframe tbody tr th {\n",
              "        vertical-align: top;\n",
              "    }\n",
              "\n",
              "    .dataframe thead th {\n",
              "        text-align: right;\n",
              "    }\n",
              "</style>\n",
              "<table border=\"1\" class=\"dataframe\">\n",
              "  <thead>\n",
              "    <tr style=\"text-align: right;\">\n",
              "      <th></th>\n",
              "      <th>0</th>\n",
              "      <th>1</th>\n",
              "      <th>accuracy</th>\n",
              "      <th>macro avg</th>\n",
              "      <th>weighted avg</th>\n",
              "    </tr>\n",
              "  </thead>\n",
              "  <tbody>\n",
              "    <tr>\n",
              "      <th>precision</th>\n",
              "      <td>1.000000</td>\n",
              "      <td>0.520000</td>\n",
              "      <td>0.957597</td>\n",
              "      <td>0.760000</td>\n",
              "      <td>0.977951</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>recall</th>\n",
              "      <td>0.955556</td>\n",
              "      <td>1.000000</td>\n",
              "      <td>0.957597</td>\n",
              "      <td>0.977778</td>\n",
              "      <td>0.957597</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>f1-score</th>\n",
              "      <td>0.977273</td>\n",
              "      <td>0.684211</td>\n",
              "      <td>0.957597</td>\n",
              "      <td>0.830742</td>\n",
              "      <td>0.963811</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>support</th>\n",
              "      <td>270.000000</td>\n",
              "      <td>13.000000</td>\n",
              "      <td>0.957597</td>\n",
              "      <td>283.000000</td>\n",
              "      <td>283.000000</td>\n",
              "    </tr>\n",
              "  </tbody>\n",
              "</table>\n",
              "</div>"
            ],
            "text/plain": [
              "                    0          1  accuracy   macro avg  weighted avg\n",
              "precision    1.000000   0.520000  0.957597    0.760000      0.977951\n",
              "recall       0.955556   1.000000  0.957597    0.977778      0.957597\n",
              "f1-score     0.977273   0.684211  0.957597    0.830742      0.963811\n",
              "support    270.000000  13.000000  0.957597  283.000000    283.000000"
            ]
          },
          "metadata": {
            "tags": []
          }
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "Un_i8to47Bvo",
        "colab_type": "text"
      },
      "source": [
        "## Majority Voting\n",
        "\n",
        "As we have seen from bagging, a group of classifiers don't have to all be descision trees. Indeed Scikitlearn has a VotingClassifier where multipule classification pipelines can be combined to create an even better classifier that aggregates predictions. This aggregation can be done by simply selecting the class label that has been predicted by the majority of the classifiers (more than 50% of votes) for 'hard voting'. Majority vote refers to binary class decisions but can be generalized to a multi-class setting using 'plurality voting'. Particular classifiers return the probability of a predicted class label via the predict_proba method and this can be used for 'soft voting' instead of class labels<sup>1</sup>.\n",
        "\n",
        "Ensemble methods work best when the predictors are as independent as possible, so one way of achiving this is to get diverse classifiers. This increases the chance they each make different types of errors which in combination will improve the overall accuracy<sup>2</sup>.\n",
        "\n",
        "As can be seen below the soft majority voter has better scores than the hard voting method and better than most other methods individually when all are on their default settings. Soft voting often achives a higher performance than hard voting because highly confident votes are given more weight<sup>2</sup>.\n",
        "\n",
        "**NOTE**\n",
        "- with some hyper-parameter optimisation its likely we could increase the performance of the soft-majority vote.\n",
        "\n",
        "---\n",
        "1. Raschka, Sebastian, and Vahid Mirjalili. Python Machine Learning, 2nd Ed. Packt Publishing, 2017\n",
        "2. Géron, A. (2017). Hands-on machine learning with Scikit-Learn and TensorFlow: concepts, tools, and techniques to build intelligent systems. \" O'Reilly Media, Inc.\"."
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "voiasorB7hDn",
        "colab_type": "code",
        "outputId": "b69da767-92c8-4091-f980-25577047d534",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 474
        }
      },
      "source": [
        "%%time\n",
        "from sklearn.preprocessing import StandardScaler\n",
        "from sklearn.ensemble import VotingClassifier\n",
        "from sklearn.pipeline import Pipeline\n",
        "from sklearn.linear_model import LogisticRegression\n",
        "from sklearn.svm import SVC\n",
        "from sklearn.tree import DecisionTreeClassifier\n",
        "from sklearn.model_selection import cross_val_score\n",
        "from imblearn.under_sampling import NeighbourhoodCleaningRule\n",
        "from sklearn.decomposition import PCA\n",
        "import timeit\n",
        "from sklearn.model_selection import StratifiedKFold\n",
        "from sklearn.metrics import confusion_matrix, precision_score, recall_score, f1_score, accuracy_score, make_scorer\n",
        "\n",
        "clf1 = Pipeline([('scl', StandardScaler()),\n",
        "                 ('clf', SVC(kernel='rbf', \n",
        "                             gamma='auto',\n",
        "                             random_state=RANDOM_STATE, \n",
        "                             probability = True))])\n",
        "\n",
        "clf2 = Pipeline([('scl', StandardScaler()),\n",
        "                 ('clf', LogisticRegression(solver='liblinear',\n",
        "                                            random_state=RANDOM_STATE))\n",
        "])\n",
        "\n",
        "clf3 = DecisionTreeClassifier(random_state=RANDOM_STATE)\n",
        "\n",
        "clf_labels = ['SVM', # Support Vector Machine\n",
        "              'LR', # LogisticRegression\n",
        "              'DT'] # Decision Tree\n",
        "\n",
        "# Majority Rule Voting\n",
        "hard_mv_clf = VotingClassifier(estimators=[(clf_labels[0],clf1),\n",
        "                                           (clf_labels[1],clf2),\n",
        "                                           (clf_labels[2],clf3)],\n",
        "                               voting='hard')\n",
        "\n",
        "soft_mv_clf = VotingClassifier(estimators=[(clf_labels[0],clf1),\n",
        "                                           (clf_labels[1],clf2),\n",
        "                                           (clf_labels[2],clf3)],\n",
        "                               voting='soft')\n",
        "\n",
        "clf_labels += ['Hard Majority Voting', 'Soft Majority Voting']\n",
        "all_clf = [clf1, clf2, clf3, hard_mv_clf, soft_mv_clf]\n",
        "\n",
        "print(color.BOLD+color.UNDERLINE+'Validation Scores\\n'+color.END)\n",
        "for clf, label in zip(all_clf, clf_labels):\n",
        "    start = timeit.default_timer() # TIME STUFF\n",
        "    \n",
        "    clf.fit(X_train, y_train)\n",
        "\n",
        "    y_pred = clf.predict(X_val)\n",
        "    scores = f1_score(y_val, y_pred)\n",
        "    print(color.BOLD+label+color.END)\n",
        "    print(\"Score: %0.3f\"\n",
        "          % scores)\n",
        "    # TIME STUFF\n",
        "    stop = timeit.default_timer()\n",
        "    print(\"Run time:\", np.round((stop-start)/60,2),\"minutes\")\n",
        "    print()"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "\u001b[1m\u001b[4mValidation Scores\n",
            "\u001b[0m\n",
            "\u001b[1mSVM\u001b[0m\n",
            "Score: 0.889\n",
            "Run time: 0.12 minutes\n",
            "\n",
            "\u001b[1mLR\u001b[0m\n",
            "Score: 0.897\n",
            "Run time: 0.02 minutes\n",
            "\n",
            "\u001b[1mDT\u001b[0m\n",
            "Score: 0.727\n",
            "Run time: 0.03 minutes\n",
            "\n",
            "\u001b[1mHard Majority Voting\u001b[0m\n",
            "Score: 0.857\n",
            "Run time: 0.18 minutes\n",
            "\n",
            "\u001b[1mSoft Majority Voting\u001b[0m\n",
            "Score: 0.897\n",
            "Run time: 0.18 minutes\n",
            "\n",
            "CPU times: user 31.5 s, sys: 236 ms, total: 31.7 s\n",
            "Wall time: 31.6 s\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "ZIRQU_s17yq_",
        "colab_type": "code",
        "outputId": "2e3abff2-ad05-4b6a-8e8c-7840a7abb4b7",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 317
        }
      },
      "source": [
        "%%time\n",
        "from sklearn.metrics import roc_curve\n",
        "from sklearn.metrics import auc\n",
        "\n",
        "# remove the hard voting because doesnt have predict proba\n",
        "del clf_labels[3], all_clf[3]\n",
        "\n",
        "colors = ['black', 'orange', 'blue', 'green']\n",
        "linestyles = [':', '--', '-.', '-']\n",
        "for clf, label, clr, ls \\\n",
        "        in zip(all_clf,\n",
        "               clf_labels, colors, linestyles):\n",
        "\n",
        "    # assuming the label of the positive class is 1\n",
        "    y_pred = clf.fit(X_train, \n",
        "                          y_train).predict_proba(X_test)[:, 1]\n",
        "    fpr, tpr, thresholds = roc_curve(y_true=y_test,\n",
        "                                     y_score=y_pred)\n",
        "    roc_auc = auc(x=fpr, y=tpr)\n",
        "    plt.plot(fpr, tpr,\n",
        "             color=clr,\n",
        "             linestyle=ls,\n",
        "             label='%s (auc = %0.2f)' % (label, roc_auc))\n",
        "\n",
        "plt.legend(loc='lower right')\n",
        "plt.plot([0, 1], [0, 1],\n",
        "         linestyle='--',\n",
        "         color='gray',\n",
        "         linewidth=2)\n",
        "\n",
        "plt.xlim([-0.1, 1.1])\n",
        "plt.ylim([-0.1, 1.1])\n",
        "plt.grid(alpha=0.5)\n",
        "plt.xlabel('False positive rate (FPR)')\n",
        "plt.ylabel('True positive rate (TPR)')\n",
        "\n",
        "#plt.savefig(os.path.join(IMAGE_DIR, 'Pipeline_Rocs.png'), dpi=300)\n",
        "plt.show()"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "display_data",
          "data": {
            "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYIAAAEGCAYAAABo25JHAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjMsIGh0\ndHA6Ly9tYXRwbG90bGliLm9yZy+AADFEAAAgAElEQVR4nOzdeXxU1f34/9eZyb4nhJAVQiAgISQY\ngmwFgqwqxLphbW2ltdJarbZ+/Fb7a3/Wtp9+PrRaW1u1FluVViv4rQJhFUFAREQWAyUgWwhk3xOy\nzX6+f9xkyDoZIJOZJOf5eMyD3HvuzH1fAuc999yzCCkliqIoytClc3cAiqIoinupRKAoijLEqUSg\nKIoyxKlEoCiKMsSpRKAoijLEebk7gKsVGRkpExMT++18JpMJHx+ffjtff1PXN3AN5msDdX197ciR\nI1VSyuHdlQ24RJCYmMjhw4f77XwFBQX0Z+Lpb+r6Bq7BfG2grq+vCSEu9lSmmoYURVGGOJUIFEVR\nhjiVCBRFUYY4lQgURVGGOJUIFEVRhjiVCBRFUYY4lQgURVGGOJUIFEVRhjiVCBRFUYY4lQgURVGG\nOJUIFEVRhjiVCBRFUYY4lQgURVGGOJclAiHE60KICiHEiR7KhRDiT0KIc0KI40KIDFfFoiiKovTM\nlXcEbwJLHJTfAiS3vlYCf3FhLIqiKEoPXLYegZTyYyFEooNDbgf+IaWUwGdCiDAhRIyUstRVMSlX\n75VXXmHTpk1s27YNgBdffJFdu3aRk5MDwPPPP8+BAwd47733AFi1ahW5ubmsXbsWgF//+tecPn2a\nt956C4BnnnmGwsJC3njjDQB++tOfUl1dzerVqwF48sknaWlp4eWXX2b1kdX8av2vABg7diwAZ8+e\nRafTMWbMGADOnDmDl7cXSaOTAPjy9Jf4+frZ53k/ceIEwcHBjBo1CoCTJ08SFBTEyJEjAcjLyyMk\nJISEhAT78WFhYcTHxwNw/D/HiRoWTnRkEAD5+ecJCw0jYtgw8Aoi93gesdGRREUEIqUkPz+fiIgI\nwsPDsekCOX7iJAmxUQwL88dqtVJQUEBkZCShoaGY8Sfv5JeMio8mPMQXi8XMxYuXiIqKIjg4GKPN\nj1Nfnmb0yBhCg3wwmUwUFhYSPWIEgUFB1DVJCi5eYmxiLEEB3hgMBoqLi4mJiSEgIIBGkzfnzp1j\nXFI8AX56mpubKS0tJS4uDj8/Py4b9OTn53PD2AT8fHQ0NTZSVl5OQkICPj6+1DZJLl68SMq4kfh4\nCRoaGqioqGDUqJF4eftSfdlCYWEhqTeMwksH9fX1VFVVkZiYiN7Lh4paAyUlJaRNGI1O2KitraWm\npoakpCSEzpuy6ibKysqYPDEJpJWa6mrq6utIShoDOm8uFldTf7metBtGgbRRVVVFY2MDiYmjQedN\nYWktly9fZmJyHCCpqKigxdDCqJGjQOfDxeIqmpubmTAmGoCysjLMZrP2u9b5kl9YhsVsZlxiFAAl\npSVImyQuLg70fpy7UIwQkjEJkQAUFxcjdILYmFjQ+3Pm/CW8vfWMjosAoLCwEG9vb6Kjo0EfwKkz\n+QQG+DEyJgyAi5cu4usTQGjoCHysw0iOH8nria9f33/QPuLOhWnigMJ220Wt+7okAiHESrS7BuLi\n4igoKOiP+ACorq7ut3O5Q2/XV11dTUtLi/3vvKamhubm5h63a2traWpq6rDd2Nho366rq+uwXV9f\nT0NDQ4dtg8FAQUEBrx96nUp9JRGmCAwGAwAWqwWd1HXYFkLYt61WKxaLxb4tpeywbbVZHW93er/N\nakNYm6AuH4CkCIAmqCvGFHADNpsNvbUR6s4jgDHD2soLMflPwGazobPWQ91Z9G3lsgnqwOyXor3f\nUgN1lXi1lVsvQB2YfCdq5eZqqKvEp63ckg91YBXjtc83VYKpCr+2ctM5MIFJr71fZywFQw0BbeWG\nM2AQGEVrfIZiaK4lsK28+Utkixcm21hsNhui+RLY6gkGgocBjaeQwgeTJVErb7wAtgZCgdBhQEMe\nUueP2RyHzWaDhrNgayJcQPgwoP4/2PRBmM1R2Gw2ZP2XCFsLEXqIiADqjmPzCsVqDcFmtSHrTiKk\niUgviAzTyq1e4VgswVhtVmT9CYS0EuUD+LSWe0disfhhtVqh7j+AJNoP8APqarH6jMBq0beWHwcg\n1r/1H31dNRafWKxWG3phs5fHBbaVV2HxTcBiteCtt2Kr+Q9mqzcxAVa89FaMlXUUN03AZNYTZDNS\nU1TI2bJkTJZJhOmDmNFyA8O9AygOu9ivdZkjQvtC7qIP1+4INkspU7sp2wysklJ+0rq9C3hKSulw\n+bHMzEypVijrOz1d34IFCwDYuXNnP0d0RdabWQDsWbHnmj+jT35/plqoP9l1f2gK+ISDoQoaTnct\nD5sE3iHQUg6N57qWh08Gr0BoLoGmC13LI6aA3g+aCqH5UpfigoYoEpOSobEAWoq7vj9yBggdNJwH\nQ1mnQgHDZ2o/Xj4DxspOxV4QOU37uf5LMHX6wqDzhWGZ2s91eWCu61iuD4CIG7Wfa4+DpaFjuVcw\nhKdpP9d8AdbmjuXeYRTUBWq/u+rDYDN2LPcZBqE3aD9XHQRp6VjuGwUhydrPlZ8Cneo5/xgISgJp\ng6oDdGb1jcfiOwpfbzO154+Qsz2C8kofyiq8Kav0oawqmLJKP8rKJLW1AoC//O48319RRu5/Arlx\n/mQ2bIDbb2vi8O5zvPBqLCMTD+MffASEFW8vf+YvyGLatJu6nNtVhBBHpJSZ3ZW5846gGEhotx3f\nuk/xAPfee6+7Q/AMhkqo2APD54D/iO6P8YvUXj3xH9HzewECYrVXTwITtFdnTQXan0GJ2qsnwWO0\nV09CxgHjei5vq3B7EjbRcXlbhd+TtoTRWV2B9uewbuuuK9oSVk9aE56UUFcHZWXaq7wcysp0lJXN\noqwM5s6Fb38b6ushIhyeew6eeMKbKqaz4jHto4KCYMQIiI6GlBS4+WZh354zZwwMH8OkuWAwgK8v\nQCCZC9P510LYulVy6JCV9PR0Fi1aREVFheO4+5E7E0EO8KgQYi0wDahXzwc8x0MPPeTuEDzD5S/h\nk+Vw807HlbniNk1NWqVuscC41nz229/CqFHwta9p+8eM0Sp/k6nr+729tYp89GhtOyQEfv5zmD5d\n205MhHPntAQQFNR7PHq99rJYLNTX1zNs2DAA5s+fz/jx4+3PtzyJyxKBEOIdIAuIFEIUAb8AvAGk\nlK8CW4FbgXNAM/BtV8WiKMrAYjJBba2Otla9deu0yrjt23z7V2OjdkxWFuzerf28Zg3MmKElAi8v\nuPVWrYKPjr7yavsmHx4OQlw5txDwy19e2fb21hLJ1SgsLCQnJweLxcLDDz+Mj48Pvr6+HpkEwLW9\nhu7rpVwCj7jq/Mr1ycrKAmDPnj1ujUMZPKxWqK7uvjIvL9eabv71L+3Y7GwoLR3BsWPa9m9/C198\noVXabRV5ZmbHir19HXviBOjadY7/Sz91TjeZTHz00UccPHgQgGHDhtHQ0GC/K/BU7mwaUjzYihUr\n3B2CMgC0tbtr7e0we7bWLLJ+PXz0Efz5z9pxDz6ofUu3Wrt+RkAAxMRA+2f6Dz8MFy9eBoYD8MEH\n2jd6rd29dzo3zJmQn5/Ppk2bqKurQwjBrFmzmDt3Ll5enl/Nen6EiluoRDC0GQxQUgIjR2pNK/v3\nw44d3X+bb9/uXlamNbmcPAk5OfDii1qlfPPNEBvbfdNMd+3ut98OBQVNtCWC4cP757qv1c6dO9m/\nfz8A0dHRZGdnExMT4+aonKcSgdIts9kMgLe3t5sjcbPwdFh0AEImuDuS62YyQUVF514zHbdfegnS\n0mDtWq0HTX6+9hB1/3749a8hKupKBX7DDR0r9OhoCA3VzvWzn2mvNt/4hnuuub/ExcWh1+uZO3cu\nM2fORK/Xuzukq6ISgdKthQsXAuoZAd4hEDnd3VH0yGaDqqorlXpystbEcv48PPMMPPkk3HgjvP8+\n3HVX958RFnalIm/7dj97NrzxhtYmD/D44/DEE9rdgQKNjY1cvHiRiRO1rrMTJkzgscceIyQkxM2R\nXRv1a1W69d3vftfdIXiGljIo2QoxSxz39e9jViucPXulki4v15pZ2n+DLy6Op7q6Y7v7iy/CY49p\nXSY/+wwqW8eJTZqk9YTp3DQzYgT4+XU9/5gxHR++Ots2P9hJKTl+/Djbt2/HZDIxbNgwbUoJGLBJ\nAFQiUHpw//33uzsEz9BwFg4+qI0juM5E0NzctTmmczPNsmVak4rRCBMmwP/8D/z0p9r2c89dqcTj\n4iA5uYXk5OAOlfv48dq5xo/X7graJCdrdwjKtaurq2PLli2cO6eNEh8zZgx+3WXRAUglAqVbzc3a\nkP+AgAA3R+LZzOaO7e6+vtA6Owff+542wOm//kv71h4crDXltCeE9iC0rSKP0OYvIyBAa6efPFnb\nTkjQkkH73jAFBdUkJga7/iKHOCklhw4dYteuXZhMJvz8/Fi8eDHp6emI9gMQBjCVCJRu3XrrrcDQ\nfkZw7hxcLgikbaGMVau0/untv8l3nrNvxowriaC8HCJbZ57Q6+GFF7SHqe2bZoYP77ndvf0sH0J0\nHPSk9J9du3bZewSlpKRwyy23EOTMEOMBRCUCpVsPP/ywu0Pocw0NWpdIR71mAI4c0f584gm4lD+O\n3P9P2/7wQygo0CrwceNgzpyuvWbi4q6cb8OGjud//HGXX6LiAlOnTuXUqVMsWLCACRMGfu+x7qhE\noHRroEw619LStUL/znfAxwf+9jd4440RtH6ZY+VKrbmlPS+vKxV5TAy0LkMAwC9+AZaKc1Crbe/a\n1T/XpLhXaWkphw8fZunSpQghCA0N5ZFHHkHnjlFq/UQlAqVb9fX1AIS2dQx3A5NJ+xbf0gL+/rBp\nE7z1VsdK//Llru+77TatTb2tOcVi0Sr8lSth6dKOTTMRET2PQp0yBbAkQ+MJCBzl0mtV3M9isbB3\n717279+PlJLY2FimTJkCMKiTAKhEoPTg9ttvB/r2GUFzszZAqaceM22v9evhK1/Rpi44dQouXtQG\nL5WUwLFjWiU+eXLHrpCdu0WCNq3B/PnleHklAjBv3jUE7RXY+zTLyoB36dIlcnJy7As13XTTTUya\nNMnNUfUflQiUbj322GO9HiOl9o28c0U+e7Y2iOnUKbj/fu0h6dy52hQFd9zR8TP8/a9U4MnJWgJo\nG8QUHq59Tusqk3zve9qrXzUXw6V/Q8IdEDiyn0+uuJrRaGTXrl0cOnQIgMjISLKzs+1Llw4VKhEo\nXdhscOedd2I2aw88x43TFuEoKoIf/rBjpd+6omMHv/udVoEHBWkVfFuvmGnT4N13Oz5gDQ7uuTeM\nt7f28vfvvrxfNObD0R9BWKpKBIPQsWPHOHToEDqdjlmzZjFnzpwBMUlcXxt6VzyESalN5du+SebM\nmQiamzs20dx/Pzz7bBVmM9x5ZyS/+IX24NTHRxukNGKE9s298+RhnfvCJyTAli1Xzh8TA/fc455r\nV5Q2Ukp7///MzExKS0uZNm2afYTwUKQSwQBms0FNTdemmbAwrX0cYNEi7dv8H/+obc+cqQ1MahMc\nHERMjFaBp6drx2dlwd133w3A0aN7GNn6RTgqCo4f77/rU5S+dvLkSXbv3s23vvUtgoOD0el09udh\nQ5lKBB5GSu2hamCgtr17N9TWwp13atsPPwyHDl35Bm+xdP2M6dOvJIK0tCvzvAuhNfWEhV2ZZ6ai\n4lK3i7t7e/8XcGVkq6IMZA0NDWzbto1Tp04BcPjwYeZdU++BwUklgn526hScOeO410xAgDajJMAr\nr2ijWdsSgdWqVeBpaT33mgluN+vA8893PP+SJc7FuWzZsuu/WEVxMyklubm57NixA4PBgI+PDwsW\nLCAzM9PdoXkUlQiug8Wize7Y3SRi//M/2rf63/5Wa5YpKdG+ka9aBf/4x5XPiIy8UoHPmqVV8jEx\n2p2BEPCnP2kPTNusXt0/11ZWVgYwpNtNARg2FW6/CH5R7o5EuUp1dXVs2rSJ/Px8AMaOHcvSpUvd\nOjbGU6lE4MD9f7yfT6oOoNeHERPvz4m8EwQFxFNXF4bJrMNs6X6QiV4nObC6nKrKS+i84tE/EMqs\n1d6cPHWSmHHRTHnOC53OTFVVEcMiwgkMDKJRF86mU6cYExtJsL8X7//ZRGlZGZGRkQQEBNJsC+bM\nmTMkJw4n0FeP0WSkvLycqOHD8fMPpNESwLlz5xg/Ogp/Hx0GQwsVlZWMGDECX78ALht9yc/PZ8KY\nEfh6CZpbmqmqqtI+PzCE2ma9Nr/62BF46wWXCi8BMDJhJOi8wbd1zVVjJdg6rTeo9wWf1j6fhgqQ\nnWZW0/uBT1hreevitO15+YN363/OljL77tzaS0wOHwlVByFyGtjMcOGfXf/CwydDRAZYWuDiOx2K\ngqqqIOwWCJsE5staV9DOImdA6AQwVkPRxq7lUXO0a1AGlKamJi5cuIC/vz9Llixh0qRJg2aSuL6m\nEoEDWy5uoSWoBWtxNOERWmO8r64FPz2E+Jrw8bry8gpPJr/gPMmxgmBdBZhhZBjAWQDMYhoAIT7V\n+NuqwNZabmuGBiBUK/ex1UJDDd60llsK4bIOgrQRjj6WajDV4ttWbr4EVm/wSwPA21IBxnr82sqN\nBWDxA29tjhQvUym0NBDQ/vObAkFok897GYvB2tQaO9BwWquk2xJB4wWwtnT8i/KJuJIIGs+DzdSx\n3Hf4lUTQcBZkp0TiF30lETSctu+e7AVf5zRcXNuaCEzalNCdpf7/WiIwX+5SHgkQLLREYKjo/v1T\nX9ESQdOl7stn/BOCx3bdr3ichoYGglvbRuPi4rjjjjtISkoisO2hm9ItlQgcSE9PR1YeIGdmEiGL\nXkfovbXKxlTX9WD/WNB5gakezPVdywPiQejAVAvmhm7KW+dEMNaApbFToYDA1gEuhiqwNncq1mmf\nD2Co7FpRC68rc+m3lIPtSrehwqJCEkYmgX/r+qotZV0rcp0v+LcO120uAdnpCbXe70rTSXNx14pe\nHwB+rdNwNhUCne8IAq8kmqZLdOHd+tBD768103Qpb10QxDeyS3lhUSEJSa0jRANHdf/+tiQWOrH7\n8rbYFI9ltVrZv38/H3/8McuXL2fcuHEAQ2p08PVQiaAXUgqCgvVaEgCt0vF2sBKRT6j26rE8/ErF\n0x3fCO3Vk7YKtcfyXlb5bqvQW1l9bVeSAIB/L88EelucJSDOcXlgLyM2HQ3aEjrH5Tp9l3Krr+3K\n70vn7fj9eh81aGwAKi0tZePGjZS3Th9bVFRkTwSKc1QicKCkpISLF25k5b5v8/evuDsaRVHaM5vN\n7N27l08//RQpJWFhYSxbtoykpCR3hzbgqETgQFV1FTYZic5f9RhRFE9SWVnJ2rVrqampAWD69OnM\nmzcPHx8fN0c2MKlE4EDapDQ+/VQiwgbnYhSKMlCFhIRgsVgYPnw42dnZxLdfSEK5aioR9ELS83z1\niqL0n/PnzzNy5Ei8vb3x9fXlm9/8JmFhYUNykri+pqo4B4qKisBmQVTscXcoijJkNTc3s379et56\n6y12795t3x8ZGamSQB9xKhEIIUKEEOOFEFfVpUIIsUQIcVoIcU4I8XQ35SOFELuFEF8IIY4LIW69\nms93tbo6rZuoTnQzoY+iKC4lpSQvL4+XX36Z48eP4+XlZR8joPStHtOpECIYeBj4OhAEVAF+Qohh\nwCfAK1LKfQ7erwdeBhYCRcAhIUSOlPJku8N+DrwrpfyLECIF2AokXt8l9Z3U1FT2f2Lpcb58RVFc\no6Ghgd27d1NYWAjAqFGjyM7OJiLCQddq5Zo5uq9aD7wNzJdSVrftFELogKnAN4UQyVLK13t4/03A\nOSllfuv71gK3A+0TgQTaOuWHAiXXdBUuphO23g9SFKVP1NXV8eqrr2I0GvH19WXhwoVkZGSo6SFc\nqMdEIKVc0MN+G3Cw9eVIHFDYbrsImNbpmGeBHUKIHwKBQLfnFEKsBFaCNmy8oKCgl1P3jQsXLiBl\nAlarqd/O2d/a1mgdrAbz9Q3Wa5NSEhUVhdFoZM6cOQQGBnLxYjcjvgc4T/r9XfWTFiHEGOBJKeXD\nfXD++4A3pZS/F0LMAP4phEhtTTZ2UsrVwGqAzMxM2d38+a7Q3NxMaGAl8zKCup2zf7AYzNcGg/v6\nBsO12Ww2Dh48SFJSEiNGaCPf4+PjKSoqYvTo0W6OzrU85ffX48NiIUSqEGKrECJXCPGsEGKEEGId\nsA/Id+Kzi4H28wnEt+5r70HgXQAp5QHAj9Z5wjzBxIkTSU4J5+5HstwdiqIMShUVFbz++uvs2LGD\nnJwcZOvMtN7e3qopqB85uiP4W+vrALAEyAXeAcZIKVscvK/NISBZCDEaLQF8De3Bc3uXgPnAm0KI\nCWiJoPKqrsDFpE1itdjQe6metorSV6xWK5988gkff/wxNpuN4OBg5syZoyp/N3GUCPyklH9r/TlP\nCPGolPIJZz9YSmkRQjwKfADogdellHlCiF8Bh6WUOcB/Aa8JIX6M9uB4hZSdJ6t3n0uXLnGxIJ6f\nfrqN3711m7vDUZRBobi4mJycHCoqKgCYMmUKCxYswM9PrfngLg4TgRBiEtCWog3tt6WUvS5jLqXc\nitYltP2+Z9r9fBKYdbVB95fGxkbih11k8cgTgEoEinK9jEYj//znPzEajYSHh5Odne0x7eRDmaNE\nUAW80sO2BOa4KihPkZKSgqw8wPzMk70frChKr3x9fZk/fz61tbXMmzcP7/brsCpu46j7qJp4GTCY\nfalrCCCs90MVRenEYDCwc+dOoqOj7QvGT5061c1RKZ05Glk8BvgtMBb4D/ATKWVpfwXmCS5evEhB\nwY08d/oWfrPI3dEoysBy5swZNm/eTENDA/7+/qSnp6s7AA/lqGnoDbReQr8AsoE/A3f3R1CeorlZ\nWxJSFzrGzZEoysDR1NTE9u3bOXHiBKANAs3OzlZJwIM5SgQhUsq/tP6cJ4Q42h8BeZIJEyZQUQEi\nbKK7Q1EUjyel5MSJE2zfvp3m5ma8vLy4+eabmTZtGjo1l7tHu5peQ/5X22tosBDSBKiVjxTFESkl\nBw4coLm5mdGjR7Ns2TLCwx2sz614DEeJoJIh3mvoQkEBkIiudAtwh5ujURTPI6XEbDbj4+ODTqcj\nOzubkpISbrzxRjU4bABxlAiekFIe6rdIPJDRYARACI8Z46YoHqOmpoZNmzYREBDAPffcA0B0dDTR\n0dFujky5Wo4SwV+BjP4KxBONHz+e8nLQqUSgKHY2m43PPvuM3bt3Y7FYCAgIoKGhQS0aM4A5SgTq\nvq6VuiNQFE15eTk5OTmUlGhLh6SlpbF48WICAgLcHJlyPRwlgtFCiPd7KpRS3umCeDxK/oULwGh0\nOpUIFGXv3r32SeJCQkJYunQpycnJ7g5L6QO9PSx+ub8C8UQWs5nI0GLmzFVtnorS0tKCzWYjMzOT\nBQsW4Ovr6+6QlD7iKBE0SCl39VskHmj8+HEYDAZm3THb3aEoSr8zmUzU1dURFRUFwM0330xKSgoj\nR450c2RKX3M0yqPQQdmQYTJYuFxZ4+4wFKVfXbhwgVdffZV//etfGI1a7zkfHx+VBAYpR4ngfx29\nUQgRJIRI6eN4PMrZcxf44lgQr/3qA3eHoij9wmAwkJOTwz/+8Q9qa2vx8/OjqanJ3WEpLuaoaejr\nQojngG3AEbRnBn5ok9DNa/3zSZdH6EbSZmXMiDMsTv0P2vLKijJ4ffnll2zZsoXGxkb0ej1z5sxh\n1qxZ6PV6d4emuJijaagfE0JEAvcA3wRigBbgFLBGSrmnXyJ0o3HjxiIrD5A6pvNSy4oyuGzfvp2D\nBw8C2sLx2dnZDB8+3M1RKf3F0R0BUsoq4C+tryFHSrjcEkxZdSiq35AymI0ePZqjR48yf/58pk6d\nqiaJG2LUb9uBs2cvkHshlf/7kVpIQxlc6uvrOXbsmH17/PjxPP7442qm0CHK4R2BotFFTnF3CIrS\nJ6SUHD58mJ07d2I2mxk+fDixsbEABAYGujk6xV1UInAgMXE0paUgwgZ15yhliKiuriYnJ4dLly4B\ncMMNN6j5gRTAiUQghPAHfgSMklJ+XwgxFkiWUm5zeXQeQmepA7VqsTJA2Ww2Dhw4wJ49e7BYLAQG\nBnLrrbcyYcIENVW0Ajh3R/A62prFbYvZlwD/F61b6aCWn6/NNSSK1wPfdnc4inJNdu7cyYEDBwBI\nT09n8eLF+Pv7uzkqxZM4kwiSpZT3CSHuAZBSNosh8jWi7aGZmoZaGcimT59Ofn4+CxYsYOzYse4O\nR/FAznQPMAkh/NBWJUMIMRowuTQqDzFq1CgAhkbaUwaLwsJC1q9fj81mAyAkJITvfe97KgkoPXLm\njuDXwHYgXgixBpgLfNelUXkYtR6BMhCYTCZ27drF559/DsDIkSOZMkXr8TZEbuKVa9RrIpBSbhNC\nHAZmoi1W83+klBUuj8wDnM/PB5JU05Di8c6fP8/mzZupq6tDCMGsWbNIT093d1jKAOFMr6EdUspF\nwMZu9vX23iXAi4Ae+JuUclU3xywHnkVrejompfy68+G7lq+PjhHDCvnK0jR3h6Io3WppaWHHjh3k\n5uYC2prB2dnZxMTEuDkyZSDpMREIIXzQJpkbIYQI5srSlSFAr3PRCiH0aAvbLASKgENCiBwp5cl2\nxyQDPwVmSSlrhRBR13wlLjBmTCIGg4HkGTe5OxRF6daJEyfIzc1Fr9eTlZXFjBkz1CRxylVzdEfw\nCPAEEAXkcSURXAZedeKzbwLOSSnzAYQQa4HbgZPtjnkIeFlKWQvgaU1ONhs01ZuoKiggMjHR3eEo\nCqCNC2jr0TZlyhQqKyu56aabiIyMdHNkykDlaPbRPwB/EEL8SEr5x2v47Dg6Lm5TBEzrdMw4ACHE\nfrTmo2ellNs7f5AQYiWwEiAuLo6CgoJrCOfqnThxjtrasWx8dQPzv//Vfjlnf6uurnZ3CC41mK5P\nSkl+fj65ubksWbIEg8EAQEpKCo2NjTQ2Nro5wr41mH533fGk63PmYfEfhRA3ACloTUVt+//VR+dP\nBrKAeOBjIcQkKWVdpxhWA7emW3kAACAASURBVKsBMjMzZWI/fTsPCvImJjCPW6amEzuI7wj66+/T\nXQbD9dXV1bF582bOnz8PQFVVFaNGjRoU1+aIur7+4czD4p8Di4AbgA+AxcAnQG+JoBhIaLcd37qv\nvSLgoJTSDFwQQpxBSwyHnIrexZKSRiErDxA7vK73gxXFBaSUHDp0iF27dmEymfDz82PJkiWkpaVx\n8eJFd4enDBLOjCO4F5gMHJVSflMIEQO86cT7DgHJrQPQioGvAZ17BG1AW/rrjdZFcMYB+U7G7nJW\nK9Q0RFBcGU6cu4NRhpzq6mo2btxIYaHWwpqSksItt9xCUFCQmyNTBhtnEkGLlNIqhLC09h4qA0b1\n9iYppUUI8SjaXYQeeF1KmSeE+BVwWEqZ01q2SAhxErCijVHwmIazkyfzqakZz6e+Y7kn293RKEON\nyWSiqKiIoKAg+yRxiuIKziSCL4QQYWiTzx1G6zX0uTMfLqXcCmzttO+Zdj9LtJ5JTzgbcH/y89Me\niehiex0yoSh9ora2lvDwcABiYmK45557SExMVJPEKS7lMBG0Ti73bOvD25eFEB8AIVLKo/0SnZvF\nxMRSUgIidJy7Q1EGOYvFwt69e9m/fz/Lly/nhhtuAFB3AUq/6G3NYimE+BBIbd0+1y9ReRhdyyWc\nGEOnKNfk0qVL5OTk2LsTlpeX2xOBovQHZ5qGcoUQN0opv3B5NB4mv3WuIVH0PtraPIrSd4xGI7t2\n7eLQIa2TXGRkJNnZ2SQkJPTyTkXpW84kghvRpoc4DzShjTCWUsoMl0bmAfwDAqitBZ1OTTqn9K3y\n8nLeeecd6uvr0el0zJo1izlz5uDlpVaPVfqfM//qhmx/megR0ZQUq2molb7X/oFwdnY20dHRbo5I\nGcqcGVl8vj8C8WRqGmrlekkpOX36NElJSfj4+ODj48O3vvUtwsLC7PMGKYq7qH+BDmjrEagVypTr\n09DQwLvvvsu6dev46KOP7PsjIiJUElA8gmqQdCA8zJcg30tkLlvs7lCUAUhKSW5uLjt27MBgMODj\n46NmCFU8klOJQAgRj7aI/W4hhC/gJaVscm1o7jdqVBwGg4HhyanuDkUZYGpra9m8eXNrzzMYO3Ys\nS5cuJTQ01M2RKUpXzkw69x3gUSAUGIM2vcQrwALXhuZ+ZjPUVhqoOH2MqPFq2T/FOXV1dfzlL3/B\nbDbj7+/PkiVLmDRpklo3WPFYztwRPIa2yMxBACnlGU9bScxVTpzI5/LlJP6zYzvzVSJQnBQWFkZy\ncjI6nY4lS5YQGBjo7pAUxSFnEoFBSmlq+zbTugTlkPhqExnpS1LEF9w0Yby7Q1E8mNVqZf/+/SQn\nJ9vXCr7zzjvVkpHKgOFMl4X9QoifAH5CiHnAOmCza8PyDAkJcYQEGAgONLg7FMVDlZSU8Nprr7F7\n9242bdqENo8iKgkoA4ozdwQ/QVsm8kvgcbSpo//qyqA8hdEIlTXRaj0CpQuz2cyePXs4cOAAUkrC\nw8NZsGCBeg6gDEjOJILbgL9JKf/i6mA8TV5ePg0NSeQXD1eJQLG7ePEiOTk51NTUIIRg+vTpzJs3\nDx8fH3eHpijXxJlEcA/wZyHER2jNQh9KKa2uDcszBIeE0tAAIumb7g5F8RAGg4F33nkHo9HI8OHD\nyc7OJj4+3t1hKcp1cWaKiW+2jh24Dfg28FchxDYp5fddHp2bRQ4bRkkx6ILVFNRDnZQSIQR+fn4s\nWrSIy5cvM3v2bPUsQBkUnBpQJqU0CiE2Ai1oy04uBwZ9Imh97oeoOw6kuTUWxT2am5v54IMPiI2N\nZdq0aQBkZAz6iXeVIcaZAWUL0RawXwB8AvyDrovQD0pt6xHoStajEsHQIqUkLy+Pbdu20dzczLlz\n58jIyMDb29vdoSlKn3PmjmAl2rOBH0opW1wcj0cJDQujqUnNPjrUNDQ0sGXLFk6fPg1AYmIiy5Yt\nU0lAGbSceUZwT38E4okiwiPUegRDiJSSL774gh07dmA0GvH19WXhwoVkZGSobqHKoNZjIhBC7JVS\nzhVC1ALta8K2FcoiXB6dh1ArlA0NUkqOHj2K0Whk3Lhx3HbbbYSEhLg7LEVxOUd3BPNa/xyy8+bm\n518ARiNQiWCwstlsmEwm/Pz80Ol0ZGdnU15eTmpqqroLUIaMHqeYkFLaWn/8u5TS2v4F/L1/wnOv\nuLgAEhOKuGHpQ+4ORXGBiooKXn/9dTZs2GCfGiIqKkrNFKoMOc48LO7QXaZ10rmprgnHs8TGjsBg\nMOAfqcYRDCZWq5V9+/axb98+bDYbwcHBNDU1ERQU5O7QFMUtHD0jeAp4GggWQtS07UZ7XjAk7gia\nmyVVpS1Un9zLsJS57g5H6QPFxcXk5ORQUVEBwJQpU1iwYAF+fn5ujkxR3MfRHcHvgN8D/4uWEAAY\nKtNLAOTlXaS5OZGyLz5QiWCAk1Kya9cuPv30U/skccuWLWP06NHuDk1R3M5RIhgrpTwrhPgnMLFt\nZ1vbqZTyeG8fLoRYAryINhr5b1LKVT0cdxfwb2CqlPKw8+G7VkJCAKHWzxk/UlUWA50QAptNe+w1\nY8YM5s2bp8YFKEorR4ngaeBB4OVuyiQwx9EHtz5LeBlYCBQBh4QQOVLKk52OC0ab3vrgVcTdL6Kj\no5CV5/HysvV+sOJxDAYD1dXVJCYmAjBv3jxSU1OJjY11b2CK4mF6TARSygdb/5x9jZ99E3BOSpkP\nIIRYC9wOnOx03K+B3wL/5xrP4zKXL9uoqhhJVV3Q0O1DO0CdPn2aLVu2YLVaSU1NxdfXF29vb5UE\nFKUbzsw1dCfa1NMNQoingQzgN1LKY728NQ4obLddBEzr9NkZQIKUcosQosdEIIRYiTbVBXFxcRQU\nFPQWdp84deoSBkMiReWCxn46Z3+rrq52dwh9ymAw8Pnnn3PhwgVAWz/47Nmzg7JH0GD73XWmrq//\nONN99Fkp5ftCiJnArWgPkP8KTL+eEwshdMALwIrejpVSrgZWA2RmZsq2W31Xi4iIoKQEgjKfJTFx\n8N4T9NffpytJKTlx4gTbtm2jpaUFLy8vbr75ZkaMGEFSUpK7w3OZwfC7c0RdX/9wJhG09RJaCvxV\nSrlRCPGsE+8rBhLabce37msTDKQCe1ofQEcDOUKIbE95YBwSEkJJCegCBm8SGCy2bNnCkSNHABg9\nejTLli0jPDy83+4eFWUgcyYRlAohXgZuAaYIIXxwbtH7Q0CyEGI0WgL4Gu2mr5ZS1tNu+gohxB7g\nSU9JAgBWqw3QIco/gqSb3R2O4sD48eM5ceIEixYt4sYbb1QjgxXlKjhToS8H9gK3Silr0Srvpx2/\nBaSUFuBRtMXuTwHvSinzhBC/EkJkX0fM/aaoqAgAXelmN0eidFZTU2O/AwBITk7mRz/6kZopVFGu\ngTPTUDcKIfKALCFEFrBPSrnNmQ+XUm4Ftnba90wPx2Y585n9KTw8nJYWNQ21J7HZbHz22Wfs3r0b\nq9XKiBEj7GsGq9HBinJtnOk19CjwA2BD6653hRAvSylfcWlkHiAoKBgA9QXTM5SXl5OTk0NJSQkA\naWlpREQMmdnQFcVlnF2h7CYpZSOAEOJ/gE+BQZ8IrFYroFfrEbiZxWJh3759fPLJJ9hsNkJCQli6\ndCnJycnuDk1RBgVnEoEATO22za37Bj3tm2eCWo/AzXbt2sVnn30GQGZmJgsWLMDX19fNUSnK4OFM\nIvgncFAI8R5aAvgqsMalUXmI0aMDMBtLGLFwo7tDGdJmzpxJYWEhCxcuZNSoUe4OR1EGHWceFv+u\ntWvnV9DmGPq+lPKQqwPzBMOHD8NgMKD3D3V3KENKfn4+R44c4a677kKn0xEcHMyDDz6oegMpios4\nc0cAYACMgK31zyGhutpMTXkLTV9uIPCGr7o7nEHPYDCwY8cOvvjiCwDGjBlDRkYGgEoCiuJCvY4j\nEEL8DHgHiEEbHfwvIcRPXR2YJzh3vpySyjBaLnzo7lAGvS+//JKXX36ZL774Ar1ez7x580hPT3d3\nWIoyJDhzR/At4EYpZTOAEOI3wBdoC9YMasljAwg1f0pkWKK7Qxm0Ghsb2b59O3l5eQDEx8eTnZ3N\n8OHD3RyZogwdTk0x0ek4r9Z9g15ERASy0pnB18q1+vLLL8nLy8Pb25v58+czdepUdDr1d64o/cmZ\nRFAD5AkhPkB7WLwIbZGZFwCklE+4MD63Ki83U1s6hhajN/7uDmYQsVgseHlp//SmTJlCTU0NU6dO\nJTw83M2RKcrQ5Ewi2NL6avOZi2LxOAUFlRgMsVgseneHMihIKTl8+DAff/wxDz74IGFhYQghWLRo\nkbtDU5QhzZnuo3/vj0A8UXh4OKWlIKb+yd2hDHjV1dXk5ORw6dIlAP7zn/8we/a1Ln6nKEpfcrb7\n6JDk7681COm81SjWa2Wz2fj000/Zs2cPVquVwMBAbr31VlJSUtwdmqIorVQicMBsNgPeiItrYcLX\n3B3OgFNRUcGGDRsoLdX6FqSnp7N48WJ7glUUxTM4nQiEEL5SyiEzmAygsrIKiEFX8aFKBNfAZrNR\nXl5OaGgoS5cuZezYse4OSVGUbjgzDfVNwN+BUGCkECId+K6U8oeuDs7dwsLCKSsDobozOq2ystI+\nBiA6Opp7772XUaNGqUniFMWDOVPD/QltveJqACnlMWCeK4PyFG0Lneh0qtdQb0wmE9u2beOVV17h\n5MmT9v3jxo1TSUBRPJwzTUM6KeXFTnO9WHs6eDCxPyPQq0cpjpw/f55NmzZRX1+PEILa2lqXns9s\nNlNUVITBYOj1WIvFwqlTp1waj7sM5msDdX3Xys/Pj/j4eLy9vZ1+jzM1XGFr85AUQuiBHwJnrjHG\nAaWqugqIRuhUIuhOS0sLO3bsIDc3F9CagrKzs4mJiXHpeYuKiggODiYxMbHXyeiMRuOgvSMZzNcG\n6vquhZSS6upqioqKGD16tNPvc6aGexiteWgkUA7sbN036KVMCMRorEJ30x/dHYrHKSsr4+2336ax\nsRG9Xs/cuXOZOXMmer3rm9EMBoNTSUBRhhohBMOGDaOysvKq3ufMgLIKYEh2mQkJCXGq+WEoioiI\nwMvLi4SEBLKzs4mMjOzX86skoCjdu5b/G870GnoNuq7VKKVcedVnG2CKioxcrm6Gwg2QMLTXI5BS\ncuLECfvDXx8fH1asWEFISIiqlBVlgHOm19BOYFfraz8QxRBZnKa4uJrLTQFQd8zdobhVXV0db7/9\nNu+//z67du2y7w8NDR2ySeA3v/kNEydOJC0tjcmTJ3Pw4EF++ctf8tOfdlyqIzc3lwkTJgCQmJjY\nZVqNyZMnk5qa2u05SktLWbp0qWsu4BrU1NSwcOFCkpOTWbhwYY+dAp566ilSU1NJTU1l3bp19v2z\nZ89m8uTJTJ48mdjYWL76Ve3L1ebNm3nmmWf65RqU7vWaCKSU69q91gB3AlNcH5r7paQEMX3sQRBD\n82GxlJKDBw/yyiuvcP78efz8/IiLi3N3WG534MABNm/ezNGjRzl+/Dg7d+4kISGB++67r0PFB7B2\n7Vruu+8++3ZDQwOFhYUAvfYYeeGFF3jooYf6/gKu0apVq5g/fz5nz55l/vz5rFq1qssxW7Zs4ejR\no+Tm5nLw4EGef/55Ll++DMC+ffvIzc0lNzeXGTNmcOeddwJw2223sWnTJpqbm/v1epQrrmWk1Ghg\nRF8H4omCg4K0H3TOd8MaLKqqqnjjjTfYvn07ZrOZlJQUHnnkEY9cNSwrK4s333wT0LqWZmVl8dZb\nbwHQ3NxMVlaWvYKur68nKyuL999/H9CuMysri02bNgHaQ/DelJaWEhkZae/xERkZSWxsLOPGjSM8\nPJyDBw/aj3333Xc7JILly5fbY3nnnXc6lHX23nvvsWTJEgAKCgqYPXs2GRkZZGRk8OmnnwKwZ8+e\nDncNjz76qP3v4tChQ8ycOZP09HRuuukmGhoaer02RzZu3MgDDzwAwAMPPMCGDRu6HHPy5EnmzJmD\nl5cXgYGBpKWlsX379g7HXL58mY8++sh+RyCEICsri82bN19XfMq1c2apylohRE3rqw74EBgSS1UW\nXDRyonDikLsjqK2t5dVXX6WwsJCgoCCWL1/OPffcQ1BbYhziFi1aRGFhIePGjeMHP/gBe/futZfd\nd999rF27FoDPPvuMiIgIkpOT7eV33XWXPQlt2rSJZcuWdXuOCxcuEB4ebk82UVFRfPjhhxw9epR1\n69bx2GOPOYzRZDJx77338uKLL3Ls2DF27tzZZY6nhoYGe1NN51f7QYFtysvL7V2Do6OjKS8v73JM\neno627dvp7m5maqqKnbv3m2/A2qzYcMG5s+fT0hIiH1fZmYm+/btc3hNius4rOGE1gCcDhS37rJJ\nKbs8OB6syssaMJmGgVeAu0PpV+Hh4UyYMAEvLy8WLVrk8ZPE7dmzx/6zt7d3h+2AgIAO26GhoR22\nIyMjO2xHR0f3er6goCCOHDnCvn372L17N/feey+rVq1ixYoV3HvvvcycOZPf//73XZqFAIYNG0Z4\neDhr165lwoQJBAR0/2+rtLS0w3KdZrOZRx99lNzcXPR6PWfOOB7Kc/r0aWJiYpg6dSpAh0q3TXBw\nsH0MyNUSQnT7fGjRokX2O5Hhw4czY8aMLl2K33nnHb773e922BcVFUVJSck1xaJcP4eJQEophRBb\npZTdP83qhRBiCfAioAf+JqVc1an8CeC7gAWoBL4jpbx4LedyhZCwcCordZD8fXeH4lJms5m9e/eS\nkpJCbGwsAHfccYdaMtIBvV5PVlYWWVlZTJo0iTVr1rBixQoSEhIYPXo0e/fu5b333uPAgQNd3nvv\nvffyyCOP2JtwuuPv79+h6/If/vAHRowYwbFjx7DZbPbpT7y8vLDZbPbjrqa7c0NDQ49rQvzrX//q\nMlX4iBEjKC0tJSYmhtLSUqKiorp9789+9jN+9rOfAfD1r3+dcePG2cuqqqr4/PPPWb9+fYf3GAwG\nj//CMZg50+aRK4S4UUr5xdV8cOso5JeBhUAR2vKWOVLK9vecXwCZUspmIcTDwO+Ae6/mPK7k4+3N\nYO8UU15ezubNm6murub8+fOsXLkSIYRKAg6cPn0anU5nb/LJzc1l1KhR9vL77ruPH//4xyQlJREf\nH9/l/XfccQelpaUsXry4x2/B48aNo6CgwL5dX19PfHw8Op2ONWvWYLVqs7yMGjWKkydPYjQaaWlp\nYdeuXXzlK19h/PjxlJaWcujQIaZOnUpDQwP+/v72JULh6u8IsrOzWbNmDU8//TRr1qzh9ttv73KM\n1Wqlrq6OYcOGcfz4cY4fP95hBbp///vfLF261J7I2pw5c6bH3lOK6/X4v10Ie8P4jWiV+GkhxFEh\nxBdCiKNOfPZNwDkpZb6U0gSsBTr8y5FS7pZStnUV+Azo+r/GjUxGA0JaoXxv7wcPMEajka1bt7J9\n+3aqq6uJjIzk1ltvHbLdQa9GY2MjDzzwACkpKaSlpXHy5EmeffZZe/k999xDXl5ejw+Cg4ODeeqp\np/Dx8enxHIGBgYwZM4Zz584B8IMf/IA1a9aQnp7Ol19+SWBgIAAJCQksX76c1NRUli9fzo033giA\nj48P69at44c//CHp6eksXLjwugdHPv3003z44YckJyezc+dOnn76aQAOHz5sb+oxm83Mnj2blJQU\nVq5cyVtvvdUh+XTXXAawe/dubrvttuuKT7l2oqcmfyHEUSllhhBiTHflUsrzDj9YiLuBJVLK77Zu\nfxOYJqV8tIfjXwLKpJT/3U3ZSmAlQFxc3JRPPvnE0an7zIRVt2I2R1D6/btoGn5Hv5yzPxQXF3Pg\nwAGampoQQjBp0iTS0tL6ZXqIvtDQ0NChucERi8XSoSIaSDZu3MjRo0f55S9/2W35QL629srLy3ng\ngQe69C4aLNfXE1de35kzZwgODu6wb/To0UeklJndHe8oCgG9V/h9QQhxP5AJzO2uXEq5GlgNkJmZ\nKRMTE10dEqA9YKuphuFRsQwf1T/ndDWDwcC6deswGAzExMSQmZlJRkaGu8O6KqdOnbqqyboG6sRl\ny5cv5/Llyw7jH6jX1l55eTl/+MMfur2WwXB9jrjq+ry8vLiaetJRIhje+jC3W1LKF3r57GIgod12\nPFd6H9kJIRYAPwPmetoKaF5eehBywHcfbbvrE0Lg5+fH4sWLaWpqYsaMGfbF5BXP1Ll3zWDU1rNJ\ncR9HNZweCKL1zuAaHAKShRCj0RLA14Cvtz9ACHEj8Fe0JqSKazyPy5hN2noEA3lAWUNDA1u3bmXk\nyJHMmDED0KY1UBRFaeMoEZRKKX91rR8spbQIIR4FPkBLKq9LKfOEEL8CDkspc4Dn0JLN/219SHlJ\nSpl9refsa83N9fh4BYNXoLtDuWpSSnJzc/nggw8wGo0UFhaSmZl5VYtVKIoyNPT6jOB6SCm3Als7\n7Xum3c8LrvccrpSSGorRaITo+e4O5arU1tayefNm8vPzARg7dixLly5VSUBRlG45SgQDq/ZzgYCA\ngAHVn95ms/H555/z0UcfYTab8ff3Z8mSJUyaNEl1C1UUpUc91nJSypr+DMQTnT9zmfOnmqC+67wr\nniovLw+z2czEiRN55JFHSEtLU0nABbqbd+nZZ58lLi6OyZMnk5KSwjvvvNPj+//4xz/yj3/8w5Uh\nXpXt27czfvx4xo4d2+2sogAXL15k/vz5pKWlkZWVRVFRkX1/RkYGkydPZuLEibz66qv29yxYsMDl\na1grfUBKOaBeU6ZMkf0l9NExctijqVJWftZv57xaFotFNjc327crKirkqVOnnH7/hQsXXBCVa508\nedLpYw0Gg0tiCAwM7LLvF7/4hXzuueeklFKeOXNGBgcHS5PJ1OU4s9ksJ02aJM1m83XF0FfXZrFY\nZFJSkjx//rw0Go0yLS1N5uXldTnu7rvvlm+++aaUUspdu3bJ+++/X0oppdFotMfS0NAgR40aJYuL\ni6WUUr755pvyv//7v68pLlf97jyFK6+vu/8jaM9mu61XB067hxukjNOTmnACPHTx+pKSEl577TXW\nr19v7yI6fPhwbrjhBjdH1s92ZnV9nXlFK7M0d1+e/6ZWbqjqWtYHkpOTCQgI6Pbb8EcffURGRoZ9\nMNFrr73G1KlTSU9P56677rLPy79ixQr+/e9/29/X/i7kt7/9LVOmTCE9Pd0+wvdaff7554wdO5ak\npCR8fHz42te+xsaNG7scd/LkSW6++WYA5s2bZz/Gx8fH3h/eaDR2mPsoOzvb4Z2R4hlUInDAx6f1\n4arwrIesZrOZDz/8kL/97W+Ul5dTVVVFU1OTu8NS2jl69CjJycndTsy2f/9+pky5srbTnXfeyaFD\nhzh27BgTJkzg73//u8PP3rZtGxs3bmTfvn0cO3aMn/zkJ12Oefvtt7udXvruu+/ucmxxcTEJCVeG\n/MTHx1Nc3GXID+np6fYptNevX09DQwPV1dUAFBYWkpaWRkJCAk899ZR98sLw8HCMRqP9OMUzeeZX\nXQ9x+rwXxuZJHnVHUFBQwKZNm6ipqUEIwfTp05k3b57DeWsGvQV7ei7zCnBc7hfpuPwq/eEPf+CN\nN97gzJkz9sVuOistLbUvXwlw4sQJfv7zn1NXV0djYyOLFy92eI6dO3fy7W9/2z6FdURERJdjvvGN\nb/CNb3zjOq6kq+eff96+8M2cOXOIi4uzT0uSkJDA8ePHKSkp4atf/Sp33303I0Zo61e1TTE9bNiw\nPo1H6TueU8N5oMsNVqyWQNC7fz0CKSXbt2/n888/B7QmoOzs7G5nt1Tc58c//jFPPvkkOTk5PPjg\ng/YlPtvrPMX0ihUr2LBhA+np6bz55pv29RHaTzFts9kwmUxOx/H222/z3HPPddk/duzYDs1NAHFx\ncR0WjykqKup2SdLY2Fj7HUFjYyPvvfceYWFhXY5JTU1l37599rsPNcW051NNQw4EBIUhdT4QlOju\nUBBC4OPjg06nY+7cuaxcuVIlAQ+WnZ1NZmYma9as6VI2YcIE+6yioI3+jomJwWw28/bbb9v3JyYm\ncuTIEQBycnIwm80ALFy4kDfeeMP+LKGmpmsHv2984xv29YHbvzonAdCmeDh79iwXLlzAZDKxdu1a\nsrO7juusqqqyJ6b//d//5Tvf+Q6gJY6WlhZAG8PyySefMH78eED7AlNWVnZV894o/U8lAgd0Oj1C\n576ul83NzR3aaufOncv3vvc9srKyBvWsjANBc3Mz8fHx9tcLL3SdeuuZZ57hhRde6PDwFOCWW27h\n448/tm//+te/Ztq0acyaNavDg/6HHnqIvXv3kp6ezoEDB+xTTy9ZsoTs7GxmzpzJ5MmTef7556/r\nWry8vHjppZdYvHgxEyZMYPny5UycONF+DTk5OYC2Etz48eMZN24c5eXl9sVnTp06xbRp00hPT2fu\n3Lk8+eSTTJo0CYAjR44wffp09e/Vw/U4DbWnyszMlIcPH+6Xc0X8aCotLXpa/rQFfPuvfVNKSV5e\nHtu2bUOn0/HII490aV7oKwUFBQPu29qpU6c6tLE7YjQaPXIGyzvuuIPf/e53HdYzvlqeem3tPf74\n42RnZzN//tWPTx0I13c9XHl93f0fEUJc0zTUQ57JaEDgD/2YLC9fvszWrVs5ffo0oDUPmEwmlyUC\nxT1WrVpFaWnpdSWCgSA1NfWakoDSv1QicMDPz4+WZvql15CUkqNHj/Lhhx/avyksXLiQjIwMNTJ4\nEBo/fry9HX0we+ihh9wdguIElQgcEEJo6xH0wzTUGzdu5NixY4C2Xu1tt91GSEiIy8+rKIqiEoED\nFosV0PXLwjSTJk3i7Nmz3HLLLUycOFHdBSiK0m9UInBA2hrx89a75I6goqKCCxcuMG3aNADGjBnD\n448/PrQHhimK4hYqETgwMT1SW49A9F0vW6vVyr59+9i3bx82m43Y2Fj78H6VBBRFcQc1jsABb2/v\nPu3/XFxczOrVq9m7DyTG1wAAGotJREFUdy82m40pU6Z0OxeN4vn0er192uX09HR+//vfY7PZ+OCD\nD+zz+gQFBTF+/HgmT57Mt771rS6fUVpaytKlS90QffdqampYuHAhycnJLFy4sMfpo5966ilSU1NJ\nTU1l3bp19v0PPvgg6enppKWlcffdd9PY2AjASy+9xOuvv94v16BcG5UIHDh5vJZzJ69/Mjez2cyO\nHTv4+9//TkVFBeHh4TzwwAMsXbp0UPeTHsz8/f3Jzc0lLy+PDz/8kG3btvHLX/6SxYsX20fxZmZm\n8vbbb5Obm9vt2gMvvPCCR/WqWbVqFfPnz+fs2bPMnz+/23UJtmzZwtGjR8nNzeXgwYM8//zzXL58\nGdDmWTp27BjHjx9n5MiRvPTSSwB85zvf4c9//nO/XotydVQicEDYGvDzbr7uz9m1axcHDhwAYMaM\nGTz88MMDbhCXJ8vK6vm1cKE3WVnQfvBtVha8+ab2c1VV1/dcraioKFavXs1LL73E1QzQfO+991iy\nZAmgDeybPXs2GRkZZGRk8OmnnwLaaN72dw1tk74BHD58mJkzZ5Kens5NN91EQ0PD1Qffzsb/1965\nx1VVpf///XAwUcfUxJSykLw1llwU+ybpZI06zjijL38mqOmgo+OYSZPdxnlZjU5+v2p56WKjfhsH\nkewrNqYhqY0alWU6oKIZ5YWkNBjSGvESo1ye3x97czzgAQ4C5wis9+vFi73XXmvt59kHzrNu+7Pe\nfpvY2FgAYmNj2bhx4xV5MjMz+clPfoK/vz8tWrQgNDSUrVu3AjhXuakqBQUFzgUPzZs3p1OnTk6d\nLMO1h5kjqITuIQXIxdM1rqd///7k5eUxcOBAt2JehvrPbbfdRnFxMd9++61TdbMyjh8/Tps2bZw9\nwhtvvJFt27YREBDA0aNHGTNmDJW9QX/p0iXGjRvHunXr6NOnD2fPnr1C2O3cuXP079/fbfk33niD\nHj16lEnLy8sjKCgIgA4dOpCXl3dFubCwMObMmcPjjz/ODz/8QGpqapl6Jk6cyObNm+nRoweLFi1y\npkdGRrJz507uuuuuKp6MwReYQFAJfiLoVSzjPHz4MPv27SM6OhqHw0GLFi2cLS1D7WOLdbrl4sXC\nK4bfXPMHBlZevq7Izc2lXbt2zvPCwkKmT59ORkYGDoeDI0eOVFr+8OHDdOjQgT59+gC4feekZcuW\nZGRkXJV9IuJ2CfPgwYNJS0sjKiqKdu3a0bdvX6cUNUB8fDzFxcXExcWRlJTExIkTASvQffHFF1dl\ni6HuMYGgEj49diMlJe2qzmhz4cIFtm7dyqFDhwA4cOAAvXr1qivzDNcQX375JQ6Hw+PJ//JS1EuW\nLKF9+/YcOHCAkpISp6SIqxQ1UKZMVVS3R9C+fXtyc3MJCgoiNze3Ql9mzZrlFJwbO3Ys3bp1K3Pd\n4XAwevRonn/+eWcgMFLU1zYmEFRCwSVBcFSZT1U5dOgQW7ZsoaCggCZNmnD//fcTHh7uBSsNvubU\nqVNMnTqV6dOne/wiYLdu3cjOznae5+fn07FjR/z8/EhISKC4uBiA4OBgMjMzuXjxIgUFBezYsYN+\n/frRvXt3/vWvf5GWlkafPn04d+4czZo1K7PKrbo9gmHDhpGQkMDMmTNJSEhg+PDhV+QpLi7mzJkz\ntG3bloMHD3Lw4EEGDx6MqpKVlUWXLl1QVZKTk8soqR45coR77rnHY1sM3sUEgkq4rtn1FBdVPvmX\nn5/PO++8w9GjRwEICQnhV7/6FW3atPGGiQYfUVBQQHh4OIWFhfj7+zN+/Hgee+wxj8u3aNGCzp07\nc+zYMbp06cK0adMYOXIkq1evZsiQIU7J6VtuuYXo6GjuvPNOQkJCiIiIAKx3Tl5//XXi4uIoKCig\nWbNmbN++vcy+xtVl5syZREdHs3LlSoKDg1m3bh1gTUovX76cv/71rxQWFjp7Gddffz2vv/66s9cS\nGxvL2bNnUVXCwsJYtmyZs+6PP/6Y2bNnX7VthrrFyFBXQpsZAygqLuHcyx9WmGfv3r2kpKTQtGlT\nBg8eTERERL2ShzAy1L5jw4YN7N27l7lz515V+WvZN1f279/P4sWLSUxMrFa5+uLf1WJkqOsJxZfO\nI24CZWFhIU2aWLITvXr14uzZs0RGRtKyZUtvm2iox4wYMaJRbOp++vRpnnvuOV+bYagEEwgqoaSk\nuMyLFiUlJezevZuPP/6YyZMn06ZNG0SE++67z2c2Guo3kydP9rUJdc6gQYN8bYKhCkwgqAR/hz9g\nrdjIy8sjOTmZnJwcwOp6RUVF+dA6g8FgqB3qNBCIyBDgJcAB/FVV55e73hRYDfQGvgNiVDW7Lm2q\nDqrgL36kpqby0UcfUVJSwvXXX88vf/nLBr+zlMFgaDzUWSAQEQfwKjAIOAmkiUiyqma6ZJsE/FtV\nu4jIaGABEFNXNlWXdo6W/LxVhHOj8cjISAYOHNigJ7AMBkPjoy57BHcBx1T1SwARWQsMB1wDwXBg\ntn38d2CpiIheI0uZxHGWNo4W3HDDDQwbNozg4GBfm2QwGAy1Tl0GgpuBEy7nJ4H/qiiPqhaJSD7Q\nFigj8CMiU4ApADfffHOZF3Hqkv59Q8n5dw6//umvUVWv3deb1MdVK0VFRdY+ER7mrQvmz59PUlIS\nDocDPz8/li5dWqmOzkcffURcXBxNmjRh9erVZGRkMHr06CvyZWdnc/vtt/OHP/yBOXPmANaqm06d\nOjF58mRefPFFZ97yvqWkpPD555/z5JNPeuyHa5nk5GS6du3q8dLcxMRE/vGPf5RZFnr69GnCw8PJ\nysqqsOf8yiuvMGnSJJo3bw7A8OHDSUhIoHXr1mXyXc1nl5uby7Rp09iwYUO1y9YF33//PePGjeOr\nr74iODiYNWvWON8xcvVv1qxZbNmyBYA//vGPjBo1CrAWE+zcuZNWrVoB8NprrxEWFsbmzZtJS0vj\nT3/6k9v7FhUVVe/7SlXr5Ad4AGteoPR8PLC0XJ5DQEeX8ywgsLJ6e/furd7k+PHjXr2ft6mP/mVm\nZnqc9z//+U+t33/Xrl169913O+s+deqUfvPNN5WW+d3vfqeJiYmqqpqamqpDhw51m+/48eMaEhKi\n4eHhzrS//OUvGhYWpg8//HCZvDX1rbCwsMx5bGysvvnmmx6Xz8/P17Zt2+qFCxecacuWLdOJEydW\nWi44OFhPnTpVZf1X498TTzyhGzdurHa5uuLJJ5/UefPmqarqvHnz9KmnnnJeK/UvJSVFBw4cqIWF\nhXr+/HmNjIzU/Px8Va34MykpKdHw8PAyz94Vd/8jQLpW8L1alz2Cb4BbXM472mnu8pwUEX+gFdak\nscHgEY9ufZSMf1Uso1BSUoKfX/XU1sM7hPPikBcrvJ6bm0tgYKCzxRsYGOi8tmPHDp544gmKioro\n06cPy5YtIzExkXXr1vHuu++yZcsWsrKy+PzzzwkPDyc2NpYZM2aUqb958+b8+Mc/Jj09ncjISJKS\nkoiOjnauWNu0aRNz587l4sWLBAYGsmbNGtq3b8+qVatIT09n6dKlZGdn85vf/IbTp0/Trl074uPj\nufXWW5kwYQIBAQHs37+fe+65h9DQUNLT0xk7dizJycl88MEHzJ07l/Xr1zNq1Cj27dsHwNGjR4mJ\niXGeg/Vm8b333sumTZuIibGm9tauXevUIXL3LFasWEFOTg733XcfgYGBpKam0qlTJ9LT0zl//jw/\n//nP6devH7t27SIoKIhNmzbRrFkz0tLSmDRpEn5+fgwaNIgtW7Y4Nb1cWb9+vfMFvezsbMaPH8+F\nC9aeIkuXLiUqKor333+fhQsXkpKSAljS3pGRkUyYMIG0tDR+//vfc+HCBZo2bcqOHTtq9H7Q22+/\nzfu2qmFsbCwDBgxgwYIFZfK4Snv7+/s7pb2jo6MrrFdEGDBgACkpKZXm85S63I8gDegqIiEich0w\nGkgulycZKJXlfAB4z45cBsM1y+DBgzlx4gTdunVj2rRpfPDBB4AlrDZhwgSSkpL49NNPKSoqYtmy\nZUyePJlhw4bxwgsvsGbNGubPn0///v3JyMi4IgiUMnr0aNauXcuJEydwOBzcdNNNzmv9+vVj9+7d\n7NmzxynuVp64uDhiY2M5ePAgDz74II888ojz2smTJ9m1axeLFy92pkVFRTltzMjIoHPnzrRq1cqp\nVRQfH+8UkHNlzJgxrF27FoCcnByOHDnC/fffX+GzeOSRR7jppptITU0lNTX1ivqOHj3Kww8/zGef\nfUbr1q1Zv349YMlbr1ixwqnO6o6KpL337dtHUlJSmWfgjkuXLhETE8NLL73EgQMH2L59u1tp79Id\n6Mr/ZGZmXlGnp9LeW7du5YcffuD06dOkpqZy4sTlUfVZs2YRGhrKjBkzygyJlkp71wZ11iNQa8x/\nOvAu1vLRv6nqZyLyZ6wuSjKwEkgUkWPA91jBwmDwmMpa7lA3r/H/6Ec/Yu/evezcuZPU1FRiYmKY\nP38+ERERhISEONU4Y2NjefXVV3n00UerfY8hQ4bwzDPP0L59e2dru5STJ08SExNDTk4OhYWFhISE\nXFH+k08+4a233gJg/PjxPPXUU85ro0aNqvDL1JXJkycTHx/P4sWLSUpKcruxzNChQ5k2bRpnz55l\n3bp1jBw5EofDwaFDh67qWYSEhDjFGiMiIsjOzubMmTOcO3eOvn37ApbiaWlr3pXakPYOCgq6pqS9\n582bR4cOHbh06RJTpkxhwYIFPPvss4AV6Ep7iTWlTncoU9XNqtpNVTur6n/bac/aQQBV/Y+qjlLV\nLqp6l9orjAyGax2Hw8GAAQOYM2cOS5cudbZca4vrrruO3r17s2jRIh544IEy1+Li4pg+fTp79+5l\nxYoV1ZKmBpyCdlUxcuRItmzZQkpKCr1796Zt27ZX5GnWrBlDhgxhw4YNrF27ljFjxlTLlvK4Bm2H\nw1GtCePKpL3T09O5dOkSUHNp7+r0CEqlvYEqpb0zMjLYtm0bquoMoEFBQYgITZs2ZeLEiWWCcW1K\ne5utKg2GanL48GGn2ixARkYGwcHBdO/enezsbI4dOwZYq2ruvffeK8q3bNnSo20lH3/8cRYsWMAN\nN9xQJj0/P9+5011CQoLbslFRUc4hmzVr1lS4L0FldgUEBPCzn/2Mhx56yO2wUCljxoxh8eLF5OXl\nOVvtlT0LT/0vpXXr1rRs2ZI9e/YAOP0qjztp76CgIPz8/EhMTHQr7X3mzBl27NjhtDk3N5e0tDTA\n+tIvH4hKewTufsrv7wCXpb2BSqW9S1fvuUp7A84goqps3LiRO++801nuyJEjZc5rggkEBkM1OX/+\nPLGxsfTo0YPQ0FAyMzOZPXs2AQEBxMfHM2rUKHr27Imfnx9Tp069onxoaCgOh4OwsDCWLFlS4X3u\nuOMOtzvbzZ49m1GjRtG3b98yE9WAc+jhlVdeIT4+ntDQUBITE3nppZeq9Gv06NG88MILREREkJWV\nBcCDDz6In5+f84vJHYMGDSInJ4eYmBjn/St7FlOmTGHIkCHV0uhauXIlv/3tbwkPD+fChQvO5ZSu\nuEp7A0ybNo2EhATCwsL44osv3Ep7R0dHl5H2TkpKIi4ujrCwMAYNGlTt3lZ5Zs6cybZt2+jatSvb\nt29n5syZgCXtXfo8SqW9e/TowZQpU5zS3mA9/549e9KzZ09Onz7N008/7aw7NTWVoUOH1si+UowM\ndRXUR5nm6lAf/WsoMtQ1pbxvixYt4uzZs873D2qDhQsXkp+f7xP1UFf/zp8/79xrYf78+eTm5roN\nbjWV9vYmNfnbzMvLY+zYsc7eTHmMDLXB0AhZvnw5q1atck4Q1wYjRowgKyuL9957r9bqvFreeecd\n5s2bR1FREcHBwaxatcptvsYi7f3111+zaNGiWqvP9AiqoD62mKtDffTP9AgsGrJvYPyrCdXtEZg5\nAkO9pL41YAwGb3E1/xsmEBjqHQEBAXz33XcmGBgM5VBVvvvuOwICAqpVzswRGOodHTt25OTJk5w6\ndarKvEVFRc4VGA2NhuwbGP+uloCAADp27FitMg33KRsaLE2aNHH7Nq076uMciKc0ZN/A+OdNzNCQ\nwWAwNHJMIDAYDIZGjgkEBoPB0Mipd+8RiMgp4Csv3jKQcjumNTCMf/WXhuwbGP9qm2BVbefuQr0L\nBN5GRNIregmjIWD8q780ZN/A+OdNzNCQwWAwNHJMIDAYDIZGjgkEVfO/vjagjjH+1V8asm9g/PMa\nZo7AYDAYGjmmR2AwGAyNHBMIDAaDoZFjAoGNiAwRkcMickxEZrq53lREkuzre0Skk/etvDo88O0x\nEckUkYMiskNEgn1h59VSlX8u+UaKiIrINbFkz1M88U9Eou3P8DMRecPbNtYED/4+bxWRVBHZb/+N\n/sIXdl4NIvI3EflWRA5VcF1E5GXb94Mi0svbNgKWbGlj/wEcQBZwG3AdcADoUS7PNGC5fTwaSPK1\n3bXo231Ac/v4ofrim6f+2flaAh8Cu4FIX9tdy59fV2A/0MY+v9HXdteyf/8LPGQf9wCyfW13Nfz7\nCdALOFTB9V8AWwAB7gb2+MJO0yOwuAs4pqpfquolYC0wvFye4UCCffx34KdSulP3tU2Vvqlqqqr+\nYJ/uBqqnYetbPPnsAJ4DFgA1243c+3ji32+BV1X13wCq+q2XbawJnvinwPX2cSsgx4v21QhV/RD4\nvpIsw4HVarEbaC0iQd6x7jImEFjcDJxwOT9pp7nNo6pFQD7Q1ivW1QxPfHNlElYLpb5QpX92d/sW\nVX3Hm4bVEp58ft2AbiLysYjsFpEhXrOu5nji32xgnIicBDYDcd4xzStU9/+zTjD7ERiciMg4IBK4\n19e21BYi4gcsBib42JS6xB9reGgAVm/uQxHpqapnfGpV7TEGWKWqi0SkL5AoIneqaomvDWsomB6B\nxTfALS7nHe00t3lExB+ri/qdV6yrGZ74hogMBGYBw1T1opdsqw2q8q8lcCfwvohkY43DJtejCWNP\nPr+TQLKqFqrqceAIVmCoD3ji3yRgHYCqfgIEYAm2NQQ8+v+sa0wgsEgDuopIiIhchzUZnFwuTzIQ\nax8/ALyn9mzPNU6VvolIBLACKwjUp/FlqMI/Vc1X1UBV7aSqnbDmQIaparpvzK02nvxtbsTqDSAi\ngVhDRV9608ga4Il/XwM/BRCRH2MFgqr3Ka0fJAO/tlcP3Q3kq2qut40wQ0NYY/4iMh14F2sVw99U\n9TMR+TOQrqrJwEqsLukxrMmf0b6z2HM89O0F4EfAm/b899eqOsxnRlcDD/2rt3jo37vAYBHJBIqB\nJ1W1PvRWPfXvceA1EZmBNXE8oZ40whCR/8MK0oH2HMefgCYAqroca87jF8Ax4Adgok/srCfP02Aw\nGAx1hBkaMhgMhkaOCQQGg8HQyDGBwGAwGBo5JhAYDAZDI8cEAoPBYGjkmEBgqFNEpFhEMlx+OlWS\nt1NFKo3eRkQiReRl+3iAiES5XJsqIr/2oi3hV6O4KSJBIpJiHw8QkXyXz2G7nT5bRL6x0w6JyDA3\n6ZkiMsal3oUicn9t+WfwPeY9AkNdU6Cq4b42orrYL5yVvnQ2ADgP7LKvLa/t+4mIv61h5Y5wLOmP\nzdWs9jHgNZfznar6Szf5lqjqQvtlrZ0icmO59K7AXhH5u6oWAq/Y9b5XTXsM1yimR2DwOnbLf6eI\n7LN/otzkuUNE/mm3SA/aX0aIyDiX9BUi4nBTNltEnheRT+28XVzu+55c3nfhVjt9lN0aPiAiH9pp\nA0Qkxe7BTAVm2Pfsb7eWnxCR20Xkn+X8+tQ+7i0iH4jIXhF5V9woSorIKhFZLiJ7gOdF5C4R+UQs\n3f1dItLdftv2z0CMff8YEWkhls79P+287tRWAUYCWz39XFT1c6CIcvINqnoU62WnNvb5V0BbEeng\nad2GaxsTCAx1TTOX4YgNdtq3wCBV7QXEAC+7KTcVeMnuTUQCJ+0Wawxwj51eDDxYwX3zVbUnsBR4\n0U57BUhQ1VBgjct9nwV+pqphQJk3qlU1G1iO1ToOV9WdLte+AK4TkRA7KQZIEpEm9r0eUNXewN+A\n/67Azo5AlKo+BnwB9FfVCNum/7GlmZ/F2iMiXFWTsDSh3lPVu7D2knhBRFq4Vmrb9O9yulH9XT6L\nWeUNEZH/AkooJ98glnrr0XLyI/uAeyrwyVDPMENDhrrG3dBQE2CpiJR+mXdzU+4TYJaIdATeUtWj\nIvJToDeQJpYURjOsoOKO/3P5vcQ+7gv8P/s4EXjePv4YWCUi64C3quMclhhaDDDf/h0DdMcSuttm\n2+kAKtKPeVNVi+3jVkCC3ftRbCkCNwwGhonIE/Z5AHAr8LlLniCu1OOpaGhohljKs+eAGFVV2+4Z\nIjIR6/P5Vbky3wI3VWCfoZ5hAoHBF8wA8oAwrF7pFZvFqOob9pDJUGCziPwOaxenBFX9owf30AqO\nr8yoOtVuDQ/FGgvv7ZkbACRhaTS9ZVWlR0WkJ/CZqvb1oPwFl+PngFRVHWEPSb1fQRkBRqrq4Urq\nLcAKEJ6wRFUXVpRuTyCvFJHOqlr6WQXY9zA0AMzQkMEXtAJybT358Vgt5jKIyG3Al6r6MvA2EArs\nAB4oncwUkRuk4v2VY1x+f2If7+KyWOCDwE67ns6qukdVn8VqRbvKAoPVUm7p7iaqmoXVq3kGKygA\nHAbaiaWdj4g0EZE7KrDTlVZcliCeUMn93wXixG62i6UeW54jQCcP7lkltvBbOpfVd8HqJVwTK7wM\nNccEAoMv+AsQKyIHgNsp2youJRo4JCIZWMMsq1U1E3ga+IeIHAS2YQ2BuKONnef3WD0QsHa2mmin\nj7evgTXG/qlYS1d3Ye2b68omYETpZLGbeyUB47ismX8JS6p8ge1jBnDFhLgbngfmich+yvbWU4Ee\npZPFWD2HJsBBEfnMPi+Dql4AskonymuBPwOPiYifPQfShcurqgz1HKM+amhwiLUBTaSqnva1Lb5E\nREYAvVX16Tqot5eqPlOb9Rp8h5kjMBgaKKq6QUTqYl9tf2BRHdRr8BGmR2AwGAyNHDNHYDAYDI0c\nEwgMBoOhkWMCgcFgMDRyTCAwGAyGRo4JBAaDwdDI+f+5hJRSZ+1w0QAAAABJRU5ErkJggg==\n",
            "text/plain": [
              "<Figure size 432x288 with 1 Axes>"
            ]
          },
          "metadata": {
            "tags": []
          }
        },
        {
          "output_type": "stream",
          "text": [
            "CPU times: user 21.4 s, sys: 209 ms, total: 21.6 s\n",
            "Wall time: 21.5 s\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "1n1l_NO0bJOZ",
        "colab_type": "text"
      },
      "source": [
        "# Boosting Methods\n",
        "\n",
        "Boosting methods are typically weak estimators that are built sequentially with each estimator attempting to reduce the bias of the predecessor<sup>1</sup>. The weak learners often only have a slight performance advantage over random guessing, but by focusing on training samples that are hard to classify, the overall performance of the ensemble is improved<sup>2</sup>.\n",
        "\n",
        "Compared to bagging models, boosting can lead to a decrease in bias and variance, but algorithms such as AdaBoost are also known for overfitting to the training data (high variance)<sup>2</sup>.\n",
        "\n",
        "---\n",
        "1. Geron2017\n",
        "2. Python Machine Learning"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "EDergkH6bNv6",
        "colab_type": "text"
      },
      "source": [
        "## AdaBoost\n",
        "\n",
        "For AdaBoost<sup>1</sup>, the first base classifier is trained to make predictions on the training set. Afterwards the weights of misclssified training instances are increased. A second classifier then is traind with these new weights and makes predictions. This is then repeated until all predictors are trained, wherein the ensemble makes predictions like bagging, except with weights depending on overall accuracy on the weighted training set <sup>2</sup>.\n",
        "\n",
        "**NOTES**\n",
        "- SVM's are generally not good predictors for AdaBoost due to being slow and unstable in this context<sup>2</sup>.\n",
        "- Boosting in this way tends to be slower than bagging as each tree needs to be grown sequentially as they depend on each other rather than able to be grown in parallel. As such I reduced the number of estimators for this example.\n",
        "\n",
        "---\n",
        "1. Freund, Y., & Schapire, R. E. (1997). A decision-theoretic generalization of on-line learning and an application to boosting. Journal of computer and system sciences, 55(1), 119-139.\n",
        "2. Geron2017"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "5zJLkLiebOky",
        "colab_type": "code",
        "outputId": "0ab826b1-9dbe-484f-aa65-98a6710bb2ee",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 213
        }
      },
      "source": [
        "%%time\n",
        "from sklearn.tree import DecisionTreeClassifier\n",
        "from sklearn.ensemble import AdaBoostClassifier\n",
        "\n",
        "tree = DecisionTreeClassifier(criterion='gini', \n",
        "                              max_depth=1,\n",
        "                              random_state=RANDOM_STATE)\n",
        "\n",
        "ada = AdaBoostClassifier(base_estimator=tree,\n",
        "                         n_estimators=100, \n",
        "                         learning_rate=0.1,\n",
        "                         random_state=RANDOM_STATE)\n",
        "\n",
        "ada.fit(X_train, y_train)\n",
        "\n",
        "y_pred = ada.predict(X_val)\n",
        "\n",
        "display(pd.DataFrame(classification_report(y_val, y_pred , output_dict = True)))"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "display_data",
          "data": {
            "text/html": [
              "<div>\n",
              "<style scoped>\n",
              "    .dataframe tbody tr th:only-of-type {\n",
              "        vertical-align: middle;\n",
              "    }\n",
              "\n",
              "    .dataframe tbody tr th {\n",
              "        vertical-align: top;\n",
              "    }\n",
              "\n",
              "    .dataframe thead th {\n",
              "        text-align: right;\n",
              "    }\n",
              "</style>\n",
              "<table border=\"1\" class=\"dataframe\">\n",
              "  <thead>\n",
              "    <tr style=\"text-align: right;\">\n",
              "      <th></th>\n",
              "      <th>0</th>\n",
              "      <th>1</th>\n",
              "      <th>accuracy</th>\n",
              "      <th>macro avg</th>\n",
              "      <th>weighted avg</th>\n",
              "    </tr>\n",
              "  </thead>\n",
              "  <tbody>\n",
              "    <tr>\n",
              "      <th>precision</th>\n",
              "      <td>1.000000</td>\n",
              "      <td>0.928571</td>\n",
              "      <td>0.996466</td>\n",
              "      <td>0.964286</td>\n",
              "      <td>0.996719</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>recall</th>\n",
              "      <td>0.996296</td>\n",
              "      <td>1.000000</td>\n",
              "      <td>0.996466</td>\n",
              "      <td>0.998148</td>\n",
              "      <td>0.996466</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>f1-score</th>\n",
              "      <td>0.998145</td>\n",
              "      <td>0.962963</td>\n",
              "      <td>0.996466</td>\n",
              "      <td>0.980554</td>\n",
              "      <td>0.996529</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>support</th>\n",
              "      <td>270.000000</td>\n",
              "      <td>13.000000</td>\n",
              "      <td>0.996466</td>\n",
              "      <td>283.000000</td>\n",
              "      <td>283.000000</td>\n",
              "    </tr>\n",
              "  </tbody>\n",
              "</table>\n",
              "</div>"
            ],
            "text/plain": [
              "                    0          1  accuracy   macro avg  weighted avg\n",
              "precision    1.000000   0.928571  0.996466    0.964286      0.996719\n",
              "recall       0.996296   1.000000  0.996466    0.998148      0.996466\n",
              "f1-score     0.998145   0.962963  0.996466    0.980554      0.996529\n",
              "support    270.000000  13.000000  0.996466  283.000000    283.000000"
            ]
          },
          "metadata": {
            "tags": []
          }
        },
        {
          "output_type": "stream",
          "text": [
            "CPU times: user 29.2 s, sys: 9.28 ms, total: 29.2 s\n",
            "Wall time: 29.3 s\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "fN351ZFgcQCk",
        "colab_type": "text"
      },
      "source": [
        "## Gradient Boosting\n",
        "Gradient Boosting works similar to AdaBoost in that it sequentially adds predictors to correct predessessors in an ensemble. However instead of changing weights it fits a model to the residual errors<sup>1</sup>.\n",
        "\n",
        "Lets run through a quick example given in Geron (2017)<sup>1</sup>.\n",
        "\n",
        "---\n",
        "1. Geron2017"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "Htzo4xq8cT4X",
        "colab_type": "code",
        "outputId": "9e7972f1-cecd-4dc5-eab1-5a164ab64991",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 175
        }
      },
      "source": [
        "from sklearn.tree import DecisionTreeClassifier\n",
        "from sklearn.metrics import classification_report\n",
        "\n",
        "tree_reg1 = DecisionTreeClassifier(max_depth=2, random_state=RANDOM_STATE)\n",
        "tree_reg1.fit(X_train, y_train)\n",
        "\n",
        "y_train2 = y_train - tree_reg1.predict(X_train)\n",
        "tree_reg2 = DecisionTreeClassifier(max_depth=2, random_state=RANDOM_STATE)\n",
        "tree_reg2.fit(X_train, y_train2)\n",
        "\n",
        "y_train3 = y_train2 - tree_reg2.predict(X_train)\n",
        "tree_reg3 = DecisionTreeClassifier(max_depth=2, random_state=RANDOM_STATE)\n",
        "tree_reg3.fit(X_train, y_train3)\n",
        "\n",
        "y_pred = sum(tree.predict(X_val) for tree in (tree_reg1, tree_reg2, tree_reg3))\n",
        "\n",
        "pd.DataFrame(classification_report(y_val, y_pred , output_dict =True))"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/html": [
              "<div>\n",
              "<style scoped>\n",
              "    .dataframe tbody tr th:only-of-type {\n",
              "        vertical-align: middle;\n",
              "    }\n",
              "\n",
              "    .dataframe tbody tr th {\n",
              "        vertical-align: top;\n",
              "    }\n",
              "\n",
              "    .dataframe thead th {\n",
              "        text-align: right;\n",
              "    }\n",
              "</style>\n",
              "<table border=\"1\" class=\"dataframe\">\n",
              "  <thead>\n",
              "    <tr style=\"text-align: right;\">\n",
              "      <th></th>\n",
              "      <th>0</th>\n",
              "      <th>1</th>\n",
              "      <th>accuracy</th>\n",
              "      <th>macro avg</th>\n",
              "      <th>weighted avg</th>\n",
              "    </tr>\n",
              "  </thead>\n",
              "  <tbody>\n",
              "    <tr>\n",
              "      <th>precision</th>\n",
              "      <td>0.996269</td>\n",
              "      <td>0.800000</td>\n",
              "      <td>0.985866</td>\n",
              "      <td>0.898134</td>\n",
              "      <td>0.987253</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>recall</th>\n",
              "      <td>0.988889</td>\n",
              "      <td>0.923077</td>\n",
              "      <td>0.985866</td>\n",
              "      <td>0.955983</td>\n",
              "      <td>0.985866</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>f1-score</th>\n",
              "      <td>0.992565</td>\n",
              "      <td>0.857143</td>\n",
              "      <td>0.985866</td>\n",
              "      <td>0.924854</td>\n",
              "      <td>0.986344</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>support</th>\n",
              "      <td>270.000000</td>\n",
              "      <td>13.000000</td>\n",
              "      <td>0.985866</td>\n",
              "      <td>283.000000</td>\n",
              "      <td>283.000000</td>\n",
              "    </tr>\n",
              "  </tbody>\n",
              "</table>\n",
              "</div>"
            ],
            "text/plain": [
              "                    0          1  accuracy   macro avg  weighted avg\n",
              "precision    0.996269   0.800000  0.985866    0.898134      0.987253\n",
              "recall       0.988889   0.923077  0.985866    0.955983      0.985866\n",
              "f1-score     0.992565   0.857143  0.985866    0.924854      0.986344\n",
              "support    270.000000  13.000000  0.985866  283.000000    283.000000"
            ]
          },
          "metadata": {
            "tags": []
          },
          "execution_count": 26
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "7-Vyk3cceaF7",
        "colab_type": "text"
      },
      "source": [
        "A simpler way is to use the GradientBoostingClassifier. The learning rate hyperparameter scales the contribution of each tree. If it is set low, you will need more trees but it will typically generalize better (shrinkage)<sup>1</sup>.\n",
        "\n",
        "---\n",
        "1. Geron2017"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "QYs9gS24eavV",
        "colab_type": "code",
        "outputId": "1d447140-71c2-43cb-8f3a-024c716954d5",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 175
        }
      },
      "source": [
        "from sklearn.ensemble import GradientBoostingClassifier\n",
        "\n",
        "GBC = GradientBoostingClassifier(max_depth = 2, n_estimators=3, \n",
        "                                 learning_rate=1.0, random_state=RANDOM_STATE)\n",
        "GBC.fit(X_train, y_train)\n",
        "y_pred = GBC.predict(X_val)\n",
        "pd.DataFrame(classification_report(y_val, y_pred , output_dict = True))"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/html": [
              "<div>\n",
              "<style scoped>\n",
              "    .dataframe tbody tr th:only-of-type {\n",
              "        vertical-align: middle;\n",
              "    }\n",
              "\n",
              "    .dataframe tbody tr th {\n",
              "        vertical-align: top;\n",
              "    }\n",
              "\n",
              "    .dataframe thead th {\n",
              "        text-align: right;\n",
              "    }\n",
              "</style>\n",
              "<table border=\"1\" class=\"dataframe\">\n",
              "  <thead>\n",
              "    <tr style=\"text-align: right;\">\n",
              "      <th></th>\n",
              "      <th>0</th>\n",
              "      <th>1</th>\n",
              "      <th>accuracy</th>\n",
              "      <th>macro avg</th>\n",
              "      <th>weighted avg</th>\n",
              "    </tr>\n",
              "  </thead>\n",
              "  <tbody>\n",
              "    <tr>\n",
              "      <th>precision</th>\n",
              "      <td>0.996269</td>\n",
              "      <td>0.800000</td>\n",
              "      <td>0.985866</td>\n",
              "      <td>0.898134</td>\n",
              "      <td>0.987253</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>recall</th>\n",
              "      <td>0.988889</td>\n",
              "      <td>0.923077</td>\n",
              "      <td>0.985866</td>\n",
              "      <td>0.955983</td>\n",
              "      <td>0.985866</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>f1-score</th>\n",
              "      <td>0.992565</td>\n",
              "      <td>0.857143</td>\n",
              "      <td>0.985866</td>\n",
              "      <td>0.924854</td>\n",
              "      <td>0.986344</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>support</th>\n",
              "      <td>270.000000</td>\n",
              "      <td>13.000000</td>\n",
              "      <td>0.985866</td>\n",
              "      <td>283.000000</td>\n",
              "      <td>283.000000</td>\n",
              "    </tr>\n",
              "  </tbody>\n",
              "</table>\n",
              "</div>"
            ],
            "text/plain": [
              "                    0          1  accuracy   macro avg  weighted avg\n",
              "precision    0.996269   0.800000  0.985866    0.898134      0.987253\n",
              "recall       0.988889   0.923077  0.985866    0.955983      0.985866\n",
              "f1-score     0.992565   0.857143  0.985866    0.924854      0.986344\n",
              "support    270.000000  13.000000  0.985866  283.000000    283.000000"
            ]
          },
          "metadata": {
            "tags": []
          },
          "execution_count": 27
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "NUwBkBtEelyy",
        "colab_type": "text"
      },
      "source": [
        "Early stopping can be used to find the optimal number of trees, using the staged_predict() method; which returns an iterator over the predictions made by the ensemble at each training stage<sup>1</sup>.\n",
        "\n",
        "---\n",
        "1. Geron2017"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "3VOWWk07eqPy",
        "colab_type": "code",
        "outputId": "2f1c1187-ff83-47d6-d8b0-be390473c791",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 207
        }
      },
      "source": [
        "from sklearn.metrics import accuracy_score\n",
        "\n",
        "GBC = GradientBoostingClassifier(max_depth = 2, n_estimators =30)\n",
        "GBC.fit(X_train, y_train)\n",
        "\n",
        "scores = [accuracy_score(y_val, y_pred) \n",
        "          for y_pred in GBC.staged_predict(X_val)]\n",
        "bst_n_estimators = np.argmax(scores)\n",
        "\n",
        "GBC_best = GradientBoostingClassifier(max_depth = 2, \n",
        "                                      n_estimators = bst_n_estimators,\n",
        "                                      random_state = RANDOM_STATE)\n",
        "GBC_best.fit(X_train, y_train)"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "GradientBoostingClassifier(ccp_alpha=0.0, criterion='friedman_mse', init=None,\n",
              "                           learning_rate=0.1, loss='deviance', max_depth=2,\n",
              "                           max_features=None, max_leaf_nodes=None,\n",
              "                           min_impurity_decrease=0.0, min_impurity_split=None,\n",
              "                           min_samples_leaf=1, min_samples_split=2,\n",
              "                           min_weight_fraction_leaf=0.0, n_estimators=9,\n",
              "                           n_iter_no_change=None, presort='deprecated',\n",
              "                           random_state=0, subsample=1.0, tol=0.0001,\n",
              "                           validation_fraction=0.1, verbose=0,\n",
              "                           warm_start=False)"
            ]
          },
          "metadata": {
            "tags": []
          },
          "execution_count": 28
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "2BAiwc7De_Iz",
        "colab_type": "text"
      },
      "source": [
        "**Notes**\n",
        "- Adapted from https://github.com/ageron/handson-ml/blob/master/07_ensemble_learning_and_random_forests.ipynb"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "ULoTBxvje6Pq",
        "colab_type": "code",
        "outputId": "f3a00446-8748-4476-a2df-748f5394a10c",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 296
        }
      },
      "source": [
        "worst_score = np.min(scores)\n",
        "best_score = np.max(scores)\n",
        "\n",
        "plt.plot(scores, \"b.-\")\n",
        "plt.plot([0, 30], [best_score, best_score], \"k--\")\n",
        "plt.plot(bst_n_estimators, best_score, \"ko\")\n",
        "plt.plot([bst_n_estimators, bst_n_estimators], [worst_score, best_score], \"k--\")\n",
        "plt.text(bst_n_estimators+1.5, best_score-0.005, \"Best\", ha=\"center\", fontsize=14)\n",
        "plt.xlabel(\"Number of trees\")\n",
        "plt.title(\"Accuracy\", fontsize=14)\n",
        "\n",
        "plt.show()"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "display_data",
          "data": {
            "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYAAAAEXCAYAAACkpJNEAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjMsIGh0\ndHA6Ly9tYXRwbG90bGliLm9yZy+AADFEAAAgAElEQVR4nO3de7xVZb3v8c+XhSAIXhdejhDo9spJ\nJUVfYRcQzdStoWKCqUm5wzI6mrqPUdYxxSjFnbtzwiOaokgBhzLBjaAQ2E4owc3CG0KEGiAo6AJF\nCAV+548xFk7WhTWByRrz8n2/XvM15xzPM8b4PWvC/M3nGWM8QxGBmZlVnlZZB2BmZtlwAjAzq1BO\nAGZmFcoJwMysQjkBmJlVKCcAM7MK5QRgZlahnACs7Eg6WdIWSc9mHYtZMXMCsHL0L8BI4JOSjs8y\nEEltsty/2Y44AVhZkdQO+AowCpgIXF2v/L9JGivpHUkbJNVIOiOn/DxJf5G0Ma0zWdLeadnrkm6q\nt71Zkv5PzvvXJd0q6UFJa4Gx6fKfSlqUbvd1SXfWbbe5fUv6kaSXGmnrs5J+sdt/NKtYTgBWbi4B\n3oiIF4ExwFcl7QUgaR/gGaAbcCFwAnBb3YqSzgEmAU8DpwBnpPV39v/JDcCrQE/g++myD4CvA8cD\n1wIDgR/kue8HgeMknZZT/1jgdOBXOxmb2TbyXEBWTiTNAp6IiBGSBLwG3BQREyV9A/g34IiIWNPI\nus8CyyJiYBPbfh34PxExot7+XoqIITl1XoyIC5qJ85tpXEflue8ngOUR8c30/c+AMyOi5472Y7Yj\n7gFY2ZB0FPBZ4NcAkfy6GcvHw0CfAl5o7Ms/p3xGAUKZ10hsl0j6k6RVktYDPwc+sRP7vh8YKKmd\npCrgSvzr33ZT66wDMCugfwGqgL8nP/4BEICkLgXY/ta67eXYq5F6H+S+kfRpYBzwY+C7wFrgS8CI\nhqs26T+ADUB/YB2wP2miM9tV7gFYWZDUGrgKGAr0yHmcBLwAfA2YD5woqbqJzcwHztzBblYDh+Xs\nc2/guDzC+wywIiJuj4i5EfFXoOvO7DsiNgOjSY4jfB34XUSsy2PfZk1yD8DKxT8D1cD9EfFOboGk\nccA3gU8C3wMel/Q9YEW67P2ImAncAUyWtITk17WAs4H7ImID8Afg65ImkSSDH5Df/6HFwOGSLgfm\nAF8ELqtXp7l9AzwA3EzSEzk7r7+K2Q64B2Dl4mpgZv0v/9T/Iznz5zNAb2A5MBl4iWRYJgAiYgpw\nEXAuyS/yZ0jOxtmabmc4SRJ4HHgK+FNab4ciYjJwF3APSW/kC8CP6tVpbt9ExNJ0+d+BWc3t16w5\nPgvIrIRIegUYGxF3ZB2LlT4PAZmVAEmdSK5x6Abcl200Vi6cAMxKw9vAGuCaHZzGarZTPARkZlah\nfBDYzKxCldQQUHV1dXTr1i3rMMzMSsrzzz+/JiI61V9eUgmgW7duzJvX4Cp7MzPbAUlvNLbcQ0Bm\nZhXKCcDMrEI5AZiZVSgnADOzCuUEYGZWoZwArGDGjh1Lt27daNWqFd26dWPs2LFZh2RmO1B16623\nZh1D3kaNGnXr4MGDsw7DGjF27FgGDx7MmjXJLAXr1q1j6tSpdOvWjRNPPLFB/Tlz4NFHoXVr6NLM\nrVryrVsu9UohRv9tWq5eIfz4xz9eeeutt46qv7ykrgPYHX369Gmw7NJLL+Xaa69lw4YNnHfeeQ3K\nBw0axKBBg1izZg2XXHJJg/JvfetbDBgwgGXLlnHllVc2KL/xxhu54IILWLRoEddcc02D8ltuuYWz\nzjqLmpoarr/++gblP/nJTzj99NOZPXs23//+9xuU33PPPfTo0YPp06czbNiwBuX33Xcfxx57LJMn\nT+buu+9uUD5mzBi6dOnC+PHjuffeexuUT5w4kerqakaPHs3o0aMblE+ZMoX27dszcuRIbrjhBjZt\n2rRd+YYNG/j2t3/A6NEr+etfn9i2fNMmeOutdsCTVFXBCSfczrvvbn83xHbtDuLLX/4ty5fD6NFD\niZgDwKGHQtu20LFjZy666FEApk27nr//vYZVqz5e/5hjjmHAgOTf+xNPDObddxdv23dSrwetW9/D\nlVfC3LlX8P77y7fb/7779mLhwuFs2QLQn0MOeYe2bT8u79btTD7/+R+m8Z1LxMbt4jv66PPp1esm\nAB55pE/OfhOf/vSlfOEL1/LRRxv4zW+2/7e3aRO8/fYgIgbRqtUaOnW6ZLt9A5xyyrfYb78BPPLI\nMrZsuXK7fSfbv5FjjrmANWsW8dhj12y370MPhTPPvIUjjzyLVatqeOqp6+v9baCq6id89aunEzGb\nmTO3/7e3aROsXn0PW7f2QJrOwQcPaxDfeefdxz/+cSwPPzyZrVvvbhBfv35j2G+/Lrz88nhmz763\nQXyXXz6R9u2rWbBgNAsWjN5u30ndKbRu3Z6ePUeycuUE6uvbdxZjxsDmzSOAJ7bbd+vW7fjKV54E\n4Iknbmf+/I//7R16KOy/f/JvD2DGjKGsWDFnu7+N1JlBgx6lc+fk395bb9VsF99bbx0DjKKqCo48\ncjCbNi3eLrZDDunBf//v96TxXUGrVss56STYd9+kvFevXgwfPhyA/v3788477zBr1qwGbSyEikkA\ntmfV//Kvs27d35k+ven1Nm+G+U3MqD9sGNSfqir3i+Lll5PnxqazWrw4Wb+p8rp9P/RQ07E1tV+A\nN96AP/6x6fjeeANmzGh6/3/+M/zlL03HVmfLlob7rtv+jmJ84w2QGt/+qlUwdmzT5XX7zedvE9F4\nfI38ntiu3i9+seP47r57x/FB8vn9+c+Nl9WPvX6MTf3bWLUqeezo305Esv184lu8uOHyN96A5577\n+P3WrbB27ccJoEVFRMk8TjnllLDi1LVr1yC5scp2j65duzaoO3t2RLt2EVVVyfPs2U1vN9+65VKv\nFGL036Zl21wIwLxo5Ds1r9lAJZ0D/DvJDbcfiIif1ivvCjwIdALeBa6IiOVp2Z0kt+trBTwNXBcR\nIWkWyf1VN6abOTsi3t5RHD179gxPBVGcxo4dy6BBg9m8ecO2Ze3bt2fUqFFcfvnlDerPmQOzZkGf\nPtCr1463nW/dcqlXCjH6b9Ny9QpB0vMR0bNBQWNZIfdB8qX/N+BIoA2wAOher87/A65KX/cFxqSv\nTweeTbdRRXI/1D5p2SygZ3P7z324B1C83nsvol27R6NVq7bbfvk/+uijWYdlZtF0DyCfYwCnAUsi\nuR9p3Q22+wGv5NTpDtyQvp4J/L4uvwB7p4lDwF7AW3ns00rMAw/Axo2X86lP3c+++7LHDlqZWeHk\ncx3A4cCynPfL02W5FgAXp68vAjpKOiiSUzdmAivTx7SIWJiz3kOSaiT9UJJ2qQWWuY8+gn/7N+jd\nO6MDWWa2Swp1IdhNQG9J84HewApgi6SjgOOBziRJo6+kz6XrXB4RJwCfSx8Nz6MEJA2WNE/SvNWr\nVxcoXCukceNg+XK4+easIzGznZHPENAKIPcyhc7psm0i4k3SHoCkDkD/iFgr6RvAnyNifVr2JNAL\n+M+IWJGu+76kX5MMNT1Sf+cRMQoYBclB4J1rnu1pEXDnnfDJT8I558CRR/p+5WalIp8ewFzgaElH\nSGoDDAQm5VaQVC2pbltDSc4IAvg7Sc+gtaS9SHoHC9P31em6ewHnAy/tfnOspT35JLz0EvzP/5mc\nF33sscdy7LHHZh2WmeWh2QQQEZuBIcA0YCEwISJelnSbpC+l1foAiyQtBg4B7kiXTyQ5g+hFkuME\nCyJiMtAWmCbpBaCGpEdxf8FaZS3mzjuTy9gHDkzeT548mcmTJ2cblJnlJa/rAIqFrwMoLn/5C3z6\n0/Dzn0PdTBZ1U274LCCz4tHUdQCeDdR22Z13wgEHwL/8S9aRmNmucAKwXbJ4MTz2GFx7LXTokHU0\nZrYrnABsl9x9N7RpA9/5TtaRmNmucgKwnbZqFTz8MHzta3DIIVlHY2a7ytNB2077xS/gww/hxhsb\nlo0ZM6blAzKzXeIEYDvl/fdh5Ejo3x+OOqpheZc9fWsjMysYDwHZTrn/fli3LrnwqzHjx49n/Pjx\nLRuUme0S9wAsbx9+mJzzf8YZcOqpjdepu7XkgAEDWjAyM9sVTgCWt9/8Jpn07X5fs21WFjwEZHl5\n9tlk2OfII+GLX8w6GjMrBCcAa9acOdC3L7z9dtIDaOpG3GZWWpwArFmzZiXj/wBbtiTvmzNo0CAk\nbXtUV1dz/vnn8+qrrxYkptdffx1JeG4os13nBGDNqq5Onlu1Sq7+Ted7a9TEiROZOHEiAGeddRYr\nV65k5cqVPPXUU2zcuJGLLrpozwdsZnlxArBmPfkkdOwIP/whzJgBvXo1Xbe6uprqNGO0bduWQw89\nlEMPPZSTTz6Z7373u7z66qts3LgRgBUrVjBw4EAOOOAADjjgAP75n/+Zv/71r9u2tWzZMvr168eB\nBx5I+/btOe644xg3bhwARxxxBACnnnoqkrbNQmpm+fNZQLZDixbB738PP/gB3Hpr8/VHjx7d6PL3\n33+f8ePHc8IJJ9CuXTs2bNjAGWecwemnn84zzzxDmzZtGDFiBGeddRYLFy6kffv2XHvttfzjH/9g\n5syZ7LvvvixatGjb9p577jlOO+00pk6dykknnUSbNm0K02CzCuIEYDt0993Qtm3+k77VJYBu3box\ndepUOqRThX7wwQd06dKFKVOmADBu3DgigoceeghJANx3330cfPDBPPHEE1x66aW88cYb9O/fn5NO\nOgn4+Fc/QKdOnQA46KCDOPTQQwvRVLOK4yEga9LKlR9P+nbwwTu//uc//3lqamqoqanhueee48wz\nz+Tss89m2bJlPP/887z22mt07NiRDh060KFDB/bbbz9qa2v529/+BsB1113HsGHD6NWrF7fccgvP\nP/98gVtoVtncA7Am/eIXsHkz3HDDrq3fvn17jsqZMOiBBx5gv/32Y9SoUWzdupUePXpsG9PPdeCB\nBwJw9dVX88UvfpEpU6Ywffp0Tj/9dIYOHcqt+YxFmVmz3AOwRr33Htx7b9OTvu0KSbRq1YoNGzZw\n8skns2TJEqqrqznqqKO2e9QlAIDOnTszePBgJkyYwG233caoUaMAto35b9mypTDBmVUgJwBrVHOT\nvuVj06ZNrFq1ilWrVrFw4UK+853vsH79ei644AIuv/xyDjnkEPr168czzzzDa6+9xh//+EduvPHG\nbWcCXXfddUydOpWlS5dSU1PD1KlT6d69OwAHH3ww7dq1Y9q0abz11lusW7euEM02qywRUTKPU045\nJWzP27Qp4vDDI/r23fl1P/jgg/jggw/iqquuCmDbo2PHjnHqqafGxIkTt9VdtWpVDBo0KDp16hRt\n2rSJbt26xde+9rVYvXp1REQMGTIkjjrqqGjbtm1UV1fHgAEDYvny5dvWv//++6NLly7RqlWr6N27\n9+4226xsAfOike9UJWWloWfPnuErP/e80aOTA79Tp3reH7NyIOn5iOhZf7mHgGw7W7fCXXfBSSfB\n2Wfv/PojR45k5MiRhQ/MzAourwQg6RxJiyQtkfS9Rsq7Spoh6QVJsyR1zim7U9LLkhZK+oXSk74l\nnSLpxXSb25ZbtqZMgVdeScb+d+UTmTBhAhMmTCh8YGZWcM0mAElVwC+Bc4HuwGWSuterNgJ4JCJO\nBG4Dhqfrng58BjgR+CRwKtA7Xede4BvA0enjnN1tjO2+n/0MunaFL38560jMbE/LpwdwGrAkIpZG\nxIfAOKBfvTrdgT+kr2fmlAewN9AGaAvsBbwl6TBg34j4c3qA4hHgwt1qie222bPhT39Kzvvfa6+s\nozGzPS2fBHA4sCzn/fJ0Wa4FwMXp64uAjpIOiog5JAlhZfqYFhEL0/WXN7NNACQNljRP0rzVq1fn\nEa7tqrvuggMPhKuvzjoSM2sJhToIfBPQW9J8kiGeFcAWSUcBxwOdSb7g+0r63M5sOCJGRUTPiOhZ\nN/+LFd6rr8Ljj8OQIbDPPllHY2YtIZ+pIFYAXXLed06XbRMRb5L2ACR1APpHxFpJ3wD+HBHr07In\ngV7AmHQ7TW7TWtaIEcmkb0OG7N52ZuVztxgzKwr59ADmAkdLOkJSG2AgMCm3gqRqSXXbGgo8mL7+\nO0nPoLWkvUh6BwsjYiXwnqRPp2f/fBV4vADtsV3w5pswZgx8/evgTpZZ5Wg2AUTEZmAIMA1YCEyI\niJcl3SbpS2m1PsAiSYuBQ4A70uUTgb8BL5IcJ1gQEZPTsmuBB4AlaZ0nC9Ii22l1k77deOPub2vE\niBGMGDFi9zdkZnucrwQuEnPmJPfa7dNnx3fcKnS9p5+GCy6Az34Wpk/flci3V3dnLg8FmRWPpq4E\n9nTQReDZZ5Mv6i1boKoK+vWDxu5xsmpVcqC2kPUeeyy5+vdPf0qSxo6ShZmVFyeAIvC73yVDMJA8\nP/kktGvXsN7GjYWvt3Xrx/VmzXICMKsknguoCBx3XPLcqlXyRT19OqxZ0/AxfXpSXlVV+Hpt2iS9\nEDOrHO4BFIHO6QmxgwfDV7/a9K/wXr1gxozmx/YLXW9ntGusq2FmRckHgYvAr38Nl18OCxd+3Bsw\nMysUTwddxGprk+cDDsg2DjOrLE4ARaAuAey/f7ZxFMLtt9/O7bffnnUYZpYHJ4AiUFubHIxt2zbr\nSHbfjBkzmDFjRtZhmFkenACKQG2th3/MrOU5ARSBtWudAMys5TkBFAH3AMwsC74OoAjU1sInPpF1\nFIVx0EEHZR2CmeXJCaAI1NbCSSdlHUVh/Pa3v806BDPLk4eAioCHgMwsC04AGdu8Gd5/v3wSwNCh\nQxk6dGjWYZhZHjwElLG1a5PncrgIDGDOnDlZh2BmeXIPIGOeBsLMsuIEkDEnADPLihNAxuqGgJwA\nzKyl+RhAxsqtB9C57uYGZlb0nAAyVm4J4NFHH806BDPLk4eAMlZuCcDMSkdeCUDSOZIWSVoi6XuN\nlHeVNEPSC5JmSeqcLj9DUk3O4x+SLkzLRkt6LaesR2GbVhpqa5NpoMvlTorXX389119/fdZhmFke\nmh0CklQF/BL4ArAcmCtpUkS8klNtBPBIRDwsqS8wHLgyImYCPdLtHAgsAZ7KWe9fI2JiYZpSmsrt\nKuCampqsQzCzPOXTAzgNWBIRSyPiQ2Ac0K9ene7AH9LXMxspB7gEeDIiNuxqsOWo3BKAmZWOfBLA\n4cCynPfL02W5FgAXp68vAjpKqj8t5EDgN/WW3ZEOG/1cUqP3w5I0WNI8SfNWr16dR7ilpba2fK4C\nNrPSUqiDwDcBvSXNB3oDK4AtdYWSDgNOAKblrDMUOA44FTgQuLmxDUfEqIjoGRE9O3XqVKBwi4d7\nAGaWlXxOA10BdMl53zldtk1EvEnaA5DUAegfEWtzqlwKPBYRH+WsszJ9uUnSQyRJpOKsXQvdu2cd\nReEcc8wxWYdgZnnKJwHMBY6WdATJF/9A4Cu5FSRVA+9GxFaSX/YP1tvGZeny3HUOi4iVkgRcCLy0\na00obeXWAxg1alTWIZhZnpodAoqIzcAQkuGbhcCEiHhZ0m2SvpRW6wMskrQYOAS4o259Sd1IehDP\n1Nv0WEkvAi8C1cCw3WpJCdq6FdatK68EYGalI68rgSNiCjCl3rIf5byeCDR6OmdEvE7Dg8ZERN+d\nCbQcrVsHEeWVAAYPHgy4J2BWCjwVRIbK8SrgxYsXZx2CmeXJU0FkqBwTgJmVDieADDkBmFmWnAAy\n5ARgZlnyMYAM1SWAcroSuEePipzTz6wkOQFkqBx7APfcc0/WIZhZnjwElKG1a6F1a9hnn6wjMbNK\n5ASQobqrgKWsIymcK664giuuuCLrMMwsDx4CylC5TQMBsHz58qxDMLM8uQeQoXJMAGZWOpwAMuQE\nYGZZcgLIkBOAmWXJxwAyVI4JoFevXlmHYGZ5cgLIyNatyWmg5ZYAhg8fnnUIZpYnDwFl5P33kyRQ\nTlcBm1lpcQLIyNr0hpnl1gPo378//fv3zzoMM8uDh4AyUo7TQAC88847WYdgZnlyDyAj5ZoAzKx0\nOAFkxAnAzLLmBJARJwAzy5qPAWSkXBPAmWeemXUIZpYnJ4CM1NZCVRV07Jh1JIX1wx/+MOsQzCxP\neQ0BSTpH0iJJSyR9r5HyrpJmSHpB0ixJndPlZ0iqyXn8Q9KFadkRkv6SbnO8pDaFbVpxq61NrgEo\np6mgzay0NJsAJFUBvwTOBboDl0nqXq/aCOCRiDgRuA0YDhARMyOiR0T0APoCG4Cn0nV+Bvw8Io4C\naoGrC9CeklGXAMrNueeey7nnnpt1GGaWh3x6AKcBSyJiaUR8CIwD+tWr0x34Q/p6ZiPlAJcAT0bE\nBkkiSQgT07KHgQt3NvhSVo7TQABs3LiRjRs3Zh2GmeUhnwRwOLAs5/3ydFmuBcDF6euLgI6SDqpX\nZyDwm/T1QcDaiNi8g20CIGmwpHmS5q1evTqPcEtDOU4EZ2alpVCngd4E9JY0H+gNrAC21BVKOgw4\nAZi2sxuOiFER0TMienbq1KlA4WbPCcDMspbPWUArgC457zuny7aJiDdJewCSOgD9I2JtTpVLgcci\n4qP0/TvA/pJap72ABtssd04AZpa1fBLAXOBoSUeQfEkPBL6SW0FSNfBuRGwFhgIP1tvGZelyACIi\nJM0kOS4wDrgKeHxXG1FqIso3AZx//vlZh2BmeWo2AUTEZklDSIZvqoAHI+JlSbcB8yJiEtAHGC4p\ngD8C365bX1I3kh7EM/U2fTMwTtIwYD7wq91uTYn44APYvLk8E8BNN92UdQhmlqe8LgSLiCnAlHrL\nfpTzeiIfn9FTf93XaeQAb0QsJTnDqOKU61XAZlZaPBdQBso5AfTp04c+ffpkHYaZ5cEJIAPlnADM\nrHQ4AWSg7m5g5XglsJmVDieADLgHYGbFwAkgA04AZlYMPB10Bmprk1lA99sv60gK79JLL806BDPL\nkxNABmprky//VmXY/7r22muzDsHM8lSGX0HFr1yvAgbYsGEDGzZsyDoMM8uDewAZKOcEcN555wEw\na9asbAMxs2a5B5CBck4AZlY6nAAy4ARgZsXACSAD5Xo3MDMrLU4AGSjX+wGbWWnxQeAWtnEjbNpU\nvj2AQYMGZR2CmeXJCaCFlftVwE4AZqXDQ0AtrNwTwJo1a1izZk3WYZhZHtwDaGHlngAuueQSwNcB\nmJUC9wBaWLknADMrHU4ALcwJwMyKhRNAC3MCMLNi4QTQwuruBlaOU0GbWWnxQeAWVlsLHTtC6zL9\ny3/rW9/KOgQzy1NeX0OSzgH+HagCHoiIn9Yr7wo8CHQC3gWuiIjladkngAeALkAA50XE65JGA72B\ndelmBkVEzW63qMiV+zxAAwYMyDoEM8tTs0NAkqqAXwLnAt2ByyR1r1dtBPBIRJwI3AYMzyl7BLgr\nIo4HTgPezin714jokT7K/ssfyj8BLFu2jGXLlmUdhpnlIZ8ewGnAkohYCiBpHNAPeCWnTnfghvT1\nTOD3ad3uQOuIeBogItYXKO6SVe4J4MorrwR8HYBZKcjnIPDhQO5PuuXpslwLgIvT1xcBHSUdBBwD\nrJX0O0nzJd2V9ijq3CHpBUk/l9S2sZ1LGixpnqR5q1evzqtRxazcE4CZlY5CnQV0E9Bb0nyScf0V\nwBaSHsbn0vJTgSOBQek6Q4Hj0uUHAjc3tuGIGBURPSOiZ6dOnQoUbnacAMysWOSTAFaQHMCt0zld\ntk1EvBkRF0fEp4AfpMvWkvQWaiJiaURsJhkaOjktXxmJTcBDJENNZc8JwMyKRT4JYC5wtKQjJLUB\nBgKTcitIqpZUt62hJGcE1a27v6S6n+59SY8dSDosfRZwIfDS7jSkFGzalEwH7QRgZsWg2YPAEbFZ\n0hBgGslpoA9GxMuSbgPmRcQkoA8wXFIAfwS+na67RdJNwIz0i/554P5002PTxCCgBvhmYZtWfOou\nAivnBHDjjTdmHYKZ5Smv6wAiYgowpd6yH+W8nghMbGLdp4ETG1ned6ciLQOVMA3EBRdckHUIZpYn\nTwXRguoSQDnfDnLRokUsWrQo6zDMLA9lOiFBcaqEHsA111wD+DoAs1LgHkALqoQEYGalwwmgBTkB\nmFkxcQJoQU4AZlZMnABaUG0t7LMP7LVX1pGYmfkgcIuqhKuAb7nllqxDMLM8OQG0oLVryz8BnHXW\nWVmHYGZ58hBQC6qEHkBNTQ01NRVxawezkuceQAuqrYUjjsg6ij3r+uuvB3wdgFkpcA+gBdXWlvdV\nwGZWWpwAWlAlDAGZWelwAmghH30E69c7AZhZ8XACaCGVMBW0mZUWHwRuIZVyFfBPfvKTrEMwszw5\nAbSQSkkAp59+etYhmFmePATUQiplCGj27NnMnj076zDMLA/uAbSQSukBfP/73wd8HYBZKXAPoIVU\nSgIws9LhBNBCKuF2kGZWWpwAWkhtLey9d/IwMysGTgAtxFcBm1mxyesgsKRzgH8HqoAHIuKn9cq7\nAg8CnYB3gSsiYnla9gngAaALEMB5EfG6pCOAccBBwPPAlRHxYUFaVYQqJQHcc889WYdgZnlqtgcg\nqQr4JXAu0B24TFL3etVGAI9ExInAbcDwnLJHgLsi4njgNODtdPnPgJ9HxFFALXD17jSk2FVKAujR\nowc9evTIOgwzy0M+Q0CnAUsiYmn6C30c0K9ene7AH9LXM+vK00TROiKeBoiI9RGxQZKAvsDEdJ2H\ngQt3qyVFrlISwPTp05k+fXrWYZhZHvJJAIcDy3LeL0+X5VoAXJy+vgjoKOkg4BhgraTfSZov6a60\nR3EQsDYiNu9gmwBIGixpnqR5q1evzq9VRahSEsCwYcMYNmxY1mGYWR4KdRD4JqC3pPlAb2AFsIXk\nGMPn0vJTgSOBQTuz4YgYFRE9I6Jnp06dChRuy6uE20GaWWnJJwGsIDmAW6dzumybiHgzIi6OiE8B\nP0iXrSX5ZV+TDh9tBn4PnAy8A+wvqXVT2ywnW7bAunVOAGZWXPJJAHOBoyUdIakNMBCYlFtBUrWk\num0NJTkjqG7d/SXV/XTvC7wSEUFyrOCSdPlVwOO73ozitm5d8uwEYGbFpNkEkP5yHwJMAxYCEyLi\nZUm3SfpSWq0PsEjSYuAQ4NqZdeIAAAtbSURBVI503S0kwz8zJL0ICLg/Xedm4AZJS0iOCfyqYK0q\nMr4K2MyKUV7XAUTEFGBKvWU/ynk9kY/P6Km/7tPAiY0sX0pyhlHZq6R5gO67776sQzCzPHk20BZQ\nSQng2GOPzToEM8uTp4JoAZWUACZPnszkyZOzDsPM8uAeQAuopARw9913A3DBBRdkHImZNcc9gBZQ\nSQnAzEqHE0ALWLsW2rSBdu2yjsTM7GNOAC2gbhoIKetIzMw+5gTQAiplHiAzKy0+CNwCKikBjBkz\nJusQzCxPTgAtoLYWDj446yhaRpcuXZqvZGZFwUNALaCSegDjx49n/PjxWYdhZnlwD6AFVFICuPfe\newEYMGBAxpGYWXPcA9jDtm71vQDMrDg5Aexh770HEU4AZlZ8nAD2sLVrk2cnADMrNk4Ae5ingTCz\nYuWDwHtYpSWAiRMbvS2EmRUhJ4A9rNISQHV1ddYhmFmePAS0h1VaAhg9ejSjR4/OOgwzy4MTwB5W\nafcDdgIwKx1OAHtYbS1UVUGHDllHYma2PSeAPcxTQZtZsXIC2MMqaRoIMysteSUASedIWiRpiaTv\nNVLeVdIMSS9ImiWpc07ZFkk16WNSzvLRkl7LKetRmCYVF08DYWbFqtnTQCVVAb8EvgAsB+ZKmhQR\nr+RUGwE8EhEPS+oLDAeuTMs2RkRTX+7/GhFlfeJ4pfUApkyZknUIZpanfHoApwFLImJpRHwIjAP6\n1avTHfhD+npmI+UVq9ISQPv27Wnfvn3WYZhZHvJJAIcDy3LeL0+X5VoAXJy+vgjoKOmg9P3ekuZJ\n+rOkC+utd0c6bPRzSW0b27mkwen681avXp1HuMWl0hLAyJEjGTlyZNZhmFkeCnUQ+Cagt6T5QG9g\nBbAlLesaET2BrwD3SPqndPlQ4DjgVOBA4ObGNhwRoyKiZ0T07NSpU4HCbRkRlZcAJkyYwIQJE7IO\nw8zykE8CWAHk3uevc7psm4h4MyIujohPAT9Il61Nn1ekz0uBWcCn0vcrI7EJeIhkqKmsrF8PW7ZU\nVgIws9KRTwKYCxwt6QhJbYCBwKTcCpKqJdVtayjwYLr8gLqhHUnVwGeAV9L3h6XPAi4EXtr95hSX\nSrsK2MxKS7MJICI2A0OAacBCYEJEvCzpNklfSqv1ARZJWgwcAtyRLj8emCdpAcnB4Z/mnD00VtKL\nwItANTCsQG1qYM4cGD48eW7JerNmJc9vv513qGZmLUYRkXUMeevZs2fMmzdvp9aZMwc+97lkKEaC\nrl2hsZNUNmyAN95Ixu0LXa9tW5g5E3r12qnQS1KfPn0AmFWX/cwsc5KeT4/Fbqfsp4OeNSu5Ly8k\nX8YdOsBxxzWs9+qrSfmeqLd5cxJHJSQAf/GblY6yTwB9+sDee8OHH0KbNjBqVONfxHPmwJln7rl6\n6Q9jM7OiUfZDQJB8Gc+alXwJ7+hXeFb1zMz2pKaGgCoiAZiZVbKmEoBnAzUzq1BOAGZmFcoJwMys\nQjkBmJlVKCcAM7MK5QRgZlahSuo0UEmrgTd2cfVqYE0Bw8lSubSlXNoBbkuxKpe27G47ukZEg/n0\nSyoB7A5J8xo7D7YUlUtbyqUd4LYUq3Jpy55qh4eAzMwqlBOAmVmFqqQEMCrrAAqoXNpSLu0At6VY\nlUtb9kg7KuYYgJmZba+SegBmZpbDCcDMrEJVRAKQdI6kRZKWSPpe1vHsKkmvS3pRUo2kkpoXW9KD\nkt6W9FLOsgMlPS3pr+nzAVnGmK8m2nKrpBXpZ1Mj6bwsY8yHpC6SZkp6RdLLkq5Ll5fc57KDtpTi\n57K3pOckLUjb8uN0+RGS/pJ+j42X1Ga391XuxwAkVQGLgS8Ay4G5wGU5N6cvGZJeB3pGRMld2CLp\n88B64JGI+GS67E7g3Yj4aZqYD4iIm7OMMx9NtOVWYH1EjMgytp0h6TDgsIj4L0kdgeeBC4FBlNjn\nsoO2XErpfS4C9omI9ZL2Av4EXAfcAPwuIsZJ+r/Agoi4d3f2VQk9gNOAJRGxNCI+BMYB/TKOqeJE\nxB+Bd+st7gc8nL5+mOQ/bNFroi0lJyJWRsR/pa/fBxYCh1OCn8sO2lJyIrE+fbtX+gigLzAxXV6Q\nz6USEsDhwLKc98sp0X8YJP8InpL0vKTBWQdTAIdExMr09SrgkCyDKYAhkl5Ih4iKftgkl6RuwKeA\nv1Din0u9tkAJfi6SqiTVAG8DTwN/A9ZGxOa0SkG+xyohAZSTz0bEycC5wLfToYiyEMlYZCmPR94L\n/BPQA1gJ3J1tOPmT1AH4LXB9RLyXW1Zqn0sjbSnJzyUitkRED6AzySjGcXtiP5WQAFYAXXLed06X\nlZyIWJE+vw08RvIPo5S9lY7d1o3hvp1xPLssIt5K/9NuBe6nRD6bdIz5t8DYiPhdurgkP5fG2lKq\nn0udiFgLzAR6AftLap0WFeR7rBISwFzg6PQIehtgIDAp45h2mqR90oNbSNoHOBt4acdrFb1JwFXp\n66uAxzOMZbfUfWGmLqIEPpv0YOOvgIUR8W85RSX3uTTVlhL9XDpJ2j993Y7kBJaFJIngkrRaQT6X\nsj8LCCA99eseoAp4MCLuyDiknSbpSJJf/QCtgV+XUjsk/QboQzKt7VvA/wJ+D0wAPkEyzfelEVH0\nB1ebaEsfkmGGAF4HrskZRy9Kkj4L/CfwIrA1Xfx9krHzkvpcdtCWyyi9z+VEkoO8VSQ/0idExG3p\nd8A44EBgPnBFRGzarX1VQgIwM7OGKmEIyMzMGuEEYGZWoZwAzMwqlBOAmVmFcgIwM6tQTgBWMiSF\npLtz3t+UTsJWiG2PlnRJ8zV3ez9flrRQ0sx6y7tJ+sqe3r9ZLicAKyWbgIslVWcdSK6cqzPzcTXw\njYg4o97ybkCjCWAnt2+WNycAKyWbSe6N+t36BfV/wUtanz73kfSMpMclLZX0U0mXp/Otvyjpn3I2\nc5akeZIWSzo/Xb9K0l2S5qYTil2Ts93/lDQJaDC1uKTL0u2/JOln6bIfAZ8FfiXprnqr/BT4XDpn\n/XclDZI0SdIfgBnp+v+aE8ePc/Z1RdqeGkn3pTFXpX+Tl9I4GvzNzPzLwkrNL4EX0nsJ5Osk4HiS\nKZyXAg9ExGlKbhryHeD6tF43krli/gmYKeko4KvAuog4VVJb4FlJT6X1TwY+GRGv5e5M0n8Dfgac\nAtSSzOB6YXo1Z1/gpoiof0Of76XL6xLPoHT7J0bEu5LOBo5O4xMwKZ0McDUwAPhMRHwkaSRwOfAy\ncHjO/Qr234m/l1UIJwArKRHxnqRHgP8BbMxztbl1l/9L+htQ9wX+IpA7FDMhnTTsr5KWkszAeDZw\nYk7vYj+SL+IPgefqf/mnTgVmRcTqdJ9jgc+TTH2xM57OmYLh7PQxP33fIY3jRJJEMzeZDod2JJO3\nTQaOlPS/gf/IabPZNk4AVoruAf4LeChn2WbSIU1JrYDc2+XlzpeyNef9Vrb/P1B/XpQg+bX9nYiY\nllsgqQ/wwa6Fn7fc7QsYHhH31YvjO8DDETG0/sqSTgK+CHyT5M5YX9+DsVoJ8jEAKznpr+IJJAdU\n67xO8ksY4Eskd1HaWV+W1Co9LnAksAiYBnwrnWoYSceks7HuyHNAb0nVSm5JehnwTDPrvA903EH5\nNODr6Xz3SDpc0sEkxwcuSV/X3c+3a3qgvFVE/Ba4hWQ4yWw77gFYqbobGJLz/n7gcUkLgKns2q/z\nv5N8ee8LfDMi/iHpAZJjA/+VTjm8mmZuxRcRK5XcS3cmyS/3/4iI5qbufQHYksY/muTYQe42n5J0\nPDAnHepZTzIb5CuSbiE5ztAK+Aj4Nsnw2EPpMoAGPQQzzwZqZlahPARkZlahnADMzCqUE4CZWYVy\nAjAzq1BOAGZmFcoJwMysQjkBmJlVqP8PW+k2i6YDQwoAAAAASUVORK5CYII=\n",
            "text/plain": [
              "<Figure size 432x288 with 1 Axes>"
            ]
          },
          "metadata": {
            "tags": []
          }
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "OQBedRT3faPV",
        "colab_type": "text"
      },
      "source": [
        "Rather than training a large number of trees then looking back to find the best number you can use *warm_start=True* to keep using the existing trees when fit is called allowing incremental training<sup>1</sup>.\n",
        "\n",
        "We can also use a *subsample* hyperparameter to specify the fraction of training instances used for each tree (Stochastic Gradient Boosting). This trades a higher bias for lower variance and speeds up training.\n",
        "\n",
        "---\n",
        "1. Geron2017"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "EOHPyEzifdgA",
        "colab_type": "code",
        "outputId": "b955cada-c0e5-4511-91f5-f32d1d2ed0dc",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 55
        }
      },
      "source": [
        "gbrt = GradientBoostingClassifier(max_depth=2, warm_start=True, subsample = 0.5, random_state=RANDOM_STATE)\n",
        "\n",
        "max_val_score = float(0)\n",
        "score_not_going_up = 0\n",
        "for n_estimators in range(1, 120):\n",
        "    gbrt.n_estimators = n_estimators\n",
        "    gbrt.fit(X_train, y_train)\n",
        "    y_pred = gbrt.predict(X_val)\n",
        "    val_score = accuracy_score(y_val, y_pred)\n",
        "    if val_score > max_val_score:\n",
        "        max_val_score = val_score\n",
        "        score_not_going_up = 0\n",
        "    else:\n",
        "        score_not_going_up += 1\n",
        "        if score_not_going_up == 5:\n",
        "            break  # early stopping\n",
        "            \n",
        "print(\"Number of estimators:\", gbrt.n_estimators)\n",
        "print(\"Maximum Accuracy:\", max_val_score)"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "Number of estimators: 10\n",
            "Maximum Accuracy: 0.9858657243816255\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "rz6JILNbfsRB",
        "colab_type": "text"
      },
      "source": [
        "### XgBoost\n",
        "\n",
        "Two very effective, efficient, and parallelizable algorithms, that are popular at time of writing are XGBoost<sup>2</sup> and lightGBM<sup>3</sup>. Both algorithms improve upon basic gradient boosted decision tree (GBDT) algorithms in a number of ways. \n",
        "\n",
        "Both algorithms...\n",
        "- ...can grow trees leaf-wise, so that each split is in the leaf that reduces the most loss, rather than a level-wise strategy, which maintains a more balanced tree with splits generally increasing as the levels increase. Although leaf-wise training is more prone to overfitting, it is more flexible and applicable to large datasets. \n",
        "- ...allow for methods for finding the best split of features for each leaf, such as histogram-based bin sampling and ignoring sparse inputs. Histogram-based methods subsample the number of splits evaluated by a model by grouping features into a set of bins before building each tree. \n",
        "\n",
        "For specifics look into https://xgboost.readthedocs.io/en/latest/tutorials/model.html\n",
        "\n",
        "As can be seen it has a similar/better performance to adaboost but is much faster to compute\n",
        "\n",
        "**TODO**\n",
        "- improve this and the LGBM sections as these are really good models. Until I have written this up look at the following links below for more info:\n",
        "  - https://medium.com/@am.sharma/lgbm-on-colab-with-gpu-c1c09e83f2af\n",
        "  - https://devblogs.nvidia.com/gradient-boosting-decision-trees-xgboost-cuda/\n",
        "  - http://mlexplained.com/2018/01/05/lightgbm-and-xgboost-explained/\n",
        "  - https://machinelearningmastery.com/gentle-introduction-xgboost-applied-machine-learning/\n",
        "  - https://www.kaggle.com/stuarthallows/using-xgboost-with-scikit-learn\n",
        "  - https://www.datacamp.com/community/tutorials/xgboost-in-python\n",
        "    \n",
        "---\n",
        "1. https://machinelearningmastery.com/gentle-introduction-xgboost-applied-machine-learning/\n",
        "2. Chen2016\n",
        "3. Ke2017"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "EELSbEsugAWZ",
        "colab_type": "code",
        "outputId": "5c9b6036-1ac8-46dd-e9f6-9b61d2e04451",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 36
        }
      },
      "source": [
        "from xgboost import XGBClassifier\n",
        "\n",
        "%timeit XGBClassifier(n_estimators =10, random_state=RANDOM_STATE).fit(X_train, y_train)"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "1 loop, best of 3: 1.03 s per loop\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "3Fj4pPdtgKVc",
        "colab_type": "code",
        "outputId": "272d9e6f-9d13-422e-f596-0de6f1fac52f",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 36
        }
      },
      "source": [
        "%timeit AdaBoostClassifier(n_estimators =10, random_state=RANDOM_STATE).fit(X_train, y_train)"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "1 loop, best of 3: 2.95 s per loop\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "5q4_t_MkgC8_",
        "colab_type": "code",
        "outputId": "7653e8c4-9e79-4433-f30d-1b51f8d9ed15",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 36
        }
      },
      "source": [
        "%timeit GradientBoostingClassifier(n_estimators =10, random_state=RANDOM_STATE).fit(X_train, y_train)"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "1 loop, best of 3: 7.49 s per loop\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "ryPAdDN8ftmW",
        "colab_type": "code",
        "outputId": "16afd642-5af1-47b4-e60f-3b51ef471aa3",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 175
        }
      },
      "source": [
        "XGmodel = XGBClassifier(max_depth = 3,\n",
        "                        learning_rate = 0.1, \n",
        "                        n_estimators=500,\n",
        "                        booster = 'gbtree',\n",
        "                        n_jobs =-1,\n",
        "                        random_state=RANDOM_STATE)\n",
        "\n",
        "XGmodel.fit(X_train, y_train)\n",
        "\n",
        "y_pred = XGmodel.predict(X_val)\n",
        "\n",
        "display(pd.DataFrame(classification_report(y_val, y_pred , output_dict = True)))"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "display_data",
          "data": {
            "text/html": [
              "<div>\n",
              "<style scoped>\n",
              "    .dataframe tbody tr th:only-of-type {\n",
              "        vertical-align: middle;\n",
              "    }\n",
              "\n",
              "    .dataframe tbody tr th {\n",
              "        vertical-align: top;\n",
              "    }\n",
              "\n",
              "    .dataframe thead th {\n",
              "        text-align: right;\n",
              "    }\n",
              "</style>\n",
              "<table border=\"1\" class=\"dataframe\">\n",
              "  <thead>\n",
              "    <tr style=\"text-align: right;\">\n",
              "      <th></th>\n",
              "      <th>0</th>\n",
              "      <th>1</th>\n",
              "      <th>accuracy</th>\n",
              "      <th>macro avg</th>\n",
              "      <th>weighted avg</th>\n",
              "    </tr>\n",
              "  </thead>\n",
              "  <tbody>\n",
              "    <tr>\n",
              "      <th>precision</th>\n",
              "      <td>1.000000</td>\n",
              "      <td>0.928571</td>\n",
              "      <td>0.996466</td>\n",
              "      <td>0.964286</td>\n",
              "      <td>0.996719</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>recall</th>\n",
              "      <td>0.996296</td>\n",
              "      <td>1.000000</td>\n",
              "      <td>0.996466</td>\n",
              "      <td>0.998148</td>\n",
              "      <td>0.996466</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>f1-score</th>\n",
              "      <td>0.998145</td>\n",
              "      <td>0.962963</td>\n",
              "      <td>0.996466</td>\n",
              "      <td>0.980554</td>\n",
              "      <td>0.996529</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>support</th>\n",
              "      <td>270.000000</td>\n",
              "      <td>13.000000</td>\n",
              "      <td>0.996466</td>\n",
              "      <td>283.000000</td>\n",
              "      <td>283.000000</td>\n",
              "    </tr>\n",
              "  </tbody>\n",
              "</table>\n",
              "</div>"
            ],
            "text/plain": [
              "                    0          1  accuracy   macro avg  weighted avg\n",
              "precision    1.000000   0.928571  0.996466    0.964286      0.996719\n",
              "recall       0.996296   1.000000  0.996466    0.998148      0.996466\n",
              "f1-score     0.998145   0.962963  0.996466    0.980554      0.996529\n",
              "support    270.000000  13.000000  0.996466  283.000000    283.000000"
            ]
          },
          "metadata": {
            "tags": []
          }
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "MNucdBnlgjIo",
        "colab_type": "text"
      },
      "source": [
        "### LightGBM\n",
        "\n",
        "LightGBM provides additional methods than XGBoost for finding the best split of features for each leaf include data subsampling and exclusive feature bundling.\n",
        "\n",
        "Gradient-based one-side sampling, available in lightGBM, concentrates on data points with larger gradients rather than data points that contribute less to training. In order to ensure ignoring small gradients does not lead to biased sampling, data with small gradients are randomly sampled and these samples are given increased weight when assessing their contribution to the change in loss. \n",
        "\n",
        "**TODO**\n",
        "- add about the GPU enabled versions\n",
        "\n",
        "**NOTE**\n",
        "- `neg_bagging_fraction` is not available in Gradient-based one-side sampling\n",
        "\n",
        "https://lightgbm.readthedocs.io/en/latest/index.html"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "_dXpfAI9pkcd",
        "colab_type": "code",
        "outputId": "447165da-2210-4576-dd4c-543f8621430c",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 36
        }
      },
      "source": [
        "from lightgbm import LGBMClassifier\n",
        "\n",
        "%timeit LGBMClassifier(n_estimators =10, random_state=RANDOM_STATE).fit(X_train, y_train)"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "1 loop, best of 3: 2 s per loop\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "icAkNV3NlIT8",
        "colab_type": "code",
        "outputId": "bc43a9e8-362a-47dc-8c5b-bdf8a87f0e82",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 175
        }
      },
      "source": [
        "LGBM_model = LGBMClassifier(boosting_type='gbdt',  \n",
        "                            max_depth=3,\n",
        "                            n_estimators=500, \n",
        "                            random_state=RANDOM_STATE,\n",
        "                            neg_bagging_fraction = 0.5, # for subsampling interictal data\n",
        "                            n_jobs =-1,\n",
        "                            bagging_seed = RANDOM_STATE)\n",
        "\n",
        "LGBM_model.fit(X_train, y_train)\n",
        "\n",
        "y_pred = LGBM_model.predict(X_val)\n",
        "\n",
        "display(pd.DataFrame(classification_report(y_val, y_pred , output_dict = True)))"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "display_data",
          "data": {
            "text/html": [
              "<div>\n",
              "<style scoped>\n",
              "    .dataframe tbody tr th:only-of-type {\n",
              "        vertical-align: middle;\n",
              "    }\n",
              "\n",
              "    .dataframe tbody tr th {\n",
              "        vertical-align: top;\n",
              "    }\n",
              "\n",
              "    .dataframe thead th {\n",
              "        text-align: right;\n",
              "    }\n",
              "</style>\n",
              "<table border=\"1\" class=\"dataframe\">\n",
              "  <thead>\n",
              "    <tr style=\"text-align: right;\">\n",
              "      <th></th>\n",
              "      <th>0</th>\n",
              "      <th>1</th>\n",
              "      <th>accuracy</th>\n",
              "      <th>macro avg</th>\n",
              "      <th>weighted avg</th>\n",
              "    </tr>\n",
              "  </thead>\n",
              "  <tbody>\n",
              "    <tr>\n",
              "      <th>precision</th>\n",
              "      <td>1.000000</td>\n",
              "      <td>0.928571</td>\n",
              "      <td>0.996466</td>\n",
              "      <td>0.964286</td>\n",
              "      <td>0.996719</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>recall</th>\n",
              "      <td>0.996296</td>\n",
              "      <td>1.000000</td>\n",
              "      <td>0.996466</td>\n",
              "      <td>0.998148</td>\n",
              "      <td>0.996466</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>f1-score</th>\n",
              "      <td>0.998145</td>\n",
              "      <td>0.962963</td>\n",
              "      <td>0.996466</td>\n",
              "      <td>0.980554</td>\n",
              "      <td>0.996529</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>support</th>\n",
              "      <td>270.000000</td>\n",
              "      <td>13.000000</td>\n",
              "      <td>0.996466</td>\n",
              "      <td>283.000000</td>\n",
              "      <td>283.000000</td>\n",
              "    </tr>\n",
              "  </tbody>\n",
              "</table>\n",
              "</div>"
            ],
            "text/plain": [
              "                    0          1  accuracy   macro avg  weighted avg\n",
              "precision    1.000000   0.928571  0.996466    0.964286      0.996719\n",
              "recall       0.996296   1.000000  0.996466    0.998148      0.996466\n",
              "f1-score     0.998145   0.962963  0.996466    0.980554      0.996529\n",
              "support    270.000000  13.000000  0.996466  283.000000    283.000000"
            ]
          },
          "metadata": {
            "tags": []
          }
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "HuBtOyK0pBFy",
        "colab_type": "text"
      },
      "source": [
        "Another package of note is CatBoost<sup>1,2</sup>, which makes improvements on the handling of categorical features and has been shown to be quicker on some datasets than XGBoost and LightGBM. However, although in both industrial and academic applications gradient boosted trees are becoming known as consistently strong performing classifiers, the interpretability of final model is still worse than more basic models such as decision trees.\n",
        "\n",
        "---\n",
        "1. Dorogush2018\n",
        "2. Prokhorenkova2018"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "9shVAE4xmuz6",
        "colab_type": "text"
      },
      "source": [
        "# Exercises\n",
        "\n",
        "Below are a few suggested exercises that may help improve your skills.\n",
        "\n",
        "1. compare boosting and pasting models\n",
        "\n",
        "2. Weight classifiers in terms of importance. Weight the best to worst classifiers based on other notebooks outcomes to see if this improves performance. This can be done in the VotingClassifier using a variable called 'weight' followed by a list of the wieghtng order (e.g. [1,2,3])\n",
        "\n",
        "3. load machine learning instances from previous notebooks and put them into a voting classifier\n",
        "\n",
        "4. look for optimal hyperparameters using optimisation techniques\n",
        "\n",
        "5. test models on all patients in a patient specific manner (patient independent is tricky because all the channels are different for each patient)"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "YvFvkeckGBou",
        "colab_type": "text"
      },
      "source": [
        "# Extra"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "Z-dKEYZV6n8f",
        "colab_type": "text"
      },
      "source": [
        "## Plot Missclassified\n",
        "\n",
        "Lets have a look at the samples that were missclasified. You will need the full data downloaded for this to work (rather than just the feature df), so because that takes a while I've made this an extra part.\n",
        "\n",
        "**TODO**\n",
        "- get this code working again"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "daTZPSU-6-tO",
        "colab_type": "code",
        "colab": {}
      },
      "source": [
        "import mne \n",
        "from scipy.io import loadmat\n",
        "\n",
        "\n",
        "def mat_to_mne(file_path):\n",
        "    mat = loadmat(file_path)  # load mat-file\n",
        "    \n",
        "    data = mat['data']  # variable in mat file\n",
        "    channels = mat['channels']  # dtypes of structures are \"unsized objects\"\n",
        "    freq = mat['freq'][0]\n",
        "\n",
        "    channels_list = []\n",
        "    for channel_array in channels[0][0]:\n",
        "        channels_list.append(channel_array[0])\n",
        "\n",
        "    df = pd.DataFrame(data,\n",
        "                      index=channels_list)\n",
        "\n",
        "    df = df.T\n",
        "    \n",
        "    info = mne.create_info(ch_names=list(df.columns), \n",
        "                               sfreq=freq, \n",
        "                               ch_types=['eeg']*df.shape[-1])\n",
        "    \n",
        "    df = df.apply(lambda x: x*1e-6) # data needs to be in volts rather than in microvolts (which it currently is)\n",
        "    # transpose the data\n",
        "    data_T = df.transpose()\n",
        "    # create raw mne object\n",
        "    raw = mne.io.RawArray(data_T, info)\n",
        "    \n",
        "    return raw\n",
        "\n",
        "\n",
        "def plot_missclassified(predicted, y_test, test_index_list, file_list, plot_kwargs):\n",
        "\n",
        "    for index, class_no in enumerate(predicted):\n",
        "        if class_no != y_test[index]:\n",
        "            file_path = file_list[test_index_list[index]]\n",
        "            \n",
        "            raw = mat_to_mne(file_path)\n",
        "\n",
        "            raw.plot(**plot_kwargs)\n",
        "            \n",
        "\n",
        "plot_kwargs = {\n",
        "    #'scalings': dict(eeg=20e-5),   # zooms the plot out\n",
        "    'highpass': 0.5,              # filters out low frequencies\n",
        "    'lowpass': 40.,                # filters out high frequencies\n",
        "    }\n",
        "\n",
        "index_train, index_test = train_test_split(list(range(0,len(data_x))), test_size=TEST_SIZE, random_state=RANDOM_STATE)\n",
        "            \n",
        "plot_missclassified(forest_predicted, y_test, index_test, file_list(os.path.join(os.path.join(dir_file_list[1], '*'))), plot_kwargs)"
      ],
      "execution_count": 0,
      "outputs": []
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "-EX0KJVwmvoI",
        "colab_type": "text"
      },
      "source": [
        "# License\n",
        "\n",
        "<a rel=\"license\" href=\"http://creativecommons.org/licenses/by-nc/4.0/\"><img alt=\"Creative Commons License\" style=\"border-width:0\" src=\"https://i.creativecommons.org/l/by-nc/4.0/88x31.png\" /></a><br />This work is licensed under a <a rel=\"license\" href=\"http://creativecommons.org/licenses/by-nc/4.0/\">Creative Commons Attribution-NonCommercial 4.0 International License</a>."
      ]
    }
  ]
}