{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Metal Furnace Challenge : Weekend"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "**Manufacturing of any alloy is not a simple process. Many complicated factors are involved in the making of a perfect alloy, from the temperature at which various metals are melted to the presence of impurities to the cooling temperature set to cool down the alloy. Very minor changes in any of these factors can affect the quality or grade of the alloy produced.**"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "<img src=\"https://www.machinehack.com/wp-content/uploads/2020/04/MetalFurnace-01-1536x864.jpg\" width=800 height=500>"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Given are 28 distinguishing factors in the manufacturing of an alloy, your objective as a data scientist is to build a Machine Learning model that can predict the grade of the product using these factors.\n",
    "\n",
    "You are provided with 28 anonymized factors (f0 to f27) that influence the making of a perfect alloy that is to be used for various applications based on the grade/quality of the obtained product.\n",
    "\n",
    "**Data Description**  \n",
    "The unzipped folder will have the following files.\n",
    "\n",
    "- Train.csv – 620 observations.\n",
    "- Test.csv – 266 observations.\n",
    "- Sample Submission – Sample format for the submission.\n",
    "- Target Variable: grade"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Import Libraries"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 2,
   "metadata": {},
   "outputs": [],
   "source": [
    "import pandas as pd\n",
    "import numpy as np\n",
    "\n",
    "import matplotlib.pyplot as plt\n",
    "import seaborn as sns\n",
    "\n",
    "#Print all rows and columns. Dont hide any\n",
    "pd.set_option('display.max_rows', None)\n",
    "pd.set_option('display.max_columns', None)\n",
    "\n",
    "import warnings\n",
    "warnings.simplefilter('ignore')\n",
    "\n",
    "from sklearn.model_selection import train_test_split,cross_val_predict\n",
    "from sklearn.metrics import log_loss\n",
    "\n",
    "from sklearn.linear_model import LogisticRegression\n",
    "from sklearn.ensemble import AdaBoostClassifier,RandomForestClassifier,GradientBoostingClassifier,BaggingClassifier\n",
    "from sklearn.neighbors import KNeighborsClassifier\n",
    "\n",
    "import xgboost as xgb\n",
    "import lightgbm as lgb"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Import Dataset"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## PART 1"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 3,
   "metadata": {},
   "outputs": [],
   "source": [
    "train = pd.read_csv(\"data/Train.csv\")\n",
    "test = pd.read_csv(\"data/Test.csv\")\n",
    "sample = pd.read_excel(\"data/Sample_Submission.xlsx\")"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 4,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "((620, 29), (266, 28))"
      ]
     },
     "execution_count": 4,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "train.shape,test.shape"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 5,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/html": [
       "<div>\n",
       "<style scoped>\n",
       "    .dataframe tbody tr th:only-of-type {\n",
       "        vertical-align: middle;\n",
       "    }\n",
       "\n",
       "    .dataframe tbody tr th {\n",
       "        vertical-align: top;\n",
       "    }\n",
       "\n",
       "    .dataframe thead th {\n",
       "        text-align: right;\n",
       "    }\n",
       "</style>\n",
       "<table border=\"1\" class=\"dataframe\">\n",
       "  <thead>\n",
       "    <tr style=\"text-align: right;\">\n",
       "      <th></th>\n",
       "      <th>f0</th>\n",
       "      <th>f1</th>\n",
       "      <th>f2</th>\n",
       "      <th>f3</th>\n",
       "      <th>f4</th>\n",
       "      <th>f5</th>\n",
       "      <th>f6</th>\n",
       "      <th>f7</th>\n",
       "      <th>f8</th>\n",
       "      <th>f9</th>\n",
       "      <th>f10</th>\n",
       "      <th>f11</th>\n",
       "      <th>f12</th>\n",
       "      <th>f13</th>\n",
       "      <th>f14</th>\n",
       "      <th>f15</th>\n",
       "      <th>f16</th>\n",
       "      <th>f17</th>\n",
       "      <th>f18</th>\n",
       "      <th>f19</th>\n",
       "      <th>f20</th>\n",
       "      <th>f21</th>\n",
       "      <th>f22</th>\n",
       "      <th>f23</th>\n",
       "      <th>f24</th>\n",
       "      <th>f25</th>\n",
       "      <th>f26</th>\n",
       "      <th>f27</th>\n",
       "      <th>grade</th>\n",
       "    </tr>\n",
       "  </thead>\n",
       "  <tbody>\n",
       "    <tr>\n",
       "      <td>0</td>\n",
       "      <td>1.848564</td>\n",
       "      <td>-0.26425</td>\n",
       "      <td>-0.461423</td>\n",
       "      <td>0.409400</td>\n",
       "      <td>1.305455</td>\n",
       "      <td>2.329398</td>\n",
       "      <td>0.370965</td>\n",
       "      <td>0.090167</td>\n",
       "      <td>0.107958</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.395874</td>\n",
       "      <td>0.308879</td>\n",
       "      <td>0.548623</td>\n",
       "      <td>0.472101</td>\n",
       "      <td>0.172917</td>\n",
       "      <td>0.098853</td>\n",
       "      <td>0.308879</td>\n",
       "      <td>0.040193</td>\n",
       "      <td>0.182574</td>\n",
       "      <td>0.085505</td>\n",
       "      <td>0.233285</td>\n",
       "      <td>-1.080663</td>\n",
       "      <td>0.443257</td>\n",
       "      <td>-0.406121</td>\n",
       "      <td>-0.687687</td>\n",
       "      <td>0.271886</td>\n",
       "      <td>3.727218</td>\n",
       "      <td>0.102129</td>\n",
       "      <td>2</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <td>1</td>\n",
       "      <td>-0.825098</td>\n",
       "      <td>-0.26425</td>\n",
       "      <td>3.032397</td>\n",
       "      <td>-2.442599</td>\n",
       "      <td>1.305455</td>\n",
       "      <td>-0.276144</td>\n",
       "      <td>0.370965</td>\n",
       "      <td>0.090167</td>\n",
       "      <td>0.107958</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.395874</td>\n",
       "      <td>0.308879</td>\n",
       "      <td>0.548623</td>\n",
       "      <td>0.472101</td>\n",
       "      <td>0.172917</td>\n",
       "      <td>0.098853</td>\n",
       "      <td>0.308879</td>\n",
       "      <td>0.040193</td>\n",
       "      <td>0.182574</td>\n",
       "      <td>0.085505</td>\n",
       "      <td>0.233285</td>\n",
       "      <td>-1.080663</td>\n",
       "      <td>-0.232546</td>\n",
       "      <td>-0.406366</td>\n",
       "      <td>-0.687687</td>\n",
       "      <td>0.271886</td>\n",
       "      <td>-0.232472</td>\n",
       "      <td>0.102129</td>\n",
       "      <td>4</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <td>2</td>\n",
       "      <td>1.848564</td>\n",
       "      <td>-0.26425</td>\n",
       "      <td>-0.461423</td>\n",
       "      <td>0.409400</td>\n",
       "      <td>1.305455</td>\n",
       "      <td>2.329398</td>\n",
       "      <td>0.370965</td>\n",
       "      <td>0.090167</td>\n",
       "      <td>0.107958</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.395874</td>\n",
       "      <td>0.308879</td>\n",
       "      <td>0.548623</td>\n",
       "      <td>0.472101</td>\n",
       "      <td>0.172917</td>\n",
       "      <td>0.098853</td>\n",
       "      <td>0.308879</td>\n",
       "      <td>0.040193</td>\n",
       "      <td>0.182574</td>\n",
       "      <td>0.085505</td>\n",
       "      <td>0.233285</td>\n",
       "      <td>0.925358</td>\n",
       "      <td>1.459782</td>\n",
       "      <td>1.221876</td>\n",
       "      <td>1.877777</td>\n",
       "      <td>0.271886</td>\n",
       "      <td>-0.232472</td>\n",
       "      <td>0.102129</td>\n",
       "      <td>2</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <td>3</td>\n",
       "      <td>0.511733</td>\n",
       "      <td>-0.26425</td>\n",
       "      <td>-0.461423</td>\n",
       "      <td>0.409400</td>\n",
       "      <td>-0.525726</td>\n",
       "      <td>-0.276144</td>\n",
       "      <td>0.370965</td>\n",
       "      <td>0.090167</td>\n",
       "      <td>0.107958</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.395874</td>\n",
       "      <td>0.308879</td>\n",
       "      <td>-1.999287</td>\n",
       "      <td>-2.118189</td>\n",
       "      <td>0.172917</td>\n",
       "      <td>0.098853</td>\n",
       "      <td>-3.237512</td>\n",
       "      <td>0.040193</td>\n",
       "      <td>0.182574</td>\n",
       "      <td>0.085505</td>\n",
       "      <td>0.233285</td>\n",
       "      <td>0.925358</td>\n",
       "      <td>-0.008030</td>\n",
       "      <td>-0.406366</td>\n",
       "      <td>1.504523</td>\n",
       "      <td>0.271886</td>\n",
       "      <td>-0.232472</td>\n",
       "      <td>0.102129</td>\n",
       "      <td>2</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <td>4</td>\n",
       "      <td>-0.825098</td>\n",
       "      <td>-0.26425</td>\n",
       "      <td>-0.461423</td>\n",
       "      <td>0.409400</td>\n",
       "      <td>-0.525726</td>\n",
       "      <td>-0.276144</td>\n",
       "      <td>0.370965</td>\n",
       "      <td>0.090167</td>\n",
       "      <td>0.107958</td>\n",
       "      <td>0.0</td>\n",
       "      <td>-2.526055</td>\n",
       "      <td>0.308879</td>\n",
       "      <td>-1.999287</td>\n",
       "      <td>-2.118189</td>\n",
       "      <td>0.172917</td>\n",
       "      <td>0.098853</td>\n",
       "      <td>0.308879</td>\n",
       "      <td>0.040193</td>\n",
       "      <td>0.182574</td>\n",
       "      <td>0.085505</td>\n",
       "      <td>0.233285</td>\n",
       "      <td>0.925358</td>\n",
       "      <td>-0.573268</td>\n",
       "      <td>-1.164793</td>\n",
       "      <td>1.877777</td>\n",
       "      <td>0.271886</td>\n",
       "      <td>-0.232472</td>\n",
       "      <td>0.102129</td>\n",
       "      <td>2</td>\n",
       "    </tr>\n",
       "  </tbody>\n",
       "</table>\n",
       "</div>"
      ],
      "text/plain": [
       "         f0       f1        f2        f3        f4        f5        f6  \\\n",
       "0  1.848564 -0.26425 -0.461423  0.409400  1.305455  2.329398  0.370965   \n",
       "1 -0.825098 -0.26425  3.032397 -2.442599  1.305455 -0.276144  0.370965   \n",
       "2  1.848564 -0.26425 -0.461423  0.409400  1.305455  2.329398  0.370965   \n",
       "3  0.511733 -0.26425 -0.461423  0.409400 -0.525726 -0.276144  0.370965   \n",
       "4 -0.825098 -0.26425 -0.461423  0.409400 -0.525726 -0.276144  0.370965   \n",
       "\n",
       "         f7        f8   f9       f10       f11       f12       f13       f14  \\\n",
       "0  0.090167  0.107958  0.0  0.395874  0.308879  0.548623  0.472101  0.172917   \n",
       "1  0.090167  0.107958  0.0  0.395874  0.308879  0.548623  0.472101  0.172917   \n",
       "2  0.090167  0.107958  0.0  0.395874  0.308879  0.548623  0.472101  0.172917   \n",
       "3  0.090167  0.107958  0.0  0.395874  0.308879 -1.999287 -2.118189  0.172917   \n",
       "4  0.090167  0.107958  0.0 -2.526055  0.308879 -1.999287 -2.118189  0.172917   \n",
       "\n",
       "        f15       f16       f17       f18       f19       f20       f21  \\\n",
       "0  0.098853  0.308879  0.040193  0.182574  0.085505  0.233285 -1.080663   \n",
       "1  0.098853  0.308879  0.040193  0.182574  0.085505  0.233285 -1.080663   \n",
       "2  0.098853  0.308879  0.040193  0.182574  0.085505  0.233285  0.925358   \n",
       "3  0.098853 -3.237512  0.040193  0.182574  0.085505  0.233285  0.925358   \n",
       "4  0.098853  0.308879  0.040193  0.182574  0.085505  0.233285  0.925358   \n",
       "\n",
       "        f22       f23       f24       f25       f26       f27  grade  \n",
       "0  0.443257 -0.406121 -0.687687  0.271886  3.727218  0.102129      2  \n",
       "1 -0.232546 -0.406366 -0.687687  0.271886 -0.232472  0.102129      4  \n",
       "2  1.459782  1.221876  1.877777  0.271886 -0.232472  0.102129      2  \n",
       "3 -0.008030 -0.406366  1.504523  0.271886 -0.232472  0.102129      2  \n",
       "4 -0.573268 -1.164793  1.877777  0.271886 -0.232472  0.102129      2  "
      ]
     },
     "execution_count": 5,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "train.head()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 6,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/html": [
       "<div>\n",
       "<style scoped>\n",
       "    .dataframe tbody tr th:only-of-type {\n",
       "        vertical-align: middle;\n",
       "    }\n",
       "\n",
       "    .dataframe tbody tr th {\n",
       "        vertical-align: top;\n",
       "    }\n",
       "\n",
       "    .dataframe thead th {\n",
       "        text-align: right;\n",
       "    }\n",
       "</style>\n",
       "<table border=\"1\" class=\"dataframe\">\n",
       "  <thead>\n",
       "    <tr style=\"text-align: right;\">\n",
       "      <th></th>\n",
       "      <th>f0</th>\n",
       "      <th>f1</th>\n",
       "      <th>f2</th>\n",
       "      <th>f3</th>\n",
       "      <th>f4</th>\n",
       "      <th>f5</th>\n",
       "      <th>f6</th>\n",
       "      <th>f7</th>\n",
       "      <th>f8</th>\n",
       "      <th>f9</th>\n",
       "      <th>f10</th>\n",
       "      <th>f11</th>\n",
       "      <th>f12</th>\n",
       "      <th>f13</th>\n",
       "      <th>f14</th>\n",
       "      <th>f15</th>\n",
       "      <th>f16</th>\n",
       "      <th>f17</th>\n",
       "      <th>f18</th>\n",
       "      <th>f19</th>\n",
       "      <th>f20</th>\n",
       "      <th>f21</th>\n",
       "      <th>f22</th>\n",
       "      <th>f23</th>\n",
       "      <th>f24</th>\n",
       "      <th>f25</th>\n",
       "      <th>f26</th>\n",
       "      <th>f27</th>\n",
       "    </tr>\n",
       "  </thead>\n",
       "  <tbody>\n",
       "    <tr>\n",
       "      <td>0</td>\n",
       "      <td>-0.837812</td>\n",
       "      <td>-0.273636</td>\n",
       "      <td>1.276580</td>\n",
       "      <td>0.463262</td>\n",
       "      <td>-0.585142</td>\n",
       "      <td>-0.24287</td>\n",
       "      <td>0.349804</td>\n",
       "      <td>0.12356</td>\n",
       "      <td>0.166795</td>\n",
       "      <td>0.06143</td>\n",
       "      <td>0.445195</td>\n",
       "      <td>0.27735</td>\n",
       "      <td>-2.139737</td>\n",
       "      <td>-2.527625</td>\n",
       "      <td>0.17609</td>\n",
       "      <td>0.06143</td>\n",
       "      <td>0.285133</td>\n",
       "      <td>0.06143</td>\n",
       "      <td>0.197642</td>\n",
       "      <td>0.06143</td>\n",
       "      <td>0.27735</td>\n",
       "      <td>0.886135</td>\n",
       "      <td>-0.568935</td>\n",
       "      <td>1.100428</td>\n",
       "      <td>-0.244589</td>\n",
       "      <td>0.229718</td>\n",
       "      <td>-0.217109</td>\n",
       "      <td>0.087039</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <td>1</td>\n",
       "      <td>2.078087</td>\n",
       "      <td>-0.273636</td>\n",
       "      <td>-0.496119</td>\n",
       "      <td>0.463262</td>\n",
       "      <td>-2.438092</td>\n",
       "      <td>-0.24287</td>\n",
       "      <td>0.349804</td>\n",
       "      <td>0.12356</td>\n",
       "      <td>0.166795</td>\n",
       "      <td>0.06143</td>\n",
       "      <td>0.445195</td>\n",
       "      <td>0.27735</td>\n",
       "      <td>0.513736</td>\n",
       "      <td>0.395628</td>\n",
       "      <td>0.17609</td>\n",
       "      <td>0.06143</td>\n",
       "      <td>0.285133</td>\n",
       "      <td>0.06143</td>\n",
       "      <td>-5.059644</td>\n",
       "      <td>0.06143</td>\n",
       "      <td>0.27735</td>\n",
       "      <td>0.886135</td>\n",
       "      <td>0.504299</td>\n",
       "      <td>-0.434268</td>\n",
       "      <td>-0.244040</td>\n",
       "      <td>0.229718</td>\n",
       "      <td>-0.217109</td>\n",
       "      <td>0.087039</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <td>2</td>\n",
       "      <td>-0.837812</td>\n",
       "      <td>-0.273636</td>\n",
       "      <td>1.276580</td>\n",
       "      <td>0.463262</td>\n",
       "      <td>-0.585142</td>\n",
       "      <td>-0.24287</td>\n",
       "      <td>0.349804</td>\n",
       "      <td>0.12356</td>\n",
       "      <td>0.166795</td>\n",
       "      <td>0.06143</td>\n",
       "      <td>0.445195</td>\n",
       "      <td>0.27735</td>\n",
       "      <td>0.513736</td>\n",
       "      <td>0.395628</td>\n",
       "      <td>0.17609</td>\n",
       "      <td>0.06143</td>\n",
       "      <td>0.285133</td>\n",
       "      <td>0.06143</td>\n",
       "      <td>0.197642</td>\n",
       "      <td>0.06143</td>\n",
       "      <td>0.27735</td>\n",
       "      <td>-1.128496</td>\n",
       "      <td>-0.568935</td>\n",
       "      <td>-0.434268</td>\n",
       "      <td>-0.662763</td>\n",
       "      <td>0.229718</td>\n",
       "      <td>-0.217109</td>\n",
       "      <td>0.087039</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <td>3</td>\n",
       "      <td>-0.837812</td>\n",
       "      <td>-0.273636</td>\n",
       "      <td>-0.496119</td>\n",
       "      <td>0.463262</td>\n",
       "      <td>1.267808</td>\n",
       "      <td>-0.24287</td>\n",
       "      <td>-2.858743</td>\n",
       "      <td>0.12356</td>\n",
       "      <td>0.166795</td>\n",
       "      <td>0.06143</td>\n",
       "      <td>0.445195</td>\n",
       "      <td>0.27735</td>\n",
       "      <td>0.513736</td>\n",
       "      <td>0.395628</td>\n",
       "      <td>0.17609</td>\n",
       "      <td>0.06143</td>\n",
       "      <td>0.285133</td>\n",
       "      <td>0.06143</td>\n",
       "      <td>-5.059644</td>\n",
       "      <td>0.06143</td>\n",
       "      <td>0.27735</td>\n",
       "      <td>-1.128496</td>\n",
       "      <td>-0.449819</td>\n",
       "      <td>-1.918647</td>\n",
       "      <td>-0.662763</td>\n",
       "      <td>0.229718</td>\n",
       "      <td>-0.217109</td>\n",
       "      <td>0.087039</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <td>4</td>\n",
       "      <td>-0.837812</td>\n",
       "      <td>-0.273636</td>\n",
       "      <td>-0.496119</td>\n",
       "      <td>0.463262</td>\n",
       "      <td>-0.585142</td>\n",
       "      <td>-0.24287</td>\n",
       "      <td>-2.858743</td>\n",
       "      <td>0.12356</td>\n",
       "      <td>0.166795</td>\n",
       "      <td>0.06143</td>\n",
       "      <td>0.445195</td>\n",
       "      <td>0.27735</td>\n",
       "      <td>0.513736</td>\n",
       "      <td>0.395628</td>\n",
       "      <td>0.17609</td>\n",
       "      <td>0.06143</td>\n",
       "      <td>0.285133</td>\n",
       "      <td>0.06143</td>\n",
       "      <td>0.197642</td>\n",
       "      <td>0.06143</td>\n",
       "      <td>0.27735</td>\n",
       "      <td>-1.128496</td>\n",
       "      <td>-0.568935</td>\n",
       "      <td>-0.434268</td>\n",
       "      <td>-0.662763</td>\n",
       "      <td>0.229718</td>\n",
       "      <td>-0.217109</td>\n",
       "      <td>0.087039</td>\n",
       "    </tr>\n",
       "  </tbody>\n",
       "</table>\n",
       "</div>"
      ],
      "text/plain": [
       "         f0        f1        f2        f3        f4       f5        f6  \\\n",
       "0 -0.837812 -0.273636  1.276580  0.463262 -0.585142 -0.24287  0.349804   \n",
       "1  2.078087 -0.273636 -0.496119  0.463262 -2.438092 -0.24287  0.349804   \n",
       "2 -0.837812 -0.273636  1.276580  0.463262 -0.585142 -0.24287  0.349804   \n",
       "3 -0.837812 -0.273636 -0.496119  0.463262  1.267808 -0.24287 -2.858743   \n",
       "4 -0.837812 -0.273636 -0.496119  0.463262 -0.585142 -0.24287 -2.858743   \n",
       "\n",
       "        f7        f8       f9       f10      f11       f12       f13      f14  \\\n",
       "0  0.12356  0.166795  0.06143  0.445195  0.27735 -2.139737 -2.527625  0.17609   \n",
       "1  0.12356  0.166795  0.06143  0.445195  0.27735  0.513736  0.395628  0.17609   \n",
       "2  0.12356  0.166795  0.06143  0.445195  0.27735  0.513736  0.395628  0.17609   \n",
       "3  0.12356  0.166795  0.06143  0.445195  0.27735  0.513736  0.395628  0.17609   \n",
       "4  0.12356  0.166795  0.06143  0.445195  0.27735  0.513736  0.395628  0.17609   \n",
       "\n",
       "       f15       f16      f17       f18      f19      f20       f21       f22  \\\n",
       "0  0.06143  0.285133  0.06143  0.197642  0.06143  0.27735  0.886135 -0.568935   \n",
       "1  0.06143  0.285133  0.06143 -5.059644  0.06143  0.27735  0.886135  0.504299   \n",
       "2  0.06143  0.285133  0.06143  0.197642  0.06143  0.27735 -1.128496 -0.568935   \n",
       "3  0.06143  0.285133  0.06143 -5.059644  0.06143  0.27735 -1.128496 -0.449819   \n",
       "4  0.06143  0.285133  0.06143  0.197642  0.06143  0.27735 -1.128496 -0.568935   \n",
       "\n",
       "        f23       f24       f25       f26       f27  \n",
       "0  1.100428 -0.244589  0.229718 -0.217109  0.087039  \n",
       "1 -0.434268 -0.244040  0.229718 -0.217109  0.087039  \n",
       "2 -0.434268 -0.662763  0.229718 -0.217109  0.087039  \n",
       "3 -1.918647 -0.662763  0.229718 -0.217109  0.087039  \n",
       "4 -0.434268 -0.662763  0.229718 -0.217109  0.087039  "
      ]
     },
     "execution_count": 6,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "test.head()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 7,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "2    472\n",
       "1     68\n",
       "3     47\n",
       "4     27\n",
       "0      6\n",
       "Name: grade, dtype: int64"
      ]
     },
     "execution_count": 7,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "train.grade.value_counts()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 8,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "f0        8\n",
       "f1       10\n",
       "f2        7\n",
       "f3        2\n",
       "f4        3\n",
       "f5        8\n",
       "f6        2\n",
       "f7        2\n",
       "f8        3\n",
       "f9        1\n",
       "f10       2\n",
       "f11       2\n",
       "f12       3\n",
       "f13       2\n",
       "f14       2\n",
       "f15       2\n",
       "f16       2\n",
       "f17       2\n",
       "f18       2\n",
       "f19       4\n",
       "f20       2\n",
       "f21       2\n",
       "f22      49\n",
       "f23      63\n",
       "f24      24\n",
       "f25       3\n",
       "f26       3\n",
       "f27       3\n",
       "grade     5\n",
       "dtype: int64"
      ]
     },
     "execution_count": 8,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "train.nunique()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 9,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "f0      8\n",
       "f1      7\n",
       "f2      6\n",
       "f3      2\n",
       "f4      3\n",
       "f5      6\n",
       "f6      2\n",
       "f7      2\n",
       "f8      3\n",
       "f9      2\n",
       "f10     2\n",
       "f11     2\n",
       "f12     3\n",
       "f13     2\n",
       "f14     2\n",
       "f15     2\n",
       "f16     2\n",
       "f17     2\n",
       "f18     2\n",
       "f19     2\n",
       "f20     2\n",
       "f21     2\n",
       "f22    37\n",
       "f23    41\n",
       "f24    14\n",
       "f25     3\n",
       "f26     3\n",
       "f27     2\n",
       "dtype: int64"
      ]
     },
     "execution_count": 9,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "test.nunique()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Metric"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "<img src=\"https://www.machinehack.com/wp-content/uploads/2020/02/MULTI_CLASS_LOGLOSS_FORM.png\" height=400>"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 10,
   "metadata": {},
   "outputs": [],
   "source": [
    "def metric(y,y0):\n",
    "    return log_loss(y,y0)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Features"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "**Removing the f9 feature as the train data contains only 1 unique value and f9 feature did not have feature importance**"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 11,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "27"
      ]
     },
     "execution_count": 11,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "features = list(set(train.columns)-set(['grade','f9']))\n",
    "target = 'grade'\n",
    "len(features)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Cross Validation"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "**Since, the test data was only 30% on public leaderboard, it was diffcuilt to match the local validation score**"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 12,
   "metadata": {},
   "outputs": [],
   "source": [
    "def cross_valid(model,train,features,target,cv=3):\n",
    "    results = cross_val_predict(model, train[features], train[target], method=\"predict_proba\",cv=cv)\n",
    "    return metric(train[target],results)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "**Baseliner Models are checked to find good performing basliner models**"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 13,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "LGBMClassifier 0.1614617219167179\n",
      "XGBClassifier 0.15135431579699563\n",
      "GradientBoostingClassifier 0.19288880185032714\n",
      "LogisticRegression 0.36364542223702856\n",
      "RandomForestClassifier 0.1988283464155193\n",
      "AdaBoostClassifier 1.3349991672743202\n"
     ]
    }
   ],
   "source": [
    "models = [lgb.LGBMClassifier(), xgb.XGBClassifier(), GradientBoostingClassifier(), LogisticRegression(), \n",
    "              RandomForestClassifier(), AdaBoostClassifier()\n",
    "             ]\n",
    "\n",
    "for i in models:\n",
    "    model = i\n",
    "    error = cross_valid(model,train,features,target,cv=10)\n",
    "    print(str(model).split(\"(\")[0], error)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Modelling XGB Model"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 14,
   "metadata": {},
   "outputs": [],
   "source": [
    "def xgb_model(train, features, target, plot=True):\n",
    "    evals_result = {}\n",
    "    trainX, validX, trainY, validY = train_test_split(train[features], train[target], test_size=0.2, random_state=13)\n",
    "    print(\"XGB Model\")\n",
    "    \n",
    "    dtrain = xgb.DMatrix(trainX, label=trainY)\n",
    "    dvalid = xgb.DMatrix(validX, label=validY)\n",
    "    watchlist = [(dtrain, 'train'), (dvalid, 'valid')]\n",
    "    \n",
    "    MAX_ROUNDS=2000\n",
    "    early_stopping_rounds=100\n",
    "    params = {\n",
    "        'booster': 'gbtree',\n",
    "        'objective': 'multi:softprob',\n",
    "        'eval_metric': 'mlogloss',\n",
    "        'learning_rate': 0.01,\n",
    "        'num_round': MAX_ROUNDS,\n",
    "        'max_depth': 8,\n",
    "        'seed': 25,\n",
    "        'nthread': -1,\n",
    "        'num_class':5\n",
    "    }\n",
    "    \n",
    "    model = xgb.train(\n",
    "        params,\n",
    "        dtrain,\n",
    "        evals=watchlist,\n",
    "        num_boost_round=MAX_ROUNDS,\n",
    "        early_stopping_rounds=early_stopping_rounds,\n",
    "        verbose_eval=50\n",
    "        #feval=metric_xgb\n",
    "    \n",
    "    )\n",
    "    \n",
    "    print(\"Best Iteration :: {} \\n\".format(model.best_iteration))\n",
    "    \n",
    "    \n",
    "    if plot:\n",
    "        # Plotting Importances\n",
    "        fig, ax = plt.subplots(figsize=(24, 24))\n",
    "        xgb.plot_importance(model, height=0.4, ax=ax)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 15,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "XGB Model\n",
      "[01:16:56] WARNING: /workspace/src/learner.cc:328: \n",
      "Parameters: { num_round } might not be used.\n",
      "\n",
      "  This may not be accurate due to some parameters are only used in language bindings but\n",
      "  passed down to XGBoost core.  Or some parameters are not used but slip through this\n",
      "  verification. Please open an issue if you find above cases.\n",
      "\n",
      "\n",
      "[0]\ttrain-mlogloss:1.58837\tvalid-mlogloss:1.58974\n",
      "Multiple eval metrics have been passed: 'valid-mlogloss' will be used for early stopping.\n",
      "\n",
      "Will train until valid-mlogloss hasn't improved in 100 rounds.\n",
      "[50]\ttrain-mlogloss:0.91360\tvalid-mlogloss:0.95750\n",
      "[100]\ttrain-mlogloss:0.58307\tvalid-mlogloss:0.65255\n",
      "[150]\ttrain-mlogloss:0.39094\tvalid-mlogloss:0.48186\n",
      "[200]\ttrain-mlogloss:0.27371\tvalid-mlogloss:0.37873\n",
      "[250]\ttrain-mlogloss:0.20020\tvalid-mlogloss:0.31611\n",
      "[300]\ttrain-mlogloss:0.15344\tvalid-mlogloss:0.27949\n",
      "[350]\ttrain-mlogloss:0.12128\tvalid-mlogloss:0.26036\n",
      "[400]\ttrain-mlogloss:0.09965\tvalid-mlogloss:0.24850\n",
      "[450]\ttrain-mlogloss:0.08434\tvalid-mlogloss:0.24160\n",
      "[500]\ttrain-mlogloss:0.07264\tvalid-mlogloss:0.23812\n",
      "[550]\ttrain-mlogloss:0.06386\tvalid-mlogloss:0.23592\n",
      "[600]\ttrain-mlogloss:0.05717\tvalid-mlogloss:0.23340\n",
      "[650]\ttrain-mlogloss:0.05208\tvalid-mlogloss:0.23237\n",
      "[700]\ttrain-mlogloss:0.04817\tvalid-mlogloss:0.23050\n",
      "[750]\ttrain-mlogloss:0.04530\tvalid-mlogloss:0.22985\n",
      "[800]\ttrain-mlogloss:0.04257\tvalid-mlogloss:0.22979\n",
      "[850]\ttrain-mlogloss:0.04023\tvalid-mlogloss:0.23141\n",
      "Stopping. Best iteration:\n",
      "[775]\ttrain-mlogloss:0.04399\tvalid-mlogloss:0.22903\n",
      "\n",
      "Best Iteration :: 775 \n",
      "\n"
     ]
    },
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAABW4AAAVWCAYAAAANHSDeAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAgAElEQVR4nOzde5ieVX0v/O9KJoQgUBoCNgEEYwRimGRI0FBFncgbEQJ1s7FUpTSIlo3YIjaIUFqw7VaCNCW8oqRQfEWs4BZqOUrLBgY1buRgQpCkUIVoOGxQNioBhEyy3j9mmD1AOIjM3DfM53Ndz8U86z791vDjIvnOmvWUWmsAAAAAAGiPUU0XAAAAAADA0wluAQAAAABaRnALAAAAANAyglsAAAAAgJYR3AIAAAAAtIzgFgAAAACgZQS3AACMSKWUJaWUv266DgAA2JhSa226BgAAXkFKKauTvDbJ+kHDO9da7/st7tmd5Ku11u1/u+pemUopX05yT631r5quBQCAdrDiFgCAl+KAWuvmg14vObR9OZRSOpp8/m+jlDK66RoAAGgfwS0AAC+bUsqepZTvlVJ+UUq5tX8l7VPHPlRKWVVKeaSUclcp5b/1j78mybeSTCqlrO1/TSqlfLmU8t8HXd9dSrln0PvVpZRPlVJWJHm0lNLRf93FpZSflVLuLqUc/Ty1Dtz/qXuXUo4rpTxYSrm/lPJfSin7lVLuLKX8n1LKXw669tOllItKKV/vn88PSikzBh2fWkrp6f8+3F5K+YNnPPesUsqVpZRHk3w4ySFJjuuf+2X95x1fSvlx//1XllIOHHSPw0op3y2l/H0p5eH+ue476Pj4Usr/V0q5r//4vw46tn8pZXl/bd8rpUx/0f+CAQAYNoJbAABeFqWU7ZJckeS/Jxmf5NgkF5dStuk/5cEk+yfZMsmHkpxeSplZa300yb5J7nsJK3g/kGRekq2SbEhyWZJbk2yXZO8kx5RS9nmR9/q9JJv2X3tSknOS/HGSWUnenuSkUsrkQee/N8k3+uf6tST/WkoZU0oZ01/HvyfZNsmfJ/nnUsoug679YJLPJNkiyVeS/HOSz/XP/YD+c37c/9zfSfI3Sb5aSpk46B6zk9yRZEKSzyU5t5RS+o+dn2SzJNP6azg9SUopM5N8Kcl/S7J1kn9McmkpZeyL/B4BADBMBLcAALwU/9q/YvMXg1Zz/nGSK2utV9ZaN9Rar05yc5L9kqTWekWt9ce1z/XpCzbf/lvW8f/WWtfUWh9P8uYk29Ra/7bW+mSt9a70ha/vf5H3WpfkM7XWdUkuTF8gekat9ZFa6+1Jbk8yeHXqLbXWi/rP/4f0hb579r82T7Kwv45rk1yevpD5KZfUWpf2f59+vbFiaq3fqLXe13/O15P8Z5K3DDrlJ7XWc2qt65Ocl2Riktf2h7v7Jjmy1vpwrXVd//c7Sf40yT/WWr9fa11faz0vyRP9NQMA0CKv2L3AAABo1H+ptf7PZ4ztmOQPSykHDBobk+S6JOn/Vf6Tk+ycvgUEmyW57besY80znj+plPKLQWOjk3znRd7rof4QNEke7//nA4OOP56+QPZZz661bujfxmHSU8dqrRsGnfuT9K3k3VjdG1VK+ZMkf5Fkp/6hzdMXJj/lfw96/mP9i203T98K4P9Ta314I7fdMcn8UsqfDxrbZFDdAAC0hOAWAICXy5ok59da//SZB/p/Ff/iJH+SvtWm6/pX6j71q/11I/d7NH3h7lN+byPnDL5uTZK7a61vfCnFvwQ7PPVFKWVUku2TPLXFww6llFGDwtvXJblz0LXPnO/T3pdSdkzfauG9k/yvWuv6Usry/N/v1/NZk2R8KWWrWusvNnLsM7XWz7yI+wAA0CBbJQAA8HL5apIDSin7lFJGl1I27f/Qr+3Tt6pzbJKfJentX3377kHXPpBk61LK7wwaW55kv/4P2vq9JMe8wPNvTPKr/g8sG9dfw26llDe/bDN8ulmllP9aSunor+2JJDck+X76Qufj+ve87U5yQPq2X3guDyQZvH/ua9IX5v4s6ftgtyS7vZiiaq33p+/D3r5YSvnd/hre0X/4nCRHllJmlz6vKaXMK6Vs8SLnDADAMBHcAgDwsqi1rknfB3b9ZfoCxzVJPplkVK31kSRHJ/kfSR5O34dzXTro2v9IckGSu/r3zZ2Uvg/YujXJ6vTth/v1F3j++vQFpF1J7k7y8yT/lL4P9xoKlyT5o/TN59Ak/7V/P9knk/xB+vaZ/XmSLyb5k/45Ppdzk7zpqT2Da60rkyxK8r/SF+p2Jln6G9R2aPr27P2P9H0o3DFJUmu9OX373J7ZX/ePkhz2G9wXAIBhUmrd2G+lAQAAz6WU8ukkU2qtf9x0LQAAvDpZcQsAAAAA0DKCWwAAAACAlrFVAgAAAABAy1hxCwAAAADQMoJbAAAAAICW6Wi6gJdiq622qlOmTGm6DEaoRx99NK95zWuaLoMRSv/RJP1H0/QgTdJ/NEn/0ST9R5NGSv/dcsstP6+1bvPM8VdkcPva1742N998c9NlMEL19PSku7u76TIYofQfTdJ/NE0P0iT9R5P0H03SfzRppPRfKeUnGxu3VQIAAAAAQMsIbgEAAAAAWkZwCwAAAADQMoJbAAAAAICWEdwCAAAAALSM4BYAAAAAoGUEtwAAAAAALSO4BQAAAABoGcEtAAAAAEDLCG4BAAAAAFpGcAsAAAAA0DKCWwAAAACAlhHcAgAAAAC0jOAWAAAAAKBlBLcAAAAAAC0juAUAAAAAaBnBLQAAAABAywhuAQAAAABaRnALAAAAANAyglsAAAAAgJYR3AIAAAAAtIzgFgAAAACgZQS3AAAAAAAtI7gFAAAAAGgZwS0AAAAAQMsIbgEAAAAAWkZwCwAAAADQMoJbAAAAAICWEdwCAAAAALSM4BYAAAAAoGUEtwAAAAAALSO4BQAAAABoGcEtAAAAAEDLCG4BAAAAAFpGcAsAAAAA0DKCWwAAAACAlhHcAgAAAAC0jOAWAAAAAKBlBLcAAAAAAC0juAUAAAAAaBnBLQAAAABAywhuAQAAAABaRnALAAAAANAyglsAAAAAgJYptdama/iNvW7ylDrq4DOaLoMRakFnbxbd1tF0GYxQ+o8m6T+apgdpkv6jSfqPJum/Zq1eOK/pEhrV09OT7u7upssYcqWUW2qtezxz3IpbAAAAAICWEdwCAAAAALSM4BYAAAAAoGUEtwAAAAAALSO4BQAAAABoGcEtAAAAALTUHXfcka6uroHXlltumcWLF+fTn/50tttuu4HxK6+8Mkny5JNP5kMf+lA6OzszY8aM9PT0DNzrggsuSGdnZ6ZPn573vOc9+fnPf97QrHgxhiy4LaUcXUpZVUqppZQV/a/vlVJm9B/foZRyXf85t5dSPj5UtQAAAADAK9Euu+yS5cuXZ/ny5bnllluy2Wab5cADD0ySfOITnxg4tt9++yVJzjnnnCTJbbfdlquvvjoLFizIhg0b0tvbm49//OO57rrrsmLFikyfPj1nnnlmY/PihQ3litujkuyX5G1J3llrnZ7k75Kc3X+8N8mCWuvUJHsm+Vgp5U1DWA8AAAAAvGJdc801ecMb3pAdd9zxOc9ZuXJl9t577yTJtttum6222io333xzaq2ptebRRx9NrTW/+tWvMmnSpOEqnZdgSILbUsqSJJOTXJpkdq314f5DNyTZPklqrffXWn/Q//UjSVYl2W4o6gEAAACAV7oLL7wwH/jABwben3nmmZk+fXoOP/zwPPxwX/w2Y8aMXHLJJent7c3dd9+dW265JWvWrMmYMWNy1llnpbOzM5MmTcrKlSvz4Q9/uKmp8CIMSXBbaz0yyX1J5tRaTx906MNJvvXM80spOyXZPcn3h6IeAAAAAHgle/LJJ3PppZfmD//wD5MkH/3oR/PjH/84y5cvz8SJE7NgwYIkyeGHH57tt98+e+yxR4455pi89a1vTUdHR9atW5ezzjory5Yty3333Zfp06fnlFNOaXJKvICO4XpQKWVO+oLbvZ4xvnmSi5McU2v91fNcf0SSI5JkwoRtclJn7xBWC8/tteOSBfqPhug/mqT/aJoepEn6jybpP5qk/5o1+IPFvvvd7+b1r399Vq1alVWrVj3tvM7Oznzta18bOP+9731v3vve9yZJ/uzP/iwPP/xwzj333Dz88MNZs2ZN1qxZkze+8Y254IILstdeT4vqWmXt2rVP+x6MNMMS3JZSpif5pyT71lofGjQ+Jn2h7T/XWv/l+e5Raz07/fvjvm7ylLrotmHLnOFpFnT2Rv/RFP1Hk/QfTdODNEn/0ST9R5P0X7NWH9I98PWSJUty1FFHpbu7b+z+++/PxIkTkySnn356Zs+ene7u7jz22GOpteY1r3lNrr766owfPz6HHXZY7rvvvvzN3/xNpk2blm222SbXXHNN3va2tw3cr416enpaXd9QG/L/8kopr0vyL0kOrbXeOWi8JDk3yapa6z8MdR0AAAAA8Er02GOP5eqrr84//uM/Dowdd9xxWb58eUop2WmnnQaOPfjgg9lnn30yatSobLfddjn//POTJJMmTcrJJ5+cd7zjHRkzZkx23HHHfPnLX25iOrxIw/Ejk5OSbJ3ki31ZbXprrXskeVuSQ5PcVkpZ3n/uX9ZarxyGmgAAAADgFWGzzTbLQw899LSxpwLZZ9ppp51yxx13bPTYkUcemSOPPPJlr4+hMWTBba11p/4vP9L/eubx7yYpQ/V8AAAAAIBXqlFNFwAAAAAAwNMJbgEAAAAAWkZwCwAAAADQMoJbAAAAAICWEdwCAAAAALSM4BYAAAAAoGU6mi7gpRg3ZnTuWDiv6TIYoXp6erL6kO6my2CE0n80Sf/RND1Ik/QfTdJ/NEn/QXOsuAUAAAAAaBnBLQAAAABAywhuAQAAAABaRnALAAAAANAyglsAAAAAgJYR3AIAAAAAtIzgFgAAAACgZQS3AAAAAAAtI7gFAAAAAGgZwS0AAAAAQMsIbgEAAAAAWkZwCwAAAADQMoJbAAAAAICWEdwCAAAAALSM4BYAAAAAoGUEtwAAAAAALSO4BQAAAABoGcEtAAAAAEDLCG4BAAAAAFpGcAsAAAAA0DKCWwAAAACAlhHcAgAAAAC0jOAWAAAAAKBlBLcAAAAAAC0juAUAAAAAaBnBLQAAAABAywhuAQAAAABaRnALAAAAANAyglsAAAAAgJYR3AIAAAAAtIzgFgAAAACgZQS3AAAAAAAtI7gFAAAAAGiZjqYLeCkeX7c+Ox1/RdNlMEIt6OzNYfqPhug/mqT/aJoe5OWyeuG8pksAAHhBVtwCAAAAALSM4BYAAAAAoGUEtwAAAAAALSO4BQAAAABoGcEtAAAAAEDLCG4BAAAAAFpGcAsAAIxY69evz+677579998/SXLttddm5syZ2W233TJ//vz09vYmSa6++upMnz4906dPz1vf+tbceuutSZJf//rXectb3pIZM2Zk2rRpOfnkkxubCwDw6jKkwW0p5ehSyqpSSi2lrOh/fa+UMqP/+KallBtLKbeWUm4vpfzNUNYDAAAw2BlnnJGpU6cmSTZs2JD58+fnwgsvzA9/+MPsuOOOOe+885IkEydOzPXXX58VK1bkr//6r3PEEUckScaOHZtrr702t956a5YvX56rrroqN9xwQ2PzAQBePYZ6xe1RSfZL8rYk76y1Tk/yd0nO7j/+RJJ31VpnJOlK8p5Syp5DXBMAAEDuueeeXHHFFfnIRz6SJHnooYcyduzY7LzzzkmSuXPn5uKLL06S7Lbbbvnd3/3dJMmee+6Ze+65J0lSSsnmm2+eJFm3bl3WrVuXUspwTwUAeBUasuC2lLIkyeQklyaZXWt9uP/QDUm2T5LaZ23/+Jj+Vx2qmgAAAJ5yzDHH5HOf+1xGjer7a9GECROybt263HzzzUmSiy66KGvWrHnWdeeee2723Xffgffr169PV1dXtt1228ydOzezZ88engkAAK9qQxbc1lqPTHJfkjm11tMHHfpwkm899aaUMrqUsjzJg0murrV+f6hqAgAASJLLL7882267bWbNmjUwVkrJhRdemE984hN5y1veki222CIdHR1Pu+66667Lueeem1NPPXVgbPTo0Vm+fHnuueee3HjjjfnhD384bPMAAF69Sq1Dt8C1lLI6yR611p/3v5+T5ItJ9qq1PvSMc7dK8s0kf15rfdafdEopRyQ5IkkmTNhm1kmLzxmyuuH5vHZc8sDjTVfBSKX/aJL+o2l6kJdL53a/k3POOSf//u//ntGjR+fJJ5/MY489lre//e058cQTB8676aabcsUVV+TTn/501q5dmwceeCAnnXRSFi5cmB122GGj9z7vvPOy6aab5o/+6I+GazqMAGvXrh3YkgOGm/6jSSOl/+bMmXNLrXWPZ44PW3BbSpmevmB231rrnc9x/slJHq21/v3z3fd1k6fUUQef8bLXCy/Ggs7eLLqt44VPhCGg/2iS/qNpepCXy+qF8572vqenJ3//93+fyy+/PA8++GC23XbbPPHEE9lvv/1y4okn5l3vele+/vWv58QTT8xXvvKVvPWtbx249mc/+1nGjBmTrbbaKo8//nje/e5351Of+lT233//4Z4Wr2I9PT3p7u5uugxGKP1Hk0ZK/5VSNhrcDsuffEspr0vyL0kOHRzallK2SbKu1vqLUsq4JP9PklOf4zYAAABD6rTTTsvll1+eDRs25KMf/Wje9a53JUm+8pWv5KGHHspRRx2VJOno6MjNN9+c+++/P/Pnz8/69euzYcOGHHzwwUJbAOBlMVxLFk5KsnWSL/Z/wmpvf4o8Mcl5pZTR6dtv93/UWi8fppoAAADS3d09sJrntNNOy2mnnfascz75yU9udMXP9OnTs2zZsiGuEAAYiYY0uK217tT/5Uf6X888viLJ7kNZAwAAAADAK82opgsAAAAAAODpBLcAAAAAAC0juAUAAAAAaBnBLQAAAABAywhuAQAAAABaRnALAAAAANAyHU0X8FKMGzM6dyyc13QZjFA9PT1ZfUh302UwQuk/mqT/aJoeBABgJLHiFgAAAACgZQS3AAAAAAAtI7gFAAAAAGgZwS0AAAAAQMsIbgEAAAAAWkZwCwAAAADQMoJbAAAAAICWEdwCAAAAALSM4BYAAAAAoGUEtwAAAAAALSO4BQAAAABoGcEtAAAAAEDLCG4BAAAAAFpGcAsAAAAA0DKCWwAAAACAlhHcAgAAAAC0jOAWAAAAAKBlBLcAAAAAAC0juAUAAAAAaBnBLQAAAABAywhuAQAAAABaRnALAAAAANAyglsAAAAAgJYR3AIAAAAAtIzgFgAAAACgZQS3AAAAAAAtI7gFAAAAAGgZwS0AAAAAQMsIbgEAAAAAWkZwCwAAAADQMoJbAAAAAICWEdwCAAAAALSM4BYAAAAAoGU6mi7gpXh83frsdPwVTZfBCLWgszeH6T8aov8YaqsXzmu6BAAAAGLFLQAAAABA6whuAQAAAABaRnALAAAAANAyglsAAAAAgJYR3AIAAAAAtIzgFgAYsGbNmsyZMydTp07NtGnTcsYZZyRJbr311nzsYx9LZ2dnDjjggPzqV79KkqxevTrjxo1LV1dXurq6cuSRRyZJHnvsscybNy+77rprpk2bluOPP76xOQEAALwSDWlwW0o5upSyqpRSSykr+l/fK6XMeMZ5o0spy0oplw9lPQDA8+vo6MiiRYuyatWq3HDDDfnCF76QlStX5iMf+Uj+9E//NLfddlsOPPDAnHbaaQPXvOENb8jy5cuzfPnyLFmyZGD82GOPzX/8x39k2bJlWbp0ab71rW81MSUAAIBXpKFecXtUkv2SvC3JO2ut05P8XZKzn3Hex5OsGuJaAIAXMHHixMycOTNJssUWW2Tq1Km59957c8cdd2TGjL6fu86dOzcXX3zx895ns802y5w5c5Ikm2yySWbOnJl77rlnaIsHAAB4FRmy4LaUsiTJ5CSXJplda324/9ANSbYfdN72SeYl+aehqgUA+M2tXr06y5Yty+zZs7Pbbrtl6dKlSZJvfOMbWbNmzcB5d999d3bfffe8853vzHe+851n3ecXv/hFLrvssuy9997DVjsAAMAr3ZAFt7XWI5Pcl2ROrfX0QYc+nGTw70ouTnJckg1DVQsA8JtZu3ZtDjrooCxevDhbbrllvvSlL+WSSy7JrFmz8sgjj2STTTZJ0rdC96c//WmWLVuWf/iHf8gHP/jBgf1vk6S3tzcf+MAHcvTRR2fy5MlNTQcAAOAVp2M4H1ZKmZO+4Hav/vf7J3mw1npLKaX7Ba49IskRSTJhwjY5qbN3iKuFjXvtuGSB/qMh+o+h1tPTk97e3pxwwgmZPXt2xo8fn56eniTJySefnM033zxr1qzJtttuOzA+2NZbb50LLrggu+yyS5Lk1FNPHfjwso2dD7+JtWvX6iMao/9okv6jSfqPJo30/iu11qG7eSmrk+xRa/15KWV6km8m2bfWemf/8VOSHJqkN8mmSbZM8i+11j9+vvu+bvKUOurgM4asbng+Czp7s+i2Yf2ZBwzQfwy1u0/ZL/Pnz8/48eOzePHigfEHH3wwK1euzDve8Y4cdthh6e7uzuGHH56f/exnGT9+fEaPHp277rorb3/723Pbbbdl/Pjx+au/+qusWrUq3/jGNzJq1FBvq89I0NPTk+7u7qbLYITSfzRJ/9Ek/UeTRkr/lVJuqbXu8czxYflbVCnldUn+JcmhT4W2SVJrPaHWun2tdack709y7QuFtgDA0Fm6dGnOP//8XHvttenq6kpXV1euvPLKXHDBBTn00EOz6667ZtKkSfnQhz6UJPn2t7+d6dOnZ8aMGXnf+96XJUuWZPz48bnnnnvymc98JitXrszMmTPT1dWVf/on29kDAAC8WMO1bOukJFsn+WIpJUl6N5YiAwDN2muvvfJcv40zY8aMZ/20+6CDDspBBx30rHO3337757wPAAAAL2xIg9v+lbRJ8pH+1/Od25OkZyjrAQAAAAB4JbDhHAAAAABAywhuAQAAAABaRnALAAAAANAyglsAAAAAgJYR3AIAAAAAtIzgFgAAAACgZQS3AAAAAAAt09F0AS/FuDGjc8fCeU2XwQjV09OT1Yd0N10GI5T+AwAAgJHBilsAAAAAgJYR3AIAAAAAtIzgFgAAAACgZQS3AAAAAAAtI7gFAAAAAGgZwS0AAAAAQMsIbgEAAAAAWkZwCwAAAADQMoJbAAAAAICWEdwCAAAAALSM4BYAAAAAoGUEtwAAAAAALSO4BQAAAABoGcEtAAAAAEDLCG4BAAAAAFpGcAsAAAAA0DKCWwAAAACAlhHcAgAAAAC0jOAWAAAAAKBlBLcAAAAAAC0juAUAAAAAaBnBLQAAAABAywhuAQAAAABaRnALAAAAANAyglsAAAAAgJYR3AIAAAAAtIzgFgAAAACgZQS3AAAAAAAtI7gFAAAAAGgZwS0AAAAAQMsIbgEAAAAAWkZwCwAAAADQMoJbAAAAAICW6Wi6gJfi8XXrs9PxVzRdBiPUgs7eHKb/aIj+47msXjiv6RIAAAB4GVlxCwAAAADQMoJbAAAAAICWEdwCAAAAALSM4BYAAAAAoGUEtwAAAAAALSO4BYBXiTVr1mTOnDmZOnVqpk2bljPOOCNJsnz58uy5557p6urKHnvskRtvvPFp1910000ZPXp0LrrooiTJT37yk8yaNStdXV2ZNm1alixZMuxzAQAAGOk6mnhoKeXoJB9N8oMkDyXZL8ljSQ6rtf6giZoA4JWuo6MjixYtysyZM/PII49k1qxZmTt3bo477ricfPLJ2XfffXPllVfmuOOOS09PT5Jk/fr1+dSnPpV99tln4D4TJ07M9773vYwdOzZr167Nbrvtlj/4gz9oaFYAAAAjU1Mrbo9KX1j7z0ne2P86IslZDdUDAK94EydOzMyZM5MkW2yxRaZOnZp77703pZT86le/SpL88pe/zKRJkwau+fznP5+DDjoo22677cDYJptskrFjxyZJnnjiiWzYsGEYZwEAAEDSwIrbUsqSJJOTXJpk5/Stsq1JbiilbFVKmVhrvX+46wKAV5PVq1dn2bJlmT17dhYvXpx99tknxx57bDZs2JDvfe97SZJ777033/zmN3Pttdfmpptuetr1a9asybx58/KjH/0op512WiZNmpQ777yziakAAACMSMO+4rbWemSS+5LMSXJ1kjWDDt+TZLvhrgkAXk3Wrl2bgw46KIsXL86WW26Zs846K6effnrWrFmT008/PR/+8IeTJMccc0xOPfXUjB49+ln32GGHHbJixYr86Ec/ynnnnZcHHnhguKcBAAAwopW+xa7D/NBSVifZI8l5SU6ptX63f/yaJMfVWm/ZyDVHpG87hUyYsM2skxafM3wFwyCvHZc88HjTVTBS6T+eS+d2v5Mk6e3tzQknnJA3v/nNOfjgg5Mk+++/fy677LKUUlJrzf77758rrrgiH/jAB/LUnwN++ctfZtNNN82CBQuy1157Pe3ep556avbcc8/MmjUrm2+++fBODAZZu3atHqQx+o8m6T+apP9o0kjpvzlz5txSa93jmeONfDjZIPck2WHQ++3Ttxr3WWqtZyc5O0leN3lKXXRb06UzUi3o7I3+oyn6j+ey+pDu1Fozf/78vO1tb8vixYsHju2www4ppaS7uzvXXHNNdt1113R3d+f++//vzkSHHXZY9t9//7zvfe/LPffck6233jrjxo3Lww8/nB//+Mf53Oc+l4ceeijd3d0NzA769PT06EEao/9okv6jSfqPJo30/mv6b/+XJvmzUsqFSWYn+aX9bQHgpVm6dGnOP//8dHZ2pqurK0ny2c9+Nuecc04+/vGPp7e3N5tuumnOPvvs573PqlWrsmDBgoEVuscee2w6OzvT09MzDLMAAAAgaT64vTLJfkl+lOSxJB9qthwAeOXaa6+98lxbIN1yy7N2IXqaL3/5ywNfz507NytWrHg5SwMAAOA31EhwW2vdadDbjzVRAwAAAABAW41qugAAAAAAAJ5OcAsAAAAA0DKCWwAAAACAlhHcAgAAAAC0jOAWAAAAAKBlBLcAAM4c9WQAACAASURBVAAAAC3T0XQBL8W4MaNzx8J5TZfBCNXT05PVh3Q3XQYjlP4DAACAkcGKWwAAAACAlhHcAgAAAAC0jOAWAAAAAKBlBLcAAAAAAC0juAUAAAAAaBnBLQAAAABAywhuAQAAAABaRnALAAAAANAyglsAAAAAgJYR3AIAAAAAtIzgFgAAAACgZQS3AAAAAAAtI7gFAAAAAGgZwS0AAAAAQMsIbgEAAAAAWkZwCwAAAADQMoJbAAAAAICWEdwCAAAAALSM4BYAAAAAoGUEtwAAAAAALSO4BQAAAABoGcEtAAAAAEDLCG4BAAAAAFpGcAsAAAAA0DKCWwAAAACAlhHcAgAAAAC0jOAWAAAAAKBlBLcAAAAAAC0juAUAAAAAaBnBLQAAAABAywhuAQAAAABaRnALAAAAANAyglsAAAAAgJbpaLqAl+Lxdeuz0/FXNF0GI9SCzt4cpv9oiP5rp9UL5zVdAgAAAK8yVtwCAAAAALSM4BYAAAAAoGUEtwAAAAAALSO4BQAAAABoGcEtAAAAAEDLCG4BAAAAAFpGcAsAL4M1a9Zkzpw5mTp1aqZNm5Yzzjhj4NjnP//57LLLLpk2bVqOO+64JMmNN96Yrq6udHV1ZcaMGfnmN785cP5OO+2Uzs7OdHV1ZY899hj2uQAAANC8jiYeWko5OslHk/yg1npIKeXNSW5I8ke11ouaqAkAfhsdHR1ZtGhRZs6cmUceeSSzZs3K3Llz88ADD+SSSy7JihUrMnbs2Dz44INJkt122y0333xzOjo6cv/992fGjBk54IAD0tHR97/m6667LhMmTGhySgAAADSokeA2yVFJ9q213l1KGZ3k1CT/1lAtAPBbmzhxYiZOnJgk2WKLLTJ16tTce++9Oeecc3L88cdn7NixSZJtt902SbLZZpsNXPvrX/86pZThLxoAAIDWGvatEkopS5JMTnJpKeUTSf48ycVJHhzuWgBgKKxevTrLli3L7Nmzc+edd+Y73/lOZs+enXe+85256aabBs77/ve/n2nTpqWzszNLliwZWG1bSsm73/3uzJo1K2effXZT0wAAAKBBw77ittZ6ZCnlPUnmJBmb5GtJ3pXkzcNdCwC83NauXZuDDjooixcvzpZbbpne3t48/PDDueGGG3LTTTfl4IMPzl133ZVSSmbPnp3bb789q1atyvz587Pvvvtm0003zdKlSzNp0qQ8+OCDmTt3bnbddde84x3vaHpqAAAADKOmtkp4yuIkn6q1rn+hXxEtpRyR5IgkmTBhm5zU2TsM5cGzvXZcskD/0RD91049PT1Jkt7e3pxwwgmZPXt2xo8fn56enmy22WaZPHlyrr/++iTJk08+mUsuuSRbbbXV0+6xbt26nHfeedlll12SJHfeeWeSZPfdd88FF1yQDRs2DN+EnsPatWsH5gpN0IM0Sf/RJP1Hk/QfTRrp/VdqrcP/0FJWJ9kjyU1JnkpsJyR5LMkRtdZ/fb7rXzd5Sh118BnPdwoMmQWdvVl0W9M/82Ck0n/ttHrhvNRaM3/+/IwfPz6LFy8eOLZkyZLcd999+du//dvceeed2XvvvfPTn/40q1evzg477JCOjo785Cc/ye///u9nxYoVGTduXDZs2JAtttgijz76aObOnZuTTjop73nPexqcYZ+enp50d3c3XQYjmB6kSfqPJuk/mqT/aNJI6b9Syi211j2eOd7o3/5rra9/6utSypeTXP5CoS0AtNHSpUtz/vnnp7OzM11dXUmSz372szn88MNz+OGHZ7fddssmm2yS8847L6WUfPe7383ChQszZsyYjBo1Kl/84hczYcKE3HXXXTnwwAOT9K3g/eAHP9iK0BYAAIDhZdkWALwM9tprrzzXb7F89atffdbYoYcemkMPPfRZ45MnT86tt976stcHAADAK0sjwW2tdaeNjB02/JUAAAAAALTPqKYLAAAAAADg6QS3AAAAAAAtI7gFAAAAAGgZwS0AAAAAQMsIbgEAAAAAWkZwCwAAAADQMh1NF/BSjBszOncsnNd0GYxQPT09WX1Id9NlMELpPwAAABgZrLgFAAAAAGgZwS0AAAAAQMsIbgEAAAAAWkZwCwAAAADQMoJbAAAAAICWEdwCAAAAALSM4BYAAAAAoGUEtwAAAAAALSO4BQAAAABoGcEtAAAAAEDLCG4BAAAAAFpGcAsAAAAA0DKCWwAAAACAlhHcAgAAAAC0jOAWAAAAAKBlBLcAAAAAAC0juAUAAAAAaBnBLQAAAABAywhuAQAAAABaRnALAAAAANAyglsAAAAAgJYR3AIAAAAAtIzgFgAAAACgZQS3AAAAAAAtI7gFAAAAAGgZwS0AAAAAQMsIbgEAAAAAWkZwCwAAAADQMoJbAAAAAICWEdwCAAAAALSM4BYAAAAAoGUEtwAAAAAALSO4BQAAAABomY6mC3gpHl+3Pjsdf0XTZTBCLejszWH6j4Y03X+rF85r7NkAAAAwklhxCwAAAADQMoJbAAAAAICWEdwCAAAAALSM4BYAAAAAoGUEtwAAAAAALSO4BQAAAABoGcEtAL+RNWvWZM6cOZk6dWqmTZuWM844I0nyyU9+MrvuumumT5+eAw88ML/4xS8GrjnllFMyZcqU7LLLLvm3f/u3gfGrrroqu+yyS6ZMmZKFCxcO+1wAAACgrRoJbkspR5dSVpVS7i2l/LKUsrz/dVIT9QDw4nV0dGTRokVZtWpVbrjhhnzhC1/IypUrM3fu3Pzwhz/MihUrsvPOO+eUU05JkqxcuTIXXnhhbr/99lx11VU56qijsn79+qxfvz4f+9jH8q1vfSsrV67MBRdckJUrVzY8OwAAAGiHjoaee1SSfZPsmOTYWuv+DdUBwG9o4sSJmThxYpJkiy22yNSpU3Pvvffm3e9+98A5e+65Zy666KIkySWXXJL3v//9GTt2bF7/+tdnypQpufHGG5MkU6ZMyeTJk5Mk73//+3PJJZfkTW960zDPCAAAANpn2FfcllKWJJmc5NIkuw/38wF4+axevTrLli3L7Nmznzb+pS99Kfvuu2+S5N57780OO+wwcGz77bfPvffe+5zjAAAAQAPBba31yCT3JZmTZFmS3y+l3FpK+VYpZdpw1wPAS7N27docdNBBWbx4cbbccsuB8c985jPp6OjIIYcckiSptT7r2lLKc44DAAAAzW2V8JQfJNmx1rq2lLJfkn9N8saNnVhKOSLJEUkyYcI2Oamzd/iqhEFeOy5ZoP9oSNP919PTkyTp7e3NCSeckNmzZ2f8+PED41dddVUuu+yyLFq0KNdff32S5Mknn8z111+f7bffPkmyYsWKzJw5M0ly6623Dlz77W9/+2nPoH3Wrl3r3w+N0oM0Sf/RJP1Hk/QfTRrp/Vc2tuJpyB9ayuoke9Raf/5ixp/pdZOn1FEHnzF0BcLzWNDZm0W3Nf0zD0aqpvtv9cJ5qbVm/vz5GT9+fBYvXjxw7Kqrrspf/MVf5Prrr88222wzMH777bfngx/8YG688cbcd9992XvvvfOf//mfqbVm5513zjXXXJPtttsub37zm/O1r30t06b55Yu26unpSXd3d9NlMILpQZqk/2iS/qNJ+o8mjZT+K6XcUmvd45njjaZPpZTfS/JArbWWUt6Svq0bHmqyJgCe39KlS3P++eens7MzXV1dSZLPfvazOfroo/PEE09k7ty5Sfo+oGzJkiWZNm1aDj744LzpTW9KR0dHvvCFL2T06NFJkjPPPDP77LNP1q9fn8MPP1xoCwAAAP2aXjb4viQfLaX0Jnk8yftrE0uAAXjR9tprr43uT7vffvs95zUnnnhiTjzxxI1e83zXAQAAwEjVSHBba92p/8sz+18AAAAAAPQb1XQBAAAAAAA8neAWAAAAAKBlBLcAAAAAAC0juAUAAAAAaBnBLQAAAABAywhuAQAAAABapqPpAl6KcWNG546F85ougxGqp6cnqw/pbroMRij9BwAAACODFbcAAAAAAC0juAUAAAAAaBnBLQAAAABAywhuAQAAAABaRnALAAAAANAyglsAAAAAgJYR3AIAAAAAtIzgFgAAAACgZQS3AAAAAAAtI7gFAAAAAGgZwS0AAAAAQMsIbgEAAAAAWkZwCwAAAADQMoJbAAAAAICWEdwCAAAAALSM4BYAAAAAoGUEtwAAAAAALSO4BQAAAABoGcEtAAAAAEDLCG4BAAAAAFpGcAsAAAAA0DKCWwAAAACAlhHcAgAAAAC0jOAWAAAAAKBlBLcAAAAAAC0juAUAAAAAaBnBLQAAAABAywhuAQAAAABaRnALAAAAANAyglsAAAAAgJYR3AIAAAAAtIzgFgAAAADg/2fv/qOuLuu80b8vQEsUMkUKJUTUlF+Kaaknx26HISnGMw9RJjHPYI3HY0yjTvbDsuMz63Sm7jEtmvKM4Th56iltyF/M1OPkYtqZPpZiYaaI9NSdPxq1UMZAJwSu8wfEAkElY9/fL96v11p7sfe1r733e3d/+ue9Lr+7ZRS3AAAAAAAtM6TpAC/G08+sz9jzv9F0DAao8yavy+nmj4b8LvPX1zujy2kAAACAbnHiFgAAAACgZRS3AAAAAAAto7gFAAAAAGgZxS0AAAAAQMsobgEAAAAAWkZxC/AS9uCDD+akk07K+PHjM3HixHz2s59Nkjz++OOZNm1aDj300EybNi1PPPHE5td0Op1MmTIlEydOzJve9KYkyfLlyzNlypTNt+HDh2f+/PmNfCcAAAAYCBopbkspZ5dSlpVSainlR5tu/7OUcmQTeQBeqoYMGZJLLrkky5Yty/e+971ceumluffee9Pb25upU6dmxYoVmTp1anp7e5Mkq1atyrx587Jo0aLcc889WbhwYZLksMMOy9KlS7N06dLceeedGTp0aGbOnNnkVwMAAICXtKZO3M5L8tYkb0zyplrrEUk+nmRBQ3kAXpJGjRqV173udUmSYcOGZfz48Xn44Ydzww03ZO7cuUmSuXPn5vrrr0+SfPWrX83b3va2jBkzJkkycuTIbd5z8eLFOfjgg3PggQf207cAAACAgaffi9tSymVJxiVZlOTYWutv//vc7yUZ3d95AAaKvr6+/PCHP8yxxx6bRx99NKNGjUqysdx97LHHkiT3339/nnjiifT09OToo4/Ol770pW3e5+qrr87s2bP7NTsAAAAMNEP6+wNrrWeVUqYnOanW+qstnvrzJP+jv/MADASrV6/OrFmzMn/+/AwfPvw5961bty533nlnFi9enKeffjrHH398jjvuuLz2ta9NkqxduzaLFi3KJz/5yf6KDgAAAANSvxe321NKOSkbi9sTnmfPmUnOTJIRI/bLhZPX9VM62Nqr9kjOM3805HeZv06nk2RjGfuRj3wkxx57bPbZZ590Op0MHz4811xzTfbdd9+sXLkyw4YNS6fTydq1a3P44YfnjjvuSJIceuih+epXv5qenp4kyS233JKDDjooy5Yty7Jly7rxFWmx1atXb54raIIZpEnmjyaZP5pk/mjSQJ+/xovbUsoRSf4hyVtqrSufa1+tdUE2XQN3zLhD6iV3Nx6dAeq8yeti/mjK7zJ/fXN6UmvN3Llz88Y3vjHz58/f/Nw73/nOrFixIrNmzUpvb29OO+209PT05FWvelXe97735YQTTsjatWvzwAMP5KKLLsqkSZOSJJdddlnmzZu3uchlYOl0Ov72NMoM0iTzR5PMH00yfzRpoM9fo+1TKWVMkmuT/Nda6/1NZgF4Kbr11lvz5S9/OZMnT86UKVOSJJ/4xCdy/vnn59RTT80VV1yRMWPGZOHChUmS8ePHZ/r06TniiCMyaNCgnHHGGZtL26eeeio33XRTvvCFLzT2fQAAAGCgaPrY4IVJ9k3y/5ZSkmRdrfWYZiMBvHSccMIJqbVu97nFixdvd/2DH/xgPvjBD26zPnTo0Kxc+Zz/YQQAAACwEzVS3NZax266e8amGwAAAAAAmwxqOgAAAAAAAFtT3AIAAAAAtIziFgAAAACgZRS3AAAAAAAto7gFAAAAAGgZxS0AAAAAQMsobgEAAAAAWmZI0wFejD12G5zlvTOajsEA1el00jenp+kYDFDmDwAAAAYGJ24BAAAAAFpGcQsAAAAA0DKKWwAAAACAllHcAgAAAAC0jOIWAAAAAKBlFLcAAAAAAC2juAUAAAAAaBnFLQAAAABAyyhuAQAAAABaRnELAAAAANAyilsAAAAAgJZR3AIAAAAAtIziFgAAAACgZRS3AAAAAAAto7gFAAAAAGgZxS0AAAAAQMsobgEAAAAAWkZxCwAAAADQMopbAAAAAICWUdwCAAAAALSM4hYAAAAAoGUUtwAAAAAALaO4BQAAAABoGcUtAAAAAEDLKG4BAAAAAFpGcQsAAAAA0DKKWwAAAACAllHcAgAAAAC0jOIWAAAAAKBlFLcAAAAAAC2juAUAAAAAaBnFLQAAAABAyyhuAQAAAABaZkjTAV6Mp59Zn7Hnf6PpGAxQ501el9N3wvz19c7YCWkAAAAAeCly4hYAAAAAoGUUtwAAAAAALaO4BQAAAABoGcUtAAAAAEDLKG4BAAAAAFpGcQsNes973pORI0dm0qRJm9cWLlyYiRMnZtCgQVmyZMnm9ZUrV+akk07KXnvtlfe9731bvc8FF1yQ17zmNdlrr736LTsAAAAA3dO14raUcnYpZVkp5ZpSym2llN+UUj7wrD17l1K+Xkq5b9Pe47uVB9ro9NNPz4033rjV2qRJk3LttdfmxBNP3Gr95S9/eT7+8Y/n4osv3uZ9TjnllNx+++1dzQoAAABA/xnSxfeel+QtSdYkOTDJf9nOns8mubHW+vZSyu5JhnYxD7TOiSeemL6+vq3Wxo8fv929e+65Z0444YT85Cc/2ea54447rhvxAAAAAGhIV07cllIuSzIuyaIkc2qtdyR55ll7hic5MckVSVJrXVtrXdWNPAAAAAAAu5KuFLe11rOS/CLJSbXWzzzHtnFJfpnki6WUH5ZS/qGUsmc38gAAAAAA7Eq6eamEHfns1yX5y1rr90spn01yfpL/a3ubSylnJjkzSUaM2C8XTl7Xb0FhS6/aIzlvJ8xfp9NJkjzyyCNZs2bN5se/tWrVqtx5551ZvXr1Vuv33XdfHn744W32J8n69eu3u85Lx+rVq/2NaYz5o2lmkCaZP5pk/miS+aNJA33+mixuH0ryUK31+5sefz0bi9vtqrUuSLIgScaMO6RecneT0RnIzpu8Ljtj/vrm9Gz8t68ve+65Z3p6erZ6fu+9987RRx+dY445ZuvX9fVl9erV2+xPksGDB293nZeOTqfjb0xjzB9NM4M0yfzRJPNHk8wfTRro89eVSyXsiFrrI0keLKUctmlpapJ7m8oDTZg9e3aOP/74LF++PKNHj84VV1yR6667LqNHj85tt92WGTNm5OSTT968f+zYsXn/+9+fK6+8MqNHj8699278v8yHPvShjB49Ok899VRGjx6dv/7rv27oGwEAAACwM3T92Gop5dVJliQZnmRDKeXcJBNqrU8m+cskXyml7J7kp0ne3e080CZXXXXVdtdnzpy53fW+vr7trl900UW56KKLdlYsAAAAABrWteK21jp2i4ejn2PP0iTHbO85AAAAAICBqrFLJQAAAAAAsH2KWwAAAACAllHcAgAAAAC0jOIWAAAAAKBlFLcAAAAAAC2juAUAAAAAaJkhTQd4MfbYbXCW985oOgYDVKfTSd+cnqZjAAAAAPAS5sQtAAAAAEDLKG4BAAAAAFpGcQsAAAAA0DKKWwAAAACAllHcAgAAAAC0jOIWAAAAAKBlFLcAAAAAAC2juAUAAAAAaBnFLQAAAABAyyhuAQAAAABaRnELAAAAANAyilsAAAAAgJZR3AIAAAAAtIziFgAAAACgZRS3AAAAAAAto7gFAAAAAGgZxS0AAAAAQMsobgEAAAAAWkZxCwAAAADQMopbAAAAAICWUdwCAAAAALSM4hYAAAAAoGUUtwAAAAAALaO4BQAAAABoGcUtAAAAAEDLKG4BAAAAAFpGcQsAAAAA0DKKWwAAAACAllHcAgAAAAC0jOIWAAAAAKBlFLcAAAAAAC2juAUAAAAAaBnFLQAAAABAywxpOsCL8fQz6zP2/G80HYMB6rzJ63L67zl/fb0zdlIaAAAAAF6KnLgFAAAAAGgZxS0AAAAAQMsobgEAAAAAWkZxCwAAAADQMopbAAAAAICWUdwCAAAAALSM4hYa8p73vCcjR47MpEmTNq8tXLgwEydOzKBBg7JkyZKt9n/yk5/MIYccksMOOyz/+q//miRZvnx5pkyZsvk2fPjwzJ8/v1+/BwAAAAA7XyPFbSnl7FLKslLKV0opPaWUpaWUe0op32kiDzTh9NNPz4033rjV2qRJk3LttdfmxBNP3Gr93nvvzdVXX5177rknN954Y+bNm5f169fnsMMOy9KlS7N06dLceeedGTp0aGbOnNmfXwMAAACALhjS0OfOS/KWJE8k+Z9JptdaHyiljGwoD/S7E088MX19fVutjR8/frt7b7jhhpx22ml52cteloMOOiiHHHJIbr/99hx//PGb9yxevDgHH3xwDjzwwG7GBgAAAKAf9PuJ21LKZUnGJVmU5C+SXFtrfSBJaq2P9Xce2BU8/PDDec1rXrP58ejRo/Pwww9vtefqq6/O7Nmz+zsaAAAAAF3Q78VtrfWsJL9IclKS/ZK8spTSKaXcWUr5s/7OA7uCWus2a6WUzffXrl2bRYsW5R3veEd/xgIAAACgS5q6VMKWn390kqlJ9khyWynle7XW+5+9sZRyZpIzk2TEiP1y4eR1/RoUfutVeyTn/Z7z1+l0kiSPPPJI1qxZs/nxb61atSp33nlnVq9enWRjMfud73wno0ePTpL86Ec/yute97rNr7vlllty0EEHZdmyZVm2bNnvlY12W7169TbzAv3F/NE0M0iTzB9NMn80yfzRpIE+f00Xtw8l+VWtdU2SNaWUm5McmWSb4rbWuiDJgiQZM+6QesndTUdnoDpv8rr8vvPXN6dn4799fdlzzz3T09Oz1fN77713jj766BxzzDFJkv322y/vete78vnPfz6/+MUvsnLlypx11lkZPHhwkuSyyy7LvHnztnkfXno6nY6/M40xfzTNDNIk80eTzB9NMn80aaDPX79fKuFZbkjyB6WUIaWUoUmOTeK4IAPC7Nmzc/zxx2f58uUZPXp0rrjiilx33XUZPXp0brvttsyYMSMnn3xykmTixIk59dRTM2HChEyfPj2XXnrp5tL2qaeeyk033ZS3ve1tTX4dAAAAAHaiRo+t1lqXlVJuTPKjJBuS/EOt9cdNZoL+ctVVV213febMmdtdv+CCC3LBBRdssz506NCsXLlyp2YDAAAAoFmNFLe11rFb3P9Ukk81kQMAAAAAoI2avlQCAAAAAADPorgFAAAAAGgZxS0AAAAAQMsobgEAAAAAWkZxCwAAAADQMopbAAAAAICWGdJ0gBdjj90GZ3nvjKZjMEB1Op30zelpOgYAAAAAL2FO3AIAAAAAtIziFgAAAACgZRS3AAAAAAAto7gFAAAAAGgZxS0AAAAAQMsobgEAAAAAWkZxCwAAAADQMopbAAAAAICWUdwCAAAAALSM4hYAAAAAoGUUtwAAAAAALaO4BQAAAABoGcUtAAAAAEDLKG4BAAAAAFpGcQsAAAAA0DKKWwAAAACAllHcAgAAAAC0jOIWAAAAAKBlFLcAAAAAAC2juAUAAAAAaBnFLQAAAABAyyhuAQAAAABaRnELAAAAANAyilsAAAAAgJZR3AIAAAAAtIziFgAAAACgZRS3AAAAAAAto7gFAAAAAGgZxS0AAAAAQMsobgEAAAAAWkZxCwAAAADQMopbAAAAAICWUdwCAAAAALTMkKYDvBhPP7M+Y8//RtMxGID6emc0HQEAAACAAcCJWwAAAACAllHcAgAAAAC0jOIWAAAAAKBlFLcAAAAAAC2juAUAAAAAaBnFLbwIn/3sZzNp0qRMnDgx8+fPT5LcddddOf744zN58uSccsopefLJJ5MkX/nKVzJlypTNt0GDBmXp0qVNxgcAAACg5bpa3JZSzi6lLCulXFNKua2U8ptSygeetWd6KWV5KeUnpZTzu5kHdoaf/exnufzyy3P77bfnrrvuyr/8y79kxYoVOeOMM9Lb25u77747M2fOzKc+9akkyZw5c7J06dIsXbo0X/7ylzN27NhMmTKl4W8BAAAAQJt1+8TtvCRvTfLeJGcnuXjLJ0spg5NcmuQtSSYkmV1KmdDlTPB7+fnPf57jjjsuQ4cOzZAhQ/KmN70p1113XZYvX54TTzwxSTJt2rRcc80127z2qquuyuzZs/s7MgAAAAC7mK4Vt6WUy5KMS7IoyZxa6x1JnnnWtjck+Umt9ae11rVJrk7yJ93KBDvDQQcdlJtvvjkrV67MU089lW9+85t58MEHM2nSpCxatChJsnDhwjz44IPbvPZrX/ua4hYAAACAF9S14rbWelaSXyQ5qdb6mefYdkCSLduthzatQWsdeOCB+fCHP5xp06Zl+vTpOfLIIzNkyJD84z/+Yy699NIcffTR+fWvf53dd999q9d9//vfz9ChQzNp0qSGkgMAAACwqxjS8OeX7azV7W4s5cwkZybJiBH75cLJ67qZC7ar0+lk9erVOfjgg/PpT386SXL55Zfn5S9/eR555JF89KMfTZI8+OCDGTlyZDqdzubXXnrppTn22GO3WoPf1erVq80QjTF/NM0M0iTzR5PMH00yfzRpoM9f08XtQ0les8Xj0dl4SncbtdYFSRYkyZhxh9RL7m46OgNR35yedDqdTJgwISNHjswDDzyQO++8M7fddlueeeaZjBw5Mhs2bMjpp5+eD37wg+np6UmSbNiwIX/6p3+am2++OePGjWv2S7BL63Q6m+cK+pv5o2lmJJBfugAAIABJREFUkCaZP5pk/miS+aNJA33+uv3jZC/kjiSHllIOKqXsnuS0bLwmLrTarFmzMmHChJxyyim59NJL88pXvjJXXXVVXvva1+bwww/P/vvvn3e/+92b9998880ZPXq00hYAAACAHdIvx1ZLKa9OsiTJ8CQbSinnJplQa32ylPK+JP+aZHCSf6y13tMfmeD38d3vfnebtXPOOSfnnHPOdvf39PTke9/7XrdjAQAAAPAS0dXittY6douHo59jzzeTfLObOQAAAAAAdiVNXyoBAAAAAIBnUdwCAAAAALSM4hYAAAAAoGUUtwAAAAAALaO4BQAAAABoGcUtAAAAAEDLKG4BAAAAAFpmSNMBXow9dhuc5b0zmo4BAAAAANAVTtwCAAAAALSM4hYAAAAAoGUUtwAAAAAALaO4BQAAAABoGcUtAAAAAEDLKG4BAAAAAFpGcQsAAAAA0DKKWwAAAACAllHcAgAAAAC0jOIWAAAAAKBlFLcAAAAAAC2juAUAAAAAaBnFLQAAAABAyyhuAQAAAABaRnELAAAAANAyilsAAAAAgJZR3AIAAAAAtIziFgAAAACgZRS3AAAAAAAto7gFAAAAAGgZxS0AAAAAQMsobgEAAAAAWkZxCwAAAADQMopbAAAAAICWUdwCAAAAALSM4hYAAAAAoGUUtwAAAAAALaO4BQAAAABoGcUtAAAAAEDLKG4BAAAAAFpGcQsAAAAA0DKKWwAAAACAllHcAgAAAAC0zJCmA7wYTz+zPmPP/0bTMdjJ+npnNB0BAAAAAFrBiVsAAAAAgJZR3AIAAAAAtIziFgAAAACgZRS3AAAAAAAto7gFAAAAAGgZxS2tsnz58kyZMmXzbfjw4Zk/f/7m5y+++OKUUvKrX/0qSXLffffl+OOPz8te9rJcfPHFTcUGAAAAgJ1qSDffvJRydpL3Jrk3yf5JXpfkglrrxVvs+askZySpSe5O8u5a6392Mxftddhhh2Xp0qVJkvXr1+eAAw7IzJkzkyQPPvhgbrrppowZM2bz/n322Sd/93d/l+uvv76RvAAAAADQDd0+cTsvyVuzsbw9O8lWRyJLKQdsWj+m1jopyeAkp3U5E7uIxYsX5+CDD86BBx6YJPmrv/qrXHTRRSmlbN4zcuTIvP71r89uu+3WVEwAAAAA2Om6VtyWUi5LMi7JoiRzaq13JHlmO1uHJNmjlDIkydAkv+hWJnYtV199dWbPnp0kWbRoUQ444IAceeSRDacCAAAAgO7r2qUSaq1nlVKmJzmp1vqr59jzcCnl4iQPJHk6ybdqrd/qViZ2HWvXrs2iRYvyyU9+Mk899VT+5m/+Jt/6ltEAAAAAYGDo6jVuX0gp5ZVJ/iTJQUlWJVlYSvnTWut/387eM5OcmSQjRuyXCyev69esdF+n09l8/5ZbbslBBx2UZcuW5ac//Wnuv//+HHbYYUmSX/7yl5k4cWL+/u//Pvvss0+SpK+vL3vsscdW79Etq1ev7pfPge0xfzTJ/NE0M0iTzB9NMn80yfzRpIE+f40Wt0n+KMnPaq2/TJJSyrVJ/rck2xS3tdYFSRYkyZhxh9RL7m46Ojtb35yezfcvu+yyzJs3Lz09Penp6cl73vOezc+NHTs2S5YsyYgRIzavdTqd7LXXXunp6Um3dTqdfvkc2B7zR5PMH00zgzTJ/NEk80eTzB9NGujz13T7+UCS40opQ7PxUglTkyxpNhJNe+qpp3LTTTflC1/4wgvufeSRR3LMMcfkySefzKBBgzJ//vzce++9GT58eD8kBQAAAIDu6JfitpTy6mwsZIcn2VBKOTfJhFrr90spX0/ygyTrkvwwm07VMnANHTo0K1eufM7n+/r6Nt9/9atfnYceeqgfUgEAAABA/+lqcVtrHbvFw9HPsee/Jflv3cwBAAAAALArGdR0AAAAAAAAtqa4BQAAAABoGcUtAAAAAEDL/M7FbSnllaWUI7oRBgAAAACAHSxuSymdUsrwUso+Se5K8sVSyqe7Gw0AAAAAYGDa0RO3r6i1PpnkbUm+WGs9OskfdS8WAAAAAMDANWRH95VSRiU5NckFXcyzQ/bYbXCW985oOgYAAAAAQFfs6Inb/zvJvyb5X7XWO0op45Ks6F4sAAAAAICBa4dO3NZaFyZZuMXjnyaZ1a1QAAAAAAAD2Y7+ONlrSymLSyk/3vT4iFLKx7obDQAAAABgYNrRSyVcnuQjSZ5Jklrrj5Kc1q1QAAAAAAAD2Y4Wt0Nrrbc/a23dzg4DAAAAAMCOF7e/KqUcnKQmSSnl7Un+vWupAAAAAAAGsB36cbIkf5FkQZLDSykPJ/lZkjldSwUAAAAAMIC9YHFbShmU5Jha6x+VUvZMMqjW+uvuRwMAAAAAGJhe8FIJtdYNSd636f4apS0AAAAAQHft6DVubyqlfKCU8ppSyj6/vXU1GQAAAADAALWj17h9z6Z//2KLtZpk3M6NAwAAAADADhW3tdaDuh0EAAAAAICNdqi4LaX82fbWa61f2rlxAAAAAADY0UslvH6L+y9PMjXJD5IobgEAAAAAdrIdvVTCX275uJTyiiRf7koiAAAAAIABbtCLfN1TSQ7dmUEAAAAAANhoR69x+89J6qaHg5JMSLKwW6EAAAAAAAayHb3G7cVb3F+X5Oe11oe6kAcAAAAAYMDb0UslvLXW+p1Nt1trrQ+VUv62q8kAAAAAAAaoHS1up21n7S07MwgAAAAAABs976USSinvTTIvybhSyo+2eGpYklu7GQwAAAAAYKB6oWvcfjXJ/0jyySTnb7H+61rr411LBQAAAAAwgD1vcVtr/Y8k/5FkdpKUUkYmeXmSvUope9VaH+h+RAAAAACAgWWHrnFbSjmllLIiyc+SfCdJXzaexAUAAAAAYCfb0R8n+3+SHJfk/lrrQUmmxjVuAQAAAAC6YkeL22dqrSuTDCqlDKq1fjvJlC7mAgAAAAAYsF7ox8l+a1UpZa8k303ylVLKY0nWdS8WAAAAAMDAtaMnbv8kyVNJzk1yY5L/leSUboUCAAAAABjIdujEba11TSnlwCSH1lr/v1LK0CSDuxsNAAAAAGBg2qETt6WU/yPJ15N8YdPSAUmu71YoAAAAAICBbEcvlfAXSd6Y5MkkqbWuSDKyW6EAAAAAAAayHS1uf1NrXfvbB6WUIUlqdyIBAAAAAAxsO1rcfqeU8tEke5RSpiVZmOSfuxcLAAAAAGDg2qEfJ0tyfpI/T3J3kv8zyTeT/EO3Qr2Qp59Zn7Hnf6OpjydJX++MpiMAAAAAwEvW8xa3pZQxtdYHaq0bkly+6QYAAAAAQBe90KUSrv/tnVLKNV3OAgAAAABAXri4LVvcH9fNIAAAAAAAbPRCxW19jvsAAAAAAHTJC/042ZGllCez8eTtHpvuZ9PjWmsd3tV0AAAAAAAD0PMWt7XWwf0VBAAAAACAjV7oUgnwvFatWpW3v/3tOfzwwzN+/PjcdtttefzxxzNt2rQceuihmTZtWp544okkyRNPPJGZM2fmiCOOyBve8Ib8+Mc/bjg9AAAAALRT14rbUsrZpZRlpZRrSim3lVJ+U0r5wBbPv6aU8u1Ne+4ppZzTrSx0zznnnJPp06fnvvvuy1133ZXx48ent7c3U6dOzYoVKzJ16tT09vYmST7xiU9kypQp+dGPfpQvfelLOeccf3IAAAAA2J5unridl+StSd6b5OwkFz/r+XVJzqu1jk9yXJK/KKVM6GIedrInn3wyN998c/78z/88SbL77rtn7733zg033JC5c+cmSebOnZvrr78+SXLvvfdm6tSpSZLDDz88fX19efTRR5sJDwAAAAAt1pXitpRyWZJxSRYlmVNrvSPJM1vuqbX+e631B5vu/zrJsiQHdCMP3fHTn/40++23X9797nfnqKOOyhlnnJE1a9bk0UcfzahRo5Iko0aNymOPPZYkOfLII3PttdcmSW6//fb8/Oc/z0MPPdRYfgAAAABoq64Ut7XWs5L8IslJtdbPvND+UsrYJEcl+X438tAd69atyw9+8IO8973vzQ9/+MPsueeemy+LsD3nn39+nnjiiUyZMiWf+9znctRRR2XIkOf9fTwAAAAAGJAab81KKXsluSbJubXWJ59n35lJzkySESP2y4WT1/VTQran0+nk8ccfz4gRI/L000+n0+nk4IMPzle/+tUMHz4811xzTfbdd9+sXLkyw4YNS6fTSbLx0glz585NrTWzZ8/OQw89tPnHy3YVq1ev3vx9oL+ZP5pk/miaGaRJ5o8mmT+aZP5o0kCfv0aL21LKbtlY2n6l1nrt8+2ttS5IsiBJxow7pF5yd+Od84DWN6cnSfKZz3wmo0aNymGHHZZOp5M/+IM/SJKsWLEis2bNSm9vb0477bT09PRk1apVGTp0aHbfffdcfvnlefOb35wZM2Y0+C1enE6nk56enqZjMECZP5pk/miaGaRJ5o8mmT+aZP5o0kCfv8baz1JKSXJFkmW11k83lYPfz+c+97nMmTMna9euzbhx4/LFL34xGzZsyKmnnporrrgiY8aMycKFC5Mky5Yty5/92Z9l8ODBmTBhQq644oqG0wMAAABAO3W9uC2lvDrJkiTDk2wopZybZEKSI5L81yR3l1KWbtr+0VrrN7udiZ1nypQpWbJkyTbrixcv3mbt+OOPz4oVK/ojFgAAAADs0rpW3NZax27xcPR2ttySpHTr8wEAAAAAdlWDmg4AAAAAAMDWFLcAAAAAAC2juAUAAAAAaBnFLQAAAABAyyhuAQAAAABaRnELAAAAANAyQ5oO8GLssdvgLO+d0XQMAAAAAICucOIWAAAAAKBlFLcAAAAAAC2juAUAAAAAaBnFLQAAAABAyyhuAQAAAABaRnELAAAAANAyilsAAAAAgJZR3AIAAAAAtIziFgAAAACgZRS3AAAAAAAto7gFAAAAAGgZxS0AAAAAQMsobgEAAAAAWkZxCwAAAADQMopbAAAAAICWUdwCAAAAALSM4hYAAAAAoGUUtwAAAAAALaO4BQAAAABoGcUtAAAAAEDLKG4BAAAAAFpGcQsAAAAA0DKKWwAAAACAllHcAgAAAAC0jOIWAAAAAKBlFLcAAAAAAC2juAUAAAAAaBnFLQAAAABAyyhuAQAAAABaRnELAAAAANAyilsAAAAAgJZR3AIAAAAAtIziFgAAAACgZYY0HeDFePqZ9Rl7/jeajjEg9PXOaDoCAAAAAAw4TtwCAAAAALSM4hYAAAAAoGUUtwAAAAAALaO4BQAAAABoGcUtAAAAAEDLKG4BAAAAAFpmSNMB2DWMHTs2w4YNy+DBgzNkyJAsWbIk73znO7N8+fIkyapVq7L33ntn6dKluemmm3L++edn7dq12X333fOpT30qf/iHf9jwNwAAAACAXUcjxW0p5ewk701yb5L9k7wuyQW11oubyMOO+fa3v50RI0Zsfvy1r31t8/3zzjsvr3jFK5IkI0aMyD//8z9n//33z49//OOcfPLJefjhh/s9LwAAAADsqpo6cTsvyVuSrElyYJL/0lAOdoJaa/7pn/4p//Zv/5YkOeqoozY/N3HixPznf/5nfvOb3+RlL3tZUxEBAAAAYJfS79e4LaVclmRckkVJ5tRa70jyTH/n4HdTSsmb3/zmHH300VmwYMFWz333u9/Nq171qhx66KHbvO6aa67JUUcdpbQFAAAAgN9Bv5+4rbWeVUqZnuSkWuuv+vvzeXFuvfXW7L///nnssccybdq0HH744TnxxBOTJFdddVVmz569zWvuueeefPjDH863vvWt/o4LAAAAALu0Umvt/w8tpS/JMb8tbkspf51k9fNd47aUcmaSM5NkxIj9jr5w/uX9kJTJB7xim7Urr7wye+yxR975zndm/fr1ecc73pEvfOEL2W+//Tbv+eUvf5n3v//9+dCHPpTJkyf3Z+SuW716dfbaa6+mYzBAmT+aZP5omhmkSeaPJpk/mmT+aNJAmb+TTjrpzlrrMc9eb+oat7+zWuuCJAuSZMy4Q+old+8y0XdpfXN6smbNmmzYsCHDhg3LmjVr8tGPfjQXXnhhenp6cuONN2by5Ml5xzvesfk1q1atypve9KbMnz8/s2bNajB9d3Q6nfT09DQdgwHK/NEk80fTzCBNMn80yfzRJPNHkwb6/PX7NW7Z9Tz66KM54YQTcuSRR+YNb3hDZsyYkenTpydJrr766m0uk/D5z38+P/nJT/Lxj388U6ZMyZQpU/LYY481ER0AAAAAdkmNHlstpbw6yZIkw5NsKKWcm2RCrfXJJnOxtXHjxuWuu+7a7nNXXnnlNmsf+9jH8rGPfazLqQAAAADgpauR4rbWOnaLh6ObyAAAAAAA0FYulQAAAAAA0DKKWwAAAACAllHcAgAAAAC0jOIWAAAAAKBlFLcAAAAAAC2juAUAAAAAaJkhTQd4MfbYbXCW985oOgYAAAAAQFc4cQsAAAAA0DKKWwAAAACAllHcAgAAAAC0jOIWAAAAAKBlFLcAAAAAAC2juAUAAAAAaBnFLQAAAABAyyhuAQAAAABaRnELAAAAANAyilsAAAAAgJZR3AIAAAAAtIziFgAAAACgZRS3AAAAAAAto7gFAAAAAGgZxS0AAAAAQMsobgEAAAAAWkZxCwAAAADQMopbAAAAAICWUdwCAAAAALSM4hYAAAAAoGUUtwAAAAAALaO4BQAAAABoGcUtAAAAAEDLKG4BAAAAAFpGcQsAAAAA0DKKWwAAAACAllHcAgAAAAC0jOIWAAAAAKBlFLcAAAAAAC2juAUAAAAAaBnFLQAAAABAyyhuAQAAAABaRnELAAAAANAyQ5oO8GI8/cz6jD3/G03H2GX19c5oOgIAAAAA8DycuAUAAAAAaBnFLQAAAABAyyhuAQAAAABaRnELAAAAANAyilsAAAAAgJZR3A5gY8eOzeTJkzNlypQcc8wxSZLHH38806ZNy6GHHppp06bliSee2Oo1d9xxRwYPHpyvf/3rTUQGAAAAgAGhq8VtKeXsUsqyUso1pZTbSim/KaV8YDv7BpdSflhK+Zdu5mFb3/72t7N06dIsWbIkSdLb25upU6dmxYoVmTp1anp7ezfvXb9+fT784Q/n5JNPbiouAAAAAAwI3T5xOy/JW5O8N8nZSS5+jn3nJFnW5SzsgBtuuCFz585NksydOzfXX3/95uc+97nPZdasWRk5cmRT8QAAAABgQOhacVtKuSzJuCSLksyptd6R5Jnt7BudZEaSf+hWFravlJI3v/nNOfroo7NgwYIkyaOPPppRo0YlSUaNGpXHHnssSfLwww/nuuuuy1lnndVYXgAAAAAYKIZ0641rrWeVUqYnOanW+qvn2To/yYeSDOtWFrbv1ltvzf7775/HHnss06ZNy+GHH/6ce88999z87d/+bQYPHtyPCQEAAABgYOpacbsjSil/nOSxWuudpZSeF9h7ZpIzk2TEiP1y4eR1/ZDwpanT6Wy+f//99ydJjjrqqFx11VUZPnx4rrnmmuy7775ZuXJlhg0blk6nk1tuuSXf/e53kyT/8R//kRtuuCH33XdfTjjhhCa+QqNWr1691f+G0J/MH00yfzTNDNIk80eTzB9NMn80aaDPX6PFbZI3JvnfSylvTfLyJMNLKf+91vqnz95Ya12QZEGSjBl3SL3k7qaj77r65vRkzZo12bBhQ4YNG5Y1a9bkox/9aC688MLstddeWbFiRWbNmpXe3t6cdtpp6enpyb//+79vfv3pp5+eP/7jP87b3/72Br9FczqdTnp6epqOwQBl/miS+aNpZpAmmT+aZP5okvmjSQN9/hptP2utH0nykSTZdOL2A9srbdn5Hn300cycOTNJsm7durzrXe/K9OnT8/rXvz6nnnpqrrjiiowZMyYLFy5sOCkAAAAADDz9UtyWUl6dZEmS4Uk2lFLOTTKh1vpkf3w+2xo3blzuuuuubdb33XffLF68+Hlfe+WVV3YpFQAAAACQdLm4rbWO3eLh6BfY20nS6WIcAAAAAIBdwqCmAwAAAAAAsDXFLQAAAABAyyhuAQAAAABaRnELAAAAANAyilsAAAAAgJZR3AIAAAAAtIziFgAAAACgZYY0HeDF2GO3wVneO6PpGAAAAAAAXeHELQAAAABAyyhuAQAAAABaRnELAAAAANAyilsAAAAAgJZR3AIAAAAAtIziFgAAAACgZRS3AAAAAAD/P3v3Hq1XXd/7/vNLVoKRcDk1xEYQs7NRQHJZQkraIeCK7njCRVtFQcxhR5FSoYpy3K1YhxHdfzRbZAAHLxC1kmOVVLEWhtAcO6RrY90ilyYCVgMWwwG5RBCGJAZZCb/9B4s0IQExrmfNX7JerzHWYD3zmc8zv2vw/es95phpjHALAAAAANAY4RYAAAAAoDHCLQAAAABAY4RbAAAAAIDGCLcAAAAAAI0RbgEAAAAAGiPcAgAAAAA0RrgFAAAAAGiMcAsAAAAA0BjhFgAAAACgMcItAAAAAEBjhFsAAAAAgMYItwAAAAAAjRFuAQAAAAAaI9wCAAAAADRGuAUAAAAAaIxwCwAAAADQGOEWAAAAAKAxwi0AAAAAQGOEWwAAAACAxgi3AAAAAACNEW4BAAAAABoj3AIAAAAANEa4BQAAAABojHALAAAAANAY4RYAAAAAoDF9XQ+wMzYObc70c6/peoyeWLv0+K5HAAAAAAA65o5bAAAAAIDGCLcAAAAAAI0RbgEAAAAAGiPcAgAAAAA0RrgFAAAAAGiMcNugxx9/PEceeWTmzJmTww47LB/96Ee3ef+9731vJk+evOX1Oeeck/7+/vT39+cVr3hF9t1339EeGQAAAAAYQX29/PJSytlJzkzyb0lekuTwJB+utX5y+P0XJLk+yR7Ds1xZa/3os3zdmLHHHnvkuuuuy+TJkzM0NJSjjjoqxx57bP7wD/8wN998cx599NFtzr/wwgu3/H7JJZdk1apVoz0yAAAAADCCen3H7VlJjstT8fbsJJ98xvu/TvLaWuucJP1JFpZS/rDHMzWvlLLljtqhoaEMDQ2llJLNmzfnL/7iL/KJT3ziWT97xRVX5JRTThmtUQEAAACAHuhZuC2lXJpkRpKrkyyqtd6UZGjrc+pT1g+/nDD8U3s1065k8+bN6e/vz9SpU7NgwYLMmzcvn/rUp/LGN74x06ZN2+Fn7r777vz0pz/Na1/72lGeFgAAAAAYST17VEKt9d2llIVJ5tdaH3q280op45PckuSgJJ+utX6/VzPtSsaPH5/Vq1fn0UcfzZve9KZcf/31+drXvpbBwcFn/cyKFSvylre8JePHjx+9QQEAAACAEVdq7d0NrqWUtUnmPh1uSynnJVn/9DNun3Huvkm+keS9tdbbd/D+GUnOSJIpU/Y7YslFn+vZ3F2atf8+2x1bvnx5kuSqq67KxIkTkyTr1q3LtGnT8uUvf3nLeX/6p3+a973vfZk5c+boDDtGrV+/fpt/HA5Gk/2jS/aPrtlBumT/6JL9o0v2jy6Nlf2bP3/+LbXWuc883tN/nOy3UWt9tJQymGRhku3Cba11WZJlSXLgjIPqBbc1M/qIWrtoID//+c8zYcKE7Lvvvtm4cWM+8pGP5IMf/GC++MUvbjlv8uTJ+dnPfrbl9Zo1azI0NJQ///M/Tymli9HHjMHBwQwMDHQ9BmOU/aNL9o+u2UG6ZP/okv2jS/aPLo31/eu0fpZS9ksyNBxtJyX5L0n+R5czteD+++/P4sWLs3nz5jz55JM56aSTcsIJJzznZ6644oq87W1vE20BAAAAYDcwKuG2lPL7SW5OsneSJ0sp70/yyiTTkiwffs7tuCRfrbV+czRmatns2bOzatWq5zxn/fr127w+77zzejgRAAAAADCaehpua63Tt3p5wA5OuTXJq3o5AwAAAADArmZc1wMAAAAAALAt4RYAAAAAoDHCLQAAAABAY4RbAAAAAIDGCLcAAAAAAI0RbgEAAAAAGtPX9QA7Y9KE8Vmz9PiuxwAAAAAA6Al33AIAAAAANEa4BQAAAABojHALAAAAANAY4RYAAAAAoDHCLQAAAABAY4RbAAAAAIDGCLcAAAAAAI0RbgEAAAAAGiPcAgAAAAA0RrgFAAAAAGiMcAsAAAAA0BjhFgAAAACgMcItAAAAAEBjhFsAAAAAgMYItwAAAAAAjRFuAQAAAAAaI9wCAAAAADRGuAUAAAAAaIxwCwAAAADQGOEWAAAAAKAxwi0AAAAAQGOEWwAAAACAxgi3AAAAAACNEW4BAAAAABoj3AIAAAAANEa4BQAAAABojHALAAAAANAY4RYAAAAAoDHCLQAAAABAY4RbAAAAAIDGCLcAAAAAAI0RbgEAAAAAGiPcAgAAAAA0pq/rAXbGxqHNmX7uNT37/rVLj+/ZdwMAAAAA/CbuuAUAAAAAaIxwCwAAAADQGOEWAAAAAKAxwi0AAAAAQGOEWwAAAACAxgi3AAAAAACNEW6fxWmnnZapU6dm5syZW46dd9552X///dPf35/+/v5ce+21SZIbb7xxy7E5c+bkG9/4RldjAwAAAAC7gZ6G21LK2aWUH5VSainl1uGf/1VKmbPVOQtLKWtKKT8ppZzby3l+G+94xzuycuXK7Y6fc845Wb16dVavXp3jjjsuSTJz5szcfPPNWb16dVauXJk/+7M/y6ZNm0Z7ZAAAAABgN9HX4+8/K8mxSaYl+VGt9ZFSyrFJliWZV0oZn+TTSRYkuTfJTaWUq2ut/9bjuX6jY445JmvXrn1e577whS/c8vvjjz+eUkqPpgIAAAAAxoKe3XFbSrk0yYwkVyeZV2t9ZPitG5IcMPz7kUl+Umu9q9b6RJIVSf64VzONhE996lOZPXt2TjvttDzyyCNbjn//+9/PYYcdllmzZuXSSy9NX1+vmzgAAAAAsLvqWbittb47yX1J5tdaL9zqrXcl+cfh3/dPcs9W7907fKxJZ555Zv5z2k1aAAAgAElEQVT93/89q1evzrRp0/KBD3xgy3vz5s3LD3/4w9x0003567/+6zz++OMdTgoAAAAA7MpG9bbQUsr8PBVuj3r60A5Oq8/y2TOSnJEkU6bslyWzevcM2cHBwSTJAw88kA0bNmx5vbVZs2blK1/5yg7fGxoayvLly3PwwQf3bEa6s379+h3+f4fRYP/okv2ja3aQLtk/umT/6JL9o0tjff9GLdyWUmYn+XySY2utDw8fvjfJS7c67YA8dZfudmqty/LUs3Fz4IyD6gW39W70tYsGnvrv2rXZc889MzDw1Ov7778/06ZNS5JceOGFmTdvXgYGBvLTn/40L33pS9PX15e77747Dz74YE488cRMmTKlZzPSncHBwS07AaPN/tEl+0fX7CBdsn90yf7RJftHl8b6/o1KuC2lHJjk75OcWmu9Y6u3bkry8lLKf0rysyRvS/L20ZjpNznllFMyODiYhx56KAcccEA+9rGPZXBwMKtXr04pJdOnT89ll12WJPmXf/mXLF26NBMmTMi4cePymc98RrQFAAAAAHbaaN1xuyTJi5J8ppSSJJtqrXNrrZtKKe9J8v8lGZ/kb2qtPxylmZ7TFVdcsd2xd73rXTs899RTT82pp57a65EAAAAAgDGip+G21jp9+NfTh392dM61Sa7t5RwAAAAAALuScV0PAAAAAADAtoRbAAAAAIDGCLcAAAAAAI0RbgEAAAAAGiPcAgAAAAA0RrgFAAAAAGhMX9cD7IxJE8ZnzdLjux4DAAAAAKAn3HELAAAAANAY4RYAAAAAoDHCLQAAAABAY4RbAAAAAIDGCLcAAAAAAI0RbgEAAAAAGiPcAgAAAAA0RrgFAAAAAGiMcAsAAAAA0BjhFgAAAACgMcItAAAAAEBjhFsAAAAAgMYItwAAAAAAjRFuAQAAAAAaI9wCAAAAADRGuAUAAAAAaIxwCwAAAADQGOEWAAAAAKAxwi0AAAAAQGOEWwAAAACAxgi3AAAAAACNEW4BAAAAABoj3AIAAAAANEa4BQAAAABojHALAAAAANAY4RYAAAAAoDHCLQAAAABAY4RbAAAAAIDGCLcAAAAAAI0RbgEAAAAAGiPcAgAAAAA0RrgFAAAAAGiMcAsAAAAA0Ji+rgfYGRuHNmf6udfs9OfXLj1+BKcBAAAAABhZ7rgFAAAAAGiMcAsAAAAA0BjhFgAAAACgMcItAAAAAEBjhFsAAAAAgMaM6XB72mmnZerUqZk5c+aWY7/4xS+yYMGCvPzlL8+CBQvyyCOPJEkGBwezzz77pL+/P/39/fn4xz/e1dgAAAAAwG6uk3BbSjm7lPKjUsqGUsrq4Z/bSymbSym/N1pzvOMd78jKlSu3ObZ06dK87nWvy5133pnXve51Wbp06Zb3jj766KxevTqrV6/OkiVLRmtMAAAAAGCM6eqO27OSHFdr3bPW2l9r7U/yoST/s9b6i9Ea4phjjsnv/d62nfiqq67K4sWLkySLFy/OP/zDP4zWOAAAAAAASToIt6WUS5PMSHJ1KeWcrd46JckVoz3PMz344IOZNm1akmTatGlZt27dlve+973vZc6cOTn22GPzwx/+sKsRAQAAAIDdXN9oX7DW+u5SysIk82utDyVJKeWFSRYmec9oz/N8HX744bn77rszefLkXHvttfmTP/mT3HnnnV2PBQAAAADshkY93D6LNyT57nM9JqGUckaSM5JkypT9smTWpp2+2ODg4JbfH3jggWzYsGHLsb333jtf//rX86IXvSgPP/xw9tprr23OT5IXvvCFeeyxx3LVVVdln3322ek52DWtX79+u52A0WL/6JL9o2t2kC7ZP7pk/+iS/aNLY33/Wgm3b8tveExCrXVZkmVJcuCMg+oFt+386GsXDfzH72vXZs8998zAwFPHTj755Nx555058cQTs3Tp0rztbW/LwMBAHnjggbz4xS9OKSU33nhjJk6cmDe+8Y0ppez0HOyaBgcHt+wLjDb7R5fsH12zg3TJ/tEl+0eX7B9dGuv713m4LaXsk+Q1Sf6v0b72KaecksHBwTz00EM54IAD8rGPfSznnntuTjrppHzhC1/IgQcemK997WtJkiuvvDKf/exn09fXl0mTJmXFihWiLQAAAADQE52H2yRvSvKtWuuG0b7wFVfs+Cbfb3/729sde8973pP3vKfZR/ACAAAAALuRTsJtrXX6Vr9fnuTyLuYAAAAAAGjRuK4HAAAAAABgW8ItAAAAAEBjhFsAAAAAgMYItwAAAAAAjRFuAQAAAAAaI9wCAAAAADRGuAUAAAAAaExf1wPsjEkTxmfN0uO7HgMAAAAAoCfccQsAAAAA0BjhFgAAAACgMcItAAAAAEBjhFsAAAAAgMYItwAAAAAAjRFuAQAAAAAaI9wCAAAAADRGuAUAAAAAaIxwCwAAAADQGOEWAAAAAKAxwi0AAAAAQGOEWwAAAACAxgi3AAAAAACNEW4BAAAAABoj3AIAAAAANEa4BQAAAABojHALAAAAANAY4RYAAAAAoDHCLQAAAABAY4RbAAAAAIDGCLcAAAAAAI0RbgEAAAAAGiPcAgAAAAA0RrgFAAAAAGiMcAsAAAAA0BjhFgAAAACgMcItAAAAAEBjhFsAAAAAgMYItwAAAAAAjRFuAQAAAAAaI9wCAAAAADRGuAUAAAAAaIxwCwAAAADQmL6uB9gZG4c2Z/q51zznOWuXHj9K0wAAAAAAjCx33AIAAAAANEa4BQAAAABojHALAAAAANAY4RYAAAAAoDHCLQAAAABAY3b7cDt9+vTMmjUr/f39mTt37pbjl1xySQ4++OAcdthh+cu//MsOJwQAAAAA2FZfr764lHJ2kjOTHJLktuHD65OcWWv9wfA5a5M8lmRzkk211rk7+Krf2T//8z9nypQp27y+6qqrcuutt2aPPfbIunXrenFZAAAAAICd0rNwm+SsJMcmmZbkR7XWR0opxyZZlmTeVufNr7U+1MM5tvPZz3425557bvbYY48kydSpU0fz8gAAAAAAz6knj0oopVyaZEaSq5PMq7U+MvzWDUkO6MU1n2OWvP71r88RRxyRZcuWJUnuuOOOfOc738m8efPymte8JjfddNNojgQAAAAA8Jx6csdtrfXdpZSF2f5u2ncl+cetT03yrVJKTXJZrXXZSM/y3e9+Ny95yUuybt26LFiwIIccckg2bdqURx55JDfccENuuummnHTSSbnrrrtSShnpywMAAAAA/NZKrbU3X/zU82vnPh1uSynzk3wmyVG11oeHj72k1npfKWVqkn9K8t5a6/XP8n1nJDkjSaZM2e+IJRd97jmvP2v/fbY7dvnll2fSpEm55ZZb8va3vz39/f1JkkWLFuXTn/509t133537YxlT1q9fn8mTJ3c9BmOU/aNL9o+u2UG6ZP/okv2jS/aPLo2V/Zs/f/4tO/q3v3r5jNstSimzk3w+ybFPR9skqbXeN/zfdaWUbyQ5MskOw+3w3bjLkuTAGQfVC2577tHXLhrIhg0b8uSTT2avvfbKhg0b8ld/9VdZsmRJ5syZk/vuuy8DAwO54447Mm7cuPzxH/+xO255XgYHBzMwMND1GIxR9o8u2T+6Zgfpkv2jS/aPLtk/ujTW96/n4baUcmCSv09yaq31jq2O75lkXK31seHfX5/k4yN57QcffDBvetObkiSbNm3K29/+9ixcuDBPPPFETjvttMycOTMTJ07M8uXLRVsAAAAAoBmjccftkiQvSvKZ4Ti6afjW3xcn+cbwsb4kX6m1rhzJC8+YMSM/+MEPtjs+ceLE/O3f/u1IXgoAAAAAYMT0LNzWWqcP/3r68M8z378ryZxeXR8AAAAAYFc1rusBAAAAAADYlnALAAAAANAY4RYAAAAAoDHCLQAAAABAY4RbAAAAAIDGCLcAAAAAAI0RbgEAAAAAGtPX9QA7Y9KE8Vmz9PiuxwAAAAAA6Al33AIAAAAANEa4BQAAAABojHALAAAAANAY4RYAAAAAoDHCLQAAAABAY4RbAAAAAIDGCLcAAAAAAI0RbgEAAAAAGiPcAgAAAAA0RrgFAAAAAGiMcAsAAAAA0BjhFgAAAACgMcItAAAAAEBjhFsAAAAAgMYItwAAAAAAjRFuAQAAAAAaI9wCAAAAADRGuAUAAAAAaIxwCwAAAADQGOEWAAAAAKAxwi0AAAAAQGOEWwAAAACAxgi3AAAAAACNEW4BAAAAABoj3AIAAAAANEa4BQAAAABojHALAAAAANAY4RYAAAAAoDHCLQAAAABAY4RbAAAAAIDGCLcAAAAAAI0RbgEAAAAAGiPcAgAAAAA0pq/rAXbGxqHNmX7uNTt8b+3S40d5GgAAAACAkeWOWwAAAACAxgi3AAAAAACNEW4BAAAAABoj3AIAAAAANEa4BQAAAABozG4dbjdv3pxXvepVOeGEE5Ik1113XQ4//PDMnDkzixcvzqZNmzqeEAAAAABgez0Lt6WUs0spPyql1FLKrcM//6uUMmerc/YtpVxZSvnx8Ll/NJIzXHzxxTn00EOTJE8++WQWL16cFStW5Pbbb8/LXvayLF++fCQvBwAAAAAwInp5x+1ZSY5L8uokr6m1zk7y35Ms2+qci5OsrLUekmROkh+N1MXvvffeXHPNNTn99NOTJA8//HD22GOPvOIVr0iSLFiwIF//+tdH6nIAAAAAACOmJ+G2lHJpkhlJrk4yr9b6yPBbNyQ5YPicvZMck+QLSVJrfaLW+uhIzfD+978/n/jEJzJu3FN/4pQpUzI0NJSbb745SXLllVfmnnvuGanLAQAAAACMmJ6E21rru5Pcl2R+rfXCrd56V5J/HP59RpKfJ/liKWVVKeXzpZQ9R+L63/zmNzN16tQcccQRW46VUrJixYqcc845OfLII7PXXnulr69vJC4HAAAAADCiSq21N19cytokc2utDw2/np/kM0mOqrU+XEqZm6fuwH11rfX7pZSLk/yy1vqRZ/m+M5KckSRTpux3xJKLPrfD687af5987nOfy7e+9a2MHz8+TzzxRH71q1/l6KOPzoc//OEt591000255pprct55543Y38zYsH79+kyePLnrMRij7B9dsn90zQ7SJftHl+wfXbJ/dGms7N/8+fNvqbXOfebxUQm3pZTZSb6R5Nha6x3D7/9+khtqrdOHXx+d5Nxa6/G/6bsPnHFQHXfSxTt8b+3SbT8+ODiYT37yk/nmN7+ZdevWZerUqfn1r3+d4447Lh/+8Ifz2te+9nf4KxmLBgcHMzAw0PUYjFH2jy7ZP7pmB+mS/aNL9o8u2T+6NFb2r5Syw3Dby3+c7OkLH5jk75Oc+nS0TZJa6wNJ7imlHDx86HVJ/q2Xs5x//vk59NBDM3v27LzhDW8QbQEAAACAJo3GQ16XJHlRks+UUpJk01YF+b1JvlxKmZjkriTvHOmLDwwMbCnz559/fs4///yRvgQAAAAAwIjqWbh9+hEISU4f/tnROauTbHcbMAAAAADAWNbzRyUAAAAAAPDbEW4BAAAAABoj3AIAAAAANEa4BQAAAABojHALAAAAANAY4RYAAAAAoDF9XQ+wMyZNGJ81S4/vegwAAAAAgJ5wxy0AAAAAQGOEWwAAAACAxgi3AAAAAACNEW4BAAAAABoj3AIAAAAANEa4BQAAAABojHALAAAAANAY4RYAAAAAoDHCLQAAAABAY4RbAAAAAIDGCLcAAAAAAI0RbgEAAAAAGiPcAgAAAAA0RrgFAAAAAGiMcAsAAAAA0BjhFgAAAACgMcItAAAAAEBjhFsAAAAAgMYItwAAAAAAjRFuAQAAAAAaI9wCAAAAADRGuAUAAAAAaIxwCwAAAADQGOEWAAAAAKAxwi0AAAAAQGOEWwAAAACAxgi3AAAAAACNEW4BAAAAABoj3AIAAAAANEa4BQAAAABojHALAAAAANAY4RYAAAAAoDHCLQAAAABAY/q6HmBnbBzanOnnXrPNsbVLj+9oGgAAAACAkeWOWwAAAACAxgi3AAAAAACNEW4BAAAAABoj3AIAAAAANEa4BQAAAABojHALAAAAANCY3SrcPv744znyyCMzZ86cHHbYYfnoRz+aJFm0aFEOPvjgzJw5M6eddlqGhoY6nhQAAAAA4Nn1LNyWUs4upfyolPL1Usr3Sim/LqX8t2ec8zellHWllNtH4pp77LFHrrvuuvzgBz/I6tWrs3Llytxwww1ZtGhRfvzjH+e2227Lxo0b8/nPf34kLgcAAAAA0BN9Pfzus5Icm2RDkpcl+ZMdnHN5kk8l+X9H4oKllEyePDlJMjQ0lKGhoZRSctxxx20558gjj8y99947EpcDAAAAAOiJntxxW0q5NMmMJFcnWVRrvSnJds8nqLVen+QXI3ntzZs3p7+/P1OnTs2CBQsyb968Le8NDQ3lS1/6UhYuXDiSlwQAAAAAGFE9Cbe11ncnuS/J/Frrhb24xrMZP358Vq9enXvvvTc33nhjbr/9P57CcNZZZ+WYY47J0UcfPZojAQAAAAD8VkqttTdfXMraJHNrrQ8Nvz4vyfpa6yefcd70JN+stc78Dd93RpIzkmTKlP2OWHLR57Z5f9b++2z3meXLl+cFL3hBTj755Cxfvjx33nlnPv7xj2fcuN3q32RjlK1fv37LIzlgtNk/umT/6JodpEv2jy7ZP7pk/+jSWNm/+fPn31JrnfvM4718xu2IqrUuS7IsSQ6ccVC94LZtR1+7aCA///nPM2HChOy7777ZuHFjPvKRj+SDH/xgfvKTn2TNmjX59re/nUmTJnUxPruRwcHBDAwMdD0GY5T9o0v2j67ZQbpk/+iS/aNL9o8ujfX922XC7fNx//33Z/Hixdm8eXOefPLJnHTSSTnhhBPS19eXl73sZfmjP/qjJMmb3/zmLFmypONpAQAAAAB2rOfhtpTy+0luTrJ3kidLKe9P8spa6y9LKVckGUgypZRyb5KP1lq/sLPXmj17dlatWrXd8U2bNu3sVwIAAAAAjLqehdta6/StXh7wLOec0qvrAwAAAADsqvwrXQAAAAAAjRFuAQAAAAAaI9wCAAAAADRGuAUAAAAAaIxwCwAAAADQGOEWAAAAAKAxfV0PsDMmTRifNUuP73oMAAAAAICecMctAAAAAEBjhFsAAAAAgMYItwAAAAAAjRFuAQAAAAAaI9wCAAAAADRGuAUAAAAAaIxwCwAAAADQGOEWAAAAAKAxwi0AAAAAQGOEWwAAAACAxgi3AAAAAACNEW4BAAAAABoj3AIAAAAANEa4BQAAAABojHALAAAAANAY4RYAAAAAoDHCLQAAAABAY4RbAAAAAIDGCLcAAAAAAI0RbgEAAAAAGiPcAgAAAAA0RrgFAAAAAGiMcAsAAAAA0BjhFgAAAACgMcItAAAAAEBjhFsAAAAAgMYItwAAAAAAjRFuAQAAAAAaI9wCAAAAADRGuAUAAAAAaIxwCwAAAADQGOEWAAAAAKAxwi0AAAAAQGP6uh5gZ2wc2pzp516z5fXapcd3OA0AAAAAwMhyxy0AAAAAQGOEWwAAAACAxgi3AAAAAACNEW4BAAAAABoj3AIAAAAANGa3Cbf33HNP5s+fn0MPPTSHHXZYLr744iTJySefnP7+/vT392f69Onp7+/veFIAAAAAgOfW18VFSylnJzkzye8nuSfJk0k2JXl/rfVfduY7+/r6csEFF+Twww/PY489liOOOCILFizI3/3d32055wMf+ED22WefEfgLAAAAAAB6p5Nwm+SsJMcm+XmSDbXWWkqZneSrSQ7ZmS+cNm1apk2bliTZa6+9cuihh+ZnP/tZXvnKVyZJaq356le/muuuu25E/gAAAAAAgF4Z9UcllFIuTTIjydVJ/rTWWoff2jNJfdYP/hbWrl2bVatWZd68eVuOfec738mLX/zivPzlLx+JSwAAAAAA9Myo33Fba313KWVhkvm11odKKW9K8tdJpiY5/nf9/vXr1+fEE0/MRRddlL333nvL8SuuuCKnnHLK7/r1AAAAAAA9V/7jhtdRvGgpa5PMrbU+tNWxY5IsqbX+l2f5zBlJzkiSKVP2O2LJRZ/b8t6s/Z96bu2mTZvyoQ99KH/wB3+Qk046acv7mzdvzlvf+tZcdtll2W+//XrwFzGWrF+/PpMnT+56DMYo+0eX7B9ds4N0yf7RJftHl+wfXRor+zd//vxbaq1zn3m8q2fcbqfWen0p5T+XUqZsHXS3en9ZkmVJcuCMg+oFt/3H6GsXDaTWmsWLF+fVr351Lrroom0+u3LlysyaNStvfetbe/xXMBYMDg5mYGCg6zEYo+wfXbJ/dM0O0iX7R5fsH12yf3RprO/fqD/jdmullINKKWX498OTTEzy8M5813e/+9186UtfynXXXZf+/v709/fn2muvTZKsWLHCYxIAAAAAgF1G13fcnpjkv5ZShpJsTHJy3clnNxx11FF5to9efvnlOz0gAAAAAMBo6yTc1lqnD//6P4Z/AAAAAAAY1umjEgAAAAAA2J5wCwAAAADQGOEWAAAAAKAxwi0AAAAAQGOEWwAAAACAxgi3AAAAAACNEW4BAAAAABrT1/UAO2PShPFZs/T4rscAAAAAAOgJd9wCAAAAADRGuAUAAAAAaIxwCwAAAADQGOEWAAAAAKAxwi0AAAAAQGOEWwAAAACAxgi3AAAAAACNEW4BAAAAABoj3AIAAAAANEa4BQAAAABojHALAAAAANAY4RYAAAAAoDHCLQAAAABAY4RbAAAAAIDGCLcAAAAAAI0RbgEAAAAAGiPcAgAAAAA0RrgFAAAAAGiMcAsAAAAA0BjhFgAAAACgMcItAAAAAEBjhFsAAAAAgMYItwAAAAAAjRFuAQAAAAAaI9wCAAAAADRGuAUAAAAAaIxwCwAAAADQGOEWAAAAAKAxwi0AAAAAQGOEWwAAAACAxgi3AAAAAACNEW4BAAAAABoj3AIAAAAANGaXDLcbhzZ3PQIAAAAAQM/skuEWAAAAAGB3JtwCAAAAADRGuAUAAAAAaIxwCwAAAADQGOEWAAAAAKAxu3S4ffTRR/OWt7wlhxxySA499NB873vf63okAAAAAIDfWV8XFy2lnJ3kzCQ/Hp7hwOH/frLW+sXn+z3ve9/7snDhwlx55ZV54okn8qtf/ao3AwMAAAAAjKJOwm2Ss5Icm+SUJPvUWt9QStkvyZpSypdrrU/8pi/45S9/meuvvz6XX355kmTixImZOHFiL2cGAAAAABgVo/6ohFLKpUlmJLk6SU2yVymlJJmc5BdJNj2f77nrrruy33775Z3vfGde9apX5fTTT8+GDRt6NjcAAAAAwGgZ9XBba313kvuSzE/yqSSHDr++Lcn7aq1PPp/v2bRpU/71X/81Z555ZlatWpU999wzS5cu7dncAAAAAACjpdRaR/+ipaxNMjfJQJJXJ/m/k/znJP+UZE6t9Zc7+MwZSc5IkilT9jvisssuzVlnnZUVK1YkSW699dZ85StfEW/pufXr12fy5Mldj8EYZf/okv2ja3aQLtk/umT/6JL9o0tjZf/mz59/S6117jOPd/WM26e9M8nS+lQ9/kkp5adJDkly4zNPrLUuS7IsSQ6ccVB985vfnAsvvDDTpk3LwQcfnMHBwRx99NEZGBgYzfkZgwYHB+0ZnbF/dMn+0TU7SJfsH12yf3TJ/tGlsb5/XYfb/z/J65J8p5Ty4iQHJ7nr+X74kksuyaJFi/LEE09kxowZ+eIXv9irOQEAAAAARk3X4fa/J7m8lHJbkpLkg7XWh57vh/v7+3PzzTf3bDgAAAAAgC50Em5rrdO3evn6LmYAAAAAAGjVuK4HAAAAAABgW8ItAAAAAEBjhFsAAAAAgMYItwAAAAAAjRFuAQAAAAAaI9wCAAAAADRmlwy3kyaM73oEAAAAAICe2SXDLQAAAADA7ky4BQAAAABojHALAAAAANAY4RYAAAAAoDHCLQAAAABAY4RbAAAAAIDGCLcAAAAAAI0RbgEAAAAAGiPcAgAAAAA0RrgFAAAAAGiMcAsAAAAA0BjhFgAAAACgMcItAAAAAEBjhFsAAAAAgMYItwAAAAAAjRFuAQAAAAAaI9wCAAAAADRGuAUAAAAAaIxwCwAAAADQGOEWAAAAAKAxwi0AAAAAQGOEWwAAAACAxgi3AAAAAACNEW4BAAAAABoj3AIAAAAANEa4BQAAAABojHALAAAAANAY4RYAAAAAoDHCLQAAAABAY4RbAAAAAIDGCLcAAAAAAI0RbgEAAAAAGiPcAgAAAAA0RrgFAAAAAGiMcAsAAAAA0BjhFgAAAACgMcItAAAAAEBjhFsAAAAAgMYItwAAAAAAjRFuAQAAAAAaI9wCAAAAADRGuAUAAAAAaIxwCwAAAADQGOEWAAAAAKAxwi0AAAAAQGNKrbXrGX5rpZTHkqzpeg7GrClJHup6CMYs+0eX7B9ds4N0yf7RJftHl+wfXRor+/eyWut+zzzY18UkI2BNrXVu10MwNpVSbrZ/dMX+0SX7R9fsIF2yf3TJ/tEl+0eXxvr+eVQCAAAAAEBjhFsAAAAAgMbsquF2WdcDMKbZP7pk/+iS/aNrdpAu2T+6ZP/okv2jS2N6/3bJf5wMAAAAAGB3tqvecQsAAAAAsNvapcJtKWVhKWVNKeUnpZRzu56H3UMp5W9KKetKKbdvdez3Sin/VEq5c/i//8fw8VJK+X+Gd/DWUsrhW31m8fD5d5ZSFnfxt7DrKaW8tJTyz6WUH5VSflhKed/wcTvIqCilvKCUcmMp5QfDO/ix4eP/qZTy/eF9+rtSysTh43sMv/7J8PvTt/quDw0fX1NK+T+7+YvY1ZRSxpdSVpX/3d69xthR1nEc//7ooqFFqSASpEYwVgQNlItYBWstgiAGasSkSIQYEohBFGO8vgFviQYj+kJJkItAEIIVkKChENA0MYKVUrlqAhShgrSmUOQSoPj3xTwLy3aLL8YAAAeCSURBVHbbBmhnz+5+P8nmzDzz5OxM8svMc/7nzDPJtW3d7Kk3SR5IckeSFUn+2tq8BqsXSWYmWZzk720s+AHzpz4k2bOd94b/nkhyuvlTn5J8uX3+uDPJZe1ziePAUSZM4TbJNOBnwJHA3sBxSfYe373SJPFL4IhRbd8Abqyq2cCNbR26/M1ufycD50A3wAfOAN4PHAScMXyRkzZjPfCVqtoLmAuc2s5tZlB9eRZYUFX7AnOAI5LMBX4InN0y+BhwUut/EvBYVb0TOLv1o+V2EfAeunPqz9u1W9qcLwH3jFg3e+rbR6pqTlUd2Na9BqsvPwWuq6p3A/vSnQvNn7a6qvpHO+/NAQ4AngauwvypJ0l2A74IHFhV7wWm0Y3nHAeOMmEKt3QngXur6v6qeg64HDhmnPdJk0BVLQXWjmo+BrioLV8ELBzRfnF1bgZmJtkV+BhwQ1WtrarHgBvYsBgsbaCqHqmq5W35v3QD9t0wg+pJy9KTbXXb9lfAAmBxax+dweFsLgYOTZLWfnlVPVtVK4F76a7d0kYlmQUcBZzX1oPZ0/jzGqytLskbgXnA+QBV9VxVPY75U/8OBe6rqn9i/tSvIWC7JEPAdOARHAduYCIVbncDHhqxvqq1SVvDLlX1CHSFNeAtrX1jOTSfes3a7R77AbdgBtWjdLeqrwBW0w247wMer6r1rcvIPL2YtbZ9HbATZlCvzk+ArwH/a+s7YfbUrwKuT3JrkpNbm9dg9eEdwBrgwnTTxZyXZAbmT/1bBFzWls2felFV/wJ+BDxIV7BdB9yK48ANTKTCbcZoq973QlPdxnJoPvWaJNke+A1welU9samuY7SZQb0mVfVCu1VuFt031HuN1a29mkFtEUk+AayuqltHNo/R1expazq4qvanuw341CTzNtHXDGpLGgL2B86pqv2Ap3jptvSxmD9tcW3+0KOBX2+u6xht5k+vWptS4xhgD+CtwAy6a/FoU34cOJEKt6uAt41YnwU8PE77osnv0XbrB+11dWvfWA7Np161JNvSFW0vraorW7MZVO/aLZp/pJtveWa7bQlenqcXs9a270A33YwZ1Ct1MHB0kgfopsBaQPcLXLOn3lTVw+11Nd38jgfhNVj9WAWsqqpb2vpiukKu+VOfjgSWV9Wjbd38qS8fBVZW1Zqqeh64EvggjgM3MJEKt8uA2e0Jc6+j+zn/NeO8T5q8rgGGn4h5IvDbEe0ntKdqzgXWtVtIlgCHJ3lT++bo8NYmbVKbl+d84J6q+vGITWZQvUiyc5KZbXk7ukHUPcAfgGNbt9EZHM7mscBNVVWtfVF74usedA+v+Es/R6GJqKq+WVWzqmp3unHdTVV1PGZPPUkyI8kbhpfprp134jVYPaiqfwMPJdmzNR0K3I35U7+O46VpEsD8qT8PAnOTTG+fiYfPgY4DRxnafJfBUFXrk3yB7iQwDbigqu4a593SJJDkMmA+8OYkq+ieivkD4IokJ9GdUD7duv8e+DjdhNdPA58DqKq1Sb5L9wUDwHeqavQDz6SxHAx8FrijzTEK8C3MoPqzK3BRe/rqNsAVVXVtkruBy5N8D7iN9vCU9npJknvpvuVeBFBVdyW5gm7AtR44tape6PlYNDl8HbOnfuwCXNV9XmQI+FVVXZdkGV6D1Y/TgEvbD5Pup8vUNpg/9SDJdOAw4JQRzX4GUS+q6pYki4HldOO324Bzgd/hOPBl0hWoJUmSJEmSJEmDYiJNlSBJkiRJkiRJU4KFW0mSJEmSJEkaMBZuJUmSJEmSJGnAWLiVJEmSJEmSpAFj4VaSJEmSJEmSBszQeO+AJEmStKUleQG4Y0TTwqp6YJx2R5IkSXrFUlXjvQ+SJEnSFpXkyaravsf/N1RV6/v6f5IkSZr8nCpBkiRJU06SXZMsTbIiyZ1JPtTaj0iyPMnfktzY2nZMcnWS25PcnGSf1n5mknOTXA9cnGRakrOSLGt9TxnHQ5QkSdIE51QJkiRJmoy2S7KiLa+sqk+O2v4ZYElVfT/JNGB6kp2BXwDzqmplkh1b328Dt1XVwiQLgIuBOW3bAcAhVfVMkpOBdVX1viSvB/6U5PqqWrk1D1SSJEmTk4VbSZIkTUbPVNWcTWxfBlyQZFvg6qpakWQ+sHS40FpVa1vfQ4BPtbabkuyUZIe27ZqqeqYtHw7sk+TYtr4DMBuwcCtJkqRXzMKtJEmSppyqWppkHnAUcEmSs4DHgbEeAJGx3qK9PjWq32lVtWSL7qwkSZKmJOe4lSRJ0pST5O3A6qr6BXA+sD/wZ+DDSfZofYanSlgKHN/a5gP/qaonxnjbJcDn2694SfKuJDO26oFIkiRp0vIXt5IkSZqK5gNfTfI88CRwQlWtafPUXplkG2A1cBhwJnBhktuBp4ETN/Ke5wG7A8uTBFgDLNyaByFJkqTJK1Vj3Q0mSZIkSZIkSRovTpUgSZIkSZIkSQPGwq0kSZIkSZIkDRgLt5IkSZIkSZI0YCzcSpIkSZIkSdKAsXArSZIkSZIkSQPGwq0kSZIkSZIkDRgLt5IkSZIkSZI0YCzcSpIkSZIkSdKA+T9gM3B3fQU/8wAAAABJRU5ErkJggg==\n",
      "text/plain": [
       "<Figure size 1728x1728 with 1 Axes>"
      ]
     },
     "metadata": {
      "needs_background": "light"
     },
     "output_type": "display_data"
    }
   ],
   "source": [
    "xgb_model(train, features, target, plot=True)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 16,
   "metadata": {},
   "outputs": [],
   "source": [
    "xgb1 = xgb.XGBClassifier(\n",
    "    booster='gbtree',\n",
    "    objective='multi:softprob',\n",
    "    learning_rate= 0.1,\n",
    "    num_round= 775,\n",
    "    max_depth=8,\n",
    "    seed=25,\n",
    "    nthread=-1,\n",
    "    eval_metric='mlogloss',\n",
    "    num_class=5\n",
    "\n",
    ")"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Train and Validation data splits"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 17,
   "metadata": {},
   "outputs": [],
   "source": [
    "trainX, validX, trainY, validY = train_test_split(train[features], \n",
    "                                                  train[target], test_size=0.2,stratify=train[target], random_state=13)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 18,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "0.1450790688486567"
      ]
     },
     "execution_count": 18,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "model = xgb1\n",
    "cross_valid(model,train,features,target,cv=10)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Prediction on test dataset"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 19,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Validation Score: 0.14190392327990622\n"
     ]
    }
   ],
   "source": [
    "model = xgb1\n",
    "model.fit(trainX[features],trainY)\n",
    "y_pred_valid = model.predict_proba(validX[features])\n",
    "print(\"Validation Score:\",metric(validY,y_pred_valid))\n",
    "y_pred_test = model.predict_proba(test[features])\n",
    "result = pd.DataFrame(y_pred_test)\n",
    "#result.to_excel(\"xgb_boost_solution1.xlsx\",index=False)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 20,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/html": [
       "<div>\n",
       "<style scoped>\n",
       "    .dataframe tbody tr th:only-of-type {\n",
       "        vertical-align: middle;\n",
       "    }\n",
       "\n",
       "    .dataframe tbody tr th {\n",
       "        vertical-align: top;\n",
       "    }\n",
       "\n",
       "    .dataframe thead th {\n",
       "        text-align: right;\n",
       "    }\n",
       "</style>\n",
       "<table border=\"1\" class=\"dataframe\">\n",
       "  <thead>\n",
       "    <tr style=\"text-align: right;\">\n",
       "      <th></th>\n",
       "      <th>0</th>\n",
       "      <th>1</th>\n",
       "      <th>2</th>\n",
       "      <th>3</th>\n",
       "      <th>4</th>\n",
       "    </tr>\n",
       "  </thead>\n",
       "  <tbody>\n",
       "    <tr>\n",
       "      <td>0</td>\n",
       "      <td>0.000374</td>\n",
       "      <td>0.000774</td>\n",
       "      <td>0.998009</td>\n",
       "      <td>0.000383</td>\n",
       "      <td>0.000461</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <td>1</td>\n",
       "      <td>0.000420</td>\n",
       "      <td>0.077199</td>\n",
       "      <td>0.003564</td>\n",
       "      <td>0.918483</td>\n",
       "      <td>0.000334</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <td>2</td>\n",
       "      <td>0.000319</td>\n",
       "      <td>0.000330</td>\n",
       "      <td>0.998785</td>\n",
       "      <td>0.000262</td>\n",
       "      <td>0.000304</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <td>3</td>\n",
       "      <td>0.001292</td>\n",
       "      <td>0.001324</td>\n",
       "      <td>0.022292</td>\n",
       "      <td>0.973273</td>\n",
       "      <td>0.001819</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <td>4</td>\n",
       "      <td>0.000533</td>\n",
       "      <td>0.000553</td>\n",
       "      <td>0.990465</td>\n",
       "      <td>0.007940</td>\n",
       "      <td>0.000509</td>\n",
       "    </tr>\n",
       "  </tbody>\n",
       "</table>\n",
       "</div>"
      ],
      "text/plain": [
       "          0         1         2         3         4\n",
       "0  0.000374  0.000774  0.998009  0.000383  0.000461\n",
       "1  0.000420  0.077199  0.003564  0.918483  0.000334\n",
       "2  0.000319  0.000330  0.998785  0.000262  0.000304\n",
       "3  0.001292  0.001324  0.022292  0.973273  0.001819\n",
       "4  0.000533  0.000553  0.990465  0.007940  0.000509"
      ]
     },
     "execution_count": 20,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "result.head()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 21,
   "metadata": {},
   "outputs": [],
   "source": [
    "## Public score\n",
    "## 0.16467"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": []
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## PART-2"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "**Mapping test features values to train feature values**"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 22,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/html": [
       "<div>\n",
       "<style scoped>\n",
       "    .dataframe tbody tr th:only-of-type {\n",
       "        vertical-align: middle;\n",
       "    }\n",
       "\n",
       "    .dataframe tbody tr th {\n",
       "        vertical-align: top;\n",
       "    }\n",
       "\n",
       "    .dataframe thead th {\n",
       "        text-align: right;\n",
       "    }\n",
       "</style>\n",
       "<table border=\"1\" class=\"dataframe\">\n",
       "  <thead>\n",
       "    <tr style=\"text-align: right;\">\n",
       "      <th></th>\n",
       "      <th>f0</th>\n",
       "      <th>f1</th>\n",
       "      <th>f2</th>\n",
       "      <th>f3</th>\n",
       "      <th>f4</th>\n",
       "      <th>f5</th>\n",
       "      <th>f6</th>\n",
       "      <th>f7</th>\n",
       "      <th>f8</th>\n",
       "      <th>f9</th>\n",
       "      <th>f10</th>\n",
       "      <th>f11</th>\n",
       "      <th>f12</th>\n",
       "      <th>f13</th>\n",
       "      <th>f14</th>\n",
       "      <th>f15</th>\n",
       "      <th>f16</th>\n",
       "      <th>f17</th>\n",
       "      <th>f18</th>\n",
       "      <th>f19</th>\n",
       "      <th>f20</th>\n",
       "      <th>f21</th>\n",
       "      <th>f22</th>\n",
       "      <th>f23</th>\n",
       "      <th>f24</th>\n",
       "      <th>f25</th>\n",
       "      <th>f26</th>\n",
       "      <th>f27</th>\n",
       "    </tr>\n",
       "  </thead>\n",
       "  <tbody>\n",
       "    <tr>\n",
       "      <td>0</td>\n",
       "      <td>-0.837812</td>\n",
       "      <td>-0.273636</td>\n",
       "      <td>1.276580</td>\n",
       "      <td>0.463262</td>\n",
       "      <td>-0.585142</td>\n",
       "      <td>-0.24287</td>\n",
       "      <td>0.349804</td>\n",
       "      <td>0.12356</td>\n",
       "      <td>0.166795</td>\n",
       "      <td>0.06143</td>\n",
       "      <td>0.445195</td>\n",
       "      <td>0.27735</td>\n",
       "      <td>-2.139737</td>\n",
       "      <td>-2.527625</td>\n",
       "      <td>0.17609</td>\n",
       "      <td>0.06143</td>\n",
       "      <td>0.285133</td>\n",
       "      <td>0.06143</td>\n",
       "      <td>0.197642</td>\n",
       "      <td>0.06143</td>\n",
       "      <td>0.27735</td>\n",
       "      <td>0.886135</td>\n",
       "      <td>-0.568935</td>\n",
       "      <td>1.100428</td>\n",
       "      <td>-0.244589</td>\n",
       "      <td>0.229718</td>\n",
       "      <td>-0.217109</td>\n",
       "      <td>0.087039</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <td>1</td>\n",
       "      <td>2.078087</td>\n",
       "      <td>-0.273636</td>\n",
       "      <td>-0.496119</td>\n",
       "      <td>0.463262</td>\n",
       "      <td>-2.438092</td>\n",
       "      <td>-0.24287</td>\n",
       "      <td>0.349804</td>\n",
       "      <td>0.12356</td>\n",
       "      <td>0.166795</td>\n",
       "      <td>0.06143</td>\n",
       "      <td>0.445195</td>\n",
       "      <td>0.27735</td>\n",
       "      <td>0.513736</td>\n",
       "      <td>0.395628</td>\n",
       "      <td>0.17609</td>\n",
       "      <td>0.06143</td>\n",
       "      <td>0.285133</td>\n",
       "      <td>0.06143</td>\n",
       "      <td>-5.059644</td>\n",
       "      <td>0.06143</td>\n",
       "      <td>0.27735</td>\n",
       "      <td>0.886135</td>\n",
       "      <td>0.504299</td>\n",
       "      <td>-0.434268</td>\n",
       "      <td>-0.244040</td>\n",
       "      <td>0.229718</td>\n",
       "      <td>-0.217109</td>\n",
       "      <td>0.087039</td>\n",
       "    </tr>\n",
       "  </tbody>\n",
       "</table>\n",
       "</div>"
      ],
      "text/plain": [
       "         f0        f1        f2        f3        f4       f5        f6  \\\n",
       "0 -0.837812 -0.273636  1.276580  0.463262 -0.585142 -0.24287  0.349804   \n",
       "1  2.078087 -0.273636 -0.496119  0.463262 -2.438092 -0.24287  0.349804   \n",
       "\n",
       "        f7        f8       f9       f10      f11       f12       f13      f14  \\\n",
       "0  0.12356  0.166795  0.06143  0.445195  0.27735 -2.139737 -2.527625  0.17609   \n",
       "1  0.12356  0.166795  0.06143  0.445195  0.27735  0.513736  0.395628  0.17609   \n",
       "\n",
       "       f15       f16      f17       f18      f19      f20       f21       f22  \\\n",
       "0  0.06143  0.285133  0.06143  0.197642  0.06143  0.27735  0.886135 -0.568935   \n",
       "1  0.06143  0.285133  0.06143 -5.059644  0.06143  0.27735  0.886135  0.504299   \n",
       "\n",
       "        f23       f24       f25       f26       f27  \n",
       "0  1.100428 -0.244589  0.229718 -0.217109  0.087039  \n",
       "1 -0.434268 -0.244040  0.229718 -0.217109  0.087039  "
      ]
     },
     "execution_count": 22,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "test.head(2)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "**Trick Part** "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 23,
   "metadata": {},
   "outputs": [],
   "source": [
    "tr_f0 = pd.DataFrame(train['f0'].unique()).sort_values(by=0).reset_index(drop=True)\n",
    "te_f0 = pd.DataFrame(test['f0'].unique()).sort_values(by=0).reset_index(drop=True)\n",
    "d = pd.concat([tr_f0,te_f0],axis=1)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 24,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/html": [
       "<div>\n",
       "<style scoped>\n",
       "    .dataframe tbody tr th:only-of-type {\n",
       "        vertical-align: middle;\n",
       "    }\n",
       "\n",
       "    .dataframe tbody tr th {\n",
       "        vertical-align: top;\n",
       "    }\n",
       "\n",
       "    .dataframe thead th {\n",
       "        text-align: right;\n",
       "    }\n",
       "</style>\n",
       "<table border=\"1\" class=\"dataframe\">\n",
       "  <thead>\n",
       "    <tr style=\"text-align: right;\">\n",
       "      <th></th>\n",
       "      <th>0</th>\n",
       "      <th>0</th>\n",
       "    </tr>\n",
       "  </thead>\n",
       "  <tbody>\n",
       "    <tr>\n",
       "      <td>0</td>\n",
       "      <td>-0.825098</td>\n",
       "      <td>-0.837812</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <td>1</td>\n",
       "      <td>-0.379487</td>\n",
       "      <td>-0.421255</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <td>2</td>\n",
       "      <td>0.066123</td>\n",
       "      <td>-0.004698</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <td>3</td>\n",
       "      <td>0.511733</td>\n",
       "      <td>0.411859</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <td>4</td>\n",
       "      <td>0.957343</td>\n",
       "      <td>0.828416</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <td>5</td>\n",
       "      <td>1.402954</td>\n",
       "      <td>1.244973</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <td>6</td>\n",
       "      <td>1.848564</td>\n",
       "      <td>1.661530</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <td>7</td>\n",
       "      <td>2.294174</td>\n",
       "      <td>2.078087</td>\n",
       "    </tr>\n",
       "  </tbody>\n",
       "</table>\n",
       "</div>"
      ],
      "text/plain": [
       "          0         0\n",
       "0 -0.825098 -0.837812\n",
       "1 -0.379487 -0.421255\n",
       "2  0.066123 -0.004698\n",
       "3  0.511733  0.411859\n",
       "4  0.957343  0.828416\n",
       "5  1.402954  1.244973\n",
       "6  1.848564  1.661530\n",
       "7  2.294174  2.078087"
      ]
     },
     "execution_count": 24,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "d"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "**On observation, we can see there is some relation between train and test unique values.  \n",
    "The relation is found using the below equation:  \n",
    "y = ax + c**  "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "**train_value = a1*test_value + c1**"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 25,
   "metadata": {},
   "outputs": [],
   "source": [
    "# Function to find the a1 and c1 values for each feature.\n",
    "def calculate_transform(i,train,test):\n",
    "    tr = pd.DataFrame(train[i].unique()).sort_values(by=0).reset_index(drop=True)\n",
    "    te = pd.DataFrame(test[i].unique()).sort_values(by=0).reset_index(drop=True)\n",
    "\n",
    "    a1 = (tr[0][1]-tr[0][0])/(te[0][1]-te[0][0])\n",
    "    c1 = (tr[0][0]) - (te[0][0])*a1\n",
    "    #a1*te[0] + c1\n",
    "    return [a1,c1]"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 26,
   "metadata": {},
   "outputs": [],
   "source": [
    "for i in list(set(features)-set(['f1','f22','f23'])):\n",
    "    l = calculate_transform(i,train,test)\n",
    "    test[i] = l[0]*test[i]+l[1]"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "**f23,f1,f22 features have unequal number of unique values so they r calculated separetely**"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 27,
   "metadata": {},
   "outputs": [],
   "source": [
    "#f1\n",
    "tr_f0 = pd.DataFrame(train['f1'].unique()).sort_values(by=0).reset_index(drop=True)\n",
    "te_f0 = pd.DataFrame(test['f1'].unique()).sort_values(by=0).reset_index(drop=True)\n",
    "a1 = (tr_f0[0][9]-tr_f0[0][8])/(te_f0[0][6]-te_f0[0][5])\n",
    "c1 = (tr_f0[0][8]) - (te_f0[0][5])*a1\n",
    "test['f1'] = a1*test['f1'] + c1\n",
    "\n",
    "#f22\n",
    "tr_f0 = pd.DataFrame(train['f22'].unique()).sort_values(by=0).reset_index(drop=True)\n",
    "te_f0 = pd.DataFrame(test['f22'].unique()).sort_values(by=0).reset_index(drop=True)\n",
    "a1 = (tr_f0[0][48]-tr_f0[0][47])/(te_f0[0][36]-te_f0[0][35])\n",
    "c1 = (tr_f0[0][47]) - (te_f0[0][35])*a1\n",
    "test['f22'] = a1*test['f22'] + c1\n",
    "\n",
    "\n",
    "#f23\n",
    "tr_f0 = pd.DataFrame(train['f23'].unique()).sort_values(by=0).reset_index(drop=True)\n",
    "te_f0 = pd.DataFrame(test['f23'].unique()).sort_values(by=0).reset_index(drop=True)\n",
    "a1 = (tr_f0[0][62]-tr_f0[0][61])/(te_f0[0][40]-te_f0[0][39])\n",
    "c1 = (tr_f0[0][61]) - (te_f0[0][39])*a1\n",
    "test['f23'] = a1*test['f23'] + c1"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 28,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/html": [
       "<div>\n",
       "<style scoped>\n",
       "    .dataframe tbody tr th:only-of-type {\n",
       "        vertical-align: middle;\n",
       "    }\n",
       "\n",
       "    .dataframe tbody tr th {\n",
       "        vertical-align: top;\n",
       "    }\n",
       "\n",
       "    .dataframe thead th {\n",
       "        text-align: right;\n",
       "    }\n",
       "</style>\n",
       "<table border=\"1\" class=\"dataframe\">\n",
       "  <thead>\n",
       "    <tr style=\"text-align: right;\">\n",
       "      <th></th>\n",
       "      <th>f0</th>\n",
       "      <th>f1</th>\n",
       "      <th>f2</th>\n",
       "      <th>f3</th>\n",
       "      <th>f4</th>\n",
       "      <th>f5</th>\n",
       "      <th>f6</th>\n",
       "      <th>f7</th>\n",
       "      <th>f8</th>\n",
       "      <th>f9</th>\n",
       "      <th>f10</th>\n",
       "      <th>f11</th>\n",
       "      <th>f12</th>\n",
       "      <th>f13</th>\n",
       "      <th>f14</th>\n",
       "      <th>f15</th>\n",
       "      <th>f16</th>\n",
       "      <th>f17</th>\n",
       "      <th>f18</th>\n",
       "      <th>f19</th>\n",
       "      <th>f20</th>\n",
       "      <th>f21</th>\n",
       "      <th>f22</th>\n",
       "      <th>f23</th>\n",
       "      <th>f24</th>\n",
       "      <th>f25</th>\n",
       "      <th>f26</th>\n",
       "      <th>f27</th>\n",
       "    </tr>\n",
       "  </thead>\n",
       "  <tbody>\n",
       "    <tr>\n",
       "      <td>0</td>\n",
       "      <td>-0.825098</td>\n",
       "      <td>-0.26425</td>\n",
       "      <td>1.388246</td>\n",
       "      <td>0.4094</td>\n",
       "      <td>-0.525726</td>\n",
       "      <td>-0.276144</td>\n",
       "      <td>0.370965</td>\n",
       "      <td>0.090167</td>\n",
       "      <td>0.107958</td>\n",
       "      <td>0.06143</td>\n",
       "      <td>0.395874</td>\n",
       "      <td>0.308879</td>\n",
       "      <td>-1.999287</td>\n",
       "      <td>-2.118189</td>\n",
       "      <td>0.172917</td>\n",
       "      <td>0.098853</td>\n",
       "      <td>0.308879</td>\n",
       "      <td>0.040193</td>\n",
       "      <td>0.182574</td>\n",
       "      <td>-8.750026</td>\n",
       "      <td>0.233285</td>\n",
       "      <td>0.925358</td>\n",
       "      <td>-0.573268</td>\n",
       "      <td>1.087230</td>\n",
       "      <td>-0.287622</td>\n",
       "      <td>0.271886</td>\n",
       "      <td>-0.232472</td>\n",
       "      <td>-7.812837</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <td>1</td>\n",
       "      <td>2.294174</td>\n",
       "      <td>-0.26425</td>\n",
       "      <td>-0.461423</td>\n",
       "      <td>0.4094</td>\n",
       "      <td>-2.356907</td>\n",
       "      <td>-0.276144</td>\n",
       "      <td>0.370965</td>\n",
       "      <td>0.090167</td>\n",
       "      <td>0.107958</td>\n",
       "      <td>0.06143</td>\n",
       "      <td>0.395874</td>\n",
       "      <td>0.308879</td>\n",
       "      <td>0.548623</td>\n",
       "      <td>0.472101</td>\n",
       "      <td>0.172917</td>\n",
       "      <td>0.098853</td>\n",
       "      <td>0.308879</td>\n",
       "      <td>0.040193</td>\n",
       "      <td>-5.477226</td>\n",
       "      <td>-8.750026</td>\n",
       "      <td>0.233285</td>\n",
       "      <td>0.925358</td>\n",
       "      <td>0.443257</td>\n",
       "      <td>-0.406121</td>\n",
       "      <td>-0.287096</td>\n",
       "      <td>0.271886</td>\n",
       "      <td>-0.232472</td>\n",
       "      <td>-7.812837</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <td>2</td>\n",
       "      <td>-0.825098</td>\n",
       "      <td>-0.26425</td>\n",
       "      <td>1.388246</td>\n",
       "      <td>0.4094</td>\n",
       "      <td>-0.525726</td>\n",
       "      <td>-0.276144</td>\n",
       "      <td>0.370965</td>\n",
       "      <td>0.090167</td>\n",
       "      <td>0.107958</td>\n",
       "      <td>0.06143</td>\n",
       "      <td>0.395874</td>\n",
       "      <td>0.308879</td>\n",
       "      <td>0.548623</td>\n",
       "      <td>0.472101</td>\n",
       "      <td>0.172917</td>\n",
       "      <td>0.098853</td>\n",
       "      <td>0.308879</td>\n",
       "      <td>0.040193</td>\n",
       "      <td>0.182574</td>\n",
       "      <td>-8.750026</td>\n",
       "      <td>0.233285</td>\n",
       "      <td>-1.080663</td>\n",
       "      <td>-0.573268</td>\n",
       "      <td>-0.406121</td>\n",
       "      <td>-0.687687</td>\n",
       "      <td>0.271886</td>\n",
       "      <td>-0.232472</td>\n",
       "      <td>-7.812837</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <td>3</td>\n",
       "      <td>-0.825098</td>\n",
       "      <td>-0.26425</td>\n",
       "      <td>-0.461423</td>\n",
       "      <td>0.4094</td>\n",
       "      <td>1.305455</td>\n",
       "      <td>-0.276144</td>\n",
       "      <td>-2.695676</td>\n",
       "      <td>0.090167</td>\n",
       "      <td>0.107958</td>\n",
       "      <td>0.06143</td>\n",
       "      <td>0.395874</td>\n",
       "      <td>0.308879</td>\n",
       "      <td>0.548623</td>\n",
       "      <td>0.472101</td>\n",
       "      <td>0.172917</td>\n",
       "      <td>0.098853</td>\n",
       "      <td>0.308879</td>\n",
       "      <td>0.040193</td>\n",
       "      <td>-5.477226</td>\n",
       "      <td>-8.750026</td>\n",
       "      <td>0.233285</td>\n",
       "      <td>-1.080663</td>\n",
       "      <td>-0.460446</td>\n",
       "      <td>-1.850510</td>\n",
       "      <td>-0.687687</td>\n",
       "      <td>0.271886</td>\n",
       "      <td>-0.232472</td>\n",
       "      <td>-7.812837</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <td>4</td>\n",
       "      <td>-0.825098</td>\n",
       "      <td>-0.26425</td>\n",
       "      <td>-0.461423</td>\n",
       "      <td>0.4094</td>\n",
       "      <td>-0.525726</td>\n",
       "      <td>-0.276144</td>\n",
       "      <td>-2.695676</td>\n",
       "      <td>0.090167</td>\n",
       "      <td>0.107958</td>\n",
       "      <td>0.06143</td>\n",
       "      <td>0.395874</td>\n",
       "      <td>0.308879</td>\n",
       "      <td>0.548623</td>\n",
       "      <td>0.472101</td>\n",
       "      <td>0.172917</td>\n",
       "      <td>0.098853</td>\n",
       "      <td>0.308879</td>\n",
       "      <td>0.040193</td>\n",
       "      <td>0.182574</td>\n",
       "      <td>-8.750026</td>\n",
       "      <td>0.233285</td>\n",
       "      <td>-1.080663</td>\n",
       "      <td>-0.573268</td>\n",
       "      <td>-0.406121</td>\n",
       "      <td>-0.687687</td>\n",
       "      <td>0.271886</td>\n",
       "      <td>-0.232472</td>\n",
       "      <td>-7.812837</td>\n",
       "    </tr>\n",
       "  </tbody>\n",
       "</table>\n",
       "</div>"
      ],
      "text/plain": [
       "         f0       f1        f2      f3        f4        f5        f6  \\\n",
       "0 -0.825098 -0.26425  1.388246  0.4094 -0.525726 -0.276144  0.370965   \n",
       "1  2.294174 -0.26425 -0.461423  0.4094 -2.356907 -0.276144  0.370965   \n",
       "2 -0.825098 -0.26425  1.388246  0.4094 -0.525726 -0.276144  0.370965   \n",
       "3 -0.825098 -0.26425 -0.461423  0.4094  1.305455 -0.276144 -2.695676   \n",
       "4 -0.825098 -0.26425 -0.461423  0.4094 -0.525726 -0.276144 -2.695676   \n",
       "\n",
       "         f7        f8       f9       f10       f11       f12       f13  \\\n",
       "0  0.090167  0.107958  0.06143  0.395874  0.308879 -1.999287 -2.118189   \n",
       "1  0.090167  0.107958  0.06143  0.395874  0.308879  0.548623  0.472101   \n",
       "2  0.090167  0.107958  0.06143  0.395874  0.308879  0.548623  0.472101   \n",
       "3  0.090167  0.107958  0.06143  0.395874  0.308879  0.548623  0.472101   \n",
       "4  0.090167  0.107958  0.06143  0.395874  0.308879  0.548623  0.472101   \n",
       "\n",
       "        f14       f15       f16       f17       f18       f19       f20  \\\n",
       "0  0.172917  0.098853  0.308879  0.040193  0.182574 -8.750026  0.233285   \n",
       "1  0.172917  0.098853  0.308879  0.040193 -5.477226 -8.750026  0.233285   \n",
       "2  0.172917  0.098853  0.308879  0.040193  0.182574 -8.750026  0.233285   \n",
       "3  0.172917  0.098853  0.308879  0.040193 -5.477226 -8.750026  0.233285   \n",
       "4  0.172917  0.098853  0.308879  0.040193  0.182574 -8.750026  0.233285   \n",
       "\n",
       "        f21       f22       f23       f24       f25       f26       f27  \n",
       "0  0.925358 -0.573268  1.087230 -0.287622  0.271886 -0.232472 -7.812837  \n",
       "1  0.925358  0.443257 -0.406121 -0.287096  0.271886 -0.232472 -7.812837  \n",
       "2 -1.080663 -0.573268 -0.406121 -0.687687  0.271886 -0.232472 -7.812837  \n",
       "3 -1.080663 -0.460446 -1.850510 -0.687687  0.271886 -0.232472 -7.812837  \n",
       "4 -1.080663 -0.573268 -0.406121 -0.687687  0.271886 -0.232472 -7.812837  "
      ]
     },
     "execution_count": 28,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "test.head()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 29,
   "metadata": {},
   "outputs": [],
   "source": [
    "#xgb_model(train,features,target,plot=False)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 30,
   "metadata": {},
   "outputs": [],
   "source": [
    "xgb1 = xgb.XGBClassifier(\n",
    "    booster='gbtree',\n",
    "    objective='multi:softprob',\n",
    "    learning_rate= 0.1,\n",
    "    num_round= 775,\n",
    "    max_depth=8,\n",
    "    seed=25,\n",
    "    nthread=-1,\n",
    "    eval_metric='mlogloss',\n",
    "    num_class=5\n",
    ")"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 31,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "0.1450790688486567"
      ]
     },
     "execution_count": 31,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "model = xgb1\n",
    "cross_valid(model,train,features,target,cv=10)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 32,
   "metadata": {},
   "outputs": [],
   "source": [
    "trainX, validX, trainY, validY = train_test_split(train[features], \n",
    "                                                  train[target], test_size=0.2,stratify=train[target], random_state=13)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 33,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Validation Score: 0.14190392327990622\n"
     ]
    }
   ],
   "source": [
    "model = xgb1\n",
    "model.fit(trainX,trainY)\n",
    "y_pred_valid = model.predict_proba(validX)\n",
    "print(\"Validation Score:\",metric(validY,y_pred_valid))\n",
    "y_pred_test = model.predict_proba(test[features])\n",
    "result = pd.DataFrame(y_pred_test)\n",
    "#result.to_excel(\"xgb_boost_trick_part.xlsx\",index=False)\n",
    "##0.08903"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 34,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/html": [
       "<div>\n",
       "<style scoped>\n",
       "    .dataframe tbody tr th:only-of-type {\n",
       "        vertical-align: middle;\n",
       "    }\n",
       "\n",
       "    .dataframe tbody tr th {\n",
       "        vertical-align: top;\n",
       "    }\n",
       "\n",
       "    .dataframe thead th {\n",
       "        text-align: right;\n",
       "    }\n",
       "</style>\n",
       "<table border=\"1\" class=\"dataframe\">\n",
       "  <thead>\n",
       "    <tr style=\"text-align: right;\">\n",
       "      <th></th>\n",
       "      <th>0</th>\n",
       "      <th>1</th>\n",
       "      <th>2</th>\n",
       "      <th>3</th>\n",
       "      <th>4</th>\n",
       "    </tr>\n",
       "  </thead>\n",
       "  <tbody>\n",
       "    <tr>\n",
       "      <td>0</td>\n",
       "      <td>0.000316</td>\n",
       "      <td>0.000517</td>\n",
       "      <td>0.998454</td>\n",
       "      <td>0.000324</td>\n",
       "      <td>0.000390</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <td>1</td>\n",
       "      <td>0.000443</td>\n",
       "      <td>0.012755</td>\n",
       "      <td>0.003776</td>\n",
       "      <td>0.982674</td>\n",
       "      <td>0.000352</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <td>2</td>\n",
       "      <td>0.000309</td>\n",
       "      <td>0.000348</td>\n",
       "      <td>0.998826</td>\n",
       "      <td>0.000219</td>\n",
       "      <td>0.000298</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <td>3</td>\n",
       "      <td>0.001354</td>\n",
       "      <td>0.001308</td>\n",
       "      <td>0.034240</td>\n",
       "      <td>0.961424</td>\n",
       "      <td>0.001674</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <td>4</td>\n",
       "      <td>0.000493</td>\n",
       "      <td>0.000555</td>\n",
       "      <td>0.995900</td>\n",
       "      <td>0.002577</td>\n",
       "      <td>0.000475</td>\n",
       "    </tr>\n",
       "  </tbody>\n",
       "</table>\n",
       "</div>"
      ],
      "text/plain": [
       "          0         1         2         3         4\n",
       "0  0.000316  0.000517  0.998454  0.000324  0.000390\n",
       "1  0.000443  0.012755  0.003776  0.982674  0.000352\n",
       "2  0.000309  0.000348  0.998826  0.000219  0.000298\n",
       "3  0.001354  0.001308  0.034240  0.961424  0.001674\n",
       "4  0.000493  0.000555  0.995900  0.002577  0.000475"
      ]
     },
     "execution_count": 34,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "result.head()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 35,
   "metadata": {},
   "outputs": [],
   "source": [
    "# This is the final submission which got highest private score and 0.08903 public score\n",
    "#result.to_excel(\"xgb_boost_trick_part.xlsx\",index=False)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": []
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.7.4"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 2
}
