{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 3.1 学习目标¶\n",
    "#### （1）学习特征预处理、缺失值、异常值处理、数据分桶等特征处理方法\n",
    "#### （2）学习特征交互、编码、选择的相应方法\n",
    "#### （3）完成相应学习打卡任务，两个选做的作业不做强制性要求，供学有余力同学自己探索"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 3.2 内容介绍\n",
    "```数据预处理：\n",
    "缺失值的填充\n",
    "时间格式处理\n",
    "对象类型特征转换到数值\n",
    "异常值处理：\n",
    "基于3segama原则\n",
    "基于箱型图\n",
    "数据分箱\n",
    "固定宽度分箱\n",
    "分位数分箱\n",
    "离散数值型数据分箱\n",
    "连续数值型数据分箱\n",
    "卡方分箱（选做作业）\n",
    "特征交互\n",
    "特征和特征之间组合\n",
    "特征和特征之间衍生\n",
    "其他特征衍生的尝试（选做作业）\n",
    "特征编码\n",
    "one-hot编码\n",
    "label-encode编码\n",
    "特征选择\n",
    "1 Filter\n",
    "2 Wrapper （RFE）\n",
    "3 Embedded\n",
    "```"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 15,
   "metadata": {},
   "outputs": [],
   "source": [
    "import pandas as pd\n",
    "import numpy as np\n",
    "import matplotlib.pyplot as plt\n",
    "import seaborn as sns\n",
    "import datetime\n",
    "from tqdm import tqdm\n",
    "from sklearn.preprocessing import LabelEncoder\n",
    "from sklearn.feature_selection import SelectKBest\n",
    "from sklearn.feature_selection import chi2\n",
    "from sklearn.preprocessing import MinMaxScaler\n",
    "# import xgboost as xgb\n",
    "import lightgbm as lgb\n",
    "# from catboost import CatBoostRegressor\n",
    "import warnings\n",
    "from sklearn.model_selection import StratifiedKFold, KFold, train_test_split\n",
    "from sklearn.metrics import accuracy_score, f1_score, roc_auc_score, log_loss\n",
    "warnings.filterwarnings('ignore')\n",
    "import re\n",
    "from sklearn.preprocessing import MinMaxScaler\n",
    "from sklearn import metrics\n",
    "from sklearn.metrics import roc_auc_score\n",
    "\n",
    "DATA_PATH = './data/'"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 2,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "X.shape: (800000, 124)\n"
     ]
    },
    {
     "data": {
      "text/html": [
       "<div>\n",
       "<style scoped>\n",
       "    .dataframe tbody tr th:only-of-type {\n",
       "        vertical-align: middle;\n",
       "    }\n",
       "\n",
       "    .dataframe tbody tr th {\n",
       "        vertical-align: top;\n",
       "    }\n",
       "\n",
       "    .dataframe thead th {\n",
       "        text-align: right;\n",
       "    }\n",
       "</style>\n",
       "<table border=\"1\" class=\"dataframe\">\n",
       "  <thead>\n",
       "    <tr style=\"text-align: right;\">\n",
       "      <th></th>\n",
       "      <th>term</th>\n",
       "      <th>homeOwnership</th>\n",
       "      <th>verificationStatus</th>\n",
       "      <th>regionCode</th>\n",
       "      <th>applicationType</th>\n",
       "      <th>grade_B</th>\n",
       "      <th>grade_C</th>\n",
       "      <th>grade_D</th>\n",
       "      <th>grade_E</th>\n",
       "      <th>grade_F</th>\n",
       "      <th>...</th>\n",
       "      <th>n5</th>\n",
       "      <th>n6</th>\n",
       "      <th>n7</th>\n",
       "      <th>n8</th>\n",
       "      <th>n9</th>\n",
       "      <th>n10</th>\n",
       "      <th>n11</th>\n",
       "      <th>n12</th>\n",
       "      <th>n13</th>\n",
       "      <th>n14</th>\n",
       "    </tr>\n",
       "  </thead>\n",
       "  <tbody>\n",
       "    <tr>\n",
       "      <th>0</th>\n",
       "      <td>5</td>\n",
       "      <td>2</td>\n",
       "      <td>2</td>\n",
       "      <td>32</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>...</td>\n",
       "      <td>0.407211</td>\n",
       "      <td>0.265279</td>\n",
       "      <td>0.185348</td>\n",
       "      <td>0.294771</td>\n",
       "      <td>0.133573</td>\n",
       "      <td>0.252447</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.264447</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>1</th>\n",
       "      <td>5</td>\n",
       "      <td>0</td>\n",
       "      <td>2</td>\n",
       "      <td>18</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>...</td>\n",
       "      <td>0.316720</td>\n",
       "      <td>0.232119</td>\n",
       "      <td>0.324358</td>\n",
       "      <td>0.321569</td>\n",
       "      <td>0.333932</td>\n",
       "      <td>0.468829</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.264447</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>2</th>\n",
       "      <td>5</td>\n",
       "      <td>0</td>\n",
       "      <td>2</td>\n",
       "      <td>14</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>...</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>0.696357</td>\n",
       "      <td>0.185348</td>\n",
       "      <td>0.107190</td>\n",
       "      <td>0.200359</td>\n",
       "      <td>0.396702</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.528894</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>3</th>\n",
       "      <td>3</td>\n",
       "      <td>1</td>\n",
       "      <td>1</td>\n",
       "      <td>11</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>...</td>\n",
       "      <td>0.723931</td>\n",
       "      <td>0.132639</td>\n",
       "      <td>0.324358</td>\n",
       "      <td>0.535948</td>\n",
       "      <td>0.400718</td>\n",
       "      <td>0.324574</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.132223</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>4</th>\n",
       "      <td>3</td>\n",
       "      <td>1</td>\n",
       "      <td>2</td>\n",
       "      <td>21</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>...</td>\n",
       "      <td>0.180983</td>\n",
       "      <td>0.298439</td>\n",
       "      <td>0.463369</td>\n",
       "      <td>0.375163</td>\n",
       "      <td>0.467504</td>\n",
       "      <td>0.432766</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.528894</td>\n",
       "    </tr>\n",
       "  </tbody>\n",
       "</table>\n",
       "<p>5 rows × 124 columns</p>\n",
       "</div>"
      ],
      "text/plain": [
       "   term  homeOwnership  verificationStatus  regionCode  applicationType  \\\n",
       "0     5              2                   2          32                0   \n",
       "1     5              0                   2          18                0   \n",
       "2     5              0                   2          14                0   \n",
       "3     3              1                   1          11                0   \n",
       "4     3              1                   2          21                0   \n",
       "\n",
       "   grade_B  grade_C  grade_D  grade_E  grade_F  ...        n5        n6  \\\n",
       "0        0        0        0        1        0  ...  0.407211  0.265279   \n",
       "1        0        0        1        0        0  ...  0.316720  0.232119   \n",
       "2        0        0        1        0        0  ...  0.000000  0.696357   \n",
       "3        0        0        0        0        0  ...  0.723931  0.132639   \n",
       "4        0        1        0        0        0  ...  0.180983  0.298439   \n",
       "\n",
       "         n7        n8        n9       n10  n11  n12  n13       n14  \n",
       "0  0.185348  0.294771  0.133573  0.252447  0.0  0.0  0.0  0.264447  \n",
       "1  0.324358  0.321569  0.333932  0.468829  0.0  0.0  0.0  0.264447  \n",
       "2  0.185348  0.107190  0.200359  0.396702  0.0  0.0  0.0  0.528894  \n",
       "3  0.324358  0.535948  0.400718  0.324574  0.0  0.0  0.0  0.132223  \n",
       "4  0.463369  0.375163  0.467504  0.432766  0.0  0.0  0.0  0.528894  \n",
       "\n",
       "[5 rows x 124 columns]"
      ]
     },
     "execution_count": 2,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "class Preprocessor:\n",
    "    \n",
    "    def __init__(self):\n",
    "        self.training = True\n",
    "        self.fea_to_remove = ['id','policyCode', 'issueDate', 'earliesCreditLine']\n",
    "        self.belong_to_cate = ['employmentTitle', 'verificationStatus', 'applicationType', 'homeOwnership', 'purpose', 'regionCode', 'term']\n",
    "        self.label = 'isDefault'\n",
    "        self.numerical_fea = []\n",
    "        self.category_fea = []\n",
    "        self.title_dict = {}\n",
    "        self.bounds = {}\n",
    "        \n",
    "    def get_labels(self, data):\n",
    "        labels = data[self.label].values\n",
    "        del data[self.label]\n",
    "        return data, labels\n",
    "    \n",
    "    def remove_fea(self, data):\n",
    "        columns = data.columns.tolist()\n",
    "        fea_to_keep = list(filter(lambda f:f not in self.fea_to_remove, columns))\n",
    "        data = data[fea_to_keep].copy()\n",
    "        return data\n",
    "    \n",
    "    def replace_emp_length(self, x):\n",
    "        if x == \"< 1 year\":\n",
    "            return 0\n",
    "        elif pd.isnull(x):\n",
    "            return -1\n",
    "        elif isinstance(x, int):\n",
    "            return x\n",
    "        elif isinstance(x, float):\n",
    "            return int(x)\n",
    "        else:\n",
    "            return int(re.search(\"\\d+\", x).group())\n",
    "    \n",
    "    def employment_length(self, data):\n",
    "        data['employmentLength'] = data['employmentLength'].apply(self.replace_emp_length)\n",
    "        return data\n",
    "    \n",
    "    def reduce_employment_title(self, data):\n",
    "        if self.training:\n",
    "            titles = data['employmentTitle'].value_counts().reset_index()\n",
    "            title_list = titles.head(45)['index'].tolist()\n",
    "            title_to_remove = titles['index'][~titles['index'].isin(title_list)].tolist()\n",
    "            self.title_dict = dict(zip(title_list, [str(i) for i in range(45)]))\n",
    "            self.title_dict['default'] = 45\n",
    "        \n",
    "        data['employmentTitle'] = data['employmentTitle'].apply(lambda x:self.title_dict[x] if x in self.title_dict else self.title_dict['default'])\n",
    "        return data\n",
    "    \n",
    "    def adjust_to_category(self, data):\n",
    "        for fea in self.numerical_fea:\n",
    "            if fea in self.belong_to_cate:\n",
    "                data[fea] = data[fea].astype(int)\n",
    "                self.numerical_fea.remove(fea)\n",
    "                self.category_fea.append(fea)\n",
    "        return data\n",
    "\n",
    "    def get_fea_fields(self, data):\n",
    "        self.columns = data.columns.tolist()\n",
    "        self.numerical_fea = data.select_dtypes(exclude=['object']).columns.tolist()\n",
    "        self.category_fea = list(filter(lambda col: col not in self.numerical_fea, self.columns))\n",
    "        \n",
    "    def fillna(self, data):\n",
    "        if self.training:\n",
    "        # 中位数填充数值型\n",
    "            self.numerical_median = dict(zip(self.numerical_fea, data[self.numerical_fea].median()))\n",
    "            self.category_mode = dict(zip(self.category_fea, data[self.category_fea].mode()))\n",
    "        # 中位数填充数值型\n",
    "            data[self.numerical_fea] = data[self.numerical_fea].fillna(self.numerical_median)\n",
    "        # 按照众数填充类别型特征\n",
    "            data[self.category_fea] = data[self.category_fea].fillna(self.category_mode)\n",
    "        else:\n",
    "            data[self.numerical_fea] = data[self.numerical_fea].fillna(self.numerical_median)\n",
    "            data[self.category_fea] = data[self.category_fea].fillna(self.category_mode)\n",
    "        return data\n",
    "    \n",
    "    def find_outliers_by_3segama(self, data, fea):\n",
    "        data_std = np.std(data[fea])\n",
    "        data_mean = np.mean(data[fea])\n",
    "        outliers_cut_off = data_std * 3\n",
    "        lower_rule = data_mean - outliers_cut_off\n",
    "        upper_rule = data_mean + outliers_cut_off\n",
    "        return lower_rule, upper_rule\n",
    "    \n",
    "    def remove_outliers(self, data):\n",
    "        if self.training:\n",
    "            for fea in self.numerical_fea:\n",
    "                lower_rule,upper_rule = self.find_outliers_by_3segama(data, fea)\n",
    "                self.bounds[fea] = [lower_rule,upper_rule]\n",
    "                data.loc[data[fea] < lower_rule, fea] = lower_rule\n",
    "                data.loc[data[fea] > upper_rule, fea] = upper_rule\n",
    "        else:\n",
    "            for fea in self.numerical_fea:\n",
    "                lower_rule = self.bounds[fea][0]\n",
    "                upper_rule = self.bounds[fea][1]\n",
    "                data.loc[data[fea] < lower_rule, fea] = lower_rule\n",
    "                data.loc[data[fea] > upper_rule, fea] = upper_rule\n",
    "        return data\n",
    "    \n",
    "    def get_dummies(self, data):\n",
    "        df_dummy = pd.get_dummies(data[self.category_fea], drop_first=True)\n",
    "        return df_dummy\n",
    "    \n",
    "    def min_max_scaler(self, data):\n",
    "        min_max_maps = {}\n",
    "        for fea in self.numerical_fea:\n",
    "            fitter = MinMaxScaler()\n",
    "            data[fea] = fitter.fit_transform(data[fea].values.reshape(-1,1))\n",
    "            min_max_maps[fea] = fitter \n",
    "        return data[self.numerical_fea].copy(), min_max_maps\n",
    "    \n",
    "    def _proc(self, data):\n",
    "        if self.training:\n",
    "            data, y = self.get_labels(data)\n",
    "        data = self.remove_fea(data)\n",
    "        data = self.employment_length(data)\n",
    "        data = self.reduce_employment_title(data)\n",
    "        if self.training:\n",
    "            self.get_fea_fields(data)\n",
    "        data = self.adjust_to_category(data)\n",
    "        data = self.fillna(data)\n",
    "        data = self.remove_outliers(data)\n",
    "        df_num, self.min_max_maps = self.min_max_scaler(data)\n",
    "        df_dummy = self.get_dummies(data)\n",
    "        X = pd.concat([df_dummy, df_num], axis=1)\n",
    "        self.training = False\n",
    "        return X, y\n",
    "    \n",
    "    def fit_transform(self, data):\n",
    "        X, y = self._proc(data)\n",
    "        return X, y\n",
    "    \n",
    "    def transform(self, data):\n",
    "        data = self.remove_fea(data)\n",
    "        data = self.employment_length(data)\n",
    "        data = self.reduce_employment_title(data)\n",
    "        data = self.adjust_to_category(data)\n",
    "        data = self.fillna(data)\n",
    "        data = self.remove_outliers(data)\n",
    "        df_num, self.min_max_maps = self.min_max_scaler(data)\n",
    "        df_dummy = self.get_dummies(data)\n",
    "        X = pd.concat([df_dummy, df_num], axis=1)\n",
    "        return X\n",
    "\n",
    "    \n",
    "data_train = pd.read_csv(DATA_PATH + 'train.csv')\n",
    "processor = Preprocessor()\n",
    "X, y = processor.fit_transform(data_train)\n",
    "print(\"X.shape:\", X.shape)\n",
    "X.head()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 3,
   "metadata": {},
   "outputs": [],
   "source": [
    "data_test_a = pd.read_csv(DATA_PATH + 'testA.csv')\n",
    "X_t = processor.transform(data_test_a)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 4,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "(200000, 124)\n"
     ]
    },
    {
     "data": {
      "text/html": [
       "<div>\n",
       "<style scoped>\n",
       "    .dataframe tbody tr th:only-of-type {\n",
       "        vertical-align: middle;\n",
       "    }\n",
       "\n",
       "    .dataframe tbody tr th {\n",
       "        vertical-align: top;\n",
       "    }\n",
       "\n",
       "    .dataframe thead th {\n",
       "        text-align: right;\n",
       "    }\n",
       "</style>\n",
       "<table border=\"1\" class=\"dataframe\">\n",
       "  <thead>\n",
       "    <tr style=\"text-align: right;\">\n",
       "      <th></th>\n",
       "      <th>term</th>\n",
       "      <th>homeOwnership</th>\n",
       "      <th>verificationStatus</th>\n",
       "      <th>regionCode</th>\n",
       "      <th>applicationType</th>\n",
       "      <th>purpose</th>\n",
       "      <th>grade_B</th>\n",
       "      <th>grade_C</th>\n",
       "      <th>grade_D</th>\n",
       "      <th>grade_E</th>\n",
       "      <th>...</th>\n",
       "      <th>n5</th>\n",
       "      <th>n6</th>\n",
       "      <th>n7</th>\n",
       "      <th>n8</th>\n",
       "      <th>n9</th>\n",
       "      <th>n10</th>\n",
       "      <th>n11</th>\n",
       "      <th>n12</th>\n",
       "      <th>n13</th>\n",
       "      <th>n14</th>\n",
       "    </tr>\n",
       "  </thead>\n",
       "  <tbody>\n",
       "    <tr>\n",
       "      <th>0</th>\n",
       "      <td>3</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>21</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>...</td>\n",
       "      <td>0.361966</td>\n",
       "      <td>0.132639</td>\n",
       "      <td>0.695054</td>\n",
       "      <td>0.482353</td>\n",
       "      <td>0.400718</td>\n",
       "      <td>0.613084</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.635206</td>\n",
       "      <td>0.396670</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>1</th>\n",
       "      <td>5</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>8</td>\n",
       "      <td>0</td>\n",
       "      <td>2</td>\n",
       "      <td>0</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>...</td>\n",
       "      <td>0.045246</td>\n",
       "      <td>0.099480</td>\n",
       "      <td>0.139011</td>\n",
       "      <td>0.214379</td>\n",
       "      <td>0.200359</td>\n",
       "      <td>0.180319</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.0</td>\n",
       "      <td>1.000000</td>\n",
       "      <td>0.264447</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>2</th>\n",
       "      <td>3</td>\n",
       "      <td>1</td>\n",
       "      <td>2</td>\n",
       "      <td>20</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>...</td>\n",
       "      <td>0.045246</td>\n",
       "      <td>1.000000</td>\n",
       "      <td>0.231685</td>\n",
       "      <td>0.133987</td>\n",
       "      <td>0.267145</td>\n",
       "      <td>0.432766</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>0.925564</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>3</th>\n",
       "      <td>5</td>\n",
       "      <td>0</td>\n",
       "      <td>1</td>\n",
       "      <td>11</td>\n",
       "      <td>0</td>\n",
       "      <td>4</td>\n",
       "      <td>0</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>...</td>\n",
       "      <td>0.316720</td>\n",
       "      <td>0.066320</td>\n",
       "      <td>0.370695</td>\n",
       "      <td>0.348366</td>\n",
       "      <td>0.133573</td>\n",
       "      <td>0.360638</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>0.396670</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>4</th>\n",
       "      <td>3</td>\n",
       "      <td>1</td>\n",
       "      <td>1</td>\n",
       "      <td>8</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>...</td>\n",
       "      <td>0.497703</td>\n",
       "      <td>0.099480</td>\n",
       "      <td>0.741390</td>\n",
       "      <td>0.455556</td>\n",
       "      <td>0.734650</td>\n",
       "      <td>0.685212</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>0.132223</td>\n",
       "    </tr>\n",
       "  </tbody>\n",
       "</table>\n",
       "<p>5 rows × 124 columns</p>\n",
       "</div>"
      ],
      "text/plain": [
       "   term  homeOwnership  verificationStatus  regionCode  applicationType  \\\n",
       "0     3              0                   0          21                0   \n",
       "1     5              0                   0           8                0   \n",
       "2     3              1                   2          20                0   \n",
       "3     5              0                   1          11                0   \n",
       "4     3              1                   1           8                0   \n",
       "\n",
       "   purpose  grade_B  grade_C  grade_D  grade_E  ...        n5        n6  \\\n",
       "0        0        1        0        0        0  ...  0.361966  0.132639   \n",
       "1        2        0        1        0        0  ...  0.045246  0.099480   \n",
       "2        0        0        0        1        0  ...  0.045246  1.000000   \n",
       "3        4        0        1        0        0  ...  0.316720  0.066320   \n",
       "4        0        0        0        1        0  ...  0.497703  0.099480   \n",
       "\n",
       "         n7        n8        n9       n10  n11  n12       n13       n14  \n",
       "0  0.695054  0.482353  0.400718  0.613084  0.0  0.0  0.635206  0.396670  \n",
       "1  0.139011  0.214379  0.200359  0.180319  0.0  0.0  1.000000  0.264447  \n",
       "2  0.231685  0.133987  0.267145  0.432766  0.0  0.0  0.000000  0.925564  \n",
       "3  0.370695  0.348366  0.133573  0.360638  0.0  0.0  0.000000  0.396670  \n",
       "4  0.741390  0.455556  0.734650  0.685212  0.0  0.0  0.000000  0.132223  \n",
       "\n",
       "[5 rows x 124 columns]"
      ]
     },
     "execution_count": 4,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "print(X_t.shape)\n",
    "X_t.head()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 5,
   "metadata": {},
   "outputs": [],
   "source": [
    "X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.1)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 6,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(720000, 124)"
      ]
     },
     "execution_count": 6,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "X_train.shape"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 7,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "LogisticRegression()"
      ]
     },
     "execution_count": 7,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "from sklearn.linear_model import LogisticRegression\n",
    "from sklearn.metrics import classification_report\n",
    "\n",
    "lr = LogisticRegression()\n",
    "lr.fit(X_train, y_train)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 8,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "              precision    recall  f1-score   support\n",
      "\n",
      "           0       0.81      0.98      0.89     63957\n",
      "           1       0.54      0.07      0.13     16043\n",
      "\n",
      "    accuracy                           0.80     80000\n",
      "   macro avg       0.67      0.53      0.51     80000\n",
      "weighted avg       0.75      0.80      0.74     80000\n",
      "\n"
     ]
    }
   ],
   "source": [
    "y_pred = lr.predict(X_test)\n",
    "r = classification_report(y_test, y_pred)\n",
    "print(r)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 9,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "[LightGBM] [Warning] num_threads is set with n_jobs=12, nthread=28 will be ignored. Current value: num_threads=12\n",
      "[LightGBM] [Warning] Unknown parameter: silent\n",
      "[1]\ttraining's auc: 0.698932\tvalid_1's auc: 0.698528\n",
      "Training until validation scores don't improve for 20 rounds\n",
      "[2]\ttraining's auc: 0.701916\tvalid_1's auc: 0.701565\n",
      "[3]\ttraining's auc: 0.703779\tvalid_1's auc: 0.703347\n",
      "[4]\ttraining's auc: 0.706866\tvalid_1's auc: 0.706472\n",
      "[5]\ttraining's auc: 0.707753\tvalid_1's auc: 0.707291\n",
      "[6]\ttraining's auc: 0.708483\tvalid_1's auc: 0.707733\n",
      "[7]\ttraining's auc: 0.70903\tvalid_1's auc: 0.708162\n",
      "[8]\ttraining's auc: 0.709398\tvalid_1's auc: 0.708472\n",
      "[9]\ttraining's auc: 0.71007\tvalid_1's auc: 0.70921\n",
      "[10]\ttraining's auc: 0.710708\tvalid_1's auc: 0.709781\n",
      "[11]\ttraining's auc: 0.711331\tvalid_1's auc: 0.710477\n",
      "[12]\ttraining's auc: 0.711728\tvalid_1's auc: 0.710749\n",
      "[13]\ttraining's auc: 0.712393\tvalid_1's auc: 0.711397\n",
      "[14]\ttraining's auc: 0.712812\tvalid_1's auc: 0.711806\n",
      "[15]\ttraining's auc: 0.713126\tvalid_1's auc: 0.712033\n",
      "[16]\ttraining's auc: 0.713761\tvalid_1's auc: 0.712648\n",
      "[17]\ttraining's auc: 0.714186\tvalid_1's auc: 0.713095\n",
      "[18]\ttraining's auc: 0.714803\tvalid_1's auc: 0.713738\n",
      "[19]\ttraining's auc: 0.715335\tvalid_1's auc: 0.714265\n",
      "[20]\ttraining's auc: 0.715627\tvalid_1's auc: 0.714559\n",
      "[21]\ttraining's auc: 0.716105\tvalid_1's auc: 0.714959\n",
      "[22]\ttraining's auc: 0.716487\tvalid_1's auc: 0.715276\n",
      "[23]\ttraining's auc: 0.716926\tvalid_1's auc: 0.715697\n",
      "[24]\ttraining's auc: 0.717328\tvalid_1's auc: 0.71609\n",
      "[25]\ttraining's auc: 0.717734\tvalid_1's auc: 0.716464\n",
      "[26]\ttraining's auc: 0.718063\tvalid_1's auc: 0.716799\n",
      "[27]\ttraining's auc: 0.718417\tvalid_1's auc: 0.71705\n",
      "[28]\ttraining's auc: 0.718749\tvalid_1's auc: 0.717345\n",
      "[29]\ttraining's auc: 0.719088\tvalid_1's auc: 0.717595\n",
      "[30]\ttraining's auc: 0.719375\tvalid_1's auc: 0.717801\n",
      "[31]\ttraining's auc: 0.719668\tvalid_1's auc: 0.718032\n",
      "[32]\ttraining's auc: 0.719898\tvalid_1's auc: 0.718218\n",
      "[33]\ttraining's auc: 0.720223\tvalid_1's auc: 0.718395\n",
      "[34]\ttraining's auc: 0.720565\tvalid_1's auc: 0.718709\n",
      "[35]\ttraining's auc: 0.72083\tvalid_1's auc: 0.718877\n",
      "[36]\ttraining's auc: 0.721097\tvalid_1's auc: 0.719078\n",
      "[37]\ttraining's auc: 0.721368\tvalid_1's auc: 0.719307\n",
      "[38]\ttraining's auc: 0.721629\tvalid_1's auc: 0.71948\n",
      "[39]\ttraining's auc: 0.721867\tvalid_1's auc: 0.719653\n",
      "[40]\ttraining's auc: 0.722092\tvalid_1's auc: 0.719792\n",
      "[41]\ttraining's auc: 0.722306\tvalid_1's auc: 0.719922\n",
      "[42]\ttraining's auc: 0.722548\tvalid_1's auc: 0.7201\n",
      "[43]\ttraining's auc: 0.722772\tvalid_1's auc: 0.720233\n",
      "[44]\ttraining's auc: 0.722962\tvalid_1's auc: 0.720393\n",
      "[45]\ttraining's auc: 0.723219\tvalid_1's auc: 0.72059\n",
      "[46]\ttraining's auc: 0.723621\tvalid_1's auc: 0.720936\n",
      "[47]\ttraining's auc: 0.723812\tvalid_1's auc: 0.721079\n",
      "[48]\ttraining's auc: 0.724186\tvalid_1's auc: 0.721352\n",
      "[49]\ttraining's auc: 0.724375\tvalid_1's auc: 0.721495\n",
      "[50]\ttraining's auc: 0.724564\tvalid_1's auc: 0.721602\n",
      "[51]\ttraining's auc: 0.724714\tvalid_1's auc: 0.721686\n",
      "[52]\ttraining's auc: 0.724921\tvalid_1's auc: 0.72187\n",
      "[53]\ttraining's auc: 0.725107\tvalid_1's auc: 0.722051\n",
      "[54]\ttraining's auc: 0.725297\tvalid_1's auc: 0.722193\n",
      "[55]\ttraining's auc: 0.725483\tvalid_1's auc: 0.722402\n",
      "[56]\ttraining's auc: 0.725663\tvalid_1's auc: 0.722525\n",
      "[57]\ttraining's auc: 0.725813\tvalid_1's auc: 0.7226\n",
      "[58]\ttraining's auc: 0.725999\tvalid_1's auc: 0.722759\n",
      "[59]\ttraining's auc: 0.726168\tvalid_1's auc: 0.722889\n",
      "[60]\ttraining's auc: 0.726363\tvalid_1's auc: 0.723063\n",
      "[61]\ttraining's auc: 0.726519\tvalid_1's auc: 0.723179\n",
      "[62]\ttraining's auc: 0.726669\tvalid_1's auc: 0.723331\n",
      "[63]\ttraining's auc: 0.726832\tvalid_1's auc: 0.723487\n",
      "[64]\ttraining's auc: 0.72697\tvalid_1's auc: 0.723593\n",
      "[65]\ttraining's auc: 0.727127\tvalid_1's auc: 0.723689\n",
      "[66]\ttraining's auc: 0.727294\tvalid_1's auc: 0.723823\n",
      "[67]\ttraining's auc: 0.727582\tvalid_1's auc: 0.724007\n",
      "[68]\ttraining's auc: 0.727709\tvalid_1's auc: 0.724088\n",
      "[69]\ttraining's auc: 0.727873\tvalid_1's auc: 0.724238\n",
      "[70]\ttraining's auc: 0.728004\tvalid_1's auc: 0.724354\n",
      "[71]\ttraining's auc: 0.728131\tvalid_1's auc: 0.724432\n",
      "[72]\ttraining's auc: 0.728405\tvalid_1's auc: 0.724647\n",
      "[73]\ttraining's auc: 0.728533\tvalid_1's auc: 0.724686\n",
      "[74]\ttraining's auc: 0.728675\tvalid_1's auc: 0.724799\n",
      "[75]\ttraining's auc: 0.728828\tvalid_1's auc: 0.724901\n",
      "[76]\ttraining's auc: 0.728972\tvalid_1's auc: 0.72499\n",
      "[77]\ttraining's auc: 0.72909\tvalid_1's auc: 0.725088\n",
      "[78]\ttraining's auc: 0.729233\tvalid_1's auc: 0.725194\n",
      "[79]\ttraining's auc: 0.729382\tvalid_1's auc: 0.725323\n",
      "[80]\ttraining's auc: 0.729507\tvalid_1's auc: 0.725411\n",
      "[81]\ttraining's auc: 0.729671\tvalid_1's auc: 0.725511\n",
      "[82]\ttraining's auc: 0.729798\tvalid_1's auc: 0.725567\n",
      "[83]\ttraining's auc: 0.72994\tvalid_1's auc: 0.725637\n",
      "[84]\ttraining's auc: 0.730094\tvalid_1's auc: 0.725741\n",
      "[85]\ttraining's auc: 0.730197\tvalid_1's auc: 0.725835\n",
      "[86]\ttraining's auc: 0.730316\tvalid_1's auc: 0.725889\n",
      "[87]\ttraining's auc: 0.730416\tvalid_1's auc: 0.72597\n",
      "[88]\ttraining's auc: 0.730562\tvalid_1's auc: 0.726059\n",
      "[89]\ttraining's auc: 0.730697\tvalid_1's auc: 0.726142\n",
      "[90]\ttraining's auc: 0.73081\tvalid_1's auc: 0.726192\n",
      "[91]\ttraining's auc: 0.730989\tvalid_1's auc: 0.726266\n",
      "[92]\ttraining's auc: 0.731148\tvalid_1's auc: 0.726378\n",
      "[93]\ttraining's auc: 0.731269\tvalid_1's auc: 0.726396\n",
      "[94]\ttraining's auc: 0.731371\tvalid_1's auc: 0.726454\n",
      "[95]\ttraining's auc: 0.731495\tvalid_1's auc: 0.726546\n",
      "[96]\ttraining's auc: 0.731596\tvalid_1's auc: 0.726626\n",
      "[97]\ttraining's auc: 0.731708\tvalid_1's auc: 0.726673\n",
      "[98]\ttraining's auc: 0.731791\tvalid_1's auc: 0.726703\n",
      "[99]\ttraining's auc: 0.731883\tvalid_1's auc: 0.726751\n",
      "[100]\ttraining's auc: 0.731993\tvalid_1's auc: 0.72677\n",
      "[101]\ttraining's auc: 0.73209\tvalid_1's auc: 0.726822\n",
      "[102]\ttraining's auc: 0.7322\tvalid_1's auc: 0.726874\n",
      "[103]\ttraining's auc: 0.732288\tvalid_1's auc: 0.7269\n",
      "[104]\ttraining's auc: 0.732414\tvalid_1's auc: 0.726968\n",
      "[105]\ttraining's auc: 0.732486\tvalid_1's auc: 0.726995\n",
      "[106]\ttraining's auc: 0.732585\tvalid_1's auc: 0.727065\n",
      "[107]\ttraining's auc: 0.732699\tvalid_1's auc: 0.72711\n",
      "[108]\ttraining's auc: 0.732783\tvalid_1's auc: 0.727124\n",
      "[109]\ttraining's auc: 0.732871\tvalid_1's auc: 0.72719\n",
      "[110]\ttraining's auc: 0.732983\tvalid_1's auc: 0.727259\n",
      "[111]\ttraining's auc: 0.733146\tvalid_1's auc: 0.727383\n",
      "[112]\ttraining's auc: 0.73323\tvalid_1's auc: 0.727441\n",
      "[113]\ttraining's auc: 0.733317\tvalid_1's auc: 0.727471\n",
      "[114]\ttraining's auc: 0.733439\tvalid_1's auc: 0.727538\n",
      "[115]\ttraining's auc: 0.733529\tvalid_1's auc: 0.727586\n",
      "[116]\ttraining's auc: 0.733643\tvalid_1's auc: 0.727652\n",
      "[117]\ttraining's auc: 0.733737\tvalid_1's auc: 0.727683\n",
      "[118]\ttraining's auc: 0.733832\tvalid_1's auc: 0.727708\n",
      "[119]\ttraining's auc: 0.733915\tvalid_1's auc: 0.727714\n",
      "[120]\ttraining's auc: 0.733984\tvalid_1's auc: 0.727704\n",
      "[121]\ttraining's auc: 0.734156\tvalid_1's auc: 0.727881\n",
      "[122]\ttraining's auc: 0.734261\tvalid_1's auc: 0.727939\n",
      "[123]\ttraining's auc: 0.734362\tvalid_1's auc: 0.727942\n",
      "[124]\ttraining's auc: 0.73446\tvalid_1's auc: 0.727988\n",
      "[125]\ttraining's auc: 0.734555\tvalid_1's auc: 0.728049\n",
      "[126]\ttraining's auc: 0.734658\tvalid_1's auc: 0.728095\n",
      "[127]\ttraining's auc: 0.734729\tvalid_1's auc: 0.728113\n",
      "[128]\ttraining's auc: 0.734826\tvalid_1's auc: 0.728127\n",
      "[129]\ttraining's auc: 0.734916\tvalid_1's auc: 0.728193\n",
      "[130]\ttraining's auc: 0.734983\tvalid_1's auc: 0.728207\n",
      "[131]\ttraining's auc: 0.735061\tvalid_1's auc: 0.72825\n",
      "[132]\ttraining's auc: 0.735155\tvalid_1's auc: 0.728288\n",
      "[133]\ttraining's auc: 0.735219\tvalid_1's auc: 0.728308\n",
      "[134]\ttraining's auc: 0.735277\tvalid_1's auc: 0.728336\n",
      "[135]\ttraining's auc: 0.735375\tvalid_1's auc: 0.728334\n",
      "[136]\ttraining's auc: 0.735456\tvalid_1's auc: 0.728354\n",
      "[137]\ttraining's auc: 0.735535\tvalid_1's auc: 0.728399\n",
      "[138]\ttraining's auc: 0.7356\tvalid_1's auc: 0.728414\n",
      "[139]\ttraining's auc: 0.735693\tvalid_1's auc: 0.728461\n",
      "[140]\ttraining's auc: 0.735768\tvalid_1's auc: 0.728498\n",
      "[141]\ttraining's auc: 0.735845\tvalid_1's auc: 0.728505\n",
      "[142]\ttraining's auc: 0.73593\tvalid_1's auc: 0.728516\n",
      "[143]\ttraining's auc: 0.736003\tvalid_1's auc: 0.728511\n",
      "[144]\ttraining's auc: 0.736064\tvalid_1's auc: 0.728514\n",
      "[145]\ttraining's auc: 0.736179\tvalid_1's auc: 0.728508\n",
      "[146]\ttraining's auc: 0.736256\tvalid_1's auc: 0.728561\n",
      "[147]\ttraining's auc: 0.736332\tvalid_1's auc: 0.728585\n",
      "[148]\ttraining's auc: 0.736476\tvalid_1's auc: 0.728657\n",
      "[149]\ttraining's auc: 0.73655\tvalid_1's auc: 0.728678\n",
      "[150]\ttraining's auc: 0.736623\tvalid_1's auc: 0.728676\n",
      "[151]\ttraining's auc: 0.736704\tvalid_1's auc: 0.728725\n",
      "[152]\ttraining's auc: 0.736792\tvalid_1's auc: 0.728775\n",
      "[153]\ttraining's auc: 0.736856\tvalid_1's auc: 0.728814\n",
      "[154]\ttraining's auc: 0.736919\tvalid_1's auc: 0.728829\n",
      "[155]\ttraining's auc: 0.736987\tvalid_1's auc: 0.728865\n",
      "[156]\ttraining's auc: 0.737052\tvalid_1's auc: 0.728886\n",
      "[157]\ttraining's auc: 0.737148\tvalid_1's auc: 0.728911\n",
      "[158]\ttraining's auc: 0.737227\tvalid_1's auc: 0.728907\n",
      "[159]\ttraining's auc: 0.7373\tvalid_1's auc: 0.728923\n",
      "[160]\ttraining's auc: 0.737402\tvalid_1's auc: 0.728951\n",
      "[161]\ttraining's auc: 0.737493\tvalid_1's auc: 0.729022\n",
      "[162]\ttraining's auc: 0.737557\tvalid_1's auc: 0.729032\n",
      "[163]\ttraining's auc: 0.737607\tvalid_1's auc: 0.729047\n",
      "[164]\ttraining's auc: 0.737673\tvalid_1's auc: 0.729068\n",
      "[165]\ttraining's auc: 0.737746\tvalid_1's auc: 0.72908\n",
      "[166]\ttraining's auc: 0.737813\tvalid_1's auc: 0.729095\n",
      "[167]\ttraining's auc: 0.737916\tvalid_1's auc: 0.729166\n",
      "[168]\ttraining's auc: 0.73801\tvalid_1's auc: 0.729248\n",
      "[169]\ttraining's auc: 0.738089\tvalid_1's auc: 0.729274\n",
      "[170]\ttraining's auc: 0.738166\tvalid_1's auc: 0.7293\n",
      "[171]\ttraining's auc: 0.738254\tvalid_1's auc: 0.72931\n",
      "[172]\ttraining's auc: 0.738314\tvalid_1's auc: 0.729326\n",
      "[173]\ttraining's auc: 0.738369\tvalid_1's auc: 0.729359\n",
      "[174]\ttraining's auc: 0.738454\tvalid_1's auc: 0.729389\n",
      "[175]\ttraining's auc: 0.738507\tvalid_1's auc: 0.729396\n",
      "[176]\ttraining's auc: 0.738555\tvalid_1's auc: 0.729382\n",
      "[177]\ttraining's auc: 0.738611\tvalid_1's auc: 0.729371\n",
      "[178]\ttraining's auc: 0.738685\tvalid_1's auc: 0.729374\n",
      "[179]\ttraining's auc: 0.738732\tvalid_1's auc: 0.729368\n",
      "[180]\ttraining's auc: 0.738829\tvalid_1's auc: 0.72939\n",
      "[181]\ttraining's auc: 0.738906\tvalid_1's auc: 0.729411\n",
      "[182]\ttraining's auc: 0.73897\tvalid_1's auc: 0.729403\n",
      "[183]\ttraining's auc: 0.739025\tvalid_1's auc: 0.729393\n",
      "[184]\ttraining's auc: 0.739106\tvalid_1's auc: 0.729386\n",
      "[185]\ttraining's auc: 0.73917\tvalid_1's auc: 0.729388\n",
      "[186]\ttraining's auc: 0.73923\tvalid_1's auc: 0.729367\n",
      "[187]\ttraining's auc: 0.739281\tvalid_1's auc: 0.729395\n",
      "[188]\ttraining's auc: 0.739365\tvalid_1's auc: 0.729449\n",
      "[189]\ttraining's auc: 0.739407\tvalid_1's auc: 0.729459\n",
      "[190]\ttraining's auc: 0.739452\tvalid_1's auc: 0.729478\n",
      "[191]\ttraining's auc: 0.739554\tvalid_1's auc: 0.729536\n",
      "[192]\ttraining's auc: 0.739601\tvalid_1's auc: 0.729548\n",
      "[193]\ttraining's auc: 0.739669\tvalid_1's auc: 0.729529\n",
      "[194]\ttraining's auc: 0.739729\tvalid_1's auc: 0.729531\n",
      "[195]\ttraining's auc: 0.739789\tvalid_1's auc: 0.72952\n",
      "[196]\ttraining's auc: 0.739848\tvalid_1's auc: 0.729533\n",
      "[197]\ttraining's auc: 0.739923\tvalid_1's auc: 0.729541\n",
      "[198]\ttraining's auc: 0.739978\tvalid_1's auc: 0.729563\n",
      "[199]\ttraining's auc: 0.740044\tvalid_1's auc: 0.729575\n",
      "[200]\ttraining's auc: 0.740122\tvalid_1's auc: 0.729573\n",
      "[201]\ttraining's auc: 0.740192\tvalid_1's auc: 0.729589\n",
      "[202]\ttraining's auc: 0.740242\tvalid_1's auc: 0.729592\n",
      "[203]\ttraining's auc: 0.740308\tvalid_1's auc: 0.729602\n",
      "[204]\ttraining's auc: 0.740372\tvalid_1's auc: 0.729627\n",
      "[205]\ttraining's auc: 0.740427\tvalid_1's auc: 0.729623\n",
      "[206]\ttraining's auc: 0.740481\tvalid_1's auc: 0.729625\n",
      "[207]\ttraining's auc: 0.740528\tvalid_1's auc: 0.729654\n",
      "[208]\ttraining's auc: 0.740589\tvalid_1's auc: 0.729685\n",
      "[209]\ttraining's auc: 0.740666\tvalid_1's auc: 0.72971\n",
      "[210]\ttraining's auc: 0.740736\tvalid_1's auc: 0.729719\n",
      "[211]\ttraining's auc: 0.740778\tvalid_1's auc: 0.729724\n",
      "[212]\ttraining's auc: 0.740844\tvalid_1's auc: 0.729718\n",
      "[213]\ttraining's auc: 0.740906\tvalid_1's auc: 0.729729\n",
      "[214]\ttraining's auc: 0.740955\tvalid_1's auc: 0.729722\n",
      "[215]\ttraining's auc: 0.741039\tvalid_1's auc: 0.729765\n",
      "[216]\ttraining's auc: 0.741101\tvalid_1's auc: 0.729761\n",
      "[217]\ttraining's auc: 0.741158\tvalid_1's auc: 0.729785\n",
      "[218]\ttraining's auc: 0.741216\tvalid_1's auc: 0.729765\n",
      "[219]\ttraining's auc: 0.741284\tvalid_1's auc: 0.729739\n",
      "[220]\ttraining's auc: 0.741339\tvalid_1's auc: 0.729758\n",
      "[221]\ttraining's auc: 0.741421\tvalid_1's auc: 0.729784\n",
      "[222]\ttraining's auc: 0.741475\tvalid_1's auc: 0.729775\n",
      "[223]\ttraining's auc: 0.741528\tvalid_1's auc: 0.729773\n",
      "[224]\ttraining's auc: 0.741592\tvalid_1's auc: 0.729791\n",
      "[225]\ttraining's auc: 0.741652\tvalid_1's auc: 0.729788\n",
      "[226]\ttraining's auc: 0.741702\tvalid_1's auc: 0.729791\n",
      "[227]\ttraining's auc: 0.741751\tvalid_1's auc: 0.729797\n",
      "[228]\ttraining's auc: 0.741798\tvalid_1's auc: 0.729798\n",
      "[229]\ttraining's auc: 0.741841\tvalid_1's auc: 0.729799\n",
      "[230]\ttraining's auc: 0.741926\tvalid_1's auc: 0.729781\n",
      "[231]\ttraining's auc: 0.741991\tvalid_1's auc: 0.729788\n",
      "[232]\ttraining's auc: 0.742057\tvalid_1's auc: 0.729806\n",
      "[233]\ttraining's auc: 0.742125\tvalid_1's auc: 0.729821\n",
      "[234]\ttraining's auc: 0.74218\tvalid_1's auc: 0.729814\n",
      "[235]\ttraining's auc: 0.742254\tvalid_1's auc: 0.729834\n",
      "[236]\ttraining's auc: 0.742316\tvalid_1's auc: 0.729855\n",
      "[237]\ttraining's auc: 0.74238\tvalid_1's auc: 0.729866\n",
      "[238]\ttraining's auc: 0.742452\tvalid_1's auc: 0.729873\n",
      "[239]\ttraining's auc: 0.742512\tvalid_1's auc: 0.729897\n",
      "[240]\ttraining's auc: 0.742563\tvalid_1's auc: 0.729887\n",
      "[241]\ttraining's auc: 0.742641\tvalid_1's auc: 0.7299\n",
      "[242]\ttraining's auc: 0.742708\tvalid_1's auc: 0.729905\n",
      "[243]\ttraining's auc: 0.742767\tvalid_1's auc: 0.729914\n",
      "[244]\ttraining's auc: 0.742831\tvalid_1's auc: 0.729907\n",
      "[245]\ttraining's auc: 0.742888\tvalid_1's auc: 0.729917\n",
      "[246]\ttraining's auc: 0.742966\tvalid_1's auc: 0.729906\n",
      "[247]\ttraining's auc: 0.743032\tvalid_1's auc: 0.729906\n",
      "[248]\ttraining's auc: 0.743068\tvalid_1's auc: 0.729907\n",
      "[249]\ttraining's auc: 0.743127\tvalid_1's auc: 0.729921\n",
      "[250]\ttraining's auc: 0.743189\tvalid_1's auc: 0.729947\n",
      "[251]\ttraining's auc: 0.743236\tvalid_1's auc: 0.729955\n",
      "[252]\ttraining's auc: 0.74329\tvalid_1's auc: 0.729975\n",
      "[253]\ttraining's auc: 0.743338\tvalid_1's auc: 0.729978\n",
      "[254]\ttraining's auc: 0.743405\tvalid_1's auc: 0.729972\n",
      "[255]\ttraining's auc: 0.743485\tvalid_1's auc: 0.729981\n",
      "[256]\ttraining's auc: 0.74353\tvalid_1's auc: 0.729977\n",
      "[257]\ttraining's auc: 0.743595\tvalid_1's auc: 0.729965\n",
      "[258]\ttraining's auc: 0.743642\tvalid_1's auc: 0.729951\n",
      "[259]\ttraining's auc: 0.743725\tvalid_1's auc: 0.729964\n",
      "[260]\ttraining's auc: 0.743789\tvalid_1's auc: 0.729998\n",
      "[261]\ttraining's auc: 0.743822\tvalid_1's auc: 0.729995\n",
      "[262]\ttraining's auc: 0.743891\tvalid_1's auc: 0.730008\n",
      "[263]\ttraining's auc: 0.743937\tvalid_1's auc: 0.73\n",
      "[264]\ttraining's auc: 0.743968\tvalid_1's auc: 0.729978\n",
      "[265]\ttraining's auc: 0.744019\tvalid_1's auc: 0.729995\n",
      "[266]\ttraining's auc: 0.744086\tvalid_1's auc: 0.730007\n",
      "[267]\ttraining's auc: 0.744136\tvalid_1's auc: 0.730016\n",
      "[268]\ttraining's auc: 0.744191\tvalid_1's auc: 0.730011\n",
      "[269]\ttraining's auc: 0.744267\tvalid_1's auc: 0.730045\n",
      "[270]\ttraining's auc: 0.744325\tvalid_1's auc: 0.730033\n",
      "[271]\ttraining's auc: 0.74438\tvalid_1's auc: 0.730039\n",
      "[272]\ttraining's auc: 0.744432\tvalid_1's auc: 0.730035\n",
      "[273]\ttraining's auc: 0.744491\tvalid_1's auc: 0.730018\n",
      "[274]\ttraining's auc: 0.744554\tvalid_1's auc: 0.730015\n",
      "[275]\ttraining's auc: 0.744625\tvalid_1's auc: 0.73002\n",
      "[276]\ttraining's auc: 0.744691\tvalid_1's auc: 0.730023\n",
      "[277]\ttraining's auc: 0.744747\tvalid_1's auc: 0.730057\n",
      "[278]\ttraining's auc: 0.744808\tvalid_1's auc: 0.73006\n",
      "[279]\ttraining's auc: 0.744866\tvalid_1's auc: 0.730073\n",
      "[280]\ttraining's auc: 0.744926\tvalid_1's auc: 0.730072\n",
      "[281]\ttraining's auc: 0.744973\tvalid_1's auc: 0.73006\n",
      "[282]\ttraining's auc: 0.745024\tvalid_1's auc: 0.73005\n",
      "[283]\ttraining's auc: 0.745078\tvalid_1's auc: 0.730028\n",
      "[284]\ttraining's auc: 0.745143\tvalid_1's auc: 0.730046\n",
      "[285]\ttraining's auc: 0.7452\tvalid_1's auc: 0.730067\n",
      "[286]\ttraining's auc: 0.745268\tvalid_1's auc: 0.730098\n",
      "[287]\ttraining's auc: 0.745318\tvalid_1's auc: 0.730095\n",
      "[288]\ttraining's auc: 0.745352\tvalid_1's auc: 0.730115\n",
      "[289]\ttraining's auc: 0.745403\tvalid_1's auc: 0.730133\n",
      "[290]\ttraining's auc: 0.745458\tvalid_1's auc: 0.730153\n",
      "[291]\ttraining's auc: 0.745523\tvalid_1's auc: 0.730164\n",
      "[292]\ttraining's auc: 0.745561\tvalid_1's auc: 0.730175\n",
      "[293]\ttraining's auc: 0.745597\tvalid_1's auc: 0.730186\n",
      "[294]\ttraining's auc: 0.745639\tvalid_1's auc: 0.730198\n",
      "[295]\ttraining's auc: 0.745681\tvalid_1's auc: 0.730191\n",
      "[296]\ttraining's auc: 0.745749\tvalid_1's auc: 0.730207\n",
      "[297]\ttraining's auc: 0.745815\tvalid_1's auc: 0.730233\n",
      "[298]\ttraining's auc: 0.745867\tvalid_1's auc: 0.730241\n",
      "[299]\ttraining's auc: 0.745913\tvalid_1's auc: 0.730237\n",
      "[300]\ttraining's auc: 0.745974\tvalid_1's auc: 0.73022\n",
      "[301]\ttraining's auc: 0.74603\tvalid_1's auc: 0.730255\n",
      "[302]\ttraining's auc: 0.746085\tvalid_1's auc: 0.730255\n",
      "[303]\ttraining's auc: 0.746163\tvalid_1's auc: 0.730273\n",
      "[304]\ttraining's auc: 0.746243\tvalid_1's auc: 0.730319\n",
      "[305]\ttraining's auc: 0.746287\tvalid_1's auc: 0.730321\n",
      "[306]\ttraining's auc: 0.746351\tvalid_1's auc: 0.73031\n",
      "[307]\ttraining's auc: 0.746397\tvalid_1's auc: 0.730301\n",
      "[308]\ttraining's auc: 0.746445\tvalid_1's auc: 0.730318\n",
      "[309]\ttraining's auc: 0.746505\tvalid_1's auc: 0.730337\n",
      "[310]\ttraining's auc: 0.746555\tvalid_1's auc: 0.730344\n",
      "[311]\ttraining's auc: 0.74658\tvalid_1's auc: 0.730345\n",
      "[312]\ttraining's auc: 0.746642\tvalid_1's auc: 0.730366\n",
      "[313]\ttraining's auc: 0.746667\tvalid_1's auc: 0.730374\n",
      "[314]\ttraining's auc: 0.746715\tvalid_1's auc: 0.730366\n",
      "[315]\ttraining's auc: 0.746745\tvalid_1's auc: 0.730377\n",
      "[316]\ttraining's auc: 0.746803\tvalid_1's auc: 0.730434\n",
      "[317]\ttraining's auc: 0.746878\tvalid_1's auc: 0.730446\n",
      "[318]\ttraining's auc: 0.746921\tvalid_1's auc: 0.730448\n",
      "[319]\ttraining's auc: 0.746971\tvalid_1's auc: 0.730444\n",
      "[320]\ttraining's auc: 0.747005\tvalid_1's auc: 0.730437\n",
      "[321]\ttraining's auc: 0.747064\tvalid_1's auc: 0.730417\n",
      "[322]\ttraining's auc: 0.747121\tvalid_1's auc: 0.730461\n",
      "[323]\ttraining's auc: 0.747189\tvalid_1's auc: 0.730492\n",
      "[324]\ttraining's auc: 0.747236\tvalid_1's auc: 0.730505\n",
      "[325]\ttraining's auc: 0.747312\tvalid_1's auc: 0.730514\n",
      "[326]\ttraining's auc: 0.747371\tvalid_1's auc: 0.73051\n",
      "[327]\ttraining's auc: 0.747437\tvalid_1's auc: 0.730552\n",
      "[328]\ttraining's auc: 0.747496\tvalid_1's auc: 0.730567\n",
      "[329]\ttraining's auc: 0.747565\tvalid_1's auc: 0.730569\n",
      "[330]\ttraining's auc: 0.74763\tvalid_1's auc: 0.730572\n",
      "[331]\ttraining's auc: 0.747682\tvalid_1's auc: 0.730571\n",
      "[332]\ttraining's auc: 0.747748\tvalid_1's auc: 0.730569\n",
      "[333]\ttraining's auc: 0.747806\tvalid_1's auc: 0.730573\n",
      "[334]\ttraining's auc: 0.747854\tvalid_1's auc: 0.730589\n",
      "[335]\ttraining's auc: 0.747904\tvalid_1's auc: 0.730598\n",
      "[336]\ttraining's auc: 0.74795\tvalid_1's auc: 0.730604\n",
      "[337]\ttraining's auc: 0.747979\tvalid_1's auc: 0.730594\n",
      "[338]\ttraining's auc: 0.748007\tvalid_1's auc: 0.730592\n",
      "[339]\ttraining's auc: 0.748048\tvalid_1's auc: 0.730597\n",
      "[340]\ttraining's auc: 0.748094\tvalid_1's auc: 0.730614\n",
      "[341]\ttraining's auc: 0.748151\tvalid_1's auc: 0.730604\n",
      "[342]\ttraining's auc: 0.748203\tvalid_1's auc: 0.7306\n",
      "[343]\ttraining's auc: 0.748257\tvalid_1's auc: 0.730587\n",
      "[344]\ttraining's auc: 0.748306\tvalid_1's auc: 0.730575\n",
      "[345]\ttraining's auc: 0.748381\tvalid_1's auc: 0.730599\n",
      "[346]\ttraining's auc: 0.748424\tvalid_1's auc: 0.730597\n",
      "[347]\ttraining's auc: 0.748475\tvalid_1's auc: 0.730601\n",
      "[348]\ttraining's auc: 0.748548\tvalid_1's auc: 0.730614\n",
      "[349]\ttraining's auc: 0.748603\tvalid_1's auc: 0.730638\n",
      "[350]\ttraining's auc: 0.748653\tvalid_1's auc: 0.730621\n",
      "[351]\ttraining's auc: 0.748719\tvalid_1's auc: 0.730644\n",
      "[352]\ttraining's auc: 0.748772\tvalid_1's auc: 0.730639\n",
      "[353]\ttraining's auc: 0.748831\tvalid_1's auc: 0.730649\n",
      "[354]\ttraining's auc: 0.748885\tvalid_1's auc: 0.73067\n",
      "[355]\ttraining's auc: 0.748957\tvalid_1's auc: 0.730684\n",
      "[356]\ttraining's auc: 0.748981\tvalid_1's auc: 0.730692\n",
      "[357]\ttraining's auc: 0.749022\tvalid_1's auc: 0.730697\n",
      "[358]\ttraining's auc: 0.749091\tvalid_1's auc: 0.730722\n",
      "[359]\ttraining's auc: 0.749129\tvalid_1's auc: 0.730702\n",
      "[360]\ttraining's auc: 0.749172\tvalid_1's auc: 0.730682\n",
      "[361]\ttraining's auc: 0.749223\tvalid_1's auc: 0.730671\n",
      "[362]\ttraining's auc: 0.749322\tvalid_1's auc: 0.73072\n",
      "[363]\ttraining's auc: 0.749385\tvalid_1's auc: 0.730715\n",
      "[364]\ttraining's auc: 0.749418\tvalid_1's auc: 0.730729\n",
      "[365]\ttraining's auc: 0.749469\tvalid_1's auc: 0.730731\n",
      "[366]\ttraining's auc: 0.749525\tvalid_1's auc: 0.730771\n",
      "[367]\ttraining's auc: 0.749573\tvalid_1's auc: 0.730776\n",
      "[368]\ttraining's auc: 0.749618\tvalid_1's auc: 0.730804\n",
      "[369]\ttraining's auc: 0.749663\tvalid_1's auc: 0.730806\n",
      "[370]\ttraining's auc: 0.749719\tvalid_1's auc: 0.730802\n",
      "[371]\ttraining's auc: 0.749772\tvalid_1's auc: 0.730797\n",
      "[372]\ttraining's auc: 0.749806\tvalid_1's auc: 0.7308\n",
      "[373]\ttraining's auc: 0.749903\tvalid_1's auc: 0.730861\n",
      "[374]\ttraining's auc: 0.749941\tvalid_1's auc: 0.730842\n",
      "[375]\ttraining's auc: 0.750022\tvalid_1's auc: 0.730894\n",
      "[376]\ttraining's auc: 0.750073\tvalid_1's auc: 0.730895\n",
      "[377]\ttraining's auc: 0.750123\tvalid_1's auc: 0.730879\n",
      "[378]\ttraining's auc: 0.750177\tvalid_1's auc: 0.730894\n",
      "[379]\ttraining's auc: 0.750229\tvalid_1's auc: 0.730892\n",
      "[380]\ttraining's auc: 0.750271\tvalid_1's auc: 0.730908\n",
      "[381]\ttraining's auc: 0.750312\tvalid_1's auc: 0.730925\n",
      "[382]\ttraining's auc: 0.750369\tvalid_1's auc: 0.730956\n",
      "[383]\ttraining's auc: 0.750386\tvalid_1's auc: 0.730954\n",
      "[384]\ttraining's auc: 0.750437\tvalid_1's auc: 0.730967\n",
      "[385]\ttraining's auc: 0.750497\tvalid_1's auc: 0.730987\n",
      "[386]\ttraining's auc: 0.75056\tvalid_1's auc: 0.731009\n",
      "[387]\ttraining's auc: 0.750611\tvalid_1's auc: 0.730983\n",
      "[388]\ttraining's auc: 0.750671\tvalid_1's auc: 0.730981\n",
      "[389]\ttraining's auc: 0.750722\tvalid_1's auc: 0.73097\n",
      "[390]\ttraining's auc: 0.750756\tvalid_1's auc: 0.730981\n",
      "[391]\ttraining's auc: 0.750807\tvalid_1's auc: 0.730992\n",
      "[392]\ttraining's auc: 0.750865\tvalid_1's auc: 0.730992\n",
      "[393]\ttraining's auc: 0.750902\tvalid_1's auc: 0.731006\n",
      "[394]\ttraining's auc: 0.750945\tvalid_1's auc: 0.730999\n",
      "[395]\ttraining's auc: 0.751015\tvalid_1's auc: 0.730998\n",
      "[396]\ttraining's auc: 0.75106\tvalid_1's auc: 0.731007\n",
      "[397]\ttraining's auc: 0.751157\tvalid_1's auc: 0.731049\n",
      "[398]\ttraining's auc: 0.7512\tvalid_1's auc: 0.73104\n",
      "[399]\ttraining's auc: 0.7513\tvalid_1's auc: 0.731075\n",
      "[400]\ttraining's auc: 0.751347\tvalid_1's auc: 0.731089\n",
      "[401]\ttraining's auc: 0.751424\tvalid_1's auc: 0.731109\n",
      "[402]\ttraining's auc: 0.751475\tvalid_1's auc: 0.731079\n",
      "[403]\ttraining's auc: 0.751535\tvalid_1's auc: 0.731092\n",
      "[404]\ttraining's auc: 0.751582\tvalid_1's auc: 0.731074\n",
      "[405]\ttraining's auc: 0.751627\tvalid_1's auc: 0.731092\n",
      "[406]\ttraining's auc: 0.751671\tvalid_1's auc: 0.731113\n",
      "[407]\ttraining's auc: 0.751724\tvalid_1's auc: 0.73113\n",
      "[408]\ttraining's auc: 0.75177\tvalid_1's auc: 0.731117\n",
      "[409]\ttraining's auc: 0.751803\tvalid_1's auc: 0.731109\n",
      "[410]\ttraining's auc: 0.75185\tvalid_1's auc: 0.731109\n",
      "[411]\ttraining's auc: 0.751875\tvalid_1's auc: 0.731107\n",
      "[412]\ttraining's auc: 0.751915\tvalid_1's auc: 0.73112\n",
      "[413]\ttraining's auc: 0.751972\tvalid_1's auc: 0.731116\n",
      "[414]\ttraining's auc: 0.752021\tvalid_1's auc: 0.73114\n",
      "[415]\ttraining's auc: 0.75205\tvalid_1's auc: 0.731142\n",
      "[416]\ttraining's auc: 0.752088\tvalid_1's auc: 0.731129\n",
      "[417]\ttraining's auc: 0.752127\tvalid_1's auc: 0.73113\n",
      "[418]\ttraining's auc: 0.75216\tvalid_1's auc: 0.731135\n",
      "[419]\ttraining's auc: 0.752222\tvalid_1's auc: 0.73115\n",
      "[420]\ttraining's auc: 0.75228\tvalid_1's auc: 0.731161\n",
      "[421]\ttraining's auc: 0.752328\tvalid_1's auc: 0.731164\n",
      "[422]\ttraining's auc: 0.752377\tvalid_1's auc: 0.731172\n",
      "[423]\ttraining's auc: 0.752423\tvalid_1's auc: 0.731206\n",
      "[424]\ttraining's auc: 0.752468\tvalid_1's auc: 0.73122\n",
      "[425]\ttraining's auc: 0.752507\tvalid_1's auc: 0.731217\n",
      "[426]\ttraining's auc: 0.752557\tvalid_1's auc: 0.731214\n",
      "[427]\ttraining's auc: 0.752603\tvalid_1's auc: 0.731188\n",
      "[428]\ttraining's auc: 0.752639\tvalid_1's auc: 0.731188\n",
      "[429]\ttraining's auc: 0.752678\tvalid_1's auc: 0.731166\n",
      "[430]\ttraining's auc: 0.752717\tvalid_1's auc: 0.73117\n",
      "[431]\ttraining's auc: 0.752764\tvalid_1's auc: 0.731178\n",
      "[432]\ttraining's auc: 0.752817\tvalid_1's auc: 0.731166\n",
      "[433]\ttraining's auc: 0.752881\tvalid_1's auc: 0.73118\n",
      "[434]\ttraining's auc: 0.752933\tvalid_1's auc: 0.73118\n",
      "[435]\ttraining's auc: 0.752971\tvalid_1's auc: 0.731179\n",
      "[436]\ttraining's auc: 0.753011\tvalid_1's auc: 0.731166\n",
      "[437]\ttraining's auc: 0.753061\tvalid_1's auc: 0.731168\n",
      "[438]\ttraining's auc: 0.75308\tvalid_1's auc: 0.731168\n",
      "[439]\ttraining's auc: 0.753116\tvalid_1's auc: 0.731177\n",
      "[440]\ttraining's auc: 0.753148\tvalid_1's auc: 0.731181\n",
      "[441]\ttraining's auc: 0.753205\tvalid_1's auc: 0.731193\n",
      "[442]\ttraining's auc: 0.753245\tvalid_1's auc: 0.731187\n",
      "[443]\ttraining's auc: 0.753305\tvalid_1's auc: 0.731165\n",
      "[444]\ttraining's auc: 0.753351\tvalid_1's auc: 0.731142\n",
      "Early stopping, best iteration is:\n",
      "[424]\ttraining's auc: 0.752468\tvalid_1's auc: 0.73122\n"
     ]
    }
   ],
   "source": [
    "import lightgbm as lgb\n",
    "\n",
    "\n",
    "train_matrix = lgb.Dataset(X_train, label=y_train)\n",
    "valid_matrix = lgb.Dataset(X_test, label=y_test)\n",
    "\n",
    "params = {\n",
    "    'boosting_type': 'gbdt',\n",
    "    'objective': 'binary',\n",
    "    'metric': 'auc',\n",
    "    'min_child_weight': 5,\n",
    "    'num_leaves': 2 ** 5,\n",
    "    'lambda_l2': 10,\n",
    "    'feature_fraction': 0.8,\n",
    "    'bagging_fraction': 0.8,\n",
    "    'bagging_freq': 4,\n",
    "    'learning_rate': 0.1,\n",
    "    'seed': 2020,\n",
    "    'nthread': 28,\n",
    "    'n_jobs':12,\n",
    "    'silent': True,\n",
    "    'verbose': -1,\n",
    "}\n",
    "\n",
    "model = lgb.train(params, train_matrix, num_boost_round=500, valid_sets=[train_matrix, valid_matrix],early_stopping_rounds=20)\n",
    "\n",
    "# val_pred = model.predict(val_x, num_iteration=model.best_iteration)\n",
    "# test_pred = model.predict(test_x, num_iteration=model.best_iteration)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 10,
   "metadata": {},
   "outputs": [],
   "source": [
    "def predict(probs, thresh=0.1):\n",
    "    return 1 * (probs >= thresh)\n",
    "\n",
    "y_pred = model.predict(X_test, num_iteration=model.best_iteration)\n",
    "threshs = [0.1*i for i in range(11)]\n",
    "preds = []\n",
    "for thresh in threshs:\n",
    "    pred = predict(y_pred, thresh=thresh)\n",
    "    preds.append(pred)\n",
    "\n",
    "    \n",
    "preds = np.array(preds)\n",
    "\n",
    "\n",
    "# r2 = classification_report(y_test, y_pred)\n",
    "# print(r2)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 19,
   "metadata": {},
   "outputs": [],
   "source": [
    "# y_pred = model.predict(X_test, num_iteration=model.best_iteration)\n",
    "\n",
    "fpr, tpr, threshhold = metrics.roc_curve(y_test, y_pred)\n",
    "roc_auc = metrics.auc(fpr, tpr)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 20,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAfkAAAHwCAYAAACluRYsAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjMuNCwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8QVMy6AAAACXBIWXMAAAsTAAALEwEAmpwYAABWeUlEQVR4nO3deZzN5fvH8ddNJEvJ0kZFUUII0aKsac9SEe0q2vdv+74XfaPoJxVtSmkT2bL3LcqSEIpKjCVC9sHM3L8/rqMmjXHGzJn7LO/n4zGPmXPmzJm30zTX3Pfnvq/bee8RERGR5FMkdAARERGJDRV5ERGRJKUiLyIikqRU5EVERJKUiryIiEiSUpEXERFJUiryIiIiSUpFXiSFOecWOee2OOc2OudWOOfecM6Vzvb5k5xz45xzG5xz65xzQ51zNXd6jn2dcz2dc4sjz7MwcrtC4f+LRCQ7FXkROdd7XxqoBxwH3AvgnDsRGA0MAQ4BqgLfA185546IPKY4MBaoBZwB7AucBKwGGhXqv0JE/sWp451I6nLOLQKu9t6Pidx+DqjlvT/bOfclMNt7f/1OXzMCWOW9v8w5dzXwJHCk935jIccXkd3QSF5EAHDOVQbOBBY650piI/LBOTz0A+C0yMetgJEq8CLxSUVeRD51zm0AlgArgYeBctjvh+U5PH45sON6e/ldPEZE4oCKvIi09d6XAZoBNbACvhbIAg7O4fEHA39EPl69i8eISBxQkRcRALz3E4E3gB7e+03AZODCHB7aAVtsBzAGON05V6pQQopInqjIi0h2PYHTnHP1gHuAy51zNzvnyjjn9nfOPQGcCDwaefzb2DT/R865Gs65Is658s65+5xzZ4X4B4jI31TkReQv3vtVwFvAg977/wGnA+2x6+6/YVvsmnjvF0QevxVbfDcf+AJYD3yLTfl/U+j/ABH5B22hExERSVIayYuIiCSpmBV551x/59xK59ycXXzeOedejLTAnOWcqx+rLCIiIqkoliP5N7A2l7tyJlA98tYV+L8YZhEREUk5MSvy3vtJwJpcHtIGeMubKUBZ55z224qIiBSQkNfkK2Fbb3ZIi9wnIiIiBWCvgN/b5XBfjkv9nXNdsSl9SpUq1aBGjRqxzCUiIhKVjAzYtg0yM+2995CebrfT02FjHk51KFoUnIPixSErC8rstYVKm34iyxVlVtbWP7z3FfOaL2SRTwMOzXa7MrAspwd67/sB/QAaNmzop02bFvt0IiKS8rZvh2XLYMECGD8eVqywt++/h6VLc//ahg3h6KOt8NeubUW/enUoVcrejjwSypWD0qWhWLGdvnjrVjjqKChTEcaNwx1zzG97kj9kkf8MuNE5NwhoDKzz3uugCxERKXTp6TBlCsyeDYsXw2efwZIlsGXLvx9bujTUrw8nnABHHAG1akHZsnDwwbDffnDIIVCmTD4D7b03vPEGVK5sfxnsoZgVeefce9iBFxWcc2nYyVbFALz3fYHhwFnAQmAzcGWssoiIiAAsXw6//QZffWUj9PHj4bvv/v24o46CY4+10fihh0KdOnbfkUfalHrMfPMNzJ0LV14JzZvn++liVuS9951283kP3BCr7y8iIvLnn/DllzBgAHzyyb8/X7363wW8bVsr7NWqQcmShZ0U+N//4MwzbUqgUycoUSLfTxlyur7AbN++nbS0NNLT00NHkUJQokQJKleuTLF/XcQSkVTmPfzyC4wbBxMn2vT7zz/b54oVgxo14Pjj4fTToV49u120aNDIfxs/Hs45x6YNxo4tkAIPSVLk09LSKFOmDFWqVMHFdB5FQvPes3r1atLS0qhatWroOCIS2J9/wgcfwPvv2yz3ihV/f65mTbj9divs550XaHQejdGjoU0bu8A/diwcdFCBPXVSFPn09HQV+BThnKN8+fKsWrUqdBQRKWTbtsH8+TBihE3Br1wJU6f+/flDD4W77oJWreDUU23tWkKYPduW4X/xBVTM8y65XCVFkQdU4FOI/luLpIb162HOHKt9I0bATz/B2rV/f/7ww+HWW+Hss22NWtxMvUdrwwZbhn/HHXDDDQU2RZ+dTqErAM2aNWPUqFH/uK9nz55cf/31uX7Nrvb7r1q1imLFivHKK6/84/7SpUv/4/Ybb7zBjTfe+Nftt956i9q1a1OrVi1q1qxJjx498vpP+ZeRI0dy9NFHU61aNZ555pkcH9O9e3fq1atHvXr1qF27NkWLFmXNmjWkp6fTqFEj6tatS61atXj44Yf/+prBgwdTq1YtihQp8o/X4YsvvqBBgwYce+yxNGjQgHHjxuX73yAiiWHTJrue/txzdr18//3h5JPhkUds0fmxx8Irr8APP1izmUWL4IUXbOSecAV+8GCbnp81y27HoMADdo0zkd4aNGjgdzZ37tx/3VeY+vbt66+44op/3Ne4cWM/adKkXX5N06ZN/dSpU3P8XJ8+fXyTJk1806ZN/3F/qVKl/nF7wIAB/oYbbvDeez98+HB/3HHH+aVLl3rvvd+yZYvv169fXv8p/5CRkeGPOOII//PPP/utW7f6OnXq+B9++CHXr/nss8988+bNvffeZ2Vl+Q0bNnjvvd+2bZtv1KiRnzx5svfe/pvNnz//X6/DjBkz/vo3zJ492x9yyCE5fp/Q/81FJH8yM71fscL7zz/3/t57va9Xz/u99vLels95X6OG95dd5v1HH3n/00+h0xawd97xvkgR75s08X79+qi+BJjm96BmJs10fUgXXHABDzzwAFu3bmXvvfdm0aJFLFu2jCZNmnDdddcxdepUtmzZwgUXXMCjjz662+d77733eP755+ncuTNLly6lUqXdt/R/+umn6dGjB4cccghgK9CvueaafP27vv32W6pVq8YRRxwBwEUXXcSQIUOoWbNmrtk7dbLdk865v2Yftm/fzvbt2/+aaj/mmGNy/Prjjjvur49r1apFenr6X6+riCS2336D7t1h2jSYN8+m43eoWtUWl196qXWHO+qocDljasAAuOoqaNYMhg611ncxlHRF/tZbYebMgn3OevWgZ89df758+fI0atSIkSNH0qZNGwYNGkTHjh1xzvHkk09Srlw5MjMzadmyJbNmzaJOnTq7fK4lS5awYsUKGjVqRIcOHXj//fe5/fbbd5txzpw5NGjQYLePGzhwIN27d//X/dWqVePDDz/8x31Lly7l0EP/7jxcuXJlvvnmm10+9+bNmxk5ciS9e/f+677MzEwaNGjAwoULueGGG2jcuPFuM+7w0Ucfcdxxx6nAiySwX3+FQYOsg9yUKXafc1bQW7Wy5jInnADly4fNWShGjYIuXeC00+DTTwtluX/SFflQOnXqxKBBg/4q8v379wfggw8+oF+/fmRkZLB8+XLmzp2ba5EfNGgQHTp0AGzkfNVVV+Va5PO6CO3iiy/m4osvjuqxNkMU/fcbOnQoJ598MuXKlfvrvqJFizJz5kz+/PNP2rVrx5w5c6hdu/Zuv/cPP/zA3XffzejRo6PKKiLxYdUqGDbMVr2PGwc//mj3H3003HabrS878siwGYNp3hyeespeiFhdg99J0hX53EbcsdS2bVtuv/12ZsyYwZYtW6hfvz6//vorPXr0YOrUqey///5cccUVu23Y89577/H7778zcOBAAJYtW8aCBQuoXr06++yzD9u2baN48eIArFmzhgoVKgA2tT19+nRatGiR6/PnZSRfuXJlliz5+zTgtLS0vy4H5GTQoEF/TdXvrGzZsjRr1oyRI0futsinpaXRrl073nrrLY5M2d8GIokhM9O2eQ8ebIe2zJjx9+eaNYOLL7btbE2bBosY3oABNnVRsSLce2/hfu89uZAf8i0eF97tcOGFF/q6dev6hx9+2Hvv/cyZM32dOnV8ZmamX7FihT/ggAP8gAEDvPc5L7ybP3++P+qoo/5x30MPPeQfe+wx7733559/vn/99de9995v3rzZN27c2E+cONF77/3nn3/uGzRo4JcvX+699z49Pd336tUrX/+e7du3+6pVq/pffvnlr4V3c+bMyfGxf/75p99///39xo0b/7pv5cqVfu3atX/lbdKkiR86dOg/vm7n12Ht2rW+Tp06/sMPP8w1W7z8NxdJRWPGeN+2rfc1a3pfvPjfi+XKlbP7hw2Lej1Z8nvySXtx7r03X0+DFt6F16lTJ9q3b8+gQYMAqFu3Lscddxy1atXiiCOO4OSTT87169977z3atWv3j/vOP/98LrroIh588EF69epFt27dePHFF/Hec9lll3HqqacCcNZZZ/H777/TqlUrvPc45+jSpUu+/j177bUXvXv35vTTTyczM5MuXbpQq1YtAPr27QvAtddeC8Ann3xC69atKZVtEcny5cu5/PLLyczMJCsriw4dOnDOOef89fibbrqJVatWcfbZZ1OvXj1GjRpF7969WbhwIY8//jiPP/44AKNHj+aAAw7I179FRPbcunV2TX3yZJgwwRbNgV1HP+ssa9Z24YUxX0OWWLyHRx+1t4svhsceCxLD+Ryuu8aznM6Tnzdv3i5Xa0ty0n9zkdhavdr2p3/44T9bxZ50ErRrB9dfH8dtYkPzHu6/H55+Gq64Al57Ld8b+Z1z0733DfP6dRrJi4gIAFu32jX1u+6yA9EAKlSwvu8XXQQXXGAHvchubNgAH30EXbvC//0fFAnXd05FXkQkhS1dagPN0aNh+nQr9PvuCx06wLXXFsiR5qkjK8ve9t3Xrm3sv3+MD5/fPRV5EZEUs3kzvPce9O79d1+RMmXgmmugcWM70jwl9q0XpKws+6to40Z45x3ItpU4pKQp8jsWm0nyS7R1JCLxYuFCm0EeP95uV6gAN99sC+d2s/tWcpOZCVdfDW+8AffdF3z0nl1SFPkSJUqwevVqypcvr0Kf5Ly38+RLFFIjCZFksGwZvPmm1R+wU9u6dLFr7XslRRUIKCMDLr8c3n3XVtI/+KCKfEGrXLkyaWlpOmM8RZQoUYLKlSuHjiES96ZMgYcftuvtYF3nBgyAE08MmyupdO1qBf6ppwq/0U0UkqLIFytWjKpVq4aOISISXFaWXW/v3t060IH1iH/ySTj++LgaZCaHK66AunXhlltCJ8lRUhR5EZFU99NP8PbbdhjMwoVQqRI8/jh062bdVKUApafDyJHQtq317I00JYtHKvIiIgnIexupDxtmo/Ydx7aWLWur5rt10/X2mNi82Yr7mDHwww8Q50259CMgIpJANm2CgQOtiO9Qv76dXnr++dCwoabkY2bjRjj3XJg4Efr3j/sCDyryIiIJYcYMeOgh+Pxzu33ggXDZZba3vXr1sNlSwvr11qh/8mTbB9+5c+hEUVGRFxGJU1u3wuuvwwsv2HX24sXhqqts1N6mTaEdSS4AI0bAN9/YoocLLwydJmoq8iIicebXX63lef/+dlBMpUp23smNN8JBB4VOl2K8t+sfHTvatZAjjwydKE/Cdc0XEZG/ZGRYw5oTT4QjjrDFdIceCp98AkuWwBNPqMAXulWr4JRT4Ouv7XaCFXjQSF5EJKj0dHj1VXjgAbvsW6EC3Hmn9VjRtfaAVqyAli3hl19swV2CUpEXEQlg5kzo2dNG7wBHHQWvvGKzwlodH9jSpdbMPy0Nhg9P6KP4VORFRArRzz9b99PBg+32uefCxRfbWq6Ax47LDr//Dk2bwsqVMGoUNGkSOlG+qMiLiMRYVpYtyu7d23Zgge1pf/ppTcnHnfLloVmzv8/dTXAq8iIiMfLnnzYd36cPLFhg9919t9WPBFzDldwWLoRSpeDgg+G110KnKTAq8iIiBWzgQNtW/dFHtrBu773hv/+1LnUlS4ZOJ/8yb54tsqteHSZMSKpFESryIiIFID3drrPfeOPffeQ7dYKrr7bLusWLh80nuzBnjhV45+Dll5OqwIOKvIjIHvMe/vc/6NvXrrlnZVnjmiuvhGeftRG8xLHvvrP2gXvvDePGwdFHh05U4FTkRUTyaP58WzQ3erRtp4a/V8m3bw/FioXNJ1Hw3s6AL1nSCny1aqETxYSKvIhIlKZOhQcftJ1VYHXhpZfsrJJy5cJmkzxyzq6vbNkCVaqEThMz2pUpIrIb69dbk5pGjWzQd/HF1l9+wQK7Bq8Cn0C+/BIuvxy2b7ej/JK4wING8iIiuZozx0bqs2fD9dfDI49AxYqhU8keGTsWzjsPDjvM9jemwH9IFXkRkRysXWsr4z/9FPbZx963aRM6leyxUaOgbVu7xjJmTEoUeNB0vYjIP6xaBddeayfAffyxbX9bsEAFPqF9/rmN4GvUgPHjbZo+RajIi4hgB43dey/Urm2nwp19NkyaBBMnWhM0SWAVKtiRsWPH2scpRNP1IpLS5s2znvIDBthC60qVbO/7iSeGTib5Nn++jd4bN4Yvvki6RjfR0EheRFLSwIF2ibZmTWt0dtppNpO7ZIkKfFJ45x2oVcu6FEFKFnjQSF5EUsxvv8Gjj9rIHWw31bXXwgknhM0lBah/f1s12by5dSlKYSryIpIS0tLg1lvt0Biwo14HDlTr2aTzyiv2V1vr1n9vjUhhmq4XkaSVlQVvvQVnnWWr5T/6yA6N+ekn+PBDFfikM3cuXHedrZocMiTlCzxoJC8iSeqzz6x5zdKldvuWW+z3fxKeQSI71KwJw4ZBq1Y69i9CI3kRSSpr1thArk0b2LABeva0Y2B79lSBT1rPPWerJsGmbVTg/6IiLyJJwXt44w3bMTViBNx1l62Uv+UWTcsnLe/h4Yfh7rv/XkUv/6DpehFJaBs32u/3l16CWbOsmc2QIdoGl/S8t+5Fzz4LXbrYPkj5FxV5EUlYs2dbMd+0yRbWvfIKXHmlznNPet7DHXfACy/YSvo+faCIJqZzoldFRBLOL7/YNuhGjSAzE3r1gkWLoGtXFfiU4D388QfcfLON4FXgd0kjeRFJGN7D66/DNdfY7Q4d4Omn4YgjwuaSQpKVBatX2wlyAwZYcU/RTnbRUpEXkYQwYQK0b29HwNata9fha9QInUoKTWYmXHWVHSwwYwbsu2/oRAlBcxwiEtc2bfq7QynYDO1336nAp5SMDLj0UnjzTetDrAIfNY3kRSQuzZ9ve9sHDYJ16+CSS+DFF2H//UMnk0K1bRt07mztCp95xrbLSdRU5EUkrqxebQfIvPSS3T7wQLv82q5d2FwSyAMPWIH/73/htttCp0k4KvIiEje++so61a1eDccdB++/D9Wrh04lQd11F9SrZ6N5yTNdkxeR4BYsgGbNoEkTW0E/dqytrVKBT1GbN8Mjj8DWrVChggp8PqjIi0gwmZnw1FNw1FG2aPq66+wgsRYtQieTYDZutP7zjz8OX34ZOk3C03S9iBS6zEzo3h1eew1+/tlG8a++CtWqhU4mQa1bZwX+m2/gnXfsNDnJFxV5ESlUaWlw4YUwZQocfrgV96uuUk+TlLd2LZx+uu2PfP99OP/80ImSgoq8iBSKzZut1fjzz8Off1qf+WuuUXGXiLQ0WLzYVtKfd17oNElDRV5EYm7mTGjbFn77DcqWhW+/hYYNA4eS+LBpE5QqBccea9duSpUKnSipaOGdiMTMtm12WFj9+va7/LPPbHucCrwAsHw5HH88PPec3VaBL3AayYtITAwcaMd9L1kCZ5xhB8scckjoVBI30tJsG8WyZdC4ceg0SUsjeREpUD/+aN3pLrnERu+DBsHw4Srwks1vv0HTprBiBYwaZR9LTKjIi0iB2LTJFtLVqAGffgp33glLl0LHjlpcJ9ls2WKnDa1eDWPGwMknh06U1DRdLyL54r1taX7ySRvFd+sG99wDVaqETiZxaZ997HCCWrVssYbElIq8iOyxjRttj/sHH1gL2kGDbOQu8i/z5tl1+NNOs2NjpVCoyIvIHpk61Qr6r7/aCvqnn4ZixUKnkrg0eza0bAmlS9sZwsWLh06UMnRNXkTyJDMTbrgBGjWypjZffAE9eqjAyy58951dgy9eHEaOVIEvZCryIhK1deugdWt4+WVo3x7mzFF7ccnFt9/aNrlSpWDiRDuJSAqViryIRGXUKDjsMBg3Du6/Hz78UNviZDcGDYL994dJk+DII0OnSUkq8iKSq3nz4KKLrKHNfvtZk5snntC2OMlFZqa979HDTpQ7/PCweVKYiryI5Cgjw470PvZYOxTskktser5z59DJJK6NHWs/NL/9BkWKQMWKoROlNK2uF5F/WbHCjoP93/9szVT//tr3LlEYOdLaHVavbvvhJbiYjuSdc2c45350zi10zt2Tw+f3c84Ndc5975z7wTl3ZSzziMju/fwznHiinff+6qs2MFOBl90aOhTatLGWh+PGwQEHhE4kxLDIO+eKAn2AM4GaQCfnXM2dHnYDMNd7XxdoBjzvnNP+CpFAxo+HunXh999h2DC4+mpde5cojBlj2y3q1rUCX6FC6EQSEcuRfCNgoff+F+/9NmAQ0Ganx3igjHPOAaWBNUBGDDOJSA5WrLCtcS1aQMmS8PXXcPrpoVNJwmjY0FoffvGFraaXuBHLIl8JWJLtdlrkvux6A8cAy4DZwC3e+6wYZhKRnUycaCd9fvEFXHCBraavVy90KkkII0fagTNly0Lfvrb9QuJKLIt8TpN8fqfbpwMzgUOAekBv59y+/3oi57o656Y556atWrWqoHOKpKwRI2wE770V+cGDoXz50KkkIbz+Opx1lvUzlrgVyyKfBhya7XZlbMSe3ZXAx94sBH4Fauz8RN77ft77ht77hhW1HUOkQAwYAGefDZUqwfTp6lwnefDyy7Zg4/TT4d57Q6eRXMSyyE8FqjvnqkYW010EfLbTYxYDLQGccwcCRwO/xDCTSMpLT7fmNl26WJfR8eO1lVnyoGdPO7zg3HPh00+1VS7OxazIe+8zgBuBUcA84APv/Q/OuWudc9dGHvY4cJJzbjYwFrjbe/9HrDKJpLpJk6BmTWtuc8UV8P33akYmebB6NTz5JJx/vvU13nvv0IlkN2LaDMd7PxwYvtN9fbN9vAxoHcsMImJdRnv2hLvugjJlrMh36BA6lSQU723BxpQpdoiBjh1MCOp4J5Lk1q+39VFffWXdRkePhoMOCp1KEob38NBD9vHjj+ugmQSj3vUiSWzqVDjwQCvwzzxjR3urwEvUvIe777YTiVassNuSUDSSF0lSU6bYvvf0dGtI1rJl6ESSULyH226DXr3g+uvhpZfU/jABaSQvkoTGjYNTT4VNm2wUrwIveXbzzVbgb70Veve2E+Uk4WgkL5JkRo2yg8DKlbPp+YMPDp1IEtKJJ0Lp0vDUUxrBJzAVeZEk8eeftnr+1Vdt8fPo0SrwkkcZGTBzpvWi79w5dBopAJp/EUkCX38NRxxhBb59e5gxA44+OnQqSSjbt8Oll8LJJ9t5w5IUNJIXSWDeW2Hv1s1uf/qpHektkifbtkGnTvDxx/Dcc9oml0RU5EUS1Lhxtvh51iyoXRuGDoUqVUKnkoSzdStceKH9APXsCbfcEjqRFCBN14skoCFD7GyQtDTo3t0uo6rAyx55800r8C+/rAKfhDSSF0kwb79tfecPPxwmT7ZmNyJ77Jpr7ECDJk1CJ5EY0EheJEFkZFjzscsug6pVrdmNCrzskQ0b7Br8woW2PU4FPmmpyIskgG3boG1bWxPVrh1MmwYHHBA6lSSkdevsWs/gwbagQ5KairxInEtPh6uugs8/hwcesAXQZcuGTiUJae1aOO00O9Tg/fdtv6UkNV2TF4ljixdD48Z2Nsgtt9ghYCJ7ZPVqaNUK5s61vxTPPTd0IikEGsmLxKlJk+D4423w9eqrtrtJZI8VL269jocMUYFPIRrJi8SZrCx47TW49lprSztpEjRqFDqVJKwVK6wHfZkydhyh+tCnFBV5kTiycaMtsBs7FurWtYY35cqFTiUJKy0NWrSAGjXgs89U4FOQputF4sSSJbaTadw4eOYZmD5dBV7yYdEiO2/499/h3ntDp5FANJIXiQMTJtjWuM2bbWfT+eeHTiQJ7eefbQS/fr1N0R9/fOhEEoiKvEhggwbBJZfYqH3KFDjuuNCJJKF5DxddBJs22bSQfqBSmoq8SEAffWTHdtepAyNG6Px3KQDOWT/6zEw49tjQaSQwXZMXCWTaNOssethhMHq0Crzk06xZ8OijNpKvWVMFXgAVeZFCl5kJffvaZdJy5azAq0Wt5MuMGdC8ue29/OOP0GkkjqjIixQi76FrV7juOhtoffstHHVU6FSS0L75xhbZlSljTRUqVgydSOKIirxIIbrySujf31bSf/edTdWL7LGvvrJe9OXLw8SJdjyhSDYq8iKFwHu46y5bD3XqqbbgrmjR0Kkk4f3+Oxx6qI3gDz88dBqJQyryIjG2YQN06wbdu8N556mzqBSAHdfd27eHmTOhUqWgcSR+qciLxNDUqVCtmh0wc/318MknUKxY6FSS0IYPt2n5UaPstn6gJBcq8iIx8sYbdrCM93YWfJ8+UET/x0l+DBlihxscdRQ0bBg6jSQANcMRiYEpU+Cqq+Doo+36e61aoRNJwvvwQ2us0KABjBwJZcuGTiQJQEVepIBNnmy95w86CMaPV5MbKQAzZ1qr2hNOsOn6ffcNnUgShCYPRQrQu+/CSSfBmjUwdKgKvBSQunXhpZdsBK8CL3mgIi9SADZvhptvhosvtiY3CxdC/fqhU0nCe/NNmDfPtmNcdx2ULh06kSQYFXmRfPr9978HWiefDP/7H1SuHDqVJLw+feCKK+C550InkQSmIi+SD1Om2PT8woXw2WdW4DWbKvn2wgtw443Qpo0ddCCyh7TwTmQPff21jdzLlIEvvoBWrUInkqTwzDNw771wwQW2yEP74CUfVORF9sDkyXDmmXYWyKRJUKNG6ESSFDIyrMlNp07w1luwl35FS/7oJ0gkD7Ky4NZb7fp7xYowYYIKvBQA72HrVihRwjon7b23DjeQAqFr8iJ5cM89VuDbtYMffoCaNUMnkoS34/Sili1tm0bJkirwUmBU5EWi9PbbdsjMmWdaFzsd2y355r1NDfXoAfXq2UhepACpyItE4Ysv4LLLoE4d+PhjnSInBSAry/a+v/gi3HYb9O6tww2kwOknSmQ3vvrKRu9VqlgXOw22pEDcdx+88opdA3r+ef3lKDGhhXciuVi1ykbwBx5oK+oPOih0IkkaV18NFSrAHXeowEvMaCQvsgvr1tkIfvFiOzZWBV7ybft2GDDArsVXqwZ33qkCLzGlIi+Sg/Hj7UTPmTPhgw/gtNNCJ5KEt20bdOwIXbrY3kuRQqAiL7KTF16AFi1g9WprVduuXehEkvDS06F9e/jkE+jVC5o3D51IUoSuyYtEeA9PPAEPPQStW8P770PZsqFTScLbvNn+Uhw92vrQd+sWOpGkEBV5kYju3a3A16tn2+RKlQqdSJLCjBnW+7h/f7jyytBpJMWoyIsAL78Md98Np5xil0u1XVnyLSvLfpCaNIGff4ZDDgmdSFKQfpVJSvMenn4abrjBTpQbNUoFXgrAn3/CqafCe+/ZbRV4CUS/ziRlZWbCVVdZT5K2be2S6T77hE4lCW/NGjt3+Ntv9QMlwWm6XlJSVpY1uXn3XTj/fFtkpzNBJN9WrbL9lvPn20r6s88OnUhSnEbykpIuucQK/BVXwODBKvBSADZutK1xP/5oey9V4CUOaCQvKeeZZ+xSabdu8H//p4ZjUkBKlYILL7TVmy1ahE4jAoDz3ofOkCcNGzb006ZNCx1DEtT06XD88dauduhQLbKTArBkiS20O/bY0EkkiTnnpnvvG+b16zSSl5SxZYv1JClb1g7/UoGXfFu0yKbo99oL5s2z9yJxRD+RkhIyMuw6/JIltoq+cuXQiSThLVxo0/IbN9oPlQq8xCH9VErSy8yEzp2ti91jj+mwGSkAP/5oBX7rVhg3ztokisQhFXlJat7bOqjJk+GWW+DBB0MnkqTw5JM2PTRhAtSuHTqNyC6pyEtSe/JJK/D33msfixSIV16BZcvgyCNDJxHJlZYeSVLyHh55xEbuZ55pp8tpq5zky7RpcPrpsG6ddbJTgZcEoJG8JJ1t2+zo7s8/t4XPH36olfSST1OmWIEvV862y+23X+hEIlHRrz5JOvfcYwX+7rthzBgoWTJ0Iklo//ufrdasWBEmToTDDw+dSCRqKvKSVF54AXr2hE6drLOdRvCSL19+aSP4SpWswB92WOhEInmiX4GSFLyHG2+E22+3I2P79QudSJLCoYdC06a2ir5SpdBpRPJMRV4S3rp1dg2+Tx+bVR0zBkqXDp1KEtp339lRhVWqwPDhcNBBoROJ7BEVeUlov/0GzZrBsGHW6GbkSNh779CpJKF9+ik0bgzdu4dOIpJvWl0vCWv2bCvwa9bYsbGdOoVOJAlv8GBrj9iggR1TKJLgVOQlIS1a9Hd72ilTbOAlki8DB8Jll8FJJ9n2jH33DZ1IJN9U5CXh/P47nHCCvZ80SQVeCsDvv0PXrrbI7rPPtKhDkoaKvCSU9evtuNg1a2DUKOtLL5JvBx5oKzbr1lVjBUkqKvKSMDZssKI+axa89Ra0bh06kSS83r2hVCm48ko48cTQaUQKnFbXS0LIyoIrrrAC368fXHpp6ESS8J5/Hm66ybZmeB86jUhMqMhLQnj8cTsP/pFH4JprQqeRhPfUU3DnnXDhhTBokE4vkqSlIi9x7+ef4emn4bzz4KGHQqeRhPfII3D//XDxxbb3slix0IlEYkZFXuJaRoZ1s8vKgmef1YBLCsBee9m1nzfftI9Fkph+wiVuLVhg25Z3LLSrUSN0IklY3sOSJXbAzAMP2G39xSgpIKYjeefcGc65H51zC51z9+ziMc2cczOdcz845ybGMo8kjh0j+Bkz7Hq8FtrJHsvKgptvhnr1YPFiu08FXlJEzEbyzrmiQB/gNCANmOqc+8x7PzfbY8oCLwNneO8XO+cOiFUeSSw33QRz5lgTss6dQ6eRhJWVBddeC6++CnfcYafKiaSQWI7kGwELvfe/eO+3AYOANjs9pjPwsfd+MYD3fmUM80iCeO016NvXGpCpH73sscxM6NLFCvx999mBMxrBS4qJZZGvBCzJdjstcl92RwH7O+cmOOemO+cui2EeSQDTp9sWuerVoVcv/U6WfOjd2xbXPfooPPGEfpgkJUU9Xe+cK+W935SH587p/6idO07sBTQAWgL7AJOdc1O89z/t9L27Al0BDjvssDxEkETivW1d3n9/mDwZSpQInUgS2rXX2jnwHTuGTiISzG5H8s65k5xzc4F5kdt1nXMvR/HcaUD2C2CVgWU5PGak936T9/4PYBJQd+cn8t7389439N43rFixYhTfWhJRr14wYQL85z9QvnzoNJKQtm61H6DVq2HvvVXgJeVFM13/AnA6sBrAe/89cGoUXzcVqO6cq+qcKw5cBHy202OGAKc45/ZyzpUEGhP5Y0JSy48/2ii+VSu4++7QaSQhpafblowePeCLL0KnEYkLUU3Xe++XuH9ez8qM4msynHM3AqOAokB/7/0PzrlrI5/v672f55wbCcwCsoDXvPdz8vqPkMSWlgZnnQVlysAbb0ARtWiSvNq8Gdq2tZPkXnkFLroodCKRuBBNkV/inDsJ8JER+c1EOdr23g8Hhu90X9+dbncHukcXV5LN6tVwzjmwdCmMHAmVdl6aKbI7GzfCuefCxInQv791sxMRILrp+muBG7CV8WlAPeD6GGaSFLF0KZx2mu2Hf/ddaNYsdCJJSBs2wPLl8M47KvAiO4lmJH+09/7i7Hc4504GvopNJEkFmzfDmWfC7Nl2CFj79qETScJZv97Ogj/4YOt9XLx46EQicSeakfxLUd4nErWHHrIC/8YbWgAte2DNGmjRwrbJgQq8yC7sciTvnDsROAmo6Jy7Pdun9sUW0onskfffh+efh/PPh8svD51GEs6qVbYN48cf4bHHQqcRiWu5TdcXB0pHHlMm2/3rgQtiGUqS17p1tvC5Vi1rXyuSJytWQMuW8OuvMHSoLeoQkV3aZZH33k8EJjrn3vDe/1aImSRJZWb+3Yu+e3coWzZoHEk0WVlw9tnw228wfLhWaopEIZqFd5udc92BWsBfjUa99y1ilkqSjvdwww0wYgTcc48tuhPJkyJF7K/D4sWhSZPQaUQSQjQL7wYC84GqwKPAIqybnUjUPvzQepRccw089VToNJJQfv3VDpoBW2ynAi8StWiKfHnv/evAdu/9RO99F+CEGOeSJPLrr3DddXayXJ8+OgxM8mDBAjj1VDsLfs2a0GlEEk400/XbI++XO+fOxg6ZqRy7SJJMtm2zVfRr1sDYsVCsWOhEkjDmzbNFdtu32w9PuXKhE4kknGiK/BPOuf2AO7D98fsCt8YylCSP226D776z/fB1/3W+oMguzJljBd45O5qwVq3QiUQS0m6LvPd+WOTDdUBz+KvjnUiuxo6Fl1+Grl21H17yaOJE2GsvGDcOjj46dBqRhOW89zl/wrmiQAesZ/1I7/0c59w5wH3APt774wov5t8aNmzop02bFuJbSx4sXgw1a8K++8LcudouJ1HautXOgQf480/94IhEOOeme+8b5vXrclt49zpwNVAeeNE5NwDoATwXqsBLYli71k6WS0+3Y731e1qiMnmyrc6cGtm8ox8ckXzLbbq+IVDHe5/lnCsB/AFU896vKJxokqjuvBN++MEOBdOlVInKpEnW6Oagg+xNRApEbiP5bd77LADvfTrwkwq87M4779iR3tdd93d3O5FcjR1r3ZEqV7Zr8YceGjqRSNLIbSRfwzk3K/KxA46M3HaA997XiXk6SSiDBtlx3rVrQ48eodNIQpg+3a7tVKsGY8bAgQeGTiSSVHIr8scUWgpJeN98A5deatvkRo+GEiV2/zUi1KkDN98M//kPVKgQOo1I0sntgBodSiNRGT8ezjrLBmFDhkD58qETSdwbMQLq17cfmmefDZ1GJGlF09ZWZJfS06FDB9sqN3q0XVYVydX778O558K994ZOIpL0oul4J7JLzz0Hf/xhJ3/WrBk6jcS9t9+2hRsnnwy9eoVOI5L0ohrJO+f2cc6p7ZT8w8SJ8Oijtm7qjDNCp5G417+/tT5s1sym68uUCZ1IJOnttsg7584FZgIjI7frOec+i3EuiXPp6XDttdavZOBAnSwnu7F1q225aN0ahg2DUqVCJxJJCdFM1z8CNAImAHjvZzrnqsQuksQ776FLF5g/3xba7btv6EQS17y3VrXjx8N++2nrhUghima6PsN7vy7mSSRhPPssvPce3H47nHde6DQS17p3t65IGRm2kl4FXqRQRVPk5zjnOgNFnXPVnXMvAV/HOJfEqbQ0eOYZaNFCDW9kN558Eu66yz7exUFYIhJb0RT5m4BawFbgXezI2VtjmEnilPe2MHrrVltVr+vwkiPv4eGH4YEHrEPSO+9AsWKhU4mkpGiuyR/tvb8fuD/WYSS+9e5tbcafeAIaNAidRuLWY4/ZW5cu0K8fFC0aOpFIyoqmyP/XOXcwMBgY5L3/IcaZJA59+aWdLlenDtxzT+g0EtdatYJ16+x6ThH12xIJabf/B3rvmwPNgFVAP+fcbOfcA7EOJvFj+3a4+mprVzt0qAZmkoOsLJvmAWt089//qsCLxIGo/i/03q/w3r8IXIvtmX8olqEkvrz4Ivz0E/TsCYcdFjqNxJ2sLOjWzUbwX30VOo2IZBNNM5xjnHOPOOfmAL2xlfXqUJ4i5s+H++6zjnYXXhg6jcSdzEy79v7aa3D//XDSSaETiUg20VyTHwC8B7T23i+LcR6JI+vXQ/v2Nj3ft69W08tOMjLgssusacJjj8GDD4ZOJCI72W2R996fUBhBJP5cfbWN5IcPh8MPD51G4s6YMVbgn3kG7r47dBoRycEui7xz7gPvfQfn3GwgeycLB3jvfZ2Yp5NgRo6EwYPhppt0+IzswhlnwIwZcNxxoZOIyC44v4tOVM65g733y51zOY7hvPe/xTTZLjRs2NBPmzYtxLdOGStXQu3a1mb8u++gdOnQiSRubNliU/Q33wynnBI6jUjKcM5N9943zOvX7XLhnfd+eeTD6733v2V/A67f06AS/7p1gz//hA8+UIGXbDZvtsMKPvoIFiwInUZEohDNFrrTcrjvzIIOIvFh/Hj49FNrfKNZWPnLxo1w1lkwbhwMGGAr6kUk7uV2Tf46bMR+hHNuVrZPlQG0GTYJbd5sJ8sddJBtmxMBrMCfcQZMmWJ96Dt1Cp1IRKKU2+r6d4ERwNNA9kamG7z3a2KaSoLo3Bm+/94WTGuaXv6yzz5wxBFw661wwQWh04hIHuRW5L33fpFz7oadP+GcK6dCn1xefRWGDLHDwzp2DJ1G4sLq1ZCeDpUqwVtvhU4jIntgdyP5c4Dp2Ba67K1QPHBEDHNJIVq+3AZpTZuqn4lErFxpbWqLFIHp03VggUiC2mWR996fE3lftfDiSGFbv94usW7bZmeK6He5sHw5tGwJixbpRCKRBBdN7/qTnXOlIh9f4pz7r3NOx5QkAe/h/PNh4kQ7hKZ+/dCJJLi0NJvSWbwYRoywYi8iCSuaLXT/B2x2ztUF7gJ+A96OaSopFD16WGfSRx+F664LnUbiwk03wYoVMGqUFXsRSWjRHFCT4b33zrk2QC/v/evOuctjHUxia+VKOzr2+ON1HV6yeeUVWLIEGjQInURECkA0I/kNzrl7gUuBz51zRYFisY0lsXbttXbp9emndbpcyvvpJ/uB2L4dDjhABV4kiURT5DsCW4Eu3vsVQCWge0xTSUwtWWLrqa6/XpdcU97cuTYt//HHdh1eRJLKbot8pLAPBPZzzp0DpHvvtWk2QXlvR8gWKQK33RY6jQQ1ezY0a2YfT5gARx4ZMo2IxEA0q+s7AN8CFwIdgG+cc2p7laB69oTRo22xnX6np7DvvoPmzaF4cdteUbNm6EQiEgPRLLy7Hzjee78SwDlXERgDfBjLYFLwFi+GJ56ARo3g7rtDp5GgMjKsk93HH+uvPZEkFk2RL7KjwEesJrpr+RJnHnkENm2C11/XYruUtXSpFffjj7fRfBH9ryySzKL5P3ykc26Uc+4K59wVwOfA8NjGkoL27bd2Qmi3blC7dug0EsSkSVCjhh1UACrwIilgtyN57/1/nHPtgSZY//p+3vtPYp5MCszvv0OHDlCmDDzwQOg0EsTYsXDuuVClCpxzTug0IlJIcjtPvjrQAzgSmA3c6b1fWljBpGB4D+edB7/9Bp9/DhUrhk4khW7kSGjXDqpXtxaHBxwQOpGIFJLc5uv6A8OA87GT6F4qlERSoN54w6bqn38ezjordBopdEuXWoE/5hgYP14FXiTF5DZdX8Z7H7l4x4/OuRmFEUgKzk8/WU/6Bg2sJbmkoEqV4O23revR/vuHTiMihSy3Il/COXccf58jv0/22957Ff04lp5uI/fixWHwYCimRsSp5f33oUIFK+4XqK2FSKrKrcgvB/6b7faKbLc90CJWoST/+vaFn3+GDz6AqlVDp5FC9dZbcOWV0Lo1tGih/ZIiKWyXRd5737wwg0jB2bLFVtE3bapBXMp5/XW45hrrZvfhhyrwIilOG2WT0DvvWNObu+7S7/iU8vLLdjDB6afDsGFQqlToRCISmIp8ktm2DXr0sCn6M88MnUYKjfcwbZrthf/0U9hnn9CJRCQORNPWVhJInz62qn7AAI3iU8b69bDvvtbJLjPTVluKiBDdKXTOOXeJc+6hyO3DnHONYh9N8mrFCrj3XjjxRLj88tBppFA8/jjUqwcrV0LRoirwIvIP0UzXvwycCHSK3N4A9IlZItljt90GW7fapVmN4pOc9/Dgg/DQQ9CkCZQvHzqRiMShaKbrG3vv6zvnvgPw3q91zmm4EGeGD4dBg2yxXb16odNITHlvZwV37w5XXQWvvGKjeBGRnUQzkt/unCuK7Y3fcZ58VkxTSZ5s22bboqtXt8GdJLlevazAX3cd9OunAi8iuxTNSP5F4BPgAOfck8AFgM4yiyO33GKXZPv1g9KlQ6eRmLvsMltgd/vtui4jIrmK5qjZgc656UBLrKVtW+/9vJgnk6h8/LF1t+vaFdq0CZ1GYiYz07ZOdO0K5crBHXeETiQiCWC3Rd45dxiwGRia/T7v/eJYBpPd++MPuyRbuza88ELoNBIzGRnQpYsdNFO+PFx8cehEIpIgopmu/xy7Hu+AEkBV4EegVgxzSRS6dYONG+Hdd6FkydBpJCa2b7fp+UGDbLucCryI5EE00/XHZr/tnKsPdItZIonKoEE2Vf/II3Dssbt9uCSibdugUyf7D/3cc/Cf/4ROJCIJJs8d77z3M5xzx8cijERnzhy7NFu9ujW/kSS1aBFMmAA9e9rqShGRPIrmmvzt2W4WAeoDq2KWSHLl/d+Lqz/4QA3OktL27bDXXnDUUdajWI1uRGQPRbNPvky2t72xa/Raxx3IwIHw3Xfw0ktqepOUNm2CM86AJ56w2yrwIpIPuY7kI01wSnvvdTEwDqxda1ujjz0WrrgidBopcBs2wNlnw1dfWXcjEZF82mWRd87t5b3PiCy0kzjw+OOwapW1sC2iQ4KTy7p1djbwt9/adomOHUMnEpEkkNtI/lvs+vtM59xnwGBg045Peu8/jnE2yebXX6F3b2jfHho2DJ1GClRmpk3RT59uCy3atw+dSESSRDSr68sBq4EW/L1f3gO7LfLOuTOAXkBR4DXv/TO7eNzxwBSgo/f+w+iip5bHHrNa8PzzoZNIgSta1PrQ778/nHtu6DQikkRyK/IHRFbWz+Hv4r6D390TR67n9wFOA9KAqc65z7z3c3N43LPAqDxmTxnLl8Obb1oflCpVQqeRArNype2HbNHCtkyIiBSw3Ip8UaA0/yzuO+y2yAONgIXe+18AnHODsFX5c3d63E3AR4D23u/CrbfaNfgHdCxQ8li+HFq2hN9/t2sx++4bOpGIJKHcivxy7/1j+XjuSsCSbLfTgMbZH+CcqwS0wy4FqMjnYMgQu0x7551w9NGh00iBSEuz0fvy5fD55yrwIhIzua3Rzu8ZltHMAPQE7vbeZ+b6RM51dc5Nc85NW7UqdfrwbN9u/elr1rRr8pIEFi2CU0+1EfyoUfaxiEiM5DaSb5nP504DDs12uzKwbKfHNAQGOTsTuwJwlnMuw3v/afYHee/7Af0AGjZsGM2lgqTwxBNWC15/HfbZJ3QaKRD9+1vDgzFj4HhNXolIbDnvY1MznXN7AT9hfywsBaYCnb33P+zi8W8Aw3a3ur5hw4Z+2rRpBZw2/ixaZEfINmsGQ4eCy++8ioTlvf1HzMqC336DqlVDJxKRBOKcm+69z/MG6pi1VPHeZwA3Yqvm5wEfeO9/cM5d65y7NlbfN1lcf73VhOefV4FPeHPnQuPGtsCuSBEVeBEpNHk+hS4vvPfDgeE73dd3F4+9IpZZEsmQITBiBDz0kBbbJbxZs6BVK9sLn54eOo2IpBg1R40zGzbYKL5KFbjnntBpJF9mzIDmze2owIkT4ZhjQicSkRQT05G85N0zz8CyZbbwWovtEtjMmbZNrmxZGDcOjjgidCIRSUEayceRxYuhe3do3dreJIFVrWr96CdOVIEXkWBU5OPIHXdYf/q+Oa5akIQwbRps3gz77QeDBsHhh4dOJCIpTEU+TixYAB9+CFddpcXXCWvMGGtuc+edoZOIiAAq8nEhKwtuu80WYKs/fYIaPhzOOQeqVYNHHgmdRkQEUJGPC6+/bi3Mn3gCDjssdBrJsyFDoG1bqFULxo+HAw4InUhEBIhhx7tYSbaOd9u3w7HH2sfz5qnxTcLZsgWqV4dKlWxLRNmyoROJSBLa04532kIX2MCB8OOPdl68CnwC2mcfuxZ/yCE6TU5E4o6m6wPavNmuwVerBpdcEjqN5Mmbb8L991tP+ho1VOBFJC5pJB/Q55/D0qXw7rvW0lwSxKuv2hnALVva9ZbixUMnEhHJkUpLQK++amu02rYNnUSi1qcPdO1qjW6GDlWBF5G4piIfyBdf2NuNN6p9bcLo2dP+g7VpA598AiVKhE4kIpIrFfkAtm+3wWDlynD77aHTSNQOPhg6doTBg2HvvUOnERHZLRX5APr3h0WL7DCaUqVCp5FceQ/z59vHHTvCe+9BsWJhM4mIRElFvpCtXQt33QX161vNkDjmPTz4INSpY8fGgvY5ikhC0er6QvbSS7B+vb3fS69+/PLe/hrr0QOuuQbq1QudSEQkzzSSL0Rz5sDTT8Npp8FJJ4VOI7vkPdx6qxX4G26wYwG1x1FEEpB+cxWiBx6wlfSvvRY6ieTq44/hxRft1KCXXlKBF5GEpQnjQrJggZ1jcuutOoQm7rVvb/+xzj1X1+BFJKFpiFJInn3WjpK99dbQSSRHGRn2H+enn6ywn3eeCryIJDyN5AvB/PnW6vyqq+Dww0OnkX/Zvt0OD/jgAzjySDjqqNCJREQKhIp8IfjPf2wl/UMPhU4i/7JtG1x0kXWw69EDbropdCIRkQKjIh9jkybBsGFwzz125LjEkfR0uOACOynoxRdV4EUk6ajIx9gjj0D58nDvvaGTyL9kZlrTgr597VQ5EZEkoyIfQ0uW2Ej+1lt13Hhc2bQJsrKgTBkYP95WRIqIJCEV+Rh6+WWrJVdfHTqJ/GXDBjj7bFskMXasCryIJDUV+RhZuxaef95OJa1RI3QaAeDPP+HMM2HqVBg4UFvkRCTpqcjHyAsv2M6se+4JnUQAWLMGTj8dvv/ejopt1y50IhGRmFORj4Ht22HAAGjSBBo3Dp1GALj0Upg1y1rWnnNO6DQiIoVCRT4G+vSBtDR48snQSeQvzz9vKyFPOy10EhGRQqO2tgVs+3a4/37rT3/JJaHTpLhly6yfsPe2MEIFXkRSjEbyBeyZZ2DzZrjvPh1eFtSSJdCiBaxYYQ1vjjwydCIRkUKnMlSA1qyB//4XTj5ZvVWC+vVXOPVUWLkSRo9WgReRlKWRfAHq3992afXqFTpJClu40EbwGzfaPviGDUMnEhEJRkW+gHgPr79uNaVBg9BpUtiPP9qxsePGQb16odOIiASlIl9AXn3VjpR9443QSVLUpk1QqpR1s1u4EEqWDJ1IRCQ4XZMvABkZ8PTTNoq/9NLQaVLQ999DtWrw6ad2WwVeRARQkS8QQ4bAokVw221aUV/opk2D5s2tF32tWqHTiIjEFZWkAjBwIJQrBx06hE6SYqZMgZYtYb/97Li/6tVDJxIRiSsq8vk0f77NEl92mQ0mpZAsWmTNbSpWhIkToWrV0IlEROKOinw+vfyyray/+ebQSVLM4YfDww9bgT/ssNBpRETiksae+fDHH7aa/uyzNZAsNGPGwMEH2/X3O+8MnUZEJK5pJJ8PvXrBhg02oJRC8Pnn9hfVHXeETiIikhBU5PfQpk3Qt6+t+zr++NBpUsCnn9oZ8MceC+++GzqNiEhCUJHfQ2+/bdP1GlQWgsGD4cILoX59m64vVy50IhGRhKAiv4feew+OOQbOOCN0kiTnPbz2Gpxwgh02U7Zs6EQiIglDC+/2wPr1MHUqdOkCzoVOk8QyMmxf4scfW7EvXTp0IhGRhKKR/B546y3YsgU6dgydJIn162fHxW7YYD3pVeBFRPJMRT6PsrKgRw+oUQOaNAmdJkn17g3dutm192LFQqcREUlYKvJ59Omn8Ntv1vxGU/Ux8N//wk03Qdu2Nk1fokToRCIiCUtFPo8GDbJDaDp3Dp0kCb38sm1XuPBC+OADKF48dCIRkYSmIp8HaWm2m+u66+xMFClgp58Ot95q++A1TS8ikm8q8nnQr5+9v/76sDmSivd2DSQrC448El54QSf9iIgUEBX5KGVm2nqwpk2hZs3QaZKE99Z/vl07myIREZECpSFTlIYNg7Vr4aqrQidJEllZcMst9pfTTTdBhw6hE4mIJB2N5KP02mtwyCFw0UWhkySBrCxb2NC7ty2069VLWxVERGJART4Ka9faSP6CC7QerEDMnm1n9N53H3TvrgIvIhIjmq6Pwptv2vvzzgubI+F5bwW9bl2YNQuOOkoFXkQkhjSS342sLFvwffzx0KJF6DQJbPt26NQJ3nnHbh99tAq8iEiMqcjvxuTJsHgxXH21atIe27rVGty8/z6sXBk6jYhIytB0/W707Wvvzz47bI6ElZ4O558Pw4fDSy/BjTeGTiQikjJU5HOxaBEMHGij+EqVQqdJQNu320KGMWPglVega9fQiUREUoqKfC569LD3998fNkfCKlYMTjrJGv1fcUXoNCIiKUdFfhfS021Vffv2UKVK6DQJZv16WLIEatWCRx4JnUZEJGVp4d0uDB8OGzfCZZeFTpJg/vwTWreGVq1g06bQaUREUppG8rvw2mtQoQKccUboJAlk9Wor8LNnWy/6UqVCJxIRSWkayedg+XIYNQouvlhHmkdt5UprJPDDD3aqXJs2oROJiKQ8jeRz8M471gTnmmtCJ0kgTzwBCxbA0KFw2mmh04iICBrJ52jwYOu8WqtW6CQJ5LnnYNIkFXgRkTiiIr+TL7+EqVN18mlUFi+2TnZr10KJEtCwYehEIiKSjabrd/Luu/b+8svD5oh7v/4KzZvbavpFi2D//UMnEhGRnWgkn016Orz3nu2NV4e7XCxYAKeeChs2wNixcNxxoROJiEgONJLPZsoUWLfOVtXLLsyfb6vot2+HceNs8YKIiMQljeSzGTTIOrG2bBk6SRwrWdJaAE6YoAIvIhLnNJKP2LzZrsdfcAHst1/oNHHol1/g8MPhsMPgq6907q6ISALQSD7i66/tEvNFF4VOEoemTbOV8w88YLdV4EVEEoKKfMRHH8E++9iCcclm8mS7frHfftCtW+g0IiKSBzEt8s65M5xzPzrnFjrn7snh8xc752ZF3r52zgW5yLt9OwwbBk2bQpkyIRLEqS+/tF70BxxgjW50HJ+ISEKJWZF3zhUF+gBnAjWBTs65mjs97Fegqfe+DvA40C9WeXLz9tuQlqYjz/9h40bbS1i5MkycCIceGjqRiIjkUSwX3jUCFnrvfwFwzg0C2gBzdzzAe/91tsdPASrHMM8u9e1rNezCC0N89zhVujR8+CHUqAEHHhg6jYiI7IFYTtdXApZku50WuW9XrgJGxDBPjhYssDa2XbtCEa1QsOsWr79uHzdtqgIvIpLAYlnWclqC7XN8oHPNsSJ/9y4+39U5N805N23VqlUFGBGeecYWi2tVPfDJJzZF368fZGSETiMiIvkUyyKfBmS/kFsZWLbzg5xzdYDXgDbe+9U5PZH3vp/3vqH3vmHFihULLODGjTBwILRtC9WqFdjTJqb337frFQ0bwujRsJdaKIiIJLpYFvmpQHXnXFXnXHHgIuCz7A9wzh0GfAxc6r3/KYZZcvT++7B1K9x0U2F/5zjzzjvQuTOcdBKMGqVuQCIiSSJmwzXvfYZz7kZgFFAU6O+9/8E5d23k832Bh4DywMvOGqxkeO8L7bzS//s/qF4dmjUrrO8Yp9LS7EX47DMoVSp0GhERKSDO+xwvk8ethg0b+mnTpuX7eX74AWrXhueeg//8pwCCJaI//oAKFezjjAxN0YuIxCnn3PQ9GQSn7Hryjz+29506hc0RzIsv2jTG/Pl2WwVeRCTppGyR//JLqFnTer2knO7d4ZZb7MjYI44InUZERGIkJYv8tm3WpbV169BJAnjySbjrLujY0c7WLV48dCIREYmRlCzyX3xhq+pT7jCaQYPsJLlLL7UV9cWKhU4kIiIxlJJF/o03oGTJFBzJt28PL70EAwboGryISApIuSK/bZu1ZD/9dChRInSaQuA99OhhK+mLF4cbb4SiRUOnEhGRQpByRX7IEHufEnvjs7Ks089//gNvvhk6jYiIFLKUm7MdNsy2hl93XegkMZaVBd26wWuvWZG//fbQiUREpJCl3Ej+m2+gceMkX3OWmQlduliBf+ABePZZO4VHRERSSkoV+V9+gR9/hFNOCZ0kxtauha+/hsceg8cfV4EXEUlRKTVd//LL9r5t26AxYmf7divoFSrA9OlQpkzoRCIiElBKjeTffRdOPRWOPjp0khjYuhUuuACuvNJW1KvAi4ikvJQp8suWwfLlUKdO6CQxsGULtGtnp8ideKKm50VEBEih6frRo+395ZeHzVHgNm+GNm1g7Fh49VW4+urQiUREJE6kTJH/5BMoXx7q1QudpIB17Ajjxlkbv8suC51GRETiSEpM12/cCCNHQocOSdjN9c47YeBAFXgREfmXZCt5ORo82NrZtm8fOkkBWbvWrj907AhNm4ZOIyIicSolRvIff2znxrdsGTpJAVi92v4hl18OixeHTiMiInEs6Yv86tV2tOzZZyfBovOVK+183LlzbZHBYYeFTiQiInEs6afr/+//bAt5166hk+TT8uU2gl+0yBrwt2oVOpGIiMS5pC7y3sN778EJJ0D9+qHT5NOoUTY9P2KErsOLiEhUknq6/qefbGa7Y8fQSfIhK8veX3EFLFigAi8iIlFL6iL/6af2vk2boDH23C+/QN26MHmy3T744LB5REQkoST1dP3EiVC9OlStGjrJHliwwBbZbdkCe+8dOo2IiCSgpB3Jr1v396r6hDNvnp2ks20bjB+fBAsKREQkhKQdyU+cCBkZCVjkf/3VrrsXLQoTJkDNmqETiYhIgkrakfy4cVCyJDRpEjpJHlWuDBdeaH+lqMCLiEg+JO1Ifvp0O1a2RInQSaI0fToccogtruvTJ3QaERFJAkk5kl+3Dr7+OoEuZX/9tS2yS/iOPSIiEk+SssiPGmXby9u2DZ0kChMnQuvWcNBB1p5PRESkgCRlkf/iC5umP/XU0El2Y+xYOPNM60E/caJdjxcRESkgSVfkMzNh+HAbHMf19vKsLLj7bqhWzVbRq9GNiIgUsKRbeDdvHixblgBb54oUsYNm9toLKlQInUZERJJQ0o3khwyx982bh82xSx99BJ072yb+gw5SgRcRkZhJuiI/dqxtnatePXSSHAwaZKflLFpk7WpFRERiKKmKfHo6TJsGjRqFTpKDt96Ciy+Gk0+25f9lyoROJCIiSS6pivy0abBhA5x1VugkO3nzTTsqtlkzWxWoAi8iIoUgqYr8rFn2vl69oDH+7eij4YILbKFdqVKh04iISIpIqiI/caJ1hq1SJXSSiBkz7P0JJ8AHH8A++4TNIyIiKSWpivw339glb+dCJwGeew4aNLDRu4iISABJU+RXroTffrNBc3CPP26Nbi66CM44I3QaERFJUUlT5P/3P3tft27AEN7Dgw/CQw/BpZfCO+9YsxsREZEAkqbIT5hg9TTo+fHffgtPPAFXXQUDBkDRogHDiIhIqkuaYeaXX1qBD9qvvnFjGD/eTsYpkjR/P4mISIJKikq0bh3MnAmnnBLgm2dlwe2329J+sL3wKvAiIhIHkmIkP3y4vW/RopC/cWYmdOsGr78OpUtD06aFHEBERGTXkqLIjxgB++5r2+cKTUYGdOkCb79ti+0efbQQv7mIiMjuJUWR//57uxxerFghfcPt2231/Pvv23a5Bx4opG8sIiISvYS/ePznnzB3rvWdKTRFithfFM89pwIvIiJxK+FH8uPH28z5aacVwjfbuhXWrIGDD7ZT5eKitZ6IiEjOEn4k//XXtj8+5p3utmyBNm1s9Xx6ugq8iIjEvYQfyc+aBcceCyVLxvCbbNoE551n0wavvgolSsTwm4mIiBSMhB/J//gjVK8ew2+wYQOceaa11HvzTetmJyIikgASusivWWOH0tSpE8Nvcvvtdk3g3XdtRb2IiEiCSOjp+nfesffNmsXwmzz9NJx/vk6TExGRhJPQI/nZs+39SScV8BP/8QfceSds2wYVKqjAi4hIQkroIj9jhnWSLdCF7r//Ds2bQ58+1mVHREQkQSVskfce5s2D444rwCddvtzm/n/+GYYNg+OPL8AnFxERKVwJe01+/nzbul6zZgE9YVqanXCzfDmMHGnHxYqIiCSwhC3y06fb+8aNC+gJV660jnajR8OJJxbQk4qIiISTsEX+u+/s/VFH5fOJ1q6F/feH+vVhwQIoXjzf2UREROJBwl6T/+UXKFMmn83nfvwRateGnj3ttgq8iIgkkYQt8suXWzvbPTZ3ri3Nz8iAVq0KLJeIiEi8SMgiv3GjXZPf40vns2bZKvoiRaxdbe3aBZhOREQkPiRkkV+40AbgjRrtwRevWwctW8Lee8PEiXDMMQWeT0REJB4k5MK7HSvra9TYgy/ebz/o1cvOpj3iiALNJSIiEk8SssiPGwdly+Zxj/xXX9mRsa1bQ+fOsYomIiISNxKyyM+ZY1P1e0WbfsIEOOccO5O2VSu7Fi8iIpLkErLaLVyYh1H8mDFw1llw+OEwYoQKvIiIpIyEq3jbtsHmzTYo360RI2wEX60ajB8PBx0U83wiIiLxIuGK/Nat9j6qTndDh0KtWlbgDzggprlERETiTcJdk9+2zd4femguD9q61bbI9e5ti+3KlCmUbCIiIvEk4Uby6elQtGguu9/efdea26Sl2fV3FXgREUlRCVnkjzwSihXL4ZNvvgmXXAKVKtkeOxERkRSWcEV+yxa7zP4vr74KV15p3eyGD4fSpQs9m4iISDxJuCK/bZstlv+HDz6Arl3hjDNssV3JkkGyiYiIxJOEK/Le57AT7rTT4K674JNP8nn2rIiISPJIuCIPULly5INBg+wi/f77w7PP2op6ERERAWJc5J1zZzjnfnTOLXTO3ZPD551z7sXI52c55+pH87xVqgCPPQadOkGfPgUdW0REJCnErMg754oCfYAzgZpAJ+fczs1ozwSqR966Av8XzXMf9dYD8PDDcPnlcOutBRdaREQkicRyJN8IWOi9/8V7vw0YBLTZ6TFtgLe8mQKUdc4dnNuTViKNsn2ehGuugf79bdO8iIiI/Essi3wlYEm222mR+/L6mH+owB/462+Avn112IyIiEguYtnW1uVwn9+Dx+Cc64pN5wNsLfJynzm8rGvxMVQB+CN0iBSg1zn29BrHnl7jwnH0nnxRLIt8GpC9w3xlYNkePAbvfT+gH4Bzbpr3vmHBRpXs9BoXDr3OsafXOPb0GhcO59y0Pfm6WM53TwWqO+eqOueKAxcBn+30mM+AyyKr7E8A1nnvl8cwk4iISMqI2Ujee5/hnLsRGAUUBfp7739wzl0b+XxfYDhwFrAQ2AxcGas8IiIiqSamR81674djhTz7fX2zfeyBG/L4tP0KIJrkTq9x4dDrHHt6jWNPr3Hh2KPX2VmdFRERkWSjPWgiIiJJKm6LfKxa4srfoniNL468trOcc1875+qGyJnIdvcaZ3vc8c65TOfcBYWZL1lE8zo755o552Y6535wzk0s7IyJLorfF/s554Y6576PvMZaY5VHzrn+zrmVzrk5u/h83uue9z7u3rCFej8DRwDFge+Bmjs95ixgBLbX/gTgm9C5E+ktytf4JGD/yMdn6jUu+Nc42+PGYetXLgidO9HeovxZLgvMBQ6L3D4gdO5EeovyNb4PeDbycUVgDVA8dPZEegNOBeoDc3bx+TzXvXgdycekJa78w25fY+/91977tZGbU7A+BhK9aH6OAW4CPgJWFma4JBLN69wZ+Nh7vxjAe6/XOm+ieY09UMY554DSWJHPKNyYic17Pwl73XYlz3UvXot8TFriyj/k9fW7CvsLUqK329fYOVcJaAf0RfZUND/LRwH7O+cmOOemO+cuK7R0ySGa17g3cAzW0Gw2cIv3Pqtw4qWMPNe9mG6hy4cCa4kruxT16+eca44V+SYxTZR8onmNewJ3e+8zbQAkeyCa13kvoAHQEtgHmOycm+K9/ynW4ZJENK/x6cBMoAVwJPCFc+5L7/36GGdLJXmue/Fa5AusJa7sUlSvn3OuDvAacKb3fnUhZUsW0bzGDYFBkQJfATjLOZfhvf+0UBImh2h/X/zhvd8EbHLOTQLqAiry0YnmNb4SeMbbxeOFzrlfgRrAt4UTMSXkue7F63S9WuLG3m5fY+fcYcDHwKUa8eyR3b7G3vuq3vsq3vsqwIfA9SrweRbN74shwCnOub2ccyWBxsC8Qs6ZyKJ5jRdjMyU45w7EDlT5pVBTJr881724HMl7tcSNuShf44eA8sDLkZFmhtdBFFGL8jWWfIrmdfbez3POjQRmAVnAa977HLcpyb9F+bP8OPCGc242Nq18t/dep9PlgXPuPaAZUME5lwY8DBSDPa976ngnIiKSpOJ1ul5ERETySUVeREQkSanIi4iIJCkVeRERkSSlIi8iIpKkVORFAoicODcz21uVXB67sQC+3xvOuV8j32uGc+7EPXiO15xzNSMf37fT577Ob8bI8+x4XeZETjQru5vH13POnVUQ31skGWkLnUgAzrmN3vvSBf3YXJ7jDWCY9/5D51xroIf3vk4+ni/fmXb3vM65N4GfvPdP5vL4K4CG3vsbCzqLSDLQSF4kDjjnSjvnxkZG2bOdc/86rc45d7BzblK2ke4pkftbO+cmR752sHNud8V3ElAt8rW3R55rjnPu1sh9pZxzn0fOBZ/jnOsYuX+Cc66hc+4ZYJ9IjoGRz22MvH8/+8g6MoNwvnOuqHOuu3NuqrNzsLtF8bJMJnL4hnOukXPua+fcd5H3R0c6rz0GdIxk6RjJ3j/yfb7L6XUUSSVx2fFOJAXs45ybGfn4V+BCoJ33fr1zrgIwxTn3mf/nVFtnYJT3/knnXFGgZOSxDwCtvPebnHN3A7djxW9XzgVmO+caYB2zGmMdyr5xzk3Ezgxf5r0/G8A5t1/2L/be3+Ocu9F7Xy+H5x4EdASGR4pwS+A67ICjdd77451zewNfOedGe+9/zSlg5N/XEng9ctd84NRI57VWwFPe+/Odcw+RbSTvnHsKGOe97xKZ6v/WOTcm0rNeJOWoyIuEsSV7kXTOFQOecs6dirVdrQQcCKzI9jVTgf6Rx37qvZ/pnGsK1MSKJkBxbASck+7OuQeAVVjRbQl8sqMAOuc+Bk4BRgI9nHPPYlP8X+bh3zUCeDFSyM8AJnnvt0QuEdRxzl0Qedx+QHXsD5zsdvzxUwWYDnyR7fFvOueqY6duFdvF928NnOecuzNyuwRwGOpTLylKRV4kPlwMVAQaeO+3O+cWYQXqL977SZE/As4G3nbOdQfWAl947ztF8T3+473/cMeNyIj4X7z3P0VG+WcBT0dG3LnNDGT/2nTn3ATs2NGOwHs7vh1wk/d+1G6eYov3vl5k9mAYcAPwItYXfbz3vl1kkeKEXXy9A8733v8YTV6RZKdr8iLxYT9gZaTANwcO3/kBzrnDI495FZvGrg9MAU52zu24xl7SOXdUlN9zEtA28jWlgHbAl865Q4DN3vt3gB6R77Oz7ZEZhZwMwi4DnIIdaELk/XU7vsY5d1Tke+bIe78OuBm4M/I1+wFLI5++IttDNwBlst0eBdzkItMazrnjdvU9RFKBirxIfBgINHTOTcNG9fNzeEwzYKZz7jvgfKCX934VVvTec87Nwop+jWi+ofd+BvAGdt73N9jJbN8Bx2LXsmcC9wNP5PDl/YBZOxbe7WQ0cCowxnu/LXLfa8BcYIZzbg7wCruZSYxk+R471vQ5bFbhK+wUtB3GAzV3LLzDRvzFItnmRG6LpCxtoRMREUlSGsmLiIgkKRV5ERGRJKUiLyIikqRU5EVERJKUiryIiEiSUpEXERFJUiryIiIiSUpFXkREJEn9P8oTao4AvRRWAAAAAElFTkSuQmCC\n",
      "text/plain": [
       "<Figure size 576x576 with 1 Axes>"
      ]
     },
     "metadata": {
      "needs_background": "light"
     },
     "output_type": "display_data"
    }
   ],
   "source": [
    "plt.figure(figsize=(8, 8))\n",
    "plt.title('Validation ROC')\n",
    "plt.plot(fpr, tpr, 'b', label = 'Val AUC = %0.4f' % roc_auc)\n",
    "plt.ylim(0,1)\n",
    "plt.xlim(0,1)\n",
    "plt.legend(loc='best')\n",
    "plt.title('ROC')\n",
    "plt.ylabel('True Positive Rate')\n",
    "plt.xlabel('False Positive Rate')\n",
    "# 画出对角线\n",
    "plt.plot([0,1],[0,1],'r--')\n",
    "plt.show()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 11,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "(80000, 124)\n"
     ]
    }
   ],
   "source": [
    "print(X_test.shape)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 12,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "当前0, 概率值0.0\n",
      "              precision    recall  f1-score   support\n",
      "\n",
      "           0       0.00      0.00      0.00     63957\n",
      "           1       0.20      1.00      0.33     16043\n",
      "\n",
      "    accuracy                           0.20     80000\n",
      "   macro avg       0.10      0.50      0.17     80000\n",
      "weighted avg       0.04      0.20      0.07     80000\n",
      "\n",
      "当前1, 概率值0.1\n",
      "              precision    recall  f1-score   support\n",
      "\n",
      "           0       0.94      0.31      0.46     63957\n",
      "           1       0.25      0.92      0.39     16043\n",
      "\n",
      "    accuracy                           0.43     80000\n",
      "   macro avg       0.60      0.62      0.43     80000\n",
      "weighted avg       0.80      0.43      0.45     80000\n",
      "\n",
      "当前2, 概率值0.2\n",
      "              precision    recall  f1-score   support\n",
      "\n",
      "           0       0.89      0.66      0.76     63957\n",
      "           1       0.33      0.68      0.45     16043\n",
      "\n",
      "    accuracy                           0.66     80000\n",
      "   macro avg       0.61      0.67      0.60     80000\n",
      "weighted avg       0.78      0.66      0.69     80000\n",
      "\n",
      "当前3, 概率值0.30000000000000004\n",
      "              precision    recall  f1-score   support\n",
      "\n",
      "           0       0.85      0.85      0.85     63957\n",
      "           1       0.41      0.42      0.42     16043\n",
      "\n",
      "    accuracy                           0.76     80000\n",
      "   macro avg       0.63      0.64      0.64     80000\n",
      "weighted avg       0.77      0.76      0.76     80000\n",
      "\n",
      "当前4, 概率值0.4\n",
      "              precision    recall  f1-score   support\n",
      "\n",
      "           0       0.83      0.94      0.88     63957\n",
      "           1       0.50      0.23      0.31     16043\n",
      "\n",
      "    accuracy                           0.80     80000\n",
      "   macro avg       0.66      0.59      0.60     80000\n",
      "weighted avg       0.76      0.80      0.77     80000\n",
      "\n",
      "当前5, 概率值0.5\n",
      "              precision    recall  f1-score   support\n",
      "\n",
      "           0       0.81      0.98      0.89     63957\n",
      "           1       0.58      0.10      0.18     16043\n",
      "\n",
      "    accuracy                           0.81     80000\n",
      "   macro avg       0.70      0.54      0.53     80000\n",
      "weighted avg       0.77      0.81      0.75     80000\n",
      "\n",
      "当前6, 概率值0.6000000000000001\n",
      "              precision    recall  f1-score   support\n",
      "\n",
      "           0       0.80      1.00      0.89     63957\n",
      "           1       0.68      0.03      0.06     16043\n",
      "\n",
      "    accuracy                           0.80     80000\n",
      "   macro avg       0.74      0.51      0.48     80000\n",
      "weighted avg       0.78      0.80      0.72     80000\n",
      "\n",
      "当前7, 概率值0.7000000000000001\n",
      "              precision    recall  f1-score   support\n",
      "\n",
      "           0       0.80      1.00      0.89     63957\n",
      "           1       0.77      0.00      0.01     16043\n",
      "\n",
      "    accuracy                           0.80     80000\n",
      "   macro avg       0.79      0.50      0.45     80000\n",
      "weighted avg       0.79      0.80      0.71     80000\n",
      "\n",
      "当前8, 概率值0.8\n",
      "              precision    recall  f1-score   support\n",
      "\n",
      "           0       0.80      1.00      0.89     63957\n",
      "           1       0.50      0.00      0.00     16043\n",
      "\n",
      "    accuracy                           0.80     80000\n",
      "   macro avg       0.65      0.50      0.44     80000\n",
      "weighted avg       0.74      0.80      0.71     80000\n",
      "\n",
      "当前9, 概率值0.9\n",
      "              precision    recall  f1-score   support\n",
      "\n",
      "           0       0.80      1.00      0.89     63957\n",
      "           1       0.00      0.00      0.00     16043\n",
      "\n",
      "    accuracy                           0.80     80000\n",
      "   macro avg       0.40      0.50      0.44     80000\n",
      "weighted avg       0.64      0.80      0.71     80000\n",
      "\n",
      "当前10, 概率值1.0\n",
      "              precision    recall  f1-score   support\n",
      "\n",
      "           0       0.80      1.00      0.89     63957\n",
      "           1       0.00      0.00      0.00     16043\n",
      "\n",
      "    accuracy                           0.80     80000\n",
      "   macro avg       0.40      0.50      0.44     80000\n",
      "weighted avg       0.64      0.80      0.71     80000\n",
      "\n"
     ]
    }
   ],
   "source": [
    "for i, pred in enumerate(preds):\n",
    "    print(f\"当前{i}, 概率值{0.1*i}\")\n",
    "    r = classification_report(y_test, pred)\n",
    "    print(r)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 13,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "0.19931165968550144\n",
      "0.006265826579031761\n",
      "0.8249600642667576\n"
     ]
    }
   ],
   "source": [
    "print(np.mean(y_pred))\n",
    "print(np.min(y_pred))\n",
    "print(np.max(y_pred))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 14,
   "metadata": {},
   "outputs": [],
   "source": [
    "pred_t = model.predict(X_t, num_iteration=model.best_iteration)\n",
    "df = pd.DataFrame()\n",
    "df['id'] = data_test_a['id'].astype(int)\n",
    "df['isDefault'] = pred_t\n",
    "df.to_csv('./processed_data/submit_first.csv', index=None)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": []
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.7.9"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 4
}
