{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 集成学习案例一 （幸福感预测）\n",
    "\n",
    "### 背景介绍\n",
    "\n",
    "此案例是一个数据挖掘类型的比赛——幸福感预测的baseline。比赛的数据使用的是官方的《中国综合社会调查（CGSS）》文件中的调查结果中的数据，其共包含有139个维度的特征，包括个体变量（性别、年龄、地域、职业、健康、婚姻与政治面貌等等）、家庭变量（父母、配偶、子女、家庭资本等等）、社会态度（公平、信用、公共服务）等特征。\n",
    "\n",
    "\n",
    "### 数据信息\n",
    "赛题要求使用以上 **139** 维的特征，使用 **8000** 余组数据进行对于个人幸福感的预测（预测值为1，2，3，4，5，其中1代表幸福感最低，5代表幸福感最高）。\n",
    "因为考虑到变量个数较多，部分变量间关系复杂，数据分为完整版和精简版两类。可从精简版入手熟悉赛题后，使用完整版挖掘更多信息。在这里我直接使用了完整版的数据。赛题也给出了index文件中包含每个变量对应的问卷题目，以及变量取值的含义；survey文件中为原版问卷，作为补充以方便理解问题背景。\n",
    "\n",
    "### 评价指标\n",
    "最终的评价指标为均方误差MSE，即：\n",
    "$$Score = \\frac{1}{n} \\sum_1 ^n (y_i - y ^*)^2$$\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 导入package"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 1,
   "metadata": {},
   "outputs": [],
   "source": [
    "import os\n",
    "import time \n",
    "import pandas as pd\n",
    "import numpy as np\n",
    "import seaborn as sns\n",
    "from sklearn.linear_model import LogisticRegression\n",
    "from sklearn.svm import SVC, LinearSVC\n",
    "from sklearn.ensemble import RandomForestClassifier\n",
    "from sklearn.neighbors import KNeighborsClassifier\n",
    "from sklearn.naive_bayes import GaussianNB\n",
    "from sklearn.linear_model import Perceptron\n",
    "from sklearn.linear_model import SGDClassifier\n",
    "from sklearn.tree import DecisionTreeClassifier\n",
    "from sklearn import metrics\n",
    "from datetime import datetime\n",
    "import matplotlib.pyplot as plt\n",
    "from sklearn.metrics import roc_auc_score, roc_curve, mean_squared_error,mean_absolute_error, f1_score\n",
    "import lightgbm as lgb\n",
    "import xgboost as xgb\n",
    "from sklearn.ensemble import RandomForestRegressor as rfr\n",
    "from sklearn.ensemble import ExtraTreesRegressor as etr\n",
    "from sklearn.linear_model import BayesianRidge as br\n",
    "from sklearn.ensemble import GradientBoostingRegressor as gbr\n",
    "from sklearn.linear_model import Ridge\n",
    "from sklearn.linear_model import Lasso\n",
    "from sklearn.linear_model import LinearRegression as lr\n",
    "from sklearn.linear_model import ElasticNet as en\n",
    "from sklearn.kernel_ridge import KernelRidge as kr\n",
    "from sklearn.model_selection import  KFold, StratifiedKFold,GroupKFold, RepeatedKFold\n",
    "from sklearn.model_selection import train_test_split\n",
    "from sklearn.model_selection import GridSearchCV\n",
    "from sklearn import preprocessing\n",
    "import logging\n",
    "import warnings\n",
    "\n",
    "warnings.filterwarnings('ignore') #消除warning"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 导入数据集"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 2,
   "metadata": {},
   "outputs": [],
   "source": [
    "train = pd.read_csv(\"train.csv\", parse_dates=['survey_time'],encoding='latin-1') \n",
    "test = pd.read_csv(\"test.csv\", parse_dates=['survey_time'],encoding='latin-1') #latin-1向下兼容ASCII\n",
    "train = train[train[\"happiness\"]!=-8].reset_index(drop=True)\n",
    "train_data_copy = train.copy() #删去\"happiness\" 为-8的行\n",
    "target_col = \"happiness\" #目标列\n",
    "target = train_data_copy[target_col]\n",
    "del train_data_copy[target_col] #去除目标列\n",
    "\n",
    "data = pd.concat([train_data_copy,test],axis=0,ignore_index=True)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 查看数据的基本信息"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 3,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "count    7988.000000\n",
       "mean        3.867927\n",
       "std         0.818717\n",
       "min         1.000000\n",
       "25%         4.000000\n",
       "50%         4.000000\n",
       "75%         4.000000\n",
       "max         5.000000\n",
       "Name: happiness, dtype: float64"
      ]
     },
     "execution_count": 3,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "train.happiness.describe() #数据的基本信息"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 数据预处理\n",
    "\n",
    "首先需要对于数据中的连续出现的负数值进行处理。由于数据中的负数值只有-1，-2，-3，-8这几种数值，所以它们进行分别的操作，实现代码如下："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 4,
   "metadata": {},
   "outputs": [],
   "source": [
    "#make feature +5\n",
    "#csv中有复数值：-1、-2、-3、-8，将他们视为有问题的特征，但是不删去\n",
    "def getres1(row):\n",
    "    return len([x for x in row.values if type(x)==int and x<0])\n",
    "\n",
    "def getres2(row):\n",
    "    return len([x for x in row.values if type(x)==int and x==-8])\n",
    "\n",
    "def getres3(row):\n",
    "    return len([x for x in row.values if type(x)==int and x==-1])\n",
    "\n",
    "def getres4(row):\n",
    "    return len([x for x in row.values if type(x)==int and x==-2])\n",
    "\n",
    "def getres5(row):\n",
    "    return len([x for x in row.values if type(x)==int and x==-3])\n",
    "\n",
    "#检查数据\n",
    "data['neg1'] = data[data.columns].apply(lambda row:getres1(row),axis=1)\n",
    "data.loc[data['neg1']>20,'neg1'] = 20  #平滑处理,最多出现20次\n",
    "\n",
    "data['neg2'] = data[data.columns].apply(lambda row:getres2(row),axis=1)\n",
    "data['neg3'] = data[data.columns].apply(lambda row:getres3(row),axis=1)\n",
    "data['neg4'] = data[data.columns].apply(lambda row:getres4(row),axis=1)\n",
    "data['neg5'] = data[data.columns].apply(lambda row:getres5(row),axis=1)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "填充缺失值，在这里我采取的方式是将缺失值补全，使用fillna(value)，其中value的数值根据具体的情况来确定。例如将大部分缺失信息认为是零，将家庭成员数认为是1，将家庭收入这个特征认为是66365，即所有家庭的收入平均值。部分实现代码如下："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 5,
   "metadata": {},
   "outputs": [],
   "source": [
    "#填充缺失值 共25列 去掉4列 填充21列\n",
    "#以下的列都是缺省的，视情况填补\n",
    "data['work_status'] = data['work_status'].fillna(0)\n",
    "data['work_yr'] = data['work_yr'].fillna(0)\n",
    "data['work_manage'] = data['work_manage'].fillna(0)\n",
    "data['work_type'] = data['work_type'].fillna(0)\n",
    "\n",
    "data['edu_yr'] = data['edu_yr'].fillna(0)\n",
    "data['edu_status'] = data['edu_status'].fillna(0)\n",
    "\n",
    "data['s_work_type'] = data['s_work_type'].fillna(0)\n",
    "data['s_work_status'] = data['s_work_status'].fillna(0)\n",
    "data['s_political'] = data['s_political'].fillna(0)\n",
    "data['s_hukou'] = data['s_hukou'].fillna(0)\n",
    "data['s_income'] = data['s_income'].fillna(0)\n",
    "data['s_birth'] = data['s_birth'].fillna(0)\n",
    "data['s_edu'] = data['s_edu'].fillna(0)\n",
    "data['s_work_exper'] = data['s_work_exper'].fillna(0)\n",
    "\n",
    "data['minor_child'] = data['minor_child'].fillna(0)\n",
    "data['marital_now'] = data['marital_now'].fillna(0)\n",
    "data['marital_1st'] = data['marital_1st'].fillna(0)\n",
    "data['social_neighbor']=data['social_neighbor'].fillna(0)\n",
    "data['social_friend']=data['social_friend'].fillna(0)\n",
    "data['hukou_loc']=data['hukou_loc'].fillna(1) #最少为1，表示户口\n",
    "data['family_income']=data['family_income'].fillna(66365) #删除问题值后的平均值"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "除此之外，还有特殊格式的信息需要另外处理，比如与时间有关的信息，这里主要分为两部分进行处理：首先是将“连续”的年龄，进行分层处理，即划分年龄段，具体地在这里我们将年龄分为了6个区间。其次是计算具体的年龄，在Excel表格中，只有出生年月以及调查时间等信息，我们根据此计算出每一位调查者的真实年龄。具体实现代码如下："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 6,
   "metadata": {
    "scrolled": true
   },
   "outputs": [],
   "source": [
    "#144+1 =145\n",
    "#继续进行特殊的列进行数据处理\n",
    "#读happiness_index.xlsx\n",
    "data['survey_time'] = pd.to_datetime(data['survey_time'], format='%Y-%m-%d',errors='coerce')#防止时间格式不同的报错errors='coerce‘\n",
    "data['survey_time'] = data['survey_time'].dt.year #仅仅是year，方便计算年龄\n",
    "data['age'] = data['survey_time']-data['birth']\n",
    "# print(data['age'],data['survey_time'],data['birth'])\n",
    "#年龄分层 145+1=146\n",
    "bins = [0,17,26,34,50,63,100]\n",
    "data['age_bin'] = pd.cut(data['age'], bins, labels=[0,1,2,3,4,5]) "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "在这里因为家庭的收入是连续值，所以不能再使用取众数的方法进行处理，这里就直接使用了均值进行缺失值的补全。第三种方法是使用我们日常生活中的真实情况，例如“宗教信息”特征为负数的认为是“不信仰宗教”，并认为“参加宗教活动的频率”为1，即没有参加过宗教活动，主观的进行补全，这也是我在这一步骤中使用最多的一种方式。就像我自己填表一样，这里我全部都使用了我自己的想法进行缺省值的补全。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 7,
   "metadata": {},
   "outputs": [],
   "source": [
    "#对‘宗教’处理\n",
    "data.loc[data['religion']<0,'religion'] = 1 #1为不信仰宗教\n",
    "data.loc[data['religion_freq']<0,'religion_freq'] = 1 #1为从来没有参加过\n",
    "#对‘教育程度’处理\n",
    "data.loc[data['edu']<0,'edu'] = 4 #初中\n",
    "data.loc[data['edu_status']<0,'edu_status'] = 0\n",
    "data.loc[data['edu_yr']<0,'edu_yr'] = 0\n",
    "#对‘个人收入’处理\n",
    "data.loc[data['income']<0,'income'] = 0 #认为无收入\n",
    "#对‘政治面貌’处理\n",
    "data.loc[data['political']<0,'political'] = 1 #认为是群众\n",
    "#对体重处理\n",
    "data.loc[(data['weight_jin']<=80)&(data['height_cm']>=160),'weight_jin']= data['weight_jin']*2\n",
    "data.loc[data['weight_jin']<=60,'weight_jin']= data['weight_jin']*2  #个人的想法，哈哈哈，没有60斤的成年人吧\n",
    "#对身高处理\n",
    "data.loc[data['height_cm']<150,'height_cm'] = 150 #成年人的实际情况\n",
    "#对‘健康’处理\n",
    "data.loc[data['health']<0,'health'] = 4 #认为是比较健康\n",
    "data.loc[data['health_problem']<0,'health_problem'] = 4\n",
    "#对‘沮丧’处理\n",
    "data.loc[data['depression']<0,'depression'] = 4 #一般人都是很少吧\n",
    "#对‘媒体’处理\n",
    "data.loc[data['media_1']<0,'media_1'] = 1 #都是从不\n",
    "data.loc[data['media_2']<0,'media_2'] = 1\n",
    "data.loc[data['media_3']<0,'media_3'] = 1\n",
    "data.loc[data['media_4']<0,'media_4'] = 1\n",
    "data.loc[data['media_5']<0,'media_5'] = 1\n",
    "data.loc[data['media_6']<0,'media_6'] = 1\n",
    "#对‘空闲活动’处理\n",
    "data.loc[data['leisure_1']<0,'leisure_1'] = 1 #都是根据自己的想法\n",
    "data.loc[data['leisure_2']<0,'leisure_2'] = 5\n",
    "data.loc[data['leisure_3']<0,'leisure_3'] = 3"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "使用众数（代码中使用mode()来实现异常值的修正），由于这里的特征是空闲活动，所以采用众数对于缺失值进行处理比较合理。具体的代码参考如下："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 8,
   "metadata": {},
   "outputs": [],
   "source": [
    "data.loc[data['leisure_4']<0,'leisure_4'] = data['leisure_4'].mode() #取众数\n",
    "data.loc[data['leisure_5']<0,'leisure_5'] = data['leisure_5'].mode()\n",
    "data.loc[data['leisure_6']<0,'leisure_6'] = data['leisure_6'].mode()\n",
    "data.loc[data['leisure_7']<0,'leisure_7'] = data['leisure_7'].mode()\n",
    "data.loc[data['leisure_8']<0,'leisure_8'] = data['leisure_8'].mode()\n",
    "data.loc[data['leisure_9']<0,'leisure_9'] = data['leisure_9'].mode()\n",
    "data.loc[data['leisure_10']<0,'leisure_10'] = data['leisure_10'].mode()\n",
    "data.loc[data['leisure_11']<0,'leisure_11'] = data['leisure_11'].mode()\n",
    "data.loc[data['leisure_12']<0,'leisure_12'] = data['leisure_12'].mode()\n",
    "data.loc[data['socialize']<0,'socialize'] = 2 #很少\n",
    "data.loc[data['relax']<0,'relax'] = 4 #经常\n",
    "data.loc[data['learn']<0,'learn'] = 1 #从不，哈哈哈哈\n",
    "#对‘社交’处理\n",
    "data.loc[data['social_neighbor']<0,'social_neighbor'] = 0\n",
    "data.loc[data['social_friend']<0,'social_friend'] = 0\n",
    "data.loc[data['socia_outing']<0,'socia_outing'] = 1\n",
    "data.loc[data['neighbor_familiarity']<0,'social_neighbor']= 4\n",
    "#对‘社会公平性’处理\n",
    "data.loc[data['equity']<0,'equity'] = 4\n",
    "#对‘社会等级’处理\n",
    "data.loc[data['class_10_before']<0,'class_10_before'] = 3\n",
    "data.loc[data['class']<0,'class'] = 5\n",
    "data.loc[data['class_10_after']<0,'class_10_after'] = 5\n",
    "data.loc[data['class_14']<0,'class_14'] = 2\n",
    "#对‘工作情况’处理\n",
    "data.loc[data['work_status']<0,'work_status'] = 0\n",
    "data.loc[data['work_yr']<0,'work_yr'] = 0\n",
    "data.loc[data['work_manage']<0,'work_manage'] = 0\n",
    "data.loc[data['work_type']<0,'work_type'] = 0\n",
    "#对‘社会保障’处理\n",
    "data.loc[data['insur_1']<0,'insur_1'] = 1\n",
    "data.loc[data['insur_2']<0,'insur_2'] = 1\n",
    "data.loc[data['insur_3']<0,'insur_3'] = 1\n",
    "data.loc[data['insur_4']<0,'insur_4'] = 1\n",
    "data.loc[data['insur_1']==0,'insur_1'] = 0\n",
    "data.loc[data['insur_2']==0,'insur_2'] = 0\n",
    "data.loc[data['insur_3']==0,'insur_3'] = 0\n",
    "data.loc[data['insur_4']==0,'insur_4'] = 0"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "取均值进行缺失值的补全（代码实现为means()），在这里因为家庭的收入是连续值，所以不能再使用取众数的方法进行处理，这里就直接使用了均值进行缺失值的补全。具体的代码参考如下："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 9,
   "metadata": {},
   "outputs": [],
   "source": [
    "#对家庭情况处理\n",
    "family_income_mean = data['family_income'].mean()\n",
    "data.loc[data['family_income']<0,'family_income'] = family_income_mean\n",
    "data.loc[data['family_m']<0,'family_m'] = 2\n",
    "data.loc[data['family_status']<0,'family_status'] = 3\n",
    "data.loc[data['house']<0,'house'] = 1\n",
    "data.loc[data['car']<0,'car'] = 0\n",
    "data.loc[data['car']==2,'car'] = 0\n",
    "data.loc[data['son']<0,'son'] = 1\n",
    "data.loc[data['daughter']<0,'daughter'] = 0\n",
    "data.loc[data['minor_child']<0,'minor_child'] = 0\n",
    "#对‘婚姻’处理\n",
    "data.loc[data['marital_1st']<0,'marital_1st'] = 0\n",
    "data.loc[data['marital_now']<0,'marital_now'] = 0\n",
    "#对‘配偶’处理\n",
    "data.loc[data['s_birth']<0,'s_birth'] = 0\n",
    "data.loc[data['s_edu']<0,'s_edu'] = 0\n",
    "data.loc[data['s_political']<0,'s_political'] = 0\n",
    "data.loc[data['s_hukou']<0,'s_hukou'] = 0\n",
    "data.loc[data['s_income']<0,'s_income'] = 0\n",
    "data.loc[data['s_work_type']<0,'s_work_type'] = 0\n",
    "data.loc[data['s_work_status']<0,'s_work_status'] = 0\n",
    "data.loc[data['s_work_exper']<0,'s_work_exper'] = 0\n",
    "#对‘父母情况’处理\n",
    "data.loc[data['f_birth']<0,'f_birth'] = 1945\n",
    "data.loc[data['f_edu']<0,'f_edu'] = 1\n",
    "data.loc[data['f_political']<0,'f_political'] = 1\n",
    "data.loc[data['f_work_14']<0,'f_work_14'] = 2\n",
    "data.loc[data['m_birth']<0,'m_birth'] = 1940\n",
    "data.loc[data['m_edu']<0,'m_edu'] = 1\n",
    "data.loc[data['m_political']<0,'m_political'] = 1\n",
    "data.loc[data['m_work_14']<0,'m_work_14'] = 2\n",
    "#和同龄人相比社会经济地位\n",
    "data.loc[data['status_peer']<0,'status_peer'] = 2\n",
    "#和3年前比社会经济地位\n",
    "data.loc[data['status_3_before']<0,'status_3_before'] = 2\n",
    "#对‘观点’处理\n",
    "data.loc[data['view']<0,'view'] = 4\n",
    "#对期望年收入处理\n",
    "data.loc[data['inc_ability']<=0,'inc_ability']= 2\n",
    "inc_exp_mean = data['inc_exp'].mean()\n",
    "data.loc[data['inc_exp']<=0,'inc_exp']= inc_exp_mean #取均值\n",
    "\n",
    "#部分特征处理，取众数\n",
    "for i in range(1,9+1):\n",
    "    data.loc[data['public_service_'+str(i)]<0,'public_service_'+str(i)] = data['public_service_'+str(i)].dropna().mode().values\n",
    "for i in range(1,13+1):\n",
    "    data.loc[data['trust_'+str(i)]<0,'trust_'+str(i)] = data['trust_'+str(i)].dropna().mode().values"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 数据增广\n",
    "\n",
    "这一步，我们需要进一步分析每一个特征之间的关系，从而进行数据增广。经过思考，这里我添加了如下的特征：第一次结婚年龄、最近结婚年龄、是否再婚、配偶年龄、配偶年龄差、各种收入比（与配偶之间的收入比、十年后预期收入与现在收入之比等等）、收入与住房面积比（其中也包括10年后期望收入等等各种情况）、社会阶级（10年后的社会阶级、14年后的社会阶级等等）、悠闲指数、满意指数、信任指数等等。除此之外，我还考虑了对于同一省、市、县进行了归一化。例如同一省市内的收入的平均值等以及一个个体相对于同省、市、县其他人的各个指标的情况。同时也考虑了对于同龄人之间的相互比较，即在同龄人中的收入情况、健康情况等等。具体的实现代码如下："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 10,
   "metadata": {},
   "outputs": [],
   "source": [
    "#第一次结婚年龄 147\n",
    "data['marital_1stbir'] = data['marital_1st'] - data['birth'] \n",
    "#最近结婚年龄 148\n",
    "data['marital_nowtbir'] = data['marital_now'] - data['birth'] \n",
    "#是否再婚 149\n",
    "data['mar'] = data['marital_nowtbir'] - data['marital_1stbir']\n",
    "#配偶年龄 150\n",
    "data['marital_sbir'] = data['marital_now']-data['s_birth']\n",
    "#配偶年龄差 151\n",
    "data['age_'] = data['marital_nowtbir'] - data['marital_sbir'] \n",
    "\n",
    "#收入比 151+7 =158\n",
    "data['income/s_income'] = data['income']/(data['s_income']+1)\n",
    "data['income+s_income'] = data['income']+(data['s_income']+1)\n",
    "data['income/family_income'] = data['income']/(data['family_income']+1)\n",
    "data['all_income/family_income'] = (data['income']+data['s_income'])/(data['family_income']+1)\n",
    "data['income/inc_exp'] = data['income']/(data['inc_exp']+1)\n",
    "data['family_income/m'] = data['family_income']/(data['family_m']+0.01)\n",
    "data['income/m'] = data['income']/(data['family_m']+0.01)\n",
    "\n",
    "#收入/面积比 158+4=162\n",
    "data['income/floor_area'] = data['income']/(data['floor_area']+0.01)\n",
    "data['all_income/floor_area'] = (data['income']+data['s_income'])/(data['floor_area']+0.01)\n",
    "data['family_income/floor_area'] = data['family_income']/(data['floor_area']+0.01)\n",
    "data['floor_area/m'] = data['floor_area']/(data['family_m']+0.01)\n",
    "\n",
    "#class 162+3=165\n",
    "data['class_10_diff'] = (data['class_10_after'] - data['class'])\n",
    "data['class_diff'] = data['class'] - data['class_10_before']\n",
    "data['class_14_diff'] = data['class'] - data['class_14']\n",
    "#悠闲指数 166\n",
    "leisure_fea_lis = ['leisure_'+str(i) for i in range(1,13)]\n",
    "data['leisure_sum'] = data[leisure_fea_lis].sum(axis=1) #skew\n",
    "#满意指数 167\n",
    "public_service_fea_lis = ['public_service_'+str(i) for i in range(1,10)]\n",
    "data['public_service_sum'] = data[public_service_fea_lis].sum(axis=1) #skew\n",
    "\n",
    "#信任指数 168\n",
    "trust_fea_lis = ['trust_'+str(i) for i in range(1,14)]\n",
    "data['trust_sum'] = data[trust_fea_lis].sum(axis=1) #skew\n",
    "\n",
    "#province mean 168+13=181\n",
    "data['province_income_mean'] = data.groupby(['province'])['income'].transform('mean').values\n",
    "data['province_family_income_mean'] = data.groupby(['province'])['family_income'].transform('mean').values\n",
    "data['province_equity_mean'] = data.groupby(['province'])['equity'].transform('mean').values\n",
    "data['province_depression_mean'] = data.groupby(['province'])['depression'].transform('mean').values\n",
    "data['province_floor_area_mean'] = data.groupby(['province'])['floor_area'].transform('mean').values\n",
    "data['province_health_mean'] = data.groupby(['province'])['health'].transform('mean').values\n",
    "data['province_class_10_diff_mean'] = data.groupby(['province'])['class_10_diff'].transform('mean').values\n",
    "data['province_class_mean'] = data.groupby(['province'])['class'].transform('mean').values\n",
    "data['province_health_problem_mean'] = data.groupby(['province'])['health_problem'].transform('mean').values\n",
    "data['province_family_status_mean'] = data.groupby(['province'])['family_status'].transform('mean').values\n",
    "data['province_leisure_sum_mean'] = data.groupby(['province'])['leisure_sum'].transform('mean').values\n",
    "data['province_public_service_sum_mean'] = data.groupby(['province'])['public_service_sum'].transform('mean').values\n",
    "data['province_trust_sum_mean'] = data.groupby(['province'])['trust_sum'].transform('mean').values\n",
    "\n",
    "#city   mean 181+13=194\n",
    "data['city_income_mean'] = data.groupby(['city'])['income'].transform('mean').values\n",
    "data['city_family_income_mean'] = data.groupby(['city'])['family_income'].transform('mean').values\n",
    "data['city_equity_mean'] = data.groupby(['city'])['equity'].transform('mean').values\n",
    "data['city_depression_mean'] = data.groupby(['city'])['depression'].transform('mean').values\n",
    "data['city_floor_area_mean'] = data.groupby(['city'])['floor_area'].transform('mean').values\n",
    "data['city_health_mean'] = data.groupby(['city'])['health'].transform('mean').values\n",
    "data['city_class_10_diff_mean'] = data.groupby(['city'])['class_10_diff'].transform('mean').values\n",
    "data['city_class_mean'] = data.groupby(['city'])['class'].transform('mean').values\n",
    "data['city_health_problem_mean'] = data.groupby(['city'])['health_problem'].transform('mean').values\n",
    "data['city_family_status_mean'] = data.groupby(['city'])['family_status'].transform('mean').values\n",
    "data['city_leisure_sum_mean'] = data.groupby(['city'])['leisure_sum'].transform('mean').values\n",
    "data['city_public_service_sum_mean'] = data.groupby(['city'])['public_service_sum'].transform('mean').values\n",
    "data['city_trust_sum_mean'] = data.groupby(['city'])['trust_sum'].transform('mean').values\n",
    "\n",
    "#county  mean 194 + 13 = 207\n",
    "data['county_income_mean'] = data.groupby(['county'])['income'].transform('mean').values\n",
    "data['county_family_income_mean'] = data.groupby(['county'])['family_income'].transform('mean').values\n",
    "data['county_equity_mean'] = data.groupby(['county'])['equity'].transform('mean').values\n",
    "data['county_depression_mean'] = data.groupby(['county'])['depression'].transform('mean').values\n",
    "data['county_floor_area_mean'] = data.groupby(['county'])['floor_area'].transform('mean').values\n",
    "data['county_health_mean'] = data.groupby(['county'])['health'].transform('mean').values\n",
    "data['county_class_10_diff_mean'] = data.groupby(['county'])['class_10_diff'].transform('mean').values\n",
    "data['county_class_mean'] = data.groupby(['county'])['class'].transform('mean').values\n",
    "data['county_health_problem_mean'] = data.groupby(['county'])['health_problem'].transform('mean').values\n",
    "data['county_family_status_mean'] = data.groupby(['county'])['family_status'].transform('mean').values\n",
    "data['county_leisure_sum_mean'] = data.groupby(['county'])['leisure_sum'].transform('mean').values\n",
    "data['county_public_service_sum_mean'] = data.groupby(['county'])['public_service_sum'].transform('mean').values\n",
    "data['county_trust_sum_mean'] = data.groupby(['county'])['trust_sum'].transform('mean').values\n",
    "\n",
    "#ratio 相比同省 207 + 13 =220\n",
    "data['income/province'] = data['income']/(data['province_income_mean'])                                      \n",
    "data['family_income/province'] = data['family_income']/(data['province_family_income_mean'])   \n",
    "data['equity/province'] = data['equity']/(data['province_equity_mean'])       \n",
    "data['depression/province'] = data['depression']/(data['province_depression_mean'])                                                \n",
    "data['floor_area/province'] = data['floor_area']/(data['province_floor_area_mean'])\n",
    "data['health/province'] = data['health']/(data['province_health_mean'])\n",
    "data['class_10_diff/province'] = data['class_10_diff']/(data['province_class_10_diff_mean'])\n",
    "data['class/province'] = data['class']/(data['province_class_mean'])\n",
    "data['health_problem/province'] = data['health_problem']/(data['province_health_problem_mean'])\n",
    "data['family_status/province'] = data['family_status']/(data['province_family_status_mean'])\n",
    "data['leisure_sum/province'] = data['leisure_sum']/(data['province_leisure_sum_mean'])\n",
    "data['public_service_sum/province'] = data['public_service_sum']/(data['province_public_service_sum_mean'])\n",
    "data['trust_sum/province'] = data['trust_sum']/(data['province_trust_sum_mean']+1)\n",
    "\n",
    "#ratio 相比同市 220 + 13 =233\n",
    "data['income/city'] = data['income']/(data['city_income_mean'])                                      \n",
    "data['family_income/city'] = data['family_income']/(data['city_family_income_mean'])   \n",
    "data['equity/city'] = data['equity']/(data['city_equity_mean'])       \n",
    "data['depression/city'] = data['depression']/(data['city_depression_mean'])                                                \n",
    "data['floor_area/city'] = data['floor_area']/(data['city_floor_area_mean'])\n",
    "data['health/city'] = data['health']/(data['city_health_mean'])\n",
    "data['class_10_diff/city'] = data['class_10_diff']/(data['city_class_10_diff_mean'])\n",
    "data['class/city'] = data['class']/(data['city_class_mean'])\n",
    "data['health_problem/city'] = data['health_problem']/(data['city_health_problem_mean'])\n",
    "data['family_status/city'] = data['family_status']/(data['city_family_status_mean'])\n",
    "data['leisure_sum/city'] = data['leisure_sum']/(data['city_leisure_sum_mean'])\n",
    "data['public_service_sum/city'] = data['public_service_sum']/(data['city_public_service_sum_mean'])\n",
    "data['trust_sum/city'] = data['trust_sum']/(data['city_trust_sum_mean'])\n",
    "\n",
    "#ratio 相比同个地区 233 + 13 =246\n",
    "data['income/county'] = data['income']/(data['county_income_mean'])                                      \n",
    "data['family_income/county'] = data['family_income']/(data['county_family_income_mean'])   \n",
    "data['equity/county'] = data['equity']/(data['county_equity_mean'])       \n",
    "data['depression/county'] = data['depression']/(data['county_depression_mean'])                                                \n",
    "data['floor_area/county'] = data['floor_area']/(data['county_floor_area_mean'])\n",
    "data['health/county'] = data['health']/(data['county_health_mean'])\n",
    "data['class_10_diff/county'] = data['class_10_diff']/(data['county_class_10_diff_mean'])\n",
    "data['class/county'] = data['class']/(data['county_class_mean'])\n",
    "data['health_problem/county'] = data['health_problem']/(data['county_health_problem_mean'])\n",
    "data['family_status/county'] = data['family_status']/(data['county_family_status_mean'])\n",
    "data['leisure_sum/county'] = data['leisure_sum']/(data['county_leisure_sum_mean'])\n",
    "data['public_service_sum/county'] = data['public_service_sum']/(data['county_public_service_sum_mean'])\n",
    "data['trust_sum/county'] = data['trust_sum']/(data['county_trust_sum_mean'])\n",
    "\n",
    "#age   mean 246+ 13 =259\n",
    "data['age_income_mean'] = data.groupby(['age'])['income'].transform('mean').values\n",
    "data['age_family_income_mean'] = data.groupby(['age'])['family_income'].transform('mean').values\n",
    "data['age_equity_mean'] = data.groupby(['age'])['equity'].transform('mean').values\n",
    "data['age_depression_mean'] = data.groupby(['age'])['depression'].transform('mean').values\n",
    "data['age_floor_area_mean'] = data.groupby(['age'])['floor_area'].transform('mean').values\n",
    "data['age_health_mean'] = data.groupby(['age'])['health'].transform('mean').values\n",
    "data['age_class_10_diff_mean'] = data.groupby(['age'])['class_10_diff'].transform('mean').values\n",
    "data['age_class_mean'] = data.groupby(['age'])['class'].transform('mean').values\n",
    "data['age_health_problem_mean'] = data.groupby(['age'])['health_problem'].transform('mean').values\n",
    "data['age_family_status_mean'] = data.groupby(['age'])['family_status'].transform('mean').values\n",
    "data['age_leisure_sum_mean'] = data.groupby(['age'])['leisure_sum'].transform('mean').values\n",
    "data['age_public_service_sum_mean'] = data.groupby(['age'])['public_service_sum'].transform('mean').values\n",
    "data['age_trust_sum_mean'] = data.groupby(['age'])['trust_sum'].transform('mean').values\n",
    "\n",
    "# 和同龄人相比259 + 13 =272\n",
    "data['income/age'] = data['income']/(data['age_income_mean'])                                      \n",
    "data['family_income/age'] = data['family_income']/(data['age_family_income_mean'])   \n",
    "data['equity/age'] = data['equity']/(data['age_equity_mean'])       \n",
    "data['depression/age'] = data['depression']/(data['age_depression_mean'])                                                \n",
    "data['floor_area/age'] = data['floor_area']/(data['age_floor_area_mean'])\n",
    "data['health/age'] = data['health']/(data['age_health_mean'])\n",
    "data['class_10_diff/age'] = data['class_10_diff']/(data['age_class_10_diff_mean'])\n",
    "data['class/age'] = data['class']/(data['age_class_mean'])\n",
    "data['health_problem/age'] = data['health_problem']/(data['age_health_problem_mean'])\n",
    "data['family_status/age'] = data['family_status']/(data['age_family_status_mean'])\n",
    "data['leisure_sum/age'] = data['leisure_sum']/(data['age_leisure_sum_mean'])\n",
    "data['public_service_sum/age'] = data['public_service_sum']/(data['age_public_service_sum_mean'])\n",
    "data['trust_sum/age'] = data['trust_sum']/(data['age_trust_sum_mean'])"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "经过如上的操作后，最终我们的特征从一开始的131维，扩充为了272维的特征。接下来考虑特征工程、训练模型以及模型融合的工作。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 11,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "shape (10956, 272)\n"
     ]
    },
    {
     "data": {
      "text/html": [
       "<div>\n",
       "<style scoped>\n",
       "    .dataframe tbody tr th:only-of-type {\n",
       "        vertical-align: middle;\n",
       "    }\n",
       "\n",
       "    .dataframe tbody tr th {\n",
       "        vertical-align: top;\n",
       "    }\n",
       "\n",
       "    .dataframe thead th {\n",
       "        text-align: right;\n",
       "    }\n",
       "</style>\n",
       "<table border=\"1\" class=\"dataframe\">\n",
       "  <thead>\n",
       "    <tr style=\"text-align: right;\">\n",
       "      <th></th>\n",
       "      <th>id</th>\n",
       "      <th>survey_type</th>\n",
       "      <th>province</th>\n",
       "      <th>city</th>\n",
       "      <th>county</th>\n",
       "      <th>survey_time</th>\n",
       "      <th>gender</th>\n",
       "      <th>birth</th>\n",
       "      <th>nationality</th>\n",
       "      <th>religion</th>\n",
       "      <th>...</th>\n",
       "      <th>depression/age</th>\n",
       "      <th>floor_area/age</th>\n",
       "      <th>health/age</th>\n",
       "      <th>class_10_diff/age</th>\n",
       "      <th>class/age</th>\n",
       "      <th>health_problem/age</th>\n",
       "      <th>family_status/age</th>\n",
       "      <th>leisure_sum/age</th>\n",
       "      <th>public_service_sum/age</th>\n",
       "      <th>trust_sum/age</th>\n",
       "    </tr>\n",
       "  </thead>\n",
       "  <tbody>\n",
       "    <tr>\n",
       "      <td>0</td>\n",
       "      <td>1</td>\n",
       "      <td>1</td>\n",
       "      <td>12</td>\n",
       "      <td>32</td>\n",
       "      <td>59</td>\n",
       "      <td>2015</td>\n",
       "      <td>1</td>\n",
       "      <td>1959</td>\n",
       "      <td>1</td>\n",
       "      <td>1</td>\n",
       "      <td>...</td>\n",
       "      <td>1.285211</td>\n",
       "      <td>0.410351</td>\n",
       "      <td>0.848837</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>0.683307</td>\n",
       "      <td>0.521429</td>\n",
       "      <td>0.733668</td>\n",
       "      <td>0.724620</td>\n",
       "      <td>0.666638</td>\n",
       "      <td>0.925941</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <td>1</td>\n",
       "      <td>2</td>\n",
       "      <td>2</td>\n",
       "      <td>18</td>\n",
       "      <td>52</td>\n",
       "      <td>85</td>\n",
       "      <td>2015</td>\n",
       "      <td>1</td>\n",
       "      <td>1992</td>\n",
       "      <td>1</td>\n",
       "      <td>1</td>\n",
       "      <td>...</td>\n",
       "      <td>0.733333</td>\n",
       "      <td>0.952824</td>\n",
       "      <td>1.179337</td>\n",
       "      <td>1.012552</td>\n",
       "      <td>1.344444</td>\n",
       "      <td>0.891344</td>\n",
       "      <td>1.359551</td>\n",
       "      <td>1.011792</td>\n",
       "      <td>1.130778</td>\n",
       "      <td>1.188442</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <td>2</td>\n",
       "      <td>3</td>\n",
       "      <td>2</td>\n",
       "      <td>29</td>\n",
       "      <td>83</td>\n",
       "      <td>126</td>\n",
       "      <td>2015</td>\n",
       "      <td>2</td>\n",
       "      <td>1967</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>...</td>\n",
       "      <td>1.343537</td>\n",
       "      <td>0.972328</td>\n",
       "      <td>1.150485</td>\n",
       "      <td>1.190955</td>\n",
       "      <td>1.195762</td>\n",
       "      <td>1.055679</td>\n",
       "      <td>1.190955</td>\n",
       "      <td>0.966470</td>\n",
       "      <td>1.193204</td>\n",
       "      <td>0.803693</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <td>3</td>\n",
       "      <td>4</td>\n",
       "      <td>2</td>\n",
       "      <td>10</td>\n",
       "      <td>28</td>\n",
       "      <td>51</td>\n",
       "      <td>2015</td>\n",
       "      <td>2</td>\n",
       "      <td>1943</td>\n",
       "      <td>1</td>\n",
       "      <td>1</td>\n",
       "      <td>...</td>\n",
       "      <td>1.111663</td>\n",
       "      <td>0.642329</td>\n",
       "      <td>1.276353</td>\n",
       "      <td>4.977778</td>\n",
       "      <td>1.199143</td>\n",
       "      <td>1.188329</td>\n",
       "      <td>1.162630</td>\n",
       "      <td>0.899346</td>\n",
       "      <td>1.153810</td>\n",
       "      <td>1.300950</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <td>4</td>\n",
       "      <td>5</td>\n",
       "      <td>1</td>\n",
       "      <td>7</td>\n",
       "      <td>18</td>\n",
       "      <td>36</td>\n",
       "      <td>2015</td>\n",
       "      <td>2</td>\n",
       "      <td>1994</td>\n",
       "      <td>1</td>\n",
       "      <td>1</td>\n",
       "      <td>...</td>\n",
       "      <td>0.750000</td>\n",
       "      <td>0.587284</td>\n",
       "      <td>1.177106</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>0.236957</td>\n",
       "      <td>1.116803</td>\n",
       "      <td>1.093645</td>\n",
       "      <td>1.045313</td>\n",
       "      <td>0.728161</td>\n",
       "      <td>1.117428</td>\n",
       "    </tr>\n",
       "  </tbody>\n",
       "</table>\n",
       "<p>5 rows × 272 columns</p>\n",
       "</div>"
      ],
      "text/plain": [
       "   id  survey_type  province  city  county  survey_time  gender  birth  \\\n",
       "0   1            1        12    32      59         2015       1   1959   \n",
       "1   2            2        18    52      85         2015       1   1992   \n",
       "2   3            2        29    83     126         2015       2   1967   \n",
       "3   4            2        10    28      51         2015       2   1943   \n",
       "4   5            1         7    18      36         2015       2   1994   \n",
       "\n",
       "   nationality  religion  ...  depression/age  floor_area/age health/age  \\\n",
       "0            1         1  ...        1.285211        0.410351   0.848837   \n",
       "1            1         1  ...        0.733333        0.952824   1.179337   \n",
       "2            1         0  ...        1.343537        0.972328   1.150485   \n",
       "3            1         1  ...        1.111663        0.642329   1.276353   \n",
       "4            1         1  ...        0.750000        0.587284   1.177106   \n",
       "\n",
       "   class_10_diff/age  class/age  health_problem/age  family_status/age  \\\n",
       "0           0.000000   0.683307            0.521429           0.733668   \n",
       "1           1.012552   1.344444            0.891344           1.359551   \n",
       "2           1.190955   1.195762            1.055679           1.190955   \n",
       "3           4.977778   1.199143            1.188329           1.162630   \n",
       "4           0.000000   0.236957            1.116803           1.093645   \n",
       "\n",
       "   leisure_sum/age  public_service_sum/age  trust_sum/age  \n",
       "0         0.724620                0.666638       0.925941  \n",
       "1         1.011792                1.130778       1.188442  \n",
       "2         0.966470                1.193204       0.803693  \n",
       "3         0.899346                1.153810       1.300950  \n",
       "4         1.045313                0.728161       1.117428  \n",
       "\n",
       "[5 rows x 272 columns]"
      ]
     },
     "execution_count": 11,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "print('shape',data.shape)\n",
    "data.head()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "我们还应该删去有效样本数很少的特征，例如负值太多的特征或者是缺失值太多的特征，这里我一共删除了包括“目前的最高教育程度”在内的9类特征，得到了最终的263维的特征"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 12,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(7988, 263)"
      ]
     },
     "execution_count": 12,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "#272-9=263\n",
    "#删除数值特别少的和之前用过的特征\n",
    "del_list=['id','survey_time','edu_other','invest_other','property_other','join_party','province','city','county']\n",
    "use_feature = [clo for clo in data.columns if clo not in del_list]\n",
    "data.fillna(0,inplace=True) #还是补0\n",
    "train_shape = train.shape[0] #一共的数据量，训练集\n",
    "features = data[use_feature].columns #删除后所有的特征\n",
    "X_train_263 = data[:train_shape][use_feature].values\n",
    "y_train = target\n",
    "X_test_263 = data[train_shape:][use_feature].values\n",
    "X_train_263.shape #最终一种263个特征"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "这里选择了最重要的49个特征，作为除了以上263维特征外的另外一组特征"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 13,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(7988, 49)"
      ]
     },
     "execution_count": 13,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "imp_fea_49 = ['equity','depression','health','class','family_status','health_problem','class_10_after',\n",
    "           'equity/province','equity/city','equity/county',\n",
    "           'depression/province','depression/city','depression/county',\n",
    "           'health/province','health/city','health/county',\n",
    "           'class/province','class/city','class/county',\n",
    "           'family_status/province','family_status/city','family_status/county',\n",
    "           'family_income/province','family_income/city','family_income/county',\n",
    "           'floor_area/province','floor_area/city','floor_area/county',\n",
    "           'leisure_sum/province','leisure_sum/city','leisure_sum/county',\n",
    "           'public_service_sum/province','public_service_sum/city','public_service_sum/county',\n",
    "           'trust_sum/province','trust_sum/city','trust_sum/county',\n",
    "           'income/m','public_service_sum','class_diff','status_3_before','age_income_mean','age_floor_area_mean',\n",
    "           'weight_jin','height_cm',\n",
    "           'health/age','depression/age','equity/age','leisure_sum/age'\n",
    "          ]\n",
    "train_shape = train.shape[0]\n",
    "X_train_49 = data[:train_shape][imp_fea_49].values\n",
    "X_test_49 = data[train_shape:][imp_fea_49].values\n",
    "X_train_49.shape #最重要的49个特征"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "选择需要进行onehot编码的离散变量进行one-hot编码，再合成为第三类特征，共383维。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 14,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(7988, 383)"
      ]
     },
     "execution_count": 14,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "cat_fea = ['survey_type','gender','nationality','edu_status','political','hukou','hukou_loc','work_exper','work_status','work_type',\n",
    "           'work_manage','marital','s_political','s_hukou','s_work_exper','s_work_status','s_work_type','f_political','f_work_14',\n",
    "           'm_political','m_work_14']\n",
    "noc_fea = [clo for clo in use_feature if clo not in cat_fea]\n",
    "\n",
    "onehot_data = data[cat_fea].values\n",
    "enc = preprocessing.OneHotEncoder(categories = 'auto')\n",
    "oh_data=enc.fit_transform(onehot_data).toarray()\n",
    "oh_data.shape #变为onehot编码格式\n",
    "\n",
    "X_train_oh = oh_data[:train_shape,:]\n",
    "X_test_oh = oh_data[train_shape:,:]\n",
    "X_train_oh.shape #其中的训练集\n",
    "\n",
    "X_train_383 = np.column_stack([data[:train_shape][noc_fea].values,X_train_oh])#先是noc，再是cat_fea\n",
    "X_test_383 = np.column_stack([data[train_shape:][noc_fea].values,X_test_oh])\n",
    "X_train_383.shape"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "基于此，我们构建完成了三种特征工程（训练数据集），其一是上面提取的最重要的49中特征，其中包括健康程度、社会阶级、在同龄人中的收入情况等等特征。其二是扩充后的263维特征（这里可以认为是初始特征）。其三是使用One-hot编码后的特征，这里要使用One-hot进行编码的原因在于，有部分特征为分离值，例如性别中男女，男为1，女为2，我们想使用One-hot将其变为男为0，女为1，来增强机器学习算法的鲁棒性能；再如民族这个特征，原本是1-56这56个数值，如果直接分类会让分类器的鲁棒性变差，所以使用One-hot编码将其变为6个特征进行非零即一的处理。"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "#### 特征建模\n",
    "\n",
    "首先我们对于原始的263维的特征，使用lightGBM进行处理，这里我们使用5折交叉验证的方法：\n",
    "\n",
    "1.lightGBM"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 15,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "fold n°1\n",
      "Training until validation scores don't improve for 800 rounds\n",
      "[500]\ttraining's l2: 0.507058\tvalid_1's l2: 0.50963\n",
      "[1000]\ttraining's l2: 0.458952\tvalid_1's l2: 0.469564\n",
      "[1500]\ttraining's l2: 0.433197\tvalid_1's l2: 0.453234\n",
      "[2000]\ttraining's l2: 0.415242\tvalid_1's l2: 0.444839\n",
      "[2500]\ttraining's l2: 0.400993\tvalid_1's l2: 0.440086\n",
      "[3000]\ttraining's l2: 0.388937\tvalid_1's l2: 0.436924\n",
      "[3500]\ttraining's l2: 0.378101\tvalid_1's l2: 0.434974\n",
      "[4000]\ttraining's l2: 0.368159\tvalid_1's l2: 0.43406\n",
      "[4500]\ttraining's l2: 0.358827\tvalid_1's l2: 0.433151\n",
      "[5000]\ttraining's l2: 0.350291\tvalid_1's l2: 0.432544\n",
      "[5500]\ttraining's l2: 0.342368\tvalid_1's l2: 0.431821\n",
      "[6000]\ttraining's l2: 0.334675\tvalid_1's l2: 0.431331\n",
      "[6500]\ttraining's l2: 0.327275\tvalid_1's l2: 0.431014\n",
      "[7000]\ttraining's l2: 0.320398\tvalid_1's l2: 0.431087\n",
      "[7500]\ttraining's l2: 0.31352\tvalid_1's l2: 0.430819\n",
      "[8000]\ttraining's l2: 0.307021\tvalid_1's l2: 0.430848\n",
      "[8500]\ttraining's l2: 0.300811\tvalid_1's l2: 0.430688\n",
      "[9000]\ttraining's l2: 0.294787\tvalid_1's l2: 0.430441\n",
      "[9500]\ttraining's l2: 0.288993\tvalid_1's l2: 0.430433\n",
      "Early stopping, best iteration is:\n",
      "[9119]\ttraining's l2: 0.293371\tvalid_1's l2: 0.430308\n",
      "fold n°2\n",
      "Training until validation scores don't improve for 800 rounds\n",
      "[500]\ttraining's l2: 0.49895\tvalid_1's l2: 0.52945\n",
      "[1000]\ttraining's l2: 0.450107\tvalid_1's l2: 0.496478\n",
      "[1500]\ttraining's l2: 0.424394\tvalid_1's l2: 0.483286\n",
      "[2000]\ttraining's l2: 0.40666\tvalid_1's l2: 0.476764\n",
      "[2500]\ttraining's l2: 0.392432\tvalid_1's l2: 0.472668\n",
      "[3000]\ttraining's l2: 0.380438\tvalid_1's l2: 0.470481\n",
      "[3500]\ttraining's l2: 0.369872\tvalid_1's l2: 0.468919\n",
      "[4000]\ttraining's l2: 0.36014\tvalid_1's l2: 0.467318\n",
      "[4500]\ttraining's l2: 0.351175\tvalid_1's l2: 0.466438\n",
      "[5000]\ttraining's l2: 0.342705\tvalid_1's l2: 0.466284\n",
      "[5500]\ttraining's l2: 0.334778\tvalid_1's l2: 0.466151\n",
      "[6000]\ttraining's l2: 0.3273\tvalid_1's l2: 0.466016\n",
      "[6500]\ttraining's l2: 0.320121\tvalid_1's l2: 0.466013\n",
      "Early stopping, best iteration is:\n",
      "[5915]\ttraining's l2: 0.328534\tvalid_1's l2: 0.465918\n",
      "fold n°3\n",
      "Training until validation scores don't improve for 800 rounds\n",
      "[500]\ttraining's l2: 0.499658\tvalid_1's l2: 0.528985\n",
      "[1000]\ttraining's l2: 0.450356\tvalid_1's l2: 0.497264\n",
      "[1500]\ttraining's l2: 0.424109\tvalid_1's l2: 0.485403\n",
      "[2000]\ttraining's l2: 0.405965\tvalid_1's l2: 0.479513\n",
      "[2500]\ttraining's l2: 0.391747\tvalid_1's l2: 0.47646\n",
      "[3000]\ttraining's l2: 0.379601\tvalid_1's l2: 0.474691\n",
      "[3500]\ttraining's l2: 0.368915\tvalid_1's l2: 0.473648\n",
      "[4000]\ttraining's l2: 0.359218\tvalid_1's l2: 0.47316\n",
      "[4500]\ttraining's l2: 0.350338\tvalid_1's l2: 0.473043\n",
      "[5000]\ttraining's l2: 0.341842\tvalid_1's l2: 0.472719\n",
      "[5500]\ttraining's l2: 0.333851\tvalid_1's l2: 0.472779\n",
      "Early stopping, best iteration is:\n",
      "[4942]\ttraining's l2: 0.342828\tvalid_1's l2: 0.472642\n",
      "fold n°4\n",
      "Training until validation scores don't improve for 800 rounds\n",
      "[500]\ttraining's l2: 0.505224\tvalid_1's l2: 0.508238\n",
      "[1000]\ttraining's l2: 0.456198\tvalid_1's l2: 0.473992\n",
      "[1500]\ttraining's l2: 0.430167\tvalid_1's l2: 0.461419\n",
      "[2000]\ttraining's l2: 0.412084\tvalid_1's l2: 0.454843\n",
      "[2500]\ttraining's l2: 0.397714\tvalid_1's l2: 0.450999\n",
      "[3000]\ttraining's l2: 0.385456\tvalid_1's l2: 0.448697\n",
      "[3500]\ttraining's l2: 0.374527\tvalid_1's l2: 0.446993\n",
      "[4000]\ttraining's l2: 0.364711\tvalid_1's l2: 0.44597\n",
      "[4500]\ttraining's l2: 0.355626\tvalid_1's l2: 0.445132\n",
      "[5000]\ttraining's l2: 0.347108\tvalid_1's l2: 0.44466\n",
      "[5500]\ttraining's l2: 0.339146\tvalid_1's l2: 0.444226\n",
      "[6000]\ttraining's l2: 0.331478\tvalid_1's l2: 0.443992\n",
      "[6500]\ttraining's l2: 0.324231\tvalid_1's l2: 0.444014\n",
      "Early stopping, best iteration is:\n",
      "[5874]\ttraining's l2: 0.333372\tvalid_1's l2: 0.443868\n",
      "fold n°5\n",
      "Training until validation scores don't improve for 800 rounds\n",
      "[500]\ttraining's l2: 0.504304\tvalid_1's l2: 0.515256\n",
      "[1000]\ttraining's l2: 0.456062\tvalid_1's l2: 0.478544\n",
      "[1500]\ttraining's l2: 0.430298\tvalid_1's l2: 0.463847\n",
      "[2000]\ttraining's l2: 0.412591\tvalid_1's l2: 0.456182\n",
      "[2500]\ttraining's l2: 0.398635\tvalid_1's l2: 0.451783\n",
      "[3000]\ttraining's l2: 0.386609\tvalid_1's l2: 0.449154\n",
      "[3500]\ttraining's l2: 0.375948\tvalid_1's l2: 0.447265\n",
      "[4000]\ttraining's l2: 0.366291\tvalid_1's l2: 0.445796\n",
      "[4500]\ttraining's l2: 0.357236\tvalid_1's l2: 0.445098\n",
      "[5000]\ttraining's l2: 0.348637\tvalid_1's l2: 0.444364\n",
      "[5500]\ttraining's l2: 0.340736\tvalid_1's l2: 0.443998\n",
      "[6000]\ttraining's l2: 0.333154\tvalid_1's l2: 0.443622\n",
      "[6500]\ttraining's l2: 0.325783\tvalid_1's l2: 0.443226\n",
      "[7000]\ttraining's l2: 0.318802\tvalid_1's l2: 0.442986\n",
      "[7500]\ttraining's l2: 0.312164\tvalid_1's l2: 0.442928\n",
      "[8000]\ttraining's l2: 0.305691\tvalid_1's l2: 0.442696\n",
      "[8500]\ttraining's l2: 0.29935\tvalid_1's l2: 0.442521\n",
      "[9000]\ttraining's l2: 0.293242\tvalid_1's l2: 0.442655\n",
      "Early stopping, best iteration is:\n",
      "[8594]\ttraining's l2: 0.298201\tvalid_1's l2: 0.44238\n",
      "CV score: 0.45102656\n"
     ]
    }
   ],
   "source": [
    "##### lgb_263 #\n",
    "#lightGBM决策树\n",
    "lgb_263_param = {\n",
    "'num_leaves': 7, \n",
    "'min_data_in_leaf': 20, #叶子可能具有的最小记录数\n",
    "'objective':'regression',\n",
    "'max_depth': -1,\n",
    "'learning_rate': 0.003,\n",
    "\"boosting\": \"gbdt\", #用gbdt算法\n",
    "\"feature_fraction\": 0.18, #例如 0.18时，意味着在每次迭代中随机选择18％的参数来建树\n",
    "\"bagging_freq\": 1,\n",
    "\"bagging_fraction\": 0.55, #每次迭代时用的数据比例\n",
    "\"bagging_seed\": 14,\n",
    "\"metric\": 'mse',\n",
    "\"lambda_l1\": 0.1005,\n",
    "\"lambda_l2\": 0.1996, \n",
    "\"verbosity\": -1}\n",
    "folds = StratifiedKFold(n_splits=5, shuffle=True, random_state=4)   #交叉切分：5\n",
    "oof_lgb_263 = np.zeros(len(X_train_263))\n",
    "predictions_lgb_263 = np.zeros(len(X_test_263))\n",
    "\n",
    "for fold_, (trn_idx, val_idx) in enumerate(folds.split(X_train_263, y_train)):\n",
    "\n",
    "    print(\"fold n°{}\".format(fold_+1))\n",
    "    trn_data = lgb.Dataset(X_train_263[trn_idx], y_train[trn_idx])\n",
    "    val_data = lgb.Dataset(X_train_263[val_idx], y_train[val_idx])#train:val=4:1\n",
    "\n",
    "    num_round = 10000\n",
    "    lgb_263 = lgb.train(lgb_263_param, trn_data, num_round, valid_sets = [trn_data, val_data], verbose_eval=500, early_stopping_rounds = 800)\n",
    "    oof_lgb_263[val_idx] = lgb_263.predict(X_train_263[val_idx], num_iteration=lgb_263.best_iteration)\n",
    "    predictions_lgb_263 += lgb_263.predict(X_test_263, num_iteration=lgb_263.best_iteration) / folds.n_splits\n",
    "\n",
    "print(\"CV score: {:<8.8f}\".format(mean_squared_error(oof_lgb_263, target)))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "接着，我使用已经训练完的lightGBM的模型进行特征重要性的判断以及可视化，从结果我们可以看出，排在重要性第一位的是health/age，就是同龄人中的健康程度，与我们主观的看法基本一致。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 16,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAA+gAAAfYCAYAAAC9lvdaAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8QZhcZAAAgAElEQVR4nOzdf7znc53//9vdrxCRTJaUKRXph1ETbSKiQiptNLuxRS1LWz4qlVaraCvbL21bq7CadtlSSqFoIpQSZjB+JVpjv6IV+U0SHt8/Xs/h3emcOWfmzMx5nZnb9XJxOa/36/V8PZ+P9+s9f7i/n8/X652qQpIkSZIkTawVJroASZIkSZJkQJckSZIkqRcM6JIkSZIk9YABXZIkSZKkHjCgS5IkSZLUAwZ0SZIkSZJ6wIAuSZLGLMk2SX450XX0XZJXJfnORNexJCXZLsmvF3B8kySXJrknyYGj9LV3kvMXcPzcJH83Sh+PS3JNkiePXr0k9ZMBXZKkxSjJDUl+n+Tegf82GGefCwxCS1NV/aSqNpnoOqBf12UYHweOnOgilrYkGwx8Ju8Hzq2qNavq80t67Kr6A3A88IElPZYkLSkGdEmSFr/XVtUaA//dPJHFJFlpIsdfEvr8npK8GFirqn6+lMftwzXZBTizbW8EXLWUx/9v4K1JHreUx5WkxcKALknSUpLkJUl+luTOJHOTbDdwbJ8kv2jLga9P8vdt/+OBM4ANBmfkk8xM8s8D5//JbHKbyf9AksuB+5Ks1M77VpJbk8wbXHacZMsks5PcneSWJJ8d4T0MN877klye5L4k/5FkvSRntPdyVpIntrZTk1SS/ZLcnOQ3Sd470NfjknyuHbu5bT9ucNz2nv4P+NoI12XLJBe0a/ybJF9IssrAGJVk/yTXJbkjyReTZOD4vgOfw9VJXtj2j3jthrEzcN6Q6/avSW5s13dOkm0G+v19knUG2m6R5LYkK7fXb2s13ZHkB0k2GvJ+/iHJdcB1CxqrHVstyVdbX79I8v4hn+eC/o2s1v7d3ZHkauDFw7z3XYDvJ/kRsD3whfbZPDvJWkn+s/X9v0k+lGTY/xdN8sp0y9XvSvIFYPAzemaS89qx25KcNP9YVf0auAN4yQI+H0nqLQO6JElLQZKnAN8D/hlYBzgY+FaSKa3Jb4FdgScA+wBHJXlhVd1HF/huXoQZ+b8BXgOsDTwCnAbMBZ4C7AAclOTVre2/Av9aVU8ANga+sRBv743AK4FnA6+lC87/CKxL9/8aQ8Ps9sCzgFcBhyTZse0/lC5YTQM2B7YEPjRw3l/QXbuNgLcw/HV5GHh3G/sv2/t8x5Dxd6ULl5sDbwJeDZBkD+Ajre8nAK8DftdC5IKu3VDPB4bep39xe1/r0M3yfjPJqq3mC9o1nO/NwMlV9ccku9Fdy78CpgA/oftyYtBuwFbAZgsaqx37MDAVeAbdZ7bX/E7G8D4/TPdvY+N2zd46WET7QmFb4IdV9YpW6zvbZ3Mt8G/AWm3sl9Nd532GXrwk6wLfovvs1wX+B9h6oMlHgVnAE4ENW7+DfkH32UrSpGNAlyRp8ftOm8G9M489KGwv4PtV9f2qeqSqfgjMpptxpKq+V1X/U53z6ALINsN3P2afr6obq+r3dIF0SlUdUVUPVtX1wLHAX7e2fwSemWTdqrp3IZdn/1tV3VJVN9GFsgur6tJ2T/ApwBZD2h9eVfdV1RXAV+i+SADYEziiqn5bVbcChwN/O3DeI8CHq+oP7T39maqaU1U/r6qHquoG4Mt0YXDQkVV1Z1X9f8A5dGEW4O+AT1bVxe1z+FVV/S+jX7uh1gbuGVLXCVX1u1bXZ4DHAfPv5f/v+degzeb/ddsH8PfAJ6rqF1X1EN297dMGZ9Hb8dvnX5NRxnoT8PGquqPNNg/eGz7a+3wT8LE21o1DzoUunM+tqnuG7CfJisAM4INVdU/7bD7Dn36+8+0CXF1VJ1fVH4HPAf83cPyPdF/SbFBVD1TV0IfL3UP3GUjSpGNAlyRp8dutqtZu/+3W9m0E7DEQ3O8EXgasD5Bk5yQ/T3J7O7YL3ezheNw4sL0R3XLwwfH/EVivHX873Qz4NUkuTrLrQoxzy8D274d5vcYC6vpfYP5D9DZor4c7BnBrVT2woELaUurTk/xfkrvpAu3Q6zgY9u4fqO+pdLO1Q4127Ya6A1hzSF3vbUvK72rnrzVQ18nAX6Z7mOC2QNF90TF/7H8dGPd2uuXeTxnofvB6jjbWBkPaL8y/kaHnDn5W0Ja3D39JWBdYhT//fJ8yTNs/Gaeqasi476e7BhcluSrJ24acvyZw5wh1SFKv9eFhIpIkLQ9uBP6rqvYdeiDdfdbfolvy+922tPk7PHbfbQ3T333A6gOv/2KYNoPn3QjMq6pnDVdcVV0H/E1b5vxXwMlJntSW2C9uTwWuadtPA+Yv2b+ZP32w2OAx+PPrMNx1ORq4FPibqronyUHA7mOs60a65dvD7R/x2g3jcrovO4Dup+noniy+A3BVVT2S5A7a51tVdyaZRTdD/Rzgay2Uzh/7Y1V14gLGe/Q6jDYW8Bu6ZeFXt9dPXYj3+ZvWfvDzGbQL8IYRzr2Nx2a+54/9NOCmBYwz/z1l8HVV/R+wbzv2MuCsJD+uql+1Js+hm52XpEnHGXRJkpaOE4DXJnl1khWTrJruwWcb0s0sPg64FXgoyc5092fPdwvwpCRrDey7DNglyTpJ/gI4aJTxLwLuTveQtdVaDc9L98RxkuyVZEpVPcJjs48Pj/tdD++fkqye5Ll09yDPf8jX14APJZnS7kM+jO66jWS467ImcDdwb5JNgQMWoq7jgIOTvCidZ7al5Au8dsP4Pn+6rH5N4CG6z3elJIfR3eM+6L/pvqB5I48tbwf4EvDBdq1oD1rbYwHvYbSxvtH6e2J7LsI7B46N9j4Hz90QeNf8E5M8HXhcVV3DMKrq4Xb+x5Ks2a7rexj+8/0e8Nwkf5XuyfQHMvAFVJI92vjQrVYo2r/V9p7WAZbqE/QlaXExoEuStBS0e3ZfT7dk+Fa62cr3ASu0e3YPpAswd9A9JOzUgXOvoQuv17elxxsA/0X3MK8b6O5Xf/RJ1iOM/zDdA9ymAfPoZjSPo1v+DLATcFWSe+keGPfXoy0nH4fzgF8BZwOfrqpZbf8/092XfzlwBXBJ2zesEa7LwXTX7x66+6cXeF2G9PdN4GN0Afke4DvAOmO4dkP7uQS4K8lWbdcP6B6cdy3dsu4HGLIsne7zfhZwS1XNHejrFOBfgK+3JftX0j0cbySjjXUE8Ov2Ps6iW17/hzbWaO/z8NbnPLp/c/810O9rGHl5+3zvolv5cT1wPt11Pn5oo6q6DdiD7nfkf0d3XX460OTFwIXt3+qpwP+rqnnt2JuBr7bnH0jSpJPHVlBJkiQtOUmm0oW7ldsDz5ZZSV4FvGPgGQS9lOQAui9jhj5Ib2H7+T7whaoaLaQvMe1WkbnAtlX124mqQ5LGwxl0SZKkxayqZvUxnCdZP8nWSVZIsgnwXron7Y/XuXRPxJ8w7en+mxrOJU1mzqBLkqSlYnmaQe+rdu/394Cn0z1r4Ot0P3324IQWJkkCDOiSJEmSJPWCS9wlSZIkSeoBfwddk8q6665bU6dOnegyJEmSJGmRzZkz57aqmjJ0vwFdk8rUqVOZPXv2RJchSZIkSYssyf8Ot98l7pIkSZIk9YABXZIkSZKkHnCJuyaVh269nVuPPmGiy5AkSZLUY1MO2GuiS1gkzqBLkiRJktQDBnRJkiRJknrAgC5JkiRJUg8Y0CVJkiRJ6gEDuiRJkiRJPWBAlyRJkiSpBwzokiRJkiT1gAFdkiRJkqQeMKBLkiRJktQDBnRJkiRJknrAgC5JkiRJUg8Y0CVJkiRJ6gED+iSXZGqSKxdDP3sn+ULb3i3JZgPHzk0yfQHnzkmyynhrkCRJkqTlmQFdw9kN2GzUVnRfEAA3VdWDS7IgSZIkSVrWGdCXDSsmOTbJVUlmJVktycZJzmyz2z9JsilAktcmuTDJpUnOSrLeYEdJXgq8DvhUksuSbNwO7ZHkoiTXJtlm4JSdgTPbuUcnmd3qOHygz12SXJPk/CSfT3J62//4JMcnubjV8/oleI0kSZIkqdcM6MuGZwFfrKrnAncCbwSOAd5VVS8CDgb+vbU9H3hJVW0BfB14/2BHVfUz4FTgfVU1rar+px1aqaq2BA4CPjxwyk60gA4cWlXTgRcAL0/ygiSrAl8Gdq6qlwFTBs49FPhRVb0Y2J7uS4HHD31zSfZrwX/27+69e+GvjiRJkiRNAitNdAFaLOZV1WVtew4wFXgp8M0k89s8rv3dEDgpyfrAKsC8MY7x7SH90+4737Cqrm/H3pRkP7p/V+vTLZNfAbi+quaP8zVgv7b9KuB1SQ5ur1cFngb8YnDgqjqG7gsHpm30jBpjvZIkSZI0qRjQlw1/GNh+GFgPuLOqpg3T9t+Az1bVqUm2Az6ykGM8zGP/brahm5EnydPpZupfXFV3JJlJF7jDyAK8sap+OcYaJEmSJGmZ5RL3ZdPdwLwkewCks3k7thZwU9t+6wjn3wOsOYZxdgLOaNtPAO4D7mr3te/c9l8DPKM9TA5gxsD5PwDelTbNn2SLMYwpSZIkScskA/qya0/g7UnmAlcB8x/A9hG6pe8/AW4b4dyvA+9rD27beIQ2ANsB5wFU1Vzg0jbW8cBP2/7fA+8AzkxyPnALcFc7/6PAysDl7afiPrrwb1OSJEmSlg2p8pZeLbwkGwLHVtXOY2i7RlXd22bKvwhcV1VHLcq40zZ6Rv3wkCMW5VRJkiRJy4kpB+w10SUsUJI57QHbf8IZdC2Sqvr1WMJ5s2+Sy+hm19eie6q7JEmSJGmAD4nTEtdmyxdpxlySJEmSlhfOoEuSJEmS1AMGdEmSJEmSesCALkmSJElSDxjQJUmSJEnqAQO6JEmSJEk9YECXJEmSJKkHDOiSJEmSJPWAv4OuSWWlKesw5YC9JroMSZIkSVrsnEGXJEmSJKkHDOiSJEmSJPWAAV2SJEmSpB4woEuSJEmS1AMGdEmSJEmSesCALkmSJElSDxjQJUmSJEnqAX8HXZPKQ7feyq1fOmaiy5AkSZLUQ1P232+iSxgXZ9AlSZIkSeoBA7okSZIkST1gQJckSZIkqQcM6JIkSZIk9YABXZIkSZKkHjCgS5IkSZLUAwZ0SZIkSZJ6wIAuSZIkSVIPGNAlSZIkSeoBA7okSZIkST1gQJckSZIkqQcM6JIkSZIk9YABfRJIskGSk9v2tCS7jPG89ZPMWsK1fT/J2ktyDEmSJElaHhjQJ4Gqurmqdm8vpwFjCujATsAPxjpOkhUXobZdqurOhT1PkiRJkvSnDOhLWJK9klyU5LIkX06yYpJ9klyb5Lwkxyb5Qms7M8nuA+fe2/5OTXJlklWAI4AZrb8ZSa5LMqW1WyHJr5Ks27rYCTgjyXZJfpzklCRXJ/lSkhXmj5HkiCQXAn+ZZIcklya5IsnxSR6XZOck3xioa7skp7XtG5Ks22r8RXs/VyWZlWS11uaZSc5KMjfJJUk2bvvfl+TiJJcnOXwJfxSSJEmS1GsG9CUoyXOAGcDWVTUNeBjYCzgc2Bp4JbDZWPurqgeBw4CTqmpaVZ0EnADs2ZrsCMytqtvabPgmVXV1O7Yl8F7g+cDGwF+1/Y8HrqyqrYDZwExgRlU9H1gJOAD4IfCSJI9v58wAThqmxGcBX6yq5wJ3Am9s+09s+zcHXgr8JsmrWvst6VYFvCjJtsO97yT7JZmdZPbv7r13rJdLkiRJkiYVA/qStQPwIuDiJJe11+8Gzq2qW1vgHi7oLozjgbe07bcBX2nbWwEXDrS7qKqur6qHga8BL2v7Hwa+1bY3AeZV1bXt9VeBbavqIeBM4LVJVgJeA3x3mFrmVdVlbXsOMDXJmsBTquoUgKp6oKruB17V/rsUuATYlC6w/5mqOqaqplfV9CetscboV0SSJEmSJqGVJrqAZVyAr1bVBx/dkewGvGGE9g/RvjRJEmCV0QaoqhuT3JLkFXShfP5s+s50ofrRpkNPbX8faKF9fr0jOQn4B+B24OKqumeYNn8Y2H4YWG0BfQb4RFV9eQFjSpIkSdJywxn0JetsYPckTwZIsg7djPF2SZ6UZGVgj4H2N9DNuAO8Hlh5mD7vAdYcsu84uqXu3xgI2zu08efbMsnT273nM4Dzh+n7GrpZ72e2138LnNe2zwVeCOzLQsz6V9XdwK/bFxO0e9pXp3t43duSrNH2P2X+dZIkSZKk5ZEBfQlq939/CJiV5HK6e7nXBz4CXACcRbe8e75jgZcnuYhuNvy+Ybo9B9hs/kPi2r5TgTVoy9vbQ+MeaOF4vguAI4ErgXnAKcPU+wCwD/DNJFcAjwBfasceBk6nm5k/faEuRBf0D2zX4GfAX1TVLOC/gQvaWCfz5188SJIkSdJyI1VDVz5raUqyNzC9qt45jj6mA0dV1Tbt9V7AhlV1ZHu9HXBwVe06/oon1rSNNqoffvDQiS5DkiRJUg9N2X+/iS5hTJLMqarpQ/d7D/okl+QQuietz7/3nKo6YeIqkiRJkiQtCgP6BKuqmXQ/bbao5x9Jt3R9QW3OpbuHXJIkSZLUU96DLkmSJElSDxjQJUmSJEnqAQO6JEmSJEk9YECXJEmSJKkHDOiSJEmSJPWAAV2SJEmSpB4woEuSJEmS1AP+DromlZWmTGHK/vtNdBmSJEmStNg5gy5JkiRJUg8Y0CVJkiRJ6gEDuiRJkiRJPWBAlyRJkiSpBwzokiRJkiT1gAFdkiRJkqQeMKBLkiRJktQD/g66JpU/3noLtxz9mYkuQ5IkSRKw3gHvnegSlinOoEuSJEmS1AMGdEmSJEmSesCALkmSJElSDxjQJUmSJEnqAQO6JEmSJEk9YECXJEmSJKkHDOiSJEmSJPWAAV2SJEmSpB4woEuSJEmS1AMGdEmSJEmSesCALkmSJElSDxjQJUmSJEnqAQO6AEiyQZKT2/a0JLuM8bz1k8xastVJkiRJ0rLPgC4Aqurmqtq9vZwGjCmgAzsBP1gyVUmSJEnS8sOAvgxIsleSi5JcluTLSVZMsk+Sa5Ocl+TYJF9obWcm2X3g3Hvb36lJrkyyCnAEMKP1NyPJdUmmtHYrJPlVknVbFzsBZyRZI8nZSS5JckWS1w+M8U9JrknywyRfS3Jw279xkjOTzEnykySbLp0rJkmSJEn9s9JEF6DxSfIcYAawdVX9Mcm/A3sBhwMvAu4CzgEuHUt/VfVgksOA6VX1zjbGpsCewOeAHYG5VXVbkhWBTarq6iQrAW+oqrtbeP95klNbDW8EtqD793YJMKcNdwywf1Vdl2Qr4N+BVwzzHvcD9gPYcJ0nLuQVkiRJkqTJwYA++e1AF4IvTgKwGvBS4NyquhUgyUnAs8cxxvHAd+kC+tuAr7T9WwEXtu0AH0+yLfAI8BRgPeBlwHer6vetltPa3zVand9sdQM8brjBq+oYujDP5hs9tcbxPiRJkiSptwzok1+Ar1bVBx/dkewGvGGE9g/Rbm1Il4xXGW2AqroxyS1JXkEXyvdsh3YGzmzbewJTgBe1mfwbgFVbfcNZAbizqqaNNr4kSZIkLQ+8B33yOxvYPcmTAZKsQ7ecfbskT0qyMrDHQPsb6GbcAV4PrDxMn/cAaw7ZdxxwAvCNqnq47duhjQ+wFvDbFs63BzZq+88HXptk1TZr/hqAqrobmJdkj1Z3kmy+0O9ekiRJkpYRBvRJrqquBj4EzEpyOfBDYH3gI8AFwFl0933Pdyzw8iQX0c2G3zdMt+cAm81/SFzbdyqwBm15e3to3AMtaAOcCExPMptuNv2aVt/F7dy5wLeB2XT3xdPavT3JXOAqui8MJEmSJGm55BL3ZUBVnQScNGT3z3ksTO8NTG9tbwFeMtDug23/DcDz2vbtwIuH9Lc53cPhrmmvXw08+vvnVXUb8JcjlPjpqvpIktWBHwOfaefMo3sKvCRJkiQt9wzoGlWSQ4ADeOzec6rqhIXo4pgkm9Hdk/7VqrpktBMkSZIkaXljQF8OVNVMYOY4zj8SOHIc5795Uc+VJEmSpOWF96BLkiRJktQDBnRJkiRJknrAgC5JkiRJUg8Y0CVJkiRJ6gEDuiRJkiRJPWBAlyRJkiSpBwzokiRJkiT1gL+Drkll5Snrsd4B753oMiRJkiRpsXMGXZIkSZKkHjCgS5IkSZLUAwZ0SZIkSZJ6wIAuSZIkSVIPGNAlSZIkSeoBA7okSZIkST1gQJckSZIkqQf8HXRNKn+89SZ+8+//ONFlSJIkScu99d/x8YkuYZnjDLokSZIkST1gQJckSZIkqQcM6JIkSZIk9YABXZIkSZKkHjCgS5IkSZLUAwZ0SZIkSZJ6wIAuSZIkSVIPGNAlSZIkSeoBA7okSZIkST1gQJckSZIkqQcM6JIkSZIk9YABXZIkSZKkHjCgC4AkGyQ5uW1PS7LLGM9bP8msBRw/IsmObfugJKsvnoolSZIkadliQBcAVXVzVe3eXk4DxhTQgZ2AHyyg38Oq6qz28iDAgC5JkiRJwzCgLwOS7JXkoiSXJflykhWT7JPk2iTnJTk2yRda25lJdh849972d2qSK5OsAhwBzGj9zUhyXZIprd0KSX6VZN3WxU7AGe3Y+5NckWRukiMHx0tyILABcE6Sc5K8PclRA3Xsm+SzS/5qSZIkSVI/GdAnuSTPAWYAW1fVNOBhYC/gcGBr4JXAZmPtr6oeBA4DTqqqaVV1EnACsGdrsiMwt6puS7IisElVXZ1kZ2A3YKuq2hz45JB+Pw/cDGxfVdsDXwdel2Tl1mQf4CsjvMf9ksxOMvt3994/1rciSZIkSZOKAX3y2wF4EXBxksva63cD51bVrS1wnzTOMY4H3tK238ZjQXor4MK2vSPwlaq6H6Cqbl9Qh1V1H/AjYNckmwIrV9UVI7Q9pqqmV9X0J63hCnlJkiRJyyYD+uQX4KtttntaVW0CfASoEdo/RPvckwRYZbQBqupG4JYkr6AL5We0QzsDZw7UMdKYIzkO2JsFzJ5LkiRJ0vLCgD75nQ3snuTJAEnWAS4FtkvypLaEfI+B9jfQzbgDvB5YmT93D7DmkH3H0S11/0ZVPdz27dDGB5gFvG3+U9pbHQvst6ouBJ4KvBn42qjvVJIkSZKWYQb0Sa6qrgY+BMxKcjnwQ2B9uln0C4CzgEsGTjkWeHmSi+hmw+8bpttzgM3mPySu7TsVWIM2090eGvdAVd3d6jiztZndltofPEy/xwBnJDlnYN83gJ9W1R0L+94lSZIkaVmSqoVdlazJJsnewPSqeuc4+pgOHFVV27TXewEbVtWR46zt9Nbv2aM2BjbfaP068wP7jGdISZIkSYvB+u/4+ESXMGklmVNV04fuX2kiitHkkuQQ4AAee5I7VXXCOPtcG7iI7onwYwrnkiRJkrQsM6AvB6pqJjBzHOcfCYxrpnyYPu8Enr04+5QkSZKkycx70CVJkiRJ6gEDuiRJkiRJPWBAlyRJkiSpBwzokiRJkiT1gAFdkiRJkqQeMKBLkiRJktQDBnRJkiRJknrA30HXpLLylKew/js+PtFlSJIkSdJi5wy6JEmSJEk9YECXJEmSJKkHDOiSJEmSJPWAAV2SJEmSpB4woEuSJEmS1AMGdEmSJEmSesCALkmSJElSD/g76JpUHvztPG78tz0nugxJkiRpQjz1XSdOdAlagpxBlyRJkiSpBwzokiRJkiT1gAFdkiRJkqQeMKBLkiRJktQDBnRJkiRJknrAgC5JkiRJUg8Y0CVJkiRJ6gEDuiRJkiRJPWBAlyRJkiSpBwzokiRJkiT1gAFdkiRJkqQeMKAvQ5JskOTktj0tyS5jPG/9JLOWbHWPjvWPS2McSZIkSZpsDOjLkKq6uap2by+nAWMK6MBOwA+WTFV/xoAuSZIkScMwoPdEkr2SXJTksiRfTrJikn2SXJvkvCTHJvlCazszye4D597b/k5NcmWSVYAjgBmtvxlJrksypbVbIcmvkqzbutgJOKMde3+SK5LMTXJk2zctyc+TXJ7klCRPbPvPTTK9ba+b5Ia2vXeSbyc5s437ybb/SGC1VtOJST6a5P8NvI+PJTlwyV1lSZIkSeovA3oPJHkOMAPYuqqmAQ8DewGHA1sDrwQ2G2t/VfUgcBhwUlVNq6qTgBOAPVuTHYG5VXVbkhWBTarq6iQ7A7sBW1XV5sAnW/v/BD5QVS8ArgA+PIYyprX39Hy6LwqeWlWHAL9vNe0J/Afw1nYNVgD+GjhxrO9TkiRJkpYlK010AQJgB+BFwMVJAFYDXgqcW1W3AiQ5CXj2OMY4Hvgu8DngbcBX2v6tgAvb9o7AV6rqfoCquj3JWsDaVXVea/NV4JtjGO/sqrqr1X41sBFw42CDqrohye+SbAGsB1xaVb8b2lGS/YD9AJ7yxNXH+HYlSZIkaXJxBr0fAny1zSxPq6pNgI8ANUL7h2ifXbpEv8poA1TVjcAtSV5BF8rPaId2Bs4cqGOkMRdYB7DqkGN/GNh+mJG/DDoO2BvYh+5LhOFqP6aqplfV9HXWGDqMJEmSJC0bDOj9cDawe5InAyRZB7gU2C7Jk5KsDOwx0P4Guhl3gNcDKw/T5z3AmkP2HUe31P0bVfVw27dDGx9gFvC2JKvPr6PNgt+RZJvW5m+B+bPpg3U8ek/8KP7Y3s98p9DdA/9ilt6D6iRJkiSpdwzoPVBVVwMfAmYluRz4IbA+3Sz6BcBZwCUDpxwLvDzJRXSz4fcN0+05wGbzHxLX9p0KrEFb3t4eGvdAVd3d6jiztZmd5DLg4HbeW4FPtdqm0T2ADuDTwAFJfgbMf+DcaI4BLk9yYhvzwVbr4JcGkiRJkrTcSdXCrGjWREmyNzC9qt45jj6mA0dV1Tbt9V7AhlV15OKpcpFqWoHuy4c9quq60dq/4GlPqu+9b6clX5gkSZLUQ099l89UXhYkmVNV04fu9yFxy4kkhwAH8NiT3KmqEyauIkiyGXA6cMpYwrkkSZIkLcsM6JNEVc0EZo7j/COBCZspH05b2v+Mia5DkiRJkvrAe9AlSZIkSeoBA7okSZIkST1gQJckSZIkqQcM6Ic4aC4AACAASURBVJIkSZIk9YABXZIkSZKkHjCgS5IkSZLUAwZ0SZIkSZJ6wIAuSZIkSVIPrDTRBUgLY5UnP52nvuvEiS5DkiRJkhY7Z9AlSZIkSeoBA7okSZIkST1gQJckSZIkqQcM6JIkSZIk9YABXZIkSZKkHjCgS5IkSZLUAwZ0SZIkSZJ6wN9B16TywG9/xTVffP1ElyFJkqSlYNN/+O5ElyAtVc6gS5IkSZLUAwZ0SZIkSZJ6wIAuSZIkSVIPGNAlSZIkSeoBA7okSZIkST1gQJckSZIkqQcM6JIkSZIk9YABXZIkSZKkHjCgS5IkSZLUAwZ0SZIkSZJ6wIAuSZIkSVIPGNAlSZIkSeoBA/oiSHJgkl8kOXGc/RyRZMe2fW6S6YupvoOSrL642o3Sx5wkq4ynD0mSJEmSAX1RvQPYpar2HE8nVXVYVZ21mGoadBAwluA91nbDSjIVuKmqHlzUPiRJkiRJHQP6QkryJeAZwKlJPpDkZ0kubX83aW32TvKdJKclmZfknUne09r9PMk6rd3MJLsP6f/tSY4aeL1vks+OUMvjk3wvydwkVyaZkeRAYAPgnCTntHZHJ5md5Kokh7d9w7W7d6Dv3ZPMbNt7tP7nJvnxQAk7A2eONEbbv0uSa5Kcn+TzSU4fqP34JBe36/L6Rfg4JEmSJGmZYUBfSFW1P3AzsD1wNLBtVW0BHAZ8fKDp84A3A1sCHwPub+0uAN6ygCG+Drwuycrt9T7AV0ZouxNwc1VtXlXPA86sqs/Pr6+qtm/tDq2q6cALgJcnecEI7UZyGPDqqtoceN2Q8c8caYwkqwJfBnauqpcBUwbOPRT4UVW9mO5afirJ44cbPMl+LfzPvuNeJ+slSZIkLZsM6OOzFvDNJFcCRwHPHTh2TlXdU1W3AncBp7X9VwBTR+qwqu4DfgTsmmRTYOWqumKE5lcAOyb5lyTbVNVdI7R7U5JLgEtbjZuN7e096qfAzCT7AisCtPvON6yq6xcwxqbA9VU1r7X52kCfrwIOSXIZcC6wKvC04QavqmOqanpVTX/iGt7uLkmSJGnZtNJEFzDJfZQuiL+h3Y997sCxPwxsPzLw+hFGv+7HAf8IXMPIs+dU1bVJXgTsAnwiyayqOmKwTZKnAwcDL66qO9qy9VVH6nJg+9E2VbV/kq2A1wCXJZkGTAPOH2WMLOA9BnhjVf1yAW0kSZIkabnhDPr4rAXc1Lb3XlydVtWFwFPplsh/baR2STagWzp/AvBp4IXt0D3Amm37CcB9wF1J1qO7b5xh2gHckuQ5SVYA3jAwzsZVdWFVHQbc1mrbCThjlDGuAZ7RvrwAmDEw1g+AdyVJG2OLES+IJEmSJC0HnEEfn08CX03yHrpl6YvTN4BpVXXHAto8n+7e7UeAPwIHtP3HAGck+U1VbZ/kUuAq4Hq65eoM1w44BDgduBG4ElijtftUkmfRzXqfDcwFjqW7N52qmjvcGFX1+yTvAM5Mchtw0cDYHwU+B1zeQvoNwK4Lc4EkSZIkaVmSqhq9lZa69rTzo6rq7ImuZagkGwLHVtXOY2i7RlXd20L4F4Hrquqo0c4byfOetnad/IGXL+rpkiRJmkQ2/YfvTnQJ0hKRZE57yPafcIl7zyRZO8m1wO/7GM4BqurXYwnnzb7tQXBX0d0S8OUlV5kkSZIkTV4uce+ZqroTePbgviRPoltaPtQOVfW7pVLYImqz5Ys8Yy5JkiRJywsD+iTQQvi0ia5DkiRJkrTkuMRdkiRJkqQeMKBLkiRJktQDBnRJkiRJknrAgC5JkiRJUg8Y0CVJkiRJ6gEDuiRJkiRJPeDPrGlSWfXJz2TTf/juRJchSZIkSYudM+iSJEmSJPWAAV2SJEmSpB4woEuSJEmS1AMGdEmSJEmSesCALkmSJElSDxjQJUmSJEnqAQO6JEmSJEk94O+ga1K5/9ZfccmXXjvRZUiSJPXKC/c/baJLkLQYOIMuSZIkSVIPGNAlSZIkSeoBA7okSZIkST1gQJckSZIkqQcM6JIkSZIk9YABXZIkSZKkHjCgS5IkSZLUAwZ0SZIkSZJ6wIAuSZIkSVIPGNAlSZIkSeoBA7okSZIkST1gQJ8AST6S5OCJrgMgyf5J3jLOPuYkWWVx1SRJkiRJy6OVJroALZokK1XVQ+Ptp6q+NM46pgI3VdWD461FkiRJkpZnzqAvJUkOTfLLJGcBm7R9Gyc5s81A/yTJpm3/zCRfavuuTbJr2793km8mOQ2Y1fa9L8nFSS5Pcnjb9/gk30syN8mVSWa0/Ucmubq1/XTb9+hsfpJpSX7ejp+S5Ilt/7lJ/iXJRa2ebQbe2s7Ama3d0UlmJ7lqfi1t/y5JrklyfpLPJzl9oM7jW/2XJnn9kvsEJEmSJKnfnEFfCpK8CPhrYAu6a34JMAc4Bti/qq5LshXw78Ar2mlTgZcDGwPnJHlm2/+XwAuq6vYkrwKeBWwJBDg1ybbAFODmqnpNG3+tJOsAbwA2rapKsvYwpf4n8K6qOi/JEcCHgYPasZWqassku7T9O7b9OwHvbtuHtrpWBM5O8gLgWuDLwLZVNS/J1wbGOxT4UVW9rdVzUZKzquq+hbi8kiRJkrRMcAZ96dgGOKWq7q+qu4FTgVWBlwLfTHIZXYhdf+Ccb1TVI1V1HXA9sGnb/8Oqur1tv6r9dyld6N+ULrBfAezYZr23qaq7gLuBB4DjkvwVcP9ggUnWAtauqvParq8C2w40+Xb7O4fuywPafecbVtX17dibklzS6nkusFmr6fqqmtfaDAb0VwGHtPd/brsmTxt68ZLs12bmZ99xryvpJUmSJC2bnEFfemrI6xWAO6tq2hjbz389OLsc4BNV9eWhJ7dZ+12ATySZVVVHJNkS2IFuNv+dPDZbPxZ/aH8f5rF/N9sA57fxng4cDLy4qu5IMpMucGcBfQZ4Y1X9ckEDV9UxdKsN2GyjtYdeF0mSJElaJjiDvnT8GHhDktWSrAm8lm4Ge16SPQDS2XzgnD2SrJBkY+AZwHAh9gfA25Ks0fp4SpInJ9kAuL+qTgA+DbywtVmrqr5Pt2z9T74YaLPsdwzcX/63wHks2E7AGW37CXRfHtyVZD26e9MBrgGe0R4mBzBjSP3vSpJW/xajjCdJkiRJyyxn0JeCqrokyUnAZcD/Aj9ph/YEjk7yIWBl4OvA3Hbsl3QBeT26+9QfaDl2sN9ZSZ4DXNCO3QvsBTwT+FSSR4A/AgcAawLfTTJ/Vvvd/Lm3Al9Ksjrdsvp9Rnlr2wGHtVrmJrkUuKqd+9O2//dJ3gGcmeQ24KKB8z8KfA64vIX0G4BdRxlTkiRJkpZJqXLFcN+05eGnV9XJE13LSJJsCBxbVTuPoe0aVXVvC+FfBK6rqqMWZdzNNlq7TvjgNqM3lCRJWo68cP/TJroESQshyZyqmj50v0vctUiq6tdjCefNvu1BcFcBa9E9EE+SJEmSNMAl7j1UVXtPdA2LU5stX6QZc0mSJElaXjiDLkmSJElSDxjQJUmSJEnqAQO6JEmSJEk9YECXJEmSJKkHDOiSJEmSJPWAAV2SJEmSpB4woEuSJEmS1AMGdEmSJEmSemCliS5AWhirT3kmL9z/tIkuQ5IkSZIWO2fQJUmSJEnqAQO6JEmSJEk9YECXJEmSJKkHDOiSJEmSJPWAAV2SJEmSpB4woEuSJEmS1AMGdEmSJEmSesDfQdekcu+tv+Knx+w60WVIkiSN2db7nT7RJUiaJJxBlyRJkiSpBwzokiRJkiT1gAFdkiRJkqQeMKBLkiRJktQDBnRJkiRJknrAgC5JkiRJUg8Y0CVJkiRJ6gEDuiRJkiRJPWBAlyRJkiSpBwzokiRJkiT1gAFdkiRJkqQeMKBLkiRJktQDBvRFlOTAJL9IcuI4+zkiyY5t+9wk0xdTfQclWX1xtRuljzlJVhnh2OuSHNK2d0uy2XjGkiRJkqRllQF90b0D2KWq9hxPJ1V1WFWdtZhqGnQQMJbgPdZ2w0oyFbipqh4c7nhVnVpVR7aXuwEGdEmSJEkahgF9EST5EvAM4NQkH0jysySXtr+btDZ7J/lOktOSzEvyziTvae1+nmSd1m5mkt2H9P/2JEcNvN43yWdHqOXxSb6XZG6SK5PMSHIgsAFwTpJzWrujk8xOclWSw9u+4drdO9D37klmtu09Wv9zk/x4oISdgTNbm52SXNLanD1wHb6Q5KXA64BPJbksycZJLhkY61lJ5iz0hyFJkiRJy4iVJrqAyaiq9k+yE7A98CDwmap6qC1V/zjwxtb0ecAWwKrAr4APVNUWLXy/BfjcCEN8Hbg8yfur6o/APsDfj9B2J+DmqnoNQJK1ququJO8Btq+q21q7Q6vq9iQrAmcneUFVfX6YdiM5DHh1Vd2UZO0h4787yRTgWGDbqpo3/wuIgWv2sySnAqdX1cmt1ruSTKuqy9p7nDncwEn2A/YDWG+d1UYpU5IkSZImJ2fQx28t4JtJrgSOAp47cOycqrqnqm4F7gJOa/uvAKaO1GFV3Qf8CNg1yabAylV1xQjNrwB2TPIvSbapqrtGaPemNmN9aatxYZea/xSYmWRfYEWAdt/5hlV1PfAS4MdVNa+9h9vH0OdxwD7tS4MZwH8P16iqjqmq6VU1fe01hr3VXZIkSZImPQP6+H2ULog/D3gt3Wz5fH8Y2H5k4PUjjL564Thgb7qZ5a+M1KiqrgVeRBfUP5HksKFtkjwdOBjYoapeAHxvSJ1/0uXA9qNtqmp/4EPAU4HLkjwJ2AY4f/4wQ84di2/RLZHfFZhTVb9byPMlSZIkaZlhQB+/tYCb2vbei6vTqrqQLgy/GfjaSO2SbADcX1UnAJ8GXtgO3QOs2bafANwH3JVkPbpQzDDtAG5J8pwkKwBvGBhn46q6sKoOA25rte0EnNGaXAC8vH0ZwNAl7sONVVUPAD8AjmYBX0JIkiRJ0vLAgD5+n6Sbuf4pben3YvQN4KdVdccC2jwfuCjJZcChwD+3/ccAZyQ5p6rm0i1tvwo4nm65OkPbtdeHAKfTLbH/zUC7TyW5oi3l/zEwF9gOOA+gLePfD/h2krnAScPU+nXgfe1BeRu3fSfSzbzPWuCVkCRJkqRlXKoWdlWylpYkpwNHVdXZE13LUEk2BI6tqp1Hbbzgfg4G1qqqfxpL+003Wrv+49CXjWdISZKkpWrr/U6f6BIk9UySOVU1feh+n+LeQ+0p6RcBc/sYzgGq6tf86VL5hZbkFGBj4BWLpShJkiRJmsQM6D1UVXcCzx7c1x7KNlxY32GyPlytqt4weitJkiRJWj4Y0CeJFsKnTXQdkiRJkqQlw4fESZIkSZLUAwZ0SZIkSZJ6wIAuSZIkSVIPGNAlSZIkSeoBA7okSZIkST1gQJckSZIkqQf8mTVNKmtMeSZb73f6RJchSZIkSYudM+iSJEmSJPWAAV2SJEmSpB4woEuSJEmS1AMGdEmSJEmSesCALkmSJElSDxjQJUmSJEnqAQO6JEmSJEk94O+ga1K5+7brOOu4XSa6DEmSpGHt+Hffn+gSJE1izqBLkiRJktQDBnRJkiRJknrAgC5JkiRJUg8Y0CVJkiRJ6gEDuiRJkiRJPWBAlyRJkiSpBwzokiRJkiT1gAFdkiRJkqQeMKBLkiRJktQDBnRJkiRJknrAgC5JkiRJUg8Y0CVJkiRJ6oHlNqAnOTDJL5KcOM5+jkiyY9s+N8n0xVTfQUlWX1ztRuljTpJVxtPHGMfZO8kGS3ocSZIkSZqMltuADrwD2KWq9hxPJ1V1WFWdtZhqGnQQMJbgPdZ2w0oyFbipqh5c1D4Wwt6AAV2SJEmShrFcBvQkXwKeAZya5ANJfpbk0vZ3k9Zm7yTfSXJaknlJ3pnkPa3dz5Os09rNTLL7kP7fnuSogdf7JvnsCLU8Psn3ksxNcmWSGUkOpAuy5yQ5p7U7OsnsJFclObztG67dvQN9755kZtveo/U/N8mPB0rYGTiztdkpySWtzdlt3zrtOlze3vcL2v6PJDl4YKwrk0xt//0iybGt1llJVmvXaDpwYpLLkrwmySkD578yybdHuEb7tfc++657lsb3CJIkSZK09C2XAb2q9gduBrYHjga2raotgMOAjw80fR7wZmBL4GPA/a3dBcBbFjDE14HXJVm5vd4H/n/27jRcr6q+4/73B0EpgmGQWlFrJESjTBEig8ogUghqVZRBBRG0cNHWOrRYfQoiUFsHrFbrwGAlIqgIgiJKgjKLjAkJiRP0AXqp+FgRZHYA/s+Lex24OZyTnJDh3ufk+7muXGffa6+91n/vkze/e+29D6eM0ncWcFtVbV1VWwBzqurTQ/VV1ctbvyOraiawFbBLkq1G6Teao4E9q2pr4DXD5p+TZGPgZOANrc++bf+xwPVVtRXwL8CpS5kHYBrw2araHPhdG/Ms4DrggKqaAXwXeEGbF5ZwjarqpKqaWVUzJ6+30u/ElyRJkqSBWC0D+jCTgTOTLAY+CWzet+/iqrqnqn4D3AV8u7UvAqaMNmBV3QdcBLw6yXRgrapaNEr3RcDuST6aZKequmuUfvslmQ9c32p84dhO7xFXALOTHAqsCdCeO39WVd0M7ABcVlW3tHO4ox33MuDLre0iYKMkk5cy1y1VtaBtz2OEa1VV1cY9MMn6wI7A+ct4TpIkSZI0YRjQ4V/pBfEtgL8G1u7b94e+7Yf7Pj8MTFrKuF+g98z1klbPqaobgW3pBfUPJzl6eJ8kzwWOAF7RVrK/M6zOxwzZt/1In3bXwFHAs4EFSTYCdgJ+MDTNsGPpax9pjgd57P+f0a7bQ4x+rU4BDgTeBJxZVQ+O0k+SJEmSJjwDem8F/Zdt++AVNWhVXU0vDL8Z+Opo/dpbze+vqtOAjwPbtF33AOu17acC9wF3JXk6vefGGaEfwK+TvCDJGsDeffNMraqrq+po4PZW2yweXbW+kt6t889t/Tds7ZcBB7S2XYHbq+pu4NahWpNsAzx36VflsbVW1W30btE/Cpg9huMlSZIkacJa2irw6uBjwJeS/CO929JXpK8DM6rqziX02RI4PsnDwJ+Av23tJwHnJ/lVVb08yfXAj4Cb6d2uzkj9gPcD5wE/BxYD67Z+xyeZRm9F/EJgIb1nzo8GqKrfJDkMOLuF+/8D/go4BjglyQ3A/cBb23jfAA5KsgC4FrhxDNdjNnBCkgeAHavqAeB0YOOq+vEYjpckSZKkCSu9R4G1MiQ5D/hkVV046FqGS/Is4OSq2mupnVduHZ+h9xK6/x5L/+dNmVyfO+qlK7kqSZKkJ2b3v/nuoEuQNA4kmddeAv4Y3uK+EiRZP8mNwANdDOcAVfWLDoTzefTeSn/aIOuQJEmSpC7wFveVoKp+Bzyvv629lG2ksP6KqvrtKimsY6pq20HXIEmSJEldYUBfRVoInzHoOiRJkiRJ3eQt7pIkSZIkdYABXZIkSZKkDjCgS5IkSZLUAQZ0SZIkSZI6wIAuSZIkSVIHGNAlSZIkSeoA/8yaxpWnPm0au//NdwddhiRJkiStcK6gS5IkSZLUAQZ0SZIkSZI6wIAuSZIkSVIHGNAlSZIkSeoAA7okSZIkSR1gQJckSZIkqQMM6JIkSZIkdYB/B13jyl2338R5X9xr0GVIkiQB8Oq3nT/oEiRNIK6gS5IkSZLUAQZ0SZIkSZI6wIAuSZIkSVIHGNAlSZIkSeoAA7okSZIkSR1gQJckSZIkqQMM6JIkSZIkdYABXZIkSZKkDjCgS5IkSZLUAQZ0SZIkSZI6wIAuSZIkSVIHGNCfgCTHJDli0HUAJDk8yUHLOca8JE9aUTUtYZ6Dk2yysueRJEmSpPFo0qALWF0lmVRVDy7vOFV1wnLWMQX4ZVX9cXlrGYODgcXAbatgLkmSJEkaV1xBH6MkRyb5WZLvA89vbVOTzGkr0Jcnmd7aZyc5obXdmOTVrf3gJGcm+TZwQWt7b5Jrk9yQ5NjW9pQk30myMMniJPu39o8k+XHr+/HW9shqfpIZSa5q+89JskFrvyTJR5Nc0+rZqe/U9gLmtH6zksxv817Y2jZM8s025lVJtho+b/u8OMmU9u8nSU5O8qMkFyT5syT7ADOB05MsSPKqJOf0Hf9XSc5e0b83SZIkSRovXEEfgyTbAm8EXkTvms0H5gEnAYdX1U1Jtgc+B+zWDpsC7AJMBS5Osllr3xHYqqruSLIHMA3YDghwbpKdgY2B26rqVW3+yUk2BPYGpldVJVl/hFJPBf6hqi5NchzwQeDdbd+kqtouyStb++6tfRbwniQbAycDO1fVLW0+gGOB66vqdUl2a3PMWMolmwa8qaoOTfJ14A1VdVqSdwBHVNV1SQL8R5KNq+o3wCHAKSMNluQw4DCAjTdaeylTS5IkSdL45Ar62OwEnFNV91fV3cC5wNrAS4AzkywATgSe0XfM16vq4aq6CbgZmN7av1dVd7TtPdq/6+mF/un0wu0iYPe26r1TVd0F3A38HvhCktcD9/cXmGQysH5VXdqavgTs3NdlaHV6Hr0vD2jPnT+rqm4GdgAuq6pbAPpqfBnw5dZ2EbBRm2tJbqmqBcPn61dV1cY9sH3ZsCNw/kiDVdVJVTWzqmZOXnelPyovSZIkSQPhCvrY1bDPawC/q6rRVpOH9x/6fF9fW4APV9WJww9uq/avBD6c5IKqOi7JdsAr6K3mv4NHV+vH4g/t50M8+nvfCfhBXy3Dax5qH66AB3nsFzz9S9t/6Nt+CPizUWo6Bfg2vS8ezlwRz+RLkiRJ0njlCvrYXAbs3Z6lXg/4a3or2Lck2RcgPVv3HbNvkjWSTAU2BX42wrhzgbclWbeN8cwkf97edH5/VZ0GfBzYpvWZXFXfpXfb+mO+GGir7Hf2PV/+FuBSlmwWj65aXwnskuS5rZahW9wvAw5obbsCt7e7CG4Ftmnt2wDPXcpcAPcA6/XVfBu9F8YdBcwew/GSJEmSNGG5gj4GVTU/yRnAAuB/gcvbrgOAzyc5ClgL+BqwsO37Gb2A/HR6z6n/vvfY9WPGvSDJC4Ar2757gQOBzYDjkzwM/An4W3rB9ltJ1qa3qv2eEUp9K3BCknXo3VZ/yFJObVfg6FbLb9qz3mcnWQP4P+CvgGOAU5LcQO9Libe2Y78BHNRu778WuHEpc0EvhJ+Q5AFgx6p6ADgd2LiqfjyG4yVJkiRpwkrvUWCtSElmA+dV1VmDrmU0SZ4FnFxVew24js/Qewndf4+l/7Qpk+uTR79kJVclSZI0Nq9+24iv0JGkJUoyr6pmDm93BX01VVW/oPcn1gYmyTx6z+T/0yDrkCRJkqQuMKCvBFV18KBrGA+qattB1yBJkiRJXeFL4iRJkiRJ6gADuiRJkiRJHWBAlyRJkiSpAwzokiRJkiR1gAFdkiRJkqQOMKBLkiRJktQBBnRJkiRJkjrAgC5JkiRJUgdMGnQB0rKY/LRpvPpt5w+6DEmSJEla4VxBlyRJkiSpAwzokiRJkiR1gAFdkiRJkqQOMKBLkiRJktQBBnRJkiRJkjrAgC5JkiRJUgcY0CVJkiRJ6gD/DrrGlTtvv4mzTpk16DIkSVLH7XPInEGXIEnLzBV0SZIkSZI6wIAuSZIkSVIHGNAlSZIkSeoAA7okSZIkSR1gQJckSZIkqQMM6JIkSZIkdYABXZIkSZKkDjCgS5IkSZLUAQZ0SZIkSZI6wIAuSZIkSVIHGNAlSZIkSeoAA/o4k+SYJEes4DFPTPLSFTnmKPPsmuQlK3seSZIkSRqPDOgC2B64ahXMsytgQJckSZKkERjQOy7JQUluSLIwyZeH7Ts0ybVt3zeSrNPa902yuLVf1to2T3JNkgVtvGmt/QXAjVX1UJLNkny/HTc/ydT0HN/GW5Rk/3bcrknO66vlM0kObtu3Jjm2jbEoyfQkU4DDgfe0GnZKckuStdoxT23HrbWyr6kkSZIkdZEBvcOSbA4cCexWVVsD7xrW5eyqenHb9xPg7a39aGDP1v6a1nY48KmqmgHMBH7R2vcC5rTt04HPtuNeAvwKeD0wA9ga2B04PskzxlD+7VW1DfB54IiquhU4AfhkVc2oqsuBS4BXtf5vBL5RVX8a4TocluS6JNfdfe8fxzC1JEmSJI0/BvRu2w04q6puB6iqO4bt3yLJ5UkWAQcAm7f2K4DZSQ4F1mxtVwL/kuR9wHOq6oHWvicwJ8l6wDOr6pw21++r6n7gZcBXq+qhqvo1cCnw4jHUfnb7OQ+YMkqfLwCHtO1DgFNG6lRVJ1XVzKqa+dR1nzSGqSVJkiRp/DGgd1uAWsL+2cA7qmpL4FhgbYCqOhw4Cng2sCDJRlX1FXqr6Q8Ac5Ps1m6JX7+qbmtzjVbDSB7ksf9/1h62/w/t50PApJEGqKorgClJdgHWrKrFo56pJEmSJE1wBvRuuxDYL8lGAEk2HLZ/PeBX7bntA4Yak0ytqqur6mjgduDZSTYFbq6qTwPnAlsBLwcuBqiqu4FfJHldG+PJLcBfBuyfZM0kGwM7A9cA/wu8sPWbDLxiDOdzT6u536nAVxll9VySJEmSVhcG9A6rqh8B/wZcmmQh8IlhXT4AXA18D/hpX/vx7eVsi+kF7IXA/sDiJAuA6fSCcf/z5wBvAd6Z5Abgh8BfAOcAN7QxLgL+uar+v6r6OfD1tu904PoxnNK3gb2HXhLX2k4HNqAX0iVJkiRptZWqJd1BrYksyXxg+5FezLYKa9gHeG1VvWUs/adOmVwf/eCOK7kqSZI03u1zyJyld5KkAUkyr6pmDm8f8dlgrR7aW9YHJsl/0VvFf+Ug65AkSZKkLjCga2Cq6h8GXYMkSZIkdYXPoEuSJEmS1AEGdEmSJEmSOsCALkmSJElSBxjQJUmSJEnqAAO6JEmSJEkdYECXJEmSJKkDDOiSJEmSJHWAAV2SJEmSpA6YNOgCpGWxwdOmsc8hcwZdhiRJkiStcK6gS5IkSZLUAQZ0SZIkSZI6wIAuSZIkVlzGzAAAIABJREFUSVIHGNAlSZIkSeoAA7okSZIkSR1gQJckSZIkqQMM6JIkSZIkdYB/B13jyh2/vYnTZu856DIkSVKHHXjw3EGXIElPiCvokiRJkiR1gAFdkiRJkqQOMKBLkiRJktQBBnRJkiRJkjrAgC5JkiRJUgcY0CVJkiRJ6gADuiRJkiRJHWBAlyRJkiSpAwzokiRJkiR1gAFdkiRJkqQOMKBLkiRJktQBBvRhkhyT5IhB1wGQ5PAkBy3nGPOSPGlF1TTC+Mcl2X1ljS9JkiRJq4tJgy5gIkoyqaoeXN5xquqE5axjCvDLqvrjGPsvc91VdfQTKE2SJEmSNIwr6ECSI5P8LMn3gee3tqlJ5rQV6MuTTG/ts5Oc0NpuTPLq1n5wkjOTfBu4oLW9N8m1SW5Icmxre0qS7yRZmGRxkv1b+0eS/Lj1/Xhre2Q1P8mMJFe1/eck2aC1X5Lko0muafXs1HdqewFzWr97k/xHkvlJLkyycd/x/57kUuBdSZ7T9t/Qfv5lkslJbk2yRjtmnSQ/T7JWux77tPZbkxzb5ljUd83WTXJKa7shyRta+x5Jrmz9z0yy7kr6FUuSJElS5632AT3JtsAbgRcBrwde3HadBPxDVW0LHAF8ru+wKcAuwKuAE5Ks3dp3BN5aVbsl2QOYBmwHzAC2TbIzMAu4raq2rqotgDlJNgT2Bjavqq2AD41Q6qnA+9r+RcAH+/ZNqqrtgHcPa59FC+jAU4D5VbUNcOmwfutX1S5V9R/AZ4BT2zynA5+uqruAhe2cAf4amFtVfxqhztvbHJ9v1w3gA8BdVbVlG/eiJE8DjgJ2b/2vA/5xhPFIcliS65Jcd/c9Y7oZQJIkSZLGndU+oAM7AedU1f1VdTdwLrA28BLgzCQLgBOBZ/Qd8/WqeriqbgJuBqa39u9V1R1te4/273pgfuszjV643r2teu/Uwu/dwO+BLyR5PXB/f4FJJtML0Ze2pi8BO/d1Obv9nEfvywPac+fPqqqb276HgTPa9mnAy/qOP6Nve0fgK237y339zgD2b9tvHHZMv8fVAuwOfHaoQ1XdCewAvBC4ol3jtwLPGWnAqjqpqmZW1cynrrfSHqeXJEmSpIHyGfSeGvZ5DeB3VTVjjP2HPt/X1xbgw1V14vCD26r9K4EPJ7mgqo5Lsh3wCnrh9x3AbstQ/x/az4d49He6E/CDJRzTfw73jdrr0X7ntno3BLYFLlqGWsLjr1nofaHxpiXMLUmSJEmrDVfQ4TJg7yR/lmQ9erdv3w/ckmRfgPRs3XfMvknWSDIV2BT42QjjzgXeNvRcdZJnJvnzJJsA91fVacDHgW1an8lV9V16t6k/5ouBtsp+Z9/z5W+hd5v6kswCzu/7vAawT9t+M6OH9x/S+5IA4IChflV1L3AN8CngvKp6aCnz97uA3pcOALTn568CXppks9a2TpLnLcOYkiRJkjShrPYr6FU1P8kZwALgf4HL264DgM8nOQpYC/gaveewoRfILwWeDhxeVb9PMnzcC5K8ALiy7bsXOBDYDDg+ycPAn4C/BdYDvtWeZQ/wnhFKfSu9593XoXdb/SFLObVdgf43rN8HbJ5kHnAXj96uPtw7gS8meS/wm2HznAGc2cZeFh8CPptkMb2V9WOr6uwkBwNfTfLk1u8o4MZlHFuSJEmSJoRUDb/zWEuSZDa9FeSzBl3LaJI8Czi5qvbqa7u3qsb9W9I3fe7kOu6DOwy6DEmS1GEHHjx30CVI0hIlmVdVM4e3r/Yr6BNRVf2C3p9YkyRJkiSNEwb0ZVRVBw+6hidiIqyeS5IkSdJE5kviJEmSJEnqAAO6JEmSJEkdYECXJEmSJKkDDOiSJEmSJHWAAV2SJEmSpA4woEuSJEmS1AEGdEmSJEmSOsCALkmSJElSB0wadAHSsthwo2kcePDcQZchSZIkSSucK+iSJEmSJHWAAV2SJEmSpA4woEuSJEmS1AEGdEmSJEmSOsCALkmSJElSBxjQJUmSJEnqAAO6JEmSJEkd4N9B17hy+29v4r9P3XPQZUiSpJXo7QfNHXQJkjQQrqBLkiRJktQBBnRJkiRJkjrAgC5JkiRJUgcY0CVJkiRJ6gADuiRJkiRJHWBAlyRJkiSpAwzokiRJkiR1gAFdkiRJkqQOMKBLkiRJktQBBnRJkiRJkjrAgC5JkiRJUgcY0FczSY5JcsQKHvPEJC8dZd8mSc5q2zOSvHJFzi1JkiRJE4UBXSvC9sBVI+2oqtuqap/2cQZgQJckSZKkERjQJ7gkByW5IcnCJF8etu/QJNe2fd9Isk5r3zfJ4tZ+WWvbPMk1SRa08aa19hcAN1bVQ0k2S/L9dtz8JFOTTGljPQk4Dti/jbF/kpuSbNzGWSPJ/yR52iq9QJIkSZLUEQb0CSzJ5sCRwG5VtTXwrmFdzq6qF7d9PwHe3tqPBvZs7a9pbYcDn6qqGcBM4BetfS9gTts+HfhsO+4lwK+GJqqqP7Zxz6iqGVV1BnAacEDrsjuwsKpuH+E8DktyXZLr7rnnj0/oWkiSJElS1xnQJ7bdgLOGQm9V3TFs/xZJLk+yiF5Q3ry1XwHMTnIosGZruxL4lyTvA55TVQ+09j2BOUnWA55ZVee0uX5fVfcvpb4vAge17bcBp4zUqapOqqqZVTVzvfWeNIbTliRJkqTxx4A+sQWoJeyfDbyjqrYEjgXWBqiqw4GjgGcDC5JsVFVfobea/gAwN8lu7Zb49avqtjbXMqmqnwO/TrIbvefYz1/WMSRJkiRpojCgT2wXAvsl2QggyYbD9q8H/CrJWjx6qzlJplbV1VV1NHA78OwkmwI3V9WngXOBrYCXAxcDVNXdwC+SvK6N8eShZ9r73NPm7PcFere6f72qHlruM5YkSZKkccqAPoFV1Y+AfwMuTbIQ+MSwLh8Arga+B/y0r/34JIuSLAYuAxYC+wOLkywApgOn8tjnzwHeArwzyQ3AD4G/GDbfxcALh14S19rOBdZllNvbJUmSJGl1kaol3QEtjS7JfGD7qvrTcowxE/hkVe00lv5Tnju5PnDsDk90OkmSNA68/aC5gy5BklaqJPOqaubw9kmDKEYTQ1VtszzHJ3k/8Lf03V4vSZIkSasrb3HXwFTVR6rqOVX1g0HXIkmSJEmDZkCXJEmSJKkDDOiSJEmSJHWAAV2SJEmSpA4woEuSJEmS1AEGdEmSJEmSOsCALkmSJElSBxjQJUmSJEnqAAO6JEmSJEkdMGnQBUjL4mkbTePtB80ddBmSJEmStMK5gi5JkiRJUgcY0CVJkiRJ6gADuiRJkiRJHWBAlyRJkiSpAwzokiRJkiR1gAFdkiRJkqQOMKBLkiRJktQB/h10jSv/d8dNfPa0PQddhiRJWoq/P3DuoEuQpHHHFXRJkiRJkjrAgC5JkiRJUgcY0CVJkiRJ6gADuiRJkiRJHWBAlyRJkiSpAwzokiRJkiR1gAFdkiRJkqQOMKBLkiRJktQBBnRJkiRJkjrAgC5JkiRJUgcY0CVJkiRJ6gAD+gAkOSbJEYOuAyDJ4UkOWs4x5iV50ij7XpPk/W37dUleuDxzSZIkSdJENWnQBeiJSTKpqh5c3nGq6oTlrGMK8Muq+uMo458LnNs+vg44D/jx8swpSZIkSRORK+irSJIjk/wsyfeB57e2qUnmtBXoy5NMb+2zk5zQ2m5M8urWfnCSM5N8G7igtb03ybVJbkhybGt7SpLvJFmYZHGS/Vv7R5L8uPX9eGt7ZDU/yYwkV7X95yTZoLVfkuSjSa5p9ezUd2p7AXNav1lJ5rd5L+yr+TNJXgK8Bjg+yYJ27vP7rs+0JPNW1vWXJEmSpK5zBX0VSLIt8EbgRfSu+XxgHnAScHhV3ZRke+BzwG7tsCnALsBU4OIkm7X2HYGtquqOJHsA04DtgADnJtkZ2Bi4rape1eafnGRDYG9gelVVkvVHKPVU4B+q6tIkxwEfBN7d9k2qqu2SvLK1797aZwHvSbIxcDKwc1Xd0uZ7RFX9MMm5wHlVdVar664kM6pqAXAIMHuU63cYcBjABhutPdplliRJkqRxzRX0VWMn4Jyqur+q7qZ3y/fawEuAM5MsAE4EntF3zNer6uGqugm4GZje2r9XVXe07T3av+vphf7p9AL7ImD3tuq9U1XdBdwN/B74QpLXA/f3F5hkMrB+VV3amr4E7NzX5ez2cx69Lw9oz50/q6puBnYALquqWwD6alySLwCHJFkT2B/4ykidquqkqppZVTPXfeqIj7pLkiRJ0rjnCvqqU8M+rwH8rqpmjLH/0Of7+toCfLiqThx+cFu1fyXw4SQXVNVxSbYDXkFvNf8dPLpaPxZ/aD8f4tH/NzsBP+irZXjNS/MNeqvxFwHzquq3y3i8JEmSJE0YrqCvGpcBeyf5syTrAX9NbwX7liT7AqRn675j9k2yRpKpwKbAz0YYdy7wtiTrtjGemeTPk2wC3F9VpwEfB7ZpfSZX1Xfp3bb+mC8G2ir7nX3Pl78FuJQlmwWc37avBHZJ8txWy4Yj9L8HWK9vzt+3c/g8cMpS5pIkSZKkCc0V9FWgquYnOQNYAPwvcHnbdQDw+SRHAWsBXwMWtn0/oxeQn07vOfXfJxk+7gVJXgBc2fbdCxwIbEbvZWwPA38C/pZeMP5WkrXprXa/Z4RS3wqckGQderfVH7KUU9sVOLrV8pv2rPjZSdYA/g/4q2H9vwacnOSdwD5V9f8CpwOvp730TpIkSZJWV6la1ruStbIlmU3fy9S6KMmzgJOraq/lHOcIeiv7HxhL/7/cdHK977gdlmdKSZK0Cvz9gXMHXYIkdVaSeVU1c3i7K+h6QqrqF/T+xNoTluQcem+pX5Zn4SVJkiRpQjKgd1BVHTzoGlaFqtp70DVIkiRJUlf4kjhJkiRJkjrAgC5JkiRJUgcY0CVJkiRJ6gADuiRJkiRJHWBAlyRJkiSpAwzokiRJkiR1gAFdkiRJkqQOMKBLkiRJktQBkwZdgLQs/nzDafz9gXMHXYYkSZIkrXCuoEuSJEmS1AEGdEmSJEmSOsCALkmSJElSBxjQJUmSJEnqAAO6JEmSJEkdYECXJEmSJKkDDOiSJEmSJHWAfwdd48qv77iJj391z0GXIUmSRnHEm+YOugRJGrdcQZckSZIkqQMM6JIkSZIkdYABXZIkSZKkDjCgS5IkSZLUAQZ0SZIkSZI6wIAuSZIkSVIHGNAlSZIkSeoAA7okSZIkSR1gQJckSZIkqQMM6JIkSZIkdYABXZIkSZKkDjCgS5IkSZLUAeMqoCc5JskRI7RPSbK4bc9M8ulVX93jJTk8yUGDrmNpkrwpyZGrYJ4pSd68sueRJEmSpPFo0qALWNGq6jrgulU1X5JJVfXgKLWcsKrqWE6zgFXxpcYU4M3AV1bBXJIkSZI0rgx0Bb2tqP40yZeS3JDkrCTrJLk1ydNan5lJLuk7bOskFyW5KcmhI4y5a5Lz2va6SU5JsqiN/4ZR6lgzyewki1vf97T2qUnmJJmX5PIk01v77CSfSHIxcHyrd/2+8f4nydP7V/yTbJbk+0kWJpmfZGprf2+Sa1t9xy7hWj0lyXfa8YuT7N/aR7xWbe4vJbmg9Xl9ko+185uTZK3WL8AMYP5o16utsC9q8360r6Z7+7b3STK77/p8OskPk9ycZJ/W7SPATkkWJHlPu6Yz+sa4IslWI5z7YUmuS3Ldvff8cbRLJEmSJEnjWhdW0J8PvL2qrkjyReDvltJ/K2AH4CnA9Um+s4S+HwDuqqotAZJsMEq/GcAzq2qL1m8obJ8EHF5VNyXZHvgcsFvb9zxg96p6KMkawN7AKa3frVX16172fcTpwEeq6pwkawNrJNkDmAZsBwQ4N8nOVXXZCDXOAm6rqle1Gicv4byHTAVeDrwQuBJ4Q1X9c5JzgFcB3wReBCysqkryuOuVZBPgo8C2wJ3ABUleV1XfXMrczwBeBkwHzgXOAt4PHFFVr27j3wEcDLw7yfOAJ1fVDcMHqqqT6P0uePamk2sM5y1JkiRJ404XnkH/eVVd0bZPoxfqluRbVfVAVd0OXEwv3I5md+CzQx+q6s5R+t0MbJrkv5LMAu5Osi7wEuDMJAuAE+mFziFnVtVDbfsMYP+2/cb2+RFJ1qP3BcA5rY7fV9X9wB7t3/XAfHphdtooNS4Cdk/y0SQ7VdVdSzjvIedX1Z/asWsCc/rGmtK2ZwHnt+2RrteLgUuq6jftVv7TgZ3HMPc3q+rhqvox8PRR+pwJvLqt5r8NmD2GcSVJkiRpQurCCvrwFdECHuTRLw/WHkP/0WQp+3sDVN2ZZGtgT+Dvgf2AdwO/q6oZoxx2X9/2lcBmSTYGXgd8aIQ6Rqvvw1V14hhqvDHJtsArgQ8nuaCqjmPJ1+oP7diHk/ypqoauxcM8+rvfAxi69X+k6zVa7QzrO+LcSxqjqu5P8j3gtfSu+cwlzCVJkiRJE1oXVtD/MsmObftNwA+AW+ndUg2Phschr02ydpKNgF2Ba5cw9gXAO4Y+jHaLe3uGe42q+ga92+K3qaq7gVuS7Nv6pIX4x2nB9xzgE8BPquq3w/bfDfwiyevaWE9Osg4wF3hbW60nyTOT/PkoNW4C3F9VpwEfB7Zpu25l9Gu1RO02+Ul99Y50va4GdknytCRr0vsdXdq6/DrJC/pu8V+ae4D1hrV9gd4L6q6tqjuWpX5JkiRJmki6ENB/Arw1yQ3AhsDngWOBTyW5HHhoWP9rgO8AVwH/WlW3LWHsDwEbtJebLaT3PPZInglc0m5lnw38P639AODt7dgf0VvpHc0ZwIEMu729z1uAd7bz/CHwF1V1Ab03ml+ZZBG957SHB9ghWwLXtBqP5NFV+iVdq6X5K+D7fZ8fd72q6lf0rsfFwEJgflV9q/V/P3AecBHwqzHMdwPwYHvR3XsAqmoecDdwyjLWLkmSJEkTSh6963kAkydTgPOGXs6mVSvJF4AvVNVVA6xhE+ASYHpVPby0/s/edHK96992WOl1SZKkJ+aIN80ddAmS1HlJ5lXV4x7x7cIKugakqv5mwOH8IHq30B85lnAuSZIkSRPZQF8SV1W3Aqt09TzJ1cCThzW/paoWrco6RtOerb9whF2vGP5s+3hXVacCpw66DkmSJEnqgi68xX2VqqrtB13DkrQQPtqb4yVJkiRJE5S3uEuSJEmS1AEGdEmSJEmSOsCALkmSJElSBxjQJUmSJEnqAAO6JEmSJEkdYECXJEmSJKkDVrs/s6bx7ekbTuOIN80ddBmSJEmStMK5gi5JkiRJUgcY0CVJkiRJ6gADuiRJkiRJHWBAlyRJkiSpAwzokiRJkiR1gAFdkiRJkqQOMKBLkiRJktQB/h10jSu33XkTx3x9z0GXIUmSRnDMfnMHXYIkjWuuoEuSJEmS1AEGdEmSJEmSOsCALkmSJElSBxjQJUmSJEnqAAO6JEmSJEkdYECXJEmSJKkDDOiSJEmSJHWAAV2SJEmSpA4woEuSJEmS1AEGdEmSJEmSOsCALkmSJElSBxjQJUmSJEnqgAkZ0JO8M8lPkpy+nOMcl2T3tn1JkpkrqL53J1lnRfVbyhjzkjxpecZYyviPXCNJkiRJ0hM3adAFrCR/B+xVVbcszyBVdfQKqme4dwOnAfevoH4jSjIF+GVV/XGM/SdV1YPLMsdKvEaSJEmStFqZcCvoSU4ANgXOTfK+JD9Mcn37+fzW5+Ak30zy7SS3JHlHkn9s/a5KsmHrNzvJPsPGf3uST/Z9PjTJJ0ap5SlJvpNkYZLFSfZP8k5gE+DiJBe3fp9Pcl2SHyU5trWN1O/evrH3STK7be/bxl+Y5LK+EvYC5gwdm+Q/ksxPcmGSjVv7JUn+PcmlwLuSPKftv6H9/Mskk5PcmmSNdsw6SX6eZK3+a9T6HNvmWJRkemtfN8kpre2GJG9o7XskubL1PzPJuqNcx8Pa9bnu/rvH9F2DJEmSJI07Ey6gV9XhwG3Ay4HPAztX1YuAo4F/7+u6BfBmYDvg34D7W78rgYOWMMXXgNckWat9PgQ4ZZS+s4DbqmrrqtoCmFNVnx6qr6pe3vodWVUzga2AXZJsNUq/0RwN7FlVWwOvGTb/nLb9FGB+VW0DXAp8sK/f+lW1S1X9B/AZ4NSq2go4Hfh0Vd0FLAR2af3/GphbVX8aoZbb2xyfB45obR8A7qqqLdu4FyV5GnAUsHvrfx3wjyOdXFWdVFUzq2rmOk9daXfrS5IkSdJATbiAPsxk4Mwki4FPApv37bu4qu6pqt8AdwHfbu2LgCmjDVhV9wEXAa9uK8RrVdWiUbovAnZP8tEkO7WgO5L9kswHrm81vnBsp/eIK4DZSQ4F1gRoz50/q6pubn0eBs5o26cBL+s7/oy+7R2Br7TtL/f1OwPYv22/cdgx/c5uP+fx6HXcHfjsUIequhPYgd55XpFkAfBW4DlLOU9JkiRJmrAm6jPoQ/6VXhDfuz2PfUnfvj/0bT/c9/lhln5dvgD8C/BTRl89p6puTLIt8Ergw0kuqKrj+vskeS69leYXV9Wd7bb1tUcbsm/7kT5VdXiS7YFXAQuSzABmAD9Ywjn0j3XfGPqd285hQ2Bbel9SjGToOj7Eo9cxw+YbavteVb1pCXNLkiRJ0mpjdVhB/2XbPnhFDVpVVwPPpneL/FdH65dkE3q3zp8GfBzYpu26B1ivbT+VXkC+K8nT6T03zgj9AH6d5AXtWfC9++aZWlVXtxe23d5qmwWc33fsGsDQ8/RvZvTw/kN6K+QABwz1q6p7gWuATwHnVdVDo533CC4A3tFX7wbAVcBLk2zW2tZJ8rxlGFOSJEmSJpSJHtA/Rm/V9wrard8r0NeBK9rt2qPZErim3cJ9JPCh1n4ScH6Si6tqIb1b238EfJHe7eoM79c+vx84j97q9a/6+h3fXsC2GLiM3vPiu9J71nzIfcDmSeYBuwGPWcnv807gkCQ3AG8B3tW37wzgQEa/vX00HwI2GHqRHb3n6n9D70uTr7a5rgKmL+O4kiRJkjRhpGr4nccaiyTnAZ+sqgsHXctwSZ4FnFxVe/W13VtVI74lfTzZZOrkOuzDOwy6DEmSNIJj9ps76BIkaVxIMq+9KPwxJvoK+gqXZP0kNwIPdDGcA1TVL/rDuSRJkiSp+yb6S+JWuKr6HfCYZ6WTbASMFNZfUVW/XSWFLcVEWD2XJEmSpInMgL4CtBA+Y9B1SJIkSZLGL29xlyRJkiSpAwzokiRJkiR1gAFdkiRJkqQOMKBLkiRJktQBBnRJkiRJkjrAgC5JkiRJUgf4Z9Y0rmyywTSO2W/uoMuQJEmSpBXOFXRJkiRJkjrAgC5JkiRJUgcY0CVJkiRJ6gADuiRJkiRJHWBAlyRJkiSpAwzokiRJkiR1gH9mTePKz++8iXd/Y9agy5AkqdP+8w1zBl2CJOkJcAVdkiRJkqQOMKBLkiRJktQBBnRJkiRJkjrAgC5JkiRJUgcY0CVJkiRJ6gADuiRJkiRJHWBAlyRJkiSpAwzokiRJkiR1gAFdkiRJkqQOMKBLkiRJktQBBnRJkiRJkjrAgC5JkiRJUgcY0DskyTFJjljBY56Y5KUrcsxh478myftX1viSJEmStLowoE982wNXjaVjkknLOnhVnVtVH1nmqiRJkiRJj2FAH6AkByW5IcnCJF8etu/QJNe2fd9Isk5r3zfJ4tZ+WWvbPMk1SRa08aa19hcAN1bVQ0kuSfKfSX7Yjt+u9TkmyUlJLgBOTbJ2klOSLEpyfZKXt35XJ9m8r75Lkmyb5OAkn2lts5N8us1xc5J9+vr/cxtzYZKPtLapSeYkmZfk8iTTV+b1liRJkqQuW+YVU60YLeweCby0qm5PsiHwzr4uZ1fVya3vh4C3A/8FHA3sWVW/TLJ+63s48KmqOj3Jk4A1W/tewJy+MZ9SVS9JsjPwRWCL1r4t8LKqeiDJPwFU1ZYtMF+Q5HnA14D9gA8meQawSVXNS7LlsFN7BvAyYDpwLnBWkr2A1wHbV9X97VwBTgIOr6qbkmwPfA7YbYRrdRhwGMB6T1t7KVdWkiRJksYnV9AHZzfgrKq6HaCq7hi2f4u2qrwIOAAYWr2+Apid5FAeDeJXAv+S5H3Ac6rqgda+J48N6F9tc10GPLUv4J/bd8zLgC+3fj8F/hd4HvB1YN/WZz/gzFHO65tV9XBV/Rh4emvbHTilqu4fOtck6wIvAc5MsgA4kV64f5yqOqmqZlbVzD976pNGmVaSJEmSxjcD+uAEqCXsnw28o6q2BI4F1gaoqsOBo4BnAwuSbFRVXwFeAzwAzE2yW7slfv2quq1vzOHzDX2+b1hdj1NVvwR+m2QrYH96K+oj+cMIY410rmsAv6uqGX3/XjDKmJIkSZI04RnQB+dCYL8kGwH03fY9ZD3gV0nWoreCTus3taqurqqjgduBZyfZFLi5qj5N77byrYCXAxcPG3P/NsbLgLuq6q4R6rpsaL52a/tfAj9r+74G/DMwuaoWLcO5XgC8re85+g2r6m7gliT7trYk2XoZxpQkSZKkCcWAPiBV9SPg34BLkywEPjGsyweAq4HvAT/taz++vWxtMb0wvZBe8F7cbhWfDpzK458/B7gzyQ+BE+g90z6SzwFrtlvrzwAOrqqhVfGzgDfSu919Wc51Dr0vDq5rNQ79KbkDgLe38/8R8NplGVeSJEmSJpJULekua41XSebTeynbn9rnS4Ajquq6gRa2nJ4+dXK96WM7DroMSZI67T/fMPw7eklSlySZV1Uzh7f7FvcJqqq2GXQNkiRJkqSxM6CvJqpq10HXIEmSJEkanc+gS5IkSZLUAQZ0SZIkSZI6wIAuSZIkSVIHGNAlSZIkSeoAA7okSZIkSR1gQJckSZIkqQMM6JIkSZIkdYABXZIkSZKkDpg06AKkZfHsDabxn2+YM+gyJEmSJGmFcwVdkiRJkqQOMKBLkiRJktQBBnRJkiRJkjrAgC5JkiRJUgc8Sqz0AAAgAElEQVQY0CVJkiRJ6gADuiRJkiRJHeCfWdO4cvPvbmK/b80adBmSJA3E11/rnxqVpInMFXRJkiRJkjrAgC5JkiRJUgcY0CVJkiRJ6gADuiRJkiRJHWBAlyRJkiSpAwzokiRJkiR1gAFdkiRJkqQOMKBLkiRJktQBBnRJkiRJkjrAgC5JkiRJUgcY0CVJkiRJ6gADuiRJkiRJHWBAlyRJkiSpAzoT0JMck+SIEdqnJFnctmcm+fSqr+7xkhye5KBB17E0Sd6U5MiVOP4mSc5aWeNLkiRJ0upi0qALWBZVdR1w3aqaL8mkqnpwlFpOWFV1LKdZwJi+1EiyZlU9tCyDV9VtwD5PpDBJkiRJ0qNW2gp6W/n+aZIvJbkhyVlJ1klya5KntT4zk1zSd9jWSS5KclOSQ0cYc9ck57XtdZOckmRRG/8No9SxZpLZSRa3vu9p7VOTzEkyL8nlSaa39tlJPpHkYuD4Vu/6feP9T5Kn96/4J9ksyfeTLEwyP8nU1v7eJNe2+o5dwrV6SpLvtOMXJ9m/tY94rdrcX0pyQevz+iQfa+c3J8larV+AGcD8dsyXh1/fdk0vTvIVYFFr+8dWx+Ik725tH03yd301H5Pkn4bd4XBwkrNbDTcl+Vhf/1nt2ixMcmHfeX+xXaPrk7x2lOtzWJLrklz3h7v/ONpllCRJkqRxbWWvoD8feHtVXZHki8DfLaX/VsAOwFOA65N8Zwl9PwDcVVVbAiTZYJR+M4BnVtUWrd9Q2D4JOLyqbkqyPfA5YLe273nA7lX1UJI1gL2BU1q/W6vq173s+4jTgY9U1TlJ1gbWSLIHMA3YDghwbpKdq+qyEWqcBdxWVa9qNU5ewnkPmQq8HHghcCXwhqr65yTnAK8Cvgm8CFhYVdXqHe36bgdsUVW3JNkWOATYvtV9dZJLga8B/9muE8B+re7hX/LMaPP+AfhZkv8Cfg+cDOzc5tiw9T0SuKiq3tZ+L9ck+X5V3dc/YFWdRO/3xYabTf7/2bvTMMuq8oz7/1taJkFmh6DS2pCgqLTQgCAgGkRwAm0QEQcGw2sSp0QxJsYIaEIQNS+KRkCFiKjIpARkEGWSmYbuBoJCBHwdiIoyCCpg+7wfzio5ljWcpru6dlX9f9fVV+1ae+21nr2rv9xn7b1PDXBtJEmSJGnKmehn0H9YVZe17S8C24/T/+tV9Zuqugu4kF5wHM3OwKeGfqmqu0fpdxvwjCSfTLIrcF+SNYDtgFOSLASOAZ7cd8wpfbd6nwzs3bZf137/gyRr0vsA4IxWx2+r6tfALu3f9cB1wKb0AvtIbgB2bqvUO1TVvWOc95BzqurhduxKwLl9Y81u27sC5/QdM9r1vbqqbm/b2wNnVNUDVXU/cDqwQ1VdDzwhvWfONwfurqr/b4S6vlVV91bVb4H/ATai96HAJUNzVNUvW99dgPe1v8FFwKrA0wY4d0mSJEmadiZ6BX34amcBv+ORDwZWHaD/aDLO/t4AVXe3QPlS4G/prfy+C7inquaOclj/Cu4VwMZJNgD2AD48Qh2j1Xd4VR0zQI23tJXrlwGHJzm/qg5j7Gv1YDv290kerqqha/F7Hvm77gL03/o/2vXtP9/RzgfgVHrPmz+J3or6SB7s217SahntbxV6K//fG2NOSZIkSZoRJnoF/WlJtm3b+wDfAe4Atmxtw58b3z3JqknWA3YCrhlj7POBtw39Mtot7u0Z7sdU1Wn0bovfoqruA25Pslfrkxbi/0QLvmcAHwdurqpfDNt/H/CjJHu0sVZJsjpwHnBAW60nyYZJnjBKjX8G/Lqqvgh8FNii7bqD0a/VmNpt8rOG1TvI9b0E2CO99wU8jt7t/Ze2fV+hdxfBnvTC+qCuAF6Y5OmttqFb3M8D3t6elSfJ85ZiTEmSJEmaViY6oN8MvDnJYmBd4D+BQ4GjklxKb4W139XA2cCVwIfaG8JH82FgnfYis0X0nsceyYbARe026hOAf2zt+wIHtmNvAkZ8QVlzMvAGht3e3ueNwDvaeV4OPKmqzge+BFyR5AZ6gXbNUY5/Dr3nrxfSey57aJV+rGs1npcAFwxrG/f6VtV19K7T1cBVwGfb7e1U1U3tHH5cVXcOWkhV/Rw4CDi9Xe+h6/gh4LHA4vaiuQ8NfHaSJEmSNM3kkTujl/PAyWzgrKGXs2nFSvJZeuH6yvb7IcD9VfXRSS1sGa278Vq188e2Hb+jJEnT0Fd3P3f8TpKkzkuyoKrmDW+fUt+DrsFV1VsmuwZJkiRJ0uAmLKBX1R3ACl09T3IVsMqw5jdW1Q0rso7RtGe/vzXCrr8c/mz78lZVh0zk+JIkSZKkZTOtVtCrapvJrmEsLYSP9uZ4SZIkSdIMNtEviZMkSZIkSQMwoEuSJEmS1AEGdEmSJEmSOsCALkmSJElSBxjQJUmSJEnqAAO6JEmSJEkdMK2+Zk3T3zPW3oSv7n7uZJchSZIkScudK+iSJEmSJHWAAV2SJEmSpA4woEuSJEmS1AEGdEmSJEmSOsCALkmSJElSBxjQJUmSJEnqAL9mTVPKrffcwW5fP3Cyy5AkaYU6Z/fPTXYJkqQVwBV0SZIkSZI6wIAuSZIkSVIHGNAlSZIkSeoAA7okSZIkSR1gQJckSZIkqQMM6JIkSZIkdYABXZIkSZKkDjCgS5IkSZLUAQZ0SZIkSZI6wIAuSZIkSVIHGNAlSZIkSeoAA7okSZIkSR1gQJ9BkhyS5D3LecxjkrxgeY4pSZIkSTORAV3LahvgyskuQpIkSZKmOgP6NJbkTUkWJ1mU5MRh+/4qyTVt32lJVm/teyW5sbVf0to2S3J1koVtvE1a+zOBW6pqyRjjzUlyZdt3WJL7+2o4uLUvTnLoCrswkiRJktRBBvRpKslmwPuBF1fV5sA7h3U5vaq2avtuBg5s7f8CvLS1v6q1vRU4qqrmAvOAH7X23YBzxxnvqHbsVsBP+urbBdgE2BqYC2yZZMdRzuWgJNcmufah+3671NdCkiRJkqYCA/r09WLg1Kq6C6Cqfjls/7OTXJrkBmBfYLPWfhlwQpK/AlZqbVcA/5TkH4CNquo3rf2lPBLQRxtvW+CUtv2lvvl3af+uB64DNqUX2P9EVR1bVfOqat7Kj1918CsgSZIkSVPIrMkuQBMmQI2x/wRgj6palGQ/YCeAqnprkm2AlwMLk8ytqi8luaq1nZfkLfSeO1+7qn4y1njj1Hd4VR3zKM5NkiRJkqYdV9Cnr28Br02yHkCSdYftXxO4M8lj6a140/rNqaqrqupfgLuApyZ5BnBbVX0COBN4LvAi4MLxxqMX5Oe37df1tZ8HHJBkjTbvhkmesExnLEmSJElTmCvo01RV3ZTkX4GLkyyhdyv5HX1dPgBcBfwAuIFewAY4sr0ELvRC/iLgfcAbkjwM/B9wWPt36gDjvQv4YpJ3A2cD97b6zm8vmbsiCcD9wBuAny2nSyBJkiRJU0qqxroLWhpZkuuAbarq4XH6rQ78pqoqyeuAfapq90c771obr1/bfexRHy5J0pR0zu6fm+wSJEnLUZIFVTVveLsr6HpUqmqLAbtuCRyd3jL5PcABE1eVJEmSJE1dBnRNqKq6FNh8suuQJEmSpK7zJXGSJEmSJHWAAV2SJEmSpA4woEuSJEmS1AEGdEmSJEmSOsCALkmSJElSBxjQJUmSJEnqAAO6JEmSJEkdYECXJEmSJKkDZk12AdLS2GTt2Zyz++cmuwxJkiRJWu5cQZckSZIkqQMM6JIkSZIkdYABXZIkSZKkDjCgS5IkSZLUAQZ0SZIkSZI6wIAuSZIkSVIH+DVrmlJuvedHvOxr/zDZZUiSNOG+sccRk12CJGkFcwVdkiRJkqQOMKBLkiRJktQBBnRJkiRJkjrAgC5JkiRJUgcY0CVJkiRJ6gADuiRJkiRJHWBAlyRJkiSpAwzokiRJkiR1gAFdkiRJkqQOMKBLkiRJktQBBnRJkiRJkjrAgC5JkiRJUgcY0CVJkiRJ6gAD+giS3D9An8tXRC0rSpJjkrxgBcyzU5LtJnoeSZIkSZpqDOiPUlUtc8hMMmt51LKcbANcuQLm2QkwoEuSJEnSMAb0cSQ5OMk1SRYnObSv/f7288lJLkmyMMmNSXbo39+290xyQts+IcnHk1wIHJHkcUk+3+a4PsnuY9SyWZKr21yLk2ySZHaSG/v6vCfJIW37oiT/0eq7OclWSU5PcmuSD/cd80zglqpakmTjJBckWZTkuiRz0nNkO78bkuzdjtspyVl94xydZL+2fUeSQ9sYNyTZNMls4K3A37Vz2CHJ7Uke2455fDvuscPO+6Ak1ya59qH7frNUfz9JkiRJmiq6tILbOUl2ATYBtgYCnJlkx6q6pK/b64Hzqupfk6wErD7A0H8O7NwC8b8B366qA5KsDVyd5IKqemCE494KHFVVJyVZGVgJeOI4cz1UVTsmeSfwdWBL4JfA95P8R1X9AtgNOLf1Pwn496o6I8mq9D7EeQ0wF9gcWB+4JsklwycawV1VtUWSvwHeU1VvSfIZ4P6q+ij0PkQAXg58DXgdcFpVPdw/SFUdCxwLsNbGT6oB5pUkSZKkKccV9LHt0v5dD1wHbEovsPe7Bti/rVo/p6p+NcC4p1TVkr453pdkIXARsCrwtFGOuwL4pyT/AGxUVYMsJ5/Zft4A3FRVd1bVg8BtwFPbvpcC5yZZE9iwqs4AqKrfVtWvge2BL1fVkqr6KXAxsNUAc5/efi4AZo/S57PA/m17f+D4AcaVJEmSpGnHgD62AIdX1dz2b+Oq+lx/h7aaviPwY+DEJG8a2tXXbdVh4/avjgeY3zfH06rq5pGKqaovAa8CfgOcl+TFwO/447/j8LkebD9/37c99PusJKsDa1fVT1otIxmtfdC5lzDK3RpVdRkwO8kLgZWq6saR+kmSJEnSdGdAH9t5wAFJ1gBIsmGSJ/R3SLIR8LOqOg74HLBF2/XTJM9M8hjg1ePM8fYkaeM9b7SOSZ4B3FZVn6C3Mv5c4KfAE5Ksl2QV4BVLeY4vAi4EqKr7gB8l2aPNt0oL8JcAeydZKckG9D6QuBr4AfCs1m8t4C8HmO9XwJrD2r4AfBlXzyVJkiTNYAb0MVTV+cCXgCuS3ACcyp+Gy52AhUmuB+YDR7X29wFnAd8G7hxjmg8BjwUWt5e9fWiMvnsDN7bb4TcFvtCe1z4MuKrN992BT7Cn//lzgDcC70iyGLgceBJwBrAYWNTO571V9X9V9UPgq23fSfQeBRjPfwOvHnpJXGs7CViHXkiXJEmSpBkpVb5zayZLch2wzfAXs63gGvYEdq+qN47Xd62Nn1Qv+OibV0BVkiRNrm/sccRklyBJmiBJFlTVvOHtvsV9hquqLcbvNXGSfJLeKv7LJrMOSZIkSZpsBvQOSvJSYPjH5rdX1VjPsk9JVfX2ya5BkiRJkrrAgN5BVXUevZfHSZIkSZJmCF8SJ0mSJElSBxjQJUmSJEnqAAO6JEmSJEkdYECXJEmSJKkDDOiSJEmSJHWAb3HXlLLJ2k/hG3sM/wY6SZIkSZr6XEGXJEmSJKkDDOiSJEmSJHWAAV2SJEmSpA4woEuSJEmS1AEGdEmSJEmSOsCALkmSJElSBxjQJUmSJEnqAL8HXVPKrffcycvO+PBklyFJ0oT6xqv/ebJLkCRNAlfQJUmSJEnqAAO6JEmSJEkdYECXJEmSJKkDDOiSJEmSJHWAAV2SJEmSpA4woEuSJEmS1AEGdEmSJEmSOsCALkmSJElSBxjQJUmSJEnqAAO6JEmSJEkdYECXJEmSJKkDDOiSJEmSJHWAAV2SJEmSpA4woC9nSe4foM/lK6KWFSXJMUleMNl1SJIkSdJUZkCfBFW13bKOkWTW8qhlOdkGuHKyi5AkSZKkqcyAPoGSHJzkmiSLkxza135/+/nkJJckWZjkxiQ79O9v23smOaFtn5Dk40kuBI5I8rgkn29zXJ9k9zFq2SzJ1W2uxUk2STI7yY19fd6T5JC2fVGS/2j13ZxkqySnJ7k1yYf7jnkmcEtVLUnyV62WRUlOS7J66zMnyZVt32HDzm/EazSs9oOSXJvk2ofue2Dp/giSJEmSNEUY0CdIkl2ATYCtgbnAlkl2HNbt9cB5VTUX2BxYOMDQfw7sXFXvBt4PfLuqtgJeBByZ5HGjHPdW4Kg21zzgRwPM9VBV7Qh8Bvg68LfAs4H9kqzX+uwGnNu2T6+qrapqc+Bm4MDWflSbeyvgJ0ODD3iNqKpjq2peVc1b+fGjnZ4kSZIkTW0G9ImzS/t3PXAdsCm9MNrvGmD/tmr9nKr61QDjnlJVS/rmeF+ShcBFwKrA00Y57grgn5L8A7BRVf1mgLnObD9vAG6qqjur6kHgNuCpbd9LeSSgPzvJpUluAPYFNmvt2wKntO0v9Y0/yDWSJEmSpBmhS88xTzcBDq+qY0brUFWXtBXjlwMnJjmyqr4AVF+3VYcd1n+Pd4D5VfW98Yqpqi8luarNdV6StwC38Mcf0gyf68H28/d920O/z2q3sK9dVUOr4icAe1TVoiT7ATuNU9a410iSJEmSZgpX0CfOecABSdYASLJhkif0d0iyEfCzqjoO+BywRdv10yTPTPIY4NXjzPH2JGnjPW+0jkmeAdxWVZ+gtzL+XOCnwBOSrJdkFeAVS3mOLwIu7Pt9TeDOJI+lt4I+5Epgftt+3bD6x7xGkiRJkjRTuII+Qarq/PYCtStafr4feAPws75uOwEHJ3m47X9Ta38fcBbwQ+BGYI1RpvkQ8P8Ci1tIv4PRQ/bewBvaXP8HHFZVDyc5DLgKuB347lKe5m7AqX2/f6CN9QN6t8Wv2drfBXwxybuBs4F7YeBrJEmSJEkzQqpq/F7SCJJcB2xTVQ+P02914DdVVUleB+xTVaO+cX4sa228Yb3gyL9+NIdKkjRlfOPV/zzZJUiSJlCSBVU1b3i7K+h61Kpqi/F7AbAlcHRb5b8HOGDiqpIkSZKkqcmAPs0keSlwxLDm26tqrGfZJ1RVXUrva+QkSZIkSaMwoE8zVXUevZevSZIkSZKmEN/iLkmSJElSBxjQJUmSJEnqAAO6JEmSJEkdYECXJEmSJKkDDOiSJEmSJHWAb3HXlLLJ2k/mG6/+58kuQ5IkSZKWO1fQJUmSJEnqAAO6JEmSJEkdYECXJEmSJKkDDOiSJEmSJHWAAV2SJEmSpA4woEuSJEmS1AEGdEmSJEmSOsDvQdeUcus9P+Xlp39sssuQJGm5Ofs1757sEiRJHeEKuiRJkiRJHWBAlyRJkiSpAwzokiRJkiR1gAFdkiRJkqQOMKBLkiRJktQBBnRJkiRJkjrAgC5JkiRJUgcY0CVJkiRJ6gADuiRJkiRJHWBAlyRJkiSpAwzokiRJkiR1gAFdkiRJkqQOMKA3SWYnuXE5jLNfkqPb9h5JntW376Ik85Z1jqWo5f5R2k9IsudynOeYJC9YXuNJkiRJ0kxkQJ9YewDPGrfXo5SeLvwNtwGunOwiJEmSJGkq60K465KVkhyX5KYk5ydZLcmcJOcmWZDk0iSbAiR5ZZKrklyf5IIkT+wfKMl2wKuAI5MsTDKn7dorydVJbkmyw2iFtJX4r7e5v5fkg619dpKbk3wauA54apJ9ktyQ5MYkRwwb52NJrkvyrSQbjDDPlkkubud3XpInt/aLkvxHkkvafFslOT3JrUk+3Hf8M4FbqmpJkr9Kck2SRUlOS7J66zMnyZVt32H9K/tJDm7ti5McujR/LEmSJEmaTgzof2wT4FNVtRlwDzAfOBZ4e1VtCbwH+HTr+x3g+VX1POArwHv7B6qqy4EzgYOram5Vfb/tmlVVWwPvAj44Tj1bA/sCc+kF+6Hb4/8C+EKb+2HgCODFrd9WSfZo/R4HXFdVWwAXD58vyWOBTwJ7tvP7PPCvfV0eqqodgc8AXwf+Fng2sF+S9Vqf3YBz2/bpVbVVVW0O3Awc2NqPAo6qqq2An/TNvwu9a751q33LJDsOvwhJDkpybZJrH7r3gXEumSRJkiRNTbMmu4COub2qFrbtBcBsYDvglCRDfVZpP58CnNxWnFcGbh9wjtOHjT+Wb1bVLwCSnA5sD3wN+EFVDd1SvhVwUVX9vPU7Cdix9fs9cHLr98W+uYf8Bb3A/c12fisBd/btP7P9vAG4qarubHPcBjwV+AXwUmD/1u/ZbXV9bWAN4LzWvi292/0BvgR8tG3v0v5d335fg15gv6S/yKo6lt4HJay18VNrxCslSZIkSVOcAf2PPdi3vQR4InBPVc0doe8ngY9X1ZlJdgIOWco5ljD+9R8eRod+719GDoMbPl7oBe9tR+k/VOvv+eNr83tgVruFfe2qGloVPwHYo6oWJdkP2GmcegIcXlXHDFa+JEmSJE1f3uI+tvuA25PsBX94Kdvmbd9awI/b9ptHOf5XwJrLMP9LkqybZDV6K9CXjdDnKuCFSdZPshKwD73b2aH39x16W/vr6d2W3+97wAZJtoXeLe9JNluK+l4EXNj3+5rAne3W+X372q+k97gAwOv62s8DDkiyRpt/wyRPWIr5JUmSJGnaMKCPb1/gwCSLgJuA3Vv7IfRufb8UuGuUY78CHNxeJDdnlD5j+Q5wIrAQOK2qrh3eod12/o/0gvIies+cf73tfgDYLMkCes+oHzbs2IfoBfgj2vktpHdL/6D6nz8H+AC9Dwy+CXy3r/1dwN8nuRp4MnBvm/98ere8X5HkBuBUlu0DDUmSJEmaslLlI71d1G4Rn1dVb5vsWkaT5Dpgm6p6eJx+qwO/qapK8jpgn6rafaxjRrPWxk+t7T/yrkdzqCRJnXT2a9492SVIklawJAuqat7wdp9B16PW3g4/iC2Bo9N7E909wAETV5UkSZIkTU0G9EmW5KX0viat3+1V9Wp6L12b8qrqUmDzcTtKkiRJ0gxmQJ9kVXUej3wdmSRJkiRphvIlcZIkSZIkdYABXZIkSZKkDjCgS5IkSZLUAQZ0SZIkSZI6wIAuSZIkSVIHGNAlSZIkSeoAv2ZNU8omaz+Rs1/z7skuQ5IkSZKWO1fQJUmSJEnqAAO6JEmSJEkdYECXJEmSJKkDDOiSJEmSJHWAAV2SJEmSpA4woEuSJEmS1AEGdEmSJEmSOsDvQdeUcus9P+Plpx892WVIkrTMzn7N2ya7BElSx7iCLkmSJElSBxjQJUmSJEnqAAO6JEmSJEkdYECXJEmSJKkDDOiSJEmSJHWAAV2SJEmSpA4woEuSJEmS1AEGdEmSJEmSOsCALkmSJElSBxjQJUmSJEnqAAO6JEmSJEkdYECXJEmSJKkDDOgdkGR2khuXwzj7JTm6be+R5Fl9+y5KMm+MYxckWXlZaxiwxj+b6HkkSZIkaaoxoE9fewDPGrcXvQ8IgB9X1UMTWVCzH2BAlyRJkqRhDOjdsVKS45LclOT8JKslmZPk3La6fWmSTQGSvDLJVUmuT3JBkif2D5RkO+BVwJFJFiaZ03btleTqJLck2aHvkN2Ac9uxuya5LsmiJN9qbesm+VqSxUmuTPLc1n5Ikvf0zXtjuxtgdpKbRzifPYF5wEmtrpcnOaPv+JckOX25X1lJkiRJmgIM6N2xCfCpqtoMuAeYDxwLvL2qtgTeA3y69f0O8Pyqeh7wFeC9/QNV1eXAmcDBVTW3qr7fds2qqq2BdwEf7DtkV+DcJBsAxwHzq2pzYK+2/1Dg+qp6LvBPwBcezflU1anAtcC+VTUX+AbwzDYvwP7A8cMHSnJQkmuTXPvQvfcPMLUkSZIkTT2zJrsA/cHtVbWwbS8AZgPbAackGeqzSvv5FODkJE8GVgZuH3COodXpofFpz50/papuS/JK4JKquh2gqn7Z+m9P7wMDqurbSdZLstajOJ8/UlWV5ETgDUmOB7YF3jRCv2PpfVjBWhs/rQY8V0mSJEmaUgzo3fFg3/YS4InAPW2lebhPAh+vqjOT7AQcspRzLOGRv/0O9FbkAQKMFIAzQlsBv+OP78JYdYS5huZbbZSajgf+G/gtcEpV/W604iVJkiRpOvMW9+66D7g9yV4A6dm87VsL+HHbfvMox/8KWHOAeXYFzmnbVwAvTPL0Nue6rf0SYN/WthNwV1XdB9wBbNHatwCePsB8f1RXVf0E+Anwz8AJAxwvSZIkSdOSAb3b9gUOTLIIuAnYvbUfQu/W90uBu0Y59ivAwe1FcnNG6QOwE3AxQFX9HDgIOL3NeXLffPOSLAb+nUc+FDgNWDfJQuCvgVsGOKcTgM+0l8QNraqfBPywqv5ngOMlSZIkaVpKlY/0zlRJngIcV1W7TXIdR9N7Cd3nxuu71sZPq+0/8t7xukmS1Hlnv+Ztk12CJGmSJFlQVfOGt/sM+gxWVT+i9xVrkybJAuAB4N2TWYckSZIkTTYDuiZV+wo5SZIkSZrxfAZdkiRJkqQOMKBLkiRJktQBBnRJkiRJkjrAgC5JkiRJUgcY0CVJkiRJ6gADuiRJkiRJHWBAlyRJkiSpA/wedE0pm6z9BM5+zdsmuwxJkiRJWu5cQZckSZIkqQMM6JIkSZIkdYABXZIkSZKkDjCgS5IkSZLUAQZ0SZIkSZI6wIAuSZIkSVIHGNAlSZIkSeoAvwddU8qtd/+cl5927GSXIUnSUjt7/kGTXYIkqeNcQZckSZIkqQMM6JIkSZIkdYABXZIkSZKkDjCgS5IkSZLUAQZ0SZIkSZI6wIAuSZIkSVIHGNAlSZIkSeoAA7okSZIkSR1gQJckSZIkqQMM6JIkSZIkdYABXZIkSZKkDjCgT7Aks5PcuBzG2S/J0W17jyTP6tt3UZJ5Yxy7IMnKy1rDGOMflmTniRpfkiRJkmYCA/rUtAfwrHF70fuAAPhxVT00YP9ZS1tMVf1LVV2wtMdJkiRJkh5hQF8xVkpyXJKbkpyfZLUkc5Kc21a3L02yKUCSVya5Ksn1SS5I8sT+gZJsB7wKODLJwiRz2q69klyd5JYkO/POU3cAACAASURBVPQdshtwbjv2/iQfS3Jdkm8l2aC1X5Tk35JcDLwzyUZt/+L282lJ1kpyR5LHtGNWT/LDJI9NckKSPVv7HUkObXPc0HdeayQ5vrUtTjK/te+S5IrW/5Qka0zUH0GSJEmSusyAvmJsAnyqqjYD7gHmA8cCb6+qLYH3AJ9ufb8DPL+qngd8BXhv/0BVdTlwJnBwVc2tqu+3XbOqamvgXcAH+w7ZlRbQgccB11XVFsDFw/qtXVUvrKqPAUcDX6iq5wInAZ+oqnuBRcALW/9XAudV1cMjnO9dbY7/bOcG8AHg3qp6Thv320nWB/4Z2Ln1vxb4+zGuoyRJkiRNW0t9O7MelduramHbXgDMBrYDTkky1GeV9vMpwMlJngysDNw+4BynDxuf9tz5U6rqtrbv98DJbfuLfcfQ1w6wLfCatn0i8JG+PnsDFwKv45EPFcaqZWicndsxAFTV3UleQe9W/cvadVgZuGL4YEkOAg4CWHX9dUeZUpIkSZKmNgP6ivFg3/YS4InAPVU1d4S+nwQ+XlVnJtkJOGQp51jCI3/XHeityI+m+rYfGKDfmcDhSdYFtgS+vRS1ZNh8Q23frKp9xpibqjqW3h0HrDVno+FjSJIkSdK04C3uk+M+4PYkewGkZ/O2by3gx237zaMc/ytgzQHm2RU4p+/3xwB7tu3XM3p4v5xHVrv3HepXVfcDVwNHAWdV1ZIBahhyPvC2oV+SrANcCbwgycatbfUkf74UY0qSJEnStGFAnzz7AgcmWQTcBOze2g+hd+v7pcBdoxz7FeDg9iK5OaP0AdiJ3rPmQx4ANkuyAHgxcNgox70D2D/JYuCNwDv79p0MvIE/viV+EB8G1klyYzvnF1XVz4H9gC+3ua4ENl3KcSVJkiRpWkiVdwxPR0meAhxXVbv1td1fVVP6Lelrzdmotv/I+ye7DEmSltrZ8w+a7BIkSR2RZEFVzRve7jPo01RV/YjeV6xJkiRJkqYAb3GfQab66rkkSZIkTWcGdEmSJEmSOsCALkmSJElSBxjQJUmSJEnqAAO6JEmSJEkdYECXJEmSJKkDDOiSJEmSJHWAAV2SJEmSpA4woEuSJEmS1AGzJrsAaWlsss4GnD3/oMkuQ5IkSZKWO1fQJUmSJEnqAAO6JEmSJEkdYECXJEmSJKkDDOiSJEmSJHWAAV2SJEmSpA4woEuSJEmS1AEGdEmSJEmSOsDvQdeU8r93/4JXnHbCZJchSdKfOGv+fpNdgiRpinMFXZIkSZKkDjCgS5IkSZLUAQZ0SZIkSZI6wIAuSZIkSVIHGNAlSZIkSeoAA7okSZIkSR1gQJckSZIkqQMM6JIkSZIkdYABXZIkSZKkDjCgS5IkSZLUAQZ0SZIkSZI6wIAuSZIkSVIHzKiAnuSQJO8ZoX12khvb9rwkn1jx1f2pJG9N8qbJrmM8SfZJ8v4x9n8jydrt39+syNokSZIkaaqYNdkFdE1VXQtcu6LmSzKrqn43Si2fWVF1LKNdgVE/1Kiql0HvgxDgb4BPr5CqJEmSJGkKmdIr6G3l+7tJ/ivJ4iSnJlk9yR1J1m995iW5qO+wzZN8O8mtSf5qhDF3SnJW214jyfFJbmjjzx+ljpWSnJDkxtb371r7nCTnJlmQ5NIkm7b2E5J8PMmFwJGt3rX7xvvfJE/sX/FPsnGSC5IsSnJdkjmt/eAk17T6Dh3jWj0uydnt+BuT7N3aR7xWbe7/SnJ+6/OaJB9p53dukse2fgHmAteNdr365vh3YE6ShUmOTHJikt37ajwpyatG/4tLkiRJ0vQ1HVbQ/wI4sKouS/J5eiu0Y3ku8HzgccD1Sc4eo+8HgHur6jkASdYZpd9cYMOqenbrNxS2jwXeWlW3JtmG3srxi9u+Pwd2rqolSR4DvBo4vvW7o6p+2su+f3AS8O9VdUaSVYHHJNkF2ATYGghwZpIdq+qSEWrcFfhJVb281bjWGOc9ZA7wIuBZwBXA/Kp6b5IzgJcDXwOeByyqqkoy3vV6H/Dsqprb9r8Q+Dvg662e7YA3Dy8iyUHAQQCrrb/eAGVLkiRJ0tQzpVfQmx9W1WVt+4vA9uP0/3pV/aaq7gIupBduR7Mz8KmhX6rq7lH63QY8I8knk+wK3JdkDXqB85QkC4FjgCf3HXNKVS1p2ycDe7ft17Xf/yDJmvQ+ADij1fHbqvo1sEv7dz1wHbApvcA+khuAnZMckWSHqrp3jPMeck5VPdyOXQk4t2+s2W17V+Cctj3o9RrafzGwcZInAPsAp410u39VHVtV86pq3sqPX3OAsiVJkiRp6pkOK+g1wu+/45EPH1YdoP9oMs7+3gBVdyfZHHgp8LfAa4F3AfcMrRaP4IG+7SvoBdUNgD2AD49Qx2j1HV5VxwxQ4y1JtgReBhye5PyqOoyxr9WD7djfJ3m4qoauxe955P/OLsDQrf8DXa9hTgT2pffBxAFLeawkSZIkTRvTYQX9aUm2bdv7AN8B7gC2bG3DnxvfPcmqSdYDdgKuGWPs84G3Df0y2i3u7fnqx1TVafRui9+iqu4Dbk+yV+uTFuL/RAu+ZwAfB26uql8M238f8KMke7SxVkmyOnAecEBbrSfJhm01eqQa/wz4dVV9EfgosEXbdQejX6sxtdvSZ/XVO971+hUwfAn8BHofZlBVNy3N/JIkSZI0nYwb0NvLyj6X5Jz2+7OSHDjxpQ3sZuDNSRYD6wL/CRwKHJXkUmDJsP5XA2cDVwIfqqqfjDH2h4F12kvVFtF7HnskGwIXtVvZTwD+sbXvCxzYjr0J2H3kw4Hebe1vYNjt7X3eCLyjneflwJOq6nzgS8AVSW4ATuVPA/CQ5wBXtxrfzyOr9GNdq/G8BLig7/cxr1cL8pe1/Ue2tp/S+xsev5RzS5IkSdK0kkfuWh6lQy+YHw+8v6o2TzILuH7oRWCTKb2v7Tpr6OVsWrGSfBb4bFVduQxjrE7vmfYtBnkufu05T6/tP/LBRzudJEkT5qz5+012CZKkKSLJgqqaN7x9kFvc16+qr9J77pj2Eq+lXWnVNFRVb1nGcL4z8F3gkwO+tE6SJEmSpq1BXhL3QHteuwCSPB/oRJiqqjuAFbp6nuQqYJVhzW+sqhtWZB2jaX+rb42w6y+HP9s+2arqAuBpk12HJEmSJHXBIAH974EzgTlJLgM2APac0Ko6rKq2mewaxtJC+GhvjpckSZIkddSYAT3JY+h99dYLgb+g9zVa32vfjS1JkiRJkpaTMQN6+/7rj1XVtvTeQi5JkiRJkibAIC+JOz/J/CSZ8GokSZIkSZqhBn0G/XHA75L8lt5t7lVVj5/QyiRJkiRJmkHGDehVteaKKESSJEmSpJls3ICeZMeR2qvqkuVfjiRJkiRJM9Mgt7gf3Le9KrA1sAB48YRUJI1h43XW46z5+012GZIkSZK03A1yi/sr+39P8lTgIxNWkSRJkiRJM9Agb3Ef7kfAs5d3IZIkSZIkzWSDPIP+SaDar48B5gKLJrIoSZIkSZJmmkGeQb+2b/t3wJer6rIJqkeSJEmSpBlpkIC+dlUd1d+Q5J3D2yRJkiRJ0qM3yDPobx6hbb/lXIckSZIkSTPaqCvoSfYBXg88PcmZfbvWBH4x0YVJkiRJkjSTjHWL++XAncD6wMf62n8FLJ7IoqTR/O/dv+QVp5402WVIkqaQs/bcd7JLkCRpIKMG9Kr6AfADYNsVV44kSZIkSTPTuM+gJ3l+kmuS3J/koSRLkty3IoqTJEmSJGmmGOQlcUcD+wC3AqsBbwE+OZFFSZIkSZI00wzyNWtU1f8mWamqlgDHJ7l8guuSJEmSJGlGGSSg/zrJysDCJB+h9+K4x01sWZIkSZIkzSyD3OL+xtbvbcADwFOB+RNZlCRJkiRJM824K+hV9YMkqwFPrqpDV0BNkiRJkiTNOIO8xf2VwELg3Pb73CRnTnRhkiRJkiTNJIPc4n4IsDVwD0BVLQRmT1xJkiRJkiTNPIME9N9V1b0TXokkSZIkSTPYIG9xvzHJ64GVkmwCvAPwa9YkSZIkSVqORl1BT3Ji2/w+sBnwIPBl4D7gXRNfmgaVZHaSG5fDOPslObpt75HkWX37Lkoyb4xjF7Sv4xtp36uSvG+kcSVJkiRJPWOtoG+ZZCNgb+BFwMf69q0O/HYiC9Ok2wM4C/if8TommQ38uKoeGml/VZ0JDL1YcOBxJUmSJGkmGesZ9M/Qe3P7psC1ff8WtJ/qlpWSHJfkpiTnJ1ktyZwk57bV7UuTbAq9N/MnuSrJ9UkuSPLE/oGSbAe8CjgyycIkc9quvZJcneSWJDv0HbIbj7zlf9ck1yVZlORbrW2/JEePNG6S6/rm3STJggm7QpIkSZLUYaMG9Kr6RFU9E/h8VT2j79/Tq+oZK7BGDWYT4FNVtRm9N+7PB44F3l5VWwLvAT7d+n4HeH5VPQ/4CvDe/oGq6nJ6K94HV9Xcqvp+2zWrqram94jDB/sO2RU4N8kGwHHA/KraHNhrgHHvTTK3ddkfOGEZr4MkSZIkTUnjviSuqv56RRSiZXZ7+wo86N3lMBvYDjglyVCfVdrPpwAnJ3kysDJw+4BznD5sfNpz50+pqtuSvBK4pKpuB6iqXw4w5meB/ZP8Pb3HKbYe3iHJQcBBAKutv96ApUqSJEnS1DLI16xpaniwb3sJsC5wT1upHvr3zLb/k8DRVfUc4P8BVl3KOZbwyIc7O9BbkQcIUEtZ92n0bpF/BbCgqn4xvENVHVtV86pq3sqPf/xSDi9JkiRJU4MBffq6D7g9yV4A6dm87VsL+HHbfvMox/8KWHOAeXYFzmnbVwAvTPL0Nue6441bVb8FzgP+Ezh+gPkkSZIkaVoyoE9v+wIHJlkE3ATs3toPoXfr+6XAXaMc+xXg4PYiuTmj9AHYCbgYoKp+Tu9W9NPbnCcPOO5J9Fbezx/0xCRJkiRpuknV0t6RLPUkeQpwXFXttozjvAdYq6o+MF7ftec8o7Y/4kPLMp0kaYY5a899J7sESZL+SJIFVTVvePu4L4mTRlNVP6L3/PijluQMYA7w4uVSlCRJkiRNUQZ0TaqqevVk1yBJkiRJXeAz6JIkSZIkdYABXZIkSZKkDjCgS5IkSZLUAQZ0SZIkSZI6wIAuSZIkSVIHGNAlSZIkSeoAA7okSZIkSR1gQJckSZIkqQNmTXYB0tLYeJ11OWvPfSe7DEmSJEla7lxBlyRJkiSpAwzokiRJkiR1gAFdkiRJkqQOMKBLkiRJktQBBnRJkiRJkjrAgC5JkiRJUgcY0CVJkiRJ6gC/B11Tyv/efTevOPWrk12GJGmSnbXnaye7BEmSljtX0CVJkiRJ6gADuiRJkiRJHWBAlyRJkiSpAwzokiRJkiR1gAFdkiRJkqQOMKBLkiRJktQBBnRJkiRJkjrAgC5JkiRJUgcY0CVJkiRJ6gADuiRJkiRJHWBAlyRJkiSpAwzokiRJkiR1gAF9FEnekeTmJD9OcvRk1/NoJfnHJPtO4PjzknxiosaXJEmSpJli1mQX0GF/A+wGvBCYt6yDJZlVVb9b0ccCuwCvnah5qupa4NpHU5gkSZIk6RGuoI8gyWeAZwBnAuv0tW+U5FtJFrefTxun/YQkH09yIXDEKHNtneTyJNe3n3/R2vdLckqS/wbOb20HJ7mmzXNo3xhfS7IgyU1JDuprfzywclX9vNXymSSXJrklyStGmic9Rya5MckNSfZu/U5O8rK+sU9IMj/JTknOam2HJPl8kouS3JbkHX3939TqXpTkxNa2QZLT2jldk+QFy/BnkyRJkqQpzRX0EVTVW5PsCrwIeEXfrqOBL1TVfyU5APgEsMcY7QB/DuxcVUtGme67wI5V9bskOwP/Bsxv+7YFnltVv0yyC7AJsDUQ4MwkO1bVJcABrc9qwDVJTquqXwA7A9/qm2s2vTsC5gAXJtl4hHnmA3OBzYH123iXAF8B9ga+kWRl4C+Bvwa2GXY+m7brtibwvST/2a7B+4EXVNVdSdZtfY8C/qOqvtM+1DgPeObwC9Q+dDgIYLX11x/lMkqSJEnS1GZAXzrbAq9p2ycCHxmnHeCUMcI5wFrAfyXZBCjgsX37vllVv2zbu7R/17ff16AX2C8B3pHk1a39qa39F8CuwPF94321qn4P3JrkNnphevg82wNfbjX/NMnFwFbAOcAnkqzSxr2kqn6TZPj5nF1VDwIPJvkZ8ETgxcCpVXUXQN9cOwPP6hvj8UnWrKpf9Q9YVccCxwKsPWdOjXgVJUmSJGmKM6Avm9HCYn/7A+OM8SHgwqp6dZLZwEWjHBvg8Ko6pv/gJDvRC7rbVtWvk1wErNp2b01vlXu0eod+Hz7Pn6iq37axX0pvJf3Lo5zPg33bS+j9H8sIc0PvEYttq+o3o4wlSZIkSTOGz6AvncuB17XtfYHvjNM+iLWAH7ft/cbodx5wQJI1AJJsmOQJ7fi7WzjfFHh+278Z8N1hq/d7JXlMkjn0nrH/3gjzXALsnWSlJBsAOwJXt31fAfYHdmj1DOpbwGuTrNdqG7rF/XzgbUOdksxdijElSZIkaVoxoC+ddwD7J1kMvBF45zjtg/gIcHiSy4CVRutUVecDXwKuSHIDcCq957zPBWa1uT8EXNkO2a3t6/c94GJ6t6u/tap+O8JUZwCLgUXAt4H3VtX/tX3n0wvsF1TVQ4OeYFXdBPwrcHGSRcDH2653APPay+P+B3jroGNKkiRJ0nSTKh/pnY6SfBN4U1Xd2X4/ATirqk6d1MKW0dpz5tT2Rxw+2WVIkibZWXsO9A2ikiR1UpIFVfUnX+ftM+jTVFW9ZLJrkCRJkiQNzoC+giTZnz+99f2yqvrbFTF/Ve23IuaRJEmSJD06BvQVpKqO54+/8kySJEmSpD/wJXGSJEmSJHWAAV2SJEmSpA4woEuSJEmS1AEGdEmSJEmSOsCALkmSJElSBxjQJUmSJEnqAL9mTVPKxuusw1l7vnayy5AkSZKk5c4VdEmSJEmSOsCALkmS9P+zd+fRmlT1uce/D7PMg8AVBQkNiKDQQAdEgQAioCYKgtPFIGggRBM0Bo1eDIJDFMjCiEYFDbQDRmQUMTKIIPPQDd3QCBEFclW8KJPIEMbf/ePdLa+HM3X36T51zvl+1ur11rtr196/KlnL9ZxdVa8kSR1gQJckSZIkqQMM6JIkSZIkdYABXZIkSZKkDjCgS5IkSZLUAQZ0SZIkSZI6wN9B14Tyswce5C/OOGe8y5AkLWHf22/v8S5BkqTFzhV0SZIkSZI6wIAuSZIkSVIHGNAlSZIkSeoAA7okSZIkSR1gQJckSZIkqQMM6JIkSZIkdYABXZIkSZKkDjCgS5IkSZLUAQZ0SZIkSZI6wIAuSZIkSVIHGNAlSZIkSeoAA7okSZIkSR1gQO+wJIcluTXJr5J8YbzrWVhJPpJk//GuQ5IkSZK6zIDebe8BXgccMRaDJVlmPI4F9gAuXITjJUmSJGnSM6B3VJIvAxsB5wJr9LW/OMnFSW5qnxuM0D4zyfFJLgGOGWKu7ZJcleTG9vmS1n5gktOTfI8WsJN8MMn1bZ6j+8Y4J8nsJLckOaSvfVVguar6bavlS0kuSXJHkj9LcnK7S2DmMNfikCSzksx64qGHFvqaSpIkSVKXGdA7qqoOBe4GdgUe6Nv1BeDrVbUlcCpwwgjtAJsCu1fVPwwx3W3AzlW1NXAk8M99+3YA3llVuyXZA9gE2A6YDmybZOfW711VtS0wAzgsyVqtfXfg4r7x1gB2A/4e+B7wWWAL4OVJpg9xLU6qqhlVNWO5VVcd4hQkSZIkaWIzoE88OwDfatvfAHYcoR3g9Kp6epgxVwNOTzKPZwPzfBdV1f1te4/270bgBmAzeoEdeqF8LnANsH5f+17AD/rG+15VFXAzcE9V3VxVzwC3ABsOU6MkSZIkTWqL8lyxuqFG0f7ICGN8ArikqvZJsiFw6RDHBvh0VZ3Yf3CSXeitlO9QVY8muRRYoe3eDvibvu6Pt89n+rbnf/e/R0mSJElTlivoE89VwNva9v7AFSO0j8ZqwK/a9oHD9LsAeFeSlQGSvDDJOu34B1o43wx4Rdu/BXDbCKv3kiRJkiRcsZyIDgNOTvJB4LfAQSO0j8axwNeSfAD40VCdqurCJC8Frk4C8DDwDuB84NAkNwH/Re82d4DXtn2SJEmSpBGk9ziwNPaSXAQcUFW/HqsxV5+2ce10zL+M1XCSpAnie/vtPd4lSJI0ZpLMrqoZA9tdQddiU1WvGe8aJEmSJGmiMKBPIUkOAt43oPnKqnrveNQjSZIkSXqWAX0KqapTgFPGuw5JkiRJ0nP5FndJkiRJkjrAgC5JkiRJUgcY0CVJkiRJ6gADuiRJkiRJHWBAlyRJkiSpAwzokiRJkiR1gD+zpgll4zVW53v77T3eZUiSJEnSmHMFXZIkSZKkDjCgS5IkSZLUAQZ0SZIkSZI6wIAuSZIkSVIHGNAlSZIkSeoAA7okSZIkSR1gQJckSZIkqQP8HXRNKD974He88Yz/HO8yJEmL6Lv7vW68S5AkqXNcQZckSZIkqQMM6JIkSZIkdYABXZIkSZKkDjCgS5IkSZLUAQZ0SZIkSZI6wIAuSZIkSVIHGNAlSZIkSeoAA7okSZIkSR1gQJckSZIkqQMM6JIkSZIkdYABXZIkSZKkDjCgS5IkSZLUAQb0MZbkqCSHL+Y5Tk7ymyTzBrSvmeSiJLe3zzUWYMy7kjy/bV/V135cklva59pJrk1yY5Kd2v6PJNl/rM5NkiRJkqYqA/rENBPYa5D2DwMXV9UmwMXt+wKrqlf2ff1rYJuq+iDwauC2qtq6qi5v+/cALlyYeSRJkiRJzzKgL6IkByS5KcncJN8YsO/gJNe3fWcmWbG1vznJvNZ+WWvbIsl1Sea08TYZas6qugy4f5BdbwS+1ra/Buw9TN1rJbmwrYafCKRv38Pt81xgJeDaJP8IHAu8rtX4vCSrAstV1W+T/EXf6voPk6zbxli7rebfkOTEJP/dt1L/jr5zPjHJ0kPUekiSWUlmPfHQ74Y6JUmSJEma0AzoiyDJFsARwG5VtRXwvgFdzqqqP237bgXe3dqPBPZs7W9obYcCn6uq6cAM4JcLUdK6VfVrgPa5zjB9PwZcUVVbA+cCGwzsUFVvAB6rqulVdUyr+7T2/TFgd3or9QBXAK9o430b+FDfPD+qqm2As+fPk+SlwFuBV7VzfhoY9Fb5qjqpqmZU1YzlVl1ttNdCkiRJkiaUZca7gAluN+CMqroXoKruT9K//2VJPgmsDqwMXNDarwRmJvkOcFZruxo4IsmL6AX72xdz7TsDb2p1fz/JAwsxxl7AKW37RcBpSV4ALAfc2dp3BPZp85zfN8+rgW2B69s1ex7wm4WoQZIkSZImBVfQF02AGmb/TOBvq+rlwNHACgBVdSjwUWB9YE6StarqW/RW0x8DLkiy20LUc08LyLTPkQLvcLWPxnbAdW3788AX2rn+Ne1c6bt1foAAX2ur8dOr6iVVddQi1iNJkiRJE5YBfdFcDLwlyVrQe4v6gP2rAL9Osix9t28nmVZV11bVkcC9wPpJNgLuqKoT6N1yvuVC1HMu8M62/U7gu8P0vWx+TUleC4z6je/tmC3ovTDu6da0GvCrvrnnuwJ4Sztmj755Lgb2S7JO27dmkhcvSA2SJEmSNJkY0BdBVd0CfAr4cZK5wPEDuvwTcC1wEXBbX/txSW5uP5N2GTCX3vPY85LMATYDvj7UvEn+g94t8S9J8ssk859t/wzwmiS3A69p34dyNLBzkhvovYn9/47mnPu8Fji/7/tRwOlJLqf3R4f+efZo87wW+DXw+6r6Cb27CC5MchO9a/SCBaxBkiRJkiaNVC3qXc6aipJcBBww/6V0w/RbHni6qp5KsgPwpfZSuIWy+rRN6s+O+dzCHi5J6ojv7ve68S5BkqRxk2R2Vc0Y2O5L4rRQquo1o+y6AfCdJEsBTwAHL76qJEmSJGniMqB3VHuu/eJBdr26qu5bgHEO4rk//3ZlVb13UeobrfY2+q2XxFySJEmSNJEZ0DuqhfCFvhW8b5xTePan0CRJkiRJHeVL4iRJkiRJ6gADuiRJkiRJHWBAlyRJkiSpAwzokiRJkiR1gAFdkiRJkqQOMKBLkiRJktQB/syaJpSN11iN7+73uvEuQ5IkSZLGnCvokiRJkiR1gAFdkiRJkqQOMKBLkiRJktQBBnRJkiRJkjrAgC5JkiRJUgcY0CVJkiRJ6gADuiRJkiRJHeDvoGtC+dkDD7H3GT8c7zIkSUM4Z7/dx7sESZImLFfQJUmSJEnqAAO6JEmSJEkdYECXJEmSJKkDDOiSJEmSJHWAAV2SJEmSpA4woEuSJEmS1AEGdEmSJEmSOsCALkmSJElSBxjQJUmSJEnqAAO6JEmSJEkdYECXJEmSJKkDDOiSJEmSJHXAlAroSR4eRZ+rlkQtS0qSE5O8ajGO/4YkH15c40uSJEnSVDGlAvpoVNUrF3WMJMuMRS1jZHvgmtF0XJi6q+rcqvrMAlclSZIkSfojUzagJ/lgkuuT3JTk6L72h9vnC5JclmROknlJdurf37b3SzKzbc9McnySS4BjkqyU5OQ2x41J3jhMLVskua7NdVOSTZJsmGReX5/DkxzVti9N8tlW361J/jTJWUluT/LJvmNeCvy0qp5ux/xrkqva+WzX+hyV5KQkFwJfT7JCklOS3Nzq3rX1uzbJFn1jX5pk2yQHJvlC3zU4oc1xR5L9+vp/qI05N8lnWtu0JOcnmZ3k8iSbDXF9DkkyK8msJx763Yj/20qSJEnSRNSlld4lJskewCbAdkCAc5PsXFWX9XX738AFVfWpJEsDK45i6E2B3Vsg/mfgR1X1riSrA9cl+WFVPTLIcYcCn6uqU5MsBywNrDvCXE9U1c5J3gd8F9gWuB/4eZLPVtV9wGuB8/uOWamqXplkZ+Bk4GWtfVtgYe4XUwAAIABJREFUx6p6LMk/AFTVy1tgvjDJpsC3gbcAH0vyAmC9qpqd5OUD6noBsCOwGXAucEaS1wJ7A9tX1aNJ1mx9TwIOrarbk2wPfBHYbeCJVtVJrS+rT9u0RrgukiRJkjQhTcmADuzR/t3Yvq9ML7D3B/TrgZOTLAucU1VzRjHu6VX1dN8cb0hyePu+ArABcOsgx10NHJHkRcBZLbCONNe57fNm4Jaq+jVAkjuA9YH7gD2Bg/qO+Q+AqrosyartDwcA51bVY217R+Dzrd9tSf6b3h8evgNcBHyMXlA/fYi6zqmqZ4CfJJn/R4bdgVOq6tE27v1JVgZeCZzed67Lj3TSkiRJkjRZTdWAHuDTVXXiUB1aiN0ZeD3wjSTHVdXXgf4V3BUGHNa/Oh5g36r6r5GKqapvJbm2zXVBkr8CfsofP4IwcK7H2+czfdvzvy+TZEVg9aq6u3+qgVMPUfdgNf4qyX1JtgTeCvz1EKfTX0v6PgfOvRTwYFVNH2IcSZIkSZpSpuoz6BcA72qruCR5YZJ1+jskeTHwm6r6CvDvwDZt1z1JXppkKWCfEeb4u7Tl4SRbD9UxyUbAHVV1Ar2V8S2Be4B1kqyVZHngzxfwHHcFLhnQ9tY2347A76pqsAe6LwP2b/02pbfqP/+PDN8GPgSsVlU3L0AtF9K73iu2cdesqoeAO5O8ubUlyVYLMKYkSZIkTSpTMqBX1YXAt4Crk9wMnAGsMqDbLsCcJDcC+wKfa+0fBs4DfgT8ephpPgEsC9zUXvb2iWH6vhWYl2QOvWe3v15VTwIfB65t89026hPsGfj8OcAD6f2M3JeBdw9x3BeBpdt1OQ04sKrmr4qfAbyN3u3uo1ZV59P7w8Osdo7zb/vfH3h3krnALcCQL9KTJEmSpMkuVb5zazJKcgO9l7I92b5fChxeVbPGtbBFtPq0TWuXY7443mVIkoZwzn67j3cJkiR1XpLZVTVjYPtUfQZ90quqbUbuJUmSJEnqCgP6EpRkT+CYAc13VtVwz7KPiaraZXHPIUmSJElaeAb0JaiqLqD38jhJkiRJkv7IlHxJnCRJkiRJXWNAlyRJkiSpAwzokiRJkiR1gAFdkiRJkqQOMKBLkiRJktQBBnRJkiRJkjrAn1nThLLxGqtyzn67j3cZkiRJkjTmXEGXJEmSJKkDDOiSJEmSJHWAAV2SJEmSpA4woEuSJEmS1AEGdEmSJEmSOsCALkmSJElSBxjQJUmSJEnqAH8HXRPKzx94mH3OvGK8y5AkDeHsfXcc7xIkSZqwXEGXJEmSJKkDDOiSJEmSJHWAAV2SJEmSpA4woEuSJEmS1AEGdEmSJEmSOsCALkmSJElSBxjQJUmSJEnqAAO6JEmSJEkdYECXJEmSJKkDDOiSJEmSJHWAAV2SJEmSpA4woEuSJEmS1AEG9EWQ5LAktyb5VZIvjHc9CyvJR5LsvwTmmZ7kdYt7HkmSJEmaiAzoi+Y9wOuAI8ZisCTLjMexwB7AhYtw/GhNp3e9JEmSJEkDGNAXUpIvAxsB5wJr9LW/OMnFSW5qnxuM0D4zyfFJLgGOGWKu7ZJcleTG9vmS1n5gktOTfI8WsJN8MMn1bZ6j+8Y4J8nsJLckOaSvfVVguar6bZJ1k5ydZG7798rW5wNJ5rV/729tGyaZ1zfO4UmOatuXJjkmyXVJfppkpyTLAR8H3ppkTpK3Jrk9ydrtmKWS/CzJ8wc5/0OSzEoy6/GHHlzg/60kSZIkaSIwoC+kqjoUuBvYFXigb9cXgK9X1ZbAqcAJI7QDbArsXlX/MMR0twE7V9XWwJHAP/ft2wF4Z1XtlmQPYBNgO3qr1dsm2bn1e1dVbQvMAA5LslZr3x24uG2fAPy4qrYCtgFuSbItcBCwPfAK4OAkW498hVimqrYD3g98rKqeaLWfVlXTq+o04JvA/FvrdwfmVtW9AweqqpOqakZVzVh+1dVHMbUkSZIkTTwG9LG3A/Cttv0NYMcR2gFOr6qnhxlzNeD0tmL9WWCLvn0XVdX9bXuP9u9G4AZgM3qBHXqhfC5wDbB+X/tewA/a9m7AlwCq6umq+l2r8+yqeqSqHgbOAnYa9gr0nNU+ZwMbDtHnZOCAtv0u4JRRjCtJkiRJk9KiPLes0alRtD8ywhifAC6pqn2SbAhcOsSxAT5dVSf2H5xkF3or1DtU1aNJLgVWaLu3A/5mmLkzRPtT/PEfeFYYsP/x9vk0Q/x3VlW/SHJPkt3ordAv9hfVSZIkSVJXuYI+9q4C3ta29weuGKF9NFYDftW2Dxym3wXAu5KsDJDkhUnWacc/0ML5ZvRuVSfJFsBtfav3F9PCepKl2/PplwF7J1kxyUrAPsDlwD3AOknWSrI88OejOI/fA6sMaPsqvVvdvzPCXQSSJEmSNKkZ0MfeYcBBSW4C/hJ43wjto3Es8OkkVwJLD9Wpqi6kdxv91UluBs6gF4jPB5Zpc3+C3m3uAK9t++Z7H7BrO3Y2sEVV3QDMBK4DrgW+WlU3VtWT9F76di1wHr3n5EdyCbD5/JfEtbZzgZXx9nZJkiRJU1yqhroDW5NdkouAA6rq1+NYwwzgs1U1mufaWWPaZrXLsV9dzFVJkhbW2fvuOHInSZKmuCSzq2rGwHafQZ/Cquo14zl/kg/Tu6XeZ88lSZIkTXkG9A5JchDPvfX9yqp673jUs7hV1WeAz4x3HZIkSZLUBQb0DqmqU/BZbEmSJEmaknxJnCRJkiRJHWBAlyRJkiSpAwzokiRJkiR1gAFdkiRJkqQOMKBLkiRJktQBBnRJkiRJkjrAn1nThDJtjZU5e98dx7sMSZIkSRpzrqBLkiRJktQBBnRJkiRJkjrAgC5JkiRJUgcY0CVJkiRJ6gADuiRJkiRJHWBAlyRJkiSpAwzokiRJkiR1gL+Drgnl5w88yr5nzhrvMiRpyjpz3xnjXYIkSZOWK+iSJEmSJHWAAV2SJEmSpA4woEuSJEmS1AEGdEmSJEmSOsCALkmSJElSBxjQJUmSJEnqAAO6JEmSJEkdYECXJEmSJKkDDOiSJEmSJHWAAV2SJEmSpA4woEuSJEmS1AEGdEmSJEmSOsCAvpCSrJ7kPWM01i5JXjkWYy3k/LOTLLcE5jkwyXqLex5JkiRJmogM6AtvdeA5AT3J0gsx1i7AuAT0JBsCv6qqJ5bAdAcCBnRJkiRJGoQBfeF9BpiWZE6S65NckuRbwM1JNkwyb37HJIcnOaptH5bkJ0luSvLtFpAPBf6+jbXTYJMleXOSeUnmJrmstR2Y5At9fc5LskvbfjjJMW11/IdJtktyaZI7kryhb+jXAue3Y/ZKckOb4+LWtmaSc1q91yTZsrUfleTwvrnntfPeMMmtSb6S5JYkFyZ5XpL9gBnAqe08X5/k7L7jX5PkrCHO/ZAks5LMevyhB0b3v44kSZIkTTAG9IX3YeDnVTUd+CCwHXBEVW0+iuO2rqotgUOr6i7gy8Bnq2p6VV0+xHFHAntW1VbAG4bo028l4NKq2hb4PfBJ4DXAPsDH+/rtBZyfZG3gK8C+bY43t/1HAze2ev8P8PVRzL0J8G9VtQXwYBvzDGAWsH+7Zv8JvLTNC3AQcMpgg1XVSVU1o6pmLL/qGqOYXpIkSZImHgP62Lmuqu4cRb+b6K0ivwN4agHGvxKYmeRgYDS30T9BWxkHbgZ+XFVPtu0NAdpz5y+qqjuAVwCXzT+Hqrq/Hbsj8I3W9iNgrSSrjTD3nVU1p23Pnj9fv6qqNu47kqwO7AD8YBTnJUmSJEmTkgF97DzSt/0Uf3xtV+jbfj3wb8C2wOwky4xm8Ko6FPgosD4wJ8laI8zzZAvBAM8Aj7dxngHmz7kTcEXbDlA8VwYrZ4S5H+/bfrpvvoFOAd4BvB04vaoW5A8WkiRJkjSpGNAX3u+BVYbYdw+wTpK1kiwP/DlAkqWA9avqEuBD9F40t/IIY9GOnVZV11bVkcC99IL6XcD0JEslWZ/ebfYLYi+eXbW+GvizJH/S5luztV8G7N/adgHuraqH2tzbtPZtgD8ZxXx/dJ5VdTdwN70/PMxcwNolSZIkaVIZ1eqtnquq7ktyZXsZ3GP0Qvn8fU8m+ThwLXAncFvbtTTwzXaLeOg9d/5gku8BZyR5I/B3QzyHflySTdpxFwNzW/ud9G5bnwfcsICnsQu9Z9upqt8mOQQ4q/0h4Tf0nlk/CjglyU3Ao8A727FnAgckmQNcD/x0FPPNBL6c5DFgh6p6DDgVWLuqfrKAtUuSJEnSpJJn74LWVJLkRcBXquq141zHF+i9hO7fR9N/jWmb127HjuY9dZKkxeHMfWeMdwmSJE14SWZX1XP+T9UV9Cmqqn5J7yfWxk2S2fSe3f+H8axDkiRJkrrAgN4xSY7g2Z84m+/0qvrUeNSzOLWfgJMkSZIkYUDvnBbEJ10YlyRJkiQNz7e4S5IkSZLUAQZ0SZIkSZI6wIAuSZIkSVIHGNAlSZIkSeoAA7okSZIkSR1gQJckSZIkqQP8mTVNKNPWWJEz950x3mVIkiRJ0phzBV2SJEmSpA4woEuSJEmS1AEGdEmSJEmSOsCALkmSJElSBxjQJUmSJEnqAAO6JEmSJEkdYECXJEmSJKkD/B10TSh3PPA/vOXMn4x3GZI0qX1n383HuwRJkqYkV9AlSZIkSeoAA7okSZIkSR1gQJckSZIkqQMM6JIkSZIkdYABXZIkSZKkDjCgS5IkSZLUAQZ0SZIkSZI6wIAuSZIkSVIHGNAlSZIkSeoAA7okSZIkSR1gQJckSZIkqQMM6JIkSZIkdYABfTFI8v4kKy7EcRsmmTdGNRyY5Atte+8km/ftuzTJjLGYR5IkSZI0Ngzoi8f7gQUO6IvR3sDmI/aSJEmSJI2bKRvQkxyQ5KYkc5N8I8mLk1zc2i5OskHrNzPJfn3HPdw+d2kr0WckuS3Jqek5DFgPuCTJJUneneSzfccfnOT4YUpbOslXktyS5MIkz2vHTUtyfpLZSS5Psllr/4sk1ya5MckPk6w74DxfCbwBOC7JnCTT2q43J7kuyU+T7DTMdTowyTlJvpfkziR/m+QDbb5rkqy5MPUlOSrJye0a3tGu21A1HJJkVpJZjz90/zCXTpIkSZImrikZ0JNsARwB7FZVWwHvA74AfL2qtgROBU4YxVBb01st3xzYCHhVVZ0A3A3sWlW7At8G3pBk2XbMQcApw4y5CfBvVbUF8CCwb2s/Cfi7qtoWOBz4Ymu/AnhFVW3d5vpQ/2BVdRVwLvDBqppeVT9vu5apqu1a/R8b4TxfBvxvYDvgU8Cjbb6rgQMWob7NgD3buB/ru0Z/pKpOqqoZVTVj+VXXHKFUSZIkSZqYlhnvAsbJbsAZVXUvQFXdn2QH4E1t/zeAY0cxznVV9UuAJHOADekF0j+oqkeS/Aj48yS3AstW1c3DjHlnVc1p27OBDZOsDLwSOD3J/H7Lt88XAacleQGwHHDnKOoGOKt/jhH6XlJVvwd+n+R3wPda+83AlotQ3/er6nHg8SS/AdYFfjnK+iVJkiRpUpmqAT1AjdBn/v6naHcapJc+l+vr83jf9tMMfT2/Cvwf4DaGXz0fbMzntfkfrKrpg/T/PHB8VZ2bZBfgqBHGHzjPcHUPVtMzfd+faccubH2jvX6SJEmSNOlNyVvcgYuBtyRZC6A9R30V8La2f3+eXQm/C9i2bb8RGPQ27AF+D6wy/0tVXQusT+828f9Y0GKr6iHgziRvbvUmyVZt92rAr9r2O0dTz1gbg/okSZIkacqbkgG9qm6h9yz1j5PMBY4HDgMOSnIT8Jf0nksH+ArwZ0muA7YHHhnFFCcBP0hySV/bd4Arq+qBhSx7f+Ddrd5b6P2xAHor0qcnuRy4d4hjvw18sL2obdoQfRbVotQnSZIkSVNeqka601tjIcl5wGer6uLxrmUiW3Pay2r3Y78z3mVI0qT2nX39ZU5JkhanJLOrasbA9im5gr4kJVk9yU+BxwznkiRJkqSh+FKuxayqHgQ27W9rz74PFtZfXVX3LZHCBkiyJ3DMgOY7q2qf8ahHkiRJkqYaA/o4aCF8sDeej5uqugC4YLzrkCRJkqSpylvcJUmSJEnqAAO6JEmSJEkdYECXJEmSJKkDDOiSJEmSJHWAAV2SJEmSpA4woEuSJEmS1AH+zJomlI3WWIHv7Lv5eJchSZIkSWPOFXRJkiRJkjrAgC5JkiRJUgcY0CVJkiRJ6gADuiRJkiRJHWBAlyRJkiSpAwzokiRJkiR1gAFdkiRJkqQO8HfQNaHc9eATHHTW/x3vMiRpUjnlTRuMdwmSJAlX0CVJkiRJ6gQDuiRJkiRJHWBAlyRJkiSpAwzokiRJkiR1gAFdkiRJkqQOMKBLkiRJktQBBnRJkiRJkjrAgC5JkiRJUgcY0CVJkiRJ6gADuiRJkiRJHWBAlyRJkiSpAwzokiRJkiR1gAF9iklyVJLDx2O+JB9Psnvb3inJLUnmJHlekuPa9+OWVG2SJEmS1CXLjHcBmjqq6si+r/sD/1JVpwAk+Wtg7ap6fFyKkyRJkqRx5gr6JJfkgCQ3JZmb5BsD9h2c5Pq278wkK7b2NyeZ19ova21bJLmurXjflGSTYeY8Isl/Jfkh8JK+9plJ9kvyV8BbgCOTnJrkXGAl4Nokbx1kvEOSzEoy639+d/+YXBdJkiRJ6hpX0CexJFsARwCvqqp7k6wJHNbX5ayq+krr+0ng3cDngSOBPavqV0lWb30PBT5XVacmWQ5Yeog5twXeBmxN77+vG4DZ/X2q6qtJdgTOq6oz2nEPV9X0wcasqpOAkwCev/GWtaDXQZIkSZImAlfQJ7fdgDOq6l6Aqhq4/PyyJJcnuZneLedbtPYrgZlJDubZIH418H+S/CPw4qp6bIg5dwLOrqpHq+oh4NwxPB9JkiRJmrQM6JNbgOFWnGcCf1tVLweOBlYAqKpDgY8C6wNzkqxVVd8C3gA8BlyQZLdhxnWVW5IkSZIWkAF9crsYeEuStQDaLe79VgF+nWRZeivotH7Tqura9lK3e4H1k2wE3FFVJ9BbFd9yiDkvA/Zpb2ZfBfiLsT0lSZIkSZqcfAZ9EquqW5J8CvhxkqeBG4G7+rr8E3At8N/AzfQCO8Bx7SVwoRfy5wIfBt6R5Eng/wEfH2LOG5KcBsxp414+1uclSZIkSZNRqrwbWRPH8zfesv7i2PPGuwxJmlROedMG412CJElTSpLZVTVjYLu3uEuSJEmS1AHe4q6F0p5rv3iQXa+uqvuWdD2SJEmSNNEZ0LVQWggf9HfLJUmSJEkLzlvcJUmSJEnqAAO6JEmSJEkdYECXJEmSJKkDDOiSJEmSJHWAAV2SJEmSpA4woEuSJEmS1AH+zJomlA1XX45T3rTBeJchSZIkSWPOFXRJkiRJkjrAgC5JkiRJUgcY0CVJkiRJ6gADuiRJkiRJHWBAlyRJkiSpAwzokiRJkiR1gAFdkiRJkqQO8HfQNaHc/eCTHHX23eNdhiRNGkfts954lyBJkhpX0CVJkiRJ6gADuiRJkiRJHWBAlyRJkiSpAwzokiRJkiR1gAFdkiRJkqQOMKBLkiRJktQBBnRJkiRJkjrAgC5JkiRJUgcY0CVJkiRJ6gADuiRJkiRJHWBAlyRJkiSpAwzokiRJkiR1gAF9HCU5Ksnhg7RvmGRe256R5IQlX91zJTk0yQFjON7qSc5IcluSW5PsMFZjS5IkSdJEs8x4F6DhVdUsYNaSmi/JMlX11BC1fHmMp/sccH5V7ZdkOWDFMR5fkiRJkiYMV9DHUFv5vi3J15Lc1FaHV0xyV5Lntz4zklzad9hWSX6U5PYkBw8y5i5JzmvbKyc5JcnNbfx9h6hj6SQzk8xrff++tU9Lcn6S2UkuT7JZa5+Z5PgklwDHtXpX7xvvZ0nW7V/xT7Jxkh8mmZvkhiTTWvsHk1zf6jt6mGu1KrAz8O8AVfVEVT04RN9DksxKMuvRh+4b8vpLkiRJ0kTmCvrYewnw7qq6MsnJwHtG6L8l8ApgJeDGJN8fpu8/Ab+rqpcDJFljiH7TgRdW1ctav/lh+yTg0Kq6Pcn2wBeB3dq+TYHdq+rpJEsB+wCntH53VdU9SfrnOBX4TFWdnWQFYKkkewCbANsBAc5NsnNVXTZIjRsBv21zbAXMBt5XVY8M7FhVJ7XaWW/jrWqY6yNJkiRJE5Yr6GPvF1V1Zdv+JrDjCP2/W1WPVdW9wCX0wu1Qdgf+bf6XqnpgiH53ABsl+XySvYCHkqwMvBI4Pckc4ETgBX3HnF5VT7ft04C3tu23te9/kGQVen8AOLvV8T9V9SiwR/t3I3ADsBm9wD6YZYBtgC9V1dbAI8CHhzl3SZIkSZrUXEEfewNXeAt4imf/GLLCKPoPJSPs7w1Q9UBbld4TeC/wFuD9wINVNX2Iw/pXrq8GNk6yNrA38MlB6hiqvk9X1Ykj1Qj8EvhlVV3bvp+BAV2SJEnSFOYK+tjboO9t5G8HrgDuArZtbQOfG39jkhWSrAXsAlw/zNgXAn87/8tQt7i3592Xqqoz6d0Wv01VPQTcmeTNrU9aiH+OqirgbOB44Naqum/A/oeAXybZu421fJIVgQuAd7XVepK8MMk6Q8zx/4BfJHlJa3o18JNhzl2SJEmSJjUD+ti7FXhnkpuANYEvAUcDn0tyOfD0gP7XAd8HrgE+UVV3DzP2J4E12svf5gK7DtHvhcCl7Vb2mcBHWvv+wLvbsbcAbxxmrtOAdzDg9vY+fwkc1s7zKuB/VdWFwLeAq5PcTG9VfJVh5vg74NQ2xnTgn4fpK0mSJEmTWnqLpRoLSTYEzpv/cjaNvfU23qoOOe4H412GJE0aR+2z3niXIEnSlJNkdlXNGNjuCrokSZIkSR3gS+LGUFXdBSzR1fMk1wLLD2j+y6q6eUnWMZT2bP3Fg+x69cBn2yVJkiRpKjOgT3BVtf141zCcFsKHenO8JEmSJKnxFndJkiRJkjrAgC5JkiRJUgcY0CVJkiRJ6gADuiRJkiRJHWBAlyRJkiSpAwzokiRJkiR1gD+zpgllvdWX5ah91hvvMiRJkiRpzLmCLkmSJElSBxjQJUmSJEnqAAO6JEmSJEkdYECXJEmSJKkDDOiSJEmSJHWAAV2SJEmSpA7wZ9Y0ofzmwSf5t7PvGe8yJGnSeO8+6453CZIkqXEFXZIkSZKkDjCgS5IkSZLUAQZ0SZIkSZI6wIAuSZIkSVIHGNAlSZIkSeoAA7okSZIkSR1gQJckSZIkqQMM6JIkSZIkdYABXZIkSZKkDjCgS5IkSZLUAQZ0SZIkSZI6wIAuSZIkSVIHGNAlSZIkSeoAA/oSlOT9SVYcq34LMO92Sea0f3OT7DNM3w2TzFvA8TdrY9+YZNqiVyxJkiRJU48Bfcl6PzCa4D3afqM1D5hRVdOBvYATkywzhuPvDXy3qrauqp+P1Dk9/rcnSZIkSX0MSYtJkpWSfL+tWM9L8jFgPeCSJJe0Pl9KMivJLUmObm2HDdLv4b5x90sys22/uY09N8llQ9VSVY9W1VPt6wpAjVD+Mkm+luSmJGfMX81Psm2SHyeZneSCJC9I8jp6f1D4q756P9Dqmpfk/a1twyS3JvkicAOwfpI9klyd5IYkpydZeYhreUi7TrMefuj+EUqXJEmSpInJgL747AXcXVVbVdXLgH8F7gZ2rapdW58jqmoGsCXwZ0m2rKoTBuk3lCOBPatqK+ANw3VMsn2SW4CbgUP7AvtgXgKcVFVbAg8B70myLPB5YL+q2hY4GfhUVf0n8GXgs1W1a5JtgYOA7YFXAAcn2bpv3K9X1dbAI8BHgd2rahtgFvCBwYqpqpOqakZVzVh51TVHuCSSJEmSNDEZ0Befm4HdkxyTZKeq+t0gfd6S5AbgRmALYPMFnONKYGaSg4Glh+tYVddW1RbAnwIfSbLCMN1/UVVXtu1vAjvSC9cvAy5KModeuH7RIMfuCJxdVY9U1cPAWcBObd9/V9U1bfsV9M73yjbeO4EXD3+6kiRJkjR5jeVzyOpTVT9tq8mvAz6d5ML+/Un+BDgc+NOqeqDdtj5UaO6/Jf0Pfarq0CTbA68H5iSZXlX3jVDXrUkeoRe2Z41ivvnfA9xSVTsMN37rN5RHBvS7qKrePsJ4kiRJkjQluIK+mCRZD3i0qr4J/AuwDfB7YJXWZVV6gfV3SdYFXtt3eH8/gHuSvLS9WO0Pb2BPMq2tjB8J3AusP0QtfzL/pXBJXkxvNfyuYcrfIMn8IP524Argv4C157cnWTbJFoMcexmwd5IVk6zU6r18kH7XAK9KsnEbb8Ukmw5TkyRJkiRNaq6gLz4vB45L8gzwJPA3wA7AD5L8uj2vfSNwC3AHvdvV5zupvx/wYeA84Bf03sg+/2VqxyXZhN5q9MXA3CFq2RH4cJIngWeA91TVvcPUfivwziQnArcDX6qqJ5LsB5yQZDV6/+38a6v/D6rqhnY3wHWt6atVdWOSDQf0+22SA4H/SLJ8a/4o8NNh6pIkSZKkSStVI73QW+qODTbeqv7xuAtH7ihJGpX37rPueJcgSdKUk2R2e2H4H/EWd0mSJEmSOsBb3CeRJHsCxwxovrOq9hmk71r0bosf6NUjvWhOkiRJkjT2DOiTSFVdAFwwyr73AdMXb0WSJEmSpNHyFndJkiRJkjrAgC5JkiRJUgcY0CVJkiRJ6gADuiRJkiRJHWBAlyRJkiSpAwzokiRJkiR1gD+zpgllndWX5b37rDveZUiSJEnSmHMFXZIkSZKkDjCgS5IkSZLUAQZ0SZIkSZI6wIAuSZIkSVIHGNAlSZIkSeoAA7okSZIkSR3gz6xpQrn/gac49czfjncZkjSh7b/v2uNdgiRJGoQr6JIkSZIkdYABXZIkSZKkDjCgS5IkSZLUAQZ0SZIkSZI6wIAuSZIkSVIHGNAlSZIkSeoAA7okSZIkSR1gQJckSZIkqQMM6JIkSZIkdYABXZIkSZKgLXPFAAAgAElEQVSkDjCgS5IkSZLUAQZ0SZIkSZI6wIAuSZIkSVIHTKmAnuSoJIcP0r5hknlte0aSE5Z8dc+V5NAkB4x3HSNJ8vYkR4x3HZIkSZI0kS0z3gV0TVXNAmYtqfmSLFNVTw1Ry5eXVB2LaC+gE3/UkCRJkqSJakKvoLeV79uSfC3JTUnOSLJikruSPL/1mZHk0r7DtkryoyS3Jzl4kDF3SXJe2145ySlJbm7j7ztEHUsnmZlkXuv79619WpLzk8xOcnmSzVr7zCTHJ7kEOK7Vu3rfeD9Lsm7/in+SjZP8MMncJDckmdbaP5jk+lbf0cNcq5WSfL8dPy/JW1v7oNeqzf21JBe2Pm9Kcmw7v/OTLNv6BZgO3JBkuyRXJbmxfb6k9VkxyXdajacluTbJjLZvjyRXt3M6PcnKg9R+SJJZSWY99NB9Q52iJEmSJE1ok2EF/SXAu6vqyiQnA+8Zof+WwCuAlYAbk3x/mL7/BPyuql4OkGSNIfpNB15YVS9r/eaH7ZOAQ6vq9iTbA18Edmv7NgV2r6qnkywF7AOc0vrdVVX39LLvH5wKfKaqzk6yArBUkj2ATYDtgADnJtm5qi4bpMa9gLur6vWtxtWGOe/5pgG7ApsDVwP7VtWHkpwNvB44B9gamFtVleQ2YOeqeirJ7sA/A/vS+9/kgaraMsnLgDmthucDH23X4ZEk/wh8APh4fxFVdVK7lmw0bXqNom5JkiRJmnAmQ0D/RVVd2ba/CRw2Qv/vVtVjwGNtBXs7WmAcxO7A2+Z/qaoHhuh3B7BRks8D3wcubCvBrwRO7wvay/cdc3pVPd22TwOOBE5p853WP3iSVej9AeDsVsf/tPY9gD2AG1vXlekF9sEC+s3AvyQ5Bjivqi4f4lz6/aCqnkxyM7A0cH7fWBu27b2AH7Tt1YCvJdkEKGDZ1r4j8LlW+7wkN7X2V9AL/1e2a7QcvT8ESJIkSdKUMxkC+sAV1QKe4tnb91cYRf+hZIT9vQGqHkiyFbAn8F7gLcD7gQeravoQhz3St301sHGStYG9gU8OUsdQ9X26qk4cRY0/TbIt8Drg00kurKqPM/y1erwd+0ySJ6tq/rV4hmf/29mD3io5wCeAS6pqnyQbApeOov6LqurtI9UvSZIkSZPdhH4GvdkgyQ5t++3AFcBdwLatbeBz429MskKStYBdgOuHGftC4G/nfxnqFvd2q/ZSVXUmvdvit6mqh4A7k7y59UkL8c/Rgu/ZwPHArVV134D9DwG/TLJ3G2v5JCsCFwDvmv/cdpIXJllniBrXAx6tqm8C/wJs03bdxdDXaljtNvll+updDfhV2z6wr+sV9P5oQZLNgZe39muAVyXZuO1bMcmmC1KDJEmSJE0WkyGg3wq8s902vSbwJeBo4HNJLgeeHtD/Onq3oV8DfKKq7h5m7E8Ca7SXqs2l9zz2YF4IXJpkDjAT+Ehr3x94dzv2FuCNw8x1GvAOBtze3ucvgcPaeV4F/K+quhD4FnB1uw39DGCVIY5/OXBdq/EInl2lH+5ajeQ1wA/7vh9Lb3X+Snq3xM/3RWDtVvs/AjfRe7b/t/SC/H+0fdcAmy1gDZIkSZI0KeTZu5YnnnYb9XnzX86mJSvJV4GvVtU1I/RbGli2qv6nvX3+YmDTqnpiQefcaNr0+sSxFy1cwZIkAPbfd+3xLkGSpCktyeyqmjGwfTI8g67/z96dh9tV1nf/f38gQaRQEGSswVRkKCIJEkCwUhCKYC1KlWJFEVD52apAEYfH+lirtSL6gzrUIVIBFSeoWsDKIDLJnAAhIqit0asCBWUelPH7/LHvlG085+ScnJPstZP367q4ztr3utd9f9cO/3z2vYYBqao3jLPrWsCF7dVsAf56WcK5JEmSJK3MhjqgV9XPgBW6ep7kKn77aewAr62qhSuyjtG0e+svGGHXXkve276iVNX9wO/8OiRJkiRJetJQB/RBqKpdBl3DWFoIH+3J8ZIkSZKkjloZHhInSZIkSdLQM6BLkiRJktQBBnRJkiRJkjrAgC5JkiRJUgcY0CVJkiRJ6gADuiRJkiRJHeBr1jRU1n/aNA5+xYaDLkOSJEmSppwr6JIkSZIkdYABXZIkSZKkDjCgS5IkSZLUAQZ0SZIkSZI6wIAuSZIkSVIHGNAlSZIkSeoAX7OmoXLP3Y9x5um/GnQZkrRC7X/g0wddgiRJWgFcQZckSZIkqQMM6JIkSZIkdYABXZIkSZKkDjCgS5IkSZLUAQZ0SZIkSZI6wIAuSZIkSVIHGNAlSZIkSeoAA7okSZIkSR1gQJckSZIkqQMM6JIkSZIkdYABXZIkSZKkDjCgS5IkSZLUAQZ0SZIkSZI6wIA+BZIcmeSmJKdNcpz3J9m7bV+UZM4yjHH5ZGpYHpJsmuS8QdchSZIkSV02bdAFrCT+BtivqhZNZpCqeu9kC6mq3SY7xnKwL3DuoIuQJEmSpC5zBX2SknwGeBZwZpJ3Jrk8yXXt79atz6FJvpXkrCSLkrwlyTGt35VJ1m/9TknyyiXGf32SE/s+vzHJCWPU80D7u0dbhT8jyc1JTkuStm+nVt+CJFcnWSfJmklOTrKw1bXnBGvfIsk5SeYnuTTJNn1l7Qt8p9V0cZKvJ/lxkuOSHNxqWJhki1HO6Ygk85LMu+++Oyf+jyRJkiRJQ8CAPklV9SbgVmBP4NPA7lW1A/Be4J/6um4HvBrYGfgg8FDrdwVwyBhTfBXYP8n09vkw4ORxlrcDcDSwLb0fEV6QZA3ga8BRVTUL2Bv4NfDmdj7PBf4KODXJmhOofS7w1qraETgW+BRAktWBravqh63fLOAo4LnAa4Gtqmpn4CTgrSOdRFXNrao5VTXn939/g3GeuiRJkiQNFy9xn1rr0gu2WwIFTO/bd2FV3Q/cn+Re4KzWvhDYfrQBq+rBJN8DXprkJmB6VS0cZz1XV9UvAJJcD8wE7gVuq6pr2vj3tf1/DHyitd2c5OfAVuOpPcnawG7A6W2RHuAp7e8uwFV9NV1TVbe1Of8LOK9vrD3HeV6SJEmStNIxoE+tD9ALswckmQlc1Lfv4b7tJ/o+P8HS/x1OAt4N3Mz4V8+XnPPxNk/o/XiwpIzQNtI4I9W+GnBPVc0e4dj9gHMmMJYkSZIkrZK8xH1qrQvc0rYPnapBq+oqYAa9y8y/MsnhbgY2S7ITQLv/fBpwCXBwa9sK2Bz40Tjruw9YlOTAdnySzGq79wIumGTNkiRJkrTSM6BPreOBDyW5DFh9isf+OnBZVd09mUGq6hHgIOATSRYA5wNr0rtnfPUkC+ndo35oVT08+ki/42Dg9W3MG4GXJdkQ+M3iy+glSZIkSaNL1UhXO6trkpwNnFhVQ7ManeQ1wDOq6ripGvPZW8yuE4777lQNJ0lDYf8Dnz7oEiRJ0hRKMr+q5izZ7j2/HZdkPeBqYMEwhXOAqvrSoGuQJEmSpGFhQO+4qrqHJ5+mDkCSDRj5vu69qsoXhUuSJEnSEDKgD6EWwkd6YrokSZIkaUj5kDhJkiRJkjrAgC5JkiRJUgcY0CVJkiRJ6gADuiRJkiRJHWBAlyRJkiSpAwzokiRJkiR1gK9Z01BZ72nT2P/Apw+6DEmSJEmacq6gS5IkSZLUAQZ0SZIkSZI6wIAuSZIkSVIHGNAlSZIkSeoAA7okSZIkSR1gQJckSZIkqQN8zZqGyn13PcZ3v/zLQZchScvd3q/ecNAlSJKkFcwVdEmSJEmSOsCALkmSJElSBxjQJUmSJEnqAAO6JEmSJEkdYECXJEmSJKkDDOiSJEmSJHWAAV2SJEmSpA4woEuSJEmS1AEGdEmSJEmSOsCALkmSJElSBxjQJUmSJEnqAAO6JEmSJEkdYECXJEmSJKkDDOgDkmS9JH8zRWPtkWS3qRhrGeefn2SNUfbtn+RdbfvlSbZdsdVJkiRJ0nAwoA/OesDvBPQkqy/DWHsAAwnoSWYCt1TVIyPtr6ozq+q49vHlgAFdkiRJkkZgQB+c44Atklyf5JokFyb5MrAwycwkP1jcMcmxSd7Xto9M8sMkNyT5agvIbwL+to31wpEmS3Jgkh8kWZDkktZ2aJJP9vU5O8kebfuBJB9uq+PfTbJzkouS/DTJ/n1D7wec047ZN8m1bY4L+udoK/z7Ax9pdW6R5Nq+ubdMMn+U2o9IMi/JvHvvv3Ni37IkSZIkDYlpgy5gFfYuYLuqmt1C8bfb50UtdI913B9W1cNJ1quqe5J8Bnigqj46xnHvBV5cVbckWW8c9f0ecFFVvTPJN4F/BP6U3gr4qcCZrd++9H4c2BD4HLB7O4f1+werqsuTnAmcXVVnACS5N8nsqroeOAw4ZaRCqmouMBdgq2fNrnHULkmSJElDxxX07ri6qhaNo98NwGlJXgM8NoHxLwNOSfJGYDyX0T9CWxkHFgIXV9WjbXsmQLvv/BlV9VPg+cAli8+hqu4axxwnAYe1y/oPAr48/tORJEmSpJWLAb07Huzbfozf/rdZs2/7z4B/AXYE5icZ11UQVfUm4D3ADOD6JBssZZ5Hq2rxavUTwMNtnCd48sqLFwLfb9sBJrq6/W/0LpF/KTC/qrx+XZIkSdIqy4A+OPcD64yy73ZgoyQbJHkKvQBLktWAGVV1IfAOeg+aW3spY9GO3aKqrqqq9wK/ohfUfwbMTrJakhnAzhM8h32B77TtK4A/SfKHbb71R+j/W3VW1W+Ac4FPAydPcG5JkiRJWqkY0AekrRZf1h4G95El9j0KvB+4CjgbuLntWh34UpKFwHXAiVV1D3AWcMBYD4mj93C2hW2+S4AF9C57X0TvsvWPAteOcuxo9gAubjX/EjgC+EaSBcDXRuj/VeDtSa5LskVrO43eyvt5E5xbkiRJklYqefIqZmn8kjwD+FxV7TfJcY4F1q2q/zue/ls9a3Z96h/Pn8yUkjQU9n71hoMuQZIkLSdJ5lfVnCXbfYq7lklV/YLe/ePLrD0dfgvgRVNSlCRJkiQNMQP6SibJ3wEHLtF8elV9cBD1jKWqDhh0DZIkSZLUFQb0lUwL4p0L45IkSZKksfmQOEmSJEmSOsCALkmSJElSBxjQJUmSJEnqAAO6JEmSJEkdYECXJEmSJKkDfIq7hsrvrz+NvV+94aDLkCRJkqQp5wq6JEmSJEkdYECXJEmSJKkDDOiSJEmSJHWAAV2SJEmSpA4woEuSJEmS1AEGdEmSJEmSOsCALkmSJElSB/gedA2VB+58jMu/8MtBlyFJy81uh2w46BIkSdKAuIIuSZIkSVIHGNAlSZIkSeoAA7okSZIkSR1gQJckSZIkqQMM6JIkSZIkdYABXZIkSZKkDjCgS5IkSZLUAQZ0SZIkSZI6wIAuSZIkSVIHGNAlSZIkSeoAA7okSZIkSR1gQJckSZIkqQMM6BqoJOckuSfJ2YOuRZIkSZIGyYCuQfsI8NpBFyFJkiRJg2ZA14QlmZnkpiSfS3JjkvOSPDXJFm1FfH6SS5Ns0/pvkeTKJNckeX+SBxaPVVUXAPcP7GQkSZIkqSMM6FpWWwL/UlXPAe4BXgHMBd5aVTsCxwKfan0/BnysqnYCbp3oREmOSDIvybx77r9zaqqXJEmSpI4xoGtZLaqq69v2fGAmsBtwepLrgc8Cm7b9uwKnt+0vT3SiqppbVXOqas5662wwuaolSZIkqaOmDboADa2H+7YfBzYG7qmq2QOqR5IkSZKGmivomir3AYuSHAiQnllt35X0LoEHeNUgipMkSZKkrjOgayodDLw+yQLgRuBlrf1o4JgkV9O77P3exQckuZTe5e97JflFkhev4JolSZIkqRO8xF0TVlU/A7br+/zRvt37jnDILcDzq6qSvAqY13fsC5dXnZIkSZI0TAzoWhF2BD6ZJPSe+H74gOuRJEmSpM4xoGu5q6pLgVlL7ShJkiRJqzDvQZckSZIkqQMM6JIkSZIkdYABXZIkSZKkDjCgS5IkSZLUAQZ0SZIkSZI6wIAuSZIkSVIHGNAlSZIkSeoA34OuobL2BtPY7ZANB12GJEmSJE05V9AlSZIkSeoAA7okSZIkSR1gQJckSZIkqQMM6JIkSZIkdYABXZIkSZKkDjCgS5IkSZLUAQZ0SZIkSZI6wPega6g89KvHuO6kOwZdhiQtNzu8YaNBlyBJkgbEFXRJkiRJkjrAgC5JkiRJUgcY0CVJkiRJ6gADuiRJkiRJHWBAlyRJkiSpAwzokiRJkiR1gAFdkiRJkqQOMKBLkiRJktQBBnRJkiRJkjrAgC5JkiRJUgcY0CVJkiRJ6gADuiRJkiRJHWBAlyRJkiSpA1b6gJ7kyCQ3JTltkuO8P8nebfuiJHOWYYzLJ1PD8pBk0yTnraC53r0i5pEkSZKkYTRt0AWsAH8D7FdViyYzSFW9d7KFVNVukx1jOdgXOHcFzfVu4J9W0FySJEmSNFRW6hX0JJ8BngWcmeSdSS5Pcl37u3Xrc2iSbyU5K8miJG9Jckzrd2WS9Vu/U5K8conxX5/kxL7Pb0xywhj1PND+7tFW4c9IcnOS05Kk7dup1bcgydVJ1kmyZpKTkyxsde05wdq3SHJOkvlJLk2yTV9Z+wLfaf3e0eZYkOS41ja7jXVDkm8meVpr/9+rCJI8PcnP+mr6RpvvJ0mOb+3HAU9Ncn073w8kOarvu/lgkiNH+d6OSDIvyby7779z7H90SZIkSRpSK3VAr6o3AbcCewKfBnavqh2A9/LbK7nbAa8GdgY+CDzU+l0BHDLGFF8F9k8yvX0+DDh5nOXtABwNbEvvR4QXJFkD+BpwVFXNAvYGfg28uZ3Pc4G/Ak5NsuYEap8LvLWqdgSOBT4FkGR1YOuq+mGS/YCXA7u0uY9vx34BeGdVbQ8sBP5+HOc2GzgIeC5wUJIZVfUu4NdVNbuqDgb+FXhdq2M14FXAiLchVNXcqppTVXOets4G45hekiRJkobPqnCJ+2Lr0gu2WwIFTO/bd2FV3Q/cn+Re4KzWvhDYfrQBq+rBJN8DXprkJmB6VS0cZz1XV9UvAJJcD8wE7gVuq6pr2vj3tf1/DHyitd2c5OfAVuOpPcnawG7A6W2RHuAp7e8uwFVte2/g5Kp6qM1zV5J1gfWq6uLW51Tg9HGc2wVVdW+r/YfAM4H/7u9QVT9LcmeSHYCNgeuqyuVxSZIkSausVSmgf4BemD0gyUzgor59D/dtP9H3+QmW/h2dRO/e6psZ/+r5knM+3uYJvR8PlpQR2kYaZ6TaVwPuqarZIxy7H3BO3xwjzT2ax3jyCow1l9g30rmN5CTgUGAT4PMTmFuSJEmSVjor9SXuS1gXuKVtHzpVg1bVVcAMepeZf2WSw90MbJZkJ4B2//k04BLg4Na2FbA58KNx1ncfsCjJge34JJnVdu8FXNC2zwMOT7JW67d+WwW/O8kLW5/XAotX038G7Ni2f+ve/DE82nc7AMA36d0DvxMr7kF1kiRJktRJq1JAPx74UJLLgNWneOyvA5dV1d2TGaSqHqF37/YnkiwAzqe3Ov0pYPUkC+ndo35oVT08+ki/42Dg9W3MG4GXJdkQ+M3iy+ir6hzgTGBeu+T+2Hbs64CPJLmB3r3l72/tHwX+Or1Xxz19nHXMBW5Ie+VdO98Lga9X1eMTOB9JkiRJWumkaiJXNWskSc4GTqyqC5bauSOSvAZ4RlUdN8AaVgOuBQ6sqp+M55htZ86u096zQl7bLkkDscMbNhp0CZIkaTlLMr+q5izZvirdgz7lkqwHXA0sGKZwDlBVXxrk/Em2Bc4GvjnecC5JkiRJKzMD+iRU1T08+TR1AJJswJP3dffby6eUP6mqfkjv9XKSJEmSJAzoU66F8JGemC5JkiRJ0qhWpYfESZIkSZLUWQZ0SZIkSZI6wIAuSZIkSVIHGNAlSZIkSeoAA7okSZIkSR3gU9w1VNZ6+jR2eMNGgy5DkiRJkqacK+iSJEmSJHWAAV2SJEmSpA4woEuSJEmS1AEGdEmSJEmSOsCALkmSJElSBxjQJUmSJEnqAAO6JEmSJEkd4HvQNVR+c8ej/Ohfbh90GZI05bZ+88aDLkGSJA2YK+iSJEmSJHWAAV2SJEmSpA4woEuSJEmS1AEGdEmSJEmSOsCALkmSJElSBxjQJUmSJEnqAAO6JEmSJEkdYECXJEmSJKkDDOiSJEmSJHWAAV2SJEmSpA4woEuSJEmS1AEGdEmSJEmSOmCVDehJjk6y1hSM874kx05FTVMhyZuSHDLoOiRJkiRJE7PKBnTgaGDSAX2qJJk2FeNU1Weq6gtTMZYkSZIkacXpdEBPckiSG5IsSPLFJM9MckFruyDJ5q3fKUle2XfcA+3vHkkuSnJGkpuTnJaeI4HNgAuTXJjk9UlO7Dv+jUlOGKOuv0vyoyTfBbbua98iyTlJ5ie5NMk2ffV9prX9OMlLW/uhSU5PchZwXmt7e5Jr2jn+Q2v7vSTfbt/DD5Ic1NqPS/LD1vejre1/V/STzE5yZdv/zSRPa+0XJflwkqtbPS8c41wPTfKtJGclWZTkLUmOSXJdG3v9pZz7nye5qvX/bpKN++r8fKvlp+3fRJIkSZJWWVOyars8JHkO8HfAC6rqVy0Ingp8oapOTXI48HHg5UsZagfgOcCtwGVtvI8nOQbYs439e8ANSd5RVY8ChwH/3yh17Qi8qo07DbgWmN92zwXeVFU/SbIL8CngRW3fTOBPgC3o/TDw7Na+K7B9Vd2VZB9gS2BnIMCZSXYHNgRurao/azWs276PA4BtqqqSrDdCuV8A3lpVFyd5P/D39K4cAJhWVTsneUlr33uM73C7dr5rAv8JvLOqdmg/ahwC/PMY5/594PmtxjcA7wDe1sbdBtgTWAf4UZJPt+9/ye/8COAIgM2e9owxypQkSZKk4dXZgE4v3J1RVb8CaAF2V+Av2v4vAsePY5yrq+oXAEmupxeUv9/foaoeTPI94KVJbgKmV9XCUcZ7IfDNqnqojXlm+7s2sBtwepLFfZ/Sd9zXq+oJ4CdJfkovnAKcX1V3te192n/Xtc9r0wvslwIfTfJh4OyqurRdEv8b4KQk3wbO7i8yybrAelV1cWs6FTi9r8s32t/57TsZy4VVdT9wf5J7gbNa+0Jg+6Wc+zOAryXZFFgDWNQ37rer6mHg4SR3ABsDv1hy8qqaS+8HALbbfFYtpVZJkiRJGkpdDugBlhbGFu9/jHa5fnoJcY2+Pg/3bT/O6Od8EvBu4Gbg5HHO22814J6qmj3OYxZ/frCvLcCHquqzSx7cVu5fAnwoyXlV9f4kOwN70VvRfwtPrtaPx+LvZazvZMm+AE/0fX6iHTvWuX8COKGqzkyyB/C+UcYdTx2SJEmStNLq8j3oFwB/mWQDgHZJ9+X0wijAwTy5Ev4zYMe2/TJg+jjGv5/epdUAVNVVwAzg1cBXxjjuEuCAJE9Nsg7w5+34+4BFSQ5s9SbJrL7jDkyyWpItgGcBPxph7HOBw9uKNEn+IMlGSTYDHqqqLwEfBZ7X+qxbVf9B77L13wrHVXUvcHff/eWvBS5mOVjKua8L3NK2X7c85pckSZKklUFnVyyr6sYkHwQuTvI4vcu+jwQ+n+TtwC/p3SsO8Dng35NcTS/YPzjSmEuYC3wnyW1VtWdr+zowu6ruHqOua5N8Dbge+Dm9y88XOxj4dJL30PuR4KvAgrbvR/QC8sb07tX+Td/l4IvHPi/JHwFXtH0PAK8Bng18JMkTwKPAX9P7ceHfk6xJb+X9b0co93XAZ9J7ndxPefL7Wh5GO/f30bv0/RbgSuAPl2MNkiRJkjS0UuUtvYslORs4saoumOJxT6F37/gZUznuqmi7zWfVv73zvEGXIUlTbus3bzzoEiRJ0gqSZH5VzVmyvcuXuK8wSdZL8mPg11MdziVJkiRJGo/OXuK+IlXVPcBW/W3t3veRwvpeVXXnBMc/dNmrWzGSvBj48BLNi6rqgEHUI0mSJEmrGgP6KFoIH+2J7CudqjqX3kPqJEmSJEkD4CXukiRJkiR1gAFdkiRJkqQOMKBLkiRJktQBBnRJkiRJkjrAgC5JkiRJUgcY0CVJkiRJ6gBfs6ahsuZG09n6zRsPugxJkiRJmnKuoEuSJEmS1AEGdEmSJEmSOsCALkmSJElSBxjQJUmSJEnqAAO6JEmSJEkdYECXJEmSJKkDDOiSJEmSJHWA70HXUHnk9kf57///fwZdhiRNqRlv22TQJUiSpA5wBV2SJEmSpA4woEuSJEmS1AEGdEmSJEmSOsCALkmSJElSBxjQJUmSJEnqAAO6JEmSJEkdYECXJEmSJKkDDOiSJEmSJHWAAV2SJEmSpA4woEuSJEmS1AEGdEmSJEmSOsCALkmSJElSBxjQlyLJkUluSnLaJMd5f5K92/ZFSeYswxiXT6aG5SHJpknOG2N//3kfnWStFVedJEmSJA2PaYMuYAj8DbBfVS2azCBV9d7JFlJVu012jOVgX+Dc0XYucd5HA18CHlreRUmSJEnSsHEFfQxJPgM8CzgzyTuTXJ7kuvZ369bn0CTfSnJWkkVJ3pLkmNbvyiTrt36nJHnlEuO/PsmJfZ/fmOSEMep5oP3do63Cn5Hk5iSnJUnbt1Orb0GSq5Osk2TNJCcnWdjq2nOCtW+R5Jwk85NcmmSbvrL2Bb7T+r2jzbEgyXH9553kSGAz4MIkF0703CVJkiRpZWdAH0NVvQm4FdgT+DSwe1XtALwX+Ke+rtsBrwZ2Bj4IPNT6XQEcMsYUXwX2TzK9fT4MOHmc5e1Ab0V6W3o/IrwgyRrA14CjqmoWsDfwa+DN7XyeC/wVcGqSNSdQ+1zgrVW1I3As8CmAJKsDW1fVD5PsB7wc2KXNfXx/sVX1cdp3WVV7TuTckxyRZF6SeXc9eOc4vx5JkiRJGi5e4j5+69ILtlsCBUzv23dhVd0P3J/kXuCs1r4Q2H60AavqwSTfA16a5CZgelUtHGc9V1fVLwCSXA/MBO4Fbquqa9r492P847UAACAASURBVLX9fwx8orXdnOTnwFbjqT3J2sBuwOltkR7gKe3vLsBVbXtv4OSqeqjNc9dYxU/k3KtqLr0fCdh+xqxa2hcjSZIkScPIgD5+H6AXZg9IMhO4qG/fw33bT/R9foKlf8cnAe8Gbmb8q+dLzvl4myf0fjxYUkZoG2mckWpfDbinqmaPcOx+wDl9c0w0PC/ruUuSJEnSSsdL3MdvXeCWtn3oVA1aVVcBM+hdZv6VSQ53M7BZkp0A2v3n04BLgINb21bA5sCPxlnffcCiJAe245NkVtu9F3BB2z4POHzxU9oX37++hPuBdfrGnspzlyRJkqShZkAfv+OBDyW5DFh9isf+OnBZVd09mUGq6hHgIOATSRYA5wNr0rtnfPUkC+ndo35oVT08+ki/42Dg9W3MG4GXJdkQ+M3iy+ir6hzgTGBeu+T+2BHGmQt8J8mFfW1Tcu6SJEmSNOxS5S29g5bkbODEqrpgqZ07IslrgGdU1XGTHGdC5779jFn17aNHfaubJA2lGW/bZNAlSJKkFSjJ/Kqas2S796APUJL1gKuBBcMUzgGq6kuTOX6Yz12SJEmSlgcD+gBV1T08+TR1AJJswJP3dffbq6pWmneMjXTukiRJkrQqM6B3TAvhIz0xXZIkSZK0EvMhcZIkSZIkdYABXZIkSZKkDjCgS5IkSZLUAQZ0SZIkSZI6wIAuSZIkSVIHGNAlSZIkSeoAX7OmobLGxtOZ8bZNBl2GJEmSJE05V9AlSZIkSeoAA7okSZIkSR1gQJckSZIkqQMM6JIkSZIkdYABXZIkSZKkDjCgS5IkSZLUAQZ0SZIkSZI6wPega6g8+j+P8D8f+dmgy5CkKbPJ22cOugRJktQRrqBLkiRJktQBBnRJkiRJkjrAgC5JkiRJUgcY0CVJkiRJ6gADuiRJkiRJHWBAlyRJkiSpAwzokiRJkiR1gAFdkiRJkqQOMKBLkiRJktQBBnRJkiRJkjrAgC5JkiRJUgcY0CVJkiRJ6gAD+hRL8sA4+ly+ImpZUZJ8NskLRtm3WZIz2vbsJC9ZsdVJkiRJ0nAwoA9AVe022TGSTJuKWqbILsCVI+2oqlur6pXt42zAgC5JkiRJIzCgL0dJ3p7kmiQ3JPmHvvYH2t9Nk1yS5PokP0jywv79bfuVSU5p26ckOSHJhcCHk/xeks+3Oa5L8rIxanlOkqvbXDck2TLJzCQ/6OtzbJL3te2LkpzY6rspyU5JvpHkJ0n+se+YPwJ+XFWPJ3l2ku8mWZDk2iRbLJ4jyRrA+4GDWg0HtbE2bOOsluQ/kzx9Kr57SZIkSRo2XVqFXakk2QfYEtgZCHBmkt2r6pK+bq8Gzq2qDyZZHVhrHENvBezdAvE/Ad+rqsOTrAdcneS7VfXgCMe9CfhYVZ3WwvLqwMZLmeuRqto9yVHAvwM7AncB/5XkxKq6E9gPOKf1Pw04rqq+mWRNej8AbQRQVY8keS8wp6re0r6jbYCDgX8G9gYWVNWvliwiyRHAEQB/sN5m4/iKJEmSJGn4uIK+/OzT/rsOuBbYhl5g73cNcFhbtX5uVd0/jnFPr6rH++Z4V5LrgYuANYHNRznuCuDdSd4JPLOqfj2Ouc5sfxcCN1bVbVX1MPBTYEbb92LgnCTrAH9QVd8EqKrfVNVDSxn/88Ahbftw4OSROlXV3KqaU1VzNvi9DcZRtiRJkiQNHwP68hPgQ1U1u/337Kr61/4ObTV9d+AW4ItJFofV6uu25hLj9q+OB3hF3xybV9VNIxVTVV8G9gd+DZyb5EXAY/z2/wNLzvVw+/tE3/biz9OSrAWsV1W3tlompKr+G7i91bIL8J2JjiFJkiRJKwsD+vJzLnB4krUBkvxBko36OyR5JnBHVX0O+FfgeW3X7Un+KMlqwAFLmeOtSdLG22G0jkmeBfy0qj5Ob2V8e+B2YKMkGyR5CvDSCZ7jnsCFAFV1H/CLJC9v8z2lBfh+9wPrLNF2EvAl4Ot9VwZIkiRJ0irHgL6cVNV5wJeBK5IsBM7gd8PpHsD1Sa4DXgF8rLW/Czgb+B5w2xjTfACYDtzQHvb2gTH6HgT8oF0Ovw3whap6lN6D265q89087hPs6b//HOC1wJFJbgAuBzZZov+FwLaLHxLX2s4E1maUy9slSZIkaVWRqlp6L2kESa4FdmlBf1nHmAOcWFUvHE//Wc/Yvs496syld5SkIbHJ22cOugRJkrSCJZlfVXOWbPcp7lpmVfW8pfcaXZJ3AX9N70nukiRJkrRKM6CvZJK8GPjwEs2Lqmqse9kHoqqOA44bdB2SJEmS1AUG9JVMVZ1L7+FxkiRJkqQh4kPiJEmSJEnqAAO6JEmSJEkdYECXJEmSJKkDDOiSJEmSJHWAAV2SJEmSpA4woEuSJEmS1AG+Zk1DZfoma7DJ22cOugxJkiRJmnKuoEuSJEmS1AEGdEmSJEmSOsCALkmSJElSBxjQJUmSJEnqAAO6JEmSJEkdYECXJEmSJKkDDOiSJEmSJHWA70HXUHn09t/wPyf8cNBlSNKU2OSYbQddgiRJ6hBX0CVJkiRJ6gADuiRJkiRJHWBAlyRJkiSpAwzokiRJkiR1gAFdkiRJkqQOMKBLkiRJktQBBnRJkiRJkjrAgC5JkiRJUgcY0CVJkiRJ6gADuiRJkiRJHWBAlyRJkiSpAwzokiRJkiR1wNAE9CQ/S/L0tv3AGP02S3LGiqtsfJLsmuRzbfsrSW5I8rdTMO5/JFmvbY/6vYxx/JwkH59sHZIkSZKkyZk26AKmWlXdCrxy0HWMYF/gnCSbALtV1TOnYtCqeskkj58HzJuKWiRJkiRJy66TK+hJvpVkfpIbkxwxwWNnJvlB2z40yTeSnJPkJ0mO7+u3b5JrkyxIckFrW7/NfUOSK5Ns39rfl+TUJOe1lfy/SHJ8koVt7Omt345JLm61n5tk077S9gK+C5wHbJTk+iQvTPLGJNe0Ov4tyVptrFOSfDrJhUl+muRPknw+yU1JTuk7j/+9sqCv7YtJXtb3+bQk+4/yfe2R5Oy+8/x8kovanEf29TukfS8LknyxtT0zyQWt/YIkm0+w9n2SXNH+HU5Psvb4/6UlSZIkaeXSyYAOHF5VOwJzgCOTbDCJsWYDBwHPBQ5KMiPJhsDngFdU1SzgwNb3H4Drqmp74N3AF/rG2QL4M+BlwJeAC6vqucCvgT9rIf0TwCtb7Z8HPgjQAvSjVXUvsD/wX1U1u6ouBb5RVTu1Om4CXt8359OAFwF/C5wFnAg8B3huktljnPNJwGFt7nWB3YD/GOf3tQ3wYmBn4O+TTE/yHODvgBe1Oo9qfT8JfKF9X6cB/ZfKj1l7+07eA+xdVc+jt4p/zEgFJTkiybwk8+588K5xnoYkSZIkDZeuXuJ+ZJID2vYMYMtJjHVBC8Yk+SHwTHrh8ZKqWgRQVYtT3x8Dr2ht30uyQQu4AN+pqkeTLARWB85p7QuBmcDWwHbA+UlofW5rffaht3I+ku2S/COwHrA2cG7fvrOqqtqct1fVwnYeN7Y5rx9pwKq6OMm/JNkI+Avg36rqsbG/pv/17ap6GHg4yR3AxvSC9hlV9as2/uLva9c2PsAXgeP7xlla7c8AtgUua9/XGsAVo5zPXGAuwKwZ29U4z0OSJEmShkrnAnqSPYC9gV2r6qEkFwFrTmLIh/u2H6d3zgFGCnoZoW1xv4cBquqJJI9W1eL2J/rGvLGqdh1hjP2AE0ap7xTg5VW1IMmhwB4j1P7EEuexeM6xfBE4GHgVcPhS+vabyPe1pP4+S6v9ceD8qvqrCdQmSZIkSSutLl7ivi5wdwvn2wDPXw5zXAH8SZI/hN695639EnqhdvEPBb+qqvvGOeaPgA2T7NqOn57kOektD2/PKKvdwDrAbe0S+YOX5WRGcQpwNEBV3TjJsS4A/nLxrQZ939fl9H4AgF7t35/AmFcCL0jy7DbmWkm2mmSdkiRJkjS0OreCTu/S8TcluYFe6L1yqieoql+m9/C5byRZDbgD+FPgfcDJbe6HgNdNYMxHkrwS+Hi7LH4a8M/AU+nd1z7aCvT/Ba4Cfk7vcvl1lu2sfqee25PcBHxrCsa6MckHgYuTPA5cBxwKHAl8PsnbgV/S7nsf55i/bFcMfCXJU1rze4AfT7ZeSZIkSRpGGT03aiokeQ/wn1X11RU871r0Av/zFt+DvzKYNWO7Ovdvvz7oMiRpSmxyzLaDLkGSJA1AkvlVNWfJ9i6uoK9UquofV/ScSfam9xT5E1amcC5JkiRJK7OhDehJnkvvQWj9Hq6qXQZRT5dU1XeBzfvbkrwY+PASXRdV1QFIkiRJkgZuaAN6e23XWO8CV5+qOpfffoWbJEmSJKlDuvgUd0mSJEmSVjkGdEmSJEmSOsCALkmSJElSBxjQJUmSJEnqAAO6JEmSJEkdYECXJEmSJKkDhvY1a1o1Td94TTY5ZttBlyFJkiRJU84VdEmSJEmSOsCALkmSJElSBxjQJUmSJEnqAAO6JEmSJEkdYECXJEmSJKkDDOiSJEmSJHWAAV2SJEmSpA7wPegaKo/e/hC3//P8QZchSctk46N3HHQJkiSpw1xBlyRJkiSpAwzokiRJkiR1gAFdkiRJkqQOMKBLkiRJktQBBnRJkiRJkjrAgC5JkiRJUgcY0CVJkiRJ6gADuiRJkiRJHWBAlyRJkiSpAwzokiRJkiR1gAFdkiRJkqQOMKBLkiRJktQBBvQBSXJkkpuS3JLkk4OuZ1kl+T9JDh50HZIkSZI07Azog/M3wEuAv5uKwZJMG8SxwD7AeZM4XpIkSZKEAX0gknwGeBZwJvC0vvZnJrkgyQ3t7+ZLaT8lyQlJLgQ+PMpcOye5PMl17e/Wrf3QJKcnOYsWsJO8Pck1bZ5/6BvjW0nmJ7kxyRF97b8PrFFVv0zy50muavN8N8nGrc+GSc5Pcm2Szyb5eZKnt32vSXJ1kuvbvtWn8nuWJEmSpGFiQB+AqnoTcCuwJ3B3365PAl+oqu2B04CPL6UdYCtg76p62yjT3QzsXlU7AO8F/qlv367A66rqRUn2AbYEdgZmAzsm2b31O7yqdgTmAEcm2aC17w1c0La/Dzy/zfNV4B2t/e+B71XV84BvAot/XPgj4CDgBVU1G3gcGPFS+SRHJJmXZN5dD949UhdJkiRJGnqTubRZU29X4C/a9heB45fSDnB6VT0+xpjrAqcm2RIoYHrfvvOr6q62vU/777r2eW16gf0SeqH8gNY+o7XfCewLnNzanwF8LcmmwBrAotb+x8ABAFV1TpLFCXsvYEfgmiQATwXuGOkEqmouMBdg1oxta4xzlSRJkqShZUDvttHCaH/7g0sZ4wPAhVV1QJKZwEWjHBvgQ1X12f6Dk+xBb6V816p6KMlFwJpt987AX7ftTwAnVNWZ7Zj39Y07kgCnVtX/WUr9kiRJkrRK8BL3brkceFXbPpjeZeNjtY/HusAtbfvQMfqdCxyeZG2AJH+QZKN2/N0tnG8DPL/tfw5wc9/qff88r+sb9/vAX7Zj9uHJe+4vAF7Z5iDJ+kmeOYHzkiRJkqSVigG9W44EDktyA/Ba4KiltI/H8cCHklwGjPoQtqo6D/gycEWShcAZwDrAOcC0NvcHgCvbIfu1fYu9Dzg9yaXAr/ra/wHYJ8m17ZjbgPur6ofAe4Dz2tjnA5tO4LwkSZIkaaWSKm/p1cQlOR84pKpuW0q/pwCPV9VjSXYFPt0eCrdMZs3Yts572xeX9XBJGqiNj95x0CVIkqQOSDK/quYs2e496FomVfWn4+y6OfD1JKsBjwBvXH5VSZIkSdLwMqCvJJIcxu9e+n5ZVb15EPUsVlU/AXYYZA2SJEmSNAwM6CuJqjqZJ195JkmSJEkaMj4kTpIkSZKkDjCgS5IkSZLUAQZ0SZIkSZI6wIAuSZIkSVIHGNAlSZIkSeoAA7okSZIkSR3ga9Y0VKZvvBYbH73joMuQJEmSpCnnCrokSZIkSR1gQJckSZIkqQMM6JIkSZIkdYABXZIkSZKkDjCgS5IkSZLUAQZ0SZIkSZI6wIAuSZIkSVIH+B50DZVH73iA2z922aDLkKRx2fioFwy6BEmSNERcQZckSZIkqQMM6JIkSZIkdYABXZIkSZKkDjCgS5IkSZLUAQZ0SZIkSZI6wIAuSZIkSVIHGNAlSZIkSeoAA7okSZIkSR1gQJckSZIkqQMM6JIkSZIkdYABXZIkSZKkDjCga2CSzE5yRZIbk9yQ5KBB1yRJkiRJgzJt0AVolfYQcEhV/STJZsD8JOdW1T2DLkySJEmSVjRX0DVhSWYmuSnJ59rq93lJnppkiyTnJJmf5NIk27T+WyS5Msk1Sd6f5AGAqvpxVf2kbd8K3AFsOLgzkyRJkqTBMaBrWW0J/EtVPQe4B3gFMBd4a1XtCBwLfKr1/RjwsaraCbh1pMGS7AysAfzXCPuOSDIvyby7HnBxXZIkSdLKyYCuZbWoqq5v2/OBmcBuwOlJrgc+C2za9u8KnN62v7zkQEk2Bb4IHFZVTyy5v6rmVtWcqpqz/trrTe1ZSJIkSVJHeA+6ltXDfduPAxsD91TV7IkMkuT3gW8D76mqK6ewPkmSJEkaKq6ga6rcByxKciBAema1fVfSuwQe4FWLD0iyBvBN4AtVdTqSJEmStAozoGsqHQy8PskC4EbgZa39aOCYJFfTu+z93tb+l8DuwKFJrm//TWgFXpIkSZJWFl7irgmrqp8B2/V9/mjf7n1HOOQW4PlVVUleBcxrx30J+NJyLFWSJEmShoYBXSvCjsAnk4TeE98PH3A9kiRJktQ5BnQtd1V1KTBrqR0lSZIkaRXmPeiSJEmSJHWAAV2SJEmSpA4woEuSJEmS1AEGdEmSJEmSOsCALkmSJElSBxjQJUmSJEnqAAO6JEmSJEkdYECXJEmSJKkDpg26AGkipm+0Nhsf9YJBlyFJkiRJU84VdEmSJEmSOsCALkmSJElSBxjQJUmSJEnqAAO6JEmSJEkdYECXJEmSJKkDDOiSJEmSJHWAAV2SJEmSpA7wPegaKo/dcT93fOJ7gy5DkpZqo7e+aNAlSJKkIeMKuiRJkiRJHWBAlyRJkiSpAwzokiRJkiR1gAFdkiRJkqQOMKBLkiRJktQBBnRJkiRJkjrAgC5JkiRJUgcY0CVJkiRJ6gADuiRJkiRJHWBAlyRJkiSpAwzokvT/2rv/cLuq+s7j7w8ERH6YVDQ+VcAoUBEsBI0IKhgRFZCitTD4awYYKw8jAuIgZXREpdpqa4dioVhKFaqICOKM0qmCEH4YWiCBkESBqoFWkCECEglq+JHv/HHWNYfrTXJvTO7Z9+b9ep773L3XXnutdc4395x8z1p7H0mSJKkDTNAlSZIkSeoAE/QJJMlzk1zatmcmOXgU58xOcvmGH91T+vxQ3/aMJItXU++8JLuO38gkSZIkqbtM0CeIJFOq6idVdVgrmgmsNUEfkA+tvQpU1R9X1feHlyfZdP0PSZIkSZK6zQR9A2szyHe02eLFSS5MckCSuUl+kGSv9nNDklvb7xe1c49KckmSbwJXDM1GJ9kcOB04IsmCJEesro1RjO9jST6f5JokS5Kc0HfsA62/xUne38pOGaqT5IwkV7ft1yX5UpJPAU9v47qwNTUlyQVJFia5NMmW7Zxrksxq28uTnJ7kRmCfYWM8Jsm8JPMeXP7wugdDkiRJkjrMBH187AScCewO7AK8A3g1cDK92eY7gP2qak/gNODP+s7dBziyqvYfKqiqx1q9i6tqZlVdvJY21mYX4I3AXsBHk2yW5GXA0cArgL2B9yTZE7gO2LedNwvYOslm7fFcX1WnAr9s43pnq/ci4Nyq2h34OfDeEcawFbC4ql5RVd/tP1BV51bVrKqate3W08bwsCRJkiRp4pgy6AFsJO6qqkUASb4HXFVVlWQRMAOYClyQZGeggM36zr2yqh4aRR9ramNt/qmqVgArkiwFnkMv4f56VT3axn0ZvcT8HOBlSbYBVgC30EvU9wVOGKlx4MdVNbdtf6nV+8ywOk8CXxvDmCVJkiRpUnEGfXys6Nte2be/kt6HJH8KzKmqlwB/AGzRV//RUfaxpjbGMr4n25gyUsWqehy4m97s+g3A9cBrgR2B21fTfq1lH+BXVfXk6IcsSZIkSZOLCXo3TAXubdtHjfKcR4Btfss21uQ64C1JtkyyFfCH9JLxoWMnt9/XA8cCC6pqKPF+vC17H7JDkqHryt8OPGUJuyRJkiTJBL0r/gL48yRzgdHewXwOsOvQTeLWsY3VqqpbgPOBm4AbgfOq6tZ2+Hrgd4F/qar7gV+xKnkHOBdY2HeTuNuBI5MsBJ5Jb5m8JEmSJKlPVk16St03c4cX1RUfNL+X1H3Tj99/7ZUkSdJGKcn8qpo1vNwZdEmSJEmSOsC7uG8kkhwNnDiseG5VHTeI8UiSJEmSnsoEfSNRVV8AvjDocUiSJEmSRuYSd0mSJEmSOsAEXZIkSZKkDjBBlyRJkiSpA0zQJUmSJEnqABN0SZIkSZI6wARdkiRJkqQO8GvWNKFMmb4N04/ff9DDkCRJkqT1zhl0SZIkSZI6wARdkiRJkqQOMEGXJEmSJKkDTNAlSZIkSeoAE3RJkiRJkjrABF2SJEmSpA4wQZckSZIkqQP8HnRNKE8sXcbSs/7voIchSSOa/r6DBz0ESZI0gTmDLkmSJElSB5igS5IkSZLUASbokiRJkiR1gAm6JEmSJEkdYIIuSZIkSVIHmKBLkiRJktQBJuiSJEmSJHWACbokSZIkSR1ggi5JkiRJUgeYoEuSJEmS1AEm6JIkSZIkdYAJuiRJkiRJHWCCPoIk05K8dz21NTvJK9dHW+vY//wkm2/A9k9PcsCGal+SJEmSNhYm6CObBvxGgp5k03VoazYwkAQ9yQzg3qp6bJT1p4y1j6o6raq+M9bzJEmSJElPZYI+sk8BOyZZkOTmJHOSfBlYlGRGksVDFZOcnORjbfuEJN9PsjDJV1qCfCxwUmtr35E6S3J4ksVJbktyXSs7KslZfXUuTzK7bS9P8uk2O/6dJHsluSbJkiSH9jV9EPCtvnP+KsktSa5K8uxWfk2SP0tyLXBikue34wvb7x2STE1yd5JN2jlbJvlxks2SnJ/ksFZ+d5KPtz4WJdmllW+d5AutbGGSP2rlb0jyL63+JUm2Xs3zc0ySeUnmPbh82ZgCKUmSJEkThQn6yE4FflRVM4EPAnsBH66qXUdx3p5VtTtwbFXdDXwOOKOqZlbV9as57zTgjVW1B3Doaur02wq4pqpeBjwCfAJ4PfCHwOl99Q6kJejtnFuq6qXAtcBH++pNq6rXVNVfAWcB/9gew4XAZ6tqGXAb8JpW/w+Ab1fV4yOM7YHWxznAya3sI8Cyqvr91u7VSZ4F/E/ggFZ/HvCBkR5sVZ1bVbOqata2W08dxdMjSZIkSROPCfro3FRVd42i3kLgwiTvAp4YQ/tzgfOTvAcYzTL6x1iVeC8Crm3J8iJgBkC77ny7qlrS6q0ELm7bXwJe3dfexX3b+wBfbttf7Kt3MXBE237bsHP6XdZ+zx8aC3AAcPZQhar6GbA3sCswN8kC4Ejg+atpU5IkSZImvTFfc7yRerRv+wme+sHGFn3bbwL2ozcL/pEku42m8ao6Nskr2vkLksxcSz+PV1W17ZXAitbOyr7ryPcFvrumbvu2H11trVX1vgH8eZJnAi8Drl5N/RXt95Os+veVYf0NlV1ZVW9fQ9+SJEmStNFwBn1kjwDbrObY/cD0JNsmeRpwCEC7Pnv7qpoDnELvRnNbr6Ut2rk7VtWNVXUa8ACwPXA3MDPJJkm2p7fMfiwOBP65b38T4LC2/Q5Wn7zfQG+GHOCdQ/WqajlwE3AmcHlVPTmGsVwBvG9oJ8nvAP8KvCrJTq1syyS/N4Y2JUmSJGlScQZ9BFX1YJK57WZwv6SXlA8dezzJ6cCNwF3AHe3QpsCXkkylNzt8RlU9nOSbwKVJ3gwcv5rr0P8yyc7tvKvoXe9Na38RsBi4ZYwPYza9a9uHPArslmQ+sIxVy9WHOwH4fJIPAj8Fju47djFwSWt7LD4BnN2ezyeBj1fVZUmOAi5qH3RA75r0fxtj25IkSZI0KWTVSmlNFkm2A/6+qg7qK1teVSPeJX0imbnDznXFKWcOehiSNKLp7zt40EOQJEkTQJL5VTVreLkz6JNQVd1D7yvWJEmSJEkThAn6OEryYeDwYcWXVNUnN3Tfk2H2XJIkSZImMxP0cdQS8Q2ejEuSJEmSJh7v4i5JkiRJUgeYoEuSJEmS1AEm6JIkSZIkdYAJuiRJkiRJHWCCLkmSJElSB5igS5IkSZLUAX7NmiaUKdOnMv19Bw96GJIkSZK03jmDLkmSJElSB5igS5IkSZLUASbokiRJkiR1gAm6JEmSJEkdYIIuSZIkSVIHmKBLkiRJktQBJuiSJEmSJHWA34OuCeWJpQ+z9OzLBj0MSRrR9OPeOughSJKkCcwZdEmSJEmSOsAEXZIkSZKkDjBBlyRJkiSpA0zQJUmSJEnqABN0SZIkSZI6wARdkiRJkqQOMEGXJEmSJKkDTNAlSZIkSeoAE3RJkiRJkjrABF2SJEmSpA4wQZckSZIkqQNM0CVJkiRJ6gAT9AFIMi3Je9dTW7OTvHJ9tLWO/c9Psvmg+pckSZKkycIEfTCmAb+RoCfZdB3amg0MJEFPMgO4t6oeG0T/kiRJkjSZmKAPxqeAHZMsSHJzkjlJvgwsSjIjyeKhiklOTvKxtn1Cku8nWZjkKy1BPhY4qbW170idJTk8yeIktyW5rpUdleSsvjqXJ5ndtpcn+XSbHf9Okr2SXJNkSZJD+5o+CPhWO+ecJPOSfC/Jx/vaPTjJHUm+m+SzSS5v5Vsl+Xx7/LcmefPqnqwkx7S25z24fNlYnmdJkiRJmjCmDHoAG6lTgZdU1cyWFP9T27+rJd1rOu8FVbUiybSqejjJxnTiIQAAC6dJREFU54DlVfWZNZx3GvDGqro3ybRRjG8r4Jqq+pMkXwc+Abwe2BW4APhGq3cgcFLb/nBVPdRWAVyVZHfg34C/A/Zrj+2ivj4+DFxdVf+1jemmJN+pqkeHD6aqzgXOBZi5w041ivFLkiRJ0oTjDHo33FRVd42i3kLgwiTvAp4YQ/tzgfOTvAcYzTL6x2gz48Ai4NqqerxtzwBo151vV1VLWr3/lOQW4FZgN3rJ/C7Akr7H1p+gvwE4NckC4BpgC2CHMTwmSZIkSZpUnEHvhv5Z4yd46gcnW/RtvwnYDzgU+EiS3UbTeFUdm+QV7fwFSWaupZ/Hq2popnolsKK1szLJ0L+ZfYHvAiR5AXAy8PKq+lmS81t7WcOwAvxRVd05mscgSZIkSZOdM+iD8QiwzWqO3Q9MT7JtkqcBhwAk2QTYvqrmAKfQu9Hc1mtpi3bujlV1Y1WdBjwAbA/cDcxMskmS7YG9xvgYDgT+uW0/g96HDMuSPIfetekAdwAv7Fu2f0Tf+d8Gjk+SNsY9x9i/JEmSJE0qzqAPQFU9mGRuuxncL+kl5UPHHk9yOnAjcBe9JBd6S9O/lGQqvdnnM9o16N8ELm03WTu+qq4focu/TLJzO+8q4LZWfhe9ZeuLgVvG+DBm07u2naq6LcmtwPeAJfSW1FNVv2xfJ/etJA8AN/Wd/6fAXwMLW5J+N+3DCEmSJEnaGGXVSmZpdJJsB/x9VR00irpbV9XyloSfDfygqs5Y175n7rBTXfEnf7Gup0vSBjX9uLcOegiSJGkCSDK/qmYNL3eJu8asqu4ZTXLevKfdCO57wFR6d3WXJEmSJA3jEvdJJMmHgcOHFV9SVZ8cxHgA2mz5Os+YS5IkSdLGwgR9EmmJ+MCScUmSJEnSunOJuyRJkiRJHWCCLkmSJElSB5igS5IkSZLUASbokiRJkiR1gAm6JEmSJEkdYIIuSZIkSVIH+DVrmlCmTJ/G9OPeOuhhSJIkSdJ65wy6JEmSJEkdYIIuSZIkSVIHmKBLkiRJktQBqapBj0EatSSPAHcOehwC4FnAA4MehABj0RXGoTuMRXcYi+4wFt1hLLph0HF4flU9e3ihN4nTRHNnVc0a9CAESeYZi24wFt1gHLrDWHSHsegOY9EdxqIbuhoHl7hLkiRJktQBJuiSJEmSJHWACbommnMHPQD9mrHoDmPRDcahO4xFdxiL7jAW3WEsuqGTcfAmcZIkSZIkdYAz6JIkSZIkdYAJuiRJkiRJHWCCrgkhyYFJ7kzywySnDno8k12SzydZmmRxX9kzk1yZ5Aft9++08iT5bIvNwiQvHdzIJ58k2yeZk+T2JN9LcmIrNx7jLMkWSW5KcluLxcdb+QuS3NhicXGSzVv509r+D9vxGYMc/2STZNMktya5vO0bhwFIcneSRUkWJJnXynx9GoAk05JcmuSO9p6xj7EYf0le1P4ehn5+nuT9xmIwkpzU3rMXJ7movZd3+v3CBF2dl2RT4GzgIGBX4O1Jdh3sqCa984EDh5WdClxVVTsDV7V96MVl5/ZzDHDOOI1xY/EE8N+r6sXA3sBx7d+/8Rh/K4D9q2oPYCZwYJK9gU8DZ7RY/Ax4d6v/buBnVbUTcEarp/XnROD2vn3jMDivraqZfd8n7OvTYJwJfKuqdgH2oPf3YSzGWVXd2f4eZgIvA34BfB1jMe6SPA84AZhVVS8BNgXeRsffL0zQNRHsBfywqpZU1WPAV4A3D3hMk1pVXQc8NKz4zcAFbfsC4C195f9YPf8KTEvyu+Mz0smvqu6rqlva9iP0/sP1PIzHuGvP6fK2u1n7KWB/4NJWPjwWQzG6FHhdkozTcCe1JNsBbwLOa/vBOHSJr0/jLMkzgP2AfwCoqseq6mGMxaC9DvhRVf07xmJQpgBPTzIF2BK4j46/X5igayJ4HvDjvv17WpnG13Oq6j7oJY3A9FZufMZJW2q1J3AjxmMg2rLqBcBS4ErgR8DDVfVEq9L/fP86Fu34MmDb8R3xpPXXwCnAyra/LcZhUAq4Isn8JMe0Ml+fxt8LgZ8CX2iXfpyXZCuMxaC9DbiobRuLcVZV9wKfAf6DXmK+DJhPx98vTNA1EYz0yZXfD9gdxmccJNka+Brw/qr6+ZqqjlBmPNaTqnqyLVvcjt7qnhePVK39NhYbQJJDgKVVNb+/eISqxmF8vKqqXkpvme5xSfZbQ11jseFMAV4KnFNVewKPsmoJ9UiMxQbWrms+FLhkbVVHKDMW60G7zv/NwAuA5wJb0XutGq5T7xcm6JoI7gG279vfDvjJgMayMbt/aMlV+720lRufDSzJZvSS8wur6rJWbDwGqC0dvYbefQGmtaVz8NTn+9exaMen8puXjmjsXgUcmuRuepc87U9vRt04DEBV/aT9XkrvOtu98PVpEO4B7qmqG9v+pfQSdmMxOAcBt1TV/W3fWIy/A4C7quqnVfU4cBnwSjr+fmGCrongZmDndsfFzektF/rGgMe0MfoGcGTbPhL4P33l/6XdhXRvYNnQEi799tq1T/8A3F5V/6vvkPEYZ0menWRa2346vTf+24E5wGGt2vBYDMXoMODqqnJW5LdUVf+jqrarqhn03g+urqp3YhzGXZKtkmwztA28AViMr0/jrqr+H/DjJC9qRa8Dvo+xGKS3s2p5OxiLQfgPYO8kW7b/Tw39XXT6/SK+R2kiSHIwvRmSTYHPV9UnBzykSS3JRcBs4FnA/cBHgf8NfBXYgd4L3uFV9VB7wTuL3l3ffwEcXVXzBjHuySjJq4HrgUWsut72Q/SuQzce4yjJ7vRuHrMpvQ+4v1pVpyd5Ib2Z3GcCtwLvqqoVSbYAvkjvvgEPAW+rqiWDGf3klGQ2cHJVHWIcxl97zr/edqcAX66qTybZFl+fxl2SmfRunLg5sAQ4mvZahbEYV0m2pHct8wuralkr8+9iANL7StQj6H0rzq3AH9O71ryz7xcm6JIkSZIkdYBL3CVJkiRJ6gATdEmSJEmSOsAEXZIkSZKkDjBBlyRJkiSpA0zQJUmSJEnqABN0SZLUaUluGOf+ZiR5x3j2KUkSmKBLkqSOq6pXjldfSaYAMwATdEnSuPN70CVJUqclWV5VWyeZDXwcuB+YCVwGLAJOBJ4OvKWqfpTkfOBXwG7Ac4APVNXlSbYAzgFmAU+08jlJjgLeBGwBbAVsCbwYuAu4APg68MV2DOB9VXVDG8/HgAeAlwDzgXdVVSV5OXBmO2cF8DrgF8CngNnA04Czq+rv1vPTJUmawKYMegCSJEljsAe95PkhYAlwXlXtleRE4Hjg/a3eDOA1wI7AnCQ7AccBVNXvJ9kFuCLJ77X6+wC7V9VDLfE+uaoOAUiyJfD6qvpVkp2Bi+gl+QB70vsg4CfAXOBVSW4CLgaOqKqbkzwD+CXwbmBZVb08ydOAuUmuqKq7NsDzJEmagEzQJUnSRHJzVd0HkORHwBWtfBHw2r56X62qlcAPkiwBdgFeDfwNQFXdkeTfgaEE/cqqemg1fW4GnJVkJvBk3zkAN1XVPW08C+h9MLAMuK+qbm59/bwdfwOwe5LD2rlTgZ3pzdRLkmSCLkmSJpQVfdsr+/ZX8tT/1wy/hq+ArKHdR9dw7CR6y+r3oHf/nl+tZjxPtjFkhP5p5cdX1bfX0JckaSPmTeIkSdJkdHiSTZLsCLwQuBO4DngnQFvavkMrH+4RYJu+/an0ZsRXAv8Z2HQtfd8BPLddh06SbdrN574N/Lckmw2NIclWa2hHkrSRcQZdkiRNRncC19K7Sdyx7frxvwU+l2QRvZvEHVVVK5LfmFhfCDyR5DbgfOBvga8lORyYw5pn26mqx5IcAfxNkqfTu/78AOA8ekvgb0mv058Cb1kfD1aSNDl4F3dJkjSptLu4X15Vlw56LJIkjYVL3CVJkiRJ6gBn0CVJkiRJ6gBn0CVJkiRJ6gATdEmSJEmSOsAEXZIkSZKkDjBBlyRJkiSpA0zQJUmSJEnqgP8P5JlWHJ35NNcAAAAASUVORK5CYII=\n",
      "text/plain": [
       "<Figure size 1008x2016 with 1 Axes>"
      ]
     },
     "metadata": {
      "needs_background": "light"
     },
     "output_type": "display_data"
    }
   ],
   "source": [
    "#---------------特征重要性\n",
    "pd.set_option('display.max_columns', None)\n",
    "#显示所有行\n",
    "pd.set_option('display.max_rows', None)\n",
    "#设置value的显示长度为100，默认为50\n",
    "pd.set_option('max_colwidth',100)\n",
    "df = pd.DataFrame(data[use_feature].columns.tolist(), columns=['feature'])\n",
    "df['importance']=list(lgb_263.feature_importance())\n",
    "df = df.sort_values(by='importance',ascending=False)\n",
    "plt.figure(figsize=(14,28))\n",
    "sns.barplot(x=\"importance\", y=\"feature\", data=df.head(50))\n",
    "plt.title('Features importance (averaged/folds)')\n",
    "plt.tight_layout()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "后面，我们使用常见的机器学习方法，对于263维特征进行建模：\n",
    "\n",
    "2.xgboost"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 17,
   "metadata": {
    "scrolled": true
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "fold n°1\n",
      "[19:14:55] WARNING: /Users/travis/build/dmlc/xgboost/src/objective/regression_obj.cu:170: reg:linear is now deprecated in favor of reg:squarederror.\n",
      "[19:14:55] WARNING: /Users/travis/build/dmlc/xgboost/src/learner.cc:480: \n",
      "Parameters: { silent } might not be used.\n",
      "\n",
      "  This may not be accurate due to some parameters are only used in language bindings but\n",
      "  passed down to XGBoost core.  Or some parameters are not used but slip through this\n",
      "  verification. Please open an issue if you find above cases.\n",
      "\n",
      "\n",
      "[0]\ttrain-rmse:3.40426\tvalid_data-rmse:3.38329\n",
      "Multiple eval metrics have been passed: 'valid_data-rmse' will be used for early stopping.\n",
      "\n",
      "Will train until valid_data-rmse hasn't improved in 600 rounds.\n",
      "[500]\ttrain-rmse:0.40805\tvalid_data-rmse:0.70588\n",
      "[1000]\ttrain-rmse:0.27046\tvalid_data-rmse:0.70760\n",
      "Stopping. Best iteration:\n",
      "[663]\ttrain-rmse:0.35644\tvalid_data-rmse:0.70521\n",
      "\n",
      "[19:15:46] WARNING: /Users/travis/build/dmlc/xgboost/src/objective/regression_obj.cu:170: reg:linear is now deprecated in favor of reg:squarederror.\n",
      "fold n°2\n",
      "[19:15:46] WARNING: /Users/travis/build/dmlc/xgboost/src/objective/regression_obj.cu:170: reg:linear is now deprecated in favor of reg:squarederror.\n",
      "[19:15:46] WARNING: /Users/travis/build/dmlc/xgboost/src/learner.cc:480: \n",
      "Parameters: { silent } might not be used.\n",
      "\n",
      "  This may not be accurate due to some parameters are only used in language bindings but\n",
      "  passed down to XGBoost core.  Or some parameters are not used but slip through this\n",
      "  verification. Please open an issue if you find above cases.\n",
      "\n",
      "\n",
      "[0]\ttrain-rmse:3.39811\tvalid_data-rmse:3.40788\n",
      "Multiple eval metrics have been passed: 'valid_data-rmse' will be used for early stopping.\n",
      "\n",
      "Will train until valid_data-rmse hasn't improved in 600 rounds.\n",
      "[500]\ttrain-rmse:0.40719\tvalid_data-rmse:0.69456\n",
      "[1000]\ttrain-rmse:0.27402\tvalid_data-rmse:0.69501\n",
      "Stopping. Best iteration:\n",
      "[551]\ttrain-rmse:0.39079\tvalid_data-rmse:0.69403\n",
      "\n",
      "[19:16:31] WARNING: /Users/travis/build/dmlc/xgboost/src/objective/regression_obj.cu:170: reg:linear is now deprecated in favor of reg:squarederror.\n",
      "fold n°3\n",
      "[19:16:31] WARNING: /Users/travis/build/dmlc/xgboost/src/objective/regression_obj.cu:170: reg:linear is now deprecated in favor of reg:squarederror.\n",
      "[19:16:31] WARNING: /Users/travis/build/dmlc/xgboost/src/learner.cc:480: \n",
      "Parameters: { silent } might not be used.\n",
      "\n",
      "  This may not be accurate due to some parameters are only used in language bindings but\n",
      "  passed down to XGBoost core.  Or some parameters are not used but slip through this\n",
      "  verification. Please open an issue if you find above cases.\n",
      "\n",
      "\n",
      "[0]\ttrain-rmse:3.40181\tvalid_data-rmse:3.39295\n",
      "Multiple eval metrics have been passed: 'valid_data-rmse' will be used for early stopping.\n",
      "\n",
      "Will train until valid_data-rmse hasn't improved in 600 rounds.\n",
      "[500]\ttrain-rmse:0.41334\tvalid_data-rmse:0.66250\n",
      "Stopping. Best iteration:\n",
      "[333]\ttrain-rmse:0.47284\tvalid_data-rmse:0.66178\n",
      "\n",
      "[19:17:07] WARNING: /Users/travis/build/dmlc/xgboost/src/objective/regression_obj.cu:170: reg:linear is now deprecated in favor of reg:squarederror.\n",
      "fold n°4\n",
      "[19:17:08] WARNING: /Users/travis/build/dmlc/xgboost/src/objective/regression_obj.cu:170: reg:linear is now deprecated in favor of reg:squarederror.\n",
      "[19:17:08] WARNING: /Users/travis/build/dmlc/xgboost/src/learner.cc:480: \n",
      "Parameters: { silent } might not be used.\n",
      "\n",
      "  This may not be accurate due to some parameters are only used in language bindings but\n",
      "  passed down to XGBoost core.  Or some parameters are not used but slip through this\n",
      "  verification. Please open an issue if you find above cases.\n",
      "\n",
      "\n",
      "[0]\ttrain-rmse:3.40240\tvalid_data-rmse:3.39012\n",
      "Multiple eval metrics have been passed: 'valid_data-rmse' will be used for early stopping.\n",
      "\n",
      "Will train until valid_data-rmse hasn't improved in 600 rounds.\n",
      "[500]\ttrain-rmse:0.41021\tvalid_data-rmse:0.66575\n",
      "[1000]\ttrain-rmse:0.27491\tvalid_data-rmse:0.66431\n",
      "Stopping. Best iteration:\n",
      "[863]\ttrain-rmse:0.30689\tvalid_data-rmse:0.66358\n",
      "\n",
      "[19:18:06] WARNING: /Users/travis/build/dmlc/xgboost/src/objective/regression_obj.cu:170: reg:linear is now deprecated in favor of reg:squarederror.\n",
      "fold n°5\n",
      "[19:18:07] WARNING: /Users/travis/build/dmlc/xgboost/src/objective/regression_obj.cu:170: reg:linear is now deprecated in favor of reg:squarederror.\n",
      "[19:18:07] WARNING: /Users/travis/build/dmlc/xgboost/src/learner.cc:480: \n",
      "Parameters: { silent } might not be used.\n",
      "\n",
      "  This may not be accurate due to some parameters are only used in language bindings but\n",
      "  passed down to XGBoost core.  Or some parameters are not used but slip through this\n",
      "  verification. Please open an issue if you find above cases.\n",
      "\n",
      "\n",
      "[0]\ttrain-rmse:3.39347\tvalid_data-rmse:3.42628\n",
      "Multiple eval metrics have been passed: 'valid_data-rmse' will be used for early stopping.\n",
      "\n",
      "Will train until valid_data-rmse hasn't improved in 600 rounds.\n",
      "[500]\ttrain-rmse:0.41704\tvalid_data-rmse:0.64937\n",
      "[1000]\ttrain-rmse:0.27907\tvalid_data-rmse:0.64914\n",
      "Stopping. Best iteration:\n",
      "[598]\ttrain-rmse:0.38625\tvalid_data-rmse:0.64856\n",
      "\n",
      "[19:18:55] WARNING: /Users/travis/build/dmlc/xgboost/src/objective/regression_obj.cu:170: reg:linear is now deprecated in favor of reg:squarederror.\n",
      "CV score: 0.45559329\n"
     ]
    }
   ],
   "source": [
    "##### xgb_263\n",
    "#xgboost\n",
    "xgb_263_params = {'eta': 0.02,  #lr\n",
    "              'max_depth': 6,  \n",
    "              'min_child_weight':3,#最小叶子节点样本权重和\n",
    "              'gamma':0, #指定节点分裂所需的最小损失函数下降值。\n",
    "              'subsample': 0.7,  #控制对于每棵树，随机采样的比例\n",
    "              'colsample_bytree': 0.3,  #用来控制每棵随机采样的列数的占比 (每一列是一个特征)。\n",
    "              'lambda':2,\n",
    "              'objective': 'reg:linear', \n",
    "              'eval_metric': 'rmse', \n",
    "              'silent': True, \n",
    "              'nthread': -1}\n",
    "\n",
    "\n",
    "folds = KFold(n_splits=5, shuffle=True, random_state=2019)\n",
    "oof_xgb_263 = np.zeros(len(X_train_263))\n",
    "predictions_xgb_263 = np.zeros(len(X_test_263))\n",
    "\n",
    "for fold_, (trn_idx, val_idx) in enumerate(folds.split(X_train_263, y_train)):\n",
    "    print(\"fold n°{}\".format(fold_+1))\n",
    "    trn_data = xgb.DMatrix(X_train_263[trn_idx], y_train[trn_idx])\n",
    "    val_data = xgb.DMatrix(X_train_263[val_idx], y_train[val_idx])\n",
    "\n",
    "    watchlist = [(trn_data, 'train'), (val_data, 'valid_data')]\n",
    "    xgb_263 = xgb.train(dtrain=trn_data, num_boost_round=3000, evals=watchlist, early_stopping_rounds=600, verbose_eval=500, params=xgb_263_params)\n",
    "    oof_xgb_263[val_idx] = xgb_263.predict(xgb.DMatrix(X_train_263[val_idx]), ntree_limit=xgb_263.best_ntree_limit)\n",
    "    predictions_xgb_263 += xgb_263.predict(xgb.DMatrix(X_test_263), ntree_limit=xgb_263.best_ntree_limit) / folds.n_splits\n",
    "\n",
    "print(\"CV score: {:<8.8f}\".format(mean_squared_error(oof_xgb_263, target)))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "3. RandomForestRegressor随机森林"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 18,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "fold n°1\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "[Parallel(n_jobs=-1)]: Using backend ThreadingBackend with 8 concurrent workers.\n",
      "[Parallel(n_jobs=-1)]: Done  34 tasks      | elapsed:    0.6s\n",
      "[Parallel(n_jobs=-1)]: Done 184 tasks      | elapsed:    2.6s\n",
      "[Parallel(n_jobs=-1)]: Done 434 tasks      | elapsed:    6.5s\n",
      "[Parallel(n_jobs=-1)]: Done 784 tasks      | elapsed:   11.8s\n",
      "[Parallel(n_jobs=-1)]: Done 1234 tasks      | elapsed:   18.9s\n",
      "[Parallel(n_jobs=-1)]: Done 1600 out of 1600 | elapsed:   25.6s finished\n",
      "[Parallel(n_jobs=8)]: Using backend ThreadingBackend with 8 concurrent workers.\n",
      "[Parallel(n_jobs=8)]: Done  34 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 184 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 434 tasks      | elapsed:    0.1s\n",
      "[Parallel(n_jobs=8)]: Done 784 tasks      | elapsed:    0.1s\n",
      "[Parallel(n_jobs=8)]: Done 1234 tasks      | elapsed:    0.2s\n",
      "[Parallel(n_jobs=8)]: Done 1600 out of 1600 | elapsed:    0.2s finished\n",
      "[Parallel(n_jobs=8)]: Using backend ThreadingBackend with 8 concurrent workers.\n",
      "[Parallel(n_jobs=8)]: Done  34 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 184 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 434 tasks      | elapsed:    0.1s\n",
      "[Parallel(n_jobs=8)]: Done 784 tasks      | elapsed:    0.1s\n",
      "[Parallel(n_jobs=8)]: Done 1234 tasks      | elapsed:    0.2s\n",
      "[Parallel(n_jobs=8)]: Done 1600 out of 1600 | elapsed:    0.2s finished\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "fold n°2\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "[Parallel(n_jobs=-1)]: Using backend ThreadingBackend with 8 concurrent workers.\n",
      "[Parallel(n_jobs=-1)]: Done  34 tasks      | elapsed:    0.6s\n",
      "[Parallel(n_jobs=-1)]: Done 184 tasks      | elapsed:    2.8s\n",
      "[Parallel(n_jobs=-1)]: Done 434 tasks      | elapsed:    6.9s\n",
      "[Parallel(n_jobs=-1)]: Done 784 tasks      | elapsed:   12.9s\n",
      "[Parallel(n_jobs=-1)]: Done 1234 tasks      | elapsed:   21.0s\n",
      "[Parallel(n_jobs=-1)]: Done 1600 out of 1600 | elapsed:   27.5s finished\n",
      "[Parallel(n_jobs=8)]: Using backend ThreadingBackend with 8 concurrent workers.\n",
      "[Parallel(n_jobs=8)]: Done  34 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 184 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 434 tasks      | elapsed:    0.1s\n",
      "[Parallel(n_jobs=8)]: Done 784 tasks      | elapsed:    0.1s\n",
      "[Parallel(n_jobs=8)]: Done 1234 tasks      | elapsed:    0.2s\n",
      "[Parallel(n_jobs=8)]: Done 1600 out of 1600 | elapsed:    0.2s finished\n",
      "[Parallel(n_jobs=8)]: Using backend ThreadingBackend with 8 concurrent workers.\n",
      "[Parallel(n_jobs=8)]: Done  34 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 184 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 434 tasks      | elapsed:    0.1s\n",
      "[Parallel(n_jobs=8)]: Done 784 tasks      | elapsed:    0.2s\n",
      "[Parallel(n_jobs=8)]: Done 1234 tasks      | elapsed:    0.2s\n",
      "[Parallel(n_jobs=8)]: Done 1600 out of 1600 | elapsed:    0.3s finished\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "fold n°3\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "[Parallel(n_jobs=-1)]: Using backend ThreadingBackend with 8 concurrent workers.\n",
      "[Parallel(n_jobs=-1)]: Done  34 tasks      | elapsed:    0.6s\n",
      "[Parallel(n_jobs=-1)]: Done 184 tasks      | elapsed:    3.4s\n",
      "[Parallel(n_jobs=-1)]: Done 434 tasks      | elapsed:    7.6s\n",
      "[Parallel(n_jobs=-1)]: Done 784 tasks      | elapsed:   13.7s\n",
      "[Parallel(n_jobs=-1)]: Done 1234 tasks      | elapsed:   21.0s\n",
      "[Parallel(n_jobs=-1)]: Done 1600 out of 1600 | elapsed:   26.9s finished\n",
      "[Parallel(n_jobs=8)]: Using backend ThreadingBackend with 8 concurrent workers.\n",
      "[Parallel(n_jobs=8)]: Done  34 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 184 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 434 tasks      | elapsed:    0.1s\n",
      "[Parallel(n_jobs=8)]: Done 784 tasks      | elapsed:    0.1s\n",
      "[Parallel(n_jobs=8)]: Done 1234 tasks      | elapsed:    0.2s\n",
      "[Parallel(n_jobs=8)]: Done 1600 out of 1600 | elapsed:    0.2s finished\n",
      "[Parallel(n_jobs=8)]: Using backend ThreadingBackend with 8 concurrent workers.\n",
      "[Parallel(n_jobs=8)]: Done  34 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 184 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 434 tasks      | elapsed:    0.1s\n",
      "[Parallel(n_jobs=8)]: Done 784 tasks      | elapsed:    0.1s\n",
      "[Parallel(n_jobs=8)]: Done 1234 tasks      | elapsed:    0.2s\n",
      "[Parallel(n_jobs=8)]: Done 1600 out of 1600 | elapsed:    0.2s finished\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "fold n°4\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "[Parallel(n_jobs=-1)]: Using backend ThreadingBackend with 8 concurrent workers.\n",
      "[Parallel(n_jobs=-1)]: Done  34 tasks      | elapsed:    0.8s\n",
      "[Parallel(n_jobs=-1)]: Done 184 tasks      | elapsed:    3.5s\n",
      "[Parallel(n_jobs=-1)]: Done 434 tasks      | elapsed:    7.9s\n",
      "[Parallel(n_jobs=-1)]: Done 784 tasks      | elapsed:   13.3s\n",
      "[Parallel(n_jobs=-1)]: Done 1234 tasks      | elapsed:   20.6s\n",
      "[Parallel(n_jobs=-1)]: Done 1600 out of 1600 | elapsed:   26.1s finished\n",
      "[Parallel(n_jobs=8)]: Using backend ThreadingBackend with 8 concurrent workers.\n",
      "[Parallel(n_jobs=8)]: Done  34 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 184 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 434 tasks      | elapsed:    0.1s\n",
      "[Parallel(n_jobs=8)]: Done 784 tasks      | elapsed:    0.1s\n",
      "[Parallel(n_jobs=8)]: Done 1234 tasks      | elapsed:    0.2s\n",
      "[Parallel(n_jobs=8)]: Done 1600 out of 1600 | elapsed:    0.2s finished\n",
      "[Parallel(n_jobs=8)]: Using backend ThreadingBackend with 8 concurrent workers.\n",
      "[Parallel(n_jobs=8)]: Done  34 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 184 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 434 tasks      | elapsed:    0.1s\n",
      "[Parallel(n_jobs=8)]: Done 784 tasks      | elapsed:    0.1s\n",
      "[Parallel(n_jobs=8)]: Done 1234 tasks      | elapsed:    0.2s\n",
      "[Parallel(n_jobs=8)]: Done 1600 out of 1600 | elapsed:    0.2s finished\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "fold n°5\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "[Parallel(n_jobs=-1)]: Using backend ThreadingBackend with 8 concurrent workers.\n",
      "[Parallel(n_jobs=-1)]: Done  34 tasks      | elapsed:    0.6s\n",
      "[Parallel(n_jobs=-1)]: Done 184 tasks      | elapsed:    2.7s\n",
      "[Parallel(n_jobs=-1)]: Done 434 tasks      | elapsed:    6.8s\n",
      "[Parallel(n_jobs=-1)]: Done 784 tasks      | elapsed:   12.2s\n",
      "[Parallel(n_jobs=-1)]: Done 1234 tasks      | elapsed:   19.2s\n",
      "[Parallel(n_jobs=-1)]: Done 1600 out of 1600 | elapsed:   25.1s finished\n",
      "[Parallel(n_jobs=8)]: Using backend ThreadingBackend with 8 concurrent workers.\n",
      "[Parallel(n_jobs=8)]: Done  34 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 184 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 434 tasks      | elapsed:    0.1s\n",
      "[Parallel(n_jobs=8)]: Done 784 tasks      | elapsed:    0.1s\n",
      "[Parallel(n_jobs=8)]: Done 1234 tasks      | elapsed:    0.2s\n",
      "[Parallel(n_jobs=8)]: Done 1600 out of 1600 | elapsed:    0.2s finished\n",
      "[Parallel(n_jobs=8)]: Using backend ThreadingBackend with 8 concurrent workers.\n",
      "[Parallel(n_jobs=8)]: Done  34 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 184 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 434 tasks      | elapsed:    0.1s\n",
      "[Parallel(n_jobs=8)]: Done 784 tasks      | elapsed:    0.1s\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "CV score: 0.47804209\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "[Parallel(n_jobs=8)]: Done 1234 tasks      | elapsed:    0.2s\n",
      "[Parallel(n_jobs=8)]: Done 1600 out of 1600 | elapsed:    0.3s finished\n"
     ]
    }
   ],
   "source": [
    "#RandomForestRegressor随机森林\n",
    "folds = KFold(n_splits=5, shuffle=True, random_state=2019)\n",
    "oof_rfr_263 = np.zeros(len(X_train_263))\n",
    "predictions_rfr_263 = np.zeros(len(X_test_263))\n",
    "\n",
    "for fold_, (trn_idx, val_idx) in enumerate(folds.split(X_train_263, y_train)):\n",
    "    print(\"fold n°{}\".format(fold_+1))\n",
    "    tr_x = X_train_263[trn_idx]\n",
    "    tr_y = y_train[trn_idx]\n",
    "    rfr_263 = rfr(n_estimators=1600,max_depth=9, min_samples_leaf=9, min_weight_fraction_leaf=0.0,\n",
    "            max_features=0.25,verbose=1,n_jobs=-1)\n",
    "    #verbose = 0 为不在标准输出流输出日志信息\n",
    "#verbose = 1 为输出进度条记录\n",
    "#verbose = 2 为每个epoch输出一行记录\n",
    "    rfr_263.fit(tr_x,tr_y)\n",
    "    oof_rfr_263[val_idx] = rfr_263.predict(X_train_263[val_idx])\n",
    "    \n",
    "    predictions_rfr_263 += rfr_263.predict(X_test_263) / folds.n_splits\n",
    "\n",
    "print(\"CV score: {:<8.8f}\".format(mean_squared_error(oof_rfr_263, target)))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "4. GradientBoostingRegressor梯度提升决策树"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 19,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "fold n°1\n",
      "      Iter       Train Loss      OOB Improve   Remaining Time \n",
      "         1           0.6419           0.0036           24.34s\n",
      "         2           0.6564           0.0031           23.18s\n",
      "         3           0.6693           0.0031           22.69s\n",
      "         4           0.6589           0.0031           22.78s\n",
      "         5           0.6522           0.0027           22.58s\n",
      "         6           0.6521           0.0031           22.40s\n",
      "         7           0.6370           0.0029           22.23s\n",
      "         8           0.6343           0.0030           22.06s\n",
      "         9           0.6447           0.0029           21.87s\n",
      "        10           0.6397           0.0028           21.75s\n",
      "        20           0.5955           0.0019           20.93s\n",
      "        30           0.5695           0.0016           20.09s\n",
      "        40           0.5460           0.0015           19.34s\n",
      "        50           0.5121           0.0011           18.65s\n",
      "        60           0.4994           0.0012           18.03s\n",
      "        70           0.4912           0.0010           17.44s\n",
      "        80           0.4719           0.0010           16.76s\n",
      "        90           0.4310           0.0007           16.28s\n",
      "       100           0.4437           0.0006           15.84s\n",
      "       200           0.3424           0.0002           10.15s\n",
      "       300           0.3063          -0.0000            4.94s\n",
      "       400           0.2759          -0.0000            0.00s\n",
      "fold n°2\n",
      "      Iter       Train Loss      OOB Improve   Remaining Time \n",
      "         1           0.6836           0.0034           24.61s\n",
      "         2           0.6613           0.0030           22.86s\n",
      "         3           0.6500           0.0031           24.11s\n",
      "         4           0.6621           0.0036           23.15s\n",
      "         5           0.6356           0.0031           23.49s\n",
      "         6           0.6460           0.0029           23.13s\n",
      "         7           0.6263           0.0032           22.83s\n",
      "         8           0.6149           0.0029           22.72s\n",
      "         9           0.6350           0.0030           22.83s\n",
      "        10           0.6325           0.0026           22.65s\n",
      "        20           0.6064           0.0025           21.62s\n",
      "        30           0.5812           0.0018           20.59s\n",
      "        40           0.5460           0.0018           19.98s\n",
      "        50           0.5016           0.0014           19.52s\n",
      "        60           0.4991           0.0010           18.84s\n",
      "        70           0.4645           0.0009           18.24s\n",
      "        80           0.4621           0.0007           17.76s\n",
      "        90           0.4497           0.0007           17.20s\n",
      "       100           0.4374           0.0005           16.51s\n",
      "       200           0.3420           0.0001           10.35s\n",
      "       300           0.3032          -0.0000            4.95s\n",
      "       400           0.2710          -0.0000            0.00s\n",
      "fold n°3\n",
      "      Iter       Train Loss      OOB Improve   Remaining Time \n",
      "         1           0.6692           0.0036           24.95s\n",
      "         2           0.6468           0.0031           23.99s\n",
      "         3           0.6313           0.0034           24.05s\n",
      "         4           0.6499           0.0032           23.70s\n",
      "         5           0.6358           0.0033           23.38s\n",
      "         6           0.6343           0.0029           23.05s\n",
      "         7           0.6312           0.0036           22.71s\n",
      "         8           0.6180           0.0032           22.47s\n",
      "         9           0.6275           0.0035           22.57s\n",
      "        10           0.6168           0.0030           22.24s\n",
      "        20           0.5792           0.0021           20.73s\n",
      "        30           0.5583           0.0023           20.27s\n",
      "        40           0.5521           0.0018           19.70s\n",
      "        50           0.5067           0.0013           18.84s\n",
      "        60           0.4754           0.0010           18.42s\n",
      "        70           0.4811           0.0009           17.84s\n",
      "        80           0.4603           0.0008           17.38s\n",
      "        90           0.4439           0.0006           16.74s\n",
      "       100           0.4323           0.0007           16.25s\n",
      "       200           0.3401           0.0002           10.23s\n",
      "       300           0.2862          -0.0000            4.84s\n",
      "       400           0.2690          -0.0000            0.00s\n",
      "fold n°4\n",
      "      Iter       Train Loss      OOB Improve   Remaining Time \n",
      "         1           0.6687           0.0032           21.09s\n",
      "         2           0.6517           0.0031           23.29s\n",
      "         3           0.6583           0.0031           23.63s\n",
      "         4           0.6607           0.0033           24.45s\n",
      "         5           0.6583           0.0029           24.78s\n",
      "         6           0.6688           0.0028           24.80s\n",
      "         7           0.6320           0.0030           25.08s\n",
      "         8           0.6502           0.0026           24.94s\n",
      "         9           0.6358           0.0026           24.51s\n",
      "        10           0.6258           0.0027           24.24s\n",
      "        20           0.5910           0.0023           22.41s\n",
      "        30           0.5609           0.0020           21.31s\n",
      "        40           0.5399           0.0017           20.50s\n",
      "        50           0.4963           0.0013           19.67s\n",
      "        60           0.4844           0.0012           18.86s\n",
      "        70           0.4781           0.0008           18.21s\n",
      "        80           0.4484           0.0010           17.63s\n",
      "        90           0.4619           0.0006           16.95s\n",
      "       100           0.4430           0.0005           16.46s\n",
      "       200           0.3377           0.0001           10.50s\n",
      "       300           0.3001           0.0001            4.97s\n",
      "       400           0.2623          -0.0000            0.00s\n",
      "fold n°5\n",
      "      Iter       Train Loss      OOB Improve   Remaining Time \n",
      "         1           0.6857           0.0031           23.50s\n",
      "         2           0.6320           0.0035           24.26s\n",
      "         3           0.6573           0.0033           23.41s\n",
      "         4           0.6494           0.0033           24.20s\n",
      "         5           0.6311           0.0033           24.32s\n",
      "         6           0.6362           0.0031           24.20s\n",
      "         7           0.6291           0.0032           24.05s\n",
      "         8           0.6354           0.0032           23.56s\n",
      "         9           0.6383           0.0030           23.54s\n",
      "        10           0.6250           0.0029           23.64s\n",
      "        20           0.5989           0.0023           21.45s\n",
      "        30           0.5736           0.0019           20.27s\n",
      "        40           0.5457           0.0016           19.60s\n",
      "        50           0.5045           0.0015           18.76s\n",
      "        60           0.4820           0.0012           18.20s\n",
      "        70           0.4756           0.0010           17.44s\n",
      "        80           0.4484           0.0009           16.91s\n",
      "        90           0.4410           0.0007           16.34s\n",
      "       100           0.4195           0.0004           15.72s\n",
      "       200           0.3348           0.0001           10.05s\n",
      "       300           0.2933          -0.0000            4.76s\n",
      "       400           0.2658          -0.0000            0.00s\n",
      "CV score: 0.45583290\n"
     ]
    }
   ],
   "source": [
    "#GradientBoostingRegressor梯度提升决策树\n",
    "folds = StratifiedKFold(n_splits=5, shuffle=True, random_state=2018)\n",
    "oof_gbr_263 = np.zeros(train_shape)\n",
    "predictions_gbr_263 = np.zeros(len(X_test_263))\n",
    "\n",
    "for fold_, (trn_idx, val_idx) in enumerate(folds.split(X_train_263, y_train)):\n",
    "    print(\"fold n°{}\".format(fold_+1))\n",
    "    tr_x = X_train_263[trn_idx]\n",
    "    tr_y = y_train[trn_idx]\n",
    "    gbr_263 = gbr(n_estimators=400, learning_rate=0.01,subsample=0.65,max_depth=7, min_samples_leaf=20,\n",
    "            max_features=0.22,verbose=1)\n",
    "    gbr_263.fit(tr_x,tr_y)\n",
    "    oof_gbr_263[val_idx] = gbr_263.predict(X_train_263[val_idx])\n",
    "    \n",
    "    predictions_gbr_263 += gbr_263.predict(X_test_263) / folds.n_splits\n",
    "\n",
    "print(\"CV score: {:<8.8f}\".format(mean_squared_error(oof_gbr_263, target)))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "5. ExtraTreesRegressor 极端随机森林回归"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 20,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "fold n°1\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "[Parallel(n_jobs=-1)]: Using backend ThreadingBackend with 8 concurrent workers.\n",
      "[Parallel(n_jobs=-1)]: Done  34 tasks      | elapsed:    0.4s\n",
      "[Parallel(n_jobs=-1)]: Done 184 tasks      | elapsed:    1.7s\n",
      "[Parallel(n_jobs=-1)]: Done 434 tasks      | elapsed:    4.0s\n",
      "[Parallel(n_jobs=-1)]: Done 784 tasks      | elapsed:    7.2s\n",
      "[Parallel(n_jobs=-1)]: Done 1000 out of 1000 | elapsed:    9.0s finished\n",
      "[Parallel(n_jobs=8)]: Using backend ThreadingBackend with 8 concurrent workers.\n",
      "[Parallel(n_jobs=8)]: Done  34 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 184 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 434 tasks      | elapsed:    0.1s\n",
      "[Parallel(n_jobs=8)]: Done 784 tasks      | elapsed:    0.1s\n",
      "[Parallel(n_jobs=8)]: Done 1000 out of 1000 | elapsed:    0.1s finished\n",
      "[Parallel(n_jobs=8)]: Using backend ThreadingBackend with 8 concurrent workers.\n",
      "[Parallel(n_jobs=8)]: Done  34 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 184 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 434 tasks      | elapsed:    0.1s\n",
      "[Parallel(n_jobs=8)]: Done 784 tasks      | elapsed:    0.1s\n",
      "[Parallel(n_jobs=8)]: Done 1000 out of 1000 | elapsed:    0.1s finished\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "fold n°2\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "[Parallel(n_jobs=-1)]: Using backend ThreadingBackend with 8 concurrent workers.\n",
      "[Parallel(n_jobs=-1)]: Done  34 tasks      | elapsed:    0.3s\n",
      "[Parallel(n_jobs=-1)]: Done 184 tasks      | elapsed:    1.6s\n",
      "[Parallel(n_jobs=-1)]: Done 434 tasks      | elapsed:    3.8s\n",
      "[Parallel(n_jobs=-1)]: Done 784 tasks      | elapsed:    6.9s\n",
      "[Parallel(n_jobs=-1)]: Done 1000 out of 1000 | elapsed:    8.9s finished\n",
      "[Parallel(n_jobs=8)]: Using backend ThreadingBackend with 8 concurrent workers.\n",
      "[Parallel(n_jobs=8)]: Done  34 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 184 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 434 tasks      | elapsed:    0.1s\n",
      "[Parallel(n_jobs=8)]: Done 784 tasks      | elapsed:    0.1s\n",
      "[Parallel(n_jobs=8)]: Done 1000 out of 1000 | elapsed:    0.1s finished\n",
      "[Parallel(n_jobs=8)]: Using backend ThreadingBackend with 8 concurrent workers.\n",
      "[Parallel(n_jobs=8)]: Done  34 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 184 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 434 tasks      | elapsed:    0.1s\n",
      "[Parallel(n_jobs=8)]: Done 784 tasks      | elapsed:    0.1s\n",
      "[Parallel(n_jobs=8)]: Done 1000 out of 1000 | elapsed:    0.1s finished\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "fold n°3\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "[Parallel(n_jobs=-1)]: Using backend ThreadingBackend with 8 concurrent workers.\n",
      "[Parallel(n_jobs=-1)]: Done  34 tasks      | elapsed:    0.4s\n",
      "[Parallel(n_jobs=-1)]: Done 184 tasks      | elapsed:    1.7s\n",
      "[Parallel(n_jobs=-1)]: Done 434 tasks      | elapsed:    4.1s\n",
      "[Parallel(n_jobs=-1)]: Done 784 tasks      | elapsed:    7.6s\n",
      "[Parallel(n_jobs=-1)]: Done 1000 out of 1000 | elapsed:    9.6s finished\n",
      "[Parallel(n_jobs=8)]: Using backend ThreadingBackend with 8 concurrent workers.\n",
      "[Parallel(n_jobs=8)]: Done  34 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 184 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 434 tasks      | elapsed:    0.1s\n",
      "[Parallel(n_jobs=8)]: Done 784 tasks      | elapsed:    0.1s\n",
      "[Parallel(n_jobs=8)]: Done 1000 out of 1000 | elapsed:    0.1s finished\n",
      "[Parallel(n_jobs=8)]: Using backend ThreadingBackend with 8 concurrent workers.\n",
      "[Parallel(n_jobs=8)]: Done  34 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 184 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 434 tasks      | elapsed:    0.1s\n",
      "[Parallel(n_jobs=8)]: Done 784 tasks      | elapsed:    0.1s\n",
      "[Parallel(n_jobs=8)]: Done 1000 out of 1000 | elapsed:    0.1s finished\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "fold n°4\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "[Parallel(n_jobs=-1)]: Using backend ThreadingBackend with 8 concurrent workers.\n",
      "[Parallel(n_jobs=-1)]: Done  34 tasks      | elapsed:    0.4s\n",
      "[Parallel(n_jobs=-1)]: Done 184 tasks      | elapsed:    1.7s\n",
      "[Parallel(n_jobs=-1)]: Done 434 tasks      | elapsed:    4.0s\n",
      "[Parallel(n_jobs=-1)]: Done 784 tasks      | elapsed:    7.6s\n",
      "[Parallel(n_jobs=-1)]: Done 1000 out of 1000 | elapsed:   10.6s finished\n",
      "[Parallel(n_jobs=8)]: Using backend ThreadingBackend with 8 concurrent workers.\n",
      "[Parallel(n_jobs=8)]: Done  34 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 184 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 434 tasks      | elapsed:    0.1s\n",
      "[Parallel(n_jobs=8)]: Done 784 tasks      | elapsed:    0.1s\n",
      "[Parallel(n_jobs=8)]: Done 1000 out of 1000 | elapsed:    0.2s finished\n",
      "[Parallel(n_jobs=8)]: Using backend ThreadingBackend with 8 concurrent workers.\n",
      "[Parallel(n_jobs=8)]: Done  34 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 184 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 434 tasks      | elapsed:    0.1s\n",
      "[Parallel(n_jobs=8)]: Done 784 tasks      | elapsed:    0.1s\n",
      "[Parallel(n_jobs=8)]: Done 1000 out of 1000 | elapsed:    0.2s finished\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "fold n°5\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "[Parallel(n_jobs=-1)]: Using backend ThreadingBackend with 8 concurrent workers.\n",
      "[Parallel(n_jobs=-1)]: Done  34 tasks      | elapsed:    0.4s\n",
      "[Parallel(n_jobs=-1)]: Done 184 tasks      | elapsed:    1.9s\n",
      "[Parallel(n_jobs=-1)]: Done 434 tasks      | elapsed:    4.4s\n",
      "[Parallel(n_jobs=-1)]: Done 784 tasks      | elapsed:    8.6s\n",
      "[Parallel(n_jobs=-1)]: Done 1000 out of 1000 | elapsed:   10.7s finished\n",
      "[Parallel(n_jobs=8)]: Using backend ThreadingBackend with 8 concurrent workers.\n",
      "[Parallel(n_jobs=8)]: Done  34 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 184 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 434 tasks      | elapsed:    0.1s\n",
      "[Parallel(n_jobs=8)]: Done 784 tasks      | elapsed:    0.1s\n",
      "[Parallel(n_jobs=8)]: Done 1000 out of 1000 | elapsed:    0.1s finished\n",
      "[Parallel(n_jobs=8)]: Using backend ThreadingBackend with 8 concurrent workers.\n",
      "[Parallel(n_jobs=8)]: Done  34 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 184 tasks      | elapsed:    0.0s\n",
      "[Parallel(n_jobs=8)]: Done 434 tasks      | elapsed:    0.1s\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "CV score: 0.48598792\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "[Parallel(n_jobs=8)]: Done 784 tasks      | elapsed:    0.1s\n",
      "[Parallel(n_jobs=8)]: Done 1000 out of 1000 | elapsed:    0.1s finished\n"
     ]
    }
   ],
   "source": [
    "#ExtraTreesRegressor 极端随机森林回归\n",
    "folds = KFold(n_splits=5, shuffle=True, random_state=13)\n",
    "oof_etr_263 = np.zeros(train_shape)\n",
    "predictions_etr_263 = np.zeros(len(X_test_263))\n",
    "\n",
    "for fold_, (trn_idx, val_idx) in enumerate(folds.split(X_train_263, y_train)):\n",
    "    print(\"fold n°{}\".format(fold_+1))\n",
    "    tr_x = X_train_263[trn_idx]\n",
    "    tr_y = y_train[trn_idx]\n",
    "    etr_263 = etr(n_estimators=1000,max_depth=8, min_samples_leaf=12, min_weight_fraction_leaf=0.0,\n",
    "            max_features=0.4,verbose=1,n_jobs=-1)\n",
    "    etr_263.fit(tr_x,tr_y)\n",
    "    oof_etr_263[val_idx] = etr_263.predict(X_train_263[val_idx])\n",
    "    \n",
    "    predictions_etr_263 += etr_263.predict(X_test_263) / folds.n_splits\n",
    "\n",
    "print(\"CV score: {:<8.8f}\".format(mean_squared_error(oof_etr_263, target)))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "至此，我们得到了以上5种模型的预测结果以及模型架构及参数。其中在每一种特征工程中，进行5折的交叉验证，并重复两次（Kernel Ridge Regression，核脊回归），取得每一个特征数下的模型的结果。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 21,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "fold 0\n",
      "fold 1\n",
      "fold 2\n",
      "fold 3\n",
      "fold 4\n",
      "fold 5\n",
      "fold 6\n",
      "fold 7\n",
      "fold 8\n",
      "fold 9\n"
     ]
    },
    {
     "data": {
      "text/plain": [
       "0.44815130114230267"
      ]
     },
     "execution_count": 21,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "train_stack2 = np.vstack([oof_lgb_263,oof_xgb_263,oof_gbr_263,oof_rfr_263,oof_etr_263]).transpose()\n",
    "# transpose()函数的作用就是调换x,y,z的位置,也就是数组的索引值\n",
    "test_stack2 = np.vstack([predictions_lgb_263, predictions_xgb_263,predictions_gbr_263,predictions_rfr_263,predictions_etr_263]).transpose()\n",
    "\n",
    "#交叉验证:5折，重复2次\n",
    "folds_stack = RepeatedKFold(n_splits=5, n_repeats=2, random_state=7)\n",
    "oof_stack2 = np.zeros(train_stack2.shape[0])\n",
    "predictions_lr2 = np.zeros(test_stack2.shape[0])\n",
    "\n",
    "for fold_, (trn_idx, val_idx) in enumerate(folds_stack.split(train_stack2,target)):\n",
    "    print(\"fold {}\".format(fold_))\n",
    "    trn_data, trn_y = train_stack2[trn_idx], target.iloc[trn_idx].values\n",
    "    val_data, val_y = train_stack2[val_idx], target.iloc[val_idx].values\n",
    "    #Kernel Ridge Regression\n",
    "    lr2 = kr()\n",
    "    lr2.fit(trn_data, trn_y)\n",
    "    \n",
    "    oof_stack2[val_idx] = lr2.predict(val_data)\n",
    "    predictions_lr2 += lr2.predict(test_stack2) / 10\n",
    "    \n",
    "mean_squared_error(target.values, oof_stack2) "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "接下来我们对于49维的数据进行与上述263维数据相同的操作\n",
    "\n",
    "1.lightGBM"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 22,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "fold n°1\n",
      "Training until validation scores don't improve for 1000 rounds\n",
      "[1000]\ttraining's l2: 0.46958\tvalid_1's l2: 0.500767\n",
      "[2000]\ttraining's l2: 0.429395\tvalid_1's l2: 0.482214\n",
      "[3000]\ttraining's l2: 0.406748\tvalid_1's l2: 0.477959\n",
      "[4000]\ttraining's l2: 0.388735\tvalid_1's l2: 0.476283\n",
      "[5000]\ttraining's l2: 0.373399\tvalid_1's l2: 0.475506\n",
      "[6000]\ttraining's l2: 0.359798\tvalid_1's l2: 0.475435\n",
      "Early stopping, best iteration is:\n",
      "[5429]\ttraining's l2: 0.367348\tvalid_1's l2: 0.475325\n",
      "fold n°2\n",
      "Training until validation scores don't improve for 1000 rounds\n",
      "[1000]\ttraining's l2: 0.469767\tvalid_1's l2: 0.496741\n",
      "[2000]\ttraining's l2: 0.428546\tvalid_1's l2: 0.479198\n",
      "[3000]\ttraining's l2: 0.405733\tvalid_1's l2: 0.475903\n",
      "[4000]\ttraining's l2: 0.388021\tvalid_1's l2: 0.474891\n",
      "[5000]\ttraining's l2: 0.372619\tvalid_1's l2: 0.474262\n",
      "[6000]\ttraining's l2: 0.358826\tvalid_1's l2: 0.47449\n",
      "Early stopping, best iteration is:\n",
      "[5002]\ttraining's l2: 0.372597\tvalid_1's l2: 0.47425\n",
      "fold n°3\n",
      "Training until validation scores don't improve for 1000 rounds\n",
      "[1000]\ttraining's l2: 0.47361\tvalid_1's l2: 0.4839\n",
      "[2000]\ttraining's l2: 0.433064\tvalid_1's l2: 0.462219\n",
      "[3000]\ttraining's l2: 0.410658\tvalid_1's l2: 0.457989\n",
      "[4000]\ttraining's l2: 0.392859\tvalid_1's l2: 0.456091\n",
      "[5000]\ttraining's l2: 0.377706\tvalid_1's l2: 0.455416\n",
      "[6000]\ttraining's l2: 0.364058\tvalid_1's l2: 0.455285\n",
      "Early stopping, best iteration is:\n",
      "[5815]\ttraining's l2: 0.3665\tvalid_1's l2: 0.455119\n",
      "fold n°4\n",
      "Training until validation scores don't improve for 1000 rounds\n",
      "[1000]\ttraining's l2: 0.471715\tvalid_1's l2: 0.496877\n",
      "[2000]\ttraining's l2: 0.431956\tvalid_1's l2: 0.472828\n",
      "[3000]\ttraining's l2: 0.409505\tvalid_1's l2: 0.467016\n",
      "[4000]\ttraining's l2: 0.391659\tvalid_1's l2: 0.464929\n",
      "[5000]\ttraining's l2: 0.376239\tvalid_1's l2: 0.464048\n",
      "[6000]\ttraining's l2: 0.36213\tvalid_1's l2: 0.463628\n",
      "[7000]\ttraining's l2: 0.349338\tvalid_1's l2: 0.463767\n",
      "Early stopping, best iteration is:\n",
      "[6272]\ttraining's l2: 0.358584\tvalid_1's l2: 0.463542\n",
      "fold n°5\n",
      "Training until validation scores don't improve for 1000 rounds\n",
      "[1000]\ttraining's l2: 0.466349\tvalid_1's l2: 0.507696\n",
      "[2000]\ttraining's l2: 0.425606\tvalid_1's l2: 0.492745\n",
      "[3000]\ttraining's l2: 0.403731\tvalid_1's l2: 0.488917\n",
      "[4000]\ttraining's l2: 0.386479\tvalid_1's l2: 0.487113\n",
      "[5000]\ttraining's l2: 0.371358\tvalid_1's l2: 0.485881\n",
      "[6000]\ttraining's l2: 0.357821\tvalid_1's l2: 0.485185\n",
      "[7000]\ttraining's l2: 0.345577\tvalid_1's l2: 0.484535\n",
      "[8000]\ttraining's l2: 0.33415\tvalid_1's l2: 0.484483\n",
      "Early stopping, best iteration is:\n",
      "[7649]\ttraining's l2: 0.338078\tvalid_1's l2: 0.484416\n",
      "CV score: 0.47052692\n"
     ]
    }
   ],
   "source": [
    "##### lgb_49\n",
    "lgb_49_param = {\n",
    "'num_leaves': 9,\n",
    "'min_data_in_leaf': 23,\n",
    "'objective':'regression',\n",
    "'max_depth': -1,\n",
    "'learning_rate': 0.002,\n",
    "\"boosting\": \"gbdt\",\n",
    "\"feature_fraction\": 0.45,\n",
    "\"bagging_freq\": 1,\n",
    "\"bagging_fraction\": 0.65,\n",
    "\"bagging_seed\": 15,\n",
    "\"metric\": 'mse',\n",
    "\"lambda_l2\": 0.2, \n",
    "\"verbosity\": -1}\n",
    "folds = StratifiedKFold(n_splits=5, shuffle=True, random_state=9)   \n",
    "oof_lgb_49 = np.zeros(len(X_train_49))\n",
    "predictions_lgb_49 = np.zeros(len(X_test_49))\n",
    "\n",
    "for fold_, (trn_idx, val_idx) in enumerate(folds.split(X_train_49, y_train)):\n",
    "    print(\"fold n°{}\".format(fold_+1))\n",
    "    trn_data = lgb.Dataset(X_train_49[trn_idx], y_train[trn_idx])\n",
    "    val_data = lgb.Dataset(X_train_49[val_idx], y_train[val_idx])\n",
    "\n",
    "    num_round = 12000\n",
    "    lgb_49 = lgb.train(lgb_49_param, trn_data, num_round, valid_sets = [trn_data, val_data], verbose_eval=1000, early_stopping_rounds = 1000)\n",
    "    oof_lgb_49[val_idx] = lgb_49.predict(X_train_49[val_idx], num_iteration=lgb_49.best_iteration)\n",
    "    predictions_lgb_49 += lgb_49.predict(X_test_49, num_iteration=lgb_49.best_iteration) / folds.n_splits\n",
    "\n",
    "print(\"CV score: {:<8.8f}\".format(mean_squared_error(oof_lgb_49, target)))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "2. xgboost"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 23,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "fold n°1\n",
      "[19:25:31] WARNING: /Users/travis/build/dmlc/xgboost/src/objective/regression_obj.cu:170: reg:linear is now deprecated in favor of reg:squarederror.\n",
      "[19:25:31] WARNING: /Users/travis/build/dmlc/xgboost/src/learner.cc:480: \n",
      "Parameters: { silent } might not be used.\n",
      "\n",
      "  This may not be accurate due to some parameters are only used in language bindings but\n",
      "  passed down to XGBoost core.  Or some parameters are not used but slip through this\n",
      "  verification. Please open an issue if you find above cases.\n",
      "\n",
      "\n",
      "[0]\ttrain-rmse:3.40431\tvalid_data-rmse:3.38307\n",
      "Multiple eval metrics have been passed: 'valid_data-rmse' will be used for early stopping.\n",
      "\n",
      "Will train until valid_data-rmse hasn't improved in 600 rounds.\n",
      "[500]\ttrain-rmse:0.52770\tvalid_data-rmse:0.72110\n",
      "[1000]\ttrain-rmse:0.43563\tvalid_data-rmse:0.72245\n",
      "Stopping. Best iteration:\n",
      "[690]\ttrain-rmse:0.49010\tvalid_data-rmse:0.72044\n",
      "\n",
      "[19:25:44] WARNING: /Users/travis/build/dmlc/xgboost/src/objective/regression_obj.cu:170: reg:linear is now deprecated in favor of reg:squarederror.\n",
      "fold n°2\n",
      "[19:25:44] WARNING: /Users/travis/build/dmlc/xgboost/src/objective/regression_obj.cu:170: reg:linear is now deprecated in favor of reg:squarederror.\n",
      "[19:25:44] WARNING: /Users/travis/build/dmlc/xgboost/src/learner.cc:480: \n",
      "Parameters: { silent } might not be used.\n",
      "\n",
      "  This may not be accurate due to some parameters are only used in language bindings but\n",
      "  passed down to XGBoost core.  Or some parameters are not used but slip through this\n",
      "  verification. Please open an issue if you find above cases.\n",
      "\n",
      "\n",
      "[0]\ttrain-rmse:3.39815\tvalid_data-rmse:3.40784\n",
      "Multiple eval metrics have been passed: 'valid_data-rmse' will be used for early stopping.\n",
      "\n",
      "Will train until valid_data-rmse hasn't improved in 600 rounds.\n",
      "[500]\ttrain-rmse:0.52871\tvalid_data-rmse:0.70336\n",
      "[1000]\ttrain-rmse:0.43793\tvalid_data-rmse:0.70446\n",
      "Stopping. Best iteration:\n",
      "[754]\ttrain-rmse:0.47982\tvalid_data-rmse:0.70278\n",
      "\n",
      "[19:25:57] WARNING: /Users/travis/build/dmlc/xgboost/src/objective/regression_obj.cu:170: reg:linear is now deprecated in favor of reg:squarederror.\n",
      "fold n°3\n",
      "[19:25:57] WARNING: /Users/travis/build/dmlc/xgboost/src/objective/regression_obj.cu:170: reg:linear is now deprecated in favor of reg:squarederror.\n",
      "[19:25:57] WARNING: /Users/travis/build/dmlc/xgboost/src/learner.cc:480: \n",
      "Parameters: { silent } might not be used.\n",
      "\n",
      "  This may not be accurate due to some parameters are only used in language bindings but\n",
      "  passed down to XGBoost core.  Or some parameters are not used but slip through this\n",
      "  verification. Please open an issue if you find above cases.\n",
      "\n",
      "\n",
      "[0]\ttrain-rmse:3.40183\tvalid_data-rmse:3.39291\n",
      "Multiple eval metrics have been passed: 'valid_data-rmse' will be used for early stopping.\n",
      "\n",
      "Will train until valid_data-rmse hasn't improved in 600 rounds.\n",
      "[500]\ttrain-rmse:0.53169\tvalid_data-rmse:0.66896\n",
      "[1000]\ttrain-rmse:0.44129\tvalid_data-rmse:0.67058\n",
      "Stopping. Best iteration:\n",
      "[452]\ttrain-rmse:0.54177\tvalid_data-rmse:0.66871\n",
      "\n",
      "[19:26:07] WARNING: /Users/travis/build/dmlc/xgboost/src/objective/regression_obj.cu:170: reg:linear is now deprecated in favor of reg:squarederror.\n",
      "fold n°4\n",
      "[19:26:07] WARNING: /Users/travis/build/dmlc/xgboost/src/objective/regression_obj.cu:170: reg:linear is now deprecated in favor of reg:squarederror.\n",
      "[19:26:07] WARNING: /Users/travis/build/dmlc/xgboost/src/learner.cc:480: \n",
      "Parameters: { silent } might not be used.\n",
      "\n",
      "  This may not be accurate due to some parameters are only used in language bindings but\n",
      "  passed down to XGBoost core.  Or some parameters are not used but slip through this\n",
      "  verification. Please open an issue if you find above cases.\n",
      "\n",
      "\n",
      "[0]\ttrain-rmse:3.40240\tvalid_data-rmse:3.39014\n",
      "Multiple eval metrics have been passed: 'valid_data-rmse' will be used for early stopping.\n",
      "\n",
      "Will train until valid_data-rmse hasn't improved in 600 rounds.\n",
      "[500]\ttrain-rmse:0.53218\tvalid_data-rmse:0.67783\n",
      "[1000]\ttrain-rmse:0.44361\tvalid_data-rmse:0.67978\n",
      "Stopping. Best iteration:\n",
      "[566]\ttrain-rmse:0.51924\tvalid_data-rmse:0.67765\n",
      "\n",
      "[19:26:18] WARNING: /Users/travis/build/dmlc/xgboost/src/objective/regression_obj.cu:170: reg:linear is now deprecated in favor of reg:squarederror.\n",
      "fold n°5\n",
      "[19:26:19] WARNING: /Users/travis/build/dmlc/xgboost/src/objective/regression_obj.cu:170: reg:linear is now deprecated in favor of reg:squarederror.\n",
      "[19:26:19] WARNING: /Users/travis/build/dmlc/xgboost/src/learner.cc:480: \n",
      "Parameters: { silent } might not be used.\n",
      "\n",
      "  This may not be accurate due to some parameters are only used in language bindings but\n",
      "  passed down to XGBoost core.  Or some parameters are not used but slip through this\n",
      "  verification. Please open an issue if you find above cases.\n",
      "\n",
      "\n",
      "[0]\ttrain-rmse:3.39345\tvalid_data-rmse:3.42619\n",
      "Multiple eval metrics have been passed: 'valid_data-rmse' will be used for early stopping.\n",
      "\n",
      "Will train until valid_data-rmse hasn't improved in 600 rounds.\n",
      "[500]\ttrain-rmse:0.53565\tvalid_data-rmse:0.66150\n",
      "[1000]\ttrain-rmse:0.44204\tvalid_data-rmse:0.66241\n",
      "Stopping. Best iteration:\n",
      "[747]\ttrain-rmse:0.48554\tvalid_data-rmse:0.66016\n",
      "\n",
      "[19:26:32] WARNING: /Users/travis/build/dmlc/xgboost/src/objective/regression_obj.cu:170: reg:linear is now deprecated in favor of reg:squarederror.\n",
      "CV score: 0.47102840\n"
     ]
    }
   ],
   "source": [
    "##### xgb_49\n",
    "xgb_49_params = {'eta': 0.02, \n",
    "              'max_depth': 5, \n",
    "              'min_child_weight':3,\n",
    "              'gamma':0,\n",
    "              'subsample': 0.7, \n",
    "              'colsample_bytree': 0.35, \n",
    "              'lambda':2,\n",
    "              'objective': 'reg:linear', \n",
    "              'eval_metric': 'rmse', \n",
    "              'silent': True, \n",
    "              'nthread': -1}\n",
    "\n",
    "\n",
    "folds = KFold(n_splits=5, shuffle=True, random_state=2019)\n",
    "oof_xgb_49 = np.zeros(len(X_train_49))\n",
    "predictions_xgb_49 = np.zeros(len(X_test_49))\n",
    "\n",
    "for fold_, (trn_idx, val_idx) in enumerate(folds.split(X_train_49, y_train)):\n",
    "    print(\"fold n°{}\".format(fold_+1))\n",
    "    trn_data = xgb.DMatrix(X_train_49[trn_idx], y_train[trn_idx])\n",
    "    val_data = xgb.DMatrix(X_train_49[val_idx], y_train[val_idx])\n",
    "\n",
    "    watchlist = [(trn_data, 'train'), (val_data, 'valid_data')]\n",
    "    xgb_49 = xgb.train(dtrain=trn_data, num_boost_round=3000, evals=watchlist, early_stopping_rounds=600, verbose_eval=500, params=xgb_49_params)\n",
    "    oof_xgb_49[val_idx] = xgb_49.predict(xgb.DMatrix(X_train_49[val_idx]), ntree_limit=xgb_49.best_ntree_limit)\n",
    "    predictions_xgb_49 += xgb_49.predict(xgb.DMatrix(X_test_49), ntree_limit=xgb_49.best_ntree_limit) / folds.n_splits\n",
    "\n",
    "print(\"CV score: {:<8.8f}\".format(mean_squared_error(oof_xgb_49, target)))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "3. GradientBoostingRegressor梯度提升决策树"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 24,
   "metadata": {
    "scrolled": true
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "fold n°1\n",
      "      Iter       Train Loss      OOB Improve   Remaining Time \n",
      "         1           0.6529           0.0032            9.69s\n",
      "         2           0.6736           0.0029            9.55s\n",
      "         3           0.6522           0.0029            9.29s\n",
      "         4           0.6393           0.0034            9.49s\n",
      "         5           0.6454           0.0032            9.36s\n",
      "         6           0.6467           0.0031            9.22s\n",
      "         7           0.6650           0.0026            9.23s\n",
      "         8           0.6225           0.0030            9.20s\n",
      "         9           0.6350           0.0028            9.09s\n",
      "        10           0.6311           0.0028            9.25s\n",
      "        20           0.6074           0.0022            8.67s\n",
      "        30           0.5790           0.0017            8.19s\n",
      "        40           0.5443           0.0016            7.89s\n",
      "        50           0.5405           0.0013            7.63s\n",
      "        60           0.5141           0.0010            7.47s\n",
      "        70           0.4991           0.0008            7.28s\n",
      "        80           0.4791           0.0007            7.12s\n",
      "        90           0.4707           0.0006            6.92s\n",
      "       100           0.4632           0.0006            6.74s\n",
      "       200           0.4013           0.0001            5.09s\n",
      "       300           0.3924          -0.0001            3.62s\n",
      "       400           0.3526          -0.0000            2.32s\n",
      "       500           0.3355          -0.0000            1.12s\n",
      "       600           0.3201          -0.0000            0.00s\n",
      "fold n°2\n",
      "      Iter       Train Loss      OOB Improve   Remaining Time \n",
      "         1           0.6518           0.0034            8.83s\n",
      "         2           0.6618           0.0033            8.42s\n",
      "         3           0.6483           0.0032            8.28s\n",
      "         4           0.6592           0.0029            8.27s\n",
      "         5           0.6386           0.0030            8.18s\n",
      "         6           0.6438           0.0031            8.16s\n",
      "         7           0.6477           0.0033            8.12s\n",
      "         8           0.6593           0.0029            8.15s\n",
      "         9           0.6182           0.0029            8.19s\n",
      "        10           0.6358           0.0028            8.32s\n",
      "        20           0.5810           0.0025            7.91s\n",
      "        30           0.5816           0.0020            7.74s\n",
      "        40           0.5529           0.0013            7.53s\n",
      "        50           0.5402           0.0011            7.38s\n",
      "        60           0.5096           0.0011            7.17s\n",
      "        70           0.4883           0.0010            7.03s\n",
      "        80           0.4980           0.0007            6.84s\n",
      "        90           0.4706           0.0006            6.71s\n",
      "       100           0.4704           0.0004            6.55s\n",
      "       200           0.3867           0.0001            5.01s\n",
      "       300           0.3686          -0.0000            3.60s\n",
      "       400           0.3363          -0.0000            2.32s\n",
      "       500           0.3357          -0.0000            1.13s\n",
      "       600           0.3160          -0.0000            0.00s\n",
      "fold n°3\n",
      "      Iter       Train Loss      OOB Improve   Remaining Time \n",
      "         1           0.6457           0.0038            8.04s\n",
      "         2           0.6687           0.0033            8.08s\n",
      "         3           0.6462           0.0036            8.04s\n",
      "         4           0.6587           0.0035            8.02s\n",
      "         5           0.6430           0.0031            7.99s\n",
      "         6           0.6540           0.0029            7.95s\n",
      "         7           0.6377           0.0030            7.93s\n",
      "         8           0.6414           0.0030            7.97s\n",
      "         9           0.6399           0.0030            8.07s\n",
      "        10           0.6375           0.0028            8.07s\n",
      "        20           0.5949           0.0025            7.67s\n",
      "        30           0.5854           0.0019            7.72s\n",
      "        40           0.5386           0.0016            7.46s\n",
      "        50           0.5156           0.0013            7.32s\n",
      "        60           0.5080           0.0011            7.17s\n",
      "        70           0.5021           0.0009            7.04s\n",
      "        80           0.4654           0.0008            6.85s\n",
      "        90           0.4712           0.0006            6.72s\n",
      "       100           0.4740           0.0006            6.53s\n",
      "       200           0.3924           0.0000            4.96s\n",
      "       300           0.3568          -0.0000            3.58s\n",
      "       400           0.3400          -0.0001            2.31s\n",
      "       500           0.3283          -0.0001            1.12s\n",
      "       600           0.3044          -0.0000            0.00s\n",
      "fold n°4\n",
      "      Iter       Train Loss      OOB Improve   Remaining Time \n",
      "         1           0.6606           0.0032            8.27s\n",
      "         2           0.6878           0.0030            8.37s\n",
      "         3           0.6490           0.0031            8.37s\n",
      "         4           0.6564           0.0032            8.29s\n",
      "         5           0.6568           0.0027            8.27s\n",
      "         6           0.6496           0.0030            8.27s\n",
      "         7           0.6451           0.0029            8.22s\n",
      "         8           0.6210           0.0031            8.21s\n",
      "         9           0.6239           0.0028            8.35s\n",
      "        10           0.6535           0.0025            8.35s\n",
      "        20           0.6038           0.0022            7.92s\n",
      "        30           0.6032           0.0019            7.76s\n",
      "        40           0.5492           0.0018            7.55s\n",
      "        50           0.5333           0.0011            7.37s\n",
      "        60           0.4973           0.0010            7.24s\n",
      "        70           0.4942           0.0009            7.09s\n",
      "        80           0.4753           0.0008            6.92s\n",
      "        90           0.4806           0.0005            6.76s\n",
      "       100           0.4659           0.0005            6.58s\n",
      "       200           0.4046           0.0000            4.99s\n",
      "       300           0.3647          -0.0000            3.59s\n",
      "       400           0.3561          -0.0000            2.32s\n",
      "       500           0.3330          -0.0000            1.12s\n",
      "       600           0.3152          -0.0000            0.00s\n",
      "fold n°5\n",
      "      Iter       Train Loss      OOB Improve   Remaining Time \n",
      "         1           0.6721           0.0036            8.28s\n",
      "         2           0.6822           0.0034            8.41s\n",
      "         3           0.6634           0.0033            8.26s\n",
      "         4           0.6584           0.0032            8.21s\n",
      "         5           0.6574           0.0030            8.40s\n",
      "         6           0.6544           0.0033            8.31s\n",
      "         7           0.6533           0.0028            8.30s\n",
      "         8           0.6196           0.0029            8.27s\n",
      "         9           0.6530           0.0028            8.43s\n",
      "        10           0.6108           0.0032            8.49s\n",
      "        20           0.6107           0.0027            7.91s\n",
      "        30           0.5649           0.0020            7.70s\n",
      "        40           0.5555           0.0016            7.55s\n",
      "        50           0.5156           0.0014            7.40s\n",
      "        60           0.5144           0.0010            7.21s\n",
      "        70           0.5001           0.0009            7.05s\n",
      "        80           0.4908           0.0007            6.88s\n",
      "        90           0.4820           0.0008            6.73s\n",
      "       100           0.4617           0.0007            6.55s\n",
      "       200           0.3993          -0.0000            5.01s\n",
      "       300           0.3678          -0.0000            3.61s\n",
      "       400           0.3399          -0.0000            2.31s\n",
      "       500           0.3182          -0.0000            1.12s\n",
      "       600           0.3238          -0.0000            0.00s\n",
      "CV score: 0.46724198\n"
     ]
    }
   ],
   "source": [
    "folds = StratifiedKFold(n_splits=5, shuffle=True, random_state=2018)\n",
    "oof_gbr_49 = np.zeros(train_shape)\n",
    "predictions_gbr_49 = np.zeros(len(X_test_49))\n",
    "#GradientBoostingRegressor梯度提升决策树\n",
    "for fold_, (trn_idx, val_idx) in enumerate(folds.split(X_train_49, y_train)):\n",
    "    print(\"fold n°{}\".format(fold_+1))\n",
    "    tr_x = X_train_49[trn_idx]\n",
    "    tr_y = y_train[trn_idx]\n",
    "    gbr_49 = gbr(n_estimators=600, learning_rate=0.01,subsample=0.65,max_depth=6, min_samples_leaf=20,\n",
    "            max_features=0.35,verbose=1)\n",
    "    gbr_49.fit(tr_x,tr_y)\n",
    "    oof_gbr_49[val_idx] = gbr_49.predict(X_train_49[val_idx])\n",
    "    \n",
    "    predictions_gbr_49 += gbr_49.predict(X_test_49) / folds.n_splits\n",
    "\n",
    "print(\"CV score: {:<8.8f}\".format(mean_squared_error(oof_gbr_49, target)))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "至此，我们得到了以上3种模型的基于49个特征的预测结果以及模型架构及参数。其中在每一种特征工程中，进行5折的交叉验证，并重复两次（Kernel Ridge Regression，核脊回归），取得每一个特征数下的模型的结果。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 25,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "fold 0\n",
      "fold 1\n",
      "fold 2\n",
      "fold 3\n",
      "fold 4\n",
      "fold 5\n",
      "fold 6\n",
      "fold 7\n",
      "fold 8\n",
      "fold 9\n"
     ]
    },
    {
     "data": {
      "text/plain": [
       "0.4662728551415085"
      ]
     },
     "execution_count": 25,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "train_stack3 = np.vstack([oof_lgb_49,oof_xgb_49,oof_gbr_49]).transpose()\n",
    "test_stack3 = np.vstack([predictions_lgb_49, predictions_xgb_49,predictions_gbr_49]).transpose()\n",
    "#\n",
    "folds_stack = RepeatedKFold(n_splits=5, n_repeats=2, random_state=7)\n",
    "oof_stack3 = np.zeros(train_stack3.shape[0])\n",
    "predictions_lr3 = np.zeros(test_stack3.shape[0])\n",
    "\n",
    "for fold_, (trn_idx, val_idx) in enumerate(folds_stack.split(train_stack3,target)):\n",
    "    print(\"fold {}\".format(fold_))\n",
    "    trn_data, trn_y = train_stack3[trn_idx], target.iloc[trn_idx].values\n",
    "    val_data, val_y = train_stack3[val_idx], target.iloc[val_idx].values\n",
    "        #Kernel Ridge Regression\n",
    "    lr3 = kr()\n",
    "    lr3.fit(trn_data, trn_y)\n",
    "    \n",
    "    oof_stack3[val_idx] = lr3.predict(val_data)\n",
    "    predictions_lr3 += lr3.predict(test_stack3) / 10\n",
    "    \n",
    "mean_squared_error(target.values, oof_stack3) \n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "接下来我们对于383维的数据进行与上述263以及49维数据相同的操作\n",
    "\n",
    "1. Kernel Ridge Regression 基于核的岭回归"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 26,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "fold n°1\n",
      "fold n°2\n",
      "fold n°3\n",
      "fold n°4\n",
      "fold n°5\n",
      "CV score: 0.51412085\n"
     ]
    }
   ],
   "source": [
    "folds = KFold(n_splits=5, shuffle=True, random_state=13)\n",
    "oof_kr_383 = np.zeros(train_shape)\n",
    "predictions_kr_383 = np.zeros(len(X_test_383))\n",
    "\n",
    "for fold_, (trn_idx, val_idx) in enumerate(folds.split(X_train_383, y_train)):\n",
    "    print(\"fold n°{}\".format(fold_+1))\n",
    "    tr_x = X_train_383[trn_idx]\n",
    "    tr_y = y_train[trn_idx]\n",
    "    #Kernel Ridge Regression 岭回归\n",
    "    kr_383 = kr()\n",
    "    kr_383.fit(tr_x,tr_y)\n",
    "    oof_kr_383[val_idx] = kr_383.predict(X_train_383[val_idx])\n",
    "    \n",
    "    predictions_kr_383 += kr_383.predict(X_test_383) / folds.n_splits\n",
    "\n",
    "print(\"CV score: {:<8.8f}\".format(mean_squared_error(oof_kr_383, target)))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "2. 使用普通岭回归"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 27,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "fold n°1\n",
      "fold n°2\n",
      "fold n°3\n",
      "fold n°4\n",
      "fold n°5\n",
      "CV score: 0.48687670\n"
     ]
    }
   ],
   "source": [
    "folds = KFold(n_splits=5, shuffle=True, random_state=13)\n",
    "oof_ridge_383 = np.zeros(train_shape)\n",
    "predictions_ridge_383 = np.zeros(len(X_test_383))\n",
    "\n",
    "for fold_, (trn_idx, val_idx) in enumerate(folds.split(X_train_383, y_train)):\n",
    "    print(\"fold n°{}\".format(fold_+1))\n",
    "    tr_x = X_train_383[trn_idx]\n",
    "    tr_y = y_train[trn_idx]\n",
    "    #使用岭回归\n",
    "    ridge_383 = Ridge(alpha=1200)\n",
    "    ridge_383.fit(tr_x,tr_y)\n",
    "    oof_ridge_383[val_idx] = ridge_383.predict(X_train_383[val_idx])\n",
    "    \n",
    "    predictions_ridge_383 += ridge_383.predict(X_test_383) / folds.n_splits\n",
    "\n",
    "print(\"CV score: {:<8.8f}\".format(mean_squared_error(oof_ridge_383, target)))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "3. 使用ElasticNet 弹性网络"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 28,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "fold n°1\n",
      "fold n°2\n",
      "fold n°3\n",
      "fold n°4\n",
      "fold n°5\n",
      "CV score: 0.53296555\n"
     ]
    }
   ],
   "source": [
    "folds = KFold(n_splits=5, shuffle=True, random_state=13)\n",
    "oof_en_383 = np.zeros(train_shape)\n",
    "predictions_en_383 = np.zeros(len(X_test_383))\n",
    "\n",
    "for fold_, (trn_idx, val_idx) in enumerate(folds.split(X_train_383, y_train)):\n",
    "    print(\"fold n°{}\".format(fold_+1))\n",
    "    tr_x = X_train_383[trn_idx]\n",
    "    tr_y = y_train[trn_idx]\n",
    "    #ElasticNet 弹性网络\n",
    "    en_383 = en(alpha=1.0,l1_ratio=0.06)\n",
    "    en_383.fit(tr_x,tr_y)\n",
    "    oof_en_383[val_idx] = en_383.predict(X_train_383[val_idx])\n",
    "    \n",
    "    predictions_en_383 += en_383.predict(X_test_383) / folds.n_splits\n",
    "\n",
    "print(\"CV score: {:<8.8f}\".format(mean_squared_error(oof_en_383, target)))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "4. 使用BayesianRidge 贝叶斯岭回归"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 29,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "fold n°1\n",
      "fold n°2\n",
      "fold n°3\n",
      "fold n°4\n",
      "fold n°5\n",
      "CV score: 0.48717310\n"
     ]
    }
   ],
   "source": [
    "folds = KFold(n_splits=5, shuffle=True, random_state=13)\n",
    "oof_br_383 = np.zeros(train_shape)\n",
    "predictions_br_383 = np.zeros(len(X_test_383))\n",
    "\n",
    "for fold_, (trn_idx, val_idx) in enumerate(folds.split(X_train_383, y_train)):\n",
    "    print(\"fold n°{}\".format(fold_+1))\n",
    "    tr_x = X_train_383[trn_idx]\n",
    "    tr_y = y_train[trn_idx]\n",
    "    #BayesianRidge 贝叶斯回归\n",
    "    br_383 = br()\n",
    "    br_383.fit(tr_x,tr_y)\n",
    "    oof_br_383[val_idx] = br_383.predict(X_train_383[val_idx])\n",
    "    \n",
    "    predictions_br_383 += br_383.predict(X_test_383) / folds.n_splits\n",
    "\n",
    "print(\"CV score: {:<8.8f}\".format(mean_squared_error(oof_br_383, target)))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "至此，我们得到了以上4种模型的基于383个特征的预测结果以及模型架构及参数。其中在每一种特征工程中，进行5折的交叉验证，并重复两次（LinearRegression简单的线性回归），取得每一个特征数下的模型的结果。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 30,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "fold 0\n",
      "fold 1\n",
      "fold 2\n",
      "fold 3\n",
      "fold 4\n",
      "fold 5\n",
      "fold 6\n",
      "fold 7\n",
      "fold 8\n",
      "fold 9\n"
     ]
    },
    {
     "data": {
      "text/plain": [
       "0.4878202780283125"
      ]
     },
     "execution_count": 30,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "train_stack1 = np.vstack([oof_br_383,oof_kr_383,oof_en_383,oof_ridge_383]).transpose()\n",
    "test_stack1 = np.vstack([predictions_br_383, predictions_kr_383,predictions_en_383,predictions_ridge_383]).transpose()\n",
    "\n",
    "folds_stack = RepeatedKFold(n_splits=5, n_repeats=2, random_state=7)\n",
    "oof_stack1 = np.zeros(train_stack1.shape[0])\n",
    "predictions_lr1 = np.zeros(test_stack1.shape[0])\n",
    "\n",
    "for fold_, (trn_idx, val_idx) in enumerate(folds_stack.split(train_stack1,target)):\n",
    "    print(\"fold {}\".format(fold_))\n",
    "    trn_data, trn_y = train_stack1[trn_idx], target.iloc[trn_idx].values\n",
    "    val_data, val_y = train_stack1[val_idx], target.iloc[val_idx].values\n",
    "    # LinearRegression简单的线性回归\n",
    "    lr1 = lr()\n",
    "    lr1.fit(trn_data, trn_y)\n",
    "    \n",
    "    oof_stack1[val_idx] = lr1.predict(val_data)\n",
    "    predictions_lr1 += lr1.predict(test_stack1) / 10\n",
    "    \n",
    "mean_squared_error(target.values, oof_stack1) \n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "由于49维的特征是最重要的特征，所以这里考虑增加更多的模型进行49维特征的数据的构建工作。\n",
    "1. KernelRidge 核岭回归"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 31,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "fold n°1\n",
      "fold n°2\n",
      "fold n°3\n",
      "fold n°4\n",
      "fold n°5\n",
      "CV score: 0.50254410\n"
     ]
    }
   ],
   "source": [
    "folds = KFold(n_splits=5, shuffle=True, random_state=13)\n",
    "oof_kr_49 = np.zeros(train_shape)\n",
    "predictions_kr_49 = np.zeros(len(X_test_49))\n",
    "\n",
    "for fold_, (trn_idx, val_idx) in enumerate(folds.split(X_train_49, y_train)):\n",
    "    print(\"fold n°{}\".format(fold_+1))\n",
    "    tr_x = X_train_49[trn_idx]\n",
    "    tr_y = y_train[trn_idx]\n",
    "    kr_49 = kr()\n",
    "    kr_49.fit(tr_x,tr_y)\n",
    "    oof_kr_49[val_idx] = kr_49.predict(X_train_49[val_idx])\n",
    "    \n",
    "    predictions_kr_49 += kr_49.predict(X_test_49) / folds.n_splits\n",
    "\n",
    "print(\"CV score: {:<8.8f}\".format(mean_squared_error(oof_kr_49, target)))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "2. Ridge 岭回归"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 32,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "fold n°1\n",
      "fold n°2\n",
      "fold n°3\n",
      "fold n°4\n",
      "fold n°5\n",
      "CV score: 0.49451286\n"
     ]
    }
   ],
   "source": [
    "folds = KFold(n_splits=5, shuffle=True, random_state=13)\n",
    "oof_ridge_49 = np.zeros(train_shape)\n",
    "predictions_ridge_49 = np.zeros(len(X_test_49))\n",
    "\n",
    "for fold_, (trn_idx, val_idx) in enumerate(folds.split(X_train_49, y_train)):\n",
    "    print(\"fold n°{}\".format(fold_+1))\n",
    "    tr_x = X_train_49[trn_idx]\n",
    "    tr_y = y_train[trn_idx]\n",
    "    ridge_49 = Ridge(alpha=6)\n",
    "    ridge_49.fit(tr_x,tr_y)\n",
    "    oof_ridge_49[val_idx] = ridge_49.predict(X_train_49[val_idx])\n",
    "    \n",
    "    predictions_ridge_49 += ridge_49.predict(X_test_49) / folds.n_splits\n",
    "\n",
    "print(\"CV score: {:<8.8f}\".format(mean_squared_error(oof_ridge_49, target)))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "3. BayesianRidge 贝叶斯岭回归"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 33,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "fold n°1\n",
      "fold n°2\n",
      "fold n°3\n",
      "fold n°4\n",
      "fold n°5\n",
      "CV score: 0.49534595\n"
     ]
    }
   ],
   "source": [
    "folds = KFold(n_splits=5, shuffle=True, random_state=13)\n",
    "oof_br_49 = np.zeros(train_shape)\n",
    "predictions_br_49 = np.zeros(len(X_test_49))\n",
    "\n",
    "for fold_, (trn_idx, val_idx) in enumerate(folds.split(X_train_49, y_train)):\n",
    "    print(\"fold n°{}\".format(fold_+1))\n",
    "    tr_x = X_train_49[trn_idx]\n",
    "    tr_y = y_train[trn_idx]\n",
    "    br_49 = br()\n",
    "    br_49.fit(tr_x,tr_y)\n",
    "    oof_br_49[val_idx] = br_49.predict(X_train_49[val_idx])\n",
    "    \n",
    "    predictions_br_49 += br_49.predict(X_test_49) / folds.n_splits\n",
    "\n",
    "print(\"CV score: {:<8.8f}\".format(mean_squared_error(oof_br_49, target)))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "4. ElasticNet 弹性网络"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 34,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "fold n°1\n",
      "fold n°2\n",
      "fold n°3\n",
      "fold n°4\n",
      "fold n°5\n",
      "CV score: 0.53841695\n"
     ]
    }
   ],
   "source": [
    "folds = KFold(n_splits=5, shuffle=True, random_state=13)\n",
    "oof_en_49 = np.zeros(train_shape)\n",
    "predictions_en_49 = np.zeros(len(X_test_49))\n",
    "#\n",
    "for fold_, (trn_idx, val_idx) in enumerate(folds.split(X_train_49, y_train)):\n",
    "    print(\"fold n°{}\".format(fold_+1))\n",
    "    tr_x = X_train_49[trn_idx]\n",
    "    tr_y = y_train[trn_idx]\n",
    "    en_49 = en(alpha=1.0,l1_ratio=0.05)\n",
    "    en_49.fit(tr_x,tr_y)\n",
    "    oof_en_49[val_idx] = en_49.predict(X_train_49[val_idx])\n",
    "    \n",
    "    predictions_en_49 += en_49.predict(X_test_49) / folds.n_splits\n",
    "\n",
    "print(\"CV score: {:<8.8f}\".format(mean_squared_error(oof_en_49, target)))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "我们得到了以上4种新模型的基于49个特征的预测结果以及模型架构及参数。其中在每一种特征工程中，进行5折的交叉验证，并重复两次（LinearRegression简单的线性回归），取得每一个特征数下的模型的结果。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 35,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "fold 0\n",
      "fold 1\n",
      "fold 2\n",
      "fold 3\n",
      "fold 4\n",
      "fold 5\n",
      "fold 6\n",
      "fold 7\n",
      "fold 8\n",
      "fold 9\n"
     ]
    },
    {
     "data": {
      "text/plain": [
       "0.49491439094008133"
      ]
     },
     "execution_count": 35,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "train_stack4 = np.vstack([oof_br_49,oof_kr_49,oof_en_49,oof_ridge_49]).transpose()\n",
    "test_stack4 = np.vstack([predictions_br_49, predictions_kr_49,predictions_en_49,predictions_ridge_49]).transpose()\n",
    "\n",
    "folds_stack = RepeatedKFold(n_splits=5, n_repeats=2, random_state=7)\n",
    "oof_stack4 = np.zeros(train_stack4.shape[0])\n",
    "predictions_lr4 = np.zeros(test_stack4.shape[0])\n",
    "\n",
    "for fold_, (trn_idx, val_idx) in enumerate(folds_stack.split(train_stack4,target)):\n",
    "    print(\"fold {}\".format(fold_))\n",
    "    trn_data, trn_y = train_stack4[trn_idx], target.iloc[trn_idx].values\n",
    "    val_data, val_y = train_stack4[val_idx], target.iloc[val_idx].values\n",
    "    #LinearRegression\n",
    "    lr4 = lr()\n",
    "    lr4.fit(trn_data, trn_y)\n",
    "    \n",
    "    oof_stack4[val_idx] = lr4.predict(val_data)\n",
    "    predictions_lr4 += lr4.predict(test_stack1) / 10\n",
    "    \n",
    "mean_squared_error(target.values, oof_stack4) \n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 模型融合\n",
    "\n",
    "这里对于上述四种集成学习的模型的预测结果进行加权的求和，得到最终的结果，当然这种方式是很不准确的。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 36,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "0.4527515432292745"
      ]
     },
     "execution_count": 36,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "#和下面作对比\n",
    "mean_squared_error(target.values, 0.7*(0.6*oof_stack2 + 0.4*oof_stack3)+0.3*(0.55*oof_stack1+0.45*oof_stack4))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "更好的方式是将以上的4中集成学习模型再次进行集成学习的训练，这里直接使用LinearRegression简单线性回归的进行集成。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 37,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "fold 0\n",
      "fold 1\n",
      "fold 2\n",
      "fold 3\n",
      "fold 4\n",
      "fold 5\n",
      "fold 6\n",
      "fold 7\n",
      "fold 8\n",
      "fold 9\n"
     ]
    },
    {
     "data": {
      "text/plain": [
       "0.4480223491250565"
      ]
     },
     "execution_count": 37,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "train_stack5 = np.vstack([oof_stack1,oof_stack2,oof_stack3,oof_stack4]).transpose()\n",
    "test_stack5 = np.vstack([predictions_lr1, predictions_lr2,predictions_lr3,predictions_lr4]).transpose()\n",
    "\n",
    "folds_stack = RepeatedKFold(n_splits=5, n_repeats=2, random_state=7)\n",
    "oof_stack5 = np.zeros(train_stack5.shape[0])\n",
    "predictions_lr5= np.zeros(test_stack5.shape[0])\n",
    "\n",
    "for fold_, (trn_idx, val_idx) in enumerate(folds_stack.split(train_stack5,target)):\n",
    "    print(\"fold {}\".format(fold_))\n",
    "    trn_data, trn_y = train_stack5[trn_idx], target.iloc[trn_idx].values\n",
    "    val_data, val_y = train_stack5[val_idx], target.iloc[val_idx].values\n",
    "    #LinearRegression\n",
    "    lr5 = lr()\n",
    "    lr5.fit(trn_data, trn_y)\n",
    "    \n",
    "    oof_stack5[val_idx] = lr5.predict(val_data)\n",
    "    predictions_lr5 += lr5.predict(test_stack5) / 10\n",
    "    \n",
    "mean_squared_error(target.values, oof_stack5) \n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 结果保存\n",
    "\n",
    "进行index的读取工作"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 38,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "count    2968.000000\n",
       "mean        3.879322\n",
       "std         0.462290\n",
       "min         1.636433\n",
       "25%         3.667859\n",
       "50%         3.954825\n",
       "75%         4.185277\n",
       "max         5.051027\n",
       "Name: happiness, dtype: float64"
      ]
     },
     "execution_count": 38,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "submit_example = pd.read_csv('submit_example.csv',sep=',',encoding='latin-1')\n",
    "\n",
    "submit_example['happiness'] = predictions_lr5\n",
    "\n",
    "submit_example.happiness.describe()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "进行结果保存，这里我们预测出的值是1-5的连续值，但是我们的ground truth是整数值，所以为了进一步优化我们的结果，我们对于结果进行了整数解的近似，并保存到了csv文件中。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 39,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "count    2968.000000\n",
       "mean        3.879330\n",
       "std         0.462127\n",
       "min         1.636433\n",
       "25%         3.667859\n",
       "50%         3.954825\n",
       "75%         4.185277\n",
       "max         5.000000\n",
       "Name: happiness, dtype: float64"
      ]
     },
     "execution_count": 39,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "submit_example.loc[submit_example['happiness']>4.96,'happiness']= 5\n",
    "submit_example.loc[submit_example['happiness']<=1.04,'happiness']= 1\n",
    "submit_example.loc[(submit_example['happiness']>1.96)&(submit_example['happiness']<2.04),'happiness']= 2\n",
    "\n",
    "submit_example.to_csv(\"submision.csv\",index=False)\n",
    "submit_example.happiness.describe()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "大家可以对于model的参数进行更进一步的调整，例如使用网格搜索的方法。这留给大家做进一步的思考喽～"
   ]
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.7.4"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 2
}
