{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Train a Deep NN to predict Asset Price movements"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "In practice, we need to explore variations of the design options outlined above because we can rarely be sure from the outset which network architecture best suits the data.\n",
    "\n",
    "The GridSearchCV class provided by scikit-learn that we encountered in Chapter 6, The Machine Learning Workflow conveniently automates this process. Just be mindful of the risk of false discoveries and keep track of how many experiments you are running to adjust the results accordingly.\n",
    "\n",
    "In this section, we will explore various options to build a simple feedforward Neural Network to predict asset price moves for a one-month horizon."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Setup Docker for GPU acceleration"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "`docker run -it -p 8889:8888 -v /path/to/machine-learning-for-trading/16_convolutions_neural_nets/cnn:/cnn --name tensorflow tensorflow/tensorflow:latest-gpu-py3 bash`"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Imports & Settings"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 1,
   "metadata": {},
   "outputs": [],
   "source": [
    "import warnings\n",
    "warnings.filterwarnings('ignore')"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 10,
   "metadata": {},
   "outputs": [],
   "source": [
    "import os\n",
    "from pathlib import Path\n",
    "from importlib import reload\n",
    "from joblib import dump, load\n",
    "\n",
    "import numpy as np\n",
    "import pandas as pd\n",
    "import matplotlib.pyplot as plt\n",
    "from matplotlib.gridspec import GridSpec\n",
    "import seaborn as sns\n",
    "\n",
    "from sklearn.model_selection import train_test_split, GridSearchCV, StratifiedKFold\n",
    "from sklearn.metrics import roc_auc_score\n",
    "\n",
    "import tensorflow as tf\n",
    "from keras.models import Sequential\n",
    "from keras import backend as K\n",
    "from keras.wrappers.scikit_learn import KerasClassifier\n",
    "from keras.layers import Dense, Dropout, Activation\n",
    "from keras.models import load_model\n",
    "from keras.callbacks import Callback, EarlyStopping, TensorBoard, ModelCheckpoint"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 2,
   "metadata": {},
   "outputs": [],
   "source": [
    "np.random.seed(42)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Create a stock return series to predict asset price moves"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "We will use the last 24 monthly returns and dummy variables for the month and the year to predict whether the price will go up or down the following month. We use the daily Quandl stock price dataset (see GitHub for instructions on how to source the data)."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 3,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "<class 'pandas.core.frame.DataFrame'>\n",
      "DatetimeIndex: 2896 entries, 2007-01-01 to 2018-03-27\n",
      "Columns: 3199 entries, A to ZUMZ\n",
      "dtypes: float64(3199)\n",
      "memory usage: 70.7 MB\n"
     ]
    }
   ],
   "source": [
    "prices = (pd.read_hdf('../data/assets.h5', 'quandl/wiki/prices')\n",
    "          .adj_close\n",
    "          .unstack().loc['2007':])\n",
    "prices.info()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "We will work with monthly returns to keep the size of the dataset manageable and remove some of the noise contained in daily returns, which leaves us with almost 2,500 stocks with 120 monthly returns each:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 4,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "<class 'pandas.core.frame.DataFrame'>\n",
      "DatetimeIndex: 120 entries, 2017-12-31 to 2008-01-31\n",
      "Freq: -1M\n",
      "Columns: 2489 entries, A to ZUMZ\n",
      "dtypes: float64(2489)\n",
      "memory usage: 2.3 MB\n"
     ]
    }
   ],
   "source": [
    "returns = (prices\n",
    "           .resample('M')\n",
    "           .last()\n",
    "           .pct_change()\n",
    "           .loc['2008': '2017']\n",
    "           .dropna(axis=1)\n",
    "           .sort_index(ascending=False))\n",
    "returns.info()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 5,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/html": [
       "<div>\n",
       "<style scoped>\n",
       "    .dataframe tbody tr th:only-of-type {\n",
       "        vertical-align: middle;\n",
       "    }\n",
       "\n",
       "    .dataframe tbody tr th {\n",
       "        vertical-align: top;\n",
       "    }\n",
       "\n",
       "    .dataframe thead th {\n",
       "        text-align: right;\n",
       "    }\n",
       "</style>\n",
       "<table border=\"1\" class=\"dataframe\">\n",
       "  <thead>\n",
       "    <tr style=\"text-align: right;\">\n",
       "      <th>ticker</th>\n",
       "      <th>A</th>\n",
       "      <th>AAL</th>\n",
       "      <th>AAN</th>\n",
       "      <th>AAON</th>\n",
       "      <th>AAP</th>\n",
       "      <th>AAPL</th>\n",
       "      <th>AAWW</th>\n",
       "      <th>ABAX</th>\n",
       "      <th>ABC</th>\n",
       "      <th>ABCB</th>\n",
       "      <th>...</th>\n",
       "      <th>ZEUS</th>\n",
       "      <th>ZIGO</th>\n",
       "      <th>ZINC</th>\n",
       "      <th>ZION</th>\n",
       "      <th>ZIOP</th>\n",
       "      <th>ZIXI</th>\n",
       "      <th>ZLC</th>\n",
       "      <th>ZMH</th>\n",
       "      <th>ZQK</th>\n",
       "      <th>ZUMZ</th>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>date</th>\n",
       "      <th></th>\n",
       "      <th></th>\n",
       "      <th></th>\n",
       "      <th></th>\n",
       "      <th></th>\n",
       "      <th></th>\n",
       "      <th></th>\n",
       "      <th></th>\n",
       "      <th></th>\n",
       "      <th></th>\n",
       "      <th></th>\n",
       "      <th></th>\n",
       "      <th></th>\n",
       "      <th></th>\n",
       "      <th></th>\n",
       "      <th></th>\n",
       "      <th></th>\n",
       "      <th></th>\n",
       "      <th></th>\n",
       "      <th></th>\n",
       "      <th></th>\n",
       "    </tr>\n",
       "  </thead>\n",
       "  <tbody>\n",
       "    <tr>\n",
       "      <th>2017-12-31</th>\n",
       "      <td>-0.032785</td>\n",
       "      <td>0.030501</td>\n",
       "      <td>0.056469</td>\n",
       "      <td>0.006859</td>\n",
       "      <td>-0.012970</td>\n",
       "      <td>-0.015246</td>\n",
       "      <td>0.015584</td>\n",
       "      <td>0.016003</td>\n",
       "      <td>0.082528</td>\n",
       "      <td>-0.028226</td>\n",
       "      <td>...</td>\n",
       "      <td>0.078815</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>0.025832</td>\n",
       "      <td>-0.094092</td>\n",
       "      <td>-0.004545</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>-0.044725</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>2017-11-30</th>\n",
       "      <td>0.017786</td>\n",
       "      <td>0.078385</td>\n",
       "      <td>0.025000</td>\n",
       "      <td>0.041429</td>\n",
       "      <td>0.235625</td>\n",
       "      <td>0.016623</td>\n",
       "      <td>-0.058680</td>\n",
       "      <td>0.007025</td>\n",
       "      <td>0.107587</td>\n",
       "      <td>0.035491</td>\n",
       "      <td>...</td>\n",
       "      <td>0.055085</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>0.066509</td>\n",
       "      <td>-0.019313</td>\n",
       "      <td>-0.092784</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>0.235127</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>2017-10-31</th>\n",
       "      <td>0.061814</td>\n",
       "      <td>-0.014108</td>\n",
       "      <td>-0.156544</td>\n",
       "      <td>0.015228</td>\n",
       "      <td>-0.176008</td>\n",
       "      <td>0.096808</td>\n",
       "      <td>-0.067629</td>\n",
       "      <td>0.083987</td>\n",
       "      <td>-0.070091</td>\n",
       "      <td>-0.001043</td>\n",
       "      <td>...</td>\n",
       "      <td>-0.141818</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>-0.015261</td>\n",
       "      <td>-0.241042</td>\n",
       "      <td>-0.008180</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>-0.024862</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>2017-09-30</th>\n",
       "      <td>-0.008035</td>\n",
       "      <td>0.061466</td>\n",
       "      <td>-0.013832</td>\n",
       "      <td>0.057515</td>\n",
       "      <td>0.013928</td>\n",
       "      <td>-0.060244</td>\n",
       "      <td>-0.014970</td>\n",
       "      <td>-0.033968</td>\n",
       "      <td>0.031153</td>\n",
       "      <td>0.090808</td>\n",
       "      <td>...</td>\n",
       "      <td>0.205479</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>0.080623</td>\n",
       "      <td>-0.039124</td>\n",
       "      <td>-0.079096</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>0.453815</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>2017-08-31</th>\n",
       "      <td>0.082455</td>\n",
       "      <td>-0.111179</td>\n",
       "      <td>-0.043431</td>\n",
       "      <td>-0.035503</td>\n",
       "      <td>-0.125971</td>\n",
       "      <td>0.106251</td>\n",
       "      <td>0.124579</td>\n",
       "      <td>-0.013579</td>\n",
       "      <td>-0.140733</td>\n",
       "      <td>-0.038210</td>\n",
       "      <td>...</td>\n",
       "      <td>0.069057</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>-0.034067</td>\n",
       "      <td>0.155515</td>\n",
       "      <td>-0.003752</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>-0.019685</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>2008-05-31</th>\n",
       "      <td>0.237670</td>\n",
       "      <td>-0.538999</td>\n",
       "      <td>-0.122768</td>\n",
       "      <td>0.162611</td>\n",
       "      <td>0.162053</td>\n",
       "      <td>0.085082</td>\n",
       "      <td>0.020105</td>\n",
       "      <td>0.153454</td>\n",
       "      <td>0.021099</td>\n",
       "      <td>-0.073431</td>\n",
       "      <td>...</td>\n",
       "      <td>0.269937</td>\n",
       "      <td>0.026587</td>\n",
       "      <td>0.002140</td>\n",
       "      <td>-0.062060</td>\n",
       "      <td>-0.163399</td>\n",
       "      <td>-0.321053</td>\n",
       "      <td>0.051158</td>\n",
       "      <td>-0.018339</td>\n",
       "      <td>-0.122302</td>\n",
       "      <td>0.000477</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>2008-04-30</th>\n",
       "      <td>0.012739</td>\n",
       "      <td>-0.035915</td>\n",
       "      <td>0.178947</td>\n",
       "      <td>-0.097354</td>\n",
       "      <td>0.018502</td>\n",
       "      <td>0.212195</td>\n",
       "      <td>0.103273</td>\n",
       "      <td>0.099698</td>\n",
       "      <td>-0.010493</td>\n",
       "      <td>-0.067248</td>\n",
       "      <td>...</td>\n",
       "      <td>0.135255</td>\n",
       "      <td>-0.062701</td>\n",
       "      <td>0.210708</td>\n",
       "      <td>0.017563</td>\n",
       "      <td>0.040816</td>\n",
       "      <td>-0.018088</td>\n",
       "      <td>0.048583</td>\n",
       "      <td>-0.047521</td>\n",
       "      <td>-0.008155</td>\n",
       "      <td>0.335245</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>2008-03-31</th>\n",
       "      <td>-0.025482</td>\n",
       "      <td>-0.281452</td>\n",
       "      <td>0.041991</td>\n",
       "      <td>0.213204</td>\n",
       "      <td>0.017068</td>\n",
       "      <td>0.147816</td>\n",
       "      <td>0.086957</td>\n",
       "      <td>-0.204873</td>\n",
       "      <td>-0.017737</td>\n",
       "      <td>0.139290</td>\n",
       "      <td>...</td>\n",
       "      <td>0.092010</td>\n",
       "      <td>-0.023548</td>\n",
       "      <td>-0.262420</td>\n",
       "      <td>-0.046073</td>\n",
       "      <td>-0.048544</td>\n",
       "      <td>-0.012755</td>\n",
       "      <td>0.022774</td>\n",
       "      <td>0.034135</td>\n",
       "      <td>0.090000</td>\n",
       "      <td>-0.107509</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>2008-02-29</th>\n",
       "      <td>-0.095983</td>\n",
       "      <td>-0.104046</td>\n",
       "      <td>0.067251</td>\n",
       "      <td>-0.072472</td>\n",
       "      <td>-0.062605</td>\n",
       "      <td>-0.076389</td>\n",
       "      <td>0.013216</td>\n",
       "      <td>-0.104762</td>\n",
       "      <td>-0.102822</td>\n",
       "      <td>-0.098859</td>\n",
       "      <td>...</td>\n",
       "      <td>0.223413</td>\n",
       "      <td>0.086104</td>\n",
       "      <td>0.047365</td>\n",
       "      <td>-0.120684</td>\n",
       "      <td>-0.063636</td>\n",
       "      <td>0.101124</td>\n",
       "      <td>0.180929</td>\n",
       "      <td>-0.036473</td>\n",
       "      <td>-0.055614</td>\n",
       "      <td>-0.085803</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>2008-01-31</th>\n",
       "      <td>-0.078389</td>\n",
       "      <td>-0.059143</td>\n",
       "      <td>-0.009270</td>\n",
       "      <td>-0.101917</td>\n",
       "      <td>-0.058173</td>\n",
       "      <td>-0.316640</td>\n",
       "      <td>-0.078938</td>\n",
       "      <td>-0.092303</td>\n",
       "      <td>0.038110</td>\n",
       "      <td>-0.063501</td>\n",
       "      <td>...</td>\n",
       "      <td>0.065594</td>\n",
       "      <td>-0.058587</td>\n",
       "      <td>-0.116676</td>\n",
       "      <td>0.172414</td>\n",
       "      <td>-0.067797</td>\n",
       "      <td>-0.226087</td>\n",
       "      <td>0.018680</td>\n",
       "      <td>0.181255</td>\n",
       "      <td>0.110723</td>\n",
       "      <td>-0.210591</td>\n",
       "    </tr>\n",
       "  </tbody>\n",
       "</table>\n",
       "<p>10 rows × 2489 columns</p>\n",
       "</div>"
      ],
      "text/plain": [
       "ticker             A       AAL       AAN      AAON       AAP      AAPL  \\\n",
       "date                                                                     \n",
       "2017-12-31 -0.032785  0.030501  0.056469  0.006859 -0.012970 -0.015246   \n",
       "2017-11-30  0.017786  0.078385  0.025000  0.041429  0.235625  0.016623   \n",
       "2017-10-31  0.061814 -0.014108 -0.156544  0.015228 -0.176008  0.096808   \n",
       "2017-09-30 -0.008035  0.061466 -0.013832  0.057515  0.013928 -0.060244   \n",
       "2017-08-31  0.082455 -0.111179 -0.043431 -0.035503 -0.125971  0.106251   \n",
       "2008-05-31  0.237670 -0.538999 -0.122768  0.162611  0.162053  0.085082   \n",
       "2008-04-30  0.012739 -0.035915  0.178947 -0.097354  0.018502  0.212195   \n",
       "2008-03-31 -0.025482 -0.281452  0.041991  0.213204  0.017068  0.147816   \n",
       "2008-02-29 -0.095983 -0.104046  0.067251 -0.072472 -0.062605 -0.076389   \n",
       "2008-01-31 -0.078389 -0.059143 -0.009270 -0.101917 -0.058173 -0.316640   \n",
       "\n",
       "ticker          AAWW      ABAX       ABC      ABCB  ...      ZEUS      ZIGO  \\\n",
       "date                                                ...                       \n",
       "2017-12-31  0.015584  0.016003  0.082528 -0.028226  ...  0.078815  0.000000   \n",
       "2017-11-30 -0.058680  0.007025  0.107587  0.035491  ...  0.055085  0.000000   \n",
       "2017-10-31 -0.067629  0.083987 -0.070091 -0.001043  ... -0.141818  0.000000   \n",
       "2017-09-30 -0.014970 -0.033968  0.031153  0.090808  ...  0.205479  0.000000   \n",
       "2017-08-31  0.124579 -0.013579 -0.140733 -0.038210  ...  0.069057  0.000000   \n",
       "2008-05-31  0.020105  0.153454  0.021099 -0.073431  ...  0.269937  0.026587   \n",
       "2008-04-30  0.103273  0.099698 -0.010493 -0.067248  ...  0.135255 -0.062701   \n",
       "2008-03-31  0.086957 -0.204873 -0.017737  0.139290  ...  0.092010 -0.023548   \n",
       "2008-02-29  0.013216 -0.104762 -0.102822 -0.098859  ...  0.223413  0.086104   \n",
       "2008-01-31 -0.078938 -0.092303  0.038110 -0.063501  ...  0.065594 -0.058587   \n",
       "\n",
       "ticker          ZINC      ZION      ZIOP      ZIXI       ZLC       ZMH  \\\n",
       "date                                                                     \n",
       "2017-12-31  0.000000  0.025832 -0.094092 -0.004545  0.000000  0.000000   \n",
       "2017-11-30  0.000000  0.066509 -0.019313 -0.092784  0.000000  0.000000   \n",
       "2017-10-31  0.000000 -0.015261 -0.241042 -0.008180  0.000000  0.000000   \n",
       "2017-09-30  0.000000  0.080623 -0.039124 -0.079096  0.000000  0.000000   \n",
       "2017-08-31  0.000000 -0.034067  0.155515 -0.003752  0.000000  0.000000   \n",
       "2008-05-31  0.002140 -0.062060 -0.163399 -0.321053  0.051158 -0.018339   \n",
       "2008-04-30  0.210708  0.017563  0.040816 -0.018088  0.048583 -0.047521   \n",
       "2008-03-31 -0.262420 -0.046073 -0.048544 -0.012755  0.022774  0.034135   \n",
       "2008-02-29  0.047365 -0.120684 -0.063636  0.101124  0.180929 -0.036473   \n",
       "2008-01-31 -0.116676  0.172414 -0.067797 -0.226087  0.018680  0.181255   \n",
       "\n",
       "ticker           ZQK      ZUMZ  \n",
       "date                            \n",
       "2017-12-31  0.000000 -0.044725  \n",
       "2017-11-30  0.000000  0.235127  \n",
       "2017-10-31  0.000000 -0.024862  \n",
       "2017-09-30  0.000000  0.453815  \n",
       "2017-08-31  0.000000 -0.019685  \n",
       "2008-05-31 -0.122302  0.000477  \n",
       "2008-04-30 -0.008155  0.335245  \n",
       "2008-03-31  0.090000 -0.107509  \n",
       "2008-02-29 -0.055614 -0.085803  \n",
       "2008-01-31  0.110723 -0.210591  \n",
       "\n",
       "[10 rows x 2489 columns]"
      ]
     },
     "execution_count": 5,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "returns.head().append(returns.tail())"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 6,
   "metadata": {},
   "outputs": [],
   "source": [
    "n = len(returns)\n",
    "T = 24\n",
    "tcols = list(range(25))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 7,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "<class 'pandas.core.frame.DataFrame'>\n",
      "DatetimeIndex: 236455 entries, 2010-02-01 to 2017-12-01\n",
      "Data columns (total 45 columns):\n",
      "1            236455 non-null float64\n",
      "2            236455 non-null float64\n",
      "3            236455 non-null float64\n",
      "4            236455 non-null float64\n",
      "5            236455 non-null float64\n",
      "6            236455 non-null float64\n",
      "7            236455 non-null float64\n",
      "8            236455 non-null float64\n",
      "9            236455 non-null float64\n",
      "10           236455 non-null float64\n",
      "11           236455 non-null float64\n",
      "12           236455 non-null float64\n",
      "13           236455 non-null float64\n",
      "14           236455 non-null float64\n",
      "15           236455 non-null float64\n",
      "16           236455 non-null float64\n",
      "17           236455 non-null float64\n",
      "18           236455 non-null float64\n",
      "19           236455 non-null float64\n",
      "20           236455 non-null float64\n",
      "21           236455 non-null float64\n",
      "22           236455 non-null float64\n",
      "23           236455 non-null float64\n",
      "24           236455 non-null float64\n",
      "label        236455 non-null int64\n",
      "year_2010    236455 non-null uint8\n",
      "year_2011    236455 non-null uint8\n",
      "year_2012    236455 non-null uint8\n",
      "year_2013    236455 non-null uint8\n",
      "year_2014    236455 non-null uint8\n",
      "year_2015    236455 non-null uint8\n",
      "year_2016    236455 non-null uint8\n",
      "year_2017    236455 non-null uint8\n",
      "month_1      236455 non-null uint8\n",
      "month_2      236455 non-null uint8\n",
      "month_3      236455 non-null uint8\n",
      "month_4      236455 non-null uint8\n",
      "month_5      236455 non-null uint8\n",
      "month_6      236455 non-null uint8\n",
      "month_7      236455 non-null uint8\n",
      "month_8      236455 non-null uint8\n",
      "month_9      236455 non-null uint8\n",
      "month_10     236455 non-null uint8\n",
      "month_11     236455 non-null uint8\n",
      "month_12     236455 non-null uint8\n",
      "dtypes: float64(24), int64(1), uint8(20)\n",
      "memory usage: 51.4 MB\n"
     ]
    }
   ],
   "source": [
    "data = pd.DataFrame()\n",
    "for i in range(n-T-1):\n",
    "    df = returns.iloc[i:i+T+1]\n",
    "    data = pd.concat([data, (df.reset_index(drop=True).T\n",
    "                             .assign(year=df.index[0].year,\n",
    "                                     month=df.index[0].month))],\n",
    "                     ignore_index=True)\n",
    "data[tcols] = (data[tcols].apply(lambda x: x.clip(lower=x.quantile(.01),\n",
    "                                                  upper=x.quantile(.99))))\n",
    "data['label'] = (data[0] > 0).astype(int)\n",
    "data['date'] = pd.to_datetime(data.assign(day=1)[['year', 'month', 'day']])\n",
    "data = pd.get_dummies((data.drop(0, axis=1)\n",
    "                       .set_index('date')\n",
    "                       .apply(pd.to_numeric)), \n",
    "                      columns=['year', 'month']).sort_index()\n",
    "data.info()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 8,
   "metadata": {},
   "outputs": [
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "/home/stefan/.pyenv/versions/miniconda3-latest/envs/ml4t/lib/python3.6/site-packages/pandas/io/pytables.py:274: PerformanceWarning: \n",
      "your performance may suffer as PyTables will pickle object types that it cannot\n",
      "map directly to c-types [inferred_type->mixed-integer,key->axis0] [items->None]\n",
      "\n",
      "  f(store)\n"
     ]
    }
   ],
   "source": [
    "data.to_hdf('data.h5', 'returns')"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 9,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(236455, 45)"
      ]
     },
     "execution_count": 9,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "data.shape"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 12,
   "metadata": {},
   "outputs": [],
   "source": [
    "class OneStepTimeSeriesSplit:\n",
    "    \"\"\"Generates tuples of train_idx, test_idx pairs\n",
    "    Assumes the index contains a level labeled 'date'\"\"\"\n",
    "\n",
    "    def __init__(self, n_splits=3, test_period_length=1, shuffle=False):\n",
    "        self.n_splits = n_splits\n",
    "        self.test_period_length = test_period_length\n",
    "        self.shuffle = shuffle\n",
    "        self.test_end = n_splits * test_period_length\n",
    "\n",
    "    @staticmethod\n",
    "    def chunks(l, chunk_size):\n",
    "        for i in range(0, len(l), chunk_size):\n",
    "            yield l[i:i + chunk_size]\n",
    "\n",
    "    def split(self, X, y=None, groups=None):\n",
    "        unique_dates = (X.index\n",
    "                            .get_level_values('date')\n",
    "                            .unique()\n",
    "                            .sort_values(ascending=False)[:self.test_end])\n",
    "\n",
    "        dates = X.reset_index()[['date']]\n",
    "        for test_date in self.chunks(unique_dates, self.test_period_length):\n",
    "            train_idx = dates[dates.date < min(test_date)].index\n",
    "            test_idx = dates[dates.date.isin(test_date)].index\n",
    "            if self.shuffle:\n",
    "                np.random.shuffle(list(train_idx))\n",
    "            yield train_idx, test_idx\n",
    "\n",
    "    def get_n_splits(self, X, y, groups=None):\n",
    "        return self.n_splits"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Define Network Architecture"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Custom AUC Loss Metric"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "For binary classification, AUC is an excellent metric because it assesses performance irrespective of the threshold chosen to convert probabilities into positive predictions. Unfortunately, Keras does not provide it ‘out-of-the-box’ because it focuses on metrics that help gradient descent optimized based on batches of samples during training. However, we can define a custom loss metric for use with the early stopping callback as follows (included in the compile step):"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 20,
   "metadata": {},
   "outputs": [],
   "source": [
    "def auc_roc(y_true, y_pred):\n",
    "    # any tensorflow metric\n",
    "    value, update_op = tf.metrics.auc(y_true, y_pred)\n",
    "\n",
    "    # find all variables created for this metric\n",
    "    metric_vars = [i for i in tf.local_variables() if 'auc_roc' in i.name.split('/')[1]]\n",
    "\n",
    "    # Add metric variables to GLOBAL_VARIABLES collection.\n",
    "    # They will be initialized for new session.\n",
    "    for v in metric_vars:\n",
    "        tf.add_to_collection(tf.GraphKeys.GLOBAL_VARIABLES, v)\n",
    "\n",
    "    # force to update metric values\n",
    "    with tf.control_dependencies([update_op]):\n",
    "        value = tf.identity(value)\n",
    "        return value"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Set up `build_fn` for `keras.wrappers.scikit_learn.KerasClassifier`"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Keras contains a wrapper that we can use with the sklearn GridSearchCV class. It requires a build_fn that constructs and compiles the model based on arguments that can later be passed during the GridSearchCV iterations.\n",
    "\n",
    "The following `make_model` function illustrates how to flexibly define various architectural elements for the search process. The dense_layers argument defines both the depth and width of the network as a list of integers. We also use dropout for regularization, expressed as a float in the range [0, 1] to define the probability that a given unit will be excluded from a training iteration."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 78,
   "metadata": {},
   "outputs": [],
   "source": [
    "def make_model(dense_layers, activation, dropout):\n",
    "    '''Creates a multi-layer perceptron model\n",
    "    \n",
    "    dense_layers: List of layer sizes; one number per layer\n",
    "    '''\n",
    "\n",
    "    model = Sequential()\n",
    "    for i, layer_size in enumerate(dense_layers, 1):\n",
    "        if i == 1:\n",
    "            model.add(Dense(layer_size, input_dim=input_dim))\n",
    "            model.add(Activation(activation))\n",
    "        else:\n",
    "            model.add(Dense(layer_size))\n",
    "            model.add(Activation(activation))\n",
    "    model.add(Dropout(dropout))\n",
    "    model.add(Dense(1))\n",
    "    model.add(Activation('sigmoid'))\n",
    "\n",
    "    model.compile(loss='binary_crossentropy',\n",
    "                  optimizer='Adam',\n",
    "                  metrics=['binary_accuracy', auc_roc])\n",
    "\n",
    "    return model"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Run Keras with `GridSearchCV`"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Train-Test Split"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "We split the data into a training set for cross-validation, and keep the last 12 months with data as holdout test:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 9,
   "metadata": {},
   "outputs": [],
   "source": [
    "data = pd.read_hdf('data.h5', 'returns')"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 6,
   "metadata": {},
   "outputs": [],
   "source": [
    "X_train = data[:'2016'].drop('label', axis=1)\n",
    "y_train = data[:'2016'].label"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 7,
   "metadata": {},
   "outputs": [],
   "source": [
    "X_test = data['2017':].drop('label', axis=1)\n",
    "y_test = data['2017':].label"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Define GridSearch inputs"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Now we just need to define our Keras classifier using the make_model function, set cross-validation (see chapter 6 on The Machine Learning Process and following for the OneStepTimeSeriesSplit), and the parameters that we would like to explore. \n",
    "\n",
    "We pick several one- and two-layer configurations, relu and tanh activation functions, and different dropout rates. We could also try out different optimizers (but did not run this experiment to limit what is already a computationally intensive effort):"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "input_dim = X_train.shape[1]"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 62,
   "metadata": {},
   "outputs": [],
   "source": [
    "clf = KerasClassifier(make_model, epochs=10, batch_size=32)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 13,
   "metadata": {},
   "outputs": [],
   "source": [
    "n_splits = 12"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 14,
   "metadata": {},
   "outputs": [],
   "source": [
    "cv = OneStepTimeSeriesSplit(n_splits=n_splits)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 60,
   "metadata": {},
   "outputs": [],
   "source": [
    "param_grid = {'dense_layers': [[32], [32, 32], [64], [64, 64], [64, 64, 32], [64, 32], [128]],\n",
    "              'activation'  : ['relu', 'tanh'],\n",
    "              'dropout'     : [.25, .5, .75],\n",
    "              }"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "To trigger the parameter search, we instantiate a GridSearchCV object, define the fit_params that will be passed to the Keras model’s fit method, and provide the training data to the GridSearchCV fit method:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 64,
   "metadata": {},
   "outputs": [],
   "source": [
    "gs = GridSearchCV(estimator=clf,\n",
    "                  param_grid=param_grid,\n",
    "                  scoring='roc_auc',\n",
    "                  cv=cv,\n",
    "                  refit=True,\n",
    "                  return_train_score=True,\n",
    "                  n_jobs=-1,\n",
    "                  verbose=1,\n",
    "                  iid=False,\n",
    "                  error_score=np.nan)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "fit_params = dict(callbacks=[EarlyStopping(monitor='auc_roc', \n",
    "                                           patience=300, \n",
    "                                           verbose=1, mode='max')],\n",
    "                  verbose=2,\n",
    "                  epochs=50)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "gs.fit(X=X_train.astype(float), y=y_train, **fit_params)\n",
    "print('\\nBest Score: {:.2%}'.format(gs.best_score_))\n",
    "print('Best Params:\\n', pd.Series(gs.best_params_))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Persist best model and training data"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "gs.best_estimator_.model.save('best_model.h5')"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "pd.DataFrame(gs.cv_results_).to_csv('cv_results.csv', index=False)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "y_pred = gs.best_estimator_.model.predict(test_data.drop('label', axis=1))\n",
    "roc_auc_score(y_true=test_data.label, y_score=y_pred)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 9,
   "metadata": {},
   "outputs": [],
   "source": [
    "with pd.HDFStore('data.h5') as store:\n",
    "    store.put('X_train', X_train)\n",
    "    store.put('X_test', X_test)\n",
    "    store.put('y_train', y_train)\n",
    "    store.put('y_test', y_test)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 94,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "<class 'pandas.core.frame.DataFrame'>\n",
      "RangeIndex: 504 entries, 0 to 503\n",
      "Data columns (total 5 columns):\n",
      "activation      504 non-null object\n",
      "dense_layers    504 non-null object\n",
      "dropout         504 non-null float64\n",
      "split           504 non-null object\n",
      "score           504 non-null float64\n",
      "dtypes: float64(2), object(3)\n",
      "memory usage: 19.8+ KB\n"
     ]
    }
   ],
   "source": [
    "cv_results = pd.read_csv('gridsearch/cv_results.csv')\n",
    "cv_results = (cv_results.filter(like='param_')\n",
    "              .join(cv_results\n",
    "                    .filter(like='_test_score')\n",
    "                    .filter(like='split'))\n",
    "             .rename(columns = lambda x: x.replace('param_', '')))\n",
    "cv_results =pd.melt(id_vars=['activation', 'dense_layers', 'dropout'], \n",
    "                    frame=cv_results,\n",
    "                   value_name='score',\n",
    "                   var_name='split')\n",
    "cv_results.info()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "The following chart shows the range of cross-validation results for the various elements of the Neural Network architectures that we tested in our experiment. It shows that the settings that performed best in combination, when evaluated individually, tended to do as good as or better than the alternatives."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 119,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAA/gAAAGyCAYAAABQh2+0AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvOIA7rQAAIABJREFUeJzs3XmcFNW5//HvMzMiKIiyqAEkkqvEGKOYTHQMiZoFJLnX3esWr5pFEg3qzWKuBKPGSDQxifGKesUtyc+oGDCKCQqo0UTiRDCCRhREcAGUwAz7PjPP74+qxp6mZ6Z7pquru/rzfr3qNVOnT51+uqu7up46p6rM3QUAAAAAAMpbVdwBAAAAAACAriPBBwAAAAAgAUjwAQAAAABIABJ8AAAAAAASgAQfAAAAAIAEIMEHAAAAACABSPABII2ZnW9mnjatN7N5ZjbGzGoK/FxHmdnfzWxj+FzDCtl+JTCzq8P3rqDrppjM7NjwdRT0N7kzny8zuzOs+8tCxpLWvpvZtTnUe9rMnk6bHxa+R32iiCt8jkjWQyFkbJMyp5My6l0dY6gFEa6Hz8UdBwCUo5L7EQOAEvGfko6SdKqk5yXdLOnKAj/HXZJqJB0fPtfCAreP8nCspKtU+N/kvD5fZtZDwedekr4c80GTi8IpZZiC9yiyBF/RrYdC+bWC9Zg5PRNjTFG5ShIJPgB0Qtn2eABAxOa6+6Lw/xlmdoCk/1YXk3wzq5ZkklokfVjSeHd/qkuRBu2apF3cfVtX20L5C3uh8/18nSxpD0nTJH1J0ihJf8zhuXZ1962djTUbd59fyPbiUuDv5TJ3ry9AOwCABCvVo9QAUGpmS+plZnunCszsgnD4/hYzW2Vmd2UOIQ6HzI43s8vNbImkbZIultSsYBv8w7DOm2nLnJPR7v8zsw9ktPummd1rZl81s9fCdv/dzPYP2/ummV1nZu+Fpxnca2a7mdkBZjbdzDaY2SIzOy+j3QPC51tiZpvNbLGZ3WZme2XU+7WZLTWzw83sr2a2ycxeN7NvZr5xZjYkbPM9M9satnlTRp1jzOzJMNaNYYyH5LWGdn5v/svMFoSv469mdqCZ7W5mt5tZg5mtMLNfpPdUh8O03cxODV/jajNbZ2a/M7O+Gc8zxsyeM7NGM1tjZvVm9u9Z4tndzK43szfC1/+emU0xs33C4dRXhVW3p4Zdd/D69jCzCWa2PGxvgZl9O0wmZWbnq53PVzvOk7Ra0vmSNks6N8tzp06JOCT1OZL0YNrjJ5vZrPDztc7MnjezE7K0c0n4GVtvZs+Y2UczHt8xRD98PfeED71u7w9N3z98vMbMxprZa+H7sTxcr90z2uzUekj7TByb0d756XGEZVm/l+Fju5nZT8PXvS38O84iPiXAzA4zs6nhZ3lzuH4+k1En9X2uNbO/hfUWpD7PZvad8LWtM7NHzKx/xvIdrgN7f9v0DTO7xszeDb83j5rZoLR6qc//uLR1fXX42CfNbKYF399NFmxLbo3qvQOAckQPPgDkZoiCpGmDJJnZ9ZK+K+l/JV0maaCkayUdYmafcvfmtGXPl7RY0vckbZT0ooJh/88qGEZ9p6StYbujJd0uaZKksZIGSPqJpCPN7OPuviGt3c8qGLr8I0n/kvRm2mNjJT2tIGk7WNLPFIwaOFzSHZJ+LulCSfeY2Rx3fyVcboCkpQpGK6yW9CFJP1DQq3tUxnuyh6T7JP1K0jWSviLpNjNb4O5/Dl/PkPC1blKQQL0uaT9JI1ONhEnEI5L+JOmcsPh/JP3VzA5193eUv6Ml/VvYTrcwxikK1sMiSWeGda6Q9IakzCThV5KekHSWpAMVrIMBCt7zlP0VrLs39f5Q+D+a2Zfc/bHwtXWTNFPBerpOUr2k3pKOk7RXuPwgSV+T9GkFn7E2hcngnyR9XMFokpcVJJC/lNRfwbr6U9jWTp+vdtodIOkLkia6+0oze1jSKWa2l7uvzrLII2HbP1XwuZKZXazg+/Cwgs/dhjDO/TOWPUfSAkmXKlg3N0h6xMwOcvemLM/1JwXfrSsUnEKwNCx/N/x7r4L3/qeS/ibpI5J+HD7vqWFsBV0PHdjpe2nBQaTpCr6LP1aw3uok/VDBaQffzaFdsyynTbTxnqUW+LikvyrY5lyg4Hv4TUlPhNupF9Kq7yHptwq2DcsljZM0xcxukTRU0rck7aPgu3GLpNPTlu1wHaQZG9b5qqS9Jf1C0u8kHRM+fpSk5xScknB7WLbUzHoqeA+fV7BNXR+2/6m2Xj8AVCR3Z2JiYmIKJwU7jq5geHONgp3/byjY4X84rLN/OH9lxrLDw2VPSitzBTvLPTLq1oSPXZ1WVi1phaQ/Z9T9dFj3krSyNxXsrO+bUXf/sO5TGeUPheXnpJXtJalJ0lXtvB81ac9/eFr5r8Oyz6aV7SpplYIkMVX2WwWJ3oB2nmORpCczyvYI2/pVB+vr6jCOmoz3plFS77SyS8J6d2Ys/4/091vBedgu6fGMel8Oyz/fRhxV4Xs1Q9IjaeVfDZc7IZ/X0E7d/wjrnp9Rnkri+7X1+eqg3f8J6x8Vzh8Xzn+zjVgvzbK+1kt6qIPncQUHeXZJKzstLP9UWtnTkp7O8r08IKO9z4Tl57axvoZ1dT2kfSaOzShPxbR/Dt/L/wrrHp1RPk5BL//eObxvbU39Muqlb1OelPSqpG5pZdVh2cNpZb/OjE/SoWHZAknVaeW/lLQ9VZbHOtg/nH8mo973wvIBGa/j2ox6tWH5obl8ppmYmJgqdWKIPgBk95qCndhGBb27v1OQJEjSCAUJ3e/Coak1Yc/a3yWtU9AznO5xd9+cw3N+WEGP1u/SC939WUlv6f0erpR6d3+vjbYey/J6pKAHLNXuagU9jPulysysm5n9IBxqu1nBe/DXtPjSbfKwpz5sb6uC5G1wWp2Rkv7o7suzBWlmByroac98Lzcp6MXLfC9z9Zy7r02b3+n1p5Xvp509mDH/ewU91TtGMZjZJ8zsj2a2QsGBku0KPhvp79NISe+5+9T8X0JWR4dx3J9Rfq+C3vDMURa5OlfS6+7+XDj/hIIDUzsN0w/9IWP+U5J6SpqYw3PNdPftafMvh38HZ6vcgVEKEuQpGZ+fGeHjqc9PoddDe7J9L0cp+A7/LUucuyjoze/I3ZI+mWVak62yBRdNPEbhZzftOU3B+s38bm1097+kzae+M0946xFJryk4gJQ6bSjXdZDyp4z5XNf/6wpe6+0WnMaU7XsLABWPIfoAkN3JCoYCr5f0lrtvSXssdR7+op2WCvTNmH83a62dpc7fz1b/Pe18BfH22s0cVr2tnfL0c5WvU3CNgGsUDKNdr2Do8kMZ9bK1JQW9yOn1+ur9IdXZpN7Lu8Ip09vtLNuezr7+lBXpM+6+zcxWKzgVQ2Fy8aSk+Qrer7cVJPk/VjA8OaWvpGWdiL8tfSQ1+s4XtXsv7fG8mNknFQwd/6mZ7Zn20EOSxpjZUHfPvAJ/5mcv9Zlvb12nNGbMp15LtvXQkb0VHNjY0MbjfdP+FnI9tCfb93JvSR9UcBAom8xtRtZ23X1OHnH0UdBb/8Nw2omZVbl7Szjb6kBB+JmX2v4updZXrusgpVPr393XmtlnFbyWWxVcE+UVBSOQprS3LABUEhJ8AMjun/7+VfQzNYR/Ryp7ktuQMd/uRdPSpHZ8983y2L6SMnfuc203H2dK+q2777hXeXjua2etUpgUtyH1Xo1V0KuYKa67AuyTPhOew72X3k8SRyk4h/t0d1+aVm+3jHZWSerUxQLb0Cipj5l189ZXZk99ZjI/e7lIXWjxf8Ip07kKzn9Pl/nZWxX+HSjpn52IobMaJG1RMEw8m9TIka6sh9TBvW4Z5W0l5dm+lw2Slqj1eevp3sw/rA6tUTDa4xYFp8rsJC2574pc10GXuftcSaeGIwRqFWw3HjSzw9y9mJ87AChZJPgAkL+ZCnacB7v7zAK2u0BBz/GZSuvNNrNPKej9+0UBn6stu2nnXsavdKG9GQou1vYBd8/Ws7lAQXLzUXe/vgvPU2inKxgSnfKfCk7LSA1hTyXyO94rMxuq4DoM6b3YMySdaWbHu/ujbTxXqgezh4IRE+15RsFFHf9TrU/l+LKCgyF53UYtPHBxpoLTSy7PUuVGSf9lZj909/YOKP1NQQ/uaO18GkQhpL9H6R5XcFCit7s/2c7yXVkPb4V/D9H7w86l4FaCuXpcwcXmNrj7ax1VLgR332hmf5V0mKR/FCiZzybXdZCPbdp5Xe/gwYUF683sh5JOUDBqhgQfAESCDwB5c/c3zOynkiaY2YcVJF1bFJzLPULBhdz+3F4bbbTbbGZXKjjH9F4F51UPlDRewfmn97S3fIE8Luk8M3tZwSkIp6hrV6m+SsFV3v9mZj8J2xwoaZS7n+PubmbfUnAV9W4Kzn1fpaAH/VOS3nb3X3bh+Tvro2Z2j6QHFFxBfLyCi4OlEpgnFAzJ/62Z/ULB+cg/UjBUP/36NvcquHr5/WZ2nYJEupeCi9j9Kkz2Uvd8/66ZPSapuZ2h2I8puDr+/4W3KntFQaL5dUnXufuqNpZry38o6In+rrs/nfmgmd0u6TYFF5pr8zPt7uvNbKykm81sioKDD+sVXE1+i7vfnGdcmVLv0bfM7DcKDqy85O5Pm9n9kiab2S8VXGG9RcEF3b4k6X/C0ws6vR7c/V0ze0bSWDNbpeC6FecouHZErn6n4EDZk+HnZZ6CEQH/piBBPcndN3XQxkAzy3au/lttHDyTpO9I+ouk6WZ2l4LTB/opuLtBtbtnO6iTlzzWQT7mK7jt5+MKRkktD2MereAuDUsk7a7g4pnr9f6BNwCoeCT4ANAJ7v4DM3tVwa2jvqVgWO47Cs7Lfr0L7U40s00KemkfUdArOk3S9731LfKicrGCi3CND+enKbhV3POdaczd3zSzIxXc5uw6BUnVMgWvLVVnmpkdreCK4ncq6Ll7T0Fv9KTOvYwuu1RB4jVJwXnMjypIJiRJ7v6KmX1ZwbUKpiq41d7lCobuH5tWb7uZjVRwoGN0+LdB0iy9f0rGHxWcU3yRglvfWTjtxN1bwtsK/kRBr2lfBSMgvqPg9mX5Ok9BgvT7Nh6/X8FV089TOwl+GNsEM3tPwWf3dwqS8FcVXJegS9x9Xngv9NEKEvUqBbeufFNBsn2xgotgjlPQE/+mgpEEK8Llu7oezlFwoON/FRzMu1vBZ/qOHOPfbmbHKfiMjA5j36jgc/Mn5XYqyvnhlOkyBbe2y/a8/wivsXBVGHtvSSsV3D3i/3KJPUcdroM8jVEQ76MK7s7xIwWfxc0KzsH/gILP7WxJI9JPkwGASmftj7gDAKBymNmxChLZEe6e7ZoAAAAAJYvb5AEAAAAAkAAk+AAAAAAAJABD9AEAAAAASAB68AEAAAAASAASfAAAAAAAEoAEHwAAAACABCDBBwAAAAAgAUjwAQAAAABIgJq4AyiUfv36+f777x93GEBZe+GFF1a5e/+444ga2wug69heAABQPLn+7iYmwd9///01Z86cuMMAypqZvRV3DMXA9gLoOrYXAAAUT66/uwzRBwAAAAAgAUjwAQAAAABIABJ8AAAAAAASgAQfAAAAAIAEIMEHAAAAACABSPABAAAAAEgAEnwAAAAAABKABB8AAAAAgASINME3s1FmtsDMFpnZ5Vkev9HM5obTQjNbk/bYeWb2ejidF2WcAACgPLBvAQBA22qiatjMqiXdImmEpKWSZpvZVHefn6rj7t9Oq3+xpMPD//tIukpSrSSX9EK47Oqo4gUAAKWNfQsAANoXZQ/+EZIWuftid98m6QFJJ7ZT/yxJ94f/Hydpprs3hj+8MyWNijBWAABQ+ti3AACgHVEm+AMlvZM2vzQs24mZfVDSEElP5bOsmY02szlmNmflypUFCbqU/GvtprhDAACglES+bxEuG/v+BfsAAIDOiDLBtyxl3kbdMyVNdvfmfJZ194nuXuvutf379+9kmKXpxcUr9KVrJ+vFJSviDgUoCs6rBZCDyPctpPj3L9gHAAB0VpQJ/lJJ+6XND5K0vI26Z+r9IXT5Lps4Tc0tumrSLLmkqyfNUlNzS9whAZFKO6/2i5IOlnSWmR2cXsfdv+3uw9x9mKSbJT0ULps6r/ZIBcN3rzKzvYoZP4CiSfy+BfsAAICuiDLBny3pQDMbYmbdFPzQTs2sZGYflrSXpOfSiqdLGmlme4U76iPDsoowadZratywRZLUsH6LHpz1WswRAZHjvFogBys3JO90tDwlft+CfQAAQFdEluC7e5OkMQp+PF+V9KC7v2Jm15jZCWlVz5L0gLt72rKNkn6s4Id8tqRrwrLEW7Vus26bPlebtzVJkjZva9Kt0+eqYf3mmCMDIsU1O4AOzF02Vyffc7LmLZ8XdyixSfq+BfsAAICuiuw2eZLk7tMkTcsouzJj/uo2lr1b0t2RBVeips9douaW1sPxmltc0+cu0dmfObiNpYCyV5RrdkiaKEm1tbVttQ2UpKaWJo1/YrxcrvFPjNd959ynmqpIf8JLVpL3LdgHAAB0VZRD9NEJow4fouqq1qulusp03LAhMUUEFEXiz6sFumLyvMlq3BR0NjdsbNCUl6bEHBGiwD4AAKCrSPBLTN9ePXThccPUo1vQM9O9W40uOm6Y+vbqEXNkQKQSf14t0FkNGxt0R/0d2tIUnJe9pWmLJj43cUfCj+RgHwAA0FUk+CXojOEHqW+v7pKkfr266/ThB8UcERCtpJ9XC3TFzIUz1eKth223eItmLpwZU0SIEvsAAICuIMEvQTXVVbr69OEySVefMVw11awmJJ+7T3P3oe7+b+4+Piy70t2nptW52t0vz7Ls3e5+QDjdU8y4gaiN/PBIVVnr34Eqq9KIoSNiighRYh8AANAV/GqUqMM/tI+mXXGaDh+yT9yhAABi1Ge3Prqg7gJ1rwl6dbvXdNfoo0arz259Yo4MUWEfAADQWST4JWzv3rvFHQIAoAScdthp6rt7X0lS39376tRDT405IkSNfQAAQGeQ4AMAUOJqqmr0g8//QCbTuC+Mq9hb5AEAgPaxhwAAQBkYNnCY/vCVP6h/z/5xh4KIvbNqvfbr1yvuMAAAZYgefAAAygTJPQAAaA8JPgAAAAAACUCCDwAAAABAApDgAwAAAACQACT4AAAAAAAkAAk+AAAAAAAJQIIPAABQQt5etU6zXlsWdxgAgDJEgg8AAAAAQAKQ4AMAAAAAkAAk+AAAAAAAJAAJPgAAAAAACUCCDwAAAABAApDgAwAAAACQACT4AAAAAAAkAAk+AAAAAAAJQIIPAAAAAEACkOADAAAAAJAAJPgAEKN/rd0UdwgAAABICBL8EjXnjRWa88aKuMMAEKEXF6/Ql66drBeX8F0HAABA15HgA0AMmppbdNWkWXJJV0+apabmlrhDAgAAQJkjwQeAGEya9ZoaN2yRJDWs36IHZ70Wc0QAAAAodyT4AFBkq9Zt1m3T52rztiZJ0uZtTbp1+lw1rN8cc2QAAAAoZyT4AFBk0+cuUXNL6yH5zS2u6XOXxBQRAAAAkoAEHwCKbNThQ1Rd1XrzW11lOm7YkJgiAgAAQBKQ4ANAkfXt1UMXHjdMPbrVSJK6d6vRRccNU99ePWKODEDcHnj2Nb2zar3eWbU+7lAAAGWIBB8AYnDG8IPUt1d3SVK/Xt11+vCDYo4IAAAA5Y4EHwBiUFNdpatPHy6TdPUZw1VTzeYYAAAAXVMTdwAAUKkO/9A+mnbFadq7925xhwIAAIAEoMsIAGJEcg8AAIBCIcEHAAAAACABSPABAAAAAEgAEnwAAAAAABKABB8AAAAAgAQgwQcAAAAAIAFI8AEAKBMvLnsx7hAAAEAJI8EHAAAAACABSPABAAAAAEiASBN8MxtlZgvMbJGZXd5GndPNbL6ZvWJm96WVN5vZ3HCaGmWcAACgPLBvAQBA22qiatjMqiXdImmEpKWSZpvZVHefn1bnQEljJQ1399VmtndaE5vdfVhU8QEAgPLCvgUAAO2Lsgf/CEmL3H2xu2+T9ICkEzPqXCDpFndfLUnu/q8I4wEAAOWNfQsAANoRZYI/UNI7afNLw7J0QyUNNbNZZlZvZqPSHutuZnPC8pMijBNACWDYLYAcFGXfwsxGh/XmrFy5snDRAwAQsciG6EuyLGWe5fkPlHSspEGS/mpmh7j7GkmD3X25mX1I0lNm9rK7v9HqCcxGSxotSYMHDy50/ACKhGG3AHIU+b6FJLn7REkTJam2tjazfQAASlaUPfhLJe2XNj9I0vIsdR5x9+3uvkTSAgU/ynL35eHfxZKelnR45hO4+0R3r3X32v79+xf+FQAoFobdAshF5PsWAACUsygT/NmSDjSzIWbWTdKZkjKHzj4s6bOSZGb9FAyrW2xme5nZrmnlwyXNF4Ck4pQeALlg3wIAgHZENkTf3ZvMbIyk6ZKqJd3t7q+Y2TWS5rj71PCxkWY2X1KzpMvcvcHMPiXpdjNrUXAQ4vr0oboAEodTegB0iH0LAADaF+U5+HL3aZKmZZRdmfa/S/pOOKXX+Zukj0UZG4CSkuuw23p33y5piZmlht3OTh92a2ZPKxh22yrB55xaIBnYtwAAoG1RDtEHgFwx7BYAAADoIhJ8ALFz9yZJqWG3r0p6MDXs1sxOCKtNl9QQDrv9s8Jht5I+ImmOmc0Ly8tq2O2cN1bEHQIAoMhWbuD2iwCiEekQfQDIFcNuAQCVYO6yuRrz0BjdcuotOmzAYXGHAyBh6MEHAAAAiqCppUnjnxgvl2v8E+PV1NIUd0gAEoYEHwAAACiCyfMmq3FToySpYWODprw0JeaIACQNCX6Jenf1Br27ekPcYQAAAKAAGjY26I76O7SlaYskaUvTFk18buKOhB8ACoEEHwAAALFqWpf8C47OXDhTLd7SqqzFWzRz4cyYIgKQRCT4QBdUwg4JAABR2vLWC1r+qxHa8vY/4g4lUiM/PFJV1nrXu8qqNGLoiJgiApBEJPhAJ1XKDgkAAFHxliY1PHKFJFfjI+PkCb7oXJ/d+uiCugvUvaa7JKl7TXeNPmq0+uzWJ+bIUGgb1m6OOwRUMBJ8oBMqaYcEAICorH/+PrVsbJAkNW9o0PrZ98ccUbROO+w09d29rySp7+59deqhp8YcEQpt+eIG/fqaJ7R8cUPcoaBCkeADnVBpOyQAABRa84ZVWvvnCfLtQW+nb9+stU/drOYNq2KOLDo1VTX65Ym/lMk07gvjVFNVE3dIKKCW5hY9cf9cSdITD8xVS3NLB0sAhUeCD+SpEndIAAAotI3/nCb35lZl7s3a+M/HYoqoeP7wlT/osAGHxR0GCuylZ5do04atkqRN67fqpVlLYo4IlYgEH8hTJe+QAABQKLsf8iWZVbcqM6vW7od8MaaIiqd/z/5xh4AC27hui+ofX6CmbcE+YtO2ZtU/tkCb1m+NOTJUGhJ8IE+VvEMCAEChVPfsp96fHSPbpYckyXbpod6fu1jVPfvFHBmQv9dfXCZv8VZl3uJa+OKymCJCpSLBB/LEDgmAOEx7dZreXfdu3GEABdXriLNV3TO46Fx1z37q9cmzYo4oekvXLFX9W/Vxh4ECG/rxQbIqa1VmVaahhw+MKSJUKhJ8oBMqcYcEhffonDf07uoNcYcBALGxqhrt/eXbJZn6nHitjIvOoUzt1mtX1Y36sGq6BaM8a7pVq+6LH9ZuvXaNOTJUGhJ8oBOsqkZ9TrhW7JAAANB1A/57proP/njcYQBdcuinh+xI6HfrtasOHT4k5ohQiUjwgU7q/sFPsEMCAEAB1OyxT9whAF1WVV2lL5w5TJL0hTOHqaqaVAvFR7cj0AXskAAAACBlwIf66vwrv6CevXvEHQoqFIeVAAAAEJvtjW9r86Jn4w4DKBiSe8SJHnwAAACgCCbPm/z+zAfjiwNActGDD3TSljdna8ubs+MOAwAAAAAkkeADQCxunzFPyxs3aHnjBt0+Y17c4QAAAHTKusaGuENAGhJ8oJOa1y5X89rlcYcBJFLTuhVxhwAAADrw9mvzdfMl39A7C16NOxSESPBLzO0z5u3Us0fvHoBKsuWtF7T8VyO05e1/xB0KAAB5WbtqY9whFE1Lc7MevX2CJGnq7RPU0twcc0SQSPABACXEW5rU8MgVklyNj4yTtzTFHRJQND97+Hm9vWrdjulnDz+vnz38fNxhRWr98/epqfFtNTW+HXcoAPI0e8Zj2rhurSRp49o1mjPzsZgjgkSCDwAoIeufv08tG4Nz+Zo3NGj97PtjjggA0FVbV66MOwQU2IY1q/XM5Pu1fetWSdL2rVv19O/v14a1a2KODCT4AICS0Lxhldb+eYJ8+2ZJkm/frLVP3azmDatijgwA0Flr5s1T/Wn/qbUvvRR3KCigV557Vi0tLa3KvKVF8597NqaIkEKCDwAoCRv/OU3urc/fc2/Wxn8y5A9A+bvxmRv1zpp3dkw3PnOjbnzmxrjDipQ3NWnBT66T3PXaT66TN3HaVVJ89FOfUVVV61TSqqp08FGfjikipJDgAwBKwu6HfElm1a3KzKq1+yFfjCkiAFFa/fh1amp8a8e0+vHrtPrx6+IOCwW07KGHtG31aknStsZGLXvoDzFHhELp2XtPHXPaWdpl110lSbvsuquO/c+z1LP3njFHBhJ8AEBJqO7ZT70/O0a2Sw9Jku3SQ70/d7Gqe/aLOTIAQL62NTRoyZ13qWXLFklSy5YtWnLnndrW2BhzZCiUT478onYPE/qevfdU7QgOyJcCEnwAQMnodcTZqu7ZV3ufe7eqe/ZTr0+eFXdIAIBO+NeTT8qznKP9ryeejCkiFFpVdbWOH/0tSdLx3xijqurqDpZAMdTEHQBQbtY+c2vW+d7HXBRHOChDV02a1WbZj84YXuxwSopV1ajPCddK3qI+J14rq+JnCgDK0d5f+IKW3HmXPK3Mqqq09xc+H1tMKLzBBx2si//3du3Rp2/coSBEDz4AoKQ0r1mq5rXL1X3wx+MOBQDQSd369NGQr39NVd27S5KqunfXkK9/Xd369Ik5MhTS6hXvkdyXGBJ8AEDJWPvMrWpas0xNa5btNFqmkt3197v07rp39e66d3XX3+8YUrZ7AAAgAElEQVSKOxwAyMnAU05R7T13SwoS/oGnnBxzREDyMfaxRGQbspvt8UofvgsAQJJcfFfH5yOn17n5awxvRvmwmjDVMNNBPxj7/jyAyPAtAwAAQFH8674L86qz99m3RRkOiqRu8u+1a//+cYdRFGtWbdSaVRv1wYP2jjsUVCiG6AMAAACITKUk90ApoAcfyFHDI1fk9HjfE68tRjgAAAAlb/M7S7X5naXqU3dk3KEgAo3vvau99tk37jCKavu6rdplj13jDqNN9OADAAAAANCBTW+v0aKbZ2nTO2viDqVNJPgAAAAAALTDW1q0/NH5kqTlU+fLW1pijig7EnwAAAAAANrROHupmjZukyQ1bdymxjlLY44oOxJ8AAAAAADa0LRhq1Y+s1i+Pei19+0tWvn0YjVt2BZzZDsjwQcAAAAAoA1rX1khtXjrQpfWzV8RT0DtIMEHAAAAUHDLpkzR5mVLtXlZaQ5lBnLV+6P7SlXWutCkPQ7eJ56A2kGCDwAoCQ2PXKGmNctbTQ2PXNHhLSpRWcxslJktMLNFZnZ5G3VON7P5ZvaKmd2XVn6emb0eTucVL2oAQDmr6dlN/Y/5kGyXIH22XarU/9gPqaZnt5gj21mkCT4/wgAAoFDMrFrSLZK+KOlgSWeZ2cEZdQ6UNFbScHf/qKT/Dsv7SLpK0pGSjpB0lZntVcTwAQBlrM8nB6lm9yChr+nZTX1qB8UcUXaRJfj8CAMAgAI7QtIid1/s7tskPSDpxIw6F0i6xd1XS5K7/yssP07STHdvDB+bKWlUkeIGKs6im27S5qVLd0yLbrpJi266Ke6wUECzZzymxhXvafaMx+IOpSisqkoDjg/S2QHHHyyrKs3B8FFGxY8wgJwx4gdADgZKeidtfmlYlm6opKFmNsvM6s1sVB7LSpLMbLSZzTGzOStXrixQ6ACAcrfb4D11wMXDtdt+e8YdSptqImw72w/pkRl1hkqSmc2SVC3pand/vI1ld/oRNrPRkkZL0uDBgwsWOIDiShvxM0LB9322mU119/lpddJH/Kw2s73D8tSIn1pJLumFcNnVxX4dACJnWcoyLmusGkkHSjpW0iBJfzWzQ3JcNih0nyhpoiTV1tZmrQMAqEy77LFr3CG0K8oe/Hx/hM+SdKeZ7ZnjsnL3ie5e6+61/fv372K4AGLEiB8AuVgqab+0+UGSlmep84i7b3f3JZIWKNjXyGVZAADatG315rhD6FCUPfi5/gjXu/t2SUvMLP1H+NiMZZ+OLFIAcYt8xA+ARJgt6UAzGyJpmaQzJZ2dUedhBZ0Gvzazfgq2HYslvSHpJ2nX9BmpYFQQEJnvTf1eXnV+fsLPowwHQAWIMsHnRziLC26bUfDl77hwZJfaBEpA5MNuOaUHKH/u3mRmYyRNV3Cg7253f8XMrpE0x92nho+NNLP5kpolXebuDZJkZj9WsH8iSde4e2PxXwUAANGJLMHnRxhAHiIf8cM5tUAyuPs0SdMyyq5M+98lfSecMpe9W9LdUccIAEBcouzB50cYQK4Y8QO0YfzM8W2WjRsxrtjhAACgGb+9O+v8yHO/Gkc4SBNpgg8AuWDET2Va8ZuvdKruPufdE0U4QKROuv7hSNt5+PKTCtI+0BUvX/b9vOp87IafRRlO0b307JK4QwBI8AGUBkb8AAAAAF0T5W3yAAAAAABIhG2Nm+IOoUP04ANAxPK5e0Z6Xe6QAQAAgHyQ4AMZ8jkvONflOWcYAAAAQNRI8AEAAADk7PmzMm90U9h2jrj/voK0D1QiEnwAAErImIfGdKruhFMmRBEOAAAoIyT4AAAAANAFf/nDP7POH33yIXGEgwrGVfQBAAAAAEgAEnwAAAAAABKABB8AAAAAgAQgwQcAAAAAIAFI8AEAAAAASACuog8AAAB0wRm/PSPSdiadO6kg7QPovMY5S3f87VM7KOZo2kYPPgAAAAAACUAPfoH8x08eKovn/uMPTokwEgAAAKAyPHrH3/Oqc/wFR0YZDiCJHnwAAAAAABKBBB8AAAAAgARgiD4AFEAUp+m01yan2wAAABTHezMWZp3fd+TQOMJpFz34AAAAAAAkAD34AIDILL/puKK2OeDS6QV/PgAAgHJBgg8AAICCWj7h3yNtZ8CYPxWkfQBIGoboAwAAAACQAPTgAwAAAACQ5u1J8/KqM/iMw6IMJ2ck+KgYUZwLHMVzcw4xAAAAgM5giD4AAAAAAAlAgg8AAAAAQAKQ4AMAAAAAkAAk+AAAAAAAJAAJPgAAAAAACcBV9AEAAPL0ict+G3cIreQbzws3nBtRJACAONGDDwAAAABAAtCDDwAAAABo1wM3jM+rzpmXjYsyHLSBBB8AgCI79denFrXNKedPKfjzAQCA0sMQfQAAAAAAEoAEHwAAAACABGCIPgAAAACgIi267blI2zngwqMK0n6u6MEHAAAAACABSPABAAAAAEgAhugDAACgXW9f87G4Q2gl33gGX/lyRJEAQGkhwQcAAACANtx73VORtnPO2M8VpH1AYog+AAAAAACJQIIPAAAAAEACMEQfAAAASDP85uFxh9BKvvHMunhWXvWfOfqYvOpHLd94jvnLMxFFUplu/e6YSNu56BcTCtI+siPBBwB0qNQusNWWzsbJBbjKh5mNknSTpGpJd7r79RmPny/pBknLwqIJ7n5n+FizpNTKftvdTyhK0AAAFEnOCb6ZfVrSge5+j5n1l9TT3Zd0sExJ/gh/4rLfFqqpshPFa3/hhnML3ibKW2e2FwAqT77bCjOrlnSLpBGSlkqabWZT3X1+RtVJ7p6t62izuw8rVPwAAJSanBJ8M7tKUq2kD0u6R9Iuku6V1OZ4IX6EgcrUme0FgMrTyW3FEZIWufvisI0HJJ0oKXPfAgCAipTrRfZOlnSCpI2S5O7LJfXqYJkdP8Luvk1S6kcYQLJ1ZnshMxtlZgvMbJGZXZ7l8fPNbKWZzQ2nr6c91pxWPrWArwVAdDqzrRgo6Z20+aVhWaZTzewlM5tsZvullXc3szlmVm9mJ7X1JGY2Oqw3Z+XKlTm9GAAASkGuQ/S3ububmUuSme2ewzLZfoSPzFLvVDM7WtJCSd9299Qy3c1sjqQmSde7+8M5xgogXnlvL0ppxE+5nMLT2Tg5pQYlpDP7FpalzDPmH5V0v7tvNbNvSvqNpNRNpge7+3Iz+5Ckp8zsZXd/Y6cG3SdKmihJtbW1me0DAFCycu3Bf9DMbpe0p5ldIOkJSXd0sEyuP8L7u/uhYZu/SXtssLvXSjpb0q/M7N92egKOsAOlqDPbC0b8AJWnM9uKpZLSe+QHSVqeXsHdG9x9azh7h6RPpD22PPy7WNLTkg7vygsAAKDU5NSD7+4/N7MRktYpOFfuSnef2cFiOf0Ip83eIemnaY/t+BE2s6cV/Ai/kbE8R9iBEtPJ7UXkI37MbLSk0ZI0ePDgfF4SgAh0clsxW9KBZjZEwQV6z1TQEbCDmX3A3d8NZ0+Q9GpYvpekTWHPfj8F5/r/rGAvCABK1PhzTos7hFbyjWfcvZMjiiSZOkzww6Gz0939C5I6+uFNx48wUGG6sL2IfNgtBwSB0tHZbYW7N5nZGEnTFdyh5253f8XMrpE0x92nSrrEzE5QcMCvUdL54eIfkXS7mbUoGMF4fZbTgAAAKGsdJvju3mxmm8yst7uvzbVhfoSBytPZ7YWKMOIHQOnowrZC7j5N0rSMsivT/h8raWyW5f4m6WOdDBkAgLKQ60X2tkh62cxmKrzarSS5+yXtLcSPMFCROrO9YMQPUHk6tW8BAADalmuC/6dwAoCO5L29YMQPUJHYtwAAoMByvcjeb8ysm6ShYdECd98eXViodG9fU7kDOKJ47YOvfLngbbals9sLRvwAlYV9CwAACi+nBN/MjlVwQas3FVwMaz8zO8/d/xJdaADKEdsLALlgWwEgLhO++2jcIbSSbzxjfnF8RJEgCXIdov8LSSPdfYEkmdlQSfcr7d6yABBiewEgF2wrAAAosKoc6+2S+gGWJHdfKGmXaEICUObYXgDIBdsKAAAKLNce/Dlmdpek/xfOf1nSC9GEBKDMsb1AxRh+8/C4Q8hJZ+OcdfGsAkfSStluK+rr67V9WXlfy7O+vl51dXVxhwEAKLBcE/wLJX1L0iUKzpP7i6RbowoKQFljewEgF2wrAAAosFwT/BpJN7n7LyXJzKol7RpZVADKGdsLALko221FXV2ddpmyMO4wuoTeewBIplzPwX9SUo+0+R6Snih8OAASgO0FgFywrQAAoMBy7cHv7u4bUjPuvsHMdosoJgDlje0FgFywrQAAFNyr45+KO4RW8o3nI+M+16Xny7UHf6OZfTw1Y2a1kjZ36ZkBJBXbCwC5YFsBAECB5dqDf6mk35vZckkuaYCkMyKLCkA5Y3sBIBdsKwAAKLBcE/whkg6XNFjSyZLqFPwYA0AmthcAcsG2okzU19dr4ZvNcYfRJUO5LSCACpHrEP0fuvs6SXtKGiFpoqTbIosKQDljewEgF2wrAAAosFx78FOHbf9d0v+5+yNmdnU0IUWrvr5e25fNjzuMRKnnqDhaS8z2AkCk2FaUibq6Og2YUR13GF0ymP0UABUi1x78ZWZ2u6TTJU0zs13zWBZAZWF7ASAXbCsAACiwXHvwT5c0StLP3X2NmX1A0mXRhRWduro67TJlYdxhJAq998iQmO0FgEixrQAAoMBySvDdfZOkh9Lm35X0blRBAShfbC8A5IJtBQAAhcdQOAAAAAAAEiDXIfoAkHiVcBFOLooJAACQXPTgAwAAAACQAPTgA0CoEi7C2Zne+/r6ei18s7njimVsKCMbAKAi1NfXa+HyuXGH0SX19f35zUKbSPABAACAClVfX6/ZWzbHHUaX7MpBWmAHEnwAQLvq6uo0YEZ13GFEajA7hgBQEerq6jTn9yvjDqNLOJiB9pDgAwAAABWqrq5OW7v3iDuMLiHhBd7HRfYAAAAAAEgAevABAACAUH19vba+vjXuMLqEW6IClYsefAAAAAAAEoAefAAAACBUV1enXWfvGncYXULvPVC56MEHAAAAACABSPABAAAAAEgAhugDAACgXYOvfDmv+ssn/HtEkQQGjPlTpO0DQLkiwUfJqa+v18I3m+MOI1GGcjVdAAAAJFx9fb1eXjIn7jC6ZG39bl3abyfBBwAAAIAEqq+v15KGNXGH0SXc9jE/JPgoOXV1dRowozruMBJlMBtFoOCScK/sjrBTBQAoJ3V1der95Ka4w+iSj3Txd5cEHwAAAAASqK6uTk/23TPuMLqEA835IcEHAKATknCv7I6wUwUAQHnhNnkAAAAAACQACT4AAAAAAAlAgg8AAAAAQAKQ4AMAAAAAkAAk+AAAAAAAJAAJPgAAAAAACUCCDwAAAABAApDgAwCAsmFmo8xsgZktMrPLszx+vpmtNLO54fT1tMfOM7PXw+m84kYOAED0auIOAAAAIBdmVi3pFkkjJC2VNNvMprr7/Iyqk9x9TMayfSRdJalWkkt6IVx2dRFCBwCgKCLtwecoOwAAKKAjJC1y98Xuvk3SA5JOzHHZ4yTNdPfGMKmfKWlURHECABCLyHrwOcoOIB9mNkrSTZKqJd3p7tdnPH6+pBskLQuLJrj7neFj50m6Iiy/1t1/U5SgARTbQEnvpM0vlXRklnqnmtnRkhZK+ra7v9PGsgOzPYmZjZY0WpIGDx6cNZAXbjg3r8BPuv7hvOrn6+HLT4q0fQBAeYiyB5+j7AByknZA8IuSDpZ0lpkdnKXqJHcfFk6p5D51QPBIBdudq8xsryKFDqC4LEuZZ8w/Kml/dz9U0hOSUgf8clk2KHSf6O617l7bv3//TgcLAECxRZng53qk/FQze8nMJpvZfvksa2ajzWyOmc1ZuXJloeIGUHwcEASQi6WS9kubHyRpeXoFd29w963h7B2SPpHrsgAAlLsoE/zIj7JzhB1IDA4IAsjFbEkHmtkQM+sm6UxJU9MrmNkH0mZPkPRq+P90SSPNbK9wlM/IsAwAgMSIMsHnKDuAXHFAEECH3L1J0hgFifmrkh5091fM7BozOyGsdomZvWJm8yRdIun8cNlGST9WcJBgtqRrwjIAABIjytvk7TjKruCiWGdKOju9gpl9wN3fDWczj7L/JO082pGSxkYYK4B45XRAMG32Dkk/TVv22Ixlny54hABKgrtPkzQto+zKtP/Hqo19Bne/W9LdkQYIlKFj/vJMXvWfP+vsjit1wRH33xdp+0CSRZbgu3uTmaWOsldLujt1lF3SHHefquAo+wmSmiQ1Ku0ou5mljrJLBT7Knu+Vb3PxHz95qOBtRuGPPzgl7hCAbDggCAAAAHRRlD34HGUHkJNSPiAIAAAAlItIE3wAyBUHBAEAAICuifIiewAAAAAAoEjowQcAdGjwlS93arnlNx1X4EjaN+BS7noGAAAqFwk+AKTp7EU4i32hTS6YCQAAgEwM0QcAAAAAIAHowQcAAACAhBp37+S86t/63TERRRK46BcTIm2/0tGDDwAAAABAApDgAwAAAACQAAzRBwAAiNjDl5+Utfziu57Mq52bv/b5QoQDAEgoEnyUpM7ekqs9xb5dV2dxmy8AQLkbMOZPWcv/dd+FebWz99m3FSIcAKgYJPgAAABAmlkXz8qr/hm/PSOiSAKTzp0UafsAkoMEHwCATso3CUg59denFjiS9k05f0pRnw8AStmYXxyfV/17r3sqokgC54z9XKTto7JwkT0AAAAAABKABB8AAAAAgARgiD4AAAAAIBE+Mi6/Ux4W3fZcRJEEDrjwqEjbz0QPPgAAAAAACUCCDwAAAABAAjBEHwAAAEDOjrj/vqzlL1/2/bza+dgNPytEOADS0IMPAAAAAEACkOADAAAAAJAAJPgAAAAAACQACT4AAAAAAAlAgg8AAAAAQAKQ4AMAAAAAkADcJg8AAABFsffZt+34f/Xj12Wts9eoscUKBwAShwQfAAAAACBJuugXE7KWP3DD+LzaOfOycYUIB3liiD4AAAAAAAlAD36B/PEHp+RU74LbZhT8ue+4cGTB2wQAAAAAlBd68AEAAAAASAB68AGgANobxdPZkTuMzgEAAEA+SPABAJEZcOn0Nh9b8ZuvdKrNfc67p7PhAAAAJBoJPgAAANAFk86dlLX8e1O/l1c7Pz/h54UIB0AF4xx8AAAAAAASgB58AACKbMr5U9p8bMxDYzrV5oRTst+3GKXt5q99fsf/P3v4+ax1vn/SEcUKBwAqzgEXHpW1/O1J8/JqZ/AZhxUinC6jBx8AAAAAgAQgwQcAAAAAIAEYoo+K0d7VvNN19sre7eGq3wAAAACiRg8+AAAAAAAJQIIPAAAAAEACMEQfAAAAQJd97Iaf7fh/0U03Za1zwKWXFiscoCLRgw8AAAAAQAKQ4AMAAAAAkAAM0QcAAGXDzEZJuklStaQ73f36NuqdJun3kj7p7nPMbH9Jr0paEFapd/dvRh8xgHJ3ztjPZS1/9I6/59XO8RccWYhwgHaR4AMAgLJgZtWSbpE0QtJSSbPNbKq7z8+o10vSJZIy977fcPdhRQkWAIAYkOAX2R0XjsxaftWkWTkt/6MzhhcyHAAAyskRkha5+2JJMrMHJJ0oaX5GvR9L+pmk7xU3PKC1n5/w8x3/3/jMjVnrfPuYbxcrHAAVINJz8M1slJktMLNFZnZ5O/VOMzM3s9pwfn8z22xmc8Pp/6KMEwAAlIWBkt5Jm18alu1gZodL2s/d/5hl+SFm9qKZPWNmn4kwTgAAYhFZgp82jO6Lkg6WdJaZHZylXrvD6MKJc+SAhOOAIIAcWJYy3/GgWZWkGyV9N0u9dyUNdvfDJX1H0n1mtkfWJzEbbWZzzGzOypUrCxA2AADFEWUP/o5hdO6+TVJqGF2m1DC6LRHGAqCEcUAQQI6WStovbX6QpOVp870kHSLpaTN7U1KdpKlmVuvuW929QZLc/QVJb0gamu1J3H2iu9e6e23//v0jeBkAAEQjygSfYXQAcsUBQQC5mC3pQDMbYmbdJJ0paWrqQXdf6+793H1/d99fUr2kE8Kr6PcPDybKzD4k6UBJi4v/EpCy16ixqunzwR3TXqPGaq9RY+MOCwDKWpQJfuTD6BhCByRG5AcE2V4A5c/dmySNkTRdwS3vHnT3V8zsGjM7oYPFj5b0kpnNkzRZ0jfdvTHaiAEAKK4or6KfzzA6SdpXwTC6E9x9jqStUjCMzsxSw+jmpD+Bu0+UNFGSamtrXQDKVa4HBM/PUi91QLDBzD4h6WEz+6i7r2vVGNsLIBHcfZqkaRllV7ZR99i0/6dImhJpcAAAxCzKHnyG0QHIVVHOqwUAAACSLLIefHdvMrPUMLpqSXenhtFJmuPuU9tZ/GhJ15hZk6RmMYwOSLodBwQlLVNwQPDs1IPuvlZSv9S8mT0t6XupA4KSGt29mQOCAIBS9e1jvq3J8ybvmD/tsNNijAZAUkU5RJ9hdABywgHByrTPefe0mm945Io26/Y98dqowykZE06Z0Gp+/MzxbdYdN2Jc1OEAKKDTDjtN9W/Vxx0GgASLNMEHgFxxQBAAAADomijPwQcAAAAAAEVCDz4AROyOC0e2mr9q0qw26/7ojOFRhwMAAICEIsEHMmSeF5zS3vnB6SrpXGEAALqi1xFna/OiZ+MOAwASgwQfAAAAAPJ0/AVH7vj/L3/4Z9Y6R598SLHCQYENPuOwHf+/N2Nh1jr7jiy9OzNzDj4AAAAAAAlADz4AAEAJ+P5JR+iBZ1/bMX/mpw+KMRoAaO3My96/NeuM396dtc7Ic79arHDQBnrwAQAAAABIABJ8AAAAAAASgAQfAAAAAIAEIMEHAAAAACABSPABAAAAAEgAEnwAAAAAABKA2+QBAAAARTJoz0Ea1HtQ3GEASCgS/BLxozOGS5JunzEv6+PfGHlYMcMBAAAAAJQZEnwAAADEZpc+g1XTZ3DcYQBAm/YdOVSNc5bumO9TW7qjcDgHHwAAAACABCDBBwAAAAAgAUjwAQAAAABIAM7BBwCUhL4nXqu1z9zaqqz3MRfFFA0QjzM/fZBmvbYs7jAAAGWKHnwAAAAAABKAHnwAAIAS9OKSFTp8yD5xh4ECG9S7dK++jc47+uRD9NKzS3bMH/rpITFGg0pGgg8ARfajM4br9hnzWpV9Y+RhMUUDoJQ0Nbfs+P/qSbM05bKTVFOd7AGX3CIPAAon2b8YAAAAZWTSrNc0uN8euuSuJ9WwfosenPVa3CEBAMoICT4AAEAJWLVus26bPlcn/fQPkqTN25p06/S5ali/OebIAADlgiH6QI76nnitJO10le8UrvYNAOiK6XOXqLmlpVVZc4tr+twlOvszB8cUVXE0rVuhmj0q53oDKzesVP+e/eMOA0AC0YMPAABQAkYdPkTVVa13zaqrTMcNS/bFura89YKW/2qEtrz9j7hDKYq5y+bq5HtO1rzl8zquXMYOuPRS9Rg0aMd0wKWX6oBLL407LKAgNr2zJu4Q2kSCDwBACRs3Ypz23WPfVtO4EeM0bsS4uENDgfXt1UMXHjdMPboFAyy7d6vRRccNU99ePWKOLDre0qSGR66Q5Gp8ZJy8pSnukCLV1NKk8U+Ml8s1/onxakr46wWSavnU+fKMEVelggQfAACgRJwx/CD17dVdktSvV3edPvygmCOK1vrn71PLxgZJUvOGBq2ffX/MEUVr8rzJatzUKElq2NigKS9NiTmiaA089VT1GDhIPQZya0CUP29uUbe9emjF9IVq2rhNjXOWxh1SViT4AAAAJaKmukpXnz5cJunqM4Yn+hZ5zRtWae2fJ8i3BxcR9O2btfapm9W8YVXMkUWjYWOD7qi/Q1uatkiStjRt0cTnJu5I+AGUrqYNW7XymcU75n17i1Y+vVhNG7bFGFV2yf3VAAAAKEOHf2gfTbviNB0+JNkXndv4z2lyb25V5t6sjf98LKaIojVz4Uy1eOshvS3eopkLZ8YUEYBcrX1lhdTieueBtGtnuLRu/or4gmoDCT4AoGT0PuYi1ew5UDV7DuTOFKhoe/feLe4QIrf7IV+SWXWrMrNq7X7IF2OKKFojPzxSVdZ617vKqjRi6IiYIkKhHfrpIerdb3f17rd73KFEbuS5X9Ve+35gxzTy/7d373FWlXXfxz/fmeE8CI6ggaggmkY+ggqkaaVGHnpS0zA1u5PSzEq7O1iPdlDrtoMP+dTrzk5m59vwQFjmbR6y0PIWBZWDqJSJJWqCKCYKIjO/5491DS6mDTPD7D1rz97f9+s1r9nrdO3ftQ7XXte6rrXW+z7AEe/7QNFhVcyw178GGrT5SMF2E6rvQqwr+GZmZmbW6xqbRzDssLNRv+whguo3iGGHn0Nj84iCI6uMlsEtfPDADzKwKXvGwsCmgZx50Jm0DG4pODIz60xTc39GvmV31C+rPqtfAyMP3Z2m5v4FR/avmooOwDb3oSMmAvCbBX8F4JjJ44sMx0pob1V8cdGvARgy8bgiwzGrOUMmHsf6x+YXHYaZ9YKhU9/D2vmz2PjcChqbRzB0yilFh1RR0ydOZ86SOTzx/BPsMGQH3rXvu4oOycy6qGXKGJ5bsIJX1qynqbk/LZOr8+GRbsE3MyvAh46YyOiWZka3NG+6sGe2JadNOY1R241i1HajuHnZzX61ltUMNTTRcuzFgGg57mLUUNttT00NTXz2rZ9FiM9N+xxNNZ7fejR8xBB+84O7efLR1UWHYmWmhgZGHzMBgNHHTEAN1VmVrs6ozPqAxmGjaRw2uugwzKwOzF40m0vnXsqo7UbVxau1rL4M3O0ARn/8Vgbuun/RofSKSTtP4rr3X8fE0b64W2vaWl99iOLvrlq42bDVhsG7DmePcw5m8C7Diw5li1zBNzMzq2L5V2udPedsv1rLalLTdtX3oKpKGtk8sugQrAIW/2k5sy69HYCXXniZxXcuLzgiqyLMvLQAABl6SURBVIR+2w0oOoStcgW/So3avplR2zcXHYaZmRXMr9YyM6t+L/5zPfNuWsbGDdmrHzduaGXeb5fx0gsvFxyZ1RtX8M3MrOoMHDul6BCqhl+tZWZW/f5y/xNEW2w2LtqCP9//REERWb1yBd/MzKyK+dVaZmbV77X7j0Ed3pOuBvHa/XYuKCKrV67gm5mZVbnpE6ezw5AdAPxqLTOzKjR46AAOPGovmvo3AtDUv5EDj96LwUOr+37tcnl82UNFh2CJK/hm20oNrPzZ6az/+31FR2JmNc6v1jKzvmrQLmNoOfANRYfRK/Y9ZNymCv3goQPY9+BxBUdUWQe89QhadnoNt/zsh1z//ctoa20tOiTDFXyzbRJtG1n9688DwbO//hzhd1KbldXGfz5ddAhVx6/Wykg6StIySY9IOm8r802XFJIm58adn5ZbJunI3onYzOpFQ2MD006eBMC0kyfR0FjbVa35t/x20+cXn1/Dglt/u5W5rbfU9l5nViEv3PML2l5cDUDr2tW8MH9WwRGZ1Y71f7uXJ7/5NveOKaHeX60lqRH4NnA0MAE4RdKEEvMNBT4G3J0bNwE4GXg9cBTwnZSemVnZjN59B2ZcMI3Ru+9QdCgVtXbNc9w+exZXzfwyAK+8/DJzr53F2ufXFByZuYJv1k2ta5/h+T9cRryyDoB4ZR3P//5btK59puDIrK85ZvJ4vw6zA/eOsU5MBR6JiEcjYgNwFXBcifn+A/i/wPrcuOOAqyLi5YhYDjyS0jMzK6vmYYOKDqHilt71J9raNn+Fa7S18eBdfyooImvnCr5ZN734wI1EbH6PUUQrLz7gbklmPeXeMdaJnYHHc8Mr0rhNJO0H7BIRN3R3WTMz65rXv/FNNDRsXpVUQwMTDjqkoIisXUUr+L5PzmrRkH3eTsdenVIjQ/Y5uqCIaoPLC3PvGOsClRi36cXTkhqAbwCf6u6ym80onSlpgaQFq1at2qZAzcxqWfOw4bxl+in0G5A9VLDfgAEceuIpNA8bXnBkVrEKvu+T65nJ43di8vidig7DSmhsHsGww85G/bLuV+o3iGGHn0Nj84iCI+u7XF4YuHeMdckKYJfc8BjgydzwUGAfYK6kx4ADgevTBcHOlt0kIi6PiMkRMXnkyPp+7oGZ2ZZMOeJohqQKffOw4Ux+mxu7qkElW/B9n5zVrKFT30Njc/bwlMbmEQydckrBEfV5Li/MvWOsK+YDe0oaJ6k/2cW969snRsTzETEiIsZGxFhgHnBsRCxI850saYCkccCewD29nwWz+jJoZ98JU6saGhs55syPAnDMh86modHtK9WgkhX8it8n5y50VhQ1NNFy7MWAaDnuYuR3UveUywtz7xjrVERsBM4GbgYeAq6JiKWSviTp2E6WXQpcAzwI3AR8NDp2GTEzs27Zde8JnPOf32eXvV5XdCiWVLJW0tX75GZ0d9lNIyIuBy4HmDx5csn76MwqZeBuBzD647fStJ1vpSiDui0vfCvO5oZOfQ9r589i43Mr3DvGSoqIG4EbO4y7YAvzHtph+MvAlysWnJlZHdqupbZfCdjXVLIFv1fukzMrkiv3ZePywgD3jjEzq0Uvu+ecWa+pZAXf98mZWVe5vLBN2nvHDNx1/6JDMTOzHlqzaBHzpp/I84sXFx2KWV2oWAXf98mZWVe5vLCO3DvGzKzvi40bWfaVr0IED3/lq8TGjUWHZFbzKtr30ffJ9czK519ix2GDiw7DrFfUa3nh49zMzGrVE3PmsOG55wDY8OyzPDHnOsa8+8SCozKrbZXsom89cP+jT/P2i2dz//Kniw7FzCrEx7mZmdWqDatXs/yKH9K2Pnuzbdv69Sy/4go2PPtswZGZ1TZX8KvQxtY2Lrz6TgK46Oo72djaVnRIZlZmPs7NzKyWrbztNqJt89+2aGtj5e9uKygis/rgCn4VuvrOh3l2bXa1c/UL67nmzocLjsjMys3HuZmZ1bIdp01DDZtXNdTQwI7T3lpQRGb1wRX8KvPMP9fx3ZsXsm5D9hCSdRs28p2bF7L6hXUFR2Zm5eLj3MzMal3/lhbGnXE6DQMHAtAwcCDjzjiD/i0tBUdmVttcwa8yNy9cTmuH7kytbcHNC5cXFJGZlZuPczMzqwc7n3DCpgp9/5YWdj7h+IIjMqt9ruBXmaP2G0djh+5MjQ3iyEnjCorIzMrNx7mZmdUDNTWx1/nngcTenz0fNVX0BV5mhiv4VWeHoYP48JGTGNQ/KwAH9m/iI0dOYoehgwqOzMzKxce5mZnVi+ETJ3Lg7GsZtu++RYdiVhdcwa9CJx28NzsMze5XGjF0IO8+eO+CIzKzcvNxbmZm9WLAyJFFh2BWN1zBr0JNjQ1c9O6DEXDRSQfT1OjNZFZrfJybmZmZWbn5Rpgqtd/uO3Hj56ez47DBRYdiZhXi49zMzMzMyslNRlXMJ/1mtc/HuZmZmZmViyv4ZmZmZmZmZjXAFXwzMzMzMzOzGuAKvpmZmZmZmVkNcAXfzMzMzMzMrAa4gm9mZmZmZmZWA1zBNzMzMzMzM6sBruCbmZmZmZmZ1QBX8M3MzMzMzMxqgCv4ZmZmZmZmZjXAFXwzMzMzMzOzGuAKvpmZmZmZmVkNUEQUHUNZSFoF/K3oOMpsBPBM0UHYVtXaNtotIkYWHUSlVVl5UWv7ULl4vZRWTevF5UVlVdO27g3Ob21zfmub89s7uvS7WzMV/FokaUFETC46DtsybyPrKe9DpXm9lOb1Uj/qbVs7v7XN+a1tzm91cRd9MzMzMzMzsxrgCr6ZmZmZmZlZDXAFv7pdXnQA1ilvI+sp70Oleb2U5vVSP+ptWzu/tc35rW3ObxXxPfhmZmZmZmZmNcAt+GZmZmZmZmY1wBV8MzMzMzMzsxrgCn6VkzRXUtW+hqEWSRou6SM9WN7bzACQdJSkZZIekXReiemflPSgpMWSbpO0W25aq6SF6e/63o2893RhHc2QtCq3Ls4oIs7e0IV18Y3cevizpDW5aXWxv1QzSWMlrZO0MDduuKTZkh6W9JCkgzosc66kkDSiC+nvKumWlM6DksZ2mP4tSWtzw5+Q9HdJl/U8dyXjqVh+Je0m6d60Py+VdFYaP1jSf6f0l0r6Wm6Zbc5vtW27raQzNXecL5J0fBq/i6Q/pPSXSvr33DIzJf1D0rmdr4nNvmuzdSJpoKR70vculfTF3LxXprLrAUk/ktSvk7R7dfuWU4n10q39RNJJqYy/oYj4u6JEHn8kaaWkBzrMNzPle7Gk6yQNT+P7SfqppCVpnZyfxg9K23xDV46bIuTzvi37fNVs34jwX8F/gICGLUybC0wuOsZ6+gPGAg/0YHlvM/8BNAJ/BXYH+gOLgAkd5jkMGJw+fxi4OjdtbdF5qJJ1NAO4rOhYq2FddJj/HOBH9bS/VPtfqd8O4KfAGelzf2B4btouwM3A34ARXUh/LvC29Lm5vexIw5OBn3fcDyp5/FQyv2nZAbm8PgaMBgYDh+Xm+SNwdE/zW43bbgvpDAaa0udRwEqgKX3eP40fCvw5X34AFwHn9mSdkJ2rNqfP/YC7gQPT8NvTdAGzgA9X0/at5H6/LfsJcChwQ5H56GYe3wzsX+IYOSK3P14CXJI+vwe4KrfPPgaMzS33WFeOm6Lzvq37fDVsX7fgFyRdIXpI0neA+4B/k3SXpPskXSupucQy+Svz0yX9pBdDridfA8anq3ffUNayel+6EnkcbLb9fpCu6t0iaVAujRPTVb8/S3pTMdmwgk0FHomIRyNiA3AVcFx+hoj4Q0S8lAbnAWN6OcaidbqO6kh318UpZCcVVqUkbUd2YvxDgIjYEBFrcrN8A/gM0OnTjiVNIDuRvjWltba97JDUCMxMaRWmnPlNy76cBgeQepxGxEsR8Yf2ecjOn8peblbrtkv535gGB7Z/f0Q8FRH3pc8vAA8BO3clza6KTPt5aL/01/79N6bpAdxDJ9uk6O1bLuXcT6pZRNwBPFti/C25/TF/DhPAEElNwCBgA/DP3oi1nMq5z/c2V/CLtRfwM+BtwOnAtIjYH1gAfLLIwOrcecBfI2IS8Gng+LRdDgMulaQ0357AtyPi9cAa4F25NJoiYirwceDC3gvdqsjOwOO54RVs/YTrdOC3ueGBkhZImifpnZUIsAp0dR29K3UBnC1pl94Jrdd1eX9RdivHOOD3udH1sL/0NbsDq4AfS7pf0hWShgBIOhZ4IiIWdTGt1wJrJM1Jac1MlUOAs4HrI+Kpsuege8qZ3/Zu54vJjotLIuLJDtOHA8cAt5UtB6+q2m0n6Q2SlgJLgLNyFaz26WOB/chaG8tKUmPqtr0SuDUi7u4wvR/wb8BNXUiryO1bLmXd5/u4D/DqOcxs4EXgKeDvwNcj4l8uEPQF5dzne5Mr+MX6W0TMAw4EJgB3pp3oNGC3rS5pvUXAV9KP0O/ITrh3StOWR0T7/Xr3knXraTdnC+OtfqjEuJJX8SW9l6yb5szc6F0jYjJZV7dvShpf/hAL15V19Buyrn37kh2DP614VMXo8v4CnAzMjojW3Lh62F/6miaybq3fjYj9yE54z5M0GPgccEE303oTcC4whaxiMUPSaOBE4FvlDHwblTO/RMTj6bjfAzhNUvtvL6llcBbwnxHxaLkykFO12y4i7k4NC1OA8yUNbJ+Wen/+Evh4RJS9xTQiWlPjxxhgqqR9OszyHeCOiPhjF9IqcvuWS1n3+b5K0ueAjcCVadRUoJXstotxwKck7V5QeD1Szn2+N7mCX6wX03+RXRWalP4mRMTpJebPn+wNLDHdyu9UYCRwQDrAn+bVdf9ybr5WsoKeDtM6jrf6sYLs/rt2Y4AnO84kaRrZicCxuS6LtLdmpJObuWQtMrWm03UUEatz6+UHwAG9FFtv69L+kpxMh+75dbK/9DUrgBW5Fp/ZZJWB8WQnvYskPUa2re+T9JpO0ro/3cKxEfhVSms/sgrSIymtwZIeqURmuqCc+d0k7dtLySrJ7S4H/hIR3yxT7B1V/baLiIfIziP3gU0tib8EroyIOVtbtqdSN/S5wFHt4yRdSHa+1K0eqAVt33KpyD7fl0g6DXgHcGrqrg7ZheabIuKViFgJ3EnWiNFnlXOf7w2u4FeHecDBkvaATU8RfW2J+Z6W9DpJDcDxvRphfXmB7CE1AMOAlRHxiqTDcM8K67r5wJ6SxknqT1Yp2+zp5pL2A75PVrlfmRu/vaQB6fMI4GDgwV6LvPd0ZR2Nyg0eS3ZvaS3qdF0ASNoL2B64KzeuXvaXPiUi/gE8nrYZwFuBByNiSUTsGBFjI2IsWSVh/4j4h7KnpP+sRHLzge0ljUzDh6e0/jsiXpNL66WI2KOyOSutnPmVNEbpuTaStifbp5el4YvJfps/3hfywjZsO0nHS/pqx4RS+dCUPu9GdqvnY5JEdh/4QxHx/8q1Hjp890i9+pT0QcA04OE0fAZwJHBKRLTllqnK7Vsu27KfFBVrJUg6Cvg/ZOcwL+Um/R04XJkhZD2VHy4ixp7Yln2+WrhlsQpExCpJM4BZ7SdpwOfJnoKadx5wA9n9Sg+QPXnUyiwiVku6U9nrQOYDe0taACykDxZQVoyI2CjpbLIn6DaSPfF8qaQvAQsi4nqyLvnNwLXZ+Rl/j4hjgdcB35fURnYh9msRUXMVti6uo4+lexk3kj3kZ0ZhAVdQF9cFZA/XuyrXUgJ1sr/0UecAV6aLNo8C7+9k/l2BdR1HRkSrstec3ZYqc/eS9WipNmXJL9k+famkIOvl+PWIWCJpDFmPp4fJWkQhe6r6FeXKQE6R2248pR9KdghZF/BXgDbgIxHxjKRDyO4DXqJXX/X32Yi4sZPv6Y5RwE+VPT+gAbgmItpfBfY9sifF35W2yZyI+BLVvX3Lpbv7SZ8jaRbZk+FHSFoBXBgRPwQuI3tI4q1pW82LiLOAbwM/JqurCPhxRCwuIvYe2pZ9vipo83MEMzMzM+suZQ83uyEiOt6j2Z00ZgI/L9fJcGo8mBwRZ5cjvQ5pj6VG8ltteZH0X8AnImJVT9NK6V1E9hq+r3djmbFU0TpJ6c2gQvtzN2IYS8/Xy6Fkry18R5nCKqty5LGT9B8j247PVCL9nqiV7esu+mZmZmY91woMy7WgdltEfLqMlaFPAOdTuddT1VJ+qyovEfHeMlbuZwLv5dXnPnVVVa2TXtifu6pH60XSSWQPZnuurFGVV4+3fSmSBqU0+5H1QKlGNbF93YJvZmZmZmZmVgPcgm9mZmZmZmZWA1zBNzMzMzMzM6sBruCbmZmZmZmZ1QBX8G2bSDpU0htzw2dJet82pjVD0ujc8BWSJpQjTjPbNpJaJS2UtFTSIkmflFTYb4akd26pXJB0kaQnUrwPSjqlJ+mZmdUCSWs7DM+QdFn6XPK8TdLY9JrgUunNlTS5DHEdKumGzuc0s23RVHQA1mcdCqwF/gcgIr7Xg7RmkL0r88mU1hk9jM3Mem5dREwCkLQj8AtgGHBhfiZJTRGxsRfieSdwA7Cl97t/IyK+LmlP4F5JsyPilR6kZ2ZWs3p43tZn9OJvlFnVcAu+bUbSryTdm1rtzkzjjpJ0X2rFuy29I/Is4BOpxexNqQXtXEmvk3RPLr2xkhanzxdImi/pAUmXKzMdmAxcmdIalL9CLOkUSUvSMpfk0l0r6csppnmSduq9tWRWXyJiJXAmcHY6bmdIulbSb4Bb0riZ6Thdkl4T095Kc4ek61LL+vfaewFs7djOfZ4u6Sept9CxwMxUTozfSqx/AV4Ctk9pfDCVO4sk/VLS4FLppb+bUvn3R0l7l31FmplVifbztvT5gFRG3gV8NDfPIElXSVos6WpgUG7aEZLuSueH10pqTuMfk/TFNH5Jd8rSLZwnjpd0X26ePSXdm4v79lRu3yxpVBo/V9JXJN0O/LukE1OaiyTd0cNVZ1b1XMG3jj4QEQeQVbo/lirOPwDeFRETgRMj4jHge2QtZpMi4o/tC0fEQ0B/SbunUScB16TPl0XElIjYh+xH4h0RMRtYAJya0lrXnpaybvuXAIcDk4Apkt6ZJg8B5qWY7gA+WP5VYWbtIuJRst+MHdOog4DTIuJw4ASyY3QiMI2s4jwqzTcV+BTwv4DxwAmdHNulvvt/gOuBT6dy4q9bmlfS/sBf0kUJgDmp3JkIPAScvoX0LgfOSeXfuWTvsTUz68sGpYuYC5W91/tLW5jvx8DHIuKgDuM/DLwUEfsCXwYOAJA0Avg8MC0i9ic7j/tkbrln0vjvkpWnXVXqPPGvwPOSJqV53g/8RFI/4FvA9FRu/yjF2G54RLwlIi4FLgCOTL8Dx3YjHrM+yRV86+hjkhYB84BdyFrt7oiI5QAR8WwX0rgGeHf6fBJwdfp8mKS7JS0hO7F/fSfpTAHmRsSq1L3qSuDNadoGsu61APcCY7sQl5n1jHKfb82VB4cAsyKiNSKeBm4nO34B7omIRyOiFZiV5t3asb2tPiFpGXA3cFFu/D6pRX4JcColyp3U8vRG4Np0Evx9YFTH+czM+ph16SLmpHTL1QUdZ5A0jKwyfHsa9fPc5DcD/wUQEYuBxWn8gcAE4M5UZp4G7JZbbk76393zsy2dJ14BvF9SI9l55S+AvYB9gFtTDJ8HxuTSujr3+U6yiwIfBBq7EY9Zn+R78G0TSYeStb4dFBEvSZoLLCIrRLvjarIT5TlARMRfJA0kaxGbHBGPS7oIGNhZSFuZ9kpERPrcivdls4pKvXJagfaW8Rfzk7eyaJQY7ur8nZURee334J8A/EzS+IhYD/wEeGdELJI0g+z5IR01AGvanzlgZlZHxL+W03mlponsIu+WHmj6cvrf5fOzTs4Tf0n2/JffA/dGxOrUE2xpiV4H7Tb9RkXEWZLeAPxvYKGkSRGxuitxmfVFbsG3vGHAc6lyvzfZFdoBwFskjQOQ1JLmfQEYWiqR1J2qFfgCr15BbS+kn0mtZdNzi2wprbvTd49IV21PIWsZNLNeJGkk2W05l+UurOXdAZwkqTHN+2ag/VkcUyWNS/fenwT8ia0f208re5ZHA3B87ju2WObkRcQcsu6ip6VRQ4GnUnfOU0ulFxH/BJZLOjHlV5ImdvZdZmZ9XUSsIesCf0galS8n72gflrQPsG8aPw84WNIeadpgSa/tYShbPE9MF2tvJuvy/+M0ehkwUtJBKYZ+kkr2DE0XfO+OiAuAZ8h6qJrVLFfwLe8moEnZQ/H+g6wAX0XWTX9O6rrfXmH/DXB8uq/rTSXSuhp4L+n++/QD8gNgCfArYH5u3p8A30tpbXqAS0Q8BZwP/IGsJ8F9EfHrMuXVzLau/d7NpcDvgFuAL25h3uvIum4uImth+UxE/CNNuwv4GtmbMpYD13VybJ9HdvvN74Gnct9xFfBpSfdrKw/ZS74EtL/W7wtkFxRuBR7eSnqnAqencm4pcFwn32FmViveD3xb2UP21uXGfxdoTueFnyFduI2IVWRvQJqVps0Duvtg0rdKWtH+B7yOLZ8nQnYrV5D9FhERG8guAlySyu2FZLdalTIzPfDvAbKLFou6GatZn6LSjTFmZmY9k277OTci3lF0LGZm1ncpe+L/sIj4QtGxmFU737dsZmZmZmZVSdJ1ZG9hObzoWMz6Arfgm5mZmZmZmdUA34NvZmZmZmZmVgNcwTczMzMzMzOrAa7gm5mZmZmZmdUAV/DNzMzMzMzMaoAr+GZmZmZmZmY14P8DI///1DKK8dEAAAAASUVORK5CYII=\n",
      "text/plain": [
       "<Figure size 1008x432 with 3 Axes>"
      ]
     },
     "metadata": {
      "needs_background": "light"
     },
     "output_type": "display_data"
    }
   ],
   "source": [
    "fig = plt.figure(constrained_layout=True, figsize=(14, 6))\n",
    "gs = GridSpec(nrows=1, ncols=4, figure=fig)\n",
    "ax1 = fig.add_subplot(gs[0, 0])\n",
    "ax1.set_xlabel('Activation Functon')\n",
    "sns.boxenplot(x='activation', y='score', data=cv_results, ax=ax1)\n",
    "ax2 = fig.add_subplot(gs[0, 1])\n",
    "sns.boxenplot(x='dropout', y='score', data=cv_results, ax=ax2);\n",
    "ax2.set_xlabel('Dropout Rate')\n",
    "ax3 = fig.add_subplot(gs[0, 2:])\n",
    "sns.boxenplot(x='dense_layers', y='score', data=cv_results, ax=ax3)\n",
    "ax3.set_xlabel('Hidden Layers')\n",
    "fig.suptitle('Performance Impact of Architecture Elements', fontsize=16)\n",
    "fig.savefig('parameter_impact', dpi=300);"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Load best model"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 8,
   "metadata": {},
   "outputs": [],
   "source": [
    "model = load_model('gridsearch/best_model.h5', custom_objects={'auc_roc': auc_roc})"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 9,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "_________________________________________________________________\n",
      "Layer (type)                 Output Shape              Param #   \n",
      "=================================================================\n",
      "dense_1 (Dense)              (None, 64)                2880      \n",
      "_________________________________________________________________\n",
      "activation_1 (Activation)    (None, 64)                0         \n",
      "_________________________________________________________________\n",
      "dense_2 (Dense)              (None, 64)                4160      \n",
      "_________________________________________________________________\n",
      "activation_2 (Activation)    (None, 64)                0         \n",
      "_________________________________________________________________\n",
      "dropout_1 (Dropout)          (None, 64)                0         \n",
      "_________________________________________________________________\n",
      "dense_3 (Dense)              (None, 1)                 65        \n",
      "_________________________________________________________________\n",
      "activation_3 (Activation)    (None, 1)                 0         \n",
      "=================================================================\n",
      "Total params: 7,105\n",
      "Trainable params: 7,105\n",
      "Non-trainable params: 0\n",
      "_________________________________________________________________\n"
     ]
    }
   ],
   "source": [
    "model.summary()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Predict 1 year of price moves"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "y_pred = model.predict(test_data.drop('label', axis=1))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 11,
   "metadata": {
    "scrolled": true
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "0.5106585850411519"
      ]
     },
     "execution_count": 11,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "roc_auc_score(y_score=y_pred, y_true=test_data.label)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Retrain with all data"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Custom ROC AUC Callback"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 7,
   "metadata": {},
   "outputs": [],
   "source": [
    "class auc_callback(Callback):\n",
    "    def __init__(self,training_data,validation_data):\n",
    "        self.x = training_data[0]\n",
    "        self.y = training_data[1]\n",
    "        self.x_val = validation_data[0]\n",
    "        self.y_val = validation_data[1]\n",
    "\n",
    "\n",
    "    def on_train_begin(self, logs={}):\n",
    "        return\n",
    "\n",
    "    def on_train_end(self, logs={}):\n",
    "        return\n",
    "\n",
    "    def on_epoch_begin(self, epoch, logs={}):\n",
    "        return\n",
    "\n",
    "    def on_epoch_end(self, epoch, logs={}):\n",
    "        y_pred = self.model.predict(self.x)\n",
    "        roc = roc_auc_score(y_true=self.y, y_score=y_pred)\n",
    "        y_pred_val = self.model.predict_proba(self.x_val)\n",
    "        roc_val = roc_auc_score(y_true=self.y_val, y_score=y_pred_val)\n",
    "        print('\\rroc-auc: {:.2%} - roc-auc_val: {:.2%}'.format(roc, roc_val),end=100*' '+'\\n')\n",
    "        return\n",
    "\n",
    "    def on_batch_begin(self, batch, logs={}):\n",
    "        return\n",
    "\n",
    "    def on_batch_end(self, batch, logs={}):\n",
    "        return"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Early Stopping"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 18,
   "metadata": {},
   "outputs": [],
   "source": [
    "early_stopping = EarlyStopping(monitor='val_loss',\n",
    "                               min_delta=0,\n",
    "                               patience=5,\n",
    "                               verbose=0,\n",
    "                               mode='auto',\n",
    "                               baseline=None,\n",
    "                               restore_best_weights=False)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Model Checkpoints"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 19,
   "metadata": {},
   "outputs": [],
   "source": [
    "checkpointer = ModelCheckpoint('models/weights.{epoch:02d}-{val_loss:.2f}.hdf5',\n",
    "                               monitor='val_loss',\n",
    "                               verbose=0,\n",
    "                               save_best_only=True,\n",
    "                               save_weights_only=False,\n",
    "                               mode='auto',\n",
    "                               period=1)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Tensorboard"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 20,
   "metadata": {},
   "outputs": [],
   "source": [
    "tensorboard = TensorBoard(log_dir='./logs',\n",
    "                          histogram_freq=1,\n",
    "                          batch_size=32,\n",
    "                          write_graph=True,\n",
    "                          write_grads=True,\n",
    "                          update_freq='epoch')"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 10,
   "metadata": {},
   "outputs": [],
   "source": [
    "data = pd.read_hdf('data.h5', 'returns')\n",
    "features = data.drop('label', axis=1)\n",
    "label = data.label"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Run cross-validation"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 31,
   "metadata": {
    "scrolled": true
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Train on 233966 samples, validate on 2489 samples\n",
      "Epoch 1/50\n",
      "233966/233966 [==============================] - 13s 57us/step - loss: 0.5537 - binary_accuracy: 0.7110 - auc_roc: 0.7790 - val_loss: 0.6052 - val_binary_accuracy: 0.6063 - val_auc_roc: 0.7791\n",
      "roc-auc: 79.44% - roc-auc_val: 66.48%                                                                                                    \n",
      "Epoch 2/50\n",
      "233966/233966 [==============================] - 13s 56us/step - loss: 0.5536 - binary_accuracy: 0.7121 - auc_roc: 0.7792 - val_loss: 0.6026 - val_binary_accuracy: 0.6268 - val_auc_roc: 0.7793\n",
      "roc-auc: 79.50% - roc-auc_val: 67.07%                                                                                                    \n",
      "Epoch 3/50\n",
      "233966/233966 [==============================] - 13s 58us/step - loss: 0.5541 - binary_accuracy: 0.7119 - auc_roc: 0.7794 - val_loss: 0.6074 - val_binary_accuracy: 0.6239 - val_auc_roc: 0.7795\n",
      "roc-auc: 79.51% - roc-auc_val: 65.37%                                                                                                    \n",
      "Epoch 4/50\n",
      "233966/233966 [==============================] - 13s 57us/step - loss: 0.5537 - binary_accuracy: 0.7111 - auc_roc: 0.7796 - val_loss: 0.5859 - val_binary_accuracy: 0.6235 - val_auc_roc: 0.7797\n",
      "roc-auc: 79.52% - roc-auc_val: 65.90%                                                                                                    \n",
      "Epoch 5/50\n",
      "233966/233966 [==============================] - 14s 58us/step - loss: 0.5531 - binary_accuracy: 0.7119 - auc_roc: 0.7798 - val_loss: 0.5977 - val_binary_accuracy: 0.6312 - val_auc_roc: 0.7799\n",
      "roc-auc: 79.42% - roc-auc_val: 66.32%                                                                                                    \n",
      "Epoch 6/50\n",
      "233966/233966 [==============================] - 13s 57us/step - loss: 0.5537 - binary_accuracy: 0.7130 - auc_roc: 0.7800 - val_loss: 0.6024 - val_binary_accuracy: 0.6223 - val_auc_roc: 0.7800\n",
      "roc-auc: 79.61% - roc-auc_val: 65.85%                                                                                                    \n",
      "Epoch 7/50\n",
      "233966/233966 [==============================] - 13s 57us/step - loss: 0.5529 - binary_accuracy: 0.7124 - auc_roc: 0.7801 - val_loss: 0.6206 - val_binary_accuracy: 0.5954 - val_auc_roc: 0.7802\n",
      "roc-auc: 79.54% - roc-auc_val: 65.76%                                                                                                    \n",
      "Epoch 8/50\n",
      "233966/233966 [==============================] - 13s 57us/step - loss: 0.5535 - binary_accuracy: 0.7126 - auc_roc: 0.7802 - val_loss: 0.6131 - val_binary_accuracy: 0.6095 - val_auc_roc: 0.7803\n",
      "roc-auc: 79.60% - roc-auc_val: 65.40%                                                                                                    \n",
      "Epoch 9/50\n",
      "233966/233966 [==============================] - 13s 58us/step - loss: 0.5525 - binary_accuracy: 0.7124 - auc_roc: 0.7804 - val_loss: 0.6038 - val_binary_accuracy: 0.6159 - val_auc_roc: 0.7805\n",
      "roc-auc: 79.57% - roc-auc_val: 65.76%                                                                                                    \n",
      "Epoch 10/50\n",
      "233966/233966 [==============================] - 14s 59us/step - loss: 0.5530 - binary_accuracy: 0.7122 - auc_roc: 0.7805 - val_loss: 0.6535 - val_binary_accuracy: 0.5842 - val_auc_roc: 0.7806\n",
      "roc-auc: 79.52% - roc-auc_val: 64.36%                                                                                                    \n",
      "Epoch 11/50\n",
      "233966/233966 [==============================] - 13s 56us/step - loss: 0.5524 - binary_accuracy: 0.7134 - auc_roc: 0.7807 - val_loss: 0.6469 - val_binary_accuracy: 0.5613 - val_auc_roc: 0.7807\n",
      "roc-auc: 79.59% - roc-auc_val: 63.95%                                                                                                    \n",
      "Epoch 12/50\n",
      "233966/233966 [==============================] - 13s 56us/step - loss: 0.5526 - binary_accuracy: 0.7125 - auc_roc: 0.7808 - val_loss: 0.5977 - val_binary_accuracy: 0.6179 - val_auc_roc: 0.7809\n",
      "roc-auc: 79.61% - roc-auc_val: 64.73%                                                                                                    \n",
      "Epoch 13/50\n",
      "233966/233966 [==============================] - 13s 54us/step - loss: 0.5519 - binary_accuracy: 0.7128 - auc_roc: 0.7809 - val_loss: 0.5924 - val_binary_accuracy: 0.6103 - val_auc_roc: 0.7810\n",
      "roc-auc: 79.68% - roc-auc_val: 65.55%                                                                                                    \n",
      "Epoch 14/50\n",
      "233966/233966 [==============================] - 13s 54us/step - loss: 0.5516 - binary_accuracy: 0.7132 - auc_roc: 0.7811 - val_loss: 0.5868 - val_binary_accuracy: 0.6195 - val_auc_roc: 0.7811\n",
      "roc-auc: 79.67% - roc-auc_val: 65.80%                                                                                                    \n",
      "Epoch 15/50\n",
      "233966/233966 [==============================] - 13s 56us/step - loss: 0.5515 - binary_accuracy: 0.7135 - auc_roc: 0.7812 - val_loss: 0.5954 - val_binary_accuracy: 0.5918 - val_auc_roc: 0.7813\n",
      "roc-auc: 79.66% - roc-auc_val: 65.20%                                                                                                    \n",
      "Epoch 16/50\n",
      "233966/233966 [==============================] - 13s 56us/step - loss: 0.5516 - binary_accuracy: 0.7130 - auc_roc: 0.7813 - val_loss: 0.5753 - val_binary_accuracy: 0.6364 - val_auc_roc: 0.7814\n",
      "roc-auc: 79.58% - roc-auc_val: 65.97%                                                                                                    \n",
      "Epoch 17/50\n",
      "233966/233966 [==============================] - 13s 57us/step - loss: 0.5521 - binary_accuracy: 0.7125 - auc_roc: 0.7815 - val_loss: 0.5823 - val_binary_accuracy: 0.6175 - val_auc_roc: 0.7815\n",
      "roc-auc: 79.68% - roc-auc_val: 65.53%                                                                                                    \n",
      "Epoch 18/50\n",
      "233966/233966 [==============================] - 14s 59us/step - loss: 0.5508 - binary_accuracy: 0.7139 - auc_roc: 0.7816 - val_loss: 0.6029 - val_binary_accuracy: 0.5842 - val_auc_roc: 0.7816\n",
      "roc-auc: 79.72% - roc-auc_val: 65.05%                                                                                                    \n",
      "Epoch 19/50\n",
      "233966/233966 [==============================] - 13s 57us/step - loss: 0.5518 - binary_accuracy: 0.7132 - auc_roc: 0.7817 - val_loss: 0.5917 - val_binary_accuracy: 0.6059 - val_auc_roc: 0.7817\n",
      "roc-auc: 79.67% - roc-auc_val: 65.45%                                                                                                    \n",
      "Epoch 20/50\n",
      "233966/233966 [==============================] - 13s 55us/step - loss: 0.5516 - binary_accuracy: 0.7131 - auc_roc: 0.7818 - val_loss: 0.6021 - val_binary_accuracy: 0.5934 - val_auc_roc: 0.7818\n",
      "roc-auc: 79.72% - roc-auc_val: 65.22%                                                                                                    \n",
      "Epoch 21/50\n",
      "233966/233966 [==============================] - 13s 55us/step - loss: 0.5513 - binary_accuracy: 0.7130 - auc_roc: 0.7819 - val_loss: 0.5871 - val_binary_accuracy: 0.6215 - val_auc_roc: 0.7819\n",
      "roc-auc: 79.70% - roc-auc_val: 66.23%                                                                                                    \n",
      "Epoch 22/50\n",
      "233966/233966 [==============================] - 15s 63us/step - loss: 0.5511 - binary_accuracy: 0.7142 - auc_roc: 0.7820 - val_loss: 0.6219 - val_binary_accuracy: 0.6071 - val_auc_roc: 0.7820\n",
      "roc-auc: 79.75% - roc-auc_val: 64.54%                                                                                                    \n",
      "Epoch 23/50\n",
      "233966/233966 [==============================] - 13s 55us/step - loss: 0.5507 - binary_accuracy: 0.7134 - auc_roc: 0.7821 - val_loss: 0.5907 - val_binary_accuracy: 0.6099 - val_auc_roc: 0.7821\n",
      "roc-auc: 79.75% - roc-auc_val: 64.33%                                                                                                    \n",
      "Epoch 24/50\n",
      "233966/233966 [==============================] - 14s 61us/step - loss: 0.5508 - binary_accuracy: 0.7134 - auc_roc: 0.7822 - val_loss: 0.5853 - val_binary_accuracy: 0.6404 - val_auc_roc: 0.7822\n",
      "roc-auc: 79.77% - roc-auc_val: 65.56%                                                                                                    \n",
      "Epoch 25/50\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "233966/233966 [==============================] - 14s 60us/step - loss: 0.5511 - binary_accuracy: 0.7135 - auc_roc: 0.7823 - val_loss: 0.5729 - val_binary_accuracy: 0.6364 - val_auc_roc: 0.7823\n",
      "roc-auc: 79.75% - roc-auc_val: 66.25%                                                                                                    \n",
      "Epoch 26/50\n",
      "233966/233966 [==============================] - 13s 58us/step - loss: 0.5501 - binary_accuracy: 0.7141 - auc_roc: 0.7824 - val_loss: 0.6024 - val_binary_accuracy: 0.6047 - val_auc_roc: 0.7824\n",
      "roc-auc: 79.73% - roc-auc_val: 64.77%                                                                                                    \n",
      "Epoch 27/50\n",
      "233966/233966 [==============================] - 13s 57us/step - loss: 0.5502 - binary_accuracy: 0.7137 - auc_roc: 0.7825 - val_loss: 0.5909 - val_binary_accuracy: 0.6272 - val_auc_roc: 0.7825\n",
      "roc-auc: 79.72% - roc-auc_val: 65.44%                                                                                                    \n",
      "Epoch 28/50\n",
      "233966/233966 [==============================] - 13s 57us/step - loss: 0.5502 - binary_accuracy: 0.7144 - auc_roc: 0.7826 - val_loss: 0.5718 - val_binary_accuracy: 0.6167 - val_auc_roc: 0.7826\n",
      "roc-auc: 79.76% - roc-auc_val: 65.95%                                                                                                    \n",
      "Epoch 29/50\n",
      "233966/233966 [==============================] - 14s 58us/step - loss: 0.5502 - binary_accuracy: 0.7130 - auc_roc: 0.7827 - val_loss: 0.6173 - val_binary_accuracy: 0.6159 - val_auc_roc: 0.7827\n",
      "roc-auc: 79.79% - roc-auc_val: 64.45%                                                                                                    \n",
      "Epoch 30/50\n",
      "233966/233966 [==============================] - 13s 57us/step - loss: 0.5508 - binary_accuracy: 0.7143 - auc_roc: 0.7828 - val_loss: 0.5921 - val_binary_accuracy: 0.6010 - val_auc_roc: 0.7828\n",
      "roc-auc: 79.87% - roc-auc_val: 65.77%                                                                                                    \n",
      "Epoch 31/50\n",
      "233966/233966 [==============================] - 14s 58us/step - loss: 0.5503 - binary_accuracy: 0.7141 - auc_roc: 0.7828 - val_loss: 0.6025 - val_binary_accuracy: 0.5862 - val_auc_roc: 0.7829\n",
      "roc-auc: 79.84% - roc-auc_val: 65.13%                                                                                                    \n",
      "Epoch 32/50\n",
      "233966/233966 [==============================] - 13s 57us/step - loss: 0.5502 - binary_accuracy: 0.7136 - auc_roc: 0.7829 - val_loss: 0.6052 - val_binary_accuracy: 0.5705 - val_auc_roc: 0.7830\n",
      "roc-auc: 79.83% - roc-auc_val: 64.10%                                                                                                    \n",
      "Epoch 33/50\n",
      "233966/233966 [==============================] - 13s 58us/step - loss: 0.5503 - binary_accuracy: 0.7138 - auc_roc: 0.7830 - val_loss: 0.5782 - val_binary_accuracy: 0.6163 - val_auc_roc: 0.7830\n",
      "roc-auc: 79.81% - roc-auc_val: 65.42%                                                                                                    \n",
      "Epoch 34/50\n",
      "233966/233966 [==============================] - 14s 58us/step - loss: 0.5499 - binary_accuracy: 0.7141 - auc_roc: 0.7831 - val_loss: 0.6154 - val_binary_accuracy: 0.5894 - val_auc_roc: 0.7831\n",
      "roc-auc: 79.85% - roc-auc_val: 64.34%                                                                                                    \n",
      "Epoch 35/50\n",
      "233966/233966 [==============================] - 14s 58us/step - loss: 0.5500 - binary_accuracy: 0.7148 - auc_roc: 0.7831 - val_loss: 0.5801 - val_binary_accuracy: 0.6308 - val_auc_roc: 0.7832\n",
      "roc-auc: 79.85% - roc-auc_val: 66.07%                                                                                                    \n",
      "Epoch 36/50\n",
      "233966/233966 [==============================] - 14s 58us/step - loss: 0.5492 - binary_accuracy: 0.7148 - auc_roc: 0.7832 - val_loss: 0.5955 - val_binary_accuracy: 0.6087 - val_auc_roc: 0.7833\n",
      "roc-auc: 79.79% - roc-auc_val: 64.99%                                                                                                    \n",
      "Epoch 37/50\n",
      "233966/233966 [==============================] - 14s 58us/step - loss: 0.5488 - binary_accuracy: 0.7142 - auc_roc: 0.7833 - val_loss: 0.6324 - val_binary_accuracy: 0.5962 - val_auc_roc: 0.7834\n",
      "roc-auc: 79.88% - roc-auc_val: 64.45%                                                                                                    \n",
      "Epoch 38/50\n",
      "233966/233966 [==============================] - 14s 58us/step - loss: 0.5500 - binary_accuracy: 0.7143 - auc_roc: 0.7834 - val_loss: 0.6193 - val_binary_accuracy: 0.6103 - val_auc_roc: 0.7834\n",
      "roc-auc: 79.86% - roc-auc_val: 63.82%                                                                                                    \n",
      "Epoch 39/50\n",
      "233966/233966 [==============================] - 14s 60us/step - loss: 0.5497 - binary_accuracy: 0.7151 - auc_roc: 0.7835 - val_loss: 0.6100 - val_binary_accuracy: 0.6179 - val_auc_roc: 0.7835\n",
      "roc-auc: 79.86% - roc-auc_val: 65.18%                                                                                                    \n",
      "Epoch 40/50\n",
      "233966/233966 [==============================] - 14s 58us/step - loss: 0.5500 - binary_accuracy: 0.7146 - auc_roc: 0.7835 - val_loss: 0.6218 - val_binary_accuracy: 0.5970 - val_auc_roc: 0.7836\n",
      "roc-auc: 79.91% - roc-auc_val: 65.95%                                                                                                    \n",
      "Epoch 41/50\n",
      "233966/233966 [==============================] - 13s 57us/step - loss: 0.5497 - binary_accuracy: 0.7142 - auc_roc: 0.7836 - val_loss: 0.6496 - val_binary_accuracy: 0.6135 - val_auc_roc: 0.7836\n",
      "roc-auc: 79.88% - roc-auc_val: 62.27%                                                                                                    \n",
      "Epoch 42/50\n",
      "233966/233966 [==============================] - 13s 57us/step - loss: 0.5491 - binary_accuracy: 0.7147 - auc_roc: 0.7837 - val_loss: 0.6386 - val_binary_accuracy: 0.5950 - val_auc_roc: 0.7837\n",
      "roc-auc: 79.79% - roc-auc_val: 63.07%                                                                                                    \n",
      "Epoch 43/50\n",
      "233966/233966 [==============================] - 13s 56us/step - loss: 0.5491 - binary_accuracy: 0.7140 - auc_roc: 0.7837 - val_loss: 0.6006 - val_binary_accuracy: 0.6139 - val_auc_roc: 0.7838\n",
      "roc-auc: 79.93% - roc-auc_val: 64.58%                                                                                                    \n",
      "Epoch 44/50\n",
      "233966/233966 [==============================] - 13s 57us/step - loss: 0.5487 - binary_accuracy: 0.7155 - auc_roc: 0.7838 - val_loss: 0.6363 - val_binary_accuracy: 0.5842 - val_auc_roc: 0.7838\n",
      "roc-auc: 79.96% - roc-auc_val: 63.40%                                                                                                    \n",
      "Epoch 45/50\n",
      "233966/233966 [==============================] - 13s 57us/step - loss: 0.5489 - binary_accuracy: 0.7143 - auc_roc: 0.7839 - val_loss: 0.6019 - val_binary_accuracy: 0.6328 - val_auc_roc: 0.7839\n",
      "roc-auc: 79.94% - roc-auc_val: 64.84%                                                                                                    \n",
      "Epoch 46/50\n",
      "233966/233966 [==============================] - 13s 56us/step - loss: 0.5492 - binary_accuracy: 0.7143 - auc_roc: 0.7839 - val_loss: 0.6273 - val_binary_accuracy: 0.5982 - val_auc_roc: 0.7840\n",
      "roc-auc: 79.94% - roc-auc_val: 63.96%                                                                                                    \n",
      "Epoch 47/50\n",
      "233966/233966 [==============================] - 13s 57us/step - loss: 0.5487 - binary_accuracy: 0.7147 - auc_roc: 0.7840 - val_loss: 0.6383 - val_binary_accuracy: 0.6227 - val_auc_roc: 0.7840\n",
      "roc-auc: 79.91% - roc-auc_val: 62.70%                                                                                                    \n",
      "Epoch 48/50\n",
      "233966/233966 [==============================] - 13s 57us/step - loss: 0.5486 - binary_accuracy: 0.7147 - auc_roc: 0.7841 - val_loss: 0.6810 - val_binary_accuracy: 0.5685 - val_auc_roc: 0.7841\n",
      "roc-auc: 79.93% - roc-auc_val: 56.54%                                                                                                    \n",
      "Epoch 49/50\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "233966/233966 [==============================] - 13s 56us/step - loss: 0.5482 - binary_accuracy: 0.7147 - auc_roc: 0.7841 - val_loss: 0.6690 - val_binary_accuracy: 0.5914 - val_auc_roc: 0.7841\n",
      "roc-auc: 79.93% - roc-auc_val: 58.56%                                                                                                    \n",
      "Epoch 50/50\n",
      "233966/233966 [==============================] - 13s 56us/step - loss: 0.5483 - binary_accuracy: 0.7149 - auc_roc: 0.7842 - val_loss: 0.6080 - val_binary_accuracy: 0.6123 - val_auc_roc: 0.7842\n",
      "roc-auc: 79.96% - roc-auc_val: 63.88%                                                                                                    \n",
      "Train on 231477 samples, validate on 2489 samples\n",
      "Epoch 1/50\n",
      "231477/231477 [==============================] - 13s 56us/step - loss: 0.5499 - binary_accuracy: 0.7142 - auc_roc: 0.7842 - val_loss: 0.4909 - val_binary_accuracy: 0.7513 - val_auc_roc: 0.7843\n",
      "roc-auc: 79.94% - roc-auc_val: 80.20%                                                                                                    \n",
      "Epoch 2/50\n",
      "231477/231477 [==============================] - 13s 56us/step - loss: 0.5487 - binary_accuracy: 0.7151 - auc_roc: 0.7843 - val_loss: 0.4940 - val_binary_accuracy: 0.7489 - val_auc_roc: 0.7843\n",
      "roc-auc: 79.99% - roc-auc_val: 79.57%                                                                                                    \n",
      "Epoch 3/50\n",
      "231477/231477 [==============================] - 13s 56us/step - loss: 0.5489 - binary_accuracy: 0.7139 - auc_roc: 0.7844 - val_loss: 0.4928 - val_binary_accuracy: 0.7477 - val_auc_roc: 0.7844\n",
      "roc-auc: 79.92% - roc-auc_val: 79.59%                                                                                                    \n",
      "Epoch 4/50\n",
      "231477/231477 [==============================] - 13s 56us/step - loss: 0.5485 - binary_accuracy: 0.7144 - auc_roc: 0.7844 - val_loss: 0.5015 - val_binary_accuracy: 0.7473 - val_auc_roc: 0.7845\n",
      "roc-auc: 79.95% - roc-auc_val: 79.05%                                                                                                    \n",
      "Epoch 5/50\n",
      "231477/231477 [==============================] - 13s 56us/step - loss: 0.5486 - binary_accuracy: 0.7150 - auc_roc: 0.7845 - val_loss: 0.4990 - val_binary_accuracy: 0.7477 - val_auc_roc: 0.7845\n",
      "roc-auc: 79.96% - roc-auc_val: 78.99%                                                                                                    \n",
      "Epoch 6/50\n",
      "231477/231477 [==============================] - 13s 56us/step - loss: 0.5489 - binary_accuracy: 0.7147 - auc_roc: 0.7845 - val_loss: 0.5020 - val_binary_accuracy: 0.7469 - val_auc_roc: 0.7846\n",
      "roc-auc: 79.96% - roc-auc_val: 79.07%                                                                                                    \n",
      "Epoch 7/50\n",
      "231477/231477 [==============================] - 13s 57us/step - loss: 0.5484 - binary_accuracy: 0.7143 - auc_roc: 0.7846 - val_loss: 0.5076 - val_binary_accuracy: 0.7453 - val_auc_roc: 0.7846\n",
      "roc-auc: 79.98% - roc-auc_val: 78.32%                                                                                                    \n",
      "Epoch 8/50\n",
      "231477/231477 [==============================] - 13s 57us/step - loss: 0.5493 - binary_accuracy: 0.7142 - auc_roc: 0.7847 - val_loss: 0.5062 - val_binary_accuracy: 0.7389 - val_auc_roc: 0.7847\n",
      "roc-auc: 80.01% - roc-auc_val: 78.50%                                                                                                    \n",
      "Epoch 9/50\n",
      "231477/231477 [==============================] - 13s 57us/step - loss: 0.5489 - binary_accuracy: 0.7152 - auc_roc: 0.7847 - val_loss: 0.5052 - val_binary_accuracy: 0.7409 - val_auc_roc: 0.7847\n",
      "roc-auc: 79.96% - roc-auc_val: 77.79%                                                                                                    \n",
      "Epoch 10/50\n",
      "231477/231477 [==============================] - 13s 57us/step - loss: 0.5481 - binary_accuracy: 0.7149 - auc_roc: 0.7848 - val_loss: 0.5057 - val_binary_accuracy: 0.7421 - val_auc_roc: 0.7848\n",
      "roc-auc: 79.94% - roc-auc_val: 77.99%                                                                                                    \n",
      "Epoch 11/50\n",
      "231477/231477 [==============================] - 13s 56us/step - loss: 0.5486 - binary_accuracy: 0.7145 - auc_roc: 0.7848 - val_loss: 0.5025 - val_binary_accuracy: 0.7425 - val_auc_roc: 0.7848\n",
      "roc-auc: 79.97% - roc-auc_val: 78.02%                                                                                                    \n",
      "Epoch 12/50\n",
      "231477/231477 [==============================] - 13s 56us/step - loss: 0.5482 - binary_accuracy: 0.7149 - auc_roc: 0.7849 - val_loss: 0.5104 - val_binary_accuracy: 0.7441 - val_auc_roc: 0.7849\n",
      "roc-auc: 79.94% - roc-auc_val: 77.48%                                                                                                    \n",
      "Epoch 13/50\n",
      "231477/231477 [==============================] - 13s 57us/step - loss: 0.5486 - binary_accuracy: 0.7145 - auc_roc: 0.7849 - val_loss: 0.5060 - val_binary_accuracy: 0.7457 - val_auc_roc: 0.7850\n",
      "roc-auc: 79.96% - roc-auc_val: 77.77%                                                                                                    \n",
      "Epoch 14/50\n",
      "231477/231477 [==============================] - 13s 57us/step - loss: 0.5479 - binary_accuracy: 0.7153 - auc_roc: 0.7850 - val_loss: 0.5108 - val_binary_accuracy: 0.7409 - val_auc_roc: 0.7850\n",
      "roc-auc: 79.96% - roc-auc_val: 77.73%                                                                                                    \n",
      "Epoch 15/50\n",
      "231477/231477 [==============================] - 13s 56us/step - loss: 0.5486 - binary_accuracy: 0.7141 - auc_roc: 0.7850 - val_loss: 0.5103 - val_binary_accuracy: 0.7352 - val_auc_roc: 0.7851\n",
      "roc-auc: 79.98% - roc-auc_val: 77.14%                                                                                                    \n",
      "Epoch 16/50\n",
      "231477/231477 [==============================] - 13s 57us/step - loss: 0.5486 - binary_accuracy: 0.7144 - auc_roc: 0.7851 - val_loss: 0.5091 - val_binary_accuracy: 0.7376 - val_auc_roc: 0.7851\n",
      "roc-auc: 79.99% - roc-auc_val: 77.43%                                                                                                    \n",
      "Epoch 17/50\n",
      "231477/231477 [==============================] - 13s 57us/step - loss: 0.5482 - binary_accuracy: 0.7150 - auc_roc: 0.7851 - val_loss: 0.5070 - val_binary_accuracy: 0.7453 - val_auc_roc: 0.7852\n",
      "roc-auc: 79.95% - roc-auc_val: 77.64%                                                                                                    \n",
      "Epoch 18/50\n",
      "231477/231477 [==============================] - 13s 57us/step - loss: 0.5477 - binary_accuracy: 0.7156 - auc_roc: 0.7852 - val_loss: 0.5134 - val_binary_accuracy: 0.7409 - val_auc_roc: 0.7852\n",
      "roc-auc: 80.01% - roc-auc_val: 77.25%                                                                                                    \n",
      "Epoch 19/50\n",
      "231477/231477 [==============================] - 13s 57us/step - loss: 0.5481 - binary_accuracy: 0.7151 - auc_roc: 0.7852 - val_loss: 0.5093 - val_binary_accuracy: 0.7405 - val_auc_roc: 0.7853\n",
      "roc-auc: 79.98% - roc-auc_val: 77.55%                                                                                                    \n",
      "Epoch 20/50\n",
      "231477/231477 [==============================] - 13s 57us/step - loss: 0.5473 - binary_accuracy: 0.7153 - auc_roc: 0.7853 - val_loss: 0.5144 - val_binary_accuracy: 0.7376 - val_auc_roc: 0.7853\n",
      "roc-auc: 80.00% - roc-auc_val: 76.70%                                                                                                    \n",
      "Epoch 21/50\n",
      "231477/231477 [==============================] - 13s 57us/step - loss: 0.5479 - binary_accuracy: 0.7153 - auc_roc: 0.7853 - val_loss: 0.5122 - val_binary_accuracy: 0.7425 - val_auc_roc: 0.7854\n",
      "roc-auc: 80.05% - roc-auc_val: 76.72%                                                                                                    \n",
      "Epoch 22/50\n",
      "231477/231477 [==============================] - 13s 57us/step - loss: 0.5477 - binary_accuracy: 0.7153 - auc_roc: 0.7854 - val_loss: 0.5127 - val_binary_accuracy: 0.7421 - val_auc_roc: 0.7854\n",
      "roc-auc: 80.07% - roc-auc_val: 76.88%                                                                                                    \n",
      "Epoch 23/50\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "231477/231477 [==============================] - 13s 56us/step - loss: 0.5481 - binary_accuracy: 0.7153 - auc_roc: 0.7854 - val_loss: 0.5123 - val_binary_accuracy: 0.7433 - val_auc_roc: 0.7855\n",
      "roc-auc: 80.06% - roc-auc_val: 76.86%                                                                                                    \n",
      "Epoch 24/50\n",
      "231477/231477 [==============================] - 13s 56us/step - loss: 0.5483 - binary_accuracy: 0.7157 - auc_roc: 0.7855 - val_loss: 0.5124 - val_binary_accuracy: 0.7437 - val_auc_roc: 0.7855\n",
      "roc-auc: 80.04% - roc-auc_val: 76.63%                                                                                                    \n",
      "Epoch 25/50\n",
      "231477/231477 [==============================] - 13s 57us/step - loss: 0.5478 - binary_accuracy: 0.7148 - auc_roc: 0.7855 - val_loss: 0.5121 - val_binary_accuracy: 0.7445 - val_auc_roc: 0.7855\n",
      "roc-auc: 80.06% - roc-auc_val: 77.23%                                                                                                    \n",
      "Epoch 26/50\n",
      "231477/231477 [==============================] - 13s 57us/step - loss: 0.5481 - binary_accuracy: 0.7157 - auc_roc: 0.7856 - val_loss: 0.5159 - val_binary_accuracy: 0.7401 - val_auc_roc: 0.7856\n",
      "roc-auc: 80.03% - roc-auc_val: 76.75%                                                                                                    \n",
      "Epoch 27/50\n",
      "231477/231477 [==============================] - 13s 56us/step - loss: 0.5482 - binary_accuracy: 0.7146 - auc_roc: 0.7856 - val_loss: 0.5163 - val_binary_accuracy: 0.7433 - val_auc_roc: 0.7856\n",
      "roc-auc: 80.08% - roc-auc_val: 76.83%                                                                                                    \n",
      "Epoch 28/50\n",
      "231477/231477 [==============================] - 13s 56us/step - loss: 0.5476 - binary_accuracy: 0.7148 - auc_roc: 0.7856 - val_loss: 0.5146 - val_binary_accuracy: 0.7360 - val_auc_roc: 0.7857\n",
      "roc-auc: 80.08% - roc-auc_val: 76.59%                                                                                                    \n",
      "Epoch 29/50\n",
      "231477/231477 [==============================] - 13s 56us/step - loss: 0.5479 - binary_accuracy: 0.7154 - auc_roc: 0.7857 - val_loss: 0.5120 - val_binary_accuracy: 0.7413 - val_auc_roc: 0.7857\n",
      "roc-auc: 80.06% - roc-auc_val: 76.80%                                                                                                    \n",
      "Epoch 30/50\n",
      "231477/231477 [==============================] - 13s 57us/step - loss: 0.5479 - binary_accuracy: 0.7149 - auc_roc: 0.7857 - val_loss: 0.5124 - val_binary_accuracy: 0.7417 - val_auc_roc: 0.7857\n",
      "roc-auc: 80.09% - roc-auc_val: 76.80%                                                                                                    \n",
      "Epoch 31/50\n",
      "231477/231477 [==============================] - 13s 57us/step - loss: 0.5473 - binary_accuracy: 0.7158 - auc_roc: 0.7858 - val_loss: 0.5154 - val_binary_accuracy: 0.7397 - val_auc_roc: 0.7858\n",
      "roc-auc: 80.08% - roc-auc_val: 76.59%                                                                                                    \n",
      "Epoch 32/50\n",
      "231477/231477 [==============================] - 13s 57us/step - loss: 0.5478 - binary_accuracy: 0.7156 - auc_roc: 0.7858 - val_loss: 0.5181 - val_binary_accuracy: 0.7433 - val_auc_roc: 0.7858\n",
      "roc-auc: 80.10% - roc-auc_val: 76.57%                                                                                                    \n",
      "Epoch 33/50\n",
      "231477/231477 [==============================] - 13s 56us/step - loss: 0.5471 - binary_accuracy: 0.7159 - auc_roc: 0.7858 - val_loss: 0.5141 - val_binary_accuracy: 0.7441 - val_auc_roc: 0.7859\n",
      "roc-auc: 80.05% - roc-auc_val: 76.66%                                                                                                    \n",
      "Epoch 34/50\n",
      "231477/231477 [==============================] - 13s 57us/step - loss: 0.5476 - binary_accuracy: 0.7151 - auc_roc: 0.7859 - val_loss: 0.5160 - val_binary_accuracy: 0.7421 - val_auc_roc: 0.7859\n",
      "roc-auc: 80.12% - roc-auc_val: 76.58%                                                                                                    \n",
      "Epoch 35/50\n",
      "231477/231477 [==============================] - 13s 56us/step - loss: 0.5478 - binary_accuracy: 0.7155 - auc_roc: 0.7859 - val_loss: 0.5130 - val_binary_accuracy: 0.7425 - val_auc_roc: 0.7859\n",
      "roc-auc: 79.99% - roc-auc_val: 76.72%                                                                                                    \n",
      "Epoch 36/50\n",
      "231477/231477 [==============================] - 13s 56us/step - loss: 0.5474 - binary_accuracy: 0.7149 - auc_roc: 0.7860 - val_loss: 0.5173 - val_binary_accuracy: 0.7376 - val_auc_roc: 0.7860\n",
      "roc-auc: 80.14% - roc-auc_val: 76.47%                                                                                                    \n",
      "Epoch 37/50\n",
      "231477/231477 [==============================] - 13s 56us/step - loss: 0.5475 - binary_accuracy: 0.7155 - auc_roc: 0.7860 - val_loss: 0.5135 - val_binary_accuracy: 0.7469 - val_auc_roc: 0.7860\n",
      "roc-auc: 80.13% - roc-auc_val: 76.92%                                                                                                    \n",
      "Epoch 38/50\n",
      "231477/231477 [==============================] - 13s 57us/step - loss: 0.5472 - binary_accuracy: 0.7148 - auc_roc: 0.7860 - val_loss: 0.5120 - val_binary_accuracy: 0.7401 - val_auc_roc: 0.7861\n",
      "roc-auc: 80.10% - roc-auc_val: 76.86%                                                                                                    \n",
      "Epoch 39/50\n",
      "231477/231477 [==============================] - 13s 57us/step - loss: 0.5473 - binary_accuracy: 0.7155 - auc_roc: 0.7861 - val_loss: 0.5146 - val_binary_accuracy: 0.7473 - val_auc_roc: 0.7861\n",
      "roc-auc: 80.11% - roc-auc_val: 76.46%                                                                                                    \n",
      "Epoch 40/50\n",
      "231477/231477 [==============================] - 13s 57us/step - loss: 0.5467 - binary_accuracy: 0.7152 - auc_roc: 0.7861 - val_loss: 0.5188 - val_binary_accuracy: 0.7433 - val_auc_roc: 0.7861\n",
      "roc-auc: 80.17% - roc-auc_val: 76.03%                                                                                                    \n",
      "Epoch 41/50\n",
      "231477/231477 [==============================] - 13s 57us/step - loss: 0.5470 - binary_accuracy: 0.7144 - auc_roc: 0.7862 - val_loss: 0.5176 - val_binary_accuracy: 0.7393 - val_auc_roc: 0.7862\n",
      "roc-auc: 80.11% - roc-auc_val: 76.28%                                                                                                    \n",
      "Epoch 42/50\n",
      "231477/231477 [==============================] - 13s 57us/step - loss: 0.5466 - binary_accuracy: 0.7157 - auc_roc: 0.7862 - val_loss: 0.5155 - val_binary_accuracy: 0.7405 - val_auc_roc: 0.7862\n",
      "roc-auc: 80.09% - roc-auc_val: 76.55%                                                                                                    \n",
      "Epoch 43/50\n",
      "231477/231477 [==============================] - 13s 57us/step - loss: 0.5468 - binary_accuracy: 0.7152 - auc_roc: 0.7862 - val_loss: 0.5253 - val_binary_accuracy: 0.7401 - val_auc_roc: 0.7863\n",
      "roc-auc: 80.07% - roc-auc_val: 75.74%                                                                                                    \n",
      "Epoch 44/50\n",
      "231477/231477 [==============================] - 13s 57us/step - loss: 0.5470 - binary_accuracy: 0.7149 - auc_roc: 0.7863 - val_loss: 0.5161 - val_binary_accuracy: 0.7437 - val_auc_roc: 0.7863\n",
      "roc-auc: 80.15% - roc-auc_val: 76.21%                                                                                                    \n",
      "Epoch 45/50\n",
      "231477/231477 [==============================] - 13s 57us/step - loss: 0.5465 - binary_accuracy: 0.7159 - auc_roc: 0.7863 - val_loss: 0.5193 - val_binary_accuracy: 0.7417 - val_auc_roc: 0.7863\n",
      "roc-auc: 80.16% - roc-auc_val: 75.84%                                                                                                    \n",
      "Epoch 46/50\n",
      "231477/231477 [==============================] - 13s 58us/step - loss: 0.5466 - binary_accuracy: 0.7152 - auc_roc: 0.7864 - val_loss: 0.5197 - val_binary_accuracy: 0.7421 - val_auc_roc: 0.7864\n",
      "roc-auc: 80.20% - roc-auc_val: 76.09%                                                                                                    \n",
      "Epoch 47/50\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "231477/231477 [==============================] - 13s 56us/step - loss: 0.5465 - binary_accuracy: 0.7153 - auc_roc: 0.7864 - val_loss: 0.5197 - val_binary_accuracy: 0.7413 - val_auc_roc: 0.7864\n",
      "roc-auc: 80.12% - roc-auc_val: 76.30%                                                                                                    \n",
      "Epoch 48/50\n",
      "231477/231477 [==============================] - 13s 56us/step - loss: 0.5472 - binary_accuracy: 0.7149 - auc_roc: 0.7864 - val_loss: 0.5241 - val_binary_accuracy: 0.7304 - val_auc_roc: 0.7864\n",
      "roc-auc: 80.13% - roc-auc_val: 76.15%                                                                                                    \n",
      "Epoch 49/50\n",
      "231477/231477 [==============================] - 13s 56us/step - loss: 0.5465 - binary_accuracy: 0.7159 - auc_roc: 0.7865 - val_loss: 0.5142 - val_binary_accuracy: 0.7437 - val_auc_roc: 0.7865\n",
      "roc-auc: 80.11% - roc-auc_val: 76.45%                                                                                                    \n",
      "Epoch 50/50\n",
      "231477/231477 [==============================] - 13s 57us/step - loss: 0.5470 - binary_accuracy: 0.7156 - auc_roc: 0.7865 - val_loss: 0.5183 - val_binary_accuracy: 0.7453 - val_auc_roc: 0.7865\n",
      "roc-auc: 80.23% - roc-auc_val: 76.28%                                                                                                    \n",
      "Train on 228988 samples, validate on 2489 samples\n",
      "Epoch 1/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5464 - binary_accuracy: 0.7163 - auc_roc: 0.7865 - val_loss: 0.5229 - val_binary_accuracy: 0.6910 - val_auc_roc: 0.7865\n",
      "roc-auc: 80.17% - roc-auc_val: 77.98%                                                                                                    \n",
      "Epoch 2/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5465 - binary_accuracy: 0.7165 - auc_roc: 0.7866 - val_loss: 0.5304 - val_binary_accuracy: 0.6910 - val_auc_roc: 0.7866\n",
      "roc-auc: 80.16% - roc-auc_val: 77.60%                                                                                                    \n",
      "Epoch 3/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5467 - binary_accuracy: 0.7166 - auc_roc: 0.7866 - val_loss: 0.5303 - val_binary_accuracy: 0.6914 - val_auc_roc: 0.7866\n",
      "roc-auc: 80.17% - roc-auc_val: 77.56%                                                                                                    \n",
      "Epoch 4/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5466 - binary_accuracy: 0.7160 - auc_roc: 0.7866 - val_loss: 0.5262 - val_binary_accuracy: 0.6886 - val_auc_roc: 0.7866\n",
      "roc-auc: 80.24% - roc-auc_val: 77.81%                                                                                                    \n",
      "Epoch 5/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5467 - binary_accuracy: 0.7155 - auc_roc: 0.7867 - val_loss: 0.5294 - val_binary_accuracy: 0.6890 - val_auc_roc: 0.7867\n",
      "roc-auc: 80.21% - roc-auc_val: 77.50%                                                                                                    \n",
      "Epoch 6/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5466 - binary_accuracy: 0.7159 - auc_roc: 0.7867 - val_loss: 0.5479 - val_binary_accuracy: 0.6894 - val_auc_roc: 0.7867\n",
      "roc-auc: 80.16% - roc-auc_val: 77.45%                                                                                                    \n",
      "Epoch 7/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5460 - binary_accuracy: 0.7162 - auc_roc: 0.7867 - val_loss: 0.5337 - val_binary_accuracy: 0.6918 - val_auc_roc: 0.7867\n",
      "roc-auc: 80.18% - roc-auc_val: 77.25%                                                                                                    \n",
      "Epoch 8/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5462 - binary_accuracy: 0.7158 - auc_roc: 0.7868 - val_loss: 0.5333 - val_binary_accuracy: 0.6838 - val_auc_roc: 0.7868\n",
      "roc-auc: 80.25% - roc-auc_val: 77.23%                                                                                                    \n",
      "Epoch 9/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5467 - binary_accuracy: 0.7163 - auc_roc: 0.7868 - val_loss: 0.5378 - val_binary_accuracy: 0.6894 - val_auc_roc: 0.7868\n",
      "roc-auc: 80.22% - roc-auc_val: 77.26%                                                                                                    \n",
      "Epoch 10/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5463 - binary_accuracy: 0.7159 - auc_roc: 0.7868 - val_loss: 0.5363 - val_binary_accuracy: 0.6935 - val_auc_roc: 0.7868\n",
      "roc-auc: 80.23% - roc-auc_val: 77.23%                                                                                                    \n",
      "Epoch 11/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5464 - binary_accuracy: 0.7159 - auc_roc: 0.7869 - val_loss: 0.5299 - val_binary_accuracy: 0.6878 - val_auc_roc: 0.7869\n",
      "roc-auc: 80.06% - roc-auc_val: 77.32%                                                                                                    \n",
      "Epoch 12/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5465 - binary_accuracy: 0.7159 - auc_roc: 0.7869 - val_loss: 0.5319 - val_binary_accuracy: 0.6798 - val_auc_roc: 0.7869\n",
      "roc-auc: 80.17% - roc-auc_val: 76.90%                                                                                                    \n",
      "Epoch 13/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5467 - binary_accuracy: 0.7155 - auc_roc: 0.7869 - val_loss: 0.5317 - val_binary_accuracy: 0.6814 - val_auc_roc: 0.7869\n",
      "roc-auc: 80.20% - roc-auc_val: 77.05%                                                                                                    \n",
      "Epoch 14/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5465 - binary_accuracy: 0.7170 - auc_roc: 0.7870 - val_loss: 0.5314 - val_binary_accuracy: 0.6822 - val_auc_roc: 0.7870\n",
      "roc-auc: 80.22% - roc-auc_val: 77.01%                                                                                                    \n",
      "Epoch 15/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5466 - binary_accuracy: 0.7162 - auc_roc: 0.7870 - val_loss: 0.5467 - val_binary_accuracy: 0.6886 - val_auc_roc: 0.7870\n",
      "roc-auc: 80.21% - roc-auc_val: 76.76%                                                                                                    \n",
      "Epoch 16/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5463 - binary_accuracy: 0.7164 - auc_roc: 0.7870 - val_loss: 0.5422 - val_binary_accuracy: 0.6830 - val_auc_roc: 0.7870\n",
      "roc-auc: 80.16% - roc-auc_val: 76.69%                                                                                                    \n",
      "Epoch 17/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5459 - binary_accuracy: 0.7167 - auc_roc: 0.7870 - val_loss: 0.5349 - val_binary_accuracy: 0.6882 - val_auc_roc: 0.7871\n",
      "roc-auc: 80.22% - roc-auc_val: 77.24%                                                                                                    \n",
      "Epoch 18/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5464 - binary_accuracy: 0.7162 - auc_roc: 0.7871 - val_loss: 0.5326 - val_binary_accuracy: 0.6910 - val_auc_roc: 0.7871\n",
      "roc-auc: 80.23% - roc-auc_val: 77.29%                                                                                                    \n",
      "Epoch 19/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5459 - binary_accuracy: 0.7164 - auc_roc: 0.7871 - val_loss: 0.5427 - val_binary_accuracy: 0.6798 - val_auc_roc: 0.7871\n",
      "roc-auc: 80.20% - roc-auc_val: 76.56%                                                                                                    \n",
      "Epoch 20/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5463 - binary_accuracy: 0.7157 - auc_roc: 0.7871 - val_loss: 0.5351 - val_binary_accuracy: 0.6770 - val_auc_roc: 0.7871\n",
      "roc-auc: 80.24% - roc-auc_val: 76.76%                                                                                                    \n",
      "Epoch 21/50\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "228988/228988 [==============================] - 13s 57us/step - loss: 0.5463 - binary_accuracy: 0.7165 - auc_roc: 0.7872 - val_loss: 0.5516 - val_binary_accuracy: 0.6770 - val_auc_roc: 0.7872\n",
      "roc-auc: 80.22% - roc-auc_val: 76.24%                                                                                                    \n",
      "Epoch 22/50\n",
      "228988/228988 [==============================] - 13s 58us/step - loss: 0.5462 - binary_accuracy: 0.7156 - auc_roc: 0.7872 - val_loss: 0.5336 - val_binary_accuracy: 0.6850 - val_auc_roc: 0.7872\n",
      "roc-auc: 80.21% - roc-auc_val: 76.85%                                                                                                    \n",
      "Epoch 23/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5464 - binary_accuracy: 0.7166 - auc_roc: 0.7872 - val_loss: 0.5422 - val_binary_accuracy: 0.6874 - val_auc_roc: 0.7872\n",
      "roc-auc: 80.25% - roc-auc_val: 76.80%                                                                                                    \n",
      "Epoch 24/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5467 - binary_accuracy: 0.7165 - auc_roc: 0.7872 - val_loss: 0.5535 - val_binary_accuracy: 0.6810 - val_auc_roc: 0.7872\n",
      "roc-auc: 80.25% - roc-auc_val: 76.37%                                                                                                    \n",
      "Epoch 25/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5459 - binary_accuracy: 0.7160 - auc_roc: 0.7873 - val_loss: 0.5440 - val_binary_accuracy: 0.6810 - val_auc_roc: 0.7873\n",
      "roc-auc: 80.25% - roc-auc_val: 76.41%                                                                                                    \n",
      "Epoch 26/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5467 - binary_accuracy: 0.7165 - auc_roc: 0.7873 - val_loss: 0.5381 - val_binary_accuracy: 0.6790 - val_auc_roc: 0.7873\n",
      "roc-auc: 80.23% - roc-auc_val: 76.77%                                                                                                    \n",
      "Epoch 27/50\n",
      "228988/228988 [==============================] - 13s 55us/step - loss: 0.5461 - binary_accuracy: 0.7163 - auc_roc: 0.7873 - val_loss: 0.5531 - val_binary_accuracy: 0.6794 - val_auc_roc: 0.7873\n",
      "roc-auc: 80.25% - roc-auc_val: 76.49%                                                                                                    \n",
      "Epoch 28/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5458 - binary_accuracy: 0.7168 - auc_roc: 0.7873 - val_loss: 0.5354 - val_binary_accuracy: 0.6774 - val_auc_roc: 0.7874\n",
      "roc-auc: 80.28% - roc-auc_val: 76.86%                                                                                                    \n",
      "Epoch 29/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5461 - binary_accuracy: 0.7171 - auc_roc: 0.7874 - val_loss: 0.5420 - val_binary_accuracy: 0.6854 - val_auc_roc: 0.7874\n",
      "roc-auc: 80.27% - roc-auc_val: 77.05%                                                                                                    \n",
      "Epoch 30/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5458 - binary_accuracy: 0.7166 - auc_roc: 0.7874 - val_loss: 0.5332 - val_binary_accuracy: 0.6830 - val_auc_roc: 0.7874\n",
      "roc-auc: 80.30% - roc-auc_val: 76.98%                                                                                                    \n",
      "Epoch 31/50\n",
      "228988/228988 [==============================] - 13s 55us/step - loss: 0.5458 - binary_accuracy: 0.7160 - auc_roc: 0.7874 - val_loss: 0.5464 - val_binary_accuracy: 0.6874 - val_auc_roc: 0.7874\n",
      "roc-auc: 80.26% - roc-auc_val: 76.90%                                                                                                    \n",
      "Epoch 32/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5461 - binary_accuracy: 0.7167 - auc_roc: 0.7875 - val_loss: 0.5721 - val_binary_accuracy: 0.6838 - val_auc_roc: 0.7875\n",
      "roc-auc: 80.32% - roc-auc_val: 76.35%                                                                                                    \n",
      "Epoch 33/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5458 - binary_accuracy: 0.7170 - auc_roc: 0.7875 - val_loss: 0.5368 - val_binary_accuracy: 0.6842 - val_auc_roc: 0.7875\n",
      "roc-auc: 80.23% - roc-auc_val: 76.86%                                                                                                    \n",
      "Epoch 34/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5456 - binary_accuracy: 0.7168 - auc_roc: 0.7875 - val_loss: 0.5401 - val_binary_accuracy: 0.6842 - val_auc_roc: 0.7875\n",
      "roc-auc: 80.28% - roc-auc_val: 76.75%                                                                                                    \n",
      "Epoch 35/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5459 - binary_accuracy: 0.7172 - auc_roc: 0.7875 - val_loss: 0.5382 - val_binary_accuracy: 0.6818 - val_auc_roc: 0.7875\n",
      "roc-auc: 80.27% - roc-auc_val: 76.77%                                                                                                    \n",
      "Epoch 36/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5460 - binary_accuracy: 0.7165 - auc_roc: 0.7876 - val_loss: 0.5602 - val_binary_accuracy: 0.6838 - val_auc_roc: 0.7876\n",
      "roc-auc: 80.32% - roc-auc_val: 76.57%                                                                                                    \n",
      "Epoch 37/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5456 - binary_accuracy: 0.7166 - auc_roc: 0.7876 - val_loss: 0.5593 - val_binary_accuracy: 0.6854 - val_auc_roc: 0.7876\n",
      "roc-auc: 80.29% - roc-auc_val: 76.61%                                                                                                    \n",
      "Epoch 38/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5464 - binary_accuracy: 0.7165 - auc_roc: 0.7876 - val_loss: 0.5419 - val_binary_accuracy: 0.6834 - val_auc_roc: 0.7876\n",
      "roc-auc: 80.29% - roc-auc_val: 76.66%                                                                                                    \n",
      "Epoch 39/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5459 - binary_accuracy: 0.7162 - auc_roc: 0.7876 - val_loss: 0.5450 - val_binary_accuracy: 0.6822 - val_auc_roc: 0.7876\n",
      "roc-auc: 80.28% - roc-auc_val: 76.54%                                                                                                    \n",
      "Epoch 40/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5459 - binary_accuracy: 0.7170 - auc_roc: 0.7877 - val_loss: 0.5576 - val_binary_accuracy: 0.6854 - val_auc_roc: 0.7877\n",
      "roc-auc: 80.29% - roc-auc_val: 76.18%                                                                                                    \n",
      "Epoch 41/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5454 - binary_accuracy: 0.7169 - auc_roc: 0.7877 - val_loss: 0.5486 - val_binary_accuracy: 0.6830 - val_auc_roc: 0.7877\n",
      "roc-auc: 80.29% - roc-auc_val: 76.51%                                                                                                    \n",
      "Epoch 42/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5460 - binary_accuracy: 0.7174 - auc_roc: 0.7877 - val_loss: 0.5454 - val_binary_accuracy: 0.6842 - val_auc_roc: 0.7877\n",
      "roc-auc: 80.31% - roc-auc_val: 76.88%                                                                                                    \n",
      "Epoch 43/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5454 - binary_accuracy: 0.7168 - auc_roc: 0.7877 - val_loss: 0.5455 - val_binary_accuracy: 0.6806 - val_auc_roc: 0.7877\n",
      "roc-auc: 80.31% - roc-auc_val: 76.49%                                                                                                    \n",
      "Epoch 44/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5456 - binary_accuracy: 0.7168 - auc_roc: 0.7878 - val_loss: 0.5409 - val_binary_accuracy: 0.6810 - val_auc_roc: 0.7878\n",
      "roc-auc: 80.31% - roc-auc_val: 76.54%                                                                                                    \n",
      "Epoch 45/50\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "228988/228988 [==============================] - 13s 55us/step - loss: 0.5456 - binary_accuracy: 0.7169 - auc_roc: 0.7878 - val_loss: 0.5380 - val_binary_accuracy: 0.6834 - val_auc_roc: 0.7878\n",
      "roc-auc: 80.34% - roc-auc_val: 76.80%                                                                                                    \n",
      "Epoch 46/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5455 - binary_accuracy: 0.7173 - auc_roc: 0.7878 - val_loss: 0.5386 - val_binary_accuracy: 0.6802 - val_auc_roc: 0.7878\n",
      "roc-auc: 80.32% - roc-auc_val: 76.40%                                                                                                    \n",
      "Epoch 47/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5456 - binary_accuracy: 0.7173 - auc_roc: 0.7878 - val_loss: 0.5477 - val_binary_accuracy: 0.6762 - val_auc_roc: 0.7878\n",
      "roc-auc: 80.35% - roc-auc_val: 76.63%                                                                                                    \n",
      "Epoch 48/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5457 - binary_accuracy: 0.7167 - auc_roc: 0.7878 - val_loss: 0.5348 - val_binary_accuracy: 0.6814 - val_auc_roc: 0.7879\n",
      "roc-auc: 80.27% - roc-auc_val: 76.67%                                                                                                    \n",
      "Epoch 49/50\n",
      "228988/228988 [==============================] - 13s 56us/step - loss: 0.5458 - binary_accuracy: 0.7165 - auc_roc: 0.7879 - val_loss: 0.5443 - val_binary_accuracy: 0.6770 - val_auc_roc: 0.7879\n",
      "roc-auc: 80.32% - roc-auc_val: 76.10%                                                                                                    \n",
      "Epoch 50/50\n",
      "228988/228988 [==============================] - 13s 55us/step - loss: 0.5458 - binary_accuracy: 0.7170 - auc_roc: 0.7879 - val_loss: 0.5480 - val_binary_accuracy: 0.6798 - val_auc_roc: 0.7879\n",
      "roc-auc: 80.31% - roc-auc_val: 76.32%                                                                                                    \n",
      "Train on 226499 samples, validate on 2489 samples\n",
      "Epoch 1/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5475 - binary_accuracy: 0.7153 - auc_roc: 0.7879 - val_loss: 0.3950 - val_binary_accuracy: 0.8043 - val_auc_roc: 0.7879\n",
      "roc-auc: 80.23% - roc-auc_val: 87.58%                                                                                                    \n",
      "Epoch 2/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5469 - binary_accuracy: 0.7159 - auc_roc: 0.7879 - val_loss: 0.4031 - val_binary_accuracy: 0.8003 - val_auc_roc: 0.7879\n",
      "roc-auc: 80.20% - roc-auc_val: 86.76%                                                                                                    \n",
      "Epoch 3/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5477 - binary_accuracy: 0.7158 - auc_roc: 0.7880 - val_loss: 0.4333 - val_binary_accuracy: 0.7847 - val_auc_roc: 0.7880\n",
      "roc-auc: 80.19% - roc-auc_val: 85.24%                                                                                                    \n",
      "Epoch 4/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5469 - binary_accuracy: 0.7159 - auc_roc: 0.7880 - val_loss: 0.4360 - val_binary_accuracy: 0.7798 - val_auc_roc: 0.7880\n",
      "roc-auc: 80.22% - roc-auc_val: 85.06%                                                                                                    \n",
      "Epoch 5/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5473 - binary_accuracy: 0.7160 - auc_roc: 0.7880 - val_loss: 0.4177 - val_binary_accuracy: 0.7895 - val_auc_roc: 0.7880\n",
      "roc-auc: 80.26% - roc-auc_val: 85.62%                                                                                                    \n",
      "Epoch 6/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5463 - binary_accuracy: 0.7158 - auc_roc: 0.7880 - val_loss: 0.4236 - val_binary_accuracy: 0.7923 - val_auc_roc: 0.7880\n",
      "roc-auc: 80.19% - roc-auc_val: 85.57%                                                                                                    \n",
      "Epoch 7/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5461 - binary_accuracy: 0.7163 - auc_roc: 0.7880 - val_loss: 0.4237 - val_binary_accuracy: 0.7907 - val_auc_roc: 0.7881\n",
      "roc-auc: 80.20% - roc-auc_val: 85.39%                                                                                                    \n",
      "Epoch 8/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5470 - binary_accuracy: 0.7162 - auc_roc: 0.7881 - val_loss: 0.4666 - val_binary_accuracy: 0.7626 - val_auc_roc: 0.7881\n",
      "roc-auc: 80.26% - roc-auc_val: 83.26%                                                                                                    \n",
      "Epoch 9/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5459 - binary_accuracy: 0.7164 - auc_roc: 0.7881 - val_loss: 0.4249 - val_binary_accuracy: 0.7879 - val_auc_roc: 0.7881\n",
      "roc-auc: 80.24% - roc-auc_val: 85.24%                                                                                                    \n",
      "Epoch 10/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5464 - binary_accuracy: 0.7157 - auc_roc: 0.7881 - val_loss: 0.4240 - val_binary_accuracy: 0.7887 - val_auc_roc: 0.7881\n",
      "roc-auc: 80.24% - roc-auc_val: 85.24%                                                                                                    \n",
      "Epoch 11/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5464 - binary_accuracy: 0.7156 - auc_roc: 0.7881 - val_loss: 0.4280 - val_binary_accuracy: 0.7899 - val_auc_roc: 0.7881\n",
      "roc-auc: 80.19% - roc-auc_val: 85.07%                                                                                                    \n",
      "Epoch 12/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5465 - binary_accuracy: 0.7162 - auc_roc: 0.7881 - val_loss: 0.4347 - val_binary_accuracy: 0.7875 - val_auc_roc: 0.7882\n",
      "roc-auc: 80.25% - roc-auc_val: 85.04%                                                                                                    \n",
      "Epoch 13/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5464 - binary_accuracy: 0.7166 - auc_roc: 0.7882 - val_loss: 0.4362 - val_binary_accuracy: 0.7867 - val_auc_roc: 0.7882\n",
      "roc-auc: 80.25% - roc-auc_val: 84.89%                                                                                                    \n",
      "Epoch 14/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5461 - binary_accuracy: 0.7166 - auc_roc: 0.7882 - val_loss: 0.4625 - val_binary_accuracy: 0.7690 - val_auc_roc: 0.7882\n",
      "roc-auc: 80.21% - roc-auc_val: 83.62%                                                                                                    \n",
      "Epoch 15/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5464 - binary_accuracy: 0.7159 - auc_roc: 0.7882 - val_loss: 0.4414 - val_binary_accuracy: 0.7830 - val_auc_roc: 0.7882\n",
      "roc-auc: 80.26% - roc-auc_val: 84.48%                                                                                                    \n",
      "Epoch 16/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5465 - binary_accuracy: 0.7159 - auc_roc: 0.7882 - val_loss: 0.4443 - val_binary_accuracy: 0.7802 - val_auc_roc: 0.7882\n",
      "roc-auc: 80.19% - roc-auc_val: 84.26%                                                                                                    \n",
      "Epoch 17/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5465 - binary_accuracy: 0.7165 - auc_roc: 0.7883 - val_loss: 0.4341 - val_binary_accuracy: 0.7859 - val_auc_roc: 0.7883\n",
      "roc-auc: 80.25% - roc-auc_val: 84.32%                                                                                                    \n",
      "Epoch 18/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5462 - binary_accuracy: 0.7154 - auc_roc: 0.7883 - val_loss: 0.4449 - val_binary_accuracy: 0.7790 - val_auc_roc: 0.7883\n",
      "roc-auc: 80.23% - roc-auc_val: 83.63%                                                                                                    \n",
      "Epoch 19/50\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5461 - binary_accuracy: 0.7167 - auc_roc: 0.7883 - val_loss: 0.4521 - val_binary_accuracy: 0.7762 - val_auc_roc: 0.7883\n",
      "roc-auc: 80.18% - roc-auc_val: 83.66%                                                                                                    \n",
      "Epoch 20/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5456 - binary_accuracy: 0.7161 - auc_roc: 0.7883 - val_loss: 0.4701 - val_binary_accuracy: 0.7654 - val_auc_roc: 0.7883\n",
      "roc-auc: 80.28% - roc-auc_val: 82.93%                                                                                                    \n",
      "Epoch 21/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5461 - binary_accuracy: 0.7170 - auc_roc: 0.7883 - val_loss: 0.4334 - val_binary_accuracy: 0.7871 - val_auc_roc: 0.7883\n",
      "roc-auc: 80.25% - roc-auc_val: 84.76%                                                                                                    \n",
      "Epoch 22/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5462 - binary_accuracy: 0.7167 - auc_roc: 0.7884 - val_loss: 0.4349 - val_binary_accuracy: 0.7867 - val_auc_roc: 0.7884\n",
      "roc-auc: 80.31% - roc-auc_val: 84.51%                                                                                                    \n",
      "Epoch 23/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5461 - binary_accuracy: 0.7165 - auc_roc: 0.7884 - val_loss: 0.4528 - val_binary_accuracy: 0.7778 - val_auc_roc: 0.7884\n",
      "roc-auc: 80.29% - roc-auc_val: 83.15%                                                                                                    \n",
      "Epoch 24/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5460 - binary_accuracy: 0.7159 - auc_roc: 0.7884 - val_loss: 0.4494 - val_binary_accuracy: 0.7802 - val_auc_roc: 0.7884\n",
      "roc-auc: 80.27% - roc-auc_val: 83.20%                                                                                                    \n",
      "Epoch 25/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5467 - binary_accuracy: 0.7157 - auc_roc: 0.7884 - val_loss: 0.4512 - val_binary_accuracy: 0.7822 - val_auc_roc: 0.7884\n",
      "roc-auc: 80.26% - roc-auc_val: 83.53%                                                                                                    \n",
      "Epoch 26/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5456 - binary_accuracy: 0.7158 - auc_roc: 0.7884 - val_loss: 0.4542 - val_binary_accuracy: 0.7770 - val_auc_roc: 0.7884\n",
      "roc-auc: 80.26% - roc-auc_val: 84.07%                                                                                                    \n",
      "Epoch 27/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5458 - binary_accuracy: 0.7161 - auc_roc: 0.7885 - val_loss: 0.4519 - val_binary_accuracy: 0.7762 - val_auc_roc: 0.7885\n",
      "roc-auc: 80.19% - roc-auc_val: 83.27%                                                                                                    \n",
      "Epoch 28/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5461 - binary_accuracy: 0.7157 - auc_roc: 0.7885 - val_loss: 0.4503 - val_binary_accuracy: 0.7810 - val_auc_roc: 0.7885\n",
      "roc-auc: 80.31% - roc-auc_val: 83.48%                                                                                                    \n",
      "Epoch 29/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5460 - binary_accuracy: 0.7157 - auc_roc: 0.7885 - val_loss: 0.4431 - val_binary_accuracy: 0.7826 - val_auc_roc: 0.7885\n",
      "roc-auc: 80.25% - roc-auc_val: 83.98%                                                                                                    \n",
      "Epoch 30/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5457 - binary_accuracy: 0.7164 - auc_roc: 0.7885 - val_loss: 0.4529 - val_binary_accuracy: 0.7778 - val_auc_roc: 0.7885\n",
      "roc-auc: 80.29% - roc-auc_val: 83.79%                                                                                                    \n",
      "Epoch 31/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5466 - binary_accuracy: 0.7164 - auc_roc: 0.7885 - val_loss: 0.4627 - val_binary_accuracy: 0.7762 - val_auc_roc: 0.7885\n",
      "roc-auc: 80.27% - roc-auc_val: 83.31%                                                                                                    \n",
      "Epoch 32/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5461 - binary_accuracy: 0.7164 - auc_roc: 0.7885 - val_loss: 0.4740 - val_binary_accuracy: 0.7622 - val_auc_roc: 0.7886\n",
      "roc-auc: 80.32% - roc-auc_val: 82.17%                                                                                                    \n",
      "Epoch 33/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5459 - binary_accuracy: 0.7165 - auc_roc: 0.7886 - val_loss: 0.4634 - val_binary_accuracy: 0.7646 - val_auc_roc: 0.7886\n",
      "roc-auc: 80.29% - roc-auc_val: 82.73%                                                                                                    \n",
      "Epoch 34/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5459 - binary_accuracy: 0.7163 - auc_roc: 0.7886 - val_loss: 0.4647 - val_binary_accuracy: 0.7738 - val_auc_roc: 0.7886\n",
      "roc-auc: 80.28% - roc-auc_val: 82.90%                                                                                                    \n",
      "Epoch 35/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5458 - binary_accuracy: 0.7163 - auc_roc: 0.7886 - val_loss: 0.4580 - val_binary_accuracy: 0.7722 - val_auc_roc: 0.7886\n",
      "roc-auc: 80.25% - roc-auc_val: 82.66%                                                                                                    \n",
      "Epoch 36/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5463 - binary_accuracy: 0.7169 - auc_roc: 0.7886 - val_loss: 0.4622 - val_binary_accuracy: 0.7686 - val_auc_roc: 0.7886\n",
      "roc-auc: 80.27% - roc-auc_val: 82.65%                                                                                                    \n",
      "Epoch 37/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5465 - binary_accuracy: 0.7167 - auc_roc: 0.7886 - val_loss: 0.4850 - val_binary_accuracy: 0.7646 - val_auc_roc: 0.7886\n",
      "roc-auc: 80.28% - roc-auc_val: 81.60%                                                                                                    \n",
      "Epoch 38/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5460 - binary_accuracy: 0.7168 - auc_roc: 0.7886 - val_loss: 0.4763 - val_binary_accuracy: 0.7650 - val_auc_roc: 0.7887\n",
      "roc-auc: 80.28% - roc-auc_val: 81.48%                                                                                                    \n",
      "Epoch 39/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5461 - binary_accuracy: 0.7158 - auc_roc: 0.7887 - val_loss: 0.4947 - val_binary_accuracy: 0.7670 - val_auc_roc: 0.7887\n",
      "roc-auc: 80.33% - roc-auc_val: 81.40%                                                                                                    \n",
      "Epoch 40/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5456 - binary_accuracy: 0.7163 - auc_roc: 0.7887 - val_loss: 0.4583 - val_binary_accuracy: 0.7802 - val_auc_roc: 0.7887\n",
      "roc-auc: 80.35% - roc-auc_val: 82.70%                                                                                                    \n",
      "Epoch 41/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5458 - binary_accuracy: 0.7165 - auc_roc: 0.7887 - val_loss: 0.5330 - val_binary_accuracy: 0.7425 - val_auc_roc: 0.7887\n",
      "roc-auc: 80.29% - roc-auc_val: 79.25%                                                                                                    \n",
      "Epoch 42/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5461 - binary_accuracy: 0.7169 - auc_roc: 0.7887 - val_loss: 0.5322 - val_binary_accuracy: 0.7485 - val_auc_roc: 0.7887\n",
      "roc-auc: 80.34% - roc-auc_val: 79.42%                                                                                                    \n",
      "Epoch 43/50\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "226499/226499 [==============================] - 13s 55us/step - loss: 0.5458 - binary_accuracy: 0.7160 - auc_roc: 0.7887 - val_loss: 0.4732 - val_binary_accuracy: 0.7690 - val_auc_roc: 0.7887\n",
      "roc-auc: 80.36% - roc-auc_val: 81.88%                                                                                                    \n",
      "Epoch 44/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5461 - binary_accuracy: 0.7165 - auc_roc: 0.7887 - val_loss: 0.4661 - val_binary_accuracy: 0.7662 - val_auc_roc: 0.7887\n",
      "roc-auc: 80.27% - roc-auc_val: 82.35%                                                                                                    \n",
      "Epoch 45/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5460 - binary_accuracy: 0.7162 - auc_roc: 0.7888 - val_loss: 0.4829 - val_binary_accuracy: 0.7618 - val_auc_roc: 0.7888\n",
      "roc-auc: 80.35% - roc-auc_val: 81.51%                                                                                                    \n",
      "Epoch 46/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5462 - binary_accuracy: 0.7168 - auc_roc: 0.7888 - val_loss: 0.4600 - val_binary_accuracy: 0.7734 - val_auc_roc: 0.7888\n",
      "roc-auc: 80.28% - roc-auc_val: 82.30%                                                                                                    \n",
      "Epoch 47/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5459 - binary_accuracy: 0.7168 - auc_roc: 0.7888 - val_loss: 0.4510 - val_binary_accuracy: 0.7826 - val_auc_roc: 0.7888\n",
      "roc-auc: 80.26% - roc-auc_val: 83.39%                                                                                                    \n",
      "Epoch 48/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5458 - binary_accuracy: 0.7165 - auc_roc: 0.7888 - val_loss: 0.4677 - val_binary_accuracy: 0.7698 - val_auc_roc: 0.7888\n",
      "roc-auc: 80.34% - roc-auc_val: 82.90%                                                                                                    \n",
      "Epoch 49/50\n",
      "226499/226499 [==============================] - 13s 56us/step - loss: 0.5462 - binary_accuracy: 0.7161 - auc_roc: 0.7888 - val_loss: 0.4646 - val_binary_accuracy: 0.7710 - val_auc_roc: 0.7888\n",
      "roc-auc: 80.22% - roc-auc_val: 82.74%                                                                                                    \n",
      "Epoch 50/50\n",
      "226499/226499 [==============================] - 13s 55us/step - loss: 0.5459 - binary_accuracy: 0.7160 - auc_roc: 0.7888 - val_loss: 0.4672 - val_binary_accuracy: 0.7690 - val_auc_roc: 0.7888\n",
      "roc-auc: 80.29% - roc-auc_val: 82.49%                                                                                                    \n",
      "Train on 224010 samples, validate on 2489 samples\n",
      "Epoch 1/50\n",
      "224010/224010 [==============================] - 13s 56us/step - loss: 0.5457 - binary_accuracy: 0.7168 - auc_roc: 0.7889 - val_loss: 0.5059 - val_binary_accuracy: 0.7224 - val_auc_roc: 0.7889\n",
      "roc-auc: 80.30% - roc-auc_val: 78.19%                                                                                                    \n",
      "Epoch 2/50\n",
      "224010/224010 [==============================] - 13s 56us/step - loss: 0.5464 - binary_accuracy: 0.7160 - auc_roc: 0.7889 - val_loss: 0.5121 - val_binary_accuracy: 0.7204 - val_auc_roc: 0.7889\n",
      "roc-auc: 80.28% - roc-auc_val: 77.42%                                                                                                    \n",
      "Epoch 3/50\n",
      "224010/224010 [==============================] - 12s 56us/step - loss: 0.5463 - binary_accuracy: 0.7153 - auc_roc: 0.7889 - val_loss: 0.5125 - val_binary_accuracy: 0.7172 - val_auc_roc: 0.7889\n",
      "roc-auc: 80.28% - roc-auc_val: 77.35%                                                                                                    \n",
      "Epoch 4/50\n",
      "224010/224010 [==============================] - 12s 56us/step - loss: 0.5455 - binary_accuracy: 0.7173 - auc_roc: 0.7889 - val_loss: 0.5244 - val_binary_accuracy: 0.7107 - val_auc_roc: 0.7889\n",
      "roc-auc: 80.29% - roc-auc_val: 77.37%                                                                                                    \n",
      "Epoch 5/50\n",
      "224010/224010 [==============================] - 12s 56us/step - loss: 0.5462 - binary_accuracy: 0.7158 - auc_roc: 0.7889 - val_loss: 0.5332 - val_binary_accuracy: 0.6750 - val_auc_roc: 0.7889\n",
      "roc-auc: 80.28% - roc-auc_val: 76.71%                                                                                                    \n",
      "Epoch 6/50\n",
      "224010/224010 [==============================] - 12s 56us/step - loss: 0.5460 - binary_accuracy: 0.7169 - auc_roc: 0.7889 - val_loss: 0.5233 - val_binary_accuracy: 0.7051 - val_auc_roc: 0.7889\n",
      "roc-auc: 80.23% - roc-auc_val: 76.63%                                                                                                    \n",
      "Epoch 7/50\n",
      "224010/224010 [==============================] - 12s 56us/step - loss: 0.5462 - binary_accuracy: 0.7172 - auc_roc: 0.7889 - val_loss: 0.5269 - val_binary_accuracy: 0.6951 - val_auc_roc: 0.7889\n",
      "roc-auc: 80.31% - roc-auc_val: 75.75%                                                                                                    \n",
      "Epoch 8/50\n",
      "224010/224010 [==============================] - 12s 56us/step - loss: 0.5465 - binary_accuracy: 0.7152 - auc_roc: 0.7889 - val_loss: 0.5239 - val_binary_accuracy: 0.7019 - val_auc_roc: 0.7889\n",
      "roc-auc: 80.25% - roc-auc_val: 76.15%                                                                                                    \n",
      "Epoch 9/50\n",
      "224010/224010 [==============================] - 13s 56us/step - loss: 0.5458 - binary_accuracy: 0.7167 - auc_roc: 0.7890 - val_loss: 0.5293 - val_binary_accuracy: 0.6995 - val_auc_roc: 0.7890\n",
      "roc-auc: 80.25% - roc-auc_val: 76.05%                                                                                                    \n",
      "Epoch 10/50\n",
      "224010/224010 [==============================] - 12s 56us/step - loss: 0.5455 - binary_accuracy: 0.7168 - auc_roc: 0.7890 - val_loss: 0.5293 - val_binary_accuracy: 0.7015 - val_auc_roc: 0.7890\n",
      "roc-auc: 80.30% - roc-auc_val: 76.33%                                                                                                    \n",
      "Epoch 11/50\n",
      "224010/224010 [==============================] - 13s 56us/step - loss: 0.5463 - binary_accuracy: 0.7160 - auc_roc: 0.7890 - val_loss: 0.5316 - val_binary_accuracy: 0.6967 - val_auc_roc: 0.7890\n",
      "roc-auc: 80.30% - roc-auc_val: 76.32%                                                                                                    \n",
      "Epoch 12/50\n",
      "224010/224010 [==============================] - 12s 56us/step - loss: 0.5460 - binary_accuracy: 0.7167 - auc_roc: 0.7890 - val_loss: 0.5327 - val_binary_accuracy: 0.6975 - val_auc_roc: 0.7890\n",
      "roc-auc: 80.29% - roc-auc_val: 75.80%                                                                                                    \n",
      "Epoch 13/50\n",
      "224010/224010 [==============================] - 13s 56us/step - loss: 0.5454 - binary_accuracy: 0.7173 - auc_roc: 0.7890 - val_loss: 0.5395 - val_binary_accuracy: 0.6878 - val_auc_roc: 0.7890\n",
      "roc-auc: 80.30% - roc-auc_val: 75.94%                                                                                                    \n",
      "Epoch 14/50\n",
      "224010/224010 [==============================] - 12s 56us/step - loss: 0.5464 - binary_accuracy: 0.7165 - auc_roc: 0.7890 - val_loss: 0.5376 - val_binary_accuracy: 0.6706 - val_auc_roc: 0.7890\n",
      "roc-auc: 80.30% - roc-auc_val: 75.72%                                                                                                    \n",
      "Epoch 15/50\n",
      "224010/224010 [==============================] - 13s 56us/step - loss: 0.5453 - binary_accuracy: 0.7168 - auc_roc: 0.7890 - val_loss: 0.5314 - val_binary_accuracy: 0.6914 - val_auc_roc: 0.7890\n",
      "roc-auc: 80.31% - roc-auc_val: 76.13%                                                                                                    \n",
      "Epoch 16/50\n",
      "224010/224010 [==============================] - 13s 56us/step - loss: 0.5452 - binary_accuracy: 0.7171 - auc_roc: 0.7890 - val_loss: 0.5394 - val_binary_accuracy: 0.6770 - val_auc_roc: 0.7890\n",
      "roc-auc: 80.32% - roc-auc_val: 75.61%                                                                                                    \n",
      "Epoch 17/50\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "224010/224010 [==============================] - 12s 56us/step - loss: 0.5454 - binary_accuracy: 0.7164 - auc_roc: 0.7891 - val_loss: 0.5387 - val_binary_accuracy: 0.6802 - val_auc_roc: 0.7891\n",
      "roc-auc: 80.31% - roc-auc_val: 75.85%                                                                                                    \n",
      "Epoch 18/50\n",
      "224010/224010 [==============================] - 12s 55us/step - loss: 0.5456 - binary_accuracy: 0.7163 - auc_roc: 0.7891 - val_loss: 0.5480 - val_binary_accuracy: 0.6794 - val_auc_roc: 0.7891\n",
      "roc-auc: 80.31% - roc-auc_val: 75.35%                                                                                                    \n",
      "Epoch 19/50\n",
      "224010/224010 [==============================] - 12s 56us/step - loss: 0.5455 - binary_accuracy: 0.7167 - auc_roc: 0.7891 - val_loss: 0.5427 - val_binary_accuracy: 0.6770 - val_auc_roc: 0.7891\n",
      "roc-auc: 80.30% - roc-auc_val: 75.01%                                                                                                    \n",
      "Epoch 20/50\n",
      "224010/224010 [==============================] - 12s 55us/step - loss: 0.5453 - binary_accuracy: 0.7169 - auc_roc: 0.7891 - val_loss: 0.5425 - val_binary_accuracy: 0.6926 - val_auc_roc: 0.7891\n",
      "roc-auc: 80.29% - roc-auc_val: 75.04%                                                                                                    \n",
      "Epoch 21/50\n",
      "224010/224010 [==============================] - 12s 55us/step - loss: 0.5458 - binary_accuracy: 0.7169 - auc_roc: 0.7891 - val_loss: 0.5414 - val_binary_accuracy: 0.6782 - val_auc_roc: 0.7891\n",
      "roc-auc: 80.29% - roc-auc_val: 74.86%                                                                                                    \n",
      "Epoch 22/50\n",
      "224010/224010 [==============================] - 12s 56us/step - loss: 0.5457 - binary_accuracy: 0.7175 - auc_roc: 0.7891 - val_loss: 0.5432 - val_binary_accuracy: 0.6790 - val_auc_roc: 0.7891\n",
      "roc-auc: 80.30% - roc-auc_val: 74.52%                                                                                                    \n",
      "Epoch 23/50\n",
      "224010/224010 [==============================] - 12s 56us/step - loss: 0.5462 - binary_accuracy: 0.7167 - auc_roc: 0.7891 - val_loss: 0.5542 - val_binary_accuracy: 0.6472 - val_auc_roc: 0.7891\n",
      "roc-auc: 80.32% - roc-auc_val: 74.25%                                                                                                    \n",
      "Epoch 24/50\n",
      "224010/224010 [==============================] - 12s 56us/step - loss: 0.5453 - binary_accuracy: 0.7168 - auc_roc: 0.7891 - val_loss: 0.5547 - val_binary_accuracy: 0.6260 - val_auc_roc: 0.7891\n",
      "roc-auc: 80.33% - roc-auc_val: 74.10%                                                                                                    \n",
      "Epoch 25/50\n",
      "224010/224010 [==============================] - 12s 56us/step - loss: 0.5458 - binary_accuracy: 0.7173 - auc_roc: 0.7891 - val_loss: 0.5464 - val_binary_accuracy: 0.6786 - val_auc_roc: 0.7891\n",
      "roc-auc: 80.25% - roc-auc_val: 74.40%                                                                                                    \n",
      "Epoch 26/50\n",
      "224010/224010 [==============================] - 12s 56us/step - loss: 0.5459 - binary_accuracy: 0.7164 - auc_roc: 0.7892 - val_loss: 0.5497 - val_binary_accuracy: 0.6641 - val_auc_roc: 0.7892\n",
      "roc-auc: 80.36% - roc-auc_val: 74.58%                                                                                                    \n",
      "Epoch 27/50\n",
      "224010/224010 [==============================] - 12s 56us/step - loss: 0.5452 - binary_accuracy: 0.7166 - auc_roc: 0.7892 - val_loss: 0.5506 - val_binary_accuracy: 0.6521 - val_auc_roc: 0.7892\n",
      "roc-auc: 80.32% - roc-auc_val: 74.56%                                                                                                    \n",
      "Epoch 28/50\n",
      "224010/224010 [==============================] - 12s 56us/step - loss: 0.5452 - binary_accuracy: 0.7161 - auc_roc: 0.7892 - val_loss: 0.5495 - val_binary_accuracy: 0.6621 - val_auc_roc: 0.7892\n",
      "roc-auc: 80.31% - roc-auc_val: 73.32%                                                                                                    \n",
      "Epoch 29/50\n",
      "224010/224010 [==============================] - 12s 56us/step - loss: 0.5451 - binary_accuracy: 0.7168 - auc_roc: 0.7892 - val_loss: 0.5419 - val_binary_accuracy: 0.6902 - val_auc_roc: 0.7892\n",
      "roc-auc: 80.32% - roc-auc_val: 74.34%                                                                                                    \n",
      "Epoch 30/50\n",
      "224010/224010 [==============================] - 12s 56us/step - loss: 0.5458 - binary_accuracy: 0.7165 - auc_roc: 0.7892 - val_loss: 0.5540 - val_binary_accuracy: 0.6609 - val_auc_roc: 0.7892\n",
      "roc-auc: 80.32% - roc-auc_val: 73.92%                                                                                                    \n",
      "Epoch 31/50\n",
      "224010/224010 [==============================] - 12s 56us/step - loss: 0.5451 - binary_accuracy: 0.7167 - auc_roc: 0.7892 - val_loss: 0.5503 - val_binary_accuracy: 0.6609 - val_auc_roc: 0.7892\n",
      "roc-auc: 80.36% - roc-auc_val: 74.39%                                                                                                    \n",
      "Epoch 32/50\n",
      "224010/224010 [==============================] - 12s 56us/step - loss: 0.5456 - binary_accuracy: 0.7156 - auc_roc: 0.7892 - val_loss: 0.5488 - val_binary_accuracy: 0.6786 - val_auc_roc: 0.7892\n",
      "roc-auc: 80.40% - roc-auc_val: 74.62%                                                                                                    \n",
      "Epoch 33/50\n",
      "224010/224010 [==============================] - 13s 56us/step - loss: 0.5453 - binary_accuracy: 0.7165 - auc_roc: 0.7892 - val_loss: 0.5487 - val_binary_accuracy: 0.6625 - val_auc_roc: 0.7892\n",
      "roc-auc: 80.35% - roc-auc_val: 74.68%                                                                                                    \n",
      "Epoch 34/50\n",
      "224010/224010 [==============================] - 13s 56us/step - loss: 0.5459 - binary_accuracy: 0.7170 - auc_roc: 0.7892 - val_loss: 0.5401 - val_binary_accuracy: 0.6951 - val_auc_roc: 0.7892\n",
      "roc-auc: 80.29% - roc-auc_val: 74.51%                                                                                                    \n",
      "Epoch 35/50\n",
      "224010/224010 [==============================] - 13s 56us/step - loss: 0.5458 - binary_accuracy: 0.7170 - auc_roc: 0.7893 - val_loss: 0.5473 - val_binary_accuracy: 0.6874 - val_auc_roc: 0.7893\n",
      "roc-auc: 80.32% - roc-auc_val: 74.71%                                                                                                    \n",
      "Epoch 36/50\n",
      "224010/224010 [==============================] - 13s 56us/step - loss: 0.5452 - binary_accuracy: 0.7174 - auc_roc: 0.7893 - val_loss: 0.5445 - val_binary_accuracy: 0.6693 - val_auc_roc: 0.7893\n",
      "roc-auc: 80.32% - roc-auc_val: 74.18%                                                                                                    \n",
      "Epoch 37/50\n",
      "224010/224010 [==============================] - 12s 56us/step - loss: 0.5455 - binary_accuracy: 0.7172 - auc_roc: 0.7893 - val_loss: 0.5475 - val_binary_accuracy: 0.6842 - val_auc_roc: 0.7893\n",
      "roc-auc: 80.30% - roc-auc_val: 73.68%                                                                                                    \n",
      "Epoch 38/50\n",
      "224010/224010 [==============================] - 13s 56us/step - loss: 0.5453 - binary_accuracy: 0.7159 - auc_roc: 0.7893 - val_loss: 0.5545 - val_binary_accuracy: 0.6428 - val_auc_roc: 0.7893\n",
      "roc-auc: 80.35% - roc-auc_val: 74.10%                                                                                                    \n",
      "Epoch 39/50\n",
      "224010/224010 [==============================] - 13s 56us/step - loss: 0.5460 - binary_accuracy: 0.7170 - auc_roc: 0.7893 - val_loss: 0.5469 - val_binary_accuracy: 0.6669 - val_auc_roc: 0.7893\n",
      "roc-auc: 80.32% - roc-auc_val: 74.09%                                                                                                    \n",
      "Epoch 40/50\n",
      "224010/224010 [==============================] - 13s 56us/step - loss: 0.5453 - binary_accuracy: 0.7170 - auc_roc: 0.7893 - val_loss: 0.5480 - val_binary_accuracy: 0.6649 - val_auc_roc: 0.7893\n",
      "roc-auc: 80.35% - roc-auc_val: 73.86%                                                                                                    \n",
      "Epoch 41/50\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "224010/224010 [==============================] - 12s 55us/step - loss: 0.5456 - binary_accuracy: 0.7163 - auc_roc: 0.7893 - val_loss: 0.5499 - val_binary_accuracy: 0.6685 - val_auc_roc: 0.7893\n",
      "roc-auc: 80.31% - roc-auc_val: 74.28%                                                                                                    \n",
      "Epoch 42/50\n",
      "224010/224010 [==============================] - 12s 55us/step - loss: 0.5454 - binary_accuracy: 0.7166 - auc_roc: 0.7893 - val_loss: 0.5489 - val_binary_accuracy: 0.6786 - val_auc_roc: 0.7893\n",
      "roc-auc: 80.32% - roc-auc_val: 73.96%                                                                                                    \n",
      "Epoch 43/50\n",
      "224010/224010 [==============================] - 12s 56us/step - loss: 0.5453 - binary_accuracy: 0.7169 - auc_roc: 0.7893 - val_loss: 0.5439 - val_binary_accuracy: 0.6734 - val_auc_roc: 0.7893\n",
      "roc-auc: 80.33% - roc-auc_val: 75.23%                                                                                                    \n",
      "Epoch 44/50\n",
      "224010/224010 [==============================] - 12s 56us/step - loss: 0.5456 - binary_accuracy: 0.7172 - auc_roc: 0.7893 - val_loss: 0.5593 - val_binary_accuracy: 0.6107 - val_auc_roc: 0.7893\n",
      "roc-auc: 80.37% - roc-auc_val: 73.81%                                                                                                    \n",
      "Epoch 45/50\n",
      "224010/224010 [==============================] - 12s 56us/step - loss: 0.5455 - binary_accuracy: 0.7173 - auc_roc: 0.7894 - val_loss: 0.5581 - val_binary_accuracy: 0.6537 - val_auc_roc: 0.7894\n",
      "roc-auc: 80.33% - roc-auc_val: 73.50%                                                                                                    \n",
      "Epoch 46/50\n",
      "224010/224010 [==============================] - 12s 56us/step - loss: 0.5459 - binary_accuracy: 0.7155 - auc_roc: 0.7894 - val_loss: 0.5538 - val_binary_accuracy: 0.6593 - val_auc_roc: 0.7894\n",
      "roc-auc: 80.33% - roc-auc_val: 74.33%                                                                                                    \n",
      "Epoch 47/50\n",
      "224010/224010 [==============================] - 12s 56us/step - loss: 0.5455 - binary_accuracy: 0.7165 - auc_roc: 0.7894 - val_loss: 0.5478 - val_binary_accuracy: 0.6677 - val_auc_roc: 0.7894\n",
      "roc-auc: 80.34% - roc-auc_val: 74.55%                                                                                                    \n",
      "Epoch 48/50\n",
      "224010/224010 [==============================] - 12s 56us/step - loss: 0.5455 - binary_accuracy: 0.7169 - auc_roc: 0.7894 - val_loss: 0.5537 - val_binary_accuracy: 0.6814 - val_auc_roc: 0.7894\n",
      "roc-auc: 80.32% - roc-auc_val: 74.14%                                                                                                    \n",
      "Epoch 49/50\n",
      "224010/224010 [==============================] - 12s 56us/step - loss: 0.5451 - binary_accuracy: 0.7178 - auc_roc: 0.7894 - val_loss: 0.5568 - val_binary_accuracy: 0.6573 - val_auc_roc: 0.7894\n",
      "roc-auc: 80.38% - roc-auc_val: 74.58%                                                                                                    \n",
      "Epoch 50/50\n",
      "224010/224010 [==============================] - 12s 56us/step - loss: 0.5447 - binary_accuracy: 0.7174 - auc_roc: 0.7894 - val_loss: 0.5437 - val_binary_accuracy: 0.6983 - val_auc_roc: 0.7894\n",
      "roc-auc: 80.32% - roc-auc_val: 74.52%                                                                                                    \n",
      "Train on 221521 samples, validate on 2489 samples\n",
      "Epoch 1/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5451 - binary_accuracy: 0.7180 - auc_roc: 0.7894 - val_loss: 0.5560 - val_binary_accuracy: 0.6589 - val_auc_roc: 0.7894\n",
      "roc-auc: 80.44% - roc-auc_val: 74.16%                                                                                                    \n",
      "Epoch 2/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5452 - binary_accuracy: 0.7177 - auc_roc: 0.7894 - val_loss: 0.5674 - val_binary_accuracy: 0.6545 - val_auc_roc: 0.7894\n",
      "roc-auc: 80.41% - roc-auc_val: 73.31%                                                                                                    \n",
      "Epoch 3/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5445 - binary_accuracy: 0.7177 - auc_roc: 0.7894 - val_loss: 0.5624 - val_binary_accuracy: 0.6569 - val_auc_roc: 0.7894\n",
      "roc-auc: 80.42% - roc-auc_val: 73.39%                                                                                                    \n",
      "Epoch 4/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5455 - binary_accuracy: 0.7163 - auc_roc: 0.7894 - val_loss: 0.5572 - val_binary_accuracy: 0.6541 - val_auc_roc: 0.7894\n",
      "roc-auc: 80.39% - roc-auc_val: 73.67%                                                                                                    \n",
      "Epoch 5/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5457 - binary_accuracy: 0.7166 - auc_roc: 0.7895 - val_loss: 0.5703 - val_binary_accuracy: 0.6513 - val_auc_roc: 0.7895\n",
      "roc-auc: 80.40% - roc-auc_val: 73.37%                                                                                                    \n",
      "Epoch 6/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5450 - binary_accuracy: 0.7176 - auc_roc: 0.7895 - val_loss: 0.5613 - val_binary_accuracy: 0.6541 - val_auc_roc: 0.7895\n",
      "roc-auc: 80.34% - roc-auc_val: 73.24%                                                                                                    \n",
      "Epoch 7/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5448 - binary_accuracy: 0.7180 - auc_roc: 0.7895 - val_loss: 0.5730 - val_binary_accuracy: 0.6501 - val_auc_roc: 0.7895\n",
      "roc-auc: 80.42% - roc-auc_val: 72.96%                                                                                                    \n",
      "Epoch 8/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5445 - binary_accuracy: 0.7179 - auc_roc: 0.7895 - val_loss: 0.5660 - val_binary_accuracy: 0.6509 - val_auc_roc: 0.7895\n",
      "roc-auc: 80.44% - roc-auc_val: 73.19%                                                                                                    \n",
      "Epoch 9/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5454 - binary_accuracy: 0.7172 - auc_roc: 0.7895 - val_loss: 0.5609 - val_binary_accuracy: 0.6485 - val_auc_roc: 0.7895\n",
      "roc-auc: 80.40% - roc-auc_val: 73.07%                                                                                                    \n",
      "Epoch 10/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5456 - binary_accuracy: 0.7174 - auc_roc: 0.7895 - val_loss: 0.5651 - val_binary_accuracy: 0.6472 - val_auc_roc: 0.7895\n",
      "roc-auc: 80.37% - roc-auc_val: 72.78%                                                                                                    \n",
      "Epoch 11/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5449 - binary_accuracy: 0.7172 - auc_roc: 0.7895 - val_loss: 0.5659 - val_binary_accuracy: 0.6464 - val_auc_roc: 0.7895\n",
      "roc-auc: 80.44% - roc-auc_val: 72.67%                                                                                                    \n",
      "Epoch 12/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5448 - binary_accuracy: 0.7170 - auc_roc: 0.7895 - val_loss: 0.5742 - val_binary_accuracy: 0.6485 - val_auc_roc: 0.7895\n",
      "roc-auc: 80.40% - roc-auc_val: 72.34%                                                                                                    \n",
      "Epoch 13/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5451 - binary_accuracy: 0.7180 - auc_roc: 0.7895 - val_loss: 0.5618 - val_binary_accuracy: 0.6489 - val_auc_roc: 0.7895\n",
      "roc-auc: 80.43% - roc-auc_val: 72.59%                                                                                                    \n",
      "Epoch 14/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5457 - binary_accuracy: 0.7173 - auc_roc: 0.7895 - val_loss: 0.5710 - val_binary_accuracy: 0.6456 - val_auc_roc: 0.7895\n",
      "roc-auc: 80.41% - roc-auc_val: 72.55%                                                                                                    \n",
      "Epoch 15/50\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "221521/221521 [==============================] - 12s 55us/step - loss: 0.5450 - binary_accuracy: 0.7171 - auc_roc: 0.7895 - val_loss: 0.5721 - val_binary_accuracy: 0.6432 - val_auc_roc: 0.7895\n",
      "roc-auc: 80.37% - roc-auc_val: 72.37%                                                                                                    \n",
      "Epoch 16/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5453 - binary_accuracy: 0.7174 - auc_roc: 0.7895 - val_loss: 0.5738 - val_binary_accuracy: 0.6440 - val_auc_roc: 0.7895\n",
      "roc-auc: 80.40% - roc-auc_val: 72.25%                                                                                                    \n",
      "Epoch 17/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5447 - binary_accuracy: 0.7178 - auc_roc: 0.7896 - val_loss: 0.5713 - val_binary_accuracy: 0.6501 - val_auc_roc: 0.7896\n",
      "roc-auc: 80.36% - roc-auc_val: 72.53%                                                                                                    \n",
      "Epoch 18/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5449 - binary_accuracy: 0.7174 - auc_roc: 0.7896 - val_loss: 0.5737 - val_binary_accuracy: 0.6481 - val_auc_roc: 0.7896\n",
      "roc-auc: 80.38% - roc-auc_val: 72.58%                                                                                                    \n",
      "Epoch 19/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5449 - binary_accuracy: 0.7186 - auc_roc: 0.7896 - val_loss: 0.5820 - val_binary_accuracy: 0.6489 - val_auc_roc: 0.7896\n",
      "roc-auc: 80.40% - roc-auc_val: 72.34%                                                                                                    \n",
      "Epoch 20/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5449 - binary_accuracy: 0.7179 - auc_roc: 0.7896 - val_loss: 0.5651 - val_binary_accuracy: 0.6509 - val_auc_roc: 0.7896\n",
      "roc-auc: 80.45% - roc-auc_val: 72.88%                                                                                                    \n",
      "Epoch 21/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5454 - binary_accuracy: 0.7170 - auc_roc: 0.7896 - val_loss: 0.5666 - val_binary_accuracy: 0.6529 - val_auc_roc: 0.7896\n",
      "roc-auc: 80.40% - roc-auc_val: 72.47%                                                                                                    \n",
      "Epoch 22/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5445 - binary_accuracy: 0.7178 - auc_roc: 0.7896 - val_loss: 0.5953 - val_binary_accuracy: 0.6396 - val_auc_roc: 0.7896\n",
      "roc-auc: 80.48% - roc-auc_val: 71.68%                                                                                                    \n",
      "Epoch 23/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5448 - binary_accuracy: 0.7178 - auc_roc: 0.7896 - val_loss: 0.5820 - val_binary_accuracy: 0.6472 - val_auc_roc: 0.7896\n",
      "roc-auc: 80.42% - roc-auc_val: 71.64%                                                                                                    \n",
      "Epoch 24/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5451 - binary_accuracy: 0.7171 - auc_roc: 0.7896 - val_loss: 0.5795 - val_binary_accuracy: 0.6456 - val_auc_roc: 0.7896\n",
      "roc-auc: 80.39% - roc-auc_val: 71.95%                                                                                                    \n",
      "Epoch 25/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5450 - binary_accuracy: 0.7176 - auc_roc: 0.7896 - val_loss: 0.5986 - val_binary_accuracy: 0.6392 - val_auc_roc: 0.7896\n",
      "roc-auc: 80.37% - roc-auc_val: 71.41%                                                                                                    \n",
      "Epoch 26/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5453 - binary_accuracy: 0.7181 - auc_roc: 0.7896 - val_loss: 0.5746 - val_binary_accuracy: 0.6448 - val_auc_roc: 0.7896\n",
      "roc-auc: 80.41% - roc-auc_val: 71.97%                                                                                                    \n",
      "Epoch 27/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5447 - binary_accuracy: 0.7178 - auc_roc: 0.7896 - val_loss: 0.5930 - val_binary_accuracy: 0.6408 - val_auc_roc: 0.7896\n",
      "roc-auc: 80.39% - roc-auc_val: 71.68%                                                                                                    \n",
      "Epoch 28/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5449 - binary_accuracy: 0.7176 - auc_roc: 0.7896 - val_loss: 0.5987 - val_binary_accuracy: 0.6440 - val_auc_roc: 0.7896\n",
      "roc-auc: 80.41% - roc-auc_val: 71.83%                                                                                                    \n",
      "Epoch 29/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5448 - binary_accuracy: 0.7177 - auc_roc: 0.7897 - val_loss: 0.6196 - val_binary_accuracy: 0.6380 - val_auc_roc: 0.7897\n",
      "roc-auc: 80.41% - roc-auc_val: 71.36%                                                                                                    \n",
      "Epoch 30/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5450 - binary_accuracy: 0.7176 - auc_roc: 0.7897 - val_loss: 0.5879 - val_binary_accuracy: 0.6388 - val_auc_roc: 0.7897\n",
      "roc-auc: 80.43% - roc-auc_val: 71.49%                                                                                                    \n",
      "Epoch 31/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5447 - binary_accuracy: 0.7173 - auc_roc: 0.7897 - val_loss: 0.5980 - val_binary_accuracy: 0.6408 - val_auc_roc: 0.7897\n",
      "roc-auc: 80.48% - roc-auc_val: 71.19%                                                                                                    \n",
      "Epoch 32/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5447 - binary_accuracy: 0.7184 - auc_roc: 0.7897 - val_loss: 0.6142 - val_binary_accuracy: 0.6380 - val_auc_roc: 0.7897\n",
      "roc-auc: 80.45% - roc-auc_val: 70.88%                                                                                                    \n",
      "Epoch 33/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5448 - binary_accuracy: 0.7175 - auc_roc: 0.7897 - val_loss: 0.6016 - val_binary_accuracy: 0.6396 - val_auc_roc: 0.7897\n",
      "roc-auc: 80.45% - roc-auc_val: 71.11%                                                                                                    \n",
      "Epoch 34/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5447 - binary_accuracy: 0.7171 - auc_roc: 0.7897 - val_loss: 0.5836 - val_binary_accuracy: 0.6420 - val_auc_roc: 0.7897\n",
      "roc-auc: 80.45% - roc-auc_val: 71.58%                                                                                                    \n",
      "Epoch 35/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5444 - binary_accuracy: 0.7183 - auc_roc: 0.7897 - val_loss: 0.6249 - val_binary_accuracy: 0.6344 - val_auc_roc: 0.7897\n",
      "roc-auc: 80.43% - roc-auc_val: 71.49%                                                                                                    \n",
      "Epoch 36/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5450 - binary_accuracy: 0.7176 - auc_roc: 0.7897 - val_loss: 0.5832 - val_binary_accuracy: 0.6352 - val_auc_roc: 0.7897\n",
      "roc-auc: 80.49% - roc-auc_val: 71.46%                                                                                                    \n",
      "Epoch 37/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5454 - binary_accuracy: 0.7177 - auc_roc: 0.7897 - val_loss: 0.6079 - val_binary_accuracy: 0.6420 - val_auc_roc: 0.7897\n",
      "roc-auc: 80.43% - roc-auc_val: 71.28%                                                                                                    \n",
      "Epoch 38/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5450 - binary_accuracy: 0.7166 - auc_roc: 0.7897 - val_loss: 0.5935 - val_binary_accuracy: 0.6336 - val_auc_roc: 0.7897\n",
      "roc-auc: 80.45% - roc-auc_val: 71.49%                                                                                                    \n",
      "Epoch 39/50\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5453 - binary_accuracy: 0.7173 - auc_roc: 0.7897 - val_loss: 0.6254 - val_binary_accuracy: 0.6400 - val_auc_roc: 0.7897\n",
      "roc-auc: 80.46% - roc-auc_val: 71.06%                                                                                                    \n",
      "Epoch 40/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5451 - binary_accuracy: 0.7175 - auc_roc: 0.7897 - val_loss: 0.6080 - val_binary_accuracy: 0.6428 - val_auc_roc: 0.7897\n",
      "roc-auc: 80.42% - roc-auc_val: 71.15%                                                                                                    \n",
      "Epoch 41/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5450 - binary_accuracy: 0.7178 - auc_roc: 0.7897 - val_loss: 0.6257 - val_binary_accuracy: 0.6340 - val_auc_roc: 0.7897\n",
      "roc-auc: 80.44% - roc-auc_val: 70.76%                                                                                                    \n",
      "Epoch 42/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5452 - binary_accuracy: 0.7173 - auc_roc: 0.7897 - val_loss: 0.6283 - val_binary_accuracy: 0.6408 - val_auc_roc: 0.7898\n",
      "roc-auc: 80.45% - roc-auc_val: 71.02%                                                                                                    \n",
      "Epoch 43/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5447 - binary_accuracy: 0.7176 - auc_roc: 0.7898 - val_loss: 0.6193 - val_binary_accuracy: 0.6372 - val_auc_roc: 0.7898\n",
      "roc-auc: 80.43% - roc-auc_val: 71.34%                                                                                                    \n",
      "Epoch 44/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5448 - binary_accuracy: 0.7171 - auc_roc: 0.7898 - val_loss: 0.6500 - val_binary_accuracy: 0.6328 - val_auc_roc: 0.7898\n",
      "roc-auc: 80.44% - roc-auc_val: 71.41%                                                                                                    \n",
      "Epoch 45/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5442 - binary_accuracy: 0.7180 - auc_roc: 0.7898 - val_loss: 0.6188 - val_binary_accuracy: 0.6352 - val_auc_roc: 0.7898\n",
      "roc-auc: 80.44% - roc-auc_val: 71.48%                                                                                                    \n",
      "Epoch 46/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5444 - binary_accuracy: 0.7175 - auc_roc: 0.7898 - val_loss: 0.5901 - val_binary_accuracy: 0.6436 - val_auc_roc: 0.7898\n",
      "roc-auc: 80.43% - roc-auc_val: 71.22%                                                                                                    \n",
      "Epoch 47/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5445 - binary_accuracy: 0.7181 - auc_roc: 0.7898 - val_loss: 0.6215 - val_binary_accuracy: 0.6396 - val_auc_roc: 0.7898\n",
      "roc-auc: 80.43% - roc-auc_val: 70.75%                                                                                                    \n",
      "Epoch 48/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5449 - binary_accuracy: 0.7173 - auc_roc: 0.7898 - val_loss: 0.6314 - val_binary_accuracy: 0.6372 - val_auc_roc: 0.7898\n",
      "roc-auc: 80.43% - roc-auc_val: 71.12%                                                                                                    \n",
      "Epoch 49/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5447 - binary_accuracy: 0.7176 - auc_roc: 0.7898 - val_loss: 0.6201 - val_binary_accuracy: 0.6400 - val_auc_roc: 0.7898\n",
      "roc-auc: 80.50% - roc-auc_val: 70.62%                                                                                                    \n",
      "Epoch 50/50\n",
      "221521/221521 [==============================] - 12s 56us/step - loss: 0.5445 - binary_accuracy: 0.7185 - auc_roc: 0.7898 - val_loss: 0.6330 - val_binary_accuracy: 0.6340 - val_auc_roc: 0.7898\n",
      "roc-auc: 80.46% - roc-auc_val: 70.80%                                                                                                    \n",
      "Train on 219032 samples, validate on 2489 samples\n",
      "Epoch 1/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5454 - binary_accuracy: 0.7178 - auc_roc: 0.7898 - val_loss: 0.4998 - val_binary_accuracy: 0.7437 - val_auc_roc: 0.7898\n",
      "roc-auc: 80.45% - roc-auc_val: 82.11%                                                                                                    \n",
      "Epoch 2/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5454 - binary_accuracy: 0.7174 - auc_roc: 0.7898 - val_loss: 0.4998 - val_binary_accuracy: 0.7384 - val_auc_roc: 0.7898\n",
      "roc-auc: 80.37% - roc-auc_val: 81.72%                                                                                                    \n",
      "Epoch 3/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5455 - binary_accuracy: 0.7173 - auc_roc: 0.7898 - val_loss: 0.5055 - val_binary_accuracy: 0.7332 - val_auc_roc: 0.7898\n",
      "roc-auc: 80.44% - roc-auc_val: 81.29%                                                                                                    \n",
      "Epoch 4/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5456 - binary_accuracy: 0.7179 - auc_roc: 0.7898 - val_loss: 0.5025 - val_binary_accuracy: 0.7308 - val_auc_roc: 0.7898\n",
      "roc-auc: 80.47% - roc-auc_val: 81.27%                                                                                                    \n",
      "Epoch 5/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5450 - binary_accuracy: 0.7183 - auc_roc: 0.7898 - val_loss: 0.5145 - val_binary_accuracy: 0.7268 - val_auc_roc: 0.7899\n",
      "roc-auc: 80.44% - roc-auc_val: 81.00%                                                                                                    \n",
      "Epoch 6/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5452 - binary_accuracy: 0.7173 - auc_roc: 0.7899 - val_loss: 0.5079 - val_binary_accuracy: 0.7288 - val_auc_roc: 0.7899\n",
      "roc-auc: 80.50% - roc-auc_val: 80.67%                                                                                                    \n",
      "Epoch 7/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5450 - binary_accuracy: 0.7168 - auc_roc: 0.7899 - val_loss: 0.5152 - val_binary_accuracy: 0.7348 - val_auc_roc: 0.7899\n",
      "roc-auc: 80.47% - roc-auc_val: 80.60%                                                                                                    \n",
      "Epoch 8/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5451 - binary_accuracy: 0.7177 - auc_roc: 0.7899 - val_loss: 0.5277 - val_binary_accuracy: 0.7252 - val_auc_roc: 0.7899\n",
      "roc-auc: 80.46% - roc-auc_val: 80.07%                                                                                                    \n",
      "Epoch 9/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5450 - binary_accuracy: 0.7184 - auc_roc: 0.7899 - val_loss: 0.5284 - val_binary_accuracy: 0.7240 - val_auc_roc: 0.7899\n",
      "roc-auc: 80.45% - roc-auc_val: 80.06%                                                                                                    \n",
      "Epoch 10/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5451 - binary_accuracy: 0.7180 - auc_roc: 0.7899 - val_loss: 0.5162 - val_binary_accuracy: 0.7220 - val_auc_roc: 0.7899\n",
      "roc-auc: 80.45% - roc-auc_val: 80.34%                                                                                                    \n",
      "Epoch 11/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5446 - binary_accuracy: 0.7180 - auc_roc: 0.7899 - val_loss: 0.5246 - val_binary_accuracy: 0.7208 - val_auc_roc: 0.7899\n",
      "roc-auc: 80.52% - roc-auc_val: 80.06%                                                                                                    \n",
      "Epoch 12/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5445 - binary_accuracy: 0.7177 - auc_roc: 0.7899 - val_loss: 0.5369 - val_binary_accuracy: 0.7172 - val_auc_roc: 0.7899\n",
      "roc-auc: 80.51% - roc-auc_val: 79.21%                                                                                                    \n",
      "Epoch 13/50\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "219032/219032 [==============================] - 12s 55us/step - loss: 0.5452 - binary_accuracy: 0.7175 - auc_roc: 0.7899 - val_loss: 0.5512 - val_binary_accuracy: 0.7155 - val_auc_roc: 0.7899\n",
      "roc-auc: 80.46% - roc-auc_val: 79.06%                                                                                                    \n",
      "Epoch 14/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5443 - binary_accuracy: 0.7184 - auc_roc: 0.7899 - val_loss: 0.5139 - val_binary_accuracy: 0.7244 - val_auc_roc: 0.7899\n",
      "roc-auc: 80.52% - roc-auc_val: 79.91%                                                                                                    \n",
      "Epoch 15/50\n",
      "219032/219032 [==============================] - 12s 55us/step - loss: 0.5446 - binary_accuracy: 0.7174 - auc_roc: 0.7899 - val_loss: 0.5209 - val_binary_accuracy: 0.7216 - val_auc_roc: 0.7899\n",
      "roc-auc: 80.46% - roc-auc_val: 79.19%                                                                                                    \n",
      "Epoch 16/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5444 - binary_accuracy: 0.7172 - auc_roc: 0.7899 - val_loss: 0.5261 - val_binary_accuracy: 0.7143 - val_auc_roc: 0.7899\n",
      "roc-auc: 80.49% - roc-auc_val: 79.16%                                                                                                    \n",
      "Epoch 17/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5451 - binary_accuracy: 0.7178 - auc_roc: 0.7900 - val_loss: 0.5182 - val_binary_accuracy: 0.7220 - val_auc_roc: 0.7900\n",
      "roc-auc: 80.49% - roc-auc_val: 79.39%                                                                                                    \n",
      "Epoch 18/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5451 - binary_accuracy: 0.7169 - auc_roc: 0.7900 - val_loss: 0.5268 - val_binary_accuracy: 0.7160 - val_auc_roc: 0.7900\n",
      "roc-auc: 80.44% - roc-auc_val: 79.45%                                                                                                    \n",
      "Epoch 19/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5453 - binary_accuracy: 0.7166 - auc_roc: 0.7900 - val_loss: 0.5291 - val_binary_accuracy: 0.7172 - val_auc_roc: 0.7900\n",
      "roc-auc: 80.39% - roc-auc_val: 79.21%                                                                                                    \n",
      "Epoch 20/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5445 - binary_accuracy: 0.7174 - auc_roc: 0.7900 - val_loss: 0.5264 - val_binary_accuracy: 0.7200 - val_auc_roc: 0.7900\n",
      "roc-auc: 80.51% - roc-auc_val: 79.53%                                                                                                    \n",
      "Epoch 21/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5449 - binary_accuracy: 0.7180 - auc_roc: 0.7900 - val_loss: 0.5222 - val_binary_accuracy: 0.7164 - val_auc_roc: 0.7900\n",
      "roc-auc: 80.46% - roc-auc_val: 79.36%                                                                                                    \n",
      "Epoch 22/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5446 - binary_accuracy: 0.7179 - auc_roc: 0.7900 - val_loss: 0.5301 - val_binary_accuracy: 0.7091 - val_auc_roc: 0.7900\n",
      "roc-auc: 80.43% - roc-auc_val: 78.83%                                                                                                    \n",
      "Epoch 23/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5443 - binary_accuracy: 0.7178 - auc_roc: 0.7900 - val_loss: 0.5292 - val_binary_accuracy: 0.7164 - val_auc_roc: 0.7900\n",
      "roc-auc: 80.47% - roc-auc_val: 79.03%                                                                                                    \n",
      "Epoch 24/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5451 - binary_accuracy: 0.7181 - auc_roc: 0.7900 - val_loss: 0.5458 - val_binary_accuracy: 0.7107 - val_auc_roc: 0.7900\n",
      "roc-auc: 80.53% - roc-auc_val: 78.86%                                                                                                    \n",
      "Epoch 25/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5451 - binary_accuracy: 0.7170 - auc_roc: 0.7900 - val_loss: 0.5651 - val_binary_accuracy: 0.7051 - val_auc_roc: 0.7900\n",
      "roc-auc: 80.47% - roc-auc_val: 78.02%                                                                                                    \n",
      "Epoch 26/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5451 - binary_accuracy: 0.7177 - auc_roc: 0.7900 - val_loss: 0.5402 - val_binary_accuracy: 0.7143 - val_auc_roc: 0.7900\n",
      "roc-auc: 80.40% - roc-auc_val: 78.67%                                                                                                    \n",
      "Epoch 27/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5442 - binary_accuracy: 0.7178 - auc_roc: 0.7900 - val_loss: 0.5416 - val_binary_accuracy: 0.7127 - val_auc_roc: 0.7900\n",
      "roc-auc: 80.43% - roc-auc_val: 78.58%                                                                                                    \n",
      "Epoch 28/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5443 - binary_accuracy: 0.7179 - auc_roc: 0.7900 - val_loss: 0.5282 - val_binary_accuracy: 0.7184 - val_auc_roc: 0.7900\n",
      "roc-auc: 80.50% - roc-auc_val: 79.11%                                                                                                    \n",
      "Epoch 29/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5437 - binary_accuracy: 0.7176 - auc_roc: 0.7900 - val_loss: 0.5283 - val_binary_accuracy: 0.7151 - val_auc_roc: 0.7901\n",
      "roc-auc: 80.46% - roc-auc_val: 78.87%                                                                                                    \n",
      "Epoch 30/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5444 - binary_accuracy: 0.7171 - auc_roc: 0.7901 - val_loss: 0.5436 - val_binary_accuracy: 0.7119 - val_auc_roc: 0.7901\n",
      "roc-auc: 80.48% - roc-auc_val: 78.07%                                                                                                    \n",
      "Epoch 31/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5447 - binary_accuracy: 0.7183 - auc_roc: 0.7901 - val_loss: 0.5301 - val_binary_accuracy: 0.7119 - val_auc_roc: 0.7901\n",
      "roc-auc: 80.52% - roc-auc_val: 78.58%                                                                                                    \n",
      "Epoch 32/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5442 - binary_accuracy: 0.7188 - auc_roc: 0.7901 - val_loss: 0.5440 - val_binary_accuracy: 0.7143 - val_auc_roc: 0.7901\n",
      "roc-auc: 80.48% - roc-auc_val: 78.73%                                                                                                    \n",
      "Epoch 33/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5444 - binary_accuracy: 0.7179 - auc_roc: 0.7901 - val_loss: 0.5543 - val_binary_accuracy: 0.7063 - val_auc_roc: 0.7901\n",
      "roc-auc: 80.42% - roc-auc_val: 77.92%                                                                                                    \n",
      "Epoch 34/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5448 - binary_accuracy: 0.7177 - auc_roc: 0.7901 - val_loss: 0.5427 - val_binary_accuracy: 0.7115 - val_auc_roc: 0.7901\n",
      "roc-auc: 80.51% - roc-auc_val: 78.11%                                                                                                    \n",
      "Epoch 35/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5448 - binary_accuracy: 0.7176 - auc_roc: 0.7901 - val_loss: 0.5539 - val_binary_accuracy: 0.7139 - val_auc_roc: 0.7901\n",
      "roc-auc: 80.46% - roc-auc_val: 78.26%                                                                                                    \n",
      "Epoch 36/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5444 - binary_accuracy: 0.7177 - auc_roc: 0.7901 - val_loss: 0.5412 - val_binary_accuracy: 0.7164 - val_auc_roc: 0.7901\n",
      "roc-auc: 80.46% - roc-auc_val: 78.62%                                                                                                    \n",
      "Epoch 37/50\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5447 - binary_accuracy: 0.7178 - auc_roc: 0.7901 - val_loss: 0.5431 - val_binary_accuracy: 0.7176 - val_auc_roc: 0.7901\n",
      "roc-auc: 80.48% - roc-auc_val: 78.32%                                                                                                    \n",
      "Epoch 38/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5448 - binary_accuracy: 0.7178 - auc_roc: 0.7901 - val_loss: 0.5416 - val_binary_accuracy: 0.7095 - val_auc_roc: 0.7901\n",
      "roc-auc: 80.49% - roc-auc_val: 78.14%                                                                                                    \n",
      "Epoch 39/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5448 - binary_accuracy: 0.7181 - auc_roc: 0.7901 - val_loss: 0.5663 - val_binary_accuracy: 0.7079 - val_auc_roc: 0.7901\n",
      "roc-auc: 80.49% - roc-auc_val: 77.64%                                                                                                    \n",
      "Epoch 40/50\n",
      "219032/219032 [==============================] - 12s 55us/step - loss: 0.5446 - binary_accuracy: 0.7177 - auc_roc: 0.7901 - val_loss: 0.5329 - val_binary_accuracy: 0.7103 - val_auc_roc: 0.7901\n",
      "roc-auc: 80.52% - roc-auc_val: 78.27%                                                                                                    \n",
      "Epoch 41/50\n",
      "219032/219032 [==============================] - 12s 55us/step - loss: 0.5445 - binary_accuracy: 0.7177 - auc_roc: 0.7901 - val_loss: 0.5438 - val_binary_accuracy: 0.7059 - val_auc_roc: 0.7901\n",
      "roc-auc: 80.48% - roc-auc_val: 78.00%                                                                                                    \n",
      "Epoch 42/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5449 - binary_accuracy: 0.7175 - auc_roc: 0.7902 - val_loss: 0.5386 - val_binary_accuracy: 0.7131 - val_auc_roc: 0.7902\n",
      "roc-auc: 80.48% - roc-auc_val: 78.18%                                                                                                    \n",
      "Epoch 43/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5439 - binary_accuracy: 0.7174 - auc_roc: 0.7902 - val_loss: 0.5405 - val_binary_accuracy: 0.7099 - val_auc_roc: 0.7902\n",
      "roc-auc: 80.52% - roc-auc_val: 78.23%                                                                                                    \n",
      "Epoch 44/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5449 - binary_accuracy: 0.7175 - auc_roc: 0.7902 - val_loss: 0.5487 - val_binary_accuracy: 0.7095 - val_auc_roc: 0.7902\n",
      "roc-auc: 80.52% - roc-auc_val: 78.03%                                                                                                    \n",
      "Epoch 45/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5448 - binary_accuracy: 0.7175 - auc_roc: 0.7902 - val_loss: 0.5346 - val_binary_accuracy: 0.7099 - val_auc_roc: 0.7902\n",
      "roc-auc: 80.52% - roc-auc_val: 78.03%                                                                                                    \n",
      "Epoch 46/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5439 - binary_accuracy: 0.7179 - auc_roc: 0.7902 - val_loss: 0.5426 - val_binary_accuracy: 0.7003 - val_auc_roc: 0.7902\n",
      "roc-auc: 80.55% - roc-auc_val: 77.62%                                                                                                    \n",
      "Epoch 47/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5443 - binary_accuracy: 0.7180 - auc_roc: 0.7902 - val_loss: 0.5381 - val_binary_accuracy: 0.7123 - val_auc_roc: 0.7902\n",
      "roc-auc: 80.53% - roc-auc_val: 78.31%                                                                                                    \n",
      "Epoch 48/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5442 - binary_accuracy: 0.7178 - auc_roc: 0.7902 - val_loss: 0.5371 - val_binary_accuracy: 0.7079 - val_auc_roc: 0.7902\n",
      "roc-auc: 80.53% - roc-auc_val: 77.93%                                                                                                    \n",
      "Epoch 49/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5441 - binary_accuracy: 0.7175 - auc_roc: 0.7902 - val_loss: 0.5497 - val_binary_accuracy: 0.7027 - val_auc_roc: 0.7902\n",
      "roc-auc: 80.51% - roc-auc_val: 77.13%                                                                                                    \n",
      "Epoch 50/50\n",
      "219032/219032 [==============================] - 12s 56us/step - loss: 0.5448 - binary_accuracy: 0.7178 - auc_roc: 0.7902 - val_loss: 0.5471 - val_binary_accuracy: 0.7115 - val_auc_roc: 0.7902\n",
      "roc-auc: 80.54% - roc-auc_val: 78.01%                                                                                                    \n",
      "Train on 216543 samples, validate on 2489 samples\n",
      "Epoch 1/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5448 - binary_accuracy: 0.7176 - auc_roc: 0.7902 - val_loss: 0.4927 - val_binary_accuracy: 0.7389 - val_auc_roc: 0.7902\n",
      "roc-auc: 80.40% - roc-auc_val: 80.53%                                                                                                    \n",
      "Epoch 2/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5445 - binary_accuracy: 0.7177 - auc_roc: 0.7902 - val_loss: 0.4974 - val_binary_accuracy: 0.7380 - val_auc_roc: 0.7902\n",
      "roc-auc: 80.44% - roc-auc_val: 80.32%                                                                                                    \n",
      "Epoch 3/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5452 - binary_accuracy: 0.7176 - auc_roc: 0.7902 - val_loss: 0.5091 - val_binary_accuracy: 0.7304 - val_auc_roc: 0.7902\n",
      "roc-auc: 80.42% - roc-auc_val: 79.46%                                                                                                    \n",
      "Epoch 4/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5449 - binary_accuracy: 0.7175 - auc_roc: 0.7902 - val_loss: 0.5117 - val_binary_accuracy: 0.7272 - val_auc_roc: 0.7902\n",
      "roc-auc: 80.45% - roc-auc_val: 79.57%                                                                                                    \n",
      "Epoch 5/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5445 - binary_accuracy: 0.7183 - auc_roc: 0.7903 - val_loss: 0.5163 - val_binary_accuracy: 0.7276 - val_auc_roc: 0.7903\n",
      "roc-auc: 80.47% - roc-auc_val: 78.90%                                                                                                    \n",
      "Epoch 6/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5442 - binary_accuracy: 0.7175 - auc_roc: 0.7903 - val_loss: 0.5082 - val_binary_accuracy: 0.7280 - val_auc_roc: 0.7903\n",
      "roc-auc: 80.54% - roc-auc_val: 78.77%                                                                                                    \n",
      "Epoch 7/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5446 - binary_accuracy: 0.7178 - auc_roc: 0.7903 - val_loss: 0.5201 - val_binary_accuracy: 0.7212 - val_auc_roc: 0.7903\n",
      "roc-auc: 80.48% - roc-auc_val: 78.33%                                                                                                    \n",
      "Epoch 8/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5444 - binary_accuracy: 0.7185 - auc_roc: 0.7903 - val_loss: 0.5112 - val_binary_accuracy: 0.7192 - val_auc_roc: 0.7903\n",
      "roc-auc: 80.44% - roc-auc_val: 78.45%                                                                                                    \n",
      "Epoch 9/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5446 - binary_accuracy: 0.7178 - auc_roc: 0.7903 - val_loss: 0.5154 - val_binary_accuracy: 0.7212 - val_auc_roc: 0.7903\n",
      "roc-auc: 80.51% - roc-auc_val: 78.54%                                                                                                    \n",
      "Epoch 10/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5449 - binary_accuracy: 0.7176 - auc_roc: 0.7903 - val_loss: 0.5194 - val_binary_accuracy: 0.7192 - val_auc_roc: 0.7903\n",
      "roc-auc: 80.53% - roc-auc_val: 78.05%                                                                                                    \n",
      "Epoch 11/50\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5452 - binary_accuracy: 0.7180 - auc_roc: 0.7903 - val_loss: 0.5204 - val_binary_accuracy: 0.7172 - val_auc_roc: 0.7903\n",
      "roc-auc: 80.46% - roc-auc_val: 78.02%                                                                                                    \n",
      "Epoch 12/50\n",
      "216543/216543 [==============================] - 12s 55us/step - loss: 0.5449 - binary_accuracy: 0.7180 - auc_roc: 0.7903 - val_loss: 0.5281 - val_binary_accuracy: 0.7188 - val_auc_roc: 0.7903\n",
      "roc-auc: 80.45% - roc-auc_val: 77.33%                                                                                                    \n",
      "Epoch 13/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5450 - binary_accuracy: 0.7177 - auc_roc: 0.7903 - val_loss: 0.5231 - val_binary_accuracy: 0.7123 - val_auc_roc: 0.7903\n",
      "roc-auc: 80.46% - roc-auc_val: 77.20%                                                                                                    \n",
      "Epoch 14/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5452 - binary_accuracy: 0.7173 - auc_roc: 0.7903 - val_loss: 0.5251 - val_binary_accuracy: 0.7180 - val_auc_roc: 0.7903\n",
      "roc-auc: 80.46% - roc-auc_val: 77.54%                                                                                                    \n",
      "Epoch 15/50\n",
      "216543/216543 [==============================] - 12s 55us/step - loss: 0.5447 - binary_accuracy: 0.7180 - auc_roc: 0.7903 - val_loss: 0.5217 - val_binary_accuracy: 0.7204 - val_auc_roc: 0.7903\n",
      "roc-auc: 80.41% - roc-auc_val: 77.88%                                                                                                    \n",
      "Epoch 16/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5442 - binary_accuracy: 0.7174 - auc_roc: 0.7903 - val_loss: 0.5275 - val_binary_accuracy: 0.7103 - val_auc_roc: 0.7903\n",
      "roc-auc: 80.45% - roc-auc_val: 77.05%                                                                                                    \n",
      "Epoch 17/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5446 - binary_accuracy: 0.7185 - auc_roc: 0.7903 - val_loss: 0.5210 - val_binary_accuracy: 0.7115 - val_auc_roc: 0.7903\n",
      "roc-auc: 80.46% - roc-auc_val: 77.29%                                                                                                    \n",
      "Epoch 18/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5447 - binary_accuracy: 0.7175 - auc_roc: 0.7903 - val_loss: 0.5332 - val_binary_accuracy: 0.7111 - val_auc_roc: 0.7904\n",
      "roc-auc: 80.46% - roc-auc_val: 76.73%                                                                                                    \n",
      "Epoch 19/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5451 - binary_accuracy: 0.7172 - auc_roc: 0.7904 - val_loss: 0.5380 - val_binary_accuracy: 0.6943 - val_auc_roc: 0.7904\n",
      "roc-auc: 80.50% - roc-auc_val: 77.06%                                                                                                    \n",
      "Epoch 20/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5443 - binary_accuracy: 0.7188 - auc_roc: 0.7904 - val_loss: 0.5209 - val_binary_accuracy: 0.7091 - val_auc_roc: 0.7904\n",
      "roc-auc: 80.50% - roc-auc_val: 77.42%                                                                                                    \n",
      "Epoch 21/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5446 - binary_accuracy: 0.7177 - auc_roc: 0.7904 - val_loss: 0.5235 - val_binary_accuracy: 0.7079 - val_auc_roc: 0.7904\n",
      "roc-auc: 80.44% - roc-auc_val: 77.24%                                                                                                    \n",
      "Epoch 22/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5442 - binary_accuracy: 0.7175 - auc_roc: 0.7904 - val_loss: 0.5272 - val_binary_accuracy: 0.7103 - val_auc_roc: 0.7904\n",
      "roc-auc: 80.52% - roc-auc_val: 77.32%                                                                                                    \n",
      "Epoch 23/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5448 - binary_accuracy: 0.7181 - auc_roc: 0.7904 - val_loss: 0.5337 - val_binary_accuracy: 0.7067 - val_auc_roc: 0.7904\n",
      "roc-auc: 80.50% - roc-auc_val: 76.68%                                                                                                    \n",
      "Epoch 24/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5451 - binary_accuracy: 0.7173 - auc_roc: 0.7904 - val_loss: 0.5279 - val_binary_accuracy: 0.7023 - val_auc_roc: 0.7904\n",
      "roc-auc: 80.43% - roc-auc_val: 76.22%                                                                                                    \n",
      "Epoch 25/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5453 - binary_accuracy: 0.7176 - auc_roc: 0.7904 - val_loss: 0.5295 - val_binary_accuracy: 0.7047 - val_auc_roc: 0.7904\n",
      "roc-auc: 80.50% - roc-auc_val: 76.42%                                                                                                    \n",
      "Epoch 26/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5448 - binary_accuracy: 0.7173 - auc_roc: 0.7904 - val_loss: 0.5290 - val_binary_accuracy: 0.7055 - val_auc_roc: 0.7904\n",
      "roc-auc: 80.48% - roc-auc_val: 76.39%                                                                                                    \n",
      "Epoch 27/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5443 - binary_accuracy: 0.7179 - auc_roc: 0.7904 - val_loss: 0.5419 - val_binary_accuracy: 0.7043 - val_auc_roc: 0.7904\n",
      "roc-auc: 80.44% - roc-auc_val: 76.49%                                                                                                    \n",
      "Epoch 28/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5446 - binary_accuracy: 0.7174 - auc_roc: 0.7904 - val_loss: 0.5297 - val_binary_accuracy: 0.7039 - val_auc_roc: 0.7904\n",
      "roc-auc: 80.47% - roc-auc_val: 76.17%                                                                                                    \n",
      "Epoch 29/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5450 - binary_accuracy: 0.7179 - auc_roc: 0.7904 - val_loss: 0.5378 - val_binary_accuracy: 0.7015 - val_auc_roc: 0.7904\n",
      "roc-auc: 80.44% - roc-auc_val: 75.91%                                                                                                    \n",
      "Epoch 30/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5446 - binary_accuracy: 0.7179 - auc_roc: 0.7904 - val_loss: 0.5349 - val_binary_accuracy: 0.6930 - val_auc_roc: 0.7904\n",
      "roc-auc: 80.45% - roc-auc_val: 75.66%                                                                                                    \n",
      "Epoch 31/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5446 - binary_accuracy: 0.7177 - auc_roc: 0.7904 - val_loss: 0.5352 - val_binary_accuracy: 0.7027 - val_auc_roc: 0.7904\n",
      "roc-auc: 80.51% - roc-auc_val: 75.81%                                                                                                    \n",
      "Epoch 32/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5442 - binary_accuracy: 0.7186 - auc_roc: 0.7904 - val_loss: 0.5424 - val_binary_accuracy: 0.6947 - val_auc_roc: 0.7904\n",
      "roc-auc: 80.45% - roc-auc_val: 75.96%                                                                                                    \n",
      "Epoch 33/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5443 - binary_accuracy: 0.7181 - auc_roc: 0.7904 - val_loss: 0.5396 - val_binary_accuracy: 0.6910 - val_auc_roc: 0.7904\n",
      "roc-auc: 80.52% - roc-auc_val: 75.81%                                                                                                    \n",
      "Epoch 34/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5444 - binary_accuracy: 0.7188 - auc_roc: 0.7905 - val_loss: 0.5412 - val_binary_accuracy: 0.6939 - val_auc_roc: 0.7905\n",
      "roc-auc: 80.51% - roc-auc_val: 75.98%                                                                                                    \n",
      "Epoch 35/50\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5446 - binary_accuracy: 0.7174 - auc_roc: 0.7905 - val_loss: 0.5397 - val_binary_accuracy: 0.6995 - val_auc_roc: 0.7905\n",
      "roc-auc: 80.52% - roc-auc_val: 76.09%                                                                                                    \n",
      "Epoch 36/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5442 - binary_accuracy: 0.7178 - auc_roc: 0.7905 - val_loss: 0.5437 - val_binary_accuracy: 0.6862 - val_auc_roc: 0.7905\n",
      "roc-auc: 80.50% - roc-auc_val: 75.82%                                                                                                    \n",
      "Epoch 37/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5448 - binary_accuracy: 0.7183 - auc_roc: 0.7905 - val_loss: 0.5415 - val_binary_accuracy: 0.6955 - val_auc_roc: 0.7905\n",
      "roc-auc: 80.43% - roc-auc_val: 76.06%                                                                                                    \n",
      "Epoch 38/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5441 - binary_accuracy: 0.7176 - auc_roc: 0.7905 - val_loss: 0.5493 - val_binary_accuracy: 0.6922 - val_auc_roc: 0.7905\n",
      "roc-auc: 80.48% - roc-auc_val: 75.39%                                                                                                    \n",
      "Epoch 39/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5445 - binary_accuracy: 0.7180 - auc_roc: 0.7905 - val_loss: 0.5592 - val_binary_accuracy: 0.6782 - val_auc_roc: 0.7905\n",
      "roc-auc: 80.52% - roc-auc_val: 75.17%                                                                                                    \n",
      "Epoch 40/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5442 - binary_accuracy: 0.7183 - auc_roc: 0.7905 - val_loss: 0.5417 - val_binary_accuracy: 0.6995 - val_auc_roc: 0.7905\n",
      "roc-auc: 80.49% - roc-auc_val: 75.30%                                                                                                    \n",
      "Epoch 41/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5440 - binary_accuracy: 0.7182 - auc_roc: 0.7905 - val_loss: 0.5454 - val_binary_accuracy: 0.6983 - val_auc_roc: 0.7905\n",
      "roc-auc: 80.53% - roc-auc_val: 75.58%                                                                                                    \n",
      "Epoch 42/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5444 - binary_accuracy: 0.7187 - auc_roc: 0.7905 - val_loss: 0.5400 - val_binary_accuracy: 0.6967 - val_auc_roc: 0.7905\n",
      "roc-auc: 80.49% - roc-auc_val: 75.17%                                                                                                    \n",
      "Epoch 43/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5444 - binary_accuracy: 0.7171 - auc_roc: 0.7905 - val_loss: 0.5456 - val_binary_accuracy: 0.6910 - val_auc_roc: 0.7905\n",
      "roc-auc: 80.49% - roc-auc_val: 75.10%                                                                                                    \n",
      "Epoch 44/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5445 - binary_accuracy: 0.7181 - auc_roc: 0.7905 - val_loss: 0.5603 - val_binary_accuracy: 0.6882 - val_auc_roc: 0.7905\n",
      "roc-auc: 80.51% - roc-auc_val: 74.05%                                                                                                    \n",
      "Epoch 45/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5441 - binary_accuracy: 0.7175 - auc_roc: 0.7905 - val_loss: 0.5492 - val_binary_accuracy: 0.6866 - val_auc_roc: 0.7905\n",
      "roc-auc: 80.51% - roc-auc_val: 75.35%                                                                                                    \n",
      "Epoch 46/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5444 - binary_accuracy: 0.7186 - auc_roc: 0.7905 - val_loss: 0.5465 - val_binary_accuracy: 0.6794 - val_auc_roc: 0.7905\n",
      "roc-auc: 80.52% - roc-auc_val: 75.12%                                                                                                    \n",
      "Epoch 47/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5447 - binary_accuracy: 0.7177 - auc_roc: 0.7905 - val_loss: 0.5539 - val_binary_accuracy: 0.6758 - val_auc_roc: 0.7905\n",
      "roc-auc: 80.51% - roc-auc_val: 74.04%                                                                                                    \n",
      "Epoch 48/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5450 - binary_accuracy: 0.7178 - auc_roc: 0.7905 - val_loss: 0.5421 - val_binary_accuracy: 0.6866 - val_auc_roc: 0.7905\n",
      "roc-auc: 80.48% - roc-auc_val: 74.72%                                                                                                    \n",
      "Epoch 49/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5446 - binary_accuracy: 0.7183 - auc_roc: 0.7905 - val_loss: 0.5405 - val_binary_accuracy: 0.6983 - val_auc_roc: 0.7905\n",
      "roc-auc: 80.51% - roc-auc_val: 74.94%                                                                                                    \n",
      "Epoch 50/50\n",
      "216543/216543 [==============================] - 12s 56us/step - loss: 0.5443 - binary_accuracy: 0.7182 - auc_roc: 0.7906 - val_loss: 0.5499 - val_binary_accuracy: 0.6922 - val_auc_roc: 0.7906\n",
      "roc-auc: 80.43% - roc-auc_val: 74.38%                                                                                                    \n",
      "Train on 214054 samples, validate on 2489 samples\n",
      "Epoch 1/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5446 - binary_accuracy: 0.7180 - auc_roc: 0.7906 - val_loss: 0.5659 - val_binary_accuracy: 0.6697 - val_auc_roc: 0.7906\n",
      "roc-auc: 80.47% - roc-auc_val: 74.42%                                                                                                    \n",
      "Epoch 2/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5438 - binary_accuracy: 0.7189 - auc_roc: 0.7906 - val_loss: 0.5786 - val_binary_accuracy: 0.6645 - val_auc_roc: 0.7906\n",
      "roc-auc: 80.54% - roc-auc_val: 74.11%                                                                                                    \n",
      "Epoch 3/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5438 - binary_accuracy: 0.7190 - auc_roc: 0.7906 - val_loss: 0.5632 - val_binary_accuracy: 0.6657 - val_auc_roc: 0.7906\n",
      "roc-auc: 80.58% - roc-auc_val: 74.19%                                                                                                    \n",
      "Epoch 4/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5448 - binary_accuracy: 0.7182 - auc_roc: 0.7906 - val_loss: 0.5609 - val_binary_accuracy: 0.6641 - val_auc_roc: 0.7906\n",
      "roc-auc: 80.55% - roc-auc_val: 74.21%                                                                                                    \n",
      "Epoch 5/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5432 - binary_accuracy: 0.7196 - auc_roc: 0.7906 - val_loss: 0.5728 - val_binary_accuracy: 0.6601 - val_auc_roc: 0.7906\n",
      "roc-auc: 80.52% - roc-auc_val: 73.85%                                                                                                    \n",
      "Epoch 6/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5441 - binary_accuracy: 0.7192 - auc_roc: 0.7906 - val_loss: 0.5665 - val_binary_accuracy: 0.6641 - val_auc_roc: 0.7906\n",
      "roc-auc: 80.56% - roc-auc_val: 73.38%                                                                                                    \n",
      "Epoch 7/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5442 - binary_accuracy: 0.7193 - auc_roc: 0.7906 - val_loss: 0.5848 - val_binary_accuracy: 0.6601 - val_auc_roc: 0.7906\n",
      "roc-auc: 80.53% - roc-auc_val: 73.56%                                                                                                    \n",
      "Epoch 8/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5436 - binary_accuracy: 0.7189 - auc_roc: 0.7906 - val_loss: 0.5878 - val_binary_accuracy: 0.6545 - val_auc_roc: 0.7906\n",
      "roc-auc: 80.51% - roc-auc_val: 73.13%                                                                                                    \n",
      "Epoch 9/50\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5442 - binary_accuracy: 0.7179 - auc_roc: 0.7906 - val_loss: 0.5847 - val_binary_accuracy: 0.6589 - val_auc_roc: 0.7906\n",
      "roc-auc: 80.55% - roc-auc_val: 73.34%                                                                                                    \n",
      "Epoch 10/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5445 - binary_accuracy: 0.7183 - auc_roc: 0.7906 - val_loss: 0.5828 - val_binary_accuracy: 0.6557 - val_auc_roc: 0.7906\n",
      "roc-auc: 80.54% - roc-auc_val: 72.73%                                                                                                    \n",
      "Epoch 11/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5441 - binary_accuracy: 0.7184 - auc_roc: 0.7906 - val_loss: 0.5779 - val_binary_accuracy: 0.6585 - val_auc_roc: 0.7906\n",
      "roc-auc: 80.55% - roc-auc_val: 73.19%                                                                                                    \n",
      "Epoch 12/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5434 - binary_accuracy: 0.7188 - auc_roc: 0.7906 - val_loss: 0.5860 - val_binary_accuracy: 0.6565 - val_auc_roc: 0.7906\n",
      "roc-auc: 80.59% - roc-auc_val: 73.08%                                                                                                    \n",
      "Epoch 13/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5436 - binary_accuracy: 0.7190 - auc_roc: 0.7906 - val_loss: 0.5957 - val_binary_accuracy: 0.6609 - val_auc_roc: 0.7906\n",
      "roc-auc: 80.54% - roc-auc_val: 72.91%                                                                                                    \n",
      "Epoch 14/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5431 - binary_accuracy: 0.7199 - auc_roc: 0.7906 - val_loss: 0.6153 - val_binary_accuracy: 0.6476 - val_auc_roc: 0.7906\n",
      "roc-auc: 80.54% - roc-auc_val: 72.43%                                                                                                    \n",
      "Epoch 15/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5444 - binary_accuracy: 0.7186 - auc_roc: 0.7906 - val_loss: 0.5850 - val_binary_accuracy: 0.6541 - val_auc_roc: 0.7906\n",
      "roc-auc: 80.56% - roc-auc_val: 72.68%                                                                                                    \n",
      "Epoch 16/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5439 - binary_accuracy: 0.7193 - auc_roc: 0.7906 - val_loss: 0.5972 - val_binary_accuracy: 0.6605 - val_auc_roc: 0.7906\n",
      "roc-auc: 80.54% - roc-auc_val: 72.07%                                                                                                    \n",
      "Epoch 17/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5440 - binary_accuracy: 0.7183 - auc_roc: 0.7907 - val_loss: 0.5974 - val_binary_accuracy: 0.6541 - val_auc_roc: 0.7907\n",
      "roc-auc: 80.54% - roc-auc_val: 72.04%                                                                                                    \n",
      "Epoch 18/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5444 - binary_accuracy: 0.7183 - auc_roc: 0.7907 - val_loss: 0.6049 - val_binary_accuracy: 0.6569 - val_auc_roc: 0.7907\n",
      "roc-auc: 80.54% - roc-auc_val: 72.78%                                                                                                    \n",
      "Epoch 19/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5440 - binary_accuracy: 0.7186 - auc_roc: 0.7907 - val_loss: 0.5975 - val_binary_accuracy: 0.6460 - val_auc_roc: 0.7907\n",
      "roc-auc: 80.53% - roc-auc_val: 72.44%                                                                                                    \n",
      "Epoch 20/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5443 - binary_accuracy: 0.7191 - auc_roc: 0.7907 - val_loss: 0.6145 - val_binary_accuracy: 0.6444 - val_auc_roc: 0.7907\n",
      "roc-auc: 80.55% - roc-auc_val: 71.80%                                                                                                    \n",
      "Epoch 21/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5442 - binary_accuracy: 0.7181 - auc_roc: 0.7907 - val_loss: 0.5888 - val_binary_accuracy: 0.6517 - val_auc_roc: 0.7907\n",
      "roc-auc: 80.54% - roc-auc_val: 72.53%                                                                                                    \n",
      "Epoch 22/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5440 - binary_accuracy: 0.7189 - auc_roc: 0.7907 - val_loss: 0.5971 - val_binary_accuracy: 0.6513 - val_auc_roc: 0.7907\n",
      "roc-auc: 80.65% - roc-auc_val: 72.56%                                                                                                    \n",
      "Epoch 23/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5438 - binary_accuracy: 0.7189 - auc_roc: 0.7907 - val_loss: 0.5968 - val_binary_accuracy: 0.6513 - val_auc_roc: 0.7907\n",
      "roc-auc: 80.52% - roc-auc_val: 72.11%                                                                                                    \n",
      "Epoch 24/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5443 - binary_accuracy: 0.7182 - auc_roc: 0.7907 - val_loss: 0.6076 - val_binary_accuracy: 0.6464 - val_auc_roc: 0.7907\n",
      "roc-auc: 80.52% - roc-auc_val: 72.26%                                                                                                    \n",
      "Epoch 25/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5434 - binary_accuracy: 0.7193 - auc_roc: 0.7907 - val_loss: 0.5917 - val_binary_accuracy: 0.6501 - val_auc_roc: 0.7907\n",
      "roc-auc: 80.61% - roc-auc_val: 72.05%                                                                                                    \n",
      "Epoch 26/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5445 - binary_accuracy: 0.7183 - auc_roc: 0.7907 - val_loss: 0.6034 - val_binary_accuracy: 0.6509 - val_auc_roc: 0.7907\n",
      "roc-auc: 80.56% - roc-auc_val: 72.67%                                                                                                    \n",
      "Epoch 27/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5437 - binary_accuracy: 0.7184 - auc_roc: 0.7907 - val_loss: 0.6073 - val_binary_accuracy: 0.6517 - val_auc_roc: 0.7907\n",
      "roc-auc: 80.53% - roc-auc_val: 72.28%                                                                                                    \n",
      "Epoch 28/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5440 - binary_accuracy: 0.7183 - auc_roc: 0.7907 - val_loss: 0.6140 - val_binary_accuracy: 0.6513 - val_auc_roc: 0.7907\n",
      "roc-auc: 80.59% - roc-auc_val: 72.06%                                                                                                    \n",
      "Epoch 29/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5437 - binary_accuracy: 0.7191 - auc_roc: 0.7907 - val_loss: 0.5933 - val_binary_accuracy: 0.6525 - val_auc_roc: 0.7907\n",
      "roc-auc: 80.58% - roc-auc_val: 72.31%                                                                                                    \n",
      "Epoch 30/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5441 - binary_accuracy: 0.7187 - auc_roc: 0.7907 - val_loss: 0.6046 - val_binary_accuracy: 0.6549 - val_auc_roc: 0.7907\n",
      "roc-auc: 80.54% - roc-auc_val: 71.95%                                                                                                    \n",
      "Epoch 31/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5439 - binary_accuracy: 0.7188 - auc_roc: 0.7907 - val_loss: 0.6063 - val_binary_accuracy: 0.6452 - val_auc_roc: 0.7907\n",
      "roc-auc: 80.59% - roc-auc_val: 71.46%                                                                                                    \n",
      "Epoch 32/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5439 - binary_accuracy: 0.7176 - auc_roc: 0.7907 - val_loss: 0.6139 - val_binary_accuracy: 0.6541 - val_auc_roc: 0.7907\n",
      "roc-auc: 80.56% - roc-auc_val: 71.94%                                                                                                    \n",
      "Epoch 33/50\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5434 - binary_accuracy: 0.7186 - auc_roc: 0.7907 - val_loss: 0.6128 - val_binary_accuracy: 0.6517 - val_auc_roc: 0.7907\n",
      "roc-auc: 80.55% - roc-auc_val: 71.79%                                                                                                    \n",
      "Epoch 34/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5438 - binary_accuracy: 0.7189 - auc_roc: 0.7907 - val_loss: 0.6038 - val_binary_accuracy: 0.6529 - val_auc_roc: 0.7907\n",
      "roc-auc: 80.56% - roc-auc_val: 72.16%                                                                                                    \n",
      "Epoch 35/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5444 - binary_accuracy: 0.7182 - auc_roc: 0.7907 - val_loss: 0.6257 - val_binary_accuracy: 0.6485 - val_auc_roc: 0.7907\n",
      "roc-auc: 80.53% - roc-auc_val: 71.51%                                                                                                    \n",
      "Epoch 36/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5446 - binary_accuracy: 0.7190 - auc_roc: 0.7907 - val_loss: 0.6090 - val_binary_accuracy: 0.6476 - val_auc_roc: 0.7907\n",
      "roc-auc: 80.54% - roc-auc_val: 71.72%                                                                                                    \n",
      "Epoch 37/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5434 - binary_accuracy: 0.7195 - auc_roc: 0.7907 - val_loss: 0.5944 - val_binary_accuracy: 0.6513 - val_auc_roc: 0.7908\n",
      "roc-auc: 80.54% - roc-auc_val: 72.09%                                                                                                    \n",
      "Epoch 38/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5442 - binary_accuracy: 0.7187 - auc_roc: 0.7908 - val_loss: 0.6165 - val_binary_accuracy: 0.6444 - val_auc_roc: 0.7908\n",
      "roc-auc: 80.53% - roc-auc_val: 71.26%                                                                                                    \n",
      "Epoch 39/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5434 - binary_accuracy: 0.7191 - auc_roc: 0.7908 - val_loss: 0.6289 - val_binary_accuracy: 0.6476 - val_auc_roc: 0.7908\n",
      "roc-auc: 80.55% - roc-auc_val: 72.01%                                                                                                    \n",
      "Epoch 40/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5436 - binary_accuracy: 0.7190 - auc_roc: 0.7908 - val_loss: 0.6460 - val_binary_accuracy: 0.6444 - val_auc_roc: 0.7908\n",
      "roc-auc: 80.53% - roc-auc_val: 71.30%                                                                                                    \n",
      "Epoch 41/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5434 - binary_accuracy: 0.7192 - auc_roc: 0.7908 - val_loss: 0.6097 - val_binary_accuracy: 0.6396 - val_auc_roc: 0.7908\n",
      "roc-auc: 80.53% - roc-auc_val: 70.91%                                                                                                    \n",
      "Epoch 42/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5439 - binary_accuracy: 0.7190 - auc_roc: 0.7908 - val_loss: 0.6178 - val_binary_accuracy: 0.6436 - val_auc_roc: 0.7908\n",
      "roc-auc: 80.54% - roc-auc_val: 71.36%                                                                                                    \n",
      "Epoch 43/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5438 - binary_accuracy: 0.7185 - auc_roc: 0.7908 - val_loss: 0.6159 - val_binary_accuracy: 0.6376 - val_auc_roc: 0.7908\n",
      "roc-auc: 80.59% - roc-auc_val: 70.97%                                                                                                    \n",
      "Epoch 44/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5442 - binary_accuracy: 0.7194 - auc_roc: 0.7908 - val_loss: 0.6148 - val_binary_accuracy: 0.6404 - val_auc_roc: 0.7908\n",
      "roc-auc: 80.58% - roc-auc_val: 71.53%                                                                                                    \n",
      "Epoch 45/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5440 - binary_accuracy: 0.7192 - auc_roc: 0.7908 - val_loss: 0.6456 - val_binary_accuracy: 0.6400 - val_auc_roc: 0.7908\n",
      "roc-auc: 80.53% - roc-auc_val: 71.46%                                                                                                    \n",
      "Epoch 46/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5447 - binary_accuracy: 0.7193 - auc_roc: 0.7908 - val_loss: 0.6045 - val_binary_accuracy: 0.6432 - val_auc_roc: 0.7908\n",
      "roc-auc: 80.57% - roc-auc_val: 71.14%                                                                                                    \n",
      "Epoch 47/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5435 - binary_accuracy: 0.7184 - auc_roc: 0.7908 - val_loss: 0.6246 - val_binary_accuracy: 0.6456 - val_auc_roc: 0.7908\n",
      "roc-auc: 80.60% - roc-auc_val: 71.44%                                                                                                    \n",
      "Epoch 48/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5442 - binary_accuracy: 0.7179 - auc_roc: 0.7908 - val_loss: 0.6442 - val_binary_accuracy: 0.6537 - val_auc_roc: 0.7908\n",
      "roc-auc: 80.55% - roc-auc_val: 71.62%                                                                                                    \n",
      "Epoch 49/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5441 - binary_accuracy: 0.7184 - auc_roc: 0.7908 - val_loss: 0.6691 - val_binary_accuracy: 0.6517 - val_auc_roc: 0.7908\n",
      "roc-auc: 80.58% - roc-auc_val: 71.63%                                                                                                    \n",
      "Epoch 50/50\n",
      "214054/214054 [==============================] - 12s 56us/step - loss: 0.5436 - binary_accuracy: 0.7192 - auc_roc: 0.7908 - val_loss: 0.7108 - val_binary_accuracy: 0.6316 - val_auc_roc: 0.7908\n",
      "roc-auc: 80.56% - roc-auc_val: 70.68%                                                                                                    \n",
      "Train on 211565 samples, validate on 2489 samples\n",
      "Epoch 1/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5437 - binary_accuracy: 0.7201 - auc_roc: 0.7908 - val_loss: 0.5706 - val_binary_accuracy: 0.6573 - val_auc_roc: 0.7908\n",
      "roc-auc: 80.57% - roc-auc_val: 73.10%                                                                                                    \n",
      "Epoch 2/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5437 - binary_accuracy: 0.7190 - auc_roc: 0.7908 - val_loss: 0.5697 - val_binary_accuracy: 0.6497 - val_auc_roc: 0.7908\n",
      "roc-auc: 80.66% - roc-auc_val: 72.62%                                                                                                    \n",
      "Epoch 3/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5430 - binary_accuracy: 0.7197 - auc_roc: 0.7908 - val_loss: 0.5706 - val_binary_accuracy: 0.6215 - val_auc_roc: 0.7908\n",
      "roc-auc: 80.60% - roc-auc_val: 71.36%                                                                                                    \n",
      "Epoch 4/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5432 - binary_accuracy: 0.7199 - auc_roc: 0.7908 - val_loss: 0.5691 - val_binary_accuracy: 0.6444 - val_auc_roc: 0.7908\n",
      "roc-auc: 80.63% - roc-auc_val: 71.04%                                                                                                    \n",
      "Epoch 5/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5440 - binary_accuracy: 0.7206 - auc_roc: 0.7908 - val_loss: 0.5795 - val_binary_accuracy: 0.6384 - val_auc_roc: 0.7908\n",
      "roc-auc: 80.58% - roc-auc_val: 70.79%                                                                                                    \n",
      "Epoch 6/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5432 - binary_accuracy: 0.7203 - auc_roc: 0.7908 - val_loss: 0.5789 - val_binary_accuracy: 0.6276 - val_auc_roc: 0.7908\n",
      "roc-auc: 80.62% - roc-auc_val: 70.45%                                                                                                    \n",
      "Epoch 7/50\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5431 - binary_accuracy: 0.7199 - auc_roc: 0.7908 - val_loss: 0.5884 - val_binary_accuracy: 0.6231 - val_auc_roc: 0.7908\n",
      "roc-auc: 80.58% - roc-auc_val: 70.51%                                                                                                    \n",
      "Epoch 8/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5432 - binary_accuracy: 0.7196 - auc_roc: 0.7909 - val_loss: 0.5756 - val_binary_accuracy: 0.6276 - val_auc_roc: 0.7909\n",
      "roc-auc: 80.64% - roc-auc_val: 69.87%                                                                                                    \n",
      "Epoch 9/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5433 - binary_accuracy: 0.7204 - auc_roc: 0.7909 - val_loss: 0.5755 - val_binary_accuracy: 0.6227 - val_auc_roc: 0.7909\n",
      "roc-auc: 80.66% - roc-auc_val: 70.24%                                                                                                    \n",
      "Epoch 10/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5428 - binary_accuracy: 0.7196 - auc_roc: 0.7909 - val_loss: 0.5771 - val_binary_accuracy: 0.6336 - val_auc_roc: 0.7909\n",
      "roc-auc: 80.67% - roc-auc_val: 70.26%                                                                                                    \n",
      "Epoch 11/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5435 - binary_accuracy: 0.7197 - auc_roc: 0.7909 - val_loss: 0.5848 - val_binary_accuracy: 0.6336 - val_auc_roc: 0.7909\n",
      "roc-auc: 80.62% - roc-auc_val: 69.99%                                                                                                    \n",
      "Epoch 12/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5440 - binary_accuracy: 0.7194 - auc_roc: 0.7909 - val_loss: 0.5776 - val_binary_accuracy: 0.6256 - val_auc_roc: 0.7909\n",
      "roc-auc: 80.67% - roc-auc_val: 69.45%                                                                                                    \n",
      "Epoch 13/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5431 - binary_accuracy: 0.7208 - auc_roc: 0.7909 - val_loss: 0.5821 - val_binary_accuracy: 0.6103 - val_auc_roc: 0.7909\n",
      "roc-auc: 80.67% - roc-auc_val: 68.48%                                                                                                    \n",
      "Epoch 14/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5429 - binary_accuracy: 0.7196 - auc_roc: 0.7909 - val_loss: 0.5800 - val_binary_accuracy: 0.6195 - val_auc_roc: 0.7909\n",
      "roc-auc: 80.56% - roc-auc_val: 69.52%                                                                                                    \n",
      "Epoch 15/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5434 - binary_accuracy: 0.7200 - auc_roc: 0.7909 - val_loss: 0.5887 - val_binary_accuracy: 0.6119 - val_auc_roc: 0.7909\n",
      "roc-auc: 80.61% - roc-auc_val: 69.54%                                                                                                    \n",
      "Epoch 16/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5435 - binary_accuracy: 0.7199 - auc_roc: 0.7909 - val_loss: 0.5942 - val_binary_accuracy: 0.6123 - val_auc_roc: 0.7909\n",
      "roc-auc: 80.62% - roc-auc_val: 68.65%                                                                                                    \n",
      "Epoch 17/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5432 - binary_accuracy: 0.7193 - auc_roc: 0.7909 - val_loss: 0.6013 - val_binary_accuracy: 0.6099 - val_auc_roc: 0.7909\n",
      "roc-auc: 80.63% - roc-auc_val: 68.49%                                                                                                    \n",
      "Epoch 18/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5431 - binary_accuracy: 0.7197 - auc_roc: 0.7909 - val_loss: 0.5883 - val_binary_accuracy: 0.6107 - val_auc_roc: 0.7909\n",
      "roc-auc: 80.64% - roc-auc_val: 68.65%                                                                                                    \n",
      "Epoch 19/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5434 - binary_accuracy: 0.7200 - auc_roc: 0.7909 - val_loss: 0.5997 - val_binary_accuracy: 0.6027 - val_auc_roc: 0.7909\n",
      "roc-auc: 80.61% - roc-auc_val: 68.13%                                                                                                    \n",
      "Epoch 20/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5430 - binary_accuracy: 0.7198 - auc_roc: 0.7909 - val_loss: 0.6106 - val_binary_accuracy: 0.6171 - val_auc_roc: 0.7909\n",
      "roc-auc: 80.61% - roc-auc_val: 69.18%                                                                                                    \n",
      "Epoch 21/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5433 - binary_accuracy: 0.7206 - auc_roc: 0.7909 - val_loss: 0.5900 - val_binary_accuracy: 0.6035 - val_auc_roc: 0.7909\n",
      "roc-auc: 80.54% - roc-auc_val: 69.09%                                                                                                    \n",
      "Epoch 22/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5436 - binary_accuracy: 0.7199 - auc_roc: 0.7909 - val_loss: 0.5976 - val_binary_accuracy: 0.6087 - val_auc_roc: 0.7909\n",
      "roc-auc: 80.63% - roc-auc_val: 68.65%                                                                                                    \n",
      "Epoch 23/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5434 - binary_accuracy: 0.7199 - auc_roc: 0.7909 - val_loss: 0.5950 - val_binary_accuracy: 0.6091 - val_auc_roc: 0.7909\n",
      "roc-auc: 80.63% - roc-auc_val: 68.89%                                                                                                    \n",
      "Epoch 24/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5431 - binary_accuracy: 0.7205 - auc_roc: 0.7909 - val_loss: 0.6008 - val_binary_accuracy: 0.6115 - val_auc_roc: 0.7909\n",
      "roc-auc: 80.57% - roc-auc_val: 68.62%                                                                                                    \n",
      "Epoch 25/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5438 - binary_accuracy: 0.7192 - auc_roc: 0.7909 - val_loss: 0.5959 - val_binary_accuracy: 0.6095 - val_auc_roc: 0.7909\n",
      "roc-auc: 80.66% - roc-auc_val: 68.35%                                                                                                    \n",
      "Epoch 26/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5434 - binary_accuracy: 0.7198 - auc_roc: 0.7910 - val_loss: 0.5992 - val_binary_accuracy: 0.6099 - val_auc_roc: 0.7910\n",
      "roc-auc: 80.62% - roc-auc_val: 68.69%                                                                                                    \n",
      "Epoch 27/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5427 - binary_accuracy: 0.7197 - auc_roc: 0.7910 - val_loss: 0.5989 - val_binary_accuracy: 0.6075 - val_auc_roc: 0.7910\n",
      "roc-auc: 80.63% - roc-auc_val: 68.22%                                                                                                    \n",
      "Epoch 28/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5434 - binary_accuracy: 0.7199 - auc_roc: 0.7910 - val_loss: 0.6195 - val_binary_accuracy: 0.6039 - val_auc_roc: 0.7910\n",
      "roc-auc: 80.63% - roc-auc_val: 68.29%                                                                                                    \n",
      "Epoch 29/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5432 - binary_accuracy: 0.7204 - auc_roc: 0.7910 - val_loss: 0.5938 - val_binary_accuracy: 0.6027 - val_auc_roc: 0.7910\n",
      "roc-auc: 80.61% - roc-auc_val: 67.37%                                                                                                    \n",
      "Epoch 30/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5432 - binary_accuracy: 0.7199 - auc_roc: 0.7910 - val_loss: 0.6162 - val_binary_accuracy: 0.6127 - val_auc_roc: 0.7910\n",
      "roc-auc: 80.56% - roc-auc_val: 67.86%                                                                                                    \n",
      "Epoch 31/50\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5433 - binary_accuracy: 0.7195 - auc_roc: 0.7910 - val_loss: 0.5966 - val_binary_accuracy: 0.6075 - val_auc_roc: 0.7910\n",
      "roc-auc: 80.58% - roc-auc_val: 67.98%                                                                                                    \n",
      "Epoch 32/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5433 - binary_accuracy: 0.7201 - auc_roc: 0.7910 - val_loss: 0.5984 - val_binary_accuracy: 0.5978 - val_auc_roc: 0.7910\n",
      "roc-auc: 80.56% - roc-auc_val: 68.29%                                                                                                    \n",
      "Epoch 33/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5432 - binary_accuracy: 0.7201 - auc_roc: 0.7910 - val_loss: 0.6438 - val_binary_accuracy: 0.6123 - val_auc_roc: 0.7910\n",
      "roc-auc: 80.60% - roc-auc_val: 68.46%                                                                                                    \n",
      "Epoch 34/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5431 - binary_accuracy: 0.7195 - auc_roc: 0.7910 - val_loss: 0.6018 - val_binary_accuracy: 0.6083 - val_auc_roc: 0.7910\n",
      "roc-auc: 80.65% - roc-auc_val: 67.48%                                                                                                    \n",
      "Epoch 35/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5432 - binary_accuracy: 0.7195 - auc_roc: 0.7910 - val_loss: 0.6353 - val_binary_accuracy: 0.6071 - val_auc_roc: 0.7910\n",
      "roc-auc: 80.69% - roc-auc_val: 66.95%                                                                                                    \n",
      "Epoch 36/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5434 - binary_accuracy: 0.7196 - auc_roc: 0.7910 - val_loss: 0.6243 - val_binary_accuracy: 0.6071 - val_auc_roc: 0.7910\n",
      "roc-auc: 80.55% - roc-auc_val: 67.79%                                                                                                    \n",
      "Epoch 37/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5433 - binary_accuracy: 0.7195 - auc_roc: 0.7910 - val_loss: 0.6199 - val_binary_accuracy: 0.6014 - val_auc_roc: 0.7910\n",
      "roc-auc: 80.66% - roc-auc_val: 68.09%                                                                                                    \n",
      "Epoch 38/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5432 - binary_accuracy: 0.7196 - auc_roc: 0.7910 - val_loss: 0.6595 - val_binary_accuracy: 0.6022 - val_auc_roc: 0.7910\n",
      "roc-auc: 80.60% - roc-auc_val: 67.36%                                                                                                    \n",
      "Epoch 39/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5433 - binary_accuracy: 0.7197 - auc_roc: 0.7910 - val_loss: 0.6253 - val_binary_accuracy: 0.6006 - val_auc_roc: 0.7910\n",
      "roc-auc: 80.67% - roc-auc_val: 67.45%                                                                                                    \n",
      "Epoch 40/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5435 - binary_accuracy: 0.7198 - auc_roc: 0.7910 - val_loss: 0.6337 - val_binary_accuracy: 0.6071 - val_auc_roc: 0.7910\n",
      "roc-auc: 80.64% - roc-auc_val: 67.13%                                                                                                    \n",
      "Epoch 41/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5427 - binary_accuracy: 0.7205 - auc_roc: 0.7910 - val_loss: 0.6444 - val_binary_accuracy: 0.6006 - val_auc_roc: 0.7910\n",
      "roc-auc: 80.62% - roc-auc_val: 67.16%                                                                                                    \n",
      "Epoch 42/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5433 - binary_accuracy: 0.7210 - auc_roc: 0.7910 - val_loss: 0.6306 - val_binary_accuracy: 0.5886 - val_auc_roc: 0.7910\n",
      "roc-auc: 80.61% - roc-auc_val: 67.40%                                                                                                    \n",
      "Epoch 43/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5426 - binary_accuracy: 0.7216 - auc_roc: 0.7910 - val_loss: 0.6175 - val_binary_accuracy: 0.5870 - val_auc_roc: 0.7910\n",
      "roc-auc: 80.72% - roc-auc_val: 67.74%                                                                                                    \n",
      "Epoch 44/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5429 - binary_accuracy: 0.7196 - auc_roc: 0.7910 - val_loss: 0.6647 - val_binary_accuracy: 0.5954 - val_auc_roc: 0.7910\n",
      "roc-auc: 80.65% - roc-auc_val: 66.81%                                                                                                    \n",
      "Epoch 45/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5427 - binary_accuracy: 0.7208 - auc_roc: 0.7910 - val_loss: 0.6504 - val_binary_accuracy: 0.5930 - val_auc_roc: 0.7911\n",
      "roc-auc: 80.57% - roc-auc_val: 66.78%                                                                                                    \n",
      "Epoch 46/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5433 - binary_accuracy: 0.7203 - auc_roc: 0.7911 - val_loss: 0.6756 - val_binary_accuracy: 0.6014 - val_auc_roc: 0.7911\n",
      "roc-auc: 80.61% - roc-auc_val: 66.92%                                                                                                    \n",
      "Epoch 47/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5432 - binary_accuracy: 0.7204 - auc_roc: 0.7911 - val_loss: 0.6625 - val_binary_accuracy: 0.5994 - val_auc_roc: 0.7911\n",
      "roc-auc: 80.70% - roc-auc_val: 65.76%                                                                                                    \n",
      "Epoch 48/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5430 - binary_accuracy: 0.7200 - auc_roc: 0.7911 - val_loss: 0.6882 - val_binary_accuracy: 0.5958 - val_auc_roc: 0.7911\n",
      "roc-auc: 80.63% - roc-auc_val: 65.91%                                                                                                    \n",
      "Epoch 49/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5433 - binary_accuracy: 0.7203 - auc_roc: 0.7911 - val_loss: 0.7204 - val_binary_accuracy: 0.5974 - val_auc_roc: 0.7911\n",
      "roc-auc: 80.68% - roc-auc_val: 66.29%                                                                                                    \n",
      "Epoch 50/50\n",
      "211565/211565 [==============================] - 12s 56us/step - loss: 0.5433 - binary_accuracy: 0.7207 - auc_roc: 0.7911 - val_loss: 0.6961 - val_binary_accuracy: 0.6031 - val_auc_roc: 0.7911\n",
      "roc-auc: 80.61% - roc-auc_val: 66.81%                                                                                                    \n",
      "Train on 209076 samples, validate on 2489 samples\n",
      "Epoch 1/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5433 - binary_accuracy: 0.7191 - auc_roc: 0.7911 - val_loss: 0.5067 - val_binary_accuracy: 0.7180 - val_auc_roc: 0.7911\n",
      "roc-auc: 80.69% - roc-auc_val: 81.26%                                                                                                    \n",
      "Epoch 2/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5439 - binary_accuracy: 0.7201 - auc_roc: 0.7911 - val_loss: 0.5061 - val_binary_accuracy: 0.7172 - val_auc_roc: 0.7911\n",
      "roc-auc: 80.61% - roc-auc_val: 80.78%                                                                                                    \n",
      "Epoch 3/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5429 - binary_accuracy: 0.7193 - auc_roc: 0.7911 - val_loss: 0.5147 - val_binary_accuracy: 0.7168 - val_auc_roc: 0.7911\n",
      "roc-auc: 80.66% - roc-auc_val: 80.71%                                                                                                    \n",
      "Epoch 4/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5433 - binary_accuracy: 0.7199 - auc_roc: 0.7911 - val_loss: 0.5162 - val_binary_accuracy: 0.7160 - val_auc_roc: 0.7911\n",
      "roc-auc: 80.65% - roc-auc_val: 80.27%                                                                                                    \n",
      "Epoch 5/50\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5434 - binary_accuracy: 0.7202 - auc_roc: 0.7911 - val_loss: 0.5187 - val_binary_accuracy: 0.7135 - val_auc_roc: 0.7911\n",
      "roc-auc: 80.62% - roc-auc_val: 80.22%                                                                                                    \n",
      "Epoch 6/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5438 - binary_accuracy: 0.7200 - auc_roc: 0.7911 - val_loss: 0.5255 - val_binary_accuracy: 0.7143 - val_auc_roc: 0.7911\n",
      "roc-auc: 80.68% - roc-auc_val: 79.52%                                                                                                    \n",
      "Epoch 7/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5433 - binary_accuracy: 0.7207 - auc_roc: 0.7911 - val_loss: 0.5206 - val_binary_accuracy: 0.7139 - val_auc_roc: 0.7911\n",
      "roc-auc: 80.68% - roc-auc_val: 79.74%                                                                                                    \n",
      "Epoch 8/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5434 - binary_accuracy: 0.7195 - auc_roc: 0.7911 - val_loss: 0.5301 - val_binary_accuracy: 0.7151 - val_auc_roc: 0.7911\n",
      "roc-auc: 80.67% - roc-auc_val: 79.03%                                                                                                    \n",
      "Epoch 9/50\n",
      "209076/209076 [==============================] - 12s 55us/step - loss: 0.5438 - binary_accuracy: 0.7194 - auc_roc: 0.7911 - val_loss: 0.5221 - val_binary_accuracy: 0.7127 - val_auc_roc: 0.7911\n",
      "roc-auc: 80.64% - roc-auc_val: 79.38%                                                                                                    \n",
      "Epoch 10/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5430 - binary_accuracy: 0.7206 - auc_roc: 0.7911 - val_loss: 0.5313 - val_binary_accuracy: 0.7115 - val_auc_roc: 0.7911\n",
      "roc-auc: 80.71% - roc-auc_val: 78.42%                                                                                                    \n",
      "Epoch 11/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5432 - binary_accuracy: 0.7203 - auc_roc: 0.7911 - val_loss: 0.5346 - val_binary_accuracy: 0.7075 - val_auc_roc: 0.7911\n",
      "roc-auc: 80.67% - roc-auc_val: 78.17%                                                                                                    \n",
      "Epoch 12/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5432 - binary_accuracy: 0.7205 - auc_roc: 0.7911 - val_loss: 0.5315 - val_binary_accuracy: 0.7047 - val_auc_roc: 0.7911\n",
      "roc-auc: 80.59% - roc-auc_val: 78.37%                                                                                                    \n",
      "Epoch 13/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5437 - binary_accuracy: 0.7199 - auc_roc: 0.7912 - val_loss: 0.5344 - val_binary_accuracy: 0.6983 - val_auc_roc: 0.7912\n",
      "roc-auc: 80.64% - roc-auc_val: 78.51%                                                                                                    \n",
      "Epoch 14/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5430 - binary_accuracy: 0.7203 - auc_roc: 0.7912 - val_loss: 0.5390 - val_binary_accuracy: 0.7051 - val_auc_roc: 0.7912\n",
      "roc-auc: 80.71% - roc-auc_val: 77.83%                                                                                                    \n",
      "Epoch 15/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5435 - binary_accuracy: 0.7207 - auc_roc: 0.7912 - val_loss: 0.5413 - val_binary_accuracy: 0.6991 - val_auc_roc: 0.7912\n",
      "roc-auc: 80.64% - roc-auc_val: 78.07%                                                                                                    \n",
      "Epoch 16/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5429 - binary_accuracy: 0.7196 - auc_roc: 0.7912 - val_loss: 0.5528 - val_binary_accuracy: 0.6999 - val_auc_roc: 0.7912\n",
      "roc-auc: 80.63% - roc-auc_val: 76.95%                                                                                                    \n",
      "Epoch 17/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5431 - binary_accuracy: 0.7205 - auc_roc: 0.7912 - val_loss: 0.5406 - val_binary_accuracy: 0.7027 - val_auc_roc: 0.7912\n",
      "roc-auc: 80.68% - roc-auc_val: 77.83%                                                                                                    \n",
      "Epoch 18/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5437 - binary_accuracy: 0.7201 - auc_roc: 0.7912 - val_loss: 0.5447 - val_binary_accuracy: 0.6967 - val_auc_roc: 0.7912\n",
      "roc-auc: 80.69% - roc-auc_val: 77.04%                                                                                                    \n",
      "Epoch 19/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5430 - binary_accuracy: 0.7209 - auc_roc: 0.7912 - val_loss: 0.5509 - val_binary_accuracy: 0.7051 - val_auc_roc: 0.7912\n",
      "roc-auc: 80.72% - roc-auc_val: 76.51%                                                                                                    \n",
      "Epoch 20/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5433 - binary_accuracy: 0.7198 - auc_roc: 0.7912 - val_loss: 0.5706 - val_binary_accuracy: 0.6967 - val_auc_roc: 0.7912\n",
      "roc-auc: 80.68% - roc-auc_val: 75.26%                                                                                                    \n",
      "Epoch 21/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5432 - binary_accuracy: 0.7205 - auc_roc: 0.7912 - val_loss: 0.6411 - val_binary_accuracy: 0.6119 - val_auc_roc: 0.7912\n",
      "roc-auc: 80.67% - roc-auc_val: 66.81%                                                                                                    \n",
      "Epoch 22/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5434 - binary_accuracy: 0.7203 - auc_roc: 0.7912 - val_loss: 0.6098 - val_binary_accuracy: 0.6774 - val_auc_roc: 0.7912\n",
      "roc-auc: 80.68% - roc-auc_val: 72.54%                                                                                                    \n",
      "Epoch 23/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5430 - binary_accuracy: 0.7203 - auc_roc: 0.7912 - val_loss: 0.5641 - val_binary_accuracy: 0.7003 - val_auc_roc: 0.7912\n",
      "roc-auc: 80.68% - roc-auc_val: 75.76%                                                                                                    \n",
      "Epoch 24/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5432 - binary_accuracy: 0.7198 - auc_roc: 0.7912 - val_loss: 0.6370 - val_binary_accuracy: 0.6095 - val_auc_roc: 0.7912\n",
      "roc-auc: 80.68% - roc-auc_val: 68.82%                                                                                                    \n",
      "Epoch 25/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5438 - binary_accuracy: 0.7201 - auc_roc: 0.7912 - val_loss: 0.6322 - val_binary_accuracy: 0.6143 - val_auc_roc: 0.7912\n",
      "roc-auc: 80.74% - roc-auc_val: 69.04%                                                                                                    \n",
      "Epoch 26/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5432 - binary_accuracy: 0.7199 - auc_roc: 0.7912 - val_loss: 0.6358 - val_binary_accuracy: 0.6111 - val_auc_roc: 0.7912\n",
      "roc-auc: 80.61% - roc-auc_val: 68.03%                                                                                                    \n",
      "Epoch 27/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5435 - binary_accuracy: 0.7202 - auc_roc: 0.7912 - val_loss: 0.5951 - val_binary_accuracy: 0.6826 - val_auc_roc: 0.7912\n",
      "roc-auc: 80.66% - roc-auc_val: 74.70%                                                                                                    \n",
      "Epoch 28/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5431 - binary_accuracy: 0.7197 - auc_roc: 0.7912 - val_loss: 0.5644 - val_binary_accuracy: 0.6951 - val_auc_roc: 0.7912\n",
      "roc-auc: 80.71% - roc-auc_val: 75.47%                                                                                                    \n",
      "Epoch 29/50\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5435 - binary_accuracy: 0.7206 - auc_roc: 0.7912 - val_loss: 0.5651 - val_binary_accuracy: 0.6926 - val_auc_roc: 0.7912\n",
      "roc-auc: 80.63% - roc-auc_val: 75.03%                                                                                                    \n",
      "Epoch 30/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5430 - binary_accuracy: 0.7216 - auc_roc: 0.7912 - val_loss: 0.6008 - val_binary_accuracy: 0.6882 - val_auc_roc: 0.7912\n",
      "roc-auc: 80.67% - roc-auc_val: 73.51%                                                                                                    \n",
      "Epoch 31/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5432 - binary_accuracy: 0.7194 - auc_roc: 0.7912 - val_loss: 0.6457 - val_binary_accuracy: 0.6051 - val_auc_roc: 0.7912\n",
      "roc-auc: 80.61% - roc-auc_val: 67.57%                                                                                                    \n",
      "Epoch 32/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5432 - binary_accuracy: 0.7212 - auc_roc: 0.7912 - val_loss: 0.5895 - val_binary_accuracy: 0.6842 - val_auc_roc: 0.7912\n",
      "roc-auc: 80.73% - roc-auc_val: 73.87%                                                                                                    \n",
      "Epoch 33/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5434 - binary_accuracy: 0.7212 - auc_roc: 0.7912 - val_loss: 0.6543 - val_binary_accuracy: 0.5870 - val_auc_roc: 0.7912\n",
      "roc-auc: 80.65% - roc-auc_val: 63.89%                                                                                                    \n",
      "Epoch 34/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5435 - binary_accuracy: 0.7202 - auc_roc: 0.7913 - val_loss: 0.6065 - val_binary_accuracy: 0.6742 - val_auc_roc: 0.7913\n",
      "roc-auc: 80.74% - roc-auc_val: 73.14%                                                                                                    \n",
      "Epoch 35/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5434 - binary_accuracy: 0.7204 - auc_roc: 0.7913 - val_loss: 0.6553 - val_binary_accuracy: 0.5950 - val_auc_roc: 0.7913\n",
      "roc-auc: 80.66% - roc-auc_val: 66.00%                                                                                                    \n",
      "Epoch 36/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5428 - binary_accuracy: 0.7204 - auc_roc: 0.7913 - val_loss: 0.6513 - val_binary_accuracy: 0.6079 - val_auc_roc: 0.7913\n",
      "roc-auc: 80.65% - roc-auc_val: 64.17%                                                                                                    \n",
      "Epoch 37/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5429 - binary_accuracy: 0.7206 - auc_roc: 0.7913 - val_loss: 0.6574 - val_binary_accuracy: 0.5942 - val_auc_roc: 0.7913\n",
      "roc-auc: 80.68% - roc-auc_val: 65.50%                                                                                                    \n",
      "Epoch 38/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5433 - binary_accuracy: 0.7204 - auc_roc: 0.7913 - val_loss: 0.6498 - val_binary_accuracy: 0.5894 - val_auc_roc: 0.7913\n",
      "roc-auc: 80.74% - roc-auc_val: 64.83%                                                                                                    \n",
      "Epoch 39/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5427 - binary_accuracy: 0.7201 - auc_roc: 0.7913 - val_loss: 0.6612 - val_binary_accuracy: 0.5950 - val_auc_roc: 0.7913\n",
      "roc-auc: 80.66% - roc-auc_val: 63.33%                                                                                                    \n",
      "Epoch 40/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5431 - binary_accuracy: 0.7206 - auc_roc: 0.7913 - val_loss: 0.6467 - val_binary_accuracy: 0.6002 - val_auc_roc: 0.7913\n",
      "roc-auc: 80.72% - roc-auc_val: 65.94%                                                                                                    \n",
      "Epoch 41/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5435 - binary_accuracy: 0.7200 - auc_roc: 0.7913 - val_loss: 0.6228 - val_binary_accuracy: 0.6738 - val_auc_roc: 0.7913\n",
      "roc-auc: 80.67% - roc-auc_val: 71.15%                                                                                                    \n",
      "Epoch 42/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5433 - binary_accuracy: 0.7200 - auc_roc: 0.7913 - val_loss: 0.6591 - val_binary_accuracy: 0.6035 - val_auc_roc: 0.7913\n",
      "roc-auc: 80.66% - roc-auc_val: 64.07%                                                                                                    \n",
      "Epoch 43/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5433 - binary_accuracy: 0.7201 - auc_roc: 0.7913 - val_loss: 0.6088 - val_binary_accuracy: 0.6714 - val_auc_roc: 0.7913\n",
      "roc-auc: 80.68% - roc-auc_val: 72.55%                                                                                                    \n",
      "Epoch 44/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5431 - binary_accuracy: 0.7198 - auc_roc: 0.7913 - val_loss: 0.6608 - val_binary_accuracy: 0.6018 - val_auc_roc: 0.7913\n",
      "roc-auc: 80.62% - roc-auc_val: 63.66%                                                                                                    \n",
      "Epoch 45/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5431 - binary_accuracy: 0.7202 - auc_roc: 0.7913 - val_loss: 0.6269 - val_binary_accuracy: 0.6714 - val_auc_roc: 0.7913\n",
      "roc-auc: 80.68% - roc-auc_val: 70.58%                                                                                                    \n",
      "Epoch 46/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5438 - binary_accuracy: 0.7204 - auc_roc: 0.7913 - val_loss: 0.6284 - val_binary_accuracy: 0.6657 - val_auc_roc: 0.7913\n",
      "roc-auc: 80.63% - roc-auc_val: 70.55%                                                                                                    \n",
      "Epoch 47/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5431 - binary_accuracy: 0.7199 - auc_roc: 0.7913 - val_loss: 0.6628 - val_binary_accuracy: 0.5870 - val_auc_roc: 0.7913\n",
      "roc-auc: 80.68% - roc-auc_val: 62.12%                                                                                                    \n",
      "Epoch 48/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5427 - binary_accuracy: 0.7203 - auc_roc: 0.7913 - val_loss: 0.6564 - val_binary_accuracy: 0.6135 - val_auc_roc: 0.7913\n",
      "roc-auc: 80.68% - roc-auc_val: 64.24%                                                                                                    \n",
      "Epoch 49/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5434 - binary_accuracy: 0.7206 - auc_roc: 0.7913 - val_loss: 0.6659 - val_binary_accuracy: 0.5962 - val_auc_roc: 0.7913\n",
      "roc-auc: 80.69% - roc-auc_val: 62.63%                                                                                                    \n",
      "Epoch 50/50\n",
      "209076/209076 [==============================] - 12s 56us/step - loss: 0.5423 - binary_accuracy: 0.7205 - auc_roc: 0.7913 - val_loss: 0.6345 - val_binary_accuracy: 0.6179 - val_auc_roc: 0.7913\n",
      "roc-auc: 80.69% - roc-auc_val: 69.34%                                                                                                    \n",
      "Train on 206587 samples, validate on 2489 samples\n",
      "Epoch 1/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5428 - binary_accuracy: 0.7205 - auc_roc: 0.7913 - val_loss: 0.5526 - val_binary_accuracy: 0.6818 - val_auc_roc: 0.7913\n",
      "roc-auc: 80.75% - roc-auc_val: 76.33%                                                                                                    \n",
      "Epoch 2/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5433 - binary_accuracy: 0.7211 - auc_roc: 0.7913 - val_loss: 0.5594 - val_binary_accuracy: 0.6862 - val_auc_roc: 0.7913\n",
      "roc-auc: 80.68% - roc-auc_val: 76.11%                                                                                                    \n",
      "Epoch 3/50\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "206587/206587 [==============================] - 11s 55us/step - loss: 0.5429 - binary_accuracy: 0.7204 - auc_roc: 0.7913 - val_loss: 0.5697 - val_binary_accuracy: 0.6798 - val_auc_roc: 0.7913\n",
      "roc-auc: 80.70% - roc-auc_val: 75.71%                                                                                                    \n",
      "Epoch 4/50\n",
      "206587/206587 [==============================] - 11s 56us/step - loss: 0.5427 - binary_accuracy: 0.7204 - auc_roc: 0.7913 - val_loss: 0.5649 - val_binary_accuracy: 0.6778 - val_auc_roc: 0.7913\n",
      "roc-auc: 80.72% - roc-auc_val: 75.54%                                                                                                    \n",
      "Epoch 5/50\n",
      "206587/206587 [==============================] - 11s 55us/step - loss: 0.5430 - binary_accuracy: 0.7210 - auc_roc: 0.7913 - val_loss: 0.5623 - val_binary_accuracy: 0.6693 - val_auc_roc: 0.7913\n",
      "roc-auc: 80.75% - roc-auc_val: 75.38%                                                                                                    \n",
      "Epoch 6/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5427 - binary_accuracy: 0.7203 - auc_roc: 0.7913 - val_loss: 0.5738 - val_binary_accuracy: 0.6782 - val_auc_roc: 0.7913\n",
      "roc-auc: 80.70% - roc-auc_val: 75.54%                                                                                                    \n",
      "Epoch 7/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5430 - binary_accuracy: 0.7209 - auc_roc: 0.7913 - val_loss: 0.5708 - val_binary_accuracy: 0.6778 - val_auc_roc: 0.7913\n",
      "roc-auc: 80.73% - roc-auc_val: 75.25%                                                                                                    \n",
      "Epoch 8/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5431 - binary_accuracy: 0.7213 - auc_roc: 0.7913 - val_loss: 0.5855 - val_binary_accuracy: 0.6697 - val_auc_roc: 0.7914\n",
      "roc-auc: 80.69% - roc-auc_val: 74.93%                                                                                                    \n",
      "Epoch 9/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5418 - binary_accuracy: 0.7212 - auc_roc: 0.7914 - val_loss: 0.5701 - val_binary_accuracy: 0.6798 - val_auc_roc: 0.7914\n",
      "roc-auc: 80.71% - roc-auc_val: 75.34%                                                                                                    \n",
      "Epoch 10/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5433 - binary_accuracy: 0.7205 - auc_roc: 0.7914 - val_loss: 0.5726 - val_binary_accuracy: 0.6677 - val_auc_roc: 0.7914\n",
      "roc-auc: 80.76% - roc-auc_val: 74.81%                                                                                                    \n",
      "Epoch 11/50\n",
      "206587/206587 [==============================] - 11s 56us/step - loss: 0.5425 - binary_accuracy: 0.7209 - auc_roc: 0.7914 - val_loss: 0.5828 - val_binary_accuracy: 0.6738 - val_auc_roc: 0.7914\n",
      "roc-auc: 80.71% - roc-auc_val: 75.17%                                                                                                    \n",
      "Epoch 12/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5429 - binary_accuracy: 0.7211 - auc_roc: 0.7914 - val_loss: 0.5724 - val_binary_accuracy: 0.6693 - val_auc_roc: 0.7914\n",
      "roc-auc: 80.73% - roc-auc_val: 74.84%                                                                                                    \n",
      "Epoch 13/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5426 - binary_accuracy: 0.7218 - auc_roc: 0.7914 - val_loss: 0.5878 - val_binary_accuracy: 0.6738 - val_auc_roc: 0.7914\n",
      "roc-auc: 80.70% - roc-auc_val: 75.11%                                                                                                    \n",
      "Epoch 14/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5426 - binary_accuracy: 0.7212 - auc_roc: 0.7914 - val_loss: 0.5736 - val_binary_accuracy: 0.6701 - val_auc_roc: 0.7914\n",
      "roc-auc: 80.72% - roc-auc_val: 75.02%                                                                                                    \n",
      "Epoch 15/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5426 - binary_accuracy: 0.7206 - auc_roc: 0.7914 - val_loss: 0.5959 - val_binary_accuracy: 0.6665 - val_auc_roc: 0.7914\n",
      "roc-auc: 80.72% - roc-auc_val: 74.63%                                                                                                    \n",
      "Epoch 16/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5424 - binary_accuracy: 0.7206 - auc_roc: 0.7914 - val_loss: 0.5838 - val_binary_accuracy: 0.6750 - val_auc_roc: 0.7914\n",
      "roc-auc: 80.80% - roc-auc_val: 74.90%                                                                                                    \n",
      "Epoch 17/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5429 - binary_accuracy: 0.7201 - auc_roc: 0.7914 - val_loss: 0.5821 - val_binary_accuracy: 0.6545 - val_auc_roc: 0.7914\n",
      "roc-auc: 80.75% - roc-auc_val: 74.39%                                                                                                    \n",
      "Epoch 18/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5416 - binary_accuracy: 0.7212 - auc_roc: 0.7914 - val_loss: 0.5871 - val_binary_accuracy: 0.6533 - val_auc_roc: 0.7914\n",
      "roc-auc: 80.72% - roc-auc_val: 74.81%                                                                                                    \n",
      "Epoch 19/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5425 - binary_accuracy: 0.7203 - auc_roc: 0.7914 - val_loss: 0.5848 - val_binary_accuracy: 0.6589 - val_auc_roc: 0.7914\n",
      "roc-auc: 80.72% - roc-auc_val: 74.63%                                                                                                    \n",
      "Epoch 20/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5425 - binary_accuracy: 0.7203 - auc_roc: 0.7914 - val_loss: 0.5890 - val_binary_accuracy: 0.6541 - val_auc_roc: 0.7914\n",
      "roc-auc: 80.76% - roc-auc_val: 74.05%                                                                                                    \n",
      "Epoch 21/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5422 - binary_accuracy: 0.7203 - auc_roc: 0.7914 - val_loss: 0.5831 - val_binary_accuracy: 0.6722 - val_auc_roc: 0.7914\n",
      "roc-auc: 80.77% - roc-auc_val: 74.60%                                                                                                    \n",
      "Epoch 22/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5427 - binary_accuracy: 0.7212 - auc_roc: 0.7914 - val_loss: 0.5889 - val_binary_accuracy: 0.6505 - val_auc_roc: 0.7914\n",
      "roc-auc: 80.77% - roc-auc_val: 73.99%                                                                                                    \n",
      "Epoch 23/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5421 - binary_accuracy: 0.7207 - auc_roc: 0.7914 - val_loss: 0.5911 - val_binary_accuracy: 0.6468 - val_auc_roc: 0.7914\n",
      "roc-auc: 80.75% - roc-auc_val: 73.93%                                                                                                    \n",
      "Epoch 24/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5418 - binary_accuracy: 0.7211 - auc_roc: 0.7914 - val_loss: 0.5919 - val_binary_accuracy: 0.6509 - val_auc_roc: 0.7914\n",
      "roc-auc: 80.74% - roc-auc_val: 73.70%                                                                                                    \n",
      "Epoch 25/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5429 - binary_accuracy: 0.7206 - auc_roc: 0.7914 - val_loss: 0.5989 - val_binary_accuracy: 0.6621 - val_auc_roc: 0.7914\n",
      "roc-auc: 80.70% - roc-auc_val: 73.64%                                                                                                    \n",
      "Epoch 26/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5425 - binary_accuracy: 0.7210 - auc_roc: 0.7914 - val_loss: 0.5931 - val_binary_accuracy: 0.6408 - val_auc_roc: 0.7914\n",
      "roc-auc: 80.71% - roc-auc_val: 73.32%                                                                                                    \n",
      "Epoch 27/50\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "206587/206587 [==============================] - 11s 56us/step - loss: 0.5424 - binary_accuracy: 0.7218 - auc_roc: 0.7914 - val_loss: 0.5871 - val_binary_accuracy: 0.6521 - val_auc_roc: 0.7915\n",
      "roc-auc: 80.77% - roc-auc_val: 73.69%                                                                                                    \n",
      "Epoch 28/50\n",
      "206587/206587 [==============================] - 11s 56us/step - loss: 0.5434 - binary_accuracy: 0.7213 - auc_roc: 0.7915 - val_loss: 0.5938 - val_binary_accuracy: 0.6581 - val_auc_roc: 0.7915\n",
      "roc-auc: 80.75% - roc-auc_val: 72.85%                                                                                                    \n",
      "Epoch 29/50\n",
      "206587/206587 [==============================] - 11s 56us/step - loss: 0.5432 - binary_accuracy: 0.7204 - auc_roc: 0.7915 - val_loss: 0.5974 - val_binary_accuracy: 0.6501 - val_auc_roc: 0.7915\n",
      "roc-auc: 80.74% - roc-auc_val: 73.52%                                                                                                    \n",
      "Epoch 30/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5426 - binary_accuracy: 0.7211 - auc_roc: 0.7915 - val_loss: 0.5948 - val_binary_accuracy: 0.6376 - val_auc_roc: 0.7915\n",
      "roc-auc: 80.78% - roc-auc_val: 71.82%                                                                                                    \n",
      "Epoch 31/50\n",
      "206587/206587 [==============================] - 11s 56us/step - loss: 0.5423 - binary_accuracy: 0.7213 - auc_roc: 0.7915 - val_loss: 0.5899 - val_binary_accuracy: 0.6416 - val_auc_roc: 0.7915\n",
      "roc-auc: 80.77% - roc-auc_val: 72.75%                                                                                                    \n",
      "Epoch 32/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5426 - binary_accuracy: 0.7207 - auc_roc: 0.7915 - val_loss: 0.5954 - val_binary_accuracy: 0.6505 - val_auc_roc: 0.7915\n",
      "roc-auc: 80.75% - roc-auc_val: 72.82%                                                                                                    \n",
      "Epoch 33/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5429 - binary_accuracy: 0.7210 - auc_roc: 0.7915 - val_loss: 0.5896 - val_binary_accuracy: 0.6489 - val_auc_roc: 0.7915\n",
      "roc-auc: 80.71% - roc-auc_val: 72.92%                                                                                                    \n",
      "Epoch 34/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5428 - binary_accuracy: 0.7205 - auc_roc: 0.7915 - val_loss: 0.5953 - val_binary_accuracy: 0.6472 - val_auc_roc: 0.7915\n",
      "roc-auc: 80.72% - roc-auc_val: 73.09%                                                                                                    \n",
      "Epoch 35/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5420 - binary_accuracy: 0.7212 - auc_roc: 0.7915 - val_loss: 0.5969 - val_binary_accuracy: 0.6501 - val_auc_roc: 0.7915\n",
      "roc-auc: 80.80% - roc-auc_val: 73.23%                                                                                                    \n",
      "Epoch 36/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5421 - binary_accuracy: 0.7210 - auc_roc: 0.7915 - val_loss: 0.5936 - val_binary_accuracy: 0.6424 - val_auc_roc: 0.7915\n",
      "roc-auc: 80.76% - roc-auc_val: 72.91%                                                                                                    \n",
      "Epoch 37/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5428 - binary_accuracy: 0.7206 - auc_roc: 0.7915 - val_loss: 0.5920 - val_binary_accuracy: 0.6577 - val_auc_roc: 0.7915\n",
      "roc-auc: 80.79% - roc-auc_val: 73.01%                                                                                                    \n",
      "Epoch 38/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5422 - binary_accuracy: 0.7214 - auc_roc: 0.7915 - val_loss: 0.5921 - val_binary_accuracy: 0.6493 - val_auc_roc: 0.7915\n",
      "roc-auc: 80.71% - roc-auc_val: 72.42%                                                                                                    \n",
      "Epoch 39/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5426 - binary_accuracy: 0.7207 - auc_roc: 0.7915 - val_loss: 0.5979 - val_binary_accuracy: 0.6388 - val_auc_roc: 0.7915\n",
      "roc-auc: 80.72% - roc-auc_val: 72.30%                                                                                                    \n",
      "Epoch 40/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5424 - binary_accuracy: 0.7213 - auc_roc: 0.7915 - val_loss: 0.5910 - val_binary_accuracy: 0.6440 - val_auc_roc: 0.7915\n",
      "roc-auc: 80.72% - roc-auc_val: 72.70%                                                                                                    \n",
      "Epoch 41/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5425 - binary_accuracy: 0.7201 - auc_roc: 0.7915 - val_loss: 0.5939 - val_binary_accuracy: 0.6485 - val_auc_roc: 0.7915\n",
      "roc-auc: 80.72% - roc-auc_val: 72.49%                                                                                                    \n",
      "Epoch 42/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5423 - binary_accuracy: 0.7206 - auc_roc: 0.7915 - val_loss: 0.5886 - val_binary_accuracy: 0.6549 - val_auc_roc: 0.7915\n",
      "roc-auc: 80.76% - roc-auc_val: 73.02%                                                                                                    \n",
      "Epoch 43/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5424 - binary_accuracy: 0.7208 - auc_roc: 0.7915 - val_loss: 0.5926 - val_binary_accuracy: 0.6444 - val_auc_roc: 0.7915\n",
      "roc-auc: 80.76% - roc-auc_val: 72.61%                                                                                                    \n",
      "Epoch 44/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5421 - binary_accuracy: 0.7206 - auc_roc: 0.7915 - val_loss: 0.5886 - val_binary_accuracy: 0.6513 - val_auc_roc: 0.7915\n",
      "roc-auc: 80.74% - roc-auc_val: 72.78%                                                                                                    \n",
      "Epoch 45/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5420 - binary_accuracy: 0.7217 - auc_roc: 0.7915 - val_loss: 0.5933 - val_binary_accuracy: 0.6420 - val_auc_roc: 0.7915\n",
      "roc-auc: 80.79% - roc-auc_val: 72.75%                                                                                                    \n",
      "Epoch 46/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5430 - binary_accuracy: 0.7203 - auc_roc: 0.7915 - val_loss: 0.5891 - val_binary_accuracy: 0.6448 - val_auc_roc: 0.7915\n",
      "roc-auc: 80.77% - roc-auc_val: 72.88%                                                                                                    \n",
      "Epoch 47/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5423 - binary_accuracy: 0.7213 - auc_roc: 0.7915 - val_loss: 0.5888 - val_binary_accuracy: 0.6472 - val_auc_roc: 0.7915\n",
      "roc-auc: 80.73% - roc-auc_val: 73.00%                                                                                                    \n",
      "Epoch 48/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5428 - binary_accuracy: 0.7213 - auc_roc: 0.7915 - val_loss: 0.5913 - val_binary_accuracy: 0.6468 - val_auc_roc: 0.7916\n",
      "roc-auc: 80.75% - roc-auc_val: 72.68%                                                                                                    \n",
      "Epoch 49/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5434 - binary_accuracy: 0.7209 - auc_roc: 0.7916 - val_loss: 0.5949 - val_binary_accuracy: 0.6464 - val_auc_roc: 0.7916\n",
      "roc-auc: 80.75% - roc-auc_val: 72.80%                                                                                                    \n",
      "Epoch 50/50\n",
      "206587/206587 [==============================] - 12s 56us/step - loss: 0.5430 - binary_accuracy: 0.7212 - auc_roc: 0.7916 - val_loss: 0.5925 - val_binary_accuracy: 0.6456 - val_auc_roc: 0.7916\n",
      "roc-auc: 80.76% - roc-auc_val: 72.47%                                                                                                    \n"
     ]
    }
   ],
   "source": [
    "for fold, (train_idx, test_idx) in enumerate(cv.split(data)):\n",
    "    checkpointer = ModelCheckpoint('models/weights.{}.hdf5'.format(fold),\n",
    "                               monitor='val_loss',\n",
    "                               verbose=0,\n",
    "                               save_best_only=True,\n",
    "                               save_weights_only=False,\n",
    "                               mode='auto',\n",
    "                               period=1)\n",
    "    tensorboard = TensorBoard(log_dir='./logs/{}'.format(fold),\n",
    "                          histogram_freq=1,\n",
    "                          batch_size=32,\n",
    "                          write_graph=True,\n",
    "                          write_grads=True,\n",
    "                          update_freq='epoch')\n",
    "    X_train = features.iloc[train_idx]\n",
    "    X_test = features.iloc[test_idx]\n",
    "    y_train = label.iloc[train_idx]\n",
    "    y_test = label.iloc[test_idx]\n",
    "\n",
    "    training = model.fit(X_train, \n",
    "                         y_train, \n",
    "                          batch_size=32, \n",
    "                          epochs=50, \n",
    "                          verbose=1, \n",
    "                          validation_data=(X_test, y_test), \n",
    "                          callbacks=[checkpointer, \n",
    "                                     tensorboard,\n",
    "                                     early_stopping,\n",
    "                                     auc_callback(training_data=(X_train, y_train),\n",
    "                                                  validation_data=(X_test, y_test))])\n",
    "    history = pd.concat([history, pd.DataFrame(training.history).assign(fold=fold)])"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 29,
   "metadata": {},
   "outputs": [],
   "source": [
    "scores, preds = {}, {}\n",
    "for fold, (train_idx, test_idx) in enumerate(cv.split(data)):\n",
    "    model = load_model(f'models/weights.{fold}.hdf5', custom_objects={'auc_roc': auc_roc})\n",
    "    y_test = features.iloc[test_idx]\n",
    "    month = y_test.index[0].month\n",
    "    preds[month] = model.predict(y_test)\n",
    "    scores[month] = roc_auc_score(y_score=preds[month], y_true=label.iloc[test_idx])"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 27,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXcAAAD/CAYAAAAKVJb/AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvOIA7rQAAD4lJREFUeJzt3X+QXXV5x/H3Q9KgCIJtVqwkIdQGJSoVuw1MqRYENEAbaqst0B/YUjKdijiFaSctHcrQaRtpO4zTxmkzIqVMFUGrphoNKuKoLZgAwZCE1BAp2TJopEhbqWLw6R/34Fwum+zZ3XPZzcP7NZPJOd/zPed5djf55Nxzzz2JzESSVMtBM92AJKl7hrskFWS4S1JBhrskFWS4S1JBhrskFWS4S1JBhrskFWS4S1JBhrskFTR3pgrPnz8/Fy9ePFPlJemAdOedd34zM0cmmjdj4b548WI2bdo0U+Ul6YAUEf/RZp6XZSSpIMNdkgoy3CWpIMNdkgoy3CWpIMNdkgoy3CWpIMNdkgqasQ8xSZp9Pnvry6a032lvuL/jTjRdnrlLUkGGuyQVZLhLUkGGuyQVZLhLUkGGuyQVZLhLUkGGuyQVZLhLUkGGuyQVZLhLUkGGuyQVZLhLUkGtngoZEcuBdwNzgPdm5uqB7YuA64EjmjmrMnN9x71K4/rrX/m5Ke132Qc/3nEn0uwx4Zl7RMwB1gBnAkuB8yJi6cC0PwZuyswTgHOB93TdqCSpvTZn7suAnZm5CyAibgTOAbb1zUnghc3y4cBDXTY5dFcePsX9Huu2D0nqSJtwPwrY3bc+Bpw4MOdK4JaIeAfwAuD0TrrTAWnN79w6pf3e/ndv6LgT6ekWr/rElPZ7YPXZHXcyfG3eUI1xxnJg/TzgHzJzAXAWcENEPOPYEbEyIjZFxKY9e/ZMvltJUittwn0MWNi3voBnXna5ELgJIDP/DXgeMH/wQJm5NjNHM3N0ZGRkah1LkibUJtw3Aksi4piImEfvDdN1A3MeBE4DiIjj6IW7p+aSNEMmDPfM3AtcDGwAttO7K2ZrRFwVESuaaZcBF0XEPcAHgLdl5uClG0nSs6TVfe7NPevrB8au6FveBpzcbWuSrrzyymd1P9XRKtwlaRhe8rnNU9rv4VNf03En9fj4AUkqyHCXpIIMd0kqaFZec38ufYpMkobBM3dJKshwl6SCDHdJKshwl6SCZuUbqurW9lccN6X9jrtve8edHPjGVn1hSvstWP26jjuR9s8zd0kqyHCXpIIMd0kqyHCXpIIMd0kqyHCXpIK8FXIGvPr6V09pvy0XbOm4E0lVeeYuSQUZ7pJUkOEuSQUZ7pJUkOEuSQUZ7pJUkOEuSQUZ7pJUkOEuSQUZ7pJUkOEuSQX5bBlJGpLFqz4xpf0eWH32tGt75i5JBRnuklSQ4S5JBRnuklSQ4S5JBRnuklSQ4S5JBRnuklSQ4S5JBbUK94hYHhE7ImJnRKzax5xfjohtEbE1It7fbZuSpMmY8PEDETEHWAOcAYwBGyNiXWZu65uzBPhD4OTMfDQiXjyshiVJE2tz5r4M2JmZuzLzCeBG4JyBORcBazLzUYDM/Ea3bUqSJqNNuB8F7O5bH2vG+h0LHBsRX4qI2yNieVcNSpImr81TIWOcsRznOEuAU4AFwBci4lWZ+a2nHShiJbASYNGiRZNuVpLUTpsz9zFgYd/6AuChceZ8LDO/l5lfA3bQC/unycy1mTmamaMjIyNT7VmSNIE24b4RWBIRx0TEPOBcYN3AnI8CpwJExHx6l2l2ddmoJKm9CcM9M/cCFwMbgO3ATZm5NSKuiogVzbQNwCMRsQ34HPD7mfnIsJqWJO1fq/+JKTPXA+sHxq7oW07g0uaXJGmG+QlVSSrIcJekggx3SSrIcJekggx3SSrIcJekggx3SSrIcJekggx3SSrIcJekggx3SSrIcJekggx3SSrIcJekggx3SSrIcJekggx3SSrIcJekggx3SSrIcJekggx3SSrIcJekggx3SSrIcJekggx3SSrIcJekggx3SSrIcJekggx3SSrIcJekggx3SSrIcJekggx3SSrIcJekggx3SSrIcJekggx3SSrIcJekggx3SSqoVbhHxPKI2BEROyNi1X7mvSUiMiJGu2tRkjRZE4Z7RMwB1gBnAkuB8yJi6TjzDgMuAe7ouklJ0uS0OXNfBuzMzF2Z+QRwI3DOOPP+FLga+E6H/UmSpqBNuB8F7O5bH2vGfiAiTgAWZubH93egiFgZEZsiYtOePXsm3awkqZ024R7jjOUPNkYcBFwDXDbRgTJzbWaOZuboyMhI+y4lSZPSJtzHgIV96wuAh/rWDwNeBdwWEQ8AJwHrfFNVkmZOm3DfCCyJiGMiYh5wLrDuqY2Z+Vhmzs/MxZm5GLgdWJGZm4bSsSRpQhOGe2buBS4GNgDbgZsyc2tEXBURK4bdoCRp8ua2mZSZ64H1A2NX7GPuKdNvS5I0HX5CVZIKMtwlqSDDXZIKMtwlqSDDXZIKMtwlqSDDXZIKMtwlqSDDXZIKMtwlqSDDXZIKMtwlqSDDXZIKMtwlqSDDXZIKMtwlqSDDXZIKMtwlqSDDXZIKMtwlqSDDXZIKMtwlqSDDXZIKMtwlqSDDXZIKMtwlqSDDXZIKMtwlqSDDXZIKMtwlqSDDXZIKMtwlqSDDXZIKMtwlqSDDXZIKMtwlqSDDXZIKMtwlqaBW4R4RyyNiR0TsjIhV42y/NCK2RcRXIuKzEXF0961KktqaMNwjYg6wBjgTWAqcFxFLB6bdDYxm5vHAh4Cru25UktRemzP3ZcDOzNyVmU8ANwLn9E/IzM9l5uPN6u3Agm7blCRNRptwPwrY3bc+1ozty4XAJ6fTlCRpeua2mBPjjOW4EyN+DRgFfnYf21cCKwEWLVrUskVJ0mS1OXMfAxb2rS8AHhqcFBGnA5cDKzLzu+MdKDPXZuZoZo6OjIxMpV9JUgttwn0jsCQijomIecC5wLr+CRFxAvD39IL9G923KUmajAnDPTP3AhcDG4DtwE2ZuTUiroqIFc20vwQOBW6OiM0RsW4fh5MkPQvaXHMnM9cD6wfGruhbPr3jviRJ0+AnVCWpIMNdkgoy3CWpIMNdkgoy3CWpIMNdkgoy3CWpIMNdkgoy3CWpIMNdkgoy3CWpIMNdkgoy3CWpIMNdkgoy3CWpIMNdkgoy3CWpIMNdkgoy3CWpIMNdkgoy3CWpIMNdkgoy3CWpIMNdkgoy3CWpIMNdkgoy3CWpIMNdkgoy3CWpIMNdkgoy3CWpIMNdkgoy3CWpIMNdkgoy3CWpIMNdkgoy3CWpIMNdkgpqFe4RsTwidkTEzohYNc72gyPig832OyJicdeNSpLamzDcI2IOsAY4E1gKnBcRSwemXQg8mpk/DlwDvKvrRiVJ7bU5c18G7MzMXZn5BHAjcM7AnHOA65vlDwGnRUR016YkaTLahPtRwO6+9bFmbNw5mbkXeAz4kS4alCRNXmTm/idEvBV4U2b+drP+68CyzHxH35ytzZyxZv3+Zs4jA8daCaxsVl8O7JhCz/OBb05hv6mynvVmYy3rPXfrHZ2ZIxNNmtviQGPAwr71BcBD+5gzFhFzgcOB/xo8UGauBda2qLlPEbEpM0encwzrWe9Ar2U9602kzWWZjcCSiDgmIuYB5wLrBuasAy5olt8C3JoTvSSQJA3NhGfumbk3Ii4GNgBzgPdl5taIuArYlJnrgGuBGyJiJ70z9nOH2bQkaf/aXJYhM9cD6wfGruhb/g7w1m5b26dpXdaxnvWK1LKe9fZrwjdUJUkHHh8/IEkFGe6SVJDhPsMiYllE/FSzvDQiLo2Is56l2v/4bNTR9EXEvIj4jYg4vVk/PyL+NiLeHhE/NNP9afbxmvuAiHgFvU/c3pGZ/9s3vjwzP9VxrT+h98yeucCngROB24DTgQ2Z+Wcd1hq8fTWAU4FbATJzRVe19lH/Z+g9yuLezLxlCMc/Ediemf8dEc8HVgGvBbYBf56Zj3Vc7xLgI5m5e8LJ3dT7J3p/Tg4BvgUcCvwzcBq9v8cX7Gf3qdZ8GfBmep9h2Qt8FfhA199LDccBG+4R8ZuZeV3Hx7wEeDuwHXgN8M7M/Fiz7a7MfG3H9bY0dQ4GHgYW9IXTHZl5fIe17qIXdO8Fkl64f4DmttXM/HxXtZp6X87MZc3yRfS+rx8B3gj8S2au7rjeVuAnmlt31wKP0zznqBn/xY7rPQZ8G7if3vfx5szc02WNgXpfyczjmw8J/ifw0sx8snmG0z1d/llp6l0C/DzweeAsYDPwKL2w/93MvK3LehqCzDwgfwEPDuGYW4BDm+XFwCZ6AQ9w9xDq3T3ecrO+ueNaBwG/R+8VwmuasV1D/Pn0f20bgZFm+QXAliHU2963fNcwv5dPfX3N9/SN9D7nsQf4FL0P8x02hHr3AvOAFwH/A/xwM/68/q+9w3pbgDnN8iHAbc3yoiH9XTgcWA3cBzzS/NrejB3Rdb0JevnkEI75QuAvgBuA8we2vWcYX0er+9xnSkR8ZV+bgCOHUHJONpdiMvOBiDgF+FBEHN3U7NoTEXFIZj4O/ORTgxFxOPD9Lgtl5veBayLi5ub3r9Pycw5TdFBEvIheAEY2Z7WZ+e2I2DuEevf2vZq7JyJGM3NTRBwLfG8I9bL5nt4C3NJc9z4TOA/4K2DCZ39M0rX0gm8OcDlwc0TsAk6i96TWYZgLPEnvleVhAJn54JCu8d9E7xLhKZn5MEBEvITeP5Y3A2d0WSwi9vUqPOi9mu7adfQua30Y+K2I+CV6If9dej/Dzs3qyzJNAL2J3svBp20C/jUzX9pxvVuBSzNzc9/YXOB9wK9m5pyO6x3c/HAHx+cDP5qZW7qsN1DjbODkzPyjIR3/AXr/QAW9y0A/nZkPR8ShwBczs9O/QM0/iO8GXkfvYUyvpfek0t3AJZl5T8f17s7ME/ax7fmZ+X9d1muO+1KAzHwoIo6g997Mg5n55SHUeie9/6fhduD1wLsy87qIGAE+nJmv77jejsx8+WS3TaPek/QuOY130nZSZj6/43qb+//MR8Tl9C53rQA+nR1f8oXZH+7XAtdl5hfH2fb+zDy/43oLgL1PnTkMbDs5M7/UZb3noog4BDgyM782pOMfBvwYvbPOscz8+pDqHJuZ/z6MY88WEfFK4Dh6b4LfN+RatwCfAa5/6mcWEUcCbwPOyMzTO653L/DmzPzqONt2Z+bCcXabTr3twCubV3tPjV0A/AG9S8FHd1kPZnm4S3puaC7hraL3H/+8uBn+Or2HEq7OzMFX79Ot9xZ67/0847HjEfELmfnRjutdDdySmZ8ZGF8O/E1mLumyHhjukma5YdwZ91yoZ7hLmtUi4sHMXGS9yZnVd8tIem54tu+Mq14PDHdJs8OR7OfOOOtNnuEuaTb4OL27RjYPboiI26w3eV5zl6SCfCqkJBVkuEtSQYa7JBVkuEtSQYa7JBX0/wf5A8llqT7sAAAAAElFTkSuQmCC\n",
      "text/plain": [
       "<Figure size 432x288 with 1 Axes>"
      ]
     },
     "metadata": {
      "needs_background": "light"
     },
     "output_type": "display_data"
    }
   ],
   "source": [
    "pd.Series(scores).sort_index().plot.bar();"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Make Predictions"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 38,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "<class 'pandas.core.frame.DataFrame'>\n",
      "RangeIndex: 2489 entries, 0 to 2488\n",
      "Data columns (total 12 columns):\n",
      "1     2489 non-null float32\n",
      "2     2489 non-null float32\n",
      "3     2489 non-null float32\n",
      "4     2489 non-null float32\n",
      "5     2489 non-null float32\n",
      "6     2489 non-null float32\n",
      "7     2489 non-null float32\n",
      "8     2489 non-null float32\n",
      "9     2489 non-null float32\n",
      "10    2489 non-null float32\n",
      "11    2489 non-null float32\n",
      "12    2489 non-null float32\n",
      "dtypes: float32(12)\n",
      "memory usage: 116.8 KB\n"
     ]
    }
   ],
   "source": [
    "predictions = pd.DataFrame({month: data.squeeze() for month, data in preds.items()}, index = range(preds[1].shape[0])).sort_index(1)\n",
    "predictions.info()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Evaluate Results"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 123,
   "metadata": {},
   "outputs": [],
   "source": [
    "from sklearn.metrics import roc_curve, precision_recall_curve, average_precision_score"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 125,
   "metadata": {
    "scrolled": false
   },
   "outputs": [],
   "source": [
    "bins = np.arange(0, 1.01, .01)\n",
    "roc, prc = pd.Series(), pd.Series()\n",
    "avg_roc, avg_precision = [], []\n",
    "for month, y_score in predictions.items():\n",
    "    y_true = label[f'2017{month:02}01']\n",
    "    avg_roc.append(roc_auc_score(y_true=y_true, y_score=y_score))\n",
    "    fpr, tpr, _ = roc_curve(y_true=y_true, y_score=y_score)\n",
    "    df = pd.DataFrame({'fpr': fpr, 'tpr': tpr})\n",
    "    df.fpr = pd.cut(df.fpr, bins=bins, labels=bins[1:])\n",
    "    roc = pd.concat([roc, df.groupby('fpr').tpr.mean().bfill().to_frame('tpr').reset_index()])\n",
    "    \n",
    "    precision, recall, _ = precision_recall_curve(y_true=y_true, probas_pred=y_score)\n",
    "    avg_precision.append(average_precision_score(y_true=y_true, y_score=y_score))\n",
    "    df = pd.DataFrame({'precision': precision, 'recall': recall})\n",
    "    df.recall = pd.cut(df.recall, bins=bins, labels=bins[1:])\n",
    "    prc = pd.concat([prc, df.groupby('recall').precision.mean().ffill().to_frame('precision').reset_index()])\n",
    "    "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 126,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(0.773903996249194, 0.6880594179772762)"
      ]
     },
     "execution_count": 126,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "np.mean(avg_roc), np.mean(avg_precision)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "To obtain a measure of the model’s generalization error, we evaluate its predictive performance on the hold-out set. To this end, we iteratively predict one month in the test after training the best-performing architecture on all preceding months.\n",
    "\n",
    "The below ROC and Precision-Recall curves summarize the out-of-sample performance over the 12 months in 2017. The average AUC score is 0.7739, and the average precision is 68.8%, with the full range of the tradeoffs represented by the two graphs."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "While the AUC scores underline solid predictive performance, we need to be careful because binary price moves ignore the size of the moves. We would need to deepen our analysis to understand whether good directional predictions would translate into a profitable trading strategy."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 129,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAA+gAAAGqCAYAAACYrG6qAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvOIA7rQAAIABJREFUeJzs3Xl4FFX28PHvyb6yJSCGxYCCAsqmuCKijoCgCCIDLriN2zsi6vx0xl0cVHRccBgcdFxAXAIqwoCjgsqmjiioyCCggCxhDyH71tt9/7jVTdPpJB0MJMD5PE9DuvpW1a3q6u46dc+9JcYYlFJKKaWUUkopVb+i6rsCSimllFJKKaWU0gBdKaWUUkoppZRqEDRAV0oppZRSSimlGgAN0JVSSimllFJKqQZAA3SllFJKKaWUUqoB0ABdKaWUUkoppZRqADRAV+oIIiJXiMhMEdksImUi8rOIjBeR1AjnHysiRkRiDnZdD6ag7Qj3OOEQ12WTiEwNmXapiPxPRMqdOjU5lHU6FERkkYgsiqCc/325Mcxrb4nIpgNYd6ZzDLSv7bwHg4j0dbbxd3W4zPNE5BMR2e4cR1ud51cHlenu7IdmdbXeMPU44O8M57Phf/99IpItIu+LyEkRzj/1QI6PAxX0PhoR6Rfm9UxnO4yI3HSo6nU4EJHrw33GqyibGfKd7RKRX0Rkgog0PQR19b/PfYOmRfR9FrKcISLyp0iWr5RSwTRAV+rIcg/gBR4ABgCTgf8HfCoiR+PnvTdwVsgjuz4r5AQybwPbgH5OnYrqs04NxKMiEldHy8oEHgUaRIBe10RkCLAQKAdGYz/rDwJ7gIFBRbtj98NBC9DrwDzsZ6A38AhwOvCFiLSIYN5xwNCDWLeqFAGjwky/Fig+xHU5XFwPRBSgBxmPPTYuAqYCtwKzRETqtGaR+aPzqI0hQKUAHfgeu13f/9ZKKaWOTId1K5lSqpJLjTE5Qc8Xi8he4A2gL7CgXmp1EIhIvDGmooZi3xhjPIekQpFrBaQC7xpjlvzWhYlINCCHejsj3P+Rmo+9WHEr8I86Wma98783B2HRfwJ+AIYaY0zQ9DcOwwtxe4wxS52//ysivwKLgGuA58PN4D/2jDEbDlEdQ30AXCEiycaYkqDpo4CZ2GBU/Xa/Bh0bi0UkFhgL9KCK4LaOv5cCjDGr63BZhcDSGgsqpY5ah9sPuVKqGiHBud8y5/9WdbEOEeknIh+JyA4RKRWRVSLyf04w4i/zoYhUOoESkXZOCuitIdPeFpEcEakQkRUiMjRkPn8a7ckiMk9EioF362Bbaly3U66biMwRkTyxXQe+EpFzw5S700nbLReR5aFlRGQssMl5+pqzTYuc10RE7hbbLcHl7N9JItIoZBlGRJ4QkftEZCPgArqJSL6IPBRU7hSn7Jch828Vkb8FPX9MRL4XkQIR2SMiC0TkzJB5/CmZl4vIKyKSA+wKen2kiKx19uFP4fZhDZYBs4EHRSSpuoIiEiMi9wetb7uIPCciCf66YluXwWaO+NNk+zr7c33I8r6TkK4Pzv7d7W+p+43vzSlVbEd7EVnnHEu1TdttBuwOCc4BMMb4nOVfD0xxJq8L2g+ZzuuNnG3Y7uzHn51t3O+Cgog0F5F/ik0/r3D+f1NE4quqnIgMEJFiZ/m1Pc/wf1+d4N8Op959ROQ9EckHvnFeq5TiLiLJIvKUiGxw6rtTbLefY4LKtJMIPvfV+AAwwOVByzwbOB54M9wMzj75Wuz3R4GIzBaRE4Ne/6eI7JKQrgIiEi/2e+eFoGnpIjJZRLY59V8rIreEzOffb2eLyLsiUuQs//6g+vwgIiUiskxETg1T58tFZKnY7/l8Z/+3DSmzSWxXlJEissZZ3nIR6R1UZhFwHnBO0HG4qObdXEnosTFV7PfZWSLyXxEpA4K/224WkR/Ffh/vEZHXJKS7h3N8vyMihc42TgMqdTmSMCnu1X02xHZrug5oFbTNm5z5wqXQi0T+HfO4iIwRkY3O+7pYRLocwP5USjVQGqArdeQ7z/l/TR0trz3wOTZdcRC2dX4s8ERQmX8CPUTk9JB5bwFKgHcARKQN9mS7G3A3MBjbMjJTRAaHWfe/gcVOuQkR1DVabEDnfwS+8yJdt4j0BP6LDYpuBoYBucBnwSe1IvIH4AVscDgEm5KZBQQHX68Cw52/H8emOfrTJp/Athh+ClyKPdG8HviPVA5yrsfu+3uc/7cCS4ALgspcAJQBp4tIslPHE7EXahYGlWuF3ZdDnOXuBpaISFcq+we2RXiUUxax/arfAdZhA5ZngL8DJ4aZvzoPAc2BMTWUe8sp+w5228cDf8B2GwD7Ht7u/D2GfV0bvsdmkBzvDzLEBsbdsfspdN8tDAqAf8t7sz10A0SkB/aYWgP8zhiT50z3X4jKrGEffAv0c07Uu4qETfn9D/YYA3vM+ffDDqfO/wFuAJ5ztukTZxsDn2Nn//wXGOG8NhD4MxALhO2OICLXAnOAp40xo/0XDGqhnfN/fsj0t4GNwBXAfVWsOw77Ho3Bfv4uwXYB2IvzOTyA75xwSrEt5cFp7tcCXwG/hqnXAOz+Lsbuy/8HnAx8KSL+C6fTgBbYTJJgl2ADxjedZTVy1jMI+707CJgLTBaRO8LU9Q3gf9iuALOBJ0Xkaezn9GmnPsnAbAnqYiIitznbuBq7z2916rxYKo9pci7wf8DDzvKigQ9l39gaf8RmfKxk33FY23RxCH9sNAamY79rL2bfb8tT2N+hz7Dv8b3YriAfS9DFZOzFlkuw3cJGAB4iyOKJ4LMxDvgIyGHfNld3Eag23zHXYN/3O7Gf4bbAv+UwHztGKRXEGKMPfejjCH1gg6/dwKcRlh+LbRmKibC8YLvKPAjkAVHO9ChgA/BaUNlYYCfwUtC017AnMGkhy/0UWBGmXnfWcjtCH28dwLo/xwZScUHTop1ps4O2Nxv4JGRZI5z1Tg2adoIz7fqgac2w/Ymnhsx/jVN2cNA0gw36EkPK3o0NNOOd57OxYxAUA/2dabcBbiCliv0W7byfPwN/D5re11nvrDDzfIU9iY8KmnaGU35RBO+VAR53/n4TG0w1dp6/BWwKKnuuU/7akGVc7UzvHlLf34WUawb4gOuc50Owx+1rQJYzLcXZR7fV4XsTqA9wIVDorDM6pNwj2ADhuBr2WQvsBRn/cV3gvN+/Dyl3vfP6CSHTLwk9Bp3prwIVQLrz/K/YMS16RPKdgQ1Q3MBNEX5ON2ED7xhsUHOyczx5gZ4h2zAhzPxTQ46PG0PfkzDzRPS5r2Le4PfxAqeerYB457i9GTv+gQneB8By7AWsmKBp7Zx99XzQtF/8x2HQtNnA6qDnDzvHY4eQcq9gxyCICdlvjwSVicH+HriBdkHTBztlzwv6DBQAr4esIxObFXJXyHuYBzQNmnaas7yrgqYtAr6M8Ljw78NbnDonYfuh7yDo8+W8/wa4LMz83uBtd6af45Qf4jy/yHk+MqTcx870viH1XxT0PJLPxlRgazXHUV/neW2/Y9YBsUHTrnCmnx3J/tWHPvTR8B/agq7UEUpEUrAtzh7sVfbg14JblWt11V1EjhWRl0VkM/ZkzY1tqWuCDRwwttXsZWCkiDR2Zh0CHONM9xuAbWUoCKnPPGza9n7pfcCs2tQVOBPoFfR4uDbrFpFEbAbCe4AvqIxgW2b6OMtq7TxC0+5nYvd/JPWMxwakwaY7858XMv0TY0xZyLSFQAJwttPicp6zLV+xr3X4AmCZMSYwkJWI/E5EFopIrrMuN9CR8C3g++1/pyWqF/C+CWopNcZ8w75U/tp4FBsc3FvF6wOwx9zMkPdsvvN6nyrm89drL7YVL3h/LMa+l+cHLSOGfeM11MV74zcce8y9aIz5gzHGG1K/vxpjYowxm2vYjt3GmD7YAdUeAb7ABo0zROSV6uZ19MFeqMgKmf4WNlA+y3neD3u8/BDBMicAjwFXGGNejaC831XYY64C29KbAQw3xoR2kYnks98P2GmMmVNNmdp+51RlITZz5Spsi2ciYbrdONkrPYEZJmicCGPMRuxnM/j4eQu4zN9C7aRjX4xtXQ+u/zfAxjD1TwM6h1Th46B1eoD1wC/O+v3WOv+3cf4/C2gEvB2yjq1O2dDP2dfGyQJx/M/5vy2/zcvYY6ME+xlfDwwI+Xx5gA9D5rsIe9E0tP7fYC+O+et/FjbInhky//QI6labz0ZNavsd86kxxh30vK72t1KqgdB0GKWOQGL7487BpqOfZ4zZGvRaJjZVNLh8O2PMpgiWG+UsNwPbcrYW22o7BNuKnhBU/DXsCfsoYBK29fbbkBOaFtjU0GurWGUa9oTKb0dNdQzxnal68LRI1u3Ctio/zP7BfYCzT451nu4Kfs0Y43EC35r4+0Xut31B84eOwh1uP/yITb0/H7vPGmGDz5OAy5006L7YljZ/3Xtig5V52DTxHdgT1lfZ/72sar3p2MyIXWHKhptWLWPMryLyGnCniPw9TJEW2ACyqpGy0yJYzQJsixPYffUqNtg6RkQ6O9O2G2N+ccrUxXvjNwz7eZkSQT1rZIxZhtMv10knfh+4SUT+boxZVc2szYC9pvJgWjuDXge7P3+MsDpXAj9hL3bUxsfYiwxe7H6v6riJ5LOfhr07QnVq+50TljHGiMjb2O+3zcAcY0yBVB5PoCn2gl64+u8Ejgt6/ib2e/UK7DEyEvv5ejuoTAtsFk5wgBZa/2B5Ic9dVUyDfZ95/wj6Vb2XofPvDX5ijKlwel2E+w6pjcexF5krgC3GmIIwZXaHXuhiX/3XhxZ2+PfRsUBeSLALkX131eazUZPafsfsDXnu/xz/1v2tlGogNEBX6ggjdqTbmdjWtd8ZY/4XUmQ7ttUzdFokjsemL44yxgSu9ovIpaEFjTG5IvIecKuIzMMGPqH3Bs7Ftv49XcX6QutlIqxnJCJZdwy2pfFF9m/F2lchY3wi4j+xOib4NafVJpKg0X/C1RIb5ITOHxrkV9oPTsCwGNsqXIRN180TkQXYE91zsH28FwbNNgzbQnN58EmqE2SE9gEOt9492EDhmDBlj8EGLrU1Dju40gNhXsvFpoJWGqDPEclxvBC4W0TOAroAC4wxO0VkDXbfXcD+++g3vzdBbsH2TV8kIhcYY9ZWU7ZWjDH5IjIRm0LfGaguQN8LNBOROGOMK2h6S+d//zbtIfLBJS/EtnJ+LCIDg7M0arDXGLM8gnKRfPb3YNPkq1Pb75zqTAPuxx5HVfVfz8PWvWWY11oSdPwYYzaKyFfY1OYpzv+LjDHBt4bMxaap31nF+n6uRf2r4q/T9QQd80EO1W0hN0dwbIQ7Lvz170fliwnBr+8AmopIbEiQHu77LFRtPhs1qe13jFLqCKcBulJHEKc1923syfIgs+8WNQHOCXkkJ8Th+EfYDg7mYrF9gMP5J/A1tpWykMqpg59g0wx/qiYt+GCJZN0VIvIFdkCp703VA15txfZB/z3wetD0YUT2PbsU2woyEtvn3W+EM//iCJYBNrB8Htsa6U/R/g6bIjoW21L2VVD5JKds4CRXRC7Apkrul2URjjHGKyLLsLecGmv2jSB+BrYfaK0DdGPMdhF5EbiDyrci+gT4C7aP+ueVZt7H36KUGOa1JdhtHoc9yfYHsguwg9x1x16Q8aur9wbsZ6A/ttV4oROk13rwRhFpExK0+Z3k/O+/YFTVfliM7UYwnP1bZ6/GHiP+/T4feEhEuhljamot/Il9t3L8REQuNsYcqkDObz62W82lxpi5VZSps+8cY8xa51htjs1CCVemRES+A4Y7nxEvgIgcB5xN5QHJ3sQO+NbXqecNIa9/gv1sbDHG7P4t9a/Gf7FB+AnGmDfqaJkV2NtLHgqfYi+stjXGfFpNua+xGVLD2P+3aWQE64jks1FB+O+gUHX5HaOUOgJogK7UkeVF7En3E0CJ7H+7rK3Bqe41uFxEQoPRHdh02s3AEyLixQbqd1e1EGPMUrG3W+sD/MMYUxpS5BHsiNRLRGQStt9yU2wrWHtjzI0R1vdARLruP2GDunlO+vUObGp3T+wgX/c5reiPAa+KyBTsyd4J2Na1SNJl94rI88D9IlKCTTvvhG35/hI7AnQkFmBTYvvgtBA6QfQS7MBgS0KCkk+Au4CpTr07YlP5a0oTDvYo9mR1toi8jA1WHmNfuvSBeArb2nweQUG+MWaRiGQB7zv761vsiXgmdhTlvzip6b9gMwNuFJG92JPfn40xRU4a8vfYi1jvGWP8FycWsm/094VB66yr98a/vCLZN6r3QhG50BjzE4CIPII9Lo+voR/6RyKyG5iBbTFNxL7n/4cNOvwXYfz3br5dRN7Afl5XYi8QfAm8JCLNscH1QGyGy3hjzB5nvgnYPtaficjj2L6u6cBl2EH09gvAjTFrZN9t7j4RkQGHOEh/CztQW5aIjMf2OU7FXhR5wclYqNPvHGPM6AiKPYx9vz8UkX9ix1l4DDsQ23MhZd8FJjrbUkbl/tETsIHbFyIyAfv+J2MvzpxrjLmsNvUPxxhTKCL3Ai86x8fHTl1bYT+Ti4wx79RysauBP4rICOwAokXGmLpo7a/EGLNB7Ej1k8TeuWIxNvOmDbZ/+qvGmIXGmE/F3obyZRFJxw6+NoKaszAgss/Gamymyv/DXhQvD5PRVuffMUqpI4CpxxHq9KEPfdTtA3uyaap4jI1g/rHVzP+hU6Y79qShFNty/Ffsib0BMsMs837ntS5VrLM1toV9G7b1bge2BeSaMPWKdHT5iMpHsm6nXCds0L0bG+xtxfbFHxhS7k5sQFmOPSHr7bwnU4PKVBrF3Zku2IsdPwfV5UWgUUg5gzPqeRXbtBMbiKUGTbu7qmMA2xq3ERsMLMMONraI/Ucs7kuYUdGDXr/SqXcFNtgbGrqMauobdnuwgb8haJRuZ3qUs59/dPZzgfP333BGf3fK3Yq95ZWHyiMyP+1Muy1omn+E901h6vKb3ptw+w8bVC3A9nc9OeS4rfQ5ClneCGwgtwH7OSzDBgNPBr/vQftxG/syJTKd6Y2wY0PscLbpF2cbJWT+FsC/gsplY2/dFR9S5+ARyjtgPyNfh+6jMN9Xb9WwrdcTZiR657WpYY6PFOwtxPyDWO7A9s1vUdvPfSTvY5gymYSM4u5MH+DsjzLnmP03cGIVy3jPWcY7VbzeFBsgbnTqvxubth88unrY/UaY0dSrqfNA7MWWQqfe67EZQp1reg8J+b7Bpm9/hG2ZN1Tz3VBVfap4/yuNkh70+ihs63QJdtyKNdhjvnVQmebYwRKLsN16pmGD7NDvjEWhdabmz0ays2x/N4dNIcdR8PIP+DsmaH9dX93+0oc+9HH4PMSYSLp1KaXUgXH6VPqMMVX1G1ZKKaWUUkqhKe5KqYNAROKxKeC/w/az/M1pl0oppZRSSh3pNEBXSh0Mx2IHGsoHnjTV35dYKaWUUkopBZrirpRSSimllFJKNQRR9V0BpZRSSimllFJKaYCulFJKKaWUUko1CBqgK6WUUkoppZRSDYAG6EoppZRSSimlVAOgAbpSSimllFJKKdUAaICulFJKKaWUUko1ABqgK6WUUkoppZRSDYAG6EoppZRSSimlVAOgAbpSSimllFJKKdUAaICulFJKKaWUUko1ABqgK6WUUkoppZRSDYAG6EoppZRSSimlVAOgAbpSSimllFJKKdUAaICulFJKKaWUUko1ABqgK6WUUkoppZRSDYAG6EoppZRSSimlVAOgAbpSSimllFJKKdUAaICulFJKKaWUUko1ABqgK6WUUkoppZRSDYAG6EoppZRSSimlVAOgAbpSSimllFJ1QER+EpG+NZRpKyLFIhJ9iKp10InIJhH5nfP3WBF5q77rpNThSgN0pQ4jzg9gmfPDvlNEpopISkiZs0VkgYgUiUiBiMwVkc4hZRqJyAsissVZ1nrneXoV6xURGSMiq0SkRES2ish7InLKwdxepZRSqi6E/H7uEpEpob+fdcEY08UYs6iGMluMMSnGGG9dr98Jjt3OduaLyH9F5Ky6Xs9vUdtzEKWONhqgK3X4udQYkwJ0B3oA9/tfcH6E5wP/BjKAdsCPwFci0t4pEwd8DnQBBgCNgLOBXOD0Ktb5d+BOYAzQDOgIzAYG1bbyIhJT23mUUkqpOuD//ewJ9AIeCi3gXJA+3M+PZzjbmQ4sBN6r5/oEHOA5SHXL03MKdcQ53L+AlDpqGWN2AvOwgbrf34Bpxpi/G2OKjDF7jTEPAUuBsU6Za4G2wFBjzGpjjM8Ys9sYM84Y81HoekSkA3A7cKUxZoExpsIYU2qMedsY85RTZpGI3BQ0z/Ui8mXQcyMit4vIOmCdiLwkIs+GrOffIvIn5+8MEZkpIjkislFExvzmHaaUUkoBxphtwMfAyRD4DXtCRL4CSoH2ItJYRF4TkR0isk1EHg9OSReRm0VkjZOttlpEejrTg1O9TxeR5SJS6LTaP+9Mz3R+F2Oc5xkiMkdE9jqtyTcHrWesiLwrItOcdf0kIqdFuJ0e4G2glYg0D1rmJSKyIqiFvWvQa21E5APn9zdXRCY50493svNyRWSPiLwtIk0OYPfXeA7i7JsTguo0VUQed/7u62Tx/UVEdgJTnPfhkqDyMU4d/e/Jmc525ovIj1JDFwSl6psG6EodpkSkNXAxsN55noS9Ch3uSvm7wEXO378DPjHGFEe4qguBrcaYb39bjRkCnAF0Bt4BRoiIAIhIU6AfMN1puZiLbflv5az/LhHp/xvXr5RSSiEibYCBwA9Bk0cBtwCpwGbgDcADnIDNVusH3OTMPxx70ftabAvwYGwLcKi/A383xjQCjsf+FoeTBWzFZr5dATwpIhcGvT4YmA40AeYAkyLczjinjrlAnjOtJ/A6cCuQBrwMzBGReOcCxIfO9mdif4On+xcHjHfq2Alow74L/7VR23OQcFpis/mOw75nWcCVQa/3B/YYY74XkVbAf4DHnXnuAWYGX7BQqqHRAF2pw89sESkCsoHdwKPO9GbYz/SOMPPswKa6gf1BDlemKrUtX5XxTot+GfAFYIBzndeuAL42xmzHph02N8b81RjjMsb8CrwCjKyDOiillDp6zRaRfOBLYDHwZNBrU40xPzmtzs2wF8DvMsaUGGN2AxPY9zt0E/A3Y8wyY603xmwOsz43cIKIpBtjio0xS0MLOBcLegN/McaUG2NWAK9iLxj4fWmM+cjps/4m0K2G7fy9s51lwM3AFc524Tx/2RjzjTHGa4x5A6gAzsSmmGcA9zrbXW6M+RLA2cZPnSy6HOB54Lwa6hFOXZxT+IBHnbqUYS/6D3YaKgCucqYBXAN85Ow/nzHmU2A59gKNUg2SBuhKHX6GGGNSgb7ASewLvPOwP1rHhpnnWGCP83duFWWqUtvyVcn2/2GMMdir8v4r3ldh0/DAXhHPcFLR8p2TjAeAY+qgDkoppY5eQ4wxTYwxxxlj/ugEd37ZQX8fB8QCO4J+h14GWjivtwE2RLC+P2DHbFkrIsuC07CDZAB7jTFFQdM2Y1uv/XYG/V0KJDhp3FeLHWStWEQ+DirzrjGmCfZ3cxVwasi2/V/Ib2wbpx5tgM1BwXyAiLQQkelOun8h8Bb7zj9qoy7OKXKMMeX+J8aY9cAa4FInSB/MvgD9OGB4yPb2roM6KHXQaICu1GHKGLMYmAo86zwvAb4Ghocp/nvsoCwAnwH9RSQ5wlV9DrSuoc9bCZAU9LxluCqHPM8CrhCR47Cp7zOd6dnARuckyv9INcbo1W6llFIHS/BvVDa2VTk96HeokTGmS9Drx9e4QGPWGWOuxAb2TwPvh/nt3Q40E5HUoGltgW0RLP9tZzT4FGPMxWFe34NNZR8rIv6ANBt4IuQ3NskYk+W81lbCD7w2HruPujop+9dg095rK5JzkFKqP6cIPZ+AfWnulwGrnaAd7Da9GbK9yf4xdJRqiDRAV+rw9gJwkYj4B4q7D7hO7C3RUkWkqTOwylnAY06ZN7E/WDNF5CQRiRKRNBF5QEQqBcHGmHXAP4EsZ3CWOBFJEJGRInKfU2wFcLmIJDkDu/yhpoobY34AcrCpfPOMMfnOS98Chc4AMIkiEi0iJ4tIrwPZQUoppVRtGGN2YO+I8pzYW4JFOYOk+VO6XwXuEZFTxTrBudi8HxG5RkSaG2N8gP83br9bqxljsoH/AuOd39au2N/Qt6kDxpi12AFl/+xMegW4TUTOcOqeLCKDnAsE32LTz59ypieIyDnOfKlAMZDv9Ou+9wCrFMk5yArgKuf3fwCRpdJPx44T8P/Y13oOtqX/UhHp7ywvwTmXaX2A9VfqoNMAXanDmNMPbBrwsPP8S+zgKJdjf2Q3Ywe36e0E2hhjKrCDtKwFPgUKsT/K6cA3VaxqDHZQmhexJxkbgKHYwdzA9s1zAbuwA+tEemKR5dQl8GPq9LG7FDs6/UZsav6rQOMIl6mUUkr9VtcCccBqbBey93HSoo0x7wFPYH+7irC3HW0WZhkDgJ9EpBg7YNzI4NTsIFdiB2XbDszC9q/+tA635RngFhFpYYxZju2HPsnZrvXA9bDf7+8JwBbswHUjnGU8hr09XQF20LUPDqQiEZ6D3OnUIx+4Grt/a1ruDmwW4dnAjKDp2dhW9QewjQLZ2IsLGgOpBktsV1CllFJKKaWUUkrVJ716pJRSSimllFJKNQAaoCullFJKKaWUUg2ABuhKKaWUUkoppVQDoAG6UkoppZRSSinVAIS7z2GDlp6ebjIzM+u7GkoppdQh89133+0xxjSv73rUBf0dV0opdTSK9Lf8sAvQMzMzWb58eX1XQymllDpkRGRzfdehrujvuFJKqaNRpL/lmuKulFJKKaWUUko1ABqgK6WUUkoppZRSDYAG6EoppZRSSimlVAOgAbpSSimllFJKKdUAaICulFJKKaWUUko1ABqgK6WUUkoppZRSDYAG6EoppZRSSimlVAOgAbpSSimllFJKKdUAaICulFJKKaWUUko1ABqgK6WUUkoppZRSDcBBC9BF5HUR2S0iq6p4XURkooisF5GVItLzYNVFKaWUUkoppZRq6A5mC/pUYEA1r18MdHAetwCTD2JdlFJKKaWUUkqpBi3mYC15CAu6AAAgAElEQVTYGLNERDKrKXIZMM0YY4ClItJERI41xuw4WHVSSqnDUZnLS16pi70lLvJL3eSXuSgoc1NQ5qa43IPL48Pt9eHyGtxeHz5jwIDPGMwhrqtxVmgA4zwJ1CHCyvhrbeqo8nZ31O0yq1O0Zwe9Tu7In/qdePBXppRSSqkjykEL0CPQCsgOer7VmVYpQBeRW7Ct7LRt2/aQVE4ppeqSz2dweX14fMYGzj4bNHp8hp0F5WzKLWHTnhK25ZeRU1TBrsIK9hRXsLfERYXHV+Vyo0WIiRZio6OIiRaiRYiKEqIEBEHkEG5kEHH+EftXrevhLy51tAH76nNw5fxvCetm/Z2oJ7NAA/RDwuXxsXJrPl6fISYqiuhoaJQQS/vmKfVdNaWUUqrW6jNAD3eeFLZtwxjzL+BfAKeddtqhbhBSSqlqebw+3F4bgLu9PlxuH+Uer324fZS5vIGW7a15ZWzaY4PxjbmlbMsrw+XdF4DHRgtNk+JokhRH66aJdGqZSmpCDI0S42icGEOTpDgaJ8bSPDWelo0SaJwYS2xMFFFig3OJgignqK2n2NyuO2TlBxqo17WDtf7du3fj8XiIufokyh4dRfvMzIOzIlWJzxiKKzw0SojFGHB7DJtzS8lokkhCbHR9V08ppZSqlfoM0LcCbYKetwa211NdlFKqRj6focztpdztpajcQ0GZm6JyDx6fLxAMGyAKocztZWteKVv2lrJlbxmbc0vYsrcUj89eY0yKi+aE5ikMOLklrZsmkp4ST9OkWJokxRIdFUVyXAyNk2JolBhLfHQ0cTFRxMVEER1VzxGuqmThwoVcc801jBs3jhtvvLG+q3NUEiA2et+wOiUuDyUVHg3QlVJKHXbqM0CfA4wWkenAGUCB9j9XSjUUxtgW8XKXj8JyN3tLbL9vYww+Y4OBuOgoUuJj8Ph8/LKrmNU7Clm3q4hf95SQU1QRWFbTpFjapSfTo20G7dJT6NAihZaNE3B5fJS4PACkxMdwbOMEmiTFkRgbTZQG4oeF8ePH849//IOpU6fSr1+/+q6OcsRFR5Fb4iItJb6+q6KUUkrVykEL0EUkC+gLpIvIVuBRIBbAGPMS8BEwEFgPlAI3HKy6KKVUOP4g3O01uDw+yl1eil0eiits65vXZxBsynh8TDSNE2OJEsFnDOt3F/Ptxr2s3FbAul1FeJyyGU1sWvqgU46lfXoy7dKTaZIUF1in12cocgL+1PgYTmqZSpOkOG3pO8zs2bOHtLQ0TjvtNL7//ntatmxZ31VSQRJjo8kpqqBDi5Q6G8dAKaWUOhQO5ijuV9bwugFuP1jrV0qpUOVuL6UuL8XlbvLL3BSWuQMp52BT02Ojo4iNERolxAb6cgO4vT5+2JLP0l9z+XbjXvaWuogS6NAilcHdMuiS0ZjOxzYiJaHy12q520uZ24vPGGKihNbNEmmemkBKfH0mMakDNXfuXG6++WbmzZvHRRddVN/VUWHEREfhKXdT5vaSFKefM6WUUocP/dVSSh2xjDGUuLzkl7rYWVBOUbkHEX+LeBQp8bHV9un2+gwrt+bzxfo9fL0hl+IKDwmxUfRs25Qz2qXRK7MpqQmxlebzGUOpy0uFxwtAakIs7Zsn0zgxluS4GE1fP0y53W7+/Oc/88EHHzBz5ky6detW31VS1TBAcblHA3SllFKHFf3VUkodEYwxFJZ5KK6wA7cVV3godXnwGRuQJ8ZGk15Df1Svz/BrTjE/bS9k1fYCftpeSHGFh8TYaM5o14zeHdLp0aYpcTFRleb1eH2UuLx4fD6iRGieGk/z1BQaJcSGLa8OL263m+joaNLS0vjhhx9o1qxZfVdJ1SAhJpo9xRW0aJRQ31VRSimlIqZnjUqpw5rPZ8gpKue7TXl8vyWPDbtLyCt1IwhNEuNIS46naTV9vHOLK5j3006e/GgNV76ylD+99yOvfbWRLXtLObN9Mx4Y2Ik3/3A6/9fvRM5ol7ZfsO31GfJKXeSWVFDi8tKycTw92jTlnBPS6XRsI9JT4jU4PwK88847dOvWDbfbzUMPPaTB+WEiITaa3GIXPp/enVWpo92sWbMQEdauXVvfVanRokWLaNy4MT169KBTp0489thjdbLcl156iWnTplX5+pw5c3jqqafqZF3BjDE8+OCDdOzYkU6dOjFx4kQACgoKuPTSS+nWrRtdunRhypQpYefPysrilFNOoWvXrgwYMIA9e/YA8N5779GlSxeioqJYvnx5oPxXX31F165d6dWrF+vXrwcgPz+f/v37Y3tYN3zagq6UOiyVujzklbjYvLcUl8dHclxMjS3kfjsLylmyLocv1uWwKbcUgPSUOPp0SKdr6yZ0yWhU4+jPhWVuXF4fx6UlkZ4aT2p8jA5GdYQpKSlhzJgxfPnll0yfPp34eB0R/HASHSV4jaHE5QnbFUUpdfTIysqid+/eTJ8+nbFjx/7m5Xm9XqKjD97grueeey4ffvghJSUldO/enUsuuYRTTz018LrH4yEmpnZh3G233Vbt64MHD2bw4MEHVN/qTJ06lezsbNauXUtUVBS7d+8G4MUXX6Rz587MnTuXnJwcTjzxRK6++mri4vYNrOvxeLjzzjtZvXo16enp/PnPf2bSpEmMHTuWk08+mQ8++IBbb711v/U999xzzJw5k02bNjF58mSee+45xo0bxwMPPHDYnKdp045S6rDg8fooKneTvbeUb52B2tbtLiYhJpq05PgaR0HfW+Jizo/buee9H7n5zeW8uXQziXEx3HB2JpOu7MHr1/Vi9AUd6NOxebXBebnby57icholxXBG+2a0b27T2A+XL30Vue3btweuzPfo0aO+q6MOgABFZZ5K0wvL3dqyrtRRori4mK+++orXXnuN6dOnB6aPGDGCjz76KPD8+uuvZ+bMmXi9Xu6991569epF165defnllwHbsn3++edz1VVXccoppwAwZMgQTj31VLp06cK//vWvwLJee+01OnbsSN++fbn55psZPXo0ADk5OQwbNoxevXrRq1cvvvrqq2rrnpyczKmnnsqGDRuYOnUqw4cP59JLLw3c1vOZZ54J1PPRRx8NzDdt2jS6du1Kt27dGDVqFABjx47l2WefBWDixIl07tyZrl27MnLkSMAG0v56bt68mQsvvJCuXbty4YUXsmXLlsA+GjNmDGeffTbt27fn/fffr3H/T548mUceeYSoKBt2tmjRAgARoaioCGMMxcXFNGvWrNJFB2OMHU+opMR2ZSwsJCMjA4BOnTpx4oknVlpfbGwsZWVllJaWEhsby4YNG9i2bRvnnXdejXVtKLQF/SCaNWsWl19+OWvWrOGkk06q7+pE5M477+T9998nOzs78EEaO3YsKSkp3HPPPYFymZmZLF++nPT0dHbu3Mldd93FsmXLiI+PJzMzkxdeeIGOHTtWuZ5zzz2XoqIiAHbv3s3pp5/O7NmzeeaZZ3j77bcBe9VszZo15OTkVEop/cMf/sDy5csxxtCxY0emTp1KSkoKmzdv5sYbbwzM89Zbb9G6dWt+/vlnrrrqKjweDy+99BJnnXUWHo+HAQMGMGfOHJKSkup6V6rfyOsz7C1xkVfqoqDURanbi3H6kyfHxZCWXPPX194SF//dsIcv1+9h9fZCDNA+PZnrz87k3A7ptEiNrG+q12codXlweX3Ex0TRrU1TmiZpUH4kMsbwyiuvsHLlSiZNmsQrr7xS31VSv0FibAw5xeVkNE0MTMsvdfHd5jwymiRy4jGpOmijUke42bNnM2DAADp27EizZs34/vvv6dmzJyNHjmTGjBkMHDgQl8vF559/zuTJk3nttddo3Lgxy5Yto6KignPOOScQEH/77besWrWKdu3aAfD666/TrFkzysrK6NWrF8OGDaOiooJx48bx/fffk5qaygUXXBAYVPTOO+/k7rvvpnfv3mzZsoX+/fuzZs2aKuuem5vL0qVLefjhh1m2bBlff/01K1eupFmzZsyfP59169bx7bffYoxh8ODBLFmyhLS0NJ544gm++uor0tPT2bt3b6XlPvXUU2zcuJH4+Hjy8/MrvT569GiuvfZarrvuOl5//XXGjBnD7NmzAdixYwdffvkla9euZfDgwVxxxRUAdO/enRUrVlRa1oYNG5gxYwazZs2iefPmTJw4kQ4dOjB69GgGDx5MRkYGRUVFzJgxIxB7+MXGxjJ58mROOeUUkpOT6dChAy+++GK17/f999/PLbfcQmJiIm+++Sb33HMP48aNq3aehkZb0A+i4HSauuD1eutkOVXx+XzMmjWLNm3asGTJkojmMcYwdOhQ+vbty4YNG1i9ejVPPvkku3btqna+L774ghUrVrBixQrOOussLr/8cgDuvffewPTx48dz3nnnhe3vOWHCBH788UdWrlxJ27ZtmTRpEgD33HMP1157LStXruSRRx7h/vvvB+Dll1/mqaee4v333w9cPZw8eTKjRo3S4LyB8Xh97Mgv45uNuazaVsCeogqiJIpmSfGB/uTV9esudXn4bM0uHpr9P66f8i0vL/mV4nIPV57eln9e1ZO/j+zBsJ6tIwrOS10ecksqKCx3k5YSR/c2TTijXRrNkuM0OD8CFRQUMGLECF588cVAK8LRTkQGiMjPIrJeRO4L8/pxIvK5iKwUkUUi0ro+6lmVhNgo8svceLw+wGbArNpWQKOEWHYWlPPL7iJtSVfqCJeVlRVoJR45ciRZWVkAXHzxxSxYsICKigo+/vhj+vTpQ2JiIvPnz2fatGl0796dM844g9zcXNatWwfA6aefHgjOwbZEd+vWjTPPPJPs7OxAwOw/f42NjWX48OGB8p999hmjR4+me/fuDB48mMLCwkCDVbAvvviCHj160K9fP+677z66dOkCwEUXXRQ4L54/fz7z58+nR48e9OzZk7Vr17Ju3ToWLFjAFVdcQXp6OkDY8+iuXbty9dVX89Zbb4VNlf/666+56qqrABg1ahRffvll4LUhQ4YQFRVF586d9zvfDxecA1RUVJCQkMDy5cu5+eabufHGGwGYN28e3bt3Z/v27axYsYLRo0dTWFi437xut5vJkyfzww8/sH37drp27cr48ePDrseve/fuLF26lIULF/Lrr7+SkZGBMYYRI0ZwzTXX1BijNATagn6Q+NNpFi5cyODBgwP9XUaMGMF1113HwIEDAZsqcumllzJkyBDuu+8+Fi1aREVFBbfffju33norixYt4rHHHuPYY49lxYoVrF69miFDhpCdnU15eTl33nknt9xyC2DTaZ5++mkyMjLo0KED8fHxTJo0iZycHG677bZAesoLL7zAOeecU6nOCxcu5OSTT2bEiBFkZWXRt2/fGrdz4cKFxMbG7tevpXv37hHvp6KiIhYsWBB2YIisrCyuvPLKsPM1atQIsBcIysrKAsHS6tWrmTBhAgDnn38+Q4YMASqnu+Tn5zN37lzmzZsXcV1V3fP6DG6vD5fXh9vjo8zlZVNuCW6voXFiLKnxkfUb9foMP2bn89naXXzz615cXh/HNk5gZK82nNuhOW2aRX4RxhhDUbltLW+SFMsJLVJonBhLTLRezzzSTZs2jebNmzNt2jQSEnTkbxGJBl4ELgK2AstEZI4xZnVQsWeBacaYN0TkAmA8MOrQ1zY8EcEYKKnwkhwPP20vIEqEhNho4mOi2JFfjgAdWmhLulJHotzcXBYsWMCqVasQEbxeLyLC3/72NxISEujbty/z5s1jxowZgXNOYwz/+Mc/6N+//37LWrRoEcnJyfs9/+yzz/j6669JSkqib9++lJeXVzsQmc/n4+uvvyYxMbHKMrCvD3qo4PUbY7j//vsr9cGeOHFijY0I//nPf1iyZAlz5sxh3Lhx/PTTT9WWD15e8HgskQy61rp1a4YNGwbA0KFDueGGGwCYMmUK9913HyLCCSecQLt27Vi7di2nn356YF5/0H/88ccD8Pvf/z7igeyMMTz++OPMmDGD0aNH89hjj7Fp0yYmTpzIE088EdEy6ouecR4k4dJpgEA6DRBIpxk4cOB+6TTLli3jlVdeYePGjYBNp3niiSdYvdqeE73++ut89913LF++nIkTJ5Kbm8v27dsZN24cS5cu5dNPP91vlEp/Os2yZcuYOXMmN910U9g6+wPioUOH8uGHH+J2u2vczlWrVu03aEWomoL1WbNmceGFFwYCbr/S0lI++eSTwAc6nBtuuIGWLVuydu1a7rjjDgC6devGzJkzA8suKioiNzeX22+/neeff57bbruNBx54gL/+9a88+OCD2gpaT1weH5v3lPDfDXv45tdcfticz/+2FbAhp4QkZ7C32AgC4pyiCrK+3cLNby7n0bk/sSI7n4s6H8MzV3Tl5WtO5aozjos4OHd7fewtqSC3xEVaahynZTalR9umpKXEa3B+BPP5fDz33HN88sknjB49mhdffFGD831OB9YbY341xriA6cBlIWU6A587fy8M83q9ixIhv9TF+pxiisv3DRgnIqQlx7G9oIz1OdqSrtSR6P333+faa69l8+bNbNq0iezsbNq1axdoER45ciRTpkzhiy++CATk/fv3Z/LkyYHz4F9++YWSkpJKyy4oKKBp06YkJSWxdu1ali5dCthW9sWLF5OXl4fH4wmclwL069cvkPUJVbc6R6J///68/vrrFBcXA7Bt2zZ2797NhRdeyLvvvktubi5ApRR3n89HdnY2559/Pn/729/Iz88PLMPv7LPPDmQAv/322/Tu3fuA6zlkyBAWLFgAwOLFiwNdYNu2bcvnn9ufj127dvHzzz/Tvn37/eZt1aoVq1evJicnB4BPP/2UTp06RbTeN954g0GDBtG0aVNKS0uJiooiKiqK0tLSA96WQ0Vb0A+SrKws7rrrLmBfOk3Pnj25+OKLGTNmDBUVFXzyySf7pdOsXLkyMNhCQUEB69atIy4uLmw6zaxZswAC6TQ7d+7cLx18+PDh/PLLL4BNp/EH90AgnSY1NTUwzeVy8dFHHzFhwgRSU1M544wzmD9/PoMGDaoyiI0kuK3piycrKyvsBYO5c+dyzjnnVHs7oylTpuD1ernjjjuYMWMGN9xwA88++yyjR49m6tSp9OnTh1atWhETE0Pbtm1ZtGgRAOvXr2f79u2cdNJJjBo1CpfLxbhx46rtM6/qhsfrY2dhORv3lOD1GZokxhFdy1YrYwz/21bAnB+3s2zTXnwGurdpwg1nZ3Jm+7SIAvtgpS4PZW4vMVFRtEtPpkWjhBoHnFNHhpycHK677jry8vIYNmyYXrCrrBWQHfR8K3BGSJkfgWHA34GhQKqIpBljcoMLicgtwC1gT8oOpaS4aLLzy3B5vKQn7z8ApIiQlhTP9vxy9hS7yGyWTHqq3h5RqSNFVlYW9923f++cYcOG8c4773DuuefSr18/rr32WgYPHhwYPfymm25i06ZN9OzZE2MMzZs3D/S/DjZgwABeeuklunbtyoknnsiZZ54J2KDygQce4IwzziAjI4POnTvTuHFjwJ7D33777XTt2hWPx0OfPn146aWXDmjb+vXrx5o1azjrrLMASElJ4a233qJLly48+OCDnHfeeURHR9OjRw+mTp0amM/r9XLNNddQUFCAMYa7776bJk2a7LfsiRMncuONN/LMM8/QvHnzKm+BFqyqPuj33XcfV199NRMmTCAlJYVXX30VgIcffpjrr7+eU045BWMMTz/9dCAt37+sjIwMHn30Ufr06UNsbCzHHXdcYFtmzZrFHXfcQU5ODoMGDaJ79+6BzNjS0lLeeOMN5s+fD8Cf/vQnhg0bRlxcXKCLQ0Mmh8v94PxOO+00E3yvu4YoNzeX1q1b06JFi/3SaTZv3oyIMGrUKIYPH8706dO58sorufTSSxk2bBi33HJL2HSaZ599NpDmsmjRIh566CHmz58fSKcZO3YseXl5zJ49mzfeeAOwH6xffvmFSZMmkZ6eTnZ2drXpNHPmzOHKK6+kefPmgD2w+/Xrx1tvvcWkSZPYsWPHfukgaWlp7N69O5CCH2mf9dD91LFjR7Zt21apxWro0KEMHz480P+lOosXL+aZZ56plApUXFzMSSedxNatW/ebPmLECB5//HGmTp3KueeeS2ZmJuPGjQsMTqcOjj1F5fy8qxiP19AoIabWrdIuj49Fv+xm7o/b2ZRbSmpCDAO6tKRfl5a0bFT7Fs8Kj5fCcg9pKXG0aZpEk8RYTXE9ygwaNIiuXbvy17/+ldjYhn0bLhH5zhhz2iFe53CgvzHmJuf5KOB0Y8wdQWUygElAO2AJNljvYowpqGq5df07Xu728s2vuTRLrvruC3uKK2iaVP0FQbfXR2G5GxFo3SSJxkmxCDaIFyA5PkYDd6VURIqLi0lJScHj8TB06FBuvPFGhg4dWt/VUvUs0t9y/aU5CA7HdJqsrCxeffVVNm3axKZNm9i4cSPz58+ntLSUPn36MGfOnMAgFh988AHdunUjOjqaCy64gIqKiv1GOl62bBmLFy+ucT+99957XHLJJZWC84KCAhYvXsxll4XPlDTGsH79+sDfc+fODYySv2fPHnw+OxjQ+PHjAwNR+C1evJhWrVrRoUOHQLpLdHT0YZHucrhyeXys3VnIym2FJMRE0yw5rlbBeVG5mxnLs/nDG8v4xwL7vt9xwQlMub4X156VWevg3BhDXqmLCo+Pbq0b0611E5olx2lwfpTweDxMmDCBoqIiZs6cyfjx4xt8cF6PtgJtgp63BrYHFzDGbDfGXG6M6QE86EyrMjivL+kp8TVm68RGR5GWHE/jBJv2vmpbAf/bVsDKrfms3JrPt5tyyS91HaIaK6UOZ2PHjqV79+6cfPLJtGvXLjAmklKR0BT3g+BwS6cpLS1l3rx5gfs8gh2Eonfv3sydO5cRI0YwevRoevfujYjQokWLQHqKiDBr1izuuusunnrqKRISEgK3WYOq010Apk+fXmk/gU1Z6dev334DYQAMHDiQV199lZYtW3LddddRWFiIMYZu3boxefJkwGYY3H///YgIffr02e9WDP7BIt59910AbrnlFq6++mo8Hk9gflV3jLG3SVuzsxDjg/RajnyeU1TB7BXbmL96J+VuHz3bNuXyHq3o2rpxrVORjTG4vQaXx0epx0ObpolkpqVoa9hRZuvWrVx99dXExsZyzTXX7NfNR4W1DOggIu2AbcBIYL+0JhFJB/YaY3zA/cDrh7yWdSw6SmiSGFdpernby/eb8zi+eQptmiXpRT2lVJX8dwxS6kBoivsRRNNpVENQ7vayt9jFlrxSylx2QKb4mMj7dO8uKuf977by6epdGKBPh3SG9mhNu/TkGucN5vUZCsr8rV1CUlwUjRJjOaZRAk2SKp98qyNbUVERnTp14o9//CN/+ctfiI4+vMYZqI8Ud2e9A4EXgGjgdWPMEyLyV2C5MWaOiFyBHbndYFPcbzfGVFS3zPpIca8rXp/NwGmWHMeJLVN1vAqllFIRi/S3XFvQjyBjx47ls88+o7y8nH79+mk6jTpkjDEUlLnZmlfGnmJ7bp4aH0tySuTp57sLy3nvu618tsben/KizsdwRc/WtDiAFPaCcjdenyEzLZkWjeKJj4mu9WB06sjgcrn49NNPGTRoEN9++y0ZGRn1XaXDijHmI+CjkGmPBP39PvD+oa5XfYmOEtJT4skrdbFuVxGntG5S80xKKaVULWiAfgTRdBp1qHl9htziCjbnllBc4SEhNoZmSbVLZd+eX8b7329lwdrdCE5gfmprWqTWLjD3Ofcvd3t9tGqaQNtmydq6dZTbsGEDI0eOpFWrVgwYMECDc1VnmibFsae4gqJyd+C2bUoppVRd0ABdKVVrPp9hd2E5G/aU4Pb6SI6LIb0WreUA2XmlvLs8myW/5BAdJVzcpSWX92xN89TapamWu72UuDyIQMtGCbRqmkRKvH61He2++eYbLrnkEh555BFGjx6tt1BTdS4hJprNuSWc3Epb0ZVSStUdPYtVStVKXomL9buLKHZ5aJwQR6Nath5tzStlxrJslqzLITY6isHdWjG0RyuaJdeuX3hxhYcyt4fGiXF0PrYRTZPjan0PdHXkKS0tZceOHXTt2pXPPvuMbt261XeV1BEqOT6anCKXtqIrpZSqUxqgK6UiUu728suuIvYUu0iJiyE9uXYt5tvzy8hatoUlv9jAfEj3VlzeszWNE2t3Yltc4aHc7SEtJZ4urRrV+gKBOnKtWrWKESNGcNlll/Hkk09qcK4OKhEhPiaK7L2ldM5oXN/VUUopdYTQAF0pVaPc4gpWby8kSoTmKbVLQS8u95C1bAv/+d8OoqOEy7q34vIerWo9knqFx0thuYe05Dg6ZzSqdWCvjmxZWVmMGTOGZ555huuuu66+q6OOEinxMewqrKBtmke71iillKoT+muilKqSz2fYsreEDTklNEmMq9V9w70+wyerdvD2N1sorvBwUedjuOaM42hay1R2Ywz5ZW6io4TubZrUOhVeHdkKCwtJTEykXbt2LFmyhE6dOtV3ldRRRESIi45iS24pnTMa1Xd1lFJKHQE0QFdKhVXmsinte0tcpKfEExXhIFvGGJb+msu0pZvZmldG11aN+UPvdrRvnlLrOpS7vRRVuGndNJHMtJRaXSBQR75ly5YxcuRInn/+eS677LL6ro46SqUmxLCrsIzj0pJI1lZ0pZRSv5H+kiil9lNUbu9nvquwnNjoKNJrkdK+alsBU/+7iZ93FdGqSSIPDOzEme2a1XoEbZ8x5JW6iI+JomfbprVOh1dHNmMMEyZM4KmnnuKf//ynBueqXtlW9GjW7iykS0Zjvb2jUkqp30QDdKUUAIXlbjbmlLC3xEVcdFSt7me+La+M1776lWWb8khLjmP0+Sfwu07HEB1V+1tblbo8lLg8tEtPpk3TJGJ0ZHYVxOPxEBMTg9fr5ZtvvqFdu3b1XSWlaJQYS2GZm2Wb9nJyRlN7mlkAACAASURBVONad+VRSiml/DRAV+oo5/H62JxbyubcEpLiYmrVYl7q8jB9WTZzf9xOXEwU152VyaXdjiU+pvYtSB6vj/wyN6mJMfTKbKa3LVKVLF68mJtvvpkvvviCe++9t76ro9R+GiXGUuHx8kN2Pu3Tk2jbLJmoA7hIqZRS6uimAbpSR7G8Ehdrdxbi8vpIq2U/8wVrdzP1603kl7q5qNMxjDrrOJoeQCq6MYaCcjfGGDoek8qxjRP0pFbtx+v18vjjj/PSSy/x+uuvc8wxx9R3lZQKKz4mmrTkKDbuKaWowsNJLRsRq1lASimlakEDdKWOMsYY8kvdZOeVsqe4gtT4WFLiI2+t3lFQxqSF61m5tYATj0nl4UGd6XhM6gHVxaaze8lokkBmWrL23VRh5efns3btWr777jsyMjLquzpKVStKhPSUePJKXPxvaz4nt2qiA1wqpZSKmAboSh0l3F4fOYUVbNlbSpnbS2JsNOnJ8RH3M/f6DP9esY23v91CTJTwx77H079Ly4hb3YOVu70UV3hITYzh1OOa6j3NVVgff/wxU6ZMYcaMGWRlZdV3dZSqlaZJcRSUuViZnc/JrXXwOKWUUpHRAF2po0BRuZufthdS5vbQKD6u1rcC2pxbwgv/n737Do+rPPP//36mN82oWpIly93GHRdMB1MDIbQsYEhClk0hsJCEQEhCIHwJLZ2EELK/zSZsCJAYSOglNgESmo1xb3K3VayukTS9nfP8/pBhbWMbaWR5VO7XdemKNBodPo6tM3Of8zz3/fo2trdEOH5sIdefPp6iXuxV/1DaMOlKpHDbbcyo8FPk6/kFAjF8pFIpvv/97/PUU0/xxBNPyL8RMWgF3A7CiTRr6zqZWZmP2yFFuhBCiMOTAl2IIa6pM051UxiPw0qx19Wrn/3wrvnj79fgcdj47nnHcPL4ol4XTIap6YqnsFktTCsPdM9Vl33m4hCWLFnC1q1bWb16NUVFRbmOI0Sf5LnsRBIZ1tV3Mnd0gUymEEIIcVhSoAsxRBmmZkdrmD0dcfLdjl6/KWzsivOrf2xjU2OIE8YVcsOCCVnNIw8n0iQzJmOKPTI2TRzWU089RSwW45prruGCCy6QO+diyPC5bLRHk+zpiDO62JvrOEIIIQYwKdCFGIKiyQzVjSEiyQxFvdhn/qE3NjfzX//agUUpvnX2RM6YPKLXx+jeZ56myOdkVomv18vqxfARj8e56aabeP3113nyyScBpDgXQ06+28HOtiglficeh5wPhRBCHJy8QggxhGitaexMsLUljMtmpcjbu33isVSG/+9fO3hzSyvTRvq55ZzJlOT17hgf7jP32G3MrMyn0OuQYksc1ne+8x3C4TCrVq3C7/fnOo4Q/cJqUThtFrY1R5hZGZDzohBCiIOSAl2IISKRNtjWEqY1nKLAbe/1UvIdrRF++vfNNIUSfG5+FVfMG4W1F/vEDVPTGU/hsMk+c/HJtNY8+uijnH322fz0pz/F5XJJwSKGvDyXnbZIgrZIkpK8/+sJEk8Z1HfEcNmt+N12vA6rbAcSQohhSgp0IQY509Q0hxJsb4lgsShKetld/cNGcI8tqyHgtnPfJTOYXhHo1TFiqQyxlMGEET7KAy55YykOKxQKcd1117F+/XpOOukkKisrcx1JiKPG73KwtTlCvseBRSn2dMTY2RbFqhRag4lGqe5ivtzvotDnwGmT7u9CCDFcSIEuxCAWSqTZ2hwmnEgTcDmw97IwbupK8Mt/bGVTY4gTxxVxwxkTejWT3NSajlgKn8PGcWML8ck+c/EJTNNkwYIFHHfccSxfvhy3253rSEIcVQ6bhXAyzfaWCF3xNIm0Qb7bsd+KJa01yYzJluYwNEOR18HIfDcFHoesTBJCiCFO3k0LMQgl0ga1wSj1HQm8WYxP01qzZFMzv39nJ1aluPmcSSyYVNKrJcbJjEEonmZsiZeqQm+vlsOL4Udrzcsvv8wFF1zA888/z6hRo3IdSYicKfA4aOqK43PaD9orRCmFy27FZbeitSaaNFhb30VpnpPJZXmySkkIIYYwKdCFGEQMU9PQGWdXWwSlFMVZNGBrDiX4zZvbWVPXyazKAN88a1KvG8FFkxnShsmc0QVZjV4Tw0t7ezvXXHMNzc3NnHbaaVKci2HPohTFvp5dWFVK4XXa8DpttEWSJOoNplcEZNm7EEIMUVKgCzFItEeSbG0Ok8yYH1sO2ROm1ry8rpE/LduNQnHd6eM5f3oZll4W+J3xFA6rhbljCmRUkPhENTU1nHrqqSxcuJC//e1vOBxyQUeIbBV6nYQSadbUdjKjMiDnYCGEGILkzC7EAGeamt3tUXa1RQm47ficPd8j/qGGzjgPvr6NTY0h5lTlc8OCCYzw935ZfHs0RaHXwZRyPw6bLLEUh2YYBjt27GDChAksWrSIk046KdeRhBgS/C47kWSGVTUdTC7zk++x97r/iBBCiIFLCnQhBrBUxmRLc4i2cKp7bFkv73Zrrfn7xib+8M4ubFbFTWdN5MxjRvR6WbxhdjeDqyhwMb4kT/abi8NqaGjgC1/4AkVFRTz99NNSnAtxhPmcNhJpg40NXQCMyHNSFnCT57JJsS6EEIOcFOhCDFDRZIYNe7pIZUyKezk6DSAYTfHrN7axsqaDY0fl882zJmZ1nIxhEoylmDjCx6hCj8yqFof11ltvsXDhQq677jruuOOOXMcRYsj6sImcqTVd8Qwt4U6gu0t8nstOwGWjyOfEK9M1hBBiUJGzthADjNaapq4EW5rDuGzWrJqwLd/Vzq/+sY1kxuTaU8dxwczyXt99h+47+F2JFNNHBigN9G5JvBheUqkU6XSa0tJSFi1axOmnn57rSEIMCxal8DltH425NExNLJmhI5piT2ec48YUStd3IYQYRKRAF2IASaQNtrWEaYukyHfZe/2myjA1T7xfw9Mr6xlX4uXb505mVIEn6yyxtMGxlfkUZnHnXQwfu3bt4sorr+SKK67glltuYfLkybmOJMSwZbUoPA4bHge0R5M0dMapKvLmOpYQQogekkuqQgwQLaEEH+wKEopnKPY6e12cd8ZS3PnCBp5eWc+nppbys3+b1bfiPJVhdpUU5+Lw/vrXv3L88cdz5ZVXcvPNN+c6jhBiH/luBzvbosRTRq6jCCGE6CG5gy5EjmmtqW2Psb01QoHHkVWDn/V7uvj5ki1EEhm+edZEzp5SmnWeRNogmsowu6oAv6v3HePF8GAYBlarlfb2dl5++WWOO+64XEcSQhzAalHYLBZ2tUWYOjKQ6zhCCCF6QO6gC5FDpqnZ0RJhR2uUYp+z18V52jD533d3cfuz63HZLPz88pl9Ks6Tmb3F+agCAm4pzsXBVVdXM2fOHFatWsXXvvY1Kc6FGMD8LhtNoQSdsVSuowghhOgBKdCFyBHD1GxuClHXEafY5+h1E7ddbVFufmoNz6zew3nTy3jwytmMLfZlnSeWyhBO7C3OPVKci4/TWvPII49w2mmn8fWvf53Zs2fnOpIQ4hMopfA57GxtDmOaOtdxhBBCfAJZ4i7EUZTMGCRSJrFUhsZQgnA83evRZ6bWvLCmgUeX7sbnsnHnZ6Zy3JjCrPKYWhNOZEgbJn63nTmj/XLnXBxSOp1myZIlvPnmm0yfPj3XcYQQPeR2WGmPJqkLxijKc+KyWaSzuxBCDFBSoAtxFLSGE2xrjpAyTBTddzScNguF3t4V5x3RFL/8x1ZW13VywrhCbjxjYtYFdVc8hWFqRua7KQu4yJP95uIQVq1axR133MEzzzzDokWLch1HCJGFfLeDXe1RdrVHAfDYrfjcNhxWC06bFYfNgsNqocDb+9GeQgghjhwp0IXoR6mMyY7WCI1dcQIuR5+K4BU1QR78xzZiKYP/XDCe86aVobKYbQ7do3cKPA6OKc/DabNmnUkMbVprHnroIe69915+/etf43K5ch1JCJElq0VRtM9F4bRhEoplMEyNqfVH/zt3dKFscxJCiBzq1wJdKXUe8CBgBX6vtf7xAd+vAh4F8vc+53ta61f6M5MQR0tHNMWmxhCmqSn2OrMupjOGyZ+W1fDs6j2MKfJw36UzqCrMbnya1pq2aJIyv4vJZX6sluwyieFhw4YN/PnPf2bp0qWMHz8+13GEEEeQ3Wr5WGPSeMpgc1OIeWMK5fVBCCFypN8KdKWUFXgYOAeoBz5QSr2gtd60z9PuAJ7SWv+XUmoq8Aowpr8yCXE0aK2paY+ysy2K32Xv0x3q9kiSny7ewqbGEJ+eUc6XTx6Lw5bdvkHD1ARjSaoKPIwr8WGRN1/iEN555x3ef/99brnlFpYuXZr1xSUhxODidlhpiyTZ0xGjqsib6zhCCDEs9WeHkPnAdq31Tq11ClgEXHzAczTg3/t5AGjoxzxC9LuMYVLdGGJnW5Qir7NPxfn6+k5uemoNO1ojfPvcyVx/+visi/O0YdIeSzKhxMf4EVKci4MzDIP77ruPyy67jMmTJwNIcS4Gnb8sr2XJpqZcxxi0CjwOdrRGiaUyuY4ihBDDUn8uca8A6vb5uh44/oDn3AUsUUp9HfACZx/sQEqpa4FrAaqqqo54UCGOhETaYMOeLiLJDCW+7Pfqmlrzt1X1PL6shpH5bu67JPsl7dC9ZDGWzjCzIkBJnuwhFof20EMPsWTJElasWEFlZWWu4wjRa/GUwVMr6ijPd3Pu1LJcxxmUrBaFy2ZlW3OEmZUBuUgnhBBHWX/eQT/YGf3AAZxXAX/UWlcCnwYeU0p9LJPW+nda63la63klJSX9EFWIvgkn0qyoCZLKmPs14emtzliKH764kT8treGk8cX84vJZfSrOQ4k0GW0yd3SBFOfikBYvXszatWu57rrreP3116U4F4PWmvpOMqamLhijK57OdZxBy+eyEYylaA0ncx1FCCGGnf4s0OuBUft8XcnHl7B/GXgKQGu9FHABxf2YSYgjriOaYlVtJw6LtU9d2tfv6eKbi9awfk8X/7lgPN/51GQ8juwWuRimpi2SxOOwMqeqQEaoiYNKp9PcdtttfPnLXyYSieByubDZZLiHGLxW7A5+9PmmxlAOkwx+AZedjY0hNjeGaO6KE0l2d3wXQgjRv/rzndgHwESl1FhgD3Al8LkDnlMLnAX8USk1he4CvbUfMwlxRDV3JdjY2EXA5ehT87anV9bxl+W1lAfc3HXRVMYW+7I6ltaacDJDxjCZMMLHyHy3dOIVh3TVVVcRjUZZvXo1sjpJDHZaa1bs7uD4sYWsqu1gU0MXJ44rynWsQctutVDgdtARS9MUSgBgUYqSPCflARd+l136mQghRD/otwJda51RSt0ILKZ7hNojWuuNSqm7gRVa6xeAW4D/UUp9i+7l79doreXyrBjwtNbUd8TZ2hKm0O3AZs2uOO+IpvjFa1tYW9/F6ZNK+M8F47O+ax5PGURSacoDbsYWe3HZZb65OLh//OMfLFiwgAceeIDKykoslv5cTCXE0bGjNUowluLEcUVEkhk2Nsgd9L6yWhQ+pw2fs/t1ydSajmiK5lACm0VRHnBT5HPgcdiyvkgthBBif/26lnHvTPNXDnjszn0+3wSc3J8ZhDiS0oZJMJKkJhgjmjIo8jizvkO9uraDB17bSixt8M0zJ3LWlBFZN+NJZgySGYO5VYUEPLKcXRxcIpHglltu4dVXX+WNN95gzJgxuY4kxBHzwe4gCpg7uoCGrgR/XVlHPGXgdsjFyiPFotRHW6YMU9PQGaeuI4YGPHYrRT4HFqW6m5OmDOLpDCMDbsaP8EmzOSGE6CHZbChEDyQzBrXtMRq64mgNPqeN4iybwRmm5on3a/jrynpGFXq495LpjO7DvFnD1IQSaeZUFUhxLg4pEolwyimnMGnSJFavXk0gEMh1JCGOqBU1QSaV5pHvcTCt3M9TGjY3hZhdVZDraEOS1aLI9zg++jptmDR3dTeVs1kVdqsFv8tBbUcMp93KqD40PBVCiOFECnQhPoFhaqobw4TiaQIuR5/2dLeGk/xsyRaqG0OcM7WUa08d16el6FprgtEkU8r9+71REmJf27ZtY+LEiTz44IOcdtppcidLDDld8TRbmyN84fjuUazHlOdhUbCxUQr0o8VutWB3f3yZe6HHydaWME6bhRF+mSYihBCfRAp0IT7BrtYIHdEUxb7sx6cBvL+rnQf/sY2Mqfn2uZM5fVLfm3K1R1NUFXooz3f3+Vhi6AmHw9xwww1s3LiR5cuXc/rpp+c6khD9Yl19FwDzxhQC4HHYGFfsY5PsQ885q0VR6HawsSGEw2aRi8lCCPEJpKOHEIfR3JWgtiNGkTf7NxQZw+T3b+/k3perKfE7+dXCY49Icd4ZS1HodTCuJLuO72Jo27x5M/PmzcNut/PWW29htco+XDF0ra3v7D4fFv/fdqGpI/1saQqTNswcJhMANquFPJeNdfWdRJKZXMcRQogBTQp0IQ4hnEizqTFEgduR9ZLgjmiKO57fwPNrG/jMjHJ+ftksRvbxbrdhalojCfxuO1PK/TLmRuxHa00oFKKoqIh77rmHP/zhD3i92fc4EGKgS2VM1u/p4rjRBfudq6eN9JMyTLa3RHKYTnzIabPitFlZXdNBVzyd6zhCCDFgyRJ3IQ4ikTbYsKcLr8OW9Qi1zY0hfvT3zUSSGW45ZxILJo/oc65IMkMibTC51M/IfJfsJRb7CQaDfOUrX6G0tJT/+q//4oorrsh1JCH63araDhJpk+PGFu73+NRyPwAbG0JM2fs5dF/kTGaMrEdaiux5HDYsymB1bQczKgIU9XHrmBBCDEVyB12IfSTSBttbIizb2Y7WZD2e59UNjdz27HocVgs/v2xWn4vzjGHSFklitynmjy2kosAtxbnYz3vvvcfs2bOpqqriV7/6Va7jCHHUvLmlBbtVMasyf7/H8z0OKgvcbGzo+uixZMbg9ufW86VHP6C6sef707XWRyzvcOeyW/G77Kyt66SpM57rOEIIMeDI5WMh6C7Ma9qjNHQmsFkVBZ7uWa69ZZia37+9k5fWNzJ3dAHfPmcyPlf2v2aGqemMp7BaFBNLfZQH3H3qIi+GHtM0UUrR0NDAb37zGy688MJcRxLiqPrXllaOKfMfdCLGtHI/72xvwzC7C+yfLd7CpoYQhV4HP3h+A7d/esphu7y3R5I8/n4N72xv43vnTWHuaOkIfyTYrRYKPA42NoVIZAwqCjzYs1ytJoQQQ40U6GLYiyYzrK3vxDA0hd7sCvMPj/PTxZtZVdvJJcdWcM1JY/pUTHfFU2RMzZgiLyPz3Ths8uZF7K+pqYmrr76a6667jssuuyzXcYQ46lIZk9MnlRxytdPUkQEWb2qmNhjlpXWNvL8ryHWnjeOkCcXc+fwG7n5pE9/51GROHF+838/FUwbPrK7n2dV7MExNvsfBTxdv5ueXz2JUgczzPhJsVgtFHic17THqO+KMK/ZRGnDJRWghxLAnBboY1kKJNGvrOnFYLeR57FkfpymU4O6XNtHQGefGMybwqWllfcrVEUsRcNuZXJbXpznpYuh67bXX+Pd//3e+/OUvc/HFF+c6jhA54bBZuO3TU3h/Z/tBvz9tZPfe8wde28ru9hgL543igpkjAfjRpTP54Usb+fHfN3PlcVU4bBbaI0naoymqG0N0xNKcOrGYL544Bgtw89NrufelTfzi8mP7tDJK/B+rRVHodZI2TLa2hNkdjDKhxEdJnlO2cQkhhi15hRHDVmcsxZq6TrwOW5+K4LX1nfz075sxtOaHF0372D7I3uqKp/A5rUwd6Zclf+KgtNYsWrSIxx9/nDPPPDPXcYQYsEbkOSn2OdndHuPcqaV8/viqj77nc9m4+6Lp3P9qNX9eXguA226lyOdgUmkel82t5Jiy/2sud9v5x3DHcxv4yeLN3HXhNLnTewTZrRaKvE5SGZMNDSH8bhsTR+QRcGd/4VwIIQYrKdDFsNQWTrB+T4g8lw2nLbviXGvNs6v38OjS3VTku7n901OpKOjbCLVwIo3LZmV6Rb4U5+JjampquPHGG/n973/PH/7wh1zHEWLAU0rx6ell7OmM858LJnzsrqzbYeWHF02jJZTE77YdtrP7tJEBrl8wnofe2M4j7+7iq6eO6+/4w47DZqHE5ySWyrCypoOR+S5GF3qzbtgqhBCDkRToYthpDSdYv6eLfLcj6yI4njJ48I1tvLu9jZPHF/GNsyb2eWRPJJHBalFMrwzIfnPxMc8++yxf+9rXuPXWWykpKcl1HCEGjcvnjTrs9y1KURZw9ehY504to6Y9xgtrGyjyOvjsnMojEVEcwOOw4bZbaQsnaepKYLUoPHYbXmd3B3jZqy6EGMqkQBfDSksowYY9XeR7si/Ot7dEeOC1LezpjPMfJ43h0tkVfd4rF0lm0Eozs7JA9pyLj2lra+Ouu+7ixRdf5Pjjj891HCGGtS+dPJZgNMX/vrebtGGy8LiqT/4h0WtKKQJuBwCm1qQNk45YmsauOI2hOFPLA3JnXQgxJMltOjFsNHf1rThPGyaPL6vhlqfXEE0Z3H3RdD47p7LvxXkig0Ize1SBvNkQ+9m6dSs/+MEPKCoqYvXq1VKci6NOKXWeUmqLUmq7Uup7B/l+lVLqTaXUaqXUOqXUp3OR82iyWhTfPncyZ0wu4fH3a3lsWY3MSe9nFqVw2qz4nDaKfS6SaZMVNUE6oqlcRxNCiCNOCnQx5GUMk/pgjI2NIQqyLM63t0S4+ak1PLmijgWTR/DwVXOYNapvzeCge8+5ssCxVVKci/099thjnHzyyYwc2d1x2mKR07U4upRSVuBh4HxgKnCVUmrqAU+7A3hKaz0buBL47dFNmRtWi+KmsyfxqamlPLWijkfe3SVF+lGU57LjtltZXddJTVuUjGHmOpIQQhwxssRdDFkZw6QplGB3W5SMqSn0OLLas/bB7iD3v1KN32XnBxdMZf7YwiOSLxRPY7MqZo3Kl2XtYj8vvvgi999/P6+//jozZ87MdRwxfM0HtmutdwIopRYBFwOb9nmOBj5sdR4AGo5qwhyyKMUNZ0zAbrPw3JoGwolM99fS4POocNqsFHos7GqPUtcRo7LAQ6nfJRe7hRCDnhToYkhq7kqwrSWMYUKey5b1G6Z19Z386NVqxhR5ufviaeS5+j7yxTA1nfEUPqeN6RUBKc7FR9auXUt7ezsXXHABZ555Jl6vN9eRxPBWAdTt83U9cOA+i7uAJUqprwNe4OyDHUgpdS1wLUBV1dDZs62U4tpTx+F32fnz8lqaQwluO38KfhkPdlRYLYoirxPD1NQFY+xuj1KS52R8iU9eW4UQg5Zc5hVDTjiRprophMdho9CbfTO4LU1h7n25mrKAm7suOjLFeTiRpjOeYnyJj9lV0hBOdNNa89vf/pazzz6b9vZ2LBaLFOdiIDjYkqMD13FfBfxRa10JfBp4TCn1sZOu1vp3Wut5Wut5Q20KgVKKq+ZXccs5k9jcFObWv66loTOe61jDitWiyPc4KPQ46IimWVXbQTSZyXUsIYTIihToYkhJGyYbG0K47dY+LTPc1Rblrhc3EnDbueeiaQT6eDfEMDVtkSQ+l435YwsZVeiRETHiI9///vf5/e9/z3vvvcfll1+e6zhCfKge2HdGWSUfX8L+ZeApAK31UsAFFB+VdAPMgskjuO/SGUSSGW55ei0rdgdzHWnY6e78bseqFKtqOwgl0vt93zQ1XbE08ZSRo4RCCPHJpEAXQ8qO1gjJtNGnmeS1wRh3Pr8Bp83CvZdMp8jn7FMmrTUdsRQTRviYURHo87x0MXQsX76cUCjEDTfcwNKlS5k4cWKuIwmxrw+AiUqpsUopB91N4F444Dm1wFkASqkpdBforUc15QAytdzPLy4/liKvgx++tImfLd5CR0w6jR9tHocNl83KqpoOgpEkibRBXTDGsl3trKoNsrImSPiA4l0IIQYKKdDFkNESStDYGafA48j6GLvaonz/2fUoBfdcMp1Sv6vPuYKxFBUFLioL3H0eySaGBtM0+fGPf8yFF15IdXU1lZWVOJ19uxAkxJGmtc4ANwKLgWq6u7VvVErdrZS6aO/TbgG+qpRaC/wFuEYP83bmZQEXv1x4LJ+bX8V7O9r4zydW8dqmJunyfpS57FbynHbW1HWydEc7O1sjuGxWin0uHNbu4r0rJkW6EGLgkVt5YkiIpTJUN4YIuB1ZF8HbWyLc+fwGHDYL910yg4oCd59zdcVTFHgcjC/Jk+JcAN3F+YUXXkhXVxcrVqxg1KhRn/xDQuSI1voV4JUDHrtzn883AScf7VwDnd1q4ar5VZwyoZiH/7mdX7+xncUbm7n6hNFHZESn6BmHzULx3lVw+74Gux1WLBZYVdvBzMpAn1fKCSHEkSR30MWgF08ZbNzThcOa/b7zzU0h7nhuPW6HlR9/duYRKc6jyQx2q4Up5X7Zby4A2LlzJxaLhVtvvZV//vOfUpwLMcSNKvRw/6Uz+MaZE2iPJrnj+Q18/9n1bGoM5TrasKGUOugFcqfNSsBtZ21dJ63hRA6SCSHEwUmBLga1UCLNypogGQN8ruwWhKyr7+TO5zfid9v50WdnUBbo27J2rTWheJq0aTKjMoDDJr9mw10mk+GOO+7g9NNPp6uriwULFmCzyQImIYYDi1KcM7WM//7CPL566jjqOmJ892/ruPflTbSGk7mON6zZrRbyPQ427Al9rKGcEELkilQOYtBqDSdYubsDp82adXH+1tZW/t8LGynJc/KjS2cwIq9vxXkslaEtmqTAa2fu6AJpCCdobm5mwYIFfPDBB6xYsYJAIJDrSEKIHHDYLFw0ayT/c/U8vnjiaNbUdXLDn1fx4toGDFP2p+eK3WrB67Cxvr6LRFq6uwshck8KdDHoaK2pC8ZYV99FwG3Pepb4c2v28LMlW5hclsdPPjuzT3vQ0oZJWySJzaqYO7qQqSOlW7uASCSC/u4AvQAAIABJREFU3+/ni1/8Iq+++iqlpaW5jiSEyDGX3crlc0fxm8/NYUp5Hr97eyff/ds6drVFch1t2HI7ut9HVDeG5GKJECLnpIIQg0oibbClKUx7NEmR15nV3m5Ta/733d08t2YPJ40v4pZzJvdpGXoqYxJKpJlanscIv0uawQmSySS33norW7ZsYfHixVx77bW5jiSEGGDK/C7uunAa/9zayv+8vZNvLFrDrMoAn5k5kuPGFErvkqPM77LTHk2yvSXMpNLuxq5aa5IZE1NrnDar/J0IIY4KKdDFoNEaTrC5MYxSihJfdkvRDVPz6ze28cbmFi6YUc5XTx3XpxfcRNogmspw7Kh8CrzZj3cTQ8e2bdtYuHAhY8eOZdGiRbmOI4QYwJRSnDF5BPNGF/D3DU28sqGR+16pZkSek09NK2Pu6ALGFnuxyIXfo6LQ42BPZ5xkxiSVMYmmMmgT2Pt/v89pI+C2U+B1UOTNfmqMEEIcjhToYsDTWrO9JUJdR4x8tyPrTu1pw+TnS7bw3o52Pje/iiuPG9WnF9dYKkPKMJkzugC/y571ccTQYZomTU1NfOUrX+H666+XN29CiB7Jc9m5fN4oPjunkmU723lpXQOPLavhsWU1+F02Zo3KZ0ZFgNFFXirz3fjd8prTH5RSFHqcxFMGNouFgn1Gt2qtSRua1nCSumCcsoCTCSPypBGsEOKIkwJdDHgtoST1HTGKvc6sC55E2uBHr1azqraTr5wylouPrehTpkgig1aaOVUFeJ3yazTcRSIRbrzxRqZNm8att97KqaeemutIQohByGpRnDyhmJMnFBOMplhT18nauk7W1HXy9ra2j54XcNupKvQwoyLAsaPymVSaJ8uvjxCrRR20h4xSCodN4bBZyHNBeyRFKB5kakVALtILIY4oqSzEgBZPGWxuChFwZ7+ULJrMcPdLm9jcFOIbZ07gnKllfcoUSqSxWxUzKwuyblAnho5169ZxxRVXcMIJJ3D99dfnOo4QYogo9Do485gRnHnMCLTWNIeT1Adj1HXEqO+Is7M1yl+W1/Ln5bV4HFZmVeZz4ayRzKiQSRFHQ77HQTxlsKqmgwkjfIwMuLHIRRIhxBEgBboYsExTs6U5jN1qyXpZeziR5s4XNrKrLcq3z53MqRNL+pSpI5bC67QyvSKA0ybFuYAnnniC22+/nauvvjrXUYQQQ5RSijK/izK/i3ljCj96PBRPs25PF2tqO3h/d5ClO9uZWRng88ePZmq5P4eJhwe3w4rDZmFbS5imzgQTS/MIeORuuhCib6RAFwPWns44HbEkxd7sGsJ1xdP84PkN1AVjfP/8Y5g/tqhPedqjSQo8DqaO9Gd9wUAMDZ2dndxwww3cdttt/OQnP8l1HCHEMOV32zllQjGnTCjmqxmDxRubeHplPd/92zpmj8rna6eNp6LAneuYQ5rVoij2uoilMqyoDVLudzGm2ItCkcqYJDMGibRBWcAt+9WFED0iZwoxIIUTaXa0RihwZzebPBhNcdsz69jTGecHn5na5+K8LZJkRJ6T6RUBKc6HuWXLljF79mwKCwuZMGFCruMIIQQATpuVi2ZV8D9Xz+NLJ49he0uEW55ew4qaYK6jDQseh40Sr5NgNM3ynUGW72pndV0HmxpDbG+JsFPm3AshekjuoIsBJ5UxqW4M4bZnN3O0NZzk9ufW0xFLcdeF0/q8H6892l2cTy7zSxOeYS6dTnPDDTfwwAMPcOmll+Y6jhBCfIzLbuXS2ZWcNL6Y+1+p5u4XN3H1iaO5bE6lTJboZ0opAgfpsK+1pqEzTmmeq0cjWbtiaUAT8Mj4ViGGI7kVKAaUVMZk/Z5OUhnzoF1UP0lrOMn3n11PVzzN3RdN73NxHowmKfI5OKZcivPhrKWlhe9+97sopfjggw+kOBdCDHilfhc/+beZnDKxmD8treGni7cQS2VyHWtYUkqR57RT3RQibZiHfW57JMmq2g6qG0MYpj5KCYUQA4kU6GLASBsmG/Z0EksZBNy9v2rcEk7w/WfXE06kuefi6UzpY4OczliKgMfBFLlzPqy98cYbzJ49G6u1uymgxSKnTSHE4OCyW7n13Mn8+4ljeHd7G//xxw/4/ds7aepK5DrasOOyW0llTGraY4d8Tls4wbr6LgJuO/G0KX9PQgxTssRdDAgZw2RjQ4hI0qAgiyVdHxbnkUSGuy+ezqTSvKyzmFrTEUuR57IxbaQfm+w5H7Y2btzIF77wBR599FHOOeecXMcRQoheU0px2dxKZlUGeG5NAy+tb+SFtQ3MH1vI5+ZXMa7El+uIw0aBx0FtMEZJnvNjS+FbwwnW7+ki3+3AbrWQ77azszVCcZ5DpsYIMcxIgS5y7sPivCuWotDb+6ZwR7I4j6UyxFIGowo9jC7ySEO4Yaquro5Vq1Zx8cUXs3nzZvx+GVckhBjcJpbmceunJvOlyBhe3dDEqxsaufWv6/jGWRM5fVLfRpCKnrEohc9hY3NjiImleWRMk3TGJJYyqAvGyPc4PnrfYbNa0EBdMM6EEXIRRYjhpEfVh1LKoZSSdsXiiDNMTXVTiM54dsV5eyTJ7c9uIJLIcE8finPD1LRFkigFc0YXMGGET4rzYerFF19k3rx5bN++HUCKcyHEkFLkc/KFE0bz8OfmMLHUx8+XbOFPS3djatnvfDS4HVbShsm6+k42NYTY0RqlJZSk0Ov82PuOgNtOfUfsY70DTFOTyhx+L7sQYvD6xDvoSqkLgAcABzBWKXUs8P+01tIlSfSJYWo2N4YIRrIrzjuiKW5/bgNd8TT3XjKdiVkW56bWtEeTjC/xMqrQK/vNh7FHHnmEH/7whzzzzDOcfPLJuY4jhBD9Jt/j4J6Lp/O7t3by9Mp6atpj3HLupKwatIre6WmfHYtS2C0WdrZGmF6RT8YwaQ0n2dUexTA1U8v9FPmyG0crhBi4enKL8G7geKATQGu9BpC76aJPPizOWyKJrIrzrnia25/fQHs0yV0XTcv6zrnWmvZIigklPsYU+6Q4H6a2b99OfX09F110EatXr5biXAgxLNitFm44YwLXnz6eFTVBbn5qLbvbormOJfbhd9tpCSfZ3hJm2c52NjeFcdmseOw21tZ3UReMoWX1gxBDSk8K9LTWuvOAx+RMILJmmpotTSFawkmKva5e/3w4keYHz2+guSvBnRdMZWofurUHYykqC11UFXmyPoYY3P7yl79w4oknsnz5coqLiyksLMx1JCGEOKo+PaOcey+ZQSyV4Za/ruUf1c25jiT2kee0s6cjjtdho9jXvRTeYbNQ6HGwrTnC5qYwmU8Y3yaEGDx6so6pWil1BWBRSo0Fvgks699YYiir74jRHE5Q7Ot9cR5PGdz14kbqgjF+8JmpzKjMzzpHMJakxOdkQkkeSsmd8+Hopptu4pVXXmHJkiXMnj0713GEECJnZlQEePDK2fx8yRYefH0bmxpCXHvaOFx26SCeay679aB/D1aLotjnoCWUIJbKMG1kQP6+hBgCenIH/UZgLmACzwAJuot0IXqtK5Zme0uEAnfvl7WnDZP7XtnE9pYI3z3vGOZUFWSdI5RI43fZmVyWh0WWtQ87NTU1aK25/PLLWblypRTnQghB9xiwuy+azsJ5o3ituplrH1vBr9/Yxns72j7WqEwMDEopCr1OkmmTFbuDdMXSuY4khOijntxB/5TW+rvAdz98QCn1WbqLdSF6LJkx2NDQRZ7L3uu93oap+dniLayt7+JbZ0/khHFFfcqh0UyVGefDjtaa3/3ud9xxxx28++67stdcCCEOYLUovnDCaGZUBnh1QxPvbm/jtU3N2CyKmZUBLphRztzRhdKzZYDJc9lJpA1W1gaZXOpnZL5LVgcKMUj1pEC/g48X47cf5DEhDklrzbbmMFrrXi+/0lrz8D+3s3RnO185ZSxnHlOadQ5Ta7riaeZUFeC0yTKw4SQWi3HNNdewZcsW3nnnHSZNmpTrSEIIMWDNqsxnVmV35/DqpjArdgf519ZW7nm5mjK/iwtmlHP21FJ8Tun6PlC47FbsVgtbmkN0xJJ4HDYUCqXAZlWMDLhl1aAQg8Ahz6pKqU8B5wEVSqkH9vmWn+7l7kL02J7OOC3hJCVZ7Dt/bFkNr21qZuFxo7j42Io+5eiIpRhb7KXA27MRJ2JoiEajuN1uTjvtNP70pz/hcvX+36EQQgxHNquFGRUBZlQEuPqE0SzbFeTFtQ384d1dLFpRy81nT2b+WGmuOVBYLYpir5Nw3KArlvmoq3MyY2AYmtHF3pzmE0J8ssOt720BNtC953zjPh9LgPP7P5oYKpo642xtClPo6f2+88Ubm3h6ZT3nTSvj8/Or+pQjmszgc9oYXSQvTsOFaZr84he/YP78+ZimyY033ijFuRBCZMlmtXDKhGJ+8m8z+eUVx1Lqd3HPy5v48/s1mDLqa8BQSuFz2fC77QT2fhR5nexoi9AVlz3qQgx0h7yDrrVeDaxWSj2htU4cxUxiiDBMzY7WMPXBOAUeR6/3q62p6+S3/9zOnKp8rjt9fJ/2UmUMk0TG4LhK2Tc3XLS2tnLNNdcQDAZ5+eWXsdlkGaYQQhwpE0b4+Om/zeS3b+7gLx/Usa0lwi3nTMbnknPtQGS1KHwOO9WNIeaOLsAuPXiEGLB68ttZoZRapJRap5Ta+uFHTw6ulDpPKbVFKbVdKfW9QzznCqXUJqXURqXUn3uVXgxYibTBuvpOGjoTFPucvW7GVhuM8eNXqxlV4OG75x3Tp6I6mTEIxlMcU5qHV/bKDQtaazo6Opg9ezZvvfUWY8aMyXUkIYQYcpw2KzedPZHrTh/P6rpOvvXUGt7f1Y6Wu+kDktthJZk22NUW2e9xw9QEoymSGSNHyYQQ++pJtfJH4F7g53Qvbf8PerAHXSllBR4GzgHqgQ+UUi9orTft85yJwG3AyVrrDqXUiF7/CcSAE0tlWF3bCRqKvL1f1t4RS/HDFzdit1m48zNT8TiyL6pDiTRaa46tzKfI1/ssYnDJZDLcc889tLW18fDDD3PvvffmOpIQQgxpSikumFHOuGIvD76+jXtfrmZquZ9rThrDlHJ/ruOJAxR4HNQF4xR6HPhcdlpCCWqDMVIZjcdpZUZFQG5mCJFjPfkN9GitFyulfq613gHcoZR6uwc/Nx/YrrXeCaCUWgRcDGza5zlfBR7WWncAaK1behdfDDSmqaluDGFB4XP3/gSfzBjc93I1nfE0P7p0BiP82e0XzhgmHfE0JXkOJo7I63XneDH41NfX8/nPfx673c5jjz2W6zhCCDGsTCn38/Dn5rBkUxN/WV7Ld/62juPHFnLJsRVMG+mXkV8DhFKKfLeD9Q0h0GBR4HfZCbgtxFIZVtYEmVGRL810hcihnlRQSdV9Vt2hlLoO2AP05E53BVC3z9f1wPEHPGcSgFLqXcAK3KW1/vuBB1JKXQtcC1BV1bdGYaJ/1QajhOIZirO4W21qzQOvbWVrc5jbzj+GSaV5WWUwTE1HLMUxZX7KZQ7osPG3v/2Nc889l+9973tYrXJBRgghjjarRXH+9HLOmDyC59c28Oyqet7fFaQi382nppVy5jGlBNz2XMcc9hw2CwFlx2ZR+71H8jhs2CwWVtd2cEyZn5EF7hymFGL46kmB/i3AB3wDuA8IAF/qwc8drCo6cFOSDZgILAAqgbeVUtO11p37/ZDWvwN+BzBv3jzZ2DRAdcXS7GyNZr2U/I/v7ea9He18+ZSxnDi+OKtjmFoTjCblhWWYSKVS3HbbbZx77rl885vfzHUcIYQQdM/jXjhvFBfPGsk729tYsrGJR97dzZ+W1nD82ELOmVrGsaPypWlrDh2qSZzDZqHQ62RzU4iUYTCm2HeUkwkhPrFA11q/v/fTMHA1gFKqsgfHrgdG7fN1JdBwkOcs01qngV1KqS10F+wf9OD4YgBJZUw2NnaR57JjyeKO9SvrG3l29R4umFHOxbNGZp0jGE1RVeSV4nwY2LFjBwsXLqSiooLbb78913GEEEIcwGW3cvaUUs6eUkpNe5Qlm5p5c0sL7+5op9jn5KwpI7hw5ki5qz7AWC2KIp+Tna1RLEpR1YPxtFprtAaLXHQRos8O21pbKXWcUuoSpVTx3q+nKaX+BCzrwbE/ACYqpcYqpRzAlcALBzznOeCMvccupnvJ+85e/hlEjmmt2dEawTB0Vnu9V+wO8t9v7WDe6AK+euq4rJekB6NJRuQ5GVcsc86Hg+uvv54vfvGLPPfccxQWFuY6jhBCiMMYXeTlq6eO49H/mM93zzuGqkIPT31Qxzf+spp19Z2ffABxVFmUotDrZHtrhLpg7JDP07q7A/zK2g5W1nSQSEsneCH66pAFulLqR8ATwOeBvyulbgfeBNayd+/44WitM8CNwGKgGnhKa71RKXW3UuqivU9bDLQrpTbtPfatWuv2vvyBxNFlmppdbVEau+JZXQGvC8b46eItjCn28p1PZT9OLRRP43XZmFyWJ1dvh7BYLMbtt99OKBTi1Vdf5Rvf+Ib0GBBCiEHEbrVwyoRifnjRNH618FjcDit3PLeBx5fVYJiyi3EgsVoUhR4nW5vDNHTEP3rcNDWpjEl7JMmK3UHW1nViGpA2TNbWdUqRLkQfHW6J+8XALK11XClVSPfy9Fla6y09PbjW+hXglQMeu3OfzzVw894PMcikDZOtTWFawkmKvM5eF0qRZIZ7X96E02bhjk9Pxe3IrrFXLJVBWWD6yECv562LwWPjxo0sXLiQY489FqWUNIITQohBblyJj18tPJb/fmsHT66oY92eLr555kQqZJvagGG1KIq8TqqbQ+zpjJEyNGnDRCnQGrwO2z6Nga2EE2nW1nUya1S+TNARIkuHK9ATWus4gNY6qJTa3JviXAxtibTBhj1dRJPZdWw3TM3Pl2yhOZzkvkumU5KXXWO5ZMYgmTGYM7pQXgiGsPb2ds4++2zuv/9+rrnmGrlrLoQQQ4TLbuWbZ03i2FEFPPzmdq5/YiUnjCviktkVTCnLk/P9AGC1KIq9TjKGxutQh13tmOeyE06kWVPXyfSKABnDJJzI0B5Nks5opoz045M560Ic1uF+Q8YppZ7Z+7kCxuzzNVrrz/ZrMjFgxVMGq2o7UEChN7vC+vFlNays6eA/F4xn2shAVsf48KQ/uypfTvZDVFdXF6+++ipXXnkl1dXV5Ofn5zqSEEKIfnD6pBJmVgR4aX0jr6xvZOnOdiaX5vHpGeWcMK4Qj0Ne53PJohQOW88ulnxYpK/YFQTV/bNuuxVTa1buDjKzUuasC3E4hzvb/dsBX/+mP4OIwWNXWwQ05GXZdfXtba38dVU9n5pWxvnTy7M6hmFqgrEU00cGyPfISX4oWrFiBQsXLuS8885j4cKFUpwLIcQQV+B1cPUJo7l8biWvVzfz/NoGfvmPrTisFo4bU8Bpk0qYN7oQh022sw10ea6Dv0e0WSysrutgSpmf8nzZyiDEwRyyQNdav340g4jBIZRI0xxKUpTllc8Vu4M88NpWppT7+dpp47I6xocdQyeU+CgNuLI6hhjYXn/9da666ip++9vfctlll+U6jhBCiKPIZbdywcyRnD+jnM1NYd7a2sq729t4d0c7eU4b50wt5fwZ5ZT55T3AYOOwWShwO6huChFLG5QHXD1aHZHMGLSGkhT5nFn3LBJisJD1QqLHtNbsbIngtluz2hO2qqaD+1+tZnSRhzsvmIo9y4Zu7dEUlYUuqoo8Wf28GLja2tpobW3lpJNOYvny5YwZMybXkYQYUpRSFcBo9nn911q/lbtEQhyaRSmmlvuZWu7nq6eOY219J4s3NvHcmj08u3oPc0cX8NnZFcyolBVWg4nNaqHI66S+I0ZdMIbTZmFEnotCrwOX3Yrdqj5q+htPGezpjLOnI0YyYzK6yMPkMn+O/wRC9C8p0EWPdcbSdMRSFPt6f8V6dW0H976yiVEFHu65eDo+V3b/9DpiKYrzHEwokcYxQ81bb73F5z//eW6++Wa+9a1vSXEuxBGmlPoJsBDYBHw4B0kDUqCLAc9qUcypKmBOVQFtkSR/39jE4o1NfP+5DZw/vYwvnTxWmsUOIhbVPcINuqcCNXYlqO/onreu6R7H57RZCCcy2KyKgNuBUtDQmWB0kVf+rsWQ1uMqSSnl1Fon+zOMGLi01uxoieB19H7f+dr6Tu59uZqKfDf3XDz9kPuSPkkonsbjsHJMmV9mnQ8xDz30EPfffz+PPPII559/fq7jCDFUXQJMltdyMdgV+5x84fjRXDF3FI8tq+G5NXtYV9/FzedMYlJpXq7jiV6yWy0E3PuvqjRMTcY0KfI69rshY7UoGrvijC32He2YQhw1n7jGWCk1Xym1Hti29+tZSqmH+j2ZGFDaIknCqUyv9/1sbgpxz0ubKA+4uPeSGfizbCz30azzikDWS+PFwNPU1EQ6neakk05i5cqVUpwL0b92AtmdhIUYgBw2C18+ZSz3XjKdZMbk1r+u5fFlNXTF07mOJvrIalE4bR/fUul32akLxkkbZo6SCdH/elLp/Br4DNAOoLVeC5zRn6HEwGKYmh2tUfJ6OcqsNhjj7hc3Ueh1cM/F0wlkWZynDZNE2mBmZb4saRpCXn31VWbPns3bb7/N3LlzGTlyZK4jCTHUxYA1Sqn/Vkr9+sOPXIcSoq9mVebz0FWzOW1iCU+uqOOa/13Oz5dsYWNDF1rrXMcTR5DVojC1pjUkC4HE0NWTisuita454AqWcagni6GnsStOIm1Q1IuZ5y2hBHc+vwGbVXH3RdOznneptaYz3j1OTWadDw1aa2699VaefPJJnnzySU477bRcRxJiuHhh74cQQ47PaeOWcydz+bxR/H1DI29sbuFfW1sZVeDmzGNKOX1SCSV5PX8fIwYuv8vO7mCU0oALq2x5FENQTyqeOqXUfEArpazA14Gt/RtLDBThRJptzREKejFrvCue5s4XNpLIGPzo0pmU9WEUWjCWoiLfzQgZpTIkxONx3G43EyZMYPXq1RQXF+c6khDDhtb6UaWUA5i096EtWmtZCyyGlKpCD9eeNp4vnjiGd7a1sWRTE48u3c2flu5mekWAMyaXcNL4Yrxy0X/QslstdCVSBKNJSvL+7/2hYWq01h91gBdisOrJ2el6upe5VwHNwD/2PiaGuIxhUt0YwuOw9vgKZSJtcNeLG2mNJLnn4umMLfZm/d+PpTK47VbGl0gjkKHgqaee4tvf/jZr167luuuuy3UcIYYdpdQC4FFgN6CAUUqpf5cxa2IoctmtnD21lLOnltLUleCfW1t4c3MLv35jO//fv3Yyf2whZ0wuYU5VgRR0g5DPYWd3W5Qir5NIKkNLKEFjVwKlYEKxjxF+lzQUFoNWTwr0jNb6yn5PIgacXW1R4imDwh4ubdda85s3t7OjJcIdF0xhann2cyoNUxNLGcwbIy+cg108Huemm27i9ddf59lnn6WgoCDXkYQYrn4BnKu13gKglJoE/AWYm9NUR5ipNR3RFEU+Wc4supUFXFx5XBUL541iW0uENze38Na2Vt7Z3kbAbefCmeVcOGskHofcVR8sXHYr7dEEy3cFSaQNbFYLeU47ptZUN4eo64gxYURe1lsshcilnpyJPlBKbQGeBJ7RWof7OZMYANrCCWqDMUp68Qbn2dV7+NfWVq4+YTTzxxZl/d/WWhOMpphclpf1SDYxMGitiUajOBwOVq1ahd+f/UUbIUSf2T8szgG01luVUkPuJJtIG6QNTdowZeqH2I9SikmleUwqzePLp4xlVW0Hf9/YxOPv1/LcmgYumV3BhTPLpVAfJPwuB1qz33YFK4pir4t4ymB1XSfFPgfjS3yypUEMKp/4yqW1Hg/cS/cV9vVKqeeUUnJHfQhLpA2qm8Lkux0fG29xKKtqO3h06W5OnlDM5XMrs/5va61pj6YYWeBiZL7sOx+stNb84Q9/4NJLL6W4uJiHHnpIinMhcm+FUuoPSqkFez/+B1iZ61BHWiJtUOC1E09JP1txaDarhflji7jzM9N44PJZTCnP4/FlNXzl0RW8tqlJur8PAnarBYft4KWM22GlxOckksiwfFeQna2Rj0azmaamM5aiujHEit3BQ54rtO5+nvxbEEdbjy4naa3fA95TSt0F/Ap4AljUj7lEDu1oiWBR6pAnvQM1dsX52eItVBV6uOmsiT0u6g/0YXFenu9i0oi8rI8jcisUCnHdddexfv16Fi2S04QQA8j1wA3AN+jeg/4W8NucJuoHGhiR52JnW+SIHTOUSGNRSqaJDFETS/O48zPT2Noc5pF3d/HrN7bzwe4ObjxjAv4sR8SKgSHPZcdjamqDMRq64pTmuWgOJciYGqfVikazuq6DY0fl77dyonvEcJia9hjTRwYoz3fn8E8hhptPrMCUUj6l1OeVUi8Cy4FW4KR+TyZyIhhN0RxO4O/h0vJE2uC+l6tRwO0XTM16TrnWmmAsRVmguziXxh6D1+uvv05eXh7Lly9n2rRpuY4jhNhLa53UWj+gtf6s1vpSrfUvtdZDapiwqTVWpRjhd3KkbnqlDRPD1CQzckd+qJtUmsf9l87gP04awwe7g3x90WrW1HXmOpboI6tFUeR14rHbaA4l8TpsFHmd+Fw28lx2LChW13YSTWYASGVMNuzpYk9HgiKvk63NYRJp+f0XR09PLgVvAF4Efqq1fruf84gcyhgmm5tCPS7OAR5dupuaYIwfXjSNsj6MQgvGUpT6XUwuleJ8MNJa8+CDD+Lz+fjKV77CpZdemutIQoi9lFJPaa2vUEqtp/sG83601jNzEKtfdC9vd+CyW/E6/n/27jtOrrM6/P/nudP7zPauXa16b5YrxjYYbMCVgBsmDqYlOIQAISH5JSYkIf45ARMwgRCKacHGVGFsbOTeVSxLVq8r7Wp7n50+9z7fP3YlVHZXs7J2Z8t5v156aWf27p0j7ZR7nnLqKOzJAAAgAElEQVSOjXTWynk12Ej6kmnmlwQ51BUja1pSuHSaM5TixlVVLK8O8+Un9vCPv9nOJXOKeNeSMpZUhmR13xTmsBmEPKe/fn0uO/F0li1HelhQHuRA++By+KKhOkw2w2Bfe5QlFfL7FxMjlwR9ttbaGvdIRN419sTJZC0CrtwS9NeO9PDIthauXV7Bqpqzr8zdHUtREnBJcj5FdXZ28md/9me0tbXJknYhJqe/Gvr7PXmNYgIkMiazCr0AFAddHO1JvqkEPZ7OEnA6KAu5SZkmTd0Jwl6pCj0T1Bf7+cr7V/DgxkYe39HKC/s7qQx7uHpJGZfPL5Gl79OM1zmYpG9t7MXntBPy/PF1HvI46Iym6IimKHkTk1FC5GrEBF0p9WWt9WeAXyilhhtxv3FcIxMTaiCVpaEzTkGO7Sj6Exn+a/0+qgu8fPDCWWf9uH2JNCGvk3mSnE9Zn/vc51iwYAG/+MUvcDrlwlWIyUZr3TL0ZSeQ0FpbQy3WFgCP5S+yc08B/qFVYGGPkyNd8bM+l9aaWDrL6poCDENR6HfR0Hn25xNTj9th446LarllbTUv7Ovkse2tfOeFQzzwUgOrZ0W4fH4J59UWvOlVGmJy8DrtI1bwD3kc7GmNEvQ4zno7pxC5Gm0G/aGhv++fiEBE/mit2dcWxe2wYeSwdEdrzX8/e4D+ZIZ/umYRLvvZvVENJLM47QaLyoOyZHCKMU2T//zP/+TWW2/l29/+Nna7FE4SYgp4DniLUioCPAlsAm4CbstrVOeIaWkMQ+FzDn4m+d3209fzD8mlBVs0laU85CHkHUz4Ay47boch7dtmIJfdxtsWlvK2haUc6hzgqd0dPLe3g1cPdeN12riovpC3zC1meVUYm0w2TEsOm4FhKA60D7CwPHjWk0rHKsLLUnkxmhGvqrXWG4a+XKi1PilJV0rdxeCHu5gGWnqT9MQzOfc8f2ZvBy/u7+SDF8yivth/Vo+ZSJuYWKyolJHnqaa5uZkPfOADaK25/fbbJTkXYupQWuu4UupO4Ota63uVUlvyHdS5ksyYFPr+2B7UYTMIuBwkM+ZJM14Z06I7lsbjtI1Yc8W0Bvuo1xX5jt+nlKI85OFId5yILHOfseqK/Nx5iZ87LqrljaN9PL27nRf3d7F+Vzshj4OL6gt524JS5pcF8h2qOMeCbgftA0l6DmQoD7kpCrgIuOxYWhNLmfQl0nRE04S9DmoKvacN5CUzJvvao8RTJosqggTGUPNJzCy5XFl/iNNn0e8c5j4xBXVGk+xu7c/5YqM9muRbzx5gYXmQG1edXb/zjGmRyJqsqgnjccoyoakkk8lw+eWXc9ttt/EP//AP2Gzy+xNiClFKqQsZnDG/c+i+Ua8DlFJXAf8F2IDvaK3vOeX79wGXD930AiVa6/A5jTpHqaxJoc970n3FAScNXfGTEvS+RIY5JX4Od8UwLT3sjGdPPE19se+0payFficNnbEzxpIxLQZSWUIeR04r08TUYzMUK6rDrKgO84msxebD3Ty3r5Mnd7fz2PZWlleFuGVtDYsrQvkOVZxDRT43WdOipS9JY08cm6GwtAY9OIjncdho6o3THk2yoCxIZGjraEc0ye7WKIZS2A3FpoYe5pcGKA+7ZTZdnGa0Peg3ATcDdUqpX57wrQAgPSemgb5EhjeO9hP2OnNaYm5pzX+t34fW8Om3zzurZVxaa3riaZZVhWTkcArJZDI89NBD3Hbbbbz88ssUFBTkOyQhxNh9Cvg88Cut9Q6l1Gzg6ZEOVkrZgG8AVwJNwEal1Dqt9c5jx2it//qE4/8SWDlewZ+J1hA4pXBXyONE6z8m1MmMicdpo7rAi92m2N8+QKHv5NVj/YkMYa+DysjJyT6A32XHdYZl7lnTojeepizsprUvNbQ0XgYzpzOn3eDC+iIurC8ikTb5/Y4WfvnaUf7ul2+wrDLEn6yuYpksf5827CdUgzctjaFOXrLudthIZky2NPZQMdQ/vbk3QdjjPP6+4bLb2N0apTeRZk5JQFaTipOMNnK+AegCqhj8gD4mCkybJXEzVSyVZVtjL36XPee9dL/d2sy2o33cdfkcykJnV8WyO55mVoGX4oBUwZwqDh06xC233EJhYSHvfe97JTkXYorSWj8LPHvC7YPAJ0f5kbXA/qHjUEo9CFwH7Bzh+FuAu89NtGNjWhqn3YbnlETY57KBGhwcVkoxkMocT5TKQx6O9iRIpM3jq7mSGRONZmF5cNhkSilFRcjDkZ44Yc/pK8+ypkVPIsOSyhAlQTeVoQy7WvrojmcJe5wymz4DeJw2blhZxdVLynl8Ryu/eK2Jf1q3g6Dbzvl1hVwwu5AV1WFJyKaJkQZd3A4bLrtBRzQFGop8rpOSeJuhKA646Iql6W7oYl5JgOKAS2bTBTD6HvRDwCFg/cSFIyZCMmOytakXp93IeVT/SHecH7zcwNraAt6xqPSsHncgmSXgtlN3lvvWxcTbvXs3l156KZ///Of51Kc+JR8cQkxBSqmvaq0/pZT6LcP3Qb92hB+tBBpPuN0EnD/CY8wC6oCnRvj+R4GPAtTU1OQefI6ylqbc7zztPWpwpstBMmNhaU3I6zzercRmKOaXBdh8pAe3w8DSMJDKsKqmYNTPxgK/k4OdA6fdb1qa7kSaxWXB462YQl4Ha2oLaOiK09QdBwUex+BAgryfTm9uh43rVlRy1ZIyNjX08PLBLl480MkfdrXhc9p497IKrl1eQUjatU1bSqlhB/JOFPY4SWctdjT3U+BzMqfEj88ltX1mutGWuD+rtX6rUqqHkz/QFaC11jKNNgWlsxbbm/pAgzfHN4CMafGVP+zB47Bx1xVzzuqiImNaZCyTFeWFssRrCkgkEuzZs4dly5bxzDPPsGjRonyHJIQ4ez8a+vs/x/hzw71Zj1QY/Wbg51prc7hvaq2/DXwbYM2aNSOd46wZhqJohEKnxX4X+9sHMLVmYUXBSZ9hYa+TipCHzoEUmaxmXlngeNX2kfhddjwO+/Fl7scKyvUnMywsC1I2tKT1GLvNYE6Jn4qwm/5Ehrb+FD2JNFpDxCuz6tOdy27j4jlFXDyniIxpsa2pj8d3tPLwpkZ+/fpR3rmolBtWVlEcyK1Qr5h+nHaDIr+LgWSWjQ3dVIY9hL1O3I7BiTTpGjHzjJahHSv6UjQRgYjxZ1qaXS39JLLmGUf0TvTQxkYOdMT4+6sXnFXlWtPS9CbSLK0MSVG4KWDXrl3cdNNNvOUtb+Eb3/iGJOdCTHFa681DX25iqA86HN9jPlpW0ARUn3C7Cmge4dibgU+8yVDPmtdhwz/CoHPA4yCRMakp8A47W1lX5KM9mqQs5KLylOR6OEopKsJu9rVHsRkGNkMRcNtZUhGidJTtX8d6LJeFPGRNi8PdcRq746ftgRfTl8NmsHpWhNWzIjT2xPnF5iYe3d7K795oYfWsCFcsKGWt9FWfsfxuO15to60/xdHeBDA4Ihp021lWFZZEfQYZbYm7NfRlNdCstU4rpS4BlgE/BvonID5xjmit2dsWpSeeHtPFwBtH+3h4cyNXLCjhwvqxj9XE01niaZM5xX7Zdz4F/OY3v+HDH/4w//7v/86dd9555h8QQkwlTwJvB46tz/YATwAXjXD8RmCuUqoOOMpgEn7rqQcppeYDEeDlcx1wrnxDPcqH/Z7TTnHATe0JLdNO5HbYWFkTwTuGZeflYQ8hjxOXw8BlN8a8ssxuM6gt9NETSxNLZWVJ6wxUHfHyqbfP49a1NTy6vZWn97SzsWE3PpeNS+cWc8WCEuaXBmQrxAxjKHXaQGJ3LMWBjgEWlAXzFJWYaLl8IvwaOE8pVQ/8EPgd8H/Ae8YzMHFuHeyM0dqfoNCbe3LeNZDi3t/vpjzk4WOXzh7T45nWYLX2gMvOmtqIVGyf5KLRKFprFi1axDPPPMPixYvzHZIQ4txza62Pb57WWg8opU4vVf7H72eVUncBjzPYZu17Q9Xfvwhs0lqvGzr0FuBBrfU5X7qeC5fdoL7EP2Iic6wd1mjbq0bqhz4Sh80g5H1zs1k2Q7GwPMjGhm5cdiOnbiqjMS19/Lxi6igJurnjolpuv2AW25p6eWp3+/FWbZVhD29bUMLlC0pG3MIhpr+I10lLb4ICr/N4fQsxveWSoFta64xS6kbgq1rrrymlpIr7FHK0J8HhzhiF/tyrQ2ZMi///97tJZk3+9foleJ25j+5nTIveRJr6Ij9VBV65WJjkNm/ezM0338zf/d3fyay5ENNbTCm1Smv9GoBSajWQGO0HtNaPAo+ect8/nXL7C+c4zjFRSo24vP2Yyfo55HPZmV8aYHdr9E0nYD3xNDZDobUm5HFO2n+zGJ7NUKysibCyJsKfp7O8uH+wp/oPXznMD185TGXYw+xiH7OL/Mwu9rGoPCjt+2YIpRQhj5Pdrf0E3A7ZLjoD5JJ1ZZVS7wNuB64fuk+mQ6eIgVSWvW1RCnyuMRWi+d6Lh9jVGuVz75zPrMLhlwWOpDeePt5iRkxu999/P//8z//M/fffz0033ZTvcIQQ4+tTwMNKqWP7yMsBeeHnWVnITXcsTXc8Pab6MCcyLY3dplgzq4COaJKGrhhaI4n6FOV12rlyURlXLiqjtS/J8/s62Nc+wJ7WKM/v6wQGuwFcPKeQKxaUsrgiKMUGpzmHzcBuGOxu7Wd5VRjDUGRMi85oiqaeOA67QYHXid/twOu0yeDNFJdLgv4h4C+Ae7XWB4f2ov10fMMS54JpaXY39+N12sb0Af30nnYe2dbC9SsqeMvc4jE9Zl8iTVnILcn5JJdMJnG73YRCIV555RXq6+vzHZIQYpxprTcqpRYA8xms0L5ba53Jc1gznlKKOaV+Nh/uYSCVPeNqgOFEUxkqQh48Ths1hT5KQ26aexMc6ogR9jqluNQUVhZy8741f6zVGE1m2Nc2wAv7O3lhfyfrd7VTEnDx1nnFvHVe8ZgnVcTUEXA76BxIHR+AO9qbwNIan9NOOqM53BXH0hpLw5LKoNR+msJULlvGlFJ2YM7Qzf1a6+y4RjWKNWvW6E2bNuXr4aeUQ50DHO4aW4XYI91x/vpnrzO3xM+/XrdkTHviMqZFLJ1lbV0BLruM3E1WL7zwAh/4wAf4/e9/z4IFC/IdjhAiB0qpzVrrNW/yHF7g08AsrfVHlFJzgfla60fOSZA5ks/x4cXTWd5o6iNjWoSGmUnPmBZ2Qw27Va1rIMXa2QWnbUfriCZ542gfYY8k6dNRMmPy6qFuntrdxuuNvVh6sCvBZfOKuXResexbn4ZMS9MdT+OwKQIux7ATcOmsRSKTZW1doXQEmGRy/Sw/4zCtUuotDPZQPcrgiHuZUup2rfWLbz5MMV764hkOdQzuO89VxrT48h/24LYb/O07F4y5YE1vPM3SqpAk55OUaZrcc889fP3rX+c73/mOJOdCzDzfBzYDFw7dbgIeBiY0QRfD8zrtrKyJsLOln65YigKvE6UUyYzJQCqLw6bImPq0pCuWyhLxO4etFVMccLO8SrGtqY+A2y6fz9OM22E7PnPeE0/zwr5Ont3bwfdfauCBlxpYUR3mbQtLuWC2TJxMFzZDUXyGa3un3SCehoOdUvl9qsplHdV9wLu01jsBlFILGUzY39RIvhg/GdNiZ0sfAbdjTHuSHtzYyMGOGH//roVEfGPbB3dsabssp5mctNZkMhkaGxvZtGkTVVVV+Q5JCDHx6rXWNymlbgHQWieU9HCaVJx2g6WVIfa2RWnpS6CAgMvB0qoQIY+DbU29py2DT2ZN5pcFRjxnod/Fypowrzf2ojWyN3WainidXLO8gmuWV9DSl+DpoWrw//nEHrxOG2+ZW8xVi8uYU+LPd6hiAoQ8Dpp7E5QG3GO+phf5l0uC7jyWnANorXcppeQ3PUlprTnQMUA6a+H35V7Lb3dLPz/f3MjbFpRw4ezCMT1mxrTQQL286U9Kjz/+OF/60pd48skn+da3vpXvcIQQ+ZNWSnkADTDUPjWV35DEqWyGYkFZgIjHgdtpI+RxHF/WvrA8yIZD3biH2rJlTAuX3Titb/Kpwl4nK6sjvHG0l7RpjbmtnJhaykMebj1/FjevrWHH0T7W72rn6T3tPL6jlfpiH+9cXMZb5xWPqUOPmFqUGlwCv7u1n/NqC950G0cxsXJ5Zb6mlPofBmfNAW4DpM3aJHW0N0Fzb4KiMew7T2ZMvrJ+L0V+Fx8dY79zgN5EhiUVQVk+NclkMhn+8R//kR//+Mf8+Mc/xm6XD2IhZri7gd8D1UqpnwAXA3fkNSIxLKUUZWHPafd7nXYWlAbY2dpPsd9NNJlhbkkAI4dCsCGvgzW1Bexrj9IRTRHxOE66aLe0xrS07FWfRgylWFoVZmlVmI9cOptn97Tz+x2t/PczB/j+iw28e2k5162oIOyVebfpyO2w0R3LcrgrLpNoU0wuV+wfBz4JfI7BPejPAV8fz6DE2emNp9nbOthSbSyrFr/34iFa+5L82w1LxzyaGktlCXsdFAekEMlks23bNnbu3MmWLVsoLh5bNX4hxPQytJR9N3AjcAGDn+d/pbXuzGtgYsxKQ26642k6B9IAFAZyT67cDhtLKkK09iXZ0xbFUAprqFiwTSkylkWB1yWt2aYhv8vOu5dV8K6l5extG+DXrx/lF681sW5bM+9cVMqNq6qkqNw0FPY6OdwVI+C2UxwYW34g8mfUbEwptRSoB36ltb53YkISZ+NY9degZ/iKjiN5fl8Hj21v5YaVlSytDI3pMbXWJDImiyqD8oKfRH75y1+ya9cu/uEf/oF169blOxwhxCSgtdZKqV9rrVcDv8t3POLsKaWoL/HTHeuhOOga8+o1pRTlYQ9Bj4O2/iSBob7JHoeNQ50xWvqSZ1wyL6YupRTzywL87VULaOqJ8/PNTTy6vZVHt7dyQV0BVy8tZ1llSK7rpglDKcJeJzua+yjwuZhb6petDVPAiL8hpdTfA3cCrwHnKaW+qLX+3oRFJnKWMS12Hu3HbjPG9EG9tbGXr/xhL4srgnzg/Fljfty+ZIaKsFv2sk0SyWSSz3zmMzz22GM8+OCD+Q5HCDH5vKKUOk9rvTHfgYg3x2W3saImjP1NzHT7XHZmF5+87LU46KKxJ/5mwxNTRFXEy6fePo9b1tbwuzdaWL+zjRcPdFEZ9vCORaUsqQxRV+STbQ9TnMNmUDS0JebVg93MLvJRFHBhaY2GoeKRI+cQWdMinjHlen8CjTaEchuwTGsdU0oVA48CkqBPQvvaosQzJpEx7CHa3z7Avz26i6qIh//v3YvG3Ccxa1porakt8o01XDFOvvzlL9PR0cGWLVsIhca2GkIIMSNcDnxcKdUAxBhc5q611svyGpU4KydWcj9XAi47Lvtg8bnhkrKMaWEz1Jg6xIjJrzTo5kMX1/GB82fxwv5OHtvewvdfagDAbijqi/0sKAvwzsVlVBd48xusOGsBtwOvpTnUFeNQVwwANfgxgNNusKI6gsd5cpJuWpo9rVE6YykumF0o9aYmyGjv7imtdQxAa92hlJLhs0moJ5amtT91xp6IJ2ruTfDPv91BwG3nC9csPqsP+b6hwjTyQs2/H/zgByxbtoy//du/xWazybI0IcRIrs53AGJyU0pRGfZwuCt+WuEw09L0xtPYbQaW1tiUwueyy+zqNOK0G1yxoIQrFpTQEU2xty3K3rYoe9qiPLa9lXVbm7lgdiHvX1Mt7dqmKJuhKBymkPRAMsvWpl5WVIePt2K0LM3etijt0RQ2Q9HYnZDf+wQZLTObrZT65dDXCqg/4TZa6xvHNTJxRqal2dMWJTCGBLsnnubudTswteafr11M4VkUBImns3gcNspC0vM8n6LRKJ/4xCfYtGkTDz/8sFRpF0IMSynlZrDg6xzgDeC7WutsfqMSk1Wh38XBjthp9/cl0swp8VMZ8TKQytITS9PSlyCRNgnKnvVppzjgojjg4uI5RQD0JTL8dmszj2xr5uWDXayqCfO+1dUsGWP9IjE5+d12oskMWxt7WV4dxmU32N8RpbUvSaHPiQYau+NUhj2nzbKLc2+0K/r3nnL7/vEMRIxdc2+CZMYcdiRsOKmsyb/9bhc98TRfumEpVZGxL1NKZkxSWZOVNRGp8ppnN9xwA7W1tWzcuBGfT7YaCCFG9AMgAzzP4Cz6IuCv8hqRmLR8Ljs+l41U1jy+Su7Y0vbysAeboQh5HIQ8DkqDbl491IVpabkmmOZCHgcfuGAWN66q5NE3WvnN60f5/K/eYHFFkPevrmZlTVhW8E1xAbeDaDLDtqZeCnxOmnoG2zYrNbgQ3mk3ONwdY0FZMN+hTnsjJuha6ycnMhAxNom0yYGOAcKe3PadW1rz1fX72NsW5fPvWsi80sCYHzOVNYlnTFZURwhIoYi80Frz0EMP8d73vpcHH3yQoqKifIckhJj8FmmtlwIopb4LbMhzPGKSqwh72Nc+cDxB70tkmF8aOG05u8dpY3aRj0OdcQp80kt7JvA67fzJ6ires6ycP+xs45dbmrj7tzuYU+znuhUVXFRfNOa6RmLyOJakN/bEKTylbXPAZaelN0FVxDsuNTDEH8kraIo61DmA3TByHrH+v1eP8ML+Tu64qJYLZxeO+fHSWYtoMsuKqrC0X8mT7u5ubrzxRv7jP/6D7u5uSc6FELnKHPtClraLXBT4XAy1RyedtXA5DEpH2NZWEfbgtCtSWXMCIxT55nbYuGZ5Bd++fQ1/ecUc4uksX/7DXu54YAPffeEgTdINYMoKuB0U+dynFYNUSuGy22joHMhTZDOHDH9MQYP7vpI5F4Z7ek87D21q5MqFpdywsnLMj5cxLfqTGZZXhQh5JTnPh56eHlatWsUNN9zAgw8+iMs19toBQogZa7lSqn/oawV4hm4fq+Iu6xXFSTxOG0GPg2TGJJbOsqg8OOKEgN1mMK80wLamXlx+2Zs60zhsBu9YVMbbF5byRlMfj+1o5bfbWvj1683MLvaxsjrCypowi8qDUlBwGvC77LRHU/QlMjJhN45yTtCVUi6tdWo8gxFnljEt9rRFCbodOe312dnSz9ee3MeyyhB/fln9mPcHxdNZEhmTZVUhCs6ioJx4cyzLYsuWLaxevZp169axbJl0QxJCjI3WWrImMWZVYTdbm/oIeRwUneHzv8DnpNDvIprMyBa4GcpQiuXVYZZXh+mJp3l6dzsbG7r59etH+cVrTTjtBssqQ6yZFWF1bQFlQSk0PBUppfA67Oxu6afQ78Rlt+FyGNgNA7tNYTcUdsPAYVNSk+BNOGOCrpRaC3wXCAE1SqnlwIe11n853sGJ0x3oGCCVMSnIoTBc50CKf39sF8UBF3939YIxj1z2JzMoBatnyZ7zfGhtbeX2228H4IknnpDkXAghxIQJ+5x4nDbmlgYwzrCdTqnBXtkbDnXjlYJxM17E6+TGVVXcuKqKeDrL9qP9bDnSw6bDg3947iDVEQ9r6wq5dG4RdUU+SeamEJ/LTjydpa0vhak11tB+mGO/QQ3Y1GBRSan6fnZymUH/GvAe4NcAWuutSqnLxzUqMaz2/iTNPYkzjmTD4J6xLz26i1TG4l+vWzKmBFtrTXc8TdjrZGG59DrPh40bN3Lddddx5513cvfdd8sHlxBCiAnlsttYVhUm6M5tsaXPZaeuyMfBzhiFPudp+1fFzOR12llbV8DaugI+qjXNvUk2He5m0+Ge47PrVREPl84t5pI5RVRFPHLNMwV4naO/L5iWpqUvQWN3nNKgm6oCD0GZ7MtZLu+6htb68CkvlpwqgSilrgL+C7AB39Fa3zPCcX8CPAycp7XelMu5Z5p4Osuuln7CXucZ37i01nzjmf3sax/g79+1kFmFY2vB1ZNIUx5yM6ckIKPgEyyTydDX10ddXR0/+clPuPxyGQsTQgiRH2PdYzqr0IupNUe6Bqu6j5akH6tvoyCnVYFi6lNKURnxUBmp5LoVlfQlMrx0oJPn9nbw0w1H+L8NRygNulhVE2HNrAjLqsK4HTJJNBXZDEXY40RrTW88TWt/ksXlwRGLTYqT5ZKgNw4tc9dKKRvwl8DeM/3Q0LHfAK4EmoCNSql1WuudpxwXAD4JvDrW4GcK09LsbOnHabfltEz9kW0tPLW7nVvOqx5zxXbT0iigrsgvyfkEO3z4MLfccguXXnop99xzjyTnQgghphSlFLOLfNgUHOyIUeBznXYtkcqaDKSyOGwG80oCHO2Nk8yYkojNQCGPg6uXlHP1knK6BlJsaOhmU0MPT+1u57HtrbgdBpfNK+FdS8upKxrbZJOYHJRSBNwO3KbFrtZ+fG67tGjLQS7/Q3/O4DL3GqANWD9035msBfZrrQ8CKKUeBK4Ddp5y3L8A9wKfzTHmGaehM8ZAMkthDiPM24/28Z0XDnJ+XQE3r60Z82NFkxmqIl7pYTnB1q1bx0c+8hE++9nP8pnPfCbf4QghhBBnRSlFbZEfQyn2dwzgc9pJZS0sPTgB4HLYWFQepNA/mLy7nTa2NvZIgj7DFfpdx5P1jGmxo7mfZ/e289Tudn6/o5VF5UGuWlLG2toCfJLgTTkOm4HHYWP70T5W1UQkzziDMz7DtdbtwM1nce5KoPGE203A+SceoJRaCVRrrR9RSo2YoCulPgp8FKCmZuxJ51QWS2U50j24VCyXY7/8h72UBd18+sp5Y97/pbUma2kqwp6zDVeMUTqdxuEYXEL4m9/8hgsuuCDPEQkhhBBvXk2hD7vNoCeWpiLswOey43HYcDuMk7bqRbwOAh4H8XT2jPtaxczgsBmsqA6zojrMhy6u48ld7Ty6vYWv/GEvNkOxuDzImtoIa2sLqYzINetU4XXa6Y2n2dcWZVFFUGoNjCKXKu7/y2BBvpNorT96ph8d5r7j51FKGcB9wB1nikFr/W3g2wBr1qw5LZbp7HBXDKfNyCnZ/vZzB+mOpfiPP1l+Vh9y/cks5TwNOSQAACAASURBVGG3jGJPkD179nDzzTfzpS99iWuvvTbf4QghhBDnVEXYc8ZBf6UUc4r9bDnSKwm6OE3A7eD6lZVcu6KC3a1RNjV0s7Ghm++92MD3XmxgTrGfyxeU8NZ5xdKXewoIe520RVMEexJUF3jzHc6klcs74foTvnYDN3DyzPhImoDqE25XAc0n3A4AS4BnhkZQyoB1SqlrpVDcoGgyQ1t/isIcZs9fOtDJU3vauem8auaVBs7q8TKmRVVEXiwT4Qc/+AGf+cxn+Nd//VeuuuqqfIcjhBBC5E3I4yDsdRBLZUdcvpw1LQZSWbLaIuR2jrl1rJjaDKVYVB5kUXmQD15YS3s0ySsHu3hqdzv/+/xBvvfiIVbXRHjbwhLOqy2Q58ckVuB1sq8tStayKAm4ZcvCMHJZ4v7QibeVUj8C/pDDuTcCc5VSdcBRBpfJ33rCefuAohPO+wzwWUnO/+hwVwyX3TjjEpCeeJpvPL2f+mIfN6+pHvXYkQykshT4nVK4YZxprdFas2nTJp566inpbS6EEGLGGywu52fzke6TLtZNSw8m5ZaJ3bBRVeDBZTfY3RIl4nVilyRsxioJuLl2eSXXLq/kcFeMp/e08/TuDjY0dBNw27lsXjFvW1hKfbE/36GKU9gMRYHPRVNPgsNdcYIeB1VhD2GvU/amDzmbbKwOmHWmg7TWWaXUXcDjDLZZ+57WeodS6ovAJq31urN47BmjP5mhI5o+Y89zrTX3P7WfRMbk01fOP+sPq2TGZFF58Kx+VuRm69at/MVf/AW//e1v+frXv57vcIQQQohJI+R1UOR30Z/IoBSkTAuHoSgNuikJuAm47RhDFeENFDta+yn0nl4lXsw8swp93HFRHbdfUMuWxh6e3DVYBf6321qYW+LnPcvKuWROsSR/k8ixNmwAibTJzpZ+AAIuB8UBJyGPE7/bPmNf37nsQe/hj3vHDaAb+LtcTq61fhR49JT7/mmEYy/L5ZwzRUNnLKe94E/uamdDQzcfvqSOmrPcy5FImwTcdoIemT0fD1prvvnNb3L33Xfz1a9+lYKCgnyHJIQQQkw6dUU+tjb2UuR3URp0E3A7hr1ALwt7yGrNntYohcO0chMzk81QrJlVwJpZBUSTGZ7Z08Gj21u4b/0+vvvCIa5cVMa7lpRREpRe3JOJx2nD4xzMeZIZk4auOJaO4bAZ1BX6KAm6ZtxqmVEzMjW4tno5g0vUASyt9Ywq0pYPffEMnQMpiv2jv4H0xNN858WDLK4Ics3yirN6rFTWZCCdYWV1RKopjpPm5mYeeughXnrpJebOnZvvcIQQQohJKeB2cPGcopyuR6oiXixLs799gEK/K+fONccuY+WaZ3oLuB1cs7yC9ywrZ1tTH797o4VfbWniV1uaOK+2gHcvLWd5dXjMHY/E+HI7bMcnKDOmxZ62KAc7B6gt8lEWdM+YRH3UBF1rrZVSv9Jar56ogAQc6hzA6zjzbPZ3nj9IKmNx1+VzzuoNJpbKkjEtVtVECHvPXIhOjM3LL7/Mww8/zFe+8hWeffbZfIcjhBBCTHpjSZxrCn1kLc3hrvgZtwTC4OxcNJnBMBSFvjMfL6Y+pRTLq8Msrw7THk3y++2tPLGzjVcPdVMZ9vCORaVcOq84p+ePmFgOm0GR30XGtNjfPkDnQJpllaHjW12ms1yGITYopVaNeyQCgL5Ehp545owVDTcd7ua5fZ3cdF71WVVe702kUQpW10pyfq5ZlsU999zD9ddfz2WXXZbvcIQQQohpq7bQR1HASW88PepxfYk0adNieXUYh80gY1oTFKGYLEoCbj54YS3fv+M8Pn3lPPwuO99/qYEPPbCRv//VGzy+o5W+RCbfYYpTOGwGhT4XPfEUzX2JfIczIUbMApVSdq11FrgE+IhS6gAQY7C/udZaS9I+Dg53nXnveTJj8s1nDlAd8fDeVVVjfoyuWIqI18nC8qAUzDjHtNY89NBDPPLII2zcuJGampp8hySEEEJMW4ahWFAWZMuRHgZS2dO60ZiWpieepijgZF5pAJfdRn2Rxe62qMyiz1AOm8Hl80u4fH4Jzb0Jnt3bwbN7O7j/6f3c//R+6op8LK8KsbwqzOKK0PH90SK/Ih4X+9sHiHid074122j/ug3AKuD6CYplxosmM3QNnLly+09ePUJ7NMU9Ny4dc5/H3kSaiNfJksqQFFU5x9avX49pmtx00028733vw26f3m8eQgghxGTgsBksqQyxuaGHVNbEZbeRzJjE0lmUgrmlfirDnuPL54uDbhq648ePFTNXRdjDLWtruPm8ag50xNh8pIdtjb387o0Wfv16Mw6bYmV1hAvrCzm/roCA25HvkGcsm6Fw2W3sbu1nZXVkWi91Hy2DUABa6wMTFMuM19STwHmGhHt/+wDrth7lqsVlLK4Ijen8A8ksHruNheVBSc7PoWw2y913380DDzzAj3/8YwzDwDBkZYIQQggxUbxOO0urQrx2pAdFBr/LzsKyAAV+12mTGTZDUV/s442j/RT7JUEXg3vV55T4mVPi56Y11aSyJrtaomxs6OalA11saOjGULCsKswlc4q4YHYhIY8k6xPN77LTOZDiaG+C6rPsXjUVjJagFyulPj3SN7XWXxmHeGaseDpLW3+SglH2g6eyJl9dv5eQx8GfXlQ7pvMn0iYmFiurCmRZ+zl21113cejQIV577TVKS0vzHY4QQggxI4W9TlbVRLAZCr/LPmrBuUKfi4BrcKY9l7a2x2RNi+54mpKAtOqazlx2Gyuqw6yoDvPhS+rY1z7ASwe6eOlAJ/c/vZ//fmb/8WT94voi/G5ZNTlRIl7n4FJ3n/O0LS3TxWj/KhvgZ2gmXYyvoz0J7IYa9cPkgZcaONwd55+vWTymJ2Q6a5HImqyqCY/pQ0iM7tFHH+WSSy7hX/7lXygsLJRZcyGEECLPci18axiK+pIA25p6xnRt1J/M4HXYiaezeJ3TMzkQJ1NKMa80wLzSAH964SwOdsZ4cX8nL+wfTNa/9ewBVs+KcNn8Es6rjci2iXFmMxQeh41NDd1EvE6K/S78Hjtehw1DKZSa+m0UR3tnadFaf3HCIpnBkhmTo70JIqN8qGxs6OaRbS1cu7yCVbMiOZ9ba01vIs2K6rDsmzlHUqkUf/M3f8O6det49NFHWbRoUb5DEkIIIcQYRbwOgh4n3bE0hgJLazTgtBnDXjOZlkYpqC32sqc1Kgn6DKSUor7YT32xn9svmMX+9gGe29fBc3s7efVQNx6HjcvmF3P1knLqinz5Dnfa8rnseJyDK2D2dwxgaQ2AQqHRKMDrsrOsKjQlB0zOuAddjL+WvgSGUiP2Mu+JpfmvJ/dRW+jlTy+sHdO5e+Jpagq8FEp/x3Mim81y6aWXUllZyZYtW4hEch8sEUIIIcTkoZRifmmAlr4ELrsNl93AMBR726LDFpCLpjJUhr2UBNwcaB8ga1rYx1isV0wfSinmlgaYWxrgjovq2N7cx5O72li/q43HtreysCzAVUvKuai+UFawjgNDKbxOO8PNb2qt6U9m2NMaZUnF1OudPlqC/rYJi2IGS2VNGrvjBN3Dz55bWvPVJ/eRSJt86YalY9o/nsqaOOwGtTKCd05s2bKFlStX8s1vfpOVK1dO+eUzQgghxEznc9mZUxI46T6tNduP9uE6oYCc1pqsqakIe7AZiuqIlyPd8ZyX1IvpzWYolleFWV4V5sOXzOap3e08tr2F+9bv5VvP2rhwdiGXLyhhqXRRmhBKKUIeJ52xJE09cWoKp1YuNGKCrrXunshAZqojXXEUasQX6++2tfDakR7+/K311IyhWqHWmr5EhpU1kTG3YhMni8Vi3HXXXbzyyits3ryZVatW5TskIYQQQoyTIr+LQr+LaDJzfKl7NJmlPOw+3hO7JOjmYGcMrbUM2IuTBD0Orl9ZyXUrKtje3M8ze9p5cX8nT+1pp8Dn5KL6Qi6qL2KRdFUad8d6pwc9jik1mCabZ/IolsrS1JOgwDf8E6Y3nuZHrxxm9awIVy8pG9O5+xIZKiOeEc8tctPQ0MDVV1/N+eefz8aNG/F6p29LByGEEEL8seXWqwe78Voam6FImxaVYc/xYzxOGwV+J/GUie+Uwr0Z05LJEYFSiqWVIZZWhvjYpfVsbOjmmb3tPLGjjUe2tRDyODi/roA1tQUsqwyd9jwSb57NUATcDrY397FmVsGU2Wogz4Q8Otg5MLjfaYSR15+8eoS0afHhS+rGNDqbzloohRSneBO01nR0dFBWVsaXvvQlbrjhhnyHJIQQQogJ4nXamV3k41BXDLd9MBk/tXBcdcTLtqbekxKr3kQarTWmBQU+54jXeGJmcdoNLp5TxMVzikikTV470sNLB7p4fl8nT+xsw1AwrzTAyuowSytDzC0NTJlkcrJzO2ykTYs9rVEWlgenRLtpSdDzpDeepiOapniE4m2Hu2I8sbOVdy8tpyqS+6ytaQ1WbZ+qVQsng97eXj7ykY+gtebnP/+5JOdCCCHEDFQZ8dDcl2AglWFBeeC074c9DuyGQca0sBuK7niaiNfJ/LLBwnMHO2NEPE6ZTRcn8Thtx5P1jGmxuzXK6429vN7Yw4MbG/npxkZshqK+2MfCsiAragb3tsvz6OwF3Q5642lePdRFfZGf0pB7Um8vkAQ9DyxLs68tin+E9hxaa777wiG8Tju3rK3J+bzprEVfIs2i8iDFAfe5CndG2bhxI+9///u55ppruPfee/MdjhBCCCHyxG4zmF8WpLE7Rshzets1w1DUFHg42BnD0lARdjO3JIDNUNQV+fG77OxsieK2G9KSTQzLYTOOL4O//YJZRJMZdrVE2dXSz67Wfh7d3sJvtjbjd9m5YHYBl8wpZnlVSLoHnIWw10nWtNjbHuVwd4w5JX6K/K5JWUNC3i3yoL0/yUA6S5Fv+CR68+EetjT28uFL6nLuXZ5ImySyJsurw9JS7SxYloVlWWQyGe677z6uv/76fIckhBBCiDwr8DkJexwjXsSXBN0c7o5TF/FSU+g96bjigJvVTjs7jvbRFUvhMAx8LvuknrkT+RVwO1hbV8DaugJgsJ7BliO9vLC/g5cOdLF+VzuFPifvW13FOxaXyaz6GNltBoU+F6msyfajfZSFPMwp8U+6/0dJ0CdY1rTY3zlAaIS2alnT4rsvHqIi5OZdS8tzOudAMouJxaqacM4Jvfij9vZ2/vRP/5R3v/vd3HXXXfkORwghhBCTyGg9lN0OG+fVjlx8yu+ys6a2gGgyQ3s0RXt/kqylcdttUhRMnJHDZhxP2NNZi9eO9PCrLUf51nMH+flrTbxvdTVXLiqddAnmZOey23D6DNr7k/QnMiyuDOGfRK9H+W1OsI5oikxWj/hCenxHK009Cf7s4rqcXmwZ0yKrLVbXFEhyfhaeeuopVq5cyapVq/jYxz6W73CEEEIIMcWcqZiXzVCEvU7mlQa4sL6IFdVhPE4bnQNJBpLZCYpSTHVOu8EFswu558al/Mt1Syj2u/jmswf4yA838YvXmhhIyXNpLJRSFPgGVx1vauimvT+Z54j+aPIMFcwAlqVp6Bp+HxNAfyLDT149wrLKEOcPLW05k/5khnklgeN9OcXYPP300zzwwANceeWV+Q5FCCGEENPcsWQ97HXSn8xwuDNG50CKE1fQKwaX4gZl4kUMQynFiuowy6tCbGns5RevNfHASw08tLGRKxeVcu3yCkqDUosqV16nHYfNYHtzH6vtNkLe/L/uJEGfQN3xNMmMhd81/C/+h68cJpbO8tFLZ+dUsCCZMXE7bJSG5EU4Fo2NjXzoQx/ia1/7Gl/84hcnZXEIIYQQQkxvQbeDpVVhBlJZUhkTpRTHrkha+hK09ScJup0ntYXKmBZ9iQxKaXxOh7TimsGUUqyqibCqJsKBjgF+/fpRfvdGC49sa+bKRWXcfF41RVKXKicOm4HdMOhPZiZFgi5L3CeI1pqGjtiI+xv2tkV5Ykcr1yyrYFZhbv3LB1IZ5pb4pdjIGKxbt441a9Zw+eWXM3/+fEnOhRBCCJFXfpedQr+LAp+TyNCfRRUhllSGSGSy9CbSpLImnQMpklmTBWUBllVFyFgW3bEUWut8/xNEntUX+/nMlfP5zgfX8K4l5Ty5q42P/mgT//v8QXrj6XyHNyV4HDa6Yql8hwHIDPqE6UtkiKayw45kWVrzrWcPEPY6uPX83NqqDaSyhL1OCnzDF5sTp0skEtx777386le/4qKLLsp3OEIIIYQQIyoOuAl6HBxoH6A/kWVxRZBCv+v4xMx5tQU0dMY40h0n4HKMuN0xmTFJpE1cDgOPwyaTE9NYkd/Fx95az/UrK3lw4xEe2dbMEztbec/SCm5YWUlwhG22Alx2g554BtPSeZ/8lBn0CXK4Oz7iMqQ/7GxjX/sAH7q4Lqc+mVprkhmT+hK/vMnmYP/+/Xz84x/H4XDw/PPPS3IuhBBCiCnBZbexqCLEBfWFlATdJyUODpvB3NIAq2dFMGzQGUvRFUvRl8iQSJt0x1J0De1vryv24bIbdMXSdMVSJDNmHv9VYryVBt381dvmcf+tq1hbW8AvXmviwz/cxA9fbiCazOQ7vElJKYVGE0/nv9iezKBPgIFUlu6B9LCz59Fkhh+83MDiiiBvnVec0/n6kxnKw24pHpKDn/70p3zyk5/k7rvvxmaTUWMhhBBCTC9hr5M1swbbcA2ksvTE0vQnM1RE/BT6XMdn1qsLvCQzJr3xNI3dcXrjacJeWYk5nVVHvPzNOxfw/jUxHtzYyMObm3hkWwtXLSnjmmUVFAdkj/qJFBBLZfPeGUsS9AnQ1BMfsWXaj145TCyV5eOX1ueUPGZNi6ylqc1xn/pMtmHDBr7whS/wxBNPsHLlynyHI4QQQggxbpx2gwL76Nsf3Q4bZSEPhX4X25p66U9mZMJnBphV6ONvr1rAzV0xfrapkd+8fpR1W5t5y9wiblhRyexif75DnBTcDhtdA2nKQp68xiEJ+jhLZkxa+5IUDDNC2dyb4PEdrbxrSTm1Rbkl3L3JNPNKglK1cxRvvPEGO3bs4Oabb2bbtm24XDI6KIQQQghxjMNmsKQyxNbGXqLJTN5nDMXEmFXo42/euYAPXphk3dZm/rCzjWf2dHBRfSF/dlEdZTO8M5TLbqM7nkZrnddVt7IHfZy19ydRMOwv+SevHsFhM3j/edU5nWsglSXkdlA+w188I9Fa8z//8z9cccUVZLOD+0ckORdCCCGEOJ3LbmNZVRhDDS7rFTNHadDNR94ym+/dcR63rq1h8+Ee/vwnm3ngpUMz+rlgMxSmqUnkuUaDzKCPI8vSHOlODDsqeagzxvP7OviT1VVEctj/Y1qDheGWVRVgSFu1YX35y1/mRz/6ES+88ALz58/PdzhCCCGEEJOa22FjWXWYLUd66YqlsBsGTpuBy2FgSN2eac/vsnPL2hresaiUH75ymF+8dpT1u9q5fkUl71hUOiOrvmuliaXMnAp3jxdJ0MdRbyJD1rJw2E5/cv/k1cN4nTZuXFmV47nS1Bf78I3QR30m27BhA2VlZdx5553cdddduN2ywkAIIYQQIhdep53VsyLEUlmiySx9icEic5YFGo2Bwmk3cDtseW8/JcZHod/FX799Htcsq+AHLzfwg5cb+OmGI1w2v5hrllXkvBV3OnDZbPTE03ktoCfZ3jhq7InjGWav+J7WKK8e6uYDF8zC7z7zryCezuJ32qmMeMcjzCnLsiy+8pWvcO+99/Lggw9yxRVX5DskIYSYVpRSVwH/BdiA72it7xnmmPcDXwA0sFVrfeuEBimEeNPcDhtuh41CvwvwobUmlbVIZkxiqSy98Qzd8TSmpfE4bHmdXRTjZ06Jn3+5bgkNnTEe2dbM03s7eGJnG/NLA7x1XjGXzC3KaeXvVDZYKC4FpYG8xSCvrnEST2fpGUgPvdGd7EevNBDyOLh2WcUZz2NamnjaZE1tREYtT3HbbbfR0NDAhg0bqK2tzXc4QggxrSilbMA3gCuBJmCjUmqd1nrnCcfMBT4PXKy17lFKleQnWiHEuaSUOp60h71OKiODnYR6ExmauuN0DqRwOwz8rpm3BHomqC3ycdcVc/nghbWs39XGM3s7+PbzB/nOCwdZVhXmkjlFnF9XMC3b9DlsBv3JDMmMmbei3JKgj5O2/uSwCfXWpl62NvXx4UvqjvelHImlNd3xFAtKg1Jd8wRvvPEGS5Ys4XOf+xxLlizB4ZD/GyGEGAdrgf1a64MASqkHgeuAnScc8xHgG1rrHgCtdfuERymEmBB2m0GR30WR30U8nWVrYy/xdFZm06exoMfBjauquHFVFYe7Yjy3r5Nn97Zz/9P7+cbTsKA8yAV1BVwyt4iSwPTaYhpP5y9Blyru48C0NEd7Ti8Op7Xmx68cpsjv5Ool5Wc8T1csxaxCHxWR/Pbimyyy2Sx3330373znO2lqamLlypWSnAshxPipBBpPuN00dN+J5gHzlFIvKqVeGVoSfxql1EeVUpuUUps6OjrGKVwhxETxOu0sqQyRyJhkTOu073fHUnQOpOiKpehLZIY9Rkwtswp93H7BLP739jV87eYV3LK2hlTG5PsvNfDRH23m/qf20d6fzHeY54TdMOhLpPP3+Hl75GmsO5Yia+nTZtC3NfWxuzXKX1xWj9M++thIVyxFWdBNXeHMKcowmr6+Pq699locDgevvfYaZWVl+Q5JCCGmu+H2VelTbtuBucBlQBXwvFJqida696Qf0vrbwLcB1qxZc+o5hBBTUMDtYFF5kO3NfRT6XBhKobWmK5amJOCivsRPKmPROZCiPZqkP5nBUAqf037G62AxeSmlqCvyU1fk55a1NbT2Jfn160d5fEcrT+5u5+0LS3n/muq8Fll7s9wOg66BNHVF+Xl8SdDHQWNPAq/j9P/an21upMDr5G0LSkf9+f5khoDbzrzSgLRUA7q6uohEInzsYx/jpptuwmbLz3ITIYSYYZqA6hNuVwHNwxzzitY6AxxSSu1hMGHfODEhCiHyqSTopjaV5XBXnIjXSXcsRU2hj9lFPgxjcB97yOtgdrGPWNqkJ5amuTdBdCCDzVCEPA6UtHOb0spCbj7+1nreu6qKhzc38oedbTy1u50bVlby3lVVZ9zSOxm57Da6YikypoXDNvGDSTJ8dY4NpLL0xzOnPRl3t/azramPG1ZWjjpqmMqaACyuCGHPwxNiMkmn0/z1X/81V155JQC33nqrJOdCCDFxNgJzlVJ1SikncDOw7pRjfg1cDqCUKmJwyfvBCY1SCJFXtYU+igJOOgeSzC8LMqfEf9oEk1IKv8tOdYGXtXUFnFdXQFnITVcshWnJoprpoDjg4i8um8P/fGA1F9YX8tCmRj7+k808tbsNS0+937EC4ikzL489szPAcdDWN3xxuJ9taiTgtvPOxSMvzdZa05/MsrA8mLeiBJPFwYMHueiiizh06BDr16/HMOSpKoQQE0lrnQXuAh4HdgE/01rvUEp9USl17dBhjwNdSqmdwNPA32itu/ITsRAiHwxDsaAsyHl1hVTmUDdJKYXPZWdOiZ+5pQG642nZoz6NlATdfPYd8/mP9y6j0OfkvvX7+MzPtvLi/s4pNRijgayVn+elLHE/h7KmRXPf6cXhDnUOsLGhh9vOrxl1mUd/MkN5yE2Bb/q1LBiLbDaLaZrccccdfOITn5ClT0IIkSda60eBR0+5759O+FoDnx76I4SYoRw2g5BnbJMpSimqIl7cdoPtzf34nPacJ6hMS9OXSOO02fC7JZ2ZjBaUB/nP9y3nmT3tPLixkXt+v5uyoJvrV1TwtoWlM34ycjTyjD6HumNpzGGKw/1sUxMeh433LB2573nGtNDA7OKZWxQuHo/zyU9+kkAgwH333cfcuXPzHZIQQgghhBhHRQE3q2bZ2H60j1gsC4DDMHA5DJw247SJmv5EhoxlMavAS2NPnIxp5GWfsDgzQymuWFDKW+eV8MrBLn615Sjfeu4gP9lwhD9ZVcW7l5XjskuifipJ0M+hIz1xfKf0gmzqifPi/k7eu6pq1BG+vkSGxRXBGfsk3b59OzfddBMrV67kvvvuy3c4QgghhBBiggTdDs6vKySRMUmkTfoSafoSWXriGUCjGWx9lTYtSgJO6osDeJw2fC47O5r7KPJPrx7c043NUFw8p4iL6gvZ2dLPQxsb+f5LDfzm9Wbef14171hUKoMsJ5AE/RwZSGWJJrIU+U9uKfDzzU047AbXrRh59rw/kaHQ75zS7QjerKeffprPfvaz3HHHHbKkXQghhBBihrEZg4Xk/C778Wtiy9KDSXvG5P+1d+dhUlV3/sff36rqqq7eVxq6m1VEQTYBFYPGGDNGE5UsJBJDHI38nJhoMjHJ73F+jhMmTjJxzJhJJmYSRhON0ahxTIKjZnGJiUQRBFxAVETZGhC6m95rP78/qiBN00ADVV1V3Z/X89TzVN26de+3Dt19+N7zved0hKJUBP1U9roVtLY0wIjSQlq7I5QHh/ctovnAzDilvpxvzC/nle1t3PP8Zn70zFs8vHobC08bzbknjRj2k2SDEvS02dkWwtentL25M8wf39jNhVNHUlHU/x+NeMIRTSQ4cUTpsEtM29ra+NznPsdVV13Fddddl+1wRERERCSHeDzJCeWKA76DBsEgmfCdMKKEle+0HLQkVigaxznycpmv4WBaQzm3fGwaq7fs5efPb+b7T23kly9uY+FpYzhnUm2/k24PF7pEkQaxeIId/UwO978v78A5x/yZDYf87N6eCOOri4fdH4+VK1cya9YsKisrmTdvXrbDEREREZE8VFjg5eSRpbT1REg4R0coyp7OMEayOL6tJ5LtEOUQzIzZYyu57ZMz+McPTyZY4OW7T7zBF+5bzQtvt2Q7vKzRCHoa9Dc5XE8kzuPrdnDmhGpGlvV/X0wsnsBrxqiKIy9J+YhCPwAAIABJREFUMZQ457jpppu45ZZbWLBgQbbDEREREZE8VltayMjyCDvaehhVHqS+IkhZoY9o3LGuqY2W7jBVRcP3VtJcZ2acMb6a08ZVsWJTMz97fjM3P7qeM8ZXsfjsCYfMpYYqjaCnQX+Twz21YRdd4TgfOczoeVsoyriaYvy+4fHPsGfPHv7u7/6O9vZ2Hn/8cSXnIiIiIpIWJ44o4T0n1DB5VBnlwQLMDL/Pw7SGcmqKA+zpDJNcGVJylceMM0+o4fsLT+WK94zjpW17+cK9q3lg5RYiseysSZ4NwyMzzKDOcIzOntgBa/nFE47fvNTESXWlnDyqrN/PReMJfB5jZPnwuCL0zDPPcOqpp1JRUUFRUdGwu99eRERERDLH5/X0u7a2z+th8qgyRlcGae6K0NIVprkrTGt3hO5ILAuRypEUeD18fFYjP7xsNqeNq+TnK7bwf+5ZxaMvNxGND/1EXSXux6m5I4zXe2CyufKdFna0hbj8zHGH/FxbT5ST6kqHxZICO3bsYNGiRSxdupQLL7ww2+GIiIiIyDDi8SQnkxtVESQaTxCNO0LROHs6w+zpClEZDAzrSclyVW1pgBsunMwr2/by8xVb+NGfNvHQ6u1cOmc0500eMWTzKCXox8E5R1Nbz0Hl7b9eu50RpQHOnFDd7+ei8QR+n4cRZUP7XpimpiYee+wxFi9ezJtvvklh4fCoFhARERGR3GKWnBG+t4aKINtau3lrdyclgYJ+R+Al+6Y1VvDthnLWbt3LvSu2cPsfN/LQ6q1cdvoYzpk0YshdXBmalx0GSWc4Rjh64JIOb+7qYF1TOxfPqD/kD0tbT5QTaoqH9Dp/jz/+OLNnz2bnzp0ASs5FREREJKd4PMaY6mJmja0imkjQ2h0hNgxKqPORmXHqmEpuXTCdr180heKAj+8+8SbX/WI1yzfuITGE5hfQCPpxaOmMHFTe/puXmggWeDl/Sl2/nwlF4wQLvNQO4dkIf/WrX/GlL32JBx98kLPPPjvb4YiIiIiIHFJ5sIA5Y6vY0tLNrvYQ0VAUA4IFPvw+z5Aboc1nZsaccVXMGlvJc281c++KzXz7txs4pb6ML513IqPK8391rIwm6GZ2AfA9wAvc4Zz7dp/3rwcWAzFgN/BZ59zmTMaULv2Vt+9qD/Hsxj1cNG0URf6DmzaecHSEo8weWzUkf9E3bdpEV1cXF154IWeffTY1NTXZDklERERE5Ij8Pg8TR5RwQm0x3ZE4bd1RdneG6AhFiSUc+/7nvm+c1oPh8RjFfu+QrorNVR4z5k2sYe6Eap54bRc/Xf421/1iDZ+dN54Lp47M6wmpM5agm5kXuB34G2AbsNLMljnn1vfabQ0wxznXbWbXAP8GXJqpmNJpX3l7SaBg/7b/Wb0NAz5yav9Lq7V0hzlxRCnlwYJ+389nDz74INdeey3f+c53mDZtmkraRURERCTv7LtXvTjgo74yORobjSeIxJKPuHMkEo5Y3BGOx9nc3E3A66WkUIXJ2eD1GB88ZSSzx1by/Sff5L+eeYvnNjVz3fsnMqI0P/ORTP4knQ5sdM5tAjCz+4H5wP4E3Tn3dK/9nwcWZTCetOpb3t7cGeYP63fxgcl11JQcPPlbW0+E2tIAjZX5X3bRm3OOm266ifvvv3//feciIiIiIkNFgddDgddDcT/zO9eVFfL6zg72dIapLPIPySrZfFBTEuCfLzmF367byU+Wv83f37+Wr55/ErPGVmY7tKOWyXqMBmBrr9fbUtsO5Srg8f7eMLOrzWyVma3avXt3GkM8Nv2Vtz+8ZjsJ51gwu/Gg/UPROGbGpLrSvC636OuNN94gGo2ycOFCVq9ereRcRERERIaVIr+PGY0VnFBbTGt3hObO5BrrbT1ROkLRo163W5PUHTsz48Kpo/jepadSXeJnySPruH/llrybQC6TCXp/mWi/rWNmi4A5wK39ve+cW+qcm+Ocm1NbW5vGEI9N39nbW7sj/PbVnZx70gjq+kz+Fk84OkJRptaXE/ANjaUbnHPceeedzJs3j7Vr1zJ16lTKysqyHZaIiIiIyKDbNxv8aeOrmFJfxgm1xTRWBqktDdAZjtIZjh3xGF3hGLs7Q7R0R446qZcD1VcEuXXBDM45qZZ7V2zhXx5dP6B/g1yRyRL3bcDoXq8bgaa+O5nZB4AbgXOcc+EMxpM2zX3K23+9ZjuxRIJPzB590L6d4RhjqospLxoa953HYjEuv/xyXnnlFZ555hmmTJmS7ZBERERERLKuJOCjpM9a642VRaxraqOlK1kC37eaticSpysSpaLIz+T6KiKxOK9sb6em+OB9ZeAKC7xc/4FJnFxXyh3Pvs31D65lycWnUF+R+7cbZ3IEfSVwopmNNzM/sBBY1nsHMzsV+DFwiXPu3QzGkjbOOXb0Km9v74ny2Ks7OGtiLQ393F8ejSeoK+vnhpU81Nraitfr5aKLLuKFF15Qci4iIiIichhBv5eZoysYVVHIns4wXeEYrV0RWrrCNHeF8RjMHF3JzNEVlAcLqC0tpLGykJbuSLZDz3tmxoen1/PNj06jKxzjqw+9xGs72rMd1hFlLEF3zsWAa4HfAa8BDzrn1pnZN8zsktRutwIlwC/NbK2ZLTvE4XJG3/L2ZS81EYom+OSc/u89Lw34KC3M79Fz5xz/8R//wfTp0+nq6uKyyy4jGMz9q08iIiIiItnm83qYVFfGlFFlFBf6GF9bzIzRFcydUM2c8VVU9hktH19TQmGBl55IPItRDx1TRpVx64IZlAR83PjrV1i+cU+2QzqsjK4H4Jx7DHisz7Z/6vX8A5k8fybs7Y7un52xJxLnkZebOHNCNWOriw/atysSY/LI0sEOMa2am5u54oor2LVrF8888wwlJSXZDklEREREJO+MrAgycgAl1gVeD1Pqy3jxnVb8Ps8BM8M751T6fgz23Zf+zUfXc8tvN/DZeeMPuTR2tmWyxH1I2tUeIuhPTvb23KY9dEfizJ9Zf9B++2YLrOxvPYY8EY8nr9rNmTOHZ599lgkTJmQ5IhERERGRoa+ssICJI0rY0xWiOVUO39wVprUnQnNXWBPJHYPyYAE3f2Qq7zmhmjuXv83PV2zG5eAM7xkdQR9qwrE4HaHY/nXOn359NyPLCpky6uAZzDtDMUaVF+L35d81kHg8zre+9S3WrFnDww8/zNe//vVshyQiIiIiMqw0VATxew2f10OgwEvA58Fjxq62EG/t6SSRcJQHtfb60Qj4vHztgydT9MeNPLByK+FonM/OG59TVQlK0I9CRyjGvn+7PZ1hXtq6l4Wnje73HzQSTzCyLP/u025qamLRokU45/j5z3+e7XBERERERIYlj8eoKz84n6ivDFJTGmD73m7e2dNNwOfJ+zmvBpPXY1x77kQCPg+/XttEOJbgc+ecgCdHkvT8G97NoubOMAFvsrz9mTd244D3nTTioP0isQTBAi9lwfy7/vGXv/yFc845hyeeeIKGhty8L0NEREREZDjz+zyMrynhjAlVFAd87O5U2fvR8Jhx9dkTWDCrkcdf3cn3n3wzZ8rd8y+DzJJEwrG7I0xJoADnHE9teJfJI0v7XUuvIxxl4oiSnCqVOJxIJMKNN97IySefzFVXXcWCBQuyHZKIiIiIiBxBkd/H9MZydrWFeOPdDrweD4W+5ICiI5lw+r2evMlLBpOZcfmZYzGDX764jfNPGdnvrcuDTSPoA9QViRGLO7weY9OeLra0dHPuyQePnjvncI7996nnurfffpuzzz6b1157jfnz52c7HBEREREROQpmxsiKIKePr6aq2L8/MTcMDPZ0RnJmdDjXmBmfnDOaYr+Xx17Zke1wAI2gD1hbTxRPagKGpze8i89jnDWx5qD9usJxRpQFKCzwDnaIx2TJkiVceumlfPnLX9aVNRERERGRPFVY4GVynxFg5xxvvdvJltZuaooD+v9+PwoLvJw3uY7HXtnB4rPGU1Hkz2o8GkEfoF3tYYIFXuIJxzNv7ua0cVUHTcbgnCMUizOmqihLUQ5MT08P119/PVu2bOGuu+7i+uuv1y+riIiIiMgQY2ZMqC1hVHmQlu5ItsPJWRdOHUks4fjD+l3ZDkUJ+kBEYgnaQ1EKC7ys2drK3u5ov+XtHaEYdWWFOT2L4muvvcYZZ5zBjh07qKioUGIuIiIiIjKEeTzGpLpSqkv8tHSFD7tvPOGIxIbfZHONlUXMaCzn8XU7iSeyezuAEvQB6AzH2JfGPr1hN6UBH3PGVh6wj3OOSDzB2OrcHT0PhUJcdNFFfPGLX+S+++6jrCz7kyCIiIiIiEhmeT3G5JFllBf5aes59Eh6S3eEtp4IiWF4z/qHpo1id0eYFze3ZDUOJegDsKczjN/roTsS4/lNzZw9qZYC74FN1x6KUl9RSHEg927r7+jo4Ac/+AGBQIBXX32VxYsXa+RcRERERGQY8Xk9TBlVht/noTMUO+j91u4II8sCjK4qoj0UzUKE2XVGapK9R7M8WZwS9CNwzvFue5giv48X3m4hEk/wvkm1B+yTcI5o3DGmqjhLUR7a6tWrmT17Ni+//DKxWIxg8OBl4UREREREZOjz+zxMa6ggToKeSHz/9q5wjIDPw4l1pYyuKiKecMNuFN3rMS44ZSSrt+xlV3soa3EoQT+CrkicWCKB12P8+c091JQEOGlk6QH7tPdEGV1VRNCfWzO3r169mgsuuICbb76ZpUuXUlCQu/fGi4iIiIhI5gX9XmY0VtAdjRGJJYjGE4RicaY2lFPg9VBY4KWxsoj2nuE3in7+lDq8HuPpDe9mLYbcq8fOMZ2p8o7OcIzVW1q5aPooPL3Kw/ddXWqszJ2R6ebmZt544w3mzp3L6tWraWxszHZIIiIiIiKSI0oLC5jeUM7abXtxDqY3lh9wq25jZZBtrd3EEw6vZ/jcGltdEmDuhGr+/OYeQtH4kT+QARpBP4KW7giFPi8vvN1MLOE4a+KB5e3toShjqotyZt3z5cuXM2vWLH7/+99jZkrORURERETkIFUlAU4ZVc4JtSXUlhYe8F5hgZex1Ye/F905Rygapyt88P3s+ezDU0fi93nY0tKdlfNrBP0IWruiFPt9/PnNPdSWBphUV7L/vYRzxBOOUeW5MXp+1113ccMNN3DHHXdw0UUXZTscERERERHJYXXlhYd8r74iyJbmv46iJ5yjJxInFIuDS66xXhr0YQYdoWhOLzV9NKY2lPOdT8xgUl3pkXfOACXohxGKxonGkxMorN26l4tn1B8w+3lHKEZ9RWHWR8937txJMBjkve99L6tWrdKouYiIiIiIHJeAz8vY6mLefLeDAq8HM6guDjChtJiigI9ggRevx+iOxFixqYWiIVIOb2ZZ/R4qcT+MfTMbPr+/vL3mgPej8QQNldld9/z3v/89s2bN4oknnmDChAlKzkVEREREJC3qK4KcOKKEmaMrmHdCDVMbyhlRVkhJwLc/iS3y+zihtpi9h1lfXQZOI+iH0RGK4jHj2Y17qCsLcOKIv5a3d4VjVJX4Kcniuuc33XQTP/3pT7nvvvt43/vel7U4RERERERk6PH7PIyrKTnifg2VRTS1hQhF4wdVF0fjCXweO6ASWQ5NI+iH0dIdIRpLsHbrXs6aWHvAD1UoFmNsVXZGz9va2gA49dRTWbNmjZJzERERERHJGq/HOHlkKe3hKC61fno84WjuCtMZjtLSrdH1gVKCfgiJhKO9J8aarXuJ9ylvD0XjFPt9lAcHfyKEhx9+mEmTJrF582Y+9rGPUVtbe+QPiYiIiIiIZFBFkZ/68iBtoSjtPVH29kSYUFPM3Ak11JYEaOkKZzvEvKAS90PoicZJOMfyjXsYVV7ICbXF+9/risSYMqpsUMs0QqEQX/nKV3j88cd55JFHGDt27KCdW0RERERE5EjG1xTT3BmmNOhj4ogSivzJdPOkkaXEmhx7uyNUFPmzHGVuU4J+CN2ROO2hCC9t28vHZzXuT8aj8QQFXg/VJYFBiyUeT05WV1ZWxpo1aygvLx+0c4uIiIiIiAxEYYGX08ZX4fd6DhjM9Hk9TKkv46Wte2kPRSlLLckWTzhiiWR+5dE96oBK3A9pb3eEN3Z0kXAwd0L1/u0doShjq4sGber9u+++m3nz5uH3+/nWt76l5FxERERERHJWwOftt9K4wOthWmM5Po/R0hWhuStMVyQGQHOX7lHfRyPoh9DaFWHDrg6K/V5OqE3OXBhPOMxgRGlhxs/f0dHBF77wBVatWsUDDzyAx6NrKSIiIiIikr8CPi+zxlYSTzgKvB68HsM5x8vb2mjviVKWhTm+co0S9H5E4wm6onHWNbUxtaF8/2h5RzhKQ0URfl/mk+X169dTWFjIypUrKS4uPvIHREREREREclyB10PvldjMjEl1pbzwdjOxeAKfd3gPTCpB70d3JE5LZ5gdbSEumj4KAOccsbhjVEXmRs+dc9x+++00Nzfz9a9/nTPOOCNj5xIREREREckFQb+XSXWlvLazg9pBnOsrFylB70dXKMaGHR0ATGuoSG4LxxlRFtg/E2G6tbS0cNVVV7FlyxYeeOCBjJxDREREREQkF40sL+TdjvCwL3Uf3vUDh7C3J8LruzooK/QxtroIgFAsTmNlUcbOedtttzF+/Hj+8pe/MHHixIydR0REREREJNfsK3WPJRKEonG6IzHaeqI0d4XpCEWzHd6g0Qh6H845WroirN/RzrTGCjxmhKJxSgt9lBWmt7kSiQS33HIL559/PjfffPOgrqsuIiIiIiKSS4J+LyePLGNrazclfh/FAR9Ffi9bW7tp64lQHhz6a6grQe8jHEvQtLeHPZ0RPtGQXNKsKxJjakN5WhPonTt38pnPfIZwOMyiRYuUnIuIiIiIyLBXV15IXfmB836VFxXwytY2OkJRSguHdvm7Stz76I7EWZ+6/3x6YznReAK/z0NVUfqu1jjn+MhHPsKZZ57JU089xejRo9N2bBERERERkaEk4PMytbEcj8foDMWyHU5GaQS9j3A0zms72qkq8tNQEaSlO8LEESV4PMc/wh2NRlm6dClXX301Tz75pJZPExERERERGYDCAi8zGitYs7WVvd0REg7AAUYskaC6OLB/eex8pgS9j0TCsWFnB6eOrthfdl5VfPyj55s3b+ayyy6jrKyMT3/601RUVBz3MUVERERERIaLoN/LzNEVbGvtocTvoyjgpbDAy+6OMBvf7aRmCCzRphL3Pt5p7qatJ8r0xnIisQTFfu9xL622c+dOTj/9dD760Y/y6KOPKjkXERERERE5BkV+H5PqSqmvDFJR5KewwEtDRZCyoI/OcP6Xv2sEvY8XN7cCMK2xgq5IjPE1x16GHgqFWLFiBeeccw4rVqxg3LhxaYpSREREREREADwe4+SRZbzwdgvBAm9el7prBL2P1VtaqSnxM7KskIRzVB5jefvrr7/O3LlzueOOO3DOKTkXERERERHJkOKAjxPrStjbE8l2KMdFCXoviYTjxS2tTB5VRjSeoNDnpdjvPerjPPnkk5x11llcc801/OxnP9MSaiIiIiIiIhlWXx6kPFiQ1zO9q8S9l/U72ukIxZgyqozOcIyxVUVHlVx3dnbS0dHBzJkzeeqpp5g2bVoGoxUREREREZF9PB7jpJGlvLi5leauMKWBAvy+/BqTzq9oMyyWcJw2tpLJI8tIOEdVycDL29euXcvs2bO55557qK6uVnIuIiJ5z8wuMLPXzWyjmd3Qz/tXmNluM1ubeizORpwiIiL7FPl9nDauiokjSogmEjR3hWnvieKcy3ZoA6IR9F5mjq7gewtPZcPODgq8HkoCA2ueO++8kxtuuIHvfve7LFq0KMNRioiIZJ6ZeYHbgb8BtgErzWyZc259n10fcM5dO+gBioiIHEJhgZfGyiIaKoK0h2I07e1hR1sPFUE/Bd7cHqNWgt6PrnCMsdVlRyxv7+jooKSkhIaGBpYvX86kSZMGKUIREZGMOx3Y6JzbBGBm9wPzgb4JuoiISE4yM8qDBZQHC6gp8bNhZwceM8oKC7Id2iHl9uWDLPF6oab08IvcP/fcc0ybNo1nn32WCy64QMm5iIgMNQ3A1l6vt6W29fVxM3vZzB4ys9H9HcjMrjazVWa2auPGjZjZ/seLL77Iiy++eMC2JUuWAFBfX79/2+zZswG4+uqrD9i3qamJRx555IBtS5cu3Xfe/Y+LL74YgIsvvviA7QBLly49YNsjjzxCU1PTAduuvvpqAGbPnr1/W319PQBLlizRd9J30nfSd9J3yvHvNKIsyF23/D9KC318/IPvZfroCqaPruC82ScD8MPb/nX/tnNPrmPtmtVp/U4DZflSi7/PnDlz3KpVqzJ2/KbWHt7a08m8E2rw9LN+XiKR4NZbb+W2225j6dKlzJ8/P2OxiIiIAJjZi865OYN8zk8AH3TOLU69/gxwunPuul77VAOdzrmwmX0O+KRz7v2HO26m+3EREZHDcc6xtaWHjbs7qCkO9Js8N3eFmdZQTnXJ4Qdtj8ZA+3KVuPdhBnVlhYdMzgHa29tZuXIlY8aMGezwREREBss2oPeIeCPQ1HsH51xzr5f/DdwyCHGJiIgcMzNjdFWQSCzO1tYeatKYhKeDStz7KAsW0FgZPGj7k08+yYwZM+ju7uab3/ymknMRERnqVgInmtl4M/MDC4FlvXcws1G9Xl4CvDaI8YmIiBwTM2NCbQkjSgO0dIezHc4BNILeR3GfmdtjsRhLlizhpz/9KXfffTclJSVZikxERGTwOOdiZnYt8DvAC/zEObfOzL4BrHLOLQO+aGaXADGgBbgiawGLiIgchX1rpoe3xWkPRXNm4jgl6Eewfft2NmzYwOrVq6mrq8t2OCIiIoPGOfcY8Fifbf/U6/k/AP8w2HGJiIikg8/r4ZSGctZs2Ut3JEaRP/vpsUrcD+E3v/kNn//85xk7diwPPfSQknMREREREZEhJuDzMqW+jK5wjEQOTKCe/UsEOSYcDvO1r32NZcuW8Ytf/CLb4YiIiIiIiEgGlRUWMKa6mO2t3VQVZ3fSOCXofdx7771s376dNWvWUFlZme1wREREREREJMPGVhexqz1EJJbIahwZLXE3swvM7HUz22hmN/TzfsDMHki9v8LMxmUynsO59957efTRR7nyyit56KGHlJyLiIiIiIgMEwVeD5PqSmgPRbIaR8YSdDPzArcDFwJTgE+Z2ZQ+u10FtDrnJgLfJQvrp3Z1dXHllVdy880309jYiJn1u1i9iIiIiIiIDF01JQFqSgN0h+NZiyGTI+inAxudc5uccxHgfmB+n33mA3ennj8EnGeDnB0vXrwYgFWrVjFjxozBPLWIiIiIiIjkCDNjYm0pRQFv1mLI5D3oDcDWXq+3AWccap/UeqttQDWwp/dOZnY1cDXAmDFj0hrk0qVLKS0tTesxRUREREREJP8E/V6mjCojUJCdJD2TI+j9jYT3nbd+IPvgnFvqnJvjnJtTW1ubluD2UXIuIiIiIiIi+4woK6QkkJ351DOZoG8DRvd63Qg0HWofM/MB5UBLBmMSERERERERyUmZTNBXAiea2Xgz8wMLgWV99lkG/G3q+QLgKedyYHV4ERERERERkUGWsXH71D3l1wK/A7zAT5xz68zsG8Aq59wy4E7gHjPbSHLkfGGm4hERERERERHJZRktrHfOPQY81mfbP/V6HgI+kckYRERERERERPJBJkvcRURERERERGSAlKCLiIiIiIiI5AAl6CIiIiIiIiI5QAm6iIiIiIiISA5Qgi4iIiIiIiKSA5Sgi4iIiIiIiOQAJegiIiIiIiIiOUAJuoiIiIiIiEgOUIIuIiIiIiIikgPMOZftGI6Kme0GNqfxkDXAnjQeb7hSOx4/teHxUxumh9rx+KW7Dcc652rTeLysSVM/rp/R9FA7pofaMT3UjumhdkyPTLTjgPryvEvQ083MVjnn5mQ7jnyndjx+asPjpzZMD7Xj8VMbZpbaNz3UjumhdkwPtWN6qB3TI5vtqBJ3ERERERERkRygBF1EREREREQkByhBh6XZDmCIUDseP7Xh8VMbpofa8fipDTNL7Zseasf0UDumh9oxPdSO6ZG1dhz296CLiIiIiIiI5AKNoIuIiIiIiIjkACXoIiIiIiIiIjlg2CToZnaBmb1uZhvN7IZ+3g+Y2QOp91eY2bjBjzK3DaANrzez9Wb2spk9aWZjsxFnrjtSO/bab4GZOTPTUhl9DKQNzeyTqZ/HdWZ232DHmOsG8Ps8xsyeNrM1qd/pD2UjzlxmZj8xs3fN7NVDvG9m9v1UG79sZrMGO8Z8pn47fdR/p4f67/RQH54e6sfTIyf7cufckH8AXuAtYALgB14CpvTZ5/PAj1LPFwIPZDvuXHoMsA3PBYpSz69RGx5bO6b2KwX+BDwPzMl23Ln0GODP4onAGqAy9XpEtuPOpccA23ApcE3q+RTgnWzHnWsP4L3ALODVQ7z/IeBxwIC5wIpsx5wvD/Xbg96W6r/T0I6p/dR/H2c7qg9PWzuqHx9YW+ZcXz5cRtBPBzY65zY55yLA/cD8PvvMB+5OPX8IOM/MbBBjzHVHbEPn3NPOue7Uy+eBxkGOMR8M5GcR4Gbg34DQYAaXJwbShv8HuN051wrgnHt3kGPMdQNpQweUpZ6XA02DGF9ecM79CWg5zC7zgZ+5pOeBCjMbNTjR5T312+mj/js91H+nh/rw9FA/nia52JcPlwS9Adja6/W21LZ+93HOxYA2oHpQossPA2nD3q4iebVJDnTEdjSzU4HRzrn/HczA8shAfhYnAZPMbLmZPW9mFwxadPlhIG24BFhkZtuAx4DrBie0IeVo/27KX6nfTh/13+mh/js91Ienh/rxwTPofbkvkwfPIf1dUe+7vtxA9hnOBtw+ZrYImAOck9GI8tNh29HMPMB3gSsGK6A8NJCfRR/JErn3kRwJ+rNOGZRBAAAHjklEQVSZTXXO7c1wbPliIG34KeAu59y/m9mZwD2pNkxkPrwhQ/3KsVO/nT7qv9ND/Xd6qA9PD/Xjg2fQ+5rhMoK+DRjd63UjB5d57N/HzHwkS0EOV+4w3AykDTGzDwA3Apc458KDFFs+OVI7lgJTgT+a2Tsk73VZpolmDjDQ3+ffOOeizrm3gddJdvaSNJA2vAp4EMA59xxQCNQMSnRDx4D+bkq/1G+nj/rv9FD/nR7qw9ND/fjgGfS+fLgk6CuBE81svJn5SU4ms6zPPsuAv009XwA85VIzAwgwgDZMlXb9mGTnrvuF+nfYdnTOtTnnapxz45xz40jeC3iJc25VdsLNSQP5ff41yUmPMLMakuVymwY1ytw2kDbcApwHYGaTSXbsuwc1yvy3DLg8NQPsXKDNObcj20HlCfXb6aP+Oz3Uf6eH+vD0UD8+eAa9Lx8WJe7OuZiZXQv8juSshz9xzq0zs28Aq5xzy4A7SZZ+bCR5BX5h9iLOPQNsw1uBEuCXqXl6tjjnLsla0DlogO0ohzHANvwdcL6ZrQfiwNecc83Zizq3DLANvwL8t5l9mWQp1xVKfg5kZr8gWYJZk7rH7+tAAYBz7kck7/n7ELAR6AauzE6k+Uf9dvqo/04P9d/poT48PdSPp08u9uWmfycRERERERGR7BsuJe4iIiIiIiIiOU0JuoiIiIiIiEgOUIIuIiIiIiIikgOUoIuIiIiIiIjkACXoIiIiIiIiIjlACbpIFphZ3MzW9nqMO8y+48zs1TSc849m9rqZvWRmy83spGM4xufM7PLU8yvMrL7Xe3eY2ZQ0x7nSzGYO4DN/b2ZFx3tuERGRfNDr/xGvmtkjZlaR5uNfYWY/SD1fYmZfTefxReTQlKCLZEePc25mr8c7g3TeTzvnZgB3k1z39qg4537knPtZ6uUVQH2v9xY759anJcq/xvlDBhbn3wNK0EVEZLjY9/+IqUAL8IVsByQi6aEEXSRHpEbK/2xmq1OP9/Szzylm9kLqqvnLZnZiavuiXtt/bGbeI5zuT8DE1GfPM7M1ZvaKmf3EzAKp7d82s/Wp83wntW2JmX3VzBYAc4B7U+cMpka+55jZNWb2b71ivsLM/vMY43wOaOh1rP8ys1Vmts7M/jm17YskLxQ8bWZPp7adb2bPpdrxl2ZWcoTziIiI5Ku+feXXUhVoL+/rK1PbL09te8nM7kltu9jMVqT+H/CEmdVlIX4R6UUJukh2BHuVt/8qte1d4G+cc7OAS4Hv9/O5zwHfc87NJJkgbzOzyan956W2x4FPH+H8FwOvmFkhcBdwqXNuGuADrjGzKuCjwCnOuenAv/T+sHPuIWAVyZHumc65nl5vPwR8rNfrS4EHjjHOC4Bf93p9o3NuDjAdOMfMpjvnvg80Aec65841sxrgH4EPpNpyFXD9Ec4jIiKSd1IXus8DlqVenw+cCJwOzARmm9l7zewU4Ebg/akKtS+lDvEsMNc5dypwP/B/B/kriEgfvmwHIDJM9aSS1N4KgB+k7rmOA5P6+dxzwI1m1gg87Jx708zOA2YDK80MIEgy2e/PvWbWA7wDXAecBLztnHsj9f7dJMvkfgCEgDvM7FHgfwf6xZxzu81sk5nNBd5MnWN56rhHE2cx4AVm9dr+STO7muTfrlHAFODlPp+dm9q+PHUeP8l2ExERGSqCZrYWGAe8CPwhtf381GNN6nUJyYR9BvCQc24PgHOuJfV+I8mL6KNI9pdvD0r0InJIStBFcseXgV0kO1EPyQT5AM65+8xsBfBh4Hdmthgw4G7n3D8M4Byfds6t2vfCzKr728k5FzOz00lelV8IXAu8/yi+ywPAJ4ENwK+cc86S2fKA4wReAr4N3A58zMzGA18FTnPOtZrZXUBhP5814A/OuU8dRbwiIiL5pMc5N9PMykleRP8Cyco7A/7VOffj3junbgdz/RznP4HbnHPLzOx9wJKMRi0iR6QSd5HcUQ7scM4lgM+QHD0+gJlNADalyrqXkSz1fhJYYGYjUvtUmdnYAZ5zAzDOzCamXn8GeCZ1z3a5c+4xkhOw9TeTegdQeojjPgx8BPgUyWSdo43TORclWao+N1UeXwZ0AW2pe+QuPEQszwPz9n0nMysys/6qEURERPKac64N+CLwVTMrAH4HfHbf3Ctm1pDqd58kWYVWndpelTpEObA99fxvBzV4EemXRtBFcscPgf8xs08AT5NMRvu6FFhkZlFgJ/AN51yLmf0j8Hsz8wBRklfSNx/phM65kJldCfzSzHzASuBHQBXwm9Q96kZydL+vu4AfpUrmz+xz3FYzWw9Mcc69kNq2/mjjdM71mNm/A191zl1lZmuAdcAmkmXz+ywFHjezHan70K8AfrFvwjuSif4biIiIDDHOuTVm9hKw0Dl3T+qi9nOp27w6gUXOuXVm9k2SF+HjJEvgryA5Yv5LM9tO8gL3+Gx8BxH5K3Ouv2oXERERERERERlMKnEXERERERERyQFK0EVERERERERygBJ0ERERERERkRygBF1EREREREQkByhBFxEREREREckBStBFREREREREcoASdBEREREREZEc8P8BkUexEp4NPAIAAAAASUVORK5CYII=\n",
      "text/plain": [
       "<Figure size 1008x432 with 2 Axes>"
      ]
     },
     "metadata": {
      "needs_background": "light"
     },
     "output_type": "display_data"
    }
   ],
   "source": [
    "fig, axes = plt.subplots(ncols=2, figsize=(14, 6))\n",
    "sns.lineplot(x='fpr', y='tpr', data=roc, ax=axes[0])\n",
    "pd.Series(bins, index=bins).plot(ax=axes[0], ls='--', lw=1, c='k')\n",
    "axes[0].set_xlabel('False Positive Rate')\n",
    "axes[0].set_ylabel('True Positive Rate')\n",
    "axes[0].set_title('ROC Curve')\n",
    "axes[0].text(x=.05, y=.94, s=f'Average AUC: {np.mean(avg_roc):.2%}')\n",
    "sns.lineplot(x='recall', y='precision', data=prc, ax=axes[1])\n",
    "axes[1].set_title('Precision-Recall Curve')\n",
    "axes[1].text(x=.65, y=.9, s=f'Average Precision: {np.mean(avg_precision):.2%}')\n",
    "axes[1].set_xlabel('Recall')\n",
    "axes[1].set_ylabel('Precision')\n",
    "axes[1].axhline(.5, ls='--', lw=1, c='k')\n",
    "fig.suptitle('2-Layer Feedforward Network: Stock Price Movement Prediction', fontsize=16)\n",
    "fig.tight_layout()\n",
    "fig.subplots_adjust(top=.86)\n",
    "fig.savefig('figures/roc_prc_curves', dpi=300);"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### How to further improve the results\n",
    "\n",
    "The relatively simple architecture yields some promising results. To further improve performance, you can\n",
    "- First and foremost, add new features and more data to the model\n",
    "- Expand the set of architectures to explore, including more or wider layers\n",
    "- Inspect the training progress and train for more epochs if the validation error continued to improve at 50 epochs\n",
    "\n",
    "Finally, you can use more sophisticated architectures, including Recurrent Neural Networks (RNN) and Convolutional Neural Networks that are well suited to sequential data, whereas vanilla feedforward NNs are not designed to capture the ordered nature of the features.\n"
   ]
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.6.8"
  },
  "toc": {
   "base_numbering": 1,
   "nav_menu": {},
   "number_sections": true,
   "sideBar": true,
   "skip_h1_title": true,
   "title_cell": "Table of Contents",
   "title_sidebar": "Contents",
   "toc_cell": false,
   "toc_position": {
    "height": "calc(100% - 180px)",
    "left": "10px",
    "top": "150px",
    "width": "282.222px"
   },
   "toc_section_display": true,
   "toc_window_display": true
  }
 },
 "nbformat": 4,
 "nbformat_minor": 2
}
