{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Variational autoencoders for collaborative filtering "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "This notebook accompanies the paper \"*Variational autoencoders for collaborative filtering*\" by Dawen Liang, Rahul G. Krishnan, Matthew D. Hoffman, and Tony Jebara, in The Web Conference (aka WWW) 2018.\n",
    "\n",
    "In this notebook, we will show a complete self-contained example of training a variational autoencoder (as well as a denoising autoencoder) with multinomial likelihood (described in the paper) on the public Movielens-20M dataset, including both data preprocessing and model training."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 1,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "import os\n",
    "import shutil\n",
    "import sys\n",
    "\n",
    "import numpy as np\n",
    "from scipy import sparse\n",
    "\n",
    "import matplotlib.pyplot as plt\n",
    "%matplotlib inline\n",
    "\n",
    "import seaborn as sn\n",
    "sn.set()\n",
    "\n",
    "import pandas as pd\n",
    "\n",
    "import tensorflow as tf\n",
    "from tensorflow.contrib.layers import apply_regularization, l2_regularizer\n",
    "\n",
    "import bottleneck as bn"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Data preprocessing"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "We load the data and create train/validation/test splits following strong generalization: \n",
    "\n",
    "- We split all users into training/validation/test sets. \n",
    "\n",
    "- We train models using the entire click history of the training users. \n",
    "\n",
    "- To evaluate, we take part of the click history from held-out (validation and test) users to learn the necessary user-level representations for the model and then compute metrics by looking at how well the model ranks the rest of the unseen click history from the held-out users."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "First, download the dataset at http://files.grouplens.org/datasets/movielens/ml-20m.zip"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 2,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "### change `DATA_DIR` to the location where movielens-20m dataset sits\n",
    "DATA_DIR = '/home/ubuntu/data/ml-20m/'"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 3,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "raw_data = pd.read_csv(os.path.join(DATA_DIR, 'ratings.csv'), header=0)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 4,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "# binarize the data (only keep ratings >= 4)\n",
    "raw_data = raw_data[raw_data['rating'] > 3.5]"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 5,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/html": [
       "<div>\n",
       "<style>\n",
       "    .dataframe thead tr:only-child th {\n",
       "        text-align: right;\n",
       "    }\n",
       "\n",
       "    .dataframe thead th {\n",
       "        text-align: left;\n",
       "    }\n",
       "\n",
       "    .dataframe tbody tr th {\n",
       "        vertical-align: top;\n",
       "    }\n",
       "</style>\n",
       "<table border=\"1\" class=\"dataframe\">\n",
       "  <thead>\n",
       "    <tr style=\"text-align: right;\">\n",
       "      <th></th>\n",
       "      <th>userId</th>\n",
       "      <th>movieId</th>\n",
       "      <th>rating</th>\n",
       "      <th>timestamp</th>\n",
       "    </tr>\n",
       "  </thead>\n",
       "  <tbody>\n",
       "    <tr>\n",
       "      <th>6</th>\n",
       "      <td>1</td>\n",
       "      <td>151</td>\n",
       "      <td>4.0</td>\n",
       "      <td>1094785734</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>7</th>\n",
       "      <td>1</td>\n",
       "      <td>223</td>\n",
       "      <td>4.0</td>\n",
       "      <td>1112485573</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>8</th>\n",
       "      <td>1</td>\n",
       "      <td>253</td>\n",
       "      <td>4.0</td>\n",
       "      <td>1112484940</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>9</th>\n",
       "      <td>1</td>\n",
       "      <td>260</td>\n",
       "      <td>4.0</td>\n",
       "      <td>1112484826</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>10</th>\n",
       "      <td>1</td>\n",
       "      <td>293</td>\n",
       "      <td>4.0</td>\n",
       "      <td>1112484703</td>\n",
       "    </tr>\n",
       "  </tbody>\n",
       "</table>\n",
       "</div>"
      ],
      "text/plain": [
       "    userId  movieId  rating   timestamp\n",
       "6        1      151     4.0  1094785734\n",
       "7        1      223     4.0  1112485573\n",
       "8        1      253     4.0  1112484940\n",
       "9        1      260     4.0  1112484826\n",
       "10       1      293     4.0  1112484703"
      ]
     },
     "execution_count": 5,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "raw_data.head()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Data splitting procedure"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "- Select 10K users as heldout users, 10K users as validation users, and the rest of the users for training\n",
    "- Use all the items from the training users as item set\n",
    "- For each of both validation and test user, subsample 80% as fold-in data and the rest for prediction "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 6,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "def get_count(tp, id):\n",
    "    playcount_groupbyid = tp[[id]].groupby(id, as_index=False)\n",
    "    count = playcount_groupbyid.size()\n",
    "    return count"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 7,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "def filter_triplets(tp, min_uc=5, min_sc=0):\n",
    "    # Only keep the triplets for items which were clicked on by at least min_sc users. \n",
    "    if min_sc > 0:\n",
    "        itemcount = get_count(tp, 'movieId')\n",
    "        tp = tp[tp['movieId'].isin(itemcount.index[itemcount >= min_sc])]\n",
    "    \n",
    "    # Only keep the triplets for users who clicked on at least min_uc items\n",
    "    # After doing this, some of the items will have less than min_uc users, but should only be a small proportion\n",
    "    if min_uc > 0:\n",
    "        usercount = get_count(tp, 'userId')\n",
    "        tp = tp[tp['userId'].isin(usercount.index[usercount >= min_uc])]\n",
    "    \n",
    "    # Update both usercount and itemcount after filtering\n",
    "    usercount, itemcount = get_count(tp, 'userId'), get_count(tp, 'movieId') \n",
    "    return tp, usercount, itemcount"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Only keep items that are clicked on by at least 5 users"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 8,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "raw_data, user_activity, item_popularity = filter_triplets(raw_data)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 9,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "After filtering, there are 9990682 watching events from 136677 users and 20720 movies (sparsity: 0.353%)\n"
     ]
    }
   ],
   "source": [
    "sparsity = 1. * raw_data.shape[0] / (user_activity.shape[0] * item_popularity.shape[0])\n",
    "\n",
    "print(\"After filtering, there are %d watching events from %d users and %d movies (sparsity: %.3f%%)\" % \n",
    "      (raw_data.shape[0], user_activity.shape[0], item_popularity.shape[0], sparsity * 100))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 10,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "unique_uid = user_activity.index\n",
    "\n",
    "np.random.seed(98765)\n",
    "idx_perm = np.random.permutation(unique_uid.size)\n",
    "unique_uid = unique_uid[idx_perm]"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 11,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "# create train/validation/test users\n",
    "n_users = unique_uid.size\n",
    "n_heldout_users = 10000\n",
    "\n",
    "tr_users = unique_uid[:(n_users - n_heldout_users * 2)]\n",
    "vd_users = unique_uid[(n_users - n_heldout_users * 2): (n_users - n_heldout_users)]\n",
    "te_users = unique_uid[(n_users - n_heldout_users):]"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 12,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "train_plays = raw_data.loc[raw_data['userId'].isin(tr_users)]"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 13,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "unique_sid = pd.unique(train_plays['movieId'])"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 14,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "show2id = dict((sid, i) for (i, sid) in enumerate(unique_sid))\n",
    "profile2id = dict((pid, i) for (i, pid) in enumerate(unique_uid))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 15,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "pro_dir = os.path.join(DATA_DIR, 'pro_sg')\n",
    "\n",
    "if not os.path.exists(pro_dir):\n",
    "    os.makedirs(pro_dir)\n",
    "\n",
    "with open(os.path.join(pro_dir, 'unique_sid.txt'), 'w') as f:\n",
    "    for sid in unique_sid:\n",
    "        f.write('%s\\n' % sid)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 16,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "def split_train_test_proportion(data, test_prop=0.2):\n",
    "    data_grouped_by_user = data.groupby('userId')\n",
    "    tr_list, te_list = list(), list()\n",
    "\n",
    "    np.random.seed(98765)\n",
    "\n",
    "    for i, (_, group) in enumerate(data_grouped_by_user):\n",
    "        n_items_u = len(group)\n",
    "\n",
    "        if n_items_u >= 5:\n",
    "            idx = np.zeros(n_items_u, dtype='bool')\n",
    "            idx[np.random.choice(n_items_u, size=int(test_prop * n_items_u), replace=False).astype('int64')] = True\n",
    "\n",
    "            tr_list.append(group[np.logical_not(idx)])\n",
    "            te_list.append(group[idx])\n",
    "        else:\n",
    "            tr_list.append(group)\n",
    "\n",
    "        if i % 1000 == 0:\n",
    "            print(\"%d users sampled\" % i)\n",
    "            sys.stdout.flush()\n",
    "\n",
    "    data_tr = pd.concat(tr_list)\n",
    "    data_te = pd.concat(te_list)\n",
    "    \n",
    "    return data_tr, data_te"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 17,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "vad_plays = raw_data.loc[raw_data['userId'].isin(vd_users)]\n",
    "vad_plays = vad_plays.loc[vad_plays['movieId'].isin(unique_sid)]"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 18,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "0 users sampled\n",
      "1000 users sampled\n",
      "2000 users sampled\n",
      "3000 users sampled\n",
      "4000 users sampled\n",
      "5000 users sampled\n",
      "6000 users sampled\n",
      "7000 users sampled\n",
      "8000 users sampled\n",
      "9000 users sampled\n"
     ]
    }
   ],
   "source": [
    "vad_plays_tr, vad_plays_te = split_train_test_proportion(vad_plays)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 19,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "test_plays = raw_data.loc[raw_data['userId'].isin(te_users)]\n",
    "test_plays = test_plays.loc[test_plays['movieId'].isin(unique_sid)]"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 20,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "0 users sampled\n",
      "1000 users sampled\n",
      "2000 users sampled\n",
      "3000 users sampled\n",
      "4000 users sampled\n",
      "5000 users sampled\n",
      "6000 users sampled\n",
      "7000 users sampled\n",
      "8000 users sampled\n",
      "9000 users sampled\n"
     ]
    }
   ],
   "source": [
    "test_plays_tr, test_plays_te = split_train_test_proportion(test_plays)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Save the data into (user_index, item_index) format"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 21,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "def numerize(tp):\n",
    "    uid = map(lambda x: profile2id[x], tp['userId'])\n",
    "    sid = map(lambda x: show2id[x], tp['movieId'])\n",
    "    return pd.DataFrame(data={'uid': uid, 'sid': sid}, columns=['uid', 'sid'])"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 22,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "train_data = numerize(train_plays)\n",
    "train_data.to_csv(os.path.join(pro_dir, 'train.csv'), index=False)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 23,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "vad_data_tr = numerize(vad_plays_tr)\n",
    "vad_data_tr.to_csv(os.path.join(pro_dir, 'validation_tr.csv'), index=False)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 24,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "vad_data_te = numerize(vad_plays_te)\n",
    "vad_data_te.to_csv(os.path.join(pro_dir, 'validation_te.csv'), index=False)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 25,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "test_data_tr = numerize(test_plays_tr)\n",
    "test_data_tr.to_csv(os.path.join(pro_dir, 'test_tr.csv'), index=False)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 26,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "test_data_te = numerize(test_plays_te)\n",
    "test_data_te.to_csv(os.path.join(pro_dir, 'test_te.csv'), index=False)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Model definition and training"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "We define two related models: denoising autoencoder with multinomial likelihood (Multi-DAE in the paper) and partially-regularized variational autoencoder with multinomial likelihood (Multi-VAE^{PR} in the paper)."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Model definition"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "__Notations__: We use $u \\in \\{1,\\dots,U\\}$ to index users and $i \\in \\{1,\\dots,I\\}$ to index items. In this work, we consider learning with implicit feedback. The user-by-item interaction matrix is the click matrix $\\mathbf{X} \\in \\mathbb{N}^{U\\times I}$. The lower case $\\mathbf{x}_u =[X_{u1},\\dots,X_{uI}]^\\top \\in \\mathbb{N}^I$ is a bag-of-words vector with the number of clicks for each item from user u. We binarize the click matrix. It is straightforward to extend it to general count data."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "__Generative process__: For each user $u$, the model starts by sampling a $K$-dimensional latent representation $\\mathbf{z}_u$ from a standard Gaussian prior. The latent representation $\\mathbf{z}_u$ is transformed via a non-linear function $f_\\theta (\\cdot) \\in \\mathbb{R}^I$ to produce a probability distribution over $I$ items $\\pi (\\mathbf{z}_u)$ from which the click history $\\mathbf{x}_u$ is assumed to have been drawn:\n",
    "\n",
    "$$\n",
    "\\mathbf{z}_u \\sim \\mathcal{N}(0, \\mathbf{I}_K),  \\pi(\\mathbf{z}_u) \\propto \\exp\\{f_\\theta (\\mathbf{z}_u\\},\\\\\n",
    "\\mathbf{x}_u \\sim \\mathrm{Mult}(N_u, \\pi(\\mathbf{z}_u))\n",
    "$$"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "The objective for Multi-DAE for a single user $u$ is:\n",
    "$$\n",
    "\\mathcal{L}_u(\\theta, \\phi) = \\log p_\\theta(\\mathbf{x}_u | g_\\phi(\\mathbf{x}_u))\n",
    "$$\n",
    "where $g_\\phi(\\cdot)$ is the non-linear \"encoder\" function."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 27,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "class MultiDAE(object):\n",
    "    def __init__(self, p_dims, q_dims=None, lam=0.01, lr=1e-3, random_seed=None):\n",
    "        self.p_dims = p_dims\n",
    "        if q_dims is None:\n",
    "            self.q_dims = p_dims[::-1]\n",
    "        else:\n",
    "            assert q_dims[0] == p_dims[-1], \"Input and output dimension must equal each other for autoencoders.\"\n",
    "            assert q_dims[-1] == p_dims[0], \"Latent dimension for p- and q-network mismatches.\"\n",
    "            self.q_dims = q_dims\n",
    "        self.dims = self.q_dims + self.p_dims[1:]\n",
    "        \n",
    "        self.lam = lam\n",
    "        self.lr = lr\n",
    "        self.random_seed = random_seed\n",
    "\n",
    "        self.construct_placeholders()\n",
    "\n",
    "    def construct_placeholders(self):        \n",
    "        self.input_ph = tf.placeholder(\n",
    "            dtype=tf.float32, shape=[None, self.dims[0]])\n",
    "        self.keep_prob_ph = tf.placeholder_with_default(1.0, shape=None)\n",
    "\n",
    "    def build_graph(self):\n",
    "\n",
    "        self.construct_weights()\n",
    "\n",
    "        saver, logits = self.forward_pass()\n",
    "        log_softmax_var = tf.nn.log_softmax(logits)\n",
    "\n",
    "        # per-user average negative log-likelihood\n",
    "        neg_ll = -tf.reduce_mean(tf.reduce_sum(\n",
    "            log_softmax_var * self.input_ph, axis=1))\n",
    "        # apply regularization to weights\n",
    "        reg = l2_regularizer(self.lam)\n",
    "        reg_var = apply_regularization(reg, self.weights)\n",
    "        # tensorflow l2 regularization multiply 0.5 to the l2 norm\n",
    "        # multiply 2 so that it is back in the same scale\n",
    "        loss = neg_ll + 2 * reg_var\n",
    "        \n",
    "        train_op = tf.train.AdamOptimizer(self.lr).minimize(loss)\n",
    "\n",
    "        # add summary statistics\n",
    "        tf.summary.scalar('negative_multi_ll', neg_ll)\n",
    "        tf.summary.scalar('loss', loss)\n",
    "        merged = tf.summary.merge_all()\n",
    "        return saver, logits, loss, train_op, merged\n",
    "\n",
    "    def forward_pass(self):\n",
    "        # construct forward graph        \n",
    "        h = tf.nn.l2_normalize(self.input_ph, 1)\n",
    "        h = tf.nn.dropout(h, self.keep_prob_ph)\n",
    "        \n",
    "        for i, (w, b) in enumerate(zip(self.weights, self.biases)):\n",
    "            h = tf.matmul(h, w) + b\n",
    "            \n",
    "            if i != len(self.weights) - 1:\n",
    "                h = tf.nn.tanh(h)\n",
    "        return tf.train.Saver(), h\n",
    "\n",
    "    def construct_weights(self):\n",
    "\n",
    "        self.weights = []\n",
    "        self.biases = []\n",
    "        \n",
    "        # define weights\n",
    "        for i, (d_in, d_out) in enumerate(zip(self.dims[:-1], self.dims[1:])):\n",
    "            weight_key = \"weight_{}to{}\".format(i, i+1)\n",
    "            bias_key = \"bias_{}\".format(i+1)\n",
    "            \n",
    "            self.weights.append(tf.get_variable(\n",
    "                name=weight_key, shape=[d_in, d_out],\n",
    "                initializer=tf.contrib.layers.xavier_initializer(\n",
    "                    seed=self.random_seed)))\n",
    "            \n",
    "            self.biases.append(tf.get_variable(\n",
    "                name=bias_key, shape=[d_out],\n",
    "                initializer=tf.truncated_normal_initializer(\n",
    "                    stddev=0.001, seed=self.random_seed)))\n",
    "            \n",
    "            # add summary stats\n",
    "            tf.summary.histogram(weight_key, self.weights[-1])\n",
    "            tf.summary.histogram(bias_key, self.biases[-1])"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "The objective of Multi-VAE^{PR} (evidence lower-bound, or ELBO) for a single user $u$ is:\n",
    "$$\n",
    "\\mathcal{L}_u(\\theta, \\phi) = \\mathbb{E}_{q_\\phi(z_u | x_u)}[\\log p_\\theta(x_u | z_u)] - \\beta \\cdot KL(q_\\phi(z_u | x_u) \\| p(z_u))\n",
    "$$\n",
    "where $q_\\phi$ is the approximating variational distribution (inference model). $\\beta$ is the additional annealing parameter that we control. The objective of the entire dataset is the average over all the users. It can be trained almost the same as Multi-DAE, thanks to reparametrization trick. "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 28,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "class MultiVAE(MultiDAE):\n",
    "\n",
    "    def construct_placeholders(self):\n",
    "        super(MultiVAE, self).construct_placeholders()\n",
    "\n",
    "        # placeholders with default values when scoring\n",
    "        self.is_training_ph = tf.placeholder_with_default(0., shape=None)\n",
    "        self.anneal_ph = tf.placeholder_with_default(1., shape=None)\n",
    "        \n",
    "    def build_graph(self):\n",
    "        self._construct_weights()\n",
    "\n",
    "        saver, logits, KL = self.forward_pass()\n",
    "        log_softmax_var = tf.nn.log_softmax(logits)\n",
    "\n",
    "        neg_ll = -tf.reduce_mean(tf.reduce_sum(\n",
    "            log_softmax_var * self.input_ph,\n",
    "            axis=-1))\n",
    "        # apply regularization to weights\n",
    "        reg = l2_regularizer(self.lam)\n",
    "        \n",
    "        reg_var = apply_regularization(reg, self.weights_q + self.weights_p)\n",
    "        # tensorflow l2 regularization multiply 0.5 to the l2 norm\n",
    "        # multiply 2 so that it is back in the same scale\n",
    "        neg_ELBO = neg_ll + self.anneal_ph * KL + 2 * reg_var\n",
    "        \n",
    "        train_op = tf.train.AdamOptimizer(self.lr).minimize(neg_ELBO)\n",
    "\n",
    "        # add summary statistics\n",
    "        tf.summary.scalar('negative_multi_ll', neg_ll)\n",
    "        tf.summary.scalar('KL', KL)\n",
    "        tf.summary.scalar('neg_ELBO_train', neg_ELBO)\n",
    "        merged = tf.summary.merge_all()\n",
    "\n",
    "        return saver, logits, neg_ELBO, train_op, merged\n",
    "    \n",
    "    def q_graph(self):\n",
    "        mu_q, std_q, KL = None, None, None\n",
    "        \n",
    "        h = tf.nn.l2_normalize(self.input_ph, 1)\n",
    "        h = tf.nn.dropout(h, self.keep_prob_ph)\n",
    "        \n",
    "        for i, (w, b) in enumerate(zip(self.weights_q, self.biases_q)):\n",
    "            h = tf.matmul(h, w) + b\n",
    "            \n",
    "            if i != len(self.weights_q) - 1:\n",
    "                h = tf.nn.tanh(h)\n",
    "            else:\n",
    "                mu_q = h[:, :self.q_dims[-1]]\n",
    "                logvar_q = h[:, self.q_dims[-1]:]\n",
    "\n",
    "                std_q = tf.exp(0.5 * logvar_q)\n",
    "                KL = tf.reduce_mean(tf.reduce_sum(\n",
    "                        0.5 * (-logvar_q + tf.exp(logvar_q) + mu_q**2 - 1), axis=1))\n",
    "        return mu_q, std_q, KL\n",
    "\n",
    "    def p_graph(self, z):\n",
    "        h = z\n",
    "        \n",
    "        for i, (w, b) in enumerate(zip(self.weights_p, self.biases_p)):\n",
    "            h = tf.matmul(h, w) + b\n",
    "            \n",
    "            if i != len(self.weights_p) - 1:\n",
    "                h = tf.nn.tanh(h)\n",
    "        return h\n",
    "\n",
    "    def forward_pass(self):\n",
    "        # q-network\n",
    "        mu_q, std_q, KL = self.q_graph()\n",
    "        epsilon = tf.random_normal(tf.shape(std_q))\n",
    "\n",
    "        sampled_z = mu_q + self.is_training_ph *\\\n",
    "            epsilon * std_q\n",
    "\n",
    "        # p-network\n",
    "        logits = self.p_graph(sampled_z)\n",
    "        \n",
    "        return tf.train.Saver(), logits, KL\n",
    "\n",
    "    def _construct_weights(self):\n",
    "        self.weights_q, self.biases_q = [], []\n",
    "        \n",
    "        for i, (d_in, d_out) in enumerate(zip(self.q_dims[:-1], self.q_dims[1:])):\n",
    "            if i == len(self.q_dims[:-1]) - 1:\n",
    "                # we need two sets of parameters for mean and variance,\n",
    "                # respectively\n",
    "                d_out *= 2\n",
    "            weight_key = \"weight_q_{}to{}\".format(i, i+1)\n",
    "            bias_key = \"bias_q_{}\".format(i+1)\n",
    "            \n",
    "            self.weights_q.append(tf.get_variable(\n",
    "                name=weight_key, shape=[d_in, d_out],\n",
    "                initializer=tf.contrib.layers.xavier_initializer(\n",
    "                    seed=self.random_seed)))\n",
    "            \n",
    "            self.biases_q.append(tf.get_variable(\n",
    "                name=bias_key, shape=[d_out],\n",
    "                initializer=tf.truncated_normal_initializer(\n",
    "                    stddev=0.001, seed=self.random_seed)))\n",
    "            \n",
    "            # add summary stats\n",
    "            tf.summary.histogram(weight_key, self.weights_q[-1])\n",
    "            tf.summary.histogram(bias_key, self.biases_q[-1])\n",
    "            \n",
    "        self.weights_p, self.biases_p = [], []\n",
    "\n",
    "        for i, (d_in, d_out) in enumerate(zip(self.p_dims[:-1], self.p_dims[1:])):\n",
    "            weight_key = \"weight_p_{}to{}\".format(i, i+1)\n",
    "            bias_key = \"bias_p_{}\".format(i+1)\n",
    "            self.weights_p.append(tf.get_variable(\n",
    "                name=weight_key, shape=[d_in, d_out],\n",
    "                initializer=tf.contrib.layers.xavier_initializer(\n",
    "                    seed=self.random_seed)))\n",
    "            \n",
    "            self.biases_p.append(tf.get_variable(\n",
    "                name=bias_key, shape=[d_out],\n",
    "                initializer=tf.truncated_normal_initializer(\n",
    "                    stddev=0.001, seed=self.random_seed)))\n",
    "            \n",
    "            # add summary stats\n",
    "            tf.summary.histogram(weight_key, self.weights_p[-1])\n",
    "            tf.summary.histogram(bias_key, self.biases_p[-1])"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Training/validation data, hyperparameters"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Load the pre-processed training and validation data"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 29,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "unique_sid = list()\n",
    "with open(os.path.join(pro_dir, 'unique_sid.txt'), 'r') as f:\n",
    "    for line in f:\n",
    "        unique_sid.append(line.strip())\n",
    "\n",
    "n_items = len(unique_sid)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 30,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "def load_train_data(csv_file):\n",
    "    tp = pd.read_csv(csv_file)\n",
    "    n_users = tp['uid'].max() + 1\n",
    "\n",
    "    rows, cols = tp['uid'], tp['sid']\n",
    "    data = sparse.csr_matrix((np.ones_like(rows),\n",
    "                             (rows, cols)), dtype='float64',\n",
    "                             shape=(n_users, n_items))\n",
    "    return data"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 31,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "train_data = load_train_data(os.path.join(pro_dir, 'train.csv'))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 32,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "def load_tr_te_data(csv_file_tr, csv_file_te):\n",
    "    tp_tr = pd.read_csv(csv_file_tr)\n",
    "    tp_te = pd.read_csv(csv_file_te)\n",
    "\n",
    "    start_idx = min(tp_tr['uid'].min(), tp_te['uid'].min())\n",
    "    end_idx = max(tp_tr['uid'].max(), tp_te['uid'].max())\n",
    "\n",
    "    rows_tr, cols_tr = tp_tr['uid'] - start_idx, tp_tr['sid']\n",
    "    rows_te, cols_te = tp_te['uid'] - start_idx, tp_te['sid']\n",
    "\n",
    "    data_tr = sparse.csr_matrix((np.ones_like(rows_tr),\n",
    "                             (rows_tr, cols_tr)), dtype='float64', shape=(end_idx - start_idx + 1, n_items))\n",
    "    data_te = sparse.csr_matrix((np.ones_like(rows_te),\n",
    "                             (rows_te, cols_te)), dtype='float64', shape=(end_idx - start_idx + 1, n_items))\n",
    "    return data_tr, data_te"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 33,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "vad_data_tr, vad_data_te = load_tr_te_data(os.path.join(pro_dir, 'validation_tr.csv'),\n",
    "                                           os.path.join(pro_dir, 'validation_te.csv'))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Set up training hyperparameters"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 34,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "N = train_data.shape[0]\n",
    "idxlist = range(N)\n",
    "\n",
    "# training batch size\n",
    "batch_size = 500\n",
    "batches_per_epoch = int(np.ceil(float(N) / batch_size))\n",
    "\n",
    "N_vad = vad_data_tr.shape[0]\n",
    "idxlist_vad = range(N_vad)\n",
    "\n",
    "# validation batch size (since the entire validation set might not fit into GPU memory)\n",
    "batch_size_vad = 2000\n",
    "\n",
    "# the total number of gradient updates for annealing\n",
    "total_anneal_steps = 200000\n",
    "# largest annealing parameter\n",
    "anneal_cap = 0.2"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Evaluate function: Normalized discounted cumulative gain (NDCG@k) and Recall@k"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 35,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "def NDCG_binary_at_k_batch(X_pred, heldout_batch, k=100):\n",
    "    '''\n",
    "    normalized discounted cumulative gain@k for binary relevance\n",
    "    ASSUMPTIONS: all the 0's in heldout_data indicate 0 relevance\n",
    "    '''\n",
    "    batch_users = X_pred.shape[0]\n",
    "    idx_topk_part = bn.argpartition(-X_pred, k, axis=1)\n",
    "    topk_part = X_pred[np.arange(batch_users)[:, np.newaxis],\n",
    "                       idx_topk_part[:, :k]]\n",
    "    idx_part = np.argsort(-topk_part, axis=1)\n",
    "    # X_pred[np.arange(batch_users)[:, np.newaxis], idx_topk] is the sorted\n",
    "    # topk predicted score\n",
    "    idx_topk = idx_topk_part[np.arange(batch_users)[:, np.newaxis], idx_part]\n",
    "    # build the discount template\n",
    "    tp = 1. / np.log2(np.arange(2, k + 2))\n",
    "\n",
    "    DCG = (heldout_batch[np.arange(batch_users)[:, np.newaxis],\n",
    "                         idx_topk].toarray() * tp).sum(axis=1)\n",
    "    IDCG = np.array([(tp[:min(n, k)]).sum()\n",
    "                     for n in heldout_batch.getnnz(axis=1)])\n",
    "    return DCG / IDCG"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 36,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "def Recall_at_k_batch(X_pred, heldout_batch, k=100):\n",
    "    batch_users = X_pred.shape[0]\n",
    "\n",
    "    idx = bn.argpartition(-X_pred, k, axis=1)\n",
    "    X_pred_binary = np.zeros_like(X_pred, dtype=bool)\n",
    "    X_pred_binary[np.arange(batch_users)[:, np.newaxis], idx[:, :k]] = True\n",
    "\n",
    "    X_true_binary = (heldout_batch > 0).toarray()\n",
    "    tmp = (np.logical_and(X_true_binary, X_pred_binary).sum(axis=1)).astype(\n",
    "        np.float32)\n",
    "    recall = tmp / np.minimum(k, X_true_binary.sum(axis=1))\n",
    "    return recall"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Train a Multi-VAE^{PR}"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "For ML-20M dataset, we set both the generative function $f_\\theta(\\cdot)$ and the inference model $g_\\phi(\\cdot)$ to be 3-layer multilayer perceptron (MLP) with symmetrical architecture. "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "The generative function is a [200 -> 600 -> n_items] MLP, which means the inference function is a [n_items -> 600 -> 200] MLP. Thus the overall architecture for the Multi-VAE^{PR} is [n_items -> 600 -> 200 -> 600 -> n_items]."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 37,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "p_dims = [200, 600, n_items]"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 38,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "INFO:tensorflow:Scale of 0 disables regularizer.\n"
     ]
    }
   ],
   "source": [
    "tf.reset_default_graph()\n",
    "vae = MultiVAE(p_dims, lam=0.0, random_seed=98765)\n",
    "\n",
    "saver, logits_var, loss_var, train_op_var, merged_var = vae.build_graph()\n",
    "\n",
    "ndcg_var = tf.Variable(0.0)\n",
    "ndcg_dist_var = tf.placeholder(dtype=tf.float64, shape=None)\n",
    "ndcg_summary = tf.summary.scalar('ndcg_at_k_validation', ndcg_var)\n",
    "ndcg_dist_summary = tf.summary.histogram('ndcg_at_k_hist_validation', ndcg_dist_var)\n",
    "merged_valid = tf.summary.merge([ndcg_summary, ndcg_dist_summary])"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Set up logging and checkpoint directory\n",
    "\n",
    "- Change all the logging directory and checkpoint directory to somewhere of your choice\n",
    "- Monitor training progress using tensorflow by: `tensorboard --logdir=$log_dir`"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 39,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "arch_str = \"I-%s-I\" % ('-'.join([str(d) for d in vae.dims[1:-1]]))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 40,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "log directory: /volmount/log/ml-20m/VAE_anneal200K_cap2.0E-01/I-600-200-600-I\n"
     ]
    }
   ],
   "source": [
    "log_dir = '/volmount/log/ml-20m/VAE_anneal{}K_cap{:1.1E}/{}'.format(\n",
    "    total_anneal_steps/1000, anneal_cap, arch_str)\n",
    "\n",
    "if os.path.exists(log_dir):\n",
    "    shutil.rmtree(log_dir)\n",
    "\n",
    "print(\"log directory: %s\" % log_dir)\n",
    "summary_writer = tf.summary.FileWriter(log_dir, graph=tf.get_default_graph())"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 41,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "chkpt directory: /volmount/chkpt/ml-20m/VAE_anneal200K_cap2.0E-01/I-600-200-600-I\n"
     ]
    }
   ],
   "source": [
    "chkpt_dir = '/volmount/chkpt/ml-20m/VAE_anneal{}K_cap{:1.1E}/{}'.format(\n",
    "    total_anneal_steps/1000, anneal_cap, arch_str)\n",
    "\n",
    "if not os.path.isdir(chkpt_dir):\n",
    "    os.makedirs(chkpt_dir) \n",
    "    \n",
    "print(\"chkpt directory: %s\" % chkpt_dir)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 42,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "n_epochs = 200"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "ndcgs_vad = []\n",
    "\n",
    "with tf.Session() as sess:\n",
    "\n",
    "    init = tf.global_variables_initializer()\n",
    "    sess.run(init)\n",
    "\n",
    "    best_ndcg = -np.inf\n",
    "\n",
    "    update_count = 0.0\n",
    "    \n",
    "    for epoch in range(n_epochs):\n",
    "        np.random.shuffle(idxlist)\n",
    "        # train for one epoch\n",
    "        for bnum, st_idx in enumerate(range(0, N, batch_size)):\n",
    "            end_idx = min(st_idx + batch_size, N)\n",
    "            X = train_data[idxlist[st_idx:end_idx]]\n",
    "            \n",
    "            if sparse.isspmatrix(X):\n",
    "                X = X.toarray()\n",
    "            X = X.astype('float32')           \n",
    "            \n",
    "            if total_anneal_steps > 0:\n",
    "                anneal = min(anneal_cap, 1. * update_count / total_anneal_steps)\n",
    "            else:\n",
    "                anneal = anneal_cap\n",
    "            \n",
    "            feed_dict = {vae.input_ph: X, \n",
    "                         vae.keep_prob_ph: 0.5, \n",
    "                         vae.anneal_ph: anneal,\n",
    "                         vae.is_training_ph: 1}        \n",
    "            sess.run(train_op_var, feed_dict=feed_dict)\n",
    "\n",
    "            if bnum % 100 == 0:\n",
    "                summary_train = sess.run(merged_var, feed_dict=feed_dict)\n",
    "                summary_writer.add_summary(summary_train, \n",
    "                                           global_step=epoch * batches_per_epoch + bnum) \n",
    "            \n",
    "            update_count += 1\n",
    "        \n",
    "        # compute validation NDCG\n",
    "        ndcg_dist = []\n",
    "        for bnum, st_idx in enumerate(range(0, N_vad, batch_size_vad)):\n",
    "            end_idx = min(st_idx + batch_size_vad, N_vad)\n",
    "            X = vad_data_tr[idxlist_vad[st_idx:end_idx]]\n",
    "\n",
    "            if sparse.isspmatrix(X):\n",
    "                X = X.toarray()\n",
    "            X = X.astype('float32')\n",
    "        \n",
    "            pred_val = sess.run(logits_var, feed_dict={vae.input_ph: X} )\n",
    "            # exclude examples from training and validation (if any)\n",
    "            pred_val[X.nonzero()] = -np.inf\n",
    "            ndcg_dist.append(NDCG_binary_at_k_batch(pred_val, vad_data_te[idxlist_vad[st_idx:end_idx]]))\n",
    "        \n",
    "        ndcg_dist = np.concatenate(ndcg_dist)\n",
    "        ndcg_ = ndcg_dist.mean()\n",
    "        ndcgs_vad.append(ndcg_)\n",
    "        merged_valid_val = sess.run(merged_valid, feed_dict={ndcg_var: ndcg_, ndcg_dist_var: ndcg_dist})\n",
    "        summary_writer.add_summary(merged_valid_val, epoch)\n",
    "\n",
    "        # update the best model (if necessary)\n",
    "        if ndcg_ > best_ndcg:\n",
    "            saver.save(sess, '{}/model'.format(chkpt_dir))\n",
    "            best_ndcg = ndcg_"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 44,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAtoAAADQCAYAAAA56sZ8AAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzs3Xl8VPW9//HXrEkm+zIJhCVACCABRBBlqSiyqNW6XbUo\nmh8tVfur4tJaQH5V7G2LgrW11rZKW61VvMbLpb1YbbXV1rogyA5RdoRAQjJJJsskk8x2fn8ERiNh\nyChhsryfj4cPmTNzznzmk5Mzn3zP53yPyTAMAxEREREROa3MsQ5ARERERKQnUqEtIiIiItIJVGiL\niIiIiHQCFdoiIiIiIp1AhbaIiIiISCdQoS0iIiIi0gmssQ6gM7hcDTF77/R0B253U8zev7tRvqKn\nnEVH+YqechYd5St6yll0lK/oncmcOZ3JJ31OI9qnmdVqiXUI3YryFT3lLDrKV/SUs+goX9FTzqKj\nfEWvq+RMhbaIiIiISCdQoS0iIiIi0glUaIuIiIiIdAIV2iIiIiIinaBHzjoiIiIiXcOuQ27e3lpG\nnwwHXxndl4yU+FiH1KOFDIOa+mYykuMxm02d8h7elgAACXEqI09FGRIRkS7HMAxMps4pEk7G4/VT\ncqAGi9nE6PxM4mzRzVrwwUdHeW/7US6ZMIBRQzIBaPYFeOX9T/jw40r6ZSUyIi+dgdlJHKzwsPOQ\nm/LqRr4yui+XTczDajGH19m8p4rURDv5uanE2aOLo7HZj9lkiroIChkGpRUeKmu9ZKclkJORQLz9\ni5cJhyoa+J+397N9f3V42f++c4DCwRlMHT+Afunx9MlwnPBzLqtq5F+bj2AAZ+WlM2JgGo54W7vv\n0dDkY8/hOnaX1lLX6CM5wUZyop3czETOGZaF+TTsQz5/kE27XWzeU0WSw0ZuZiL9shIZ2j81/DP7\nPH8gyKbdVSTEWRg5KOOkrzuuqdnPzkO1HDzagKvWS2Wtl4YmH6GQQSBkYDaZMJnAajaTnGhj4sg+\nTCrMOSEvTc1+nl7zEdv3V5PssHFOgZOz8zOxWc20+EP4A0EMwGQCi9nMkL4pZKa2/uETDIXYtLuK\ntTuOkpUaz1fG9GVgTttp6464PPx9Qynv76ggEAzhTItnYHYyeX2SKeifyuC+KdhtFhqb/VTVNuNt\nCWC1mrFZzJjNJgLBEP5AiKbmAK46L1W1zYQMg6+M7kten9b3MgyD3aW1HChvYHR+Jv2yEsPxbdlT\nza5SNxNH9mFIbko4rpBhcMTVSL+sxE774+KLMhmGYcQ6iNMtlvNoO53JMX3/7kb5ip5yFh3lK3rR\n5MwwDAyDDn+5+QNBKtxe0pLiSIy3nlBktfiDrPrnPt7ZVkZKop2cDAf9shKZMb4/WWkJJ42hrLoJ\nnz8YLkiq65s5WtOEy+0lO93BuSOcZKW2ru9tCbC/rJ7q+mY8Xj+eJj97y+rYd6SO49+I8XYL5w7P\n5uyhmWSmxpORHE9TS4Dt+6rZtr8awzCYfXEB/bOTcDqTeeVfe/jtKx9x/Au1cHAG44Y5+cv7n+Bu\naMFuM+Pzh06I3WY14w+EGJidxJxZw9hdWsvr60vxeP0AWMwmBuYkM7hvMv2dSfRzJtLUHKCsqpGy\n6kYsZhMZKfGkJ8dR6fZScqCGg0cbsFhMjByUwYQR2YwtyCKxnULVMAwq3V72Hqlj5yE32/fXUN/o\na/OavpkOvnHZWQztnxpe9snRev61uQyz2US83UJivJXczEQG5CSRmmhn4y4X/95axs5DtQCMGJjG\nlVMGU1nr5Z1tZew7Uh/eVmqSnUE5yfTNSsSZlsC2vVVs3VfdJgaTCbJS40l22ElOsGEAdR4fdY0t\n1HraxvtZeTnJ3DijgGED0vC2BNh5yM3eI3WUVzVRVt1IdV1rkQdgNplIT47DmZZARkocdqsFi9mE\n1xdg0+6q8AjuZ6Ul2Zk1YSAXjs0lIc6KYRjUN/l5e8sR3tx4mIam1p9hYryVc0dkk5kSj7uhBXdD\nC4FgCLvNQpzNTKXby/7yej5bjVnMJpIdNqzHClSrxYzPHyQYMqjz+AgZBnarmfHDszmnIIvCwRnU\nelp44n+2U1HTRF5OMu6GZuqPxRBJf2cSwweksXVfFVV1zSfk0JmeQHNLgAavn4NHW48L2WkJZKbG\nU1rpCe+rx+O22yzt5utUhg9IY0ReOh98VEFFzafzX+fnpjBsQBrrP66gur4FABNwwdl9ufqCIZQc\nqOGv6w5RVtXI3MtGMPXsXODMHvsjzaOtQvs005d6dJSv6Cln0ekN+fK2BPjkaAMmwGIxYTGbsVpM\nWMwmTCYTjc2txWRTS4CcdAd5fZKwRZhjNi09kTfe38/7O44Sb7cwJj+T0UMySXbYw68JhQzWlhzl\nz+/sx9McYNSgDMbkZ5KcaGffkU+L1imj+3LeWdmYzSbe2VbOmvcOUHesOLJbzTjTExg9OJOxBVmY\nTPDMqx9T4faSnhxHKGRQd6zws9vMXP2VIcyc0B+L2UwwFOKIq5EPd1ay/uMKXLXN7X6WzxrcN5lQ\nCA5VNvD5bz6TCfJzUxmTn0mLP8gHJUfDX+onY7WY+frFQ+nfN4VHn99InN1C0SXDeXdbGSWfuI+9\nxsSl5+dx+aQ8Gr1+dh5yc9jVyIDs1uIm3m7hpTf38u728vB2E+KsXDyuH8GQwZ7SWj452kAw1LGv\naovZxNB+qTS1BCit9ISXjRyUwbnDnWSmxrOvrJ59R+rYX1bfpkhKcdgYPSST/tlJuGq9lFc3setQ\nLSYT3DxrGBecncvr6w6x+t/7TxqPyUQ4tyMGpvHViXkUDs5o8wdVRU0TpTVeNpSUs6u0Nrw/HDe0\nXyqXnDeAZIedjz6pYedB97ERXn/4fe1WM6lJdrLTEigYkMaw/mlkpyfQ0OSnvsnH2pKjfFBSAUC/\nrETKq5vCRTVAUoINZ1oCFosJExAIhqipbwnvb5+VlmRnyui+TCzsQyAQoqy6kf1H6nl3RzktviAJ\ncRYS423UN/rwBULhn+FFY3PxB0N8+HFlu9s9zmwyMSQ3hZGD0inon0ZOegLpKXFYzJ+Ogn/2OFbn\naeHd7eW8s7Wcylpv+GdsMZvwBUJcev5ArrswHyD8R5TZZCLOZsFmNbdWqQY0+4J8dLA1v4Fga+E+\nZXRfLh7fn8qaJt7ZVs62fdVt/hgZ2j+VWRMGMHZoFmazCcMwcDe0sL+snj2H69h7pJYWf4is1Hic\nqQkkJljxHxvFNkJgtbb+0RBvt+BMS8CZlkB9o49/bDxMyYEaoPX3asIIJ8MHprNhVyUl+2swaD0G\nTB7Vl5F56ax57wCHXY1tcjixMIfZ0wtISrCdkLPOpkL7DOoNX+qnk/IVvd6Ws2Zf68jIFz2FfbJ8\nBUOhNl9kn1fnaeGv6w4RZ7PQz5lI7rHTl96WAM2+IEkJNjJT4kl22E7a4hAKGWza7eKtTYdp8Yco\n6J9KQf9UUpPiaGj0Udfkw+X2UlrpobTSg81q5vJJeUwZ3RerxYxhGBwob+BQRQOYwGIyYTabMB/7\nv8frZ8veKnYedHe4EIPWL+V+WYkYQH2Tj0ZvgKQEK9lpCaSnxLPrUC21nrZFpgnIzWrNQ58MB5v3\nVHHY5cFqMZOebD+h0D2eEYPWEb2EOCtVdc3YbWbGD8um2Regpr6F8prGNiO9JmDmhAFcO3VIeGRs\n024XxW/txeP1k5uViNVsoqy6iUCwdb04m4Wzh2aSnhxHMGRghCAt2U6fjEScafHsL69nw85KPj7o\nxmI2Mbhv6whZ30wHSQk2EuNt5GQ4wl/Q0Hoqek9pLfvL63HXt45CmswmRg3OYNTgDA5VeHjmtY/D\nhWq83cL3Zo8lPzcVwzDYvr+GHQeqmT6+PznpjlP+THbsr+Yvaw8yclA6M8YPwBH/6f7u8wcpr27i\nUGUDZVWNOOKs5GYlkZvVut3q+mZq6ltISbQzYmBa+HeloqaJDbsq+XBnJYcqPCe8Z1ZqPPn9UhmS\nm8Kw/mkMyEk6odXio09qeOp/S/B4/WSnJ1Dp9pKaZOf/XDICZ1o8zb4gDV4/R1weDlV4cNV6OWtQ\nOlPH5JKTcfLP/dnfy4YmH+XVTVTUNJGblUh+v9R21zEMA29LEICEOMspW4v2ldVR/OZe9pfVMzg3\nmZF5GYzIS6efM5GUz/zR+Fk+fxC3p4VAIBT+nervTGr3jE1js5+3Nh3h7S1HMAxISbSTmmhn5KAM\nLhjTN9y6EwoZ7DncWoBmpMSRnhyH/VgrR4sviOPY70ck7R3HQobBJ+UNbNtXxbZ91bg9LXx92lAm\nFvaJuK3Pa/YFOFDeQH9nYps/pqG1FcUfNEiwtxbpndnOdcTVehwcNSSzze9iTX0zB8obOCvv0xai\nYCjEWxuP8M62MoYPSOeS8weEz1gdp0K7E6nQ7j6Ur+h90Zy5ar1U1DRRMCAt6t7TjmjxB9l7pA4j\nZGCzmrFazTjirCQm2EiMt7Ypav2BIO9tP8o/Nh4mPzeFWy4Z3qaHMWQY7DpUyztby9iwy4XZBJNH\n9WHGuQNwpsWzq7SWbfuqOeJqxNsSwOsLYrOYKRyczpghmQzJTSUQav0SsyfY2ftJDdX1zbhqvRx2\neTjsaqShycfIQRlMLuzDuGHONn2wlbVeHntpc4dGSe1WM8MGpjF5VB/GFTgxm00cPNrAzkNu3t5S\nFj4VazGbIhbDGSlxeJr8+AIhstMSKBySwba9VaccVQUYmJNE4eAMbBYzwZBBMGgQCLUWCqGQgSPe\nSnKCnfg4C2WuRvaX13OoogGrxUyyo7XQbGjyU9PQjGFAssPGpMI+XHROP4LBENv2VbNtXzUHKxpo\n9rUWOiZg8ug+XHPBEDJS4jla08S2fdU0+wIMyU1hSN9Umlr8vL2ljHe2ltHYHOCic/pxxaQ8UpPi\n2uwLHx90s3lP62nrKyblMXxg+gmf0eP1U/zWHt7bfhS71UzfrET6ZyUyOj+Ts4dmdWif9rYEsFpM\nEUfzo+FuaOGZ1z7mk6MN3PUfoynon3ZattsZKt1NbNzlwtPsZ0jfVIb2S2nzc4ikqtbLL1dvp7TS\nw9ihWcz96oiTFqoddSaP/af6o7o70Hdl9HpFob106VK2bt2KyWRi8eLFjBkz5oTXPPbYY2zZsoXn\nn38egOXLl7Nx40YCgQC33347s2bNYtGiRZSUlJCW1noQmzdvHhdddNFJ31eFdvehfEWvvZz5A0HM\nZtNJv0wOHm3gpy9tprE5gM1q5qy8dIYPSCMpwYYj3oZhGBx2eTjiasTj9VMwIJXCQRnk9zv5xT7B\nUIijNV72l9WxZU8VJQdqwqdN25OZEkc/ZxJZqfFs3O1qc7p4TH4m37l6FHabhV2H3Lzw990cOXZa\nMCc9gWDICBesdqu5zfvYrGYS4qw0twQivn/bWOJJiLNy2NU6yhdnszBumJNJo3JITrDz+H9vpa7R\nxxWT8xg+IJ3DLg/l1U1YzK0XmMXZLXia/FTXN1PhbgrHGm+3EAoZ4ThsVjNTRvVh5oQBZKbEc6C8\nnt2H6/C2BEhx2ElJbB0V7+dMIinBRq2nhb+8/wlvbykjGDJIiLMwdqiTkYPSMZtMhIzWwjloGBgh\nA6vVzMi8jPDFTNFo72LD1tPnzRQMzqKutqndddwNLZRVNZKZGk/fzMQOvVcg2Fr0n44/8JqaA8Tb\nLV3qgqeMjERqahpP/cJuzOcPUuryMKRvymkZ1dSxPzrKV/S6SqHdabOOrF+/noMHD1JcXMy+fftY\nvHgxxcXFbV6zd+9ePvzwQ2y21lMBH3zwAXv27KG4uBi3280111zDrFmzAPjud7/LtGnTOitckTPG\n4/VztLqJCncTtZ4Whg1IY2i/1DZfXi2+ICGjtTAxmaChyc+RqkbKqhrxhQxc1Y00eP24G1qoqmum\nvtGH3WpmUJ9khvRLZcTAdEYOSsdqMXOgvJ7HXtqCtyXAlNF9+ORoQ3iE8mR2ldbyl/cPtvb12Vuv\nGLdZLdhtrf82gKM1Tfg/U9j2zXQwdmgWCXFWAsEQvmNXljc2+2lo8lNxbMQTIM5u4dLzB3LROf14\n4fVdbNtXzc9e3kpWajzv7ziKCZhYmMNFY/tR0D+VkGGwZU9V6wVGXj+jBmcwZkgmQ/unhkcn/YEg\nuw7Vsm1/NeVVja0XGtktZKQm4LBbyEiJw5maQG5WYvg07dGaJj4oOcraz/x33E0zCphx7gCg9cK2\nSMqrG1lbcpQPP67EZjWHe0ZHDkpvcyp2+MD0dkdrj0tLiuPmWcO57Pw8XLVe8vultvZUdoL2iiWr\nxUx2ugP7SQpik6n1wrtop2ezWsycpkHkNi0VXYXlFDNK9AR2m4X83PZbOkTk5DrtiLV27VpmzJgB\nQH5+PnV1dXg8HpKSksKveeSRR7j33nt58sknAZgwYUJ41DslJQWv10swGOysEEU6jWEY1Hp8NDX7\nafYFqfW0sPNQLR99UkN59YkjhX0zHUws7ENDo49dpbUcrvSEZy+wWkwEgu2feLKYTa0jonnpNDT5\n2XOkjt2H6/jbukMkxls5p8DJxt0umn0B5l1xFpNH9QVaTwWXVnpobA7Q1OwnZEA/ZyL9nUnE2y3s\nOlRLySc1fHK0Hp+/tWgOBIJ4vK2tDUbIoG+mgwHZSQzMTmbUkIwOjW56vK0F92f7Ye+6bgwrXvmI\nDTsr2V3a2gZxyyXD23ypW0wmxg/PZvzw7JNu22a1MGpIZnhateMijWr0yXBw9QVDuOorg9l3pJ73\nS47y8UE3V31lEBNHdrzPsW9mItdOzefaqfkdXieSzNT4LzRKLSIiXUunFdpVVVUUFhaGH2dkZOBy\nucKF9urVqznvvPPo169f+DUWiwWHo/XiiVWrVjF16lQsltZhkBdeeIFnn32WzMxMHnjgATIyIo8w\niUDr6c7jszCcyvGLnyrdXuLsltZ2BF+Q0soGDlc20uwLMKhPCkNyU+jnTMRy7KK0kGHQ0NQ6auuq\n87KntJbdpbXtTqtkt5kpHJxBf2ciOemtxebG3S427nLxp3/vBwiPiMbbLfj8QVr8QVIT48IX5OUP\nzMDf4icpwUZygq3NKfRmX4ADZfVs3lvFhx9X8u72ckwmuPWKkW0ukMlKSzjpVGkAYwuyGFuQFU2q\nOyQpwUbS5y50slrMfPvKQv6ak4Qj3sbUs/ue8X5K07Gr6T87hZmIiMiXdcbOwX22Fby2tpbVq1fz\n7LPPUlFRccJr//GPf7Bq1SqeeeYZAK666irS0tI466yzWLFiBU8++SQPPvjgSd8rPd2B9XSdp/wC\nIvXqyIki5cvl9rJpVyXDBqYxuIOnLQ3DYNdBN6+8u5/3t5WRmhTHVVPzuWRiXpvJ/Zt9AWobWqht\naGHjzkre2nCISrc34raPzwt7Khkp8UwZ4yQlyR6+IPCsQRkMz8s4oRXgsgvyaWjysXFnJc60BIYN\nTPtSF2sN6JfO1Al5BEMGH+2vxmY1M2JQ1//DdO6Voztt2/qdjJ5yFh3lK3rKWXSUr+h1hZx1WqGd\nnZ1NVVVV+HFlZSVOpxNo7cWuqalhzpw5+Hw+Dh06xNKlS1m8eDHvvPMOTz31FL/73e9ITm5N0KRJ\nk8Lbufjii3nooYcivrfbfeKp+TNFFyxE5/P58gdCeFsClFc38tamI2zc5QrP4dnPmcjEkTkU9E+j\nnzORxHgbzb4Ahyo8HKpooLLWS3Vd600qjrdn5GQ4qPW08MwrJfzXG7vIzXJQ3+ijvtFPi79tW1Kc\nzcJXRvelYEAqPn+IZl8Aq8VM/+wkBjiTsNvMHDzawP6yeircXkIhg5BhYDJBcoKdZIeNtKQ48vun\n4kyNb7cHttZ98gumCgekHntN5P03mn2sT2rrrAK9eZ/U72T0lLPoKF/RU86io3xFr8dfDDllyhR+\n+ctfMnv2bEpKSsjOzg63jVx66aVceumlABw+fJj777+fxYsX09DQwPLly/nDH/4QnmEEYP78+SxY\nsIABAwawbt06CgoKOits6UTuhhbWfVRBdX0zF4/rF+7pbfEH+d93D/DPTUdOKH77O5OYNCqHfUfq\n2baviv95e3/4uWSHDU+Tn893L9ttZs4pyGLG+P6MyEunqSXAPzcd4R8bD/NJeQPJDhs5GQnHZn1o\n/a+/M5Fxw5ynnKv5VBeziYiIiBzXaYX2uHHjKCwsZPbs2ZhMJpYsWcLq1atJTk5m5syZ7a7z2muv\n4Xa7ueeee8LLli1bxpw5c7jnnntISEjA4XDw8MMPd1bYEoWQYbBhZyWfHG0IT98FrSPDdqs5fJeq\nFl+Qwy4Puw7Vhovif246woVjc5kyth8r/rydymN3ghvaL4X4OCvJDjvnn5XNsAFp4ZHhxmY/2/ZV\nU1rZOg3d0ZpGcjMTyeuTzKA+yfTJdJCZEk9SQtsbiCTG27hi8iAun5SHASfckEFERESkM+iGNadZ\nTzy9YxgGe4/U0dQcIDs9gazUeHbsr2H1O/vD8wd3xND+qUwamUOSw87qf++noqa1RcJkglkTBnD1\nBUM65UYqPU1P3Mc6k/IVPeUsOspX9JSz6Chf0evxrSPS/WzbV81/vbkHgDFDMhk9JIOy6ib+tfkI\nR2tO7Bs2Hbtb39Szc4mzWbBaTBiAzx+ixR8kdOwGFXabmZREO2mfuQvZOQVZvL2ljL1l9cwc358h\nuSln6mOKiIiInBEqtHsJnz/I+zuOEmezMDAniT6ZjvAUat6WAMVv7eXfW8uwmE1YLWb+vqGUv28o\nBVrncZ5YmENuZiKVtV5cbi/pKXFcMWkQuVkduzPc51ktZqaP78/sS/VXuoiIiPRMKrR7gao6L79a\nvYODFZ8WtDarmXh7a5vG8RHo/s4kbv3aSPpkONh9uJaPDtSQkmhn8qg+be5uJyIiIiKnpkK7h/vo\nkxqe+t8SPF4/U0b3IS8nmUMVHkorPfgCrTN8mB0mxg1z8rUpg7Aeu5Vw4aAMCrvB3MsiIiIiXZUK\n7R7GHwix93AtOw7UUHKghkOVHixmE0WXDOfCsbntzu0sIiIiIqdfxELb7Xbz85//nLfffpuqqipM\nJhPZ2dlcfPHF3H333eEbykhsNTUHeG9HOTv217Cr1I3PHwJae6tHDkrnmguGkN9Pt5YWEREROZMi\nFtoLFy5k8uTJ3HHHHWRlZWEYBpWVlaxZs4aFCxfy61//+kzFKe0IBEO8vaWM/333AB6vH4DcrEQK\nB2UwakgGwwakabo8ERERkRiJWGh7vV7mzp3bZllubi7f/va3mTNnTmfGJaewu7SWP/x1J0drmoi3\nW7hm6hCmjOpDRkp8rEMTEREREU5RaPv9fnbs2MGoUaPaLN+8eTOhUKhTA5P2hQyD19Ye5M/vHMDA\nYNo5/bjqK4NJSdSsICIiIiJdScRC+/7772fBggW0tLTgdDoBqKioIDU1lUceeeSMBCifamjyseKV\njyg5UEN6chy3X1nIsAFpsQ5LRERERNoRsdA+++yzee211zhy5AiVlZWYTCb69OlDnz59zlR8coy7\noYWfvrSZ8uomxuRnMu/yszS3tYiIiEgXFrHQ9vl8/OEPf+Dtt98OF9o5OTlMnz6dOXPmYLPZzlSc\nvZqr1suj/7WZqrpmLjlvANdPG4pZ0/SJiIiIdGmnnHUkMzOT7373uzidzjazjjzwwANqHzkDyqsb\nefS/NlPr8XHVVwZz5ZRBmgtbREREpBuIWGi7XC5+/vOft1mWl5fHhAkTuOmmmzo1MIGjNU0sf3Ez\ndY0+bpg2lEvPHxjrkERERESkg8yRnvT5fBw9evSE5aWlpQQCgU4LSqDS3cSj/9VaZN84vUBFtoiI\niEg3E3FE+//+3//L9ddfz+DBg9vMOnLkyBF+8pOfnJEAe6OqWi/L/2sz7oYWbpg2lJkTBsQ6JBER\nERGJUsRCe9q0abz55pts2bKFyspKAPr06cPZZ5/doQshly5dytatWzGZTCxevJgxY8ac8JrHHnuM\nLVu28Pzzz590nfLychYsWEAwGMTpdPLoo49it/fMGTdChsHTa0qoqW/hPy4copFsERERkW4qYusI\ngN1u57zzzuOKK67giiuu4Nxzz8Vms7Fs2bKI661fv56DBw9SXFzMT37yk3ZHwPfu3cuHH354ynWe\neOIJbrrpJl588UXy8vJYtWpVtJ+z2/j3ljL2ldVz3lnZXD5pUKzDEREREZEv6JSF9smUlJREfH7t\n2rXMmDEDgPz8fOrq6vB4PG1e88gjj3Dvvfeecp1169Yxffp0oHWUfe3atV807C6tvtHHqn/tIyHO\nwuzpBbEOR0RERES+hIitIxdeeGG7U8kZhoHb7Y644aqqKgoLC8OPMzIycLlcJCUlAbB69WrOO+88\n+vXrd8p1vF5vuFUkMzMTl8sV8b3T0x1YrZaIr+lMTmfyF1rv+b9vpKklwO3XjKZgcNZpjqrr+qL5\n6s2Us+goX9FTzqKjfEVPOYuO8hW9rpCziIX2+PHjOffcc7nwwgvbLDcMg/vuuy+qNzIMI/zv2tpa\nVq9ezbPPPktFRUWH1om07PPc7qaoYjudnM5kXK6GqNf7+KCbf248TF6fZCYUZH2hbXRHXzRfvZly\nFh3lK3rKWXSUr+gpZ9FRvqJ3JnMWqaCPWGj/6Ec/YvHixVx11VUkJia2ee5UF0NmZ2dTVVUVflxZ\nWRmeueSDDz6gpqaGOXPm4PP5OHToEEuXLj3pOg6Hg+bmZuLj46moqCA7Ozvie3dH//P2PkxA0SXD\nMZt1QxoRERGR7i5ij3ZiYiK/+MUvTiiyAZ555pmIG54yZQqvv/460NrPnZ2dHW4bufTSS3nttdd4\n+eWXefLJJyksLGTx4sUnXWfy5Mnh5W+88QYXXHBB9J+0C9tXVsf+snrOHprF4L4psQ5HRERERE6D\niCPan9XQ0EAoFCI1NRU49Yj2uHHjKCwsZPbs2ZhMJpYsWcLq1atJTk5m5syZHV4HYP78+SxcuJDi\n4mJyc3O5+uqrOxp2t/DmhsMAzDi3f4wjEREREZHTxWScoun53//+N7/61a9ISEggPT2dmpoaRo4c\nyXe/+90OzaUdC7HsY4q2J8jd0MKC37xPToaDH807r92LT3sy9Z1FTzmLjvIVPeUsOspX9JSz6Chf\n0esWPdqnKTggAAAgAElEQVTr1q1jxYoV/PjHP6ag4NPp5j744AMefvhhZs2axbnnnovV2uGBcfmc\nt7ccIRgymDG+f68rskVERER6sogV8m9/+1uWL1/OfffdR3V1NYWFhYwYMYKMjAxKS0txuVy88sor\nXHPNNWcq3h7FHwjxr81HcMRZmVTYJ9bhiIiIiMhpFPFiyJaWFnJzcxk6dCi33347X//61/F6vfzu\nd7/j7rvvZtq0afzzn/88U7H2OB/urKC+yc/Us3OJs8du3m8REREROf0iFtoWS2vxd+DAAa699lrO\nP/987r77bn7xi1/wt7/9jaSkJGpra89IoD3RO1vLMQHTxvU75WtFREREpHuJWGjbbDaam5txOBx8\n8MEH4eXDhw/n0KFDQMduICMn8geC7CurY0BOEs60hFiHIyIiIiKnWcQe7auvvprnnnuORx55hEWL\nFvHYY4+Rl5fH4cOHufzyy9m5cyd9+qi3+Is4UN5AIGgwrH9arEMRERERkU4QsdC+/PLLWbhwIX/6\n05946qmnqK6upqysjL59+2I2m7nzzjvDc11LdHaVtrbcDBugQltERESkJzrlvHzLli3j+eef5+ab\nb2bQoEFkZWVx+PBhPvnkE+677z5GjBhxJuLscfYcK7QLVGiLiIiI9EgdmgD7lltu4ZZbbuHIkSO4\nXC5SU1MZPHhwZ8fWYwVDIfYcqaNPhoPURHuswxERERGRThDVnWZyc3Pp27evbqzyJZVWemjxBdU2\nIiIiItKDRZx1xOv18uMf/zj8ePr06YwcOZKxY8eye/fuTg+up9p96Hh/dmqMIxERERGRzhKx0P7p\nT3+K2+0mGAwC0K9fP3bu3Mnjjz/Ob37zmzMSYE+0+3AdoAshRURERHqyiIX2hg0bWLZsWfjGNcdN\nmzaNw4cPd2pgPZVhGOwurSUjJY6sVM2fLSIiItJTRSy0k5KSsFo/beP+/ve/H/633a6L+L6I8uom\nPF6/RrNFREREeriIhXZTUxOBQCD8eMyYMQA0Nzfj9Xo7N7IeavfhY/3ZulGNiIiISI8WsdCeNm0a\nDzzwAI2NjeFlbreb73//+9xwww2dHlxPtFs3qhERERHpFSJO7/ed73yHxx57jGnTppGbm0sgEMDl\ncvHNb36T2bNnn3LjS5cuZevWrZhMJhYvXhweEQd4+eWXWbVqFWazmREjRrBkyRJWrVrFmjVrwq/Z\nsWMHmzdv5pZbbqGpqQmHwwHAwoULGTVq1Bf9zDG1p7SWpAQbfTMdsQ5FRERERDpRxELbarWycOFC\n7rrrLg4ePIjFYiEvL69D/dnr16/n4MGDFBcXs2/fPhYvXkxxcTHQOm3gq6++ysqVK7HZbBQVFbF5\n82auv/56rr/++vD6f/3rX8Pbe/jhhxk2bNiX+awxV1PfTHV9C2OHZmkuchEREZEeLmLrSCgU4te/\n/jV2u50RI0ZQUFBAaWlph6b2W7t2LTNmzAAgPz+furo6PB4PAAkJCTz33HPYbDa8Xi8ejwen09lm\n/V/96ld85zvf+aKfq0vae6R1Wr+C/po/W0RERKSnizii/atf/YqdO3fi8/lISGidii4nJ4edO3fy\nxz/+kaKiopOuW1VVRWFhYfhxRkYGLpeLpKSk8LIVK1aEtzNgwIDw8m3bttG3b982xfcTTzyB2+0m\nPz+fxYsXEx8ff9L3Tk93YLVaTvp8Z3M6k9tdfvjdAwCcO6rvSV/TGykX0VPOoqN8RU85i47yFT3l\nLDrKV/S6Qs4iFtr//Oc/eemll9q0iiQlJbFs2TLmzp0bsdD+PMMwTlh22223UVRUxK233sr48eMZ\nP348AKtWreKaa64Jv66oqIjhw4czcOBAlixZwsqVK5k3b95J38vtbupwXKeb05mMy9XQ7nPb91Rh\ntZhJi7ee9DW9TaR8SfuUs+goX9FTzqKjfEVPOYuO8hW9M5mzSAV9xNaR+Pj4dvux4+PjMZsjrkp2\ndjZVVVXhx5WVleER6traWj788MPwtqZOncqmTZvCr123bh3nnHNO+PHMmTMZOHAgABdffHG3vP27\ntyXAocoGBvVNxmaNnDsRERER6f5OOY92U9OJo8N1dXVtpvxrz5QpU3j99dcBKCkpITs7O9w2EggE\nWLRoUXgb27dvZ/DgwQBUVFSQmJgYLvANw2Du3LnU19cDrUV4QUFBNJ+xS9hfXo9hQEE/9WeLiIiI\n9AYRW0euuuoq7rzzTh588EEGDRoEwM6dO/nhD3/IN77xjYgbHjduHIWFhcyePRuTycSSJUtYvXo1\nycnJzJw5kzvuuIOioiKsVivDhw9n+vTpALhcLjIyMsLbMZlM3HDDDcydO5eEhARycnKYP3/+l/zY\nZ97ew60XQg7VhZAiIiIivYLJaK95+jNWrlzJihUraGhowDAMMjMzuf3228PT8HVFsexjOllP0GMv\nbabkEze/uOsrJDt0+/rj1HcWPeUsOspX9JSz6Chf0VPOoqN8Ra+r9GhHHNEGmDNnDnPmzMHj8WAy\nmUhMTDytwfUGoZDBvrJ6+mQ4VGSLiIiI9BKnLLR37NjB73//e3bv3o3ZbGbUqFF885vf7JZ90rFy\n2OWh2RfU/NkiIiIivUjEiyE3bNjAnXfeyeTJk3n88cf54Q9/yJAhQ5g3bx4bN248UzF2e3vUny0i\nIiLS60Qc0X766ad58sknGTVqVHjZuHHjmDhxIsuWLeOFF17o9AB7gk/vCJkW40hERERE5EyJOKLt\n9XrbFNnHjR49ut1p/6R9B8rrSUqwkZOeEOtQREREROQMiVhoR7opzWdvpS6R1Tf6SE+Ow2QyxToU\nERERETlDIraOVFZWsmrVqnafc7lcnRJQTxMIhmj2BUlKsMU6FBERERE5gyIW2uecc85JL3ocO3Zs\npwTU03i8fgAV2iIiIiK9TMRC++GHHz5TcfRYKrRFREREeqeIhfb9999/0udMJhNLly497QH1NI0q\ntEVERER6pYiF9jXXXHPCsqamJp566incbnenBdWTaERbREREpHeKWGifd955bR7/5S9/4Ze//CXX\nXnst3/jGNzo1sJ6iQYW2iIiISK90yluwA+zevZsf/ehHZGVl8dxzz9GnT5/OjqvHON46kqhCW0RE\nRKRXiVhoezweHn/8cTZs2MD999/P+eeff6bi6jGOt44kO1Roi4iIiPQmEQvtWbNm0adPH26++WbK\ny8v585//3Ob5q6++ulOD6wk8GtEWERER6ZUiFto33ngjJpOJo0ePfqGNL126lK1bt2IymVi8eDFj\nxowJP/fyyy+zatUqzGYzI0aMYMmSJaxfv567776bgoICAIYNG8YDDzxAeXk5CxYsIBgM4nQ6efTR\nR7Hb7V8opjPN03SsRztehbaIiIhIbxKx0J4/f/4X3vD69es5ePAgxcXF7Nu3j8WLF1NcXAyA1+vl\n1VdfZeXKldhsNoqKiti8eTPQegHmE0880WZbTzzxBDfddBOXXXYZP/vZz1i1ahU33XTTF47tTPI0\n+7GYTSTEWWIdioiIiIicQebO2vDatWuZMWMGAPn5+dTV1eHxeABISEjgueeew2az4fV68Xg8OJ3O\nk25r3bp1TJ8+HYBp06axdu3azgr7tPN4AyQm2DCZTLEORURERETOoE4rtKuqqkhPTw8/zsjIwOVy\ntXnNihUrmDlzJpdeeikDBgwAYO/evXz729/mxhtv5L333gNaR8CPt4pkZmaesJ2urNHr19R+IiIi\nIr1Qh6b3Ox0Mwzhh2W233UZRURG33nor48ePZ9CgQdx5551cdtlllJaWUlRUxBtvvHHK7XxeeroD\nqzV2rRpOZzIAwZBBY7OfvL4p4WVyIuUmespZdJSv6Cln0VG+oqecRUf5il5XyFmHCu2//OUv/Pa3\nv6W+vh7DMDAMA5PJxL/+9a+TrpOdnU1VVVX4cWVlZbg9pLa2lj179jBhwgTi4+OZOnUqmzZtYvz4\n8Xz1q18FYODAgWRlZVFRUYHD4aC5uZn4+HgqKirIzs6OGK/b3dSRj9UpnM5kXK4GoHXGEcMAu8UU\nXiZtfTZf0jHKWXSUr+gpZ9FRvqKnnEVH+YremcxZpIK+Q60jv/zlL/nBD37ACy+8wMqVK3nxxRdZ\nuXJlxHWmTJnC66+/DkBJSQnZ2dkkJSUBEAgEWLRoEY2NjQBs376dwYMHs2bNGn7/+98D4HK5qK6u\nJicnh8mTJ4e39cYbb3DBBRd0JOyY0xzaIiIiIr1Xh0a08/LymDBhQlQbHjduHIWFhcyePRuTycSS\nJUtYvXo1ycnJzJw5kzvuuIOioiKsVivDhw9n+vTpNDY2ct999/Hmm2/i9/t56KGHsNvtzJ8/n4UL\nF1JcXExubm63mb9bc2iLiIiI9F4mowNNz7/5zW/wer2cd955WCyf9j5PmjSpU4P7omJ5euWzpyq2\n7Kniif/ZxvXT8rns/LyYxdSV6XRY9JSz6Chf0VPOoqN8RU85i47yFb2u0jrSoRHt999/HyA81zWA\nyWTqsoV2V3F8RFs3qxERERHpfTpUaD///POdHUePFC601aMtIiIi0ut06GLIffv2UVRUxLhx4xg/\nfjzz5s3j0KFDnR1bt9fYfKzQVo+2iIiISK/ToUL7Rz/6Ed/85jd59913+fe//83s2bNZsmRJZ8fW\n7TU0qdAWERER6a06VGgbhsFFF12Ew+EgMTGRmTNnEgwGOzu2bq9Rs46IiIiI9FodKrT9fj8lJSXh\nx9u2bVOh3QEerx8TkBh/xm7AKSIiIiJdRIcqwIULF/K9732PmpoaDMMgOzubRx55pLNj6/Y8zX4c\n8VYs5g79PSMiIiIiPUiHCu2zzz6bv/3tbzQ0NGAymcJ3eJTIPE1+tY2IiIiI9FIRC+2nn36a22+/\nne9///uYTKYTnl++fHmnBdbdGYaBx+snMzU+1qGIiIiISAxELLRHjhwJwOTJk094rr3CWz7V7AsS\nDBmacURERESkl4pYaF9wwQVA6zza9913X5vn/t//+39cffXVnRdZN3d8xhEV2iIiIiK9U8RC++9/\n/ztvvPEGa9eupbKyMrw8EAjw4Ycfdnpw3VmDCm0RERGRXu2UI9oZGRns2LGDSZMmhZebTCbuvPPO\nTg+uO9Mc2iIiIiK9W8RCOz4+nvHjx/PnP/+ZuLi4Ns8tW7aMhQsXdmpw3ZnnWKGdrEJbREREpFfq\n0PR+GzZs4Gc/+xm1tbUA+Hw+0tLSVGhH4FHriIiIiEiv1qE7qTz++OM88MADZGZm8tRTT3Hdddex\naNGizo6tW/OodURERESkV+vQiHZSUhJjx47FZrNRUFDA3Xffzbe+9S2mTJkScb2lS5eydetWTCYT\nixcvZsyYMeHnXn75ZVatWoXZbGbEiBEsWbIEk8nE8uXL2bhxI4FAgNtvv51Zs2axaNEiSkpKSEtL\nA2DevHlcdNFFX/xTnwFqHRERERHp3TpUaAcCATZs2EBKSgp/+tOfyM/P5/DhwxHXWb9+PQcPHqS4\nuJh9+/axePFiiouLAfB6vbz66qusXLkSm81GUVERmzdvxufzsWfPHoqLi3G73VxzzTXMmjULgO9+\n97tMmzbtS37cM0cj2iIiIiK9W4cK7R/+8IdUVVWxYMECfvSjH1FVVcW3v/3tiOusXbuWGTNmAJCf\nn09dXR0ej4ekpCQSEhJ47rnngNai2+Px4HQ6yc3NDY96p6Sk4PV6CQaDX+bzxYzm0RYRERHp3TpU\naA8ZMoQhQ4YA8Mwzz3Row1VVVRQWFoYfZ2Rk4HK5SEpKCi9bsWIFf/zjHykqKmLAgAEAOBwOAFat\nWsXUqVOxWCwAvPDCCzz77LNkZmbywAMPkJGRcdL3Tk93YLVaOhRnZ3A6k/H6Q8TbLeT2TY1ZHN2F\n05kc6xC6HeUsOspX9JSz6Chf0VPOoqN8Ra8r5CxioX3xxRdHvNX6m2++2eE3MgzjhGW33XYbRUVF\n3HrrrYwfP57x48cD8I9//INVq1aFi/qrrrqKtLQ0zjrrLFasWMGTTz7Jgw8+eNL3crubOhzX6eZ0\nJuNyNVDX0ExivA2XqyFmsXQHx/MlHaecRUf5ip5yFh3lK3rKWXSUr+idyZxFKugjFtp/+MMfACgu\nLsbpdDJx4kSCwSDvvfceTU2Ri9ns7GyqqqrCjysrK3E6nQDU1tayZ88eJkyYQHx8PFOnTmXTpk2M\nHz+ed955h6eeeorf/e53JCe3Bv7Zm+VcfPHFPPTQQxHfuyvweAP0yXDEOgwRERERiZGI0/sNHDiQ\ngQMH8tFHHzF37lxGjBhBYWEht912Gx9//HHEDU+ZMoXXX38dgJKSErKzs8NtI4FAgEWLFtHY2AjA\n9u3bGTx4MA0NDSxfvpynn346PMMIwPz58yktLQVg3bp1FBQUfPFPfAb4/EFa/EGSHOrPFhEREemt\nOtSjXV1dzbvvvsu4ceMwm81s3ryZsrKyiOuMGzeOwsJCZs+ejclkYsmSJaxevZrk5GRmzpzJHXfc\nQVFREVarleHDhzN9+nRefvll3G4399xzT3g7y5YtY86cOdxzzz0kJCTgcDh4+OGHv9yn7mSa2k9E\nRERETEZ7zdOfs2nTJpYvX87u3bsxDIOCggLuu+8+zjvvvDMRY9Ri2cfkdCazcUcZDz37ITPG9+em\nmcNiFkt3oL6z6Cln0VG+oqecRUf5ip5yFh3lK3rdokf7uHHjxvHSSy+dtoB6uobjU/updURERESk\n14pYaP/4xz/mBz/4ATfddFO7s4+sXLmy0wLrzjxNah0RERER6e0iFtrXXXcdQJueaTk1T3hE2x7j\nSEREREQkViIW2m63m7Vr156pWHqMhiYfoLtCioiIiPRmEQvtX//61yd9zmQytZnfWj6lWUdERERE\nJGKh/fzzz5/0ueNzZMuJPLoYUkRERKTX69CsI2VlZbzwwgu43W4AfD4f69at45JLLunU4LqrhmMX\nQ6p1RERERKT3inhnyOMWLFhAWloaW7ZsYdSoUbjdbpYvX97ZsXVbHq+fhDgLVkuH0isiIiIiPVCH\nKkGLxcJtt91GVlYWc+bM4Te/+Y2m9ovA4/VrNFtERESkl+tQod3S0sLRo0cxmUyUlpZitVo5cuRI\nZ8fWLRmGQUOTn6QETe0nIiIi0pt1qEf7W9/6FmvXrmXevHlcddVVWCwWrrjiis6OrVtq9gUJBEMk\n60JIERERkV4tYqFdUVFBTk4OM2bMCC9bv349jY2NpKamdnpw3VF9o+bQFhEREZFTtI587Wtf47bb\nbuONN94gEAgAYLVaVWRHUN/YAqjQFhEREentIhba77zzDldeeSUvv/wyF110EcuWLWPfvn1nKrZu\n6fiItlpHRERERHq3iK0jcXFxXHHFFVxxxRVUVlbyyiuvcO+99+JwOLjuuuu47rrrzlSc3YZaR0RE\nREQEOjjrCEB2djbz5s3j5z//Of369eM///M/T7nO0qVL+frXv87s2bPZtm1bm+defvllbrjhBmbP\nns1DDz2EYRgnXae8vJxbbrmFm266ibvvvhufzxfNZzyjPi20NeuIiIiISG/WoUK7rq6OlStXct11\n13Hvvfdy9tln8/bbb0dcZ/369Rw8eJDi4mJ+8pOf8JOf/CT8nNfr5dVXX2XlypW89NJL7N+/n82b\nN590nSeeeIKbbrqJF198kby8PFatWvUlPnLnUuuIiIiIiMApCu233nqL+fPnc9lll7F7924efPBB\n1qxZQ1FREenp6RE3vHbt2vBsJfn5+dTV1eHxeABISEjgueeew2az4fV68Xg8OJ3Ok66zbt06pk+f\nDsC0adNYu3btl/7gnUWtIyIiIiICpyi0n3nmGaZPn85bb73FD3/4Q8aMGdPhDVdVVbUpxjMyMnC5\nXG1es2LFCmbOnMmll17KgAEDTrqO1+vFbm9txcjMzDxhO11JeNYRjWiLiIiI9GoRL4Z84YUXTtsb\nHe/B/qzbbruNoqIibr31VsaPH9+hddpb9nnp6Q6sVssXC/RLqm/0YTLBoP7pWCwdboHv1ZzO5FiH\n0O0oZ9FRvqKnnEVH+YqechYd5St6XSFnHboz5BeRnZ1NVVVV+HFlZSVOpxOA2tpa9uzZw4QJE4iP\nj2fq1Kls2rTppOs4HA6am5uJj4+noqKC7OzsiO/tdjd1zofqgPpGH444KzU1jTGLoTtxOpNxuRpi\nHUa3opxFR/mKnnIWHeUrespZdJSv6J3JnEUq6DttyHXKlCm8/vrrAJSUlJCdnU1SUhIAgUCARYsW\n0djYWoxu376dwYMHn3SdyZMnh5e/8cYbXHDBBZ0V9pdW3+gjyaEZR0RERER6u04b0R43bhyFhYXM\nnj0bk8nEkiVLWL16NcnJycycOZM77riDoqIirFYrw4cPZ/r06ZhMphPWAZg/fz4LFy6kuLiY3Nxc\nrr766s4K+0sxDIP6Rh9D+qbEOhQRERERiTGT0ZGm524mVqdXmpr93Pn4O4wdmsVd13X8wtHeTKfD\noqecRUf5ip5yFh3lK3rKWXSUr+j1+NaR3qjB6wc044iIiIiIqNA+rTxNrYV2subQFhEREen1VGif\nRhrRFhEREZHjVGifRsdHtHVXSBERERFRoX0aebzHW0c0vZ+IiIhIb6dC+zRq8PoAtY6IiIiIiArt\n00oXQ4qIiIjIcSq0TyOPLoYUERERkWNUaJ9GDV4/ZrOJhLhOu+GmiIiIiHQTKrRPI0+TnxSHHbPJ\nFOtQRERERCTGVGifRh6vn+REzTgiIiIiIiq0Tyuz2URuVmKswxARERGRLkDNxKfRg//nXPrnptHk\naY51KCIiIiISYxrRPo0yUuJJ1NR+IiIiIoIKbRERERGRTqFCW0RERESkE6jQFhERERHpBCq0RURE\nREQ6gQptEREREZFOYDIMw4h1ECIiIiIiPY1GtEVEREREOoEKbRERERGRTqBCW0RERESkE6jQFhER\nERHpBCq0RUREREQ6gQptEREREZFOYI11AD3F0qVL2bp1KyaTicWLFzNmzJhYh9QlLV++nI0bNxII\nBLj99tt56623KCkpIS0tDYB58+Zx0UUXxTbILmTdunXcfffdFBQUADBs2DC+9a1vsWDBAoLBIE6n\nk0cffRS73R7jSLuG//7v/2bNmjXhxzt27GDUqFE0NTXhcDgAWLhwIaNGjYpViF3G7t27+c53vsPc\nuXO5+eabKS8vb3e/WrNmDc899xxms5kbbriB66+/Ptahx0x7Obv//vsJBAJYrVYeffRRnE4nhYWF\njBs3LrzeH/7wBywWSwwjj43P52vRokXtHu+1j33q8zm76667cLvdANTW1jJ27Fhuv/12vva1r4WP\nY+np6TzxxBOxDDtmPl9TjB49uusdxwz50tatW2fcdttthmEYxt69e40bbrghxhF1TWvXrjW+9a1v\nGYZhGDU1NcaFF15oLFy40HjrrbdiHFnX9cEHHxjz589vs2zRokXGa6+9ZhiGYTz22GPGypUrYxFa\nl7du3TrjoYceMm6++WZj165dsQ6nS2lsbDRuvvlm4wc/+IHx/PPPG4bR/n7V2NhozJo1y6ivrze8\nXq9x+eWXG263O5ahx0x7OVuwYIHx6quvGoZhGC+88IKxbNkywzAM47zzzotZnF1Fe/lq73ivfexT\n7eXssxYtWmRs3brVKC0tNa655poYRNi1tFdTdMXjmFpHToO1a9cyY8YMAPLz86mrq8Pj8cQ4qq5n\nwoQJ/OIXvwAgJSUFr9dLMBiMcVTdz7p165g+fToA06ZNY+3atTGOqGv61a9+xXe+851Yh9El2e12\nfvvb35KdnR1e1t5+tXXrVkaPHk1ycjLx8fGMGzeOTZs2xSrsmGovZ0uWLOGSSy4BWkcVa2trYxVe\nl9NevtqjfexTkXK2f/9+GhoadLb8M9qrKbricUyF9mlQVVVFenp6+HFGRgYulyuGEXVNFoslfPp+\n1apVTJ06FYvFwgsvvEBRURH33nsvNTU1MY6y69m7dy/f/va3ufHGG3nvvffwer3hVpHMzEzta+3Y\ntm0bffv2xel0AvDEE08wZ84cHnzwQZqbm2McXexZrVbi4+PbLGtvv6qqqiIjIyP8mt58bGsvZw6H\nA4vFQjAY5MUXX+RrX/saAD6fj+9973vMnj2bZ599Nhbhxlx7+QJOON5rH/vUyXIG8Mc//pGbb745\n/Liqqoq77rqL2bNnt2mX603aqym64nFMPdqdwNBd7SP6xz/+wapVq3jmmWfYsWMHaWlpnHXWWaxY\nsYInn3ySBx98MNYhdhmDBg3izjvv5LLLLqO0tJSioqI2ZwG0r7Vv1apVXHPNNQAUFRUxfPhwBg4c\nyJIlS1i5ciXz5s2LcYRd28n2K+1vJwoGgyxYsICJEycyadIkABYsWMCVV16JyWTi5ptv5txzz2X0\n6NExjjT2rrrqqhOO9+ecc06b12gfO5HP52Pjxo089NBDAKSlpXH33Xdz5ZVX0tDQwPXXX8/EiRNP\nefagp/psTTFr1qzw8q5yHNOI9mmQnZ1NVVVV+HFlZWV4JE3aeuedd3jqqaf47W9/S3JyMpMmTeKs\ns84C4OKLL2b37t0xjrBrycnJ4atf/Somk4mBAweSlZVFXV1deFS2oqKi1x5cI1m3bl34C3zmzJkM\nHDgQ0D4WicPhOGG/au/Ypv2trfvvv5+8vDzuvPPO8LIbb7yRxMREHA4HEydO1D53THvHe+1jp/bh\nhx+2aRlJSkriP/7jP7DZbGRkZDBq1Cj2798fwwhj5/M1RVc8jqnQPg2mTJnC66+/DkBJSQnZ2dkk\nJSXFOKqup6GhgeXLl/P000+HrzqfP38+paWlQGtxdHx2DWm1Zs0afv/73wPgcrmorq7m2muvDe9v\nb7zxBhdccEEsQ+xyKioqSExMxG63YxgGc+fOpb6+HtA+FsnkyZNP2K/OPvtstm/fTn19PY2NjWza\ntIlzzz03xpF2HWvWrMFms3HXXXeFl+3fv5/vfe97GIZBIBBg06ZN2ueOae94r33s1LZv386IESPC\nj6qGE3IAAARZSURBVD/44AMefvhhAJqamti5cyeDBw+OVXgx015N0RWPY2odOQ3GjRtHYWEhs2fP\nxmQysWTJkliH1CW99tpruN1u7rnnnvCya6+9lnvuuYeEhAQcDkf44CGtLr74Yu677z7efPNN/H4/\nDz30EGeddRYLFy6kuLiY3Nxcrr766liH2aW4XK5wP57JZOKGG25g7ty5JCQkkJOTw/z582McYezt\n2LGDZcuWceTIEaxWK6+//jo//elPWbRoUZv9ymaz8f/bu3+QNrcwjuPfNwkqgot/qNAuFq3gVKw4\nCaKT0NEWKdShKIUqRagoLZiqi8RFRKdCO4VqcVEXnQqCUnUQUahOgpRuJYIBN413aK/DNb3D7X0b\nG7+fKSRwcs4hvPx43ufN6e/vp6uriyAI6O3tpaSkJNfTz4lse5ZKpSgsLKSzsxP4/jD8yMgIlZWV\nPHjwgEgkQmtr67V8gC3bfj1+/PjS9b6oqMjf2A/Z9mx6eppv375d3JUDaGhoYGFhgY6ODs7Oznj6\n9Ck3btzI4cxzI1umSCQSDA0NXanrWHBuQ5QkSZL0v7N1RJIkSQqBQVuSJEkKgUFbkiRJCoFBW5Ik\nSQqBQVuSJEkKgX/vJ0l54OvXr7S1tV06aa+5uZnu7u5fHn9zc5PJyUlmZ2d/eSxJui4M2pKUJ0pL\nS0kmk7mehiTpB4O2JOW5uro6enp62Nzc5OTkhEQiwZ07d9jZ2SGRSBCLxQiCgNevX1NdXc3h4SHx\neJxMJkNhYeHFQVKZTIbh4WH29/cpKCjgzZs3APT395NOpzk9PaWlpYVnz57lcrmSdGXYoy1Jee7s\n7IyamhqSySSPHj1iamoKgMHBQV69ekUymeTJkyeMjo4CMDw8TFdXF+/fv6e9vZ3l5WUADg4OeP78\nOXNzc8RiMdbW1vj06ROnp6fMzMzw4cMHiouLyWQyOVurJF0lVrQlKU8cHR1dHAf+t4GBAQCampoA\nqK+v5927d6TTaVKp1MXx4I2Njbx48QKA3d1dGhsbAbh//z7wvUf79u3blJeXA1BZWUk6naa1tZWp\nqSn6+vpobm7m4cOHRCLWcCQJDNqSlDf+rUf7/Pz84nUQBARB8NPPgaxV6Wg0eum9srIyFhcX2d7e\n5uPHj7S3tzM/P09RUdF/WYIk5RXLDpJ0DWxsbACwtbVFbW0tJSUlVFRUsLOzA8D6+jp3794Fvle9\nV1dXAVhaWmJiYuKn466trbGyssK9e/cYHBykuLiYVCoV8mok6c9gRVuS8kS21pFbt24BsLe3x+zs\nLMfHx4yPjwMwPj5OIpEgGo0SiUQYGRkBIB6PE4/HmZmZIRaLMTY2xpcvX7J+Z1VVFS9fvuTt27dE\no1Gampq4efNmeIuUpD9IcP7P+4WSpLxSW1vL58+ficWsrUjS72TriCRJkhQCK9qSJElSCKxoS5Ik\nSSEwaEuSJEkhMGhLkiRJITBoS5IkSSEwaEuSJEkhMGhLkiRJIfgL9NPpBoSsSKsAAAAASUVORK5C\nYII=\n",
      "text/plain": [
       "<matplotlib.figure.Figure at 0x7fe6dd70e610>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    }
   ],
   "source": [
    "plt.figure(figsize=(12, 3))\n",
    "plt.plot(ndcgs_vad)\n",
    "plt.ylabel(\"Validation NDCG@100\")\n",
    "plt.xlabel(\"Epochs\")\n",
    "pass"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Load the test data and compute test metrics"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 45,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "test_data_tr, test_data_te = load_tr_te_data(\n",
    "    os.path.join(pro_dir, 'test_tr.csv'),\n",
    "    os.path.join(pro_dir, 'test_te.csv'))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 46,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "N_test = test_data_tr.shape[0]\n",
    "idxlist_test = range(N_test)\n",
    "\n",
    "batch_size_test = 2000"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 47,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "INFO:tensorflow:Scale of 0 disables regularizer.\n"
     ]
    }
   ],
   "source": [
    "tf.reset_default_graph()\n",
    "vae = MultiVAE(p_dims, lam=0.0)\n",
    "saver, logits_var, _, _, _ = vae.build_graph()    "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Load the best performing model on the validation set"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 48,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "chkpt directory: /volmount/chkpt/ml-20m/VAE_anneal200K_cap2.0E-01/I-600-200-600-I\n"
     ]
    }
   ],
   "source": [
    "chkpt_dir = '/volmount/chkpt/ml-20m/VAE_anneal{}K_cap{:1.1E}/{}'.format(\n",
    "    total_anneal_steps/1000, anneal_cap, arch_str)\n",
    "print(\"chkpt directory: %s\" % chkpt_dir)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 50,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "n100_list, r20_list, r50_list = [], [], []\n",
    "\n",
    "with tf.Session() as sess:\n",
    "    saver.restore(sess, '{}/model'.format(chkpt_dir))\n",
    "\n",
    "    for bnum, st_idx in enumerate(range(0, N_test, batch_size_test)):\n",
    "        end_idx = min(st_idx + batch_size_test, N_test)\n",
    "        X = test_data_tr[idxlist_test[st_idx:end_idx]]\n",
    "\n",
    "        if sparse.isspmatrix(X):\n",
    "            X = X.toarray()\n",
    "        X = X.astype('float32')\n",
    "\n",
    "        pred_val = sess.run(logits_var, feed_dict={vae.input_ph: X})\n",
    "        # exclude examples from training and validation (if any)\n",
    "        pred_val[X.nonzero()] = -np.inf\n",
    "        n100_list.append(NDCG_binary_at_k_batch(pred_val, test_data_te[idxlist_test[st_idx:end_idx]], k=100))\n",
    "        r20_list.append(Recall_at_k_batch(pred_val, test_data_te[idxlist_test[st_idx:end_idx]], k=20))\n",
    "        r50_list.append(Recall_at_k_batch(pred_val, test_data_te[idxlist_test[st_idx:end_idx]], k=50))\n",
    "    \n",
    "n100_list = np.concatenate(n100_list)\n",
    "r20_list = np.concatenate(r20_list)\n",
    "r50_list = np.concatenate(r50_list)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 51,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Test NDCG@100=0.42592 (0.00211)\n",
      "Test Recall@20=0.39535 (0.00270)\n",
      "Test Recall@50=0.53540 (0.00284)\n"
     ]
    }
   ],
   "source": [
    "print(\"Test NDCG@100=%.5f (%.5f)\" % (np.mean(n100_list), np.std(n100_list) / np.sqrt(len(n100_list))))\n",
    "print(\"Test Recall@20=%.5f (%.5f)\" % (np.mean(r20_list), np.std(r20_list) / np.sqrt(len(r20_list))))\n",
    "print(\"Test Recall@50=%.5f (%.5f)\" % (np.mean(r50_list), np.std(r50_list) / np.sqrt(len(r50_list))))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Train a Multi-DAE"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "The generative function is a [200 -> n_items] MLP, thus the overall architecture for the Multi-DAE is [n_items -> 200 -> n_items]. We find this architecture achieves better validation NDCG@100 than the [n_items -> 600 -> 200 -> 600 -> n_items] architecture as used in Multi-VAE^{PR}."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 52,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "p_dims = [200, n_items]"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 53,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "tf.reset_default_graph()\n",
    "dae = MultiDAE(p_dims, lam=0.01 / batch_size, random_seed=98765)\n",
    "\n",
    "saver, logits_var, loss_var, train_op_var, merged_var = dae.build_graph()\n",
    "\n",
    "ndcg_var = tf.Variable(0.0)\n",
    "ndcg_dist_var = tf.placeholder(dtype=tf.float64, shape=None)\n",
    "ndcg_summary = tf.summary.scalar('ndcg_at_k_validation', ndcg_var)\n",
    "ndcg_dist_summary = tf.summary.histogram('ndcg_at_k_hist_validation', ndcg_dist_var)\n",
    "merged_valid = tf.summary.merge([ndcg_summary, ndcg_dist_summary])"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Set up logging and checkpoint directory"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 54,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "arch_str = \"I-%s-I\" % ('-'.join([str(d) for d in dae.dims[1:-1]]))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 55,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "log directory: /volmount/log/ml-20m/DAE/I-200-I\n"
     ]
    }
   ],
   "source": [
    "log_dir = '/volmount/log/ml-20m/DAE/{}'.format(arch_str)\n",
    "\n",
    "if os.path.exists(log_dir):\n",
    "    shutil.rmtree(log_dir)\n",
    "\n",
    "print(\"log directory: %s\" % log_dir)\n",
    "summary_writer = tf.summary.FileWriter(log_dir, graph=tf.get_default_graph())"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 56,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "chkpt directory: /volmount/chkpt/ml-20m/DAE/I-200-I\n"
     ]
    }
   ],
   "source": [
    "chkpt_dir = '/volmount/chkpt/ml-20m/DAE/{}'.format(arch_str)\n",
    "\n",
    "if not os.path.isdir(chkpt_dir):\n",
    "    os.makedirs(chkpt_dir) \n",
    "    \n",
    "print(\"chkpt directory: %s\" % chkpt_dir)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 57,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "n_epochs = 200"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "ndcgs_vad = []\n",
    "\n",
    "with tf.Session() as sess:\n",
    "\n",
    "    init = tf.global_variables_initializer()\n",
    "    sess.run(init)\n",
    "\n",
    "    best_ndcg = -np.inf\n",
    "    \n",
    "    for epoch in range(n_epochs):\n",
    "        np.random.shuffle(idxlist)\n",
    "        # train for one epoch\n",
    "        for bnum, st_idx in enumerate(range(0, N, batch_size)):\n",
    "            end_idx = min(st_idx + batch_size, N)\n",
    "            X = train_data[idxlist[st_idx:end_idx]]\n",
    "            \n",
    "            if sparse.isspmatrix(X):\n",
    "                X = X.toarray()\n",
    "            X = X.astype('float32')           \n",
    "            \n",
    "            feed_dict = {dae.input_ph: X, \n",
    "                         dae.keep_prob_ph: 0.5}        \n",
    "            sess.run(train_op_var, feed_dict=feed_dict)\n",
    "\n",
    "            if bnum % 100 == 0:\n",
    "                summary_train = sess.run(merged_var, feed_dict=feed_dict)\n",
    "                summary_writer.add_summary(summary_train, global_step=epoch * batches_per_epoch + bnum) \n",
    "                    \n",
    "        # compute validation NDCG\n",
    "        ndcg_dist = []\n",
    "        for bnum, st_idx in enumerate(range(0, N_vad, batch_size_vad)):\n",
    "            end_idx = min(st_idx + batch_size_vad, N_vad)\n",
    "            X = vad_data_tr[idxlist_vad[st_idx:end_idx]]\n",
    "\n",
    "            if sparse.isspmatrix(X):\n",
    "                X = X.toarray()\n",
    "            X = X.astype('float32')\n",
    "        \n",
    "            pred_val = sess.run(logits_var, feed_dict={dae.input_ph: X} )\n",
    "            # exclude examples from training and validation (if any)\n",
    "            pred_val[X.nonzero()] = -np.inf\n",
    "            ndcg_dist.append(NDCG_binary_at_k_batch(pred_val, vad_data_te[idxlist_vad[st_idx:end_idx]]))\n",
    "        \n",
    "        ndcg_dist = np.concatenate(ndcg_dist)\n",
    "        ndcg_ = ndcg_dist.mean()\n",
    "        ndcgs_vad.append(ndcg_)\n",
    "        merged_valid_val = sess.run(merged_valid, feed_dict={ndcg_var: ndcg_, ndcg_dist_var: ndcg_dist})\n",
    "        summary_writer.add_summary(merged_valid_val, epoch)\n",
    "\n",
    "        # update the best model (if necessary)\n",
    "        if ndcg_ > best_ndcg:\n",
    "            saver.save(sess, '{}/model'.format(chkpt_dir))\n",
    "            best_ndcg = ndcg_"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 59,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAtoAAADQCAYAAAA56sZ8AAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzt3Xl8VPW5x/HPmS0zk5nsEzYBkSJocAPlqriBYrXLFa1Y\n3HJtqWIVxKoFzBVDW0XAq95ibZVarSJtY1P0equttra9V9sYN0Sl9aJYkTWZJJN9JrOd+8eEUSQM\nGWSYLN/368XLzJk55zzz5Dh55nee8zuGaZomIiIiIiJyUFmyHYCIiIiIyECkQltEREREJANUaIuI\niIiIZIAKbRERERGRDFChLSIiIiKSASq0RUREREQywJbtADLB72/L2r4LC90EAp1Z239/o3ylTzlL\nj/KVPuUsPcpX+pSz9Chf6TuUOfP5vPt8TiPaB5nNZs12CP2K8pU+5Sw9ylf6lLP0KF/pU87So3yl\nr6/kTIW2iIiIiEgGqNAWEREREckAFdoiIiIiIhmgQltEREREJAMyOuvIsmXL2LBhA4ZhUFFRwbHH\nHrvXa+655x7eeust1qxZA8DKlSt54403iEajzJ07l3PPPZfFixezceNGCgoKAJgzZw5nnXVWJkMX\nEREZdEzTxDTBYjEO6jYDbV047FZynTYMw0guD3bFsFkNHPZDd+FaJBqnoSVIU1sXsZhJ3DTBBFeO\nFbfTjsthJRKLE47EicTieF12Cjw55Dj2H6NpmoSjcbrCMUKRGBYDHHYrOXYrDpsl+d570toZZldj\nJ83tXQTaurDbLJwwzkehN+eT13SEqW8O4sqx4XHacDvt2G2fjJkGu6Js87fT0BIiL9dBcZ6TIm9O\nyvzGTZNAaxehSIxYLE7cNLFbLbhybDgdVuw2C1aLJa1jItqdv7hpYrMa2KwWItE4ze1dNLV10RmK\nYnzqtc3tYZrbu+gIRrDbLDjsViwWg85QhPZglGBXFKslsR2bzYLHaSPXZcfttBGOxAl2RYnFTL74\nLyMpyXf1Os5DIWOF9quvvsqWLVuoqqpi8+bNVFRUUFVVtcdrPvjgA1577TXsdjsAr7zyCu+//z5V\nVVUEAgEuvPBCzj33XABuuukmpk2blqlwRUR6xTRNOkJRXDlWrJZP/sBFonE+2tVKNGbicdnxuOzk\n5dr3eM2+theNxQmGY3R1/wuFYwDk5drJ9+Rgt1loD0ZobQ/T0hGmpaOLlo4woa4YebkOCjwOnDk2\nGpqD7GrqpLG1C4fNgsthw5ljJRZP7MOMw/ASN2NH5DPCl4vVYknuv6ElRF1TkPrmIPl5TojF8bjs\nRKJxWjq6aO0I43baGT3Ey2GluVgtBvWBIHWBIF3hGA67Facjsa+2zjCtHYlYd/8cN2FESS6HlXrI\ny3Xgbw5S3xSksTVEezBCezBCOBrrzpsDj8tOPG4SicaJx01cThsepx2P257Mb67LjsthI6e7GGhq\nCbGzqZO6pk6CXVEisTiRaOJ9lOQ7Kcl3YRjQEYrSGYpgmuDo/qMOEInFiUbjyfUi0TgWw8DltJHr\ntGGa0NQaoqmti7bOcPdrE7ENKXQxeoiXknwnLR1hmlpDtLSHCUVihCPdv9tILFGEdT8OR2JEYmZ3\nIeegwJtDgSfxLy/XTqCtix0NndQFOonHTew2C3abhXAkTkcoQkcwgsft4DBfLiN8HmwWg0B7F83t\nYTqCEULhGKFwFKvFQn5uIq+5Tjs2mwWb1UIsHqe1I0xbZ4SWjnD3z2HA4MiR+ZQdXsTIIR6aWruo\nDwRpaAkmf1ehcIyc7t+5w26lKxwj2BWlKxKjJN/JsOJcSvKdbK1v572PAzS3hwFw2C0Uep3E4ibN\nbSGiMRMAu81CrtOGw27FajGwWgwiMZOucJRQOEY8bmIYBhYLWAwj8bMBebkOxo0sYPzIAvJzHWxv\n6GBbfTv+lhCdoQidoSjh7t+j1WoQi8VpauvCNNP/f9+Vs7tgThxvse7jM7r7eOn+775YLQYetx2v\ny4HXbe/+56AjGGHzjhb8zaG91ln7wibGjSxg7MgC3t7kZ3tDx16vcdgt5DrtWAyDxta9t2EAvkIX\nI30ehpW4icUTX3DagxHqmzrZ2dSZMu7kdgywWizYrEay6LVaDWwWC3HTJNx9fO8usLNh7GF5g6fQ\nrqmp4ZxzzgFg7NixtLS00N7ejsfjSb5m+fLlfOc73+FHP/oRACeddFJy1DsvL49gMEgsFstUiCLS\nS62dYeqaOhla5MbrdiSXR2Nx/M1BdjR0sKOxk/pAJ4ZhkGOz4nBYKPI6GVLowlfowmoYyT9ETocV\nr9uB02HFMAwi0TjBcBToLnxsVoLhKI0tIRpbQ3v8NxSJYbcmCo7d/7XZLMRiJo2tIZpaQwS7onjd\nDvJzHZQUuYnH4titFkwTGltD+JuDNLd3YRi7/2AYuHPs5LpsuHJshMKJP0LBrihOu5Vclx2nw0pz\nexe7moIEu6I4bBYOK/VwmM+DvznI5u0thD/zx8piGBTl5eArcJFjtxLqLhq6IrHuIihRcO3vj5Jh\ncECFQSoOW+KPZFc4/T+Ku0ei0g3pH1sCPS63GAa5LhsOm5W6piAf17WnueXssVktxONx/v7P3r7e\nSBRrdisupx2v1aC9M8L721r2mU+b1cBiMYhE4ph8ki93jo2m1hA7GjrgH/U9xuZ0WInG4mzzp86p\n3WYhz+1gZKmXSDTG3z8K8PePev59OeyJL3FtnYkveyaJY9SdY8Nms/Dex82893Fz8vV5bjuTj/QR\ni5s0tYYItHfhzLExstSD1+0gFjfpCEboCEXoiiSK6lgsMRLqdNjIczuwWg3i8cToa9w0icdN4ibU\nB4Js83fw5ze39/ie3Dk2HHYL8bhJVziOxWIwbkQ+pYVuivOd3UVj4stwZ1fiC1goHEuMqtqs2KwG\nbZ0RAu1dtLR3dX9ZitERimC1WrBbDZw5NrxuC3ZbovjMcVhxdo9ix00IRxP/z3eGorR1hmloCe71\n+8h12ph4RBEjSz0UeZ0UeBw0t4d57R91vL+1mU1bm3HYLJQdXsjIUi+hSCyZs45QlI5ghEgszlGj\nCxlZ6qG00EVbZyTxeRdI7O+NTX7Y9Jnfpc3C8OJchhW7ceXYsOz+ktP9mRzqihGJxYnF4kS7fy+x\nWDzx5T2e+DkcjWGxGLiddgq6v7jabRZy7FYMg+QXfavFQqHXQaHXSa4zUX6agM1iJL5genPIddkT\nX2C7v8C4nTY8LjuuHBumaRKJmkSiMTpCUdqDiS9SDrsl+bq+VmRDBgvthoYGysrKko+Liorw+/3J\nQnvdunVMmTKFESNGJF9jtVpxu90AVFdXc8YZZ2C1JkYannjiCR599FGKi4tZsmQJRUVF+9x3YaE7\nq/Mnppq4XPY20PJlmmbig7k1lCxgrBaDocW5yZGzWNzkg60B3tncSHNb4nRZZ1eEkgIXhw/NY+QQ\nL/5AkE1bA3y4vQWb1UJxvpPifFdiVMqVOGVomtAVjhIMJ4rPAm8Ohd4cWtrD/P2jRt77qInWjjD5\nuTnkeRyJkdHOxIhUMBQlHI0lR+0KvIkPurxcR3KkIhiK8o+PmthW/8kfhdIiN4eVevAHguxsaE+O\nSB0Ih82CCb0aTektV05i9HF7Qwcf7WqDzY17vcbpsFKc7wQSv4twJM6uQCdddZ98sd99mrutM8LH\n3e/fZrUwrCSXIUVumlpCbNnVyoc7WgE4fFgeE8cW43E5aO0eAW5sCbGrsWOPAtNht+LKseLKsZHv\nyUmennXmJIr83Y8BAm1dBFpDhMKx5O+2wOuksPtnl9NGS1uYprYQHcEIpYVuRvg8DCl2E4nG6QxF\nkqdc7TYrcdPkn9tbeG9LgA+2NYMJOQ4rOQ4rJfkuhvtyGVaSCyROUbd2hHHYEsdVvieHlvYuNm9v\n4Z/bWzExGeHzMLwkl1yXPfHFoSuKsftY6v7DWeDJId+bQzxu8vGuVj7a2UpzexfDinMZXuJhaLGb\nXJd9j1Pqoa4obZ0RrFYDu82CxTDoCEZo7QzT1j3q2tYRprUzQqgrcVo5FI5SUuDisFIPI3wevLmO\n7kLJQktHF3VNndQ3dWIAHrcDjzuxz90jzbuPx2SBZbPgsCdGLTuDiULGxKSkwEVJgYtCbw42a6IV\nIByJsWVXK5u3tdDQEqQ4z0lJgYuiPCcup6175Dfxe7Vaez7DEY3FaW7roqn7C2VzexdF3hxGDvUy\npChxBiFx9iFRgH66/aI+EGTLrsRxWJTnpDjPicft2KOloCsSS37WRKKx5ChvYff/964c2x6/g0Bb\niA2b/GzztzOk0M2wklyGFueSl+vYow1hd6vEp9siQuEo2+vbqWvqZOQQL4eVelK2THwekWiczdua\neWdzA+2dEUYPy2PM8DyGleTidPTde/KFI7HEWZ/2Llw5NoaV5PaYo9nnHUVjS5DGlhBjhufv8TtN\nh2kmvuRs97djt1pxuxKFaaHXeVDbhPqavlBfGKaZmfH9JUuWcOaZZyZHtS+99FKWLVvGmDFjaG5u\nZt68eTz66KPU1dVx6623Jnu0Af74xz/y0EMP8cgjj+D1eqmpqaGgoICjjjqK1atXs2vXLm6//fZ9\n7jubd4b0+bxZ3X9/83nyFTdNLGl8eIcjMUwT7PbEH27TTBRYnV1RdjZ2sHl7C5t3tNLSESbHZsHh\nsOLOsZGfm0OBx4HVakmOrCZGNhLf5KOxODZLYlQ18WHWlfzD/WkWw2BYcWIU5cMdrbQHIwf0vtOV\n40ic1v00A3A4Ej2DNmvitF9bR6THkc0ch5UvjMhnREkuOxo72LKrjbbOCK4ca6JYKs5lWImbYcW5\nDC1yYxgQjsQJdY9I1weC+JuDmJAchQ6Go8nT1RbDwJ2TKDQN6D71mDgtXZyfKBo+/V9Xjo3o7lP8\nnzpdaxgGxXmfFAy7+z8dLgd19a3J0ebiPCdet73HP2qRaIxgVyx5Ony3aCxxnHic9j3+KEWicXY1\ndZLvcZD3qZH+z9o9SpfT3XfY1+lzLD3KV/qUs/QoX+k7lDlLVdBn7OteaWkpDQ0Nycf19fX4fD4g\n0Yvd1NTE5ZdfTjgc5uOPP2bZsmVUVFTw0ksv8eCDD/Lwww/j9SYCP+WUU5LbmT59OkuXLs1U2JIB\nuwtaAAyIx02CXVE6u6LUt4XZUddKMJR4vPuUejQaJy/XQVFeDl6Xneb2MP6WRMHmb+4+9d/WRVGe\nkzHDvBw+LI8cu5VINFH8tgcjtCT7WcO0dnQR7Pqk2Eyc7jV7LCwdNsteLQCfZbXsPv2buEikq/tU\nIkBpoYuSfCcFnpxkURWJxhK9g/4Otjd0UOjN4YzjhjNxTBElBU7cOYlRL39LiG3+dnY2dFLozWHM\nMC+jh3oxSPReBlpDWB12dvnbEheTGN0jkt3vffcopDMnURyPHZFPrtNOJBqjrTNCNBZP9rV+tuCL\nm2ai/7Izkjh1GzexWgyGlbj36DNOFLDRvUbAejLusJRPH7CcXlw4ZRgGbqcNX0kuNrN3I+Z2mxV7\nD2fDbFZLj4W03WZhZKlnr+UHEq+IiAw8GSu0p06dyv3338/s2bPZuHEjpaWlybaR8847j/POOw+A\nbdu2ceutt1JRUUFbWxsrV67k5z//eXKGEYD58+ezcOFCRo4cSW1tLePGjctU2PI5xU0Tf3Oix3LL\nrja21LWxZVfbQR29NYCivByOGJFHXVOQ1//Pz+v/59/na71uO8V5LvJz7VitlsSFSZE4VovRfare\nSnGeky+MyOeIEfnk5zqImyaR7guOWjrCtLSHicbi3e0bTryunkdE9ydumrR3RvY5oprvyeELI/J7\nXHdokZuhRe4D+pZut1kpyktd7FkMgzx36pFZ2F3A2tPav4iIyGCUsUJ70qRJlJWVMXv2bAzDoLKy\nknXr1uH1epkxY0aP6zz33HMEAgFuvPHG5LIVK1Zw+eWXc+ONN+JyuXC73dx1112ZCntQi3f3+tUH\ngok+wdYQXZEYZvcFKLsv5ugMRQhH490XpCRGOBMXqLDXyDFASb6Tw4d6E6fzMbtbBWy4nDZKCt0Q\ni+NyJi7scToSvYxWi0FrR5imtkSva4HHga/Aha+773F3n5ppJi6A+7iunXjcxNbdmuBx2cn3JK7s\n3t+sDz2xGEayd7Uoz3lQ8rt7u3m5qQtZERERGRgy1qOdTerRTs3sHnXe2dhJXSBIXaCTrfXtbK1r\n77G3+LMMEqfMDYuBpXuKJYslcXFOrtPG6KFeRpUmWh5GDfGQm2L0sz/kq69RztKjfKVPOUuP8pU+\n5Sw9ylf6BnyPtvQtu5o6eWdzI5u2NvP+tmZaO/ds5bAYiV7cUaVehha7Kc7LocjrxN19c4Hdk+7n\nOm04c2xpXYQoIiIiMhip0B5gOkMR/rElQGtHmI5QlOb2Ljb+s4m6QDD5mkJvDlOOKuUwn4chRW5K\nC1wMK3Yf0jtziYiIiAx0KrQHgGgszoYPGnll4y42bG4kGttzhoUcu5VJR/o4dmwxR48upDjfmbE5\nTUVEREQkQYV2P9YVifHy2zv5fe3HyduuDi/JZcqEUkoLXbididsUjyz1HPAk9yIiIiJyYFRo91M1\nG3fxyz++T3swgt1mYdqkEZx53HBGZvAOXCIiIiLSeyq0+5muSIy1f9jEy2/vJMdh5SunHs45kw/T\nlHEiIiIifYwK7X6kPtDJqt+8w46GDkYP8XLtzDKGFLqzHZaIiIiI9ECFdj8R7Iryw+q32dnYydmT\nD+OSaV9Q37WIiIhIH6ZCux8wTZNHf/ceOxs7Ofekkcw+W7egFxEREenrNCTaD7zw2lZef6+eIw/L\n5+KzxmY7HBERERHpBRXafdymrc38+s+byc91cO3Midis+pWJiIiI9AcpW0cCgQD33Xcf//M//0ND\nQwOGYVBaWsr06dNZsGABXu++7+0un184EuORZ/+Bicm3Z06kwJOT7ZBEREREpJdSDo8uWrSII444\ngieffJK3336bt956iyeeeIKSkhIWLVp0qGIctH5bs4X65iAzThzJkSMLsh2OiIiIiKQhZaEdDAa5\n6qqrGDJkCFarFZvNxvDhw7n22mtpaWk5VDEOSjsaOvjdK1soysth5uljsh2OiIiIiKQpZaEdiUR4\n991391q+fv164vF4xoIa7OKmyeO/f49Y3OTyGUfidGhyGBEREZH+JmUFd+utt7Jw4UK6urrw+XwA\n1NXVkZ+fz/Lly/e78WXLlrFhwwYMw6CiooJjjz12r9fcc889vPXWW6xZs2af6+zcuZOFCxcSi8Xw\n+XzcfffdOBwD906If31nJ5u2tXDCuBJOGOfLdjgiIiIicgBSFtrHHXcczz33HNu3b6e+vh7DMBg6\ndChDhw7d74ZfffVVtmzZQlVVFZs3b6aiooKqqqo9XvPBBx/w2muvYbfbU66zatUqLrvsMs4//3zu\nvfdeqqurueyyyz7H2+67gl1RfvOXzTjsFi6fcWS2wxERERGRA5SydSQcDrN69WoWLlyY/Pfd736X\nn//850QikZQbrqmp4ZxzzgFg7NixtLS00N7evsdrli9fzne+8539rlNbW8vZZ58NwLRp06ipqUn/\nnfYTv6vdQmtnhPP/ZTRFec5shyMiIiIiByjliPaiRYsoLi7mpptuwufzYZom9fX1PPPMMyxZsiRl\n+0hDQwNlZWXJx0VFRfj9fjweDwDr1q1jypQpjBgxYr/rBIPBZKtIcXExfr8/5ZsqLHRjs1lTviaT\nfL4Dm/awoTnIC69upSjPyRVfOhpnzuDozT7QfA1myll6lK/0KWfpUb7Sp5ylR/lKX1/IWcpKzu/3\nc9999+2xbPTo0Zx00klpt26Yppn8ubm5mXXr1vHoo49SV1fXq3VSLfusQKAzrdgOJp/Pi9/fdkDr\nPvzbvxOOxrngtMNpaw1yYFvpXz5PvgYr5Sw9ylf6lLP0KF/pU87So3yl71DmLFVBn7LQDofD7Nq1\na6+e7K1btxKNRlPutLS0lIaGhuTj+vr65AWVr7zyCk1NTVx++eWEw2E+/vhjli1bts913G43oVAI\np9NJXV0dpaWlKffdH23Z1cbf3t3FyFIPUycOy3Y4IiIiIvI5pSy0v/3tbzNr1izGjBmzx6wj27dv\n584770y54alTp3L//fcze/ZsNm7cSGlpabJt5LzzzuO8884DYNu2bdx6661UVFTw5ptv9rjOqaee\nyvPPP88FF1zACy+8wOmnn34w3nuf8vRLHwLw9elfwGIxshyNiIiIiHxeKQvtadOm8eKLL/LWW29R\nX18PwNChQznuuOOSM4Xsy6RJkygrK2P27NkYhkFlZSXr1q3D6/UyY8aMXq8DMH/+fBYtWkRVVRXD\nhw9n5syZB/Je+6xAWxdvf9jI4UO9HH14UbbDEREREZGDwDB70/TcgxUrVvTZ27Bns4/pQHqCnq35\niN/8z4dc+cXxTDthxH5fP5Co7yx9yll6lK/0KWfpUb7Sp5ylR/lKX1/p0U45vV8qGzduPNBV5VNM\n0+Tld3Zht1n4l6MGXu+5iIiIyGCVsnXkzDPPxDD27hc2TZNAIJCxoAaTD7a3UNfUyclHD8HtTN2O\nIyIiIiL9R8pCe/LkyZx44omceeaZeyw3TZNbbrklo4ENFi+9vROA047VTCMiIiIiA0nK1pEf/OAH\n1NbWUlBQwIgRI5L/DjvssP1eDCn7FwpHee0f9RTnOZkwujDb4YiIiIjIQZRyRDs3N5cf/vCHPT73\nyCOPZCSgweT19/x0RWJ8ccpILD206IiIiIhI/9XriyHb2tpoaWlJPtaI9udXs3EXAKcdo7YRERER\nkYEm5Yg2wP/+7//ywAMP4HK5KCwspKmpiaOPPpqbbrpJxfbnEI3F+WB7C4f5cikpcGU7HBERERE5\nyFIW2rW1taxevZo77riDcePGJZe/8sor3HXXXZx77rmceOKJ2Gz7rdflMz7a2UYkGufIkQXZDkVE\nREREMiBlhfzTn/6UlStXcsstt9DY2EhZWRkTJkygqKiIrVu34vf7+e///m8uvPDCQxXvgLFpWzOA\nCm0RERGRASplj3ZXVxfDhw/nC1/4AnPnzuXrX/86wWCQhx9+mAULFjBt2jT+/Oc/H6pYB5RNWxOF\n9rjDVGiLiIiIDEQpC22r1QrAP//5Ty666CL+5V/+hQULFvDDH/6Q3//+93g8Hpqbmw9JoANJPG7y\n/rYWSgtcFHpzsh2OiIiIiGRAykLbbrcTCoVwu9288soryeXjx4/n448/BhI3r5H0bPO3E+yKqm1E\nREREZABL2aM9c+ZMHnvsMZYvX87ixYu55557GD16NNu2bePLX/4y7733HkOHDj1UsQ4YybaRkflZ\njkREREREMiVlof3lL3+ZRYsW8dRTT/Hggw/S2NjIjh07GDZsGBaLhXnz5lFZWXmoYh0wNm1LzEc+\nXiPaIiIiIgPWfuflW7FiBWvWrOGKK67g8MMPp6SkhG3btvHRRx9xyy23MGHChEMR54BhmiabtjaT\n73Hg0/zZIiIiIgNWrybAvvLKK7nyyivZvn07fr+f/Px8xowZs9/1li1bxoYNGzAMg4qKCo499tjk\nc08++STV1dVYLBYmTJhAZWUl1dXVPPPMM8nXvPvuu6xfv54rr7ySzs5O3G43AIsWLWLixInpvtc+\noT4QpLUjzJSjSjF023URERGRASutO80MHz6cYcOG9apAfPXVV9myZQtVVVVs3ryZiooKqqqqAAgG\ngzz77LOsXbsWu91OeXk569evZ9asWcyaNSu5/u9+97vk9u666y6OPPLIdMLtk/5P0/qJiIiIDAop\nZx0JBoPccccdycdnn302Rx99NMcffzybNm1KueGamhrOOeccAMaOHUtLSwvt7e0AuFwuHnvsMex2\nO8FgkPb2dnw+3x7rP/DAA1x33XUH9Kb6sve7C231Z4uIiIgMbClHtP/jP/6D5uZmYrEYVquVESNG\n8Kc//Yk///nP/OQnP+G+++7b57oNDQ2UlZUlHxcVFeH3+/F4PMllq1ev5vHHH6e8vJyRI0cml7/9\n9tsMGzZsj+J71apVBAIBxo4dS0VFBU6nc5/7Lix0Y7NZU7/zDPL5vPt87sNdbeS67Bx31FAsFrWO\nQOp8Sc+Us/QoX+lTztKjfKVPOUuP8pW+vpCzlIX266+/zm9+85vkjWt2mzZtGj/+8Y/T2lFP821f\nc801lJeXc/XVVzN58mQmT54MQHV19R63dS8vL2f8+PGMGjWKyspK1q5dy5w5c/a5r0CgM63YDiaf\nz4vf39bjc+3BCDsbOigbU0RjY/shjqxvSpUv6Zlylh7lK33KWXqUr/QpZ+lRvtJ3KHOWqqBP2Tri\n8Xiw2T6pxb/73e8mf3Y4HCl3WlpaSkNDQ/JxfX19coS6ubmZ1157DQCn08kZZ5zBm2++mXxtbW0t\nJ5xwQvLxjBkzGDVqFADTp0/fb9tKX/XPna0AHDEsL8uRiIiIiEimpSy0Ozs7iUajyce7Zw0JhUIE\ng8GUG546dSrPP/88ABs3bqS0tDTZNhKNRlm8eDEdHR0AvPPOO8lZTOrq6sjNzU0W8qZpctVVV9Ha\nmihSa2trGTduXNpvtC/4cEfiPYwZrkJbREREZKBL2Toybdo0lixZwm233UZubi4AgUCA22+/nUsu\nuSTlhidNmkRZWRmzZ8/GMAwqKytZt24dXq+XGTNmcP3111NeXo7NZmP8+PGcffbZAPj9foqKipLb\nMQyDSy65hKuuugqXy8WQIUOYP3/+533fWaERbREREZHBwzB7ap7uFo1Gueeee/jNb37D8OHDiUaj\n+P1+vvnNbzJ37txDGWdastnHtK+eINM0WbDqZZwOKyu/fWoWIuub1HeWPuUsPcpX+pSz9Chf6VPO\n0qN8pa+v9GinHNG22WwsWrSIG264gS1btmC1Whk9evR++7Nlb/6WEO3BCEcfXpjtUERERETkEEjZ\nox2Px/nxj3+Mw+FgwoQJjBs3jq1bt/KTn/zkUMU3YHy4owWAMWobERERERkUUhbaDzzwABs3biQc\nDieXDRkyhPfee4/HH38848ENJP/ckTh9cYQuhBQREREZFFIW2n/+85+57777cLlcyWUej4cVK1bw\n3HPPZTz3Aq3/AAAf/0lEQVS4geTDnS1YDINRQ7I/ebqIiIiIZF7KQtvpdPbYj+10OrFYUq4qnxKN\nxdmyq53DSnPJsWfvjpUiIiIicujsdx7tzs6977LY0tKSnANb9m+bv51oLK5p/UREREQGkZSF9gUX\nXMC8efP46KOPksvee+89rr32Wr7xjW9kOrYBQzeqERERERl8Uk7v941vfAOHw8G//du/0dbWhmma\nFBcXM3fuXGbOnHmoYuz3/rlDN6oRERERGWxSFtoAl19+OZdffjnt7e0YhpG8Q6T03j93tZHjsDKs\nWLkTERERGSz2W2i/++67/OxnP2PTpk1YLBYmTpzIN7/5TcaNG3co4uv34nGTuqZORg3xYLEY2Q5H\nRERERA6RlD3ar7/+OvPmzePUU0/lP//zP/ne977HEUccwZw5c3jjjTcOVYz9WqCti1jcxFfg2v+L\nRURERGTASDmi/dBDD/GjH/2IiRMnJpdNmjSJk08+mRUrVvDEE09kPMD+rr45CEBpoQptERERkcEk\n5Yh2MBjco8je7Zhjjulx2j/Zm7+70NaItoiIiMjgkrLQTnVTGo/Hc9CDGYjqA90j2iq0RURERAaV\nlK0j9fX1VFdX9/ic3+/PSEADjUa0RURERAanlIX2CSecsM+LHo8//vj9bnzZsmVs2LABwzCoqKjg\n2GOPTT735JNPUl1djcViYcKECVRWVvLqq6+yYMGC5IwmRx55JEuWLGHnzp0sXLiQWCyGz+fj7rvv\n7vHW8H1RfXMQm9VCgTcn26GIiIiIyCGUstC+6667DnjDr776Klu2bKGqqorNmzdTUVFBVVUVkOj9\nfvbZZ1m7di12u53y8nLWr18PwJQpU1i1atUe21q1ahWXXXYZ559/Pvfeey/V1dVcdtllBxzboeQP\nBPEVOLEYmtpPREREZDBJWWjfeuut+3zOMAyWLVu2z+dramo455xzABg7diwtLS20t7fj8XhwuVw8\n9thjQKLobm9vx+fzsWPHjh63VVtby/e+9z0Apk2bxiOPPNIvCu32YITOrijjDsvPdigiIiIicoil\nLLQvvPDCvZZ1dnby4IMPEggEUm64oaGBsrKy5OOioiL8fv8eF1GuXr2axx9/nPLyckaOHMmOHTv4\n4IMPuPbaa2lpaWHevHlMnTqVYDCYbBUpLi7eb394YaEbm82a8jWZ5PN5AWjemsjRqOH5yWWyN+Um\nfcpZepSv9Cln6VG+0qecpUf5Sl9fyFnKQnvKlCl7PP7tb3/L/fffz0UXXcQ3vvGNtHZkmuZey665\n5hrKy8u5+uqrmTx5Mocffjjz5s3j/PPPZ+vWrZSXl/PCCy/sdzufFQhkb+pBn8+L398GwKZ/NgLg\nybEml8mePp0v6R3lLD3KV/qUs/QoX+lTztKjfKXvUOYsVUG/31uwA2zatIkf/OAHlJSU8NhjjzF0\n6ND9rlNaWkpDQ0PycX19PT6fD4Dm5mbef/99TjrpJJxOJ2eccQZvvvkmkydP5ktf+hIAo0aNoqSk\nhLq6OtxuN6FQCKfTSV1dHaWlpb0JO+s0tZ+IiIjI4JVyHu329nbuuOMOFi5cyLx587jvvvt6VWQD\nTJ06leeffx6AjRs3UlpammwbiUajLF68mI6ODgDeeecdxowZwzPPPMPPfvYzIDF9YGNjI0OGDOHU\nU09NbuuFF17g9NNPP7B3e4j5dVdIERERkUEr5Yj2ueeey9ChQ7niiivYuXMnTz/99B7Pz5w5c5/r\nTpo0ibKyMmbPno1hGFRWVrJu3Tq8Xi8zZszg+uuvp7y8HJvNxvjx4zn77LPp6Ojglltu4cUXXyQS\nibB06VIcDgfz589n0aJFVFVVMXz48JT77Uv8zUEMoCTfme1QREREROQQS1loX3rppRiGwa5duw5o\n47fccssejydMmJD8+aKLLuKiiy7a43mPx8ODDz6413ZKS0t59NFHDyiGbKpvDlLgzcGexQszRURE\nRCQ7Uhba8+fPP1RxDDiRaJxAaxdHjizIdigiIiIikgUpe7TlwDW0BDEBn/qzRURERAYlFdoZsvtC\nSJ9mHBEREREZlFRoZ4im9hMREREZ3Ho1j/Zvf/tbfvrTn9La2oppmpimiWEY/OUvf8lweP1Xvab2\nExERERnUelVo33///dxxxx0MHz480/EMGA3NIUCtIyIiIiKDVa8K7dGjR3PSSSdlOpYBpb45iDvH\nhsdlz3YoIiIiIpIFvSq0TzjhBO69916mTJmC1frJnNCnnHJKxgLr7wJtIYrzNJotIiIiMlj1qtD+\n29/+BsD69euTywzDUKG9D6ZpEuqK4crRjWpEREREBqteFdpr1qzJdBwDSjgSxwScjl6lV0REREQG\noF5N77d582bKy8uZNGkSkydPZs6cOXz88ceZjq3fCoWjADgdGtEWERERGax6VWj/4Ac/4Jvf/CYv\nv/wy//u//8vs2bOprKzMdGz9VigcA1Roi4iIiAxmvSq0TdPkrLPOwu12k5uby4wZM4jFYpmOrd/6\npNBW64iIiIjIYNWrQjsSibBx48bk47fffluFdgpqHRERERGRXg25Llq0iJtvvpmmpiZM06S0tJTl\ny5dnOrZ+S60jIiIiItKrQvu4447j97//PW1tbRiGgcfj6dXGly1bxoYNGzAMg4qKCo499tjkc08+\n+STV1dVYLBYmTJhAZWUlhmGwcuVK3njjDaLRKHPnzuXcc89l8eLFbNy4kYKCAgDmzJnDWWedlf67\nPURUaIuIiIhIykL7oYceYu7cuXz3u9/FMIy9nl+5cuU+13311VfZsmULVVVVbN68mYqKCqqqqgAI\nBoM8++yzrF27FrvdTnl5OevXryccDvP+++9TVVVFIBDgwgsv5NxzzwXgpptuYtq0aZ/nvR4yn7SO\nqEdbREREZLBKWQkeffTRAJx66ql7PddT4f1pNTU1nHPOOQCMHTuWlpYW2tvb8Xg8uFwuHnvsMSBR\ndLe3t+Pz+Rg+fHhy1DsvL49gMNgve8E1oi0iIiIiKQvt008/HUjMo33LLbfs8dy///u/M3PmzH2u\n29DQQFlZWfJxUVERfr9/j7aT1atX8/jjj1NeXs7IkSMBcLvdAFRXV3PGGWckb/n+xBNP8Oijj1Jc\nXMySJUsoKira574LC93YbNkrcq3d+x5S6sXn82Ytjv5COUqfcpYe5St9yll6lK/0KWfpUb7S1xdy\nlrLQ/sMf/sALL7xATU0N9fX1yeXRaJTXXnstrR2ZprnXsmuuuYby8nKuvvpqJk+ezOTJkwH44x//\nSHV1NY888ggAF1xwAQUFBRx11FGsXr2aH/3oR9x+++373Fcg0JlWbAeTz+elqTkIQKgzjN/flrVY\n+gOfz6scpUk5S4/ylT7lLD3KV/qUs/QoX+k7lDlLVdDvd0S7qKiId999l1NOOSW53DAM5s2bl3Kn\npaWlNDQ0JB/X19fj8/kAaG5u5v333+ekk07C6XRyxhln8OabbzJ58mReeuklHnzwQR5++GG83kTg\nn9739OnTWbp0acp9Z5um9xMRERGRlPNoO51OJk+ezNNPP82FF16Y/Ddz5kx+/etfp9zw1KlTef75\n5wHYuHEjpaWlybaRaDTK4sWL6ejoAOCdd95hzJgxtLW1sXLlSh566KHkDCMA8+fPZ+vWrQDU1tYy\nbty4A3/Hh4B6tEVERESkV9NivP7669x77700NzcDEA6HKSgoYNGiRftcZ9KkSZSVlTF79mwMw6Cy\nspJ169bh9XqZMWMG119/PeXl5dhsNsaPH8/ZZ5/Nk08+SSAQ4MYbb0xuZ8WKFVx++eXceOONuFwu\n3G43d9111+d825mlO0OKiIiIiGH21Dz9GbNmzeLf//3fWbZsGXfeeSfPPfccJ554IlOnTj0UMaYt\nm31MPp+X7/7wf3jv42YeXjgNiyX17CyDnfrO0qecpUf5Sp9ylh7lK33KWXqUr/T1lR7tXt2C3ePx\ncPzxx2O32xk3bhwLFizg0UcfPWgBDjTBcAyHzaIiW0RERGQQ61VvQzQa5fXXXycvL4+nnnqKsWPH\nsm3btkzH1m91hWPqzxYREREZ5HpVaH/ve9+joaGBhQsX8oMf/ICGhgauvfbaTMfWb4XCUfVni4iI\niAxyvaoGjzjiCI444giA5NzWsm+hcIw8tyPbYYiIiIhIFqUstKdPn57yVusvvvjiQQ+ovzNNU60j\nIiIiIpK60P75z38OQFVVFT6fj5NPPplYLMZf//pXOjuzd/fFviwUjmECOWodERERERnUUlaDo0aN\nAuDvf//7HrOMlJWVMXfu3MxG1k8Fu3RXSBERERHp5fR+jY2NvPzyy3R2dhIKhaipqWHHjh2Zjq1f\nCqnQFhERERF6eTHk0qVLWblyJZs2bcI0TcaNG8eSJUsyHVu/1JkstNU6IiIiIjKY9aoanDRpEr/6\n1a8yHcuAoNYREREREYH9FNp33HEHt912G5dddlmPs4+sXbs2Y4H1Vyq0RURERAT2U2hffPHFANx4\n442HJJiBIBhSoS0iIiIi+ym0A4EANTU1hyqWASGoHm0RERERYT+F9o9//ON9PmcYBqeccspBD6i/\nC4U1oi0iIiIi+ym016xZs8/nnn/++YMezECg1hERERERgV7OOrJjxw6eeOIJAoEAAOFwmNraWr74\nxS+mXG/ZsmVs2LABwzCoqKjg2GOPTT735JNPUl1djcViYcKECVRWVmIYRo/r7Ny5k4ULFxKLxfD5\nfNx99904HI7P8bYzJzm9X45aR0REREQGs17dsGbhwoUUFBTw1ltvMXHiRAKBACtXrky5zquvvsqW\nLVuoqqrizjvv5M4770w+FwwGefbZZ1m7di2/+tWv+PDDD1m/fv0+11m1ahWXXXYZv/jFLxg9ejTV\n1dWf4y1n1u4e7Ry7RrRFREREBrNeFdpWq5VrrrmGkpISLr/8cn7yk5/sd2q/mpoazjnnHADGjh1L\nS0sL7e3tALhcLh577DHsdjvBYJD29nZ8Pt8+16mtreXss88GYNq0aX36Ak1N7yciIiIi0MvWka6u\nLnbt2oVhGGzdupXhw4ezffv2lOs0NDRQVlaWfFxUVITf78fj8SSXrV69mscff5zy8nJGjhy5z3WC\nwWCyVaS4uBi/359y34WFbmy27BS6oa4YAIcNLyDXZc9KDP2Nz+fNdgj9jnKWHuUrfcpZepSv9Cln\n6VG+0tcXctarQvtb3/oWNTU1zJkzhwsuuACr1cpXvvKVtHZkmuZey6655hrKy8u5+uqrmTx5cq/W\n6WnZZwUCnWnFdjDtHtFubw3S2R7KWhz9hc/nxe9vy3YY/Ypylh7lK33KWXqUr/QpZ+lRvtJ3KHOW\nqqBPWWjX1dUxZMiQZDsHJHqvOzo6yM/PT7nT0tJSGhoako/r6+vx+XwANDc38/7773PSSSfhdDo5\n44wzePPNN/e5jtvtJhQK4XQ6qauro7S0NPU7zqJgVwSH3YLFsvedNEVERERk8EjZo/3Vr36Va665\nhhdeeIFoNDFSa7PZ9ltkA0ydOjU5BeDGjRspLS1Nto1Eo1EWL15MR0cHAO+88w5jxozZ5zqnnnpq\ncvkLL7zA6aeffoBvN/OCXVGcuhBSREREZNBLOaL90ksv8Yc//IEnn3yS73//+3z1q1/l4osvZuzY\nsfvd8KRJkygrK2P27NkYhkFlZSXr1q3D6/UyY8YMrr/+esrLy7HZbIwfP56zzz4bwzD2Wgdg/vz5\nLFq0iKqqKoYPH87MmTMPzrvPgGBXVHeFFBEREREMszdNzyTaOP77v/+b//qv/8LtdnPxxRdz8cUX\nZzq+A5LNPqbr7/sffPkuln5zStZi6E/Ud5Y+5Sw9ylf6lLP0KF/pU87So3ylr6/0aPdqej9I9FzP\nmTOH++67jxEjRvD973//oAQ3kMRNk1A4pqn9RERERKR3s460tLTw29/+lqeeeopwOMzFF1/Mbbfd\nlunY+p1wJIZp6q6QIiIiIrKfQvtPf/oTTz31FG+88QYzZszg9ttv3+M26rKnUDgxh7buCikiIiIi\nKQvtRx55hIsvvpi7774bp9N5qGLqt3YX2modEREREZGUhfYTTzxxqOIYEELh3bdfV+uIiIiIyGDX\n64shZf92335dI9oiIiIiokL7IApFugvtHBXaIiIiIoOdCu2DKNk6ooshRURERAY9FdoH0ScXQ6pH\nW0RERGSwU6F9EKlHW0RERER2U6F9EH0y64gKbREREZHBToX2QdSVvBhSrSMiIiIig50K7YNIN6wR\nERERkd1UaB9EugW7iIiIiOymQvsgCnXpzpAiIiIikpDRinDZsmVs2LABwzCoqKjg2GOPTT73yiuv\ncO+992KxWBgzZgx33nknv/nNb3jmmWeSr3n33XdZv349V155JZ2dnbjdbgAWLVrExIkTMxn6AVHr\niIiIiIjslrFC+9VXX2XLli1UVVWxefNmKioqqKqqSj5/++238/jjjzN06FBuuOEGXnrpJWbNmsWs\nWbOS6//ud79Lvv6uu+7iyCOPzFS4B0UoHCPHYcViMbIdioiIiIhkWcZaR2pqajjnnHMAGDt2LC0t\nLbS3tyefX7duHUOHDgWgqKiIQCCwx/oPPPAA1113XabCy4hQJIZLM46IiIiICBkc0W5oaKCsrCz5\nuKioCL/fj8fjAUj+t76+nr/+9a8sWLAg+dq3336bYcOG4fP5kstWrVpFIBBg7NixVFRU4HQ697nv\nwkI3Ntuhb9+IRGO4HDZ8Pu8h33d/pnylTzlLj/KVPuUsPcpX+pSz9Chf6esLOTtkw6+mae61rLGx\nkWuvvZbKykoKCwuTy6urq7nwwguTj8vLyxk/fjyjRo2isrKStWvXMmfOnH3uKxDoPLjB91JHKEqB\nx4nf35aV/fdHPp9X+UqTcpYe5St9yll6lK/0KWfpUb7Sdyhzlqqgz1jrSGlpKQ0NDcnH9fX1e4xQ\nt7e3c/XVV3PjjTdy2mmn7bFubW0tJ5xwQvLxjBkzGDVqFADTp09n06ZNmQr7c8mxWfAVurIdhoiI\niIj0ARkrtKdOncrzzz8PwMaNGyktLU22iwAsX76cf/u3f+OMM87YY726ujpyc3NxOBxAYiT8qquu\norW1FUgU4ePGjctU2J/L7VedxHcunZTtMERERESkD8hY68ikSZMoKytj9uzZGIZBZWUl69atw+v1\nctppp/H000+zZcsWqqurAfjKV77C17/+dfx+P0VFRcntGIbBJZdcwlVXXYXL5WLIkCHMnz8/U2F/\nLkV5TnJddjrbQ9kORURERESyzDB7ap7u57LZx6Q+qvQoX+lTztKjfKVPOUuP8pU+5Sw9ylf6BnyP\ntoiIiIjIYKZCW0REREQkA1Roi4iIiIhkgAptEREREZEMGJAXQ4qIiIiIZJtGtEVEREREMkCFtoiI\niIhIBqjQFhERERHJABXaIiIiIiIZoEJbRERERCQDVGiLiIiIiGSALdsBDBTLli1jw4YNGIZBRUUF\nxx57bLZD6pNWrlzJG2+8QTQaZe7cufzpT39i48aNFBQUADBnzhzOOuus7AbZh9TW1rJgwQLGjRsH\nwJFHHsm3vvUtFi5cSCwWw+fzcffdd+NwOLIcad/w61//mmeeeSb5+N1332XixIl0dnbidrsBWLRo\nERMnTsxWiH3Gpk2buO6667jqqqu44oor2LlzZ4/H1TPPPMNjjz2GxWLhkksuYdasWdkOPWt6ytmt\nt95KNBrFZrNx99134/P5KCsrY9KkScn1fv7zn2O1WrMYeXZ8Nl+LFy/u8fNex9gnPpuzG264gUAg\nAEBzczPHH388c+fO5atf/Wryc6ywsJBVq1ZlM+ys+WxNccwxx/S9zzFTPrfa2lrzmmuuMU3TND/4\n4APzkksuyXJEfVNNTY35rW99yzRN02xqajLPPPNMc9GiReaf/vSnLEfWd73yyivm/Pnz91i2ePFi\n87nnnjNN0zTvuecec+3atdkIrc+rra01ly5dal5xxRXm//3f/2U7nD6lo6PDvOKKK8zbbrvNXLNm\njWmaPR9XHR0d5rnnnmu2traawWDQ/PKXv2wGAoFshp41PeVs4cKF5rPPPmuapmk+8cQT5ooVK0zT\nNM0pU6ZkLc6+oqd89fR5r2PsEz3l7NMWL15sbtiwwdy6dat54YUXZiHCvqWnmqIvfo6pdeQgqKmp\n4ZxzzgFg7NixtLS00N7enuWo+p6TTjqJH/7whwDk5eURDAaJxWJZjqr/qa2t5eyzzwZg2rRp1NTU\nZDmivumBBx7guuuuy3YYfZLD4eCnP/0ppaWlyWU9HVcbNmzgmGOOwev14nQ6mTRpEm+++Wa2ws6q\nnnJWWVnJF7/4RSAxqtjc3Jyt8PqcnvLVEx1jn0iVsw8//JC2tjadLf+UnmqKvvg5pkL7IGhoaKCw\nsDD5uKioCL/fn8WI+iar1Zo8fV9dXc0ZZ5yB1WrliSeeoLy8nO985zs0NTVlOcq+54MPPuDaa6/l\n0ksv5a9//SvBYDDZKlJcXKxjrQdvv/02w4YNw+fzAbBq1Souv/xybr/9dkKhUJajyz6bzYbT6dxj\nWU/HVUNDA0VFRcnXDObPtp5y5na7sVqtxGIxfvGLX/DVr34VgHA4zM0338zs2bN59NFHsxFu1vWU\nL2Cvz3sdY5/YV84AHn/8ca644ork44aGBm644QZmz569R7vcYNJTTdEXP8fUo50Bpu5qn9If//hH\nqqureeSRR3j33XcpKCjgqKOOYvXq1fzoRz/i9ttvz3aIfcbhhx/OvHnzOP/889m6dSvl5eV7nAXQ\nsdaz6upqLrzwQgDKy8sZP348o0aNorKykrVr1zJnzpwsR9i37eu40vG2t1gsxsKFCzn55JM55ZRT\nAFi4cCH/+q//imEYXHHFFZx44okcc8wxWY40+y644IK9Pu9POOGEPV6jY2xv4XCYN954g6VLlwJQ\nUFDAggUL+Nd//Vfa2tqYNWsWJ5988n7PHgxUn64pzj333OTyvvI5phHtg6C0tJSGhobk4/r6+uRI\nmuzppZde4sEHH+SnP/0pXq+XU045haOOOgqA6dOns2nTpixH2LcMGTKEL33pSxiGwahRoygpKaGl\npSU5KltXVzdoP1xTqa2tTf4BnzFjBqNGjQJ0jKXidrv3Oq56+mzT8banW2+9ldGjRzNv3rzksksv\nvZTc3Fzcbjcnn3yyjrluPX3e6xjbv9dee22PlhGPx8PXvvY17HY7RUVFTJw4kQ8//DCLEWbPZ2uK\nvvg5pkL7IJg6dSrPP/88ABs3bqS0tBSPx5PlqPqetrY2Vq5cyUMPPZS86nz+/Pls3boVSBRHu2fX\nkIRnnnmGn/3sZwD4/X4aGxu56KKLksfbCy+8wOmnn57NEPucuro6cnNzcTgcmKbJVVddRWtrK6Bj\nLJVTTz11r+PquOOO45133qG1tZWOjg7efPNNTjzxxCxH2nc888wz2O12brjhhuSyDz/8kJtvvhnT\nNIlGo7z55ps65rr19HmvY2z/3nnnHSZMmJB8/Morr3DXXXcB0NnZyXvvvceYMWOyFV7W9FRT9MXP\nMbWOHASTJk2irKyM2bNnYxgGlZWV2Q6pT3ruuecIBALceOONyWUXXXQRN954Iy6XC7fbnfzwkITp\n06dzyy238OKLLxKJRFi6dClHHXUUixYtoqqqiuHDhzNz5sxsh9mn+P3+ZD+eYRhccsklXHXVVbhc\nLoYMGcL8+fOzHGH2vfvuu6xYsYLt27djs9l4/vnn+Y//+A8WL168x3Flt9u5+eabmTNnDoZhcP31\n1+P1erMdflb0lLPGxkZycnK48sorgcTF8EuXLmXo0KFcfPHFWCwWpk+fPigvYOspX1dcccVen/dO\np1PHWLeecnb//ffj9/uTZ+UATjzxRJ5++mm+/vWvE4vFuOaaaxgyZEgWI8+OnmqK5cuXc9ttt/Wp\nzzHDVEOUiIiIiMhBp9YREREREZEMUKEtIiIiIpIBKrRFRERERDJAhbaIiIiISAao0BYRERERyQBN\n7yciMgBs27aN8847b6877Z155pl861vf+tzbr62t5T//8z/55S9/+bm3JSIyWKjQFhEZIIqKiliz\nZk22wxARkW4qtEVEBrijjz6a6667jtraWjo6Oli+fDlHHnkkGzZsYPny5dhsNgzD4Pbbb+cLX/gC\nH330EUuWLCEej5OTk5O8kVQ8HqeyspJ//OMfOBwOHnroIQBuvvlmWltbiUajTJs2jW9/+9vZfLsi\nIn2GerRFRAa4WCzGuHHjWLNmDZdeeimrVq0CYOHChdx6662sWbOGb3zjG3zve98DoLKykjlz5rB2\n7Vq+9rWv8bvf/Q6AzZs3M3/+fJ588klsNhsvv/wyf/vb34hGo/ziF7/gV7/6FW63m3g8nrX3KiLS\nl2hEW0RkgGhqakreDny37373uwCcdtppAEyaNImf/exntLa20tjYmLw9+JQpU7jpppsAePvtt5ky\nZQoAX/7yl4FEj/YRRxxBSUkJAEOHDqW1tZXp06ezatUqFixYwJlnnsmsWbOwWDSGIyICKrRFRAaM\nVD3apmkmfzYMA8Mw9vk80OOotNVq3WtZcXEx//Vf/8X69et58cUX+drXvsZTTz2F0+k8kLcgIjKg\naNhBRGQQeOWVVwB44403GD9+PF6vF5/Px4YNGwCoqanh+OOPBxKj3i+99BIAzz33HPfee+8+t/vy\nyy/zl7/8hcmTJ7Nw4ULcbjeNjY0ZfjciIv2DRrRFRAaInlpHDjvsMAD+/ve/88tf/pKWlhZWrFgB\nwIoVK1i+fDlWqxWLxcLSpUsBWLJkCUuWLOEXv/gFNpuNZcuW8fHHH/e4zzFjxrB48WIefvhhrFYr\np512GiNGjMjcmxQR6UcM87PnC0VEZEAZP348GzduxGbT2IqIyKGk1hERERERkQzQiLaIiIiISAZo\nRFtEREREJANUaIuIiIiIZIAKbRERERGRDFChLSIiIiKSASq0RUREREQyQIW2iIiIiEgG/D8NMbZ7\nOB7bfwAAAABJRU5ErkJggg==\n",
      "text/plain": [
       "<matplotlib.figure.Figure at 0x7fe6de861690>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    }
   ],
   "source": [
    "plt.figure(figsize=(12, 3))\n",
    "plt.plot(ndcgs_vad)\n",
    "plt.ylabel(\"Validation NDCG@100\")\n",
    "plt.xlabel(\"Epochs\")\n",
    "pass"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Compute test metrics"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 60,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "tf.reset_default_graph()\n",
    "dae = MultiDAE(p_dims, lam=0.01 / batch_size)\n",
    "saver, logits_var, _, _, _ = dae.build_graph()    "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Load the best performing model on the validation set"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 61,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "chkpt directory: /volmount/chkpt/ml-20m/DAE/I-200-I\n"
     ]
    }
   ],
   "source": [
    "chkpt_dir = '/volmount/chkpt/ml-20m/DAE/{}'.format(arch_str)\n",
    "print(\"chkpt directory: %s\" % chkpt_dir)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 63,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "n100_list, r20_list, r50_list = [], [], []\n",
    "\n",
    "with tf.Session() as sess:    \n",
    "    saver.restore(sess, '{}/model'.format(chkpt_dir))\n",
    "    \n",
    "    for bnum, st_idx in enumerate(range(0, N_test, batch_size_test)):\n",
    "        end_idx = min(st_idx + batch_size_test, N_test)\n",
    "        X = test_data_tr[idxlist_test[st_idx:end_idx]]\n",
    "\n",
    "        if sparse.isspmatrix(X):\n",
    "            X = X.toarray()\n",
    "        X = X.astype('float32')\n",
    "\n",
    "        pred_val = sess.run(logits_var, feed_dict={dae.input_ph: X})\n",
    "        # exclude examples from training and validation (if any)\n",
    "        pred_val[X.nonzero()] = -np.inf\n",
    "        n100_list.append(NDCG_binary_at_k_batch(pred_val, test_data_te[idxlist_test[st_idx:end_idx]], k=100))\n",
    "        r20_list.append(Recall_at_k_batch(pred_val, test_data_te[idxlist_test[st_idx:end_idx]], k=20))\n",
    "        r50_list.append(Recall_at_k_batch(pred_val, test_data_te[idxlist_test[st_idx:end_idx]], k=50))\n",
    "\n",
    "n100_list = np.concatenate(n100_list)\n",
    "r20_list = np.concatenate(r20_list)\n",
    "r50_list = np.concatenate(r50_list)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 64,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Test NDCG@100=0.41997 (0.00212)\n",
      "Test Recall@20=0.38768 (0.00268)\n",
      "Test Recall@50=0.52426 (0.00284)\n"
     ]
    }
   ],
   "source": [
    "print(\"Test NDCG@100=%.5f (%.5f)\" % (np.mean(n100_list), np.std(n100_list) / np.sqrt(len(n100_list))))\n",
    "print(\"Test Recall@20=%.5f (%.5f)\" % (np.mean(r20_list), np.std(r20_list) / np.sqrt(len(r20_list))))\n",
    "print(\"Test Recall@50=%.5f (%.5f)\" % (np.mean(r50_list), np.std(r50_list) / np.sqrt(len(r50_list))))"
   ]
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 2",
   "language": "python",
   "name": "python2"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 2
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython2",
   "version": "2.7.6"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 2
}