{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# 个性化推荐\n",
    "本项目使用文本卷积神经网络，并使用[`MovieLens`](https://grouplens.org/datasets/movielens/)数据集完成电影推荐的任务。\n",
    "\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "推荐系统在日常的网络应用中无处不在，比如网上购物、网上买书、新闻app、社交网络、音乐网站、电影网站等等等等，有人的地方就有推荐。根据个人的喜好，相同喜好人群的习惯等信息进行个性化的内容推荐。比如打开新闻类的app，因为有了个性化的内容，每个人看到的新闻首页都是不一样的。"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "这当然是很有用的，在信息爆炸的今天，获取信息的途径和方式多种多样，人们花费时间最多的不再是去哪获取信息，而是要在众多的信息中寻找自己感兴趣的，这就是信息超载问题。为了解决这个问题，推荐系统应运而生。"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "协同过滤是推荐系统应用较广泛的技术，该方法搜集用户的历史记录、个人喜好等信息，计算与其他用户的相似度，利用相似用户的评价来预测目标用户对特定项目的喜好程度。优点是会给用户推荐未浏览过的项目，缺点呢，对于新用户来说，没有任何与商品的交互记录和个人喜好等信息，存在冷启动问题，导致模型无法找到相似的用户或商品。"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "为了解决冷启动的问题，通常的做法是对于刚注册的用户，要求用户先选择自己感兴趣的话题、群组、商品、性格、喜欢的音乐类型等信息，比如豆瓣FM：\n",
    "<img src=\"assets/IMG_6242_300.PNG\"/>"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 下载数据集\n",
    "运行下面代码把[`数据集`](http://files.grouplens.org/datasets/movielens/ml-1m.zip)下载下来"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 6,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "'2.0.0-dev20190301'"
      ]
     },
     "execution_count": 6,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "tf.__version__"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 5,
   "metadata": {},
   "outputs": [],
   "source": [
    "import pandas as pd\n",
    "from sklearn.model_selection import train_test_split\n",
    "import numpy as np\n",
    "from collections import Counter\n",
    "import tensorflow as tf\n",
    "\n",
    "import os\n",
    "import pickle\n",
    "import re\n",
    "from tensorflow.python.ops import math_ops"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 7,
   "metadata": {},
   "outputs": [],
   "source": [
    "from urllib.request import urlretrieve\n",
    "from os.path import isfile, isdir\n",
    "from tqdm import tqdm\n",
    "import zipfile\n",
    "import hashlib\n",
    "\n",
    "def _unzip(save_path, _, database_name, data_path):\n",
    "    \"\"\"\n",
    "    Unzip wrapper with the same interface as _ungzip\n",
    "    :param save_path: The path of the gzip files\n",
    "    :param database_name: Name of database\n",
    "    :param data_path: Path to extract to\n",
    "    :param _: HACK - Used to have to same interface as _ungzip\n",
    "    \"\"\"\n",
    "    print('Extracting {}...'.format(database_name))\n",
    "    with zipfile.ZipFile(save_path) as zf:\n",
    "        zf.extractall(data_path)\n",
    "\n",
    "def download_extract(database_name, data_path):\n",
    "    \"\"\"\n",
    "    Download and extract database\n",
    "    :param database_name: Database name\n",
    "    \"\"\"\n",
    "    DATASET_ML1M = 'ml-1m'\n",
    "\n",
    "    if database_name == DATASET_ML1M:\n",
    "        url = 'http://files.grouplens.org/datasets/movielens/ml-1m.zip'\n",
    "        hash_code = 'c4d9eecfca2ab87c1945afe126590906'\n",
    "        extract_path = os.path.join(data_path, 'ml-1m')\n",
    "        save_path = os.path.join(data_path, 'ml-1m.zip')\n",
    "        extract_fn = _unzip\n",
    "\n",
    "    if os.path.exists(extract_path):\n",
    "        print('Found {} Data'.format(database_name))\n",
    "        return\n",
    "\n",
    "    if not os.path.exists(data_path):\n",
    "        os.makedirs(data_path)\n",
    "\n",
    "    if not os.path.exists(save_path):\n",
    "        with DLProgress(unit='B', unit_scale=True, miniters=1, desc='Downloading {}'.format(database_name)) as pbar:\n",
    "            urlretrieve(\n",
    "                url,\n",
    "                save_path,\n",
    "                pbar.hook)\n",
    "\n",
    "    assert hashlib.md5(open(save_path, 'rb').read()).hexdigest() == hash_code, \\\n",
    "        '{} file is corrupted.  Remove the file and try again.'.format(save_path)\n",
    "\n",
    "    os.makedirs(extract_path)\n",
    "    try:\n",
    "        extract_fn(save_path, extract_path, database_name, data_path)\n",
    "    except Exception as err:\n",
    "        shutil.rmtree(extract_path)  # Remove extraction folder if there is an error\n",
    "        raise err\n",
    "\n",
    "    print('Done.')\n",
    "    # Remove compressed data\n",
    "#     os.remove(save_path)\n",
    "\n",
    "class DLProgress(tqdm):\n",
    "    \"\"\"\n",
    "    Handle Progress Bar while Downloading\n",
    "    \"\"\"\n",
    "    last_block = 0\n",
    "\n",
    "    def hook(self, block_num=1, block_size=1, total_size=None):\n",
    "        \"\"\"\n",
    "        A hook function that will be called once on establishment of the network connection and\n",
    "        once after each block read thereafter.\n",
    "        :param block_num: A count of blocks transferred so far\n",
    "        :param block_size: Block size in bytes\n",
    "        :param total_size: The total size of the file. This may be -1 on older FTP servers which do not return\n",
    "                            a file size in response to a retrieval request.\n",
    "        \"\"\"\n",
    "        self.total = total_size\n",
    "        self.update((block_num - self.last_block) * block_size)\n",
    "        self.last_block = block_num"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 8,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Extracting ml-1m...\n",
      "Done.\n"
     ]
    }
   ],
   "source": [
    "data_dir = './'\n",
    "download_extract('ml-1m', data_dir)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 先来看看数据"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "本项目使用的是MovieLens 1M 数据集，包含6000个用户在近4000部电影上的1亿条评论。\n",
    "\n",
    "数据集分为三个文件：用户数据users.dat，电影数据movies.dat和评分数据ratings.dat。"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 用户数据\n",
    "分别有用户ID、性别、年龄、职业ID和邮编等字段。\n",
    "\n",
    "数据中的格式：UserID::Gender::Age::Occupation::Zip-code\n",
    "\n",
    "- Gender is denoted by a \"M\" for male and \"F\" for female\n",
    "- Age is chosen from the following ranges:\n",
    "\n",
    "\t*  1:  \"Under 18\"\n",
    "\t* 18:  \"18-24\"\n",
    "\t* 25:  \"25-34\"\n",
    "\t* 35:  \"35-44\"\n",
    "\t* 45:  \"45-49\"\n",
    "\t* 50:  \"50-55\"\n",
    "\t* 56:  \"56+\"\n",
    "\n",
    "- Occupation is chosen from the following choices:\n",
    "\n",
    "\t*  0:  \"other\" or not specified\n",
    "\t*  1:  \"academic/educator\"\n",
    "\t*  2:  \"artist\"\n",
    "\t*  3:  \"clerical/admin\"\n",
    "\t*  4:  \"college/grad student\"\n",
    "\t*  5:  \"customer service\"\n",
    "\t*  6:  \"doctor/health care\"\n",
    "\t*  7:  \"executive/managerial\"\n",
    "\t*  8:  \"farmer\"\n",
    "\t*  9:  \"homemaker\"\n",
    "\t* 10:  \"K-12 student\"\n",
    "\t* 11:  \"lawyer\"\n",
    "\t* 12:  \"programmer\"\n",
    "\t* 13:  \"retired\"\n",
    "\t* 14:  \"sales/marketing\"\n",
    "\t* 15:  \"scientist\"\n",
    "\t* 16:  \"self-employed\"\n",
    "\t* 17:  \"technician/engineer\"\n",
    "\t* 18:  \"tradesman/craftsman\"\n",
    "\t* 19:  \"unemployed\"\n",
    "\t* 20:  \"writer\"\n",
    "\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 6,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/html": [
       "<div>\n",
       "<style scoped>\n",
       "    .dataframe tbody tr th:only-of-type {\n",
       "        vertical-align: middle;\n",
       "    }\n",
       "\n",
       "    .dataframe tbody tr th {\n",
       "        vertical-align: top;\n",
       "    }\n",
       "\n",
       "    .dataframe thead th {\n",
       "        text-align: right;\n",
       "    }\n",
       "</style>\n",
       "<table border=\"1\" class=\"dataframe\">\n",
       "  <thead>\n",
       "    <tr style=\"text-align: right;\">\n",
       "      <th></th>\n",
       "      <th>UserID</th>\n",
       "      <th>Gender</th>\n",
       "      <th>Age</th>\n",
       "      <th>OccupationID</th>\n",
       "      <th>Zip-code</th>\n",
       "    </tr>\n",
       "  </thead>\n",
       "  <tbody>\n",
       "    <tr>\n",
       "      <th>0</th>\n",
       "      <td>1</td>\n",
       "      <td>F</td>\n",
       "      <td>1</td>\n",
       "      <td>10</td>\n",
       "      <td>48067</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>1</th>\n",
       "      <td>2</td>\n",
       "      <td>M</td>\n",
       "      <td>56</td>\n",
       "      <td>16</td>\n",
       "      <td>70072</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>2</th>\n",
       "      <td>3</td>\n",
       "      <td>M</td>\n",
       "      <td>25</td>\n",
       "      <td>15</td>\n",
       "      <td>55117</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>3</th>\n",
       "      <td>4</td>\n",
       "      <td>M</td>\n",
       "      <td>45</td>\n",
       "      <td>7</td>\n",
       "      <td>02460</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>4</th>\n",
       "      <td>5</td>\n",
       "      <td>M</td>\n",
       "      <td>25</td>\n",
       "      <td>20</td>\n",
       "      <td>55455</td>\n",
       "    </tr>\n",
       "  </tbody>\n",
       "</table>\n",
       "</div>"
      ],
      "text/plain": [
       "   UserID Gender  Age  OccupationID Zip-code\n",
       "0       1      F    1            10    48067\n",
       "1       2      M   56            16    70072\n",
       "2       3      M   25            15    55117\n",
       "3       4      M   45             7    02460\n",
       "4       5      M   25            20    55455"
      ]
     },
     "execution_count": 6,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "users_title = ['UserID', 'Gender', 'Age', 'OccupationID', 'Zip-code']\n",
    "users = pd.read_csv('./ml-1m/users.dat', sep='::', header=None, names=users_title, engine = 'python')\n",
    "users.head()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "可以看出UserID、Gender、Age和Occupation都是类别字段，其中邮编字段是我们不使用的。"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 电影数据\n",
    "分别有电影ID、电影名和电影风格等字段。\n",
    "\n",
    "数据中的格式：MovieID::Title::Genres\n",
    "\n",
    "- Titles are identical to titles provided by the IMDB (including\n",
    "year of release)\n",
    "- Genres are pipe-separated and are selected from the following genres:\n",
    "\n",
    "\t* Action\n",
    "\t* Adventure\n",
    "\t* Animation\n",
    "\t* Children's\n",
    "\t* Comedy\n",
    "\t* Crime\n",
    "\t* Documentary\n",
    "\t* Drama\n",
    "\t* Fantasy\n",
    "\t* Film-Noir\n",
    "\t* Horror\n",
    "\t* Musical\n",
    "\t* Mystery\n",
    "\t* Romance\n",
    "\t* Sci-Fi\n",
    "\t* Thriller\n",
    "\t* War\n",
    "\t* Western\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 7,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/html": [
       "<div>\n",
       "<style scoped>\n",
       "    .dataframe tbody tr th:only-of-type {\n",
       "        vertical-align: middle;\n",
       "    }\n",
       "\n",
       "    .dataframe tbody tr th {\n",
       "        vertical-align: top;\n",
       "    }\n",
       "\n",
       "    .dataframe thead th {\n",
       "        text-align: right;\n",
       "    }\n",
       "</style>\n",
       "<table border=\"1\" class=\"dataframe\">\n",
       "  <thead>\n",
       "    <tr style=\"text-align: right;\">\n",
       "      <th></th>\n",
       "      <th>MovieID</th>\n",
       "      <th>Title</th>\n",
       "      <th>Genres</th>\n",
       "    </tr>\n",
       "  </thead>\n",
       "  <tbody>\n",
       "    <tr>\n",
       "      <th>0</th>\n",
       "      <td>1</td>\n",
       "      <td>Toy Story (1995)</td>\n",
       "      <td>Animation|Children's|Comedy</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>1</th>\n",
       "      <td>2</td>\n",
       "      <td>Jumanji (1995)</td>\n",
       "      <td>Adventure|Children's|Fantasy</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>2</th>\n",
       "      <td>3</td>\n",
       "      <td>Grumpier Old Men (1995)</td>\n",
       "      <td>Comedy|Romance</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>3</th>\n",
       "      <td>4</td>\n",
       "      <td>Waiting to Exhale (1995)</td>\n",
       "      <td>Comedy|Drama</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>4</th>\n",
       "      <td>5</td>\n",
       "      <td>Father of the Bride Part II (1995)</td>\n",
       "      <td>Comedy</td>\n",
       "    </tr>\n",
       "  </tbody>\n",
       "</table>\n",
       "</div>"
      ],
      "text/plain": [
       "   MovieID                               Title                        Genres\n",
       "0        1                    Toy Story (1995)   Animation|Children's|Comedy\n",
       "1        2                      Jumanji (1995)  Adventure|Children's|Fantasy\n",
       "2        3             Grumpier Old Men (1995)                Comedy|Romance\n",
       "3        4            Waiting to Exhale (1995)                  Comedy|Drama\n",
       "4        5  Father of the Bride Part II (1995)                        Comedy"
      ]
     },
     "execution_count": 7,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "movies_title = ['MovieID', 'Title', 'Genres']\n",
    "movies = pd.read_csv('./ml-1m/movies.dat', sep='::', header=None, names=movies_title, engine = 'python')\n",
    "movies.head()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "MovieID是类别字段，Title是文本，Genres也是类别字段"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 评分数据\n",
    "分别有用户ID、电影ID、评分和时间戳等字段。\n",
    "\n",
    "数据中的格式：UserID::MovieID::Rating::Timestamp\n",
    "\n",
    "- UserIDs range between 1 and 6040 \n",
    "- MovieIDs range between 1 and 3952\n",
    "- Ratings are made on a 5-star scale (whole-star ratings only)\n",
    "- Timestamp is represented in seconds since the epoch as returned by time(2)\n",
    "- Each user has at least 20 ratings"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 8,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/html": [
       "<div>\n",
       "<style scoped>\n",
       "    .dataframe tbody tr th:only-of-type {\n",
       "        vertical-align: middle;\n",
       "    }\n",
       "\n",
       "    .dataframe tbody tr th {\n",
       "        vertical-align: top;\n",
       "    }\n",
       "\n",
       "    .dataframe thead th {\n",
       "        text-align: right;\n",
       "    }\n",
       "</style>\n",
       "<table border=\"1\" class=\"dataframe\">\n",
       "  <thead>\n",
       "    <tr style=\"text-align: right;\">\n",
       "      <th></th>\n",
       "      <th>UserID</th>\n",
       "      <th>MovieID</th>\n",
       "      <th>Rating</th>\n",
       "      <th>timestamps</th>\n",
       "    </tr>\n",
       "  </thead>\n",
       "  <tbody>\n",
       "    <tr>\n",
       "      <th>0</th>\n",
       "      <td>1</td>\n",
       "      <td>1193</td>\n",
       "      <td>5</td>\n",
       "      <td>978300760</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>1</th>\n",
       "      <td>1</td>\n",
       "      <td>661</td>\n",
       "      <td>3</td>\n",
       "      <td>978302109</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>2</th>\n",
       "      <td>1</td>\n",
       "      <td>914</td>\n",
       "      <td>3</td>\n",
       "      <td>978301968</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>3</th>\n",
       "      <td>1</td>\n",
       "      <td>3408</td>\n",
       "      <td>4</td>\n",
       "      <td>978300275</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>4</th>\n",
       "      <td>1</td>\n",
       "      <td>2355</td>\n",
       "      <td>5</td>\n",
       "      <td>978824291</td>\n",
       "    </tr>\n",
       "  </tbody>\n",
       "</table>\n",
       "</div>"
      ],
      "text/plain": [
       "   UserID  MovieID  Rating  timestamps\n",
       "0       1     1193       5   978300760\n",
       "1       1      661       3   978302109\n",
       "2       1      914       3   978301968\n",
       "3       1     3408       4   978300275\n",
       "4       1     2355       5   978824291"
      ]
     },
     "execution_count": 8,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "ratings_title = ['UserID','MovieID', 'Rating', 'timestamps']\n",
    "ratings = pd.read_csv('./ml-1m/ratings.dat', sep='::', header=None, names=ratings_title, engine = 'python')\n",
    "ratings.head()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "评分字段Rating就是我们要学习的targets，时间戳字段我们不使用。"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 来说说数据预处理"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "- UserID、Occupation和MovieID不用变。\n",
    "- Gender字段：需要将‘F’和‘M’转换成0和1。\n",
    "- Age字段：要转成7个连续数字0~6。\n",
    "- Genres字段：是分类字段，要转成数字。首先将Genres中的类别转成字符串到数字的字典，然后再将每个电影的Genres字段转成数字列表，因为有些电影是多个Genres的组合。\n",
    "- Title字段：处理方式跟Genres字段一样，首先创建文本到数字的字典，然后将Title中的描述转成数字的列表。另外Title中的年份也需要去掉。\n",
    "- Genres和Title字段需要将长度统一，这样在神经网络中方便处理。空白部分用‘< PAD >’对应的数字填充。"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 实现数据预处理"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 9,
   "metadata": {},
   "outputs": [],
   "source": [
    "def load_data():\n",
    "    \"\"\"\n",
    "    Load Dataset from File\n",
    "    \"\"\"\n",
    "    #读取User数据\n",
    "    users_title = ['UserID', 'Gender', 'Age', 'JobID', 'Zip-code']\n",
    "    users = pd.read_csv('./ml-1m/users.dat', sep='::', header=None, names=users_title, engine = 'python')\n",
    "    users = users.filter(regex='UserID|Gender|Age|JobID')\n",
    "    users_orig = users.values\n",
    "    #改变User数据中性别和年龄\n",
    "    gender_map = {'F':0, 'M':1}\n",
    "    users['Gender'] = users['Gender'].map(gender_map)\n",
    "\n",
    "    age_map = {val:ii for ii,val in enumerate(set(users['Age']))}\n",
    "    users['Age'] = users['Age'].map(age_map)\n",
    "\n",
    "    #读取Movie数据集\n",
    "    movies_title = ['MovieID', 'Title', 'Genres']\n",
    "    movies = pd.read_csv('./ml-1m/movies.dat', sep='::', header=None, names=movies_title, engine = 'python')\n",
    "    movies_orig = movies.values\n",
    "    #将Title中的年份去掉\n",
    "    pattern = re.compile(r'^(.*)\\((\\d+)\\)$')\n",
    "\n",
    "    title_map = {val:pattern.match(val).group(1) for ii,val in enumerate(set(movies['Title']))}\n",
    "    movies['Title'] = movies['Title'].map(title_map)\n",
    "\n",
    "    #电影类型转数字字典\n",
    "    genres_set = set()\n",
    "    for val in movies['Genres'].str.split('|'):\n",
    "        genres_set.update(val)\n",
    "\n",
    "    genres_set.add('<PAD>')\n",
    "    genres2int = {val:ii for ii, val in enumerate(genres_set)}\n",
    "\n",
    "    #将电影类型转成等长数字列表，长度是18\n",
    "    genres_map = {val:[genres2int[row] for row in val.split('|')] for ii,val in enumerate(set(movies['Genres']))}\n",
    "\n",
    "    for key in genres_map:\n",
    "        for cnt in range(max(genres2int.values()) - len(genres_map[key])):\n",
    "            genres_map[key].insert(len(genres_map[key]) + cnt,genres2int['<PAD>'])\n",
    "    \n",
    "    movies['Genres'] = movies['Genres'].map(genres_map)\n",
    "\n",
    "    #电影Title转数字字典\n",
    "    title_set = set()\n",
    "    for val in movies['Title'].str.split():\n",
    "        title_set.update(val)\n",
    "    \n",
    "    title_set.add('<PAD>')\n",
    "    title2int = {val:ii for ii, val in enumerate(title_set)}\n",
    "\n",
    "    #将电影Title转成等长数字列表，长度是15\n",
    "    title_count = 15\n",
    "    title_map = {val:[title2int[row] for row in val.split()] for ii,val in enumerate(set(movies['Title']))}\n",
    "    \n",
    "    for key in title_map:\n",
    "        for cnt in range(title_count - len(title_map[key])):\n",
    "            title_map[key].insert(len(title_map[key]) + cnt,title2int['<PAD>'])\n",
    "    \n",
    "    movies['Title'] = movies['Title'].map(title_map)\n",
    "\n",
    "    #读取评分数据集\n",
    "    ratings_title = ['UserID','MovieID', 'ratings', 'timestamps']\n",
    "    ratings = pd.read_csv('./ml-1m/ratings.dat', sep='::', header=None, names=ratings_title, engine = 'python')\n",
    "    ratings = ratings.filter(regex='UserID|MovieID|ratings')\n",
    "\n",
    "    #合并三个表\n",
    "    data = pd.merge(pd.merge(ratings, users), movies)\n",
    "    \n",
    "    #将数据分成X和y两张表\n",
    "    target_fields = ['ratings']\n",
    "    features_pd, targets_pd = data.drop(target_fields, axis=1), data[target_fields]\n",
    "    \n",
    "    features = features_pd.values\n",
    "    targets_values = targets_pd.values\n",
    "    \n",
    "    return title_count, title_set, genres2int, features, targets_values, ratings, users, movies, data, movies_orig, users_orig"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 加载数据并保存到本地"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "- title_count：Title字段的长度（15）\n",
    "- title_set：Title文本的集合\n",
    "- genres2int：电影类型转数字的字典\n",
    "- features：是输入X\n",
    "- targets_values：是学习目标y\n",
    "- ratings：评分数据集的Pandas对象\n",
    "- users：用户数据集的Pandas对象\n",
    "- movies：电影数据的Pandas对象\n",
    "- data：三个数据集组合在一起的Pandas对象\n",
    "- movies_orig：没有做数据处理的原始电影数据\n",
    "- users_orig：没有做数据处理的原始用户数据"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 10,
   "metadata": {},
   "outputs": [],
   "source": [
    "title_count, title_set, genres2int, features, targets_values, ratings, users, movies, data, movies_orig, users_orig = load_data()\n",
    "\n",
    "pickle.dump((title_count, title_set, genres2int, features, targets_values, ratings, users, movies, data, movies_orig, users_orig), open('preprocess.p', 'wb'))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 预处理后的数据"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 13,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/html": [
       "<div>\n",
       "<style scoped>\n",
       "    .dataframe tbody tr th:only-of-type {\n",
       "        vertical-align: middle;\n",
       "    }\n",
       "\n",
       "    .dataframe tbody tr th {\n",
       "        vertical-align: top;\n",
       "    }\n",
       "\n",
       "    .dataframe thead th {\n",
       "        text-align: right;\n",
       "    }\n",
       "</style>\n",
       "<table border=\"1\" class=\"dataframe\">\n",
       "  <thead>\n",
       "    <tr style=\"text-align: right;\">\n",
       "      <th></th>\n",
       "      <th>UserID</th>\n",
       "      <th>Gender</th>\n",
       "      <th>Age</th>\n",
       "      <th>JobID</th>\n",
       "    </tr>\n",
       "  </thead>\n",
       "  <tbody>\n",
       "    <tr>\n",
       "      <th>0</th>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>10</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>1</th>\n",
       "      <td>2</td>\n",
       "      <td>1</td>\n",
       "      <td>5</td>\n",
       "      <td>16</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>2</th>\n",
       "      <td>3</td>\n",
       "      <td>1</td>\n",
       "      <td>6</td>\n",
       "      <td>15</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>3</th>\n",
       "      <td>4</td>\n",
       "      <td>1</td>\n",
       "      <td>2</td>\n",
       "      <td>7</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>4</th>\n",
       "      <td>5</td>\n",
       "      <td>1</td>\n",
       "      <td>6</td>\n",
       "      <td>20</td>\n",
       "    </tr>\n",
       "  </tbody>\n",
       "</table>\n",
       "</div>"
      ],
      "text/plain": [
       "   UserID  Gender  Age  JobID\n",
       "0       1       0    0     10\n",
       "1       2       1    5     16\n",
       "2       3       1    6     15\n",
       "3       4       1    2      7\n",
       "4       5       1    6     20"
      ]
     },
     "execution_count": 13,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "users.head()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 14,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/html": [
       "<div>\n",
       "<style scoped>\n",
       "    .dataframe tbody tr th:only-of-type {\n",
       "        vertical-align: middle;\n",
       "    }\n",
       "\n",
       "    .dataframe tbody tr th {\n",
       "        vertical-align: top;\n",
       "    }\n",
       "\n",
       "    .dataframe thead th {\n",
       "        text-align: right;\n",
       "    }\n",
       "</style>\n",
       "<table border=\"1\" class=\"dataframe\">\n",
       "  <thead>\n",
       "    <tr style=\"text-align: right;\">\n",
       "      <th></th>\n",
       "      <th>MovieID</th>\n",
       "      <th>Title</th>\n",
       "      <th>Genres</th>\n",
       "    </tr>\n",
       "  </thead>\n",
       "  <tbody>\n",
       "    <tr>\n",
       "      <th>0</th>\n",
       "      <td>1</td>\n",
       "      <td>[3258, 3347, 2746, 2746, 2746, 2746, 2746, 274...</td>\n",
       "      <td>[16, 11, 5, 10, 10, 10, 10, 10, 10, 10, 10, 10...</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>1</th>\n",
       "      <td>2</td>\n",
       "      <td>[3318, 2746, 2746, 2746, 2746, 2746, 2746, 274...</td>\n",
       "      <td>[7, 11, 6, 10, 10, 10, 10, 10, 10, 10, 10, 10,...</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>2</th>\n",
       "      <td>3</td>\n",
       "      <td>[2286, 3451, 1085, 2746, 2746, 2746, 2746, 274...</td>\n",
       "      <td>[5, 12, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10...</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>3</th>\n",
       "      <td>4</td>\n",
       "      <td>[251, 4209, 3270, 2746, 2746, 2746, 2746, 2746...</td>\n",
       "      <td>[5, 13, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10...</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>4</th>\n",
       "      <td>5</td>\n",
       "      <td>[4457, 2615, 1663, 4559, 2474, 3519, 2746, 274...</td>\n",
       "      <td>[5, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10...</td>\n",
       "    </tr>\n",
       "  </tbody>\n",
       "</table>\n",
       "</div>"
      ],
      "text/plain": [
       "   MovieID                                              Title  \\\n",
       "0        1  [3258, 3347, 2746, 2746, 2746, 2746, 2746, 274...   \n",
       "1        2  [3318, 2746, 2746, 2746, 2746, 2746, 2746, 274...   \n",
       "2        3  [2286, 3451, 1085, 2746, 2746, 2746, 2746, 274...   \n",
       "3        4  [251, 4209, 3270, 2746, 2746, 2746, 2746, 2746...   \n",
       "4        5  [4457, 2615, 1663, 4559, 2474, 3519, 2746, 274...   \n",
       "\n",
       "                                              Genres  \n",
       "0  [16, 11, 5, 10, 10, 10, 10, 10, 10, 10, 10, 10...  \n",
       "1  [7, 11, 6, 10, 10, 10, 10, 10, 10, 10, 10, 10,...  \n",
       "2  [5, 12, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10...  \n",
       "3  [5, 13, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10...  \n",
       "4  [5, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10...  "
      ]
     },
     "execution_count": 14,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "movies.head()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 15,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "array([1,\n",
       "       list([3258, 3347, 2746, 2746, 2746, 2746, 2746, 2746, 2746, 2746, 2746, 2746, 2746, 2746, 2746]),\n",
       "       list([16, 11, 5, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10, 10])],\n",
       "      dtype=object)"
      ]
     },
     "execution_count": 15,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "movies.values[0]"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 从本地读取数据"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "title_count, title_set, genres2int, features, targets_values, ratings, users, movies, data, movies_orig, users_orig = pickle.load(open('preprocess.p', mode='rb'))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 模型设计"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "<img src=\"assets/model.001.jpeg\"/>"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "通过研究数据集中的字段类型，我们发现有一些是类别字段，通常的处理是将这些字段转成one hot编码，但是像UserID、MovieID这样的字段就会变成非常的稀疏，输入的维度急剧膨胀，这是我们不愿意见到的，毕竟我这小笔记本不像大厂动辄能处理数以亿计维度的输入：）\n",
    "\n",
    "所以在预处理数据时将这些字段转成了数字，我们用这个数字当做嵌入矩阵的索引，在网络的第一层使用了嵌入层，维度是（N，32）和（N，16）。\n",
    "\n",
    "电影类型的处理要多一步，有时一个电影有多个电影类型，这样从嵌入矩阵索引出来是一个（n，32）的矩阵，因为有多个类型嘛，我们要将这个矩阵求和，变成（1，32）的向量。\n",
    "\n",
    "电影名的处理比较特殊，没有使用循环神经网络，而是用了文本卷积网络，下文会进行说明。\n",
    "\n",
    "从嵌入层索引出特征以后，将各特征传入全连接层，将输出再次传入全连接层，最终分别得到（1，200）的用户特征和电影特征两个特征向量。\n",
    "\n",
    "我们的目的就是要训练出用户特征和电影特征，在实现推荐功能时使用。得到这两个特征以后，就可以选择任意的方式来拟合评分了。我使用了两种方式，一个是上图中画出的将两个特征做向量乘法，将结果与真实评分做回归，采用MSE优化损失。因为本质上这是一个回归问题，另一种方式是，将两个特征作为输入，再次传入全连接层，输出一个值，将输出值回归到真实评分，采用MSE优化损失。\n",
    "\n",
    "实际上第二个方式的MSE loss在0.8附近，第一个方式在1附近，5次迭代的结果。"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 文本卷积网络\n",
    "网络看起来像下面这样"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "<img src=\"assets/text_cnn.png\"/>\n",
    "图片来自Kim Yoon的论文：[`Convolutional Neural Networks for Sentence Classification`](https://arxiv.org/abs/1408.5882)\n",
    "\n",
    "将卷积神经网络用于文本的文章建议你阅读[`Understanding Convolutional Neural Networks for NLP`](http://www.wildml.com/2015/11/understanding-convolutional-neural-networks-for-nlp/)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "网络的第一层是词嵌入层，由每一个单词的嵌入向量组成的嵌入矩阵。下一层使用多个不同尺寸（窗口大小）的卷积核在嵌入矩阵上做卷积，窗口大小指的是每次卷积覆盖几个单词。这里跟对图像做卷积不太一样，图像的卷积通常用2x2、3x3、5x5之类的尺寸，而文本卷积要覆盖整个单词的嵌入向量，所以尺寸是（单词数，向量维度），比如每次滑动3个，4个或者5个单词。第三层网络是max pooling得到一个长向量，最后使用dropout做正则化，最终得到了电影Title的特征。"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 辅助函数"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 11,
   "metadata": {},
   "outputs": [],
   "source": [
    "import tensorflow as tf\n",
    "import os\n",
    "import pickle\n",
    "\n",
    "def save_params(params):\n",
    "    \"\"\"\n",
    "    Save parameters to file\n",
    "    \"\"\"\n",
    "    pickle.dump(params, open('params.p', 'wb'))\n",
    "\n",
    "\n",
    "def load_params():\n",
    "    \"\"\"\n",
    "    Load parameters from file\n",
    "    \"\"\"\n",
    "    return pickle.load(open('params.p', mode='rb'))\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 编码实现"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 12,
   "metadata": {},
   "outputs": [],
   "source": [
    "#嵌入矩阵的维度\n",
    "embed_dim = 32\n",
    "#用户ID个数\n",
    "uid_max = max(features.take(0,1)) + 1 # 6040\n",
    "#性别个数\n",
    "gender_max = max(features.take(2,1)) + 1 # 1 + 1 = 2\n",
    "#年龄类别个数\n",
    "age_max = max(features.take(3,1)) + 1 # 6 + 1 = 7\n",
    "#职业个数\n",
    "job_max = max(features.take(4,1)) + 1# 20 + 1 = 21\n",
    "\n",
    "#电影ID个数\n",
    "movie_id_max = max(features.take(1,1)) + 1 # 3952\n",
    "#电影类型个数\n",
    "movie_categories_max = max(genres2int.values()) + 1 # 18 + 1 = 19\n",
    "#电影名单词个数\n",
    "movie_title_max = len(title_set) # 5216\n",
    "\n",
    "#对电影类型嵌入向量做加和操作的标志，考虑过使用mean做平均，但是没实现mean\n",
    "combiner = \"sum\"\n",
    "\n",
    "#电影名长度\n",
    "sentences_size = title_count # = 15\n",
    "#文本卷积滑动窗口，分别滑动2, 3, 4, 5个单词\n",
    "window_sizes = {2, 3, 4, 5}\n",
    "#文本卷积核数量\n",
    "filter_num = 8\n",
    "\n",
    "#电影ID转下标的字典，数据集中电影ID跟下标不一致，比如第5行的数据电影ID不一定是5\n",
    "movieid2idx = {val[0]:i for i, val in enumerate(movies.values)}"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 超参"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 13,
   "metadata": {},
   "outputs": [],
   "source": [
    "# Number of Epochs\n",
    "num_epochs = 5\n",
    "# Batch Size\n",
    "batch_size = 256\n",
    "\n",
    "dropout_keep = 0.5\n",
    "# Learning Rate\n",
    "learning_rate = 0.0001\n",
    "# Show stats for every n number of batches\n",
    "show_every_n_batches = 20\n",
    "\n",
    "save_dir = './save'"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 输入"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "定义输入的占位符"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 14,
   "metadata": {},
   "outputs": [],
   "source": [
    "def get_inputs():\n",
    "    uid = tf.keras.layers.Input(shape=(1,), dtype='int32', name='uid')  \n",
    "    user_gender = tf.keras.layers.Input(shape=(1,), dtype='int32', name='user_gender')  \n",
    "    user_age = tf.keras.layers.Input(shape=(1,), dtype='int32', name='user_age') \n",
    "    user_job = tf.keras.layers.Input(shape=(1,), dtype='int32', name='user_job')\n",
    "\n",
    "    movie_id = tf.keras.layers.Input(shape=(1,), dtype='int32', name='movie_id') \n",
    "    movie_categories = tf.keras.layers.Input(shape=(18,), dtype='int32', name='movie_categories') \n",
    "    movie_titles = tf.keras.layers.Input(shape=(15,), dtype='int32', name='movie_titles') \n",
    "    return uid, user_gender, user_age, user_job, movie_id, movie_categories, movie_titles"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 构建神经网络"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "#### 定义User的嵌入矩阵"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 15,
   "metadata": {},
   "outputs": [],
   "source": [
    "def get_user_embedding(uid, user_gender, user_age, user_job):\n",
    "    uid_embed_layer = tf.keras.layers.Embedding(uid_max, embed_dim, input_length=1, name='uid_embed_layer')(uid)\n",
    "    gender_embed_layer = tf.keras.layers.Embedding(gender_max, embed_dim // 2, input_length=1, name='gender_embed_layer')(user_gender)\n",
    "    age_embed_layer = tf.keras.layers.Embedding(age_max, embed_dim // 2, input_length=1, name='age_embed_layer')(user_age)\n",
    "    job_embed_layer = tf.keras.layers.Embedding(job_max, embed_dim // 2, input_length=1, name='job_embed_layer')(user_job)\n",
    "    return uid_embed_layer, gender_embed_layer, age_embed_layer, job_embed_layer"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "#### 将User的嵌入矩阵一起全连接生成User的特征"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 16,
   "metadata": {},
   "outputs": [],
   "source": [
    "def get_user_feature_layer(uid_embed_layer, gender_embed_layer, age_embed_layer, job_embed_layer):\n",
    "    #第一层全连接\n",
    "    uid_fc_layer = tf.keras.layers.Dense(embed_dim, name=\"uid_fc_layer\", activation='relu')(uid_embed_layer)\n",
    "    gender_fc_layer = tf.keras.layers.Dense(embed_dim, name=\"gender_fc_layer\", activation='relu')(gender_embed_layer)\n",
    "    age_fc_layer = tf.keras.layers.Dense(embed_dim, name=\"age_fc_layer\", activation='relu')(age_embed_layer)\n",
    "    job_fc_layer = tf.keras.layers.Dense(embed_dim, name=\"job_fc_layer\", activation='relu')(job_embed_layer)\n",
    "\n",
    "    #第二层全连接\n",
    "    user_combine_layer = tf.keras.layers.concatenate([uid_fc_layer, gender_fc_layer, age_fc_layer, job_fc_layer], 2)  #(?, 1, 128)\n",
    "    user_combine_layer = tf.keras.layers.Dense(200, activation='tanh')(user_combine_layer)  #(?, 1, 200)\n",
    "\n",
    "    user_combine_layer_flat = tf.keras.layers.Reshape([200], name=\"user_combine_layer_flat\")(user_combine_layer)\n",
    "    return user_combine_layer, user_combine_layer_flat"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "#### 定义Movie ID的嵌入矩阵"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 17,
   "metadata": {},
   "outputs": [],
   "source": [
    "def get_movie_id_embed_layer(movie_id):\n",
    "    movie_id_embed_layer = tf.keras.layers.Embedding(movie_id_max, embed_dim, input_length=1, name='movie_id_embed_layer')(movie_id)\n",
    "    return movie_id_embed_layer"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "#### 合并电影类型的多个嵌入向量"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 18,
   "metadata": {},
   "outputs": [],
   "source": [
    "def get_movie_categories_layers(movie_categories):\n",
    "    movie_categories_embed_layer = tf.keras.layers.Embedding(movie_categories_max, embed_dim, input_length=18, name='movie_categories_embed_layer')(movie_categories)\n",
    "    movie_categories_embed_layer = tf.keras.layers.Lambda(lambda layer: tf.reduce_sum(layer, axis=1, keepdims=True))(movie_categories_embed_layer)\n",
    "#     movie_categories_embed_layer = tf.keras.layers.Reshape([1, 18 * embed_dim])(movie_categories_embed_layer)\n",
    "\n",
    "    return movie_categories_embed_layer"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "#### Movie Title的文本卷积网络实现"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 19,
   "metadata": {},
   "outputs": [],
   "source": [
    "def get_movie_cnn_layer(movie_titles):\n",
    "    #从嵌入矩阵中得到电影名对应的各个单词的嵌入向量\n",
    "    movie_title_embed_layer = tf.keras.layers.Embedding(movie_title_max, embed_dim, input_length=15, name='movie_title_embed_layer')(movie_titles)\n",
    "    sp=movie_title_embed_layer.shape\n",
    "    movie_title_embed_layer_expand = tf.keras.layers.Reshape([sp[1], sp[2], 1])(movie_title_embed_layer)\n",
    "    #对文本嵌入层使用不同尺寸的卷积核做卷积和最大池化\n",
    "    pool_layer_lst = []\n",
    "    for window_size in window_sizes:\n",
    "        conv_layer = tf.keras.layers.Conv2D(filter_num, (window_size, embed_dim), 1, activation='relu')(movie_title_embed_layer_expand)\n",
    "        maxpool_layer = tf.keras.layers.MaxPooling2D(pool_size=(sentences_size - window_size + 1 ,1), strides=1)(conv_layer)\n",
    "        pool_layer_lst.append(maxpool_layer)\n",
    "    #Dropout层\n",
    "    pool_layer = tf.keras.layers.concatenate(pool_layer_lst, 3, name =\"pool_layer\")  \n",
    "    max_num = len(window_sizes) * filter_num\n",
    "    pool_layer_flat = tf.keras.layers.Reshape([1, max_num], name = \"pool_layer_flat\")(pool_layer)\n",
    "\n",
    "    dropout_layer = tf.keras.layers.Dropout(dropout_keep, name = \"dropout_layer\")(pool_layer_flat)\n",
    "    return pool_layer_flat, dropout_layer"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "#### 将Movie的各个层一起做全连接"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 20,
   "metadata": {},
   "outputs": [],
   "source": [
    "def get_movie_feature_layer(movie_id_embed_layer, movie_categories_embed_layer, dropout_layer):\n",
    "    #第一层全连接\n",
    "    movie_id_fc_layer = tf.keras.layers.Dense(embed_dim, name=\"movie_id_fc_layer\", activation='relu')(movie_id_embed_layer)\n",
    "    movie_categories_fc_layer = tf.keras.layers.Dense(embed_dim, name=\"movie_categories_fc_layer\", activation='relu')(movie_categories_embed_layer)\n",
    "\n",
    "    #第二层全连接\n",
    "    movie_combine_layer = tf.keras.layers.concatenate([movie_id_fc_layer, movie_categories_fc_layer, dropout_layer], 2)  \n",
    "    movie_combine_layer = tf.keras.layers.Dense(200, activation='tanh')(movie_combine_layer)\n",
    "\n",
    "    movie_combine_layer_flat = tf.keras.layers.Reshape([200], name=\"movie_combine_layer_flat\")(movie_combine_layer)\n",
    "    return movie_combine_layer, movie_combine_layer_flat"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "tf.keras.layers."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 构建计算图"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 24,
   "metadata": {},
   "outputs": [],
   "source": [
    "import tensorflow as tf\n",
    "import datetime\n",
    "from tensorflow import keras\n",
    "from tensorflow.python.ops import summary_ops_v2\n",
    "import time\n",
    "\n",
    "MODEL_DIR = \"./models\"\n",
    "\n",
    "\n",
    "class mv_network(object):\n",
    "    def __init__(self, batch_size=256):\n",
    "        self.batch_size = batch_size\n",
    "        self.best_loss = 9999\n",
    "        self.losses = {'train': [], 'test': []}\n",
    "\n",
    "        # 获取输入占位符\n",
    "        uid, user_gender, user_age, user_job, movie_id, movie_categories, movie_titles = get_inputs()\n",
    "        # 获取User的4个嵌入向量\n",
    "        uid_embed_layer, gender_embed_layer, age_embed_layer, job_embed_layer = get_user_embedding(uid, user_gender,\n",
    "                                                                                                   user_age, user_job)\n",
    "        # 得到用户特征\n",
    "        user_combine_layer, user_combine_layer_flat = get_user_feature_layer(uid_embed_layer, gender_embed_layer,\n",
    "                                                                             age_embed_layer, job_embed_layer)\n",
    "        # 获取电影ID的嵌入向量\n",
    "        movie_id_embed_layer = get_movie_id_embed_layer(movie_id)\n",
    "        # 获取电影类型的嵌入向量\n",
    "        movie_categories_embed_layer = get_movie_categories_layers(movie_categories)\n",
    "        # 获取电影名的特征向量\n",
    "        pool_layer_flat, dropout_layer = get_movie_cnn_layer(movie_titles)\n",
    "        # 得到电影特征\n",
    "        movie_combine_layer, movie_combine_layer_flat = get_movie_feature_layer(movie_id_embed_layer,\n",
    "                                                                                movie_categories_embed_layer,\n",
    "                                                                                dropout_layer)\n",
    "        # 计算出评分\n",
    "        # 将用户特征和电影特征做矩阵乘法得到一个预测评分的方案\n",
    "        inference = tf.keras.layers.Lambda(lambda layer: \n",
    "            tf.reduce_sum(layer[0] * layer[1], axis=1), name=\"inference\")((user_combine_layer_flat, movie_combine_layer_flat))\n",
    "        inference = tf.keras.layers.Lambda(lambda layer: tf.expand_dims(layer, axis=1))(inference)\n",
    "        \n",
    "        # 将用户特征和电影特征作为输入，经过全连接，输出一个值的方案\n",
    "#         inference_layer = tf.keras.layers.concatenate([user_combine_layer_flat, movie_combine_layer_flat],\n",
    "#                                                       1)  # (?, 400)\n",
    "        # 你可以使用下面这个全连接层，试试效果\n",
    "        #inference_dense = tf.keras.layers.Dense(64, kernel_regularizer=tf.nn.l2_loss, activation='relu')(\n",
    "        #    inference_layer)\n",
    "#         inference = tf.keras.layers.Dense(1, name=\"inference\")(inference_layer)  # inference_dense\n",
    "\n",
    "        self.model = tf.keras.Model(\n",
    "            inputs=[uid, user_gender, user_age, user_job, movie_id, movie_categories, movie_titles],\n",
    "            outputs=[inference])\n",
    "\n",
    "        self.model.summary()\n",
    "\n",
    "        self.optimizer = tf.keras.optimizers.Adam(learning_rate)\n",
    "        # MSE损失，将计算值回归到评分\n",
    "        self.ComputeLoss = tf.keras.losses.MeanSquaredError()\n",
    "        self.ComputeMetrics = tf.keras.metrics.MeanAbsoluteError()\n",
    "\n",
    "        if tf.io.gfile.exists(MODEL_DIR):\n",
    "            #             print('Removing existing model dir: {}'.format(MODEL_DIR))\n",
    "            #             tf.io.gfile.rmtree(MODEL_DIR)\n",
    "            pass\n",
    "        else:\n",
    "            tf.io.gfile.makedirs(MODEL_DIR)\n",
    "\n",
    "        train_dir = os.path.join(MODEL_DIR, 'summaries', 'train')\n",
    "        test_dir = os.path.join(MODEL_DIR, 'summaries', 'eval')\n",
    "\n",
    "        #         self.train_summary_writer = summary_ops_v2.create_file_writer(train_dir, flush_millis=10000)\n",
    "        #         self.test_summary_writer = summary_ops_v2.create_file_writer(test_dir, flush_millis=10000, name='test')\n",
    "\n",
    "        checkpoint_dir = os.path.join(MODEL_DIR, 'checkpoints')\n",
    "        self.checkpoint_prefix = os.path.join(checkpoint_dir, 'ckpt')\n",
    "        self.checkpoint = tf.train.Checkpoint(model=self.model, optimizer=self.optimizer)\n",
    "\n",
    "        # Restore variables on creation if a checkpoint exists.\n",
    "        self.checkpoint.restore(tf.train.latest_checkpoint(checkpoint_dir))\n",
    "\n",
    "    def compute_loss(self, labels, logits):\n",
    "        return tf.reduce_mean(tf.keras.losses.mse(labels, logits))\n",
    "\n",
    "    def compute_metrics(self, labels, logits):\n",
    "        return tf.keras.metrics.mae(labels, logits)  #\n",
    "\n",
    "    @tf.function\n",
    "    def train_step(self, x, y):\n",
    "        # Record the operations used to compute the loss, so that the gradient\n",
    "        # of the loss with respect to the variables can be computed.\n",
    "        #         metrics = 0\n",
    "        with tf.GradientTape() as tape:\n",
    "            logits = self.model([x[0],\n",
    "                                 x[1],\n",
    "                                 x[2],\n",
    "                                 x[3],\n",
    "                                 x[4],\n",
    "                                 x[5],\n",
    "                                 x[6]], training=True)\n",
    "            loss = self.ComputeLoss(y, logits)\n",
    "            # loss = self.compute_loss(labels, logits)\n",
    "            self.ComputeMetrics(y, logits)\n",
    "            # metrics = self.compute_metrics(labels, logits)\n",
    "        grads = tape.gradient(loss, self.model.trainable_variables)\n",
    "        self.optimizer.apply_gradients(zip(grads, self.model.trainable_variables))\n",
    "        return loss, logits\n",
    "\n",
    "    def training(self, features, targets_values, epochs=5, log_freq=50):\n",
    "\n",
    "        for epoch_i in range(epochs):\n",
    "            # 将数据集分成训练集和测试集，随机种子不固定\n",
    "            train_X, test_X, train_y, test_y = train_test_split(features,\n",
    "                                                                targets_values,\n",
    "                                                                test_size=0.2,\n",
    "                                                                random_state=0)\n",
    "\n",
    "            train_batches = get_batches(train_X, train_y, self.batch_size)\n",
    "            batch_num = (len(train_X) // self.batch_size)\n",
    "\n",
    "            train_start = time.time()\n",
    "            #             with self.train_summary_writer.as_default():\n",
    "            if True:\n",
    "                start = time.time()\n",
    "                # Metrics are stateful. They accumulate values and return a cumulative\n",
    "                # result when you call .result(). Clear accumulated values with .reset_states()\n",
    "                avg_loss = tf.keras.metrics.Mean('loss', dtype=tf.float32)\n",
    "                #                 avg_mae = tf.keras.metrics.Mean('mae', dtype=tf.float32)\n",
    "\n",
    "                # Datasets can be iterated over like any other Python iterable.\n",
    "                for batch_i in range(batch_num):\n",
    "                    x, y = next(train_batches)\n",
    "                    categories = np.zeros([self.batch_size, 18])\n",
    "                    for i in range(self.batch_size):\n",
    "                        categories[i] = x.take(6, 1)[i]\n",
    "\n",
    "                    titles = np.zeros([self.batch_size, sentences_size])\n",
    "                    for i in range(self.batch_size):\n",
    "                        titles[i] = x.take(5, 1)[i]\n",
    "\n",
    "                    loss, logits = self.train_step([np.reshape(x.take(0, 1), [self.batch_size, 1]).astype(np.float32),\n",
    "                                                    np.reshape(x.take(2, 1), [self.batch_size, 1]).astype(np.float32),\n",
    "                                                    np.reshape(x.take(3, 1), [self.batch_size, 1]).astype(np.float32),\n",
    "                                                    np.reshape(x.take(4, 1), [self.batch_size, 1]).astype(np.float32),\n",
    "                                                    np.reshape(x.take(1, 1), [self.batch_size, 1]).astype(np.float32),\n",
    "                                                    categories.astype(np.float32),\n",
    "                                                    titles.astype(np.float32)],\n",
    "                                                   np.reshape(y, [self.batch_size, 1]).astype(np.float32))\n",
    "                    avg_loss(loss)\n",
    "                    #                     avg_mae(metrics)\n",
    "                    self.losses['train'].append(loss)\n",
    "\n",
    "                    if tf.equal(self.optimizer.iterations % log_freq, 0):\n",
    "                        #                         summary_ops_v2.scalar('loss', avg_loss.result(), step=self.optimizer.iterations)\n",
    "                        #                         summary_ops_v2.scalar('mae', self.ComputeMetrics.result(), step=self.optimizer.iterations)\n",
    "                        # summary_ops_v2.scalar('mae', avg_mae.result(), step=self.optimizer.iterations)\n",
    "\n",
    "                        rate = log_freq / (time.time() - start)\n",
    "                        print('Step #{}\\tEpoch {:>3} Batch {:>4}/{}   Loss: {:0.6f} mae: {:0.6f} ({} steps/sec)'.format(\n",
    "                            self.optimizer.iterations.numpy(),\n",
    "                            epoch_i,\n",
    "                            batch_i,\n",
    "                            batch_num,\n",
    "                            loss, (self.ComputeMetrics.result()), rate))\n",
    "                        # print('Step #{}\\tLoss: {:0.6f} mae: {:0.6f} ({} steps/sec)'.format(\n",
    "                        #     self.optimizer.iterations.numpy(), loss, (avg_mae.result()), rate))\n",
    "                        avg_loss.reset_states()\n",
    "                        self.ComputeMetrics.reset_states()\n",
    "                        # avg_mae.reset_states()\n",
    "                        start = time.time()\n",
    "\n",
    "            train_end = time.time()\n",
    "            print(\n",
    "                '\\nTrain time for epoch #{} ({} total steps): {}'.format(epoch_i + 1, self.optimizer.iterations.numpy(),\n",
    "                                                                         train_end - train_start))\n",
    "            #             with self.test_summary_writer.as_default():\n",
    "            self.testing((test_X, test_y), self.optimizer.iterations)\n",
    "            # self.checkpoint.save(self.checkpoint_prefix)\n",
    "        self.export_path = os.path.join(MODEL_DIR, 'export')\n",
    "        tf.saved_model.save(self.model, self.export_path)\n",
    "\n",
    "    def testing(self, test_dataset, step_num):\n",
    "        test_X, test_y = test_dataset\n",
    "        test_batches = get_batches(test_X, test_y, self.batch_size)\n",
    "\n",
    "        \"\"\"Perform an evaluation of `model` on the examples from `dataset`.\"\"\"\n",
    "        avg_loss = tf.keras.metrics.Mean('loss', dtype=tf.float32)\n",
    "        #         avg_mae = tf.keras.metrics.Mean('mae', dtype=tf.float32)\n",
    "\n",
    "        batch_num = (len(test_X) // self.batch_size)\n",
    "        for batch_i in range(batch_num):\n",
    "            x, y = next(test_batches)\n",
    "            categories = np.zeros([self.batch_size, 18])\n",
    "            for i in range(self.batch_size):\n",
    "                categories[i] = x.take(6, 1)[i]\n",
    "\n",
    "            titles = np.zeros([self.batch_size, sentences_size])\n",
    "            for i in range(self.batch_size):\n",
    "                titles[i] = x.take(5, 1)[i]\n",
    "\n",
    "            logits = self.model([np.reshape(x.take(0, 1), [self.batch_size, 1]).astype(np.float32),\n",
    "                                 np.reshape(x.take(2, 1), [self.batch_size, 1]).astype(np.float32),\n",
    "                                 np.reshape(x.take(3, 1), [self.batch_size, 1]).astype(np.float32),\n",
    "                                 np.reshape(x.take(4, 1), [self.batch_size, 1]).astype(np.float32),\n",
    "                                 np.reshape(x.take(1, 1), [self.batch_size, 1]).astype(np.float32),\n",
    "                                 categories.astype(np.float32),\n",
    "                                 titles.astype(np.float32)], training=False)\n",
    "            test_loss = self.ComputeLoss(np.reshape(y, [self.batch_size, 1]).astype(np.float32), logits)\n",
    "            avg_loss(test_loss)\n",
    "            # 保存测试损失\n",
    "            self.losses['test'].append(test_loss)\n",
    "            self.ComputeMetrics(np.reshape(y, [self.batch_size, 1]).astype(np.float32), logits)\n",
    "            # avg_loss(self.compute_loss(labels, logits))\n",
    "            # avg_mae(self.compute_metrics(labels, logits))\n",
    "\n",
    "        print('Model test set loss: {:0.6f} mae: {:0.6f}'.format(avg_loss.result(), self.ComputeMetrics.result()))\n",
    "        # print('Model test set loss: {:0.6f} mae: {:0.6f}'.format(avg_loss.result(), avg_mae.result()))\n",
    "        #         summary_ops_v2.scalar('loss', avg_loss.result(), step=step_num)\n",
    "        #         summary_ops_v2.scalar('mae', self.ComputeMetrics.result(), step=step_num)\n",
    "        # summary_ops_v2.scalar('mae', avg_mae.result(), step=step_num)\n",
    "\n",
    "        if avg_loss.result() < self.best_loss:\n",
    "            self.best_loss = avg_loss.result()\n",
    "            print(\"best loss = {}\".format(self.best_loss))\n",
    "            self.checkpoint.save(self.checkpoint_prefix)\n",
    "\n",
    "    def forward(self, xs):\n",
    "        predictions = self.model(xs)\n",
    "        # logits = tf.nn.softmax(predictions)\n",
    "\n",
    "        return predictions\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 取得batch"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 22,
   "metadata": {},
   "outputs": [],
   "source": [
    "def get_batches(Xs, ys, batch_size):\n",
    "    for start in range(0, len(Xs), batch_size):\n",
    "        end = min(start + batch_size, len(Xs))\n",
    "        yield Xs[start:end], ys[start:end]"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 训练网络"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "将用户特征和电影特征作为输入，经过全连接，输出一个值的训练"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 187,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Model: \"model_8\"\n",
      "__________________________________________________________________________________________________\n",
      "Layer (type)                    Output Shape         Param #     Connected to                     \n",
      "==================================================================================================\n",
      "movie_titles (InputLayer)       [(None, 15)]         0                                            \n",
      "__________________________________________________________________________________________________\n",
      "movie_title_embed_layer (Embedd (None, 15, 32)       166880      movie_titles[0][0]               \n",
      "__________________________________________________________________________________________________\n",
      "reshape_41 (Reshape)            (None, 15, 32, 1)    0           movie_title_embed_layer[0][0]    \n",
      "__________________________________________________________________________________________________\n",
      "conv2d_39 (Conv2D)              (None, 14, 1, 8)     520         reshape_41[0][0]                 \n",
      "__________________________________________________________________________________________________\n",
      "conv2d_40 (Conv2D)              (None, 13, 1, 8)     776         reshape_41[0][0]                 \n",
      "__________________________________________________________________________________________________\n",
      "conv2d_41 (Conv2D)              (None, 12, 1, 8)     1032        reshape_41[0][0]                 \n",
      "__________________________________________________________________________________________________\n",
      "conv2d_42 (Conv2D)              (None, 11, 1, 8)     1288        reshape_41[0][0]                 \n",
      "__________________________________________________________________________________________________\n",
      "movie_categories (InputLayer)   [(None, 18)]         0                                            \n",
      "__________________________________________________________________________________________________\n",
      "max_pooling2d_37 (MaxPooling2D) (None, 1, 1, 8)      0           conv2d_39[0][0]                  \n",
      "__________________________________________________________________________________________________\n",
      "max_pooling2d_38 (MaxPooling2D) (None, 1, 1, 8)      0           conv2d_40[0][0]                  \n",
      "__________________________________________________________________________________________________\n",
      "max_pooling2d_39 (MaxPooling2D) (None, 1, 1, 8)      0           conv2d_41[0][0]                  \n",
      "__________________________________________________________________________________________________\n",
      "max_pooling2d_40 (MaxPooling2D) (None, 1, 1, 8)      0           conv2d_42[0][0]                  \n",
      "__________________________________________________________________________________________________\n",
      "uid (InputLayer)                [(None, 1)]          0                                            \n",
      "__________________________________________________________________________________________________\n",
      "user_gender (InputLayer)        [(None, 1)]          0                                            \n",
      "__________________________________________________________________________________________________\n",
      "user_age (InputLayer)           [(None, 1)]          0                                            \n",
      "__________________________________________________________________________________________________\n",
      "user_job (InputLayer)           [(None, 1)]          0                                            \n",
      "__________________________________________________________________________________________________\n",
      "movie_id (InputLayer)           [(None, 1)]          0                                            \n",
      "__________________________________________________________________________________________________\n",
      "movie_categories_embed_layer (E (None, 18, 32)       608         movie_categories[0][0]           \n",
      "__________________________________________________________________________________________________\n",
      "pool_layer (Concatenate)        (None, 1, 1, 32)     0           max_pooling2d_37[0][0]           \n",
      "                                                                 max_pooling2d_38[0][0]           \n",
      "                                                                 max_pooling2d_39[0][0]           \n",
      "                                                                 max_pooling2d_40[0][0]           \n",
      "__________________________________________________________________________________________________\n",
      "uid_embed_layer (Embedding)     (None, 1, 32)        193312      uid[0][0]                        \n",
      "__________________________________________________________________________________________________\n",
      "gender_embed_layer (Embedding)  (None, 1, 16)        32          user_gender[0][0]                \n",
      "__________________________________________________________________________________________________\n",
      "age_embed_layer (Embedding)     (None, 1, 16)        112         user_age[0][0]                   \n",
      "__________________________________________________________________________________________________\n",
      "job_embed_layer (Embedding)     (None, 1, 16)        336         user_job[0][0]                   \n",
      "__________________________________________________________________________________________________\n",
      "movie_id_embed_layer (Embedding (None, 1, 32)        126496      movie_id[0][0]                   \n",
      "__________________________________________________________________________________________________\n",
      "reshape_40 (Reshape)            (None, 1, 576)       0           movie_categories_embed_layer[0][0\n",
      "__________________________________________________________________________________________________\n",
      "pool_layer_flat (Reshape)       (None, 1, 32)        0           pool_layer[0][0]                 \n",
      "__________________________________________________________________________________________________\n",
      "uid_fc_layer (Dense)            (None, 1, 32)        1056        uid_embed_layer[0][0]            \n",
      "__________________________________________________________________________________________________\n",
      "gender_fc_layer (Dense)         (None, 1, 32)        544         gender_embed_layer[0][0]         \n",
      "__________________________________________________________________________________________________\n",
      "age_fc_layer (Dense)            (None, 1, 32)        544         age_embed_layer[0][0]            \n",
      "__________________________________________________________________________________________________\n",
      "job_fc_layer (Dense)            (None, 1, 32)        544         job_embed_layer[0][0]            \n",
      "__________________________________________________________________________________________________\n",
      "movie_id_fc_layer (Dense)       (None, 1, 32)        1056        movie_id_embed_layer[0][0]       \n",
      "__________________________________________________________________________________________________\n",
      "movie_categories_fc_layer (Dens (None, 1, 32)        18464       reshape_40[0][0]                 \n",
      "__________________________________________________________________________________________________\n",
      "dropout_layer (Dropout)         (None, 1, 32)        0           pool_layer_flat[0][0]            \n",
      "__________________________________________________________________________________________________\n",
      "concatenate_29 (Concatenate)    (None, 1, 128)       0           uid_fc_layer[0][0]               \n",
      "                                                                 gender_fc_layer[0][0]            \n",
      "                                                                 age_fc_layer[0][0]               \n",
      "                                                                 job_fc_layer[0][0]               \n",
      "__________________________________________________________________________________________________\n",
      "concatenate_30 (Concatenate)    (None, 1, 96)        0           movie_id_fc_layer[0][0]          \n",
      "                                                                 movie_categories_fc_layer[0][0]  \n",
      "                                                                 dropout_layer[0][0]              \n",
      "__________________________________________________________________________________________________\n",
      "dense_24 (Dense)                (None, 1, 200)       25800       concatenate_29[0][0]             \n",
      "__________________________________________________________________________________________________\n",
      "dense_25 (Dense)                (None, 1, 200)       19400       concatenate_30[0][0]             \n",
      "__________________________________________________________________________________________________\n",
      "user_combine_layer_flat (Reshap (None, 200)          0           dense_24[0][0]                   \n",
      "__________________________________________________________________________________________________\n",
      "movie_combine_layer_flat (Resha (None, 200)          0           dense_25[0][0]                   \n",
      "__________________________________________________________________________________________________\n",
      "concatenate_31 (Concatenate)    (None, 400)          0           user_combine_layer_flat[0][0]    \n",
      "                                                                 movie_combine_layer_flat[0][0]   \n",
      "__________________________________________________________________________________________________\n",
      "inference (Dense)               (None, 1)            401         concatenate_31[0][0]             \n",
      "==================================================================================================\n",
      "Total params: 559,201\n",
      "Trainable params: 559,201\n",
      "Non-trainable params: 0\n",
      "__________________________________________________________________________________________________\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Step #15650\tEpoch   0 Batch   24/3125   Loss: 0.719377 mae: 0.712000 (3.5135560911641575 steps/sec)\n",
      "Step #15700\tEpoch   0 Batch   74/3125   Loss: 0.866670 mae: 0.709394 (25.830771074540085 steps/sec)\n",
      "Step #15750\tEpoch   0 Batch  124/3125   Loss: 0.700602 mae: 0.718584 (26.19073666941918 steps/sec)\n",
      "Step #15800\tEpoch   0 Batch  174/3125   Loss: 0.746907 mae: 0.717255 (26.54184004517734 steps/sec)\n",
      "Step #15850\tEpoch   0 Batch  224/3125   Loss: 0.604826 mae: 0.712918 (26.49214363385736 steps/sec)\n",
      "Step #15900\tEpoch   0 Batch  274/3125   Loss: 0.818413 mae: 0.713236 (25.899945523901128 steps/sec)\n",
      "Step #15950\tEpoch   0 Batch  324/3125   Loss: 0.787888 mae: 0.715860 (25.442358006871 steps/sec)\n",
      "Step #16000\tEpoch   0 Batch  374/3125   Loss: 0.797755 mae: 0.717225 (24.36439334037065 steps/sec)\n",
      "Step #16050\tEpoch   0 Batch  424/3125   Loss: 0.872372 mae: 0.720618 (24.813824025352993 steps/sec)\n",
      "Step #16100\tEpoch   0 Batch  474/3125   Loss: 0.793410 mae: 0.715396 (25.03474707128417 steps/sec)\n",
      "Step #16150\tEpoch   0 Batch  524/3125   Loss: 0.830577 mae: 0.709085 (25.205638827091274 steps/sec)\n",
      "Step #16200\tEpoch   0 Batch  574/3125   Loss: 0.895504 mae: 0.719491 (25.81138109557371 steps/sec)\n",
      "Step #16250\tEpoch   0 Batch  624/3125   Loss: 0.645105 mae: 0.714722 (26.149956700456315 steps/sec)\n",
      "Step #16300\tEpoch   0 Batch  674/3125   Loss: 0.847390 mae: 0.718730 (26.61955968067475 steps/sec)\n",
      "Step #16350\tEpoch   0 Batch  724/3125   Loss: 0.664384 mae: 0.715354 (25.825646542594328 steps/sec)\n",
      "Step #16400\tEpoch   0 Batch  774/3125   Loss: 0.768167 mae: 0.709734 (26.564500189117425 steps/sec)\n",
      "Step #16450\tEpoch   0 Batch  824/3125   Loss: 0.825303 mae: 0.713202 (26.768330446086082 steps/sec)\n",
      "Step #16500\tEpoch   0 Batch  874/3125   Loss: 0.770639 mae: 0.714686 (26.503995911325095 steps/sec)\n",
      "Step #16550\tEpoch   0 Batch  924/3125   Loss: 0.930788 mae: 0.717562 (26.16838942928596 steps/sec)\n",
      "Step #16600\tEpoch   0 Batch  974/3125   Loss: 0.735224 mae: 0.715772 (24.555650463883154 steps/sec)\n",
      "Step #16650\tEpoch   0 Batch 1024/3125   Loss: 0.818257 mae: 0.712219 (24.033952688977468 steps/sec)\n",
      "Step #16700\tEpoch   0 Batch 1074/3125   Loss: 0.852923 mae: 0.713759 (24.107002041992114 steps/sec)\n",
      "Step #16750\tEpoch   0 Batch 1124/3125   Loss: 0.836703 mae: 0.717131 (25.62833131938823 steps/sec)\n",
      "Step #16800\tEpoch   0 Batch 1174/3125   Loss: 0.752721 mae: 0.717368 (25.591562157057346 steps/sec)\n",
      "Step #16850\tEpoch   0 Batch 1224/3125   Loss: 0.754375 mae: 0.710149 (25.717111430899624 steps/sec)\n",
      "Step #16900\tEpoch   0 Batch 1274/3125   Loss: 0.937837 mae: 0.727498 (25.200713884358617 steps/sec)\n",
      "Step #16950\tEpoch   0 Batch 1324/3125   Loss: 0.769516 mae: 0.712292 (23.32660429411993 steps/sec)\n",
      "Step #17000\tEpoch   0 Batch 1374/3125   Loss: 0.796065 mae: 0.711374 (23.1871597776936 steps/sec)\n",
      "Step #17050\tEpoch   0 Batch 1424/3125   Loss: 0.816951 mae: 0.714958 (26.39631349398297 steps/sec)\n",
      "Step #17100\tEpoch   0 Batch 1474/3125   Loss: 0.723825 mae: 0.715634 (23.630315689178293 steps/sec)\n",
      "Step #17150\tEpoch   0 Batch 1524/3125   Loss: 0.740833 mae: 0.711555 (22.133490631204143 steps/sec)\n",
      "Step #17200\tEpoch   0 Batch 1574/3125   Loss: 0.790521 mae: 0.720749 (25.570986045496472 steps/sec)\n",
      "Step #17250\tEpoch   0 Batch 1624/3125   Loss: 0.904814 mae: 0.709239 (24.59887791735681 steps/sec)\n",
      "Step #17300\tEpoch   0 Batch 1674/3125   Loss: 0.772584 mae: 0.716537 (21.178525259443376 steps/sec)\n",
      "Step #17350\tEpoch   0 Batch 1724/3125   Loss: 0.914185 mae: 0.713413 (22.73573579629043 steps/sec)\n",
      "Step #17400\tEpoch   0 Batch 1774/3125   Loss: 0.753734 mae: 0.713978 (22.284230538892885 steps/sec)\n",
      "Step #17450\tEpoch   0 Batch 1824/3125   Loss: 0.739516 mae: 0.717396 (22.831842548567497 steps/sec)\n",
      "Step #17500\tEpoch   0 Batch 1874/3125   Loss: 0.897828 mae: 0.715829 (23.83359379430474 steps/sec)\n",
      "Step #17550\tEpoch   0 Batch 1924/3125   Loss: 0.997852 mae: 0.713268 (24.565343822533233 steps/sec)\n",
      "Step #17600\tEpoch   0 Batch 1974/3125   Loss: 0.822853 mae: 0.710993 (24.81995592853933 steps/sec)\n",
      "Step #17650\tEpoch   0 Batch 2024/3125   Loss: 0.866712 mae: 0.719618 (24.902167815859382 steps/sec)\n",
      "Step #17700\tEpoch   0 Batch 2074/3125   Loss: 0.886303 mae: 0.711236 (24.98539645685816 steps/sec)\n",
      "Step #17750\tEpoch   0 Batch 2124/3125   Loss: 0.653712 mae: 0.714852 (26.087445446733472 steps/sec)\n",
      "Step #17800\tEpoch   0 Batch 2174/3125   Loss: 0.719407 mae: 0.706014 (26.049238062221242 steps/sec)\n",
      "Step #17850\tEpoch   0 Batch 2224/3125   Loss: 0.712776 mae: 0.713036 (24.177003882796722 steps/sec)\n",
      "Step #17900\tEpoch   0 Batch 2274/3125   Loss: 0.838799 mae: 0.713682 (22.47477691890891 steps/sec)\n",
      "Step #17950\tEpoch   0 Batch 2324/3125   Loss: 0.689336 mae: 0.722462 (21.722472387166047 steps/sec)\n",
      "Step #18000\tEpoch   0 Batch 2374/3125   Loss: 0.782830 mae: 0.713400 (26.02545519528225 steps/sec)\n",
      "Step #18050\tEpoch   0 Batch 2424/3125   Loss: 0.770336 mae: 0.714173 (24.920105776221074 steps/sec)\n",
      "Step #18100\tEpoch   0 Batch 2474/3125   Loss: 0.882002 mae: 0.708309 (26.340187565720097 steps/sec)\n",
      "Step #18150\tEpoch   0 Batch 2524/3125   Loss: 0.776840 mae: 0.719742 (26.21668740597617 steps/sec)\n",
      "Step #18200\tEpoch   0 Batch 2574/3125   Loss: 0.890810 mae: 0.720383 (26.52998410972927 steps/sec)\n",
      "Step #18250\tEpoch   0 Batch 2624/3125   Loss: 0.950805 mae: 0.718372 (26.13231513163215 steps/sec)\n",
      "Step #18300\tEpoch   0 Batch 2674/3125   Loss: 0.817639 mae: 0.715430 (25.197722300614796 steps/sec)\n",
      "Step #18350\tEpoch   0 Batch 2724/3125   Loss: 0.794958 mae: 0.706530 (25.910319726479997 steps/sec)\n",
      "Step #18400\tEpoch   0 Batch 2774/3125   Loss: 0.765923 mae: 0.714593 (26.21223749040704 steps/sec)\n",
      "Step #18450\tEpoch   0 Batch 2824/3125   Loss: 0.858583 mae: 0.722311 (25.907272522653013 steps/sec)\n",
      "Step #18500\tEpoch   0 Batch 2874/3125   Loss: 0.851600 mae: 0.707288 (26.178296856347988 steps/sec)\n",
      "Step #18550\tEpoch   0 Batch 2924/3125   Loss: 0.849690 mae: 0.713650 (26.156753784253727 steps/sec)\n",
      "Step #18600\tEpoch   0 Batch 2974/3125   Loss: 0.720053 mae: 0.715370 (26.02254230124847 steps/sec)\n",
      "Step #18650\tEpoch   0 Batch 3024/3125   Loss: 0.706705 mae: 0.707073 (26.692659045279026 steps/sec)\n",
      "Step #18700\tEpoch   0 Batch 3074/3125   Loss: 0.828910 mae: 0.706889 (26.222342091861073 steps/sec)\n",
      "Step #18750\tEpoch   0 Batch 3124/3125   Loss: 0.856351 mae: 0.709075 (26.500007139446275 steps/sec)\n",
      "\n",
      "Train time for epoch #1 (18750 total steps): 137.94437193870544\n",
      "Model test set loss: 0.833367 mae: 0.720429\n",
      "best loss = 0.8333674669265747\n",
      "Step #18800\tEpoch   1 Batch   49/3125   Loss: 0.917434 mae: 0.720041 (25.02074172949635 steps/sec)\n",
      "Step #18850\tEpoch   1 Batch   99/3125   Loss: 0.999174 mae: 0.706914 (26.21823113902519 steps/sec)\n",
      "Step #18900\tEpoch   1 Batch  149/3125   Loss: 0.778137 mae: 0.718909 (25.829581212995834 steps/sec)\n",
      "Step #18950\tEpoch   1 Batch  199/3125   Loss: 0.874622 mae: 0.715626 (25.933716148411225 steps/sec)\n",
      "Step #19000\tEpoch   1 Batch  249/3125   Loss: 0.820011 mae: 0.708413 (26.045194140616218 steps/sec)\n",
      "Step #19050\tEpoch   1 Batch  299/3125   Loss: 0.787620 mae: 0.715502 (26.087237759541118 steps/sec)\n",
      "Step #19100\tEpoch   1 Batch  349/3125   Loss: 0.901380 mae: 0.716539 (26.04467984379789 steps/sec)\n",
      "Step #19150\tEpoch   1 Batch  399/3125   Loss: 0.925291 mae: 0.716388 (25.52181762532866 steps/sec)\n",
      "Step #19200\tEpoch   1 Batch  449/3125   Loss: 0.802287 mae: 0.722406 (26.29992406557073 steps/sec)\n",
      "Step #19250\tEpoch   1 Batch  499/3125   Loss: 0.800231 mae: 0.709962 (25.86288779341675 steps/sec)\n",
      "Step #19300\tEpoch   1 Batch  549/3125   Loss: 0.789069 mae: 0.708930 (26.313711235416367 steps/sec)\n",
      "Step #19350\tEpoch   1 Batch  599/3125   Loss: 0.718871 mae: 0.723298 (26.11526644856126 steps/sec)\n",
      "Step #19400\tEpoch   1 Batch  649/3125   Loss: 0.780158 mae: 0.718432 (24.352612345426696 steps/sec)\n",
      "Step #19450\tEpoch   1 Batch  699/3125   Loss: 0.776582 mae: 0.717668 (25.604453899756734 steps/sec)\n",
      "Step #19500\tEpoch   1 Batch  749/3125   Loss: 0.783024 mae: 0.705990 (26.062323086996987 steps/sec)\n",
      "Step #19550\tEpoch   1 Batch  799/3125   Loss: 0.894920 mae: 0.716140 (25.94894232006573 steps/sec)\n",
      "Step #19600\tEpoch   1 Batch  849/3125   Loss: 0.943969 mae: 0.714423 (26.195539211767528 steps/sec)\n",
      "Step #19650\tEpoch   1 Batch  899/3125   Loss: 0.890416 mae: 0.713238 (26.268575653713828 steps/sec)\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Step #19700\tEpoch   1 Batch  949/3125   Loss: 0.705293 mae: 0.712794 (25.831477407840843 steps/sec)\n",
      "Step #19750\tEpoch   1 Batch  999/3125   Loss: 0.858537 mae: 0.712239 (25.8981224172447 steps/sec)\n",
      "Step #19800\tEpoch   1 Batch 1049/3125   Loss: 0.881791 mae: 0.718265 (26.16095643612056 steps/sec)\n",
      "Step #19850\tEpoch   1 Batch 1099/3125   Loss: 0.783116 mae: 0.712359 (26.222463407497806 steps/sec)\n",
      "Step #19900\tEpoch   1 Batch 1149/3125   Loss: 0.745023 mae: 0.717032 (26.26694373757906 steps/sec)\n",
      "Step #19950\tEpoch   1 Batch 1199/3125   Loss: 0.823196 mae: 0.710118 (26.31572210905037 steps/sec)\n",
      "Step #20000\tEpoch   1 Batch 1249/3125   Loss: 0.900386 mae: 0.719564 (25.691107466186548 steps/sec)\n",
      "Step #20050\tEpoch   1 Batch 1299/3125   Loss: 0.789442 mae: 0.722340 (26.12615564354447 steps/sec)\n",
      "Step #20100\tEpoch   1 Batch 1349/3125   Loss: 0.844146 mae: 0.702390 (26.74591829095639 steps/sec)\n",
      "Step #20150\tEpoch   1 Batch 1399/3125   Loss: 0.825736 mae: 0.717600 (26.40209912319155 steps/sec)\n",
      "Step #20200\tEpoch   1 Batch 1449/3125   Loss: 0.825512 mae: 0.715158 (25.800432067477328 steps/sec)\n",
      "Step #20250\tEpoch   1 Batch 1499/3125   Loss: 0.844271 mae: 0.714350 (25.830716987546236 steps/sec)\n",
      "Step #20300\tEpoch   1 Batch 1549/3125   Loss: 0.891946 mae: 0.714242 (26.30739994723848 steps/sec)\n",
      "Step #20350\tEpoch   1 Batch 1599/3125   Loss: 0.830004 mae: 0.713734 (25.873988865491164 steps/sec)\n",
      "Step #20400\tEpoch   1 Batch 1649/3125   Loss: 0.856429 mae: 0.716237 (26.417204881878178 steps/sec)\n",
      "Step #20450\tEpoch   1 Batch 1699/3125   Loss: 0.908601 mae: 0.710375 (25.554857867651027 steps/sec)\n",
      "Step #20500\tEpoch   1 Batch 1749/3125   Loss: 0.818334 mae: 0.712334 (26.473389166213064 steps/sec)\n",
      "Step #20550\tEpoch   1 Batch 1799/3125   Loss: 0.940058 mae: 0.717206 (26.12234811055193 steps/sec)\n",
      "Step #20600\tEpoch   1 Batch 1849/3125   Loss: 0.785987 mae: 0.716771 (26.125761821579836 steps/sec)\n",
      "Step #20650\tEpoch   1 Batch 1899/3125   Loss: 0.878827 mae: 0.712974 (26.279247327686985 steps/sec)\n",
      "Step #20700\tEpoch   1 Batch 1949/3125   Loss: 0.798978 mae: 0.709768 (26.24354017091728 steps/sec)\n",
      "Step #20750\tEpoch   1 Batch 1999/3125   Loss: 0.899784 mae: 0.717475 (26.59971684833578 steps/sec)\n",
      "Step #20800\tEpoch   1 Batch 2049/3125   Loss: 0.804912 mae: 0.722035 (25.869124791409064 steps/sec)\n",
      "Step #20850\tEpoch   1 Batch 2099/3125   Loss: 0.860509 mae: 0.707659 (25.986717375712693 steps/sec)\n",
      "Step #20900\tEpoch   1 Batch 2149/3125   Loss: 0.748080 mae: 0.705866 (26.45683389024906 steps/sec)\n",
      "Step #20950\tEpoch   1 Batch 2199/3125   Loss: 0.757900 mae: 0.709834 (25.262426508471336 steps/sec)\n",
      "Step #21000\tEpoch   1 Batch 2249/3125   Loss: 0.775950 mae: 0.711926 (23.524716268400137 steps/sec)\n",
      "Step #21050\tEpoch   1 Batch 2299/3125   Loss: 0.874092 mae: 0.721197 (24.092180086619873 steps/sec)\n",
      "Step #21100\tEpoch   1 Batch 2349/3125   Loss: 0.816846 mae: 0.715530 (24.659988452859565 steps/sec)\n",
      "Step #21150\tEpoch   1 Batch 2399/3125   Loss: 0.753096 mae: 0.716950 (21.35095421259835 steps/sec)\n",
      "Step #21200\tEpoch   1 Batch 2449/3125   Loss: 0.805619 mae: 0.704887 (25.110805221327862 steps/sec)\n",
      "Step #21250\tEpoch   1 Batch 2499/3125   Loss: 0.900877 mae: 0.718168 (24.557091039899277 steps/sec)\n",
      "Step #21300\tEpoch   1 Batch 2549/3125   Loss: 0.658890 mae: 0.720243 (25.7225589503111 steps/sec)\n",
      "Step #21350\tEpoch   1 Batch 2599/3125   Loss: 0.961057 mae: 0.716548 (24.073578768849096 steps/sec)\n",
      "Step #21400\tEpoch   1 Batch 2649/3125   Loss: 0.881076 mae: 0.713885 (23.08242635621714 steps/sec)\n",
      "Step #21450\tEpoch   1 Batch 2699/3125   Loss: 0.759248 mae: 0.710945 (23.53265668477065 steps/sec)\n",
      "Step #21500\tEpoch   1 Batch 2749/3125   Loss: 0.964585 mae: 0.712072 (25.809541847994172 steps/sec)\n",
      "Step #21550\tEpoch   1 Batch 2799/3125   Loss: 0.839885 mae: 0.715857 (26.188090789290808 steps/sec)\n",
      "Step #21600\tEpoch   1 Batch 2849/3125   Loss: 0.810115 mae: 0.712979 (26.108308886838334 steps/sec)\n",
      "Step #21650\tEpoch   1 Batch 2899/3125   Loss: 0.833338 mae: 0.710993 (26.320957104718815 steps/sec)\n",
      "Step #21700\tEpoch   1 Batch 2949/3125   Loss: 0.851426 mae: 0.714394 (25.611946197072633 steps/sec)\n",
      "Step #21750\tEpoch   1 Batch 2999/3125   Loss: 0.743832 mae: 0.711331 (25.748920668008985 steps/sec)\n",
      "Step #21800\tEpoch   1 Batch 3049/3125   Loss: 0.768872 mae: 0.703642 (25.767893933913154 steps/sec)\n",
      "Step #21850\tEpoch   1 Batch 3099/3125   Loss: 0.858301 mae: 0.708396 (25.81843238443211 steps/sec)\n",
      "\n",
      "Train time for epoch #2 (21875 total steps): 122.00151920318604\n",
      "Model test set loss: 0.833812 mae: 0.720137\n",
      "Step #21900\tEpoch   2 Batch   24/3125   Loss: 0.714828 mae: 0.719840 (43.296264754914866 steps/sec)\n",
      "Step #21950\tEpoch   2 Batch   74/3125   Loss: 0.861777 mae: 0.707867 (25.511308342091947 steps/sec)\n",
      "Step #22000\tEpoch   2 Batch  124/3125   Loss: 0.691228 mae: 0.717055 (25.696647878146187 steps/sec)\n",
      "Step #22050\tEpoch   2 Batch  174/3125   Loss: 0.744891 mae: 0.715433 (25.62235387733903 steps/sec)\n",
      "Step #22100\tEpoch   2 Batch  224/3125   Loss: 0.601366 mae: 0.711255 (26.389029402068186 steps/sec)\n",
      "Step #22150\tEpoch   2 Batch  274/3125   Loss: 0.817633 mae: 0.711639 (25.70016539069753 steps/sec)\n",
      "Step #22200\tEpoch   2 Batch  324/3125   Loss: 0.785614 mae: 0.714554 (26.27761408538415 steps/sec)\n",
      "Step #22250\tEpoch   2 Batch  374/3125   Loss: 0.795298 mae: 0.715842 (25.945516879013947 steps/sec)\n",
      "Step #22300\tEpoch   2 Batch  424/3125   Loss: 0.870885 mae: 0.719502 (25.94297486831214 steps/sec)\n",
      "Step #22350\tEpoch   2 Batch  474/3125   Loss: 0.793392 mae: 0.714597 (26.150572988977153 steps/sec)\n",
      "Step #22400\tEpoch   2 Batch  524/3125   Loss: 0.827058 mae: 0.708406 (26.248302964316284 steps/sec)\n",
      "Step #22450\tEpoch   2 Batch  574/3125   Loss: 0.895391 mae: 0.719130 (26.017005725435407 steps/sec)\n",
      "Step #22500\tEpoch   2 Batch  624/3125   Loss: 0.641685 mae: 0.714070 (26.291381164048335 steps/sec)\n",
      "Step #22550\tEpoch   2 Batch  674/3125   Loss: 0.847071 mae: 0.718132 (26.0298483750211 steps/sec)\n",
      "Step #22600\tEpoch   2 Batch  724/3125   Loss: 0.663462 mae: 0.714632 (25.163161737488174 steps/sec)\n",
      "Step #22650\tEpoch   2 Batch  774/3125   Loss: 0.768797 mae: 0.709454 (24.50930023715149 steps/sec)\n",
      "Step #22700\tEpoch   2 Batch  824/3125   Loss: 0.818912 mae: 0.712923 (25.866221262145572 steps/sec)\n",
      "Step #22750\tEpoch   2 Batch  874/3125   Loss: 0.767478 mae: 0.714189 (25.928903327684697 steps/sec)\n",
      "Step #22800\tEpoch   2 Batch  924/3125   Loss: 0.931454 mae: 0.716871 (25.89400378096668 steps/sec)\n",
      "Step #22850\tEpoch   2 Batch  974/3125   Loss: 0.733199 mae: 0.714793 (25.367226588152104 steps/sec)\n",
      "Step #22900\tEpoch   2 Batch 1024/3125   Loss: 0.814897 mae: 0.710881 (25.481448432117517 steps/sec)\n",
      "Step #22950\tEpoch   2 Batch 1074/3125   Loss: 0.849924 mae: 0.712226 (25.87977132183723 steps/sec)\n",
      "Step #23000\tEpoch   2 Batch 1124/3125   Loss: 0.833062 mae: 0.714846 (26.185746247169035 steps/sec)\n",
      "Step #23050\tEpoch   2 Batch 1174/3125   Loss: 0.745402 mae: 0.715470 (25.694629758277248 steps/sec)\n",
      "Step #23100\tEpoch   2 Batch 1224/3125   Loss: 0.748312 mae: 0.708029 (25.76106008476089 steps/sec)\n",
      "Step #23150\tEpoch   2 Batch 1274/3125   Loss: 0.933713 mae: 0.725487 (25.662596435788565 steps/sec)\n",
      "Step #23200\tEpoch   2 Batch 1324/3125   Loss: 0.762547 mae: 0.710444 (25.735074585197722 steps/sec)\n",
      "Step #23250\tEpoch   2 Batch 1374/3125   Loss: 0.791072 mae: 0.709305 (25.745418243106297 steps/sec)\n",
      "Step #23300\tEpoch   2 Batch 1424/3125   Loss: 0.814867 mae: 0.713429 (26.70043128829474 steps/sec)\n",
      "Step #23350\tEpoch   2 Batch 1474/3125   Loss: 0.720931 mae: 0.714045 (25.94978678047614 steps/sec)\n",
      "Step #23400\tEpoch   2 Batch 1524/3125   Loss: 0.741626 mae: 0.710122 (25.911017611419492 steps/sec)\n",
      "Step #23450\tEpoch   2 Batch 1574/3125   Loss: 0.789810 mae: 0.719868 (25.452297755978822 steps/sec)\n",
      "Step #23500\tEpoch   2 Batch 1624/3125   Loss: 0.899230 mae: 0.708129 (25.57514914706901 steps/sec)\n",
      "Step #23550\tEpoch   2 Batch 1674/3125   Loss: 0.766033 mae: 0.715204 (25.39752050001756 steps/sec)\n",
      "Step #23600\tEpoch   2 Batch 1724/3125   Loss: 0.910642 mae: 0.712362 (25.135707400820632 steps/sec)\n",
      "Step #23650\tEpoch   2 Batch 1774/3125   Loss: 0.753186 mae: 0.712970 (25.068047584479594 steps/sec)\n",
      "Step #23700\tEpoch   2 Batch 1824/3125   Loss: 0.740267 mae: 0.716445 (22.664472215987033 steps/sec)\n",
      "Step #23750\tEpoch   2 Batch 1874/3125   Loss: 0.896657 mae: 0.715039 (23.823248623986192 steps/sec)\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Step #23800\tEpoch   2 Batch 1924/3125   Loss: 1.000547 mae: 0.713107 (26.06307129230903 steps/sec)\n",
      "Step #23850\tEpoch   2 Batch 1974/3125   Loss: 0.820561 mae: 0.710344 (24.72029312347086 steps/sec)\n",
      "Step #23900\tEpoch   2 Batch 2024/3125   Loss: 0.864284 mae: 0.719297 (25.093330372287674 steps/sec)\n",
      "Step #23950\tEpoch   2 Batch 2074/3125   Loss: 0.889233 mae: 0.710738 (26.36612054491062 steps/sec)\n",
      "Step #24000\tEpoch   2 Batch 2124/3125   Loss: 0.654758 mae: 0.714532 (25.1849736465619 steps/sec)\n",
      "Step #24050\tEpoch   2 Batch 2174/3125   Loss: 0.718836 mae: 0.705476 (23.854425073244386 steps/sec)\n",
      "Step #24100\tEpoch   2 Batch 2224/3125   Loss: 0.710718 mae: 0.712394 (25.755327386416738 steps/sec)\n",
      "Step #24150\tEpoch   2 Batch 2274/3125   Loss: 0.838033 mae: 0.713050 (25.50518063035918 steps/sec)\n",
      "Step #24200\tEpoch   2 Batch 2324/3125   Loss: 0.688421 mae: 0.721697 (25.327139643991682 steps/sec)\n",
      "Step #24250\tEpoch   2 Batch 2374/3125   Loss: 0.779463 mae: 0.712504 (26.03148327489811 steps/sec)\n",
      "Step #24300\tEpoch   2 Batch 2424/3125   Loss: 0.767461 mae: 0.713174 (25.593486032912764 steps/sec)\n",
      "Step #24350\tEpoch   2 Batch 2474/3125   Loss: 0.881240 mae: 0.707282 (25.9465858277326 steps/sec)\n",
      "Step #24400\tEpoch   2 Batch 2524/3125   Loss: 0.775806 mae: 0.718691 (25.585604973959352 steps/sec)\n",
      "Step #24450\tEpoch   2 Batch 2574/3125   Loss: 0.888655 mae: 0.719029 (25.843213918488747 steps/sec)\n",
      "Step #24500\tEpoch   2 Batch 2624/3125   Loss: 0.947004 mae: 0.717068 (26.000742151463122 steps/sec)\n",
      "Step #24550\tEpoch   2 Batch 2674/3125   Loss: 0.812928 mae: 0.714299 (26.490517284283406 steps/sec)\n",
      "Step #24600\tEpoch   2 Batch 2724/3125   Loss: 0.794508 mae: 0.705165 (26.3278632175188 steps/sec)\n",
      "Step #24650\tEpoch   2 Batch 2774/3125   Loss: 0.764899 mae: 0.713668 (26.13534710883256 steps/sec)\n",
      "Step #24700\tEpoch   2 Batch 2824/3125   Loss: 0.856916 mae: 0.721073 (26.000323089799746 steps/sec)\n",
      "Step #24750\tEpoch   2 Batch 2874/3125   Loss: 0.849404 mae: 0.706221 (26.17566655741167 steps/sec)\n",
      "Step #24800\tEpoch   2 Batch 2924/3125   Loss: 0.849582 mae: 0.712179 (25.743297652950996 steps/sec)\n",
      "Step #24850\tEpoch   2 Batch 2974/3125   Loss: 0.715426 mae: 0.713899 (26.14141317441703 steps/sec)\n",
      "Step #24900\tEpoch   2 Batch 3024/3125   Loss: 0.703302 mae: 0.705516 (26.028368741512228 steps/sec)\n",
      "Step #24950\tEpoch   2 Batch 3074/3125   Loss: 0.829947 mae: 0.705282 (25.833634827079326 steps/sec)\n",
      "Step #25000\tEpoch   2 Batch 3124/3125   Loss: 0.853463 mae: 0.707818 (25.00109975407234 steps/sec)\n",
      "\n",
      "Train time for epoch #3 (25000 total steps): 122.12160325050354\n",
      "Model test set loss: 0.834304 mae: 0.720539\n",
      "Step #25050\tEpoch   3 Batch   49/3125   Loss: 0.913244 mae: 0.720053 (23.64072845591803 steps/sec)\n",
      "Step #25100\tEpoch   3 Batch   99/3125   Loss: 0.997891 mae: 0.705465 (26.102186699379367 steps/sec)\n",
      "Step #25150\tEpoch   3 Batch  149/3125   Loss: 0.775702 mae: 0.716634 (26.921914647412667 steps/sec)\n",
      "Step #25200\tEpoch   3 Batch  199/3125   Loss: 0.872658 mae: 0.713613 (25.56349587710323 steps/sec)\n",
      "Step #25250\tEpoch   3 Batch  249/3125   Loss: 0.817473 mae: 0.706612 (26.105653636570164 steps/sec)\n",
      "Step #25300\tEpoch   3 Batch  299/3125   Loss: 0.786056 mae: 0.713442 (26.502391550583358 steps/sec)\n",
      "Step #25350\tEpoch   3 Batch  349/3125   Loss: 0.898996 mae: 0.715145 (25.65253571676271 steps/sec)\n",
      "Step #25400\tEpoch   3 Batch  399/3125   Loss: 0.921526 mae: 0.714635 (26.45687394257445 steps/sec)\n",
      "Step #25450\tEpoch   3 Batch  449/3125   Loss: 0.801972 mae: 0.721073 (26.334141340676055 steps/sec)\n",
      "Step #25500\tEpoch   3 Batch  499/3125   Loss: 0.800990 mae: 0.709175 (25.68409723389952 steps/sec)\n",
      "Step #25550\tEpoch   3 Batch  549/3125   Loss: 0.789415 mae: 0.708260 (26.59738573116266 steps/sec)\n",
      "Step #25600\tEpoch   3 Batch  599/3125   Loss: 0.718824 mae: 0.722507 (26.231391383768837 steps/sec)\n",
      "Step #25650\tEpoch   3 Batch  649/3125   Loss: 0.781388 mae: 0.717524 (26.498038314371826 steps/sec)\n",
      "Step #25700\tEpoch   3 Batch  699/3125   Loss: 0.771484 mae: 0.716912 (25.392157503752852 steps/sec)\n",
      "Step #25750\tEpoch   3 Batch  749/3125   Loss: 0.780235 mae: 0.705380 (25.585736077067388 steps/sec)\n",
      "Step #25800\tEpoch   3 Batch  799/3125   Loss: 0.895378 mae: 0.715844 (26.203486247977718 steps/sec)\n",
      "Step #25850\tEpoch   3 Batch  849/3125   Loss: 0.943989 mae: 0.714663 (26.109511564666843 steps/sec)\n",
      "Step #25900\tEpoch   3 Batch  899/3125   Loss: 0.882034 mae: 0.712891 (26.005165687523895 steps/sec)\n",
      "Step #25950\tEpoch   3 Batch  949/3125   Loss: 0.704146 mae: 0.712687 (26.376861685335403 steps/sec)\n",
      "Step #26000\tEpoch   3 Batch  999/3125   Loss: 0.858340 mae: 0.711577 (25.663902867516065 steps/sec)\n",
      "Step #26050\tEpoch   3 Batch 1049/3125   Loss: 0.879648 mae: 0.717456 (26.01718001901337 steps/sec)\n",
      "Step #26100\tEpoch   3 Batch 1099/3125   Loss: 0.776687 mae: 0.710944 (26.013536478084472 steps/sec)\n",
      "Step #26150\tEpoch   3 Batch 1149/3125   Loss: 0.742710 mae: 0.715269 (25.95526908229959 steps/sec)\n",
      "Step #26200\tEpoch   3 Batch 1199/3125   Loss: 0.821301 mae: 0.708809 (25.970964411459732 steps/sec)\n",
      "Step #26250\tEpoch   3 Batch 1249/3125   Loss: 0.897352 mae: 0.717444 (26.22338478721272 steps/sec)\n",
      "Step #26300\tEpoch   3 Batch 1299/3125   Loss: 0.783081 mae: 0.720149 (26.304199248208477 steps/sec)\n",
      "Step #26350\tEpoch   3 Batch 1349/3125   Loss: 0.838622 mae: 0.700307 (26.56941723187516 steps/sec)\n",
      "Step #26400\tEpoch   3 Batch 1399/3125   Loss: 0.822658 mae: 0.715538 (26.28208293407598 steps/sec)\n",
      "Step #26450\tEpoch   3 Batch 1449/3125   Loss: 0.821051 mae: 0.713375 (26.089739961247663 steps/sec)\n",
      "Step #26500\tEpoch   3 Batch 1499/3125   Loss: 0.844656 mae: 0.712191 (26.149249145753355 steps/sec)\n",
      "Step #26550\tEpoch   3 Batch 1549/3125   Loss: 0.888504 mae: 0.712597 (26.222535541651798 steps/sec)\n",
      "Step #26600\tEpoch   3 Batch 1599/3125   Loss: 0.830420 mae: 0.712427 (24.71080904654062 steps/sec)\n",
      "Step #26650\tEpoch   3 Batch 1649/3125   Loss: 0.855207 mae: 0.714612 (25.29193057753682 steps/sec)\n",
      "Step #26700\tEpoch   3 Batch 1699/3125   Loss: 0.903514 mae: 0.708810 (25.795760601203074 steps/sec)\n",
      "Step #26750\tEpoch   3 Batch 1749/3125   Loss: 0.817164 mae: 0.711081 (25.342415016316696 steps/sec)\n",
      "Step #26800\tEpoch   3 Batch 1799/3125   Loss: 0.935223 mae: 0.716191 (25.839994627839253 steps/sec)\n",
      "Step #26850\tEpoch   3 Batch 1849/3125   Loss: 0.783474 mae: 0.715481 (25.734101938188395 steps/sec)\n",
      "Step #26900\tEpoch   3 Batch 1899/3125   Loss: 0.881175 mae: 0.712567 (25.894256361474906 steps/sec)\n",
      "Step #26950\tEpoch   3 Batch 1949/3125   Loss: 0.798994 mae: 0.708969 (26.208758564718945 steps/sec)\n",
      "Step #27000\tEpoch   3 Batch 1999/3125   Loss: 0.899743 mae: 0.716902 (25.834783688481455 steps/sec)\n",
      "Step #27050\tEpoch   3 Batch 2049/3125   Loss: 0.804864 mae: 0.721371 (26.65921567839799 steps/sec)\n",
      "Step #27100\tEpoch   3 Batch 2099/3125   Loss: 0.857625 mae: 0.707497 (25.79817228830056 steps/sec)\n",
      "Step #27150\tEpoch   3 Batch 2149/3125   Loss: 0.746309 mae: 0.705422 (25.593251779316002 steps/sec)\n",
      "Step #27200\tEpoch   3 Batch 2199/3125   Loss: 0.761209 mae: 0.709551 (22.22798343737791 steps/sec)\n",
      "Step #27250\tEpoch   3 Batch 2249/3125   Loss: 0.773264 mae: 0.711333 (24.280871695372056 steps/sec)\n",
      "Step #27300\tEpoch   3 Batch 2299/3125   Loss: 0.870998 mae: 0.720760 (23.464758531904053 steps/sec)\n",
      "Step #27350\tEpoch   3 Batch 2349/3125   Loss: 0.816462 mae: 0.714961 (25.629512107875193 steps/sec)\n",
      "Step #27400\tEpoch   3 Batch 2399/3125   Loss: 0.749725 mae: 0.716216 (26.188950886975583 steps/sec)\n",
      "Step #27450\tEpoch   3 Batch 2449/3125   Loss: 0.801956 mae: 0.703993 (26.517867157011445 steps/sec)\n",
      "Step #27500\tEpoch   3 Batch 2499/3125   Loss: 0.896887 mae: 0.717282 (25.303871803008494 steps/sec)\n",
      "Step #27550\tEpoch   3 Batch 2549/3125   Loss: 0.657917 mae: 0.719300 (25.28930764293121 steps/sec)\n",
      "Step #27600\tEpoch   3 Batch 2599/3125   Loss: 0.956115 mae: 0.715165 (25.55015660976798 steps/sec)\n",
      "Step #27650\tEpoch   3 Batch 2649/3125   Loss: 0.876868 mae: 0.712736 (26.372397030610387 steps/sec)\n",
      "Step #27700\tEpoch   3 Batch 2699/3125   Loss: 0.752767 mae: 0.709536 (25.811997413325606 steps/sec)\n",
      "Step #27750\tEpoch   3 Batch 2749/3125   Loss: 0.961004 mae: 0.711070 (25.944496163350642 steps/sec)\n",
      "Step #27800\tEpoch   3 Batch 2799/3125   Loss: 0.835401 mae: 0.714867 (25.816096357350617 steps/sec)\n",
      "Step #27850\tEpoch   3 Batch 2849/3125   Loss: 0.807629 mae: 0.711938 (26.20102765051287 steps/sec)\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Step #27900\tEpoch   3 Batch 2899/3125   Loss: 0.828528 mae: 0.710024 (25.986971768199133 steps/sec)\n",
      "Step #27950\tEpoch   3 Batch 2949/3125   Loss: 0.848071 mae: 0.713094 (24.17492199051817 steps/sec)\n",
      "Step #28000\tEpoch   3 Batch 2999/3125   Loss: 0.742095 mae: 0.710113 (24.281360861792685 steps/sec)\n",
      "Step #28050\tEpoch   3 Batch 3049/3125   Loss: 0.765247 mae: 0.702206 (26.061115036633282 steps/sec)\n",
      "Step #28100\tEpoch   3 Batch 3099/3125   Loss: 0.851693 mae: 0.707014 (26.171730265268764 steps/sec)\n",
      "\n",
      "Train time for epoch #4 (28125 total steps): 121.49230313301086\n",
      "Model test set loss: 0.834679 mae: 0.720304\n",
      "Step #28150\tEpoch   4 Batch   24/3125   Loss: 0.713022 mae: 0.719945 (49.21506834109367 steps/sec)\n",
      "Step #28200\tEpoch   4 Batch   74/3125   Loss: 0.858430 mae: 0.706423 (26.073550679230795 steps/sec)\n",
      "Step #28250\tEpoch   4 Batch  124/3125   Loss: 0.681727 mae: 0.715435 (26.440826013567182 steps/sec)\n",
      "Step #28300\tEpoch   4 Batch  174/3125   Loss: 0.743057 mae: 0.713319 (25.724357422196512 steps/sec)\n",
      "Step #28350\tEpoch   4 Batch  224/3125   Loss: 0.598413 mae: 0.709311 (26.32005528191543 steps/sec)\n",
      "Step #28400\tEpoch   4 Batch  274/3125   Loss: 0.812796 mae: 0.709514 (26.85545653208111 steps/sec)\n",
      "Step #28450\tEpoch   4 Batch  324/3125   Loss: 0.782939 mae: 0.712544 (26.17380769598948 steps/sec)\n",
      "Step #28500\tEpoch   4 Batch  374/3125   Loss: 0.790225 mae: 0.713941 (25.84831356600388 steps/sec)\n",
      "Step #28550\tEpoch   4 Batch  424/3125   Loss: 0.870381 mae: 0.717945 (26.199960546743675 steps/sec)\n",
      "Step #28600\tEpoch   4 Batch  474/3125   Loss: 0.789495 mae: 0.713043 (25.488017003419547 steps/sec)\n",
      "Step #28650\tEpoch   4 Batch  524/3125   Loss: 0.824110 mae: 0.707150 (26.12615564354447 steps/sec)\n",
      "Step #28700\tEpoch   4 Batch  574/3125   Loss: 0.895778 mae: 0.718112 (25.735914657694806 steps/sec)\n",
      "Step #28750\tEpoch   4 Batch  624/3125   Loss: 0.639144 mae: 0.712889 (24.589806792298145 steps/sec)\n",
      "Step #28800\tEpoch   4 Batch  674/3125   Loss: 0.842407 mae: 0.716974 (25.924746052943718 steps/sec)\n",
      "Step #28850\tEpoch   4 Batch  724/3125   Loss: 0.664039 mae: 0.713647 (25.047415848215778 steps/sec)\n",
      "Step #28900\tEpoch   4 Batch  774/3125   Loss: 0.769784 mae: 0.708743 (25.340645055736317 steps/sec)\n",
      "Step #28950\tEpoch   4 Batch  824/3125   Loss: 0.819740 mae: 0.712833 (26.46114354830203 steps/sec)\n",
      "Step #29000\tEpoch   4 Batch  874/3125   Loss: 0.768802 mae: 0.714241 (25.115150667407413 steps/sec)\n",
      "Step #29050\tEpoch   4 Batch  924/3125   Loss: 0.931935 mae: 0.716747 (25.752291229076246 steps/sec)\n",
      "Step #29100\tEpoch   4 Batch  974/3125   Loss: 0.738546 mae: 0.714838 (26.32511355569768 steps/sec)\n",
      "Step #29150\tEpoch   4 Batch 1024/3125   Loss: 0.818416 mae: 0.710536 (25.884633011018746 steps/sec)\n",
      "Step #29200\tEpoch   4 Batch 1074/3125   Loss: 0.847923 mae: 0.711977 (26.503138441753684 steps/sec)\n",
      "Step #29250\tEpoch   4 Batch 1124/3125   Loss: 0.831988 mae: 0.714045 (24.722341777576116 steps/sec)\n",
      "Step #29300\tEpoch   4 Batch 1174/3125   Loss: 0.743998 mae: 0.714664 (24.557617280470872 steps/sec)\n",
      "Step #29350\tEpoch   4 Batch 1224/3125   Loss: 0.744709 mae: 0.706755 (25.870771477845516 steps/sec)\n",
      "Step #29400\tEpoch   4 Batch 1274/3125   Loss: 0.931296 mae: 0.723962 (26.046429829831148 steps/sec)\n",
      "Step #29450\tEpoch   4 Batch 1324/3125   Loss: 0.755278 mae: 0.708622 (26.00006521241332 steps/sec)\n",
      "Step #29500\tEpoch   4 Batch 1374/3125   Loss: 0.790104 mae: 0.706923 (24.28286784315215 steps/sec)\n",
      "Step #29550\tEpoch   4 Batch 1424/3125   Loss: 0.813869 mae: 0.711374 (24.784337528520012 steps/sec)\n",
      "Step #29600\tEpoch   4 Batch 1474/3125   Loss: 0.717798 mae: 0.712216 (24.399288155533547 steps/sec)\n",
      "Step #29650\tEpoch   4 Batch 1524/3125   Loss: 0.737398 mae: 0.708115 (25.75787388277025 steps/sec)\n",
      "Step #29700\tEpoch   4 Batch 1574/3125   Loss: 0.787943 mae: 0.717956 (24.815976307134438 steps/sec)\n",
      "Step #29750\tEpoch   4 Batch 1624/3125   Loss: 0.893297 mae: 0.706365 (25.72577744607792 steps/sec)\n",
      "Step #29800\tEpoch   4 Batch 1674/3125   Loss: 0.758070 mae: 0.713500 (24.978046043696384 steps/sec)\n",
      "Step #29850\tEpoch   4 Batch 1724/3125   Loss: 0.909414 mae: 0.710802 (25.82024746734905 steps/sec)\n",
      "Step #29900\tEpoch   4 Batch 1774/3125   Loss: 0.750860 mae: 0.711448 (24.73455915229711 steps/sec)\n",
      "Step #29950\tEpoch   4 Batch 1824/3125   Loss: 0.741112 mae: 0.714928 (22.190710945407083 steps/sec)\n",
      "Step #30000\tEpoch   4 Batch 1874/3125   Loss: 0.892503 mae: 0.713572 (22.831907177441575 steps/sec)\n",
      "Step #30050\tEpoch   4 Batch 1924/3125   Loss: 1.002429 mae: 0.712502 (25.337200772070158 steps/sec)\n",
      "Step #30100\tEpoch   4 Batch 1974/3125   Loss: 0.818432 mae: 0.709155 (24.380833333914616 steps/sec)\n",
      "Step #30150\tEpoch   4 Batch 2024/3125   Loss: 0.860250 mae: 0.718580 (24.132051324243832 steps/sec)\n",
      "Step #30200\tEpoch   4 Batch 2074/3125   Loss: 0.893238 mae: 0.709926 (24.375582248841894 steps/sec)\n",
      "Step #30250\tEpoch   4 Batch 2124/3125   Loss: 0.655359 mae: 0.714231 (25.915292176521632 steps/sec)\n",
      "Step #30300\tEpoch   4 Batch 2174/3125   Loss: 0.721610 mae: 0.705052 (25.797626445952417 steps/sec)\n",
      "Step #30350\tEpoch   4 Batch 2224/3125   Loss: 0.708168 mae: 0.711902 (26.49544044877016 steps/sec)\n",
      "Step #30400\tEpoch   4 Batch 2274/3125   Loss: 0.837604 mae: 0.712273 (26.41265004969173 steps/sec)\n",
      "Step #30450\tEpoch   4 Batch 2324/3125   Loss: 0.685541 mae: 0.721410 (26.17972495429803 steps/sec)\n",
      "Step #30500\tEpoch   4 Batch 2374/3125   Loss: 0.776363 mae: 0.712182 (26.241540313068793 steps/sec)\n",
      "Step #30550\tEpoch   4 Batch 2424/3125   Loss: 0.766122 mae: 0.712354 (25.91214775254049 steps/sec)\n",
      "Step #30600\tEpoch   4 Batch 2474/3125   Loss: 0.880379 mae: 0.706585 (25.19444993401503 steps/sec)\n",
      "Step #30650\tEpoch   4 Batch 2524/3125   Loss: 0.775105 mae: 0.717794 (26.443513209323886 steps/sec)\n",
      "Step #30700\tEpoch   4 Batch 2574/3125   Loss: 0.884858 mae: 0.717784 (26.004620724614533 steps/sec)\n",
      "Step #30750\tEpoch   4 Batch 2624/3125   Loss: 0.944507 mae: 0.715722 (26.23936347442548 steps/sec)\n",
      "Step #30800\tEpoch   4 Batch 2674/3125   Loss: 0.810674 mae: 0.713238 (25.401331941064946 steps/sec)\n",
      "Step #30850\tEpoch   4 Batch 2724/3125   Loss: 0.792636 mae: 0.703970 (26.111104470713855 steps/sec)\n",
      "Step #30900\tEpoch   4 Batch 2774/3125   Loss: 0.763319 mae: 0.712700 (26.108256881647662 steps/sec)\n",
      "Step #30950\tEpoch   4 Batch 2824/3125   Loss: 0.854299 mae: 0.719748 (25.85358414793342 steps/sec)\n",
      "Step #31000\tEpoch   4 Batch 2874/3125   Loss: 0.847386 mae: 0.705366 (26.32883829565205 steps/sec)\n",
      "Step #31050\tEpoch   4 Batch 2924/3125   Loss: 0.849435 mae: 0.711011 (23.70735599390323 steps/sec)\n",
      "Step #31100\tEpoch   4 Batch 2974/3125   Loss: 0.712368 mae: 0.712937 (24.40111075765484 steps/sec)\n",
      "Step #31150\tEpoch   4 Batch 3024/3125   Loss: 0.700125 mae: 0.704626 (25.271120673086134 steps/sec)\n",
      "Step #31200\tEpoch   4 Batch 3074/3125   Loss: 0.829782 mae: 0.704342 (26.268493395023704 steps/sec)\n",
      "Step #31250\tEpoch   4 Batch 3124/3125   Loss: 0.848849 mae: 0.706723 (26.45481474267777 steps/sec)\n",
      "\n",
      "Train time for epoch #5 (31250 total steps): 122.71955800056458\n",
      "Model test set loss: 0.835090 mae: 0.720690\n"
     ]
    }
   ],
   "source": [
    "mv_net=mv_network()\n",
    "mv_net.training(features, targets_values, epochs=5)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "将用户特征和电影特征做矩阵乘法得到一个预测评分的训练"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 25,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Model: \"model_1\"\n",
      "__________________________________________________________________________________________________\n",
      "Layer (type)                    Output Shape         Param #     Connected to                     \n",
      "==================================================================================================\n",
      "movie_titles (InputLayer)       [(None, 15)]         0                                            \n",
      "__________________________________________________________________________________________________\n",
      "movie_title_embed_layer (Embedd (None, 15, 32)       166880      movie_titles[0][0]               \n",
      "__________________________________________________________________________________________________\n",
      "reshape_1 (Reshape)             (None, 15, 32, 1)    0           movie_title_embed_layer[0][0]    \n",
      "__________________________________________________________________________________________________\n",
      "conv2d_4 (Conv2D)               (None, 14, 1, 8)     520         reshape_1[0][0]                  \n",
      "__________________________________________________________________________________________________\n",
      "conv2d_5 (Conv2D)               (None, 13, 1, 8)     776         reshape_1[0][0]                  \n",
      "__________________________________________________________________________________________________\n",
      "conv2d_6 (Conv2D)               (None, 12, 1, 8)     1032        reshape_1[0][0]                  \n",
      "__________________________________________________________________________________________________\n",
      "conv2d_7 (Conv2D)               (None, 11, 1, 8)     1288        reshape_1[0][0]                  \n",
      "__________________________________________________________________________________________________\n",
      "movie_categories (InputLayer)   [(None, 18)]         0                                            \n",
      "__________________________________________________________________________________________________\n",
      "max_pooling2d_4 (MaxPooling2D)  (None, 1, 1, 8)      0           conv2d_4[0][0]                   \n",
      "__________________________________________________________________________________________________\n",
      "max_pooling2d_5 (MaxPooling2D)  (None, 1, 1, 8)      0           conv2d_5[0][0]                   \n",
      "__________________________________________________________________________________________________\n",
      "max_pooling2d_6 (MaxPooling2D)  (None, 1, 1, 8)      0           conv2d_6[0][0]                   \n",
      "__________________________________________________________________________________________________\n",
      "max_pooling2d_7 (MaxPooling2D)  (None, 1, 1, 8)      0           conv2d_7[0][0]                   \n",
      "__________________________________________________________________________________________________\n",
      "uid (InputLayer)                [(None, 1)]          0                                            \n",
      "__________________________________________________________________________________________________\n",
      "user_gender (InputLayer)        [(None, 1)]          0                                            \n",
      "__________________________________________________________________________________________________\n",
      "user_age (InputLayer)           [(None, 1)]          0                                            \n",
      "__________________________________________________________________________________________________\n",
      "user_job (InputLayer)           [(None, 1)]          0                                            \n",
      "__________________________________________________________________________________________________\n",
      "movie_id (InputLayer)           [(None, 1)]          0                                            \n",
      "__________________________________________________________________________________________________\n",
      "movie_categories_embed_layer (E (None, 18, 32)       608         movie_categories[0][0]           \n",
      "__________________________________________________________________________________________________\n",
      "pool_layer (Concatenate)        (None, 1, 1, 32)     0           max_pooling2d_4[0][0]            \n",
      "                                                                 max_pooling2d_5[0][0]            \n",
      "                                                                 max_pooling2d_6[0][0]            \n",
      "                                                                 max_pooling2d_7[0][0]            \n",
      "__________________________________________________________________________________________________\n",
      "uid_embed_layer (Embedding)     (None, 1, 32)        193312      uid[0][0]                        \n",
      "__________________________________________________________________________________________________\n",
      "gender_embed_layer (Embedding)  (None, 1, 16)        32          user_gender[0][0]                \n",
      "__________________________________________________________________________________________________\n",
      "age_embed_layer (Embedding)     (None, 1, 16)        112         user_age[0][0]                   \n",
      "__________________________________________________________________________________________________\n",
      "job_embed_layer (Embedding)     (None, 1, 16)        336         user_job[0][0]                   \n",
      "__________________________________________________________________________________________________\n",
      "movie_id_embed_layer (Embedding (None, 1, 32)        126496      movie_id[0][0]                   \n",
      "__________________________________________________________________________________________________\n",
      "lambda_1 (Lambda)               (None, 1, 32)        0           movie_categories_embed_layer[0][0\n",
      "__________________________________________________________________________________________________\n",
      "pool_layer_flat (Reshape)       (None, 1, 32)        0           pool_layer[0][0]                 \n",
      "__________________________________________________________________________________________________\n",
      "uid_fc_layer (Dense)            (None, 1, 32)        1056        uid_embed_layer[0][0]            \n",
      "__________________________________________________________________________________________________\n",
      "gender_fc_layer (Dense)         (None, 1, 32)        544         gender_embed_layer[0][0]         \n",
      "__________________________________________________________________________________________________\n",
      "age_fc_layer (Dense)            (None, 1, 32)        544         age_embed_layer[0][0]            \n",
      "__________________________________________________________________________________________________\n",
      "job_fc_layer (Dense)            (None, 1, 32)        544         job_embed_layer[0][0]            \n",
      "__________________________________________________________________________________________________\n",
      "movie_id_fc_layer (Dense)       (None, 1, 32)        1056        movie_id_embed_layer[0][0]       \n",
      "__________________________________________________________________________________________________\n",
      "movie_categories_fc_layer (Dens (None, 1, 32)        1056        lambda_1[0][0]                   \n",
      "__________________________________________________________________________________________________\n",
      "dropout_layer (Dropout)         (None, 1, 32)        0           pool_layer_flat[0][0]            \n",
      "__________________________________________________________________________________________________\n",
      "concatenate_3 (Concatenate)     (None, 1, 128)       0           uid_fc_layer[0][0]               \n",
      "                                                                 gender_fc_layer[0][0]            \n",
      "                                                                 age_fc_layer[0][0]               \n",
      "                                                                 job_fc_layer[0][0]               \n",
      "__________________________________________________________________________________________________\n",
      "concatenate_4 (Concatenate)     (None, 1, 96)        0           movie_id_fc_layer[0][0]          \n",
      "                                                                 movie_categories_fc_layer[0][0]  \n",
      "                                                                 dropout_layer[0][0]              \n",
      "__________________________________________________________________________________________________\n",
      "dense_2 (Dense)                 (None, 1, 200)       25800       concatenate_3[0][0]              \n",
      "__________________________________________________________________________________________________\n",
      "dense_3 (Dense)                 (None, 1, 200)       19400       concatenate_4[0][0]              \n",
      "__________________________________________________________________________________________________\n",
      "user_combine_layer_flat (Reshap (None, 200)          0           dense_2[0][0]                    \n",
      "__________________________________________________________________________________________________\n",
      "movie_combine_layer_flat (Resha (None, 200)          0           dense_3[0][0]                    \n",
      "__________________________________________________________________________________________________\n",
      "inference (Lambda)              (None,)              0           user_combine_layer_flat[0][0]    \n",
      "                                                                 movie_combine_layer_flat[0][0]   \n",
      "__________________________________________________________________________________________________\n",
      "lambda_2 (Lambda)               (None, 1)            0           inference[0][0]                  \n",
      "==================================================================================================\n",
      "Total params: 541,392\n",
      "Trainable params: 541,392\n",
      "Non-trainable params: 0\n",
      "__________________________________________________________________________________________________\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "WARNING: Logging before flag parsing goes to stderr.\n",
      "W0304 21:47:02.173334 4664485312 tf_logging.py:161] Entity <method-wrapper '__call__' of weakref object at 0x1424cc6d8> could not be transformed and will be staged without change. Error details can be found in the logs when running with the env variable AUTOGRAPH_VERBOSITY >= 1. Please report this to the AutoGraph team. Cause: Object conversion is not yet supported. If you are trying to convert code that uses an existing object, try including the creation of that object in the conversion. For example, instead of converting the method of a class, try converting the entire class instead. See https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/autograph/README.md#using-the-functional-api for more information.\n",
      "W0304 21:47:02.189912 4664485312 tf_logging.py:161] Entity <method-wrapper '__call__' of weakref object at 0x14202c548> could not be transformed and will be staged without change. Error details can be found in the logs when running with the env variable AUTOGRAPH_VERBOSITY >= 1. Please report this to the AutoGraph team. Cause: Object conversion is not yet supported. If you are trying to convert code that uses an existing object, try including the creation of that object in the conversion. For example, instead of converting the method of a class, try converting the entire class instead. See https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/autograph/README.md#using-the-functional-api for more information.\n",
      "W0304 21:47:02.200783 4664485312 tf_logging.py:161] Entity <method-wrapper '__call__' of weakref object at 0x14202c598> could not be transformed and will be staged without change. Error details can be found in the logs when running with the env variable AUTOGRAPH_VERBOSITY >= 1. Please report this to the AutoGraph team. Cause: Object conversion is not yet supported. If you are trying to convert code that uses an existing object, try including the creation of that object in the conversion. For example, instead of converting the method of a class, try converting the entire class instead. See https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/autograph/README.md#using-the-functional-api for more information.\n",
      "W0304 21:47:02.211294 4664485312 tf_logging.py:161] Entity <method-wrapper '__call__' of weakref object at 0x1424bc408> could not be transformed and will be staged without change. Error details can be found in the logs when running with the env variable AUTOGRAPH_VERBOSITY >= 1. Please report this to the AutoGraph team. Cause: Object conversion is not yet supported. If you are trying to convert code that uses an existing object, try including the creation of that object in the conversion. For example, instead of converting the method of a class, try converting the entire class instead. See https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/autograph/README.md#using-the-functional-api for more information.\n",
      "W0304 21:47:02.222695 4664485312 tf_logging.py:161] Entity <method-wrapper '__call__' of weakref object at 0x1424bc778> could not be transformed and will be staged without change. Error details can be found in the logs when running with the env variable AUTOGRAPH_VERBOSITY >= 1. Please report this to the AutoGraph team. Cause: Object conversion is not yet supported. If you are trying to convert code that uses an existing object, try including the creation of that object in the conversion. For example, instead of converting the method of a class, try converting the entire class instead. See https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/autograph/README.md#using-the-functional-api for more information.\n",
      "W0304 21:47:02.233590 4664485312 tf_logging.py:161] Entity <method-wrapper '__call__' of weakref object at 0x1424bcf98> could not be transformed and will be staged without change. Error details can be found in the logs when running with the env variable AUTOGRAPH_VERBOSITY >= 1. Please report this to the AutoGraph team. Cause: Object conversion is not yet supported. If you are trying to convert code that uses an existing object, try including the creation of that object in the conversion. For example, instead of converting the method of a class, try converting the entire class instead. See https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/autograph/README.md#using-the-functional-api for more information.\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "WARNING: Entity <method-wrapper '__call__' of weakref object at 0x1424cc6d8> could not be transformed and will be staged without change. Error details can be found in the logs when running with the env variable AUTOGRAPH_VERBOSITY >= 1. Please report this to the AutoGraph team. Cause: Object conversion is not yet supported. If you are trying to convert code that uses an existing object, try including the creation of that object in the conversion. For example, instead of converting the method of a class, try converting the entire class instead. See https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/autograph/README.md#using-the-functional-api for more information.\n",
      "WARNING: Entity <method-wrapper '__call__' of weakref object at 0x14202c548> could not be transformed and will be staged without change. Error details can be found in the logs when running with the env variable AUTOGRAPH_VERBOSITY >= 1. Please report this to the AutoGraph team. Cause: Object conversion is not yet supported. If you are trying to convert code that uses an existing object, try including the creation of that object in the conversion. For example, instead of converting the method of a class, try converting the entire class instead. See https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/autograph/README.md#using-the-functional-api for more information.\n",
      "WARNING: Entity <method-wrapper '__call__' of weakref object at 0x14202c598> could not be transformed and will be staged without change. Error details can be found in the logs when running with the env variable AUTOGRAPH_VERBOSITY >= 1. Please report this to the AutoGraph team. Cause: Object conversion is not yet supported. If you are trying to convert code that uses an existing object, try including the creation of that object in the conversion. For example, instead of converting the method of a class, try converting the entire class instead. See https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/autograph/README.md#using-the-functional-api for more information.\n",
      "WARNING: Entity <method-wrapper '__call__' of weakref object at 0x1424bc408> could not be transformed and will be staged without change. Error details can be found in the logs when running with the env variable AUTOGRAPH_VERBOSITY >= 1. Please report this to the AutoGraph team. Cause: Object conversion is not yet supported. If you are trying to convert code that uses an existing object, try including the creation of that object in the conversion. For example, instead of converting the method of a class, try converting the entire class instead. See https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/autograph/README.md#using-the-functional-api for more information.\n",
      "WARNING: Entity <method-wrapper '__call__' of weakref object at 0x1424bc778> could not be transformed and will be staged without change. Error details can be found in the logs when running with the env variable AUTOGRAPH_VERBOSITY >= 1. Please report this to the AutoGraph team. Cause: Object conversion is not yet supported. If you are trying to convert code that uses an existing object, try including the creation of that object in the conversion. For example, instead of converting the method of a class, try converting the entire class instead. See https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/autograph/README.md#using-the-functional-api for more information.\n",
      "WARNING: Entity <method-wrapper '__call__' of weakref object at 0x1424bcf98> could not be transformed and will be staged without change. Error details can be found in the logs when running with the env variable AUTOGRAPH_VERBOSITY >= 1. Please report this to the AutoGraph team. Cause: Object conversion is not yet supported. If you are trying to convert code that uses an existing object, try including the creation of that object in the conversion. For example, instead of converting the method of a class, try converting the entire class instead. See https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/autograph/README.md#using-the-functional-api for more information.\n",
      "Step #9400\tEpoch   0 Batch   24/3125   Loss: 1.117654 mae: 0.922864 (3.922135854210507 steps/sec)\n",
      "Step #9450\tEpoch   0 Batch   74/3125   Loss: 1.216622 mae: 0.908119 (18.10271254276855 steps/sec)\n",
      "Step #9500\tEpoch   0 Batch  124/3125   Loss: 1.016003 mae: 0.886247 (21.449404144579574 steps/sec)\n",
      "Step #9550\tEpoch   0 Batch  174/3125   Loss: 0.969421 mae: 0.850738 (20.694946526140697 steps/sec)\n",
      "Step #9600\tEpoch   0 Batch  224/3125   Loss: 0.744969 mae: 0.810330 (14.541664291769655 steps/sec)\n",
      "Step #9650\tEpoch   0 Batch  274/3125   Loss: 0.990777 mae: 0.788238 (16.664443047851137 steps/sec)\n",
      "Step #9700\tEpoch   0 Batch  324/3125   Loss: 0.984966 mae: 0.782725 (20.205285076679615 steps/sec)\n",
      "Step #9750\tEpoch   0 Batch  374/3125   Loss: 0.903125 mae: 0.775597 (19.32058917906161 steps/sec)\n",
      "Step #9800\tEpoch   0 Batch  424/3125   Loss: 0.948713 mae: 0.765667 (20.566152899882162 steps/sec)\n",
      "Step #9850\tEpoch   0 Batch  474/3125   Loss: 0.908609 mae: 0.759096 (19.634737953294806 steps/sec)\n",
      "Step #9900\tEpoch   0 Batch  524/3125   Loss: 0.886103 mae: 0.752096 (17.947946170333946 steps/sec)\n",
      "Step #9950\tEpoch   0 Batch  574/3125   Loss: 0.966865 mae: 0.762127 (16.40985192688325 steps/sec)\n",
      "Step #10000\tEpoch   0 Batch  624/3125   Loss: 0.714627 mae: 0.748382 (18.95226598268736 steps/sec)\n",
      "Step #10050\tEpoch   0 Batch  674/3125   Loss: 0.861893 mae: 0.748417 (20.245152279427337 steps/sec)\n",
      "Step #10100\tEpoch   0 Batch  724/3125   Loss: 0.710298 mae: 0.748331 (20.088186205617617 steps/sec)\n",
      "Step #10150\tEpoch   0 Batch  774/3125   Loss: 0.796161 mae: 0.739216 (17.8091130532611 steps/sec)\n",
      "Step #10200\tEpoch   0 Batch  824/3125   Loss: 0.908184 mae: 0.744844 (18.4427319049416 steps/sec)\n",
      "Step #10250\tEpoch   0 Batch  874/3125   Loss: 0.850119 mae: 0.744604 (20.75822508560044 steps/sec)\n",
      "Step #10300\tEpoch   0 Batch  924/3125   Loss: 0.919433 mae: 0.743012 (18.447243476113034 steps/sec)\n",
      "Step #10350\tEpoch   0 Batch  974/3125   Loss: 0.809681 mae: 0.740242 (17.645546273921152 steps/sec)\n",
      "Step #10400\tEpoch   0 Batch 1024/3125   Loss: 0.866922 mae: 0.739520 (17.656855094959546 steps/sec)\n",
      "Step #10450\tEpoch   0 Batch 1074/3125   Loss: 0.873328 mae: 0.735937 (17.624841140003372 steps/sec)\n",
      "Step #10500\tEpoch   0 Batch 1124/3125   Loss: 0.894555 mae: 0.740011 (18.25982624383329 steps/sec)\n",
      "Step #10550\tEpoch   0 Batch 1174/3125   Loss: 0.787501 mae: 0.741089 (20.504190539639588 steps/sec)\n",
      "Step #10600\tEpoch   0 Batch 1224/3125   Loss: 0.780465 mae: 0.727621 (19.008610992725043 steps/sec)\n",
      "Step #10650\tEpoch   0 Batch 1274/3125   Loss: 0.947377 mae: 0.746048 (15.929436117564542 steps/sec)\n",
      "Step #10700\tEpoch   0 Batch 1324/3125   Loss: 0.826276 mae: 0.732540 (13.559962047789938 steps/sec)\n",
      "Step #10750\tEpoch   0 Batch 1374/3125   Loss: 0.818757 mae: 0.729078 (18.164188377303738 steps/sec)\n",
      "Step #10800\tEpoch   0 Batch 1424/3125   Loss: 0.876408 mae: 0.731607 (20.06939276735049 steps/sec)\n",
      "Step #10850\tEpoch   0 Batch 1474/3125   Loss: 0.741816 mae: 0.733828 (17.310705635410795 steps/sec)\n",
      "Step #10900\tEpoch   0 Batch 1524/3125   Loss: 0.756118 mae: 0.728295 (19.934853779367675 steps/sec)\n",
      "Step #10950\tEpoch   0 Batch 1574/3125   Loss: 0.808088 mae: 0.735761 (19.744739825181803 steps/sec)\n",
      "Step #11000\tEpoch   0 Batch 1624/3125   Loss: 0.908506 mae: 0.723253 (20.313906045162376 steps/sec)\n",
      "Step #11050\tEpoch   0 Batch 1674/3125   Loss: 0.819828 mae: 0.728448 (19.77639115224822 steps/sec)\n",
      "Step #11100\tEpoch   0 Batch 1724/3125   Loss: 0.936280 mae: 0.726600 (20.329592778428346 steps/sec)\n",
      "Step #11150\tEpoch   0 Batch 1774/3125   Loss: 0.774116 mae: 0.728166 (20.066402827425847 steps/sec)\n",
      "Step #11200\tEpoch   0 Batch 1824/3125   Loss: 0.744924 mae: 0.730440 (19.98984471294538 steps/sec)\n",
      "Step #11250\tEpoch   0 Batch 1874/3125   Loss: 0.915810 mae: 0.728119 (20.087839854527452 steps/sec)\n",
      "Step #11300\tEpoch   0 Batch 1924/3125   Loss: 1.030558 mae: 0.726210 (20.864454054021788 steps/sec)\n",
      "Step #11350\tEpoch   0 Batch 1974/3125   Loss: 0.849322 mae: 0.723590 (18.939006640273185 steps/sec)\n",
      "Step #11400\tEpoch   0 Batch 2024/3125   Loss: 0.877280 mae: 0.731662 (18.731560642136024 steps/sec)\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Step #11450\tEpoch   0 Batch 2074/3125   Loss: 0.919819 mae: 0.722572 (20.934735576119202 steps/sec)\n",
      "Step #11500\tEpoch   0 Batch 2124/3125   Loss: 0.674401 mae: 0.726368 (16.453603886189796 steps/sec)\n",
      "Step #11550\tEpoch   0 Batch 2174/3125   Loss: 0.761150 mae: 0.717496 (14.790782025840452 steps/sec)\n",
      "Step #11600\tEpoch   0 Batch 2224/3125   Loss: 0.736960 mae: 0.723739 (13.705534824468481 steps/sec)\n",
      "Step #11650\tEpoch   0 Batch 2274/3125   Loss: 0.855284 mae: 0.724616 (17.67309189105737 steps/sec)\n",
      "Step #11700\tEpoch   0 Batch 2324/3125   Loss: 0.696391 mae: 0.732655 (19.362201572960988 steps/sec)\n",
      "Step #11750\tEpoch   0 Batch 2374/3125   Loss: 0.782445 mae: 0.722666 (19.077592049305224 steps/sec)\n",
      "Step #11800\tEpoch   0 Batch 2424/3125   Loss: 0.792556 mae: 0.726266 (18.253843909524594 steps/sec)\n",
      "Step #11850\tEpoch   0 Batch 2474/3125   Loss: 0.908641 mae: 0.719441 (18.58504585618521 steps/sec)\n",
      "Step #11900\tEpoch   0 Batch 2524/3125   Loss: 0.802004 mae: 0.728494 (19.73385411123686 steps/sec)\n",
      "Step #11950\tEpoch   0 Batch 2574/3125   Loss: 0.893436 mae: 0.729582 (20.932641804376733 steps/sec)\n",
      "Step #12000\tEpoch   0 Batch 2624/3125   Loss: 0.984020 mae: 0.729569 (20.891078900692882 steps/sec)\n",
      "Step #12050\tEpoch   0 Batch 2674/3125   Loss: 0.855152 mae: 0.725303 (18.959429670812398 steps/sec)\n",
      "Step #12100\tEpoch   0 Batch 2724/3125   Loss: 0.819400 mae: 0.718279 (19.53501829051264 steps/sec)\n",
      "Step #12150\tEpoch   0 Batch 2774/3125   Loss: 0.802988 mae: 0.724313 (18.21575339134157 steps/sec)\n",
      "Step #12200\tEpoch   0 Batch 2824/3125   Loss: 0.886709 mae: 0.731373 (17.511920937695816 steps/sec)\n",
      "Step #12250\tEpoch   0 Batch 2874/3125   Loss: 0.886981 mae: 0.715915 (21.05758022462 steps/sec)\n",
      "Step #12300\tEpoch   0 Batch 2924/3125   Loss: 0.851305 mae: 0.722028 (17.16131365776258 steps/sec)\n",
      "Step #12350\tEpoch   0 Batch 2974/3125   Loss: 0.765563 mae: 0.726585 (18.641727259741153 steps/sec)\n",
      "Step #12400\tEpoch   0 Batch 3024/3125   Loss: 0.717880 mae: 0.715971 (20.205806806984555 steps/sec)\n",
      "Step #12450\tEpoch   0 Batch 3074/3125   Loss: 0.838003 mae: 0.715728 (18.304894140004222 steps/sec)\n",
      "Step #12500\tEpoch   0 Batch 3124/3125   Loss: 0.879602 mae: 0.717684 (20.30889163807859 steps/sec)\n",
      "\n",
      "Train time for epoch #1 (12500 total steps): 179.48765778541565\n",
      "Model test set loss: 0.837667 mae: 0.725071\n",
      "best loss = 0.8376671075820923\n",
      "Step #12550\tEpoch   1 Batch   49/3125   Loss: 0.921032 mae: 0.724679 (19.60485387356169 steps/sec)\n",
      "Step #12600\tEpoch   1 Batch   99/3125   Loss: 1.022029 mae: 0.712330 (18.679443893493737 steps/sec)\n",
      "Step #12650\tEpoch   1 Batch  149/3125   Loss: 0.774985 mae: 0.724549 (17.856863388717954 steps/sec)\n",
      "Step #12700\tEpoch   1 Batch  199/3125   Loss: 0.875200 mae: 0.718791 (18.377341407677683 steps/sec)\n",
      "Step #12750\tEpoch   1 Batch  249/3125   Loss: 0.816185 mae: 0.712610 (18.44760047275917 steps/sec)\n",
      "Step #12800\tEpoch   1 Batch  299/3125   Loss: 0.784824 mae: 0.719703 (17.464332705984244 steps/sec)\n",
      "Step #12850\tEpoch   1 Batch  349/3125   Loss: 0.896935 mae: 0.720693 (16.833707228064227 steps/sec)\n",
      "Step #12900\tEpoch   1 Batch  399/3125   Loss: 0.914286 mae: 0.719254 (17.421140371484498 steps/sec)\n",
      "Step #12950\tEpoch   1 Batch  449/3125   Loss: 0.801234 mae: 0.725455 (18.222184183647535 steps/sec)\n",
      "Step #13000\tEpoch   1 Batch  499/3125   Loss: 0.815844 mae: 0.712358 (18.12018871279604 steps/sec)\n",
      "Step #13050\tEpoch   1 Batch  549/3125   Loss: 0.796498 mae: 0.714242 (17.159182139903393 steps/sec)\n",
      "Step #13100\tEpoch   1 Batch  599/3125   Loss: 0.737774 mae: 0.727708 (17.128319136997835 steps/sec)\n",
      "Step #13150\tEpoch   1 Batch  649/3125   Loss: 0.790554 mae: 0.721343 (20.08582355868562 steps/sec)\n",
      "Step #13200\tEpoch   1 Batch  699/3125   Loss: 0.773473 mae: 0.722071 (18.57839594575608 steps/sec)\n",
      "Step #13250\tEpoch   1 Batch  749/3125   Loss: 0.778746 mae: 0.710079 (17.423563291636682 steps/sec)\n",
      "Step #13300\tEpoch   1 Batch  799/3125   Loss: 0.909694 mae: 0.719989 (18.841751985225976 steps/sec)\n",
      "Step #13350\tEpoch   1 Batch  849/3125   Loss: 0.965190 mae: 0.718990 (20.65377407807329 steps/sec)\n",
      "Step #13400\tEpoch   1 Batch  899/3125   Loss: 0.892525 mae: 0.717651 (21.070225578746193 steps/sec)\n",
      "Step #13450\tEpoch   1 Batch  949/3125   Loss: 0.715483 mae: 0.717744 (21.096788510819543 steps/sec)\n",
      "Step #13500\tEpoch   1 Batch  999/3125   Loss: 0.866204 mae: 0.717969 (20.980333838245446 steps/sec)\n",
      "Step #13550\tEpoch   1 Batch 1049/3125   Loss: 0.895623 mae: 0.724931 (20.90355418392638 steps/sec)\n",
      "Step #13600\tEpoch   1 Batch 1099/3125   Loss: 0.786889 mae: 0.718079 (21.441778951792823 steps/sec)\n",
      "Step #13650\tEpoch   1 Batch 1149/3125   Loss: 0.759155 mae: 0.723530 (19.401975823578233 steps/sec)\n",
      "Step #13700\tEpoch   1 Batch 1199/3125   Loss: 0.836090 mae: 0.713416 (18.98185350474476 steps/sec)\n",
      "Step #13750\tEpoch   1 Batch 1249/3125   Loss: 0.930717 mae: 0.724654 (18.6280250359565 steps/sec)\n",
      "Step #13800\tEpoch   1 Batch 1299/3125   Loss: 0.806900 mae: 0.728242 (20.203278223581385 steps/sec)\n",
      "Step #13850\tEpoch   1 Batch 1349/3125   Loss: 0.854526 mae: 0.706483 (19.91987414819502 steps/sec)\n",
      "Step #13900\tEpoch   1 Batch 1399/3125   Loss: 0.829388 mae: 0.721287 (19.37616414865624 steps/sec)\n",
      "Step #13950\tEpoch   1 Batch 1449/3125   Loss: 0.827869 mae: 0.720033 (18.63573057476976 steps/sec)\n",
      "Step #14000\tEpoch   1 Batch 1499/3125   Loss: 0.849432 mae: 0.717733 (19.680872029360945 steps/sec)\n",
      "Step #14050\tEpoch   1 Batch 1549/3125   Loss: 0.912189 mae: 0.718682 (20.525426537267002 steps/sec)\n",
      "Step #14100\tEpoch   1 Batch 1599/3125   Loss: 0.847885 mae: 0.716072 (16.613878974802287 steps/sec)\n",
      "Step #14150\tEpoch   1 Batch 1649/3125   Loss: 0.870443 mae: 0.719402 (20.988703451515757 steps/sec)\n",
      "Step #14200\tEpoch   1 Batch 1699/3125   Loss: 0.868675 mae: 0.713020 (15.910659133789993 steps/sec)\n",
      "Step #14250\tEpoch   1 Batch 1749/3125   Loss: 0.832088 mae: 0.716024 (15.90300974305344 steps/sec)\n",
      "Step #14300\tEpoch   1 Batch 1799/3125   Loss: 0.934096 mae: 0.719469 (15.662443465813912 steps/sec)\n",
      "Step #14350\tEpoch   1 Batch 1849/3125   Loss: 0.794237 mae: 0.719263 (16.991342254397296 steps/sec)\n",
      "Step #14400\tEpoch   1 Batch 1899/3125   Loss: 0.876163 mae: 0.715506 (19.812060749937483 steps/sec)\n",
      "Step #14450\tEpoch   1 Batch 1949/3125   Loss: 0.816534 mae: 0.711931 (18.873388093690174 steps/sec)\n",
      "Step #14500\tEpoch   1 Batch 1999/3125   Loss: 0.905383 mae: 0.719276 (19.359683118712137 steps/sec)\n",
      "Step #14550\tEpoch   1 Batch 2049/3125   Loss: 0.788895 mae: 0.724192 (18.548698302125516 steps/sec)\n",
      "Step #14600\tEpoch   1 Batch 2099/3125   Loss: 0.870792 mae: 0.709806 (19.309410367232474 steps/sec)\n",
      "Step #14650\tEpoch   1 Batch 2149/3125   Loss: 0.751727 mae: 0.708525 (19.28330165044639 steps/sec)\n",
      "Step #14700\tEpoch   1 Batch 2199/3125   Loss: 0.757704 mae: 0.711675 (20.28012589206262 steps/sec)\n",
      "Step #14750\tEpoch   1 Batch 2249/3125   Loss: 0.761302 mae: 0.713103 (18.991352452142905 steps/sec)\n",
      "Step #14800\tEpoch   1 Batch 2299/3125   Loss: 0.875408 mae: 0.724522 (20.435104633891733 steps/sec)\n",
      "Step #14850\tEpoch   1 Batch 2349/3125   Loss: 0.808316 mae: 0.717200 (19.306822087134385 steps/sec)\n",
      "Step #14900\tEpoch   1 Batch 2399/3125   Loss: 0.742177 mae: 0.717295 (19.88450188903299 steps/sec)\n",
      "Step #14950\tEpoch   1 Batch 2449/3125   Loss: 0.813804 mae: 0.709144 (18.693441840734685 steps/sec)\n",
      "Step #15000\tEpoch   1 Batch 2499/3125   Loss: 0.907375 mae: 0.718620 (18.78752978284014 steps/sec)\n",
      "Step #15050\tEpoch   1 Batch 2549/3125   Loss: 0.651802 mae: 0.721961 (18.9440654902264 steps/sec)\n",
      "Step #15100\tEpoch   1 Batch 2599/3125   Loss: 0.976590 mae: 0.718878 (20.78300360916237 steps/sec)\n",
      "Step #15150\tEpoch   1 Batch 2649/3125   Loss: 0.891440 mae: 0.715839 (18.953867533365923 steps/sec)\n",
      "Step #15200\tEpoch   1 Batch 2699/3125   Loss: 0.760566 mae: 0.713878 (17.403410190472616 steps/sec)\n",
      "Step #15250\tEpoch   1 Batch 2749/3125   Loss: 0.987079 mae: 0.713724 (18.973914070955846 steps/sec)\n",
      "Step #15300\tEpoch   1 Batch 2799/3125   Loss: 0.829215 mae: 0.718262 (19.152650262519327 steps/sec)\n",
      "Step #15350\tEpoch   1 Batch 2849/3125   Loss: 0.838259 mae: 0.714673 (20.58026447599398 steps/sec)\n",
      "Step #15400\tEpoch   1 Batch 2899/3125   Loss: 0.843889 mae: 0.713333 (19.44821340256655 steps/sec)\n",
      "Step #15450\tEpoch   1 Batch 2949/3125   Loss: 0.869141 mae: 0.716856 (18.70731726898316 steps/sec)\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Step #15500\tEpoch   1 Batch 2999/3125   Loss: 0.745186 mae: 0.713889 (17.02420313914268 steps/sec)\n",
      "Step #15550\tEpoch   1 Batch 3049/3125   Loss: 0.781694 mae: 0.705353 (18.916078141462705 steps/sec)\n",
      "Step #15600\tEpoch   1 Batch 3099/3125   Loss: 0.879136 mae: 0.710527 (18.563360246927022 steps/sec)\n",
      "\n",
      "Train time for epoch #2 (15625 total steps): 166.3247570991516\n",
      "Model test set loss: 0.827377 mae: 0.719194\n",
      "best loss = 0.8273772597312927\n",
      "Step #15650\tEpoch   2 Batch   24/3125   Loss: 0.722387 mae: 0.718930 (33.1196217087886 steps/sec)\n",
      "Step #15700\tEpoch   2 Batch   74/3125   Loss: 0.868104 mae: 0.707424 (13.334583823594398 steps/sec)\n",
      "Step #15750\tEpoch   2 Batch  124/3125   Loss: 0.704735 mae: 0.718248 (17.513203472223207 steps/sec)\n",
      "Step #15800\tEpoch   2 Batch  174/3125   Loss: 0.745867 mae: 0.715609 (17.495324907265648 steps/sec)\n",
      "Step #15850\tEpoch   2 Batch  224/3125   Loss: 0.616620 mae: 0.710086 (18.42146594375193 steps/sec)\n",
      "Step #15900\tEpoch   2 Batch  274/3125   Loss: 0.809966 mae: 0.711517 (16.266395233275546 steps/sec)\n",
      "Step #15950\tEpoch   2 Batch  324/3125   Loss: 0.789194 mae: 0.714804 (20.869715441106738 steps/sec)\n",
      "Step #16000\tEpoch   2 Batch  374/3125   Loss: 0.807398 mae: 0.715340 (20.591714070652444 steps/sec)\n",
      "Step #16050\tEpoch   2 Batch  424/3125   Loss: 0.851041 mae: 0.717801 (21.313451705714783 steps/sec)\n",
      "Step #16100\tEpoch   2 Batch  474/3125   Loss: 0.787050 mae: 0.711965 (18.74303666665892 steps/sec)\n",
      "Step #16150\tEpoch   2 Batch  524/3125   Loss: 0.831110 mae: 0.706831 (16.610744448001682 steps/sec)\n",
      "Step #16200\tEpoch   2 Batch  574/3125   Loss: 0.873257 mae: 0.716958 (20.89489424382903 steps/sec)\n",
      "Step #16250\tEpoch   2 Batch  624/3125   Loss: 0.635928 mae: 0.713362 (21.647278771968796 steps/sec)\n",
      "Step #16300\tEpoch   2 Batch  674/3125   Loss: 0.840082 mae: 0.715602 (21.48500958047233 steps/sec)\n",
      "Step #16350\tEpoch   2 Batch  724/3125   Loss: 0.666181 mae: 0.713153 (20.294917880249308 steps/sec)\n",
      "Step #16400\tEpoch   2 Batch  774/3125   Loss: 0.765016 mae: 0.706465 (21.10770692508062 steps/sec)\n",
      "Step #16450\tEpoch   2 Batch  824/3125   Loss: 0.833893 mae: 0.711793 (16.17672184943379 steps/sec)\n",
      "Step #16500\tEpoch   2 Batch  874/3125   Loss: 0.754802 mae: 0.711323 (20.423502290100423 steps/sec)\n",
      "Step #16550\tEpoch   2 Batch  924/3125   Loss: 0.898809 mae: 0.715913 (19.395029905208922 steps/sec)\n",
      "Step #16600\tEpoch   2 Batch  974/3125   Loss: 0.748144 mae: 0.713814 (19.592606097670238 steps/sec)\n",
      "Step #16650\tEpoch   2 Batch 1024/3125   Loss: 0.809212 mae: 0.709617 (20.889151984808702 steps/sec)\n",
      "Step #16700\tEpoch   2 Batch 1074/3125   Loss: 0.862435 mae: 0.711089 (15.475521595015842 steps/sec)\n",
      "Step #16750\tEpoch   2 Batch 1124/3125   Loss: 0.823123 mae: 0.715648 (15.13248442664885 steps/sec)\n",
      "Step #16800\tEpoch   2 Batch 1174/3125   Loss: 0.742501 mae: 0.713542 (18.50966104267946 steps/sec)\n",
      "Step #16850\tEpoch   2 Batch 1224/3125   Loss: 0.763766 mae: 0.706796 (17.37028620794047 steps/sec)\n",
      "Step #16900\tEpoch   2 Batch 1274/3125   Loss: 0.924196 mae: 0.724814 (20.83635249771209 steps/sec)\n",
      "Step #16950\tEpoch   2 Batch 1324/3125   Loss: 0.742635 mae: 0.708325 (20.9183393059824 steps/sec)\n",
      "Step #17000\tEpoch   2 Batch 1374/3125   Loss: 0.785088 mae: 0.707442 (18.00992475024956 steps/sec)\n",
      "Step #17050\tEpoch   2 Batch 1424/3125   Loss: 0.810725 mae: 0.712172 (18.758796182248936 steps/sec)\n",
      "Step #17100\tEpoch   2 Batch 1474/3125   Loss: 0.703401 mae: 0.712050 (16.05297756940783 steps/sec)\n",
      "Step #17150\tEpoch   2 Batch 1524/3125   Loss: 0.716910 mae: 0.707413 (19.620492151436395 steps/sec)\n",
      "Step #17200\tEpoch   2 Batch 1574/3125   Loss: 0.779953 mae: 0.715934 (19.58920573816576 steps/sec)\n",
      "Step #17250\tEpoch   2 Batch 1624/3125   Loss: 0.856637 mae: 0.704058 (14.561032654411614 steps/sec)\n",
      "Step #17300\tEpoch   2 Batch 1674/3125   Loss: 0.752017 mae: 0.713214 (20.594387341252055 steps/sec)\n",
      "Step #17350\tEpoch   2 Batch 1724/3125   Loss: 0.891409 mae: 0.708870 (18.66062214293837 steps/sec)\n",
      "Step #17400\tEpoch   2 Batch 1774/3125   Loss: 0.752097 mae: 0.709395 (19.887319049893165 steps/sec)\n",
      "Step #17450\tEpoch   2 Batch 1824/3125   Loss: 0.707247 mae: 0.711130 (18.161191800500127 steps/sec)\n",
      "Step #17500\tEpoch   2 Batch 1874/3125   Loss: 0.897822 mae: 0.711814 (18.221653784129455 steps/sec)\n",
      "Step #17550\tEpoch   2 Batch 1924/3125   Loss: 0.987037 mae: 0.708070 (17.185779728363077 steps/sec)\n",
      "Step #17600\tEpoch   2 Batch 1974/3125   Loss: 0.790097 mae: 0.706079 (16.49675013191724 steps/sec)\n",
      "Step #17650\tEpoch   2 Batch 2024/3125   Loss: 0.856534 mae: 0.713870 (19.01077870192897 steps/sec)\n",
      "Step #17700\tEpoch   2 Batch 2074/3125   Loss: 0.869646 mae: 0.703910 (19.945897938806958 steps/sec)\n",
      "Step #17750\tEpoch   2 Batch 2124/3125   Loss: 0.636450 mae: 0.709190 (17.939138292605232 steps/sec)\n",
      "Step #17800\tEpoch   2 Batch 2174/3125   Loss: 0.725179 mae: 0.702541 (19.167347422247172 steps/sec)\n",
      "Step #17850\tEpoch   2 Batch 2224/3125   Loss: 0.718608 mae: 0.707319 (18.442814621550447 steps/sec)\n",
      "Step #17900\tEpoch   2 Batch 2274/3125   Loss: 0.835047 mae: 0.707792 (17.919335680878138 steps/sec)\n",
      "Step #17950\tEpoch   2 Batch 2324/3125   Loss: 0.669637 mae: 0.716121 (18.207362047772587 steps/sec)\n",
      "Step #18000\tEpoch   2 Batch 2374/3125   Loss: 0.768549 mae: 0.707301 (20.142829878805045 steps/sec)\n",
      "Step #18050\tEpoch   2 Batch 2424/3125   Loss: 0.760537 mae: 0.708028 (19.621202573718495 steps/sec)\n",
      "Step #18100\tEpoch   2 Batch 2474/3125   Loss: 0.864986 mae: 0.702833 (16.839823770406568 steps/sec)\n",
      "Step #18150\tEpoch   2 Batch 2524/3125   Loss: 0.764696 mae: 0.712350 (17.625535861079495 steps/sec)\n",
      "Step #18200\tEpoch   2 Batch 2574/3125   Loss: 0.871976 mae: 0.714438 (18.72427551814583 steps/sec)\n",
      "Step #18250\tEpoch   2 Batch 2624/3125   Loss: 0.952016 mae: 0.713022 (16.810266328422987 steps/sec)\n",
      "Step #18300\tEpoch   2 Batch 2674/3125   Loss: 0.816006 mae: 0.708961 (20.035937873795543 steps/sec)\n",
      "Step #18350\tEpoch   2 Batch 2724/3125   Loss: 0.769856 mae: 0.699959 (15.318994761093595 steps/sec)\n",
      "Step #18400\tEpoch   2 Batch 2774/3125   Loss: 0.741094 mae: 0.708568 (16.33613257902275 steps/sec)\n",
      "Step #18450\tEpoch   2 Batch 2824/3125   Loss: 0.815175 mae: 0.716617 (19.313029078322334 steps/sec)\n",
      "Step #18500\tEpoch   2 Batch 2874/3125   Loss: 0.831649 mae: 0.701362 (17.305823057525025 steps/sec)\n",
      "Step #18550\tEpoch   2 Batch 2924/3125   Loss: 0.836493 mae: 0.707181 (16.310255980907574 steps/sec)\n",
      "Step #18600\tEpoch   2 Batch 2974/3125   Loss: 0.732361 mae: 0.710584 (19.023074032945903 steps/sec)\n",
      "Step #18650\tEpoch   2 Batch 3024/3125   Loss: 0.698234 mae: 0.699740 (19.953151075413068 steps/sec)\n",
      "Step #18700\tEpoch   2 Batch 3074/3125   Loss: 0.794219 mae: 0.701680 (16.423522951364912 steps/sec)\n",
      "Step #18750\tEpoch   2 Batch 3124/3125   Loss: 0.854343 mae: 0.701967 (18.082162252579238 steps/sec)\n",
      "\n",
      "Train time for epoch #3 (18750 total steps): 171.23827385902405\n",
      "Model test set loss: 0.816150 mae: 0.713537\n",
      "best loss = 0.8161500692367554\n",
      "Step #18800\tEpoch   3 Batch   49/3125   Loss: 0.888967 mae: 0.713082 (19.884407620343623 steps/sec)\n",
      "Step #18850\tEpoch   3 Batch   99/3125   Loss: 0.991000 mae: 0.699937 (18.334501632325555 steps/sec)\n",
      "Step #18900\tEpoch   3 Batch  149/3125   Loss: 0.777338 mae: 0.711343 (15.704454747078131 steps/sec)\n",
      "Step #18950\tEpoch   3 Batch  199/3125   Loss: 0.843419 mae: 0.708037 (14.916806245229031 steps/sec)\n",
      "Step #19000\tEpoch   3 Batch  249/3125   Loss: 0.831585 mae: 0.701904 (18.941172188426528 steps/sec)\n",
      "Step #19050\tEpoch   3 Batch  299/3125   Loss: 0.776248 mae: 0.707810 (16.659874506297324 steps/sec)\n",
      "Step #19100\tEpoch   3 Batch  349/3125   Loss: 0.878594 mae: 0.709572 (17.58523061731509 steps/sec)\n",
      "Step #19150\tEpoch   3 Batch  399/3125   Loss: 0.888383 mae: 0.708715 (17.86632425786436 steps/sec)\n",
      "Step #19200\tEpoch   3 Batch  449/3125   Loss: 0.799999 mae: 0.713063 (19.875814072155823 steps/sec)\n",
      "Step #19250\tEpoch   3 Batch  499/3125   Loss: 0.778257 mae: 0.699250 (18.229452975244858 steps/sec)\n",
      "Step #19300\tEpoch   3 Batch  549/3125   Loss: 0.767878 mae: 0.700799 (14.74926311776964 steps/sec)\n",
      "Step #19350\tEpoch   3 Batch  599/3125   Loss: 0.705934 mae: 0.712431 (19.8415065206367 steps/sec)\n",
      "Step #19400\tEpoch   3 Batch  649/3125   Loss: 0.761445 mae: 0.708412 (19.9426621077324 steps/sec)\n",
      "Step #19450\tEpoch   3 Batch  699/3125   Loss: 0.770477 mae: 0.708728 (20.42249989312318 steps/sec)\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Step #19500\tEpoch   3 Batch  749/3125   Loss: 0.768156 mae: 0.695677 (19.467888434898 steps/sec)\n",
      "Step #19550\tEpoch   3 Batch  799/3125   Loss: 0.872204 mae: 0.706183 (20.756601991657973 steps/sec)\n",
      "Step #19600\tEpoch   3 Batch  849/3125   Loss: 0.926446 mae: 0.705749 (19.8480057701741 steps/sec)\n",
      "Step #19650\tEpoch   3 Batch  899/3125   Loss: 0.872620 mae: 0.704646 (20.311842146681162 steps/sec)\n",
      "Step #19700\tEpoch   3 Batch  949/3125   Loss: 0.694071 mae: 0.704013 (18.6387715051348 steps/sec)\n",
      "Step #19750\tEpoch   3 Batch  999/3125   Loss: 0.855050 mae: 0.701800 (18.2450063243901 steps/sec)\n",
      "Step #19800\tEpoch   3 Batch 1049/3125   Loss: 0.848925 mae: 0.709953 (17.620785003269734 steps/sec)\n",
      "Step #19850\tEpoch   3 Batch 1099/3125   Loss: 0.747028 mae: 0.703682 (18.364380342973075 steps/sec)\n",
      "Step #19900\tEpoch   3 Batch 1149/3125   Loss: 0.739013 mae: 0.709048 (16.60404116395314 steps/sec)\n",
      "Step #19950\tEpoch   3 Batch 1199/3125   Loss: 0.835142 mae: 0.699206 (19.742964666232044 steps/sec)\n",
      "Step #20000\tEpoch   3 Batch 1249/3125   Loss: 0.911638 mae: 0.709496 (19.79305720826563 steps/sec)\n",
      "Step #20050\tEpoch   3 Batch 1299/3125   Loss: 0.783789 mae: 0.713354 (18.712897587906543 steps/sec)\n",
      "Step #20100\tEpoch   3 Batch 1349/3125   Loss: 0.831044 mae: 0.689791 (18.644360725251293 steps/sec)\n",
      "Step #20150\tEpoch   3 Batch 1399/3125   Loss: 0.809870 mae: 0.706940 (20.413635748625804 steps/sec)\n",
      "Step #20200\tEpoch   3 Batch 1449/3125   Loss: 0.793934 mae: 0.705486 (18.4352337272286 steps/sec)\n",
      "Step #20250\tEpoch   3 Batch 1499/3125   Loss: 0.827418 mae: 0.703287 (17.70579724473185 steps/sec)\n",
      "Step #20300\tEpoch   3 Batch 1549/3125   Loss: 0.900983 mae: 0.704036 (18.036048497887517 steps/sec)\n",
      "Step #20350\tEpoch   3 Batch 1599/3125   Loss: 0.814512 mae: 0.702014 (16.86636485809394 steps/sec)\n",
      "Step #20400\tEpoch   3 Batch 1649/3125   Loss: 0.854909 mae: 0.706221 (18.42892701241468 steps/sec)\n",
      "Step #20450\tEpoch   3 Batch 1699/3125   Loss: 0.830352 mae: 0.699484 (17.227804400304738 steps/sec)\n",
      "Step #20500\tEpoch   3 Batch 1749/3125   Loss: 0.801928 mae: 0.701003 (18.83035599723123 steps/sec)\n",
      "Step #20550\tEpoch   3 Batch 1799/3125   Loss: 0.892594 mae: 0.705235 (19.793959532828456 steps/sec)\n",
      "Step #20600\tEpoch   3 Batch 1849/3125   Loss: 0.780525 mae: 0.704854 (20.102337156270874 steps/sec)\n",
      "Step #20650\tEpoch   3 Batch 1899/3125   Loss: 0.865426 mae: 0.702147 (18.836067493800805 steps/sec)\n",
      "Step #20700\tEpoch   3 Batch 1949/3125   Loss: 0.792664 mae: 0.697831 (16.91720519631127 steps/sec)\n",
      "Step #20750\tEpoch   3 Batch 1999/3125   Loss: 0.877348 mae: 0.704724 (20.303547499741747 steps/sec)\n",
      "Step #20800\tEpoch   3 Batch 2049/3125   Loss: 0.758855 mae: 0.709053 (20.06761252239262 steps/sec)\n",
      "Step #20850\tEpoch   3 Batch 2099/3125   Loss: 0.848561 mae: 0.693050 (20.834940713143133 steps/sec)\n",
      "Step #20900\tEpoch   3 Batch 2149/3125   Loss: 0.736885 mae: 0.693826 (19.161030622443498 steps/sec)\n",
      "Step #20950\tEpoch   3 Batch 2199/3125   Loss: 0.718140 mae: 0.697813 (20.72603257971285 steps/sec)\n",
      "Step #21000\tEpoch   3 Batch 2249/3125   Loss: 0.732050 mae: 0.697316 (17.786605883912006 steps/sec)\n",
      "Step #21050\tEpoch   3 Batch 2299/3125   Loss: 0.864970 mae: 0.709818 (15.069732161213034 steps/sec)\n",
      "Step #21100\tEpoch   3 Batch 2349/3125   Loss: 0.783677 mae: 0.702673 (17.65038774173892 steps/sec)\n",
      "Step #21150\tEpoch   3 Batch 2399/3125   Loss: 0.719995 mae: 0.702025 (15.403004315613401 steps/sec)\n",
      "Step #21200\tEpoch   3 Batch 2449/3125   Loss: 0.757159 mae: 0.694068 (16.942456455508836 steps/sec)\n",
      "Step #21250\tEpoch   3 Batch 2499/3125   Loss: 0.895214 mae: 0.704725 (18.33493923636535 steps/sec)\n",
      "Step #21300\tEpoch   3 Batch 2549/3125   Loss: 0.622429 mae: 0.707494 (15.242684913728441 steps/sec)\n",
      "Step #21350\tEpoch   3 Batch 2599/3125   Loss: 0.935126 mae: 0.704805 (14.94417646560427 steps/sec)\n",
      "Step #21400\tEpoch   3 Batch 2649/3125   Loss: 0.869599 mae: 0.702036 (18.44096421753766 steps/sec)\n",
      "Step #21450\tEpoch   3 Batch 2699/3125   Loss: 0.750115 mae: 0.697896 (18.057596371452536 steps/sec)\n",
      "Step #21500\tEpoch   3 Batch 2749/3125   Loss: 0.924510 mae: 0.698542 (19.97780214569097 steps/sec)\n",
      "Step #21550\tEpoch   3 Batch 2799/3125   Loss: 0.813447 mae: 0.705895 (17.679407478445878 steps/sec)\n",
      "Step #21600\tEpoch   3 Batch 2849/3125   Loss: 0.817327 mae: 0.699028 (16.716910747493063 steps/sec)\n",
      "Step #21650\tEpoch   3 Batch 2899/3125   Loss: 0.825023 mae: 0.698883 (17.043679614075188 steps/sec)\n",
      "Step #21700\tEpoch   3 Batch 2949/3125   Loss: 0.844203 mae: 0.702106 (17.106902971164764 steps/sec)\n",
      "Step #21750\tEpoch   3 Batch 2999/3125   Loss: 0.722651 mae: 0.699768 (17.022765994923574 steps/sec)\n",
      "Step #21800\tEpoch   3 Batch 3049/3125   Loss: 0.757220 mae: 0.691070 (20.08489442882346 steps/sec)\n",
      "Step #21850\tEpoch   3 Batch 3099/3125   Loss: 0.854701 mae: 0.697093 (17.652886740654708 steps/sec)\n",
      "\n",
      "Train time for epoch #4 (21875 total steps): 172.3115131855011\n",
      "Model test set loss: 0.805467 mae: 0.708258\n",
      "best loss = 0.8054665923118591\n",
      "Step #21900\tEpoch   4 Batch   24/3125   Loss: 0.700578 mae: 0.707952 (30.984539584201954 steps/sec)\n",
      "Step #21950\tEpoch   4 Batch   74/3125   Loss: 0.836919 mae: 0.694345 (19.67673385013308 steps/sec)\n",
      "Step #22000\tEpoch   4 Batch  124/3125   Loss: 0.683775 mae: 0.702938 (18.016525205242058 steps/sec)\n",
      "Step #22050\tEpoch   4 Batch  174/3125   Loss: 0.758006 mae: 0.702068 (14.422156328144439 steps/sec)\n",
      "Step #22100\tEpoch   4 Batch  224/3125   Loss: 0.609306 mae: 0.699388 (13.94835187258848 steps/sec)\n",
      "Step #22150\tEpoch   4 Batch  274/3125   Loss: 0.776637 mae: 0.698963 (20.614467122209767 steps/sec)\n",
      "Step #22200\tEpoch   4 Batch  324/3125   Loss: 0.770275 mae: 0.702118 (21.244232208865203 steps/sec)\n",
      "Step #22250\tEpoch   4 Batch  374/3125   Loss: 0.793902 mae: 0.704710 (19.15049205070193 steps/sec)\n",
      "Step #22300\tEpoch   4 Batch  424/3125   Loss: 0.837741 mae: 0.705003 (19.781892381476663 steps/sec)\n",
      "Step #22350\tEpoch   4 Batch  474/3125   Loss: 0.762290 mae: 0.698549 (19.767672686902195 steps/sec)\n",
      "Step #22400\tEpoch   4 Batch  524/3125   Loss: 0.838741 mae: 0.693895 (21.377360640821745 steps/sec)\n",
      "Step #22450\tEpoch   4 Batch  574/3125   Loss: 0.831310 mae: 0.700731 (21.27240325333616 steps/sec)\n",
      "Step #22500\tEpoch   4 Batch  624/3125   Loss: 0.612761 mae: 0.698734 (20.498769185535505 steps/sec)\n",
      "Step #22550\tEpoch   4 Batch  674/3125   Loss: 0.811426 mae: 0.701995 (19.50730010350119 steps/sec)\n",
      "Step #22600\tEpoch   4 Batch  724/3125   Loss: 0.660438 mae: 0.699104 (19.40159170334461 steps/sec)\n",
      "Step #22650\tEpoch   4 Batch  774/3125   Loss: 0.753347 mae: 0.692317 (20.086425711519876 steps/sec)\n",
      "Step #22700\tEpoch   4 Batch  824/3125   Loss: 0.801164 mae: 0.699827 (20.30495699379064 steps/sec)\n",
      "Step #22750\tEpoch   4 Batch  874/3125   Loss: 0.720689 mae: 0.698485 (21.25504099178948 steps/sec)\n",
      "Step #22800\tEpoch   4 Batch  924/3125   Loss: 0.858892 mae: 0.702450 (19.878423388450173 steps/sec)\n",
      "Step #22850\tEpoch   4 Batch  974/3125   Loss: 0.745969 mae: 0.699487 (19.548608094569254 steps/sec)\n",
      "Step #22900\tEpoch   4 Batch 1024/3125   Loss: 0.769839 mae: 0.696994 (21.197630889168583 steps/sec)\n",
      "Step #22950\tEpoch   4 Batch 1074/3125   Loss: 0.817848 mae: 0.698017 (21.845670120747695 steps/sec)\n",
      "Step #23000\tEpoch   4 Batch 1124/3125   Loss: 0.795173 mae: 0.703592 (21.4181286773676 steps/sec)\n",
      "Step #23050\tEpoch   4 Batch 1174/3125   Loss: 0.721135 mae: 0.702466 (21.425213902025114 steps/sec)\n",
      "Step #23100\tEpoch   4 Batch 1224/3125   Loss: 0.754713 mae: 0.692455 (21.40551248452117 steps/sec)\n",
      "Step #23150\tEpoch   4 Batch 1274/3125   Loss: 0.887816 mae: 0.710222 (21.30429089332464 steps/sec)\n",
      "Step #23200\tEpoch   4 Batch 1324/3125   Loss: 0.694410 mae: 0.693461 (20.895654148483104 steps/sec)\n",
      "Step #23250\tEpoch   4 Batch 1374/3125   Loss: 0.733263 mae: 0.693484 (19.848704584945587 steps/sec)\n",
      "Step #23300\tEpoch   4 Batch 1424/3125   Loss: 0.786791 mae: 0.697715 (21.047366289159616 steps/sec)\n",
      "Step #23350\tEpoch   4 Batch 1474/3125   Loss: 0.702485 mae: 0.698943 (21.14160420583595 steps/sec)\n",
      "Step #23400\tEpoch   4 Batch 1524/3125   Loss: 0.706008 mae: 0.694510 (21.300179641855873 steps/sec)\n",
      "Step #23450\tEpoch   4 Batch 1574/3125   Loss: 0.765995 mae: 0.702509 (20.99578693053431 steps/sec)\n",
      "Step #23500\tEpoch   4 Batch 1624/3125   Loss: 0.814329 mae: 0.691788 (14.809322431863341 steps/sec)\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Step #23550\tEpoch   4 Batch 1674/3125   Loss: 0.711291 mae: 0.699889 (17.783771792657436 steps/sec)\n",
      "Step #23600\tEpoch   4 Batch 1724/3125   Loss: 0.869617 mae: 0.695721 (18.041441896228882 steps/sec)\n",
      "Step #23650\tEpoch   4 Batch 1774/3125   Loss: 0.762498 mae: 0.697375 (15.748884403456422 steps/sec)\n",
      "Step #23700\tEpoch   4 Batch 1824/3125   Loss: 0.699742 mae: 0.698182 (18.953381044431836 steps/sec)\n",
      "Step #23750\tEpoch   4 Batch 1874/3125   Loss: 0.867409 mae: 0.700424 (20.27585736151225 steps/sec)\n",
      "Step #23800\tEpoch   4 Batch 1924/3125   Loss: 0.965006 mae: 0.694853 (21.02829850880362 steps/sec)\n",
      "Step #23850\tEpoch   4 Batch 1974/3125   Loss: 0.761518 mae: 0.692496 (20.49723048437593 steps/sec)\n",
      "Step #23900\tEpoch   4 Batch 2024/3125   Loss: 0.853272 mae: 0.702434 (20.60425533001901 steps/sec)\n",
      "Step #23950\tEpoch   4 Batch 2074/3125   Loss: 0.839018 mae: 0.691413 (19.89791411536557 steps/sec)\n",
      "Step #24000\tEpoch   4 Batch 2124/3125   Loss: 0.636414 mae: 0.695023 (19.9965940554635 steps/sec)\n",
      "Step #24050\tEpoch   4 Batch 2174/3125   Loss: 0.727030 mae: 0.688550 (21.414900524938485 steps/sec)\n",
      "Step #24100\tEpoch   4 Batch 2224/3125   Loss: 0.692816 mae: 0.693844 (20.580070593259727 steps/sec)\n",
      "Step #24150\tEpoch   4 Batch 2274/3125   Loss: 0.804776 mae: 0.695659 (21.06603487205697 steps/sec)\n",
      "Step #24200\tEpoch   4 Batch 2324/3125   Loss: 0.630790 mae: 0.703828 (20.606581561855844 steps/sec)\n",
      "Step #24250\tEpoch   4 Batch 2374/3125   Loss: 0.728864 mae: 0.693762 (20.96268841936892 steps/sec)\n",
      "Step #24300\tEpoch   4 Batch 2424/3125   Loss: 0.732036 mae: 0.694484 (18.465093992578925 steps/sec)\n",
      "Step #24350\tEpoch   4 Batch 2474/3125   Loss: 0.845373 mae: 0.691247 (13.622812380049828 steps/sec)\n",
      "Step #24400\tEpoch   4 Batch 2524/3125   Loss: 0.736467 mae: 0.700203 (18.58940652590681 steps/sec)\n",
      "Step #24450\tEpoch   4 Batch 2574/3125   Loss: 0.855047 mae: 0.704018 (14.340161727795183 steps/sec)\n",
      "Step #24500\tEpoch   4 Batch 2624/3125   Loss: 0.927693 mae: 0.700423 (14.557431325071716 steps/sec)\n",
      "Step #24550\tEpoch   4 Batch 2674/3125   Loss: 0.785416 mae: 0.697434 (17.342469514363355 steps/sec)\n",
      "Step #24600\tEpoch   4 Batch 2724/3125   Loss: 0.732580 mae: 0.687295 (19.059820049077523 steps/sec)\n",
      "Step #24650\tEpoch   4 Batch 2774/3125   Loss: 0.716931 mae: 0.696593 (18.940469100733413 steps/sec)\n",
      "Step #24700\tEpoch   4 Batch 2824/3125   Loss: 0.787931 mae: 0.705047 (18.70766771452937 steps/sec)\n",
      "Step #24750\tEpoch   4 Batch 2874/3125   Loss: 0.806662 mae: 0.687177 (15.355600635413658 steps/sec)\n",
      "Step #24800\tEpoch   4 Batch 2924/3125   Loss: 0.833707 mae: 0.694723 (17.546198502983515 steps/sec)\n",
      "Step #24850\tEpoch   4 Batch 2974/3125   Loss: 0.710267 mae: 0.698859 (13.367490096217281 steps/sec)\n",
      "Step #24900\tEpoch   4 Batch 3024/3125   Loss: 0.696164 mae: 0.687586 (16.98731924133343 steps/sec)\n",
      "Step #24950\tEpoch   4 Batch 3074/3125   Loss: 0.764176 mae: 0.690245 (18.851789023768344 steps/sec)\n",
      "Step #25000\tEpoch   4 Batch 3124/3125   Loss: 0.836682 mae: 0.690732 (17.604053020538338 steps/sec)\n",
      "\n",
      "Train time for epoch #5 (25000 total steps): 166.06231880187988\n",
      "Model test set loss: 0.797666 mae: 0.705161\n",
      "best loss = 0.7976656556129456\n"
     ]
    }
   ],
   "source": [
    "mv_net=mv_network()\n",
    "mv_net.training(features, targets_values, epochs=5)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 显示训练Loss"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 166,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAvIAAAH0CAYAAABfKsnMAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAAWJQAAFiUBSVIk8AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvOIA7rQAAIABJREFUeJzs3Xd8VFXex/HvmfRGCaF3EAiItCh1RRAVKQIWXOu6rmV3XeuKqCsoa3903V3b4666K+6DvSwo2BUxKIICgiAgvYcWCOkkmfP8kTBMGgS4NzczfN6vV15z59479/5myITvnDn3HGOtFQAAAIDQ4vO6AAAAAABHjyAPAAAAhCCCPAAAABCCCPIAAABACCLIAwAAACGIIA8AAACEIII8AAAAEIII8gAAAEAIIsgDAAAAIYggDwAAAIQggjwAAAAQggjyAAAAQAgiyAMAAAAhiCAPAAAAhCCCPAAAABCCCPIAAABACIr0ugA3GGPWS6onaYPHpQAAACC8tZO031rbvrZPHJZBXlK9uLi45K5duyZ7XQgAAADC14oVK5Sfn+/JucM1yG/o2rVr8sKFC72uAwAAAGEsLS1NixYt2uDFuekjDwAAAIQggjwAAAAQggjyAAAAQAgiyAMAAAAhiCAPAAAAhCCCPAAAABCCCPIAAABACArXceQBAEAN+P1+ZWZmKjs7W4WFhbLWel0S4BljjGJiYpSUlKTk5GT5fHW7zZsgDwDACcrv92vz5s3Ky8vzuhSgTrDWqqCgQAUFBcrNzVXr1q3rdJgnyAMAcILKzMxUXl6eIiMj1axZMyUkJNTp0AK4ze/3Kzc3VxkZGcrLy1NmZqZSUlK8LqtavFsBADhBZWdnS5KaNWumpKQkQjxOeD6fT0lJSWrWrJmkQ++Ruop3LAAAJ6jCwkJJUkJCgseVAHXLwffEwfdIXUWQBwDgBHXwwlZa4oHyjDGSVOcv/uadCwAAAAQ5GOTrOoI8AAAAEIII8g6r61/BAAAAIDwQ5B30zzlrddpDn+mfc9Z6XQoAAAghOTk5MsZo9OjRx32sU089VYmJiQ5U5ZxnnnlGxhi9/fbbXpcSVgjyDrHW6pEPV2p3zgE98uFKr8sBAAA1YIw5qp+pU6d6XTIQwIRQDqFHDQAAoee+++6rtO7vf/+7srKydMstt6hBgwbltvXq1cuVOhISErRixQpHWtLfeeedOj9sIpxBkHcIOR4AgNAzZcqUSuumTp2qrKws3XrrrWrXrl2t1GGMUWpqqiPHatu2rSPHQd1H1xqXcNErAADh62A/9Pz8fE2aNEknnXSSoqOjdeONN0qS9uzZo0cffVRnnHGGWrRooejoaDVt2lQXXnihFi5cWOl41fWRnzBhgowx+v777/XKK68oLS1NcXFxSklJ0ZVXXqmdO3dWW1uwmTNnyhijv/zlL1qwYIGGDx+uevXqKTExUWeddVaVNUnSpk2bdMUVVyglJUXx8fFKS0vTG2+8Ue54x2vevHkaO3asUlJSFBMTow4dOujWW2/Vrl27Ku27bds23XLLLercubPi4+PVsGFDde3aVddcc402b94c2M/v9+uFF15Qv379lJKSori4OLVp00YjR47U9OnTj7vmuoIWeYcQ3AEAOLH4/X6NHj1aq1at0vDhw9WoUaNAa/jixYt13333aciQIRo7dqzq16+v9evX67333tPMmTP16aefavDgwTU+12OPPaaZM2dq7NixGjp0qL7++mtNmzZNy5Yt0/fff6+IiIgaHWfu3LmaNGmShgwZouuvv17r1q3T9OnTNWTIEC1btqxca/6WLVs0YMAAbdu2TcOGDdNpp52mrVu36qqrrtKIESOO7sWqxptvvqnLL79cERERGj9+vFq1aqVvv/1WTz75pGbMmKGvv/5aLVq0kCTt379f/fr107Zt23TOOedo3LhxKioq0saNG/X222/ryiuvVOvWrSVJt956q55++ml16tRJl156qRITE7Vt2zbNnz9f06dP17hx4xyp32sEeYcQ4wEAOLHk5+crOztby5Ytq9SXvk+fPsrIyFDDhg3LrV+7dq369eun22+/Xd99912Nz/X555/rhx9+UOfOnSWVNiCOGzdO7733nj7++GONHDmyRseZMWOG3nrrLV100UWBdU888YQmTJigZ599Vo899lhg/e23365t27bp/vvv1+TJkwPrb7jhBv3iF7+oce3VyczM1LXXXitjjObOnatTTz01sG3y5Ml68MEHdeONN+rdd9+VJM2aNUtbtmzRpEmT9MADD5Q7VkFBgYqLiyUdao3v2LGjfvzxR8XExJTbd/fu3cdde11BkHdIxQZ5a6UQmRQMAIAqtbtrltcl1NiGR0d5ct5HHnmkUoiXpOTk5Cr379ixo8aMGaOXXnpJmZmZ1e5X0R133BEI8VJpn/prr71W7733nhYsWFDjID98+PByIV6Srr/+ek2YMEELFiwIrMvOzta7776rJk2a6I477ii3f//+/TV+/Hi9/vrrNTpndd566y1lZ2fruuuuKxfiJemee+7Riy++qBkzZmj37t1KSUkJbIuLi6t0rNjY2HL3jTGKjo6u8puK4GOFOkf6yBtjLjLGPG2MSTfG7DfGWGPMtKN4/Itlj7HGmJOcqAkAAMBtffv2rXbb7NmzdcEFF6hVq1aKjo4ODGH50ksvSZK2bt1a4/NUDLqSAt1I9u7de1zHSUpKUv369csdZ9myZSouLlZaWlqlkCzJkRb5RYsWSZLOPPPMSttiY2M1cOBA+f1+LVmyRJJ09tlnq3Hjxpo8ebJGjx6tZ599Vj/88IP8fn+5x/p8Pl1yySVasWKFunfvrsmTJ+uTTz5Rdnb2cddc1zjVIj9JUk9JOZK2SKrxZdfGmPMkXVP22Lo1e8FRsBU612zZm682jeI9qgYAALgtPj5eSUlJVW6bNm2afvWrXykxMVFnn3222rdvr4SEBBlj9Mknn2jevHlHNURkVa3+kZGlMa6kpOS4jnPwWMHHycrKkiQ1bdq0yv2rW380Dp6jefPmVW4/uH7fvn2SSlvS58+frylTpmjmzJmaNWtWoJabb75Zd955Z6AF/p///KdSU1P18ssv68EHH5QkRUVFacyYMXriiSfCZmQfp4L8bSoN8GsknSFpdk0eZIxpLOkFSW9Ialb22JBUsWvN9iyCPAAgtHnVXSVUmMP0oZ00aZKSkpK0ePFidejQody21atXa968eW6Xd1zq1asnSdqxY0eV26tbfzTq168vScrIyKhy+/bt28vtJ0nt27fXyy+/LL/fr2XLlunzzz/XM888o3vuuUcRERG68847JZWG9okTJ2rixInKyMhQenq6pk2bpnfeeUcrV67UkiVLanyBcF3mSNcaa+1sa+1qe/RDtzxfdvsHJ+oAAADwWnFxsTZu3KhevXpVCvFFRUV1PsRL0imnnKLIyEgtXLhQBQUFlbbPnTv3uM/Ru3dvSdKXX35ZaVthYaHmzZsnY0yVk3D5fD716NFDt912m2bOnClJ1Q4r2axZM40fP14zZsxQ3759tXz5cq1Zs+a4668LPBtH3hjza0njJP3WWrvHqzqcUuliV2/KAAAAHouMjFTLli21fPnyciOk+P1+3X333Vq/fr2H1dVMUlKSxo0bp507d+rxxx8vt23+/Pl66623jvscF198sRITE/XSSy8F+sEf9Mgjj2j79u2B8eUlaenSpVWOOHPw24H4+NKeEDk5OeUu3D2osLAw0J2nqgtmQ5Eno9YYY9pKelLSNGvtjOM4TtWzFxxFH30AAACn3XbbbZowYYJ69OihCy64QD6fT3PmzNGGDRs0YsQIffjhh16XeERPPPGE5s6dq3vvvVdfffWVTjvtNG3ZskVvvvmmzjvvPE2fPl0+37G3CScnJ+v555/XlVdeqQEDBmj8+PFq2bKlvv32W82ePVtt2rTRM888E9j/vffe0/33369BgwapU6dOSklJ0caNGzVjxgxFRERowoQJkkr71Pfr10+pqanq3bu32rRpo7y8PH300UdavXq1LrvsMrVp0+a4X5+6oNaDvDHGJ+lllV7cenNtn98tFS92ZX4oAABOXH/84x+VmJioZ555Rv/+97+VkJCgIUOG6M0339QLL7wQEkG+TZs2+vbbb3X33Xfr448/1ty5c9WtWze9/PLLys/P1/Tp0wN96Y/VpZdeqjZt2ujRRx/VzJkzlZ2drRYtWuimm27SpEmT1KRJk8C+Y8aM0a5du5Senq53331XOTk5at68uc477zzdfvvtgRF5GjVqpIcfflizZ89Wenq6du3apXr16qlTp0668847ddVVVx1XzXWJcXpGUmPMEJVe7PqKtfaKKrbfLukvkkZZaz8IWv+lSi927WStPa6OS8aYhX369OlT3XTDbigu8eukew69Kf/zm74a3LlxrZ0fAICjtWLFCklS165dPa4EoeaWW27RU089pblz52rQoEFel+OKmr4/0tLStGjRokXW2rTaqCtYrfaRN8Z0lvSQpJeCQ3w4iIwo/1K+tmCTR5UAAAA4Y9u2bZXWfffdd3r++efVokUL9evXz4OqcFBtd63pJilG0tXGmKur2Wd12XBO51trq778OAR8uKzqoZQAAABCRdeuXdWnTx+dfPLJio2N1apVqwLdgp599tnAWPbwRm2/+hsk/auabaNUOpb8W5L2l+0LAAAAj9xwww364IMP9MorrygnJ0cNGzbU6NGjNXHiRA0cONDr8k54tRrkrbU/SLq2qm1lfeSbSfrT8faRBwAAwPF75JFH9Mgjj3hdBqrhSJA3xoxT6ZjwUmkYl6QBxpipZcu7rbUTnDgXAAAAAOda5HtJqjiWT4eyH0naKIkgDwAAADjEkVFrrLVTrLXmMD/tanCMIWX70q0GAAAAnnF6eHa31OrwkwAAoO4oGyVOfr/f40qAuuVgkD/4HqmrCPIAAJygYmJiJEm5ubkeVwLULQffEwffI3UVQR4AgBNUUlKSJCkjI0PZ2dny+/0h06UAcJq1Vn6/X9nZ2crIKJ0P6OB7pK5iFH8AAE5QycnJys3NVV5enrZs2eJ1OUCdEh8fr+TkZK/LOCyCPAAAJyifz6fWrVsrMzNT2dnZKiwspEUeJzRjjGJiYpSUlKTk5GT5fHW78wpBHgCAE5jP51NKSopSUlK8LgXAUarbHzMAAAAAVIkgDwAAAIQggjwAAAAQggjyAAAAQAgiyAMAAAAhiCAPAAAAhCCCPAAAABCCCPIAAABACCLIAwAAACGIIA8AAACEIII8AAAAEIII8gAAAEAIIsgDAAAAIYggDwAAAIQggjwAAAAQggjyAAAAQAgiyAMAAAAhiCAPAAAAhCCCPAAAABCCCPIAAABACCLIO6hto3ivSwAAAMAJgiAPAAAAhCCCvIOM1wUAAADghEGQd5AxRHkAAADUDoK8g4jxAAAAqC0EeQAAACAEEeSdRJM8AAAAaglBHgAAAAhBBHkAAAAgBBHkHUTPGgAAANQWgjwAAAAQggjyDmIceQAAANQWgryDerSq73UJAAAAOEEQ5B1021mdvS4BAAAAJwiCvIPqxUYFLUd6WAkAAADCHUHeSUFd5K31rgwAAACEP0eCvDHmImPM08aYdGPMfmOMNcZMq2bfTsaYO40xXxhjNhtjDhhjdhhjZhhjhjpRj1eCr3XNLiz2rhAAAACEPaf6f0yS1FNSjqQtklIPs+8Dkn4p6SdJH0jKlNRF0hhJY4wxt1hrn3KorlpVccyafXkH1CA+2pNaAAAAEN6cCvK3qTTAr5F0hqTZh9n3I0n/Y61dHLzSGHOGpE8lPW6Mectau92h2mqNr8Lwk4XFfo8qAQAAQLhzpGuNtXa2tXa1tUfuGW6tnVoxxJetnyPpS0nRkgY6UVdtqziM/K7sQm8KAQAAQNiraxe7FpXdhmQHc1Ohc82C9ZkeVQIAAIBwV2fGSDTGtJU0TFKepK9q+JiF1Ww6XB9911RskWeiVwAAALilTgR5Y0yMpFckxUiaaK3d63FJx6RicGcISgAAALjF8yBvjImQ9H+SBkl6Q9JfavpYa21aNcdcKKmPIwUehQia4AEAAFBLPO0jXxbip0kaL+lNSVfU5ILZuqriqDUAAACAWzwL8saYKEmvSbpE0quSLrPWhuRFrgdVzPH+0P1MAgAAgDrOk641xpholbbAj5X0H0lXW2tDftB1UyHJV7wPAAAAOKXWW+TLLmz9r0pD/L8UJiG+KiHcSwgAAAB1nCMt8saYcZLGld1tVnY7wBgztWx5t7V2QtnyPySNlLRb0lZJ91bRcv2ltfZLJ2rzUmxUhNclAAAAIEw51bWml6SrKqzrUPYjSRslHQzy7ctuUyTde5hjfulQbZ7p3rK+1yUAAAAgTDkS5K21UyRNqeG+Q5w4Z13Vs3UDLdm8TxJdawAAAOAeT4efDEe+oF5CfnI8AAAAXEKQd1jwWPK0yAMAAMAtBHmHZWQVBJZ37C/0sBIAAACEM4K8w7buyw8s3ztjmYeVAAAAIJwR5F20J/eA1yUAAAAgTBHkAQAAgBBEkAcAAABCEEEeAAAACEEEeQAAACAEEeQBAACAEESQBwAAAEIQQR4AAAAIQQR5AAAAIAQR5AEAAIAQRJAHAAAAQhBBHgAAAAhBBHkAAAAgBBHkAQAAgBBEkAcAAABCEEEeAAAACEEEeQAAACAEEeQBAACAEESQBwAAAEIQQR4AAAAIQQR5h41Pa+V1CQAAADgBEOQddm73ZoHl09o19LASAAAAhDOCvMN8PhNYjo+O9LASAAAAhDOCvMNM0LLfWs/qAAAAQHgjyDvMZ8yRdwIAAACOE0HeYcE5nhZ5AAAAuIUg77DgFnlyPAAAANxCkHcYfeQBAABQGwjyTgtK8j9uyfKuDgAAAIQ1grzDMnMPBJZzD5R4WAkAAADCGUHeYat35HhdAgAAAE4ABHmHRfgYfhIAAADuI8g7jBwPAACA2kCQd5hhQigAAADUAoK8w2yFIScr3gcAAACcQJB3mK9C35oSP0EeAAAAziPIO8yoQpCnRR4AAAAuIMi7jBwPAAAANzgS5I0xFxljnjbGpBtj9htjrDFm2hEeM9AY84ExJtMYk2+MWWqMudUYE+FETV6peK0rQR4AAABuiHToOJMk9ZSUI2mLpNTD7WyMGSvpHUkFkt6QlCnpPEl/kzRI0niH6qp1FcessSLJAwAAwHlOda25TVJnSfUk/f5wOxpj6kl6QVKJpCHW2mustXdI6iVpnqSLjDGXOFSX52iRBwAAgBscCfLW2tnW2tW2ZmMtXiSpsaTXrbXfBx2jQKUt+9IRPgzUZZW61nhTBgAAAMKcFxe7nll2+1EV276SlCdpoDEmpvZKck7FUWsYRx4AAABu8CLIdym7/bniBmttsaT1Ku273+FIBzLGLKzqR0foo++mwZ0bl7vPMPIAAABwgxdBvn7ZbVY12w+ub1ALtTiuflxU+RUEeQAAALjAqVFrPGGtTatqfVmrfJ9aLkdS5VFqGLUGAAAAbvCiRf5gi3v9arYfXL+vFmpxHV3kAQAA4AYvgvyqstvOFTcYYyIltZdULGldbRblFnI8AAAA3OBFkP+i7PbcKrYNlhQv6RtrbWHtleSe/flFXpcAAACAMORFkH9b0m5JlxhjTj240hgTK+nBsrvPeVCXIyoOP/nJTxkeVQIAAIBw5sjFrsaYcZLGld1tVnY7wBgztWx5t7V2giRZa/cbY65TaaD/0hjzuqRMSWNUOjTl25LecKIuL1S62JW+NQAAAHCBU6PW9JJ0VYV1HXRoLPiNkiYc3GCtnW6MOUPSPZIulBQraY2kP0p6qoYzxIaEsHkiAAAAqFMc6VpjrZ1irTWH+WlXxWO+ttaOtNY2tNbGWWtPsdb+zVpb4kRNXqnYteaDH7d7VAkAAADCmRd95E8oy7ft97oEAAAAhCGCvMvCqJcQAAAA6hCCvMuI8QAAAHADQR4AAAAIQQR5l9GzBgAAAG4gyAMAAAAhiCAPAAAAhCCCPAAAABCCCPIOi4+JKHe/dXKcR5UAAAAgnBHkHVYvNqrc/XaNEjyqBAAAAOGMIA8AAACEIIK8yxh+EgAAAG4gyAMAAAAhiCDvMiua5AEAAOA8gjwAAAAQggjyLrv41NZelwAAAIAwRJB32cqMbK9LAAAAQBgiyLvsuS/Xel0CAAAAwhBBHgAAAAhBBHkAAAAgBBHkAQAAgBBEkAcAAABCEEEeAAAACEEEeQAAACAEEeQBAACAEESQBwAAAEIQQR4AAAAIQQR5AAAAIAQR5AEAAIAQRJAHAAAAQhBBHgAAAAhBBHkAAAAgBBHkAQAAgBBEkAcAAABCEEEeAAAACEEEeZf5jNcVAAAAIBwR5F1ggsK733pXBwAAAMIXQd4FlvAOAAAAlxHkAQAAgBBEkAcAAABCEEEeAAAACEEEeQAAACAEeRrkjTGjjDGfGGO2GGPyjTHrjDFvGWMGeFkXAAAAUNd5FuSNMf8jaaakPpI+kvSkpEWSxkr62hhzhVe1AQAAAHVdpBcnNcY0kzRB0g5JPay1O4O2DZX0haT7JU3zoj4AAACgrvOqRb5t2bnnB4d4SbLWzpaULamxF4UBAAAAocCrIL9a0gFJfY0xKcEbjDGDJSVJ+syLwgAAAIBQ4EnXGmttpjHmTkl/lfSTMWa6pD2SOkoaI+lTSb890nGMMQur2ZTqVK0AAABAXeRJkJcka+3fjTEbJP1b0nVBm9ZImlqxyw0AAACAQ7wctWaipLclTVVpS3yCpDRJ6yS9Yox57EjHsNamVfUjaaWLpQMAAACe8yTIG2OGSPofSe9Za/9orV1nrc2z1i6SdL6krZJuN8Z08KI+AAAAoK7zqkV+dNnt7IobrLV5khaotLbetVkUAAAAECq8CvIxZbfVDTF5cP2BWqgFAAAACDleBfn0stvrjTEtgzcYY0ZIGiSpQNI3tV0YAAAAEAq8GrXmbZWOE3+WpBXGmP9KypDUVaXdboyku6y1ezyqDwAAAKjTvBpH3m+MGSnpD5IuUekFrvGSMiV9IOkpa+0nXtQGAAAAhAIvx5EvkvT3sh8AAAAAR8GzceQBAAAAHDuCPAAAABCCCPIAAABACCLIAwAAACGIIA8AAACEIII8AAAAEIII8i7o2y7Z6xIAAAAQ5gjyLvjtGR28LgEAAABhjiDvgpTEmMByarMkDysBAABAuCLIu8CYQ8uREab6HQEAAIBjRJB3gS8oyVvrYSEAAAAIWwR5l/kJ8gAAAHABQd4FwV1rLE3yAAAAcAFB3gVGh5L8yoxsDysBAABAuCLIu8BX4VUtKCrxphAAAACELYK8C4Jb5CUp7wBBHgAAAM4iyLvAMOIkAAAAXEaQd4GPIA8AAACXEeRdQZIHAACAuwjyLqjYtYYhKAEAAOA0grwLfHSSBwAAgMsI8i4gxgMAAMBtBHkX0CAPAAAAtxHkXVBxHHl6yAMAAMBpBHkXVL7Y1Zs6AAAAEL4I8i6gaw0AAADcRpB3gSHJAwAAwGUEeRcwsysAAADcRpB3QeWLXekkDwAAAGcR5F1AzxoAAAC4jSDvAnI8AAAA3EaQdwEXuwIAAMBtBHkXVMrxdJEHAACAwwjyLoigRR4AAAAuI8i7ICaq/MtKgzwAAACcRpB3QWxkhNclAAAAIMwR5F3gqzAjlKVJHgAAAA4jyLukab2YwDITQgEAAMBpBHmXBM/uSos8AAAAnEaQdwkD1wAAAMBNBHmXFPsPNcPTIA8AAACnEeRdsiu7MLCcW1jsYSUAAAAIR54HeWPMMGPMf40xGcaYQmPMNmPMx8aYkV7X5pTXF2z2ugQAAACEmUgvT26MeUzSHZK2SHpP0m5JjSWlSRoi6QPPinNQsd/vdQkAAAAIM54FeWPMdSoN8S9Lut5ae6DC9ihPCnOBjytfAQAA4DBPutYYY2IkPSRpk6oI8ZJkrS2q9cJcQo4HAACA07xqkT9bpV1o/i7Jb4wZJam7pAJJC6y18zyqyxWMIw8AAACneRXkTyu7LZC0WKUhPsAY85Wki6y1uw53EGPMwmo2pR53hQAAAEAd5tWoNU3Kbu9Q6TDrp0tKktRD0ieSBkt6y5vSAAAAgLrPqxb5gx8giiWNsdZuKLv/ozHmfEmrJJ1hjBlwuG421tq0qtaXtdT3cbBeAAAAoE7xqkV+X9nt4qAQL0my1uZJ+rjsbt/aLAoAAAAIFV4F+VVlt/uq2b637DauFmpxneVqVwAAADjMqyD/uUr7xnczxlRVw8GLX9fXXknuiY2K8LoEAAAAhBlPgry1dqOk9yW1kXRL8DZjzDmShqu0tf6j2q/OeYNOSvG6BAAAAIQZr1rkJekPkjZL+qsx5jNjzOPGmLclfSCpRNK11tosD+s7LoNOahRYZkIoAAAAOM2rUWtkrd1ijEmTdK+kMSodcnK/SlvqH7HWLvCqNicYHUrvdJEHAACA0zwL8pJUNuHTTWU/YSW4FZ4cDwAAAKd52bUGAAAAwDEiyNcChp8EAACA0wjyLjFBfWuI8QAAAHAaQd4l2/flB5bzCks8rAQAAADhiCDvktU7cwLLT37+s4eVAAAAIBwR5GvBzztyjrwTAAAAcBQI8gAAAEAIIsgDAAAAIYggDwAAAIQggjwAAAAQggjyAAAAQAgiyAMAAAAhiCBfC0b1aO51CQAAAAgzBHmX/Hpgu8Byl6ZJ3hUCAACAsESQd0m9uKjAst9aDysBAABAOCLIu8QELfvJ8QAAAHAYQd4lPhMU5WmRBwAAgMMI8i7xBeV4WuQBAADgNIK8S3xBSZ4+8gAAAHAaQb4W0CIPAAAApxHkXRLcR96KJA8AAABnEeRd4uNaVwAAALiIIO+S4BZ5P31rAAAA4DCCvEsMo9YAAADARQR5lxj6yAMAAMBFBHmX0EceAAAAbiLIuyQoxzOOPAAAABxHkHcJE0IBAADATQR5l5TrI0+OBwAAgMMI8i4p37XGszIAAAAQpgjyLgkeflKMWgMAAACHEeRd4qNrDQAAAFxEkHcJo9YAAADATQR5l9AiDwAAADcR5N0S1CTPxa4AAABwGkHeJeVa5LnYFQAAAA4jyLskuI88XWsAAADgNIK8S3xBr6wlyQMAAMBZ9t6BAAAgAElEQVRhBHmXmKA2efrIAwAAwGkEeZcETwhFjgcAAIDTCPIuMSa4RZ4oDwAAAGcR5F0SfLErTfIAAABwWp0J8saYK4wxtuznWq/rOV7Bw09+vDzDw0oAAAAQjupEkDfGtJb0jKQcr2txyqJNewPLxVztCgAAAId5HuRNaWfylyTtkfQPj8txzPrduV6XAAAAgDDmeZCXdLOkMyVdLSls0m8JrfAAAABwkadB3hjTVdKjkp601n7lZS1OY6QaAAAAuCnSqxMbYyIl/Z+kTZL+dIzHWFjNptRjrcspRSV+r0sAAABAGPOyRf5eSb0l/dpam+9hHa4Y07Ol1yUAAAAgjHnSIm+M6afSVvgnrLXzjvU41tq0ao6/UFKfYz2uE/q2bxhYjo+O8LASAAAAhKNab5Ev61LzH0k/S5pc2+evPYfGkW9WL9bDOgAAABCOvOhakyips6SukgqCJoGyku4r2+eFsnV/96A+RwTNB8XErgAAAHCcF11rCiX9q5ptfVTab36upFWSjrnbjdeCcrwsI9gAAADAYbUe5MsubL22qm3GmCkqDfIvW2tfrM26nGaCmuQPFDOCDQAAAJxVFyaECkvBLfLbsgo8qwMAAADhiSDvkq37wm5ETQAAANQhdSrIW2unWGtNqHerkZjZFQAAAO6qU0E+nJhynWsAAAAAZxHkXWLI8QAAAHARQR4AAAAIQQR5l9AgDwAAADcR5N1CkgcAAICLCPIuaV4/zusSAAAAEMYI8i5p0SDW6xIAAAAQxgjyLvEFDVsT6aOfDQAAAJxFkHdJcJBncigAAAA4jSDvkuBGeL+VLGEeAAAADiLIu8RUmBFqc2a+R5UAAAAgHBHka0lWfpHXJQAAACCMEORrSQldawAAAOAggnwtKfH7vS4BAAAAYYQgX0v8NMgDAADAQQT5WlIvNsrrEgAAABBGCPK1JDKCSaEAAADgHIK8izo2TggsM448AAAAnESQd1H52V09LAQAAABhhyDvouAgX0KSBwAAgIMI8i7y+YJb5AnyAAAAcA5B3kVBOV7keAAAADiJIO+i8n3kSfIAAABwDkHeRcEt8nSRBwAAgJMI8i4ytMgDAADAJQR5F5XvI0+QBwAAgHMI8i4qP/ykh4UAAAAg7BDkXcTFrgAAAHALQd5FvqBXlyAPAAAAJxHkXbQ5Mz+w7KdrDQAAABxEkHfR1n2HgvysH7d5WAkAAADCDUG+lnz6006vSwAAAEAYIcjXkl+e1srrEgAAABBGCPIuGtixUWC5S7N6HlYCAACAcEOQd1FKYkxgmQmhAAAA4CSCvIuCZ3Zl+EkAAAA4iSDvonITQjH8JAAAABxEkHeRYWZXAAAAuIQg76LgrjXkeAAAADiJIO8iHy3yAAAAcAlB3kXrducElpdty/KwEgAAAIQbgryLvtuwN7A87dtNHlYCAACAcONJkDfGNDLGXGuM+a8xZo0xJt8Yk2WMmWuMucYYwwcMAAAA4DAiPTrveEnPSdouabakTZKaSrpA0ouSRhhjxltmUQIAAACq5FWQ/1nSGEmzrLWBEdaNMX+StEDShSoN9e94Ux4AAABQt3nShcVa+4W19v3gEF+2PkPSP8ruDqn1wgAAAIAQURf7oheV3RZ7WgUAAABQh3nVtaZKxphISb8qu/tRDfZfWM2mVMeKAgAAAOqgutYi/6ik7pI+sNZ+7HUxAAAAQF1VZ1rkjTE3S7pd0kpJV9bkMdbatGqOtVBSH+eqAwAAAOqWOtEib4y5UdKTkn6SNNRam+lxSQAAAECd5nmQN8bcKulpSctUGuIzPC7JMRef2srrEgAAABCmPA3yxpg7Jf1N0g8qDfE7vazHaWN6tgws922f7GElAAAACDeeBXljzGSVXty6UNIwa+1ur2qpDZE+43UJAAAACCOeXOxqjLlK0v2SSiSlS7rZmEpBd4O1dmotl+ao4KdkrXd1AAAAIPx4NWpN+7LbCEm3VrPPHElTa6Ual9AGDwAAALd40rXGWjvFWmuO8DPEi9rc4qdJHgAAAA7yfNSacBYc3eevZ0RNAAAAOIcg76JVGdnl7hcUlXhUCQAAAMINQd5FPVs3KHc/dfJH2p6Vr7cXbtGIJ9P1xnebPKoMAAAAoc6ri11PCG2S4yutmzx9uT5bsUOSdOc7P+qitNaKYGhKAAAAHCVa5F0UHVH55V23K6fc/Rk/bNW/565XdkFRtcex1mr97lz5/VwwCwAAgFK0yLsoMqJyS/u63bnl7v/xzSWSpI17cvXnsd0lSU9/vlrfb9yried20ckt6uv+mT/ppa83qH+HZL1+/QD3CwcAAECdR4u8i6oK8tV5ed5GSdLCjXv1xKc/a87Pu3TZC/MlSS99vUGS9O26zEot+gAAADgx0SLvoijf0X1O6nzPhzpQ4g/cz8qv3N0m7wAj3wAAAIAWeVf5jvIi1uAQf5CtMJEUQR4AAAASQb7Oa3/3B+XuX/zPebryX/O1ZPM+jyoCAABAXUDXmhCUvnq30lfv1r2ju+n+mT/JZ6Qf7jtH9WKjvC4NAAAAtYQgH8Lun/mTJMlvpcGPzVbLBnEaltpEfzyni8eVAQAAwG10rQkT+/KKtHzbfj31xRq1u2uW3luyrdx2a22NxqH/eHmG/vDKIi1Yn+lWqQAAAHAALfJh6ubXFqt1wzh1bV5Pa3bmaPTTcyVJXZomqXFSjEb3aK5L+rYp95i8A8X67f8tlCTN+nG7Njw6qtbrBgAAQM0Q5MPY+f/7TaV1q3Zka9WObM1ds1tndWuqlMQYWWtljNGu7EIPqgQAAMCxoGuNyxKiI7wuoVqnPviZ2t01S2c8/qUWbtyrlRnZVe63dMs+rapiW2FxiX77f99r7DNztbYGE1UVl/i1Zmd2YEjNH7dk6Z7//qh5a/dozc5sTV+8Vfl1YHjNndkF2rA7VwVFR66lqMSv9NW7lJVXecz/o/Xuoi266bXFWr4t67iPdSTfb8jUW99vrtFzDFfZBUUqqUF3M0nKyCrQjB+2Krvg+P+dv1mzW7OWbldRFcPNOs3vt5WGsEX1Fm3aq1lLt+tA8fH/2/y8I1t7cmgcqWustfpm7W6t2L7fkWOtzNhfK+9loDq0yLvs+sEd9bfPfva6jMPalJmnC5+r3Hrf7q5Zuiitld5euEWS1D4lQb1aN9D9Y09WUmyUXkxfr4+X75AkDXtijl69tp8kqX+HRlq6NUvfrc/UBX1aKr+oRJ8s36FnZ6/RntwDuuYX7TV5dDed90xpd59X5m9SVIRRUYnV1YPa6b7zTnbkeR38puForNuVozOfmBO4/9bvBui0dsnV7n/vjGV6bcFmtWoYpzl3DFXEUc4dcNC2ffn645tLJEmzV+7Usj8PP6bj1MTmzDxd9I95kqSt+/J161mdHT3+/HV7lLG/QCO6N1d0pE//nLNWc9fs1oRzuqhn6waOnutYfbFyh254ZZGa1YvVB7ecrvjo6v8UlvitLnzuG23dl68R3ZvpuSvSjvm8S7fs02Uvls7Y/MgFp+jSCt3bnLRmZ7au+vd3io+O0OvX91ejxBjHjl1QVKI3v9+slMQYjTyluQqKSnTfjOXKPVCsKWNOVoqD5zoeX6zcoX98uU4X9GlZqSthRet25eiCsm8xJ4/upmt+0f6Yzzt98Vbd+sYPion0Kf3OoWqSFHvMxzqSb9bu1kOzVuj0To1114hUR4+9ZW+e3vhus07v1Fh92ydr/e5cPTRrhTo2SdBd56Ye9d9Xt7y3ZJu++nmXrh/cQZ2bJh123/8u3hr4W/vJbYOPuP/h3Pfecv1n3kb1bN1A028Y6Orr8e+56/Xagk26YWhHnd+7laPHnr9ujz5fuVO/PK21OjZO1Jyfd+nF9NL3jdPnOlbWWr2/dLuy8os0Pq2VYqPqbiNpbTPh2FpjjFnYp0+fPgsXLvS6FBUUlSh18kdel+GotLYNNXl0N4179usj7jsstYm27M3Xqh3lW/TP791S/128tcrHfH77GYry+dSmUXylbemrd+nKfy1Qi/qxurRvGyXGRupXA9opwme0fneuXl+wSUNTm2jH/gLd8voPkqSlU6ofmtNaqxfT1ytjf4HG9mqhe2cs1w9BY/RH+IzWPjyy2ufX7q5ZgeWbh3XSbWd1CvwxLy7xa1Nmnjo0TtSWvXmaPH2ZGiXG6OHzT1F0ZPkvwz5alqHfTTv0+1qT6xOstZq/PlMFRSUa3KmxfD6jnMJird+Vq+4t61X7n8of3/xB7y469Nq/fn1/9WnTUIXFJUqKjdKO/QVqlBCtyIjSGnfsL9AjH6xQk3qxuvPc1HIfVnZmF6jggF9tGsVrw+5czfhhW+CD6+TR3TSgQyONfCo9sP89I7vqQIlf1/yi/WH/EG/Zm6c3v9+iMzqnqE+bhtU+l9krd2r6D1t1Zf+2OvUwH7iCj/v+ku36n49Wllv/84MjKv2bHLRk8z6NDfpd/+n+4eWCv7VWRSVWET4ja62+XZepU1rWV/34yr9zo59O17Kth1oCa3odSnGJP/DvIZV+ICgo8uu0dtW/Nmf/dY5W7yz9pmxUj+Z65tLeyiksVlxUhCIjfMrKL1L9uEM1ZuUV6cNl29W/QyO1S0kod6z8AyWKifTJ5zPamV2g575cq5e+3iCp9Pdn3to9evLz1ZKk09o1VJdmSUptVk9X9G972Oe1aNNezVm1SxeltVLr5Mrv94Nenb9JP+/I1u+HdFTTeocPxdZaLdy4V/PXZ+rxj1cF1v/v5X008pTm1T7umqnf6fOVOwP3K/7bZOUVKSbKp6gIn3xG2rG/UM3qV11L8N+F83u31N9+2euwNVf3PJZsyVKrhnGH/WAUfK7pfxikXq0bKCu/SPViS39HcwqLlRT092/bvnwtWJ+pYV2blFtf0b68A7rqpe8Cc5YsufccXfzPeYG/5b8e2E57cg/ogj4tNbRLk8M+l/eXbFNGVoEu69dGCTFVf2jOLijSk5+tVlx0hG46s1O178eDrLX65Kcd+jkjW098eqixbOGksw77oTX49Upr21Dv/H5gue15B4oVHx0Z+LYuv6hEidXUHHysD24+Xd1a1DtszVUp8Vst25qlbi3qKSqi6uecU1is7vd9HLh/8HezqMSvqAifCotLZGTKvWYbdudqV06hTm1b/d8ISdpfUKQeUz6RJLVqGKe5d55Z7nk9MK67lmzepxuGdFSHxonVHie3sFj/mbdRjRKjNT6tVbXn3JyZp3/NXa++7ZMP+348qLC4RB8ty9CG3XmB/1v6tk/Wa9f1P+aGMzekpaVp0aJFi6y1x97Sc4wI8rUg+E2BmkuMiVT6xKFqmBCtmUu36d9z12vRpsoTYfVtn6wBHRoFgsThfHjL6UptlqSvVu/Wi+nrlL569xEfc/CDwOsLNunJz1drYMcUTTy3i5Zvy9Jvpn5fbt8nxvfU2F4tVOy36vnnT1RYxVf01w/uoKsHtVPz+nGBddUF+fwDJZr6zQbNXLpNt53VWR0aJ6hJvVgt3LhXV/17QWD//728j3q0qq+z//qV8oO6y/xlfE/1aFVfizftVauG8RrYsZFufHWxZv24vcrnOrpHc836cbvapyTo41sHKyrCp873fBiYdfhPI1OVlV+kpvVitWjjXk3/oXR0pEEnNdLXa/ZUOl67RvHasCevynOlTxyqtxdu0RldGqtPm4blto179uvAB6q4qAiN6N5Mfy0LQ6/O36SPlmfo+tM76Ip/zQ885k8jU3XVwHaKiTz0AcFaq/yiEk2avqzch5eKUpsl6aNbB8vvt/ry553atCdP/To0Utfm9fTD5n3lPrQ2SYpR+p1DtS+vSO8v2aanv1ijrPzyXW5aNojTVxMrf0Nz3tNz9ePWQ12ngv9DfmjWCm3Pytfk0d3UqmFpqM3KK9LfPvtZU7/ZoO4t6+m/NwzSj1uzAi3HknT3iFRdP7iDJGn5tv3q0DhB8dGR1f7diYn06axuTTVr6fZy34ANe+JLrd2Vq0YJ0Uq/c6i+27BXTevFaMST6bJWSkmM1hX92+rvn5V/n3VrXk8/VdNN4Y7hXXR+75b6fuNeDUttUi7E5R0oVrd7D4WTk5ok6g9lrY3WWj3/1TrtzC5U/w6NdN1/Dr3Pnr8yTeec3KzSufbmHtCop9K1Laugylok6du7h6lZ/VgVFJXo0592KLewWENTm6hpvVj9Zup3+iIoyN8wpKMmnpuqOT/v0gtfrdPcNZX/Vtw49CRNGF55uN/g1/6Mzo318m/6yu+3ysw7oH98uVYpSTG6/vQOgdm/8w4U66NlGXohfb2uH9xe5/dupalfr9eU938KHOP2czqrR6vSb7T25BQGwmp1/84DOjTS3rwDWrcrV3+5uKfG9GyhohK/Ot3zoaTSD3d/vbinVu/I0a6cQl390neSpF8NaCuj0m9Ki4O6nj11aW/d/NriKs/19V1nqrjEr905hZU+eH+/ITPwDaAk9e+QrD+P6a4uzZKUlV+kf6WvU5N6sVq/O1f/mrtekhQd6dOcO4aU+xt50KJNe8v9/ldl/SMjZYxRZu4BfbFyp2KjfDoztUmV74u3fjdAPVs10L+/Xq+Xvl6vHftLu0PFRUUov6hEcVERev5XaTq9U+NK5wk+1rs3DFSfNg2VlV+k/flFeuv7zfpF2TcZBxUWl+izn3Zqwfo9uuYXHdSmUbxufm1xYJS5SaO66pentVZSbJSstSr2W0VF+JSRVaD+j3weOE5qsyRt3ZevvAMlurJ/W81cuk2S0du/G6B2KQlauytHw8q+WX7swh4a1aO5sguK9f6SbXrogxWSpAfGnqzvN+7VjB/Kj3A3/0/D1O/hz1WVtQ+P1MY9uaofF1Xpw9LjH6/Us7PXSpKSE6I1/ORmuu+8boqNitCG3bl6dcEmDenSWA9/sCLQmPHbwR10+zldqvzQ9s7CLbr9rSVV1iGV/p7eP7a7JGn97lwt3bJPbRslqGer+p58U0SQd1hdC/KvL9ikKe8vV0FR1f3oOjROUP6BEm0/zH9ACD9X9m+ry/u30esLNmvqNxsqbU9OiFZm7gFHzzm6R3PNXFp1iK9oynnd1KVZPV36wreO1lCRMdJPfz5X63fn6pbXF+vkFvUCHxCC3Tu6m77bkKkPl2Uc8ZgPjuuuS05rrYv+Ma/cNyyHM7ZXCw3qmKKJ7ywNrLvpzJP09Bdrav5kKmhRP1ZfTBiiNTtzNH99ph4om/vhoHdvGKhHP1ipBRtqNtzr8JObBrqzHU7bRvHaWM0HqIpWPzRC7y/ZFuhu4JaxvVroyUt6a/66PfpoeYbioiL0v1+urbTfgj8N013v/lguVFdnyb3nKCLCaPRT6dV+YKzo5d/01ac/ZWjat5sklX7rNu2afsf8ex4XFaEHxnXXhX1a6ouVO5WVX1TutezWvJ4eOr+7bnx1sbbuyw+sH5bapNw3AMHe+t0AjQ8Kv07Y8OgojXgy3ZG+4YfzxPieujCtlT79aYc27snVg7NWVNqnRf1YfXjrYF338vdH/N3v3rKeZt50ujKyCnTuk19pXw2vR0qfOFTXvPydft5R+q1U/w7J+u3gjrp66ndH/6TK6vjzmO46uUU9vffDNrVsGKfLXzzUkPDG9f31845sPTBrRblrLP54dmf99dPK3WsToiP0xm8HBEaUq06Hxglatyu3RjX2adNAb/9uoDr86YMj73wMYiJ9Kiz2KybSpy/vGKKG8dF66/vNSoyN1G1vVP77ccuwTvrVgLYa/vd07T7C9SJ3npuq3w/pqB+3ZOnC574JNB4dzoRzOuuCPq008NEvAusmnttFNww56eif3HEiyDusrgV5STpQ7Nfd7/6odxZtCay7KK2VJg7voiZlXxV3nvShIxdZAQAAhJJRPZprVg0bmg7nywlDKnUPdJuXQZ5Ra2pJdKRPl/U7dLHV6B7N9ZfxPQMhXpJWPXCuvpwwxIPqAAAAvONEiJekG19b5MhxQgVBvhaltW2oxy/qod+d0VF/HlN5ZBZjjNqlJGjKed0qbYuN4p8KAADgcIIHFDgRMPxkLRt/ausj7vPrQaXDngVf6PTSr09zrd8bAAAAQg9Bvo66rF9bxURFKD46Quf1aCGfz+jqQe0CQ76d17OFGifGKCrSqH5clB77aNXhDwgAAICwQpCvo6IjfZUmi5k8qptG92ih1GZJlcbivWHISVq6ZZ8u/ue8akfHAQAAQPggyIcQn88orW3Darf3aNVAKx8YIal0Frj7Kwx1BwAAgPBBkA9TVw5oqy9W7tTaXTl66tLeOq1s1suZS7fpxlerntQjWMfGCVpbw7FrAQAAUPsI8mEqKsKnadf2k7W23Cxno3u0UFJslNbvylFsVITuevfHSo81pnQWv1FPVT1RxcJJZ+mF9PX6x5zKk7kAAAB45aHzu3tdQq0iyIe5qqYqPqNzY53RubH8fquWDeMUFxWh9ikJOvfJdOUUFOulq0/TyS3q67dndNCcVbu0MiNb0ZE+TTnv5MBY+HeNSNUdw7toT26hCov8ionyKcrn04tz1+nZ2Wt1dremOq9nCz3+8Uqd1i5Zl/drowufOzRTYfBsdYsmny2fkXrd/+lxPddIn9EtwzrpiSpm0QtVDeOjtLeGMxkeLb51OTEczWy+ABDqLujdyusSahWDk5/AfD6j0zs11qntktUoMUZf33mmFtwzTP07NJIk3T2iqz66dbA2PDpKS+49p9yEVlLp1OZNkmLVOjleTZJi1TAhWncMT9WGR0fphV+dqjE9Wyh94pn668W9lNY2WXcM76IzU5vow1tO12e3naFXr+2neXefqeSEaDWIj9antw3WDUM66uxuTTW6R3P94qQUje7RXN/dc5aCP4/cPSJV3949TGsfHql3fj9AKx84V8v+PFwrHzhXNw3rpPSJQ3Xj0KqnaD4ztYk2PDpKC/40TPPuPlMbHh2lV6/tF9h+61mdAsstG8RpcOfGVR5nzh1D1LlpYrl1l/dro6SYyp+Np159mu4Z2VVtkuMD65okxejZy/oosYr9gy2+9xw9flEPSdL4tFZacu856tI0KbD9orTKf7DioiLULGiiseq0bBivmTf9osptf724p6b/YVCl9aN7NK/2eB0bJyg6svo/KWseGqFlfx5e7fabz6z8b3ZSk0T9YWhH9W2frHNPbqboCJ8mj+6mj249Xe2PMHPfc5f30a8HttNvBrXXT/cPL/f619SkUV1rvO9jF/ZQk6SYcuuiI32689zUIz526ZRz1LZRaX2X9m2td34/8IiPaZQQXaO6rh7UTsNPblrltucu76MOjY9uBsTzerYod793mwbl7m94dJT+ddWpR3XMcb1a6LazOpf7G/PYRT307GV91LJB3GEf+9vBHXRSk0TdMKSjltx3zlGd96BzT25Wo/0axEdp8ujK83yc1bWprju9/WEfe1FaK/3v5X0C9/96cc8q5ww5VhseHVXttv/ecOTfp4oeveCUcvdP75QSWG6THK8Nj45Sj1b1j+qYt53VWded3l5X9D/07/zO7wfonpFdFR8dcdjHjurRXMkJ0frL+J56+Td9j+q8Ryu1WZKuH9yh0vo7hndRz9YNqnjEIdOu6adfD2wXuP/BzafrrK5Vv/+ORfrEodVue/rS3kd9vOB/C0nqE/R+fvrS3lr78MijPubff9lLVw1oq75lXXql0m/yL6+QIapy8P/Q//ymr0Yd5v+b6vxmUHvFHeF3KdwYa63XNTjOGLOwT58+fRYuXOh1KXDI4k179ZdPVmnQSSm6YUjVIT2YtVZPfb5GP+/M1jndmurud39UTKRPH94yWM3qx1bad+HGvTpQ7NeAjo2UX1SiL1buVJ82DdW8fqza331o/P7Pbz9DHRuXBvhNe/L0fPpardyerbG9W+qyvm10oNivT1fs0C2vL1aEMfr4tsGB/Q+ea1dOoZokxZZbN/HtpXpr4RbdMKSjrh7UXs9/tVY3ntlJ9eOiqnxu1pZ+EJOkn3dk674Zy+W3Vg9fcErgfIs37ZWVlFtYrPnrMnXlgLbq9/DngeOMT2ulx8f31PJtWcopKFbbRgkqLC5R20aHgl3+gRLtzilU66AQbK3V2l05WrcrV52bJik5MVpzV+/WoJNSNG/tHv1uWun77slLeqlVw3i9On+Txp/aKvABMaewWHNW7dInP2UoNjJC3VvVV+cmierZuoFSJ38UOE/6xKFq1TCuym+VDtbxxnebq+wetnDSWWqUWD5Ur96RrYc/WKHU5vU0cXgX/WPOOn2zdrduHtZJ0RE+jX3268C+/7wyTWemNlFUhE/vL9mmm14rva4kJTFG39x1prLyi/Sn//6oT3/aUfpaTz5bDROi5fdb5R4oVmbuAf1/e3ceJkV17nH8+84wgiAgICiCyCruiiiLelU04oaJSUyi3qgkccmTuEa9Jm7BJEbjEjFeb0xcogGNBh+VxKAiKriCC27IsDPINqzOMMzArOf+cU4PTU/3LD1DLzO/z/P0UzNV51SffvtU1dunq6s2bi3nqH7dats/a9EGLn78A8Anjicf1ItX5hVy/w+OTPg+F5VV0j4vB+dg4oxFPPL2cg7Ztwv/uGwUXTrkserrMh57ZzkjB/Tg4VlLqXGOyZeM5PAJ02vX8+8rjmdQr0489/Eq+nXvyN5dOlBV7di7a3t6de5AdY2PY++uHThpaE82l1bQZfc8Xp5XSMHGUnrssRv/NbgnG0vL+WJVMecM68PJ985kU2kFAI+PP5qiskpe/bKQW8cdTN9uvq+sLtrGndPy+XRlEe3b5XD84L34wTH9KNpWwQWPzAFg/x4dufM7hzFqQI/a/gxQU+N2+v+SJz9iRv66neKTl2ssvqNukvHa/HU89s4yzjumH6cfug+3TZ1HZbXjhtOGctvUeczIX19b9obThnLZCQNpl2Pc8uI8nprzFQDHDe7B5J+MZOG6Ek6f+DYAPTu358ObvwH4baqqxjH9y0J6denAifvTr10AABQVSURBVFEf+K94em7tNyA3nDaUDws203G3XB48/yhyc+r25eoax/KNpfTr3pHC4u1cNukjFhSWcOrBe/PIRUfXvqbPVhaRl5vD/TMWcUTfrlw3digXhf4EPpFfubmMqZ+uZsyBvaiqdizbuJVh+3Wj/16dKCzezh9eWcD5I/pxaJ8uFIVv+ibPXkFFVQ0Denbi9EP24fUF66mqdnzv6L4Mufnl2vV/eftp/H5aPmZwy1kH0yHPJ0sfr9jMzS/MY1NpBbvl5jBiQHeuP20of565hMmzfTyPH7wXPx8zmFEDu9duDzU1jorqmtr1VFXXMDjq+SJGDujOs5ePrjN/4oxFvL90EzeecSDVNY67Xl7AIft24aYzD9ppPwLw+28fxvkj9qOorJIT73mTLdurALjy5MH8fMxgnprzFb8NF4c4fvBeTPrJCMyMorIKyiqqeeGT1Rw/eK/aJL68qpqht+x4jgfOO5LfvjSfi0b356pT/GBQ7GmtG7eWM291MaMH9WDWwg1cNunj2jZcN3YopeVV/O3d5VRUO2Yv3cQHBZv51pH7UlpevVPfL7jrLN5fuon3l23i3KP6Mn9tMUvWb+WbR/ShX4+O/OfztUyevYLbzj6YXp3b0yEvly/XbOHpOSvIzclhWL89GXvI3jw/dzUH7tMZM6vdLwHM/tUpPDxrKQfu05nzwpXzqmscUz5ayS0vzqN9uxxqHFwwsh+/OPUATrznTTZu9fuCy04YyLeH9eGg3l1q17dleyW75+WSl5sT+svXfPfP79V5P28680AuO2HQTvO2bK/kmmc+ZVtFNfd9/whez1/HtC8KOfPw3px9eO863+L/9cLhjG3kh/KWNnz4cObOnTvXOTc81c+tRF7ahJLtlezWLof27Zr+Sd05x5drttCvR0e6dKibcMWzcnMZHfJy6RkzQptu078s5KeTP6ZT+3bMvP6kOsluc9XUOF6eV0hVTQ1nHdabdrlN+9Jv1qINTP10Nf89cv96r9AUUVldw9RP17B7Xi7D9+/Gsx+uZMSA7owe1KPJbX9jwTqmfLSK8cf2Z+TAnes75/hidTEDe+5R+y3K5tIKnp+7iuH7d2NYv4bb6pzjtfnr2FZZnVRswMfXLP4pc9F+PXUeT76/gkP27cJLVx7fYPmmmr9mC799aT6H9vGJU1PX/9g7y1m8roSrThnCvg2MuAOs37KdyXO+4uj9u9Eux3jpi7VcMKIfh/Zp2ogwwPVTPmN9STm3jTuIwb0677Rs49Zy3l2ykZMO6EXXjn5b/3xVEa/MK+Q7R/WpUz6e4rJKJs9ZwZBeeySdVGyvrK5NcBOpqXGcev8slm4o5Yej+vG7cw6rt3wyJs1ewcMzlzL+2P5cGmeEuj7F2yq559UF7J6Xy/WnDW3UvvfjFZv592drOXd4X95evJGlG7Zy7akHNPitTKw1Rds47f63GDmwO/d978ja9zLi81VFLFm/lTMP610b50nvF7BwXQlXjBlSZ7Annnmri3n2w5WMO7x3nf1FYzjn2FxakXAfXF3jyM0xVhdt46R73qSy2vG/Fwxj3OH7xi2fLOccNzz3OXOWb+KOcw5L+C10IgsKt/Cn1xczamAPLhrdv1F1nnh3OQsKS7hodH+emrOCTu3bcf3YofV+oxvPmwvXc/mkj/nxcQP45RkNf/O5KymRb2FK5EUSW1u8jS4d8urci0Baj8iHzwP27tzkg6Nkj7KKKvLXljBsvz13+gZDWpf1Jdv5urSSofs0/EFS0iOdibyO5CJtTO+uTRvdkuxjZkmNVkt26bhbu0Z9cyXZrVfnDjudjikSTUM1IiIiIiJZSIm8iIiIiEgWUiIvIiIiIpKFlMiLiIiIiGQhJfIiIiIiIllIibyIiIiISBZSIi8iIiIikoXSmsibWV8ze9zM1phZuZkVmNlEM9OFcUVERERE6pG2G0KZ2SDgPaAXMBVYAIwArgZON7PjnHOb0tU+EREREZFMls4R+f/DJ/FXOefOcc790jl3MnA/MBS4I41tExERERHJaGlJ5MNo/FigAHgoZvGvgVLgQjPrlOKmiYiIiIhkhXSNyI8J0+nOuZroBc65EuBdoCMwKtUNExERERHJBulK5IeG6aIEyxeH6QEpaIuIiIiISNZJ149du4ZpcYLlkfl71rcSM/s4waIDk2mUiIiIiEi20HXkRURERESyULpG5CMj7l0TLI/ML6pvJc654fHmh5H6o5JrmoiIiIhI5ktXIr8wTBOdAz8kTBOdQ9+Q/vn5+QwfHjfPFxERERFpEfn5+QD90/Hc5pxL/ZP6y08uwV9+clD0lWvMrDOwFjCgl3OuNIn1Lwe6hPWnUuTc/AUpft5sp7glR3FLjuKWHMUtOYpbchS35ChuyWlu3PoDW5xzA1qmOY2XlhF559xSM5uOv5b8z4EHoxbfDnQC/pJMEh/Wn/JAwo4f3yY65UfiU9ySo7glR3FLjuKWHMUtOYpbchS35GRz3NJ1ag3Az4D3gD+Z2SlAPjASf435RcDNaWybiIiIiEhGS9tVa5xzS4GjgSfwCfx1wCDgAWCUc25TutomIiIiIpLp0jkij3NuJfCjdLZBRERERCQb6TryIiIiIiJZSIm8iIiIiEgWSsvlJ0VEREREpHk0Ii8iIiIikoWUyIuIiIiIZCEl8iIiIiIiWUiJvIiIiIhIFlIiLyIiIiKShZTIi4iIiIhkISXyIiIiIiJZSIl8CzCzvmb2uJmtMbNyMysws4lm1i3dbWspZtbDzC4xsxfMbImZbTOzYjN7x8x+YmZx+5KZHWtm08xsc6jzuZldY2a59TzXODObGda/1czmmNnFDbTvYjP7IJQvDvXHNfd17ypm9kMzc+FxSYIyuzwOZpZrZteG92VbeJ+mmdmxzX2NLcXMTgn9rjBsX2vM7FUzOzNOWfU3wMzOMrPpZrYqxGGZmU0xs9EJyreJuJnZuWb2oJm9bWZbwvY3uYE6GRmbVG67TYmbmQ0xsxvN7A0zW2lmFWa2zsymmtmYBp5nl8fAzHY3s9vNbKGZbTez9Wb2TzM7qPERaZxk+ltM/Udtx3FicIIyKYmBmXU3n9cU2I798ONm1rexr6exktxOc83nKG+Z2de2Y7/3rJkdkKBO6+hvzjk9mvEABgHrAAe8CNwFvBH+XwD0SHcbW+h1/jS8pjXAU8CdwONAUZj/HOEGY1F1vgVUAVuBx4B7QkwcMCXB81wRlm8EHgLuB1aGefcmqHNvWL4ylH8I2BTmXZHu2MVp734hbiWhjZekIw6AAVOi+uo94X3aGt63b2VArO6Oek1/BX4PPALMBe5Wf4vbvj9EvaZHwz7pOaACqAF+2FbjBnwanq8EyA9/T66nfEbGJtXbblPiBjwTln8J/AV/rHg+tMsBV6UrBkB74J1Q58OwrTwNVAKlwMh09reYumdH1XXA4HTFAOgBLAx1XsfvU14M/68DBqZ5O90jtMsBnwATQxsnAQXAuNbc31os8G31Abwa3qQrY+b/Mcx/ON1tbKHXeXLYseTEzN8H+Cq81u9Gze8CrAfKgaOj5ncA3gvlz4tZV39ge9iY+kfN7wYsCXVGx9Q5NsxfAnSLWdemsL7+zXntLRxHA2YAS8OOoE4in6o4AOeHOu8CHaLmHxPet/VA5zTG6tLQvieA3eIsz1N/qxOTfYBqoBDoFbNsTGj7srYatxCDIWE7PIn6E9KMjQ0p3nabGLfxwLA480/Ef5gsB3qnIwbAr0KdKUQdy/Af2CIfPnIaiseuiFtMvZ74bfgZYCaJE/mUxAD/gcwB98XMvyrMfyVd22ko/1Qoc3mC5Xkx/7eq/tZigW+LD/xovAOWx+n4nfGf1EqBTulu6y6Ow00hDg9GzftxmPdknPInh2WzYub/Jsy/PU6duOsD/h7m/yhOnYTrS2OsrsaPip4ATCB+Ip+SOABvhflj4tRJuL4Uxal92DGuIE4S39i4tLX+BowMbZiaYPkWoERxc9BwQpqxsUnntttQ3BqoO52YQZ9UxQCfFK4I8wfEqZNwfamOG/ACPpHvQf2J/C6PAX60uwyfz8Qmqjn4EW9HC4/KNzZuwFFh+TNNWGer6m86R755Iuf7TXfO1UQvcM6V4D+5dQRGpbphKVYZplVR804O01filH8Lv2M41szaN7LOyzFlmlMnLcI5cXcBDzjn3qqn6C6Pg5l1wI9KlAFvN+F5UuVU/KjU80CN+XO+bzSzqy3+ed7qb95i/KjnCDPbK3qBmZ2AH2CYETVbcUssI2OTBdtufeIdKyA1MRgE9AMWOeeWN7JOypnZeOAc/OjypnrKpSoGo4DdgXdDXlMr5D2vhn/r/f3DLnRBmP7DzLqa//3Zr8zsskS/K6CV9Tcl8s0zNEwXJVi+OEzj/tCiNTCzdsBF4d/ojSJhbJxzVfhvMdoBAxtZZy3+242+ZtYxPHcnoA+wNSyPlTHxD3GahD8N6aYGiqciDoOAXPxpFrEH1UR1UumYMN2OP+fxJfyHoInAe2Y2y8x6RpVXfwOcc5uBG4G9gflm9lczu9PM/okfDX0NuDyqiuKWWKbGJtO33bjMbH/gFHwy9FbU/FTFIOOP1yFGD+BHn6c2UDxVMcj0uEWOFfvjT1mdhP8t1V+ARWb2kEX9ML019jcl8s3TNUyLEyyPzN8zBW1Jl7uAQ4FpzrlXo+YnE5vG1ukaM82G+N8GDAPGO+e2NVA2FXHI9Nj1CtMb8F8//hd+NPlwfEJ6Av68wwj1t8A5NxH4Dj7JvBT4JfA9/I+6nnDOrY8qrrgllqmxybp4hm8tnsKfMjfBOfd11OJUxSCj42b+ym9P4k9huaoRVRQ3L3Ks+CP+NKSD8MeKb+AT+58Bt0aVb3VxUyIvSTOzq4Dr8L/gvjDNzclYZjYSPwp/n3Pu/XS3J0tE9k1VwDedc+8457Y6574Avg2sAk5McJpNm2Zm/4O/Ss0T+JGkTsBwYBnwlJndnb7WSVsTRkMnAccBz+KvFiJ1XYv/QfClMR90pH6RY8UC4AfOuQXhWPE6cC7+N2m/MLPd0tbCXUyJfPPEjq7EiswvSkFbUsrMrsB/BTgf/2ONzTFFkolNY+sUx0wzNv7hlJq/479eu7WB4hGpiEOmxy7yvJ845wqiFzjnythxXuaIMFV/A8zsJPwlzv7lnPuFc26Zc67MOTcX/wFoNXCdmUVOB1HcEsvU2GRNPEMSPxn/jdA/8Zc+dTHFUhWDjI1buM75HcDfnHPTGlmtzcct5nn/7Zyrjl7gnPsMfwpcZ/xIPbTCuCmRb56FYZroHKchYZroHKmsZGbXAA8C8/BJfGGcYgljE5LbAfjR1mWNrNMbP7K4KiRyOOdK8YnJHmF5rEyI/x7413MQsD3q5h4O+HUo80iYNzH8n4o4LMVfpnBgeD8aUyeVIjFItJOLjFjtHlO+rfe3yM1M3oxdEF7HB/j9/rAwW3FLLFNjk+nbLgBmlgf8AzgPf+3sC+KdX5zCGGTy8fpg/GlHP4o+RoTjxImhzOIw75zwf6pikMlxgyYeK1pjf1Mi3zyRg+VYi7mzqZl1xn+VWAbMTnXDdhUzuxF/84RP8Un8+gRF3wjT0+MsOwF/NZ/3nHPljaxzRkyZ5tRJpXL8TSPiPT4JZd4J/0dOu9nlcXDObcdfC7sj/vzzxj5PqkRu7nFw7LYVHBqmkasBqL95kSuo9EywPDK/IkwVt8QyMjZZsO0STmOYgh+J/ztwYexoaYxUxGAp/mIDB5jZgEbWSZUCEh8nIgNlU8L/BZDSGMwGtgHHhbymVtg3jw3/1hk8SJHIVbgOjV0QfpsRSZgLoha1rv7W3OtXtvUHbeSGUOE13Rpe00dA9wbKdgE20LSbqQwgS280k2Q8JxD/OvIpiQONu8FFlzTGZ2po37Ux88fiz3v8Guiq/rZT+74f2lcI9IlZdkaI2zbCHafbctxo3A2hMjI26dx2GxG39sB/QplHacQNb1IVA1J8Q6imxK2eejNJfB35lMSAFN8Qqon9rRN+hL0CGBGz7Heh7hutub/tksC3pQf+x2TrwpvyIv521G+E/xcSDpjZ/gAuDq+pCj8iPyHOY3xMnXPYcXvzR4G7ibq9OWBxnufKsLwptze/LyyPvtXyxjAvJbd+TzKmE4iTyKcqDux8y+n88P7sstu8JxGfvuy4a/AM/J1wnwttq6TuTWXafH/Df8v6WmjLFvxVMP4A/AufxDvg6rYat/BanwiPV8JzL42ad2+c8hkXG1K87TYlbsDfwvINwO3EP1aclI4Y4D9kvBvqfIi/6trT+P1JKTAynf0twTpmkjiRT0kM8DemWhjqvI7Pc14M/68DBqV5Oz0Vn0yX40/nuhd/vfdI+4a05v7WYoFvyw9gP/zOay3+U+EK/PWuu6W7bS34GieEzljfY2acescB0/Cjp9uAL/C/zs+t57nOBmYBJaGzfwhc3ED7xodypaHeLGBcuuPWyJjWSeRTFQf8JQqvDe/LtvA+TQOOTXd8Qvt64n+PsSJsWxvxdz0ckaB8m+9vQB5wDf4r8S3hILMefy3+sW05bo3YjxVkS2xSue02JW7sSDzre0xIVwzwp0f8Bn8d73L8B44pwMGZ0N/irCMSzzqJfCpjAHTHX+Aisi9eCzwO9M2EuAFH4Ad6NoT2fQX8Gdg3ndtcKvqbhScSEREREZEsoh+7ioiIiIhkISXyIiIiIiJZSIm8iIiIiEgWUiIvIiIiIpKFlMiLiIiIiGQhJfIiIiIiIllIibyIiIiISBZSIi8iIiIikoWUyIuIiIiIZCEl8iIiIiIiWUiJvIiIiIhIFlIiLyIiIiKShZTIi4iIiIhkISXyIiIiIiJZSIm8iIiIiEgWUiIvIiIiIpKFlMiLiIiIiGSh/wcG3dgB9sXBtwAAAABJRU5ErkJggg==\n",
      "text/plain": [
       "<Figure size 432x288 with 1 Axes>"
      ]
     },
     "metadata": {
      "image/png": {
       "height": 250,
       "width": 377
      },
      "needs_background": "light"
     },
     "output_type": "display_data"
    }
   ],
   "source": [
    "%matplotlib inline\n",
    "%config InlineBackend.figure_format = 'retina'\n",
    "import matplotlib.pyplot as plt\n",
    "\n",
    "plt.plot(mv_net.losses['train'], label='Training loss')\n",
    "plt.legend()\n",
    "_ = plt.ylim()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 显示测试Loss\n",
    "迭代次数再增加一些，下降的趋势会明显一些"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 167,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAvIAAAH0CAYAAABfKsnMAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAAWJQAAFiUBSVIk8AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvOIA7rQAAIABJREFUeJzs3Xd8VeX9B/DPE/ZSRKxaat0o1VoE64BqHdVWa6mztm6Lbf1pXXXVjROtCxeKIoK42csBYUMII2ETCEkIJITsPW7WfX5/JDckN3ecc+9z9uf9evHK5d5zz/neM57zPc95zvMIKSWIiIiIiMhZEqwOgIiIiIiI9GMiT0RERETkQEzkiYiIiIgciIk8EREREZEDMZEnIiIiInIgJvJERERERA7ERJ6IiIiIyIGYyBMRERERORATeSIiIiIiB2IiT0RERETkQEzkiYiIiIgciIk8EREREZEDMZEnIiIiInIgJvJERERERA7ERJ6IiIiIyIGYyBMREREROVBXqwMwkxBiD4BDAGRbHAoRERERudtxACqllMcbtQBPJfIADunVq9eAIUOGDLA6ECIiIiJyr7S0NNTV1Rm6DK8l8tlDhgwZkJKSYnUcRERERORiw4cPR2pqaraRy2AbeSIiIiIiB2IiT0RERETkQEzkiYiIiIgciIk8EREREZEDMZEnIiIiInIgJvJERERERA7ERJ6IiIiIyIG81o88ERERuYDf70dpaSmqqqpQX18PKaXVIZGLCSHQo0cP9OvXDwMGDEBCgj3qwpVEIYS4TgjxrhBipRCiUgghhRCf65zH4UKIO4UQs4QQGUKIOiFEhRBilRBitBDCHmuMiIiILOX3+5GTk4OioiL4fD4m8WQ4KSV8Ph+KioqQk5MDv99vdUgA1NXIPwXgVwCqAeQCODWGeVwP4AMABwAsBbAPwJEArgEwEcDlQojrJY9WIiIiTystLUVtbS26du2Ko446Cn369LFNDSm5k9/vR01NDfLz81FbW4vS0lIMHDjQ6rCUtZF/EMBgAIcA+L8Y55EOYBSAn0kpb5JSPi6l/DtaLgpyAFyLlqSeiIiIPKyqqgoAcNRRR6Ffv35M4slwCQkJ6NevH4466igAB/dBqynZ86WUS6WUu+OpLZdSLpFSzpNS+oPezwfwYet/L4wjTCIiInKB+vp6AECfPn0sjoS8JrDPBfZBqznlErax9W+TpVEQERGR5QL1hqyJJ7MJIQDANs9l2P4IEEJ0BXBr639/sDIWIiIiIvKuQCJvF07ofvIVAKcD+E5K+aOWLwghUsJ8FMtDuEREREREtmPrGnkhxH0AHgKwE8AtFodDcbDLLSgiIiIit7BtjbwQ4t8A3gawA8AlUspSrd+VUg4PM88UAMPUREhaNDT5MXrKemSX1OCdv56JM39+mNUhERERkY2cddZZ2LlzJ6qrq60OxXFsWSMvhHgAwLsAtgG4qLXnGnKgKUnZWLm7GDmldbjho2SrwyEiInIFIYSuf5MnTzY0nurqagghcOWVVxq6HOrIdjXyQojH0NIufhOAS6WUxRaHRHHYlFve9rqhyR6joBERETnds88+2+m9cePGoaKiAvfffz/69+/f4bOhQ4eaFRqZyPREXgjRDcCJABqllJlBnz0N4HkAKQAu09OchoiIiMgrxowZ0+m9yZMno6KiAg888ACOO+4402Mi8ylpWiOEuEoIMVkIMRnAf1vfPi/wnhDi9XaTDwKQBmBx0DxuQ0sS3wxgJYD7hBBjgv7driJeIiIiIq8qKirCww8/jFNOOQU9e/bEYYcdht///vdYtmxZp2nr6urw+uuvY+jQoejfvz/69OmD448/Htdccw1WrFgBAHjvvffQr18/AMCCBQs6NOl5/fXXO81Tq+bmZrzzzjsYNmwY+vTpg759++Lcc8/FpEmTQk6/ePFiXH755Rg0aBB69OiBo48+GiNHjsSrr77aYbq8vDzcf//9GDx4MHr37o3DDjsMQ4YMwejRo5GTkxNzvFZQVSM/FMBtQe+d0PoPAPYCeDjKPI5v/dsFwANhplkOYHIM8ZFV2FkNERGRbaSnp+Piiy/G/v37cdFFF+GPf/wjKisrMXfuXFxyySWYOnUqbrzxxrbpb7jhBsybNw9nnnkmbr/9dvTo0QP79+/HihUrsGTJElxwwQU4++yz8fjjj2Ps2LE4+eSTO3x/xIgRMcXp9/tx7bXXYs6cOTj++OPxr3/9C83NzZg5cyZGjx6N5ORkfPTRR23Tz5gxA9dddx0OP/xwjBo1CkcddRSKi4uxY8cOTJgwAY899hgAoLKyEueccw7y8vJw2WWX4aqrrkJjYyP27t2L6dOn45ZbbsExxxwT49o1n5JEXko5BsAYjdNmA+jUm76eeRARERGRfjfeeCPy8/MxZ84cjBo1qu39kpISjBw5EnfddReuuOIK9O/fHwcOHMC8efNwwQUXYNmyZR0GQ5JSorS0pQX02WefjV/84hcYO3YsBg8eHLLZj16ffPIJ5syZgxEjRiAxMRG9evUCALzwwgsYMWIEPv74Y1x55ZVtvyGQ1CcnJ+Okk07qMK/i4oOPWy5YsAC5ubl46qmn8MILL3SYzufzoampKe7YzWS7h12JiIiI4nHcfxdYHYJm2a/80bRlrV69GikpKbj99ts7JPEAcPjhh+Ppp5/GzTffjLlz5+LWW29t+6xHjx6dRjQVQuDwww83LNZA85nXXnutLYkHgEMOOQQvvfQSrrrqKkycOLHD7xBCoGfPnp3mNXDgwE7vtZ9nQKjv2h0TeTKWvUYyJiIi8qw1a9YAaGkjH6rWfP/+/QCAtLQ0AMDRRx+Niy66CIsWLcLw4cNxzTXX4Pzzz8fZZ59teNK7ceNG9OzZE+edd16nzy6++OK2aQJuuukmLFy4EEOHDsUNN9yAiy66CCNHjsTRRx/d4buXXnopjjjiCDz99NNISkrC5ZdfjpEjR+KMM85AQoIte2WPiIk8GYtt5ImIiGyhpKQEQEvzkgULwt+1aD8w09y5c/Hyyy/jm2++wVNPPQUA6N27N/7617/itddew4ABA5TH6fP5UF9fj+OOO67TnQAA6NevH/r06YPy8oNdXN96663o27cvxo0bhwkTJmD8+PEAgHPPPRevvPIKfvvb3wJoqZ1fu3YtxowZg/nz57ethyOPPBL33XcfHnvsMXTp0kX5bzIKE3kiIiJyFTObqzjJoYceCqCl/fnf//53Td/p27cvXn75Zbz88svYu3cvli9fjk8++QSTJk1CXl4evv/+e+Vx9uzZE927d0dBQUHIz6uqqlBTU4NBgwZ1eP+aa67BNddcg6qqKiQnJ2Pu3LmYMGECrrjiCmzduhUnnNDSB8vxxx+PKVOmwO/3Y9u2bVi8eDHee+89PPnkk+jSpUvbg7FO4Lx7CERERESk27nnngsAWLlyZUzfP/bYY3Hrrbdi8eLFGDRoEBYuXIi6ujoAaKvFbm5uVhLrmWeeibq6Oqxdu7bTZ0uXLgUADBs2LOR3+/Xrh0svvRTvvvsuHnzwQdTW1mLRokWdpktISMAZZ5yBBx98EPPnzwcAzJ49W0n8ZmEiT0REROQBv/3tbzFs2DB8/vnn+Oqrr0JOk5qairKyMgDAgQMH2trLtxeoEe/evXtbAt+rVy/06tUL+/btUxJr4I7Bo48+ivr6+g7LDjTxGT16dNv7y5YtC3kREajV7927NwBgy5YtHXqxCTedU7BpDREREZEHCCEwbdo0XHLJJbjxxhvxxhtv4Ne//jX69euH3NxcbNy4ETt37sTWrVtx2GGHITMzE+effz7OPPNMnH766Rg0aBDKy8sxb948lJeX44knnkD37t3b5n/JJZdg/vz5uPbaa/HLX/4SXbt2xe9+97u2OwF63HnnnZg3bx7mz5+P008/HaNGjWrrRz4nJwd///vf8ec//7nD9NXV1RgxYgSOO+44JCQkYN26dVi5ciUGDx6Mq6++GkBLm//nn38eI0eOxMknn4yBAwdi7969mDNnDrp06YKHH4427JG9MJEnIiIi8ogTTjgBGzduxNtvv41Zs2bhs88+g5QSRx99NE477TQ88sgjbf2wn3rqqXjmmWewbNkyJCYmoqSkBIcffjiGDBmCcePG4brrrusw7w8//BAPPPAAli1bhtmzZ8Pv96Nnz54xJfIJCQmYNWsW3nvvPUyZMgUffPABhBA47bTT8Mwzz3SojQeAZ599FvPmzUNqaioWLlyILl264Oc//znGjBmDe++9F3379gUAjBo1CkVFRVi5ciVmzpyJ6upqHH300fjTn/6Ehx56CGeddVaMa9YaQkrvdCsihEgZNmzYsJSUFKtD8Yx7vkjFgq0H2v7PB5CIiChegeYeQ4YMsTgS8iKt+9/w4cORmpqaKqUcblQsbCNPRERERORATOTJUJIdyRMREREZgok8EREREZEDMZEnIiIiInIgJvJkKIHOQysTERERUfyYyJOh2EbeG/aX1+Hq8atx48fJqKhrtDocIiIiT2AiT0Rxe+jbTdi4rxxJmSV45fudVodDRERkCLt1285EnojilpxV2vZ6yc4CCyMhIi8QoqXZpt/vtzgS8ppAIh/YB63GRJ6IiIgcpUePHgCAmpoaiyMhrwnsc4F90GpM5ImIiMhR+vXrBwDIz89HVVUV/H6/7Zo8kHtIKeH3+1FVVYX8/HwAB/dBq3W1OgAiInKGSl8j5mzKwxmDDsWvjulvdTjkYQMGDEBNTQ1qa2uRm5trdTjkMb1798aAAQOsDgMAE3kiItLo5QVp+Hp9DgAg9elLMaBPd4sjIqP4GpvRs1sXq8MIKyEhAccccwxKS0tRVVWF+vp61siToYQQ6NGjB/r164cBAwYgIcEejVqYyBMRkSaBJB4AZqTk4h8XnGBhNGSUV77fiYkrszD6/OPx+OVDrA4nrISEBAwcOBADBw60OhQiy9jjcoJcixUkRETO8uHyTDT5JSYsz2Itt4u9uXAXbpiwBptyyq0OheLARJ6IiIjIQzZkl+KdJRlYu6cU136QZHU4FAcm8kSklIA9+tYlIqLQNu47WAvf7OddFydjIk+Gssl4CUSkGI9tIiLrMZGnqHLLanHHp+vw8LTNaGjSN4oem1d6jwQ3uhfw2HYntoknchb2WkNR/eebzViXXQoAGHxkX/zzghMtjoiIiMwgJe++uBErXNyDNfIUVSCJB4Dvt+VbGAk5AdvIExERmYOJPBER6cZaWiLnYoWLezCRJ13YfJKIyL1YxhM5CxN5IiLSjQkfkXOxjbx7MJEnXXg7nYjIO5juEdkbE3nShbVwRATwot6tWMQTOQsTeSIiIiIiB2IiT4ZiDb73sKaWiIjIHEzkiYiIiIgciIk8GYq1s97DuzBEziV5ABM5ChN5IiIiComJPZG9MZEnQ/Ec4D28C0NERGQOJvJERERERA7ERJ6IiIgAsB95IqdhIk+66C3k2cyCiIjIXtjs1T2YyJOhWFgQETkXi3Aie2MiT7qwgp2IiIjIHpjIky6snSEici/eRfUGNnt1DybyRNSmur6J/UYTEbkci3n3YCJPFKOVu4vwduJuFFXVWx2KElOT92Locwtx66R1cSXzrOghIjJeY7MfX6/bh6/X7UNTs9/qcMgiXa0OgMiJ8srrcMsn6wAAm3PLMen2X1scUfyenr0NALBydzGSs0px3omHxzQfVvTYm98vsaekBicM7APB++uutSu/CrsLq3DpL45Ej65dYp4Pa27ta1bqfvx35lYAQEKCwF/OOsbiiMgKrJEnisGCLQfaXi/ZWWhhJMYorWmwOgQyyN+nrMclbyzHE7O2Wh0KGaS4uh5XvrsS//5yI8YvzdT1XclLccd4dMaWg6+nb4kwJbkZE3kiIo8oqqrHsl1FAICv1uVYHA0ZZcLyTDQ2tyTkby/ebXE0RGQkJvJkKNbueA8ba9hXI9vRegI3M5F3MJEnIiIiAGwTT+Q0TOSJSCnmAUTuwbuqRPamJJEXQlwnhHhXCLFSCFEphJBCiM+tmg8ZSGd1jWBDC0fiyZuIiMj+VHU/+RSAXwGoBpAL4FSL50M2wYTQe3jpRkREZA5VTWseBDAYwCEA/s8G8yGjsN9pIsdSefiyD3oiIuspqZGXUi4NvI6ncFc1HzIQn4QiciwevkRE7sKHXYmISDfJqwIix+LR6x6q2sjbihAiJcxHbHMfL94pIXIsHr7eoHI783qNyN5YI0/6sFQnD5FS4rutBzA9JdcVgympPHzZ/NG+4tnOLOKJnMWVNfJSyuGh3m+tqR9mcjhE5FDLdhXh7i9SAQD1Tc246ZxjLY6IiCh+vAx3D9bIE5FSbqqp/e/MLW2vn5y1zcJI1HDRpqEIuJ0pGt54cQ8m8mQo3qYlIiIiMgYTeSJSir2ZEDkXB/EjchbTE3khRDchxKlCiBPNXjbFj0U8EblNZlE1bp64Fk/O2gq/n6UcETmHkoddhRBXAbiq9b9Htf49TwgxufV1sZTy4dbXgwCkAdgL4Lg45kMOwLaa3uOmNvLkDf/8bAMyi2qwKgMYekx/XH/WMVaHRESkiapea4YCuC3ovRNa/wEtSbuWBFzVfCiK2oYmvLskA90SBO65+CT06NpF0/f0pmhsZUHkTm66XMssqml7vSy9iIm8i5XVNKB/726scCDXUNK0Rko5RkopIvw7rt202cHvxTIfis/7SzPwwbJMvLMkA5NXZ2v+HvNyb+AFGEXDXcSd3Hzsv7N4N858YRH+8Vm4MSPJDXLLajEjJReVvkarQzEFH3b1qPeXZra9fmfxbgsjISIiu3JTYv/monQAQGJaATIKqyyOhozQ1OzH9R+uwUPTNuPxGVutDscUTOSJiIjIU6p8TVaHEJKvsRn3frURt05ah7zyOqvDcZyNOeU4UOEDACzYesDiaMzBRJ6IiHRzbQtjG9dAl9U0YE1mCZrZs45rjV+agXmb87AivQiPTN9sdTiO46Y7SFqpetiViIiIDFLf1IxL31qB4up6/POCE/DEFUMMWY4H8yBbaV+LvDqjxMJIyClYI09E5HCFlT58nrwX+6PcihfurUdXx6ar6Put+SiurgcAfLQiK+K0Nv0JZCNerLl2K9bIE2tgiBzurs9TkLqvHCf/pC8WPnhB2K71OGqnc9U3NWuelluZyDtYI0+G4gnFWlJKrNtT2laTR+6Uuq8cALC7sBpV9fZ8iI/iY9bdFBlUVcuLPyJ7YyJPvA3rYu8vzcBfJqzBRa8vQw0TPE+IdMucTWs0cEHeyq3sXGYNVMXxsNyDiTzpOm+xXZ2zvL6wpd/kKl8TpqzJtjQWsh5rV4kI4LncTZjIk6F40W8fvgbtbWzJHZIyivF24m4UVvqsDsU5WGgRkYMwkXcJv19iyc4CrEgv6tTGMRo95y29t+N40W9PqzOKcfcXKVi6szDk59xuzldY5cONE9fircR0PDStpT9qNq1xH19jM/7zzSbcOWUDChRcsPHYt1a4I7TK14jxyzIwZ9N+U+Mh+2OvNS6xcEcB7vo8BQDw5Z3nYMRJAzV/t33BXV3fhNkb9+P0QYdi6DH9O0/LUt4Vbpq4FgDw3dZ8ZLx0Obp24TW9a7Qeo4t2FLS9tXJ3sUXBkNE+WJaJmRtbkrtmvx+f3nG20vmzzLeHNxamY3JSNgDg6EN74ezjB1gbENkGz94uEUjiAeBf7V7r9er3O/HU7G246v3VKKpiTyde0NDs7/Qe623Js+yauIY5KOdtzmt7vXRXkUnBkNkCSTwATFieaV0gZDtM5N0ojhPR1OS9ba+/WrdPQTBkF7oeajYsCiIi75BSYm1WCRJ3FKDZz5KV1GMiT+QgxdX1GJeYHrZteyx469x+tu2vwJi525G6r0z3dwM90xjdHt613deZ+Lsenb4ZI8YuxuK0gugTm4TlgVqbcytww0fJuPOzDZi7WU37dm4iao+JPIXFAt1+npi5FeMSd+OOyeuxr6TWsOW4NUdziivfXYXJSdm4ZnxSzLV4Rnc1yfIhPmsyS/DthlzkVfgwesqGqNPzmFTLrN334dYHzQHgwW82R5iSKDZM5IkcZGG7Bxhn6+y9QE/ixRzNPuqb2G2oG2UVV1sdgiYsC+Lj13khbtadLreOKeHaO4URMJEnInKRwAUbm9bEyJ35DQVx6+7rdV68U8hEngzlxYOKyK5cm3x7meptyjJbKbuuTo4p4R5M5Ml2Gpr8eOjbzbh10jrklIZvB75gywE8OWsr9hTXmBidc7n1VqqR3Hay44W1fXBbUChayhy9gz6SuzGRdyMDcw8zksFPV+/BjNRcrEgvwoPfbAo5TW5ZLe75MhVfrN2H0ZPXGx4TaeemWl9e/HiQC/ZfNx2DRnHDkR3PdmbZ5h5M5N3IRsdnLAXND9vz215v2Bu6+72kzJK211lBNfK+xmZU1DbqXzCRyzHBsw+920JYtPFY+xsfI7ZaYF/gpunMi2UcE3nSRW9TA7MLmvwKH84duxhnv5yI9dml5i7coVRvIp5c7CHUCY3bRgOPryOv1NSale8ZsTatuLjaXVCFugb796DlxTKOibwbKSqhQhXotinkw4Tx1OytKK9tRH2THzd9vNbcmIhsIHBoaDmhsbbVOqpWvQcrIF1NT41yPLXPeirlJizPxKVvrcDFbyxDQ5M/9oWSIZjIk6u0f/C1oblzgeOmxEVvGa6rH/k41pObbm3a4WFXO8RA1gu3F7inRDOX19ebnkq5sd/vBAAcqPBhRmquUSFRjJjIu5HXS6gwHp+5BeeNXYIf27XBdzJuZuPZ4Q5UrDFouaCK57rWtZcXrv1h2rioroMMUOXj82d2w0SeXCXcOShlbxm+WpeD/Eof/jU1xdSYnMiqB+sofpHupqi8MGG+Fx9Vh5iVR2ptQxPeWbwbE1dmoSnEHVAis3nx1NXV6gDIADp3ZC/UwOwtYV/zeripCRKRLibt+k45xCKF+eGyTLyzJAMA0K9nV9zw65+bE5QCTs73rNx1nLLfeglr5IkcSncbeUOiIKMZ2Uae+4Rz2LGmMZDEA8Cbi9ItjISohRcvNJjIky1PEE5z26R1aHTRrWU2rWnhxAdNI53HVJ7knLdmSItYdxGnJVAOC5diVFRVb3UIhmMiTwYXwPYoLo3OS5enF2Hy6mxjFxLEyDXLpjX2Ea1dO7cV2YHT9sJrxidhc0651WE4jtO284hXFiOjsNrqMAzFRJ48wYxcZ2NO6FFo7SLcOmAiGJ4deq3RK7A5tVy7xrPtI9f8S0xatQfPzNmGwipfzMuwwoKtB5BeUGV1GJrx7lns/vZxstUhWCbWQ9/up4vgw6GxWeKhbzdZE4xJmMhT2NrqUAes3Q9iL7Hr6dvKvKK6vgk3TFiDy95ajoxC5yRjkahs3mPW4btidzGen78Dn63Zi8dnbDVpqercMGGN1SHYhpvL/FobjlTKCzP1Cl3evIaJPBlcUBtTKDmxptQrrDzxv7UoHWv3lCK9oBr/NKCb0UembVY+z3gFr2+zjo1IR/a3G3LaXi/eWWjI8n2NzfD7jfmtZbX26yvbrPwu1rs0bk743cit1wte3A+ZyJMu+g9+exxVbi209PDCxc+azJK211lF8Xc5GlwbPi0lF1lF7mlv6dQ9YkN2Kc55eTEueXM5KjlATSes1bUPtzVd9MJ5xGmYyJOuJNcuZZIdexOxy7pxksZmv+NOdHasre3ABqvT6KPzho+SUVHXiD3FNXj9x10GL815nHZMhSOlxM78StQ32a8JjFG0PdsS/3JcsosQmMgTnHlAh60VcOBvsZrqVab1wnBTTjnOG7sYv3tzOSrq7JkcO6H2KTjCwP9DbQezEjyja4Sb2zWpcdKDqfGwY+VFR+r3rVd+2Ik/jFuJP727yrBmVEbj3RFzeXF1M5EnaieQ6MSS8DilALG6tu6TVXvw5/dX46r3V6O4ugGZRTX43w87lczbKdvADFo2sxMv4im6eJLHTruExn3EiH1pwvIsAEB6QTWSs0qiTO09VhR3RmznhiY/NueUO/ZizWpdrQ6AOqqub8LoyetRVtuA8TcNx0k/6at7Hm7LZcbM3Y6t+yvw3KjTcPqgQw1bztQ12Xjtx104ZkBvlNU04PRBh+LDm4cjIUHbGmVS1CLSeiis9OGF+Ts6vb/jQKWuZewtqcFTs7dhUP9eeOnqX6JL6zbiNvDeOvDa7/Uqn0Ob1xhZceK0Xd/X2IzSmgb8tH+vDu//9aM1SN1XjuuG/wyvX/+ruJbhxfKANfI28+bCg71u/N/nsfW64bb9eHJSNlL2lhneJdzTc7aj0teE7XmVyKvwYeGOAszbkmfoMk0Vth9580LYX16nZD53f5GKlbuL8fX6HHy5dq+SeYZiRXMGvSf+cNOHbloT9H+DSguVa63KpQ+zuq2cjuf3HKiow4yUXNs2saP4Vfka8ZtXl2Dkq0swZ9P+tvdzSmuRuq9lYK7pKblWhedoTORtJimzuO31botHIzP7RLNxXxnu/iIFG/eFHm2vJo4+f2NNyHbmh25/GypJYrOOFpHWg6r2otvzDtbgJ6Yd7N6Q28DE7icjrOzgj/IrfDHdNp+4MgtDn1+E0ZPX6/6uVzh9l/f7Jf4yYQ0emrY54sA9XqxpjcZJ2/69pRkorm6AlMD9Xx/czg3Nfgujcgcm8mQbV49Pwndb8w2Zt+rkxsknFSfH7kV69107bt9zxy7G1eNX677b8OKCNDT7JRbvLMTW3AqDoqP2zN5/dhdWI6e05U5d+4tyt9BbeaFlcisP8VibChWZNCiTFytzmMi7kKr92InHgw1zGPszcaU5bZ9yYq81ur4bx5f1ntA351ZgVUZx9AnDKK9riPm7dqT3WLAqQdF6DMQ8kJTNjrFt+ytwoEJNE0DA+s4FyP2YyFugtKYBbyfuRuKOAkPmr6rYUDEfL5dhOaW1+GBZJjIsbiIV4NZNYXUNTLNfIk9R238g/mPGDts53CaprGsyNQ6VymsbkFtWq2x+dthOKpXVNmJq8l7Tal6NMHvjflz57iqc/+pSZc/zuE3M5VOY7zmtcseOmMhb4OnZ2/BWYjru/GwDMl00SqSZwhUm4QoFKx5avP3TdXj1h524/sOkDv1eq2J1AhsLI2I28mIx2n7j90tc+e4+6LV9AAAgAElEQVQqjHhlCT5cnmlcIDEwep+3S//YZiTEuWW1OHfsYpz/v6VYtst9zT9UeXr2Nvz7y1Rd37FTZc8D37S03W7ySzw/b7slMdjksApL9eZSPj8b7U9mYSJvgQVbD7S9NuIpbTPLge+2HsCdUzYgKcwtc7sXSnr4GvU9bJtZVAOgpaaqvFZ9swBVBZaZt7b1JpdNzX68sXAXnpq9FWU16tahqtvdS3cVIq2168xXvlfTF75enXqiCYyF4LI6XysHRHpy1jb4Gv2QErj9U3s9eBvywfs45hfvfrN2T2nEz42o1DBClc+5d4+M9OaidPzlwzXYnhf+mRW/X2Jfibq7VxQZE3mLBZ+E7VLLFU77eGsbmnD3F6lITCvAjRPXWhdUO0adIj5ekYVfjvkR/4mxVwU7nLoOJnjO8c2GHLy7JAOfJ+/DiwvSQk7T/pDRcviMmbsd57y8uEMXaLEyo7s8lTVMZtVW6S3HqnyNeH9pBmam5oa9yAqXYJpRYhZU+kxYSmxCrS67HuPfbsjBr55biIenbdb9XbNrWr1Ys6vVuuxS3BzhnH/bp+twwWtLMWZu9Lsaqo9fm6dQhmAibzNOejCmVEMNqYN+TkQvfZeGxmaJman7sbekxupwlFK9jVTWnH6WdLCP+Bmp8d+92pVfhclJ2Sisqsf9X2+K2iWiFbXa8S4xsD2trMHWu+Q3F6XjtR934T/fbsaaTH0jeLqkiPGER6dvQXV9E6an5Eas0SX7K6sNXYmRU1qLlbtb7tBPTso2MSLvYiJP4bklC1cs3AN7ZtcEOLHmweiYo+2yeUG9UZzx3EL87wdrmsSoEs/Fhl0O8U9XZ7e9Hr8s9LMGVl6Y2JnqpjXBu5NR+0ihzodinVjexcLp+3m9Q0fgdTIm8jZj96Y17TkpVjPYJSlSTUqJp2ZvxbUfJGHb/ui1aCprsY2uEa+ub8L4ZZmorg9zcebwk6rXxTIIlRu48Ve7tXyNhZ3v3IcLzayIbbxqDNPV6gDIxkIk6jsOVIaY0L3snMbFWmAFnwSizeaHbfn4PHkfAODGj5OxZczvY1uwCWK9tqxvbEbfHvqLQztey0Z+ViN42x/8f11DM95ZshsCwH2XnIye3brEHoTF66WuoRl//TgZBRU+/PXsY1BT34RbzzsOxwzobW1g7Xkx42gnsIuofQZE2q6CydtbmczARN5iEhI19U3oE0MSYTgbn2j0RhZr2R5uOSFvZ+tcRkVtI3p174LuXc25MRbr5mzfC0Wlhp4cWIsdH70XWqoO0w+WZ+KD1mYtvbp1wb2XnBzzvKzeB95dshubc8oBAOMSdwMAkjJLsOC+860MKy56EtRIU27JLcezc7fjl4MOxXOjTrMs8VUzTonEmqwSdE1IwNjv01Be24gJtwzH4CP7KYjPvuc/rex0YRO2a2id8WUWVaN/r244vG+PMPPTGZgLsGmNxSYsz8KZLyxS0oMGhb/lqPqaJGRPEZFqQoM+W5FehF+/nIgRryyJuVtFIwsso67hwsXsa/Tj9R934X8/7ERdg742lh4styOKNQF5Z/Huttdvt3vtRFtyOzcB254X+91EQ5IhizKOGyYkY+O+cny2Zi8WhRiU0Oz0NZ7VMHdzHm78eC3+MmENNu4rx57iGtw5ZYO64BSwqnyakpSNs15MxLjEdEOXo7X5Wjz7VZWvEd+s34e3FqXjkjeW47xXliC/wr49SZlNSSIvhLhOCPGuEGKlEKJSCCGFEJ/HOK+fCSEmCSHyhBD1QohsIcQ4IcRhKmK1o4YmP+7/Ony3hnYVSwGVWVSNe75MxUcr4hs8R2XhWFbTEPKEpno57d06aR0amvworq63rP/xgGjtLVXmG+FqatMOVOK9pRkYvywTE+LcN+zG7PasbYsLsao79zkfZh5KIzqo0teIj1dkYXl6UdhpdO9vzq84NVVdu/EwUvaVRZ0+ltW7Jbccj03fglW7Q48vAqhpWhPqvLmv1Lj+y6WUaGjyGzZ/QF15++zc7SipacC4xN1hnwGK13++3YQzX1iE2RuNrYh8aUEaHpuxta2CoaHJj+csGrDLjlTVyD8F4N8AhgKIeYsKIU4EkALgDgDrALwFIAvA/QDWCCEOjz9Ue2tsNraQ0Gviyiyl8xs9eT0WbDmAl7/biQ3ZkQcOiSR8k5fQpWC4wtHvl7jmgyT84zPranFyFA77roWW86aVtycDTSHcYNbGXAx7YREen7nF6lBMFW7/GftdGl76Lg23TVqHLJ2jWuvZJ81uFpFTWosv1+5DSbWOnlh0ZrCGjFpt0Goa9d5qfLMhBzd/slb3QHp21dDkx5/fX43hLy6KeCEazMg9UeudotoGYxL5man7UVHX2DYirlG+Xp/T6b0yAwZZdCpVifyDAAYDOATA/8Uxn/EAfgLgPinlVVLK/0opL0ZLQn8KgJfijtTG9hTXIKNQ38lNhXAnvczimrCD8MQqu91ob3oKQ6Ok5VdiT7G+fuH3lztzxDo9JxSzm9a41YPfbEZZbSO+Wpejud/suPuRj/P7KoTbzO2fsfhszd4wU9lPpN3W75e4cWIynpi1FQ9+q3+gI7cr1nNxE4HV+/WUpGxsya1Ala8Jt01aZ3E0RAcpecJSSrk08DrWtoSttfGXAcgG8H7Qx88C+CeAW4QQD0kp3TUiT6sHvt6oZD6q2nPuiNKmNN7FGJEs6m3G4I/hBshdn6fq/5KNmXmC1LvP6N1HotZOapyfr7FZ+WieBZU+nPbTQ3V/z8imOfHM2TYXZSH7UTcvuKziauSUtoxPsMIGlRPxsnEfB5bK1HkHyavsuPvYpagyip26Srmo9e9CKWWH9EpKWSWEWI2WRP9cAIvNDs4Mm0M8oBULO/cxS87QPkmzugcSs/kam3Hha8uQrziRN0rndu/hj3+tJYOjypAYQy2prsc/p6agyarmjLa5EnKGeNZWdX0TeuvsTlXVIaA3bi3TF+kcTMuW2XUrM44CG/98JeyUyJ/S+jfcI9a70ZLID0aURF4IkRLmo1NjC82bmg0eTMWIdqx26WormB26MgucmLScoJyUx6ne5l+u3ac5iddzkeOkdapFpN9j08OwgzHzdiBlb/SHPQ2jt428AetUSwR2uaiLNYpFOwpw71epGNS/l9J4tDJi7e3MrzJgrtZQtX6W7izEh8sz7TVWhEnslMgH7jmHq5YOvN/fhFg8x++XWLD1QIf3VDz9b/YpINRJZ2tuBR6eZt+2q1afJ1UvP1LCYUTtvupEo7yuUen89Ir35+h7FsKYnU/LdrY62U8M01OV09j9zlG0C22jKzkCHRlkFrmyRa5jGH1BeMfk9QA6jnsS4IB6hbjYKZFXRko5PNT7rTX1w0wOx3Sx1FAuSivAvV+paaNvN1ePX42mMHcXrE4m3MjpNbVGXVkZ9ds7jdYaJvymZr/mk6kZPRvpf/5BLTvcJVOhQvGFp+r1Em6fs+vdUzIXm9bEz06JfKDGPdzTYIH3y02IxdFiufJ9OIbeFuKtXTWrJjpcEq+V1gEvIvkieR8O6dUNN53zc/QMaqtp1vks3Anab/UtgThpSQj8fonn5+/A5KTsqNP6DOonOtbVHOvWCV4rm3PLcVjv7jHOzXpa1kNpTQOen7cdqzLC918eC7fmnHZpNqNX2oFKvLFwl9VhUAjhdqlYLtymJGXjY8VdYLuRnRL5wFE5OMzngfHCjR2mjJQy6vxn5vknMU3bLfhI5VRgIAspJe48/4QOn1l9Lp2Rmou7Lzwp7OdOSmLCxTpn835NSfw9X6ZiwZYDUaezt7CjLGieMl5a9hkj9qsX5+/A7E156mfsNDY6Zo2oeb9p4lqUxjgitu0ZsL5CHefmD1Snf3nPzuWgT1qo6kdehUAXlpcJITrEJYToB2AkgFoAyWYHRrGLdOg6oS5IAvjn1HDPTgdNq+EHqeyXP9YTZHDN/NKdhZ2miWdgMiGANZkluOfL1E7zNnrQznDbYMnO6N0C7sqvMjSJ17q59DZt0HN+1N5rja4QOjEiSQ83y/bra6aOESatvoDWS9cdUIf9Nr3MSOIl1NyN9RonVfy4hemJvBCimxDi1NZ+49tIKTMBLARwHIB7gr72HIA+AKa6tQ95lcxqexi8mLoGd4zg194HyzKtDsFwoRKaL9bui2uef/s4GQu2HMAdk9d3uCiwcyEfy0iBwb/nirdX4t6vNobs8cmsxFFXYs88JSaTVu2xZPA+M3TqztSaMDoxuwZ53Z5S/Or5hfh2Q+dRRSk8O5YpNj7tKKEkkRdCXCWEmCyEmAzgv61vnxd4TwjxervJBwFIQ+guJO8GUAjgHSHEbCHEWCHEErSMHJsO4EkV8ZIxPlphfVs2G5YhhjHyxLYp5+CjKPEWgjX17YcH1zc33X0wh/mCWQX5jgOVmLc5D9MsPPmHbVjj9rOZRlJKTFyZhfo4noV4fv4OXPdhUlx3rgyl9e6PDQpMO8QQTpWvCY9O3wLAvOPHrMPU7IeNrdzMNt7FlFBVIz8UwG2t/37f+t4J7d67TstMWmvlzwIwGcA5AB4CcCKAtwGcK6UsURQvGeCtRH2PL9i5AHcjq9Z3PMtVFXK0+fy4vQBzN6trW71BYf/k0dZf+MQ9+ok6tyy+LmZ35FViavJeVNQG95xi36uGxLRCJU3cymsbW0f7tOYOaEQsW1FR14jn5qlrY+3k81Wo2O3yoHOs1xMVdU1YkV6EKl8j5mzS3qTOjZQ87CqlHANgjMZpsxGh5JNS5gC4Q0VcbiKlNPQKOpZD2r6nam+IZX8oq2nAhyvc1VxIxX74xKytCuZyUFwXL/H2I9828Ff05j1PzNqGOfeMjGk51fVNuOr91Who9iN1bxneumFo22d2rf1fk1kSU28n4X5Py/q0R0KkSrgHI/0S6JKgbsMG5mTEvvLF2r34YVs+Vu5W23uRU83cGLlDAzOo3sxpBypx66R1lizbbuz0sCuFIKXE/V9vxFkvJuKHbfbqTWNenA8GGjKyq/I5ustz87ZjwnJ9TaBUnmitSvC4X4SWX1EX83d/2JaPhtamJbN0PGQajZEVFn/7ONlVo2KGFbQKk7NC38z+RkMTsIq6Rvxh3Er85tUlSDtQqSK6DoyoGH5y1jblSbxdL05DWRc0KNL/ftB28aqilj7ceZ1Na4zDRN7m1mSWYM6mPJTUNOCuz1M1f+/NRem4bdI67MzXVvDqLaO25Jbjhfk7dH7LeG4/YONlRdd83CbGCj75Skgs2HIAj7S27Q2wWx6iexyKiLXiVrJuzdY3aetgIDBK95bcjsOwVPmaQk3ewdjvdmJXQRUOVPgwunX0TD3stt/FKuZxIBTsn+GajoR6qL64uh5/mbBG9zK25lbg/P8txfUfJsHXaH3HFXZp+uMETORtbm+p/jasFXWNeGfxbixPL8LNE9caEBXw2Zq9hsyX3Kd9gay7+0mHFuZWjxp6z5fhLvq1jQILhE4S2otUc65lO1u9jlRpiPGh2Xh/vd8v8ad3V+la2Kj3VutezsZ9B5/3yKvw6f6+Wc+5GGlHXiW+Xm/NA+zJWSW4/+tNmqf/cXt+TMu5ddJa5JbVYX12Gd5fmhHTPID4B4o0gv0iUouJvMNkF+vrfbO42tpBMyImYu44j3tOUhyjZhrdU4IrC2ydx0ks1z7BNbUAsCkn8gO7dlnXVjZ52FtSiyveWWnJstdllyK9QH0XmMFltpOalBjltk/Dt8VW3XNR8Pp+b0nsSbUeZe0eWN+4r3N5oJUdL9DtF5FaTOQd5uI3liFHZy396MnrI/ZO4cRKTzsWFm4VnHzrvRsTz5ZSlfhbkYyo7AUnkuD1q2eI9MCk13/Y+VZ8POVCPOtb/12b2JcVr0embQ77WXF1fcSKDN1dqwb9v8605g/GHDx69xErryeKqupDvv/4zC04/dkfMXFl+OeOnHghZOb51Y41+E7DRN7mgndxv2zpw1iPxTsL8Z9vw59w6pv8qKqP3laS1LCquUi45UaKxtfYHPf4APH14BL9y1pOlEat8nAXGukFVbjvq40hP6tvarZNk6FAGLH0qR5vghLuBG6PNdNRuFjDlZvvL83AWS8m4uZP1tpiW8eTmMW/ndWwfi12lFtWi6/W5aC+yR+xO9Pgzf/thhz8a+oGbM6JvdY7EhWJsZm7bNiHYxXG4PZLBSbyDhTLgyjBT7HHS+tBFqlGNa6aWkWHphW1JUOfX9Th/4GCrKK2EfkxtEE10pSk7E7vRUsKzF6lgX2xsdmPXQX26JFk2/7QD5mvzSrB2S8txqVvrQgaKEudSNun83EbYdooy4m0nb1ey/bajy29hKzOKMHW/RUhpzEzMY3r7oq6MFylsi624/fR6Vvw4/YCXDU+9PMKdljfRiTyNriedS0m8mSoSLVR8dRUmXEVb5SKuuDBc4Cc0lqcMzYRI19dglU6uk3LKqrGbgOT17Hf74x7Hkbfpl28sxD/+WYTTnvmR/gaQ9cs2+X29g0fJaOirhEZhdV4e/HukNOE2ofzK3xheyhxwj6vR7hN5dSLAy09wwDAtA05GPnKErydGHq/MGsfdtv+ZFd2Xs9mNK1JL6iCr7HZsce1nSgZEIrMZecCwGrByY7qOxFa6blIERB4fObWtiT05k/WIvuVP0b93sZ95W01f1/+4xyMOHFg5JgUxBqIV9dy2r1hVDODmQr7MTdK8E/XegE2e9N+jJm3HUf07YFlj1yI3t0jF9th28jrmDbc9B0+j9RrjQHn5ilJ2ViWXojfDTlS/cwNpHWXD3QX+lZiOm4571gM6NM95mU6OTn6cu0+9OnRBdU2b+4ZvI9PXJmF20Ych25drK0ftbrSQuv+ftlbK9CjawJ+O/gIYwPyANbIe0is3aSFYsfCoqnZj4LKjg8lxdKfrtkkJA4EDczzwNeh21e3l5hW0Pb69k/19+8cIITQlWDzQWM1krNK0aShx4tn526HlEBhVb3uwbyCdXowVse0wcwuAn7Yno8Jy7NCPphrZ8/M3YaCys5N5iKtv/La+HobU3mMml3WPzFrK+7/ehOenLXN3AXH6cUFafhWwwBbTmBWZWF9kx8LdxSE/IxnGe2YyHvI6WN+xH++1d4fbSR+FSPAxTGLp2Z3LuTDNVVwotmb8rA9L3Tb2lDiuUir9jXF1L90OMHnfRbILYITorrGZjwUodeTUApD9Z5h0QqOlOCZmfvZff/KKqrpNDgXEN9FlJnsUrvvhDvRrypoimh0F70B7BnaPZjI25zKY7qhyY+ZqfuxR2df9KHMTLWuKcOOvM4PEv6w7QDeNam/XS1UnHTMGgNgV0FV2Afy7GhbDLHaIRUJtU/MCTHSbuTa1Og7lp59L56mTkYneCvSiwydv5n0/pZ4m6AFbxs7JmaBGJOzSiyORB07rGclR6UBP8Ss7ni9iIm8A8V72zTe27ZWC/Ww6F2fhxvJ0hlC9vHthCooDdr/DL2/KNT0D+usxXYTp+wSWmoVo01y66Twg/B0mI+mqbwl1DnCqvIk3HYOxBip60YyTqTjL54cI9Rupnfsm5b5OKSwswEm8mQZKw/T5+ftwJbcctvcNg5F9fpxS7kYS5/nKrU/AerZe1TcXQu1DYNPupFOwsF9V8e1S8T5e1Ttj2bs1ma3Ew+9bqwpqxJssp0dQcPxqZcVz6Op3maVvs6Vb6QOE3nypEmr92DUe6sNe3DTS+euaNz2cGysJzmt34vcdjX2pjVzN+d1aqttx/7FrX6Q3g78cR4yoSooYt3WRrXZtnMlipU6PWNkUPFpVBt5Hr/mYyLvQE6q4TAiVLclhqp8tibblrcjbRiSY8WT4H2yak+n9+JqI88TtmHiKeMKK334cHlmx/lJ7WWx6uPVrfuJ1t/ltQsWlvfmYyJPluEBr9Yzc7bjx+35VocRke5tHupWtcU7jt0Sk+DVoWftGLUmtawju61HOwm1i2tdXw9807lnMjtWftQ1Nuscpdx+vyGYmRG+OH9H5+VLqagJn7pfknagUvOgaB1iUBaB+3FAKAeyOgEuq1HzsKwdTy6q6B9kKdRM9C/3g+VZ+MPpR4eOyaL1HetSfY3NyArRw5LVe42Vx5+Kbl9VMaqmceXuYkwKcffAS7Rs5nBrPykzdC8wVl8AB/vHZxtwWO9uVodhOKPayE808BhRuadc/vZKhXOjUFgjb3N2vC330nfaexkwJHp7nY8cY1GYgTf0ipYPBLepjTWBGL8ss9N7j07fjL0l+ntAMKtvZhUira2Zqfvx6PTIvfboWt9xHEsJBp09mv0Sz4eobXSbuJ+F0Ls8ndOboazWuQ9Bhn4OQf1a1tedrJqcwWbXfBQFE3kPivcYnZ6Sa9qyjGbYg0TGzDYuwaPeGiXyyUz7mtkRYkCsbzdo3/fMYMX1wbcbcrGpXe8zVu1rkRIGPcmE3WqKQzF7M980cS2W7iw0eaktgi8iHHQN7DhvLtyF4uqO5XK867tew2jR0djhiHRAsWAbTOQdyC1NUnigRmbX7az3RMPtrN7+srqwnzmljXzAgq0H4lqWEy4EQom0jsprG3HH5PUdpw/+vs7lxd5rTWzfoxaRLmrfWZKBp0OMUt7h+zrWvwTw+Zq92r8QdkbOPKa8iom8za3dU2p1CGQwp5WZTovXa97XMcJxpCQ4ehMqzYuJ6N9fblQzI48p1HGHTUogo7DawGiiUzGiuBNFq5D5flvkDgpKdIzwLaXEroIqjXEZw07P8XgFE3mbm5Fqr6YEbtOo4DZkKCrKsljmYYfKM1Xt0b16PtBSw9x+FQdPv1hHkwypcXl25qTnH9qLd7U/OmNL9InamZKUrWm64LjibXMd+P6dU9ZHmTIyJ+ymKkIMXt87DlQqmKs+8fyOf01NURYHacNEnhxHZXl+9fgkhXOLjRDeuX1t1cnYI6s3JjmloZvpxLNPavmqqm1iyoWIww9QOwzwk1nkzRp5M+nZzpE2a6yHVH1TM/aXh2/2p4ddm5baERN5B3JCzYQWk5Oycfun65Qd+E7llu0Zjtt+n8Nzug5aBgoKvYH09k7U8TMNy44+CdlApE2pbTur2dJ2O+7sFg8AVNRZ2wuQ35gb3BQFE3kHWrunFA99uxlpFtxyU23ZriI8/G3k7vSCOSExdENtQkOTmlLZDeuivfb7nx26h41n7dY1Nsf8G6z/5c5mZiL46eo9hizPzH3AEeW+xTG+oKPb1sh9izlgZVMbJvIONSM1F3+ZsMbqMJRYkxV6ABOvCHeCjamNvMIz6+Sk2AYcCW7qEOvJTcWp5KLXl2HV7mIFc4qf1t+jZTqVCdSarNjWT7z7mtcvBPQeF/EcD1k6mrUELyfeZxBUXewytYxuzqY8JfOxsmY9KaMY1fX6R4L1MibyDlbla8LY79JQWOnT9b2MAvN6L7C6hsJMMSesYb5n9aqbkbJfyXxkmNdm2FNcg5s/Was0a7TDLXWV63F/ub7yIyBykwvzVlLqvnLc//VGlFSbM04CWePuL1Jx88S1yu4UGsEtNdlW/oobJ67FVe+v9lTuEK+uVgdA8ZmwIktzd1MBens7sBNfYzPeWbLb6jCi0lsI2aGJRjBVJyVb9IqiMISY++M2aP52WL3tWbG952zKM/QIMmLevsZmA+ZqLiGEqTvgqoxifLo6tjuFVgn3MHlYJp0KjFiMqnNGRmE1tu3vPCAghcZE3gWW7SqyOgTTfLwiC+ts2re+0ZWQNQ693Rhr0W6LCwCFtP6auZuj3x5XuauFm1e09R+q1j1lbxn+/WUqDlTEVssfj9mKmhWY5eOV+hJSO/X4tDarBB8uz0Sz3/ygtjDBUyJiG/kYdzaV+2iTBfuWUzGRN0mzXyK9oAqnHNnP6lBMpTq5fWNRutoZKhRPIablrkppTfSBQcyozIn2M4M/l7Klv/4PlmViU065UWFFZuOmNTX1TejeNQHdurS0dNyQbc8L1WChVsNfJqyxJLlzghXpRTjtp4fg8L49AADFJjcF0rrfdkriQnzvho+S4w8oVjbevXyNfoxLTMeoX/0UJxzR1+pw4jZn0378EGXAKqO4rB7HUEzkTfKvqSlITCvA5acfZXUopuLBGF24W+wfLs/E6oxi3HvxSW0nfz2yiqwdyfEgia/X5+BNG1+E6aF6nz7n5cXo3b0LFtx3Po7op387x3s7O1yCF62de6iPmcSHd+ukdRjYtweS/nsxunc1//E0o5uEeU249TIucTe+XpeD5CcuUTpfs5VU1+P+rzdZtny3PG9gBj7saoJmv0RiWgGA6MMxk3PFWlNbEqamPWVvGSYnZePZudsB6D8R3/V5fCPshVtetJ8Z/D0pgfFLM/QvX/c37C3c7erq+iYUVtXjuXmxbed4LyzCPZ/htqZNZol0MVNcXY/vtx2Icc7WbA+njpxrpXydHVDYjZQtHQWQMzCRNwFPiN4Q62bOLYv8MNT8LdpP/Kn7ynHF2yuRuKMA6Qb1TiTRsk9r7SJsemquJW2mzaAnxxmXGPkh7YxC7dur/XKfmLlVexBKxdktoUvzw+kpORE/b2z21vnArdv5oWn6xj9xEn+sbeQVx0HaMJEnMoCV1247DlTizs82GDZ/v1/iqvFJGPb8IszamNvp8+AT94TlWYbFotXMVDVdacZjh8YB3PTuOot3FuoPRoGEOBM0t9ZvPDbDmAureTou6OOherO4dTtvyeVDt8GUVlq6dL8xAhN5E3htf2xstm8/vxS/xTsLsTmnHA3Nfjz4TedaKbeeuAPcWsOoF9eDeYqr67HApEQ+mN02s5vbTtc3NWOtDXpls8MaVhlDnkvvCAcwkSelXv9xF05/9ke8uXCX1aG4ymdrstHgsQsku14QGB2X2U3xwiXkviY/np69DY9M24zy2s7PcQS3rZ+avNeI8GzB6ouW9Hx9Y4WEEutviHsEX7tdCdjYRJ1dkhol5u4nFcdB2jCRN4FdExIjvLc0A/VNfryzRP/DjU7X/oSlutbomSBtgxwAACAASURBVDnbMW1D5La3qlm12/LEr4W6lRRuTh+tyMTU5L2YlpKLl79L6/y9oC8+PXubvuVyO2tn4LralV+F6z9MwiPTNsPPXocslbK3zLB5r84oxs0T1+Kb9fsARM5LYh/7I8YvhhBrO30vYiJvAjffClRlyc4Cq0OIm9HlzoQV1rc1N4Pdy2+jEtCGJr/mmrBFOwowcWWW5geOY7E6o6Tt9bcbOj8LYfftZFex7D5Gjvz898nrsT67DNNScjE9JTfqdr3q/dWGxaLFd1vd2/Obkde2N01ci1UZxXhsxlaURBnDoLa+GfM0DEzXicIy4c4pxj3n5TbsR54MpfW4/vtkdx20Xk5yVF242nUVtt+2Kk+8WcU1+NN7qzCof6+o085IbUms8xW0/WTNeGhSSkO6Xiysqtd8wdbU7EeCEIZuo/3lB3vNWp1ZjGHHHhZxessGdXOYWPYfs47FvHJfxGXlV/owZY21TeXqm7zVlDQeTORN4OWkzkvclhBZ1W2q29ajHtv2V2Lbfm292wDAxFXWtal1+3Zanl6EC0/5ifL5vvrDTmzJ1ZYM/+bVpejVvQse+f0pcS9Xy+Hs8k1qquEvJuK5UadpmvauqSk4+/gBju6zf9muQgw9pr/VYXgSm9aYoNLXaHUIZAJesHlDfVPokXidKtbkwcE5hybta6pV0zowYH6lD3uKa3D3F6mGxdKZ2oIsKbPYk2OplNY04N6vNmqa9oft+Xh+/g4s2mFOE1Mh1J+vbv90PXJKjTtmKDwm8iZ4O8pAMG4lpfRsDY/3TlsHqThB7CmuwYr0ovhnZIDxSzOtDsEW4t3On67ORpVDKjncUI5ZlUs/+M1mJGdZ36UiGW/WRuvH6/AiJvImKI7yYImbeSmh3VUQfxdxdmLltvuHgQNaxWtVRrHVIbhCk1/izUXpVocRlpEPmFpBy7MrRjXtuOdLM+8oUDQrdxfj+23qxyVgxx7WYBt5IkX2FNdYHYJrZBRWWx2CZ1jZ7OHT1dmWLZs6K63pPF6Ait2DXQnay6s/7LQ6BFKINfImMLKLODvzctntxTahqjhp1bmhnXis69sNvz0St/++UJanFznq+CN74b5jDSbyJmjfHzO5W25ZrdUhWI6FOZH98Lgkcicm8kQKXffBGixOK8BqtqMmIgdK3sMHU4mchG3kyTASwMZ9xg05bUf5lT6MdsmIdFlFbPMfTaaH1pGv0V3dbuqRus/5gyBprZB/evY2Q+OwG6MG/vIiNim1BhN5MkxBpQ9VPm8+H0DuNm1DDnxNfrz24y6rQ4kbT73hVfoa0ae7O06TseZYbt8/pAT2FFfj2MP7WB2K47l9X7Erd5RQZEuTk7KtDoHIEI9M32J1CMrYuQtIKy1OK8SYudtx5CE9rQ5FiYwi63qCKq+173gBz8/fgclJ2Tj7+AFWh+J42SV8RswKTOTJMHUN3r0V7yU78iqtDoFM4LW75olpLaNs7it1R3KyOSe25kFub3QSqHBax2cD4mbXQfzcjg+7kmGmJu+1OgQywV8mrLE6BCIiIk9SlsgLIX4mhJgkhMgTQtQLIbKFEOOEEIfpnM+1QohlQogKIUSdEGK7EOJxIUR3VbESkTpeHSfBa75YywtzL/LYjRgix1GSyAshTgSQAuAOAOsAvAUgC8D9ANYIIQ7XOJ+XAUwHMBzALAAfAKgF8DKA74QQ3VTES0TGYc8F7vTigrQO//f7LQqETFVe23m0VyKyD1Vt5McD+AmA+6SU7wbeFEK8CeBBAC8BuCvSDIQQwwA8DqAcwHApZVbr+6J1/ncBuBfAm4piJiID5FX4rA6BTPDD9nyrQyATFFczkSeys7hr5Ftr4y8DkA3g/aCPnwVQA+AWIUS0vp2uav07MZDEA4Bsqd57ovW/98QbLxERERGRG6hoWnNR69+FUsoON1ullFUAVgPoDeDcKPM5qvVvVvAHUsoyAGUAThBCHB9fuEREREREzqcikT+l9W+4zoh3t/4dHGU+gTHtOyXqQoj+AAIPzZ4S/DkRERERkdeoaCN/aOvfijCfB97vH2U+C9DSRv4fQojxUspsoK2N/EvtpovaC44QIiXMR6dG+y4RERERkRPYZkAoKeVqIcQnAEYD2CKEmAGgFMD5AM4AsBMtiTj7SiAiIiIiz1ORyAdq3A8N83ngfS3Dyv0DLd1X/gPAX9DShW0ygAsBPIWWRL4w2kyklMNDvd9aUz9MQxxERERERLamIpHf1fo3XBv4k1v/hmtD36a1h5qPWv91IIT4JVpq41NjiJGIiIiIyFVUPOy6tPXvZUKIDvMTQvQDMBItgzolx7oAIcSFAH4OYIGUMlxbfCIiIiIiz4g7kZdSZgJYCOA4dO7n/TkAfQBMlVLWBN4UQpwqhOj04KkQ4pAQ7x0LYCKABrQ0ryEiIiIi8jxVD7veDSAJwDtCiEsApAE4By19zKcDeDJo+sBY3yLo/U9aE/dUtDzoejyAUQC6AbhFSrlFUbxERERERI6momlNoFb+LACT0ZLAPwTgRABvAzhXSlmicVbzATQCuB7AwwB+A2A6gF9JKb9RESsRERERkRso635SSpkD4A6N0wbXxAfenwJgiqqYiIiIiIjcSkmNPBERERERmYuJPBERERG5Vkvv5u7ERJ6IiIiIXOuVH3ZaHYJhmMgTERERkWtNWJ5ldQiGYSJPRERERORATOSJiIiIiByIiTwRERERkQMxkSciIiIiciAm8kREREREDsREnoiIiIjIgZjIExERERE5EBN5IiIiIiIHYiJPRERERORATOSJiIiIyNUam/1Wh2AIJvJERERE5GrTNuRaHYIhmMgTERERkau9/F2a1SEYgok8EREREZEDMZEnIiIiIleTUlodgiGYyBMRERGRq7kzjWciT0RERETkSEzkiYiIiMjVXNqyhok8EREREZETMZEnIiIiIleTLm0lz0SeiIiIiMiBmMgTERERkauxjTwREREREdkGE3kiIiIicjWXVsgzkSciIiIiciIm8kRERETkbi6tkmciT0RERETkQEzkiYiIiMjV2I88EREREZEDsftJIiIiIiKyDSbyRERERORqLq2QZyJvhp7duJqJiIiISC1mmERERETkatKljeSZyBMRERERORATeSIiIiJyNXfWxzORJyIiIiJyJCbyRERERORqLm0iz0SeiIiIiMiJmMgTERERETkQE3kiIiIiIgdiIk9ERERE5EBM5ImIiIiIHIiJPBERERGRAzGRJyIiIiJyICbyREREREQOxESeiIiIiMiBmMgTERERETkQE3kiIiIiIgdiIk9ERERE5EBM5E0gIKwOgYiIiIhcRlkiL4T4mRBikhAiTwhRL4TIFkKME0IcpnM+vxFCzGn9vk8IsU8I8Z0Q4g+qYjWbhLQ6BCIiIiJyGSWJvBDiRAApAO4AsA7AWwCyANwPYI0Q4nCN8/k/ACsBXNL69y0AywH8FsD3QognVcRLREREROR0XRXNZzyAnwC4T0r5buBNIcSbAB4E8BKAuyLNQAjRDcBYAD4Aw6WUu9p99jKAjQCeFEK8LqWsVxQ3EREREZEjxV0j31obfxmAbADvB338LIAaALcIIfpEmdUAAIcCSG+fxAOAlDINQDqAXgD6xhszEREREZHTqWhac1Hr34VSSn/7D6SUVQBWA+gN4Nwo8ykEUARgsBDi5PYfCCEGAzgZwCYpZYmCmImIiIiIHE1FIn9K69/0MJ/vbv07ONJMpJQSwD2tMaUIIaYIIcYKIT5DS/v77QCuVxAvEREREZHjqWgjf2jr34ownwfe7x9tRlLKaUKIPABfAbi13UcFAD5FywO0UQkhUsJ8dKqW76vG7ieJiIiISDVb9SMvhLgZQCJaeqwZgpYmOUMALAbwHoCvrYsudux+koiIiIhUU1EjH6hxPzTM54H3yyPNpLUd/CQAWwDc0q69/U4hxC1oacJzvRDiQinlskjzklIOD7OMFADDIn2XiIiIiMgJVNTIB3qYCdcGPvDgarg29AGXAegGYHmIh2b9AFa0/jdkkm5nbFpDRERERKqpSOSXtv69TAjRYX5CiH4ARgKoBZAcZT49Wv8eEebzwPsNsQRpJTatISIiIiLV4k7kpZSZABYCOA4tvc609xyAPgCmSilrAm8KIU4VQgQ/eLqy9e91Qogz2n8ghBgK4DoAEsCSeGMmCjbuhqFWh0BERESki6qRXe8GkATgHSHEJQDSAJyDlj7m0wE8GTR9WuvftjYnUsp1QohPAdwBYL0QYhaAvWi5QLgKQHcA46SU2xXFTNRm0GG9rA6BiIiISBclibyUMlMIcRaA5wH8AcAVAA4AeBvAc1LKMo2zGo2WtvC3A/g9gH4AKgGsAvCxlNKRvdawjbz9cQsRERGR06iqkYeUMgcttelapg2ZN7UOCjW59Z9rsI28/Qlm8kREROQwtupHnsg6zOSJiIjIWZjIm4BNa4iIiIhINSbyJmDTGvtj0xoiIiJyGibyRGDDGiIiInIeJvImYNMa+xOskiciIiKHYSJvAjatISIiIiLVmMgTgU1riIiIyHmYyBMhtoddz/x5f/WBEBEREWnERN4EbCPvTn87++e48zfHWx0G6fCv355gdQhERETKMJE3AdvI218sF1vdugjce8nJBkRDRnn88iG6v/PKNb/En4f+1IBoyAgn/aQvfn/akVaHQQY78pAeVodANnbNmYOsDsE0TOSJAPhlbBdb7OzG/f5w+lE467gBVodBGg3o3R3PjTpd9/d+emhPA6Iho1x95s/w7t/O1P29R/9wigHRkFEe/N1g3d+55dxj8eYNQw2Ixp6YyJuATWvsr8rXZHUIZFMCAjLGCz1yjqTHL7E6BNJBCKBrgv5z690XnmRANGSUX/z0EJzNipSImMibgE1r7K+6Xn8iL3iJ5g0C8Pt5DDsJ75QRuYOA/hzKa8c/E3kiAAP7dtf9HSE4kJRXMI8nspeWBI/cTgiWv9EwkTcB623tb/ixh8X0PW5Z9xOCCQOR3QgBsMWbN+ht2ui18zITeSLEXrMe60Oy5CxsI+8gXjuLexibrbofK1KiYyJvAhY27tWtCw8htxPgBZvTMJd3PwHBJhceIAS3czTMQoji0LNbF6tDIBPwREJkLwmCd8q8IEHo7zXMa8+uMZE3AdvIa3PqUf2sDoGok5YaISYMTtHFYydxr0qIoetJcp4uQvBZiCiYyJuATWu0SXDYCdhrV/2q/OG0o6wOQRcBPlQXi0H9e1my3IQEsG2NB3ThBbZprDzVJSQwh4qGiTxpsvThC60OwXaYK8TGidc/vIVPZC/NUvIC2wMShIDfb3UU9sZE3gRamtb8bsiROPknfU2IJja8i0lexX6MnWV1RonVIZAJymsbmch7QJcEwfr4KJjI28ShvbrhzvOPtzqMsNjOn7yMCYOzsLxyv8ZmP5vWeEBsD7saFIxNMZEnTSIdGJeffhQ+vvUs84KxyJf/OMfqEMgCLd3cMWHQy2snUzJXs59Na8xi5aHcJYEPu0bDRN5GnFqLdOEpR+DMn/e3OgzDjThxYIf/M1HxDraRJ7KXZr/kQ5AmsXItJwj9D7s6NZeKFRN5ipvw3GFDXsI28rHhhS4ZiTXy3pDAAaGiYiJvAi1Xk04+6Xm1VoSXL97BpjXO4uTylLRp8ksmeBr859LBcc/D+qY13NCRMJEnTaKdGNmnOrnZyJMGRp/IRVT0Ac8LXXvr37ub1SHEhU1rtOnaxdnHYUIMA0J5LR1hIm8Ct5/QpLRHn+rPXPmLiJ+/d+OZUaehyD68ebjVIZhOCO8l8scN7G11CGQwv8Ors5v8fjat0cDp27lLgrVt9J2AibwJNDWtMSGOeESrcXfCFXAXIdC3R1dl83PCb1bNyeMJnHfC4VaH4ClWHh+xLrpP9y5K47Azh+d3rW3kHf4jTNDs+MGU9Pca5uDTVEyYyJMSTrjrwCI/fk5uQhVr6E7Yt72otwFJ9zt/O1P5PO2q2eGZfJOfDWu0aHb4xQ4fao6OibwJ3JAIOP8XqOe1dfLsn9Q0SzL6WmDq6LPj+v5pPz1EUSRkpE3PXBby/REnxn7nZfCR/WL+rtM4JcGbf+9vQr5/6lGHsO20Bk5oWnPZL47E7SOOC/nZ4CP76u9+0mPbmYm8R6h4sKlfpGYpNjhwjujXI/pENojTqa4/6xhHrL6zjh2AE4/o0+l9rYX7q9eeEdP33OL8kwcqqQEzerV17xr69HVbmISgvR5dE3DHyOjTudmFg4+wOoSoEkT4C+t/X3xSzL1J/fOCE+IJ6//bO/M4KYqzj/+emdnd2fs+2IvdZU92l4UFlnO5LzkEwfsEMdEoKp4hahSjJCZRo0aN5vWK5no1l4kaTWLUGA3RV6PGhMQjEo1nFAVBQIF6/5gemJ3p7unprr5mnu/nM5/e7aP66aeqq56ueuopX+GHcYuminws7B2meoyIsM/37kH2woa8l7Cx5fvhSRMtp3GnRk+ngDeMnUUaFUEsHhAz43GiI9CKC1BReOhHb6aVmW8e2ue2CJbIzQoayn+fdEjbxmXLetwWwRBaeZmfHTTt51+Wn21BIn+xakqz2yIYwgs2hF9hQ94jEEHqRMxYBprLMNKiuwARMKaxVPu4pdTlEDAwE9NtH+8zZ7e5ev9MQCuL08HFzQlqisNS0nH7XWP0qS6Sk89uQZQ8vvhRA41Dr7FTIA/SW1eMigIDI9VJcOJd1rtDypNdFXllPLsfYEPeQ8wbWY3h5ebDvs0bWY3SvCzcfNzYhP2y0DJEM7HRNvPIc0dWo726QL4wDuHnTkzTk10zsGynO5ofe5zVaUVZvr9j5Vtl5DD/zPfRe/fMjp5lyvvMhrwDGPVRCwUDeGjtNDy0dhp+d/Y0VBnx+Y7h5uPG4pmL5mJ+d40ZMXWJjhacMn2E6nE73pcff36i5gQYs8iV01xqdvYMW50LcVCP/LITjxOVa4bU37bhd7cTzv/MIFlPbXxdK+uj/PBx9VLSYWLRzptk+fzouTNUU/J7PWYUNuQ9RjgriI6aQrRWFeKJdbPQWZNaFAUj7iWpcsHCThQqfsO5DsZZbirPx6FjU6swz5vfoXvc7S90ImDdQZ37/+8aVoTcrCAGmsrw3MVzLaV9yJg6nDtP//mT8R2dBZ/8ZBzJzGcZSU1sKcOmryzAVYdZ9z8/fVYr+uqLJUhlL26VFzff8dtXjccL6+cZm3ifhJNTmJDpdr3mFm4Zat84tA8Prh2Uks93nzxJgkT+R7dHPsm1hWF73JL9AhvyDmC2BzYrGMDNx43F7M4qyRKlxuenqffCRxHCvoYk1Yr6tJmtmNOl7UrkhQZvRkclvnVEHy5ZMhI/+8Jk/OXiubj7lEmWP8K+srQbRbn+G0o2kyd6cw2IrI16yCgjlx7cPeT/s+d2IDc7iBVj63HGrFZLaZ8zr8OykFZc+NIZqz22rZUFKApn4Y9fnGlZli8t7DJ8bsALFZsLuNnh2llThCfXzbKUxqLeYRhoLpMkUSJ+iFgTRa8EZ0rPulnYkPcIWobH8PJ83LpyvLE0XKzM7XIXMVMRFesYs25PeCQQiAiHjKnHqinNyM0OIpzln9UkZaykGJ/Ew2dPR31pbkpphHQ+emTnsZnX6oTJTfjbpfNx4cIufPuoMfIba4v58MvT1GNz24lTUVIi71jy82SvCnrJkpFoKIt8IOWEgghnOde8ar0OK/qddQHpqC5MmGBqJ0lda+L0IsPloiUmtG1W0H0TavGoYY6vSHzTsf14/HzrH6ux6NkvKb+rGfZd634pZGzHbgNfQHiipzuKU70QZp7ZTj35dVJmS2UBJjSntoiPk49qVq/5OSF8bloLlvTVSpbIOsGg/jPZ8Q4dN3G49DStIPMJrz96jKth/tTKaGdNIRrKUvtAtkp+ThBfW97r2P2S2Xd29OTesdLagnNDkFCPzeiossWlNhmyox7p9sgnu9anbZ8sMtuxiPE8sitit993t+/vVWQajkTqejZa2ds+jOuBQiC7N1oVCY95UE8Nfv3iO6nd1sB9tUZtUhX5d2dPw9ade9DfWGL4HnagZsdlgnGTajmOqsSsar6/egIaPeaWRoCmpesnlxT9qDUp5nOGdcmzIc9Yxs6XxlQ95FDlZeap9XRlVYsEhww0hnGIi5eMTNmQN4qMV6W1KrVgBHah5iOfDqZMsixKlodaxmEqeV9fmovDxzVgWHEYU1pTGzl0goADfhVOlCW9ttHswl/+DphsHHatcYCOFCPP+A2vudY4hZkeL5l6+uohzg1hAxHZbasWJSaspeIMLKKuIkPfpXnmVuA027ng13pMzZAPBPzTI3uTTrQsPcwbeMYpDGfhjNltOGxcg+lRjtENJcgOBmwJXOFW77MZVUxrrzSVXvIwo5kNG/IOcM0Ro1GSl6U7ISXZS3HrCeMwrDiM5f11kqWTg22TXXVe4OoijdBfHn6rZYh2zymTcMeq8VjUO2xo2uSP4XQ1Eb1kb3gt0kNLRX7CPqMSBjV8Z632dDJyCGcFUFmYg88NNqe8bkgsarnsJ/eCSSPM9XSn+q66pZOrD+/DC+vnGQ5ckQq2drBIZq7ZxSlTnevqn6IvBTbkHaCpIh8bvzQbGy+YbTqN2V3VeHLdLFx9+GiJkslBZvjJLmUluvL87KQxeu85eTKylEl7txw/LkYg7WvcNnRl3H58UxlmdFSB4t5eQvIly43w/dUTLKdhN1Pb9Hp2SDWfvVK5GxVjQnMZDh9Xj0MtLD7jwhy4/bj1rhm9q90fbEYef3l/PZ66YDYuXDRS+r3czHun8MlUDwDwVXQy29DJMCtx5DMd9pF3iHBWENkWQ1W5bYTqIUuyW08Yhwf++jZmdlYhGCDdF7ixPA9PrJuF7bv2oKWywFD63tVg6vlr17NMbauwKWVtUvkAuWhRF0Y3JE4ujMWKbrzSG/2/ykIxNz/2asIxo88X6YH0yAM5ickC4EaPbfS9t5JLqlFLSL/+9BJmmza/hyWUIY5u2EYJ6TuFvo+8uTCjmQIb8h7BwzZ6UmTKXluSi5MGD6xomKyerioMI36+md4l7q/4aV9G+7kMGWVqa8WQ8qEGQSNqjT0i2Y5aefZDA+2avg18uxDZ/8Hm5PP7fbKrWVlTzUI/6cQoAUqPIAf6UWvkpZWOsGuNg7hVuOy+bcS1xq67yK2gvOw3mqpkTo/QyNKdWqVsNJeT9czIwP9NYnIKc0JoqzI2iuUWoxtKTLsjeG2eg92oGvIerOpkR31JGrVG6t28iZ2R0PTvq02rRt1itpMt097nVGFDnpGCjArDb6+qFxtKWYwdXmpb2qqTXXUyP3ZBpWMNLCqklS9ecU1zVAwdXdxxovbCNk6/i501hThRWVCpqjAHNx7Tj+/pyJcMs997ThcRWd+lfhmBOmRMPQaahq5y3FyRb/rddOLDXgZ2SumOb7l+ft198iSU5w+NODWnSz9ij9rHaJRk0Ym83EHnBOxa4yBeMSRkIyCnAVQbHjRTT+sNM7qdBe679mizaNSBKDjnz+/AEd/dKPkOEfKzE6sdvWxev2QkyvOzUVMcxkE9NYbu4XY++4G6kly8eOl89FzykC3pp5oHX17chWVjatFeXWhpYiDBmG+4P0xAY6i5yAeIcHDfMFz38MuOyZGsjSMAP/zcBLz83nZsensbnnptCz4/rcV0XWY2/KRXenhl2ARenNRckBPCH784C29+9AlufPRVbN+1B5cv68GDf9NeE0LvMdLBdchOuEeeSWDxqGHJT4pDRoWk9q42lsldRc+DdZ40rGbBhmU9+/8uDGepniOjPp3eUbl/6PWM2W1Jzy8vyMH6g7txyvQRhsoZEVnqobG70XCy90jtTretPBDhqSAnhDUzWx2TRw8iwqh68+40sXihp9bJjhu13syTBpvRWlWI648eg5On6c8rcZJQMICuYUVY3l+PK1aMMhyoQI2Uw0+ayBKvdwo0lOVh9dRm1WNuvga52UG0VhXi6sNH47vHj0NVUdi0PKnPafZ4pkmGe+Q9g3cK3rS2Stz3wtuO31et8a0qCuMrS7tx3wtvo7u2CLc/sdl0+kLIXQVPdgVvR4Nx1EADXv3vDjz12paEY1Nay/HEKx/s/78kZvGdutJc+cIoBIjwwBmDeH3LDttWxvR642sVow1bvB6yQwHM6qxO2GcH/Y2leOnd7QCQNJSsbIzoR+0c54vNASGsGF15cWuUtFUVYH53ZPRq8ahaLB5Vix899Tq27dpj/iYGqCkOm7rO9Pvq/veaqywfU4fu2mK0VBRg687P8OlegR899brbYpnCkotQfNSaNK//4+EeecYzaPWiHT+pCXefPAmDLoRF9DKJIbcSa69QIICisPr3+r592mkX52bhspgeetlkhwJDjHgnhk6Nxxf3FkZUY3TFSLURLlW9SFDCOfM60FtXjLqSXNyxSv5COHok66lNt3Z+TtxCOyunNDk2InD7qvEIBgiFOSFcssRcPHyzPaipjrx4rafWqjTrl3YDiPR+X7q0B19bPnS1b1lFID4dzcUYrd1F84jZETYPDMw5gjRDnojqieg2InqLiHYT0WYiuoaIDM2aI6IZRCQM/Bpkycyo49awtKzbOtWEy24UrKaXaqWdLJ+XjEpcOdYuMqS+BWCPHtcf3G3oPDV/Wrt0XxgO4VenT8Xj589Ed22xTXdJxEhoSc3D3rLzDBOKy1i9iYOymdlRhT+tm4WNF8xGVaG5HnmzVBcluV+adc1+9ZA4Qz3J+TKb8qsO60NBTghLR9diTGOp7oei1iGtDhshhG5WTW3lTjw9pLjWENEIAE8CqAJwL4B/ABgAcCaABUQ0RQjxgU4SALAZwKUax3oBLAfwohDiDRkyZzqnz2rFc298hBf+sxVbd3425FhDqn7pkmoLJz4g0qlet2r4J9O2o71XkrNeTfJ0ynuzuNEjqbpYkc0EPTADMHUJkr8E1x45rnfLjQAAIABJREFUGmf++DmVe8Ub8iry2PgCVCUzqJNgVrRjJw7HXRv/jX9/8Iml+9tNqk3b2OGleObfHybsH1lbNOR/Jz/YVoytx7Ixdba9W3qpfn3FKEy+4vfa17r/uruKrB75GxEx4s8QQiwTQqwTQswC8C0AHQA2JEtACLFZCLFe7QfgU+W0/5Ekb0ahVsiPmzQcd62egMNVln+fPKIcS/pqUZqXhRuP6XdAwgh7HTDknaz41HA6rm93bZHmXc25s/ik79zDNbsdkuVo+LnHG3h+CVNoBQJQUZCDCc1luuf4pizHsHR0naHz0jVCWjzhrCAePnu69glxdZxf1PKDkyao7o8X3+nnsfMDWa/M1pbk4r7TpxpPS4ZAPsKyIa/0xs9DpEf9hrjDlwDYAeA4Iso3mX4FgEMA7ARwp3lJvY1bFcyZc9oT9hERvn3UGDxz0Vws7E09go1Z9Hy2U0HPPh1sq0BZXHzbKBNbtBt+VVye7Grk/MPGNWBhr3rIRrOh26ziRFFXv4exO/vRr1KrRzS+jBj9kE3qY+6DlvL7GsZQFPXJrj54MBXsWqLeAwMbSQkFM2eqX6rvs9GoPrkSIkXFoiWV7oJQUgXwQcGViIw3YKay/Y0QYogpJoT4GMATAPIATDSZ/gkAcgDcI4T4yLSUjCoFOSHNL12tIfGrDuuzRRbbw/5RpAfnp1+YrHr8qsNH23r/ZFite9R6NIIBwjKNXrykrkyOetZ4yXr2kizGdVNlIDKMao+8iXx2e2RLj+h7kJVmBl5HtfEIT2p1gZksKy9wJtqQbcUpLmEzt3G6qJ83v0Pz/TIywmaG3Gy5hrwZPFyleB4ZNV2Hsn1J43h0NYrErl9jfE7Z3mz0AiJ6Ru0HoNOkDGlNqsNlK8YmuuPIQFYPsVYyUbu1uSIfdSWJ4RXrSnKx4RDjkVrMNQraV+WEgpjRUQkAWNCt3ove11CS8j21Psi81PPshCxR1d90rHPuYk5y28rEyDDxOS/LADdSZXi5YSYi+8uc5OcvyYus7bC8P/HDPDGfE68387xO9cj7dSTEKmrvSHt1oWZIWCORysxgJp9l51imlgEZyDDko+EItmocj+5P2QIhoumIfCi8KIR40oRsnuOYCY0AgEMtGMPNFaa8lDSx2qDJag+djJYzOsYgbiizL2Z6PMlGHW45fhx+tWYqboibmzCsOIxjJjTi+qPGaF6bajWYTBa1RsauLJKdrp4RuaBnGJrKtSd0210MjRi4583v0D2u1kvfU5c8Moy8cHT+bnS1yr7Tj2WmrF2wsCvpObI+2Ni4kkXqGX32XDW316H/+8H1KRa98p6syMavlcAcwOtjj59Xtt9N5SIhxFi1HyLRdFxlwyG9ePrCObjSgnvKd47tR2FOCMW5WZjdWYWsIOGMWd5YndEKsnrk40OxqXHJwSPRWJaHysIc/M/x45Ke7xShYAC99cUJoyQr+uux4ZDeIRGFEnpnDITdi8UtH3k1ZA/tqkatkXoH8xTlqq+aC0SiSZ03v0NzpUarGDXAk5UjQz3yntG4Oxh5+rzsA4HjjL67BTnJg83JcqGSjZYMVmWLjmCWx81/Spwcqkxz9lDdF09UZtXIQwnRiWRlqn46au+yrr+7DYWtpbIA09ojI9ZfmDFi6P3i769sPZzNUpERfjLa467VHRTdn5J/OxGVAViByCTXu8yJ5k3UVjlMpdh31hThzxfORoAI4awgdu/Zi5yQtiGkbtQc2GvVP1la/HdJCZ07vwP3PvemrqFaVRjGo+fOwF4hHPWl9VJPppl8t0v8Ly7oxK+efwt7bPy68IrqjxjfgBsfeRXvbNuFLy8eicvu+/v+Y6unNg9ZYVc28iZBekSZKsSKdtRAA370VGLEYiKyfV7G6bPasOGBTQCAioJsvL/90yHHc7OCOD2FDhhdlcf7gjvoQpUKWlW81dtcsaIX09orMbGlDLOueuzA/Syma4T87CB2fLpX/xwDH1/xqOVh/MrkSbPZoAKS5bMTc5iMhKn93qrxeHfbbtQUh/GdR1/VPM/D1ZMtyDDk/6lstXzg25Stlg+9FtFJrt/jSa6JxPbkqBnxNUVhvLNtFwBg0ojExRS8NbkwgizXmrqSXNx/xiDe2PIJPn/XM5rnBQKEgIUmxEtGOZC6PMmiBNk3/ywx5ZriMB45dwbe/Ggnrv3dy/jTv5ItO2GNKa0V2PyBO0uZ54SCePS8yLOOqCwYYshLX2QsTteyDDMjyRgZGbObdQu6UFUYRl1pLs7/yQtJz5cp8fGTh+Pj3Xuw+7O9GFYcxvpfHcjnX5w2BXUluSjViKBlFVkjUl6r47QoycvG0Yrbqh52PM2PPz8J5/3keYyoLMA/3/0Yr7y3ff+xUICwdHQdhhWru2+qydNcadx1Vlb+JEtGbR6dmTtbjVpDRKgpNr5mgT9Kr3VkdEU+omznEdGQ9IioEMAUAJ8A2JhiutFJrim51TAR7lo9gIP7arHhkB501BiPdmAGWXW9zM7YrmFFmKcxYVQWfq8kzHw4yfjW0op53lCWh4kt5VLKU7IG7osHdWJ8Uyk6awoTwo468YkbzgpiRGWB7fcxMtnVjDFgpEderYfNzGTtVIm9a3FeFs6a247DxyUuCG63e0VOKIiz57bjSwu7kBfXIzu6oUR1ZDaepaNr9/99zIThmuclupAknuO9rpsD2PXBkJCqqUmd+hf11hfjwbXTcMMx/QjGPcdf18/HVYcnd6O9a/UARtUX4+y57br1gl1lNtkzxj8XEHm/542sBgCMbyq1RS6zRJ/Hy2VeJpYNeSHEqwB+A6AJwGlxhy8FkA/gLiHEjuhOIuokIs0IMkQ0CKALaTTJNRmy67G26kJcd9QYzcp/iGuN1cmuBq5vMdDLcNky/YgxTvg1zlUqJiAyqiEbs9lstXx878SBhH2p6lNWGdWKyCCTZKIWhbNwzymT8eDaaWgqH1o2vew/G2Xt7AMDoPH+onrIm+xq7rrrjhyNzppCjB3urYY/itd6oC9Z0o2Vk5tw/oIOLB5lfE0Pt12fbl85HrXFYSwfMzTCjqaPvEY6fngX9TA692ewrRK/XDMVZ8xu0z3PLn0kGzzTcnu56dixeHDtIG44emhwBnOjPyYuYgDIca0BgFMBPAngOiKaDWATgAmIxJh/CcCFcedvUrZaWWdqkitjjlqVUIxOkh0K4PKlPVjk4OJTWlQVhvHDkybg6c0fYlxTKY655c9uiySF6e2VGDe8FP8Xs+x38kV/7KlZHTHkVUQ/ciD50LvbqOWJWuM9u6sKX1/Riy07PsPxkw58rF+0qAuX378p8QIFw5Ndkxw34s+qxvDyfDy4dhoAoGnd/abSSIZf48er6bwsPxvrD+4esi9b5fmMRDMxk2PxPtlGmdlZhSfWzQIR4Wd/eXP/fr8b5rLRex8jq3K7Lweg7SYXCBA6a4rw3493G7qP3jw4tz8+/YyUGk/plR8H4A5EDPhzAIwAcC2AiUIIww6vRFQK4FCk4SRXLxFrMJTlZ+Nry3sxsaUMd6r03CZNy2LtfMyERhw+viFpPHun3vPJrRU4c06boaFvrzE+Zln62jhfwqa4sKVuRa1xw9C6ZMlIzFAiHqQDRIQjxjfiCzNGDJlId9Jgi+51TvrIu0FbVQFG1ScPwwn4d9g9EKCESFuJCwXJmuzqzByiVG8T39Ofjkxvr8SR4xNdwlJBVhl3om1WaxeMRu/y2kia00hrUYUQbwghVgkhhgkhsoUQw4UQa4UQH6qcS0IIVc0LIT4UQuQKIfJ4kqtzHDXQiB9/ftL+8E5McrTqjh+eNAG9dcUIEHB5Encho2j5lcdTV5KL644agyPGNeDO1UOXqI//3prffcCNaKBpqJ84oG6syWgYinXCL6bCpJZyw+eumtKsWdl7qZdQesjGhJ5aWZPjvNlw3rtmiiHZtM6w66nqS+WOesa6AKrhzdzRCz9pXOLDxtbjihWjzN1fY3/8PBmzHDVwwPDWcoWKzbvDxmmvJ0NE+Nry3iH77ApSkUz9suqNCp2VgsvyszHYFgnMURQO4aJFXThLJZa+GvGdiVrirprSZCg9v+HPMUjGMn6M8WzV4Eq1N9KMhmqLw5jcWoFfrpmCZy6ai2Mnak9QS8Y5SiVWkBPCCZObVORTl/Dgvlp8/dBRaK3Sn0y5ZmYbFnTXYFJLOa4+wvy6Bqkwp6ta2oJmVx7eh/7GErSopGelfBtpLE+dMWLI6NWUVuMfFfGUKZFLggFCQTjR29HoR5waRiZBmsHJgDSLDPqGj6ovHhLNKxlG6pO7T56EixYdWIDJ7CjdpJZyLB41DMW5Wbj+aPVF3WKNQKs9zmqGsZmPL6+0EtGe6ZK8LFy2rMe0e56WDu5YlfpItBrHTByOzw0247Cx9bg0ziUqyoZDenDilGZcvqwHk1UiyrlBMkM91dXftZJb0leLjupCzeN3rBrAg2sH8fwl83DSYIuhNRNU76+x//z5mlMzfY0sH3nGIn40rOO56rA+nHPP826LMYRoHOny/GzM6qpK6dpk7bxankV9h4nIcmi502a2YnxzGVoq81EYTuzFzg4FMG9kNX7z93cNzS+IN1Bzs4O46bixhuWxWkbvO32qVL/PupJc/OzUKXjzo52YcsXvhxzrrS/GU5u3AEi+gE58o2LEwKsuCmOwrQLfOaYfH+/ag7ycIJ54xVzIzLtPnohf/OUtHNRbo9pgHjXQiGsffhkf79qTUtxxNYz2rCV3l3Ouvrp8aQ9G1RXjl8+/hb+9tW3IscuW9eDLv3gRALBhWa/a5ZboqSvCmMYSDCvORV1pLjbc/3fD/sCxEBGuP7of+/YJzfkFp81sxfsff4qdn+3FRYtHWpJbmguVgyMvPzt1Mn76zH/wgz8nhoW9eMlITBpRjjENpQhnGV88zqj44awgfvqFyfjOo69iQU8NzjXZjmUFA7hwkX7eVRWGcfESa/krG6d65IMBwgNnDuKDHbsxsOFh1eOdNam3EUbLqeyFB70CG/IewemQy0W58rN+xdh67Ph0Dy6+928pXWena8MlS7oxra0SfQ0luotmOUUq9WEgQJiYxH3kpmPH4uX3tqMtSe+7GWS34WX52YYqXBnl4ay57Xjy1Q/w/vbduCXJyr1m70dEOEj5gHrwxXfMJQKgtaoQ587v0DyenxPCw2dPxz/f/TjlHjyvusCkQml+Nk6ePgLvbNuVYMgfOb4BReEQyvNz0GvQN14PNXVlBQP7RwWsfszqTRLOyw7h64eadBkh/f/N4mTp6W8sRX9jqaohn5cdwtLR9vrFjx1eiltOiNQVsYa8m69Q/PtrX/hJfUJBeUoIBghVhfIjwmUybMi7yNlz23H1b19CVpBwmsWeNiPcffIk3LXx31g+pk6qUeshF+MEwlnB/caWbMwsgy67RyAQINvXCYjFS/7kehTkhPDAGVOxd59AKMXJtV58xqqiMKpMhkMtz8/GBzsiq4rKCvuYrJzPSXH0yyxZwYCtBl684e7FhfQANRcqH1ryHsWL9YFRjAaiSOpak+S4beXPIrJWi/c6bMi7yCnTR6C1qgAtlfmOfKEONJdhoFnOpJ6U8eH7VC0pT86d145rfvcyDh1b725PhAN5MLGlDBv/tcVSGvJinZPUniQ9skPeaLjU+N6JA1jzw2cxrDgXp0w3HnNeD72n/crSbk+Ekk2GV41yGcgqjcnSMRpVxCt4xL40jW12aRK9uGY3GMSuuUB+gQ15F8kOBbDQBw1eplKcl4UrD+vDvc+9iZOnmTeA1sxqw0mDLSn5dnoBM24ES/pqLRvyfmRaWyXqSnLx5kc7cZyFCc520FNXjEfOnaHZS2ZmFVC9hvL4SU2GZTuQnsMtL0H1IRPDOA493l1bjKc3RwKxmZ2IZweJceRVJruaSFevp3bdQZ2WJvO7gVEdHNxXi18+/xaASHhkr2Bb1BqdYz11RfjSwi6dM6xjeVHKuP+9MiLgFN6piZi0IN1Gsg4dW49Dx6qHCEulqvCCEe921hi9vx/LUCgYwANnDuJvb23FhGbzEWxkE23PZDdsR4xrwHW/f0Vaen4ZAj9nXjs2/usDvL/90/3+1F6hqjAH7ykTcXvqEucLmNGwWhSlKLJGd7zIJUtGoiAcQlleNg4bZy2Wu0zsek1OmNykOrdtTGMJfn7qlKTX++PtTV/YkGdcJ9mETkYOVv3ziZyflG0E2SKZ7fUqzs3yTDi5KHZl1xdmtEo15N3ATC4XhrPw6zMHTc29sBMiwg9OmoDbn9yMWR1V+0OaWuWGo/tx5Hc34vUtn0hJz2nMTk4uL8jBVw+RHwXJKnZFXTl6oBHvbN2FrTs/GzLZ2IPVPaOCd2oixrcY7SU4beaBHpyD+2oxsaUMq6Y0DVmYyFf4rJY7e277/vjLWjGOY1HrxJ3ZmdxImN3pzERHWcQb7kbKc7qP3LZU6EdBys0O4tYTxmGwrcJwnHdfYCBfI3MvvNd0tlUX4quH9GKOxmJRptbFKMnFb86ahofWTsOYxhJrAjKWaSzL279o0kkS5yeEggGcv6ATG0x+vLhdHbp9f7fhHnlGKnpD5KfNbMXefUAoQFgzq9UT7iapMq29En946b/Iyw6iv1FOBBCnqCjIwWPnzcCbH+40Hb0knBXEvadNweA3HtE854oVo3Dnnzajp64YJ9/1TMr3SHcj2W5yQgHs3rMPQCTevRkuXNSFx1/+735XDTVmd1Vjdlc1Hvjr27j/hbdN3SdKpvm0eoXrjhqDM370F91zwllBdNQUpoWxFC1nfnHlUuN7qwbw5kc70VCWl/Tc6FMeOb4BP376DXsFY1zDe90KjO8wWiXmZYew7qBOnDu/w5dGPABcfXgfLl48Ej8/dYovn2FYcS7GNZVZMpwayvLQUKa95HxlYQ7OmdeB+d01ptL3cRvrCe48cQABinwwX3PkaFNplOVn44l1syRLpo0bhpXaPe2Kx24HE2IiiSzoSf6uqT3LwX21uO/0qTLFYmwkGCAEAmTIiI9l/cHdOGtOu01Sec9H3svvrR2wIc9IxWsvtGwqCnJw4tRmR2O3pwt2GWtFuYmr3qbCkQMHolIsMPnx4SUmtJTjD+fPxJPrZhlaJVHLjzjLg+4jskiHdv6aI0fjc4PNuOHofoyoTL4gnNbrpzYxNl3xY76fMn0EQgHC6qnNpt/JcFYQ80y4sKZ7e54usGsN4wjL++1dlY+xH682ggU5IWw4pAd3P/0GTjYRSaO/sRRXHtaHV97bjpMGm/HO1l02SOks9aXGe+z8Ek/d6qqq8Rh5atn3lMmw4lxcuGik22L4Cj/21K47qBNr57R5egTYa2qNvrf+qNmsw4Y8YyuLRg1DUTiEz01rcVsUJkXcavTM3PeYCcNxzATzMa1jQ4ymgyHvN5zwka8oyMH72yM+/8PL89XlsF0K9/CjEWsVImB+dzUe+tu7AICD+/zZoSTDiE93l8Wxw0vxzL8jazyYGX3wM2zIM7Zyw9H9bovA2ICdjYLbDY7b93caq73OTtmHJXnWXKhuPWEcDr/5TwgQ4VtHjMYbWz7Bvc9FFv2Z0qoeAjcTjV8t/Doh+fJlvdizVyCcFcTZ8+zzE/cisXWZnSNvbleZRJGJ27c+/hrGNJYYcjVLJ9iQZywT6/ucaUZQPF4eineD2AgqlYU5LktjjFCQ8zAVgg4tLrB6ajPueHIztuz4FJcv60n5+r6GEmz80mwEg4SicBZGVObjSwd14rX3d+DMOW02SJxetFcX7O/xDJnMcye+BRaPGob7lChKy/vrUFmYg1tXjrf/xmnCyslNuOPJzQCAL/ho0a+6klxcvCQzXc3YkGcYifjF39gIMj5Kfn7qFPzv069jYe8w5IS86+MZS2dNIdqrC/DSu9uxfIw/h+JTwapxNaOjav+qokfbuJx9fk4Ij58/E+9s22W6x600Zg0EIkqYUxHf65xen3TWnmbdgi48/vL72LrzM9xu0jA20tFz4zH9OPUHzwIAvnNM6iO6ly3twcjaIoyqK0lprggT4ex57SjJy0J5QQ7maqxJIBsznTxEB8qTX9oWu2BDnmEY2xhZW4RLl6bee+omRISfnzoFz//nIww0lSW/IMPJDgVw/xmDeOE/H2GwrdLWe+XnhDJu2NwrFOdl4bHzZuLTPftsW2EUAOZ31+BGxYCfZyKKVGl+Nk6d0SpbLN9jdLS8KJyFtTaGqoxy1WF9OO8nz6O2JBcnTG5K+fpfrZmKnz77HxzcV+vYqKBXYUOeYSSSTq41iTG1D+xIdxeq/JwQJo+ocFsM31BZmIPZXek3wcyvfuF2EQyQJSPeiDqDAcLC3jRaLZhRZcXYegy2V6A0L9tUWM2euuKMCp2qBxvyjFTstO/S3Xhk/AObd+kJ5yuTTni9yawqNLfyNDOU9F3xg3EFPy99zTBG4VLOMKnjlebBK3Iw9pIp+cyGPOMbeJTbWcyou7YkV7ocDOMUCe5k7ohhC1x/Mkx6wq41jG/ww9d1pjaWBTkhTGwpx4x2eyc7eoV0yuZ0ehZGGy/Un5laP3oBL+Q/Yw9syDOMRNK5stRrg5+7eC5CJiYsMf6iMBzKqIggmWx45tsYmYZhGHmwIc8wjCp6ETvi50KwEZ/+LFDCAgbSONRbOkWdisfoR8lAcxkCBFy4MDMX10knjMxZM7smAOMd2JBnLJPOvdCpksk9eLJIp0W10o10NuLVyMTwk+fP78A4Xj8h7YivV3+1Zio+3bsX/Y2lLknEyIINeYZhVMk8E4Zh0hd+n5lYeus5Bnu6wOPhjGWaKvLdFoFJI9LZvcGLZGCnM8NkHDxynr6wIc+Y4pbjx6E8PxsLumswp6tq/36uLDIDzubMIxMMfnbryox8zkS4ZKcv7FrDmGLOyGr8X9ecjPQhzRQSYmo7lNVsTDEMwzCMMbhHnjENG/EMwzAM4024SyQzYEOeYRhPwT7yDCMf7nc5QCYauEZCUaYbmfLMbMgzUmG3iPRBP468g4IwtmL0w8lOQ9ArNiZ/RALeyQ3GKpUFOY7ch9sDd2FDnmEYQzhVWfviY5C7N6XilRz3Rdkzyaj6kv1/Dy/Pc0UG/lCyn+uPHgMioCQvC2fNaXdbHMYBeLIr4xs6agrdFiHjmNRSjj/96wNMainPuMWAMoWGMneMOsZZNizrwaa3t2H3nn24+bixrsiQzh9KXmHxqFqMbihBWX428rIPmHis+fSFDXlGKnb22jaU5WH9kpH43ab3cOacNvtuxOzn9lXj8efXtmCAV3pMW+Z3V2NWZxWe3rwF3zx0lNviMDZRVRTGY+fNxD4hkBXkwfh0pr7U2Y9zHqB0FzbkGV+xckozVk5pdluMjCGcFcT09sqE/dyzlj4QEW5bOR579u5DSMfAs9Mtgu0AZwgGCMEk2majLD2xs5ONfeTdhT/LGam45XvpFbgNTF8G2yr2/z1vZLWLktiDnhFvN16xA0IBbhLtxCs+8kF2E2TSCK61GKnM767BzI5KFIVDuOlYd/wwGcYOrjqsD2vntOH7qyeguijstjiMDWSHAjh3XjvK8rNx4cIut8VJO7wyknfcpOHIyw4CAE6dMcJlaRi7uP7o/v1/X3fUGBclsRd2rWGkQkS4fdVA0mH6dOK8+R345kP/BACsO4gb/3SlqiiMtRwFIu1ZM6sNp81s5QXv0piicBYePHMaNr2zDTM7qtwWxyHs+4jy6qsy2FaBW08Yh9179mF+d43b4tgGG/KMLWSKEQ8Aq6c2I5wVRGE4lJYuF2qwT2QGkgFx5KNkshGfKU/eWJ6Hxgx3BZWFV9sDIsLsrvRvk9mQZxiLhLOCWD2VJ+AyDMPo4RUf+UzEq8Y2Y53M6TZlGIZhTMMmWGaQyaMRDONH2JBnGIZhGMZ2vDLZlWHSCTbkGYZJGTub44kt5fv/rivJtfFODMMwmQF/QqUv7CPPMIynOGX6CDz7+kd4b9suXHtk+oYMY5hMg33k3YN95NMXNuQZhkmZ5vJ8/Pfj3QCA4twsqWmHs4K488QBqWky1rHTd7q2hOPyewU2tRnGX7BrDcMwKXPlYX0ozAkhJxTAHavGuy0O4wCNZXLdnO5aPYBQgFCQE8Jly3qkps2YJxS0z5QPZ7HJ4RZ2fiyHeKVcV+G3imGYlGksz8PGC2bjqQvmYExjqdviMDZx+8rxCGcFMLw8D6fNbJWa9mBbJZ780ixsvGA2hhXzXAg3OXxcPQCgs6YQI4cVSU07unJqfWku5mRATG+vUl+ah3PmtqOjuhA3Hyd31fXS/GwMtlUAAJaNrpWaNpMcEhnkOEVEz/T39/c/88wzbovCMAzjCz7e9RnyskMIcq9b2rJ3n8Bzb3yI7tpihLOCUtMWQuDFN7dhRFU+8rLZmzdd2btP4B/vbENXTRECXFfsZ+zYsXj22WefFULI/XqKgd8qhmEYRpPCsNw5EIz3CAYIY4eX2ZI2EaG3vtiWtBnvEAwQums5n92AXWsYhmEYhmEYxoewIc8wDMMwDMMwPkSaIU9E9UR0GxG9RUS7iWgzEV1DRCnPhCOifiL6IRH9R0nrXSJ6jIiOlyUvwzAMwzAMw/gZKT7yRDQCwJMAqgDcC+AfAAYAnAlgARFNEUJ8YDCtNQCuBfAhgPsBvAmgDEAPgIUA7pQhM8MwDMMwDMP4GVmTXW9ExIg/Qwjx7ehOIroawFkANgA4JVkiRDQPwHUAfgvgUCHEx3HHedYVwzAMwzAMw0CCa43SGz8PwGYAN8QdvgTADgDHEVG+geS+CWAngKPjjXgAEEJ8Zk1ahmEYhmEYhkkPZPTIz1S2vxFC7Is9IIT4mIieQMTQnwjgYa1EiKgHwCgAvwCwhYhmAhgLQAB4DsAj8ekzDMMwDMMwTKYiw5DvULYvaRx/GRFDvh06hjyA6Drv7wF4FMC0uON/JaLlQohXkglERForPnUmu5ZhGIZhGIZh/ICMqDXRFQC2ahyP7i9Jkk4HdWKWAAAMZElEQVSVsl0NoAnAIiXtdgDfB9AL4H4iyjYtKcMwDMMwDMOkCV5a2TX6UREEcKQQ4k/K/9uUsJOdAMYBWAHgR3oJaS2Fq/TU98sRl2EYhmEYhmHcQ0aPfLTHXWtt3uj+j5KkEz3+TowRDwAQQghEwloCkbCWDMMwDMMwDJPRyDDk/6ls2zWOtylbLR/6+HS0DP4PlW2uQbkYhmEYhmEYJm2RYcg/omznEdGQ9IioEMAUAJ8A2JgknY2IhKps0ghV2aNsX7MgK8MwDMMwDMOkBZYNeSHEqwB+g8gE1dPiDl8KIB/AXUKIHdGdRNRJREMiyAghPgFwK4AwgMuJiGLO7wWwEsAeAD+xKjPDMAzDMAzD+B1Zk11PBfAkgOuIaDaATQAmIBJj/iUAF8adv0nZUtz+LyMSdnItgElKDPpqAMsRMfDXKh8ODMMwDMMwDJPRyHCtifbKjwNwByIG/DkARgC4FsBEIcQHBtPZBmAQwFcBlAFYA2AxgD8CmC+EuFaGvAzDMAzDMAzjdygSECYzIKIPcnNzy7q6utwWhWEYhmEYhkljNm3ahJ07d24RQpTbdY9MM+RfA1AEYLMLt4/OCfiHC/f2M6w3c7DezMF6MwfrzRyst9RhnZmD9WYOq3prArBNCNEsR5xEMsqQdxNlMSrNxaoYdVhv5mC9mYP1Zg7WmzlYb6nDOjMH680cftCbFB95hmEYhmEYhmGchQ15hmEYhmEYhvEhbMgzDMMwDMMwjA9hQ55hGIZhGIZhfAgb8gzDMAzDMAzjQzhqDcMwDMMwDMP4EO6RZxiGYRiGYRgfwoY8wzAMwzAMw/gQNuQZhmEYhmEYxoewIc8wDMMwDMMwPoQNeYZhGIZhGIbxIWzIMwzDMAzDMIwPYUOeYRiGYRiGYXwIG/I2Q0T1RHQbEb1FRLuJaDMRXUNEpW7L5gTK8wqN3zsa10wmogeIaAsR7SSiF4hoLREFde6zmIgeJaKtRLSdiP5MRCfY92TWIaJDiejbRPQ4EW1TdPL9JNc4ohsiOoGInlLO36pcv9jss8okFb0RUZNO+RNE9GOd+6SkAyIKEtFZSp7sVPLoASKaLOO5rUBE5UR0EhH9nIheUeTbSkR/JKLVRKTaFmR6eUtVb1zeDkBEXyeih4nojRj5/kJElxBRucY1GV3egNT0xuVNHyI6NkYXJ2mcY3v5sV13Qgj+2fQDMALAuwAEgF8AuALA75X//wGg3G0ZHdDBZgAfAViv8jtX5fylAPYA2A7gVgDfVHQlANyjcY81yvH3AdwA4FsA3lD2Xem2DnR085wi48cANil/f1/nfEd0A+BK5fgbyvk3APhA2bfGT3oD0KQcf06jDB4qQwcACMA9Me/2N5U82q7k2VKXdXaKIttbAH4A4GsAblPeTQHgJ1AWCOTyZl5vXN6GyPgpgI2Kvq4A8G0ATysyvwmggcubNb1xedPVY4Pynn6syH2SG+XHCd25rux0/gF4SMm80+P2X63sv8ltGR3QwWYAmw2eWwTgPQC7AYyL2R8G8KSisyPjrmkCsEt5kZpi9pcCeEW5ZpLbetB43pkA2pQXfQb0DVJHdANgsrL/FQClcWl9oKTXZOW5HdZbk3L8jhTST1kHAI5SrnkCQDhm/3glz94DUOiizmYBWAIgELe/BsDriuwruLxZ1huXt5iyorF/gyL7jVzeLOuNy5v6MxKA3wF4FRHDOcGQd6r8OKE71xWerj9EeuMFgNeQ2AgUIvI1tgNAvtuy2qyHzTBuyJ+o6Ox7KsdmKccei9v/FWX/pamk57UfkhukjugGwJ3K/lUq12im52G9NSH1hi5lHQD4g7J/ZirpeeEH4AJFvm9zebOsNy5vyZ+3T5Hvt1zeLOuNy5v6M54JYB+AaYiMTKgZ8o6UHyd0xz7y9jFT2f5GCLEv9oAQ4mNEvs7yAEx0WjAXyFF81S4gojOJaKaGz+MsZfugyrE/APgEwGQiyjF4za/jzvEzTukmXfVZS0QnK2XwZCIapXNuSjogojAiPTWfAHjcyDUe4zNluydmH5e35KjpLQqXN22WKNsXYvZxeUuOmt6icHlTIKIuRFySrhVC/EHnVNvLj1O6C1m5mNGlQ9m+pHH8ZQDzALQDeNgRidyjBsBdcfteI6JVQojHYvZp6kwIsYeIXgPQDaAFEd/oZNe8TUQ7ANQTUZ4Q4hMrD+EytuuGiPIB1AHYLoR4W0WGl5Vtu4XncIu5ym8/RPQogBOEEK/H7DOjgxEAggD+JYRQM+o8qzciCgE4Xvk3tnHi8qaDjt6icHlTIKJzARQAKAYwDsBURIzRK2JO4/IWh0G9ReHyhv3v5V2IuL1dkOR0J8qPI7rjHnn7KFa2WzWOR/eXOCCLm9wOYDYixnw+gF4ANyMyJPhrIuqLOdeMzoxeU6xx3C84oZt0LLOfALgMwFhEfB9LAUwH8AgibjkPKxV0FDv17EW9XQGgB8ADQoiHYvZzedNHS29c3hI5F8AlANYiYow+CGCeEOK/MedweUvEiN64vA3lYgBjAKwUQuxMcq4T5ccR3bEhz9iKEOJSIcTvhRDvCiE+EUK8KIQ4BZEJv7mI+K8xjC0IId4TQlwshHhWCPGR8vsDIqNhfwbQCkA1LFm6Q0RnADgHkUgKx7ksjm/Q0xuXt0SEEDVCCEKkM2c5Ir3qfyGifncl8zZG9Mbl7QBENAGRXvirhBB/clseJ2FD3j6S9QRH93/kgCxe5CZlOy1mnxmdGb1G64vYLzihm4wps8ow5y3Kv06VQc/ojYjWALgWwN8RmYS1Je4ULm8qGNCbKple3gBA6cz5OSJGZjkiE/2icHnTIInetK7JqPKmuNTciYibzJcNXuZE+XFEd2zI28c/la2W71ObstXyoU93osODscN+mjpTXtRmRCaW/cvgNcOU9P/jc/94wAHdCCF2IBKnuEA5Hk+6ldmEMmhSB68C2AugRckLI9e4BhGtRSQ29YuIGKNqC7NxeYvDoN70yMjyFo8Q4t+IfAh1E1GFspvLWxI09KZHJpW3AkTKQReAXTGLQAlE3JMA4H+Ufdco/ztRfhzRHRvy9vGIsp1Hiav/FQKYgoh/20anBfMI0Wg9sRXz75XtApXzpyES5edJIcRug9ccFHeOn3FKN5miT0C9DAIp6kAIsQuRWNd5AAaNXOMWRPRFRBYxeQ4RY/Q9jVO5vMWQgt70yLjypkOtst2rbLm8GSNeb3pkUnnbjcgiS2q/vyjn/FH5P+p2Y3v5cUx3VmJX8i9pLNOMXhAKka/jhDj5iEx0fVnRwQUx+4sQ6UVIZVGQZvh0Qai455iB5AtC2a4b+GDBlBT11o+4dRyU/bOVZxEAJlvVAYwt+lHksq6+rMj4fwDKkpzL5c2c3ri8ReRoB1Cssj+AAwsbPcHlzbLeuLwl1+l6qMeRd6T8OKE715Wczj9EQg+9q2TiLxBZ3vv3yv//BFDutow2P/96RJZHvh/AjQC+jsiS5jsVHdwPIDvummU4sEz3LQC+gZhluhG3jLxyzenKccPLLHvhpzzrHcrvQUXeV2P2Xalyvu26AXCVcjx2Cer3lX1eWMLcsN4APIrIcOg9yrN8C5Fwr0L5XSRDBxi6DPcmJW88s4Q5gBMU2fYoz7Ne5beSy5s1vXF52y/fWkTq+d8C+C4ibd9tiLynAsDbAEZyebOmNy5vhnS6HiqGvFPlxwndua7kdP8BaEAkBOPbAD4F8G8A1yDmay5df4iEwfqRUhl/hMgCKv9VKqnj1Spm5bopAB4A8KFSqf0VwFkAgjr3WgLgMUQ+HHYAeBqRGLqu60FH5mgFo/Xb7JZuAKxUztuhXPcYgMVu6yxVvQFYDeA+RFYY3o5ID8jrAP4XwKBMHSCyLsdZSp7sVPLoAcT1iHlUZwLAo1zerOmNy9t+2XoAXI+IK9L7iBgsW5XnWw+NkQ0ub6npjcubIZ1G3+EEQ96p8mO37ki5CcMwDMMwDMMwPoInuzIMwzAMwzCMD2FDnmEYhmEYhmF8CBvyDMMwDMMwDOND2JBnGIZhGIZhGB/ChjzDMAzDMAzD+BA25BmGYRiGYRjGh7AhzzAMwzAMwzA+hA15hmEYhmEYhvEhbMgzDMMwDMMwjA9hQ55hGIZhGIZhfAgb8gzDMAzDMAzjQ9iQZxiGYRiGYRgfwoY8wzAMwzAMw/gQNuQZhmEYhmEYxoewIc8wDMMwDMMwPoQNeYZhGIZhGIbxIWzIMwzDMAzDMIwP+X9C7MaWQ08i8wAAAABJRU5ErkJggg==\n",
      "text/plain": [
       "<Figure size 432x288 with 1 Axes>"
      ]
     },
     "metadata": {
      "image/png": {
       "height": 250,
       "width": 377
      },
      "needs_background": "light"
     },
     "output_type": "display_data"
    }
   ],
   "source": [
    "plt.plot(mv_net.losses['test'], label='Test loss')\n",
    "plt.legend()\n",
    "_ = plt.ylim()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 指定用户和电影进行评分\n",
    "这部分就是对网络做正向传播，计算得到预测的评分"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 140,
   "metadata": {},
   "outputs": [],
   "source": [
    "def rating_movie(mv_net, user_id_val, movie_id_val):\n",
    "    categories = np.zeros([1, 18])\n",
    "    categories[0] = movies.values[movieid2idx[movie_id_val]][2]\n",
    "    \n",
    "    titles = np.zeros([1, sentences_size])\n",
    "    titles[0] = movies.values[movieid2idx[movie_id_val]][1]\n",
    "    \n",
    "    inference_val = mv_net.model([np.reshape(users.values[user_id_val-1][0], [1, 1]),\n",
    "              np.reshape(users.values[user_id_val-1][1], [1, 1]),\n",
    "              np.reshape(users.values[user_id_val-1][2], [1, 1]),\n",
    "              np.reshape(users.values[user_id_val-1][3], [1, 1]),\n",
    "              np.reshape(movies.values[movieid2idx[movie_id_val]][0], [1, 1]),\n",
    "              categories,  \n",
    "              titles])\n",
    "\n",
    "    return (inference_val.numpy())"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 168,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "array([[3.7995248]], dtype=float32)"
      ]
     },
     "execution_count": 168,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "rating_movie(mv_net, 234, 1401)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 生成Movie特征矩阵\n",
    "将训练好的电影特征组合成电影特征矩阵并保存到本地"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 246,
   "metadata": {},
   "outputs": [],
   "source": [
    "movie_layer_model = keras.models.Model(inputs=[mv_net.model.input[4], mv_net.model.input[5], mv_net.model.input[6]], \n",
    "                                 outputs=mv_net.model.get_layer(\"movie_combine_layer_flat\").output)\n",
    "movie_matrics = []\n",
    "\n",
    "for item in movies.values:\n",
    "    categories = np.zeros([1, 18])\n",
    "    categories[0] = item.take(2)\n",
    "\n",
    "    titles = np.zeros([1, sentences_size])\n",
    "    titles[0] = item.take(1)\n",
    "\n",
    "    movie_combine_layer_flat_val = movie_layer_model([np.reshape(item.take(0), [1, 1]), categories, titles])  \n",
    "    movie_matrics.append(movie_combine_layer_flat_val)\n",
    "\n",
    "pickle.dump((np.array(movie_matrics).reshape(-1, 200)), open('movie_matrics.p', 'wb'))\n",
    "movie_matrics = pickle.load(open('movie_matrics.p', mode='rb'))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 57,
   "metadata": {},
   "outputs": [],
   "source": [
    "movie_matrics = pickle.load(open('movie_matrics.p', mode='rb'))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 生成User特征矩阵\n",
    "将训练好的用户特征组合成用户特征矩阵并保存到本地"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 205,
   "metadata": {},
   "outputs": [],
   "source": [
    "user_layer_model = keras.models.Model(inputs=[mv_net.model.input[0], mv_net.model.input[1], mv_net.model.input[2], mv_net.model.input[3]], \n",
    "                                 outputs=mv_net.model.get_layer(\"user_combine_layer_flat\").output)\n",
    "users_matrics = []\n",
    "\n",
    "for item in users.values:\n",
    "\n",
    "    user_combine_layer_flat_val = user_layer_model([np.reshape(item.take(0), [1, 1]), \n",
    "                                                    np.reshape(item.take(1), [1, 1]), \n",
    "                                                    np.reshape(item.take(2), [1, 1]), \n",
    "                                                    np.reshape(item.take(3), [1, 1])])  \n",
    "    users_matrics.append(user_combine_layer_flat_val)\n",
    "\n",
    "pickle.dump((np.array(users_matrics).reshape(-1, 200)), open('users_matrics.p', 'wb'))\n",
    "users_matrics = pickle.load(open('users_matrics.p', mode='rb'))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 58,
   "metadata": {},
   "outputs": [],
   "source": [
    "users_matrics = pickle.load(open('users_matrics.p', mode='rb'))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 开始推荐电影\n",
    "使用生产的用户特征矩阵和电影特征矩阵做电影推荐"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 推荐同类型的电影\n",
    "思路是计算当前看的电影特征向量与整个电影特征矩阵的余弦相似度，取相似度最大的top_k个，这里加了些随机选择在里面，保证每次的推荐稍稍有些不同。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 210,
   "metadata": {},
   "outputs": [],
   "source": [
    "def recommend_same_type_movie(movie_id_val, top_k = 20):\n",
    "   \n",
    "    norm_movie_matrics = tf.sqrt(tf.reduce_sum(tf.square(movie_matrics), 1, keepdims=True))\n",
    "    normalized_movie_matrics = movie_matrics / norm_movie_matrics\n",
    "\n",
    "    #推荐同类型的电影\n",
    "    probs_embeddings = (movie_matrics[movieid2idx[movie_id_val]]).reshape([1, 200])\n",
    "    probs_similarity = tf.matmul(probs_embeddings, tf.transpose(normalized_movie_matrics))\n",
    "    sim = (probs_similarity.numpy())\n",
    "    #     results = (-sim[0]).argsort()[0:top_k]\n",
    "    #     print(results)\n",
    "        \n",
    "    print(\"您看的电影是：{}\".format(movies_orig[movieid2idx[movie_id_val]]))\n",
    "    print(\"以下是给您的推荐：\")\n",
    "    p = np.squeeze(sim)\n",
    "    p[np.argsort(p)[:-top_k]] = 0\n",
    "    p = p / np.sum(p)\n",
    "    results = set()\n",
    "    while len(results) != 5:\n",
    "        c = np.random.choice(3883, 1, p=p)[0]\n",
    "        results.add(c)\n",
    "    for val in (results):\n",
    "        print(val)\n",
    "        print(movies_orig[val])\n",
    "        \n",
    "    return results"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 211,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "您看的电影是：[1401 'Ghosts of Mississippi (1996)' 'Drama']\n",
      "以下是给您的推荐：\n",
      "3040\n",
      "[3109 'River, The (1984)' 'Drama']\n",
      "1383\n",
      "[1406 'C�r�monie, La (1995)' 'Drama']\n",
      "1611\n",
      "[1657 'Wonderland (1997)' 'Documentary']\n",
      "2482\n",
      "[2551 'Dead Ringers (1988)' 'Drama|Thriller']\n",
      "3572\n",
      "[3641 'Woman of Paris, A (1923)' 'Drama']\n"
     ]
    },
    {
     "data": {
      "text/plain": [
       "{1383, 1611, 2482, 3040, 3572}"
      ]
     },
     "execution_count": 211,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "recommend_same_type_movie(1401, 20)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 推荐您喜欢的电影\n",
    "思路是使用用户特征向量与电影特征矩阵计算所有电影的评分，取评分最高的top_k个，同样加了些随机选择部分。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 212,
   "metadata": {},
   "outputs": [],
   "source": [
    "def recommend_your_favorite_movie(user_id_val, top_k = 10):\n",
    "\n",
    "    #推荐您喜欢的电影\n",
    "    probs_embeddings = (users_matrics[user_id_val-1]).reshape([1, 200])\n",
    "\n",
    "    probs_similarity = tf.matmul(probs_embeddings, tf.transpose(movie_matrics))\n",
    "    sim = (probs_similarity.numpy())\n",
    "    #     print(sim.shape)\n",
    "    #     results = (-sim[0]).argsort()[0:top_k]\n",
    "    #     print(results)\n",
    "        \n",
    "    #     sim_norm = probs_norm_similarity.eval()\n",
    "    #     print((-sim_norm[0]).argsort()[0:top_k])\n",
    "    \n",
    "    print(\"以下是给您的推荐：\")\n",
    "    p = np.squeeze(sim)\n",
    "    p[np.argsort(p)[:-top_k]] = 0\n",
    "    p = p / np.sum(p)\n",
    "    results = set()\n",
    "    while len(results) != 5:\n",
    "        c = np.random.choice(3883, 1, p=p)[0]\n",
    "        results.add(c)\n",
    "    for val in (results):\n",
    "        print(val)\n",
    "        print(movies_orig[val])\n",
    "\n",
    "    return results"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 213,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "以下是给您的推荐：\n",
      "2434\n",
      "[2503 'Apple, The (Sib) (1998)' 'Drama']\n",
      "3366\n",
      "[3435 'Double Indemnity (1944)' 'Crime|Film-Noir']\n",
      "1194\n",
      "[1212 'Third Man, The (1949)' 'Mystery|Thriller']\n",
      "589\n",
      "[593 'Silence of the Lambs, The (1991)' 'Drama|Thriller']\n",
      "942\n",
      "[954 'Mr. Smith Goes to Washington (1939)' 'Drama']\n"
     ]
    },
    {
     "data": {
      "text/plain": [
       "{589, 942, 1194, 2434, 3366}"
      ]
     },
     "execution_count": 213,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "recommend_your_favorite_movie(234, 10)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 看过这个电影的人还看了（喜欢）哪些电影\n",
    "- 首先选出喜欢某个电影的top_k个人，得到这几个人的用户特征向量。\n",
    "- 然后计算这几个人对所有电影的评分\n",
    "- 选择每个人评分最高的电影作为推荐\n",
    "- 同样加入了随机选择"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 242,
   "metadata": {},
   "outputs": [],
   "source": [
    "import random\n",
    "\n",
    "def recommend_other_favorite_movie(movie_id_val, top_k = 20):\n",
    "\n",
    "    probs_movie_embeddings = (movie_matrics[movieid2idx[movie_id_val]]).reshape([1, 200])\n",
    "    probs_user_favorite_similarity = tf.matmul(probs_movie_embeddings, tf.transpose(users_matrics))\n",
    "    favorite_user_id = np.argsort(probs_user_favorite_similarity.numpy())[0][-top_k:]\n",
    "    #     print(normalized_users_matrics.numpy().shape)\n",
    "    #     print(probs_user_favorite_similarity.numpy()[0][favorite_user_id])\n",
    "    #     print(favorite_user_id.shape)\n",
    "    \n",
    "    print(\"您看的电影是：{}\".format(movies_orig[movieid2idx[movie_id_val]]))\n",
    "        \n",
    "    print(\"喜欢看这个电影的人是：{}\".format(users_orig[favorite_user_id-1]))\n",
    "    probs_users_embeddings = (users_matrics[favorite_user_id-1]).reshape([-1, 200])\n",
    "    probs_similarity = tf.matmul(probs_users_embeddings, tf.transpose(movie_matrics))\n",
    "    sim = (probs_similarity.numpy())\n",
    "    #     results = (-sim[0]).argsort()[0:top_k]\n",
    "    #     print(results)\n",
    "    \n",
    "    #     print(sim.shape)\n",
    "    #     print(np.argmax(sim, 1))\n",
    "    p = np.argmax(sim, 1)\n",
    "    print(\"喜欢看这个电影的人还喜欢看：\")\n",
    "\n",
    "    if len(set(p)) < 5:\n",
    "        results = set(p)\n",
    "    else:\n",
    "        results = set()\n",
    "        while len(results) != 5:\n",
    "            c = p[random.randrange(top_k)]\n",
    "            results.add(c)\n",
    "    for val in (results):\n",
    "        print(val)\n",
    "        print(movies_orig[val])\n",
    "        \n",
    "    return results"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 243,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "您看的电影是：[1401 'Ghosts of Mississippi (1996)' 'Drama']\n",
      "喜欢看这个电影的人是：[[209 'M' 35 1]\n",
      " [1110 'F' 56 6]\n",
      " [2921 'M' 50 1]\n",
      " [287 'M' 50 13]\n",
      " [74 'M' 35 14]\n",
      " [64 'M' 18 1]\n",
      " [5728 'F' 35 20]\n",
      " [4253 'M' 45 11]\n",
      " [5099 'M' 18 0]\n",
      " [978 'M' 18 0]\n",
      " [5996 'F' 25 0]\n",
      " [4504 'F' 25 0]\n",
      " [2338 'M' 45 17]\n",
      " [277 'F' 35 1]\n",
      " [4506 'M' 50 16]\n",
      " [1636 'F' 25 19]\n",
      " [2496 'M' 50 1]\n",
      " [1985 'M' 45 12]\n",
      " [1855 'M' 18 4]\n",
      " [2154 'M' 25 12]]\n",
      "喜欢看这个电影的人还喜欢看：\n",
      "2434\n",
      "[2503 'Apple, The (Sib) (1998)' 'Drama']\n",
      "589\n",
      "[593 'Silence of the Lambs, The (1991)' 'Drama|Thriller']\n"
     ]
    },
    {
     "data": {
      "text/plain": [
       "{589, 2434}"
      ]
     },
     "execution_count": 243,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "recommend_other_favorite_movie(1401, 20)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "这个结果里面20个人最喜欢这两个电影，所以只输出了两个结果"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# 结论\n",
    "\n",
    "以上就是实现的常用的推荐功能，将网络模型作为回归问题进行训练，得到训练好的用户特征矩阵和电影特征矩阵进行推荐。"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "collapsed": true
   },
   "source": [
    "## 扩展阅读\n",
    "如果你对个性化推荐感兴趣，以下资料建议你看看："
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "- [`Understanding Convolutional Neural Networks for NLP`](http://www.wildml.com/2015/11/understanding-convolutional-neural-networks-for-nlp/)\n",
    "\n",
    "- [`Convolutional Neural Networks for Sentence Classification`](https://github.com/yoonkim/CNN_sentence)\n",
    "\n",
    "- [`利用TensorFlow实现卷积神经网络做文本分类`](http://www.jianshu.com/p/ed3eac3dcb39?from=singlemessage)\n",
    "\n",
    "- [`Convolutional Neural Network for Text Classification in Tensorflow`](https://github.com/dennybritz/cnn-text-classification-tf)\n",
    "\n",
    "- [`SVD Implement Recommendation systems`](https://github.com/songgc/TF-recomm)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "今天的分享就到这里，请多指教！"
   ]
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.6.8"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 1
}
