{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {
    "_cell_guid": "6f7e7923-bf6d-4e89-a4d2-601f11648598",
    "_uuid": "a5f2d3f6246fe1c767e3b9469c17d1bf09c04708"
   },
   "source": [
    "<font color='green'>\n",
    "\n",
    "**Welcome to Deep Learning Tutorial for Beginners**\n",
    "\n",
    "**欢迎访问初学者的深度学习指南**\n",
    "\n",
    "** [kaggle 英文原版](https://www.kaggle.com/kanncaa1/deep-learning-tutorial-for-beginners) **\n",
    "** 作者: [DATAI](https://www.kaggle.com/kanncaa1) **\n",
    "* 我会一条条把深度学习解释清楚。\n",
    "* 我没有去写长而难读的章节，相反我会定义和反复强调一条条的关键词。\n",
    "* 在这条指南的结尾，你会得到充足的深度学习的相关的信息，以便进一步深入了解。让我们开始看正文吧。\n",
    "\n",
    "** 翻译: [陈沁悦](https://www.kaggle.com/qinyuechen) **\n",
    "** 本文是根据原文翻译的，代表的是作者观点 **\n",
    "** 译者注是译者自己的个人观点和业余活动，不代表本人供职公司的观点 **\n",
    "** 本文纯属个人业余翻译，无偿分享，不承担由此造成的连带责任 **\n",
    "** 翻译的目的：已经工作十五年了，作为工程师，还是喜欢直接动手上code,而不是看书，看算法。这篇notebook 弄下来，基本上就把深度学习简单过了一遍，再回头看书事半功倍。我还在某些特定术语和库上加了我搜索的结果，和我的理解，希望可以让你看起来事半功倍。**\n",
    "\n",
    "** 你可以在kaggle上玩这个notebook, 也可以到[我的github仓库](https://github.com/hugulas/cqyblog/blob/master/ai/deeplearning/deep-learning-tutorial-for-beginners.ipynb)下载中文版下来玩。**\n",
    "\n",
    "\n",
    "<font color='red'>\n",
    "<br>正文:\n",
    "* [介绍](#1)\n",
    "* [数据集概览](#2)\n",
    "* [逻辑回归](#3)\n",
    "    * [计算图](#4)\n",
    "    * [初始化参数](#5)\n",
    "    * [前向传播](#6)\n",
    "        * 激化函数 Sigmoid\n",
    "        * 损失函数 Loss(error) Function\n",
    "        * 代价函数 Cost Function\n",
    "    * [用梯度下降优化算法](#7)\n",
    "        * 后向传播\n",
    "        * 更新参数\n",
    "    * [用Sklearn做逻辑回归](#8)\n",
    "    * [总结和提问](#9)\n",
    "    \n",
    "* [人工神经网络](#10)\n",
    "    * [2层神经网络](#11)\n",
    "        * [层数和初始化权重和偏置值参数](#12)\n",
    "        * [前向传播](#13)\n",
    "        * [损失函数和代价函数](#14)\n",
    "        * [后向传播](#15)\n",
    "        * [更新参数](#16)\n",
    "        * [用学习到的权重和偏置值参数做预测](#17)\n",
    "        * [创建模型](#18)\n",
    "    * [L层神经网络](#19)\n",
    "        * [用keras库实现](#22)\n",
    "* 时间序列预测: https://www.kaggle.com/kanncaa1/time-series-prediction-with-eda-of-world-war-2\n",
    "* [用Pytorch库的人工神经网络](#23)\n",
    "* [用Pytorch库的卷积神经网络](#24)\n",
    "* [用Pytorch库的循环神经网络](#25)\n",
    "* [结论](#20)\n",
    "\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "_cell_guid": "036bf2c0-c146-4c70-b29d-206db0fe91b0",
    "_uuid": "01d54760756dd2bc5c2a309f2862833c079a3303"
   },
   "source": [
    "<a id=\"1\"></a> <br>\n",
    "# INTRODUCTION\n",
    "# 介绍\n",
    "* **深度学习:**  一种从数据中直接学习特性的机器学习的技术\n",
    "* **为什么深度学习:** 随着数据总量不断增长，机器学习的性能就不那么好了，而深度学习提供了更好的性能，比如更准确。\n",
    "<a href=\"http://ibb.co/m2bxcc\"><img src=\"http://preview.ibb.co/d3CEOH/1.png\" alt=\"1\" border=\"0\"></a>\n",
    "* **什么是大量:** 这一点并不容易回答，但是直觉上100万个样本足以称为”大量的数据“\n",
    "* **深度学习的使用领域：** 语音识别，图像分类，自然语言处理或者推荐系统\n",
    "* **机器学习和深度学习的区别：** \n",
    "    * 机器学习包含了深度学习。 \n",
    "    * 机器学习的特征需要手工生成。\n",
    "    * 而另一面，深度学习的特征是直接从数据中获得。\n",
    "<a href=\"http://ibb.co/f8Epqx\"><img src=\"http://preview.ibb.co/hgpNAx/2.png\" alt=\"2\" border=\"0\"></a>\n",
    "\n",
    "<br>让我们来看看数据吧。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 102,
   "metadata": {
    "_cell_guid": "6fe1e8d5-b36d-4e39-9a9b-34618b5e275e",
    "_uuid": "79f18357b846d2cd91e0f7b2389e1dba8097cbdb"
   },
   "outputs": [
    {
     "ename": "CalledProcessError",
     "evalue": "Command '['ls', '../input']' returned non-zero exit status 2.",
     "output_type": "error",
     "traceback": [
      "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
      "\u001b[0;31mCalledProcessError\u001b[0m                        Traceback (most recent call last)",
      "\u001b[0;32m<ipython-input-102-67e0ad0764b3>\u001b[0m in \u001b[0;36m<module>\u001b[0;34m()\u001b[0m\n\u001b[1;32m     15\u001b[0m \u001b[0;32mfrom\u001b[0m \u001b[0msubprocess\u001b[0m \u001b[0;32mimport\u001b[0m \u001b[0mcheck_output\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m     16\u001b[0m \u001b[0;31m# 如果是 kaggle 环境\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m---> 17\u001b[0;31m \u001b[0mprint\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mcheck_output\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m[\u001b[0m\u001b[0;34m\"ls\"\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0;34m\"../input\"\u001b[0m\u001b[0;34m]\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0mdecode\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m\"utf8\"\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m\u001b[1;32m     18\u001b[0m \u001b[0;31m# 如果是 github 下载的，你需要自己下载[Sign-language-digits数据集](https://www.kaggle.com/ardamavi/sign-language-digits-dataset)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m     19\u001b[0m \u001b[0;31m#print(check_output([\"ls\", \"input\"]).decode(\"utf8\"))\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n",
      "\u001b[0;32m/usr/lib/python3.6/subprocess.py\u001b[0m in \u001b[0;36mcheck_output\u001b[0;34m(timeout, *popenargs, **kwargs)\u001b[0m\n\u001b[1;32m    354\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m    355\u001b[0m     return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,\n\u001b[0;32m--> 356\u001b[0;31m                **kwargs).stdout\n\u001b[0m\u001b[1;32m    357\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m    358\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n",
      "\u001b[0;32m/usr/lib/python3.6/subprocess.py\u001b[0m in \u001b[0;36mrun\u001b[0;34m(input, timeout, check, *popenargs, **kwargs)\u001b[0m\n\u001b[1;32m    436\u001b[0m         \u001b[0;32mif\u001b[0m \u001b[0mcheck\u001b[0m \u001b[0;32mand\u001b[0m \u001b[0mretcode\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m    437\u001b[0m             raise CalledProcessError(retcode, process.args,\n\u001b[0;32m--> 438\u001b[0;31m                                      output=stdout, stderr=stderr)\n\u001b[0m\u001b[1;32m    439\u001b[0m     \u001b[0;32mreturn\u001b[0m \u001b[0mCompletedProcess\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0mprocess\u001b[0m\u001b[0;34m.\u001b[0m\u001b[0margs\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mretcode\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mstdout\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mstderr\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m    440\u001b[0m \u001b[0;34m\u001b[0m\u001b[0m\n",
      "\u001b[0;31mCalledProcessError\u001b[0m: Command '['ls', '../input']' returned non-zero exit status 2."
     ]
    }
   ],
   "source": [
    "# 我们这个 notebook 使用的python3的环境安装了很多有用的用于数据分析的库\n",
    "# 这个环境被封装成docker镜像\"kaggle/python\"：https://github.com/kaggle/docker-python\n",
    "# 比如，我们在这里载入几个有用的库\n",
    "\n",
    "import numpy as np # linear algebra 线性代数库\n",
    "import pandas as pd # data processing, CSV file I/O (e.g. pd.read_csv) 数据处理，CSV 文件 I/O\n",
    "import matplotlib.pyplot as plt\n",
    "\n",
    "# 数据输入文件在\"../input/\"目录中\n",
    "# 比如调用下面的代码会列出输入目录中的文件(通过点击 run 按钮 或者 按下快捷键 Shift+Enter)\n",
    "# 载入 warning 库\n",
    "import warnings\n",
    "# 过滤警告\n",
    "warnings.filterwarnings('ignore')\n",
    "from subprocess import check_output\n",
    "# 如果是 kaggle 环境\n",
    "print(check_output([\"ls\", \"../input\"]).decode(\"utf8\"))\n",
    "# 如果是 github 下载的，你需要自己下载[Sign-language-digits数据集](https://www.kaggle.com/ardamavi/sign-language-digits-dataset)\n",
    "#print(check_output([\"ls\", \"input\"]).decode(\"utf8\"))\n",
    "# 你写到 input目录中的文件都会被打印在下方。"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "_cell_guid": "62bb7abd-122c-4f87-97ad-eb203e9dea1f",
    "_uuid": "3e3e94ca5c9349ac36416482d30af378966c7a8a"
   },
   "source": [
    "<a id=\"2\"></a> <br>\n",
    "# 数据集概览\n",
    "* 我们将会在本指南中使用 \"sign language digits data set\" （数字符合语音数据集） \n",
    "* 在这个数据中有2062张用手势表示的数字符号图片\n",
    "* 大家都知道数字是从0到9, 因此有10个数字符号。\n",
    "* 在本文一开始，我们简化成只有0和1两个数字符号。 \n",
    "* 在数据集中，符号0相关的图像的索引在204到408之间。 符号0的数量有205个。\n",
    "* 同时符号1在822到1028之间。符号1的数量是206个。因此，我们每种符号各自使用205个样本。\n",
    "* 说明：实际上对于深度学习来说，205个样本还是非常非常地少的。但是，作为一份指南，这影响不大。\n",
    "* 让我们准备好我们的X和Y数组。X是由符号1和符号0的手势图像组成的数组, Y是标签数组（0和1)。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 73,
   "metadata": {
    "_cell_guid": "4768ce70-3e7f-4ac9-8764-642e88006b77",
    "_uuid": "d6b38399b27c2d723750c0b4f8787a7d6d0025ea"
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(-0.5, 63.5, 63.5, -0.5)"
      ]
     },
     "execution_count": 73,
     "metadata": {},
     "output_type": "execute_result"
    },
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXQAAACFCAYAAABL2gNbAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4xLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvAOZPmwAAIABJREFUeJzsvWmQJdl1HvadczPzLVXVXb339OwYDMjBvhACN1GGLO4iKe6kSMqWqLAUVtiy5SBFRsiOUIgRDpMhmZZCsuygzSVIhk0rTIkEKQIEKYmAAIIESAxAbLNgmcF0T8/0Vl1Vr97LvIt/3HvPPfnq1Uz3TFcP3cwTMVPZ+fLly7zrOd/5zjkUQsAggwwyyCD//xd+pR9gkEEGGWSQWyPDgj7IIIMMcofIsKAPMsggg9whMizogwwyyCB3iAwL+iCDDDLIHSLDgj7IIIMMcofIsKAPMsggg9whMizogwwyyCB3iAwL+iCDDDLIHSLV7fyxh37ynwQACAYAxwjVYICQjsFAqFTkKqXjKoDMUkQrB3A6RxRA7OWjqorHzAEmna+MA1P8vKls/B6A2rh4jh0o/R5TAKfjhq0cMwX4EG+S/9pg9p3LxyH9u/UGzse9s0t/W2vK57aCc/G8swyfrvEdI7h0z45BNh6XvwClz8nH/+IXAXblGqSm09d88n/8b8vDvkx569+K/eoNEKp4W9cAIakLvo7/AbF/fR51HL8DACH1ZahCGQMEGScgALm/Kw9K59mE0m8mvpwxZSwQYWW/xs/KcQj7m0P3a/7cOYb36bw6Dvlvx0A6hiMgPQpZ6vdVPnapj9DvM7aQ4/yYZAF2+ThIX7IFuIvHf/jzf++W9ev9/9tPpYYLQFXanqV/grQ1c0Bdx4eujUNTxQet0/wz7DGp4kM27MDp4Sv2aDhf6zAx+RoLg9KPAHp9B5T+YQqoKd6jC0aOF76CW9JZrTfwiN/bczVcusfc1bBp3s1djYWLg3Ru49/WVljYOFjbtpJ+d9bAd+k3FqbMzY7KceozdmXucgeZl+zKNeSW+t7Hiz76L26sXwcNfZBBBhnkDpHbqqGHrI1VQbQ3cJBtJZhQ1BFG0ciM36ehMxfNLGoKRVsXrYGKhm44iDaetYb87/i5xzhpB1qTq6hoCR4k57N2UC1pEV608kqOiQIcp5dMOz4qiNYeQrEOiAJsl9sJUcVE1AD3Z90hyDYfSNqRAuBTe7AnadIVSugtEdGyKxLtO1SQY18Bvg5yPo8DXy31ff6b+5ogmiEogCsv53Mfm8qDxQor/V6Zfr9k0dr68jmgaOra2rKu6D2OWfrNcUBOheSS9uYpFG3dctTSkdo+v6uXbgURieXo87OBZFQxIL/BKMo/gxDkFQ8pH1O2jqoASm1PSxZRlTRxY3yxdiuHUToemWQNUxBNvDFW5lWltPKaPEZc5mDWtHVfGTUfWb231tCzjNiKNu7SguOZsEgDkxHkc0MBXRrIy5Zc/twoFCBb1MFzGbu1R9A6MvUPvHrmEEi08uABlMfu9T3szU3a27qgC8zC5RhcJjOoTHBUZRFntdCzGkxyW7W4s1q4DQdUMrBcOU9q4qdOGpuut3hnyQMyPuoLTxwP6g0KWdypgk+zL9+DXQWrB036mRAIVV02Gk9pIIYyWORbAWWVDigjIRRoJVRBFpUVr3dLJJg0YCvANyjHdT4OCPnYFFgtGAWv6A0+jwcuiziZMsmIgywkTVU2Q72g53NmaWLme+jNXJ/vXFqY1YLesVrQPcsCn98bANr029YaeJcXjwBv03cNIVhZuctkZ8DnjThv3hxQfpKk3zzKRgA1VXylNvZbKaJQBYFZiAKMWsQztMJcjkfGKUUq/u3BLORlXjVsUavFfZTwBr2g5sW6JqfmrseYytzMohd0AHB5MU0LuguMOm0sk9D1Fvfcz0x+34JeqXPOs8zdDI/G3yq90FvY5VZlo0YIYKi5m4fj0sJ+sxDKALkMMsggg9whcls1dHGGmaC0cvScn1lzJxNgsnZGRUPP0AqzV2Zr0bQr4zFKTs+KiyZg2Jdr0o5vg5FjQGlv7ESTPkhDz6YhULSCeD8v98iOlYq9OFwaLiaoC2U/1U65Lm3TpvKAK79ZFHClqeevEcouDxJIKyojoVx0CD0u0EqjjmvAjTTMUpzgvinQiTbrgeh0y+9EHMDi/PRomqK9ieanNPQMpenPtaal+0+bz1qy9gZArK1a940zosmFQD1YDYjjTxzgnRHoyzuCF9iNi2FlQnGeZTiFSKCVQAGcHWm5zRBhGG2a++rW42mkHaHZujXFOqqNg+H83n2YxSjLN75TgVxGxso82AezpGsYAWPOUEzR0GsqY6BRGnob4sDboD059oH2OUV9YNTp885XPQinSvPYUMAoPUeez3OuemNmt42maFU5gcQCAJ+Pg9LSe8o+yec+W+tQFptNUCtSv+Lm5PZCLhonFZMz9NkLsmAX024ZUgHiIqlx5zywxpWVBd2QRyMsFisLunjHQ8A4Ldgjtqg4QzKlBybcyoJdk5N7ZPGBUKMMuIrK4p4HbesrdHI+D7AWbVr9rDLbiIy8V4sKGnjNHS0p7APLuf7qHoRFodPd68XhVkqBVsqxGwV5Nq+YK6EKChcHkBlJGqNN99UYrYbSKlPMX6IgC4mGUWRh0As6hd4GrsWmPm6UyT938WW8WriZgizwzrP0XZ7sGpIhCrAJW7cwgnkFEyK+DgCW9vk2PBX4halg6wBBowwCuaDsi7dUsnKlNtaqcj1IMzPGGuNE+amUP6pR/SBYuVKYJqbF1LTpffqLeFnIy0vnzxm+N0/ztQZejoGibOVFHuRQh3i/jiqYjBGi4Ox78GCkgZwXV/Vbejz4pXa36ppca2JVyYkQdL+H0t+ArOLB4KaRtAFyGWSQQQa5Q+T2augaHsgqRa208sr3nJ6ZyaC5xBpaEVPbONmtR5UVTYEp9DSF7HDRkq+dcLuSb16Tw4TiPcYKZtEyT2ppFww4bcd1cKIdGAqi+dnsNEXd2/U7zs7UGs4UM36RNIWepp1ZFCEUZ2qNJa+6gln02WU+/y0QYS9xYq6kc15p5eL81CwW5fRkU/pda4O5v4mCON20Vm7YF5jL7IfPKmVVNaw/72t4mY+cz3V+dXyBHhuOGLXpX9M5gzZZY4YDWrEsvTAjrDUI2akGU4hKSSlUXRaZVXIcwFQYE/JMWK0FvlzJfWOMl9iOir1Yw01lxQIeGdsjGGTLN1tKDVvR0A3KPB+xxYiKRT3lpK1rNkuOCSHb074zT107S4GiaRvycpznogPLMatGY/LiONW/mVkwNjDG0te8Mm4hqHiFUFGJIRGHPxXo0QOU4RcD5EEQQp+VRvt/5gXlNtMWy0QWzNSoSc1eMHJjfA8H1fAKEDE7oxb5wmrwsnhWerKzE0hlJPhdmew1uZ65lweUNv0WGU9Af8DlQTj3tQyAua8xQbxfF4yY5guq07M5gVwq9phRMf0y9u4rVwZIqKRzJcgFCqezZWGFR2G/eCxtpLjlUjD0ArMEBkJdFnRZxBUFTm/g2V+iF/HKeGEp1ayuVb6R2MdOjoGIe2r4rOotDl6dL4tDhlzywt6wlcXIBZKNWENu1rN8L/evprxpKI0poFOT0wmdwcVFHVBMJ2i6RFk8DODTIsSeCnZr9pv+t0IyHZQ5qON+n+RNUsMsY2N78AoQ2zPPEw25LM+7jKdrfPwgOEVDMXpBr3NcV+DePAWAGg7jpKBt+zG8WjGz7yo/J9DfzLPY0J9EOVjQe5Y+8Y4R6gSxic+rbM+R9KaUrtyXHBTbSc3pG5QBchlkkEEGuUPkNkMuxewmo7S0zFk1RROgpaAgUtoZEDUsHbYvIf6KzcIUVvJdtYaeNYi+E6Zo6GPqROPKGsiyiAmnzLaanJh7i1DJ7i/ia2AFBNQLaPEs5i1zECdLTnPAlfKqO9PXxOU4IEeuBJQuuJVSnJ8Kfqk1x1xbZL5o47UrAUKKtbKqL2vjeiwVrQ1qdhIQNfRy7Ipmr4AKrWkbeOEjZ1aED9SDX0astXmW8z452ObJqmIK8pxG8deJ+v2fFcMOBuSTJpc48DoFBnkS6IoVFhOCCmBiHIpqlqEvY3Q/+B6bpVakgwyzTEwncyxr3CO2cm7MXbGUEOQaA48xRe1Yw2FZozbkwSt4H2PqREN3geV4N4ww86P+tdzJvKypOHi7YAQ6fTFxeo4Gxrgq81gHqWW4tDBfWOawJjEECuAM76HEi1C4+WDA27ugaxpUrUy4FblXdLSfxsh1YEitzD0Ns6yZYjJl029iCkYuJpzC7GpymPJCfiNP/n5kWjnWeHoeCHXoe9fnIZ+36KhM+Pyce4lFYdQi4ANhhgi/ZMgmns+slyK2I2lT5z0yV1Hj6YEIbFO768CGWyg6ClTj5hpm4RQsxSZI4FRTF3pbpRZ2zVopi6NXWHh/8eCljZopyLHGa8fc9TYFrxbIfH3enJcpbwtf2j6PjR03kg1c+tV7zBOsZjz3/DKameOE+ljYphlyDZYlQtDDC2NJs18QcGCk4a2SvNk2lcUo5WkZGYdxysmyDLOsVXH+TEwncyy365Rb1HlBJ6sWdN9jrjSrgojUXMyLuxZN/13jBZ7uTgAAfvz934mNT8a55BKiuXjDDD/6lvcAAE5X1wvejqJY5t/Pv5n/sqYQ58hT062EZUKg4jPJ34GOA+SidDnNXCPJbUU3C6BjgFwGGWSQQe4Yua0aOulQYhUopB2eotGwV6H6RXPXzjANhWTH18R0opWPuDvQ6QlEiCRr6Nps05oCk4d5ETJovm+jNPR5qMWL3gWDRdY002/MFP91z0E0rIWv4NPz+0CwJocsU8nCmKNOVMAVsQrW8sWbTl6FqPub95rfiIgjVIXy61gDHV9gKocmaXvL8ArQZynp85qtsla1YpFp56bWCvuwmnKevUhf5lDxzle9+IPn/AYA4IuzTdwzvQYA2Kz3xMoSJ7qvSjCKK9PLG8VdNiRQmg+0r096zu5lszs/vit97AMdCg9dIBcOvayJ2lLKTtGolRetW7c/ANRsJVSfqTg3G7JKAy8voeGVPGdqOOm/miy6NPAMuZ7m/jvXXgsAOPO7FTIfqJrH7639jsM//bF3AgD+yRt/BTMUSCZbZFNeYK7mLgCMyIpV1QUja4wPBJvaYEFVf/1KlqgQGzyJKh5UKg8KJRCQWAztXv6fG5VXJJcLc1mImH0vWCib3qPKrsRP9WDSdLRs7o2W6FEZlplyK4NIQysZs1vjViYzAGX67ce5tbjAMJIVy6FNA2Cs4BKDwtjJmN6ILBzlPB+A9QnLM8V21gEMzjOs6QdGhUCwmRlhCCGDdRx0FicE5EUf5fwtFK9oizIATZCgIZ17pVLMlsq4fYnSJlUnizUAgVZino+8eLQ9XDyPgwyFjNiWRYCtLAKa8aAXgC6UaTBN/T2nWszq3730JXj+/3gAAHDsU9t4z9e/BgDwHd/9PtmI5Xlu0EnRJry8NoXJJKl40cddhR6jJnikiMZjU9C2Wyol2ZbtKVS5f8ambL41ux5eLnPNxLk2JtuL8szzDihzDVDKFrz04SqYJV6bYbU+/XCrGwMA3Ijg0nptJ7ENR1cb2D+JylT3RlNgVCpzdo4aY8Tn87x/vsRFPLGbDItPZbREdRUFLI197xg+zYkQuCzu8r8In4Xs8zI37/MaIJdBBhlkkDtEbi/kImB/CSUmUg4llYelNsXENlS0MB0oJNCKsZgk6KRivzJ8eMQdNngOIDpOspRrbc+50jPzlEc+i2jzVJwyPSYLOXDKsKi959k66ILpOVzz1qqDn2xgzJ1yCiaNyeXk+szC2/eutG8woez4ph/AcEDk+8uSsCqAqNLxBaHHZtEOtpLSuISIl+IiOnbAYWKKA1tr5eIAFedoccoBRcPTzu6+5bXoaen52hx38NSvP4h7futxAAA1Ne5+fzy/9x01jtWzdL907gC3sw0mOq4BoLLClAiBStqAZNL3+lKlxtD89NiXJNe8FAfai0kO5DIqD4vuH1Ycf0Y5X3OZd9o60gFCWQwFrOKWG/KSUkNL/vxZu4nz3TEAwKnqOh6oL8k1R+s4z4MpRLI81bojBqMrCVYLVRkHAQIRmuCTOdsnP+Q5X4dKxmIXWIICG3bSh46LRS1Ba5UDUiqIYAJCHiuKoRaWGGrhJvt10NAHGWSQQe4Qub0autp5WGnr/TJxRTvT4f7LyX6083NiWuV8cb3w4YKbt6KZa5qU5rhmp4ihfkSaUBgRyi6t3kuwcB2VqcQTw6d759/Qz1aTk1juEXfoksZfke/5CWT3zzu/96X4AXsEk6PQCvccQUXoHhLWKtqFCSWhk/FF46lc8Y3UVni7y9TT+J7F0TZSEYcTblUhBE1pC+L0zO2pYwdidGFxoC6XNcuScVxNVfzD6w8CAM5+cBc0jbgsfAC1Xq6VsYZCjRy5hPvTCED8Xi+60FaY1vFdZiueJWbqi5ocBaXJeS48f1McaS8lovBGROc0N6p/dFh/zcUCzkm2RmSlT0bKWta4ufyG6o9lWmKzZE7WZPELl74KAPD+X30LNp6O3203CK/9oU8BAH703G9hM1lNriZQjqxN97AjxsbT8b7P2SN4oH4+fqCyZWiuuggXX4vnVnHZi5+uMVb62QYWzVz8IXVZHKKfJM3XShXG8JC8+b5MpxuWVySwSGdP1Lkhai6Qy0He9LUqDRq2vcEiA4vs0iAqkMpy9rZlZ0uf2VIW8Z5JmP72E+n78v3UAR2Kw4XJi8Mln+vYiDmqnbFdMBilYBWvHC4+kPCmpZ6pK9Vz2HhxwhCpRTygOELpcCZ+SekA6GpDJSeLX7k4MEquHQ2pZcfixHQCpY3Yqo24ZOXTLKRegAr3ndIAev0IoMdiyVIyZ1q892ORLfHapy8grE/ia13fhV2L0+ZkvVMC0NLvOZBMdgfel1JAJFWuqtmLU9SkTbiqfJ8ZkduZAYX6lQXdoDjEb6HUSpnoQy6ZeVTiP0ZsSzi/ckQbmRt9mEX3iZ6PmoyQ+6Vw/T3e8+jrAQCnLgS067FtNp52+ND7H4k3/57fwn2jK/F7NUSL7Nbix9wBxz+xCwB4pj2Gh5qL8QMNucCDl8ALF0jeWytjWgHr2MDmnE3M+xQw773U3PWehH0WfChEdJXvRbPVblQGyGWQQQYZ5A6R2wu55L9LykrOqazDpnXu6oq94pbnCME+zJIztvU5x8WcWzbfgH6YcNTE99OgtOjzRmkcc8m1rBxtQdc5rMQrk9nnY+pEO41Jgko4cqZK+cDCQ29dSfAlDkHWHP6STMkzSVv3nCra4XILRWdSJEVVlFJlygozyqk2qmyPlghE52fWhJZhFg1raA1p2dFpKPSsr57zWcmq89mC2vYTbD4a7bH2gVOiTTXOY3E8nl83c/mdebbd1JiqyYkGO1Hl6nwg+Jy5D31+OgCEoIom+BKxGkwoubIr5fgOhLB/eL9s0ek2inbapwgalPPaEbpsDQGF0qmTbNVke7Bn7kP9/SnnJHcs4bS+BnyTtW9GzqdVoxTMoADYaFiVnP01RBu+3K4rK7lY7C6UKE6vOPKr3o8pqCh2RclVCeRKJk4GZYuaQzGcCRIdGkxxhNJLmK+3dUE3qv6ghPVXJay/VrUIGy6pOSemUylxywDSofx6YdasBj1A5Bg5qKEsHg10LheF6x2wuOfFpQNjnIJ756GCUQuJ5j3nNABdavINsyec9DF1sqCPuYQSL8irYrr9vCZA8jkkr3nE0NPvBlK1LMvEB+0vpnBLJN+Tg7AzjEp/rBk6k7rr5dfRCzkQN2dtxmuYZcNE9oI2x4ECd6xKp8oHbOqa56w39gyX/MHeQ9h8PK4SdmrgJmkz3Zng2kPx+FS1vS//R4eqV82qx0uWcHErmTZrdn2HDFJ8gYLSSkpWH/nLSGZ6Xtw9xJS/laJhFu3Lkb5SC5gh39vIljeA5RwsmmOuj/vspD5TpiYLpJQh3hjRnygA3Jb3P8J78WCJFQQAriGEKrbhlXZa3pWcQJ+GvIJOU/ASvMBGI+7k2hHbEnzERoKMgNL3Ok+RcNMNwfsceATp1+VYg5udrwPkMsgggwxyh8ht1dBZRYrKrqWYLQxVG1TBLA0XnrnWyg/im+vEOppnPl4qVKG18niNhl8KhzRbEHPlUZxmRgU8Osr8VC8OkpqcQDGN0hjniZvehQpIrBsXWLS6Lpji4A1WnGq66rhOMKVNYKlJyb5Em1Hx9gQOmV57ayUzW3oh/sUKayoVEao08LGxPc0ciDDLSGno2sG9yum5bL5nEQf3EotC59Yu55UVltrwQ5cfgFkk+MgFcJfafmuGvUcicKbjGfLtmILECURtPT7TwleivcViCUtQDUqRE6usMW8IXpW0E41Na3J085rcjYgQF5byyRdmUpk/kYeuLdu+M7RRc1RDKwAUccH1tHJJzKb7J2vcTAKjdBNCvROPPQjHTfxHu1muzwnk7JSwOB4t4+vduB+PkOZrq+Y5K6svnzfwympQlr2Km7CBxcLJUJrzjFbyyrOqiVzmbkCBX4hx07k6XpkFnUoN0ErlbKlNmeyVghtGKsxfqGlq4a7JYT2Z441auFnRoNZ4UfJHINPbbG9BlCAIhB4r4R9f/EsAgN/5vTdJQV93Nm4w585exbn1LQDAydGubDx3j67h/iYGO5yqrqNeQZcr1Ce7ks6ofQoT06FNMfa5Io4u/BGZQ/k3GMmai/jcikXglooqVpJhlqpyEphSKfaSNt9rFSyUN2+dokHn/5jyoreIr87QV+C1bAaPqeuxiQq1cTXovJ1y7Hz+I/fgvtSXbmykDef3H8OXP/RZ+e3eoo6UcydXFWJGjX4Wx1Vi0y6bGUs+kBxbz+BMU3VcYsHVRg2V/+NWSlGuCstlpDZkzS4bcyd9Vat5pedrFobvwZ65f8bkevNkLFBM/HcHADb7EwCX8fEZML2YfR8NTpjIYmmPBOTEqzlNgl0D2iOxL64vxn0mjVrcM6SiU3g0ad557oTCGBWN0scTVdUoQy5VzujIXuaE99xLCSD8RFMox0OBi0EGGWSQP8Nye3noSWKuc+2gKpq7MCBMKQFWk9uXn1zDDprpwMoEX64z2AvkSXKQ0zNrBz9x4RvwwV9/IwBgYwtieuNzUZPbGZ3Fnxw5G39jHGAWyRGyAFIiPszvcjjxwFUAwHfd/8cAgFePLspvNeTgUfJGZ5n7WmCGha96RRnk+SXywKFLCZ/21ZfM1xxSYFEuVqKrw2vroWIVfMFW4IaJ6bBuSlI1IJnj6XiD5z0Nb1Whg0blsddO0SNUNOc26S1jDbPoYhlqPLx79joAwMmPhsJ0qEk0pb2TDV6/cV5+u83BKIrtUkLjPZziukuIOErofz4HFOcZU78uQLZ6LAeBtIKKNSBLNx+BcgNSKcem5mBrKXO3z4SRQhWqzTUpYU36LAiLxaAwRsYkoTaok8XjQpB0B6CSFM4boJ7F7+2GBmc5Qi5+7IVVkiEX52LAEQDszUel78lhljEcsiWFqMogmcdiG4wqZehXsqVqdmJRS1st5cQXyIU9SEOkyqL+U13gosAD/QK9mtmSB1E8Lia7XtyBiJWN1QKdG1Xj1VNa9NgvZcKvoquV5/Eg/IvLfx4A8P73vgFNWmNnZwK648kMPRE96Zsbe1hPm9PCVtjaiZGB8+cnmJyPHXr8owz6cEy6/4unvjZeezygmsXe+u7v+g94KC3wGpMzS9ilnmBADPwwKeJwbqtSa5QZbLI5Rz2Wy2HQFgU3N16yKjaVE5aSzsszNhbrKThM4+U6O5/GVPPE1ywlvYgDBUJrU3Tl//zFr8NHf//h+L1tghvH53vz1zyGn7j31+L5En/ZM1P/1z/6CwCAh55eoD0aJ7hrCPVuwrSPA2fqrfRMdh97Q99QFz4wvWC1Mv5rxYqQFMLGyndbLhn8mIIUl44URirHhwCiV7IJO0Uf1UE1hYU0VsyjMbUCg2loLMMsa7zAs3YTAPBLz74D53eOAgC+/d5H8f1Ho8JTAxgvOXwMAsIo9fURg25dBjYmKT/Lrh9hWsX+CVWAH8Vr/DgpHYHFv3TtyppAPIaC9Ns8lKC/zE4bUycR4Q0ZGZc+sECnU26l9izYKp/IimCjyhUmE3NJgqoojPQSJusAuQwyyCCD3CFyWzV0HRCTRWtvTKHkMqfVwQnF811MvGVHKK8wrV+sUEWsxx0/f9/s1fjV3/lyAIA76WAfiPe+6+QWHjkWNenjTXS8+EB4ei9mfXvs8imMRsmxcnKO2TjCMosTBvV2ypkckRec/X2HtSevAwCm39NiLUEt3rM41UZkMedcziz0NDUAGFcl50TnWZxrvWyWHBDcIajlSkixl0ofq8yZ7ARm0fm0dd7yrMlNuRVzXTMjlllKWStn8lhLmvv/dOHrAQDXf/RuvObxx+LDHTsK1HGYX/239+EX/lns1799/IO9d/hkG/vwxL+LDIjZXQGLjRRMtBNQ7cXfaI/UOFHtpGdSnGnlPNMpBbLzbKTygzgwqpC1XxL4pVGanM5Amh2kxH1zXGCWQ7K8NHSULYoRl3qhPQc2WWG0GAo9dlkWbS3/g0e/DQBw8penWNuK1/zyG78W7/ivngQAvL7ZBi8xPMZgNBvZUV0jJBilOwq0a/HamR+hS41BVpVzy5kpqcCO1cVG0mmM4eGUtj5fquk3D3WPZ99KHxcrJTpFkznvmv1cfOoX88lzxRkPm8pEQvXxS2GlvSKQi+GDKXirzpsVuLjO0xK/WyCXHtaqAoeyF75QjhS+h4A/XNwNAPjJ9/5lMV1O3nMNX3PXEwCAE/WuDICn53EB+MLOcXzm83cBAI58vIFPOZxMLeQPkI2YOgDUO9lkJizOxgQT//I/vhPf+LaPAQDeuPa0PJNfmqUy+NLi2FKFPfvCPb5MiToMyU9pjF9dfUqlX+0XEO7TEoE+HBFrgO4PQAGAx9ozAIBHd++TBfT3/0PEv1+9dRk4HSEuP6nRHY2dMn78Iv7vd381AOC//qu/jzY5Gxoi/MJzMelTsxN/Y3bKoItFitDsBJjd+Jy7945w1kST3mH/QjpGJ/Vjx9zJs8WgswK5rCqIodtvjrzaAAAgAElEQVRLY+gad10lgcNLMs9fTCpScIR6Rl1nM/dfZGSVvlqGoja5pCGbhxr2c+sAADcK2Lk7Kj5HP2/xr6+9FQDwljPvk+uNBPwQpuO4YO7ymqBMfuJhJ7Gdz3fH8KbRM/GZWoJv0nzLqZznRohC48tUKlSh0JNjlHemF5eFu07HndrgdISsZrzsuUYCyXTbrVrcD5SXgKEPkMsggwwyyB0it1VDDzew3WhHkuYrLzNUdACR5hpH00+zWLLWYDESLb/sjNN0fNE1+LF3fx8AYONzBttvjrz2rzj7OSli8Lm9k3h86xQA4KkLx+MzPjPCmU8m83jPiTlXzzxsChe3Y5IE+y7ln7j8WoNqFnf8u3/b4Q8+/BYAwLtf92Y88pYvAADeduwpcZx0vtQx7DIfNhA4hyZTP6/LSjkk5EVr5Tpzpma2iKXEtteXy2UBa7IlzbHS9DSP+X944q+A/vHJeP2OBbl4zatchEIWZzdAKftgt16BUpEQf+IIHvp/tgEAH/3OTbx9FDXtefD48BfvBQCsb8b2XhxDYblUgG9im7/mrU8JK2MeDEymFKm21TDDPBW+qMmJ+uQC99L0mnSNdnrrlA85W58xHvYANks4BJaLwD4qq6LOqaNzuTRq3i2nwQViIJ04u3kBt5aclL6UiasuevzGZ2I2xZ848344iR9I1gxI8hWFSuUQ4pKr5QNXX4W3Tj4f77fDsOsJ1qgLDJPri5oFSbBg5MCnMaP6Jo/PuR8Xhl3wPfaOZrxoyeNfp3/IczfmNErf56DyMCls4iXM10FDH2SQQQa5Q+QVwdCd5wOL6fKK8zrZPCsMfZXDpYbr5RffyAl1yAm3PCe+cwHYThrw3/3M9+Hop+LuuXN/wP3nLgMAHhhfxmdmEa9978dei/XHoypw+plEMbvuRBNfHGHMT6bduDJS/urEJzr4pCHMj2V+a6QuAsB1GEyfi/e7/99aPPsnDwAAfvnhB4D7Ij3y6x/+FOqlLduGsRwfFj5+I0JKOySFEWbRx9oC09xyTUmUPPLoc3w/30atfPf/ugunP/1FAEBYn8JPMn843ttODbq1ZNmsEcbXklZnCLwVra13b70B7zjzHwHEcbC4FtuyOZKpgJD+AyDh4v/ZuQ/tK0oMACa/FxVOeheqnsMsY6rLFMas/S7ci09H8Ykaj2APIbn9TUgs+LJaJ8zlGHX/5rZaIwveTInPRmPYaXyrnXM16k9EPP3KV1sc5xyNWd6zTXnkEaCSWBVH4seeOYfLZyI+H0wolsA4asuuI/gqa+vAZ9vT8Tm5wykTSQqnzJ7ED0hBG8VNH1PxkzRkFVmhjNeKHapUcrCj1fMj+0k0iSFmRH3pzu5XBHIJgVYu3LrKPaDgEu5kMCw7GoAcLp/Cb2HEkbFGbc/xmb3fOTaoRsBPPpuYEb99Fi7SYVG9agdfc/oJuffnt6OD7cgna9z1vtjpWw/HQbM4wth6dUlvmjnPbuphduP56/dVWEsL9igtLggsCxDbIGZge8Sg3ovHD/2rHfkd/u/2RwVV5LB4ZWLDeqIDiMTZtxQEtcqJpNMcCC+bXA8yy5v5tp/gZ74QYwNOfvgqwigFdj18FLunU8BOcjhX8yCBQL4GuM0BOB72VPR0fuj5B2DOfAAAcMnV4AR/uZzfWE2k0XWP574stvPbx09Jvp41sj3lAUCEYITPXPK0zKnuwYWa8WJ8br8Ud7HUry8KpR2SHBRMlEUH/O37bKke6DLD7MyJCHftrY3RJueznRI2n4z3+8Devfj29ecAAF3IEJ3BfC920DIBTuKDnljD7pvjNW5SfjNXAcLEoV2P/bL5ZIuf/kffG5+vBeab8Zr7f/AJ/Ni9v9l/bvWDu6HpZfMU1hlZdEK8KJDjQgUY9QOLimPcqOC8oDaqm3V2D5DLIIMMMsgdIq+IeqcpbVp46fxB5tyy6N1fawbaWaohnuz8+N3Zq/HB90QnzMgB89Nxl3zTmWdxXxMhl88tTuH89SPxftcCfJ2oS8lMbI8S/v73/ysAwBtGX8RuSCajW8c/f+qdAIBLT91bnjX5iqaXvIQu17OSHXHnnJFEQuPLY8yPxTY4Us2xl3IJLPz+bjtwH9ftzAEHKFQvSw7SHHVfCgVuCUZ5oT7+zOIc/um7vgkAcORJYP187FuaX4Y9ES2X7XsqtLF7YCfJ4nGQ9yRVmo1s+d3nt9bFlH+yO4H6eoFa4sVFKWuudcDrUmQwA7sp81kH2mc1aWkUDx0osNJBibrMi8BV+2huWXn0VELiD0GW4dHV8SF+5bHApdRPfveazVjL86PmrFzr69Kcv3bpzfjGtagljxO8wWDYRWy7yR7Bpn7X3TB+nnApfeCbUMa/KVZOLoxhp0YsZtN6rH8+OuMvX3gQ//1/8+0AgH/0ql+N9wrc68te26zAO1etbwd9/qL0xZuQV4yHrs/lEH9ewiZXmXM6hWoWXccRFKEWIKew1akw4/1/czfylf/Ze74B6ylk2I0BOhOZLa898qxMvs/unsTOc5EvfvaJuVSuWRyPf7/h+z+I71p/Sj1LSq6PPTz80K8AAH7w6/46+GfiIDN7ivu+nfDVIzW2743PuTgGpJKIGF+c4dpfKWHFywu5HhR6SPRyuQTqY3KHLC+GoS+LxpKBuOjloI3/5Te/CQ/9/RgAVN17D9Al/v3D5+CbBFk0wPxUwsjvir8zuWCE9TC5WBZ6eA+zEyetfWYTMx/v91NPfB2q3f6CHgDUMXYM3ZEKX/vgn8TnU9WEawTJ9TFOisQs8L7Cx/m99EKuN7U8/l1osEp6UGXOvNipjSTgUJwoGd50gXqViVblLvFg1CuKQOs6vVreceSzAIA/3HiD+CrcBNg7EX/zgx9/GLN7E4vNpOLM8Bitxd/o1hrUp+JcCwGYnYtzdOPzwFZOwwgIBpH9PGFeCmP4umDUbAluLQ6atSevYe+nYmzJp386/v3S0QXJnIkD+kn3r2YD5Q2xYg/2Lzwv9k3RgYc+yCCDDPJnU26rhp5DmJ1n+BWZA1+OaJaLnEOBcBwI75u9GgDwz3/9GwEAG+dJqoHb9YBzyVHz4Oh5+LTXfe76cWx8Ju7czdMX8dT33AMA+JG/EbXv7924AE5FCjy8lCuYB4szSZv6xTf9LP6Lv/MDAIDZu6KJuXbBgROrYfdshTaxK+odYO1i0tjWGjzypV9M9169VYvDrHdO/yOUTwMOJYmT5EA3/aRrmscsRTtUdXhdx1Wb8X88ewAAcO/vWJhHYpKt9uQ6qq2okbkRi9Exuhaw/aVJC0yOzeVXrHdyEIAHpcY58lnGt3ziBwEAu797GnUOZEzftWvA5PkUNXqywrcd+6Nyv3RNt0Ir1hkWb0RWFdrQDAgtPdOcyrMeWp77m5CVScqWJBMUPAhvHkerdvfBDuMLcdbYcQCOxZdaf6LChxZxrnzjNObLqGDwwMlovj727LRneNpJstLY9Kz1LCFlv6KWUSXSQaBYHANIrLRk9fBigvH5GK/w6O59AIDXNCU7ag0npekObA8Kxcmd2C66xkNF/kCmX4FJb36uviIYug4wupFgI0CH6+/PRGgQpBNj4YvCN8uMhHdvvwE/966/CABYe7aY1xkW69YDHj4acb1NM8PH9iLuffGxU7jvM9E0D6MG1VfFAfXXjsTiFS4Y+PRsFQwWIXu5CTkRw3F2+MVHfgEA8OhDcZD++Ee/Had/fpJ+m5DjZsZXAiaXolm59aoJ3rHxfHoPs89E2//vcnxwcNGtt81XhvujQGnL+Vt0ng8pHKzqSv7e83HjHV/YwcWviVTF2VnC5uOROljPPNpES2y2PcbPpHqtKfue3QhorqkAtb2EvXcWi3ORysRtwNZ7Y19U8xKYIt+ZAePL8Tmff+sIX5IyLDIZrKXVcw5f8C21IWWoglHyCuni2Ll95HipT7xigdXGoXWloEkvR09Opavrxt5CyZvwgQsPyrObpZQAGmpaJUdT8Njxc1vYfTb2MQiSHbF5Ns4RAPjmr/y5dF/GV5z4HADgcXsvfGoXogBJvaIfVc+BhL1X2ywsMjsm7J5LawFDFcMY4+jHIt72649HH9v3vuNDAtQb8gK/jInh0mTr2BTWk18Ny6xiDOlUuljatG82YGyAXAYZZJBB7hC5rRp6CYFFr8SbFRPVrtzBDpJVocYx73nZ1R5PSZx+9jf/IqbPlYACALATiGaMzVYyKHbB4Peei1ri9DxjHlM3oz61jvkfxUrhs7fF7XxEFTjtyh5BmBNdAOahhD/nYKassfhPr4OTk6/eKdzTal6YKNUiYC8Ro4/Vs5I/WZltugZrzhD+gl7zQ4BcVjlAjaqLCSjn51IhBF2YJH++2yau8bEJrrwtafabc1xP2bLWniFsRSQG60+TKMf1TmK5tKWPAxPsWqrKbh22Hoxavp0Uq6g9AqQKhmKxVVtBlKXFsYANzvUkCSZrZAiioRsFI/VCwTPJAl45Fku7jNiKs7u6ibEfG2jp7y0WPUdZWVI3Mkd1eywLq0IWX3bmafxeSBp6KHEA7RHCkV+LTKbfeHO0qr5leh1fsfY4AODnJl8tzB7iknyrnjGeSRM2jJyUrMtS7RGqecqpvmGE2QYCOBWnmT7LoOsxjUR3Lab4GJPD7BZUh1kVR7NPXkZ/vvJRKUr0wrCKGgWoSMOll86Rabrwc00ev3bpzQCA6UXaZ5Y2WwF7Z+KNjh3fwclUafbp7jgu7URwvd0M2GsT0+SiwYlPxHtv+bignzaVQC7LUifIpQsO24mj+CMf/S4AwMmPl++MrnvJ8VLtefg6URU/eQ2fuho3pHN3XVtJd7sR+lPGD3FI1LZVwUQV+xddoHQWzSw1ObzhxAUAwOOTY1h/LFE1T1RYu5zgBi/kEpAvrCC7nv5OATuN1y5OAjYFkkwfazC9lCiHGxXmJ/P91MKViRoBkgNGP2JNRgJd4r/j30XI//ZowsvzD/Wou76/APQZTEt/b7EcBLX4FRj1siwHXC3fd5zu/TVHH8P7XMxjRAFwCQufn2aMrsXr/8HHY6rdb3zHz+GBOp6kiUNI8xKNF0XFjoH3Pfnq9ENlw81jv1sLaK4l+uiUMX4u+fXGZVNn5wGTNvCNHOnLKio4iK+uO2CMr2ICLctK9lfYv07djAyQyyCDDDLIHSKvmIae928P6mUjy7v/wtdisnfBvOCOr8tb6esYAVcWESIx8yDBOybRE7wB5ufiDvy6zSs4aiLVYeZHODaNjIovrm1IMJFZeOGLT3N5KThw2hc9PFzSCDw8uqSpPe8rfP9Hfjh+790RNrj+AKHaifbl2nNOSpyZhYeZRw3i+ms38eCRT8c2UOHDuY102wGAyyXJlmGV7OHvGLC3XkuvjIJWVrAzXkhT187QLN9yPJYh+7G3vgmnHk1tcV8lIfyTSxbz41HrJlecjXsxESbCXXNxHtp5BdfEYT6//ximT0cr7OLbj4ojlFyQIK/GFicZhex0LDm5gZJbpA5eMgGuEp3GYKbOGwSpmzqzjQRXaSjtoBQKK0kEmvFyC0VDLtk6dDcIOyxDLQcVl/nK8RcwP5uycl4yCHV8X3vCYfvB2EHh0zGG4zNvdTiV+rUad/DPxLnt1p04E6++yQGz2N9UeyDdL1cotUcdLr825u2ZXvIYXUvztSKMr6TMkdc9wiRCc5NpzgLq0aUB40CCCCy3R5fgs86/ONPJh/3zddkC09bjjcgrtqCv6t6DqHkHySosT2PoHRhv2IwFfX+7vh8m28XpT3uUMEm1Qc9NtrCZFvRNM8OkSsyWsUd2oZu5hUke8h+/EKNAf+DEB3Aufe+MaeATTjoLDp9OnMi/9ZEfwvS9KfdLyhfRrQfMT6T3HhmsnU8L+tzDTmO3XPnuXXzzeoQfZiu85jr3jVWmeVCWpu8YyEEoCqo4DNEmug8E+yKTf1VyNQA4m2pCdq+bYfTv47nqtMH8RHzXzc9arD2bArE2i4mab+Ed4cTxSDt7/soGkJI8XX+wwemnosnuq+I/8Q3Eb2FTvrPmGhAyVr7pMEopbhkEK9WSXp55DPSTla2S5UIIAp8dEsyiRQcWrZIuGNSJ1eVCCe6rD3ilVYt6TcBXv/VTAIAPv+v1JRCo9mhPxnsf+VRs+1+59nb82Mk/lO9mzNsdSYs3EKOhBYoiUMLWy0LJuP7qdK5iWcS5CzBJYfA1wR1LkGsbf2Pma1EW9SLeBSMUZwCyUUtOYEDmwUHzIQSS5wueXpafa4BcBhlkkEHuEHlFsi16zz3NcnQDcRilNqjWAhP0oArvLWvt33z0UQDAvzn75TjxJ0lzz8EENbA+iSbVullIdry5r6UeJjVeOcoCppfi/f/gf4+OnD/wb8HsbLxf+7oZmiaVoLo0xfqTsXmPXvDYvStes3c2BVdUgNlTplZqA7tm8Pz3RY3/b7/2fZilnV6nX807vQ8kTrOAvunmtakmVGnal6XuVspyia1mRUV7oK+pLQek9FI6XBmhei7y8MenR7DjVNjDMJrrKZPdZiWh47mfvAnYXSSmDAf5tcWx0ibjy4T26FKgDkq4fzWHpHmoNxeKveTEKeqDfo8o88BokZ3hLz6wmYKCHPfrV36JaSKh/45RaoqGF0op85JFP1euZu9Za+I3N5iyhsuBClRFwH955t8BAL7/wdeA5olbPrGgtdix85NxHv3SR96BH/5LMRUEc0CXsimaqRXIJfQgisLvlhTPtYdP2R3nJwmjqwmO9B67Z+J7tZuE3ZSC99zPxmf4ge5v4kfe+h4AwMOjZzEX+OXmGl5r69nycYFif2Z5GYFFg4Y+yCCDDHKHyG3W0MuxaJZLOPBBmGLWdl6MtjhGVxyjAdjgyEVae91VmI9EPmvW0BGALhUJuG4nuOaik2VMHdarhdw7O04WJ8a49lCK2pMEPyW6bfTxKSbPp+MxSZ7nK68l+CZpVjnhviOYhM+ZecDkUrzhF/96h7/3+t+N1+oyZfD7fAytr9D60i7aAgoZUydItepgw6Fm5bsR6RX/piD4o5MovCB93VxhGTT1tkV9JF07Ku1S7QEpCSXGl+O77TwATJqUyKutkJMsBga6u6KD7dhjFpdfG4d/2AOarfRMyc9S7XmhLZ4+tg2nqIhFWy/ntJ1RsGeWRGMOfKA2l62aVVir8/zi0dSH3KU2MDIi3AWDRSLr14kSAKCHI7tQoifLXNSJyYLUJjAIuKeKfqw3Pfw0PvboA/Fyy4J/2xQBPP5Cg+98NJILFtdHwDQXFldNoCIrgyfRdnONDKIgnPXFacLiUnKgXlbWMhVrbn49fvGB/9PjZ/79twIAjn7fM/jeuz8MAFjjhWjrC1/3LGq75BjVa9ty7Qfp44CXhaHf5myL5TgzMqxnWdxbX5UgC3alniaM1OLLsEgd3MqB7AP3zmfz/rsf/GP8m2kM/c8hvuyALoUPL3yFqzY6QtbNXCZVmBvUs8yiqCQDXK5nSI7EM48AzFOxhWoH8GkWuNNtMaNyoENrJDx9/bzF574rHv/Dt/wG5imb27ZrZBHvghETLVe2sWqyt9YsOUaTB90pxx0DKzLvvmxZlWHRB5LNJtdCzbIqHDw7RR1I+pgXhPaeY/H82GC8pQpfTErof04xPL4Sf3/n0gjj09H5yRszXFxPwUQ7hO174vHxj17F7FQMGrETQk7Ql1OrmpbBbfy9N544X94LHvOQi6kErAIddGDRKhjlhRhb8js99lKaB87AZ9M8QNhL8HTTbIiblTwOvXJ++sDyfm2opN8MeblGcp6QlQA8BwKn95uj1PX9obs+iB/94wfjvR0hpHtQSnHQbQQsnspRfh40Sc5p44TVFAKVNiKVwyUv9IHinEj3aFNRm9FVErjN1yX2oF2P1zZTg1Mfjru+/9hR/Mu3RW68/fpr+L5X/ZG8t26nnHakU0qXVYqsTWuP9yzPHDyV1M+OgJskMQyQyyCDDDLIHSK3N/Q/a+XEqLMTAxBNjlxAmyK0Rt6gS1zvEfZrdA7FnK1D4YKDgHkyCWM2v7jVvmPtCfz8l0YN/dQfZdjDoEsa856rMUtxxzPXCHe63pzj2kPRQXLy4xYmlyo7myAZpSnRrEJo4nu1Zx0oR7LtVsB6jjNPmuU1xtqF+BtPf6vHP/yqfx3Pk8PMF2dUfqY912DXRu1yJ/3tvEGndvySN3tF4+OllbS6EclcacMlq2LD9qZC2XWGvN1k2tQ7QLce+3J22oilYacEl+iFaxc8RluJvz2Lf6cXamy/KmniazPUG6l+5VaFWXJ8HdmcCG2xOwW5X3asujHAXbzfm9efOjAaeN97gFZq5XNf97TuRTLTe5q4gh6z5Wg9r4wA7sEwAYdSuKRVkEHuy4ZZrGhDhUobs0wmrdvXqE0pxQYAS7nPepKzVr51dB7+aOyU6rkG9ni8R7aG/dRLQjI0HiZBJ03jZMy3i1raxrXcq3cLJBpv1tBNwN7dGTutsP5MiQzOsSriaK8J87Nr6VLCiY9HmKh51zZ+6ae/DADw117zB6KNL3yNNrVTbsfWV2JdL5zBIsG91ipYTdMWw83n0rutC7pNiydRCfLpnEGtGBDZJPEgyfHSBSM4VV6gfWD4dA8PxjRlbzPBI2XEhVO5NE7wDOFVkT1SfSAPRNNngyTZdSMcb+K1rz5zCZ/K+T+eYExSpsbtY2kTWjCqWQoMaUuSNecgGeDMNgO7hZcKAGvPANdeE7/3d7/8vXL+WXtUcLgtN5FFfNc1mCuoJbfdKp8DEcCpTb0hBCGlE7h9ZTF0AL3KL5nlkjFYgyC+jGY7CATSbpDkWXGjEto/O0uoEluoTnhnNQOuXYmjYNx0GI3jIrGz5oTjPzs7wtHPpnS8zQTzU31+d70b0G3Eznzz+AuAMFdWMztWwSgu8BJnORVq0Bg5eB/LxQYjprtTG/U+OWROesaAmYLUxax9JXZ95w0cKfgl+0MQ5F0lRB6Vig8puVwMCp5+nIEzZ1Kt0Y+exnbKqEmpoHTwJItxNXKo6tIXNi2Ouq2IAK76jUMm9Jorh/bvPejhUrHxI08GGWs5+KybElyTgXiA0gfmwrPYu3A3AGD+6rr4+kCywenAyfx81ple4RJRwlaldrgJGSCXQQYZZJA7RG5vpKjakSRzoPe9ENhspuy5Wsy8BVUYcY5IKxzYF6vbp4Up4M8/+CQA4JNHYo5jtuWZbGDJbOgCYS/twPeuXcXTp6IjZu/kUTRbyRTbTdGjM4KZp3usB9ijyYRrPCiHIHug2k5aZHa8GOCdfzk6UzbNDLMEMyx8LVGh1nNxinrT2+mX5YbyylOAn9x62/zFftsGLpaXKde6QIodUVI35JqQzbaHr9I7j0q+c18D3dGknVUBbiONnwTRVc+SRMcaCqhNyoe+buESl333jMH0QrzfyUe3cfEd8TczGaOeeVy/P/bDvWYBj8xe8DHLIoA2BIELOnEUkmjrHtzjoufyiZ0rRRisLxBGPwXGKsvrgPF+SEZXKxq6h83MHm+EqdSRYryECnVWZxliUTNrdlAqZoI+J31MOcUHcG49auhP0mmBWqoqERE8gxrFLMrQpDfC4ybF1WfjMBrF38wafNcZuEUBgLKOT1VAdzQ55seM5nr87XkqiadrFkwueZh5ijb1AWFUUpTk/o7zta8v6351nuSZnGOETJZwpT4seSpW2A3KbV3Qc0P3CiHQ6vSrrNKQHhTAoFOv5sbrQoU2d1MwmCf0bjc0eGgag1Q+JjUmy718IEwS/eWEaXG5XU+XEE6sRfjl8sYmjnwuYcRXE+TiVWGFo1ZMOASCTwk8zIKkIs7GF1NAzH9+FX9u47MA4uDPC3oMICrMlp6HfClQRQcWeW22vZAcwuQvg5TFBLdsAOX7cIqe6IS5U8nCpgPCntg7DQCYPtdi51xsF3IKWjQA5ZwrAHiesmFeiOdG13ypFUlB8vLsLRrMT2XaaY0rj0Ro59RHttBsp0U6sRpGVztcfiROj+OmD5dlWaj2zu/UKnpiq/qrJis1Q5fhmVUTX2dZ1Av5MiZ8mKI3FU2prLFfsfBhNS0zb9gdjMAwvaLuS36WaUq54StIYWeT/nLdCVOubU2pLUJl0ScqPqQQCG2bNsu84CMFIiFel9vTLQyqveSPIpLI/d6YSx1OPsj4IiaYtfjMC1/1qIrLClgI1POTFJiFesciLwFDHyCXQQYZZJA7RP5U5EPP5nhtVOZFkOx2CyqZF7OJZ4IX3qsDwydtkIOXGp89CcDUpIrvWUOn/i6arYMpt9jJ5nEwWG/i955dLwl8srSbPmZ7A2DWrehetmVwl8zKFgLVtOvxXb/p3k+IdurAYqLOfXGs2GB6mpFowsq8XyXOEXxmvzgqGRYDYSVx+haKDn7Kzz4CelaHV++d3zX35RgdPr8T+eFmt4OvI/2EbLR0ACCYIG2LBQufP2vZi00GfCpi4IwkWjuxsYtLqfsWC4bZS+yD4xOMryROczJ96/Nb2Ls7atQVDHZCHAMdgmTUjO+w7PgqfenBYl3u+pEwWxa+FpilU32s0yZkmEg70oDl0PZ88uaz8t2ISIyDr8ApG2bnjZQTdGCxIB0VTvrc1wVqyQo4QyznWnHSO3ApAkIeJ0cxG2YwEAeoUZZ9fqamcWjbUoJOGHRt1bNiMj89O1BDAJwtzk13PaWIaKk4IakEFu2dToFtOwVabdcZVUpRMD55ApNJtOx9IOnXPVcLiWFmE1PN1ph38ZxzLM/hLZX4FMVYIg/cLHvp9haJTi/ArPNXFLpW54wEmFjPWFB8vJGxZUFIA6gjIxPEsC8Ra4ElmGE5yXxO0ZrNKfIQE2fu6t6ik2XhK5lo3UZAlzzvbpzw3JMtRpNUeah2ZWDt1CWHSgCmz8UBdfHPxXvfVZeCFTtu3AtIKAs39SCXVQu48+V7vVw5eRFXOBzZww9A0ZGgB9EWNeSQ+y2nme2CwdV5jPLZtB6c6WMdlZwtFsg2r1UTX/kAACAASURBVBsHjLZSm6fgoN27g0TH6qyP1pdJRJZkkzVzi3q7Tb+TIJ55i5MPXpHnXKioUCd/S/CIljzueEnpWPXeAGSBzO0141robVqYggqQgcoyqRajWyi53UzwPcVH2Di+BPxp/JjJ9xgvWXI0dxcq2ei6wBirnD93pZDdwGmRBQQfZ/bousRm8QynqhHldmETUKd8Ssb4wvYSei/QZRhmuy7Q3cjLnAlMSMWxBMe3E0gBa18RJldTG913GsemcROKG7Wer4VSDACdCgR0rhxr9k48kf56GiCXQQYZZJA/q/IKZVskEBVtss2aFwXU2WxztWjGC1eJ5ue5hJFrbTDz0Hf9CGvpeNn5lLWCrKFXM8iubH1huSx81dNCcl4XOjPH/Hh0pJGPv712ZI4mZWZcdLXs/qi9sC7IA9UsQQApp3dNDtt+LM+ms7eVvDW8pBkVSwZILJHsFPUkPP/gqZjmVmlvHoeSbdEmDcoahuECs2QNr1IOXR9IigAsqMaYomacg4k22GPRZRKwB9ukebkAN0razag4i8yCUCeoZffu9Pm9e8JuGFclwMl5ht1N5eguMSaXE1Q260C7MefPZDfBcncfx998VYwP2As53SbgQjjQGZr/nZ1/XajEMbrwdc/5OeVyz9w2rVc8dNXv2UHaK0enFToOMp9upYjDnQv7jCmAU/IcJo8q7NfQ6+DQpjbIcQYahnFEcq0hq6ycUpKQO8AkJ2W7SPfiALtI1nPLoMQu4dpjNC7tmWEWooBOII4072oHv5Oef4/hMyvNFkiFXGzT+EhJi65K7EO9TWi20pw/OcaR0aV4rNaNuasEcsnWVmsNWpuex5piRStmi4bPKNz8fB009EEGGWSQO0RekWyL3hOYiwZCShvJ2mfNrucg1Umq4oP7lVV8prwQrahBnw6XNfecB7uaEUJyrOy0I+wmDX1iOsH1J6bDWtLQj27MsLcZNfTp+fQ8jxAmdeK6OiPUTASSCjq8iNxWANi4+7q8R6e0G62RaW02iw0lsiyf1wm5dIbF4BR/NZDghKTP30IRXFA/ry+RvFpuJH/0+ihqW+QaaUM7JrSJJxyaIFbH5LwRbao7kjDxyuPsZqxYdHl3is3jkbY47yrwdkoncSWg2lOqdhqcNIua+uWvOoH/dPpYvO9NJBu/kffjqIbd2P1uIL7gsDD0nFbCeNPLZy/jz1dCO1xQhZpKNHeuZFRqFajEaoGFTlyTwzxj7ypVB1uIxpx/uRl1gpuHxoObeO10upDI03lbw6cvVJWDTZg7ZVLFvAFlhzqjVPMCJOK4PQrxrUkVI71SElBdy2vJGsYp+dyeq3th/ho7B6JlnX0AXlvROoGeh0Rzc4ebzo56exf09HCeWcLRneeSPtQZ6ZjOG8zFtAsFXkkW154Krd9xfZ5wvp+HxRhdOudlcbfHkpl1oRZnxPZ8hOfnkXu+2ezJvdaqBdq00I9ri61jKV9ISgGw89gR7D6SS5KF4nTbM6h24zXjq2W2ve7UswCAK7lEPaI5mheCha9UPoiSHtd6lmMJKfYsTljnaKWThXSggjucEnQ9nm2GB4jFTG+8k3eyniW4qAtGMks2KaR+HmocGc3TfWu4FPpv18rg9lWpFVnvmlLtKzUzG49pnUzwNeDJS7HW33yvAefNzUNgi1BxgSwW8XuLb9rC8QQfzZccooVHvz9svwumxz/P47km23O268AiIQX4/Ru5odDLZNRjuWgo7RBqxWZ4wFCQHD2WTGHjeIOOM6xWFJS5ryWIKktkohVOen72mR8JWcGTkzmq99C8bjjH4Co7OY1AOF1XQf+a5qHnhTwrO25WSdpqeIJJcKFb89BJQHPOmNwVbAv8QQ7gRWL9TMv6NXdVLxX4nq177dhjtixMn30myh/3f2d/GqsXlAFyGWSQQQa5Q+Q2a+gJHiDAHcDHISoJgbJWPqdaad2Jh6rM+RHbnpaetQMDL6YdB49xOl8fyQ6UWrS+vXmN64vopFy4SqCMkSlb5LTu4I5nXSBqlqOrhGsXI8dp7eQMLkE4Zpul4ML4isXO3bGpH15/Tu6XISOdJ3nZEdqqKuIZjso7fudKyHOPqmi5QCvaERoOxzTXTrtVMQVAcZB2wUgGSUYQZ3bWasdgHB/F/AgXj5yS0P/4hVD+pncdX/ZYHE1meNKqhGcM4Ph4F7MujoHZlSmkOwnwdXZ4sSTr7x48AwD4O1/y3lgEGoAL/ajQDMHshuIEa1UVeNFEVVK5eShFhj2ox0P3S3EFPUdoIHE6a6rbcj/yITi7k98fLpA49mKZwZTdUuV7t94IzbgmJ5TiLBu8Vxz/xHBKHc4pAbZ9U5LvqZzkGRYN4xLnAQ6idXvyMg+M8aKhO8eFrpgdjS0XrTsUaiTAAsWOrpGUh3RTDcslKMQGoI3rADn05m52hM5tjb007jL3vOtMzPYIAB0Xq8pD4irIKUvgJcCjtzewSMK1eUVC3Cg5zHmOqpcSoFrh7pUq8x5ia+y4kfCAc0BCvHExi9emcRHxtCbPZOc1dtvUAbYSfndTVRilII/j411MNhKDZhw3ELaA2UoBBKMR+HK8x+Q5wihBLc1Wh/N/NW4i6ybCCZ0C5TSevvCVbFo6B0rM/5EGWfa8h5LPAoH6FeEllLiYr4cBtwCFJ2wdy4Y8qkrswNyVvlxOXyDFLNIsXAst7hpHP8Pnz45RpZXULEgGvfOE6VOx/SaXWuydTCH1k1INKm/Ori4m8fT4DPPtuPkGIuG4A0AGXnfviWT2B5pLmPnMldbPa1ZmVtRwiq6TehCmriG2zF3Om55Tlbv0MaC61bIsMIcVW6B9NvpY17bNz1yjHxSoGS/xHBdOeihc/Q5GxsAatWjUQi9TPs1RZ00vFl6ni85jPzDgXeaqB1kDJCTfF245FExFlqTymJ0Q0jSVZ/AVRBlgW1hupg3YTbELhn0PN5cYEV+eP6QFnTrFbPFQ+VtuHjfXMkAugwwyyCB3iNzmbIvpryOEzEM3jNCVHalNjxSCw0J51kk0vBISnaXhwgyZcIvtVK3ABZZw/zF1sn1tjOO56744HVzL2N6N39tYm0t5KCZIvvbWV5gkBsYsRqdjfAUSer6oG0zPxx+ZPB9w5Km4zV/4ygm+9Ut+P16TzM7laNRsgsesjwkisE3Pmdjm7GwrmC2+4xI+7DWvlYoJ9xKS/dyIZA1E52c37Huh7ONcts0bgZpGbPfBDbu+wT1NDMPbOWuw+URs7/mxBnacx4zBsc/El6rmrnDSU156u864Nouath0zduZRg9rbHR2cm6yO7T++FE3p5+wG5vXl+MwqIrQLLJzzLhjRtDP3vA2md6yjjxcqvcPC7596WsPN0nM0Wy7l03R5Mo9DgdJ0vu5VUcoHpp64AVaQJGUDi3NaWz5RC07HizLGWZV79C4/nxENPWvnQHSOZ066T/x1CsWBTLbMh2Agbeibkg89S7VLqHfS8axALgCwVqc1xlgJ82+9gauKdQ2kCPnk1A21gj8ZQE4T0qno0JfQr7d3QRfTMMixd4QUfwPnSsBRBwNO1X11wJH1GZKplyCZnHlR4e1qwHVcTLvT00hp2+K7Sud2ygPtGZUpEE+GYlwgwa99k+GU8l5sWcLJ18+3eO6tcVF587d9EsdSusW5WtAznGJVQdk9V6NNA6BzBovsIQ+lBmHBVJeYLXmC63wQAdAVUA5j4udMdo6DbLxOVd1hBNiqmOnZL7DwFabCeMmYssPxKs6cnfuA0x+Ok6XeqdEejddWexFqASJ0kid+pp3ZlrGYFww3B6aEPQOTJk4wJPh8MIzQJEbOlchw+mJ7Aph8Pj1TP2xfBwh1SxDSDbXX0gIpOLxa5HWaVaH7BoJXOT8kAMWV1K63UlhXokrzYWwsplVs+5GxWEsZSiemxTQdT7kVaDFj4mPqxIfF8MJsWeOFXGMQ5Dw5FNZJXsM7htcaiVpDcui/70pxaWdNSY87z0F+pSAKWSoreg3ZWHxTiqnX17MfLCDVssbkkkW4HteQvRNlPM9sg3lituy2TQ87BxLrJk9FE4A09nlOPWYLqXl8szDpALkMMsggg9whcls1dEmEU0NSZQcm8aaDGDadNyagpfJ4WZ+xRrNPSmCRaOjkMQpFy99K5dxr73C0ilry0TpqD3YCmFwa1FApfmBGJUAIMVwYiOyS3e0Iy1R5KyRgeik5XrYJzU4KdT5e4eHviIEpX7r+bE8zB1LdyOwIdZXALK0r/PvWF0fooqt6bAdgidniqDjJesdllyd/SDz0zA9XxSt0ERPDLO9UkS9JqHyDkcsdnm9WWA9H33gZ+H8TPLPlsX1/cih5gtlN3OWmaMhJuUO1VcEl59MulzFidktsANsASqpv0EwaeQyS0mhOOfxalWTLoZSYyywdD629v/j08oF7sQbAsgN8iakh1lYJGGP38hxpB8n9RyP0Na06TFLwzFq1kLQF62YhWveUy3FNTjTt3JeGvJSjW06aJ7CbOhc4QRsoWTb9tDj+iaDwktIuwVHESZGMUYF545+o+Ze5kWvJ+rpo/twWlksT/fNorgfU6XkmT20BowjjXfnqFidT/8xtjXmyqK3jUsBCZVXsQWYrhBxgWnXiTzPkIoOOIJVh0HIpAhsIqPOALZnSrDNoVyT2z9DKzDZiBra+QoK6sJ5Xa8TF/WoX60xmKqKdBolG4xrwO8lEahlIeSKIA1rtWd9LVYiyo3wCjK+kYIeKpZLJ1TfVeOfGxfh8rlHZINMC7etCf3ONwCxzV5eMks70KIqSt8WtyNmi8TZtqnk69AVdZ5DLg7ilAr9oxotmRrS+EjxdzG4OEmz0tXd/Gh86+vb4+Y6Fr1KgyxygLs3Q2kgul1QXA9UuCe6qQXPuCGlPB7d9k56vxw+uvf0uAMAj4/MSwahFF3KY+1oWcMkuGPpVilYt6g4sPp9eWmSVq0UXLvEHsJe0HEaOnvM7RwHEfDi5EPik6kp2SPZoEl+yZieLPiMIdThXGqvZYZQrFrHFOB0zlfwtDVnJ6RMMUCWmSY6cJFcWxEAQfEGnsaHKgzN9tWVZOHNAGbdUgoaqwgBjlT7XLEjglRxNbDpgdC0pEZ99Chd/+G0AgPvvfqZX+LlLc7O1RvLH5P4LnoRZQ44kYnV5M5b56tQ6eYMyQC6DDDLIIHeI3GYNXY6Ko6NBgV+omO+gkknQUXEEZmEKoskCJYm89YyxKdkKs4bA5OV4kkxGux5Q7xTtJ2sCngkh56DQeRYcoUoavZkpB2867CYE08bv7Z31SwyH4gjMz5ZhFusLJNH54ghd2EqYLV45x4I2LzWzRe/+qa11yDJbOhRNLtdDDBQEdtHc5c4ZgVksl1B3pwoCzChp6ly0+ftGl/GuN0WN7a4P7Ag8Vu8AoS5FCqaXUt6Q5NjcPVfygOjQaXJF6wMgkAs5j3A5wguX3nAOAHBvYrgAkXteinIUjnV2ssd3Udq1ZF5U5fZ8JbCbTu+gU1xI/IHKm90rbhFI5eiBjEvqIBrlrZTL12J6ClM5VMnR2FQWTSr31hgnmvtIOUuZghxnrX1iOoFq2DeYpYE4NQvR1jsq7dytA2vnE0tKiAuEkCBXVF7aIijrnbiE/sOyaMQZtuGuOJPdqMzdHjTpIDz0ejelbth1GH0h5sf3jzyEvf8kOkU3AbGuF11VLOrOFD58nqNtCfijtsRVkC2BRTHgLR3SzVvUrwzkggDFVSpmhUWhMy4FH2k6YxYx4w2L6We5TKzWGzk/YosFV3IMAH7ipUKNb4I0JBiCqUgQAgByRjzkTexPVDMIna7dLInvR88zrqeKCx4l0k4XfdaJfLKJvbBVD4fTFEWnFncgDZRVOVs65TX3akI49IHKWyXZDK6owEEqwrEzLCyl1lewPrW/SsCmYYq8UE65xfbr4yJwz6/vYHxlTd4jY+dm1sFMYntNn08LzXXG3qnEqikZisG2mNihAlyTqHMLC9qIi9f4DdfiPeAFcvGBJRK0C6XAs1cYul7c88I9882BCdi06Dq6QIQyrGKYZNQm5gIpioZAB4Z6eUhulWQ/hPcEl9IYu5rRuRRwVVmM6/LD+T0adkrBKm1ocxFvdgKzLHwleWIYQWjG7UkH/lRZhAGAFwSfA8IdSc3R+EMlek4WdJXjRu6hKNLLi3j+rN4GxlfTBrad/l5t4c/HPEyf/RtvwckjEU7d62oFs/x/7V1Lr1zHcf6q+5yZuQ8+RVKQJSiKIdiKFciIgyBGECSbbLIKkuyyzDI/Lj8gcLIJkI29deQYlkHKpBhLl6T4uPM63V1ZdHdV9WhokAEvaYz6AwSOZubOnDl9uk49vvpqUPGtqF2hbIbNSJrFOFd2eIt1uLp8bkdHR8e3GK+Zh17+JVJdDtO2j0TNnVZbZvffdyQshRaR5oOS+xd+Eq8htyaXdEjREqHjgOr+UCRwmQ2KWdIqtNENcZOGYkMNxVYJ528X75Ih3ObjLxlfrC4DAK7M1t/w0Js0S/Ty+pScerbRN+dApEPr+XjO8IqGy2rnEv4/eK0vBFuM3dO5ZJtjNnHAM1LdnbDThPIIJ8Jnntjjwz/InlC4cR1XPivPX/aIxSv3ywluW3jHoxasj74qKZ4FyWg69u358uXv3NM1uDSKnC/zsd0Ob0kLf4KTYl2rjKnzX2txN7KTx6s4NjMmq7e6joNy8eMgKbYaukfTRLad1OuL6wHY1KjCSYoQgIxMe5WQ64y1ISZMvkn9iSTAqMeSPGHYM/d2Uzz0uYsSJU8uijzASFG1mm4u4TdZqrrOAvEbIK3LHhgYvCgXc4JG1AygDsFg+kbBMQ26T2oapn5HLZgvHiVNtTwthdx7D5E++m4+to8fS2olRF2rEHzLaLHaSiheebUlwUQNAU3fyN59/ILoHnpHR0fHgeD1euj7qJdJc0nsWfOxcOBvOnv6UY0nOMgYuCl6yUVuaRBh/k0cMFVvvebVjyZQGQNHCZLfYiI91q2DK16BX5F4C2OhM20uOWyulueWwPa0eugRvzzLPLqPbv5WPLJ9inqbOIgIvh1TFYLmzePkWs+8nrs9wyssPdFFavKEF9H6P17KJ2WcacGMiOGddo1WxUMX1CNdDaNQSM+Hootu6G+ji3jnOA8N/vSj93DrJ78BAPj3byDOS8Fp9KDQujHra2a4yBYYz+srWgQb1iwUU5oCeCoLey+78w/+5FQGopyneVMIrZHeMs0kX14L3BN7iQDt5Pd1HKVzNrCTc2AlHWxBTXoNghNBp12FPqHZ2Yn1rxI1B8wJXAcee9aAjGmvt85Msq5WL34o+25NLEMhPLGs98wF+CKS9s61J5hc9tBnT1U0q7YVJC7t8xXCI3ager7Y7OlaQzft/hSU8z0sWb5n9ixhWBZFyWeFQHHnN7j7T38BAFgMZ9IFGiw90ebNJ1Pfqm39W0tVRLOWzR419a+XzaG/ES0XsmkAaK8EkRZ6YFrWOZFOv6kniVzDAKgqfxn6s5aohiKIca9h/unRBlPKsapfE9KsslycHAdNajRtBbr+lu1VEulSilogHc4jll/kQtvy2iMtbpq7Wt3gm+hljuY2DLKZY/BSmGJTsRfDHUwhNDy/ENpcIBc44CJGh2AYB5O5GOtaETG2pY18Gz02vp27OPcBS1eKoqah5ezHAbf+rWyyR0u4o7IQzsGVobTHX+bNl4Y5plM9z3XqDCUW0nKdVSrHF/Nnn9zLrz+Ox/LaMs3NUItWk6XKNleDbtlLea7kKL/PrreVcRAJiKCNY8mm1/ap8hkmEy5Io0caAeGyhjByqiOZ79IpZK5phppK513VQdq4JEZ+5qL8ZstWG10UNtS1+RKfX8vvqYZ2e9kURUlTGgxIYxptnU4kYk1r2DRGtT1+o589rFjTLE8i/Lo0RN39Kv++Tz7C+uNMJXJhkH6LMPmGpCA3X5NeESNuip8u0P41YzQFUvpdXu0e9JRLR0dHx4HgtXro1ZNlB/U6WCmMDAbB3HUrPx1OvYJSdAzI8gD1I4TCGB2m0rafB0QU2pjzkqaphdKjccKmOHqzx/XAgLjlZoZgvcunEahjTMORegH27l89+NWtEVc+zZ/35MNFM5MRyMXPyXhpNdzebr2OywptmoWEsla+LyldjaKeX8s3J6NadxFeHADxJqdEOoJv58ush14joZicKtLVtnfvMfcabdX1+/j7d/H1j98HAFz+9JF42jRFxOKtU/G6Z08TuNBX2ZFoV9uWwuR1ziuPA6jwmy9/nk/cp+fv4JPTuwCANQ+iX2+piJukkg3WQ9+XZtmEYW/RMyYnnnn1+jjutIibKEyKZ83gElxIysV6utJOz9yIhVXZBDZuey6CFypvidhmpls4OifXx+C0X8OR6uYPLmF1Mz9/er90pi5JRhJmnf8S9c4Tqg3xK9dEpLXwWfeJ3yqxwW90ruywZgzLouD5bAv/OHvj8Sz3I9z+l+/i6LjMA560+NkoYJoo2kb2lQ7pgomcA5qMhXjukx4rMb/0CLo3I5/L0AsTpBVqR6iWm0B6jYYS9gFI8odOPwQOk7lYKledmSS3bmeXViQmbK7n507uAuN5zUET4lHJ/85Zwry0YKR5fs/irObFDI+eIe/dXiKc3s/Hcfb0BNcv5URu3eCWp53bhK0+y3PSLLKZ9aJohfHrDzPn14bjF8RykTBzSLKRybEo4BGUOw8QuNxF46A89AX2o6r8HQ9b/Pff5Tvr6e1RW/+Z4crjNK+MJZYaBxPD1XSBSbm6wHCbcjLGARjznfrkThaOuP3sOr53knnHmzQK39w2ieWhwDUXrgwWm2aptYMQvax3SCqRwKxaHzJz106cijsNY9YIXPCsWDunVFKQINmLbNIBaUgykJlZb+BVviOxmWzlnTQkMUdj0FlSLjMfsLlRuPirkmpbAnZ8sMyY3aoRd5NxctgwZCo7bc2m0Y7hq/7POolB9+dbpNu5XjP91Q8BAOHDlbBVYjBpFtvcF1qeuTOMlvzvTprM3JytcdeGt5fPofeUS0dHR8eB4PV2iu7xIhjQQtXEGsIx7/DTy/slbCWpcqfE2m1nZgoGw+kejHpi1TqPyWG6VLyAtc4ajLMckgNAPE2Zr47sMVMpXtbQz4ZIgHqB4ViZB9OdE6y/nxkTwYTdUboqgVA+NyuyVVaDYa5M1HjmQFsYcwGwKnLOhJ0XXTwTNkQi6d5jMh66KZRyoka0SHjMMvLPteyFgjWN+MH79wEA9//0A7z9H9l7xnwGWmY3rHrtdDog1nmhDoDwmBkkxT1V10yLAb6kXPwXOcT+nzvv4S9vfAYgp1bsyDibZllLAbR45WGQAu/aSDdM0QufPASvIlONwJr51xa7g3lsGBp70y+vEDoYhcBVqsN4k8ww3rp68cyEVMb31fOakpPnopk3sCWPWXnsiJsZvulmWdfyffMnSVr/3USSFmXfSlrUIviw0uJ3ZbO4iWX0oJuMV74OcE+KfsJvv4J/N4u0/fLv85cMbtI9alNiWyeb3pIUXDCRwnPSLE2Xatj/+Pea5VINEhO3xl2icQKqUL+R2G3UGWWvq0od+SQXU2JtP3c+ibbIFL2EeVszRJhnvPO5OW0ST0vTyemE+SIf1LQdEIpWy7YMJl480AvORZYwy87yvHTbYfWByrgC2YAlI4creiiW7hTMxm4ah/YwWALtDeEazQ+T6nqlMM06ciIddM4jIEMHmj9LpGG6kUW1+dV6idoJSA//fMLb/1l3MwOhGIRVGYZxPMN0Woz1QLIOcW5uiqwbmyIDJW/PT3PK5fQXMzz45ESOtaojTskLSyoYHRahIZo5sDGpdINtOmE7mCS2LJb8744RNzK59T0NHfWCbtRS8zKMM1jZjmRT907y6Clpnr0+Z8kakVUy28x6x+gSIlUpBcKVK7nThymvw+JsCyoSC2HhVMaB0OzfKh/itwxfjXcx8uN5wPgo51/c18+ATRmUEiL4adbzCH/2R/j1P5Y5oW+ty29ybSt/PR87a9Xos5gZpPXU2T3aGG55npsaQE0Xvih6yqWjo6PjQPBGiqIuUqOQVu+uTKXWCQCm4McebSs+yh2/vu5IHQjHSJXL7hh+qLrmCcOeQQaYlbZxMygheQYfFaGnWcBYmmWcY0kNhOoBPiH4tRYxKiOGCVhfLcyJOwGfPyyDNkoTjh0nxpEkbdHMA41WtIeMd1afA6z4kC2Y2bs89jz/KiF85UG/a5d1wSJenSMq8wIAHdqRGR9ViI2wmNWQbSYF0rff+RrTjdw/MD5cAkNZu6mwIR6vMFzPEdHmike9wNKQi2JADr1rYxEPTmaKVtz62Qa/+tub+fuOngrH2g6kWMdRlTGjNgpJK3/QQmgMTmdgJpNWs1GY5ZsbhkQbmtcUnPHqwsWsa/U206ARdWK9DqGnNi+jKgXI/1TJAJiomEiHdjiXGkbMpkRKcx9xXOb3PvjjPMD3Oz85A5WBNeworxuANHMiuQFS0TUQ1OZMJbV6PoFWWz3QorGfHj7C+m8+AQDc+QeGP6qeeVmzHTaL7tF2rZStYtJAdt/W4wkmdWr2pZUEoMSy118Ub4blQpp+ybkJebpJvzQhlTyuxoPldThWVUQicA1TPNd+CJAnHSBr4tOa341zwrzMAx2WhG1JrYRpgK80yOSQtr75uzRTMXxwztMCmdYYa5PRMuLksxzCLX9Q41hSdkhCQ0lsWCxmgIU15MBORX+328ykQZy5sCi9XAj3Qqjn2N6omcxast6QyRp31oaQ8iEh+KZhLBXaWRwDZsUIzHzE+Xfy+bx6/2tgViiDl47KcaixxmWv9Q4TvpLZXGnm4Y4MfQLA4tdf4Vdf3sjf8d5KjHimwurQETvcANhhs0SHaVtqLkHZSzCDDnZD9vqvrLXJu9q8rItqEOxA5VcJ+fxEkvJ0zCrJxKSPI4HH6myxMGAk9eK4oa7WOor3DqnsLyLGUJlfTm8AT36UU2m3fnaEP3Z+DAAAA+FJREFU4YG0/cqNPB2NwnCKC286zyHnbnxWGoXOdRwQnx5Lus7fvIF7f50/w83X6mwZR2tfXQPPa+Jr0mP6nL0Jq7PGhnKse9R1LZeOjo6Oby/ezICL5s6jnjhMcYMJ4nWzM89LWGfTLIbn7Fj0o2EYBOy5YVtUCCPmGLh0rxRNzz2Gp8XjmidMzuiwbNt7YFhA5lR66wGaVMf28oDrv8i34+X72Ztc3PdIpSC7vZqkyOJMqqkposTWEwDQ6lI8J82S503yN97zKiGpMdsY5jTyameE6bHB2wyNFitlzRJhnNWKks5UnQ0Rm7fy42urDXjeFpzZE/yqtJk/8whHmguwDAcX68kl8Lw0J1Vv/8szuJ+/BwB4cONECupTUl3zKXppiKrsJZtmmbaDKGSm4JSvbNJnu/INAFppDG6bUfaOE7yg/oL91wrpPmKAasRM3PSW2GIpkK+HVDebYzhf11jTmGQGpBBpI+D8JHvVZz88xa2flp6DmWr4pJlHOClpuoFMxAgTIlR2kwPKcBQeHPxZDq+3f3gL4XL5wZPpAygeOkUyc0ltb4Cee5ty2RdRU9qJqsq+tBFWnnULOb+/U9BqD15vp6gh/EtYtGOsRdcFUCaAU0pUIwUrf0eQRkwH7Sp1JpVr0wE2lV6phZd1s8+eMDZlIlF85kXUy9olGTpwpJ2Pfq2NDDa1MR0TLt/OYeO7/55D++N//S8MH+TOx8/++V2ReH3eRm1yqXvOY3NhNfQoblMxF5hDB+tNmEDm5KM557YRS05q/QhmoITgkR24sBrSGDEMevDbIoiGlPJ/yCmA/IAktPbHg+RX2em65bnC5f0DZOPQWJqeVg/x1s/z9/3vjy7h2nHe+MFollhZYJFQtWkWo8kCO4wk7DfoTWrMdv3uSatBmyMvjL1ExhaS2V/ymOz1pAvMnmUt5BJIxjZ5s0dJjb/d8/ml1pitrxPiotQqToZGj4frpKyRZB8DezR7Ri+5dwoJWJeBGld0QInI7wLN3M+GUbZPN8nUMojRrFv+jJ00i2XB2Nx60vf0lEtHR0fHtxTEL+nSd3R0dHT8fqJ76B0dHR0Hgm7QOzo6Og4E3aB3dHR0HAi6Qe/o6Og4EHSD3tHR0XEg6Aa9o6Oj40DQDXpHR0fHgaAb9I6Ojo4DQTfoHR0dHQeCbtA7Ojo6DgTdoHd0dHQcCLpB7+jo6DgQdIPe0dHRcSDoBr2jo6PjQNANekdHR8eBoBv0jo6OjgNBN+gdHR0dB4Ju0Ds6OjoOBN2gd3R0dBwIukHv6OjoOBB0g97R0dFxIOgGvaOjo+NA0A16R0dHx4Hg/wCtNMkyTzvEQgAAAABJRU5ErkJggg==\n",
      "text/plain": [
       "<matplotlib.figure.Figure at 0x3ff9bb35208>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    }
   ],
   "source": [
    "# 译者注：载入npy格式的数组文件，这是numpy存储数组的一种格式\n",
    "# 如果是 kaggle 环境\n",
    "x_l = np.load('input/Sign-language-digits-dataset/X.npy')\n",
    "Y_l = np.load('input/Sign-language-digits-dataset/Y.npy')\n",
    "# 如果是 github 下载的，\n",
    "#x_l = np.load('input/Sign-language-digits-dataset/X.npy')\n",
    "# Y_l = np.load('input/Sign-language-digits-dataset/Y.npy')\n",
    "img_size = 64\n",
    "plt.subplot(1, 3, 1)\n",
    "# 译者注：大家可以通过改变x_l序号，看看不同手势符号\n",
    "# 符号0图片\n",
    "plt.imshow(x_l[205].reshape(img_size, img_size))\n",
    "plt.axis('off')\n",
    "plt.subplot(1, 3, 2)\n",
    "# 符号1图片\n",
    "plt.imshow(x_l[823].reshape(img_size, img_size))\n",
    "plt.axis('off')\n",
    "plt.subplot(1, 3, 3)\n",
    "# 译者注：符号3图片，你还可以试试其他序号\n",
    "plt.imshow(x_l[1].reshape(img_size, img_size))\n",
    "plt.axis('off')"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "_cell_guid": "1c9cf8f2-f318-4f69-b92e-9c1bfe78c2c7",
    "_uuid": "c12d1854c6f5fb43c8f081083a441585ed48dc93"
   },
   "source": [
    "* 为了创建图像数组，我把0和1的数组连接到一起。\n",
    "* 然后我为符号0图像数组创建了标签数组z，为符号1图像数组创建了标签数组o\n",
    "* 译者注：不熟悉numpy的初学者，可以搜索”numpy.resharp 例子\" 或者调用help(np.reshape)来看下它的用法 [python基础之numpy.reshape详解](https://www.jianshu.com/p/fc2fe026f002)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 67,
   "metadata": {
    "_cell_guid": "88a9b18b-12c7-46d6-8370-e2f02b55dd7a",
    "_uuid": "ad417f6189b0d9476388d92b6be09c887f7f1a40",
    "scrolled": true
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "(410, 64, 64)\n",
      "(410, 1)\n"
     ]
    }
   ],
   "source": [
    "# Join a sequence of arrays along an row axis.\n",
    "# 把一串数组的连接起来，统一编号\n",
    "X = np.concatenate((x_l[204:409], x_l[822:1027] ), axis=0) # 从 0 到 204 是符号”0“ 从205到410是符号”1“\n",
    "z = np.zeros(205)\n",
    "o = np.ones(205)\n",
    "Y = np.concatenate((z, o), axis=0).reshape(X.shape[0],1)\n",
    "# 译者笔记：这里将z和o连接起来,然后变成一个410x1的数组\n",
    "print(X.shape)\n",
    "print(Y.shape)\n",
    "# 去掉注释就可以阅读到np.reshape的帮助\n",
    "# help(np.reshape)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "_cell_guid": "d6024477-cf34-40f9-b079-8388b5146e1b",
    "_uuid": "a999c02e74af603d65a4442842d739aa351f593c"
   },
   "source": [
    "* X的形状是 (410, 64, 64)\n",
    "    * 410表示我们有410张照片 (标志0和标志1的照片)\n",
    "    * 64 表示我们照片的大小是64x64(64x64像素)\n",
    "* Y的形状是(410,1)\n",
    "    *  410 表示我们有410个标签(0和1组成) \n",
    "* 让我们把X,Y拆分成训练组和测试组\n",
    "    * 参数 test_size = 测试大小的百分比. 测试占 15% 和 训练占 85% (译者注：原文这里标注成75%是计算错误。)\n",
    "    * 参数 random_state = 42 我们使用相同的随机种子. 这意味着如果我们反复划分训练和测试组，他总是使用相同的训练和测试分布，因为我们使用相同的random_state参数值"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 64,
   "metadata": {
    "_cell_guid": "00db1f03-75d3-48dc-a6fb-824a58790d42",
    "_uuid": "82869efdb6890ede1899c1bc8c90fa8b1caceec3"
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "348\n",
      "62\n"
     ]
    }
   ],
   "source": [
    "# 让我们创建数组 x_train, y_train, x_test, y_test \n",
    "from sklearn.model_selection import train_test_split\n",
    "X_train, X_test, Y_train, Y_test = train_test_split(X, Y, test_size=0.15, random_state=42)\n",
    "number_of_train = X_train.shape[0]\n",
    "number_of_test = X_test.shape[0]\n",
    "print(number_of_train)\n",
    "print(number_of_test)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "_cell_guid": "4bb55e07-cabb-46ce-b21a-fe4a7b34995f",
    "_uuid": "e294dc952e7ba34569162a8965680ead8edd2834"
   },
   "source": [
    "* 现在我们有3维数组X 做为输入，但是我们要把降维到2维，才能把他变成我们第一个深度学习模型的输入\n",
    "* 译者注：也就是通过numpy.reshape把64*64的二维数组变成4096的一维数组.\n",
    "* 我们的标签数组(Y)已经是2维，所以我们不用动它。\n",
    "* 让我们把X数组变平吧(图像数组).\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 5,
   "metadata": {
    "_cell_guid": "f5937123-1e16-4036-844b-f498ed42e504",
    "_uuid": "f8b7203144bc85873d960f765e2bb3e711b57f30"
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "X train flatten (348, 4096)\n",
      "X test flatten (62, 4096)\n"
     ]
    }
   ],
   "source": [
    "X_train_flatten = X_train.reshape(number_of_train,X_train.shape[1]*X_train.shape[2])\n",
    "X_test_flatten = X_test .reshape(number_of_test,X_test.shape[1]*X_test.shape[2])\n",
    "print(\"X train flatten\",X_train_flatten.shape)\n",
    "print(\"X test flatten\",X_test_flatten.shape)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "_cell_guid": "dab40a5b-e319-4807-a608-12065d860847",
    "_uuid": "cb26fc242ffcae8bf8f164d32dc9c5436e8312fc"
   },
   "source": [
    "* 如你所见, 我们的训练数组中有348张图像，每张图像有4096个像素。\n",
    "* 同时, 我们的测试数组有62张图像，每张图像也有4096个像素。\n",
    "* 然后，我们对矩阵做转置，也就是做行和列的交换。\n",
    "* 我知道你会问为什么要做置换，这里没有什么技术上的解释。你看到我后面写得代码就明白了。：）\n",
    "* 译者注：如果你和我一样已经把线性代数基本还给大学老师了，请看一下百度百科，非常简单https://baike.baidu.com/item/%E8%BD%AC%E7%BD%AE%E7%9F%A9%E9%98%B5）"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 74,
   "metadata": {
    "_cell_guid": "ad9bee66-78f1-44ec-a114-465356b9cc7d",
    "_uuid": "88eef1b839ee51234a53da3ccb1e36a2e5e9a0e6"
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "x train:  (4096, 348)\n",
      "x test:  (4096, 62)\n",
      "y train:  (1, 348)\n",
      "y test:  (1, 62)\n"
     ]
    }
   ],
   "source": [
    "x_train = X_train_flatten.T\n",
    "x_test = X_test_flatten.T\n",
    "y_train = Y_train.T\n",
    "y_test = Y_test.T\n",
    "print(\"x train: \",x_train.shape)\n",
    "print(\"x test: \",x_test.shape)\n",
    "print(\"y train: \",y_train.shape)\n",
    "print(\"y test: \",y_test.shape)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "_cell_guid": "0dfdae4c-5c3e-4ebd-8714-2ecf974ef2fa",
    "_uuid": "ed7b18eea8062e401823686bb2f672e1c548fac0"
   },
   "source": [
    "<font color='purple'>\n",
    "到现在我们都做了什么:\n",
    "* 为符号0和符号1创建了标签\n",
    "* 创建了已经降维的训练和测试组\n",
    "* 我们最终的输入(图像)和输出（标签和分类）看起来如图所示\n",
    "<a href=\"http://ibb.co/bWMK7c\"><img src=\"http://image.ibb.co/fOqCSc/3.png\" alt=\"3\" border=\"0\"></a>"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "_cell_guid": "8b5f8812-f21c-4936-91b9-d33048dec40b",
    "_uuid": "5f037b5d3f44a9bf139dddb7946c0b33cf7d0298",
    "collapsed": true
   },
   "source": [
    "<a id=\"3\"></a> <br>\n",
    "# 逻辑回归\n",
    "* 当我们说到二元分类时（0和1的输出），我们就会立刻想到逻辑回归。\n",
    "* 译者注：如果你和我一样看到回归两个字就很晕，可以看一下关于逻辑回归的解释。\n",
    "（https://easyai.tech/ai-definition/logistic-regression/）\n",
    "* 然而，在深度学习指南中，逻辑回归是指什么呢？\n",
    "* 答案是逻辑回归是个非常简单的神经网络。\n",
    "* 顺便说一嘴，神经网络和深度学习就是同一件东西。当我们讲到人工神经网络的时候，我会解释“deep”“深度”这个名词的具体含义。\n",
    "* 为了能理解逻辑回归（简易深度学习），让我们先学习下计算图（computation graph）。 "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "_cell_guid": "9db2d7f4-0393-47d2-bbf3-88600177b4d1",
    "_uuid": "e2f49564d7d487417546d3b1ce94bc63e6394632"
   },
   "source": [
    "<a id=\"4\"></a> <br>\n",
    "##  计算图\n",
    "* 计算图是一个理解数学表达式的不错的方式。\n",
    "* 它看起来像数序表达式的可视化展示。\n",
    "* 比如说表达式 $$c = \\sqrt{a^2 + b^2}$$\n",
    "* 它的计算图如图所示. 如我们所见的，数学被用图形的方式表达出来\n",
    "<a href=\"http://imgbb.com/\"><img src=\"http://image.ibb.co/hWn6Lx/d.jpg\" alt=\"d\" border=\"0\"></a>"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "_cell_guid": "7c968bd6-39d6-4497-add1-6fd1eb331659",
    "_uuid": "5060aff503466a5c54bcf76a545484abda032e03"
   },
   "source": [
    "* 让我们来看看逻辑回归的计算图吧。\n",
    "<a href=\"http://ibb.co/c574qx\"><img src=\"http://preview.ibb.co/cxP63H/5.jpg\" alt=\"5\" border=\"0\"></a>\n",
    "    * 参数包括了 weight(权重)和 bias(偏置） \n",
    "    * Weights权重: 每个像素的系数\n",
    "    * Bias偏置: 截距\n",
    "    * z = (w.t)x + b  => z equals to (transpose of weights times input x) + bias \n",
    "    * z = (w.t)x + b  => z 等于 (系数乘以像素的转置) + 偏置值    \n",
    "    * 换一种方式来说 => z = b + px1*w1 + px2*w2 + ... + px4096*w4096\n",
    "    * y_head = sigmoid(z)\n",
    "    * 通过激活函数Sigmoid可以将z做为参数，获得在0和1之间返回值。你可以在计算图中看到 sigmoid 函数\n",
    "* 为什么我们要使用sigmoid函数?\n",
    "    * 它给出了概率性的结果\n",
    "    * 它是可导的，我们可以在梯度下降算法用到它。(我们很快就会看到它)\n",
    "* 让我们来举个例子吧：\n",
    "    * 比方说我们发现 z=4, 然后我们将 z做为参数传递给 sigmoid 函数。结果y_head的取值大概是0.9。它表示分类的结果有90%的概率是1。\n",
    "* 现在让我们开始吧，挨个仔细练习计算图的每个部分吧。"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "_cell_guid": "b9ec7e1d-186b-4911-be80-bc856f43b689",
    "_uuid": "40c4a1af2372960d36b6649f5baf4915e9e16738"
   },
   "source": [
    "<a id=\"5\"></a> <br>\n",
    "## 参数初始化\n",
    "* 从前面的数据准备阶段，你知道我们的输入是4096像素的图片数组。（图片保存在 x_train 中）\n",
    "* 每个数组有自己对应的权重。\n",
    "* 第一步是每个像素乘以自己的权重。\n",
    "* 但是，问题是权重的初始值应该是多少呢？ \n",
    "    * 关于权重初始化的一些技术，我会在人工神经网络中解释。但是，这一次初始化权重是0.01 \n",
    "    * 好的，权重是0.01，那么权重数组的形状是什么样的？如你对计算图所理解的那样，它是(4096,1)\n",
    "    * 译者注：我们的训练集的形状是x train:  (4096, 348)\n",
    "    * 并且初始偏置值是0.\n",
    "* 让我们来写代码吧。为了能在人工智能网络(ANN)等标题中反复使用这些代码，我定义了一个方法(method)."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 83,
   "metadata": {
    "_cell_guid": "74d461fc-4aa9-4b76-bd26-43551cef138b",
    "_uuid": "f3be95b8ca86fea08336badabf563b5f8f59a142"
   },
   "outputs": [],
   "source": [
    "# 简单的定义方法示例\n",
    "# 译者注：如果你已经学会 python, 不用管他\n",
    "def dummy(parameter):\n",
    "    dummy_parameter = parameter + 5\n",
    "    return dummy_parameter\n",
    "result = dummy(3)     # result = 8\n",
    "\n",
    "# 让我们初始化参数吧\n",
    "# 我们需要一个4096维的数组作为我们这个初始化权重方法的参数，每一维对应着一个像素。\n",
    "# 译者注：initialize_weights_and_bias函数接受dimension做为参数，比如 dimension=4096, \n",
    "# 然后通过 np.full 创建一个4096维的长度为1的数组，并且通通赋值为0.01\n",
    "# 译者注：np.full 返回一个根据指定shape和type,并用fill_value填充的新数组。\n",
    "def initialize_weights_and_bias(dimension):\n",
    "    w = np.full((dimension,1),0.01)\n",
    "    b = 0.0\n",
    "    return w, b\n",
    "#help(np.full)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 81,
   "metadata": {
    "_uuid": "d71d5b420ee146833d479781ea889eeebcb74e47"
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "(4096, 1)\n",
      "0.0\n"
     ]
    }
   ],
   "source": [
    "w,b = initialize_weights_and_bias(4096)\n",
    "# 译者注：我们来看看参数的形状吧\n",
    "print(w.shape)\n",
    "print(b)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "_cell_guid": "1fbf27d5-1f4b-4b94-a757-99e2becf54a0",
    "_uuid": "dde6fab55966b2de3dc808e03613df9bea7bb0d2"
   },
   "source": [
    "<a id=\"6\"></a> <br>\n",
    "## 前向传播\n",
    "* 每一步通过像素计算代价的过程都可以称为前向传播。\n",
    "    * z = (w.T)x + b => 在这个等式中我们知道x是像素数组，我们知道w(权重)和b(偏置值)，所以我们就只要通过计算就能知道z的值 (T是transpose转置运算)\n",
    "    * 然后，我们将z代入sigmoid函数中，它返回一个y_head值（可能性）。如果你的思路看到这里感到混乱，那么就再去看一眼计算图。sigmoid函数的公式已经在计算图中表示出来。\n",
    "    * 然后我们运算损失loss(error)函数\n",
    "    * 代价函数是所有损失的总和 \n",
    "    * 让我们从z和编写sigmoid方法开始吧。我们把z作为输入参数，返回y_head值（可能性）。 "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 84,
   "metadata": {
    "_cell_guid": "697be401-792b-46c0-8fe6-79cd65110419",
    "_uuid": "e024479bce9ce2022f65ffd586a6c29b124e7ab5"
   },
   "outputs": [],
   "source": [
    "# 计算z值\n",
    "#z = np.dot(w.T,x_train)+b\n",
    "def sigmoid(z):\n",
    "    y_head = 1/(1+np.exp(-z))\n",
    "    return y_head"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 85,
   "metadata": {
    "_uuid": "da7e0244449200eaf382507806f01901f1d281c8"
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "0.5"
      ]
     },
     "execution_count": 85,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "y_head = sigmoid(0)\n",
    "y_head"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "_cell_guid": "571dc02a-b25d-4726-ad2b-9661ae6783e2",
    "_uuid": "d67ca31d01cc8f03fdd862789454e4acf1b2033b"
   },
   "source": [
    "我们写了sigmoid方法，也试着用它计算了y_head值。让我们来学习下损失函数吧\n",
    "* 译者注：参考 \n",
    "[分类问题之损失函数](https://zh.wikipedia.org/wiki/%E5%88%86%E9%A1%9E%E5%95%8F%E9%A1%8C%E4%B9%8B%E6%90%8D%E5%A4%B1%E5%87%BD%E6%95%B8)\n",
    "* 让我们来举个例子吧， 我把一张图片作为输入，然后把它乘以相应的权重然后加上偏置值，然后得到z。然后，我把z代入sigmoid方法中，这样我得到了 y_head值。直到这里为止，我们知道我们在做什么。然后，比方说，y_head变成了0.9, 比0.5大，所以我们预测图像是符号1.OK，每一件事情看来是好的。但是，我们的预测是正确的吗？我们怎么检查这是对是错嘛？答案是通过损失函数：\n",
    "    * 损失函数的数学表达式如下\n",
    "    <a href=\"https://imgbb.com/\"><img src=\"https://image.ibb.co/eC0JCK/duzeltme.jpg\" alt=\"duzeltme\" border=\"0\"></a>\n",
    "    * 它说你如果做了错误的预测，损失或者错误在变大。**DENKLEM DUZELTME**\n",
    "        * 例子： 我们真实的图像是符号1并且它的标签也是1，然后我们的预测值 y_head=1。 当我们放入y和y_head放入损失公式，结果就是0.当我们做出正确预测时，我们的损失是0, 然而，如果我们做出的是错误的预测，比如 y_head=0,损失就是无穷大。\n",
    "* 在这之后，代价函数是损失函数的总和。每个图片都有相应的损失函数。代价函数是这些图片的损失函数的总和。        \n",
    "* 让我们来实现前向传播。\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 11,
   "metadata": {
    "_cell_guid": "adbc7d22-8bba-48c1-b754-75c9d4f0543c",
    "_uuid": "447f1ee51819fdfeba0e1ce0b03449cb549510ed"
   },
   "outputs": [],
   "source": [
    "# 前向传播步骤:\n",
    "# 计算 z = w.T*x+b\n",
    "# y_head = sigmoid(z)\n",
    "# loss(error) = loss(y,y_head)\n",
    "# cost = sum(loss)\n",
    "def forward_propagation(w,b,x_train,y_train):\n",
    "    z = np.dot(w.T,x_train) + b\n",
    "    y_head = sigmoid(z) # probabilistic 0-1\n",
    "    loss = -y_train*np.log(y_head)-(1-y_train)*np.log(1-y_head)\n",
    "    cost = (np.sum(loss))/x_train.shape[1]      # x_train.shape[1]  is for scaling\n",
    "    return cost "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "_cell_guid": "b4700f54-c2c5-4a53-ad1f-7fd63f5cd2e9",
    "_uuid": "e577873215128f94fea2eb328de90a4f32b7f803"
   },
   "source": [
    "<a id=\"7\"></a> <br>\n",
    "##  通过梯度下降优化算法\n",
    "* 好的，现在我们知道我们的代价是错误 \n",
    "* 因此，我们需要减少代价，因为我们知道如果成本高，它表示我们做出了错误的预测。 \n",
    "* 让我们来想想第一步，一切都是从初始化权重和偏置值开始。因此代价依赖于他们。\n",
    "* 为了降低代价，我们需要更新权重和偏置值。\n",
    "* 换句话说, 我们的模型需要学习参数权重和偏置值，最小化代价函数。这项技术被称为梯度下降。\n",
    "* 让我们举个例子:\n",
    "    * 我们设置 w=5和 bias=0,所以现在忽略偏置值。然后，我们进行前向传播，我们的代价函数是1.5. \n",
    "    \n",
    "    * 它看起来像这样 (红线)\n",
    "    <a href=\"http://imgbb.com/\"><img src=\"http://image.ibb.co/dAaYJH/7.jpg\" alt=\"7\" border=\"0\"></a>\n",
    "    * 如你所见在这幅图中， 我们当前的结果并不在代价函数的最低点。因此，我们需要达到代价最低点。好吧，让我们来更新权重。（符号:=表示为更新中） \n",
    "    * w := w - step. 问题是这里的距离(step)是什么意思？在当前，距离就是图中斜线slope1。好吧，这样的改变是显著的。为了找到最低点，我们使用斜率slope1.比如我们用 slope1 = 3，然后我们更新权重。w := w - slope1 => w = 2。\n",
    "    * 现在我们的权重 w 是2，如你记得那样， 我们需要继续前向传播寻找最低代价值(cost)\n",
    "    * 比如说根据前向传播，我们算出 w=2是，代价函数取值是0.4。我们这么做看起来是对的，因为代价函数取值变小了。我们得到了新的权重值，代价函数取值是0.4。这样够了吗？事实上我并不知道，所以我们再试一步。\n",
    "    * Slope2 = 0.7 and w = 2. 让我们来更新权重 w : = w - step(slope2) => w = 1.3 权重现在是1.3. 让我们找出新的代价函数取值。\n",
    "    * 再做了一次前向传播后， w=1.3 然后我们的代价 cost =0.3. 好吧，我们的代价变得更低， 它看起来不错。 但是它是不是足够好了呢或者我们是不是还要再执行一步呢？答案还是我不知道，让我们再试试。\n",
    "    * Slope3 = 0.01 并且 w = 1.3. 更新权重 w := w - step(slope3) => w = 1.29 ~ 1.3. 这次权重并没有变化，因为我们找到了代价函数的最低点。 \n",
    "    * 这一切看起来都不错，但是我们是怎么确定斜率的呢? 如果你还记得高中或者大学的课程，为了找到函数在某一指定点的斜率（代价函数的斜率），我们对这一点求导数. 也许你还会问，好吧我知道怎么求斜率了，但是怎么知道该往那边走呢？你会说也许顺着斜率，代价值也可能变大而不是走向最低点。我们的答案是斜率（导数）同时给出了距离和方向。因此不用担心。\n",
    "    * 更新后的等式是这样的。他表示通过权重和偏置值计算得到的代价函数。根据权重和偏置值对代价函数求导，然后乘以α学习率，然后就可以更新权重值。（为了解释，我忽略了偏置值，但是所有的步骤都应该包含偏置值）\n",
    "    <a href=\"http://imgbb.com/\"><img src=\"http://image.ibb.co/hYTTJH/8.jpg\" alt=\"8\" border=\"0\"></a>\n",
    "    * 现在，我确定你会问什么是学习率，我前面从没有提到过。它是非常直白的术语，它决定了学习率。然后，这里有个学习很快或者不学习的权衡。比如说你在巴黎（当前的代价），然后你想去马德里（最小代价）。如果你的速度（学习率）很慢，你会花很长时间才能到达马德里。但是，另一方面，如果你的速度（学习率）非常大，你可以跑得飞快但是有可能坠毁而永远到不了马德里。因此，我们要明智地选择速度（学习率）\n",
    "    * 学习率也称为超参，需要被选择和调校。我会在人工神经网络中一起更详细介绍它和其他超参。现在，我们假设在前面的例子中学习率为1\n",
    "    \n",
    "* 我想现在你已经理解了前向传播的逻辑了（从权重和偏置值到代价）和后向传播（从代价来更新权重和偏置值）。而且你也学习了梯度下降。在实现代码前，你还要学习另外一件事情：我们怎么根据权重和偏置值对代价函数求导。这和Python或者编程无关。这是纯数学。有两个观点，第一个是用 Google搜索关于怎么对log loss函数求导， 第二个是google搜索什么是log loss函数的导数。我选择第二种因为我不说话就无法解释数学 ：）\n",
    "\n",
    "$$ \\frac{\\partial J}{\\partial w} = \\frac{1}{m}x(  y_head - y)^T$$\n",
    "$$ \\frac{\\partial J}{\\partial b} = \\frac{1}{m} \\sum_{i=1}^m (y_head-y)$$"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 12,
   "metadata": {
    "_cell_guid": "2b74bb7c-bcbe-4865-ac61-25ced274e7de",
    "_uuid": "088f192f99d2a31f041116c893ec713ef3b4fe43"
   },
   "outputs": [],
   "source": [
    "# 在后序传播中，我们会使用前向传播得到的 y_head \n",
    "# 因此，我们把前向传播和后向传播绑在一起，而不是再写个后向传播的函数。\n",
    "def forward_backward_propagation(w,b,x_train,y_train):\n",
    "    # 前向传播\n",
    "    z = np.dot(w.T,x_train) + b\n",
    "    y_head = sigmoid(z)\n",
    "    loss = -y_train*np.log(y_head)-(1-y_train)*np.log(1-y_head)\n",
    "    cost = (np.sum(loss))/x_train.shape[1]      # x_train.shape[1]  is for scaling\n",
    "    # 后向传播\n",
    "    derivative_weight = (np.dot(x_train,((y_head-y_train).T)))/x_train.shape[1] # x_train.shape[1]  is for scaling\n",
    "    derivative_bias = np.sum(y_head-y_train)/x_train.shape[1]                 # x_train.shape[1]  is for scaling\n",
    "    gradients = {\"derivative_weight\": derivative_weight,\"derivative_bias\": derivative_bias}\n",
    "    return cost,gradients"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "_cell_guid": "d82dbae8-d11e-4ea8-bbfe-545fee3166b3",
    "_uuid": "9e4d028259897e1565341736d58e46f70ea6b312"
   },
   "source": [
    "* 到这里，我们学习了：\n",
    "    * 初始化参数 (已经实现)\n",
    "    * 通过前向传播和代价函数寻找代价 (已经实现)\n",
    "    * 更新（学习）参数（权重和偏置值）。让我们来实现它。\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 86,
   "metadata": {
    "_cell_guid": "0940e18d-636d-4503-a2c6-1994024726fe",
    "_uuid": "31299bda686ae2ab157ea18df70e4741e5225345"
   },
   "outputs": [],
   "source": [
    "# 更新学习参数\n",
    "def update(w, b, x_train, y_train, learning_rate,number_of_iterarion):\n",
    "    cost_list = []\n",
    "    cost_list2 = []\n",
    "    index = []\n",
    "    # 更新学习参数需要number_of_iterarion次迭代\n",
    "    for i in range(number_of_iterarion):\n",
    "        # 做前向和后向传播来寻找代价和梯度\n",
    "        cost,gradients = forward_backward_propagation(w,b,x_train,y_train)\n",
    "        cost_list.append(cost)\n",
    "        # 让我们来做更新\n",
    "        w = w - learning_rate * gradients[\"derivative_weight\"]\n",
    "        b = b - learning_rate * gradients[\"derivative_bias\"]\n",
    "        if i % 10 == 0:\n",
    "            cost_list2.append(cost)\n",
    "            index.append(i)\n",
    "            print (\"Cost after iteration %i: %f\" %(i, cost))\n",
    "    # 我们更新了学习参数权重 weights 和偏置值 bias\n",
    "    parameters = {\"weight\": w,\"bias\": b}\n",
    "    plt.plot(index,cost_list2)\n",
    "    plt.xticks(index,rotation='vertical')\n",
    "    plt.xlabel(\"Number of Iterarion\")\n",
    "    plt.ylabel(\"Cost\")\n",
    "    plt.show()\n",
    "    return parameters, gradients, cost_list\n",
    "#parameters, gradients, cost_list = update(w, b, x_train, y_train, learning_rate = 0.009,number_of_iterarion = 200)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "_cell_guid": "202a05c6-2187-40eb-9733-e4da792f04b6",
    "_uuid": "1892ccefd7debe1e9f59b500873b7afded3a4a01"
   },
   "source": [
    "* 哇, 我觉得累了 :) 到这里为止，我们已经学习了参数. 这表示我们适应了数据\n",
    "* 为了预测，我们已经搞好了参数。那么，我们开始预测吧。\n",
    "* 在预测步骤中，我们用测试集x_test做为输入。我们对它做前向预测。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 87,
   "metadata": {
    "_cell_guid": "1fd35d0f-e989-49fa-9bcb-1769d92a5ab4",
    "_uuid": "c9b80f081f49a722818cfebfc01e8b8dc69db9c1"
   },
   "outputs": [],
   "source": [
    " # 预测\n",
    "def predict(w,b,x_test):\n",
    "    # x_test 是前向传播的输入\n",
    "    z = sigmoid(np.dot(w.T,x_test)+b)\n",
    "    Y_prediction = np.zeros((1,x_test.shape[1]))\n",
    "    # 如果 z 大于 0.5, 我们的预测结果是符号1 (y_head=1),\n",
    "    # 如果 z 小于 0.5,, 我们的预测结果是符号0 (y_head=0),\n",
    "    for i in range(z.shape[1]):\n",
    "        if z[0,i]<= 0.5:\n",
    "            Y_prediction[0,i] = 0\n",
    "        else:\n",
    "            Y_prediction[0,i] = 1\n",
    "\n",
    "    return Y_prediction\n",
    "# predict(parameters[\"weight\"],parameters[\"bias\"],x_test)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "_cell_guid": "ce815319-161b-408b-84fe-4bb1fab92d13",
    "_uuid": "40dbb73794b6b742b01e33038284aa9f11cc698c"
   },
   "source": [
    "* 我们做了预测。\n",
    "* 现在让我们把所有工作串起来吧。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 90,
   "metadata": {
    "_cell_guid": "029e2cd3-125b-4ca4-8b94-25d8164ba1a7",
    "_uuid": "81fb6989ff3860d72462f8212b1d00325272a471"
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Cost after iteration 0: 14.014222\n",
      "Cost after iteration 10: 2.544689\n",
      "Cost after iteration 20: 2.577950\n",
      "Cost after iteration 30: 2.397999\n",
      "Cost after iteration 40: 2.185019\n",
      "Cost after iteration 50: 1.968348\n",
      "Cost after iteration 60: 1.754195\n",
      "Cost after iteration 70: 1.535079\n",
      "Cost after iteration 80: 1.297567\n",
      "Cost after iteration 90: 1.031919\n",
      "Cost after iteration 100: 0.737019\n",
      "Cost after iteration 110: 0.441355\n",
      "Cost after iteration 120: 0.252278\n",
      "Cost after iteration 130: 0.205168\n",
      "Cost after iteration 140: 0.196168\n"
     ]
    },
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYIAAAETCAYAAAA7wAFvAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4xLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvAOZPmwAAIABJREFUeJzt3Xl0ZGd95vHvr6q0lfaS1KukVntrg5tgoAkcIAmEJU7CYMKQGA8kJvGMZyYzgZAhgIc52YeYkGTgJCGZDhBDQsyBAIEQ8ALYQBhjW25v3W7vvbh3tXqRWi2pJdVv/rhXUqlaa0m3rlT3+ZxTp0r3Xr337ZJaT733Xa65OyIiklypuCsgIiLxUhCIiCScgkBEJOEUBCIiCacgEBFJOAWBiEjCKQhERBJOQSAiknAKAhGRhMvEXYHFaG9v956enrirISKypjz44IMn3b1joePWRBD09PTQ29sbdzVERNYUMzuwmON0aUhEJOEUBCIiCacgEBFJOAWBiEjCKQhERBIusiAws8+Y2Qkz2z3LvvebmZtZe1TnFxGRxYmyRXArcE3xRjPrAt4IHIzw3CIiskiRBYG7fx84Ncuu/wN8AIj8HpnffeI4n7znmahPIyKyppW1j8DM3gIcdvdHFnHsTWbWa2a9fX19JZ3v357u5y++8wy6L7OIyNzKFgRmlgU+DPzOYo53953uvsPdd3R0LDhDelbduTqGxyY4ee5CSd8vIpIE5WwRXApsBR4xs/1AJ7DLzDZEdcLutiwAB0+dj+oUIiJrXtmCwN0fc/d17t7j7j3AIeCl7n4sqnN254IgeF5BICIypyiHj94G3AtsM7NDZnZjVOeaS2ergkBEZCGRrT7q7tcvsL8nqnNPqq1Ks76pRpeGRETmUfEzi7tzWQWBiMg8Kj4IunJZXRoSEZlH5QdBa5ajAyOMjk/EXRURkVWp4oOgO5fFHQ6fHo67KiIiq1LlB4HmEoiIzKvyg0BzCURE5lXxQdDRUENNJqUWgYjIHCo+CFIpo0tDSEVE5lTxQQCTcwnUWSwiMpvEBMHzp85rOWoRkVkkIgi6clnOjY5z5vxY3FUREVl1EhEEkyOH1E8gInIxBYGISMIlIgi6cnWAgkBEZDaJCIJsdYb2hhpNKhMRmUUiggCCVoFaBCIiF0tMEOi+BCIis0tUEBw5M8zYRD7uqoiIrCqJCYKuXJa8w5EzmmEsIlIoypvXf8bMTpjZ7oJtHzOzJ8zsUTP7qpm1RHX+YhpCKiIyuyhbBLcC1xRtuwvY7u4/BjwF3Bzh+WdQEIiIzC6yIHD37wOnirbd6e7j4Zc/AjqjOn+x9U21VKe1HLWISLE4+wh+DfhWuU6WThmdrXWaSyAiUiSWIDCzDwPjwOfnOeYmM+s1s96+vr4VOW9XLsvzWo5aRGSGsgeBmd0AvBl4p8+zLrS773T3He6+o6OjY0XOrbkEIiIXK2sQmNk1wAeBt7h72f8id+eynB0e46yWoxYRmRLl8NHbgHuBbWZ2yMxuBP4SaATuMrOHzexvojr/bLomb2R/Wq0CEZFJmagKdvfrZ9n86ajOtxiFq5Bu39wcZ1VERFaNxMwshukWgfoJRESmJSoImmqraM1WKQhERAokKghg+kb2IiISSFwQdGkIqYjIDIkLgu5clsOnhxnXctQiIkBCg2A87xw9OxJ3VUREVoVEBgGgfgIRkVDigkCTykREZkpcEGxsriWTMnUYi4iEEhcEmXSKza11HNQqpCIiQAKDALQKqYhIoUQGQZcmlYmITElmELRmOTV0gcERLUctIpLIIJgeQqp+AhGRRAeB+glERBIeBOonEBFJaBA0Z6toqs2oRSAiQkKDAKC7TUNIRUQgyUGgIaQiIkCCg6Arl+XQ6WHyeY+7KiIisYosCMzsM2Z2wsx2F2zLmdldZvZ0+Nwa1fkX0p3LcmEiz/FBLUctIskWZYvgVuCaom0fAr7j7pcD3wm/jsXUENJ+XR4SkWSLLAjc/fvAqaLN1wKfDV9/FnhrVOdfiOYSiIgEyt1HsN7djwKEz+vKfP4pm1rqSJnmEoiIrNrOYjO7ycx6zay3r69vxcuvSqfY1FKnFoGIJF65g+C4mW0ECJ9PzHWgu+909x3uvqOjoyOSynS1ai6BiEi5g+DrwA3h6xuAr5X5/DME9yXQwnMikmxRDh+9DbgX2GZmh8zsRuAW4I1m9jTwxvDr2HS3ZTl5bpTzF8bjrIaISKwyURXs7tfPsev1UZ1zqboKlqPetqEx5tqIiMRj1XYWl4OGkIqIKAgABYGIJFuig6A1W0VDTUZzCUQk0RIdBGamG9mLSOIlOggAunOaVCYiyaYgyAWTyty1HLWIJJOCIJdldDxP3+Bo3FUREYlF4oOgSyOHRCThEh8EGkIqIkmX+CDY3FqHmYJARJIr8UFQk0mzoalWQSAiiZX4IAA0l0BEEk1BwPQQUhGRJFIQEATB8YFRRsYm4q6KiEjZKQiYHjl06LRaBSKSPAoCNJdARJJNQcB0i+B53bZSRBJIQQC0N1RTV5VWi0BEEklBQLActUYOiUhSKQhCmksgIkkVSxCY2fvMbI+Z7Taz28ysNo56FNJy1CKSVGUPAjPbDLwH2OHu24E08I5y16NYd66O8xcm6B+6EHdVRETKKq5LQxmgzswyQBY4ElM9pmgIqYgkVdmDwN0PA38KHASOAmfd/c7i48zsJjPrNbPevr6+yOs1PYRUQSAiyRLHpaFW4FpgK7AJqDezdxUf5+473X2Hu+/o6OiIvF6drWGLoF9BICLJEseloTcA+9y9z93HgK8Ar4qhHjPUVadZ11ijS0MikjhxBMFB4JVmljUzA14P7I2hHhfRXAIRSaJFBYGZ/f1iti2Gu98H/BOwC3gsrMPOUspaad2aSyAiCbTYFsFVhV+YWRp4Wakndfffdfcr3X27u/+yu4+WWtZK6splOTowwoXxfNxVEREpm3mDwMxuNrNB4MfMbCB8DAIngK+VpYZl1J3L4g6Hz2jxORFJjnmDwN3/2N0bgY+5e1P4aHT3Nne/uUx1LJvuNs0lEJHkWeyloW+YWT2Amb3LzP7czLZEWK9YdGtSmYgk0GKD4K+B82b2YuADwAHgc5HVKiYdDTXUZFLqMBaRRFlsEIx7sBrbtcAn3P0TQGN01YpHKmV05bKaVCYiiZJZ5HGDZnYz8MvAT4Sjhqqiq1Z8ulrrdGlIRBJlsS2C64BR4Nfc/RiwGfhYZLWK0eRcAi1HLSJJsaggCP/4fx5oNrM3AyPuXnF9BBDMJRgcHefM+bG4qyIiUhaLnVn8S8D9wC8CvwTcZ2Zvj7JicdHIIRFJmsX2EXwYeLm7nwAwsw7g2wRLRVSUwrkEL+5qibk2IiLRW2wfQWoyBEL9S/jeNaWrVS0CEUmWxbYIbjezO4Dbwq+vA74ZTZXiVV+Tob2hWnMJRCQx5g0CM7sMWO/uv21mbwNeAxhwL0HncUXqymV5/rSCQESSYaHLOx8HBgHc/Svu/lvu/j6C1sDHo65cXHRfAhFJkoWCoMfdHy3e6O69QE8kNVoFunNZjpwZYWxCy1GLSOVbKAhq59lXt5IVWU26clkm8s7RMyNxV0VEJHILBcEDZvafijea2Y3Ag9FUKX6aSyAiSbLQqKHfBL5qZu9k+g//DqAa+IUoKxYnBYGIJMm8QeDux4FXmdnrgO3h5n919+9GXrMYrW+qpSptCgIRSYRFzSNw97uBu1fqpGbWAnyKIFycYDG7e1eq/OVKp4zOVt3IXkSSYbETylbaJ4Db3f3tZlYNZGOqx5y6NIRURBKi7MtEmFkT8JPApwHc/YK7nyl3PRbSndN9CUQkGeJYL+gSoA/4OzN7yMw+NXk/5NWkO5fl7PAYZ7UctYhUuDiCIAO8FPhrd38JMAR8qPggM7vJzHrNrLevr6/cdZwaOaSlJkSk0sURBIeAQ+5+X/j1PxEEwwzuvtPdd7j7jo6OjrJWEII+AkAdxiJS8coeBOHdzp43s23hptcDj5e7Hgvp0lwCEUmIuEYN/Qbw+XDE0HPAr8ZUjzk11VbRmq1SEIhIxYslCNz9YYIZyquaViEVkSSoyLuMrZSunCaViUjlUxDMozuX5dDpYSbyHndVREQioyCYR3cuy3jeOXp2OO6qiIhERkEwD40cEpEkUBDMo1tzCUQkARQE89jYXEs6peWoRaSyKQjmkUmn2NxSx8FT6iMQkcqlIFiA5hKISKVTECygK5flkIJARCqYgmAB3bks/UMXODc6HndVREQioSBYgEYOiUilUxAsoFtzCUSkwikIFqAWgYhUOgXBApqzVTTVZtQiEJGKpSBYhO42DSEVkcqlIFgEzSUQkUqmIFiErtYsh04Nk9dy1CJSgRQEi9CVy3JhIs/xwZG4qyIisuIUBIswNYS0X5eHRKTyKAgWQXMJRKSSxRYEZpY2s4fM7Btx1WGxNrXUkTLNJRCRyhRni+C9wN4Yz79o1ZkUG5vreP60lqMWkcoTSxCYWSfw88Cn4jh/KTSEVEQqVVwtgo8DHwDyMZ1/yRQEIlKpyh4EZvZm4IS7P7jAcTeZWa+Z9fb19ZWpdnPrbsvSNzjK8IWJuKsiIrKi4mgRvBp4i5ntB74A/LSZ/UPxQe6+0913uPuOjo6OctfxIl2Ti8+dVqtARCpL2YPA3W9290537wHeAXzX3d9V7nosleYSiEil0jyCRdJcAhGpVJk4T+7u9wD3xFmHxWrNVlFfnVYQiEjFUYtgkcyMrlxWk8pEpOIoCJZAQ0hFpBIpCJZgMgjctRy1iFQOBcESdLdlGR3P0zc4GndVRERWjIJgCTSXQEQqkYJgCTSEVEQqkYJgCTa31GEGB/u1CqmIVA4FwRLUVqXZ0FSrFoGIVBQFwRJpLoGIVBoFwRJpLoGIVBoFwRJ157IcGxhhZEzLUYtIZVAQLNHkyKFDum2liFQIBcESdeXqAN3IXkQqh4Jgibo0l0BEKkysy1CvRR0NNdRWpbh99zGGLowzNu6M5/NcmMgzPuGMTeTDR/B6fMK5EG6bfD1esH/y2PGJPON5J1dfTWdrHZta6tjcEjxvaqmjs7WOjoYaUimL+y0QkQqjIFgiM+Pqrhbufa6fe5/rD7dBVTpFdTpFJm0Xvc6kjOpMaup1fU1m6nVVJkVVKjguZcbJc6McOj3M/ftOMTAyPuPcVWljQ3PtVEBsLgqLzS111FWn43hbRGQNUxCU4B9ufAXDYxNUpYM/7umIPqUPjoxx5MwIR84Mc+jMMEfCx+HTw/zo2X6ODYyQL1oINVdfzaaWmWHR2VpHd66eLW1Z6mv0IxeRmfRXoQSZdIrGdPTdK421VWzbUMW2DY2z7h+byHN8YIQjZ0Y4fOZ8+BwExXN9Q/zg6ZOcvzBzmGtHYw09bVm6c/X0tGXZ0h4+t9XTXFcV+b9JRFYfBcEaVpVO0dmapbM1C+Qu2u/unB0e49DpYQ70n2d//xAH+oc40H+eHz5zki/vGplxfGu2ii1tQcthS9t0QPS0ZcnVV2Om/gmRSlT2IDCzLuBzwAYgD+x090+Uux5JYGa0ZKtpyVazfXPzRfuHL0xw8NR0QOzvP8/B/vM8eOA0//LIkRmXnRprMmxpz7IlvMTUEwZGT3s96xprFBIia1gcLYJx4H+4+y4zawQeNLO73P3xGOqSaHXVabZtaJz10tPo+ETYkhhi/8nzU4Hx+NEB7thzjPGClKirSrOlLcvW9nq2tNWztT0bPiskRNaCsgeBux8FjoavB81sL7AZUBCsIjWZNJd2NHBpR8NF+8Yn8hw5M8K+yZbEySAknjw+yLf3Hmds4uKQ6Gmrpyfsjwie61nfpJAQWQ1i7SMwsx7gJcB9cdZDliaTTtHdlqW7LQt0zNg3PpHn6NkR9p0MQmLfyfMc6B/iqRODfOeJmSFRW5WacYmppy18tGdZ31irORMiZRJbEJhZA/Bl4DfdfWCW/TcBNwF0d3eXuXZSqkw6RVcuG87AnhkSE3nnyJlh9vcPsf/kdEg8feIc333ixEUhMdUf0T6zX2Jjc11kQ3ZFksjcfeGjVvqkZlXAN4A73P3PFzp+x44d3tvbG33FJDaTIbHv5BAHTp3nwMmg8/pAf/D1hfH81LHV6RRdubqpPonp0U31bGqpJVOGob0ia4GZPejuOxY6Lo5RQwZ8Gti7mBCQZEinrKAlMVM+7xwbGAlbEufDEU7BMNh/e+YkI2PTIZEJy5nql5iaK1FPZ2sdVQoJkYvEcWno1cAvA4+Z2cPhtv/p7t+MoS6yBqRSNrWMxqsunbnP3TkxOMr+k0MFcyWC5wf2nWKoYEJdOmV0ttbRE45omgyJrWFIqCUhSRXHqKF/A3SBV1aEmbG+qZb1TbW84pK2Gfvcnf6hC1Od1vtPDrEv7J/o3T8zJCZbEpOjmraGrYit7fVsalGfhFQ2zSyWimVmtDfU0N5Qw8u2zJx57e6cPHeB/f1D7DsZhMP+MDDu23dqxtIcVekgJLZODoENWxE97Vk2NddpdJOseQoCSSQzo6Oxho7GGl7ec3FInBgcnTEEdjIofvjszD6J6kyKS9rrwzkX9Vy6Lph7cUlHPdlq/feStUG/qSJFCi83vbLoclM+7xwfHAlbEefZd/Icz/UNsefIWb61++iMZTk2NddOBUPwXM9lHQ10aLa1rDIKApElSKWMjc11bGy+uON6dHyCA/3nefbEOZ7tO8ezfUM823eOL/U+P6M/orEmwyVhMEzO3r5sXT3duXqqM+qwlvJTEIiskJpMmivWN3LF+plrN7k7xwdGw3A4FwbFEPc+289Xdh2eOi6dMrbkslzS0cBl6xq4MlwH6pKOemoyuuGQREdBIBIxs+DOchuaa3n1Ze0z9p0bHWdf2HKYfDxz4hzfe2p6pnUmZWxtrw8WCFwfhMOVG5robFVHtawMBYFIjBpqMryos5kXdc5cJnxsIs++k0M8eWyQJ48N8sSxQR45dIZvPHp06phsdZrL1zdy5VQ4NHLFhkbaG2rK/c+QNS6WJSaWSktMiASGRsd56vh0ODx5bJAnjw9yaujC1DHtDdVh66GJbRsa2LahiSvWN2gUUwKt2iUmRKR09TUZXtLdyku6W2ds7xscDcNhgCePDfLU8UFuu/8gw2NBJ7UZdLVmeeHGJrZvbuKqTc1ctbmJdY21cfwzZJVREIhUgMk5Ea+5fLoPIp93Dp46zxNhMDxxbIDHjwxw+55jU8esa6zhqk1NbN/cHITDpqDvQcNbk0VBIFKhUimbmgl9zfYNU9sHRsbYe2SA3UcG2HPkLHsOD/C9p/qm5kA011VNtxrCkNjaVq+O6QqmIBBJmKbaKl5xSduMtZlGxiZ44tgguw+fDcLhyAC3/nA/FyaCWdTZ6jQv3NjEVZuauGpzM9s3NXP5+gat5loh1FksIrMam8jz9PFzU8Gw+/BZHj86MLUOU3U6xbYNjTNaDy/Y2ERtleY8rBaL7SxWEIjIok3knf39Q+w5MsCew2fZfeQsuw8PcHZ4DAgmxV3W0cBVm5vYvqmZ7ZubeeGmJhpqdPEhDgoCESkLd+fwmWF2Hw76HILLSwOcGBydOmZre/1Uf8P2sPXQWl8dY62TQcNHRaQszIzO1iydrdkZndInBkamLintPnKWhw7OnBC3uaVuOhzCFsS6Jg1njYOCQEQisa6plnVNtbzuynVT286cv1AQDsHlpbv2HmfywkR7Q81UKLxgYxPduSybW+tozVZpSGuEFAQiUjYt2WpefVn7jDWXzo2Os/doGA7h5aUfPH2SiYI1vbPVaTpb68KWRx2drXVsbpl+nauvVlAsg4JARGLVUJPh5T25GTcIGhmb4Nm+cxw+Pcyhqcd5Dp0epnf/KQZGxmeUUVeVng6IGYERPLcpKOalIBCRVae2Kh0OSW2edf/Z4TEOnx7m8JnpgJh83nXwzNQopunyUmxuCYKhraGa2qo0NZkUNZnguTqTCr6e2h7uq0rNOK62avr15P7qdGrNT7aLJQjM7BrgE0Aa+JS73xJHPURkbWquq6K5rooXbmqadf/AyNhUa+LwVFAM8/zp8zxz4hyj43lGxycYHc9zYTw/axlLEYQBpMwwgmcs/Nqmt9vU12BY8Fx4zIxjg31//LYXXXQ71ZVW9iAwszTwV8AbgUPAA2b2dXd/vNx1EZHK1FRbRdPGKl6wcfagKJTPOxcm8tPhMJafERTB1+Hr8TyjY9MBMnncyFievDvuTt7BHfJhD3iwPXwmGG479bUTHh/sm+3YbHX0E/TiaBH8OPCMuz8HYGZfAK4FFAQiUnaplFGbSoczoqvirk4s4lgoZDPwfMHXh8JtIiISgziCYLZelYumN5vZTWbWa2a9fX19ZaiWiEgyxREEh4Cugq87gSPFB7n7Tnff4e47Ojo6ylY5EZGkiSMIHgAuN7OtZlYNvAP4egz1EBERYugsdvdxM/vvwB0Ew0c/4+57yl0PEREJxDKPwN2/CXwzjnOLiMhMur2QiEjCKQhERBJuTdyYxsz6gAMlfns7cHIFq7MWy11LdV1r5a6luq61ctdSXVdruVvcfcFhl2siCJbDzHoXc4eeSi53LdV1rZW7luq61spdS3Vdi+UW0qUhEZGEUxCIiCRcEoJgp8pdU3Vda+WupbqutXLXUl3XYrlTKr6PQERE5peEFoGIiMxDQSAiknAVd89iM7uS4EY3mwmWtz4CfN3d98ZaMRGRVaqiWgRm9kHgCwT3PLifYKVTA24zsw/FWTcRkdWqojqLzewp4Cp3HyvaXg3scffL46nZxcysGbgZeCswOfPvBPA14BZ3P1NiuRngRuAXgE1Mt4q+Bny6+L1ZBeWu+PsQ4Xu7Zn5mUdW16BzrKWh5u/vxZZZnBLeyLWzN3+/L/CMVVblh2Sv6HkRd7pznq7AgeAL4GXc/ULR9C3Cnu28rsdwo/ljdAXwX+Ky7Hwu3bQBuAN7g7m8ssa63AWeAzxLcBAiCm//cAOTc/bpVVu6Kvw8Rvrdr5mcWVV3Dcq4G/gZoBg4X1PcM8OvuvquEMt8EfBJ4uqjMy8Iy7yyxrlGVu+LvQZTlLsjdK+YBXAM8A3yLYOztTuD2cNs1yyj3DuCDwIaCbRvCbXeVWOaTpexbZrlPrbFyS3ofYnpvV9XPLKq6ht//MPCKWba/EnikxDL3Aj2zbN8K7F1GXaMqd8XfgyjLXehRUX0E7n47cAXw+wR/vO8Efg/YFu4rVY+7f9TDT1bhuY65+0eB7hLLPGBmHwibgEDQHAz7OZ5fRl1Pm9kvmtnUz9bMUmZ2HXB6FZYbxfsQ1Xu7ln5mUdUVoN7d7yve6O4/AupLLDPDdGuo0GGgqsQyoyw3ivcgynLnVXGjhtw9D/xohYs9YGYfIGhmH4epa3jvpvT/VNcBHwK+F5blwHGC23b+0jLq+g7go8BfmdnkJasW4O5w33LL/aSZnSbohG9egXKjeB+iem/X0s9ssq73FITBStQV4Ftm9q/A55j+/e8CfoWgBV6KzwAPmNkXisp8B/DpZdQ1qnKjeA+iLHdeFdVHEBUzayX4T3UtsC7cPPmf6hZ3L+lTWzjUtRP4kbufK9h+zXJaMGb2CoI/Us8CLyBoVj7uwZ3hls3M2giC4OPu/q6VKLOg7J8g6Nh7zEu/fvsK4Al3P2tmWYKf3UuBPcBH3P1sieW+B/iquy/3E3VxudXA9QSdmLuAnwVeRVDfnV56R/xlBB3QXcA48BRwW6n//qKyf5bpYdpG8Kn768v5HTOzF8xR5uPLrOsLgbdEUO7PzVHusv6fRfHeLnhOBcHymNmvuvvflfB97wH+G8E1zKuB97r718J9u9z9pSXW53cJ/pBkgLsI/qh+D3gDcIe7/+8Sy/36LJt/mqBDEnd/S4nl3u/uPx6+/o8E78k/A28C/sXdbymhzD3Aiz24P/ZOYAj4MvD6cPvbSqzr2bCsZ4F/BL7k7stef97MPk/w86oDzhJcAvhqWF9z9xtKKPM9wJuB7wM/R3Dt+TRBMPy6u9+z3HpLeZjZOnc/EelJoup8SMoDOFji9z0GNISve4BegjAAeGgZ9XkMSANZYABoCrfXAY8uo9xdwD8ArwV+Knw+Gr7+qWWU+1DB6weAjvB1PUGroJQy9xa83lW07+Hl1JVg7s2bCC4r9BE0128AGpdR7qPhc4agpZkOv7ZSf2aTvwfh6yxwT/i6ezm/X2EZzcAtBB9i+sPH3nBby3LKnuN831rG9zYBfwz8PXB90b5PLqPcDcBfA38FtBH0RT4KfBHYuIxyc7M89gOtBCPIVvS9nXxUXB9BFMzs0bl2Aevn2LeQtIeXg9x9v5m9FvincKirlVgmwLi7TwDnzexZdx8IzzFsZvlllLsDeC/wYeC33f1hMxt29+8to0yAVHjpLUXw6bcvrO+QmY2XWObugpbaI2a2w917zewKoKTLLCH3oA/qTuBOM6siaH1dD/wp00OLlyoVXh6qJ/ij3QycAmpYfkfpRFhOY/gPOBjWezm+SNASfJ3PHJr6buBLQClDfudqARtBi7lUf0cwdPTLwK+Z2duB/+DuowSXTEt1K/CvBD+zu4HPE7TAriUY/nltieWe5OK7MW4m+CDmwCUllju/qBKmkh4En9KuBrYUPXoIJnuUUuZ3gauLtmUIOokmllHX+4Bs+DpVsL2Zok/HJZbfSfCf/S8psTVUVN5+4DlgX/i8IdzeQImf3sN/660El3DuI/jj/xzBJbIXL6Ouc36SBuqWUe77wvodAN4DfAf4W4JP9b9bYpnvJfiEuhN4AvjVcHsH8P1l/syiGPI7Ef6fuHuWx/Ay6vpw0dcfBn5I8Cm+5P8PzGzJHpzvnEss9/0ErcwXFWzbt5yf16LOG/UJKuFBcBngNXPs+8cSy+ykYF5C0b5XL6OuNXNsby/85VqB9+TnCTpeo3rPs8DWZZbRCLwYeBmwfgXqdEWE/95NwKbwdQvwduDHl1nmVWE5V65wXe8EPlD4nhK0jD8IfLvEMncDl8+x7/ll1HUvBR+Iwm03EHTEH1hGuY8UvP6jon0lXdIs+P7JD1t/Hv4OPxfV793kQ53FIrIkUYyiCy/ZPObuT86y763u/s8l1vVPCFYV+HbR9muAv/ASl50xsz8A/sRrfNtfAAAEbUlEQVQLRvuF2y8jeA/eXkq5RWX9O4IWTI+7b1huefOeS0EgIiul1FF05S5zrZRrZnXApe6+O6r6goJARFaQmR1091Jn25etTJU7k0YNiciSRDGKLqKReSp3kRQEIrJU64Gf4eJ1kAz4f6uoTJW7SAoCEVmqbxBMhny4eIeZ3bOKylS5i6Q+AhGRhKuoZahFRGTpFAQiIgmnIJBYmZmb2Z8VfP1+M/u9FSr71nCiUqTCm8rsNbO7i7b3mNnu8PXV4bLFUdbjm2bWEuU5pDIpCCRuo8DbzKw97ooUMrP0Eg6/kWBp59fNc8zVBMtBL6UOixrMYYGUu/+cr8BN6SV5FAQSt3GChdHeV7yj+BO9mZ0Ln19rZt8zsy+a2VNmdouZvdPM7jezx8zs0oJi3mBmPwiPe3P4/Wkz+5iZPWBmj5rZfy4o924z+0eCBd+K63N9WP5uM/touO13gNcAf2NmH5vtHxiuLPoHwHVm9rCZXWdm9Wb2mbAOD5nZteGx7zazL5nZvxCscNpgZt8xs13huSeP6wlbIZ8kWJmyy8z2Twaqmf1WWM/dZvabRd/zt2a2x8zuDGeuStJFvZiRHnrM9wDOEawZv59g1dD3A78X7rsVeHvhseHza4EzwEaCJZYPA78f7nsvwZ3TJr//doIPPJcT3OmpFrgJ+F/hMTUE94LYGpY7xCyL3REsCneQYPXODMFKmW8N990D7Jjle3qA3eHrdwN/WbDvI8C7wtctBHcPqw+PO0S49nx4rsl7SrQDzxCMKe8B8sArC8rcHx7zMoIgqydYxXUP8JLwe8YJV70lWE76XXH/DugR/0MtAomdB/dM+BzBEsyL9YC7H/VgXflnCVbEhOAPYE/BcV9097y7P02w1POVBDeW+RUze5hgmeo2gqAAuN/d981yvpcT3Nylz93HCdaf/8kl1LfYm4APhXW4hyCgJpcPuMvdT4WvDfhIOOP02wRr00/OMD3gwU3Ni72G4JaaQx4sivYV4CfCfft8eoz6g8x8ryShNKFMVouPE1ziKFxUa5zw8qWZGVBdsG+04HW+4Os8M3+viyfKOMEf199w9zsKd1hwc6ChOeq3nJsFzVXev/ei1TYtuN9yYR3eSdAKeZm7j5nZfoLQoMS6Fr5vEwR3rpOEU4tAVoXwE/AXCTpeJ+0nuMwBwZLHpdxZ6xfNLBX2G1wCPAncAfzXyTt1mdkVZla/QDn3AT9lZu1hR/L1BDe6WaxBwruEhe4AfiMMOMzsJXN8XzNwIgyB1xHcEGkh3wfeambZ8N/1C8APllBXSRgFgawmf0ZwjXvS3xL88b0fKP6kvFhPEvzB/hbwX9x9BPgU8DiwKxze+X9ZoHXs7keBmwnumPUIwd2tvraEetwNvHCysxj4Q4JgezSswx/O8X2fB3aYWS9B6+CJhU7k7rsI+kfuJwiwT7n7Q0uoqySMlpgQEUk4tQhERBJOQSAiknAKAhGRhFMQiIgknIJARCThFAQiIgmnIBARSTgFgYhIwv1/Ow+euXoA5hIAAAAASUVORK5CYII=\n",
      "text/plain": [
       "<matplotlib.figure.Figure at 0x3ff9b9db9b0>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "train accuracy: 92.816091954023 %\n",
      "test accuracy: 93.54838709677419 %\n"
     ]
    }
   ],
   "source": [
    "def logistic_regression(x_train, y_train, x_test, y_test, learning_rate ,  num_iterations):\n",
    "    # 初始化\n",
    "    dimension =  x_train.shape[0]  # that is 4096\n",
    "    w,b = initialize_weights_and_bias(dimension)\n",
    "    # 不要去改变学习率\n",
    "    parameters, gradients, cost_list = update(w, b, x_train, y_train, learning_rate,num_iterations)\n",
    "    \n",
    "    y_prediction_test = predict(parameters[\"weight\"],parameters[\"bias\"],x_test)\n",
    "    y_prediction_train = predict(parameters[\"weight\"],parameters[\"bias\"],x_train)\n",
    "\n",
    "    # 打印训练测试错误\n",
    "    print(\"train accuracy: {} %\".format(100 - np.mean(np.abs(y_prediction_train - y_train)) * 100))\n",
    "    print(\"test accuracy: {} %\".format(100 - np.mean(np.abs(y_prediction_test - y_test)) * 100))\n",
    "    \n",
    "logistic_regression(x_train, y_train, x_test, y_test,learning_rate = 0.01, num_iterations = 150)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "_cell_guid": "6603c5a2-0a1b-4e4a-addd-a89be453e6fd",
    "_uuid": "1da3b972bab3207d2ba77f3d21516296f3e9be02"
   },
   "source": [
    "* 我们学习了简易神经网络（逻辑回归）背后的逻辑和怎么实现它。\n",
    "* 现在我们已经学习了逻辑，我们接下来可以用sklearn库了，用它来实现这些逻辑回归的步骤更加容易。\n",
    "\n",
    "\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "_cell_guid": "119db4ab-f04c-41c0-aef9-728565786e94",
    "_uuid": "ba8b9c960d1735b146fbbad378c90902e7218d59"
   },
   "source": [
    "<a id=\"8\"></a> <br>\n",
    "## 通过 Skylearn 做逻辑回归\n",
    "* 在sklearn库中, 有个逻辑回归方法可以很容易地实现逻辑回归。 \n",
    "* 我不打算解释sklearn逻辑回归用到的每个参数，但是如果你想要的话，可以读这篇文章 http://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html\n",
    "* 但是，精确度和我们前面计算得到的是不一样的。因为逻辑回归函数还使用了其他很多我们没有在前面用到的特性，比如不同的优化参数和调整。\n",
    "* 让我们对逻辑回归下个结论，然后开始人工智能网络部分吧。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 89,
   "metadata": {
    "_cell_guid": "5bc37200-cd53-4da3-a2ef-c630cf1a2d10",
    "_uuid": "a13b1565f8a5ae234eaa9ed6da288103a3579938"
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "test accuracy: 0.967741935483871 \n",
      "train accuracy: 1.0 \n"
     ]
    }
   ],
   "source": [
    "from sklearn import linear_model\n",
    "logreg = linear_model.LogisticRegression(random_state = 42,max_iter= 150)\n",
    "print(\"test accuracy: {} \".format(logreg.fit(x_train.T, y_train.T).score(x_test.T, y_test.T)))\n",
    "print(\"train accuracy: {} \".format(logreg.fit(x_train.T, y_train.T).score(x_train.T, y_train.T)))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "_cell_guid": "b889d43b-15c9-4f31-a4ff-c51c9b5ac077",
    "_uuid": "3379f61c2fb28d4f6e0e7b55fa31da22bab037a9"
   },
   "source": [
    "<a id=\"9\"></a> <br>\n",
    "## 总结和问题\n",
    "<font color='purple'>\n",
    "在第一部分我们都做了什么:\n",
    "* 初始化参数权重和偏置值\n",
    "* 前向传播\n",
    "* 损失函数\n",
    "* 代价函数\n",
    "* 后向传播 (梯度下降)\n",
    "* 用学习到的参数权重和偏置值做预测\n",
    "* 通过 Skylearn 进行逻辑回归\n",
    "\n",
    "<br> 如果你在这里有问你可以问作者，因为接下来我们会构建逻辑回归的人工智能网络。\n",
    "<br> 家庭作业: 这是一个不错的点，停下来做个联系吧. 你的作业是创建自己的逻辑回归方法来分类另外两个数字符号。"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "_cell_guid": "a800aa00-7adf-4d10-83c1-3a5221219329",
    "_uuid": "dd0e970ce5194a34ca55b27b6580ef85714e1b92"
   },
   "source": [
    "<a id=\"10\"></a> <br>\n",
    "# 人工神经网络 (ANN)\n",
    "* 它也被称为深度神经网络或者深度学习\n",
    "* **什么是神经网络:** 它是基本上是逻辑回归，并且执行两遍以上。\n",
    "* 在逻辑回归中, 有输入层和输出层。然后，在神经网络中，在输入层和输出层之间至少还有一层隐藏的层\n",
    "* **什么是深度, 多少层可以称为深度 ** 当我问我的老师这个问题的时候，他说“深度”（Deep)是个具有相对性的名词；它当然是指出了网络的深度，表明了有多少隐藏层。 “你的游泳池有多深？” 它可以是12英寸深（3.6米左右）也可以是2英寸（0.6米)。无论如何，它都有个深度。泳池的质量可以用深度来表示。32年前，我使用了两到三个隐藏层。那是因为那时我收到了专用硬件的限制。在几年前，20层被视为挺深了。在10月，吴恩达提到了152层是他目前知道的最大的商业网络。上周，我和在某著名大公司供职的人聊天，他说他用了几千层。所以，我倾向于坚持问“多深？”\n",
    "* **为什么它称为“隐藏”:** 因为隐藏层看不到输入(训练集)。\n",
    "* 比方说你有输入，1层隐藏层和输出层。当有人问你“嘿，我的朋友，你的神经网络有几层？”答案是“我有两层神经网络”。因为当计算层数时, 输入层被无视了。\n",
    "* 让我们来看2层神经网络: \n",
    "<a href=\"http://ibb.co/eF315x\"><img src=\"http://preview.ibb.co/dajVyH/9.jpg\" alt=\"9\" border=\"0\"></a>\n",
    "* 我们会一步步学习这张图.\n",
    "    * 如你所见在输入和输出层之间有一层隐藏层。而这个隐藏层有3个节点。如果你好奇为啥我选择了3个节点，答案是没有理由，我就是这么选的。节点数量是和学习率一样是超参。因此我们会在神经网络部分结束的时候看到这些超参。\n",
    "    * 输入和输出层没有变。他们和逻辑回归是一样的。\n",
    "    * 在图中有一个tanh函数，你可能并不了解. 它和sigmoid函数一样是激活函数. Tanh激活函数用于隐藏单元时比sigmoid函数好，因为它的输出的平均值接近于0，所以它为下一层做了数据集中。而且，tanh激活函数增加了非线性，这让我们的学习模型更好。\n",
    "    * 如你所见紫色的有两部分。每个部分都和逻辑回归那样。它们的区别是激活函数，输入和输出。\n",
    "        * 在逻辑回归中: 输入 => 输出\n",
    "        * 在两层神经网络中: 输入 => 隐藏层 => 输出层. 你可以认为隐藏层是第一部分的输出和第二部分的输入。\n",
    "* 说完了. 我们将会沿着逻辑回归相同的路径来学习两层神经网络。\n",
    "\n",
    "   \n",
    "    \n",
    "    "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "_cell_guid": "35b73665-27cb-48a7-abb1-0e0e6bc10ec0",
    "_uuid": "689a4a049579fff0c718bc30639f18e2dcab8ccc"
   },
   "source": [
    "<a id=\"11\"></a> <br>\n",
    "## 2层神经网络\n",
    "* 层的大小，权重以及偏置值参数初始化\n",
    "* 前向传播\n",
    "* 损失函数和代价函数\n",
    "* 后向传播\n",
    "* 更新参数\n",
    "* 用学到的权重的偏置值做预测\n",
    "* 创建模型"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "_cell_guid": "bbfe8df0-bf38-4842-aec1-c04417a99420",
    "_uuid": "0a5e286cf360d579eb6e5d5f220dd1a17c458039"
   },
   "source": [
    "<a id=\"12\"></a> <br>\n",
    "## 层的大小，权重以及偏置值参数初始化\n",
    "* 训练集x_train有348个样本 $x^{(348)}$:\n",
    "$$z^{[1] (348)} =  W^{[1]} x^{(348)} + b^{[1] (348)}$$ \n",
    "$$a^{[1] (348)} = \\tanh(z^{[1] (348)})$$\n",
    "$$z^{[2] (348)} = W^{[2]} a^{[1] (348)} + b^{[2] (348)}$$\n",
    "$$\\hat{y}^{(348)} = a^{[2] (348)} = \\sigma(z^{ [2] (348)})$$\n",
    "\n",
    "* 在逻辑回归中, 我们初始化权重 0.01 和偏置值 0. 这次, 我们随机初始化权重。如果我们把第一隐藏层每个神经元都初始化为0将会进行同样的计算，即使多次梯度下降的迭代后，每个这一层的神经元仍然还是和其他神经元一样计算相同的事情。所以，我们做随机的初始化。而且，起始的权重会比较小。如果它们的初始值非常大，这会导致tanh函数的输入非常大，然而这会导致梯度接近于0.优化算法会非常慢。\n",
    "* 偏置值开始时会是0"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 94,
   "metadata": {
    "_cell_guid": "089fd577-95a0-4218-9b53-72bf1c0ab206",
    "_uuid": "922670a74f6999885759399ebea8b10692796a29"
   },
   "outputs": [],
   "source": [
    "# 初始化参数和层大小\n",
    "def initialize_parameters_and_layer_sizes_NN(x_train, y_train):\n",
    "    parameters = {\"weight1\": np.random.randn(3,x_train.shape[0]) * 0.1,\n",
    "                  \"bias1\": np.zeros((3,1)),\n",
    "                  \"weight2\": np.random.randn(y_train.shape[0],3) * 0.1,\n",
    "                  \"bias2\": np.zeros((y_train.shape[0],1))}\n",
    "    return parameters"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "_cell_guid": "65832cdf-7ee8-447b-a068-48ac2e46b49f",
    "_uuid": "66147bbafbe25dac498f3963ea8419126f624ce9"
   },
   "source": [
    "<a id=\"13\"></a> <br>\n",
    "## 前向传播\n",
    "* 前向传播和逻辑回归几乎一样。\n",
    "* 唯一的区别是我们使用tanh函数和我们所有都做两次.\n",
    "* numpy也有tanh函数. 所以我们不需要自己实现它。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 18,
   "metadata": {
    "_cell_guid": "d64d6b90-7f14-453f-a401-4119504496e3",
    "_uuid": "41e1e2f1c7afff027ba0a9f9b2fdcbe312e9a194"
   },
   "outputs": [],
   "source": [
    "\n",
    "def forward_propagation_NN(x_train, parameters):\n",
    "\n",
    "    Z1 = np.dot(parameters[\"weight1\"],x_train) +parameters[\"bias1\"]\n",
    "    A1 = np.tanh(Z1)\n",
    "    Z2 = np.dot(parameters[\"weight2\"],A1) + parameters[\"bias2\"]\n",
    "    A2 = sigmoid(Z2)\n",
    "\n",
    "    cache = {\"Z1\": Z1,\n",
    "             \"A1\": A1,\n",
    "             \"Z2\": Z2,\n",
    "             \"A2\": A2}\n",
    "    \n",
    "    return A2, cache\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "_cell_guid": "30a0abc9-7ee0-4093-afd5-ae9d5b1fcd5e",
    "_uuid": "ee7a42ee207e222eed5c24b1bfbf2d6ce0cdec37"
   },
   "source": [
    "<a id=\"14\"></a> <br>\n",
    "## 损失函数和代价函数\n",
    "* 损失函数和代价函数同逻辑回归是一样的\n",
    "* Cross entropy function\n",
    "<a href=\"https://imgbb.com/\"><img src=\"https://image.ibb.co/nyR9LU/as.jpg\" alt=\"as\" border=\"0\"></a><br />"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 95,
   "metadata": {
    "_cell_guid": "24143d72-bc62-4f2d-b0cf-4f67a4016299",
    "_uuid": "b55887b28cffc8083a76af45d25957d4f3e9f6fa"
   },
   "outputs": [],
   "source": [
    "# 计算代价\n",
    "def compute_cost_NN(A2, Y, parameters):\n",
    "    logprobs = np.multiply(np.log(A2),Y)\n",
    "    cost = -np.sum(logprobs)/Y.shape[1]\n",
    "    return cost\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "_cell_guid": "39839772-976a-4e53-a2e0-07aa76c9bd98",
    "_uuid": "43767f9271e2b2b6e0c3560e414f3c0c596ffe2f"
   },
   "source": [
    "<a id=\"15\"></a> <br>\n",
    "## 后向传播\n",
    "* 如你所知，后向传播需要求导。\n",
    "* 如果你想进一步了解，（我需要用说来讲解），请看 youtube 视频\n",
    "* 但是逻辑是一样的, 让我们来写代码吧。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 96,
   "metadata": {
    "_cell_guid": "2fcfd3c4-f935-4272-a284-c2dbc2c35afb",
    "_uuid": "6bf7bce2e4413ecdc16ea778008eee4072738aab"
   },
   "outputs": [],
   "source": [
    "# 后向传播\n",
    "def backward_propagation_NN(parameters, cache, X, Y):\n",
    "\n",
    "    dZ2 = cache[\"A2\"]-Y\n",
    "    dW2 = np.dot(dZ2,cache[\"A1\"].T)/X.shape[1]\n",
    "    db2 = np.sum(dZ2,axis =1,keepdims=True)/X.shape[1]\n",
    "    dZ1 = np.dot(parameters[\"weight2\"].T,dZ2)*(1 - np.power(cache[\"A1\"], 2))\n",
    "    dW1 = np.dot(dZ1,X.T)/X.shape[1]\n",
    "    db1 = np.sum(dZ1,axis =1,keepdims=True)/X.shape[1]\n",
    "    grads = {\"dweight1\": dW1,\n",
    "             \"dbias1\": db1,\n",
    "             \"dweight2\": dW2,\n",
    "             \"dbias2\": db2}\n",
    "    return grads"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "_cell_guid": "af195fda-5649-4e6d-830e-72123f2726a8",
    "_uuid": "b1996782dc44fda7993407c9b5efee5d4fef46e4"
   },
   "source": [
    "<a id=\"16\"></a> <br>\n",
    "## 更新参数\n",
    "* 更新参数和逻辑回归是一样的\n",
    "* 我们其实和逻辑回归一样干了大量的工作。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 97,
   "metadata": {
    "_cell_guid": "d9ae95d4-1d11-4293-822d-e1d5f0c16d1e",
    "_uuid": "facf2b475cb82e14dcc6be56b57fe2c3ad0b1a8f"
   },
   "outputs": [],
   "source": [
    "# 更新参数\n",
    "def update_parameters_NN(parameters, grads, learning_rate = 0.01):\n",
    "    parameters = {\"weight1\": parameters[\"weight1\"]-learning_rate*grads[\"dweight1\"],\n",
    "                  \"bias1\": parameters[\"bias1\"]-learning_rate*grads[\"dbias1\"],\n",
    "                  \"weight2\": parameters[\"weight2\"]-learning_rate*grads[\"dweight2\"],\n",
    "                  \"bias2\": parameters[\"bias2\"]-learning_rate*grads[\"dbias2\"]}\n",
    "    \n",
    "    return parameters"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "_cell_guid": "ac416480-ec9c-45b4-ac9d-1caeded9ba90",
    "_uuid": "9c471502563017fabb991494359091215e4ad583"
   },
   "source": [
    "<a id=\"17\"></a> <br>\n",
    "## 用学到的权重和偏置值做预测\n",
    "* 让我们写预测方法吧，它和逻辑回归的那个差不多"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 98,
   "metadata": {
    "_cell_guid": "96004eb5-d6ca-41ab-a577-70fb0628a2f4",
    "_uuid": "53c00c4430c6fcc3298dde8de804cab71884caa5"
   },
   "outputs": [],
   "source": [
    "# 预测\n",
    "def predict_NN(parameters,x_test):\n",
    "    # x_test 是前向传播的输入\n",
    "    A2, cache = forward_propagation_NN(x_test,parameters)\n",
    "    Y_prediction = np.zeros((1,x_test.shape[1]))\n",
    "    # 如果z>0.5, 我们预测的结果是符号1\n",
    "    # 如果z<0.5, 我们预测的结果是符号0,\n",
    "    for i in range(A2.shape[1]):\n",
    "        if A2[0,i]<= 0.5:\n",
    "            Y_prediction[0,i] = 0\n",
    "        else:\n",
    "            Y_prediction[0,i] = 1\n",
    "\n",
    "    return Y_prediction"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "_cell_guid": "d0df9e13-300b-4d5e-b8ec-f702ed0e06af",
    "_uuid": "94202fbc047d59fa5c8b81ba02962f1f3cc56d8f"
   },
   "source": [
    "<a id=\"18\"></a> <br>\n",
    "## 创建模型\n",
    "* 让我们把所有东西放到一起"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {
    "_cell_guid": "b66f3c28-0f71-4176-8a6b-35b98b0db936",
    "_uuid": "9babf239f800bedc9864c6a75677985d57f1cc78"
   },
   "outputs": [],
   "source": [
    "# 两层神经网络\n",
    "def two_layer_neural_network(x_train, y_train,x_test,y_test, num_iterations):\n",
    "    cost_list = []\n",
    "    index_list = []\n",
    "    #初始化参数和层大小\n",
    "    parameters = initialize_parameters_and_layer_sizes_NN(x_train, y_train)\n",
    "\n",
    "    for i in range(0, num_iterations):\n",
    "         # 前向传播\n",
    "        A2, cache = forward_propagation_NN(x_train,parameters)\n",
    "        # 计算开销\n",
    "        cost = compute_cost_NN(A2, y_train, parameters)\n",
    "         # 后向传播\n",
    "        grads = backward_propagation_NN(parameters, cache, x_train, y_train)\n",
    "         # 更新参数\n",
    "        parameters = update_parameters_NN(parameters, grads)\n",
    "        \n",
    "        if i % 100 == 0:\n",
    "            cost_list.append(cost)\n",
    "            index_list.append(i)\n",
    "            print (\"Cost after iteration %i: %f\" %(i, cost))\n",
    "    plt.plot(index_list,cost_list)\n",
    "    plt.xticks(index_list,rotation='vertical')\n",
    "    plt.xlabel(\"Number of Iterarion\")\n",
    "    plt.ylabel(\"Cost\")\n",
    "    plt.show()\n",
    "    \n",
    "    # 预测\n",
    "    y_prediction_test = predict_NN(parameters,x_test)\n",
    "    y_prediction_train = predict_NN(parameters,x_train)\n",
    "\n",
    "    # 打印训练和测试的结果\n",
    "    print(\"train accuracy: {} %\".format(100 - np.mean(np.abs(y_prediction_train - y_train)) * 100))\n",
    "    print(\"test accuracy: {} %\".format(100 - np.mean(np.abs(y_prediction_test - y_test)) * 100))\n",
    "    return parameters\n",
    "\n",
    "parameters = two_layer_neural_network(x_train, y_train,x_test,y_test, num_iterations=2500)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "_cell_guid": "04d3b704-2fd4-410b-ad90-a5a0aedd4b7d",
    "_uuid": "62bb69efcf883ba22581dfc8d4f2e0eaa2d99ee3"
   },
   "source": [
    "<font color='purple'>\n",
    "\n",
    "到这里为止，我们创建了两层神经网络，学习了如何实现。\n",
    "* 层的大小，权重以及偏置值参数初始化\n",
    "* 前向传播\n",
    "* 损失函数和代价函数\n",
    "* 后向传播\n",
    "* 更新参数\n",
    "* 用学到的权重的偏置值做预测\n",
    "* 创建模型\n",
    "\n",
    "<br> Now lets learn how to implement L layer neural network with keras."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "_cell_guid": "c054ed2a-9dd9-498c-9be6-6e9b16ff5913",
    "_uuid": "aa2f896b72236e09687afc3c613a4fc801b16552"
   },
   "source": [
    "<a id=\"19\"></a> <br>\n",
    "# L层神经网络\n",
    "* **如果隐藏层数增加，会发生什么:** 早期的层会检测简单的特性。\n",
    "* 如果后面的神经网络中，模型由简单的特性组成，模型可以学习更多更复杂的功能。比方说，让我们来看看符号1.\n",
    "<a href=\"http://ibb.co/dNgDJH\"><img src=\"http://preview.ibb.co/mpD4Qx/10.jpg\" alt=\"10\" border=\"0\"></a>\n",
    "* 如果第一层学习了边缘或者基本图像比如说线条。当层数增加时，层开始学习更复杂的事情，比如凸多边形形状或者典型特征比如食指。\n",
    "* 让我们来创建个模型\n",
    "    * 有一些超参比如我们需要的学习率，迭代数，隐藏层数，隐藏单元数，激活函数的类型。哇，太多了，：）\n",
    "    * 如果你花了大量时间在深度学习的世界，这些超参可以靠自觉挑选 \n",
    "    * 但是, 如果你不想花太多时间，最后的办法还是 google 搜索，但是这也不是必须的。你需要尝试不同超参，找到最好的。\n",
    "    * 在本指南中，我们的模型有2个隐藏层分别带了8个和4个节点。 因为隐藏层和节点数增加了，它会花大量的时间。\n",
    "    * 对于激活函数，我们会分别用 relu(第一隐藏层)， relu(第二隐藏层)和 sigmoid(输出层)\n",
    "    * 迭代将会是100次\n",
    "* 我们的方法和前面一样，虽然你已经学过深度学习的逻辑，我们的工作还是可以通过keras库来简化。我们会为深度神经网络使用keras库。\n",
    "\n",
    "* 首先让我们来转置训练集和测试集 x_train, x_test, y_train and y_test.\n",
    "\n",
    "    \n",
    "    "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 100,
   "metadata": {
    "_cell_guid": "631a05c4-e362-4fa0-9048-21c599f55344",
    "_uuid": "0a978924a68d423de4babe73c15412ad938c1858"
   },
   "outputs": [],
   "source": [
    "# 转置\n",
    "x_train, x_test, y_train, y_test = x_train.T, x_test.T, y_train.T, y_test.T"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "_cell_guid": "e17b5a34-00dc-49a8-b7ac-a2bd3a753f78",
    "_uuid": "5a78a5570bc50180a02190dee46180f19eb165e1"
   },
   "source": [
    "<a id=\"22\"></a> <br>\n",
    "## 用keras库实现\n",
    "让我们看看keras库的一些参数：\n",
    "* units: 节点的输出维度\n",
    "* kernel_initializer: 初始化权重\n",
    "* activation: 激活函数, 我们用relu\n",
    "* input_dim: 输入维度，这里它就是我们图象中的像素数量（4096像素）\n",
    "* 优化器: 我们使用adam优化器\n",
    "    * Adam是训练神经网络时最有效的的优化算法之一。\n",
    "    * Adam的一些优点是相对小的内存需求，和常常只要少量调校超参就能工作得很好。\n",
    "* 损失: 代价函数是相同的. 同时成本函数是我们前面用过的cross-entropy\n",
    "$$J = - \\frac{1}{m} \\sum\\limits_{i = 0}^{m} \\large\\left(\\small y^{(i)}\\log\\left(a^{[2] (i)}\\right) + (1-y^{(i)})\\log\\left(1- a^{[2] (i)}\\right)  \\large  \\right) \\small \\tag{6}$$\n",
    "* metrics: 这是精确的\n",
    "* cross_val_score: 使用交叉验证。如果你们不知道交叉验证，请看我的机器学习指南。\n",
    "https://www.kaggle.com/kanncaa1/machine-learning-tutorial-for-beginners\n",
    "* epochs: 迭代数量"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 93,
   "metadata": {
    "_cell_guid": "8870c45b-b0fe-4050-a8af-41498a417ed5",
    "_uuid": "9361c3183a40fa0080055b7d5c1002aef68b4d77"
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Epoch 1/100\n",
      "232/232 [==============================] - 0s 348us/step - loss: 0.6931 - accuracy: 0.5086\n",
      "Epoch 2/100\n",
      "232/232 [==============================] - 0s 55us/step - loss: 0.6929 - accuracy: 0.5431\n",
      "Epoch 3/100\n",
      "232/232 [==============================] - 0s 56us/step - loss: 0.6927 - accuracy: 0.5431\n",
      "Epoch 4/100\n",
      "232/232 [==============================] - 0s 54us/step - loss: 0.6926 - accuracy: 0.5431\n",
      "Epoch 5/100\n",
      "232/232 [==============================] - 0s 53us/step - loss: 0.6924 - accuracy: 0.5431\n",
      "Epoch 6/100\n",
      "232/232 [==============================] - 0s 54us/step - loss: 0.6923 - accuracy: 0.5431\n",
      "Epoch 7/100\n",
      "232/232 [==============================] - 0s 51us/step - loss: 0.6922 - accuracy: 0.5431\n",
      "Epoch 8/100\n",
      "232/232 [==============================] - 0s 54us/step - loss: 0.6921 - accuracy: 0.5431\n",
      "Epoch 9/100\n",
      "232/232 [==============================] - 0s 56us/step - loss: 0.6920 - accuracy: 0.5431\n",
      "Epoch 10/100\n",
      "232/232 [==============================] - 0s 52us/step - loss: 0.6918 - accuracy: 0.5431\n",
      "Epoch 11/100\n",
      "232/232 [==============================] - 0s 58us/step - loss: 0.6917 - accuracy: 0.5431\n",
      "Epoch 12/100\n",
      "232/232 [==============================] - 0s 55us/step - loss: 0.6916 - accuracy: 0.5431\n",
      "Epoch 13/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.6915 - accuracy: 0.5431\n",
      "Epoch 14/100\n",
      "232/232 [==============================] - 0s 61us/step - loss: 0.6914 - accuracy: 0.5431\n",
      "Epoch 15/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.6913 - accuracy: 0.5431\n",
      "Epoch 16/100\n",
      "232/232 [==============================] - 0s 65us/step - loss: 0.6913 - accuracy: 0.5431\n",
      "Epoch 17/100\n",
      "232/232 [==============================] - 0s 68us/step - loss: 0.6912 - accuracy: 0.5431\n",
      "Epoch 18/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.6911 - accuracy: 0.5431\n",
      "Epoch 19/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.6910 - accuracy: 0.5431\n",
      "Epoch 20/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.6909 - accuracy: 0.5431\n",
      "Epoch 21/100\n",
      "232/232 [==============================] - 0s 66us/step - loss: 0.6909 - accuracy: 0.5431\n",
      "Epoch 22/100\n",
      "232/232 [==============================] - 0s 67us/step - loss: 0.6908 - accuracy: 0.5431\n",
      "Epoch 23/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.6907 - accuracy: 0.5431\n",
      "Epoch 24/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.6906 - accuracy: 0.5431\n",
      "Epoch 25/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.6906 - accuracy: 0.5431\n",
      "Epoch 26/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.6906 - accuracy: 0.5431\n",
      "Epoch 27/100\n",
      "232/232 [==============================] - 0s 66us/step - loss: 0.6904 - accuracy: 0.5431\n",
      "Epoch 28/100\n",
      "232/232 [==============================] - 0s 65us/step - loss: 0.6903 - accuracy: 0.5431\n",
      "Epoch 29/100\n",
      "232/232 [==============================] - 0s 65us/step - loss: 0.6903 - accuracy: 0.5431\n",
      "Epoch 30/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.6903 - accuracy: 0.5431\n",
      "Epoch 31/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.6902 - accuracy: 0.5431\n",
      "Epoch 32/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.6901 - accuracy: 0.5431\n",
      "Epoch 33/100\n",
      "232/232 [==============================] - 0s 61us/step - loss: 0.6901 - accuracy: 0.5431\n",
      "Epoch 34/100\n",
      "232/232 [==============================] - 0s 66us/step - loss: 0.6901 - accuracy: 0.5431\n",
      "Epoch 35/100\n",
      "232/232 [==============================] - 0s 61us/step - loss: 0.6900 - accuracy: 0.5431\n",
      "Epoch 36/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.6900 - accuracy: 0.5431\n",
      "Epoch 37/100\n",
      "232/232 [==============================] - 0s 65us/step - loss: 0.6900 - accuracy: 0.5431\n",
      "Epoch 38/100\n",
      "232/232 [==============================] - 0s 70us/step - loss: 0.6899 - accuracy: 0.5431\n",
      "Epoch 39/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.6899 - accuracy: 0.5431\n",
      "Epoch 40/100\n",
      "232/232 [==============================] - 0s 65us/step - loss: 0.6899 - accuracy: 0.5431\n",
      "Epoch 41/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.6899 - accuracy: 0.5431\n",
      "Epoch 42/100\n",
      "232/232 [==============================] - 0s 68us/step - loss: 0.6898 - accuracy: 0.5431\n",
      "Epoch 43/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.6898 - accuracy: 0.5431\n",
      "Epoch 44/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.6898 - accuracy: 0.5431\n",
      "Epoch 45/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.6898 - accuracy: 0.5431\n",
      "Epoch 46/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.6898 - accuracy: 0.5431\n",
      "Epoch 47/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.6898 - accuracy: 0.5431\n",
      "Epoch 48/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.6897 - accuracy: 0.5431\n",
      "Epoch 49/100\n",
      "232/232 [==============================] - 0s 65us/step - loss: 0.6898 - accuracy: 0.5431\n",
      "Epoch 50/100\n",
      "232/232 [==============================] - 0s 65us/step - loss: 0.6897 - accuracy: 0.5431\n",
      "Epoch 51/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.6898 - accuracy: 0.5431\n",
      "Epoch 52/100\n",
      "232/232 [==============================] - 0s 61us/step - loss: 0.6897 - accuracy: 0.5431\n",
      "Epoch 53/100\n",
      "232/232 [==============================] - 0s 61us/step - loss: 0.6897 - accuracy: 0.5431\n",
      "Epoch 54/100\n",
      "232/232 [==============================] - 0s 61us/step - loss: 0.6897 - accuracy: 0.5431\n",
      "Epoch 55/100\n",
      "232/232 [==============================] - 0s 61us/step - loss: 0.6897 - accuracy: 0.5431\n",
      "Epoch 56/100\n",
      "232/232 [==============================] - 0s 65us/step - loss: 0.6897 - accuracy: 0.5431\n",
      "Epoch 57/100\n",
      "232/232 [==============================] - 0s 61us/step - loss: 0.6897 - accuracy: 0.5431\n",
      "Epoch 58/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.6896 - accuracy: 0.5431\n",
      "Epoch 59/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.6896 - accuracy: 0.5431\n",
      "Epoch 60/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.6896 - accuracy: 0.5431\n",
      "Epoch 61/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.6896 - accuracy: 0.5431\n",
      "Epoch 62/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.6896 - accuracy: 0.5431\n",
      "Epoch 63/100\n",
      "232/232 [==============================] - 0s 71us/step - loss: 0.6896 - accuracy: 0.5431\n",
      "Epoch 64/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.6896 - accuracy: 0.5431\n",
      "Epoch 65/100\n",
      "232/232 [==============================] - 0s 68us/step - loss: 0.6896 - accuracy: 0.5431\n",
      "Epoch 66/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.6895 - accuracy: 0.5431\n",
      "Epoch 67/100\n",
      "232/232 [==============================] - 0s 65us/step - loss: 0.6895 - accuracy: 0.5431\n",
      "Epoch 68/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.6895 - accuracy: 0.5431\n",
      "Epoch 69/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.6895 - accuracy: 0.5431\n",
      "Epoch 70/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.6895 - accuracy: 0.5431\n",
      "Epoch 71/100\n",
      "232/232 [==============================] - 0s 65us/step - loss: 0.6895 - accuracy: 0.5431\n",
      "Epoch 72/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.6895 - accuracy: 0.5431\n",
      "Epoch 73/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.6895 - accuracy: 0.5431\n",
      "Epoch 74/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.6895 - accuracy: 0.5431\n",
      "Epoch 75/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.6895 - accuracy: 0.5431\n",
      "Epoch 76/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.6895 - accuracy: 0.5431\n",
      "Epoch 77/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.6895 - accuracy: 0.5431\n",
      "Epoch 78/100\n",
      "232/232 [==============================] - 0s 66us/step - loss: 0.6895 - accuracy: 0.5431\n",
      "Epoch 79/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.6895 - accuracy: 0.5431\n",
      "Epoch 80/100\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "232/232 [==============================] - 0s 65us/step - loss: 0.6895 - accuracy: 0.5431\n",
      "Epoch 81/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.6895 - accuracy: 0.5431\n",
      "Epoch 82/100\n",
      "232/232 [==============================] - 0s 65us/step - loss: 0.6895 - accuracy: 0.5431\n",
      "Epoch 83/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.6895 - accuracy: 0.5431\n",
      "Epoch 84/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.6895 - accuracy: 0.5431\n",
      "Epoch 85/100\n",
      "232/232 [==============================] - 0s 68us/step - loss: 0.6895 - accuracy: 0.5431\n",
      "Epoch 86/100\n",
      "232/232 [==============================] - 0s 66us/step - loss: 0.6895 - accuracy: 0.5431\n",
      "Epoch 87/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.6895 - accuracy: 0.5431\n",
      "Epoch 88/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.6896 - accuracy: 0.5431\n",
      "Epoch 89/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.6895 - accuracy: 0.5431\n",
      "Epoch 90/100\n",
      "232/232 [==============================] - 0s 65us/step - loss: 0.6895 - accuracy: 0.5431\n",
      "Epoch 91/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.6895 - accuracy: 0.5431\n",
      "Epoch 92/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.6895 - accuracy: 0.5431\n",
      "Epoch 93/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.6895 - accuracy: 0.5431\n",
      "Epoch 94/100\n",
      "232/232 [==============================] - 0s 69us/step - loss: 0.6895 - accuracy: 0.5431\n",
      "Epoch 95/100\n",
      "232/232 [==============================] - 0s 67us/step - loss: 0.6895 - accuracy: 0.5431\n",
      "Epoch 96/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.6895 - accuracy: 0.5431\n",
      "Epoch 97/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.6895 - accuracy: 0.5431\n",
      "Epoch 98/100\n",
      "232/232 [==============================] - 0s 66us/step - loss: 0.6895 - accuracy: 0.5431\n",
      "Epoch 99/100\n",
      "232/232 [==============================] - 0s 70us/step - loss: 0.6895 - accuracy: 0.5431\n",
      "Epoch 100/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.6895 - accuracy: 0.5431\n",
      "116/116 [==============================] - 0s 98us/step\n",
      "Epoch 1/100\n",
      "232/232 [==============================] - 0s 339us/step - loss: 0.6937 - accuracy: 0.5216\n",
      "Epoch 2/100\n",
      "232/232 [==============================] - 0s 56us/step - loss: 0.6922 - accuracy: 0.5216\n",
      "Epoch 3/100\n",
      "232/232 [==============================] - 0s 55us/step - loss: 0.6913 - accuracy: 0.5216\n",
      "Epoch 4/100\n",
      "232/232 [==============================] - 0s 52us/step - loss: 0.6910 - accuracy: 0.5216\n",
      "Epoch 5/100\n",
      "232/232 [==============================] - 0s 54us/step - loss: 0.6898 - accuracy: 0.5216\n",
      "Epoch 6/100\n",
      "232/232 [==============================] - 0s 54us/step - loss: 0.6884 - accuracy: 0.5216\n",
      "Epoch 7/100\n",
      "232/232 [==============================] - 0s 52us/step - loss: 0.6868 - accuracy: 0.5216\n",
      "Epoch 8/100\n",
      "232/232 [==============================] - 0s 52us/step - loss: 0.6845 - accuracy: 0.5216\n",
      "Epoch 9/100\n",
      "232/232 [==============================] - 0s 53us/step - loss: 0.6811 - accuracy: 0.5216\n",
      "Epoch 10/100\n",
      "232/232 [==============================] - 0s 52us/step - loss: 0.6756 - accuracy: 0.5216\n",
      "Epoch 11/100\n",
      "232/232 [==============================] - 0s 56us/step - loss: 0.6677 - accuracy: 0.5216\n",
      "Epoch 12/100\n",
      "232/232 [==============================] - 0s 56us/step - loss: 0.6570 - accuracy: 0.5216\n",
      "Epoch 13/100\n",
      "232/232 [==============================] - 0s 61us/step - loss: 0.6413 - accuracy: 0.5216\n",
      "Epoch 14/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.6258 - accuracy: 0.5216\n",
      "Epoch 15/100\n",
      "232/232 [==============================] - 0s 61us/step - loss: 0.6061 - accuracy: 0.5216\n",
      "Epoch 16/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.5895 - accuracy: 0.6422\n",
      "Epoch 17/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.5601 - accuracy: 0.6724\n",
      "Epoch 18/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.5079 - accuracy: 0.8491\n",
      "Epoch 19/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.4503 - accuracy: 0.9310\n",
      "Epoch 20/100\n",
      "232/232 [==============================] - 0s 65us/step - loss: 0.3902 - accuracy: 0.9397\n",
      "Epoch 21/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.3547 - accuracy: 0.8793\n",
      "Epoch 22/100\n",
      "232/232 [==============================] - 0s 65us/step - loss: 0.3450 - accuracy: 0.8578\n",
      "Epoch 23/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.2973 - accuracy: 0.9052\n",
      "Epoch 24/100\n",
      "232/232 [==============================] - 0s 66us/step - loss: 0.2939 - accuracy: 0.9009\n",
      "Epoch 25/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.2361 - accuracy: 0.9397\n",
      "Epoch 26/100\n",
      "232/232 [==============================] - 0s 65us/step - loss: 0.2415 - accuracy: 0.9138\n",
      "Epoch 27/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.2467 - accuracy: 0.8966\n",
      "Epoch 28/100\n",
      "232/232 [==============================] - 0s 65us/step - loss: 0.2072 - accuracy: 0.9440\n",
      "Epoch 29/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.1913 - accuracy: 0.9397\n",
      "Epoch 30/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.1807 - accuracy: 0.9526\n",
      "Epoch 31/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.1635 - accuracy: 0.9612\n",
      "Epoch 32/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.1605 - accuracy: 0.9440\n",
      "Epoch 33/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.1569 - accuracy: 0.9569\n",
      "Epoch 34/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.1562 - accuracy: 0.9526\n",
      "Epoch 35/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.1426 - accuracy: 0.9612\n",
      "Epoch 36/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.1312 - accuracy: 0.9655\n",
      "Epoch 37/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.1307 - accuracy: 0.9612\n",
      "Epoch 38/100\n",
      "232/232 [==============================] - 0s 65us/step - loss: 0.1367 - accuracy: 0.9698\n",
      "Epoch 39/100\n",
      "232/232 [==============================] - 0s 68us/step - loss: 0.1216 - accuracy: 0.9698\n",
      "Epoch 40/100\n",
      "232/232 [==============================] - 0s 66us/step - loss: 0.1142 - accuracy: 0.9741\n",
      "Epoch 41/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.1122 - accuracy: 0.9698\n",
      "Epoch 42/100\n",
      "232/232 [==============================] - 0s 65us/step - loss: 0.1059 - accuracy: 0.9741\n",
      "Epoch 43/100\n",
      "232/232 [==============================] - 0s 65us/step - loss: 0.1006 - accuracy: 0.9741\n",
      "Epoch 44/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.1007 - accuracy: 0.9655\n",
      "Epoch 45/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.1035 - accuracy: 0.9698\n",
      "Epoch 46/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.0930 - accuracy: 0.9741\n",
      "Epoch 47/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.0958 - accuracy: 0.9741\n",
      "Epoch 48/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.0859 - accuracy: 0.9741\n",
      "Epoch 49/100\n",
      "232/232 [==============================] - 0s 68us/step - loss: 0.0888 - accuracy: 0.9828\n",
      "Epoch 50/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.1012 - accuracy: 0.9698\n",
      "Epoch 51/100\n",
      "232/232 [==============================] - 0s 61us/step - loss: 0.0924 - accuracy: 0.9784\n",
      "Epoch 52/100\n",
      "232/232 [==============================] - 0s 68us/step - loss: 0.0819 - accuracy: 0.9741\n",
      "Epoch 53/100\n",
      "232/232 [==============================] - 0s 69us/step - loss: 0.0791 - accuracy: 0.9698\n",
      "Epoch 54/100\n",
      "232/232 [==============================] - 0s 66us/step - loss: 0.0726 - accuracy: 0.9784\n",
      "Epoch 55/100\n",
      "232/232 [==============================] - 0s 61us/step - loss: 0.0700 - accuracy: 0.9828\n",
      "Epoch 56/100\n",
      "232/232 [==============================] - 0s 67us/step - loss: 0.0706 - accuracy: 0.9784\n",
      "Epoch 57/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.0692 - accuracy: 0.9828\n",
      "Epoch 58/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.0670 - accuracy: 0.9828\n",
      "Epoch 59/100\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "232/232 [==============================] - 0s 64us/step - loss: 0.0635 - accuracy: 0.9828\n",
      "Epoch 60/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.0617 - accuracy: 0.9828\n",
      "Epoch 61/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.0591 - accuracy: 0.9828\n",
      "Epoch 62/100\n",
      "232/232 [==============================] - 0s 68us/step - loss: 0.0562 - accuracy: 0.9828\n",
      "Epoch 63/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.0546 - accuracy: 0.9828\n",
      "Epoch 64/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.0541 - accuracy: 0.9914\n",
      "Epoch 65/100\n",
      "232/232 [==============================] - 0s 68us/step - loss: 0.0594 - accuracy: 0.9871\n",
      "Epoch 66/100\n",
      "232/232 [==============================] - 0s 65us/step - loss: 0.0596 - accuracy: 0.9828\n",
      "Epoch 67/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.0493 - accuracy: 0.9871\n",
      "Epoch 68/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.0663 - accuracy: 0.9784\n",
      "Epoch 69/100\n",
      "232/232 [==============================] - 0s 67us/step - loss: 0.0464 - accuracy: 0.9871\n",
      "Epoch 70/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.0505 - accuracy: 0.9871\n",
      "Epoch 71/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.0513 - accuracy: 0.9871\n",
      "Epoch 72/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.0400 - accuracy: 0.9957\n",
      "Epoch 73/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.0396 - accuracy: 0.9871\n",
      "Epoch 74/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.0407 - accuracy: 0.9957\n",
      "Epoch 75/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.0436 - accuracy: 0.9828\n",
      "Epoch 76/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.0422 - accuracy: 0.9914\n",
      "Epoch 77/100\n",
      "232/232 [==============================] - 0s 70us/step - loss: 0.0384 - accuracy: 0.9914\n",
      "Epoch 78/100\n",
      "232/232 [==============================] - 0s 65us/step - loss: 0.0461 - accuracy: 0.9871\n",
      "Epoch 79/100\n",
      "232/232 [==============================] - 0s 65us/step - loss: 0.0403 - accuracy: 0.9871\n",
      "Epoch 80/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.0416 - accuracy: 0.9957\n",
      "Epoch 81/100\n",
      "232/232 [==============================] - 0s 65us/step - loss: 0.0324 - accuracy: 0.9914\n",
      "Epoch 82/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.0318 - accuracy: 0.9914\n",
      "Epoch 83/100\n",
      "232/232 [==============================] - 0s 65us/step - loss: 0.0296 - accuracy: 0.9957\n",
      "Epoch 84/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.0312 - accuracy: 1.0000\n",
      "Epoch 85/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.0381 - accuracy: 0.9914\n",
      "Epoch 86/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.0469 - accuracy: 0.9784\n",
      "Epoch 87/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.0339 - accuracy: 0.9914\n",
      "Epoch 88/100\n",
      "232/232 [==============================] - 0s 66us/step - loss: 0.0280 - accuracy: 0.9957\n",
      "Epoch 89/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.0308 - accuracy: 0.9914\n",
      "Epoch 90/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.0330 - accuracy: 0.9871\n",
      "Epoch 91/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.0257 - accuracy: 0.9957\n",
      "Epoch 92/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.0302 - accuracy: 0.9957\n",
      "Epoch 93/100\n",
      "232/232 [==============================] - 0s 61us/step - loss: 0.0264 - accuracy: 0.9957\n",
      "Epoch 94/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.0237 - accuracy: 0.9957\n",
      "Epoch 95/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.0238 - accuracy: 1.0000\n",
      "Epoch 96/100\n",
      "232/232 [==============================] - 0s 65us/step - loss: 0.0200 - accuracy: 1.0000\n",
      "Epoch 97/100\n",
      "232/232 [==============================] - 0s 72us/step - loss: 0.0202 - accuracy: 1.0000\n",
      "Epoch 98/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.0226 - accuracy: 0.9957\n",
      "Epoch 99/100\n",
      "232/232 [==============================] - 0s 65us/step - loss: 0.0223 - accuracy: 0.9957\n",
      "Epoch 100/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.0236 - accuracy: 1.0000\n",
      "116/116 [==============================] - 0s 99us/step\n",
      "Epoch 1/100\n",
      "232/232 [==============================] - 0s 340us/step - loss: 0.6932 - accuracy: 0.4957\n",
      "Epoch 2/100\n",
      "232/232 [==============================] - 0s 55us/step - loss: 0.6930 - accuracy: 0.4957\n",
      "Epoch 3/100\n",
      "232/232 [==============================] - 0s 56us/step - loss: 0.6928 - accuracy: 0.5302\n",
      "Epoch 4/100\n",
      "232/232 [==============================] - 0s 56us/step - loss: 0.6923 - accuracy: 0.5603\n",
      "Epoch 5/100\n",
      "232/232 [==============================] - 0s 54us/step - loss: 0.6916 - accuracy: 0.6078\n",
      "Epoch 6/100\n",
      "232/232 [==============================] - 0s 53us/step - loss: 0.6899 - accuracy: 0.5043\n",
      "Epoch 7/100\n",
      "232/232 [==============================] - 0s 53us/step - loss: 0.6869 - accuracy: 0.4957\n",
      "Epoch 8/100\n",
      "232/232 [==============================] - 0s 55us/step - loss: 0.6821 - accuracy: 0.6078\n",
      "Epoch 9/100\n",
      "232/232 [==============================] - 0s 53us/step - loss: 0.6777 - accuracy: 0.4957\n",
      "Epoch 10/100\n",
      "232/232 [==============================] - 0s 53us/step - loss: 0.6692 - accuracy: 0.5129\n",
      "Epoch 11/100\n",
      "232/232 [==============================] - 0s 55us/step - loss: 0.6541 - accuracy: 0.6509\n",
      "Epoch 12/100\n",
      "232/232 [==============================] - 0s 56us/step - loss: 0.6386 - accuracy: 0.6552\n",
      "Epoch 13/100\n",
      "232/232 [==============================] - 0s 61us/step - loss: 0.6243 - accuracy: 0.5862\n",
      "Epoch 14/100\n",
      "232/232 [==============================] - 0s 60us/step - loss: 0.6144 - accuracy: 0.7974\n",
      "Epoch 15/100\n",
      "232/232 [==============================] - 0s 61us/step - loss: 0.5895 - accuracy: 0.6293\n",
      "Epoch 16/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.5694 - accuracy: 0.8405\n",
      "Epoch 17/100\n",
      "232/232 [==============================] - 0s 61us/step - loss: 0.5740 - accuracy: 0.6767\n",
      "Epoch 18/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.5523 - accuracy: 0.8319\n",
      "Epoch 19/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.5322 - accuracy: 0.8147\n",
      "Epoch 20/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.5127 - accuracy: 0.7802\n",
      "Epoch 21/100\n",
      "232/232 [==============================] - 0s 65us/step - loss: 0.4946 - accuracy: 0.8276\n",
      "Epoch 22/100\n",
      "232/232 [==============================] - 0s 67us/step - loss: 0.4894 - accuracy: 0.8362\n",
      "Epoch 23/100\n",
      "232/232 [==============================] - 0s 61us/step - loss: 0.4725 - accuracy: 0.8793\n",
      "Epoch 24/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.4569 - accuracy: 0.8836\n",
      "Epoch 25/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.4467 - accuracy: 0.8879\n",
      "Epoch 26/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.4385 - accuracy: 0.8922\n",
      "Epoch 27/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.4341 - accuracy: 0.9224\n",
      "Epoch 28/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.4241 - accuracy: 0.8793\n",
      "Epoch 29/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.4170 - accuracy: 0.8836\n",
      "Epoch 30/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.4213 - accuracy: 0.9483\n",
      "Epoch 31/100\n",
      "232/232 [==============================] - 0s 67us/step - loss: 0.4486 - accuracy: 0.8448\n",
      "Epoch 32/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.4053 - accuracy: 0.9052\n",
      "Epoch 33/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.4080 - accuracy: 0.9483\n",
      "Epoch 34/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.3994 - accuracy: 0.9009\n",
      "Epoch 35/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.3862 - accuracy: 0.9267\n",
      "Epoch 36/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.3815 - accuracy: 0.9655\n",
      "Epoch 37/100\n",
      "232/232 [==============================] - 0s 65us/step - loss: 0.3666 - accuracy: 0.9224\n",
      "Epoch 38/100\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "232/232 [==============================] - 0s 66us/step - loss: 0.3652 - accuracy: 0.9655\n",
      "Epoch 39/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.3732 - accuracy: 0.9052\n",
      "Epoch 40/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.3732 - accuracy: 0.9310\n",
      "Epoch 41/100\n",
      "232/232 [==============================] - 0s 65us/step - loss: 0.3503 - accuracy: 0.9655\n",
      "Epoch 42/100\n",
      "232/232 [==============================] - 0s 60us/step - loss: 0.3504 - accuracy: 0.9310\n",
      "Epoch 43/100\n",
      "232/232 [==============================] - 0s 69us/step - loss: 0.3568 - accuracy: 0.9569\n",
      "Epoch 44/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.3348 - accuracy: 0.9397\n",
      "Epoch 45/100\n",
      "232/232 [==============================] - 0s 65us/step - loss: 0.3491 - accuracy: 0.9741\n",
      "Epoch 46/100\n",
      "232/232 [==============================] - 0s 61us/step - loss: 0.3577 - accuracy: 0.9052\n",
      "Epoch 47/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.3272 - accuracy: 0.9612\n",
      "Epoch 48/100\n",
      "232/232 [==============================] - 0s 66us/step - loss: 0.3260 - accuracy: 0.9440\n",
      "Epoch 49/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.3299 - accuracy: 0.9483\n",
      "Epoch 50/100\n",
      "232/232 [==============================] - 0s 65us/step - loss: 0.3205 - accuracy: 0.9784\n",
      "Epoch 51/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.3135 - accuracy: 0.9612\n",
      "Epoch 52/100\n",
      "232/232 [==============================] - 0s 66us/step - loss: 0.3139 - accuracy: 0.9784\n",
      "Epoch 53/100\n",
      "232/232 [==============================] - 0s 66us/step - loss: 0.3072 - accuracy: 0.9569\n",
      "Epoch 54/100\n",
      "232/232 [==============================] - 0s 67us/step - loss: 0.3042 - accuracy: 0.9741\n",
      "Epoch 55/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.3078 - accuracy: 0.9526\n",
      "Epoch 56/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.2990 - accuracy: 0.9871\n",
      "Epoch 57/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.2955 - accuracy: 0.9569\n",
      "Epoch 58/100\n",
      "232/232 [==============================] - 0s 65us/step - loss: 0.2981 - accuracy: 0.9569\n",
      "Epoch 59/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.2894 - accuracy: 0.9784\n",
      "Epoch 60/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.2885 - accuracy: 0.9698\n",
      "Epoch 61/100\n",
      "232/232 [==============================] - 0s 66us/step - loss: 0.2850 - accuracy: 0.9828\n",
      "Epoch 62/100\n",
      "232/232 [==============================] - 0s 68us/step - loss: 0.2868 - accuracy: 0.9698\n",
      "Epoch 63/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.2812 - accuracy: 0.9828\n",
      "Epoch 64/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.2770 - accuracy: 0.9828\n",
      "Epoch 65/100\n",
      "232/232 [==============================] - 0s 65us/step - loss: 0.2881 - accuracy: 0.9526\n",
      "Epoch 66/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.2786 - accuracy: 0.9828\n",
      "Epoch 67/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.2754 - accuracy: 0.9828\n",
      "Epoch 68/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.2792 - accuracy: 0.9698\n",
      "Epoch 69/100\n",
      "232/232 [==============================] - 0s 69us/step - loss: 0.2774 - accuracy: 0.9698\n",
      "Epoch 70/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.2716 - accuracy: 0.9784\n",
      "Epoch 71/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.2640 - accuracy: 0.9871\n",
      "Epoch 72/100\n",
      "232/232 [==============================] - 0s 68us/step - loss: 0.2698 - accuracy: 0.9655\n",
      "Epoch 73/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.2613 - accuracy: 0.9828\n",
      "Epoch 74/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.2595 - accuracy: 0.9828\n",
      "Epoch 75/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.2576 - accuracy: 0.9784\n",
      "Epoch 76/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.2573 - accuracy: 0.9871\n",
      "Epoch 77/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.2534 - accuracy: 0.9871\n",
      "Epoch 78/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.2541 - accuracy: 0.9871\n",
      "Epoch 79/100\n",
      "232/232 [==============================] - 0s 61us/step - loss: 0.2501 - accuracy: 0.9871\n",
      "Epoch 80/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.2463 - accuracy: 0.9871\n",
      "Epoch 81/100\n",
      "232/232 [==============================] - 0s 70us/step - loss: 0.2446 - accuracy: 0.9828\n",
      "Epoch 82/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.2467 - accuracy: 0.9871\n",
      "Epoch 83/100\n",
      "232/232 [==============================] - 0s 67us/step - loss: 0.2410 - accuracy: 0.9871\n",
      "Epoch 84/100\n",
      "232/232 [==============================] - 0s 61us/step - loss: 0.2445 - accuracy: 0.9914\n",
      "Epoch 85/100\n",
      "232/232 [==============================] - 0s 61us/step - loss: 0.2400 - accuracy: 0.9828\n",
      "Epoch 86/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.2413 - accuracy: 0.9871\n",
      "Epoch 87/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.2656 - accuracy: 0.9784\n",
      "Epoch 88/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.2476 - accuracy: 0.9526\n",
      "Epoch 89/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.2628 - accuracy: 0.9741\n",
      "Epoch 90/100\n",
      "232/232 [==============================] - 0s 70us/step - loss: 0.2348 - accuracy: 0.9871\n",
      "Epoch 91/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.2306 - accuracy: 0.9914\n",
      "Epoch 92/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.2308 - accuracy: 0.9871\n",
      "Epoch 93/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.2294 - accuracy: 0.9828\n",
      "Epoch 94/100\n",
      "232/232 [==============================] - 0s 70us/step - loss: 0.2340 - accuracy: 0.9828\n",
      "Epoch 95/100\n",
      "232/232 [==============================] - 0s 61us/step - loss: 0.2385 - accuracy: 0.9784\n",
      "Epoch 96/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.3289 - accuracy: 0.9181\n",
      "Epoch 97/100\n",
      "232/232 [==============================] - 0s 64us/step - loss: 0.2326 - accuracy: 0.9784\n",
      "Epoch 98/100\n",
      "232/232 [==============================] - 0s 65us/step - loss: 0.2263 - accuracy: 0.9828\n",
      "Epoch 99/100\n",
      "232/232 [==============================] - 0s 62us/step - loss: 0.2178 - accuracy: 0.9914\n",
      "Epoch 100/100\n",
      "232/232 [==============================] - 0s 63us/step - loss: 0.2161 - accuracy: 0.9871\n",
      "116/116 [==============================] - 0s 102us/step\n",
      "Accuracy mean: 0.7557471295197805\n",
      "Accuracy variance: 0.23092438143369948\n"
     ]
    }
   ],
   "source": [
    "# 评估ANN\n",
    "from keras.wrappers.scikit_learn import KerasClassifier\n",
    "from sklearn.model_selection import cross_val_score\n",
    "from keras.models import Sequential # 载入神经网络库\n",
    "from keras.layers import Dense # 载入我们层库\n",
    "def build_classifier():\n",
    "    classifier = Sequential() # 载入神经网络\n",
    "    classifier.add(Dense(units = 8, kernel_initializer = 'uniform', activation = 'relu', input_dim = x_train.shape[1]))\n",
    "    classifier.add(Dense(units = 4, kernel_initializer = 'uniform', activation = 'relu'))\n",
    "    classifier.add(Dense(units = 1, kernel_initializer = 'uniform', activation = 'sigmoid'))\n",
    "    classifier.compile(optimizer = 'adam', loss = 'binary_crossentropy', metrics = ['accuracy'])\n",
    "    return classifier\n",
    "classifier = KerasClassifier(build_fn = build_classifier, epochs = 100)\n",
    "accuracies = cross_val_score(estimator = classifier, X = x_train, y = y_train, cv = 3)\n",
    "mean = accuracies.mean()\n",
    "variance = accuracies.std()\n",
    "print(\"Accuracy mean: \"+ str(mean))\n",
    "print(\"Accuracy variance: \"+ str(variance))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {
    "_uuid": "59ad9da159449a3ed9e99ed0ec931ee14b7aab66"
   },
   "outputs": [],
   "source": []
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "_cell_guid": "3963de5c-1b2e-45ec-b499-4c404afa8595",
    "_uuid": "a73620b9d04cd7627e7da9961db541140dfd8467"
   },
   "source": [
    "<a id=\"23\"></a> <br>\n",
    "## Pytorch库中的人工神经网络.\n",
    "* Pytorch和keras一样是一种框架。\n",
    "* 它让创建和实现深度学习块更简单。\n",
    "* 人工神经网络: https://www.kaggle.com/kanncaa1/pytorch-tutorial-for-deep-learning-lovers"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "_cell_guid": "46ac06d5-1f74-4e8d-b7b6-189a1b7bf646",
    "_uuid": "b57204c8acc548963ad6be04f75baef0bf09219f"
   },
   "source": [
    "<a id=\"24\"></a> <br>\n",
    "## Pytorch库中的卷积神经网络\n",
    "* Pytorch和keras一样是一种框架。\n",
    "* 它让创建和实现深度学习块更简单。 \n",
    "* 卷积神经网络: https://www.kaggle.com/kanncaa1/pytorch-tutorial-for-deep-learning-lovers"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "_cell_guid": "d7273dad-b03c-453f-b444-d1a910d6d2a3",
    "_uuid": "e5c5e2dbf4d36428211699a43b819b2ea95600ea"
   },
   "source": [
    "<a id=\"25\"></a> <br>\n",
    "## Pytorch库中的循环神经网络\n",
    "* Pytorch和keras一样是一种框架。\n",
    "* 它让创建和实现深度学习块更简单。\n",
    "* 循环神经网络RNN: https://www.kaggle.com/kanncaa1/recurrent-neural-network-with-pytorch"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "_cell_guid": "1a62d8b1-8308-4654-bdfd-479965ee08af",
    "_uuid": "dc3f380e7953a5c3bb6580b30c0da7cb94599c81"
   },
   "source": [
    "<a id=\"20\"></a> <br>\n",
    "# 结论\n",
    "* 首先，感谢数据集的提供者\n",
    "* 如果，你看到有错误拼写，请无视它们，谢谢 ：）\n",
    "* 这篇指南是比较浅显的, 如果某些内容你需要更多细节可以评论.\n",
    "* 如果你觉得我某些内容没有解释清楚，你可以到 youtube上面学习（特别是吴恩达），然后继续。\n",
    "* 如果有些关于Python和机器学习的内容你理解不了，请看我其他的指南。\n",
    "    * 数据科学: https://www.kaggle.com/kanncaa1/data-sciencetutorial-for-beginners\n",
    "    * 机器学习: https://www.kaggle.com/kanncaa1/machine-learning-tutorial-for-beginners\n",
    "* 现在我希望你理解和学习什么是深度学习。然后，我们并不需要每次编写长长的代码来构建深度学习模型。因为有深度学习的框架可以简单快速地搭建学习模型。\n",
    "    * 人工神经网络: https://www.kaggle.com/kanncaa1/pytorch-tutorial-for-deep-learning-lovers\n",
    "    * 卷积神经网络: https://www.kaggle.com/kanncaa1/pytorch-tutorial-for-deep-learning-lovers\n",
    "    * 循环神经网络: https://www.kaggle.com/kanncaa1/recurrent-neural-network-with-pytorch\n",
    "     \n",
    "\n",
    "### <br>如果你喜欢他，请问作者点赞。\n",
    "## <br> 如果你有任何问题，我非常高兴听到他。\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": []
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.6.9"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 1
}
