{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "本书配套视频课程：[解剖深度学习原理，从0实现深度学习库](https://ke.qq.com/course/2900371?tuin=ac5537fd) \n",
    "\n",
    "更多代码或学习资料将向购买视频课程或书的学生提供。\n",
    "\n",
    "\n",
    "+ 博客网站：[https://hwdong-net.github.io](https://hwdong-net.github.io)\n",
    "+ youtube频道: [hwdong](http://www.youtube.com/c/hwdong)\n",
    "+ bilibili网站：[hw-dong](https://space.bilibili.com/281453312)\n",
    "\n",
    "# 第4章 神经网络\n",
    "\n",
    "## 4.1 神经网络（Neural Network）\n",
    "\n",
    "### 4.1.2 激活函数\n",
    "\n",
    "#### 1.  阶跃函数sign(x)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "def sign(x):\n",
    "    return np.array(x > 0, dtype=np.int)\n",
    "\n",
    "def grad_sign(x):\n",
    "    return np.zeros_like(x)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "import numpy as np\n",
    "import matplotlib.pylab as plt\n",
    "%matplotlib inline\n",
    "\n",
    "x = np.arange(-5.0,5.0, 0.1)\n",
    "plt.ylim(-0.1, 1.1) # 指定y轴的范围\n",
    "plt.plot(x, sign(x),label=\"sigmoid\")\n",
    "plt.plot(x, grad_sign(x),label=\"derivative\")\n",
    "plt.legend(loc=\"upper right\", frameon=False)\n",
    "plt.show()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "#### 2. tanh函数\n",
    "\n",
    "tanh函数：\n",
    "\n",
    "$$tanh(x)=(e^x-e^{-x})/(e^x+e^{-x})\\tag{4-5}$$\n",
    "\n",
    "它的导函数是：\n",
    "\n",
    "$$\\begin{aligned}tanh'(x)&=[(e^x+e^{-x})(e^x+e^{-x})-(e^x-e^{-x})(e^x-e^{-x})]/(e^x+e^{-x})^2\\\\&=1-((e^x-e^{-x})^2)/(e^x+e^{-x})^2=1-tanh^2(x) \\end{aligned}\\tag{4-6}$$\n",
    "\n",
    "numpy提供了计算tanh的计算函数tanh()，下面代码计算tanh'(x)，并绘制tanh(x)和tanh'(x)的函数曲线："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "def grad_tanh(x):\n",
    "    a = np.tanh(x)\n",
    "    return 1 - a**2"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "x = np.arange(-5.0, 5.0, 0.1)\n",
    "plt.plot(x, np.tanh(x),label=\"tanh\")\n",
    "plt.plot(x, grad_tanh(x),label=\"derivative\")\n",
    "plt.legend(loc=\"upper right\", frameon=False)\n",
    "plt.show()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "#### 4. ReLU函数\n",
    "\n",
    "ReLU函数f(x)在x大于0 时，直接输出x，否则，输出0：\n",
    "\n",
    "$$Relu(x)= \\begin{cases} x &  (x > 0)\\\\\n",
    "          0 & (x \\leq 0)\\end{cases}  \\tag{4-7}$$\n",
    "          \n",
    "其导数：\n",
    "\n",
    "$$Relu'(x)= \\begin{cases} 1 &  (x > 0)\\\\\n",
    "          0 & (x \\leq 0)\\end{cases}\\tag{4-8}$$\n",
    "\n",
    "下面代码计算Relu(x),Relu'(x)，并绘制Relu(x),Relu'(x)的函数曲线："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "def relu(x):\n",
    "    return np.maximum(0, x)\n",
    "def grad_relu(x):\n",
    "    return 1. * (x > 0)\n",
    "\n",
    "x = np.arange(-5.0, 5.0, 0.1)\n",
    "plt.plot(x, relu(x),label=\"relu\")\n",
    "plt.plot(x, grad_relu(x),label=\"derivative\")\n",
    "plt.legend(loc=\"upper right\", frameon=False)\n",
    "plt.show()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Relu函数还有一些变种，如LeakRelu函数：\n",
    "\n",
    "$$LeakRelu(x)= \\begin{cases} x &  (x > 0)\\\\\n",
    "          kx & (x \\leq 0)\\end{cases}  \\tag{4-9}$$\n",
    "          \n",
    "其导数：\n",
    "\n",
    "$$LeakRelu'(x)= \\begin{cases} 1 &  (x > 0)\\\\\n",
    "          k & (x \\leq 0)\\end{cases}\\tag{4-10}$$\n",
    "\n",
    "下面代码计算$LeakRelu(x),LeakRelu'(x)$，并绘制$LeakRelu(x),LeakRelu'(x)$的函数曲线："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "import numpy as np\n",
    "def leakRelu(x,k=0.2):\n",
    "    y = np.copy( x )\n",
    "    y[ y < 0 ] *= k        \n",
    "    return y\n",
    "\n",
    "def grad_leakRelu(x,k=0.2):\n",
    "    return np.clip(x > 0, k, 1.0)\n",
    "    grad = np.ones_like(x)\n",
    "    grad[x < 0] = alpha\n",
    "    return grad\n",
    "  \n",
    "x = np.arange(-5.0, 5.0, 0.1)\n",
    "plt.plot(x, leakRelu(x),label=\"leakrelu\")\n",
    "plt.plot(x, grad_leakRelu(x),label=\"derivative\")\n",
    "plt.legend(loc=\"upper right\", frameon=False)\n",
    "plt.show()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 4.1.3 神经网络和深度学习\n",
    "\n",
    "神经网络的正向计算（前向传播）可用下面的python代码实现"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "import numpy as np\n",
    "\n",
    "def sigmoid(x):\n",
    "    return 1 / (1 + np.exp(-x))\n",
    "\n",
    "g1 = sigmoid\n",
    "\n",
    "g2 = sigmoid\n",
    "\n",
    "# x和W1,b1\n",
    "x = np.array([1.0, 0.5])              # 输入x： 1x2行向量\n",
    "W1 = np.array([[0.1, 0.3,0.5,0.2],\n",
    "               [0.4,0.6,0.7, 0.1]])   # W1 ： 2x4矩阵\n",
    "b1 = np.array([0.1, 0.2, 0.3,0.4])    # 偏置b1： 1x4行向量\n",
    "print(\"x.shape\",x.shape)                        # (2,)\n",
    "print(\"W1.shape\",W1.shape)                       # (2, 4)\n",
    "print(\"b1.shape\",b1.shape)                       # (4,)\n",
    "\n",
    "# 从输入x和W1,b1计算z1和a1的值\n",
    "z1 = np.dot(x,W1) + b1                # (1,4)\n",
    "a1 = g1(z1)                      # (1,4)\n",
    "print(\"z1\",z1)                             # (4,)\n",
    "print(\"a1\",a1) \n",
    "\n",
    "# a1、W2,b2\n",
    "W2 = np.array([[0.1, 1.4,0.2],[2.5, 0.6, 0.3],[1.1,0.7,0.8],[0.3,1.5,2.1]])\n",
    "b2 = np.array([0.1, 2,0.3])\n",
    "print(\"a2.shape\",a1.shape) # (4,)\n",
    "print(\"W2.shape\",W2.shape) # (2, 4)\n",
    "print(\"b2.shape\",b2.shape) # (2,)\n",
    "\n",
    "# 从a1、W2,b2计算z2和a2的值\n",
    "z2 = np.dot(a1,W2) + b2\n",
    "a2 = g2(z2)\n",
    "print(\"z2\",z2)\n",
    "print(\"a2\",a2)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "###  4.1.4 多个样本的正向计算\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "X = np.array([[1.0, 2.],[3.0,4.0]])\n",
    "W1 = np.array([[0.1, 0.3,0.5,0.2],\n",
    "               [0.4,0.6,0.7, 0.1]])   # W1 ： 2x4矩阵\n",
    "b1 = np.array([0.1, 0.2, 0.3,0.4])    # 偏置b1： 1x4行向量\n",
    "\n",
    "print(\"X.shape\",X.shape) # (2,2)\n",
    "print(\"W1.shape\",W1.shape) # (2, 4)\n",
    "print(\"b1.shape\",b1.shape) # (4,)\n",
    "\n",
    "# 计算第1层的Z1,A1\n",
    "Z1 = np.dot(X,W1) + b1\n",
    "A1 = sigmoid(Z1)\n",
    "print(\"Z1:\",Z1)\n",
    "print(\"A1:\",A1)\n",
    "\n",
    "W2 = np.array([[0.1, 1.4,0.2],[2.5, 0.6, 0.3],[1.1,0.7,0.8],[0.3,1.5,2.1]])\n",
    "b2 = np.array([0.1, 2,0.3])\n",
    "print(\"A1.shape\",A1.shape) # (2,)\n",
    "print(\"W2.shape\",W2.shape) # (4, 2)\n",
    "print(\"b2.shape\",b2.shape) # (4,)\n",
    "\n",
    "# 计算第1层的Z2,A2\n",
    "Z2 = np.dot(A1,W2) + b2\n",
    "A2 = sigmoid(Z2)\n",
    "print(\"Z2:\",Z2)\n",
    "print(\"A2:\",A2)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 4.1.6 损失函数\n",
    "\n",
    "#### 1. 均方差损失\n",
    "\n",
    "对于一个样本$(\\pmb f^{(i)},\\pmb y^{(i)} )$，$\\frac{1}{2}{{\\|(\\pmb f^{(i)}-\\pmb y^{(i)} \\|}_2}^2$的计算代码如下："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "import numpy as np\n",
    "f = np.array([0.1, 0.2,0.5])\n",
    "y = np.array([0.3, 0.4,0.2])\n",
    "loss =  np.sum((f - y) ** 2)/2 \n",
    "print(loss)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "对于多个样本，$\\mathcal L(\\pmb F,\\pmb Y)= \\frac{1}{2m}{{\\|(\\pmb f^{(i)}-\\pmb y^{(i)} \\|}_2}^2$可用如下代码计算："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 64,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "0.08499999999999999\n"
     ]
    }
   ],
   "source": [
    "F = np.array([[0.1, 0.2,0.5],[0.1, 0.2,0.5]])\n",
    "Y = np.array([[0.3, 0.4,0.2],[0.3, 0.4,0.2]])\n",
    "\n",
    "m = F.shape[0] #len(F)\n",
    "loss =  np.sum((F - Y) ** 2)/(2*m)\n",
    "# loss = (np.square(H-Y)).mean() \n",
    "print(loss)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "可以将均方差写成一个函数："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 6,
   "metadata": {},
   "outputs": [
    {
     "ename": "NameError",
     "evalue": "name 'F' is not defined",
     "output_type": "error",
     "traceback": [
      "\u001b[1;31m---------------------------------------------------------------------------\u001b[0m",
      "\u001b[1;31mNameError\u001b[0m                                 Traceback (most recent call last)",
      "\u001b[1;32m<ipython-input-6-c4972a9dfba5>\u001b[0m in \u001b[0;36m<module>\u001b[1;34m\u001b[0m\n\u001b[0;32m      6\u001b[0m     \u001b[1;32mreturn\u001b[0m \u001b[0mloss\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m      7\u001b[0m \u001b[1;33m\u001b[0m\u001b[0m\n\u001b[1;32m----> 8\u001b[1;33m \u001b[0mmse_loss\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mF\u001b[0m\u001b[1;33m,\u001b[0m\u001b[0mY\u001b[0m\u001b[1;33m,\u001b[0m\u001b[1;32mTrue\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0m",
      "\u001b[1;31mNameError\u001b[0m: name 'F' is not defined"
     ]
    }
   ],
   "source": [
    "def mse_loss(F,Y,divid_2=False):\n",
    "    m = F.shape[0]\n",
    "    loss =  np.sum((F - Y) ** 2)/m\n",
    "    if divid_2:\n",
    "        loss/=2\n",
    "    return loss\n",
    "\n",
    "mse_loss(F,Y,True)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "#### 2. 二分类交叉熵损失\n",
    "\n",
    "对于二分类问题，输出层只有一个逻辑回归的神经元，输出一个样本属于某个分类（如类别1）的概率，所有样本的概率输出构成一个向量$\\pmb f$，训练样本的目标值用1或0表示样本属于哪个分类。所有样本的目标值构成的向量用$\\pmb y$表示。则交叉熵损失是：\n",
    "\n",
    "$$\\begin{aligned}\n",
    "L(\\pmb f,\\pmb y) = \\frac{1}{m} \\sum_{i=1}^m L_i( {y}^{(i)}, {f}^{(i)}) = -\\frac{1}{m} \\sum_{i=1}^{m} \\left[  y^{(i)} \\log(f^{(i)}) + (1- y^{(i)})\\log(1- f^{(i)}) \\right] \\\\\n",
    " = -\\frac{1}{m}np.sum (\\pmb y \\log{\\pmb f} +(1-\\pmb y) \\log(1-\\pmb f))\n",
    "\\end{aligned}  \\tag{4-30}$$\n",
    "\n",
    "其中$y^{(i)}$值为1或0，表示样本所属的类别，$f^{(i)}$表示样本属于值为1的类别的概率。\n",
    "\n",
    "二分类交叉熵损失可用下面的代码计算。\n",
    "```python\n",
    "- (1./m)*np.sum(np.multiply(y,np.log(f)) + np.multiply((1 - y), np.log(1 - f)))\n",
    "```\n",
    "例如："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "f = np.array([0.1, 0.2,0.5])   #3个样本对应分类1的概率\n",
    "y = np.array([0,   1,   0])   #3个样本对应的分类\n",
    "m = y.shape[0]\n",
    "\n",
    "loss = - (1./m)*np.sum(np.multiply(y,np.log(f)) + np.multiply((1 - y), np.log(1 - f)))\n",
    "print(loss)  "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "为防止f或1-f出现0值导致log()函数值异常，可以在计算对数中，增加一个很小的量 𝜖 ，因此可以编写下面的二分类交叉熵损失函数"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "def binary_cross_entropy(f,y,epsilon = 1e-8):\n",
    "    #np.sum(y*np.log(f+epsilon)+ (1-y)*np.log(1-f+epsilon), axis=1)   \n",
    "    m = len(y)\n",
    "    return - (1./m)*np.sum(np.multiply(y,np.log(f+epsilon)) + \n",
    "                           np.multiply((1 - y), np.log(1 - f+epsilon)))    \n",
    "\n",
    "binary_cross_entropy(f,y)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "3. 多分类交叉熵损失\n",
    "\n",
    "上述是针对二分类问题的交叉熵损失， 可以推广到超过2类的多分类问题，$f^{(i)}_c$表示第i个样本属于第c个类别的概率，$y^{(i)}_c$用1或0值第i个样本是否属于类别c，即用one-hot向量$y^{(i)}$表示样本目标值。根据第3章的softmax回归，可知多个样本的交叉熵损失如下：\n",
    "\n",
    "$${L}(\\pmb f,\\pmb y)= \\frac{1}{m}  \\sum_{i=1}^m {L}_i(\\mathbf{y}^{(i)},\\mathbf{f}^{(i)}) = -\\frac{1}{m} \\sum_{i=1}^m \\sum_{c=1}^{C} y^{(i)}_c \\cdot \\log( f^{(i)}_c )= -\\frac{1}{m} \\sum_{i=1}^m y^{(i)} \\cdot \\log( f^{(i)}) \\tag{4-31}$$\n",
    "\n",
    "如果所有样本的目标值都采用的是one-hot向量，根据第3章，对于m个样本，可以将${\\mathcal L}(\\pmb {f},\\pmb{y})$写成向量化的Hadamard乘积：\n",
    "\n",
    "$${\\mathcal L}(\\pmb {f},\\pmb{y})= -\\frac{1}{m}sum( \\mathbf{y} \\odot \\log(\\mathbf{f}))\\tag{4-32}$$\n",
    "\n",
    "用numpy代码可写成： \n",
    "```\n",
    "-(1./m)*np.sum(np.multiply(y, np.log(f)))\n",
    "```"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "def cross_entropy_loss_onehot(F,Y):\n",
    "    m = len(F)  #F.shape[0] \n",
    "    return -(1./m) *np.sum(np.multiply(Y, np.log(F)))\n",
    "\n",
    "F = np.array([[0.2,0.5,0.3],[0.4,0.3,0.3]])\n",
    "Y = np.array([[0,0,1],[1,0,0]])\n",
    "cross_entropy_loss_onehot(F,Y)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "如果每个样本的目标值不是用one-hot向量表示的，而是用一个整数表示该样本属于哪个分类。对于C分类问题，这些整数值是$0,1,2,\\cdots,C-1$用于表示样本属于哪个类，如用一个整数如2表示样本属于第3类。此时该样本的交叉熵损失就是$f^{(i)}$的对应分量（即下标2对应的分类$f^{(i)}_2$）的负log值，即$-log f^{(i)}_2$。\n",
    "\n",
    "对于整数表示样本的目标分类，多分类交叉熵损失为：\n",
    "\n",
    "$${\\mathcal L}(\\pmb f,\\pmb y)= \\frac{1}{m}  \\sum_{i=1}^m {L}_i(\\pmb{y}^{(i)},\\pmb{f}^{(i)}) =  -\\frac{1}{m} \\sum_{i=1}^m  \\log( f^{(i)}_{y^{(i)}}) \\tag{4-33}$$\n",
    "\n",
    "其中$y^{(i)}$表示第i个样本所属分类对应的整数值（下标）。\n",
    "\n",
    "因此，可以定义如下的多分类交叉熵计算函数："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 7,
   "metadata": {},
   "outputs": [],
   "source": [
    "def cross_entropy_loss(F,Y,onehot=False):\n",
    "    m = len(F) #F.shape[0]      #样本数\n",
    "    if onehot:\n",
    "        return -(1./m) *np.sum(np.multiply(Y, np.log(F)))\n",
    "    else: return  - (1./m) *np.sum( np.log(F[range(m),Y]) )  # F[i]中对应Y[i]的那个分类的log值   "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "F = np.array([[0.2,0.5,0.3],[0.4,0.3,0.3]])  #每行对应一个样本\n",
    "Y = np.array([2,0])  #第1个样本属于第2类、第2个样本属于第0类\n",
    "\n",
    "cross_entropy_loss(F,Y)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 4.1.7 基于数值梯度的神经网络训练"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "import numpy as np\n",
    "\n",
    "def initialize_parameters(n_x, n_h, n_o):\n",
    "    np.random.seed(2)            # 固定种子，使得每次运行这个代码的随机数的值总是同样的   \n",
    "  \n",
    "    W1 = np.random.randn(n_x,n_h)* 0.01\n",
    "    b1 = np.zeros((1,n_h))\n",
    "    W2 = np.random.randn(n_h,n_o) * 0.01\n",
    "    b2 = np.zeros((1,n_o))\n",
    "   \n",
    "    assert (W1.shape == (n_x, n_h))\n",
    "    assert (b1.shape == (1, n_h))\n",
    "    assert (W2.shape == (n_h, n_o))\n",
    "    assert (b2.shape == (1, n_o))\n",
    "    \n",
    "    parameters = [W1,b1,W2,b2]\n",
    "    return parameters"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "测试一下这个函数"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "n_x, n_h, n_o = 2,4,3\n",
    "parameters = initialize_parameters(n_x, n_h, n_o)\n",
    "print(\"W1 = \" + str(parameters[0]))\n",
    "print(\"b1 = \" + str(parameters[1]))\n",
    "print(\"W2 = \" + str(parameters[2]))\n",
    "print(\"b2 = \" + str(parameters[3]))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "编写正向计算的函数forward_propagation(X, parameters)："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "def sigmoid(x):\n",
    "    return 1 / (1 + np.exp(-x))\n",
    "\n",
    "def forward_propagation(X, parameters):     \n",
    "    W1,b1,W2,b2 = parameters      \n",
    "   \n",
    "    Z1 = np.dot(X,W1) + b1    # Z1形状： (3,2)(2,4)+(1,4)=>(3,4)  \n",
    "    A1 = np.tanh(Z1)\n",
    "    Z2 = np.dot(A1,W2) + b2   # Z2形状： (3,4)(4,3)+(1,3)=>(3,3) \n",
    "    #A2 = sigmoid(Z2)  \n",
    "   \n",
    "    assert(Z2.shape == (X.shape[0],3))\n",
    "    return Z2"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "测试一下这个函数："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "X = np.array([[1.,2.],[3.,4.],[5.,6.]])  #每行对应一个样本\n",
    "\n",
    "Z2 = forward_propagation(X, parameters)\n",
    "print(Z2)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "正向计算函数输出了属于每个类的得分$Z$，得分可用softmax()函数转化为属于每个类的概率，然后和真正目标值计算多分类交叉熵损失。函数softmax_cross_entropy()和函数softmax_cross_entropy_reg()根据输出得分值Z和真实值y计算交叉熵损失，后者包含了正则项损失（reg是正则项系数）。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 22,
   "metadata": {},
   "outputs": [],
   "source": [
    "def softmax(Z):\n",
    "    exp_Z = np.exp(Z-np.max(Z,axis=1,keepdims=True))\n",
    "    return exp_Z/np.sum(exp_Z,axis=1,keepdims=True)\n",
    "\n",
    "def softmax_cross_entropy(Z, y, onehot=False):\n",
    "    m = len(Z)\n",
    "    F = softmax(Z)\n",
    "    if onehot:\n",
    "        loss = -np.sum(y*np.log(F))/m\n",
    "    else:\n",
    "        y.flatten()\n",
    "        log_Fy = -np.log(F[range(m),y])        \n",
    "        loss = np.sum(log_Fy) / m\n",
    "    return loss\n",
    "\n",
    "def softmax_cross_entropy_reg(Z, Y, parameters,onehot=False,reg=1e-3):\n",
    "    W1 = parameters[0]  \n",
    "    W2 = parameters[2]\n",
    "    L  = softmax_cross_entropy(Z,y,onehot)+ reg*(np.sum(W1**2)+np.sum(W2**2))    \n",
    "    assert(isinstance(L, float))    \n",
    "    return L"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "y = np.array([2,0,1])  #每行对应一个样本\n",
    "softmax_cross_entropy_reg(Z2,y,parameters)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "通常希望，输入一组数据X和对应的目标值y，神经网络能计算出损失函数值，因此，将正向计算和单独的交叉熵损失计算函数合并在一起："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "def compute_loss_reg(f,loss,X, Y, parameters,reg=1e-3):\n",
    "    Z2 = f(X,parameters)\n",
    "    return loss(Z2,y,parameters,reg)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "测试一下这个函数："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "reg  =1e-3\n",
    "compute_loss_reg(forward_propagation,softmax_cross_entropy_reg, X, y, parameters,reg)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "定义一个返回计算损失函数对象的函数f()，将它和模型参数传给2.4)节的通用数值梯度计算函数计算神经网络的数值梯度。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "import util\n",
    "\n",
    "def f():\n",
    "    return compute_loss_reg(forward_propagation,softmax_cross_entropy_reg, X, y, parameters,reg)\n",
    "num_grads = util.numerical_gradient(f,parameters)\n",
    "print(num_grads[0])\n",
    "print(num_grads[3])"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "现在可以修改前面的梯度下降法，训练神经网络模型了："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "def max_abs(grads):\n",
    "    return max([np.max(np.abs(grad)) for grad in grads])\n",
    "    \n",
    "def gradient_descent_ANN(f,X, y,parameters, reg=0., alpha=0.01, \n",
    "                         iterations=100,gamma = 0.8,epsilon=1e-8):   \n",
    "    losses = []\n",
    "    for i in range(0,iterations):\n",
    "        loss = f()\n",
    "        grads = util.numerical_gradient(f, parameters)      \n",
    "        if max_abs(grads)<epsilon:\n",
    "            print(\"gradient is small enough!\")\n",
    "            print(\"iterated num is :\",i)\n",
    "            break  \n",
    "        for param, grad in zip(parameters, grads):\n",
    "            param-=alpha * grad  \n",
    "            \n",
    "        losses.append(loss)\n",
    "    return parameters,losses"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "现在再次测试前面的螺旋数据点集，为了说明完成的训练过程，重新将神经网络重新定义为中间隐含层的神经元数目为5。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "import numpy as np\n",
    "import matplotlib.pyplot as plt\n",
    "%matplotlib inline\n",
    "\n",
    "np.random.seed(100)\n",
    "\n",
    "def gen_spiral_dataset(N=100,D=2,K=3):   \n",
    "    X = np.zeros((N*K,D)) # data matrix (each row = single example)\n",
    "    y = np.zeros(N*K, dtype='uint8') # class labels\n",
    "    for j in range(K):\n",
    "        ix = range(N*j,N*(j+1))\n",
    "        r = np.linspace(0.0,1,N) # radius\n",
    "        t = np.linspace(j*4,(j+1)*4,N) + np.random.randn(N)*0.2  # theta\n",
    "        X[ix] = np.c_[r*np.sin(t), r*np.cos(t)]\n",
    "        y[ix] = j\n",
    "    return X,y\n",
    "\n",
    "N,D,K = 100 ,2,3\n",
    "\n",
    "X_spiral,y_spiral = gen_spiral_dataset()\n",
    "plt.scatter(X_spiral[:, 0], X_spiral[:, 1], c=y_spiral, s=40, cmap=plt.cm.Spectral)\n",
    "plt.show()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "X = X_spiral\n",
    "y = y_spiral\n",
    "n_x, n_h, n_o = 2,5,3\n",
    "parameters = initialize_parameters(n_x, n_h, n_o)\n",
    "alpha = 1e-0\n",
    "iterations  =1000\n",
    "lambda_ = 1e-3\n",
    "parameters,losses = gradient_descent_ANN(f,X,y,parameters,lambda_, alpha, iterations)\n",
    "for param in parameters:\n",
    "    print(param)\n",
    "print(losses[:-1:len(losses)//10])\n",
    "plt.plot(losses, color='r')"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "下面的函数通过比较预测结果和目标值，计算模型在样本集 (𝑋,𝑦) 上的预测准确度："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "def getAccuracy(X,y,parameters):\n",
    "    predicts = forward_propagation(X,parameters)    \n",
    "    #probs = softmax(np.dot(X,w))\n",
    "    predicts = np.argmax(predicts,axis=1) \n",
    "    accuracy = sum(predicts == y)/(float(len(y)))\n",
    "    return accuracy"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "getAccuracy(X,y,parameters)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "模型在训练集上的预测准确度达到了0.943，而原来的softmax回归模型的预测准确度只有0.516。再次用类似前面的代码画出决策区域："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "# plot the resulting classifier\n",
    "h = 0.02\n",
    "x_min, x_max = X[:, 0].min() - 1, X[:, 0].max() + 1\n",
    "y_min, y_max = X[:, 1].min() - 1, X[:, 1].max() + 1\n",
    "xx, yy = np.meshgrid(np.arange(x_min, x_max, h),\n",
    "                     np.arange(y_min, y_max, h))\n",
    "\n",
    "XX = np.c_[xx.ravel(), yy.ravel()]\n",
    "Z = forward_propagation(XX,parameters)\n",
    "Z = np.argmax(Z, axis=1)\n",
    "Z = Z.reshape(xx.shape)\n",
    "fig = plt.figure()\n",
    "plt.contourf(xx, yy, Z, cmap=plt.cm.Spectral, alpha=0.3)\n",
    "plt.scatter(X[:, 0], X[:, 1], c=y, s=40, cmap=plt.cm.Spectral)\n",
    "plt.xlim(xx.min(), xx.max())\n",
    "plt.ylim(yy.min(), yy.max())\n",
    "#fig.savefig('spiral_linear.png')"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "可以看到2层神经网络模型的决策曲线不再是直线而可以是任意弯曲的曲线。"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 4.2 反向求导\n",
    "### 4.2.3 损失函数关于输出的梯度\n",
    "#### 1. 二分类交叉熵损失函数关于输出的梯度"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "import numpy as np\n",
    "def sigmoid(x):\n",
    "    return 1 / (1 + np.exp(-x))\n",
    "\n",
    "def binary_cross_entropy(f,y,epsilon = 1e-8):\n",
    "    #np.sum(y*np.log(f+epsilon)+ (1-y)*np.log(1-f+epsilon), axis=1)    \n",
    "    m = len(y)\n",
    "    return - (1./m)*np.sum(np.multiply(y,np.log(f+epsilon)) + np.multiply((1 - y), np.log(1 - f+epsilon)))\n",
    "\n",
    "def binary_cross_entropy_grad(out,y,sigmoid_out = True,epsilon = 1e-8):\n",
    "    if sigmoid_out:\n",
    "        f = out\n",
    "        grad = ((f-y)/(f*(1-f)+epsilon)  )/(len(y)) \n",
    "    else:\n",
    "        f = sigmoid(out) # out is z\n",
    "        grad = (f-y)/(len(y)) \n",
    "   \n",
    "\n",
    "def binary_cross_entropy_loss_grad(out,y,sigmoid_out = True,epsilon = 1e-8):     \n",
    "    if sigmoid_out: \n",
    "        f = out\n",
    "        grad = ((f-y)/(f*(1-f)+epsilon)  )/(len(y))         \n",
    "    else: \n",
    "        f = sigmoid(out) # out is z\n",
    "        grad = (f-y)/(len(y)) \n",
    "    loss = binary_cross_entropy(f,y,epsilon)\n",
    "    return loss,grad"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "z = np.array([-4, 5,2])   #3个样本对应分类的得分\n",
    "f = sigmoid(z)  #3个样本对应分类1的概率\n",
    "y = np.array([0,   1,   0])   #3个样本对应的分类\n",
    "\n",
    "loss,grad = binary_cross_entropy_loss_grad(z,y,False)\n",
    "print(loss,grad)\n",
    "loss,grad = binary_cross_entropy_loss_grad(f,y)\n",
    "print(loss,grad)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "#### 2. 均方差损失函数关于输出的梯度 \n",
    "对于多个样本构成的矩阵$\\pmb{F},\\pmb{Y}$，则均方差$L(\\pmb{F},\\pmb{Y}) = \\frac{1}{2m}\\sum_{i=1}^m{{\\|(\\pmb f^{(i)}-\\pmb y^{(i)} \\|}_2}^2$关于$\\pmb{F}$梯度就是 $\\frac{1}{m}(\\pmb{F}-\\pmb{Y})$。因为$\\pmb F=\\pmb Z$，即：\n",
    "\n",
    "$$\\frac{\\partial\\mathcal{L}}{\\partial{\\pmb Z}} = \\frac{\\partial\\mathcal{L}}{\\partial{\\pmb F}} = \\frac{1}{m}(\\pmb{F}-\\pmb{Y})\\tag{4-39}$$"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 5,
   "metadata": {},
   "outputs": [],
   "source": [
    "def mse_loss_grad(f,y):\n",
    "    m = len(f)\n",
    "    loss = (1./m)*np.sum((f-y)**2)# np.square(f-y))\n",
    "    grad = (2./m)*(f-y)\n",
    "    return loss,grad"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "#### 3. 多分类的交叉熵损失函数关于输出的梯度 "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 27,
   "metadata": {},
   "outputs": [],
   "source": [
    "def softmax(Z):\n",
    "    A = np.exp(Z-np.max(Z,axis=-1,keepdims=True))\n",
    "    return A/np.sum(A,axis=-1,keepdims=True)\n",
    "\n",
    "def cross_entropy_grad(Z,Y,onehot = False,softmax_out=False):  \n",
    "    if softmax_out:\n",
    "        F = Z\n",
    "    else:\n",
    "        F = softmax(Z)\n",
    "    if onehot:\n",
    "        dZ = (F - Y) /len(Z)\n",
    "    else:\n",
    "        m = len(Y)\n",
    "        dZ = F.copy()\n",
    "        dZ[np.arange(m),Y] -= 1\n",
    "        dZ /= m\n",
    "        #I_i = np.zeros_like(Z)\n",
    "        #I_i[np.arange(len(Z)),Y] = 1    \n",
    "        #return (F - I_i) /len(Z)  #Z.shape[0]\n",
    "    return dZ"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 4.2.5  2层神经网络的python实现"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "from util import *\n",
    "def dRelu(x):\n",
    "    return 1 * (x > 0)\n",
    "\n",
    "def max_abs(s):\n",
    "    max_value = 0\n",
    "    for x in s:\n",
    "        max_value_ = np.max(np.abs(x))\n",
    "        if(max_value_>max_value):\n",
    "            max_value = max_value_\n",
    "    return max_value\n",
    "\n",
    "class TwoLayerNN:\n",
    "    def __init__(self, input_units, hidden_units,output_units):\n",
    "        # initialize parameters randomly\n",
    "        n = input_units\n",
    "        h = hidden_units\n",
    "        K = output_units\n",
    "      \n",
    "        self.W1 = 0.01 * np.random.randn(n,h)\n",
    "        self.b1 = np.zeros((1,h))\n",
    "        self.W2 = 0.01 * np.random.randn(h,K)\n",
    "        self.b2 = np.zeros((1,K))        \n",
    "        \n",
    "        \n",
    "    def train(self,X,y,reg=0,iterations=10000, learning_rate=1e-0,epsilon = 1e-8):\n",
    "        m = X.shape[0]\n",
    "        W1 =  self.W1\n",
    "        b1 =  self.b1\n",
    "        W2 =  self.W2\n",
    "        b2 =  self.b2        \n",
    "        for i in range(iterations):\n",
    "            # forward evaluate class scores, [N x K]\n",
    "            Z1 = np.dot(X, W1) + b1\n",
    "            A1 = np.maximum(0,Z1)  #ReLU activation\n",
    "            Z2 = np.dot(A1, W2) + b2\n",
    "            \n",
    "            data_loss = softmax_cross_entropy(Z2,y)\n",
    "            reg_loss = reg*np.sum(W1*W1) + reg*np.sum(W2*W2)\n",
    "            loss = data_loss + reg_loss\n",
    "            if i % 1000 == 0:\n",
    "                print(\"iteration %d: loss %f\" % (i, loss))\n",
    "    \n",
    "            # backward \n",
    "            dZ2 = cross_entropy_grad(Z2,y)\n",
    "            dW2 = np.dot(A1.T, dZ2) +2*reg*W2\n",
    "            db2 = np.sum(dZ2, axis=0, keepdims=True)\n",
    "            dA1 = np.dot(dZ2,W2.T)\n",
    "            \n",
    "        \n",
    "            dA1[A1 <= 0] = 0\n",
    "            dZ1 = dA1\n",
    "            #dZ1 = dA1*dReLU(A1)   \n",
    "            #dZ1 = np.multiply(dA1,dRelu(A1) )           \n",
    "            dW1 = np.dot(X.T, dZ1)+2*reg*W1\n",
    "            db1 = np.sum(dZ1, axis=0, keepdims=True)      \n",
    "            \n",
    "            if max_abs([dW2,db2,dW1,db1])<epsilon:\n",
    "                print(\"gradient is small enough at iter : \",i);\n",
    "                break\n",
    "          \n",
    "          \n",
    "             # perform a parameter update\n",
    "            W1 += -learning_rate * dW1\n",
    "            b1 += -learning_rate * db1\n",
    "            W2 += -learning_rate * dW2\n",
    "            b2 += -learning_rate * db2\n",
    "        return W1,b1,W2,b2  \n",
    "    \n",
    "    def predict(self,X):\n",
    "        Z1 = np.dot(X, W1) + b1\n",
    "        A1 = np.maximum(0,Z1)  #ReLU activation\n",
    "        Z2 = np.dot(A1, W2) + b2\n",
    "        return Z2"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "import numpy as np\n",
    "import matplotlib.pyplot as plt\n",
    "%matplotlib inline\n",
    "import  data_set  as ds\n",
    "\n",
    "np.random.seed(89)\n",
    "X,y = ds.gen_spiral_dataset()\n",
    "\n",
    "# lets visualize the data:\n",
    "plt.scatter(X[:, 0], X[:, 1], c=y, s=20, cmap=plt.cm.spring)\n",
    "plt.show()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "nn = TwoLayerNN(2,100,3)\n",
    "W1,b1,W2,b2 = nn.train(X,y)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "用下面的代码输出训练模型的准确性："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "# evaluate training set accuracy\n",
    "#A1 = np.maximum(0, np.dot(X, W1) + b1)\n",
    "#Z2 = np.dot(A1, W2) + b2\n",
    "Z2 = nn.predict(X)\n",
    "predicted_class = np.argmax(Z2, axis=1)\n",
    "print ('training accuracy: %.2f' % (np.mean(predicted_class == y)))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "可见，用分析导数计算梯度训练的模型更加准确，达到了99%。下列代码可视化的决策边界也优于用数值梯度训练的模型的决策边界："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "# plot the resulting classifier\n",
    "h = 0.02\n",
    "x_min, x_max = X[:, 0].min() - 1, X[:, 0].max() + 1\n",
    "y_min, y_max = X[:, 1].min() - 1, X[:, 1].max() + 1\n",
    "xx, yy = np.meshgrid(np.arange(x_min, x_max, h),\n",
    "                     np.arange(y_min, y_max, h))\n",
    "XX = np.c_[xx.ravel(), yy.ravel()]\n",
    "Z = nn.predict(XX)\n",
    "Z = np.argmax(Z, axis=1)\n",
    "Z = Z.reshape(xx.shape)\n",
    "fig = plt.figure()\n",
    "plt.contourf(xx, yy, Z, cmap=plt.cm.Spectral, alpha=0.8)\n",
    "plt.scatter(X[:, 0], X[:, 1], c=y, s=20, cmap=plt.cm.spring)\n",
    "plt.xlim(xx.min(), xx.max())\n",
    "plt.ylim(yy.min(), yy.max())\n",
    "#fig.savefig('spiral_net.png')"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 4.3 实现一个简单的深度学习框架\n",
    "\n",
    "4.3.2  网络层的代码实现\n",
    "\n",
    "Layer类表示一个抽象的神经网络层，除初始化构造函数init()外，主要包含2个方法：前向计算forward(self, x)接受输入x产生输出、反向求导backward(self,grad)接受反向传进来的梯度grad，grad是损失函数关于其输出的梯度，来自于其后一层（对于最后一层，grad就是损失函数关于输出层输出的梯度）。backward()计算该层相关参数的梯度（如累加和z和权值参数W）。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 12,
   "metadata": {},
   "outputs": [],
   "source": [
    "class Layer:\n",
    "    def __init__(self):\n",
    "        pass\n",
    "    def forward(self, x):        \n",
    "        raise NotImplementedError\n",
    "        \n",
    "    def backward(self, grad):      \n",
    "        raise NotImplementedError"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "下面是网络层代码的实现："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 13,
   "metadata": {},
   "outputs": [],
   "source": [
    "class Layer:\n",
    "    def __init__(self):\n",
    "        pass\n",
    "    def forward(self, x):        \n",
    "        raise NotImplementedError\n",
    "        \n",
    "    def backward(self, grad):      \n",
    "        raise NotImplementedError\n",
    "\n",
    "class Dense(Layer):\n",
    "    def __init__(self, input_dim, out_dim,activation=None): \n",
    "        super().__init__()\n",
    "        self.W = np.random.randn(input_dim, out_dim) * 0.01  #0.01 * np.random.randn\n",
    "        self.b = np.zeros((1,out_dim))  #np.zeros(out_dim)     \n",
    "      \n",
    "        self.activation = activation\n",
    "        self.A = None\n",
    "        \n",
    "    def forward(self, x):\n",
    "        # f(x) = xw+b\n",
    "        self.x = x\n",
    "        Z = np.matmul(x, self.W) + self.b\n",
    "        self.A = self.g(Z)       \n",
    "        return self.A\n",
    "    \n",
    "    def backward(self, dA_out):\n",
    "        # 反向传播      \n",
    "        A_in = self.x       \n",
    "        dZ = self.dZ_(dA_out)\n",
    "        \n",
    "        self.dW = np.dot(A_in.T, dZ)\n",
    "        self.db = np.sum(dZ, axis=0, keepdims=True)          \n",
    "        dA_in = np.dot(dZ, np.transpose(self.W))    \n",
    "        return dA_in\n",
    "    \n",
    "    def g(self,z):\n",
    "        if self.activation=='relu':\n",
    "            return np.maximum(0, z)  \n",
    "        elif self.activation=='sogmiod':\n",
    "            return 1 / (1 + np.exp(-z)) \n",
    "        else:\n",
    "            return z\n",
    "        \n",
    "    def dZ_(self,dA_out):\n",
    "        if self.activation=='relu':\n",
    "            grad_g_z = 1. * (self.A > 0)  # 实际应该是 1. * (self.Z > 0)，但两者等价\n",
    "            return np.multiply(dA_out,grad_g_z)\n",
    "        elif self.activation=='sogmiod':\n",
    "            grad_g_z = self.A(1-self.A) \n",
    "            return np.multiply(dA_out,grad_g_z)\n",
    "        else:\n",
    "            return dA_out"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "可以测试一下这个神经网络层Dense的forward()函数"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 31,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "(3, 10)\n",
      "[[-0.03953509 -0.00214997  0.00743433 -0.16926214 -0.05162853  0.06734225\n",
      "  -0.00221485 -0.11710758 -0.07046456  0.02609659]\n",
      " [ 0.00848392  0.08259757 -0.09858177  0.0374092  -0.08303008  0.04151241\n",
      "  -0.01407859 -0.02415486  0.04236149  0.0648261 ]\n",
      " [-0.13877363 -0.04122276 -0.00984716 -0.03461381  0.11513754  0.1043094\n",
      "   0.00170353 -0.00449278 -0.0057236  -0.01403174]]\n"
     ]
    }
   ],
   "source": [
    "import numpy as np\n",
    "np.random.seed(1)\n",
    "x = np.random.randn(3,48)  #3个样本，3个通道，每个通道是4x4图像\n",
    "dense = Dense(48,10,'none')\n",
    "o = dense.forward(x)\n",
    "print(o.shape)\n",
    "print(o)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 4.3.3 网络层的梯度检验\n",
    "\n",
    "下面的代码假设$f$是一个多变量参数$p$的函数，即给了$p$，可以计算$f(p)$的函数值，如果知道了损失函数$\\mathcal{L}$关于$f$的梯度$ \\frac{\\partial \\mathcal{L}}{\\partial f}$，则可以在此基础上计算损失函数$\\mathcal{L}$关于$p$的梯度。即：\n",
    "\n",
    "$$\\frac{\\partial \\mathcal{L} }{\\partial p} = \\frac{\\partial \\mathcal{L}}{\\partial f}\\frac{\\partial f}{\\partial p}  \\tag{4-73}$$\n",
    "\n",
    "用grad、df表示$\\frac{\\partial \\mathcal{L}}{\\partial p}、\\frac{\\partial \\mathcal{L}}{\\partial f}$，即：\n",
    "\n",
    "$$grad = df \\frac{\\partial f}{\\partial p} \\tag{4-74}$$\n",
    "\n",
    "即：\n",
    "\n",
    "$$\\frac{\\partial \\mathcal{L} }{\\partial p_j } = \\sum_{i}\\frac{\\partial \\mathcal{L}}{\\partial f_i}\\frac{f_i(p+\\epsilon) - f_i(p-\\epsilon)} {2\\epsilon} = \\frac{\\partial \\mathcal{L}}{\\partial f} \\cdot \\frac{ f(p_j+\\epsilon) - f(p_j-\\epsilon)}{2\\epsilon}  = df \\cdot \\frac{ f(p_j+\\epsilon) - f(p_j-\\epsilon)}{2\\epsilon}\\tag{4-77}$$\n",
    "\n",
    "其中$f$就是网络层dense的forward()输出，如果用$f= dense.forward(x)$表示这个函数计算，这个函数计算依赖于某个参数$p$，损失函数对参数p的数值求导可以用如下的函数实现："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 32,
   "metadata": {},
   "outputs": [],
   "source": [
    "def numerical_gradient_from_df(f, p, df, h=1e-5):\n",
    "  grad = np.zeros_like(p)\n",
    "  it = np.nditer(p, flags=['multi_index'], op_flags=['readwrite'])\n",
    "  while not it.finished:\n",
    "    idx = it.multi_index\n",
    "    \n",
    "    oldval = p[idx]\n",
    "    p[idx] = oldval + h\n",
    "    pos = f()       #在f的某个依赖参数p[idx]变化后重新调用f()计算其输出\n",
    "    p[idx] = oldval - h\n",
    "    neg = f()       #在f的某个依赖参数p[idx]变化后重新调用f()计算其输出\n",
    "    p[idx] = oldval\n",
    "       \n",
    "    grad[idx] = np.sum((pos - neg) * df) / (2 * h)\n",
    "    #grad[idx] = np.dot((pos - neg), df) / (2 * h)\n",
    "    it.iternext()\n",
    "  return grad"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "下面代码先模拟一个损失函数关于dense输出的梯度df，再调用`dense.backward(df)`去反向求导dense模型参数的梯度，输出的dx是关于dense输入x的梯度dx，再用上面的数值梯度函数numerical_gradient_from_df去计算关于x的数值梯度dx_num，然后比较两者dx和dx_num的误差："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 33,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "2.1851062625977136e-12\n"
     ]
    }
   ],
   "source": [
    "df = np.random.randn(3, 10)\n",
    "dx = dense.backward(df)\n",
    "dx_num = numerical_gradient_from_df(lambda :dense.forward(x),x,df)\n",
    "\n",
    "diff_error = lambda x, y: np.max(np.abs(x - y)) \n",
    "print(diff_error(dx,dx_num))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "也可以比较dense模型参数的梯度是否一致，如下代码检验关于dense的模型参数W的梯度是否一致："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 34,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "2.2715163083830703e-12\n"
     ]
    }
   ],
   "source": [
    "dW_num = numerical_gradient_from_df(lambda :dense.forward(x),dense.W,df)\n",
    "print(diff_error(dense.dW,dW_num))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 4.3.4 神经网络类\n",
    "\n",
    "在层的基础上，可以定义一个表示整个神经网络的类NeuralNetwork："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 35,
   "metadata": {},
   "outputs": [],
   "source": [
    "class NeuralNetwork:   \n",
    "    def __init__(self):\n",
    "        self._layers = []\n",
    " \n",
    "    def add_layer(self, layer):       \n",
    "        self._layers.append(layer)\n",
    "    \n",
    "    def forward(self, X):    \n",
    "        self.X = X\n",
    "        for layer in self._layers:           \n",
    "            X = layer.forward(X)      \n",
    "        return X     \n",
    "    \n",
    "    def predict(self, X):        \n",
    "        p = self.forward(X)\n",
    "     \n",
    "        if p.ndim == 1:     #单样本\n",
    "            return np.argmax(ff)\n",
    "        \n",
    "        # 多样本\n",
    "        return np.argmax(p, axis=1)\n",
    "  \n",
    "\n",
    "    def backward(self,loss_grad,reg = 0.):\n",
    "        for i in reversed(range(len(self._layers))):\n",
    "            layer = self._layers[i] \n",
    "            loss_grad = layer.backward(loss_grad)\n",
    "            \n",
    "        for i in range(len(self._layers)):\n",
    "            self._layers[i].dW += 2*reg * self._layers[i].W\n",
    "    \n",
    "    def reg_loss(self,reg):\n",
    "        loss = 0\n",
    "        for i in range(len(self._layers)):\n",
    "            loss+= reg*np.sum(self._layers[i].W*self._layers[i].W)\n",
    "        return loss\n",
    "    \n",
    "    def update_parameters(self,learning_rate):\n",
    "        for i in range(len(self._layers)):  \n",
    "            self._layers[i].W += -learning_rate *  self._layers[i].dW\n",
    "            self._layers[i].b += -learning_rate * self._layers[i].db \n",
    "            \n",
    "    def parameters(self):\n",
    "        params = []\n",
    "        for i in range(len(self._layers)):\n",
    "            params.append(self._layers[i].W)\n",
    "            params.append(self._layers[i].b)\n",
    "        return params\n",
    "    \n",
    "    def grads(self):\n",
    "        grads = []\n",
    "        for i in range(len(self._layers)):\n",
    "            grads.append(self._layers[i].dW)\n",
    "            grads.append(self._layers[i].db)\n",
    "        return grads"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "有了网络层Layer和神经网络类NeuralNetwork，就可以对于实际问题如2为平面的点集分类问题定义一个2层神经网络模型："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 36,
   "metadata": {},
   "outputs": [],
   "source": [
    "nn = NeuralNetwork()\n",
    "nn.add_layer(Dense(2, 100, 'relu'))\n",
    "nn.add_layer(Dense(100, 3, 'softmax'))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "对于多分类问题，可以用前面的softmax_cross_entropy()和cross_entropy_grad计算一个对于多分类交叉熵损失和关于加权和的梯度："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 37,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "1.098695480580774 -9.25185853854297e-18\n"
     ]
    }
   ],
   "source": [
    "X_temp = np.random.randn(2,2)\n",
    "y_temp = np.random.randint(3, size=2)\n",
    "F = nn.forward(X_temp)\n",
    "loss = softmax_cross_entropy(F,y_temp)\n",
    "loss_grad =  cross_entropy_grad(F,y_temp)\n",
    "print(loss,np.mean(loss_grad))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 4.3.5 神经网络的梯度检验\n",
    "\n",
    "为了确保神经网络的正向计算、损失函数计算和反向求导的计算正确，可以比较数值梯度与分析梯度。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 38,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "(2, 100) (2, 100)\n",
      "(1, 100) (1, 100)\n",
      "(100, 3) (100, 3)\n",
      "(1, 3) (1, 3)\n"
     ]
    },
    {
     "data": {
      "text/plain": [
       "2.3017241064515748e-10"
      ]
     },
     "execution_count": 38,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "import util\n",
    "\n",
    "#根据损失函数关于输出的梯度loss_grad计算模型参数的梯度\n",
    "nn.backward(loss_grad)\n",
    "grads= nn.grads()\n",
    "\n",
    "def loss_fun():\n",
    "    F = nn.forward(X_temp)\n",
    "    return softmax_cross_entropy(F,y_temp)\n",
    "\n",
    "params = nn.parameters()\n",
    "numerical_grads = util.numerical_gradient(loss_fun,params,1e-6)\n",
    "\n",
    "for i in range(len(params)):\n",
    "    print(numerical_grads[i].shape,grads[i].shape)\n",
    "  \n",
    "\n",
    "def diff_error(x, y):\n",
    "  return np.max(np.abs(x - y)) \n",
    "\n",
    "def diff_errors(xs, ys):\n",
    "    errors = []\n",
    "    for i in range(len(xs)):\n",
    "        errors.append(diff_error(xs[i],ys[i]))\n",
    "    return np.max(errors)\n",
    "\n",
    "diff_errors(numerical_grads,grads)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "数值梯度和分析梯度误差很小，说明分析梯度基本正确。下面是梯度下降算法的代码："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 39,
   "metadata": {},
   "outputs": [],
   "source": [
    "def cross_entropy_grad_loss(F,y,softmax_out=False,onehot=False):\n",
    "    if softmax_out:\n",
    "        loss = cross_entropy_loss(F,y,onehot)\n",
    "    else:    \n",
    "        loss = softmax_cross_entropy(F,y,onehot)\n",
    "    loss_grad =  cross_entropy_grad(F,y,onehot,softmax_out)\n",
    "    return loss,loss_grad"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 41,
   "metadata": {},
   "outputs": [],
   "source": [
    "   \n",
    "def train(nn,X,y,loss_function,epochs=10000,learning_rate=1e-0,reg = 1e-3,print_n=10):\n",
    "    for epoch in range(epochs):\n",
    "        f = nn.forward(X)        \n",
    "        loss,loss_grad = loss_function(f,y)        \n",
    "        loss+=nn.reg_loss(reg)\n",
    "        \n",
    "        nn.backward(loss_grad,reg)\n",
    "      \n",
    "        nn.update_parameters(learning_rate);        \n",
    "       \n",
    "        if epoch % print_n == 0:\n",
    "            print(\"iteration %d: loss %f\" % (epoch, loss)) "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "用上面的数据训练集训练模型，并输出模型预测的准确度："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 42,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "iteration 0: loss 1.098749\n",
      "iteration 1000: loss 0.199245\n",
      "iteration 2000: loss 0.129508\n",
      "iteration 3000: loss 0.116411\n",
      "iteration 4000: loss 0.110031\n",
      "iteration 5000: loss 0.105776\n",
      "iteration 6000: loss 0.103647\n",
      "iteration 7000: loss 0.102508\n",
      "iteration 8000: loss 0.101521\n",
      "iteration 9000: loss 0.100991\n",
      "0.9933333333333333\n"
     ]
    }
   ],
   "source": [
    "import  data_set  as ds\n",
    "\n",
    "np.random.seed(89)\n",
    "X,y = ds.gen_spiral_dataset()\n",
    "\n",
    "epochs=10000\n",
    "learning_rate=1e-0\n",
    "reg = 1e-4\n",
    "print_n = epochs//10\n",
    "train(nn,X,y,cross_entropy_grad_loss,epochs,learning_rate,reg,print_n)\n",
    "print(np.mean(nn.predict(X)==y))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "通常采用批梯度下降算法train_batch()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 43,
   "metadata": {},
   "outputs": [],
   "source": [
    "def data_iter(X,y,batch_size,shuffle=False):\n",
    "    m = len(X)  \n",
    "    indices = list(range(m))\n",
    "    if shuffle:                 # shuffle是True表示打乱次序\n",
    "        np.random.shuffle(indices)\n",
    "    for i in range(0, m - batch_size + 1, batch_size):\n",
    "        batch_indices = np.array(indices[i: min(i + batch_size, m)])      \n",
    "        yield X.take(batch_indices,axis=0), y.take(batch_indices,axis=0)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 44,
   "metadata": {},
   "outputs": [],
   "source": [
    "   \n",
    "def train_batch(nn,XX,YY,loss_function,epochs=10000,batch_size=50,learning_rate=1e-0,reg = 1e-3,print_n=10):\n",
    "    iter = 0\n",
    "    for epoch in range(epochs):\n",
    "        for X,y in data_iter(XX,YY,batch_size,True):             \n",
    "            f = nn.forward(X)        \n",
    "            loss,loss_grad = loss_function(f,y)        \n",
    "            loss+=nn.reg_loss(reg)\n",
    "\n",
    "            nn.backward(loss_grad,reg)\n",
    "\n",
    "            nn.update_parameters(learning_rate);        \n",
    "\n",
    "            if iter % print_n == 0:\n",
    "                print(\"iteration %d: loss %f\" % (iter, loss)) \n",
    "            iter+=1"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "用批梯度下降法训练一个2层神经网络："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 45,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "iteration 0: loss 1.098579\n",
      "iteration 600: loss 0.377089\n",
      "iteration 1200: loss 0.198609\n",
      "iteration 1800: loss 0.129696\n",
      "iteration 2400: loss 0.208457\n",
      "iteration 3000: loss 0.090015\n",
      "iteration 3600: loss 0.110976\n",
      "iteration 4200: loss 0.095018\n",
      "iteration 4800: loss 0.084522\n",
      "iteration 5400: loss 0.095629\n",
      "0.9866666666666667\n"
     ]
    }
   ],
   "source": [
    "nn = NeuralNetwork()\n",
    "nn.add_layer(Dense(2, 100, 'relu'))\n",
    "nn.add_layer(Dense(100, 3))\n",
    "\n",
    "epochs=1000\n",
    "batch_size=50\n",
    "learning_rate=1e-0\n",
    "reg = 1e-4\n",
    "print_n = epochs*len(X)//batch_size//10\n",
    "\n",
    "train_batch(nn,X,y,cross_entropy_grad_loss,epochs,batch_size,learning_rate,reg,print_n)\n",
    "print(np.mean(nn.predict(X)==y))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 4.3.6 基于深度学习框架的MNIST数据手写数字识别"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 46,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "float32\n",
      "(50000, 784)\n",
      "(10000, 784)\n"
     ]
    }
   ],
   "source": [
    "#%%time \n",
    "import pickle, gzip, urllib.request, json\n",
    "import numpy as np\n",
    "import os.path\n",
    "\n",
    "if not os.path.isfile(\"mnist.pkl.gz\"):\n",
    "    # Load the dataset\n",
    "    urllib.request.urlretrieve(\"http://deeplearning.net/data/mnist/mnist.pkl.gz\", \"mnist.pkl.gz\")\n",
    "    \n",
    "with gzip.open('mnist.pkl.gz', 'rb') as f:\n",
    "    train_set, valid_set, test_set = pickle.load(f, encoding='latin1')\n",
    "\n",
    "train_X, train_y = train_set\n",
    "valid_X, valid_y = valid_set\n",
    "print(train_X.dtype)\n",
    "print(train_set[0].shape)\n",
    "print(valid_X.shape)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 47,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAS4AAAD4CAYAAABSUAvFAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAAUT0lEQVR4nO3df6xX9X3H8edLxDB/1KpUQ+AWmWFbGV1px9SWTilNOzDZsMlYwbXTthaxxdikJlibTBNqwtaqdc6W3iIqqZWQioU1rJQQV2s2G8CggMz2Sh1cIVCmnW39o97y3h/fc/XL98f5fu/9fu8953N5PZJv7vd73uec++Gb68vP+ZxzPkcRgZlZSk4rugFmZkPl4DKz5Di4zCw5Di4zS46Dy8ySc/po/jJJPoVpNsIiQp1sP3/+/Dh+/Hhb6+7atWtrRMzv5PcNR0fBJWk+cC8wDlgTEau60iozK8zx48fZuXNnW+tKmjjCzWlo2IeKksYB9wMLgBnAEkkzutUwMytORLT1KkonPa5Lgb6IOAAgaT2wEHi+Gw0zs+KcOHGi6Cbk6iS4JgOHqj73A5fVriRpKbC0g99jZqOo6N5UOzoJrkYDgHX/2ojoBXrBg/NmqRjLwdUP9FR9ngIc7qw5ZlYGZQ+uTq7j2gFMlzRN0hnAYmBzd5plZkUas4PzETEgaTmwlcrlEGsjYl/XWmZmhSl7j6uj67giYguwpUttMbMSiIgxfVbRzMaoMd3jMrOxycFlZslxcJlZUoo+Y9gOB5eZ1fHgvJklxz0uM0uKDxXNLEkOLjNLjoPLzJLj4DKzpPiWHzNLkntcZpYcB5eZJcfBZWbJcXCZWVI8OG9mSXKPy8yS4+Ays+Q4uMwsKb7J2syS5OAyS9D27dtz61KjB7m/Zd68ed1szqjzWUUzS457XGaWFI9xmVmSHFxmlhwHl5klx8FlZknxvYpmliT3uMxK6J577smtf+ADH8itr1u3rpvNKZ2yB9dpnWws6SVJeyTtlrSzW40ys2INXhLR6tUOSfMlvSCpT9KtDernSvo3Sc9K2ifpU6322Y0e14ci4ngX9mNmJdGtHpekccD9wEeAfmCHpM0R8XzVap8Hno+Iv5b0DuAFSY9ExO+a7deHimZ2ki4Pzl8K9EXEAQBJ64GFQHVwBXCOKvdRnQ28Agzk7bSjQ8XsF/5I0i5JSxutIGmppJ0+lDRLxxAOFScO/vedvWpzYDJwqOpzf7as2r8C7wIOA3uAmyMiNzk77XHNiYjDki4Etkn674h4snqFiOgFegEklXvEz8yAIR0qHo+I2Tn1Rnej1+78r4DdwDzgEipZ8pOIeK3ZTjvqcUXE4eznMeBxKt1CM0tcFwfn+4Geqs9TqPSsqn0K2BgVfcAvgD/J2+mwg0vSWZLOGXwPfBTYO9z9mVk5tBtabQbXDmC6pGmSzgAWA5tr1jkIfBhA0kXAHwMH8nbayaHiRcDj2bxEpwPfjYgfdrA/s65atWpV09qyZctyt33jjTdy663m60pdt84qRsSApOXAVmAcsDYi9klaltVXAyuBhyTtoXJouaLVlQrDDq7sLMF7hru9mZVXN2/5iYgtwJaaZaur3h+mcsTWNl8OYWZ1yn7lvIPLzE7iiQTNLEkOLjNLjoPLzJLj4DIryOWXX960Nn78+Nxtn3rqqdz6hg0bhtWmFHgiQTNLkntcZpYcB5eZJcfBZWbJcXCZWVI8OG9mSXKPy8yS4+CyQl1xxRW59S9/+cu59SVLluTWX3nllSG3qVtatW3mzJlNay+++GLutrfccsuw2jRWOLjMLCm+ydrMkuTgMrPk+KyimSXHPS4zS4rHuMwsSQ4uM0uOg8sK1dvbm1ufPn16bn3GjBm59VbzVo2k2267Lbd+wQUXNK199rOfzd322WefHVabxgoHl5klxfcqmlmS3OMys+Q4uMwsOQ4uM0uOg8vMkuLBeTNLkntcVqjXX389t97qD3TChAndbM6QzJo1K7c+derU3Hper6HIf1cKyh5cp7VaQdJaScck7a1adr6kbZJ+nv08b2SbaWajafB+xVavorQMLuAhYH7NsluB7RExHdiefTazMaDd0Cp1cEXEk0Dt/LwLgYez9w8DV3e5XWZWoLIH13DHuC6KiCMAEXFE0oXNVpS0FFg6zN9jZgU45c8qRkQv0AsgqdwjfmZWeG+qHe2McTVyVNIkgOznse41ycyK1s1DRUnzJb0gqU9Sw/FwSXMl7Za0T9KPW+1zuMG1Gbg2e38tsGmY+zGzEupWcEkaB9wPLABmAEskzahZ5+3AN4C/iYg/BRa12m/LQ0VJjwJzgYmS+oHbgVXABkmfAQ6284ts5KxcubJp7d3vfnfutvv378+tj+S8VGeddVZufcWKFbn1M888M7f+9NNPN61973vfy932VNfFQ8VLgb6IOAAgaT2Vk3vPV61zDbAxIg5mv7vlEVzL4IqIZk/d/HCrbc0sPUO85WeipJ1Vn3uzce1Bk4FDVZ/7gctq9vFHwHhJ/wGcA9wbEevyfqmvnDezOkPocR2PiNk5dTXafc3n04E/p9IZ+gPgvyQ9HRE/a7ZTB5eZ1enioWI/0FP1eQpwuME6xyPit8BvJT0JvAdoGlzDHZw3szGsi2cVdwDTJU2TdAawmMrJvWqbgL+UdLqkM6kcSuYOvrrHZWZ1utXjiogBScuBrcA4YG1E7JO0LKuvjoj9kn4IPAecANZExN7me3VwmVmNbl+AGhFbgC01y1bXfP4q8NV29+ngSkBPT09uPe9RWwMDA7nbLl++PLf+y1/+Mrfeibvvvju3vmhR/lU2hw/XDpWcbM6cOUNuk1Wc8rf8mFl6yn7Lj4PLzOo4uMwsKSncZO3gMrM6Di4zS46Dy8yS47OKZpYUj3FZW2bOnJlbf/zxx3PrEydObFq77777crf98Y9bztnWkVtuuaVp7brrruto33feeWdH21tzDi4zS46Dy8yS4+Ays6QMcSLBQji4zKyOe1xmlhwHl5klx8FlZslxcJ0CTj89/2v8xCc+kVt/4IEHcuunnZY/w3beQOr73//+3G2/9KUv5dZbzZl1/vnn59bz5tSSGj1H4S3r1uU+6IVvfetbuXUbHl+AamZJ8llFM0uOe1xmlhwHl5klxWNcZpYkB5eZJcfBZWbJ8VnFU8DixYtz62vWrMmtt/q/W6s/or6+vqa12bNn527bqr5w4cLc+uTJk3PrkyZNalpr9czGT3/607l1GxkpjHHlX9kISFor6ZikvVXL7pD0sqTd2euqkW2mmY2mwfBq9SpKy+ACHgLmN1h+T0TMyl5bGtTNLFFlD66Wh4oR8aSki0e+KWZWFskfKuZYLum57FDyvGYrSVoqaaeknR38LjMbJYMTCbbzKspwg+ubwCXALOAIcFezFSOiNyJmR0T+KLCZlUbyh4qNRMTRwfeSvg38oGstMrPCjclDRUnV57g/Buxttq6ZpSf5HpekR4G5wERJ/cDtwFxJs4AAXgJuGME2lsLHP/7xprUHH3wwd9s33ngjt/6rX/0qt37NNdfk1l999dWmtbvuanoUD8CVV16ZW291nVerObXy/rjzngcJcOjQodz63Llzc+svvvhibt2aK3uPq52ziksaLM6f+c7MklV0b6odnZxVNLMxqptnFSXNl/SCpD5Jt+as9xeSfi/pb1vt08FlZnW6NcYlaRxwP7AAmAEskTSjyXr/BGxtp30OLjOr08XB+UuBvog4EBG/A9YDjW6AvQl4DDjWzk4dXGZ2knZDKwuuiYMXmGevpTW7mwxUn2Xpz5a9SdJkKlcnrG63jZ4dwszqDGFw/niLi8sbnXau3fnXgRUR8ftWZ6kHObjadMMNza/4OHjwYO62X/nKV3LrrS6n6MRNN92UW2/1iK9WjzfrRKs/0ieeeCK37ssdRk4Xzyr2Az1Vn6cAh2vWmQ2sz/4eJgJXSRqIiO8326mDy8zqdPE+xB3AdEnTgJeBxcBJFyZGxLTB95IeAn6QF1rg4DKzGt28jisiBiQtp3K2cBywNiL2SVqW1dse16rm4DKzOt28ADWbr29LzbKGgRUR17WzTweXmdUp+5XzDi4zq+PgMrOkDE4kWGYOLjOr4x7XGLFp06amtY0bN+Zu22p6lpHUauqYmTNndrT/JUsaTR7ylr17hz9VW39//7C3tc44uMwsOQ4uM0uOg8vMkpLCRIIOLjOr47OKZpYc97jMLDkOLjNLise4xpB777236CY0de655zatLVq0KHfbt73tbbn1VnNebdiwIbduaXJwmVlyPDhvZknxoaKZJcnBZWbJcXCZWXIcXGaWHAeXmSXFEwnaqPjc5z7XtHbjjTfmbnvsWP4Tz+fNmzesNlnayt7jOq3VCpJ6JD0hab+kfZJuzpafL2mbpJ9nP88b+eaa2WgYvCSi1asoLYMLGAC+GBHvAi4HPi9pBnArsD0ipgPbs89mNgYkH1wRcSQinsne/xrYD0wGFgIPZ6s9DFw9Uo00s9HTbmgVGVxDGuOSdDHwXuCnwEURcQQq4SbpwibbLAWWdtZMMxtNZR/jaju4JJ0NPAZ8ISJek9TWdhHRC/Rm+yj3t2FmQPnvVWxnjAtJ46mE1iMRMfhIm6OSJmX1SUD+6SkzS0byh4qqdK0eAPZHxN1Vpc3AtcCq7Gfz53dZR6ZOnZpbv/7665vWWv1x9fb25tb9iLBTT9Gh1I52DhXnAJ8E9kjanS27jUpgbZD0GeAgkD/xk5klI/ngioingGYDWh/ubnPMrAySDy4zO/WUfXDewWVmJxkrY1xmdopxcJlZchxcZpYcB5d1bNu2bbn1vOu8vvOd7+Rue/vttw+rTTa2dTO4JM0H7gXGAWsiYlVN/e+BFdnH3wA3RsSzeft0cJnZSbo5kaCkccD9wEeAfmCHpM0R8XzVar8AroyIVyUtoHKL4GV5+3VwmVmdLva4LgX6IuIAgKT1VGaWeTO4IuI/q9Z/GpjSaqcOLjOrM4TgmihpZ9Xn3mxihUGTgUNVn/vJ7019Bvj3Vr/UwWVmdYYQXMcjYnZOvdFdNw13LulDVILrg61+qYPLzE7S5QtQ+4Geqs9TgMO1K0n6M2ANsCAi/rfVTtua1sbMTi1dnNZmBzBd0jRJZwCLqcws8yZJ7wQ2Ap+MiJ+1s1P3uMysTrfOKkbEgKTlwFYql0OsjYh9kpZl9dXAPwIXAN/IJigdaHH46eBKwYMPPphbX7lyZdPapk2eJs2GrpvXcUXEFmBLzbLVVe+vB5pPKteAg8vMTuKbrM0sSQ4uM0uOg8vMkuOJBM0sKR7jMrMkObjMLDllDy6NZgP9JGuzkRcR7T1mvokJEyZET09P6xWBvr6+Xa0uFh0J7nGZWZ2y97gcXGZ2km5OJDhSHFxmVsc9LjNLjoPLzJLj4DKzpPgCVDNLUtmDq+UMqJJ6JD0hab+kfZJuzpbfIellSbuz11Uj31wzGw0nTpxo61WUdnpcA8AXI+IZSecAuyQNPqH0noj42sg1z8yKUPYeV8vgiogjwJHs/a8l7afyyCEzG4NSGOMa0sMyJF0MvBf4abZouaTnJK2VdF6TbZZK2lnz7DUzK7EuPixjRLQdXJLOBh4DvhARrwHfBC4BZlHpkd3VaLuI6I2I2UXcz2Rmw1P24GrrrKKk8VRC65GI2AgQEUer6t8GfjAiLTSzUVf2W37aOaso4AFgf0TcXbV8UtVqHwP2dr95Zjba2u1tlb3HNQf4JLBH0u5s2W3AEkmzqDxO+yXghhFpoZmNurIPzrdzVvEpoNH8PlsaLDOzMSD54DKzU4+Dy8yS4+Ays6R4IkEzS5J7XGaWHAeXmSXHwWVmSSn64tJ2OLjMrI6Dy8yS47OKZpYc97jMLCkpjHENaSJBMzs1dHN2CEnzJb0gqU/SrQ3qkvQvWf05Se9rtU8Hl5nV6VZwSRoH3A8sAGZQmVVmRs1qC4Dp2WsplUlKczm4zKxOF5/ycynQFxEHIuJ3wHpgYc06C4F1UfE08Paa+f7qjPYY13Hgf6o+T8yWlVFZ21bWdoHbNlzdbNvULuxjK5U2tWNCzfMkeiOit+rzZOBQ1ed+4LKafTRaZzLZQ3oaGdXgioh3VH+WtLOsc9GXtW1lbRe4bcNVtrZFxPwu7q7RXH61x5jtrHMSHyqa2UjqB3qqPk8BDg9jnZM4uMxsJO0ApkuaJukMYDGwuWadzcA/ZGcXLwf+L3uea1NFX8fV23qVwpS1bWVtF7htw1XmtnUkIgYkLacybjYOWBsR+yQty+qrqUwDfxXQB7wOfKrVflX2C83MzGr5UNHMkuPgMrPkFBJcrW4BKJKklyTtkbS75vqUItqyVtIxSXurlp0vaZukn2c/zytR2+6Q9HL23e2WdFVBbeuR9ISk/ZL2Sbo5W17od5fTrlJ8bykZ9TGu7BaAnwEfoXIadAewJCKeH9WGNCHpJWB2RBR+saKkK4DfULmqeGa27J+BVyJiVRb650XEipK07Q7gNxHxtdFuT03bJgGTIuIZSecAu4Crgeso8LvLadffUYLvLSVF9LjauQXAgIh4EnilZvFC4OHs/cNU/vBHXZO2lUJEHImIZ7L3vwb2U7kSu9DvLqddNkRFBFezy/vLIoAfSdolaWnRjWngosFrXLKfFxbcnlrLszv81xZ1GFtN0sXAe4GfUqLvrqZdULLvreyKCK4hX94/yuZExPuo3LH++eyQyNrzTeASYBaV+8zuKrIxks4GHgO+EBGvFdmWag3aVarvLQVFBNeQL+8fTRFxOPt5DHicyqFtmRwdvHM++3ms4Pa8KSKORsTvI+IE8G0K/O4kjacSDo9ExMZsceHfXaN2lel7S0URwdXOLQCFkHRWNmiKpLOAjwJ787cadZuBa7P31wKbCmzLSWqmIvkYBX13kgQ8AOyPiLurSoV+d83aVZbvLSWFXDmfne79Om/dAnDnqDeiAUl/SKWXBZXbob5bZNskPQrMpTLFyFHgduD7wAbgncBBYFFEjPogeZO2zaVyuBPAS8ANre45G6G2fRD4CbAHGJw06jYq40mFfXc57VpCCb63lPiWHzNLjq+cN7PkOLjMLDkOLjNLjoPLzJLj4DKz5Di4zCw5Di4zS87/A7raA5h83xMhAAAAAElFTkSuQmCC\n",
      "text/plain": [
       "<Figure size 432x288 with 2 Axes>"
      ]
     },
     "metadata": {
      "needs_background": "light"
     },
     "output_type": "display_data"
    }
   ],
   "source": [
    "import matplotlib.pyplot as plt\n",
    "%matplotlib inline\n",
    "\n",
    "digit = train_set[0][9].reshape(28,28)\n",
    "\n",
    "plt.imshow(digit,cmap='gray')\n",
    "plt.colorbar()\n",
    "plt.show()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 21,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "(50000, 784)\n"
     ]
    }
   ],
   "source": [
    "print(train_X.shape)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 48,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "iteration 0: loss 2.320527\n",
      "iteration 3906: loss 0.436557\n",
      "iteration 7812: loss 0.363573\n",
      "iteration 11718: loss 0.289885\n",
      "iteration 15624: loss 0.177679\n",
      "iteration 19530: loss 0.286339\n",
      "iteration 23436: loss 0.189970\n",
      "iteration 27342: loss 0.143797\n",
      "iteration 31248: loss 0.158769\n",
      "iteration 35154: loss 0.153224\n",
      "0.98474\n",
      "0.9766\n",
      "[4] 4\n"
     ]
    }
   ],
   "source": [
    "nn = NeuralNetwork()\n",
    "nn.add_layer(Dense(784, 200, 'relu'))\n",
    "nn.add_layer(Dense(200, 100, 'relu'))\n",
    "nn.add_layer(Dense(100, 10, ))\n",
    "\n",
    "epochs = 25\n",
    "batch_size = 32\n",
    "learning_rate = 0.1\n",
    "reg = 1e-3\n",
    "print_n = 25*len(train_X)//32//10\n",
    "train_batch(nn,train_X,train_y,cross_entropy_grad_loss,epochs,batch_size,learning_rate,reg,print_n)\n",
    "\n",
    "#nn.train_batch(train_X,train_y,ds.data_iter,loss_gradient_softmax_crossentropy,25,0.1,32,True,1e-3,2)\n",
    "print(np.mean(nn.predict(train_X)==train_y))\n",
    "print(np.mean(nn.predict(valid_X)==valid_y))\n",
    "print(nn.predict(valid_X[9]),valid_y[9])"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 4.3.7 改进的通用神经网络框架：分离加权和与激活函数\n",
    "\n",
    "Layer类添加了以成员变量params保存模型的参数，用于保存模型的参数，并且添加了一个方法reg_loss_grad用来给模型参数的梯度增加损失函数中的正则项的梯度。\n",
    "\n",
    "Dense类仅仅进行加权和计算，其构造函数接受一个对权值参数随机初始化的参数，根据不同的随机初始化方法对权值参数初始化。Dense类接受的单个数据特征不仅仅是一个向量，还可以是多通道的二维图像，如彩色图像包含红绿蓝三种颜色的图像，每个颜色通道是一个二维数组，因此forward()和backwrd()方法都会用下列代码先将多通道的输入数据摊平为一个一维向量。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "x1 = x.reshape(x.shape[0],np.prod(x.shape[1:]))  #将多通道的x摊平"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 49,
   "metadata": {},
   "outputs": [],
   "source": [
    "class Layer:\n",
    "    def __init__(self):\n",
    "        self.params = None\n",
    "        pass\n",
    "    def forward(self, x):       \n",
    "        raise NotImplementedError\n",
    "    def backward(self, x, grad):        \n",
    "        raise NotImplementedError\n",
    "    def reg_grad(self,reg):\n",
    "        pass\n",
    "    def reg_loss(self,reg):\n",
    "        return 0.  \n",
    "    \n",
    "#----------加权和计算------------    \n",
    "class Dense(Layer): \n",
    "    # Z = XW+b\n",
    "    def __init__(self, input_dim, out_dim,init_method = ('random',0.01)):  \n",
    "        super().__init__()\n",
    "        random_method_name,random_value = init_method      \n",
    "        if random_method_name == \"random\":\n",
    "            self.W = np.random.randn(input_dim, out_dim) * random_value  #0.01 * np.random.randn\n",
    "            self.b = np.random.randn(1,out_dim)* random_value  \n",
    "        elif random_method_name == \"he\":\n",
    "            self.W = np.random.randn(input_dim, out_dim)*np.sqrt(2/input_dim)\n",
    "            #self.b = np.random.randn(1,out_dim)* random_value\n",
    "            self.b = np.zeros((1,out_dim))\n",
    "        elif random_method_name == \"xavier\":\n",
    "            self.W = np.random.randn(input_dim, out_dim)*np.sqrt(1/input_dim)\n",
    "            self.b = np.random.randn(1,out_dim)* random_value  \n",
    "        elif random_method_name == \"zeros\":\n",
    "            self.W = np.zeros((input_dim, out_dim))\n",
    "            self.b = np.zeros((1,out_dim))   \n",
    "        else:            \n",
    "            self.W = np.random.randn(input_dim, out_dim)* random_value\n",
    "            self.b = np.zeros((1,out_dim))  \n",
    "            \n",
    "        self.params = [self.W,self.b]\n",
    "        self.grads = [np.zeros_like(self.W),np.zeros_like(self.b)]\n",
    "      #  self.activation = activation\n",
    "      #  self.A = None\n",
    "        \n",
    "    def forward(self, x): \n",
    "        self.x = x        \n",
    "        x1 = x.reshape(x.shape[0],np.prod(x.shape[1:]))  #将多通道的x摊平      \n",
    "        Z = np.matmul(x1, self.W) + self.b        \n",
    "        return Z\n",
    "    \n",
    "    def backward(self, dZ):\n",
    "        # 反向传播      \n",
    "        x = self.x\n",
    "        x1 = x.reshape(x.shape[0],np.prod(x.shape[1:]))  #将多通道的x摊平\n",
    "        dW = np.dot(x1.T, dZ)\n",
    "        db = np.sum(dZ, axis=0, keepdims=True)          \n",
    "        dx = np.dot(dZ, np.transpose(self.W)) \n",
    "        dx = dx.reshape(x.shape)    #反摊平为多通道的x的形状   \n",
    "        \n",
    "        #self.grads = [dW, db]\n",
    "        self.grads[0] += dW\n",
    "        self.grads[1] += db\n",
    "       \n",
    "        return dx\n",
    "    \n",
    "    #--------添加正则项的梯度-----\n",
    "    def reg_grad(self,reg):\n",
    "        self.grads[0]+= 2*reg * self.W\n",
    "        \n",
    "    def reg_loss(self,reg):\n",
    "        return  reg*np.sum(self.W**2)\n",
    "    \n",
    "    def reg_loss_grad(self,reg):\n",
    "        self.grads[0]+= 2*reg * self.W\n",
    "        return  reg*np.sum(self.W**2)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "假如x是6个样本，每个样本是3个通道而每个通道是```4*4```的图像，下面代码是这3个样本作为输入的正向计算："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 50,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "(3, 10)\n",
      "[[-0.03953509 -0.00214997  0.00743433 -0.16926214 -0.05162853  0.06734225\n",
      "  -0.00221485 -0.11710758 -0.07046456  0.02609659]\n",
      " [ 0.00848392  0.08259757 -0.09858177  0.0374092  -0.08303008  0.04151241\n",
      "  -0.01407859 -0.02415486  0.04236149  0.0648261 ]\n",
      " [-0.13877363 -0.04122276 -0.00984716 -0.03461381  0.11513754  0.1043094\n",
      "   0.00170353 -0.00449278 -0.0057236  -0.01403174]]\n"
     ]
    }
   ],
   "source": [
    "import numpy as np\n",
    "np.random.seed(1)\n",
    "x = np.random.randn(3,3,4, 4)  #3个样本，3个通道，每个通道是4x4图像\n",
    "dense = Dense(3*4*4,10,('no',0.01))\n",
    "o = dense.forward(x)\n",
    "print(o.shape)\n",
    "print(o)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "梯度验证"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 51,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "3.638244314951079e-09\n",
      "1.3450414982951384e-11\n",
      "[[ 1.77463167  0.11663492  1.87794917  0.27986781  1.27243915 -2.44375556\n",
      "  -2.1266117   0.99629747 -0.73720237 -0.68570287]\n",
      " [-0.69807196  0.22547472 -0.93721649  0.3286185  -1.0421723   0.66487528\n",
      "   1.33111205  0.25677848 -0.58451408  0.71015412]\n",
      " [ 0.12251147 -0.4041516   0.57764614  0.89962639 -0.35195022  0.77829011\n",
      "  -0.01618803 -0.62209694 -1.28543176 -0.37554316]]\n",
      "[[ 1.77463167  0.11663492  1.87794917  0.27986781  1.27243915 -2.44375556\n",
      "  -2.1266117   0.99629747 -0.73720237 -0.68570287]\n",
      " [-0.69807196  0.22547472 -0.93721649  0.3286185  -1.0421723   0.66487528\n",
      "   1.33111205  0.25677848 -0.58451408  0.71015412]\n",
      " [ 0.12251147 -0.4041516   0.57764614  0.89962639 -0.35195022  0.77829011\n",
      "  -0.01618803 -0.62209694 -1.28543176 -0.37554316]]\n"
     ]
    }
   ],
   "source": [
    "do = np.random.randn(3, 10)\n",
    "dx = dense.backward(do)\n",
    "dx_num = numerical_gradient_from_df(lambda :dense.forward(x),x,do)\n",
    "\n",
    "diff_error = lambda x, y: np.max(np.abs(x - y)/(np.maximum(1e-8, np.abs(x) + np.abs(y) )) )\n",
    "print(diff_error(dx,dx_num))\n",
    "\n",
    "dW_num = numerical_gradient_from_df(lambda :dense.forward(x),dense.params[0],do)\n",
    "print(diff_error(dense.grads[0],dW_num))\n",
    "print(dense.grads[0][:3])\n",
    "print(dW_num[:3])"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "还可以给Dense层后面接一个损失函数，比较损失函数关于Dense模型参数的分析梯度和数值梯度："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 52,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "2.0148860313259954e-07\n",
      "[[ 0.47568681 -0.06324119 -0.29294422 -0.76304343 -0.09660146  0.62794569\n",
      "   1.16087896  0.06261028 -0.6611078  -0.02940735]\n",
      " [-0.10777785 -1.47174583  0.63258553  1.22381944 -0.35702633  0.4409597\n",
      "  -2.42444873 -0.28804741 -1.33377026  0.66775208]]\n",
      "[array([ 0.47568681, -0.06324119, -0.29294422, -0.76304343, -0.09660146,\n",
      "        0.62794569,  1.16087896,  0.06261028, -0.6611078 , -0.02940735]), array([-0.10777785, -1.47174583,  0.63258553,  1.22381944, -0.35702633,\n",
      "        0.4409597 , -2.42444873, -0.28804741, -1.33377026,  0.66775208])]\n"
     ]
    }
   ],
   "source": [
    "import util\n",
    "x = np.random.randn(3,3,4, 4)\n",
    "y = np.random.randn(3,10) \n",
    "\n",
    "dense = Dense(3*4*4,10,('no',0.01))\n",
    "\n",
    "f = dense.forward(x)\n",
    "loss,do = mse_loss_grad(f,y)\n",
    "dx = dense.backward(do)\n",
    "def loss_f():\n",
    "    f = dense.forward(x)\n",
    "    loss= mse_loss(f,y)\n",
    "    return loss\n",
    "    \n",
    "dW_num = util.numerical_gradient(loss_f,dense.params[0],1e-6)\n",
    "print(diff_error(dense.grads[0],dW_num))\n",
    "print(dense.grads[0][:2])\n",
    "print(dW_num[:2])"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Dense层只计算加权和，而不需要在其中根据激活函数不同计算激活函数的值或求激活函数的导数，变得很简单。不同的激活函数可以单独实现为一个激活函数层类，下面代码定义了神经网络中最长使用的激活函数对应的激活函数层："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 53,
   "metadata": {},
   "outputs": [],
   "source": [
    "class Relu(Layer):\n",
    "    def __init__(self):\n",
    "        super().__init__()\n",
    "        pass\n",
    "    def forward(self, x):\n",
    "        self.x = x  \n",
    "        return np.maximum(0, x)\n",
    "    def backward(self, grad_output):\n",
    "        # 如果x>0，导数为1,否则0\n",
    "        x = self.x\n",
    "        relu_grad = x > 0\n",
    "        return grad_output * relu_grad \n",
    "    \n",
    "class Sigmoid(Layer):\n",
    "    def __init__(self):\n",
    "        super().__init__()\n",
    "        pass\n",
    "    def forward(self, x):\n",
    "        self.x = x  \n",
    "        return 1.0/(1.0 + np.exp(-x))     \n",
    "    def backward(self, grad_output): \n",
    "        x = self.x  \n",
    "        a  = 1.0/(1.0 + np.exp(-x))         \n",
    "        return grad_output * a*(1-a) \n",
    "    \n",
    "class Tanh(Layer):\n",
    "    def __init__(self):\n",
    "        super().__init__()\n",
    "        pass\n",
    "    def forward(self, x):\n",
    "        self.x = x  \n",
    "        self.a = np.tanh(x)  \n",
    "        return self.a    \n",
    "    def backward(self, grad_output):           \n",
    "        d = (1-np.square(self.a))           \n",
    "        return grad_output * d\n",
    "    \n",
    "class Leaky_relu(Layer):\n",
    "    def __init__(self,leaky_slope):\n",
    "        super().__init__()\n",
    "        self.leaky_slope = leaky_slope        \n",
    "    def forward(self, x):\n",
    "        self.x = x  \n",
    "        return np.maximum(self.leaky_slope*x,x)            \n",
    "    def backward(self, grad_output): \n",
    "        x = self.x    \n",
    "        d=np.zeros_like(x)\n",
    "        d[x<=0]=self.leaky_slope\n",
    "        d[x>0]=1       \n",
    "        return grad_output * d"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "激活层没有模型参数，只是将输入x经过变换产生一个输出。输入和输出张量的形状是一样的。同样，可以用数值梯度检查激活曾的分析梯度是否正确。下面代码用模拟的损失函数关于激活层输出的梯度do，检查了上面每个激活层的分析梯度和数值梯度的误差："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 54,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "3.2756345281587516e-12\n",
      "7.43892997215858e-12\n",
      "5.170019175240593e-11\n",
      "3.282573028416693e-11\n"
     ]
    }
   ],
   "source": [
    "import numpy as np\n",
    "np.random.seed(1)\n",
    "x = np.random.randn(3,3,4, 4)\n",
    "do = np.random.randn(3,3,4, 4)\n",
    "\n",
    "relu = Relu()\n",
    "relu.forward(x)\n",
    "dx = relu.backward(do)\n",
    "dx_num = numerical_gradient_from_df(lambda :relu.forward(x),x,do)\n",
    "print(diff_error(dx,dx_num))\n",
    "\n",
    "leaky_relu = Leaky_relu(0.1)\n",
    "leaky_relu.forward(x)\n",
    "dx = leaky_relu.backward(do)\n",
    "dx_num = numerical_gradient_from_df(lambda :leaky_relu.forward(x),x,do)\n",
    "print(diff_error(dx,dx_num))\n",
    "\n",
    "tanh = Tanh()\n",
    "tanh.forward(x)\n",
    "dx = tanh.backward(do)\n",
    "dx_num = numerical_gradient_from_df(lambda :tanh.forward(x),x,do)\n",
    "print(diff_error(dx,dx_num))\n",
    "\n",
    "sigmoid = Sigmoid()\n",
    "sigmoid.forward(x)\n",
    "dx = sigmoid.backward(do)\n",
    "dx_num = numerical_gradient_from_df(lambda :sigmoid.forward(x),x,do)\n",
    "print(diff_error(dx,dx_num))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "从这些激活层的分梯度和数值梯度误差几乎相等，可以基本确信分析梯度代码的正确性。\n",
    "\n",
    "在dense层和各个激活层的基础上，可以定义一个表示神经网络的类NeuralNetwork:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 55,
   "metadata": {},
   "outputs": [],
   "source": [
    "class NeuralNetwork:  \n",
    "    def __init__(self):\n",
    "        self._layers = []\n",
    "        self._params = []\n",
    " \n",
    "    def add_layer(self, layer):      \n",
    "        self._layers.append(layer)\n",
    "        if layer.params: \n",
    "           # for  i in range(len(layer.params)): \n",
    "            for  i, _ in enumerate(layer.params):                         \n",
    "                self._params.append([layer.params[i],layer.grads[i]])            \n",
    "    \n",
    "    def forward(self, X): \n",
    "        for layer in self._layers:\n",
    "            X = layer.forward(X) \n",
    "        return X   \n",
    "\n",
    "    def __call__(self, X):\n",
    "        return self.forward(X)\n",
    "    \n",
    "    def predict(self, X):\n",
    "        \"\"\"\n",
    "        输入X，预测其分类\n",
    "        \"\"\"\n",
    "        p = self.forward(X)\n",
    "        # One row\n",
    "        if p.ndim == 1:     #单样本\n",
    "            return np.argmax(ff)        \n",
    "        # 多样本\n",
    "        return np.argmax(p, axis=1)\n",
    "  \n",
    "   \n",
    "    def backward(self,loss_grad,reg = 0.):\n",
    "        for i in reversed(range(len(self._layers))):\n",
    "            layer = self._layers[i] \n",
    "            loss_grad = layer.backward(loss_grad)\n",
    "            layer.reg_grad(reg) \n",
    "        return loss_grad\n",
    "    \n",
    "    \n",
    "    def backpropagation(self, X, y,loss_function,reg=0):\n",
    "        \"\"\"\n",
    "        反向计算，loss_function函数用于计算损失函数关于输出层的梯度\n",
    "        \"\"\"        \n",
    "        # Feed forward for the output\n",
    "        f = self.forward(X)          \n",
    "        #损失函数关于输出f的梯度\n",
    "        loss,loss_grad = loss_function(f,y)         \n",
    "      \n",
    "        #从loss_grad反向求导\n",
    "        self.zero_grad()\n",
    "        self.backward(loss_grad)  \n",
    "        reg_loss = self.reg_loss_grad(reg)       \n",
    "        return loss+reg_loss\n",
    "        #return np.mean(loss)\n",
    "    \n",
    "    def reg_loss(self,reg):\n",
    "        reg_loss = 0\n",
    "        for i in range(len(self._layers)):\n",
    "            reg_loss+=self._layers[i].reg_loss(reg)\n",
    "        return reg_loss\n",
    "    \n",
    "    def parameters(self): \n",
    "        return self._params\n",
    "    \n",
    "    def zero_grad(self):\n",
    "        for i,_ in enumerate(self._params):           \n",
    "            #self.params[i][1].fill(0.) \n",
    "            self._params[i][1][:] = 0 \n",
    "            \n",
    "    def get_parameters(self):\n",
    "        return self._params "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "为了确保forward()和backward()方法的正确性，可以用数值梯度的方法检验它们的正确性。下面代码定义了一个简单的神经网络，并用一组随机生成的样本(x,y)来计算比较backward()计算的分析梯度和用1.4节的通用数值梯度函数求得的数值梯度，看看它们的计算结果是否一致。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 56,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "1.892395698905401e-06\n",
      "1.7651393552515298e-06\n",
      "2.306498772862026e-06\n",
      "2.3545204992835373e-10\n"
     ]
    }
   ],
   "source": [
    "import util\n",
    "\n",
    "np.random.seed(1)\n",
    "nn = NeuralNetwork()\n",
    "nn.add_layer(Dense(2, 100,('no',0.01)))\n",
    "nn.add_layer(Relu())\n",
    "nn.add_layer(Dense(100, 3,('no',0.01)))\n",
    "\n",
    "x = np.random.randn(5,2)\n",
    "y = np.random.randint(3, size=5)\n",
    "\n",
    "f = nn.forward(x)\n",
    "dZ = cross_entropy_grad(f,y)  #util.grad_softmax_cross_entropy(f,y) #\n",
    "nn.zero_grad()                            #梯度清零\n",
    "reg = 0.1\n",
    "dx =  nn.backward(dZ,reg)\n",
    "\n",
    "#-----计算数值梯度-----------\n",
    "params = nn.parameters()\n",
    "nn_params=[]\n",
    "for i in range(len(params) ):    \n",
    "    nn_params.append(params[i][0]) \n",
    "    \n",
    "def loss_fn():\n",
    "    f = nn.forward(x)\n",
    "    loss =  softmax_cross_entropy(f,y) #util.softmax_cross_entropy(f,y) # \n",
    "    return loss+nn.reg_loss(reg)\n",
    "\n",
    "numerical_grads = util.numerical_gradient(loss_fn,nn_params,1e-6)\n",
    "for i in range(len(numerical_grads)):\n",
    "    print(diff_error(params[i][1],numerical_grads[i]))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 57,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "1.892395698905401e-06\n",
      "1.7651393552515298e-06\n",
      "2.306498772862026e-06\n",
      "2.3545204992835373e-10\n"
     ]
    }
   ],
   "source": [
    "import util\n",
    "\n",
    "np.random.seed(1)\n",
    "nn = NeuralNetwork()\n",
    "nn.add_layer(Dense(2, 100,('no',0.01)))\n",
    "nn.add_layer(Relu())\n",
    "nn.add_layer(Dense(100, 3,('no',0.01)))\n",
    "\n",
    "x = np.random.randn(5,2)\n",
    "y = np.random.randint(3, size=5)\n",
    "\n",
    "f = nn.forward(x)\n",
    "dZ = cross_entropy_grad(f,y)  #util.grad_softmax_cross_entropy(f,y) #\n",
    "nn.zero_grad()                            #梯度清零\n",
    "reg = 0.1\n",
    "dx =  nn.backward(dZ,reg)\n",
    "\n",
    "#-----计算数值梯度-----------\n",
    "params = nn.parameters()\n",
    "nn_params=[]\n",
    "for i in range(len(params) ):    \n",
    "    nn_params.append(params[i][0]) \n",
    "    \n",
    "def loss_fn():\n",
    "    f = nn.forward(x)\n",
    "    loss =  softmax_cross_entropy(f,y) #util.softmax_cross_entropy(f,y) # \n",
    "    return loss+nn.reg_loss(reg)\n",
    "\n",
    "numerical_grads = util.numerical_gradient(loss_fn,nn_params,1e-6)\n",
    "for i in range(len(numerical_grads)):\n",
    "    print(diff_error(params[i][1],numerical_grads[i]))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "可见数值梯度和分析梯度非常接近，初步确定模型的forward()和backward()没什么问题。\n",
    "\n",
    "### 4.3.8 独立的参数优化器\n",
    "\n",
    "为了方便用不同的梯度下降优化策略更新模型参数，可以将它们编写为一个单独的类，如："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 59,
   "metadata": {},
   "outputs": [],
   "source": [
    "class SGD():\n",
    "    def __init__(self,model_params,learning_rate=0.01, momentum=0.9):\n",
    "        self.params,self.lr,self.momentum = model_params,learning_rate,momentum\n",
    "        self.vs = []\n",
    "        for p,grad in self.params:\n",
    "            v = np.zeros_like(p)\n",
    "            self.vs.append(v)\n",
    "        \n",
    "    def zero_grad(self):        \n",
    "        #for p,grad in params:\n",
    "        for i,_ in enumerate(self.params):           \n",
    "            #self.params[i][1][:] = 0.          \n",
    "            self.params[i][1].fill(0) \n",
    "                \n",
    "    def step(self):           \n",
    "        for i,_ in enumerate(self.params):     \n",
    "            p,grad = self.params[i]           \n",
    "            self.vs[i] = self.momentum*self.vs[i]+self.lr* grad             \n",
    "            self.params[i][0] -= self.vs[i] \n",
    "            #self.params[i][0][:] =  self.params[i][0] - self.vs[i] \n",
    "      \n",
    "    def scale_learning_rate(self,scale):\n",
    "        self.lr *= scale"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 60,
   "metadata": {},
   "outputs": [],
   "source": [
    "learning_rate = 1e-1\n",
    "momentum = 0.9\n",
    "optimizer = SGD(nn.parameters(),learning_rate,momentum)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "同样，也可以定义其他的优化器类，如下面的Adam优化器："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 61,
   "metadata": {},
   "outputs": [],
   "source": [
    "class Adam():\n",
    "    def __init__(self,model_params,learning_rate=0.01, beta_1 = 0.9,beta_2 = 0.999,epsilon =1e-8):\n",
    "        self.params,self.lr = model_params,learning_rate\n",
    "        self.beta_1,self.beta_2,self.epsilon = beta_1,beta_2,epsilon\n",
    "        self.ms = []\n",
    "        self.vs = []\n",
    "        self.t = 0\n",
    "        for p,grad in self.params:\n",
    "            m = np.zeros_like(p)\n",
    "            v = np.zeros_like(p)\n",
    "            self.ms.append(m)\n",
    "            self.vs.append(v)\n",
    "        \n",
    "    def zero_grad(self):        \n",
    "        #for p,grad in params:        \n",
    "        for i,_ in enumerate(self.params):\n",
    "            #self.params[i][1][:] = 0.          \n",
    "            self.params[i][1].fill(0) \n",
    "                \n",
    "    def step(self):   \n",
    "        #for  i in range(len(self.params)): \n",
    "        beta_1,beta_2,lr = self.beta_1,self.beta_2,self.lr\n",
    "        self.t+=1\n",
    "        t = self.t\n",
    "        for i,_ in enumerate(self.params):     \n",
    "            p,grad = self.params[i]       \n",
    "            \n",
    "            self.ms[i] = beta_1*self.ms[i]+(1-beta_1)*grad\n",
    "            self.vs[i] = beta_2*self.vs[i]+(1-beta_2)*grad**2 \n",
    "            \n",
    "            m_1 = self.ms[i]/(1-np.power(beta_1, t))\n",
    "            v_1 = self.vs[i]/(1-np.power(beta_2, t))  \n",
    "            self.params[i][0]-= lr*m_1/(np.sqrt(v_1)+self.epsilon)\n",
    "      \n",
    "    def scale_learning_rate(self,scale):\n",
    "        self.lr *= scale"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "下面的训练函数train()接受一个数据迭代器，每次从中取出一批训练样本（input, target），对每批样本，先执行forwrd()计算输出output，然后用损失函数计算其损失loss和损失函数关于输出output的梯度loss_grad，将这个梯度loss_grad通过backward()函数反向回传，求得模型参数和中间变量的梯度。随后用optimizer的step()函数更新模型参数。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 62,
   "metadata": {},
   "outputs": [],
   "source": [
    "def data_iterator(X,y,batch_size,shuffle=False):\n",
    "    m = len(X)  \n",
    "    indices = list(range(m))\n",
    "    if shuffle:                 # shuffle是True表示打乱次序\n",
    "        np.random.shuffle(indices)\n",
    "    for i in range(0, m - batch_size + 1, batch_size):\n",
    "        batch_indices = np.array(indices[i: min(i + batch_size, m)])      \n",
    "        yield X.take(batch_indices,axis=0), y.take(batch_indices,axis=0)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 63,
   "metadata": {},
   "outputs": [],
   "source": [
    "def train_nn(nn,X,y,optimizer,loss_fn,epochs=100,batch_size = 50,reg = 1e-3,print_n=10):\n",
    "    iter = 0\n",
    "    losses = [] \n",
    "    for epoch in range(epochs):\n",
    "        for X_batch,y_bacth in data_iter(X,y,batch_size):     \n",
    "            optimizer.zero_grad()      \n",
    "            \n",
    "            f = nn(X_batch) # nn.forward(X_batch)      \n",
    "            loss,loss_grad = loss_fn(f, y_bacth)       \n",
    "            nn.backward(loss_grad,reg)               \n",
    "            loss += nn.reg_loss(reg)\n",
    "          \n",
    "            optimizer.step()\n",
    "\n",
    "            losses.append(loss)\n",
    "            \n",
    "            if iter%print_n==0:\n",
    "                print(iter,\"iter:\",loss)\n",
    "            iter +=1          \n",
    "\n",
    "    return losses"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "现在可以用这个神经网络去训练前面的3分类的问题"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 65,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "0 iter: 1.0985916677722303\n",
      "480 iter: 0.7056240023920841\n",
      "960 iter: 0.6422407772314334\n",
      "1440 iter: 0.5246104670488081\n",
      "1920 iter: 0.4186441561530432\n",
      "2400 iter: 0.37118840941018727\n",
      "2880 iter: 0.34583485668931857\n",
      "3360 iter: 0.32954842747580104\n",
      "3840 iter: 0.31961537369884196\n",
      "4320 iter: 0.3124394704919282\n",
      "4800 iter: 0.30620107113884415\n"
     ]
    },
    {
     "data": {
      "text/plain": [
       "[<matplotlib.lines.Line2D at 0x1e4f78f69d0>]"
      ]
     },
     "execution_count": 65,
     "metadata": {},
     "output_type": "execute_result"
    },
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXQAAAD4CAYAAAD8Zh1EAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAAgAElEQVR4nO3deXyV5Z338c8v+0oSkrAmyCqYqrgExKVqtbWIY+lMF5daNyxDn9raTuuMnT5tpzOvmXaWdqrVllLFddSp3UQf19qqKBYJCsqqEUFCWBIgCYGQ9ff8cQ4YQkIOcJI7uc/3/Xqd1znnvq+c/C6F77m47uUyd0dERAa/pKALEBGR+FCgi4iEhAJdRCQkFOgiIiGhQBcRCYmUoH5xUVGRjx07NqhfLyIyKC1fvrzW3Yu72xdYoI8dO5aKioqgfr2IyKBkZpt62qcpFxGRkFCgi4iEhAJdRCQkFOgiIiGhQBcRCYleA93MFprZDjNb1cP+KWb2mpk1m9m34l+iiIjEIpYR+n3AzCPs3wV8DfiveBQkIiLHptdAd/eXiYR2T/t3uPsyoDWehfXk/dq9/OS59by4fgf7W9v741eKiAwK/XphkZnNBeYCjBkz5pg+4+0t9dz550o6HHIzUph3wQTmXTCB5CSLZ6kiIoNOvwa6uy8AFgCUl5cf08oan5o6iounDGPZxl38z9IP+M9n11O1u4kf/s0pca1VRGSwGZRnuWSnp3Dh5GH86tpy/vaC8Tzy+gf8ad32oMsSEQnUoAz0zr75icmcUJjFnX+qDLoUEZFAxXLa4iPAa8BkM6syszlmNs/M5kX3jzCzKuDvgP8bbTOkb8v+UFpKEtedPZY3Pqjj3e17+uvXiogMOL3Oobv7Vb3s3waUxK2iY3DpKSP45yfX8Me1O5g0PDfIUkREAjPop1wARuZlctLIISx+tyboUkREAhOKQAcoP6GAt6rq6eg4ppNnREQGvdAE+qkleTQ2t7GhtjHoUkREAhGiQM8HYNWWhoArEREJRmgCfWxRFkkGG2o0QheRxBSaQE9PSaZ0aBbv1e4NuhQRkUCEJtABxhdls6FGgS4iiSlUgT62KJuNtXtx15kuIpJ4QhXoo/MzaWptp76pX+7kKyIyoIQq0EfmZQJQXbc/4EpERPpfqAJ9VH4GAFvrmwKuRESk/4Us0KMj9HqN0EUk8YQq0Ity0klJMrbWaYQuIoknVIGenGQMH5LBtgaN0EUk8YQq0AGKctKobWwJugwRkX4XukAvzElnZ2Nz0GWIiPS7WFYsWmhmO8xsVQ/7zczuMLNKM3vLzM6If5mxi4zQFegiknhiGaHfB8w8wv5LgUnRx1zgF8df1rEryklnZ2OL7osuIgmn10B395eBXUdoMht4wCP+AuSb2ch4FXi0CnPSaetwGvbralERSSzxmEMfDWzu9L4quu0wZjbXzCrMrKKmpm+WiyvKSQPQtIuIJJx4BLp1s63b+Q53X+Du5e5eXlxcHIdffbjinHQAavboTBcRSSzxCPQqoLTT+xKgOg6fe0yKciOBvnOvRugikljiEeiLgGujZ7vMAOrdfWscPveYFGZHp1z2KNBFJLGk9NbAzB4BLgSKzKwK+D6QCuDu84GngFlAJbAPuKGvio1FQVYaSQa79mrKRUQSS6+B7u5X9bLfga/EraLjlJRkFGSlUatAF5EEE7orRQGGZqexS5f/i0iCCWWgF+akacpFRBJOOAM9O11nuYhIwglloA/NTmOnRugikmBCG+h1+1ppa+8IuhQRkX4TykA/cPn/7n26n4uIJI5QBvrQ7MjVojowKiKJJKSBHhmha6ELEUkkoQz0wuiUiw6MikgiCWWgHxiha8pFRBJJKAO9ICsNM43QRSSxhDLQk6P3c9EcuogkklAGOkTv56IRuogkkFAHuqZcRCSRhDbQi3SDLhFJMDEFupnNNLP1ZlZpZrd1s7/AzH5vZm+Z2etmdnL8Sz06Q7M1hy4iiaXXQDezZOAu4FKgDLjKzMq6NPtHYIW7nwpcC9we70KP1tDsdOqaWmnv6Ha9ahGR0IllhD4dqHT3De7eAjwKzO7Spgx4AcDd1wFjzWx4XCs9SoXZabjD7n2adhGRxBBLoI8GNnd6XxXd1tlK4G8AzGw6cAJQEo8Cj5UuLhKRRBNLoFs327rOY/wIKDCzFcBXgTeBtsM+yGyumVWYWUVNTc1RF3s0Dlz+X6t5dBFJEL0uEk1kRF7a6X0JUN25gbs3ADcAmJkB70cfdGm3AFgAUF5e3qeT24W646KIJJhYRujLgElmNs7M0oArgUWdG5hZfnQfwE3Ay9GQD4ymXEQk0fQ6Qnf3NjO7GXgWSAYWuvtqM5sX3T8fOAl4wMzagTXAnD6sOSYFWakA7GxUoItIYohlygV3fwp4qsu2+Z1evwZMim9pxyclOYmCrFSN0EUkYYT2SlE4cPm/DoqKSGIIdaAXZqdrykVEEkaoA113XBSRRBLqQC/UDbpEJIGEO9Cz09i1r0X3cxGRhBDqQB8avZ9Lne7nIiIJINyBnqOrRUUkcYQ60AujV4tq5SIRSQThDvToDbp06qKIJIJQB/qH93PRxUUiEn6hDvSCLE25iEjiCHWgpyYnMTQ7jR17NEIXkfALdaADjMrPYGtdU9BliIj0ufAHel4m1XX7gy5DRKTPhT/Q8zPZUteEu64WFZFwC32gj87PpLG5jYb9hy1xKiISKjEFupnNNLP1ZlZpZrd1sz/PzJ4ws5VmttrMboh/qcdmVH4mANWaRxeRkOs10M0sGbgLuBQoA64ys7Iuzb4CrHH3qcCFwI87rTEaqFH5GYACXUTCL5YR+nSg0t03uHsL8Cgwu0sbB3LNzIAcYBcwIOY4RhdohC4iiSGWQB8NbO70viq6rbM7iSwUXQ28Ddzi7h1xqfA4FWWnk5aSxObdCnQRCbdYAt262db1lJFPAiuAUcBpwJ1mNuSwDzKba2YVZlZRU1Nz1MUei6QkY1xhNhtq9vbL7xMRCUosgV4FlHZ6X0JkJN7ZDcDvPKISeB+Y0vWD3H2Bu5e7e3lxcfGx1nzUJgzLZkNNY7/9PhGRIMQS6MuASWY2Lnqg80pgUZc2HwAXA5jZcGAysCGehR6PCcU5bNq1j5a2ATELJCLSJ3oNdHdvA24GngXWAr9299VmNs/M5kWb/Qtwjpm9DbwA/IO71/ZV0UdrQnEO7R3Opp2adhGR8EqJpZG7PwU81WXb/E6vq4FL4lta/EwozgGgckcjk4bnBlyNiEjfCP2VogATh+WQnGSsrm4IuhQRkT6TEIGemZbMlBG5rNhcF3QpIiJ9JiECHWBqaT4rN9fR0aGbdIlIOCVMoJ9ems+e5jbe3aHTF0UknBIm0M+bVATAi+t3BFyJiEjfSJhAH5mXSdnIIbywToEuIuGUMIEO8PGy4VRs3MUW3ahLREIooQL98+UlOPDw0k1BlyIiEncJFeglBVlcUjacB5ZsYmdjc9DliIjEVUIFOsCtn5zMvtZ2/uXJNVpnVERCJeECfeKwXL560UT+sKKa+5ZsDLocEZG4ieleLmHz1YsmsWpLAz94Yg0tbR3MPX88kcWWREQGr4QboQMkJxk//8IZzDplBD98eh3zHlpOzR7NqYvI4JaQgQ6QlpLEXVefwf+97CT+tG4HF/34RR54bSPtujWAiAxSCRvoAGbGTR8dz9O3nM+pJXl87/HVXPLfL/HkW9W654uIDDoJHegHTByWw0NzzmL+NWeQZMbND7/JrDsW8+Rb1Rqxi8igEVOgm9lMM1tvZpVmdls3+281sxXRxyozazezofEvt++YGTNPHskzXz+f2688jZa2Dm5++E0u+vGLPLz0A/a3tgddoojIEVlv52KbWTLwDvAJIgtGLwOucvc1PbS/HPiGu190pM8tLy/3ioqKYyq6P7R3OM+v2cbPX3yPt6rqKc5N58Zzx/GFGWMYkpEadHkikqDMbLm7l3e3L5YR+nSg0t03uHsL8Cgw+wjtrwIeOfoyB5bkpMiI/fGvnMv/3HQWk4fn8u/PrOOcH/6Jf3tqLVvrdT8YERlYYjkPfTSwudP7KuCs7hqaWRYwk8ii0t3tnwvMBRgzZsxRFRoUM+PciUWcO7GIt6vqWbB4A3cv3sDCV97nU6eNYu7545kyYkjQZYqIxDRC7+6Km57maS4HXnX3Xd3tdPcF7l7u7uXFxcWx1jhgnFKSx8+uOp2Xbv0Y18w4gaff3sbMny7muoWvs6SyVrcSEJFAxRLoVUBpp/clQHUPba8kBNMtvSkdmsU/feojLLntIr75iRNZXV3P1Xcv5fI7X+GJldW0tXcEXaKIJKBYDoqmEDkoejGwhchB0avdfXWXdnnA+0Cpu+/t7RcP9IOiR2N/azu/e2MLdy/ewIbavZQUZHLTeeP4/LRSstIS8u4KItJHjnRQtNdAj37ALOCnQDKw0N3/1czmAbj7/Gib64GZ7n5lLEWFKdAP6Ohwnl+7nQUvb2D5pt3kZ6VywznjuPG8seTqzBgRiYPjDvS+EMZA72z5pl384sUN/HHtdgqyUvk/F07ki2efQEZqctClicggpkAP0MrNdfzXc+tZ/G4tw4ek87WLJ3FFeSkpybpIV0SO3vGehy7HYWppPg/OOYtH586gtCCL7/x+FZfd8QpLKmuDLk1EQkaB3k9mjC/ksXlnM/+aM9nb0sbVdy/lyw8tZ/OufUGXJiIhoUDvR5H7xYzgj393Ad+65EReXF/Dx3/yEnf9uZJWneooIsdJgR6AjNRkbr5oEn/61gVcNGUY//nsembf+SqrttQHXZqIDGIK9ACNzMvkF9ecyfxrzqCmsZnZd73Kj55epzs7isgxUaAPADNPHskfv3EBnzljNPNfeo9P3/Uq67ftCbosERlkFOgDRF5WKv/x2ance/00ahub+dSdr3D/ko26P4yIxEyBPsB8bMownr7lfM6eUMj3F61mzv0V7NrbEnRZIjIIKNAHoOLcdO69fhr/dHkZr1TWcvnPXtEBUxHplQJ9gDIzrj93HI/97dm4O5/5xRJ+u7wq6LJEZABToA9wU0vzWfTV8zh9TD7ffGwl3398lW7PKyLdUqAPAkU56Tw05yzmnDeO+1/bxE0PVNDY3BZ0WSIywCjQB4mU5CS++1dl/Ntfn8Lid2v53PzX2Fa/P+iyRGQAUaAPMlefNYaF109j8659fPquV1lT3RB0SSIyQCjQB6ELTizmsXlnYwZXLHiNio3dLuEqIgkmpkA3s5lmtt7MKs3sth7aXGhmK8xstZm9FN8ypauTRg7hN18+h+KcdK65ZykvvVMTdEkiErBeA93MkoG7gEuBMuAqMyvr0iYf+DnwKXf/CPC5PqhVuhidn8mv553N+KIcbrp/GU+9vTXokkQkQLGM0KcDle6+wd1bgEeB2V3aXA38zt0/AHD3HfEtU3pSlJPOI3NnMLUkn5sffoPHKjYHXZKIBCSWQB8NdE6Jqui2zk4ECszsRTNbbmbXdvdBZjbXzCrMrKKmRlME8ZKXmcoDc6Zz7sQi/v63b+kCJJEEFUugWzfbut4xKgU4E7gM+CTwXTM78bAfcl/g7uXuXl5cXHzUxUrPstJS+NW15ZwzoZBbf7OSx1dsCbokEelnsQR6FVDa6X0JUN1Nm2fcfa+71wIvA1PjU6LEKiM1mbuvncb0cUP5xv+u4ImVXf83iUiYxRLoy4BJZjbOzNKAK4FFXdo8DnzUzFLMLAs4C1gb31IlFplpySy8fhrlJwzl6/+7gmdWbQu6JBHpJ70Guru3ATcDzxIJ6V+7+2ozm2dm86Jt1gLPAG8BrwN3u/uqvitbjiQrLYWFN0zj1JI8vvbImyx5rzbokkSkH1hQCyiUl5d7RUVFIL87UdTta+Hzv3yN6rr9PPKlGZxSkhd0SSJynMxsubuXd7dPV4qGWH5WGg/ceBZ5malcd+/rvFfTGHRJItKHFOghNyIvgwfnTMeAa+95na31TUGXJCJ9RIGeAMYX53D/jdOpb2rl2nteZ7eWtBMJJQV6gjh5dB6/uracTTv3MffBCva3tgddkojEmQI9gZw9oZAff34qyzbu5puPraSjI5gD4iLSN1KCLkD61+VTR1Fd18QPn15HSX4m3551UtAliUicKNAT0Nzzx1O1u4lfvryBkoJMvnj22KBLEpE4UKAnIDPj+5eXsbW+ie8vWs3IvEw+XjY86LJE5DhpDj1BpSQnccdVp3Py6Dy++sibrNxcF3RJInKcFOgJLCsthXuum0ZhThpz7l/G5l37gi5JRI6DAj3BFeemc98N02ltd667V+eoiwxmCnRh4rAc7r6unKrdTcy5f5nOURcZpBToAsC0sUO5/YrTeHNzHV975E3adY66yKCjQJeDLj1lJN+9rIzn1mznB0+sJqg7cYrIsdFpi3KIG88bx9b6Jn61+H1G5Wcy74IJQZckIjFSoMthvn3pSWxraOZHT69jZF4Gs0/ruia4iAxEMU25mNlMM1tvZpVmdls3+y80s3ozWxF9fC/+pUp/SUoy/utzpzJj/FC+9dhKXq3Uikcig0GvgW5mycBdwKVAGXCVmZV103Sxu58WffxznOuUfpaekswvv1jO+KIc5j24nLVbG4IuSUR6EcsIfTpQ6e4b3L0FeBSY3bdlyUCQl5nKvTdMIzs9hevvfV0XHokMcLEE+mhgc6f3VdFtXZ1tZivN7Gkz+0h3H2Rmc82swswqampqjqFc6W+j8jO578ZpNLW0c809S9nRsD/okkSkB7EEunWzrev5bG8AJ7j7VOBnwB+6+yB3X+Du5e5eXlxcfHSVSmCmjBjCfTdOp2ZPM9fcs1RXk4oMULEEehVQ2ul9CVDduYG7N7h7Y/T1U0CqmRXFrUoJ3BljCrj72nI27tzHdfe+zp79rUGXJCJdxBLoy4BJZjbOzNKAK4FFnRuY2Qgzs+jr6dHP3RnvYiVY50ws4udXn8Ga6gbm3F9BU4tuESAykPQa6O7eBtwMPAusBX7t7qvNbJ6ZzYs2+yywysxWAncAV7ouMwylj5cNjy5jt4u/fWi57vsiMoBYULlbXl7uFRUVgfxuOX7/u+wDbvvd25w7oYhfXVtOZlpy0CWJJAQzW+7u5d3t071c5JhcMW0M//GZU3n1vVquv/d19ja3BV2SSMJToMsx+1x5KT+94jQqNu3muoU6UCoSNAW6HJfZp43mZ1edzorNdXzh7qXUNjYHXZJIwlKgy3GbdcpIfvnFM3ln+x4+84slbKzdG3RJIglJgS5xcfFJw3nkSzNoaGrlM79YwgotOi3S7xToEjenjyngt18+h6z0ZK5a8BeeWbU16JJEEooCXeJqfHEOv/vyuUwekcu8h97gv59/hw4tZyfSLxToEnfFuek8OncGnz2zhNtfeJd5Dy2nUac1ivQ5Bbr0iYzUZP7zs6fy/cvLeGHdDmbf+YruqS7SxxTo0mfMjBvOHceDc6bTsL+N2Xe9yoOvbdTi0yJ9RIEufe6cCUU8fctHOWdCId99fDXzHlqu89VF+oACXfpFUU46C6+bxndmncSf19VwyX+/zKKV1Rqti8SRAl36TVKS8aXzx/Pk186jdGgWX3vkTeY+uFyrIInEiQJd+t2Jw3P57byz+cdZU3j5nRou/slL3Pvq+7S2dwRdmsigpkCXQKQkJzH3/Ak8fctHOa00nx88sYbL7ljMksraoEsTGbQU6BKo8cU5PHDjdBZ88UyaWtu5+u6lfOmBCtZt0ymOIkcrpkA3s5lmtt7MKs3stiO0m2Zm7Wb22fiVKGFnZlzykRE8/40LuPWTk/nLezu59PbF3PLom7rRl8hR6HXFIjNLBt4BPkFkwehlwFXuvqabds8D+4GF7v6bI32uViySntTta2H+Sxu4b8n7tLY7l586krnnT6Bs1JCgSxMJ3PGuWDQdqHT3De7eAjwKzO6m3VeB3wI7jrlSESA/K43bLp3Cy7d+jOvPGcvza7Yz647FfPGepbxaWatTHUV6EEugjwY2d3pfFd12kJmNBv4amH+kDzKzuWZWYWYVNTU1R1urJJhhQzL47l+VseS2i7n1k5NZu3UPX7h7KZfevpgHXttIg1ZIEjlELIFu3WzrOkT6KfAP7n7EJeDdfYG7l7t7eXFxcaw1SoLLy0rlKx+byCv/8DF+9DenkJxkfO/x1Zz1ry/w979ZyYrNdRq1iwApMbSpAko7vS8Bqru0KQceNTOAImCWmbW5+x/iUqUIkRt+XTl9DFdMK+WtqnoeXvoBi1ZW8+uKKsYXZzN76mhmnzaKsUXZQZcqEohYDoqmEDkoejGwhchB0avdfXUP7e8DntRBUekPDftb+X9vbeUPb25h6fu7AJhams+nTxvFZaeOZFhuRsAVisTXkQ6K9hro0Q+YRWRaJZnIGSz/ambzANx9fpe296FAlwBU1zXxxMpqHl9RzZqtDZjB6aX5XPKREVxSNpzxxTlBlyhy3I470PuCAl360rvb9/DMqm08t2Y7b2+pB2DisBw+UTacC08s5owTCkhN1nV1Mvgo0CWhbalr4o9rtvPcmm38ZcMu2juc3PQUzplYyAUnDuP8E4soKcgKukyRmCjQRaIa9reypHInL71Tw8vv1LClrgmIjN7Pn1TMuRMLKT9hKHlZqQFXKtI9BbpIN9yd92oaeXF9DS+9U8PS93fR0ha54+Pk4bmUjy1g2tihlI8tYHR+JtGzuEQCpUAXicH+1nZWbK6jYuMulm3czRubdrMnurj1yLwMzjyhgDPGFHBqSR5lo4aQlRbLWb8i8XWkQNefSJGojNRkZowvZMb4QgDaO5z12/ZQsSkS8Mve38WTb20FIMlgQnEOp4zO4+TReZxSkkfZyCFkp+uvlARHI3SRo7C9YT9vV9Xz9pZ6Vm2JPO/YE1kf1QzGFWUzZUQuk4cPYfKIXCaPyGXM0CySkzRdI/GhEbpInAwfksHwsgw+Xjb84LYdDft5Oxrua6obWFPdwNOrtnFgrJSRmsSJw3M5cXguU0Z8+Fycm655eYkrjdBF+sC+ljbe3d7I+u17WL8t8li3bQ+1jc0H2xRkpR4M98kjhjB5RA4nDs8lN0Nn2EjPNEIX6WdZaSlMLc1namn+Idt3NjYfDPl3tkdC/jfLq9jb8uF97UbnZzJ5RC5jC7M5oTCLMUOzGFOYRUlBJukpyf3dFRlEFOgi/agwJ51zctI5Z0LRwW0dHc6WuqbISL5T2L/23k6aWj8MejMYlZfJmKFZkaAvzKK0IItR+RmMzMtkWG46Kbr6NaEp0EUClpRklA7NonRo1iFz8+5OTWMzH+zcx6ad+9i0ax8f7NzLpl37eH7NdnbubTnkc5KTjGG56YzKz2RkXsbB55F5mYzKj7wvzE7TvH2IKdBFBigzY1huBsNyMygfO/Sw/Y3NbWzZ3UR1fRNb6/ZTXffh61Vb6nluzfaDF0odkJpsFGanU5SbRnFOOkU56RTlRp6Lc9MpyolsL85NJy8zVeE/yCjQRQapnPSUg6dGdsfd2bW3ha31+9lS18TWuia272mmdk8zNY2Rx9qtkQO1bR2HnxzROfwLsyMB3/UxpPP7rMhzdlqyvggCokAXCSkzozAnncKcdE4enddju44Op76pldpoyNfsaaa2sSXyfk8ztY3N7Nrbwsade6lvaqWhqZVu8v+glCQ7GPSHBH5mypG/EDJTyUlP0ZfBcVCgiyS4pCSjIDuNguw0Jg3vfrTfWUeH09jSRv2+1oMBX3+kx74WPjjwZbC/jfYjfBskJxlDMlJ6DPxDwj8jhdyMyJdAbkYKOekpZCX4vw4U6CJyVJKSjCEZqQzJSD1kbcpYuDuNzW2HBH7DIa/bDvtC2LK76eDr7qaGDqnNIDs9hdz0lEMCPycjui09hezoc07GgdfJZKcduj0nPYX0lKRB9+UQU6Cb2UzgdiIrFt3t7j/qsn828C9AB9AGfN3dX4lzrSIyyJkZuRmp5GakUlJwdD/r7uxraT8Y7o3NbTTub2NP9LmxufXg+z37D2xro66plard+2iMbt/XcsS17A9KTjKy05K7hH9Kp/BP7nZ7bucviugXSHZaSr/c/qHXQDezZOAu4BNEFoxeZmaL3H1Np2YvAIvc3c3sVODXwJS+KFhEEpOZHQzIUfmZx/w57R3OvpY29ja3R74EmtvZ2xwJ/wPPB15H2hy6fXvD/oPbG5uPPIXUWVZa8sHw/8JZY7jpo+OPuQ89iWWEPh2odPcNAGb2KDAbOBjo7t7YqX02EMz9BEREepGc9OG/EuD4FhF3d5rbOg4J/a5fFId/WbRTlJMen850EUugjwY2d3pfBZzVtZGZ/TXwQ2AYcFl3H2Rmc4G5AGPGjDnaWkVEBhQzIyM1mYzU5D4L6aMRy3XC3U38HDYCd/ffu/sU4NNE5tMP/yH3Be5e7u7lxcXFR1epiIgcUSyBXgWHHMwuAap7auzuLwMTzKyopzYiIhJ/sQT6MmCSmY0zszTgSmBR5wZmNtGi5/eY2RlAGrAz3sWKiEjPep1Dd/c2M7sZeJbIaYsL3X21mc2L7p8PfAa41sxagSbgCg/qRusiIglKC1yIiAwiR1rgQjdPFhEJCQW6iEhIKNBFREIisDl0M6sBNh3jjxcBtXEsZzBQnxOD+pwYjqfPJ7h7txfyBBbox8PMKno6KBBW6nNiUJ8TQ1/1WVMuIiIhoUAXEQmJwRroC4IuIADqc2JQnxNDn/R5UM6hi4jI4QbrCF1ERLpQoIuIhMSgC3Qzm2lm682s0sxuC7qe42FmC81sh5mt6rRtqJk9b2bvRp8LOu37drTf683sk522n2lmb0f33XHgzpcDjZmVmtmfzWytma02s1ui28Pc5wwze93MVkb7/IPo9tD2+QAzSzazN83syej7UPfZzDZGa11hZhXRbf3bZ3cfNA8id3t8DxhP5Ba9K4GyoOs6jv6cD5wBrOq07T+A26KvbwP+Pfq6LNrfdGBc9L9DcnTf68DZRBYjeRq4NOi+9dDfkcAZ0de5wDvRfoW5zwbkRF+nAkuBGWHuc6e+/x3wMPBk2P9sR2vdCBR12davfR5sI/SD65u6ewtwYH3TQckji4Hs6rJ5NnB/9PX9RFaAOrD9UXdvdvf3gUpgupmNBJrGQDEAAAI9SURBVIa4+2se+dPwQKefGVDcfau7vxF9vQdYS2SJwzD32f3DNXdTow8nxH0GMLMSIktR3t1pc6j73IN+7fNgC/Tu1jcdHVAtfWW4u2+FSAASWaMVeu776OjrrtsHNDMbC5xOZMQa6j5Hpx5WADuA59099H0Gfgr8PdDRaVvY++zAc2a23CLrJ0M/9zmWRaIHkpjWNw2pnvo+6P6bmFkO8Fvg6+7ecIQpwlD02d3bgdPMLB/4vZmdfITmg77PZvZXwA53X25mF8byI91sG1R9jjrX3avNbBjwvJmtO0LbPunzYBuhH9X6poPU9ug/u4g+74hu76nvVdHXXbcPSGaWSiTM/8fdfxfdHOo+H+DudcCLwEzC3edzgU+Z2UYi06IXmdlDhLvPuHt19HkH8HsiU8T92ufBFui9rm8aAouA66KvrwMe77T9SjNLN7NxwCTg9eg/4/aY2Yzo0fBrO/3MgBKt7x5grbv/pNOuMPe5ODoyx8wygY8D6whxn9392+5e4u5jifwd/ZO7X0OI+2xm2WaWe+A1cAmwiv7uc9BHho/hSPIsImdHvAd8J+h6jrMvjwBbgVYi38xzgELgBeDd6PPQTu2/E+33ejod+QbKo3943gPuJHoF8EB7AOcR+efjW8CK6GNWyPt8KvBmtM+rgO9Ft4e2z136fyEfnuUS2j4TOfNuZfSx+kA29Xefdem/iEhIDLYpFxER6YECXUQkJBToIiIhoUAXEQkJBbqISEgo0EVEQkKBLiISEv8f5OhX9djROHMAAAAASUVORK5CYII=\n",
      "text/plain": [
       "<Figure size 432x288 with 1 Axes>"
      ]
     },
     "metadata": {
      "needs_background": "light"
     },
     "output_type": "display_data"
    }
   ],
   "source": [
    "\n",
    "import data_set as ds\n",
    "import util\n",
    "\n",
    "np.random.seed(1)\n",
    "nn = NeuralNetwork()\n",
    "nn.add_layer(Dense(2, 100,('no',0.01)))\n",
    "nn.add_layer(Relu())\n",
    "nn.add_layer(Dense(100, 3,('no',0.01)))\n",
    "\n",
    "X,y = ds.gen_spiral_dataset()\n",
    "epochs=5000\n",
    "batch_size = len(X)\n",
    "reg = 0.5e-3\n",
    "print_n=480\n",
    "\n",
    "learning_rate = 1e-1\n",
    "momentum = 0.5#\n",
    "optimizer = SGD(nn.parameters(),learning_rate,momentum)\n",
    "\n",
    "losses = train_nn(nn,X,y,optimizer,cross_entropy_grad_loss,epochs,batch_size,reg,print_n)\n",
    "\n",
    "import matplotlib.pylab as plt\n",
    "%matplotlib inline\n",
    "plt.plot(losses)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 4.3.9 fashion-mnist的分类训练"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 66,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "(60000, 784) (60000,)\n",
      "uint8 uint8\n"
     ]
    }
   ],
   "source": [
    "import mnist_reader\n",
    "X_train, y_train = mnist_reader.load_mnist('data/fashion', kind='train')\n",
    "X_test, y_test = mnist_reader.load_mnist('data/fashion', kind='t10k')\n",
    "print(X_train.shape,y_train.shape)\n",
    "print(X_train.dtype,y_train.dtype)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 67,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "(60000, 28, 28)\n"
     ]
    },
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAU4AAAD7CAYAAAAFI30bAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAAgAElEQVR4nOy9W2xk2XUe/O26nLrfSBbv7HvPtGZGox5ppJFHmnGsQLIuMRw9xIoCB7EdwC8JYBt5sJyX/G+/ngL8QAIYAmJINow4AiRAgS3DNmRFkUZSe6ZHo+mZ7pkedjf7xmsVWfd71fkf2N/mqs1TbF6KxSL7fABRZNWpU4dn1V57Xb61lrJtGy5cuHDhYvfwHPUFuHDhwsVxg6s4Xbhw4WKPcBWnCxcuXOwRruJ04cKFiz3CVZwuXLhwsUe4itOFCxcu9ogDKU6l1OeVUu8rpeaVUl/r10W5OFq4cj25cGXbH6j98jiVUl4ANwF8FsADAK8D+Kpt29f7d3kuBg1XricXrmz7B98B3vsJAPO2bd8GAKXUXwH4TQA9haCUetLZ9hnbttNHfRGPwdDI1ePxwO/3w+fzYXR0FIFAANVqFbVaDe12G41GA7Ztg5s/H30+H/x+P/x+P8LhMJRSaDab6HQ6KBaLqFQqsG0bnU6nX5d6HOQK7FG2/Zarx+PRMo1GowCAUqmkZbNbeXg8HliWpc/l8XhQrVZRr9fR54KennI9iOKcAXBf/P0AwEsHON+TgLtHfQG7wMDl6vF4oJSC1+vVi8vr9SIcDmNychLpdBpf+cpXcO7cOVy/fh3z8/PI5XJYXFxEq9VCu91Gp9NBq9UCAIyOjiKdTiOdTuP555+Hz+dDJpNBuVzGT3/6U/zyl79EvV5HtVpFp9NBu92Gbdv6XFIZ7xLHQa7AEa/ZcDiMeDyO6elpvPLKKwCAK1euYHl5GaVSCeVyuet4ykAp1fV8LBbD7OwswuEw0uk0LMvCu+++i/n5ebRaLTQajX5dck+5HkRxKofntn3blFK/D+D3D/A5LgaLgcrVsiyMjY0hFArh3LlzGBsbw+TkJObm5uDz+RAMBhEMBnHx4kUkEglEo1E8++yzWuEB0JYKFV4gEEAgENCK2LZtTExMoNVq4ZlnnkG9XketVkO5XEalUsG9e/dQLBbxzjvvYHFxEYVCAfl8/qD/2jDisbI9jPXq9/vh9XoxOTmJ8+fP49SpU3jppZcQCATw9NNPo1wua1nQE7BtW8uX7+eGGgqFMD4+Dtu2sbKyglKphHa7jXA4jEwmgzt37uj3HhYOojgfAJgTf88CWDQPsm37GwC+Abiu+jHBQOXq9XoRj8eRSCRw4cIFnDlzBhcvXsRzzz0HpRQ6nQ6UUggEAvB4PBgdHYXX60UgEEAkEtlmjUhUKhUsLy+j1Wpp6yWdTiOVSqFWq6FYLCKfz+PatWvIZDIoFAool8totVooFAr9dvuGAY+V7WGsV4/HA5/Ph2QyienpaczNzeHMmTOIRCI4c+YMOp0OyuWy9gAoL3oQwWAQPp8PPp8PXq8XwWAQqVQK9Xodv/jFL7C6uopyuayV7sLCQj8ue0ccRHG+DuCiUuosgIcA/jWAf9OXq3JxlDhUuXq9Xni9XkxPT+O5555DLBbD6dOnEYlEMDc3h1QqhZGREW0xcPG0Wi0opbRb7/P5UCwW9aJUSmmXHdi0PhuNhrZiqARzuRzq9Tra7TaazSba7TbGxsYQDofx6quv4plnnsHy8jIePnyI1dVVvP3226jVav36948aR7JmGf5oNBpoNBpYXV3Fz3/+c0SjUZw9exbxeByhUAjhcLhLnoSMYVOh0r1fWFjA6uoqCoXCnuKkB8W+Fadt2y2l1H8E8HcAvAD+zLbtd/t2ZS6OBIctV5/PB8uy8PTTT+MrX/kKRkZGcOrUKQSDQR3fbDabWqkxcSBjj7Ztb4uFKqXQaDS0tWImjIh6vQ5gU4H7fD54PB5MTExAKYWLFy9CKYUHDx7g7t27uHbtGj744IMToziPas1y82OIZGlpCffv30c8Hkc0GkUkEkEsFkM0GkUwGEQsFtOJH2DTc2g0Gmg2m2g0GlhfX8eDBw+QzWZx8+ZNrKyswLIs+P3+gXkJB7E4Ydv29wF8v0/XMjDQagkEAgiFQmg2myiVSj13K6XUSXTbeuKw5KqUwtTUFGZnZ3HhwgWMjIzo7Kq0Ftvttv6b8S7z/vO5TqeDRqMBj8eDVqvV83gT8tz8PvD3YDCI0dFRfX21Wg31en1g1sxh4qjXrMfj0e54rVbDysqK9kCYKa/X6/B6vfp+1+t1NJtNlMtllEolrK2tYXFxEblcDu12G5Zl6bDOYcc2iQMpzuMKUly4iDc2NvDee+9pa0SC7mGf6StPJDweD1599VV86UtfQiqVwtTUFACg0WigVqtphaeU0orMXAh8TR5TrVb1+0z0ioFS4ZKqJBEKhXDx4kWUSiXMzMzA6/ViZWUF1Wq1T3fiyQSZE8Dm5lgqlXD16lW89957+MQnPqHpZtwIfb5N9USqUSaT0WGU1157DbVaDdFoFPF4XB8z9K76cYBpKTIzR5cgkUggHo+j1WohGAwC2HLtaPW46A/ooicSCUxMTCAUCsHv92urT8anaAEylsnnJKRClHKSz/McEtLd72WdSm8kkUigUqkgm83250a4ALAlp3K5jGazifX1dWQyGQQCAdRqNXi93i5XvV6vI5vNIpPJIJfLaUUZjUZ1THSQXuGJVZxy0di2Db/fj1QqhVAohOeffx4zMzOaKM0sa61W0zGytbU1rK+vO1o9Tp9BOO14XPRPqsXq8/kwOTmJeDyOqakpjI6O6kyqtDJpjTCD6vf7EYlE9CKiQgXQ5QFI61Ny/+Q5qSQZP63X6zo84ySXTqeDZDKJl156CUtLS9jY2ECxWBzE7TqxoHuulEI4HNbfgUKhgJ/97Ge4du0aQqEQ4vG4lr3H48HGxgYqlQqKxSIKhYJOFIbDYR2nlt+NQeDEKk6g2+Jk7CoajWJ6ehrnzp3Ti6jT6WB0dBS1Wk0nJZixBR6v8EwrR+58gxTmsMLj8SASiSAejyMSiSAYDHZtUl6v19FSJO3I6/XqOBYXCJNFPJ6xM5M0TYuVCpKLjL8DcLQ6yQedmprSv7s4ONrttt4UWclVq9V0uCUcDiOZTMLn8yEej8Pj8SCbzWqeZ7VahWVZekMdtMIkTqziNONglmVhbm4OY2NjuHDhAs6fP49qtYpKpYJoNKpL+CiEubk55HI5TahtNpuo1WpdStTM3NK1lFbRThbrkwLGk6empjAyMqIXjtfr7YpNSgu+3W5jZWUFV69e1TSWdrutEzV+vx+BQACdTkfTiwhpxVIeMkwTCoUwMjKC2dlZANDxVdNLCYfDuHTpks7+ujgYZHgE2JR3MBjUXgGTc5VKBR6PR7MZmPQLBAI6gURL04m2NAicaMXJpA6wWaEyNTWlq1JOnTqlTf9wOKz5YRRMqVRCrVbD/Pw8MpmMpkSY1qcpLPIU+Xyr1dJ0jCcVXq8X4+PjmJubQzwe1/eI96lXUiibzeLKlSvI5/PI5XKatF4sFhEKhZBMJtFqtZDL5fQ9lllbfga9DZ/Ph5mZGYyPj+PSpUs4ffo0vF6vTgry8/m9CYfDOHv2LAKBAMLh8FHewhMBM66slNKJWoJhFEk/syxLb7aWZelzyE1u0DixipOLjzfV6/UikUhgdHRUl+5JazCRSKBer+tgNa2TqakpXL58GeVyGWtra5pUTQHXajXdtMCyLCSTSQQCAZRKJVQqFVQqFeRyOR1fexLh8/m04ozFYtvcaRlO8Xg8Wm7lchkLCwvI5/N6Q2NSqV6vY2lpSVceeb3ebRlZno/KtNFoYHFxEdlsFrFYDO12uysRxWOBLZcyGAwiHA4jGo0iFouhXq/3sxb6iYaTxyYbu9BIkRvgTnDKORwWTqziNC1Dy7IwPT2NU6dOYWJiAqOjo9pdZMUClWMul8Pk5CRGR0cxOjqKD33oQ6hWq1hYWECpVMLS0hLy+bzOBMZisa4SskgkggcPHmBpaQmZTKaLzP0kwrIsXLx4EZcvX0YymeziT0rXjaR0xiPX19fx+uuvo1Ao4Omnn9abXiQSwZ07d3Dz5k3EYjG88soriMfjWF5eRj6f16V5UmmWSiXU63XcuXMH2WwWkUgEzWazy5Xn+1jl4vP5EIvFUKvVkE6nMT4+jmw26yrOfULGqAlaoVSKTAw6QSpOMxxGGbqKs0+g1UCLgfQFLly6AKFQCACQTCbh8XgQDoe7XEkmKtrtdle8y+fzIRwO64w94zXRaBQTExOwbRuZTEbXPz+JYAaU2VLTNZchFQBdll+z2YRt25pzG4lEEAqF0Gg0cPfuXSQSCZw6dQqJRAIANI0oFovpc7fbbWxsbKBaraJQKOgKI0l/kp8vKVI8jt+hJ1WG/YBUnI9TcDtZlzK5Z/J6B4UTrziTySTOnDmD06dP4/z585iZmYFlWTrREwgEuhIN6XRa88ry+TzK5TJyuZx230OhEKanpzXFic0nvF4v2u02VldXkcvlcPr0abzwwgu4deuWLhVbXl4+6ttxJKCrfurUKR07lu65rNrh8UwcKKUQj8fxW7/1W/jUpz6l3fj3338fp0+fRjKZxOc//3kkk0lcu3YNS0tLmJubw7lz57TV2Ww28eDBA+RyOfz5n/85bt++3RVvsyxrm9tIq5M0pPHxcVy4cEF7JS72DjbooLHC8JWUPSEVY69zmXzfnazVfuPEK85AIIB0Oo3R0VFEo1FtSbLrjoyDAZtVI1w0lUqlK/MbCAS6qClsTABsJoGazaZWAtFoVLt2lmUd2f9/1OCmYlmWJjdLpUk5mFaDdLs8Hg+SySRGR0e1tZhMJjE2NoZ4PK6z3qx7jsfjGBkZ6VKc1WpVL1xgy0XsdDr6s0zGRKfT0WEWWrF8v4u9Q643JyYFvw9O79vpnPI787g4aL9w4hXn7OwsvvCFLyCdTmN6ehqxWKzrBlNQZgXL+Pi4Tgaw3RVrYckrq9frqFQq2gryer14/vnnEQqFdPb29u3bmJ+ffyLL9QKBgG4qTM4ey+JM0jsATXJnQwfyPPP5PP7iL/4CP/rRj7QiYz9NAPjZz34GALqihCRqyQeNRqNQSuHWrVsANss82XOTnECZ4ZeKEwCmp6ehlML8/Pwgb+GJgjRSTMUpm1mbRQxOPGm54cpzuzHOPcCprI5IpVJ49tlnkUqlkEwmdRmXfJ9021iVQAum3W53ddxptVraisrlcl3trAKBAKanpzE2NoaRkRHE43HdfbxUKg3mZgwR/H4/4vE4YrGY5lvK2CKArgXE+0qCO4+r1Wq4cuUK3nnnHd2lncqx0WjgwYMHmvNnvlcpBcuycOHCBcRiMe1mU/kyJiorjIDtHNxEIqE/18X+sZNyo1KU3wn5mvmceU43xrlH7MTjarfbqNfretHSHZBVJBJmpYn8DMZixsbGEIlEMDo6isnJyS6+2dzcnO6os7S0pKlITyLi8TguX76MmZkZTXyXnFaZTZf8TSo+v9+PdDoNv9+PZDKpY550/YPBoLb2WQEGQCd1uJhYIx8KhTA1NYVgMIjZ2VmEQiEd3yQ1ybIsrSwZOwOAkZERXQboYn+QPF3ZXFq+9jhCu6zykgm8Qa+xxypOpdSfAfgXAFZt237u0XMjAP4XgDMAFgD8lm3bG4d3mftHq9XS7hsTPHT3JD+MP+T/kbhuVpPYto3x8XH9t7RsyBX1+Xz44IMPcO/ePWSz2aGsUR+EXFOpFD75yU9ienpa079I5ZE0FFoMMhzS6XRgWRYmJyd1fNGyLIRCIa1ASV2Kx+Na4XJRUnakGbGuORQKYW5uDmfPnkUkEtGKstPpaI6oTFaRopROpzUrY9gxzGuWa0kqO26c0qp0KoN1AgsdBr3GdmNxfhPAfwPw5+K5rwH4gW3bX1ebs5m/BuCP+395B4cZV3HikEkuIbC9XNMUoulems8D0PQlLsQhxDdxyHJlzT9/qKToFvP+sQ6dFSOURzwexwsvvKA3PnZYYixUVgnJihT+Lbl9gUBAW7ztdhszMzNadiZNhj+dTkezL7LZrONAsSHFNzGka5ZrynzOdNN7HWu+B4D2FrhJDmK9PVZx2rb9f5VSZ4ynfxPAP3v0+7cA/B8MseK0LAuWZelFxAXH+CW7trTbbZ3skdVFcjeTC4xde2QygYIOh8MYGRl57Fyco8Ig5FqpVHD37l1UKhXdni2ZTCIWi2kqF91uAMhms12JuDNnzuDFF190pJjIzWwnLp9Je2JzkVqtprsjmYR5HttqtXQF2Ntvv60b6A47hnXNOslHcmhNefL3nc7DsApj1YNiPew3xjlh2/YSANi2vaSUGu91oDrCKZdUklwYhNzFpJXC0aLNZrMrliUFSvqK07mAbheEvDLLsjSXc8jRV7k2m01sbGzA4/FgaWkJjUZDN1axLAvhcFgnkEhUlx2LfD4fEomE7qQjEzcAtlGZnLwKGUoB0BUuYFs7yr5cLuvNlJ2zaGUuLy9jbW3tuFicTtiVbA9jvT4ucbOTZfm495iVQ4PKrB96csgewJRL060GtpQW21TReiTNpd1u65stLU8qTo4hZTyU55SZX2aKaTXZ9mYfTypPv9+PRCKB06dPo1Kp4ObNm4fx7x8JdiPX9fV1vPbaa7AsC6+99pqu6Q+Hwzq5lkwm8eKLLyKVSmF8fBzRaFTf80qlgqWlJe3GM7lnJvR6hUucjqH8gU15NhoNrKysoFgs4u///u9x/fr1bW4/Qw6SwnRScRjrlWESzpWiousV/trh2roqzoAtQ8Xj8eix0GwEIntV9Bv7VZwrSqmpRzvXFIDVfl7UfiGFIUnXvJmmS71TzJPJB7O7uNk4Qlo9tr05xqFarSIYDOpuLizzPAboq1zr9fq2aqlQKIRAIIBYLKbrv8fHx1Gv1/WYYJl5rVQqADbdfqk4d2NZyMUoFxq7YJFaxOqwd955Bz//+c8P8i8PM45kzcqQlowp89FJee4FMpRmhlsOs53jflfz/wbw7wB8/dHj9/p2RfuEdPEA6Nrm8+fP645FTD6YJj3joB6PR3fN8fv9+sbLfoF0J6l4WT4WDodRq9Vw//59ZDIZpFIpxONxXds8qIqGA+LQ5SqnVjYaDRQKBUSjUaTTaYyMjGBqaqorUcB7zySbrCjarfsn3ye5uNwYl5aWsLq6qpX0CcWRrFmuGVruXFM7Zc13Sgjx0fxd9mqVxROHhd3Qkf4nNoPKY0qpBwD+CzZv/reVUv8ewD0A/+rQrnCXMLNxExMTePrppzE7O6stPi5apz6Q0g1nfE1WBMnjpZXJ3TQYDKLVamFlZQULCwuYnp7Wk/mc3MujxlHJlRZfvV5HsVhEPp/X3NiXX355mzKUcWoZxwS2d8ByUqgm5UVaJKSxZDIZLC0tnRjFOWxrlhvlQeZ4OWXd5Wvka5sNXA4Lu8mqf7XHS/+8z9eyL0jqCLC1UKrVKnK5nA74O5V0URHKprossZNuurnDMX5KJdBut3UvyHQ6DdverHV/+PChtmaGraXcUcnVKR4NoIvlYJbT9VoEfH0vloWZfWefTjabkMcNmlTdLwzjmpXWfz/PZ04RGBSOReCtF2hB0B2QKBQKePjwoe6fSAsD6O7IQzoKKTFSEdbr9S43z+QEUtFy4XU6HZw/fx5zc3P46U9/infeeQe3b9/GwsLC0CnOo4C0+KgoJTiehIk703LshyIzrVYAeuaNuVE6KXgXe4e8f2YLv4NCrulBYugUp7lz7OcmK6WQTCZx9uxZjI2NaQVJ15sVK9JKpZXjxA10uj6Te8ZdLxwO61ZzdB3cUcObcJIlrXPZ2ENamJSLPN5MLuzGOjTfJ5+X1UrEcbY4hw1O1mA/7625Jk8EHWk3oBXA3+WjtCbNm+1kaQKbu8+v/dqv4bd/+7d1eR4XJ11z7nxMFsi5MwTJ7aaryM+UhHfG4djOLBgMolqtbnMBn3SYPQBarRbW19d1o2dOO2RMU95rwkl5mjCflxVJMjlIShkLH/jex4UJXOwe0tMgnKhFJkwObq9zS+/RfO6wMBSKU8JJce60OGSgn3+zxvn06dM6aMwF62TByJsu3fidBObkwimlNAWJWeBBx16GHaY8GWOkxWm66VJee72XpqstNzl5LsnbdNF/7Gcd7FYW/fBQ94MjUZw7Ka5eis0JqVQK6XRau+WRSASTk5OIRqP42Mc+puOPlUqlSxHKDDk/mzFImS3n77R+JJ0F6CbfMivP2UKTk5P46Ec/Co/Hg2vXrj3xky4JuYikxc9sqKQKmQwG8/29zitj33xOUstMxenkVbjoDyTzRFIGTbfaTMztpGydWBMmQ0Y2fTkMHJnFaVqRvX7fCeFwGOl0Wk+iTCaTuHjxoq6HltVAZvZcKk02igC6g8xcZJLHCWy5m9LV4GLlAo3H45idncX9+/ddq9OAkxtNV9qMNe8Uc94t30/KW3omThupi/6Da8hMDEnZSsVHPM5Fl+cwP29YCfAHRq8vKonobMzBmmY5r4Td2ScnJzEzM4NoNIrp6WkEg0EkEgkd16Qy5DwhuoIkwcsFJhvU2ratR5RWq1VsbGx0Kd5EIqGb88pFJ2MtsVgM09PTevibiy04xap7Ka9eitO0UOQjfzdjaOZnuJnzwcNMwAIHd6+dEriHbawcieLsdaNYb+r3+3Xvw1gshvHxcQQCASQSCYTDYXzkIx/B3NwcRkZGdNacZrkM+rPmPBwO67kz7N8nY6Occsldir0gfT6fHgcMAMFgEJZlYWxsDMlkUi840mikFcu+nKOjo67i3AUoQ7PN304WSK/kglPWXb5HhgCA7iSfPMZVpv2DU7jFycrk63s5r8SgusEfieJkV5xkMolUKqW7F3m9XkQiEd2cg8PQeEwwGEQgENAllLL+W1p/tDaZMWXCgRYsaS+MS8pFwt6QvKZAIKCb11Kp87yy8S6tW7qBLMN8kge17QWUDwfhSQvTXAi7Sd6Zx0qY1uogZ9U8idirItutbJ0+43Hx0X5h4IrT6/VifHwco6Oj+PjHP45PfepTCIVCXQpTzgWSiQJadTLAz0azMvmilNLKsVgsYm1tDZFIBJcvX0YkEsHDhw9RLpe7Eg/sGE6lLG88FS+7u7Tbbayvr+vPoKKlxczOSrSa3UX5eCilEIlEkEwmtQXP56VLt5M1+bjz8z3tdltvulScrvI8POxHmfWSreneS/mZP06bZr8wcMVJ5TQ+Pq5jlIFAQE8aZKdvmSigsuLzkpMnf6erzpvu9XrRarWQz+d1N28qZTY2lvwyZnjZao5UGZn88Xg8ulcjM+hOSSf5/7roxk6hGpay7sZ62M29dYpnmtcwCAvlSUc/7+9OynBQHsTAFWcwGMSnP/1p/Mqv/IpWnPxnga3ZysyIA1vWhUlnYBxTuuh+v1+3lLMsC6urq7h27Rr8fj9arRZSqRSeeuopzM7Ool6vd8UmG40G7t69i2azqcc9BINBbQ1TucvBYGYSgs9Xq1UopTSh28UmZBLN6TUzvkkqmJSxk9Uhz8Hn5EbGxSQz+DzWtTgPF1JmZkLVvO9OspQwvQ7zczhXCoD2Rg8DA1WcVGzT09N46qmnEIlEEA6HAWyvxnGqZaZl1yuwTwExNhkKhaCUQi6Xg8/nw9LSEmq1Gi5cuKBdaJlpbzabyOfzqNVq2NjYQKFQ0HPY/X6/VspceJKqJCE5qXTzXTwelLuEaSma2IurLj/HfM2J2+nicPC4pi2PgzSinM5BPXGY626gijMcDuPy5cuYnZ1FKpVCq9VCLpfrqvyhhcHqG4I3RCrHWq2mZ8hUq1UAW3QmPj8xMYEvf/nLyOVyuH79OhYWFrC+vq5LIyW53ePx6Bnb8XgcY2NjCAQCCIfDjg105ZRMqcjl+cgG4Bx2F85g2IWbkrTgTQvDKRzCY83XnGbamMcEAgEEg0HHTdDFwSF5nNK7dEr47QdmYpjdyw4Tu+nHOYfNaXmTADoAvmHb9v+n9jFuNBgM4tKlS5icnEQsFkMul0OpVNIWIhWTzLKL69A3n7SgYrGobxjderaQY3xydHQUZ8+excLCAn784x/j7t27uHXrluZjMmPPTvFnzpzRSnN8fLzL4mGlEIViXiOwPWjNDvC0fIcF/ZRrv8AQjXTHpDtNmIpQyqjXa3xOKlgAXWGdk+AZDKNcpUUvf5ws/70qT6dzUH8ctcXZAvCfbNt+UykVA3BVKfUPAH4Hexw3ysx4pVLRTRXkHGsmcABoHiWwxc3i+ItarYZGo4FSqYRisdiVUZcZdrkDeb1efPSjH8Xp06d1eV88HsfIyIiOi/j9foyOjiIUCiEajW6LqfJHNoMwq4r4O6+Zc3aGkMvZN7n2C4+LR/VabNIifdxiMa1OsjTIinA69phh6ORqJvlMOe3nPpvfFZP/e9jYTSPjJQCcjldUSt0AMIN9jBvl0Kt8Pq+nHwYCAU0ip5VIAjrdbklJ8ng8umdmoVDQrj7jj7RYmFwiZSgQCODLX/4yAGB5eRn5fF5bnH6/H5FIRFuqTBTxWmTSQnZq6uUGygUeCoWQTqdRq9Vw/fr1vcjmUNFPufYT+w3myzaBEr0sT8nIoGsnN8rjan0Om1xpbTqNqzFDJruF0/sOezibiT0FApRSZwC8AOAK9jFuNBgMdrW2l7u+rP3mF1jGu5iMYWaUilG6cnJnk8J6dB1aEZPQHolENJGelqOsXWfMjddrVraYCSwZTpC7K7Pxw4qDyrXP17Ite77X9+8GcvFJz0G+ftwxbHLtJZt+bFRSnoMYE7xrxamUigL4DoA/tG27sIcvqB43GolE7Fwup5tukA5El1bWo8vGts1mU7vc8oaQ28lYI7PfjFvJcRlKKVQqFSilEI1GEY/Hu6zLarWKTqejQwgUNLsryR2Nj5JSQaslGo3qTjA8B6lPw4h+yFX1aYzs48rleikz83293m9anLRQJWPipGCY5GqGQszQl9zA9gq+VxoxbO145IpTKeXHphD+0rbt7z56es/jRhnDpFIkFUi6WWYyRpYzSouUpr7M1EmLU+468nxU0n6/fxuRnb07ZTkmFbepOOUAKipsJpmk1T/moCsAACAASURBVAtgWyXSsKBfct0rdmvN7fae7aQ0d7JeZQ/Ww04mDBJHJdcdrsfx3u7XVXcCz+VkYB0GdpNVVwD+B4Abtm3/V/HSnseNNptN3L9/Hzdu3IBlWZiamsKpU6f0aM9Op4N8Pt8VYzRnJUvrUipMSZSmwpUWBJ/vdDool8va/TetSGmFAtDxVr6fljIVeq1W09xOOcqWHFXGYodt5lA/5XoI17ajIuSjuSDNJIT5Hh5jWj7yueOOYZSr7Fcr48umXHYLp8QS17LP50M6nYZlWXjw4EH//gkDu7E4PwXg3wK4ppR669Fz/xn7GDfKGu+lpSWkUikkEglEIhE0Gg1dwcMORvzx+/268YOT8pQKFIBWjiZ9RSaepFlvuguyu1Kn09FuhhSQDBXIruV8bDab2srkpjBsihN9lGu/8bj45k5WyuMWoLnoDuImDimGTq5OlVly0+Lfe1WeMhHLNc6QC0uhDwu7yar/BECv/2jP40bb7Tbm5+d1RjyfzyMajWJyclJbagB0/BPYUoayWkcqUNOKALZnUYHtM4J2slhk/FKWgdqP6CuMoVJB8lrC4bCuqV9fX8fi4iLef/99/b8MC/ot136AMgawI4HZtDadLJfHuYByw9xvImoYMWxyVY+SstFoVBtAfN7EbjYxU9FKOXON0hA7zK7+A69Vb7VauH79Om7cuIH79+/j3r17OHPmDD73uc8hkUjo5Eq5XNZJIlqh5lwgJ5fdzGibmVPz0XzOSWhKbXZPYpKI72MPTwBaocbjcViWheXlZWQyGdy9exe//OUvUalUBnSHjy/4xZcVYyacrMTdLBAni8ZUnCeExzlUUGpz8mssFutqsWhucDsZPPI9prLk85Rls9lEuVzWyd7DwpE1MrZtG6VSCSsrK1BK4a233kIsFkM6nUY4HNYxQrZpA9D1BeejqTjN2JjTWAxJJyJkQxEn5cmYptO0RDYFsW0bKysrAIDFxUWsrq7i7t27XbQpF7urENnJcpSeAY8zeZw7fYapIHeyflwcHKan1yu2+ThvoVcyScapzdzFYeFI2srxH1pbW8PGxgZu3LiBn/3sZwiFQrh48SLGxsbw6quv4sUXX9S14sxYS26mjJ04BfyBTZe/VCp1uWaFQkHXsgObSpEUqV5JAipgOYKDC3djYwM3b95EuVzG2toayuUy7t+/j8XFRT1b3cXu0cuy4GvyGLnwepHgCacFa27ALvoP2cWM7ReB7go7YEs3yF4FUiZmq0m5psmU4Xo7bGPlSMcDk85Tr9dRLpdhWRYSiQSazSZWVlawsrKi27ox/inda2ltMCZm3rBqtYpCodDVsYjzu4Gt5hL1en2bIPm6PI7ugCS+53I5rK6uolwuY2VlBZVKBaurq1hfXz/cG3hM0esLLYsMeh27U1LBVJrSQtmtheuivyBDhUwUYLvnyOd6vb8XpMEkDSrSDE+U4tzpn2m1Wrh79y4WFxexvLyMv/3bv+1q7JFOp3W7OI7AqNfrsCxLE9oLhUJX9rxUKmF9fb1L0clkEyF5pL2uu9eCZlyFCpiZdBe7h23bKJfLyOVyiMfjCAaDALbPPDeTdtJKcTqOMpOVYfIz5WbpKs/+o9PpYG1tDaVSCUopTExM6N4QpPiZIRYZswS2x0EJ6gXZ4azRaGBlZUXnRQ4LR2pxmqCiA4CNje7GLYFAAFNTUwgGgzpDV6/XUa1WEQgEkEqlAADZbFa74Z1OB8ViEZlMxo1ZDTm4iBqNRtcmZ9LFzL8fd5x83cm66ZUYctEf2PYmxZDjUMh3Zl9d875LN13GKXutX7roMm9RqVT0TLDDwlApzp3QarWwvr4On8+HjY0NPWKBVT6ZTAbApmsuEziMXboYbpC2woF8LDrgIuKoE3m8ZFc4geEVYMvirFQqqFarmlLWbreRSqU091ae3/3e9Ac0YhYXF9FutxGJRDA+Pq45l7KvLbAVqzY3QqeMO2eHZTIZbGxsIJfL6bLpE+Wq7xeswHFxcsG+rGxiLelnpKmZGXWl1I6cT3kOADrr6vP5tOKMx+Oo1+vbaFCu8uwPGBZbXV3FxsYGUqkULMvSNCXKkPJ1UpxAN+OF+REmeu/du4f79+9rr+WwcWwUp4uTDdu2UalUsLGxoUMwZDsopfQQP2B7Rn0n+orsugVAc/xYkVYul3H37l1ks1kdJpLvd9E/0AOoVqtYWVnRHgYZM+zBywYdphsvKYHsKVEsFvXIGzNeephwFaeLoUCn00E2m8X9+/d1pykWHdi27dhomJaJzNiakJ2sAOhsq9/vRygUQrVaxfz8PIrFYleHfldp9h+0PPP5PIrFYlfMmZxtjtX2+Xxd40y4sZZKJTQaDeTzeb2x8ryDUpqAqzhdHAF6ucB0vwh2pgK2V/7Q4qTiNLt/8zPkEEDzM3w+X1frwkEuvCcNUuZSLgRryykPYDu9TDYnJyVwkFamhBrkzqqUWgNQBpAZ2If2D2M4+HWftm073Y+LGSa4cnXlOoQ4VLkOVHECgFLqDdu2Xxzoh/YBx/W6B4Xjen+O63UPCsf1/hz2dbvkNRcuXLjYI1zF6cKFCxd7xFEozm8cwWf2A8f1ugeF43p/jut1DwrH9f4c6nUPPMbpwoULF8cdrqvuwoULF3uEqzhduHDhYo8YqOJUSn1eKfW+UmpeKfW1QX72bqGUmlNK/VApdUMp9a5S6g8ePT+ilPoHpdQHjx5TR32twwJXricTrlx3+NxBxTiVUl4ANwF8FsADAK8D+Kpt29cHcgG7hNqcOT1l2/abSqkYgKsA/iWA3wGwbtv21x99iVK2bf/xEV7qUMCV68mEK9edMUiL8xMA5m3bvm3bdgPAXwH4zQF+/q5g2/aSbdtvPvq9COAGgBlsXuu3Hh32LWwKx4Ur15MKV6474ECKc4+m/AyA++LvB4+eG1oopc4AeAHAFQATtm0vAZvCAjB+dFd2uHDlenKxB9m6ct0B+1acj0z5/w7gCwCeAfBVpdQzO73F4bmh5UIppaIAvgPgD23bfmIagbpyPbnYo2xdue70efuNcSqlfgXA/2Pb9q8/+vtPAMC27f93h+N/us/r7AJbjFmWhUAgAJ/Ph3A4DI/Ho9uIsdO37Ono9/vh8XgQDof1ewOBQNd8mlKphGaziVKpdBhzgzLD3gziKOXab8ju8IfcEXzo5QrsTbaHIVfOifJ6vbq/ajQahWVZui0cOx7JVoDs9G9Zlm4+Xa/X0Ww2df/WQ5JtT7kepK2ckyn/knmQUur3Afz+fj+EX34qPq/Xi5GREUQiEczMzOD06dNIp9N4/vnnYVmWnjfyzjvv4P3330c+n8fDhw/h8/kwMTGBSCSC559/HjMzM5idncWZM2dg27aetPlP//RPWF5exptvvonr16939frrwxiOuwd584AwELk6gZugbB/HBcRZRDvdf76PCywQCCCRSADYnERaq9XQaDR0q7o+4jjIFdiFbPshV9mZn8qSBks4HMbIyAieeuoppFIpfPrTn8bk5KRWhIVCAWtra7qFHACtZKempjA7O4tCoYDbt29jbW0N3/ve93D37t2ukcBcr6YC3gd6yvUginNXprxt29/Ao/InpdSe/4NkMomZmRkkk0lcvnxZNznlOIVYLKa7R3s8HsTjcQDAr/7qr+LVV1/t6gLebDahlEIikUA4HIbX69UCqtVq6HQ6OHfuHGZmZvDMM8+gVqshl8thcXERmUwGP/7xj7ua3Z5QDESuJgKBAL7whS/g2Wefhd/v15ZFrVZDsVjEj370Izx8+FB3cN920Y+a4VqWhU9+8pP4zGc+o8czdDodrK+vo1Kp4Ic//CFef/11bdk8YXisbA8qVzYgjsfjeP7555FMJjE1NYVUKqW77vt8PkQiEViWhfHxcYRCIYyNjSEUCnWNiKbC44ZIy9Pv96PVamF8fBzJZBKVSgW1Wk1/V5aXl5HL5XDt2jUUi0WtVPuJgyjOBwDmxN+zABYPdjnbEQqFMDU1hampKbz88ssYGxvTjWs5T50D75VSuhX/5OQkRkdHdUfpZrOJTCbT1TG6XC6jWCx2NbJNp9Pwer2Ix+OIRqNYXl7G+++/j7t37+Lq1atPguIciFxN+Hw+PPvss/jMZz6DYDCIUCikQyaZTAY3b95EPp9Ho9HoqTipKC9duoQvfelLXbOLOBpjYWEBv/jFLwDgSVSchy5bzo2Kx+N4+umnMTk5iUuXLmFychJer1ev13q9DmAr7JZIJDAyMqKVq/Q6qEiLxSIKhQKUUhgZGUGn08GpU6fg8XhQLpdRr9exurqK+fl5LC0tYWFhQY/Y6DcOojhfB3BRKXUWwEMA/xrAv+nLVQEIBoMIBoOYmprCpUuXkEgkUKlUsLa2po+hC0BrEticcsmdiQO4AoEA2u02yuWy7hjNsaWch043nG5GPp+H1+vV50skEvj4xz+ObDaLmzdvYnV1tV//6rDhUOW6E9SjOTMca0EFGg6H8Ru/8Rv45Cc/iWKxiEqlgkKhgNXVVViWhZmZGX2sZVl4/vnnEQqFdIf4drutZ9kA0N3Dn0Acmmy5Xs+cOYOXXnoJiUQCZ86cQSQSgc/nQ71e75oRZYZLqtUqlpeXtbfBsSjAVtf+ZrOp45ntdhtKKbRaLT1KWCmFSCSCubk5JJNJAJshmjt37iCTySCXy+lpuAfFvhWnbdstpdR/BPB3ALwA/sy27Xf7clUAIpEIEokE5ubm8OEPfxg+n0/vOH6/v2sIPeMYsiV/vV7XUzHNY/hTr9dRq9X0++jW00Ws1WoIhUKIx+MYGRnBq6++ilKphFKpdGIV52HLdSdQnpZlIRKJaAtFKYWnn35ab4aNRgN3797F22+/jWg0ipdffhnJZFJvmDLGxVGxHAcMYCBTEIcRhylbrtcXXngBv/d7vwfLsrQ3x1CYdMH5yOFr1WoV9XpdD257dL1aydLDpByj0aj2OJkHYagulUqh0+ngqaeeQq1Ww09+8hPMz8/jgw8+QDab7Usi6UAzh2zb/j6A7x/4KgwopZBKpTA3N4fR0VG9CMy5MjyWi0Wi0WjoQV9UpnLHo9A4vIuLTWbYeQzdCi7oVCqFsbExbbGeNByWXB8HykPOCWq1WnqBeDweBAIB+P1+NJtN3L59G4lEAh/+8IcRDAYRDofh8/l0TEtalbRYnvS5QochW6UUxsbGcObMGUxMTOg1R6UpjRJzfck4JtenebycVMrNUa5hep3SA+XfXq8XY2NjaDQayGQy8Pl8WoEfBEM5rM3j8eDSpUt45ZVX4PV6NS1I/rPcZXw+X5fio8VYLpdRLpfRbDZRq9V0lk/edN5YYGuqoRQWh0etr6/DsixMTk4iFovhwoULAID79+/j1q1b7kTEPsC2bdRqNZRKJUQiEb34GLu2H02mpEu+srKCb3/720in07h48SJarRYmJyeRSCTQbDZRqVQAbC3IWq2Gcrn8xFqbhwmPx4OPfOQj+OxnP4t4PI5qtapzCK1WS7NipLdnJn+8Xi+CwWBXNpzuvKQxBQKBrrnrdNN5HqmAOS312WefxaVLl1CtVvH2229rr+Ug63YoFSeArt2FN13efMY0TEHIRwqBx0rr1LZtrTQlnHZFJ0vXRf/BWLO0FDudjl4cdMfIhsjn87AsC+VyWbMipNUieZwHpKW46AHKIxKJYGRkRLMhpAdHNxvozpBLRer3+7ssQRouthgN7ff74ff7u3SDCZ6P69zj8SAYDOr4ZzgchlJKe5r7xVAqzk6ng9u3b8Pj8WBubg6XL1/WmTPGH9vttk780Mrke6lYefOj0WjXTsRMmzmOVn4+z0Gyrsfj0dSGDz74ANeuXUO1WnUXY59g27ZO+DBeacaunUbBtttt5HI5ZLNZzM3NIRQKdbEuaO2Q2+m0WbrYHzweDyKRCEKhEBKJBBKJhGa40KOjccO16PNtqhyuRT6aBo8MsVFZknYIdHuI8v30UgDoxBEtz8nJSVy+fBlra2t45513dAhuPxhKxQkAhUIBDx8+RDweh9fr1T+8OXS5uMBMcjSVqdfrhd/v7zq3FFYvUIA+n08vvlqthmq1qheqi/6Bybpqtaply0XH13vNTq/X6130Fvk+bph095ysFBf7B40X/nDmubT+pSykHOSPkxvP9cnCF1qcALZ5g/K7YRLf+bmRSATpdFrPcD8IhlJx2raNXC6HZrOJaDSKe/fuIRaLIZ1O65gmg8IUAp+nq8cfGcvoFRRmPIW7mryppLEUCgW89dZbWF9fx/Ly8sDuxZMCWpxra2s4ffq03hD5GhcjPYVAIICJiQnEYjG9mGQCSCpMvv+wOH1PKpisC4VCiEQiiEaj2tqX1D4aHlImMu7pBKn4ZIiGxhPXsVNugmufjBkq3tHRUZ2foOW7Xwyl4gSAYrGIYrGIRCKBpaUltFotzQvL5/NdpVym4pRJIhlspiVDl43vdyrvkkJjpvbatWtYWlpyJGC7OBhs20a5XEYmk0GlUukKrUhWBRWnZVm69FZ6ItKd5yLl+/jdcNE/WJalGQ2RSAQej0dzLanoLMvSxHeGULjZyZgkAEePQFqohPk+qUyBrRg3ObxerxeJRAKnTp1CoVA4mRanRKFQwPvvv49KpYIXX3wR0WgU4XBYZ01lkogLhLQWKkP5OhWodBsoUBJqCRJ3GXurVqsHDiq7cAaz6nTVZehFbpKtVgvVahXpdBq//uu/DsuycPHiRSQSCV2wQO9BLjST6uTi4GDZczKZ1NU+TMYA6OoxwdcoV2bea7Wa3tQkNRBAV94CQNe64/nC4TD8fj/i8TgikYg+XobZaChRgfcjzj30inNlZQWZTAYf+tCH8JWvfAWjo6NYX1/XlBMmaPhDd4xNAwi5kGh5cDECW7ExLloGovP5PObn55HNZlEoFA6jY5ILbC6KSqWCXC6n+bcyTkZZUb7nzp3DH/3RHwGAtjLZW0C66jIOWq1WXVe9j/D5fBgbG8PU1BTi8bhWmlRMXFu0GJmsqdfrOtG6tram8waFQqHLM6Bcycvlc/QCLcvC9PQ0YrEYnnrqKUQiEZ2cAqCVOevjS6WS5gEfNNY99IqTu0epVML9+/fh8XhQKpW2kdmlay5dNTNITMj393IBKOSNjQ0tVBeHB254Ttlzgi4541bAlmVi8nn5nZCxUhf9A7ma7EA2Pz+vFRoflVIoFos6gUcOJRXn+vq6bs5RLpe7FCZlx79NIjzjlKFQCF6vF/V6HdFoFCMjI11VZ7R8eU0nWnHKwH6n08HKygr+9E//FKlUCp/61Kdw7tw5fSx3JRnPZBxF7kDc+WSAmZl3J26Yx+NBLpfD22+/ra1bF4cDWpz5fB6VSkVbJ0D3JsfNq9lsolgs6tdl9Vij0UCj0dALmN8JN6PeX9Trddy4cQPz8/O4c+cOvv/972N2dhbPPvssIpEIJiYmoJTCa6+9hlu3biGXy2F9fV2vU2DLW+Dak4kjKj6618xLMNnXbrdRKBTQbDa1JfnKK6/gd3/3d3WTHsq/0+kgEonojkq0hveLoVacEvV6Hffv38fGxgY+/OEP60Uiqwicyriczvu4YDSwZb1wd3Rd9MMFlSJDJU7JAD7KTQ/AtqytfF2GcVz0H2Qx0HL0er0YHx9HrVbTNeeLi4tYWFhAqVRCPp/vej/lQg+ClqIkzNPQAaDjpAC6ekqUSiXYto1MJoNyuQyfz6d7WtDVp8XbjzzF0CpOxqkklahUKqHdbiObzWJtbQ2BQADhcFhzx7hgZGxTEmJplTCBRFI0IV101qFTIC4OF0wOlUoleDwejIyMaDlJ1oOEWY3ixPXs5ba7ODhCoRBeeuklTE5OasXXbrexurqKtbU1LC8vo9PpoFAoIBgM4mMf+5guZgG2aGZSxub6BdBljQLoUq7sq1sqlbRX+O1vf1uvZSaimHxie0l6K/vF0CpOoDsmxQA/AK3QvF4vYrEYgG53zaw4abfb2jJluVWr1dI7kqxS4LlY496Hru8udgnZVyASiXS1IpPxLYKWJiHjYIS0UGX828XBQUbDhQsX9H1eWVnBzZs30Wq1kM1mddKOx37xi1/UcpNZdZbNst+qdOdND9Ln8yEUCiEQCGBmZgbhcBjZbBb5fB6vv/46/vEf/1E3N2bfinq93tfm1UOtOHcCTXjOKyFo1ssEA5MIkuZgkuHNxcbuSiTRujhccGNkeISQpGnzOTPMIkv8TJ4vCfCu4uwfWq0WlpeX4fV6kUwmEY/HEYvFcP78efj9foyOjsLr9WJhYQEbGxsoFov4wQ9+oEsgOTeMSpReAeVvVn/RwKGFqZTC0tKSTkCVSiUUCgW88MILOuzD1oKNRgOJRAKpVArLy8u4cuWKbgK0n+/EYxWnUurPAPwLAKu2bT/36LkRAP8LwBkACwB+y7btjT1/+j7BG8nefTKrymybtDQlQVqSqJ1iYHys1+solUoHqmcdZgyjXGu1mo5DmZw9ytBJYZoLzKxSIc3sSaEjDUq2rVYLDx48QK1Ww/nz5xGLxZBIJDQ96bnnnoNlWbhy5Qru3LmDxcVF/PVf/7XuhBQIBPToDGa7ge6QGdcvY5vSCm02m1hcXNQWZaPRwNzcHF566SWdFGq321pxXrx4Ec888wzefPNN3L5929GA2i12Y3F+E8B/A/Dn4rmvAfiBbdtfV5uzmb8G4I/3/On7hCyfc1pETjXN3NVkth7obg4gE0YMJpuu+k5lYscM38SQyRXAto1MPr+b9/Z6vleTkBOKb2IAsvX5fHroIecKsabc5/PpWGU0GsXExASCwSBSqZQu1fT5fEgkEvp3bo6ScgSgKyZqVgfF4/GuDTeVSiEajQLYNH48Ho/2HtfX1/Hw4UOsrq5q/bHftfxYxWnb9v9Vm4PeJX4TwD979Pu3APwfHOICMzmb5XIZGxsbSCQSXRVALLGSzVMBbOvZKbN2NNUbjUYXcbdWq2FjY0OTsfm8jKMeZwyDXHuBCYLdVPlIRSs3QzMmSovzECZcDh0GJdtwOIxXX30VL7zwQte4mmKxCI/Hg2w2q7Ps09PTSKVSGB0ddfQaZHd4U+4mtxpAF1VJKaWn2xYKBaysrKDRaOiClVwuh+XlZaytreG9997D/fv3kc1mu9b2XrHfGOeEbdtLj/6pJaXUeK8DVZ/HyNJENxs2OMW9TIqStDZl0sGkMfF500J5AniARyZXE72szsfJoNem1quF4BOEXcl2r3KVRHeuLZZcyi5XlFuv7kZSPk5rUSpOeoSS8yldeh7LiQAMBbAakBax5PjuFYeeHLL7PEbWtm2USiWsra1hamqqq4qg3W7rm1QsFvWgNRJsg8GgtiapECU9iaA16tJXeqPfcjXOvWvupeTlOh0vKWqHMSb2pGEvcm00Grh58yaUUpienkY6nUYwGNRjZRYWFrR3WKlUEIvFkEwmtTUoS2rNmCafl0PbgC0laio82UHJ7/fr+UfhcBhjY2NYXl7G5OQkZmZm8NZbb+HevXtYXl7GysrKvgpb9qs4V5RSU492rikAA51c1mg0NI0B2LrpdLVlTBPYslR4c03Xn787ZdmfMMV5pHLdDUxqCn9/nJxcKlL/ZUuOZiaT0eN92Tgc2Fyn7HiVy+UQiUSQy+VQr9c1VUmuXan4ZGJPfh4z70za0rihJRmJRJBMJhEOh5FKpZBMJpHP5+Hz+TA7O4uzZ88il8shHo8fqEvSfhXn/wbw7wB8/dHj9/Z5nj2DZj25mKZlIv+Ws0pkmSUbPsjem7LVHN8r/35CcGRyNdHLJTez5sAWzcwpUcjvAntGHrQP4zFG32VLGtL4+LhuYsy6c9av5/N5BAIBTE9PI5lMYnR0VE+1JAvGiQlB9JKp9BhpgbZaLRSLRSwuLqLRaGBxcRHVahXZbBbZbBatVgu5XA7Xr1/H8vKybha0H+yGjvQ/sRlUHlNKPQDwX7B587+tlPr3AO4B+Ff7+vR9QmZId1Kc0vSXs2cYIyWPjApWJo0kD/AkYhjlKq7NMYHgdJz5KEtw5QJjlZk5DeAkYlCy9Xi2xvFalqW5k5VKBcViESsrKygUCjh37hzS6TTGxsZ0/bqZcCWcqIG95CyPZ5OQhYUFrK6uotVq6cf19XXk83mUSiUsLS3h1q1b2go+NMVp2/ZXe7z0z/f1iX0Ab5as7pFD6+Vo2F4ZWq/XqysXOD1P8sROuls3jHJ9HMx4mJOLbvI3+XowGEQkEnkiFOegZKuUQigUQiwW064ymxqXSiXd+pHrT4ZLzMmUpvEjH+XnAeiKg/I8sqUd1zHdfs6hYv16LBbTPX33S1E7Fn6LU3aVcY5SqdTVuLhWq3U1MeZrJNBSUH6/Xx/v9/v1DuY0k9vF0cApjmkqT2B7FlYqTi6KWCyGTqeDUCg0wP/gZMPj8SCRSOgKIdaOJ5NJXXVXKpW6asW5dlmRx3XH2KX08kyKmaxVZ48JGePkDymGVOQc1hgIBBAMBrG6uopEIqEbhOwnYXgsFKcJBpPl+AtakDKwTLoBsHWzZRKJRf8+n0/vVlSirEoyk0lPACVpaGG6bPL5XpYnwREPJzX0chSQiR1zkwPQ5cH18hB2WlM7FTT0ug7GO6VFKzslsUGI3+/XYbr94FgoTqm4AGjzmwF/Uk18Ph8ikUiXgGiVAtALh5UKFKy0UKhUi8Uims0mCoXCtiSEi8Gjl9KUr8sFJS0ZpRTi8biOc7roHySDRXKf6c2xaQuTr7QiybeUHa7M2LaUtdlazgzX0IBiF7VwOKxlz7nqNLYCgQCSySRqtRoymcy+/u9joTglpJIzg8oAtmXCO52O7gvI1ldScZpznnku7khPcBZ24KBs5Syo3bzHhOzZSDC26c5VPxyYhSSyUg/Y6mRlhlTMR6d17bTO5TlMypJk3ADdDBkqUFqg+/Ugj4VWMAPHpDysr69jaWlJt+DvdDra/GaySHZioRCLxWJXww9m1zudTlcjiEgkoueWuDhcR+Al3gAAIABJREFUKKWQSCQQCoUQj8f1VESz3M4pxmk+mpVhtDrM/qsuDg7ZK9PkRFerVU0uZwhMylNiN8khuuFyCq2Me0qLk20JZWJKfk8OGro5FopTQmbTy+UycrkcGo2GHtJFl4A3LBgMIhaLdVkhcpJis9nUQWTbtvX5EonEk877Gyio3GKx2K6/0E4xMz5PdDodbWEArsV5GHAyLGS1FrDdE5Sy26l4QSpZacn2ShTKrvDmrCKptGlQnWiLU4LdwaenpxEMBrsK+WWHI2bHLcvSs7eZOOJNZYzT7/frage+5vV6EYlEEA6HEQ6H4fF4trU7c9E/KKX0oC2/36/pLOZMGnm8k8VpyofvZXhmZGQEMzMzeqKmK8/+wIw5kuTORhqSI22+D9heuOLE4TQ/Q2baCX62GRIw8yROYYG94FgqznQ6jTNnzqBWq+lZJ6wM4NA2Duzy+/160YTDYSiluhYkJyay3RWwlYmLRqP6h++Tyhl44koyDw101cfHx+H3+3Ut805UEdnQwZQDQzP8YcfwiYkJzM3NIZPJoFAouLSzPoHuOrClvCqVCsrlss4Z7KSkTBfeaZN0WndOj2and/leaa0ehGFxbBSnUkq71JFIBNFotGtn4kxl/s1W/KQa+Xw+xGIxPUZU3ljWuPLmko7ExNLExATy+Tyq1arezXq5iS72B4/Hg2g0itHRUc21dEoE7OaeOy1QeguxWAynTp2CbdtYWFjoy7W72A7btru4m2YVnlR0ptXpJONenoSTZSkz7GZ8kzD7VuwVQ6045Q31er067jgxMYHp6WkUCgVsbGyg1WohFotBKaWHN2WzWRQKBc3HtCwLY2Nj8Pv9OsbJagen0kq69MlkEi+++CIymYy2bnlNLvoHj8eD6elpPPPMM5iYmOh63gzsO2VazVJbqWTb7TZyuRy8Xi+mp6fxmc98BleuXMEvf/lLt1vSAdGraTiTuJVKRceZ+bypHM0MvFPiyIyJmpajdPO5puv1OiqVyrb4Kg2jJyLG6fV6EQqFEI1Gu7ogkd/Fm0Prkg08SHaVLjtdbr/fr3l9UmhKKd00gJZQtVrV9CS33Vz/wU2PfEsnTmav9/V6Xr7G0EwoFMLIyEiXh+Gi/zCTMU7xxMdRjeRGuJMl6hS3VEppi9csv+WxB8FQK075jwaDQVy8eBGpVAq2bWNxcbFrp6rVavB4PLrhQL1ex8bGBqLRKEKhkN6VOp0OSqUSSqWSbrPPgU78HM5iprKMxWJoNpsYGxuDUgq5XG5fPfxc9IbH48H4+DjOnz+PZDKp5SrjmOaXn3XJTouAlghZEbRQUqkU4vE43n//fbeKaECQFXm9FJ2TO02YcUx5rKyBpyFF3vbGxgaCwSDGx8c13VBayAdx1Y/NN8fn8yGZTCKVSgFA1yA1BoS5iBifJI9LglwwjguVWXg5XVEmJkiGZ5bdpSj1H7Q4OYPGaYGZx/cK8Eu3Xrpptr3ZnTyRSOhEoYv+YCcPjKEwpxi1k1x3Oq/p4puWqPyuMFQge1T0y1M8NhrA5/NhbGwM6XQaxWJR04aUUro5AOMWkvogg8WEvOG0QnmOcDiMUCjUNaaUSpbW6cbGwAY/PjEgHYkbo8nVcyJN74ZOwmMYYwsGgzrByNplN7O+P0gjhNU5wPYsOC3OXvFN85xOLrV5Thm/5t8yudvpdFAsFhEKhTS90Az/HGTjfKzFqZSaU0r9UCl1Qyn1rlLqDx49P6KU+gel1AePHlP7vopdwOfz6WFPfr9fxxllSZfsIi2uXytPp+yd7Cpt27Zuk0WXHdga6BYOhxGNRk9Ea7Jhkau4Hl015GRxAti2oKQV42RJyE2TsW7KMBQK6Q32JFmeg5Yru4+ZOQKp8Mxet4RpwOzwP3Wdq5cRJI9nBRGrBPez6e6E3bjqLQD/ybbtDwH4JID/oJR6BlvjRi8C+MGjv/sONu4IBoO6YojJIO54shMKsJWFl5QDmYmTQWspUOnyc/az3+/X72OZ1gmpPjlSuRLqEc2Mw7R4f80BXeZ7gO7Yl6lk5YZp2zay2SyWlpZQLBZ12V4ymUQikTgp8iQGIlfT+pOusFkr7mS8OJ3P7LLkdMxOfSqAbkuS1YVM8jKWyuMOlY5kb07G43S8olLqBoAZDGiUbCAQwMjICGKxGCqViiZGMztOd9pUkFyQclHIGURScUqqBEn0fr8f8Xgc1WpVcwBDoZD+7OOOo5YrQSZEOBxGJBLRNcYyXr2Tu25uhMB2C6XZbOL+/ft69kwikUAwGMTc3ByCwaBeXCcBg5CrXDfm8yx5pAyBLfqPbDEnrlcfw7970ZKkcjbXstw8uVnmcjkA0K669E4Oqjj3lBxSSp0B8AKAKzDGjQLoOW5UKfWGUuqNfV3go3I5NmfolREzzXenHc5MFDy6vq4djMrYtu2uLwe/LLKv30lx8Y5CruI8WnHKhtKm62e+x+mHr0nI74uUGSvJTnKPzsOWq2n5S4tTJmWdZLhT0qeXG/04l57vpTw51FFm4QeeHFJKRQF8B8Af2rZd2K3SsA84Rpa98+Lx+LaefDJ2QqUGdC862T2FtCTZ8EO2nOp0Osjn86hUKkgkEojFYl3zmzkvmtYRBXOccVRyJSzLwvT0NMbHxxGJRHhuHTahTLkouajMptN8n3GN+j3kbrJ6LBKJYGpqCp1OB/Pz8/u9/KHFYcvVKa7o9XrRaDRQLBZRKBS6LEynuGSv88pzyud6xb1lroIbsSzHrtfrjs2WD7Jh7uqdSik/NoXwl7Ztf/fR0ytqc8wo1CGOkmXLN6daV9m7Ud4Y0/wnSIh32n1oncpdipal/Axm7aV1dFxxlHIlqMSi0ahuF+a0sJwsEulVSNdLumR8H+PTVLK0ONnA5SRhkHJ1kpNJ7dvpeBOmB7GbY+W5qcBZOVSr1Q6FNbGbKZcKwP8AcMO27f8qXhrIKFlzUQDQTTxkeRWVmuxCbU7BlM1L2QiAFmU8Htd9PRuNBtLpNOLxuLZCZcaefM7jHBc7arkSLGyYnZ1FMBhEqVTq+rJLawPYipdxo+RCAbZaxrFdoIyFhcNh2Latk33BYBBTU1N6csBJwVHIVcqm2WzqiZJSYTkZMeKatyWbdvocJwvU9CxbrZa2emVyqF/YzTfmUwD+LYBrSqm3Hj33nzHAUbLSkmBmjEJho2EqNtPSkL/L+lRp4tO85y7Fji6hUAjValW/R2bXT0B3+COXK7C5mY2Pj2NychJ+v183YKFceilOp94CZFKwpaBpcfIYehPxeFy77icIRypXtpMj+wVwLpmlEuNrO1HLTKtSrlvKTp6HSWF2StuJKrVf7Car/hMAvT7h0EfJsrsR3Sy605VKRXMqaW0CW9PyzMYCPBcbHcuBbcFgEPF4XGcCOaGPi5jH0spk2OA4J4eOWq6E1+tFLBZDMpnUXdpJUOciAKATduxk9fDhQ7zzzjsoFot48OABbNvG5z73OXzoQx/S3wdaGzIuxr4FkUgEo6OjyOVyx30D7MKg5Sqz1Epttl7kjHM5opteAJt+0Fszk7ROWfVeiSKCXilzINxY2eCDhTLyeqXHsh8M/TfG6/Xq7CddtEajgXK5rHssSsXJ8cB8Ly1FANpSlDQlVgyNjIzoaqRSqYRKpaLHaMghT81mU4/nOM6Kc1jArlcjIyOIRqN6vr3k2zE0w8bUsVgMKysr+O53v4vl5WW89dZb8Hg8OHv2LC5fvqwbunBgmFycZGhEo1GMj49jY2PjpFmcRwYqzlwu1+WqMyxGb45yMWvHgW46kqz+k5/BR6W2eNjcEOkRKqVQrVb1d4dwShzvB0OvOOVN4T8qY1hy5CdvuuyZCUATcrkgnfiAMn7Kagh2UaJgKWhpubo4GNjyLZvNdik5E3TRzTg3FSoVLL8DJkUG6G5fxs23Wq32jaLioncjYZnIlTkLae1LNxzodvFN69NUoDLPEQgEdPWZeS6nZOJ+MPSKk53YOf5CKaVH/rbbbYRCoa5BTawxp1Ll0Ci/369HlUpaiwQzgrVaDdVqVe+OvPlsU0WKVKlU6lqYLvaOarWK9957D6VSCWfPnsXMzIzuiiTjU7Qs5OZZr9dRr9e7uihxc5SbJI/nI93Jhw8fYmVlZVsjGBePh6m4+Dflwvgi0N3/koqSIRj2DJDWJ9/X6/P4aNu2lj+9DDaKKZfLXcMYZQ8LKtiDcLGHXnHKLCqwPZMmdxBzxzKVI3dCKk7eRKcffpYUooznmFVJLvaHVquFXC6HcDiMRqOxjWok77+ZeZVZW2lNyqoSQrp9jJNvbGy44zMOADNuyN9N97vZbOrm4Yw7N5tN7doD6Kk4e3mGVJz0Cul98DPoKTolmgaSHDpqsHLI7/drpcfgPnctKjLeTLoJJEmHQiFNiuVr3J1isZj+YQemarWqyyuBLSFJWhPr2F0cDKVSCW+88QZGR0fxxS9+scuDaLVaelyJkzIE0LXIarUaSqVSVzEE5c3QSjAYRCAQQCaTwQ9/+EOsrq6iWCwO6L89GZD8WVrw9PbMMFir1cKdO3dw9epVrbSazSZKpVJXIkcq217ZePk35U75lkolbGxs4N69e1hYWEA2m9UbsfRAduqBsBccC8UpqUZ0y8xaWSc6A+kKtA5l3EVal5LUHgwGdeccQp5XWqsuDo5Wq4W1tTXNlOD9NeeqPw60QGXHbwlZJGHbNsrlMh48eID19XXXVd8HZBYccB5fwudzuRyWlpZ0IxeGSkwl28u7cEoSyYw9j7FtG+vr6zo5RV3B65DfpxOvOC3LwujoKHw+H8rlMprNplZ2pvvFBce2c+SUra+vIxKJIJlMdi2gfD6PWq2GVCql6StceD6fD6FQSMdrmG2v1Wq4d+8elpaWsL6+7sY3+4R6vY7vf//7WF5exkc/+lG8/PLLXXFOxppl5Q8LEagog8EgotGoZk/QHQSAUCgEpRSuXr2K69ev4+rVq1hbW9ONbl3sHlxnsveDXH+06uk1XL9+HSsrK13NPsxadnOD3K1io+LkZxaLRayurqJWq6HRaCAYDG5bo9Lz2G+4begVZyAQQCKR0OR00oNMEjStQFqPFGqtVsPy8jISiYSmHlF58iZXKhUkk0kdM1FK6d1RJhXIT1tZWcGDBw96lpW52DsajQZ+8pOf4N1334XH48Grr77aVXQgWRQA9MYmm3RYlqWbhchEBACdZb1+/Tq+853vYHFxUQ/6c7E3UEHKhB0VHWl7skLv1q1buHXr1pFcK/tLSEuYZdykLe0HQ684s9ksrl69Co/Hg3w+r838er2OXC6H1dVVWJalyyOXl5e1ywcAhUIB2WwWlUoFb7/9NkKhkJ4Z1Gw2Ua/Xsb6+rmcOPXz4EJVKBY1GA0tLS9qFbDabKBQKqNfr+jqcsn8u9ge6zwDwxhtv4C//8i+7+nPSOhgbG0MymcSDBw+wtramlZ9SCktLS7h586be9CqVCpaWljR/sFar4ec//zkWFxeRy+Vc+e0TZJ+Q3cLSxmq1ikwmg0wmg1wuNxSbUqfTwerqKu7cuaO9l0KhgPX1deTz+X1f49ArzoWFBSwtLQFAVxUIsBXzCIfDSKfTAIBMJoNarYZ4PI5oNIpSqYRMJgPbtvHBBx8A2J4FlKVbDHCTVC2P5UKTyQoX/YFt29jY2EAul8N3v/td/M3f/A0SiQQuXLiAcDiMyclJRKNRnDt3DrOzs7h27Rrm5+dRLBZ1vfmNGzd0445IJILV1VW88cYb2NjYwLvvvqv7EJh17C72BlbXBQIBrK2tYWRkRHOtHz58qJMzwxA7psUbj8cRj8eRTCaRzWbx8OFDzR3eD4ZecbIxKn8HugPTpAdxcJukJJCWQGW4UyzLFPIw7JZPGihP8jM9Ho+2Qmu1mp4+yuIEk3bCXqosUOCk0kqloluduegfeN85MYFrTdL+COkSDzovIDP/clTOQcZ8q0H+E0qpNQBlAJmBfWj/MIaDX/dp27bT/biYYYIrV1euQ4hDletAFScAKKXesG37xYF+aB9wXK97UDiu9+e4XvegcFzvz2Fft0tGdOHChYs9wlWcLly4cLFHHIXi/MYRfGY/cFyve1A4rvfnuF73oHBc78+hXvfAY5wuXLhwcdzhuuouXLhwsUe4itOFCxcu9oiBKk6l1OeVUu8rpeaVUl8b5GfvFkqpOaXUD5VSN5RS7yql/uDR8yNKqX9QSn3w6DF11Nc6LHDlejLhynWHzx1UjFMp5QVwE8BnATwA8DqAr9q2fX0gF7BLqM2Z01O2bb+plIoBuArgXwL4HQDrtm1//dGXKGXb9h8f4aUOBVy5nky4ct0Zg7Q4PwFg3rbt27ZtNwD8FYDfHODn7wq2bS/Ztv3mo9+LAG4AmMHmtX7r0WHfwqZwXLhyPalw5boDDqQ492jKzwC4L/5+8Oi5oYVS6gyAFwBcATBh2/YSsCksAONHd2WHC1euJxd7kK0r1x2wb8X5yJT/7wC+AOAZAF9VSj2z01scnhtaLpRSKgrgOwD+0LbtJ6Y7hCvXk4s9ytaV606QLe/38gPgVwD8nfj7TwD8yWOOt5/wn7X93u9B/QyjXJVSdigUsqPRqB0MBm2/3297vd6ex3u9XtuyLDsU+v/b+7bYtq5zzW+R3OTmdfMqXkRdLfkSW7aTTlOnadGgkymCybRJH87gZIBBTjHF6csAc4B5OMV5GaDAAH06mOljgVM0AxSdCZAE9UOB4qBo07hNUtdpHDuW7ViyZVkXUyLF+53c8yD9y4tbmxIpiRQl7w8QJFHcm1v73+tb//23q4qiqIqiqBaL5amWa7ey7Ydcj8BXW7nup62cnir/Fe2bGGN/D+Dv9/E5xwkLh30BHaDnchXn0ei9TrOgaAifLMuYnp6Gx+PhkymLxSIymYxuT01FUeD1euHxeDAyMoJGo4FPPvkEa2trvB0djeM4IBwFuQIdyNZYry1oK9f9EGdHqryqqj/FVvkTY2zb3w0MHA5FrowxOBwO2Gw2jIyMYHJyEqFQCBcuXIDL5YLf7+czoIj8qCGxFpIkwWq1QpIkOBwONJtNrK+vo1Ao4Pr167h9+zaWl5dx69attuRJ/V6PGXaVrbFeO8N+iPMRgBHh9ziA5f1djoEBQE/l2m7GC2MMNpsNTqcTsVgMZ8+excTEBF555RV4PB4+Nrhb0Mwiaq7rdrsBbGq7t2/f3vVajxl5Gmv2gLAf4rwKYJoxNgFgCcDfAvhPB3JVBg4TfZErEZLP58PXvvY1BAIBBAIBuFwuxGIxjIyMwOfzwWq1QlVVlEolAOAmPHX1VtUng9xoLhF96Znjk5OTMJvNOHnyJE6dOsXn5BSLRdy4cQPLy8eaR47Emm3nytkvXC4XPB4PKpUKNjY2AABOpxMWiwWlUqmrMRp7Jk5VVeuMsf8K4DcAzAB+pqrq53s9n4HBQK/k2m4xhEIhvPHGGzh16hSGhobgdrthNpthtVrRbDb5ULxqtYpGo8GnK9IQvXq9zofzkWZKozMajQafRCpJEsxmM06dOoWzZ8+2DBq7desWEokEMpkMJ066zl4t4sPAUViz2rHAB3nfyeedyWSQy+XQbDahKApkWUYymUSlUun48/Y1c0hV1V8D+PV+zmFg8NAPuQYCAYyNjWFychLhcBiKosBqtfK/02wY+gJaSYxG1JL5rqoqJEni7yETXTyGxkPTa+QDDYfDsNlsOHPmDEqlEhKJBB8QeNwwyGt2r6N69WA2m7k2SeOKw+EwYrEY/H4/FEUBAPj9flitVnz++efIZDIAOiPrgR/WZuB4QPswzszM4Ac/+AHC4TDOnTsHl8uFYrGIYrHITW06TiRKYJMULRYLZFneRoxEmDQyWJIkAOCaa61W41oNYwxOpxPnz59Ho9GAzWbDiy++iN/85jd499139zXMy8D+sZ97b7PZMDExAZfLhUgkAo/HA7/fj1AoxKfikhXSaDTwi1/8Ag8fPuw428IgTgN9hSRJsFgs8Hq9iMfjCAQCsNlsPBDTaDTAGIPJtFmboTWZ9V6j9+qlJmkDPETEpHkyxrgfNRgMol6vIxAIwG6384mpBvoD0ULQA40gpo0RACwWC0wmE2w2G5djs9mE0+lENBqF2+1GMBjk48LtdjtkWW4hTdpU6Ro6gUGcBvoGxhiGh4cRj8cxMzODEydO8BSjcrnMxzebTKZtGieBfhZn2+/kh6T3kLZK76Hz1ut1pNNpAMDQ0BACgQDm5uZw7tw5bGxsYH5+fiDmgz8t0MpbRCwWw/T0NIrFIvdFx+NxKIqCc+fOYWpqCuVyGcViEbIsY3x8HHa7HcViEbVajY+IXlpawvvvv49isYhEIoFCoYCVlZWu5GwQp4G+QlEURKNRhEIhKIoCi8WCjY2NbQ+tnuYhkh7QqmFqgwriMVoNVjwPme/AZtTVZrMhGAwiEokA2CRcgzgPFyRXt9uNaDSKXC6HfD4Pxhii0Sj8fj/OnDmDCxcu8MIIq9WKeDwOq9WKtbU15PN5vkkWCgXcv38f6XQai4uLyOVyXV+TQZwGegI9LdBsNuO5557D66+/jmg0ilqthkajAbPZDMYYqtUqJzMy2elLNM+7jXTTOek8InmKpFqv16GqKk6fPo0333wTN2/exMrKCtbX17lGLF6P4f/sHSwWCzweDzfFGWO8MKJarSIUCoExhnA4DKfTiVwuh2vXriGdTuPRo0eo1+tcnjabjZvmZrMZmUwG1WoVzWYTbrcbVqsVhUKhP+lIBgy0QztyY4zh3Llz+M53voNKpYJCoYB6vQ6LxcLN5mq1CsYYz9fUapHtfGB6JCaSo1bTFM9nMpn45zcaDUxMTOD8+fPw+/145513uKkn+l8pCGWgNzCbzfB4PJBlmW+s4XAYIyMjUFUVsVgMwKaVYLFYsLS0hPn5eSQSCXzxxRfI5/N48OABqtUqxsbG4PP5EI1GMTw8jFwux32kDocDdrsdjUbDIE4DhwstodhsNpw+fRqhUAixWAzVahW1Wg21Wg0mkwlWqxWMMVQqlZZz6JHdbgGEdtejPUardQJPUqAajQbPGQ2FQiiXyyiVSi0asUGaBwvaJGVZ5mRos9l4sYOqqkgmk5ibm+NRcYvFgmaziUqlguXlZXzxxRfcjC8Wi/w4ytTwer1QVRVWqxXhcJjnBtfrdZRKJe7r7gQGcRroCURicblc+Pa3v42ZmRlerVOtVlGpVHhdOVVviMdrczGBvZdB6kXWteY/aZW1Wg3VahVWqxUTExOwWCxIJpPI5/O6WquB/cNsNsNiscDv92NsbAwAuGZIvQkWFhaQy+UwMjKCeDwOu92OTCaDUqmE2dlZvP/++7BarXA4HHzzazabSKfTKBaL8Pl8aDabcDgcmJqa4qlH9Xod2Wy2q6oxgzgN9BxmsxmKoiAYDMJut+t2RdpJi2zXRekgQQQtflksFjidTrjdbh6V17seA/sHuWfoi7TPRqMBu90OAAgGgwiHw/B6vVyrpMDPxsYGTx0jTZTkZDKZWvzYlLUhSRI8Hg8AcLcAbZy7wSBOAz2HxWLB8PAwpqamIMsyALRU/ogRb60ZvBNJ7aUccqeUJboOKtl0u90YGxvjeYIG9get71uUhSRJvNdAo9GAJEkYGhqC1WrF6Ogo/H4/Tp8+jYsXL2J+fh7vvvsuEokE7ty5w1sNAuBpRwC4T9put8PpdIIxxk14q9UKWZYxMzPD2xUmEgnkcjmsra3p5gSLMIjTQM9BnY/EDkeiZidCS5o79e7cK2FqjxN/15ZkejweHt010FuIG6jJZILdbofdbofH44HX64Xf70cgEMDy8jJSqRQvjaWGHXQs5QMDTzRZiqqTH5si7rIsw+1288T4crm8LZNDD8f2adipUUCnKSXdLE69PEEDm6CINZU7EnlSKpKInSqIRHNae37xePG7NuFdfJ/4WWK+J+V2OhwOXLhwAX6/H5cvXz7gu/L0YafNi0xkIj3yLzscDqyurmJ+fh4fffQRms0mUqkUbt++jWKxiEKhsOvnUrtCh8MBWZZRr9exsrKCZDIJSZLgdruRSCR4QMrpdKLZbCKfz7c957EgznZaCaWZaP/WSS5gN/mC4gI3Iq76oGi1XpRcTwNsNpv8ntLxzWazhWi1iez0fr0OO/QeAvnR6NyqqvJzE3FSErWqqtzPZmB/aLc2aHMlGVmtVvj9fjgcDszNzeH+/ft49OgRFhYWulpf1K+AGn1YLBbU63Xe0EOWZTgcDmSzWR6gotLNY0+cu93IdqaZ9nVJkjA2NgaXy4Xx8XFEo1HcvXsXf/jDH3Z0GIuL0iDNJyBTNxgM8jphxhg3k8i/KRKf2MhD1DD17q+2gkjrq9S+X/yb1p9KNetixyUia8OK6D0or5JKItPpNOr1OiRJwuLiIs9qMJvN28zxnSA2yKZ2hOVyGZVKBc1mE8vLy7BYLDxlyWazwePxoNlsIplMtj3vrsTJGPsZgP8AIKGq6rmt1/wA/h+AcQAPAPxHVVU32p3jMLBblFZvIVBrsdHRUbz00kt4/vnn8d577+Hjjz/eNdK2mzN50NAPudpsNoRCIYTDYXg8HjgcDlQqFZ7kTiQlkqTFYuG5dUSqorxEzVK856I/tFNZiBseEafNZuNpLNq2dkcFR3HNUvpQpVLh2uDs7CyAJ5sfaYTihtYJ7HY73G43TCYTisUiJ85KpYL19XWoqgqXywWn08n7ue4m807GA/8cwCua134I4Leqqk4D+O3W7wMF7YIjMMagKAoikQhGR0cxNTWFs2fP4tKlS3jhhRdw6tQpjI6Owu12Q1VVKIqCs2fPtkSE9UAmQTQaxfj4OE9zGGD8HD2WK7kw6Evr+tC2j6Nj6Ls2f1Pv53aviSko7d6r1yREm2hvtVp5gv4Rws9xBNcsQXSt6Gn87QKLeqDNmKwJvWdC9G/TeXcb07Krxqmq6h/Y5qB3Ea8BeGnr57cA/B7AP+76X/QR7bRKs9mM6enRdW8BAAAdgUlEQVRpxONxBINBhEIhDA0N4eLFiy3lXRaLBblcDqOjo/je976HhYUFvPXWW7pJsnSjXS4XvvrVryIajeLKlSv49NNP+/Gv7gn9kqtInOJDSxVDoh+aQPdTjzxFP6ZW8xT/1g56gUGROEUt02KxwO12I5/P876eRwFHdc0S2rUU1Ns8O4k/UOScSnotFkvLpq6qKu/vSi3mepWOFFZVdWXrwlcYY0M7XPihjBulm0JjGMh/ZrPZWmbaEHkGg0HIsszLsADwoEAsFkOpVGq7eMxmMxwOB5/GGAgE4HA4+vnvHhQOVK5EjnT/RdISd/itz9Mls/2CouQiaeudu9210HVShZMYwDhi6Ei2h7Ve20F0wdDv3fqbae3T+icyBdCy3rtBz4ND6iGNG3W73XC5XAiHw5ienobD4UA8HofL5cLMzAyvma5UKjCZTCiVStz/0Wg0eDDD7/cjEom03GwtqB8gJekGg0Hemv+4ohO5yrKMeDyOeDwOp9PZEsneOg4AeLSd6tfJlwW091dqA0P0M2kQev5OrZahzfcjk65cLvMBcJVKBY1GA5FIBBMTE1hbW0Mqler+hh0RHNZ6FaGVuRgY3MuGSlH14eFhjI2NYWNjA7Ozs0in07h58yZKpRJqtRrK5TLXSncj570S52PGWHRr54oCSOzxPAcOWhjkEA6FQpiYmICiKDhx4gQ8Hg9Onz6NSCSCTCaDVCqFarXKO/WUSiVe5kUak6IoUBRFN+eQKhMikQgnTKfTyf1iRywae6BypZJFatrQzrwiE5kCBGK0Hdi+kHa6p1rtktDOdwqgRRsWzX+6HmBzGqLP5+MVKkcQA7tmO4WobbaTsd6zQZanx+NBLBaDLMtYW1vjmyXJWqxd3w17Jc7LAN4E8OOt77/a43nagh5ibf6dHsiUcjgcOH/+PEKhEE6cOME1zEAg0DK0KZvNcu2G0hKo1KtUKqFSqXDTe2VlBZ9++inm5uZ4Xhf19xsaGkI0GsXk5CReffVVWCwW/OUvf8Hjx4+RTCYxPDyMYrGIjY2No0KgPZOrSFhiF6Kd3i+mDImkJr6H0lPod+3xBPEcJGtKsqdxHmL5JxE4uWHOnz/fkjx9BNHzNdtPtJO5+DcCtTBMpVJYWVlBvV7H0NAQJEmCyWRCtVqFLMsIBALIZrMtzVzaoZN0pF9i06kcZIw9AvA/sHnz32aM/RcADwH8Tcf/cfvPAYCWHYUe8N1AtcQejwfPPfccTp48ifPnz+PUqVN8RGytVkM6nea1rJlMpqXnI5mHNDaWOqmsra3h6tWrWFpaQrFY5HlhsiwjFovhwoULOH36NF5++WUUi0W89957+Oijj2C32xEOh5FMJpFOpweOOPslV1Gb69ZXJRInTbAUcz61mmOnUVZyGdB56FnT83OazWbY7XacPHkSLpcLt2/f3sNd6C/6JdvDhMgT2udK+75qtYpSqYRMJoNEIgFZluH3+7nMqROW1+vlwaHdeKeTqPobbf70b3c7thtoiZN8UHoRV5qf7XK54PV64Xa7MTw8DLfbjWeeeYb36stkMjwnkFRx0jC0oM9yu91cMy2Xy7zJgKIoYGyzSQClNvj9fgSDQdRqNVy+fBm5XA4LCwsolUq8EmJQh331Q650DyjhXQsxcX23dCTxb3qkuRtEq4WOFxefWGpJzwkdR7+LGukgo19rdhCwU/YMucxqtRpSqRTK5TISiQScTifi8Tiq1SoP/lKeZ6PRQLFY3NVcH4jKIb1F0M48N5lMiEajCIfDmJqawpkzZxCNRvHlL3+ZD/5qNBooFApYXV1tefAp3YgK/mlRi3NnQqEQT5TNZrNwOBx49tlnIUkSvvvd7/LjGo0GUqkUkskkbty4gZ/85CfY2NjgBFwulyFJEsrl8sBpm/1Cs9nkkyLFnEmx4zrwZPCaVovYb2SdPkubC6hNjSIznfpwisUOdG303BwF4nya0G5tkSuNGncsLy9jY2MDyWQSgUCAu178fj/8fj9PfKe5RLsVvAwEce4Em80Gl8sFSZL4MK0TJ04gFArxXEyPxwOr1dpSKge05vppE7G15qMYuSN/GGkZog8MANbW1pDL5bggUqkUb0Pm9Xp5yoPZbEYymeQpD4OqffYKVqsVgUAAgUCAj3UVLQoCyYSqhQ4qFUlEu2II8RkQ/el0HUS6TqcTiqIY7eUGHGK3o2AwCJvNhlwuh3K5zIPGlFYmbpJknpPitW9TvR/Yyd9F2mQwGMSlS5fg8/ng8/ngcrm4xsgY46VT4oAmt9ute04xZUVcpM1mE8VikSfEUhs0xhgvz8rn83j77bdx7do1/vmBQACXLl1CIBDAV77yFYTDYd5o9dq1a3j33XdRqVSwuLjYk/s3qAiFQvjWt76FeDwOv9/PH0gKygBP/NMESg87CI0TeLJ5assoKaJKGyLVMVMHJ6vVCgD896mpKdTrdYTD4aOYLXHsoFd2CwDDw8N44YUX4HA4oCgKVFXFtWvXkMlkEA6Hcf78eb4hVioVPHjwALlcDl6vFy6XC8lksqVLUzscCnHSwtBqgMD2KgGv14toNIpIJILJyUneMMJms6FQKCCbzfJBS6JPVJblbV279RzHWtA5xGABaSLkZF5fX+cVRORzDYfDiEQiOHHiBCKRCI/OLS8vQ1GUlrEQTwskSUIoFEIoFIIkSbruF1H22tLMg4BedJ0gPoei5SHmm9IGS53riVANDAbE5tM0aSAcDsPhcMDhcKBer3MXnd1u5wEgUpBKpRIKhQJ3w1CWTa/yOPcMk8mEQCAAp9PJ51fLsgyv18t3fnpQZVmGz+fD8PAw/+cLhQIKhcK2SJq27T6RpljpoRfZ1ZbrUXoLReMo6maxWDiBf//738drr73GzXhJkriJ3mg0eAdpxjYn8126dAm1Wg23bt3q120eCNhsNvh8Pni9Xq71AeCaHk23zGQy20i1XWrJfiH6VqnjuDboo5f6lMvlUKlUeODPwOGCNjhSomZmZjA9PQ2v14tYLAbGGB8GOL7VO4KIkrIpzGYzt0olSeqqKqzvxMkYg9PphN/vx8jICKanp+FyuRCJRLgvkWYqO51OnvUvRqkrlQrq9TpPCxK1Bto5xARqWija+me90joxmCDWrNJuRhPyAPBywkqlwhPp8/k8r22mWufJycmOkmqPG0wmE28gK6Z4iClA9XqdV2xQaaYWB0VUovUAoIU0tUSt9X+XSiVeYWJgcEBjMaampnDp0iUuy0ajgWw2i2aziWAwCIfDgXQ6jfX1dQBPyjBtNhtfm53kjBP6Spx2ux1nz57FqVOnMDQ0hGAwiKGhIe6PFE0k0gYAtDhwAfCHnbRDcVGST4oWp+hTE6tXtGkvWnNRa8ZRxA14srgogZ6i7KqqcoGQJvo0jlygzY42PsqPFdPB6OGm+6atCNFLcKb3ac1pgl4+n/Y89KyQdSO2rhPHAovXQRsAlWQaeIJeWAbapHb6IkVGkiRekfalL30JQ0NDiMViPJPFbrcjl8vh6tWr2NjY4CW0olnu9Xq5L5uKYvL5fEfllkCfidPlcuHFF1/ExYsXEY/HObmIxKVtcktlUC0XvUWApEFQxFqb0iIuCiI/ceaNaNrTe0WfCS0gOkbMSRQXpJhzSucjbfQoddU5KFAxgsvl4g8maZVEWHTPtL0VtdaASHj0PBC0myBBG73X1rM3Go0WrZc+n3J+xa5NdA1OpxOyLBtRdQ12Sj7f6/m0MQ/iAjKlJUlCIBBAKBTCN77xDUxOTiKVSiGdTvOeCMViER988AEWFxe3baJutxsjIyOQZZln5Aw0cdZqNayuruLhw4eo1Wq8ntxsNkOWZf4g040TyU97M8Xfxdpm7fHigiSSFv2ZotYqmufiedotTC2Bat9DJt76+vpTZarbbDYepRQ1N71MBq1LRM+Noof9LFKRrLXRU9HfqQ0s9SJN6qiDZHpQGqd4PvpOz4hYIUjdzcgkbzQasNlsyOfzWFpawsrKCp9oqYWY1yu6aWgj7wR9Jc5CoYA//vGPuH//PrxeLyKRCK/4iUQiPJhgs9m4tkYBGJHw9BYh/Z0aStD7RBKl10UtU5uORCZlrVZrqTgSNRfRtKN0KHGWs/h6MpnE559/jnK53M9bfajw+Xw4efIk4vF4iyxEU526D5FFQQ+4OBpBG90U7/9eIZK01i1AGzi5F0jDEZ8hgzxbsdPmtldo5UOuOuofMT4+jm9+85twOBzI5XJIpVKIRqMIBoO4evUq3n77bR5r2A0kW4qfdNpirq/E2Ww2ebE9mW2SJKFYLAIAbyQqTqUjBy418tDTOkXNk1RtrUBJ5ReP1aapNJtNntZEOX20sEWtlCAueDIjqcsSmfqJRALpdPqpIk7yBYryaqed61kSevITzXY97X6n38Vj6G8igYoLVZsSZWichw+SAWXZKIoCh8PBNUxVVXkzHarm2229iTIXC2c63QT6HrlQVRXpdBrZbBYrKyu4efMmH5JEO77FYuE16NTOi4rwyTEsppIATx5qiqrTwqM+e5SHqaoq97dRiV29XucVA6VSqeV3seJH1Ez0zJNmc3OkKGmrlGqTSCSOavPbPYE0drEhNI1/1WZBmEwmyLKMZrOp23pOa9ZrSZY+T88NQO+jTY0CUvQzaTLkRqFnT7w2sW7dQH+hXTNTU1N46aWXYLPZuHbo8/nAGMOHH36IGzduIJ1Od1yhx9hm2iMpaO16KujhUIiT6reJuAhiAMfn88Hj8cDtdvPu7FRCRYO/SKujY4Ht882p25GoTRYKBZTLZT7xTuzDSRojEateqeROs0tyudy2mueDNmUGHaJrQ9Tw9LRO0c0h3lPRtN9JWxW103a+tnY+aq0LBtiehC8GFgwC3Rv2G3mnTczn82FiYgK1Wg3ZbJZPwTSbzVhfX8fNmze7PjdlwIjtCTs6roOLHgHwfwBEADQB/FRV1f/N9jA1j8iLHk6aKig+kGK1DvXGzOfzsFgsWF1d5aMwtH7Kdg+1uED0mkqQVkSLh4hb7Pau1WrF1+ka6MYXi0WuxZKWS1UKg9TH8SDlqkU2m8X9+/cRiURafEYiyWnTkEjmFPXW64xFoACONudWa+prnytguxYjunvEwJBoyh8l9FKunUBsokNFK+Q3TqfTO47c1YPFYsEzzzyDWCyG6elpqOpm9VYoFEKpVMKHH36I1dVV3L17d9uxeoFIretFlmWeatbVdXXwnjqA/66q6ieMMTeAa4yxfwXwd9icmvdjxtgPsTk1b8fhT6QBij5LkTjFnYlItl6vY319vSUqrn2oxYdfXIi0GMThTKThiP4ssfMNfQaRsyRJLRF/4ElZpngOStInrZbImXy6zWZzoIgTByhXLXK5HK+gEs11glYLBfSJE2jf6VtvwxS1Q9Jg2/lFxXMTyWpLL3sR+OgDeibXTkBrQZIknuqjKApkWYaqqkilUt1pdhYLTp48iZmZGT7Hi869sbGBv/71r201zXaWA/2NfPHa8uyOrmu3N6ibA55oyFOOMTYLYBj7mJonaiHtfEjiAyz+8/SAa7sctRtrIRIe0NpSTPu5FAiiKBsAPqxeex4RRLDAk+FPZO6XSiWk0+mB0156IVeCy+XiUU5x/K/YZIMWmJjLCWyfEbRTIK+dM38nS0SUvx6pizLW01wHHb2UayewWq3weDw8JU2cvEDWXCfaPGObDcMdDgd8Ph9CoRBkWYbdbkcmk8Gf//xnJJNJPoO9HUTC1HtWSOPsNt+6K5pljI0DeBbAx+hiIqIWRCqVSqXtQ0nqvkiStGBIGxRL5rSaiVhZIuZqaqEN9FQqFa4Zi7mXonapNRNpoQJPNBU6njpPDxpxijgouRICgQDOnj2L8fFxnuxOMqdMBOo+Rb9rU7+0m554/0TNVC+ivpNGKrpVSL7aTVh0KRxRrRPAwcu1E9DkAxpFQVVj5I+klm6UjtYOVEfu8XgQj8cxOTnJ/ZGrq6v45S9/ibW1tR0DQVotU/t5JpMJHo8HPp+v7SDGduiYOBljLgDvAPgHVVWzne7CbJdxo+1unhgYEN9L2qCodehpnNTbcesaALRqukD7IV0UFabgjhh4os/SBi20xEkkQRrooC6+XsiVmraICfDA9qAMyfLx48eoVCpQFAV2u52/VyQ+vQCD3v3f5Zpb3kfZF2LWBJXparVZuh5y/Qy6D7RX63U3UKcyVVWRzWYhSRInTlJKOtmMrFYrotEo/H4/fD4fHA4H8vk81tfXkUgkeIB3ryDesNlsvIlxN+jo3YwxCZtC+IWqqu9uvdzR1Dx1j+NGxRxJUejaxdTugejWvNLzhWiFq70OraNZ7/hucsP6jV7JdWhoCOfPn8fY2FjLmF/RhUGklUwm8atf/QqpVAqvvfYaLly4wP3QomZCJCVWd2irjbauS/dnsUyXfrfZbCiVSrh//z6q1So8Hg9vuEw5vFrIsgxFUfgAsEGU7WGsV0I2m+UBYHG2EwCeadLJPfP7/Xj99dcxPj6O6elphMNh/Pa3v8Xly5extraGQqGw6zn0ynNFMMbg9/sRjUbhdDq7+C87i6ozAP8CYFZV1X8W/tTzqXntVGwD+0cv5Wq1WvlcerHsFdheO16r1bC8vIxEIsGDSmJPAXF0hbgZaf4Xfm7tAhE1V9HCECPoxWKR5+2K59LTjqiJxKBqm4e5XgHwXGjyYWutOq2bRAsK0LpcLsTjcYyOjvJ5QPl8HouLi7zEcr+gzJ6eBIcAvAjgPwO4wRj7dOu1f8Ixm5r3FKJnchWzE6j5Cj2kWhdHqVTC7du38eDBA0xPT6NcLiMej2N8fJwvNDrWZDLxQVp67pl2kXsAvKEMvV4sFrG6uor19XVcuXIFpVIJfr8fkUiEH0ckLVo4sVgMMzMzWF5eRi6XG8QeBAOxXoPBIF599VWEw2E+NO3evXuYnZ1FKpXCvXv3dP2Tw8PDuHjxIsbHx3Hx4kWEw2HMz8/jiy++wJ07d7C8vMw3uf2Cii8cDsfBB4dUVb0CoJ3de+ym5j0t6KVcRV8ikRylf4hVXcCm+ba6uopHjx5hbm6ON32hudekDVCwkPxnYjCHINayk4kozkcX+wlUq1Ukk0msrq5ibm6OTxOg4/W+M8agKAri8ThKpdJARtsHZb263W48//zzmJychNfrhd1ux8cff8xbvz148ECXOBVFwTPPPIOxsTGMjIxAURTcvHkTDx8+xOPHj/ecoaIXGGJbKY60oXeDp69ZpIG+gMiLHkwaHdLuAa3X67h9+zYymQxu376Njz76qKW1H+XZUtCOCiH0/Juitil2nRf90sViEZlMBtlsFrOzs2CMbWsKIWqaZMb7fD6cOHEC2Wy2o4DUUYYkSYhEIi3FIuQ6EeePi7588WeKU9DIiomJCRQKBbhcLszOzvICkWazyWvQT58+jWeffRY+nw+ZTAaZTAbXrl3DtWvXMD8/fyBuO3HDpRTEbmEQp4EDhxjppr6ku80kr9fruH79Oj777LO2QT8KGlHeLBVP6AXmgCeLQiym0LtWVVWhKArPCRS1WQpwEAGHQiGcOXMGjx8/PvbEabPZMDExwQmTGnfXajUUCoWWqZBiOhl90XFOpxOBQADT09M8c+J3v/sdL2lWVRWhUAijo6M4d+4cXnjhBTDGsLS0hPX1dfzpT3/C73//+65qyXcDWUHiZtANDOI0cOBIJpO4desWqtUqLly4wP2Tet2HRHSSpkL+Uu20TL3UNdKSyDTfaXE0m5stBYlgxZxOOr/JtDnMK5PJPBXD90ymzc73Yt4yESLdT/E7bVL1ep37isvlMi+zpFSifD4PRVEAbG5EAHDixAnE43H4fD4UCgVUKhXcvXsXa2trvJuaGPzbD4GSVlyr1ZDJZJBMJruWp0GcBg4cN2/exNLSEr7+9a/j5Zdf5pUZlCdJpLSXXEgiTMoL3A2dZmao6maDlrW1NV6hQiRAQSqLxYJkMol79+5hdXV1YCPrBwWLxYJQKASfz8eHJ9LIbOqET9oa9ZUgUqLg4NraGu81sb6+jtXVVRSLRYyNjcFkMmFsbAwejwfRaBShUAgmkwkLCwtYXV3FO++8g0ePHuHhw4ctbSj3al4TaCMtFouYn59HLpfrvoZ+z59uwEAb0CTBtbU1LC8vQ1XVlo5SZPoWi0Ue7OkWB13Ro6qbfQby+TwYY1zTIq2TNKpsNou1tTXkcrljT5x0TygvUywAEHM0aUPUljaTWU/TJnO5HG/xSDmzgUAAiqLA5XJBkiSUy2VkMhk8fvwYiUQCyWSSb5AHJW8xzYwCj0ZwyMChgwIHn332GX70ox8hFovhjTfewPT0NDfbk8kk7t+/j7t373aUzKytINLzbYrQvrab5tloNLC4uIibN29iamqKm5CyLKNSqeDBgwfI5XJ4//338etf/xr5fH4QU5EOFLlcDh988AEPzIkzuohEKevBbrfz5h70XVGUlsIDq9XKNVfqrev1emG1WpFKpfDgwQOsrKzg1q1byGQyPNtBHNZ4EFNGqcDBYrHgueeew5kzZ3Dr1i1cuXKl43MYxGngwEER1WQyiatXryIWi+GVV15p0U7K5TL3X3Wak9euakzr39zpurTnEwk1l8thfX0d0Wi0JWHbZDJxX9jy8jIWFha6uR1HFvV6fceOXuQDtVqtvK7carXC4XDAarVykx3YvNdut5sTJmmc1JC8WCwikUhgcXERd+7cQalUQjab3UaUB6F1ku8b2Mw3jcVicLlcXZ3DIE4DPUO9Xkcmk4Hb7ebVP9SD1eFwcBONTHjSaMS2cgQyrbSaZzt0ooVqfa3pdBorKysYGxvjgSer1YpisYjZ2VnMzc1haWlpP7fkWIECajTxIJ/Pb+vHSXIS++9SZRClqzHGkM1mkcvlkM1msbGx0TJB4KBBQSFJknD9+nWUy2UsLy93dQ6DOA30DNRRn+bCkH+MEttdLhccDgf3jZHvTNQItDioMlxtyhM1paCoL5V+Uine3Nwcrl+/jvX19X197nEDJbEfpZlajUaDk/ydO3e49dMNDOI00HNUq1XcvXuXjzsxm81IpVJYXFzEwsICTwXR64jVa4hETHmK4rRDseHHQQekDBwuqtUq5ubmkE6nkUjo9jxpC4M4DfQcpVIJV65cwfz8PMrlMiqVCp9KmM/neeK5WIXSD2hJsFwuI5fLoVQqtYyHpnlUBmkeL9DoDZPJ1HWgzyBOA30BDcYrlUot32ny6CBA7CpP/tR+ErmB/mOvUXrWz4eWMbYGoADgKDqKgtj/dY+pqho6iIsZJBhyNeQ6gOipXPtKnADAGPuLqqr/pq8fegA4qtfdLxzV+3NUr7tfOKr3p9fXfby7FBgwYMBAD2AQpwEDBgx0icMgzp8ewmceBI7qdfcLR/X+HNXr7heO6v3p6XX33cdpwIABA0cdhqluwIABA12ir8TJGHuFMXaHMXaPMfbDfn52p2CMjTDGfscYm2WMfc4Y+29br/sZY//KGPti67vvsK91UGDI9XjCkOsOn9svU50xZgZwF8C/A/AIwFUAb6iqeqsvF9Ah2ObM6aiqqp8wxtwArgF4HcDfAUipqvrjrYfIp6rqPx7ipQ4EDLkeTxhy3Rn91DifB3BPVdV5VVWrAP4vgNf6+PkdQVXVFVVVP9n6OQdgFsAwNq/1ra23vYVN4Rgw5HpcYch1B/STOIcBLAq/P9p6bWDBGBsH8CyAjwGEVVVdATaFBWDo8K5soGDI9XjCkOsO6Cdx6jVQHNiQPmPMBeAdAP+gqmr2sK9ngGHI9XjCkOsO6CdxPgIwIvweB9Bd99A+gTEmYVMIv1BV9d2tlx9v+VPIr9JdH6rjC0OuxxOGXHdAP4nzKoBpxtgEY8wK4G8BXO7j53cEttnd9l8AzKqq+s/Cny4DeHPr5zcB/Krf1zagMOR6PGHIdafP7XN3pH8P4H8BMAP4maqq/7NvH94hGGNfA/ABgBsAqJ/YP2HTb/I2gFEADwH8jaqqqUO5yAGDIdfjCUOuO3yuUTlkwIABA93BqBwyYMCAgS5hEKcBAwYMdAmDOA0YMGCgSxjEacCAAQNdwiBOAwYMGOgSBnEaMGDAQJcwiNOAAQMGuoRBnAYMGDDQJf4/CORZoMLIUToAAAAASUVORK5CYII=\n",
      "text/plain": [
       "<Figure size 432x288 with 9 Axes>"
      ]
     },
     "metadata": {
      "needs_background": "light"
     },
     "output_type": "display_data"
    }
   ],
   "source": [
    "import numpy as np\n",
    "import matplotlib.pyplot as plt\n",
    "%matplotlib inline\n",
    "trainX = X_train.reshape(-1,28,28)\n",
    "print(trainX.shape)\n",
    "#lot first few images\n",
    "for i in range(9):\n",
    "    # define subplot\n",
    "    plt.subplot(330 + 1 + i)\n",
    "    # plot raw pixel data\n",
    "    plt.imshow(trainX[i], cmap=plt.get_cmap('gray'))\n",
    "# show the figure\n",
    "plt.show()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 68,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "72.94035223214286 0.2860402\n"
     ]
    }
   ],
   "source": [
    "train_X = trainX.astype('float32')/255.0\n",
    "print(np.mean(trainX),np.mean(train_X))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "定义训练的神经网络模型"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 102,
   "metadata": {},
   "outputs": [],
   "source": [
    "import numpy as np\n",
    "import util\n",
    "##from NeuralNetwork import *\n",
    "##from train import *\n",
    "np.random.seed(1)\n",
    "\n",
    "nn = NeuralNetwork()\n",
    "nn.add_layer(Dense(784, 500))\n",
    "nn.add_layer(Relu())\n",
    "nn.add_layer(Dense(500, 200))\n",
    "nn.add_layer(Relu())\n",
    "nn.add_layer(Dense(200, 100))\n",
    "nn.add_layer(Relu())\n",
    "nn.add_layer(Dense(100, 10))\n",
    "#nn.add_layer(Tanh())"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "定义优化器对象"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 103,
   "metadata": {},
   "outputs": [],
   "source": [
    "learning_rate = 0.01\n",
    "momentum = 0.9\n",
    "optimizer = SGD(nn.parameters(),learning_rate,momentum)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 104,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "0 iter: 2.3016755298047347\n",
      "1000 iter: 1.1510374540057933\n",
      "2000 iter: 0.47471113470221005\n",
      "3000 iter: 0.5333139450988945\n",
      "4000 iter: 0.259167391843765\n",
      "5000 iter: 0.3629363583454308\n",
      "6000 iter: 0.3486191552507917\n",
      "7000 iter: 0.4914253677369693\n"
     ]
    },
    {
     "data": {
      "text/plain": [
       "[<matplotlib.lines.Line2D at 0x1e486700790>]"
      ]
     },
     "execution_count": 104,
     "metadata": {},
     "output_type": "execute_result"
    },
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXQAAAD4CAYAAAD8Zh1EAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAAgAElEQVR4nO3dd3xV5f0H8M83i7BCIIQ9wgh7ExkiyJSlohX3qNZRLW2VtlDUilRsS4XaFhfOFrUqTuAnMkRlKQIJexhmgLASCCQhIfv5/XHPvdx9T3LHOffyeb9eeXHvOc8955vB9z73maKUAhERhb8oowMgIqLAYEInIooQTOhERBGCCZ2IKEIwoRMRRYgYo27cuHFjlZKSYtTtiYjCUkZGxlmlVLK7c4Yl9JSUFKSnpxt1eyKisCQiRz2dY5MLEVGEYEInIooQTOhERBGCCZ2IKEIwoRMRRQgmdCKiCMGETkQUIcI+oecXlyP/UjlyCkpw7mIpKqsU1mTmYNPhcyivrLKVq6pSqKpSOHexFGsycwyMmIgoOAybWFRTR88V4dq5a4J2/QN/GY/Y6LB/nyOiK1DYZa5dJ/KDev3Up5cj+3xxUO9BRBQMYtSORWlpaaqmU/8LSsqREB+LgzkX8dTnuxAbI+jfpiGuatcIB85cRP6lcpy9WIqRXZqgS/MEXCqrRFLdOBSVVeCzjBO4/arWKCmvxIGci3j4XfcxHPnbBIiIP98iEVHAiUiGUirN7blwTOjBUlWl0P6prwAAc37WE3cMaGNwREREjrwl9LBrcgmmqCjB78d0AgDM+HyXwdEQEVUPE7qT34xKNToEIqIaYUInIooQTOhuTOzVHABgVP8CEVFNMKG70aNFAwDApfJKgyMhItKPCd2NRnVjAQB5RWUGR0JEpB8TuhtVWkvLgrWHjA2EiKgamNDdKCwpBwC8/+MxgyMhItKPCd2NIR0bAwAGpDQyOBIiIv2Y0N1IbVIfADA0tbHBkRAR6ceE7kZstCA6SlBSwVEuRBQ+mNDdEBHEx0ShpLzKd2EiIpNgQvegqKwSW4+dNzoMIiLdmNC92HbsgtEhEBHpxoRORBQhmNCJiCIEEzoRUYRgQvegW/MEAMDJC5cMjoSISB8mdA/2nioAAOzMZscoEYUHJnQfSis4Fp2IwgMTug9FpZwtSkThwWdCF5HWIvKdiOwTkT0i8ribMiIi80XkoIjsFJF+wQk39OYs32d0CEREuuipoVcA+L1SqiuAQQCmiEg3pzLjAaRqX48AeC2gURrgpj4tAADchY6IwoXPhK6UOqWU2qo9LgSwD0BLp2KTALyrLH4EkCgizQMebQg1TYgHAJRXsQ2diMJDtdrQRSQFQF8Am5xOtQRw3O55NlyTPkTkERFJF5H03Nzc6kUaYjHRAgAor2QVnYjCg+6ELiL1AHwG4AmlVIHzaTcvccmESqk3lFJpSqm05OTk6kUaYkNTLfFd28nccRIRWelK6CISC0sy/59S6nM3RbIBtLZ73grASf/DM06f1okAgLSUhgZHQkSkj55RLgLgbQD7lFIveii2FMB92miXQQDylVKnAhhnyMVEWT50VLDJhYjCRIyOMkMA3Atgl4hs1449BaANACilFgD4CsAEAAcBFAN4IPChhla0ltAzTxcaHAkRkT4+E7pSagPct5Hbl1EApgQqKDOwfDABlu06hVcMjoWISA/OFCUiihBM6EREEYIJnYgoQjChExFFCCZ0IqIIwYRORBQhmNCJiCKEnolFV6x+bRK5wQURhQ0mdC8yTxeiqIwJnYjCA5tcvGAyJ6JwwoRORBQhmNB1KK/krkVEZH5M6DpcKC43OgQiIp+Y0HWwLqVLRGRmTOg6FJVWGB0CEZFPTOg6TF203XchIiKDMaHrsP8Mdy0iIvNjQtehoIRNLkRkfkzoREQRggmdiChCMKETEUUIJnQiogjBhE5EFCGY0ImIIgQTOhFRhGBCJyKKEEzoREQRggndi4eHtjM6BCIi3ZjQvXhyfFcAwNjuTQ2OhIjINyZ0L6KiBF2bJ6BKGR0JEZFvMUYHYHb7ThVg36kCo8MgIvKJNXQ/7T6Rj5QZy3Awh0vsEpGxmNB1Ol9U5vb40h0nAQDf7MsJZThERC6Y0HX6OP241/N/W/4T8i9xM2kiMg4Tuk6iY5/oL3eeDH4gREQeMKHrJHCf0ZXiEBgiMgefCV1E3hGRHBHZ7eH8cBHJF5Ht2tfMwIdpPou3ncDxvGKjwyAistFTQ/8vgHE+yqxXSvXRvp7zPyzzOVtUirfWH7bVyJ9YtB03vfI9RE9bDBFRCPgch66UWiciKcEPxdxeX3sYAHBNamN0aZYAADjnYeQLEZERAtWGPlhEdojIchHp7qmQiDwiIukikp6bmxugW4dWRaVjmznb0InILAKR0LcCaKuU6g3gJQCLPRVUSr2hlEpTSqUlJycH4Nahd+RsEVJmLDM6DCIiF34ndKVUgVLqovb4KwCxItLY78hMat3+8PxkQUSRz++ELiLNROsZFJEB2jXP+Xtds3LuA7XvFGXrCxEZyWenqIh8CGA4gMYikg3gWQCxAKCUWgBgMoDHRKQCwCUAd6gIblh2/s4i+FslojCjZ5TLnT7Ovwzg5YBFZHLeRilyBCMRGYkzRavp4/Rso0MgInKLCd2HZgnxusuy9YWIjMSE7kOUj2aUN9cfCU0gREQ+MKH7wKn9RBQumNB9SK5fS3dZ5n4iMhITug//ur2P7rJsQyciIzGh+9CwTly1XzP+3+vxyLvpQYiGiMgzn+PQr3g1aEbZd6oA+04VBD4WIiIvWEMPIBHgmcVu9wEhIgo6JnQfqtPRqRTw3o9HgxcMEZEXTOhERBGCCd2HenHsZiCi8MCE7kOUr6miREQmwYRORBQhmNADaN6qTKNDIKIrGBN6AF0oLvd4rryyClVVnEpKRMHDhB4iqU8vx4zPdxodBhFFMCb0EOLmGEQUTEzoREQRggndBFbvPYNVe04bHQYRhTnOmgkBX52hD2krM2bNmRiKcIgoQrGGHgJVOhdK33uSKzQSUc0xoevw3KTuNXrdR5uPYcn2E6h0SuhvrT+MqYu2u5R/cOGWGt2HiAhgk4su7RrXrdHrZny+CwDwwi29HI4/v2wfAOCf1dgNiYjIF9bQdWjbqGYJ3epsUWmAIiEi8owJXYc2SXX8er0/M0SnLtqO9zZm+XV/IroyMKGHwLxV+z2eyysq8/raL7adwDNL9gQ6JCKKQEzoBus3+2ujQyCiCMGEHmIrdp8yOgQiilBM6CG2/sBZj+dO5Zfg+4OezxMRecOEHmL23aM5BSUu52cu2a3rOgUl5SitqAxQVEQUCZjQQ8x+jtGAv37jcl5E35Z3vWatwt1vbgpUWEQUAZjQdVr9u2EBuc6Hm48F5DoAkH70fMCuRUThjwldp45N6ofkPtySmohqigmdiChCMKGbjM4mdCIiFz4Tuoi8IyI5IuJ2+IVYzBeRgyKyU0T6BT7MK8fxvEtGh0BEYUpPDf2/AMZ5OT8eQKr29QiA1/wP68p1qbwS72w4YnQYRBSGfCZ0pdQ6AHleikwC8K6y+BFAoog0D1SAV6LnvtxrdAhEFIYC0YbeEsBxu+fZ2jEXIvKIiKSLSHpubm4Abh25vK3QeKmsZhOKbnx5Ax5amF7TkIjI5AKR0N1147nNRkqpN5RSaUqptOTk5ADcOrR6t2oQsnvNW5Xp8dzi7SdqdM2d2flYve9MTUMiIpMLRELPBtDa7nkrACcDcF3T+fSxq7Fj5nUhuderaw45PB/6wrc4crYIAMeqE5F7gUjoSwHcp412GQQgXykVkUsKxkZHoUGdWCyZMiTk9z6edwkLf8gCAGw64q1Lg4iuVHqGLX4IYCOAziKSLSIPisijIvKoVuQrAIcBHATwJoBfBS1ak+jdOhFPju8S8vv+94csKKXwxTbvTS5nL5YiZcYyPM/OVaIris9NopVSd/o4rwBMCVhEYSLKoBlAp92s0Gjv6LkiXDt3DQDgrQ1H8Kfru4UgKiIyA84UrSGjZnTOXem5sxQAss+7n5hU05ExRBQ+mNBraGiqZZROy8TaIb3v51u9N7e4e585e7EUXWeuCE5ARGQaPptcyL3Ozeoja85EFJVWoPuzK40OB19sy8YX205i3X7X8f2n87030xgtt7AUCgpN6scbHQpRWGNCjxBTF+3weC4YzUPllVUQADHR/n/Iu+ovqwEAWXMm+n0toisZE3qYu3XBDxjcobHH8+eLypBfXB7w+6Y+vRydm9bHyqmB2fiDiPzHhO4nzxP0/ffxluM+y2zJOo8tWZ53Luo7++tAhuQg80xh0K5NRNXHTlETm/7ZTkPvP/vLvfguM8f2fMXuUxj94lpUellnhoiMw4ROHr294Qge+M8W2/Npn+zEwZyLKCqrMDAqIvKECT2Apo3tbHQI1ZZfXI7zRWW6yvpTL/9g0zGc8TEpymwKS8pxoVjfz4bIDJjQ/WSZKAvUjYvGlBEdDY6m+no/t8pnO/tb6w/j0fcybM+rO2gmp6AET32xy6G2Hw7Snl+NPs8Frw+CKNDYKRogEuabge7MvoDE2nFok1TH5dzzy/YBsLxp+fLQwnQkxMfgxdv72I6Va23u58OstltaUWV0CETVwhp6gL1zf5rRIeh24sLlZQJufPl7DJv7HSqrFHZl56OismbJbPW+M/jcafEw66eYULzlfbPvDDYcOBuCO4XOXW/+iBe/3m90GBQGWEP3k3O78sguTQ2Jo7r+vuInvOa05joAdHjqKwDAPYPaBOxeWj4PyaeYB7UdmWoySenj9OPo1jwBPVqGbiMTPX44dA4/HDqHxvXicO+gtqb6NFhSXolPM7Jx98A2porrSsUaeoCE25+yu2Ru7/0fj3k85+9/3NV7z+DEhUsoKa9Eyoxlfl3LXyt2n7Z11k7/dCeuf2lDUO7z2w+34eF3/dv+b+aSPV7nHBhh3spM/GnxbqzYfdroUAisofutblwMOjetj6ljOhkdStBVZ5TLheIyJNaJc3vuIT8TW8qMZRjdtQne+vlVul/z7JLdKCmvwt8n97Idq6pSePT9DLRNqoO100b4FZMvS3cEZhOv0gpzrZqZp/WLFHE1T1NgDd1P0VGClVOHYVyPZkaHotvekwVBv0ewR4es3pfju5CdhRuPYlG648xb6xvU8bziAEVFZCwm9CBYNXUYbu7b0ugwPJowf33ArtV/9tcoKCnH5iN56DXL/aqTl9vQfV9v1tI9tmaY/EvlyNL2UXWnrKIKhSU1X6fG2llLfuCP0FSY0IOgU9P6eGFyL7R1MwQwnNkSs92xc0Vl2JWdj/nfHEBBifcZpHoS+n+1fVMB4KZXvsfweWs8lr3/P5vRc9Yq3xf1wblP4ADXqKEwxYQeJLHRUXjj3vAZwqjHpXJLO6lzpaykvNJjsi4uq8Dwed9V+16ZpwtxxEvtHLCM/qiue9/e5LPMmH+uq/Z1a+qV7w7iyc9d1+z5eMtxZJ4OgzeWcBsNEOGY0IOoc7P66GmyIXCB8Op3Bx2eP7gwHes9jP3ed6oQVbaavf7//WP/5ZhUC0rKkTJjGf77/ZHqBevEU5xGmbsyEx9uvty2f/ZiKb7adQrTP9vp8jOwKigpR06YLaNAocFRLkFWKyby3jMzjuofOnco56LtcVlFFV5dc9BLac+sCey9H4/W6PXh4sGF6dhx/ILXMsNe+A4XistNtyHIuv252Jl9Ab8emWp0KFesyMs2JjP/zr5GhxBw1elLtF8C+HRBCV5Y4X2Ta3dW7D5VrXvqkV9cju0+EqdD+UvlyDiaZ3vtruz8wAakOXHe94ibCwHcsCTrbBH2nAzM93LfO5sxbxVntBqJCT3IWoR4E+lQUCEe2vDo+1ttj/VOavrZq9+jwMsImHve3oTJCzZ6PF9S7jiu+oH/bMYtr21EeWUVbn9jI2542fMEpNzCUmysQft+Tazccxr9Zn9d4/Hpw+etwcT5+idTfZqRbfsEsXTHSXx/0FxNWGaTW1ga0hU7mdCp2sw2W9GdrccuYPXeMx7P7zrhvVba5ZkVbstXKYWffHRW3vLaD7jzzR91Ruqf2V/uRV5RGXIKSkNyvz98sgOTXvkegGX265kQ3TdcXfWX0K7YyYROYeHoueBN/vFV56+qUiivtC4wdrl0cVkFqqoUHlq4BT8cPIvyyirsys7HsQiYqFRVpcJqZ6o//98efLDJslzF4m0n8OXOk9VqUosU7BQNgecmdcfMJXuMDiPkikoDt7ORdbmAg3adrPZW7jmNqzskORwL1FpR73gYWdNt5krsnHUdVu/LwabDeSgMyPfrPeh7397s9nig+xjufPNHbDqSZ7qOV0/+830WAOCugW3wxKLttuOv3t0PE3o2NygqRzM+24kLxeVYcG//oN2DNfQQuG9wCjY/NcroMELugf9Wb0OLn07XbEmChT9k4ZfvZeAPn+xwOP7DwXMoC8Ca5oftxsM79x8YPdm0Om9aazJzcNvrG1Glo+a96UieH1FVz8XSCqzJrN5SDnr5mssQSh9tOY4Ve4K7iBkTeogk1atldAght7maSWHcv2q2JMGzSy2ffpzbtj/JyMYNOlZPrKhSXkd6eMuZb6zzvmqlXikzluHcRf/aoyurlENHcMbRPOTZbS/4mw+2YfORPFwMwZ6wSinkX9I3Gmfqou24/z9bkK1jhE8olVVUVftv2GhM6CESHSXYMfM6o8OIaKXlrrXxzDOFLiNODua4dmre5mXEi30t2LlG/sp3nhP6hgNnkTJjmcsoh3X7c902R/32o20er+XJ8TzLJiUKCrOW7kGvWatsI15ueW0jJi/4AYBltUlrk1B1PlVsPHQOBSXlUEpVa9OTN9YdRu8/r8JJu01UPDmca2lGsx9ZVFBS7nYP2qyzRfjup+DU5p39bfk+3Pb6xpAsZhcobEMPoQZ1YnF7WmssSj+OXwxphzOFJVi285TRYUWM0x5mT85Z8ZPD89Evus7A9Lb867E830nJXTX+FW1G7evrDtuOzVyyG+9utEyOatPIca2f80WuNdpDue77DADgq12OfzuLtZ2idp/IR552rcO5liaHhRtrNiHrzjd/xJCOSRjZpSlmf7nXY7kFax3f2FZpI4xOXLjkMnR3xe5T6NikPjo2qedw3P6NZvQ/1iKnsNSlDd+6tk912vbPFJTgUO5FXN2hse7XALAtvRDorRPf3nAEdw9sg/hY31s6Vhdr6CH2s36WVRjHdm+K5CuwGcYIvmZe+rJuf67t8as+NgaxZ00E9puJvGuXWJ1Hw+w95VoTHPWPtR6vv99pETFrPrzltY1eN9Nw1+7++tpDLmPvrTJPF+Lzrdkerwe4dlZ7a6Z69P2tGP2i5+8LAHIKLc1PKTOW+bWloAgwcf563PWm7zV8PAl0P8nsL/fi5W9rNmPaFyb0EBvYPgkH/jIeA9sncfnWMDT/mwO6y9ZkqOXZGrajP/7RdlzUOcrG3Z/d35b/5DXJRAVgyNCRs0Uun0j3niywdVx6usUHm/1b7uHsxcs17JV7TiNlxjIcc/O7GfvPdXjey6cQdy6WVuB0fvXX1fFn2WdvmNANEBtt+bFPMvGa6eS/Sx5qvMFQ3THXe08WuCSVAzmFeHPdYZcNqc9eLHOZiDXh3/o6sHccv4B/rLIs9zDyH2sw5YOtDucnzF9vW7wtGPUb52su2W5pltp5wvXnlXmmEG9tcB2i6u297IaXNmDQ377xK8ZAYkI3UL82DZE1ZyKu7ZRsdCgUhv61Wv+nBWcT5q93WUt+5Z4z+MtX+3R9CnHXPGTPmgSfX7YPL2k1/5om7IullaiorHJY3iBlxjK8uCrTZViqc8177krHtYNqsuSyfdwZR/Pwq/9l2IZ+Wj9drNxz2mGbwWmf7DBkITl2iprAQ0PbYa1dOy2Fn0Ifm3uYyZc7A7O/aXXs9rHUgjfr9udi1ItrXUYGzf/2INYfPIsvfjXEdmzYXM9r728/fkH3wmbllVWocDNe/+F3M5BXVIa8SWUOndK/fC8DAHBj7xYALENmP8m43O/gPELI343WPWEN3QR6tUxEfGwUbk9rbXQoFIGcp/A711pD4Xod8wG8OXqu2KEt3GrbsQv48bC+Wnde0eX+CYFg94l8jP/3emQczcNLTp9K+j33tdsx6FFaHn7524MeZ3+vdDN5qOPTy3XF6C9dCV1ExolIpogcFJEZbs4PF5F8Edmufc0MfKiRq0GdWPw0ezwGtm9kdCgUgTo89ZXD80Auv+uOuw5HPUorqvDmusPVGu8OAHe8Uf2F0LYeO4/JC37AvlMFuOW1jfiHU7+B8zIOl/sbLBndfqtEe/e9s9lWWzeCzyYXEYkG8AqAMQCyAWwRkaVKKefu4PVKqeuDEOMVw9pWFx8bhXaN62Gfj3bK2Tf1wDOLd4cgMiL9vDV72HOuAb+x7jCW7jiJ+NjgNxy87abz05Otx87jnrc34c370nyOQlpncNOpnp/cAAAHlVKHlVJlAD4CMCm4YV2ZorTfxg29WiA22ncb290D2gQ5IqLgcd696oK2VIC32bf+0Nsp65y0tx2zLBett2nHSHoSeksAx+2eZ2vHnA0WkR0islxEuru7kIg8IiLpIpKem8tOQGcTe7bAA0NS8PTEroiL9v2riYoSzJ7k9kdNZHrbjjkOHbTWbj3N+A2V951Gp3yXaYkrWGPHA0lPQndXVXR+r9sKoK1SqjeAlwAsdnchpdQbSqk0pVRacjKH6jmLi4nCszd0R2KdONtEjkWPDPL6msn92ZFK4Unv4l2BskPntoGe1or5ON37bFkz0JPQswHYZ41WABzGPSmlCpRSF7XHXwGIFZHqLZxADnxt8za6a1MAlh10nN01kE0xRM70zvLVm/j98XH6cd+FakBPQt8CIFVE2olIHIA7ACy1LyAizUQbWCkiA7Trmr/BycSseVpEMKFnMzw2vIPD+ZQky8JO7hJ6rZiosNmYgOhKVOxlMTh/+BzlopSqEJFfA1gJIBrAO0qpPSLyqHZ+AYDJAB4TkQoAlwDcobhQiV+sPzwR4NW7LTucWBd5+uW17fGbkakAgNpuVmwTn5uqEVEk0jVTVGtG+crp2AK7xy8DeDmwoV3Zbr+qNTKOnkdKUl3bsRYN4nEyvwRPju9qOxbjpvM0SJPQiMjkOPXfpG5La43bnGaOLn98mK61mZPrc1leoisRp/6HkQZ1YpHSuK7H8/cOagvAfTMMEUU+JvQI0KB2rMvuL77smsXt8IgiDZtcIsCOZy3JOfN0IT7bmo0RnZt4Ld+4Xhzqx8eGIjQiCiEm9AjSuVl97H1unM9yvhI+EYUnNrlcgTielCgyMaFHuC7N6tse33FV9ZYJyJoz0TaBiYjMj00uEWzfc+MQHSXYe6oAcdFR2H0yHx9tOW6bhTq4fRI2+lhBrpLzw4jCBmvoEax2XDTiYqLQp3UiurVIQK0Yy6/but70wl8McHmN865J08d2sT3u2yYxiNESkb+Y0K8gE3s2x+OjUjFjvCVJx8VEYfXvhjmUcZ5lekPvFjj4l/G4d1BbvHxXv1CFSkQ1wIR+BYmJjsLUMZ0chix2bFIfHzw8EL8f0wmA+2UDYqKjMPumHmjRID5UoRJRDbANnXB1h8bIOnt5H8iPHhmEsgrXfR317lS+fvoILN1x0mUz4seGd8CxvGIs23nKwyuJyB+soZOLQe2TMKxTzTcgad2oDqaM6Gh7PnW0pfYfLYJX2GxDFDRM6BQQG/44wuM5vas/2g+xJKLqY5MLAfC9Q5LV3Mm9MO3TnQCAzx67GjkFJTh8tggtE2vj33f0QZtGruPWrZtwRPlI7LemtUajurGYuWQPCksqPJZbO204rp27Rle8RFcS1tDJifese6s2rLFr8wT0b9sQ43s2x5QRHSEimNSnJfq2aejyGvvdl5y1aljb9jgmSnBz31bYNWusS7nOTS/X3qs4NJ7ILSZ0AgC005bl7dEywWfZ5Y8PxUc+Nq+2alwvDspWQ3dN6OumXW6qaeJmHfdnru8GwDHxJ9WL03VvIrNqlhCcEWNM6ATAMtLl66nDcNcA3xtMd22egAa1fa/W+OZ9aVj662tsNWprk8uM8V3w2t39kP6n0Yiya4cZ16OZyzVStWWB69a63DqYEB+LQ3+dgIk9m3u89+T+rVyOxcU4/rn/cZxlPP6dA9pg//PjPV7rVjfXstLzBuhLQjxbPikwmNDJJrVpfd1DE/UY060pWiTWxjWpjQEAV3dMAgA8em0HjO/ZHI3rXa6RN6wT63LvuOgoXNOxMaaN7YzZk3o4nIuOElvr0MND2zmca5lYG/Nu7Y0/39gdANCndSI2PzUKPVo4Jt96WiKNEtdkb99B26henK5Nt633q65A/swpPATrV86ETkE3qH0SDv11Avq3beT2/I6Z12HDH0c6HPv8V1dj7fThiIoSTBnREQ3qeP5E0KuV+yUJErXXDO+cjCYJ8UisY2mqeePe/vjuD8NtjfvW/1xt7RYiW/HE5Rm07pqKAKBxvVq2Wj4A3NyvpccYrd570HW5hQHt3P9cALi8keh5YwGC95HeytuoJjIOP+tRSER7GeLiLln3c9O56mx8j2ZYtvMUurVIQFx0FMoqLZOhxnRrCgC4oVcLlFZU4aY+lkQ779beWLL9BMZ0awoRQe3YaMxbtR/3DU4BAKydZpkQlX+p3OE+1tBnXt8Nz325FwDw4cODMLhDkkM5X5WumCiBOJUa2K6Ry+if2GhBeaXCpD4tHI5vfnqUw/Nx3ZvhiTGpGPev9Q7HH722Ayb3b4nRL67zEZF3DevE4nxxucvx347siFYNuQqnP4K15h1r6BQ21k8fgfXTL9cMr+9lWWemQ3I9PH/z5SaZP03sCgCIihLcltba1pzSqG4cHhjSztbE0axBPHY8ex062Y2gubF3C9verHHRltdZa+i/uOZy045zMnfWsUk9zLqhm8MxESCh9uU61HXdmmLRLwfbnv98cFvcf3UKdv95LB4flYq/39LL4fVN6ltq3S0TLR3EsTFR6NLMsRkpqW4cZozvgo5N/B/T72k00YgurhukxPgak6pZ/vhQf0Lyat6tvbHnz64jpMwgEH0tejChU56NEW8AAAoLSURBVNho3agOWjuNc4/Rku5taa2xZMoQbH5qlO2Yvz58ZCAAYGjq5VmzM6/vhnfuT3NbPl7bnPuJ0alY/btrcf+QdujdqoFDmV6tEvGPW3ujbVId/O46ywxa23LGHZIw68buqBUTjaljOtmu57yGzuOjUwFcfsOxt3LqMJdj7pJcfaeO2A8eGuhSZmx3yycda7K+a6Clw9y+78NquN0uWIPbJ2HttOEuZb79/bXo2tz/xOYcu1W9WtEOnefVMaSj9zdof4zu2hRf/iZ4b2T2mNApYvRunYgmAWw77t+2EQ7/dYJDG/cvrmmHkV2aOpVriEZ14xAbHYWsORPxhLbUAWAZ6WOtqVubW27p3wprp42w1a6TtARZJ859Mlo7fYTDKJzhnZOREB+DB69x7Aze/NQot8nWXZJr67RxSQe7TcZfuasfsuZMRG3tDeVPE7sia85EzJ7UA+unj3B5UwUszTNWPVs1QNukug7nx3ZvivbJ1dvIHHD/pjW4vWPy/e1IyzITTd387rs1T7CtLgrA4+isZ2+oWYe2N9ZPUjOv7+ajZOAwoRN5EaWjKeGzx67G1mfGuD3XJCEe9w1OQauGtfH3yT3dlnnm+q746809MVQbDeQsNjrKYRROk/rx2DlrLLo5jdpxfjNzrtnX0xL7gnv6oVZMtMO5pgnx+Oq3Q5H5/DhM7GUZDurc4hIdJW6T+XOTuuNZuxE+08d2djjfMrE2FtzT3+V129z8zHq3duzg3jbzcpkXb+uNrDkTbWUeGdYe+58fj8dHd8LiKUPcTmoTAW7ua+lDublvS2x6apRLmVfv7odOTevb3sDS2vruvwFgW6EUsHRsvzDZsYlszbThyJozEW1CuOsXO0WJgiwqSlxG8dirExdja86oiXXTRqCgxLXzcvnjw1BSUQkA2D5zDOrExeB8cRmaJsTjf5uO2coN1D6BOL9BWPsOPHVoj+7aBJuP5Nk6ld+5Pw25haUuTV7142Mchmbe2r8VPsnIRsO6jhPEJvVpgYZ14rDj+AXbsbq1YvD6vf3Rv21Dl08fMVFie6Pr09r9SCcRy5vViieGokNyPcS6qfFP0OYzWL/Pdx64Cr1mrXIo8/bP0/DgwnTb8x3PXoeE+Bjsz7mIO65qjSEdG+Pj9OMOr3F3Lyu9S21UFxM6UZjzVANsUCcWDWBpYrAO2bQ2S7x4Wx8s2nIMt/RvhYZ13M+8nTqmEyqrlG25B2dv/fwqh+fOTVFWzhPA5t7aG3Nv7Q0Atj1rs84VI1oEM8Z3QYcm9fDM4t22Joux3V0nnOllbeZy7jwGLDtw9WhxuY9jcIckfL33DGKjHBPxkI5JGNW1KXbNug49Z61C3bhoW9PNS3f2dblup6b18IfrOrscDwUmdKIrUHL9Wvj1yFSvZRrUjsXsm3p4LePN5P6t8GlGNn49sqPHMmumjcAX27IxddEOVCmF+Nho3DuorW2kkTvWpiP7jVrsPT4qFf+38yTyi8sxfZxrYm3XuC6OnC3CF78a4nB8/h19ceJCMWrHReOlO/viWF4x5q7MRJW2NYD108B1Pt5gerZM9FkmWJjQiSgo5t3aG/O0mrg3QzpY+g7u1ZpufLl7YBtUKYV7PCT9qWM6Yapd+7azxVOGILew1OV47bho23DPG3q3wIkLlzB3ZaZtSYpaMdHY/NQo26cdZ8naWkTuVhwFgHd/MQCL0o9j2c5TthFMgSbKoF3d09LSVHp6uu+CREQGKS6rQO3YaN3LM3z70xkMS032OHRWKYVX1xzC9b2au4wE0ktEMpRSbsfOsoZOROSBp6GknnjqR7ASEYfdvAKNwxaJiCIEEzoRUYRgQiciihBM6EREEYIJnYgoQjChExFFCCZ0IqIIwYRORBQhDJspKiK5AI7W8OWNAZwNYDjBwBj9Z/b4APPHaPb4AMZYXW2VUsnuThiW0P0hIumepr6aBWP0n9njA8wfo9njAxhjILHJhYgoQjChExFFiHBN6G8YHYAOjNF/Zo8PMH+MZo8PYIwBE5Zt6ERE5Cpca+hEROSECZ2IKEKEXUIXkXEikikiB0VkRgjv+46I5IjIbrtjjUTkaxE5oP3b0O7ck1qMmSIy1u54fxHZpZ2bL3q3QtEXY2sR+U5E9onIHhF53Exxiki8iGwWkR1afH82U3xOsUaLyDYR+dKMMYpIlnbt7SKSbrYYRSRRRD4VkZ+0v8fBJouvs/azs34ViMgTZoqxRpRSYfMFIBrAIQDtAcQB2AGgW4juPQxAPwC77Y69AGCG9ngGgL9rj7tpsdUC0E6LOVo7txnAYAACYDmA8QGMsTmAftrj+gD2a7GYIk7tWvW0x7EANgEYZJb4nGL9HYAPAHxp0t91FoDGTsdMEyOAhQAe0h7HAUg0U3xOsUYDOA2grVlj1P29GHXjGv7gBwNYaff8SQBPhvD+KXBM6JkAmmuPmwPIdBcXgJVa7M0B/GR3/E4Arwcx3iUAxpgxTgB1AGwFMNBs8QFoBeAbACNxOaGbLcYsuCZ0U8QIIAHAEWiDLswWn5t4rwPwvZlj1PsVbk0uLQEct3uerR0zSlOl1CkA0P5toh33FGdL7bHz8YATkRQAfWGpBZsmTq0pYzuAHABfK6VMFZ/mXwCmA6iyO2a2GBWAVSKSISKPmCzG9gByAfxHa7Z6S0Tqmig+Z3cA+FB7bNYYdQm3hO6ubcqM4y49xRmS+EWkHoDPADyhlCrwVtRDPEGLUylVqZTqA0steICI9DBTfCJyPYAcpVSG3pd4iCXYv+shSql+AMYDmCIiw7yUDXWMMbA0T76mlOoLoAiW5gtPDPv/IiJxAG4E8Imvoh5iMVVOCreEng2gtd3zVgBOGhQLAJwRkeYAoP2box33FGe29tj5eMCISCwsyfx/SqnPzRqnUuoCgDUAxpksviEAbhSRLAAfARgpIu+bLEYopU5q/+YA+ALAABPFmA0gW/v0BQCfwpLgzRKfvfEAtiqlzmjPzRijbuGW0LcASBWRdto76x0AlhoYz1IAP9ce/xyWNmvr8TtEpJaItAOQCmCz9hGuUEQGaT3h99m9xm/aNd8GsE8p9aLZ4hSRZBFJ1B7XBjAawE9miQ8AlFJPKqVaKaVSYPn7+lYpdY+ZYhSRuiJS3/oYljbg3WaJUSl1GsBxEemsHRoFYK9Z4nNyJy43t1hjMVuM+hnVeO9HB8YEWEZvHALwdAjv+yGAUwDKYXlXfhBAEiydZwe0fxvZlX9aizETdr3eANJg+c93CMDLcOo48jPGa2D5uLcTwHbta4JZ4gTQC8A2Lb7dAGZqx00Rn5t4h+Nyp6hpYoSljXqH9rXH+v/AZDH2AZCu/a4XA2hopvi0a9cBcA5AA7tjpoqxul+c+k9EFCHCrcmFiIg8YEInIooQTOhERBGCCZ2IKEIwoRMRRQgmdCKiCMGETkQUIf4fodmCRfNFcL0AAAAASUVORK5CYII=\n",
      "text/plain": [
       "<Figure size 432x288 with 1 Axes>"
      ]
     },
     "metadata": {
      "needs_background": "light"
     },
     "output_type": "display_data"
    }
   ],
   "source": [
    "epochs=8\n",
    "batch_size = 64\n",
    "reg = 0#1e-3\n",
    "print_n=1000\n",
    "\n",
    "losses = train_nn(nn,train_X,y_train,optimizer,cross_entropy_grad_loss,epochs,batch_size,reg,print_n)\n",
    "\n",
    "plt.plot(losses)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 105,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "0.87965\n",
      "0.8585\n"
     ]
    }
   ],
   "source": [
    "print(np.mean(nn.predict(train_X)==y_train))\n",
    "test_X = X_test.reshape(-1,28,28).astype('float32')/255.0\n",
    "print(np.mean(nn.predict(test_X)==y_test))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 4.3. 9 读写模型参数"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 56,
   "metadata": {},
   "outputs": [],
   "source": [
    "class NeuralNetwork:  \n",
    "    def __init__(self):\n",
    "        self._layers = []\n",
    "        self._params = []\n",
    " \n",
    "    def add_layer(self, layer):      \n",
    "        self._layers.append(layer)\n",
    "        if layer.params: \n",
    "           # for  i in range(len(layer.params)): \n",
    "            for  i, _ in enumerate(layer.params):                         \n",
    "                self._params.append([layer.params[i],layer.grads[i]])            \n",
    "    \n",
    "    def forward(self, X): \n",
    "        for layer in self._layers:\n",
    "            X = layer.forward(X) \n",
    "        return X   \n",
    "\n",
    "    def __call__(self, X):\n",
    "        return self.forward(X)\n",
    "    \n",
    "    def predict(self, X):\n",
    "        \"\"\"\n",
    "        输入X，预测其分类\n",
    "        \"\"\"\n",
    "        p = self.forward(X)\n",
    "        # One row\n",
    "        if p.ndim == 1:     #单样本\n",
    "            return np.argmax(ff)        \n",
    "        # 多样本\n",
    "        return np.argmax(p, axis=1)\n",
    "  \n",
    "   \n",
    "    def backward(self,loss_grad,reg = 0.):\n",
    "        for i in reversed(range(len(self._layers))):\n",
    "            layer = self._layers[i] \n",
    "            loss_grad = layer.backward(loss_grad)\n",
    "            layer.reg_grad(reg) \n",
    "        return loss_grad\n",
    "    \n",
    "    \n",
    "    def backpropagation(self, X, y,loss_function,reg=0):\n",
    "        \"\"\"\n",
    "        反向计算，loss_function函数用于计算损失函数关于输出层的梯度\n",
    "        \"\"\"        \n",
    "        # Feed forward for the output\n",
    "        f = self.forward(X)          \n",
    "        #损失函数关于输出f的梯度\n",
    "        loss,loss_grad = loss_function(f,y)         \n",
    "      \n",
    "        #从loss_grad反向求导\n",
    "        self.zero_grad()\n",
    "        self.backward(loss_grad)  \n",
    "        reg_loss = self.reg_loss_grad(reg)       \n",
    "        return loss+reg_loss\n",
    "        #return np.mean(loss)\n",
    "    \n",
    "    def reg_loss(self,reg):\n",
    "        reg_loss = 0\n",
    "        for i in range(len(self._layers)):\n",
    "            reg_loss+=self._layers[i].reg_loss(reg)\n",
    "        return reg_loss\n",
    "    \n",
    "    def parameters(self): \n",
    "        return self._params\n",
    "    \n",
    "    def zero_grad(self):\n",
    "        for i,_ in enumerate(self._params):           \n",
    "            #self.params[i][1].fill(0.) \n",
    "            self.params[i][1][:] = 0 \n",
    "            \n",
    "    def get_parameters(self):\n",
    "        return self._params \n",
    "    \n",
    "    \n",
    "    def save_parameters(self,filename):\n",
    "        params = {}\n",
    "        for i in range(len(self._layers)):\n",
    "            if self._layers[i].params:\n",
    "                params[i] = self._layers[i].params\n",
    "        np.save(filename, params)\n",
    "                \n",
    "        \n",
    "    def load_parameters(self,filename):\n",
    "        params = np.load(filename,allow_pickle = True)\n",
    "        count = 0\n",
    "        for i in range(len(self._layers)):\n",
    "            if self._layers[i].params:\n",
    "                layer_params = params.item().get(i)\n",
    "                self._layers[i].params = layer_params                \n",
    "                for j in range(len(layer_params)):                   \n",
    "                    self._params[count][0] = layer_params[j]\n",
    "                    count+=1  "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 57,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "p [[-1.1933103  -0.34171091]\n",
      " [-0.08202091  0.07448119]\n",
      " [ 0.71977306 -0.08260996]]\n",
      "\n",
      "p [[ 0.00687304 -0.0052508 ]]\n",
      "\n",
      "p [[ 1.25016925 -0.27714009 -0.27932718 -1.07294965]\n",
      " [-0.98489125 -0.23448778  0.31553585 -0.80151211]]\n",
      "\n",
      "p [[-0.00451004  0.01688496  0.01032032 -0.01244033]]\n",
      "\n",
      "p [[-1.1933103  -0.34171091]\n",
      " [-0.08202091  0.07448119]\n",
      " [ 0.71977306 -0.08260996]]\n",
      "\n",
      "p [[ 0.00687304 -0.0052508 ]]\n",
      "\n",
      "p [[ 1.25016925 -0.27714009 -0.27932718 -1.07294965]\n",
      " [-0.98489125 -0.23448778  0.31553585 -0.80151211]]\n",
      "\n",
      "p [[-0.00451004  0.01688496  0.01032032 -0.01244033]]\n",
      "\n"
     ]
    }
   ],
   "source": [
    "#from NeuralNetwork import *\n",
    "nn = NeuralNetwork()\n",
    "nn.add_layer(Dense(3, 2,('xavier',0.01)))\n",
    "nn.add_layer(Relu())\n",
    "nn.add_layer(Dense(2, 4,('xavier',0.01)))\n",
    "nn.add_layer(Relu())\n",
    "\n",
    "def print_nn_parameters(params,print_grad=False): \n",
    "    for p,grad in params:  \n",
    "        print(\"p\",p)\n",
    "        if print_grad:\n",
    "            print(\"grad\",grad)\n",
    "        print()    \n",
    "print_nn_parameters(nn.get_parameters())\n",
    "nn.save_parameters('model_params.npy')\n",
    "nn.load_parameters('model_params.npy')\n",
    "print_nn_parameters(nn.get_parameters())"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": []
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.8.2"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 4
}
