{
 "cells": [
  {
   "cell_type": "markdown",
   "source": [
    "# 第四章.神经网络的学习\n",
    "<P>这里所说的学习是从训练数据中自动获取最优权重参数的过程</P>\n",
    "\n",
    "## 4.1.从数据中学习\n",
    "\n",
    "### 4.1.1. 数据驱动\n",
    "### 4.1.2 训练集和测试集\n",
    "\n",
    "## 4.2 损失函数\n",
    "<P>神经网络以某个指标为线索寻找最优权重参数,神经网络的学习中所用的指标称为损失函数(loss function).即神经网络以损失函数为线索寻找最优权重参数.损失函数可以使用任意函数,但一般用均方误差和交叉熵误差等</P>\n",
    "\n",
    "### 4.2.1.均方误差\n",
    "<p>均方误差会计算神经网络的输出和正确解监督数据的各个元素之差的平方,再求总和</p>"
   ],
   "metadata": {
    "collapsed": false
   }
  },
  {
   "cell_type": "code",
   "execution_count": 1,
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "0.09750000000000003\n",
      "0.5975\n"
     ]
    }
   ],
   "source": [
    "import numpy as np\n",
    "\n",
    "\n",
    "# 使用python实现均方误差\n",
    "def mean_squared_error(y,t):\n",
    "    return 0.5*np.sum((y-t)**2)\n",
    "\n",
    "t = [0,0,1,0,0,0,0,0,0,0] # 设'2'为正解\n",
    "y = [0.1,0.05,0.6,0.0,0.05,0.1,0.0,0.1,0.0,0.0] # '2'的概率最高情况(0.6)\n",
    "print(mean_squared_error(np.array(np.array(y)),np.array(t)))\n",
    "y = [0.1,0.05,0.1,0.0,0.05,0.1,0.0,0.6,0.0,0.0] # '7'的概率最高情况(0.6)\n",
    "print(mean_squared_error(np.array(t),np.array(y)))"
   ],
   "metadata": {
    "collapsed": false
   }
  },
  {
   "cell_type": "markdown",
   "source": [
    "### 4.2.2.交叉熵误差\n",
    "<p>交叉熵误差的值是由正确解标签所对应的输出结果决定的</p>"
   ],
   "metadata": {
    "collapsed": false
   }
  },
  {
   "cell_type": "code",
   "execution_count": 4,
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "0.510825457099338\n",
      "2.302584092994546\n"
     ]
    }
   ],
   "source": [
    "def cross_entropy_error(y,t):\n",
    "    # y表示神经元输出值(softmax的函数)\n",
    "    # t表示训练数据的正确分类向量\n",
    "    delta = 1e-7\n",
    "    return -np.sum(t*np.log(y+delta)) # 加如delta是保护措施,以防出现log(0)\n",
    "\n",
    "t = [0,0,1,0,0,0,0,0,0,0] # 设'2'为正解\n",
    "y = [0.1,0.05,0.6,0.0,0.05,0.1,0.0,0.1,0.0,0.0] # '2'的概率最高情况(0.6)\n",
    "print(cross_entropy_error(np.array(y),np.array(t)))\n",
    "y = [0.1,0.05,0.1,0.0,0.05,0.1,0.0,0.6,0.0,0.0] # '7'的概率最高情况(0.6)\n",
    "print(cross_entropy_error(np.array(y),np.array(t)))"
   ],
   "metadata": {
    "collapsed": false,
    "ExecuteTime": {
     "end_time": "2023-09-26T11:40:05.298400Z",
     "start_time": "2023-09-26T11:40:05.291377600Z"
    }
   }
  },
  {
   "cell_type": "markdown",
   "source": [
    "### 4.2.3.mini-batch学习\n",
    "<p>我们往往需要将所有样本的损失函数值加起来,以求得总的损失函数值,但是,当数据量非常大时，损失函数值的加法运算会非常复杂。因此我们不必加总所有损失函数的值。换句话说,我们只需要加总一小部分数据的损失函数值。这一小部分数据,我们称之为mini-batch</p>"
   ],
   "metadata": {
    "collapsed": false
   }
  },
  {
   "cell_type": "code",
   "execution_count": 5,
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "(60000, 784)\n",
      "(60000, 10)\n"
     ]
    }
   ],
   "source": [
    "from mnist import load_mnist\n",
    "(x_train,t_train),(x_test,t_test)= load_mnist(normalize=True,one_hot_label=True)\n",
    "print(x_train.shape)\n",
    "print(t_train.shape)"
   ],
   "metadata": {
    "collapsed": false,
    "ExecuteTime": {
     "end_time": "2023-09-26T12:09:26.766912800Z",
     "start_time": "2023-09-26T12:09:25.374109800Z"
    }
   }
  },
  {
   "cell_type": "code",
   "execution_count": 8,
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "[[0. 0. 0. ... 0. 0. 0.]\n",
      " [0. 0. 0. ... 0. 0. 0.]\n",
      " [0. 0. 0. ... 0. 0. 0.]\n",
      " ...\n",
      " [0. 0. 0. ... 0. 0. 0.]\n",
      " [0. 0. 0. ... 0. 0. 0.]\n",
      " [0. 0. 0. ... 0. 0. 0.]]\n",
      "[[0. 0. 0. 1. 0. 0. 0. 0. 0. 0.]\n",
      " [0. 0. 1. 0. 0. 0. 0. 0. 0. 0.]\n",
      " [0. 0. 0. 0. 1. 0. 0. 0. 0. 0.]\n",
      " [0. 0. 0. 0. 0. 0. 0. 0. 0. 1.]\n",
      " [1. 0. 0. 0. 0. 0. 0. 0. 0. 0.]\n",
      " [0. 0. 0. 0. 0. 0. 0. 0. 0. 1.]\n",
      " [0. 0. 0. 0. 0. 1. 0. 0. 0. 0.]\n",
      " [0. 0. 0. 0. 0. 0. 0. 0. 1. 0.]\n",
      " [0. 1. 0. 0. 0. 0. 0. 0. 0. 0.]\n",
      " [0. 0. 0. 0. 0. 0. 0. 1. 0. 0.]]\n"
     ]
    }
   ],
   "source": [
    "train_size = x_train.shape[0]\n",
    "batch_size = 10\n",
    "batch_mask = np.random.choice(train_size,batch_size)\n",
    "x_batch = x_train[batch_mask]\n",
    "t_batch = t_train[batch_mask]\n",
    "print(x_batch)\n",
    "print(t_batch)"
   ],
   "metadata": {
    "collapsed": false,
    "ExecuteTime": {
     "end_time": "2023-09-26T12:23:43.357071900Z",
     "start_time": "2023-09-26T12:23:43.352089800Z"
    }
   }
  },
  {
   "cell_type": "markdown",
   "source": [
    "### 4.2.4.mini-batch版交叉熵误差的实现\n"
   ],
   "metadata": {
    "collapsed": false
   }
  },
  {
   "cell_type": "code",
   "execution_count": 4,
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "2\n"
     ]
    },
    {
     "data": {
      "text/plain": "0.510825457099338"
     },
     "execution_count": 4,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "import numpy as np\n",
    "def cross_entropy_error(y,t):\n",
    "    if y.ndim ==1:\n",
    "        t = t.reshape(1,t.size)\n",
    "        y = y.reshape(1,y.size)\n",
    "    print(y.ndim)\n",
    "    batch_size = y.shape[0]\n",
    "    return -np.sum(t*np.log(y+1e-7))/batch_size\n",
    "\n",
    "t = [0,0,1,0,0,0,0,0,0,0] # 设'2'为正解\n",
    "y = [0.1,0.05,0.6,0.0,0.05,0.1,0.0,0.1,0.0,0.0] # '2'的概率最高情况(0.6)\n",
    "\n",
    "cross_entropy_error(np.array(y),np.array(t))"
   ],
   "metadata": {
    "collapsed": false,
    "ExecuteTime": {
     "end_time": "2023-09-27T01:40:11.290747900Z",
     "start_time": "2023-09-27T01:40:11.239763900Z"
    }
   }
  },
  {
   "cell_type": "code",
   "execution_count": 5,
   "outputs": [],
   "source": [
    "# 当监督数据是标签形式(非one-hot表示,而是像2-7这样)\n",
    "def cross_entropy_error(y,t):\n",
    "    if y.ndim ==1:\n",
    "        t = t.reshape(1,t.size)\n",
    "        y = y.reshape(1,y.size)\n",
    "\n",
    "    batch_size = y.shape[0]\n",
    "    return -np.sum(np.log(y[np.arange(batch_size),t]+1e-7))/batch_size\n"
   ],
   "metadata": {
    "collapsed": false,
    "ExecuteTime": {
     "end_time": "2023-09-27T01:52:59.650488700Z",
     "start_time": "2023-09-27T01:52:59.617447700Z"
    }
   }
  },
  {
   "cell_type": "markdown",
   "source": [
    "## 4.3.数值微分\n",
    "<p>梯度法使用梯度的信息决定前进的方向(参数修改的方向).利用微小的差分求导数的过程被称为数值微分</p>\n",
    "\n",
    "### 4.3.1.导数\n",
    "<p>导数表示的是某个瞬间的变化量。即x的“微小变化”将会导致f(x)在多大程度上的变化</p>"
   ],
   "metadata": {
    "collapsed": false
   }
  },
  {
   "cell_type": "code",
   "execution_count": 6,
   "outputs": [],
   "source": [
    "def numerical_diff(f,x):\n",
    "    h = 1e-4 # 0.0001\n",
    "    return (f(x+h)-f(x-h))/(2*h)"
   ],
   "metadata": {
    "collapsed": false,
    "ExecuteTime": {
     "end_time": "2023-09-27T02:13:23.609199100Z",
     "start_time": "2023-09-27T02:13:23.587829300Z"
    }
   }
  },
  {
   "cell_type": "markdown",
   "source": [
    "### 4.3.2.数值微分的例子"
   ],
   "metadata": {
    "collapsed": false
   }
  },
  {
   "cell_type": "code",
   "execution_count": 7,
   "outputs": [
    {
     "data": {
      "text/plain": "<Figure size 432x288 with 1 Axes>",
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXgAAAEGCAYAAABvtY4XAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjIsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+WH4yJAAAgAElEQVR4nO3deXhV1b3/8feXhAAJcwbmAGGSQcZAglKqOFzlUlGrFixSlUGtVu291uut/Vlbe68d1OvUWlFQkNEJBxxxlgqBAGEM8xSmDIwJgYQk6/dHwr2YJiFAdvY5J5/X8+Th5Ox9sr6uc/JxZ++11zLnHCIiEnrq+V2AiIh4QwEvIhKiFPAiIiFKAS8iEqIU8CIiISrc7wJOFxMT4zp16uR3GSIiQWP58uU5zrnYirYFVMB36tSJ1NRUv8sQEQkaZrazsm06RSMiEqIU8CIiIUoBLyISojwNeDNrbmZvmtkGM0s3s6FeticiIv/H64uszwAfO+duMLMIINLj9kREpIxnAW9mTYHhwK0AzrlCoNCr9kRE5Pu8PEWTAGQDr5jZSjN72cyiPGxPRERO42XAhwMDgReccwOAY8BD5Xcys8lmlmpmqdnZ2R6WIyISeJbvPMhL32zz5Gd7GfC7gd3OuZSy79+kNPC/xzk3xTmX6JxLjI2t8GYsEZGQlL7vKLe9soxZKTs5VlBU4z/fs4B3zu0HMsysR9lTlwHrvWpPRCSY7Mg5xi1TlxIZEc5rE5KIalDzl0S9HkXzC2BW2QiabcBtHrcnIhLw9h85wbipKRSXlDB38lA6tPRmgKGnAe+cSwMSvWxDRCSYHM4vZPy0FA4dK2TO5GS6xjXxrK2AmmxMRCSUHSso4tZXlrHjQD6v3jaYvu2be9qepioQEakFJ04WM3F6Kmv2HOH5sQO4qEuM520q4EVEPFZYVMLPZ61gyfYDPHljP67s3bpW2lXAi4h4qLjE8ct5aXyxIYv/uvZCrh3QrtbaVsCLiHikpMTxH2+t5oM1+3h4ZE9uToqv1fYV8CIiHnDO8bv31/Hm8t3cd1k3Jg1PqPUaFPAiIh74yycbmb54JxOHdeb+y7v5UoMCXkSkhv31yy387autjB0Sz8P/2hMz86UOBbyISA169R/b+csnGxndvy1/uLaPb+EOCngRkRrzemoGj76/nit6teKJG/sRVs+/cAcFvIhIjViwei8PvbWaH3SL4fmbB1A/zP949b8CEZEg98WGTO6fm8agji148ZZBNAgP87skQAEvInJevt2czZ0zV9CzTVOm3jqYyIjAmeJLAS8ico6+25rDxOmpJMREMeP2ITRtWN/vkr5HAS8icg6Wbj/IhFdTiW8ZyayJSbSIivC7pH+igBcROUvLdx7itleW0qZ5Q2ZNSiK6cQO/S6qQAl5E5CysyjjMrdOWEtukAXMmJRPXpKHfJVVKAS8iUk1r9xzhlqkpNI+qz+xJybRqGrjhDgp4EZFqSd93lHFTU2jSsD6zJybTtnkjv0s6IwW8iMgZbM7MZdzLKTQMD2P2pCTPFsmuaQp4EZEqbM3OY+xLKdSrZ8yelETH6Ci/S6o2BbyISCV25Bzj5peWAI45k5JIiG3sd0lnRQEvIlKBjIP53PzSEgqLSpg1MZmucU38LumsBc49tSIiASLjYD5jpizhWGExsycl0aN18IU7KOBFRL5n14F8xkxZzLHCYmZNTKJ322Z+l3TOPA14M9sB5ALFQJFzLtHL9kREzsfOA8cYO2UJ+SdLw71Pu+ANd6idI/hLnXM5tdCOiMg525FzjLEvLeHEyWJmT0ymV9umfpd03nSKRkTqvO05pUfuhcUlzJ6UTM82wR/u4P0oGgd8ambLzWxyRTuY2WQzSzWz1OzsbI/LERH5vm3ZeYyZsrgs3JNCJtzB+4C/2Dk3ELgauNvMhpffwTk3xTmX6JxLjI2N9bgcEZH/szU7jzFTllBU7JgzKZkLWodOuIPHAe+c21v2bxYwHxjiZXsiItW1Jas03EucY87k5KAdClkVzwLezKLMrMmpx8CVwFqv2hMRqa4tWbmMmbIE52DOpGS6twq9cAdvL7K2Auab2al2ZjvnPvawPRGRM9qcmcvYl5ZgZsyZlEzXuOCafuBseBbwzrltQD+vfr6IyNnauD+Xn75cN8IdNBeNiNQRa/cc4SdTFhNWz5g7OfTDHRTwIlIHLN95iLEvLSEqIpzX7xhKlyCbFfJc6UYnEQlpi7ceYML0ZcQ1acCsScm0C4KVmGqKAl5EQtbXm7KZPCOV+JaRzJqYRFyAr6Fa0xTwIhKSFq7P5O5ZK+gS15iZE4YQ3biB3yXVOgW8iIScBav3cv/cNHq3a8aM24bQLLK+3yX5QhdZRSSkvLV8N/fOWcmA+ObMnFB3wx10BC8iIWRWyk4enr+Wi7tG89L4RCIj6nbE1e3/ehEJGVMXbeexBesZcUEcf/vpQBrWD/O7JN8p4EUk6P31yy385ZONXN2nNc+MGUBEuM4+gwJeRIKYc44/fryBF7/exrX92/LEjf0ID1O4n6KAF5GgVFzi+M07a5izNINxyfH8/po+1KtnfpcVUBTwIhJ0CotK+OXraXyweh93X9qFB67sQdnMtXIaBbyIBJXjhcXcOXM5X2/K5tcjL2Dy8C5+lxSwFPAiEjSOHD/JhFeXsWLXIf704wv5yeB4v0sKaAp4EQkK2bkFjJ+2lC1ZuTx/80BGXtjG75ICngJeRALe7kP5jHs5hcyjBUz92WCGd4/1u6SgoIAXkYC2JSuXcS8vJb+wiJkTkxjUsYXfJQUNBbyIBKzVuw/zs2lLCatXj3l3DKVnm6Z+lxRUFPAiEpCWbDvAxOmpNI+sz8wJSXSKifK7pKCjgBeRgPPRmn3cNy+Nji0jeW1CEq2b1a2FOmqKAl5EAsprS3byyLtrGdChOdNuHUzzyAi/SwpaCngRCQjOOZ5auInnvtjC5T3jeG7sQBpFaEbI86GAFxHfFRWX8Jt31jJ3WQY/SezAf13XR5OG1QDPA97MwoBUYI9zbpTX7YlIcDleWMwv5qzks/RMfjGiK/92RXfNK1NDauMI/j4gHdD4JhH5nsP5hUyYnsqKXYd4bHRvbhnaye+SQoqnfwOZWXvgX4GXvWxHRILP3sPHueHvi1mz+wh/u3mgwt0DXh/BPw08CDSpbAczmwxMBoiP18RBInXBpsxcxk9dyrGCImZMGEJyQrTfJYUkz47gzWwUkOWcW17Vfs65Kc65ROdcYmys5pcQCXXLdhzkhhe+o8Q5Xr9zqMLdQ14ewV8MXGNmI4GGQFMzm+mcG+dhmyISwD5eu5/75q6kXYtGzLh9CO1bRPpdUkjz7AjeOfefzrn2zrlOwBjgC4W7SN01ddF27pq1nF5tm/LmnRcp3GuBxsGLiKeKSxyPLVjPq9/t4KrerXl6TH8a1tcNTLWhVgLeOfcV8FVttCUigeN4YTH3zl3JwvWZTBjWmV+P7EmYFsauNTqCFxFPZOcWMHH6MlbvOcKjP+rFrRd39rukOkcBLyI1bmt2Hre+spTs3AJeHDeIK3u39rukOkkBLyI1aun2g0yakUr9MGPu5KH079Dc75LqLAW8iNSY91bt5YHXV9G+ZSNevXUI8dEaKeMnBbyInDfnHC98vZU/f7yRIZ1bMuWWQZrHPQAo4EXkvJwsLuGRd9cxZ+kurunXlr/c2JcG4RoGGQgU8CJyzo7kn+Tu2StYtCWHuy7pwq+u7EE9DYMMGAp4ETknO3KOcfv0ZWQczOfPN/TlpsQOfpck5SjgReSsLd56gLtmlc4jOHNCEkmaMCwgKeBF5KzMW7aLh+evpWN0JNNuHUzH6Ci/S5JKKOBFpFqKSxx/+ngDU77Zxg+6xfD8zQNp1qi+32VJFRTwInJGeQVF3D93JZ+lZzF+aEceGdVLi2IHAQW8iFRpz+HjTHh1GZuz8vj96N6M19J6QUMBLyKVWrHrEJNnLKfgZDGv3DqY4d216lowUcCLSIXeTdvDr95cTeumDZkzKYlurSpdWlkClAJeRL6nuMTxl0828vevtzKkU0v+fssgWkZp2oFgpIAXkf915PhJ7pu7kq82ZnNzUjyP/qg3EeG6mBqsFPAiAsCWrDwmzUgl42A+f7i2D+OSO/pdkpwnBbyI8Hl6JvfPTSMivB6zJyUzpHNLv0uSGqCAF6nDnHP87autPPHpRnq3bcqLtyTSrnkjv8uSGqKAF6mj8guL+NUbq/lgzT5G92/LH6/vS6MITfMbShTwInVQxsF8Js1IZVNmLr8eeQGTfpCAmab5DTUKeJE65rutOdw9awXFJY5XbhvCD3XzUsiqVsCbWRxwMdAWOA6sBVKdcyUe1iYiNcg5xyv/2MF/fZhO55goXhqfSOcYzQQZyqoMeDO7FHgIaAmsBLKAhsC1QBczexN40jl3tILXNgS+ARqUtfOmc+63NVu+iFTHsYIiHnp7De+v2ssVvVrx1E39aNJQM0GGujMdwY8EJjnndpXfYGbhwCjgCuCtCl5bAIxwzuWZWX1gkZl95Jxbcr5Fi0j1bc3O487XlrM1O48Hr+rBncO7aFm9OqLKgHfO/aqKbUXAO1Vsd0Be2bf1y77cOdQoIufo47X7eeCNVUSE1+O1CUlc3DXG75KkFlXrHmQze83Mmp32fScz+7warwszszRKT+0sdM6lVLDPZDNLNbPU7Ozss6ldRCpRVFzC4x+lc+fM5XSJa8yCXwxTuNdB1Z1kYhGQYmYjzWwS8Cnw9Jle5Jwrds71B9oDQ8ysTwX7THHOJTrnEmNjdTVf5Hzl5BVwy9SlvPj1NsYlx/P6Hcm01c1LdVK1RtE45140s3XAl0AOMMA5t7+6jTjnDpvZV8BVlI7AEREPrNh1iJ/PXMGh/EKeuLEfNwxq73dJ4qPqnqK5BZgGjAdeBT40s35neE2smTUve9wIuBzYcF7VikiFnHPMWLyDn7y4mPrhxts/v0jhLtW+0enHwDDnXBYwx8zmUxr0A6p4TRtgupmFUfo/ktedcwvOp1gR+Wf5hUX8Zv5a3l65hxEXxPE/N/WnWaSGQEr1T9FcW+77pWaWdIbXrKbq/wGIyHnanJnLz2etYEt2Hv92RXfuubSrhkDK/6ryFI2Z/cbMKpw31DlXaGYjzGyUN6WJSFXeWr6ba57/B4fyC3nt9iTuvaybwl2+50xH8GuA983sBLACyKb0TtZuQH/gM+C/Pa1QRL7neGExj7y7ljeW7yY5oSXPjhlAXNOGfpclAehMAX+Dc+5iM3uQ0rHsbYCjwExgsnPuuNcFisj/2ZJVekpmc1Ye947oyn2XdydMR+1SiTMF/CAz6wj8FLi03LZGlE48JiK14O0Vu3l4/loiI8KYcfsQftBN941I1c4U8H8HPgYSgNTTnjdKpx1I8KguESlzvLCYR99bx7zUDJI6t+TZsQNopVMyUg1nmovmWeBZM3vBOXdXLdUkImW2ZOVy96yVbMrK5RcjunLfZd0ID6vuDehS11V3mKTCXaQWOeeYtyyDR99fR1REONNvG8JwLcwhZ0krOokEmCPHT/Lrt9fwwZp9DOsaw1M39dMoGTknCniRAJK64yD3zU0j8+gJHrr6Aib/IEFj2+WcKeBFAkBxieOvX27h6c820aFlJG/edRH9OzT3uywJcgp4EZ/tPXyc++elsXT7Qa4b0I7fj+6t5fSkRijgRXz08dr9/MdbqykqLuGpm/px/UDNACk1RwEv4oP8wiL+8EE6s1N2cWG7Zjw7dgCdY6L8LktCjAJepJalZRzml/PS2HHgGHcMT+Dfr+xBRLjGtkvNU8CL1JKi4hKe/3ILz32xhdZNGzJnUjLJCdF+lyUhTAEvUgu25xzj/nlprMo4zHUD2vG70b1pqgup4jEFvIiHnHPMWZrBYwvWExFej+dvHsCovm39LkvqCAW8iEeycwt46K3VfL4hi2FdY3jixn60bqY7UqX2KOBFPLBwfSYPvbWa3IIiHhnVi1sv6qQ7UqXWKeBFatCR/JP8bsE63l6xh55tmjJnTH+6t2rid1lSRyngRWrIlxuzeOit1eTkFXLviK7cM6Kbhj+KrxTwIucp98RJ/rAgnXmpGXSLa8xL4xPp217zyIj/FPAi52HR5hwefHMV+4+e4M4fduH+y7vRsH6Y32WJAAp4kXNyrKCIxz9KZ+aSXSTERvHmXRcxML6F32WJfI9nAW9mHYAZQGugBJjinHvGq/ZEasuSbQf41Zur2H3oOBOHdeaBf+mho3YJSF4ewRcB/+6cW2FmTYDlZrbQObfewzZFPJN74iR//GgDs1J20TE6ktfvGMrgTi39LkukUp4FvHNuH7Cv7HGumaUD7QAFvASdz9Mz+c07a8k8eoKJwzrzb1d2JzJCZzglsNXKJ9TMOgEDgJQKtk0GJgPEx8fXRjki1XYgr4Dfvb+e91btpUerJrwwbpBWWpKg4XnAm1lj4C3gfufc0fLbnXNTgCkAiYmJzut6RKrDOce7aXv53fvryCso4peXd+euS7poXLsEFU8D3szqUxrus5xzb3vZlkhN2Xv4OA/PX8OXG7MZEN+cP/24r+5GlaDk5SgaA6YC6c65p7xqR6SmlJQ4ZqXs5I8fbaDEwSOjevGzizoRpjlkJEh5eQR/MXALsMbM0sqe+7Vz7kMP2xQ5J+n7jvLr+WtYuesww7rG8Pj1F9KhZaTfZYmcFy9H0SwCdOgjAS2/sIinP9vM1EXbad6oPk/d1I/rBrSj9A9QkeCmcV5SZ322PpPfvreOPYePM2ZwBx66+gKaR0b4XZZIjVHAS52z78hxHn1vHZ+sy6R7q8a8caduWJLQpICXOqOouITpi3fy1KcbKXaOB6/qwcRhCRr6KCFLAS91wspdh/h/765l7Z6jXNIjlsdG99FFVAl5CngJaQfyCvjTxxt4PXU3cU0a8NebBzLywta6iCp1ggJeQlJRcQmzUnbx5KcbyS8s5o7hCfzism40bqCPvNQd+rRLyFm24yCPvLuO9H1HGdY1hkev6U3XuMZ+lyVS6xTwEjKyjp7g8Y82MH/lHto2a8gLPx3IVX10OkbqLgW8BL2TxSVM/24HT3+2mcKiEu65tCs/v7SLpvOVOk+/ARK0nHN8uTGLP3yQzrbsY1zSI5bf/qg3nWOi/C5NJCAo4CUobcrM5bEF6/l2cw4JMVG8PD6Ry3rG6XSMyGkU8BJUDh4r5H8WbmL20l1ERYTx/0b14pbkjrpZSaQCCngJCoVFJcxYvINnPt9MfmEx45Liuf/y7rSI0twxIpVRwEtAc86xcH0m//1hOjsO5HNJj1geHtmTblqAQ+SMFPASsFZlHObxj9JZsu0gXeMa88ptg7m0R5zfZYkEDQW8BJydB47x50828sHqfURHRfD70b0ZOySe+mE6zy5yNhTwEjBy8gp47vPNzErZRf2wetw7oiuThifQpGF9v0sTCUoKePFdfmERL3+7nSnfbOP4yWJ+MrgD91/WjbimDf0uTSSoKeDFN0XFJcxLzeDpzzaTnVvAv/RuxYNXXUCXWM0bI1ITFPBS60pKHB+s2cf/fLaJbdnHSOzYgr+PG8igjlpVSaQmKeCl1pwa8vjUwk1s2J9L91aNmXLLIK7o1Up3oIp4QAEvnnPO8e3mHJ78dCOrdh+hc0wUz4zpz6i+bQmrp2AX8YoCXjyVsu0AT366iaU7DtKueSP+fENfrh/QjnANeRTxnAJePJGWcZgnP93It5tziGvSgMdG9+amwR1oEB7md2kidYYCXmrU8p2HeO6LzXy1MZuWURE8PLIn45I70ihCwS5S2zwLeDObBowCspxzfbxqRwJDyrYDPPfFFhZtyaFlVAQPXtWD8UM7aQ1UER95+dv3KvA8MMPDNsRHzjkWbz3AM59vJmX7QWIaN+DhkT35aXK8VlMSCQCe/RY6574xs05e/Xzxz6lRMc9+vpnUnYdo1bQBv/1RL8YOiadhfZ2KEQkUvh9mmdlkYDJAfHy8z9VIVUpKHAvTM3nhq62kZRymbbOGPDa6NzcmdlCwiwQg3wPeOTcFmAKQmJjofC5HKlBQVMw7K/fw4jfb2JZ9jA4tG/H49Rfy44HttZKSSADzPeAlcOWeOMnslF1M+8d2Mo8W0LttU54bO4Cr+7TWOHaRIKCAl3+SlXuCV/6xg5lLdpJ7ooiLu0bzxI39GNY1RlMKiAQRL4dJzgEuAWLMbDfwW+fcVK/ak/O3NTuPl7/dzlsrdnOyuISRfdpwxw8T6Nu+ud+licg58HIUzVivfrbUHOcci7bkMG3Rdr7cmE1EeD1+PLA9k4cn0Dkmyu/yROQ86BRNHXXiZOmF02n/2M6mzDxiGjfgl5d35+akeGKbNPC7PBGpAQr4Oibr6AleW7KTWSm7OHiskF5tmvLEjf34Ub82midGJMQo4OuIVRmHefW7HSxYvZeiEscVPVtx+7DOJHVuqQunIiFKAR/CjhcW8/6qvcxM2cnq3UeIighjXHJHbr2oEx2jdX5dJNQp4EPQtuw8ZqXs4o3UDI6eKKJ7q8Y8Nro31w5oR5OG9f0uT0RqiQI+RBQVl/BZeiYzl+xi0ZYc6ocZV/Vpw7ikeIboNIxInaSAD3K7D+XzRupu5i3LYP/RE7Rt1pAHruzOTYM7ENekod/liYiPFPBBqKComE/XZfJ6agaLtuQAMKxrDL8f3ZsRF8RpGgERARTwQSV931HmLcvgnbQ9HM4/Sbvmjbh3RDduTGxP+xaRfpcnIgFGAR/gjp44yXtpe3k9NYPVu48QEVaPK3q34ieJHbi4awxh9XRuXUQqpoAPQIVFJXyzKZv5aXv4bH0mBUUlXNC6CY+M6sV1A9rRIirC7xJFJAgo4AOEc46VGYd5Z+Ue3l+1l0P5J2kZFcGYwR24fmB7+rZvppEwInJWFPA+255zjHdW7uGdtD3sPJBPg/B6XNGrFdcNaMfw7rHU1wVTETlHCngf7D18nA/X7GPB6n2kZRzGDIYmRHPPpV25qk9r3YwkIjVCAV9L9h05zodr9vPB6r2s2HUYgF5tmvKfV1/ANf3b0qZZI58rFJFQo4D30P4jJ/hwzT4+WLOP5TsPAaWh/qt/6cHIC9tovnUR8ZQCvobtyDnGwvWZfLJuP6llod6zTVMeuLI7Iy9sQ0JsY58rFJG6QgF/nkpKHGm7D7NwfSafrc9kc1YeUBrq/35Fd0b2bUMXhbqI+EABfw5OnCzmu605paGenkV2bgFh9Yykzi25OSmey3u2okNL3VkqIv5SwFdTxsF8vt6UzVcbs/luaw75hcVERYRxSY84rujVikt7xNEsUqNfRCRwKOArceJkMSnbD/L1xmy+2pTFtuxjALRv0YjrB7bj8p6tGNolWsvciUjAUsCXcc6xNTuPbzfn8NXGbJZsO0BBUQkR4fVITohmXFJHftgjloSYKN1RKiJBoc4GvHOOXQfzWbz1AN9tPcDibQfIzi0AICEmirFD4rmkRyxJnaNpFKGjdBEJPnUq4PcdOc53W0rDfPHWA+w5fByA2CYNGJoQzUVdormoSwzx0bpAKiLBz9OAN7OrgGeAMOBl59wfvWzvdCUljs1ZeaTuPMjyHYdI3XmIXQfzAWgRWZ/khGju/GECQ7tE0yW2sU67iEjI8SzgzSwM+CtwBbAbWGZm7znn1nvR3vHCYtIyDrN850FSdx5ixc5DHD1RBEBM4wgGdWzB+KEduahLDBe0bkI9zaMuIiHOyyP4IcAW59w2ADObC4wGajTgC4qKuenFJazbc4SiEgdAt7jG/GvfNgzq2JLEji3oGB2pI3QRqXO8DPh2QMZp3+8GksrvZGaTgckA8fHxZ91Ig/AwOkdHcnGXaBI7tWBgfAuaR2pBDBERLwO+okNm909PODcFmAKQmJj4T9ur4+kxA87lZSIiIc3L1SR2Ax1O+749sNfD9kRE5DReBvwyoJuZdTazCGAM8J6H7YmIyGk8O0XjnCsys3uATygdJjnNObfOq/ZEROT7PB0H75z7EPjQyzZERKRiWtFZRCREKeBFREKUAl5EJEQp4EVEQpQ5d073FnnCzLKBnef48hggpwbLqSmq6+wFam2q6+yorrN3LrV1dM7FVrQhoAL+fJhZqnMu0e86ylNdZy9Qa1NdZ0d1nb2ark2naEREQpQCXkQkRIVSwE/xu4BKqK6zF6i1qa6zo7rOXo3WFjLn4EVE5PtC6QheREROo4AXEQlRQRXwZnaVmW00sy1m9lAF283Mni3bvtrMBtZSXR3M7EszSzezdWZ2XwX7XGJmR8wsrezrkVqqbYeZrSlrM7WC7bXeZ2bW47R+SDOzo2Z2f7l9aq2/zGyamWWZ2drTnmtpZgvNbHPZvy0qeW2Vn0kP6vqLmW0oe6/mm1nzSl5b5fvuQV2Pmtme096vkZW8trb7a95pNe0ws7RKXutlf1WYD7XyGXPOBcUXpVMObwUSgAhgFdCr3D4jgY8oXU0qGUippdraAAPLHjcBNlVQ2yXAAh/6bQcQU8V2X/qs3Pu6n9KbNXzpL2A4MBBYe9pzfwYeKnv8EPCnSmqv8jPpQV1XAuFlj/9UUV3Ved89qOtR4IFqvNe12l/ltj8JPOJDf1WYD7XxGQumI/j/XcTbOVcInFrE+3SjgRmu1BKguZm18bow59w+59yKsse5QDqla9IGA1/67DSXAVudc+d6B/N5c859Axws9/RoYHrZ4+nAtRW8tDqfyRqtyzn3qXOuqOzbJZSulFarKumv6qj1/jrFzAy4CZhTU+1VVxX54PlnLJgCvqJFvMuHaHX28ZSZdQIGACkVbB5qZqvM7CMz611LJTngUzNbbqULnJfnd5+NofJfOj/665RWzrl9UPoLCsRVsI/ffXc7pX99VeRM77sX7ik7dTStktMNfvbXD4BM59zmSrbXSn+VywfPP2PBFPDVWcS7Wgt9e8XMGgNvAfc7546W27yC0tMQ/YDngHdqqayLnXMDgauBu81seLntvvWZlS7leA3wRgWb/eqvs+Fn3z0MFAGzKtnlTO97TXsB6AL0B/ZRejqkPD9/P8dS9dG75/11hnyo9GUVPFftPgumgK/OIt6+LfRtZvUpffNmOefeLr/dOXfUOZdX9vhDoL6ZxXhdl3Nub9m/WQ3uDPoAAAJZSURBVMB8Sv/kO52fi6NfDaxwzmWW3+BXf50m89SpqrJ/syrYx5e+M7OfAaOAn7qyE7XlVeN9r1HOuUznXLFzrgR4qZL2/OqvcOB6YF5l+3jdX5Xkg+efsWAK+Oos4v0eML5sZEgycOTUn0BeKju/NxVId849Vck+rcv2w8yGUNr3BzyuK8rMmpx6TOkFurXldvOlz8pUelTlR3+V8x7ws7LHPwPerWCfWl9Y3syuAv4DuMY5l1/JPtV532u6rtOv21xXSXu13l9lLgc2OOd2V7TR6/6qIh+8/4x5cdXYqy9KR3xsovSq8sNlz90J3Fn22IC/lm1fAyTWUl3DKP2zaTWQVvY1slxt9wDrKL0KvgS4qBbqSihrb1VZ24HUZ5GUBnaz057zpb8o/Z/MPuAkpUdME4Bo4HNgc9m/Lcv2bQt8WNVn0uO6tlB6TvbU5+zv5euq7H33uK7Xyj4/qykNoDaB0F9lz7966nN12r612V+V5YPnnzFNVSAiEqKC6RSNiIicBQW8iEiIUsCLiIQoBbyISIhSwIuIhCgFvIhIiFLAi4iEKAW8SCXMbHDZ5FkNy+52XGdmffyuS6S6dKOTSBXM7A9AQ6ARsNs597jPJYlUmwJepApl838sA05QOl1Csc8liVSbTtGIVK0l0JjSlXga+lyLyFnREbxIFczsPUpX0elM6QRa9/hckki1hftdgEigMrPxQJFzbraZhQHfmdkI59wXftcmUh06ghcRCVE6By8iEqIU8CIiIUoBLyISohTwIiIhSgEvIhKiFPAiIiFKAS8iEqL+Py3Z/7D0OmBhAAAAAElFTkSuQmCC\n"
     },
     "metadata": {
      "needs_background": "light"
     },
     "output_type": "display_data"
    }
   ],
   "source": [
    "def function_1(x):\n",
    "    return 0.01*x**2+0.1*x\n",
    "\n",
    "# 绘制该函数的图像\n",
    "import numpy as np\n",
    "import matplotlib.pyplot as plt\n",
    "x = np.arange(0.0,20.0,0.1)\n",
    "y = function_1(x)\n",
    "plt.xlabel(\"x\")\n",
    "plt.ylabel(\"f(x)\")\n",
    "plt.plot(x,y)\n",
    "plt.show()"
   ],
   "metadata": {
    "collapsed": false,
    "ExecuteTime": {
     "end_time": "2023-09-27T02:19:10.349688Z",
     "start_time": "2023-09-27T02:19:07.518385600Z"
    }
   }
  },
  {
   "cell_type": "code",
   "execution_count": 8,
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "当x=5时的导数: 0.1999999999990898\n",
      "当x=10时的倒数: 0.2999999999986347\n"
     ]
    }
   ],
   "source": [
    "print('当x=5时的导数:',numerical_diff(function_1,5))\n",
    "print(\"当x=10时的倒数:\",numerical_diff(function_1,10))"
   ],
   "metadata": {
    "collapsed": false,
    "ExecuteTime": {
     "end_time": "2023-09-27T02:20:54.226509200Z",
     "start_time": "2023-09-27T02:20:54.207465Z"
    }
   }
  },
  {
   "cell_type": "markdown",
   "source": [
    "### 4.3.3.偏导数\n",
    "<P>偏导数主要是固定住其他变量,只求某个目标变量的导数值</P>"
   ],
   "metadata": {
    "collapsed": false
   }
  },
  {
   "cell_type": "code",
   "execution_count": 9,
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "6.00000000000378\n",
      "7.999999999999119\n"
     ]
    }
   ],
   "source": [
    "def function_tmp1(x0):\n",
    "    return x0*x0+4**2\n",
    "\n",
    "print(numerical_diff(function_tmp1,3))\n",
    "\n",
    "def function_tmp2(x1):\n",
    "    return 3.0**2.0+x1*x1\n",
    "\n",
    "print(numerical_diff(function_tmp2,4))"
   ],
   "metadata": {
    "collapsed": false,
    "ExecuteTime": {
     "end_time": "2023-09-27T02:34:02.661037600Z",
     "start_time": "2023-09-27T02:34:02.639012600Z"
    }
   }
  },
  {
   "cell_type": "markdown",
   "source": [
    "## 4.4.梯度\n",
    "<p>由全部变量的偏导数汇总而成的向量被称为梯度</p>"
   ],
   "metadata": {
    "collapsed": false
   }
  },
  {
   "cell_type": "code",
   "execution_count": 10,
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "[6. 8.]\n",
      "[0. 4.]\n",
      "[6. 0.]\n"
     ]
    }
   ],
   "source": [
    "def function_2(x):\n",
    "    return x[0]**2+x[1]**2\n",
    "\n",
    "def numerical_gradient(f,x):\n",
    "    h = 1e-4 #0.0001\n",
    "    grad = np.zeros_like(x) # 生成和x形状相同的数组\n",
    "    for idx in range(x.size):\n",
    "        tmp_val = x[idx]\n",
    "        #  f(x+h)的计算\n",
    "        x[idx] = tmp_val +h\n",
    "        fxh1 = f(x)\n",
    "        # f(x-h)的计算\n",
    "        x[idx] = tmp_val - h\n",
    "        fxh2 = f(x)\n",
    "\n",
    "        grad[idx] = (fxh1-fxh2)/(2*h)\n",
    "        x[idx] = tmp_val # 还原值\n",
    "\n",
    "    return grad\n",
    "\n",
    "# 求梯度\n",
    "print(numerical_gradient(function_2,np.array([3.0,4.0])))\n",
    "print(numerical_gradient(function_2,np.array([0.0,2.0])))\n",
    "print(numerical_gradient(function_2,np.array([3.0,0.0])))"
   ],
   "metadata": {
    "collapsed": false,
    "ExecuteTime": {
     "end_time": "2023-09-27T02:57:54.101415100Z",
     "start_time": "2023-09-27T02:57:54.053405500Z"
    }
   }
  },
  {
   "cell_type": "markdown",
   "source": [
    "## 4.4.1.梯度法\n",
    "<p>在梯度法中,函数的取值从当前位置沿着梯度方向前进一段距离,然后在新的地方重新求梯度,再沿着新梯度方向前进，如此反复,不断地沿梯度方向前进。像这样不断沿着梯度方向前进,逐渐减小函数值的过程就是梯度法</p>"
   ],
   "metadata": {
    "collapsed": false
   }
  },
  {
   "cell_type": "code",
   "execution_count": 11,
   "outputs": [
    {
     "data": {
      "text/plain": "array([-6.11110793e-10,  8.14814391e-10])"
     },
     "execution_count": 11,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "def gradient_descent(f,init_x,lr=0.01,step_num=100):\n",
    "    x = init_x\n",
    "    for i in range(step_num):\n",
    "        grad = numerical_gradient(f,x)\n",
    "        x-=lr*grad\n",
    "    return x\n",
    "\n",
    "# 用梯度法求function2_的最小值\n",
    "init_x = np.array([-3.0,4.0])\n",
    "gradient_descent(function_2,init_x=init_x,lr=0.1,step_num=100)"
   ],
   "metadata": {
    "collapsed": false,
    "ExecuteTime": {
     "end_time": "2023-09-27T04:16:50.992332100Z",
     "start_time": "2023-09-27T04:16:50.664336700Z"
    }
   }
  },
  {
   "cell_type": "code",
   "execution_count": 12,
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "[ 2.34235971e+12 -3.96091057e+12]\n",
      "[ 2.34235971e+12 -3.96091057e+12]\n"
     ]
    }
   ],
   "source": [
    "# 学习率过大或过小都不太好\n",
    "print(gradient_descent(function_2,init_x=init_x,lr=10,step_num=100))\n",
    "print(gradient_descent(function_2,init_x=init_x,lr=1e-10,step_num=100))"
   ],
   "metadata": {
    "collapsed": false,
    "ExecuteTime": {
     "end_time": "2023-09-27T04:19:12.116168500Z",
     "start_time": "2023-09-27T04:19:12.082145200Z"
    }
   }
  },
  {
   "cell_type": "markdown",
   "source": [
    "### 4.4.2.神经网络的梯度"
   ],
   "metadata": {
    "collapsed": false
   }
  },
  {
   "cell_type": "code",
   "execution_count": 13,
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "[[-1.67418753 -0.36928904  1.07259889]\n",
      " [-0.67516453  0.5515613   0.26371204]]\n",
      "[-1.6121606   0.27483174  0.88090017]\n",
      "2\n",
      "0.487442937024093\n"
     ]
    }
   ],
   "source": [
    "import numpy as np\n",
    "from common.functions import softmax,cross_entropy_error\n",
    "from common.gradient import numerical_gradient\n",
    "\n",
    "class simpleNet:\n",
    "    def __init__(self):\n",
    "        self.W = np.random.randn(2,3) # 用高斯分布进行初始化\n",
    "\n",
    "    def predict(self,x):\n",
    "        return np.dot(x,self.W)\n",
    "\n",
    "    def loss(self,x,t):\n",
    "        z = self.predict(x)\n",
    "        y = softmax(z)\n",
    "        loss = cross_entropy_error(y,t)\n",
    "\n",
    "        return loss\n",
    "\n",
    "\n",
    "net = simpleNet()\n",
    "print(net.W) # 权重参数\n",
    "x = np.array([0.6,0.9])\n",
    "p = net.predict(x)\n",
    "print(p)\n",
    "print(np.argmax(p)) # 最大值的索引\n",
    "t = np.array([0,0,1])\n",
    "print(net.loss(x,t))"
   ],
   "metadata": {
    "collapsed": false,
    "ExecuteTime": {
     "end_time": "2023-09-27T04:48:20.331058600Z",
     "start_time": "2023-09-27T04:48:20.102350500Z"
    }
   }
  },
  {
   "cell_type": "code",
   "execution_count": 14,
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "[[ 0.03046034  0.20102273 -0.23148307]\n",
      " [ 0.04569052  0.30153409 -0.3472246 ]]\n"
     ]
    }
   ],
   "source": [
    "def f(W):\n",
    "    return net.loss(x,t)\n",
    "\n",
    "dW = numerical_gradient(f,net.W)\n",
    "print(dW)"
   ],
   "metadata": {
    "collapsed": false,
    "ExecuteTime": {
     "end_time": "2023-09-27T04:55:34.582200300Z",
     "start_time": "2023-09-27T04:55:34.565208Z"
    }
   }
  },
  {
   "cell_type": "markdown",
   "source": [
    "## 4.5.学习算法的实现：\n",
    "<p>前提:神经网络存在合适的权重和偏置,调整权重和偏置以便拟合训练数据的过程称为\"学习\"。神经网络的学习有以下四个步骤:</p>\n",
    "<ol>\n",
    "    <li>(mini-batch)从训练数据中随机挑选一部分数据,这部分数据称为mini-batch.我们的目标是减少mini-batch的损失函数的值</li>\n",
    "    <li>(计算梯度)为了减小mini-batch的损失函数的值,需要求出各个权重参数的梯度。梯度表示损失函数的值减小最多的方向</li>\n",
    "    <li>(更新参数)将权重参数沿梯度方向做微笑更新</li>\n",
    "    <li>(重复)重读步骤1，2，3</li>\n",
    "</ol>\n",
    "<p>由于这里的数据是随机选择的mini-batch数据，顾上述方法又被称为随机梯度下降法。即对随机选择的数据做梯度下降。</p>\n",
    "\n",
    "### 4.5.1.实现两层神经网络的类"
   ],
   "metadata": {
    "collapsed": false
   }
  },
  {
   "cell_type": "code",
   "execution_count": 3,
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "W1:(784, 100)\n",
      "b1:(100,)\n",
      "W2:(100, 10)\n",
      "b2:(10,)\n"
     ]
    }
   ],
   "source": [
    "import sys\n",
    "import numpy as np\n",
    "from common.functions import *\n",
    "from common.gradient import numerical_gradient\n",
    "\n",
    "def sigmoid(x):\n",
    "    return 1/(1+np.exp(-x))\n",
    "\n",
    "class TwoLayerNet:\n",
    "    def __init__(self,input_size,hidden_size,output_size,weight_init_std=0.01):\n",
    "        # 初始化权重\n",
    "        self.params = {}\n",
    "        self.params['W1'] = weight_init_std *(np.random.randn(input_size,hidden_size))\n",
    "        self.params['b1'] = np.zeros(hidden_size)\n",
    "        self.params['W2'] = weight_init_std*np.random.randn(hidden_size,output_size)\n",
    "        self.params['b2'] = np.zeros(output_size)\n",
    "\n",
    "    def predict(self,x):\n",
    "        W1,W2 = self.params['W1'],self.params['W2']\n",
    "        b1,b2 = self.params['b1'],self.params['b2']\n",
    "\n",
    "        a1 = np.dot(x,W1) + b1\n",
    "        z1 = sigmoid(a1)\n",
    "        a2 = np.dot(z1,W2)+b2\n",
    "        y = softmax(a2)\n",
    "\n",
    "        return y\n",
    "\n",
    "    # x:输入数据,t:监督数据\n",
    "    def loss(self,x,y):\n",
    "        y = self.predict(x)\n",
    "        return cross_entropy_error(y,t)\n",
    "\n",
    "    def accuracy(self,x,t):\n",
    "        y = self.predict(x)\n",
    "        y = np.argmax(y,axis=1)\n",
    "        t = np.argmax(t,axis=1)\n",
    "\n",
    "        accuracy = np.sum(y==t)/float(x.shape[0])\n",
    "        return accuracy\n",
    "\n",
    "    # x:输入数据,t:监督数据\n",
    "    def numerical_gradient(self,x,t):\n",
    "        loss_W = lambda W:self.loss(x,t)\n",
    "\n",
    "        grads = {}\n",
    "        grads['W1'] = numerical_gradient(loss_W,self.params['W1'])\n",
    "        grads['b1'] = numerical_gradient(loss_W,self.params['b1'])\n",
    "        grads['W2'] = numerical_gradient(loss_W,self.params['W2'])\n",
    "        grads['b2'] = numerical_gradient(loss_W,self.params['b2'])\n",
    "\n",
    "        return grads\n",
    "\n",
    "# params属性中,保存了该神经网络的所有参数\n",
    "net = TwoLayerNet(input_size=784,hidden_size=100,output_size=10)\n",
    "print('W1:'+str(net.params['W1'].shape))\n",
    "print('b1:'+str(net.params['b1'].shape))\n",
    "print('W2:'+str(net.params['W2'].shape))\n",
    "print('b2:'+str(net.params['b2'].shape))"
   ],
   "metadata": {
    "collapsed": false,
    "ExecuteTime": {
     "end_time": "2023-09-28T02:28:00.986959700Z",
     "start_time": "2023-09-28T02:28:00.966957700Z"
    }
   }
  },
  {
   "cell_type": "code",
   "execution_count": 4,
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "[[0.10475243 0.11032539 0.09116674 0.10220091 0.10172536 0.09842263\n",
      "  0.10179765 0.09026611 0.09485547 0.1044873 ]\n",
      " [0.1035365  0.10974417 0.09003455 0.10285809 0.10145122 0.09912368\n",
      "  0.10327253 0.09103757 0.09645629 0.10248538]\n",
      " [0.10427155 0.10975969 0.09070582 0.10195828 0.10215042 0.09859317\n",
      "  0.10280475 0.09048789 0.09634961 0.10291881]\n",
      " [0.10461388 0.1100126  0.09046154 0.10225756 0.10194716 0.09893651\n",
      "  0.10323114 0.09042767 0.09545887 0.10265307]\n",
      " [0.1043753  0.11189079 0.09066916 0.10229771 0.10066781 0.09944436\n",
      "  0.10135108 0.08981215 0.09651174 0.10297989]\n",
      " [0.10341266 0.11032167 0.0917084  0.10270671 0.10262031 0.0994829\n",
      "  0.10198429 0.09168853 0.09389383 0.1021807 ]\n",
      " [0.10338262 0.10974415 0.09099714 0.10275952 0.10070994 0.10080189\n",
      "  0.10349676 0.08963532 0.09488161 0.10359105]\n",
      " [0.10271374 0.10934638 0.09166805 0.10223093 0.10257697 0.09874987\n",
      "  0.10262728 0.09183149 0.09553212 0.10272316]\n",
      " [0.10422833 0.10986638 0.09060966 0.10246165 0.10206345 0.09936966\n",
      "  0.1019011  0.0906077  0.09526297 0.10362909]\n",
      " [0.10481478 0.11046706 0.09084182 0.10327483 0.10156638 0.09947642\n",
      "  0.10045479 0.09045316 0.09488404 0.10376672]\n",
      " [0.10464123 0.11011565 0.09111092 0.10228776 0.10187109 0.09929453\n",
      "  0.10198501 0.08999728 0.09502249 0.10367405]\n",
      " [0.10265426 0.11037246 0.09094085 0.10243242 0.10173967 0.09967983\n",
      "  0.1026304  0.09033328 0.09579978 0.10341705]\n",
      " [0.10315069 0.11022803 0.09031923 0.10199797 0.1021078  0.10021113\n",
      "  0.10149043 0.09108739 0.09578362 0.10362371]\n",
      " [0.10283844 0.11069859 0.0904629  0.10350083 0.10166266 0.09943702\n",
      "  0.10218424 0.09097352 0.09563919 0.10260262]\n",
      " [0.1043777  0.10929246 0.09149496 0.10368731 0.10221421 0.09916937\n",
      "  0.10178925 0.08958512 0.09551957 0.10287005]\n",
      " [0.10280384 0.11032906 0.090817   0.10279734 0.10109634 0.09933361\n",
      "  0.10263361 0.0903554  0.09519689 0.10463692]\n",
      " [0.10368671 0.11142831 0.09093746 0.10185652 0.10132438 0.09866431\n",
      "  0.10283384 0.09015832 0.09572463 0.10338552]\n",
      " [0.10489815 0.11077875 0.09065465 0.10139836 0.10192664 0.0991868\n",
      "  0.10287624 0.08963614 0.09528237 0.1033619 ]\n",
      " [0.10420491 0.1099665  0.09051587 0.10243189 0.10181251 0.09953366\n",
      "  0.1023229  0.09040052 0.09480674 0.10400451]\n",
      " [0.10406843 0.11013897 0.09108994 0.10208308 0.101984   0.09798063\n",
      "  0.10263654 0.09065935 0.09595661 0.10340245]\n",
      " [0.10319096 0.11120567 0.09109553 0.10141546 0.10208178 0.10018073\n",
      "  0.10206245 0.0896435  0.09608123 0.1030427 ]\n",
      " [0.10351104 0.1113891  0.09073463 0.10167654 0.10197315 0.09817344\n",
      "  0.10108831 0.09130259 0.09519704 0.10495417]\n",
      " [0.10458107 0.1088155  0.09130262 0.10282286 0.10194858 0.09966765\n",
      "  0.10227685 0.09043791 0.09549504 0.10265193]\n",
      " [0.10352591 0.11078743 0.09053554 0.10278998 0.1011535  0.09844194\n",
      "  0.10321092 0.09082881 0.09481424 0.10391173]\n",
      " [0.10339856 0.11021162 0.09070257 0.10278546 0.10259479 0.09898489\n",
      "  0.10230561 0.09013295 0.09618813 0.10269541]\n",
      " [0.10366811 0.1096337  0.09077123 0.10345812 0.10086313 0.09872732\n",
      "  0.10299674 0.09138063 0.09523144 0.10326957]\n",
      " [0.10378588 0.1107346  0.09088546 0.10273982 0.10156676 0.09950639\n",
      "  0.10209201 0.0906062  0.09581109 0.1022718 ]\n",
      " [0.1035688  0.1105209  0.09095069 0.10202819 0.10211765 0.09914229\n",
      "  0.10215335 0.09090553 0.0954094  0.1032032 ]\n",
      " [0.10438881 0.11026087 0.09056395 0.10204933 0.10225847 0.09851115\n",
      "  0.10170869 0.09110774 0.09564247 0.10350852]\n",
      " [0.10389384 0.1093214  0.09111229 0.10252702 0.1019395  0.09833497\n",
      "  0.10291505 0.0902013  0.09489208 0.10486253]\n",
      " [0.10466655 0.11052095 0.08938725 0.1039091  0.10179802 0.09838936\n",
      "  0.10327756 0.08967462 0.09541237 0.10296422]\n",
      " [0.10357414 0.10970395 0.09047525 0.10210259 0.10180427 0.09959486\n",
      "  0.1024331  0.09042923 0.09565883 0.10422378]\n",
      " [0.10441783 0.11136575 0.09003303 0.10199737 0.10143998 0.09862094\n",
      "  0.10217295 0.09002946 0.09565127 0.10427142]\n",
      " [0.10440687 0.11039055 0.09047685 0.10277307 0.10212112 0.09879487\n",
      "  0.10249496 0.09100604 0.09480997 0.10272569]\n",
      " [0.10329751 0.11029664 0.09151522 0.10192856 0.10223413 0.09885582\n",
      "  0.10268283 0.09001469 0.09593735 0.10323724]\n",
      " [0.10400024 0.10964907 0.09157687 0.10228843 0.10274021 0.09873275\n",
      "  0.1012711  0.09096581 0.09500726 0.10376827]\n",
      " [0.10363503 0.10954912 0.0901675  0.10358137 0.10249135 0.09893078\n",
      "  0.10229011 0.09006716 0.09610254 0.10318504]\n",
      " [0.10353825 0.11002649 0.09096671 0.10301948 0.10216411 0.09976927\n",
      "  0.10224562 0.09029026 0.09511455 0.10286525]\n",
      " [0.10333028 0.11135996 0.09012713 0.10245599 0.10247937 0.09854407\n",
      "  0.10127335 0.0908257  0.0959018  0.10370236]\n",
      " [0.10408614 0.11001089 0.09082123 0.10199186 0.1013451  0.09870208\n",
      "  0.10304359 0.09086839 0.09569573 0.10343499]\n",
      " [0.10448373 0.10956834 0.09109137 0.10252902 0.10171055 0.09954913\n",
      "  0.10123324 0.09155611 0.09524005 0.10303848]\n",
      " [0.10531896 0.11038825 0.08968524 0.10474456 0.10115735 0.09880792\n",
      "  0.10202189 0.08946472 0.09569211 0.10271902]\n",
      " [0.10532127 0.11098999 0.09101921 0.10220661 0.1015384  0.09949424\n",
      "  0.10118248 0.09030496 0.09500715 0.1029357 ]\n",
      " [0.10433584 0.10967881 0.09079376 0.10264823 0.10218454 0.09849071\n",
      "  0.10263859 0.09104214 0.09529462 0.10289277]\n",
      " [0.1045104  0.11057233 0.08984459 0.1031083  0.10139243 0.09912815\n",
      "  0.10226395 0.09142076 0.09513716 0.10262193]\n",
      " [0.10416255 0.10996333 0.09023574 0.10299911 0.10137649 0.0987501\n",
      "  0.10316677 0.09089829 0.09492576 0.10352186]\n",
      " [0.10419314 0.11108174 0.09037712 0.1015066  0.10191754 0.09940866\n",
      "  0.10202693 0.08992237 0.0957005  0.1038654 ]\n",
      " [0.10510157 0.11139313 0.09097012 0.10215962 0.10115224 0.09880071\n",
      "  0.10187949 0.08979854 0.09499164 0.10375296]\n",
      " [0.10418419 0.11063127 0.09110997 0.10199541 0.10196606 0.0992782\n",
      "  0.10317563 0.09067674 0.09439213 0.1025904 ]\n",
      " [0.10407127 0.11071184 0.09058088 0.10230399 0.10087937 0.09837232\n",
      "  0.10150681 0.09158422 0.09619333 0.10379598]\n",
      " [0.10295814 0.10932188 0.09124172 0.10331585 0.10164219 0.09887545\n",
      "  0.10195478 0.09156369 0.09527219 0.10385412]\n",
      " [0.10462087 0.11120427 0.09043846 0.10314669 0.1017122  0.09956296\n",
      "  0.10106232 0.09032206 0.09533565 0.10259451]\n",
      " [0.10478662 0.11002353 0.09091465 0.10266173 0.10193535 0.09879765\n",
      "  0.10107059 0.09112731 0.09619131 0.10249127]\n",
      " [0.1040528  0.11124057 0.0907927  0.10239839 0.10157594 0.09779931\n",
      "  0.10193362 0.09057997 0.09524043 0.10438628]\n",
      " [0.10387813 0.11076344 0.09101507 0.10196485 0.10080264 0.09946225\n",
      "  0.10190652 0.08974523 0.09594396 0.10451791]\n",
      " [0.10527368 0.11049735 0.09044584 0.10271988 0.10098357 0.09843609\n",
      "  0.10249036 0.09102245 0.09580415 0.10232664]\n",
      " [0.1034583  0.11113979 0.09032961 0.10304439 0.102546   0.09887716\n",
      "  0.10237238 0.09025217 0.09522717 0.10275302]\n",
      " [0.10432727 0.11120457 0.09103767 0.10240671 0.10038742 0.09943862\n",
      "  0.10201162 0.08995821 0.09543656 0.10379135]\n",
      " [0.10372263 0.11099401 0.09096883 0.10154967 0.10221896 0.09798726\n",
      "  0.1028396  0.09079768 0.09501802 0.10390334]\n",
      " [0.10444344 0.11058785 0.0918952  0.10316926 0.10121523 0.09816018\n",
      "  0.10109458 0.09060503 0.095035   0.10379423]\n",
      " [0.10424899 0.1099443  0.09069835 0.10284261 0.10136047 0.0992455\n",
      "  0.10260211 0.090554   0.09581781 0.10268585]\n",
      " [0.10487411 0.11089144 0.09088564 0.10262544 0.10151151 0.09940361\n",
      "  0.10112122 0.09035821 0.09537666 0.10295216]\n",
      " [0.10431611 0.11109486 0.08988285 0.10269105 0.10207149 0.09899598\n",
      "  0.10169894 0.0908124  0.09480742 0.10362889]\n",
      " [0.10473278 0.11019417 0.0904397  0.10294308 0.10206122 0.09972422\n",
      "  0.10090505 0.09035995 0.09533235 0.10330749]\n",
      " [0.10462267 0.11003199 0.0910865  0.10247379 0.10164395 0.09878352\n",
      "  0.10246289 0.0910272  0.09507533 0.10279215]\n",
      " [0.10415747 0.11110352 0.08974352 0.10304395 0.10053445 0.0997816\n",
      "  0.1030039  0.08990784 0.0962408  0.10248294]\n",
      " [0.10323446 0.11081612 0.09046369 0.10275962 0.1019168  0.09904972\n",
      "  0.10095495 0.09123604 0.09611922 0.10344938]\n",
      " [0.1043016  0.10931233 0.09069161 0.10213649 0.10266706 0.09952658\n",
      "  0.10201104 0.09083773 0.09568971 0.10282585]\n",
      " [0.10310045 0.11051696 0.08980249 0.10110039 0.10199698 0.10006998\n",
      "  0.10417103 0.08963059 0.09599125 0.10361989]\n",
      " [0.10356368 0.11175553 0.09056612 0.10219904 0.10161463 0.09938574\n",
      "  0.1020776  0.09093016 0.09624845 0.10165905]\n",
      " [0.10376508 0.11159219 0.09096411 0.10234603 0.10068477 0.10041677\n",
      "  0.10166433 0.08972536 0.09438618 0.10445517]\n",
      " [0.10377626 0.10967679 0.09085077 0.1028287  0.10206117 0.09912414\n",
      "  0.10271726 0.09023575 0.09573043 0.10299873]\n",
      " [0.10436464 0.11104977 0.09082663 0.10187843 0.10171873 0.09881087\n",
      "  0.10161066 0.09155605 0.09536897 0.10281525]\n",
      " [0.10332606 0.11085701 0.0912249  0.10240569 0.1013304  0.09937256\n",
      "  0.10198735 0.09018294 0.09546243 0.10385066]\n",
      " [0.10407814 0.11014438 0.08955842 0.10198872 0.10218928 0.09967087\n",
      "  0.1019673  0.09047244 0.09665413 0.10327631]\n",
      " [0.10398569 0.11014538 0.09006463 0.1034191  0.10163895 0.09912027\n",
      "  0.10286475 0.0900211  0.09459428 0.10414585]\n",
      " [0.10337129 0.11093684 0.09061512 0.10237342 0.1013738  0.09941097\n",
      "  0.10253557 0.09079494 0.09512549 0.10346255]\n",
      " [0.10357431 0.10983015 0.09157241 0.1028499  0.1014104  0.09953168\n",
      "  0.10172683 0.09044623 0.09559428 0.10346381]\n",
      " [0.10449211 0.11020539 0.09157053 0.1019706  0.10047816 0.0996118\n",
      "  0.10364634 0.09116157 0.09513409 0.10172941]\n",
      " [0.1038912  0.10849174 0.09064823 0.10336141 0.10181121 0.09980589\n",
      "  0.10328806 0.08943218 0.09486556 0.10440451]\n",
      " [0.10473682 0.1106233  0.08947272 0.10295348 0.1017979  0.09859307\n",
      "  0.10140212 0.09116971 0.09560614 0.10364474]\n",
      " [0.10459324 0.11003591 0.09068703 0.10286206 0.10044873 0.09947927\n",
      "  0.10331941 0.08912302 0.09533233 0.10411899]\n",
      " [0.10431976 0.11016401 0.09184719 0.10107835 0.10200438 0.09915134\n",
      "  0.10307319 0.08962942 0.09500528 0.10372708]\n",
      " [0.10363577 0.10916449 0.09057876 0.10357133 0.10210362 0.09854829\n",
      "  0.10298823 0.09084527 0.09544481 0.10311942]\n",
      " [0.10465499 0.11063132 0.09044634 0.10225081 0.10162575 0.09834738\n",
      "  0.10301338 0.09068448 0.09436216 0.1039834 ]\n",
      " [0.1056647  0.11097671 0.08970911 0.10220163 0.10190718 0.09833504\n",
      "  0.10223055 0.09072233 0.09526247 0.10299027]\n",
      " [0.10359966 0.11108518 0.09002111 0.10364869 0.1003232  0.09923047\n",
      "  0.10190766 0.09090764 0.09517489 0.1041015 ]\n",
      " [0.10266131 0.11134042 0.09039563 0.10302849 0.10171348 0.09969543\n",
      "  0.10198629 0.09075109 0.09569916 0.1027287 ]\n",
      " [0.10334286 0.11024932 0.09046824 0.10313894 0.10172645 0.09923922\n",
      "  0.10161205 0.09134392 0.09524748 0.10363152]\n",
      " [0.10397292 0.11008401 0.09058779 0.10318949 0.10208942 0.09944372\n",
      "  0.10147805 0.09079563 0.09588012 0.10247886]\n",
      " [0.1041701  0.10998597 0.09081786 0.10328924 0.10106105 0.09909907\n",
      "  0.10216473 0.09038846 0.09576707 0.10325645]\n",
      " [0.10373037 0.10943909 0.09123816 0.10237005 0.1027302  0.09886812\n",
      "  0.10260389 0.08943874 0.09581463 0.10376674]\n",
      " [0.10384644 0.11074844 0.09050041 0.10105787 0.10202763 0.09901448\n",
      "  0.1014182  0.0914227  0.09563056 0.10433328]\n",
      " [0.10256771 0.11057417 0.09161307 0.10263958 0.10142023 0.09910988\n",
      "  0.10277919 0.09037879 0.09583713 0.10308025]\n",
      " [0.10348135 0.11101411 0.09063354 0.10261542 0.10164334 0.09870694\n",
      "  0.1016997  0.09136518 0.09491909 0.10392133]\n",
      " [0.10326383 0.11106203 0.09074444 0.10202384 0.1020241  0.09899067\n",
      "  0.10204865 0.09084416 0.09564271 0.10335556]\n",
      " [0.10302012 0.1102459  0.09043438 0.10218448 0.10243741 0.09895195\n",
      "  0.10207522 0.09173609 0.09622358 0.10269086]\n",
      " [0.10354985 0.11010542 0.09170386 0.10202922 0.10184862 0.09959006\n",
      "  0.10281084 0.08960265 0.09489627 0.10386321]\n",
      " [0.10347121 0.11030155 0.09082301 0.10255215 0.10172342 0.09893067\n",
      "  0.10291291 0.09127915 0.09527283 0.10273312]\n",
      " [0.10458327 0.11083858 0.09034364 0.10300173 0.1010854  0.1001627\n",
      "  0.10145534 0.09001315 0.09629958 0.10221661]]\n",
      "y的形状 (100, 10)\n"
     ]
    }
   ],
   "source": [
    "# 做推理\n",
    "x = np.random.randn(100,784)\n",
    "y = net.predict(x)\n",
    "print(y)\n",
    "print('y的形状',y.shape)"
   ],
   "metadata": {
    "collapsed": false,
    "ExecuteTime": {
     "end_time": "2023-09-28T02:30:49.975550400Z",
     "start_time": "2023-09-28T02:30:49.875544500Z"
    }
   }
  },
  {
   "cell_type": "code",
   "execution_count": 7,
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "W1的梯度: (784, 100)\n",
      "b1的梯度: (100,)\n",
      "W2的梯度: (100, 10)\n",
      "b2的梯度: (10,)\n"
     ]
    }
   ],
   "source": [
    "# grad属性中保留了所有参数的梯度\n",
    "x = np.random.rand(100,784) # 伪输入数据\n",
    "t = np.random.rand(100,10) # 伪正解数据\n",
    "grads = net.numerical_gradient(x,t) # 计算梯度\n",
    "\n",
    "print('W1的梯度:',grads['W1'].shape) #(784,100)\n",
    "print('b1的梯度:',grads['b1'].shape)\n",
    "print('W2的梯度:',grads['W2'].shape)\n",
    "print('b2的梯度:',grads['b2'].shape)"
   ],
   "metadata": {
    "collapsed": false,
    "ExecuteTime": {
     "end_time": "2023-09-28T02:42:55.883854200Z",
     "start_time": "2023-09-28T02:41:11.508132300Z"
    }
   }
  },
  {
   "cell_type": "markdown",
   "source": [
    "### 4.5.2.mini-batch的实现\n",
    "<p>所谓mini-batch学习,就是从训练数据中随机选择一部分数据(称为mini-batch),再以这些mini-batch为对象,使用梯度法更新参数的过程</p>"
   ],
   "metadata": {
    "collapsed": false
   }
  },
  {
   "cell_type": "code",
   "execution_count": 8,
   "outputs": [
    {
     "ename": "ValueError",
     "evalue": "x and y must have same first dimension, but have shapes (1,) and (100,)",
     "output_type": "error",
     "traceback": [
      "\u001B[1;31m---------------------------------------------------------------------------\u001B[0m",
      "\u001B[1;31mValueError\u001B[0m                                Traceback (most recent call last)",
      "\u001B[1;32m<ipython-input-8-86029a77a811>\u001B[0m in \u001B[0;36m<module>\u001B[1;34m\u001B[0m\n\u001B[0;32m     29\u001B[0m \u001B[1;33m\u001B[0m\u001B[0m\n\u001B[0;32m     30\u001B[0m \u001B[0mx_index\u001B[0m \u001B[1;33m=\u001B[0m \u001B[0mlen\u001B[0m\u001B[1;33m(\u001B[0m\u001B[0mtrain_loss_list\u001B[0m\u001B[1;33m)\u001B[0m\u001B[1;33m\u001B[0m\u001B[1;33m\u001B[0m\u001B[0m\n\u001B[1;32m---> 31\u001B[1;33m \u001B[0mplt\u001B[0m\u001B[1;33m.\u001B[0m\u001B[0mplot\u001B[0m\u001B[1;33m(\u001B[0m\u001B[0mx_index\u001B[0m\u001B[1;33m,\u001B[0m\u001B[0mtrain_loss_list\u001B[0m\u001B[1;33m)\u001B[0m\u001B[1;33m\u001B[0m\u001B[1;33m\u001B[0m\u001B[0m\n\u001B[0m\u001B[0;32m     32\u001B[0m \u001B[0mplt\u001B[0m\u001B[1;33m.\u001B[0m\u001B[0mxlabel\u001B[0m\u001B[1;33m(\u001B[0m\u001B[1;34m\"iter_num\"\u001B[0m\u001B[1;33m)\u001B[0m\u001B[1;33m\u001B[0m\u001B[1;33m\u001B[0m\u001B[0m\n\u001B[0;32m     33\u001B[0m \u001B[0mplt\u001B[0m\u001B[1;33m.\u001B[0m\u001B[0mylabel\u001B[0m\u001B[1;33m(\u001B[0m\u001B[1;34m'loss value'\u001B[0m\u001B[1;33m)\u001B[0m\u001B[1;33m\u001B[0m\u001B[1;33m\u001B[0m\u001B[0m\n",
      "\u001B[1;32mE:\\myAnaconda\\lib\\site-packages\\matplotlib\\pyplot.py\u001B[0m in \u001B[0;36mplot\u001B[1;34m(scalex, scaley, data, *args, **kwargs)\u001B[0m\n\u001B[0;32m   2759\u001B[0m \u001B[1;33m@\u001B[0m\u001B[0mdocstring\u001B[0m\u001B[1;33m.\u001B[0m\u001B[0mcopy\u001B[0m\u001B[1;33m(\u001B[0m\u001B[0mAxes\u001B[0m\u001B[1;33m.\u001B[0m\u001B[0mplot\u001B[0m\u001B[1;33m)\u001B[0m\u001B[1;33m\u001B[0m\u001B[1;33m\u001B[0m\u001B[0m\n\u001B[0;32m   2760\u001B[0m \u001B[1;32mdef\u001B[0m \u001B[0mplot\u001B[0m\u001B[1;33m(\u001B[0m\u001B[1;33m*\u001B[0m\u001B[0margs\u001B[0m\u001B[1;33m,\u001B[0m \u001B[0mscalex\u001B[0m\u001B[1;33m=\u001B[0m\u001B[1;32mTrue\u001B[0m\u001B[1;33m,\u001B[0m \u001B[0mscaley\u001B[0m\u001B[1;33m=\u001B[0m\u001B[1;32mTrue\u001B[0m\u001B[1;33m,\u001B[0m \u001B[0mdata\u001B[0m\u001B[1;33m=\u001B[0m\u001B[1;32mNone\u001B[0m\u001B[1;33m,\u001B[0m \u001B[1;33m**\u001B[0m\u001B[0mkwargs\u001B[0m\u001B[1;33m)\u001B[0m\u001B[1;33m:\u001B[0m\u001B[1;33m\u001B[0m\u001B[1;33m\u001B[0m\u001B[0m\n\u001B[1;32m-> 2761\u001B[1;33m     return gca().plot(\n\u001B[0m\u001B[0;32m   2762\u001B[0m         *args, scalex=scalex, scaley=scaley, **({\"data\": data} if data\n\u001B[0;32m   2763\u001B[0m         is not None else {}), **kwargs)\n",
      "\u001B[1;32mE:\\myAnaconda\\lib\\site-packages\\matplotlib\\axes\\_axes.py\u001B[0m in \u001B[0;36mplot\u001B[1;34m(self, scalex, scaley, data, *args, **kwargs)\u001B[0m\n\u001B[0;32m   1645\u001B[0m         \"\"\"\n\u001B[0;32m   1646\u001B[0m         \u001B[0mkwargs\u001B[0m \u001B[1;33m=\u001B[0m \u001B[0mcbook\u001B[0m\u001B[1;33m.\u001B[0m\u001B[0mnormalize_kwargs\u001B[0m\u001B[1;33m(\u001B[0m\u001B[0mkwargs\u001B[0m\u001B[1;33m,\u001B[0m \u001B[0mmlines\u001B[0m\u001B[1;33m.\u001B[0m\u001B[0mLine2D\u001B[0m\u001B[1;33m)\u001B[0m\u001B[1;33m\u001B[0m\u001B[1;33m\u001B[0m\u001B[0m\n\u001B[1;32m-> 1647\u001B[1;33m         \u001B[0mlines\u001B[0m \u001B[1;33m=\u001B[0m \u001B[1;33m[\u001B[0m\u001B[1;33m*\u001B[0m\u001B[0mself\u001B[0m\u001B[1;33m.\u001B[0m\u001B[0m_get_lines\u001B[0m\u001B[1;33m(\u001B[0m\u001B[1;33m*\u001B[0m\u001B[0margs\u001B[0m\u001B[1;33m,\u001B[0m \u001B[0mdata\u001B[0m\u001B[1;33m=\u001B[0m\u001B[0mdata\u001B[0m\u001B[1;33m,\u001B[0m \u001B[1;33m**\u001B[0m\u001B[0mkwargs\u001B[0m\u001B[1;33m)\u001B[0m\u001B[1;33m]\u001B[0m\u001B[1;33m\u001B[0m\u001B[1;33m\u001B[0m\u001B[0m\n\u001B[0m\u001B[0;32m   1648\u001B[0m         \u001B[1;32mfor\u001B[0m \u001B[0mline\u001B[0m \u001B[1;32min\u001B[0m \u001B[0mlines\u001B[0m\u001B[1;33m:\u001B[0m\u001B[1;33m\u001B[0m\u001B[1;33m\u001B[0m\u001B[0m\n\u001B[0;32m   1649\u001B[0m             \u001B[0mself\u001B[0m\u001B[1;33m.\u001B[0m\u001B[0madd_line\u001B[0m\u001B[1;33m(\u001B[0m\u001B[0mline\u001B[0m\u001B[1;33m)\u001B[0m\u001B[1;33m\u001B[0m\u001B[1;33m\u001B[0m\u001B[0m\n",
      "\u001B[1;32mE:\\myAnaconda\\lib\\site-packages\\matplotlib\\axes\\_base.py\u001B[0m in \u001B[0;36m__call__\u001B[1;34m(self, *args, **kwargs)\u001B[0m\n\u001B[0;32m    214\u001B[0m                 \u001B[0mthis\u001B[0m \u001B[1;33m+=\u001B[0m \u001B[0margs\u001B[0m\u001B[1;33m[\u001B[0m\u001B[1;36m0\u001B[0m\u001B[1;33m]\u001B[0m\u001B[1;33m,\u001B[0m\u001B[1;33m\u001B[0m\u001B[1;33m\u001B[0m\u001B[0m\n\u001B[0;32m    215\u001B[0m                 \u001B[0margs\u001B[0m \u001B[1;33m=\u001B[0m \u001B[0margs\u001B[0m\u001B[1;33m[\u001B[0m\u001B[1;36m1\u001B[0m\u001B[1;33m:\u001B[0m\u001B[1;33m]\u001B[0m\u001B[1;33m\u001B[0m\u001B[1;33m\u001B[0m\u001B[0m\n\u001B[1;32m--> 216\u001B[1;33m             \u001B[1;32myield\u001B[0m \u001B[1;32mfrom\u001B[0m \u001B[0mself\u001B[0m\u001B[1;33m.\u001B[0m\u001B[0m_plot_args\u001B[0m\u001B[1;33m(\u001B[0m\u001B[0mthis\u001B[0m\u001B[1;33m,\u001B[0m \u001B[0mkwargs\u001B[0m\u001B[1;33m)\u001B[0m\u001B[1;33m\u001B[0m\u001B[1;33m\u001B[0m\u001B[0m\n\u001B[0m\u001B[0;32m    217\u001B[0m \u001B[1;33m\u001B[0m\u001B[0m\n\u001B[0;32m    218\u001B[0m     \u001B[1;32mdef\u001B[0m \u001B[0mget_next_color\u001B[0m\u001B[1;33m(\u001B[0m\u001B[0mself\u001B[0m\u001B[1;33m)\u001B[0m\u001B[1;33m:\u001B[0m\u001B[1;33m\u001B[0m\u001B[1;33m\u001B[0m\u001B[0m\n",
      "\u001B[1;32mE:\\myAnaconda\\lib\\site-packages\\matplotlib\\axes\\_base.py\u001B[0m in \u001B[0;36m_plot_args\u001B[1;34m(self, tup, kwargs)\u001B[0m\n\u001B[0;32m    340\u001B[0m \u001B[1;33m\u001B[0m\u001B[0m\n\u001B[0;32m    341\u001B[0m         \u001B[1;32mif\u001B[0m \u001B[0mx\u001B[0m\u001B[1;33m.\u001B[0m\u001B[0mshape\u001B[0m\u001B[1;33m[\u001B[0m\u001B[1;36m0\u001B[0m\u001B[1;33m]\u001B[0m \u001B[1;33m!=\u001B[0m \u001B[0my\u001B[0m\u001B[1;33m.\u001B[0m\u001B[0mshape\u001B[0m\u001B[1;33m[\u001B[0m\u001B[1;36m0\u001B[0m\u001B[1;33m]\u001B[0m\u001B[1;33m:\u001B[0m\u001B[1;33m\u001B[0m\u001B[1;33m\u001B[0m\u001B[0m\n\u001B[1;32m--> 342\u001B[1;33m             raise ValueError(f\"x and y must have same first dimension, but \"\n\u001B[0m\u001B[0;32m    343\u001B[0m                              f\"have shapes {x.shape} and {y.shape}\")\n\u001B[0;32m    344\u001B[0m         \u001B[1;32mif\u001B[0m \u001B[0mx\u001B[0m\u001B[1;33m.\u001B[0m\u001B[0mndim\u001B[0m \u001B[1;33m>\u001B[0m \u001B[1;36m2\u001B[0m \u001B[1;32mor\u001B[0m \u001B[0my\u001B[0m\u001B[1;33m.\u001B[0m\u001B[0mndim\u001B[0m \u001B[1;33m>\u001B[0m \u001B[1;36m2\u001B[0m\u001B[1;33m:\u001B[0m\u001B[1;33m\u001B[0m\u001B[1;33m\u001B[0m\u001B[0m\n",
      "\u001B[1;31mValueError\u001B[0m: x and y must have same first dimension, but have shapes (1,) and (100,)"
     ]
    },
    {
     "data": {
      "text/plain": "<Figure size 432x288 with 1 Axes>",
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXwAAAD8CAYAAAB0IB+mAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjIsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+WH4yJAAANQklEQVR4nO3cX2id933H8fdndg3rnzWhUUtnp9QbTlNfNCNR0zDWLV3ZamcXptCLpKVhoWDCmtLLhMHai9ysF4NSktSYYEJv6os1tO5IGwajzSBLFxlSJ05I0VwWay7EaUsHKSw4+e7inE1Cka3H5xxJjr7vFwj0nOcn6asf8tuPj3WeVBWSpO3vd7Z6AEnS5jD4ktSEwZekJgy+JDVh8CWpCYMvSU2sG/wkx5K8nOS5i5xPkm8kWUxyKsmNsx9TkjStIVf4jwAHLnH+ILBv/HYY+Ob0Y0mSZm3d4FfVE8CvLrHkEPCtGnkKuCrJ+2c1oCRpNnbO4HPsBs6uOF4aP/aL1QuTHGb0rwDe8Y533HT99dfP4MtLUh8nT558parmJvnYWQQ/azy25v0aquoocBRgfn6+FhYWZvDlJamPJP856cfO4rd0loBrVxzvAc7N4PNKkmZoFsE/Adw5/m2dW4DfVNWbns6RJG2tdZ/SSfJt4FbgmiRLwFeBtwFU1RHgMeA2YBH4LXDXRg0rSZrcusGvqjvWOV/AF2c2kSRpQ/hKW0lqwuBLUhMGX5KaMPiS1ITBl6QmDL4kNWHwJakJgy9JTRh8SWrC4EtSEwZfkpow+JLUhMGXpCYMviQ1YfAlqQmDL0lNGHxJasLgS1ITBl+SmjD4ktSEwZekJgy+JDVh8CWpCYMvSU0YfElqwuBLUhMGX5KaMPiS1ITBl6QmDL4kNWHwJakJgy9JTRh8SWrC4EtSEwZfkpoYFPwkB5K8mGQxyX1rnH93ku8n+WmS00numv2okqRprBv8JDuAB4GDwH7gjiT7Vy37IvB8Vd0A3Ar8Q5JdM55VkjSFIVf4NwOLVXWmql4DjgOHVq0p4F1JArwT+BVwYaaTSpKmMiT4u4GzK46Xxo+t9ADwYeAc8Czw5ap6Y/UnSnI4yUKShfPnz084siRpEkOCnzUeq1XHnwKeAX4f+CPggSS/96YPqjpaVfNVNT83N3fZw0qSJjck+EvAtSuO9zC6kl/pLuDRGlkEfg5cP5sRJUmzMCT4TwP7kuwd/0fs7cCJVWteAj4JkOR9wIeAM7McVJI0nZ3rLaiqC0nuAR4HdgDHqup0krvH548A9wOPJHmW0VNA91bVKxs4tyTpMq0bfICqegx4bNVjR1a8fw74y9mOJkmaJV9pK0lNGHxJasLgS1ITBl+SmjD4ktSEwZekJgy+JDVh8CWpCYMvSU0YfElqwuBLUhMGX5KaMPiS1ITBl6QmDL4kNWHwJakJgy9JTRh8SWrC4EtSEwZfkpow+JLUhMGXpCYMviQ1YfAlqQmDL0lNGHxJasLgS1ITBl+SmjD4ktSEwZekJgy+JDVh8CWpCYMvSU0YfElqYlDwkxxI8mKSxST3XWTNrUmeSXI6yY9nO6YkaVo711uQZAfwIPAXwBLwdJITVfX8ijVXAQ8BB6rqpSTv3aiBJUmTGXKFfzOwWFVnquo14DhwaNWazwKPVtVLAFX18mzHlCRNa0jwdwNnVxwvjR9b6Trg6iQ/SnIyyZ1rfaIkh5MsJFk4f/78ZBNLkiYyJPhZ47FadbwTuAn4K+BTwN8lue5NH1R1tKrmq2p+bm7usoeVJE1u3efwGV3RX7vieA9wbo01r1TVq8CrSZ4AbgB+NpMpJUlTG3KF/zSwL8neJLuA24ETq9Z8D/h4kp1J3g58DHhhtqNKkqax7hV+VV1Icg/wOLADOFZVp5PcPT5/pKpeSPJD4BTwBvBwVT23kYNLki5PqlY/Hb855ufna2FhYUu+tiS9VSU5WVXzk3ysr7SVpCYMviQ1YfAlqQmDL0lNGHxJasLgS1ITBl+SmjD4ktSEwZekJgy+JDVh8CWpCYMvSU0YfElqwuBLUhMGX5KaMPiS1ITBl6QmDL4kNWHwJakJgy9JTRh8SWrC4EtSEwZfkpow+JLUhMGXpCYMviQ1YfAlqQmDL0lNGHxJasLgS1ITBl+SmjD4ktSEwZekJgy+JDVh8CWpiUHBT3IgyYtJFpPcd4l1H03yepLPzG5ESdIsrBv8JDuAB4GDwH7gjiT7L7Lua8Djsx5SkjS9IVf4NwOLVXWmql4DjgOH1lj3JeA7wMsznE+SNCNDgr8bOLvieGn82P9Lshv4NHDkUp8oyeEkC0kWzp8/f7mzSpKmMCT4WeOxWnX8deDeqnr9Up+oqo5W1XxVzc/NzQ2dUZI0AzsHrFkCrl1xvAc4t2rNPHA8CcA1wG1JLlTVd2cypSRpakOC/zSwL8le4L+A24HPrlxQVXv/7/0kjwD/ZOwl6cqybvCr6kKSexj99s0O4FhVnU5y9/j8JZ+3lyRdGYZc4VNVjwGPrXpszdBX1V9PP5YkadZ8pa0kNWHwJakJgy9JTRh8SWrC4EtSEwZfkpow+JLUhMGXpCYMviQ1YfAlqQmDL0lNGHxJasLgS1ITBl+SmjD4ktSEwZekJgy+JDVh8CWpCYMvSU0YfElqwuBLUhMGX5KaMPiS1ITBl6QmDL4kNWHwJakJgy9JTRh8SWrC4EtSEwZfkpow+JLUhMGXpCYMviQ1YfAlqYlBwU9yIMmLSRaT3LfG+c8lOTV+ezLJDbMfVZI0jXWDn2QH8CBwENgP3JFk/6plPwf+rKo+AtwPHJ31oJKk6Qy5wr8ZWKyqM1X1GnAcOLRyQVU9WVW/Hh8+BeyZ7ZiSpGkNCf5u4OyK46XxYxfzBeAHa51IcjjJQpKF8+fPD59SkjS1IcHPGo/VmguTTzAK/r1rna+qo1U1X1Xzc3Nzw6eUJE1t54A1S8C1K473AOdWL0ryEeBh4GBV/XI240mSZmXIFf7TwL4ke5PsAm4HTqxckOQDwKPA56vqZ7MfU5I0rXWv8KvqQpJ7gMeBHcCxqjqd5O7x+SPAV4D3AA8lAbhQVfMbN7Yk6XKlas2n4zfc/Px8LSwsbMnXlqS3qiQnJ72g9pW2ktSEwZekJgy+JDVh8CWpCYMvSU0YfElqwuBLUhMGX5KaMPiS1ITBl6QmDL4kNWHwJakJgy9JTRh8SWrC4EtSEwZfkpow+JLUhMGXpCYMviQ1YfAlqQmDL0lNGHxJasLgS1ITBl+SmjD4ktSEwZekJgy+JDVh8CWpCYMvSU0YfElqwuBLUhMGX5KaMPiS1ITBl6QmDL4kNTEo+EkOJHkxyWKS+9Y4nyTfGJ8/leTG2Y8qSZrGusFPsgN4EDgI7AfuSLJ/1bKDwL7x22HgmzOeU5I0pSFX+DcDi1V1pqpeA44Dh1atOQR8q0aeAq5K8v4ZzypJmsLOAWt2A2dXHC8BHxuwZjfwi5WLkhxm9C8AgP9J8txlTbt9XQO8stVDXCHci2XuxTL3YtmHJv3AIcHPGo/VBGuoqqPAUYAkC1U1P+Drb3vuxTL3Ypl7scy9WJZkYdKPHfKUzhJw7YrjPcC5CdZIkrbQkOA/DexLsjfJLuB24MSqNSeAO8e/rXML8Juq+sXqTyRJ2jrrPqVTVReS3AM8DuwAjlXV6SR3j88fAR4DbgMWgd8Cdw342kcnnnr7cS+WuRfL3Itl7sWyifciVW96ql2StA35SltJasLgS1ITGx58b8uwbMBefG68B6eSPJnkhq2YczOstxcr1n00yetJPrOZ822mIXuR5NYkzyQ5neTHmz3jZhnwZ+TdSb6f5KfjvRjy/4VvOUmOJXn5Yq9VmribVbVhb4z+k/c/gD8AdgE/BfavWnMb8ANGv8t/C/CTjZxpq94G7sUfA1eP3z/YeS9WrPsXRr8U8JmtnnsLfy6uAp4HPjA+fu9Wz72Fe/G3wNfG788BvwJ2bfXsG7AXfwrcCDx3kfMTdXOjr/C9LcOydfeiqp6sql+PD59i9HqG7WjIzwXAl4DvAC9v5nCbbMhefBZ4tKpeAqiq7bofQ/aigHclCfBORsG/sLljbryqeoLR93YxE3Vzo4N/sVsuXO6a7eByv88vMPobfDtady+S7AY+DRzZxLm2wpCfi+uAq5P8KMnJJHdu2nSba8hePAB8mNELO58FvlxVb2zOeFeUibo55NYK05jZbRm2gcHfZ5JPMAr+n2zoRFtnyF58Hbi3ql4fXcxtW0P2YidwE/BJ4HeBf0vyVFX9bKOH22RD9uJTwDPAnwN/CPxzkn+tqv/e6OGuMBN1c6OD720Zlg36PpN8BHgYOFhVv9yk2TbbkL2YB46PY38NcFuSC1X13c0ZcdMM/TPySlW9Crya5AngBmC7BX/IXtwF/H2NnsheTPJz4Hrg3zdnxCvGRN3c6Kd0vC3DsnX3IskHgEeBz2/Dq7eV1t2LqtpbVR+sqg8C/wj8zTaMPQz7M/I94ONJdiZ5O6O71b6wyXNuhiF78RKjf+mQ5H2M7hx5ZlOnvDJM1M0NvcKvjbstw1vOwL34CvAe4KHxle2F2oZ3CBy4Fy0M2YuqeiHJD4FTwBvAw1W17W4tPvDn4n7gkSTPMnpa496q2na3TU7ybeBW4JokS8BXgbfBdN301gqS1ISvtJWkJgy+JDVh8CWpCYMvSU0YfElqwuBLUhMGX5Ka+F/Xe3Wlc9XddQAAAABJRU5ErkJggg==\n"
     },
     "metadata": {
      "needs_background": "light"
     },
     "output_type": "display_data"
    }
   ],
   "source": [
    "import numpy as np\n",
    "from mnist import load_mnist\n",
    "import matplotlib.pyplot as plt\n",
    "\n",
    "(x_train,t_train),(x_test,t_test) = load_mnist(normalize=True,one_hot_label=True)\n",
    "train_loss_list = []\n",
    "\n",
    "# 超参数\n",
    "iters_num = 100 # 也可取10000次\n",
    "train_size = x_train.shape[0]\n",
    "batch_size = 100\n",
    "learning_rate = 0.1\n",
    "network = TwoLayerNet(input_size=784,hidden_size=50,output_size=10)\n",
    "for i in range(iters_num):\n",
    "    # 获取mini-batch\n",
    "    batch_mask = np.random.choice(train_size,batch_size)\n",
    "    x_batch = x_train[batch_mask]\n",
    "    t_batch = t_train[batch_mask]\n",
    "\n",
    "    # 计算梯度\n",
    "    grad = network.numerical_gradient(x_batch,t_batch)\n",
    "    # 更新参数\n",
    "    for key in ('W1','b1','W2','b2'):\n",
    "        network.params[key] -=grad[key]*learning_rate\n",
    "\n",
    "    # 记录学习过程\n",
    "    loss = network.loss(x_batch,t_batch)\n",
    "    train_loss_list.append(loss)"
   ],
   "metadata": {
    "collapsed": false,
    "ExecuteTime": {
     "end_time": "2023-09-28T04:33:05.184863100Z",
     "start_time": "2023-09-28T03:12:07.448622500Z"
    }
   }
  },
  {
   "cell_type": "code",
   "execution_count": 12,
   "outputs": [
    {
     "data": {
      "text/plain": "<Figure size 432x288 with 1 Axes>",
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAY4AAAEHCAYAAAC5u6FsAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjIsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+WH4yJAAAgAElEQVR4nO3deXhU1fnA8e+bPSEJCSRsWQjKIotsRkRBEEWraKVq3YqUWtHWH1axWqu2tbV2U1tbWxeKgFqLW4VW3FdANLIECGvYkTWQAEkIZE/e3x/3JoSQkBnIZCDzfp4nD3fuPffOe4DMO+ece88RVcUYY4zxVJC/AzDGGHN6scRhjDHGK5Y4jDHGeMUShzHGGK9Y4jDGGOOVEH8H0BISEhI0LS3N32EYY8xpZenSpftUNbH+/oBIHGlpaWRmZvo7DGOMOa2IyLaG9ltXlTHGGK9Y4jDGGOMVSxzGGGO8YonDGGOMVyxxGGOM8YolDmOMMV6xxGGMMcYrPkscIpIiInNFJFtE1ojIPQ2UGSsiK0UkS0QyRWR4nWOXi8h6EdkkIg/W2d9ORD4RkY3un/G+qsPirQd4c8kOX13eGGNOS75scVQC96lqb2AoMElE+tQr8xkwQFUHAj8EpgGISDDwLHAF0Ae4uc65DwKfqWoP9/wH8ZH3Vu7m9+9n++ryxhhzWvJZ4lDVHFVd5m4XAdlAUr0yh/TISlJtgJrtIcAmVd2iquXA68BY99hY4GV3+2XgO76qQ0xEKIfKKrHFrowx5ogWGeMQkTRgELCogWPXiMg64D2cVgc4CaZuH9FOjiSdjqqaA05yAjo08p53uN1fmXl5eScUd0xECFXVSnF51Qmdb4wxrZHPE4eIRAOzgMmqerD+cVX9r6qehdNyeKzmtAYu5dXXflWdqqrpqpqemHjMHF0eiY0MBeBgacUJnW+MMa2RTxOHiITiJI2Zqjr7eGVV9QvgTBFJwGlhpNQ5nAzsdrf3ikhn9/qdgdxmD9wVE+HMAVlUWumrtzDGmNOOL++qEmA6kK2qTzVSprtbDhEZDIQB+4ElQA8R6SYiYcBNwBz3tDnABHd7AvC2r+oQE+G0OIqsxWGMMbV8Oa36MGA8sEpEstx9DwOpAKo6BbgO+L6IVAAlwI3uYHmliNwFfAQEAzNUdY17jT8Bb4rIbcB24HpfVSDWbXEcLLEWhzHG1PBZ4lDVL2l4rKJumceBxxs59j7wfgP79wOXNEeMTalpcdgYhzHGHGFPjh9HbKTb4rAxDmOMqWWJ4zhibYzDGGOOYYnjOMJDgggNFruryhhj6rDEcRwiQmxEKAdLrMVhjDE1LHE0ISYixFocxhhThyWOJsREhNpdVcYYU4cljibERlqLwxhj6rLE0YSY8FC7q8oYY+qwxNGE2MgQe3LcGGPqsMTRhJgIa3EYY0xdljiaEBMRwuHyKiqrqv0dijHGnBIscTSh5unxQ2XWXWWMMWCJo0m2JocxxhzNEkcTalYBLLSnx40xBrDE0SRrcRhjzNEscTTBZsg1xpijWeJoQmztYk7W4jDGGPDtmuMpIjJXRLJFZI2I3NNAmXEistL9yRCRAXWO3SMiq91zJ9fZP1BEFopIlohkisgQX9UB6nZVWYvDGGPAty2OSuA+Ve0NDAUmiUifemW2AiNVtT/wGDAVQET6AbcDQ4ABwFUi0sM95wngUVUdCDzivvaZaFt33BhjjuKzxKGqOaq6zN0uArKBpHplMlQ13325EEh2t3sDC1W1WFUrgfnANTWnAbHudltgt6/qABAaHERUWLC1OIwxxhXSEm8iImnAIGDRcYrdBnzgbq8Gfi8i7YESYAyQ6R6bDHwkIn/GSXwXNPKedwB3AKSmpp5U/LYmhzHGHOHzwXERiQZmAZNV9WAjZUbhJI6fA6hqNvA48AnwIbACp+sL4E7gXlVNAe4Fpjd0TVWdqqrpqpqemJh4UnWItTU5jDGmlk8Th4iE4iSNmao6u5Ey/YFpwFhV3V+zX1Wnq+pgVR0BHAA2uocmADXX+g/OOIhPWYvDGGOO8OVdVYLTGshW1acaKZOKkwTGq+qGesc61ClzLfCae2g3MNLdvpgjCcVnbBVAY4w5wpdjHMOA8cAqEcly9z0MpAKo6hScu6LaA885eYZKVU13y85yxzgqgEl1BtFvB54WkRCgFHccw5diI0PZfqDY129jjDGnBZ8lDlX9EpAmykwEJjZy7MLjXPeckw7QC05XlbU4jDEG7Mlxj8RE2CqAxhhTwxKHB2IjQimvqqa0osrfoRhjjN9Z4vBArM2Qa4wxtSxxeKBmTQ67s8oYYyxxeMTW5DDGmCMscXggpmZqdVsF0BhjLHF44shiTtbiMMYYSxwesDU5jDHmCEscHrDBcWOMOcIShwfahAUTJNZVZYwxYInDIyJCdHiIDY4bYwyWODwWGxlqLQ5jjMESh8ecqdUtcRhjjCUOD8VEhNjguDHGYInDY7ER1lVljDFgicNjsRE2OG6MMWCJw2PO4LglDmOM8eWa4ykiMldEskVkjYjc00CZcSKy0v3JEJEBdY7dIyKr3XMn1zvvJyKy3j32hK/qUFdsRAhFZZVUVWtLvJ0xxpyyfLnmeCVwn6ouE5EYYKmIfKKqa+uU2QqMVNV8EbkCmAqcJyL9cNYWHwKUAx+KyHuqulFERgFjgf6qWiYiHXxYh1rto8NRhfzichKiw1viLY0x5pTksxaHquao6jJ3uwjIBpLqlclQ1Xz35UIg2d3uDSxU1WJVrQTmA9e4x+4E/qSqZe41cn1Vh7pqksW+Q2Ut8XbGGHPKapExDhFJAwYBi45T7DbgA3d7NTBCRNqLSBQwBkhxj/UELhSRRSIyX0TObeQ97xCRTBHJzMvLO+k6JESHAZBXZInDGBPYfNlVBYCIRAOzgMmqerCRMqNwEsdwAFXNFpHHgU+AQ8AKnK6vmpjjgaHAucCbInKGqh41+KCqU3G6vkhPTz/pgYmEGGtxGGMM+LjFISKhOEljpqrObqRMf2AaMFZV99fsV9XpqjpYVUcAB4CN7qGdwGx1LAaqgQRf1gMgsSZxFJX7+q2MMeaU5su7qgSYDmSr6lONlEkFZgPjVXVDvWMd6pS5FnjNPfQ/4GL3WE8gDNjnizrUFRMeQlhIkLU4jDEBz5ddVcOA8cAqEcly9z0MpAKo6hTgEaA98JyTZ6hU1XS37CwRaQ9UAJPqDKLPAGaIyGqcO64m1O+m8gURITE63MY4jDEBz2eJQ1W/BKSJMhOBiY0cu7CR/eXALScd4AlIiAknz1ocxpgAZ0+OeyExOox9h2yMwxgT2CxxeCHBuqqMMcYShzcSY8I5cLjMph0xxgQ0SxxeSIgOp9qddsQYYwKVJQ4v1Ew7Yt1VxphAZonDC4n29Lgxxlji8EbNfFWWOIwxgcwShxdq5quyripjTCCzxOGFI9OO2OC4MSZwWeLwQs20I/usxWGMCWCWOLxk044YYwKdJQ4vJUaH2RiHMSagWeLwUmJMuI1xGGMCmiUOLyVE27QjxpjAZonDSzXTjhw4bK0OY0xgssThJXt63BgT6DxKHCLSVURGu9uRIhLj27BOXTXzVVniMMYEqiYTh4jcDrwF/NPdlYyz7ndT56WIyFwRyRaRNSJyTwNlxonISvcnQ0QG1Dl2j4isds+d3MC594uIikhCU7E0p5ppR+zOKmNMoPKkxTEJZ/3wgwCquhHo4MF5lcB9qtobGApMEpE+9cpsBUaqan/gMWAqgIj0A24HhgADgKtEpEfNSSKSAlwKbPcgjmZlXVXGmEDnSeIoc9f5BkBEQoAmbylS1RxVXeZuFwHZQFK9Mhmqmu++XIjTmgHoDSxU1WJVrQTmA9fUOfWvwAOexNHcosNDCLdpR4wxAcyTxDFfRB4GIkXkUuA/wDvevImIpAGDgEXHKXYb8IG7vRoYISLtRSQKGAOkuNe6GtilqiuaeM87RCRTRDLz8vK8Cfe4RMSWkDXGBLQQD8o8iPOhvgr4EfA+MM3TNxCRaGAWMFlVDzZSZpT7HsMBVDVbRB4HPgEOASuASjeJ/AK4rKn3VdWpuF1f6enpzdoySYgJt64qY0zAajJxqGo18IL74xURCcVJGjNVdXYjZfrjJKIrVHV/nfedDkx3y/wB2AmcCXQDVogIOF1by0RkiKru8Ta+E5UYHc7O/OKWejtjjDmlNJk4RGQrDYwlqOoZTZwnOB/82ar6VCNlUoHZwHhV3VDvWAdVzXXLXAuc746HdKhT5hsgXVX3NVWP5pQYE0bWjvymCxpjTCvkSVdVep3tCOB6oJ0H5w0DxgOrRCTL3fcwkAqgqlOAR4D2wHNuC6JSVWveb5aItAcqgEl1BtH9zpl2pJyqaiU4SPwdjjHGtChPuqr219v1NxH5EudD/3jnfQkc91NVVScCExs5dqEHsaU1VcYXOsQ4047sO1RGx9gIf4RgjDF+40lX1eA6L4NwWiAB++Q4QFJ8JAA780sscRhjAo4nXVV/qbNdCXwD3OCTaE4TSXFRAOwqKOGcrvF+jsYYY1qWJ11Vo1oikNNJTYtjV36JnyMxxpiW12jiEJGfHu/Exu6UCgTR4SG0jQxlV4HdkmuMCTzHa3EE9DhGU5LiIq3FYYwJSI0mDlV9tCUDOd0kxUeybf9hf4dhjDEtzpO7qiJwpgPpi/McBwCq+kMfxnXKS4qLJGPTPlQV9xkUY4wJCJ5McvgK0An4Fs4stclAkS+DOh0kx0dyuLyKwpIKf4dijDEtypPE0V1VfwUcVtWXgSuBs30b1qkvKe7IsxzGGBNIPEkcNV+pC9wFltoCaT6L6DRRe0tugSUOY0xg8eQBwKkiEg/8CpgDRLvbAS053n0I0FocxpgA40nieFFVq3DGN447I24giY8KJTI02FocxpiA40lX1VYRmSoil4jdPlRLREiKt2c5jDGBx5PE0Qv4FJgEfCMiz4jIcN+GdXpIiou0FocxJuA0mThUtURV31TVa4GBQCxOt1XAS4q3xGGMCTyetDgQkZEi8hywDOchwICeHbdGUlwkBw6XU1xe6e9QjDGmxXi6dGwW8CbwM1W1eTZcye4tubsLSujewab2MsYEBk9aHANU9RpVfc2bpCEiKSIyV0SyRWSNiNzTQJlxIrLS/ckQkQF1jt0jIqvdcyfX2f+kiKxzz/mviMR5GlNzq3kIcIcNkBtjAognYxwHT/DalcB9qtobGApMEpE+9cpsBUaqan/gMWAqgPug4e3AEGAAcJWI9HDP+QTo556zAXjoBOM7abYuhzEmEHk0xnEiVDVHVZe520VANpBUr0yGqua7LxfizIMF0BtYqKrFqlqJMxh/jXvOx+6++ue0uA4xEYQEiQ2QG2MCis8SR10ikgYMAhYdp9htwAfu9mpghIi0F5EoYAyQ0sA5P6xzTosLDhI6x0VYi8MYE1CaTBzuWEOsOKaLyDIRuczTNxCRaGAWMLmxbi8RGYWTOH4OoKrZwOM43VIfAitwur7qnvMLd9/MRq55h4hkikhmXl6ep+F6zZ7lMMYEGk9aHD90P/AvAxKBW4E/eXJxEQnFSRozVXV2I2X6A9OAsaq6v2a/qk5X1cGqOgI4AGysc84E4CpgnKpqQ9dV1amqmq6q6YmJiZ6Ee0KS4qKsxWGMCSiezFVVM83IGJx5q1Z4MvWIW2Y6kN3Y+uQikgrMBsar6oZ6xzqoaq5b5lrgfHf/5Tgtk5Gq6vdFv5PiI9lbVEp5ZTVhIS3S82eMMX7lSeJYKiIfA92Ah0QkBqj24LxhwHhglYhkufseBlIBVHUK8AjQHnjOzUWVqprulp0lIu1xpnWfVGcQ/RkgHPjEPWehqv7Yg3h8IjkuElXYU1hKavsof4VhjDEtxpPEcRvOVCNbVLVYRNrhdFcdl6p+yZHWSmNlJgITGzl2YSP7uzcZcQuquy6HJQ5jTCDwpG/lfGC9qhaIyC3AL4FC34Z1+qh5CHC3DZAbYwKEJ4njeaDYfar7AWAb8C+fRnUa6dQ2ArDEYYwJHJ4kjkr3zqWxwNOq+jRgEzO5IkKDSYgOt1tyjTEBw5MxjiIReQhnoPtCEQkGQn0b1uklKS7CEocxJmB40uK4ESjDeZ5jD860IU/6NKrTTFJ8pHVVGWMChieTHO7BeTq7rYhcBZSqqo1x1NGlrfP0eCPPIhpjTKviyZQjNwCLgetxFnBaJCLf9XVgp5MucZGUVlSTX1zh71CMMcbnPBnj+AVwrqrmAohIIs4a5G/5MrDTSVKdBZ3atQnzczTGGONbnoxxBNUkDdd+D88LGDXPctgAuTEmEHjS4vhQRD4CXnNf3wi877uQTj9d4mxBJ2NM4Ggycajqz0TkOpy5pwSYqqr/9Xlkp5H4qFAiQoPszipjTEDwpMWBqs7CmR7dNEBESIqLZHehJQ5jTOvXaOIQkSKgoftLBVBVjfVZVKehLnGR1lVljAkIjSYOVbVpRbyQFBdJdk6Rv8Mwxhifs7ujmkmXuEj2HSqjtKLK36EYY4xPWeJoJjW35O4pLPVzJMYY41uWOJpJF3uWwxgTICxxNBN7CNAYEyh8ljhEJEVE5opItoisEZF7GigzTkRWuj8Z7mJRNcfuEZHV7rmT6+xvJyKfiMhG9894X9XBG53aRiBiCzoZY1o/X7Y4KoH7VLU3MBSYJCJ96pXZCoxU1f7AY8BUABHpB9wODAEGAFeJSA/3nAeBz1S1B/CZ+9rvwkKC6BATbonDGNPq+SxxqGqOqi5zt4uAbJy1POqWyVDVfPflQiDZ3e4NLFTVYlWtBOYD17jHxgIvu9svA9/xVR281SUu0rqqjDGtXouMcYhIGjAIWHScYrcBH7jbq4ERItJeRKKAMUCKe6yjquaAk5yADo285x0ikikimXl5eSdfCQ90iYtkd4HdVWWMad18njhEJBpnupLJqnqwkTKjcBLHzwFUNRt4HPgE+BBYgdP15TFVnaqq6aqanpiYeBI18FxynC3oZIxp/XyaOEQkFCdpzFTV2Y2U6Q9MA8aq6v6a/ao6XVUHq+oI4ACw0T20V0Q6u+d2BnLrX9NfusRFUl5Zzb5D5f4OxRhjfMaXd1UJMB3IVtWnGimTCswGxqvqhnrHOtQpcy1HpnWfA0xwtycAbzd/9Ccm2V3QaWOuTT1ijGm9PJod9wQNA8YDq0Qky933MJAKoKpTgEeA9sBzTp6hUlXT3bKzRKQ9UAFMqjOI/ifgTRG5DdiOs6TtKWHoGe1pExbM7GW7uODMBH+HY4wxPuGzxKGqX+LMpHu8MhOBiY0cu7CR/fuBS046QB9oEx7C1QOT+O/ynTzy7T7ERoT6OyRjjGl29uR4M7t5SAqlFdW8nbXb36EYY4xPWOJoZmcntaVP51heX7zd36EYY4xPWOJoZiLCzUNSWLP7IKt2Fvo7HGOMaXaWOHxg7KAkIkKDeG2JtTqMMa2PJQ4fiI0I5cqzuzAnazeHy7x6btEYY055ljh85Ib0ZA6VVfLFhpaZ7sQYY1qKJQ4fGZQaT1hIEMt3FPg7FGOMaVaWOHwkLCSIvl1iWb49v+nCxhhzGrHE4UODUuJZtauQiqpqf4dijDHNxhKHDw1MjaO0opr1e2zuKmNM62GJw4cGpcQB2DiHMaZVscThQ8nxkSREh5G13RKHMab1sMThQyLCwJQ4snbYALkxpvWwxOFjA1Pi2Jx3mMLiCn+HYowxzcISh48NSo0HYMVO664yxrQOljh8rH9yW0QgywbIjTGthCUOH4uJCKV7YrQ9CGiMaTV8ueZ4iojMFZFsEVkjIvc0UGaciKx0fzJEZECdY/e6560WkddEJMLdP1BEFopIlohkisgQX9WhuQxKjSNrRwGq6u9QjDHmpPmyxVEJ3KeqvYGhwCQR6VOvzFZgpKr2Bx4DpgKISBJwN5Cuqv2AYOAm95wngEdVdSDOmuVP+LAOzWJgSjz5xRVsP1Ds71CMMeak+SxxqGqOqi5zt4uAbCCpXpkMVa3pw1kIJNc5HAJEikgIEAXUrMWqQKy73bbO/lPWoFTnQcDFWw/4ORJjjDl5LTLGISJpwCBg0XGK3QZ8AKCqu4A/A9uBHKBQVT92y00GnhSRHW6Zhxp5zzvcrqzMvDz/Tm3eq2MMXdpG8NGaPX6NwxhjmoPPE4eIRAOzgMmqerCRMqNwEsfP3dfxwFigG9AFaCMit7jF7wTuVdUU4F5gekPXVNWpqpququmJiYnNWSWvBQUJV5zdmS827ONgqT3PYYw5vfk0cYhIKE7SmKmqsxsp0x+YBoxV1f3u7tHAVlXNU9UKYDZwgXtsgvsa4D/AKT84DnBl/86UV1Xz6dq9/g7FGGNOii/vqhKc1kC2qj7VSJlUnCQwXlU31Dm0HRgqIlHudS7BGSMBZ0xjpLt9MbDRF/E3t0EpcXRpG8F7K3P8HYoxxpyUEB9eexgwHlglIlnuvoeBVABVnYJzV1R74DknP1Dpdi8tEpG3gGU4d2ctx73jCrgdeNodNC8F7vBhHZqNiDDm7M786+ttHCytIDYi1N8hGWPMCZFAeLYgPT1dMzMz/R0Gy7bnc+1zGTx1wwCuHZzc9AnGGONHIrJUVdPr77cnx1vQoJQ4kuIirbvKGHNas8TRgkSEK/p1YsHGfRSW2N1VxpjTkyWOFjbGvbvqwVkr2bjXlpQ1xpx+LHG0sEEpcdx50ZnMXZ/LpX/9gokvL2FXQYm/wzLGGI9Z4mhhIsLPLz+LjAcvYfLoHny1aT9//mi9v8MyxhiP+fJ2XHMc7dqEMXl0T3IKSnlvVQ6lFVVEhAb7OyxjjGmStTj87KoBnTlUVsn8Df6dT8sYYzxlicPPzj+jPe3ahPGu3aJrjDlNWOLws5DgIC7v14nPsvdSUl7l73CMMaZJljhOAVf170xxeRWfr8v1dyjGGNMkSxyngPO6tScxJpz3Vp3ya1IZY4wljlNBcJAwpl8nPl+Xy+GySn+HY4wxx2WJ4xRx1YAulFZU88zcTSz55gC7CkoIhAkojTGnH3uO4xRxTmo83RLa8Py8zTw/bzMA1w1O5i83DPBzZMYYczRLHKeIoCDhw8kXsuNACbsLSnhvZQ5vZO7gxnNTGNKtnb/DM8aYWtZVdQoJDwmme4doRvRM5NdX96FTbASPvbuW6mrrsjLGnDoscZyiosJCeODyXqzaVcj/snb5OxxjjKnlyzXHU0Rkrohki8gaEbmngTLjRGSl+5MhIgPqHLvXPW+1iLwmIhF1jv1ERNa7x5/wVR387TsDk+if3JYnPlxPcblzt1VpRRUVVdV+jswYE8h8OcZRCdynqstEJAZYKiKfqOraOmW2AiNVNV9ErsBZV/w8EUkC7gb6qGqJiLwJ3AS8JCKjgLFAf1UtE5EOPqyDXwUFCb+6qg/XT/maa5/LoKi0kt2FJSREh/PY2L5c3q+zv0M0xgQgn7U4VDVHVZe520VANpBUr0yGqua7LxcCdRfiDgEiRSQEiAJqno67E/iTqpa512jVj1ufm9aOW4elERIsnNM1nrsv7kFidDg//vcyfvRKJnsPlvo7RGNMgJGWeFZARNKAL4B+qnqwkTL3A2ep6kT39T3A74ES4GNVHefuzwLeBi4HSoH7VXVJA9e7A7gDIDU19Zxt27Y1c638p6KqmmkLtvK3TzegwFVnd2bc0FQGp8YjIl5f75O1e/nj+9k8deNABqbENX/AxpjTkogsVdX0+vt9PjguItHALGDycZLGKOA24Ofu63ic7qhuQBegjYjc4hYPAeKBocDPgDelgU9LVZ2qqumqmp6YmNjMtfKv0OAg7rzoTD6+dwQ3pqfw8dq9XPf813x3ytfkHy73+DqqyrNzN3HHK5ls2XeYmQtbT3JtbUorqmyd+hOkqjz6zhoe/u8qf4fS4nw1HurTxCEioThJY6aqzm6kTH9gGjBWVfe7u0cDW1U1T1UrgNnABe6xncBsdSwGqoEEX9bjVNW1fRse+04/Fj18CY+N7cvqXYWMm7aIguLGk0duUSnLt+fzwaoc7nptOU9+tJ5v9+/Clf078+GaPZRVNjxD744DxUyYsZhZS3f6qjoGWL2rkPV7jl2L/q5XlzPm6QWN/vuYxv0ncycvfvUNry7azsqdBf4O54Rt31/MK19/w8It+znkwdRE3+w7zEVPzmPhlv1NlvWWzwbH3VbAdCBbVZ9qpEwqTlIYr6ob6hzaDgwVkSicrqpLgEz32P+Ai4F5ItITCAP2+aYWp4c24SGMPz+N1PZtuP3lTG6ZvoiZtw2lbVRobRlV5S8fb+CZuZtq9wUJ/Oxbvfi/i85k3oY83luZw4IN+xjdp+NR15+3PpfJb2RRUFzBsm35jOiZSGJMeKPxqOoJdZkFOlXlR68spaKqms/uG0lMhPPvt3jrAT7N3gs4H4K3DO16Uu9TVa0EB53cv8+y7fk8/sE6SiuqGH9+Gt8e0JnwkODaehw4XM7eg2XsPVhKuzZhDPBTF+im3CJ+PWcNQ7q1Y13OQZ6du4l/jj+m56XZ5RSW0L5NOGEhzfPdfPWuQr4/YzEH3B4FERiUEsc/x6c3+LtYVa3c/58VHCytoGv7qGaJoS5ftjiGAeOBi0Uky/0ZIyI/FpEfu2UeAdoDz7nHMwFUdRHwFrAMWOXGOdU9ZwZwhoisBl4HJqhN6gTAyJ6J/HP8OazfU8RNLyxk/oY8VJWKqmru/89Knpm7iesGJzPjB+m8d/dwlv/qMiaN6o6IMLx7AnFRobyz8ugZep/5fCO3vrSETrERvHjruZRWVjW6RnppRRU/fGkJt/9raaPzbKkqT360jsv/9gU784ub/e/geFSVVxdtZ86K5p2FePv+Yn76RhYZm0/u+8uqXYXsKight6iMpz5xvkepKo9/uI6OseH0T27L8/M2n1T3w/wNeQz87ce8dYItxz2Fpdz7RhbXPpfB1n2HKamo4v7/rGDYnz5n4suZXP63L+jzyEec87tPGfP3Bdz60hLGPvsV/8nc4fV77cwv5vK/fcEr9bpQK6qqeWPJdj5cnVN7m3pDSiuquOvV5USGBfOPmwfxgwvS+GjNXjbsPbZFB/DVpn3838ylZGze59E8carK7oISquo8oKuqvPTVVoY/PpfRT83n7axdHj3AW15Z3eh6PEu3Ha6SENEAABNiSURBVODmFxYSGRrM25OG8eKt53LPJT3Izili4stLGjzvhQVbyNyWz2/H9qVz28gm399bLTI47m/p6emamZnZdMFWYu66XB6cvZK9B8s4q1MMbSNDWbT1APeO7sndl3RvtDXw0OyVzMnaTeYvLyUyLJj3VuYw6dVlXD2gC49f15/IsGD+8H42LyzYwpxJwzk7uW3tuRVV1fz4laV85q4p8sz3BnFV/y5HXb+6WvnV26uZuWg7IUFCarso/vPj82kf3XjrpSnvrcxh+fZ8duQXs/dgGd8/vyvXDk4+ptzhskoemLWS91bmEBwkvHb7UI+mctl3qIyvN++noLicguIKgoOFced1pW2k0xrYcaCYG//5NbsLnbvbrh7QhV9c2ZuOsRFUVytFpZVs3neIDXuK2Jx3iM5tIzk3rR29O8cQEnz097YnP1rHlPlbGHN2Z95buZs5dw1nT2EpE/+VyR+uOZvObSO49aUlPHFdf244N6XJ2Ou3/DblHuKa577icFklIUFBvHbHeZzTteG/g8NllYSFBBHqxqiqvLFkB797L5vyymomXtiN/xvVnTZhwXy1aT8vfrWVbQeKSWsfRdf2bUiOj6RTbAQdYsP56ycbydi8j7/ffOz/iZprb847zJmJbY6K96dvZDF7ufPw623Du/HwmN7szC/m7tezWLHD6XIKDwniwh4J3HtpT/p2aXvUdX8zZw0vZXzDiz84l1FndSD/cDnDHv+cS/t05OmbBh1T39FPzSfH/Xc8p2s8E4d3o0fHGDq3jaBN+NGdM/mHy/nl26t5b2UOPTpEM3l0T0b36cCj76zl1UXbGdEzkbyiMrJzDtK3SyxP3TCQXp1iGqz7+6v28Jt31nCwpILL+3Xiu+ck06tjDOv2FLFqVyHPzt1Ex9gI/j3xPJLijiSBT9bu5Y5XMhnduyNTbjmnthW5bs9Brv7HV1x8Vgeev2XwSbX+Gxsct8TRSpVXVvN21i5eWLCFzXmH+f13+nHTkNTjnvPVpn2Mm7aI58YNJj0tnm/99QtS2kUx684Laj9ADpZWcPGf59G1fRve+vH5iAhV1cq9b2QxZ8Vufju2L68v3kFhSQWf/nQkkWFO90VlVTUPvLWS2ct38aORZzC6d0fGT19Ejw4xvHr7ebXdMt50c9UktojQIJLjo6hW5Zt9h3nh++lc0vtId9s3+w7zo1eWsjG3iMmje/Lf5bsoLq/kvbsvJKGRpLV290Fe/Gorb6/YTXnl0d/wE6LD+fW3+zAoNY4b/7mQQ2WVzPhBOl9s2Mfz8zcjQERoMEWlFdT9shkWHES521qICgvmsbH9uO6cI0nukr/Mo2NsBM+PO4dLnppHcnwUJeXOA58f3zuC4CDh6me+orCkgs/vG3lM4gEnOX+2LpdpC7awdFs+3zsvlcmjexIk8J1nv+JQWSUv/3AIk2Yu41BZJW/fNfyoD6OC4nKe+XwT//p6G5FhwVzapyMXn9WB1xZvZ8HGfQw9ox2PX9efru3bePRvBFBcXsmEGYtZvr2AKbecc0xX6Iwvt/Lbd9fy6NV9mXBBWu3f/5X/WMDtF55BeWU1L2V8w5Bu7Vi7+yBBAr+/5mwSosP5eO0e5mTtJjIsmI/vHUFUmPMBv3RbPt+dksGE89P4zdV9a9/rD+9nM23BFj6/7yLSEo7U4Y8fZPPP+Vt49fbz2Jx7iCnzt7CroKT2eHxUKEO6tWN4j0TaRoby2LtrKSguZ9x5Xfly0z425R4iJiKEotJK7rzoTH52WS8A53fi3bX06BDNGz86/6h67y4o4Vf/W81n63LplxTLgOQ43lmxm4OlR7eiBqbEMfX759AhJoL6XvpqK795Zy03pqdwUa9EqhWembuJvKJSPpo84qS+lIEljoBLHDVUlYLiCuLbhDVZtqpaOe8Pn3FuWjzlldUs2LSP9+8eTvcOR39TenPJDh6YtZJL+3QkLDiIXQUlZO0o4OeXn8WdF53Joi37uXHqQu4d3ZN7RvfgwOFyJr+RxRcb8rj/sp613WNz1+Vy+78y6ZbQhjbhIezML0EEZkw496jWzJa8Q/x6zhruGtWd885oDzgtgcv++gXJ8ZHMvvMCQoKDOFxWyY1Tv2Zz7mHe/NH5nNU5hulfbuXpTzcSHhrEP24exIU9Elmzu5BrnsvgvG7tePnWIQS539QOHC7n3ZW7+e/yXSzfXkBkaDDXnZPEDekpdG4bSdvIUDbsLeKh2atYtauQyNBgQoOFmROH1sa7bf9hXvzqG6pVaRsZSmxEKGkJbejVMYbk+Ej2HCwlc1s+0xZsYWd+CQseGEWb8BA25R5i9FPzaz88Zy/byU/fXAHAs98bzJX9nYc9P16zhzteWcpjY/uSGBPOvPV5rN9bRGhQECHBws78ErYfKCYpLpLBXeN5f1UOUWHBJMdHsTn3UG0rY1NuEdc8m0Fyuyh+OCyNkooq9hSW8u+F2ygqq+SaQUmowqfZeykqrSQqLJiHrjiLced1rf378kZRaQXjpi1i3Z4i3p40jN6dY2v/zi96ci7F5VWIwFs/voABKXFMmLGYrB0FfPGzUbSNCuWlr5zkkp7Wjr/eOPCoZLfkmwNcP+VrfjisG498uw8VVdV8+x9fUlhSwSc/HUl0ndZC7sFShj8xl2/17cRfbxhASHAQG/YWMebpBVw3OJnHv9sfcL54Ld+eT05hKTmFpWzJO0TG5v21yaRXxxieunEAfbu0papambNiFzMXbmfc0FSuGXR0i3fagi387r1s3p40rHasp7Siisv++gV5RWXcd1lPfnBBGiHBQZRWOCuB7j1YSq9OMfTpHEtc1PF/dx97dy3Tv9xa+zpIYMot53BZ305e/zvVZ4kjQBOHtx55ezX/+trpU/7VVX24bXi3Y8pUVyu3vbyE7JwiosKDiQ4P4cqzO/OjkWfWlpn06jI+y97Ln68fwO/ezeZAcTm/vbrvMa2ed1fu5pnPN5EQHU5yfCTzN+QRJMI7PxlOuzZhFJZUcM2zX7Fl32EiQoOYMeFcLuiewKSZy/hk7V7evXs4PTseSWy5RaVc82wGZZXVtGsTyoa9h7i0T0d+c3Xfoz5sXl20nYf/u4oRPRMJEqfvflPuISqrlbM6xXDd4GRuSE856gaDGpVV1bz89TZmL9vJH645+4QGfpdtz+fa5zJqk+2zczfx5Efr+fqhi+ncNhJV5baXMzlcVslrtw+t/bBWVcb8/Uuyc5w722PCQ+iX5CStiqpqIsOCuencVL7VtyMhwUFs3FvEH97PZu76PJ78bn+uTz/SxTV3fS63v5xJZZ1m0aheifz8irM4q5PzwV5eWU3mtgN0S2hz0n3l+w6VccXTC4iLDGXOXcOJDAvmEbfr8vU7hjL59SwAfnllb+6cuYyHx5zFHSOO/J/KLSqlfZvwBgf2H3l7Na8s3MasOy9g0ZYDPP7hOl74fjqX1mvdADz+4Tqen7eZXh1j+PXVffjbpxvZsLeIz++7iHbH+YKlqnyzv5hNuYcY0TOh9maAphwqq+T8P37GiJ6JPPu9wQA8N28TT3y4nn/fdh7De5zcTaGqypZ9h6moqiZIhLaRoXSMPbZ1ciIscVji8MjirQe44Z9fM/SMdrw6cegJfbsE2FVQwsV/nkdZZTVd20fx7PcG137AHc+KHQVcP+VrhnRrx4wfnMvEf2WSscnpH3/60418s/8wtwztyvQvt/Kzb/Vi0qjux1xj494irns+g5iIUB69uu8xXSPg/LL99t21vLcyhw6x4XSKjaB7hxiuHtCFPl1iT6jO3powYzErdxbw5c8v5nsvLATg7buGHxVjtXLMB+WqnYV8tm4vF5yZwKDUuNpuxOMpKC5v8Jvr/kNlFJdXERkWTJuwkNquRV/5cuM+xs9YxM1DUvnBBWlc8fQCxp2Xym/H9iNrRwHXT8mgslrpHBvB5/dfRESo5x/Olz01n7CQIPYcLHVvFGn47ilV5eO1e3ns3bXszHdaEH+69uwmu3JPxh8/yOaFL7Yw7/5RRIQFMerJeVzQPYEXvu/7O7xOhiUOSxweUVX+vWg73+rTkQ4n+a3ljSXbydpRyENjziI24thv7o15ffF2Hpy9iu4dotmUe4g/Xns2Nw9J5cDhcm6Ztoi1OQfpn9y2touqIQXF5USEBnv8weMPNa2O75/flX99vY0HLu/F/110bCJsbWrGE7q2jyL/cDnzfjaq9pv+yxnf8Os5a/jL9QOOGv/xxNz1udz64hKiw0P45KcjmmwhlVZUMfWLLew5WMrvxvY74S9JnthTWMqFT3zOuPO6UlJexezlO/n43pF0S/B8rMgfLHFY4jitPDR7Ja8t3sGE87vy6Nh+tfsLisv5+2ebGH9+11P+l84TE2YsZv6GPAA+u28kZyZG+zki3yuvrOb6KRms2FnYYHforoKSo7oVvTHjy62kJURx8VnHtjL97b43V/Duyt2UV1Vz27Bu/PKqPv4OqUmWOCxxnFbKK6vJ2LyP4d0TGm1VtAbLt+dzzXMZdO8Qzac/HenvcFrM7oIS3lmxm1uHdWu2h+ROdev3FPGtv31BfFQo8342qvaW7lNZY4nDlo41p6SwkCAu6tVqZ8yvNSg1np9c3P2oAf5A0CUu8qibKQJBr04xPHB5L3p1jDktksbxWIvDGGNMg/w2O64xxpjWxRKHMcYYr1jiMMYY4xVLHMYYY7xiicMYY4xXLHEYY4zxiiUOY4wxXrHEYYwxxisB8QCgiOQB25oseEQCgbmOeSDWOxDrDIFZ70CsM5xcvbuqamL9nQGROLwlIpkNPS3Z2gVivQOxzhCY9Q7EOoNv6m1dVcYYY7xiicMYY4xXLHE0bKq/A/CTQKx3INYZArPegVhn8EG9bYzDGGOMV6zFYYwxxiuWOIwxxnjFEkc9InK5iKwXkU0i8qC/4/EFEUkRkbkiki0ia0TkHnd/OxH5REQ2un/G+zvW5iYiwSKyXETedV8HQp3jROQtEVnn/puf39rrLSL3uv+3V4vIayIS0RrrLCIzRCRXRFbX2ddoPUXkIfezbb2IfOtE39cSRx0iEgw8C1wB9AFuFpFTf0V571UC96lqb2AoMMmt54PAZ6raA/jMfd3a3ANk13kdCHV+GvhQVc8CBuDUv9XWW0SSgLuBdFXtBwQDN9E66/wScHm9fQ3W0/0dvwno657znPuZ5zVLHEcbAmxS1S2qWg68Doz1c0zNTlVzVHWZu12E80GShFPXl91iLwPf8U+EviEiycCVwLQ6u1t7nWOBEcB0AFUtV9UCWnm9gRAgUkRCgChgN62wzqr6BXCg3u7G6jkWeF1Vy1R1K7AJ5zPPa5Y4jpYE7Kjzeqe7r9USkTRgELAI6KiqOeAkF6CD/yLzib8BDwDVdfa19jqfAeQBL7pddNNEpA2tuN6qugv4M7AdyAEKVfVjWnGd62msns32+WaJ42jSwL5We7+yiEQDs4DJqnrQ3/H4kohcBeSq6lJ/x9LCQoDBwPOqOgg4TOvoommU26c/FugGdAHaiMgt/o3qlNBsn2+WOI62E0ip8zoZp4nb6ohIKE7SmKmqs93de0Wks3u8M5Drr/h8YBhwtYh8g9MFebGI/JvWXWdw/k/vVNVF7uu3cBJJa673aGCrquapagUwG7iA1l3nuhqrZ7N9vlniONoSoIeIdBORMJyBpDl+jqnZiYjg9Hlnq+pTdQ7NASa42xOAt1s6Nl9R1YdUNVlV03D+XT9X1VtoxXUGUNU9wA4R6eXuugRYS+uu93ZgqIhEuf/XL8EZx2vNda6rsXrOAW4SkXAR6Qb0ABafyBvYk+P1iMgYnL7wYGCGqv7ezyE1OxEZDiwAVnGkv/9hnHGON4FUnF++61W1/sDbaU9ELgLuV9WrRKQ9rbzOIjIQ54aAMGALcCvOl8ZWW28ReRS4EecOwuXARCCaVlZnEXkNuAhn6vS9wK+B/9FIPUXkF8APcf5eJqvqByf0vpY4jDHGeMO6qowxxnjFEocxxhivWOIwxhjjFUscxhhjvGKJwxhjjFcscRhjjPGKJQ5jPCQiGe6faSLyPX/HY4y/WOIwxkOqeoG7mQZ4lThOdPpqY05FljiM8ZCIHHI3/wRcKCJZ7oJBwSLypIgsEZGVIvIjt/xF7oJZr+I8pd/QNdPcxZVecBce+lhEIt1j80Qk3d1OcOfZQkR+ICL/E5F3RGSriNwlIj91Z79dKCLtfP13YQKbJQ5jvPcgsEBVB6rqX4HbcKbuPhc4F7jdnQsInPUOfqGqx1sQrAfwrKr2BQqA6zyIoR9Oq2cI8Hug2J399mvg+ydSKWM8FeLvAIxpBS4D+ovId93XbXGSQTmw2F0053i2qmqWu70UpyusKXPdRbiKRKQQeMfdvwro703wxnjLEocxJ0+An6jqR0ftdCZTPOzB+WV1tquASHe7kiO9AhHHOae6zutq7Pfa+Jh1VRnjvSIgps7rj4A73TVOEJGe7ip7J+sb4Bx3+7vHKWdMi7JvJsZ4byVQKSIrgJeAp3G6l5a56z/k0TzrWf8ZeFNExgOfN8P1jGkWNq26McYYr1hXlTHGGK9YV5UxLcBdafCzBg5doqr7WzoeY06GdVUZY4zxinVVGWOM8YolDmOMMV6xxGGMMcYrljiMMcZ45f8BXVnvqqd0fEQAAAAASUVORK5CYII=\n"
     },
     "metadata": {
      "needs_background": "light"
     },
     "output_type": "display_data"
    }
   ],
   "source": [
    "x_index = len(train_loss_list)\n",
    "plt.plot(range(1,x_index+1),train_loss_list)\n",
    "plt.xlabel(\"iter_num\")\n",
    "plt.ylabel('loss value')\n",
    "plt.show()"
   ],
   "metadata": {
    "collapsed": false,
    "ExecuteTime": {
     "end_time": "2023-09-28T05:01:04.352367500Z",
     "start_time": "2023-09-28T05:01:01.758725400Z"
    }
   }
  },
  {
   "cell_type": "markdown",
   "source": [],
   "metadata": {
    "collapsed": false
   }
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "outputs": [],
   "source": [],
   "metadata": {
    "collapsed": false
   }
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 2
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython2",
   "version": "2.7.6"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 0
}
