{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# 作业题目：\n",
    "#### 问题描述\n",
    "以mnist为数据集，使用tensorflow，构造并训练一个神经网络，在测试机上达到超过98%的准确率。\n",
    "#### 解题提示\n",
    "在完成过程中，需要综合运用目前学到的基础知识：\n",
    "1. 深度神经网络\n",
    "2. 激活函数\n",
    "3. 正则化\n",
    "4. 初始化\n",
    "\n",
    "并探索如下超参数设置：\n",
    "* 隐层数量\n",
    "* 各隐层中神经元数量\n",
    "* 学习率\n",
    "* 正则化因子\n",
    "* 权重初始化分布参数"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 导入工具和数据"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 2,
   "metadata": {},
   "outputs": [
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "C:\\ProgramData\\Anaconda3\\lib\\site-packages\\h5py\\__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.\n",
      "  from ._conv import register_converters as _register_converters\n"
     ]
    }
   ],
   "source": [
    "\"\"\"A very simple MNIST classifier.\n",
    "See extensive documentation at\n",
    "https://www.tensorflow.org/get_started/mnist/beginners\n",
    "\"\"\"\n",
    "from __future__ import absolute_import\n",
    "from __future__ import division\n",
    "from __future__ import print_function\n",
    "\n",
    "import argparse\n",
    "import sys\n",
    "\n",
    "from tensorflow.examples.tutorials.mnist import input_data\n",
    "\n",
    "import tensorflow as tf\n",
    "\n",
    "FLAGS = None\n",
    "#\n",
    "import numpy as np\n",
    "import matplotlib.pyplot as plt\n",
    "%matplotlib inline\n",
    "#"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "我们在这里调用系统提供的Mnist数据函数为我们读入数据，如果没有下载的话则进行下载。\n",
    "\n",
    "这里将data_dir改为适合你的运行环境的目录 <font color=#ff0000>** **</font>"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 3,
   "metadata": {
    "scrolled": true
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Extracting input_data\\train-images-idx3-ubyte.gz\n",
      "Extracting input_data\\train-labels-idx1-ubyte.gz\n",
      "Extracting input_data\\t10k-images-idx3-ubyte.gz\n",
      "Extracting input_data\\t10k-labels-idx1-ubyte.gz\n"
     ]
    }
   ],
   "source": [
    "# Import data\n",
    "data_dir = 'input_data'  #E:\\lzp\\csdn\\prj\\week6\\week6_mnist\\\n",
    "mnist = input_data.read_data_sets(data_dir, one_hot=True)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 熟悉样本"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 4,
   "metadata": {
    "scrolled": true
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      " 类型是 <class 'tensorflow.contrib.learn.python.learn.datasets.base.Datasets'>\n",
      " 训练数据有 55000\n",
      " 测试数据有 10000\n",
      " 数据类型 is <class 'numpy.ndarray'>\n",
      " 标签类型 <class 'numpy.ndarray'>\n",
      " 训练集的shape (55000, 784)\n",
      " 训练集的标签的shape (55000, 10)\n",
      " 测试集的shape' is (10000, 784)\n",
      " 测试集的标签的shape (10000, 10)\n"
     ]
    }
   ],
   "source": [
    "#观察\n",
    "print (\" 类型是 %s\" % (type(mnist)))\n",
    "print (\" 训练数据有 %d\" % (mnist.train.num_examples))\n",
    "print (\" 测试数据有 %d\" % (mnist.test.num_examples))\n",
    "trainimg   = mnist.train.images\n",
    "trainlabel = mnist.train.labels\n",
    "testimg    = mnist.test.images\n",
    "testlabel  = mnist.test.labels\n",
    "# 28 * 28 * 1\n",
    "print (\" 数据类型 is %s\"    % (type(trainimg)))\n",
    "print (\" 标签类型 %s\"  % (type(trainlabel)))\n",
    "print (\" 训练集的shape %s\"   % (trainimg.shape,))\n",
    "print (\" 训练集的标签的shape %s\" % (trainlabel.shape,))\n",
    "print (\" 测试集的shape' is %s\"    % (testimg.shape,))\n",
    "print (\" 测试集的标签的shape %s\"  % (testlabel.shape,))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 5,
   "metadata": {
    "scrolled": true
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "1th 训练数据 标签是 3\n"
     ]
    },
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAQQAAAECCAYAAAAYUakXAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvNQv5yAAADk5JREFUeJzt3V+InfWdx/HPdxv3ZvQiZkYN1ky2IklrYaNMZKExKGWj6Y2TC8tGkRRWR6RCY/Zi4x9UiImyNK7JTXS6hqbQuEj+rFKUVqTE7E3IP9GYiWsp2WgdMom5UPGi6Hz34jz57jSd+f3OnHPm/J5J3i+QOfN8z5+vj+OH53nO9/yOubsAQJL+pnQDAOqDQAAQCAQAgUAAEAgEAIFAABCKBIKZ3WlmH5rZH8xsfYkeUszspJm9b2bvmtmhGvSz3czGzOzYhG1XmtlbZvZR9XNuzfp72sz+VO3Dd83sRwX7u87Mfm9mI2b2gZn9rNpei32Y6K/r+9C6PYdgZt+S9D+S/lHSJ5IOSlrt7se72kiCmZ2UNODuZ0v3IklmtlzSl5J+5e7fr7b9m6Rz7v5cFapz3f1fa9Tf05K+dPefl+hpIjObL2m+ux8xsyskHZY0KOknqsE+TPT3Y3V5H5Y4QrhF0h/c/Y/u/mdJ/ynprgJ9zBru/o6kcxdsvkvSjur2DjX+gIqYor/acPdRdz9S3f5C0oika1WTfZjor+tKBMK1kj6e8PsnKvQvn+CSfmdmh81sqHQzU7ja3Uelxh+UpKsK9zOZh83sveqUotgpzURmtlDSTZIOqIb78IL+pC7vwxKBYJNsq9v89A/c/WZJKyX9tDokxvRsk3S9pCWSRiVtLtuOZGaXS9otaa27f166nwtN0l/X92GJQPhE0nUTfv+2pE8L9DEld/+0+jkmaa8apzl1c7o69zx/DjpWuJ+/4O6n3f0bdx+X9AsV3odmdpka/7P92t33VJtrsw8n66/EPiwRCAcl3WBmf2dmfyvpnyS9XqCPSZlZT3VhR2bWI2mFpGPpRxXxuqQ11e01kl4r2MtfOf8/WmWVCu5DMzNJL0sacffnJ5RqsQ+n6q/EPuz6uwySVL198oKkb0na7u4bu97EFMzsO2ocFUjSHEk7S/dnZq9Iuk1Sr6TTkp6S9F+SXpW0QNIpSXe7e5ELe1P0d5sah7ou6aSkB8+frxfob5mk/ZLelzRebX5MjfP04vsw0d9qdXkfFgkEAPXEpCKAQCAACAQCgEAgAAgEAoBQNBBqPBYsif7aVef+6tybVK6/0kcItf6PIvprV537q3NvUqH+SgcCgBppazDJzO6UtEWNicP/cPfnMvdnCgooxN0n+2DhX2g5EFpZ6IRAAMppJhDaOWVgoRPgItNOIMyGhU4ATMOcNh7b1EIn1dsndb+iC0DtBUJTC524+7CkYYlrCEDdtXPKUOuFTgBMX8tHCO7+tZk9LOm3+v+FTj7oWGcAuq6rC6RwygCUM9NvOwK4yBAIAAKBACAQCAACgQAgEAgAAoEAIBAIAAKBACAQCAACgQAgEAgAAoEAIBAIAAKBACAQCAACgQAgEAgAAoEAIBAIAAKBACAQCAACgQAgEAgAAoEAIBAIAAKBACAQCAACgQAgEAgAwpzSDaA++vv7k/X7778/WX/88ceTdXdP1s3S31Y+MjKSrD/xxBPJ+t69e5N1tBkIZnZS0heSvpH0tbsPdKIpAGV04gjhdnc/24HnAVAY1xAAhHYDwSX9zswOm9lQJxoCUE67pww/cPdPzewqSW+Z2Ql3f2fiHaqgICyAWaCtIwR3/7T6OSZpr6RbJrnPsLsPcMERqL+WA8HMeszsivO3Ja2QdKxTjQHoPsu9NzzlA82+o8ZRgdQ49djp7hszj2ntxdCUvr6+ZP3RRx9N1u+9995kfd68ecl6bo6g3TmE3OM//vjjZH3p0qXJ+tmzF/ebZe6e3sFq4xqCu/9R0t+3+ngA9cPbjgACgQAgEAgAAoEAIBAIAAKBACC0PIfQ0osxh9CW3HoDGzZsSNZLzwGcOXMmWc/p7e1N1hcuXJisHz9+PFm/8cYbp9vSrNLMHAJHCAACgQAgEAgAAoEAIBAIAAKBACAQCAACcwizyMGDB5P1m2++OVlvdw4h9z7+7bffnqy3u97AsmXLkvV9+/Yl67l//zlzLu6vKWEOAcC0EAgAAoEAIBAIAAKBACAQCAACgQAgMIdQI4sXL07Wc3MIn332WbKeW48gNyfwyCOPJOtr165N1jdt2pSsnzp1KlnPyf0tj4+PJ+sPPfRQsj48PDztnuqEOQQA00IgAAgEAoBAIAAIBAKAQCAACAQCgMAcwiySm1PIzRG0ux7B0NBQsr5t27ZkfenSpcn6kSNHkvVVq1Yl67t27UrWc3/r11xzTbLe7v4rrSNzCGa23czGzOzYhG1XmtlbZvZR9XNuu80CKK+ZU4ZfSrrzgm3rJb3t7jdIerv6HcAslw0Ed39H0rkLNt8laUd1e4ekwQ73BaCAVi8qXu3uo5JU/byqcy0BKGXGV5U0syFJ6atRAGqh1SOE02Y2X5Kqn2NT3dHdh919wN0HWnwtAF3SaiC8LmlNdXuNpNc60w6AkrKnDGb2iqTbJPWa2SeSnpL0nKRXzeyfJZ2SdPdMNomGEydOFH393HoKH374YbKeW68ht97C+vXpN7Ny3ysx03MaF4NsILj76ilKP+xwLwAKY3QZQCAQAAQCAUAgEAAEAgFAIBAAhBkfXUb3LF++PFnPraeQmzMYGRlJ1hctWpSsHzhwIFnv6+tL1nPrGeT6X7lyZbIOjhAATEAgAAgEAoBAIAAIBAKAQCAACAQCgMAcwkXknnvuSdYfeOCBZD23nkBuDiD3+NycQbvrGWzdujVZz33vAzhCADABgQAgEAgAAoEAIBAIAAKBACAQCAACcwiXkNwcQenH79+/P1lft25dss6cQfs4QgAQCAQAgUAAEAgEAIFAABAIBACBQAAQmEO4iOzcuTNZ7+/vT9Z7e3uT9dz3OvT09CTrOU8++WSyzpzBzMseIZjZdjMbM7NjE7Y9bWZ/MrN3q39+NLNtAuiGZk4Zfinpzkm2/7u7L6n+eaOzbQEoIRsI7v6OpHNd6AVAYe1cVHzYzN6rTinmdqwjAMW0GgjbJF0vaYmkUUmbp7qjmQ2Z2SEzO9TiawHokpYCwd1Pu/s37j4u6ReSbkncd9jdB9x9oNUmAXRHS4FgZvMn/LpK0rGp7gtg9rAm1tp/RdJtknolnZb0VPX7Ekku6aSkB919NPtiZu19oB5F5eYQnnnmmWR9cHAwWT969GiyvnLlymQ9970Nlzp3T3/xhZoYTHL31ZNsfrmljgDUGqPLAAKBACAQCAACgQAgEAgAAoEAIGTnEDr6YrN8DqGvry9ZP3PmTJc6mZ3efPPNZP2OO+5I1nPfy/DCCy9Mu6dLSTNzCBwhAAgEAoBAIAAIBAKAQCAACAQCgEAgAAh8L8MEy5cvT9Y3b55ypThJ0okTJ5L1++67b9o9XUw2btyYrK9YsSJZX7RoUSfbwSQ4QgAQCAQAgUAAEAgEAIFAABAIBACBQAAQLqk5hNx6Bi+++GKyPjY2lqxf6nMGPT09yfpLL72UrJtlP66PGcYRAoBAIAAIBAKAQCAACAQCgEAgAAgEAoBwSc0hrFq1KlnPfd5+3759nWxn1lm8eHGyvnv37mQ9t39z3xGSW28C7cseIZjZdWb2ezMbMbMPzOxn1fYrzewtM/uo+jl35tsFMJOaOWX4WtK/uPt3Jf2DpJ+a2fckrZf0trvfIOnt6ncAs1g2ENx91N2PVLe/kDQi6VpJd0naUd1th6TBmWoSQHdM66KimS2UdJOkA5KudvdRqREakq7qdHMAuqvpi4pmdrmk3ZLWuvvnzX4QxcyGJA211h6AbmrqCMHMLlMjDH7t7nuqzafNbH5Vny9p0o8Cuvuwuw+4+0AnGgYwc5p5l8EkvSxpxN2fn1B6XdKa6vYaSa91vj0A3WS5937NbJmk/ZLelzRebX5MjesIr0paIOmUpLvd/VzmudIvNsNy76OPjIwk68ePH0/Wn3322bae//Dhw8l6Tn9/f7J+6623Juu5OY3BwfR149xpZO5vbcuWLcn6unXrknWkuXv2PD97DcHd/1vSVE/0w+k2BaC+GF0GEAgEAIFAABAIBACBQAAQCAQAITuH0NEXKzyHkLNr165kfabfhz969GiynrNgwYJkfd68ecl6u/3nHr9x48ZkfevWrcn62bNnk3WkNTOHwBECgEAgAAgEAoBAIAAIBAKAQCAACAQCgMAcwgR9fX3J+htvvJGsDwykF4UaHx9P1md6DiD3+K+++ipZz30vwqZNm5L1vXv3JuuYWcwhAJgWAgFAIBAABAIBQCAQAAQCAUAgEAAE5hCmobe3N1nfsGFDW88/NJT+xrs9e/Yk6+2uF5D7XoTcHALqjTkEANNCIAAIBAKAQCAACAQCgEAgAAgEAoCQnUMws+sk/UrSNZLGJQ27+xYze1rSA5LOVHd9zN2TCwbM9jkEYDZrZg6hmUCYL2m+ux8xsyskHZY0KOnHkr5095832xCBAJTTTCDMaeJJRiWNVre/MLMRSde23x6AupnWNQQzWyjpJkkHqk0Pm9l7ZrbdzOZ2uDcAXdZ0IJjZ5ZJ2S1rr7p9L2ibpeklL1DiC2DzF44bM7JCZHepAvwBmUFMfbjKzyyT9RtJv3f35SeoLJf3G3b+feR6uIQCFdOTDTdZYyvdlSSMTw6C62HjeKknHWmkSQH008y7DMkn7Jb2vxtuOkvSYpNVqnC64pJOSHqwuQKaeiyMEoJCOvO3YSQQCUA7rIQCYFgIBQCAQAAQCAUAgEAAEAgFAIBAABAIBQCAQAAQCAUAgEAAEAgFAIBAABAIBQCAQAITsqssddlbS/074vbfaVlf0154691fn3qTO99ffzJ26ukDKX7242SF3HyjWQAb9tafO/dW5N6lcf5wyAAgEAoBQOhCGC79+Dv21p8791bk3qVB/Ra8hAKiX0kcIAGqEQAAQCAQAgUAAEAgEAOH/AHjxiVGQ9BiwAAAAAElFTkSuQmCC\n",
      "text/plain": [
       "<matplotlib.figure.Figure at 0x3a75198>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    }
   ],
   "source": [
    "#熟悉样本\n",
    "nsample = 1\n",
    "# randidx = np.random.randint(trainimg.shape[0], size=nsample) \n",
    "randidx = [1] #第n个图\n",
    "for i in randidx:\n",
    "    curr_img   = np.reshape(trainimg[i, :], (28, 28)) # 28 by 28 matrix \n",
    "    curr_label = np.argmax(trainlabel[i, :] ) # Label\n",
    "    plt.matshow(curr_img, cmap=plt.get_cmap('gray'))\n",
    "    print (\"\" + str(i) + \"th 训练数据 \" + \"标签是 \" + str(curr_label))\n",
    "    plt.show()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "*"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 熟悉激活函数"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "激活函数(Activation Functions)： "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "sigmoid tanh relu softplus RReLU  leakRelu PReLU MaxOut ELU SELU Swish cRelu MPELU"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "官网地址：http://www.tensorfly.cn/tfdoc/api_docs/python/nn.html"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "优缺点分析：\n",
    "1. https://zhuanlan.zhihu.com/p/22142013   \n",
    "2. https://blog.csdn.net/weixin_39881922/article/details/79045687"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 6,
   "metadata": {
    "scrolled": false
   },
   "outputs": [
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAArMAAADuCAYAAAAqaI8bAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4yLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvNQv5yAAAIABJREFUeJzs3Xd8Tff/wPHXvcnNHjKRQUKMIASxaY0aRbVVdKlVOtBW2/SLTjq1X1209Kv8bGrVqJbWqKpaTWIEsQWJSCJ73nl+f1xSaYKQcTPez8fjNPee8znnvG+in/u+n/sZKkVREEIIIYQQoipSWzoAIYQQQggh7pUks0IIIYQQosqSZFYIIYQQQlRZkswKIYQQQogqS5JZIYQQQghRZUkyK4QQQgghqixJZoUQQgghRJUlyawQQgghhKiyJJkVQgghhBBV1t0ms4psspXX1q9fP4vHIFu13moiS//OZavGW0nr7NTly5WYpsFK/H/+Y/GYZatSW4lJy6yoNK5du2bpEIQQQpRQSevsvMgoABzatCnPcEQNJsmsEEIIIcpN7qFDANi3lmRWlA9JZoUQQghRLvQJCRgSElA7O2PbKMjS4YhqSpJZIYQQQpSL3ChzFwP70FBUakk5RPmwLu0F9Ho9cXFx5Ofnl0U8lZ6dnR1+fn5oNBpLhyKEEHetptXZpSH1fenlRZm7GDi0aW3hSER1VupkNi4uDmdnZwICAlCpVGURU6WlKAopKSnExcURGBho6XCEEOKu1aQ6uzSkvi8buYeut8xKf1lRjkrd5p+fn4+Hh0eNqBRVKhUeHh7SoiGEqLJqUp1dGlLfl54xOwftyVNgZYV9yxBLhyOqsTLpwFKTKsWa9FqFqGwup+ayJTrB0mFUeVKPlYz8nkon/+gRMJmwCw5G7eBg6XBENSa9sYUQVUKO1sC4JRG8uDyKDYfiLR2OEOIOcq/3l7WX/rKinNWoZHbUqFGsXbvW0mEIIe6SyaTw2urDnLyaRQMvR3oGe1s6JFFB/vzzT5o3b05oaCgxMTGsWLHijud07ty5AiITd5IXJYsliIpRo5JZIUTV9PWOM/x6PBFnO2u+HxGGi52MLq8pli9fTnh4OIcPHyYxMbFEyezevXsrIDJxO4rRSN6RI4AM/hLlr9SzGdwsYMrPZXm5ArEzBtzyWE5ODsOGDSMuLg6j0cg777xDUFAQr732GtnZ2Xh6erJo0SLq1q1bONaAACIiIvD09CQiIoLw8HB27dpVLvELIe7dlugEvt5xBrUKZj/ZmoZeTpYOqdqwRJ0Nxdfbnp6ehIeHYzAYaNeuHXPnzmXp0qWsXr2aX3/9le3bt3Pu3DliYmIIDQ1l5MiR9OnTh9GjR6PT6TCZTKxbt45GjRrh5OREdnY2JpOJiRMn8scffxAYGIjJZGLMmDEMGTKEgIAARo4cyU8//YRer2fNmjU0bdq0XH4fNZH29GlMOTlofH3R1JZvUkT5KtNk1hK2bt2Kj48PP/9srpQzMjJ48MEH2bhxI15eXqxatYq33nqL//u//7NwpEKIu3XiSiavrTa37kx9MJjuTeRNsToort5u0aIFO3bsoHHjxowYMYK5c+cyadIk9uzZw8CBAxkyZAi7du1i5syZbN68GYCXXnqJV155haeffhqdTofRaCx0nx9//JHY2Fiio6NJSkoiODiYMWPGFBz39PQkKiqKOXPmMHPmTObPn19xv4RqrmCxBOliICpAmSazd/o0Xh5CQkIIDw9n8uTJDBw4EDc3N44dO0bv3r0BMBqNRVplhRCVX0q2lnFLIsjTGxnc2pex3WSuz7JmiTobitbbLi4uBAYG0rhxYwBGjhzJt99+y6RJk257nU6dOvHRRx8RFxfH4MGDadSoUaHje/bsYejQoajVaurUqUOPHj0KHR88eDAAbdu25ccffyzDVyhksQRRkap8n9nGjRsTGRlJSEgIU6dOZd26dTRv3pzDhw9z+PBhoqOj+e2334qcZ21tjclkApB5BEthzJgxeHt706JFi4J9qamp9O7dm0aNGtG7d2/S0tIsGKGoivRGE+OXRxGfnkcr/1p8PDhEpkmqRv5db2/cuPGervPUU0+xadMm7O3t6du3Lzt37ix0XFGU255va2sLgJWVFQaD4Z5iqAqKq6dvpigKL7/8MkFBQbRs2ZKo662qpVGwWIK0zIoKUOWT2StXruDg4MDw4cMJDw/nwIEDJCcns2/fPsC8dOPx48eLnBcQEEBkZCQA69atq9CYq5NRo0axdevWQvtmzJhBr169OHPmDL169WLGjBkWik5UVdN/Os6BC6l4O9sy75m22GmszAfyM+HKYcsGJ0rt3/X23r17iY2N5ezZswAsXbqU+++/v8h5zs7OZGVlFTw/f/48DRo04OWXX2bQoEEcPXq0UPmuXbuybt06TCYTiYmJNXZcRHH19M22bNnCmTNnOHPmDPPmzePFF18s1f30V69iuJKA2skJ26CgUl2rslIUBZNiwqSYMJqMGE1GDCbDHTe9SV+5N6POvBm0xWz5RTd9Xok2o0Ffrn+PKt9nNjo6mjfeeAO1Wo1Go2Hu3LlYW1vz8ssvk5GRgcFgYNKkSTRv3rzQee+99x7PPvssH3/8MR06dLBQ9FXffffdR2xsbKF9GzduLHjTGDlyJN27d+fTTz+t+OBElbT8wEWW7b+EjbWa/z3TltouduYDRj0Rc3pzxCqJZ/vPg0a9LRuouGfF1dsZGRkMHTq0YADYCy+8UOS8li1bYm1tTatWrRg1ahT5+fksW7YMjUZDnTp1ePfddwuVf+yxx9ixYwctWrSgcePGdOjQAVdX14p6mZVGcfX0zTZu3MiIESNQqVR07NiR9PR0EhIS7rmL3o0puexbtUJlZXVP17gX2bpsEnISSMhJ4GrOVVLzU8nWZZOlzyJLl0WOPod8Qz46ow6tSYvOqENn1JkTUcWAUTEnpSbFhMI/yaqiKJgw/1S4fWu/KN4k9zCefWhhuV2/yiezffv2pW/fvkX27969u8i+RYsWFTzu1q0bp0+fLs/QaqzExMSCSrBu3bokJSXdsuy8efOYN28eAMnJyRUSn6i8DpxP4b2N5m9SPnk0hNb13MwHFIW0za/wrksGIzIAj+rZ2lNT3KrePnToUJF9N9fbGo2GHTt2FDo+derUIudkZ2cDoFarmTlzJk5OTqSkpNC+fXtCQszLqt6c3IWFhdXYVluA+Ph4/P39C577+fkRHx9fbDJbkjq7IhZLSM9P50jyEQ4nH+Zw0mFOpZ4iS5915xPLmOp6V5YbnaBK0hmqJnaYUpfzq67yyayo2p577jmee+45wPyGImquuLRcxi+PwmBSGNctkMfa+hUc0+75kleubqOv1sgTw1aBuwwGEyUzcOBA0tPT0el0vPPOO9SpU8fSIVU6xfUtvlUf9ZLU2eW1WEKWLosfz/zI+jPrOZdxrshxOys76jrVpY5DHeo61cXDzgNnG2ecbJxw1jjjqHHE3toeW7UGm+wkbJLPYHPtDNbpF7FKi8Uq/TJWJj1qxdwHUw2oFQUVFN7UGlT2bmDnCrbOYOdi/mnjDDYOoLEHjSNo7MDaHqxtwMoWrG3Byub6pjFv6hs/rUBtDSqrmx6rzZva6vpjq3/2qVTXN7U5KtX16G4cK7TvplRb9a+0++a/cxUelyDJrChztWvXLviKKiEhAW9vmU5J3F6uzsC4JZGk5Oi4r7EXUx4MLji2d+FU1ueuwht4qcdn4N/ecoGKKqcmt7iWlJ+fH5cvXy54HhcXh4+Pzz1dy5ieTn5MDGg02LdqVSbxXc68zPKTy1l/Zj25hlwAbK1saeHZglCvUEK9Q2nh2QIPO4/ik3CDDi4fgJM7IT4CrhwBbUYxd1KBqz+41QdXP3DxMW/OPuBUGxzcwcHDnLhW4cSvOpJkVpS5QYMGsXjxYqZMmcLixYt5+OGHLR2SqMQURSF8zRFiEjIJ9HRk9hOtsVKb3yiO/7aE/WnLuOJoz0vadqhDhlg4WiGqn0GDBvHNN9/wxBNPcODAAVxdXe+5v2zOwYOgKNi3aonawaFUcWXpsvhg3wdsjd1a0Fe1fZ32DA8eTlffrmisbrMSYOYVOP0rnN0O5/8A3b+6IDjXBZ824BMKno3BsxG4NzC3qooqR5JZUSpPPvkku3bt4tq1a/j5+TF9+nSmTJnCsGHDWLBgAfXq1WPNmjWWDlNUYrN3nuWX6Ks425qXqnV1ML9BxR3fR9TfU9hW24WXztrS4eOlFo5UiKqpuHparzePLn/hhRfo378/v/zyC0FBQTg4OLBw4b0P1MndfwAAx46dShXz5czLTNw5kfMZ59GoNTwY+CDPNHuGpu63WaXNoIVTv0DUUji3E24erOUVDEG9oH5ncxLrIvPPVyeSzIpSWblyZbH7/z1IQ4ji/Hr8Kl9sO41KBbOebE2Qt3mp2oyky/z9w2DmB7rxxqkcek+PQqWu8jMJCmERt6qnb1CpVHz77bdlcq+c/fsBcOzU8Z6vEXE1gld3vUq6Np2gWkHM6jkLf2f/W5+Qcg4Ofg9HV0FeqnmflQ0E9YbGfaBhL6h1m/NFlSfJrBDCIk5dzeK1VeY5Y//Ttyk9mpr7Vuvzc9n9eVe+auLGpAtpdHt1Pxq70n1dKYQof/rERHTnz6NycMD++qwRd2v9mfW8v/99DCYDXX278t/7/ouTjVPxhdNi4Y/P4MhKUMyLIFE7BNo8AyFDzX1cRY1Qo5LZ7t27M3PmTBk1L4SFpeXoGLvkb3J0Rh4O9eGF+xuYDygK+z7vw5xG9jx7NZ2woWtwrVPfssGKCjF27Fhee+01mjVrVm736N+/PytWrKBWrVqF9k+bNg0nJyfCw8PL7d41Qe71VlmHsLaobGzu+vz50fP5OuprAIYHD+f1sNexVheTpmTEwe7/wqFlYDKYR/m3Hg7txpn7wIpyozeayMzTk5GnJzPfQGaenqx8A1n5ejLzbzw2kK01kH39Z1a+ntFdAnmktW+5xVXtkllFUVAUBbV8JSlEpaQ3mpiwIorLqXm09HPl08daFoxA1v75OfM8k+iTraVN24/xb9nVwtGKijJ//vxyv8cvv/xS7veoyXL2Xe9icA/9ZY9fO87sQ7NRq9S81eEthjUZVrSQ0QB/fWlujTXqzNNQtXoS7v+PefCWKDG90UR6rp60XB0p2TrSc3WkXX+enqsjPVdP+o2kNU9Peq75cZ7eeE/3i0vLLeNXUFiZJ7O3Wz/9f//7X8H8dPPmzeP555+/Zdk7ral9s9jYWB588EF69OjBvn37mDRpEt999x1arZaGDRuycOFCnJwKf03h5ORUMLH22rVr2bx5c6HJuYUQ5eOjn2PYey4FTydb/nfTUrWmYz/ydvQc6gKv9P4KdfNHLRtoTTCtnFbDmlbctEf/yMnJYdiwYcTFxWE0GnnnnXeYO3duwTdnCxYs4NNPP8XHx4dGjRpha2vLN998w6hRo7C3t+fkyZNcvHiRhQsXsnjxYvbt20eHDh0K6vCVK1fy8ccfoygKAwYMKFiBMCAggIiICDw9Pfnoo49YsmQJ/v7+eHl50bZt2/L5XdQQiqLcc39ZvUnPu3vfxaSYGNFsRPGJbFIMbHgRrlxfWKPFY3D/FPBqXNrQqwVFUcjWGkjO0nItW3f9541NR0q2ltQcHSk55seZ+YZ7uo+VWoWLnTUu9hpc7TU421njYmf+6XzzT1trnOyscbazxsnWGj+38u0qVm1aZk+dOsXChQt5//33GTx4MNu3b8fR0ZFPP/2UL774osgyh0KIirfq70ss2huLjZV5qdq6ruZpcI79uohtMe9x1d6W+cHPSSJbzW3duhUfHx9+/vlnADIyMpg7dy4AV65c4YMPPiAqKgpnZ2d69uxJq5vmK01LS2Pnzp1s2rSJhx56iL/++ov58+fTrl07Dh8+jLe3N5MnTyYyMhI3Nzf69OnDhg0beOSRRwquERkZyQ8//MChQ4cwGAy0adNGktlS0sXGYrh6FSs3N2ybNLmrcxcdW8TptNP4OvkyIXRC4YNGA+ybDb9/bG6NdfWHQbOhYY8yjL5yMxhNJGZpuZqRx9UMLQkZeVzNyCcxS0tiZj5JmfkkZWnJ1ZW81VStgloONrg72uDuYEMtBw1u13/WcrDBzUFDLQdNQdJay8EGl+uJ6e0aLS2lzJPZkrao3ryKSFmoX78+HTt2ZPPmzZw4cYIuXboAoNPp6NSpdFOECCFKLyI2lbc3HAPgw0db0La+eanay0f3EBX5Jr95uzA10Qvbbq9bMsya5Q4tqOUlJCSE8PBwJk+ezMCBA+nWrVvBsYMHD3L//ffj7m4evDN06NBCS48/9NBDqFQqQkJCqF27dsHytM2bNyc2NpaLFy/SvXt3vLy8AHj66afZvXt3oWT2zz//5NFHH8Xh+jyogwYNKvfXXN0V9Jft0OGuZh45n3GeuUfMH2Te6/QeDpqbWvCyEmH1M+YFDwDajIQ+H5pX3KpG9EYTCen5XE7L5XJqLpfTcolPyyM+PY8r6flczczHaLpzbmWnUePlbIuXky1ezrZ4Ol3fnG3xdDQnrh5Otrg72uBqrymYz7s6qDYts46OjoA5me7du3eJpiK5IT8/v1xjE6Kmi0/P44VlkeiNCqO7BDAszDxNTsbVi0SsGcqCQDfCT+XQafqvsrJODdC4cWMiIyP55ZdfmDp1Kn369Ck4dqcGEVtbWwDUanXB4xvPDQYD1tYle1urjK1LVdk//WVL3sXApJiYvnc6epOeR4MepZPPTQ1PiSdgxTDIuGxegevh2RD0QFmHXWGMJoX4tDzOJWdz4VoOsSk5XLiWw8WUXOLT8+6YrHo52+LjakcdVzvqutpTx9WO2i621Ha2w9vFDm8XW5wraatpRag2yewNHTt2ZMKECZw9e5agoCByc3OJi4ujcePC/Wpq165NTEwMTZo0Yf369Tg7O1soYiGqtzydkeeXRnAtW0fXIE/e6m9eqlafn8ufX95XMAXXfa8flCm4aogrV67g7u7O8OHDcXJyKjReoX379rz66qukpaXh7OzMunXrClpfS6JDhw688sorXLt2DTc3N1auXMlLL71UqMx9993HqFGjmDJlCgaDgZ9++um2YzjE7SkmE7kHri+WcBf9ZVefWk1UUhSe9p68HnbTNzLndsLqkaDNBL928MRKcPIq67DLhdGkcCk1l1NXMzl5NYszSdmcSzInsFqDqdhzVCqo62qHv5sDfu725p9u9vi62eNby5y42lpbVfArqVqqXTLr5eXFokWLePLJJ9FqtQB8+OGHRZLZGTNmMHDgQPz9/WnRokXBYDAhRNlRFIU31h7hWHwm9T0c+Oap1lhbqVFMJrZN68Q3jRwYm5BOuyd+xNVbJjWvKaKjo3njjTdQq9VoNBrmzp1bMC2Wr68vb775Jh06dMDHx4dmzZrh6lrygWp169blk08+oUePHiiKQv/+/Yssqd2mTRsef/xxQkNDqV+/fqFuDuLu5cfEYMzIwNqnLpp69Up0ztWcq3wZ+SUAb3Z4E1fb63/jyEWw+TVQjNDsEXj0u0q7xGy+3sipq1lEx2dwLD6DEwmZnE7MIl9ffNJa28WWhl5OBHo6EujpSICHIwGejvi720uyWkqqu5k1gEJrw5nFxMQQHBxcdhFVATXxNVeEsLAwIiIiLB2GKEPf/n6W//56CkcbKzZM6EKj2uZvQPbPeorZthGE5eTRp8VHNO8zoiLCqYnfv1XJOjs7OxsnJycMBgOPPvooY8aM4dFHLTMosCr8vizlRp2dsmABSf+dievgwfh8/FGJzv304Kcsi1lGr3q9+KrHV+adv38Mf5hnnqDra9DzHagk02wqikJsSi5RF9OIvJTG4UvpnE7MwlBM94A6LnY0qeNM0zrONKrtTJC3Ew29HHG201gg8iqtxHV2tWuZFUJUDttPJDLzt1OoVPD1E60LEllT9FrWWO3H1wAd3YZXVCIrqpBp06axfft28vPz6dOnT6HBW6LyKegvW8IuBnqTnp/Pm2eyGBcyzrxzz5fmRFZlBQO/hLYjyyXWkjKZFGKuZrLvXAr7z6cSdSmN1BxdoTIqFQR5OxHi60oLX1ea+7jQtI4ztRzufsEIUTqSzAohytyZxCwmrTqMosAbfZvwQLPa5gOXDzLr9/9wzdaKec3HY9v1VcsGKiqlmTNnWjoEUUKKTkduZCRgnsmgJPbE7SFNm0ZD14Y082gGEf8H26cBKhg8D0KGlF/AtxGfnsfvJ5PYc+YaBy6kkJarL3Tc08mWNvVq0aa+G639a9HC1xVHW0mjKgP5KwghylR6ro6xSyLI1hoY2LIu47s3BODSkd3s/m0429wcWObVE9sukywcqRCitPKOHEHJy8MmqCEab+8SnbPp3CYABgUNQnVsnbmPLMCAzys0kTWaFKIupbHzZBK/n0zi5NWsQsd9XO3o1NCTTg09aB/gjr+7fY2dLaCyk2RWCFFmDEYTE1cc4mJKLs19XPjvkFaoVCrSE2KJWPc43we48e4lBbfRs2QKLiGqgZx9+4CSL2Gbnp/OrrhdqFVqBiqOsP55QIFe70G7Z8sxUjOTSSHyUho/HbnCL9FXuZatLTjmaGNF10ae3N/Ymy5BHtRzd5DktYqQZFYIUWY+/uUke85ew9PJhnkjwrC3sUKXl82fX9/H143dmHQ+nXav/A1WUvUIUR1k7doFgFO3riUqvzV2KwaTgS7uzfHe+DKYDNBlEnR7rRyjhNOJWaz++zKbjyZwNfOfueXruTvwQHBtejb1pl2gm8wqUEXJO4oQokysjrjM//11AY2VirnD2+Jby948Bdf0Tnwb5MhzV9Jo/9QGXLx8LR2qEKIMKHo92hMxqBwccCjhYgk3uhg8FHsYDPnQdhQ8MK1c4svXG/klOoEVBy4RcTGtYL9vLXsGtqzLwJY+tPB1kdbXaqDaJrMBAQFERETg6elp6VCEqPYiL6bx9nrzUrUfPNyCdgHmpUi3fTyQ5YFaHszMo02nz/FtVrIBIqL6Sk9PZ8WKFYwfP/6ezu/evTszZ84kLCysjCMTd8uYlQU2tjh16YL6ptXYbuV8xnmir0XjqKjomXoVArpB/8/LvMtRQkYe8/+8wJqIy2TmGwBzF4KHW/vyWBs/2tSrJQlsNVNtk1khRMVIyMjj+aWR6IwmRnaqzxPtzZOmH10xja3up6inN9HBYxTBvZ6ycKSiMkhPT2fOnDn3nMyKysOUmQWetjj17Fmi8pvOmltl+2ZnYe9UF4b8X5l2ObpwLYf//XGOdVFx6I3m+V9b+bnyZPt6PNTKR2YeqMbK9i87reSrtNzddTNue3jZsmXMmjULnU5Hhw4dmDNnTsGx2NhYBg4cyLFj5lajmTNnkp2dzbRp08onViFqkHy9keeWRHItW0unBh68PbCZ+cCl/WxPWUKKjRXDM1vTceIHlg1UFCtkccmXib0b0SOjb3lsypQpnDt3jtDQUHr06MHRo0dJS0tDr9fz4Ycf8vDDDxMbG8uDDz5I165d2bt3L76+vmzcuBF7e/NKUGvWrGH8+PGkp6ezYMECWcHLAozZ2ZhycsDbG6f777tzeZORn06tBmBQTj48tRqcSjb7wZ2cTcri6x1n+fnoFUyKuaF3YMu6PH9fQ0L8yikvEZVKlf+YEhMTw6pVq/jrr7/QaDSMHz+e5cuXWzosIao9RVGYvO4o0fEZ+LvbM+fpNmis1JByjjUbnmGngw1LvXtTa9zXlg5VVCIzZszg2LFjHD58GIPBQG5uLi4uLly7do2OHTsyaNAgAM6cOcPKlSv5/vvvGTZsGOvWrWP48OEAGAwGDh48yC+//ML06dPZvn27JV9SjZSzZw8oCvatW2Pt7n7H8gdj1pCkz8JPr6dNj/fBv12pY0jP1fHV9jMs3X8Ro0lBY6ViaGs/nr+/AQ28nEp9fVF1lHHL7O1bUMvDjh07iIyMpF078/8YeXl5eJdwrjshxL373+7zbDx8BUcbK+aPaIebow1pV87z18JefFvHgcU2QbgN+LLSLEcpirpdC2pFUBSFN998k927d6NWq4mPjycxMRGAwMBAQkNDAWjbti2xsbEF5w0ePLjY/aLiZO3cCYBzSboY5Geyae/HYAODnBqgaje2VPc2GE2sPHiJz7edJj1Xj1oFT3Wox8QeQfjUsi/VtUXVVOVbZhVFYeTIkXzyySeF9i9atAgAa2trTCZTwf78/HyEEKXz+8kkPt16EoAvHw+lSR1ndHnZ7JnVnZmNnfjP5RzqT14mU3CJ21q+fDnJyclERkai0WgICAgoqKNtbxpQZGVlRV5eXsHzG8esrKwwGAwVG7RA0evJ/mM3AE49e9yxfO7WyeywNgJqHupbujmmj1xO5z9rj3Iq0bzAQccG7rz3UHOC67rc8zVF1Vflm0x69erF2rVrSUpKAiA1NZWLFy8WHK9duzZJSUmkpKSg1WrZvHmzpUIVolo4m5TFyysPoSjwWu/G9GlexzwF17SOfBPkyPNX0mk9bA3YyZuLKMrZ2ZmsLHMikpGRgbe3NxqNht9//71Q3S0qr9yoQ5gyMlDZ2mAbGHj7wqd/JfLUj+Sp1bR0bYSfW9A93dNoUpiz6yyPzd3LqcQs/Nzs+W54G1aO6yiJrKj6LbPNmjXjww8/pE+fPphMJjQaDd9++23BcY1Gw7vvvkuHDh0IDAykadOmFoy2Zvnyyy+ZP38+KpWKkJAQFi5ciJ2dnaXDEqWQkatn3JJIsrQG+ofU4aWe5jembR/2Z1lDHQMy8mjd+Qt8gttbOFJRWXl4eNClSxdatGhBu3btOHnyJGFhYYSGhkr9XEVkX+9ioHZ2vn3B3FTY9BKH7Mwt6W39S7awwr9dSc/jtdWH2X8+FYDRXQKY3K8pdhpZ4ECYqRRFuZvyRQrHxMQQHBxcdhFVATXxNd+t+Ph4unbtyokTJ7C3t2fYsGH079+fUaNG3fKcsLAwIiIiKi5IcVeMJoXRi/5m9+lkguu6sO7FTjjYWLPnu1dYa9qMvcnEw84j6DhyuqVDvZWaOLGk1NmlJL+vwhRF4VzffugvXeJptYqoEyduVRDWjIITGxgVEESkSsfsnrPp7t/9ru63JTqBKT9Gk5G3yqnvAAAgAElEQVSnx9PJlplDW9K9iYyLqSFKXGdX+ZZZUXkZDAby8vLQaDTk5ubi4+Nj6ZBEKczYEsPu08m4O9rw/Yi2ONhYc3bnMvbmryfDzpYBWWF0nFhpE1khRBnQnTuH/tIlrNzcUOt1ty54bB2c2IDOxoljVoAJQr1CS3wfRVH47o/zBX3zezb15rMhLfF0uvPiDKLmqfJ9ZkXl5OvrS3h4OPXq1aNu3bq4urrSp0+fIuXmzZtHWFgYYWFhJCcnWyBSURLrIuP4/s8LWKtVzHm6DX5uDpByjqhj0/nT0YGhF2vxwH9+sHSYQohylrXD3MXAqXv3WxfKTICfXwfgRLcJaE06Gro2pJZdrRLdw2RSeH/zCT7dehKVCt4eEMyCkWGSyIpbkmRWlIu0tDQ2btzIhQsXuHLlCjk5OSxbtqxIueeee46IiAgiIiLw8vKyQKTiTg5fTmfqevMUTtMGNadjAw/ITWXPqseY66hhjn1T+n3wlywPKUQNcKO/7C1nMVAU+OllyE+HoN4cdqsLQOvarUt0fZ3BxKRVh1n4VywaKxWzn2zN2G4NpH4RtyXJrCgX27dvJzAwEC8vLzQaDYMHD2bv3r2WDkvcpcTMfJ5bEoHOYOLpDvUY3rE+2pxMdszsxJt2Or4wuuA/dClqaxtLhyqEKGeG5GTyjh5FZWODU+fOxRc6tg7O/AZ2rjBoNlFJhwBo493mjtfP1hoYs+hvNh25gpOtNYtGt2dgS+meJu5MkllRLurVq8f+/fvJzc1FURR27NghgyiqmHy9keeWRpKUpaVDoHkuR8VkYtv7nfjUV834q5mEPr4ObO8wolkIUS1kbtkKioJj586oHR2LFshLg61TzI97f4DiXIdD15PZUO/b95fVGoyMXniQPWev4elkww/PdaRLkGdZvwRRTckAMFEuOnTowJAhQ2jTpg3W1ta0bt2a5557ztJhiRJSFIU310dz5HI6vrXMS9XaWKv5dXofljQ0MCgth9DOX6By9bV0qEKICpKxcSMArg8PKr7AtnchJxnqdYbWz3Ah8wLp2nS87L3wc/K75XUVRWHqumj+jk2jjosdq57vSH2PYpJlIW5BWmaBadOmMXPmTEuHUe1Mnz6dkydPcuzYMZYuXVpoRR9RuS3Yc4Efo+Kx11jx/YgwPJxs+XPuS2z2OkeQVke72uNo2n2YpcMUNcCff/5J8+bNCQ0NJSYmhhUrVpToPCcnp3KOrGbRnjlD/vHjqJ2dcepRTH/Zi3shagmoNfDQV6BWcyjR3Crb2rv1bfu8ztl1jh8PxeNgY8WCUWGSyIq7VmOSWaPRaOkQhKgS/jidzMe/xADwxbBWNPNx4cjGOfyl20SOWk2vvA50eOY9C0cpaorly5cTHh7O4cOHSUxMLHEyK8pWxqZNALj064f634vfGLTw0yvmx91eA68mAEQlRQHQpvat+8tuiU7gv7+eQqWCrx4PpbmPa9kHL6q9atHNIDY2ln79+tGhQwcOHTpE48aNWbJkCc2aNWPMmDH89ttvTJw4kXbt2jFhwgSSk5NxcHDg+++/L7LiTPfu3Zk5cyZhYWFcu3aNsLAwYmNjLfPChKhg55OzmbgiCpMCL/dqxIMhdbl6fA8RMR/xl4cz42Nd6fWRTMFVXcQ0LZ9+7MEnY257PCcnh2HDhhEXF4fRaOSdd97B09OT8PBwDAYD7dq1Y+7cuSxdupTVq1fz66+/sn37ds6dO0dMTAyhoaGMHDkSNzc31q9fj1ar5cKFCzz11FO8917hD1q7du1i5syZBUuZT5w4kbCwMEaNGsWUKVPYtGkT1tbW9OnTR76huwXFaCTjJ/Pvz/WRh4sW2PMlXDsNHo2g62sFu2/0l23tXfxMBkfj0nl19WEApj7YlD7N65Rx5KKmKPNkNmRxSFlfkuiR0Xcsc+rUKRYsWECXLl0YM2YMc+bMAcDOzo49e/YA0KtXL7777jsaNWrEgQMHGD9+PDuvTzMiRE2Xma9n7JIIsvIN9G1em0m9GkFOCqf/mMAyL1deP6Wlz0d/WjpMUQ1s3boVHx8ffv75ZwAyMjJo0aIFO3bsoHHjxowYMYK5c+cyadIk9uzZw8CBAxkyZEiRxHTRokUcPHiQY8eO4eDgQLt27RgwYABhYWF3jCE1NZX169dz8uRJVCoV6enp5fqaq7LcgwcxXL2Kxs8P+zb/amW9dgb+/Nz8+KGvQGNutU3OTeZy1mUcrB1o7Na4yDWvZuQzdnEE+XoTj4f5M65bg/J+GaIaK/NktiSJZ3nw9/enS5cuAAwfPpxZs2YB8PjjjwOQnZ3N3r17GTp0aME5Wq224gMVohIymhReWXmI88k5NKntzBfDQlEbtZz6YShv2+mZZXClxbQtWGlkCq7q5E4tqOUlJCSE8PBwJk+ezMCBA3FxcSEwMJDGjc1Jz8iRI/n222+ZNGnSHa/Vu3dvPDw8ABg8eDB79uwpUTLr4uKCnZ0dY8eOZcCAAQwcOLB0L6oay9hwY+DXw0X7vm5+FYw6aD0cAroW7L7RKtvKqxXW6sKpxo0BpklZWjo2cOeDR1rIPLKiVKpNn9l//49w47nj9elDTCYTtWrV4vDhwwVbTEzRitza2hqTyQRAfn5+OUctROXw2a8n+f1UMrUcNHw/IgwHjZpdM7oxkQTezFUR+sSPWDuUbPUeIe6kcePGREZGEhISwtSpU9l4fZT8vbhV3X/DzXU6/FOvW1tbc/DgQR577DE2bNhAv3797jmGqmDr1q00adKEoKAgZsyYUeT4okWL8PLyIjQ0lNDQUObPnw+AKSeHzG3bgGJmMchLg9g/wcEDen9Q6FBBF4NiFkv49XgiO08m4WxrzawnWmNjXW1SEWEh1eZf0KVLl9i3bx8AK1eupGvXroWO3/jkv2bNGsD8yfDIkSNFrhMQEEBkZCQAa9euLeeohbC8DYfi+d8f57FSq5jzVBvqeTjw2/t9mFM7k0fTsun96ApwqWvpMEU1cuXKFRwcHBg+fDjh4eHs3buX2NhYzp49C8DSpUu5//77i5zn7OxMVlZWoX3btm0jNTWVvLw8NmzYUPAN3Q3169fnxIkTaLVaMjIy2LFjB2D+ti4jI4P+/fvz1Vdfcfjw4XJ6tZZnNBqZMGECW7Zs4cSJE6xcuZITJ04UKff4448XNPaMHTsWgKzt21Fyc7Fv0wabevX+KZyXDhnx5se93wcH90LXKhj89a/FEnK0Bqb/dByAN/o1wdvlX4PJhLgH1SaZDQ4OZvHixbRs2ZLU1FRefPHFImWWL1/OggULaNWqFc2bNy+2NSA8PJy5c+fSuXNnrl27VhGhC2ExR+PSmbzuKADvDmxG5yBPdn87ns3eF2ii1RFW93msfFtZOEpR3URHR9O+fXtCQ0P56KOP+PDDD1m4cCFDhw4lJCQEtVrNCy+8UOS8li1bYm1tTatWrfjyyy8B6Nq1K8888wyhoaE89thjRboY+Pv7M2zYMFq2bMnTTz9N69bmlsKsrCwGDhxIy5Ytuf/++wuuVx0dPHiQoKAgGjRogI2NDU888USJW8P/mVv2XwO/dn4AJj3U6wStnip0KEefw8nUk1iprAjxLDyO5qvtp0nIyKelnytPd6h/7y9KiJtUi9kMANRqNd99912hff+ehSAwMJCtW7cWOXfatGkFj5s2bcrRo0cLnn/44YdlGqcQlUVSZj7PLYlEazDxZHt/RnSqz+H1s9lj+Jk8Wxseye1I+6fftnSYohrq27cvffv2LbL/0KFDRfYtWrSo4LFGoyloWb1xzNvbm2+++abIednZ2QWPP/vsMz777LMiZQ4ePHi3oVdJ8fHx+Pv7Fzz38/PjwIEDRcqtW7eO3bt307hxY7788kvqaDTk7NuPysYGl37mv9e8efPYs2oWi7peBlQw4AtQF24XO5p8FJNiooVHCxw0DgX7T1zJ5P/+ikWtgo8eCcFKLf1kRdmoNi2zQoiS0xqMvLAskquZ+bQLcGP6oBZciNhGxOkZHHCwZ/BlT3qFL7d0mEKIMqAoSpF9/+5b/NBDDxEbG8vRo0d54IEHGDlyJBk//QSKglPPnli5mud/fW7ssywZ5oFaBTh5Qe1mRa5dXH9Zk0nh7Q3RGE0KIzoFEOIn88mKslMtktmAgACOHTtm6TCEqBIUReHt9ceIupSOj6sdc4e3JT85lqifR7LC04VxZ/T0nfaHpcMU4o5GjRpVbKusKMzPz4/Lly8XPI+Li8PHx6dQGQ8Pj4JVGseNG0dkZCQZ6zcA/xr4FfF/kHAYXPzAufi+9MX1l10VcZmoS+l4O9vyWp+iU3UJURplkswW96mvuqpJr1VUTwv/imVNZBx2GjXzRoThaasQ/8uzzK7nxivnM+kxdb9MwVXNST1WMtXl99SuXTvOnDnDhQsX0Ol0/PDDDwwaVHhmgoSEhILHmzZtYmhQELrz57H29sbpxoDqrETYcX3WggdngKpoCqEoCjEp5pmCbvSXzcjVM2PLSQDefagZLnaasn6JooYrdTJrZ2dHSkpKtfmf/nYURSElJQW7fy/lJ0QVsefMNT66vlTtf4e0okVdZxLXj+UlrvJWLvR7dS+Obl4WjlKUp5pUZ5dGdarvra2t+eabb+jbty/BwcEMGzaM5s2b8+6777Lp+jK1s2bNonnz5rRq1YpZs2bxesMgANxHPINKcz353DoZtBnQqC80LX5e3qTcJDJ1mbjauuLt4A3AqohLZOTp6djAnQEhMjOKKHulHgDm5+dHXFwcycnJZRFPpWdnZ4efn5+lwxDirsVey2HCiiiMJoUJPRryUCsf9vx3EF+7nuLJfAN9hm0ETxldXN3VtDq7NKpTfd+/f3/69+9faN/7779f8PiTTz7hk08+ASD/xAkuDH4MtYMDtYYNMxc4/RscXw8aB+j/X7jFIgdn083TqwXVCkKlUmE0KSzZdxGAsV0byOIIolyUOpnVaDQEBgaWRSxCiHKSla9n3JIIMvL0PBDszeu9m7B79guscjpBsNbIUz1mQZ0Wlg5TVACps8WdpCxcBECtoUOxcnEBbTb8/Lr5YI83we3WH3pvTmYBdp5MIi4tD393e3o09S7XuEXNVS0GgAkhbs1kUnh11WHOJGXTyNuJLx8P5ciG2fxp2oJOBT103bAPecjSYQohKgF9QgKZv/wCVla4j3jGvHPXJ5BxCeq0hA5F53C/2Zm0MwA0qtUIgMV7YwEY0TFApuIS5UaSWSGquc+3nWJ7TBKu9hrmjwwj+egOIs98yt8OdgyOq02P15dYOkQhRCWRumQpGI249OuHxtcXrhyG/XPMg70GzQKr23+hW9Ay6xbE2aQs9py9hr3GimFh/rc9T4jSkGRWiGrspyNX+Pb3c1ipVXz7VBsccy4T+ctoVni6MPasib7TZQouIYSZMSuL9NWrAXAfPRqMBvjpFVBM0OEF8Gl92/NNiolz6ecAczeDG31lH2nti6uDzGAgyo8ks0JUU8fiM3hj7REA3uofTHsfDXvnPsDsem5MOpdBjyl7Ud+hlUUIUXOkr1mLKScHh/btsW/RHA7+7585ZXu8dcfz47PiyTfm423vjVpxYF1kHAAjO8vAUlG+JJkVohpKztIybkkE+XoTQ9v6MbpzPVI2v8jXQc5MiEun49hfZQouIUQBRa8ndYm5y5H7mNGQcg52Xl/OfcBMsHW64zXOpJv7ywa5BbE2Mo4cnZGODdxpWsel3OIWAspgNgMhROWiNRh5cVkkCRn5tKlXiw8fbUHu9veYmBHJ8Hwjg0Zuxr5+iKXDFEJUIplbtmC4ehWbhg1x6toFFg8AfS60GAJNHizRNW70l23oGsSS3eYuBqM6B5RXyEIUkGRWiGpEURTe23iciItp1HGx47tn2nLo/yax2LiFlkYTox6ch6p+mztfSAhRY5i0WpJnzQbAY8xoVPu/gcsHzMvV9v9via9zNu3s9evV5sK1HHxc7XgguHa5xCzEzaSbgRDVyJJ9F/nh78vYWquZN6Itl7fO4bf8nzABLzV7CVWjBywdohCikkldvAR9XBy2jRrh2iEIfv/YfGDQN+DgXuLr3OhmcOicedW04Z3qY20laYYof/KvTIhqYu/Za7y/+QQAnw1picPlv4i88AWH7G15JL4O7j1ftXCEQojKRp+URMp33wFQ+z/hqH6aAEYdtB0Nd/HhV2/UE5sRC8Chs7aoVMh0XKLCSDIryk16ejpDhgyhadOmBAcHs2/fPkuHVG1dSsll/PWlal+4vyGdXdOJ+m0sqzycGXvGRN/puywdohCiEkr+6mtMubk49eyJo24XJEaDWwD0+fCurnMx8yIGxYCnbV10Bmta+Lji6WRbLjEL8W/SZ1aUm1deeYV+/fqxdu1adDodubm5lg6pWsrWGhi75G/Sc/X0bOrNxE6e7Py0Fd80ciP8bAY93joiU3AJIYrIO3acjPXrQaOh9jP94LcRgAoe+a5Esxfc7MbgL1uTLwBdG3mWdbhC3JK8w4lykZmZye7du1m0aBEANjY22NjYWDaoaujGUrWnE7Np6OXIl8NC+GNaCF83cWFCXBodx/6Gg6u8qQghClMUhcSPPwZFwf2JIdjsn2peHKHzy1C/011f70Z/2cxMc33TLUjqHVFxpJuBKBfnz5/Hy8uL0aNH07p1a8aOHUtOTk6RcvPmzSMsLIywsDCSk5MtEGnV9tX202w7kYiLnTXzR7bD6s/3WdBIzeMpWbR9YB7eDWQKLiFEUVlbtpAXFYWVuzueXgch4zL4hkHPt+/pejdmMkhKdcNOo6ZtgFtZhivEbUkyK8qFwWAgKiqKF198kUOHDuHo6MiMGTOKlHvuueeIiIggIiICLy+ZxP9u/BKdwKydZ1Gr4Jun2uAfu4rw86tprdXzZN+5BHV+yNIhCiEqIVN+PokzZwLg1bchVlf+BAdPGLYErO+tn+uNbgYmbW3aB3pga21VZvEKcSeSzIpy4efnh5+fHx06dABgyJAhREVFWTiq6uP4lQxeX21eqvbN/sG4RC/j4/3mARtTO76DcytJZIUQxUucMQPDlQRsA+pSy7gRVGoYuhBcfe/penmGPC5nXUaFFSatp3QxEBVOkllRLurUqYO/vz+nTp0CYMeOHTRr1szCUVUPKdlanlsSSZ7eyOA2vtyvPsb+859z2FbDW+59sA4bbekQhRCVVOaWLaT/sAqVxpq6Lc6hUgMPTIfA++75muczzqOggN4LsJbBX6LCyQAwUW5mz57N008/jU6no0GDBixcuNDSIVV5OoOJF5dHEZ+eR6h/LV5tBXtXPc8a31q8esqI7yezLB2iEKKS0l26RMI77wLg3dkae6d0aPYwdH6pVNe90V9Wl+uNp5MtTes4lzpWIe6GJLOi3ISGhhIREWHpMKqVaT8d5+CFVLydbflqUACH5rbn2yA3ws9k0OvtaJmCSwhRLJNOR/yrr2HKzsa5sSNudc6AZxN4+FtQqUp17Zv7y3YN8kBVyusJcbfknU+IKmLp/ousOHAJG2s13z0dyvHZnfi6iQsTL6fR6fmd2LuUfNlJIUTNkjRzJvnHj6OpZUPdFmdROdeBp34A29K3ot6YlsukrUPXRjKQV1Q86TMrRBWw/3wK0zcdB2DG4BCSFjzC/OtTcLXpMx+vgGALRyiEqKyytm8nbclSsFLh2y4eK9da8Mx6cG9QJte/0c3AqK1NVxn8JSxAklkhKrnLqbmMXx6FwaQwrlsgDxm38KPvVdrk5hPW4HWCOg2wdIhCiEoqLzqaK1OmAuDdMh37urbw9DqoXTYDcjN1mSTmJqKYNDR0q0cdV7syua4Qd0O6GQhRieVoDYxbEkFqjo77G3sxOSieD7d9gpWVmomt3sCt+0RLhyiEqKTyjh/n0rNjMWVn41IvD/dgAzy5Fvzaltk9zqWfA8Ck9aZbI+8yu64Qd0OSWSEqKZNJIXzNEU5ezaKBpyOv1r/E4q3/Idpew5KAoThKIiuEuIX8kye5NHoMpsxMnP3y8OmUierx5aWagqs4Z9Ju9JetTTeZkktYiHQzEKKSmr3zLFuOXcXZ1prP7rfl5F8vscLJlo+Uhjg+8L6lwxNCVFL5p05zadQoTJmZOPnk43ufFtWTy6DJg2V+r5hr5mQWXR06BHqU+fWFKAlpmRWiEtp67Cpfbj+NSgUzB/oTv/4B5gS5EX4mk/pTl4FaPocKIYrKP32aSyNHYEzPwLFuPr59Naie+RF8WpfL/Y4kngYg0DUQR1tJKYRlyDuiEJXMyauZvLb6MACTH2iAYe0gvm7oykuX0uj0/DaZgksIUayMnzYTO2yYOZGtk4/fo3VQP7+j3BJZgPicSwB0qiczqgjLkY9RQlQiqTk6xi6OIFdn5JFQH+rtfJZ5QVY8dS2TNg8ukCm4hBBFmLRaEj/8gPQ16wBwqZdL3SfboH5yMdi5lN99FRNa0zUUxYpO9RuV232EuBNJZoWoJPRGE+OXRxKXlkdLP1cGXv6cH/yTaZero22jyTTs0N/SIQohKhnd5cvEvziO/LMXUakVarfJotboF1B1nwrlvCKgzqhDhQqTzoPmdd3K9V5C3I4ks0JUEh9sPsH+86l4OduysGsGX+/Zi52ippv6QVoPftnS4QkhKhFFrydt2WKSZ83ClKdH42jAd2At7J9fWa7dCm6Wa8jHEXusDN7UlfllhQVJMitEJbDy4CWW7LuIjZWaJQMc2LD7eWLsNMz2GkCdR7+wdHhCiEpCURSyf9tC0sfvo0vMAMDJLx+fV57G6sF3wNq2wmLJ1WtxxB4PWz9UKlWF3VeIf5NkVggL+zs2lXc3HgPgnfZwftcoVjrbsNy5LbUf+dzC0QkhKou8w5EkTZtM7sl4AGycDXg/UBen8V+g8g+r8Hi0Bi0A9Z0DKvzeQtxMklkhLCg+PY8XlkaiNyqMbu2Cy4En+DTIlc+zHKk9Yj5Ia4cQNZpJqyVr9ULSli8lLzYVACsbE55dauH28vuomvaxWD2hN+kAaOEtg7+EZUkyK4SF5OoMjFscQUqOji6BLrSNHMUXTVx4+VIaDZ5dDxrpgyZETaQoCtojB8lY8i0ZOyMw5isAqDUmarV0xPO1t7BqM9iiH3ZNigkTegA6+je1WBxCgCSzQliEoii8seYoJxIyqe/hwBOx4cxrpDFPwTVgMR7+TSwdohCiAplyc8nZsoLsXzeTfegshixjwTE7dyO1erbEdeSrqIM6V4pvbOKzrqJgwmRwpI2fj6XDETWcJLNCWMC3v5/l5+gEnGyteU07l431U+mYqyOsyZs0aNfH0uEJIcqRoigYzh0nb/fP5EUdJO9ULPlXclCM/ySpVnZGnJu6UWvoMOwfehFsHCwYcVH7LsUAYGOqIyt/CYuTf4FCVLBtJxKZ+Zt5qdq33XYSYfU39iY1Xa0HEPrIBEuHJ4QoA4qiYLp2Ff2F4+jPRKM7dQJtbCy6K9fQJedj1P77DBV2XuDUOginPgOx6/UEKnvXMotn69atvPLKKxiNRsaOHcuUKVMKHddqtYwYMYLIyEg8PDxYtWoVAQEBt7zeoavmZWzdbXzLLEYh7pUks0JUoNOJWUz64RAAMzpD+rnVnLK1YcwVP+6bNsfC0QkhiqPk52LKTMWUkYIpLRljWhKm9BRMGakY01IwpKRgTE/HmJGFMTMHfXo+hiwjJsOtuwNY2Ziw83PEvnEA9m3bYd9tAFb1Q8qlC4HRaGTChAls27YNPz8/2rVrx6BBg2jWrFlBmQULFuDm5sbZs2f54YcfmDx5MqtWrbrlNU+nngOgvktgmccrxN2SZFaICpKeq2PckghydEaGN9PgEjuJ+c52/M+qGYHvrrF0eEKgPxdN1spiPlQpyp1PvrnM9cdKcecpyj9lFQW4/th0/RyUf8rcXE4BRTGZH5vMxwo/N6GYTGA0me9rMqEYTeb9RhMYjdd/msspBiPK9X2K4abNaEIxKChGBZMeTAZAuZcEU4XKSkHjrEZTyw4bPy9sAgKwbdICmxbtsW7cFlU5r9B1w8GDBwkKCqJBgwYAPPHEE2zcuLFQMrtx40amTZsGwJAhQ5g4cSKKotxy/tiE3MsANPdqWL7BC1ECkswKUQEMRhMTVkRxMSWXUA8D/dPeYoozfI8PDYavALXa0iEKgf70YRKX7bJ0GJXETUmcSkFtDWoNqG3VqG2tsbLXoLazwcrFEStXF6zc3bF298TKqy7W9YPQNGiO2rseqkrw/3Z8fDz+/v4Fz/38/Dhw4MAty1hbW+Pq6kpKSgqenp7FXjPbeAWAjn4yk4GwPFWxn5xv7a4KC3E3wsLCiIiIsHQY5WLapuMs2huLl72at7PH81VTe95MN9H7xb3g4G7p8GoKyw8Br2D9+vVTrl27VuLySl42hqQEDAYD1tZl1dZxh197MYdV1/+rN+jRaDRFC6uKe/6vx9d/qlRc/+peZf55fVOpbn6uBrXanHhef4zaCpWVFeYsVk1ycjJeXl53++LLxd3GkpaWRmZmJvXr1wcgJSWFnJwc6tWrV1Dm+PHjNGrUCBsbGwCio6MJDg4u8u8gOTmZ5JQUFC8j2itaWrdoXSlW/6osf5/KEgdUnljuNY7IyMhfFUXpV5Ky0jIrRDlb/fdlFu2NRWOlYkr2O8xvbMszyRk06r1EEllRrrZu3XpP51WWD5aVJQ6o2rHs27ePadOm8euvvwLwySefADB16tSCMn379mXatGl06tQJg8FAnTp1OHToULGJ6pqj+3n/0DjOvhtLZGRkKV9N2agsf5/KEgdUnlhKEUeJElkAy3//Iao1o9FI69atGThwoKVDsYjIi6m8tSEagLdN/+Pnhhl0zsmjbfBbBLTtZeHohBA1Qbt27Thz5gwXLlxAp9Pxww8/MGjQoEJlBg0axOLFiwFYu3YtPXv2vGWLa2TCKfMDo6QQonKQlllRrr7++muCg4PJzMy0dCgVLiEjj+eXRqE3KozXbOGU94ap5eYAACAASURBVFGcTGq62j5Mq4fHWzo8IUQNYW1tzTfffEPfvn0xGo2MGTOG5s2b8+677xL2/+3deXxU1dnA8d9MJvu+r2xZhLCEEMJaRbaAqAVZKigKFRTUolisSku1aEXAhaql2lJE0SoovGIQ2QREAVEM4MIiJECAhITs+zbLef8YGBKSQICZTALP1898ZubeM/c+9zDeeXLuueckJjJy5EimTp3K/fffT3R0NH5+fqxcubLR7R0tOG5+YWy0iBDNSpJZYTMZGRl88cUXzJkzh0WLFtk7nGZVWWNk2vt7ySur5g7H/bh6J7PX2ZUp2e255bk37R2eEJc0bdo0e4cAtJw4oPXHcvvtt3P77bfXWfbCCy9YXru4uLBqVdNGVcmqOAU6cHNyveI4bKWl/Pu0lDig5cTSHHHIDWDCZsaNG8ef//xnSktLefXVV1m3bt0ly7eU/j3XSinF4yt/5POfztDTp4L7nf7MW96OPHFUy4j5v7SIu5tvUPa/S6X5yTlbWJXJpIhbehsa5zPoXzPy68+H7B2SuH41+Zwtv6rCJtatW0dQUBA9e/a8ZLklS5aQmJhIYmIiubm5zRSdbb399TE+/+kMgU41PO31Cv/wceLVGn+GPLdbElkhRKt2sqAMHM3naldHZztHI4SZ/LIKm9i1axdr166lffv2TJgwgW3btnHffffVKzdt2jRSUlJISUlpEUOIXKuth8/yyqYjaE01vOGziGecy3ipyom4yWtxdveyd3hCWKxatYouXbqg1WrrXRGZP38+0dHRdOzY0XIH/MVOnDhBnz59iImJYfz48dTU1FxzTOPHjyc+Pp74+Hjat29PfHx8g+Xat29Pt27diI+PJzEx8Zr325C5c+cSHh5uiWf9+vUNltu4cSMdO3YkOjqaBQsWWD2Op556ik6dOhEXF8fo0aMpKipqsJwt66T2MT7/n7fQaPXolBcOGgfAPBXu+PHjiY6Opk+fPqSnp1t1/wCnT59m0KBBxMbG0qVLF9544416ZbZv3463t7fl36x2Nwpru1x9K6V4/PHHiY6OJi4ujn379tkkjiNHjliONz4+Hi8vL15//fU6ZWxZL1OmTCEoKIiuXbtalhUUFJCUlERMTAxJSUkUFhY2+Nnly5cTExNDTEyM5ebDq6aUupKHEFfsq6++Unfcccdly/Xs2bMZorGd1LMlqstzG1Xbp9eqDx7rpu5c0lF9+Fp7pfLS7B2aMLvS89318GjUoUOH1K+//qpuvfVW9cMPP1iWHzx4UMXFxamqqip1/PhxFRkZqQwGQ73P/+53v1MrVqxQSik1ffp09dZbb11qd1ds1qxZ6vnnn29wXbt27VRubq5V93exv/3tb+qVV165ZBmDwaAiIyPVsWPHVHV1tYqLi1MHDx60ahybNm1Ser1eKaXU008/rZ5++ukGy9mqTi4+xo7T71dd3+uqBn/4O8s5+1//+peaPn26UkqpFStWqLvvvtvqcZw5c0bt3btXKaVUSUmJiomJqVfXTf2tsYbL1fcXX3yhbrvtNmUymdTu3btV7969bR6TwWBQwcHBKj09vc5yW9bL119/rfbu3au6dOliWfbUU0+p+fPnK6WUmj9/foPf2fz8fNWhQweVn5+vCgoKVIcOHVRBQcHFxZp8rpOWWSGsoLhCz4PLUyirNjCzYjFbOldyS3kFXW+aDf4y3aNoeWJjY+nYsWO95cnJyUyYMAFnZ2c6dOhAdHQ0e/bsqVNGKcW2bdsYN24cAJMnT+azzz6zWmxKKT755BPuueceq23TFmpPE+vk5GSZJtaahg0bZpm4oG/fvmRkZFh1+5dz8TH6RPkD0MaznaVMcnIykydPBsz3SmzdurXhqYyvQWhoKAkJCQB4enoSGxtLZmamVfdhTcnJyUyaNAmNRkPfvn0pKioiKyvLpvvcunUrUVFRlskxmsOAAQPw86s7Xnrt70Nj54ZNmzaRlJSEn58fvr6+JCUlXfW42CDdDEQzGDhw4GVv/mrNDEYTM1bsIz2/gtHlq8jteBRfk4n+rmOIu3O6vcMT4oo0NPXpxUlDfn4+Pj4+liSroTLXYseOHQQHBxMTE9Pgeo1Gw7Bhw+jZsydLliyx2n4vtnjxYuLi4pgyZUqDl0qbUlfWtGzZMkaMGNHgOlvVycXHWOlcCkDngJgGy9SeCtdW0tPT2b9/P3369Km3bvfu3XTv3p0RI0Zw8OBBm8Vwufpu7u8GwMqVKxv9A7C56gXg7NmzhIaGAuY/QnJycuqVsXb9yNBcQlyj+Rt+ZUdqHj3Ld9Gh/VZ2O7nyQHYkNz/3+uU/LIQNDR06lOzs7HrL582bx6hRoxr8TEMtahcPnt+UMtcS04oVKy7ZKrtr1y7CwsLIyckhKSmJTp06MWDAgCbtv6mxPPLIIzz77LNoNBqeffZZnnzySZYtW1an3LXUQ1PjOF8n8+bNQ6fTMXHixAa3Ya06uVjtY6ysMVKjy0MH9AyL4ZMGypxnqyluy8rKGDt2LK+//jpeXnXvQ0hISODkyZN4eHiwfv167rrrLlJTU20Sx+XquznrBKCmpoa1a9daZnerrTnrpamsXT+SzApxDVbvzeCdnScIrUhlTPC7vO/tw+OpDiTNb/hmESGa05YtW674MxEREZw+fdryPiMjg7CwsDplAgICKCoqwmAwoNPpGixztTEZDAY+/fTTS06Ten5fQUFBjB49mj179lxV4tbU+nnooYcanMWwKXVljTiWL1/OunXr2Lp1a6M/+Naqk4vVPsbD2SVoncwjGcT4RtUrExERgcFgoLi4uN6lZ2vQ6/WMHTuWiRMnMmbMmHrraye3t99+O48++ih5eXkEBARYPZbL1be1vhtNtWHDBhISEggODq63rjnrBSA4OJisrCxCQ0PJysoiKCioXpmIiAi2b99ueZ+RkcHAgQOvep/SzUCIq7TvVCF/+fQX3KnkhTZLWRLmy4y0cpLmfidDcIlWa+TIkaxcuZLq6mpOnDhBamoqvXv3rlNGo9EwaNAgVq9eDZiTrcZaeq/Uli1b6NSpExEREQ2uLy8vp7S01PJ68+bNde6ktpba/RvXrFnT4D6aMk3stdq4cSMLFy5k7dq1uLm5NVjGlnVS+xi3Hj6J1rEEjdIR5nEhMbuSqXCvllKKqVOnEhsby6xZsxosk52dbWnx27NnDyaTCX9/f6vGAU2r75EjR/L++++jlOK7777D29vbcundFi51NaO56uW82t+Hxs4Nw4cPZ/PmzRQWFlJYWMjmzZsZPnz41e/0Su4Wu9q73YRoitY0mkFWUaVKfPFLFflMsvpq0WB169JY9fUbXVXhmeP2Dk00zt4jC7So0Qw+/fRTFR4erpycnFRQUJAaNmyYZd2LL76oIiMj1U033aTWr19vWT5ixAiVmZmplFLq2LFjqlevXioqKkqNGzdOVVVVXWp3TTZ58mT19ttv11mWmZmpRowYYdlvXFyciouLU507d1YvvviiVfZ7sfvuu0917dpVdevWTf32t79VZ86cqReLUua71mNiYlRkZKRNYomKilIRERGqe/fuqnv37pZRA5qzTs4fY4fH5phHMlh5h3r22WdVVFSUUkqpyspKNW7cOBUVFaV69eqljh07ZtX9K6XUjh07FKC6detmqYsvvvhCvf3225bvyz//+U/VuXNnFRcXp/r06aN27dpl9TiUary+a8diMpnUo48+qiIjI1XXrl3rjBhibeXl5crPz08VFRVZljVXvUyYMEGFhIQonU6nwsPD1dKlS1VeXp4aPHiwio6OVoMHD1b5+flKKaV++OEHNXXqVMtn33nnHRUVFaWioqLUsmXLGtp8k891MgOYaDFaywxgVXoj4/+zmx9PF/Ky21u8H36CBypNjJu4Efwi7R2eaJzMACbEVaqoMdBj0XycQj9hUMRQ3hzyj1ZzzhatVpPP2dJnVogroJTiz5/+wk8ZxTyQ/xrrIzMZXK5n3OiVksgKIa5b358oACfzTWqxATfZORoh6pKOfUJcgf/uOM6a/ZkMyf2QyviT+BtN9HEdA2372js0IYSwmZ2peWidzwIQ49PwkGlC2Isks0I00VdHcpi/4Vc65W8nrssOTjs6Mjy/IzdPW2Tv0IQQwqbMyay5ZTbaJ9rO0QhRlySzQjTBsdwyHl+xn8CSI0xs/yEbvDy474Qzw+Zcv5NBCCEEQE5JFUdyc9E6FuOkdaKNZ5vLf0iIZiTJrBCXUVyp56HlKRiLs3na+2WWhvowI62coX/bLUNwCSGuezvTLnQxiPKJwkHrYOeIhKhLbgAT4hKMJsXjK/ZzKq+Y13wXsyjCj5nphdzyxHc4uXrYOzwhhLC5nal5OEgXA9GCSbOSEJfw8sZf+fpoDs+5vct/wkp5rFRPr3Gf4B3Szt6hCSGEzSml6rTMRvtKMitaHmmZFaIRa/Zn8J9vjvOA/kO2BB9kWIWRcWNWQpvel/+wEEJcB46eLSOntBrvgBxMSMusaJmkZVaIBvx0uohn/u8XBma/T1nk9wQajDw+6GVJZIUQN5QdqbkAOLjIsFyi5ZJkVoiL5JRUMe2DFKLObiEu7lvO6HQML+qCtts4e4cmhBDNamdaHhqHMvSU4u7oToh7iL1DEqIeSWaFqKXaYGT6//aiyfyRe9uv4EsvdyaecCXpz8n2Dk0IIZpVtcHI98cL6owvq9HciLNCi5ZOklkhzlFKMWfNAY6mpjLL+1WWhvrwh9RKkubKEFxCiBvPvpNFVOqNhAQUAdJfVrRc8gstxDnv7DzBmj1pzDHN4V/Rfsw8Ucgtf/wGRxc3e4cmhBDNbmeaub+sv18hADG+0l9WtEySzAoBfHM0l5fWH+IZ3dt80M2TqdlF9L57tQzBJYS4Ye1MzQNAOZ4B5OYv0XLJ0Fzihncir5wZH+1jisNavmlzitsq9PTu8zIR3X5j79CEEMIujmSX8lNGMW5OWnKqTgIyxqxouaRlVtzQSqr0PLj8B/oWr6c4dAuhBgOPDVlEpyH32Ts0IYSwm+W70wEYEe9ChaECPxc//Fz87BqTEI2RZFbcsIwmxRMrf8T517V0bLOGbJ2Ov3f7A9ouo+0dmhBC2E1xhZ41+zIB6HVTNSBdDETLJsmsuGG9uvkIh3/Yzvj2K9ni5caEE2443/xHe4clhBB2tWrvaSr1Rm6ODqBcmZNa6WIgWjJJZsUNKfnHTD7YtJtZPq+xLNSHR1MrGTb3WxmCSwhxQzOaFO/vNveRndy/PWlFaYAMyyVaNvnlFjZx+vRpBg0aRGxsLF26dOGNN96wd0gWv2QU8+eP9/AX0xwWR/nxxIlCBjy5U4bgEkLc8LYfyeFUQQURvq4M7hQkyaxoFWQ0A2ETOp2O1157jYSEBEpLS+nZsydJSUl07tzZrnHllFbx0PI9zMh7ivf7ePNgVhG9JnyKd1Abu8YlhBAtwXvfpgMwqV87FEaOFx0HJJkVLZu0zAqbCA0NJSEhAQBPT09iY2PJzMy0a0zVBiOP/G8fd5S8z65ELcNLykjotZCILv3sGpcQQrQEx3LL2JGah4ujlrsT23C69DQ1phpC3UPxcPKwd3hCNEqSWWFz6enp7N+/nz59+tRbt2TJEhITE0lMTCQ3N9dmMSileO6zgwSfXk9R211EGAwMDptOl2H322yfQgjRmrx/rlV2dI9wfNycpIuBaDUkmRU2VVZWxtixY3n99dfx8vKqt37atGmkpKSQkpJCYGCgzeJY/m06h3etJSroQ3J1DrwQN4PuE/5qs/0JIURrUlqlZ/XeDMB84xdAWuG5ZFZGMhAtnCSzwmb0ej1jx45l4sSJjBkzxm5x7ErLY+mKVYwOfIutHs78IyRJhuASQohaVqVkUF5jpE8HPzqFmBseUotSARljVrR8kswKm1BKMXXqVGJjY5k1a5bd4jiZX86flq7nMe9FvBfiw6OpVXiPWAQajd1iEkKIliS/rJo3t5kT1yk3d7Asl24GorWQZFbYxK5du/jggw/Ytm0b8fHxxMfHs379+maNoazawPR3vuHxqtm8FeXHE8eLGPCnXeicXJo1DiGEsJeCggKSkpKIiYkhKSmJwsLCemXmb/iVogo9urw0np54GyNHjqSkpoT04nQctY5E+kTaIXIhmk6G5hI2cfPNN6OUstv+TSbFzI/2MSp1Bsv7eDPtTCG97/0Mr8Bwu8UkhBDNbcGCBQwZMoTZs2ezYMECFixYwMKFCy3rvzuez+q9GShDDV8ueJAOATMB2JGxA4Wii38XnB2c7RW+EE0iLbPiurToy6NEfP0YOxIduK24jIR+iwjvXH80BSGEuJ4lJyczefJkACZPnsxnn31mWVdjMPHXzw4AULE3mQ4B7pZ1+3P2A9AjuEczRivE1ZGWWXHdWffzGY5veweX3nm00ZvoE/B7YofcY++whBCi2Z09e5bQ0FDAPP53Tk6OZd1/dxwnLaeMDgHu7Pj2YxITD6DT6Zg9ezb7XPYBkBCUUGd7S5YsYcmSJQA2HU5RiCshyay4rhzILObDVZ+QELaKAw46ZmgG0mvy3+0dlhBC2MzQoUPJzs6ut3zevHmNfuZUfgVvbjXf9PX3UV2JvPcYYWFhHD9+nMFJg/Gf6w9AfGB8nc9NmzaNadOmAZCYmGitQxDimkgyK64beWXVPPefjxnp/Sar3Z35ICgJ39++ae+whBDCprZs2dLouuDgYLKysggNDSUrK4ugoCDzJDJrD1BtMDEqPoybYwIs5SMjI0m8I5EjpiNEeUfh4+LTHIcgxDWRPrPiulBjMDHz358z3vQs7/m68IZjNL53/EOG4BJC3NBGjhzJ8uXLAVi+fDmjRo3iw+9Psf1ILp4uOubcEUthYSHV1dUA5OXlcaDQ3I9W+suK1kKSWdHqKaV4bvUefpv5GG9F+jPzeCFBd/4XHOTCgxDixjZ79my+/PJLYmJi+PLLLxk04WH+tvYgAG1zvyPI04XDhw+TmJhI9+7dGTRoEB0HdwTq95cVoqWSX3vR6n3w7Qk6fDWRd3v5MP1MEX3uS8bTP9TeYQkhhN35+/uzdetWANJyyhj91i6MJsUjA6N45rY7AOjfvz+//PILACZl4taPb4Vq6BEkLbOidZBkVrRq3x7LI/2/4znc34Hbi8uI7/caYbG97R2WEEK0KIXlNUxd/gOlVQaGdwnmqWEdGyyXXpxOUXURQa5BhHvIuNyidZBkVrRapwsqWPHSQzj0LqBDjYk+AVPoLENwCSFEHTUGE9P/t5eT+RV0DffiH+Pj0Wobvp9gX455SK4ewT3QyD0HopWQPrOiVSqvNvCPpe/Rpts+irVaBpUl0nfy8/YOSwghWhSjSTH705/Zc6KAYC9nlk7qhZtT4+1YlskSpIuBaEUkmRWtjsmkWPC/dXTTvsoOd1emFndg6NMf2zssIYRoUar0Rh5bsY9P92Xi4qhl6aRehHi7XPIz+842PFmCEC2ZdDMQrc6bn31D95zZ/CvIhX9rI+n4p89AK3+XCSHEeSVVeqa9n8J3xwvwdNbx38mJdIvwvuRncipyyCjLwN3RnRjfmGaKVIhrJ8msaFXW7TlKyM57+GdXH+aVu9Fx2koZgksIIWrJKali8rs/cDirhEBPZ96f0pvYUK/Lfu58F4Pugd3RaeW8KloP+baKVuNgRhFZ79zGqt7eTMssJGbCh+DsYe+whBCixTiWW8bv393D6YJKOgS48/6U3rTxc2vSZ6W/rGitJJkVrUJ+WTXr/jaMvX0cGVlYSs+b3yA4Rk64QggB5sljVqVkMPfzg1TUGOke4c2y3/fC38O5yduQ/rKitZJkVrR4eqOJN2ffR2HPfKKrTfQIfJBOg+62d1hCCNEiFFfo+fOan1n/SzYAv+0exoIx3XB3bvpPfLm+nCOFR9BpdHQN6GqrUIWwCUlmRYv38kvP4RK9hzKtE8OLevKbx+baOyQhhGgRdh/LZ9YnP5JVXIWHs44XRnVhdI/wKx4j9qfcnzApE10CuuDm2LRuCUK0FJLMihZt7bavCXF6j8/cPXgg1Z07Xv7E3iEJIYTd5ZZW89rmI3ycchqloEdbH94Y34O2/leXiJ7vYiD9ZUVrJMmsaLH2HkqlZs90/hfoxdOndAybt9veIQkhhF3VGEy89+0J/rk1jdJqAzqthkcHR/PY4GgcHa5uiEKlFF+e/BKAPqF9rBmuEM1CklnRIp3IzCbjs3t4M0TLc8WuDP3L1+DoZO+whBDCLpRSfHnoLPM3/MqJvHIABncKYs4dsUQFXtuoLgfzD3K8+Dh+Ln70C+tnjXCFaFaSzIoWp7yqhm9e6s//Epx5rMDAwKmbZQguIcQNyWhSrP8li399lcav2aUARAW68+ydnRnYMcgq+0hOSwbg9g6346h1tMo2hWhOksyKFkUpxXtPDuDr7o7cVVBKp55voPONsHdYQgjRrKoNRj7bn8nb24+Rnl8BQJCnM48MjOK+vu2uukvBxfRGPRvSNwAwMmqkVbYpRHOTZFa0KG8+OYa0Lrl0rDYR6T2FboPH2jskIYRoNifyylm55xSr9mZQUF4DQFs/Nx4ZGMWYhHCcdQ5W3d83Gd9QXF1MjG8Mnfw6WXXbQjQXSWaFzWzcuJGZM2diNBp58MEHmT17dqNlv/32W04cO8CeB4vxUhq65PZkxAsvNGO0QghhH1V6I5sPnWXF96fYfTzfsjw21IuHb43kjm6h6KzUEnux5GPmLgajokZd8XBeQrQUkswKmzAajfzhD3/gyy+/JCIigl69ejFy5Eg6d+5cr+x3P3zHI2+Podpbw8jyChzzuzDmpdV2iFoIIZpHtcHIN0fzWPfzGbYcOkt5jREAF0ctv40L494+bYlv42PTBLOgqoAdGTtw0DhwR+QdNtuPELZm82T2zTff5O9//3uD67y9vUlLS7O8T0xM5OTJkw2Wfeyxx3juuecA+Prrrxk3blyj+/zhhx9o3749AFOnTmXt2rUNlhswYAD/93//B0BRURExMTGNbnPp0qWMGjVKjqmJx7Rnzx78/Pzo08c8zEtFRQW9evXCze3CGIg7v9tJsVsx8089T9++nrAtD6/S7tz87JpGYxZCiNaquFLPztQ8tv56li8PnaW0ymBZFxfhzbieEdzVIxwvl+a5CWvDiQ0YlIFbwm8hwDWgWfYphC1olFJNLnzbbbepvLy8K9pBdnY2mZmZDa5zcHAgPj7e8v6XX36hpqamwbLBwcFERJhvBCopKSE1NbXRfXbt2hVnZ/N81MePH6ewsLDBcp6entx0002AuSXxxx9/bHSbkZGR+Pr6XnfHZDAYCAwMtPoxFRYWkluQS1lFGRqd5sLDQYNWp7W8dlBaAvU1+CkjB/KgXUwXPF3tOwRXbm4ugYGBdo3hPInFenHs3bt3k1LqNhuE1JI1/QQvrM5kUhzNKeWbo7ls+zWHlPRCDKYL/ySxoV7cGRfKnXGhtPN3b/b4xq8bz6H8Q7xy6yvc1v7K/9dITEwkJSXFBpEJAUCTL0tcUcvsxo0brziSiooK+vXrx5YtW+qt02g0BARc+GswPz8fk8nU4HZcXV3x8DAPz1RTU0NxcXGj+/Tz88PBwdxJvqSkhOrqagCGDh1aJw5HR0d8fHwAMJlM5Ofn19/YOZ6enri4uFiOqby8vMFyTT2moUOHsmvXrms+potd6THdfPPNpKSkXPExldWUkVOZQ05VTp3nvJo8cipzMJQZ8KrxIiowiiCXIIpOF1GcUcwDYx8gyDWIYNdgIv2C0H04Fk3mXg6a2tHrrTyOHPy50XibS0s6QUssVo3jRktkRTNTSnEst5zdx/P57lg+u4/nW27iAnDQaujdwY/BnYJI6hx8zePDXou0wjQO5R/C09GTQW0G2S0OIazB5t0M3NzccHR0bFJLir+/f5O26eTk1OSWGS8vL8vrS8Wh1WqbvE03N7c6l8svpbFjcnR0tCSycPXHdClXe0wGk4HcilyyyrMsj+y0bLLLsy3v9UY9Ie4hhLqHEuoRSoh7CDcF3USoRyih7qGk/5LOSy+8xOpN5r6v8+fPBxcYF3euK0VxBrw/ErJ/5pQpkMcd5kDVg02KVQghWoKKGgM/nS5m36lC9p0sZP/pojrJK0CIlwv9o/0Z1DGIATGBeLu1jHFc1x4zd1Ub3mE4zg7Odo5GiGsjN4DdYJRSlNSUWBJTYzcji/YuIrssm+wK87K8yjz8XPwIdTcnqSFuIXTw7kD/sP6WZT7Ol74xIbxPOKmpqZw4cYLw8HBWrlzJRx99ZF6ZkQIr74Wys5wihPv1T/P4qN/w3csNt8oLIYS9lVUbOHSmhAOZxRzILOaXzGKO5ZZhuqgjR4CHM30j/egX5U//qADa+7u1uFECDCYD646vA8yjGAjR2jVLMjtt2rTm2M1ltZQ4wHax1BhrOFtx9kIratm5ltVarao6jY4QD3OS2umWTng5eXFTm5ssiWqQW9A1zwKj0+lYvHgxw4cPx2g0MmXKFLp06QK/rIbPHgVjNb+6xDO+6FHiYtozJiGcWQEt4waEG+F7cjVaSiwtJQ5xfarSG0nPLyf1bBlHskv5NbuUX7NLyCisrFdWp9XQOcyThLa+9GznS0JbXyJ8XVtc8nqxTembyK3MpZ1XO7oHdrd3OEJcsyu6AQy5mcCulFLkV+WTXV73kn/t98XVxQS5BRHsFkyYR5ilK0CIe4jltaeTZ/MHb6iBrxfCjlcBONH+bpJ+vRMXZxc2PnELEb5uLaZPprhutewMwzbknN0Ak0lxtrSKE3nlpOdVkJ5fzrGcMtJyyzhdUFGvtRXAyUFLdJAH3cK96RrhTbdwbzqFeOLiaN1JDGytqKqIUcmjKKgqYG6/uYy96eonppFztrAx29wAJmyrQl9BdkU22WUNJ6rZ5dm4Obpd6Kt67hEXEGdZFuAagIO2hZ1cj30FG56GvKOg0VJ86wuM3B6DASPP3dmZCN+m9T8WQoimUEpRXKknruHb4gAAFXBJREFUs6iSjMJKThdUWJ5PF1ZwqqCCKn3D3Zq0GugQ4E5UoDsdQzzpGOJFbIgn7QPcrTaFrD29/MPLFFQV0CukF2Nixtg7HCGsQpLZZmI0GcmrzKt3yT+rPIuz5WfJKs+i0lBpaUENcQsh1COUHkE9LDdVhbiH4KpztfehNF3RKdg0Bw6fGxPXLwp15yIe2+5GaVUuQzoF8bvECPvGKIRoVZRSFFboySmtIrvY/MgqruJsifn5TFElmUWVVJybhKAxAR5OtPN3p72/Ox0C3IgM9CA6yIN2/m5WnzK2pdiZuZPPj3+Os4Mzc/vNbfHdIYRoKqsks6tWrWLu3LkcPnyYPXv2kJiYaFk3f/583nnnHRwcHHjzzTcZPnx4vc+fOHGCCRMmUFBQQEJCAh988AFOTtc+1uj48eM5cuQIYJ5AwMfHp8FxV9u3b4+npycODg7odLqrumxSWlNaryX1fJ/V7HJza6uh1IC2XIumVMPN3W+mf9f+9AnpQ4iHuVXV19mXTZs2MfOxC1PA/mH2H665Hmp76qmn+Pzzz3FyciIqKop3333XMpRXbddUJ8UZ8MNS+O7fYKgERzcY8BT0+wMfpmTzzdEDUFPO9lenszD77nrT3FZXVzNp0iT27t2Lv78/H3/8sWXCCGs5ffo0kyZNIjs7G61Wy7Rp05g5c2adMtu3b2fUqFF06NABgDFjxlgmubC2y9W3UoqZM2eyfv163NzceO+990hISLBqDEeOHGH8+PGW98ePH+eFF17giSeesCyzZZ1MmTKFdevWERQUxIEDBwAoKChg/PjxpKen0759ez755BPLeM+1LV++nBdffBGAv/71r0yePNkqMYnmoZSipNJAfnk1+eU15JdVk1tWQ15pNbll1eSVVpNTWk3uuUeN8fI3i7o7ORDu60q4jytt/Nxo4+tGGz9XInzdaOvv1mwTE7QUFfoKXthtniL80fhHaevV1s4RCWE9Vklmu3btyqeffsr06dPrLD906BArV67k4MGDnDlzhqFDh3L06FHLeKnnPfPMM/zxj39kwoQJPPzww7zzzjs88sgj1xzXxx9/bHn95JNP4u3t3WjZr776qs5YqrXpjXrOVpxtMFk9/96ojIS5h1lurArzCKtz9/+S15bg7eHNn2b/qdEYrmQK2KuVlJTE/Pnz0el0PPPMM8yfP5+FCxc2WPZSdVKPUnByF3z/H/j1C1DnWkW6jIFhL4J3OCfzy3lp/WEAnhtxE/ct+K7BY3znnXfw9fUlLS2NlStX8swzz9T5t7QGnU7Ha6+9RkJCAqWlpfTs2ZOkpKR6dX3LLbewbt06q+67MZeq7w0bNpCamkpqairff/89jzzyCN9//71V99+xY0fLH3tGo5Hw8HBGjx5dr5yt6uT3v/89M2bMYNKkSZZlCxYsYMiQIcyePZsFCxawYMGCet/XgoICnn/+eVJSUtBoNPTs2ZORI0c2mPQK2zMYTRRX6imu1FNUqae4Qk9hRQ2FFXqKKmrMr8v1FJSbX59/1hub3r3Xy0VHkJcLwV7OhHq7EuLlQoi3CyFeLoT5uBLu64qXi05aHmt5c/+bZJVnEesXy6TOky7/ASFaEasks7GxsQ0uT05OZsKECTg7O9OhQweio6PZs2cP/fr1s5RRSrFt2zbLsE2TJ09m7ty5Vklma+/jk08+Ydu2bQ2u07hpOFp8lJ8qfjInqmVZlmGqssuyKaguIMA1wJyYnrv8f5PvTdwacaulW4CXk9clT5wOXP6y1Z49e4iOjiYyMhKACRMmkJycbNVkdtiwYZbXffv2ZfXq1Ve/MaXg7EE4ugEOfAo5h8zLtTroPAb6PgJtegOgN5r406qfqKgx4ll4hClJ5nnAGzrG5ORk5s6dC8C4ceOYMWOG+d/Jij9MoaGhhIaGAubJI2JjY8nMzLRqXVtTcnIykyZNQqPR0LdvX4qKisjKyrIcg7Vt3bqVqKgo2rVrZ5PtN2TAgAGkp6fXWZacnMz27dsB87lh4MCB9ZLZTZs2kZSUhJ+fH2D+g23jxo3cc889zRH2dUUpRUWNkbJqA6VVBkqr9JRVGyirMlBSpae0ykDJueWlVQaKK/WUnEtcz78vqzZcfkcN8HTW4efhhL+7E37uzgR6OhHo4UyAp7PlOdjThSAv51Z305W9/ZjzIx8d/ggHjQMv/OYFdFrpYSiuLzb9RmdmZtK3b1/L+4iIiHpTpubn5+Pj44NOp2u0zLWoMlSx7ut1BPYM5AAH2PLjlnqtqu6PuTN9zXQ0pRq6tuvKkF5D6OzfmRB3cwtrgGuAVf7nX7x4Me+//z6JiYm89tpr9VqOMjMzadOmjeV9RESE1Vvfalu2bFmdy8q1aTQahg0bhkajYfr06ReGQ6osgowfIHUzHNkIxacufMg9CBIfgJ4PgNeFJMtoUsz65Cd+SC/E01HRw5RmWdfQMdauB51Oh7e3N/n5+U1vJb5C6enp7N+/nz59+tRbt3v3brp3705YWBivvvqqeXgxG2i0vs9p6LuRmZlps2R25cqVjSaDzVUnAGfPnrUcY2hoKDk5OfXKNFY34srtPp7Pvf+9tnOOVgPero6Wh4+bEz5ujvi6OeHt6oivmyN+Hs74uTnh6+6In7sTvm5OkqDaSEZpBnN2zkGheKDrA3Ty62TvkISwuiZnaBqNZktDP1rz5s1j1KiGB11uaNivi1vXmlKmMSZlYsjIIeRW5YIn4IXlOTw2nCrHKspqytBWaPG/w5+UsymEuIcQFxjHsPbDzN0C3EMoyi0iLCyMnJwckpKSePifDzOg24AmxVDb0KFDyc7Orrd83rx5PPLIIzz77LNoNBqeffZZnnzySZYtW2a1umhqHOf/rebNm4dOp2PixIkNbmPXrl2E+bhQkPYDbz/3MGdNGwjWn4b81LoF3QMhZjh0HAExw0BXt6+zyaT486c/8/lPZ/Bw1vH7DmUczqo7Q441vxNXqqysjLFjx/L666/Xm1ktISGBkydP4uHhwfr167nrrrtITU1tZEvXZteuXXW+g506dWLAgAvfweask5qaGtauXWuete0izVknTdWcdXO983R2xNXRAQ8XHZ7OOjxddHi46PBw1uHp4oini/nZy8W8ztvVES9XR7xcziWvbo54OOnQaqX+W4J9Z/fxxFdPUFhdSEffjjzc/WF7hySETTQ5mVVKDeUKxyyMiIjg9OnTlvcZGRmEhYXVKRMQEEBRUREGgwGdTlenTLm+3DLof0PDVJ2tOIvnfZ50de9abyzV89Oseum8aBPRhjV71xAR0fCd825h5qGhgoKCGD16NHv27KmTSDTVli1bmlTuoYce4s4776y3vCn1ZY04li9fzrp1n7N1/Wdo8o5CSSaUnDE/ik5BfhphealQkYcfMCceyD7XRcPBGULjoP0t0PF2CO8J2oaHq1FK8fznB/kkJQMXRy3Lft8LY/YRNq+49DGer4eIiAgMBgPFxcWWS8jWpNfrGTt2LBMnTmTMmPpD1NRObm+//XYeffRR8vLybNJCfL4OGvsOWuu70RQbNmwgISGB4ODgeuuas04AgoODLd0psrKyCAoKqlcmIiLC0hUBzHUzcOBAm8RzvesW4c3hv99m7zCEFSSnJfP87ufRm/T8Jvw3vDLgFZm2Vly3bNrNYOTIkdx7773MmjWLM2fOkJqaSu/e5j6UBpOB3IpcssqzSJiYwKyVswiKDmLzd5vRjtfSf0V/DCYDwW7BlsQ0xC2E3qG9LYlrsFswLjqXS8awceNGOnXq1GgiW15ejslkwtPTk/LycjZv3myTO9Zr929cs2YNXbt2rVemV69eF6aADQtl9ccr+GD5u+ZL+yYDGPVgrDn3XA2GavN7QzXoK0FfUeu5AqpLzY+qEqgugapiSs+mc3tBJpNGOKB5o+Olg3Z0x+jbgS9/PE3b/mPonHQ/BHer1/raEKUUCzceYfnukzg5aPnvpER6d/DD0KZX49PcnjNy5EiWL19Ov379WL16NYMHD7Z6S5tSiqlTpxIbG8usWbMaLJOdnU1wcDAajYY9e/ZgMpnw9/e3ahzQtO/gyJEjWbx4MRMmTOD777/H29vbZl0MVqxY0WgXg+aqk/POfxdmz57N8uXLG7wKNHz4cP7yl79QWFgIwObNmxtsVRbiRmBSJt7Y9wbLDpiv/E2MncifEv8k/WTFdc0qM4CtWbOGxx57jNzcXHx8fIiPj2fTpk2off9jzZcvkl5dTp6zA5XeXuQ7a8nSmMhTJvzREoKWIAO4FpbgV6UnTKMjLiiEMKXFBw2aK560p26IJ0+ewt3djQD/Cy1HeoOe06dPEdkhkuqaatLT09Fgvp/Jx8eHYEvrjzIvrL1dpeq/rl2mzrJzz0qRX5CHQa9HqwFHBy2enh44aMBkNFBVWY6biwuYjJhMBrTq0uMjWku5XkOpxoOQmB5U6HxZ+/VeJjz0R87UuDN51ovkVDlgMBi59957mTNnTpO3azQpXtt8hLe2H0On1fD2fT1J6nyhlW/9+vU88cQTlmlu58yZw3PPPcdHH31EWloaVVVV3H///ezfvx8/Pz9WrlxpuSnOWnbu3Mktt9xCt27d0J5rWX7ppZc4dcrcB/jhhx9m8eLFvP322+h0OlxdXVm0aBH9+/e3ahxgHgLr/KgBBoPBUt///ve/LbEopZgxYwYbN27Ezc2Nd999t84QeNZSUVFBmzZtOH78uGX0j9px2LJO7rnnHrZv305eXh7BwcE8//zz3HXXXdx9992cOnWKtm3bsmrVKvz8/EhJSeHf//43S5cuBcz9v1966SUA5syZwwMPPNDQLm7Ea98yA9gNQm/UszF9Ix8c+oDDBYdx0Djwlz5/4e6Od9tsnzIDmLCxJp+zbTud7c5/MOvnxfgajYQajIQYDOZno4Egg5Eba5S/K6TRgsbBPDKAg+O5ZyfzawdH8+V+ndO5Z2dwdD33cLvw7OwFLl7g7Gl+uHiDWwC4+YObn/lzVpZTWsUfP/6RXWn5aDXwxoQe/LZ70y6Hy4lR2Jgks+K6U1hVyCdHPmHlkZXkVeYB4Ofix4JbFtAvrN9lPn1t5JwtbKyFTGfb/V4WRQ68hg1c5W9Pg5ekG1hWr1xjZTQXldc08PqiMhqNOSGtvV7rcC5J1dZ6rzEnrRqt+b1Wdy6JbX3TJu5Ky2Pmyh/JK6vG392Jf4yPZ8BNgfYOSwghrht5lXn8lPMT+3P282PujxzMP4jBZB4OLdonmkmdJ3F75O2W/rGXmtSoto0bNzJz5oUJey6ezEaIlsy2yaxnsPkhrmsGo4k3t6byz6/SUAr6RvrxxoQeBHtduj+zEELcqJRSGJSBGmMNNcYaqo3VlNaUUqYvMz/XlFFYXWiZ7vz8jJI5lXWHp9Og4ZbwW7i/8/30De1b7/6CxiY1qq05JuwRwpakR7i4akopNh7I5pXNRzieW45GAzOHxPD4kBgcZGgeIVqdw/mHmbOr6f3jW6sr7F5X53PqXM8NhbJs5/zri59NmDAp88NoMmJQBsvrGlMNJnX5aXkv5qpzJS4wjvjAeHoE9aBbYDe8nLwaLd/YpEa1NceEPULYkiSz4qrsSstj4cZf+TmjGIC2fm4sGNON/tG2GaJJCGF7VcYqUgvtO27wjUSn0eHo4IizgzNOWic8nDzwcPLA09ETDycPvJ28zSP51BpyMsgtyOojE1zJhD1LlixhyZIlAOTm5lo1DiGuliSzosmq9Ea2HD7Lh9+dYvfxfAACPZ15fEgM4xPb4KRrff18hRAXdPTtyOrfXsMU161IU4b7a2g0HQ0ay2c1aMy3RZz/T3PhWavRokVrea3T6nDQOKDT6tBqtDhqHa2WlDZlopxLuZKJR6ZNm2aZodAWI6oIcTUkmRWXpJTil8xiVu/NIPnHMxRX6gHwctHx8MAoft+/PW5O8jUS4nrg5uhGR7/LjD8tWpymTtjTmOaclEUIW5AsRNRTXKln97F8dqblsjM1j/T8Csu6LmFe/K5nBKN7RODtJoOrCSFEa1dnwp5GJrMRoiWTZPYGpzeaSMsp4+CZEg6eKebH00X8dLoIU62rTn7uTtwVH864nhF0Dmv8RgMhhBAtS+1Jje644w7LpEZnzpzhwQcfZP369eh0OhYvXszw4cMtk9l06dLF3qEL0WS2nTRBtAhVeiO5pdVkFFZyqqCck/kVnCyoID2vnNScMmoMde+o1Wk1JLT15eaYAG6OCSAu3Budg+37w8oA3MLGbsQhNuScLWxGztnCxlrIpAnimplMimqDiSq9kSqDkSq9iYoaA+XVRsqrDZTXGCivNlBcqaeoQk9RpZ7iSj2F5TXklFaTU1JFSZXhkvto6+dGlzAv8yPcm17t/fBwlq+GEEIIIVo+m2Ys3xzNZfOh+ndYXq2LG5EbanJouKG5/sLz5SzP58cOVBdKm1+bF1xYZi5pUhdeo8CkFOrc8/l1JqUwnnttNJkfJqUwmBQmk/nZYFQYTCbL6xqjCb3RRI3B/Kw3XnvDik6rIdDTmVBvF9r5u9PWz412/uZHTLAnXi7S91UIIYQQrZNNk9mDZ0r433enbLmLG4KzTouLowMujuZnV0cH3J11uDvr8HB2wN1Jh7erI96ujvi4OeLt5oSPqyNBXs4Eebrg4+qIViYxEEIIIcR1yKbJ7C0xAXg4W7kT+UVj3zWUojU0PF6D4wVq6m7jwnuNZaEGzo0dWGu9BrTnxxnUaNCeG2dQqzlXVgMOGg1arfm9g0aD9tx7B40GnYP5vU6rReegQafVoHPQotNqcNJpcXTQnnvW4OSgbdJ4iC3JU089xeeff46TkxNRUVG8++67+Pj42DssIYQQQlyH5AYwYXWbN29m8ODB6HQ6nnnmGQAWLlx42c/JzQTCxlrXX4XWIedsYTNyzhY21uRztkzZJKxu2LBh6HTmRv++ffuSkZFh54iEEEIIcb2SZFbY1LJlyxgxYkSj65csWUJiYiKJiYkyz7cQQgghrph0MxBXpSlzgc+bN4+UlBQ+/fTTJvX7lUtWwsakm4EQViTnbGFjMs6ssK3LzQW+fPly1q1bx9atW1vdDWxCCCGEaD0kmRVWt3HjRhYuXMjXX3+Nm5ubvcMRQgghxHVM+swKq5sxYwalpaUkJSURHx/Pww8/bO+QhBBCCHGdkpZZYXVpaWn2DkEIIYQQN4grvQFMCJvRaDQblVK32TsOIYQQlyfnbNFSSDIrhBBCCCFaLekzK4QQQgghWi1JZoUQQgghRKslyawQQgghhGi1JJkVQgghhBCtliSzQgghhBCi1ZJkVgghhBBCtFqSzAohhBBCiFZLklkhhBBCCNFqSTIrhBBCCCFarf8HqxDUrnPIQCYAAAAASUVORK5CYII=\n",
      "text/plain": [
       "<matplotlib.figure.Figure at 0xfa7bef0>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    }
   ],
   "source": [
    "#--------------------------------------------------\n",
    "# tf.nn.relu(features, name=None)\n",
    "# tf.nn.relu6(features, name=None)\n",
    "# tf.nn.softplus(features, name=None)\n",
    "# tf.nn.dropout(x, keep_prob, noise_shape=None, seed=None, name=None)\n",
    "# tf.nn.bias_add(value, bias, name=None)\n",
    "# tf.sigmoid(x, name=None)\n",
    "# tf.tanh(x, name=None)\n",
    "# tf.nn.elu(features, name=None)\n",
    "# tf.nn.crelu(features, name=None)\n",
    "# tf.nn.softsign(features, name=None)\n",
    "#--------------------------------------------------\n",
    "# 画图认识一下 \n",
    "x = np.linspace(-10,10, 60)  \n",
    "def elu(x,a):  \n",
    "    y=[]  \n",
    "    for i in x:  \n",
    "        if i>=0:  \n",
    "            y.append(i)  \n",
    "        else:  \n",
    "            y.append(a*np.exp(i)-1)  \n",
    "    return y\n",
    "\n",
    "def func_selu(x):\n",
    "    #with tf.name_scope('elu') as scope:\n",
    "    alpha = 1.6732632423543772848170429916717\n",
    "    scale = 1.0507009873554804934193349852946\n",
    "    y = scale*np.where(x>=0.0, x, alpha*(np.exp(x)-1))\n",
    "    return y #tf.nn.\n",
    "    \n",
    "relu=np.maximum(x,[0]*60)  \n",
    "relu6=np.minimum(np.maximum(x,[0]*60),[6]*60)  \n",
    "softplus=np.log(np.exp(x)+1)  \n",
    "elu=elu(x,1)  \n",
    "selu=func_selu(x)\n",
    "softsign=x/(np.abs(x)+1)  \n",
    "sigmoid=1/(1+np.exp(-x))  \n",
    "tanh=np.tanh(x)  \n",
    "prelu=np.maximum(0.1*x,x) #lrelu类似\n",
    "\n",
    "plt.figure(figsize=(12,4))#matplotlib 的 figure 就是一个 单独的 figure 小窗口  \n",
    "# plt.figure()\n",
    "plt.subplot(121)\n",
    "# plt.plot(x, relu6,label='relu6',linewidth=3.0)\n",
    "plt.plot(x, selu,label='selu',linewidth=2.0)  \n",
    "plt.plot(x, relu,label='relu',color='black',linestyle='--',linewidth=2.0)  \n",
    "plt.plot(x, elu,label='elu',linewidth=2.0)  \n",
    "plt.plot(x, prelu,label='prelu',linewidth=1.0)  \n",
    "  \n",
    "ax = plt.gca()  \n",
    "ax.spines['right'].set_color('none')  \n",
    "ax.spines['top'].set_color('none')  \n",
    "ax.xaxis.set_ticks_position('bottom')  \n",
    "ax.spines['bottom'].set_position(('data', 0))  \n",
    "ax.yaxis.set_ticks_position('left')  \n",
    "ax.spines['left'].set_position(('data', 0))  \n",
    "plt.legend(loc='best')  \n",
    "\n",
    "# plt.figure()  \n",
    "plt.subplot(122)\n",
    "plt.ylim((-1.2, 1.2))  \n",
    "plt.plot(x, softsign,label='softsign',linewidth=2.0)  \n",
    "plt.plot(x, sigmoid,label='sigmoid',linewidth=2.0)  \n",
    "plt.plot(x, tanh,label='tanh',linewidth=2.0)  \n",
    "plt.plot(x, softplus,label='softplus',linewidth=2.0)  \n",
    "# plt.plot(x, hyperbolic_tangent,label='hyperbolic_tangent',linewidth=2.0)  \n",
    "ax = plt.gca()  \n",
    "ax.spines['right'].set_color('none')  \n",
    "ax.spines['top'].set_color('none')  \n",
    "ax.xaxis.set_ticks_position('bottom')  \n",
    "ax.spines['bottom'].set_position(('data', 0))  \n",
    "ax.yaxis.set_ticks_position('left')  \n",
    "ax.spines['left'].set_position(('data', 0))  \n",
    "plt.legend(loc='best')  \n",
    "plt.show()  "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "从上图看，激活函数有如下分类，将在训练中起到不同作用：\n",
    "1. 可导：sigmoid、selu，不可导：relu\n",
    "2. 线性： relu、prelu,非线性:selu、elu、sigmoid、tanh\n",
    "3. 变化率：例如selu大于elu"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 获取输入和标签："
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "placeholder,数值不可改变"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 7,
   "metadata": {},
   "outputs": [],
   "source": [
    "# Create the model\n",
    "x = tf.placeholder(tf.float32, [None, 784])\n",
    "# 定义我们的ground truth 占位符 Define loss and optimizer\n",
    "y = tf.placeholder(tf.float32, [None, 10])"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# 定义函数"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### selu函数"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 8,
   "metadata": {},
   "outputs": [],
   "source": [
    "def selu(x):\n",
    "    with tf.name_scope('elu') as scope:\n",
    "        alpha = 1.6732632423543772848170429916717\n",
    "        scale = 1.0507009873554804934193349852946\n",
    "        return scale*tf.where(x>=0.0, x, alpha*tf.nn.elu(x))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 模型初始化函数"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Xavier参考：\n",
    "1. https://blog.csdn.net/qq_26898461/article/details/50996507 重理论\n",
    "2. https://stackoverflow.com/questions/33640581/how-to-do-xavier-initialization-on-tensorflow/33711310  重实践"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 9,
   "metadata": {},
   "outputs": [],
   "source": [
    "def initNN(InitializationMethod = 'gauss',initConstant = 0.1,stddevConstant = 0.1,\n",
    "           numHiddenUnits1=200,numHiddenUnits2=80,inputSize=784,numClasses=10):\n",
    "#----------------------------------------------------\n",
    "# 1 功能说明：\n",
    "#   参数初始化，最大支持3层神经网络，即2个隐层\n",
    "# 2 参数说明：\n",
    "#   InitializationMethod：初始化方法，支持3种，固定值'constant'，固定统计特性的高斯分布'gauss' ，以及'Xavier'方法\n",
    "#   initConstant：固定值\n",
    "#   stddevConstant：固定的高斯方差值\n",
    "#----------------------------------------------------\n",
    "    HiddenLayer2 = 1 if(numHiddenUnits2>0) else 0\n",
    "    numHiddenUnits2 = numHiddenUnits2 if(numHiddenUnits2>0) else numClasses\n",
    "    if(InitializationMethod == 'constant'):\n",
    "        W1 = tf.Variable(tf.zeros([inputSize, numHiddenUnits1])+initConstant) \n",
    "        W2 = tf.Variable(tf.zeros([numHiddenUnits1, numHiddenUnits2])+initConstant) \n",
    "        W3 = tf.Variable(tf.zeros([numHiddenUnits2, numClasses])+initConstant) \n",
    "        B1 = tf.Variable(tf.constant(initConstant), [numHiddenUnits1])\n",
    "        B2 = tf.Variable(tf.constant(initConstant), [numHiddenUnits2])\n",
    "        B3 = tf.Variable(tf.constant(initConstant), [numClasses])\n",
    "    elif(InitializationMethod == 'gauss'):\n",
    "        W1 = tf.Variable(tf.truncated_normal([inputSize, numHiddenUnits1], stddev=stddevConstant)) \n",
    "        W2 = tf.Variable(tf.truncated_normal([numHiddenUnits1, numHiddenUnits2], stddev=stddevConstant)) \n",
    "        W3 = tf.Variable(tf.truncated_normal([numHiddenUnits2, numClasses], stddev=stddevConstant)) \n",
    "        B1 = tf.Variable(tf.random_normal([numHiddenUnits1]))\n",
    "        B2 = tf.Variable(tf.random_normal([numHiddenUnits2]))\n",
    "        B3 = tf.Variable(tf.random_normal([numClasses]))\n",
    "    elif(InitializationMethod == 'Xavier'):\n",
    "        W1 = tf.get_variable(\"W1\", shape=[inputSize, numHiddenUnits1], initializer=tf.contrib.layers.xavier_initializer())\n",
    "        W2 = tf.get_variable(\"W2\", shape=[numHiddenUnits1, numHiddenUnits2], initializer=tf.contrib.layers.xavier_initializer())\n",
    "        W3 = tf.get_variable(\"W3\", shape=[numHiddenUnits2, numClasses], initializer=tf.contrib.layers.xavier_initializer())\n",
    "        B1 = tf.Variable(tf.random_normal([numHiddenUnits1]))\n",
    "        B2 = tf.Variable(tf.random_normal([numHiddenUnits2]))\n",
    "        B3 = tf.Variable(tf.random_normal([numClasses]))\n",
    "    else : #if(InitializationMethod == 'gauss'):\n",
    "        W1 = tf.Variable(tf.truncated_normal([inputSize, numHiddenUnits1], stddev=stddevConstant)) \n",
    "        W2 = tf.Variable(tf.truncated_normal([numHiddenUnits1, numHiddenUnits2], stddev=stddevConstant)) \n",
    "        W3 = tf.Variable(tf.truncated_normal([numHiddenUnits2, numClasses], stddev=stddevConstant)) \n",
    "        B1 = tf.Variable(tf.random_normal([numHiddenUnits1]))\n",
    "        B2 = tf.Variable(tf.random_normal([numHiddenUnits2]))\n",
    "        B3 = tf.Variable(tf.random_normal([numClasses]))\n",
    "    if(HiddenLayer2 == 0) :\n",
    "        W3 = tf.Variable(0) \n",
    "        \n",
    "    return W1,W2,W3,B1,B2,B3\n",
    "#--------------------------------------------------------"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 10,
   "metadata": {},
   "outputs": [],
   "source": [
    "# #初始化测试\n",
    "# initConstant = 0.1\n",
    "# stddevConstant = 0.1\n",
    "# InitializationMethod = ['constant','gauss','Xavier']\n",
    "# W1,W2,W3,B1,B2,B3 = initNN(InitializationMethod[1],initConstant,stddevConstant,numHiddenUnits1[1],numHiddenUnits2[1],inputSize,numClasses)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 网络结构函数"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "层连接，激活函数"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 11,
   "metadata": {},
   "outputs": [],
   "source": [
    "def connectNN(x,W1,W2,W3,B1,B2,B3,activationFunction = 'relu'):\n",
    "#----------------------------------------------------\n",
    "# 1 功能说明：\n",
    "#   层连接，激活函数选择，最大支持3层神经网络，即2个隐层\n",
    "# 2 参数说明：\n",
    "#   activationFunction：激活函数，支持3种，'relu','selu','elu'\n",
    "#   yout:输出是没有激活的output值\n",
    "#----------------------------------------------------\n",
    "    #第1个计算层：第1个隐层\n",
    "    logits1 = tf.matmul(x, W1) + B1\n",
    "    if(activationFunction == 'relu'):\n",
    "        output1 = tf.nn.relu(logits1)  \n",
    "    elif(activationFunction == 'selu'):\n",
    "        output1 = selu(logits1) \n",
    "    elif(activationFunction == 'elu'):\n",
    "        output1 = tf.nn.elu(logits1)  \n",
    "    else:\n",
    "        output1 = tf.nn.elu(logits1) \n",
    "    #第2个计算层：第2个隐层或输出层 \n",
    "    logits2 = tf.matmul(output1, W2) + B2\n",
    "    if(W3.shape != []):\n",
    "        if(activationFunction == 'relu'):\n",
    "            output2 = tf.nn.relu(logits2)  \n",
    "        elif(activationFunction == 'selu'):\n",
    "            output2 = selu(logits2) \n",
    "        elif(activationFunction == 'elu'):\n",
    "            output2 = tf.nn.elu(logits2)  \n",
    "        else:\n",
    "            output2 = tf.nn.elu(logits2) \n",
    "        logits3 = tf.matmul(output2, W3) + B3\n",
    "    #选择输出    \n",
    "    yout = logits2 if(W3.shape == []) else logits3\n",
    "    return yout\n",
    "#----------------------------------------------------"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 12,
   "metadata": {},
   "outputs": [],
   "source": [
    "# #激活函数测试\n",
    "# activationFunction = ['relu','selu','elu']\n",
    "# yout = connectNN(x,W1,W2,W3,B1,B2,B3,activationFunction = activationFunction[2])"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 正则函数"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "参考例程：\n",
    "1. https://blog.csdn.net/u012560212/article/details/73000740"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 13,
   "metadata": {},
   "outputs": [],
   "source": [
    "def regularNN(W1,W2,W3,RegularExpression = 'l2',regular_lambda = 0.0001):\n",
    "#----------------------------------------------------\n",
    "# 1 功能说明：\n",
    "#   计算正则损失，最大支持3层神经网络，即2个隐层\n",
    "# 2 参数说明：\n",
    "#   RegularExpression：正则函数，支持2种，'l1' , 'l2'\n",
    "#   regular_lambda   ：惩罚因子\n",
    "#   cost_regular     : 输出正则损失\n",
    "#----------------------------------------------------\n",
    "    #选择每层的正则函数\n",
    "    if(RegularExpression == 'l1'):\n",
    "        cost_w1 = tf.contrib.layers.l1_regularizer(regular_lambda)(W1)  \n",
    "        cost_w2 = tf.contrib.layers.l1_regularizer(regular_lambda)(W2)\n",
    "    else:\n",
    "        cost_w1 = tf.contrib.layers.l2_regularizer(regular_lambda)(W1)  \n",
    "        cost_w2 = tf.contrib.layers.l2_regularizer(regular_lambda)(W2)\n",
    "    #形成正则代价函数\n",
    "    if(W3.shape == []):\n",
    "        cost_regular = tf.reduce_mean(cost_w1 + cost_w2)   \n",
    "    else:\n",
    "        if(RegularExpression == 'l1'):\n",
    "            cost_w3 = tf.contrib.layers.l1_regularizer(regular_lambda)(W3)\n",
    "        else:\n",
    "            cost_w3 = tf.contrib.layers.l2_regularizer(regular_lambda)(W3)\n",
    "        cost_regular = tf.reduce_mean(cost_w1 + cost_w2 + cost_w3)\n",
    "    return cost_regular\n",
    "#----------------------------------------------------"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 14,
   "metadata": {},
   "outputs": [],
   "source": [
    "# # 正则测试\n",
    "# RegularExpression = ['l1','l2']\n",
    "# regular_lambda = 0.0001\n",
    "# cost_regular = regularNN(W1,W2,W3,RegularExpression[1],regular_lambda)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 学习率函数"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "参考例程：\n",
    "1. https://blog.csdn.net/zSean/article/details/75196092  讲理论\n",
    "2. https://blog.csdn.net/uestc_c2_403/article/details/72213286  有图示"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 15,
   "metadata": {},
   "outputs": [],
   "source": [
    "def learningRateNN(learning_rate = 0.3,global_step=tf.Variable(0),decay_steps = 150,decay_rate = 0.98,staircase=True):\n",
    "#----------------------------------------------------\n",
    "# 1 功能说明：\n",
    "#   定义优化方法，学习率，梯度下降\n",
    "# 2 参数说明：如下\n",
    "#----------------------------------------------------\n",
    "    #--------------学习速率的设置（学习速率呈指数下降）---------------------  \n",
    "    # tf.train.exponential_decay(learning_rate, global_, decay_steps, decay_rate, staircase=False)\n",
    "    # 公式：decayed_learning_rate=learining_rate*decay_rate^(global_step/decay_steps) \n",
    "    # learning_rate为事先设定的初始学习率；\n",
    "    # global_ = tf.Variable(0)  #必须是tf变量，且最好是0\n",
    "    # decay_steps为衰减速度。\n",
    "    # decay_rate为衰减系数；\n",
    "    # 默认值为False,当为True时，（global_step/decay_steps）则被转化为整数) ,选择不同的衰减方式，因此一般设置为Ture更好\n",
    "    eita = tf.train.exponential_decay(learning_rate,global_step,decay_steps,decay_rate,staircase) \n",
    "    return eita"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 16,
   "metadata": {},
   "outputs": [],
   "source": [
    "# 学习率测试\n",
    "# eita =  learningRateNN(learning_rate = 0.3,global_step=tf.Variable(0),decay_steps = 150,decay_rate = 0.98,staircase=True)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 损失计算函数"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 17,
   "metadata": {},
   "outputs": [],
   "source": [
    "def lossNN(y_logits,loss_calculation = 'softmax'):\n",
    "#----------------------------------------------------\n",
    "# 1 功能说明：\n",
    "#   定义损失函数，及其计算方法\n",
    "# 2 参数说明：\n",
    "#   y_logits：输入未激活的logits\n",
    "#   loss_function: 支持3种损失计算方式，'manual'手动计算，'softmax'系统函数计算，'mse'最小均方差\n",
    "#----------------------------------------------------\n",
    "    # ----------- 交叉熵计算方法1：使用系统函数 \n",
    "    \"\"\"\n",
    "    接下来我们计算交叉熵，\n",
    "    1. 注意这里不要使用注释中的手动计算方式，而是使用系统函数。\n",
    "    2. 另一个注意点就是，softmax_cross_entropy_with_logits的logits参数是**未经激活的wx+b**\n",
    "    \"\"\"\n",
    "    # The raw formulation of cross-entropy,\n",
    "    #\n",
    "    #   tf.reduce_mean(-tf.reduce_sum(y * tf.log(tf.nn.softmax(y)),\n",
    "    #                                 reduction_indices=[1]))\n",
    "    #\n",
    "    # can be numerically unstable.\n",
    "    #\n",
    "    # So here we use tf.nn.softmax_cross_entropy_with_logits on the raw\n",
    "    # outputs of 'y', and then average across the batch.\n",
    "    # cross_entropy = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(labels=y, logits=yout))  \n",
    "    # The raw formulation end\n",
    "#     cross_entropy = tf.nn.softmax_cross_entropy_with_logits(labels=y, logits=yout)\n",
    "    # ----------- 交叉熵计算方法2：手动计算 \n",
    "    y_pred = tf.nn.softmax(y_logits)    #relu  #softmax #elu\n",
    "#     y_pred = selu(yout)    \n",
    "    #计算交叉熵  \n",
    "#     cross_entropy = -tf.reduce_sum(y*tf.log(y_pred),reduction_indices=[1]) #\n",
    "#     cross_entropy = tf.square(y-y_pred)\n",
    "    # \n",
    "#     cost = tf.reduce_mean(cross_entropy)\n",
    "    if(loss_calculation == 'manual'):\n",
    "        cross_entropy = -tf.reduce_sum(y*tf.log(y_pred),reduction_indices=[1])\n",
    "    elif(loss_calculation == 'softmax'):\n",
    "        cross_entropy = tf.nn.softmax_cross_entropy_with_logits(labels=y, logits=y_logits)\n",
    "    elif(loss_calculation == 'mse'):\n",
    "        cross_entropy = tf.square(y-y_pred)\n",
    "    else:\n",
    "        cross_entropy = tf.nn.softmax_cross_entropy_with_logits(labels=y, logits=y_logits)\n",
    "    #-----------------------\n",
    "    return cross_entropy\n",
    "#----------------------------------------------------"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 18,
   "metadata": {},
   "outputs": [],
   "source": [
    "#损失函数测试\n",
    "# y_logits = y\n",
    "# loss = lossNN(y_logits = y_logits,loss_calculation = 'softmax')"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 训练函数"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 19,
   "metadata": {},
   "outputs": [],
   "source": [
    "def trainingNN(trainingIterations = 13000,batchSize = 100,\n",
    "               #网络结构参数\n",
    "               numHiddenUnits1 = 200,numHiddenUnits2 = 80,inputSize = 784,numClasses = 10,               \n",
    "               #初始化参数\n",
    "               initConstant = 0.1, stddevConstant = 0.1, InitializationMethod = 'gauss',               \n",
    "               #激活函数参数\n",
    "               activationFunction = 'relu',\n",
    "               #正则参数\n",
    "               RegularExpression = 'l2',  regular_lambda = 0.0001,\n",
    "               #学习率参数\n",
    "               learning_rate = 0.4,global_step=tf.Variable(0),decay_steps = 150,decay_rate = 0.98,staircase=True,\n",
    "               #损失函数参数\n",
    "               loss_calculation = 'softmax'\n",
    "               ):\n",
    "#----------------------------------------------------\n",
    "# 1 功能说明：\n",
    "#   调用初始化、激活、正则、学习率、损失计算等各函数，完成训练、精度计算、预测\n",
    "# 2 参数说明：\n",
    "#   见具体函数。\n",
    "#----------------------------------------------------\n",
    "    #初始化\n",
    "#     initConstant = 0.1\n",
    "#     stddevConstant = 0.1\n",
    "#     InitializationMethod = ['constant','gauss','Xavier']\n",
    "    W1,W2,W3,B1,B2,B3 = initNN(InitializationMethod,initConstant,stddevConstant,numHiddenUnits1,numHiddenUnits2,inputSize,numClasses)\n",
    "    #-----------------------\n",
    "    #激活函数\n",
    "#     activationFunction = ['relu','selu','elu']\n",
    "    #y_logits是没有激活的logits值\n",
    "    y_logits = connectNN(x,W1,W2,W3,B1,B2,B3,activationFunction = activationFunction)\n",
    "    #-----------------------\n",
    "    #正则\n",
    "#     RegularExpression = ['l1','l2']\n",
    "#     regular_lambda = 0.0001\n",
    "    cost_regular = regularNN(W1,W2,W3,RegularExpression,regular_lambda)\n",
    "    #-----------------------\n",
    "    #损失:交叉熵或mse\n",
    "    cross_entropy = lossNN(y_logits = y_logits,loss_calculation = loss_calculation)\n",
    "    #-----------------------\n",
    "    #代价\n",
    "    cost = tf.reduce_mean(cross_entropy + cost_regular)\n",
    "    # 学习率：\n",
    "#     global_step = tf.Variable(0)\n",
    "    eita =  learningRateNN(learning_rate,global_step,decay_steps,decay_rate,staircase)\n",
    "    #-----------------------\n",
    "    # 优化方法：梯度下降\n",
    "    # train_step = tf.train.GradientDescentOptimizer(learning_rate = learning_rate).minimize(cost,global_step=global_step)\n",
    "    opti = tf.train.GradientDescentOptimizer(learning_rate = eita).minimize(cost)\n",
    "    #-----------------------  \n",
    "    # 精度    \n",
    "    y_pred = tf.nn.softmax(y_logits)    \n",
    "    correct_prediction = tf.equal(tf.argmax(y_pred, 1), tf.argmax(y, 1))\n",
    "    accuracy = tf.reduce_mean(tf.cast(correct_prediction, tf.float32))\n",
    "    #-----------------------\n",
    "    # 声明session\n",
    "    sess = tf.Session()\n",
    "    #所有变量初始化\n",
    "    init_op = tf.global_variables_initializer()\n",
    "    sess.run(init_op)\n",
    "    #-----------------------\n",
    "    for i in range(trainingIterations): #trainingIterations\n",
    "        batch_xs, batch_ys = mnist.train.next_batch(batchSize)\n",
    "        feed_dict={x: batch_xs, y: batch_ys}\n",
    "        batch_eita = sess.run(eita,feed_dict={global_step: i}) \n",
    "        batch_opti,batch_loss = sess.run([opti,cost], feed_dict=feed_dict)\n",
    "        if (i%1000 == 0) or (i == (trainingIterations-1)):\n",
    "            trainAccuracy = accuracy.eval(session=sess, feed_dict=feed_dict)\n",
    "            testAccuracy  = accuracy.eval(session=sess, feed_dict = {x: mnist.test.images,y: mnist.test.labels})\n",
    "            print (\"step=%5d, trainAccuracy=%1.2f, loss=%1.6f, eita=%1.6f,testAccuracy=%1.4f\"\n",
    "                   %(i,       trainAccuracy,     batch_loss,   batch_eita,testAccuracy      ))\n",
    "    #-------------\n",
    "#     acc = accuracy.eval(session=sess, feed_dict = {x: mnist.test.images,y: mnist.test.labels})\n",
    "#     print(\"testing accuracy=%1.4f\"%(acc))\n",
    "#----------------------------------------------------"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 20,
   "metadata": {},
   "outputs": [],
   "source": [
    "# 训练测试\n",
    "# trainingNN()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# 开始训练"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "#### 训练参数"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 21,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "trainingIterations = 11000\n"
     ]
    }
   ],
   "source": [
    "#类别数量\n",
    "numClasses = 10 \n",
    "#特征维度\n",
    "inputSize = 784 \n",
    "#样本遍历次数\n",
    "epochs = 20\n",
    "#训练迭代次数\n",
    "trainingIterations = 11000 #int(trainimg.shape[0]*epochs/batchSize)\n",
    "print(\"trainingIterations = %d\"% trainingIterations)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "#### 监测量选择说明"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "选取了随迭代变化、对性能影响大的量，打印输出包括：       \n",
    "1. step      ： 迭代序号，标记训练过程\n",
    "2. trainAccuracy：训练精度，观察变化 \n",
    "3. loss      ： 训练损失，观察是否训练不足或过拟合\n",
    "4. eita      ： 学习率，观察步进情况，了解参数的量级\n",
    "5. testAccuracy ：预测精度，观察拟合情况和预测效果"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 训练参数的影响"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "以batchSize为例"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 23,
   "metadata": {
    "scrolled": false
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "batchSize =  10\n",
      "step=    0, trainAccuracy=0.20, loss=2.792613, eita=0.400000,testAccuracy=0.1135\n",
      "step= 1000, trainAccuracy=1.00, loss=0.357671, eita=0.354337,testAccuracy=0.8534\n",
      "step= 2000, trainAccuracy=1.00, loss=0.257041, eita=0.307609,testAccuracy=0.9088\n",
      "step= 3000, trainAccuracy=0.90, loss=0.529501, eita=0.267043,testAccuracy=0.9156\n",
      "step= 4000, trainAccuracy=1.00, loss=0.522495, eita=0.236558,testAccuracy=0.9059\n",
      "step= 5000, trainAccuracy=1.00, loss=0.531087, eita=0.205362,testAccuracy=0.9358\n",
      "step= 6000, trainAccuracy=0.90, loss=0.508001, eita=0.178280,testAccuracy=0.9150\n",
      "step= 7000, trainAccuracy=1.00, loss=0.410735, eita=0.157928,testAccuracy=0.9284\n",
      "step= 8000, trainAccuracy=0.90, loss=0.426208, eita=0.137102,testAccuracy=0.9380\n",
      "step= 9000, trainAccuracy=0.90, loss=1.071163, eita=0.119021,testAccuracy=0.9458\n",
      "step=10000, trainAccuracy=1.00, loss=0.346427, eita=0.105434,testAccuracy=0.9235\n",
      "step=11000, trainAccuracy=1.00, loss=0.904257, eita=0.091530,testAccuracy=0.9426\n",
      "step=12000, trainAccuracy=0.90, loss=0.756698, eita=0.079460,testAccuracy=0.8925\n",
      "step=12999, trainAccuracy=1.00, loss=0.158469, eita=0.070389,testAccuracy=0.9529\n",
      "batchSize =  100\n",
      "step=    0, trainAccuracy=0.16, loss=2.851851, eita=0.400000,testAccuracy=0.1192\n",
      "step= 1000, trainAccuracy=1.00, loss=0.120922, eita=0.354337,testAccuracy=0.9676\n",
      "step= 2000, trainAccuracy=1.00, loss=0.183732, eita=0.307609,testAccuracy=0.9716\n",
      "step= 3000, trainAccuracy=1.00, loss=0.135332, eita=0.267043,testAccuracy=0.9725\n",
      "step= 4000, trainAccuracy=1.00, loss=0.068604, eita=0.236558,testAccuracy=0.9780\n",
      "step= 5000, trainAccuracy=1.00, loss=0.070823, eita=0.205362,testAccuracy=0.9750\n",
      "step= 6000, trainAccuracy=1.00, loss=0.077957, eita=0.178280,testAccuracy=0.9791\n",
      "step= 7000, trainAccuracy=1.00, loss=0.061592, eita=0.157928,testAccuracy=0.9802\n",
      "step= 8000, trainAccuracy=1.00, loss=0.057855, eita=0.137102,testAccuracy=0.9789\n",
      "step= 9000, trainAccuracy=1.00, loss=0.053233, eita=0.119021,testAccuracy=0.9809\n",
      "step=10000, trainAccuracy=1.00, loss=0.050338, eita=0.105434,testAccuracy=0.9815\n",
      "step=11000, trainAccuracy=1.00, loss=0.048081, eita=0.091530,testAccuracy=0.9806\n",
      "step=12000, trainAccuracy=1.00, loss=0.046202, eita=0.079460,testAccuracy=0.9816\n",
      "step=12999, trainAccuracy=1.00, loss=0.044328, eita=0.070389,testAccuracy=0.9810\n"
     ]
    }
   ],
   "source": [
    "#定义一个批次的样本数量，并训练\n",
    "batchSize   = [10,100]\n",
    "for i in range(len(batchSize)):\n",
    "    print('batchSize = ' ,batchSize[i])\n",
    "    trainingNN(batchSize = batchSize[i])"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "小结：\n",
    "* 整体上，训练中对样本使用重复覆盖多，有助于提高精度\n",
    "* 不一定训练次数越多越好，训练少会欠拟合，训练多容易过拟合，需要根据参数和表现调试\n",
    "* 模型越复杂，相对要求的训练时间越长才能出好效果，需要在效果和时间上找到平衡"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 初始化的影响"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 21,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "InitializationMethod =  Xavier\n",
      "step=    0, trainAccuracy=0.17, loss=2.429571, eita=0.400000,testAccuracy=0.1501\n",
      "step= 1000, trainAccuracy=1.00, loss=0.146191, eita=0.354337,testAccuracy=0.9592\n",
      "step= 2000, trainAccuracy=1.00, loss=0.097048, eita=0.307609,testAccuracy=0.9753\n",
      "step= 3000, trainAccuracy=1.00, loss=0.058004, eita=0.267043,testAccuracy=0.9779\n",
      "step= 4000, trainAccuracy=1.00, loss=0.069011, eita=0.236558,testAccuracy=0.9788\n",
      "step= 5000, trainAccuracy=1.00, loss=0.047101, eita=0.205362,testAccuracy=0.9792\n",
      "step= 6000, trainAccuracy=1.00, loss=0.035372, eita=0.178280,testAccuracy=0.9800\n",
      "step= 7000, trainAccuracy=1.00, loss=0.033875, eita=0.157928,testAccuracy=0.9814\n",
      "step= 8000, trainAccuracy=1.00, loss=0.047923, eita=0.137102,testAccuracy=0.9815\n",
      "step= 9000, trainAccuracy=1.00, loss=0.032775, eita=0.119021,testAccuracy=0.9815\n",
      "step=10000, trainAccuracy=1.00, loss=0.030523, eita=0.105434,testAccuracy=0.9819\n",
      "step=11000, trainAccuracy=1.00, loss=0.028906, eita=0.091530,testAccuracy=0.9819\n",
      "step=12000, trainAccuracy=1.00, loss=0.028827, eita=0.079460,testAccuracy=0.9823\n",
      "step=12999, trainAccuracy=1.00, loss=0.027832, eita=0.070389,testAccuracy=0.9825\n",
      "InitializationMethod =  gauss , stddevConstant =  0.1\n",
      "step=    0, trainAccuracy=0.27, loss=3.141652, eita=0.400000,testAccuracy=0.2053\n",
      "step= 1000, trainAccuracy=0.99, loss=0.216906, eita=0.354337,testAccuracy=0.9609\n",
      "step= 2000, trainAccuracy=1.00, loss=0.103823, eita=0.307609,testAccuracy=0.9741\n",
      "step= 3000, trainAccuracy=1.00, loss=0.101925, eita=0.267043,testAccuracy=0.9747\n",
      "step= 4000, trainAccuracy=1.00, loss=0.097328, eita=0.236558,testAccuracy=0.9762\n",
      "step= 5000, trainAccuracy=1.00, loss=0.108738, eita=0.205362,testAccuracy=0.9798\n",
      "step= 6000, trainAccuracy=1.00, loss=0.079559, eita=0.178280,testAccuracy=0.9780\n",
      "step= 7000, trainAccuracy=1.00, loss=0.059410, eita=0.157928,testAccuracy=0.9806\n",
      "step= 8000, trainAccuracy=1.00, loss=0.058458, eita=0.137102,testAccuracy=0.9807\n",
      "step= 9000, trainAccuracy=1.00, loss=0.053001, eita=0.119021,testAccuracy=0.9815\n",
      "step=10000, trainAccuracy=1.00, loss=0.051657, eita=0.105434,testAccuracy=0.9821\n",
      "step=11000, trainAccuracy=1.00, loss=0.056265, eita=0.091530,testAccuracy=0.9804\n",
      "step=12000, trainAccuracy=1.00, loss=0.078762, eita=0.079460,testAccuracy=0.9740\n",
      "step=12999, trainAccuracy=1.00, loss=0.069644, eita=0.070389,testAccuracy=0.9759\n",
      "InitializationMethod =  constant , initConstant  =  0.1\n",
      "step=    0, trainAccuracy=0.14, loss=2.389304, eita=0.400000,testAccuracy=0.0980\n",
      "step= 1000, trainAccuracy=0.15, loss=715.209839, eita=0.354337,testAccuracy=0.0980\n",
      "step= 2000, trainAccuracy=0.06, loss=660.409241, eita=0.307609,testAccuracy=0.0980\n",
      "step= 3000, trainAccuracy=0.10, loss=609.808594, eita=0.267043,testAccuracy=0.0980\n",
      "step= 4000, trainAccuracy=0.12, loss=563.085999, eita=0.236558,testAccuracy=0.0980\n",
      "step= 5000, trainAccuracy=0.14, loss=520.001221, eita=0.205362,testAccuracy=0.0980\n",
      "step= 6000, trainAccuracy=0.03, loss=480.199890, eita=0.178280,testAccuracy=0.0980\n",
      "step= 7000, trainAccuracy=0.10, loss=443.440338, eita=0.157928,testAccuracy=0.0980\n",
      "step= 8000, trainAccuracy=0.10, loss=409.539032, eita=0.137102,testAccuracy=0.0980\n",
      "step= 9000, trainAccuracy=0.09, loss=378.218781, eita=0.119021,testAccuracy=0.0980\n",
      "step=10000, trainAccuracy=0.14, loss=349.317505, eita=0.105434,testAccuracy=0.0980\n",
      "step=11000, trainAccuracy=0.12, loss=322.635620, eita=0.091530,testAccuracy=0.0980\n",
      "step=12000, trainAccuracy=0.13, loss=298.004395, eita=0.079460,testAccuracy=0.0980\n",
      "step=12999, trainAccuracy=0.06, loss=275.294800, eita=0.070389,testAccuracy=0.0980\n"
     ]
    }
   ],
   "source": [
    "#定义初始化方法，并训练\n",
    "InitializationMethod = ['Xavier','gauss','constant'] #注意：Xavier需要先训练，因为使用了tf.get_variable()函数\n",
    "initConstant = 0.1\n",
    "stddevConstant = 0.1\n",
    "for i in range(len(InitializationMethod)):\n",
    "    #print\n",
    "    if(InitializationMethod[i] == 'constant'):\n",
    "        print('InitializationMethod = ' ,InitializationMethod[i],', initConstant  = ' ,initConstant)\n",
    "    elif(InitializationMethod[i] == 'gauss'):\n",
    "        print('InitializationMethod = ' ,InitializationMethod[i],', stddevConstant = ' ,stddevConstant)\n",
    "    if(InitializationMethod[i] == 'Xavier'):\n",
    "        print('InitializationMethod = ' ,InitializationMethod[i])\n",
    "    #training\n",
    "    trainingNN(InitializationMethod = InitializationMethod[i],initConstant=initConstant,stddevConstant=stddevConstant)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "小结：\n",
    "* Xavier方法效果最好，收敛很快(见监测量loss)，而且比较稳定(见监测量testAccuracy)\n",
    "* gauss方法效果低于Xavier，但也可以，需要其他参数配合，防止过拟合\n",
    "* 固定值方法只在单层神经网络中验证可行，在有隐层的网络中试了多次效果很差，可以看到误差在逐渐减少，但是迭代很慢受不了，可能需要其他参数配合。"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 网络结构的影响"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "隐层及神经元数目"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 27,
   "metadata": {
    "scrolled": false
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "numHiddenUnits1 =  100 , numHiddenUnits2 =  0\n",
      "step=    0, trainAccuracy=0.14, loss=3.424922, eita=0.400000,testAccuracy=0.1078\n",
      "step= 1000, trainAccuracy=0.99, loss=0.237386, eita=0.354337,testAccuracy=0.9590\n",
      "step= 2000, trainAccuracy=0.96, loss=0.258977, eita=0.307609,testAccuracy=0.9682\n",
      "step= 3000, trainAccuracy=0.98, loss=0.158048, eita=0.267043,testAccuracy=0.9718\n",
      "step= 4000, trainAccuracy=1.00, loss=0.075299, eita=0.236558,testAccuracy=0.9739\n",
      "step= 5000, trainAccuracy=1.00, loss=0.112960, eita=0.205362,testAccuracy=0.9769\n",
      "step= 6000, trainAccuracy=1.00, loss=0.054160, eita=0.178280,testAccuracy=0.9750\n",
      "step= 7000, trainAccuracy=1.00, loss=0.049235, eita=0.157928,testAccuracy=0.9783\n",
      "step= 8000, trainAccuracy=1.00, loss=0.091784, eita=0.137102,testAccuracy=0.9789\n",
      "step= 9000, trainAccuracy=1.00, loss=0.066709, eita=0.119021,testAccuracy=0.9777\n",
      "step=10000, trainAccuracy=1.00, loss=0.051409, eita=0.105434,testAccuracy=0.9781\n",
      "step=11000, trainAccuracy=1.00, loss=0.052146, eita=0.091530,testAccuracy=0.9786\n",
      "step=12000, trainAccuracy=1.00, loss=0.037634, eita=0.079460,testAccuracy=0.9784\n",
      "step=12999, trainAccuracy=1.00, loss=0.053306, eita=0.070389,testAccuracy=0.9785\n",
      "numHiddenUnits1 =  280 , numHiddenUnits2 =  0\n",
      "step=    0, trainAccuracy=0.20, loss=3.297922, eita=0.400000,testAccuracy=0.1208\n",
      "step= 1000, trainAccuracy=0.98, loss=0.279449, eita=0.354337,testAccuracy=0.9612\n",
      "step= 2000, trainAccuracy=0.98, loss=0.214556, eita=0.307609,testAccuracy=0.9739\n",
      "step= 3000, trainAccuracy=1.00, loss=0.168815, eita=0.267043,testAccuracy=0.9778\n",
      "step= 4000, trainAccuracy=1.00, loss=0.095884, eita=0.236558,testAccuracy=0.9749\n",
      "step= 5000, trainAccuracy=1.00, loss=0.115544, eita=0.205362,testAccuracy=0.9791\n",
      "step= 6000, trainAccuracy=1.00, loss=0.088438, eita=0.178280,testAccuracy=0.9798\n",
      "step= 7000, trainAccuracy=1.00, loss=0.084696, eita=0.157928,testAccuracy=0.9808\n",
      "step= 8000, trainAccuracy=1.00, loss=0.072764, eita=0.137102,testAccuracy=0.9798\n",
      "step= 9000, trainAccuracy=1.00, loss=0.074638, eita=0.119021,testAccuracy=0.9804\n",
      "step=10000, trainAccuracy=1.00, loss=0.066679, eita=0.105434,testAccuracy=0.9811\n",
      "step=11000, trainAccuracy=1.00, loss=0.073139, eita=0.091530,testAccuracy=0.9812\n",
      "step=12000, trainAccuracy=1.00, loss=0.065914, eita=0.079460,testAccuracy=0.9815\n",
      "step=12999, trainAccuracy=1.00, loss=0.059803, eita=0.070389,testAccuracy=0.9809\n",
      "numHiddenUnits1 =  200 , numHiddenUnits2 =  80\n",
      "step=    0, trainAccuracy=0.23, loss=3.512236, eita=0.400000,testAccuracy=0.1475\n",
      "step= 1000, trainAccuracy=0.99, loss=0.154811, eita=0.354337,testAccuracy=0.9624\n",
      "step= 2000, trainAccuracy=1.00, loss=0.154837, eita=0.307609,testAccuracy=0.9704\n",
      "step= 3000, trainAccuracy=1.00, loss=0.098287, eita=0.267043,testAccuracy=0.9726\n",
      "step= 4000, trainAccuracy=0.99, loss=0.098824, eita=0.236558,testAccuracy=0.9751\n",
      "step= 5000, trainAccuracy=1.00, loss=0.074335, eita=0.205362,testAccuracy=0.9753\n",
      "step= 6000, trainAccuracy=1.00, loss=0.064894, eita=0.178280,testAccuracy=0.9817\n",
      "step= 7000, trainAccuracy=1.00, loss=0.075974, eita=0.157928,testAccuracy=0.9809\n",
      "step= 8000, trainAccuracy=1.00, loss=0.057729, eita=0.137102,testAccuracy=0.9819\n",
      "step= 9000, trainAccuracy=1.00, loss=0.056534, eita=0.119021,testAccuracy=0.9815\n",
      "step=10000, trainAccuracy=1.00, loss=0.052044, eita=0.105434,testAccuracy=0.9821\n",
      "step=11000, trainAccuracy=1.00, loss=0.048793, eita=0.091530,testAccuracy=0.9823\n",
      "step=12000, trainAccuracy=1.00, loss=0.046599, eita=0.079460,testAccuracy=0.9817\n",
      "step=12999, trainAccuracy=1.00, loss=0.045531, eita=0.070389,testAccuracy=0.9827\n"
     ]
    }
   ],
   "source": [
    "#设置网络结构，并训练\n",
    "numHiddenUnits = [[100,  0], #第1种结构的第1、2隐层的神经元数目,0表示没有该隐层\n",
    "                  [280,  0], #第2种结构的第1、2隐层的神经元数目,0表示没有该隐层\n",
    "                  [200, 80]] #第3种结构的第1、2隐层的神经元数目\n",
    "for i in range(len(numHiddenUnits)):\n",
    "# for i in range(1,2):\n",
    "    numHiddenUnits1 = numHiddenUnits[i][0] \n",
    "    numHiddenUnits2 = numHiddenUnits[i][1] \n",
    "    print('numHiddenUnits1 = ',numHiddenUnits1,', numHiddenUnits2 = ',numHiddenUnits2)\n",
    "    trainingNN(numHiddenUnits1 = numHiddenUnits1, numHiddenUnits2 = numHiddenUnits2)\n",
    "    "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "小结：\n",
    "* 隐层数目相同，神经元数目增加，有助于获得更好精度\n",
    "* 神经元数目相同，优先设置隐层数目多的结构，预测效果一般更好。"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 激活函数的影响"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 52,
   "metadata": {
    "scrolled": false
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "activationFunction =  relu\n",
      "step=    0, trainAccuracy=0.13, loss=2.839070, eita=0.200000,testAccuracy=0.1022\n",
      "step= 1000, trainAccuracy=0.99, loss=0.181809, eita=0.106288,testAccuracy=0.9554\n",
      "step= 2000, trainAccuracy=0.99, loss=0.207884, eita=0.050837,testAccuracy=0.9689\n",
      "step= 3000, trainAccuracy=1.00, loss=0.104175, eita=0.024315,testAccuracy=0.9735\n",
      "step= 4000, trainAccuracy=1.00, loss=0.089707, eita=0.012922,testAccuracy=0.9717\n",
      "step= 5000, trainAccuracy=1.00, loss=0.100331, eita=0.006181,testAccuracy=0.9764\n",
      "step= 6000, trainAccuracy=1.00, loss=0.085677, eita=0.002956,testAccuracy=0.9755\n",
      "step= 7000, trainAccuracy=1.00, loss=0.081749, eita=0.001571,testAccuracy=0.9767\n",
      "step= 8000, trainAccuracy=1.00, loss=0.069027, eita=0.000751,testAccuracy=0.9785\n",
      "step= 9000, trainAccuracy=1.00, loss=0.067764, eita=0.000359,testAccuracy=0.9783\n",
      "step=10000, trainAccuracy=1.00, loss=0.066035, eita=0.000191,testAccuracy=0.9791\n",
      "step=11000, trainAccuracy=1.00, loss=0.064262, eita=0.000091,testAccuracy=0.9801\n",
      "step=12000, trainAccuracy=1.00, loss=0.060243, eita=0.000044,testAccuracy=0.9790\n",
      "step=12999, trainAccuracy=1.00, loss=0.058429, eita=0.000023,testAccuracy=0.9788\n",
      "activationFunction =  selu\n",
      "step=    0, trainAccuracy=0.17, loss=3.314286, eita=0.200000,testAccuracy=0.1174\n",
      "step= 1000, trainAccuracy=1.00, loss=0.136967, eita=0.106288,testAccuracy=0.9599\n",
      "step= 2000, trainAccuracy=0.99, loss=0.177520, eita=0.050837,testAccuracy=0.9691\n",
      "step= 3000, trainAccuracy=1.00, loss=0.079219, eita=0.024315,testAccuracy=0.9707\n",
      "step= 4000, trainAccuracy=1.00, loss=0.135271, eita=0.012922,testAccuracy=0.9734\n",
      "step= 5000, trainAccuracy=1.00, loss=0.088622, eita=0.006181,testAccuracy=0.9748\n",
      "step= 6000, trainAccuracy=1.00, loss=0.087459, eita=0.002956,testAccuracy=0.9753\n",
      "step= 7000, trainAccuracy=1.00, loss=0.078246, eita=0.001571,testAccuracy=0.9770\n",
      "step= 8000, trainAccuracy=1.00, loss=0.085734, eita=0.000751,testAccuracy=0.9766\n",
      "step= 9000, trainAccuracy=1.00, loss=0.086188, eita=0.000359,testAccuracy=0.9787\n",
      "step=10000, trainAccuracy=1.00, loss=0.070580, eita=0.000191,testAccuracy=0.9773\n",
      "step=11000, trainAccuracy=1.00, loss=0.074412, eita=0.000091,testAccuracy=0.9790\n",
      "step=12000, trainAccuracy=1.00, loss=0.067677, eita=0.000044,testAccuracy=0.9797\n",
      "step=12999, trainAccuracy=1.00, loss=0.062745, eita=0.000023,testAccuracy=0.9785\n",
      "activationFunction =  elu\n",
      "step=    0, trainAccuracy=0.14, loss=3.223276, eita=0.200000,testAccuracy=0.0980\n",
      "step= 1000, trainAccuracy=1.00, loss=0.173497, eita=0.106288,testAccuracy=0.9575\n",
      "step= 2000, trainAccuracy=0.98, loss=0.222858, eita=0.050837,testAccuracy=0.9671\n",
      "step= 3000, trainAccuracy=0.99, loss=0.156366, eita=0.024315,testAccuracy=0.9730\n",
      "step= 4000, trainAccuracy=1.00, loss=0.113447, eita=0.012922,testAccuracy=0.9763\n",
      "step= 5000, trainAccuracy=1.00, loss=0.151421, eita=0.006181,testAccuracy=0.9753\n",
      "step= 6000, trainAccuracy=0.99, loss=0.205242, eita=0.002956,testAccuracy=0.9747\n",
      "step= 7000, trainAccuracy=1.00, loss=0.077475, eita=0.001571,testAccuracy=0.9780\n",
      "step= 8000, trainAccuracy=1.00, loss=0.100265, eita=0.000751,testAccuracy=0.9790\n",
      "step= 9000, trainAccuracy=1.00, loss=0.067679, eita=0.000359,testAccuracy=0.9801\n",
      "step=10000, trainAccuracy=1.00, loss=0.068655, eita=0.000191,testAccuracy=0.9790\n",
      "step=11000, trainAccuracy=1.00, loss=0.066749, eita=0.000091,testAccuracy=0.9760\n",
      "step=12000, trainAccuracy=1.00, loss=0.067178, eita=0.000044,testAccuracy=0.9800\n",
      "step=12999, trainAccuracy=1.00, loss=0.069885, eita=0.000023,testAccuracy=0.9800\n"
     ]
    }
   ],
   "source": [
    "#定义激活函数，并训练\n",
    "activationFunction = ['relu','selu','elu']\n",
    "for i in range(len(activationFunction)):\n",
    "    print('activationFunction = ' ,activationFunction[i])\n",
    "    trainingNN(activationFunction = activationFunction[i],learning_rate = 0.2 , decay_rate = 0.9)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "小结：\n",
    "* relu作为基本的激活函数，收敛快，表现稳定\n",
    "* selu收敛最快，但是经常会跑飞。通过查看激活函数图形，可以看到selu斜率最大，因此需要其他参数例如较小的学习率等配合，发挥selu的效果\n",
    "* elu收敛快，且收敛平稳，大多数情况下效果均衡，这可能与elu可导同时斜率适中有关"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 学习率的影响"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "初始学习率和步进下降系数"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 22,
   "metadata": {
    "scrolled": false
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "learningRate =  0.5 decay_rate =  0.93\n",
      "step=    0, trainAccuracy=0.17, loss=3.287596, eita=0.500000,testAccuracy=0.1010\n",
      "step= 1000, trainAccuracy=1.00, loss=0.133813, eita=0.323495,testAccuracy=0.9660\n",
      "step= 2000, trainAccuracy=0.99, loss=0.218233, eita=0.194647,testAccuracy=0.9728\n",
      "step= 3000, trainAccuracy=1.00, loss=0.075372, eita=0.117119,testAccuracy=0.9783\n",
      "step= 4000, trainAccuracy=1.00, loss=0.087444, eita=0.075775,testAccuracy=0.9766\n",
      "step= 5000, trainAccuracy=1.00, loss=0.065243, eita=0.045594,testAccuracy=0.9768\n",
      "step= 6000, trainAccuracy=1.00, loss=0.064249, eita=0.027434,testAccuracy=0.9772\n",
      "step= 7000, trainAccuracy=1.00, loss=0.076648, eita=0.017749,testAccuracy=0.9799\n",
      "step= 8000, trainAccuracy=1.00, loss=0.053872, eita=0.010680,testAccuracy=0.9804\n",
      "step= 9000, trainAccuracy=1.00, loss=0.049121, eita=0.006426,testAccuracy=0.9803\n",
      "step=10000, trainAccuracy=1.00, loss=0.046307, eita=0.004158,testAccuracy=0.9814\n",
      "step=11000, trainAccuracy=1.00, loss=0.043494, eita=0.002502,testAccuracy=0.9807\n",
      "step=12000, trainAccuracy=1.00, loss=0.041224, eita=0.001505,testAccuracy=0.9818\n",
      "step=12999, trainAccuracy=1.00, loss=0.038302, eita=0.000974,testAccuracy=0.9816\n",
      "learningRate =  0.5 decay_rate =  0.98\n",
      "step=    0, trainAccuracy=0.16, loss=3.136833, eita=0.500000,testAccuracy=0.1028\n",
      "step= 1000, trainAccuracy=0.99, loss=0.211238, eita=0.442921,testAccuracy=0.9648\n",
      "step= 2000, trainAccuracy=1.00, loss=0.079709, eita=0.384511,testAccuracy=0.9724\n",
      "step= 3000, trainAccuracy=1.00, loss=0.083697, eita=0.333804,testAccuracy=0.9761\n",
      "step= 4000, trainAccuracy=1.00, loss=0.083767, eita=0.295698,testAccuracy=0.9714\n",
      "step= 5000, trainAccuracy=1.00, loss=0.069436, eita=0.256703,testAccuracy=0.9754\n",
      "step= 6000, trainAccuracy=1.00, loss=0.064194, eita=0.222850,testAccuracy=0.9755\n",
      "step= 7000, trainAccuracy=1.00, loss=0.065439, eita=0.197410,testAccuracy=0.9784\n",
      "step= 8000, trainAccuracy=1.00, loss=0.056051, eita=0.171377,testAccuracy=0.9799\n",
      "step= 9000, trainAccuracy=0.95, loss=0.457769, eita=0.148777,testAccuracy=0.9269\n",
      "step=10000, trainAccuracy=1.00, loss=0.412858, eita=0.131793,testAccuracy=0.9585\n",
      "step=11000, trainAccuracy=1.00, loss=0.199146, eita=0.114413,testAccuracy=0.9735\n",
      "step=12000, trainAccuracy=1.00, loss=0.158622, eita=0.099325,testAccuracy=0.9752\n",
      "step=12999, trainAccuracy=1.00, loss=0.147737, eita=0.087986,testAccuracy=0.9766\n",
      "learningRate =  0.3 decay_rate =  0.93\n",
      "step=    0, trainAccuracy=0.15, loss=3.073013, eita=0.300000,testAccuracy=0.1352\n",
      "step= 1000, trainAccuracy=0.99, loss=0.250618, eita=0.194097,testAccuracy=0.9579\n",
      "step= 2000, trainAccuracy=1.00, loss=0.148008, eita=0.116788,testAccuracy=0.9680\n",
      "step= 3000, trainAccuracy=1.00, loss=0.097503, eita=0.070272,testAccuracy=0.9763\n",
      "step= 4000, trainAccuracy=0.99, loss=0.160483, eita=0.045465,testAccuracy=0.9752\n",
      "step= 5000, trainAccuracy=1.00, loss=0.074934, eita=0.027356,testAccuracy=0.9798\n",
      "step= 6000, trainAccuracy=1.00, loss=0.091275, eita=0.016460,testAccuracy=0.9789\n",
      "step= 7000, trainAccuracy=1.00, loss=0.070083, eita=0.010650,testAccuracy=0.9788\n",
      "step= 8000, trainAccuracy=1.00, loss=0.062917, eita=0.006408,testAccuracy=0.9799\n",
      "step= 9000, trainAccuracy=1.00, loss=0.061041, eita=0.003856,testAccuracy=0.9802\n",
      "step=10000, trainAccuracy=1.00, loss=0.055718, eita=0.002495,testAccuracy=0.9791\n",
      "step=11000, trainAccuracy=1.00, loss=0.055897, eita=0.001501,testAccuracy=0.9802\n",
      "step=12000, trainAccuracy=1.00, loss=0.053837, eita=0.000903,testAccuracy=0.9810\n",
      "step=12999, trainAccuracy=1.00, loss=0.051260, eita=0.000584,testAccuracy=0.9804\n",
      "learningRate =  0.3 decay_rate =  0.98\n",
      "step=    0, trainAccuracy=0.18, loss=3.317314, eita=0.300000,testAccuracy=0.0958\n",
      "step= 1000, trainAccuracy=1.00, loss=0.104660, eita=0.265753,testAccuracy=0.9608\n",
      "step= 2000, trainAccuracy=1.00, loss=0.160224, eita=0.230707,testAccuracy=0.9703\n",
      "step= 3000, trainAccuracy=1.00, loss=0.097119, eita=0.200282,testAccuracy=0.9736\n",
      "step= 4000, trainAccuracy=1.00, loss=0.101852, eita=0.177419,testAccuracy=0.9777\n",
      "step= 5000, trainAccuracy=1.00, loss=0.079106, eita=0.154022,testAccuracy=0.9766\n",
      "step= 6000, trainAccuracy=1.00, loss=0.072992, eita=0.133710,testAccuracy=0.9789\n",
      "step= 7000, trainAccuracy=1.00, loss=0.076564, eita=0.118446,testAccuracy=0.9778\n",
      "step= 8000, trainAccuracy=1.00, loss=0.062947, eita=0.102826,testAccuracy=0.9794\n",
      "step= 9000, trainAccuracy=1.00, loss=0.060245, eita=0.089266,testAccuracy=0.9802\n",
      "step=10000, trainAccuracy=1.00, loss=0.057090, eita=0.079076,testAccuracy=0.9796\n",
      "step=11000, trainAccuracy=1.00, loss=0.057495, eita=0.068648,testAccuracy=0.9793\n",
      "step=12000, trainAccuracy=1.00, loss=0.053748, eita=0.059595,testAccuracy=0.9805\n",
      "step=12999, trainAccuracy=1.00, loss=0.053492, eita=0.052792,testAccuracy=0.9800\n"
     ]
    }
   ],
   "source": [
    "#定义学习率，并训练\n",
    "learningRate = [0.5,0.3]\n",
    "decay_rate   = [0.93,0.98]\n",
    "for i in range(len(learningRate)):\n",
    "    for j in range(len(decay_rate)):\n",
    "        print('learningRate = ' ,learningRate[i],'decay_rate = ' ,decay_rate[j])\n",
    "        trainingNN(learning_rate = learningRate[i],decay_rate = decay_rate[j])"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "小结：\n",
    "* 学习率是本次调参消耗精力最大的一个参数，与训练迭代参数、批次参数、网络结构、激活函数、正则等关系都很密切，很难调\n",
    "* 初始值：网络结构复杂/批次大时，适于较大值，便于快速快速收敛，同时辅助较大下降系数，避免过拟合；激活函数收敛快(例如selu)时，需要配小值\n",
    "* 下降值：配合初始值使用，即保证一定收敛速度，又不过拟合\n",
    "* 步进decay_steps：调优过程中也是一个重要参数"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 正则表达式的影响"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 25,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "RegularExpression =  l1 lambda =  0.001\n",
      "step=    0, trainAccuracy=0.14, loss=15.752938, eita=0.400000,testAccuracy=0.1315\n",
      "step= 1000, trainAccuracy=0.94, loss=0.900204, eita=0.354337,testAccuracy=0.9213\n",
      "step= 2000, trainAccuracy=0.97, loss=0.683569, eita=0.307609,testAccuracy=0.9323\n",
      "step= 3000, trainAccuracy=0.95, loss=0.625435, eita=0.267043,testAccuracy=0.9289\n",
      "step= 4000, trainAccuracy=0.89, loss=0.974937, eita=0.236558,testAccuracy=0.9058\n",
      "step= 5000, trainAccuracy=0.97, loss=0.670109, eita=0.205362,testAccuracy=0.9225\n",
      "step= 6000, trainAccuracy=0.96, loss=0.670483, eita=0.178280,testAccuracy=0.9271\n",
      "step= 7000, trainAccuracy=0.97, loss=0.606056, eita=0.157928,testAccuracy=0.9283\n",
      "step= 8000, trainAccuracy=0.98, loss=0.525477, eita=0.137102,testAccuracy=0.9474\n",
      "step= 9000, trainAccuracy=0.97, loss=0.603286, eita=0.119021,testAccuracy=0.9227\n",
      "step=10000, trainAccuracy=0.68, loss=1.735352, eita=0.105434,testAccuracy=0.7413\n",
      "step=11000, trainAccuracy=0.96, loss=0.549732, eita=0.091530,testAccuracy=0.9391\n",
      "step=12000, trainAccuracy=0.96, loss=0.568546, eita=0.079460,testAccuracy=0.9228\n",
      "step=12999, trainAccuracy=0.99, loss=0.532393, eita=0.070389,testAccuracy=0.9218\n",
      "RegularExpression =  l1 lambda =  0.0001\n",
      "step=    0, trainAccuracy=0.15, loss=4.422222, eita=0.400000,testAccuracy=0.1220\n",
      "step= 1000, trainAccuracy=1.00, loss=0.872496, eita=0.354337,testAccuracy=0.9559\n",
      "step= 2000, trainAccuracy=1.00, loss=0.498749, eita=0.307609,testAccuracy=0.9699\n",
      "step= 3000, trainAccuracy=0.99, loss=0.362984, eita=0.267043,testAccuracy=0.9725\n",
      "step= 4000, trainAccuracy=1.00, loss=0.257116, eita=0.236558,testAccuracy=0.9723\n",
      "step= 5000, trainAccuracy=1.00, loss=0.235602, eita=0.205362,testAccuracy=0.9750\n",
      "step= 6000, trainAccuracy=1.00, loss=0.242407, eita=0.178280,testAccuracy=0.9690\n",
      "step= 7000, trainAccuracy=0.99, loss=0.356472, eita=0.157928,testAccuracy=0.9632\n",
      "step= 8000, trainAccuracy=1.00, loss=0.259015, eita=0.137102,testAccuracy=0.9720\n",
      "step= 9000, trainAccuracy=1.00, loss=0.255485, eita=0.119021,testAccuracy=0.9726\n",
      "step=10000, trainAccuracy=0.99, loss=0.287285, eita=0.105434,testAccuracy=0.9747\n",
      "step=11000, trainAccuracy=1.00, loss=0.172007, eita=0.091530,testAccuracy=0.9741\n",
      "step=12000, trainAccuracy=1.00, loss=0.200920, eita=0.079460,testAccuracy=0.9734\n",
      "step=12999, trainAccuracy=1.00, loss=0.179376, eita=0.070389,testAccuracy=0.9726\n",
      "RegularExpression =  l2 lambda =  0.001\n",
      "step=    0, trainAccuracy=0.17, loss=4.147881, eita=0.400000,testAccuracy=0.1008\n",
      "step= 1000, trainAccuracy=0.99, loss=0.553669, eita=0.354337,testAccuracy=0.9644\n",
      "step= 2000, trainAccuracy=1.00, loss=0.276826, eita=0.307609,testAccuracy=0.9696\n",
      "step= 3000, trainAccuracy=0.99, loss=0.222228, eita=0.267043,testAccuracy=0.9747\n",
      "step= 4000, trainAccuracy=1.00, loss=0.197915, eita=0.236558,testAccuracy=0.9748\n",
      "step= 5000, trainAccuracy=1.00, loss=0.211716, eita=0.205362,testAccuracy=0.9736\n",
      "step= 6000, trainAccuracy=1.00, loss=0.197394, eita=0.178280,testAccuracy=0.9734\n",
      "step= 7000, trainAccuracy=1.00, loss=0.121743, eita=0.157928,testAccuracy=0.9769\n",
      "step= 8000, trainAccuracy=1.00, loss=0.155757, eita=0.137102,testAccuracy=0.9788\n",
      "step= 9000, trainAccuracy=0.99, loss=0.159078, eita=0.119021,testAccuracy=0.9668\n",
      "step=10000, trainAccuracy=1.00, loss=0.115324, eita=0.105434,testAccuracy=0.9802\n",
      "step=11000, trainAccuracy=0.99, loss=0.139341, eita=0.091530,testAccuracy=0.9756\n",
      "step=12000, trainAccuracy=1.00, loss=0.169381, eita=0.079460,testAccuracy=0.9765\n",
      "step=12999, trainAccuracy=1.00, loss=0.126446, eita=0.070389,testAccuracy=0.9782\n",
      "RegularExpression =  l2 lambda =  0.0001\n",
      "step=    0, trainAccuracy=0.16, loss=3.213193, eita=0.400000,testAccuracy=0.1088\n",
      "step= 1000, trainAccuracy=1.00, loss=0.172484, eita=0.354337,testAccuracy=0.9647\n",
      "step= 2000, trainAccuracy=1.00, loss=0.094474, eita=0.307609,testAccuracy=0.9743\n",
      "step= 3000, trainAccuracy=1.00, loss=0.092723, eita=0.267043,testAccuracy=0.9744\n",
      "step= 4000, trainAccuracy=1.00, loss=0.119653, eita=0.236558,testAccuracy=0.9757\n",
      "step= 5000, trainAccuracy=1.00, loss=0.078554, eita=0.205362,testAccuracy=0.9783\n",
      "step= 6000, trainAccuracy=1.00, loss=0.074039, eita=0.178280,testAccuracy=0.9801\n",
      "step= 7000, trainAccuracy=1.00, loss=0.059545, eita=0.157928,testAccuracy=0.9796\n",
      "step= 8000, trainAccuracy=1.00, loss=0.055604, eita=0.137102,testAccuracy=0.9797\n",
      "step= 9000, trainAccuracy=1.00, loss=0.054083, eita=0.119021,testAccuracy=0.9807\n",
      "step=10000, trainAccuracy=1.00, loss=0.051544, eita=0.105434,testAccuracy=0.9800\n",
      "step=11000, trainAccuracy=1.00, loss=0.050170, eita=0.091530,testAccuracy=0.9800\n",
      "step=12000, trainAccuracy=1.00, loss=0.047771, eita=0.079460,testAccuracy=0.9806\n",
      "step=12999, trainAccuracy=1.00, loss=0.042442, eita=0.070389,testAccuracy=0.9807\n"
     ]
    }
   ],
   "source": [
    "#定义正则表达式，并训练\n",
    "RegularExpression = ['l1','l2']\n",
    "regular_lambda = [0.001,0.0001]\n",
    "for i in range(len(RegularExpression)):\n",
    "    for j in range(len(regular_lambda)):\n",
    "        print('RegularExpression = ' ,RegularExpression[i],'lambda = ',regular_lambda[j])\n",
    "        trainingNN(RegularExpression = RegularExpression[i],regular_lambda = regular_lambda[j])"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "小结：\n",
    "* 正则作用极其重要，尤其当网络结构复杂时，正则项是必备项，避免还未找到极值点就过拟合，保证训练稳定收敛不跑飞,而且所有W都要正则不能漏\n",
    "* l1正则：能够实现稀疏，但是有时会收敛过程会震荡，且惩罚不够有力\n",
    "* l2正则：惩罚力度大，比较平稳，本次任务表现比较好\n",
    "* 惩罚系数lambda：最重要的调参工作之一，需要配合网络复杂程度、迭代次数等调试,例如结构复杂时惩罚加大。例如l2情况下，本次任务前期lambda=1e-5表现最好，后期1e-4表现好，还有很大发挥空间"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 损失计算的影响"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 26,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "loss_calculation =  manual\n",
      "step=    0, trainAccuracy=0.19, loss=2.937502, eita=0.400000,testAccuracy=0.1283\n",
      "step= 1000, trainAccuracy=0.99, loss=0.279545, eita=0.354337,testAccuracy=0.9667\n",
      "step= 2000, trainAccuracy=1.00, loss=0.086593, eita=0.307609,testAccuracy=0.9743\n",
      "step= 3000, trainAccuracy=1.00, loss=0.131396, eita=0.267043,testAccuracy=0.9729\n",
      "step= 4000, trainAccuracy=1.00, loss=0.078062, eita=0.236558,testAccuracy=0.9788\n",
      "step= 5000, trainAccuracy=1.00, loss=0.087291, eita=0.205362,testAccuracy=0.9797\n",
      "step= 6000, trainAccuracy=1.00, loss=0.063167, eita=0.178280,testAccuracy=0.9802\n",
      "step= 7000, trainAccuracy=1.00, loss=0.076729, eita=0.157928,testAccuracy=0.9803\n",
      "step= 8000, trainAccuracy=1.00, loss=0.056848, eita=0.137102,testAccuracy=0.9816\n",
      "step= 9000, trainAccuracy=1.00, loss=0.055274, eita=0.119021,testAccuracy=0.9819\n",
      "step=10000, trainAccuracy=1.00, loss=0.051767, eita=0.105434,testAccuracy=0.9818\n",
      "step=11000, trainAccuracy=1.00, loss=0.047683, eita=0.091530,testAccuracy=0.9817\n",
      "step=12000, trainAccuracy=1.00, loss=0.046114, eita=0.079460,testAccuracy=0.9822\n",
      "step=12999, trainAccuracy=1.00, loss=0.042422, eita=0.070389,testAccuracy=0.9814\n",
      "loss_calculation =  softmax\n",
      "step=    0, trainAccuracy=0.12, loss=3.245414, eita=0.400000,testAccuracy=0.1032\n",
      "step= 1000, trainAccuracy=0.99, loss=0.146358, eita=0.354337,testAccuracy=0.9650\n",
      "step= 2000, trainAccuracy=1.00, loss=0.126295, eita=0.307609,testAccuracy=0.9724\n",
      "step= 3000, trainAccuracy=1.00, loss=0.079250, eita=0.267043,testAccuracy=0.9740\n",
      "step= 4000, trainAccuracy=1.00, loss=0.092532, eita=0.236558,testAccuracy=0.9766\n",
      "step= 5000, trainAccuracy=1.00, loss=0.089355, eita=0.205362,testAccuracy=0.9784\n",
      "step= 6000, trainAccuracy=1.00, loss=0.076925, eita=0.178280,testAccuracy=0.9784\n",
      "step= 7000, trainAccuracy=1.00, loss=0.077080, eita=0.157928,testAccuracy=0.9781\n",
      "step= 8000, trainAccuracy=1.00, loss=0.062736, eita=0.137102,testAccuracy=0.9792\n",
      "step= 9000, trainAccuracy=1.00, loss=0.052994, eita=0.119021,testAccuracy=0.9800\n",
      "step=10000, trainAccuracy=1.00, loss=0.052993, eita=0.105434,testAccuracy=0.9789\n",
      "step=11000, trainAccuracy=1.00, loss=0.049847, eita=0.091530,testAccuracy=0.9802\n",
      "step=12000, trainAccuracy=1.00, loss=0.045834, eita=0.079460,testAccuracy=0.9798\n",
      "step=12999, trainAccuracy=1.00, loss=0.045520, eita=0.070389,testAccuracy=0.9802\n",
      "loss_calculation =  mse\n",
      "step=    0, trainAccuracy=0.12, loss=0.169428, eita=0.400000,testAccuracy=0.1374\n",
      "step= 1000, trainAccuracy=0.93, loss=0.075342, eita=0.354337,testAccuracy=0.8953\n",
      "step= 2000, trainAccuracy=0.96, loss=0.068067, eita=0.307609,testAccuracy=0.9191\n",
      "step= 3000, trainAccuracy=0.91, loss=0.068845, eita=0.267043,testAccuracy=0.9283\n",
      "step= 4000, trainAccuracy=0.92, loss=0.064797, eita=0.236558,testAccuracy=0.9341\n",
      "step= 5000, trainAccuracy=0.96, loss=0.054740, eita=0.205362,testAccuracy=0.9397\n",
      "step= 6000, trainAccuracy=0.96, loss=0.054885, eita=0.178280,testAccuracy=0.9416\n",
      "step= 7000, trainAccuracy=0.99, loss=0.045530, eita=0.157928,testAccuracy=0.9462\n",
      "step= 8000, trainAccuracy=0.99, loss=0.042012, eita=0.137102,testAccuracy=0.9488\n",
      "step= 9000, trainAccuracy=0.95, loss=0.044468, eita=0.119021,testAccuracy=0.9533\n",
      "step=10000, trainAccuracy=0.96, loss=0.041500, eita=0.105434,testAccuracy=0.9549\n",
      "step=11000, trainAccuracy=0.99, loss=0.036973, eita=0.091530,testAccuracy=0.9568\n",
      "step=12000, trainAccuracy=0.96, loss=0.036866, eita=0.079460,testAccuracy=0.9601\n",
      "step=12999, trainAccuracy=0.98, loss=0.032940, eita=0.070389,testAccuracy=0.9591\n"
     ]
    }
   ],
   "source": [
    "#定义损失计算方式，并训练\n",
    "loss_calculation = ['manual','softmax','mse'] \n",
    "for i in range(len(loss_calculation)):\n",
    "    print('loss_calculation = ' ,loss_calculation[i])\n",
    "    trainingNN(loss_calculation = loss_calculation[i])"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "小结：\n",
    "* softmax几乎是输出层激活函数的唯一选择，收敛快，求导后表达式很简单,计算很快，因此目前还没有找到比softmax更好的函数\n",
    "* (1) manual部分：softmax手动计算方式，目前表现尚好，还没出现问题，课件中提到有数值问题，因此有待进一步验证，不作为默认计算方式\n",
    "* (2) softmax部分：softmax使用系统函数计算方式，表现稳定，因此作为默认的计算方式\n",
    "* (3) mse部分：使用MSE作为损失函数，有一定合理性，效果不及softmax"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 参数合影"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "参数逐个检查，记录最优"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 22,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "step=    0, trainAccuracy=0.10, loss=2.943863, eita=0.400000,testAccuracy=0.1008\n",
      "step= 1000, trainAccuracy=0.99, loss=0.148374, eita=0.354337,testAccuracy=0.9605\n",
      "step= 2000, trainAccuracy=1.00, loss=0.113335, eita=0.307609,testAccuracy=0.9754\n",
      "step= 3000, trainAccuracy=1.00, loss=0.113394, eita=0.267043,testAccuracy=0.9743\n",
      "step= 4000, trainAccuracy=1.00, loss=0.081421, eita=0.236558,testAccuracy=0.9773\n",
      "step= 5000, trainAccuracy=1.00, loss=0.072134, eita=0.205362,testAccuracy=0.9793\n",
      "step= 6000, trainAccuracy=1.00, loss=0.066714, eita=0.178280,testAccuracy=0.9795\n",
      "step= 7000, trainAccuracy=1.00, loss=0.104032, eita=0.157928,testAccuracy=0.9779\n",
      "step= 8000, trainAccuracy=1.00, loss=0.057324, eita=0.137102,testAccuracy=0.9816\n",
      "step= 9000, trainAccuracy=1.00, loss=0.056135, eita=0.119021,testAccuracy=0.9818\n",
      "step=10000, trainAccuracy=1.00, loss=0.053918, eita=0.105434,testAccuracy=0.9813\n",
      "step=11000, trainAccuracy=1.00, loss=0.047661, eita=0.091530,testAccuracy=0.9818\n",
      "step=12000, trainAccuracy=1.00, loss=0.046253, eita=0.079460,testAccuracy=0.9825\n",
      "step=12999, trainAccuracy=1.00, loss=0.044403, eita=0.070389,testAccuracy=0.9820\n"
     ]
    }
   ],
   "source": [
    "# 全部参数\n",
    "trainingNN(#1 训练迭代参数\n",
    "           trainingIterations = 13000,batchSize = 100,\n",
    "           #2 网络结构参数\n",
    "           numHiddenUnits1 = 200,numHiddenUnits2 = 80,inputSize = 784,numClasses = 10,               \n",
    "           #3 初始化参数\n",
    "           initConstant = 0.1, stddevConstant = 0.1, InitializationMethod = 'gauss',               \n",
    "           #4 激活函数参数\n",
    "           activationFunction = 'relu',\n",
    "           #5 正则参数\n",
    "           RegularExpression = 'l2',  regular_lambda = 0.0001,\n",
    "           #6 学习率参数\n",
    "           learning_rate = 0.4,global_step=tf.Variable(0),decay_steps = 150,decay_rate = 0.98,staircase=True,\n",
    "           #7 损失函数参数\n",
    "           loss_calculation = 'softmax'\n",
    "           )"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Hint：\n",
    "- 多隐层\n",
    "- 激活函数\n",
    "- 正则化\n",
    "- 初始化\n",
    "- 摸索一下各个超参数\n",
    "  - 隐层神经元数量\n",
    "  - 学习率\n",
    "  - 正则化惩罚因子\n",
    "  - 最好每隔几个step就对loss、accuracy等等进行一次输出，有根据地进行调整"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# 作业小结\n",
    "1. 学习了tensorflow的常量、变量、session、run等概念，对tensorflow的程序设计思路和运行流程有了一定了解，tensorflow机制很有特点\n",
    "2. 学习了神经网络(NN)的层级结构(隐层、神经元等)、初始化、激活函数、学习率、正则、损失函数等内容，参数调优仍然是个很大的工程\n",
    "3. 数据可视化采用了打印监测量的方式，便于捕捉训练过程的数值变化，获取更多细节，及时调整训练策略\n",
    "4. 七类参数封装了函数，统一调用，便于逐类认识和快速调优，努力编写清晰的程序结构，有助于提高效率\n",
    "5. 本次训练找极值采用各影响因子多次轮询分析的方法，没有用gridSearch的方法，这可能导致找到的极值是局部极点，有待提高运算力，优先使用多个参数维度同时遍历找全局最优的方法，以获得更好效果。"
   ]
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.6.4"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 2
}
