{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "# MNIST 手寫數字辨識 - Softmax Regression"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "## 目標\n",
    "* 下載並熟悉 MNIST 資料集\n",
    "* 建立 tensorflow softmax regression model\n",
    "* 訓練 model 並計算出準確度"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 6,
   "metadata": {
    "collapsed": false,
    "deletable": true,
    "editable": true
   },
   "outputs": [],
   "source": [
    "import numpy as np\n",
    "import tensorflow as tf\n",
    "import matplotlib.pyplot as plt\n",
    "%matplotlib inline"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "當我們在學習一個新的程式語言的時候，譬如 `java`, `python` 之類的語言的時候，第一堂課都是學習印出 \"Hello World!\"．而在機器學習的領域裡面，MNIST 就有著相當於 Hello World 的地位．\n",
    "\n",
    "而什麼是 MNIST 呢？就是一個手寫數字的資料集．它包含了一連串的手寫數字圖片，並且有相對應的數字，如下圖，看得出來就是 5, 0, 4, 1 這四個數字．\n",
    "\n",
    "![](https://www.tensorflow.org/images/MNIST.png)\n",
    "\n",
    "在這裡會嘗試用一個簡單的 Softmax Regression 的模型來解決這個問題，當然沒辦法達到頂尖的辨識水準，不過會是一個好的開始，接下來也會學到比較複雜但更為準確的模型，讓我們先來看看 MNIST 資料集吧．"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "## 下載並熟悉 MNIST data"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "MNIST 資料集的官網是在[Yann LeCun's website](http://yann.lecun.com/exdb/mnist/)．這裡我們只要在 python 內把以下的兩行程式碼貼上，就可以下載 MNIST 的資料集．"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 7,
   "metadata": {
    "collapsed": false,
    "deletable": true,
    "editable": true
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Extracting MNIST_data/train-images-idx3-ubyte.gz\n",
      "Extracting MNIST_data/train-labels-idx1-ubyte.gz\n",
      "Extracting MNIST_data/t10k-images-idx3-ubyte.gz\n",
      "Extracting MNIST_data/t10k-labels-idx1-ubyte.gz\n"
     ]
    }
   ],
   "source": [
    "from tensorflow.examples.tutorials.mnist import input_data\n",
    "mnist = input_data.read_data_sets(\"MNIST_data/\", one_hot=True)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "MNIST 資料集分成三個部分\n",
    "\n",
    "1. 55,000 筆的 training data (`mnist.train`)\n",
    "2. 10,000 筆的 test data (`mnist.test`)\n",
    "3. 5,000 筆的 validation data (`mnist.validation`)\n",
    "\n",
    "把資料集分成這三個部分在機器學習裡面是非常重要的，因為我們必須從學習資料以及驗證資料來看我們的學習到底有沒有效果 (generalizes)\n",
    "\n",
    "如同上面講到的，每筆 MNIST 資料都有兩個部分，第一個是手寫的數字 image，另一個是對應的 label．在這裡我們把 image 稱作 `x`，而 label 稱作 `y`． training set 還有 test set 都有 image 以及 label．例如 training images 被稱作 `mnist.train.images`， training labels 稱作 `mnist.train.labels`．"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 8,
   "metadata": {
    "collapsed": false,
    "deletable": true,
    "editable": true
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "<class 'tensorflow.contrib.learn.python.learn.datasets.base.Datasets'>\n",
      "55000\n",
      "5000\n",
      "10000\n"
     ]
    }
   ],
   "source": [
    "# 來看看 mnist 的型態\n",
    "print type(mnist)\n",
    "print mnist.train.num_examples\n",
    "print mnist.validation.num_examples\n",
    "print mnist.test.num_examples"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 9,
   "metadata": {
    "collapsed": false,
    "deletable": true,
    "editable": true
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "讓我們看一下 MNIST 訓練還有測試的資料集長得如何\n",
      "\n",
      " train_img 的 type : <type 'numpy.ndarray'>\n",
      " train_img 的 dimension : (55000, 784)\n",
      " train_label 的 type : <type 'numpy.ndarray'>\n",
      " train_label 的 dimension : (55000, 10)\n",
      " test_img 的 type : <type 'numpy.ndarray'>\n",
      " test_img 的 dimension : (10000, 784)\n",
      " test_label 的 type : <type 'numpy.ndarray'>\n",
      " test_label 的 dimension : (10000, 10)\n"
     ]
    }
   ],
   "source": [
    "print(\"讓我們看一下 MNIST 訓練還有測試的資料集長得如何\")\n",
    "train_img = mnist.train.images\n",
    "train_label = mnist.train.labels\n",
    "test_img = mnist.test.images\n",
    "test_label = mnist.test.labels\n",
    "print\n",
    "print(\" train_img 的 type : %s\" % (type(train_img)))\n",
    "print(\" train_img 的 dimension : %s\" % (train_img.shape,))\n",
    "print(\" train_label 的 type : %s\" % (type(train_label)))\n",
    "print(\" train_label 的 dimension : %s\" % (train_label.shape,))\n",
    "print(\" test_img 的 type : %s\" % (type(test_img)))\n",
    "print(\" test_img 的 dimension : %s\" % (test_img.shape,))\n",
    "print(\" test_label 的 type : %s\" % (type(test_label)))\n",
    "print(\" test_label 的 dimension : %s\" % (test_label.shape,))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "從上面可以看到每個 image 有 784 個數字，因為每張圖片其實是 28 pixels X 28 pixels，我們可以把它看成一個很大的 array 如下圖\n",
    "\n",
    "![](https://www.tensorflow.org/images/MNIST-Matrix.png)\n",
    "\n",
    "把這個 array `拉平` 成一個 28x28 = 784 的向量，是用什麼方法拉平的並不重要，只要確保對於每張圖片都是用同樣的方法拉平就可以了．從這個角度來看， MNIST 就是一個 784 維向量空間裡面的點，只是有著比較複雜的[結構](http://colah.github.io/posts/2014-10-Visualizing-MNIST/)（注意，這個連結有很複雜的視覺化計算）\n",
    "\n",
    "或許你會想問說把這個二維圖片變成一個向量，會不會造成什麼不好的效果．當然有一些更強大的電腦視覺技巧是可以學習這一些原始的數字結構的，之後我們還會看到．但在這裡我們使用的方法是 Softmax Regression ，他並不需要保留原始的數字結構，直接把它變成一個向量即可．\n",
    "\n",
    "我們把 `mnist.train.images` 稱作一個 `tensor`（喔喔！tensorflow 的 tensor 出現了，他其實就是一個 n-dimensional array．而 tensor + flow = tensorflow，也就是指這個 n-dimensional array 變化流動的意思）這個 `tensor` 形狀為 [55000, 784]．第一個維度指的是圖片的 index，第二個則是每個圖片的 pixel 點，這個 pixel 點是一個介於 0 到 1 的值，來表示 pixel 點的強度．\n",
    "\n",
    "![](https://www.tensorflow.org/images/mnist-train-xs.png)\n",
    "\n",
    "而每個 MNIST 中的圖片都有一個對應的 label 也就是從 0 到 9 的數值．在這裡每個 label 都是一個 **one-hot vectors**． one-hot vector 是指說只有一個維度是 1 其他都是 0．在這裡數字 n 表示一個只有在第 n 維度（從 0 開始）數字為 1 的 10 維向量．例如 label 0 的表示法就是（[1, 0, 0, 0, 0, 0, 0, 0, 0, 0]．因此，`mnist.train.labels` 是一個 [60000, 10] 的矩陣．\n",
    "\n",
    "![](https://www.tensorflow.org/images/mnist-train-ys.png)\n",
    "\n",
    "以下我們實際印出了 MNIST 的資料集來看看他長得怎麼樣"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 10,
   "metadata": {
    "collapsed": false,
    "deletable": true,
    "editable": true
   },
   "outputs": [
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAQQAAAEMCAYAAAAiW8hnAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAE0xJREFUeJzt3X2QVfV9x/H3J4jRCiJWg/hIUaNo0xAljE5MSkxMxalR\nW6shtRLTzCaOzoDRPxzHNKRjkiYjajJ2dEikaqJktBpEJ0kFSgYpMQYyhAc1mmYgCriICixqsOK3\nf5yzP25W9tx79z6cu8vnNXNnd+/3PHzvYfnc8/DbcxURmJkBvKfsBsysczgQzCxxIJhZ4kAws8SB\nYGaJA8HMEgdCH5JmSfphSeseL2lns6fdl0j6oaRZzZ5X0nRJP22kt8FgyAeCpKslrZC0S9LdfWpT\nJL04wOUeK2lnxSMkvV7x80frXWZE/D4iRjR72nrl/zHektSTP9ZI+rqkg+tYxouSpjTQwzJJnxvo\n/M0WEfdExNR655P02z6/J29L+nEremyGIR8IwCbgJmBuMxcaEX+IiBG9j/zpD1Y890TfeSQNa2YP\nLfaNiBgJHA78M/BR4AlJB5bb1uASESdV/I4cTPb7+GDJbfVryAdCRDwcEfOBVyqfl3QQ8FPgyIr0\nPjIv7y/p3vzdcZ2kSQNZd/5O+++SfibpdeCjkj4taZWkHZL+IOkrFdOfICkqfl4m6WuSlue9/EzS\nofVOm9evyNe3VdINtb6DR8QfI+Ip4HzgCGB6vrwTJS2R9Gq+zB9IGpXX5gFHAj/Nt+uXJb1H0n9K\neknSNkk/lzRhANu0luUcLmlxvh2WSDqmYv5TJC3K+35W0t/XuN4vSPp5RQ/flbRF0nZJqyWdUsNi\nPg6MAryH0Gki4nVgKrCp4l19U17+NPAj4BBgAXB7A6v6LPA1YCTwC2An8I/5ss8HZkj62yrzTwfG\nAAcBX653WkkfAL4LfAY4iuxd/4h6XkREbAcWk+0pAIhsz+sI4BRgPPCVfNppZO+EU/Pteks+z2PA\nifk8a4Ef1NNDhWrLuQz4F+Aw4OneuqQRwELgXuB9ZP8OcySdVOf6pwJn5D2MJtuur9Yw33TgwYh4\ns871tc0+GwhVLIuIn0TEbrJfpg82sKwfR8QvIuKdiNgVEf8dEevyn39DFjx/XTD/XRHxfES8Qbar\nOXEA0/4DMD8ilkfELuDGAb6WTcChABHxXEQsjoi3ImILcGvR68hf790R0RMRfwRmAafne2o1q3E5\nj0bE/+Sv9QbgY5LGAhcAz0XEvRHxdkSsBOYDF9fTA/B/ZLv/J+c9PR0RLxXNkIfR3wF317mutnIg\n7F3lP+4bwAGS9hvgsl6o/EHSmflu7suStgNfIHsnq7WXohOJ/U17ZGUf+d7RazX03tdR5O+Eko6Q\n9ICkjZJ2kP2i9/s6JA2T9G1Jv8+n/11eKnrtA11O5WvdDmwn2wbHAR/JDzW2SdoGXAqMraeHiHgc\nuBO4A+iWdKekkVVmuxh4KSKW1bOudtvXA6Edf+rZdx0/Ah4CjomIUcD3yXa/W2kzcHTvD/m76eh6\nFpBfYTgb6D1Z+i1gF/CBiDgY+Bx/+jr6vu7LgfPyZYwCTuhddD191LicynMGo/LpNpEFxeKIOKTi\nMSIirq6zByLitog4DfhLskOmokM5yA4X7q13Pe025ANB0n6SDgCGAcMkVb7bdwN/3nsyrE1GAq9G\nxB8lnUF2/NlqDwIXSjpD0v7Av9Y6o6T35idVHwFeZs8v9UjgdWB7ftLuuj6zdpOdV6Bi+l1kJ3f/\nDPh6Dasfnv979T6G17ic8/M9sfeSned4IiI2k50POlXSZyUNzx+T6z2HkM8zOf89eh14C3inYPrj\nyM69OBA6wI3Am8D1ZCeb3syfIyKeBeYBv893IY/sdynNcyXwTUk9ZMe3D7R6hRGxGriGLBg2kf1n\neoXsP1Z/bsh7fAW4B3gS+Eh+fgLgq8Bkst3xBWR7PZW+AXwt364zgf/I170JWAcsr6H1OWT/Xr2P\n79W4nB+SBcFW4K/I9ip6Dx/+huz3YDPZIdY3gffW0EulQ4C7gG3A+nxZtxRMfzlZKK2vcz3tFxFt\nfwDnAr8lO/67voweqvS3HlgDrAJWdEA/c4EtwNqK5w4lO2P+fP51dB3LO5jsHe2YFvY3C9iYb8NV\nwHklbr9jgCVkVxzWATMa3YZt6q/t27CMFz8M+F+y3cn9gd8Ap5T1y9JPj+uBw8ruo6KfjwGn9fkP\n9+3eMCXb+/lWlWV8mmwXewTZO23Tgq6f/mYB15W97fJexgKn5d+PBJ4jO+6vaxuW0F/bt2EZhwyT\ngd9FNvT2LbKTbBeU0MegERFLefd17gvIduXJv15YZTEXke1mvwiMA6a1uL+OERGbI+LX+fc9wDNk\nV0zq3Ybt7q/tygiEo/jTS3EvUtKLLxDA45JWSuoqu5l+jInsRBlkx8JjiiaOiCtiz5n1cyLi+da3\nyNX5KL65kuq6qtEqksYBHwJ+SZ3bsB369Adt3ob7wknFgTgrsktKU4GrJH2s7IaKRLav2Wl3y70D\nOJ5scNRmYHa57aTBQQ8BMyNiR2WtE7bhXvpr+zYsIxA2UnGdmOz6+MYS+uhXRGzMv24hG3c+udyO\n9qo7H31H/nVLyf38iYjojojdEfEO2TmLUrdhfsnyIeC+iHg4f7pjtuHe+itjG5YRCL8CTpT0F/k1\n8c+QXbbqCJIO6h11lg/g+RTZePlOs4D8D43yr4+U2Mu79P5Hy11EidtQksguEz4Te/6uAjpkG/bX\nXxnbUPmZzbaSdB5wG9kVh7kRUcsglbaQNJ49f422H3B/2f0p++vBKWTDc7vJxgDMJxvDcCywAbgk\nIko5sddPf1PIdnWD7KrNFyuO19vd31lkIyzXsGcA0Q1kx+mlb8OC/qbR5m1YSiCYWWfySUUzSxwI\nZpY4EMwscSCYWeJAMLOk1EDo4GHBgPtrVCf318m9QXn9lb2H0NH/KLi/RnVyf53cG5TUX9mBYGYd\npKGBSZLOBb5DNuLw+xHxb1Wm9ygos5JERNX7Vw44EJR9CtFzwDlkf8L8K2BaRDxdMI8DwawktQRC\nI4cMvtGJ2RDTSCAMhhudmFkdBvrhIzXLL590+hldM6OxQKjpRicRMYfsdto+h2DW4Ro5ZOjoG52Y\nWf0GvIcQEW9Luhr4L/bc6GRd0zozs7Zr6w1SfMhgVp5WX3Y0syHGgWBmiQPBzBIHgpklDgQzSxwI\nZpY4EMwscSCYWeJAMLPEgWBmiQPBzBIHgpklDgQzSxwIZpY4EMwscSCYWeJAMLPEgWBmiQPBzBIH\ngpklDgQzSxwIZpY4EMwscSCYWeJAMLPEgWBmiQPBzBIHgpklDgQzSxwIZpY4EMws2a+RmSWtB3qA\n3cDbETGpGU2ZWTkaCoTcxyNiaxOWY2Yl8yGDmSWNBkIAj0taKamrGQ2ZWXkaPWQ4KyI2SnofsFDS\nsxGxtHKCPCgcFmaDgCKiOQuSZgE7I+LmgmmaszIzq1tEqNo0Az5kkHSQpJG93wOfAtYOdHlmVr5G\nDhnGAD+W1Luc+yPiZ03pysxK0bRDhppW5kMGs9K09JDBzIYeB4KZJQ4EM0scCGaWOBDMLHEgmFnS\njL92tA5xxRVXFNarXWJ+5ZVXCusTJkworC9fvrywvmzZssK6lc97CGaWOBDMLHEgmFniQDCzxIFg\nZokDwcwSB4KZJUNqHMK0adMK66eddlphvdp1/E53yCGHNDT/7t27C+v7779/Yf3NN98srL/xxhuF\n9TVr1hTWL7nkksL6yy+/XFi36ryHYGaJA8HMEgeCmSUOBDNLHAhmljgQzCxxIJhZMqhuwz579uzC\n+owZMwrrw4YNa2T1VrIlS5YU1quNQ+nu7m5mO4OOb8NuZnVxIJhZ4kAws8SBYGaJA8HMEgeCmSUO\nBDNLBtU4hBdeeKGwfvTRRxfWV69eXViv9vf8rVbtcwvmz5/fpk4G5pxzzimsX3755YX1cePGNbT+\nauMULr300sL6UL+fQlPGIUiaK2mLpLUVzx0qaaGk5/Ovoxtt1szKV8shw93AuX2eux5YHBEnAovz\nn81skKsaCBGxFHi1z9MXAPfk398DXNjkvsysBAM9qTgmIjbn378EjGlSP2ZWooZvshoRUXSyUFIX\n0NXoesys9Qa6h9AtaSxA/nVLfxNGxJyImBQRkwa4LjNrk4EGwgJgev79dOCR5rRjZmWqOg5B0jxg\nCnAY0A18FZgPPAAcC2wALomIvice97ashsYhvP/97y+sn3rqqYX1RYsWFdZ7enrq7slqN378+ML6\nY489VlifMGFCQ+u/7rrrCuvV7rcx2NUyDqHqOYSI6O+uE5+ouyMz62geumxmiQPBzBIHgpklDgQz\nSxwIZpY4EMwsGVT3Q7Ch7eKLLy6sP/jggw0tf+vWrYX1ww8/vKHldzp/LoOZ1cWBYGaJA8HMEgeC\nmSUOBDNLHAhmljgQzCxxIJhZ4kAws8SBYGaJA8HMEgeCmSUOBDNLHAhmljgQzCxp+KPczGp15ZVX\nFtY//OEPt3T9BxxwQGH99NNPL6yvXLmyme10JO8hmFniQDCzxIFgZokDwcwSB4KZJQ4EM0scCGaW\n+HMZhpCxY8cW1i+77LLC+syZM5vZzrtU60+q+rEBLbVjx47C+qhRo9rUSWs05XMZJM2VtEXS2orn\nZknaKGlV/jiv0WbNrHy1HDLcDZy7l+dvjYiJ+eMnzW3LzMpQNRAiYinwaht6MbOSNXJS8WpJq/ND\nitFN68jMSjPQQLgDOB6YCGwGZvc3oaQuSSskrRjgusysTQYUCBHRHRG7I+Id4HvA5IJp50TEpIiY\nNNAmzaw9BhQIkiqvH10ErO1vWjMbPKreD0HSPGAKcJikF4GvAlMkTQQCWA98sYU97jM++clPFtar\n/b1+V1dXYX38+PF197QvmTt3btktlK5qIETEtL08fVcLejGzknnospklDgQzSxwIZpY4EMwscSCY\nWeJAMLPEn8vQRCeccEJh/c477yysn3322YX1Vt8vYMOGDYX11157raHl33jjjYX1Xbt2FdZvv/32\nwvpJJ51Ud0+VNm3a1ND8Q4H3EMwscSCYWeJAMLPEgWBmiQPBzBIHgpklDgQzSzwOoQ7XXHNNYf2q\nq64qrB9//PGF9Z07dxbWt23bVli/7bbbCuvVrrMvX768sF5tnEKrbd++vaH5e3p6CuuPPvpoQ8sf\nCryHYGaJA8HMEgeCmSUOBDNLHAhmljgQzCxxIJhZ4nEIdTjzzDML69XGGSxYsKCwPnt2v5+IB8DS\npUsL64PdxIkTC+vHHXdcQ8uvdr+FZ599tqHlDwXeQzCzxIFgZokDwcwSB4KZJQ4EM0scCGaWOBDM\nLPE4hDp86UtfKqyvXr26sH7TTTc1s50hp9rnWowZM6ah5S9atKih+fcFVfcQJB0jaYmkpyWtkzQj\nf/5QSQslPZ9/Hd36ds2slWo5ZHgbuDYiTgHOAK6SdApwPbA4Ik4EFuc/m9kgVjUQImJzRPw6/74H\neAY4CrgAuCef7B7gwlY1aWbtUddJRUnjgA8BvwTGRMTmvPQS0NgBnpmVruaTipJGAA8BMyNiR+UH\nj0ZESIp+5usCuhpt1Mxar6Y9BEnDycLgvoh4OH+6W9LYvD4W2LK3eSNiTkRMiohJzWjYzFqnlqsM\nAu4CnomIWypKC4Dp+ffTgUea356ZtZMi9rqnv2cC6SzgCWAN8E7+9A1k5xEeAI4FNgCXRMSrVZZV\nvDLbp918882F9WuvvbawXu1zK6ZOnVpYf/LJJwvrg11EqNo0Vc8hRMQyoL8FfaLepsysc3nospkl\nDgQzSxwIZpY4EMwscSCYWeJAMLPE90OwtlmzZk1h/eSTT25o+Y8//nhhfaiPM2gG7yGYWeJAMLPE\ngWBmiQPBzBIHgpklDgQzSxwIZpZ4HIK1zbhx4wrr++1X/Ou4ffv2wvqtt95ab0vWh/cQzCxxIJhZ\n4kAws8SBYGaJA8HMEgeCmSUOBDNLPA7BmmbatGmF9QMPPLCw3tPTU1jv6ir+REDf76Bx3kMws8SB\nYGaJA8HMEgeCmSUOBDNLHAhmljgQzCxRRLRvZVL7VmZNN3z48ML6U089VViv9rkL8+bNK6x//vOf\nL6xbsYhQtWmq7iFIOkbSEklPS1onaUb+/CxJGyWtyh/nNaNpMytPLSMV3waujYhfSxoJrJS0MK/d\nGhE3t649M2unqoEQEZuBzfn3PZKeAY5qdWNm1n51nVSUNA74EPDL/KmrJa2WNFfS6H7m6ZK0QtKK\nhjo1s5arORAkjQAeAmZGxA7gDuB4YCLZHsTsvc0XEXMiYlJETGpCv2bWQjUFgqThZGFwX0Q8DBAR\n3RGxOyLeAb4HTG5dm2bWDrVcZRBwF/BMRNxS8fzYiskuAtY2vz0za6darjJ8BPgnYI2kVflzNwDT\nJE0EAlgPfLElHVrHqDZm5f777y+sr1q1qrC+cOHCwrq1Xi1XGZYBexvQ8JPmt2NmZfLQZTNLHAhm\nljgQzCxxIJhZ4kAws8SBYGaJ74dgto9oyv0QzGzf4UAws8SBYGaJA8HMEgeCmSUOBDNLHAhmltRy\nP4Rm2gpsqPj5sPy5TuX+GtPJ/XVyb9D8/o6rZaK2Dkx618qlFZ18r0X315hO7q+Te4Py+vMhg5kl\nDgQzS8oOhDklr78a99eYTu6vk3uDkvor9RyCmXWWsvcQzKyDOBDMLHEgmFniQDCzxIFgZsn/A7Ef\nH9ZUofg/AAAAAElFTkSuQmCC\n",
      "text/plain": [
       "<matplotlib.figure.Figure at 0x7fd590437210>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAQQAAAEMCAYAAAAiW8hnAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAFDlJREFUeJzt3X2wXHV9x/H3RxO1TXgIJIZAISmKIGhFDAyOIUIdgTAF\nkrGlBmqjFcIwMgOk/oGAEAsJ2ikImXaCUSgogoNAhDKg0hRDqJZCACHJDWBpeAghD1AggIKQb/84\nJz+WmPs7u3f37tncfF4zO/fe/Z6H757kfvY8/O5ZRQRmZgDvqrsBM+sdDgQzSxwIZpY4EMwscSCY\nWeJAMLPEgbAFSbMlXVvTuveW9Eqnp92eSLpW0uxOzytphqQ72ultWzCkA0HSeyVdKelJSRslPSRp\nSkP9cEnPDHDZe0l6peERkl5t+PmwVpcZEU9ExMhOT9uq8hfjjXKbbZT0iKQ5knZsYRnPSDq8jR7u\nkfTFgc7faRFxTURMqZ7ynSRdUm6LlyWtknT2YPTXKUM6EIBhwNPAp4GdgPOAGyRNaHfBEfFURIzc\n/Cif/ljDc0u2nEfSu9tdbxfNjYgdgDHAl4HDgCWS/qjetrY5C4APRcSOFNvwi5KOq7mnfg3pQIiI\nVyNidkSsiohNEXEb8L/AJySNAO4Adm94V9+9nPU9kr5fvjsulzRxIOsv32n/RdJPJb0KHCbpuHJP\n5WVJT0n6esP0H5QUDT/fI+kbkn5Z9vJTSbu0Om1Z/1K5vg2Szmn2HTwifhcR/w0cC+wGzCiXt4+k\nuyS9UC7zB5J2KmvXA7sDd5TbdZakd0m6UdJzkl6U9AtJHx7ANm1mOWMkLSq3w12S9myYf39J/172\nvVLS55pc78mSftHQwzxJ6yS9JOlhSftvbb6IeDQiXtv8I7AJ+GCrr7tbhnQgbEnSWOBDwPKIeBWY\nAjzb8K7+bDnpccCPgJ2BW4F/bmO1JwLfAHYAfgW8ApxULvtY4AxJf1Ex/wxgLDACmNXqtJI+CswD\nPg/sQfGuv1srLyIiXgIWUbzLAQi4qFzO/sDewNfLaacDzwJTyu16aTnPbcA+5TzLgB+00kODquX8\nDXA+MBpYsbkuaSRwJ/B94P0U/w4LJO3b4vqnAIeWPYyi2K4v9DexpHPLN4SngfcC17e4vq7ZbgJB\n0nDgh8A1EbGyYvJ7IuL2iHiL4j/Tx9pY9cKI+FW5h/J6RPxHRCwvf/41RfB8OjP/lRHxePku82Pg\nwAFM+1fATyLilxHxOsWh00A8C+wCEBGPRcSiiHgjItYB3869jvL1Xh0RGyPid8Bs3t5Ta1qTy/m3\niPjP8rWeA0yWNA44HngsIr4fEW9GxFLgJ8BfttID8HtgR2C/sqcVEfFcpuc5wEjgE8C1wMstrq9r\ntotAkPQuil/sN4DTm5il8R/3NeB9koYNcPVPb9HLJ8vd3PWSXgJOpngna7aX3InE/qbdvbGPcu/o\n/5rofUt7UL4TStpN0g2SVkt6GbiazOuQ9G5J/yjpiXL635Sl3Gsf6HIaX+tLwEsU22A88KnyUONF\nSS8Cfw2Ma6WHiPg5cAUwH1gr6QpJO1TMExHxAEWYXNDK+rppyAeCJAFXUuxGfy4ift9Q7safem65\njh8BNwF7RsROwPcodr8H0xrgTzb/UL6bjmplASquMPw5sPlk6beA14GPlifMvsg7X8eWr/tvgWPK\nZezE28fRrb72ZpbTeM5gp3K6ZymCYlFE7NzwGBkRzbxJvENEXBYRBwEfoThkyh3KNRoGfKDV9XXL\nkA8EihT/MHBsRPx2i9paYNfNJ8O6ZAfghYj4naRDKY4/B9uPgamSDpX0HuAfmp1RxaXbicAtwHqK\n428oXserwEvlSbuvbjHrWorzCjRM/zrwPPDHwJwmVj9c0vsaHsObXM6x5Z7YeynOcyyJiDUU54MO\nkHSipOHl45BWzyGU8xxS7jW+SrHnuWkr0w2XdIqkncsTkZ8ETqM4F9OThnQgSBoPnEpxLP2c3r6a\ncBJAeS7heuCJchdy98ziOuU04GJJGymOb28Y7BVGxMPAWRTB8CzFL9PzFL9Y/Tmn7PF54Brgv4BP\nNZwxvwA4hGJ3/FaKvZ5Gc4FvlNv1TOBfy3U/CywHftlE6wuA3zY8vtvkcq6lCIINwJ9R7FVsPnw4\niuKk4xqKQ6yLKU70tWJnir3OF4FV5bIu3cp0QXF+4gmK8wZXl9PNb3F93RMRXX8ARwOPUhz/nV1H\nDxX9rQIeAR4C7u+Bfq4C1gHLGp7bheKM+ePl11EtLG9Hine0PQexv9nA6nIbPgQcU+P22xO4i+KK\nw3LgjHa3YZf66/o2rOPFvxv4H4rdyfcAvwb2r+s/Sz89rgJG191HQz+TgYO2+IX7x81hCpwNfKti\nGcdR7GKPpHin7VjQ9dPfbOCrdW+7spdxwEHl9zsAj1Ec97e0DWvor+vbsI5DhkOA30Qx9PYNipNs\nx9fQxzYjIu7mD69zH0+xK0/5dWrFYqZR7GY/A0wApg9yfz0jItZEcYafiNgI9FFcMWl1G3a7v66r\nIxD24J2X4p6hphefEcDPJS2VNLPuZvoxNooTZVAcC4/NTRwRX4q3z6x/NiIeH/wWOb0cxXeVpJau\nagwWFcPWPw7cS4vbsBu26A+6vA2H9EnFNkyK4pLSFOArkibX3VBOFPuavXa33PkUl9cOpDjpdkm9\n7aSRijcBZ0bEOwYH9cI23Ep/Xd+GdQTCahquE1NcH19dQx/9iojV5dd1wEKKw5xes7YcfUf5dV3N\n/bxDRKyNiLciYhPFOYtat2F5yfIm4IcRcXP5dM9sw631V8c2rCMQ7gP2kfSn5TXxz1NctuoJkkZs\nHnVWDuA5kmK8fK+5lfIPjcqvt9TYyx/Y/ItWmkaN27BhcFpfvP13FdAj27C//urYhirPbHaVpGOA\nyyiuOFwVxVjvniBpb4q9AihGlV1Xd38q/nrwcIrhuWspxgD8hGIMw17Ak8AJEVHLib1++jucYlc3\nKK7anNpwvN7t/iZRjLB8hLcHEJ1DcZxe+zbM9DedLm/DWgLBzHqTTyqaWeJAMLPEgWBmiQPBzBIH\ngpkltQZCDw8LBtxfu3q5v17uDerrr+49hJ7+R8H9tauX++vl3qCm/uoOBDPrIW0NTJJ0NHA5xYjD\n70XENyum9ygos5pEROX9KwccCCo+hegx4LMUf8J8HzA9IlZk5nEgmNWkmUBo55DBNzoxG2LaCYRt\n4UYnZtaCgX74SNPKyye9fkbXzGgvEJq60UlELKC4nbbPIZj1uHYOGXr6Ridm1roB7yFExJuSTgd+\nxts3Olnesc7MrOu6eoMUHzKY1WewLzua2RDjQDCzxIFgZokDwcwSB4KZJQ4EM0scCGaWOBDMLHEg\nmFniQDCzxIFgZokDwcwSB4KZJQ4EM0scCGaWOBDMLHEgmFniQDCzxIFgZokDwcwSB4KZJQ4EM0sc\nCGaWOBDMLHEgmFniQDCzxIFgZokDwcwSB4KZJQ4EM0uG1d2A9Y7x48dn6yeffHK2fu6552brEZGt\nS/lPK+/r68vWzzvvvGx94cKF2bq1GQiSVgEbgbeANyNiYieaMrN6dGIP4YiI2NCB5ZhZzXwOwcyS\ndgMhgJ9LWippZicaMrP6tHvIMCkiVkt6P3CnpJURcXfjBGVQOCzMtgFt7SFExOry6zpgIXDIVqZZ\nEBETfcLRrPcNOBAkjZC0w+bvgSOBZZ1qzMy6T1XXhvudUdqbYq8AikOP6yJiTsU8A1uZNWXMmDHZ\n+te+9rVs/aSTTsrWd91112y9ahxBu+MQquZ/+umns/WDDz44W9+wYWhfLIuI/AamjXMIEfEE8LGB\nzm9mvceXHc0scSCYWeJAMLPEgWBmiQPBzBIHgpklAx6HMKCVeRxCW6ruN3DhhRdm63WPA1i/fn22\nXmX06NHZ+oQJE7L1FStWZOsHHHBAqy1tU5oZh+A9BDNLHAhmljgQzCxxIJhZ4kAws8SBYGaJA8HM\nEo9D2Ibcd9992fpBBx2Urbc7DqHqOv4RRxyRrbd7v4FJkyZl64sXL87Wq17/sGFD+2NKPA7BzFri\nQDCzxIFgZokDwcwSB4KZJQ4EM0scCGaWeBxCD9lvv/2y9apxCM8//3y2XnU/gqpxAmeddVa2fuaZ\nZ2brc+fOzdafeuqpbL1K1f/lTZs2ZeunnXZatr5gwYKWe+olHodgZi1xIJhZ4kAws8SBYGaJA8HM\nEgeCmSUOBDNLPA5hG1I1TqFqHEG79yOYOXNmtj5//vxs/eCDD87WH3jggWx92rRp2fqNN96YrVf9\nX99tt92y9Xa3X906Mg5B0lWS1kla1vDcLpLulPR4+XVUu82aWf2aOWS4Gjh6i+fOBhZFxD7AovJn\nM9vGVQZCRNwNvLDF08cD15TfXwNM7XBfZlaDgZ5UHBsRa8rvnwPGdqgfM6tR23eVjIjInSyUNBPI\nn40ys54w0D2EtZLGAZRf1/U3YUQsiIiJETFxgOsysy4ZaCDcCswov58B3NKZdsysTpWHDJKuBw4H\nRkt6BrgA+CZwg6QvA08CJwxmk1ZYuXJlreuvup/Co48+mq1X3a+h6n4LZ5+dv5hV9bkSgz1OYyio\nDISImN5P6TMd7sXMauahy2aWOBDMLHEgmFniQDCzxIFgZokDwcyStocuW++YPHlytl51P4WqcQZ9\nfX3Z+r777put33vvvdn6mDFjsvWq+xlU9T9lypRs3byHYGYNHAhmljgQzCxxIJhZ4kAws8SBYGaJ\nA8HMEo9DGEJOPPHEbP2UU07J1qvuJ1A1DqBq/qpxBu3ez2DevHnZetXnPpj3EMysgQPBzBIHgpkl\nDgQzSxwIZpY4EMwscSCYWeJxCNuRqnEEdc+/ZMmSbH3WrFnZuscZtM97CGaWOBDMLHEgmFniQDCz\nxIFgZokDwcwSB4KZJR6HMIRcd9112fr48eOz9dGjR2frVZ/rMGLEiGy9yvnnn5+te5zB4KvcQ5B0\nlaR1kpY1PDdb0mpJD5WPYwa3TTPrhmYOGa4Gjt7K89+OiAPLx+2dbcvM6lAZCBFxN/BCF3oxs5q1\nc1LxdEkPl4cUozrWkZnVZqCBMB/4AHAgsAa4pL8JJc2UdL+k+we4LjPrkgEFQkSsjYi3ImIT8F3g\nkMy0CyJiYkRMHGiTZtYdAwoESeMafpwGLOtvWjPbdqiJe+1fDxwOjAbWAheUPx8IBLAKODUi1lSu\nTGrvD+qtVlXjEC666KJsferUqdn6gw8+mK1PmTIlW6/63IbtXUTkP/iCJgYmRcT0rTx95YA6MrOe\n5qHLZpY4EMwscSCYWeJAMLPEgWBmiQPBzJLKcQgdXdk2Pg5hzJgx2fr69eu71Mm26Y477sjWjzrq\nqGy96nMZLrvsspZ72p40Mw7BewhmljgQzCxxIJhZ4kAws8SBYGaJA8HMEgeCmSX+XIYGkydPztYv\nuaTfO8UBsHLlymz9C1/4Qss9DSVz5szJ1o888shsfd999+1kO7YV3kMws8SBYGaJA8HMEgeCmSUO\nBDNLHAhmljgQzCzZrsYhVN3P4IorrsjW161bl61v7+MMRowYka1/5zvfydalyj/Xt0HmPQQzSxwI\nZpY4EMwscSCYWeJAMLPEgWBmiQPBzJLtahzCtGnTsvWqv7dfvHhxJ9vZ5uy3337Z+k033ZStV23f\nqs8IqbrfhLWvcg9B0p6S7pK0QtJySWeUz+8i6U5Jj5dfRw1+u2Y2mJo5ZHgT+PuI2B84FPiKpP2B\ns4FFEbEPsKj82cy2YZWBEBFrIuKB8vuNQB+wB3A8cE052TXA1MFq0sy6o6WTipImAB8H7gXGRsSa\nsvQcMLajnZlZ1zV9UlHSSOAm4MyIeLnxD1EiIvr7IFdJM4GZ7TZqZoOvqT0EScMpwuCHEXFz+fRa\nSePK+jhgq38KGBELImJiREzsRMNmNniaucog4EqgLyIubSjdCswov58B3NL59sysm1R17VfSJGAJ\n8AiwqXz6HIrzCDcAewFPAidExAsVy8qvbJBVXUfv6+vL1lesWJGtX3zxxW0tf+nSpdl6lfHjx2fr\nhx12WLZeNU5j6tT8eeOq+xlU/V+7/PLLs/VZs2Zl65YXEZU3nKg8hxAR9wD9LegzrTZlZr3LQ5fN\nLHEgmFniQDCzxIFgZokDwcwSB4KZJZXjEDq6sprHIVS58cYbs/XBvg7/4IMPZutV9tprr2x91113\nzdbb7b9q/jlz5mTr8+bNy9Y3bNiQrVteM+MQvIdgZokDwcwSB4KZJQ4EM0scCGaWOBDMLHEgmFni\ncQgNxowZk63ffvvt2frEifmbQm3atClbH+xxAFXzv/baa9l61ecizJ07N1tfuHBhtm6Dy+MQzKwl\nDgQzSxwIZpY4EMwscSCYWeJAMLPEgWBmicchtGD06NHZ+oUXXtjW8mfOzH/i3c0335ytt3u/gKrP\nRagah2C9zeMQzKwlDgQzSxwIZpY4EMwscSCYWeJAMLPEgWBmicchmG0nOjIOQdKeku6StELSckln\nlM/PlrRa0kPl45hONG1m9ancQ5A0DhgXEQ9I2gFYCkwFTgBeiYh/anpl3kMwq00zewjDmljIGmBN\n+f1GSX3AHu23Z2a9pqWTipImAB8H7i2fOl3Sw5KukjSqn3lmSrpf0v1tdWpmg67pk4qSRgKLgTkR\ncbOkscAGIIALKQ4r/q5iGT5kMKtJM4cMTQWCpOHAbcDPIuLSrdQnALdFxEcqluNAMKtJp64yCLgS\n6GsMg/Jk42bTgGUDadLMekczVxkmAUuAR4DNHyxwDjAdOJDikGEVcGp5AjK3LO8hmNWkY4cMneJA\nMKuPb5BiZi1xIJhZ4kAws8SBYGaJA8HMEgeCmSUOBDNLHAhmljgQzCxxIJhZ4kAws8SBYGaJA8HM\nEgeCmSWVN1ntsA3Akw0/jy6f61Xurz293F8v9wad7298MxN19X4If7By6f6ImFhbAxXcX3t6ub9e\n7g3q68+HDGaWOBDMLKk7EBbUvP4q7q89vdxfL/cGNfVX6zkEM+stde8hmFkPcSCYWeJAMLPEgWBm\niQPBzJL/By402Tk7k2myAAAAAElFTkSuQmCC\n",
      "text/plain": [
       "<matplotlib.figure.Figure at 0x7fd58dcf6d90>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAQQAAAEMCAYAAAAiW8hnAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAEzdJREFUeJzt3XuwXWV9xvHvwyWRkgtJoyGkiWjEtlAxSialBUKoN2Aq\nl2lrRToFRieMFQcQnUEGMCJiSgHTtBQnmBS8wSB3O0hNAwwXLTU4MRBIiWWCuXEiSbnFQm6//rHe\n87KNOWvvffZl7XPyfGb2nL33u9a7fnudc579rrXes48iAjMzgH2qLsDMeocDwcwyB4KZZQ4EM8sc\nCGaWORDMLHMg7EbSXEnfqWjb75T0WruX3ZtI+o6kue1eV9JZkn7YSm1DwbAPhPRN3ijpFUnPSvpU\nTdtsSesG2e9USa/V3ELS1prHxzXbZ0Q8FxGj2r1ss9I+2ybp1XR7UtJXJY1poo91kma3UMOjks4e\n7PrtFhE3R8RJg11f0gRJmyU91May2m7YBwLwNeDQiBgDnAJcKemoVjuNiF9GxKj+W3r6vTXPPbL7\nOpL2bXW7XXRVRIwG3gp8EjgOeETSAdWWNWT9A7Cy6iLqGfaBEBErI+KN/ofpNk3SgcAPgUNq3tUP\nScuNkPSt9O64UtKMwWw7vdNeL+l+SVuB4ySdIml5GrH8UtJlNcu/S1LUPH5U0pcl/TjVcr+k8c0u\nm9rPSdt7UdIljb6DR8TrEfFfwEeBg4GzUn+HSXpQ0pbU57cljU1ttwCHAD9M+/VzkvaRdLukFyS9\nJOkhSX84iH3aSD9vlbQ07YcHJU2pWf9wSf+R6l4l6S8a3O6n+t/dUw0LJG2S9LKkFZIOL1n3OOAw\n4NvNvt5uG/aBACDpXyT9GlgFbATui4itwEnAhpp39Q1plVOAW4GDgHuBf25h858AvgyMBn4CvAac\nmfr+KHC+pD+vs/5ZwETgQOBzzS4r6T3AAuDjwGSKd/2Dm3kREfEysJRipAAg4MrUz+HAO4HL0rJn\nABuAk9J+vS6t828UvxgHA08x+F+Qev38DXA5MAF4ur9d0ihgCfAt4G0U34eFkn6/ye2fBBydahhH\nsV+37GlBSfsB/wScR/Fm1NP2ikCIiL+j+IU8DrgTeKN8DR6NiPsiYifFD9N7W9j8XRHxk4jYFRFv\nRMQDadSyKyJ+ThE8x5esvygiVkfEr4HvA9MHsexfAXdHxI/TaOnSQb6WDcB4gIh4NiKWRsS2iNgE\nfL3sdaTXe1NEvBoRrwNzgaPSSK1hDfbzg4h4LL3WS4BZkiYBpwLPRsS3ImJHRDwB3A38ZTM1ANuB\nMcAfpJqejogXBlj2QuCRiFje5DYqsVcEAkBE7IyIR4HfAz5dZ/Hab+6vgbekpB+MtbUPJP1JGub+\nStLLwKco3skaraXsROJAyx5SW0caHf1vA7XvbjLpnVDSwZJuk7Re0ivATZS8Dkn7Srpa0nNp+V+k\nprLXPth+al/ry8DLFPvg7cAx6VDjJUkvAX8NTGqmhoj4EfAN4AagT9I3JI3eQ61TKH7WLtu9rVft\nNYFQYz9gWrrfjSHc7tu4FbgDmBIRY4FvUgy/O2kjRRACkN5NxzXTQbrC8GdA/8nSv6cYab0nnbA9\nm998Hbu/7r8FTk59jAXe1d91M3U02E/tOYOxabkNFEGxNCIOqrmNiojzmqyBiJgfEe8H/ojikGlP\nh3J/TBE2qyS9AFwL/Gm635OGdSBIepukj0sald5ZPgKcQXEsDNAH/G7/ybAuGQ1siYjXJR1NcfzZ\nad8HTpN0tKQRwBWNrihpZDqpeg/wK4rjbyhex1bg5fRO+PndVu2jOK9AzfJvAJuB3wG+2sDm95f0\nlprb/g3289E0EhtJcZ7jkYjYSHE+6AhJn5C0f7rNbPYcQlpnZho1bgW2Abv2sOgPgHdQHLpNpziX\ntIzyw75KDetAoHiX+jSwjmKIfA1wQUTcCxARq4BbgOfSEPKQAXtqn08DX5P0KsXx7W2d3mBErKA4\nlv0+xTvl5nQrO5dySapxM3Az8J/AMen8BMCXgJkUw/F7KUY9ta4Cvpz26wXAv6Ztb6C4/PbjBkpf\nCPxfze3GBvv5DkUQvAgcSTGq6D98+AjFSceNFIdYXwNGNlBLrYOARcBLwJrU13W7L5TOGb3QfwNe\nAbaVnG+oXkR0/QacCPw3xfHfxVXUUKe+NcCTwHJgWQ/UsxjYBDxV89x4ijPmq9PXcU30N4biHW1K\nB+ubC6xP+3A5cHKF+28K8CDFFYeVwPmt7sMu1df1fVjFi98X+B+K4eQI4OfA4VX9sAxQ4xpgQtV1\n1NQzC3j/br9wV/eHKXAx8Pd1+jiFYog9iuKdtm1BN0B9c4HPV73vUi2TgPen+6OBZymO+5vahxXU\n1/V9WMUhw0zgF1FMvd1GcZLt1ArqGDIi4mF++zr3qRRDedLX0+p0czrFMHsdcCjFuZRO1tczImJj\nRPws3X8VeIbiikmz+7Db9XVdFYEwmd+8FLeOil58iQB+JOkJSXOqLmYAE6M4UQbFsfDEsoUj4px4\n88z6hyJidedL5Lw0i2+xpKauanSKpEOB9wGP0+Q+7Ibd6oMu78PhflJxsI6N4pLSScBnJM2quqAy\nUYw1e20W3A0Ul3enU5x0u7bacvJMxTsoTiy/UtvWC/twD/V1fR9WEQjrqblOTHF9fH0FdQwoItan\nr5uAuygOc3pNX5p9R/q6qeJ6fkNE9EUxGWwXxTmLSvdhumR5B/DdiLgzPd0z+3BP9VWxD6sIhJ8C\nh0l6R7om/nGKy1Y9QdKB/bPO0gSeD1PMl+8195L+0Ch9vafCWn5L/y9acjoV7kNJorhM+Ey8+XcV\n0CP7cKD6qtiHSmc2u0rSycB8iisOiyOikUkqXSHpnRSjAihmNX6v6vpU/PXgbIrpuX0UcwDuppjD\nMBV4HvhYRFRyYm+A+mZTDHWD4qrNuTXH692u71iKGZZP8uYEoksojtMr34cl9Z1Bl/dhJYFgZr3J\nJxXNLHMgmFnmQDCzzIFgZpkDwcyySgOhh6cFA66vVb1cXy/XBtXVV/UIoae/Kbi+VvVyfb1cG1RU\nX9WBYGY9pKWJSZJOBP6RYsbhNyNiXp3lPQvKrCIRUffzKwcdCCr+C9GzwIco/oT5p8AZEfF0yToO\nBLOKNBIIrRwy+INOzIaZVgJhKHzQiZk1YbD/fKRh6fJJr5/RNTNaC4SGPugkIhZSfJy2zyGY9bhW\nDhl6+oNOzKx5gx4hRMQOSecB/86bH3Sysm2VmVnXdfUDUnzIYFadTl92NLNhxoFgZpkDwcwyB4KZ\nZQ4EM8scCGaWORDMLHMgmFnmQDCzzIFgZpkDwcwyB4KZZQ4EM8scCGaWORDMLHMgmFnmQDCzzIFg\nZpkDwcwyB4KZZQ4EM8scCGaWORDMLHMgmFnmQDCzzIFgZpkDwcwyB4KZZQ4EM8scCGaWORDMLNuv\nlZUlrQFeBXYCOyJiRjuKMrNqtBQIyQkR8WIb+jGzivmQwcyyVgMhgB9JekLSnHYUZGbVafWQ4diI\nWC/pbcASSasi4uHaBVJQOCzMhgBFRHs6kuYCr0XENSXLtGdjZta0iFC9ZQZ9yCDpQEmj++8DHwae\nGmx/Zla9Vg4ZJgJ3Serv53sRcX9bqrI9GjFiRGn70qVLS9uPOeaY0vb0vRzQSy+9VNp+5JFHlrav\nXbu2tN2qN+hAiIjngPe2sRYzq5gvO5pZ5kAws8yBYGaZA8HMMgeCmWUOBDPL2vHXjtYm9eYZLFq0\nqLS93jyDeu6+++7S9nnz5pW2b9iwoaXtd9rEiRNL2/v6+rpUSe/yCMHMMgeCmWUOBDPLHAhmljkQ\nzCxzIJhZ5kAws8zzEHrIRRddVNp+5plnttT/9ddfX9r+hS98obT99ddfb2n7nXbNNQN+WBcA55xz\nTmn7V77yldL2+fPnN13TUOMRgpllDgQzyxwIZpY5EMwscyCYWeZAMLPMgWBmmechdNERRxxR2n7p\npZe21P9rr71W2n7hhReWtu/YsaOl7XfajBkzStvPPvvs0vZx48a1sZrhySMEM8scCGaWORDMLHMg\nmFnmQDCzzIFgZpkDwcwyz0Pooosvvri0/YADDihtrzdP4JRTTmlp/V5X7/Maxo8fX9q+ffv20vZ6\n/5dib1B3hCBpsaRNkp6qeW68pCWSVqevnvFhNgw0cshwE3Dibs9dDCyNiMOApemxmQ1xdQMhIh4G\ntuz29KnAzen+zcBpba7LzCow2JOKEyNiY7r/AlD+T/PMbEho+aRiRISkGKhd0hxgTqvbMbPOG+wI\noU/SJID0ddNAC0bEwoiYERHlf6pmZpUbbCDcC5yV7p8F3NOecsysSnUPGSTdAswGJkhaB3wJmAfc\nJumTwPPAxzpZ5HBx1FFHtbT+/fffX9r+0EMPtdT/vvvuW9o+YsSIlvqvZ9q0aaXtxx9/fEv93377\n7aXta9asaan/4aBuIETEGQM0faDNtZhZxTx12cwyB4KZZQ4EM8scCGaWORDMLHMgmFnmz0MYQkaO\nHNnS+jNnzixtv/LKK0vbP/jBD7a0/U7r6+srbb/qqqu6VMnQ5RGCmWUOBDPLHAhmljkQzCxzIJhZ\n5kAws8yBYGaZ5yF00dVXX13avnjx4tL2E044obT9gQceKG2fNWtWafs++wzt94cbb7yxtH3lypVd\nqmToGto/AWbWVg4EM8scCGaWORDMLHMgmFnmQDCzzIFgZpnnIXTR1KlTW1p/v/3Kv12zZ89uqf/H\nH3+8tP2uu+4qbZ88eXJp+2c/+9mma2rGsmXLOtr/3sAjBDPLHAhmljkQzCxzIJhZ5kAws8yBYGaZ\nA8HMMs9D6KJ6n3ewbdu2jm7/1ltvLW1fu3ZtafvOnTtL27/4xS82XVMzHnvssdL2++67r6Pb3xvU\nHSFIWixpk6Snap6bK2m9pOXpdnJnyzSzbmjkkOEm4MQ9PP/1iJiebo5ms2GgbiBExMPAli7UYmYV\na+Wk4nmSVqRDinFtq8jMKjPYQLgBmAZMBzYC1w60oKQ5kpZJ8l+emPW4QQVCRPRFxM6I2AXcCAz4\nb4UjYmFEzIiIGYMt0sy6Y1CBIGlSzcPTgacGWtbMho668xAk3QLMBiZIWgd8CZgtaToQwBrg3A7W\nOGysW7eutH3evHldqqQztm7d2tH+FyxYUNq+Y8eOjm5/b1A3ECLijD08vagDtZhZxTx12cwyB4KZ\nZQ4EM8scCGaWORDMLHMgmFnmz0Owtqn3eQn17Nq1q7R99erVLfVv9XmEYGaZA8HMMgeCmWUOBDPL\nHAhmljkQzCxzIJhZ5nkI1jbnntvax2IsWbKktH358uUt9W/1eYRgZpkDwcwyB4KZZQ4EM8scCGaW\nORDMLHMgmFnmeQjWsLFjx5a2jxkzpqX+58+f39L61jqPEMwscyCYWeZAMLPMgWBmmQPBzDIHgpll\nDgQzyzwPwRo2c+bM0vapU6eWtm/fvr20ffPmzU3XZO1Vd4QgaYqkByU9LWmlpPPT8+MlLZG0On0d\n1/lyzayTGjlk2AFcFBGHA0cDn5F0OHAxsDQiDgOWpsdmNoTVDYSI2BgRP0v3XwWeASYDpwI3p8Vu\nBk7rVJFm1h1NnVSUdCjwPuBxYGJEbExNLwAT21qZmXVdwycVJY0C7gAuiIhXJOW2iAhJMcB6c4A5\nrRZqZp3X0AhB0v4UYfDdiLgzPd0naVJqnwRs2tO6EbEwImZExIx2FGxmndPIVQYBi4BnIuK6mqZ7\ngbPS/bOAe9pfnpl1kyL2ONJ/cwHpWOAR4ElgV3r6EorzCLcBU4HngY9FxJY6fZVvzHraqlWrStvf\n/e53l7Zv2VL648GECROarskaFxGqt0zdcwgR8SgwUEcfaLYoM+tdnrpsZpkDwcwyB4KZZQ4EM8sc\nCGaWORDMLPPnIVjDRo4c2dL6K1asaFMl1ikeIZhZ5kAws8yBYGaZA8HMMgeCmWUOBDPLHAhmlnke\ngnXNzp07qy7B6vAIwcwyB4KZZQ4EM8scCGaWORDMLHMgmFnmQDCzzPMQrGtmzZpV2n755ZeXtl9x\nxRXtLMf2wCMEM8scCGaWORDMLHMgmFnmQDCzzIFgZpkDwcwyz0Owhi1YsKC0/bLLLittP+igg0rb\nd+3a1XRN1l51RwiSpkh6UNLTklZKOj89P1fSeknL0+3kzpdrZp3UyAhhB3BRRPxM0mjgCUlLUtvX\nI+KazpVnZt1UNxAiYiOwMd1/VdIzwOROF2Zm3dfUSUVJhwLvAx5PT50naYWkxZLGDbDOHEnLJC1r\nqVIz67iGA0HSKOAO4IKIeAW4AZgGTKcYQVy7p/UiYmFEzIiIGW2o18w6qKFAkLQ/RRh8NyLuBIiI\nvojYGRG7gBuBmZ0r08y6oZGrDAIWAc9ExHU1z0+qWex04Kn2l2dm3aSIKF9AOhZ4BHgS6L9QfAlw\nBsXhQgBrgHPTCciyvso3ZmYdExGqt0zdQGgnB4JZdRoJBE9dNrPMgWBmmQPBzDIHgpllDgQzyxwI\nZpY5EMwscyCYWeZAMLPMgWBmmQPBzDIHgpllDgQzyxwIZpZ1+/8yvAg8X/N4QnquV7m+1vRyfb1c\nG7S/vrc3slBXPw/htzYuLevlz1p0fa3p5fp6uTaorj4fMphZ5kAws6zqQFhY8fbrcX2t6eX6erk2\nqKi+Ss8hmFlvqXqEYGY9xIFgZpkDwcwyB4KZZQ4EM8v+Hyp7/AmhJxHPAAAAAElFTkSuQmCC\n",
      "text/plain": [
       "<matplotlib.figure.Figure at 0x7fd58dc8b9d0>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    }
   ],
   "source": [
    "trainimg = mnist.train.images\n",
    "trainlabel = mnist.train.labels\n",
    "nsample = 1\n",
    "randidx = np.random.randint(trainimg.shape[0], size=nsample)\n",
    "\n",
    "for i in [0, 1, 2]:\n",
    "    curr_img   = np.reshape(trainimg[i, :], (28, 28)) # 28 by 28 matrix \n",
    "    curr_label = np.argmax(trainlabel[i, :] ) # Label\n",
    "    plt.matshow(curr_img, cmap=plt.get_cmap('gray'))\n",
    "    plt.title(\"\" + str(i + 1) + \"th Training Data \" \n",
    "              + \"Label is \" + str(curr_label))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "## Softmax Regressions"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "我們知道 MNIST 裡面的手寫數字圖片都是從 0 到 9 的數字，因此我們想要知道給定一張圖片的時候對應到各個數字的機率是多少．例如我們的模型可能看了一張數字為 9 的圖片，模型不可能百分之百的確定他是某個數字，可能覺得有 80% 的機率為 9．然後有 5% 的機率是 8．剩餘的機率為其他數字．\n",
    "\n",
    "`softmax regression` 是一個簡單且自然的模型，而 MNIST 就是一個經典的例子．`softmax` 的作用就是把機率分配給許多的類別，並且這些類別機率的加總為 1．即使是之後更複雜的模型，在模型最後一個階段仍然是使用 `softmax` 來分配機率．\n",
    "\n",
    "`softmax regression` 有兩個步驟，第一個我們必須先蒐集對應類別的**證據** (`evidence`)，然後再把 `evidence` 轉化成機率．\n",
    "\n",
    "為了得到一張給定圖片屬於哪一個數字的證據 (`evidence`) ，我們把圖片的 pixenl 加權求和．權重是負的表示這部分的 pixel 有很強的證據顯示突變並不屬於該數字，而如果很可能是這個數字，權重就會是正的．\n",
    "\n",
    "下面的圖片顯示了模型從每個數字學習到的權值．紅色表示負權重，藍色表示正權重．\n",
    "\n",
    "![](https://www.tensorflow.org/images/softmax-weights.png)\n",
    "\n",
    "我們同樣加入了額外的 evidence 稱作**偏移量** (bias)．基本上我們會說這個證據是和輸入比較沒有關係的．結果是給定一個輸入 **x** 對應到的數字 **i** 他的證據如下:\n",
    "\n",
    "$$\\text{evidence}_i = \\sum_j W_{i,~ j} x_j + b_i$$\n",
    "\n",
    "$$\\text{y= softmax(evidence)}$$\n",
    " \n",
    "在這裡 `softmax` 代表著一種`激勵函數`或是`連結函數`，把線性輸出轉化成我們需要的形式，在這裡則是把它轉成 10 個類別．你可以想把 `evidence` 轉化成每個數字類別的機率．它的定義如下：\n",
    "\n",
    "$$\\text{softmax}(x)= \\text{normalize(exp(x))}$$\n",
    "\n",
    "如果我們把這個等式展開會得到：\n",
    "\n",
    "![](http://imgur.com/E2qHl47.jpg)\n",
    " \n",
    "\n",
    "通常如果我們把 softmax 函數想像成這樣的形式會幫助我們理解：把輸入指數化並且正規化．指數話的意思為如果一個 `evidence` 權重值增加的話，會對輸出的 hypothesis 貢獻更多的比重．相對的，如果權重值減少，在 hypothesis 中會減少得更多．注意到沒有一個 hypothesis 有 0 或是負數的值．Softmax 同時會正規化這些權重，讓他加總會是 1，成為一個機率分佈（如果想要用更直覺的方式了解 softmax，可以參考 Michael Nielsen 的書，裡面有完整的互動式視覺化．\n",
    "\n",
    "現在可以把 softmax 回歸表示成以下的圖像．對於每個輸出，我們計算其權重和並且加上偏移量，最後用上 softmax．\n",
    "\n",
    "![](https://www.tensorflow.org/images/softmax-regression-scalargraph.png)\n",
    "\n",
    "如果把它寫成等式，我們會得到：\n",
    "\n",
    "![](https://www.tensorflow.org/images/softmax-regression-scalarequation.png)\n",
    "\n",
    "我們可比這些過程向量化，把它變成一系列的矩陣相乘，還有相加．這些過程對於計算效率有很大的幫助（同時也幫助我們思考）\n",
    "\n",
    "更精簡地，我們可以直接寫下：\n",
    "\n",
    "$$y = \\text{softmax}(Wx+b)$$\n",
    "\n",
    "現在我們可以來把這些過程用 Tensorflow 來實現．"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "collapsed": false,
    "deletable": true,
    "editable": true
   },
   "source": [
    "## 實現 Regression"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "我們通常會用 Numpy 這類的套件來在 Python 中更有效率地處理像是矩陣相乘這樣的數值運算，而它會把這些計算移到 Python 外面，使用別種的程式語言還更有效率的實現方法．很不幸的是這樣的方法當把結果移回 Python 的時候會有 overhead 的情形．特別是在程式執行在 GPUs 或者分散式系統的時候，移動資料的成本會變得非常的高．\n",
    "\n",
    "Tensorflow 同樣的也把這些計算移到 Python 外，但是它用了一些方法來避免 overhead．它先讓我們先敘述一個交互操作的圖，然後再把所有交互計算的過程移到 Python 外面，而不是只是在 Python 外面執行單一個昂貴的操作．(這樣的方式可以在一些機器學習套件中看到)\n",
    "\n",
    "要開始執行 tensorflow 之前先讓我們 import 它．"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 11,
   "metadata": {
    "collapsed": false,
    "deletable": true,
    "editable": true
   },
   "outputs": [],
   "source": [
    "import tensorflow as tf"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "我們用操作符號變數來描述這一些交互操作單元．讓我們來建立一個範例:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 12,
   "metadata": {
    "collapsed": true,
    "deletable": true,
    "editable": true
   },
   "outputs": [],
   "source": [
    "x = tf.placeholder(tf.float32, [None, 784])"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "`x` 不是一個特定的數值．他是一個佔位子 (`placeholder`)，是一個先要求 Tensorflow 預先保留的數值，在真正計算的時候才把數值輸入進去．我們想要可以輸入任意數量的 MNIST 圖片，每一張圖都會先轉化成 784 維的向量．用 2-D 的浮點數 tensor 來表現它．它的形狀是 `[None, 784]`．(這裡的 `None` 意味著它第一個維度可以是任一長度的．)\n",
    "\n",
    "我們的模型同時需要權重還有偏移值．可以把它們看作一個額外的輸入，但是 Tensorflow有定義更好的方法來表示他們: `Variable`．一個 `Variable` 就是一個在 Tensorflow 交互操作圖中可以被變更的 tensor．他可以在計算中被取用和變更．在機器學習中，通常模型參數就會用 `Variables` 來表示．"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 13,
   "metadata": {
    "collapsed": true,
    "deletable": true,
    "editable": true
   },
   "outputs": [],
   "source": [
    "W = tf.Variable(tf.zeros([784, 10]))\n",
    "b = tf.Variable(tf.zeros([10]))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "我們經由給予 `tf.Variable` 初始值的方式來建立 Variable: 在這個例子中我們同時建立了 `W` 和 `b` 這兩個都為 0 的 tensor．因為 `W` 和 `b` 都會在學習過程中被學習出來，所以我們並不需要特別考慮他的初始值是多少．\n",
    "\n",
    "值得注意的是 `W` 的形狀為 [784, 10] 因為我們想要把一個 784 維的向量經由矩陣相乘後產生一個 10 維的證據 (evidence）向量來表示不同的數字．b 則是一個長度為 10 的向量，然後我們可以把他加入最後的輸出中．\n",
    "\n",
    "我們現在可以來實現我們的模型了．它僅僅需要一行來定義它!"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 14,
   "metadata": {
    "collapsed": true,
    "deletable": true,
    "editable": true
   },
   "outputs": [],
   "source": [
    "y = tf.nn.softmax(tf.matmul(x, W) + b)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "首先我們把 `x` 乘上 `W` 這裡的表示方式就是 `tf.matmul(x, W)` 對應了之前的矩陣相乘的數學式 **Wx**，`x` 是一個代表著多個輸入的 2D tensor．我們接下來把它加上 `b`，最後把它輸入 `tf.nn.softmax`．\n",
    "\n",
    "到這裡，我們用了幾行程式來設定變數，然後再一行就建立起我們自己的模型了．Tensorflow 不僅僅可以讓 `softmax regression` 變得如此簡單，像是其他的數值運算，機器學習模型，甚至是物理學的模擬都可以利用 Tensorflow 來描述．而只要這些模型被定義後，它就可以在任何的設備上使用，例如你的電腦的 CPU 和 GPU，甚至是手機呢！"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "## 模型訓練"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "為了要訓練的我們的模型，我們必須先定義一下怎樣的模型才是好的模型．事實上在機器學習中，通常是定義一個模型怎樣算是不好的．我們把這個定義稱作成本 (cost) 或是損失 (loss)．它代表我們的模型和預期的結果間的差距．我們會嘗試要最小化這些成本，當這些成本或損失越低的時候，就代表著我們的模型越好．\n",
    "\n",
    "有一個非常常見而且很棒的成本函數稱作 `cross-entropy`．它原先產生於通訊理論中的通訊壓縮編碼，但從賭博到機器學習等領域都有著很重要的地位．它的定義如下:\n",
    "\n",
    "$$H_{y'}(y) = -\\sum_i y'_i \\log(y_i)$$\n",
    "\n",
    "`y` 是預測的機率分佈，而 `y'` 是真實的機率分佈 (one-hot 數字向量)．概略地來說 `cross-entropy` 用來量測我們的預測和真相的之間的差距．更多的探討 `cross-entropy` 有點超出這裡這份的指引的範圍，但很推薦你好好地理解[它](http://colah.github.io/posts/2015-09-Visual-Information/)．\n",
    "\n",
    "為了實現 `cross-entropy` 我們必須先加入一個新的佔位子 (placeholder) 來放置正確的答案．"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 15,
   "metadata": {
    "collapsed": true,
    "deletable": true,
    "editable": true
   },
   "outputs": [],
   "source": [
    "y_ = tf.placeholder(tf.float32, [None, 10])"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "然後我們可以來實現 cross-entropy 函數, $-\\sum y'\\log(y)$:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 16,
   "metadata": {
    "collapsed": true,
    "deletable": true,
    "editable": true
   },
   "outputs": [],
   "source": [
    "cross_entropy = tf.reduce_mean(-tf.reduce_sum(y_ * tf.log(y), reduction_indices=[1]))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "首先 `tf.log` 會先對每個 `y` 的元素取 log．接下來我們把每個 `y_` 中的元素乘上 `tf.log(y)` 中對應的元素．接下來使用 `tf.reduce_sum` 把第二個維度的元素加總起來，(reduction_indices=[1]，這個參數)．最後 `tf.reduce_mean` 計算出這一輪的平均值．\n",
    "\n",
    "(在程式碼裡面我們並沒有直接使用這段程式碼，因為它是 numerically unstable．取而代之的我們使用 `tf.nn.softmax_cross_entropy_with_logits`．並把 `tf.matmul(x, W) + b` 當作函數輸入．在你自己的程式裡面請考慮使用 `tf.nn(sparse_)softmax_cross_entropy_with_logits`．\n",
    "\n",
    "好的，現在我們已經知道我們要我們的模型做什麼了，而且 Tensorflow 也已經知道整個模型的計算流程圖了，現在就讓 Tensorflow 來幫你訓練模型吧．它可以自動的計算[反向傳遞](http://colah.github.io/posts/2015-08-Backprop/) (backpropagation algorithm)並且調整參數來讓成本 (lost) 最小化．當然的你可以自己選擇要使用哪一個調整參數的最佳化演算法．"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 17,
   "metadata": {
    "collapsed": false,
    "deletable": true,
    "editable": true
   },
   "outputs": [],
   "source": [
    "train_step = tf.train.GradientDescentOptimizer(0.5).minimize(cross_entropy)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "在這個例子中，我們要 Tensorflow 使用梯度下降法 [gradient descent algorithm](https://en.wikipedia.org/wiki/Gradient_descent) 來最小化 `cross_entropy`，而它的學習速率 (learning rate) 是 0.5．梯度下降法 (Gradient descent) 是一個簡單的學習方法，Tensorflow 會把每個參數往最小化 cost 的方向調整．不過 Tensorflow 同時也提供了許多[最佳化](https://www.tensorflow.org/api_docs/python/train#optimizers)的演算法：而且只要調整一行的程式碼就可以使用這些演算法了．\n",
    "\n",
    "實際上 Tensorflow 在這裡做的事情是在你所定義的計算圖用一系列後台的計算來實現反向傳遞以及梯度下降法．最後它給你的只是一個單一簡單的函數，當運行的時候，他就會利用梯度下降法來訓練你的模型參數，不斷地減低 cost．"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "現在我們已經設置好我們的模型了，但在執行之前還有最後一件事情是我們要先來初始化我們所建立的變數．注意一下這時候只是定義而已還沒有真正的執行．"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 18,
   "metadata": {
    "collapsed": false,
    "deletable": true,
    "editable": true
   },
   "outputs": [],
   "source": [
    "init = tf.global_variables_initializer()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "我們現在可以利用 `Session` 來初始化我們的參數以及啟動我們的模型了．"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 19,
   "metadata": {
    "collapsed": false,
    "deletable": true,
    "editable": true
   },
   "outputs": [],
   "source": [
    "sess = tf.Session()\n",
    "sess.run(init)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "開始訓練模型！我們會執行 1000 次的訓練"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 20,
   "metadata": {
    "collapsed": false,
    "deletable": true,
    "editable": true
   },
   "outputs": [],
   "source": [
    "for i in range(1000):\n",
    "    batch_xs, batch_ys = mnist.train.next_batch(100)\n",
    "    sess.run(train_step, feed_dict={x: batch_xs, y_: batch_ys})"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "在每一次 loop 中我們會從訓練數據中隨機抓取一批 100 筆數據，然後把這些數據去替換掉之前我們設定的站位子 (`placeholder`)來進行訓練．\n",
    "\n",
    "使用一小部分的隨機數據稱作隨機訓練 (stochastic training)，更精確地說是隨機梯度下降．理想上我們希望用所有的數據來訓練，這樣會有更好的訓練結果，但這樣需要很大的計算消耗．所以每一次使用不同的訓練子集，這樣做可以有一樣的效果但是比較少的計算消耗．"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "collapsed": false,
    "deletable": true,
    "editable": true
   },
   "source": [
    "## 評估我們的模型\n",
    "\n",
    "我們的模型表現的如何呢？\n",
    "讓我們看看我們預測的數字是否正確．`tf.argmax` 是一個特別有用的函數，它可以讓我們找到在某一維的 tensor 中找到最大的數值的索引值 (index)．例如 `tf.argmax(y, 1)` 代表著模型對於每一筆輸入認為最有可能的數字，`tf.argmax(y_, 1)` 則是代表著正確的數字．我們可以使用 `tf.equal` 來確認我們的預測是否正確．\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 21,
   "metadata": {
    "collapsed": false,
    "deletable": true,
    "editable": true
   },
   "outputs": [],
   "source": [
    "correct_prediction = tf.equal(tf.argmax(y, 1), tf.argmax(y_, 1))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "這列出了一系列的布林值．為了來看看有多少比重的預測是正確的，我們把布林值轉化成福點數然後取平均值．例如 `[True, False, True, True]` 會變成 `[1, 0, 1, 1]`平均值是 `0.75`．"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 22,
   "metadata": {
    "collapsed": false,
    "deletable": true,
    "editable": true
   },
   "outputs": [],
   "source": [
    "accuracy = tf.reduce_mean(tf.cast(correct_prediction, tf.float32))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "collapsed": true,
    "deletable": true,
    "editable": true
   },
   "source": [
    "終於我們可以來印出我們的測試資料執行出來的準度了．"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 23,
   "metadata": {
    "collapsed": false,
    "deletable": true,
    "editable": true
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "0.9189\n"
     ]
    }
   ],
   "source": [
    "print(sess.run(accuracy, feed_dict = {x: mnist.test.images, y_: mnist.test.labels}))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "出來的結果大概是 92%\n",
    "\n",
    "這樣的結果算是好的嗎？其實是非常差的．這是因為我們用的是非常簡單的模型．如果做一些小調整，可以得到 97% 的精準度．而最好的模型可以達到 99.7%！(更多的資訊可以看一下[一些結果](http://rodrigob.github.io/are_we_there_yet/build/classification_datasets_results.html)．\n",
    "\n",
    "重要的是我們從這個模型學到了什麼，如果你覺得這樣的結果很令人沮喪的話看一下接下來的[教材](https://www.tensorflow.org/tutorials/mnist/pros/index)吧！學習用 Tensorflow 來建立更好的模型！"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 24,
   "metadata": {
    "collapsed": false,
    "deletable": true,
    "editable": true
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "[  3.29472641e-05   7.88767363e-09   1.11483721e-04   2.53725634e-03\n",
      "   1.26105158e-06   2.56959665e-05   2.00715462e-08   9.96722996e-01\n",
      "   1.77862657e-05   5.50678174e-04]\n"
     ]
    }
   ],
   "source": [
    "print(sess.run(y[0,:], feed_dict = {x: mnist.test.images, y_: mnist.test.labels}))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {
    "collapsed": true,
    "deletable": true,
    "editable": true
   },
   "outputs": [],
   "source": []
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 2",
   "language": "python",
   "name": "python2"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 2
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython2",
   "version": "2.7.12"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 2
}
