{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "# Batch and Stochastic Training\n",
    "\n",
    "This python function illustrates two different training methods: batch and stochastic training.  For each model, we will use a regression model that predicts one model variable.\n",
    "\n",
    "We start by loading the necessary libraries and resetting the computational graph."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 1,
   "metadata": {
    "collapsed": true,
    "deletable": true,
    "editable": true
   },
   "outputs": [],
   "source": [
    "import matplotlib.pyplot as plt\n",
    "import numpy as np\n",
    "import tensorflow as tf\n",
    "from tensorflow.python.framework import ops\n",
    "ops.reset_default_graph()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "We start a computational graph session."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 2,
   "metadata": {
    "collapsed": true,
    "deletable": true,
    "editable": true
   },
   "outputs": [],
   "source": [
    "sess = tf.Session()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "# Stochastic Training\n",
    "\n",
    "----------------------\n",
    "\n",
    "### Generate Data\n",
    "\n",
    "The data we will create is 100 random samples from a `Normal(mean = 1, sd = 0.1)`.  The target will be an array of size 100 filled with the constant 10.0.\n",
    "\n",
    "We also create the necessary placeholders in the graph for the data and target.  Note that we use a shape of `[1]` for stochastic training."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 3,
   "metadata": {
    "collapsed": true,
    "deletable": true,
    "editable": true
   },
   "outputs": [],
   "source": [
    "x_vals = np.random.normal(1, 0.1, 100)\n",
    "y_vals = np.repeat(10., 100)\n",
    "\n",
    "x_data = tf.placeholder(shape=[1], dtype=tf.float32)\n",
    "y_target = tf.placeholder(shape=[1], dtype=tf.float32)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "### Model Variables and Operations\n",
    "\n",
    "We create the one variable in the graph, `A`.  We then create the model operation, which is just the multiplication of the input data and `A`."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 4,
   "metadata": {
    "collapsed": true,
    "deletable": true,
    "editable": true
   },
   "outputs": [],
   "source": [
    "# Create variable (one model parameter = A)\n",
    "A = tf.Variable(tf.random_normal(shape=[1]))\n",
    "\n",
    "# Add operation to graph\n",
    "my_output = tf.multiply(x_data, A)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "### Loss Function\n",
    "\n",
    "For this, we choose the L2 loss.  We can easily choose the L1 loss by replacing `tf.square()` with `tf.abs()`."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 5,
   "metadata": {
    "collapsed": true,
    "deletable": true,
    "editable": true
   },
   "outputs": [],
   "source": [
    "# Add L2 loss operation to graph\n",
    "loss = tf.square(my_output - y_target)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "### Optimization and Initialization\n",
    "\n",
    "For the optimization function, we will choose the standard Gradient Descent Algorithm with a learning rate of `0.02`.  We also add and run a variable initialization operation."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 6,
   "metadata": {
    "collapsed": true,
    "deletable": true,
    "editable": true
   },
   "outputs": [],
   "source": [
    "# Create Optimizer\n",
    "my_opt = tf.train.GradientDescentOptimizer(0.02)\n",
    "train_step = my_opt.minimize(loss)\n",
    "\n",
    "# Initialize variables\n",
    "init = tf.global_variables_initializer()\n",
    "sess.run(init)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "### Train Model\n",
    "\n",
    "We run the training step for 100 iterations and print off the value of `A` and the loss every 5 iterations."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 7,
   "metadata": {
    "collapsed": false,
    "deletable": true,
    "editable": true,
    "scrolled": true
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Step #5 A = [ 2.15584159]\n",
      "Loss = [ 59.26628113]\n",
      "Step #10 A = [ 3.6463747]\n",
      "Loss = [ 38.98156357]\n",
      "Step #15 A = [ 4.82094002]\n",
      "Loss = [ 22.86421394]\n",
      "Step #20 A = [ 5.77139997]\n",
      "Loss = [ 19.58551025]\n",
      "Step #25 A = [ 6.52262354]\n",
      "Loss = [ 9.57485962]\n",
      "Step #30 A = [ 7.12625837]\n",
      "Loss = [ 6.36455345]\n",
      "Step #35 A = [ 7.63606024]\n",
      "Loss = [ 2.55103087]\n",
      "Step #40 A = [ 7.93105412]\n",
      "Loss = [ 2.34990478]\n",
      "Step #45 A = [ 8.30565166]\n",
      "Loss = [ 2.08385348]\n",
      "Step #50 A = [ 8.55292988]\n",
      "Loss = [ 6.9521904]\n",
      "Step #55 A = [ 8.81163883]\n",
      "Loss = [ 1.32351923]\n",
      "Step #60 A = [ 8.95606041]\n",
      "Loss = [ 6.76757669]\n",
      "Step #65 A = [ 9.12868977]\n",
      "Loss = [ 0.61686873]\n",
      "Step #70 A = [ 9.18059349]\n",
      "Loss = [ 0.01681929]\n",
      "Step #75 A = [ 9.22375107]\n",
      "Loss = [ 0.00448726]\n",
      "Step #80 A = [ 9.28870773]\n",
      "Loss = [ 0.08788975]\n",
      "Step #85 A = [ 9.44317436]\n",
      "Loss = [ 0.01499911]\n",
      "Step #90 A = [ 9.61464977]\n",
      "Loss = [ 0.00621458]\n",
      "Step #95 A = [ 9.65019512]\n",
      "Loss = [ 0.32731441]\n",
      "Step #100 A = [ 9.64083195]\n",
      "Loss = [ 0.31338796]\n"
     ]
    }
   ],
   "source": [
    "loss_stochastic = []\n",
    "# Run Loop\n",
    "for i in range(100):\n",
    "    rand_index = np.random.choice(100)\n",
    "    rand_x = [x_vals[rand_index]]\n",
    "    rand_y = [y_vals[rand_index]]\n",
    "    sess.run(train_step, feed_dict={x_data: rand_x, y_target: rand_y})\n",
    "    if (i+1)%5==0:\n",
    "        print('Step #' + str(i+1) + ' A = ' + str(sess.run(A)))\n",
    "        temp_loss = sess.run(loss, feed_dict={x_data: rand_x, y_target: rand_y})\n",
    "        print('Loss = ' + str(temp_loss))\n",
    "        loss_stochastic.append(temp_loss)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "# Batch Training\n",
    "\n",
    "------------------\n",
    "\n",
    "We start by resetting the computational graph"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 8,
   "metadata": {
    "collapsed": true,
    "deletable": true,
    "editable": true
   },
   "outputs": [],
   "source": [
    "# Batch Training:\n",
    "# Re-initialize graph\n",
    "ops.reset_default_graph()\n",
    "sess = tf.Session()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "For Batch training, we need to declare our batch size. The larger the batch size, the smoother the convergence will be towards the optimal value.  But if the batch size is too large, the optimization algorithm may get stuck in a local minimum, where a more stochastic convergence may jump out.\n",
    "\n",
    "Here, the we may change the batch size from 1 to 100 to see the effects of the batch size on the convergence plots at the end."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 9,
   "metadata": {
    "collapsed": true,
    "deletable": true,
    "editable": true
   },
   "outputs": [],
   "source": [
    "# Declare batch size\n",
    "batch_size = 25"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "### Generate the Data\n",
    "\n",
    "The data we will create is 100 random samples from a `Normal(mean = 1, sd = 0.1)`.  The target will be an array of size 100 filled with the constant 10.0.\n",
    "\n",
    "We also create the necessary placeholders in the graph for the data and target.\n",
    "\n",
    "Note that here, our placeholders have shape `[None, 1]`, where the batch size will take the place of the `None` dimension."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 10,
   "metadata": {
    "collapsed": true,
    "deletable": true,
    "editable": true
   },
   "outputs": [],
   "source": [
    "# Create data\n",
    "x_vals = np.random.normal(1, 0.1, 100)\n",
    "y_vals = np.repeat(10., 100)\n",
    "x_data = tf.placeholder(shape=[None, 1], dtype=tf.float32)\n",
    "y_target = tf.placeholder(shape=[None, 1], dtype=tf.float32)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "### Model Variables and Operations\n",
    "\n",
    "We create the one variable in the graph, `A`.  We then create the model operation, which is just the multiplication of the input data and `A`."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 11,
   "metadata": {
    "collapsed": true,
    "deletable": true,
    "editable": true
   },
   "outputs": [],
   "source": [
    "# Create variable (one model parameter = A)\n",
    "A = tf.Variable(tf.random_normal(shape=[1,1]))\n",
    "\n",
    "# Add operation to graph\n",
    "my_output = tf.matmul(x_data, A)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "### Loss Function\n",
    "\n",
    "For this, we choose the L2 loss.  We can easily choose the L1 loss by replacing `tf.square()` with `tf.abs()`."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 12,
   "metadata": {
    "collapsed": true,
    "deletable": true,
    "editable": true
   },
   "outputs": [],
   "source": [
    "# Add L2 loss operation to graph\n",
    "loss = tf.reduce_mean(tf.square(my_output - y_target))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "### Optimization and Initialization\n",
    "\n",
    "For the optimization function, we will choose the standard Gradient Descent Algorithm with a learning rate of `0.02`.  We also add and run a variable initialization operation."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 13,
   "metadata": {
    "collapsed": true,
    "deletable": true,
    "editable": true
   },
   "outputs": [],
   "source": [
    "# Initialize variables\n",
    "init = tf.global_variables_initializer()\n",
    "sess.run(init)\n",
    "\n",
    "# Create Optimizer\n",
    "my_opt = tf.train.GradientDescentOptimizer(0.02)\n",
    "train_step = my_opt.minimize(loss)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "### Train Model\n",
    "\n",
    "We run the training step for 100 iterations and print off the value of `A` and the loss every 5 iterations.\n",
    "\n",
    "Note that here we select a batch of data instead of just one data point."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 14,
   "metadata": {
    "collapsed": false,
    "deletable": true,
    "editable": true,
    "scrolled": true
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Step #5 A = [[ 1.87009048]]\n",
      "Loss = 65.4902\n",
      "Step #10 A = [[ 3.36440945]]\n",
      "Loss = 45.8059\n",
      "Step #15 A = [[ 4.58703709]]\n",
      "Loss = 29.0102\n",
      "Step #20 A = [[ 5.57683086]]\n",
      "Loss = 19.0908\n",
      "Step #25 A = [[ 6.38111591]]\n",
      "Loss = 13.8178\n",
      "Step #30 A = [[ 7.03755045]]\n",
      "Loss = 10.0696\n",
      "Step #35 A = [[ 7.5764699]]\n",
      "Loss = 6.38239\n",
      "Step #40 A = [[ 8.01234055]]\n",
      "Loss = 5.45151\n",
      "Step #45 A = [[ 8.35609531]]\n",
      "Loss = 2.84672\n",
      "Step #50 A = [[ 8.64307308]]\n",
      "Loss = 2.26105\n",
      "Step #55 A = [[ 8.84179211]]\n",
      "Loss = 2.03782\n",
      "Step #60 A = [[ 9.0474844]]\n",
      "Loss = 1.74324\n",
      "Step #65 A = [[ 9.21524715]]\n",
      "Loss = 1.54201\n",
      "Step #70 A = [[ 9.32061958]]\n",
      "Loss = 1.09765\n",
      "Step #75 A = [[ 9.4463768]]\n",
      "Loss = 1.29265\n",
      "Step #80 A = [[ 9.53134918]]\n",
      "Loss = 0.934568\n",
      "Step #85 A = [[ 9.59794903]]\n",
      "Loss = 0.810933\n",
      "Step #90 A = [[ 9.66680145]]\n",
      "Loss = 1.40375\n",
      "Step #95 A = [[ 9.71764565]]\n",
      "Loss = 1.36738\n",
      "Step #100 A = [[ 9.73819923]]\n",
      "Loss = 0.717119\n"
     ]
    }
   ],
   "source": [
    "loss_batch = []\n",
    "# Run Loop\n",
    "for i in range(100):\n",
    "    rand_index = np.random.choice(100, size=batch_size)\n",
    "    rand_x = np.transpose([x_vals[rand_index]])\n",
    "    rand_y = np.transpose([y_vals[rand_index]])\n",
    "    sess.run(train_step, feed_dict={x_data: rand_x, y_target: rand_y})\n",
    "    if (i+1)%5==0:\n",
    "        print('Step #' + str(i+1) + ' A = ' + str(sess.run(A)))\n",
    "        temp_loss = sess.run(loss, feed_dict={x_data: rand_x, y_target: rand_y})\n",
    "        print('Loss = ' + str(temp_loss))\n",
    "        loss_batch.append(temp_loss)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "deletable": true,
    "editable": true
   },
   "source": [
    "# Plot Stochastic vs Batch Training\n",
    "\n",
    "Here is the matplotlib code to plot the loss for each."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 15,
   "metadata": {
    "collapsed": false,
    "deletable": true,
    "editable": true
   },
   "outputs": [
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXcAAAEACAYAAABI5zaHAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzt3X2cjXX++PHXe0xDaEREuS/FKIaRSNS4mWXLUqK0qVS7\nUcSvLZvaatTWxkbk282mmyXbDaWWVjea5UjbrSLFIIqkDNJMbmNm3r8/PmeOMXNm5szNuZ338/E4\nD2eu6zrX9T6nq/e5zvv63IiqYowxJrbEhTsAY4wxVc+SuzHGxCBL7sYYE4MsuRtjTAyy5G6MMTHI\nkrsxxsSgMpO7iJwpIqtE5HPvvzkiMk5E6ovIEhHZICLviEi9UARsjDGmbFKedu4iEgd8D3QDxgI/\nqerfReQOoL6qTgxOmMYYY8qjvGWZfsBmVd0GDAbmeJfPAS6pysCMMcZUXHmT+xXAi97njVU1C0BV\ndwCNqjIwY4wxFRdwcheR44BBwCveRTZugTHGRKj4cmz7W+AzVd3t/TtLRBqrapaINAF2+nuRiNiX\ngDHGVICqSkVfW56yzJXAS4X+XgSM9D6/FlhY0gtV1R6qpKenhz2GSHnYZ2GfhX0WpT8qK6DkLiLH\n426mvlZo8RQgTUQ2eNdNrnQ0xhhjqkRAZRlVPUiRG6aqugeX1I0xxkQY66EaQqmpqeEOIWLYZ3GU\nfRZH2WdRdcrVialCBxDRYB/DmGBr1aoVW7duDXcYJga1bNmSLVu2FFsuImglbqhacjcmAN7/0cId\nholBJZ1blU3uVpYxxpgYZMndGGNikCV3Y4yJQZbcjYlCr7zyCikpKaSkpNC+fXtGjBjhW3ffffeR\nm5tbqf337t2bN998s7JhArB161aefvrpY5YNHDiQb7/9tlz7iYuL48CBA1USU3VgN1SNCUAk3VDd\nsWMHHTt2ZPXq1Zx66qkArFmzho4dOwIuCe7bt4/atWtX+Bi9e/dmwoQJXHTRRZWO1+PxMGHCBD79\n9NNK7adGjRrs3bu3Uu8rEtkNVWMM4JJ7QkIC9evX9y0rSOxjx45FROjRowcpKSn88ssv7Ny5kyFD\nhpCcnExycjJz5871vW79+vX079/f7zqPx0OvXr1o06YNd955p2/5I488Qrdu3ejSpQvnn38+X3zx\nBQAHDx7k8ssv5+yzz6Zz584MHz7cF1NmZiYpKSlcfvnlALRu3Zp169YB8MMPPzB06FCSk5Pp1KkT\nU6ZM8fu+S/pyffvtt0lJSaFTp06kpaWxefNmADZu3EiPHj3o3LkzHTt25JFHHgFg4cKFdOzYkZSU\nFDp27Mh7771Xjk8/ioRgfAQ1JtpF0nmcn5+vl1xyiTZs2FCHDh2qM2bM0J9++sm3XkT0wIEDvr+v\nuOIKvffee1VV9ccff9RTTjlF165dq7m5uXrmmWfqggULfNvu2bNHVVVTU1N1+PDhqqqak5OjDRs2\n1E2bNqmq6u7du33bZ2RkaPfu3VVV9fXXX9cBAwb41mVnZ6uqqsfj0a5dux7zHlq1aqVr165VVdXe\nvXvrtGnTfOsKv5fCRET3799/zLKdO3dqo0aNdP369aqq+uyzz2q3bt1UVXX8+PE6efLkYvEkJyfr\nRx995Pss9+7d6/d4oVLSueVdXuHca1fuxlQBkco9yncs4fXXX2f58uX06dOHxYsXk5ycTHZ2tm8b\nLXSVm5GRwahRowBo0qQJF198McuWLWPDhg3k5eUxZMgQ37aFfw0MGzYMgMTERJKSknxXxJ9++ikX\nXnghHTp04E9/+pPvyj05OZnMzExuueUWXn31VRISEsp8L/v37+eDDz7g1ltv9S1r0KBBwJ/Fxx9/\nTKdOnWjbti0A1113HatXr2b//v1ccMEFPPPMM9x7770sW7aMevXcTKB9+/bl1ltvZerUqaxbt466\ndesGfLxoYsndmCqgWrlHRbRv356bbrqJJUuWkJiYiMfj8budiCBFvkEK6rxaysFr1arle16jRg1y\nc3M5cuQIw4YNY+bMmXz55Ze8/fbb/Prrr4Artaxdu5a0tDQyMjJITk7m8OHDZb6PQO9nFH0P4L7E\n/L03gCFDhrBixQratGnD5MmTufrqqwGYNm0aTz/9NDVr1mTYsGE8++yzZR47GoUmue/fH5LDGFMd\n/PDDD3z00Ue+v7///nt2797NaaedBrgr7ZycHN/6fv36MWvWLMDV69966y369OlDu3btOO6441iw\nYIFv2z179pR67EOHDpGXl0ezZs0AePzxx33rtm/fTlxcHIMGDeKRRx5h9+7d7Nmzp1g8hdWpU4ce\nPXowffp037KffvrJ77b+vgDOO+88Vq9ezcaNGwGYPXs2nTt3pk6dOmzevJnGjRtzzTXXkJ6e7ruh\nu3HjRs466yxuueUWRowYUekbvZGqPJN1VNyKFTBgQEgOZUysy83NJT09ne+++45atWqhqjz44IO+\nm6q33XYbvXv3pnbt2ng8Hh599FFGjRpFcnIyAFOmTKFdu3aAu7k4ZswY7rvvPmrUqMHtt9/OVVdd\nVeLV8AknnMD999/POeecQ8OGDRk6dKhvmy+//JKJEycCkJ+fz1133UWTJk1o1KgRbdu2pUOHDiQl\nJTF//vxj9j937lzGjBnD7NmziY+P5/e//z0TJkwo9r5FhLZt2/qu9OvWrUtmZibPP/88V155JXl5\neTRq1Ih//etfAMyfP58XXniBhIQE4uLimDlzJgATJ05k06ZN1KhRg/r168fslXtomkLedhtMnRrU\n4xgTTJHUFNLEluhuCrl3b0gOY4wxxrFOTMYEwK7cTbBE95W7McaYkLLkbowxMciSuzHGxCBL7sYY\nE4NCl9xXroRYHaDHGGMiTOiS+4YNMGNGyA5nTCxr1aoV7du3p3PnzrRv355Ro0aRl5dX5uvmzJnD\npk2bytxu+fLldO3aNaBYrrvuOp544omAtg2V9PR0XnnllaAe44EHHvCNgNm1a1eWLFniW3fw4EGG\nDx/OGWecQfv27Vm8eHFQY/EnoB6qIlIPeAY4G8gHrgc2AvOAlsAW4HJV9d/HGKBvXxg7FnJzIT40\nHWONiVUiwoIFC0hKSkJV6dmzJ6+99ppvsK+SzJ49m0aNGtGmTZuAjhGt7rvvvqAfo1u3btx+++3U\nqlWLNWvWcOGFF7Jjxw5q1qzJ1KlTSUxM5Ouvv2bTpk306tWLzZs3h3Qs+kCv3B8F3lTVJCAZWA9M\nBDJUtS2wFLizlNdDkybQrBl89lklwjXGFChoG33gwAEOHTrkG9Fx6dKl9OjRgy5dupCcnMz8+fMB\nl9hXrlzJuHHjSElJYenSpQA89NBDdOzYkU6dOtGzZ0/f/o8cOcLo0aNJTk6mc+fObNiwoVzxlTSO\nvKpy8803+3559OrVC4Bdu3aRlpbm2/62224r8xgffvghXbp0ISUlhQ4dOjBv3jzg2F8TgwcP9s1a\n1bJlS7p06QK4cXaGDRtG9+7dSU5OZvLkyeV6f2lpab7B1Tp27Iiq+sbFmTdvHqNHjwagTZs2dO3a\nlbfeeqtc+6+0ssYEBk4ANvtZvh5o7H3eBFhfwuuPDlB8662qDzxQnqGOjYkIlDWee3q6/wEf09PL\n3r6kbUrRqlUrTUpK0k6dOmliYqIOHTrUty47O1vz8/NVVTUrK0ubNWvmG8s8NTVVFy9e7Nt29uzZ\n2qNHD923b5+qHh3P3ePxaEJCgn7xxReqqvrggw/qiBEj/MYycuRIffzxx4stL2kc+VWrVmlSUtIx\n8aqqTp8+XUePHl1seWkGDx6sL7/8su/vnJycEmPKycnR5ORkXbhwoaqqpqWl6YoVK1RV9fDhw9qr\nVy/NyMhQVdVx48Zp586d/T6++eabYnHMnj1bu3Tp4vv7hBNOOGbc+5tvvlmnT5/u9z2UdG5RyfHc\nA6mPnAbsFpF/4q7aVwL/z5vYs7zZe4eINCpzT/36wcMPw1/+Uq4vIGMi3qRJ7hGs7f0oKMscPnyY\nIUOGMHPmTMaNG8fOnTu57rrr+Prrr4mPj+fnn39mw4YNnHvuucX2sXjxYm666Sbq1KkDHDuee9u2\nbX2DkXXv3p3//Oc/5YovIyPDN/tR4XHkr776anJzc7nhhhvo3bs3AwcO9B1jxowZ3HHHHVxwwQX0\n79+/zGP07t2bBx54gE2bNpGWlub3PYL7FTJkyBCuv/56Bg0axIEDB/B4POzevdv3C2jfvn1kZmbS\nt29fHn300YDf5/Lly0lPTycjI8O3LBJKWoEk93ggBRijqitFZDquJBNwX+xJBSfx4cOk9upFarnD\nNMYUVZCUEhISGDhwIIsXL2bcuHHcdNNNDB48mNdeew1wSfrQoUOl7sMff+O5l0dJ48gnJiby1Vdf\n4fF4yMjI4I477mDVqlV0796dVatW8e677zJ37lwmT57MihUrSj3G+PHjGTRoEBkZGdxyyy3079+f\n+++/v9h2N954Ix06dGDcuHGAG7UyLi6OlStXEhdXvDo9fvx4v9PvFdzraN26NeDKQtdccw2LFi06\n5j5GixYt2Lp1KyeddBIA3333HX369Cn1vXg8nhLH5K+Qsi7tgcbAN4X+7gn8B8jk2LJMZgmv9/uT\nw5hoEmnnceFp6vLy8nT48OF62223qarqOeeco4sWLVJV1SVLlmhcXJwuX75cVVUHDRqkL774om8/\nc+bM0R49evimmiuY4q7o1Hj+psorMHLkSH3ssceKLR8+fLhOmjRJVV1ZpmnTprpu3TrdtWuX/vzz\nz77Y27dvrx9++KF+++23euTIEVVV/f7777V27dqqqrp9+3Zt166d32Nv3LjR9/yFF17Q/v37+2Iq\nKMukp6frJZdcUuy1/fr107/+9a++v7dt26Y7duzwexx/PvnkE23RooV+8sknxdZNmjRJb7zxRl+M\nTZo08ZW+iirp3CLYZRlVzRKRbSJypqpuBPoCa72PkcAU4FpgYdV95RhjSiMiDB06lFq1anH48GHO\nPvts7rnnHsDdIL355ptJT0+na9euvnHcwV3B3n777UydOpWHH36Ya665hu3bt9O9e3fi4+NJTEys\n0ITR9957L1OmTPHNjDRr1ixmzpzJjTfeeMw48klJSaxatYo//vGP5OXlkZuby0UXXUT37t2ZPXs2\n06ZNIz4+HlXlqaeeAtzkJMcdd5zf486cOZNly5aRkJBArVq1eOyxx3yfT4H777+ftm3b0rlzZ9+Y\n8C+99BL/+te/uPXWW0lOTkZVSUxM5LnnnqNx48YBvecxY8Zw6NAhRo0a5Xvfc+fO5ayzzmLChAmM\nHDmSM844g/j4eJ5++mlf6StUAhoVUkSScU0hjwO+Aa4DagDzgebAd8AwVc3281oN5BjGRDIbFTJ8\npk+fTuPGjfn9738f7lCCIlijQtqQv8YEwJK7CZaoHvLX5uowxpjQCklyL1bCe+MNGD8+FIc2xphq\nKSTJ3dsR7qgzz4TXX3ddOIwxxlS5kAzy4je5q8LXX7vnxkS4li1bRkTHFBN7WrZsGZT9hiS5f/MN\n/PQTeNvzg4jrrZqRYcndRIUtW7aEOwRjyiUkZZmePWHZsiILC5K7McaYKheS5N6nj5/STL9+8Omn\nVnc3xpggCF9yb9zY1WusjmmMMVUuJMk9ORl274bt24usKKFLsTHGmMoJSXKPi4PUVD9X78YYY4Ii\nZHOo+i3NGGOMCYqQJ3e7f2qMMcEXsuTetq2bG/ubb4qsyMmB1atDFYYxxlQLIUvuIu7q/b//LbIi\nMxOuvTZUYRhjTLUQsuQOJdTdzzkHtm6FrKxQhmKMMTEtLMn9mLp7fLw1pTHGmCoW0uTesiUkJsJX\nXxVZ0a8fvPtuKEMxxpiYFtLkDqUMRZCRYU1pjDGmikRGcm/bFoYNg0OHQh2OMcbEpJDPoZqVBe3a\nwa5drtxujDGmuKiYQ7Wwxo2hWTP4/PNQH9kYY6qPkCd3sKEIjDEm2Cy5G2NMDAqo5i4iW4AcIB84\noqrnikh9YB7QEtgCXK6qOX5eq0WPkZ0NzZu7YYBr1qz0ezDGmJgTqpp7PpCqqp1V9VzvsolAhqq2\nBZYCdwZ60BNPhKQk+OijIiu+/RYmTAh0N8YYY0oQaHIXP9sOBuZ4n88BLinPgf2WZho1giefhP37\ny7MrY4wxRQSa3BV4R0Q+FZE/eJc1VtUsAFXdATQqz4H9Jve6daFLF1ixojy7MsYYU0SgLc17qOoO\nEWkELBGRDbiEH5BJkyb5nqemppKamkrPnrBqlbtIr1On0MYFvVUHDAh098YYE/U8Hg8ej6fK9lfu\nTkwikg7sA/6Aq8NniUgTYJmqJvnZvtgN1QIXXgh33lkkj3/4Idx0k43xboyp1oJ+Q1VEaotIXe/z\nOsBvgC+BRcBI72bXAgvLe3C/pZmuXW0IYGOMqaRAyjKNgddFRL3bv6CqS0RkJTBfRK4HvgOGlffg\nffrArbcWjSjeNaM56aTy7s4YY4xXyMeWKezwYWjYELZsgQYNghqGMcZElagbW6awhATo0QOWLw9n\nFMYYE3vCmtzBhiIwxphgsORujDExKOzJvXNn+PFH9ygmOzvk8RhjTCwIe3KvUcO1d1+2rMiK3buh\ndWvIzQ1LXMYYE83CntyhhNJMw4Zu6MjPPgtLTMYYE80iIrn37VtC3T0tDd59N+TxGGNMtIuI5J6U\nBAcOuBF/j1EwzowxxphyiYjkLlJCaeaCC1xZZt++sMRljDHRKiKSO5SQ3OvUgSuv9HNJb4wxpjRh\nHX6gsG+/db1Vf/jBXckbY0x1FtXDDxTWujXUqgWZmeGOxBhjol/EJHew3qrGGFNVLLkbY0wMipia\nO7ghCM46C3btcj1XjTGmuoqZmjvAKadAkyYlzLA3axasXRvymIwxJhpFVHKHUkozmzbByy+HPB5j\njIlG0ZPcL70U/v3vkMdjjDHRKKJq7gB79kCrVm5QyISEQivy86FpU1ixAtq0qfI4jTEmksRUzR3c\nXKpnnAGffFJkRVwcDBoECxeGJS5jjIkmEZfcoZTSzCWXWGnGGGMCEF3JvU8feOKJkMdjjDHRJuJq\n7uAGgWzSBHbuhNq1gxSYMcZEsJDV3EUkTkQ+F5FF3r9bichHIrJBRF4SkfiKBlFU3brQqRP8739V\ntUdjjKleylOWGQ+sK/T3FGCaqrYFsoEbqjIwG4rAGGMqLqDkLiLNgIuAZwot7gMs8D6fA1xalYFZ\ncjfGmIoL9Mp9OjABUAAROQn4WVXzveu/B06tysDOOw/WrYPsbD8rVd1ANMYYY/wqs04uIhcDWaq6\nWkRSCxZ7H4WVeNd00qRJvuepqamkpqaWtKlPzZrQvTu8955r3n6MHTvcCGNZWXDccWXuyxhjIp3H\n48Hj8VTZ/spsLSMifwNGALnA8cAJwL+B3wBNVDVfRLoD6ar6Wz+vL3drmQIPPeTy94wZflZ26wZ/\n+xv07VuhfRtjTCQLemsZVb1LVVuo6mnAcGCpqo4AlgHDvJtdC1R519G+fUupu1uHJmOMKVFlOjFN\nBP4kIhuBBsCzVRPSUSkpsG2bm1e1mILkHuR2+sYYE40ishNTYVdf7WrvY8YUWaEK7drBiy9Cly6V\nC9IYYyJMzA0cVtRll8GCBX5WiMAtt0BOTshjMsaYSBfxV+4HD7qhCDZtgkaNqjAwY4yJYDF/5X78\n8TBggN07NcaY8oj45A6llGaMMcb4FfFlGXCjRJ56KmzdCvXrV1FgxhgTwWK+LANulMi+fWHRonBH\nYowx0SEqkjuUUZp56SV47bWQxmOMMZEsapL7734HHg/88ksJGzz3XCjDMcaYiBY1yb1ePejVCxYv\n9rPyoovcCGN794Y8LmOMiURRk9yhlNJMvXpujOB33gl5TMYYE4miKrkPHgzvvgv79/tZaQOJGWOM\nT1Ql95NOgnPPhbff9rNy0CB4803IzQ15XMYYE2miKrlDKaWZpk1hzRqIr7J5uo0xJmpFRSemwrKy\noG1bNxlTrVpVtltjjIko1aITU2GNG0Nysqu9G2OM8S/qkjvA0KHw6qvhjsIYYyJX1JVlALZvh44d\n4ccfISGhSndtjDERodqVZcDdOz3zTFi2zM9KVXdj1RhjqrGoTO5QSmlGFfr3d7N7GGNMNRW1yX3I\nEFi40E+z9rg41+Z94cKwxGWMMZEgapN769bQvDmsWOFnpfVWNcZUc1Gb3KGU0kyfPvDll7BzZ8hj\nMsaYSBDVyf2yy+D11yE/v8iKmjVd3f2NN8ISlzHGhFuZyV1EaorIxyKySkS+FJF07/JWIvKRiGwQ\nkZdEJOT9/s88Exo2hA8+8LPyj3+EBg1CHZIxxkSEgNq5i0htVT0gIjWA/wHjgT8Br6rqKyLyJLBa\nVZ/y89oqb+de2H33QXY2TJ8etEMYY0zIhaSdu6oe8D6tCcQDCvQGCobwmgNcWtEgKqNgILEg98Uy\nxpioElByF5E4EVkF7ADeBTYD2apaUO3+Hjg1OCGW7qyzoHZt+PTTcBzdGGMiU0B1cm8S7ywiicDr\nQJK/zUp6/aRJk3zPU1NTSU1NLVeQpRE5evV+7rlVtltjjAkpj8eDx+Opsv2Ve2wZEbkXOAD8GWii\nqvki0h1IV9Xf+tk+qDV3gM8/h8svh6+/dsneGGOiXdBr7iLSUETqeZ8fD/QD1gHLgGHeza4FwtYl\ntHNn1xzyiy/8rHz3Xbj//pDHZIwx4RRIzf0UYJmIrAY+Bt5R1TeBicCfRGQj0AB4Nnhhlq5waaaY\n1q3hsccgJyfkcRljTLhE5ZC//nz8MYwcCZmZflaOHAktWtgVvDEmalS2LBMzyT0/H1q2hHfegfbt\ni6zcsgW6dHGZ/+STgx6LMcZUVrUcz92fuDg3UqTf0kyrVjBiBDz4YKjDMsaYsIiZ5A5lTL/3l7+4\n2bWtt5MxphqImbIMQF6em6Xp/fehTZuQHNIYY4LCyjKF1KgBl15aQmnGGGOqkZhK7lBGacYYY6qJ\nmCrLgJt275RTYOVK13rGGGOikZVlioiPh8GDAyjN7NhhN1eNMTEr5pI7lNJbtYCqm6kpIyNkMRlj\nTCjFXFkG4PBhaNLETaPatGkJG82fD3//uxsr2EYbM8ZEGCvL+JGQAAMHuvlVSzR0qLuCt6Y1xpgY\nFJPJHQIozcTFwUMPwd13u7uwxhgTQ2I2uf/mN7BqFezcWcpGaWmuac2cOSGLyxhjQiFmk/vxx8OA\nAfDvf5eykQg8/jhceGHI4jLGmFCI2eQOAZRmwA0haWMVGGNiTEy2limwb59rLfPtt9CgQVhCMMaY\nCrHWMqWoWxf69oVFi8IdiTHGhFZMJ3dwpZmnn4Zffgl3JMYYEzrVIrknJbnS+oIFZYw48N13MHt2\nqEIzxpigiemae2ErVsCoUXD66W6+bL+DimVluW+Bzz+3UceMMWFlNfcA9eoFq1dD9+5uOtVp0/z0\nXWrcGG6+GSZNCkeIxhhTZarNlXthmzbBTTfBrl3w1FPQrVuhlTk5cMYZ4PH4mWnbGGNCw67cK6BN\nG1iyBCZMgEsugbFjXU4HoF49+POf3bAExhgTpcpM7iLSTESWisg6EflSRMZ5l9cXkSUiskFE3hGR\nesEPt+qIwFVXwdq1bhTJs85yMzipAmPGwLp18OOP4Q7TGGMqpMyyjIg0AZqo6moRqQt8BgwGrgN+\nUtW/i8gdQH1Vnejn9RFXlvHn/ffdDdfWrd0N11bNct3MH8YYEwZBL8uo6g5VXe19vg/IBJrhEnzB\niFtzgEsqGkQk6NnTDTTWoweccw5MnRHPkSPhjsoYYyqmXDdURaQV4AHOBrapav1C635S1ZP8vCYq\nrtwL27TJNZrZudPPDVdjjAmByl65B1x38JZkXgXGq+o+EQk4Y08q1LQwNTWV1NTUcoQYem3awDvv\nwEsvuRuuQ4bA3/7m7rUaY0wweDwePB5Ple0voCt3EYkH/gO8paqPepdlAqmqmuWtyy9T1SQ/r426\nK/fC9uyBW8YqB/Yrry+slo2LjDFhEKqmkM8B6woSu9ciYKT3+bXAwooGEckaNIDZjW7nrPee5OOP\nwx2NMcYEJpDWMucD7wFfAup93AV8AswHmgPfAcNUNdvP66P6yh2Ades4cF5fpjZ/lHu/ujzc0Rhj\nqoHKXrlXyx6qFXHkszX83K0/WXfNpMP9w8IdjjEmxlkP1RA5rktHPnvwHU6ZPA6d/0q4wzHGmFLZ\nlXs55OfDFe2+YFqzR2jx39mum6sxxgSBXbmHUFwcXPtIMhftnENeviV2Y0zksuReThdfDImJrg28\nMcZEKivLVIDHAzfcAJmZkJAQ7miMMbHIyjJhkJrqerE+95x3wYED8N574QzJGGOOYcm9gh58EP76\nV5fX2bIFhg2DhTHZj8sYE4UsuVfQOee4Kfsefxw3Y9Obb8KNN8KiReEOzRhjrOZeGevWuRLN1197\nBxVbudLdcX3mGfjd78IdnjEmilnNPYzat4eLLoJHHvEuOOccWLwY/vAH+PDDsMZmjKne7Mq9kr79\n1uX09euhUSPvwnXr4PTToWbNsMZmjIleNrZMBBg71jWJ9F3BG2NMJVlyjwA//ghnnw2rV0Pz5uGO\nxhgTCyy5R4g774SffoJZs0rYQNXGojHGBMySe4TYswfOPBM++MD9W8zll7sWNCNGWJI3xpTJknsE\nefBB+OqrEsadWbMGrrwSkpPhiSfgxBNDHp8xJnpYU8gIMn68G3fmiy/8rOzY0bWDr18fOnWC998P\ndXjGmGrErtyr2MyZ8O678MYbpWz0n//AH/8I8+bBBReELDZjTPSwskyE+fVXV3N/8UU4//xSNty5\nE046CWrUCFlsxpjoYck9Aj33HMyZ40o0du/UGFMRVnOPQNdcA1lZrjxTbtXsi9AYExyW3IMgPt4N\nB3zXXeXM1Xv3wrnnwooVQYvNGFM9WHIPkssucxNqv/ZaOV50wgmQnu7axN9zDxw5ErT4jDGxrczk\nLiLPikiWiKwptKy+iCwRkQ0i8o6I1AtumNEnLs61e7/7bsjLK8cLBw6EVavg00+hVy/YvDloMRpj\nYlcgV+7/BPoXWTYRyFDVtsBS4M6qDiwWDBjgRor817/K+cImTdzkH1de6Zrc7NkTlPiMMbEroNYy\nItISeEMm1Nu6AAAPSUlEQVRVO3r/Xg9cqKpZItIE8KhquxJeW+1ayxS2YgVcfTVs2FDBEYB374aG\nDas8LmNMZAtXa5mTVTULQFV3AI3K2L7a6tXLTerx9NMV3IEldmNMBcSH4iCTJk3yPU9NTSU1NTUU\nh40YDz7oZmwaORLq1q2inebnu8K+MSYmeDwePB5Ple2vomWZTCC1UFlmmaomlfDaal2WKTB2LCxZ\n4ibUTkur5M4++8xNxj13rvtZYIyJOaEqy4j3UWARMNL7/FpgYUUDqC4ee8zN1DRqFFxxBfzwQyV2\nlpICo0fDhRfC//2fdXwyxhQTSFPIF4EPgDNF5DsRuQ6YDKSJyAagn/dvU4aBA92QwGec4QaJnDED\ncnMrsCMRN/DYBx+4pji//a2bDsoYY7xsbJkwWb8exoxxrRyffBK6d6/gjo4ccUX9N95wQwrbYDbG\nxAQbOCyKqbqJPW6/3V3VP/SQGyiyQg4cgNq1qzQ+Y0z42MBhUUwEfv97yMx0beDPOgv++U/XEKbc\nLLEbYwqxK/cI8tlncNNNLtE/8QR06FDJHRaMTXPccZWOzRgTWnblHkO6dIEPP3RX8336wIQJsG9f\nJXb4wgtu+IKNG6ssRmNMdLDkHmFq1HBX71995caEb9/ejSxZoR8/117rek6dfz489VTMNJn8/HPX\nCnTTpvDGkZVlw/6YyGVlmQi3fLlL9q1buybtp51WgZ1kZsKIEXDqqfDMM9C4cZXHGSrLl8OwYTBk\nCLzzjptnvGnT0Mexdaub/rZ+fRdDlfU8NsbLyjIx7sILYfVql0jOPRf+8Y8KXIAnJbl6z9lnu1pP\nlFq0CIYOdS2M/vEP96WXlubGVgul7duhb1/XyqlrVzd4Z7mGdTYmBOzKPYps2ADDh7ur+GeegQYN\nKrATVf9t4Z9+2k3+2qGD62HVoYN7VOggVe/55+HPf3bN+bt2Pbr8zjshIwP++19ITAx+HFlZ7gv3\n+utdPEeOuD5kZ5/tOqUZU1Xsyr0aadsWPvoIWraETp3gvfcqsJOSOjldcQX8/e+QnOwK/nfdBa1a\nwbRplQm5Sjz6KPzlL7B06bGJHeBvf3M3ogcPhoMHgxvHTz+5XwpXXukSO7iGSK++enTcIGMihV25\nR6nFi+GGG9wQM3ff7eZtrXKq8OuvUKtW8XWrVrmG+QkJQTjw0cOnp8PLL7vJxlu29L9dXh5cdZXr\nx7VgQXBafmZnu1JMWprrbFb0O/Lbb6FHD3j2WTcCqDGVZVfu1dTFF7v8+v770Ls3fPddEA4i4j+x\nA0yd6gbJefJJ9wVQxfLz4ZZbXBlmxYqSEzu4FkbPP+/G6bn++gp2AivF3r2u9NKzp//EDq5U9tpr\nrnHSmjXF1xsTapbco9gpp7hywMCBcM457qo1ZF54AebNc9m3TRtXkzh0qEp2feSIm71qzRrweAJr\n3JOQ4MojW7fC+PFV1+rzwAH3+Xbo4GrqpQ3dc955rkXT735n47iZ8LOyTIz4+GPX+SktzQ0tHNLR\nCD75BO6/H+rVc0m/Eg4cgMsvd8/nzy//+8jJgdRUl2Dvv79SoXDoEAwa5Ka0nT078LlRHngAFi50\nX0x16lQuBlN92cBhxueXX1wN/osvXJ260sMXlNehQyWXcQKQne2ScosWLplWtHa+c6eb3nD0aLj1\n1ort4/BhuOwyOP54ePHF8t3TUHXlmb173a8JmzDLVITV3I1PYqK7cP7zn93wBY8/HuJOqSUl9gCK\n4FlZ7t5Bp05ugqnK3BQ9+WR3A3bGDDcQW3nl5rpfQXFx7vMs781qEZg1y7WumTix/Mc3pipYco8x\nIm7Ugf/9zzVbv/RSl2TCJjvb1eSnTYP9+/1usmWLu9IePBhmzqyaK90WLdz9iLvucjc6A5WX5666\n9+1zZaGKfsnUrOmO++9/V2JydGMqwZJ7jDrzTDdRU5s27mq4CufdLZ8TT4TXX3c9ZE8/HR5++Jgk\nv26dS+xjx8KkSVU710jbtvDmm648k5FR9vb5+W4axO3bXWKuWbNyxz/pJNdk9Z57Aju+MVVKVYP6\ncIcw4fTWW6pNmqjefbfqkSNhDGTNGtVhw1RPPln15Zf1449VGzdWff75Qtvk5VX5Yd97T7VhQ9UP\nPyx5m/x81bFjVXv0UN27t2qPv3y5e8tr11btfk1s8+bOCudeu6FaTezY4co169e7poW1a7vH8ccf\nfV7assLL69SBE044+ih36WLzZt7/JIFLxzXnuefcTVSfm2+GV16B5s2hWTP3b/PmrmaTlFTh9//m\nm3Ddde4KuuiNZlW44w7XA/a//3WNfqra3LmuQ9ZHH7l7AsaUxVrLmIDl57vkvm+fa3JY8Dh48Ni/\nS1u3f7977N179HHccccm+6KPunWP/Xv/fledeeUVN05LsSB37YJt2+D774/+e+ml/ieaveOOo20O\nCz9Gjy42VsHLL8OcW1by7MN7OPWMOi6wxEQmP5HIvLfrkeGJr/g0hwG49153o3fpUvdFaUxpLLmb\nsFJ1XwB797ovjcJJv6THwYOuo1FKShUEsGWL+1lS8K1T8EhNdTccivj40skcefu/nHv2ARJ+3UvO\ntl/QX36BZ5/lxJGXFt//1KlurJ3ERHdJn5joHgMG+O82+8sv7o7w8ce7rrNFPqurrnLfXy++aE0k\nTeksuRtTTpMnuzLJ8OEwZ44bgO3UU0vY+P334euvXdLOyXH//vKLG2+4c+fi2191levBdPCga0NZ\nUM96/nno149Dh9wYNb17u85OPPqo61Z78snQqJH79+ST3SwtJ5wQzI8hNq1b57494+KOPkTcRAhF\nvmwB1ykCjm4XF+f+m1X2bnoVCGtyF5EBwAxcq5tnVXWKn20suZuIouran8+b5yb/KG3cmkod5PDh\no/Ws+vV93W137XIVpnvvhWubvON+Gezc6R67drl/Z8xwI5EVNW2a6xRQ8CVw8slu3yV9GTz/PGze\nfGyN7eBBuO8+16SqqHvucftv0MDtt+Dfvn3dv+Gg6r5Yt207+rj6av/dl3v2hJ9/dq/Jzz/6WLnS\ntdwqql07N51W4W0PHnS/Bv29302b3OwwIairhS25i0gcsBHoC/wAfAoMV9X1Rbaz5O7l8XhITU0N\ndxgRIRI+i9zcII2mGYDMTFc5mjcPoByfxcKF7sUFXwK7drnkNGuWa/Na1D/+4RJV4Tvixx/vxqlo\n1Kj49m+/7UpdP//sHnv2uH+nTPFb5mLIEPjhB5cI69Z1g/wkJLgvjxYtim8/Z47r+1CwXcGjf384\n8cTi50VamrsLLXL05nrz5m4Et4YNA/vMyisv7+iVfGH5+e5LdMsW90XRsqUbFrtVKxdPFdfZKpvc\nK3Nqnwt8rapbvYG8DAwG1pf6qmosEhJapIiEzyJciR1cw58XX3TD6Ldo4eGyy1Jp0eJo7mratIRW\nSIMHu0egRo8uX2ADBpRv+4cfdl8ye/a4XwWHD7tHSfMOFtwsL9iu4NG9u//kPmuW+/UQjCZMJfFX\nvgGXvNevd0l+xw6X5LdudaPEReANlMqc3k2BbYX+/h6X8I0xAejb100dOGmSy41ffOGGbt62zeWO\nRo2OvVgtnPybN3dNWsOeU04/3T0Cdfvt5dt/69bl274IVTfkUU6O+8GQnV3284JJXwou3EWKPo8D\nTkXkVKCHW5Zx7LbjxsFvflOp0CutMsnd388Fq78YUw7durnHpEnHLs/NdReEhcvM33zj7hFs2+a+\nBHJy3BX+Kacc/RVSNBEFukzVHTPQx5EjxZeBiyM+3l38+vu3tGU1argL4eXLi8da+N/SluXlufvd\nhZO1iKui1Kvn/i38vODfpk2PPi9cTlc9Oj5T0eelrW/XrtT/7CFRmZp7d2CSqg7w/j0R16NqSpHt\nLOEbY0wFhOuGag1gA+6G6o/AJ8CVqppZ0WCMMcZUjQqXZVQ1T0TGAks42hTSErsxxkSAoHdiMsYY\nE3pBu9cuIgNEZL2IbBSRO4J1nEgkIs1EZKmIrBORL0VknHd5fRFZIiIbROQdEQlh+67wEpE4Eflc\nRBZ5/24lIh95P4uXRCSMDRNDR0TqicgrIpIpImtFpFt1PS9E5FYR+UpE1ojICyKSUF3OCxF5VkSy\nRGRNoWUlngciMlNEvhaR1SLip0NDcUFJ7t4OTo8B/YGzgCtFJALuH4dMLvAnVW0PnAeM8b7/iUCG\nqrYFlgJ3hjHGUBsPrCv09xRgmvezyAZuCEtUofco8KaqJgHJuH4h1e68ENeO8BYgRVU74krEV1J9\nzot/4vJjYX7PAxH5LXC6qp4BjAL+EcgBgnXl7uvgpKpHgIIOTtWCqu5Q1dXe5/uATKAZ7jOY491s\nDnBJeCIMLRFpBlwEPFNocR9ggff5HMDPqF2xRUROAHqp6j8BVDVXVXOopucFUAOo4706Px7X0703\n1eC8UNX3gZ+LLC56HgwutPx57+s+BuqJSOOyjhGs5O6vg1PTIB0roolIK6AT8BHQWFWzwH0BAH76\nf8ek6cAEvP0gROQk4GdVLZhc9XugpKG7YslpwG4R+ae3RDVLRGpTDc8LVf0BmAZ8B2wHcoDPgexq\neF4UOLnIeVAw8n/RfLqdAPJpsJK7dXACRKQu8Cow3nsFXx0/g4uBLO8vmYLzQih+jlSHzyYeSAEe\nV9UUYD/up3h1eO/HEJETcVekLXEJvA7wWz+bVrvPxo8K5dNgJffvgcKjBjXD/eSqNrw/NV8F5qrq\nQu/irIKfUyLSBNgZrvhC6HxgkIh8A7yEK8fMwP20LDj/qsv58T2wTVVXev9egEv21fG86Ad8o6p7\nVDUPeB3oAZxYDc+LAiWdB98DzQttF9DnEqzk/inQRkRaikgCMBxYFKRjRarngHWq+mihZYuAkd7n\n1wILi74o1qjqXaraQlVPw50HS1V1BLAMGObdrLp8FlnANhEpGGu3L7CWanhe4Mox3UWklogIRz+L\n6nReFP0FW/g8GMnR974IuAZ8IwNkF5RvSt15sNq5e8d6f5SjHZwmB+VAEUhEzgfeA77E/XxS4C5c\nL975uG/h74BhqpodrjhDTUQuBG5T1UEi0hp3o70+sAoY4b35HtNEJBl3Y/k44BvgOtyNxWp3XohI\nOu4L/wjuHPgD7qo05s8LEXkRSAVOArKAdODfwCv4OQ9E5DFgAK6Ud52qfl7mMawTkzHGxJ5wDxhq\njDEmCCy5G2NMDLLkbowxMciSuzHGxCBL7sYYE4MsuRtjTAyy5G6MMTHIkrsxxsSg/w+hscLysKVJ\n/wAAAABJRU5ErkJggg==\n",
      "text/plain": [
       "<matplotlib.figure.Figure at 0x7ff4069cb710>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    }
   ],
   "source": [
    "plt.plot(range(0, 100, 5), loss_stochastic, 'b-', label='Stochastic Loss')\n",
    "plt.plot(range(0, 100, 5), loss_batch, 'r--', label='Batch Loss, size=20')\n",
    "plt.legend(loc='upper right', prop={'size': 11})\n",
    "plt.show()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {
    "collapsed": true,
    "deletable": true,
    "editable": true
   },
   "outputs": [],
   "source": []
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.5.2"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 2
}
