{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {
    "tags": [
     "pdf-title"
    ]
   },
   "source": [
    "# Convolutional Networks\n",
    "\n",
    "So far we have worked with deep fully-connected networks, using them to explore different optimization strategies and network architectures. Fully-connected networks are a good testbed for experimentation because they are very computationally efficient, but in practice all state-of-the-art results use convolutional networks instead.\n",
    "\n",
    "First you will implement several layer types that are used in convolutional networks. You will then use these layers to train a convolutional network on the CIFAR-10 dataset."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 1,
   "metadata": {
    "tags": [
     "pdf-ignore"
    ]
   },
   "outputs": [],
   "source": [
    "# As usual, a bit of setup\n",
    "import numpy as np\n",
    "import matplotlib.pyplot as plt\n",
    "from cs231n.classifiers.cnn import *\n",
    "from cs231n.data_utils import get_CIFAR10_data\n",
    "from cs231n.gradient_check import eval_numerical_gradient_array, eval_numerical_gradient\n",
    "from cs231n.layers import *\n",
    "from cs231n.fast_layers import *\n",
    "from cs231n.solver import Solver\n",
    "\n",
    "%matplotlib inline\n",
    "plt.rcParams['figure.figsize'] = (10.0, 8.0) # set default size of plots\n",
    "plt.rcParams['image.interpolation'] = 'nearest'\n",
    "plt.rcParams['image.cmap'] = 'gray'\n",
    "\n",
    "# for auto-reloading external modules\n",
    "# see http://stackoverflow.com/questions/1907993/autoreload-of-modules-in-ipython\n",
    "%load_ext autoreload\n",
    "%autoreload 2\n",
    "\n",
    "def rel_error(x, y):\n",
    "  \"\"\" returns relative error \"\"\"\n",
    "  return np.max(np.abs(x - y) / (np.maximum(1e-8, np.abs(x) + np.abs(y))))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 2,
   "metadata": {
    "tags": [
     "pdf-ignore"
    ]
   },
   "outputs": [
    {
     "output_type": "stream",
     "name": "stdout",
     "text": [
      "X_train:  (49000, 3, 32, 32)\ny_train:  (49000,)\nX_val:  (1000, 3, 32, 32)\ny_val:  (1000,)\nX_test:  (1000, 3, 32, 32)\ny_test:  (1000,)\n"
     ]
    }
   ],
   "source": [
    "# Load the (preprocessed) CIFAR10 data.\n",
    "\n",
    "data = get_CIFAR10_data()\n",
    "for k, v in data.items():\n",
    "  print('%s: ' % k, v.shape)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Convolution: Naive forward pass\n",
    "The core of a convolutional network is the convolution operation. In the file `cs231n/layers.py`, implement the forward pass for the convolution layer in the function `conv_forward_naive`. \n",
    "\n",
    "You don't have to worry too much about efficiency at this point; just write the code in whatever way you find most clear.\n",
    "\n",
    "You can test your implementation by running the following:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 3,
   "metadata": {},
   "outputs": [
    {
     "output_type": "stream",
     "name": "stdout",
     "text": [
      "Testing conv_forward_naive\ndifference:  2.2121476417505994e-08\n"
     ]
    }
   ],
   "source": [
    "x_shape = (2, 3, 4, 4)\n",
    "w_shape = (3, 3, 4, 4)\n",
    "x = np.linspace(-0.1, 0.5, num=np.prod(x_shape)).reshape(x_shape)\n",
    "w = np.linspace(-0.2, 0.3, num=np.prod(w_shape)).reshape(w_shape)\n",
    "b = np.linspace(-0.1, 0.2, num=3)\n",
    "\n",
    "conv_param = {'stride': 2, 'pad': 1}\n",
    "out, _ = conv_forward_naive(x, w, b, conv_param)\n",
    "correct_out = np.array([[[[-0.08759809, -0.10987781],\n",
    "                           [-0.18387192, -0.2109216 ]],\n",
    "                          [[ 0.21027089,  0.21661097],\n",
    "                           [ 0.22847626,  0.23004637]],\n",
    "                          [[ 0.50813986,  0.54309974],\n",
    "                           [ 0.64082444,  0.67101435]]],\n",
    "                         [[[-0.98053589, -1.03143541],\n",
    "                           [-1.19128892, -1.24695841]],\n",
    "                          [[ 0.69108355,  0.66880383],\n",
    "                           [ 0.59480972,  0.56776003]],\n",
    "                          [[ 2.36270298,  2.36904306],\n",
    "                           [ 2.38090835,  2.38247847]]]])\n",
    "\n",
    "# Compare your output to ours; difference should be around e-8\n",
    "print('Testing conv_forward_naive')\n",
    "print('difference: ', rel_error(out, correct_out))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Aside: Image processing via convolutions\n",
    "\n",
    "As fun way to both check your implementation and gain a better understanding of the type of operation that convolutional layers can perform, we will set up an input containing two images and manually set up filters that perform common image processing operations (grayscale conversion and edge detection). The convolution forward pass will apply these operations to each of the input images. We can then visualize the results as a sanity check."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Colab Users Only\n",
    "\n",
    "Please execute the below cell to copy two cat images to the Colab VM."
   ]
  },
  {
   "source": [
    "# Colab users only!\n",
    "%mkdir -p cs231n/notebook_images\n",
    "%cd drive/My\\ Drive/$FOLDERNAME/cs231n\n",
    "%cp -r notebook_images/ /content/cs231n/\n",
    "%cd /content/"
   ],
   "cell_type": "markdown",
   "metadata": {}
  },
  {
   "source": [
    "from imageio import imread\n",
    "from PIL import Image\n",
    "\n",
    "kitten = imread('cs231n/notebook_images/kitten.jpg')\n",
    "puppy = imread('cs231n/notebook_images/puppy.jpg')\n",
    "# kitten is wide, and puppy is already square\n",
    "d = kitten.shape[1] - kitten.shape[0]\n",
    "kitten_cropped = kitten[:, d//2:-d//2, :]\n",
    "\n",
    "img_size = 200   # Make this smaller if it runs too slow\n",
    "resized_puppy = np.array(Image.fromarray(puppy).resize((img_size, img_size)))\n",
    "resized_kitten = np.array(Image.fromarray(kitten_cropped).resize((img_size, img_size)))\n",
    "x = np.zeros((2, 3, img_size, img_size))\n",
    "x[0, :, :, :] = resized_puppy.transpose((2, 0, 1))\n",
    "x[1, :, :, :] = resized_kitten.transpose((2, 0, 1))\n",
    "\n",
    "# Set up a convolutional weights holding 2 filters, each 3x3\n",
    "w = np.zeros((2, 3, 3, 3))\n",
    "\n",
    "# The first filter converts the image to grayscale.\n",
    "# Set up the red, green, and blue channels of the filter.\n",
    "w[0, 0, :, :] = [[0, 0, 0], [0, 0.3, 0], [0, 0, 0]]\n",
    "w[0, 1, :, :] = [[0, 0, 0], [0, 0.6, 0], [0, 0, 0]]\n",
    "w[0, 2, :, :] = [[0, 0, 0], [0, 0.1, 0], [0, 0, 0]]\n",
    "\n",
    "# Second filter detects horizontal edges in the blue channel.\n",
    "w[1, 2, :, :] = [[1, 2, 1], [0, 0, 0], [-1, -2, -1]]\n",
    "\n",
    "# Vector of biases. We don't need any bias for the grayscale\n",
    "# filter, but for the edge detection filter we want to add 128\n",
    "# to each output so that nothing is negative.\n",
    "b = np.array([0, 128])\n",
    "\n",
    "# Compute the result of convolving each input in x with each filter in w,\n",
    "# offsetting by b, and storing the results in out.\n",
    "out, _ = conv_forward_naive(x, w, b, {'stride': 1, 'pad': 1})\n",
    "\n",
    "def imshow_no_ax(img, normalize=True):\n",
    "    \"\"\" Tiny helper to show images as uint8 and remove axis labels \"\"\"\n",
    "    if normalize:\n",
    "        img_max, img_min = np.max(img), np.min(img)\n",
    "        img = 255.0 * (img - img_min) / (img_max - img_min)\n",
    "    plt.imshow(img.astype('uint8'))\n",
    "    plt.gca().axis('off')\n",
    "\n",
    "# Show the original images and the results of the conv operation\n",
    "plt.subplot(2, 3, 1)\n",
    "imshow_no_ax(puppy, normalize=False)\n",
    "plt.title('Original image')\n",
    "plt.subplot(2, 3, 2)\n",
    "imshow_no_ax(out[0, 0])\n",
    "plt.title('Grayscale')\n",
    "plt.subplot(2, 3, 3)\n",
    "imshow_no_ax(out[0, 1])\n",
    "plt.title('Edges')\n",
    "plt.subplot(2, 3, 4)\n",
    "imshow_no_ax(kitten_cropped, normalize=False)\n",
    "plt.subplot(2, 3, 5)\n",
    "imshow_no_ax(out[1, 0])\n",
    "plt.subplot(2, 3, 6)\n",
    "imshow_no_ax(out[1, 1])\n",
    "plt.show()"
   ],
   "cell_type": "markdown",
   "metadata": {
    "tags": [
     "pdf-ignore-input"
    ]
   }
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Convolution: Naive backward pass\n",
    "Implement the backward pass for the convolution operation in the function `conv_backward_naive` in the file `cs231n/layers.py`. Again, you don't need to worry too much about computational efficiency.\n",
    "\n",
    "When you are done, run the following to check your backward pass with a numeric gradient check."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 4,
   "metadata": {},
   "outputs": [
    {
     "output_type": "stream",
     "name": "stdout",
     "text": [
      "Testing conv_backward_naive function\ndx error:  1.159803161159293e-08\ndw error:  2.247109434939654e-10\ndb error:  3.37264006649648e-11\n"
     ]
    }
   ],
   "source": [
    "np.random.seed(231)\n",
    "x = np.random.randn(4, 3, 5, 5)\n",
    "w = np.random.randn(2, 3, 3, 3)\n",
    "b = np.random.randn(2,)\n",
    "dout = np.random.randn(4, 2, 5, 5)\n",
    "conv_param = {'stride': 1, 'pad': 1}\n",
    "\n",
    "dx_num = eval_numerical_gradient_array(lambda x: conv_forward_naive(x, w, b, conv_param)[0], x, dout)\n",
    "dw_num = eval_numerical_gradient_array(lambda w: conv_forward_naive(x, w, b, conv_param)[0], w, dout)\n",
    "db_num = eval_numerical_gradient_array(lambda b: conv_forward_naive(x, w, b, conv_param)[0], b, dout)\n",
    "\n",
    "out, cache = conv_forward_naive(x, w, b, conv_param)\n",
    "#print(out.shape)\n",
    "dx, dw, db = conv_backward_naive(dout, cache)\n",
    "\n",
    "# Your errors should be around e-8 or less.\n",
    "print('Testing conv_backward_naive function')\n",
    "print('dx error: ', rel_error(dx, dx_num))\n",
    "print('dw error: ', rel_error(dw, dw_num))\n",
    "print('db error: ', rel_error(db, db_num))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Max-Pooling: Naive forward\n",
    "Implement the forward pass for the max-pooling operation in the function `max_pool_forward_naive` in the file `cs231n/layers.py`. Again, don't worry too much about computational efficiency.\n",
    "\n",
    "Check your implementation by running the following:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 5,
   "metadata": {},
   "outputs": [
    {
     "output_type": "stream",
     "name": "stdout",
     "text": [
      "Testing max_pool_forward_naive function:\ndifference:  4.1666665157267834e-08\n"
     ]
    }
   ],
   "source": [
    "x_shape = (2, 3, 4, 4)\n",
    "x = np.linspace(-0.3, 0.4, num=np.prod(x_shape)).reshape(x_shape)\n",
    "pool_param = {'pool_width': 2, 'pool_height': 2, 'stride': 2}\n",
    "\n",
    "out, _ = max_pool_forward_naive(x, pool_param)\n",
    "\n",
    "correct_out = np.array([[[[-0.26315789, -0.24842105],\n",
    "                          [-0.20421053, -0.18947368]],\n",
    "                         [[-0.14526316, -0.13052632],\n",
    "                          [-0.08631579, -0.07157895]],\n",
    "                         [[-0.02736842, -0.01263158],\n",
    "                          [ 0.03157895,  0.04631579]]],\n",
    "                        [[[ 0.09052632,  0.10526316],\n",
    "                          [ 0.14947368,  0.16421053]],\n",
    "                         [[ 0.20842105,  0.22315789],\n",
    "                          [ 0.26736842,  0.28210526]],\n",
    "                         [[ 0.32631579,  0.34105263],\n",
    "                          [ 0.38526316,  0.4       ]]]])\n",
    "\n",
    "# Compare your output with ours. Difference should be on the order of e-8.\n",
    "print('Testing max_pool_forward_naive function:')\n",
    "print('difference: ', rel_error(out, correct_out))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Max-Pooling: Naive backward\n",
    "Implement the backward pass for the max-pooling operation in the function `max_pool_backward_naive` in the file `cs231n/layers.py`. You don't need to worry about computational efficiency.\n",
    "\n",
    "Check your implementation with numeric gradient checking by running the following:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 6,
   "metadata": {},
   "outputs": [
    {
     "output_type": "stream",
     "name": "stdout",
     "text": [
      "Testing max_pool_backward_naive function:\ndx error:  3.27562514223145e-12\n"
     ]
    }
   ],
   "source": [
    "np.random.seed(231)\n",
    "x = np.random.randn(3, 2, 8, 8)\n",
    "dout = np.random.randn(3, 2, 4, 4)\n",
    "pool_param = {'pool_height': 2, 'pool_width': 2, 'stride': 2}\n",
    "\n",
    "dx_num = eval_numerical_gradient_array(lambda x: max_pool_forward_naive(x, pool_param)[0], x, dout)\n",
    "\n",
    "out, cache = max_pool_forward_naive(x, pool_param)\n",
    "dx = max_pool_backward_naive(dout, cache)\n",
    "\n",
    "# Your error should be on the order of e-12\n",
    "print('Testing max_pool_backward_naive function:')\n",
    "print('dx error: ', rel_error(dx, dx_num))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Fast layers\n",
    "\n",
    "Making convolution and pooling layers fast can be challenging. To spare you the pain, we've provided fast implementations of the forward and backward passes for convolution and pooling layers in the file `cs231n/fast_layers.py`."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "The fast convolution implementation depends on a Cython extension; to compile it either execute the local development cell (option A) if you are developing locally, or the Colab cell (option B) if you are running this assignment in Colab.\n",
    "\n",
    "---\n",
    "\n",
    "**Very Important, Please Read**. For **both** option A and B, you have to **restart** the notebook after compiling the cython extension. In Colab, please save the notebook `File -> Save`, then click `Runtime -> Restart Runtime -> Yes`. This will restart the kernel which means local variables will be lost. Just re-execute the cells from top to bottom and skip the cell below as you only need to run it once for the compilation step.\n",
    "\n",
    "---"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Option A: Local Development\n",
    "\n",
    "Go to the cs231n directory and execute the following in your terminal:\n",
    "\n",
    "```bash\n",
    "python setup.py build_ext --inplace\n",
    "```"
   ]
  },
  {
   "source": [
    "## Option B: Colab\n",
    "\n",
    "Execute the cell below only only **ONCE**."
   ],
   "cell_type": "markdown",
   "metadata": {}
  },
  {
   "source": [
    "%cd drive/My\\ Drive/$FOLDERNAME/cs231n/\n",
    "!python setup.py build_ext --inplace"
   ],
   "cell_type": "markdown",
   "metadata": {}
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "The API for the fast versions of the convolution and pooling layers is exactly the same as the naive versions that you implemented above: the forward pass receives data, weights, and parameters and produces outputs and a cache object; the backward pass recieves upstream derivatives and the cache object and produces gradients with respect to the data and weights.\n",
    "\n",
    "**NOTE:** The fast implementation for pooling will only perform optimally if the pooling regions are non-overlapping and tile the input. If these conditions are not met then the fast pooling implementation will not be much faster than the naive implementation.\n",
    "\n",
    "You can compare the performance of the naive and fast versions of these layers by running the following:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 3,
   "metadata": {
    "scrolled": true
   },
   "outputs": [
    {
     "output_type": "stream",
     "name": "stdout",
     "text": [
      "Testing conv_forward_fast:\n",
      "Naive: 0.232064s\n",
      "Fast: 0.036982s\n",
      "Speedup: 6.275111x\n",
      "Difference:  4.926407851494105e-11\n",
      "\n",
      "Testing conv_backward_fast:\n",
      "Naive: 0.666152s\n",
      "Fast: 0.012001s\n",
      "Speedup: 55.508990x\n",
      "dx difference:  1.006330206620509e-11\n",
      "dw difference:  5.61594157319212e-13\n",
      "db difference:  0.0\n"
     ]
    }
   ],
   "source": [
    "# Rel errors should be around e-9 or less\n",
    "from cs231n.fast_layers import conv_forward_fast, conv_backward_fast\n",
    "from time import time\n",
    "np.random.seed(231)\n",
    "x = np.random.randn(100, 3, 31, 31)\n",
    "w = np.random.randn(25, 3, 3, 3)\n",
    "b = np.random.randn(25,)\n",
    "dout = np.random.randn(100, 25, 16, 16)\n",
    "conv_param = {'stride': 2, 'pad': 1}\n",
    "\n",
    "t0 = time()\n",
    "out_naive, cache_naive = conv_forward_naive(x, w, b, conv_param)\n",
    "t1 = time()\n",
    "out_fast, cache_fast = conv_forward_fast(x, w, b, conv_param)\n",
    "t2 = time()\n",
    "\n",
    "print('Testing conv_forward_fast:')\n",
    "print('Naive: %fs' % (t1 - t0))\n",
    "print('Fast: %fs' % (t2 - t1))\n",
    "print('Speedup: %fx' % ((t1 - t0) / (t2 - t1)))\n",
    "print('Difference: ', rel_error(out_naive, out_fast))\n",
    "\n",
    "t0 = time()\n",
    "dx_naive, dw_naive, db_naive = conv_backward_naive(dout, cache_naive)\n",
    "t1 = time()\n",
    "dx_fast, dw_fast, db_fast = conv_backward_fast(dout, cache_fast)\n",
    "t2 = time()\n",
    "\n",
    "print('\\nTesting conv_backward_fast:')\n",
    "print('Naive: %fs' % (t1 - t0))\n",
    "print('Fast: %fs' % (t2 - t1))\n",
    "print('Speedup: %fx' % ((t1 - t0) / (t2 - t1)))\n",
    "print('dx difference: ', rel_error(dx_naive, dx_fast))\n",
    "print('dw difference: ', rel_error(dw_naive, dw_fast))\n",
    "print('db difference: ', rel_error(db_naive, db_fast))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 4,
   "metadata": {},
   "outputs": [
    {
     "output_type": "stream",
     "name": "stdout",
     "text": [
      "Testing pool_forward_fast:\nNaive: 0.009003s\nfast: 0.003999s\nspeedup: 2.251148x\ndifference:  0.0\n\nTesting pool_backward_fast:\nNaive: 0.017006s\nfast: 0.011008s\nspeedup: 1.544844x\ndx difference:  0.0\n"
     ]
    }
   ],
   "source": [
    "# Relative errors should be close to 0.0\n",
    "from cs231n.fast_layers import max_pool_forward_fast, max_pool_backward_fast\n",
    "np.random.seed(231)\n",
    "x = np.random.randn(100, 3, 32, 32)\n",
    "dout = np.random.randn(100, 3, 16, 16)\n",
    "pool_param = {'pool_height': 2, 'pool_width': 2, 'stride': 2}\n",
    "\n",
    "t0 = time()\n",
    "out_naive, cache_naive = max_pool_forward_naive(x, pool_param)\n",
    "t1 = time()\n",
    "out_fast, cache_fast = max_pool_forward_fast(x, pool_param)\n",
    "t2 = time()\n",
    "\n",
    "print('Testing pool_forward_fast:')\n",
    "print('Naive: %fs' % (t1 - t0))\n",
    "print('fast: %fs' % (t2 - t1))\n",
    "print('speedup: %fx' % ((t1 - t0) / (t2 - t1)))\n",
    "print('difference: ', rel_error(out_naive, out_fast))\n",
    "\n",
    "t0 = time()\n",
    "dx_naive = max_pool_backward_naive(dout, cache_naive)\n",
    "t1 = time()\n",
    "dx_fast = max_pool_backward_fast(dout, cache_fast)\n",
    "t2 = time()\n",
    "\n",
    "print('\\nTesting pool_backward_fast:')\n",
    "print('Naive: %fs' % (t1 - t0))\n",
    "print('fast: %fs' % (t2 - t1))\n",
    "print('speedup: %fx' % ((t1 - t0) / (t2 - t1)))\n",
    "print('dx difference: ', rel_error(dx_naive, dx_fast))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Convolutional \"sandwich\" layers\n",
    "Previously we introduced the concept of \"sandwich\" layers that combine multiple operations into commonly used patterns. In the file `cs231n/layer_utils.py` you will find sandwich layers that implement a few commonly used patterns for convolutional networks. Run the cells below to sanity check they're working."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 5,
   "metadata": {},
   "outputs": [
    {
     "output_type": "stream",
     "name": "stdout",
     "text": [
      "Testing conv_relu_pool\ndx error:  4.3975044095018435e-09\ndw error:  3.651673815374511e-09\ndb error:  3.721670750819115e-10\n"
     ]
    }
   ],
   "source": [
    "from cs231n.layer_utils import conv_relu_pool_forward, conv_relu_pool_backward\n",
    "np.random.seed(231)\n",
    "x = np.random.randn(2, 3, 16, 16)\n",
    "w = np.random.randn(3, 3, 3, 3)\n",
    "b = np.random.randn(3,)\n",
    "dout = np.random.randn(2, 3, 8, 8)\n",
    "conv_param = {'stride': 1, 'pad': 1}\n",
    "pool_param = {'pool_height': 2, 'pool_width': 2, 'stride': 2}\n",
    "\n",
    "out, cache = conv_relu_pool_forward(x, w, b, conv_param, pool_param)\n",
    "dx, dw, db = conv_relu_pool_backward(dout, cache)\n",
    "\n",
    "dx_num = eval_numerical_gradient_array(lambda x: conv_relu_pool_forward(x, w, b, conv_param, pool_param)[0], x, dout)\n",
    "dw_num = eval_numerical_gradient_array(lambda w: conv_relu_pool_forward(x, w, b, conv_param, pool_param)[0], w, dout)\n",
    "db_num = eval_numerical_gradient_array(lambda b: conv_relu_pool_forward(x, w, b, conv_param, pool_param)[0], b, dout)\n",
    "\n",
    "# Relative errors should be around e-8 or less\n",
    "print('Testing conv_relu_pool')\n",
    "print('dx error: ', rel_error(dx_num, dx))\n",
    "print('dw error: ', rel_error(dw_num, dw))\n",
    "print('db error: ', rel_error(db_num, db))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 6,
   "metadata": {},
   "outputs": [
    {
     "output_type": "stream",
     "name": "stdout",
     "text": [
      "Testing conv_relu:\ndx error:  4.84744795054139e-09\ndw error:  3.8283065057775625e-10\ndb error:  2.9449034603190923e-10\n"
     ]
    }
   ],
   "source": [
    "from cs231n.layer_utils import conv_relu_forward, conv_relu_backward\n",
    "np.random.seed(231)\n",
    "x = np.random.randn(2, 3, 8, 8)\n",
    "w = np.random.randn(3, 3, 3, 3)\n",
    "b = np.random.randn(3,)\n",
    "dout = np.random.randn(2, 3, 8, 8)\n",
    "conv_param = {'stride': 1, 'pad': 1}\n",
    "\n",
    "out, cache = conv_relu_forward(x, w, b, conv_param)\n",
    "dx, dw, db = conv_relu_backward(dout, cache)\n",
    "\n",
    "dx_num = eval_numerical_gradient_array(lambda x: conv_relu_forward(x, w, b, conv_param)[0], x, dout)\n",
    "dw_num = eval_numerical_gradient_array(lambda w: conv_relu_forward(x, w, b, conv_param)[0], w, dout)\n",
    "db_num = eval_numerical_gradient_array(lambda b: conv_relu_forward(x, w, b, conv_param)[0], b, dout)\n",
    "\n",
    "# Relative errors should be around e-8 or less\n",
    "print('Testing conv_relu:')\n",
    "print('dx error: ', rel_error(dx_num, dx))\n",
    "print('dw error: ', rel_error(dw_num, dw))\n",
    "print('db error: ', rel_error(db_num, db))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Three-layer ConvNet\n",
    "Now that you have implemented all the necessary layers, we can put them together into a simple convolutional network.\n",
    "\n",
    "Open the file `cs231n/classifiers/cnn.py` and complete the implementation of the `ThreeLayerConvNet` class. Remember you can use the fast/sandwich layers (already imported for you) in your implementation. Run the following cells to help you debug:"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Sanity check loss\n",
    "After you build a new network, one of the first things you should do is sanity check the loss. When we use the softmax loss, we expect the loss for random weights (and no regularization) to be about `log(C)` for `C` classes. When we add regularization the loss should go up slightly."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 13,
   "metadata": {},
   "outputs": [
    {
     "output_type": "stream",
     "name": "stdout",
     "text": [
      "Initial loss (no regularization):  2.302586575497883\n",
      "Initial loss (with regularization):  2.5083480528490716\n"
     ]
    }
   ],
   "source": [
    "model = ThreeLayerConvNet()\n",
    "\n",
    "N = 50\n",
    "X = np.random.randn(N, 3, 32, 32)\n",
    "y = np.random.randint(10, size=N)\n",
    "\n",
    "loss, grads = model.loss(X, y)\n",
    "print('Initial loss (no regularization): ', loss)\n",
    "\n",
    "model.reg = 0.5\n",
    "loss, grads = model.loss(X, y)\n",
    "print('Initial loss (with regularization): ', loss)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Gradient check\n",
    "After the loss looks reasonable, use numeric gradient checking to make sure that your backward pass is correct. When you use numeric gradient checking you should use a small amount of artifical data and a small number of neurons at each layer. Note: correct implementations may still have relative errors up to the order of e-2."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 14,
   "metadata": {},
   "outputs": [
    {
     "output_type": "stream",
     "name": "stdout",
     "text": [
      "W1 max relative error: 1.380104e-04\n",
      "W2 max relative error: 1.822723e-02\n",
      "W3 max relative error: 3.064049e-04\n",
      "b1 max relative error: 3.477652e-05\n",
      "b2 max relative error: 2.516375e-03\n",
      "b3 max relative error: 7.945660e-10\n"
     ]
    }
   ],
   "source": [
    "num_inputs = 2\n",
    "input_dim = (3, 16, 16)\n",
    "reg = 0.0\n",
    "num_classes = 10\n",
    "np.random.seed(231)\n",
    "X = np.random.randn(num_inputs, *input_dim)\n",
    "y = np.random.randint(num_classes, size=num_inputs)\n",
    "\n",
    "model = ThreeLayerConvNet(num_filters=3, filter_size=3,\n",
    "                          input_dim=input_dim, hidden_dim=7,\n",
    "                          dtype=np.float64)\n",
    "loss, grads = model.loss(X, y)\n",
    "# Errors should be small, but correct implementations may have\n",
    "# relative errors up to the order of e-2\n",
    "for param_name in sorted(grads):\n",
    "    f = lambda _: model.loss(X, y)[0]\n",
    "    param_grad_num = eval_numerical_gradient(f, model.params[param_name], verbose=False, h=1e-6)\n",
    "    e = rel_error(param_grad_num, grads[param_name])\n",
    "    print('%s max relative error: %e' % (param_name, rel_error(param_grad_num, grads[param_name])))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Overfit small data\n",
    "A nice trick is to train your model with just a few training samples. You should be able to overfit small datasets, which will result in very high training accuracy and comparatively low validation accuracy."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 15,
   "metadata": {},
   "outputs": [
    {
     "output_type": "stream",
     "name": "stdout",
     "text": [
      "(Iteration 1 / 30) loss: 2.414060\n",
      "(Epoch 0 / 15) train acc: 0.200000; val_acc: 0.137000\n",
      "(Iteration 2 / 30) loss: 3.102925\n",
      "(Epoch 1 / 15) train acc: 0.140000; val_acc: 0.087000\n",
      "(Iteration 3 / 30) loss: 2.270330\n",
      "(Iteration 4 / 30) loss: 2.096705\n",
      "(Epoch 2 / 15) train acc: 0.240000; val_acc: 0.094000\n",
      "(Iteration 5 / 30) loss: 1.838880\n",
      "(Iteration 6 / 30) loss: 1.934188\n",
      "(Epoch 3 / 15) train acc: 0.510000; val_acc: 0.173000\n",
      "(Iteration 7 / 30) loss: 1.827912\n",
      "(Iteration 8 / 30) loss: 1.639574\n",
      "(Epoch 4 / 15) train acc: 0.520000; val_acc: 0.188000\n",
      "(Iteration 9 / 30) loss: 1.330082\n",
      "(Iteration 10 / 30) loss: 1.756115\n",
      "(Epoch 5 / 15) train acc: 0.630000; val_acc: 0.167000\n",
      "(Iteration 11 / 30) loss: 1.024162\n",
      "(Iteration 12 / 30) loss: 1.041826\n",
      "(Epoch 6 / 15) train acc: 0.750000; val_acc: 0.229000\n",
      "(Iteration 13 / 30) loss: 1.142777\n",
      "(Iteration 14 / 30) loss: 0.835706\n",
      "(Epoch 7 / 15) train acc: 0.790000; val_acc: 0.247000\n",
      "(Iteration 15 / 30) loss: 0.587786\n",
      "(Iteration 16 / 30) loss: 0.645509\n",
      "(Epoch 8 / 15) train acc: 0.820000; val_acc: 0.252000\n",
      "(Iteration 17 / 30) loss: 0.786844\n",
      "(Iteration 18 / 30) loss: 0.467054\n",
      "(Epoch 9 / 15) train acc: 0.820000; val_acc: 0.178000\n",
      "(Iteration 19 / 30) loss: 0.429880\n",
      "(Iteration 20 / 30) loss: 0.635498\n",
      "(Epoch 10 / 15) train acc: 0.900000; val_acc: 0.206000\n",
      "(Iteration 21 / 30) loss: 0.365807\n",
      "(Iteration 22 / 30) loss: 0.284220\n",
      "(Epoch 11 / 15) train acc: 0.820000; val_acc: 0.201000\n",
      "(Iteration 23 / 30) loss: 0.469343\n",
      "(Iteration 24 / 30) loss: 0.509369\n",
      "(Epoch 12 / 15) train acc: 0.920000; val_acc: 0.211000\n",
      "(Iteration 25 / 30) loss: 0.111638\n",
      "(Iteration 26 / 30) loss: 0.145388\n",
      "(Epoch 13 / 15) train acc: 0.930000; val_acc: 0.213000\n",
      "(Iteration 27 / 30) loss: 0.155575\n",
      "(Iteration 28 / 30) loss: 0.143398\n",
      "(Epoch 14 / 15) train acc: 0.960000; val_acc: 0.212000\n",
      "(Iteration 29 / 30) loss: 0.158160\n",
      "(Iteration 30 / 30) loss: 0.118934\n",
      "(Epoch 15 / 15) train acc: 0.990000; val_acc: 0.220000\n"
     ]
    }
   ],
   "source": [
    "np.random.seed(231)\n",
    "\n",
    "num_train = 100\n",
    "small_data = {\n",
    "  'X_train': data['X_train'][:num_train],\n",
    "  'y_train': data['y_train'][:num_train],\n",
    "  'X_val': data['X_val'],\n",
    "  'y_val': data['y_val'],\n",
    "}\n",
    "\n",
    "model = ThreeLayerConvNet(weight_scale=1e-2)\n",
    "\n",
    "solver = Solver(model, small_data,\n",
    "                num_epochs=15, batch_size=50,\n",
    "                update_rule='adam',\n",
    "                optim_config={\n",
    "                  'learning_rate': 1e-3,\n",
    "                },\n",
    "                verbose=True, print_every=1)\n",
    "solver.train()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 16,
   "metadata": {
    "id": "small_data_train_accuracy"
   },
   "outputs": [
    {
     "output_type": "stream",
     "name": "stdout",
     "text": [
      "Small data training accuracy: 0.82\n"
     ]
    }
   ],
   "source": [
    "# Print final training accuracy\n",
    "print(\n",
    "    \"Small data training accuracy:\",\n",
    "    solver.check_accuracy(small_data['X_train'], small_data['y_train'])\n",
    ")"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 17,
   "metadata": {
    "id": "small_data_validation_accuracy"
   },
   "outputs": [
    {
     "output_type": "stream",
     "name": "stdout",
     "text": [
      "Small data validation accuracy: 0.252\n"
     ]
    }
   ],
   "source": [
    "# Print final validation accuracy\n",
    "print(\n",
    "    \"Small data validation accuracy:\",\n",
    "    solver.check_accuracy(small_data['X_val'], small_data['y_val'])\n",
    ")"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Plotting the loss, training accuracy, and validation accuracy should show clear overfitting:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 18,
   "metadata": {},
   "outputs": [
    {
     "output_type": "display_data",
     "data": {
      "text/plain": "<Figure size 720x576 with 2 Axes>",
      "image/svg+xml": "<?xml version=\"1.0\" encoding=\"utf-8\" standalone=\"no\"?>\r\n<!DOCTYPE svg PUBLIC \"-//W3C//DTD SVG 1.1//EN\"\r\n  \"http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd\">\r\n<!-- Created with matplotlib (https://matplotlib.org/) -->\r\n<svg height=\"479.63625pt\" version=\"1.1\" viewBox=\"0 0 608.98125 479.63625\" width=\"608.98125pt\" xmlns=\"http://www.w3.org/2000/svg\" xmlns:xlink=\"http://www.w3.org/1999/xlink\">\r\n <defs>\r\n  <style type=\"text/css\">\r\n*{stroke-linecap:butt;stroke-linejoin:round;}\r\n  </style>\r\n </defs>\r\n <g id=\"figure_1\">\r\n  <g id=\"patch_1\">\r\n   <path d=\"M 0 479.63625 \r\nL 608.98125 479.63625 \r\nL 608.98125 0 \r\nL 0 0 \r\nz\r\n\" style=\"fill:none;\"/>\r\n  </g>\r\n  <g id=\"axes_1\">\r\n   <g id=\"patch_2\">\r\n    <path d=\"M 43.78125 204.872727 \r\nL 601.78125 204.872727 \r\nL 601.78125 7.2 \r\nL 43.78125 7.2 \r\nz\r\n\" style=\"fill:#ffffff;\"/>\r\n   </g>\r\n   <g id=\"matplotlib.axis_1\">\r\n    <g id=\"xtick_1\">\r\n     <g id=\"line2d_1\">\r\n      <defs>\r\n       <path d=\"M 0 0 \r\nL 0 3.5 \r\n\" id=\"m74a7a71402\" style=\"stroke:#000000;stroke-width:0.8;\"/>\r\n      </defs>\r\n      <g>\r\n       <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"69.144886\" xlink:href=\"#m74a7a71402\" y=\"204.872727\"/>\r\n      </g>\r\n     </g>\r\n     <g id=\"text_1\">\r\n      <!-- 0 -->\r\n      <defs>\r\n       <path d=\"M 31.78125 66.40625 \r\nQ 24.171875 66.40625 20.328125 58.90625 \r\nQ 16.5 51.421875 16.5 36.375 \r\nQ 16.5 21.390625 20.328125 13.890625 \r\nQ 24.171875 6.390625 31.78125 6.390625 \r\nQ 39.453125 6.390625 43.28125 13.890625 \r\nQ 47.125 21.390625 47.125 36.375 \r\nQ 47.125 51.421875 43.28125 58.90625 \r\nQ 39.453125 66.40625 31.78125 66.40625 \r\nz\r\nM 31.78125 74.21875 \r\nQ 44.046875 74.21875 50.515625 64.515625 \r\nQ 56.984375 54.828125 56.984375 36.375 \r\nQ 56.984375 17.96875 50.515625 8.265625 \r\nQ 44.046875 -1.421875 31.78125 -1.421875 \r\nQ 19.53125 -1.421875 13.0625 8.265625 \r\nQ 6.59375 17.96875 6.59375 36.375 \r\nQ 6.59375 54.828125 13.0625 64.515625 \r\nQ 19.53125 74.21875 31.78125 74.21875 \r\nz\r\n\" id=\"DejaVuSans-48\"/>\r\n      </defs>\r\n      <g transform=\"translate(65.963636 219.471165)scale(0.1 -0.1)\">\r\n       <use xlink:href=\"#DejaVuSans-48\"/>\r\n      </g>\r\n     </g>\r\n    </g>\r\n    <g id=\"xtick_2\">\r\n     <g id=\"line2d_2\">\r\n      <g>\r\n       <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"156.605701\" xlink:href=\"#m74a7a71402\" y=\"204.872727\"/>\r\n      </g>\r\n     </g>\r\n     <g id=\"text_2\">\r\n      <!-- 5 -->\r\n      <defs>\r\n       <path d=\"M 10.796875 72.90625 \r\nL 49.515625 72.90625 \r\nL 49.515625 64.59375 \r\nL 19.828125 64.59375 \r\nL 19.828125 46.734375 \r\nQ 21.96875 47.46875 24.109375 47.828125 \r\nQ 26.265625 48.1875 28.421875 48.1875 \r\nQ 40.625 48.1875 47.75 41.5 \r\nQ 54.890625 34.8125 54.890625 23.390625 \r\nQ 54.890625 11.625 47.5625 5.09375 \r\nQ 40.234375 -1.421875 26.90625 -1.421875 \r\nQ 22.3125 -1.421875 17.546875 -0.640625 \r\nQ 12.796875 0.140625 7.71875 1.703125 \r\nL 7.71875 11.625 \r\nQ 12.109375 9.234375 16.796875 8.0625 \r\nQ 21.484375 6.890625 26.703125 6.890625 \r\nQ 35.15625 6.890625 40.078125 11.328125 \r\nQ 45.015625 15.765625 45.015625 23.390625 \r\nQ 45.015625 31 40.078125 35.4375 \r\nQ 35.15625 39.890625 26.703125 39.890625 \r\nQ 22.75 39.890625 18.8125 39.015625 \r\nQ 14.890625 38.140625 10.796875 36.28125 \r\nz\r\n\" id=\"DejaVuSans-53\"/>\r\n      </defs>\r\n      <g transform=\"translate(153.424451 219.471165)scale(0.1 -0.1)\">\r\n       <use xlink:href=\"#DejaVuSans-53\"/>\r\n      </g>\r\n     </g>\r\n    </g>\r\n    <g id=\"xtick_3\">\r\n     <g id=\"line2d_3\">\r\n      <g>\r\n       <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"244.066516\" xlink:href=\"#m74a7a71402\" y=\"204.872727\"/>\r\n      </g>\r\n     </g>\r\n     <g id=\"text_3\">\r\n      <!-- 10 -->\r\n      <defs>\r\n       <path d=\"M 12.40625 8.296875 \r\nL 28.515625 8.296875 \r\nL 28.515625 63.921875 \r\nL 10.984375 60.40625 \r\nL 10.984375 69.390625 \r\nL 28.421875 72.90625 \r\nL 38.28125 72.90625 \r\nL 38.28125 8.296875 \r\nL 54.390625 8.296875 \r\nL 54.390625 0 \r\nL 12.40625 0 \r\nz\r\n\" id=\"DejaVuSans-49\"/>\r\n      </defs>\r\n      <g transform=\"translate(237.704016 219.471165)scale(0.1 -0.1)\">\r\n       <use xlink:href=\"#DejaVuSans-49\"/>\r\n       <use x=\"63.623047\" xlink:href=\"#DejaVuSans-48\"/>\r\n      </g>\r\n     </g>\r\n    </g>\r\n    <g id=\"xtick_4\">\r\n     <g id=\"line2d_4\">\r\n      <g>\r\n       <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"331.527332\" xlink:href=\"#m74a7a71402\" y=\"204.872727\"/>\r\n      </g>\r\n     </g>\r\n     <g id=\"text_4\">\r\n      <!-- 15 -->\r\n      <g transform=\"translate(325.164832 219.471165)scale(0.1 -0.1)\">\r\n       <use xlink:href=\"#DejaVuSans-49\"/>\r\n       <use x=\"63.623047\" xlink:href=\"#DejaVuSans-53\"/>\r\n      </g>\r\n     </g>\r\n    </g>\r\n    <g id=\"xtick_5\">\r\n     <g id=\"line2d_5\">\r\n      <g>\r\n       <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"418.988147\" xlink:href=\"#m74a7a71402\" y=\"204.872727\"/>\r\n      </g>\r\n     </g>\r\n     <g id=\"text_5\">\r\n      <!-- 20 -->\r\n      <defs>\r\n       <path d=\"M 19.1875 8.296875 \r\nL 53.609375 8.296875 \r\nL 53.609375 0 \r\nL 7.328125 0 \r\nL 7.328125 8.296875 \r\nQ 12.9375 14.109375 22.625 23.890625 \r\nQ 32.328125 33.6875 34.8125 36.53125 \r\nQ 39.546875 41.84375 41.421875 45.53125 \r\nQ 43.3125 49.21875 43.3125 52.78125 \r\nQ 43.3125 58.59375 39.234375 62.25 \r\nQ 35.15625 65.921875 28.609375 65.921875 \r\nQ 23.96875 65.921875 18.8125 64.3125 \r\nQ 13.671875 62.703125 7.8125 59.421875 \r\nL 7.8125 69.390625 \r\nQ 13.765625 71.78125 18.9375 73 \r\nQ 24.125 74.21875 28.421875 74.21875 \r\nQ 39.75 74.21875 46.484375 68.546875 \r\nQ 53.21875 62.890625 53.21875 53.421875 \r\nQ 53.21875 48.921875 51.53125 44.890625 \r\nQ 49.859375 40.875 45.40625 35.40625 \r\nQ 44.1875 33.984375 37.640625 27.21875 \r\nQ 31.109375 20.453125 19.1875 8.296875 \r\nz\r\n\" id=\"DejaVuSans-50\"/>\r\n      </defs>\r\n      <g transform=\"translate(412.625647 219.471165)scale(0.1 -0.1)\">\r\n       <use xlink:href=\"#DejaVuSans-50\"/>\r\n       <use x=\"63.623047\" xlink:href=\"#DejaVuSans-48\"/>\r\n      </g>\r\n     </g>\r\n    </g>\r\n    <g id=\"xtick_6\">\r\n     <g id=\"line2d_6\">\r\n      <g>\r\n       <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"506.448962\" xlink:href=\"#m74a7a71402\" y=\"204.872727\"/>\r\n      </g>\r\n     </g>\r\n     <g id=\"text_6\">\r\n      <!-- 25 -->\r\n      <g transform=\"translate(500.086462 219.471165)scale(0.1 -0.1)\">\r\n       <use xlink:href=\"#DejaVuSans-50\"/>\r\n       <use x=\"63.623047\" xlink:href=\"#DejaVuSans-53\"/>\r\n      </g>\r\n     </g>\r\n    </g>\r\n    <g id=\"xtick_7\">\r\n     <g id=\"line2d_7\">\r\n      <g>\r\n       <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"593.909777\" xlink:href=\"#m74a7a71402\" y=\"204.872727\"/>\r\n      </g>\r\n     </g>\r\n     <g id=\"text_7\">\r\n      <!-- 30 -->\r\n      <defs>\r\n       <path d=\"M 40.578125 39.3125 \r\nQ 47.65625 37.796875 51.625 33 \r\nQ 55.609375 28.21875 55.609375 21.1875 \r\nQ 55.609375 10.40625 48.1875 4.484375 \r\nQ 40.765625 -1.421875 27.09375 -1.421875 \r\nQ 22.515625 -1.421875 17.65625 -0.515625 \r\nQ 12.796875 0.390625 7.625 2.203125 \r\nL 7.625 11.71875 \r\nQ 11.71875 9.328125 16.59375 8.109375 \r\nQ 21.484375 6.890625 26.8125 6.890625 \r\nQ 36.078125 6.890625 40.9375 10.546875 \r\nQ 45.796875 14.203125 45.796875 21.1875 \r\nQ 45.796875 27.640625 41.28125 31.265625 \r\nQ 36.765625 34.90625 28.71875 34.90625 \r\nL 20.21875 34.90625 \r\nL 20.21875 43.015625 \r\nL 29.109375 43.015625 \r\nQ 36.375 43.015625 40.234375 45.921875 \r\nQ 44.09375 48.828125 44.09375 54.296875 \r\nQ 44.09375 59.90625 40.109375 62.90625 \r\nQ 36.140625 65.921875 28.71875 65.921875 \r\nQ 24.65625 65.921875 20.015625 65.03125 \r\nQ 15.375 64.15625 9.8125 62.3125 \r\nL 9.8125 71.09375 \r\nQ 15.4375 72.65625 20.34375 73.4375 \r\nQ 25.25 74.21875 29.59375 74.21875 \r\nQ 40.828125 74.21875 47.359375 69.109375 \r\nQ 53.90625 64.015625 53.90625 55.328125 \r\nQ 53.90625 49.265625 50.4375 45.09375 \r\nQ 46.96875 40.921875 40.578125 39.3125 \r\nz\r\n\" id=\"DejaVuSans-51\"/>\r\n      </defs>\r\n      <g transform=\"translate(587.547277 219.471165)scale(0.1 -0.1)\">\r\n       <use xlink:href=\"#DejaVuSans-51\"/>\r\n       <use x=\"63.623047\" xlink:href=\"#DejaVuSans-48\"/>\r\n      </g>\r\n     </g>\r\n    </g>\r\n    <g id=\"text_8\">\r\n     <!-- iteration -->\r\n     <defs>\r\n      <path d=\"M 9.421875 54.6875 \r\nL 18.40625 54.6875 \r\nL 18.40625 0 \r\nL 9.421875 0 \r\nz\r\nM 9.421875 75.984375 \r\nL 18.40625 75.984375 \r\nL 18.40625 64.59375 \r\nL 9.421875 64.59375 \r\nz\r\n\" id=\"DejaVuSans-105\"/>\r\n      <path d=\"M 18.3125 70.21875 \r\nL 18.3125 54.6875 \r\nL 36.8125 54.6875 \r\nL 36.8125 47.703125 \r\nL 18.3125 47.703125 \r\nL 18.3125 18.015625 \r\nQ 18.3125 11.328125 20.140625 9.421875 \r\nQ 21.96875 7.515625 27.59375 7.515625 \r\nL 36.8125 7.515625 \r\nL 36.8125 0 \r\nL 27.59375 0 \r\nQ 17.1875 0 13.234375 3.875 \r\nQ 9.28125 7.765625 9.28125 18.015625 \r\nL 9.28125 47.703125 \r\nL 2.6875 47.703125 \r\nL 2.6875 54.6875 \r\nL 9.28125 54.6875 \r\nL 9.28125 70.21875 \r\nz\r\n\" id=\"DejaVuSans-116\"/>\r\n      <path d=\"M 56.203125 29.59375 \r\nL 56.203125 25.203125 \r\nL 14.890625 25.203125 \r\nQ 15.484375 15.921875 20.484375 11.0625 \r\nQ 25.484375 6.203125 34.421875 6.203125 \r\nQ 39.59375 6.203125 44.453125 7.46875 \r\nQ 49.3125 8.734375 54.109375 11.28125 \r\nL 54.109375 2.78125 \r\nQ 49.265625 0.734375 44.1875 -0.34375 \r\nQ 39.109375 -1.421875 33.890625 -1.421875 \r\nQ 20.796875 -1.421875 13.15625 6.1875 \r\nQ 5.515625 13.8125 5.515625 26.8125 \r\nQ 5.515625 40.234375 12.765625 48.109375 \r\nQ 20.015625 56 32.328125 56 \r\nQ 43.359375 56 49.78125 48.890625 \r\nQ 56.203125 41.796875 56.203125 29.59375 \r\nz\r\nM 47.21875 32.234375 \r\nQ 47.125 39.59375 43.09375 43.984375 \r\nQ 39.0625 48.390625 32.421875 48.390625 \r\nQ 24.90625 48.390625 20.390625 44.140625 \r\nQ 15.875 39.890625 15.1875 32.171875 \r\nz\r\n\" id=\"DejaVuSans-101\"/>\r\n      <path d=\"M 41.109375 46.296875 \r\nQ 39.59375 47.171875 37.8125 47.578125 \r\nQ 36.03125 48 33.890625 48 \r\nQ 26.265625 48 22.1875 43.046875 \r\nQ 18.109375 38.09375 18.109375 28.8125 \r\nL 18.109375 0 \r\nL 9.078125 0 \r\nL 9.078125 54.6875 \r\nL 18.109375 54.6875 \r\nL 18.109375 46.1875 \r\nQ 20.953125 51.171875 25.484375 53.578125 \r\nQ 30.03125 56 36.53125 56 \r\nQ 37.453125 56 38.578125 55.875 \r\nQ 39.703125 55.765625 41.0625 55.515625 \r\nz\r\n\" id=\"DejaVuSans-114\"/>\r\n      <path d=\"M 34.28125 27.484375 \r\nQ 23.390625 27.484375 19.1875 25 \r\nQ 14.984375 22.515625 14.984375 16.5 \r\nQ 14.984375 11.71875 18.140625 8.90625 \r\nQ 21.296875 6.109375 26.703125 6.109375 \r\nQ 34.1875 6.109375 38.703125 11.40625 \r\nQ 43.21875 16.703125 43.21875 25.484375 \r\nL 43.21875 27.484375 \r\nz\r\nM 52.203125 31.203125 \r\nL 52.203125 0 \r\nL 43.21875 0 \r\nL 43.21875 8.296875 \r\nQ 40.140625 3.328125 35.546875 0.953125 \r\nQ 30.953125 -1.421875 24.3125 -1.421875 \r\nQ 15.921875 -1.421875 10.953125 3.296875 \r\nQ 6 8.015625 6 15.921875 \r\nQ 6 25.140625 12.171875 29.828125 \r\nQ 18.359375 34.515625 30.609375 34.515625 \r\nL 43.21875 34.515625 \r\nL 43.21875 35.40625 \r\nQ 43.21875 41.609375 39.140625 45 \r\nQ 35.0625 48.390625 27.6875 48.390625 \r\nQ 23 48.390625 18.546875 47.265625 \r\nQ 14.109375 46.140625 10.015625 43.890625 \r\nL 10.015625 52.203125 \r\nQ 14.9375 54.109375 19.578125 55.046875 \r\nQ 24.21875 56 28.609375 56 \r\nQ 40.484375 56 46.34375 49.84375 \r\nQ 52.203125 43.703125 52.203125 31.203125 \r\nz\r\n\" id=\"DejaVuSans-97\"/>\r\n      <path d=\"M 30.609375 48.390625 \r\nQ 23.390625 48.390625 19.1875 42.75 \r\nQ 14.984375 37.109375 14.984375 27.296875 \r\nQ 14.984375 17.484375 19.15625 11.84375 \r\nQ 23.34375 6.203125 30.609375 6.203125 \r\nQ 37.796875 6.203125 41.984375 11.859375 \r\nQ 46.1875 17.53125 46.1875 27.296875 \r\nQ 46.1875 37.015625 41.984375 42.703125 \r\nQ 37.796875 48.390625 30.609375 48.390625 \r\nz\r\nM 30.609375 56 \r\nQ 42.328125 56 49.015625 48.375 \r\nQ 55.71875 40.765625 55.71875 27.296875 \r\nQ 55.71875 13.875 49.015625 6.21875 \r\nQ 42.328125 -1.421875 30.609375 -1.421875 \r\nQ 18.84375 -1.421875 12.171875 6.21875 \r\nQ 5.515625 13.875 5.515625 27.296875 \r\nQ 5.515625 40.765625 12.171875 48.375 \r\nQ 18.84375 56 30.609375 56 \r\nz\r\n\" id=\"DejaVuSans-111\"/>\r\n      <path d=\"M 54.890625 33.015625 \r\nL 54.890625 0 \r\nL 45.90625 0 \r\nL 45.90625 32.71875 \r\nQ 45.90625 40.484375 42.875 44.328125 \r\nQ 39.84375 48.1875 33.796875 48.1875 \r\nQ 26.515625 48.1875 22.3125 43.546875 \r\nQ 18.109375 38.921875 18.109375 30.90625 \r\nL 18.109375 0 \r\nL 9.078125 0 \r\nL 9.078125 54.6875 \r\nL 18.109375 54.6875 \r\nL 18.109375 46.1875 \r\nQ 21.34375 51.125 25.703125 53.5625 \r\nQ 30.078125 56 35.796875 56 \r\nQ 45.21875 56 50.046875 50.171875 \r\nQ 54.890625 44.34375 54.890625 33.015625 \r\nz\r\n\" id=\"DejaVuSans-110\"/>\r\n     </defs>\r\n     <g transform=\"translate(301.658594 233.14929)scale(0.1 -0.1)\">\r\n      <use xlink:href=\"#DejaVuSans-105\"/>\r\n      <use x=\"27.783203\" xlink:href=\"#DejaVuSans-116\"/>\r\n      <use x=\"66.992188\" xlink:href=\"#DejaVuSans-101\"/>\r\n      <use x=\"128.515625\" xlink:href=\"#DejaVuSans-114\"/>\r\n      <use x=\"169.628906\" xlink:href=\"#DejaVuSans-97\"/>\r\n      <use x=\"230.908203\" xlink:href=\"#DejaVuSans-116\"/>\r\n      <use x=\"270.117188\" xlink:href=\"#DejaVuSans-105\"/>\r\n      <use x=\"297.900391\" xlink:href=\"#DejaVuSans-111\"/>\r\n      <use x=\"359.082031\" xlink:href=\"#DejaVuSans-110\"/>\r\n     </g>\r\n    </g>\r\n   </g>\r\n   <g id=\"matplotlib.axis_2\">\r\n    <g id=\"ytick_1\">\r\n     <g id=\"line2d_8\">\r\n      <defs>\r\n       <path d=\"M 0 0 \r\nL -3.5 0 \r\n\" id=\"m9671dea085\" style=\"stroke:#000000;stroke-width:0.8;\"/>\r\n      </defs>\r\n      <g>\r\n       <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"43.78125\" xlink:href=\"#m9671dea085\" y=\"202.594285\"/>\r\n      </g>\r\n     </g>\r\n     <g id=\"text_9\">\r\n      <!-- 0.0 -->\r\n      <defs>\r\n       <path d=\"M 10.6875 12.40625 \r\nL 21 12.40625 \r\nL 21 0 \r\nL 10.6875 0 \r\nz\r\n\" id=\"DejaVuSans-46\"/>\r\n      </defs>\r\n      <g transform=\"translate(20.878125 206.393504)scale(0.1 -0.1)\">\r\n       <use xlink:href=\"#DejaVuSans-48\"/>\r\n       <use x=\"63.623047\" xlink:href=\"#DejaVuSans-46\"/>\r\n       <use x=\"95.410156\" xlink:href=\"#DejaVuSans-48\"/>\r\n      </g>\r\n     </g>\r\n    </g>\r\n    <g id=\"ytick_2\">\r\n     <g id=\"line2d_9\">\r\n      <g>\r\n       <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"43.78125\" xlink:href=\"#m9671dea085\" y=\"172.556635\"/>\r\n      </g>\r\n     </g>\r\n     <g id=\"text_10\">\r\n      <!-- 0.5 -->\r\n      <g transform=\"translate(20.878125 176.355854)scale(0.1 -0.1)\">\r\n       <use xlink:href=\"#DejaVuSans-48\"/>\r\n       <use x=\"63.623047\" xlink:href=\"#DejaVuSans-46\"/>\r\n       <use x=\"95.410156\" xlink:href=\"#DejaVuSans-53\"/>\r\n      </g>\r\n     </g>\r\n    </g>\r\n    <g id=\"ytick_3\">\r\n     <g id=\"line2d_10\">\r\n      <g>\r\n       <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"43.78125\" xlink:href=\"#m9671dea085\" y=\"142.518984\"/>\r\n      </g>\r\n     </g>\r\n     <g id=\"text_11\">\r\n      <!-- 1.0 -->\r\n      <g transform=\"translate(20.878125 146.318203)scale(0.1 -0.1)\">\r\n       <use xlink:href=\"#DejaVuSans-49\"/>\r\n       <use x=\"63.623047\" xlink:href=\"#DejaVuSans-46\"/>\r\n       <use x=\"95.410156\" xlink:href=\"#DejaVuSans-48\"/>\r\n      </g>\r\n     </g>\r\n    </g>\r\n    <g id=\"ytick_4\">\r\n     <g id=\"line2d_11\">\r\n      <g>\r\n       <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"43.78125\" xlink:href=\"#m9671dea085\" y=\"112.481334\"/>\r\n      </g>\r\n     </g>\r\n     <g id=\"text_12\">\r\n      <!-- 1.5 -->\r\n      <g transform=\"translate(20.878125 116.280553)scale(0.1 -0.1)\">\r\n       <use xlink:href=\"#DejaVuSans-49\"/>\r\n       <use x=\"63.623047\" xlink:href=\"#DejaVuSans-46\"/>\r\n       <use x=\"95.410156\" xlink:href=\"#DejaVuSans-53\"/>\r\n      </g>\r\n     </g>\r\n    </g>\r\n    <g id=\"ytick_5\">\r\n     <g id=\"line2d_12\">\r\n      <g>\r\n       <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"43.78125\" xlink:href=\"#m9671dea085\" y=\"82.443683\"/>\r\n      </g>\r\n     </g>\r\n     <g id=\"text_13\">\r\n      <!-- 2.0 -->\r\n      <g transform=\"translate(20.878125 86.242902)scale(0.1 -0.1)\">\r\n       <use xlink:href=\"#DejaVuSans-50\"/>\r\n       <use x=\"63.623047\" xlink:href=\"#DejaVuSans-46\"/>\r\n       <use x=\"95.410156\" xlink:href=\"#DejaVuSans-48\"/>\r\n      </g>\r\n     </g>\r\n    </g>\r\n    <g id=\"ytick_6\">\r\n     <g id=\"line2d_13\">\r\n      <g>\r\n       <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"43.78125\" xlink:href=\"#m9671dea085\" y=\"52.406033\"/>\r\n      </g>\r\n     </g>\r\n     <g id=\"text_14\">\r\n      <!-- 2.5 -->\r\n      <g transform=\"translate(20.878125 56.205252)scale(0.1 -0.1)\">\r\n       <use xlink:href=\"#DejaVuSans-50\"/>\r\n       <use x=\"63.623047\" xlink:href=\"#DejaVuSans-46\"/>\r\n       <use x=\"95.410156\" xlink:href=\"#DejaVuSans-53\"/>\r\n      </g>\r\n     </g>\r\n    </g>\r\n    <g id=\"ytick_7\">\r\n     <g id=\"line2d_14\">\r\n      <g>\r\n       <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"43.78125\" xlink:href=\"#m9671dea085\" y=\"22.368382\"/>\r\n      </g>\r\n     </g>\r\n     <g id=\"text_15\">\r\n      <!-- 3.0 -->\r\n      <g transform=\"translate(20.878125 26.167601)scale(0.1 -0.1)\">\r\n       <use xlink:href=\"#DejaVuSans-51\"/>\r\n       <use x=\"63.623047\" xlink:href=\"#DejaVuSans-46\"/>\r\n       <use x=\"95.410156\" xlink:href=\"#DejaVuSans-48\"/>\r\n      </g>\r\n     </g>\r\n    </g>\r\n    <g id=\"text_16\">\r\n     <!-- loss -->\r\n     <defs>\r\n      <path d=\"M 9.421875 75.984375 \r\nL 18.40625 75.984375 \r\nL 18.40625 0 \r\nL 9.421875 0 \r\nz\r\n\" id=\"DejaVuSans-108\"/>\r\n      <path d=\"M 44.28125 53.078125 \r\nL 44.28125 44.578125 \r\nQ 40.484375 46.53125 36.375 47.5 \r\nQ 32.28125 48.484375 27.875 48.484375 \r\nQ 21.1875 48.484375 17.84375 46.4375 \r\nQ 14.5 44.390625 14.5 40.28125 \r\nQ 14.5 37.15625 16.890625 35.375 \r\nQ 19.28125 33.59375 26.515625 31.984375 \r\nL 29.59375 31.296875 \r\nQ 39.15625 29.25 43.1875 25.515625 \r\nQ 47.21875 21.78125 47.21875 15.09375 \r\nQ 47.21875 7.46875 41.1875 3.015625 \r\nQ 35.15625 -1.421875 24.609375 -1.421875 \r\nQ 20.21875 -1.421875 15.453125 -0.5625 \r\nQ 10.6875 0.296875 5.421875 2 \r\nL 5.421875 11.28125 \r\nQ 10.40625 8.6875 15.234375 7.390625 \r\nQ 20.0625 6.109375 24.8125 6.109375 \r\nQ 31.15625 6.109375 34.5625 8.28125 \r\nQ 37.984375 10.453125 37.984375 14.40625 \r\nQ 37.984375 18.0625 35.515625 20.015625 \r\nQ 33.0625 21.96875 24.703125 23.78125 \r\nL 21.578125 24.515625 \r\nQ 13.234375 26.265625 9.515625 29.90625 \r\nQ 5.8125 33.546875 5.8125 39.890625 \r\nQ 5.8125 47.609375 11.28125 51.796875 \r\nQ 16.75 56 26.8125 56 \r\nQ 31.78125 56 36.171875 55.265625 \r\nQ 40.578125 54.546875 44.28125 53.078125 \r\nz\r\n\" id=\"DejaVuSans-115\"/>\r\n     </defs>\r\n     <g transform=\"translate(14.798438 115.694176)rotate(-90)scale(0.1 -0.1)\">\r\n      <use xlink:href=\"#DejaVuSans-108\"/>\r\n      <use x=\"27.783203\" xlink:href=\"#DejaVuSans-111\"/>\r\n      <use x=\"88.964844\" xlink:href=\"#DejaVuSans-115\"/>\r\n      <use x=\"141.064453\" xlink:href=\"#DejaVuSans-115\"/>\r\n     </g>\r\n    </g>\r\n   </g>\r\n   <g id=\"line2d_15\">\r\n    <defs>\r\n     <path d=\"M 0 3 \r\nC 0.795609 3 1.55874 2.683901 2.12132 2.12132 \r\nC 2.683901 1.55874 3 0.795609 3 0 \r\nC 3 -0.795609 2.683901 -1.55874 2.12132 -2.12132 \r\nC 1.55874 -2.683901 0.795609 -3 0 -3 \r\nC -0.795609 -3 -1.55874 -2.683901 -2.12132 -2.12132 \r\nC -2.683901 -1.55874 -3 -0.795609 -3 0 \r\nC -3 0.795609 -2.683901 1.55874 -2.12132 2.12132 \r\nC -1.55874 2.683901 -0.795609 3 0 3 \r\nz\r\n\" id=\"m9d21bdd65e\" style=\"stroke:#1f77b4;\"/>\r\n    </defs>\r\n    <g clip-path=\"url(#p2a4ce5bee9)\">\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"69.144886\" xlink:href=\"#m9d21bdd65e\" y=\"57.568891\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"86.637049\" xlink:href=\"#m9d21bdd65e\" y=\"16.185124\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"104.129212\" xlink:href=\"#m9d21bdd65e\" y=\"66.203498\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"121.621375\" xlink:href=\"#m9d21bdd65e\" y=\"76.634103\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"139.113538\" xlink:href=\"#m9d21bdd65e\" y=\"92.122997\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"156.605701\" xlink:href=\"#m9d21bdd65e\" y=\"86.397342\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"174.097864\" xlink:href=\"#m9d21bdd65e\" y=\"92.781902\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"191.590027\" xlink:href=\"#m9d21bdd65e\" y=\"104.096409\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"209.08219\" xlink:href=\"#m9d21bdd65e\" y=\"122.689238\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"226.574353\" xlink:href=\"#m9d21bdd65e\" y=\"97.095168\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"244.066516\" xlink:href=\"#m9d21bdd65e\" y=\"141.067429\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"261.558679\" xlink:href=\"#m9d21bdd65e\" y=\"140.006275\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"279.050842\" xlink:href=\"#m9d21bdd65e\" y=\"133.941606\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"296.543005\" xlink:href=\"#m9d21bdd65e\" y=\"152.389022\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"314.035168\" xlink:href=\"#m9d21bdd65e\" y=\"167.282864\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"331.527332\" xlink:href=\"#m9d21bdd65e\" y=\"163.815122\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"349.019495\" xlink:href=\"#m9d21bdd65e\" y=\"155.324414\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"366.511658\" xlink:href=\"#m9d21bdd65e\" y=\"174.535887\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"384.003821\" xlink:href=\"#m9d21bdd65e\" y=\"176.769124\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"401.495984\" xlink:href=\"#m9d21bdd65e\" y=\"164.416573\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"418.988147\" xlink:href=\"#m9d21bdd65e\" y=\"180.61833\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"436.48031\" xlink:href=\"#m9d21bdd65e\" y=\"185.519678\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"453.972473\" xlink:href=\"#m9d21bdd65e\" y=\"174.398346\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"471.464636\" xlink:href=\"#m9d21bdd65e\" y=\"171.993765\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"488.956799\" xlink:href=\"#m9d21bdd65e\" y=\"195.887603\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"506.448962\" xlink:href=\"#m9d21bdd65e\" y=\"193.860082\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"523.941125\" xlink:href=\"#m9d21bdd65e\" y=\"193.248058\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"541.433288\" xlink:href=\"#m9d21bdd65e\" y=\"193.979632\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"558.925451\" xlink:href=\"#m9d21bdd65e\" y=\"193.092756\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"576.417614\" xlink:href=\"#m9d21bdd65e\" y=\"195.449283\"/>\r\n    </g>\r\n   </g>\r\n   <g id=\"patch_3\">\r\n    <path d=\"M 43.78125 204.872727 \r\nL 43.78125 7.2 \r\n\" style=\"fill:none;stroke:#000000;stroke-linecap:square;stroke-linejoin:miter;stroke-width:0.8;\"/>\r\n   </g>\r\n   <g id=\"patch_4\">\r\n    <path d=\"M 601.78125 204.872727 \r\nL 601.78125 7.2 \r\n\" style=\"fill:none;stroke:#000000;stroke-linecap:square;stroke-linejoin:miter;stroke-width:0.8;\"/>\r\n   </g>\r\n   <g id=\"patch_5\">\r\n    <path d=\"M 43.78125 204.872727 \r\nL 601.78125 204.872727 \r\n\" style=\"fill:none;stroke:#000000;stroke-linecap:square;stroke-linejoin:miter;stroke-width:0.8;\"/>\r\n   </g>\r\n   <g id=\"patch_6\">\r\n    <path d=\"M 43.78125 7.2 \r\nL 601.78125 7.2 \r\n\" style=\"fill:none;stroke:#000000;stroke-linecap:square;stroke-linejoin:miter;stroke-width:0.8;\"/>\r\n   </g>\r\n  </g>\r\n  <g id=\"axes_2\">\r\n   <g id=\"patch_7\">\r\n    <path d=\"M 43.78125 442.08 \r\nL 601.78125 442.08 \r\nL 601.78125 244.407273 \r\nL 43.78125 244.407273 \r\nz\r\n\" style=\"fill:#ffffff;\"/>\r\n   </g>\r\n   <g id=\"matplotlib.axis_3\">\r\n    <g id=\"xtick_8\">\r\n     <g id=\"line2d_16\">\r\n      <g>\r\n       <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"69.144886\" xlink:href=\"#m74a7a71402\" y=\"442.08\"/>\r\n      </g>\r\n     </g>\r\n     <g id=\"text_17\">\r\n      <!-- 0 -->\r\n      <g transform=\"translate(65.963636 456.678437)scale(0.1 -0.1)\">\r\n       <use xlink:href=\"#DejaVuSans-48\"/>\r\n      </g>\r\n     </g>\r\n    </g>\r\n    <g id=\"xtick_9\">\r\n     <g id=\"line2d_17\">\r\n      <g>\r\n       <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"136.78125\" xlink:href=\"#m74a7a71402\" y=\"442.08\"/>\r\n      </g>\r\n     </g>\r\n     <g id=\"text_18\">\r\n      <!-- 2 -->\r\n      <g transform=\"translate(133.6 456.678437)scale(0.1 -0.1)\">\r\n       <use xlink:href=\"#DejaVuSans-50\"/>\r\n      </g>\r\n     </g>\r\n    </g>\r\n    <g id=\"xtick_10\">\r\n     <g id=\"line2d_18\">\r\n      <g>\r\n       <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"204.417614\" xlink:href=\"#m74a7a71402\" y=\"442.08\"/>\r\n      </g>\r\n     </g>\r\n     <g id=\"text_19\">\r\n      <!-- 4 -->\r\n      <defs>\r\n       <path d=\"M 37.796875 64.3125 \r\nL 12.890625 25.390625 \r\nL 37.796875 25.390625 \r\nz\r\nM 35.203125 72.90625 \r\nL 47.609375 72.90625 \r\nL 47.609375 25.390625 \r\nL 58.015625 25.390625 \r\nL 58.015625 17.1875 \r\nL 47.609375 17.1875 \r\nL 47.609375 0 \r\nL 37.796875 0 \r\nL 37.796875 17.1875 \r\nL 4.890625 17.1875 \r\nL 4.890625 26.703125 \r\nz\r\n\" id=\"DejaVuSans-52\"/>\r\n      </defs>\r\n      <g transform=\"translate(201.236364 456.678437)scale(0.1 -0.1)\">\r\n       <use xlink:href=\"#DejaVuSans-52\"/>\r\n      </g>\r\n     </g>\r\n    </g>\r\n    <g id=\"xtick_11\">\r\n     <g id=\"line2d_19\">\r\n      <g>\r\n       <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"272.053977\" xlink:href=\"#m74a7a71402\" y=\"442.08\"/>\r\n      </g>\r\n     </g>\r\n     <g id=\"text_20\">\r\n      <!-- 6 -->\r\n      <defs>\r\n       <path d=\"M 33.015625 40.375 \r\nQ 26.375 40.375 22.484375 35.828125 \r\nQ 18.609375 31.296875 18.609375 23.390625 \r\nQ 18.609375 15.53125 22.484375 10.953125 \r\nQ 26.375 6.390625 33.015625 6.390625 \r\nQ 39.65625 6.390625 43.53125 10.953125 \r\nQ 47.40625 15.53125 47.40625 23.390625 \r\nQ 47.40625 31.296875 43.53125 35.828125 \r\nQ 39.65625 40.375 33.015625 40.375 \r\nz\r\nM 52.59375 71.296875 \r\nL 52.59375 62.3125 \r\nQ 48.875 64.0625 45.09375 64.984375 \r\nQ 41.3125 65.921875 37.59375 65.921875 \r\nQ 27.828125 65.921875 22.671875 59.328125 \r\nQ 17.53125 52.734375 16.796875 39.40625 \r\nQ 19.671875 43.65625 24.015625 45.921875 \r\nQ 28.375 48.1875 33.59375 48.1875 \r\nQ 44.578125 48.1875 50.953125 41.515625 \r\nQ 57.328125 34.859375 57.328125 23.390625 \r\nQ 57.328125 12.15625 50.6875 5.359375 \r\nQ 44.046875 -1.421875 33.015625 -1.421875 \r\nQ 20.359375 -1.421875 13.671875 8.265625 \r\nQ 6.984375 17.96875 6.984375 36.375 \r\nQ 6.984375 53.65625 15.1875 63.9375 \r\nQ 23.390625 74.21875 37.203125 74.21875 \r\nQ 40.921875 74.21875 44.703125 73.484375 \r\nQ 48.484375 72.75 52.59375 71.296875 \r\nz\r\n\" id=\"DejaVuSans-54\"/>\r\n      </defs>\r\n      <g transform=\"translate(268.872727 456.678437)scale(0.1 -0.1)\">\r\n       <use xlink:href=\"#DejaVuSans-54\"/>\r\n      </g>\r\n     </g>\r\n    </g>\r\n    <g id=\"xtick_12\">\r\n     <g id=\"line2d_20\">\r\n      <g>\r\n       <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"339.690341\" xlink:href=\"#m74a7a71402\" y=\"442.08\"/>\r\n      </g>\r\n     </g>\r\n     <g id=\"text_21\">\r\n      <!-- 8 -->\r\n      <defs>\r\n       <path d=\"M 31.78125 34.625 \r\nQ 24.75 34.625 20.71875 30.859375 \r\nQ 16.703125 27.09375 16.703125 20.515625 \r\nQ 16.703125 13.921875 20.71875 10.15625 \r\nQ 24.75 6.390625 31.78125 6.390625 \r\nQ 38.8125 6.390625 42.859375 10.171875 \r\nQ 46.921875 13.96875 46.921875 20.515625 \r\nQ 46.921875 27.09375 42.890625 30.859375 \r\nQ 38.875 34.625 31.78125 34.625 \r\nz\r\nM 21.921875 38.8125 \r\nQ 15.578125 40.375 12.03125 44.71875 \r\nQ 8.5 49.078125 8.5 55.328125 \r\nQ 8.5 64.0625 14.71875 69.140625 \r\nQ 20.953125 74.21875 31.78125 74.21875 \r\nQ 42.671875 74.21875 48.875 69.140625 \r\nQ 55.078125 64.0625 55.078125 55.328125 \r\nQ 55.078125 49.078125 51.53125 44.71875 \r\nQ 48 40.375 41.703125 38.8125 \r\nQ 48.828125 37.15625 52.796875 32.3125 \r\nQ 56.78125 27.484375 56.78125 20.515625 \r\nQ 56.78125 9.90625 50.3125 4.234375 \r\nQ 43.84375 -1.421875 31.78125 -1.421875 \r\nQ 19.734375 -1.421875 13.25 4.234375 \r\nQ 6.78125 9.90625 6.78125 20.515625 \r\nQ 6.78125 27.484375 10.78125 32.3125 \r\nQ 14.796875 37.15625 21.921875 38.8125 \r\nz\r\nM 18.3125 54.390625 \r\nQ 18.3125 48.734375 21.84375 45.5625 \r\nQ 25.390625 42.390625 31.78125 42.390625 \r\nQ 38.140625 42.390625 41.71875 45.5625 \r\nQ 45.3125 48.734375 45.3125 54.390625 \r\nQ 45.3125 60.0625 41.71875 63.234375 \r\nQ 38.140625 66.40625 31.78125 66.40625 \r\nQ 25.390625 66.40625 21.84375 63.234375 \r\nQ 18.3125 60.0625 18.3125 54.390625 \r\nz\r\n\" id=\"DejaVuSans-56\"/>\r\n      </defs>\r\n      <g transform=\"translate(336.509091 456.678437)scale(0.1 -0.1)\">\r\n       <use xlink:href=\"#DejaVuSans-56\"/>\r\n      </g>\r\n     </g>\r\n    </g>\r\n    <g id=\"xtick_13\">\r\n     <g id=\"line2d_21\">\r\n      <g>\r\n       <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"407.326705\" xlink:href=\"#m74a7a71402\" y=\"442.08\"/>\r\n      </g>\r\n     </g>\r\n     <g id=\"text_22\">\r\n      <!-- 10 -->\r\n      <g transform=\"translate(400.964205 456.678437)scale(0.1 -0.1)\">\r\n       <use xlink:href=\"#DejaVuSans-49\"/>\r\n       <use x=\"63.623047\" xlink:href=\"#DejaVuSans-48\"/>\r\n      </g>\r\n     </g>\r\n    </g>\r\n    <g id=\"xtick_14\">\r\n     <g id=\"line2d_22\">\r\n      <g>\r\n       <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"474.963068\" xlink:href=\"#m74a7a71402\" y=\"442.08\"/>\r\n      </g>\r\n     </g>\r\n     <g id=\"text_23\">\r\n      <!-- 12 -->\r\n      <g transform=\"translate(468.600568 456.678437)scale(0.1 -0.1)\">\r\n       <use xlink:href=\"#DejaVuSans-49\"/>\r\n       <use x=\"63.623047\" xlink:href=\"#DejaVuSans-50\"/>\r\n      </g>\r\n     </g>\r\n    </g>\r\n    <g id=\"xtick_15\">\r\n     <g id=\"line2d_23\">\r\n      <g>\r\n       <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"542.599432\" xlink:href=\"#m74a7a71402\" y=\"442.08\"/>\r\n      </g>\r\n     </g>\r\n     <g id=\"text_24\">\r\n      <!-- 14 -->\r\n      <g transform=\"translate(536.236932 456.678437)scale(0.1 -0.1)\">\r\n       <use xlink:href=\"#DejaVuSans-49\"/>\r\n       <use x=\"63.623047\" xlink:href=\"#DejaVuSans-52\"/>\r\n      </g>\r\n     </g>\r\n    </g>\r\n    <g id=\"text_25\">\r\n     <!-- epoch -->\r\n     <defs>\r\n      <path d=\"M 18.109375 8.203125 \r\nL 18.109375 -20.796875 \r\nL 9.078125 -20.796875 \r\nL 9.078125 54.6875 \r\nL 18.109375 54.6875 \r\nL 18.109375 46.390625 \r\nQ 20.953125 51.265625 25.265625 53.625 \r\nQ 29.59375 56 35.59375 56 \r\nQ 45.5625 56 51.78125 48.09375 \r\nQ 58.015625 40.1875 58.015625 27.296875 \r\nQ 58.015625 14.40625 51.78125 6.484375 \r\nQ 45.5625 -1.421875 35.59375 -1.421875 \r\nQ 29.59375 -1.421875 25.265625 0.953125 \r\nQ 20.953125 3.328125 18.109375 8.203125 \r\nz\r\nM 48.6875 27.296875 \r\nQ 48.6875 37.203125 44.609375 42.84375 \r\nQ 40.53125 48.484375 33.40625 48.484375 \r\nQ 26.265625 48.484375 22.1875 42.84375 \r\nQ 18.109375 37.203125 18.109375 27.296875 \r\nQ 18.109375 17.390625 22.1875 11.75 \r\nQ 26.265625 6.109375 33.40625 6.109375 \r\nQ 40.53125 6.109375 44.609375 11.75 \r\nQ 48.6875 17.390625 48.6875 27.296875 \r\nz\r\n\" id=\"DejaVuSans-112\"/>\r\n      <path d=\"M 48.78125 52.59375 \r\nL 48.78125 44.1875 \r\nQ 44.96875 46.296875 41.140625 47.34375 \r\nQ 37.3125 48.390625 33.40625 48.390625 \r\nQ 24.65625 48.390625 19.8125 42.84375 \r\nQ 14.984375 37.3125 14.984375 27.296875 \r\nQ 14.984375 17.28125 19.8125 11.734375 \r\nQ 24.65625 6.203125 33.40625 6.203125 \r\nQ 37.3125 6.203125 41.140625 7.25 \r\nQ 44.96875 8.296875 48.78125 10.40625 \r\nL 48.78125 2.09375 \r\nQ 45.015625 0.34375 40.984375 -0.53125 \r\nQ 36.96875 -1.421875 32.421875 -1.421875 \r\nQ 20.0625 -1.421875 12.78125 6.34375 \r\nQ 5.515625 14.109375 5.515625 27.296875 \r\nQ 5.515625 40.671875 12.859375 48.328125 \r\nQ 20.21875 56 33.015625 56 \r\nQ 37.15625 56 41.109375 55.140625 \r\nQ 45.0625 54.296875 48.78125 52.59375 \r\nz\r\n\" id=\"DejaVuSans-99\"/>\r\n      <path d=\"M 54.890625 33.015625 \r\nL 54.890625 0 \r\nL 45.90625 0 \r\nL 45.90625 32.71875 \r\nQ 45.90625 40.484375 42.875 44.328125 \r\nQ 39.84375 48.1875 33.796875 48.1875 \r\nQ 26.515625 48.1875 22.3125 43.546875 \r\nQ 18.109375 38.921875 18.109375 30.90625 \r\nL 18.109375 0 \r\nL 9.078125 0 \r\nL 9.078125 75.984375 \r\nL 18.109375 75.984375 \r\nL 18.109375 46.1875 \r\nQ 21.34375 51.125 25.703125 53.5625 \r\nQ 30.078125 56 35.796875 56 \r\nQ 45.21875 56 50.046875 50.171875 \r\nQ 54.890625 44.34375 54.890625 33.015625 \r\nz\r\n\" id=\"DejaVuSans-104\"/>\r\n     </defs>\r\n     <g transform=\"translate(307.553125 470.356562)scale(0.1 -0.1)\">\r\n      <use xlink:href=\"#DejaVuSans-101\"/>\r\n      <use x=\"61.523438\" xlink:href=\"#DejaVuSans-112\"/>\r\n      <use x=\"125\" xlink:href=\"#DejaVuSans-111\"/>\r\n      <use x=\"186.181641\" xlink:href=\"#DejaVuSans-99\"/>\r\n      <use x=\"241.162109\" xlink:href=\"#DejaVuSans-104\"/>\r\n     </g>\r\n    </g>\r\n   </g>\r\n   <g id=\"matplotlib.axis_4\">\r\n    <g id=\"ytick_8\">\r\n     <g id=\"line2d_24\">\r\n      <g>\r\n       <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"43.78125\" xlink:href=\"#m9671dea085\" y=\"410.60719\"/>\r\n      </g>\r\n     </g>\r\n     <g id=\"text_26\">\r\n      <!-- 0.2 -->\r\n      <g transform=\"translate(20.878125 414.406409)scale(0.1 -0.1)\">\r\n       <use xlink:href=\"#DejaVuSans-48\"/>\r\n       <use x=\"63.623047\" xlink:href=\"#DejaVuSans-46\"/>\r\n       <use x=\"95.410156\" xlink:href=\"#DejaVuSans-50\"/>\r\n      </g>\r\n     </g>\r\n    </g>\r\n    <g id=\"ytick_9\">\r\n     <g id=\"line2d_25\">\r\n      <g>\r\n       <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"43.78125\" xlink:href=\"#m9671dea085\" y=\"370.805977\"/>\r\n      </g>\r\n     </g>\r\n     <g id=\"text_27\">\r\n      <!-- 0.4 -->\r\n      <g transform=\"translate(20.878125 374.605196)scale(0.1 -0.1)\">\r\n       <use xlink:href=\"#DejaVuSans-48\"/>\r\n       <use x=\"63.623047\" xlink:href=\"#DejaVuSans-46\"/>\r\n       <use x=\"95.410156\" xlink:href=\"#DejaVuSans-52\"/>\r\n      </g>\r\n     </g>\r\n    </g>\r\n    <g id=\"ytick_10\">\r\n     <g id=\"line2d_26\">\r\n      <g>\r\n       <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"43.78125\" xlink:href=\"#m9671dea085\" y=\"331.004763\"/>\r\n      </g>\r\n     </g>\r\n     <g id=\"text_28\">\r\n      <!-- 0.6 -->\r\n      <g transform=\"translate(20.878125 334.803982)scale(0.1 -0.1)\">\r\n       <use xlink:href=\"#DejaVuSans-48\"/>\r\n       <use x=\"63.623047\" xlink:href=\"#DejaVuSans-46\"/>\r\n       <use x=\"95.410156\" xlink:href=\"#DejaVuSans-54\"/>\r\n      </g>\r\n     </g>\r\n    </g>\r\n    <g id=\"ytick_11\">\r\n     <g id=\"line2d_27\">\r\n      <g>\r\n       <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"43.78125\" xlink:href=\"#m9671dea085\" y=\"291.20355\"/>\r\n      </g>\r\n     </g>\r\n     <g id=\"text_29\">\r\n      <!-- 0.8 -->\r\n      <g transform=\"translate(20.878125 295.002768)scale(0.1 -0.1)\">\r\n       <use xlink:href=\"#DejaVuSans-48\"/>\r\n       <use x=\"63.623047\" xlink:href=\"#DejaVuSans-46\"/>\r\n       <use x=\"95.410156\" xlink:href=\"#DejaVuSans-56\"/>\r\n      </g>\r\n     </g>\r\n    </g>\r\n    <g id=\"ytick_12\">\r\n     <g id=\"line2d_28\">\r\n      <g>\r\n       <use style=\"stroke:#000000;stroke-width:0.8;\" x=\"43.78125\" xlink:href=\"#m9671dea085\" y=\"251.402336\"/>\r\n      </g>\r\n     </g>\r\n     <g id=\"text_30\">\r\n      <!-- 1.0 -->\r\n      <g transform=\"translate(20.878125 255.201555)scale(0.1 -0.1)\">\r\n       <use xlink:href=\"#DejaVuSans-49\"/>\r\n       <use x=\"63.623047\" xlink:href=\"#DejaVuSans-46\"/>\r\n       <use x=\"95.410156\" xlink:href=\"#DejaVuSans-48\"/>\r\n      </g>\r\n     </g>\r\n    </g>\r\n    <g id=\"text_31\">\r\n     <!-- accuracy -->\r\n     <defs>\r\n      <path d=\"M 8.5 21.578125 \r\nL 8.5 54.6875 \r\nL 17.484375 54.6875 \r\nL 17.484375 21.921875 \r\nQ 17.484375 14.15625 20.5 10.265625 \r\nQ 23.53125 6.390625 29.59375 6.390625 \r\nQ 36.859375 6.390625 41.078125 11.03125 \r\nQ 45.3125 15.671875 45.3125 23.6875 \r\nL 45.3125 54.6875 \r\nL 54.296875 54.6875 \r\nL 54.296875 0 \r\nL 45.3125 0 \r\nL 45.3125 8.40625 \r\nQ 42.046875 3.421875 37.71875 1 \r\nQ 33.40625 -1.421875 27.6875 -1.421875 \r\nQ 18.265625 -1.421875 13.375 4.4375 \r\nQ 8.5 10.296875 8.5 21.578125 \r\nz\r\nM 31.109375 56 \r\nz\r\n\" id=\"DejaVuSans-117\"/>\r\n      <path d=\"M 32.171875 -5.078125 \r\nQ 28.375 -14.84375 24.75 -17.8125 \r\nQ 21.140625 -20.796875 15.09375 -20.796875 \r\nL 7.90625 -20.796875 \r\nL 7.90625 -13.28125 \r\nL 13.1875 -13.28125 \r\nQ 16.890625 -13.28125 18.9375 -11.515625 \r\nQ 21 -9.765625 23.484375 -3.21875 \r\nL 25.09375 0.875 \r\nL 2.984375 54.6875 \r\nL 12.5 54.6875 \r\nL 29.59375 11.921875 \r\nL 46.6875 54.6875 \r\nL 56.203125 54.6875 \r\nz\r\n\" id=\"DejaVuSans-121\"/>\r\n     </defs>\r\n     <g transform=\"translate(14.798438 365.803011)rotate(-90)scale(0.1 -0.1)\">\r\n      <use xlink:href=\"#DejaVuSans-97\"/>\r\n      <use x=\"61.279297\" xlink:href=\"#DejaVuSans-99\"/>\r\n      <use x=\"116.259766\" xlink:href=\"#DejaVuSans-99\"/>\r\n      <use x=\"171.240234\" xlink:href=\"#DejaVuSans-117\"/>\r\n      <use x=\"234.619141\" xlink:href=\"#DejaVuSans-114\"/>\r\n      <use x=\"275.732422\" xlink:href=\"#DejaVuSans-97\"/>\r\n      <use x=\"337.011719\" xlink:href=\"#DejaVuSans-99\"/>\r\n      <use x=\"391.992188\" xlink:href=\"#DejaVuSans-121\"/>\r\n     </g>\r\n    </g>\r\n   </g>\r\n   <g id=\"line2d_29\">\r\n    <path clip-path=\"url(#p95d0c31365)\" d=\"M 69.144886 410.60719 \r\nL 102.963068 422.547554 \r\nL 136.78125 402.646948 \r\nL 170.599432 348.915309 \r\nL 204.417614 346.925249 \r\nL 238.235795 325.034581 \r\nL 272.053977 301.153853 \r\nL 305.872159 293.19361 \r\nL 339.690341 287.223428 \r\nL 373.508523 287.223428 \r\nL 407.326705 271.302943 \r\nL 441.144886 287.223428 \r\nL 474.963068 267.322821 \r\nL 508.78125 265.332761 \r\nL 542.599432 259.362579 \r\nL 576.417614 253.392397 \r\n\" style=\"fill:none;stroke:#1f77b4;stroke-linecap:square;stroke-width:1.5;\"/>\r\n    <g clip-path=\"url(#p95d0c31365)\">\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"69.144886\" xlink:href=\"#m9d21bdd65e\" y=\"410.60719\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"102.963068\" xlink:href=\"#m9d21bdd65e\" y=\"422.547554\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"136.78125\" xlink:href=\"#m9d21bdd65e\" y=\"402.646948\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"170.599432\" xlink:href=\"#m9d21bdd65e\" y=\"348.915309\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"204.417614\" xlink:href=\"#m9d21bdd65e\" y=\"346.925249\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"238.235795\" xlink:href=\"#m9d21bdd65e\" y=\"325.034581\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"272.053977\" xlink:href=\"#m9d21bdd65e\" y=\"301.153853\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"305.872159\" xlink:href=\"#m9d21bdd65e\" y=\"293.19361\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"339.690341\" xlink:href=\"#m9d21bdd65e\" y=\"287.223428\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"373.508523\" xlink:href=\"#m9d21bdd65e\" y=\"287.223428\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"407.326705\" xlink:href=\"#m9d21bdd65e\" y=\"271.302943\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"441.144886\" xlink:href=\"#m9d21bdd65e\" y=\"287.223428\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"474.963068\" xlink:href=\"#m9d21bdd65e\" y=\"267.322821\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"508.78125\" xlink:href=\"#m9d21bdd65e\" y=\"265.332761\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"542.599432\" xlink:href=\"#m9d21bdd65e\" y=\"259.362579\"/>\r\n     <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"576.417614\" xlink:href=\"#m9d21bdd65e\" y=\"253.392397\"/>\r\n    </g>\r\n   </g>\r\n   <g id=\"line2d_30\">\r\n    <path clip-path=\"url(#p95d0c31365)\" d=\"M 69.144886 423.144573 \r\nL 102.963068 433.094876 \r\nL 136.78125 431.701834 \r\nL 170.599432 415.980354 \r\nL 204.417614 412.995263 \r\nL 238.235795 417.174391 \r\nL 272.053977 404.836014 \r\nL 305.872159 401.253905 \r\nL 339.690341 400.258875 \r\nL 373.508523 414.985324 \r\nL 407.326705 409.413154 \r\nL 441.144886 410.408184 \r\nL 474.963068 408.418124 \r\nL 508.78125 408.020111 \r\nL 542.599432 408.219118 \r\nL 576.417614 406.627069 \r\n\" style=\"fill:none;stroke:#ff7f0e;stroke-linecap:square;stroke-width:1.5;\"/>\r\n    <defs>\r\n     <path d=\"M 0 3 \r\nC 0.795609 3 1.55874 2.683901 2.12132 2.12132 \r\nC 2.683901 1.55874 3 0.795609 3 0 \r\nC 3 -0.795609 2.683901 -1.55874 2.12132 -2.12132 \r\nC 1.55874 -2.683901 0.795609 -3 0 -3 \r\nC -0.795609 -3 -1.55874 -2.683901 -2.12132 -2.12132 \r\nC -2.683901 -1.55874 -3 -0.795609 -3 0 \r\nC -3 0.795609 -2.683901 1.55874 -2.12132 2.12132 \r\nC -1.55874 2.683901 -0.795609 3 0 3 \r\nz\r\n\" id=\"m4c8f4990d0\" style=\"stroke:#ff7f0e;\"/>\r\n    </defs>\r\n    <g clip-path=\"url(#p95d0c31365)\">\r\n     <use style=\"fill:#ff7f0e;stroke:#ff7f0e;\" x=\"69.144886\" xlink:href=\"#m4c8f4990d0\" y=\"423.144573\"/>\r\n     <use style=\"fill:#ff7f0e;stroke:#ff7f0e;\" x=\"102.963068\" xlink:href=\"#m4c8f4990d0\" y=\"433.094876\"/>\r\n     <use style=\"fill:#ff7f0e;stroke:#ff7f0e;\" x=\"136.78125\" xlink:href=\"#m4c8f4990d0\" y=\"431.701834\"/>\r\n     <use style=\"fill:#ff7f0e;stroke:#ff7f0e;\" x=\"170.599432\" xlink:href=\"#m4c8f4990d0\" y=\"415.980354\"/>\r\n     <use style=\"fill:#ff7f0e;stroke:#ff7f0e;\" x=\"204.417614\" xlink:href=\"#m4c8f4990d0\" y=\"412.995263\"/>\r\n     <use style=\"fill:#ff7f0e;stroke:#ff7f0e;\" x=\"238.235795\" xlink:href=\"#m4c8f4990d0\" y=\"417.174391\"/>\r\n     <use style=\"fill:#ff7f0e;stroke:#ff7f0e;\" x=\"272.053977\" xlink:href=\"#m4c8f4990d0\" y=\"404.836014\"/>\r\n     <use style=\"fill:#ff7f0e;stroke:#ff7f0e;\" x=\"305.872159\" xlink:href=\"#m4c8f4990d0\" y=\"401.253905\"/>\r\n     <use style=\"fill:#ff7f0e;stroke:#ff7f0e;\" x=\"339.690341\" xlink:href=\"#m4c8f4990d0\" y=\"400.258875\"/>\r\n     <use style=\"fill:#ff7f0e;stroke:#ff7f0e;\" x=\"373.508523\" xlink:href=\"#m4c8f4990d0\" y=\"414.985324\"/>\r\n     <use style=\"fill:#ff7f0e;stroke:#ff7f0e;\" x=\"407.326705\" xlink:href=\"#m4c8f4990d0\" y=\"409.413154\"/>\r\n     <use style=\"fill:#ff7f0e;stroke:#ff7f0e;\" x=\"441.144886\" xlink:href=\"#m4c8f4990d0\" y=\"410.408184\"/>\r\n     <use style=\"fill:#ff7f0e;stroke:#ff7f0e;\" x=\"474.963068\" xlink:href=\"#m4c8f4990d0\" y=\"408.418124\"/>\r\n     <use style=\"fill:#ff7f0e;stroke:#ff7f0e;\" x=\"508.78125\" xlink:href=\"#m4c8f4990d0\" y=\"408.020111\"/>\r\n     <use style=\"fill:#ff7f0e;stroke:#ff7f0e;\" x=\"542.599432\" xlink:href=\"#m4c8f4990d0\" y=\"408.219118\"/>\r\n     <use style=\"fill:#ff7f0e;stroke:#ff7f0e;\" x=\"576.417614\" xlink:href=\"#m4c8f4990d0\" y=\"406.627069\"/>\r\n    </g>\r\n   </g>\r\n   <g id=\"patch_8\">\r\n    <path d=\"M 43.78125 442.08 \r\nL 43.78125 244.407273 \r\n\" style=\"fill:none;stroke:#000000;stroke-linecap:square;stroke-linejoin:miter;stroke-width:0.8;\"/>\r\n   </g>\r\n   <g id=\"patch_9\">\r\n    <path d=\"M 601.78125 442.08 \r\nL 601.78125 244.407273 \r\n\" style=\"fill:none;stroke:#000000;stroke-linecap:square;stroke-linejoin:miter;stroke-width:0.8;\"/>\r\n   </g>\r\n   <g id=\"patch_10\">\r\n    <path d=\"M 43.78125 442.08 \r\nL 601.78125 442.08 \r\n\" style=\"fill:none;stroke:#000000;stroke-linecap:square;stroke-linejoin:miter;stroke-width:0.8;\"/>\r\n   </g>\r\n   <g id=\"patch_11\">\r\n    <path d=\"M 43.78125 244.407273 \r\nL 601.78125 244.407273 \r\n\" style=\"fill:none;stroke:#000000;stroke-linecap:square;stroke-linejoin:miter;stroke-width:0.8;\"/>\r\n   </g>\r\n   <g id=\"legend_1\">\r\n    <g id=\"patch_12\">\r\n     <path d=\"M 50.78125 281.763523 \r\nL 106.05625 281.763523 \r\nQ 108.05625 281.763523 108.05625 279.763523 \r\nL 108.05625 251.407273 \r\nQ 108.05625 249.407273 106.05625 249.407273 \r\nL 50.78125 249.407273 \r\nQ 48.78125 249.407273 48.78125 251.407273 \r\nL 48.78125 279.763523 \r\nQ 48.78125 281.763523 50.78125 281.763523 \r\nz\r\n\" style=\"fill:#ffffff;opacity:0.8;stroke:#cccccc;stroke-linejoin:miter;\"/>\r\n    </g>\r\n    <g id=\"line2d_31\">\r\n     <path d=\"M 52.78125 257.50571 \r\nL 72.78125 257.50571 \r\n\" style=\"fill:none;stroke:#1f77b4;stroke-linecap:square;stroke-width:1.5;\"/>\r\n    </g>\r\n    <g id=\"line2d_32\">\r\n     <g>\r\n      <use style=\"fill:#1f77b4;stroke:#1f77b4;\" x=\"62.78125\" xlink:href=\"#m9d21bdd65e\" y=\"257.50571\"/>\r\n     </g>\r\n    </g>\r\n    <g id=\"text_32\">\r\n     <!-- train -->\r\n     <g transform=\"translate(80.78125 261.00571)scale(0.1 -0.1)\">\r\n      <use xlink:href=\"#DejaVuSans-116\"/>\r\n      <use x=\"39.208984\" xlink:href=\"#DejaVuSans-114\"/>\r\n      <use x=\"80.322266\" xlink:href=\"#DejaVuSans-97\"/>\r\n      <use x=\"141.601562\" xlink:href=\"#DejaVuSans-105\"/>\r\n      <use x=\"169.384766\" xlink:href=\"#DejaVuSans-110\"/>\r\n     </g>\r\n    </g>\r\n    <g id=\"line2d_33\">\r\n     <path d=\"M 52.78125 272.183835 \r\nL 72.78125 272.183835 \r\n\" style=\"fill:none;stroke:#ff7f0e;stroke-linecap:square;stroke-width:1.5;\"/>\r\n    </g>\r\n    <g id=\"line2d_34\">\r\n     <g>\r\n      <use style=\"fill:#ff7f0e;stroke:#ff7f0e;\" x=\"62.78125\" xlink:href=\"#m4c8f4990d0\" y=\"272.183835\"/>\r\n     </g>\r\n    </g>\r\n    <g id=\"text_33\">\r\n     <!-- val -->\r\n     <defs>\r\n      <path d=\"M 2.984375 54.6875 \r\nL 12.5 54.6875 \r\nL 29.59375 8.796875 \r\nL 46.6875 54.6875 \r\nL 56.203125 54.6875 \r\nL 35.6875 0 \r\nL 23.484375 0 \r\nz\r\n\" id=\"DejaVuSans-118\"/>\r\n     </defs>\r\n     <g transform=\"translate(80.78125 275.683835)scale(0.1 -0.1)\">\r\n      <use xlink:href=\"#DejaVuSans-118\"/>\r\n      <use x=\"59.179688\" xlink:href=\"#DejaVuSans-97\"/>\r\n      <use x=\"120.458984\" xlink:href=\"#DejaVuSans-108\"/>\r\n     </g>\r\n    </g>\r\n   </g>\r\n  </g>\r\n </g>\r\n <defs>\r\n  <clipPath id=\"p2a4ce5bee9\">\r\n   <rect height=\"197.672727\" width=\"558\" x=\"43.78125\" y=\"7.2\"/>\r\n  </clipPath>\r\n  <clipPath id=\"p95d0c31365\">\r\n   <rect height=\"197.672727\" width=\"558\" x=\"43.78125\" y=\"244.407273\"/>\r\n  </clipPath>\r\n </defs>\r\n</svg>\r\n",
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAmEAAAHgCAYAAADt8bqrAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjIsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+WH4yJAAAgAElEQVR4nOzdeXxU9b3/8deHEEiEQBCQQABBQBBFQeNWbEu1Cm6Vcq1Fu2kXqtXW9ra49LZqe9urt/R28Wcrl1pre69LvULRVirVumDrxiqrUUCWJOwQSCAJWT6/P2aCSZiQSTInZ2byfj4eeczMOWdmPnMckzfnu5m7IyIiIiIdq0vYBYiIiIh0RgphIiIiIiFQCBMREREJgUKYiIiISAgUwkRERERCoBAmIiIiEoKuYRfQWv369fNhw4aFXYaIiIhIi5YuXbrb3fvH2pdyIWzYsGEsWbIk7DJEREREWmRmm5vbp+ZIERERkRAohImIiIiEQCFMREREJAQKYSIiIiIhSLmO+alg/vJiZi0spKS0gkG52cycPJqpE/LDLktERESSiEJYgs1fXsyd81ZRUV0LQHFpBXfOWwWgICYiIiJHqDkywWYtLDwSwOpVVNcya2FhSBWJiIhIMlIIS7CS0opWbRcREZHOSSEswQblZrdqu4iIiHROgYUwM8sys7fM7G0zW2NmP4hxjJnZ/Wa23sxWmtmZQdXTUWZOHk12ZkajbdmZGcycPDqkikRERCQZBdkxvwq40N3LzSwT+IeZ/dXd32hwzKXAqOjPucCD0duUVd/5XqMjRURE5FgCC2Hu7kB59GFm9MebHHYV8IfosW+YWa6ZDXT3bUHV1RGmTshX6BIREZFjCrRPmJllmNkKYCfwvLu/2eSQfGBrg8dF0W0iIiIiaS3QEObute4+HhgMnGNmpzU5xGI9rekGM5thZkvMbMmuXbuCKFVERESkQ3XI6Eh3LwVeBqY02VUEDGnweDBQEuP5c9y9wN0L+vfvH1idIiIiIh0lyNGR/c0sN3o/G/g48E6Tw54BPh8dJXkesD/V+4OJiIiIxCPI0ZEDgd+bWQaRsPeku//FzG4EcPfZwALgMmA9cAi4IcB6RERERJJGkKMjVwITYmyf3eC+AzcHVYOIiIhIstKM+SIiIiIhUAgTERERCYFCmIiIiEgIFMJEREREQqAQJiIiIhIChTARERGRECiEiYiIiIRAIUxEREQkBAphIiIiIiFQCBMREREJgUKYiIiISAgUwkRERERCoBAmIiIiEgKFMBEREZEQKISJiIiIhEAhTERERCQECmEiIiIiIVAIExEREQmBQpiIiIhICBTCREREREKgECYiIiISgsBCmJkNMbOXzGydma0xs1tjHDPJzPab2Yroz11B1SMiIiKSTLoG+No1wLfdfZmZ5QBLzex5d1/b5LhX3f2KAOsQERERSTqBXQlz923uvix6vwxYB+QH9X4iIiIiqaRD+oSZ2TBgAvBmjN3nm9nbZvZXMzu1I+oRERERCVuQzZEAmFlPYC7wTXc/0GT3MuBEdy83s8uA+cCoGK8xA5gBMHTo0IArFhEREQleoFfCzCyTSAB71N3nNd3v7gfcvTx6fwGQaWb9Yhw3x90L3L2gf//+QZYsIiIi0iECuxJmZgb8Fljn7j9r5pg8YIe7u5mdQyQU7gmqpnjMX17MrIWFlJRWMCg3m5mTRzN1grqyiYiISGIF2Rw5EfgcsMrMVkS3fRcYCuDus4GrgZvMrAaoAKa7uwdY0zHNX17MnfNWUVFdC0BxaQV3zlsFoCAmIiIiCRVYCHP3fwDWwjEPAA8EVUNrzVpYeCSA1auormXWwkKFMBEREUkozZjfQElpRau2i4iIiLSVQlgDg3KzW7VdREREpK0UwhqYOXk02ZkZjbZlZ2Ywc/LokCoSERGRdBX4PGGppL7fl0ZHioiISNAUwpqYOiFfoUtEREQCp+ZIERERkRDoSlgS08SxIiIi6UshLElp4lgREZH0pubIJHWsiWNFREQk9SmEJSlNHCsiIpLeFMKSlCaOFRERSW8KYUlKE8eKiIikN3XMT1KaOFZERCS9KYQlMU0cKyIikr7UHCkiIiISAoUwERERkRCoOTLNadZ9ERGR5KQQlsY0676IiEjyUnNkGtOs+yIiIslLISyNadZ9ERGR5KUQlsY0676IiEjyCiyEmdkQM3vJzNaZ2RozuzXGMWZm95vZejNbaWZnBlVPZ6RZ90VERJJXkB3za4Bvu/syM8sBlprZ8+6+tsExlwKjoj/nAg9GbyUBNOu+iIhI8goshLn7NmBb9H6Zma0D8oGGIewq4A/u7sAbZpZrZgOjz5UE0Kz7IiIiyalD+oSZ2TBgAvBmk135wNYGj4ui20RERETSWuAhzMx6AnOBb7r7gaa7YzzFY7zGDDNbYmZLdu3aFUSZIiIiIh0q0MlazSyTSAB71N3nxTikCBjS4PFgoKTpQe4+B5gDUFBQcFRIk9Sg2ftFREQ+EOToSAN+C6xz9581c9gzwOejoyTPA/arP1h6qp+9v7i0AueD2fvnLy8OuzQREZFQBHklbCLwOWCVma2IbvsuMBTA3WcDC4DLgPXAIeCGAOuREB1r9n5dDRMRkc4oyNGR/yB2n6+Gxzhwc1A1SPLQ7P0iIiKNacZ86RCavV9ERKQxhTDpEJq9X0REpLFAR0dK+mjvyEbN3i8iItKYQpi0qH5kY33H+vqRjUCrg1giQpemuhARkXSg5khp0bFGNnY0TXUhIiLpQiFMWpRMIxuTKRCKiIi0h0KYtCiZRjYmUyAUERFpD4UwaVEyjWxMpkCYKPOXFzPxvhcZfsezTLzvRTWtioh0Egph0qKpE/K5d9o48nOzMSA/N5t7p40LpTN8MgXCRFAfNxGRzkujIyUuiRrZmIg6IH2mutByTiIinZdCmKScZAmEiaA+biIinZeaI0VClI593EREJD5xhTAzu9XMelnEb81smZldEnRxIuku3fq4iYhI/OK9EvZFdz8AXAL0B24A7gusKpFOIpkGPYiISMeKt0+YRW8vA37n7m+bmR3rCSLJLJmWPkqnPm4iIhK/eEPYUjP7GzAcuNPMcoC64MoSCU6i1sIUERFpj3ibI78E3AGc7e6HgEwiTZIiKUdLH4mISDKIN4SdDxS6e6mZfRb4HrA/uLJEgqNpIUREJBnEG8IeBA6Z2RnAbcBm4A+BVSUSIE0LISIiySDeEFbj7g5cBfzS3X8J5ARXlkhwNC2EiIgkg3g75peZ2Z3A54APm1kGkX5hIikn3ZY+EhGR1BRvCPs0cB2R+cK2m9lQYNaxnmBmDwNXADvd/bQY+ycBTwPvRzfNc/cfxlu4SHtoWggREQlbXM2R7r4deBTobWZXAJXu3lKfsEeAKS0c86q7j4/+KICJiIhIpxHvskXXAG8BnwKuAd40s6uP9Rx3XwTsbXeFIiIiImko3ubIfyMyR9hOADPrD7wAPNXO9z/fzN4GSoDvuPuadr6eiIiISEqIN4R1qQ9gUXuIf2Rlc5YBJ7p7uZldBswHRsU60MxmADMAhg4d2s63FREREQlfvEHqOTNbaGbXm9n1wLPAgva8sbsfcPfy6P0FQKaZ9Wvm2DnuXuDuBf3792/P24qIiIgkhbiuhLn7TDP7F2AikcW857j7n9rzxmaWB+xwdzezc4gEwj3teU0RERGRVBFvcyTuPheYG+/xZvY4MAnoZ2ZFwN1E5xZz99nA1cBNZlYDVADToxPCioiIiKS9Y4YwMysDYgUjA9zdezX3XHe/9liv7e4PAA/EU6SIiIhIujlmCHN3LU0kIiIiEoD2jnAUERERkTaIu0+YiBxt/vJirUEpIiJtohAm0kbzlxdz57xVVFTXAlBcWsGd81YBKIiJiEiL1Bwp0kazFhYeCWD1KqprmbWwMKSKREQklSiEibRRSWlFq7aLiIg0pBAm0kaDcrNbtV1ERKQhhTCRNpo5eTTZmRmNtmVnZjBz8uiQKhIRkVSijvkibVTf+V6jI0VEpC0UwkTaYeqEfIUuERFpEzVHioiIiIRAIUxEREQkBAphIiIiIiFQCBMREREJgTrmi0hCaT1NEZH4KISJSMJoPU0RkfipOVJEEkbraYqIxE8hTEQSRutpiojETyFMRBJG62mKiMRPIUxEEkbraYqIxE8d80XkiPaObNR6miIi8VMIE0kDiZgWIlEjG7WepohIfAJrjjSzh81sp5mtbma/mdn9ZrbezFaa2ZlB1SKSzurDU3FpBc4H4Wn+8uJWvY5GNoqIdKwg+4Q9Akw5xv5LgVHRnxnAgwHWIpK2EhWeNLJRRKRjBRbC3H0RsPcYh1wF/MEj3gByzWxgUPWIpKtEhSeNbBQR6Vhhjo7MB7Y2eFwU3XYUM5thZkvMbMmuXbs6pDiRVJGo8KSRjSIiHSvMEGYxtnmsA919jrsXuHtB//79Ay5LJLUkKjxNnZDPvdPGkZ+bjQH5udncO22cOtmLiAQkzNGRRcCQBo8HAyUh1SKSshI5LYRGNoqIdJwwQ9gzwC1m9gRwLrDf3beFWI9IylJ4EhFJPYGFMDN7HJgE9DOzIuBuIBPA3WcDC4DLgPXAIeCGoGoRERERSTaBhTB3v7aF/Q7cHNT7i4iIiCQzrR0pIiIiEgItWyQiSSkRSzGJiCQzhTARSTqJWsdSRCSZqTlSRJKO1rEUkc5AV8JEJOmk4zqWal4VkaZ0JUxEkk66rWNZ37xaXFqB80Hz6vzlxWGXJiIhUggTkaSTbutYqnlVRGJRc6SIJJ1ELsWUDNKxeVVE2k8hTESSUjotxTQoN5viGIErVZtXRSQx1BwpIhKwdGteFZHE0JUwEZGApVvzqogkhkKYiEgHSKfmVRFJDDVHioiIiIRAV8JERFqgiVZj03kRaR+FMBGRY9A6lrHpvIi0n5ojRUSOQROtxqbzItJ+uhImImkrEc1lmmg1Np0XkfbTlTARSUuJWq8x3daxTBSdF5H2UwgTkbSUqOYyTbQam86LSPupOVJE0lKimss00WpsOi8i7acQJiJpKZHrNWqi1dh0XkTaJ9DmSDObYmaFZrbezO6IsX+Sme03sxXRn7uCrEdEOg81l0lY5i8vZuJ9LzL8jmeZeN+Lre6HKJ1HYFfCzCwD+BVwMVAELDazZ9x9bZNDX3X3K4KqQ0Q6JzWXSRg0f5q0RpDNkecA6919I4CZPQFcBTQNYSIigUjH5jLNUh+cRJzbYw0I0X8naSrI5sh8YGuDx0XRbU2db2Zvm9lfzezUWC9kZjPMbImZLdm1a1cQtYqIJL1ETbshR0vUudX8adIaQYYwi7HNmzxeBpzo7mcA/w+YH+uF3H2Ouxe4e0H//v0TXKaISGpI1LQb6rN0tESdW82fJq0RZAgrAoY0eDwYKGl4gLsfcPfy6P0FQKaZ9QuwJhGRlJWIqyy6mhZboq5gaUCItEaQIWwxMMrMhptZN2A68EzDA8wsz8wsev+caD17AqxJRCRlJeIqi9Z8jC1RV7CmTsjn3mnjyM/NxoD83GzunTZO/cEkpsA65rt7jZndAiwEMoCH3X2Nmd0Y3T8buBq4ycxqgApgurs3bbIUEREiV1kajryD1l9lUZ+l2BJxbuul44AQCUagk7VGmxgXNNk2u8H9B4AHgqxBRCRdJGLajUROYpsIyTLaU1OaSBgs1S48FRQU+JIlS8IuQ0QkJTWdxwoiV3zCaDJLplpEgmJmS929INY+LeAtItKJJFOfJfVPk85Oa0eKiHQyydJnSf3TpLPTlTAREQmF5tSSzk4hTEREQpHIObU0Aa2kIjVHiohIKBI1IlGLZkuqUggTEZHQJKJ/mhbNllSl5kgREUlp6uAvqUohTEREUpo6+EuqUggTEZGUpkWzJVWpT5iIiKQ0LTkkqUohTEREUl6yTEAr0hoKYSIiItKsZFlkPdlqSQSFMBERkTTV3tCSyDnYkqmWZKGO+SIiImmoPrQUl1bgfBBaWrOaQKIWWU+mWurrSYYVFhTCREREkkwiQkIiQkui5mBLploSEQgTRSFMREQkiSQqJCQitCRqDrZkqiWRV9TaSyFMREQkiSQqJCQitCRqDrZkqiWZVlhQCBMREUkiiQoJiQgtUyfkc++0ceTnZmNAfm42904b1+qO8MlUSzKtsKDRkSIiIklkUG42xTECV2tDQqImsU3EHGzJVMvMyaMbjbKE8FZYMHfv8Ddtj4KCAl+yZEnYZYiIiASi6VQMEAkJbbnqI7F15HxjZrbU3Qti7Qv0SpiZTQF+CWQAD7n7fU32W3T/ZcAh4Hp3XxZkTSIiIslMyzAFL1lWWAgshJlZBvAr4GKgCFhsZs+4+9oGh10KjIr+nAs8GL0VERHptJIlJEiwguyYfw6w3t03uvth4AngqibHXAX8wSPeAHLNbGCANYmIiIgkhSBDWD6wtcHjoui21h6Dmc0wsyVmtmTXrl0JL1RERESkowUZwizGtqajAOI5Bnef4+4F7l7Qv3//hBQnIiIiEqYgQ1gRMKTB48FASRuOEREREUk7gU1RYWZdgXeBi4BiYDFwnbuvaXDM5cAtREZHngvc7+7ntPC6u4DNgRTdWD9gdwe8T2ekcxscndtg6fwGR+c2WDq/wWnp3J7o7jGb8QIbHenuNWZ2C7CQyBQVD7v7GjO7Mbp/NrCASABbT2SKihvieN0OaY80syXNzesh7aNzGxyd22Dp/AZH5zZYOr/Bac+5DXSeMHdfQCRoNdw2u8F9B24OsgYRERGRZKS1I0VERERCoBDWvDlhF5DGdG6Do3MbLJ3f4OjcBkvnNzhtPrcpt3akiIiISDrQlTARERGRECiENWFmU8ys0MzWm9kdYdeTbsxsk5mtMrMVZrYk7HpSmZk9bGY7zWx1g23Hm9nzZvZe9LZPmDWmsmbO7z1mVhz9/q4ws8vCrDFVmdkQM3vJzNaZ2RozuzW6Xd/fdjrGudV3NwHMLMvM3jKzt6Pn9wfR7W367qo5soHoouPv0mDRceDaJouOSzuY2SagwN01X007mdlHgHIi66+eFt32E2Cvu98X/UdEH3e/Pcw6U1Uz5/ceoNzdfxpmbakuukbwQHdfZmY5wFJgKnA9+v62yzHO7TXou9tuZmZAD3cvN7NM4B/ArcA02vDd1ZWwxuJZdFwkKbj7ImBvk81XAb+P3v89kV++0gbNnF9JAHff5u7LovfLgHVE1g3W97edjnFuJQE8ojz6MDP647Txu6sQ1lhcC4pLuzjwNzNbamYzwi4mDQ1w920Q+WUMnBByPenoFjNbGW2uVHNZO5nZMGAC8Cb6/iZUk3ML+u4mhJllmNkKYCfwvLu3+burENZYXAuKS7tMdPczgUuBm6NNPiKp4kFgBDAe2Ab8V7jlpDYz6wnMBb7p7gfCriedxDi3+u4miLvXuvt4Iutdn2Nmp7X1tRTCGtOC4gFz95Lo7U7gT0SagCVxdkT7hNT3DdkZcj1pxd13RH8B1wG/Qd/fNov2p5kLPOru86Kb9f1NgFjnVt/dxHP3UuBlYApt/O4qhDW2GBhlZsPNrBswHXgm5JrShpn1iHYUxcx6AJcAq4/9LGmlZ4AvRO9/AXg6xFrSTv0v2ahPou9vm0Q7N/8WWOfuP2uwS9/fdmru3Oq7mxhm1t/McqP3s4GPA+/Qxu+uRkc2ER22+ws+WHT8xyGXlDbM7CQiV78gsm7pYzq/bWdmjwOTgH7ADuBuYD7wJDAU2AJ8yt3VubwNmjm/k4g05ziwCfhqfT8QiZ+ZXQC8CqwC6qKbv0uk75K+v+1wjHN7LfrutpuZnU6k430GkQtZT7r7D82sL2347iqEiYiIiIRAzZEiIiIiIVAIExEREQmBQpiIiIhICBTCREREREKgECYiIiISAoUwEUlJZvZa9HaYmV2X4Nf+bqz3EhFJJE1RISIpzcwmAd9x9yta8ZwMd689xv5yd++ZiPpERJqjK2EikpLMrDx69z7gw2a2wsy+FV1cd5aZLY4uVvzV6PGTzOwlM3uMyESWmNn86GLya+oXlDez+4Ds6Os92vC9LGKWma02s1Vm9ukGr/2ymT1lZu+Y2aPRmctFRJrVNewCRETa6Q4aXAmLhqn97n62mXUH/mlmf4seew5wmru/H338RXffG11+ZLGZzXX3O8zslugCvU1NIzLr+BlEZtJfbGaLovsmAKcSWW/2n8BE4B+J/7giki50JUxE0s0lwOfNbAWRZXD6AqOi+95qEMAAvmFmbwNvAEMaHNecC4DHowsh7wBeAc5u8NpF0QWSVwDDEvJpRCRt6UqYiKQbA77u7gsbbYz0HTvY5PHHgfPd/ZCZvQxkxfHazalqcL8W/X4VkRboSpiIpLoyIKfB44XATWaWCWBmJ5tZjxjP6w3siwawMcB5DfZV1z+/iUXAp6P9zvoDHwHeSsinEJFOR/9SE5FUtxKoiTYrPgL8kkhT4LJo5/hdwNQYz3sOuNHMVgKFRJok680BVprZMnf/TIPtfwLOB94GHLjN3bdHQ5yISKtoigoRERGREKg5UkRERCQECmEiIiIiIVAIExEREQmBQpiIiIhICBTCREREREKgECYiIiISAoUwERERkRAohImIiIiEQCFMREREJAQpt2xRv379fNiwYWGXISIiItKipUuX7nb3/rH2pVwIGzZsGEuWLAm7DBEREZEWmdnm5vapOVJEREQkBIGFMDN72Mx2mtnqZvabmd1vZuvNbKWZnRlULSIiIiLJJsgrYY8AU46x/1JgVPRnBvBggLWIiIiIJJXA+oS5+yIzG3aMQ64C/uDuDrxhZrlmNtDdt7X2vaqrqykqKqKysrKN1aaOrKwsBg8eTGZmZtiliIiISDuE2TE/H9ja4HFRdFurQ1hRURE5OTkMGzYMM0tUfUnH3dmzZw9FRUUMHz487HJERERS0vzlxcxaWEhJaQWDcrOZOXk0Uyfkd3gdYXbMj5WWPOaBZjPMbImZLdm1a9dR+ysrK+nbt29aBzAAM6Nv376d4oqfiIhIEOYvL+bOeasoLq3AgeLSCu6ct4r5y4s7vJYwQ1gRMKTB48FASawD3X2Ouxe4e0H//jGn2kj7AFavs3xOERGRRNt/qJofPbuWiuraRtsrqmuZtbCww+sJsznyGeAWM3sCOBfY35b+YMmgtLSUxx57jK997Wutet5ll13GY489Rm5ubkCViYiIdD5VNbVs2HmQwh0HeGd7GYXRn237m29JKimt6MAKIwILYWb2ODAJ6GdmRcDdQCaAu88GFgCXAeuBQ8ANQdUStNLSUn79618fFcJqa2vJyMho9nkLFiwIujQREZG05e4Ul1bwzrYyCneURQPXATbuOkhNXaSHU2aGMfKEHM47qS+j83L4zaKN7Dl4+KjXGpSb3dHlBzo68toW9jtwc1DvfyyJ7pB3xx13sGHDBsaPH09mZiY9e/Zk4MCBrFixgrVr1zJ16lS2bt1KZWUlt956KzNmzAA+mP2/vLycSy+9lAsuuIDXXnuN/Px8nn76abKzO/4LISIikoz2V1RTuL2Md7Z/cHXr3e1llFXVHDkmPzebUwbmcPHYAYzO68WYvByG9+tBZsYHva/yemVx57xVjZokszMzmDl5dId+HkjBZYvaq75DXv3Jr++QB7Q5iN13332sXr2aFStW8PLLL3P55ZezevXqIyMYH374YY4//ngqKio4++yz+Zd/+Rf69u3b6DXee+89Hn/8cX7zm99wzTXXMHfuXD772c+245OKiIgkl3gughyuqWPDrnIKt5exbvuBmE2JvbK6MiavF588M5/ReTmMycvh5AE55GS1PH1T/fslw+jItAthP/jzGtaWHGh2//ItpRyurWu0raK6ltueWsnjb22J+Zyxg3px95Wnxl3DOeec02gKifvvv58//elPAGzdupX33nvvqBA2fPhwxo8fD8BZZ53Fpk2b4n4/ERGRZBfrIsjtc1eyumQ/fY7r1mxT4oj+PTl3+PGMGdjrSODK65XVroFqUyfkhxK6mkq7ENaSpgGspe1t0aNHjyP3X375ZV544QVef/11jjvuOCZNmhRzionu3bsfuZ+RkUFFRcd3EBQREQnC/orYoxKraup46NX3gUhT4pi8HD5+ygDGDIzdlJhu0i6EtXTFauJ9L1IcYwREfm42f/zq+W16z5ycHMrKymLu279/P3369OG4447jnXfe4Y033mjTe4iIiCS7hk2J70T7b7U0KtGAlfdcEldTYrpJuxDWkpmTRye8Q17fvn2ZOHEip512GtnZ2QwYMODIvilTpjB79mxOP/10Ro8ezXnnndeu+kVERMJWPyrxg7B17KbE0Xm9eOjV5kcldsYABmCRQYqpo6CgwJcsWdJo27p16zjllFPifo1kWa6grVr7eUVERNqqflRiYXRU4jvNjEock5fD6OjPmLxenNS/cVNi0z5hELkIcu+0cSn1N7i1zGypuxfE2tfproRB8nTIExER6QitHZXYXFNiTlZXxuTlMHVCg1GJeTn0SrFRicmiU4YwERGRzqKlUYn1c2/Fako8Z/jxjInOtzU6L4eBvdNjVGKyUAgTEREJSNDdX2pq6yirrKGssoYDldXR+x/cHqisYc6iDS2OShydl8NFpwxgTLQpcXi/HnTrmr6jEpOFQpiIiEgAWpocvKUAVVZZQ1lVNExVxDqm5qhw1RoGrLj7Enpnd85O8clAIUxERCQAsxYWHhWSKqpr+dcnVxzVQb05WZldyMnKJCerKzlZmfTK6sqg3Cxyun+wLXL7wf6crEx6ZUdue3bvysd++nLMqZkG5WYrgIVMIUxERCQAJTGCD0Cdw2fOHdpsgGq4LRFNgkFMzSSJoRAWgp49e1JeXh52GSIigUj1aYDaq6yyml++8B7NTQCVn5vN964Y22H1aFRi8lIIExGRhGmpH1Q6q6tz5i4r4j+fK2TPwSrOG348K4pKqaz+YFm8sK5AaVRicuqcIWzlk/D3H8L+Iug9GC66C06/ps0vd/vtt3PiiSfyta99DYB77rkHM2PRokXs27eP6upqfvSjH3HVVVcl6hOIiCSl5vpBzVpYmNYhYGVRKXc/s4blW0oZPySX336hgDOG5Hb6q4JybJ0vhK18Ev78DaiOttXv3xp5DF58YSoAACAASURBVG0OYtOnT+eb3/zmkRD25JNP8txzz/Gtb32LXr16sXv3bs477zw+8YlPtGt+FRGRZOXuvPLurpgdwKH5/lGpbnd5FbOeK+TJpVvp26M7P/3UGUybkE+XLpHf9boCJceSfiHsr3fA9lXN7y9aDLVVjbdVV8DTt8DS38d+Tt44uPS+Zl9ywoQJ7Ny5k5KSEnbt2kWfPn0YOHAg3/rWt1i0aBFdunShuLiYHTt2kJeX14YPJSKSnGrrnAWrtvHgyxtYu+0AXSzS8bwpB+55Zg1f/vBwBvc5rsPrTLSa2jr+543N/Oz5d6k4XMuXLxjO1y8aFdfM8SL10i+EtaRpAGtpe5yuvvpqnnrqKbZv38706dN59NFH2bVrF0uXLiUzM5Nhw4ZRWdn8KvIiIqmksrqWecuK+e9FG9i85xAn9e/BT64+nS7A959e06hJsnvXLpwxuDf/+8Zm/veNzXxi/CBu+ugIRg3ICe8DtMNrG3ZzzzNreHdHOR8e1Y+7rxzLyBNS87NIuNIvhB3jihUAPz8t0gTZVO8hcMOzbX7b6dOn85WvfIXdu3fzyiuv8OSTT3LCCSeQmZnJSy+9xObNm9v82iIiyaKssprH3tzCQ/94n11lVZw+uDezP3sml4zNO9IE1zWjS8x+UCWlFTz06vs8/tYW5i0r5uKxA/japBFMGNon5E8Vn+LSCv7j2XU8u2obg/tk89+fO4tLxg5QNxNps/QLYS256K7GfcIAMrMj29vh1FNPpaysjPz8fAYOHMhnPvMZrrzySgoKChg/fjxjxoxpZ+EiIuHZXV7FI//cxB9e38SByhouGNmPX3x6PB8a0feoENJcP6hBudncdeVYbrlwJL9/bROPvLaJ59fu4LyTjudrk0by4VH9kjLQVFbXMmfRRn798nrc4VsfP5mvfvQksjIzwi5NUpy5NzeTSXIqKCjwJUuWNNq2bt06TjnllPhfJMGjIztaqz+viEgbbd17iIde3cgfl2ylqqaOKafmceNHR3DGkNx2v/bBqhoef2sLD736PtsPVHJafi9u+uhIppyWR0aX8MOYu/P82h38+7Nr2bq3gsvG5fHdy05Jiz5t0nHMbKm7F8Ta1/muhEEkcKVQ6BIR6Wjv7ihj9ssbePrtEroYfHJCPjM+MoKRJ/RM2Hv06N6VL3/4JD53/onMX17Mf7+ykZsfW8bwfj346kdO4pNn5tO9azhXm9bvLOeHf1nLond3MeqEnjz65XOZOLJfKLVI+uqcIUxERGJatmUfv35pAy+s20F2ZgbXf2gYX7pgOINyswN7z+5dM/j02UO5+qwhLFyznQdf3sAd81bx8xfe5csXnMS15w6lZ/eO+XNVVlnN/3txPQ//432yu2Vw1xVj+dz5J5KZ0f7lg0SaUggTEenk3J1F7+3m1y+t583395J7XCbf/PgovnD+MPr06NZhdWR0MS4bN5BLT8vjn+v38OuX1/PjBet44KX1fOH8E7l+4nCOD6ieujrnT8uLue+5d9hdXsU1Zw1h5pTR9OvZPZD3E4E0CmHunpQdOhMt1frwiUjyqq1z/ro6MsfXmpIDDOydxfevGMv0s4fQo4OuPMViZlwwqh8XjOrHiq2lPPjyeu5/cT1zXt3I9LOH8pWPnER+Aq/MrSraz93PrGbZllLOGJLLQ58vSEifN5GWpEXH/Pfff5+cnBz69j16lE46cXf27NlDWVkZw4cPD7scEUlRVTXROb5e2cCm6BxfN350BFPH59Ota3I2u63fWcbsVzYyf3kxAFeNz+emSSe1a36uPeVV/PRvhTyxeCt9e3Tj9ilj+JczBx+ZakMkEY7VMT8tQlh1dTVFRUWdYjLUrKwsBg8eTGamZmUWkdYpr6rhsTc389Cr77MzOsfX1yaN4OKxyTEaMR4lpRX85tWNPPHWViqqa7lk7AC+9rGRjG/Flaua2jr+Nzrb/aHDtXzhQ8O49eOa7V6CkfYhTEREmrenvIpHXtvE71+LzPE1cWRfbvroSCaOTN3Wg70HDx/5TPsrqvnQiL7cNGkEF4w89lxjr2/Yww/+vIZ3tpdxwcjIbPepOnO/pAaFMBGRAM1fXhxzhviw6/jSBcPYsreCJxZvoaqmjslj87hpUmLm+EoW5VU1PPHWFn7z6kZ2HKhiXH5vbpo0gsmn5vHnt0uOnI8TenVnYO8sVmzdT35uNt+/4hQmn5qXsiFUUodCmIhIQOYvL+bOeasarZWYnZnBvdPGdWgQi1UHgAFXnzWYr340sXN8JZuqmlrmLy9m9isbeX/3Qfr37EZpRTXVtY3/xk05dQC/mD5Bs91Lh1EIExFJsKqaWtbvLOezD73JvkPVR+3PzDBO7sBmrnd3lB0VOAAG9OrOm9/9eIfVEbbaOmfhmu3c+sTymOcjPzebf95xYQiVSWcV2oz5ZjYF+CWQATzk7vc12d8b+F9gaLSWn7r774KsSUSkNdydon0VvLO9jMLtB6K3ZWzcfZDauub/EVtd6wzsndVhda4pORBz+84DVR1WQzKon2vs5keXxdxfUloRc7tIGAILYWaWAfwKuBgoAhab2TPuvrbBYTcDa939SjPrDxSa2aPufjioukREmrP/UDXvRINWfeh6d0c55VU1R44Z3CebMXk5XHLqAEbn9eJHf1nLzrKjg05+bjYPfeHsDqt94n0vUhwjYAQ5030yG5SbrfMhSS/IK2HnAOvdfSOAmT0BXAU0DGEO5FikZ2RPYC9Q0/SFREQSqaqmlg07D1K4Ixq4tkWubm0/8ME0N72zMxmdl8O0M/MZnZfDmLwcTh6QQ06TaQzq6jxmn7CZk0d32OcBmDl5dFLUkSx0PiQVBBnC8oGtDR4XAec2OeYB4BmgBMgBPu3udQHWJCJppKVRifVNiYXbyyjcURYNXAd4f/dBaqJNiZkZxsgTcjh/RF9G5+UcCVx5vbLiGjlX/35hj45MljqShc6HpILAOuab2aeAye7+5ejjzwHnuPvXGxxzNTAR+FdgBPA8cIa7H2jyWjOAGQBDhw49a/PmzYHULCKpI9ZowG5du3Dl6QPpnpkRCV7by2I2JUbCVi/G5OUwvF8PLc4sIoEJq2N+ETCkwePBRK54NXQDcJ9HkuB6M3sfGAO81fAgd58DzIHI6MjAKhaRlFBTW8ePF6w7ajqGwzV1zF1WHHdToohImIIMYYuBUWY2HCgGpgPXNTlmC3AR8KqZDQBGAxsDrElEUtChwzUs31LK4k17WbJpH8u27OPQ4dqYxxqw4q6LNQmniCS9wEKYu9eY2S3AQiJTVDzs7mvM7Mbo/tnAvwOPmNkqIr87b3f33UHVJCKpYXd5FUs27WXxpn0s2bSX1SUHqK1zzOCUvF586qzBPPN2Scz5uQblZiuAiUhKCHSeMHdfACxosm12g/slwCVB1iAiyc3d2bznEG9t2suS6JWujbsPAtC9axfGD8nlpo+OoGBYH848sc+RRZYnDO2j0W8iktICDWEiIk3V1NaxblvZkdC1eNM+dpdH5tnKPS6TghOP59NnD6Fg2PGclt+L7l1jLy+j0W8ikuoUwkQkUIcO17BiSymLN+1j8aa9jfpzDe6TzUdG9aNg2PGcPawPI/r3pEuX+JsSp07IV+gSkZSlECYirXas+bki/bn2RTvRN+7PNSban6tg2PEUDOvDwN6avVxEOi8t4C0irRJrfq7MDOPMobnsKjvcqD/XGUNyOScauBr25xIR6SxCW8BbRNLPrIWFR83PVV3rvLVpHxeNOSGu/lwiIqIQJiKtVBJjUWQAnA5dsFpEJNVprQ4RidtrG3bT3BRcg3LVv0tEpDV0JUxEWlRVU8vP/vYuc17dSL8e3ThQWUNVTd2R/ZqfS0Sk9RTCROSY1u8s4xuPr2DttgNcd+5Qvnf5KfxtzQ7NzyUi0k4KYSISk7vzP29s5sfPrqNH96785vMFXDx2AKD5uUREEkEhTESOsrOsktueWsnLhbuYNLo/P7n6dE7IyQq7LBGRtKIQJiKNvLB2B7fPXUl5VQ0/vOpUPnfeiVoQW0QkAAphIgJElhf60bPreOzNLYwd2Isnpo9n1ICcsMsSEUlbCmEiwsqiUr75xAre33OQr370JP714pM10aqISMAUwkQ6sdo6Z/YrG/j58+/SP6c7j375XD40ol/YZYmIdAoKYSKd1Na9h/j2k2/z1qa9XHH6QH48dRy9j9PajiIiHUUhTKSTcXfmryjmrvlrcOBn15zBJyfkq/O9iEgHUwgT6UT2V1Tzvfmr+fPbJZw9rA8/u2Y8Q44/LuyyREQ6JYUwkU7i9Q17+PaTK9hZVsV3LjmZmyaNJKOLrn6JiIRFIUwkzR2uqeO/ni9kzqKNDOvbg7k3fYgzhuSGXZaISKenECaSxtbvLOPWJ1awpuQA154zlO9fcQrHddP/9iIiyUC/jUXSkLvzv29s5kfRdR/nfO4sLjk1L+yyRESkAYUwkTSzq6yK2556m5cKd/HRk/sz61Na91FEJBkphImkkb+v28FtT0XWffzBJ07l8+dr3UcRkWSlECaSBioO1/KjZ9fy6JtbOGVgLx6fPp6Tte6jiEhSUwgTSXGrivZz6x+X8/7ug8z4yEl8+xKt+ygikgoUwkRSzPzlxcxaWEhJaQU5WV0pq6xhQK8sHv3SuXxopNZ9FBFJFV3iOcjM5prZ5WYW1/EiEoz5y4u5c94qiksrcOBAZQ1dDL5x0UgFMBGRFBNvqHoQuA54z8zuM7MxAdYkIs2YtfAdKqprG22rdfjVSxtCqkhERNoqrhDm7i+4+2eAM4FNwPNm9pqZ3WBmmUEWKCIRm3YfpLi0Mua+ktKKDq5GRETaK+7mRTPrC1wPfBlYDvySSCh7PpDKRASAmto65izawORfLKK5ySYG5WZ3aE0iItJ+8fYJmwe8ChwHXOnun3D3P7r714Gex3jeFDMrNLP1ZnZHM8dMMrMVZrbGzF5py4cQSVdrSw7wyV+/xn8seIePnNyfuz8xluzMxiMfszMzmDl5dEgViohIW8U7OvIBd38x1g53L4i13cwygF8BFwNFwGIze8bd1zY4Jhf4NTDF3beY2Qmtql4kTVXV1PLAi+t58OUN5B6Xya+uO5PLxuVhZuRmdzsyOnJQbjYzJ49m6oT8sEsWEZFWijeEnWJmy9y9FMDM+gDXuvuvj/Gcc4D17r4x+pwngKuAtQ2OuQ6Y5+5bANx9Z2s/gEi6Wbp5L7c9tZINuw4y7cx8vn/5WPr06HZk/9QJ+QpdIiJpIN4+YV+pD2AA7r4P+EoLz8kHtjZ4XBTd1tDJQB8ze9nMlprZ5+OsRyTtHKyq4Z5n1nD17NeprK7j9188h59dM75RABMRkfQR75WwLmZm7u5wpKmxpb8MsfoQe4z3Pwu4CMgGXjezN9z93UYvZDYDmAEwdOjQOEsWSR2vvLuL785bRcn+Cr5w/jC+M3k0PbtrLmURkXQW72/5hcCTZjabSJC6EXiuhecUAUMaPB4MlMQ4Zre7HwQOmtki4AygUQhz9znAHICCgoKmQU4kZe07eJh/f3Yt85YVM6J/D5668XzOOvH4sMsSEZEOEG8Iux34KnATkStcfwMeauE5i4FRZjYcKAamE+kD1tDTwANm1pXIlbVzgZ/HWZNIynJ3Fqzazt3PrKb0UDVfv3AkN39sJFmZWvNRRKSziCuEuXsdkVnzH4z3hd29xsxuIXIVLQN42N3XmNmN0f2z3X2dmT0HrATqgIfcfXVrP4RIKtlxoJLvzV/N82t3MC6/N//zpXM5ZWCvsMsSEZEOZtFuXsc+yGwUcC8wFsiq3+7uJwVXWmwFBQW+ZMmSjn5bkXZzd/64eCs/XrCOwzV1fPuSk/nixOF0zdCSrCIi6crMljY3nVe8zZG/A+4m0lT4MeAGYne8F5EYNu0+yJ3zVvH6xj2cd9Lx3DftdIb16xF2WSIiEqJ4Q1i2u/89OkJyM3CPmb1KJJiJSDNqauv43T838V/PF5LZpQv3ThvHpwuG0KWL/g0jItLZxRvCKs2sC/BetJ9XMaDZ7UWOYd22A9w+dyUri/bz8VMG8KOpp5HXO6vlJ4qISKcQbwj7JpF1I78B/DuRJskvBFWUSCqrqqnlVy+u59cvb6B3diYPXDeBy8cNxExXv0RE5AMthrDoxKzXuPtMoJxIfzARiWHp5r3cPncV63eWM21CPt+/YqxmvBcRkZhaDGHuXmtmZzWcMV9EGjtYVcOshYX8/vVNDOqdzSM3nM2k0WqxFxGR5sXbHLkceNrM/g84WL/R3ecFUpVICln07i7ujC459PnzTmTmlDFackhERFoU71+K44E9wIUNtjmgECadVumhw/z7X9Yxd1kRI/r34P++ej4Fw7TkkIiIxCfeGfPVD0w6vfnLi5m1sJCS0gpyj8ukuraOyuo6bvnYSG65UEsOiYhI68QVwszsd0SufDXi7l9MeEUiSWj+8mLumLeSyuo6APYdqsYMvnPJaG7+2MiQqxMRkVQUb3PkXxrczwI+CZQkvhyRozW8AjUoN5uZk0czdUJ+q16jsrqWssoayiqro7c1HKisPvL4QKN9HxxTf3/PwcNHvaY7PPbmFoUwERFpk3ibI+c2fGxmjwMvBFKRSAPzlxdz57xVVFTXAlBcWsFtT63k7a37OGVg72iQOjpUNQ1Th2vrWnyvnt27kpNV/5NJ357dGNavBzlZXXnszS0xn1NSWpHQzysiIp1HW4dwjQKGJrIQkVhmLSw8EsDqHa6t43evbW60rUe3DHKyMsnJ6kqv7MYBKierK72i+3KyupLTPfNI0Krf1zOrKxnHWErolcJdFMcIXINysxPzQUVEpNOJt09YGY37hG0Hbg+kIpEGmrvSZMCi2z4WV4BKhJmTRze6IgeQnZnBzMmjA31fERFJX/E2R+YEXYhILAN6ZbH9QOVR2wflZjPk+OM6rI76Pmjt7ZsmIiJSL94rYZ8EXnT3/dHHucAkd58fZHEiJ+R0OyqEhXUFauqEfIUuERFJmC5xHnd3fQADcPdS4O5gShKJeH7tDlYWH+DycXnk52ZjQH5uNvdOG6cwJCIiKS/ejvmxwprWZZHAlFfVcNfTqxk9IIdfTJ9AZka8/14QERFJDfH+ZVtiZj8zsxFmdpKZ/RxYGmRh0rn9198K2X6gkv+YNk4BTERE0lK8f92+DhwG/gg8CVQANwdVlHRub28t5ZHXNvGZc4dy1ol9wi5HREQkEPGOjjwI3BFwLSLU1NZx57xV9O/ZndumjAm7HBERkcDEdSXMzJ6Pjoisf9zHzBYGV5Z0Vg//833WbjvAPZ84lV5ZmWGXIyIiEph4myP7RUdEAuDu+4ATgilJOqutew/x8+ff46IxJ3DpaXlhlyMiIhKoeENYnZkdWabIzIbReAZ9kXZxd77/9GrM4IdTT8Ms2BnwRUREwhbvNBP/BvzDzF6JPv4IMCOYkqQz+svKbbxcuIvvXzGWfK3HKCIinUC8HfOfM7MCIsFrBfA0kRGSIu22/1A1P/jzWsbl9+b6Dw0LuxwREZEOEe+yRV8GbgUGEwlh5wGvAxcGV5p0Fvc99w57D1bxyA1nB74Qt4iISLKIt0/YrcDZwGZ3/xgwAdgVWFXSaSzetJfH39rCFycO57T83mGXIyIi0mHiDWGV7l4JYGbd3f0doONXUJa0UlVTy53zVpGfm823Lj457HJEREQ6VLwd84ui84TNB543s31ASXBlSWfw369sZP3Och6+voAe3bUUqYiIdC7xdsz/ZPTuPWb2EtAbeC6wqiTtbdxVzgMvrefycQO5cMyAsMsRERHpcK1eGdndX3H3Z9z9cEvHmtkUMys0s/Vm1uyyR2Z2tpnVmtnVra1HUo+7829/Wk33rl24+8qxYZcjIiISilaHsHiZWQbwK+BSYCxwrZkd9Rc3etx/AloGqZN4amkRr2/cwx2XjuGEXllhlyMiIhKKwEIYcA6w3t03Rq+aPQFcFeO4rwNzgZ0B1iJJYk95FT9esI6CE/tw7dlDW36CiIhImgoyhOUDWxs8LopuO8LM8oFPArMDrEOSyI+fXcfBqhr+Y9o4umhOMBER6cSCDGGx/sI2XW/yF8Dt7l57zBcym2FmS8xsya5dmp4sVf3jvd3MW17MVz8ygpMH5IRdjoiISKiCnBegCBjS4PFgjp7WogB4IrpYcz/gMjOrcff5DQ9y9znAHICCggItHJ6CKqtr+bf5qxjW9zhuuXBk2OWIiIiELsgQthgYZWbDgWJgOnBdwwPcfXj9fTN7BPhL0wAm6eH+v7/H5j2HePTL55KVmRF2OSIiIqELLIS5e42Z3UJk1GMG8LC7rzGzG6P71Q+sk3hn+wHmLNrItDPzmTiyX9jliIiIJIVApyl39wXAgibbYoYvd78+yFokHHV1znfnrSInqyvfu1xzgomIiNQLsmO+CI++tYVlW0r53uVjOb5Ht7DLERERSRoKYRKYHQcq+clf32HiyL5MOzO/5SeIiIh0IgphEpgf/HkNVbV1/GjqOKIjYEVERCRKIUwC8fd1O1iwajvfuHAkw/v1CLscERGRpKMQJgl3sKqGu55ew8kDejLjIyPCLkdERCQpBTo6Ujqnnz3/LsWlFTx14/l066qcLyIiEov+QkpCrSraz+/++T6fOXcoBcOOD7scERGRpKUQJglTU1vHHfNW0rdnd26bMibsckRERJKamiMlYR55bRNrSg7wq+vOpHd2ZtjliIiIJDVdCZOEKNp3iP/627tcOOYELhuXF3Y5IiIiSU8hTNrN3bnr6TUA/PCqUzUnmIiISBwUwqTdFqzazovv7OTbl5zM4D7HhV2OiIhISlAIk3bZX1HNPX9ew2n5vbj+Q8PCLkdERCRlqGO+tMtPnnuHPeVVPPyFs+maoUwvIiISL/3VlDZbunkvj765hRsmDmfc4N5hlyMiIpJSFMKkTQ7X1HHnvFXk52bzrxefHHY5IiIiKUfNkdImcxZt4N0d5fz2CwX06K6vkYiISGvpSpi02vu7D3L/i+u5bFweF50yIOxyREREUpJCmLSKu/Nvf1pF94wu3H3lqWGXIyIikrIUwqRV5i0r5rUNe7jt0jEM6JUVdjkiIiIpSyFM4rb34GF+9Oxazhyay2fOGRp2OSIiIilNIUzi9uNn11FWWcO9006nSxctTSRyxMon4eenwT25kduVT4ZdkYikAA1rk7i8tn43c5cVcfPHRjA6LyfsckQiVj4Jf/8h7C+C3oPhorvg9Gs6voY/fwOqKyKP92+NPIaOr0VEUopCmDRr/vJiZi0spKS0gi5djH49Mvn6haPCLkskIt7w4w51NVBTCTWHI7e1VVDT4KfR40qoPdz4+GMd8+7CyG1D1RWRcKgQJiLHoBAmMc1fXsyd81ZRUV0LQG2dc6CyludWb2fqhPyQq5NOxx0q90PZNjhQEvlZ+N0PAli96gr4043wt+81Dlhe1/4aunSFjO7QtcFPRvejA1i9/Vsj79+1e/vfW0TSkkKYxDRrYeGRAFbvcG0dsxYWKoRJYtXWQPmOSLAqK4ED2xrcRkNX2TaoPhTf63ktjL40dmDqmgVdu0VuM7o1eRzr+PrndIcuGbHf7+enRQJXc/vO/jIUfBF69m/b+RGRtKUQJjGVlFa0art0MvH2xao80DhIHbltELQO7jz6SlVGN8jJg5xBMPB0OHkK9BoEvQZGtvUaCI9cHnn/pnoPgSt/GcznjuWiuxo3iwJkZsM5X4Uda+Dl/4BXfwrjPgXn3hj5PCIiKIRJE+7OX1dvp4sZte5H7R+Umx1CVZJUYvXFmv81WD0PsnMbB63DZUc/Pys3EqhyBsKAUz8IVb3yI9t6DYLj+oK1MAL3ortjh5+L7krcZ41HffhsLpTufg/enA0rHoMVj8KJF8B5N8Loy5q/uiYiwUqGQT2AeYw/tMmsoKDAlyxZEnYZaendHWXc88waXtuwh0G9s9h98DCHaz64QpGdmcG908apObKz++loKN8ee1+vwdGrVQM/CFpNb7sdl7hakuQXaVwq9sGy/4G3fgP7t0Du0MjVsjM/B1m9w65OpPNo+g9JiPwD7sr7A/n9YWZL3b0g5j6FMNlfUc0vXniXP7y+mZ7du/KdS07m2nOG8peV246MjhyUm83MyaMVwDqr8l2w+il4+wnYtqKZgwzuKe3QslJSbQ0ULoA3HoQtr0FmD5jwmUhTZd8RYVcniZYs/1DojHXU1sDhcjh8MPoTvf9/18Oh3Ucf33sIfGt1wstQCJOY6uqc/1u6lZ88V8jeQ4e57pyhfOeS0fTp0S3s0iQZVFdA4V8jwWv9C5EO7wPPgH2boTJG2AroF1haK1kRaapcPTcy5cWoyZGmypM+1nJzbDJLlj/4YevgKy4pXcfYqxoHpbjvN7OvqjwyMrpVgvmHZGghzMymAL8EMoCH3P2+Jvs/A9wefVgO3OTubx/rNYMOYQ3nxkrnqz/Lt+zj7mfWsLJoP2cP68PdV57KaflqEun06upgy+uw8glYMx+qDkT6bJ1+DZwxHU44JXl+oaeTsh2w5GFY8ls4uAv6nxIJY+OuSWzzbUdIpu9H0GHQPTJFSeWByP8rlfsjP1UHItue/37kcVOZ2ZHA7bVQF/3xhrd1TR4fa3tdy8fVHm7+M1gXwCK3Zsd4bC3s7/LBMc3t37shMmdfe2UeB916RH96Nrjf9HEz9+d+Ccp3Hv266XQlzMwygHeBi4EiYDFwrbuvbXDMh4B17r7PzC4F7nH3c4/1ukGGsKZzY0H69YPaVVbFfz73Dk8tLWJAr+5897JT+MQZg7BU/le3tN/u9ZHg9fYfI/2VMnvA2E9EgtewDx/dgfz/t3fv0XHXZR7H30+SNm2TNik0vSS903uxF4sLWlBokSLQi8cbiCyL7vF4RdR1pauy6KLrOe4qsCrIERdcWXRlYYugtFy0ShFESktpCy2X3tI7bVqalqZJnv3j+wszTyNifAAAEclJREFUk05CL8l8J5nP65w5M/ObX6ZPvif9zTPfy/NVT0fnaDwcesWe/DFsXwW9+8OMq0KZi4o8vwa5hw+2W2eGRLK10n4w8+q3KAeSVhKkrXOOdTHDWyWDRyVQ++Hwvszn6QnV4WzP90PzkRNrrwHjwIrD72NFyX1x5n22Y1nPLcpyXtrxZTe2Hce7vxISOfdklbKnPfdWz1u/nv6ct3jdYc3/tR3HrG8cWwLVs+zkF7QUwpwwM3snIamakzxfCODu/9rG+f2B59293StNZyZhM7/7GLVZSjDUVPZm2bWzOuXfzJUjTc3c+cQGbnpkPW80NvGJs0fzuVljKC/VAtmCVf8arL4XVt4Ntc+Ei/voc2HqZTDh4nCxkzjcYeMT8NQt8MKDgIXhmrM+A8PeET+217fBrhdg14up+51rsw9Td7SiklZ13tpI5jYsg8YsJXWKisMK3WNKoAxK+4YEsle/VvcVrY61ft4PfjYH9tce/ba5Hrpvq5ZdocYBOf0i2V4S1pmfwDVAemtvAdrr5foE8LtsL5jZJ4FPAgwfPryj4jtKd62N9af1u/jmb9bw0s4DnDe+iuvmTmbUAH3AHrfu0PvTeBjWPRR6vNYvDkMDg06HC26A0z8YVjZKfGYwcma47d0If7ktrKxcfS/UzAjJ2KT5UNyj82JwD3/rbyZaL6QSrsP7U+f17h+GTye/PwxX//F72XvCKobB55e32hYq27ZRx7CN1JvPsx1rgIN7sidgEIboJs1PJUu9KrInUL36Qc++oTfpRJ1/fX6UUWmrll2hxgHh2p0H1+/OTMKyjW9l7XYzs/MISdjZ2V5399uA2yD0hHVUgK1VV/bO2hPmwBW3P8X8aTXMmTyIvr068cLXgTbvOcgND65h8eodjDi1D7dfeQazJw6KHVbX1JU3aXaHzU+FCfar7w1DKuWD4axPw5RLYfDpsSOU9vQfAXO+DecuDL2WT94S5rQs+XoYppxxFZSdeuLv39wchqBb92rtXhcmObcoq4KqCeHvvWpC6lY2IHMRQe/+bX/QlvQMt1xor9flkh/kJoa3qiGXK4ojb0UfjjSzKcB9wPvcfd1bvW+u54SVlhTxnnEDeGH7ATbtOUhpSRGzJw5k/rQazh1fRWlJ/hVbPNTQxC1LX+YnS1+myIzPzRrD358zKi9jzQsNB0PV9vrdYU5L/c7wTf7ArnBfvytMVs82obRnWZhTUTkcKoaH+/KB+bGybc8rocfruV/B3lfDZNYJl4R5XqPPVaHQrqq5OaxWffLH8Mrvw5DclA/DmZ+GHc+3/QHX3AR7N2QmW7vWhmKy6VtClQ+GqvFJkjU+9G4NGH98iV4+9Brn0wIBKWix5oSVECbmzwZqCRPzP+ruq9POGQ48Bvytuz9xLO8ba3Wku/Ps5joWPVvLA89t47X6Bvr2KuGi04cwf3o1Z446leKiuB+87s5Dz2/nhgfXUlt3iHlTq1l40QSGVHTxKvfHe0F3D4Ux63eHhOrAztTjjORqZ3h8pD77+5RWhG/55QNDEnasikuhcliSmCX3lSNSx8oHn9wQR3sO7oHV94XEa/NTgMGod4d5XhMvCfNbpPvYuTaUuFj5qzD8ZkWZW0AV9YDq6eG13eszNxvvV5OZbFVNhKpxoSeru8iHZFAKXswSFRcBNxJKVPzM3b9tZp8CcPdbzeynwAeAjcmPNLYVaIt8qBPW2NTMspdfY9GztSxevZ36hiYG9Stl7pRqFkyvYXJ1v5yvNkyvdj9hcF++OW8yZ44+iSGKfJHt22xxaShu2X9UqqeqfleSbO0KCVe2CbdWFLbDKRuYSq4yHleFW/lA6DMAevRK/Wx7QxufeTK8Vrfp6Nu+zUfPjynqET4QKocnidmIzISt7xAobmemQOsPlvP+KcxrWXk3rFsc5sZUTYSpHwklDvJ9VZ2cvIN74OZp2cshWDGcNiuVcA2cCAPGqkq/SI6oWGsnOtTQxCNrd7BoxVaWrtvJkSZndFUZC6bVMG9qNSM7eQJ8W9XuS4o7qacl174/GfZn2aS5RXHPkEiVVyUJVVXyuCrtePK4zyknPgR3MkMbDQeTJG0z1G1MJWctidqBHZnnF5WELX4qR6T1pCUJ2/ZV8Ni/ZMbRoqwqbBI99VIYPCU/hkQld66vJPu0W+1kIBKTkrAcqTvYwG9XbWfRilqeenUPAFOHVbJgWjUXTxnCwL693uIdjl3raveXJdXuT+ku1e4P7gkrwv6QtaIJYHDtxrCKKVfJRmcNbRx5I7xn3cZWPWrJ49e30caalpQ+A+DLL7bfgybdWz4t/xeRNykJi2Br3SF+s3Iri1ZsZc22/RQZzBwzoENWWD67aS/X37+alVv2ccaI/lw/rxtVu6/bBH/+ESz/eZgsXNIrcx5Li0L6YGlsCL2BdZvg5/PbOEm9HQVPE9FF8pKSsMjW73idRSu2smhlLZv3HKK0pIjzJw5i3rTq41phmV7tfmDfUhZeNIEF02q6R7X7Hath2U2w6p7Qs/W2D8G7rg6rvfTBkqLeDmmPJqKL5B0lYXnC3Vm+qY77V6RWWPbrVcJFbxvCvGnVnDXqVIqK7KgVml9671j2HjzyZrX7j589is/PGtv1q923VAVfdiOsXxK2yplxZShEWTksdZ4+WFLU2yEi0qUoCTseOfrAP9LUzLKXdnP/iq1vrrAc3K8XE4eU88TLezjcmFpmboQZQe8ZV8V1cydxWlV5h8eTU83N8OJvQ/K15emwYvHMT4XCk31OiR1d/lNSKiLSZSgJO1aRehlSKyxreWRtlp3dgVPKevLM18/v2kOPjQ2hftUTN4dq3JXDw5DjtMuhZ5/Y0YmIiHS4WHtHdj2Pfuvopf9HDoXjnZiE9e5ZzNyp1cydWs2oax/Mug5ub31D103A3tgPz9wRKny/vg0GvQ0+cDtMWqDVfCIiUrD0CZhuXxv1qPZthj99H8ZdGAoddmIy1Nb+ldWVXbDq/YGdYZ+7p2+Hw/tg5Dkw/4dw2mzVsBIRkYKnJCxdxdDsK8+KesCj3wy3imEw9oKQkI06JwxXdqCvzBl/1P6VvXsU85U54zv03+lUr70MT/wHrPjvUL194lw4+xqomRE7MhERkbyhJCzd7OvanhM28pywgm/dYlj5S/jr7VDSO+zLN+4CGDsnc0XfCVowPWwxk23/yry39Vl4/EZYe3+o+j71sjDna8CY2JGJiIjkHU3Mb+1YVp41HoYNj4eEbP1i2LshHB84OSRk4y6Eoe848S1yuhJ3eOUP8PgP4NWloYL9GR+Hsz4NfQfHjk5ERCQqrY7sTO6wez2seyj0lG36MzQ3Qu/+MOb8kJCdNqv7lV5oboI1i0KZiW0roXxwSLzOuEobA4uIiCS0OrIzmUHVuHCbeTUcqoOXHwsJ2folsOrXYEUw7MzUXLJOntzfYbL1Ck6cCyvuCnO+9m6AU8eE4dqpl0JJaeyIRUREugz1hHWm5iaoXR6GLNcthu3PheOdPLm/Q2SrmVbUI+zl2PB6mGQ/8xqYcHFhDLuKiIicAA1H5ov9W5PJ/Uvgld8nG1S3TO6fE24VQ3NfEb2pMcTScms4CP/1fqjPUji2pBdcfg+MPLtr9OaJiIhEpCQsHx15AzY+HhKydQ9B3cZwvG8N1O8I88pa9OgNc74TVmCmJ0pvJk6HoKE+3B9J7rO+nv6zaec2NRxH4AbX13VoU4iIiHRXSsLynXvYxmfdYnjsBmg6fHLvZ8XQswx69AkJXM+ycN+jT7j1TI73KGv79Qe+BAd3H/3eFcPgi8+fXHwiIiIFQhPz850ZVI0Pt4eva/u8uTelJU59UklT68SqpOfJx9R4OHvNtNntxCciIiLHTElYvmmran/FMJjxd7mLo2UOWi7npomIiBQQJWH5pq2q/TF6oKZ8WEmXiIhIJymKHYC0MuXDoe5WxTDAwv3cm5UMiYiIdDPqCctH6oESERHp9tQTJiIiIhKBkjARERGRCJSEiYiIiESgJExEREQkgi5XMd/MdgEbc/BPDQCylIwvSGqLTGqPFLVFJrVHJrVHitoiUyG1xwh3r8r2QpdLwnLFzP7a1jYDhUZtkUntkaK2yKT2yKT2SFFbZFJ7BBqOFBEREYlASZiIiIhIBErC2nZb7ADyiNoik9ojRW2RSe2RSe2RorbIpPZAc8JEREREolBPmIiIiEgESsJaMbMLzexFM3vJzK6NHU9MZjbMzH5vZmvNbLWZfSF2TLGZWbGZPWtmD8SOJTYzqzSze8zsheRv5J2xY4rJzL6Y/D953szuNrNesWPKFTP7mZntNLPn046dYmYPm9n65L5/zBhzqY32+F7yf+U5M7vPzCpjxphL2doj7bV/MDM3swExYotNSVgaMysGfgS8D5gEXGZmk+JGFVUj8GV3nwicBXy2wNsD4AvA2thB5ImbgIfcfQIwlQJuFzOrAa4GznD304Fi4NK4UeXUHcCFrY5dCzzq7mOBR5PnheIOjm6Ph4HT3X0KsA5YmOugIrqDo9sDMxsGvBfYlOuA8oWSsEx/A7zk7q+4ewPwS2B+5Jiicfdt7r48efw64UO2Jm5U8ZjZUOBi4KexY4nNzPoB7wZuB3D3BnevixtVdCVAbzMrAfoAWyPHkzPu/kdgT6vD84E7k8d3AgtyGlRE2drD3Ze4e2Py9ElgaM4Di6SNvw+AHwD/CBTs5HQlYZlqgM1pz7dQwElHOjMbCUwHnoobSVQ3Ei4YzbEDyQOjgV3AfybDsz81s7LYQcXi7rXAvxG+0W8D9rn7krhRRTfI3bdB+EIHDIwcTz75OPC72EHEZGbzgFp3Xxk7lpiUhGWyLMcKNkNvYWblwP8C17j7/tjxxGBmlwA73f2Z2LHkiRLg7cAt7j4dqKewhpsyJPOd5gOjgGqgzMw+FjcqyUdm9jXCVI+7YscSi5n1Ab4GXBc7ltiUhGXaAgxLez6UAhpSyMbMehASsLvc/d7Y8UQ0E5hnZhsIw9SzzOwXcUOKaguwxd1bekbvISRlhep84FV33+XuR4B7gXdFjim2HWY2BCC53xk5nujM7ErgEuByL+z6UKcRvrCsTK6pQ4HlZjY4alQRKAnL9DQw1sxGmVlPwsTa+yPHFI2ZGWHOz1p3/37seGJy94XuPtTdRxL+Lh5z94Lt6XD37cBmMxufHJoNrIkYUmybgLPMrE/y/2Y2BbxQIXE/cGXy+EpgUcRYojOzC4GvAvPc/WDseGJy91XuPtDdRybX1C3A25PrSkFREpYmmTT5OWAx4QL6P+6+Om5UUc0EriD0+qxIbhfFDkryxueBu8zsOWAa8J3I8UST9AjeAywHVhGurQVTEdzM7gb+DIw3sy1m9gngu8B7zWw9YQXcd2PGmEtttMcPgb7Aw8m19NaoQeZQG+0hqGK+iIiISBTqCRMRERGJQEmYiIiISARKwkREREQiUBImIiIiEoGSMBEREZEIlISJiBwjMzvXzB6IHYeIdA9KwkREREQiUBImIt2OmX3MzP6SFMX8iZkVm9kBM/t3M1tuZo+aWVVy7jQze9LMnjOz+5J9IDGzMWb2iJmtTH7mtOTty83sHjN7wczuSirki4gcNyVhItKtmNlE4CPATHefBjQBlwNlwHJ3fzuwFPjn5Ed+DnzV3acQqt23HL8L+JG7TyXsA7ktOT4duAaYBIwm7CwhInLcSmIHICLSwWYDM4Cnk06q3oTNo5uBXyXn/AK418wqgEp3X5ocvxP4tZn1BWrc/T4Ad38DIHm/v7j7luT5CmAk8Hjn/1oi0t0oCROR7saAO919YcZBs2+0Oq+9PdvaG2I8nPa4CV1HReQEaThSRLqbR4EPmtlAADM7xcxGEK53H0zO+SjwuLvvA/aa2TnJ8SuApe6+H9hiZguS9yg1sz45/S1EpNvTNzgR6VbcfY2ZfR1YYmZFwBHgs0A9MNnMngH2EeaNAVwJ3JokWa8AVyXHrwB+YmbfSt7jQzn8NUSkAJh7ez3yIiLdg5kdcPfy2HGIiLTQcKSIiIhIBOoJExEREYlAPWEiIiIiESgJExEREYlASZiIiIhIBErCRERERCJQEiYiIiISgZIwERERkQj+H8fqnbzq/+ywAAAAAElFTkSuQmCC\n"
     },
     "metadata": {
      "needs_background": "light"
     }
    }
   ],
   "source": [
    "plt.subplot(2, 1, 1)\n",
    "plt.plot(solver.loss_history, 'o')\n",
    "plt.xlabel('iteration')\n",
    "plt.ylabel('loss')\n",
    "\n",
    "plt.subplot(2, 1, 2)\n",
    "plt.plot(solver.train_acc_history, '-o')\n",
    "plt.plot(solver.val_acc_history, '-o')\n",
    "plt.legend(['train', 'val'], loc='upper left')\n",
    "plt.xlabel('epoch')\n",
    "plt.ylabel('accuracy')\n",
    "plt.show()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Train the net\n",
    "By training the three-layer convolutional network for one epoch, you should achieve greater than 40% accuracy on the training set:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 19,
   "metadata": {
    "scrolled": false
   },
   "outputs": [
    {
     "output_type": "stream",
     "name": "stdout",
     "text": [
      "(Iteration 1 / 980) loss: 2.304740\n",
      "(Epoch 0 / 1) train acc: 0.103000; val_acc: 0.107000\n",
      "(Iteration 21 / 980) loss: 2.098229\n",
      "(Iteration 41 / 980) loss: 1.949788\n",
      "(Iteration 61 / 980) loss: 1.888398\n",
      "(Iteration 81 / 980) loss: 1.877093\n",
      "(Iteration 101 / 980) loss: 1.851877\n",
      "(Iteration 121 / 980) loss: 1.859353\n",
      "(Iteration 141 / 980) loss: 1.800181\n",
      "(Iteration 161 / 980) loss: 2.143292\n",
      "(Iteration 181 / 980) loss: 1.830573\n",
      "(Iteration 201 / 980) loss: 2.037280\n",
      "(Iteration 221 / 980) loss: 2.020304\n",
      "(Iteration 241 / 980) loss: 1.823728\n",
      "(Iteration 261 / 980) loss: 1.692679\n",
      "(Iteration 281 / 980) loss: 1.882594\n",
      "(Iteration 301 / 980) loss: 1.798261\n",
      "(Iteration 321 / 980) loss: 1.851960\n",
      "(Iteration 341 / 980) loss: 1.716323\n",
      "(Iteration 361 / 980) loss: 1.897655\n",
      "(Iteration 381 / 980) loss: 1.319744\n",
      "(Iteration 401 / 980) loss: 1.738790\n",
      "(Iteration 421 / 980) loss: 1.488866\n",
      "(Iteration 441 / 980) loss: 1.718409\n",
      "(Iteration 461 / 980) loss: 1.744440\n",
      "(Iteration 481 / 980) loss: 1.605460\n",
      "(Iteration 501 / 980) loss: 1.494847\n",
      "(Iteration 521 / 980) loss: 1.835179\n",
      "(Iteration 541 / 980) loss: 1.483923\n",
      "(Iteration 561 / 980) loss: 1.676871\n",
      "(Iteration 581 / 980) loss: 1.438325\n",
      "(Iteration 601 / 980) loss: 1.443469\n",
      "(Iteration 621 / 980) loss: 1.529369\n",
      "(Iteration 641 / 980) loss: 1.763475\n",
      "(Iteration 661 / 980) loss: 1.790329\n",
      "(Iteration 681 / 980) loss: 1.693343\n",
      "(Iteration 701 / 980) loss: 1.637078\n",
      "(Iteration 721 / 980) loss: 1.644564\n",
      "(Iteration 741 / 980) loss: 1.708919\n",
      "(Iteration 761 / 980) loss: 1.494252\n",
      "(Iteration 781 / 980) loss: 1.901751\n",
      "(Iteration 801 / 980) loss: 1.898991\n",
      "(Iteration 821 / 980) loss: 1.489988\n",
      "(Iteration 841 / 980) loss: 1.377615\n",
      "(Iteration 861 / 980) loss: 1.763751\n",
      "(Iteration 881 / 980) loss: 1.540284\n",
      "(Iteration 901 / 980) loss: 1.525582\n",
      "(Iteration 921 / 980) loss: 1.674166\n",
      "(Iteration 941 / 980) loss: 1.714316\n",
      "(Iteration 961 / 980) loss: 1.534668\n",
      "(Epoch 1 / 1) train acc: 0.504000; val_acc: 0.499000\n"
     ]
    }
   ],
   "source": [
    "model = ThreeLayerConvNet(weight_scale=0.001, hidden_dim=500, reg=0.001)\n",
    "\n",
    "solver = Solver(model, data,\n",
    "                num_epochs=1, batch_size=50,\n",
    "                update_rule='adam',\n",
    "                optim_config={\n",
    "                  'learning_rate': 1e-3,\n",
    "                },\n",
    "                verbose=True, print_every=20)\n",
    "solver.train()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 23,
   "metadata": {
    "id": "full_data_train_accuracy"
   },
   "outputs": [
    {
     "output_type": "stream",
     "name": "stdout",
     "text": [
      "Full data training accuracy: 0.4\n"
     ]
    }
   ],
   "source": [
    "# Print final training accuracy\n",
    "print(\n",
    "    \"Full data training accuracy:\",\n",
    "    solver.check_accuracy(small_data['X_train'], small_data['y_train'])\n",
    ")"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 22,
   "metadata": {
    "id": "full_data_validation_accuracy"
   },
   "outputs": [
    {
     "output_type": "stream",
     "name": "stdout",
     "text": [
      "Full data validation accuracy: 0.499\n"
     ]
    }
   ],
   "source": [
    "# Print final validation accuracy\n",
    "print(\n",
    "    \"Full data validation accuracy:\",\n",
    "    solver.check_accuracy(data['X_val'], data['y_val'])\n",
    ")"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Visualize Filters\n",
    "You can visualize the first-layer convolutional filters from the trained network by running the following:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 7,
   "metadata": {},
   "outputs": [
    {
     "output_type": "error",
     "ename": "NameError",
     "evalue": "name 'model' is not defined",
     "traceback": [
      "\u001b[1;31m---------------------------------------------------------------------------\u001b[0m",
      "\u001b[1;31mNameError\u001b[0m                                 Traceback (most recent call last)",
      "\u001b[1;32m<ipython-input-7-7dcb55853a2d>\u001b[0m in \u001b[0;36m<module>\u001b[1;34m\u001b[0m\n\u001b[0;32m      1\u001b[0m \u001b[1;32mfrom\u001b[0m \u001b[0mcs231n\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mvis_utils\u001b[0m \u001b[1;32mimport\u001b[0m \u001b[0mvisualize_grid\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m      2\u001b[0m \u001b[1;33m\u001b[0m\u001b[0m\n\u001b[1;32m----> 3\u001b[1;33m \u001b[0mgrid\u001b[0m \u001b[1;33m=\u001b[0m \u001b[0mvisualize_grid\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mmodel\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mparams\u001b[0m\u001b[1;33m[\u001b[0m\u001b[1;34m'W1'\u001b[0m\u001b[1;33m]\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mtranspose\u001b[0m\u001b[1;33m(\u001b[0m\u001b[1;36m0\u001b[0m\u001b[1;33m,\u001b[0m \u001b[1;36m2\u001b[0m\u001b[1;33m,\u001b[0m \u001b[1;36m3\u001b[0m\u001b[1;33m,\u001b[0m \u001b[1;36m1\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0m\u001b[0;32m      4\u001b[0m \u001b[0mplt\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mimshow\u001b[0m\u001b[1;33m(\u001b[0m\u001b[0mgrid\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0mastype\u001b[0m\u001b[1;33m(\u001b[0m\u001b[1;34m'uint8'\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n\u001b[0;32m      5\u001b[0m \u001b[0mplt\u001b[0m\u001b[1;33m.\u001b[0m\u001b[0maxis\u001b[0m\u001b[1;33m(\u001b[0m\u001b[1;34m'off'\u001b[0m\u001b[1;33m)\u001b[0m\u001b[1;33m\u001b[0m\u001b[1;33m\u001b[0m\u001b[0m\n",
      "\u001b[1;31mNameError\u001b[0m: name 'model' is not defined"
     ]
    }
   ],
   "source": [
    "from cs231n.vis_utils import visualize_grid\n",
    "\n",
    "grid = visualize_grid(model.params['W1'].transpose(0, 2, 3, 1))\n",
    "plt.imshow(grid.astype('uint8'))\n",
    "plt.axis('off')\n",
    "plt.gcf().set_size_inches(5, 5)\n",
    "plt.show()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Spatial Batch Normalization\n",
    "We already saw that batch normalization is a very useful technique for training deep fully-connected networks. As proposed in the original paper (link in `BatchNormalization.ipynb`), batch normalization can also be used for convolutional networks, but we need to tweak it a bit; the modification will be called \"spatial batch normalization.\"\n",
    "\n",
    "Normally batch-normalization accepts inputs of shape `(N, D)` and produces outputs of shape `(N, D)`, where we normalize across the minibatch dimension `N`. For data coming from convolutional layers, batch normalization needs to accept inputs of shape `(N, C, H, W)` and produce outputs of shape `(N, C, H, W)` where the `N` dimension gives the minibatch size and the `(H, W)` dimensions give the spatial size of the feature map.\n",
    "\n",
    "If the feature map was produced using convolutions, then we expect every feature channel's statistics e.g. mean, variance to be relatively consistent both between different images, and different locations within the same image -- after all, every feature channel is produced by the same convolutional filter! Therefore spatial batch normalization computes a mean and variance for each of the `C` feature channels by computing statistics over the minibatch dimension `N` as well the spatial dimensions `H` and `W`.\n",
    "\n",
    "\n",
    "[1] [Sergey Ioffe and Christian Szegedy, \"Batch Normalization: Accelerating Deep Network Training by Reducing\n",
    "Internal Covariate Shift\", ICML 2015.](https://arxiv.org/abs/1502.03167)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Spatial batch normalization: forward\n",
    "\n",
    "In the file `cs231n/layers.py`, implement the forward pass for spatial batch normalization in the function `spatial_batchnorm_forward`. Check your implementation by running the following:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 8,
   "metadata": {},
   "outputs": [
    {
     "output_type": "stream",
     "name": "stdout",
     "text": [
      "Before spatial batch normalization:\n  Shape:  (2, 3, 4, 5)\n  Means:  [9.33463814 8.90909116 9.11056338]\n  Stds:  [3.61447857 3.19347686 3.5168142 ]\nAfter spatial batch normalization:\n  Shape:  (2, 3, 4, 5)\n  Means:  [ 6.18949336e-16  5.99520433e-16 -1.22124533e-16]\n  Stds:  [0.99999962 0.99999951 0.9999996 ]\nAfter spatial batch normalization (nontrivial gamma, beta):\n  Shape:  (2, 3, 4, 5)\n  Means:  [6. 7. 8.]\n  Stds:  [2.99999885 3.99999804 4.99999798]\n"
     ]
    }
   ],
   "source": [
    "np.random.seed(231)\n",
    "# Check the training-time forward pass by checking means and variances\n",
    "# of features both before and after spatial batch normalization\n",
    "\n",
    "N, C, H, W = 2, 3, 4, 5\n",
    "x = 4 * np.random.randn(N, C, H, W) + 10\n",
    "\n",
    "print('Before spatial batch normalization:')\n",
    "print('  Shape: ', x.shape)\n",
    "print('  Means: ', x.mean(axis=(0, 2, 3)))\n",
    "print('  Stds: ', x.std(axis=(0, 2, 3)))\n",
    "\n",
    "# Means should be close to zero and stds close to one\n",
    "gamma, beta = np.ones(C), np.zeros(C)\n",
    "bn_param = {'mode': 'train'}\n",
    "out, _ = spatial_batchnorm_forward(x, gamma, beta, bn_param)\n",
    "print('After spatial batch normalization:')\n",
    "print('  Shape: ', out.shape)\n",
    "print('  Means: ', out.mean(axis=(0, 2, 3)))\n",
    "print('  Stds: ', out.std(axis=(0, 2, 3)))\n",
    "\n",
    "# Means should be close to beta and stds close to gamma\n",
    "gamma, beta = np.asarray([3, 4, 5]), np.asarray([6, 7, 8])\n",
    "out, _ = spatial_batchnorm_forward(x, gamma, beta, bn_param)\n",
    "print('After spatial batch normalization (nontrivial gamma, beta):')\n",
    "print('  Shape: ', out.shape)\n",
    "print('  Means: ', out.mean(axis=(0, 2, 3)))\n",
    "print('  Stds: ', out.std(axis=(0, 2, 3)))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 9,
   "metadata": {},
   "outputs": [
    {
     "output_type": "stream",
     "name": "stdout",
     "text": [
      "After spatial batch normalization (test-time):\n  means:  [-0.08034406  0.07562881  0.05716371  0.04378383]\n  stds:  [0.96718744 1.0299714  1.02887624 1.00585577]\n"
     ]
    }
   ],
   "source": [
    "np.random.seed(231)\n",
    "# Check the test-time forward pass by running the training-time\n",
    "# forward pass many times to warm up the running averages, and then\n",
    "# checking the means and variances of activations after a test-time\n",
    "# forward pass.\n",
    "N, C, H, W = 10, 4, 11, 12\n",
    "\n",
    "bn_param = {'mode': 'train'}\n",
    "gamma = np.ones(C)\n",
    "beta = np.zeros(C)\n",
    "for t in range(50):\n",
    "  x = 2.3 * np.random.randn(N, C, H, W) + 13\n",
    "  spatial_batchnorm_forward(x, gamma, beta, bn_param)\n",
    "bn_param['mode'] = 'test'\n",
    "x = 2.3 * np.random.randn(N, C, H, W) + 13\n",
    "a_norm, _ = spatial_batchnorm_forward(x, gamma, beta, bn_param)\n",
    "\n",
    "# Means should be close to zero and stds close to one, but will be\n",
    "# noisier than training-time forward passes.\n",
    "print('After spatial batch normalization (test-time):')\n",
    "print('  means: ', a_norm.mean(axis=(0, 2, 3)))\n",
    "print('  stds: ', a_norm.std(axis=(0, 2, 3)))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Spatial batch normalization: backward\n",
    "In the file `cs231n/layers.py`, implement the backward pass for spatial batch normalization in the function `spatial_batchnorm_backward`. Run the following to check your implementation using a numeric gradient check:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 10,
   "metadata": {},
   "outputs": [
    {
     "output_type": "stream",
     "name": "stdout",
     "text": [
      "dx error:  2.786648193872555e-07\ndgamma error:  7.0974817113608705e-12\ndbeta error:  3.275608725278405e-12\n"
     ]
    }
   ],
   "source": [
    "np.random.seed(231)\n",
    "N, C, H, W = 2, 3, 4, 5\n",
    "x = 5 * np.random.randn(N, C, H, W) + 12\n",
    "gamma = np.random.randn(C)\n",
    "beta = np.random.randn(C)\n",
    "dout = np.random.randn(N, C, H, W)\n",
    "\n",
    "bn_param = {'mode': 'train'}\n",
    "fx = lambda x: spatial_batchnorm_forward(x, gamma, beta, bn_param)[0]\n",
    "fg = lambda a: spatial_batchnorm_forward(x, gamma, beta, bn_param)[0]\n",
    "fb = lambda b: spatial_batchnorm_forward(x, gamma, beta, bn_param)[0]\n",
    "\n",
    "dx_num = eval_numerical_gradient_array(fx, x, dout)\n",
    "da_num = eval_numerical_gradient_array(fg, gamma, dout)\n",
    "db_num = eval_numerical_gradient_array(fb, beta, dout)\n",
    "\n",
    "#You should expect errors of magnitudes between 1e-12~1e-06\n",
    "_, cache = spatial_batchnorm_forward(x, gamma, beta, bn_param)\n",
    "dx, dgamma, dbeta = spatial_batchnorm_backward(dout, cache)\n",
    "print('dx error: ', rel_error(dx_num, dx))\n",
    "print('dgamma error: ', rel_error(da_num, dgamma))\n",
    "print('dbeta error: ', rel_error(db_num, dbeta))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Group Normalization\n",
    "In the previous notebook, we mentioned that Layer Normalization is an alternative normalization technique that mitigates the batch size limitations of Batch Normalization. However, as the authors of [2] observed, Layer Normalization does not perform as well as Batch Normalization when used with Convolutional Layers:\n",
    "\n",
    ">With fully connected layers, all the hidden units in a layer tend to make similar contributions to the final prediction, and re-centering and rescaling the summed inputs to a layer works well. However, the assumption of similar contributions is no longer true for convolutional neural networks. The large number of the hidden units whose\n",
    "receptive fields lie near the boundary of the image are rarely turned on and thus have very different\n",
    "statistics from the rest of the hidden units within the same layer.\n",
    "\n",
    "The authors of [3] propose an intermediary technique. In contrast to Layer Normalization, where you normalize over the entire feature per-datapoint, they suggest a consistent splitting of each per-datapoint feature into G groups, and a per-group per-datapoint normalization instead. \n",
    "\n",
    "<p align=\"center\">\n",
    "<img src=\"https://raw.githubusercontent.com/cs231n/cs231n.github.io/master/assets/a2/normalization.png\">\n",
    "</p>\n",
    "<center>Visual comparison of the normalization techniques discussed so far (image edited from [3])</center>\n",
    "\n",
    "Even though an assumption of equal contribution is still being made within each group, the authors hypothesize that this is not as problematic, as innate grouping arises within features for visual recognition. One example they use to illustrate this is that many high-performance handcrafted features in traditional Computer Vision have terms that are explicitly grouped together. Take for example Histogram of Oriented Gradients [4]-- after computing histograms per spatially local block, each per-block histogram is normalized before being concatenated together to form the final feature vector.\n",
    "\n",
    "You will now implement Group Normalization. Note that this normalization technique that you are to implement in the following cells was introduced and published to ECCV just in 2018 -- this truly is still an ongoing and excitingly active field of research!\n",
    "\n",
    "[2] [Ba, Jimmy Lei, Jamie Ryan Kiros, and Geoffrey E. Hinton. \"Layer Normalization.\" stat 1050 (2016): 21.](https://arxiv.org/pdf/1607.06450.pdf)\n",
    "\n",
    "\n",
    "[3] [Wu, Yuxin, and Kaiming He. \"Group Normalization.\" arXiv preprint arXiv:1803.08494 (2018).](https://arxiv.org/abs/1803.08494)\n",
    "\n",
    "\n",
    "[4] [N. Dalal and B. Triggs. Histograms of oriented gradients for\n",
    "human detection. In Computer Vision and Pattern Recognition\n",
    "(CVPR), 2005.](https://ieeexplore.ieee.org/abstract/document/1467360/)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Group normalization: forward\n",
    "\n",
    "In the file `cs231n/layers.py`, implement the forward pass for group normalization in the function `spatial_groupnorm_forward`. Check your implementation by running the following:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 24,
   "metadata": {},
   "outputs": [
    {
     "output_type": "stream",
     "name": "stdout",
     "text": [
      "Before spatial group normalization:\n  Shape:  (2, 6, 4, 5)\n  Means:  [9.72505327 8.51114185 8.9147544  9.43448077]\n  Stds:  [3.67070958 3.09892597 4.27043622 3.97521327]\nAfter spatial group normalization:\n  Shape:  (2, 6, 4, 5)\n  Means:  [-2.14643118e-16  5.25505565e-16  2.65528340e-16 -3.38618023e-16]\n  Stds:  [0.99999963 0.99999948 0.99999973 0.99999968]\n"
     ]
    }
   ],
   "source": [
    "np.random.seed(231)\n",
    "# Check the training-time forward pass by checking means and variances\n",
    "# of features both before and after spatial batch normalization\n",
    "\n",
    "N, C, H, W = 2, 6, 4, 5\n",
    "G = 2\n",
    "x = 4 * np.random.randn(N, C, H, W) + 10\n",
    "x_g = x.reshape((N*G,-1))\n",
    "print('Before spatial group normalization:')\n",
    "print('  Shape: ', x.shape)\n",
    "print('  Means: ', x_g.mean(axis=1))\n",
    "print('  Stds: ', x_g.std(axis=1))\n",
    "\n",
    "# Means should be close to zero and stds close to one\n",
    "gamma, beta = np.ones((1,C,1,1)), np.zeros((1,C,1,1))\n",
    "bn_param = {'mode': 'train'}\n",
    "\n",
    "out, _ = spatial_groupnorm_forward(x, gamma, beta, G, bn_param)\n",
    "out_g = out.reshape((N*G,-1))\n",
    "print('After spatial group normalization:')\n",
    "print('  Shape: ', out.shape)\n",
    "print('  Means: ', out_g.mean(axis=1))\n",
    "print('  Stds: ', out_g.std(axis=1))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Spatial group normalization: backward\n",
    "In the file `cs231n/layers.py`, implement the backward pass for spatial batch normalization in the function `spatial_groupnorm_backward`. Run the following to check your implementation using a numeric gradient check:"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 27,
   "metadata": {},
   "outputs": [
    {
     "output_type": "stream",
     "name": "stdout",
     "text": [
      "dx error:  7.413109753818481e-08\ndgamma error:  9.468195772749234e-12\ndbeta error:  3.354494437653335e-12\n"
     ]
    }
   ],
   "source": [
    "np.random.seed(231)\n",
    "N, C, H, W = 2, 6, 4, 5\n",
    "G = 2\n",
    "x = 5 * np.random.randn(N, C, H, W) + 12\n",
    "gamma = np.random.randn(1,C,1,1)\n",
    "beta = np.random.randn(1,C,1,1)\n",
    "dout = np.random.randn(N, C, H, W)\n",
    "\n",
    "gn_param = {}\n",
    "fx = lambda x: spatial_groupnorm_forward(x, gamma, beta, G, gn_param)[0]\n",
    "fg = lambda a: spatial_groupnorm_forward(x, gamma, beta, G, gn_param)[0]\n",
    "fb = lambda b: spatial_groupnorm_forward(x, gamma, beta, G, gn_param)[0]\n",
    "\n",
    "dx_num = eval_numerical_gradient_array(fx, x, dout)\n",
    "da_num = eval_numerical_gradient_array(fg, gamma, dout)\n",
    "db_num = eval_numerical_gradient_array(fb, beta, dout)\n",
    "\n",
    "_, cache = spatial_groupnorm_forward(x, gamma, beta, G, gn_param)\n",
    "dx, dgamma, dbeta = spatial_groupnorm_backward(dout, cache)\n",
    "#You should expect errors of magnitudes between 1e-12~1e-07\n",
    "print('dx error: ', rel_error(dx_num, dx))\n",
    "print('dgamma error: ', rel_error(da_num, dgamma))\n",
    "print('dbeta error: ', rel_error(db_num, dbeta))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": []
  }
 ],
 "metadata": {
  "celltoolbar": "Edit Metadata",
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.8.3-final"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 2
}