{
  "nbformat": 4,
  "nbformat_minor": 0,
  "metadata": {
    "celltoolbar": "Slideshow",
    "kernelspec": {
      "name": "python3",
      "display_name": "Python 3"
    },
    "language_info": {
      "codemirror_mode": {
        "name": "ipython",
        "version": 3
      },
      "file_extension": ".py",
      "mimetype": "text/x-python",
      "name": "python",
      "nbconvert_exporter": "python",
      "pygments_lexer": "ipython3",
      "version": "3.8.2"
    },
    "colab": {
      "name": "PyTorch_Introduction.ipynb",
      "provenance": [],
      "collapsed_sections": [
        "zqyBFdobMTha",
        "jlzoXa6UMThg",
        "l08nQdE9MThp",
        "Tmg4eFQAMThr",
        "jifMOIcNMTh5",
        "aSO1McZLMTiT",
        "OCwLf9C2MTiY",
        "IrapEC2XMTiY",
        "NwsmNTYLMTig",
        "OyN-mHRoMTii"
      ],
      "include_colab_link": true
    },
    "accelerator": "GPU"
  },
  "cells": [
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "view-in-github",
        "colab_type": "text"
      },
      "source": [
        "<a href=\"https://colab.research.google.com/github/Iallen520/lhy_DL_Hw/blob/master/PyTorch_Introduction.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "X7Lg2NXGMThA",
        "colab_type": "text"
      },
      "source": [
        "# PyTorch Introduction\n",
        "\n",
        "### TA: Chi-Liang Liu\n",
        "##### This Tutorial is modified from [University of Washington CSE446](https://courses.cs.washington.edu/courses/cse446/19au/section9.html) and [PyTorch Official Tutorials](https://pytorch.org/tutorials/)"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "Xi_QP1bmMThC",
        "colab_type": "text"
      },
      "source": [
        "Today, we will be intoducing PyTorch, \"an open source deep learning platform that provides a seamless path from research prototyping to production deployment\".\n",
        "\n",
        "This notebook is by no means comprehensive. If you have any questions the **documentation** and **Google** are your friends.\n",
        "\n",
        "Goal takeaways:\n",
        "- Automatic differentiation is a powerful tool\n",
        "- PyTorch implements common functions used in deep learning\n",
        "- Data Processing with PyTorch DataSet\n",
        "- Mixed Presision Training in PyTorch"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "RQIPkkKdMThD",
        "colab_type": "code",
        "colab": {}
      },
      "source": [
        "import torch\n",
        "import torch.nn as nn\n",
        "import torch.nn.functional as F\n",
        "\n",
        "from mpl_toolkits.mplot3d import Axes3D\n",
        "import matplotlib.pyplot as plt\n",
        "\n",
        "import numpy as np\n",
        "\n",
        "torch.manual_seed(446)\n",
        "np.random.seed(446)"
      ],
      "execution_count": 0,
      "outputs": []
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "fCQTBsnWMThH",
        "colab_type": "text"
      },
      "source": [
        "## Tensors and relation to numpy\n",
        "\n",
        "By this point, we have worked with numpy quite a bit. PyTorch's basic building block, the `tensor` is similar to numpy's `ndarray`"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "IXnFLFr1MThI",
        "colab_type": "code",
        "outputId": "9eb2c156-0c35-4e11-d388-f3c20edc5e2c",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 260
        }
      },
      "source": [
        "# we create tensors in a similar way to numpy nd arrays\n",
        "x_numpy = np.array([0.1, 0.2, 0.3])\n",
        "x_torch = torch.tensor([0.1, 0.2, 0.3])\n",
        "print('x_numpy, x_torch')\n",
        "print(x_numpy, x_torch)\n",
        "print()\n",
        "\n",
        "# to and from numpy, pytorch\n",
        "print('to and from numpy and pytorch')\n",
        "print(torch.from_numpy(x_numpy), x_torch.numpy())\n",
        "print()\n",
        "\n",
        "# we can do basic operations like +-*/\n",
        "y_numpy = np.array([3,4,5.])\n",
        "y_torch = torch.tensor([3,4,5.])\n",
        "print(\"x+y\")\n",
        "print(x_numpy + y_numpy, x_torch + y_torch)\n",
        "print()\n",
        "\n",
        "# many functions that are in numpy are also in pytorch\n",
        "print(\"norm\")\n",
        "print(np.linalg.norm(x_numpy), torch.norm(x_torch))\n",
        "print()\n",
        "\n",
        "# to apply an operation along a dimension,\n",
        "# we use the dim keyword argument instead of axis\n",
        "print(\"mean along the 0th dimension\")\n",
        "x_numpy = np.array([[1,2],[3,4.]])\n",
        "x_torch = torch.tensor([[1,2],[3,4.]])\n",
        "print(np.mean(x_numpy, axis=0), torch.mean(x_torch, dim=0))\n"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "x_numpy, x_torch\n",
            "[0.1 0.2 0.3] tensor([0.1000, 0.2000, 0.3000])\n",
            "\n",
            "to and from numpy and pytorch\n",
            "tensor([0.1000, 0.2000, 0.3000], dtype=torch.float64) [0.1 0.2 0.3]\n",
            "\n",
            "x+y\n",
            "[3.1 4.2 5.3] tensor([3.1000, 4.2000, 5.3000])\n",
            "\n",
            "norm\n",
            "0.37416573867739417 tensor(0.3742)\n",
            "\n",
            "mean along the 0th dimension\n",
            "[2. 3.] tensor([2., 3.])\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "TtyttsoZMThL",
        "colab_type": "text"
      },
      "source": [
        "### `Tensor.view`\n",
        "We can use the `Tensor.view()` function to reshape tensors similarly to `numpy.reshape()`\n",
        "\n",
        "It can also automatically calculate the correct dimension if a `-1` is passed in. This is useful if we are working with batches, but the batch size is unknown."
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "ABhZ5mKpMThM",
        "colab_type": "code",
        "outputId": "312318b2-7ba9-4867-f289-141f09786ee4",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 69
        }
      },
      "source": [
        "# \"MNIST\"\n",
        "N, C, W, H = 10000, 3, 28, 28\n",
        "X = torch.randn((N, C, W, H))\n",
        "\n",
        "print(X.shape)\n",
        "print(X.view(N, C, 784).shape)\n",
        "print(X.view(-1, C, 784).shape) # automatically choose the 0th dimension"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "torch.Size([10000, 3, 28, 28])\n",
            "torch.Size([10000, 3, 784])\n",
            "torch.Size([10000, 3, 784])\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "AXgxfCTMOjIp",
        "colab_type": "text"
      },
      "source": [
        "### `BROADCASTING SEMANTICS`\n",
        "Two tensors are “broadcastable” if the following rules hold:\n",
        "\n",
        "Each tensor has at least one dimension.\n",
        "\n",
        "When iterating over the dimension sizes, starting at the trailing dimension, the dimension sizes must either be equal, one of them is 1, or one of them does not exist."
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "9ioj-DAhOjiN",
        "colab_type": "code",
        "outputId": "53f3e777-491b-4be6-a6a2-9eedc70f2046",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 35
        }
      },
      "source": [
        "# PyTorch operations support NumPy Broadcasting Semantics.\n",
        "x=torch.empty(5,1,4,1)\n",
        "y=torch.empty(  3,1,1)\n",
        "print((x+y).size())"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "torch.Size([5, 3, 4, 1])\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "Y3vQ3yD9MThP",
        "colab_type": "text"
      },
      "source": [
        "## Computation graphs\n",
        "\n",
        "What's special about PyTorch's `tensor` object is that it implicitly creates a computation graph in the background. A computation graph is a a way of writing a mathematical expression as a graph. There is an algorithm to compute the gradients of all the variables of a computation graph in time on the same order it is to compute the function itself.\n",
        "\n",
        "Consider the expression $e=(a+b)*(b+1)$ with values $a=2, b=1$. We can draw the evaluated computation graph as\n",
        "<br>\n",
        "<br>\n",
        "\n",
        "In PyTorch, we can write this as"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "G-HpojJ_MThQ",
        "colab_type": "text"
      },
      "source": [
        "![tree-img](https://colah.github.io/posts/2015-08-Backprop/img/tree-eval.png)\n",
        "\n",
        "[source](https://colah.github.io/posts/2015-08-Backprop/)"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "n7NGX7CVMThR",
        "colab_type": "code",
        "outputId": "e390ff56-83dc-458d-ca87-e1d57b9304af",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 69
        }
      },
      "source": [
        "a = torch.tensor(2.0, requires_grad=True) # we set requires_grad=True to let PyTorch know to keep the graph\n",
        "b = torch.tensor(1.0, requires_grad=True)\n",
        "c = a + b\n",
        "d = b + 1\n",
        "e = c * d\n",
        "print('c', c)\n",
        "print('d', d)\n",
        "print('e', e)"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "c tensor(3., grad_fn=<AddBackward0>)\n",
            "d tensor(2., grad_fn=<AddBackward0>)\n",
            "e tensor(6., grad_fn=<MulBackward0>)\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "orGtJTjkMThU",
        "colab_type": "text"
      },
      "source": [
        "We can see that PyTorch kept track of the computation graph for us."
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "gPZfJ1hy4uxj",
        "colab_type": "text"
      },
      "source": [
        "## CUDA SEMANTICS\n",
        "It's easy cupy tensor from cpu to gpu or from gpu to cpu."
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "JYqe5vVv43tG",
        "colab_type": "code",
        "outputId": "202de93f-1f82-4ee4-9c53-0ec985655323",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 121
        }
      },
      "source": [
        "cpu = torch.device(\"cpu\")\n",
        "gpu = torch.device(\"cuda\")\n",
        "\n",
        "x = torch.rand(10)\n",
        "print(x)\n",
        "x = x.to(gpu)\n",
        "print(x)\n",
        "x = x.to(cpu)\n",
        "print(x)"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "tensor([0.3959, 0.6177, 0.7256, 0.0971, 0.9186, 0.8277, 0.4409, 0.9344, 0.8967,\n",
            "        0.1897])\n",
            "tensor([0.3959, 0.6177, 0.7256, 0.0971, 0.9186, 0.8277, 0.4409, 0.9344, 0.8967,\n",
            "        0.1897], device='cuda:0')\n",
            "tensor([0.3959, 0.6177, 0.7256, 0.0971, 0.9186, 0.8277, 0.4409, 0.9344, 0.8967,\n",
            "        0.1897])\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "R7Wy2mEOMThU",
        "colab_type": "text"
      },
      "source": [
        "## PyTorch as an auto grad framework\n",
        "\n",
        "Now that we have seen that PyTorch keeps the graph around for us, let's use it to compute some gradients for us.\n",
        "\n",
        "Consider the function $f(x) = (x-2)^2$.\n",
        "\n",
        "Q: Compute $\\frac{d}{dx} f(x)$ and then compute $f'(1)$.\n",
        "\n",
        "We make a `backward()` call on the leaf variable (`y`) in the computation, computing all the gradients of `y` at once."
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "zvN0jSOKMThV",
        "colab_type": "code",
        "outputId": "c5b113bc-1b6b-42d7-83a1-fdfe6240a2da",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 52
        }
      },
      "source": [
        "def f(x):\n",
        "    return (x-2)**2\n",
        "\n",
        "def fp(x):\n",
        "    return 2*(x-2)\n",
        "\n",
        "x = torch.tensor([1.0], requires_grad=True)\n",
        "\n",
        "y = f(x)\n",
        "y.backward()\n",
        "\n",
        "print('Analytical f\\'(x):', fp(x))\n",
        "print('PyTorch\\'s f\\'(x):', x.grad)\n"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "Analytical f'(x): tensor([-2.], grad_fn=<MulBackward0>)\n",
            "PyTorch's f'(x): tensor([-2.])\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "yvJR6H7KMThX",
        "colab_type": "text"
      },
      "source": [
        "It can also find gradients of functions.\n",
        "\n",
        "Let $w = [w_1, w_2]^T$\n",
        "\n",
        "Consider $g(w) = 2w_1w_2 + w_2\\cos(w_1)$\n",
        "\n",
        "Q: Compute $\\nabla_w g(w)$ and verify $\\nabla_w g([\\pi,1]) = [2, \\pi - 1]^T$"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "-WCp53C1MThY",
        "colab_type": "code",
        "outputId": "2d448c41-cb28-46c1-e072-5095a1dd82aa",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 52
        }
      },
      "source": [
        "def g(w):\n",
        "    return 2*w[0]*w[1] + w[1]*torch.cos(w[0])\n",
        "\n",
        "def grad_g(w):\n",
        "    return torch.tensor([2*w[1] - w[1]*torch.sin(w[0]), 2*w[0] + torch.cos(w[0])])\n",
        "\n",
        "w = torch.tensor([np.pi, 1], requires_grad=True)\n",
        "\n",
        "z = g(w)\n",
        "z.backward()\n",
        "\n",
        "print('Analytical grad g(w)', grad_g(w))\n",
        "print('PyTorch\\'s grad g(w)', w.grad)\n"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "Analytical grad g(w) tensor([2.0000, 5.2832])\n",
            "PyTorch's grad g(w) tensor([2.0000, 5.2832])\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "zqyBFdobMTha",
        "colab_type": "text"
      },
      "source": [
        "## Using the gradients\n",
        "Now that we have gradients, we can use our favorite optimization algorithm: gradient descent!\n",
        "\n",
        "Let $f$ the same function we defined above.\n",
        "\n",
        "Q: What is the value of $x$ that minimizes $f$?"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "m4-8fhqAMThb",
        "colab_type": "code",
        "outputId": "98f1d676-b28f-4269-8cd0-17a405d9a8f8",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 295
        }
      },
      "source": [
        "x = torch.tensor([5.0], requires_grad=True)\n",
        "step_size = 0.25\n",
        "\n",
        "print('iter,\\tx,\\tf(x),\\tf\\'(x),\\tf\\'(x) pytorch')\n",
        "for i in range(15):\n",
        "    y = f(x)\n",
        "    y.backward() # compute the gradient\n",
        "    \n",
        "    print('{},\\t{:.3f},\\t{:.3f},\\t{:.3f},\\t{:.3f}'.format(i, x.item(), f(x).item(), fp(x).item(), x.grad.item()))\n",
        "    \n",
        "    x.data = x.data - step_size * x.grad # perform a GD update step\n",
        "    \n",
        "    # We need to zero the grad variable since the backward()\n",
        "    # call accumulates the gradients in .grad instead of overwriting.\n",
        "    # The detach_() is for efficiency. You do not need to worry too much about it.\n",
        "    x.grad.detach_()\n",
        "    x.grad.zero_()"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "iter,\tx,\tf(x),\tf'(x),\tf'(x) pytorch\n",
            "0,\t5.000,\t9.000,\t6.000,\t6.000\n",
            "1,\t3.500,\t2.250,\t3.000,\t3.000\n",
            "2,\t2.750,\t0.562,\t1.500,\t1.500\n",
            "3,\t2.375,\t0.141,\t0.750,\t0.750\n",
            "4,\t2.188,\t0.035,\t0.375,\t0.375\n",
            "5,\t2.094,\t0.009,\t0.188,\t0.188\n",
            "6,\t2.047,\t0.002,\t0.094,\t0.094\n",
            "7,\t2.023,\t0.001,\t0.047,\t0.047\n",
            "8,\t2.012,\t0.000,\t0.023,\t0.023\n",
            "9,\t2.006,\t0.000,\t0.012,\t0.012\n",
            "10,\t2.003,\t0.000,\t0.006,\t0.006\n",
            "11,\t2.001,\t0.000,\t0.003,\t0.003\n",
            "12,\t2.001,\t0.000,\t0.001,\t0.001\n",
            "13,\t2.000,\t0.000,\t0.001,\t0.001\n",
            "14,\t2.000,\t0.000,\t0.000,\t0.000\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "2TPWiwARMThd",
        "colab_type": "text"
      },
      "source": [
        "# Linear Regression\n",
        "\n",
        "Now, instead of minimizing a made-up function, lets minimize a loss function on some made-up data.\n",
        "\n",
        "We will implement Gradient Descent in order to solve the task of linear regression."
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "3el-4esEMThe",
        "colab_type": "code",
        "outputId": "40f1a3ea-24e9-4fd1-b93d-12882e6052c7",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 69
        }
      },
      "source": [
        "# make a simple linear dataset with some noise\n",
        "\n",
        "d = 2\n",
        "n = 50\n",
        "X = torch.randn(n,d)\n",
        "true_w = torch.tensor([[-1.0], [2.0]])\n",
        "y = X @ true_w + torch.randn(n,1) * 0.1\n",
        "print('X shape', X.shape)\n",
        "print('y shape', y.shape)\n",
        "print('w shape', true_w.shape)\n"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "X shape torch.Size([50, 2])\n",
            "y shape torch.Size([50, 1])\n",
            "w shape torch.Size([2, 1])\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "jlzoXa6UMThg",
        "colab_type": "text"
      },
      "source": [
        "### Note: dimensions\n",
        "PyTorch does a lot of operations on batches of data. The convention is to have your data be of size $(N, d)$ where $N$ is the size of the batch of data."
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "l08nQdE9MThp",
        "colab_type": "text"
      },
      "source": [
        "### Sanity check\n",
        "To verify PyTorch is computing the gradients correctly, let's recall the gradient for the RSS objective:\n",
        "\n",
        "$$\\nabla_w \\mathcal{L}_{RSS}(w; X) = \\nabla_w\\frac{1}{n} ||y - Xw||_2^2 = -\\frac{2}{n}X^T(y-Xw)$$"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "R5HfA5YcMThp",
        "colab_type": "code",
        "outputId": "cad02754-6dde-4c34-c4dc-b273bcf2e614",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 52
        }
      },
      "source": [
        "# define a linear model with no bias\n",
        "def model(X, w):\n",
        "    return X @ w\n",
        "\n",
        "# the residual sum of squares loss function\n",
        "def rss(y, y_hat):\n",
        "    return torch.norm(y - y_hat)**2 / n\n",
        "\n",
        "# analytical expression for the gradient\n",
        "def grad_rss(X, y, w):\n",
        "    return -2*X.t() @ (y - X @ w) / n\n",
        "\n",
        "w = torch.tensor([[1.], [0]], requires_grad=True)\n",
        "y_hat = model(X, w)\n",
        "\n",
        "loss = rss(y, y_hat)\n",
        "loss.backward()\n",
        "\n",
        "print('Analytical gradient', grad_rss(X, y, w).detach().view(2).numpy())\n",
        "print('PyTorch\\'s gradient', w.grad.view(2).numpy())\n"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "Analytical gradient [ 4.342543  -3.5023162]\n",
            "PyTorch's gradient [ 4.342543 -3.502316]\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "Tmg4eFQAMThr",
        "colab_type": "text"
      },
      "source": [
        "Now that we've seen PyTorch is doing the right think, let's use the gradients!\n",
        "\n",
        "## Linear regression using GD with automatically computed derivatives\n",
        "\n",
        "We will now use the gradients to run the gradient descent algorithm.\n",
        "\n",
        "Note: This example is an illustration to connect ideas we have seen before to PyTorch's way of doing things. We will see how to do this in the \"PyTorchic\" way in the next example."
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "gea4LETnMThs",
        "colab_type": "code",
        "outputId": "e9701f26-7f06-4bc2-80c2-1cb2b68ea666",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 433
        }
      },
      "source": [
        "step_size = 0.1\n",
        "\n",
        "print('iter,\\tloss,\\tw')\n",
        "for i in range(20):\n",
        "    y_hat = model(X, w)\n",
        "    loss = rss(y, y_hat)\n",
        "    \n",
        "    loss.backward() # compute the gradient of the loss\n",
        "    \n",
        "    w.data = w.data - step_size * w.grad # do a gradient descent step\n",
        "    \n",
        "    print('{},\\t{:.2f},\\t{}'.format(i, loss.item(), w.view(2).detach().numpy()))\n",
        "    \n",
        "    # We need to zero the grad variable since the backward()\n",
        "    # call accumulates the gradients in .grad instead of overwriting.\n",
        "    # The detach_() is for efficiency. You do not need to worry too much about it.\n",
        "    w.grad.detach()\n",
        "    w.grad.zero_()\n",
        "\n",
        "print('\\ntrue w\\t\\t', true_w.view(2).numpy())\n",
        "print('estimated w\\t', w.view(2).detach().numpy())"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "iter,\tloss,\tw\n",
            "0,\t7.82,\t[0.13149136 0.70046324]\n",
            "1,\t2.84,\t[-0.11822014  0.9229876 ]\n",
            "2,\t1.84,\t[-0.31444427  1.1054724 ]\n",
            "3,\t1.19,\t[-0.4684834  1.2552956]\n",
            "4,\t0.77,\t[-0.58927345  1.3784461 ]\n",
            "5,\t0.50,\t[-0.68387645  1.4797904 ]\n",
            "6,\t0.33,\t[-0.75787055  1.563287  ]\n",
            "7,\t0.22,\t[-0.8156596  1.632159 ]\n",
            "8,\t0.15,\t[-0.86071837  1.6890337 ]\n",
            "9,\t0.10,\t[-0.89578694  1.736055  ]\n",
            "10,\t0.07,\t[-0.9230244  1.7749742]\n",
            "11,\t0.05,\t[-0.94413096  1.8072236 ]\n",
            "12,\t0.03,\t[-0.9604442  1.8339758]\n",
            "13,\t0.02,\t[-0.9730157  1.8561921]\n",
            "14,\t0.02,\t[-0.9826713  1.8746614]\n",
            "15,\t0.01,\t[-0.99005884  1.8900318 ]\n",
            "16,\t0.01,\t[-0.99568594  1.9028363 ]\n",
            "17,\t0.01,\t[-0.99994993  1.913514  ]\n",
            "18,\t0.01,\t[-1.0031612  1.9224268]\n",
            "19,\t0.01,\t[-1.0055621  1.9298735]\n",
            "\n",
            "true w\t\t [-1.  2.]\n",
            "estimated w\t [-1.0055621  1.9298735]\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "AexdjJtcMThu",
        "colab_type": "text"
      },
      "source": [
        "## torch.nn.Module\n",
        "\n",
        "`Module` is PyTorch's way of performing operations on tensors. Modules are implemented as subclasses of the `torch.nn.Module` class. All modules are callable and can be composed together to create complex functions.\n",
        "\n",
        "[`torch.nn` docs](https://pytorch.org/docs/stable/nn.html)\n",
        "\n",
        "Note: most of the functionality implemented for modules can be accessed in a functional form via `torch.nn.functional`, but these require you to create and manage the weight tensors yourself.\n",
        "\n",
        "[`torch.nn.functional` docs](https://pytorch.org/docs/stable/nn.html#torch-nn-functional)."
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "vuigjBAiMThv",
        "colab_type": "text"
      },
      "source": [
        "### Linear Module\n",
        "The bread and butter of modules is the Linear module which does a linear transformation with a bias. It takes the input and output dimensions as parameters, and creates the weights in the object.\n",
        "\n",
        "Unlike how we initialized our $w$ manually, the Linear module automatically initializes the weights randomly. For minimizing non convex loss functions (e.g. training neural networks), initialization is important and can affect results. If training isn't working as well as expected, one thing to try is manually initializing the weights to something different from the default. PyTorch implements some common initializations in `torch.nn.init`.\n",
        "\n",
        "[`torch.nn.init` docs](https://pytorch.org/docs/stable/nn.html#torch-nn-init)"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "Yi4lhPVCMThv",
        "colab_type": "code",
        "outputId": "e0b12ab0-9759-460e-bd50-07cdb0003ae5",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 225
        }
      },
      "source": [
        "d_in = 3\n",
        "d_out = 4\n",
        "linear_module = nn.Linear(d_in, d_out)\n",
        "\n",
        "example_tensor = torch.tensor([[1.,2,3], [4,5,6]])\n",
        "# applys a linear transformation to the data\n",
        "transformed = linear_module(example_tensor)\n",
        "print('example_tensor', example_tensor.shape)\n",
        "print('transormed', transformed.shape)\n",
        "print()\n",
        "print('We can see that the weights exist in the background\\n')\n",
        "print('W:', linear_module.weight)\n",
        "print('b:', linear_module.bias)"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "example_tensor torch.Size([2, 3])\n",
            "transormed torch.Size([2, 4])\n",
            "\n",
            "We can see that the weights exist in the background\n",
            "\n",
            "W: Parameter containing:\n",
            "tensor([[ 0.5260,  0.4925, -0.0887],\n",
            "        [ 0.3944,  0.4080,  0.2182],\n",
            "        [-0.1409,  0.0518,  0.3034],\n",
            "        [ 0.0913,  0.2452, -0.2616]], requires_grad=True)\n",
            "b: Parameter containing:\n",
            "tensor([0.5021, 0.0118, 0.1383, 0.4757], requires_grad=True)\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "YLNmKz9BMThx",
        "colab_type": "text"
      },
      "source": [
        "### Activation functions\n",
        "PyTorch implements a number of activation functions including but not limited to `ReLU`, `Tanh`, and `Sigmoid`. Since they are modules, they need to be instantiated."
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "toOsF9qXMThy",
        "colab_type": "code",
        "outputId": "47865021-7a3f-4f36-8151-29d5cb41e382",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 52
        }
      },
      "source": [
        "activation_fn = nn.ReLU() # we instantiate an instance of the ReLU module\n",
        "example_tensor = torch.tensor([-1.0, 1.0, 0.0])\n",
        "activated = activation_fn(example_tensor)\n",
        "print('example_tensor', example_tensor)\n",
        "print('activated', activated)"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "example_tensor tensor([-1.,  1.,  0.])\n",
            "activated tensor([0., 1., 0.])\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "tPdu_KS_MTh0",
        "colab_type": "text"
      },
      "source": [
        "### Sequential\n",
        "\n",
        "Many times, we want to compose Modules together. `torch.nn.Sequential` provides a good interface for composing simple modules."
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "yn-jaKd3MTh1",
        "colab_type": "code",
        "outputId": "1dd86d16-7560-4882-c7bf-c8b3f1e63fbd",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 35
        }
      },
      "source": [
        "d_in = 3\n",
        "d_hidden = 4\n",
        "d_out = 1\n",
        "model = torch.nn.Sequential(\n",
        "                            nn.Linear(d_in, d_hidden),\n",
        "                            nn.Tanh(),\n",
        "                            nn.Linear(d_hidden, d_out),\n",
        "                            nn.Sigmoid()\n",
        "                           )\n",
        "\n",
        "example_tensor = torch.tensor([[1.,2,3],[4,5,6]])\n",
        "transformed = model(example_tensor)\n",
        "print('transformed', transformed.shape)"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "transformed torch.Size([2, 1])\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "D5GkJ1UTMTh2",
        "colab_type": "text"
      },
      "source": [
        "Note: we can access *all* of the parameters (of any `nn.Module`) with the `parameters()` method. "
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "zTTsMkxoMTh3",
        "colab_type": "code",
        "outputId": "6b52d368-18a1-4486-8735-00d82482dc2b",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 208
        }
      },
      "source": [
        "params = model.parameters()\n",
        "\n",
        "for param in params:\n",
        "    print(param)"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "Parameter containing:\n",
            "tensor([[-0.5607,  0.4221, -0.0254],\n",
            "        [-0.3630,  0.4541,  0.0275],\n",
            "        [-0.0703, -0.1463,  0.3065],\n",
            "        [ 0.0065, -0.2664,  0.0267]], requires_grad=True)\n",
            "Parameter containing:\n",
            "tensor([-0.3196,  0.2911,  0.1999, -0.3758], requires_grad=True)\n",
            "Parameter containing:\n",
            "tensor([[-0.0289,  0.1544,  0.3992, -0.3301]], requires_grad=True)\n",
            "Parameter containing:\n",
            "tensor([-0.1438], requires_grad=True)\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "jifMOIcNMTh5",
        "colab_type": "text"
      },
      "source": [
        "### Loss functions\n",
        "PyTorch implements many common loss functions including `MSELoss` and `CrossEntropyLoss`."
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "b8NXNEhlMTh6",
        "colab_type": "code",
        "outputId": "24e9a149-87e7-48a4-f96f-1ee6a14f0f86",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 35
        }
      },
      "source": [
        "mse_loss_fn = nn.MSELoss()\n",
        "\n",
        "input = torch.tensor([[0., 0, 0]])\n",
        "target = torch.tensor([[1., 0, -1]])\n",
        "\n",
        "loss = mse_loss_fn(input, target)\n",
        "\n",
        "print(loss)"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "tensor(0.6667)\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "Hh0YZh1QMTh7",
        "colab_type": "text"
      },
      "source": [
        "## torch.optim\n",
        "PyTorch implements a number of gradient-based optimization methods in `torch.optim`, including Gradient Descent. At the minimum, it takes in the model parameters and a learning rate.\n",
        "\n",
        "Optimizers do not compute the gradients for you, so you must call `backward()` yourself. You also must call the `optim.zero_grad()` function before calling `backward()` since by default PyTorch does and inplace add to the `.grad` member variable rather than overwriting it.\n",
        "\n",
        "This does both the `detach_()` and `zero_()` calls on all tensor's `grad` variables.\n",
        "\n",
        "[`torch.optim` docs](https://pytorch.org/docs/stable/optim.html)"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "CldNJzMHMTh8",
        "colab_type": "code",
        "outputId": "c62667a6-2b2a-4191-85d2-84e3ba51f1be",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 87
        }
      },
      "source": [
        "# create a simple model\n",
        "model = nn.Linear(1, 1)\n",
        "\n",
        "# create a simple dataset\n",
        "X_simple = torch.tensor([[1.]])\n",
        "y_simple = torch.tensor([[2.]])\n",
        "\n",
        "# create our optimizer\n",
        "optim = torch.optim.SGD(model.parameters(), lr=1e-2)\n",
        "mse_loss_fn = nn.MSELoss()\n",
        "\n",
        "y_hat = model(X_simple)\n",
        "print('model params before:', model.weight)\n",
        "loss = mse_loss_fn(y_hat, y_simple)\n",
        "optim.zero_grad()\n",
        "loss.backward()\n",
        "optim.step()\n",
        "print('model params after:', model.weight)\n"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "model params before: Parameter containing:\n",
            "tensor([[-0.9604]], requires_grad=True)\n",
            "model params after: Parameter containing:\n",
            "tensor([[-0.9060]], requires_grad=True)\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "dxv9VHTOMTh-",
        "colab_type": "text"
      },
      "source": [
        "As we can see, the parameter was updated in the correct direction"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "VjiD9FATMTh_",
        "colab_type": "text"
      },
      "source": [
        "## Linear regression using GD with automatically computed derivatives and PyTorch's Modules\n",
        "\n",
        "Now let's combine what we've learned to solve linear regression in a \"PyTorchic\" way."
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "RGz8gPweMTh_",
        "colab_type": "code",
        "outputId": "e3401d7e-9f8a-4e42-96e5-dfb00c32ccf6",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 433
        }
      },
      "source": [
        "step_size = 0.1\n",
        "\n",
        "linear_module = nn.Linear(d, 1, bias=False)\n",
        "\n",
        "loss_func = nn.MSELoss()\n",
        "\n",
        "optim = torch.optim.SGD(linear_module.parameters(), lr=step_size)\n",
        "\n",
        "print('iter,\\tloss,\\tw')\n",
        "\n",
        "for i in range(20):\n",
        "    y_hat = linear_module(X)\n",
        "    loss = loss_func(y_hat, y)\n",
        "    optim.zero_grad()\n",
        "    loss.backward()\n",
        "    optim.step()\n",
        "    \n",
        "    print('{},\\t{:.2f},\\t{}'.format(i, loss.item(), linear_module.weight.view(2).detach().numpy()))\n",
        "\n",
        "print('\\ntrue w\\t\\t', true_w.view(2).numpy())\n",
        "print('estimated w\\t', linear_module.weight.view(2).detach().numpy())"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "iter,\tloss,\tw\n",
            "0,\t6.14,\t[-0.4951109  -0.20055914]\n",
            "1,\t4.19,\t[-0.64017504  0.1509075 ]\n",
            "2,\t2.87,\t[-0.7496651  0.4441856]\n",
            "3,\t1.98,\t[-0.8317375  0.689143 ]\n",
            "4,\t1.37,\t[-0.8927491   0.89393103]\n",
            "5,\t0.95,\t[-0.93764454  1.0652909 ]\n",
            "6,\t0.67,\t[-0.9702622  1.208804 ]\n",
            "7,\t0.47,\t[-0.99357456  1.3290964 ]\n",
            "8,\t0.33,\t[-1.0098771  1.4300069]\n",
            "9,\t0.23,\t[-1.0209374  1.5147243]\n",
            "10,\t0.17,\t[-1.028112   1.5859002]\n",
            "11,\t0.12,\t[-1.0324373  1.6457422]\n",
            "12,\t0.09,\t[-1.0347017  1.6960896]\n",
            "13,\t0.06,\t[-1.035502   1.7384766]\n",
            "14,\t0.05,\t[-1.0352864  1.7741843]\n",
            "15,\t0.04,\t[-1.0343897  1.8042834]\n",
            "16,\t0.03,\t[-1.033059  1.829669]\n",
            "17,\t0.02,\t[-1.031475   1.8510911]\n",
            "18,\t0.02,\t[-1.0297676  1.8691778]\n",
            "19,\t0.01,\t[-1.0280287  1.8844559]\n",
            "\n",
            "true w\t\t [-1.  2.]\n",
            "estimated w\t [-1.0280287  1.8844559]\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "3j9hUvjQMTiD",
        "colab_type": "text"
      },
      "source": [
        "## Linear regression using SGD \n",
        "In the previous examples, we computed the average gradient over the entire dataset (Gradient Descent). We can implement Stochastic Gradient Descent with a simple modification."
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "q3EUcFMbMTiE",
        "colab_type": "code",
        "outputId": "34a3f055-cefa-4bc2-b6fb-13b3117b620c",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 260
        }
      },
      "source": [
        "step_size = 0.01\n",
        "\n",
        "linear_module = nn.Linear(d, 1)\n",
        "loss_func = nn.MSELoss()\n",
        "optim = torch.optim.SGD(linear_module.parameters(), lr=step_size)\n",
        "print('iter,\\tloss,\\tw')\n",
        "for i in range(200):\n",
        "    rand_idx = np.random.choice(n) # take a random point from the dataset\n",
        "    x = X[rand_idx] \n",
        "    y_hat = linear_module(x)\n",
        "    loss = loss_func(y_hat, y[rand_idx]) # only compute the loss on the single point\n",
        "    optim.zero_grad()\n",
        "    loss.backward()\n",
        "    optim.step()\n",
        "    \n",
        "    if i % 20 == 0:\n",
        "        print('{},\\t{:.2f},\\t{}'.format(i, loss.item(), linear_module.weight.view(2).detach().numpy()))\n",
        "\n",
        "print('\\ntrue w\\t\\t', true_w.view(2).numpy())\n",
        "print('estimated w\\t', linear_module.weight.view(2).detach().numpy())"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "iter,\tloss,\tw\n",
            "0,\t5.33,\t[-0.52818084  0.2690754 ]\n",
            "20,\t1.33,\t[-0.5849738   0.54701847]\n",
            "40,\t0.21,\t[-0.68336743  0.93094164]\n",
            "60,\t0.41,\t[-0.76554966  1.3865377 ]\n",
            "80,\t0.22,\t[-0.8548197  1.528812 ]\n",
            "100,\t0.45,\t[-0.9011376  1.679943 ]\n",
            "120,\t0.04,\t[-0.9418524  1.7858417]\n",
            "140,\t0.00,\t[-0.97288156  1.857902  ]\n",
            "160,\t0.00,\t[-0.98335326  1.893024  ]\n",
            "180,\t0.01,\t[-0.9927237  1.904962 ]\n",
            "\n",
            "true w\t\t [-1.  2.]\n",
            "estimated w\t [-0.99158174  1.9331173 ]\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "6Re8u8STMTiI",
        "colab_type": "text"
      },
      "source": [
        "# Neural Network Basics in PyTorch\n",
        "We will try and fit a simple neural network to the data."
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "401On5ckMTiJ",
        "colab_type": "code",
        "outputId": "8f604a99-f52f-4736-fc9c-4de39d01c96d",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 297
        }
      },
      "source": [
        "%matplotlib inline\n",
        "\n",
        "d = 1\n",
        "n = 200\n",
        "X = torch.rand(n,d)\n",
        "y = 4 * torch.sin(np.pi * X) * torch.cos(6*np.pi*X**2)\n",
        "\n",
        "plt.scatter(X.numpy(), y.numpy())\n",
        "plt.title('plot of $f(x)$')\n",
        "plt.xlabel('$x$')\n",
        "plt.ylabel('$y$')\n",
        "\n",
        "plt.show()"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "display_data",
          "data": {
            "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYIAAAEYCAYAAABRB/GsAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0\ndHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAAgAElEQVR4nO3dcXRc1X0n8O/PYoBxoJG9KJswWJi6\niUmMgtXoYHF8ziYQElMSmymEOK6dbLbZ+LTdZDFlxdqgU8upU5xoE9jddJt1GrZp7RAH4kzt2F0H\naih7OJG3cmRbCHBKKLYZ0sUpiJJYAVn+7R8zY49m3nvzZvTeu/e99/2c43OkN29m7pOl+b177+/+\nrqgqiIgovWaZbgAREZnFQEBElHIMBEREKcdAQESUcgwEREQpx0BARJRyDARERCnHQEBElHIMBJQ6\nIvKCiNwQ0XstFJFDIvK6iPxHl3M6ROQREXlVRB4QkXtFZJ3P1/+/IrIo2FZT2pxnugFENhORFwD8\ne1V9tMWXuAvAY6q62OOcDQD+QVU/JCIdAA4B+A2fr/9fAHwBwK0tto+IPQKikF0OYKzBOTcAeKj8\n9acB7FXVCZ+vvwvAdSLy9taaR8RAQAlVHv7ZICJPl4dc/peIXOhw3rtF5HERGReRMRFZUfXYXwHo\nBLBbRH4hInc1+fz9AK4D8LXy899V89zzReQ1AF3l9xgF8FsA/q7mvC+LSKHq+0ER+VsROV9VfwXg\nIIBlrf2kiBgIKNlWo/QBuQDAuwD0Vz8oIhkAuwH8EMDbAHwewHYRWQgAqvpJAMcBLFfVi1T1y00+\n/3oA/wfA58rP/0n181X1TQDXAni5/HgXSkHhaM11fAmlu/5uEfk9ADcCuKX8fAB4BsDVrfyAiAAG\nAkq2r6nqCVV9BcAXAayqebwXwEUAtqjqm6q6H8APHM5zM9PnA8BiAIervm8H8Hr1Car6zwDuA/At\nlOYTblLV16pOeb38PKKWMBBQkp2o+voYgEtrHr8UwAlVPVNzXs7n68/0+UB9IHgVwMUO542g1FvY\noKonah67GMB4E+9JNA0DASXZvKqvOwG8VPP4SwDmicismvOKVd97bdjh5/mNXI3pgeAISsNYZ4lI\nF4A/Q6lH8LsOr/HumtcgagoDASXZfxCRy0RkLoB7AOyoefwAgFMA7hKRjIh8AMByAN+pOuf/Afh1\nl9f38/xGagPBXgDvr3wjIjmU5iF+D8AfAOgqv0/l8QsBvA/AI028J9E0DASUZN9GaSL3eQA/BbC5\n+sHyZOtylDJ1fg7gfwD4lKo+W3XavQD6y1lB/6mF57sqp3zOAVB9/l8CuElEsiLyaygFhq+q6i5V\nPQVgEKX5jorlAB5X1dreDpFvwq0qKYkCWAhmjIj8CUqZRPf7OPcAgM+o6lPht4ySiiuLiSyjqnc3\nce6SMNtC6cChISKilOPQEBFRyrFHQESUcrGcI7jkkkt0/vz5pptBRBQrBw8e/LmqdtQej2UgmD9/\nPoaHh003g4goVkTkmNNxDg0REaUcAwERUcpZEwhEpE1ERkTkB6bbQkSUJtYEAgC3o1RXnYiIImRF\nIBCRywB8BMCfm24LEVHa2JI1dD9Km3w71WEHAIjIWgBrAaCzszOiZhGZVxgp4p7vj+KXb04BAATA\n6t5ObM53mW0YJYbxHoGIfBSlAlsHvc5T1a2q2qOqPR0ddWmwRIm0+hs/wrodh84GAaC0QcK2oeOY\nv34P+guj5hpHiWE8EABYCmBFuVrkdwBcLyLbzDaJyLz+wiie/OkrnudsGzrOYEAzZjwQqOoGVb1M\nVecD+ASA/aq6xnCziIx78EDtjpQzO4/IjfFAQETOpnwWhPR7HpEbWyaLAQCq+jiAxw03g8iowkgR\ng/uO+j6/TSTE1lAaWBUIiNKuMFLEH+44hDNNPGfVknmhtYfSgYGAyCIbdh7xHQSYRkpBYSAgssjE\npHsYeGHLRyJsCaUJJ4uJiFKOPQIiSxRGiq6PzXKZD65MLL80PoFL27PoW7YQ+e5cSC2kpGIgILJA\nYaSIDTvdF4b9zpL6siqV50xMllYdF8cnzr4GgwE1g0NDRBYY3Hf07Ad6rTUuE8JOz5mYnGoq9ZQI\nYCAgskJxfML1MbesILfneL0WkRMGAiILuM0BeC0Wc3uMC8yoWQwERIYVRoo441Ilwqt8hNtjLDlB\nzWIgIDLMa0w/155t+jGv5xA5YSAgMuwljzH9vmULPR/LZtqmHctm2jyfQ+SEgYDIsEtd7uDbsxnP\nNNB8dw733tKFXHsWglJP4N5bupg6Sk3jOgIiw/qWLZy2HgAo3dkPrFjU8Ln57hw/+GnGGAiIDKt8\nkHOFMJnCQEBkAd7Zk0kMBEQGsVYQ2cB4IBCRCwE8AeAClNrzsKpuNNsqovCxVhDZwoasoTcAXK+q\nVwNYDOBGEek13Cai0LFWENnCeI9AVRXAL8rfZsr/uDSSEs9t/YDXugKiMNjQI4CItInIIQAvA3hE\nVQ84nLNWRIZFZPjkyZPRN5IoYG7rB9yOE4XFikCgqlOquhjAZQCuEZGrHM7Zqqo9qtrT0dERfSOJ\nAsaVwWQLKwJBhaqOA3gMwI2m20IUNq4MJlsYnyMQkQ4Ak6o6LiJZAB8C8CXDzSKKBNcPkA2MBwIA\n7wDwLRFpQ6mH8l1V/YHhNhERpYbxQKCqRwB0m24HEVFaWTVHQERE0WMgICJKOQYCIqKUMz5HQJQ2\n/YVRPHjgBKZU0SaCVUvmYXO+y3SzKMUYCIgi1F8Yxbah42e/n1I9+z2DAZnCoSGiCD144ERTx4mi\nwEBAFKEpda6n6HacKAoMBEQRahNp6jhRFBgIiCK0asm8po63qjBSxNIt+3HF+j1YumU/CiPFQF+f\nkoWBgChCm/NdWNPbebYH0CaCNb2dgU4UV3Y+K45PQFHa+eyOHYfQXxgN7D0oWURjODbZ09Ojw8PD\npptBZKWlW/aj6LC5jQC4b+ViFrlLMRE5qKo9tceZPkqUMG47nClK22MmLRBUr8uoyLVn0bdsYeKu\nNSwcGiJKGK8dzpK2DWZlXUZt1hWHw5rDQECUMH3LFsItBylp22B6rb9QANuHjnOi3AcGAqKEyXfn\nsLq3sy4YJG0bzP7CaMP1FwpgYNdYNA2KMQYCogTanO/CfSsXJ3YbzNpSHV7GJyY5RNSA8UAgIvNE\n5DEReVpExkTkdtNtIkqCfHcOfcsW4tL2LF4an8DgvqOJGSZptiTHNg4RebIha+g0gDtV9ccicjGA\ngyLyiKo+bbphREEqjBQxuO8oXhqfwKURZLVU1hNMTE4BKE2gbthZujOOe8+glZIcScyYCorxHoGq\n/kxVf1z++nUAzwDg/xYlitMirw07R0O9Sx3cd/RsEKiYmJzC4L6job1nVFopyeG0toJKjAeCaiIy\nH6X9iw84PLZWRIZFZPjkyZNRN41oRgZ2jUX+oeyWKpqEFFK3khwZj080ATg85MKaQCAiFwH4HoB1\nqvovtY+r6lZV7VHVno6OjugbSNSiwkgR4xOTjo+F+aH81mymqeNx4laqY/C2xa4fapUFdVTPhjkC\niEgGpSCwXVV3mm4PUZC8PnzCzOt3Gz1JSqHTzfku1xpN63YccjyehN5QGIz3CEREAHwTwDOq+lXT\n7SEKmteHT5h5/eOnnHshbsfjwE9V1Xx3DjmXAJu0BXVBsaFHsBTAJwGMikgljN+tqnsNtsl6Thko\nACLNSiF/Lm3POk5UzpmdCfX/x+194/ph2EwWVN+yhdPOBZK3oC5IrD4aA4WRIgZ2jbmOMwNApk0A\nBSbPuP9/ZjOzcO8t72VwiFjtBxhQ+lAKe4GXqfcNi1tV1Vx7Fk+uv77ueNTpunHA6qMx5CcAVExO\nNQ7oE5NnsG7HIQwfe4UbpUeo8uET9YdS5fWrf4cu9EqrsVyzWVD57lzqP/j9YiCwUDMBoBXbho5j\nz5GfYePyRfxDiYjJD6U3Tp85+/WrpyZju6gsaUNdNonv7UFC9RdGcceOQ6EFgYpXT02yTG8KJGlR\nWd+yhchm2qYd47h/MNgjsEhhpIjtQ8fRyqyNnzmCWpUyvT2Xz43d3WGcmByrTsqissrPcGJyCm0i\nmFLl5jMBYiCwQDOVFJ3karKGmllKn9Rdq2xhut5PEoZTan+GU6pnewL8vQ0GA4FhrQaB2ZlZ+BOH\nDKDK96U/niOYmDzj9PRp4nZ3GCdeQzNRfIglIY3S9M8wDRgIDGu2nG57NoOBFY0neasnJxtNPsfp\n7jBuTA/NmMpYCpLpn2EaMBAYUhnz9FtOdybjoZWg0F8YrZuDiNvdYdzYMDQT9zRKG36GScdAELFm\nU0PvX7k4sD/izfku9Fw+N9Z3h3GThKEZ0667ssNx+PS6K1l8MigMBBFyWunpZU1vZ+Af0nG/O4yb\nJAzNmPbYs85l592OU/MYCCJSGCnizu8e9jUUJABW93Zy9W9CMPjOjFsWHOcIgsNAEIFKT6BREHCr\nmUKUVoWRIgRwXFvDOYLgMBCEzG9PgOPGRPUG9x11DAKC1kp4sxCdMwaCEDll6TiZMztjVd0f/rGQ\nLdyGfxTNL8gzvbjPZgwEISiMFNH30CE0WsvVJoKvfPxqq34J+cdCNnFLHXXbeMaL28K0gV1jqf/d\nZtG5gPUXRrFuR+MgkM20WRcEAO8/FqKoBVlozm3SeXxiMvWb2jMQBKgwUvRVLqJNxNrNQdy64vxj\nIRPy3Tnce0sXcu1ZCEo9gVb/dto8NmuOYzXWIFkRCETkARF5WUSeMt2Wmdi0u/FdswBW9gQqvDIx\n0v7HQtELcr7KK2Ej7amoVgQCAH8B4EbTjWhVZUPtV31sCr46hEViQfLqcqf9j4WiVZmvKo5PQHFu\nvqrVnqnXvELaU1GtCASq+gSAV0y3oxWrv/EjrNtxyFfp56UL5lq/SCzfncOc2RnHx2aJcHiIIhP0\npjp9yxYiM6t+eCjTJqlP3bYiEPghImtFZFhEhk+etGNp+epv/AhP/tRf/FrT24ntn7025BYFY+Py\nRXUTdECpaz2TOzKiZgRddTTfncPgbVejPXvuRmfO7AwGP2bvUG1UYpM+qqpbAWwFgJ6enlY28QpU\nYaToKwhkZgkGb4vXL1qlrU4L4VgH3h+uxZi5MKqOstyHs9j0CGxSSRFtJNeejV0QqMh353DGZXKN\ncwXegh7bTivuURyd2PQIbOF3OCjI8tGmsA58a7ijVjDCrtzKXts5VgQCEXkQwAcAXCIiLwLYqKrf\nNNuqen6Hg5YuSMZm8Kyl3xruqBWcsIZyuIJ+OisCgaquMt0GP/xkKyxdMDc2k8KNsJZ+a9iTsh97\nbdNZEQjiotEdXXs2k5ggUMHJteaxJ2U/9tqm42RxE7zu6GYBGFixKLrGkLWCLItA4XD7W05rr409\ngiY43ekBQGYWMHhb/CeHKTg296Q4ScpeWy0GAheFkSLuevgw3pw6l0K5dMFc3HtLV+r/iPhBEl+c\nJC3h/Nd0DAQO3FJEK8fSvJ1kaa+Fw5g8UwqQxfEJ9D10GEC6PkjiipOk59QGg0oySNp+DgDnCOr0\nF0Y9U0T9lpRIqoFdY2eDQMXkGeV+BTHBSdJzCiNF9D18eNrCv76HD6dy4R8DQRW/+wmk2fiEc4VV\nt+NpU6lEe8X6PVi6Zb91HyqcJD1n0+4xTE7V3NRMqa9y8knDQFBWGfIgalUcSkuwbMM5bmXj/ZST\nTxoGgrINO4/UDXk4WbpgbgStsZdbiWq342kSdNnkMDC1lZxwshileYGJRpsMI1mrhlu1cfki9D18\nuK5LrVq6I07zB0pcxt9tTm2NUns24zikWV2mOi1SHQgKI0XcvfMITjUIAmt6O63fUCYqlQ+QTbvH\npnWhxycmU5mGWO2tLh8sb03hB0scDKxYNC0DDiiVjU/jwtDUDg0VRoq486HDDYPAnNkZBoEa+e4c\nZp9ffw9h2zBI1Nz2RvfYM50MqmxUUz1MFtey8TOV2h7Bpt1jmPIxJ7BxefruDvyIyzBIlMZdJhnd\njpN5Ng6TFUaKGNg1Nq132Z7NYGDFotDamsoeQWGk6CszYI3lG82bxDTEevyZ0ExVshdrhxjHJyax\nbsch9BdGQ3nf1PUIKil+jXBewBtrtdTjz4SaVVuu5dSbpz2zF7cPHUfP5cHvd5K6HoFTil8tBoHG\nmIZYjz8TaobTyuZGIxUKYN2OQ4EvVhR12Zc2SiJyI4D/CqANwJ+r6hav83t6enR4eLip96hEXqcN\nQ6oxCBBRFLq/8MMZLV7LZtqavtEQkYOq2lN73HiPQETaAPwpgN8C8B4Aq0TkPUG+R/WKTzdtIrh/\n5WIGASIKnd95Si9BZuk1DAQi8oiIXB3Iuzm7BsBzqvq8qr4J4DsAbg7yDRoNB2UzbfjKx9OZNkZE\n0fIzT+l3UVtQWXp+Jov/M4D7ReQFAHer6s8CeedzcgBOVH3/IoAlQb6B1w8rl/I65EQUrUY3pu3Z\nDA5t/DCAxkPaQWWkNQwEqvpjANeJyK0A/reI7ATwZVWNNGFcRNYCWAsAnZ2dTT3XbTPxXHs21XsL\nBI0b1hA15nVjWruyubLOoXZDISDYjDRfcwQiIgCOAvgzAJ8H8A8i8slAWgAUAcyr+v6y8rFpVHWr\nqvaoak9HR0dTb8CKi+Fzqrx5R4h5z0Rx5XYX3ybiurI57Iy0hllDIvIkgCsAjAEYAnAAwLMAbgdw\ngaqunVEDRM4D8BMAH0QpAPw9gN9RVdei4DPJGuLdajiWbtnv2OsSAPet5H7OtuLfRfTc7u6jSDV2\nyxryM0ewFsDTWh8xPi8iz8y0Yap6WkQ+B2AfSumjD3gFgVbZuJQ8Sdy6uwqkchvEOOD+xWbYuF+y\nnzkCrw/ljwTRCFXdC2BvEK9FZrjNwwDprj9kM+5fbI5tN6YzWkegqs8H1RCKt75lC+FWZJO1duzE\nwoFUkbpaQxSOfHcOw8dewfah46geQ0zDpHxcx9ndenEM3OljfGUxJcfmfBfuW7l42raVF5yX7F+x\nOOxT7IbZdM4KI0Us3bIfV6zfE3hNH1sl+6+UjPhV1WY/lZ3LkvrHFId9it2wSF69OAf2meDQEAUq\nbROQcR9nt23S0rS0/f5WMBBQoOL+wdis9tkZx+Jh7bO5T3EcBf37G5f5Iw4NUaDStkuX23pMC6q7\nUwuC/P112m+g7+HDVg4zMRBQoNI2AfnahHMpYbfjZLcgf3837R7D5NT0O4LJKcWm3YGvl50xDg1R\noGxcNRkmpmAmS5C/v277Dcx0H4IwMBBQ4KonICtjpHfsOJTIoMB9ipMnjRPoDAQUmjTUsklbD4j8\n8ZoH8LvpTJQYCCg0aUnFS+MdZFq0mvXjNQ9Qvd+ALRgIKDRuKXdee0cT2aLVHm2j/YhtvGlg1hCF\nxmvC1MYUOqJqra4a93o8Z2kSAQMBhcZrwtTGFDqiaq0uLvN63NYkAgYCCo1XF9jGFDqiaq32aN2e\n157NWDksBDAQEBE56lu2EJlZ9btsKIB1Ow7VBYNK1dLi+ETd3hzZTJuVk8QVRgOBiNwmImMickZE\n6vbRpPhzS5WzMYWOqFq+O4eLLnTPp7l755GzX1eXkwAwbU+OOFR1Nd0jeArALQCeMNwOCsnAikV1\nd1WZWWL13RFRxbjHEOapqnLrd+88UldOAgDmzM7gyfXXWx0EAMPpo6r6DACIuG1ySHHHBVfxEZdK\nmVHy2osbODdXUB0UqsVlLiw26whEZC2AtQDQ2dlpuDXUDC64sl8aVoG3om/ZQqzbccj18ThsQORH\n6ENDIvKoiDzl8O/mZl5HVbeqao+q9nR0dITVXApRGrcAjIs477QWpnx3DhmPT8ni+IRnjyEuc2Gh\n9whU9Yaw34PsxztOu6VtQ6FmDN622LNX4CUuc2GmJ4spJXjHaTcTGwrFpYeY785hTW9nXUqoFwGw\nprczNjc5ptNHf1tEXgRwLYA9IrLPZHsoPKw7ZLeoNxSK2ybxm/NduG/lYuTasw0DQq49i/tWLsbm\nfFckbQuC0UCgqt9X1ctU9QJV/dequsxkeyg8bneWAtYdskG+O4d7b+k6+0EXdu57HHuI+e4cnlx/\nPf5xy0dcawbl2rOxSBetFZusIYq3vmULcceOQ6jNtFYgcWWp4yrK7K64z0kkbUMizhFQJPLdubog\nUBGXP/40iGrc3sScRJCi7kGFjT0CikyO+/taLcrMriTcUSdpfQx7BBSZqCckqTlRj9tfWJWg357N\nxPqOOu7YI6DIsNyE3aIat6/teQDAG6edSzRQNBgIKFJJ6k4njVtdnaCH7tKyl3WccGiIiABEN3QX\n94yhJGIgICIA0WXCxD1jKIk4NETGxLHscRzb3Iwohu6SkDGUNAwEZEQci9D1F0axfej42fUQcWiz\njZg0YB8GAjIibhOGhZHitCBQYXObbcakAbtwjoCMiNuE4eC+o1wZTYnFHgEZEVWqYlC8PuxtbbON\nkj7HElfsEZARcVtl7FU91dY22yZupafThIGAjIhb0S6nwCUAVsdo8xHT4lh6Oi0YCMiYSn331b2d\n+KfXfoV1Ow5hwYa96C+Mmm5aHafAFbfNR0yL27xQmnCOgIzqL4xi29Dxs99PqZ793rYPWWa6zEzc\n5oXSxPRWlYMi8qyIHBGR74tIu8n2UPQePHCiqeMUnf7CKBZs2Iv56/cE0lO77sqOum0ebZ4XShPT\nQ0OPALhKVd8L4CcANhhuD0VsSp2TMt2OUzQqPbXK/0Olp9ZqMCiMFPG9g8VpKbgC4Nb3sZdlA9N7\nFv9QVU+Xvx0CcJnJ9lD02sR5K3C34xSNoHtqThPFCuCxZ0+29HoULNM9gmq/C+Bv3B4UkbUiMiwi\nwydP8pcnKVYtmdfUcYpG0D01ThTbLfRAICKPishTDv9urjrnHgCnAWx3ex1V3aqqPara09HREXaz\nKSKb811Y09t5tgfQJoI1vZ3WTRSnTdA9NVYctVvoWUOqeoPX4yLyaQAfBfBBVQ4Mp9HmfJfVH/z9\nhVE8eOAEplTRJoJVS+ZZ3d4grFoyb1o2V/XxVlx3ZYfj6113JW/qbGA0fVREbgRwF4D3q+opk20h\nchKn9NYgVa6tutDeW85vQ8/lc1t6Pbe5AM4R2MH0HMHXAFwM4BEROSQiXzfcHqJp0pze2nP5XJzX\ndm4o6JdvTqHv4cMtlYTgHIHdjPYIVPU3TL4/USNpTm/dtHsMk1PTr3NySrFp91jTKZ9cTGY30z0C\nIqulOb311VOTTR13Uxgp4pdvnK47zsVk9mAgIPLA9NaZqVQcHZ+YHjzmzM5YXWQwbRgIiDykOb21\nPZtp6rgTp4VkADD7/PMYBCzConNEDdie3hqWgRWL0PfQYUyemT5P8NGr3+H7NThJHA/sERCRo3x3\nDiuvmVdXKO57B4u+M4e4kCweGAiIyNVjz56s26t5YnIKm3aP+Xp+3HaiSysGAiJy5TaE8+qpSV+9\ngrjtRJdWnCMgIldu+f9AaSLYzwc6N/SxH3sEROTKawiHE77JwUBARK7y3TnXdFFO+CYHAwFZrTBS\nxNIt+3HF+j1YumV/S3VuaGYGVizihG/CcY6ArFVZlVpZkFQcn8CGnaWtEsMccy6MFDG47yheGp/A\npe1Z9C1bmOox7sq182eSXAwEZC2nVakTk1O+JylbURgpTltEVRyfQN9DhwGEG3xsxwnfZOPQEFnL\nLVvF7XgQBnaN1a2knTyjGNjlL2+eKI7YIyBrtYk4lnsOs/JnbXG0RsfTxu+wGYfX4oWBgKyV5r0A\nbOR3zsbU3A61zujQkIj8sYgcKe9O9kMRudRke8guOZf0RLfjQZgz2zlV0u14mnjN2VQb2DXm6zyy\nh+k5gkFVfa+qLgbwAwB/ZLg9ZBETdWo2Ll+ETNv0oadMm2Dj8kWhvWdc+KkkWhgpug6jcQGavUxv\nVfkvVd++Bairb0UpZiJtkamS7vxsN+l1188FaPYSNTzeKiJfBPApAK8BuE5VT7qctxbAWgDo7Ox8\n37Fjx6JrJFmBE5Bm1Y79V7RnMxhYsQj57hyuWL/H9W7u/pWL+f9lmIgcVNWeuuNhBwIReRTA2x0e\nukdV/7rqvA0ALlTVjY1es6enR4eHhwNsJdnO6UMom2ljJcuIFUaK2LR7rG7fYgGwurcTjz170rHX\nMGd2BiN/9OGIWklu3AJB6HMEqnqDql7l8O+va07dDuDWsNtD8eQ2Ucn8/mjlu3OYfX79iLIC2DZ0\n3DEIZDNtnGOxnOmsoXdWfXszgGdNtYXs5jbROD7hry4+BaeZSV/uPxAPprOGtojIUyJyBMCHAdxu\nuD1kKa+JRqYlRsvvpG+uPYsn11/PIBADRgOBqt5aHiZ6r6ouV1Xe2pEjr5TRMEtOUL2+ZQvr9jF2\nwnTR+DDdIyDyJd+d81zU1V8YjbA16ZbvzmF1b2fD85guGh8MBBQbXhOODx440fLr9hdGsWDDXsxf\nvwcLNuxlUPFhc74LSxfMdX08M0u4X0GMMBBQbHiNNbdaf6i/MIptQ8fPPn9KFduGjjMY+LD9s9di\nTW8namsAtmczGLztas4NxAiLzlGsuFUkBUo57s1++GwfOu54/MEDJ7A539V0+9Jmc76LP6cEYI+A\nYmXVknmuj23a3dyagsJI0XUVLCucUpowEFCseN19vnqquTUFXmmnYe55QGQbBgKKHa8y1M2sNPZK\nO/XqeRAlDQMBxY5XNorflcZek8FvOb+N496UKgwEFDv57hzas+5rChrNFRRGiq6TxALgi7/NIEDp\nwkBAsTSwwn1NQW1lzFqD+466ThIruJ0ipQ8DAcXSTD6svUofhLkNJpGtGAgottyGh7yGjQojRcxy\nyQgSeM8/ECUVAwHF1sCKRcjMqtlfeJa4DhtVNrdxWiNQ2ViFw0KURlxZTLHVaH/h/sIovn3gOM40\nWBvWJoKvfJwlESi9GAgo1vLdOccP8EoNIT/OqDIIUKpxaIgSqZlqpCyXTGnHQECJ5LdWUDbTxgli\nSj0rAoGI3CkiKiKXmG4LJYOfWkHcT5eoxPgcgYjMQ2m/Yn8DukQ+rFoyz3WOINMmGPwYJ4eJKmzo\nEdwH4C7AdbEnUdM257uwprcTNdmlmDM7wyBAVEPUYN11EbkZwPWqeruIvACgR1V/7nLuWgBrAaCz\ns/N9x44di66hREQJICIHVV28+UMAAAUVSURBVLWn9njoQ0Mi8iiAtzs8dA+Au1EaFmpIVbcC2AoA\nPT097D0QEQUk9ECgqjc4HReRLgBXADgspYm9ywD8WESuUdV/CrtdRERUYmyyWFVHAbyt8n2joSEi\nIgqHDZPFRERkkPH00QpVnW+6DUREaWQ0a6hVInISQKtpQ5cASNvwE685HXjN6dHqdV+uqh21B2MZ\nCGZCRIad0qeSjNecDrzm9Aj6ujlHQESUcgwEREQpl8ZAsNV0AwzgNacDrzk9Ar3u1M0REBHRdGns\nERARURUGAiKilEtkIBCRG0XkqIg8JyLrHR6/QER2lB8/ICLzo29lsHxc8x+KyNMickRE/lZELjfR\nzqA1uu6q824tb34U+1RDP9csIh8v/3+Pici3o25j0Hz8fneKyGMiMlL+Hb/JRDuDJCIPiMjLIvKU\ny+MiIv+t/DM5IiK/2fKbqWqi/gFoA/BTAL8O4HwAhwG8p+acPwDw9fLXnwCww3S7I7jm6wDMLn/9\n+3G/Zr/XXT7vYgBPABhCqZ6V8baH/H/9TgAjAOaUv3+b6XZHcM1bAfx++ev3AHjBdLsDuO5/A+A3\nATzl8vhNAP4GgADoBXCg1fdKYo/gGgDPqerzqvomgO8AuLnmnJsBfKv89cMAPijiY29DezW8ZlV9\nTFVPlb8dQqnaa9z5+b8GgD8G8CUAv4qycSHxc82fBfCnqvoqAKjqyxG3MWh+rlkB/Fr567cCeCnC\n9oVCVZ8A8IrHKTcD+EstGQLQLiLvaOW9khgIcgBOVH3/YvmY4zmqehrAawD+VSStC4efa672GZTu\nJOKu4XWXu8vzVHVPlA0LkZ//63cBeJeIPCkiQyJyY2StC4efax4AsEZEXgSwF8Dno2maUc3+3buy\npugcRUNE1gDoAfB+020Jm4jMAvBVAJ823JSonYfS8NAHUOr5PSEiXao6brRV4VoF4C9U9Ssici2A\nvxKRq1T1jOmGxUESewRFAPOqvr+sfMzxHBE5D6Wu5D9H0rpw+LlmiMgNKO0Mt0JV34iobWFqdN0X\nA7gKwOPl/S56AeyK+YSxn//rFwHsUtVJVf1HAD9BKTDElZ9r/gyA7wKAqv4IwIUoFWZLMl9/934k\nMRD8PYB3isgVInI+SpPBu2rO2QXg35a//hiA/VqefYmphtcsIt0A/idKQSDuY8YVntetqq+p6iWq\nOl9LZc6HULr+YTPNDYSf3+8CSr0BiMglKA0VPR9lIwPm55qPA/ggAIjIu1EKBCcjbWX0dgH4VDl7\nqBfAa6r6s1ZeKHFDQ6p6WkQ+B2AfStkGD6jqmIh8AcCwqu4C8E2Uuo7PoTQZ8wlzLZ45n9c8COAi\nAA+V58WPq+oKY40OgM/rThSf17wPwIdF5GkAUwD6VDW2PV6f13wngG+IyB0oTRx/OuY3dxCRB1EK\n6JeU5z42AsgAgKp+HaW5kJsAPAfgFIB/1/J7xfxnRUREM5TEoSEiImoCAwERUcoxEBARpRwDARFR\nyjEQEBGlHAMBEVHKMRAQEaUcAwFRAMq18D9U/nqziPx3020i8itxK4uJDNkI4Asi8jYA3QBivWqb\n0oUri4kCIiJ/h1IZjw+o6uum20PkF4eGiAIgIl0A3gHgTQYBihsGAqIZKu8KtR2lHaN+kYCNYChl\nGAiIZkBEZgPYCeBOVX0GpW0xN5ptFVFzOEdARJRy7BEQEaUcAwERUcoxEBARpRwDARFRyjEQEBGl\nHAMBEVHKMRAQEaXc/wdbG4mxcZWRhAAAAABJRU5ErkJggg==\n",
            "text/plain": [
              "<Figure size 432x288 with 1 Axes>"
            ]
          },
          "metadata": {
            "tags": []
          }
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "iQl2A_XDMTiL",
        "colab_type": "text"
      },
      "source": [
        "Here we define a simple two hidden layer neural network with Tanh activations. There are a few hyper parameters to play with to get a feel for how they change the results."
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "C72y6DtpMTiM",
        "colab_type": "code",
        "outputId": "e91ae971-a87f-430e-ab35-1918173b0b4f",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 208
        }
      },
      "source": [
        "# feel free to play with these parameters\n",
        "\n",
        "step_size = 0.05\n",
        "n_epochs = 6000\n",
        "n_hidden_1 = 32\n",
        "n_hidden_2 = 32\n",
        "d_out = 1\n",
        "\n",
        "neural_network = nn.Sequential(\n",
        "                            nn.Linear(d, n_hidden_1), \n",
        "                            nn.Tanh(),\n",
        "                            nn.Linear(n_hidden_1, n_hidden_2),\n",
        "                            nn.Tanh(),\n",
        "                            nn.Linear(n_hidden_2, d_out)\n",
        "                            )\n",
        "\n",
        "loss_func = nn.MSELoss()\n",
        "\n",
        "optim = torch.optim.SGD(neural_network.parameters(), lr=step_size)\n",
        "print('iter,\\tloss')\n",
        "for i in range(n_epochs):\n",
        "    y_hat = neural_network(X)\n",
        "    loss = loss_func(y_hat, y)\n",
        "    optim.zero_grad()\n",
        "    loss.backward()\n",
        "    optim.step()\n",
        "    \n",
        "    if i % (n_epochs // 10) == 0:\n",
        "        print('{},\\t{:.2f}'.format(i, loss.item()))\n",
        "\n"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "iter,\tloss\n",
            "0,\t4.33\n",
            "600,\t4.27\n",
            "1200,\t3.87\n",
            "1800,\t1.54\n",
            "2400,\t0.71\n",
            "3000,\t0.78\n",
            "3600,\t0.24\n",
            "4200,\t0.10\n",
            "4800,\t0.08\n",
            "5400,\t0.08\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "vQBkFt9LMTiO",
        "colab_type": "code",
        "outputId": "c4353831-4f51-4087-fcaf-41b21cca1b73",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 302
        }
      },
      "source": [
        "X_grid = torch.from_numpy(np.linspace(0,1,50)).float().view(-1, d)\n",
        "y_hat = neural_network(X_grid)\n",
        "plt.scatter(X.numpy(), y.numpy())\n",
        "plt.plot(X_grid.detach().numpy(), y_hat.detach().numpy(), 'r')\n",
        "plt.title('plot of $f(x)$ and $\\hat{f}(x)$')\n",
        "plt.xlabel('$x$')\n",
        "plt.ylabel('$y$')\n",
        "plt.show()"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "display_data",
          "data": {
            "image/png": "iVBORw0KGgoAAAANSUhEUgAAAYIAAAEdCAYAAAABymAfAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0\ndHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAAgAElEQVR4nO3deXxU1fn48c9JGCCsYQlLAmEn7BCN\niEDdFQXBiFhFtFoX6te27qBUfoIWlTYu1Npal1qr4oZiZLOgIqAgKBAgCSFAgASGLSxBJCFkOb8/\n7kyYSSbJkMzMvTPzvF+vvEju3Ln33CG5zz3bc5TWGiGEEOErwuwCCCGEMJcEAiGECHMSCIQQIsxJ\nIBBCiDAngUAIIcKcBAIhhAhzEgiEECLMSSAQIkwopUYrpUabXQ5hPUomlAkR+pRSbYFljh+v0lof\nNbM8wlokEAgRBpRS/wA+ByKBcVrr35tcJGEhEgiEECLMSR+BEEKEOQkEQggR5iQQCEtRSu1RSl0Z\noHMlKKU2KaVOKqUeqGafGKXUV0qp40qpt5VSzyulHvLy+D8qpfr7ttTVnusdpdSsWvYJimsRgdfA\n7AIIUVdKqT3APVrrr+t4iKnAt1rrITXsMw3YobW+SikVA2wCenp5/BeAZ4Ab61g+XwulaxE+JDUC\nEc66AJm17HMlMM/x/Z3AEq11kZfHXwBcppTqULfi+VwoXYvwIQkEIuAczT/TlFJbHc0U/1FKNfaw\nX1+l1AqlVIFSKlMpNc7ltfeAeGChUuoXpdTUc3z/cuAy4FXH+3tXem9DpdQJYKDjHOnAtcDKSvv9\nVSmV6vJzilLqG6VUQ631aWADMKqaz+EJpVSOo2lqq1LqBg+f02NKqS1KqRNKqY+dn5NSKlEptdHx\n3o+BKp/fuVxLTdcBUNu1iCCntZYv+QroF7AHyAA6A62B1cAsl9euBGzATuBPQEPgcuAkkFDpOFdW\ncw5v3r8Co2mpunL2Aw65/JwPXFBpnzbACSARuA9IB1q6vP4K8FI1x78JiMV4ILsZOAV0rHR9Pzr2\naQ1kOc7REMgFHnZc5wSgxPkZ1uVaaruO2q5FvoL7S2oEwiyvaq33aq2PAc8CEyu9PgxoBszWWp/R\nWi8HFnnYrzr1fT/AEGCzy8/RGMGkgjZm6L4M/BejDX601vqEyy4nHe+rQms9T2u9X2tdrrX+GNgB\nDK202yuOfY4BCx1lGoYRAOZorUu01p8CP9XnWry4jhqvRQQ3CQTCLHtdvs/FeOp1FQvs1VqXV9ov\nzsvj1/f9UPXmeRxo7mG/NIxml2la672VXmsOFHg6uFLqN45RSwVKqQJgANC20m4HXb4vxAhusYBd\na+06GzTXB9dS03XUeC0iuEkgEGbp7PJ9PLC/0uv7gc5KqYhK+9ldfq5pWrw376/NYNxvnluAyn0J\nA4HXMJ6k7/JwjL6VjuF8XxfgTeAPQButdTRGc5nyolwHgDillOu+8bW8p8Zr8eI6oJprEcFPAoEw\ny++VUp2UUq2BJ4GPK72+DuMJeKpSyqaUuhQYC3zkss8hoHs1x/fm/bWpfPNcAlzi/EEpFYfRXHMf\ncD8w0HEe5+uNgfOBrzwcuylGIMt37PtbjBqBN34ASoEHHNc2nqpNSl5fS23X4cW1iCAngUCY5QOM\nbJi7gBzAbTKU1voMxo37WuAI8E/gN1rrbS67PQ9MdzStPFaH91fLMUyyFeC6/7vAaKVUlFKqBcbN\n9CWt9QKtdSGQgtHf4TQWWKG1rlzbQWu9FXgR46Z+CKNJZrU3ZXNc23iMIaDHMDqa59fxWlp6cR01\nXosIfpJ0TgScDyaCmUYp9RxwWGs9x4t91wF3a60z/F+ycxdK1yLqRwKBCLhgDgRChCJpGhJCiDAn\nNQIhhAhzUiMQQogwF5TZR9u2bau7du1qdjGEECKobNiw4YjWOqby9qAMBF27dmX9+vVmF0MIIYKK\nUsrjDHRpGhJCiDAngUAIIcKcBAIhhAhzEgiEECLMWSYQKKUilVJpSqlFZpdFCCHCiWUCAfAgxgpM\nQgghAsgSgUAp1QkYA7xldlmEECLcWGUewRxgKp5XfwJAKTUZmAwQH1/bGhxChI5Jb/7A6pxjbtta\nRTXgb/FFXNyoEG691aSSiVBheiBQSl2HkQp3Q+XFMFxprd8A3gBISkqSBEki5E1PTef9tXlu22xl\nJVyX9R2/3bCAQQd3GhtHjIAuXUwooQgVVmgaGgGMc6Qm/gi4XCn1vrlFEsJclYNAm1MFPLD6Q1a/\ndhcvL36JJmdO8/rQ8QD8+NGXZhVThAjTawRa62nANABHjeAxrfVtphZKCJN9uO7s2vF3//g5U1f9\nl0Zlpazodj6PJY3ju26JNCgv484NC8lZ9A1DH7/PxNKKYGd6IBBCVFXmSA9/YV46T377Nst7JPH8\nZXeR06ZzxT4lkRFktu9Ozz1bzSqmCBFWaBqqoLVeobW+zuxyCGGm6anpALQsOsnLi15kT6uOPDBu\nqlsQcNrUMYGBh3KgpCTQxRQhxFKBQIhwN+nNH4y+Aa2Z/b+/0/ZUAQ+Mm0phwyiP+6fHJdC4pBgy\nMwNcUhFKJBAIYRGpafaKYaI3b1nGtdvX8MLFt5PRoScAI3q0plUTW8X+0VE2rr37euOHdesCXl4R\nOqSPQAiLSFmaDUCPo3uZ8c0bfN9lMG8OvaHi9bn3XlT1TVpDmzbw44/wu98FqqgixEiNQAiL2F9Q\nRMPSEv628AVON2jEo2MeRivjTzRSKc9vUgqGDpUagagXCQRCWETLKBuPrXqXAYdyePzaBzjUvG3F\naxMvrNpRDEZz0ltn2lGeuZWrnl5Eapo9UMUVIUSahoSwgNQ0O4OzfmTyT5/zXuJovuo1rOK1ET1a\nMyt5oMf3TJm3meGtu3MPmrbb0plSbDzbJSfGBazsIvhJjUAIC3h6YSZPffU6O9p05tnL7qrY3rRh\npOe+AWDmgkxKyjWbO/YCYMiBbErKNTMXyAgicW4kEAhhAeVHj9Hj2D7mD7ic07bGFdtPnSmr9j0F\nRcbcgYKoFuxu1ZHBB7a7bRfCWxIIhDBZapqdfod3AZDZrnudjrG5Y2+G7M/2ZbFEGJFAIITJUpZm\n0/9QDgCZ7Xu4vRYdZfP0FgC3OQWbOibQ4ZdjtD95xG27EN6QQCCEyfYXFDHgUA4HmrXhaNNot9dm\njutf7ftmjO2PLdIYVropNgGApIPbmTG2+vcI4YmMGhLCZLHRUfQ/tIuMDlVrAzWN/nG+lrI0m6zS\nbpRENuDBFgX0lhFD4hxJIBDCZE9c3Jnuf7KzJGFkxbYoW2SNtQGn5MS4s8FiVSK9JROpqANpGhLC\nZGPVESJ1OQe690EBcdFRPD9+4LnPBRg6FNavh7LqRxoJ4YnUCIQwW1oaALOfvZPZ9VmP+8IL4R//\ngKwsGDDAR4UT4UBqBEKYKDXNzoJ3FnO8cXNGzN1RvxQRQ4ca/0reIXGOTA8ESqnGSqkflVKblVKZ\nSqmnzS6TEIGQmmZn2vx0uuVlk9m+O/YTp5k2P73uwaBXL4iONjKRCnEOTA8EQDFwudZ6MDAEuEYp\nNayW9wgR9FKWZlNyupjeR3LJcMwfKCopq0hHfc4iIiQTqagT0wOBNvzi+NHm+NImFkmIgNhfUESv\no3k0Kitlq8tEsv0FRXU/6NChkJEBp075oIQiXJgeCACUUpFKqU3AYeArrXWVRxql1GSl1Hql1Pr8\n/PzAF1IIH4uNjmLAQeeM4u5u2+vswguNUUMbN9a3eCKMWCIQaK3LtNZDgE7AUKVUlSEPWus3tNZJ\nWuukmJiYwBdSCB+bMiqBQUd2c8rWmN2tYgFj/sCUUQl1P6h0GIs6sNTwUa11gVLqW+AaIMPs8gjh\nT8mJcRwpOUhObA90RCRx0VFMGZVQv7UE2rWDrl2lw1icE9MDgVIqBihxBIEo4CrgLyYXSwj/Ky+n\n7c6ttL3zTnbPHuO74154Ifzwg++OJ0KeFZqGOgLfKqW2AD9h9BEsMrlMQvjfzp3wyy+QmOjb4w4d\nCnl5cPCgb48rQpbpNQKt9RbAx38JQgQBx4xivwQCgA0bYIwPaxoiZFmhRiBEeNq4EWw26O/jtNHO\n9BIZ0s0mvCOBQAizpKUZN+2GDX173Oho6NRJAoHwmgQCIcygtREIfN0s5DRggAQC4TUJBEIE2PTU\ndEb+4V04coQZexsxPTXd9ycZMMDIQlpa6vtji5AjgUCIAJqems77a/Po45hRnN6uO++vzfN9MBgw\nAIqLISfHt8cVIUkCgRAB9OG6vQD0P5RDOYpt7bq6bfeZgQONf6V5SHhBAoEQAVSmjXyKAw7lsKt1\nHIUNo9y2+0zfvqCUBALhFQkEQgRQpFIA9Du0i0yXjKPO7T4TFQU9e0ogEF6RQCBEAE28sDOtCk8Q\ndzK/Yg0C53ZfSk2zs7Jhe3YuX8uI2cvrt/KZCHkSCIQIoFnJA3mgtbH8Rmb77kQqxW3D4pmVPNBn\n50hNszNl3mY2texE1+P7OXLkBFPmbZZgIKpleooJIcLNb5sUAPDBP/8PWrf2+fFnLsikpFyzvW0X\nGuhyuh/bR1a77sxckFm/zKYiZEmNQIhAS0uDLl38EgQACopKAMiO6QJA7/xct+1CVCY1AiECbeNG\n/80odrGnVSxnIhqQcCTX7+cyS2qanacXZnK88GyQi46yMXNcf6n9nAOpEQgRSL/8Ajt2+DUQtGpi\nA6A0sgE5bTpV1Aic20NFapqdKZ9udgsCYNR8Hvp4k39mbIcoCQRCBNKmTUaeofPO89spZoztjy3S\nGI66vW0XEo7kYYtUzBjr4yynJpqems5DH2+ipKz6+Rd+mbEdoiQQCBFIzjUI/BgIkhPjSJkwmLjo\nKLbHdKHziUO8fG2PkGkqefbd74h+6a80LK29z+P9tXlMelNWa6uN6YFAKdVZKfWtUmqrUipTKfWg\n2WUSwm82bjTWFe7Y0a+nSU6MY8qoBI506QXAFx98HTLDRxv9+y0e++59JmR87dX+q3OOSc2gFqYH\nAqAUeFRr3Q8YBvxeKdXP5DIJ4XOpaXZ2LP2OFU07M+Iv3/r1xpyaZmfa/HTWNDECTqvd25k2Pz0k\ngsEVO9YBcO+P84koL/PqPT7P5RRiTA8EWusDWuuNju9PAllAaNRhhXBITbPz1Ccb6HpoD5ntu2Mv\nKPLrjTllaTZFJWXsa9mOQlsjEo7kUlRSRsrSbL+cL2AOHCDxQDabOvam2/EDjNruXbOPz3M5hRjT\nA4ErpVRXjPWL15lbEiF8a+aCTOIP7MZWXlaRWsKfN+b9BUUAaBXB9rbxJOTvAcDu2B60Fi0CYNo1\nf2B3q47ct+4zo/Md6NWuaY1vDYXakL9YJhAopZoBnwEPaa1/9vD6ZKXUeqXU+vz8/MAXUIg6Sk2z\nU1BUwoCDOwHcks3t99ONOTY6quJ7Y+SQMYRUEeQ3xC++gG7dOH/sJfx76I0MPriDkXnp3DYsnq8e\nuZQRPaqfpBf0tSE/skQgUErZMILAXK31fE/7aK3f0Fonaa2TYmJiAlvAUHPyJMydC+PGGR2Xq1eb\nXaKQ5rwB9T+8i58bNSUvukPFa643bF+aMioBZz7T7LZdiDlVQOvCE2iC+IZ46hR8/TWMG8esGwYx\nKzUF2rfn/eMrK3I1zb33omrf7q+gGwpMDwRKKQX8G8jSWr9kdnlC1qlT8MkncOONxs3/ttuMoYyR\nkXDHHcbrwi+cN6ABB3PY2q6bsU6Aw5RRCX45Z3JiHM5W8e3OVBOOWkGw3hDX/etDKC5m4qH2RkbV\nrKPw4IOwdKkxP8Mhrprg6q+gGwpMDwTACOB24HKl1CbH12izC2V1qWl2RsxeTrcnFlekGa68bcHa\nHPj8c7jlFuPmf/PNsGYN3HsvfP895ObCRx8Zyxk+/rjZlxSyYqOjiCwvo0/+HrfU062a2Pw6tt95\nQ8xu655zKBhviKlpduzvfsKJRk35qVO/is72xcPHQbNmkJJSse+UUQlE2SLd3h9li/Rb0A0Fpuca\n0lp/D/h4VY7QND01nQ/X7a0yAsJeUMSUTzeDBkrOcOmeTVyXtYrLZqyFM0UcjWrBlwmXsKjvr+h1\n47X8+cYhZ998ySXw0EMwZw7ccANccUVgLyoMTBmVwL9fX0RUaTEZHXoCxo3J3zN9p4xKYNr8dA43\na01B42YkHMkN2hvii19uJXXHOpb3uIDSSOO2VVRSxnM/HGLM735n/P4++yx07VoRXFOWZrO/oIjY\n6CimjEoImQl1/mB6IBA185RUq7LI8jKG7d7MdVnfMWrHD0Sf/oWCxs1Y3OdXLOrzK37oMoiyCOMJ\nae1PdnYdK3JvS33uOViyBO66C9LToUULf19WWElOjKNz3GkAtrbrTlyAbkzO489ckEl22y70zs+j\nsc0KjQDnrsPWNNoU/czXPS90276/oMh4kHnlFXjxRfj73wHj2uXG7z0JBBY2PTWd99fmeXwtoryM\nC/dmct22VVyTvYY2RT9zsmEUy3oNY2Hfi1nddQglkZ6TjK3OOUbiM8uYMdaRoTEqCv77XxgxAh55\nBN56y5+XFZbOP54LUVEse30yNAjsn11xaTnbY7pw/daVHD91hmnzjVm2wXSjvCFvA2ciGrCy+/lu\n22Ojo6BTJ5g0Cf79b5gxA9q2NamUwUsCgQWlptl58vN0Tp1xnzWpdDnn27O4Lus7Rmevpt2p45yy\nNeabnkNZ1OdXrOx+PsUNGnp1juOFJTz88SbW5x4zRlwMGwZTp8Ls2TB+PIyWbhqf2rgRBg0KeBBw\nTizLbtuFFsWn6HDyKAdbtCVlaXZQBYKxeev5sesgfmnUpGKbWzPX1Knwzjvw6qswc6YpZQxmEggs\nxplatyKrotYMPrCdsVmrGJ29mtiTRzjdoCHLuyexqO/FLO+RxGlbYyPbpAbKvZ9BqYG5a/NI6tLa\nuCnMnGlM2LnnHmPRcz8tnBJuUjfs5cq16/m83yX8a/bygLZXO0cIORepSTiSy8EWbYNr5NC2bTTP\n3cWq0X+o2NSqie1sjRagb19jOPSrr8KUKdC05sllwp0EAgtITbMz9dPNnHFJqdvi9C/c++PnXL91\nBfEnDjmqxecx+9I7+abHUE65PBk525zhbAdZY1sERSXltZ7bOa48OTEOGjUymoguvBAeeADef9/n\n1xpuUtPs/PPtr0guPkVmu7OpJSAwTTOx0VHYC4rY7jJyaGX384Nq5FDGa+8xAFjY5Wyz0GlPv9tT\np8LIkfD22/DHPwaugCFAAoHJUtPsPPzxpoox3xHlZfx6y1dMWfUu0ad/4buuibwyYiLLeg3j58bN\n3N47okfrKhNoXG8uNfUxuHJ7OjzvPJg+3agdTJwIY8bU9dIERpAdZN8BUDFiyJlaIhCBwDly6ATN\nOdisdVCOHCr/4gvS2/fgQIuzE0k9foYjRhhfL74I990HttBaiMefgnMIQQhJWZpdEQQS7dtIfe9R\nZi99lZw2nRh7xxzu/PXTfDrwSrcg0KqJjTk3D6lxFiXArOSB7Jk9hjk3DyE6qvo/iipPh3/6E3Tu\nDP/8Z10vSzjsLyhiwKEcSiIiK57KndsDITkxjufHDzTWJmjbhQHH9/L8+IHB0z9w+DADcrdWGS0E\n1XyGU6ca82PmzQtA4UKH1AhMkppmJ2VpNvaCImJ+Oc7jK99hQsY3HGrWmgfGPsaCvpe4zUAFuG1Y\nfMVU+nPhHEo3PTWduWvzcO1F8Ph0aLMZM4//+lc4eBA6dEDUTWx0FP0P7WJH23jONLC5bQ+UiqGU\n+ZfBa6/RZ1AQ/X8uWkQEmq96DavyksfP8LrroF8/43d34sQqf0PCM6kRmGB6ajoPf7wJe0ER12Wt\n4ps3f8e4rSv514U3cvk9/2JBv0vdfoGjo4waQF2CgKtZyQN5+eYhxEVHoTD6Fqp9Orz9digrgw8/\nrNc5w92Uq3sz8NBOtxnFpjXNDBgARUWwe3fgz11HB977BHuLGCM1hwtbhPL8GUZEGJ3FmzfDsmUB\nKmXwkxpBgKWm2Zm7No/IslKeWPEf7ln/BRti+zBl9EPsatPJbV9PfQD15fVEm7594YIL4L334OGH\nfVqGcJLcXkHhCexd+6DA3FmuAwYY/2ZkQM+egT//uSospNXqFXw08KoqT/bNGjeo/jO89Vajn+sv\nf4FRowJQ0OAnNYIASk2z8+gnm2lz6jhzP57OPeu/4J3zruOWW58PSBA4Z7ffbiSmy8gwtxzBzLFG\n8cOP38Lu2WNY/cTl5rXP93Ms/JeZac75z9U339C4pNhj/0BBDTPtadjQeHj59lv46Sc/FjB0SCAI\nEOfSgYP3bWXROw8y6MAOHrruUWZedV/FDOC46Cj2zB7DntljzA8CYCSra9DAqBWIutm40XiaHTzY\n7JIYY+tjY2HnTrNL4pWcDz7nlK0x6+IHVHmt1j6We++Fli2NvgJRK2ka8rOKTuHjhdy26Uue+voN\nDrRoy/jbZ5LVrnvFfgr/pSSus5gYuPZaYz7Bc88ZKavFuUlLg969jQyZVtCzZ9AEghM/buTntl2q\npErx6m+lRQu4/35jpvyOHdCrV8XfoiSiq0pqBH7k7BQ+kl/AC0vmMGvZP/m+6xDG3jGnShCYNCze\nMr+Urumsn2yeCPv3w/LlZhcrOG3cCImJZpfirCAKBPEH91TMiHal8XIy3gMPGM1EL75YUSO3FxSh\nwe9rRgcbCQR+ctVLK3h/bR5xBQeZ//4UxmcsZ86Iidw94Sm3OQGRSvGyD0YE+UrlP5hPOw7hRONm\n5P3tdbOLFnyOHoW8PGOSnlX07GkMCbb6QkSHD9O28ITb3Aun6haeqaJDB2PRpXfe4d+f/kBRiXvu\nrqKSMqbN3+KL0gY9CQR+cNVLK9hx+BSX7NrAov8+RKcTh7hrwlPMGTkJrc5+5FG2SF789WDL1ATg\nbJIyp+IGDVmcMJKYZYvgl19MLFkQcnQUW65GAMZiRFbmGKCwp4P7sNFzHnr72GNw5gzXLP/E48tF\nJeVMT02vczFDhQQCH5uems7OQyf54+oP+c+8mRxo3paxd8xhRY8L3ParcQy/iTzN1vxswOVElRSz\n4eV/m1CiIGblQGD15iFHILjpjlHezXupTq9eMH48v9m0hKbFhR53+XDdXh8UOLhZIhAopd5WSh1W\nSgX1OMXUNDsLVmzljfmzePT7uaT2v5Qbbn+BvFYdK/ZRwJybh5g7jLAGnkZjbIjrS250B8rffdeE\nEgWxjRshPh7atDG7JGf1cExss3ogyMzkTMtoZm04Xv/O3ccfp/npU0zc/D+PL1de8S8cWSIQAO8A\n15hdiPpITbPz+qtf8MW7D3Pprg08deXveGTMI5y2NXbbz0qdwp54rHYrxef9L+P8nWmwb1/gCxWs\n0tKs1T8AxmiamBjLB4Kj6zayuWUn7CdO179z94IL4LLLuHv9F9jKqs4/iJQ0FNYIBFrrVcAxs8tR\nV5Pe/IHlM/7GZ+89QpOSYm6Z+Dzvnj+2ymzIXu2aWqZTuDrJiXG0alI1Qd38/pcTgeYf9z4jIy28\ncfIkbN9uvUAA1h85pDWNtmWxrU2822ZnxtE6mTqVjiePcv3WlVVemnhh57odM4RYIhB4Qyk1WSm1\nXim1Pj8/3+ziAEYtYPCTi7ji9dm8sjCFjPY9uO6Ov7GhU78q+/Zq15SvHrk08IWsgxlj+xNlc58z\nkNeqI+vj+nLl+qVM+2yLBIPabN4MWlurf8DJ6oHAbqdZ8SmyPYwYqnPW1lGjYNAgpm5ZQANH2sVI\npeqcyDHUBE0g0Fq/obVO0lonxcTE1P4GP0tNs/Piuyt5490nuGvDAt4+fxy33vIc+c1aue2nMLKG\nBksQAPfUxa7mD7ichCN5dLfvqPuTWZjY9OVqAEYuzmfE7OXWCpw9expNfKdPm10Szxwdxds9zCGo\nc9ZWpWDqVNrt28XOEZo9s8eQ8/xoCQIOQRMIrCQ1zc67L37Ip2/9kUEHd/LA2Md45srJlEa6T9S2\n2hyBc5GcGMfqJy7HtXFrUZ9fURzZgPEZy4NrqcMAS02zs+XrHyi0NcLeoq31Ji/17GnUVqyahdQR\nCPbWd+hoZTffDF26GMnohBsJBOcodeM+0qb+mY8/fIKiBo244fYXjLTRHlhtjkBduD6B/dy4GSu6\nJzF62/fEtWhkYqmsLWVpNl0P55HTulPFvJF6tW/7mtWHkGZmQocOPH7biPoNHa2sQQN49FFYvZpV\n/0mtmD1vuRqbCSwRCJRSHwI/AAlKqX1KqbvNLpNHhYU0uOtOnv76dVZ2O49xd7zMtkp50p1G9Ggd\n9EEAjFFErv0Fi/uMpOMvR3ku1uIzU020v6CIHkf3sbNN5yrbLcHqQ0gzMmDAgIpaqU+ztt51F8XR\nrSiZ/RdJN+HCEoFAaz1Ra91Ra23TWnfSWltv5tKuXTB8OKM3L+elkZO498b/V2UNYTjbJ2CJ7KE+\n4NpfoICs8y6mrGEjLt5SdfSFMPSIgriT+VUCgWUWjG/dGqKjrRkIysth61bo398/x2/alPfPH8sV\n29fS88jZ9bwtVWMzgWQf9caXX8KkSaA1d02YwYoeSR53Uwp2Px96i71XWcwmbRR8+im88IKxIpRw\n82RP4zNxDQSWWjBeKeuOHNqzBwoLzy6i4wf/6DeKW1d+zOQf5zN19EMV2y1TYzOB/BXXpLwcnnkG\nxowxZohu2MCmAdU/6U+6ML7a10LKhAmwd68s+lGNy/RRAE526+m79m1fs2ogcC6C5MdAEBXbgY8H\nXUVy5go6/HykYrtlamwmkEBQnYICuP56mDGDz/tdRp8r/h893txG/9jm2CKqzkQc0aN1UI4OqpNx\n44wF7ufNM7sk1pSVBZGRfDB7kvmrknmQmmbnnUMNKN29h4ufXWattnFnIOhXdS6Or0wZlcB7wycQ\nocv57YYFgMVqbCaQQODJli2QlETpki/5f1fdx8NjHua0rTFlWrM65xhDu7VyG80w5+YhIdMn4I3U\nXb+wuvt57HvzPUY8/421biRWkJVlPHE3bGh2SapwphlPj4qhgS5H5eVaq6M0I8Oofbdo4bdTJCfG\n8ce7rmT5oEu5ddOXJDQqs16NLcCkj6CyDz6Ae+7hqK0Jkyc+73GW8Npdx8l5frQJhTNfapqdKfM2\nM67HcF7MXkebbVuYcrIY8EQ09AEAACAASURBVHKxkHCQlQV9+5pdCo+cacb3OBIhdj1+gNxWsaQs\nzbbG/19mpl+bhZySE+P49onHaT5xFFes+IyUqGYV28OR1AicSkrgwQdh0iT2dOvLNbfP8RgEILyz\nFc5ckElJuearXhdyJqIBo7d9T0m5ZuaCIFkQ3d9KSoy2d4sGAmeHaF60EQi6HN/vtt1UJSWwbVtA\nAkFqmp3J6aWs6prIbzcs4MiRE0z5dLN1akYBJoEAjBWbrrgCXnmFNddN4srRT1VJFeEqnLMVFhQZ\n2Rt/btyM1V0HMzp7NWhdsT3cfb1gNZSW8siWYktOVHJ2iOY3jeaUrTFdjx9w226qnTvhzBn/DR11\n8fTCTErKNK8Nm0DMqQLGZyynpEzz9MLwfKCRQLB6tZEhcsMGPnlkNrf2n1glVURlkq3QsCRhBPEn\nDjHgkMVXuwqQ1DQ7Cz76BoAdbTpbcqJSxQRBpcht1ZEuBQes01Ga6bgJB6BGcLzQeHD5IX4Qmzv0\nYvKPnxFRXlaxPdyEbyDQGv7+d7j0UmjShL8/9y5TbbX/AgZDKml/ck1RvazXRZRERDI6+3uPqavD\nTcrSbDodNiYp5bTpBFhvopLrBMHc6I70/PmgdTpKMzKMOQ6BbFZTitcvvJFuxw9w9Y61gTuvxYRn\nICgshN/8Bh54AK65hllP/5cXDzSu9W3BlEraX2aM7Y8t0mgaOxHVnDVdBjN622p0ubbUk68Z9hcU\n0fPoXuzNYyhsGOW23UqcqRuuveFXdCk4SPKgDmYXyZCRYaS/iPJ/M1V01NkHl//1vojdrTpy37pP\niW4cnuNnwi8Q5OTARRfB3LnwzDOkzvwnb6UX1Pq2YEsl7S/JiXGkTBhcUQNYkjCCrgUHiMvNtlwz\nSKC1jLLR8+jeitqA63ZL6tnT6KC1yqpzARoxBDBzXP+K+UDlEZG8OXQ8Qw7s4NXYEwE5v9WEVyBY\nvBiSkoxZsYsXM31gMg/N21Lr22TxCnfJiXE0aWg8OS3rNYxSFcG12ast1wwSaBG6nB5H91UJBJYd\nW2ClLKSnT8OOHQELBMmJcaTcNLhiPtAPI8ZwunVbRn7+n4Cc32rCqx60di107Qqffcb0Lad4f21e\nrW+RIOCZs7njeJOW/BA/iNHbvueFX91uuWaQQIo6fICmJaerJJsrsGoHpGsW0iuuMLcs2dlQVhaQ\nEUNOVXJoRT4MTz5prC43eHDAyuFqemo6H67b6zZEPS46iimjEvzajxNeNYKZM2HNGlJPNGKuBIF6\ncR1uuKTPSLof30+f/D3WGIZokqGnDwNYN+toZXFx0KiRNWoEARwxVK3/+z9o1gz++ldTTj89NZ33\n1+ZVmadkLyji4Y83MT013W/nDq9AEBkJUVGkLM2mtilhUbYICQI1cF2nYFmvYZSpCK7fscYawxBN\ncldrY+lHy2YdrSwiwqgVWCEQZGQYC8f07m1eGVq1gsmT4eOPjSyofjY9NZ0e05bQ9YnF9Ji2pMaH\nUw3MXZvntz648AoEDrU1X9giFM+PHxSg0gQn12GIx5pGk9Z9MLft+5HkIbFmF800A0/u50zLaBrH\ndrBu1tHKrJKFNCMDEhLMz8/08MNGp85LL/n1NJWf/su0rvXhVAOPfuKf2c+W6CNQSl0D/A2IBN7S\nWs/2x3lS0+y11gaa2CJ4bvwga//xWoRbG2uXPLj//oCO/LCcrCwaDujP6mkmt7efi5494auvjHk1\nZvZqZ2TABReYd36nTp2MtUfeegueegratvXLaT5ct7dO7yvTmmnzjSYiX96jaq0RKKW+Ukr5redE\nKRUJ/AO4FugHTFRK+TwHrTPror2a2oBzZbGtf75WgkBdjB9vNDWEc2rqbdssm2OoWj17QlERHDhg\nXhlOnYLdu63zADF1qvGZ/OMffjl8apq9XvnK/DE6z5umoceBOUqp/yilOvr07IahwE6t9S6t9Rng\nI+B6X5/EmXXRk7joKF6+eYj0CdRH+/Zw8cXGymXh6OhRyM+HPn3MLsm5scL6xVu3Gv9aJRD06wdj\nx8KcObBpk08P7XwgrYk3ucx8PTqv1kCgtd6otb4MWAT8Tyk1Qynly2EQcYBrPWmfY5sbpdRkpdR6\npdT6/Pz8cz5JdR+cAsstHBK0Jkww/qidf9jhJCvL+DcYawRgbiBwLkYTwKGjtXrhBWME0a9+BcuW\n+eywNT2QgtEqkfP8aPbMHsOcm4dUGxR8PRLNq85ipZQCsoHXgD8CO5RSt/u0JLXQWr+htU7SWifF\nxMSc8/ur++AsO7QvCH3Z6yLKleKl3z1nycybfhWsgSA+3hitY2YgyMw0hrE6aydW0Ls3/PADdO9u\nLFX7zjs+OWxNT/KVh6snJ8bx4q8HV4zOc/LHSDRv+ghWA3bgZYwn9TuBS4GhSqk3fFAGO+A68LqT\nY5tPuQ53dLL00L4gk5pm55HvDvNTXD9Gb/s+IGOfLSUry8iR06WL2SU5Nw0aQLdu5tcI+vY1hndb\nSadOsGoVXHIJ/Pa38Oc/G53q9VDdg2dcdJTHpmnX0Xn+HInmzaihycBWrat8An9USmX5oAw/Ab2U\nUt0wAsAtwK0+OK4b5weXsjSb/QVFxAZgtl44cVZ5l/QZydNfv06Po3vJadOZuWvzSOrSOvQ/56ws\nY/hjRPCMyHaOoptV1oLY1WlkpdnN+X/KyIDLLgv8eb3RsiUsWQL33GOMIsrLg9deMwJoHUwZlcC0\n+eluzUO1PZBWmQHtB7Vejda6ppUaxtS3AFrrUqXUH4ClGMNH367lnHUWiA80XDmrvP/rfRFPf/06\n12av5tXht6DBOssg+lNWFgwfbnYpvObstDSWrYwlKX0ryZ8ZebcC+n9VUAB2u7X6Bypr2BD++1+j\nGe3ZZ40kfU89BRdeeM6B36oPpPWaR6C13uWLQmitlwBLfHEsYY7Y6CjsBUUcat6Wn+L6MWbb97w6\n/BbAemmYfe7UKcjNhbvvNrskXnPttMyN7kjzM0VE/Xw88EHbaiOGqqMUzJoFnTsb6ev/9z9o184Y\nXTRuHFx5JTRp4tWhrPhAGjz1WGFpU0Yl4BzfsKTPCPrm76HbMaOrJ9Q75L/94jsA7t9QGDSd5K7B\nObdiIfv9gQ/azhxDVq4RuPrd74ylbT/4AC6/3Jg3c/310KaNERDeest4PchIIBA+kZwYx6Rh8Sjg\ny94jALg2ezUKuKzPuY/yChapaXYWz/sWgJ1tOllyeUpPXINzbsVC9gcCH7QzMownaYt0sqem2Rkx\nezndnlhcfVBv1QomToQPPzTmjnz1Fdx7r5G19N57ITbWWPPk+eeNQFfPDuZAkEAgfGZW8kAmDYvn\nUIu2bIjtw5ht36OBzzbYLX9jrKuUpdnEH8qlTEWwp5VR3Q+GdRlcR9Hta9meMhVBz58PBn4UXWam\nMYHLAp3srtkHNHgX1Bs2NJqFXnnFSFS3eTM884yRUvtPfzKavHr2NHIYrVgBpaUBuppzY/6nL0LK\nt9vy0Rgrl/U/vIsux/cHxY2xrpzLU+ZGd+BMA5vbditzHZZY0sDGoeh2jGt6KvBt1xbKTeVpstc5\n/e4qBYMGwfTp8OOPRif4668bs81fe80YGdWuHdx2G3zyCZywzmpoEgiETzlvgF/2MZqHRmevdtse\naqKb2Oh5dB85wbIGgQvn2sW7Z48hNmkgnY8EuNZ29KjRnm6R/oHqfkfr/LsbG2uktV68GI4cgfnz\njf6EpUvh5pshJgauvhpefdUYbGAiCQTCp5w3wP0t2rGpY29Gb/vebXsoSU2zU1RYTNfj+93WILBF\nquCbqJiQYKwSFsj2bIt1FPs6+4DbegPPrmK66gn/+Y8R/L77Dh580JiX8Mc/GisnJibCjBmwYUPA\n+xUkEAifcm17XpwwkoGHcuj1y+HguzF6IWVpNrFH99OwvNQtEDRt2MBywwNrlZAAJ08GdsSLxQKB\nL7MPeFpv4P21ecZM+8hIGDkSUlKMjLXbthnfN2tmDFFNSjKGqd5/vzFMtbjYJ9dXEwkEwqdc256d\nzUMvR+4MvhujF5z9A2CMGHI6UWTRNYprkuC42WUHsC8nMxOaNzduehbgy3QO1a034HF7QgI89phR\nSzh40MhrNHQovPsuXHutsSbChAnGz0eOnHNZvGGJhWlEaHGdMHN81StEzP+MblHDLTOL0ldio6Mq\nAoFrH0FQNoO5BoJLLw3MOTMyjNqAmQviVOKryV7VrTdQ6zoEMTFwxx3G1+nT8O23sGCB8fXZZ8bo\nqs8/N+Ys+JDUCITfpKbZ+XeH8+m3L5u4E4eCZoy9t6aMSqDv0Tz2tYjhl0bGrNKgTWTYqZORNC/Q\nNQKLNAv5Uk2/396sNVChcWOjRvDaa7B3L/z0Ezz5pJHawsckEAi/SVmazRe9jPw71zhGD4XSUNLk\nxDguLj5AbsfuwbNGcXUiIozUy4EKBIcPG80cFhk66olXk8s8eHph9anSJl5Yx2awiAij7+CZZ4xF\noHxMmoaE3+wvKEJHdyC9fQ/GbFvNW0PHV2wPCcXFtNyTw4gpU9j9XL3zL5ovIcEYsRIIFusorsw1\nKR+cnVwGNSflS02zc7yw+j4iq66CKDUC4TfOtvIvE0aQeCCb2J8PA8bY+5CwbZsxU3TQILNL4hsJ\nCcbawQEYpWLJVclc1HVyWU2vx1m470gCgfCbKaMSsEUqliQ4cw+tAYxRNSHRT7DFSNscMoGgd28o\nL4ecHP+fKzMToqOhoz+WQa+/uk4uq+l1K/cdSSAQfpOcGIctQrGndRxb23XjWkc/QbmuuR01aKSn\nG7lmevc2uyS+EcghpM7UEhYaMeSqrpPLqns9Ospm6b4jCQTCrwpLygEj91CSPYv2J41x0DW1owaN\nLVuMpo06rlZlOYEKBFpbfsSQp8llYPQV1LT8anWT0maOs+61ggQCESBLEkYCZ5uHQsKWLaHTLATQ\nogV06OD/QHDgABw/bulAkJwYx43ne36Cr5gh7MI5wujhjzfR2BZBdJQtqEaSmRoIlFI3KaUylVLl\nSqkkM8si/CM6yugY3tWmE9vadqloHnJuD1r5+cYNbaA1R4HUmTPnkD85RwxZeOgoGJl0q/PBuryK\n71PT7Ez5dHNF+urjhSWcOlPKyzcPYfUTl1s+CID5NYIMYDywyuRyCD+ZOa4/tgijHXhJn5FcsG8r\nsYXHLV9VrlW644kwlGoEENhAYOEaAdTc8VvuMkH4T/O3UFLmPmO4pEwHVT+YqYFAa52ltQ6N2UXC\no+TEOFJuGmzkHkoYQQSa15rkBsVTUo1CbMSQs2lj1o4yOHaMJcurbwevt8xMI39Ou3b+O4cP1NYx\nnJpmLLjk7AerLJj6wcyuEXhNKTVZKbVeKbU+P7/6KpuwHmfe+9//fhy723Wh6IOPgmZt32pt2WLc\nyPwwyzPQXFfmynEkz3v/3WX++/9x5hiyuNqGez76yWYe+nhTgErjX34PBEqpr5VSGR6+rj+X42it\n39BaJ2mtk2JiQncN3FDlvNks6DmcoXszKbbvD+68QyHUUew6eWpXa6OmFnd4r39SgWgNW7davn8A\njAeYJrbqb5G1JZALpn4wvwcCrfWVWusBHr6+8Pe5hXU4bzZL+hjNQ6O2/xC8eYfKyozmjRAJBK5t\n4ftatudMRAO6H7P7JxXIvn3w889sahFXpzw+gfbc+Lr/HwdTP1jQNA2J4Oa8qWS37UJO606Mzv7e\nbXtQ2bnTSBEcIoHAtS28LCKS3FYd6X5sn3/SaTtSS7y4r8G5LRJvkuTEOG4bFs+5THtTwG3D4oOq\nH8zs4aM3KKX2ARcBi5VSS80sj/CfipuKMlJODMvLoHXhieDMOxRiHcWVJ0Htah1Hj+N2/6REcIwY\nSm/VyW2zlWuHs5IH8vLNQyoWrKkplXRcdBQv3zzEssnlqmP2qKHPtdadtNaNtNbttdajzCyP8B9n\n3iGAJX1GEKnLGbX9B345XWrJJ8EabdliLDfYt6/ZJfGJyitzHY7tSveCgyQP9ENHeGYmh5u2oiCq\nRZWXrFw7dA542D17DC/+erDH2cNzgmjeQGXSNCQCIjkxjqYNjVQMWTHd2N2qI9dmr6akXFv2SbBa\nW7YY+YUaNza7JD7jvNG9fPMQ8tp2JqK0hF8/8aHvg3RmJrkdunl8KVhWdvPlkpZWESJJUkQwqFjL\nVymWJIzkd+s+o1XhCfabW6xzt2WLX1aJMptzZFffxsb4/qa5OUybb4zQ88lNrrwcMjNpff0tRNki\n3dI8B9vKbr5a0tIqpEYgAsb1iW9Jwgga6HKu3rE2aJ4EAfj5Z9izJ2T6B1w5R3btbh0LQPej+3zb\ndp+bC4WF9LhsWMg9UQc7CQQiYFw7JTPb9yCvZXvG7PghqJ4EKxZUCcFA4GyjP96kJccbN6fHMbvb\n9npzdBSvsrUjZWk2+wuKiI2OYsqoBAkCJpNAIALGrW1VKVYNvpSRuZtI7trE7KJ5L8RGDLlyrZnt\nah1H92P7qmyvF0cgeCyzJCiGjoYTCQQioFxHX9z2wqNElJbCF0E0t3DLFmjZEjrXcRFyC3Otse1q\n3Ynux+y+bbvPyOBwi7YcjnQPLFYeOhouJBAI8yQlQXw8fPqp2SXx3pYtRuppi66sVR+uNbZdbeJo\nd+o4KVd39V2zTWYmWW3iPb5k5aGj4UACgTBN6qb9fBg/lDNfLuXqmQss3zyQunEfp9an8d6plpZO\ni1Afzhrb4w8aqcCua3zSNwcuKYGsLOydenh8OagGDIQgCQTCFM6hivO6XkjD8lL6bVhl6bbi1DQ7\nc/79FU2LC8lq1xV7QRFTPt1s2fLWm6+Xrdy0CU6fpud1V3icjBVUAwZCkAQCYQrnUMW02AT2N2/L\n6OzVlm4rfnphJj0P7gZgW0xXIPgWHzknPXpARITvAsEaY4nSobeOkaGjFiQTyoQpnG3CWkXwv97D\nmbTpS5oVF7K/wOSCVeN4YQkJ+XsAI3Ge6/aQ1KgRdOvm20AQHw+dOpHcyUcT1ITPSI1AmMJtclmf\nETQqK+HynJ8s3VbcN38PudEdONUoiIa71kdCAmzf7ptjrVnDvr6JQZF6OhxJIBCmcB2quCGuL4ea\ntea6HWss21YcHWWjz+HdFc1CrttDljMQlHteitFre/fCvn28Q0eZP2BREgiEKVyHKqIi+G7gxVyx\nez3JvVqaXTSPnrm6O92O73cLBLYIFVSLj5yzhAQoKjIWk6kPR//A2g593DZbuU8o3EggEKZxDlWc\nNCyez7oNI7K4mD/e9izTU/24cHodjWt4gkhdzuGuvSs6OVNuGhzabd2+Gjm0Zg2FtkZValMg8wes\nQjqLhammp6bz/to8IuL6kt8kmlHbvucPa0cCWGtxD0dqiWdnTOLZ3r1NLkyAuAaCq66q+3HWrGFb\n5z6URla93Vi5TyicmL1CWYpSaptSaotS6nOlVLSZ5RGB9+G6vQCUR0SytPdFXL7rJxqXnK7Ybhlb\ntkBUlDGsMgykptkZ8c5WTjaM4tMPvql7W/6pU5CWRtNLL65YmMjJFqks2ycUbsxuGvoKGKC1HgRs\nB6aZXB4RYGVaV3y/JGEETUqKuWTXRrftlrBpEwwYYKxMFuJS0+xMmbcZ+4nT7GrdiXYHcpkyr46T\n59avh7Iyjg1Ogsr/pRb7Lw5nZi9VuUxrXer4cS3Qqab9RehxXf91XfxAjka1YHT26hrXhQ24sjL4\n6aeQXIzGk5kLMikpN+7Su1rH0ftILiVl5cxcUIfJc46O4qePtKg4plNQrk4XosyuEbi6C/iyuheV\nUpOVUuuVUuvz8/MDWCzhTxMvPJvFs8zRPHRFzo/cPiTGxFJVkplpNHGESSAoKDo7Se6nzv3p8Msx\nehzd57bda2vWQN++ZJ/xPMxWOoutwe+BQCn1tVIqw8PX9S77PAmUAnOrO47W+g2tdZLWOikmxkI3\nCVEvs5IHctuw+IoawNI+I2l2poiZjS20gOXatca/w4aZWw4TrOx2PgCX7t5w7m/W2ggEw4fTspr5\nFtJZbA1+HzWktb6ypteVUncC1wFXaG21hmERCLOSB54dIVRyNXz1MsybB9dfX/MbA2XtWmjTJmw6\nils1sVWkzrC3bMeONp25ZNcG5l9y07kdKDsbjh1jY1xfTp0prfKyLUI6i63C7FFD1wBTgXFa60Iz\nyyIswmaD5GRYuBCKi80uDQCHvl7F8ujudJ22hB7TllhynoMvzRjb322Ez4ru5zNsbzrPXNnt3A7k\n6B9IOdmGkrKqz3jNGjcI7XkYQcTsPoJXgebAV0qpTUqpf5lcHmEFEyYYi8R//bXZJWHW3NW035vD\nxljjybVMa95fmxfSwSA5MY6UCYMr0mes7HY+DctKWfaPj85t5NCaNdC6NetsbTy+XBCqCfuCkNmj\nhnpqrTtrrYc4vu4zszzCIq64wlgOct48s0vC9oUrAEiLdU+PYLl5Dn7gbM75sfMACm2NSNq27tyG\nka5ZAxddRMdWTT2+LP0D1mF2jUCIqho2NPoHvvgCzpwxtSiD7VmUo9jSsZfbdsvNc/CxlKXZFc05\nZxrYWBM/iEt3baCkrNy7IZ/HjkFWFgwf7pZg0EkWo7EWCQTCmm66CQoKYPlyU4tx3oFsdrbpzMlG\n7k+1lprn4AeVh3Wu6J5El4KDdDu+37shn86RVsOHA9CowdlbTasmNlmMxmIkEAhruuoqaN7c3IXt\ntWbo4Z2kxVZ9cnWd/xCKKjfbrOxuDCO9ZNcG75p01qyByEgWNoxj2vx0tzkIp0vqmdZa+JwEAmFN\njRrBuHHw+efGwudmyMmh6ckCGo28qKIGEKkUtw2Lt1ZCPD+YMirBbeTQ3ugO5LSO47LdG7xr0lmz\nBoYMYfaqvRSVlLm9JOmnrUeyjwrrmjAB5s6FFSvql/2yrhzNG8n/N4HkgaF946/M2Wzz9MLMijkF\nK7udz62b/8ey07U0DZWWwrp1cPfd1TYjyYxia5EagbCuUaOgWTPzmofWrjXO36+fOec3WXJinNuc\ngpXdz6dx6RlS//ZBzSOHtmyBwkIYPrzaZiQZMWQtEgiEdUVFwXXXGc1DpVVnpvrd2rUwdGhYZByt\nztMLMytGD63tPIDTDRryq53reXphDQnoHBPJGDFCRgwFCQkEwtomTID8fPjuu8Cet7AQNm8Om0Rz\n1TnuMumr2NaIH+IHcsmuDW7bq1izBjp1gs6d3ZYkda7sJiOGrEf6CIS1XXstNGliNA9ddlngzrtx\no1ELCcNEczVZ2e18Zu56g/jjB6rfyZFozik5MU5u/BYnNQJhbU2awOjR8NlnxroAgeIcBx/mNYLo\nSllDVziGkV67b5PnN2zYALm5boFAWJ8EAmF9N90Ehw7B6tWBO+e6ddCtG7RvH7hzWtDMcf2xRZwd\nRrqndRy50R357c/bqu68cqWRHiQuDm68MYClFPUlgUBYWmqanSuzmnK6QUPmPTGn7mvnnuM5D321\nki8axzNi9vKAnNOqkhPjSLlpsFsbf9nVV9Nh4w9w+vTZHT//3Bjl1bHj2T4CETSkj0BYVmqanWnz\n0ykqMZokfpW+kss/2wzgtzbn1DQ7c95ZTvKJfNKSErAXFDFtfrpfz2l1Vdr4FxfBJ/81OvCvugpe\nfx3uvx8uuAAWLzbWbhBBRWoEwrJSlmZXzEpdkjCCDr8co8+erX6dlZqyNJs+eVkAFaklZCZsJZde\nasz8/vJLeOYZuO8+uOYa+OYbCQJBSmoEwrLsLrNPl/cYSnGkjTHZ3/PnTn39es7f7M+mOLIBWe26\neyxLOEtNs5OyNJvZHfsx7JW/YysrhTvugDffNBYVctlnf0ERsdFRTBmVELa1qWAhNQJhWa4ZPn9p\n1IRveyQxbutKGpX7b/RQpFIk7t9GZvsenGlgc9se7pxNdfaCIr7qORRbWSlvXTSB1AdmuQUB5z4a\nKprWwrmfJRiYvVTln5VSWxyrky1TSsWaWR5hLZVz/n8y8CpiCgu4ZOePfjunKi1h4MGcKgvRhPr6\nA95wbap777wxXPPbvzPr4jtJWba9Yp+ZCzIlyVwQMrtGkKK1HqS1HgIsAp4yuTzCQuI8pEI+2Kw1\nd2z13xKWvzp9gKjS4iqppyuXJRy5JorTKoJt7bq5bU9Ns7ulm67uvcJ6zF6q8meXH5sC8tglKlTO\nU1MWEUnq4KsZvuMnsPunqeGhZscA96UpJTeOobYEcjU99UuSOWszu0aAUupZpdReYBJSIxAuKuep\nadXExsLzr0GVl/P6XU/5pd15sH0bp9u2g/h4yY1TiacEcgq4rE8MUPNTvwRSa1Paz22fSqmvgQ4e\nXnpSa/2Fy37TgMZa6xnVHGcyMBkgPj7+/NzcXH8UV1jU2TkFZcz96E90LjjENX94m+duHOy7m7TW\n0LMnDBpkTJASVUxPTWfu2jy3qrsCJg2L59tt+R5HV7VqYiPtqasDVkZRPaXUBq11UuXtfq8RaK2v\n1FoP8PD1RaVd5wLVzkvXWr+htU7SWifFxMT4t9DCclw7Kj8edDXxJw4xZGeabzshFy+GXbvg+ut9\nd8wQ8+22/Crttxp4f22exyAQZYtkxtj+ASmbqDuzRw31cvnxesBDAhMh3MfxL+09nILGzbh5y1e+\nG9+vNcycCd27w6RJvjlmCDqXTl9pVgseZk8om62USgDKgVzgPpPLIywqUqmKIZzFDRryef/LuHXT\nl7Q57aNfmcWLjcyZb79dMSZeVBUbHeVV8I2LjmL1E5cHoETCF8weNXSjo5lokNZ6rNZaZp0Ij6rM\nKRh0FY3KShmb8W39O41dawO33Va/Y4W4KaMS8GZqnQwXDS6mjxoSwhuVx/FntevO5g69uHnLMqZ8\nsql+wcBZG5g+XWoDtUhOjGPSsPha95PhosFFAoEICp6GLn48+Gr65u+h7/7tzFxQwxq6NdEa+0OP\nkxvdgZ5bW9Nj2hKmp6b7oMSha1byQG6rIRhEKBkuGmwkEIig4JxT4Gph34spatCIWzYvq3ZGa23e\nm/4qcTlbefWimymNjf6BDQAACEVJREFUbECZ1ry/Nk+CQS1mJQ9kzs1DaNrQPTg3sUXw0q+HSAdx\nkPH7PAJ/SEpK0uvXrze7GMIEXZ9Y7PbzC4tfZtT2NQz9/Xs8f/uwc7sBaU16x160KD7FFff8i9LI\ns2MnIpUi5/nRviq2EJZg2jwCIXypVRP3NvyPB11F8zNFjMn+nqcXnlvz0Nq//ZeBh3IqagOuJMmc\nCCcSCERQqTw56adO/clpHcektC8pOFXsfaex1kSnPEdudAc+739ZlZcl7bQIJxIIRFBJToxzrxUo\nxZsX3EDigWweXP2h953GixbRZ/8Oj7UBgIkXdvZRiYWwPgkEIuhUrhV8NHgU8wZcyUOrP2R4mhfz\nCsrLK0YKeaoNNG0YyazkgR7eKERokkAggk5yYhzRUe61gidH/Z4NsX14cfHLfPRG5TRWLk6dYv+V\nY4jblcXfRkysUhtQwLM3SBAQ4UUCgQhKM8e51wrONLBx3w1PcjyqOS/OnQmHDlV90/79cMkltF+x\njJlXTGZ+/6opEDTI0EcRdiQQiKDk6Wad36wV946fTuuin2H8eCguPvvipk0wdChs28bk8dN5J2kc\neOgQlpXIRDiSQCCCllvzkENmh57MSH4U1qyB++838ggtXAgjR1JYWs4dd7/MNz2HejyeQmbEivBk\ndvZRIeps5rj+TJm3mZLys2P+bRGK4X+6H/pGwp//DEePwoIFHO87iOuvmkpe45Yej+VcXEWahUQ4\nkkAggpbzpp2yNJv9BUXERkcxZVQCyYlxpJbfS/N5y7niiy/4svdwHr7mEU7bGns8TpzL+4QIR5Ji\nQoSc1DQ7U+ZtJqL4NMP2ZrCqWyJaeW4FVcDu2WMCW0AhTFJdigmpEYiQk7I022gusjViZffza9xX\n0iULIZ3FIgR5uyhKlC1SOoeFwCKBQCn1qFJKK6Xaml0WEfy8ecqPVErW0xXCwfRAoJTqDFwN5Jld\nFhEapoxKwBZRfdI4W6TixV8PliAghIPpgQB4GZiKMalTiHpLTowj5abBHucZtGpiI2WCBAEhXJna\nWayUuh6wa603q1rS/iqlJgOTAeLja18zVYS35MQ4udkL4SW/BwKl1NdABw8vPQn8CaNZqFZa6zeA\nN8AYPuqzAgohRJjzeyDQWl/pabtSaiDQDXDWBjoBG5VSQ7XWB/1dLiGEEAbTmoa01ulAO+fPSqk9\nQJLW+ohZZRJCiHBkhc5iIYQQJrLMzGKtdVezyyCEEOEoKHMNKaXygdw6vr0tEG7NT3LN4UGuOTzU\n55q7aK1jKm8MykBQH0qp9Z6SLoUyuebwINccHvxxzdJHIIQQYU4CgRBChLlwDARvmF0AE8g1hwe5\n5vDg82sOuz4CIYQQ7sKxRiCEEMKFBAIhhAhzIRsIlFLXKKWylVI7lVJPeHi9kVLqY8fr65RSXQNf\nSt/y4pofUUptVUptUUp9o5TqYkY5fam2a3bZ70bH4kdBPdTQm+tVSv3a8f+cqZT6INBl9DUvfq/j\nlVLfKqXSHL/bo80opy8ppd5WSh1WSmVU87pSSr3i+Ey2KKXOq9cJtdYh9wVEAjlAd6AhsBnoV2mf\n+4F/Ob6/BfjY7HIH4JovA5o4vv+/cLhmx37NgVXAWox8VqaX3Y//x72ANKCV4+d2Zpc7ANf8BvB/\nju/7AXvMLrcPrvti4Dwgo5rXRwNfAgoYBqyrz/lCtUYwFNiptd6ltT4DfARcX2mf64H/Or7/FLhC\n1bYogrXVes1a62+11oWOH9diZHwNZt78PwP8GfgLcDqQhfMDb673XuAfWuvjAFrrwwEuo695c80a\naOH4viWwP4Dl8wut9SrgWA27XA+8qw1rgWilVMe6ni9UA0EcsNfl532ObR730VqXAieANgEpnX94\nc82u7sZ4oghmtV6zo8rcWWu9OJAF8xNv/o97A72VUquVUmuVUtcErHT+4c01zwRuU0rtA5YAfwxM\n0Ux1rn/vNbJM0jkROEqp24Ak4BKzy+JPSqkI4CXgTpOLEkgNMJqHLsWo8a1SSg3UWheYWir/mgi8\no7V+USl1EfCeUmqA1rrc7IIFi1CtEdiBzi4/d3Js87iPUqoBRpXyaEBK5x/eXDNKqSsxVocbp7Uu\nDlDZ/KW2a24ODABWONa7GAYsCOIOY2/+j/cBC7TWJVrr3cB2jMAQrLy55ruBTwC01j8AjTESs4Uy\nr/7evRWqgeAnoJdSqptSqiFGZ/CCSvssAO5wfD8BWK4dvTBBqtZrVkolAq9jBIFgbzuGWq5Za31C\na91Wa91VG2nO12Jc+3pziltv3vxep2LUBlBKtcVoKtoVyEL6mDfXnAdcAaCU6osRCPIDWsrAWwD8\nxjF6aBhwQmt9oK4HC8mmIa11qVLqD8BSjFEHb2utM5VSzwDrtdYLgH9jVCF3YnTK3GJeievPy2tO\nAZoB8xz94nla63GmFbqevLzmkOHl9S4FrlZKbQXKgCla66Ct6Xp5zY8CbyqlHsboOL4zyB/qUEp9\niBHQ2zr6PmYANgCt9b8w+kJGAzuBQuC39TpfkH9eQggh6ilUm4aEEEJ4SQKBEEKEOQkEQggR5iQQ\nCCFEmJNAIIQQYU4CgRBChDkJBEIIEeYkEAjhA458+Fc5vp+llPq72WUSwlshObNYCBPMAJ5RSrUD\nEoGgnbEtwo/MLBbCR5RSKzFSeFyqtT5pdnmE8JY0DQnhA0qpgUBH4IwEARFsJBAIUU+OlaHmYqwa\n9UsILAYjwowEAiHqQSnVBJgPPKq1zsJYFnOGuaUS4txIH4EQQoQ5qREIIUSYk0AghBBhTgKBEEKE\nOQkEQggR5iQQCCFEmJNAIIQQYU4CgRBChLn/D4RrHd5K7mWKAAAAAElFTkSuQmCC\n",
            "text/plain": [
              "<Figure size 432x288 with 1 Axes>"
            ]
          },
          "metadata": {
            "tags": []
          }
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "K8mPkng3MTiQ",
        "colab_type": "text"
      },
      "source": [
        "# Things that might help on the homework\n",
        "\n",
        "## Brief Sidenote: Momentum\n",
        "\n",
        "There are other optimization algorithms besides stochastic gradient descent. One is a modification of SGD called momentum. We won't get into it here, but if you would like to read more [here](https://distill.pub/2017/momentum/) is a good place to start.\n",
        "\n",
        "We only change the step size and add the momentum keyword argument to the optimizer. Notice how it reduces the training loss in fewer iterations."
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "QhGP8gZDMTiQ",
        "colab_type": "code",
        "outputId": "41f87ecd-db93-46d8-9798-30795069e265",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 208
        }
      },
      "source": [
        "# feel free to play with these parameters\n",
        "\n",
        "step_size = 0.05\n",
        "momentum = 0.9\n",
        "n_epochs = 1500\n",
        "n_hidden_1 = 32\n",
        "n_hidden_2 = 32\n",
        "d_out = 1\n",
        "\n",
        "neural_network = nn.Sequential(\n",
        "                            nn.Linear(d, n_hidden_1), \n",
        "                            nn.Tanh(),\n",
        "                            nn.Linear(n_hidden_1, n_hidden_2),\n",
        "                            nn.Tanh(),\n",
        "                            nn.Linear(n_hidden_2, d_out)\n",
        "                            )\n",
        "\n",
        "loss_func = nn.MSELoss()\n",
        "\n",
        "optim = torch.optim.SGD(neural_network.parameters(), lr=step_size, momentum=momentum)\n",
        "print('iter,\\tloss')\n",
        "for i in range(n_epochs):\n",
        "    y_hat = neural_network(X)\n",
        "    loss = loss_func(y_hat, y)\n",
        "    optim.zero_grad()\n",
        "    loss.backward()\n",
        "    optim.step()\n",
        "    \n",
        "    if i % (n_epochs // 10) == 0:\n",
        "        print('{},\\t{:.2f}'.format(i, loss.item()))\n",
        "\n"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "iter,\tloss\n",
            "0,\t4.41\n",
            "150,\t2.87\n",
            "300,\t0.51\n",
            "450,\t0.06\n",
            "600,\t0.13\n",
            "750,\t0.03\n",
            "900,\t0.01\n",
            "1050,\t0.00\n",
            "1200,\t0.00\n",
            "1350,\t0.00\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "rGZL4mkbMTiS",
        "colab_type": "code",
        "colab": {}
      },
      "source": [
        "X_grid = torch.from_numpy(np.linspace(0,1,50)).float().view(-1, d)\n",
        "y_hat = neural_network(X_grid)\n",
        "plt.scatter(X.numpy(), y.numpy())\n",
        "plt.plot(X_grid.detach().numpy(), y_hat.detach().numpy(), 'r')\n",
        "plt.title('plot of $f(x)$ and $\\hat{f}(x)$')\n",
        "plt.xlabel('$x$')\n",
        "plt.ylabel('$y$')\n",
        "plt.show()"
      ],
      "execution_count": 0,
      "outputs": []
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "aSO1McZLMTiT",
        "colab_type": "text"
      },
      "source": [
        "## CrossEntropyLoss\n",
        "So far, we have been considering regression tasks and have used the [MSELoss](https://pytorch.org/docs/stable/nn.html#torch.nn.MSELoss) module. For the homework, we will be performing a classification task and will use the cross entropy loss.\n",
        "\n",
        "PyTorch implements a version of the cross entropy loss in one module called [CrossEntropyLoss](https://pytorch.org/docs/stable/nn.html#torch.nn.CrossEntropyLoss). Its usage is slightly different than MSE, so we will break it down here. \n",
        "\n",
        "- input: The first parameter to CrossEntropyLoss is the output of our network. It expects a *real valued* tensor of dimensions $(N,C)$ where $N$ is the minibatch size and $C$ is the number of classes. In our case $N=3$ and $C=2$. The values along the second dimension correspond to raw unnormalized scores for each class. The CrossEntropyLoss module does the softmax calculation for us, so we do not need to apply our own softmax to the output of our neural network.\n",
        "- output: The second parameter to CrossEntropyLoss is the true label. It expects an *integer valued* tensor of dimension $(N)$. The integer at each element corresponds to the correct class. In our case, the \"correct\" class labels are class 0, class 1, and class 1.\n",
        "\n",
        "Try out the loss function on three toy predictions. The true class labels are $y=[1,1,0]$. The first two examples correspond to predictions that are \"correct\" in that they have higher raw scores for the correct class. The second example is \"more confident\" in the prediction, leading to a smaller loss. The last two examples are incorrect predictions with lower and higher confidence respectively."
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "ALoGYsu1MTiU",
        "colab_type": "code",
        "colab": {}
      },
      "source": [
        "loss = nn.CrossEntropyLoss()\n",
        "\n",
        "input = torch.tensor([[-1., 1],[-1, 1],[1, -1]]) # raw scores correspond to the correct class\n",
        "# input = torch.tensor([[-3., 3],[-3, 3],[3, -3]]) # raw scores correspond to the correct class with higher confidence\n",
        "# input = torch.tensor([[1., -1],[1, -1],[-1, 1]]) # raw scores correspond to the incorrect class\n",
        "# input = torch.tensor([[3., -3],[3, -3],[-3, 3]]) # raw scores correspond to the incorrect class with incorrectly placed confidence\n",
        "\n",
        "target = torch.tensor([1, 1, 0])\n",
        "output = loss(input, target)\n",
        "print(output)\n"
      ],
      "execution_count": 0,
      "outputs": []
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "OCwLf9C2MTiY",
        "colab_type": "text"
      },
      "source": [
        "## Learning rate schedulers\n",
        "\n",
        "Often we do not want to use a fixed learning rate throughout all training. PyTorch offers learning rate schedulers to change the learning rate over time. Common strategies include multiplying the lr by a constant every epoch (e.g. 0.9) and halving the learning rate when the training loss flattens out.\n",
        "\n",
        "See the [learning rate scheduler docs](https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate) for usage and examples"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "IrapEC2XMTiY",
        "colab_type": "text"
      },
      "source": [
        "## Convolutions\n",
        "When working with images, we often want to use convolutions to extract features using convolutions. PyTorch implments this for us in the `torch.nn.Conv2d` module. It expects the input to have a specific dimension $(N, C_{in}, H_{in}, W_{in})$ where $N$ is batch size, $C_{in}$ is the number of channels the image has, and $H_{in}, W_{in}$ are the image height and width respectively.\n",
        "\n",
        "We can modify the convolution to have different properties with the parameters:\n",
        "- kernel_size\n",
        "- stride\n",
        "- padding\n",
        "\n",
        "They can change the output dimension so be careful.\n",
        "\n",
        "See the [`torch.nn.Conv2d` docs](https://pytorch.org/docs/stable/nn.html#torch.nn.Conv2d) for more information."
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "8gKSeJYuMTiZ",
        "colab_type": "text"
      },
      "source": [
        "To illustrate what the `Conv2d` module is doing, let's set the conv weights manually to a Gaussian blur kernel.\n",
        "\n",
        "We can see that it applies the kernel to the image."
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "QJjlv1lOMTiZ",
        "colab_type": "code",
        "colab": {}
      },
      "source": [
        "# an entire mnist digit\n",
        "image = np.array([0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0.3803922 , 0.37647063, 0.3019608 ,0.46274513, 0.2392157 , 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0.3529412 , 0.5411765 , 0.9215687 ,0.9215687 , 0.9215687 , 0.9215687 , 0.9215687 , 0.9215687 ,0.9843138 , 0.9843138 , 0.9725491 , 0.9960785 , 0.9607844 ,0.9215687 , 0.74509805, 0.08235294, 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.54901963,0.9843138 , 0.9960785 , 0.9960785 , 0.9960785 , 0.9960785 ,0.9960785 , 0.9960785 , 0.9960785 , 0.9960785 , 0.9960785 ,0.9960785 , 0.9960785 , 0.9960785 , 0.9960785 , 0.9960785 ,0.7411765 , 0.09019608, 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0.8862746 , 0.9960785 , 0.81568635,0.7803922 , 0.7803922 , 0.7803922 , 0.7803922 , 0.54509807,0.2392157 , 0.2392157 , 0.2392157 , 0.2392157 , 0.2392157 ,0.5019608 , 0.8705883 , 0.9960785 , 0.9960785 , 0.7411765 ,0.08235294, 0., 0., 0., 0.,0., 0., 0., 0., 0.,0.14901961, 0.32156864, 0.0509804 , 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.13333334,0.8352942 , 0.9960785 , 0.9960785 , 0.45098042, 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0.32941177, 0.9960785 ,0.9960785 , 0.9176471 , 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0.32941177, 0.9960785 , 0.9960785 , 0.9176471 ,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0.4156863 , 0.6156863 ,0.9960785 , 0.9960785 , 0.95294124, 0.20000002, 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0.09803922, 0.45882356, 0.8941177 , 0.8941177 ,0.8941177 , 0.9921569 , 0.9960785 , 0.9960785 , 0.9960785 ,0.9960785 , 0.94117653, 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0.26666668, 0.4666667 , 0.86274517,0.9960785 , 0.9960785 , 0.9960785 , 0.9960785 , 0.9960785 ,0.9960785 , 0.9960785 , 0.9960785 , 0.9960785 , 0.5568628 ,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0.14509805, 0.73333335,0.9921569 , 0.9960785 , 0.9960785 , 0.9960785 , 0.8745099 ,0.8078432 , 0.8078432 , 0.29411766, 0.26666668, 0.8431373 ,0.9960785 , 0.9960785 , 0.45882356, 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0.4431373 , 0.8588236 , 0.9960785 , 0.9490197 , 0.89019614,0.45098042, 0.34901962, 0.12156864, 0., 0.,0., 0., 0.7843138 , 0.9960785 , 0.9450981 ,0.16078432, 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0.6627451 , 0.9960785 ,0.6901961 , 0.24313727, 0., 0., 0.,0., 0., 0., 0., 0.18823531,0.9058824 , 0.9960785 , 0.9176471 , 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0.07058824, 0.48627454, 0., 0.,0., 0., 0., 0., 0.,0., 0., 0.32941177, 0.9960785 , 0.9960785 ,0.6509804 , 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0.54509807, 0.9960785 , 0.9333334 , 0.22352943, 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0.8235295 , 0.9803922 , 0.9960785 ,0.65882355, 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0.9490197 , 0.9960785 , 0.93725497, 0.22352943, 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0.34901962, 0.9843138 , 0.9450981 ,0.3372549 , 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.01960784,0.8078432 , 0.96470594, 0.6156863 , 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0.01568628, 0.45882356, 0.27058825,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0., 0.,0., 0., 0., 0.], dtype=np.float32)\n",
        "image_torch = torch.from_numpy(image).view(1, 1, 28, 28)"
      ],
      "execution_count": 0,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "28-oVco6MTib",
        "colab_type": "code",
        "outputId": "19ef863a-74fc-4697-a2e1-2c525ea74a24",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 545
        }
      },
      "source": [
        "# a gaussian blur kernel\n",
        "gaussian_kernel = torch.tensor([[1., 2, 1],[2, 4, 2],[1, 2, 1]]) / 16.0\n",
        "\n",
        "conv = nn.Conv2d(1, 1, 3)\n",
        "# manually set the conv weight\n",
        "conv.weight.data[:] = gaussian_kernel\n",
        "\n",
        "convolved = conv(image_torch)\n",
        "\n",
        "plt.title('original image')\n",
        "plt.imshow(image_torch.view(28,28).detach().numpy())\n",
        "plt.show()\n",
        "\n",
        "plt.title('blurred image')\n",
        "plt.imshow(convolved.view(26,26).detach().numpy())\n",
        "plt.show()"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "display_data",
          "data": {
            "image/png": "iVBORw0KGgoAAAANSUhEUgAAAPsAAAEICAYAAACZA4KlAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0\ndHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAARoElEQVR4nO3de5BW9X3H8fdH5KKCgFEJRcwab1Uz\nKZpVU7UNVmOUXNSmpZJqqGOC9ZLW6pgYM1biJBnjGB1TLwlW6yVeYsYbpNioGGvNeFuMEbzEKwq4\nsgo6oEZY4Ns/noPzgHvOsz733d/nNbOzz57vuXyfBz7POc85zzlHEYGZDX6btboBM2sOh90sEQ67\nWSIcdrNEOOxmiXDYzRLhsA8Akn4m6Zx6j1thPh2SQtLmOfWnJE2udTnWPPJxduuLpA7gZWBoRKxt\nbTdWD16ztzlJQ1rdgw0ODnsLSNpD0v2S3s42h79SVrtG0hWS5kp6Fzg4G/aDsnG+Lalb0muSvpFt\nbu9SNv0PsseTJS2RdIaknmya48vm80VJv5e0UtJiSTM/wnNYJOnQ7PFMSb+S9AtJqyQtkLSbpO9m\ny10s6bCyaY+X9Ew27kuSTtxk3kXPb7ikCyW9KmlZ9rFli4/6b5Aih73JJA0F5gB3A9sD3wJukLR7\n2WhfA34IjAIe3GT6w4HTgUOBXYDJFRb5cWA0MAE4AbhM0tis9i7wdWAM8EXgJElHVfnUvgxcD4wF\nfg/8htL/rwnAecDPy8btAb4EbA0cD1wsaZ9+Pr/zgd2ASVl9AvDvVfacFIe9+T4LjATOj4g1EXEf\n8GtgWtk4d0bE7yJifUS8v8n0U4H/ioinIuI9YGaF5fUC50VEb0TMBd4BdgeIiPsjYkG2nCeBm4DP\nVfm8/i8ifpN9vv8VsF32HHuBm4EOSWOy5f53RLwYJf9L6Y3vryo9P0kCZgD/FhErImIV8CPgmCp7\nTkqfe1qtof4MWBwR68uGvUJpDbXB4grTd/VzXIDlm+xge4/Smw2S9qe0pvwUMAwYTimo1VhW9vhP\nwJsRsa7sb7Llvi3pCOBcSmvozYAtgQXZOEXPb7ts3Pml3AMgwPs1+sFr9uZ7DZgoqfy13xFYWvZ3\n0SGSbmCHsr8n1tDLjcBsYGJEjAZ+Rik8DSNpOHArcCEwLiLGAHPLllv0/N6k9MaxV0SMyX5GR8TI\nRvY8WDjszfcIpbXrtyUNzY5Vf5nSpm5/3AIcn+3k2xKo5Zj6KGBFRLwvaT9K+woabcMWxBvA2mwt\nf1hZPff5ZVtDV1L6jL89gKQJkr7QhL4HPIe9ySJiDaVwH0FpTXU58PWIeLaf098F/BT4LfAC8HBW\nWl1FOycD50laRWkn1y1VzOMjyT5n/0u2rLcovcHMLqtXen7f2TBc0krgXrJ9EFbMX6oZ4CTtASwE\nhg/GL78M9ufXTF6zD0CSjs6ON48FfgzMGUxBGOzPr1Uc9oHpRErHql8E1gEntbaduhvsz68lvBlv\nlgiv2c0S0dQv1QzT8BjBVs1cpFlS3udd1sTqPr8rUVPYs+8xX0LpG0z/GRHnF40/gq3YX4fUskgz\nK/BIzMutVb0Zn516eRml48V7AtMk7Vnt/MyssWr5zL4f8EJEvJR9UeRm4Mj6tGVm9VZL2Cew8UkK\nS9j4ZA4AJM2Q1CWpq7eqL3mZWT00fG98RMyKiM6I6BzK8EYvzsxy1BL2pWx8RtIObHzmlpm1kVrC\n/hiwq6SdJA2jdAGB2RWmMbMWqfrQW0SslXQqpcsPDQGujoin6taZmdVVTcfZs8scza1TL2bWQP66\nrFkiHHazRDjsZolw2M0S4bCbJcJhN0uEw26WCIfdLBEOu1kiHHazRDjsZolw2M0S4bCbJcJhN0uE\nw26WCIfdLBEOu1kiHHazRDjsZolw2M0S4bCbJcJhN0uEw26WCIfdLBEOu1kiHHazRDjsZolw2M0S\n4bCbJcJhN0tETbdslrQIWAWsA9ZGRGc9mjKz+qsp7JmDI+LNOszHzBrIm/Fmiag17AHcLWm+pBl9\njSBphqQuSV29rK5xcWZWrVo34w+KiKWStgfukfRsRDxQPkJEzAJmAWytbaLG5ZlZlWpas0fE0ux3\nD3A7sF89mjKz+qs67JK2kjRqw2PgMGBhvRozs/qqZTN+HHC7pA3zuTEi/qcuXZlZ3VUd9oh4CfiL\nOvZiZg3kQ29miXDYzRLhsJslwmE3S4TDbpaIepwIYy3WffoBuTVV+M7iiOXFI7z158XTj39oXfH8\n5zxaPANrGq/ZzRLhsJslwmE3S4TDbpYIh90sEQ67WSIcdrNEDJrj7D2n5B9rBnj7072F9dsPu7Se\n7TTVHsMeq3ra92NtYX30ZlsU1nuOe7ew/tpP8/+LXfT65wunXT5168L62sVLCuu2Ma/ZzRLhsJsl\nwmE3S4TDbpYIh90sEQ67WSIcdrNEKKJ5N2nZWtvE/jqk6umfu3Lf3NqzUy4vnHa4hla9XGuNYxdN\nLqy/9bUKx+EXvVrHbgaGR2IeK2OF+qp5zW6WCIfdLBEOu1kiHHazRDjsZolw2M0S4bCbJWJAnc9+\nxcHX5dYqHUf/8fJdC+s9a0ZV1VM93Db/M4X1Hef0edi0LSw5pHh9ccGUG3NrXx25snDaX3TcX1g/\n9sbJhfW3/mGH3FqK58JXXLNLulpSj6SFZcO2kXSPpOez32Mb26aZ1ao/m/HXAIdvMuwsYF5E7ArM\ny/42szZWMewR8QCwYpPBRwLXZo+vBY6qc19mVmfVfmYfFxHd2ePXgXF5I0qaAcwAGMGWVS7OzGpV\n8974KJ1Jk3s2TUTMiojOiOgcyvBaF2dmVao27MskjQfIfvfUryUza4Rqwz4bmJ49ng7cWZ92zKxR\nKp7PLukmYDKwLbAMOBe4A7gF2BF4BZgaEZvuxPuQWs9n12f2yq29Oan43Obt7/hjYX3d8ortWxU2\n+3T+Dd6/dPPvCqc9Zczimpa9+1Un5dY6znmopnm3q6Lz2SvuoIuIaTml6lNrZk3nr8uaJcJhN0uE\nw26WCIfdLBEOu1kiBtSlpG1wWf7Nvyysd33/iprmP3/1mtza2TvtV9O825UvJW1mDrtZKhx2s0Q4\n7GaJcNjNEuGwmyXCYTdLhMNulgiH3SwRDrtZIhx2s0Q47GaJcNjNEuGwmyXCYTdLxIC6ZbMNPEvO\nPiC3tn7vVQ1d9rgh+eezr/2b4ttkb37f/Hq303Jes5slwmE3S4TDbpYIh90sEQ67WSIcdrNEOOxm\nifB14weBzT/ZkVt74YTxhdNefsysOnezsckjenNrQ9S6dc2Lve8U1k/+xEFN6qS+arpuvKSrJfVI\nWlg2bKakpZKeyH6m1LNhM6u//ry1XgMc3sfwiyNiUvYzt75tmVm9VQx7RDwArGhCL2bWQLV8aDpV\n0pPZZv7YvJEkzZDUJamrl9U1LM7MalFt2K8AdgYmAd3AT/JGjIhZEdEZEZ1DGV7l4sysVlWFPSKW\nRcS6iFgPXAkMzltimg0iVYVdUvnxnKOBhXnjmll7qHg+u6SbgMnAtpKWAOcCkyVNAgJYBJzYwB4H\nvXf+fv/C+hv7FL8nn/e3N+fWjhn1VlU91U97fm/r0HtPK6zvRleTOmmeimGPiGl9DL6qAb2YWQO1\n59uumdWdw26WCIfdLBEOu1kiHHazRPhS0nWgvfcqrI+5tLuwPrfjisJ6I08FvePdkYX1hX/aoab5\n//qCybm1IauLT6+eft6cwvqM0a9V0xIAw14fWvW0A5XX7GaJcNjNEuGwmyXCYTdLhMNulgiH3SwR\nDrtZInycvZ9e+X7+rYfPOeaXhdP+46jlhfVX175XWH92Te5VvwD41k3fyK1t2d3nVYU/MP7+Nwvr\n655+rrBeyWgernra5787rsLMi4+zv1xwueiOO4svJT0Yec1ulgiH3SwRDrtZIhx2s0Q47GaJcNjN\nEuGwmyXCx9n7acy+Pbm1SsfRD3n6K4X13v/4eGF9izsfLax38FBhvci6qqes3frP7V1YP2pMpYsY\nF6+rVqwfll98dEGFeQ8+XrObJcJhN0uEw26WCIfdLBEOu1kiHHazRDjsZonozy2bJwLXAeMo3aJ5\nVkRcImkb4JdAB6XbNk+NiFbfH7hhPnZC/vnPu5x+UuG0O59ZfBx8c16tqqeB7q3dRhTWDxxR27po\nxsJjc2vbUtt5+gNRf17NtcAZEbEn8FngFEl7AmcB8yJiV2Be9reZtamKYY+I7oh4PHu8CngGmAAc\nCVybjXYtcFSjmjSz2n2k7SRJHcDewCPAuIjYcF+j1ylt5ptZm+p32CWNBG4FTouIleW1iAhKn+f7\nmm6GpC5JXb2srqlZM6tev8IuaSiloN8QEbdlg5dJGp/VxwN9nikSEbMiojMiOocyvB49m1kVKoZd\nkoCrgGci4qKy0mxgevZ4OnBn/dszs3rpzymuBwLHAQskPZENOxs4H7hF0gnAK8DUxrTYHtZ2v55b\n2/nM/JrlW77v2pqmf2ZN8SW4R10+uqb5DzYVwx4RDwJ5Fx8/pL7tmFmj+Bt0Zolw2M0S4bCbJcJh\nN0uEw26WCIfdLBG+lLQ11BcWrsyt3T7msgpTF1wKGpj+1PTC+ti7Hqsw/7R4zW6WCIfdLBEOu1ki\nHHazRDjsZolw2M0S4bCbJcLH2a2h/m7rJ3NrW242snDa53rfLaxveemYqnpKldfsZolw2M0S4bCb\nJcJhN0uEw26WCIfdLBEOu1kifJzdatJz8gGF9XFD8s8pf7k3/zbYANN+dGZhfdu7im+FbRvzmt0s\nEQ67WSIcdrNEOOxmiXDYzRLhsJslwmE3S0TF4+ySJgLXAeOAAGZFxCWSZgLfBN7IRj07IuY2qlFr\nDQ0fXlj/6j/fV1hftX5Nbm3KoycVTrvjz30cvZ7686WatcAZEfG4pFHAfEn3ZLWLI+LCxrVnZvVS\nMewR0Q10Z49XSXoGmNDoxsysvj7SZ3ZJHcDewCPZoFMlPSnpakljc6aZIalLUlcvq2tq1syq1++w\nSxoJ3AqcFhErgSuAnYFJlNb8P+lruoiYFRGdEdE5lOLPf2bWOP0Ku6ShlIJ+Q0TcBhARyyJiXUSs\nB64E9mtcm2ZWq4phlyTgKuCZiLiobPj4stGOBhbWvz0zq5f+7I0/EDgOWCDpiWzY2cA0SZMoHY5b\nBJzYkA6ttdZHYfn6OQcX1u/6w+Tc2o63PFxNR1al/uyNfxBQHyUfUzcbQPwNOrNEOOxmiXDYzRLh\nsJslwmE3S4TDbpYIX0raCkVv/imqAB3f82moA4XX7GaJcNjNEuGwmyXCYTdLhMNulgiH3SwRDrtZ\nIhRRfL5yXRcmvQG8UjZoW+DNpjXw0bRrb+3aF7i3atWzt09ExHZ9FZoa9g8tXOqKiM6WNVCgXXtr\n177AvVWrWb15M94sEQ67WSJaHfZZLV5+kXbtrV37AvdWrab01tLP7GbWPK1es5tZkzjsZoloSdgl\nHS7pj5JekHRWK3rII2mRpAWSnpDU1eJerpbUI2lh2bBtJN0j6fnsd5/32GtRbzMlLc1euyckTWlR\nbxMl/VbS05KekvSv2fCWvnYFfTXldWv6Z3ZJQ4DngM8DS4DHgGkR8XRTG8khaRHQGREt/wKGpL8G\n3gGui4hPZcMuAFZExPnZG+XYiPhOm/Q2E3in1bfxzu5WNL78NuPAUcA/0cLXrqCvqTThdWvFmn0/\n4IWIeCki1gA3A0e2oI+2FxEPACs2GXwkcG32+FpK/1maLqe3thAR3RHxePZ4FbDhNuMtfe0K+mqK\nVoR9ArC47O8ltNf93gO4W9J8STNa3UwfxkVEd/b4dWBcK5vpQ8XbeDfTJrcZb5vXrprbn9fKO+g+\n7KCI2Ac4Ajgl21xtS1H6DNZOx077dRvvZunjNuMfaOVrV+3tz2vVirAvBSaW/b1DNqwtRMTS7HcP\ncDvtdyvqZRvuoJv97mlxPx9op9t493WbcdrgtWvl7c9bEfbHgF0l7SRpGHAMMLsFfXyIpK2yHSdI\n2go4jPa7FfVsYHr2eDpwZwt72Ui73MY77zbjtPi1a/ntzyOi6T/AFEp75F8EvteKHnL6+iTwh+zn\nqVb3BtxEabOul9K+jROAjwHzgOeBe4Ft2qi364EFwJOUgjW+Rb0dRGkT/UngiexnSqtfu4K+mvK6\n+euyZonwDjqzRDjsZolw2M0S4bCbJcJhN0uEw26WCIfdLBH/D3ImkM6hEnS6AAAAAElFTkSuQmCC\n",
            "text/plain": [
              "<Figure size 432x288 with 1 Axes>"
            ]
          },
          "metadata": {
            "tags": []
          }
        },
        {
          "output_type": "display_data",
          "data": {
            "image/png": "iVBORw0KGgoAAAANSUhEUgAAAP0AAAEICAYAAACUHfLiAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0\ndHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAAUTElEQVR4nO3dfZBddX3H8fdnN7tZ8kQSEsJmk/IQ\nHgNTA0YUZWpaEBG1QKdjxY6io4Y6OmLrTMGHKtOhU3R8qJ3p2AZBEFFLRYXO0GqItNTOiAYa8wQY\nCInZkEcSyBPZZHe//eOedS6493c2u3f33s3v85rZ2XvP9+zvfPduPjn33nPu7ygiMLN8tDS6ATMb\nWw69WWYcerPMOPRmmXHozTLj0JtlxqFvEEmbJF1eo7ZEUvdY91S1/dMkhaQJNerrJC0Z47asTgb9\no5qlRMT5je7Bhs97+uPMYHvnWntsy5ND31ivk7Re0l5J35TUMdhKxVPtM6vu3yXp1uL2Ekndkm6S\ntB34pqRbJH1f0rcl7QPeL+lESXdI2iZpq6RbJbUWY7RK+pKk3ZI2Am9PNV390qTY1r8V29ovaY2k\nsyV9StJOSVskXVH1sx+Q9GSx7kZJN7xq7L8uenxe0oeqf3dJE4s+fyNph6R/lnTCsB75jDn0jfXn\nwFuBBcDZwGeHOc4pwEzgVGBpsexq4PvAdOBe4C6gFzgTuBC4AvhQse6HgXcUyxcDf3qM238ncA8w\nA/g/4MdU/m11AX8L/EvVujuLbU0DPgB8VdJFAJKuBP4KuLzoc8mrtnMblcdpUVHvAj53jL1aRPir\nAV/AJuAvqu5fBTxb3F4CdFfVAjiz6v5dwK1V6x4BOqrqtwCPVt2fA/QAJ1Qtuw54pLj901f1ckWx\nzQmJ3i+v2tbyqto7gQNAa3F/ajHW9Bpj/Qi4sbh9J/D3VbUzB353QMBBYEFV/RLguUb/Lcfbl1/r\nNdaWqtubgbnDHGdXRBxOjH0q0AZskzSwrKVqnbmD9HIsdlTdfhnYHRF9VfcBpgAvSnob8Hkqe+wW\nYBKwpqqPlTV+h9nFuo9X/Q4CWo+x1+w59I01v+r27wHP11jvEJV/8ANOAaoP6Q32UcnqZVuo7Oln\nRUTvIOtuG6SXupM0EbgfeB/wQEQclfQjKuEd6GNe1Y9U97Sbyn8g50fE1tHoLxd+Td9YH5U0T9JM\n4DPAv9ZYbxXwnuINtyuBNx/LRiJiG/AT4MuSpklqkbRA0sA49wEfL3qZAdw8vF+nVDswEdgF9BZ7\n/Suq6vcBH5B0nqRJwN9U/Q79wO1U3gM4GUBSl6S3jlKvxy2HvrG+QyWMG4FngVtrrHcjldfKL1J5\n8+9Hw9jW+6iEbj2wl8qbfJ1F7XYqb779CngC+MEwxi8VEfuBj1MJ917gPcCDVfX/AP4ReAR4Bvh5\nUeopvt80sLw4KvEwcM5o9Ho8U/GGiFnTkXQesBaYWONliQ2D9/TWVCRdWxyPnwF8Afh3B76+HHpr\nNjdQOZb/LNAHfKSx7Rx//PTeLDPe05tlZkyP07drYnQweSw3aZaVwxzkSPQotc6IQl8cM/4albOi\nvhERt6XW72Ayr9dlI9mkmSU8FitK1xn20/viE1r/BLwNWAhcJ2nhcMczs7Exktf0FwPPRMTGiDgC\nfI/KJ7vMrImNJPRdvPIDEd3FMjNrYqP+Rp6kpRSf8e54xWdGzKwRRrKn38orPwU1r1j2ChGxLCIW\nR8TiNiaOYHNmVg8jCf0vgbMknS6pHXg3VR+eMLPmNOyn9xHRK+ljVD6d1QrcGRHr6taZmY2KEb2m\nj4iHgIfq1IuZjQGfhmuWGYfeLDMOvVlmHHqzzDj0Zplx6M0y49CbZcahN8uMQ2+WGYfeLDMOvVlm\nHHqzzDj0Zplx6M0y49CbZcahN8uMQ2+WGYfeLDMOvVlmHHqzzDj0Zplx6M0y49CbZcahN8uMQ2+W\nGYfeLDMOvVlmHHqzzDj0Zplx6M0y49CbZcahN8vMhEY3YINr6egoX2fG9PQKE9vr1E1tcfDl0nX6\n9+5Nj9HbW692bAhGFHpJm4D9QB/QGxGL69GUmY2eeuzp/zAidtdhHDMbA35Nb5aZkYY+gJ9IelzS\n0sFWkLRU0kpJK4/SM8LNmdlIjfTp/aURsVXSycBySU9FxKPVK0TEMmAZwDTNjBFuz8xGaER7+ojY\nWnzfCfwQuLgeTZnZ6Bl26CVNljR14DZwBbC2Xo2Z2egYydP7OcAPJQ2M852I+M+RNKOJE0vXaZ09\nK1nvm11y7Bo4clL6GHj/xMa/v9kzrbV0nQNd6T77Thh5H6F0fWL6EDwAU7v7kvXJ3YdKx2h57vlk\nve+FPekBwq8sBww79BGxEXhNHXsxszHQ+F2amY0ph94sMw69WWYcerPMOPRmmXHozTLj0JtlZkwn\n0VBLCy2TJtes973mzNIxtr2+9s8D7Dv3aOkYc+anzyiZNelg6Rij7fcmlvdw9uSdyfqJreUTXJRp\nUX+yvrVnRukYq1/qStbX/Xpe6RidK85K1mf893PJeu+O9GMFZHMCj/f0Zplx6M0y49CbZcahN8uM\nQ2+WGYfeLDMOvVlmxvZiF+1t6NTax2w3v31S6RBvvGxNsn75jPWlY5zWtitZn9pypHSMMocjPQnG\nxiMnJ+ubj6QnCwE43N+WrO/pTZ/TMBRTWw8n6xdP2Vg6xrtn/CJZf2runNIxPjf9j5P1aDk9WZ/5\nSOkmyo/lHyfH8b2nN8uMQ2+WGYfeLDMOvVlmHHqzzDj0Zplx6M0yM6bH6aNF9E9qr1k/Oi392W2A\nnr50y/dsfUPpGDsPTEnW+/pH/n9hz5F0n0d2pM9JOOH58otdTCi/RsSIlV0w49D89IUsAC76/WeT\n9Y/MLT+I/neLHkjWb+r9k2S9/cCppduY9F/pB7R///7SMcYD7+nNMuPQm2XGoTfLjENvlhmH3iwz\nDr1ZZhx6s8w49GaZGduLXRzto3Xbnpr1rp+WT/rw9Npzk/WOPeUn+MzY05usq3/kkyXoaLqPCS++\nlKy3vFh+Ikj0jHyyjzJqT0/U0XdK+cUunnnj2cn6F6/pKB3jC2fcn6z/5aIVyfo//OYdpds465lT\n0is8mcnJOZLulLRT0tqqZTMlLZe0ofhe/pc3s6YwlKf3dwFXvmrZzcCKiDgLWFHcN7NxoDT0EfEo\n8Orn5FcDdxe37wauqXNfZjZKhvuafk5EbCtubwdqzmwoaSmwFKCjdeowN2dm9TLid+8jIoCa73xF\nxLKIWBwRi9tbSj6yZWajbrih3yGpE6D4PoTrAJtZMxhu6B8Eri9uXw+kP+xsZk2j9DW9pO8CS4BZ\nkrqBzwO3AfdJ+iCwGXjXUDYWR4/Su21HzfqUh8uPg04tOW4cB8tnlug/nL6Aw1goO5ug/GyDJpH4\new7o7DkrWd9w6rzSMZ6alz6GfvWUJ5P1O8+/pHQbPZ3TkvUJ6U2MG6Whj4jrapQuq3MvZjYGfBqu\nWWYcerPMOPRmmXHozTLj0JtlxqE3y4xDb5aZMZ1EA4D+2ldEOV6uIJKTlpKTpQCiI/3PTH0qHeNw\npLczu3Visn769NqTtwzYPiM9LUTbhPTvEb3pyVmahff0Zplx6M0y49CbZcahN8uMQ2+WGYfeLDMO\nvVlmxv44vQ2Nyo9dt04tmWi0q+Z8pb915JT0GL0drcl6z/R0HWDveenfpevCbck6wLnt6XUmkO7j\nohO3lG7j3rPTF+U4sTM9kUfvlu7SbTQD7+nNMuPQm2XGoTfLjENvlhmH3iwzDr1ZZhx6s8z4OP0o\nUVt7st4yc3qyHp2zSrex97z0xRl2X1h+rH/KOXuT9ZOnHEjW53YcLN3Gn01/Lll/46QNpWMsbKs9\nDwNAqzqS9T+asr50G984703Jem/XzPQAPk5vZs3IoTfLjENvlhmH3iwzDr1ZZhx6s8w49GaZcejN\nMuOTcwbTkp6QoXX2SaVDHFk4L1nffUH6ZJJ9Z6dPRgE4/bznk/Wbu35ROsZrOzYn6xOV7mNX/6TS\nbWzoSU8+8VRPZ+kY01vSfS5Qf+kYZaLkohvqi/TPj7iDsVG6p5d0p6SdktZWLbtF0lZJq4qvq0a3\nTTOrl6E8vb8LuHKQ5V+NiEXF10P1bcvMRktp6CPiUaD8QmBmNi6M5I28j0laXTz9r3nlP0lLJa2U\ntPIoPSPYnJnVw3BD/3VgAbAI2AZ8udaKEbEsIhZHxOI20lcWNbPRN6zQR8SOiOiLiH7gduDi+rZl\nZqNlWKGXVH2M5Vpgba11zay5lB6nl/RdYAkwS1I38HlgiaRFVA5NbgJuGMUe665lUvrYsk6fn6zv\nvKRkMgVg75sPJ+tvPfeJZP3sSdtLt9FWcgy9+0h5n8v3LEzWt+xPT/ax44UTS7fRuil9TsLRKeVH\nuK+9NH3OwadO/p9kfV3PgtJttG8tmfhkb/r97PIzK5pDaegj4rpBFt8xCr2Y2RjwabhmmXHozTLj\n0JtlxqE3y4xDb5YZh94sMw69WWaynESjZc7sZL37LelJMqZcWX7izPs6n0rW9/WmT1j53ubFpdvY\nuTHd56Tu9GQgAJO3pk+MOWF3b7J+xotHSrcxYc+uZH3vReVX81l9QVeyfmh2+vdYcyg9qQnAlC0l\nK7zwYukY44H39GaZcejNMuPQm2XGoTfLjENvlhmH3iwzDr1ZZrI8Th/tbcn6kfS8EbS1lF9Y4dvr\n0jOInfB4eiKPmU8dLd3GOVteStZb9uwvHaP/pX3p+oED6QFiCJd4mJ6eaKO3I33eBEDnpPTverSk\njTV755ZuY8rW9DQYfSWP1XjhPb1ZZhx6s8w49GaZcejNMuPQm2XGoTfLjENvlpksj9Oz84Vkec5j\nNa/HCcC+7Z3JOkDX5vTn0Cev2pis9+3aXbqN/t70NsrPJhgbOnFasn5wrkrHWDS1O1l/9mj6b7bx\nuTml2zhn+6FkPfrHy+Us0rynN8uMQ2+WGYfeLDMOvVlmHHqzzDj0Zplx6M0y49CbZSbLk3P6Xkxf\ntGDS/z6drj+RvlAFQOxPTz7Reyh9Ish40XrSzNJ19r02PYFF7wUHS8c4tT19stL9e9IXBzlxdXri\nFIDW7t8k6+lTocaP0j29pPmSHpG0XtI6STcWy2dKWi5pQ/E9fUqUmTWFoTy97wU+GRELgTcAH5W0\nELgZWBERZwErivtm1uRKQx8R2yLiieL2fuBJoAu4Gri7WO1u4JrRatLM6ueYXtNLOg24EHgMmBMR\n24rSdmDQTzRIWgosBeggPRmkmY2+Ib97L2kKcD/wiYh4xbSgERHAoPORRsSyiFgcEYvbmDiiZs1s\n5IYUekltVAJ/b0T8oFi8Q1JnUe8Edo5Oi2ZWT0N5917AHcCTEfGVqtKDwPXF7euBB+rfnpnV21Be\n078JeC+wRtKqYtmngduA+yR9ENgMvGt0WhwFJRdo6NtXclGDsnpG+hZ0la7TfWV6Oo/PLvpx+XZI\nT7SxfP3CZH3BqpfLt7E7PbnK8aI09BHxM6j5iF9W33bMbLT5NFyzzDj0Zplx6M0y49CbZcahN8uM\nQ2+WmSw/T29Dp4npU6f3nT65dIzXn//rZP11J2wqHeNzm69O1qf/vD1Zb9+QvrgIQG/JxUOOF97T\nm2XGoTfLjENvlhmH3iwzDr1ZZhx6s8w49GaZcejNMuOTcyypde4pyfrec8v3G2+ZsiNZv2fPJaVj\nPP3TBcn6aT/bk6znMkHGUHhPb5YZh94sMw69WWYcerPMOPRmmXHozTLj0JtlxsfpLalv1rRkvWdm\n+kIWAI/uPDNZ37K6s3SMMx4+lKzHhufS9UwmyBgK7+nNMuPQm2XGoTfLjENvlhmH3iwzDr1ZZhx6\ns8w49GaZKT05R9J84FvAHCCAZRHxNUm3AB8GdhWrfjoiHhqtRq0xWva9nKyftGpK6Rj71s9N1s9Y\nkz7xBqB1TfoKNf09PaVjWMVQzsjrBT4ZEU9Imgo8Lml5UftqRHxp9Nozs3orDX1EbAO2Fbf3S3oS\n6BrtxsxsdBzTa3pJpwEXAo8Viz4mabWkOyXNqHNvZjYKhhx6SVOA+4FPRMQ+4OvAAmARlWcCX67x\nc0slrZS08ih+3WXWaEMKvaQ2KoG/NyJ+ABAROyKiLyL6gduBiwf72YhYFhGLI2JxG+nLHpvZ6CsN\nvSQBdwBPRsRXqpZXfx7yWmBt/dszs3obyrv3bwLeC6yRtKpY9mngOkmLqBzG2wTcMCodmlldKSLG\nbmPSLmBz1aJZwO4xa2D43Gd9jYc+x0OP8Lt9nhoRs1M/MKah/52NSysjYnHDGhgi91lf46HP8dAj\nDK9Pn4ZrlhmH3iwzjQ79sgZvf6jcZ32Nhz7HQ48wjD4b+prezMZeo/f0ZjbGHHqzzDQs9JKulPS0\npGck3dyoPspI2iRpjaRVklY2up8BxYecdkpaW7VspqTlkjYU3xv6IagaPd4iaWvxeK6SdFUjeyx6\nmi/pEUnrJa2TdGOxvNkez1p9HtNj2pDX9JJagV8DbwG6gV8C10XE+jFvpoSkTcDiiGiqEzUk/QFw\nAPhWRFxQLPsisCcibiv+I50RETc1WY+3AAeaaR6G4pTyzuo5I4BrgPfTXI9nrT7fxTE8po3a018M\nPBMRGyPiCPA94OoG9TIuRcSjwJ5XLb4auLu4fTeVfxANU6PHphMR2yLiieL2fmBgzohmezxr9XlM\nGhX6LmBL1f1umndijgB+IulxSUsb3UyJOcWkJwDbqUxx1oyadh6GV80Z0bSP50jmtvAbeeUujYiL\ngLcBHy2esja9qLxua8bjsUOah6ERBpkz4rea6fEc7twWAxoV+q3A/Kr784plTScithbfdwI/pMa8\nAU1ix8BHnovvOxvcz+8Y6jwMY22wOSNowsdzJHNbDGhU6H8JnCXpdEntwLuBBxvUS02SJhdvmCBp\nMnAFzT1vwIPA9cXt64EHGtjLoJpxHoZac0bQZI9n3ea2iIiGfAFXUXkH/1ngM43qo6THM4BfFV/r\nmqlP4LtUnsodpfKeyAeBk4AVwAbgYWBmE/Z4D7AGWE0lVJ1N8FheSuWp+2pgVfF1VRM+nrX6PKbH\n1KfhmmXGb+SZZcahN8uMQ2+WGYfeLDMOvVlmHHqzzDj0Zpn5fwGpJA7BvjhWAAAAAElFTkSuQmCC\n",
            "text/plain": [
              "<Figure size 432x288 with 1 Axes>"
            ]
          },
          "metadata": {
            "tags": []
          }
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "zhf848TRMTid",
        "colab_type": "text"
      },
      "source": [
        "As we can see, the image is blurred as expected. \n",
        "\n",
        "In practice, we learn many kernels at a time. In this example, we take in an RGB image (3 channels) and output a 16 channel image. After an activation function, that could be used as input to another `Conv2d` module."
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "noG9FyJ0MTie",
        "colab_type": "code",
        "colab": {}
      },
      "source": [
        "im_channels = 3 # if we are working with RGB images, there are 3 input channels, with black and white, 1\n",
        "out_channels = 16 # this is a hyperparameter we can tune\n",
        "kernel_size = 3 # this is another hyperparameter we can tune\n",
        "batch_size = 4\n",
        "image_width = 32\n",
        "image_height = 32\n",
        "\n",
        "im = torch.randn(batch_size, im_channels, image_width, image_height)\n",
        "\n",
        "m = nn.Conv2d(im_channels, out_channels, kernel_size)\n",
        "convolved = m(im) # it is a module so we can call it\n",
        "\n",
        "print('im shape', im.shape)\n",
        "print('convolved im shape', convolved.shape)"
      ],
      "execution_count": 0,
      "outputs": []
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "NwsmNTYLMTig",
        "colab_type": "text"
      },
      "source": [
        "## Useful links:\n",
        "- [60 minute PyTorch Tutorial](https://pytorch.org/tutorials/beginner/deep_learning_60min_blitz.html)\n",
        "- [PyTorch Docs](https://pytorch.org/docs/stable/index.html)\n",
        "- [Lecture notes on Auto-Diff](https://courses.cs.washington.edu/courses/cse446/19wi/notes/auto-diff.pdf)\n",
        "\n"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "0d77LgKaMTih",
        "colab_type": "text"
      },
      "source": [
        "\n",
        "Custom Datasets, DataLoaders\n",
        "===================================================\n",
        "This is modified from pytorch official tutorial.\n",
        "**Author**: `Sasank Chilamkurthy <https://chsasank.github.io>`_\n",
        "\n",
        "A lot of effort in solving any machine learning problem goes in to\n",
        "preparing the data. PyTorch provides many tools to make data loading\n",
        "easy and hopefully, to make your code more readable. In this tutorial,\n",
        "we will see how to load and preprocess/augment data from a non trivial\n",
        "dataset.\n",
        "\n"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "OyN-mHRoMTii",
        "colab_type": "text"
      },
      "source": [
        "Dataset class\n",
        "-------------\n",
        "\n",
        "``torch.utils.data.Dataset`` is an abstract class representing a\n",
        "dataset.\n",
        "Your custom dataset should inherit ``Dataset`` and override the following\n",
        "methods:\n",
        "\n",
        "-  ``__len__`` so that ``len(dataset)`` returns the size of the dataset.\n",
        "-  ``__getitem__`` to support the indexing such that ``dataset[i]`` can\n",
        "   be used to get $i$\\ th sample\n",
        "\n",
        "Let's create a dataset class for our face landmarks dataset. We will\n",
        "read the csv in ``__init__`` but leave the reading of images to\n",
        "``__getitem__``. This is memory efficient because all the images are not\n",
        "stored in the memory at once but read as required.\n",
        "\n",
        "Sample of our dataset will be a dict\n",
        "``{'image': image, 'landmarks': landmarks}``. Our dataset will take an\n",
        "optional argument ``transform`` so that any required processing can be\n",
        "applied on the sample. We will see the usefulness of ``transform`` in the\n",
        "next section."
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "I302HaeiMTij",
        "colab_type": "code",
        "colab": {}
      },
      "source": [
        "from torch.utils.data import Dataset, DataLoader\n",
        "\n",
        "\n",
        "class FakeDataset(Dataset):\n",
        "\n",
        "    def __init__(self, x, y):\n",
        "        self.x = x\n",
        "        self.y = y\n",
        "\n",
        "    def __len__(self):\n",
        "        return len(self.x)\n",
        "\n",
        "    def __getitem__(self, idx):\n",
        "        return self.x[idx], self.y[idx]"
      ],
      "execution_count": 0,
      "outputs": []
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "vOgpiQcIMTik",
        "colab_type": "text"
      },
      "source": [
        "However, we are losing a lot of features by using a simple ``for`` loop to\n",
        "iterate over the data. In particular, we are missing out on:\n",
        "\n",
        "-  Batching the data\n",
        "-  Shuffling the data\n",
        "-  Load the data in parallel using ``multiprocessing`` workers.\n",
        "\n",
        "``torch.utils.data.DataLoader`` is an iterator which provides all these\n",
        "features. Parameters used below should be clear. One parameter of\n",
        "interest is ``collate_fn``. You can specify how exactly the samples need\n",
        "to be batched using ``collate_fn``. However, default collate should work\n",
        "fine for most use cases."
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "TMHL8d06MTik",
        "colab_type": "code",
        "outputId": "a5fc7f8a-0364-43fa-bf4c-bf63b9c71881",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 1000
        }
      },
      "source": [
        "x = np.random.rand(100, 10)\n",
        "y = np.random.rand(100)\n",
        "\n",
        "dataset = FakeDataset(x, y)\n",
        "dataloader = DataLoader(dataset, batch_size=4,\n",
        "                        shuffle=True, num_workers=4)\n",
        "\n",
        "for i_batch, sample_batched in enumerate(dataloader):\n",
        "    print(i_batch, sample_batched)"
      ],
      "execution_count": 0,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "0 [tensor([[0.8425, 0.0163, 0.0450, 0.9658, 0.1704, 0.7332, 0.5599, 0.3465, 0.5721,\n",
            "         0.3130],\n",
            "        [0.6278, 0.0859, 0.0657, 0.6674, 0.6366, 0.9306, 0.8121, 0.6326, 0.5322,\n",
            "         0.7793],\n",
            "        [0.1630, 0.2736, 0.0070, 0.1457, 0.0807, 0.6849, 0.8761, 0.0502, 0.7291,\n",
            "         0.2466],\n",
            "        [0.3544, 0.1138, 0.4130, 0.6864, 0.0309, 0.2801, 0.9899, 0.4980, 0.8004,\n",
            "         0.2637]], dtype=torch.float64), tensor([0.3623, 0.9702, 0.6671, 0.3185])]\n",
            "1 [tensor([[0.1096, 0.4694, 0.7044, 0.3765, 0.8890, 0.9266, 0.2414, 0.1361, 0.3555,\n",
            "         0.6641],\n",
            "        [0.6713, 0.4263, 0.9188, 0.9392, 0.6510, 0.8450, 0.1234, 0.0928, 0.8625,\n",
            "         0.0309],\n",
            "        [0.9241, 0.5794, 0.1106, 0.4831, 0.1394, 0.8072, 0.4977, 0.0555, 0.7470,\n",
            "         0.6890],\n",
            "        [0.5961, 0.1317, 0.5008, 0.0683, 0.5264, 0.8927, 0.2962, 0.2142, 0.7640,\n",
            "         0.8714]], dtype=torch.float64), tensor([0.2701, 0.0337, 0.0983, 0.8594])]\n",
            "2 [tensor([[0.9806, 0.3530, 0.0961, 0.5138, 0.8475, 0.7146, 0.5949, 0.2177, 0.5165,\n",
            "         0.7706],\n",
            "        [0.1328, 0.4722, 0.4579, 0.2818, 0.5141, 0.8366, 0.0479, 0.0689, 0.7267,\n",
            "         0.3794],\n",
            "        [0.9347, 0.8148, 0.4499, 0.0176, 0.4606, 0.1236, 0.1444, 0.2298, 0.5406,\n",
            "         0.6820],\n",
            "        [0.5355, 0.7904, 0.9605, 0.7997, 0.1974, 0.2357, 0.5871, 0.5880, 0.8157,\n",
            "         0.7967]], dtype=torch.float64), tensor([0.8239, 0.3083, 0.6488, 0.3752])]\n",
            "3 [tensor([[0.3225, 0.1331, 0.9460, 0.9248, 0.6805, 0.3465, 0.6849, 0.1555, 0.6478,\n",
            "         0.7804],\n",
            "        [0.3914, 0.0533, 0.7894, 0.8275, 0.7835, 0.2978, 0.4156, 0.8235, 0.9098,\n",
            "         0.8376],\n",
            "        [0.9852, 0.7606, 0.0070, 0.6510, 0.2410, 0.3949, 0.1414, 0.7690, 0.8134,\n",
            "         0.7123],\n",
            "        [0.3330, 0.9045, 0.2981, 0.6869, 0.8339, 0.6206, 0.7687, 0.8025, 0.1287,\n",
            "         0.2582]], dtype=torch.float64), tensor([0.0759, 0.8951, 0.8446, 0.4160])]\n",
            "4 [tensor([[0.5660, 0.5365, 0.7909, 0.4810, 0.2659, 0.1149, 0.3142, 0.5249, 0.2611,\n",
            "         0.9927],\n",
            "        [0.1415, 0.0567, 0.5027, 0.6150, 0.1651, 0.4861, 0.4452, 0.6551, 0.6557,\n",
            "         0.7662],\n",
            "        [0.1333, 0.8867, 0.5232, 0.9336, 0.1000, 0.0323, 0.3984, 0.9970, 0.4848,\n",
            "         0.4662],\n",
            "        [0.1505, 0.9721, 0.3268, 0.8636, 0.4060, 0.1426, 0.2754, 0.0926, 0.8451,\n",
            "         0.1824]], dtype=torch.float64), tensor([0.2815, 0.1673, 0.6319, 0.7661])]\n",
            "5 [tensor([[0.1754, 0.2864, 0.7618, 0.0244, 0.3647, 0.9209, 0.0891, 0.1538, 0.1855,\n",
            "         0.7084],\n",
            "        [0.3389, 0.7122, 0.8899, 0.6695, 0.9767, 0.1120, 0.2055, 0.7342, 0.6544,\n",
            "         0.9195],\n",
            "        [0.5439, 0.1739, 0.6939, 0.3658, 0.5159, 0.9156, 0.0107, 0.5253, 0.1156,\n",
            "         0.4393],\n",
            "        [0.4609, 0.6859, 0.3709, 0.9270, 0.6991, 0.5749, 0.8408, 0.4865, 0.5438,\n",
            "         0.6000]], dtype=torch.float64), tensor([0.8977, 0.1774, 0.1617, 0.8094])]\n",
            "6 [tensor([[0.1443, 0.5263, 0.4120, 0.3804, 0.8041, 0.9519, 0.1409, 0.5286, 0.9267,\n",
            "         0.2848],\n",
            "        [0.3207, 0.4271, 0.3206, 0.9046, 0.7617, 0.8463, 0.6925, 0.1443, 0.0974,\n",
            "         0.5341],\n",
            "        [0.2514, 0.5027, 0.7723, 0.5894, 0.3749, 0.9916, 0.5685, 0.2184, 0.4930,\n",
            "         0.4290],\n",
            "        [0.0795, 0.8094, 0.9198, 0.5961, 0.0574, 0.6314, 0.6500, 0.2977, 0.3418,\n",
            "         0.7521]], dtype=torch.float64), tensor([0.4732, 0.8877, 0.3861, 0.6891])]\n",
            "7 [tensor([[0.0472, 0.2678, 0.7080, 0.2435, 0.6636, 0.8487, 0.3345, 0.3422, 0.7687,\n",
            "         0.3709],\n",
            "        [0.3773, 0.7661, 0.9189, 0.0614, 0.9269, 0.1185, 0.6387, 0.1429, 0.6989,\n",
            "         0.3441],\n",
            "        [0.2442, 0.1764, 0.6082, 0.0751, 0.6153, 0.5185, 0.6593, 0.6469, 0.2874,\n",
            "         0.4393],\n",
            "        [0.3332, 0.0981, 0.7359, 0.7838, 0.3023, 0.2381, 0.5067, 0.4706, 0.2707,\n",
            "         0.3132]], dtype=torch.float64), tensor([0.0040, 0.0164, 0.0552, 0.4883])]\n",
            "8 [tensor([[0.4232, 0.2060, 0.8339, 0.6456, 0.2742, 0.0868, 0.9288, 0.2806, 0.1588,\n",
            "         0.8433],\n",
            "        [0.1711, 0.7196, 0.2643, 0.3807, 0.5423, 0.7101, 0.6998, 0.4714, 0.5880,\n",
            "         0.9182],\n",
            "        [0.6122, 0.3149, 0.4599, 0.4874, 0.5861, 0.7513, 0.2208, 0.0972, 0.9090,\n",
            "         0.7897],\n",
            "        [0.0274, 0.2815, 0.4425, 0.2692, 0.4284, 0.3600, 0.6660, 0.2705, 0.7473,\n",
            "         0.2071]], dtype=torch.float64), tensor([0.4983, 0.5712, 0.0727, 0.9358])]\n",
            "9 [tensor([[0.8506, 0.4169, 0.4541, 0.9331, 0.0052, 0.1280, 0.9007, 0.1913, 0.8718,\n",
            "         0.0106],\n",
            "        [0.5955, 0.0951, 0.6246, 0.7979, 0.9589, 0.3035, 0.9799, 0.8582, 0.6784,\n",
            "         0.7507],\n",
            "        [0.2766, 0.8681, 0.0590, 0.0525, 0.6620, 0.5223, 0.9452, 0.1036, 0.2693,\n",
            "         0.3226],\n",
            "        [0.5691, 0.1234, 0.0489, 0.6661, 0.6113, 0.6277, 0.0384, 0.6170, 0.2797,\n",
            "         0.8429]], dtype=torch.float64), tensor([0.6756, 0.3991, 0.4770, 0.3899])]\n",
            "10 [tensor([[0.0431, 0.4885, 0.0495, 0.2853, 0.8376, 0.3061, 0.0888, 0.8225, 0.8141,\n",
            "         0.8305],\n",
            "        [0.1141, 0.1068, 0.5190, 0.8968, 0.4951, 0.1997, 0.6193, 0.2545, 0.8648,\n",
            "         0.8976],\n",
            "        [0.5321, 0.4342, 0.0405, 0.1812, 0.6136, 0.7870, 0.0483, 0.9592, 0.4478,\n",
            "         0.5764],\n",
            "        [0.3276, 0.4199, 0.6992, 0.4885, 0.0958, 0.6588, 0.2741, 0.4439, 0.1695,\n",
            "         0.8619]], dtype=torch.float64), tensor([0.9373, 0.8928, 0.8922, 0.3145])]\n",
            "11 [tensor([[0.3854, 0.9316, 0.0298, 0.9188, 0.8910, 0.3670, 0.8487, 0.1697, 0.0127,\n",
            "         0.0498],\n",
            "        [0.6574, 0.5226, 0.4919, 0.2658, 0.3278, 0.4690, 0.0562, 0.9769, 0.1475,\n",
            "         0.5129],\n",
            "        [0.4771, 0.3198, 0.2225, 0.2465, 0.5702, 0.4464, 0.7550, 0.1298, 0.1753,\n",
            "         0.1040],\n",
            "        [0.7026, 0.0047, 0.9629, 0.5305, 0.8144, 0.2466, 0.7914, 0.4742, 0.6611,\n",
            "         0.9934]], dtype=torch.float64), tensor([0.0839, 0.3191, 0.5800, 0.1524])]\n",
            "12 [tensor([[0.6947, 0.4833, 0.4664, 0.0637, 0.6154, 0.3276, 0.2890, 0.9505, 0.2479,\n",
            "         0.4722],\n",
            "        [0.0165, 0.3028, 0.3071, 0.0787, 0.2653, 0.3550, 0.3574, 0.0368, 0.0728,\n",
            "         0.8916],\n",
            "        [0.2653, 0.5330, 0.0414, 0.4293, 0.3036, 0.1316, 0.0314, 0.6894, 0.7553,\n",
            "         0.2532],\n",
            "        [0.7834, 0.7959, 0.4286, 0.5653, 0.3529, 0.3601, 0.0417, 0.5342, 0.4847,\n",
            "         0.0560]], dtype=torch.float64), tensor([0.1866, 0.1207, 0.0190, 0.0361])]\n",
            "13 [tensor([[0.1089, 0.5596, 0.5591, 0.7019, 0.5667, 0.2379, 0.3286, 0.0240, 0.3211,\n",
            "         0.1834],\n",
            "        [0.1796, 0.2674, 0.7746, 0.9959, 0.3893, 0.6051, 0.6899, 0.3247, 0.2102,\n",
            "         0.8673],\n",
            "        [0.4786, 0.1964, 0.8945, 0.9900, 0.8276, 0.1961, 0.2638, 0.0117, 0.1773,\n",
            "         0.6262],\n",
            "        [0.9219, 0.4175, 0.7378, 0.7466, 0.6516, 0.3478, 0.3196, 0.4792, 0.0107,\n",
            "         0.0097]], dtype=torch.float64), tensor([0.3808, 0.3140, 0.0566, 0.3341])]\n",
            "14 [tensor([[0.7485, 0.2783, 0.3977, 0.0927, 0.1109, 0.4401, 0.9317, 0.4913, 0.9146,\n",
            "         0.5833],\n",
            "        [0.2575, 0.8604, 0.1631, 0.3485, 0.5333, 0.5776, 0.7931, 0.6992, 0.4117,\n",
            "         0.0477],\n",
            "        [0.6240, 0.6081, 0.1885, 0.8287, 0.0992, 0.9690, 0.7310, 0.4237, 0.0929,\n",
            "         0.3810],\n",
            "        [0.8435, 0.2153, 0.4496, 0.2253, 0.4997, 0.3479, 0.4391, 0.0605, 0.4542,\n",
            "         0.0157]], dtype=torch.float64), tensor([0.5321, 0.3106, 0.2295, 0.6350])]\n",
            "15 [tensor([[0.4420, 0.6948, 0.1094, 0.5931, 0.7619, 0.4407, 0.4690, 0.3976, 0.9607,\n",
            "         0.3494],\n",
            "        [0.7332, 0.8916, 0.8853, 0.7412, 0.7929, 0.9534, 0.1238, 0.0607, 0.1257,\n",
            "         0.0268],\n",
            "        [0.1370, 0.7772, 0.2095, 0.0615, 0.9563, 0.1148, 0.7654, 0.8568, 0.8793,\n",
            "         0.3518],\n",
            "        [0.5042, 0.4699, 0.3214, 0.7640, 0.5101, 0.6616, 0.3617, 0.4770, 0.0069,\n",
            "         0.1894]], dtype=torch.float64), tensor([0.1883, 0.9971, 0.0032, 0.4412])]\n",
            "16 [tensor([[0.2195, 0.3428, 0.2049, 0.9051, 0.9741, 0.3217, 0.1570, 0.6321, 0.9287,\n",
            "         0.0520],\n",
            "        [0.6150, 0.6825, 0.1255, 0.7408, 0.5374, 0.0448, 0.4369, 0.7070, 0.6105,\n",
            "         0.2682],\n",
            "        [0.2921, 0.9617, 0.9515, 0.3262, 0.4368, 0.5069, 0.7174, 0.9699, 0.8135,\n",
            "         0.2042],\n",
            "        [0.0705, 0.8939, 0.1630, 0.9421, 0.4714, 0.8645, 0.1503, 0.4629, 0.6484,\n",
            "         0.6583]], dtype=torch.float64), tensor([0.1052, 0.2601, 0.8708, 0.7861])]\n",
            "17 [tensor([[0.2729, 0.1071, 0.9125, 0.4087, 0.2112, 0.2144, 0.4645, 0.4689, 0.5563,\n",
            "         0.3527],\n",
            "        [0.3428, 0.9303, 0.4622, 0.7690, 0.5182, 0.2178, 0.8088, 0.3814, 0.2451,\n",
            "         0.0097],\n",
            "        [0.1366, 0.9362, 0.8221, 0.4098, 0.0233, 0.8347, 0.3028, 0.7414, 0.2492,\n",
            "         0.3986],\n",
            "        [0.5625, 0.4209, 0.3199, 0.5329, 0.4318, 0.4624, 0.7504, 0.0850, 0.8262,\n",
            "         0.8674]], dtype=torch.float64), tensor([0.2831, 0.3401, 0.1860, 0.6624])]\n",
            "18 [tensor([[0.1269, 0.5661, 0.6663, 0.0428, 0.6780, 0.2926, 0.7766, 0.1815, 0.7354,\n",
            "         0.6291],\n",
            "        [0.9117, 0.1700, 0.2482, 0.3003, 0.1531, 0.6073, 0.5338, 0.5332, 0.2620,\n",
            "         0.0839],\n",
            "        [0.0994, 0.4514, 0.3857, 0.9796, 0.9205, 0.1876, 0.3074, 0.2653, 0.4389,\n",
            "         0.1293],\n",
            "        [0.6515, 0.9134, 0.5027, 0.7137, 0.3813, 0.0376, 0.6971, 0.9692, 0.9135,\n",
            "         0.2692]], dtype=torch.float64), tensor([0.6234, 0.0576, 0.2245, 0.6051])]\n",
            "19 [tensor([[0.8126, 0.4741, 0.5860, 0.3143, 0.0428, 0.0722, 0.0376, 0.9781, 0.0124,\n",
            "         0.2722],\n",
            "        [0.5806, 0.0531, 0.8747, 0.0568, 0.1727, 0.7886, 0.5517, 0.1274, 0.0775,\n",
            "         0.2411],\n",
            "        [0.7651, 0.3097, 0.8265, 0.7565, 0.3180, 0.9248, 0.9444, 0.4895, 0.4972,\n",
            "         0.6315],\n",
            "        [0.3985, 0.6418, 0.9978, 0.6291, 0.6545, 0.5499, 0.6170, 0.3744, 0.1309,\n",
            "         0.0333]], dtype=torch.float64), tensor([0.4335, 0.1575, 0.8146, 0.9767])]\n",
            "20 [tensor([[0.5020, 0.6228, 0.7321, 0.8117, 0.8293, 0.7492, 0.4505, 0.8574, 0.3756,\n",
            "         0.7858],\n",
            "        [0.0157, 0.9246, 0.2020, 0.1813, 0.5590, 0.1894, 0.8663, 0.3976, 0.6099,\n",
            "         0.7796],\n",
            "        [0.1478, 0.4510, 0.5293, 0.0144, 0.6534, 0.0011, 0.7619, 0.8707, 0.0184,\n",
            "         0.0679],\n",
            "        [0.7997, 0.7779, 0.2931, 0.6034, 0.3622, 0.1584, 0.6144, 0.7697, 0.3829,\n",
            "         0.0093]], dtype=torch.float64), tensor([0.8451, 0.1214, 0.9799, 0.2867])]\n",
            "21 [tensor([[0.4305, 0.7886, 0.6214, 0.4531, 0.4469, 0.7205, 0.1592, 0.1842, 0.5187,\n",
            "         0.6969],\n",
            "        [0.7630, 0.1256, 0.6109, 0.5586, 0.9137, 0.5597, 0.6269, 0.3198, 0.3077,\n",
            "         0.3593],\n",
            "        [0.7059, 0.0629, 0.4574, 0.6665, 0.9461, 0.6405, 0.6298, 0.0565, 0.9597,\n",
            "         0.4506],\n",
            "        [0.3676, 0.1686, 0.0707, 0.4301, 0.8522, 0.6364, 0.6080, 0.8737, 0.3691,\n",
            "         0.1897]], dtype=torch.float64), tensor([0.2027, 0.4832, 0.9747, 0.2156])]\n",
            "22 [tensor([[0.2407, 0.3158, 0.8096, 0.1740, 0.9177, 0.7346, 0.3027, 0.5294, 0.2325,\n",
            "         0.8409],\n",
            "        [0.0633, 0.2424, 0.2661, 0.7229, 0.8196, 0.0043, 0.3061, 0.9725, 0.4043,\n",
            "         0.5097],\n",
            "        [0.3322, 0.7527, 0.8107, 0.2407, 0.2330, 0.0650, 0.1372, 0.2534, 0.4211,\n",
            "         0.1260],\n",
            "        [0.6050, 0.2824, 0.6153, 0.2513, 0.0531, 0.0236, 0.7016, 0.6394, 0.9860,\n",
            "         0.4539]], dtype=torch.float64), tensor([0.3310, 0.4057, 0.7878, 0.9452])]\n",
            "23 [tensor([[0.3348, 0.1017, 0.6351, 0.7048, 0.8362, 0.4930, 0.7561, 0.2674, 0.9701,\n",
            "         0.8510],\n",
            "        [0.5713, 0.6165, 0.7750, 0.5141, 0.1918, 0.8182, 0.5630, 0.2650, 0.9904,\n",
            "         0.0493],\n",
            "        [0.4832, 0.9956, 0.4878, 0.2249, 0.9667, 0.6957, 0.0909, 0.6688, 0.8366,\n",
            "         0.0182],\n",
            "        [0.7553, 0.7957, 0.3804, 0.9051, 0.6033, 0.3121, 0.1475, 0.6260, 0.8481,\n",
            "         0.0456]], dtype=torch.float64), tensor([0.3492, 0.8794, 0.3888, 0.2310])]\n",
            "24 [tensor([[0.6934, 0.8673, 0.6935, 0.4884, 0.7295, 0.3680, 0.1332, 0.1841, 0.8962,\n",
            "         0.7861],\n",
            "        [0.3471, 0.9848, 0.1266, 0.3642, 0.3110, 0.2262, 0.7783, 0.4433, 0.5535,\n",
            "         0.4736],\n",
            "        [0.3525, 0.3487, 0.2770, 0.5047, 0.1117, 0.9636, 0.4308, 0.3456, 0.0113,\n",
            "         0.2805],\n",
            "        [0.2460, 0.8893, 0.5499, 0.0054, 0.8566, 0.2870, 0.2730, 0.5990, 0.5467,\n",
            "         0.3042]], dtype=torch.float64), tensor([0.0302, 0.0902, 0.4506, 0.3607])]\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "-m-ml0NoMTim",
        "colab_type": "text"
      },
      "source": [
        "Mixed Presision Training\n",
        "===================================================\n",
        "**Author**: `Chi-Liang Liu <https://liangtaiwan.github.io>`\n",
        "**Ref**: https://github.com/NVIDIA/apex\n",
        "Using mixed precision to train your networks can be:\n",
        "- 2-4x faster\n",
        "- memory-efficient\n",
        "in only 3 lines of Python."
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "a94qxAC_MTin",
        "colab_type": "text"
      },
      "source": [
        "# Apex \n",
        "\n",
        "NVIDIA-maintained utilities to streamline mixed precision and distributed training in Pytorch. Some of the code here will be included in upstream Pytorch eventually. The intention of Apex is to make up-to-date utilities available to users as quickly as possible."
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "fw6ociH0MTin",
        "colab_type": "text"
      },
      "source": [
        "## apex.amp\n",
        "\n",
        "Amp allows users to easily experiment with different pure and mixed precision modes.\n",
        "Commonly-used default modes are chosen by\n",
        "selecting an \"optimization level\" or ``opt_level``; each ``opt_level`` establishes a set of\n",
        "properties that govern Amp's implementation of pure or mixed precision training.\n",
        "Finer-grained control of how a given ``opt_level`` behaves can be achieved by passing values for\n",
        "particular properties directly to ``amp.initialize``.  These manually specified values\n",
        "override the defaults established by the ``opt_level``."
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "IxP-tvHcMTio",
        "colab_type": "code",
        "colab": {}
      },
      "source": [
        "from apex import amp\n",
        "\n",
        "# Declare model and optimizer as usual, with default (FP32) precision\n",
        "model = torch.nn.Linear(10, 100).cuda()\n",
        "optimizer = torch.optim.SGD(model.parameters(), lr=1e-3)\n",
        "\n",
        "# Allow Amp to perform casts as required by the opt_level\n",
        "model, optimizer = amp.initialize(model, optimizer, opt_level=\"O1\")\n",
        "...\n",
        "# loss.backward() becomes:\n",
        "with amp.scale_loss(loss, optimizer) as scaled_loss:\n",
        "    scaled_loss.backward()\n",
        "..."
      ],
      "execution_count": 0,
      "outputs": []
    }
  ]
}