{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 3.2 autograd\n",
    "\n",
    "用Tensor训练网络很方便，但从上一小节最后的线性回归例子来看，反向传播过程需要手动实现。这对于像线性回归等较为简单的模型来说，还可以应付，但实际使用中经常出现非常复杂的网络结构，此时如果手动实现反向传播，不仅费时费力，而且容易出错，难以检查。torch.autograd就是为方便用户使用，而专门开发的一套自动求导引擎，它能够根据输入和前向传播过程自动构建计算图，并执行反向传播。\n",
    "\n",
    "计算图(Computation Graph)是现代深度学习框架如PyTorch和TensorFlow等的核心，其为高效自动求导算法——反向传播(Back Propogation)提供了理论支持，了解计算图在实际写程序过程中会有极大的帮助。本节将涉及一些基础的计算图知识，但并不要求读者事先对此有深入的了解。关于计算图的基础知识推荐阅读Christopher Olah的文章[^1]。\n",
    "\n",
    "[^1]: http://colah.github.io/posts/2015-08-Backprop/\n",
    "\n",
    "\n",
    "### 3.2.1 requires_grad\n",
    "PyTorch在autograd模块中实现了计算图的相关功能，autograd中的核心数据结构是Variable。从v0.4版本起，Variable和Tensor合并。我们可以认为需要求导(requires_grad)的tensor即Variable. autograd记录对tensor的操作记录用来构建计算图。\n",
    "\n",
    "Variable提供了大部分tensor支持的函数，但其不支持部分`inplace`函数，因这些函数会修改tensor自身，而在反向传播中，variable需要缓存原来的tensor来计算反向传播梯度。如果想要计算各个Variable的梯度，只需调用根节点variable的`backward`方法，autograd会自动沿着计算图反向传播，计算每一个叶子节点的梯度。\n",
    "\n",
    "`variable.backward(gradient=None, retain_graph=None, create_graph=None)`主要有如下参数：\n",
    "\n",
    "- grad_variables：形状与variable一致，对于`y.backward()`，grad_variables相当于链式法则${dz \\over dx}={dz \\over dy} \\times {dy \\over dx}$中的$\\textbf {dz} \\over \\textbf {dy}$。grad_variables也可以是tensor或序列。\n",
    "- retain_graph：反向传播需要缓存一些中间结果，反向传播之后，这些缓存就被清空，可通过指定这个参数不清空缓存，用来多次反向传播。\n",
    "- create_graph：对反向传播过程再次构建计算图，可通过`backward of backward`实现求高阶导数。\n",
    "\n",
    "上述描述可能比较抽象，如果没有看懂，不用着急，会在本节后半部分详细介绍，下面先看几个例子。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 1,
   "metadata": {},
   "outputs": [],
   "source": [
    "from __future__ import print_function\n",
    "import torch as t"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 2,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor([[ 0.6743,  1.0805, -0.0148,  0.1897],\n",
       "        [-2.0161,  0.6986,  0.7225,  1.4667],\n",
       "        [ 0.4757, -1.4019,  0.3953,  0.9282]], requires_grad=True)"
      ]
     },
     "execution_count": 2,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "#在创建tensor的时候指定requires_grad\n",
    "a = t.randn(3,4, requires_grad=True)\n",
    "# 或者\n",
    "a = t.randn(3,4).requires_grad_()\n",
    "# 或者\n",
    "a = t.randn(3,4)\n",
    "a.requires_grad=True\n",
    "a"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 3,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor([[0., 0., 0., 0.],\n",
       "        [0., 0., 0., 0.],\n",
       "        [0., 0., 0., 0.]], requires_grad=True)"
      ]
     },
     "execution_count": 3,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "b = t.zeros(3,4).requires_grad_()\n",
    "b"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 4,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor([[ 0.6743,  1.0805, -0.0148,  0.1897],\n",
       "        [-2.0161,  0.6986,  0.7225,  1.4667],\n",
       "        [ 0.4757, -1.4019,  0.3953,  0.9282]], grad_fn=<AddBackward0>)"
      ]
     },
     "execution_count": 4,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# 也可写成c = a + b\n",
    "c = a.add(b)\n",
    "c"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 5,
   "metadata": {},
   "outputs": [],
   "source": [
    "d = c.sum()\n",
    "d.backward() # 反向传播"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 6,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "True"
      ]
     },
     "execution_count": 6,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "d # d还是一个requires_grad=True的tensor,对它的操作需要慎重\n",
    "d.requires_grad"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 7,
   "metadata": {
    "scrolled": false
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor([[1., 1., 1., 1.],\n",
       "        [1., 1., 1., 1.],\n",
       "        [1., 1., 1., 1.]])"
      ]
     },
     "execution_count": 7,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "a.grad"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 8,
   "metadata": {
    "scrolled": false
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(True, True, True)"
      ]
     },
     "execution_count": 8,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# 此处虽然没有指定c需要求导，但c依赖于a，而a需要求导，\n",
    "# 因此c的requires_grad属性会自动设为True\n",
    "a.requires_grad, b.requires_grad, c.requires_grad"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 9,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(True, True, False)"
      ]
     },
     "execution_count": 9,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# 由用户创建的variable属于叶子节点，对应的grad_fn是None\n",
    "a.is_leaf, b.is_leaf, c.is_leaf"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 10,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "True"
      ]
     },
     "execution_count": 10,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# c.grad是None, 因c不是叶子节点，它的梯度是用来计算a的梯度\n",
    "# 所以虽然c.requires_grad = True,但其梯度计算完之后即被释放\n",
    "c.grad is None"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "计算下面这个函数的导函数：\n",
    "$$\n",
    "y = x^2\\bullet e^x\n",
    "$$\n",
    "它的导函数是：\n",
    "$$\n",
    "{dy \\over dx} = 2x\\bullet e^x + x^2 \\bullet e^x\n",
    "$$\n",
    "来看看autograd的计算结果与手动求导计算结果的误差。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 11,
   "metadata": {},
   "outputs": [],
   "source": [
    "def f(x):\n",
    "    '''计算y'''\n",
    "    y = x**2 * t.exp(x)\n",
    "    return y\n",
    "\n",
    "def gradf(x):\n",
    "    '''手动求导函数'''\n",
    "    dx = 2*x*t.exp(x) + x**2*t.exp(x)\n",
    "    return dx"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 12,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor([[0.1231, 0.3861, 4.4141, 0.1981],\n",
       "        [0.1323, 0.2422, 0.0375, 0.5060],\n",
       "        [0.3397, 8.7858, 0.3481, 0.0198]], grad_fn=<MulBackward0>)"
      ]
     },
     "execution_count": 12,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "x = t.randn(3,4, requires_grad = True)\n",
    "y = f(x)\n",
    "y"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 13,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor([[-0.4411,  1.9714, 11.9577, -0.4610],\n",
       "        [ 0.9825,  1.4459, -0.3101,  0.1116],\n",
       "        [-0.3939, 20.9747,  1.8385, -0.2412]])"
      ]
     },
     "execution_count": 13,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "y.backward(t.ones(y.size())) # gradient形状与y一致\n",
    "x.grad"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 14,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor([[-0.4411,  1.9714, 11.9577, -0.4610],\n",
       "        [ 0.9825,  1.4459, -0.3101,  0.1116],\n",
       "        [-0.3939, 20.9747,  1.8385, -0.2412]], grad_fn=<AddBackward0>)"
      ]
     },
     "execution_count": 14,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# autograd的计算结果与利用公式手动计算的结果一致\n",
    "gradf(x) "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 3.2.2 计算图\n",
    "\n",
    "PyTorch中`autograd`的底层采用了计算图，计算图是一种特殊的有向无环图（DAG），用于记录算子与变量之间的关系。一般用矩形表示算子，椭圆形表示变量。如表达式$ \\textbf {z = wx + b}$可分解为$\\textbf{y = wx}$和$\\textbf{z = y + b}$，其计算图如图3-3所示，图中`MUL`，`ADD`都是算子，$\\textbf{w}$，$\\textbf{x}$，$\\textbf{b}$即变量。\n",
    "\n",
    "![图3-3:computation graph](imgs/com_graph.svg)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "如上有向无环图中，$\\textbf{X}$和$\\textbf{b}$是叶子节点（leaf node），这些节点通常由用户自己创建，不依赖于其他变量。$\\textbf{z}$称为根节点，是计算图的最终目标。利用链式法则很容易求得各个叶子节点的梯度。\n",
    "$${\\partial z \\over \\partial b} = 1,\\space {\\partial z \\over \\partial y} = 1\\\\\n",
    "{\\partial y \\over \\partial w }= x,{\\partial y \\over \\partial x}= w\\\\\n",
    "{\\partial z \\over \\partial x}= {\\partial z \\over \\partial y} {\\partial y \\over \\partial x}=1 * w\\\\\n",
    "{\\partial z \\over \\partial w}= {\\partial z \\over \\partial y} {\\partial y \\over \\partial w}=1 * x\\\\\n",
    "$$\n",
    "而有了计算图，上述链式求导即可利用计算图的反向传播自动完成，其过程如图3-4所示。\n",
    "\n",
    "![图3-4：计算图的反向传播](imgs/com_graph_backward.svg)\n",
    "\n",
    "\n",
    "在PyTorch实现中，autograd会随着用户的操作，记录生成当前variable的所有操作，并由此建立一个有向无环图。用户每进行一个操作，相应的计算图就会发生改变。更底层的实现中，图中记录了操作`Function`，每一个变量在图中的位置可通过其`grad_fn`属性在图中的位置推测得到。在反向传播过程中，autograd沿着这个图从当前变量（根节点$\\textbf{z}$）溯源，可以利用链式求导法则计算所有叶子节点的梯度。每一个前向传播操作的函数都有与之对应的反向传播函数用来计算输入的各个variable的梯度，这些函数的函数名通常以`Backward`结尾。下面结合代码学习autograd的实现细节。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 15,
   "metadata": {},
   "outputs": [],
   "source": [
    "x = t.ones(1)\n",
    "b = t.rand(1, requires_grad = True)\n",
    "w = t.rand(1, requires_grad = True)\n",
    "y = w * x # 等价于y=w.mul(x)\n",
    "z = y + b # 等价于z=y.add(b)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 16,
   "metadata": {
    "scrolled": true
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(False, True, True)"
      ]
     },
     "execution_count": 16,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "x.requires_grad, b.requires_grad, w.requires_grad"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 17,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "True"
      ]
     },
     "execution_count": 17,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# 虽然未指定y.requires_grad为True，但由于y依赖于需要求导的w\n",
    "# 故而y.requires_grad为True\n",
    "y.requires_grad"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 18,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(True, True, True)"
      ]
     },
     "execution_count": 18,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "x.is_leaf, w.is_leaf, b.is_leaf"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 19,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(False, False)"
      ]
     },
     "execution_count": 19,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "y.is_leaf, z.is_leaf"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 20,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "<AddBackward0 at 0x7fb73c7cd490>"
      ]
     },
     "execution_count": 20,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# grad_fn可以查看这个variable的反向传播函数，\n",
    "# z是add函数的输出，所以它的反向传播函数是AddBackward\n",
    "z.grad_fn "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 21,
   "metadata": {
    "scrolled": true
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "((<MulBackward0 at 0x7fb73c7cd7d0>, 0L),\n",
       " (<AccumulateGrad at 0x7fb73c7cdad0>, 0L))"
      ]
     },
     "execution_count": 21,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# next_functions保存grad_fn的输入，是一个tuple，tuple的元素也是Function\n",
    "# 第一个是y，它是乘法(mul)的输出，所以对应的反向传播函数y.grad_fn是MulBackward\n",
    "# 第二个是b，它是叶子节点，由用户创建，grad_fn为None，但是有\n",
    "z.grad_fn.next_functions "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 22,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "True"
      ]
     },
     "execution_count": 22,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# variable的grad_fn对应着和图中的function相对应\n",
    "z.grad_fn.next_functions[0][0] == y.grad_fn"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 23,
   "metadata": {
    "scrolled": true
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "((<AccumulateGrad at 0x7fb73c7cdc10>, 0L), (None, 0L))"
      ]
     },
     "execution_count": 23,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# 第一个是w，叶子节点，需要求导，梯度是累加的\n",
    "# 第二个是x，叶子节点，不需要求导，所以为None\n",
    "y.grad_fn.next_functions"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 24,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(None, None)"
      ]
     },
     "execution_count": 24,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# 叶子节点的grad_fn是None\n",
    "w.grad_fn,x.grad_fn"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "计算w的梯度的时候，需要用到x的数值(${\\partial y\\over \\partial w} = x $)，这些数值在前向过程中会保存成buffer，在计算完梯度之后会自动清空。为了能够多次反向传播需要指定`retain_graph`来保留这些buffer。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 25,
   "metadata": {
    "scrolled": true
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor([1.])"
      ]
     },
     "execution_count": 25,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# 使用retain_graph来保存buffer\n",
    "z.backward(retain_graph=True)\n",
    "w.grad"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 26,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor([2.])"
      ]
     },
     "execution_count": 26,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# 多次反向传播，梯度累加，这也就是w中AccumulateGrad标识的含义\n",
    "z.backward()\n",
    "w.grad"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "PyTorch使用的是动态图，它的计算图在每次前向传播时都是从头开始构建，所以它能够使用Python控制语句（如for、if等）根据需求创建计算图。这点在自然语言处理领域中很有用，它意味着你不需要事先构建所有可能用到的图的路径，图在运行时才构建。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 27,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor([1.])"
      ]
     },
     "execution_count": 27,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "def abs(x):\n",
    "    if x.data[0]>0: return x\n",
    "    else: return -x\n",
    "x = t.ones(1,requires_grad=True)\n",
    "y = abs(x)\n",
    "y.backward()\n",
    "x.grad"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 28,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "tensor([-1.])\n"
     ]
    }
   ],
   "source": [
    "x = -1*t.ones(1)\n",
    "x = x.requires_grad_()\n",
    "y = abs(x)\n",
    "y.backward()\n",
    "print(x.grad)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 29,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor([1.], grad_fn=<NegBackward>)"
      ]
     },
     "execution_count": 29,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "y"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 30,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor([-1.], requires_grad=True)"
      ]
     },
     "execution_count": 30,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "x"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 31,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "True"
      ]
     },
     "execution_count": 31,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "x.requires_grad"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 32,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "True"
      ]
     },
     "execution_count": 32,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "x.requires_grad\n",
    "cc=x*3\n",
    "cc.requires_grad"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 33,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor([0., 0., 0., 6., 3., 2.])"
      ]
     },
     "execution_count": 33,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "def f(x):\n",
    "    result = 1\n",
    "    for ii in x:\n",
    "        if ii.item()>0: \n",
    "            result=ii*result\n",
    "    return result\n",
    "x = t.arange(-2,4,dtype=t.float32).requires_grad_()\n",
    "y = f(x) # y = x[3]*x[4]*x[5]\n",
    "y.backward()\n",
    "x.grad"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "变量的`requires_grad`属性默认为False，如果某一个节点requires_grad被设置为True，那么所有依赖它的节点`requires_grad`都是True。这其实很好理解，对于$ \\textbf{x}\\to \\textbf{y} \\to \\textbf{z}$，x.requires_grad = True，当需要计算$\\partial z \\over \\partial x$时，根据链式法则，$\\frac{\\partial z}{\\partial x} = \\frac{\\partial z}{\\partial y} \\frac{\\partial y}{\\partial x}$，自然也需要求$ \\frac{\\partial z}{\\partial y}$，所以y.requires_grad会被自动标为True. \n",
    "\n",
    "\n",
    "\n",
    "有些时候我们可能不希望autograd对tensor求导。认为求导需要缓存许多中间结构，增加额外的内存/显存开销，那么我们可以关闭自动求导。对于不需要反向传播的情景（如inference，即测试推理时），关闭自动求导可实现一定程度的速度提升，并节省约一半显存，因其不需要分配空间计算梯度。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 34,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(True, True, True)"
      ]
     },
     "execution_count": 34,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "x = t.ones(1, requires_grad=True)\n",
    "w = t.rand(1, requires_grad=True)\n",
    "y = x * w\n",
    "# y依赖于w，而w.requires_grad = True\n",
    "x.requires_grad, w.requires_grad, y.requires_grad"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 35,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(False, True, False)"
      ]
     },
     "execution_count": 35,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "with t.no_grad():\n",
    "    x = t.ones(1)\n",
    "    w = t.rand(1, requires_grad = True)\n",
    "    y = x * w\n",
    "# y依赖于w和x，虽然w.requires_grad = True，但是y的requires_grad依旧为False\n",
    "x.requires_grad, w.requires_grad, y.requires_grad"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 36,
   "metadata": {},
   "outputs": [],
   "source": [
    "t.no_grad??"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 37,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(False, True, False)"
      ]
     },
     "execution_count": 37,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "t.set_grad_enabled(False)\n",
    "x = t.ones(1)\n",
    "w = t.rand(1, requires_grad = True)\n",
    "y = x * w\n",
    "# y依赖于w和x，虽然w.requires_grad = True，但是y的requires_grad依旧为False\n",
    "x.requires_grad, w.requires_grad, y.requires_grad\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 38,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "<torch.autograd.grad_mode.set_grad_enabled at 0x7fb76c4a0d90>"
      ]
     },
     "execution_count": 38,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# 恢复默认配置\n",
    "t.set_grad_enabled(True)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "如果我们想要修改tensor的数值，但是又不希望被autograd记录，那么我么可以对tensor.data进行操作"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 39,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor([[1., 1., 1., 1.],\n",
       "        [1., 1., 1., 1.],\n",
       "        [1., 1., 1., 1.]])"
      ]
     },
     "execution_count": 39,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "a = t.ones(3,4,requires_grad=True)\n",
    "b = t.ones(3,4,requires_grad=True)\n",
    "c = a * b\n",
    "\n",
    "a.data # 还是一个tensor"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 40,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "False"
      ]
     },
     "execution_count": 40,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "a.data.requires_grad # 但是已经是独立于计算图之外"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 41,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "False"
      ]
     },
     "execution_count": 41,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "d = a.data.sigmoid_() # sigmoid_ 是个inplace操作，会修改a自身的值\n",
    "d.requires_grad"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 42,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor([[0.7311, 0.7311, 0.7311, 0.7311],\n",
       "        [0.7311, 0.7311, 0.7311, 0.7311],\n",
       "        [0.7311, 0.7311, 0.7311, 0.7311]], requires_grad=True)"
      ]
     },
     "execution_count": 42,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "a "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "如果我们希望对tensor，但是又不希望被记录, 可以使用tensor.data 或者tensor.detach()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 43,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "True"
      ]
     },
     "execution_count": 43,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "a.requires_grad"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 44,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "False"
      ]
     },
     "execution_count": 44,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# 近似于 tensor=a.data, 但是如果tensor被修改，backward可能会报错\n",
    "tensor = a.detach()\n",
    "tensor.requires_grad"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 45,
   "metadata": {},
   "outputs": [],
   "source": [
    "# 统计tensor的一些指标，不希望被记录\n",
    "mean = tensor.mean()\n",
    "std = tensor.std()\n",
    "maximum = tensor.max()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 46,
   "metadata": {},
   "outputs": [],
   "source": [
    "tensor[0]=1\n",
    "# 下面会报错：　RuntimeError: one of the variables needed for gradient\n",
    "#             computation has been modified by an inplace operation\n",
    "#　因为 c=a*b, b的梯度取决于a，现在修改了tensor，其实也就是修改了a，梯度不再准确\n",
    "# c.sum().backward() "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "在反向传播过程中非叶子节点的导数计算完之后即被清空。若想查看这些变量的梯度，有两种方法：\n",
    "- 使用autograd.grad函数\n",
    "- 使用hook\n",
    "\n",
    "`autograd.grad`和`hook`方法都是很强大的工具，更详细的用法参考官方api文档，这里举例说明基础的使用。推荐使用`hook`方法，但是在实际使用中应尽量避免修改grad的值。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 47,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(True, True, True)"
      ]
     },
     "execution_count": 47,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "x = t.ones(3, requires_grad=True)\n",
    "w = t.rand(3, requires_grad=True)\n",
    "y = x * w\n",
    "# y依赖于w，而w.requires_grad = True\n",
    "z = y.sum()\n",
    "x.requires_grad, w.requires_grad, y.requires_grad"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 48,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(tensor([0.2772, 0.2943, 0.0755]), tensor([1., 1., 1.]), None)"
      ]
     },
     "execution_count": 48,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# 非叶子节点grad计算完之后自动清空，y.grad是None\n",
    "z.backward()\n",
    "(x.grad, w.grad, y.grad)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 49,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(tensor([1., 1., 1.]),)"
      ]
     },
     "execution_count": 49,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# 第一种方法：使用grad获取中间变量的梯度\n",
    "x = t.ones(3, requires_grad=True)\n",
    "w = t.rand(3, requires_grad=True)\n",
    "y = x * w\n",
    "z = y.sum()\n",
    "# z对y的梯度，隐式调用backward()\n",
    "t.autograd.grad(z, y)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 50,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "y的梯度： tensor([1., 1., 1.])\n"
     ]
    }
   ],
   "source": [
    "# 第二种方法：使用hook\n",
    "# hook是一个函数，输入是梯度，不应该有返回值\n",
    "def variable_hook(grad):\n",
    "    print('y的梯度：',grad)\n",
    "\n",
    "x = t.ones(3, requires_grad=True)\n",
    "w = t.rand(3, requires_grad=True)\n",
    "y = x * w\n",
    "# 注册hook\n",
    "hook_handle = y.register_hook(variable_hook)\n",
    "z = y.sum()\n",
    "z.backward()\n",
    "\n",
    "# 除非你每次都要用hook，否则用完之后记得移除hook\n",
    "hook_handle.remove()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "最后再来看看variable中grad属性和backward函数`grad_variables`参数的含义，这里直接下结论：\n",
    "\n",
    "- variable $\\textbf{x}$的梯度是目标函数${f(x)} $对$\\textbf{x}$的梯度，$\\frac{df(x)}{dx} = (\\frac {df(x)}{dx_0},\\frac {df(x)}{dx_1},...,\\frac {df(x)}{dx_N})$，形状和$\\textbf{x}$一致。\n",
    "- 对于y.backward(grad_variables)中的grad_variables相当于链式求导法则中的$\\frac{\\partial z}{\\partial x} = \\frac{\\partial z}{\\partial y} \\frac{\\partial y}{\\partial x}$中的$\\frac{\\partial z}{\\partial y}$。z是目标函数，一般是一个标量，故而$\\frac{\\partial z}{\\partial y}$的形状与variable $\\textbf{y}$的形状一致。`z.backward()`在一定程度上等价于y.backward(grad_y)。`z.backward()`省略了grad_variables参数，是因为$z$是一个标量，而$\\frac{\\partial z}{\\partial z} = 1$"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 51,
   "metadata": {
    "scrolled": true
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor([2., 4., 6.])"
      ]
     },
     "execution_count": 51,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "x = t.arange(0,3, requires_grad=True,dtype=t.float)\n",
    "y = x**2 + x*2\n",
    "z = y.sum()\n",
    "z.backward() # 从z开始反向传播\n",
    "x.grad"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 52,
   "metadata": {
    "scrolled": true
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor([2., 4., 6.])"
      ]
     },
     "execution_count": 52,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "x = t.arange(0,3, requires_grad=True,dtype=t.float)\n",
    "y = x**2 + x*2\n",
    "z = y.sum()\n",
    "y_gradient = t.Tensor([1,1,1]) # dz/dy\n",
    "y.backward(y_gradient) #从y开始反向传播\n",
    "x.grad"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "另外值得注意的是，只有对variable的操作才能使用autograd，如果对variable的data直接进行操作，将无法使用反向传播。除了对参数初始化，一般我们不会修改variable.data的值。"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "在PyTorch中计算图的特点可总结如下：\n",
    "\n",
    "- autograd根据用户对variable的操作构建其计算图。对变量的操作抽象为`Function`。\n",
    "- 对于那些不是任何函数(Function)的输出，由用户创建的节点称为叶子节点，叶子节点的`grad_fn`为None。叶子节点中需要求导的variable，具有`AccumulateGrad`标识，因其梯度是累加的。\n",
    "- variable默认是不需要求导的，即`requires_grad`属性默认为False，如果某一个节点requires_grad被设置为True，那么所有依赖它的节点`requires_grad`都为True。\n",
    "- variable的`volatile`属性默认为False，如果某一个variable的`volatile`属性被设为True，那么所有依赖它的节点`volatile`属性都为True。volatile属性为True的节点不会求导，volatile的优先级比`requires_grad`高。\n",
    "- 多次反向传播时，梯度是累加的。反向传播的中间缓存会被清空，为进行多次反向传播需指定`retain_graph`=True来保存这些缓存。\n",
    "- 非叶子节点的梯度计算完之后即被清空，可以使用`autograd.grad`或`hook`技术获取非叶子节点的值。\n",
    "- variable的grad与data形状一致，应避免直接修改variable.data，因为对data的直接操作无法利用autograd进行反向传播\n",
    "- 反向传播函数`backward`的参数`grad_variables`可以看成链式求导的中间结果，如果是标量，可以省略，默认为1\n",
    "- PyTorch采用动态图设计，可以很方便地查看中间层的输出，动态的设计计算图结构。\n",
    "\n",
    "这些知识不懂大多数情况下也不会影响对pytorch的使用，但是掌握这些知识有助于更好的理解pytorch，并有效的避开很多陷阱"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 3.2.3 扩展autograd\n",
    "\n",
    "\n",
    "目前绝大多数函数都可以使用`autograd`实现反向求导，但如果需要自己写一个复杂的函数，不支持自动反向求导怎么办? 写一个`Function`，实现它的前向传播和反向传播代码，`Function`对应于计算图中的矩形， 它接收参数，计算并返回结果。下面给出一个例子。\n",
    "\n",
    "```python\n",
    "\n",
    "class Mul(Function):\n",
    "                                                            \n",
    "    @staticmethod\n",
    "    def forward(ctx, w, x, b, x_requires_grad = True):\n",
    "        ctx.x_requires_grad = x_requires_grad\n",
    "        ctx.save_for_backward(w,x)\n",
    "        output = w * x + b\n",
    "        return output\n",
    "        \n",
    "    @staticmethod\n",
    "    def backward(ctx, grad_output):\n",
    "        w,x = ctx.saved_tensors\n",
    "        grad_w = grad_output * x\n",
    "        if ctx.x_requires_grad:\n",
    "            grad_x = grad_output * w\n",
    "        else:\n",
    "            grad_x = None\n",
    "        grad_b = grad_output * 1\n",
    "        return grad_w, grad_x, grad_b, None\n",
    "```\n",
    "\n",
    "分析如下：\n",
    "\n",
    "- 自定义的Function需要继承autograd.Function，没有构造函数`__init__`，forward和backward函数都是静态方法\n",
    "- backward函数的输出和forward函数的输入一一对应，backward函数的输入和forward函数的输出一一对应\n",
    "- backward函数的grad_output参数即t.autograd.backward中的`grad_variables`\n",
    "- 如果某一个输入不需要求导，直接返回None，如forward中的输入参数x_requires_grad显然无法对它求导，直接返回None即可\n",
    "- 反向传播可能需要利用前向传播的某些中间结果，需要进行保存，否则前向传播结束后这些对象即被释放\n",
    "\n",
    "Function的使用利用Function.apply(variable)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 53,
   "metadata": {},
   "outputs": [],
   "source": [
    "from torch.autograd import Function\n",
    "class MultiplyAdd(Function):\n",
    "                                                            \n",
    "    @staticmethod\n",
    "    def forward(ctx, w, x, b):                              \n",
    "        ctx.save_for_backward(w,x)\n",
    "        output = w * x + b\n",
    "        return output\n",
    "        \n",
    "    @staticmethod\n",
    "    def backward(ctx, grad_output):                         \n",
    "        w,x = ctx.saved_tensors\n",
    "        grad_w = grad_output * x\n",
    "        grad_x = grad_output * w\n",
    "        grad_b = grad_output * 1\n",
    "        return grad_w, grad_x, grad_b                       "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 54,
   "metadata": {
    "scrolled": true
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(None, tensor([1.]), tensor([1.]))"
      ]
     },
     "execution_count": 54,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "x = t.ones(1)\n",
    "w = t.rand(1, requires_grad = True)\n",
    "b = t.rand(1, requires_grad = True)\n",
    "# 开始前向传播\n",
    "z=MultiplyAdd.apply(w, x, b)\n",
    "# 开始反向传播\n",
    "z.backward()\n",
    "\n",
    "# x不需要求导，中间过程还是会计算它的导数，但随后被清空\n",
    "x.grad, w.grad, b.grad"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 55,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(tensor([1.]), tensor([0.8435], grad_fn=<MulBackward0>), tensor([1.]))"
      ]
     },
     "execution_count": 55,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "x = t.ones(1)\n",
    "w = t.rand(1, requires_grad = True)\n",
    "b = t.rand(1, requires_grad = True)\n",
    "#print('开始前向传播')\n",
    "z=MultiplyAdd.apply(w,x,b)\n",
    "#print('开始反向传播')\n",
    "\n",
    "# 调用MultiplyAdd.backward\n",
    "# 输出grad_w, grad_x, grad_b\n",
    "z.grad_fn.apply(t.ones(1))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "之所以forward函数的输入是tensor，而backward函数的输入是variable，是为了实现高阶求导。backward函数的输入输出虽然是variable，但在实际使用时autograd.Function会将输入variable提取为tensor，并将计算结果的tensor封装成variable返回。在backward函数中，之所以也要对variable进行操作，是为了能够计算梯度的梯度（backward of backward）。下面举例说明，有关torch.autograd.grad的更详细使用请参照文档。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 56,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(tensor([10.], grad_fn=<MulBackward0>),)"
      ]
     },
     "execution_count": 56,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "x = t.tensor([5], requires_grad=True,dtype=t.float)\n",
    "y = x ** 2\n",
    "grad_x = t.autograd.grad(y, x, create_graph=True)\n",
    "grad_x # dy/dx = 2 * x"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 57,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(tensor([2.]),)"
      ]
     },
     "execution_count": 57,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "grad_grad_x = t.autograd.grad(grad_x[0],x)\n",
    "grad_grad_x # 二阶导数 d(2x)/dx = 2"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "这种设计虽然能让`autograd`具有高阶求导功能，但其也限制了Tensor的使用，因autograd中反向传播的函数只能利用当前已经有的Variable操作。这个设计是在`0.2`版本新加入的，为了更好的灵活性，也为了兼容旧版本的代码，PyTorch还提供了另外一种扩展autograd的方法。PyTorch提供了一个装饰器`@once_differentiable`，能够在backward函数中自动将输入的variable提取成tensor，把计算结果的tensor自动封装成variable。有了这个特性我们就能够很方便的使用numpy/scipy中的函数，操作不再局限于variable所支持的操作。但是这种做法正如名字中所暗示的那样只能求导一次，它打断了反向传播图，不再支持高阶求导。\n",
    "\n",
    "\n",
    "上面所描述的都是新式Function，还有个legacy Function，可以带有`__init__`方法，`forward`和`backwad`函数也不需要声明为`@staticmethod`，但随着版本更迭，此类Function将越来越少遇到，在此不做更多介绍。\n",
    "\n",
    "此外在实现了自己的Function之后，还可以使用`gradcheck`函数来检测实现是否正确。`gradcheck`通过数值逼近来计算梯度，可能具有一定的误差，通过控制`eps`的大小可以控制容忍的误差。\n",
    "关于这部份的内容可以参考github上开发者们的讨论[^3]。\n",
    "\n",
    "[^3]: https://github.com/pytorch/pytorch/pull/1016"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "下面举例说明如何利用Function实现sigmoid Function。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 58,
   "metadata": {},
   "outputs": [],
   "source": [
    "class Sigmoid(Function):\n",
    "                                                             \n",
    "    @staticmethod\n",
    "    def forward(ctx, x): \n",
    "        output = 1 / (1 + t.exp(-x))\n",
    "        ctx.save_for_backward(output)\n",
    "        return output\n",
    "        \n",
    "    @staticmethod\n",
    "    def backward(ctx, grad_output): \n",
    "        output,  = ctx.saved_tensors\n",
    "        grad_x = output * (1 - output) * grad_output\n",
    "        return grad_x                            "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 59,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "True"
      ]
     },
     "execution_count": 59,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# 采用数值逼近方式检验计算梯度的公式对不对\n",
    "test_input = t.randn(3,4, requires_grad=True).double()\n",
    "t.autograd.gradcheck(Sigmoid.apply, (test_input,), eps=1e-3)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 60,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "100 loops, best of 3: 192 µs per loop\n",
      "100 loops, best of 3: 226 µs per loop\n",
      "100 loops, best of 3: 102 µs per loop\n"
     ]
    }
   ],
   "source": [
    "def f_sigmoid(x):\n",
    "    y = Sigmoid.apply(x)\n",
    "    y.backward(t.ones(x.size()))\n",
    "    \n",
    "def f_naive(x):\n",
    "    y =  1/(1 + t.exp(-x))\n",
    "    y.backward(t.ones(x.size()))\n",
    "    \n",
    "def f_th(x):\n",
    "    y = t.sigmoid(x)\n",
    "    y.backward(t.ones(x.size()))\n",
    "    \n",
    "x=t.randn(100, 100, requires_grad=True)\n",
    "%timeit -n 100 f_sigmoid(x)\n",
    "%timeit -n 100 f_naive(x)\n",
    "%timeit -n 100 f_th(x)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "显然`f_sigmoid`要比单纯利用`autograd`加减和乘方操作实现的函数快不少，因为f_sigmoid的backward优化了反向传播的过程。另外可以看出系统实现的built-in接口(t.sigmoid)更快。"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 3.2.4 小试牛刀: 用Variable实现线性回归\n",
    "在上一节中讲解了利用tensor实现线性回归，在这一小节中，将讲解如何利用autograd/Variable实现线性回归，以此感受autograd的便捷之处。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 61,
   "metadata": {},
   "outputs": [],
   "source": [
    "import torch as t\n",
    "%matplotlib inline\n",
    "from matplotlib import pyplot as plt\n",
    "from IPython import display \n",
    "import numpy as np"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 62,
   "metadata": {},
   "outputs": [],
   "source": [
    "# 设置随机数种子，为了在不同人电脑上运行时下面的输出一致\n",
    "t.manual_seed(1000) \n",
    "\n",
    "def get_fake_data(batch_size=8):\n",
    "    ''' 产生随机数据：y = x*2 + 3，加上了一些噪声'''\n",
    "    x = t.rand(batch_size,1) * 5\n",
    "    y = x * 2 + 3 + t.randn(batch_size, 1)\n",
    "    return x, y"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 63,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "<matplotlib.collections.PathCollection at 0x7fb693516090>"
      ]
     },
     "execution_count": 63,
     "metadata": {},
     "output_type": "execute_result"
    },
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXQAAAD8CAYAAABn919SAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4yLCBo\ndHRwOi8vbWF0cGxvdGxpYi5vcmcvhp/UCwAAEE1JREFUeJzt3WuMHWd9x/Hvv2sDm4DYJF7S2MF1\nqqJVuRQcVlG4Cpq2G65xI1QZlYoiVEstbaEvtsJ9QVT6glbuC9oXLbKANrSQkAbHRQjYRFyaqojQ\nTRywQ1gIIYGsAS+E5RJWxXb/fXFmw3rrvZwzs+fynO9HWu3sM7Nn/ho/+8vM88xMIjORJA2+X+h1\nAZKkZhjoklQIA12SCmGgS1IhDHRJKoSBLkmFMNAlqRAGuiQVwkCXpEJs6+bOduzYkXv27OnmLiVp\n4N19993fy8zxjbbraqDv2bOH2dnZbu5SkgZeRDy8me02HHKJiPdHxKmIOLGi7eKIuCMivlZ9v6hO\nsZKk+jYzhv7PwLWr2t4OfCoznwF8qvpZktRDGwZ6Zt4JPLqq+Trgxmr5RmBfw3VJktrU6V0ul2bm\nt6vl7wCXNlSPJKlDtW9bzNYL1dd8qXpEHIiI2YiYXVhYqLs7SdIaOg3070bEZQDV91NrbZiZhzNz\nMjMnx8c3vOtGktShTm9b/CjwRuCvq+//3lhFklSAo8fmOTQzx8nFJXaOjTI9NcG+vbu2dJ8bBnpE\n3AS8DNgREY8AN9AK8lsi4s3Aw8DvbGWRkjRIjh6b5+CR4yydPgvA/OISB48cB9jSUN8w0DPz9Wus\nuqbhWiSpCIdm5h4P82VLp89yaGZuSwPdd7lIUsNOLi611d4UA12SGrZzbLSt9qYY6JLUsOmpCUa3\nj5zTNrp9hOmpiS3db1dfziVJw2B5nLzv7nKRJLVv395dWx7gqznkIkmFMNAlqRAGuiQVwkCXpEIY\n6JJUCANdkgphoEtSIQx0SSqEgS5JhTDQJakQBrokFcJAl6RCGOiSVAgDXZIKYaBLUiEMdEkqhIEu\nSYUw0CWpELUCPSLeGhEnIuK+iHhbU0VJktrXcaBHxLOBPwCuAp4LvDoifqWpwiRJ7alzhv6rwF2Z\n+dPMPAP8B3B9M2VJktpVJ9BPAC+JiEsi4gLglcDTmylLktSubZ3+YmbeHxF/A9wOPAbcC5xdvV1E\nHAAOAOzevbvT3UmSNlBrUjQz35eZz8/MlwI/AL56nm0OZ+ZkZk6Oj4/X2Z0kaR0dn6EDRMTTMvNU\nROymNX5+dTNlSZLaVSvQgY9ExCXAaeAtmbnYQE2SpA7UCvTMfElThUiS6vFJUUkqhIEuSYUw0CWp\nEAa6JBXCQJekQhjoklQIA12SCmGgS1IhDHRJKoSBLkmFqPsuF2lgHT02z6GZOU4uLrFzbJTpqQn2\n7d3V67KkjhnoGkpHj81z8Mhxlk63XuE/v7jEwSPHAQx1DSyHXDSUDs3MPR7my5ZOn+XQzFyPKpLq\nM9A1lE4uLrXVLg0CA11DaefYaFvt0iAw0DWUpqcmGN0+ck7b6PYRpqcmelSRVJ+TohpKyxOf3uWi\nkhjoGlr79u4ywFUUh1wkqRAGuiQVwkCXpEIY6JJUCANdkgpRK9Aj4s8i4r6IOBERN0XEk5oqTJLU\nno4DPSJ2AX8KTGbms4ERYH9ThUmS2lN3yGUbMBoR24ALgJP1S5IkdaLjQM/MeeBvgW8C3wZ+mJm3\nN1WYJKk9dYZcLgKuA64AdgIXRsQbzrPdgYiYjYjZhYWFziuVJK2rzpDLbwDfyMyFzDwNHAFeuHqj\nzDycmZOZOTk+Pl5jd5Kk9dQJ9G8CV0fEBRERwDXA/c2UJUlqV50x9LuAW4F7gOPVZx1uqC5JUptq\nvW0xM28AbmioFklSDT4pKkmFMNAlqRAGuiQVwkCXpEIY6JJUCANdkgphoEtSIQx0SSqEgS5JhTDQ\nJakQtR79VxmOHpvn0MwcJxeX2Dk2yvTUBPv27up1WZLaZKAPuaPH5jl45DhLp88CML+4xMEjxwEM\ndWnAGOhD7tDM3ONhvmzp9FkOzcwZ6NJ59PMVrYE+5E4uLrXVLg2zfr+idVJ0yO0cG22rXRpm613R\n9gMDfchNT00wun3knLbR7SNMT030qCKpf/X7Fa2BPuT27d3Fu65/DrvGRglg19go77r+OX1x+Sj1\nm36/onUMXezbu8sAlzZhemrinDF06K8rWgNdkjZp+cTHu1wkqQD9fEVroHdRP9+/KmnwGehd0u/3\nr0oafN7l0iX9fv+qpMFnoHdJv9+/KmnwdRzoETEREfeu+PpRRLytyeJK0u/3r0oafB0HembOZebz\nMvN5wPOBnwK3NVZZYXwiU9JWa2pS9Brg65n5cEOfV5x+v39V0uBrKtD3Azedb0VEHAAOAOzevbuh\n3Q2mfr5/VdLgqz0pGhFPAF4L/Nv51mfm4cyczMzJ8fHxuruTJK2hibtcXgHck5nfbeCzJEkdaiLQ\nX88awy2SpO6pFegRcSHwm8CRZsqRJHWq1qRoZj4GXNJQLZKkGnxSVJIKYaBLUiEMdEkqhK/P1VDz\nHfUqiYGuoeU76lUah1w0tHxHvUpjoGto+Y56lcZA19DyHfUqjYGuoeU76lUaJ0U1tHxHvUpjoGuo\n+Y56lcQhF0kqxMCcofsAiCStbyAC3QdAJGljAzHk4gMgkrSxgQh0HwCRpI0NRKD7AIgkbWwgAt0H\nQCRpYwMxKeoDIJK0sYEIdPABEEnayEAMuUiSNmagS1IhDHRJKkStQI+IsYi4NSK+EhH3R8QLmipM\nktSeupOifwd8MjNfFxFPAC5ooCZJUgc6DvSIeCrwUuD3ATLzZ8DPmilLktSuOkMuVwALwD9FxLGI\neG9EXLh6o4g4EBGzETG7sLBQY3eSpPXUCfRtwJXAP2bmXuAx4O2rN8rMw5k5mZmT4+PjNXYnSVpP\nnUB/BHgkM++qfr6VVsBLknqg40DPzO8A34qI5ReqXAN8uZGqJEltq3uXy58AH6zucHkQeFP9kiRJ\nnagV6Jl5LzDZUC2SpBp8UlSSCmGgS1IhDHRJKoSBLkmFMNAlqRAGuiQVwkCXpEIY6JJUCANdkgph\noEtSIQx0SSqEgS5JhTDQJakQBrokFcJAl6RCGOiSVAgDXZIKYaBLUiEMdEkqhIEuSYUw0CWpEAa6\nJBXCQJekQmyr88sR8RDwY+AscCYzJ5soSpLUvlqBXnl5Zn6vgc+RJNXgkIskFaJuoCdwe0TcHREH\nmihIktSZukMuL87M+Yh4GnBHRHwlM+9cuUEV9AcAdu/eXXN3kqS11DpDz8z56vsp4DbgqvNsczgz\nJzNzcnx8vM7uJEnr6DjQI+LCiHjK8jLwW8CJpgqTJLWnzpDLpcBtEbH8OR/KzE82UpUkqW0dB3pm\nPgg8t8FaJEk1eNuiJBXCQJekQhjoklQIA12SCmGgS1IhDHRJKoSBLkmFMNAlqRAGuiQVwkCXpEIY\n6JJUCANdkgphoEtSIQx0SSqEgS5JhTDQJakQBrokFcJAl6RCGOiSVAgDXZIKYaBLUiEMdEkqhIEu\nSYWoHegRMRIRxyLiY00UJEnqTBNn6G8F7m/gcyRJNdQK9Ii4HHgV8N5mypEkdaruGfq7gT8H/reB\nWiRJNXQc6BHxauBUZt69wXYHImI2ImYXFhY63Z0kaQN1ztBfBLw2Ih4CbgZ+PSL+dfVGmXk4Mycz\nc3J8fLzG7iRJ6+k40DPzYGZenpl7gP3ApzPzDY1VJklqi/ehS1IhtjXxIZn5WeCzTXyWJKkzjQR6\nvzl6bJ5DM3OcXFxi59go01MT7Nu7q9dlSdKWKi7Qjx6b5+CR4yydPgvA/OISB48cBzDUJRWtuDH0\nQzNzj4f5sqXTZzk0M9ejiiSpO4oL9JOLS221S1Ipigv0nWOjbbVLUimKC/TpqQlGt4+c0za6fYTp\nqYkeVSRJ3VHcpOjyxKd3uUgaNsUFOrRC3QCXNGyKG3KRpGFloEtSIQx0SSqEgS5JhTDQJakQBrok\nFSIys3s7i1gAHt7EpjuA721xOXVZYzOssRnW2Ix+rfGXMnPD/+VbVwN9syJiNjMne13HeqyxGdbY\nDGtsxiDUuB6HXCSpEAa6JBWiXwP9cK8L2ARrbIY1NsMamzEINa6pL8fQJUnt69czdElSm7oa6BHx\n/og4FREn1lj/uxHxpYg4HhGfi4jnrlj3UNV+b0TM9rDGl0XED6s67o2Id6xYd21EzEXEAxHx9h7W\nOL2ivhMRcTYiLq7WbflxjIinR8RnIuLLEXFfRLz1PNtERPx9day+FBFXrlj3xoj4WvX1xh7W2NP+\nuMkae9ofN1ljr/vjkyLiCxHxxarGvzzPNk+MiA9Xx+quiNizYt3Bqn0uIqa2osbGZGbXvoCXAlcC\nJ9ZY/0Lgomr5FcBdK9Y9BOzogxpfBnzsPO0jwNeBXwaeAHwReGYvaly17WuAT3fzOAKXAVdWy08B\nvrr6WACvBD4BBHD18r81cDHwYPX9omr5oh7V2NP+uMkae9ofN1NjH/THAJ5cLW8H7gKuXrXNHwHv\nqZb3Ax+ulp9ZHbsnAldUx3RkK+ut89XVM/TMvBN4dJ31n8vMH1Q/fh64vCuFnVvDujWu4yrggcx8\nMDN/BtwMXNdocZU2a3w9cNNW1LGWzPx2Zt5TLf8YuB9Y/YL664APZMvngbGIuAyYAu7IzEervnAH\ncG0vaux1f9zkcVxLV/pjBzX2oj9mZv6k+nF79bV68vA64MZq+VbgmoiIqv3mzPyfzPwG8ACtY9uX\n+nkM/c20zuCWJXB7RNwdEQd6VNOyF1SXb5+IiGdVbbuAb63Y5hE2/8e3JSLiAlph+JEVzV09jtWl\n615aZ0UrrXW8un4c16lxpZ72xw1q7Iv+uNFx7GV/jIiRiLgXOEXrhGHN/piZZ4AfApfQh3/X6+nL\n/2NRRLyc1h/Qi1c0vzgz5yPiacAdEfGV6ky12+6h9RjuTyLilcBR4Bk9qGMzXgP8V2auPJvv2nGM\niCfT+uN9W2b+aCv2Uddmaux1f9ygxr7oj5v8t+5Zf8zMs8DzImIMuC0inp2Z552DGmR9d4YeEb8G\nvBe4LjO/v9yemfPV91PAbfTosiczf7R8+ZaZHwe2R8QOYB54+opNL6/aemk/qy5vu3UcI2I7rT/w\nD2bmkfNsstbx6tpx3ESNPe+PG9XYD/1xM8ex0rP+uGJ/i8Bn+P/DeI8fr4jYBjwV+D79+Xe9tm4P\n2gN7WHvCcTetMaoXrmq/EHjKiuXPAdf2qMZf5Of3718FfJPWpMs2WhN4V/DzSahn9aLGav1TaY2z\nX9jt41gdjw8A715nm1dx7qToF6r2i4Fv0JoQvahavrhHNfa0P26yxp72x83U2Af9cRwYq5ZHgf8E\nXr1qm7dw7qToLdXyszh3UvRB+nhStKtDLhFxE61Z+R0R8QhwA60JCjLzPcA7aI1b/UNrPoIz2XpR\nzqW0LpOg1VE/lJmf7FGNrwP+MCLOAEvA/mz9y5+JiD8GZmjdYfD+zLyvRzUC/DZwe2Y+tuJXu3Uc\nXwT8HnC8GrcE+AtaAblc48dp3enyAPBT4E3Vukcj4q+A/65+75157iV6N2vsdX/cTI297o+bqRF6\n2x8vA26MiBFaoxK3ZObHIuKdwGxmfhR4H/AvEfEArf/w7K/qvy8ibgG+DJwB3pKt4Zu+5JOiklSI\nvhtDlyR1xkCXpEIY6JJUCANdkgphoEtSIQx0SSqEgS5JhTDQJakQ/wcmkGjLLip0VAAAAABJRU5E\nrkJggg==\n",
      "text/plain": [
       "<Figure size 432x288 with 1 Axes>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    }
   ],
   "source": [
    "# 来看看产生x-y分布是什么样的\n",
    "x, y = get_fake_data()\n",
    "plt.scatter(x.squeeze().numpy(), y.squeeze().numpy())"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 64,
   "metadata": {
    "scrolled": false
   },
   "outputs": [
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXQAAAD8CAYAAABn919SAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4yLCBo\ndHRwOi8vbWF0cGxvdGxpYi5vcmcvhp/UCwAAHEtJREFUeJzt3Xl0VfW99/H3lznMyqQEAjIFVFAw\n9qq0DjiA1FprrdZWq7X30t5OWlsqrN51+9y7nvuA4my1Sh17a1ttpd7eKpMCoqhUEJVqEgjIFJAw\nGMKQkOn7/HEChJjAOTnDPmefz2stFsnJSfY3R/ysvfb57d/H3B0REcl8bYIeQEREEkOBLiISEgp0\nEZGQUKCLiISEAl1EJCQU6CIiIaFAFxEJCQW6iEhIKNBFREKiXSoP1rt3bx88eHAqDykiknAO7N5f\nTVlFFbX1To+c9vTr3omO7eI/Ry4/UENpeSX1je7iry3f7nWVFcf94SkN9MGDB7NixYpUHlJEJGHc\nnZdXf8Ks+UXs2nWAK045kemTR3HmwJ4JO8b4mYuoLa886rFtz9xm0XxvSgNdRCRTvb1+FzPmFvH+\n5nJG9OvKkzcXcFF+X8yiytqobW0S5rFQoIuIHEPxJ3u5c14Ri4rKOLlHJ+66ZgxfHTeAtm0SG+SH\n9O+ZQ2krQ12BLiLSjG17Krl3wRpeeHcLXTq2445JI/n2+MF0at82Kcd7cVUps+YXU1peiRG5Tn+Y\ne300P0OBLiLSyJ7KGn69ZB1PLfsYd7hl/Cn84KJhnNClQ9KO+eKqUqbPWU1lTR0QCfNDoZ7bM4fN\nFTs2RvNzFOgiIsDB2jr++62N/GpxCXsqa7jqzFxuv3QEA0/snPRjz5pffDjMDzkU5sumTcCmV+yO\n5uco0EUkq9XXO//zfil3z19DaXkl54/owx2T8jmtf4+UzdDSG6GxvkGqQBeRrLV0zQ5mzi3io20V\nnJ7bnbuuGcP4Yb1TPkdLb4T275kT089RoItI1vlH6R5mzi3ijZKdDDwxhwe+fiZfGtOfNklauXI8\nUyfmH3UNHSCnfVumTsyP6eco0EUka2zefYBZ84v56/tbOaFze/79ilP55jl5dGyXnJUr0bpqbC4Q\nuZa+tbyS/j1zmDox//Dj0VKgi0jo7d5fzUOL1vK7tzfSto3xg4uG8t0LhtK9U/ugRzvsqrG5MQd4\nUwp0EQmtyuo6nlz2MY8uWcf+6lquLRjIbZeM4KQenYIeLSmOG+hm9iRwBVDm7qc3PDYL+BJQDawD\nvu3u5ckcVEQkWrV19fx55Rbue2UN2ysOcsmoftwxKZ/h/boFPVpSRbM12NPApCaPLQROd/cxwBpg\neoLnEhGJmbuz8KPtXP7A60ybs5r+PXP40/fO5fGbCkIf5hDFGbq7LzWzwU0eW9Do07eBaxI7lohI\nbN7d9CkzXi7knQ2fMqR3Fx69YRwTTzsp4ZtnpbNEXEO/BXguAT9HRCRm63bsY9a8YuZ9+Am9u3bk\n/151OtedPZD2bbOvvyeuQDezXwC1wLPHeM4UYApAXl5ePIcTETmsbG8VD7yylj++s5lO7dpw+6Uj\n+M7nT6FLx+xd69Hq39zMbibyZunF7u4tPc/dZwOzAQoKClp8nohINPYdrGX20vU8/vp6qmvrueGf\n8vjRxcPp3bVj0KMFrlWBbmaTgJ8DF7j7gcSOJCLyWTV19fzh75t48NW17NxXzRfHnMzUy/IZ3LtL\n0KOljWiWLf4BuBDobWZbgF8SWdXSEVjY8IbD2+7+vSTOKSJZqnHt24ZdBzhnyIk8cdMozkhg7VtY\nRLPK5fpmHn4iCbOIiBylce1bfr9uPHXz2VyY3yerVq7EInvfPRCRtNW09m3WNWO4Oom1b2GhQBeR\ntLG1vJJ7F0Zq37p2bMe0y0dy83nJq30LGwW6iARuT2UNjywp4ellG3CHf/58pPatZ+fk1b6FkQJd\nRAJTVXOk9q2iqoavnJnL7ZeNYMAJya99CyMFuoikXH298+J7pdyz4Ejt27RJIzm1f/egR8toCnQR\nSRl3Z+nancycW0RhwLVvYaRAF5GUWL1lDzPnFbKsZFda1L6FkQJdRJIqXWvfwkiBLiJJ0bT27YcX\nDWPKBUPSqvYtbBToIpJQTWvfrjs7UvvWr3s4a9/SiQJdRBLiUO3bvQvXULb3IJeeGql9G9Y3uKag\nF1eVMmt+MVvLK+nfM4epE/PjLmJOZwp0EYmLu/NKYRl3ziuipGwf4/J68vA3x3H24BMDnevFVaVM\nn7Oaypo6AErLK5k+ZzVAaENdgS4irbZy46fMnNtQ+9anC4/ecBYTT+uXFptnzZpffDjMD6msqWPW\n/GIFuojIIet27OOueUXM/3A7fbp15L++cjrXFQykXRrVvm0tr4zp8TBQoItI1Moqqrj/1bU816j2\n7Z+/cAqdO6RflPTvmUNpM+Hdv2dOANOkRvr9VxCRtLPvYC2zX1vHb17/mJq6zKh9mzox/6hr6AA5\n7dsydWJ+gFMllwJdRFpUXXuk9m3X/syqfTt0nVyrXEQkq7k7L63exqz5xWxsqH178vLMq327amxu\nqAO8KQW6iBzlrXW7mDm3kPe37FHtW4ZRoIsIAEWfVHDn3CIWF+9Q7VuGUqCLZLnGtW/dVPuW0RTo\nIllqz4FI7dtTb24A4F++MITvXzhUtW8ZTIEukmWqaur47VsbeHjxukjt29hcbr9UtW9hoEAXyRJ1\n9c6Lq0q5d2Gk9u2CEX24Q7VvoaJAFwk5d+e1NTuYObeIok/2Mjq3B7OuGcN5qn0LneMGupk9CVwB\nlLn76Q2PnQg8BwwGNgDXuvunyRtTRFpj9ZY9zJhbyJvrIrVvD14/litGn5xRtW/ZtgVuPKI5Q38a\n+BXw20aPTQNedfeZZjat4fM7Ej+eiLTGpl0HmLWgmP99fysndunAL790Kt/8p0F0aBfM5lmtDeVs\n3AI3HscNdHdfamaDmzz8ZeDCho+fAZagQBcJ3K59B3loUQnPLj9S+/bdC4bQLcDat3hCORu3wI1H\na6+h93P3bQ0ffwL0S9A8ItIKB6prefKNj3n0tfUcSLPat3hCORu3wI1H3G+Kurubmbf0dTObAkwB\nyMvLi/dwItJIbV09f1q5hfuOUfsW9DXoeEI5G7fAjUdrL6htN7OTARr+Lmvpie4+290L3L2gT58+\nrTyciDTm7sz/8BMm3r+U6XNWM+CEHP78vXP5zbcKPhPm0+esprS8EufI5Y4XV5WmbNaWwjeaUJ46\nMZ+cJneshn0L3Hi0NtD/CtzU8PFNwP8kZhwROZ6VG3fztUff4rv/vRIHHr3hLF741/MoaKbD81iX\nO1IlnlC+amwuM64eTW7PHAzI7ZnDjKtH6/p5C6JZtvgHIm+A9jazLcAvgZnA82b2HWAjcG0yhxQR\nKCnbx6z5sdW+pcM16Hj3Jc+2LXDjEc0ql+tb+NLFCZ5FRJpRVlHFfa+s5fkVsde+pcs1aIVyauhO\nUZE0tbeqhtlL1/N4Q+3bjecM4ocThsVU+5aNNWzZTIEukmaqa+v5/fKNPLSohF37q7lizMlMnZjP\noF6x175lYw1bNlOgi6QJd+dvH2zj7gWR2rdzh/Ri2uUj46590+WO7KFAF0kDb67bycy5RXywZQ8j\nT+rGU98+mwtHqPZNYqNAFwlQ4bYK7pxXxBLVvkkCKNBFAlBaXsm9C9YwZ1Wk9m365SO5SbVvEicF\nukgKqfZNkkmBLpICqn2TVFCgiyTRodq3exYUs3VPlWrfJKkU6CJJ4O4sWbODOxvVvt39tTNU+yZJ\npUAXSbAPtpQz4+Ui3lq/i7wTO2dk7ZtkJgW6SIJs3LWfuxesOVz79n++dCrfCLD2Ld0EvS97NlCg\ni8Spae3bjyYMY8r5wda+pRt1g6aGAl2klQ5U1/LE6x/z2NL1VNbUcW3BQG67ZHha1L6lG3WDpoYC\nXSRGtXX1PL9iC/e/Eql9u+zUfvx80kiG9e0a9GhpKx32Zc8GCnSRKLk7Cz7azl3zili3Yz9nDTqB\nR745rtmmIDlauuzLHnYKdJEorNy4mxkvF7Fi46cM6dOFx248i8tO7afNs6KkfdlTQ4EucgwlZfu4\na14RCz6K1L79v6+M5tqCAcesfZPP0r7sqaFAF2lG49q3nPZt+emlI/hOlLVv0jzty558+tcpWau5\nddEXj+p7uPattj5S+/ajCcPoFUPtm0hQFOiSlZpbFz31T+/ToX0b9h+si6v2TSQoCnTJSs2ti66p\nd6zO+esPxzNmQHy1byJBUKBL2krmreItrX+urq1XmEvGUqBLWkrmreKF2yro0K4NB2vrP/O1XK2L\nlgwW19orM/uJmX1oZv8wsz+Yme55loQ41q3irVVaXslPn3+fyQ++Thsz2jXZ/VDroiXTtfoM3cxy\ngR8Dp7p7pZk9D3wdeDpBs0kWS+St4nsO1PDwkhKebqh9m/KFIXz/wmEsLi7TumgJlXgvubQDcsys\nBugMbI1/JJHE3CpeVVPHM29u4OHFJew9WMvVYwdw+2UjDl9WSZd10dpWVhKl1YHu7qVmdjewCagE\nFrj7goRNJlktnlvF6+qdv6wq5d6G2rcL8yO1b6NOTr/aN20rK4kUzyWXE4AvA6cA5cCfzOwGd/9d\nk+dNAaYA5OXlxTGqZJPW3CretPZtzIAe3H3tGZw3NH1r37StrCRSPJdcLgE+dvcdAGY2BzgPOCrQ\n3X02MBugoKDA4zieZJlYLok0rX176PqxfDEDat+0rawkUjyBvgk4x8w6E7nkcjGwIiFTiURp4679\nzJpfzN8+2JaRtW/aVlYSKZ5r6MvN7M/Au0AtsIqGM3GRZGtc+9auTZuMrX3TtrKSSHGtcnH3XwK/\nTNAskkVau7Kjae3bdWcP5LaLh9M3Q2vftK2sJJLuFJWUa83Kjqa1bxNP68fUieGofUuX5ZOS+RTo\nknKxrOxoWvtWMOgEfn3DOM4apNo3kaYU6JJy0a7sWLFhNzPmFrFy46cM7dOF2TeexaWqfRNpkQJd\nUu54Kzsa17717daRGVeP5mtnqfZN5HgU6JJyLa3smHL+EKbPWX249u1nl43gls+nvvZNt+JLplKg\np5lsCJOmKztO6tGJ03N7MHNuUeC1b7oVXzKZuafu5s2CggJfsUL3HrWkaZhA5Mx1xtWjQxkm1bX1\n/H75Rh5cVMLu/dV86Yz+TL0sn7xenQObafzMRc1eDsrtmcOyaRMCmEgEzGyluxcc73k6Q08j2bKv\nR32987fV27h7fjGbdh/gvKG9mHb5yLRoCtKt+JLJFOhpJBvC5M2SncyYW8Tq0j2MPKkbT3/7bC4Y\n0SdtVq7oVnzJZAr0NBLmMCncVsHMuUW8tmYHuT1zuPfaM7jqzNy02zxLt+JLJlOgp5EwhklpeSX3\nLCjmL6tK6d6pPb+YPIobzx1Ep/Ztgx6tWboVXzKZAj2NhClMPlP7dv4Qvn/BMHp0Tv/Ns3QrvmQq\nBXqayfQwaVr79tVxA/jJpUdq30QkeRTokhBNa98uyu/DHZePZORJ6Vf7JhJWCnSJS9PatzMyoPZN\nJKwU6NJqjWvfBvXqzK++Eal9S5cliCLZRoEuMWtc+9arSwf+48rTuP5zeRlT+yYSVgp0iVrT2rcf\nTxjGv2Rg7ZtIWCnQ5bjCVvsmElYKdGlRmGvfRMJIgS6f4e7M/3A7d80vYr1q30QyhgJdjqLaN5HM\npUAXAErK9nLnvGIWqvZNJGMp0LPc9ooq7n9lDc+9s5nOHdoFVvsmIvHT/7VZam9VDY+9tp7H31hP\nXb3zrXMHp7z2LRvq9kRSKa5AN7OewOPA6YADt7j7W4kYTJKjuraeZ5dv5KGAa9/U3SmSePGeoT8A\nzHP3a8ysAxBcGaQcU7rVvmVL3Z5IKrU60M2sB3A+cDOAu1cD1YkZSxIpHWvfsqFuTyTV4jlDPwXY\nATxlZmcAK4Fb3X1/4yeZ2RRgCkBeXl4ch5NYpXPtW5jr9kSCEs+atHbAOODX7j4W2A9Ma/okd5/t\n7gXuXtCnT584DifRKi2v5Pbn32Pyg6/z3uZyfjF5FK/+9AKuHjcgLcIcInV7OU1q6DK9bk8kaPGc\noW8Btrj78obP/0wzgS6pU36gmkeWrMuI2rcw1e2JpItWB7q7f2Jmm80s392LgYuBjxI3mkQrU2vf\nMr1uTyTdxLvK5UfAsw0rXNYD345/JImWat9EpLG4At3d3wMKEjSLRMndWVK8gzvnHal9u+faMzl3\naK+gRxORAOlO0Qzz/uZyZswt5O31u1X7JiJHUaBniI279nPX/GJeUu2biLRAgd6CdNlnZOe+gzz0\n6lqeXb6J9m1V+yYiLVOgNyMd9hk5UF3L469/zGOvraOqtl61byJyXAr0ZgS5z0htXT3PrdjM/a+s\nZYdq30QkBgr0ZgSxz0hztW+PqvZNRGKgQG9GqvcZUe2biCSCAr0ZUyfmH3UNHZKzz4hq30QkkRTo\nzUj2PiOqfRORZDB3T9nBCgoKfMWKFSk7XrppWvt27pBerNm+j+0VVdqcSkRaZGYr3f24d+XrlDAF\nmta+XXlGf8YM6ME9C9aogk1EEkaBnkSHat9mzS9i8+5Kzhvai+mXj2L0gB6Mn7lIFWwiklAK9CRZ\nVrKTmceofVMFm4gkmgI9wT7aWsHMeUUsPU7tmyrYRCTRFOgJsuXTA9y7YA1/ea+U7p3a84vJo7jx\n3EF0alKzdkiqlkaKSPZQoMep/EA1Dy8u4Zm3NgLN174da6OvdNgALBHSZTMzkWymQG+lqpo6nn5z\nA480qn27/dIRn7lkcryNvsIQeumwmZmIhCTQU3l2WFfvzHl3C/ctXBNV7VuQG32lSjb8jiKZIOMD\nPVVnh62tfcuG1SzZ8DuKZIKMD/RUnB3GU/uWDatZsuF3FMkEGb8LVDLPDjfs3M8Pfv8uX354GX//\neDcANbX11NZ51DshTp2YT06TlS5hW82SDb+jSCbI+DP0ZJwdNq59a2NGuzZGbX1kz5ute6piuqQT\nttUszcmG31EkE2T85lxNr6FD5OxwxtWjYw6U5mrfFhWW8UlF1Weem9szh2XTJsQ9v4jI8WTN5lyJ\nODusqavn+Ua1b5NOO4mpk/IZ2qcrpyx/qdnv0Rt+IpJu4g50M2sLrABK3f2K+EeKXWvXc0dq3z7h\nrnnFrN+5n7MHn8CjN5zFWYNOOPwcveEnIpkiEWfotwKFQPMLsdPUOxt2M+PlQt7dVM6wvl35zbcK\nuGRU38+82alb9EUkU8QV6GY2APgi8F/A7QmZKMlKyvYyc24xrxRup1/3jsy8ejTXHKP2TW/4iUim\niPcM/X7g50C3BMySVNsrqrhv4RqeX7GZLh3aMXViPreMP4WcDs1vntVYWG7RF5Fwa3Wgm9kVQJm7\nrzSzC4/xvCnAFIC8vLzWHq7VKqpqeOy1dTzxxsfU1Ts3nTeYH00YzoldOqR8FhGRZIrnDH08cKWZ\nTQY6Ad3N7HfufkPjJ7n7bGA2RJYtxnG8mBysrePZtzfx0KK1fHqghivP6M/PLssnr1fnVI0gIpJS\nrQ50d58OTAdoOEP/WdMwD0J9vfO/H2zl7gXFbN5dyfhhvZg2KVL7JiISZhm/Dr2xZSU7mTG3kH+U\nVjDq5O48c8tozh/eO+rb9EVEMllCAt3dlwBLEvGzWiPa2jcRkTDL6DP0LZ8e4J4Fa3ixofbt3744\nihvOabn2TUQkzDIy0MsPVPOrRSX89q2NmMF3zx/Kv144lB457Y//zSIiIZVRgV5VU8dTyzbwyJIS\n9h2s5ZpxA/hJM7VvIiLZKCMCva7eeaGh9m3bniomjOzLHZNGkn9S2t/PJCKSMmkd6O7O4uIy7pxb\nTPH2SO3bfdedyTlDjl37JiKSjdI20N/bXM6MlwtZ/vFuBvfqzMPfGMfk0SdpCaKISAvSLtA37NzP\nrPnFvLR6G726dOA/v3wa138uj/YtbJ4lIiIRaRPoO/cd5MFX1/L75Zvo0K4NP754OFPOH0LXjmkz\noohIWgs8LfcfjNS+zV4aqX37+tkDufWS4fTt1ino0UREMkpggV5TV88f39nMA6+sZee+g1x++kn8\nbGKk9k1ERGKX8kBvrvbtsRuPrn0TEZHYpTTQ9x+s5epfv8mqTeUM79uVx79VwMXN1L6JiEjsUhro\n63fup3t5JXd+dTRfHddy7ZuIiMQupYHer3snlvzsoqhq30REJDYpPUXu262jwlxEJEl0zUNEJCQU\n6CIiIaFAFxEJCQW6iEhIKNBFREJCgS4iEhIKdBGRkFCgi4iEhAJdRCQkFOgiIiHR6kA3s4FmttjM\nPjKzD83s1kQOJiIisYlnc65a4Kfu/q6ZdQNWmtlCd/8oQbOJiEgMWn2G7u7b3P3dho/3AoVAbqIG\nExGR2CTkGrqZDQbGAsub+doUM1thZit27NiRiMOJiEgz4g50M+sKvADc5u4VTb/u7rPdvcDdC/r0\n6RPv4UREpAVxBbqZtScS5s+6+5zEjCQiIq0RzyoXA54ACt393sSNJCIirRHPGfp44EZggpm91/Bn\ncoLmEhGRGLV62aK7vwFYAmcREZE46E5REZGQUKCLiISEAl1EJCQU6CIiIaFAFxEJCQW6iEhIKNBF\nREJCgS4iEhIKdBGRkFCgi4iEhAJdRCQkFOgiIiGhQBcRCQkFuohISCjQRURCQoEuIhISCnQRkZBQ\noIuIhIQCXUQkJBToIiIhoUAXEQkJBbqISEgo0EVEQkKBLiISEnEFuplNMrNiMysxs2mJGkpERGLX\n6kA3s7bAw8DlwKnA9WZ2aqIGExGR2MRzhv45oMTd17t7NfBH4MuJGUtERGIVT6DnApsbfb6l4TER\nEQlAu2QfwMymAFMaPj1oZv9I9jEzRG9gZ9BDpAm9FkfotThCr8UR+dE8KZ5ALwUGNvp8QMNjR3H3\n2cBsADNb4e4FcRwzNPRaHKHX4gi9FkfotTjCzFZE87x4Lrm8Aww3s1PMrAPwdeCvcfw8ERGJQ6vP\n0N291sx+CMwH2gJPuvuHCZtMRERiEtc1dHd/GXg5hm+ZHc/xQkavxRF6LY7Qa3GEXosjonotzN2T\nPYiIiKSAbv0XEQmJlAS6tgg4wsyeNLOybF++aWYDzWyxmX1kZh+a2a1BzxQUM+tkZn83s/cbXov/\nCHqmoJlZWzNbZWZ/C3qWIJnZBjNbbWbvRbPSJemXXBq2CFgDXErk5qN3gOvd/aOkHjhNmdn5wD7g\nt+5+etDzBMXMTgZOdvd3zawbsBK4Khv/XZiZAV3cfZ+ZtQfeAG5197cDHi0wZnY7UAB0d/crgp4n\nKGa2AShw96jW46fiDF1bBDTi7kuB3UHPETR33+bu7zZ8vBcoJEvvNPaIfQ2ftm/4k7VvbpnZAOCL\nwONBz5JpUhHo2iJAjsnMBgNjgeXBThKchksM7wFlwEJ3z9rXArgf+DlQH/QgacCBBWa2suGu+2PS\nm6ISKDPrCrwA3ObuFUHPExR3r3P3M4nccf05M8vKy3FmdgVQ5u4rg54lTXze3ccR2dX2Bw2XbFuU\nikCPaosAyT4N14tfAJ519zlBz5MO3L0cWAxMCnqWgIwHrmy4dvxHYIKZ/S7YkYLj7qUNf5cBfyFy\nCbtFqQh0bREgn9HwRuATQKG73xv0PEEysz5m1rPh4xwiCwiKgp0qGO4+3d0HuPtgIlmxyN1vCHis\nQJhZl4YFA5hZF+Ay4Jir45Ie6O5eCxzaIqAQeD6btwgwsz8AbwH5ZrbFzL4T9EwBGQ/cSOQM7L2G\nP5ODHiogJwOLzewDIidAC909q5frCQD9gDfM7H3g78BL7j7vWN+gO0VFREJCb4qKiISEAl1EJCQU\n6CIiIaFAFxEJCQW6iEhIKNBFREJCgS4iEhIKdBGRkPj/EdFpJtWwQGYAAAAASUVORK5CYII=\n",
      "text/plain": [
       "<Figure size 432x288 with 1 Axes>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "2.02689576149 2.97328233719\n"
     ]
    }
   ],
   "source": [
    "# 随机初始化参数\n",
    "w = t.rand(1,1, requires_grad=True)\n",
    "b = t.zeros(1,1, requires_grad=True)\n",
    "losses = np.zeros(500)\n",
    "\n",
    "lr =0.005 # 学习率\n",
    "\n",
    "for ii in range(500):\n",
    "    x, y = get_fake_data(batch_size=32)\n",
    "    \n",
    "    # forward：计算loss\n",
    "    y_pred = x.mm(w) + b.expand_as(y)\n",
    "    loss = 0.5 * (y_pred - y) ** 2\n",
    "    loss = loss.sum()\n",
    "    losses[ii] = loss.item()\n",
    "    \n",
    "    # backward：手动计算梯度\n",
    "    loss.backward()\n",
    "    \n",
    "    # 更新参数\n",
    "    w.data.sub_(lr * w.grad.data)\n",
    "    b.data.sub_(lr * b.grad.data)\n",
    "    \n",
    "    # 梯度清零\n",
    "    w.grad.data.zero_()\n",
    "    b.grad.data.zero_()\n",
    "    \n",
    "    if ii%50 ==0:\n",
    "        # 画图\n",
    "        display.clear_output(wait=True)\n",
    "        x = t.arange(0, 6).view(-1, 1).float()\n",
    "        y = x.mm(w.data) + b.data.expand_as(x)\n",
    "        plt.plot(x.numpy(), y.numpy()) # predicted\n",
    "        \n",
    "        x2, y2 = get_fake_data(batch_size=20) \n",
    "        plt.scatter(x2.numpy(), y2.numpy()) # true data\n",
    "        \n",
    "        plt.xlim(0,5)\n",
    "        plt.ylim(0,13)   \n",
    "        plt.show()\n",
    "        plt.pause(0.5)\n",
    "        \n",
    "print(w.item(), b.item())"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 65,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(5, 50)"
      ]
     },
     "execution_count": 65,
     "metadata": {},
     "output_type": "execute_result"
    },
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXQAAAD8CAYAAABn919SAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4yLCBo\ndHRwOi8vbWF0cGxvdGxpYi5vcmcvhp/UCwAAIABJREFUeJztnXm4HFWZ/79vVXff/Wa92QkBEghE\nIEDYZJGEfVHUcRAdlZ+DMrjiMgrojOiMMjAquI0L4giMAiKCICBbCCCCwUAWskFWSEKSe7PftW93\n1/n9UXWqTp06VV293b7d93yeJ0+661ZXnao65633fM973kOMMWg0Go2m9jGqXQCNRqPRlAdt0DUa\njaZO0AZdo9Fo6gRt0DUajaZO0AZdo9Fo6gRt0DUajaZOSMTZiYg2A+gGkAOQZYzNI6KxAH4HYAaA\nzQAuY4ztrUwxNRqNRpOPQjz0+YyxuYyxec736wAsZIzNArDQ+a7RaDSaKlGK5HIpgDudz3cCeG/p\nxdFoNBpNsVCcmaJEtAnAXgAMwC8YY7cR0T7G2Gjn7wRgL/8u/fYqAFcBQEtLywmzZ88uqcDrO3uQ\nMAgzxreUdByNRqOpFV555ZVdjLGOfPvF0tABnM4Y20ZEEwA8RURrxT8yxhgRKd8MjLHbANwGAPPm\nzWNLliyJeUo1l/7kBYxuTuHOfz6ppONoNBpNrUBEb8bZL5bkwhjb5vzfCeBBACcB2ElEk52TTQbQ\nWVxRC4QIOvuMRqPRBMlr0ImohYja+GcA5wFYCeBhAFc4u10B4KFKFdJXHgA6oZhGo9EEiSO5TATw\noC2TIwHgbsbY40T0dwD3EdGVAN4EcFnliulhF0Oj0Wg0MnkNOmNsI4BjFdt3Azi7EoWKwvbQh/qs\nGo1GM/ypuZmiRASmVXSNRqMJUHsGHdpD12g0GhW1Z9C1hq7RaDRKas6gA9pD12g0GhU1Z9AJWkPX\naDQaFTVn0EHaQ9doNBoVNWfQCdD+uUaj0SioPYOuLbpGo9EoqT2DrjV0jUajUVJ7Bl1r6BqNRqOk\nNg16tQuh0Wg0w5DaM+jQM4s0Go1GRc0ZdECnz9VoNBoVNWfQteSi0Wg0amrOoAN6UFSj0WhUxDbo\nRGQS0VIiesT5fgcRbSKiZc6/uZUrpq8c2kPXaDQaBXEXiQaAawCsAdAubPsKY+z+8hYpGgK0i67R\naDQKYnnoRDQNwMUAbq9sceKURWvoGo1GoyKu5PIDAF8FYEnbv0NEK4joViJqKG/R1OgFLjQajUZN\nXoNORJcA6GSMvSL96XoAswGcCGAsgGtDfn8VES0hoiVdXV2lllcvQafRaDQhxPHQTwPwHiLaDOBe\nAAuI6DeMse3MJg3g1wBOUv2YMXYbY2weY2xeR0dHyQXW04o0Go1GTV6Dzhi7njE2jTE2A8DlAJ5h\njH2EiCYDABERgPcCWFnRkvrKNFRn0mg0mtqhkCgXmd8SUQdsp3kZgKvLU6RodHIujUajUVOQQWeM\nPQvgWefzggqUJwY6Dl2j0WhU1NxMUdtD1yZdo9FoZGrPoFe7ABqNRjNMqT2DrjV0jUajUVJ7Bl0v\nQafRaDRKas+ga81Fo9FolNScQQe05KLRaDQqas6g6+RcGo1Go6b2DDpIhy1qNBqNgpoz6NAeukaj\n0SipOYNuL3BR7VJoNBrN8KP2DLpegk6j0WiU1J5Bh576r9FoNCpqz6ATsHl3H7bt6692UTQajWZY\nUXsG3fl//nefrWYxNBqNZthRcwadM5iTlzfVaDSakU3NGXTSc/81Go1GSWyDTkQmES0lokec74cQ\n0WIiWk9EvyOiVOWKKZQjZPuO/QO446+bhqIIGo1GMywpxEO/BsAa4fvNAG5ljM0EsBfAleUsWBhh\n8S2fuOvv+OafVuvBUo1GM2KJZdCJaBqAiwHc7nwnAAsA3O/scifshaIrTs5Sm/R9fRn77zkd0qjR\naEYmcT30HwD4KgA+EjkOwD7GWNb5vhXAVNUPiegqIlpCREu6urpKKiwQbtC5tK5zpWs0mpFKXoNO\nRJcA6GSMvVLMCRhjtzHG5jHG5nV0dBRzCB9ZSx3dQo66ruccaTSakUoixj6nAXgPEV0EoBFAO4Af\nAhhNRAnHS58GYFvliukR5qFrNBrNSCevh84Yu54xNo0xNgPA5QCeYYz9E4BFAD7g7HYFgIcqVkqB\nTIhG7kkuGo1GMzIpJQ79WgBfIqL1sDX1X5WnSNGEaujO/zrPi0ajGanEkVxcGGPPAnjW+bwRwEnl\nL1I0oRq6nnCk0WhGODU3UzSfhq79c41GM1KpOYOezSu5DF1ZNBqNZjhRcwY91EPXiotGoxnh1JxB\nzwpRLuoBUO2iazSakUnNGXTRQxdDGLXkotFoRjo1Z9DFKBcxJzqPctH2XKPRjFRq0KB7JnswKxh0\n539Lu+gajWaEUnsGXZBZ0tlc4O8hYeoajUZT99ScQRc19E1dve5nPq9Ie+gajWakUnMGXZRcPnz7\nYvezzrao0WhGOjVn0HN5NBXtoWs0mpFKQblchgPcQ586ugntTUl3u5ZcNBrNSKcGPXTbYI9va1BO\nLNLp0jUazUil5gw699AbTEPpjev0uRqNZqRSewbdmUyUShg+b5xPLNIeukajGanEWVO0kYheJqLl\nRLSKiL7lbL+DiDYR0TLn39zKF9cz2A0JA5YVnPqvNXSNRjNSiTMomgawgDHWQ0RJAC8Q0Z+dv32F\nMXZ/5YoXju2hCwZdD4pqNJoRTl6DzmxRusf5mnT+Vd1qypILR9tzjUYzUomloRORSUTLAHQCeIox\nxmf0fIeIVhDRrUTUEPLbq4hoCREt6erqKlOxAdMg7aFrNBqNQCyDzhjLMcbmApgG4CQiegeA6wHM\nBnAigLGwF41W/fY2xtg8xti8jo6Okgt84owxdsGJfN44nymqB0U1Gs1IpaAoF8bYPgCLAFzAGNvO\nbNIAfo0hWjD6jo+fhOe/Mh8Gqb1x7aFrNJqRSpwolw4iGu18bgJwLoC1RDTZ2UYA3gtgZSULymlp\nSGD6uGYYpJZcdBy6plaY+x9P4oO/eKnaxdDUEXGiXCYDuJOITNgvgPsYY48Q0TNE1AE7YnAZgKsr\nWM4ARARhfQsvbFGnz9XUCPv6Mli8aU+1i6GpI+JEuawAcJxi+4KKlCgmpqH2xrXkotFoRio1N1OU\nI0su0DNFNRrNCKfGDbr33VskWlt0jUYzMqlZg05SlIsXh16lAmk0Gk2VqVmDLsehc7SGrtFoRio1\nbNAlD935Xxt0jUYzUqlhg06+BaN5+lxtzzUazUildg26oSUXjUajEaldgx4quVSnPBqNRlNtatig\n62yLmtpFh9dqKkHNGnSS4tA5uqFoaoGc7kpqKkDNGnRDSsal0+dqaomsrqiaClDDBt024AvXdNob\ntOSiqSG0h66pBDVs0O3/P3HXEgxkcnpQVFNTaA9dUwlq16Bziw5/49AauqYWyOZ0nmdN+aldg06C\nQc9ZXpSL9nw0NYCWXDSVIM6KRY1E9DIRLSeiVUT0LWf7IUS0mIjWE9HviChV+eJ6CA46BnOWHhTV\n1BRactFUgjgeehrAAsbYsQDmAriAiE4BcDOAWxljMwHsBXBl5YoZRPTQB7Ne91UPimpqAe2haypB\nXoPuLATd43xNOv8YgAUA7ne23wl7XdEhgwSDnskxYU1RYOeBAXz3ibVFyy+96Sw27eotRzE1GiUZ\nraFrKkAsDZ2ITCJaBqATwFMANgDYxxjLOrtsBTA15LdXEdESIlrS1dVVjjIDkCQXyUP/0n3L8D+L\nNmDplr1FHftj//sy5n/v2RJLqNGEoz10TSWIZdAZYznG2FwA0wCcBGB23BMwxm5jjM1jjM3r6Ogo\nsphBDJ+HLhp0IJ2xvxfrBL3yZnEvAgA479bn8F+PrSn695qRgdbQq8Orb+1FOpurdjEqRkFRLoyx\nfQAWATgVwGgi4otMTwOwrcxli0QMWxwUo1wYc419NUIY39jZg188v3HIz6upLbSHPvRs2tWL9//0\nRXzz4dXVLkrFiBPl0kFEo53PTQDOBbAGtmH/gLPbFQAeqlQhVciSC49yYYwJs0aHskQaTXy0hj70\n7OsbBACs3n6gyiWpHIn8u2AygDuJyIT9AriPMfYIEa0GcC8RfRvAUgC/qmA5A8iSi7imqLtgNEqz\n6Iwx3+CrRlMutIc+9IyEO57XoDPGVgA4TrF9I2w9vSpEDYqSZ9FLwmKAqe35sKGzewB96RxmjG+p\ndlFKppwa+mOvbcf8IyagKWWW7Zj1TD036ZqdKUoRg6Ku/BLjOPv7M3h4+dvKv+mY9uHFNfcsw1nf\nexZb9/ZVuyglUy4PfeW2/fj0b1/Fv/1xZVmOVwx7ewdx3q3PYWNXT/6dNRWlZg26b2JRzp/LxTD4\n5/zH+fJ9y/H5e5ZifWewMlqM4ZU392IgU7+j4rXE7t40AOCJVTvLdsx7X34Lr23dX7bjxaVcGnq/\nUzc3767evIknVu3AGzt78PPnNlStDHEYCf5ZDRt07/Ng1nK9aYsxIQ1A/if49r5+AFAa7W17+/EP\nP3sRX3vgtTKUWFMqHW0NAMqb2Oq6B17Du3/yQtmOF5dyeehJ027Couw41PAroRoRM+p5WKxmDbpp\n+CUXbrst5j2wUptM94A9b+q1bUPvwRXCjxeuwz/+/MVqF6PiiM+41imXhp5yDPpwiJoZ/oayDipO\nHuJEuQxLSMrl4jX2wlLpxqmEw11L//5Tb1S7CEPKcH8ecSiXh87lxcEqGvQ6eBx1Q8166KLkkslZ\nbogiY56xL7WiccOhK+zwwH1pl8kYVjPVcrk8assq7/GKgbe94e+h29RIMYuihg26f6aoJTR2buxL\n9YIyOU+X11SHFzfswqK19jKD3HCUyw5nrOoZwXJ56LxuVlNDHwoYY/jT8rdLus6R0Ixr2KB7nwez\nliuPiROLCtEpVQ+bV564R9GrJZWfnz+3ET942paUVLJaKWRy1Xte5dLQ+b2o5rV4VM73fX7dLnzu\nnqX4/lOvAwAeX7kDn7xrSVHHqufJgnWhoYuSiyXM7izVCxrM5dxjxqEeBuuGG5mshZwkfZXNoFfR\nqy2fh27/X81rGQo/hjtX63ba4cVX/+aVyp+0BqlhD90/KMorNmNM8NDjV3JVmoDBrPOSiHkYLc2U\nn6xluVkzuWEvm0GvouRSrtBL/mKo6qCo838lHd/2Rtv3PNCfKfoYI6F11qxBN4WSZ3LMlTvEsMVC\nvCDVvryRxJVSdH6O8jMoPFt+f8tlu+pBcuH3ppoGnVNJISPhNHgeSlwMvBnXr+BSwwadpEFR3jwK\nlVyijH/amWwUX0OPuaMmNtmcf9IYUL6ximrKFDwqpVSvltfbqta9ITg5f/bdA34PvZC6MBJ60DVr\n0EXJpS+d9aJchEHR8nnoMY9R5QpTzTC8SpHNMcEzL7PkUkWvlvcOkkZpTXA4PPKhkFz4s5c99ELa\neD22D5kaNuje55c37XHf1EzItlhIt1ZljPlATPxB0epWmGq/UCpBxvImjdWT5MJfJiXa86rXOUCU\nMipn0bkx7k77DXohNprvW8dBLrVs0L2n8vb+AbyxsxuAP5dLYW/v4DbPoMc9RpUNeh16IJmcFRgM\nrQ8PvUwTi4aBQedU1EMPuc5Crr8SDs+7vrsIv138ZtmPWyxxViw6iIgWEdFqIlpFRNc4279JRNuI\naJnz76LKF1csl//7gLOOaDbHXM88jofOjb8qIsaNQy8ibLEaMen1aNCzOYaegSzW7jgg6MX1YNDL\nM0lqODzzoajrYZdZyKm58S9nT+LN3X34+oPVS10sEycOPQvgy4yxV4moDcArRPSU87dbGWPfq1zx\nwjFC3IEHlnpLmxbiMave9K6GXsQxchZDYohXx6hLySXHsLt3EBf84C+YPrYZQPmus5qRIe6MxxIv\nZTg8cldDr+A5wtpyIR56KS+ezgMDeGnjblw6d2pZjlcp4qxYtB3AdudzNxGtATA1+leVR8y2GEZB\nGnqk5BLTQxfOl7UYEkO8gExuWMwWLC9iz6lQCSzvsasatlhY3QpjOHjonErOwBSvU2xnBUkuJby/\n/+n2xVjX2YOFazqxdMte/OWrC4bVvecUpKET0QzYy9EtdjZ9lohWENH/EtGYkN9cRURLiGhJV1dX\nSYUViWHPkStg4ohq3zQ3IDEfnLhbNbrzdemhC6GF6awTRloPkguftFbitQwHDX0oiiDW7X5h7YJC\n5oZZ3uhtwWxwVmN6ePnb2LKnP1Cm4UJsg05ErQD+AOALjLEDAH4G4DAAc2F78N9X/Y4xdhtjbB5j\nbF5HR0cZiuyWJ+8+JXvoJYQtVsP7Gw4ew+s7urHNWTSkHGSEa+LjJOW6zuEwKFrqlRSaLroSDMVZ\nRaeqb1Aw6EMkuaiqXBUnGocSy6ATURK2Mf8tY+wBAGCM7WSM5RhjFoBfYogXjA7T0EWiGv5bu/vQ\n1Z329lU87HSmeMmlGtPKh4NBP/8Hz+O0m54p2/HEKfLcQy/XZQ6W6aV72k3P4PP3LC3w3J6zUC5D\nU63Hz8s/VFEufYNe6OJQSS4qCkktMlTEiXIhAL8CsIYxdouwfbKw2/sADOlQbxzJJcpDP/O7i3DS\njU+731WySimDoiPVQy8nOYspDVa5wkOzitma6zu7Cz7Otn39oQuNhyH2DkpxrMVnXu3nX8k4dPHa\netOih67ef1dPOrDNi3IpD8PQnsfy0E8D8FEAC6QQxf8moteIaAWA+QC+WMmCysTx0PM1fHsxDPuz\nyvgPOh5h3IYi7qYNemHct2QLnl7tX/w5TBIpdxw6Y/aaso+seBvn3PI8nly1oyzHjz53cQN7MuJv\nSznOrp40Hly6tejfx+XOFzfj8ZXbC/6deGmih67q3azv7Ma8bz+N/3tpM17f0e2+pMs93jAcNfQ4\nUS4vQP1Se6z8xYkPN8RE4R5OIRq6yvjzRhdXa82VKLkcGMigfzCHie2NBf8WKF/Cp2IpxXP+6v0r\nAACbb7rY3RZ2PZWQXGb/++P4wjmzAAArt+3HeXMm2WXIWdjQ1YsjJrWV56QOPg+9hOPIobLF8ok7\nl2DZln04fWaHuxh3HHYeGMDPn9sAIJ7kcsPDqwD4n3McfB76YLSHvqtnEADwrT+tduvQ5psuLtqg\nh0liNSm5DFd42KKYC+PcoyZiyijPGBaUyyVy6n+8dKesRMllwfeew8k3Liz4d5y9fYM479bn3Fmz\nQ02fEH1QDsLuebk8Izk5V1PSjjMVoyhufnwtzv/B89i8q7fo82zs6sGFP/wL9vUNeucWrq0kD124\nhFJe6Fv32gPZher5n7t7qWtA81HKWIFPQxem/6vqAm978v3g96pQrf+Dt/1NuX0Y2vPaNehcchEn\n7zQmTVzxzhnu9zhv0KjVjfggHBAv70epYYsq3a8Qnl69E2/s7MH/LFpf0nGKpTddfGpTFWH3nDGG\nq+5aghnXPVri8SWDngoa9CVv7gUA7O4t/tn85Jn1WLP9ABau6RTOLUanePu+tbsPH/3V4tj3UjRo\npfSQin2pHBiIn598fwm5zK0wDz1PuxUp1hF4edOeyOPFGc8bKmrYoNv/J4XE6AT/hKNiMrH1C5VF\nXL8wzlqG4vmqIX9wQ5QS7smSzXuwdW/fkJy/J48RWrezG3e+uDn28UI1dAt4UtLbw9i6tw8n3/g0\ntuwJ3gP5GTU6HjoPi1u5bX/sskbBQy9F5yPMQ7/5ibX4y7pdWLjWM/5RiF5vKT0X3hvKFFhvxbGs\nfHZNjCqTyeQsfPF3y7AppCcUFuUStXSkjBuNU6ZhUT6RbzgtaVezBp2TFBoJEZAQDHohskfOYrjn\n5bdw5Dced7elxUktufxygtgwqxHjzF9GyYT3WD/w85dw+s2LhuT8fenoe/T1P67EDQ+vwuq3D8Q6\nXtjzK8RwPfDqNuw8kMbvl2wJ/E1u+FxyGcjksOj1Tlzy4xew9K19AEqLROHGMiHIg+K5S4ty8T4X\n66HnLOZGdBW6kpKYLTKfXeuMMOivvrkXDy7dhq/ev1z5d5+Hno6OQ0+HGPRifayw68q5L4jhQ80a\ndP4cxUZiEMEUvNNCGr7FGB58dZtv29odnhYd9tZf39mNB17d6h4j3/7lZHdP2vfi6FN46ENJPg99\nxjg7F8tDy7dF7scJG1guRIvlje3BZdsw47pHfQskyC9d3nAHMlZAM7/h4VVFS2JcXhE9dLF3UK4o\nlwMD2YIkEM77f/ail9yuBA8930+5hz6qKRn4m5sfPqTuir3ffHHooZJLkRad5xAKO16ciLuhomYN\nOn82YiMhAMkYkovKIGQthr5MuEEKM9Dn3PI8vnTfcl+ZgPBKVS4GMjm867vP4v5XvFCzAcdDb0hU\n57Fy3Tfs/K0NdkOOuy5kqIdeQMPkbY1P1+aDf0DQeOUE2U1uoqvePoB/vuPvRRlMnlYiaRKyOQu/\nX7IFA5noSI24JkI0aOfc8hyO+eaTBZdv+ZZ97udCB/PJZ9Cjf7un1x48bW8KBtfxl3cixKCL9yjf\nvVN56INZq+gJUAeNiTboYQ9rIJPDjOsexS+cKKChoGYNOsf3Rie/hh7mbagMQs5ikZJBnMx84nG5\nx1MpDgxk0JPOYvv+AXebq6ErDOr+/gxeXL+romXqdTwnPrgowxt8XIMcHocev0yyvukb55CMFy9f\nf0i0zoqt+3HZz1+Kf3J+HuecpmHgrpfexFfuX+F7sagcjLiXWO4c/IVKheLdzdfR4EEKKg07667g\npLaO/NmYBvmcK/kl0tWd9qUG4PQP5oqWXFQvKsvyVtIKez/wQeDbX9hU3ImLoGYNOnOqvKiZG0Q+\njz0s+6DK0FsWU1YEThwJRWyYlfbQuV6eFowPL79Kcrn6/17Bh29f7EoOG7t6lDlXLIvhO4+uxlu7\nCx9I5dpmU9LEks17cNnPX/LdN94A4nbr40wsKrQbLZ5bTsjGTzeQyYUOdIkynIoVW/cFwkYzroZO\nymgZ0V4U2nkv9/y1Qu+naH/z/ZbfX6Z4XXHtPp/k0pgwfM7VzX9e697fgUwOJ37nadz057WB3/cO\nZouWXFQGPceYu90gwrqd3bhPMU4DDK3GXrMGnbdF0yBvkhFsL4gTZjhUhiJrMZ82J1NolEu6BA89\njtfFjbfY/eRGXuWhr9lxwFfGBd9/TplzZfX2A/jlXzbhc/cWlpvELpN9/xqTJj5791K8vHkPtu8P\nShxR19d5YAA3P74WOYtFTCwqfvBZNOLBOGXPQ4/bLZe96/f85K8479bnfduy7mIWzKe3cmMYJVX8\n0+1/wxxnoF4VqlnuhFyFTpYxCpBc3BWnIhLhha0hwMfDmlKmT1JZuLYTDy2z0y6EDYYCdnuxipRc\nVNVQrJ9EwIU//Is7OY5TjYmkNWvQ+VueiFwvnUjW0NUPOOubds3/V3vo//2BYwDkN+iMMUlDL96g\nx/FgeVnF87zueIamQXho2Tafvs6NVT4vpZRFH/jAlkHAjgO2FCR2r60QD1008Nf+YQV+9uwGLN60\nOzJs0TtndHnlASvx2cv3ghsNlYYeRhyvLyvce/G4DU7C/KhD/HX9bjfuWhWqqTp/Tzqr7CFu7OrJ\nO0GqUA29kEHRqBWnsnkGRXkdaUyagfkJcdIq9w1mizawjLHA+gsW80suWUX78mxUcecthpo16Ac5\nI89Xv+tQ92YTSKmhL31rL/6+2ZscIEZPiCvKqwwpD2VL5zEcFouWXHbsH8CM6x71lSOMOF5Sv8JD\n5+QshmvuXYZ//b0XAsaLlu9lwWdPpopYbUnlofnudYiGLu7jhc+x0IlFhSRBkxtTVDIrn4YesxXG\niaTiz5Mxf4F4T0qpoUcc9z8fWY2bH1+LXT1ppRF9xw1P4D0//mtg+4LvP4ezvvdsnrKG92r39gZn\nhIq3KV9vgd9v1Sn4PUqEaOi8+TUlTTfHEifMURDpTeeKjtO3WLBcduI4T3LhiO3eM/hDZ9Fr1qC3\nNyax+aaLcencqW7oomH4u2x8wPB9P30R/ygMZvlWP+FGJuRh88km+Tz0nMV8x0hnLOQshntffguZ\nnIXFm3YDAO56Kf+CsvE8dFveUA2+qq6FX2c+jzZf+FgUrgcmbPvyfcvdcLWcwosB1F5zzmKhMdGF\nSC5yU8pGGXRuGHIsfpRJjA4Nv76gh+4YdLG8MV4kv3phE3727AbM+/bToTLH60Wmfwi7n1/83TIc\n959PBbaLxixfb4WXVT0rOzrKhdfpxqQZaIth9UqkbzAbuaZozmLKyWe83PK4lGUJ9VY4nNgeXYOu\nPfTC8Lxy8mnom3f1qvVylQEJ8fSaYhp0SyG53LdkC6574DX8+q+b3IYaJ+Y4TreXR2Koutaqa+Fb\n8h07oxic2tjV49PCRSyLub0FVcNatmUf/uvPa0L/Dkgx2c5t7k5n3VViAucUfq6a2ZjO5vCbv70J\ny2KRHnpY2GKOBX8XRpzeVDbkuA1J+x6XlsulvEJtmFF8ZIWdIVF+yYoTi/IVxZMlgvdswJ1Dob7x\nzLl3KWlQVDyv2NZlR79vMBd5r378zDqc8d+LXKPOGMP/LFqP3T1pWFZwfWBxUFT8i9hjdjX20LOW\nnziLRA97RA09IUkubyqiNcQHnxO72QqaUnaNjeOhixUmnc1hp6Mj9wxkQaPs7XEGseIYCW9QNLiv\n6lo87yj62PwFIRr0Bd9/DoA6Q96Nj63B7S9swvrvXOieQ75X3IvLhfSGRCPByxe1YIQvq6Xiufx0\n0Qb8cOE6NKfMgDfme/YBDd3bHrebHM9D55KL/7hxNPR8lDuFa5jMZRqEnMWQzlo+L1r00PPVbbEH\nJMPrcaiHbjGYREia5C7f5x5XIeWNbk65ce98H/5n1ct6yWY7Z8/6rh5MGd2E/3tpM777xOtY+tZe\nWCzYY80JYYuGQe79EQ2656FryaUguIduUFDr2qjw8vweof15y161B9qUtN95+QYLxTc2YHvovBvZ\nkDTdil9IFz0K16ArPHTVFGt+yYPZ6GN7sezxKuGfVtgRBlv29rv3Ve4V8UeSywUbHiDJIDHsk2g4\nzvres76GC3gJo/b0DgYa76Di5cERB47jtsE4BjXjXrd/e2tDwnde33EjBo5928tgz/3jTuoKylNs\nRA3257sX/JpUM4BVY0HysQ3WKBS+AAAgAElEQVSDkEqYgfEsVTisPBtV1LxVjG6299/dM4gfP7MO\n3/zTagD2ALPFWMCgy4Oi/B6qJJehJM6KRQcR0SIiWk1Eq4joGmf7WCJ6iojWOf8rF4keChLCoKjc\nNVJ5qz7JxXnIb+5Wj/7zSTJ5o1wseWJRzg1dbEgY4PVBFYMbVb4w+h0NXRUeqTLoLKaHzl8UcTX0\ng8e2ALBfnNzoyPeKV/bQQVExUVWMRiDv8rYUT8/LnrVYILIh2kP3vsf1qbKKAXb5M98nx5jv/G2N\n4R1kWQ4KcyjKIbk0CmGuYUaIp6mWJb6Colx4HYzw0MPObzkeeso0ghq6ol61S/fWYt69Utn1ducF\nsHVvH159y5s5SyDHQw8OiooeOI+uG1AMig4lcVptFsCXGWNHATgFwGeI6CgA1wFYyBibBWCh870q\nmKYnuYgaOqCuPL5GmOMGXT0gEldDzyk0dF75GxKGoKFHHiZQvjAiPfQDA4FtvBLLXeor7/i7L7yx\nP2JykgoebbSxq9fzwKRzeL0TvyfV64TXFZqlUm4ockgZf8FnslbAaxSfY5iGDsQfyHptq5eRUTy2\nOKeB18E/Lt2GnwipjdsbbSMSNhNRJDzhVBkMetKb2RsmufCEb7IDId76vHHowvOX5Rlej8Pqfs6y\nn3NDwghEuXh12/ttY9JEY9Krw5bltU9VOfmz27Kn3yfj2QvohHjobpSL6KErNPThNCjKGNvOGHvV\n+dwNYA2AqQAuBXCns9udAN5bqULmw41yEWLSOWIFyebsyJOMwkMPwzXo+SQXqUuXzli+RshLxSvy\nXS9txrxvB6MG7DLHl1zkBtacMrGvL5hvxNXQpetYuLbTF96oytgYRUuDfX82dPW491K+V7yyy7G6\nc254Apf+5K/+5xHjZSY3SDnWnDe+jOBFcVQzV93jCt/jLCC9qyeNK+9c4n4XvVcxDTO/7meklLiu\n5KI4lVwHyp1wSsSXNCyknodJLoVo6GJbk8vNDaHq9IwxHBjIwAgZFFUNtjckTbftAlxDZ+5nmZ4B\n+wXc1ZP22Qwie39Z27d8PXJy61zaJ7lY7jGGioI0dCKaAeA4AIsBTGSM8cUBdwCYGPKbq4hoCREt\n6erqKqGo4YgemuyticZi5tf/jI/f8XdfpRXrhvxbwItEiBflYp+rKWkinc25lT+dtYTKZO//jYdW\nYVfPoLLLHE9yUcehm0Qhg6LOsfMYAP5bfifyGQx+f1dvPxA6ecmQInzEv6/d0e1/6cYwUHJ7lL06\nbnyyOStwL0VDHczl4n0eiEgDwZFz//g9dMGghxjJ1kZu0INSz4auHty9+C3lsUXiLLySj+aUJ0+E\n3f9EiORCBYQtirch8MJyJZfgdf7suQ24/5WtODCQVQ6KqjT0lGn4DHpO8KhVxexO205QNmcFXuYW\nC0bf5EQNnbyXonh/eP3asqcfO/YHe82VILZBJ6JWAH8A8AXGmC+hNbNfzcqnyRi7jTE2jzE2r6Oj\no6TChuGbKSrdeNngPf9GlzIOHbC9W9WxkybF8tD5cfn0ZJ5nZTBnedPeZQlAmYYghuSSURt0w6DI\nPDKZnKX0pPhxuCHilTHfYBU3Vmt3dIfeIzfKJcTg+2buxpFcpPLLRs310HNW4F5GeejiccOingA7\nDnzW1x8LeF7pEIMetmgE19BVju2v/7oZX3vwNeWxRcqRd7+tMeHW/cI9dO9zXg094sXN65nqhfLY\na96i0uqwxWC96mhLoTEleujefVbVf+6hZ63g/AeVhy72yAneCy9sUPSU/yp+aclCiGXQiSgJ25j/\nljH2gLN5JxFNdv4+GUC8JVYqgH+mqP+SVDm6xQYm3nSVQTcN9UCMTE4YgLM9dAsDzm8Gs56nKIdO\nqQ16HA/dmVgklSthUGSmx2xOPSOWDyzKDSvKsIn7DWYtrNupjhvPWRYWvd6pnB4tHgPI73EaFHwp\nykaN14eM4lrFfbOWhXkHj3EjIkSDE3Xd//nIamRywQFXX14dIRVzmOfa5mjocUJZb3x0jXJ7lEGP\nO2BqMYZ3TLHjancqBtT7BrNKSUEm/0xR77NsNHk74Pdq5bb9+Ozdr+LN3b0wxdm1ZnBikWrS3KT2\nJr/kYnlhxWoPPeuWSzwOgcBY0FG0GHPbtJgUUKWhDyVxolwIwK8ArGGM3SL86WEAVzifrwDwUPmL\nF49ERNgif/OKiJXJ8hn0YNQBEdleQd5cLl7Fsj30nJsffDDrVRJe6XnUgOq4hYQtBuSNPAscZi1L\naWB4g3I9dMe45fPQxcq/LyTP+Z0vvYmP//rveMVZnzPooXvHUGWAFEmYRmTkinj8TC54rbKH3tKQ\nwJWnH+KUQ/DQ40guUv580SvvH8wfvtbWGK6hy4QtSRf1AlQNmKuwLCDphKn+7NkNeHLVDvdv97+y\nFUd94wlscZYxlHt/hWS+9M/wVffS+DEWre3EIyu247tPvO6r06q2yB+/eP7Joxr9koslSi7hHro9\nQ1mQwBwHQhmHzj10we744tDLPEcgDnE89NMAfBTAAiJa5vy7CMBNAM4lonUAznG+VwVTkFzEz4CX\no1tErEyihyyO9ovEMeiiptacMpHOWG6qWtGwiHmdgTCDHty2bmc33vlfC90IlrBUv2aeEZhMjim9\nOl6OfslDz5fXXaz8cXox9rH9+xWiAyedCRxRv+dlz+ZY4OXo99AZEoaX/0esC3HSHz+6Yrvvu1jX\nojJ3clpSCg29wAG0KA89bk5+izF3khNgz+7lPO0kBOPHkiUXv3wZfR45rPOhZdvw+Mrtvr/JevhA\nxvJ76BGSi+gRd7Q3+PLyW8zrUanKyXvymRxTDrqqDDp3Bn2Si+gwlDu3cQziRLm8wBgjxtgxjLG5\nzr/HGGO7GWNnM8ZmMcbOYYzlzzpVIfjNJKHrw3XbboWH/uBSL0xPbBCphBHoWgH2jL58DVwlufAJ\nL4NZy5V5uC3jb3SVNqrqqv3mb2/i7f0DeHi5PZEnzINUDeyKvLBuF34rDLZxuDHm94O/VPJq6IJx\njqvnypfHf3fI+Ja8v00mjKDkkrXw8V+/jIeWbfMdz36R+suUljx00yC3rojlj+Oh//iZ9b7vqsiW\nKPizEi+nUKcuarwl37Pj2B6oV29Ex0auT3I7CFtKb+eBAXzmt6+6vVTA77FmchauuXcZrv7Nq76/\nuRIer4eW5ffQFe3T9e4FAzqmOeW7DnEikCwNZXOWr2cqOwGhkos7KEqRYYtDSV1M/Re9cte4w5Zg\nehUa+hOrvDSk4j1PmYSEYSAjLQjd0pBAT54FkMW41KaUib7BnGfQc5ZrIF0P3Yzw0BWNdPLoJgBe\nwrGw5fLyGfTfLdmijDHn5ZA9pfySi7+RxkG+Pj6zU54MoiJhGAFtOJOzsOj1Lix6vQsXHz3ZDdtM\nZy00RWjoOcvO0cFvmU9yiWkMRXoLNOiqfOiFdtOjZv7GuYZVb+/H9n0DmDWhzd0mLiEYMOgZtXcM\n2IYvZzH85m9v4qUNu/H4qh04Y9Z4XH7SdHtf4Z7I90ceXxkUxp9kD12G1yexXh3W0SKFLQr52KV7\nLC46nbX8PVgiCpVcxPzq/POwnylaC3CvXEyfazGGhGHkXbhYJJUwlAn22xoSvsWFVYgPuDll+lKb\nprOW4B04ZS5QQx/XkgIAN/xJ9iAP7WjBr//fiXkNOqAeiOXTqVVd3iiyluXKBHFzwFuW30viaVnD\nJC+RpEkBD1+8nn/9/XLc8eJmALZBy6ehm4bhG0Tl9BexQEm/bzKR8wKPaNQ85E+0L4V206Mll/wG\n/eIfvYDudNafNCzKoEvPWE5098CrW3HDw6vwuKPDi/uLRlwumxzyynsCg1nLVwaVM+L2Kp3fPvnF\nM9HWmAxo6Pz0sr/EQxb59fgHRdUaum/qP4ntRXw5lB6BVCh1YdD9HrrTSGAbelly+eQZh4QeJ2ka\nyinvbY0J5YtB1gTFJPwi9qCoY9Cd6E7ehRvMBRudyrvjjf7t/f14/0//il09/vwlt39sHubPnhDL\noKvghk7Un4H8Xl4mx9Do6K+FeOjivdvTazeoOAbdcDwm8TJFD+uPzuo1gK1jR02h5xo6N6xiA4wr\nV4j0KTx0Vd4SAFh+w3ll8dBL0dDFl5s4QahBeA7ypK2wCBP++ek1du83IYwRfebuV3H34rd8Lzc5\nPYW4dsGM6x7FG07E1GBOklwSwTqSkQZU+blFDZ0xMcrF/n9P7yD+/Np2t223NiQCEw/t/VX50L3z\nGeSN64iOlvbQi8TL5eLXJRMGBQzxxPbG0OOkTCOQdhOwJ4CotHixMV3y4xfc9AFy+KMdtsglF3ub\nWaCGzg3Dzv0DvlwTHO7x5xsUDYM3VG96dkwNPWe53eC4g5s5y3+Ne/u4h56/OvJJWuKLSyWrAXY6\nAtngyZKLaRB4p0z8W8kG3bkXYfdkVFPSNZbiHoUagWwumKubE3YNvHe0U0gRId5P0UjLhkzW0MXy\nMgasetueosInTfWks3h0xXZ87cHXfC+rHVI6Zj7WccBpZy9vsofkbMnF208luXjjPtyg2/s0+jx0\n77p4Ma65dyk+9dtX3XDbUU3JwNwFPvVfFYcuJufiZRAHxrVBLxIv2yL5POyEGZRcVCuiHDHR1g+T\nQs4VkTAPXTa8PGe02NVrSpq+iUWyFxE3ysXVFkM8Mq7J5wtbDCPMQ88/KMrcLnrcCpyzLN+9415N\nUwwPvX8wC4v5ZyiGyWqd3Wl3vUlOOmNhfWc3ugcyyFoWEga598wnuQjG+QMnTItxVf7IFm6gVOl9\nuQEmhYde6EDaYM5SDuQD/rolesf8o5jUTKz2oowi16eoKJccY+7Lldfv14VFtcV9397nvUwYC6Zo\nEK9BfNmorjUjyYW8LYj1qSedwe+dnEX8fvOFV95ycqCPaUnaE4uEstiSS9BuiGNmJHjovkFgbdCL\nw4ty8QaaTjh4DBIG4YAUG21Kb9qJ7Q2YM7UdANBgGkoPt7Uhie6BTGB0XG6sfB1NsSJNHtXoe+vz\nyhcdtqjw0HN+bVuGV7iwJbzyMej2IGQN3TNsG7t68KX7lvk82UyOuekR4pK1mE8r5l5fHMmlL2Mv\n9msqDLosN41vTQV+f2Agg3NueR5Hf/NJ7DyQDo1yEWO4O9oa4lwW1m73jFfG9dD9z/fzC2bihWvn\nAxA1dMHYFuqhW1Zo3h1fRI/vpWFv3yF46PIyanw+QNBDt9B5YMCNKMpJkgsfGOb1+rVtdvKyqaOb\nAlEwnEHFfAHxbz45SHGtfGCY92I9ycXb95d/2eQrJ+BlWHzL6VmPaU45M6m9YzNnf7lZPbLibTz3\nup3KRNTQCx0YLzd1YdB5N4xgN5JHP386fv3xE5EwggNoSYNw/9Wnut/trGy2IUmahlKDbmtMIJNj\nAe8kTB9tEiYojW9tsCUXy9/A3UFRxeCZqiJwj0809mJX2+2llOihi7HiL27YhXte3uLu8893/B0P\nvLrNt9BwzrJcDT0ulsV8947f1yiDnjQJ41tTuGDOJMj5ymWvkHPyoeMCxxGNLv8NNxj+iUVe+eKm\nEl64thOnOucM61EdObkdExzZjxdXNCCqgbTPzD8MHz3lYOU5M1nm1iUZ1fqW4uft+0WD7v3utuc3\n4rSbnsG6nd2KKJccvvz75bjm3mXYurfPndRjR3p49YgbNv5imD622VcGUcLsi1jvMzAoqjDoT6/Z\nifWdPUENPaQ+8VPxbJebnNTZo5tTgR4pHxuTe+73vLwFix1ZiOD1qvsiJJewtArlpC4MOp9xx2/6\nnCmj0N6YVK5+YhqEeTPG4pbLjgVgv/G5QUolDKjaRpugB4qE6aNNgsfalDJ9U//f2NmDp1fvdKNp\nXEOq8KBU5xL/JnorvBIXsbazvxyC5PLhXy7G6u1e2p7NjieTNA0MZHL449JtyBbroQuVfTCGQT+s\noxVL/u1cTBnd5M7K/fhpMwB4z0U2vHOmtAeO0y09QzvKxSlHzlLO+GttiP/COm2mbdAzbo/MX0dE\n4+QlLfOMr8pTFSNxZDI5K3TZtrC8NdxhEGdRi8fnA+4PLXs7MCjaN5hz67JtRC1cMGcSzjq8wxfl\nE5zBbEeXcCdE1Jp7B7Oh0T2yh54y1c/iW39aFdDQJ49qUo4vcA+dp3zY5DgoY5qTyhmsjAUHh2W8\nlNDhHnrcKLBSqAuDztOQyqjkB97oeQUW8yYnQyUX+/hyxrSwN66YQiCVMJAWBkUB4BN3LQlo6L4G\np6jcoqHliN6f6UouxT1SOZdGVHcxxxj++/HX8YXfLcPGXb2+WYZxsJg/NIxH0oiDyf/yrkOVv+VL\nfVmW561xD11+4Y5pDkouMnYcuie58JdKv8+gJ5W/VcG972yOYX9/JiC5iC8dXtVu/8tGHPFvj2NP\n76DyvpvCxBWZwZwVumxbWMggN56iBy96oLy+/2TRery4YZfvmN0DWRzaYU8AW7O92x5Ydu7hXkXa\nZk4my5C1LNcJETNV9g3mQuub3bv1T/5TMaY5FdDQzz5yAl68fkFgsH3z7j6s7+xxV+Xq6k6DSL3g\nSNayc+pHdXx5/D0ga+jhk9oqRX0YdOdByN0lVUw538YNX0PCcBuxaajfxC1OBb/kxy/4tveGTDYS\ns7zxqcpihR3f2hCIchH1xVff2ov1nX5pgL8QxOOIjdzNCV/kEw2bWKTCshh2HPAG1OJEp4jIHjpv\nBLxBdbQ14CMnqyUGPokjx5ibCTPsOUStCMQxJcmFX4tYvtYYx+F0tNp6+10vvYljv/Uk3tjpf47i\nM+Priz7pTK/f16dOp5wwww16NscKHhTldUg0MAYBP7x8LgC/7s8XTeZ0pzNuUrGbH1+LDV128izD\nIOzr84fS+sqSs5BjXjpqcWJcr7PMW9g1iE5M2LWObUkFNHQiwvjWBmWbPueW53zX39qQUEprmRwL\nRFXJiIm6OrvT+NULtl4v+3tx0kmUSl0Y9DbH4MrdaZW36koTzp9ED91iag1anJL+x6Xb8MxauwFu\n369OJNUsSAcNTqZGsVKObUkGJhaJBuSRFdtxzi3P+46p8mA+7MzAs6+HfP8XiuuhKzLXyTywdBse\ne81L4FSoh86Y//g96SwM8jRPO0xMfR0mkesRmUSRk8e4RhqFmMtF9NBF4rwYOGNbUjANwq4eO4Ji\nzXZfpmnfdcmPSpUdEvC/dGQyOStU4+cGqyedxVZhzVzVLGCDCJfOnWrnIRIMnSxBHOjPKrNbGhQd\ntpq1LFiWlzOmN6aHnrX8uVXCPPTetCfbyG0gLJRXfOG1NyaVPfqsZQWiqmQs5pdC//MRez3SgIde\nxGS1QqmLqf9uzKsUK6700F0jb/+tIWG6KXdzzkQTmcMntuEfjp+GF9Z34Qu/WwYAWPedCwNrWXKa\nJQ89k7OQsSy0NiRw1OR2bOjqwXjHk/MGRaOvUa7wX73gCN/EEe9FVaTkwsMWQxZyFvnZsxt831WR\nB/kQy943mENT0nSlAzt3hv+YvEEZRG7jMRwPPdSgN+U36AaRK31kLaZModwWIumpGNuSQsIQw9ik\nXqPwfGTnQZUdEoDz4ircoA9mLZxy40JfNAvgGR+/h66OkpIHdbsHMgGpMeqF45Yzy2AaQZmMf46q\nb2IIqaquHT6xFd0D3iQy+RrCiiZe/6gm9ZibraFHSy58Fmqzk/IDsJ0SHuGjOl+lqAsPnWuccWLO\n3RwqTqVsSBruQKLFWKju2pg0fF729Q+8hm37gquQNCVNd6adQV6mxpzFMK41hdNmjsfu3kG3i5nO\nWnh4+dvYE9JdfWnDbizeuDugqycNw3d93DiUOiiqylyXjzjhhjJi5e5JZ9GUMt3utMWYm15YxiBh\ncpaT2liMLJgyyps4FsezZoDPQ08YRmAgrRDJhRt0TqBORnjo6awV7qFLO58/Z6JT5nA5oD+TCxhz\nwHMeRI+RG+R8ET3dA9nAgh1cconi9Z3dWL39gKehi6mGFSkaRJY4aZcB9aBoW2MS3emMO1FM9qZD\nxx+EOji6We2hZ3K2hx71wuK2RFSNHl7+ti9nFKAll9i0hXnoCqPADQXvbjYmTLcyWoxhrCJ2GbAr\n+l7B6D6+cocyd3dTynQrbcI03MUxsjnb+5/YbnvmOw/YXfKte/vw+XuW4pN3LQkcCwA+9Mu/4YO3\n/S0Q+WIapOyB5JNcVKFcRKqJRfG9iTAP/dHPn45ZE1qV5RMbU286i4aE6T4vhmDvin8zpJdYUpo8\nJk73jiO5ZIUoiqxjHOWonbBBdxXNKdPn6cmzWP1Gw3+Nn7xriRtxAQAtzrUkTArIBuKqTKpHbpC9\njJ0KdxZwNufbXzxuGN0D2cD8CyPEQz98Ymtgm2fQvfsykMegi6gklzZnJnfGspT1P8wYxzHoWYsp\n49BFuATV75uz0RvYT3voMWkN09AjBkX58nCNSW92aM7ykmAFfifEtM+a0IqedBZb9vQFMgSaBnlR\nMwYh6QyK8q4xlwH4hCceIra+U934OLLnljRJ6cV60S7Baz/h4DE4+8gJge2tqUQgHr4QDz3MoDcm\nzYBx5N6v6K1YzDbE/NmoJBeO2Dj5SjGifCNGGMXx0AeznkHneUPkHkdbAVEuJMkjvYMRkov0iPb0\nDvoM28RRPF6dIN8Ofh+zlt9D/+Hlc3HT+4/GqKYkXtvq7/Jz3ARYoofO602eLt6gkGrWuyZSGryD\nxjTjtW+eh8vmeTNtuYZuMe+FJSavE2mR5K9f/78TlYOirQ22Qc/l1JJpmP4t1sFRTSml5JLJ2hON\niAiHjG/ByYeMdf921ZmH4qCxTe6L4SThb00K6W4oNPQ4Kxb9LxF1EtFKYds3iWibtOBF1eAGvSct\nzQpVPFwvztiLffYmeIRLLuLDPnySnSrg7X397oi/eHxeabmHzheV4MvZAd7b/Pk3whfOFo18ULdU\n527n16zyxG+9bC4+ddZhge3NDabCQy9dcrEHHCWDnuAG3X89TUnTfTZyfm7A00HFR2oaQcMvNqQ4\nUhB/Lu5nCkbtxJFc7r/6VDzw6XcC8BvFgIcu/K0lj+fPI2bEOsURPVXxb9PHNuPyk6ajIWEq5RYA\nePb1Lnzizr/jpY273W38/vL7GZYfBgB296Z990jW0Pmza0yaaGtM+ssq/I47NwMZ9aCoeN/PPWoi\n5s+eEOKhJ51UDmqDzvniOYdjwWzPoRHHBxoShvK3fB+DCIv+9Sx835m/AgAzJ7Ti9Jkdrod+4Tsm\n4SOnTEdzylSOxbQ3VX7IMo6HfgeACxTbbxUXvChvsQqDJ9y6/MTpvu28coo3lxvmMw4fDwC4+JjJ\nbneW69wqRAMz28n90tmdDjw4w9F1+W/45/5MzjbwCb9Bj+LDv/yb+1mu8Akz6Eny8wP+jHmchqSh\n9KZbUgl0p7Po6k4H0pjGIcxDNxSDeZEGnRuRiKgCn+RCFDA8qoYUxWDO605zyUWe+RoncujgcS04\nfvoYAH4vXE49Id6PfAt68HtlGoSPnHKwz9MVX2RiPWgQJsmF8Z3H1uDpNf4l7UxXQ/c89bBj7Okd\n9PWEZIPO71+D2w7El49g0B1nKGz1LdFZ4ukRVGVqdyQXO7+9qsz2b5MJ8kXorNzmRSARQflbXk/5\nYxPrQkPC8EX3JEwD7Y1JDGaD0VI//8jxOGbaaOV1lpO8Bp0x9jyAqq1GFIemlIlN/3URPjN/pm87\nbzxi15tvmz2pHZtvuhjHTx/jm7E3NkRyESvlYYIuLHetTCEMbvrYZt8gUNIg9ztj+ZNRiSlGZY85\noZAGxOsT81hwGhOmMsSwpSGB59/owonfeTpv2lcV4stDfPGJ94LjSi7SC60haSDp7Kt6lXgeuv/4\nskQgG3Tx/NPHNgeOO5i1hORcFojU9/WkGWMD28LOI5ZJXmdVNPb59GoxFLUpZeLb7z1a+Vvxmrlu\nXWjkkRfl4rxEiNAQUr49vRlf3eVhixxeH/j/4ktXvP6mlAnToIBB5/daXIOAG1ZVz6G9KYl01paC\nVC9fbw0CCtWxCeSru7+/+lR84IRpbs/VCzoQXlxJ01cfeU8qa7HAfIJio88KpZSzfJaIVjiSzJiw\nnYjoKiJaQkRLurrC5YVSUXl0/J6KyZVUGuH5cybh2ING47MLZuLUQ8eppRrhd5OESAq58ZuGPZnh\nh5fPxW0fm+d6FLyyiR5Gi2JK+ecXzAxsA4Iec8I0lBN6eMVT5VdpSBpKD0dsnLznUEhabrGRiYbG\nTk3rv5cNMTz0sEkmgL9BmQYFXoryQPiGGy/CdRfOBmCPfWy+6WIcPXWU+/eMOCjqxLbz+zp9bDP+\n8tX5AIC7rjwJn3AWk1bhl4K8L/tlgy7Vv+OmB7220c1J3P6xeUIoqt97BrxFnQH/8+P3MMpDjyo/\nP4dpUiDpFz9mTzrje4kYJHnozv0TZ2CrSCUMNCaMgCzFs1uKDg2vL6TolY1utj35rp60Ujbx1vEN\nXxvYNPzPbWxLCmNbUsJ5nWtVeOjeMbz2HRw7yd/LKwfFGvSfATgMwFwA2wF8P2xHxthtjLF5jLF5\nHR0dRZ6uOHh3d0KbZ4BVkS+jmpN46DOn4ZDxLRjX2oANNwaHBMQBSDGnelBysf+/dO5UjG9t8DTz\nwRySpuHzkFUa6qyJbfj3S44KbM/kLH/DNUgpq3CDpxqUSZlqyUVu/GESQ5iumjQpoMHy48gGLFRy\ncbw1wGuA9151Cn7x0RN8+41q9rrhBlEgkiVq3MSNZRf2sXVzT3IzhZ7P6OYkDnK8+sakiTEhvTfA\n/wIU64psQOSGfc8nT8HTXzrTt+3wiW0456iJgcliotMiPoumlIk/fOpUvHT9AndboR46uZKL4ZYz\nTM4ayFhobvD3esV7yu8fr+tinWBC/4vP0pY99AVHTsAPL5+Lzwo9brG+yOMrfNyrqzutdNj4GU0K\nPo+zZ0/Ae46dgk+dNdNnG1KmX1PndUT20MVnkjS9HnjU2EklKcqgM8Z2MsZyjDELwC8BnFTeYpWH\nff12BAkPFQSKf1OKD5RzDngAAByaSURBVERMyyp7iAGJwXnA3QMZn6Yu/5ZPXlF5nYC9cECL1IhU\njTYhDEjJGIZaF5W3hYXphSXhSgg5cMRjqXKQ8MYtx+Q2Jrw4dG4cTzl0HCY7vSE+Tf5sYVDLNMhN\nsMQxFOXw8vc45ZVCH8UiMnizGfMtv8ZpbUj4BvCiNHdZp21Mmu4kMwD490uOws3/cIxTTk+ekxGN\nbWPSxAkHj8XkUU3e3ws06G50lOm9RORjiDOgxQR04j0k8srmeuhCbyJned4uN+hyrP7Y5hQunTsV\nl86d4m4TDbFcLp9BVzhsXP4wTSMwUYqI8KMPHYexLSnfiyIlpAQBvHoV9ND9YweqiVNA8TO4C6Uo\ng05Ek4Wv7wOwMmzfasIXC/Z56EW+KcWG2CA8ONkTlmNe+X4HBrJobUz6jLBooEe3JN3fqwb2tu3r\n948FhAyK8rSkE0JyeDcmTLzzsHH4yYePc7fJHk+YQQ+LGmlIeNkARUNjKCQX10PPBD10MQ6dIxuz\nca0NbqioSRSYDWoawAOffiee/dez3G2ikeK/43zrPXN8jbQnnXMNkVx20aiI6WxXfut8nxcalm9E\nLIOIaKDeO3eKO1jK91UNUItyiKq+FJqOQY5DTxhBeU6s60nTcA2z+OJOCtFXjQoNPWtZbg8mlbB7\njNz4feOSo7DmPy5wn8fMCW34w6fsVNeiAyCXi4977epJK50cfvcSBgU8dNHwymuXio6Ve61ST0R8\nP49uTgmSS/45MZUgbxwNEd0D4CwA44loK4AbAJxFRHNh36vNAP6lgmUsGj4gJUokxd7YpNQQG50Z\noAENV2rMYkNva0yEGvQxzSls2dOPrGWFelf+dLmGUidf6ixPd8qh49wVlEQMg3D3J08BYHtCS7fs\nw9od/gRSYfHbYYO47U1Jpc4bNSgqrynZmBQ9dM+A8fGPM52oJLt8SRwYyMJQeOimQW60CYcbEN49\n5mX67PyZGN/a4JsE0t2fQaMz6C3PfuTG4N8uPhKfOONQ/N/f3lTdjkhvTGXsZW+bk4gw6GIvQ/Vc\nSpVcTIMC2rcvi6hpe6c5J+9OSvgdv35eBlGW4JFEyDkSYNLEixvs8Mlkwgg4SLyNRHroLV4dUMmQ\nvDqZKoMuGF7xelMJw5dkz1BILrKHPqYlic5u7qH7e6BD5aHnNeiMsQ8pNv+qAmUpO/zh+SSXMnjo\ngPfw5cZ04Tsm+76Lla+tIeH7LnZhuWHqH8xFdpeJvPVSVYOiY1qS2LavHyccPMY9f9hA0Dtnjsc7\nZ47Hl+9b7tse7qGry9XemBTC3sSXTlBD5xX7npff8m1vSnoauuiVTxndhJeuX+DrZbW68hRCJRcR\n10OXDLobuSBc1oGBTLiH7izonc9YhqWzFc8dtr947CgPXawjjarxkqKjXLz/ozz0hOnlq2lMmm6P\nIWGSez385SRGfNjpfgnI2L0I8QWnkkNbnJeIT0OXnLLRTZ4EqvTQmZfjhUsuFx09CY+9tiPcQ0/4\nPXT+J999lzT00U0p9/xcRjq0owUbu3oje23lZGj6AVXiyMn2AgdiPGtYjpB8yA+EGwOxkr/67+fi\n09LEnYaAh+5P3MUZ7eiA/ZlcqAFOZ7180naUS7Ah/+qKE/H7q091G+jU0U2BfWQCGnqIhx4mubQ3\nJdwcOaJBt2c4+o99WIc69jqVMNzfyuZr8qgmX2Pj0UEGqT10GW5gRJ0X8AyI+BLY359xez7B1Xos\nt6xR8OMeNbkdN77vaHz4ZG9+RL76Jxp3/iJSTboR77PaQy9McuGHEz10OWxRNZ+DbxcHUxOShy4u\n3iImE7Pz8HierCorIj+nGOaqGqfiY1CqOmopPPSvnG9HPp15uBeoIb9cfAZdGKB+97G2tt/WmPCN\nv4xuTrr3vW8wax9POPdQUBfZFsO455MnY8eBAV/3xyzWQ5caYkJh0FUx7D4PXZo1Jz7jsc3eJAvu\nXU8Z1Yi3hUU10pkcGpMmBjKWLxpDZGJ7Iya2N6I3ncX41gb8x6VzsKd30F0QV4Xs1YR66CFGwueh\nSx6mfLvfMXUUPnzydNy92O+hJ03Pm48KWwS8brhpBGdQqjz0pOSRi+WTf5PJMfe+BiQXntAtj7Hk\nxi6ZMFxjzq+3kCUCebnkNKyAFOWiqAdFe+jCeIM4mAlIWUQlg+6TAyUPXV68hbedVMIIXTXJO7b9\nrCeIsqmiBzS6JYnudDZEQ+ceuuHrtS/+2tm+VB9iGyci31wO0RP/0eVz8bWLZqOtMenetyZnKUsv\ntNOOamOKY1eSuvbQRzenMHtSu3/SR5FvyoCH7jzI5hCvlSM2rNaGhB3KpzAwJziTKQ4e24w5U0Zh\n5bfOxz9Lcc8DgmYvhkipaGlIYMm/nYMzZnXg0rlT8YkzDo1VRiBcQw+LcmlvTLrX0pLyd1NNKXfJ\n7EntyheG7dmFR3WI8N8zKQ81EO2hyx5gWA55LrnIi4If1mFr69PGRPd6+POVx10KhR9H6aHnHRTN\n37RnT2rDOUfaWRsDYYtmMGzR1s39ZQNsY+ZlyvT+plosJCMs85eSYtBVz64pZeJHHzoO/3elF0in\nki+4NKO6btFDP+uIDmc/ExPbG30vh2A9CEougH2veEQR3z7GccjEKJeEQW4dGtZRLrWG2JCLN+iS\nh24GPXQVfg894dsmlus9x07BY58/AxcebWvwrQ0JvOtwf9w+99ABhHroxRBIFxvioZ+iWHQZsCUa\nfl9HC3HiYqKq46aPxvrvXISjp41yG59IwlTn0lCezylf32AWc6aMcs5l/y1qUpjsvfP7Lzv1vAcg\n92o+M38m/vCpUzEv5qzRuItLh3GRUxdU911cR1RVD7hhO3R8C/7lzENdo3/9hbNxlCNF/sPx09z8\nIrJGbCqiXMTEZX7JxVvtx2JMkFyCHjqfjQvYPb6ewWiDDthtwxeSqbiv/PqUbUKYKfqjDx2HF66d\nrzzXQWOacfTUUbj1g3a+Fr+Gri4bv5ZRjmQqxqGnEp6Hrg16GRFvZrE3NjC4507giVatxMrHtekG\nt9H4j3mUtKjxrIlt+PzZs9zvA4KGnjTDFw4ulGAcul+XHteSwovXLXBn8MmIuTzk5GbilGn+WaXR\ni5JLPlrcZGw5HDK+BRtvvAgXvmOSfT6l5GL4yiKWW/yfw71WOf2saRBOONgz5tdeMNunj4vXAiAw\n07JQTj50HDbfdLE7FgQEk2gBaqeC15OmlInrLzrSjRZ61xEdrufckDTc+H45MsWk4AvJENIiiC+U\nppQnNYCJcyGCHnrW8taTbU6Zvt5Y3Be66nqbIz10Hodul3/amGAKCMCetPanz52O9x03LXCesFWP\neH1rl5y1nnQWCcNwr29ozHmda+gcscFGLSUVhayBRWU1FBGNJZ/VyLfF0VPFbnvOYj4PvVzEGRSd\nMropUofnt0c26Cp5qVWR8iBhGLE9Wu6h8+66mI9bdQjPQ/dvV2noD33mNBw0thnf/cAxmJJnQFmV\nudI+rjPoV4HIhoRByORY3kFR+ZmKOYQSbvm8WHLeLtxMoYbhpi1oTBoYyNh518VBeY44KGqvv2n4\njnX0NCHVQtZyDXpgDkfMOq3ywrmHrhrfEOPQC0EVhy7DDynH3PemsxjVlHT1+yLNTsGMEA+9uN/N\ndtLkAp7XJXft8y2QLHrosuQSp37Jg7hervXyPbqg5OJvFHG6jdzTkw2JGP3AUaU8SJjhy6zJHDbB\njpQRB6FVccJeGdSeuGzQEwbh2IPs3Cr/OO8gnDZzPIqBv4QrMRDmTuASHQVFWlYx7zjgGZx01nJf\nvqmE4S0cQvx33NmAOz9hrnNPRA9d7E01p0y3DjF418/r6vlzJuFPnz0dgD2vgGcnbE6ZeORzp7vH\nie2hRxh0VXu0itSx40guhtSz4XHwFrPbleehD41FHyEeeuENa/HXzvZpyV7om9845NOxxYbHu73u\n1PIYr225krseehm9v6A3p74mVXm5DBOmYbsx3T4PXTUoGl9Ceu/cqRjdlPKNMaiSJ4nHBhQauiS5\nFBoZEoaroZfpeCL2tfjXER3XEpwVzK+Fx4Dz8M6ss8weYD9nOYslH/i2LGDACSk8Ztpo/G3jHpDg\nocu5ZNzzMSbIN149OnraKPzw8rk49bBxOP3mRfbvkibeMXUUDhnfgk27emN76FFhmkoP3dXQC3se\nvolFIT/l949f/5jmpDdXxCTwMd+h8tBHhEHn9UTMmZEPcXYpIMYs29/d3Bd5KqFvIpGj8/HGYBhk\nDxZGrFYkV0I3yqWCkkuYYVVV6u/947H+fUj9AhK1Um7QecUH7MrPu/1fOf+IyPISEebPDq68BKhf\nOvKgqDhz0N5ufy+XQXfDFgt46d73L6fGeqGpUiyoyt0gGFgA+P5lx+LXf92M46ePce+H+Dtyf2c/\nr6xl4e5PnoKlW/Ziv5NCgzHvN2K99A+KetcvR0VdOncqAAQkl0TMtsThvzvriA58891zAHgOTlSP\nuSIeuiRVNacSmDHOfkElTQM3vPsIfOOhVZjQHt/2lMKIMOi823nijNAsv3nhFVYeRMoXM80b3phm\ncXqyF+Xy4KdPi/y92LVdMHtCQEN/4gtnKtPwFoI8kCRXfD7gE8fDkeUt3ijEpEhccmlrSOCAE4fM\njd/mmy4uoOQe/DGovDw5OZdbVvIb+kKny4fhDlwX4BGKy5dFIafUDS1D0m/QJ7Q14toL7Mk0pvti\nY95LTip7zmI4etooHD1tFH767Hr3WHISL8Aftghf2KK6XvJnxR0c9/nEdGP5cU86ZCxm8Lw3ipnK\nMoVq6OKxQhdc4VKV8CKZPakNm3b1ImEaOG/OJJw3Z1JB5y2FEWHQDxnfgp9/5HjfrLBCCWjo7qSP\n6N8lTAM//afjfflFxLwX+eD7HDGxDb/46An4xkMr3eMCwBGCzl8ssoZuGoT/vHQODh7XgnWdPTjv\nKDvqQ7ZP31Ck+Q166PaPxJVieINsFQx6ufRm1T1NSnIZhxslHoVRLg+dz4bd3Rs+iFwscqKxMFJm\nsGfEEZ0Rfku44ecGXYx954aQMa/ei3VGzMxoObldgPCJaByue3NpKq4HzZ2EgcHg7NEo96qUQIKw\nn3IHT3QGDp/Yhj+v3FHYogJlYkQMigLABe+Y7EsuVCiyUfj0fDvCQbWyucxFR09WLooRJ+KGe3mt\njQlfPvVyJsyXvRqTCB89dQbOPLwDV55+iJsTXPSgbrnsWN/EJz6FW56UxAeJxHQGPBPktc7CE0D5\n8kUrJRdFpA3gPcu4M0DjcsQkO8xwg2Ll91JxVxTK03JlycV3DOde24OTfhmKPy8x1JCPQTF4i7vI\nKRfEQdGEYefHzyc5cemEy4dxI9D4DE5xGUd31bGIpRNLqWNhkouq7nC59sBAVvmbSjIiPPRykJA8\n9AWzJ/rkgQ/OOyj2sbhnEif6RtZ5VVEGpSI7x2EDrqKHI78EeAhhe2MSf/rs6di6tw+A501lJMmF\n37tr7l0GoHQPnTfjSMlF9tCdfXmo5buPmYJycISz5qy4WtG/vOtQdB4o3WOXI3NUKx4BwqCowr7x\nxadVElOUh24xhu9+4FhcefohmDNlFL76hxXuPmLY4gkHj8Fbe/ryGmjXQ3d+Ky+EHgavU/2+/C72\n/7kIr7gUJyjsp9xREXt3fL0EebWqoUAb9JiERUoAheu+vELGinLhLxLHk5rQ1oDWhkRZQ+LkkKqw\ncokNVNZHuYc+qinpaq/ifvKaqDLl89DDjy03Sm78O9oasPwb54WmPCiUUc1JfOX8I3DGLC/s8foL\njyzLsROCtPDCtfND18AVtXCZay+cjUM7WnHOkRPx7Btd7vHCfidmwWxKmTjOkQ//8tX52NtnLyLD\nDRpjdm/4AinrqIrmpDM2Y/p7Svngdap/0NufP8uoxc3DvOw4hL2ceBZI8eU43umB7nPuzVCiDXpM\nZA29FHi3Nk6YlusVO/99+OTpOFdYnqwS5BvoBcIHEOUFJ7iGHrbaD6fUuPqonBlhM0VFj01c2q4c\nyAuWlwsxpW7YjEdANLDBZ9mcSuCKd84AIMxgdDV0L8qF42no/mMdNLbZleOKSXPAJRcu12TyvPTl\n3w0Ii16ElVGkGCcoYZC71qwKvvCGz6A7YaQR75aKkfcKnUWgO4lopbBtLBE9RUTrnP+LDx+pMUp5\ny3PcyRux4tB5dI1NY9J0G1GlyOdN83KokLVVleSiolweujIOXcqHzhmq/BrlhJc53zOSJxaF4WrP\nrobO5Y+ghx51rLD1ZqPgTpJcZ/LBB1vFQdGrzjgM7507BR89dUbo74qZuyFOtFLheuhCexjfFr72\nbKWJ8xTuAHCBtO06AAsZY7MALHS+1zXtjUm877ip+N//d2LJx2pQTLYJw1vguOTThiMdW85gqCLM\nQ5c93caYBr3UBQC8hYCDx0klDLQ2JDBakifK8XIeaniirnw9Cu6hR2nKQLBe8ecqestx0hoXEyHE\nZYwb3j0Hnz97FhaEzC2Q4eMGH3N6GYB9P35w+XGRL4diNPSGPAEMXEMXc8eXEnxRKnFWLHqeiGZI\nmy+FvSwdANwJ4FkA15axXMMOwyDc+sG5ZTkW957yGTnA8yoqOXX45EPG4sQZY7CvL4N1nT2xur5h\nqXRbpcrMJZewRTs45RoTCBsUffKLZwYmltWgPcfXLz4SHzllet6FSxoiJBcVTJJcxBzsYpRLGIW8\nkGeMa8bm3X3u91HNSXzp3MNj/35ca0NR8xWK6ZHl6017Hrq//janTFx8dP5xhHJT7KtkImOML1i5\nA8DEMpVnRMArST4jB4ghd5UrT3Mqgd9f/U5cc+9SrOvsieWhh8UYywbV89CjDUu55I8wrTNfoq1a\nIWkamDkh/9yDhogoFxF+t+RBUTHKRZyIFFquAjz0hz93um9hi6GiKA89T94lvpKS3GNd/R+yqDE0\nlGwmmP2UQ580EV1FREuIaElXV1epp6sL+MPPN1AICDMgh8Cd5CslRQ22ccI8dJlGxUxRFaXmDh/q\npb6GOwkhjDAKco21/b1BkfKW39Ko93whGnp7Y7IqL9jiPHRvuUMVXpRLeeYwlEqxHvpOIprMGNtO\nRJMBdIbtyBi7DcBtADBv3rwqjPsOPzyDnsuzZ3Q3t9x89JSDceqh4zBrYn4PUPbQ7/j4icr0unEX\n4ajkoOhIhEtdJ+VZjIPjeeg8ysWrefJsUhUlv5CHgGJkPf6CC/OnLp07Bc+90VWWGdvloFiD/jCA\nKwDc5Pz/UNlKNALgjYYvOhwF7+YWm8e9EIgoljEHgh76WUeoB7QaY3bFSw5bdEyStuc2zakEnvjC\nmZieJyKKVysmTf3P+Qx6/Gn1YatdDQdK0dDDxrDef/w0vO+4qUPSPuOQ9+4T0T2wB0DHE9FWADfA\nNuT3EdGVAN4EcFklC1lv8IiAOBMphnrFk7jE7WLyrn++hl4uD32o8k7XAnG8Rvl+cQP20VMOdrcZ\nMTR0ALj1g8fiuIOGbwRzMRp6yo36CW+rw8WYA/GiXD4U8qezy1yWEYMrucTx0Id4xZO4FOLt/M+H\nj8c7prZH7lOq9l2FPEh1geeh8++EN759oS9qhX/KN8DKl24brhQjx7m96RjjXcOB4ds/qmMakgVo\n6MPUQy+Ei4/JH75VCxpsPaKqV8HFoe3/44ZA1hOFRKQNB3QrqgKjnWRQ42IsuHG4o2nzxQHqlXJF\npwy3nsxwx/XQIxRyLstUYyp7tSkkgGE4oA16FTh++hj84INz8c33zMm770Fjm7Hxxovw3uPq26CX\nSiHO45VnHAIAOGpytAw0ErjQmfxyxqzwtQLiRLkMZ/7j0jm+9YELgafCTdRID1JLLlWiEAOtQ/Hi\nE+dOzT9iQtErI9Ubx08fk/deDKdBv2L42Kkz8LGIHC9RfPHcwzFpVCMuqcKsz2LQBl1TF0RJBprS\nMKSB05FEY9LEx087pNrFiE1t9CM0mpjUuDM5LPEyMo5Ai15jaIOu0Wgi8cIWtUEf7mjJRVMQXzn/\niKJyX1cabWsqB08h295Y3kVANOVHG3RNQZR7JZ4xzUns7Svn2otacyk3x08fjX+/5Ci8X0daDXu0\nQddUlWe/Mh/9g6XH+GoHvXIQEa48vXYGBkcy2qBrqsqopmTBS5BFoQdFNSOZ4SeGajRFoDV0jUYb\ndE2dcO5RdvremRNaq1wSjaZ6aMlFUxdcNu8gXHzMlGGdj1ujqTTaQ9fUBUSkjblmxKMNukaj0dQJ\nJbk0RLQZQDeAHIAsY2xeOQql0Wg0msIpRx91PmNsVxmOo9FoNJoS0JKLRqPR1AmleugMwJNExAD8\ngjF2m7wDEV0F4Crnaw8RvV7kucYDGGk9AX3NIwN9zSODUq754Py7AFTKOoFENJUxto2IJgB4CsDn\nGGPPF33A6HMtGWkavb7mkYG+5pHBUFxzSZILY2yb838ngAcBnFSOQmk0Go2mcIo26ETUQkRt/DOA\n8wCsLFfBNBqNRlMYpWjoEwE86Kw3mABwN2Ps8bKUSk1Anx8B6GseGehrHhlU/JpL0tA1Go1GM3zQ\nYYsajUZTJ2iDrtFoNHVCTRh0IrqAiF4novVEdF21y1MuiOh/iaiTiFYK28YS0VNEtM75f4yznYjo\nR849WEFEx1ev5MVBRAcR0SIiWk1Eq4joGmd73V4zABBRIxG9TETLnev+lrP9ECJa7Fzf74go5Wxv\ncL6vd/4+o5rlLxYiMoloKRE94nyv6+sF7HQoRPQaES0joiXOtiGr38PeoBORCeB/AFwI4CgAHyKi\no6pbqrJxB4ALpG3XAVjIGJsFYKHzHbCvf5bz7yoAPxuiMpaTLIAvM8aOAnAKgM84z7KerxkA0gAW\nMMaOBTAXwAVEdAqAmwHcyhibCWAvgCud/a8EsNfZfquzXy1yDYA1wvd6v17OfMbYXCHmfOjqN2Ns\nWP8DcCqAJ4Tv1wO4vtrlKuP1zQCwUvj+OoDJzufJAF53Pv8CwIdU+9XqPwAPATh3hF1zM4BXAZwM\ne9Zgwtnu1nMATwA41fmccPajape9wOuc5hivBQAegb16d91er3DdmwGMl7YNWf0e9h46gKkAtgjf\ntzrb6pWJjLHtzucdsMNDgTq7D063+jgAizECrtmRH5YB6IQ9q3oDgH2Msayzi3ht7nU7f98PYNzQ\nlrhkfgDgqwAs5/s41Pf1cng6lFectCfAENZvvSLAMIYxxpw8OXUFEbUC+AOALzDGDpCwsnO9XjNj\nLAdgLhGNhj2renaVi1QxiOgSAJ2MsVeI6Kxql2eIOZ0J6VCIaK34x0rX71rw0LcBOEj4Ps3ZVq/s\nJKLJAOD83+lsr4v7QERJ2Mb8t4yxB5zNdX3NIoyxfQAWwZYcRhMRd6rEa3Ov2/n7KAC7h7iopXAa\ngPc46yXcC1t2+SHq93pdmDodypDV71ow6H8HMMsZIU8BuBzAw1UuUyV5GMAVzucrYOvMfPvHnJHx\nUwDsF7pxNQHZrvivAKxhjN0i/KlurxkAiKjD8cxBRE2wxw3WwDbsH3B2k6+b348PAHiGOSJrLcAY\nu54xNo0xNgN2e32GMfZPqNPr5VB4OpShq9/VHkSIOdBwEYA3YOuOX692ecp4XfcA2A4gA1s/uxK2\ndrgQwDoATwMY6+xLsKN9NgB4DcC8ape/iOs9HbbGuALAMuffRfV8zc51HANgqXPdKwF8w9l+KICX\nAawH8HsADc72Ruf7eufvh1b7Gkq49rMAPDISrte5vuXOv1XcVg1l/dZT/zUajaZOqAXJRaPRaDQx\n0AZdo9Fo6gRt0DUajaZO0AZdo9Fo6gRt0DUajaZO0AZdo9Fo6gRt0DUajaZO+P9tug6qWwY2MQAA\nAABJRU5ErkJggg==\n",
      "text/plain": [
       "<Figure size 432x288 with 1 Axes>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    }
   ],
   "source": [
    "plt.plot(losses)\n",
    "plt.ylim(5,50)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "用autograd实现的线性回归最大的不同点就在于autograd不需要计算反向传播，可以自动计算微分。这点不单是在深度学习，在许多机器学习的问题中都很有用。另外需要注意的是在每次反向传播之前要记得先把梯度清零。\n",
    "\n",
    "本章主要介绍了PyTorch中两个基础底层的数据结构：Tensor和autograd中的Variable。Tensor是一个类似Numpy数组的高效多维数值运算数据结构，有着和Numpy相类似的接口，并提供简单易用的GPU加速。Variable是autograd封装了Tensor并提供自动求导技术的，具有和Tensor几乎一样的接口。`autograd`是PyTorch的自动微分引擎，采用动态计算图技术，能够快速高效的计算导数。"
   ]
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 2",
   "language": "python",
   "name": "python2"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 2
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython2",
   "version": "2.7.6"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 2
}
