{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# PyTorch 中的循环神经网络模块\n",
    "前面我们讲了循环神经网络的基础知识和网络结构，下面我们教大家如何在 pytorch 下构建循环神经网络，因为 pytorch 的动态图机制，使得循环神经网络非常方便。"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 一般的 RNN\n",
    "\n",
    "![](https://ws1.sinaimg.cn/large/006tKfTcly1fmt9xz889xj30kb07nglo.jpg)\n",
    "\n",
    "对于最简单的 RNN，我们可以使用下面两种方式去调用，分别是 `torch.nn.RNNCell()` 和 `torch.nn.RNN()`，这两种方式的区别在于 `RNNCell()` 只能接受序列中单步的输入，且必须传入隐藏状态，而 `RNN()` 可以接受一个序列的输入，默认会传入全 0 的隐藏状态，也可以自己申明隐藏状态传入。\n",
    "\n",
    "`RNN()` 里面的参数有\n",
    "\n",
    "input_size 表示输入 $x_t$ 的特征维度\n",
    "\n",
    "hidden_size 表示输出的特征维度\n",
    "\n",
    "num_layers 表示网络的层数\n",
    "\n",
    "nonlinearity 表示选用的非线性激活函数，默认是 'tanh'\n",
    "\n",
    "bias 表示是否使用偏置，默认使用\n",
    "\n",
    "batch_first 表示输入数据的形式，默认是 False，就是这样形式，(seq, batch, feature)，也就是将序列长度放在第一位，batch 放在第二位\n",
    "\n",
    "dropout 表示是否在输出层应用 dropout\n",
    "\n",
    "bidirectional 表示是否使用双向的 rnn，默认是 False\n",
    "\n",
    "对于 `RNNCell()`，里面的参数就少很多，只有 input_size，hidden_size，bias 以及 nonlinearity"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 1,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "import torch\n",
    "from torch.autograd import Variable\n",
    "from torch import nn"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 47,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "# 定义一个单步的 rnn\n",
    "rnn_single = nn.RNNCell(input_size=100, hidden_size=200)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 48,
   "metadata": {
    "collapsed": false
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "Parameter containing:\n",
       "1.00000e-02 *\n",
       " 6.2260 -5.3805  3.5870  ...  -2.2162  6.2760  1.6760\n",
       "-5.1878 -4.6751 -5.5926  ...  -1.8942  0.1589  1.0725\n",
       " 3.3236 -3.2726  5.5399  ...   3.3193  0.2117  1.1730\n",
       "          ...             ⋱             ...          \n",
       " 2.4032 -3.4415  5.1036  ...  -2.2035 -0.1900 -6.4016\n",
       " 5.2031 -1.5793 -0.0623  ...   0.3424  6.9412  6.3707\n",
       "-5.4495  4.5280  2.1774  ...   1.8767  2.4968  5.3403\n",
       "[torch.FloatTensor of size 200x200]"
      ]
     },
     "execution_count": 48,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# 访问其中的参数\n",
    "rnn_single.weight_hh"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 4,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "# 构造一个序列，长为 6，batch 是 5， 特征是 100\n",
    "x = Variable(torch.randn(6, 5, 100)) # 这是 rnn 的输入格式\n",
    "# 6是第一层，5是第二层，100是第三层"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 50,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "# 定义初始的记忆状态\n",
    "h_t = Variable(torch.zeros(5, 200))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 51,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "# 传入 rnn\n",
    "out = []\n",
    "for i in range(6): # 通过循环 6 次作用在整个序列上\n",
    "    h_t = rnn_single(x[i], h_t)\n",
    "    out.append(h_t)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 52,
   "metadata": {
    "collapsed": false
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "Variable containing:\n",
       " 0.0136  0.3723  0.1704  ...   0.4306 -0.7909 -0.5306\n",
       "-0.2681 -0.6261 -0.3926  ...   0.1752  0.5739 -0.2061\n",
       "-0.4918 -0.7611  0.2787  ...   0.0854 -0.3899  0.0092\n",
       " 0.6050  0.1852 -0.4261  ...  -0.7220  0.6809  0.1825\n",
       "-0.6851  0.7273  0.5396  ...  -0.7969  0.6133 -0.0852\n",
       "[torch.FloatTensor of size 5x200]"
      ]
     },
     "execution_count": 52,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "h_t"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 54,
   "metadata": {
    "collapsed": false
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "6"
      ]
     },
     "execution_count": 54,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "len(out)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 55,
   "metadata": {
    "collapsed": false
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "torch.Size([5, 200])"
      ]
     },
     "execution_count": 55,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "out[0].shape # 每个输出的维度"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "可以看到经过了 rnn 之后，隐藏状态的值已经被改变了，因为网络记忆了序列中的信息，同时输出 6 个结果"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "下面我们看看直接使用 `RNN` 的情况"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 2,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "rnn_seq = nn.RNN(100, 200)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 3,
   "metadata": {
    "collapsed": false
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "Parameter containing:\n",
       "tensor([[ 0.0495,  0.0594, -0.0170,  ..., -0.0540,  0.0464, -0.0082],\n",
       "        [-0.0270,  0.0400, -0.0660,  ...,  0.0393, -0.0243, -0.0012],\n",
       "        [ 0.0656, -0.0424,  0.0199,  ...,  0.0478, -0.0693,  0.0547],\n",
       "        ...,\n",
       "        [-0.0602,  0.0239,  0.0688,  ..., -0.0230,  0.0374, -0.0148],\n",
       "        [ 0.0063, -0.0675, -0.0552,  ..., -0.0414,  0.0440, -0.0452],\n",
       "        [-0.0265,  0.0449,  0.0317,  ...,  0.0631,  0.0584, -0.0202]],\n",
       "       requires_grad=True)"
      ]
     },
     "execution_count": 3,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# 访问其中的参数\n",
    "rnn_seq.weight_hh_l0"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 5,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor([[[ 2.3337, -0.3935, -0.0730,  ..., -2.6293,  0.0339,  0.9435],\n",
       "         [ 0.2736,  0.7379,  0.0568,  ..., -0.2879,  1.2094, -0.4880],\n",
       "         [-0.3442,  1.2629, -1.1864,  ...,  0.0544, -0.7236, -2.3669],\n",
       "         [-0.4588, -0.6725, -0.2958,  ...,  0.5017,  1.2665,  0.7472],\n",
       "         [-1.2024, -0.0426, -1.5522,  ..., -0.6369, -0.5132,  0.5688]],\n",
       "\n",
       "        [[-0.1256, -0.7223, -0.1060,  ..., -0.0851,  1.1671,  0.2434],\n",
       "         [-0.5946,  0.1273,  1.0485,  ..., -0.4135,  0.3583, -0.7491],\n",
       "         [ 0.0916, -0.7837,  0.5234,  ..., -0.5842, -1.0563, -0.0737],\n",
       "         [ 0.6007, -0.0144,  0.7806,  ..., -1.7140, -0.4707, -1.0492],\n",
       "         [ 1.1021,  0.0721,  0.0560,  ...,  0.5380, -0.5533,  0.6495]],\n",
       "\n",
       "        [[-0.6520, -0.1016, -0.5006,  ...,  1.7436, -2.4917,  1.2215],\n",
       "         [-0.2105, -1.4971,  1.7161,  ...,  0.0461,  2.4243,  0.4196],\n",
       "         [ 1.5345,  1.3704,  0.1540,  ..., -0.9233,  0.9775,  0.5871],\n",
       "         [-0.9520,  0.3556, -1.2828,  ...,  0.4603, -0.6394,  0.0310],\n",
       "         [ 0.2464, -0.2842,  0.5034,  ..., -0.6263,  0.4160,  0.4864]],\n",
       "\n",
       "        [[-0.6444, -0.5083, -1.4501,  ...,  0.1382,  0.7963, -1.1112],\n",
       "         [-0.7624, -0.1767,  0.0835,  ...,  0.1520, -0.9241, -0.9122],\n",
       "         [ 2.4732, -0.4949, -1.4434,  ...,  0.7328,  0.4291, -1.3238],\n",
       "         [ 0.2786,  0.5268,  1.2083,  ..., -0.5961, -0.0181,  0.1301],\n",
       "         [-1.2226,  0.7160, -0.2687,  ..., -0.1740, -0.2402,  0.7683]],\n",
       "\n",
       "        [[-0.9851,  0.7887,  0.3409,  ..., -0.5939, -0.3186,  0.0697],\n",
       "         [-0.5188, -3.0232, -0.2951,  ...,  0.2169,  0.3773, -0.9107],\n",
       "         [-0.2125, -1.0329,  0.7220,  ...,  0.5019, -0.0160,  0.1643],\n",
       "         [ 1.0428, -0.6607, -0.3897,  ...,  0.0907, -0.8109, -1.0317],\n",
       "         [-0.1065,  1.8048,  0.3149,  ..., -0.0721, -1.0127, -0.7613]],\n",
       "\n",
       "        [[ 0.0091, -0.4303, -1.4101,  ...,  0.8644,  0.1530, -0.9894],\n",
       "         [-1.3198,  0.0421, -1.4443,  ..., -0.1208, -0.5738,  0.9360],\n",
       "         [ 1.6767,  0.0207,  1.3295,  ...,  0.1924,  1.3105,  0.9618],\n",
       "         [ 0.6547,  0.8355, -0.4637,  ...,  0.1809, -0.6653, -1.2893],\n",
       "         [-1.8609,  0.9387, -0.3488,  ..., -1.7976, -1.0713,  0.5294]]])"
      ]
     },
     "execution_count": 5,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "x"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 6,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "out, h_t = rnn_seq(x) # 使用默认的全 0 隐藏状态"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 7,
   "metadata": {
    "collapsed": false
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor([[[ 1.1333e-01,  1.5885e-01,  4.1467e-02, -4.1598e-03, -2.3698e-01,\n",
       "           7.7856e-01,  8.2923e-02,  2.3304e-01, -1.3479e-01,  2.1382e-01,\n",
       "           9.3382e-02, -1.3492e-03, -4.5977e-02,  7.6041e-02,  1.9969e-01,\n",
       "          -4.5109e-01,  6.2368e-02,  5.6468e-01, -6.4800e-01, -4.2772e-01,\n",
       "          -1.1335e-01,  2.8719e-01,  2.0987e-02, -8.2709e-02, -6.7418e-01,\n",
       "          -3.1197e-01,  3.8830e-01,  3.9278e-01,  5.0124e-01, -4.1049e-01,\n",
       "          -9.3685e-02, -1.9230e-01, -5.3560e-01,  2.6041e-01, -1.4221e-01,\n",
       "           2.2028e-01, -3.0141e-01, -1.9759e-01,  1.4725e-01, -6.7380e-01,\n",
       "           1.8927e-01, -1.3742e-01,  2.7362e-01,  7.7025e-01, -1.1050e-02,\n",
       "           7.0728e-01, -6.6395e-01, -4.2013e-01, -1.6709e-02, -1.2752e-02,\n",
       "          -8.1493e-02,  5.9625e-01, -3.6534e-01, -4.4869e-01,  1.5868e-01,\n",
       "           2.5594e-01, -5.8189e-01,  1.5935e-01,  2.9986e-01, -6.6092e-01,\n",
       "           2.5414e-01,  1.6611e-01, -2.8547e-01,  1.9613e-02,  4.0227e-02,\n",
       "          -7.0440e-01, -4.2571e-02,  2.5250e-01, -4.7731e-02,  6.1641e-01,\n",
       "          -4.3611e-02, -4.3344e-01, -1.4384e-01,  7.9705e-01, -1.7841e-02,\n",
       "          -5.5412e-01,  3.3516e-01, -7.1146e-02, -6.5710e-01, -4.6942e-01,\n",
       "          -3.5815e-01,  1.8904e-01, -1.2555e-01,  4.4845e-01,  2.4242e-01,\n",
       "          -1.7767e-02, -2.9842e-01,  4.5036e-01, -3.1875e-01,  3.3050e-01,\n",
       "           6.0664e-02,  5.6425e-01, -4.8079e-01,  6.7178e-02, -1.6364e-01,\n",
       "          -2.8031e-02,  7.8458e-01,  5.8046e-01, -8.8500e-03, -1.9226e-01,\n",
       "          -5.5157e-01, -4.9476e-01,  4.8151e-01,  6.2854e-02,  5.2827e-02,\n",
       "          -2.7246e-02, -6.2305e-01,  5.2033e-01,  3.9183e-01,  5.3035e-01,\n",
       "          -3.2824e-01,  2.4203e-01, -1.2596e-01, -3.6850e-01,  5.3687e-01,\n",
       "           6.8619e-01, -4.0564e-01,  2.3586e-01,  3.8086e-01,  8.4018e-01,\n",
       "           4.0339e-01, -1.8915e-02, -1.1060e-01, -1.0305e-01, -7.7821e-02,\n",
       "           3.5617e-01, -4.3336e-01,  5.9209e-01,  1.6361e-01,  4.4984e-03,\n",
       "           1.8924e-01, -1.0637e-01,  4.7823e-01, -3.7325e-01,  3.4345e-01,\n",
       "           1.0437e-02, -1.1450e-01,  3.5640e-01,  2.4121e-01,  1.2666e-01,\n",
       "          -3.4810e-01,  5.3347e-01,  4.8448e-01, -2.6108e-02,  4.3594e-01,\n",
       "           3.0750e-02, -6.8106e-02,  2.8016e-02, -3.7796e-01,  2.9825e-01,\n",
       "           3.6519e-01,  2.6988e-01,  1.6736e-02,  3.4779e-01, -3.6138e-01,\n",
       "          -5.7805e-01,  5.7597e-02,  4.3435e-01, -4.2368e-02,  3.0988e-01,\n",
       "           2.0042e-01,  2.0993e-01,  4.7922e-01,  1.1868e-01, -1.1851e-01,\n",
       "           4.5553e-01,  3.7708e-01, -1.6790e-01, -2.7367e-01, -1.7158e-01,\n",
       "          -3.8039e-01, -6.4601e-02,  1.3118e-01, -3.7543e-01, -5.8987e-01,\n",
       "          -5.1390e-01, -3.7312e-01,  2.8403e-01, -4.4659e-01,  4.2278e-01,\n",
       "          -1.5281e-01,  5.7084e-01, -3.0367e-01,  3.3350e-01,  1.2369e-02,\n",
       "          -6.8026e-01, -4.0338e-03,  2.2079e-01,  1.8614e-01,  2.5105e-01,\n",
       "          -1.5955e-01,  2.2249e-01,  1.3995e-01, -7.6564e-01, -6.5170e-01,\n",
       "          -4.1618e-03, -4.7143e-01, -2.0208e-01,  5.8075e-01, -1.2151e-01],\n",
       "         [ 3.8105e-01,  3.8160e-01, -2.0083e-01, -1.4067e-01, -5.9709e-01,\n",
       "           6.9748e-01, -3.8272e-01, -6.8977e-01, -1.7910e-01,  1.4449e-01,\n",
       "          -2.0036e-01, -4.7418e-01, -2.7828e-01, -6.2963e-01,  1.3805e-01,\n",
       "           2.5144e-01,  2.4074e-03,  9.2719e-02, -6.5571e-01,  3.3713e-02,\n",
       "          -1.0780e-01, -1.0764e-01, -2.5998e-01,  1.3800e-01,  5.0109e-01,\n",
       "          -2.2037e-01, -3.4372e-01, -7.3940e-01, -3.9438e-01, -1.0568e-01,\n",
       "          -3.1472e-02,  6.2736e-01,  3.3210e-01,  2.7996e-02, -3.9655e-01,\n",
       "           5.1917e-01,  3.7640e-01, -1.2992e-01,  4.8017e-01,  1.1641e-01,\n",
       "          -2.8019e-01,  5.3436e-02, -1.6225e-01, -2.4883e-01,  2.3426e-01,\n",
       "           2.3127e-01,  6.1225e-03, -5.1290e-01,  4.4010e-01, -7.5828e-01,\n",
       "           1.1770e-01,  5.3659e-01, -4.5349e-01,  6.2796e-01,  1.1158e-01,\n",
       "          -3.5555e-01,  3.9130e-01, -2.8065e-01,  1.7428e-01, -5.5042e-01,\n",
       "          -5.4952e-01, -5.0928e-01, -5.5796e-01,  2.9717e-02,  2.6472e-01,\n",
       "           2.4971e-01, -6.6920e-01, -2.6801e-01,  2.1138e-01,  2.8839e-01,\n",
       "           4.8042e-03,  5.7628e-01, -2.0776e-01, -2.1986e-01,  1.0972e-01,\n",
       "           1.2122e-01, -5.3642e-02,  2.8031e-01, -6.5222e-01,  2.6228e-01,\n",
       "          -5.9874e-01, -5.5464e-01,  3.4574e-01,  1.5002e-01,  3.3130e-01,\n",
       "           8.2086e-01,  2.2505e-01, -3.3721e-01,  5.7500e-01, -1.4017e-01,\n",
       "           3.4915e-01,  2.7124e-01,  4.8423e-01, -2.5866e-01,  2.9013e-01,\n",
       "          -1.6470e-01, -5.7517e-02,  2.7647e-01,  4.6972e-01, -4.7282e-02,\n",
       "           7.7063e-01, -2.3530e-01, -7.9087e-02,  8.8192e-02,  2.3925e-01,\n",
       "           2.3047e-01, -5.8297e-01, -7.7614e-01,  3.5726e-01, -2.1197e-01,\n",
       "           3.5829e-02, -3.9944e-01, -3.4101e-01, -2.5393e-01, -2.0071e-01,\n",
       "          -8.0387e-02, -2.8245e-01,  1.4163e-01,  6.3917e-01, -4.8184e-03,\n",
       "           3.3737e-01, -6.4155e-01,  5.1128e-01,  1.7942e-01,  7.8466e-02,\n",
       "          -7.4842e-01,  4.9155e-01, -2.8504e-01,  2.7376e-01,  2.3414e-01,\n",
       "          -5.1921e-01,  7.4739e-01, -1.0949e-02,  3.7171e-01, -1.8497e-01,\n",
       "           5.4580e-02,  3.5336e-01,  4.5023e-01, -6.8025e-01, -1.2923e-01,\n",
       "           2.9356e-02, -2.4788e-01, -5.9319e-02, -7.7475e-02, -7.1484e-01,\n",
       "          -8.3826e-01,  5.4403e-01,  1.5490e-01, -6.7083e-02, -7.0501e-01,\n",
       "           5.3041e-02, -1.4546e-02, -6.1355e-01,  2.4089e-01, -2.2094e-01,\n",
       "           6.1187e-01, -1.3159e-01, -4.1949e-01, -1.8331e-01,  1.9640e-01,\n",
       "           1.8149e-01, -1.3605e-01, -3.1543e-01, -3.9621e-01, -2.3576e-01,\n",
       "          -5.3976e-01, -1.6723e-01,  3.2618e-01, -3.1994e-01, -3.8873e-01,\n",
       "          -1.8042e-02, -1.4516e-01, -6.3053e-01,  6.4031e-01, -3.1954e-01,\n",
       "           3.5876e-01,  2.3368e-01, -1.8702e-01,  8.0107e-02,  1.2340e-01,\n",
       "          -6.9573e-02, -3.8268e-01,  4.1190e-02,  6.1654e-01, -5.4934e-01,\n",
       "           2.7678e-01, -2.1214e-01,  1.5815e-01,  3.8957e-01,  5.2916e-01,\n",
       "           6.6004e-02, -3.0989e-01, -2.3035e-01, -6.4198e-03,  6.7915e-02,\n",
       "          -2.3281e-01, -5.1932e-01,  5.8535e-01, -1.5538e-01, -6.2948e-01],\n",
       "         [ 6.3911e-02,  6.0429e-01, -4.3521e-01,  8.1831e-02, -1.8088e-01,\n",
       "           9.3712e-02, -2.7287e-01,  2.1750e-01,  3.7792e-01, -6.4603e-01,\n",
       "           8.3571e-01, -5.9463e-02, -1.2471e-02,  4.1060e-01,  4.3455e-01,\n",
       "           3.9026e-01,  4.1321e-01, -2.9216e-01,  2.7732e-01,  1.3208e-01,\n",
       "           6.0285e-01,  2.5605e-01,  4.2834e-01,  4.3552e-03,  6.2964e-01,\n",
       "           2.9963e-01,  5.0871e-01, -3.5363e-01,  2.6393e-01,  1.1061e-01,\n",
       "          -3.8740e-01,  1.6748e-01, -3.1708e-01,  5.1767e-01, -4.6685e-01,\n",
       "          -3.0800e-01,  6.8178e-01, -1.0236e-01, -8.9042e-02, -1.6198e-01,\n",
       "          -5.3060e-01,  3.3905e-01, -2.3259e-01,  3.6051e-01,  6.5969e-01,\n",
       "           1.1407e-01,  2.1329e-01,  1.6021e-01,  1.2152e-01, -5.9205e-01,\n",
       "           2.1677e-01,  4.4351e-01,  2.6607e-01,  5.7190e-01,  7.7724e-01,\n",
       "          -2.9488e-01,  3.3577e-01, -2.2565e-01, -3.8349e-01, -4.4188e-01,\n",
       "           1.9195e-03, -3.0170e-02, -1.6308e-01,  6.5919e-02, -3.9777e-01,\n",
       "          -7.7024e-01,  4.9891e-01,  1.4516e-01,  3.6539e-01, -5.4812e-01,\n",
       "          -6.6937e-01,  5.6637e-01,  3.2272e-01, -5.9503e-01,  3.6230e-01,\n",
       "           8.5570e-02, -1.1914e-01, -4.5350e-01, -1.1963e-01,  4.7285e-01,\n",
       "           2.8878e-01, -5.9265e-02,  5.6877e-01,  5.1695e-01,  8.7426e-01,\n",
       "           1.7637e-03,  3.1861e-01, -2.1093e-01,  6.5960e-01,  4.0056e-01,\n",
       "           5.5552e-01,  2.6890e-01, -3.6500e-01, -6.2268e-01,  5.3769e-01,\n",
       "           3.1707e-01, -8.3516e-02,  3.0651e-01,  4.2277e-01,  3.1428e-01,\n",
       "          -1.0055e-01, -5.0615e-01,  4.2011e-01, -4.6456e-01,  2.7884e-01,\n",
       "          -1.7962e-01, -4.4736e-02, -1.1770e-01, -4.8496e-01, -8.3252e-02,\n",
       "          -2.7845e-01,  2.1586e-01,  2.3801e-01,  7.5834e-01, -3.8672e-01,\n",
       "          -6.3076e-01,  3.8102e-02, -1.4574e-01, -2.9606e-01,  6.8419e-01,\n",
       "          -3.4980e-02,  2.5895e-01,  4.0484e-01,  1.6800e-01,  1.9677e-01,\n",
       "          -5.5401e-01,  2.4309e-01,  5.7823e-01,  4.0152e-02, -4.0045e-01,\n",
       "           2.0456e-01,  2.9267e-01, -1.0258e-01, -8.6060e-02,  2.3996e-01,\n",
       "           2.8594e-01,  1.4188e-01,  2.1178e-01, -2.6816e-03, -2.1537e-01,\n",
       "           1.6737e-01, -8.2726e-02, -1.7401e-01,  4.1772e-01,  3.8375e-01,\n",
       "          -3.3469e-01,  2.8644e-01, -2.4267e-01, -2.5954e-01, -9.5156e-02,\n",
       "           1.5712e-01, -5.2707e-01,  4.0699e-01,  5.2537e-01, -2.6988e-01,\n",
       "          -3.6461e-01,  4.8050e-01, -7.2475e-01, -5.6256e-01,  3.0393e-01,\n",
       "          -1.1885e-01, -6.6653e-01, -5.2889e-01, -1.4750e-01,  2.5170e-01,\n",
       "          -1.5842e-01, -3.2212e-01, -3.0366e-01, -3.5094e-01, -4.7790e-01,\n",
       "           3.8216e-01, -7.9255e-01, -1.2842e-01,  8.0835e-02,  6.2077e-01,\n",
       "          -3.4337e-01, -1.0018e-01,  3.5668e-02, -8.6934e-03,  1.6719e-01,\n",
       "          -4.8586e-01, -4.7055e-01,  2.4087e-01,  2.3722e-01, -2.1609e-01,\n",
       "           1.3467e-01, -7.8848e-01, -6.1148e-01,  4.4824e-01, -5.2596e-01,\n",
       "           6.4961e-01, -1.3039e-01,  3.2938e-01, -7.7706e-02,  4.1510e-01,\n",
       "           1.4958e-01, -3.7154e-01, -1.7087e-01,  1.0107e-01,  5.4957e-01],\n",
       "         [ 5.7845e-01, -2.3482e-01, -1.5270e-01,  3.2934e-01, -3.0766e-01,\n",
       "          -6.2181e-02,  1.5827e-01,  9.1037e-02, -4.3704e-01, -2.6513e-01,\n",
       "          -5.2287e-01,  1.7078e-01,  4.4074e-01, -2.1733e-01,  1.2664e-01,\n",
       "          -3.2632e-01,  5.5028e-02,  6.1636e-01,  2.6154e-01, -1.6852e-01,\n",
       "          -3.8369e-01,  1.1406e-01, -1.4193e-01, -1.7216e-02,  7.5114e-01,\n",
       "          -4.3522e-01,  6.7456e-01,  1.9791e-01,  8.7605e-01,  6.2270e-01,\n",
       "          -1.6363e-01, -1.3238e-01, -1.4665e-01,  2.9144e-01, -3.5652e-01,\n",
       "           5.7618e-01,  3.4954e-01, -8.2889e-01, -2.4486e-02,  3.8569e-02,\n",
       "           3.8245e-01, -3.5602e-01,  3.6415e-01,  2.5259e-01,  1.5930e-02,\n",
       "          -4.3391e-01, -2.9566e-01,  1.4432e-01,  1.0464e-01, -5.5026e-01,\n",
       "           4.1480e-01, -3.5308e-01, -1.4036e-01, -5.6716e-01,  1.6197e-01,\n",
       "          -2.2236e-01, -4.3209e-01,  1.6584e-01, -9.3161e-02, -3.1813e-01,\n",
       "           1.9988e-01, -2.8883e-01, -4.9519e-01,  2.5457e-01, -3.2991e-02,\n",
       "           1.0750e-01, -1.2342e-01, -2.2626e-01, -6.5720e-01, -6.3445e-01,\n",
       "           2.7579e-01, -1.2699e-01, -1.1106e-01,  5.6694e-01, -1.9621e-02,\n",
       "          -2.8738e-01,  1.3854e-01,  2.9560e-01, -4.6598e-01,  3.4746e-01,\n",
       "           1.8036e-01,  3.3544e-01,  1.0515e-01,  8.6303e-02,  6.8082e-01,\n",
       "           1.0158e-01,  5.3018e-01, -7.3813e-03, -7.6816e-01, -1.7511e-01,\n",
       "           4.5548e-02, -6.0771e-01,  5.0363e-01,  6.4844e-01, -1.5657e-01,\n",
       "          -3.7344e-01, -2.9119e-01,  4.4068e-01, -3.5001e-01,  3.7643e-01,\n",
       "          -2.6541e-01, -2.7512e-01, -3.9560e-01,  1.2337e-01, -7.1469e-02,\n",
       "          -3.3931e-01,  1.2951e-01,  6.6011e-02,  9.3252e-02, -2.9070e-01,\n",
       "          -7.6847e-03, -4.1818e-01, -4.0792e-01,  1.6881e-01,  7.8115e-01,\n",
       "          -2.6657e-01,  8.1391e-01,  5.7394e-01, -1.4311e-01,  2.9017e-01,\n",
       "          -5.0462e-01, -1.4978e-02,  6.9478e-01, -3.2815e-01, -4.1266e-01,\n",
       "           4.0556e-01, -4.2912e-01, -3.6029e-01, -1.7307e-01, -5.4268e-01,\n",
       "           1.2195e-01,  3.3614e-01, -2.8532e-01, -5.8846e-01,  6.1370e-01,\n",
       "          -5.7412e-01, -3.9076e-01,  4.1925e-01, -2.4648e-01, -3.9990e-02,\n",
       "          -1.0702e-04,  6.5422e-01, -1.9746e-01, -7.7925e-02,  4.3325e-01,\n",
       "          -6.0128e-02,  6.6015e-01, -5.5444e-01,  2.6252e-01,  1.7620e-01,\n",
       "          -2.5649e-01,  1.4655e-01, -2.9936e-01, -4.4785e-01, -2.4319e-01,\n",
       "          -1.6943e-01,  2.8072e-01, -3.5121e-01, -3.2593e-02, -3.0758e-01,\n",
       "          -5.9383e-01,  1.8649e-02,  1.5183e-01,  7.8557e-01, -7.1017e-01,\n",
       "          -1.6023e-01, -7.6401e-01,  9.5685e-02, -1.9640e-01,  2.2626e-01,\n",
       "           1.9687e-02,  1.9617e-01,  6.8839e-01,  3.9520e-01,  9.5970e-02,\n",
       "           1.5884e-01,  5.0409e-01,  5.9132e-01, -1.9321e-01, -6.3061e-02,\n",
       "          -4.8119e-01, -1.1833e-01,  6.4132e-02,  3.1705e-01, -1.0134e-01,\n",
       "          -3.7966e-01,  3.2759e-01, -2.3496e-02, -6.7605e-01,  1.2973e-01,\n",
       "           2.5850e-01,  6.5795e-02,  2.3295e-01,  5.7402e-01, -1.9921e-01,\n",
       "          -1.4031e-01, -4.4459e-03, -9.4705e-02,  4.7955e-02,  1.6560e-01],\n",
       "         [-1.2864e-01,  5.7423e-01, -3.2700e-01,  1.6794e-01,  6.6260e-01,\n",
       "           2.7125e-01, -7.4934e-01,  5.1868e-02,  1.4993e-01,  5.3286e-02,\n",
       "           7.3107e-02,  3.5565e-01, -2.5660e-01,  4.5221e-01,  6.4116e-01,\n",
       "           3.4328e-01,  3.5433e-02, -5.6173e-01, -4.9608e-01,  5.6402e-01,\n",
       "           3.1740e-01, -1.4534e-01, -2.9747e-01,  1.2869e-01, -2.3517e-01,\n",
       "          -3.7471e-01,  2.2314e-01, -8.2738e-01, -3.9925e-01,  6.1978e-03,\n",
       "           9.6749e-02,  3.0186e-03,  1.7492e-01,  1.2174e-01, -4.8693e-01,\n",
       "           3.4099e-02, -9.3309e-02, -1.2312e-01,  1.5616e-01,  1.7810e-01,\n",
       "          -5.8181e-02,  6.6446e-01, -5.5270e-01,  1.4527e-02, -1.6872e-01,\n",
       "           1.9402e-01, -6.1356e-01, -4.7771e-01,  5.0720e-02, -1.4775e-01,\n",
       "           3.0970e-01, -4.7754e-01,  1.3801e-01,  7.1627e-01,  4.5432e-01,\n",
       "           5.2869e-01, -3.1742e-01,  1.9348e-01,  5.6400e-02, -1.1919e-01,\n",
       "          -1.3477e-01, -2.7730e-01,  7.5605e-01,  3.5907e-01,  4.4924e-01,\n",
       "          -5.4316e-01, -1.6506e-01, -3.8497e-01, -2.5617e-02,  2.8428e-01,\n",
       "          -2.1267e-01, -2.1881e-01, -6.9089e-02,  3.3529e-01,  6.1193e-01,\n",
       "          -2.7944e-01,  4.9640e-01,  3.2285e-01, -5.1981e-01,  2.4303e-01,\n",
       "          -3.7868e-01, -4.6365e-02,  4.5754e-01,  1.6135e-01, -1.3061e-01,\n",
       "           2.4648e-01, -4.0944e-01, -2.5326e-01,  4.9134e-01,  6.4189e-01,\n",
       "           5.4417e-01, -2.0598e-01,  1.3028e-01, -3.5381e-01, -4.8535e-01,\n",
       "           7.8719e-02, -1.2654e-01, -2.0000e-01, -3.2162e-01, -4.9452e-01,\n",
       "          -1.0762e-01, -3.9386e-01,  2.4993e-01, -2.3760e-01,  4.9464e-01,\n",
       "          -1.8928e-01,  7.2702e-03,  7.0322e-02, -6.9641e-01, -5.1724e-01,\n",
       "          -6.8848e-02,  2.1939e-01, -2.3836e-01, -8.9294e-02,  1.7091e-01,\n",
       "          -1.6486e-02,  2.2339e-01, -5.1747e-01,  5.2012e-01, -3.8664e-01,\n",
       "           4.4189e-01,  5.9154e-02,  3.5246e-01, -1.4983e-02,  3.6237e-01,\n",
       "           5.5347e-02, -8.5705e-02, -5.0845e-01,  3.3581e-01,  4.2633e-01,\n",
       "          -2.4378e-01,  5.7465e-01,  1.4512e-01,  4.5498e-01,  3.9564e-01,\n",
       "          -6.9795e-01, -5.9737e-01,  5.8178e-01, -1.7396e-01,  7.6164e-02,\n",
       "           6.0638e-01,  8.1386e-02,  2.5860e-01,  1.1751e-01, -6.1966e-01,\n",
       "           3.6954e-02,  1.6930e-01,  2.8931e-01,  2.4658e-01,  4.1852e-01,\n",
       "           2.4338e-01, -4.7662e-01,  7.2391e-02, -1.6179e-01, -3.0505e-02,\n",
       "           9.4159e-02,  4.2004e-01, -1.3559e-01, -1.3835e-01,  4.1368e-01,\n",
       "           5.3771e-01, -8.8174e-01, -4.1296e-01, -2.7064e-01,  1.1284e-01,\n",
       "          -5.6771e-01, -7.4982e-01, -7.4116e-01,  5.2884e-01, -1.9310e-01,\n",
       "          -7.2175e-01, -4.8640e-01,  5.5996e-01,  5.6928e-01,  1.4528e-01,\n",
       "          -7.6671e-02,  6.0840e-02, -2.8427e-01,  3.8251e-01,  3.7221e-01,\n",
       "          -6.4411e-01, -2.5539e-01,  8.3626e-01, -2.1400e-01, -5.2170e-01,\n",
       "           1.4516e-01, -9.5631e-03, -3.6237e-01,  3.8101e-01, -4.1717e-01,\n",
       "           3.1941e-01,  3.3018e-02,  4.1189e-02,  7.0664e-01, -3.5381e-01,\n",
       "          -2.2230e-01,  3.3559e-01,  3.2675e-01, -5.5850e-01,  2.2148e-01]]],\n",
       "       grad_fn=<StackBackward0>)"
      ]
     },
     "execution_count": 7,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "h_t"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 8,
   "metadata": {
    "collapsed": false
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "6"
      ]
     },
     "execution_count": 8,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "len(out)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "这里的 h_t 是网络最后的隐藏状态，网络也输出了 6 个结果"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 40,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "# 自己定义初始的隐藏状态\n",
    "h_0 = Variable(torch.randn(1, 5, 200))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "这里的隐藏状态的大小有三个维度，分别是 (num_layers * num_direction, batch, hidden_size)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 41,
   "metadata": {
    "collapsed": false
   },
   "outputs": [],
   "source": [
    "out, h_t = rnn_seq(x, h_0)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 42,
   "metadata": {
    "collapsed": false
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "Variable containing:\n",
       "( 0 ,.,.) = \n",
       "  0.2091  0.0353  0.0625  ...   0.2340  0.3734 -0.1307\n",
       "  0.5498  0.4221  0.7877  ...  -0.4143 -0.1209  0.3335\n",
       "  0.0757  0.4204  0.3826  ...   0.3187 -0.4626 -0.2336\n",
       "  0.3106  0.7355  0.6436  ...   0.6611  0.2587 -0.0338\n",
       "  0.1025  0.6350  0.1943  ...   0.5720  0.8749  0.4525\n",
       "[torch.FloatTensor of size 1x5x200]"
      ]
     },
     "execution_count": 42,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "h_t"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 45,
   "metadata": {
    "collapsed": false
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "torch.Size([6, 5, 200])"
      ]
     },
     "execution_count": 45,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "out.shape"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "同时输出的结果也是 (seq, batch, feature)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "一般情况下我们都是用 `nn.RNN()` 而不是 `nn.RNNCell()`，因为 `nn.RNN()` 能够避免我们手动写循环，非常方便，同时如果不特别说明，我们也会选择使用默认的全 0 初始化隐藏状态"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## LSTM"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "![](https://ws1.sinaimg.cn/large/006tKfTcly1fmt9qj3uhmj30iz07ct90.jpg)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {
    "collapsed": true
   },
   "source": [
    "LSTM 和基本的 RNN 是一样的，他的参数也是相同的，同时他也有 `nn.LSTMCell()` 和 `nn.LSTM()` 两种形式，跟前面讲的都是相同的，我们就不再赘述了，下面直接举个小例子"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 58,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "lstm_seq = nn.LSTM(50, 100, num_layers=2) # 输入维度 100，输出 200，两层"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 80,
   "metadata": {
    "collapsed": false
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "Parameter containing:\n",
       "1.00000e-02 *\n",
       " 3.8420  5.7387  6.1351  ...   1.2680  0.9890  1.3037\n",
       "-4.2301  6.8294 -4.8627  ...  -6.4147  4.3015  8.4103\n",
       " 9.4411  5.0195  9.8620  ...  -1.6096  9.2516 -0.6941\n",
       "          ...             ⋱             ...          \n",
       " 1.2930 -1.3300 -0.9311  ...  -6.0891 -0.7164  3.9578\n",
       " 9.0435  2.4674  9.4107  ...  -3.3822 -3.9773 -3.0685\n",
       "-4.2039 -8.2992 -3.3605  ...   2.2875  8.2163 -9.3277\n",
       "[torch.FloatTensor of size 400x100]"
      ]
     },
     "execution_count": 80,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "lstm_seq.weight_hh_l0 # 第一层的 h_t 权重"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "**小练习：想想为什么这个系数的大小是 (400, 100)**"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 59,
   "metadata": {
    "collapsed": false
   },
   "outputs": [],
   "source": [
    "lstm_input = Variable(torch.randn(10, 3, 50)) # 序列 10，batch 是 3，输入维度 50"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 64,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "out, (h, c) = lstm_seq(lstm_input) # 使用默认的全 0 隐藏状态"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "注意这里 LSTM 输出的隐藏状态有两个，h 和 c，就是上图中的每个 cell 之间的两个箭头，这两个隐藏状态的大小都是相同的，(num_layers * direction, batch, feature)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 66,
   "metadata": {
    "collapsed": false
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "torch.Size([2, 3, 100])"
      ]
     },
     "execution_count": 66,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "h.shape # 两层，Batch 是 3，特征是 100"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 67,
   "metadata": {
    "collapsed": false
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "torch.Size([2, 3, 100])"
      ]
     },
     "execution_count": 67,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "c.shape"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 61,
   "metadata": {
    "collapsed": false
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "torch.Size([10, 3, 100])"
      ]
     },
     "execution_count": 61,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "out.shape"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "我们可以不使用默认的隐藏状态，这是需要传入两个张量"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 68,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "h_init = Variable(torch.randn(2, 3, 100))\n",
    "c_init = Variable(torch.randn(2, 3, 100))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 69,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "out, (h, c) = lstm_seq(lstm_input, (h_init, c_init))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 70,
   "metadata": {
    "collapsed": false
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "torch.Size([2, 3, 100])"
      ]
     },
     "execution_count": 70,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "h.shape"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 71,
   "metadata": {
    "collapsed": false
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "torch.Size([2, 3, 100])"
      ]
     },
     "execution_count": 71,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "c.shape"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 72,
   "metadata": {
    "collapsed": false
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "torch.Size([10, 3, 100])"
      ]
     },
     "execution_count": 72,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "out.shape"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# GRU\n",
    "![](https://ws3.sinaimg.cn/large/006tKfTcly1fmtaj38y9sj30io06bmxc.jpg)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "GRU 和前面讲的这两个是同样的道理，就不再细说，还是演示一下例子"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 73,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "gru_seq = nn.GRU(10, 20)\n",
    "gru_input = Variable(torch.randn(3, 32, 10))\n",
    "\n",
    "out, h = gru_seq(gru_input)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 76,
   "metadata": {
    "collapsed": false
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "Parameter containing:\n",
       " 0.0766 -0.0548 -0.2008  ...  -0.0250 -0.1819  0.1453\n",
       "-0.1676  0.1622  0.0417  ...   0.1905 -0.0071 -0.1038\n",
       " 0.0444 -0.1516  0.2194  ...  -0.0009  0.0771  0.0476\n",
       "          ...             ⋱             ...          \n",
       " 0.1698 -0.1707  0.0340  ...  -0.1315  0.1278  0.0946\n",
       " 0.1936  0.1369 -0.0694  ...  -0.0667  0.0429  0.1322\n",
       " 0.0870 -0.1884  0.1732  ...  -0.1423 -0.1723  0.2147\n",
       "[torch.FloatTensor of size 60x20]"
      ]
     },
     "execution_count": 76,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "gru_seq.weight_hh_l0"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 75,
   "metadata": {
    "collapsed": false
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "torch.Size([1, 32, 20])"
      ]
     },
     "execution_count": 75,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "h.shape"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 74,
   "metadata": {
    "collapsed": false
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "torch.Size([3, 32, 20])"
      ]
     },
     "execution_count": 74,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "out.shape"
   ]
  }
 ],
 "metadata": {
  "interpreter": {
   "hash": "07b2cea26089ea0ebec7cc3a83022d31e9f80d0db55f432f89380becb3d80933"
  },
  "kernelspec": {
   "display_name": "mx",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.8.12"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 2
}
