{
 "cells": [
  {
   "cell_type": "markdown",
   "id": "a895e69e",
   "metadata": {},
   "source": [
    "# 自动求导\n",
    "我们在初识自定义部分就已经介绍过了自动求导的操作，使用`mindspore。ops`模块提供的`GradOperation`接口可以生成网络模型店梯度。"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "1ada55ce",
   "metadata": {},
   "source": [
    "## 一、求一阶导\n",
    "直接对网络模型使用`mindspore.ops.GradOperation`接口即可计算梯度，其有三个参数：\n",
    "\n",
    "- `get_all`：为`False`时，只会对第一个输入求导；为`True`时，会对所有输入求导。\n",
    "- `get_by_list：`为`False`时，不会对权重求导；为`True`时，会对权重求导。\n",
    "- `sens_param`：对网络的输出值做缩放以改变最终梯度，故其维度与输出维度保持一致。"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "580e45d8",
   "metadata": {},
   "source": [
    "我们以简单线性函数举例：$$f(x)=2x+3$$"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 4,
   "id": "7108f7d6",
   "metadata": {},
   "outputs": [],
   "source": [
    "import mindspore.nn as nn\n",
    "import mindspore.ops as ops\n",
    "import numpy as np\n",
    "import mindspore as ms"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "100367dc",
   "metadata": {},
   "source": [
    "定义这个函数网络"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 8,
   "id": "6de28a25",
   "metadata": {},
   "outputs": [],
   "source": [
    "class Net(nn.Cell):\n",
    "    \"\"\"简单线性网络\"\"\"\n",
    "    def __init__(self):\n",
    "        super(Net, self).__init__()\n",
    "        self.mul = ops.Mul()\n",
    "        self.add = ops.Add()\n",
    "        self.weight = ms.Tensor(np.array([2]), ms.float32)\n",
    "        self.bias = ms.Tensor(np.array([3]), ms.float32)\n",
    "        \n",
    "    def construct(self, x):\n",
    "        x = self.mul(x, self.weight)\n",
    "        x = self.add(x, self.bias)\n",
    "        return x"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "b94dc32f",
   "metadata": {},
   "source": [
    "定义求导网络"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 9,
   "id": "54e615c4",
   "metadata": {},
   "outputs": [],
   "source": [
    "class GradNet(nn.Cell):\n",
    "    \"\"\"定义一阶求导网络\"\"\"\n",
    "    def __init__(self, net):\n",
    "        super(GradNet, self).__init__()\n",
    "        self.grad = ops.GradOperation()\n",
    "        self.net = net\n",
    "    \n",
    "    def construct(self, x):\n",
    "        x = self.grad(self.net)(x)\n",
    "        return x"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 11,
   "id": "c0284d8e",
   "metadata": {
    "scrolled": false
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "[2. 2. 2.]\n"
     ]
    }
   ],
   "source": [
    "input = ms.Tensor(np.array([1, 2, 3]), ms.float32)\n",
    "output = GradNet(Net())(input)\n",
    "print(output)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "24bfcdab",
   "metadata": {},
   "source": [
    "## 二、求二阶导\n",
    "求二阶段只需嵌套两个一阶导即可，下面以正弦函数举例：\n",
    "$$f(x) = sin(x) \\tag{1}$$\n",
    "\n",
    "其一阶导数是：\n",
    "\n",
    "$$f'(x) = cos(x) \\tag{2}$$\n",
    "\n",
    "其二阶导数为：\n",
    "\n",
    "$$f''(x) = cos'(x) = -sin(x) \\tag{3}$$"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "ca718d98",
   "metadata": {},
   "source": [
    "定义正弦函数："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 14,
   "id": "a76c051b",
   "metadata": {},
   "outputs": [],
   "source": [
    "class Net(nn.Cell):\n",
    "    \"\"\"定义sin\"\"\"\n",
    "    def __init__(self):\n",
    "        super(Net, self).__init__()\n",
    "        self.sin = ops.Sin()\n",
    "        \n",
    "    def construct(self, x):\n",
    "        x = self.sin(x)\n",
    "        return x"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "4d0403e2",
   "metadata": {},
   "source": [
    "定义二阶求导网络："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 23,
   "id": "7c8a502e",
   "metadata": {},
   "outputs": [],
   "source": [
    "class GradNet(nn.Cell):\n",
    "    \"\"\"定义一阶求导网络\"\"\"\n",
    "    def __init__(self, net):\n",
    "        super(GradNet, self).__init__()\n",
    "        self.grad = ops.GradOperation()\n",
    "        self.net = net\n",
    "        \n",
    "    def construct(self, x):\n",
    "        x = self.grad(self.net)(x)\n",
    "        return x\n",
    "\n",
    "class GradSecNet(nn.Cell):\n",
    "    \"\"\"定义二阶求导网络\"\"\"\n",
    "    def __init__(self, net):\n",
    "        super(GradSecNet, self).__init__()\n",
    "        self.grad = ops.GradOperation()\n",
    "        self.net = net\n",
    "        \n",
    "    def construct(self, x):\n",
    "        x = self.grad(self.net)(x)\n",
    "        return x"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 25,
   "id": "0ea2b73f",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "[-0.         -0.9999997  -0.00159255]\n"
     ]
    }
   ],
   "source": [
    "input = ms.Tensor(np.array([0, 3.14/2, 3.14]), ms.float32)\n",
    "net = Net()\n",
    "firstgrad = GradNet(net)\n",
    "secondgrad = GradSecNet(firstgrad)\n",
    "output = secondgrad(input)\n",
    "print(output)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "f44d0920",
   "metadata": {},
   "source": [
    "MindSpore还有梯度值缩放（sens_param参数），停止计算梯度（stop_gradient接口）等操作，这里不做介绍，具体用法可查看官方文档。"
   ]
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "mindspore",
   "language": "python",
   "name": "mindvision"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.7.5"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 5
}
