{
 "cells": [
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": "## 基本初等函数的求导公式\n",
   "id": "6c861cb41dd270d"
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": [
    "### 1.求导与微分公式\n",
    "|            | 求导公式                                       |           | 微分公式                                                            |\n",
    "|------------|--------------------------------------------|-----------|-----------------------------------------------------------------|\n",
    "| 基本初等函数求导公式 | $$ c'=0 (c为常数)   $$                        | 基本初等函数微分公式 | $$ \\mathrm{d}c = 0 (c为常数)$$                                     |\n",
    "|            | $$(x^u)'=ux^{u-1} $$                       |           | $$ \\mathrm{d}(x^u)= ux^{u-1}\\mathrm{d}x$$                       |\n",
    "|            | $$(a^x)'=a^x\\ln a $$                       |           | $$ \\mathrm{d}(a^x)= a^x\\ln{a}\\mathrm{d}x$$                      |\n",
    "|            | $$(a^x)'=a^x\\ln a $$                       |           | $$ \\mathrm{d}(a^x)= a^x\\ln{a}\\mathrm{d}x$$                      |\n",
    "|            | $$(e^x)'=e^x $$                            |           | $$ \\mathrm{d}(e^x)= e^x\\mathrm{d}x$$                            |\n",
    "|            | $$(\\ln{x})'=\\frac{1}{x} $$                 |           | $$ \\mathrm{d}(\\ln{x})=\\frac{1}{x}\\mathrm{d}x $$                 |\n",
    "|            | $$(\\sin{x})'=\\cos{x} $$                    |           | $$ \\mathrm{d}(\\sin{x})=\\cos{x}\\mathrm{d}x $$                    |\n",
    "|            | $$(\\cos{x})'=-\\sin{x} $$                   |           | $$ \\mathrm{d}(\\cos{x})=-\\sin{x}\\mathrm{d}x $$                   |\n",
    "|            | $$ \\lim_{n \\rightarrow x0} g(x) = 0 $$     |            | $$ \\mathrm{d}(\\tan{x})=(\\sec{x})^2\\mathrm{d}x $$                |\n",
    "|            | $$ (\\cot{x})'=-(\\csc{x})^2 $$              |           | $$ \\mathrm{d}(\\cot{x})=(-\\csc{x})^2\\mathrm{d}x $$               |\n",
    "|            | $$ (\\sec{x})'=\\sec{x}\\cot{x} $$            |           | $$ \\mathrm{d}(\\sec{x})=\\sec{x}\\tan{x}\\mathrm{d}x $$             |\n",
    "|            | $$ (\\csc{x})'=-\\csc{x}\\cot{x} $$           |           | $$ \\mathrm{d}(\\csc{x})=-\\csc{x}\\cot{x}\\mathrm{d}x $$            |\n",
    "|            | $$ (\\arcsin{x})'=\\frac{1}{\\sqrt{1-x^2}} $$ |           | $$ \\mathrm{d}(\\arcsin{x})=\\frac{1}{\\sqrt{1-x^2}}\\mathrm{d}x $$  |\n",
    "|            | $$ (\\arccos{x})'=-\\frac{1}{\\sqrt{1-x^2}} $$ |           | $$ \\mathrm{d}(\\arccos{x})=-\\frac{1}{\\sqrt{1-x^2}}\\mathrm{d}x $$ |\n",
    "|            | $$ (\\arctan{x})'=\\frac{1}{1+x^2} $$        |           | $$ \\mathrm{d}(\\arccos{x})=\\frac{1}{1+x^2}\\mathrm{d}x $$         |\n",
    "|            | $$ (\\arccot{x})'=-\\frac{1}{1+x^2} $$       |           | $$ \\mathrm{d}(\\arccot{x})=-\\frac{1}{1+x^2}\\mathrm{d}x $$        |\n",
    "\n"
   ],
   "id": "c0f58b96874d9ed"
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": [
    "### 2. pytorch自动微分\n",
    "\n",
    "#### 1.torch.autograd.backward()\n",
    "该函数实现自动求取梯度，函数参数如下：\n",
    "```python\n",
    "torch.autograd.backward(tensors,\n",
    "                        grade_tensors=None,\n",
    "                        retain_graph=None,\n",
    "                        create_graph=False)\n",
    "```\n",
    " ##### 【代码说明】\n",
    " + tensors: 用于求导的张量，如loss。\n",
    " + grad_tensors: 多梯度权重，当有多个loss需要计算梯度时，需要设置每个loss的权值.\n",
    " + retain_graph: 由于PyTorch采用动态图机制，在每次反向传播之后计算图都会释放掉，如果还想继续使用，就要设置此参数为True\n",
    " + create_graph: 还想继续使用，就要设置此参数为True。\n",
    "\n",
    "例如，线性的一阶导数的代码如下："
   ],
   "id": "e2cd1564af42f0ff"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-03-04T11:37:52.419134Z",
     "start_time": "2025-03-04T11:37:52.368270Z"
    }
   },
   "cell_type": "code",
   "source": [
    "import torch\n",
    "\n",
    "#创建一个需要梯度的张量w\n",
    "w = torch.tensor([1.],requires_grad=True)\n",
    "#创建一个需要梯度的张量x\n",
    "x = torch.tensor([2.],requires_grad=True)\n",
    "#将x与w相加，结果存储在a中\n",
    "a = torch.add(x,w)\n",
    "#将w和1相加，结果存储在b中\n",
    "b = torch.add(w,1)\n",
    "#将a和b相乘，结果存储在y中\n",
    "y = torch.mul(a,b)\n",
    "\n",
    "#对y进行反向传播计算梯度\n",
    "y.backward()\n",
    "# 打印w的梯度\n",
    "print(w.grad)\n"
   ],
   "id": "31008352acc45ba4",
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "tensor([9.])\n"
     ]
    }
   ],
   "execution_count": 16
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": "下面通过案例介绍grad_tensors参数的用法\n",
   "id": "360361a378dcae09"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-03-04T11:45:01.561621Z",
     "start_time": "2025-03-04T11:45:01.551648Z"
    }
   },
   "cell_type": "code",
   "source": [
    "import torch\n",
    "\n",
    "#创建一个需要梯度的张量w\n",
    "w = torch.tensor([1.],requires_grad=True)\n",
    "#创建一个需要梯度的张量x\n",
    "x = torch.tensor([2.],requires_grad=True)\n",
    "#将x与w相加，结果存储在a中\n",
    "a = torch.add(x,w)\n",
    "#将w和1相加，结果存储在b中\n",
    "b = torch.add(w,1)\n",
    "#将a和b相乘，结果存储在y中\n",
    "y0 = torch.mul(a,b)\n",
    "#将a和b相加，结果存储在y1中\n",
    "y1 = torch.add(a,b)\n",
    "#将y0和y1拼接起来，形成一个新的张量loss\n",
    "loss = torch.cat([y0,y1],dim=0)\n",
    "#定义一个梯度张量grad_t\n",
    "grad_t = torch.tensor([1.,2.])\n",
    "#对loss 进行反向传播计算梯度\n",
    "loss.backward(gradient=grad_t)\n",
    "#打印w的梯度\n",
    "print(w.grad)\n"
   ],
   "id": "e41131c6458553",
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "tensor([9.])\n"
     ]
    }
   ],
   "execution_count": 18
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": [
    "其中：\n",
    "$$y_0 = (x + w)( w + 1),\\frac{\\partial y_0}{\\partial w}=2w+x+1=5$$\n",
    "$$y_1 = (x + w)+( w + 1),\\frac{\\partial y_1}{\\partial w}=2$$\n",
    "$$w.grad = y_0 * 1 + y_1 * 2 = 5 + 2 * 2 = 9$$"
   ],
   "id": "8d4fe6ec444c4b8d"
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": [
    "\n",
    "\n",
    "#### 2.torch.autograd.grad()\n",
    "该函数实现求取梯度，函数参数如下：\n",
    "\n",
    "```python\n",
    "torch.autograd.grad(outputs,\n",
    "                    inputs,\n",
    "                    grad_outputs=None,\n",
    "                    retain_graph=None,\n",
    "                    create_graph=False)\n",
    "```"
   ],
   "id": "23bae2cfa5f6637a"
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": [
    " ##### 【代码说明】\n",
    " + outputs: 用于求导的张量，如上例中的loss。\n",
    " + inputs: 需要梯度的张量，如上例中的w.\n",
    " + retain_graph: 保存计算图。\n",
    " + grad_outputs: 多梯度权重。\n",
    " + create_graph: 创建导数计算图，用于高阶求导。\n",
    "\n",
    "例如，计算y=x^2的二阶导数的代码如下：\n",
    "\n"
   ],
   "id": "a9b72bf45e4327b1"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-03-04T12:13:55.421378Z",
     "start_time": "2025-03-04T12:13:52.265820Z"
    }
   },
   "cell_type": "code",
   "source": [
    "import torch\n",
    "\n",
    "#创建一个张量x,值为3.0，并设置requires_grad=True以计算梯度\n",
    "x = torch.tensor([3.],requires_grad=True)\n",
    "#计算x的平方，得到y\n",
    "y = torch.pow(x,2)\n",
    "#计算y关于x的梯度，create_graph=True表示同时计算二阶导数\n",
    "grad1 = torch.autograd.grad(y,x,create_graph=True)\n",
    "print(grad1)\n",
    "#计算一阶导数关于x的梯度\n",
    "grad2 = torch.autograd.grad(grad1[0],x)\n",
    "print(grad2) #输出二阶导数：[2.]"
   ],
   "id": "5bbc89c3a80f2613",
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "(tensor([6.], grad_fn=<MulBackward0>),)\n",
      "(tensor([2.]),)\n"
     ]
    }
   ],
   "execution_count": 1
  },
  {
   "metadata": {},
   "cell_type": "code",
   "outputs": [],
   "execution_count": null,
   "source": "",
   "id": "7823721f81180db7"
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": [
    "#### 3. 注意事项\n",
    "（1） 梯度不能清零，在每次反向传播会叠加，代码如下："
   ],
   "id": "301c787846095823"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-03-04T12:39:12.032243Z",
     "start_time": "2025-03-04T12:39:12.019275Z"
    }
   },
   "cell_type": "code",
   "source": [
    "import torch\n",
    "\n",
    "#创建一个需要梯度的张量w\n",
    "w = torch.tensor([1.],requires_grad=True)\n",
    "#创建一个需要梯度的张量x\n",
    "x = torch.tensor([2.],requires_grad=True)\n",
    "for i in range(3):\n",
    "    #将x和w相加，结果存储在a中\n",
    "    a = torch.add(x,w)\n",
    "    #将w和1相加，结果存储在b中\n",
    "    b = torch.add(w,1)\n",
    "    #将a和b相乘，结果存储在y中\n",
    "    y = torch.mul(a,b)\n",
    "    # 计算梯度\n",
    "    y.backward()\n",
    "    # 打印w的梯度\n",
    "    print(w.grad)\n"
   ],
   "id": "dca6fc6978def81a",
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "tensor([5.])\n",
      "tensor([10.])\n",
      "tensor([15.])\n"
     ]
    }
   ],
   "execution_count": 2
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": "这会导致我们得不到正确的结果，所以需要手动清零，代码如下:\n",
   "id": "d04f7cfe54795ad6"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-03-04T12:41:53.517222Z",
     "start_time": "2025-03-04T12:41:53.503261Z"
    }
   },
   "cell_type": "code",
   "source": [
    "import torch\n",
    "\n",
    "#创建一个需要梯度的张量w\n",
    "w = torch.tensor([1.],requires_grad=True)\n",
    "#创建一个需要梯度的张量x\n",
    "x = torch.tensor([2.],requires_grad=True)\n",
    "for i in range(3):\n",
    "    #将x和w相加，结果存储在a中\n",
    "    a = torch.add(x,w)\n",
    "    #将w和1相加，结果存储在b中\n",
    "    b = torch.add(w,1)\n",
    "    #将a和b相乘，结果存储在y中\n",
    "    y = torch.mul(a,b)\n",
    "    # 计算梯度\n",
    "    y.backward()\n",
    "    # 打印w的梯度\n",
    "    print(w.grad)\n",
    "    # 梯度清零\n",
    "    w.grad.zero_()"
   ],
   "id": "2a762cf73834d246",
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "tensor([5.])\n",
      "tensor([5.])\n",
      "tensor([5.])\n"
     ]
    }
   ],
   "execution_count": 3
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 2
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython2",
   "version": "2.7.6"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 5
}
