{
 "cells": [
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": [
    "# 神经网络反向传播的数学推导与实例分析\n",
    "\n",
    "## 一、反向传播的数学原理推导\n",
    "\n",
    "### 1. 符号定义\n",
    "设多层前馈神经网络包含输入层、隐藏层、输出层，定义符号：\n",
    "- $ l $：网络层数（输入层为 $ l=1 $，输出层为 $ l=L $）\n",
    "- $ n_l $：第 $ l $ 层神经元数量\n",
    "- $ z_i^{(l)} $：第 $ l $ 层第 $ i $ 个神经元的输入（加权和）\n",
    "- $ a_i^{(l)} $：第 $ l $ 层第 $ i $ 个神经元的输出（激活值），满足：\n",
    "  $$ a_i^{(l)} = \\sigma\\left(z_i^{(l)}\\right) $$\n",
    "  其中 $ \\sigma(\\cdot) $ 为激活函数\n",
    "- $ w_{ij}^{(l)} $：第 $ l $ 层第 $ j $ 个神经元到第 $ l+1 $ 层第 $ i $ 个神经元的权重\n",
    "- $ b_i^{(l)} $：第 $ l $ 层第 $ i $ 个神经元的偏置\n",
    "- $ \\hat{y} $：网络输出，$ y $：真实标签，损失函数 $ L(\\hat{y}, y) $\n",
    "\n",
    "\n",
    "### 2. 正向传播公式\n",
    "#### 通用递推关系\n",
    "第 $ l+1 $ 层第 $ i $ 个神经元的输入：\n",
    "$$ z_i^{(l+1)} = \\sum_{j=1}^{n_l} w_{ij}^{(l)} \\cdot a_j^{(l)} + b_i^{(l+1)} $$\n",
    "其输出（激活值）：\n",
    "$$ a_i^{(l+1)} = \\sigma\\left(z_i^{(l+1)}\\right) $$\n",
    "\n",
    "\n",
    "### 3. 反向传播核心：误差项定义\n",
    "定义第 $ l $ 层第 $ i $ 个神经元的误差项（损失对输入的偏导数）：\n",
    "$$ \\delta_i^{(l)} = \\frac{\\partial L}{\\partial z_i^{(l)}} $$\n",
    "\n",
    "\n",
    "### 4. 误差项的递推公式\n",
    "#### 输出层（$ l=L $）\n",
    "由链式法则推导：\n",
    "$$ \\delta_i^{(L)} = \\frac{\\partial L}{\\partial a_i^{(L)}} \\cdot \\sigma'\\left(z_i^{(L)}\\right) $$\n",
    "- $ \\frac{\\partial L}{\\partial a_i^{(L)}} $：损失对输出层激活值的导数\n",
    "- $ \\sigma'\\left(z_i^{(L)}\\right) $：激活函数的导数\n",
    "\n",
    "#### 隐藏层（$ 2 \\leq l \\leq L-1 $）\n",
    "误差由下一层反向传递：\n",
    "$$ \\delta_i^{(l)} = \\left( \\sum_{k=1}^{n_{l+1}} w_{ki}^{(l)} \\cdot \\delta_k^{(l+1)} \\right) \\cdot \\sigma'\\left(z_i^{(l)}\\right) $$\n",
    "\n",
    "\n",
    "### 5. 参数梯度与更新\n",
    "#### 权重梯度\n",
    "$$ \\frac{\\partial L}{\\partial w_{ij}^{(l)}} = \\delta_i^{(l+1)} \\cdot a_j^{(l)} $$\n",
    "\n",
    "#### 偏置梯度\n",
    "$$ \\frac{\\partial L}{\\partial b_i^{(l+1)}} = \\delta_i^{(l+1)} $$\n",
    "\n",
    "#### 参数更新（梯度下降，学习率 $ \\eta $）\n",
    "$$ w_{ij}^{(l)} \\leftarrow w_{ij}^{(l)} - \\eta \\cdot \\frac{\\partial L}{\\partial w_{ij}^{(l)}} $$\n",
    "$$ b_i^{(l+1)} \\leftarrow b_i^{(l+1)} - \\eta \\cdot \\frac{\\partial L}{\\partial b_i^{(l+1)}} $$\n",
    "\n",
    "\n",
    "## 二、反向传播实例计算\n",
    "\n",
    "### 网络结构\n",
    "- 输入层：1 神经元 $ x $\n",
    "- 隐藏层：1 神经元（激活函数 $ \\sigma(x) = \\text{sigmoid}(x) = \\frac{1}{1+e^{-x}} $，导数 $ \\sigma'(x) = \\sigma(x)(1-\\sigma(x)) $）\n",
    "- 输出层：1 神经元 $ \\hat{y} $\n",
    "- 参数初始值：$ w_1=0.2 $，$ w_2=0.5 $，$ b_1=0.1 $，$ b_2=0.3 $\n",
    "- 输入 $ x=1 $，真实值 $ y=0.5 $，损失函数 $ L = \\frac{1}{2}(\\hat{y}-y)^2 $\n",
    "\n",
    "\n",
    "### 1. 正向传播\n",
    "- 隐藏层输入：\n",
    "  $$ z_1 = w_1 x + b_1 = 0.2 \\times 1 + 0.1 = 0.3 $$\n",
    "- 隐藏层输出：\n",
    "  $$ a_1 = \\sigma(0.3) \\approx \\frac{1}{1+e^{-0.3}} \\approx 0.5744 $$\n",
    "- 输出层输入：\n",
    "  $$ z_2 = w_2 a_1 + b_2 = 0.5 \\times 0.5744 + 0.3 \\approx 0.5872 $$\n",
    "- 输出层输出：\n",
    "  $$ \\hat{y} = \\sigma(0.5872) \\approx 0.6426 $$\n",
    "- 损失：\n",
    "  $$ L = \\frac{1}{2}(0.6426 - 0.5)^2 \\approx 0.0101 $$\n",
    "\n",
    "\n",
    "### 2. 反向传播\n",
    "#### 输出层误差 $ \\delta_2 $\n",
    "- 损失对输出的导数：\n",
    "  $$ \\frac{\\partial L}{\\partial \\hat{y}} = \\hat{y} - y \\approx 0.1426 $$\n",
    "- 激活函数导数：\n",
    "  $$ \\sigma'(z_2) = \\hat{y}(1-\\hat{y}) \\approx 0.6426 \\times 0.3574 \\approx 0.2297 $$\n",
    "- 误差项：\n",
    "  $$ \\delta_2 = 0.1426 \\times 0.2297 \\approx 0.0327 $$\n",
    "\n",
    "#### 隐藏层误差 $ \\delta_1 $\n",
    "- 权重加权误差：\n",
    "  $$ \\sum w_{ki} \\delta_k = w_2 \\cdot \\delta_2 \\approx 0.5 \\times 0.0327 \\approx 0.0163 $$\n",
    "- 激活函数导数：\n",
    "  $$ \\sigma'(z_1) = a_1(1-a_1) \\approx 0.5744 \\times 0.4256 \\approx 0.2445 $$\n",
    "- 误差项：\n",
    "  $$ \\delta_1 = 0.0163 \\times 0.2445 \\approx 0.0040 $$\n",
    "\n",
    "#### 参数更新（$ \\eta=0.1 $）\n",
    "- 权重更新：\n",
    "  $$ w_2 \\leftarrow 0.5 - 0.1 \\times (0.0327 \\times 0.5744) \\approx 0.4981 $$\n",
    "  $$ w_1 \\leftarrow 0.2 - 0.1 \\times (0.0040 \\times 1) \\approx 0.1996 $$\n",
    "- 偏置更新：\n",
    "  $$ b_2 \\leftarrow 0.3 - 0.1 \\times 0.0327 \\approx 0.2997 $$\n",
    "  $$ b_1 \\leftarrow 0.1 - 0.1 \\times 0.0040 \\approx 0.09996 $$\n",
    "\n",
    "\n",
    "## 三、神经网络复杂度分析\n",
    "\n",
    "### 迭代一次的计算量与节点数关系\n",
    "设网络总节点数为 $ n = \\sum_{l=1}^m n_l $（$ m $ 为层数）：\n",
    "\n",
    "1. **正向传播**：\n",
    "   - 每层计算量为权重乘法（$ n_l \\times n_{l+1} $）+ 偏置加法（$ n_{l+1} $）\n",
    "   - 总计算量：$ \\sum_{l=1}^{m-1} (n_l n_{l+1} + n_{l+1}) \\approx O(n^2) $（假设各层节点数相当）\n",
    "\n",
    "2. **反向传播**：\n",
    "   - 误差项计算：与正向传播对称，$ O(n^2) $\n",
    "   - 梯度更新：与权重总数一致，$ O(n^2) $\n",
    "\n",
    "\n",
    "### 结论\n",
    "神经网络迭代一次的总计算次数与节点总数 $ n $ 成 **平方关系**（$ O(n^2) $），核心原因是层间连接的权重数量与节点数的平方成正比，而权重操作是计算量的主要来源。"
   ],
   "id": "c1c48fea65eb04a4"
  },
  {
   "metadata": {},
   "cell_type": "code",
   "outputs": [],
   "execution_count": null,
   "source": "",
   "id": "2bada338d304c73c"
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 2
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython2",
   "version": "2.7.6"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 5
}
