{
 "cells": [
  {
   "metadata": {
    "collapsed": true
   },
   "cell_type": "markdown",
   "source": [
    "# 2.3 线性代数\n",
    "\n",
    "## 2.3.1 标量\n",
    "\n",
    "仅包含一个数值被称为标量（scalar），标量变量一般使用小写英文字母表示。标量由只有一个元素的张量表示。标量是0维数组。\n",
    "\n",
    "单元素数组不是标量，因为单元素数组的维度是1。标量可以看作是生活中常见的普通数字"
   ],
   "id": "fea22fe635522808"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-11-16T13:41:59.013107Z",
     "start_time": "2025-11-16T13:41:58.998197Z"
    }
   },
   "cell_type": "code",
   "source": [
    "import torch\n",
    "\n",
    "# 创建标量\n",
    "x = torch.tensor(3.0)\n",
    "y = torch.tensor(2.0)\n",
    "\n",
    "x + y, x * y, x / y, x**y"
   ],
   "id": "52e9e1dd25137dc1",
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(tensor(5.), tensor(6.), tensor(1.5000), tensor(9.))"
      ]
     },
     "execution_count": 6,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "execution_count": 6
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": [
    "## 2.3.2 向量\n",
    "\n",
    "向量可以被视为标量值组成的列表。这些标量值被称为向量的元素（element）或分量（component）。向量一般使用小写加粗英文字母表示。\n",
    "\n",
    "向量是一维数组。向量默认为是列向量，而不是行向量。"
   ],
   "id": "62b111df63290ea7"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-11-16T13:41:59.025849Z",
     "start_time": "2025-11-16T13:41:59.019410Z"
    }
   },
   "cell_type": "code",
   "source": [
    "# 创建向量\n",
    "x = torch.arange(4)\n",
    "x"
   ],
   "id": "5d38355d372463ff",
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor([0, 1, 2, 3])"
      ]
     },
     "execution_count": 7,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "execution_count": 7
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-11-16T13:41:59.043264Z",
     "start_time": "2025-11-16T13:41:59.033028Z"
    }
   },
   "cell_type": "code",
   "source": [
    "# 通过索引访问元素\n",
    "x[3]"
   ],
   "id": "86c90d4503061530",
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor(3)"
      ]
     },
     "execution_count": 8,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "execution_count": 8
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": "向量中的元素个数就是向量的长度，向量的长度通常称为向量的维度（dimension）。",
   "id": "b859772ce9b5d5d2"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-11-16T13:41:59.055141Z",
     "start_time": "2025-11-16T13:41:59.050076Z"
    }
   },
   "cell_type": "code",
   "source": [
    "# 获取向量长度\n",
    "len(x)"
   ],
   "id": "e8e996253ba30336",
   "outputs": [
    {
     "data": {
      "text/plain": [
       "4"
      ]
     },
     "execution_count": 9,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "execution_count": 9
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": "向量是一维数组，它只有一个轴。向量的形状就是向量的长度。因此，向量长度、向量维度、向量形状指的都是向量的长度。",
   "id": "a1b4eca7c7b82f43"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-11-16T13:41:59.064368Z",
     "start_time": "2025-11-16T13:41:59.059875Z"
    }
   },
   "cell_type": "code",
   "source": [
    "# 获取向量形状\n",
    "x.shape"
   ],
   "id": "6987a9ad5bd391a1",
   "outputs": [
    {
     "data": {
      "text/plain": [
       "torch.Size([4])"
      ]
     },
     "execution_count": 10,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "execution_count": 10
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": [
    "## 2.3.3 矩阵\n",
    "\n",
    "矩阵是二维数组，有两个轴。通常使用大写加粗英文字母表示。行数和列数相同的矩阵又叫方阵（square matrix），矩阵的形状是行数*列数"
   ],
   "id": "438784fee8ffa5f2"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-11-16T13:41:59.081342Z",
     "start_time": "2025-11-16T13:41:59.072999Z"
    }
   },
   "cell_type": "code",
   "source": [
    "A = torch.arange(20).reshape(5, 4)\n",
    "A"
   ],
   "id": "570511a07f1f1d6e",
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor([[ 0,  1,  2,  3],\n",
       "        [ 4,  5,  6,  7],\n",
       "        [ 8,  9, 10, 11],\n",
       "        [12, 13, 14, 15],\n",
       "        [16, 17, 18, 19]])"
      ]
     },
     "execution_count": 11,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "execution_count": 11
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-11-16T13:41:59.096912Z",
     "start_time": "2025-11-16T13:41:59.088549Z"
    }
   },
   "cell_type": "code",
   "source": [
    "# 矩阵的转置\n",
    "A.T"
   ],
   "id": "a282e4ebc4e64592",
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor([[ 0,  4,  8, 12, 16],\n",
       "        [ 1,  5,  9, 13, 17],\n",
       "        [ 2,  6, 10, 14, 18],\n",
       "        [ 3,  7, 11, 15, 19]])"
      ]
     },
     "execution_count": 12,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "execution_count": 12
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": "和转置相同的矩阵称为对称矩阵（symmetric matrix），对称矩阵的元素关于主对角线对称。",
   "id": "63a89f6f5069c2c9"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-11-16T13:41:59.117081Z",
     "start_time": "2025-11-16T13:41:59.102399Z"
    }
   },
   "cell_type": "code",
   "source": [
    "B = torch.tensor([[1, 2, 3], [2, 0, 4], [3, 4, 5]])\n",
    "B # 对称矩阵"
   ],
   "id": "1703ebe2b8e22818",
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor([[1, 2, 3],\n",
       "        [2, 0, 4],\n",
       "        [3, 4, 5]])"
      ]
     },
     "execution_count": 13,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "execution_count": 13
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-11-16T13:41:59.133294Z",
     "start_time": "2025-11-16T13:41:59.124791Z"
    }
   },
   "cell_type": "code",
   "source": "B == B.T",
   "id": "68e4b6d2345bd22d",
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor([[True, True, True],\n",
       "        [True, True, True],\n",
       "        [True, True, True]])"
      ]
     },
     "execution_count": 14,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "execution_count": 14
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": [
    "尽管单个向量的默认方向是列向量，但在表示表格数据集的矩阵中， 将每个数据样本作为矩阵中的行向量更为常见。\n",
    "\n",
    "## 2.3.4 张量\n",
    "\n",
    "张量一般用特殊字体的大写字母表示，以此来区别矩阵。"
   ],
   "id": "887be2288c8716ca"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-11-16T13:41:59.149358Z",
     "start_time": "2025-11-16T13:41:59.138293Z"
    }
   },
   "cell_type": "code",
   "source": [
    "X = torch.arange(24).reshape(2, 3, 4)\n",
    "X"
   ],
   "id": "746feae8402e7fe5",
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor([[[ 0,  1,  2,  3],\n",
       "         [ 4,  5,  6,  7],\n",
       "         [ 8,  9, 10, 11]],\n",
       "\n",
       "        [[12, 13, 14, 15],\n",
       "         [16, 17, 18, 19],\n",
       "         [20, 21, 22, 23]]])"
      ]
     },
     "execution_count": 15,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "execution_count": 15
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": "## 2.3.5 张量运算",
   "id": "51062825301fc420"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-11-16T13:41:59.180146Z",
     "start_time": "2025-11-16T13:41:59.164104Z"
    }
   },
   "cell_type": "code",
   "source": [
    "# 加法\n",
    "A = torch.arange(20, dtype=torch.float32).reshape(5, 4)\n",
    "B = A.clone()  # 通过分配新内存，将A的一个副本分配给B\n",
    "A, A + B"
   ],
   "id": "6ecd441189232602",
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(tensor([[ 0.,  1.,  2.,  3.],\n",
       "         [ 4.,  5.,  6.,  7.],\n",
       "         [ 8.,  9., 10., 11.],\n",
       "         [12., 13., 14., 15.],\n",
       "         [16., 17., 18., 19.]]),\n",
       " tensor([[ 0.,  2.,  4.,  6.],\n",
       "         [ 8., 10., 12., 14.],\n",
       "         [16., 18., 20., 22.],\n",
       "         [24., 26., 28., 30.],\n",
       "         [32., 34., 36., 38.]]))"
      ]
     },
     "execution_count": 16,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "execution_count": 16
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-11-16T13:41:59.196565Z",
     "start_time": "2025-11-16T13:41:59.186511Z"
    }
   },
   "cell_type": "code",
   "source": [
    "# Hadamard积\n",
    "# 相同位置的元素相乘\n",
    "A * B"
   ],
   "id": "478683a7715500dc",
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor([[  0.,   1.,   4.,   9.],\n",
       "        [ 16.,  25.,  36.,  49.],\n",
       "        [ 64.,  81., 100., 121.],\n",
       "        [144., 169., 196., 225.],\n",
       "        [256., 289., 324., 361.]])"
      ]
     },
     "execution_count": 17,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "execution_count": 17
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-11-16T13:41:59.214810Z",
     "start_time": "2025-11-16T13:41:59.200825Z"
    }
   },
   "cell_type": "code",
   "source": [
    "# 和标量的加法、乘法\n",
    "# 每个元素都加上（乘以）这个标量\n",
    "a = 2\n",
    "X = torch.arange(24).reshape(2, 3, 4)\n",
    "a + X, (a * X).shape"
   ],
   "id": "f6e1865c139aa856",
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(tensor([[[ 2,  3,  4,  5],\n",
       "          [ 6,  7,  8,  9],\n",
       "          [10, 11, 12, 13]],\n",
       " \n",
       "         [[14, 15, 16, 17],\n",
       "          [18, 19, 20, 21],\n",
       "          [22, 23, 24, 25]]]),\n",
       " torch.Size([2, 3, 4]))"
      ]
     },
     "execution_count": 18,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "execution_count": 18
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": [
    "## 2.3.6 降维\n",
    "\n",
    "很显然，对于python列表求和得到的是一个数字，而不是列表。这就相当于对向量求和得到的是一个标量，这个过程张量的维度由1变为0，实现了降维。"
   ],
   "id": "feb7338dd5ff83e5"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-11-16T13:41:59.238026Z",
     "start_time": "2025-11-16T13:41:59.220704Z"
    }
   },
   "cell_type": "code",
   "source": [
    "x = torch.arange(4, dtype=torch.float32)\n",
    "x, x.sum()"
   ],
   "id": "8340614d79f4ebd0",
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(tensor([0., 1., 2., 3.]), tensor(6.))"
      ]
     },
     "execution_count": 19,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "execution_count": 19
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-11-16T13:41:59.253083Z",
     "start_time": "2025-11-16T13:41:59.243026Z"
    }
   },
   "cell_type": "code",
   "source": "A",
   "id": "179dfce2ca709178",
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor([[ 0.,  1.,  2.,  3.],\n",
       "        [ 4.,  5.,  6.,  7.],\n",
       "        [ 8.,  9., 10., 11.],\n",
       "        [12., 13., 14., 15.],\n",
       "        [16., 17., 18., 19.]])"
      ]
     },
     "execution_count": 20,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "execution_count": 20
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": "求和方法sum默认将张量的降维成标量，sum方法实际上是降维函数，可以通过axis关键字参数指定降维的方向，降维之后，这个轴消失。",
   "id": "654b82c88834d8d1"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-11-16T13:41:59.269198Z",
     "start_time": "2025-11-16T13:41:59.259548Z"
    }
   },
   "cell_type": "code",
   "source": "A.shape, A.sum()",
   "id": "2b3ad55692ad31bf",
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(torch.Size([5, 4]), tensor(190.))"
      ]
     },
     "execution_count": 21,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "execution_count": 21
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-11-16T13:41:59.284874Z",
     "start_time": "2025-11-16T13:41:59.273483Z"
    }
   },
   "cell_type": "code",
   "source": [
    "A_sum_axis0 = A.sum(axis=0)\n",
    "A_sum_axis0, A_sum_axis0.shape"
   ],
   "id": "66d0b8df7821454",
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(tensor([40., 45., 50., 55.]), torch.Size([4]))"
      ]
     },
     "execution_count": 22,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "execution_count": 22
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-11-16T13:41:59.301417Z",
     "start_time": "2025-11-16T13:41:59.289873Z"
    }
   },
   "cell_type": "code",
   "source": [
    "A_sum_axis1 = A.sum(axis=1)\n",
    "A_sum_axis1, A_sum_axis1.shape"
   ],
   "id": "610240386b31d8f0",
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(tensor([ 6., 22., 38., 54., 70.]), torch.Size([5]))"
      ]
     },
     "execution_count": 23,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "execution_count": 23
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-11-16T13:41:59.315427Z",
     "start_time": "2025-11-16T13:41:59.305295Z"
    }
   },
   "cell_type": "code",
   "source": [
    "# 沿两个轴降维\n",
    "A.sum(axis=[0, 1])  # 结果和A.sum()相同"
   ],
   "id": "84d9f1c4f7811146",
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor(190.)"
      ]
     },
     "execution_count": 24,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "execution_count": 24
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": "求平均值方法mean和sum方法类似，也是降维函数。",
   "id": "d03035117d5e3bfc"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-11-16T13:41:59.331396Z",
     "start_time": "2025-11-16T13:41:59.322610Z"
    }
   },
   "cell_type": "code",
   "source": "A.mean(), A.sum() / A.numel()",
   "id": "b6c3ffe965e76885",
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(tensor(9.5000), tensor(9.5000))"
      ]
     },
     "execution_count": 25,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "execution_count": 25
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-11-16T13:41:59.353987Z",
     "start_time": "2025-11-16T13:41:59.343987Z"
    }
   },
   "cell_type": "code",
   "source": "A.mean(axis=0), A.sum(axis=0) / A.shape[0]",
   "id": "37823a00b0fa10c7",
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(tensor([ 8.,  9., 10., 11.]), tensor([ 8.,  9., 10., 11.]))"
      ]
     },
     "execution_count": 26,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "execution_count": 26
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": "降维方法通过添加keepdims=True关键字参数，就变成了普通的数学函数，不会对张量降维。",
   "id": "afa39d25809e0fb2"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-11-16T13:41:59.372079Z",
     "start_time": "2025-11-16T13:41:59.363825Z"
    }
   },
   "cell_type": "code",
   "source": [
    "sum_A = A.sum(axis=1, keepdims=True)\n",
    "sum_A"
   ],
   "id": "8b52887c12e3aedf",
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor([[ 6.],\n",
       "        [22.],\n",
       "        [38.],\n",
       "        [54.],\n",
       "        [70.]])"
      ]
     },
     "execution_count": 27,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "execution_count": 27
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-11-16T13:41:59.386942Z",
     "start_time": "2025-11-16T13:41:59.380832Z"
    }
   },
   "cell_type": "code",
   "source": [
    "# 广播机制\n",
    "A / sum_A"
   ],
   "id": "5b83ac1ffdd78bdd",
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor([[0.0000, 0.1667, 0.3333, 0.5000],\n",
       "        [0.1818, 0.2273, 0.2727, 0.3182],\n",
       "        [0.2105, 0.2368, 0.2632, 0.2895],\n",
       "        [0.2222, 0.2407, 0.2593, 0.2778],\n",
       "        [0.2286, 0.2429, 0.2571, 0.2714]])"
      ]
     },
     "execution_count": 28,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "execution_count": 28
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-11-16T13:41:59.402455Z",
     "start_time": "2025-11-16T13:41:59.395019Z"
    }
   },
   "cell_type": "code",
   "source": "A",
   "id": "277a6479d4477fea",
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor([[ 0.,  1.,  2.,  3.],\n",
       "        [ 4.,  5.,  6.,  7.],\n",
       "        [ 8.,  9., 10., 11.],\n",
       "        [12., 13., 14., 15.],\n",
       "        [16., 17., 18., 19.]])"
      ]
     },
     "execution_count": 29,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "execution_count": 29
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-11-16T13:41:59.417790Z",
     "start_time": "2025-11-16T13:41:59.409698Z"
    }
   },
   "cell_type": "code",
   "source": [
    "# 累计求和\n",
    "# 下一行每个元素等于原张量前面所有行对应位置元素之和\n",
    "A.cumsum(axis=0)"
   ],
   "id": "9b5ecffe57d6bc38",
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor([[ 0.,  1.,  2.,  3.],\n",
       "        [ 4.,  6.,  8., 10.],\n",
       "        [12., 15., 18., 21.],\n",
       "        [24., 28., 32., 36.],\n",
       "        [40., 45., 50., 55.]])"
      ]
     },
     "execution_count": 30,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "execution_count": 30
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": [
    "## 2.3.7 点积(Dot Product)\n",
    "\n",
    "就是向量的数量积（内积），有相同数量元素的向量对应分量相乘再求和。但是经常写成行向量点列向量的形式"
   ],
   "id": "363f4a346bd7f66f"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-11-16T13:41:59.432839Z",
     "start_time": "2025-11-16T13:41:59.423169Z"
    }
   },
   "cell_type": "code",
   "source": "x",
   "id": "e25e51e3df95a5ba",
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor([0., 1., 2., 3.])"
      ]
     },
     "execution_count": 31,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "execution_count": 31
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-11-16T13:42:41.174578Z",
     "start_time": "2025-11-16T13:42:41.150529Z"
    }
   },
   "cell_type": "code",
   "source": [
    "y = torch.ones(4, dtype = torch.float32)\n",
    "        # 点积\n",
    "x, y, torch.dot(x, y)"
   ],
   "id": "714512ff753622d7",
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(tensor([0., 1., 2., 3.]), tensor([1., 1., 1., 1.]), tensor(6.))"
      ]
     },
     "execution_count": 34,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "execution_count": 34
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-11-16T13:44:16.327801Z",
     "start_time": "2025-11-16T13:44:16.311057Z"
    }
   },
   "cell_type": "code",
   "source": [
    "# 通过这种方法也可以进行点积运算\n",
    "torch.sum(x * y)"
   ],
   "id": "7a96247a2f265d01",
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor(6.)"
      ]
     },
     "execution_count": 35,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "execution_count": 35
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": [
    "## 2.3.8 矩阵-向量积(matrix-vector product)\n",
    "\n",
    "矩阵的每一行分别和列向量进行点积得到列向量的计算。"
   ],
   "id": "3d5287af13fd504a"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-11-16T13:52:15.662545Z",
     "start_time": "2025-11-16T13:52:15.641570Z"
    }
   },
   "cell_type": "code",
   "source": "A, x",
   "id": "621f74aac04a2fcf",
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(tensor([[ 0.,  1.,  2.,  3.],\n",
       "         [ 4.,  5.,  6.,  7.],\n",
       "         [ 8.,  9., 10., 11.],\n",
       "         [12., 13., 14., 15.],\n",
       "         [16., 17., 18., 19.]]),\n",
       " tensor([0., 1., 2., 3.]))"
      ]
     },
     "execution_count": 38,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "execution_count": 38
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-11-16T13:49:06.034138Z",
     "start_time": "2025-11-16T13:49:06.019207Z"
    }
   },
   "cell_type": "code",
   "source": [
    "                    # 矩阵-向量积\n",
    "A.shape, x.shape, torch.mv(A, x)"
   ],
   "id": "3e8238027546e9e8",
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(torch.Size([5, 4]), torch.Size([4]), tensor([ 14.,  38.,  62.,  86., 110.]))"
      ]
     },
     "execution_count": 36,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "execution_count": 36
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": [
    "## 2.3.9 矩阵乘法\n",
    "\n",
    "前面的矩阵-向量积实际上是矩阵乘法的特殊情形（第二个矩阵是向量）"
   ],
   "id": "a1bd9a23830b9eb5"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-11-16T13:55:30.759740Z",
     "start_time": "2025-11-16T13:55:30.739736Z"
    }
   },
   "cell_type": "code",
   "source": [
    "B = torch.ones(4, 3)\n",
    "        # 矩阵乘法\n",
    "A, B, torch.mm(A, B)"
   ],
   "id": "cc26b4d951860e53",
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(tensor([[ 0.,  1.,  2.,  3.],\n",
       "         [ 4.,  5.,  6.,  7.],\n",
       "         [ 8.,  9., 10., 11.],\n",
       "         [12., 13., 14., 15.],\n",
       "         [16., 17., 18., 19.]]),\n",
       " tensor([[1., 1., 1.],\n",
       "         [1., 1., 1.],\n",
       "         [1., 1., 1.],\n",
       "         [1., 1., 1.]]),\n",
       " tensor([[ 6.,  6.,  6.],\n",
       "         [22., 22., 22.],\n",
       "         [38., 38., 38.],\n",
       "         [54., 54., 54.],\n",
       "         [70., 70., 70.]]))"
      ]
     },
     "execution_count": 41,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "execution_count": 41
  },
  {
   "metadata": {},
   "cell_type": "markdown",
   "source": [
    "## 2.3.10 范数(norm)\n",
    "\n",
    "- L1范数是向量元素的绝对值之和\n",
    "- L2范数是向量到坐标原点的距离\n",
    "- 矩阵Frobenius范数是所有元素的平方和开根号（类似于向量到原点的距离）"
   ],
   "id": "b4dc464497a3359d"
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-11-16T13:57:59.280951Z",
     "start_time": "2025-11-16T13:57:59.268066Z"
    }
   },
   "cell_type": "code",
   "source": [
    "u = torch.tensor([3.0, -4.0])\n",
    "torch.norm(u)"
   ],
   "id": "748751b4cbc0d7b2",
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor(5.)"
      ]
     },
     "execution_count": 42,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "execution_count": 42
  },
  {
   "metadata": {
    "ExecuteTime": {
     "end_time": "2025-11-16T14:00:51.251180Z",
     "start_time": "2025-11-16T14:00:51.247408Z"
    }
   },
   "cell_type": "code",
   "source": "torch.norm(torch.ones((4, 9)))",
   "id": "342b7ca9177a3828",
   "outputs": [
    {
     "data": {
      "text/plain": [
       "tensor(6.)"
      ]
     },
     "execution_count": 43,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "execution_count": 43
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 2
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython2",
   "version": "2.7.6"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 5
}
