{
 "cells": [
  {
   "cell_type": "markdown",
   "id": "b499934e",
   "metadata": {},
   "source": [
    "# 构建网络\n",
    "我们自定义网络时，需要继承`nn.Cell`类，MindSpore的`Cell`类是所有网络的基类，也是网络的基本单元，这里我们主要介绍如何构造前行神经网络。"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "1ad0f936",
   "metadata": {},
   "source": [
    "一个基本的自定义网络，需要继承`Cell`类，并重写`__init__`和`construct`方法，在`__init__`方法中定义各网络层和算子，在`construct`方法中构建网络层级。\n",
    "\n",
    "下面以一个简单的卷积网络为例："
   ]
  },
  {
   "cell_type": "markdown",
   "id": "1dba9187",
   "metadata": {},
   "source": [
    "## 一、基本用法"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 4,
   "id": "847f1d39",
   "metadata": {},
   "outputs": [],
   "source": [
    "import mindspore.nn as nn\n",
    "\n",
    "class MyNet(nn.Cell):\n",
    "    \"\"\"一个简单的卷积网络\"\"\"\n",
    "    def __init__(self):\n",
    "        super(MyNet, self).__init__()\n",
    "        self.conv = nn.Conv2d(10, 20, 3, has_bias=True, weight_init='normal')\n",
    "        self.relu = nn.ReLU()\n",
    "    \n",
    "    def construct(self, x):\n",
    "        x = self.conv(x)\n",
    "        x = self.relu(x)\n",
    "        return x"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "8cf2a90d",
   "metadata": {},
   "source": [
    "构建其实就是一个搭积木的过程，在`__init__`方法中定义需要用到的积木，在`construct`方法中去搭这些积木。"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "16ff29d7",
   "metadata": {},
   "source": [
    "## 二、获取网络参数\n",
    "`nn.Cell`中返回参数的方法有`parameters_dict`、`get_parameters`和`trainable_params`，它们的区别在于返回值的不同。\n",
    "\n",
    "- `parameters_dict`：获取网络结构中所有参数，返回一个以key为参数名，value为参数值的字典。\n",
    "- `get_parameters`：获取网络结构中的所有参数，返回参数的迭代器。\n",
    "- `trainable_params`：获取所有可训练的参数、返回参数所组成的列表。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 7,
   "id": "7c7a8f39",
   "metadata": {
    "scrolled": true
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "OrderedDict([('conv.weight', Parameter (name=conv.weight, shape=(20, 10, 3, 3), dtype=Float32, requires_grad=True)), ('conv.bias', Parameter (name=conv.bias, shape=(20,), dtype=Float32, requires_grad=True))])\n"
     ]
    }
   ],
   "source": [
    "net = MyNet()\n",
    "\n",
    "dict = net.parameters_dict()\n",
    "# 字典名为OrderedDict\n",
    "print(dict)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 11,
   "id": "6745ffbe",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Parameter (name=conv.weight, shape=(20, 10, 3, 3), dtype=Float32, requires_grad=True)\n",
      "Parameter (name=conv.bias, shape=(20,), dtype=Float32, requires_grad=True)\n"
     ]
    }
   ],
   "source": [
    "params = net.get_parameters()\n",
    "for param in params:\n",
    "    print(param)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 12,
   "id": "d5dda2ba",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "[Parameter (name=conv.weight, shape=(20, 10, 3, 3), dtype=Float32, requires_grad=True), Parameter (name=conv.bias, shape=(20,), dtype=Float32, requires_grad=True)]\n"
     ]
    }
   ],
   "source": [
    "list = net.trainable_params()\n",
    "print(list)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "ebfe2f1a",
   "metadata": {},
   "source": [
    "## 三、相关属性\n",
    "这里补充介绍一下`Cell`中的一些其他属性方法："
   ]
  },
  {
   "cell_type": "markdown",
   "id": "dd7df5ac",
   "metadata": {},
   "source": [
    "### cells_and_names\n",
    "返回包含每个layer的内容和名字"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 13,
   "id": "699546b8",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "('', MyNet<\n",
      "  (conv): Conv2d<input_channels=10, output_channels=20, kernel_size=(3, 3), stride=(1, 1), pad_mode=same, padding=0, dilation=(1, 1), group=1, has_bias=True, weight_init=normal, bias_init=zeros, format=NCHW>\n",
      "  (relu): ReLU<>\n",
      "  >)\n",
      "('conv', Conv2d<input_channels=10, output_channels=20, kernel_size=(3, 3), stride=(1, 1), pad_mode=same, padding=0, dilation=(1, 1), group=1, has_bias=True, weight_init=normal, bias_init=zeros, format=NCHW>)\n",
      "('relu', ReLU<>)\n"
     ]
    }
   ],
   "source": [
    "for item in net.cells_and_names():\n",
    "    print(item)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "e99b35ee",
   "metadata": {},
   "source": [
    "### set_grad\n",
    "`set_grad`用于指定网络是否需要计算梯度。在不传入参数调用时，默认设置`requires_grad`为True，在执行前向网络时将会构建用于计算梯度的反向网络。`TrainOneStepCell`和`GradOperation`接口，无需使用`set_grad`，其内部已实现。若用户需要自定义此类训练功能的接口，需要在其内部或者外部设置`set_grad`。"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "347684d2",
   "metadata": {},
   "source": [
    "### set_train\n",
    "`set_train`接口指定模型是否为训练模式，在不传入参数调用时，默认设置的`mode`属性为`True`。\n",
    "\n",
    "在实现训练和推理结构不同的网络时可以通过training属性区分训练和推理场景，当`mode`设置为`True`时，为训练场景；当`mode`设置为`False`时，为推理场景。\n",
    "\n",
    "我们以Dropout算子举例："
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 14,
   "id": "f9d3577b",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "training result:\n",
      " [[[0.        0.        1.4285715]\n",
      "  [1.4285715 0.        1.4285715]]]\n",
      "infer result:\n",
      " [[[1. 1. 1.]\n",
      "  [1. 1. 1.]]]\n"
     ]
    }
   ],
   "source": [
    "import numpy as np\n",
    "import mindspore as ms\n",
    "\n",
    "net = nn.Dropout(keep_prob=0.7)\n",
    "input = ms.Tensor(np.ones([1, 2, 3]), ms.float32)\n",
    "\n",
    "# 训练模式\n",
    "net.set_train()\n",
    "output = net(input)\n",
    "print(\"training result:\\n\", output)\n",
    "\n",
    "# 推理模式\n",
    "net.set_train(mode=False)\n",
    "output = net(input)\n",
    "print(\"infer result:\\n\", output)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "13f88e69",
   "metadata": {},
   "source": [
    "## 四、构建网络\n",
    "我们有两种方法构建网络，一种是使用`nn模块`中的算子，一种是使用`ops模块`中的算子。\n",
    "\n",
    "- `mindspore.ops`模块提供了基础的算子，如数学算子和神经网络算子。\n",
    "- `mindspore.nn`模块提供了针对神经网络封装的算子。\n",
    "\n",
    "我们以一个简单的算法 $f(x)=2x+3$举例："
   ]
  },
  {
   "cell_type": "markdown",
   "id": "61cb7ecf",
   "metadata": {},
   "source": [
    "### 使用ops模块"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 23,
   "id": "574007a9",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "[5. 7. 9.]\n"
     ]
    }
   ],
   "source": [
    "import mindspore.ops as ops\n",
    "\n",
    "class Net(nn.Cell):\n",
    "    \"\"\"一个简单算法\"\"\"\n",
    "    def __init__(self):\n",
    "        super(Net, self).__init__()\n",
    "        self.mul = ops.Mul() # 乘法\n",
    "        self.add = ops.Add() # 加法\n",
    "        self.bias = ms.Parameter(ms.Tensor(np.array([3, 3, 3]), ms.float32))\n",
    "        \n",
    "    def construct(self, x):\n",
    "        x = self.add(self.mul(2, x), self.bias)\n",
    "        return x\n",
    "    \n",
    "net = Net()\n",
    "input = ms.Tensor(np.array([1, 2, 3]), ms.float32)\n",
    "output = net(input)\n",
    "print(output)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "89347e26",
   "metadata": {},
   "source": [
    "### 使用nn模块"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 40,
   "id": "7edfad0a",
   "metadata": {},
   "outputs": [],
   "source": [
    "class Net(nn.Cell):\n",
    "    \"\"\"一个简单算法\"\"\"\n",
    "    def __init__(self):\n",
    "        super(Net, self).__init__()\n",
    "        self.dense = nn.Dense(1, 1)\n",
    "    \n",
    "    def construct(self, x):\n",
    "        x = self.dense(x)\n",
    "        return x\n",
    "    \n",
    "net = Net()"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "1320c0f0",
   "metadata": {},
   "source": [
    "我们还可以使用容器构建网络，为了便于管理和组成更复杂的网络，`mindspore.nn`提供了容器对网络中的子模型块或模型层进行管理，有`nn.CellList`和`nn.SequentialCell`两种方式。这里就不做详细介绍了，具体可查看官方文档。"
   ]
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "mindspore",
   "language": "python",
   "name": "mindvision"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.7.5"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 5
}
