{
 "cells": [
  {
   "cell_type": "markdown",
   "id": "e8b3dd55",
   "metadata": {},
   "source": [
    "# 第五节：模型训练\n",
    "模型训练一般分为四个步骤：\n",
    "\n",
    "1. 构建数据集。\n",
    "2. 定义神经网络。\n",
    "3. 定义超参、损失函数及优化器。\n",
    "4. 输入训练轮次和数据集进行训练。\n",
    "\n",
    "在开始训练模型之前，我们需要指定超参、损失函数和优化器。"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "19dcb5d9",
   "metadata": {},
   "source": [
    "## 一、超参\n",
    "超参指在训练过程中可以调整的参数，不同的参数会导致模型的精度和收敛速度的不同，常见的超参有`epoch、batch_size`和`learning rate`。\n",
    "\n",
    "`epoch`：训练时遍历数据集的次数。\n",
    "\n",
    "`batch_size`：每个批次数据集的大小。\n",
    "\n",
    "`learning rate`：参数更新的速度。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 1,
   "id": "7e8959f3",
   "metadata": {},
   "outputs": [],
   "source": [
    "epochs = 10\n",
    "batch_size = 6\n",
    "# 重复epochs次，每次(总量/batch_size)个数据\n",
    "lr = 0.001\n",
    "momentum = 0.9"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "ebc965b4",
   "metadata": {},
   "source": [
    "## 二、损失函数\n",
    "在`mindspore.nn.loss`中提供了许多常用的损失函数、如`SoftmaxCrossEntropyWithLogits`、`MSELoss`、`SmoothL1Loss`等，可查看官方文档。\n",
    "我们以L1Loss举例：\n",
    "$$\\text { L1 Loss Function }=\\sum_{i=1}^{n}\\left|y_{true}-y_{predicted}\\right|$$"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 2,
   "id": "555cd2f1",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "6.0\n"
     ]
    }
   ],
   "source": [
    "import mindspore.nn as nn\n",
    "import mindspore as ms\n",
    "import numpy as np\n",
    "\n",
    "loss = nn.L1Loss()\n",
    "y_true = ms.Tensor(np.array([10, 6, 10]), ms.float32)\n",
    "y_pre = ms.Tensor(np.array([3, 4, 1]), ms.float32)\n",
    "output = loss(y_true, y_pre)\n",
    "# tensor的话有求平均的过程\n",
    "print(output)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "5ab6b2fc",
   "metadata": {},
   "source": [
    "## 三、优化器\n",
    "在`mindspore.nn`中提供了许多常用的优化器，可查看官方文档。这里我们以`Momentum`为例。定义优化器时，需要给它一个包含可优化的参数，如网络中所有可以训练的参数，即设置优化器的入参为`net.trainable_params()`，还有其他参数，如学习率、权重衰减等。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 3,
   "id": "3a270c49",
   "metadata": {},
   "outputs": [],
   "source": [
    "from mindspore.nn import Momentum\n",
    "from mindvision.classification.models import LeNet5\n",
    "\n",
    "net = LeNet5()\n",
    "Optimizer = Momentum(net.trainable_params(), lr, momentum)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "916fa486",
   "metadata": {},
   "source": [
    "## 四、训练模型"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 6,
   "id": "b14662bc",
   "metadata": {},
   "outputs": [
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "[WARNING] ME(8392:13252,MainProcess):2022-09-30-20:48:26.568.165 [mindspore\\dataset\\engine\\datasets_user_defined.py:656] Python multiprocessing is not supported on Windows platform.\n",
      "[WARNING] ME(8392:13252,MainProcess):2022-09-30-20:48:26.569.349 [mindspore\\dataset\\core\\validator_helpers.py:804] 'Resize' from mindspore.dataset.vision.c_transforms is deprecated from version 1.8 and will be removed in a future version. Use 'Resize' from mindspore.dataset.vision instead.\n",
      "[WARNING] ME(8392:13252,MainProcess):2022-09-30-20:48:26.569.349 [mindspore\\dataset\\core\\validator_helpers.py:804] 'Rescale' from mindspore.dataset.vision.c_transforms is deprecated from version 1.8 and will be removed in a future version. Use 'Rescale' from mindspore.dataset.vision instead.\n",
      "[WARNING] ME(8392:13252,MainProcess):2022-09-30-20:48:26.570.498 [mindspore\\dataset\\core\\validator_helpers.py:804] 'Rescale' from mindspore.dataset.vision.c_transforms is deprecated from version 1.8 and will be removed in a future version. Use 'Rescale' from mindspore.dataset.vision instead.\n",
      "[WARNING] ME(8392:13252,MainProcess):2022-09-30-20:48:26.571.510 [mindspore\\dataset\\core\\validator_helpers.py:804] 'HWC2CHW' from mindspore.dataset.vision.c_transforms is deprecated from version 1.8 and will be removed in a future version. Use 'HWC2CHW' from mindspore.dataset.vision instead.\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Epoch:[  0/ 10], step:[10000/10000], loss:[0.290/1.666], time:0.000 ms, lr:0.00100\n",
      "Epoch time: 58124.161 ms, per step time: 5.812 ms, avg loss: 1.666\n",
      "Epoch:[  1/ 10], step:[10000/10000], loss:[0.003/0.109], time:4.999 ms, lr:0.00100\n",
      "Epoch time: 59327.496 ms, per step time: 5.933 ms, avg loss: 0.109\n",
      "Epoch:[  2/ 10], step:[10000/10000], loss:[0.465/0.061], time:0.000 ms, lr:0.00100\n",
      "Epoch time: 58018.628 ms, per step time: 5.802 ms, avg loss: 0.061\n",
      "Epoch:[  3/ 10], step:[10000/10000], loss:[0.067/0.044], time:0.000 ms, lr:0.00100\n",
      "Epoch time: 57095.731 ms, per step time: 5.710 ms, avg loss: 0.044\n",
      "Epoch:[  4/ 10], step:[10000/10000], loss:[0.001/0.034], time:21.001 ms, lr:0.00100\n",
      "Epoch time: 57706.813 ms, per step time: 5.771 ms, avg loss: 0.034\n",
      "Epoch:[  5/ 10], step:[10000/10000], loss:[0.003/0.027], time:0.000 ms, lr:0.00100\n",
      "Epoch time: 68957.694 ms, per step time: 6.896 ms, avg loss: 0.027\n",
      "Epoch:[  6/ 10], step:[10000/10000], loss:[0.131/0.023], time:7.448 ms, lr:0.00100\n",
      "Epoch time: 62450.289 ms, per step time: 6.245 ms, avg loss: 0.023\n",
      "Epoch:[  7/ 10], step:[10000/10000], loss:[0.000/0.019], time:18.137 ms, lr:0.00100\n",
      "Epoch time: 65703.132 ms, per step time: 6.570 ms, avg loss: 0.019\n",
      "Epoch:[  8/ 10], step:[10000/10000], loss:[0.001/0.016], time:15.623 ms, lr:0.00100\n",
      "Epoch time: 59203.137 ms, per step time: 5.920 ms, avg loss: 0.016\n",
      "Epoch:[  9/ 10], step:[10000/10000], loss:[0.004/0.013], time:0.000 ms, lr:0.00100\n",
      "Epoch time: 59145.516 ms, per step time: 5.915 ms, avg loss: 0.013\n"
     ]
    }
   ],
   "source": [
    "import mindspore as ms\n",
    "import mindspore.nn as nn\n",
    "from mindvision.classification.dataset import Mnist\n",
    "from mindvision.classification.models import LeNet5\n",
    "from mindspore.nn import Momentum\n",
    "from mindvision.engine.callback import LossMonitor\n",
    "\n",
    "# 定义超参\n",
    "epochs = 10\n",
    "batch_size = 6\n",
    "lr = 0.001\n",
    "momentum = 0.9\n",
    "\n",
    "# 加载数据集\n",
    "dir = './Mnist'\n",
    "dataset = Mnist(path=dir, batch_size=6, shuffle=True, resize=32, split='train', repeat_num=1)\n",
    "dataset = dataset.run()\n",
    "\n",
    "# 定义网络\n",
    "net = LeNet5(num_classes=10)\n",
    "\n",
    "# 定义损失函数\n",
    "loss = nn.SoftmaxCrossEntropyWithLogits(sparse=True, reduction='mean')\n",
    "\n",
    "# 定义优化器\n",
    "optim = Momentum(net.trainable_params(), learning_rate=lr, momentum=momentum)\n",
    "\n",
    "# 初始化模型\n",
    "model = ms.Model(net, loss_fn=loss, optimizer=optim, metrics={'acc'})\n",
    "\n",
    "# 开始训练\n",
    "model.train(epochs, dataset, callbacks=[LossMonitor(lr, 10000)]) # 10000表示每10000个数据打印一次"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "6231c104",
   "metadata": {},
   "outputs": [],
   "source": []
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "mindspore",
   "language": "python",
   "name": "mindvision"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.7.5"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 5
}
