{
 "cells": [
  {
   "cell_type": "markdown",
   "id": "e9959ba9",
   "metadata": {},
   "source": [
    "# ShuffleNet V2: 关键观察与创新点总结\n",
    "\n",
    "ShuffleNet V2 是轻量级卷积神经网络 ShuffleNet 的改进版本，旨在进一步提升移动端和嵌入式设备上的模型效率与精度。相比第一代 ShuffleNet，V2 在结构设计上更加注重实际推理速度与模型准确率的平衡。\n",
    "\n",
    "## 关键观察（Key Observations）\n",
    "\n",
    "在设计 ShuffleNet V2 时，作者提出了以下几点影响模型效率的重要因素：\n",
    "\n",
    "| 观察点 | 说明 |\n",
    "|--------|------|\n",
    "| 1. 输入输出通道数相等时内存访问成本更低 | 当输入和输出通道数相近时，内存访问效率更高（即 MAE 最小） |\n",
    "| 2. 过多的分组卷积会增加内存访问开销 | 分组卷积虽然减少了计算量，但增加了内存访问成本（MAC），可能成为瓶颈 |\n",
    "| 3. 网络碎片化会影响并行能力 | 类似 Inception 中的多分支结构会降低 GPU/TPU 的并行效率 |\n",
    "| 4. Element-wise 操作不可忽视 | 如 Add、ReLU 等操作在轻量模型中占用较大时间比例 |\n",
    "\n",
    "这些观察是通过大量实验和理论分析得出的，指导了 ShuffleNet V2 的模块设计。\n",
    "\n",
    "---\n",
    "\n",
    "## 创新点（Innovations）\n",
    "\n",
    "基于上述观察，ShuffleNet V2 提出了以下几个关键创新点：\n",
    "\n",
    "<img src=\"resources/shufflenet_v1_block.png\" alt=\"drawing\" width=\"80%\"/>\n",
    "\n",
    "### 1. 改进的模块结构（Linear Bottleneck + Channel Split）\n",
    "\n",
    "- 使用 **Channel Split** 将输入通道分为两部分：\n",
    "  - 一部分进行深度可分离卷积（Depthwise Convolution）和通道混洗（Channel Shuffle）\n",
    "  - 另一部分直接传递到输出，实现恒等映射（Identity Mapping）\n",
    "\n",
    "> ✅ 避免了过多的分组卷积  \n",
    "> ✅ 减少了计算和内存访问开销  \n",
    "> ✅ 提升了模型效率和训练稳定性\n",
    "\n",
    "### 2. 去除模块中的 Element-wise Add 操作\n",
    "\n",
    "- 使用 Concat 替代 Add，减少 Element-wise 操作的时间占比\n",
    "- 同时保留信息流，保持模型表达能力\n",
    "\n",
    "### 3. 更加高效的全局结构设计\n",
    "\n",
    "- 整体网络由多个堆叠的 Shuffle Unit 组成\n",
    "- 每个阶段逐渐增加通道数，合理分配计算资源\n",
    "- 引入深度可分离卷积以进一步压缩参数量\n",
    "\n",
    "---\n",
    "\n",
    "## 总结\n",
    "\n",
    "ShuffleNet V2 的成功在于它不仅仅关注 FLOPs 或参数数量，而是从**实际硬件执行效率**出发，综合考虑了内存访问、并行性和 Element-wise 操作的影响。\n",
    "\n",
    "### 主要优势：\n",
    "\n",
    "- 更快的推理速度\n",
    "- 更高的准确率\n",
    "- 更适合部署在移动端和嵌入式设备\n",
    "- 模块结构简洁高效，利于工程实现\n",
    "\n",
    "如果你正在构建一个高效的移动端视觉任务系统，ShuffleNet V2 是一个非常值得尝试的骨干网络。\n",
    "\n",
    "---"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 2,
   "id": "a3312ad0",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Device:  cuda\n"
     ]
    }
   ],
   "source": [
    "# 自动重新加载外部module，使得修改代码之后无需重新import\n",
    "# see http://stackoverflow.com/questions/1907993/autoreload-of-modules-in-ipython\n",
    "%load_ext autoreload\n",
    "%autoreload 2\n",
    "\n",
    "from hdd.device.utils import get_device\n",
    "\n",
    "import torch\n",
    "import torch.nn as nn\n",
    "import torch.optim as optim\n",
    "from torchvision import datasets, transforms\n",
    "\n",
    "# 设置训练数据的路径\n",
    "DATA_ROOT = \"~/workspace/hands-dirty-on-dl/dataset\"\n",
    "# 设置TensorBoard的路径\n",
    "TENSORBOARD_ROOT = \"~/workspace/hands-dirty-on-dl/dataset\"\n",
    "# 设置预训练模型参数路径\n",
    "TORCH_HUB_PATH = \"~/workspace/hands-dirty-on-dl/pretrained_models\"\n",
    "torch.hub.set_dir(TORCH_HUB_PATH)\n",
    "# 挑选最合适的训练设备\n",
    "DEVICE = get_device([\"cuda\", \"cpu\"])\n",
    "print(\"Device: \", DEVICE)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 3,
   "id": "999a0aca",
   "metadata": {},
   "outputs": [],
   "source": [
    "from hdd.dataset.imagenette_in_memory import ImagenetteInMemory\n",
    "from hdd.data_util.auto_augmentation import ImageNetPolicy\n",
    "\n",
    "from hdd.data_util.transforms import RandomResize\n",
    "from torch.utils.data import DataLoader\n",
    "\n",
    "TRAIN_MEAN = [0.4625, 0.4580, 0.4295]\n",
    "TRAIN_STD = [0.2452, 0.2390, 0.2469]\n",
    "train_dataset_transforms = transforms.Compose(\n",
    "    [\n",
    "        RandomResize([256, 296, 384]),  # 随机在三个size中选择一个进行resize\n",
    "        transforms.RandomCrop(224),\n",
    "        transforms.RandomHorizontalFlip(),\n",
    "        ImageNetPolicy(),\n",
    "        transforms.ToTensor(),\n",
    "        transforms.Normalize(mean=TRAIN_MEAN, std=TRAIN_STD),\n",
    "    ]\n",
    ")\n",
    "val_dataset_transforms = transforms.Compose(\n",
    "    [\n",
    "        transforms.Resize(256),\n",
    "        transforms.CenterCrop(224),\n",
    "        transforms.ToTensor(),\n",
    "        transforms.Normalize(mean=TRAIN_MEAN, std=TRAIN_STD),\n",
    "    ]\n",
    ")\n",
    "train_dataset = ImagenetteInMemory(\n",
    "    root=DATA_ROOT,\n",
    "    split=\"train\",\n",
    "    size=\"full\",\n",
    "    download=True,\n",
    "    transform=train_dataset_transforms,\n",
    ")\n",
    "val_dataset = ImagenetteInMemory(\n",
    "    root=DATA_ROOT,\n",
    "    split=\"val\",\n",
    "    size=\"full\",\n",
    "    download=True,\n",
    "    transform=val_dataset_transforms,\n",
    ")\n",
    "\n",
    "\n",
    "def build_dataloader(batch_size, train_dataset, val_dataset):\n",
    "    train_dataloader = DataLoader(\n",
    "        train_dataset, batch_size=batch_size, shuffle=True, num_workers=8\n",
    "    )\n",
    "    val_dataloader = DataLoader(\n",
    "        val_dataset, batch_size=batch_size, shuffle=False, num_workers=8\n",
    "    )\n",
    "    return train_dataloader, val_dataloader"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 8,
   "id": "3bf1a0eb",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "----------------------------------------------------------------\n",
      "        Layer (type)               Output Shape         Param #\n",
      "================================================================\n",
      "            Conv2d-1         [-1, 24, 112, 112]             648\n",
      "       BatchNorm2d-2         [-1, 24, 112, 112]              48\n",
      "              ReLU-3         [-1, 24, 112, 112]               0\n",
      "           _Conv2d-4         [-1, 24, 112, 112]               0\n",
      "         MaxPool2d-5           [-1, 24, 56, 56]               0\n",
      "            Conv2d-6           [-1, 24, 28, 28]             216\n",
      "       BatchNorm2d-7           [-1, 24, 28, 28]              48\n",
      "           _Conv2d-8           [-1, 24, 28, 28]               0\n",
      "            Conv2d-9           [-1, 58, 28, 28]           1,392\n",
      "      BatchNorm2d-10           [-1, 58, 28, 28]             116\n",
      "             ReLU-11           [-1, 58, 28, 28]               0\n",
      "          _Conv2d-12           [-1, 58, 28, 28]               0\n",
      "           Conv2d-13           [-1, 58, 56, 56]           1,392\n",
      "      BatchNorm2d-14           [-1, 58, 56, 56]             116\n",
      "             ReLU-15           [-1, 58, 56, 56]               0\n",
      "          _Conv2d-16           [-1, 58, 56, 56]               0\n",
      "           Conv2d-17           [-1, 58, 28, 28]             522\n",
      "      BatchNorm2d-18           [-1, 58, 28, 28]             116\n",
      "          _Conv2d-19           [-1, 58, 28, 28]               0\n",
      "           Conv2d-20           [-1, 58, 28, 28]           3,364\n",
      "      BatchNorm2d-21           [-1, 58, 28, 28]             116\n",
      "             ReLU-22           [-1, 58, 28, 28]               0\n",
      "          _Conv2d-23           [-1, 58, 28, 28]               0\n",
      "ShuffleBlockLayer-24          [-1, 116, 28, 28]               0\n",
      "         Identity-25           [-1, 58, 28, 28]               0\n",
      "           Conv2d-26           [-1, 58, 28, 28]           3,364\n",
      "      BatchNorm2d-27           [-1, 58, 28, 28]             116\n",
      "             ReLU-28           [-1, 58, 28, 28]               0\n",
      "          _Conv2d-29           [-1, 58, 28, 28]               0\n",
      "           Conv2d-30           [-1, 58, 28, 28]             522\n",
      "      BatchNorm2d-31           [-1, 58, 28, 28]             116\n",
      "          _Conv2d-32           [-1, 58, 28, 28]               0\n",
      "           Conv2d-33           [-1, 58, 28, 28]           3,364\n",
      "      BatchNorm2d-34           [-1, 58, 28, 28]             116\n",
      "             ReLU-35           [-1, 58, 28, 28]               0\n",
      "          _Conv2d-36           [-1, 58, 28, 28]               0\n",
      "ShuffleBlockLayer-37          [-1, 116, 28, 28]               0\n",
      "         Identity-38           [-1, 58, 28, 28]               0\n",
      "           Conv2d-39           [-1, 58, 28, 28]           3,364\n",
      "      BatchNorm2d-40           [-1, 58, 28, 28]             116\n",
      "             ReLU-41           [-1, 58, 28, 28]               0\n",
      "          _Conv2d-42           [-1, 58, 28, 28]               0\n",
      "           Conv2d-43           [-1, 58, 28, 28]             522\n",
      "      BatchNorm2d-44           [-1, 58, 28, 28]             116\n",
      "          _Conv2d-45           [-1, 58, 28, 28]               0\n",
      "           Conv2d-46           [-1, 58, 28, 28]           3,364\n",
      "      BatchNorm2d-47           [-1, 58, 28, 28]             116\n",
      "             ReLU-48           [-1, 58, 28, 28]               0\n",
      "          _Conv2d-49           [-1, 58, 28, 28]               0\n",
      "ShuffleBlockLayer-50          [-1, 116, 28, 28]               0\n",
      "         Identity-51           [-1, 58, 28, 28]               0\n",
      "           Conv2d-52           [-1, 58, 28, 28]           3,364\n",
      "      BatchNorm2d-53           [-1, 58, 28, 28]             116\n",
      "             ReLU-54           [-1, 58, 28, 28]               0\n",
      "          _Conv2d-55           [-1, 58, 28, 28]               0\n",
      "           Conv2d-56           [-1, 58, 28, 28]             522\n",
      "      BatchNorm2d-57           [-1, 58, 28, 28]             116\n",
      "          _Conv2d-58           [-1, 58, 28, 28]               0\n",
      "           Conv2d-59           [-1, 58, 28, 28]           3,364\n",
      "      BatchNorm2d-60           [-1, 58, 28, 28]             116\n",
      "             ReLU-61           [-1, 58, 28, 28]               0\n",
      "          _Conv2d-62           [-1, 58, 28, 28]               0\n",
      "ShuffleBlockLayer-63          [-1, 116, 28, 28]               0\n",
      "           Conv2d-64          [-1, 116, 14, 14]           1,044\n",
      "      BatchNorm2d-65          [-1, 116, 14, 14]             232\n",
      "          _Conv2d-66          [-1, 116, 14, 14]               0\n",
      "           Conv2d-67          [-1, 116, 14, 14]          13,456\n",
      "      BatchNorm2d-68          [-1, 116, 14, 14]             232\n",
      "             ReLU-69          [-1, 116, 14, 14]               0\n",
      "          _Conv2d-70          [-1, 116, 14, 14]               0\n",
      "           Conv2d-71          [-1, 116, 28, 28]          13,456\n",
      "      BatchNorm2d-72          [-1, 116, 28, 28]             232\n",
      "             ReLU-73          [-1, 116, 28, 28]               0\n",
      "          _Conv2d-74          [-1, 116, 28, 28]               0\n",
      "           Conv2d-75          [-1, 116, 14, 14]           1,044\n",
      "      BatchNorm2d-76          [-1, 116, 14, 14]             232\n",
      "          _Conv2d-77          [-1, 116, 14, 14]               0\n",
      "           Conv2d-78          [-1, 116, 14, 14]          13,456\n",
      "      BatchNorm2d-79          [-1, 116, 14, 14]             232\n",
      "             ReLU-80          [-1, 116, 14, 14]               0\n",
      "          _Conv2d-81          [-1, 116, 14, 14]               0\n",
      "ShuffleBlockLayer-82          [-1, 232, 14, 14]               0\n",
      "         Identity-83          [-1, 116, 14, 14]               0\n",
      "           Conv2d-84          [-1, 116, 14, 14]          13,456\n",
      "      BatchNorm2d-85          [-1, 116, 14, 14]             232\n",
      "             ReLU-86          [-1, 116, 14, 14]               0\n",
      "          _Conv2d-87          [-1, 116, 14, 14]               0\n",
      "           Conv2d-88          [-1, 116, 14, 14]           1,044\n",
      "      BatchNorm2d-89          [-1, 116, 14, 14]             232\n",
      "          _Conv2d-90          [-1, 116, 14, 14]               0\n",
      "           Conv2d-91          [-1, 116, 14, 14]          13,456\n",
      "      BatchNorm2d-92          [-1, 116, 14, 14]             232\n",
      "             ReLU-93          [-1, 116, 14, 14]               0\n",
      "          _Conv2d-94          [-1, 116, 14, 14]               0\n",
      "ShuffleBlockLayer-95          [-1, 232, 14, 14]               0\n",
      "         Identity-96          [-1, 116, 14, 14]               0\n",
      "           Conv2d-97          [-1, 116, 14, 14]          13,456\n",
      "      BatchNorm2d-98          [-1, 116, 14, 14]             232\n",
      "             ReLU-99          [-1, 116, 14, 14]               0\n",
      "         _Conv2d-100          [-1, 116, 14, 14]               0\n",
      "          Conv2d-101          [-1, 116, 14, 14]           1,044\n",
      "     BatchNorm2d-102          [-1, 116, 14, 14]             232\n",
      "         _Conv2d-103          [-1, 116, 14, 14]               0\n",
      "          Conv2d-104          [-1, 116, 14, 14]          13,456\n",
      "     BatchNorm2d-105          [-1, 116, 14, 14]             232\n",
      "            ReLU-106          [-1, 116, 14, 14]               0\n",
      "         _Conv2d-107          [-1, 116, 14, 14]               0\n",
      "ShuffleBlockLayer-108          [-1, 232, 14, 14]               0\n",
      "        Identity-109          [-1, 116, 14, 14]               0\n",
      "          Conv2d-110          [-1, 116, 14, 14]          13,456\n",
      "     BatchNorm2d-111          [-1, 116, 14, 14]             232\n",
      "            ReLU-112          [-1, 116, 14, 14]               0\n",
      "         _Conv2d-113          [-1, 116, 14, 14]               0\n",
      "          Conv2d-114          [-1, 116, 14, 14]           1,044\n",
      "     BatchNorm2d-115          [-1, 116, 14, 14]             232\n",
      "         _Conv2d-116          [-1, 116, 14, 14]               0\n",
      "          Conv2d-117          [-1, 116, 14, 14]          13,456\n",
      "     BatchNorm2d-118          [-1, 116, 14, 14]             232\n",
      "            ReLU-119          [-1, 116, 14, 14]               0\n",
      "         _Conv2d-120          [-1, 116, 14, 14]               0\n",
      "ShuffleBlockLayer-121          [-1, 232, 14, 14]               0\n",
      "        Identity-122          [-1, 116, 14, 14]               0\n",
      "          Conv2d-123          [-1, 116, 14, 14]          13,456\n",
      "     BatchNorm2d-124          [-1, 116, 14, 14]             232\n",
      "            ReLU-125          [-1, 116, 14, 14]               0\n",
      "         _Conv2d-126          [-1, 116, 14, 14]               0\n",
      "          Conv2d-127          [-1, 116, 14, 14]           1,044\n",
      "     BatchNorm2d-128          [-1, 116, 14, 14]             232\n",
      "         _Conv2d-129          [-1, 116, 14, 14]               0\n",
      "          Conv2d-130          [-1, 116, 14, 14]          13,456\n",
      "     BatchNorm2d-131          [-1, 116, 14, 14]             232\n",
      "            ReLU-132          [-1, 116, 14, 14]               0\n",
      "         _Conv2d-133          [-1, 116, 14, 14]               0\n",
      "ShuffleBlockLayer-134          [-1, 232, 14, 14]               0\n",
      "        Identity-135          [-1, 116, 14, 14]               0\n",
      "          Conv2d-136          [-1, 116, 14, 14]          13,456\n",
      "     BatchNorm2d-137          [-1, 116, 14, 14]             232\n",
      "            ReLU-138          [-1, 116, 14, 14]               0\n",
      "         _Conv2d-139          [-1, 116, 14, 14]               0\n",
      "          Conv2d-140          [-1, 116, 14, 14]           1,044\n",
      "     BatchNorm2d-141          [-1, 116, 14, 14]             232\n",
      "         _Conv2d-142          [-1, 116, 14, 14]               0\n",
      "          Conv2d-143          [-1, 116, 14, 14]          13,456\n",
      "     BatchNorm2d-144          [-1, 116, 14, 14]             232\n",
      "            ReLU-145          [-1, 116, 14, 14]               0\n",
      "         _Conv2d-146          [-1, 116, 14, 14]               0\n",
      "ShuffleBlockLayer-147          [-1, 232, 14, 14]               0\n",
      "        Identity-148          [-1, 116, 14, 14]               0\n",
      "          Conv2d-149          [-1, 116, 14, 14]          13,456\n",
      "     BatchNorm2d-150          [-1, 116, 14, 14]             232\n",
      "            ReLU-151          [-1, 116, 14, 14]               0\n",
      "         _Conv2d-152          [-1, 116, 14, 14]               0\n",
      "          Conv2d-153          [-1, 116, 14, 14]           1,044\n",
      "     BatchNorm2d-154          [-1, 116, 14, 14]             232\n",
      "         _Conv2d-155          [-1, 116, 14, 14]               0\n",
      "          Conv2d-156          [-1, 116, 14, 14]          13,456\n",
      "     BatchNorm2d-157          [-1, 116, 14, 14]             232\n",
      "            ReLU-158          [-1, 116, 14, 14]               0\n",
      "         _Conv2d-159          [-1, 116, 14, 14]               0\n",
      "ShuffleBlockLayer-160          [-1, 232, 14, 14]               0\n",
      "        Identity-161          [-1, 116, 14, 14]               0\n",
      "          Conv2d-162          [-1, 116, 14, 14]          13,456\n",
      "     BatchNorm2d-163          [-1, 116, 14, 14]             232\n",
      "            ReLU-164          [-1, 116, 14, 14]               0\n",
      "         _Conv2d-165          [-1, 116, 14, 14]               0\n",
      "          Conv2d-166          [-1, 116, 14, 14]           1,044\n",
      "     BatchNorm2d-167          [-1, 116, 14, 14]             232\n",
      "         _Conv2d-168          [-1, 116, 14, 14]               0\n",
      "          Conv2d-169          [-1, 116, 14, 14]          13,456\n",
      "     BatchNorm2d-170          [-1, 116, 14, 14]             232\n",
      "            ReLU-171          [-1, 116, 14, 14]               0\n",
      "         _Conv2d-172          [-1, 116, 14, 14]               0\n",
      "ShuffleBlockLayer-173          [-1, 232, 14, 14]               0\n",
      "          Conv2d-174            [-1, 232, 7, 7]           2,088\n",
      "     BatchNorm2d-175            [-1, 232, 7, 7]             464\n",
      "         _Conv2d-176            [-1, 232, 7, 7]               0\n",
      "          Conv2d-177            [-1, 232, 7, 7]          53,824\n",
      "     BatchNorm2d-178            [-1, 232, 7, 7]             464\n",
      "            ReLU-179            [-1, 232, 7, 7]               0\n",
      "         _Conv2d-180            [-1, 232, 7, 7]               0\n",
      "          Conv2d-181          [-1, 232, 14, 14]          53,824\n",
      "     BatchNorm2d-182          [-1, 232, 14, 14]             464\n",
      "            ReLU-183          [-1, 232, 14, 14]               0\n",
      "         _Conv2d-184          [-1, 232, 14, 14]               0\n",
      "          Conv2d-185            [-1, 232, 7, 7]           2,088\n",
      "     BatchNorm2d-186            [-1, 232, 7, 7]             464\n",
      "         _Conv2d-187            [-1, 232, 7, 7]               0\n",
      "          Conv2d-188            [-1, 232, 7, 7]          53,824\n",
      "     BatchNorm2d-189            [-1, 232, 7, 7]             464\n",
      "            ReLU-190            [-1, 232, 7, 7]               0\n",
      "         _Conv2d-191            [-1, 232, 7, 7]               0\n",
      "ShuffleBlockLayer-192            [-1, 464, 7, 7]               0\n",
      "        Identity-193            [-1, 232, 7, 7]               0\n",
      "          Conv2d-194            [-1, 232, 7, 7]          53,824\n",
      "     BatchNorm2d-195            [-1, 232, 7, 7]             464\n",
      "            ReLU-196            [-1, 232, 7, 7]               0\n",
      "         _Conv2d-197            [-1, 232, 7, 7]               0\n",
      "          Conv2d-198            [-1, 232, 7, 7]           2,088\n",
      "     BatchNorm2d-199            [-1, 232, 7, 7]             464\n",
      "         _Conv2d-200            [-1, 232, 7, 7]               0\n",
      "          Conv2d-201            [-1, 232, 7, 7]          53,824\n",
      "     BatchNorm2d-202            [-1, 232, 7, 7]             464\n",
      "            ReLU-203            [-1, 232, 7, 7]               0\n",
      "         _Conv2d-204            [-1, 232, 7, 7]               0\n",
      "ShuffleBlockLayer-205            [-1, 464, 7, 7]               0\n",
      "        Identity-206            [-1, 232, 7, 7]               0\n",
      "          Conv2d-207            [-1, 232, 7, 7]          53,824\n",
      "     BatchNorm2d-208            [-1, 232, 7, 7]             464\n",
      "            ReLU-209            [-1, 232, 7, 7]               0\n",
      "         _Conv2d-210            [-1, 232, 7, 7]               0\n",
      "          Conv2d-211            [-1, 232, 7, 7]           2,088\n",
      "     BatchNorm2d-212            [-1, 232, 7, 7]             464\n",
      "         _Conv2d-213            [-1, 232, 7, 7]               0\n",
      "          Conv2d-214            [-1, 232, 7, 7]          53,824\n",
      "     BatchNorm2d-215            [-1, 232, 7, 7]             464\n",
      "            ReLU-216            [-1, 232, 7, 7]               0\n",
      "         _Conv2d-217            [-1, 232, 7, 7]               0\n",
      "ShuffleBlockLayer-218            [-1, 464, 7, 7]               0\n",
      "        Identity-219            [-1, 232, 7, 7]               0\n",
      "          Conv2d-220            [-1, 232, 7, 7]          53,824\n",
      "     BatchNorm2d-221            [-1, 232, 7, 7]             464\n",
      "            ReLU-222            [-1, 232, 7, 7]               0\n",
      "         _Conv2d-223            [-1, 232, 7, 7]               0\n",
      "          Conv2d-224            [-1, 232, 7, 7]           2,088\n",
      "     BatchNorm2d-225            [-1, 232, 7, 7]             464\n",
      "         _Conv2d-226            [-1, 232, 7, 7]               0\n",
      "          Conv2d-227            [-1, 232, 7, 7]          53,824\n",
      "     BatchNorm2d-228            [-1, 232, 7, 7]             464\n",
      "            ReLU-229            [-1, 232, 7, 7]               0\n",
      "         _Conv2d-230            [-1, 232, 7, 7]               0\n",
      "ShuffleBlockLayer-231            [-1, 464, 7, 7]               0\n",
      "          Conv2d-232           [-1, 1024, 7, 7]         475,136\n",
      "     BatchNorm2d-233           [-1, 1024, 7, 7]           2,048\n",
      "            ReLU-234           [-1, 1024, 7, 7]               0\n",
      "         _Conv2d-235           [-1, 1024, 7, 7]               0\n",
      "AdaptiveAvgPool2d-236           [-1, 1024, 1, 1]               0\n",
      "         Flatten-237                 [-1, 1024]               0\n",
      "          Linear-238                 [-1, 1000]       1,025,000\n",
      "================================================================\n",
      "Total params: 2,278,604\n",
      "Trainable params: 2,278,604\n",
      "Non-trainable params: 0\n",
      "----------------------------------------------------------------\n",
      "Input size (MB): 0.57\n",
      "Forward/backward pass size (MB): 65.35\n",
      "Params size (MB): 8.69\n",
      "Estimated Total Size (MB): 74.61\n",
      "----------------------------------------------------------------\n"
     ]
    }
   ],
   "source": [
    "import torchsummary\n",
    "from hdd.models.cnn.shufflenetv2 import ShuffleNetV2\n",
    "\n",
    "net = ShuffleNetV2(\n",
    "    num_classes=1000,\n",
    "    stage_layers=[4, 8, 4],\n",
    "    stage_out_channels=[24, 116, 232, 464, 1024],\n",
    ").to(DEVICE)\n",
    "torchsummary.summary(net, (3, 224, 224))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 9,
   "id": "851e736e",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "#Parameter: 1263854\n",
      "Epoch: 1/200 Train Loss: 2.1699 Accuracy: 0.2207 Time: 4.27057  | Val Loss: 2.1491 Accuracy: 0.2599\n",
      "Epoch: 2/200 Train Loss: 1.9332 Accuracy: 0.3273 Time: 4.32623  | Val Loss: 1.6790 Accuracy: 0.4125\n",
      "Epoch: 3/200 Train Loss: 1.7936 Accuracy: 0.3863 Time: 4.35719  | Val Loss: 1.4286 Accuracy: 0.5208\n",
      "Epoch: 4/200 Train Loss: 1.7058 Accuracy: 0.4188 Time: 4.32647  | Val Loss: 1.4172 Accuracy: 0.5246\n",
      "Epoch: 5/200 Train Loss: 1.6045 Accuracy: 0.4559 Time: 4.38447  | Val Loss: 1.1889 Accuracy: 0.6166\n",
      "Epoch: 6/200 Train Loss: 1.5400 Accuracy: 0.4865 Time: 4.40782  | Val Loss: 1.1843 Accuracy: 0.6084\n",
      "Epoch: 7/200 Train Loss: 1.4893 Accuracy: 0.5057 Time: 4.40190  | Val Loss: 1.0707 Accuracy: 0.6540\n",
      "Epoch: 8/200 Train Loss: 1.4105 Accuracy: 0.5267 Time: 4.37572  | Val Loss: 1.0230 Accuracy: 0.6703\n",
      "Epoch: 9/200 Train Loss: 1.3553 Accuracy: 0.5484 Time: 4.42825  | Val Loss: 1.1059 Accuracy: 0.6448\n",
      "Epoch: 10/200 Train Loss: 1.3013 Accuracy: 0.5610 Time: 4.25508  | Val Loss: 0.9701 Accuracy: 0.6910\n",
      "Epoch: 11/200 Train Loss: 1.2642 Accuracy: 0.5812 Time: 4.31993  | Val Loss: 0.9071 Accuracy: 0.7085\n",
      "Epoch: 12/200 Train Loss: 1.2252 Accuracy: 0.5967 Time: 4.33222  | Val Loss: 0.8448 Accuracy: 0.7287\n",
      "Epoch: 13/200 Train Loss: 1.1830 Accuracy: 0.6085 Time: 4.29895  | Val Loss: 0.8252 Accuracy: 0.7353\n",
      "Epoch: 14/200 Train Loss: 1.1704 Accuracy: 0.6080 Time: 4.45391  | Val Loss: 0.8065 Accuracy: 0.7452\n",
      "Epoch: 15/200 Train Loss: 1.1043 Accuracy: 0.6333 Time: 4.32376  | Val Loss: 0.7553 Accuracy: 0.7498\n",
      "Epoch: 16/200 Train Loss: 1.0985 Accuracy: 0.6347 Time: 4.30808  | Val Loss: 0.7611 Accuracy: 0.7582\n",
      "Epoch: 17/200 Train Loss: 1.0635 Accuracy: 0.6492 Time: 4.34610  | Val Loss: 0.7622 Accuracy: 0.7526\n",
      "Epoch: 18/200 Train Loss: 1.0062 Accuracy: 0.6725 Time: 4.35980  | Val Loss: 0.6926 Accuracy: 0.7722\n",
      "Epoch: 19/200 Train Loss: 1.0056 Accuracy: 0.6648 Time: 4.33366  | Val Loss: 0.6877 Accuracy: 0.7771\n",
      "Epoch: 20/200 Train Loss: 0.9834 Accuracy: 0.6776 Time: 4.39890  | Val Loss: 0.6830 Accuracy: 0.7865\n",
      "Epoch: 21/200 Train Loss: 0.9723 Accuracy: 0.6817 Time: 4.36872  | Val Loss: 0.6568 Accuracy: 0.7873\n",
      "Epoch: 22/200 Train Loss: 0.9294 Accuracy: 0.6949 Time: 4.41084  | Val Loss: 0.7015 Accuracy: 0.7666\n",
      "Epoch: 23/200 Train Loss: 0.9369 Accuracy: 0.6935 Time: 4.38656  | Val Loss: 0.5864 Accuracy: 0.8082\n",
      "Epoch: 24/200 Train Loss: 0.8894 Accuracy: 0.7088 Time: 4.35993  | Val Loss: 0.6097 Accuracy: 0.7967\n",
      "Epoch: 25/200 Train Loss: 0.8778 Accuracy: 0.7098 Time: 4.36108  | Val Loss: 0.6441 Accuracy: 0.7873\n",
      "Epoch: 26/200 Train Loss: 0.8836 Accuracy: 0.7129 Time: 4.37971  | Val Loss: 0.5760 Accuracy: 0.8069\n",
      "Epoch: 27/200 Train Loss: 0.8557 Accuracy: 0.7214 Time: 4.36023  | Val Loss: 0.5507 Accuracy: 0.8201\n",
      "Epoch: 28/200 Train Loss: 0.8456 Accuracy: 0.7230 Time: 4.37387  | Val Loss: 0.6106 Accuracy: 0.8066\n",
      "Epoch: 29/200 Train Loss: 0.8271 Accuracy: 0.7249 Time: 4.31997  | Val Loss: 0.5734 Accuracy: 0.8127\n",
      "Epoch: 30/200 Train Loss: 0.7855 Accuracy: 0.7465 Time: 4.44719  | Val Loss: 0.5093 Accuracy: 0.8329\n",
      "Epoch: 31/200 Train Loss: 0.7979 Accuracy: 0.7417 Time: 4.29901  | Val Loss: 0.5134 Accuracy: 0.8352\n",
      "Epoch: 32/200 Train Loss: 0.7750 Accuracy: 0.7461 Time: 4.36681  | Val Loss: 0.5295 Accuracy: 0.8280\n",
      "Epoch: 33/200 Train Loss: 0.7876 Accuracy: 0.7413 Time: 4.24445  | Val Loss: 0.5463 Accuracy: 0.8257\n",
      "Epoch: 34/200 Train Loss: 0.7717 Accuracy: 0.7523 Time: 4.32110  | Val Loss: 0.5042 Accuracy: 0.8313\n",
      "Epoch: 35/200 Train Loss: 0.7426 Accuracy: 0.7563 Time: 4.51842  | Val Loss: 0.4906 Accuracy: 0.8359\n",
      "Epoch: 36/200 Train Loss: 0.7372 Accuracy: 0.7565 Time: 4.31175  | Val Loss: 0.4978 Accuracy: 0.8400\n",
      "Epoch: 37/200 Train Loss: 0.7392 Accuracy: 0.7570 Time: 4.37339  | Val Loss: 0.4982 Accuracy: 0.8400\n",
      "Epoch: 38/200 Train Loss: 0.7078 Accuracy: 0.7653 Time: 4.35360  | Val Loss: 0.4736 Accuracy: 0.8441\n",
      "Epoch: 39/200 Train Loss: 0.6939 Accuracy: 0.7726 Time: 4.37889  | Val Loss: 0.4951 Accuracy: 0.8425\n",
      "Epoch: 40/200 Train Loss: 0.6903 Accuracy: 0.7758 Time: 4.34506  | Val Loss: 0.4487 Accuracy: 0.8527\n",
      "Epoch: 41/200 Train Loss: 0.6856 Accuracy: 0.7734 Time: 4.37080  | Val Loss: 0.4480 Accuracy: 0.8563\n",
      "Epoch: 42/200 Train Loss: 0.6713 Accuracy: 0.7790 Time: 4.39857  | Val Loss: 0.4312 Accuracy: 0.8606\n",
      "Epoch: 43/200 Train Loss: 0.6785 Accuracy: 0.7764 Time: 4.34363  | Val Loss: 0.4617 Accuracy: 0.8550\n",
      "Epoch: 44/200 Train Loss: 0.6603 Accuracy: 0.7853 Time: 4.38400  | Val Loss: 0.4343 Accuracy: 0.8601\n",
      "Epoch: 45/200 Train Loss: 0.6489 Accuracy: 0.7884 Time: 4.38801  | Val Loss: 0.4536 Accuracy: 0.8527\n",
      "Epoch: 46/200 Train Loss: 0.6463 Accuracy: 0.7846 Time: 4.34790  | Val Loss: 0.4206 Accuracy: 0.8629\n",
      "Epoch: 47/200 Train Loss: 0.6422 Accuracy: 0.7902 Time: 4.41974  | Val Loss: 0.4049 Accuracy: 0.8685\n",
      "Epoch: 48/200 Train Loss: 0.6255 Accuracy: 0.7981 Time: 4.40865  | Val Loss: 0.4282 Accuracy: 0.8639\n",
      "Epoch: 49/200 Train Loss: 0.6247 Accuracy: 0.7968 Time: 4.29573  | Val Loss: 0.4107 Accuracy: 0.8652\n",
      "Epoch: 50/200 Train Loss: 0.6020 Accuracy: 0.8018 Time: 4.36237  | Val Loss: 0.4070 Accuracy: 0.8688\n",
      "Epoch: 51/200 Train Loss: 0.6150 Accuracy: 0.7981 Time: 4.37154  | Val Loss: 0.4108 Accuracy: 0.8665\n",
      "Epoch: 52/200 Train Loss: 0.6041 Accuracy: 0.7987 Time: 4.26798  | Val Loss: 0.3887 Accuracy: 0.8683\n",
      "Epoch: 53/200 Train Loss: 0.5890 Accuracy: 0.8049 Time: 4.36536  | Val Loss: 0.3836 Accuracy: 0.8729\n",
      "Epoch: 54/200 Train Loss: 0.5829 Accuracy: 0.8101 Time: 4.40932  | Val Loss: 0.4079 Accuracy: 0.8685\n",
      "Epoch: 55/200 Train Loss: 0.5708 Accuracy: 0.8128 Time: 4.30109  | Val Loss: 0.3958 Accuracy: 0.8701\n",
      "Epoch: 56/200 Train Loss: 0.5631 Accuracy: 0.8175 Time: 4.44087  | Val Loss: 0.3936 Accuracy: 0.8716\n",
      "Epoch: 57/200 Train Loss: 0.5556 Accuracy: 0.8176 Time: 4.29072  | Val Loss: 0.3851 Accuracy: 0.8790\n",
      "Epoch: 58/200 Train Loss: 0.5541 Accuracy: 0.8172 Time: 4.32627  | Val Loss: 0.3875 Accuracy: 0.8741\n",
      "Epoch: 59/200 Train Loss: 0.5669 Accuracy: 0.8157 Time: 4.41418  | Val Loss: 0.4048 Accuracy: 0.8685\n",
      "Epoch: 60/200 Train Loss: 0.5388 Accuracy: 0.8224 Time: 4.32728  | Val Loss: 0.4076 Accuracy: 0.8680\n",
      "Epoch: 61/200 Train Loss: 0.5463 Accuracy: 0.8184 Time: 4.32134  | Val Loss: 0.3763 Accuracy: 0.8746\n",
      "Epoch: 62/200 Train Loss: 0.5243 Accuracy: 0.8255 Time: 4.30592  | Val Loss: 0.3878 Accuracy: 0.8749\n",
      "Epoch: 63/200 Train Loss: 0.5244 Accuracy: 0.8303 Time: 4.39435  | Val Loss: 0.3920 Accuracy: 0.8772\n",
      "Epoch: 64/200 Train Loss: 0.5244 Accuracy: 0.8276 Time: 4.30312  | Val Loss: 0.3732 Accuracy: 0.8805\n",
      "Epoch: 65/200 Train Loss: 0.5349 Accuracy: 0.8300 Time: 4.26692  | Val Loss: 0.3878 Accuracy: 0.8744\n",
      "Epoch: 66/200 Train Loss: 0.5145 Accuracy: 0.8287 Time: 4.23132  | Val Loss: 0.3919 Accuracy: 0.8795\n",
      "Epoch: 67/200 Train Loss: 0.5155 Accuracy: 0.8318 Time: 4.23324  | Val Loss: 0.3877 Accuracy: 0.8846\n",
      "Epoch: 68/200 Train Loss: 0.5070 Accuracy: 0.8291 Time: 4.22789  | Val Loss: 0.3748 Accuracy: 0.8759\n",
      "Epoch: 69/200 Train Loss: 0.4954 Accuracy: 0.8396 Time: 4.41306  | Val Loss: 0.3637 Accuracy: 0.8815\n",
      "Epoch: 70/200 Train Loss: 0.4903 Accuracy: 0.8394 Time: 4.30282  | Val Loss: 0.3826 Accuracy: 0.8833\n",
      "Epoch: 71/200 Train Loss: 0.4974 Accuracy: 0.8388 Time: 4.31948  | Val Loss: 0.3745 Accuracy: 0.8836\n",
      "Epoch: 72/200 Train Loss: 0.4899 Accuracy: 0.8380 Time: 4.31733  | Val Loss: 0.3584 Accuracy: 0.8838\n",
      "Epoch: 73/200 Train Loss: 0.4850 Accuracy: 0.8369 Time: 4.38120  | Val Loss: 0.3796 Accuracy: 0.8848\n",
      "Epoch: 74/200 Train Loss: 0.4783 Accuracy: 0.8431 Time: 4.40423  | Val Loss: 0.3651 Accuracy: 0.8866\n",
      "Epoch: 75/200 Train Loss: 0.4684 Accuracy: 0.8449 Time: 4.30505  | Val Loss: 0.3743 Accuracy: 0.8841\n",
      "Epoch: 76/200 Train Loss: 0.4537 Accuracy: 0.8506 Time: 4.26328  | Val Loss: 0.3402 Accuracy: 0.8915\n",
      "Epoch: 77/200 Train Loss: 0.4629 Accuracy: 0.8495 Time: 4.32696  | Val Loss: 0.3437 Accuracy: 0.8902\n",
      "Epoch: 78/200 Train Loss: 0.4617 Accuracy: 0.8482 Time: 4.27975  | Val Loss: 0.3576 Accuracy: 0.8820\n",
      "Epoch: 79/200 Train Loss: 0.4618 Accuracy: 0.8480 Time: 4.38585  | Val Loss: 0.3613 Accuracy: 0.8841\n",
      "Epoch: 80/200 Train Loss: 0.4505 Accuracy: 0.8511 Time: 4.41684  | Val Loss: 0.3452 Accuracy: 0.8935\n",
      "Epoch: 81/200 Train Loss: 0.4507 Accuracy: 0.8537 Time: 4.32774  | Val Loss: 0.3429 Accuracy: 0.8910\n",
      "Epoch: 82/200 Train Loss: 0.4410 Accuracy: 0.8536 Time: 4.28094  | Val Loss: 0.3447 Accuracy: 0.8912\n",
      "Epoch: 83/200 Train Loss: 0.4417 Accuracy: 0.8546 Time: 4.30240  | Val Loss: 0.3345 Accuracy: 0.8920\n",
      "Epoch: 84/200 Train Loss: 0.4351 Accuracy: 0.8594 Time: 4.34875  | Val Loss: 0.3443 Accuracy: 0.8922\n",
      "Epoch: 85/200 Train Loss: 0.4353 Accuracy: 0.8575 Time: 4.34258  | Val Loss: 0.3345 Accuracy: 0.8938\n",
      "Epoch: 86/200 Train Loss: 0.4251 Accuracy: 0.8612 Time: 4.29419  | Val Loss: 0.3375 Accuracy: 0.8981\n",
      "Epoch: 87/200 Train Loss: 0.4200 Accuracy: 0.8629 Time: 4.30782  | Val Loss: 0.3480 Accuracy: 0.8915\n",
      "Epoch: 88/200 Train Loss: 0.4194 Accuracy: 0.8633 Time: 4.31947  | Val Loss: 0.3440 Accuracy: 0.8948\n",
      "Epoch: 89/200 Train Loss: 0.4205 Accuracy: 0.8565 Time: 4.39335  | Val Loss: 0.3264 Accuracy: 0.8991\n",
      "Epoch: 90/200 Train Loss: 0.3972 Accuracy: 0.8693 Time: 4.30157  | Val Loss: 0.3367 Accuracy: 0.8932\n",
      "Epoch: 91/200 Train Loss: 0.4081 Accuracy: 0.8649 Time: 4.35098  | Val Loss: 0.3212 Accuracy: 0.9017\n",
      "Epoch: 92/200 Train Loss: 0.3916 Accuracy: 0.8684 Time: 4.26313  | Val Loss: 0.3357 Accuracy: 0.8912\n",
      "Epoch: 93/200 Train Loss: 0.3896 Accuracy: 0.8685 Time: 4.31223  | Val Loss: 0.3166 Accuracy: 0.9047\n",
      "Epoch: 94/200 Train Loss: 0.3924 Accuracy: 0.8703 Time: 4.27540  | Val Loss: 0.3442 Accuracy: 0.8996\n",
      "Epoch: 95/200 Train Loss: 0.3972 Accuracy: 0.8695 Time: 4.28767  | Val Loss: 0.3360 Accuracy: 0.8938\n",
      "Epoch: 96/200 Train Loss: 0.3931 Accuracy: 0.8683 Time: 4.30373  | Val Loss: 0.3292 Accuracy: 0.9034\n",
      "Epoch: 97/200 Train Loss: 0.3877 Accuracy: 0.8731 Time: 4.30930  | Val Loss: 0.3252 Accuracy: 0.8996\n",
      "Epoch: 98/200 Train Loss: 0.3740 Accuracy: 0.8784 Time: 4.26858  | Val Loss: 0.3274 Accuracy: 0.8994\n",
      "Epoch: 99/200 Train Loss: 0.3789 Accuracy: 0.8713 Time: 4.33010  | Val Loss: 0.3346 Accuracy: 0.8976\n",
      "Epoch: 100/200 Train Loss: 0.3670 Accuracy: 0.8787 Time: 4.42321  | Val Loss: 0.3299 Accuracy: 0.9011\n",
      "Epoch: 101/200 Train Loss: 0.3701 Accuracy: 0.8786 Time: 4.32938  | Val Loss: 0.3343 Accuracy: 0.8968\n",
      "Epoch: 102/200 Train Loss: 0.3428 Accuracy: 0.8850 Time: 4.31662  | Val Loss: 0.3146 Accuracy: 0.9029\n",
      "Epoch: 103/200 Train Loss: 0.3588 Accuracy: 0.8802 Time: 4.28023  | Val Loss: 0.3191 Accuracy: 0.9011\n",
      "Epoch: 104/200 Train Loss: 0.3564 Accuracy: 0.8778 Time: 4.29216  | Val Loss: 0.3254 Accuracy: 0.9014\n",
      "Epoch: 105/200 Train Loss: 0.3618 Accuracy: 0.8803 Time: 4.29815  | Val Loss: 0.3276 Accuracy: 0.8981\n",
      "Epoch: 106/200 Train Loss: 0.3540 Accuracy: 0.8844 Time: 4.39871  | Val Loss: 0.3241 Accuracy: 0.8968\n",
      "Epoch: 107/200 Train Loss: 0.3509 Accuracy: 0.8829 Time: 4.22731  | Val Loss: 0.3208 Accuracy: 0.9011\n",
      "Epoch: 108/200 Train Loss: 0.3429 Accuracy: 0.8839 Time: 4.27256  | Val Loss: 0.3235 Accuracy: 0.9034\n",
      "Epoch: 109/200 Train Loss: 0.3322 Accuracy: 0.8875 Time: 4.26989  | Val Loss: 0.3091 Accuracy: 0.9075\n",
      "Epoch: 110/200 Train Loss: 0.3531 Accuracy: 0.8839 Time: 4.34495  | Val Loss: 0.3116 Accuracy: 0.9037\n",
      "Epoch: 111/200 Train Loss: 0.3404 Accuracy: 0.8848 Time: 4.31632  | Val Loss: 0.3220 Accuracy: 0.9050\n",
      "Epoch: 112/200 Train Loss: 0.3195 Accuracy: 0.8938 Time: 4.25937  | Val Loss: 0.3141 Accuracy: 0.9065\n",
      "Epoch: 113/200 Train Loss: 0.3298 Accuracy: 0.8910 Time: 4.33733  | Val Loss: 0.3123 Accuracy: 0.9068\n",
      "Epoch: 114/200 Train Loss: 0.3492 Accuracy: 0.8855 Time: 4.46583  | Val Loss: 0.3118 Accuracy: 0.9062\n",
      "Epoch: 115/200 Train Loss: 0.3342 Accuracy: 0.8897 Time: 4.55313  | Val Loss: 0.3159 Accuracy: 0.9027\n",
      "Epoch: 116/200 Train Loss: 0.3205 Accuracy: 0.8923 Time: 4.34344  | Val Loss: 0.3192 Accuracy: 0.9022\n",
      "Epoch: 117/200 Train Loss: 0.3156 Accuracy: 0.9001 Time: 4.42740  | Val Loss: 0.3015 Accuracy: 0.9124\n",
      "Epoch: 118/200 Train Loss: 0.3145 Accuracy: 0.8958 Time: 4.38342  | Val Loss: 0.3014 Accuracy: 0.9019\n",
      "Epoch: 119/200 Train Loss: 0.3297 Accuracy: 0.8927 Time: 4.39061  | Val Loss: 0.3143 Accuracy: 0.9017\n",
      "Epoch: 120/200 Train Loss: 0.3061 Accuracy: 0.9000 Time: 4.38604  | Val Loss: 0.3096 Accuracy: 0.9062\n",
      "Epoch: 121/200 Train Loss: 0.3245 Accuracy: 0.8946 Time: 4.39514  | Val Loss: 0.2929 Accuracy: 0.9098\n",
      "Epoch: 122/200 Train Loss: 0.2974 Accuracy: 0.8998 Time: 4.35040  | Val Loss: 0.3101 Accuracy: 0.9047\n",
      "Epoch: 123/200 Train Loss: 0.3011 Accuracy: 0.8996 Time: 4.35502  | Val Loss: 0.2973 Accuracy: 0.9065\n",
      "Epoch: 124/200 Train Loss: 0.2940 Accuracy: 0.9032 Time: 4.42997  | Val Loss: 0.3055 Accuracy: 0.9057\n",
      "Epoch: 125/200 Train Loss: 0.3003 Accuracy: 0.9017 Time: 4.31951  | Val Loss: 0.3074 Accuracy: 0.9050\n",
      "Epoch: 126/200 Train Loss: 0.2989 Accuracy: 0.9035 Time: 4.56631  | Val Loss: 0.2955 Accuracy: 0.9090\n",
      "Epoch: 127/200 Train Loss: 0.2916 Accuracy: 0.9081 Time: 4.33058  | Val Loss: 0.2943 Accuracy: 0.9096\n",
      "Epoch: 128/200 Train Loss: 0.3018 Accuracy: 0.9010 Time: 4.32899  | Val Loss: 0.2932 Accuracy: 0.9090\n",
      "Epoch: 129/200 Train Loss: 0.2743 Accuracy: 0.9097 Time: 4.35678  | Val Loss: 0.2960 Accuracy: 0.9136\n",
      "Epoch: 130/200 Train Loss: 0.2742 Accuracy: 0.9118 Time: 4.39516  | Val Loss: 0.2930 Accuracy: 0.9154\n",
      "Epoch: 131/200 Train Loss: 0.2853 Accuracy: 0.9055 Time: 4.46053  | Val Loss: 0.2919 Accuracy: 0.9103\n",
      "Epoch: 132/200 Train Loss: 0.2958 Accuracy: 0.9046 Time: 4.40224  | Val Loss: 0.2921 Accuracy: 0.9101\n",
      "Epoch: 133/200 Train Loss: 0.2924 Accuracy: 0.9051 Time: 4.51495  | Val Loss: 0.3044 Accuracy: 0.9070\n",
      "Epoch: 134/200 Train Loss: 0.2831 Accuracy: 0.9085 Time: 4.47115  | Val Loss: 0.2989 Accuracy: 0.9080\n",
      "Epoch: 135/200 Train Loss: 0.2718 Accuracy: 0.9108 Time: 4.64016  | Val Loss: 0.3053 Accuracy: 0.9090\n",
      "Epoch: 136/200 Train Loss: 0.2704 Accuracy: 0.9119 Time: 4.45716  | Val Loss: 0.3073 Accuracy: 0.9070\n",
      "Epoch: 137/200 Train Loss: 0.2856 Accuracy: 0.9071 Time: 4.74292  | Val Loss: 0.2911 Accuracy: 0.9126\n",
      "Epoch: 138/200 Train Loss: 0.2622 Accuracy: 0.9127 Time: 4.58595  | Val Loss: 0.2990 Accuracy: 0.9080\n",
      "Epoch: 139/200 Train Loss: 0.2617 Accuracy: 0.9139 Time: 4.50313  | Val Loss: 0.2972 Accuracy: 0.9096\n",
      "Epoch: 140/200 Train Loss: 0.2666 Accuracy: 0.9148 Time: 4.33440  | Val Loss: 0.3046 Accuracy: 0.9083\n",
      "Epoch: 141/200 Train Loss: 0.2700 Accuracy: 0.9123 Time: 4.40885  | Val Loss: 0.2950 Accuracy: 0.9083\n",
      "Epoch: 142/200 Train Loss: 0.2747 Accuracy: 0.9106 Time: 4.35515  | Val Loss: 0.2945 Accuracy: 0.9116\n",
      "Epoch: 143/200 Train Loss: 0.2570 Accuracy: 0.9172 Time: 4.37729  | Val Loss: 0.3010 Accuracy: 0.9070\n",
      "Epoch: 144/200 Train Loss: 0.2519 Accuracy: 0.9186 Time: 4.47675  | Val Loss: 0.2932 Accuracy: 0.9118\n",
      "Epoch: 145/200 Train Loss: 0.2567 Accuracy: 0.9148 Time: 4.35659  | Val Loss: 0.2823 Accuracy: 0.9159\n",
      "Epoch: 146/200 Train Loss: 0.2681 Accuracy: 0.9135 Time: 4.30767  | Val Loss: 0.2978 Accuracy: 0.9106\n",
      "Epoch: 147/200 Train Loss: 0.2582 Accuracy: 0.9144 Time: 4.27084  | Val Loss: 0.2928 Accuracy: 0.9124\n",
      "Epoch: 148/200 Train Loss: 0.2484 Accuracy: 0.9196 Time: 4.39313  | Val Loss: 0.2873 Accuracy: 0.9169\n",
      "Epoch: 149/200 Train Loss: 0.2594 Accuracy: 0.9160 Time: 4.31150  | Val Loss: 0.2953 Accuracy: 0.9118\n",
      "Epoch: 150/200 Train Loss: 0.2511 Accuracy: 0.9196 Time: 4.28765  | Val Loss: 0.2868 Accuracy: 0.9131\n",
      "Epoch: 151/200 Train Loss: 0.2498 Accuracy: 0.9194 Time: 4.37220  | Val Loss: 0.2932 Accuracy: 0.9116\n",
      "Epoch: 152/200 Train Loss: 0.2449 Accuracy: 0.9196 Time: 4.33370  | Val Loss: 0.2912 Accuracy: 0.9126\n",
      "Epoch: 153/200 Train Loss: 0.2460 Accuracy: 0.9209 Time: 4.36251  | Val Loss: 0.2956 Accuracy: 0.9111\n",
      "Epoch: 154/200 Train Loss: 0.2330 Accuracy: 0.9257 Time: 4.41428  | Val Loss: 0.2961 Accuracy: 0.9111\n",
      "Epoch: 155/200 Train Loss: 0.2387 Accuracy: 0.9223 Time: 4.55261  | Val Loss: 0.2854 Accuracy: 0.9192\n",
      "Epoch: 156/200 Train Loss: 0.2381 Accuracy: 0.9214 Time: 4.27618  | Val Loss: 0.2846 Accuracy: 0.9172\n",
      "Epoch: 157/200 Train Loss: 0.2430 Accuracy: 0.9196 Time: 4.42527  | Val Loss: 0.2845 Accuracy: 0.9169\n",
      "Epoch: 158/200 Train Loss: 0.2344 Accuracy: 0.9233 Time: 4.45526  | Val Loss: 0.2815 Accuracy: 0.9164\n",
      "Epoch: 159/200 Train Loss: 0.2224 Accuracy: 0.9277 Time: 4.43005  | Val Loss: 0.2896 Accuracy: 0.9172\n",
      "Epoch: 160/200 Train Loss: 0.2385 Accuracy: 0.9213 Time: 4.38298  | Val Loss: 0.2805 Accuracy: 0.9200\n",
      "Epoch: 161/200 Train Loss: 0.2298 Accuracy: 0.9242 Time: 4.36689  | Val Loss: 0.2826 Accuracy: 0.9164\n",
      "Epoch: 162/200 Train Loss: 0.2131 Accuracy: 0.9306 Time: 4.52961  | Val Loss: 0.2792 Accuracy: 0.9185\n",
      "Epoch: 163/200 Train Loss: 0.2293 Accuracy: 0.9254 Time: 4.44461  | Val Loss: 0.2829 Accuracy: 0.9172\n",
      "Epoch: 164/200 Train Loss: 0.2169 Accuracy: 0.9285 Time: 4.29348  | Val Loss: 0.2767 Accuracy: 0.9190\n",
      "Epoch: 165/200 Train Loss: 0.2327 Accuracy: 0.9225 Time: 4.34389  | Val Loss: 0.2832 Accuracy: 0.9162\n",
      "Epoch: 166/200 Train Loss: 0.2202 Accuracy: 0.9302 Time: 4.51848  | Val Loss: 0.2893 Accuracy: 0.9167\n",
      "Epoch: 167/200 Train Loss: 0.2304 Accuracy: 0.9232 Time: 4.42723  | Val Loss: 0.2834 Accuracy: 0.9175\n",
      "Epoch: 168/200 Train Loss: 0.2228 Accuracy: 0.9267 Time: 4.41296  | Val Loss: 0.2818 Accuracy: 0.9185\n",
      "Epoch: 169/200 Train Loss: 0.2233 Accuracy: 0.9287 Time: 4.30212  | Val Loss: 0.2815 Accuracy: 0.9177\n",
      "Epoch: 170/200 Train Loss: 0.2165 Accuracy: 0.9291 Time: 4.40441  | Val Loss: 0.2807 Accuracy: 0.9167\n",
      "Epoch: 171/200 Train Loss: 0.2180 Accuracy: 0.9286 Time: 4.32767  | Val Loss: 0.2772 Accuracy: 0.9187\n",
      "Epoch: 172/200 Train Loss: 0.2133 Accuracy: 0.9334 Time: 4.29857  | Val Loss: 0.2763 Accuracy: 0.9190\n",
      "Epoch: 173/200 Train Loss: 0.2163 Accuracy: 0.9293 Time: 4.43410  | Val Loss: 0.2740 Accuracy: 0.9195\n",
      "Epoch: 174/200 Train Loss: 0.2141 Accuracy: 0.9316 Time: 4.43783  | Val Loss: 0.2754 Accuracy: 0.9169\n",
      "Epoch: 175/200 Train Loss: 0.2270 Accuracy: 0.9253 Time: 4.32411  | Val Loss: 0.2775 Accuracy: 0.9225\n",
      "Epoch: 176/200 Train Loss: 0.2094 Accuracy: 0.9302 Time: 4.30695  | Val Loss: 0.2790 Accuracy: 0.9180\n",
      "Epoch: 177/200 Train Loss: 0.2207 Accuracy: 0.9287 Time: 4.38354  | Val Loss: 0.2799 Accuracy: 0.9192\n",
      "Epoch: 178/200 Train Loss: 0.2225 Accuracy: 0.9269 Time: 4.43975  | Val Loss: 0.2770 Accuracy: 0.9192\n",
      "Epoch: 179/200 Train Loss: 0.2130 Accuracy: 0.9281 Time: 4.41024  | Val Loss: 0.2782 Accuracy: 0.9200\n",
      "Epoch: 180/200 Train Loss: 0.2125 Accuracy: 0.9308 Time: 4.37843  | Val Loss: 0.2741 Accuracy: 0.9187\n",
      "Epoch: 181/200 Train Loss: 0.2176 Accuracy: 0.9283 Time: 4.45176  | Val Loss: 0.2761 Accuracy: 0.9190\n",
      "Epoch: 182/200 Train Loss: 0.2024 Accuracy: 0.9343 Time: 4.40142  | Val Loss: 0.2732 Accuracy: 0.9200\n",
      "Epoch: 183/200 Train Loss: 0.2050 Accuracy: 0.9346 Time: 4.29856  | Val Loss: 0.2743 Accuracy: 0.9197\n",
      "Epoch: 184/200 Train Loss: 0.2162 Accuracy: 0.9310 Time: 4.42029  | Val Loss: 0.2800 Accuracy: 0.9182\n",
      "Epoch: 185/200 Train Loss: 0.2059 Accuracy: 0.9332 Time: 4.29871  | Val Loss: 0.2768 Accuracy: 0.9172\n",
      "Epoch: 186/200 Train Loss: 0.2039 Accuracy: 0.9335 Time: 4.40195  | Val Loss: 0.2734 Accuracy: 0.9192\n",
      "Epoch: 187/200 Train Loss: 0.2130 Accuracy: 0.9324 Time: 4.41419  | Val Loss: 0.2758 Accuracy: 0.9195\n",
      "Epoch: 188/200 Train Loss: 0.2135 Accuracy: 0.9291 Time: 4.35219  | Val Loss: 0.2732 Accuracy: 0.9175\n",
      "Epoch: 189/200 Train Loss: 0.2098 Accuracy: 0.9329 Time: 4.34330  | Val Loss: 0.2748 Accuracy: 0.9180\n",
      "Epoch: 190/200 Train Loss: 0.2017 Accuracy: 0.9321 Time: 4.31514  | Val Loss: 0.2750 Accuracy: 0.9187\n",
      "Epoch: 191/200 Train Loss: 0.2052 Accuracy: 0.9324 Time: 4.30030  | Val Loss: 0.2747 Accuracy: 0.9203\n",
      "Epoch: 192/200 Train Loss: 0.2188 Accuracy: 0.9284 Time: 4.35004  | Val Loss: 0.2752 Accuracy: 0.9205\n",
      "Epoch: 193/200 Train Loss: 0.2037 Accuracy: 0.9320 Time: 4.32202  | Val Loss: 0.2744 Accuracy: 0.9208\n",
      "Epoch: 194/200 Train Loss: 0.2048 Accuracy: 0.9324 Time: 4.40137  | Val Loss: 0.2787 Accuracy: 0.9190\n",
      "Epoch: 195/200 Train Loss: 0.2026 Accuracy: 0.9336 Time: 4.45232  | Val Loss: 0.2730 Accuracy: 0.9200\n",
      "Epoch: 196/200 Train Loss: 0.2003 Accuracy: 0.9330 Time: 4.49925  | Val Loss: 0.2739 Accuracy: 0.9195\n",
      "Epoch: 197/200 Train Loss: 0.2081 Accuracy: 0.9301 Time: 4.47371  | Val Loss: 0.2760 Accuracy: 0.9208\n",
      "Epoch: 198/200 Train Loss: 0.1962 Accuracy: 0.9358 Time: 4.44104  | Val Loss: 0.2755 Accuracy: 0.9185\n",
      "Epoch: 199/200 Train Loss: 0.2158 Accuracy: 0.9343 Time: 4.49329  | Val Loss: 0.2714 Accuracy: 0.9203\n",
      "Epoch: 200/200 Train Loss: 0.1978 Accuracy: 0.9335 Time: 4.39961  | Val Loss: 0.2742 Accuracy: 0.9208\n",
      "#Parameter: 1263854 Accuracy: 0.9207643312101911\n"
     ]
    }
   ],
   "source": [
    "from hdd.train.classification_utils import (\n",
    "    naive_train_classification_model,\n",
    "    eval_image_classifier,\n",
    ")\n",
    "from hdd.models.nn_utils import count_trainable_parameter\n",
    "\n",
    "\n",
    "def train_net(\n",
    "    train_dataloader,\n",
    "    val_dataloader,\n",
    "    net,\n",
    "    lr=1e-3,\n",
    "    weight_decay=0,\n",
    "    max_epochs=200,\n",
    ") -> dict[str, list[float]]:\n",
    "\n",
    "    print(f\"#Parameter: {count_trainable_parameter(net)}\")\n",
    "    criteria = nn.CrossEntropyLoss()\n",
    "    optimizer = torch.optim.AdamW(net.parameters(), lr=lr, weight_decay=weight_decay)\n",
    "    scheduler = torch.optim.lr_scheduler.CosineAnnealingLR(\n",
    "        optimizer, max_epochs, eta_min=lr / 100\n",
    "    )\n",
    "    training_stats = naive_train_classification_model(\n",
    "        net,\n",
    "        criteria,\n",
    "        max_epochs,\n",
    "        train_dataloader,\n",
    "        val_dataloader,\n",
    "        DEVICE,\n",
    "        optimizer,\n",
    "        scheduler,\n",
    "        verbose=True,\n",
    "    )\n",
    "    return training_stats\n",
    "\n",
    "\n",
    "train_dataloader, val_dataloader = build_dataloader(64, train_dataset, val_dataset)\n",
    "\n",
    "net = ShuffleNetV2(\n",
    "    num_classes=10,\n",
    "    stage_layers=[4, 8, 4],\n",
    "    stage_out_channels=[24, 116, 232, 464, 1024],\n",
    ").to(DEVICE)\n",
    "width_multiplier_1 = train_net(\n",
    "    train_dataloader,\n",
    "    val_dataloader,\n",
    "    net,\n",
    "    lr=0.001,\n",
    "    weight_decay=0,\n",
    ")\n",
    "\n",
    "eval_result = eval_image_classifier(net, val_dataloader.dataset, DEVICE)\n",
    "ss = [result.gt_label == result.predicted_label for result in eval_result]\n",
    "print(f\"#Parameter: {count_trainable_parameter(net)} Accuracy: {sum(ss) / len(ss)}\")"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "8596c8c1",
   "metadata": {},
   "outputs": [],
   "source": []
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "pytorch-cu124",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.11.11"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 5
}
