{
 "cells": [
  {
   "cell_type": "markdown",
   "id": "b367fc60",
   "metadata": {},
   "source": [
    "## ResNeXt 模型关键创新点总结\n",
    "\n",
    "ResNeXt 是由何凯明团队在 2017 年 CVPR 提出的图像分类网络，作为 ResNet 的改进版本，其核心创新点在于 **“聚合残差变换”（Aggregated Residual Transformations）**，通过以下关键技术提升了模型性能与效率：\n",
    "\n",
    "---\n",
    "\n",
    "#### 1. **Cardinality（基数）概念**\n",
    "- **定义**：Cardinality 是指并行分支的数量（类似 ResNet 的残差分支），但不同于深度（depth）和宽度（width），它是一个新的维度超参数。\n",
    "- **作用**：通过增加基数而非单纯加深或加宽网络，能够更高效地提升模型性能，同时保持参数量可控。\n",
    "\n",
    "#### 2. **分组卷积（Grouped Convolution）**\n",
    "- **实现方式**：将输入通道划分为多个组（groups），每组独立进行卷积操作，最后合并输出。\n",
    "- **优势**：\n",
    "  - 减少计算量和参数量（相比标准卷积）。\n",
    "  - 增强特征多样性，提升模型鲁棒性。\n",
    "\n",
    "#### 3. **聚合残差块（Aggregated Residual Block）**\n",
    "- **结构特点**：使用 **并行堆叠的相同拓扑结构块**（如多个分组卷积分支），替代 ResNet 中的三层卷积块（Bottleneck Block）。\n",
    "- **优势**：\n",
    "  - 简化网络设计，减少超参数数量。\n",
    "  - 通过多路径特征融合增强表达能力。\n",
    "\n",
    "![alt text](resources/resnext_comparison.png \"Title\")\n",
    "\n",
    "\n",
    "### 总结\n",
    "ResNeXt 的核心思想是通过 **分组卷积 + 多路径聚合**，在降低计算复杂度的同时提升模型性能。其设计哲学强调“基数”这一新维度的重要性，为后续轻量化模型（如 MobileNet、ShuffleNet）提供了重要启发。\n",
    "\n",
    "![alt text](resources/resnext_block.png \"Title\")"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "22e18951",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "The autoreload extension is already loaded. To reload it, use:\n",
      "  %reload_ext autoreload\n"
     ]
    }
   ],
   "source": [
    "# 自动重新加载外部module，使得修改代码之后无需重新import\n",
    "# see http://stackoverflow.com/questions/1907993/autoreload-of-modules-in-ipython\n",
    "%load_ext autoreload\n",
    "%autoreload 2\n",
    "\n",
    "from hdd.device.utils import get_device\n",
    "\n",
    "import torch\n",
    "import torch.nn as nn\n",
    "import torch.optim as optim\n",
    "from torchvision import datasets, transforms\n",
    "\n",
    "# 设置训练数据的路径\n",
    "DATA_ROOT = \"~/workspace/hands-dirty-on-dl/dataset\"\n",
    "# 设置TensorBoard的路径\n",
    "TENSORBOARD_ROOT = \"~/workspace/hands-dirty-on-dl/dataset\"\n",
    "# 设置预训练模型参数路径\n",
    "TORCH_HUB_PATH = \"~/workspace/hands-dirty-on-dl/pretrained_models\"\n",
    "torch.hub.set_dir(TORCH_HUB_PATH)\n",
    "# 挑选最合适的训练设备\n",
    "DEVICE = get_device([\"cuda\", \"cpu\"])\n",
    "print(\"Device: \", DEVICE)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 2,
   "id": "0f2c648d",
   "metadata": {},
   "outputs": [],
   "source": [
    "from hdd.dataset.imagenette_in_memory import ImagenetteInMemory\n",
    "from hdd.data_util.auto_augmentation import ImageNetPolicy\n",
    "\n",
    "from hdd.data_util.transforms import RandomResize\n",
    "from torch.utils.data import DataLoader\n",
    "\n",
    "TRAIN_MEAN = [0.4625, 0.4580, 0.4295]\n",
    "TRAIN_STD = [0.2452, 0.2390, 0.2469]\n",
    "train_dataset_transforms = transforms.Compose(\n",
    "    [\n",
    "        RandomResize([256, 296, 384]),  # 随机在三个size中选择一个进行resize\n",
    "        transforms.RandomCrop(224),\n",
    "        transforms.RandomHorizontalFlip(),\n",
    "        ImageNetPolicy(),\n",
    "        transforms.ToTensor(),\n",
    "        transforms.Normalize(mean=TRAIN_MEAN, std=TRAIN_STD),\n",
    "    ]\n",
    ")\n",
    "val_dataset_transforms = transforms.Compose(\n",
    "    [\n",
    "        transforms.Resize(256),\n",
    "        transforms.CenterCrop(224),\n",
    "        transforms.ToTensor(),\n",
    "        transforms.Normalize(mean=TRAIN_MEAN, std=TRAIN_STD),\n",
    "    ]\n",
    ")\n",
    "train_dataset = ImagenetteInMemory(\n",
    "    root=DATA_ROOT,\n",
    "    split=\"train\",\n",
    "    size=\"full\",\n",
    "    download=True,\n",
    "    transform=train_dataset_transforms,\n",
    ")\n",
    "val_dataset = ImagenetteInMemory(\n",
    "    root=DATA_ROOT,\n",
    "    split=\"val\",\n",
    "    size=\"full\",\n",
    "    download=True,\n",
    "    transform=val_dataset_transforms,\n",
    ")\n",
    "\n",
    "\n",
    "def build_dataloader(batch_size, train_dataset, val_dataset):\n",
    "    train_dataloader = DataLoader(\n",
    "        train_dataset, batch_size=batch_size, shuffle=True, num_workers=8\n",
    "    )\n",
    "    val_dataloader = DataLoader(\n",
    "        val_dataset, batch_size=batch_size, shuffle=False, num_workers=8\n",
    "    )\n",
    "    return train_dataloader, val_dataloader"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 3,
   "id": "b8c163f7",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "#Parameter: 23000394\n",
      "Epoch: 1/200 Train Loss: 2.3253 Accuracy: 0.2187 Time: 21.82687  | Val Loss: 2.3011 Accuracy: 0.3302\n",
      "Epoch: 2/200 Train Loss: 2.0201 Accuracy: 0.3379 Time: 21.39554  | Val Loss: 2.3912 Accuracy: 0.3990\n",
      "Epoch: 3/200 Train Loss: 1.8613 Accuracy: 0.4157 Time: 21.45586  | Val Loss: 2.1527 Accuracy: 0.4986\n",
      "Epoch: 4/200 Train Loss: 1.7678 Accuracy: 0.4559 Time: 21.57703  | Val Loss: 1.5627 Accuracy: 0.5689\n",
      "Epoch: 5/200 Train Loss: 1.6942 Accuracy: 0.4941 Time: 21.31908  | Val Loss: 2.0545 Accuracy: 0.5531\n",
      "Epoch: 6/200 Train Loss: 1.6133 Accuracy: 0.5270 Time: 21.71740  | Val Loss: 1.5808 Accuracy: 0.5870\n",
      "Epoch: 7/200 Train Loss: 1.5650 Accuracy: 0.5518 Time: 21.32236  | Val Loss: 1.3356 Accuracy: 0.6622\n",
      "Epoch: 8/200 Train Loss: 1.5192 Accuracy: 0.5744 Time: 21.96170  | Val Loss: 1.3602 Accuracy: 0.6441\n",
      "Epoch: 9/200 Train Loss: 1.4673 Accuracy: 0.5930 Time: 21.74933  | Val Loss: 1.2467 Accuracy: 0.7019\n",
      "Epoch: 10/200 Train Loss: 1.4190 Accuracy: 0.6169 Time: 21.05814  | Val Loss: 1.2783 Accuracy: 0.6757\n",
      "Epoch: 11/200 Train Loss: 1.3674 Accuracy: 0.6424 Time: 21.53493  | Val Loss: 1.2071 Accuracy: 0.7126\n",
      "Epoch: 12/200 Train Loss: 1.3199 Accuracy: 0.6627 Time: 21.11272  | Val Loss: 1.1722 Accuracy: 0.7294\n",
      "Epoch: 13/200 Train Loss: 1.3029 Accuracy: 0.6704 Time: 21.87283  | Val Loss: 1.1903 Accuracy: 0.7149\n",
      "Epoch: 14/200 Train Loss: 1.2847 Accuracy: 0.6734 Time: 21.47753  | Val Loss: 1.0747 Accuracy: 0.7717\n",
      "Epoch: 15/200 Train Loss: 1.2607 Accuracy: 0.6862 Time: 21.61251  | Val Loss: 1.2295 Accuracy: 0.7282\n",
      "Epoch: 16/200 Train Loss: 1.2430 Accuracy: 0.6956 Time: 21.44592  | Val Loss: 1.1741 Accuracy: 0.7317\n",
      "Epoch: 17/200 Train Loss: 1.2114 Accuracy: 0.7080 Time: 21.33899  | Val Loss: 1.0314 Accuracy: 0.7878\n",
      "Epoch: 18/200 Train Loss: 1.2262 Accuracy: 0.7069 Time: 21.51775  | Val Loss: 1.0446 Accuracy: 0.7819\n",
      "Epoch: 19/200 Train Loss: 1.1788 Accuracy: 0.7217 Time: 21.34291  | Val Loss: 11.0824 Accuracy: 0.3934\n",
      "Epoch: 20/200 Train Loss: 1.3029 Accuracy: 0.6692 Time: 21.44521  | Val Loss: 1.0275 Accuracy: 0.7949\n",
      "Epoch: 21/200 Train Loss: 1.1728 Accuracy: 0.7273 Time: 21.47160  | Val Loss: 1.0246 Accuracy: 0.7916\n",
      "Epoch: 22/200 Train Loss: 1.1573 Accuracy: 0.7359 Time: 21.53170  | Val Loss: 1.0515 Accuracy: 0.7862\n",
      "Epoch: 23/200 Train Loss: 1.1283 Accuracy: 0.7451 Time: 21.37671  | Val Loss: 0.9260 Accuracy: 0.8390\n",
      "Epoch: 24/200 Train Loss: 1.0974 Accuracy: 0.7591 Time: 21.36911  | Val Loss: 0.9310 Accuracy: 0.8372\n",
      "Epoch: 25/200 Train Loss: 1.1011 Accuracy: 0.7617 Time: 21.56189  | Val Loss: 0.9379 Accuracy: 0.8306\n",
      "Epoch: 26/200 Train Loss: 1.0800 Accuracy: 0.7716 Time: 20.70689  | Val Loss: 0.9545 Accuracy: 0.8316\n",
      "Epoch: 27/200 Train Loss: 1.0636 Accuracy: 0.7724 Time: 20.68008  | Val Loss: 0.9332 Accuracy: 0.8415\n",
      "Epoch: 28/200 Train Loss: 1.0432 Accuracy: 0.7834 Time: 20.66925  | Val Loss: 0.8493 Accuracy: 0.8652\n",
      "Epoch: 29/200 Train Loss: 1.0356 Accuracy: 0.7839 Time: 20.63516  | Val Loss: 0.8608 Accuracy: 0.8629\n",
      "Epoch: 30/200 Train Loss: 1.0431 Accuracy: 0.7815 Time: 20.66952  | Val Loss: 0.8589 Accuracy: 0.8596\n",
      "Epoch: 31/200 Train Loss: 1.0318 Accuracy: 0.7880 Time: 20.65634  | Val Loss: 0.9439 Accuracy: 0.8313\n",
      "Epoch: 32/200 Train Loss: 1.0151 Accuracy: 0.7960 Time: 20.68275  | Val Loss: 0.9039 Accuracy: 0.8431\n",
      "Epoch: 33/200 Train Loss: 1.0235 Accuracy: 0.7916 Time: 20.70695  | Val Loss: 0.9374 Accuracy: 0.8487\n",
      "Epoch: 34/200 Train Loss: 1.0071 Accuracy: 0.7953 Time: 20.70255  | Val Loss: 0.9536 Accuracy: 0.8324\n",
      "Epoch: 35/200 Train Loss: 1.0109 Accuracy: 0.7949 Time: 20.67435  | Val Loss: 0.8835 Accuracy: 0.8568\n",
      "Epoch: 36/200 Train Loss: 0.9851 Accuracy: 0.8054 Time: 20.68598  | Val Loss: 0.8755 Accuracy: 0.8583\n",
      "Epoch: 37/200 Train Loss: 0.9694 Accuracy: 0.8120 Time: 20.69238  | Val Loss: 0.8460 Accuracy: 0.8637\n",
      "Epoch: 38/200 Train Loss: 0.9701 Accuracy: 0.8097 Time: 20.79959  | Val Loss: 0.8015 Accuracy: 0.8841\n",
      "Epoch: 39/200 Train Loss: 0.9547 Accuracy: 0.8186 Time: 20.68101  | Val Loss: 0.8777 Accuracy: 0.8538\n",
      "Epoch: 40/200 Train Loss: 0.9393 Accuracy: 0.8214 Time: 20.64857  | Val Loss: 0.7962 Accuracy: 0.8859\n",
      "Epoch: 41/200 Train Loss: 0.9400 Accuracy: 0.8244 Time: 20.76361  | Val Loss: 0.8247 Accuracy: 0.8739\n",
      "Epoch: 42/200 Train Loss: 0.9364 Accuracy: 0.8254 Time: 20.75433  | Val Loss: 0.8397 Accuracy: 0.8716\n",
      "Epoch: 43/200 Train Loss: 0.9362 Accuracy: 0.8261 Time: 20.64131  | Val Loss: 0.8102 Accuracy: 0.8818\n",
      "Epoch: 44/200 Train Loss: 0.9433 Accuracy: 0.8227 Time: 20.67432  | Val Loss: 0.8553 Accuracy: 0.8632\n",
      "Epoch: 45/200 Train Loss: 0.9288 Accuracy: 0.8322 Time: 20.67555  | Val Loss: 0.7808 Accuracy: 0.8897\n",
      "Epoch: 46/200 Train Loss: 0.9297 Accuracy: 0.8262 Time: 20.67608  | Val Loss: 0.7730 Accuracy: 0.8948\n",
      "Epoch: 47/200 Train Loss: 0.9148 Accuracy: 0.8381 Time: 21.85090  | Val Loss: 0.7701 Accuracy: 0.8955\n",
      "Epoch: 48/200 Train Loss: 0.9010 Accuracy: 0.8432 Time: 21.92676  | Val Loss: 0.8480 Accuracy: 0.8757\n",
      "Epoch: 49/200 Train Loss: 0.9077 Accuracy: 0.8378 Time: 21.88077  | Val Loss: 0.7580 Accuracy: 0.8989\n",
      "Epoch: 50/200 Train Loss: 0.9012 Accuracy: 0.8429 Time: 22.23141  | Val Loss: 0.7583 Accuracy: 0.9009\n",
      "Epoch: 51/200 Train Loss: 0.8853 Accuracy: 0.8475 Time: 21.80080  | Val Loss: 0.7760 Accuracy: 0.8912\n",
      "Epoch: 52/200 Train Loss: 0.8771 Accuracy: 0.8517 Time: 21.61522  | Val Loss: 0.7646 Accuracy: 0.8978\n",
      "Epoch: 53/200 Train Loss: 0.8716 Accuracy: 0.8491 Time: 21.63914  | Val Loss: 0.7701 Accuracy: 0.8940\n",
      "Epoch: 54/200 Train Loss: 0.8808 Accuracy: 0.8493 Time: 21.80223  | Val Loss: 0.7497 Accuracy: 0.9022\n",
      "Epoch: 55/200 Train Loss: 0.8612 Accuracy: 0.8587 Time: 21.70541  | Val Loss: 0.8301 Accuracy: 0.8736\n",
      "Epoch: 56/200 Train Loss: 0.8870 Accuracy: 0.8453 Time: 21.54556  | Val Loss: 0.7748 Accuracy: 0.8915\n",
      "Epoch: 57/200 Train Loss: 0.8607 Accuracy: 0.8556 Time: 21.74154  | Val Loss: 0.7865 Accuracy: 0.9001\n",
      "Epoch: 58/200 Train Loss: 0.8505 Accuracy: 0.8604 Time: 21.70019  | Val Loss: 0.7596 Accuracy: 0.9034\n",
      "Epoch: 59/200 Train Loss: 0.8435 Accuracy: 0.8632 Time: 21.79803  | Val Loss: 0.7538 Accuracy: 0.9014\n",
      "Epoch: 60/200 Train Loss: 0.8303 Accuracy: 0.8675 Time: 21.69842  | Val Loss: 0.7918 Accuracy: 0.8935\n",
      "Epoch: 61/200 Train Loss: 0.8392 Accuracy: 0.8714 Time: 21.86161  | Val Loss: 0.7534 Accuracy: 0.9098\n",
      "Epoch: 62/200 Train Loss: 0.8179 Accuracy: 0.8725 Time: 21.69879  | Val Loss: 0.7578 Accuracy: 0.8976\n",
      "Epoch: 63/200 Train Loss: 0.8173 Accuracy: 0.8740 Time: 21.38398  | Val Loss: 0.7436 Accuracy: 0.9090\n",
      "Epoch: 64/200 Train Loss: 0.8159 Accuracy: 0.8747 Time: 21.60365  | Val Loss: 0.7336 Accuracy: 0.9108\n",
      "Epoch: 65/200 Train Loss: 0.8053 Accuracy: 0.8828 Time: 21.49327  | Val Loss: 0.7130 Accuracy: 0.9159\n",
      "Epoch: 66/200 Train Loss: 0.8193 Accuracy: 0.8730 Time: 21.58395  | Val Loss: 0.7520 Accuracy: 0.8983\n",
      "Epoch: 67/200 Train Loss: 0.8092 Accuracy: 0.8764 Time: 21.67838  | Val Loss: 0.7430 Accuracy: 0.9045\n",
      "Epoch: 68/200 Train Loss: 0.8059 Accuracy: 0.8786 Time: 21.64960  | Val Loss: 0.8050 Accuracy: 0.8971\n",
      "Epoch: 69/200 Train Loss: 0.8083 Accuracy: 0.8757 Time: 21.64438  | Val Loss: 0.7461 Accuracy: 0.9050\n",
      "Epoch: 70/200 Train Loss: 0.8060 Accuracy: 0.8819 Time: 21.58553  | Val Loss: 0.7230 Accuracy: 0.9136\n",
      "Epoch: 71/200 Train Loss: 0.7980 Accuracy: 0.8844 Time: 21.72330  | Val Loss: 0.7206 Accuracy: 0.9146\n",
      "Epoch: 72/200 Train Loss: 0.7897 Accuracy: 0.8863 Time: 21.78280  | Val Loss: 0.8138 Accuracy: 0.8825\n",
      "Epoch: 73/200 Train Loss: 0.7946 Accuracy: 0.8874 Time: 21.53846  | Val Loss: 0.7149 Accuracy: 0.9180\n",
      "Epoch: 74/200 Train Loss: 0.7834 Accuracy: 0.8894 Time: 21.55225  | Val Loss: 0.7035 Accuracy: 0.9190\n",
      "Epoch: 75/200 Train Loss: 0.7664 Accuracy: 0.8966 Time: 21.42389  | Val Loss: 0.7276 Accuracy: 0.9108\n",
      "Epoch: 76/200 Train Loss: 0.7650 Accuracy: 0.8979 Time: 21.53508  | Val Loss: 0.7342 Accuracy: 0.9139\n",
      "Epoch: 77/200 Train Loss: 0.7660 Accuracy: 0.8959 Time: 21.82128  | Val Loss: 0.7234 Accuracy: 0.9169\n",
      "Epoch: 78/200 Train Loss: 0.7780 Accuracy: 0.8939 Time: 21.71838  | Val Loss: 0.7244 Accuracy: 0.9146\n",
      "Epoch: 79/200 Train Loss: 0.7697 Accuracy: 0.8953 Time: 21.74520  | Val Loss: 0.7543 Accuracy: 0.9047\n",
      "Epoch: 80/200 Train Loss: 0.7689 Accuracy: 0.8930 Time: 22.03686  | Val Loss: 0.7174 Accuracy: 0.9146\n",
      "Epoch: 81/200 Train Loss: 0.7670 Accuracy: 0.8967 Time: 21.96375  | Val Loss: 0.7169 Accuracy: 0.9172\n",
      "Epoch: 82/200 Train Loss: 0.7721 Accuracy: 0.8938 Time: 21.45981  | Val Loss: 0.7263 Accuracy: 0.9139\n",
      "Epoch: 83/200 Train Loss: 0.7500 Accuracy: 0.9015 Time: 21.62502  | Val Loss: 0.6910 Accuracy: 0.9307\n",
      "Epoch: 84/200 Train Loss: 0.7418 Accuracy: 0.9028 Time: 21.75018  | Val Loss: 0.7011 Accuracy: 0.9215\n",
      "Epoch: 85/200 Train Loss: 0.7369 Accuracy: 0.9103 Time: 21.63329  | Val Loss: 0.7299 Accuracy: 0.9134\n",
      "Epoch: 86/200 Train Loss: 0.7662 Accuracy: 0.8964 Time: 21.75165  | Val Loss: 0.7122 Accuracy: 0.9180\n",
      "Epoch: 87/200 Train Loss: 0.7442 Accuracy: 0.9053 Time: 21.72330  | Val Loss: 0.6963 Accuracy: 0.9241\n",
      "Epoch: 88/200 Train Loss: 0.7371 Accuracy: 0.9103 Time: 21.81505  | Val Loss: 0.6875 Accuracy: 0.9256\n",
      "Epoch: 89/200 Train Loss: 0.7291 Accuracy: 0.9112 Time: 21.59560  | Val Loss: 0.7013 Accuracy: 0.9180\n",
      "Epoch: 90/200 Train Loss: 0.7184 Accuracy: 0.9147 Time: 21.72414  | Val Loss: 0.7053 Accuracy: 0.9182\n",
      "Epoch: 91/200 Train Loss: 0.7141 Accuracy: 0.9147 Time: 21.83843  | Val Loss: 0.6814 Accuracy: 0.9312\n",
      "Epoch: 92/200 Train Loss: 0.7181 Accuracy: 0.9166 Time: 21.83415  | Val Loss: 0.6992 Accuracy: 0.9256\n",
      "Epoch: 93/200 Train Loss: 0.7088 Accuracy: 0.9214 Time: 21.57302  | Val Loss: 0.6919 Accuracy: 0.9264\n",
      "Epoch: 94/200 Train Loss: 0.7050 Accuracy: 0.9209 Time: 21.61565  | Val Loss: 0.6875 Accuracy: 0.9220\n",
      "Epoch: 95/200 Train Loss: 0.7037 Accuracy: 0.9219 Time: 21.61732  | Val Loss: 0.6834 Accuracy: 0.9269\n",
      "Epoch: 96/200 Train Loss: 0.7040 Accuracy: 0.9214 Time: 22.02132  | Val Loss: 0.6813 Accuracy: 0.9271\n",
      "Epoch: 97/200 Train Loss: 0.7113 Accuracy: 0.9156 Time: 21.79188  | Val Loss: 0.6916 Accuracy: 0.9213\n",
      "Epoch: 98/200 Train Loss: 0.6975 Accuracy: 0.9249 Time: 21.68323  | Val Loss: 0.6729 Accuracy: 0.9315\n",
      "Epoch: 99/200 Train Loss: 0.6911 Accuracy: 0.9291 Time: 21.55866  | Val Loss: 0.6778 Accuracy: 0.9299\n",
      "Epoch: 100/200 Train Loss: 0.6903 Accuracy: 0.9296 Time: 21.77042  | Val Loss: 0.6828 Accuracy: 0.9276\n",
      "Epoch: 101/200 Train Loss: 0.6980 Accuracy: 0.9240 Time: 21.73576  | Val Loss: 0.6762 Accuracy: 0.9287\n",
      "Epoch: 102/200 Train Loss: 0.6851 Accuracy: 0.9267 Time: 21.63873  | Val Loss: 0.6715 Accuracy: 0.9325\n",
      "Epoch: 103/200 Train Loss: 0.6941 Accuracy: 0.9255 Time: 21.49827  | Val Loss: 0.6797 Accuracy: 0.9282\n",
      "Epoch: 104/200 Train Loss: 0.6819 Accuracy: 0.9291 Time: 21.60484  | Val Loss: 0.6750 Accuracy: 0.9327\n",
      "Epoch: 105/200 Train Loss: 0.6788 Accuracy: 0.9314 Time: 21.49993  | Val Loss: 0.6633 Accuracy: 0.9363\n",
      "Epoch: 106/200 Train Loss: 0.6804 Accuracy: 0.9302 Time: 21.83985  | Val Loss: 0.6797 Accuracy: 0.9282\n",
      "Epoch: 107/200 Train Loss: 0.6741 Accuracy: 0.9340 Time: 21.81366  | Val Loss: 0.6638 Accuracy: 0.9332\n",
      "Epoch: 108/200 Train Loss: 0.6802 Accuracy: 0.9300 Time: 21.68835  | Val Loss: 0.6782 Accuracy: 0.9307\n",
      "Epoch: 109/200 Train Loss: 0.6688 Accuracy: 0.9349 Time: 21.44976  | Val Loss: 0.6609 Accuracy: 0.9371\n",
      "Epoch: 110/200 Train Loss: 0.6633 Accuracy: 0.9393 Time: 21.43648  | Val Loss: 0.6671 Accuracy: 0.9343\n",
      "Epoch: 111/200 Train Loss: 0.6628 Accuracy: 0.9347 Time: 21.57318  | Val Loss: 0.6734 Accuracy: 0.9332\n",
      "Epoch: 112/200 Train Loss: 0.6612 Accuracy: 0.9402 Time: 21.60457  | Val Loss: 0.6683 Accuracy: 0.9358\n",
      "Epoch: 113/200 Train Loss: 0.6640 Accuracy: 0.9370 Time: 21.58435  | Val Loss: 0.6663 Accuracy: 0.9363\n",
      "Epoch: 114/200 Train Loss: 0.6565 Accuracy: 0.9392 Time: 21.57357  | Val Loss: 0.6636 Accuracy: 0.9386\n",
      "Epoch: 115/200 Train Loss: 0.6669 Accuracy: 0.9362 Time: 21.43539  | Val Loss: 0.6633 Accuracy: 0.9361\n",
      "Epoch: 116/200 Train Loss: 0.6547 Accuracy: 0.9441 Time: 21.59427  | Val Loss: 0.6655 Accuracy: 0.9317\n",
      "Epoch: 117/200 Train Loss: 0.6598 Accuracy: 0.9411 Time: 21.62528  | Val Loss: 0.6650 Accuracy: 0.9368\n",
      "Epoch: 118/200 Train Loss: 0.6446 Accuracy: 0.9455 Time: 21.31115  | Val Loss: 0.6699 Accuracy: 0.9343\n",
      "Epoch: 119/200 Train Loss: 0.6528 Accuracy: 0.9416 Time: 21.51582  | Val Loss: 0.6650 Accuracy: 0.9394\n",
      "Epoch: 120/200 Train Loss: 0.6444 Accuracy: 0.9459 Time: 21.31200  | Val Loss: 0.6644 Accuracy: 0.9355\n",
      "Epoch: 121/200 Train Loss: 0.6559 Accuracy: 0.9409 Time: 20.75900  | Val Loss: 0.6611 Accuracy: 0.9348\n",
      "Epoch: 122/200 Train Loss: 0.6444 Accuracy: 0.9455 Time: 21.01926  | Val Loss: 0.6650 Accuracy: 0.9348\n",
      "Epoch: 123/200 Train Loss: 0.6341 Accuracy: 0.9511 Time: 21.91034  | Val Loss: 0.6569 Accuracy: 0.9394\n",
      "Epoch: 124/200 Train Loss: 0.6366 Accuracy: 0.9504 Time: 21.76173  | Val Loss: 0.6628 Accuracy: 0.9376\n",
      "Epoch: 125/200 Train Loss: 0.6290 Accuracy: 0.9532 Time: 21.78814  | Val Loss: 0.6638 Accuracy: 0.9348\n",
      "Epoch: 126/200 Train Loss: 0.6407 Accuracy: 0.9479 Time: 21.91078  | Val Loss: 0.6538 Accuracy: 0.9445\n",
      "Epoch: 127/200 Train Loss: 0.6322 Accuracy: 0.9500 Time: 22.17569  | Val Loss: 0.6476 Accuracy: 0.9399\n",
      "Epoch: 128/200 Train Loss: 0.6322 Accuracy: 0.9510 Time: 22.04459  | Val Loss: 0.6552 Accuracy: 0.9399\n",
      "Epoch: 129/200 Train Loss: 0.6328 Accuracy: 0.9518 Time: 21.81023  | Val Loss: 0.6614 Accuracy: 0.9414\n",
      "Epoch: 130/200 Train Loss: 0.6270 Accuracy: 0.9512 Time: 21.83470  | Val Loss: 0.6472 Accuracy: 0.9462\n",
      "Epoch: 131/200 Train Loss: 0.6227 Accuracy: 0.9542 Time: 22.00222  | Val Loss: 0.6508 Accuracy: 0.9439\n",
      "Epoch: 132/200 Train Loss: 0.6249 Accuracy: 0.9533 Time: 21.80601  | Val Loss: 0.6619 Accuracy: 0.9322\n",
      "Epoch: 133/200 Train Loss: 0.6269 Accuracy: 0.9532 Time: 21.65508  | Val Loss: 0.6509 Accuracy: 0.9406\n",
      "Epoch: 134/200 Train Loss: 0.6248 Accuracy: 0.9534 Time: 21.93204  | Val Loss: 0.6550 Accuracy: 0.9391\n",
      "Epoch: 135/200 Train Loss: 0.6189 Accuracy: 0.9571 Time: 21.06397  | Val Loss: 0.6501 Accuracy: 0.9396\n",
      "Epoch: 136/200 Train Loss: 0.6190 Accuracy: 0.9546 Time: 20.88583  | Val Loss: 0.6530 Accuracy: 0.9394\n",
      "Epoch: 137/200 Train Loss: 0.6157 Accuracy: 0.9568 Time: 20.87478  | Val Loss: 0.6478 Accuracy: 0.9468\n",
      "Epoch: 138/200 Train Loss: 0.6204 Accuracy: 0.9572 Time: 20.71163  | Val Loss: 0.6491 Accuracy: 0.9419\n",
      "Epoch: 139/200 Train Loss: 0.6164 Accuracy: 0.9578 Time: 20.88943  | Val Loss: 0.6462 Accuracy: 0.9417\n",
      "Epoch: 140/200 Train Loss: 0.6120 Accuracy: 0.9575 Time: 20.99598  | Val Loss: 0.6438 Accuracy: 0.9437\n",
      "Epoch: 141/200 Train Loss: 0.6115 Accuracy: 0.9588 Time: 20.79232  | Val Loss: 0.6512 Accuracy: 0.9422\n",
      "Epoch: 142/200 Train Loss: 0.6149 Accuracy: 0.9570 Time: 20.76010  | Val Loss: 0.6550 Accuracy: 0.9386\n",
      "Epoch: 143/200 Train Loss: 0.6146 Accuracy: 0.9584 Time: 20.73074  | Val Loss: 0.6434 Accuracy: 0.9457\n",
      "Epoch: 144/200 Train Loss: 0.6111 Accuracy: 0.9572 Time: 20.78076  | Val Loss: 0.6438 Accuracy: 0.9447\n",
      "Epoch: 145/200 Train Loss: 0.6050 Accuracy: 0.9619 Time: 20.88059  | Val Loss: 0.6463 Accuracy: 0.9424\n",
      "Epoch: 146/200 Train Loss: 0.6135 Accuracy: 0.9585 Time: 21.08118  | Val Loss: 0.6396 Accuracy: 0.9493\n",
      "Epoch: 147/200 Train Loss: 0.6038 Accuracy: 0.9623 Time: 20.90001  | Val Loss: 0.6435 Accuracy: 0.9468\n",
      "Epoch: 148/200 Train Loss: 0.6021 Accuracy: 0.9625 Time: 20.78684  | Val Loss: 0.6395 Accuracy: 0.9496\n",
      "Epoch: 149/200 Train Loss: 0.6013 Accuracy: 0.9635 Time: 20.79747  | Val Loss: 0.6424 Accuracy: 0.9447\n",
      "Epoch: 150/200 Train Loss: 0.6017 Accuracy: 0.9607 Time: 20.79005  | Val Loss: 0.6426 Accuracy: 0.9460\n",
      "Epoch: 151/200 Train Loss: 0.6013 Accuracy: 0.9630 Time: 20.78885  | Val Loss: 0.6454 Accuracy: 0.9465\n",
      "Epoch: 152/200 Train Loss: 0.5927 Accuracy: 0.9681 Time: 20.94757  | Val Loss: 0.6386 Accuracy: 0.9468\n",
      "Epoch: 153/200 Train Loss: 0.5990 Accuracy: 0.9636 Time: 20.85907  | Val Loss: 0.6361 Accuracy: 0.9462\n",
      "Epoch: 154/200 Train Loss: 0.5964 Accuracy: 0.9662 Time: 20.72756  | Val Loss: 0.6389 Accuracy: 0.9457\n",
      "Epoch: 155/200 Train Loss: 0.5903 Accuracy: 0.9677 Time: 20.76454  | Val Loss: 0.6398 Accuracy: 0.9478\n",
      "Epoch: 156/200 Train Loss: 0.5874 Accuracy: 0.9711 Time: 20.81154  | Val Loss: 0.6433 Accuracy: 0.9445\n",
      "Epoch: 157/200 Train Loss: 0.5871 Accuracy: 0.9687 Time: 20.85124  | Val Loss: 0.6377 Accuracy: 0.9470\n",
      "Epoch: 158/200 Train Loss: 0.5933 Accuracy: 0.9665 Time: 20.73687  | Val Loss: 0.6373 Accuracy: 0.9478\n",
      "Epoch: 159/200 Train Loss: 0.5871 Accuracy: 0.9678 Time: 20.85895  | Val Loss: 0.6451 Accuracy: 0.9452\n",
      "Epoch: 160/200 Train Loss: 0.5835 Accuracy: 0.9705 Time: 20.79918  | Val Loss: 0.6404 Accuracy: 0.9473\n",
      "Epoch: 161/200 Train Loss: 0.5891 Accuracy: 0.9675 Time: 20.73198  | Val Loss: 0.6396 Accuracy: 0.9465\n",
      "Epoch: 162/200 Train Loss: 0.5859 Accuracy: 0.9682 Time: 20.77798  | Val Loss: 0.6399 Accuracy: 0.9452\n",
      "Epoch: 163/200 Train Loss: 0.5891 Accuracy: 0.9682 Time: 20.79025  | Val Loss: 0.6379 Accuracy: 0.9475\n",
      "Epoch: 164/200 Train Loss: 0.5863 Accuracy: 0.9703 Time: 20.90893  | Val Loss: 0.6406 Accuracy: 0.9445\n",
      "Epoch: 165/200 Train Loss: 0.5849 Accuracy: 0.9695 Time: 20.72064  | Val Loss: 0.6361 Accuracy: 0.9493\n",
      "Epoch: 166/200 Train Loss: 0.5913 Accuracy: 0.9665 Time: 20.89401  | Val Loss: 0.6342 Accuracy: 0.9488\n",
      "Epoch: 167/200 Train Loss: 0.5822 Accuracy: 0.9717 Time: 20.97545  | Val Loss: 0.6346 Accuracy: 0.9473\n",
      "Epoch: 168/200 Train Loss: 0.5842 Accuracy: 0.9696 Time: 20.75759  | Val Loss: 0.6358 Accuracy: 0.9468\n",
      "Epoch: 169/200 Train Loss: 0.5828 Accuracy: 0.9704 Time: 20.73584  | Val Loss: 0.6356 Accuracy: 0.9475\n",
      "Epoch: 170/200 Train Loss: 0.5805 Accuracy: 0.9709 Time: 20.76948  | Val Loss: 0.6348 Accuracy: 0.9473\n",
      "Epoch: 171/200 Train Loss: 0.5861 Accuracy: 0.9687 Time: 20.72314  | Val Loss: 0.6332 Accuracy: 0.9483\n",
      "Epoch: 172/200 Train Loss: 0.5838 Accuracy: 0.9724 Time: 20.77173  | Val Loss: 0.6364 Accuracy: 0.9470\n",
      "Epoch: 173/200 Train Loss: 0.5803 Accuracy: 0.9698 Time: 20.80724  | Val Loss: 0.6396 Accuracy: 0.9452\n",
      "Epoch: 174/200 Train Loss: 0.5820 Accuracy: 0.9700 Time: 20.77022  | Val Loss: 0.6381 Accuracy: 0.9462\n",
      "Epoch: 175/200 Train Loss: 0.5813 Accuracy: 0.9711 Time: 21.85112  | Val Loss: 0.6355 Accuracy: 0.9460\n",
      "Epoch: 176/200 Train Loss: 0.5796 Accuracy: 0.9726 Time: 21.63625  | Val Loss: 0.6351 Accuracy: 0.9470\n",
      "Epoch: 177/200 Train Loss: 0.5772 Accuracy: 0.9734 Time: 21.61707  | Val Loss: 0.6366 Accuracy: 0.9475\n",
      "Epoch: 178/200 Train Loss: 0.5785 Accuracy: 0.9722 Time: 21.89432  | Val Loss: 0.6340 Accuracy: 0.9490\n",
      "Epoch: 179/200 Train Loss: 0.5814 Accuracy: 0.9702 Time: 21.69114  | Val Loss: 0.6335 Accuracy: 0.9475\n",
      "Epoch: 180/200 Train Loss: 0.5750 Accuracy: 0.9723 Time: 21.83569  | Val Loss: 0.6311 Accuracy: 0.9498\n",
      "Epoch: 181/200 Train Loss: 0.5726 Accuracy: 0.9756 Time: 21.74179  | Val Loss: 0.6352 Accuracy: 0.9473\n",
      "Epoch: 182/200 Train Loss: 0.5753 Accuracy: 0.9736 Time: 21.74505  | Val Loss: 0.6327 Accuracy: 0.9490\n",
      "Epoch: 183/200 Train Loss: 0.5729 Accuracy: 0.9733 Time: 21.88877  | Val Loss: 0.6332 Accuracy: 0.9490\n",
      "Epoch: 184/200 Train Loss: 0.5825 Accuracy: 0.9695 Time: 21.65817  | Val Loss: 0.6298 Accuracy: 0.9513\n",
      "Epoch: 185/200 Train Loss: 0.5736 Accuracy: 0.9743 Time: 21.74729  | Val Loss: 0.6310 Accuracy: 0.9496\n",
      "Epoch: 186/200 Train Loss: 0.5745 Accuracy: 0.9735 Time: 21.09957  | Val Loss: 0.6312 Accuracy: 0.9478\n",
      "Epoch: 187/200 Train Loss: 0.5737 Accuracy: 0.9747 Time: 21.02071  | Val Loss: 0.6357 Accuracy: 0.9468\n",
      "Epoch: 188/200 Train Loss: 0.5719 Accuracy: 0.9753 Time: 20.99627  | Val Loss: 0.6339 Accuracy: 0.9478\n",
      "Epoch: 189/200 Train Loss: 0.5734 Accuracy: 0.9744 Time: 21.24880  | Val Loss: 0.6351 Accuracy: 0.9475\n",
      "Epoch: 190/200 Train Loss: 0.5766 Accuracy: 0.9713 Time: 21.24571  | Val Loss: 0.6325 Accuracy: 0.9480\n",
      "Epoch: 191/200 Train Loss: 0.5774 Accuracy: 0.9707 Time: 21.12806  | Val Loss: 0.6328 Accuracy: 0.9478\n",
      "Epoch: 192/200 Train Loss: 0.5712 Accuracy: 0.9752 Time: 20.90483  | Val Loss: 0.6325 Accuracy: 0.9485\n",
      "Epoch: 193/200 Train Loss: 0.5780 Accuracy: 0.9703 Time: 20.91994  | Val Loss: 0.6338 Accuracy: 0.9483\n",
      "Epoch: 194/200 Train Loss: 0.5698 Accuracy: 0.9750 Time: 20.91198  | Val Loss: 0.6310 Accuracy: 0.9490\n",
      "Epoch: 195/200 Train Loss: 0.5775 Accuracy: 0.9729 Time: 21.39987  | Val Loss: 0.6324 Accuracy: 0.9493\n",
      "Epoch: 196/200 Train Loss: 0.5705 Accuracy: 0.9756 Time: 22.16663  | Val Loss: 0.6320 Accuracy: 0.9490\n",
      "Epoch: 197/200 Train Loss: 0.5689 Accuracy: 0.9761 Time: 21.74348  | Val Loss: 0.6337 Accuracy: 0.9457\n",
      "Epoch: 198/200 Train Loss: 0.5729 Accuracy: 0.9739 Time: 22.14608  | Val Loss: 0.6323 Accuracy: 0.9480\n",
      "Epoch: 199/200 Train Loss: 0.5749 Accuracy: 0.9733 Time: 21.92082  | Val Loss: 0.6324 Accuracy: 0.9473\n",
      "Epoch: 200/200 Train Loss: 0.5725 Accuracy: 0.9750 Time: 22.08944  | Val Loss: 0.6336 Accuracy: 0.9475\n",
      "#Parameter: 23000394 Accuracy: 0.947515923566879\n"
     ]
    }
   ],
   "source": [
    "from hdd.models.cnn.resnext import ResNextNet50_32_4, ResNextNet\n",
    "from hdd.train.classification_utils import (\n",
    "    naive_train_classification_model,\n",
    "    eval_image_classifier,\n",
    ")\n",
    "from hdd.models.nn_utils import count_trainable_parameter\n",
    "\n",
    "\n",
    "def train_net(\n",
    "    train_dataloader,\n",
    "    val_dataloader,\n",
    "    lr=1e-3,\n",
    "    weight_decay=1e-3,\n",
    "    max_epochs=200,\n",
    ") -> tuple[ResNextNet, dict[str, list[float]]]:\n",
    "    net = ResNextNet50_32_4(num_classes=10, dropout=0.5).to(DEVICE)\n",
    "    print(f\"#Parameter: {count_trainable_parameter(net)}\")\n",
    "    criteria = nn.CrossEntropyLoss(label_smoothing=0.1)\n",
    "    optimizer = torch.optim.AdamW(net.parameters(), lr=lr, weight_decay=weight_decay)\n",
    "    scheduler = torch.optim.lr_scheduler.CosineAnnealingLR(\n",
    "        optimizer, max_epochs, eta_min=lr / 100\n",
    "    )\n",
    "    training_stats = naive_train_classification_model(\n",
    "        net,\n",
    "        criteria,\n",
    "        max_epochs,\n",
    "        train_dataloader,\n",
    "        val_dataloader,\n",
    "        DEVICE,\n",
    "        optimizer,\n",
    "        scheduler,\n",
    "        verbose=True,\n",
    "    )\n",
    "    return net, training_stats\n",
    "\n",
    "\n",
    "train_dataloader, val_dataloader = build_dataloader(64, train_dataset, val_dataset)\n",
    "\n",
    "net, width_multiplier_1 = train_net(\n",
    "    train_dataloader,\n",
    "    val_dataloader,\n",
    "    lr=0.001,\n",
    "    weight_decay=0,\n",
    ")\n",
    "\n",
    "eval_result = eval_image_classifier(net, val_dataloader.dataset, DEVICE)\n",
    "ss = [result.gt_label == result.predicted_label for result in eval_result]\n",
    "print(f\"#Parameter: {count_trainable_parameter(net)} Accuracy: {sum(ss) / len(ss)}\")"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "80a16faf",
   "metadata": {},
   "outputs": [],
   "source": []
  },
  {
   "cell_type": "markdown",
   "id": "c43620ab",
   "metadata": {},
   "source": []
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "pytorch-cu124",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.11.11"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 5
}
