{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# 第八周作业\n",
    "学员自己实现一个densenet的网络，并插入到slim框架中进行训练。   \n",
    "详情见：https://gitee.com/ai100/quiz-w7-2-densenet    \n",
    "tinymind 使用说明：https://gitee.com/ai100/quiz-w7-doc"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 1. 运行log的截图"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "%% densenet-tinymind 运行截图\n",
    "<img src=\"densenet-tinymind.png\" width=\"70%\">"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 2. densenet实现过程\n",
    "这次denseNet的构建，完全按照论文中给的参数进行定义。   \n",
    "具体操作，按照老师给的代码，将densenet函数中的代码补全。  "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "# 按照论文里的参数设定构建网络（growth = 24，compression = 0.5）\n",
    "net = images\n",
    "# 第一个block之前的conv（论文设定为2k个通道，之后为k）、pooling\n",
    "net = slim.conv2d(net, 2*growth, [7,7], stride=2, scope='conv1')\n",
    "net = slim.max_pool2d(net, [3,3], stride=2, padding='SAME', scope='pool1')\n",
    "# block1：6个layer，H = BN + ReLu + Conv3*3\n",
    "net = block(net, 6, growth, scope = 'block1')\n",
    "# transitio1, H = BN + ReLu + Conv1*1\n",
    "net = bn_act_conv_drp(net, num_outputs = reduce_dim(net), kernel_size = [1,1], scope='transition1')\n",
    "# block2\n",
    "net = block(net, 12, growth, scope = 'block2')\n",
    "# transition2\n",
    "net = bn_act_conv_drp(net, num_outputs = reduce_dim(net), kernel_size = [1,1], scope='transition2')\n",
    "# block3\n",
    "net = block(net, 24, growth, scope = 'block3')\n",
    "# transition3\n",
    "net = bn_act_conv_drp(net, num_outputs = reduce_dim(net), kernel_size = [1,1], scope='transition3')\n",
    "# block4\n",
    "net = block(net, 16, growth, scope = 'block4')\n",
    "# transition4\n",
    "net = bn_act_conv_drp(net, num_outputs = reduce_dim(net), kernel_size = [1,1], scope='transition4')\n",
    "\n",
    "# classification layer\n",
    "# 1. global average pooling\n",
    "net = slim.avg_pool2d(net, net.shape[1:3], stride = 2, scope = 'global_average_pool')\n",
    "# 2. 用【1，1】卷积核将图像size变为1*1*depth\n",
    "last_conv = slim.conv2d(net, num_classes, [1,1], scope = 'last_conv')\n",
    "logits = tf.squeeze(last_conv)\n",
    "\n",
    "end_points['logits'] = logits"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "对growth的个人理解：    \n",
    "DenseNet将CNN网络中每一个卷机层之后产生的结果作为这一层的feature map拼接到global state中，从而使增强信息的流动，防止梯度消失和弥散。这其中，每一层新产生的feature map的不断拼接，会导致全局feature map过深，增加计算量，所以将新feature map的增长作为超参数调整，既可以最大化层与层之间信息的传递，又可以控制计算量。这个超参数用k表示，设定的是卷积核的数量。   \n",
    "问题：\n",
    "1. 要控制feature map的增长，其实也可以通过调节block的数量和block里的层数？\n",
    "2. transition layer里用1x1x4k的卷积核先对input进行计算，再用3x3xk的卷积核计算。这里4k的设定是一个经验值吗？对于所有数据集都可行？"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "稠密链接的理解:   \n",
    "稠密链接增强了层和层之间信息的传递，每一层的输入都是它之前所有层的输出的拼接，不想ResNet是将输入进行累加，个人理解这里用拼接更好的保留了这一层的feature map信息。目的是为了防止梯度下降更新网络时，发生的梯度消失问题，同时也使网络可以更深。同时由于成长率超参数的调节，使得denseNet网络用更少的参数获得和其他网络一样好的结果。   \n",
    "思考：稠密链接这样的设定确实能获得很好的效果，但同时也依赖于计算能力。"
   ]
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.6.3"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 2
}
