{
 "cells": [
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "# Copyright 2019 Google LLC\n",
    "#\n",
    "# Licensed under the Apache License, Version 2.0 (the \"License\");\n",
    "# you may not use this file except in compliance with the License.\n",
    "# You may obtain a copy of the License at\n",
    "#\n",
    "#     https://www.apache.org/licenses/LICENSE-2.0\n",
    "#\n",
    "# Unless required by applicable law or agreed to in writing, software\n",
    "# distributed under the License is distributed on an \"AS IS\" BASIS,\n",
    "# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n",
    "# See the License for the specific language governing permissions and\n",
    "# limitations under the License."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "<a target=\"_blank\" href=\"https://colab.research.google.com/github/GoogleCloudPlatform/keras-idiomatic-programmer/blob/master/workshops/Wide_Convolutional_Neural_Networks/Idiomatic%20Programmer%20-%20handbook%201%20-%20Codelab%203.ipynb\">\n",
    "<img src=\"https://www.tensorflow.org/images/colab_logo_32px.png\" />Run in Google Colab</a>"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Idiomatic Programmer Code Labs\n",
    "\n",
    "## Code Labs #3 - Get Familiar with Wide Convolutional Neural Networks\n",
    "\n",
    "## Prerequistes:\n",
    "\n",
    "    1. Familiar with Python\n",
    "    2. Completed Handbook 1/Part 3: Wide Convolutional Neural Networks\n",
    "\n",
    "## Objectives:\n",
    "\n",
    "    1. Branch Convolutions in a Inception v1 Module\n",
    "    2. Branch Convolutions in a ResNeXt Module"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Inception Module as Function API\n",
    "\n",
    "Let's create an Inception module.\n",
    "\n",
    "We will use these approaches:\n",
    "\n",
    "    1. Dimensionality Reduction replacing one convolution in a pair with a bottleneck \n",
    "       convolution.\n",
    "    2. Branching the input through multiple convolutions (wide).\n",
    "    3. Concatenating the branches back together.\n",
    "\n",
    "You fill in the blanks (replace the ??), make sure it passes the Python interpreter, and then verify it's correctness with the summary output.\n",
    "\n",
    "You will need to:\n",
    "\n",
    "    1. Set the filter size, strides and input vector for the bottleneck layer pair with the max pooling layer.\n",
    "    2. Set the filter size and input vector for the bottleneck layer pair with the 3x3 convolution.\n",
    "    3. Set the filter size and input vector for the bottleneck layer pair with the 5x5 convolution.\n",
    "    4. Concatenate the branches."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "from keras import Model, Input\n",
    "from keras import layers\n",
    "\n",
    "# Our hypothetical input to an inception module\n",
    "x = inputs = Input((229, 229, 3))\n",
    "\n",
    "# The inception branches (where x is the previous layer)\n",
    "x1 = layers.MaxPooling2D((3, 3), strides=(1,1), padding='same')(x)\n",
    "# Add the bottleneck after the 2x2 pooling layer\n",
    "# HINT: x1 is the branch for pooling + bottleneck. So the output from pooling is the input to the bottleneck\n",
    "x1 = layers.Conv2D(64, ??, strides=??, padding='same')(??)\n",
    "\n",
    "# Add the second branch which is a single bottleneck convolution\n",
    "x2 = layers.Conv2D(64, (1, 1), strides=(1, 1), padding='same')(x)  # passes straight through\n",
    "\n",
    "x3 = layers.Conv2D(64, (1, 1), strides=(1, 1), padding='same')(x)\n",
    "# Add the the 3x3 convolutional layer after the bottleneck\n",
    "# HINT: x3 is the branch for bottleneck + convolution. So the output from bottleneck is the input to the convolution\n",
    "x3 = layers.Conv2D(96, ??, strides=(1, 1), padding='same')(??)\n",
    "\n",
    "x4 = layers.Conv2D(64, (1, 1), strides=(1, 1), padding='same')(x)\n",
    "# Add the the 5x5 convolutional layer after the bottleneck\n",
    "# HINT: x4 is the branch for bottleneck + convolution. So the output from bottleneck is the input to the convolution\n",
    "x4 = layers.Conv2D(48, ??, strides=(1, 1), padding='same')(??)\n",
    "\n",
    "# Concatenate the filters from each of the four branches\n",
    "# HINT: List the branches (variable names) as a list\n",
    "x = outputs = layers.concatenate([??, ??, ??, ??])\n",
    "\n",
    "# Let's create a mini-inception neural network using a single inception v1 module\n",
    "model = Model(inputs, outputs)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Verify the model architecture using summary method\n",
    "\n",
    "It should look like below:\n",
    "\n",
    "```\n",
    "__________________________________________________________________________________________________\n",
    "Layer (type)                    Output Shape         Param #     Connected to                     \n",
    "==================================================================================================\n",
    "input_1 (InputLayer)            (None, 229, 229, 3)  0                                            \n",
    "__________________________________________________________________________________________________\n",
    "max_pooling2d_3 (MaxPooling2D)  (None, 229, 229, 3)  0           input_1[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_3 (Conv2D)               (None, 229, 229, 64) 256         input_1[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_5 (Conv2D)               (None, 229, 229, 64) 256         input_1[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_1 (Conv2D)               (None, 229, 229, 64) 256         max_pooling2d_3[0][0]            \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_2 (Conv2D)               (None, 229, 229, 64) 256         input_1[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_4 (Conv2D)               (None, 229, 229, 96) 55392       conv2d_3[0][0]                   \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_6 (Conv2D)               (None, 229, 229, 48) 76848       conv2d_5[0][0]                   \n",
    "__________________________________________________________________________________________________\n",
    "concatenate_1 (Concatenate)     (None, 229, 229, 272 0           conv2d_1[0][0]                   \n",
    "                                                                 conv2d_2[0][0]                   \n",
    "                                                                 conv2d_4[0][0]                   \n",
    "                                                                 conv2d_6[0][0]                   \n",
    "==================================================================================================\n",
    "Total params: 133,264\n",
    "Trainable params: 133,264\n",
    "Non-trainable params: 0\n",
    "```"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "model.summary()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## ResNeXt Module as Function API\n",
    "\n",
    "Let's create a ResNeXt module.\n",
    "\n",
    "We will use these approaches:\n",
    "\n",
    "    1. Split and branching the input through parallel convolutions (wide).\n",
    "    2. Concatenating the branches back together.\n",
    "    3. Dimensionality reduction by sandwiching the split/branch between two bottleneck \n",
    "       convolutions.\n",
    "\n",
    "You will need to:\n",
    "\n",
    "    1. Append the solit (parallel) convolution into a list.\n",
    "    2. Set the number of input and output filters of each residual block group."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "from keras import Input, Model\n",
    "from keras import layers\n",
    "\n",
    "def _resnext_block(shortcut, filters_in, filters_out, cardinality=32, strides=(1, 1)):\n",
    "    \"\"\" Construct a ResNeXT block\n",
    "        shortcut   : previous layer and shortcut for identity link\n",
    "        filters_in : number of filters  (channels) at the input convolution\n",
    "        filters_out: number of filters (channels) at the output convolution\n",
    "        cardinality: width of cardinality layer\n",
    "    \"\"\"\n",
    "\n",
    "    # Bottleneck layer\n",
    "    # HINT: remember its all about 1's\n",
    "    x = layers.Conv2D(filters_in, kernel_size=??, strides=??,\n",
    "                      padding='same')(shortcut)\n",
    "    x = layers.BatchNormalization()(x)\n",
    "    x = layers.ReLU()(x)\n",
    "\n",
    "    # Cardinality (Wide) Layer\n",
    "    filters_card = filters_in // cardinality\n",
    "    groups = []\n",
    "    for i in range(cardinality):\n",
    "        # Split the input evenly across parallel branches\n",
    "        group = layers.Lambda(lambda z: z[:, :, :, i * filters_card:i *\n",
    "                              filters_card + filters_card])(x)\n",
    "        # Maintain a list of parallel branches\n",
    "        # HINT: Your building a list of the split inputs (group) which are passed \n",
    "        # through a 3x3 convolution.\n",
    "        groups.append(layers.Conv2D(filters_card, kernel_size=(3, 3),\n",
    "                                    strides=strides, padding='same')(??))\n",
    "\n",
    "    # Concatenate the outputs of the cardinality layer together\n",
    "    # HINT: Its the list of parallel branches to concatenate\n",
    "    x = layers.concatenate(??)\n",
    "    x = layers.BatchNormalization()(x)\n",
    "    x = layers.ReLU()(x)\n",
    "\n",
    "    # Bottleneck layer\n",
    "    x = layers.Conv2D(filters_out, kernel_size=(1, 1), strides=(1, 1),\n",
    "                      padding='same')(x)\n",
    "    x = layers.BatchNormalization()(x)\n",
    "\n",
    "    # special case for first resnext block\n",
    "    if shortcut.shape[-1] != filters_out:\n",
    "        # use convolutional layer to double the input size to the block so it\n",
    "        # matches the output size (so we can add them)\n",
    "        shortcut = layers.Conv2D(filters_out, kernel_size=(1, 1), strides=strides,\n",
    "                                 padding='same')(shortcut)\n",
    "        shortcut = layers.BatchNormalization()(shortcut)\n",
    "\n",
    "    # Identity Link: Add the shortcut (input) to the output of the block\n",
    "    x = layers.add([shortcut, x])\n",
    "    x = layers.ReLU()(x)\n",
    "    return x\n",
    "\n",
    "# The input tensor\n",
    "inputs = layers.Input(shape=(224, 224, 3))\n",
    "\n",
    "# Stem Convolutional layer\n",
    "x = layers.Conv2D(64, kernel_size=(7, 7), strides=(2, 2), padding='same')(inputs)\n",
    "x = layers.BatchNormalization()(x)\n",
    "x = layers.ReLU()(x)\n",
    "x = layers.MaxPool2D(pool_size=(3, 3), strides=(2, 2), padding='same')(x)\n",
    "\n",
    "# First ResNeXt Group, inputs are 128 filters and outputs are 256\n",
    "# HINT: the second number will be twice as big as the first number\n",
    "x = _resnext_block(x, ??, ??, strides=(2, 2))\n",
    "for _ in range(2):\n",
    "    x = _resnext_block(x, ??, ??)\n",
    "\n",
    "# strided convolution to match the number of output filters on next block and reduce by 2\n",
    "x = layers.Conv2D(512, kernel_size=(1, 1), strides=(2, 2), padding='same')(x)\n",
    "\n",
    "# Second ResNeXt Group, inputs will be 256 and outputs will be 512\n",
    "for _ in range(4):\n",
    "    x = _resnext_block(x, ??, ??)\n",
    "\n",
    "# strided convolution to match the number of output filters on next block and\n",
    "# reduce by 2\n",
    "x = layers.Conv2D(1024, kernel_size=(1, 1), strides=(2, 2), padding='same')(x)\n",
    "\n",
    "# Third ResNeXt Group, inputs will be 512 and outputs 1024\n",
    "for _ in range(6):\n",
    "    x = _resnext_block(x, ??, ??)\n",
    "\n",
    "# strided convolution to match the number of output filters on next block and\n",
    "# reduce by 2\n",
    "x = layers.Conv2D(2048, kernel_size=(1, 1), strides=(2, 2), padding='same')(x)\n",
    "\n",
    "# Fourth ResNeXt Group, inputs will be 1024 and outputs will be 2048\n",
    "for _ in range(3):\n",
    "    x = _resnext_block(x, ??, ??)\n",
    "\n",
    "# Final Dense Outputting Layer for 1000 outputs\n",
    "x = layers.GlobalAveragePooling2D()(x)\n",
    "outputs = layers.Dense(1000, activation='softmax')(x)\n",
    "\n",
    "model = Model(inputs, outputs)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Verify the model architecture using summary method\n",
    "\n",
    "It should look like below:\n",
    "\n",
    "```\n",
    "Layer (type)                    Output Shape         Param #     Connected to                     \n",
    "==================================================================================================\n",
    "input_2 (InputLayer)            (None, 224, 224, 3)  0                                            \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_36 (Conv2D)              (None, 112, 112, 64) 9472        input_2[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "batch_normalization_5 (BatchNor (None, 112, 112, 64) 256         conv2d_36[0][0]                  \n",
    "__________________________________________________________________________________________________\n",
    "re_lu_4 (ReLU)                  (None, 112, 112, 64) 0           batch_normalization_5[0][0]      \n",
    "__________________________________________________________________________________________________\n",
    "max_pooling2d_2 (MaxPooling2D)  (None, 56, 56, 64)   0           re_lu_4[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_37 (Conv2D)              (None, 56, 56, 128)  8320        max_pooling2d_2[0][0]            \n",
    "__________________________________________________________________________________________________\n",
    "batch_normalization_6 (BatchNor (None, 56, 56, 128)  512         conv2d_37[0][0]                  \n",
    "__________________________________________________________________________________________________\n",
    "re_lu_5 (ReLU)                  (None, 56, 56, 128)  0           batch_normalization_6[0][0]      \n",
    "__________________________________________________________________________________________________\n",
    "lambda_33 (Lambda)              (None, 56, 56, 4)    0           re_lu_5[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "lambda_34 (Lambda)              (None, 56, 56, 4)    0           re_lu_5[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "lambda_35 (Lambda)              (None, 56, 56, 4)    0           re_lu_5[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "lambda_36 (Lambda)              (None, 56, 56, 4)    0           re_lu_5[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "lambda_37 (Lambda)              (None, 56, 56, 4)    0           re_lu_5[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "lambda_38 (Lambda)              (None, 56, 56, 4)    0           re_lu_5[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "lambda_39 (Lambda)              (None, 56, 56, 4)    0           re_lu_5[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "lambda_40 (Lambda)              (None, 56, 56, 4)    0           re_lu_5[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "lambda_41 (Lambda)              (None, 56, 56, 4)    0           re_lu_5[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "lambda_42 (Lambda)              (None, 56, 56, 4)    0           re_lu_5[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "lambda_43 (Lambda)              (None, 56, 56, 4)    0           re_lu_5[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "lambda_44 (Lambda)              (None, 56, 56, 4)    0           re_lu_5[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "lambda_45 (Lambda)              (None, 56, 56, 4)    0           re_lu_5[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "lambda_46 (Lambda)              (None, 56, 56, 4)    0           re_lu_5[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "lambda_47 (Lambda)              (None, 56, 56, 4)    0           re_lu_5[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "lambda_48 (Lambda)              (None, 56, 56, 4)    0           re_lu_5[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "lambda_49 (Lambda)              (None, 56, 56, 4)    0           re_lu_5[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "lambda_50 (Lambda)              (None, 56, 56, 4)    0           re_lu_5[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "lambda_51 (Lambda)              (None, 56, 56, 4)    0           re_lu_5[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "lambda_52 (Lambda)              (None, 56, 56, 4)    0           re_lu_5[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "lambda_53 (Lambda)              (None, 56, 56, 4)    0           re_lu_5[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "lambda_54 (Lambda)              (None, 56, 56, 4)    0           re_lu_5[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "lambda_55 (Lambda)              (None, 56, 56, 4)    0           re_lu_5[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "lambda_56 (Lambda)              (None, 56, 56, 4)    0           re_lu_5[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "lambda_57 (Lambda)              (None, 56, 56, 4)    0           re_lu_5[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "lambda_58 (Lambda)              (None, 56, 56, 4)    0           re_lu_5[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "lambda_59 (Lambda)              (None, 56, 56, 4)    0           re_lu_5[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "lambda_60 (Lambda)              (None, 56, 56, 4)    0           re_lu_5[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "lambda_61 (Lambda)              (None, 56, 56, 4)    0           re_lu_5[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "lambda_62 (Lambda)              (None, 56, 56, 4)    0           re_lu_5[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "lambda_63 (Lambda)              (None, 56, 56, 4)    0           re_lu_5[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "lambda_64 (Lambda)              (None, 56, 56, 4)    0           re_lu_5[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_38 (Conv2D)              (None, 56, 56, 4)    148         lambda_33[0][0]                  \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_39 (Conv2D)              (None, 56, 56, 4)    148         lambda_34[0][0]                  \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_40 (Conv2D)              (None, 56, 56, 4)    148         lambda_35[0][0]                  \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_41 (Conv2D)              (None, 56, 56, 4)    148         lambda_36[0][0]                  \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_42 (Conv2D)              (None, 56, 56, 4)    148         lambda_37[0][0]                  \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_43 (Conv2D)              (None, 56, 56, 4)    148         lambda_38[0][0]                  \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_44 (Conv2D)              (None, 56, 56, 4)    148         lambda_39[0][0]                  \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_45 (Conv2D)              (None, 56, 56, 4)    148         lambda_40[0][0]                  \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_46 (Conv2D)              (None, 56, 56, 4)    148         lambda_41[0][0]                  \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_47 (Conv2D)              (None, 56, 56, 4)    148         lambda_42[0][0]                  \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_48 (Conv2D)              (None, 56, 56, 4)    148         lambda_43[0][0]                  \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_49 (Conv2D)              (None, 56, 56, 4)    148         lambda_44[0][0]                  \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_50 (Conv2D)              (None, 56, 56, 4)    148         lambda_45[0][0]                  \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_51 (Conv2D)              (None, 56, 56, 4)    148         lambda_46[0][0]                  \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_52 (Conv2D)              (None, 56, 56, 4)    148         lambda_47[0][0]                  \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_53 (Conv2D)              (None, 56, 56, 4)    148         lambda_48[0][0]                  \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_54 (Conv2D)              (None, 56, 56, 4)    148         lambda_49[0][0]                  \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_55 (Conv2D)              (None, 56, 56, 4)    148         lambda_50[0][0]                  \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_56 (Conv2D)              (None, 56, 56, 4)    148         lambda_51[0][0]                  \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_57 (Conv2D)              (None, 56, 56, 4)    148         lambda_52[0][0]                  \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_58 (Conv2D)              (None, 56, 56, 4)    148         lambda_53[0][0]                  \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_59 (Conv2D)              (None, 56, 56, 4)    148         lambda_54[0][0]                  \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_60 (Conv2D)              (None, 56, 56, 4)    148         lambda_55[0][0]                  \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_61 (Conv2D)              (None, 56, 56, 4)    148         lambda_56[0][0]                  \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_62 (Conv2D)              (None, 56, 56, 4)    148         lambda_57[0][0]                  \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_63 (Conv2D)              (None, 56, 56, 4)    148         lambda_58[0][0]                  \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_64 (Conv2D)              (None, 56, 56, 4)    148         lambda_59[0][0]                  \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_65 (Conv2D)              (None, 56, 56, 4)    148         lambda_60[0][0]                  \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_66 (Conv2D)              (None, 56, 56, 4)    148         lambda_61[0][0]                  \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_67 (Conv2D)              (None, 56, 56, 4)    148         lambda_62[0][0]                  \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_68 (Conv2D)              (None, 56, 56, 4)    148         lambda_63[0][0]                  \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_69 (Conv2D)              (None, 56, 56, 4)    148         lambda_64[0][0]                  \n",
    "__________________________________________________________________________________________________\n",
    "concatenate_2 (Concatenate)     (None, 56, 56, 128)  0           conv2d_38[0][0]                  \n",
    "                                                                 conv2d_39[0][0]                  \n",
    "                                                                 conv2d_40[0][0]                  \n",
    "                                                                 conv2d_41[0][0]                  \n",
    "                                                                 conv2d_42[0][0]                  \n",
    "                                                                 conv2d_43[0][0]                  \n",
    "                                                                 conv2d_44[0][0]                  \n",
    "                                                                 conv2d_45[0][0]                  \n",
    "                                                                 conv2d_46[0][0]                  \n",
    "                                                                 conv2d_47[0][0]                  \n",
    "                                                                 conv2d_48[0][0]                  \n",
    "                                                                 conv2d_49[0][0]                  \n",
    "                                                                 conv2d_50[0][0]                  \n",
    "                                                                 conv2d_51[0][0]                  \n",
    "                                                                 conv2d_52[0][0]                  \n",
    "                                                                 conv2d_53[0][0]                  \n",
    "                                                                 conv2d_54[0][0]                  \n",
    "                                                                 conv2d_55[0][0]                  \n",
    "                                                                 conv2d_56[0][0]                  \n",
    "                                                                 conv2d_57[0][0]                  \n",
    "                                                                 conv2d_58[0][0]                  \n",
    "                                                                 conv2d_59[0][0]                  \n",
    "                                                                 conv2d_60[0][0]                  \n",
    "                                                                 conv2d_61[0][0]                  \n",
    "                                                                 conv2d_62[0][0]                  \n",
    "                                                                 conv2d_63[0][0]                  \n",
    "                                                                 conv2d_64[0][0]                  \n",
    "                                                                 conv2d_65[0][0]                  \n",
    "                                                                 conv2d_66[0][0]                  \n",
    "                                                                 conv2d_67[0][0]                  \n",
    "                                                                 conv2d_68[0][0]                  \n",
    "                                                                 conv2d_69[0][0]                  \n",
    "__________________________________________________________________________________________________\n",
    "batch_normalization_7 (BatchNor (None, 56, 56, 128)  512         concatenate_2[0][0]              \n",
    "__________________________________________________________________________________________________\n",
    "re_lu_6 (ReLU)                  (None, 56, 56, 128)  0           batch_normalization_7[0][0]      \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_71 (Conv2D)              (None, 56, 56, 256)  16640       max_pooling2d_2[0][0]            \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_70 (Conv2D)              (None, 56, 56, 256)  33024       re_lu_6[0][0]                    \n",
    "__________________________________________________________________________________________________\n",
    "batch_normalization_9 (BatchNor (None, 56, 56, 256)  1024        conv2d_71[0][0]                  \n",
    "__________________________________________________________________________________________________\n",
    "batch_normalization_8 (BatchNor (None, 56, 56, 256)  1024        conv2d_70[0][0]                  \n",
    "__________________________________________________________________________________________________\n",
    "add_2 (Add)                     (None, 56, 56, 256)  0           batch_normalization_9[0][0]      \n",
    "                                                                 batch_normalization_8[0][0]      \n",
    "                                                                 \n",
    "REMOVED for  ...\n",
    "\n",
    "batch_normalization_53 (BatchNo (None, 7, 7, 1024)   4096        concatenate_17[0][0]             \n",
    "__________________________________________________________________________________________________\n",
    "re_lu_51 (ReLU)                 (None, 7, 7, 1024)   0           batch_normalization_53[0][0]     \n",
    "__________________________________________________________________________________________________\n",
    "conv2d_584 (Conv2D)             (None, 7, 7, 2048)   2099200     re_lu_51[0][0]                   \n",
    "__________________________________________________________________________________________________\n",
    "batch_normalization_54 (BatchNo (None, 7, 7, 2048)   8192        conv2d_584[0][0]                 \n",
    "__________________________________________________________________________________________________\n",
    "add_17 (Add)                    (None, 7, 7, 2048)   0           re_lu_49[0][0]                   \n",
    "                                                                 batch_normalization_54[0][0]     \n",
    "__________________________________________________________________________________________________\n",
    "re_lu_52 (ReLU)                 (None, 7, 7, 2048)   0           add_17[0][0]                     \n",
    "__________________________________________________________________________________________________\n",
    "global_average_pooling2d_1 (Glo (None, 2048)         0           re_lu_52[0][0]                   \n",
    "__________________________________________________________________________________________________\n",
    "dense_1 (Dense)                 (None, 1000)         2049000     global_average_pooling2d_1[0][0] \n",
    "==================================================================================================\n",
    "Total params: 26,493,160\n",
    "Trainable params: 26,432,104\n",
    "Non-trainable params: 61,056\n",
    "```"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "model.summary()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## End of Code Lab"
   ]
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.6.6"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 2
}
