{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Table of Contents\n",
    "* [Intro](#Intro)\n",
    "\t* [Logistic Regression](#Logistic-Regression)\n",
    "\t* [From Linear to Logistic Regression](#From-Linear-to-Logistic-Regression)\n",
    "\t* [Logistic Function](#Logistic-Function)\n",
    "* [Decision Boundary [TOFIX]](#Decision-Boundary-[TOFIX])\n",
    "* [Simulate Data](#Simulate-Data)\n",
    "* [Logistic Regression (Sklearn)](#Logistic-Regression-%28Sklearn%29)\n",
    "* [Gradient Descent](#Gradient-Descent)\n",
    "\t* [Training Animation](#Training-Animation)\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Intro"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Exploratory notebook related to basic concepts and theory behind logistic regression. Includes toy examples implementation and relative visualization."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Logistic Regression"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Contrary from what the name suggests, logistic regression solves Classification type of problems. It moves away from regression to overcome linearity limitations in the context of classification, and adopts the logistic function for hypothesis building."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 59,
   "metadata": {
    "ExecuteTime": {
     "end_time": "2017-08-02T13:38:31.706870",
     "start_time": "2017-08-02T13:38:31.293846"
    },
    "collapsed": false
   },
   "outputs": [],
   "source": [
    "%matplotlib notebook\n",
    "\n",
    "import numpy as np\n",
    "import pandas as pd\n",
    "import seaborn as sns\n",
    "from sklearn import linear_model, datasets\n",
    "from matplotlib import pyplot as plt, animation\n",
    "\n",
    "sns.set_context(\"paper\")"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## From Linear to Logistic Regression"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Demonstrate the rationale behind the move from linear to logistic regression using reproduced examples from [Coursera course](https://www.coursera.org/learn/machine-learning). Consider again the statements \"classification is not a linear function\".\n",
    "\n",
    "We can clearly see how outliers can easily demonstrate the non feasibility of regression of classification problems."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 2,
   "metadata": {
    "ExecuteTime": {
     "end_time": "2017-08-02T12:52:14.965049",
     "start_time": "2017-08-02T12:52:14.956049"
    },
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "# Tumor data\n",
    "x = np.arange(10)\n",
    "y = np.array([0]*5 + [1]*5)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 3,
   "metadata": {
    "ExecuteTime": {
     "end_time": "2017-08-02T12:52:25.202635",
     "start_time": "2017-08-02T12:52:25.003624"
    },
    "collapsed": false
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Slope = 0.152 (r = 0.870, p = 0.00105)\n"
     ]
    },
    {
     "data": {
      "text/plain": [
       "<IPython.core.display.Javascript object>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "text/html": [
       "<img src=\"\">"
      ],
      "text/plain": [
       "<IPython.core.display.HTML object>"
      ]
     },
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# Plot separation when \"clean\" data\n",
    "from scipy import stats\n",
    "\n",
    "slope, intercept, r, p, _ = stats.linregress(x, y)\n",
    "print('Slope = {:.3f} (r = {:.3f}, p = {:.5f})'.format(slope, r, p))\n",
    "\n",
    "ax = sns.regplot(x, y)\n",
    "x_intersect = (0.5 - intercept)/slope\n",
    "ax.plot([x_intersect, x_intersect], [-1,2], 'k-')\n",
    "sns.plt.show()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 4,
   "metadata": {
    "ExecuteTime": {
     "end_time": "2017-08-02T12:52:33.791126",
     "start_time": "2017-08-02T12:52:33.610116"
    },
    "collapsed": false
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Slope = 0.047 (r = 0.613, p = 0.04494)\n"
     ]
    },
    {
     "data": {
      "text/plain": [
       "<IPython.core.display.Javascript object>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "text/html": [
       "<img src=\"\">"
      ],
      "text/plain": [
       "<IPython.core.display.HTML object>"
      ]
     },
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# Plot sepration when outlier\n",
    "x = np.append(x, [25])\n",
    "y = np.append(y, [1])\n",
    "slope, intercept, r, p, _ = stats.linregress(x, y)\n",
    "print('Slope = {:.3f} (r = {:.3f}, p = {:.5f})'.format(slope, r, p))\n",
    "\n",
    "ax = sns.regplot(x, y)\n",
    "x_intersect = (0.5 - intercept)/slope\n",
    "ax.plot([x_intersect, x_intersect], [-1,2], 'k-')\n",
    "sns.plt.show()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Logistic Function"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "The hypothesis function associated with the Logistic Regression model. \n",
    "\n",
    "$$\\frac{1}{1+e^{-x}}$$\n",
    "\n",
    "A sigmoid function is a function characterized by an S shaped curve. Logistic function is a special case of sigmoid function, but often the two terms are used interchangeably.\n",
    "\n",
    "Statistical approaches tend to mention the logit function (inverse of the sigmoid one) and the concept of odds. [Great article about the connection of the two interpretations](https://sebastianraschka.com/faq/docs/logistic-why-sigmoid.html)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 18,
   "metadata": {
    "ExecuteTime": {
     "end_time": "2017-08-02T13:10:31.648776",
     "start_time": "2017-08-02T13:10:31.511768"
    },
    "collapsed": false
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "<IPython.core.display.Javascript object>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "text/html": [
       "<img src=\"\">"
      ],
      "text/plain": [
       "<IPython.core.display.HTML object>"
      ]
     },
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# Exponential\n",
    "x = np.linspace(-2, 5, 100)\n",
    "y = np.exp(-x)\n",
    "\n",
    "ax = plt.plot(x, y)\n",
    "plt.show()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 19,
   "metadata": {
    "ExecuteTime": {
     "end_time": "2017-08-02T13:10:37.732124",
     "start_time": "2017-08-02T13:10:37.685121"
    },
    "collapsed": false
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "<IPython.core.display.Javascript object>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "text/html": [
       "<img src=\"\">"
      ],
      "text/plain": [
       "<IPython.core.display.HTML object>"
      ]
     },
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# Sigmoid \n",
    "x = np.linspace(-10, 10, 100)\n",
    "y = 1/(1 + np.exp(-x))\n",
    "\n",
    "ax = plt.plot(x, y)\n",
    "plt.show()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Decision Boundary [TOFIX]"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 44,
   "metadata": {
    "collapsed": false
   },
   "outputs": [],
   "source": [
    "h_0 = lambda x : t_0 + (t_1 * x[0]) + (t_2 * x[1])"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 46,
   "metadata": {
    "collapsed": false
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "<IPython.core.display.Javascript object>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "text/html": [
       "<img src=\"\">"
      ],
      "text/plain": [
       "<IPython.core.display.HTML object>"
      ]
     },
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "t_0 = -3\n",
    "t_1 = 1\n",
    "t_2 = 1\n",
    "x_1 = np.arange(5)\n",
    "x_2 = np.arange(5)\n",
    "res = np.dstack(np.meshgrid(x_1, x_2)).reshape(-1, 2)\n",
    "s_1 = filter(lambda x : h_0((x[0],x[1]))>=0, res)\n",
    "s_2 = filter(lambda x : h_0((x[0],x[1]))<0, res)\n",
    "\n",
    "m = ['+','o']\n",
    "for i, s in enumerate([s_1, s_2]):\n",
    "    x_1, x_2 = list(map(np.array, zip(*s)))\n",
    "    sns.regplot(x_1, x_2, fit_reg=False, marker=m[i])\n",
    "    sns.plt.show()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Simulate Data"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 82,
   "metadata": {
    "ExecuteTime": {
     "end_time": "2017-08-02T13:57:34.223218",
     "start_time": "2017-08-02T13:57:34.211217"
    },
    "collapsed": false
   },
   "outputs": [],
   "source": [
    "iris = datasets.load_iris()\n",
    "X = iris.data[:, :2]  # we only take the first two features.\n",
    "Y = iris.target\n",
    "# Replace label 2 with value 1, so we have only two classes to predict\n",
    "np.place(Y, Y==2, 1)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 83,
   "metadata": {
    "ExecuteTime": {
     "end_time": "2017-08-02T13:57:35.120269",
     "start_time": "2017-08-02T13:57:35.089268"
    },
    "collapsed": false,
    "scrolled": true
   },
   "outputs": [
    {
     "data": {
      "text/html": [
       "<div>\n",
       "<table border=\"1\" class=\"dataframe\">\n",
       "  <thead>\n",
       "    <tr style=\"text-align: right;\">\n",
       "      <th></th>\n",
       "      <th>feat_1</th>\n",
       "      <th>feat_2</th>\n",
       "      <th>class</th>\n",
       "    </tr>\n",
       "  </thead>\n",
       "  <tbody>\n",
       "    <tr>\n",
       "      <th>0</th>\n",
       "      <td>5.1</td>\n",
       "      <td>3.5</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>1</th>\n",
       "      <td>4.9</td>\n",
       "      <td>3.0</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>2</th>\n",
       "      <td>4.7</td>\n",
       "      <td>3.2</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>3</th>\n",
       "      <td>4.6</td>\n",
       "      <td>3.1</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>4</th>\n",
       "      <td>5.0</td>\n",
       "      <td>3.6</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "  </tbody>\n",
       "</table>\n",
       "</div>"
      ],
      "text/plain": [
       "   feat_1  feat_2  class\n",
       "0     5.1     3.5      0\n",
       "1     4.9     3.0      0\n",
       "2     4.7     3.2      0\n",
       "3     4.6     3.1      0\n",
       "4     5.0     3.6      0"
      ]
     },
     "execution_count": 83,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "df = pd.DataFrame(X, columns=['feat_1', 'feat_2'])\n",
    "df['class'] = Y\n",
    "df.head()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 84,
   "metadata": {
    "ExecuteTime": {
     "end_time": "2017-08-02T13:57:35.939316",
     "start_time": "2017-08-02T13:57:35.853311"
    },
    "collapsed": false
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "<IPython.core.display.Javascript object>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "text/html": [
       "<img src=\"\">"
      ],
      "text/plain": [
       "<IPython.core.display.HTML object>"
      ]
     },
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "sns.regplot(x='feat_1', y='feat_2', data=df[df['class']==0], color='g', fit_reg=False)\n",
    "sns.regplot(x='feat_1', y='feat_2', data=df[df['class']==1], color='b', fit_reg=False)\n",
    "sns.plt.legend(['0 Class', '1 Class'])\n",
    "sns.plt.show()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Logistic Regression (Sklearn)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 20,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "from sklearn import metrics"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 16,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "logreg = linear_model.LogisticRegression(C=1e5)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 17,
   "metadata": {
    "collapsed": false
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "LogisticRegression(C=100000.0, class_weight=None, dual=False,\n",
       "          fit_intercept=True, intercept_scaling=1, max_iter=100,\n",
       "          multi_class='ovr', n_jobs=1, penalty='l2', random_state=None,\n",
       "          solver='liblinear', tol=0.0001, verbose=0, warm_start=False)"
      ]
     },
     "execution_count": 17,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "logreg.fit(X, Y)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 18,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "predictions = logreg.predict(X)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 21,
   "metadata": {
    "collapsed": false
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "1.0"
      ]
     },
     "execution_count": 21,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "metrics.accuracy_score(Y, predictions)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Gradient Descent"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Fit model using gradient descent.\n",
    "\n",
    "For the cost function we can rely on cross-entropy loss, which for binary cases is:\n",
    "\n",
    "$$\n",
    "L(y,\\hat{y})\\ =\\ -y\\log {\\hat  {y}}-(1-y)\\log(1-{\\hat  {y}})\n",
    "$$\n",
    "\n",
    "[Ref 1](http://aimotion.blogspot.ie/2011/11/machine-learning-with-python-logistic.html)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 20,
   "metadata": {
    "ExecuteTime": {
     "end_time": "2017-08-02T13:18:42.361843",
     "start_time": "2017-08-02T13:18:42.352843"
    },
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "# Sigmoid function\n",
    "def sigmoid(X):\n",
    "    res = 1.0 / (1.0 + np.exp(-1.0 * X))\n",
    "    return res"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 51,
   "metadata": {
    "ExecuteTime": {
     "end_time": "2017-08-02T13:35:22.943073",
     "start_time": "2017-08-02T13:35:22.932073"
    },
    "collapsed": false
   },
   "outputs": [],
   "source": [
    "# Cost for single prediction\n",
    "def compute_cost(X, y_true, theta):\n",
    "    m = len(y_true)\n",
    "    \n",
    "    y_pred = sigmoid(X.dot(theta).flatten())\n",
    "    \n",
    "    # Simplified\n",
    "    #if y_true == 1:\n",
    "    #    return -log(y_pred)\n",
    "    #else:\n",
    "    #    return -log(1 - y_pred)\n",
    "    \n",
    "    # One liner\n",
    "    cost = ((-y_true.T.dot(np.log(y_pred)) - \n",
    "             (1-y_true).T.dot(np.log(1-y_pred))) /(1.0*m))\n",
    "    return cost"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 52,
   "metadata": {
    "ExecuteTime": {
     "end_time": "2017-08-02T13:35:23.398099",
     "start_time": "2017-08-02T13:35:23.393099"
    },
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "# single gradient descent step\n",
    "def gradient_descent_step(X, y, theta, alpha):\n",
    "    m = len(y)\n",
    "    # compute predictions\n",
    "    pred = sigmoid(X.dot(theta).flatten())\n",
    "    \n",
    "    # get error\n",
    "    errors = -np.sum((y-pred)*X.T, axis=1).reshape(3,1)\n",
    "    \n",
    "    theta -= alpha * (errors/m)\n",
    "    \n",
    "    return theta"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 53,
   "metadata": {
    "ExecuteTime": {
     "end_time": "2017-08-02T13:35:24.714175",
     "start_time": "2017-08-02T13:35:24.706174"
    },
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "# run an entire training cycle\n",
    "def train(X, y, alpha, iters):\n",
    "    cost_history = np.zeros(shape=(iters, 1))\n",
    "    theta_history = []\n",
    "    \n",
    "    # our parameters are slope and intercepts (bias)\n",
    "    theta = np.random.randn(3, 1)\n",
    "    for i in range(iters):\n",
    "        theta = gradient_descent_step(X, y, theta, alpha)\n",
    "        \n",
    "        cost_history[i, 0] = compute_cost(X, y, theta)\n",
    "        theta_history.append(theta.copy())\n",
    "    \n",
    "    return theta_history, cost_history"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 89,
   "metadata": {
    "ExecuteTime": {
     "end_time": "2017-08-02T13:58:53.957779",
     "start_time": "2017-08-02T13:58:53.941778"
    },
    "collapsed": false
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "(150, 3)\n",
      "(150,)\n"
     ]
    }
   ],
   "source": [
    "# Parameter learning\n",
    "\n",
    "# input data including bias\n",
    "iris = datasets.load_iris()\n",
    "X = iris.data[:, :3]\n",
    "X[:, 2] = 1\n",
    "y = iris.target\n",
    "# Replace label 2 with value 1, so we have only two classes to predict\n",
    "np.place(y, y==2, 1)\n",
    "\n",
    "print(X.shape)\n",
    "print(y.shape)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 55,
   "metadata": {
    "ExecuteTime": {
     "end_time": "2017-08-02T13:35:26.541279",
     "start_time": "2017-08-02T13:35:26.483276"
    },
    "collapsed": false
   },
   "outputs": [],
   "source": [
    "alpha = 0.01\n",
    "epochs = 1000\n",
    "theta_history, cost_history = train(X, y, alpha, epochs)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 57,
   "metadata": {
    "ExecuteTime": {
     "end_time": "2017-08-02T13:37:12.966366",
     "start_time": "2017-08-02T13:37:12.875361"
    },
    "collapsed": false
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "<IPython.core.display.Javascript object>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "text/html": [
       "<img src=\"\">"
      ],
      "text/plain": [
       "<IPython.core.display.HTML object>"
      ]
     },
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# Plot history\n",
    "fig, axes = plt.subplots(2, 1)\n",
    "# plot cost\n",
    "axes[0].set_title('Cost History')\n",
    "axes[0].plot(cost_history.reshape(-1))\n",
    "axes[0].set_ylabel(\"cost\")\n",
    "# plot theta\n",
    "axes[1].set_title('Theta History')\n",
    "for t_idx in range(len(theta_history[0])):\n",
    "    axes[1].plot([t[t_idx] for t in theta_history], label='theta_{}'.format(t_idx))\n",
    "axes[1].set_xlabel(\"epoch\")\n",
    "plt.legend()\n",
    "plt.show()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Training Animation"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 95,
   "metadata": {
    "ExecuteTime": {
     "end_time": "2017-08-02T14:01:49.810837",
     "start_time": "2017-08-02T14:01:49.671829"
    },
    "collapsed": false
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "<IPython.core.display.Javascript object>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "text/html": [
       "<img src=\"\">"
      ],
      "text/plain": [
       "<IPython.core.display.HTML object>"
      ]
     },
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "alpha = 0.01\n",
    "epochs = 100\n",
    "\n",
    "# Plot SGD animation\n",
    "fig, ax = sns.plt.subplots(figsize=(8, 6))\n",
    "xx, yy = np.mgrid[0:10:.5, 0:10:.5]\n",
    "grid = np.c_[xx.ravel(), yy.ravel()]\n",
    "X_grid = np.ones(shape=(len(xx)*len(yy), 3))\n",
    "X_grid[:, :2] = grid\n",
    "theta = np.random.randn(3, 1)\n",
    "pred = sigmoid(X_grid.dot(theta).flatten()).reshape(xx.shape)\n",
    "contour = ax.contourf(xx, yy, pred, 25, cmap=\"RdBu\",\n",
    "                      vmin=0, vmax=1)\n",
    "sns.regplot(x='feat_1', y='feat_2', data=df[df['class']==0], color='g', fit_reg=False)\n",
    "sns.regplot(x='feat_1', y='feat_2', data=df[df['class']==1], color='b', fit_reg=False)\n",
    "ax_c = fig.colorbar(contour)\n",
    "ax_c.set_ticks([0, .25, .5, .75, 1])\n",
    "epoch_text = ax.text(0, 0, \"Epoch 0\")\n",
    "\n",
    "def animate(i):\n",
    "    global X, y, theta, alpha, df\n",
    "    theta = gradient_descent_step(X, y, theta, alpha)\n",
    "\n",
    "    pred = sigmoid(X_grid.dot(theta).flatten()).reshape(xx.shape)\n",
    "    contour = ax.contourf(xx, yy, pred, 25, cmap=\"RdBu\",\n",
    "                      vmin=0, vmax=1)\n",
    "    cost = compute_cost(X, y, theta)\n",
    "    epoch_text.set_text(\"Epoch {}, cost {:.3f}\".format(i, cost))\n",
    "    sns.regplot(x='feat_1', y='feat_2', data=df[df['class']==0], color='g', fit_reg=False)\n",
    "    sns.regplot(x='feat_1', y='feat_2', data=df[df['class']==1], color='b', fit_reg=False)\n",
    "    return epoch_text,\n",
    "\n",
    "ani = animation.FuncAnimation(fig, animate, epochs, interval=1, repeat=False)"
   ]
  }
 ],
 "metadata": {
  "anaconda-cloud": {},
  "kernelspec": {
   "display_name": "Python [conda env:kaggle]",
   "language": "python",
   "name": "conda-env-kaggle-py"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3.0
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.5.2"
  },
  "toc": {
   "colors": {
    "hover_highlight": "#DAA520",
    "running_highlight": "#FF0000",
    "selected_highlight": "#FFD700"
   },
   "moveMenuLeft": true,
   "nav_menu": {
    "height": "123px",
    "width": "252px"
   },
   "navigate_menu": true,
   "number_sections": true,
   "sideBar": true,
   "threshold": 4.0,
   "toc_cell": false,
   "toc_section_display": "block",
   "toc_window_display": false
  }
 },
 "nbformat": 4,
 "nbformat_minor": 0
}