{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Ungraded Lab - Gradient for regularized logistic regression\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Goals\n",
    "In this lab you will:\n",
    "- extend the implementation of determining the gradient to include regularization.\n",
    "- write a version using looping\n",
    "- optionally write a vectorized version\n",
    "- speed test the difference\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "import numpy as np\n",
    "from lab_utils import sigmoid"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Looping Version\n",
    "\n",
    "The gradient of the regularized cost function the partial deriviative of cost relative to the parameters $w$ and $b$:\n",
    "\n",
    "$$\\frac{\\partial J(\\mathbf{w})}{\\partial b} = \\frac{1}{m}  \\sum_{i=0}^{m-1} (f_{\\mathbf{w},b}(\\mathbf{x}^{(i)}) - y^{(i)})  \\tag {1}$$\n",
    "\n",
    "$$\\frac{\\partial J(\\mathbf{w},b)}{\\partial w_j} = \\left( \\frac{1}{m}  \\sum_{i=0}^{m-1} (f_{\\mathbf{w},b}(\\mathbf{x}^{(i)}) - y^{(i)}) x_j^{(i)} \\right) +  \\frac{\\lambda}{m} w_j  \\quad\\, \\mbox{for $j=0...(n-1)$} \\tag {2}$$\n",
    "\n",
    "\n",
    "You will implement a function called `compute_gradient_reg` which will return $\\frac{\\partial J(\\mathbf{w},b)}{\\partial w},\\frac{\\partial J(\\mathbf{w},b)}{\\partial b}$."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Please complete the `compute_gradient_reg` function:\n",
    "\n",
    "- Calculate the gradient for each element `dJdw` and `dJdb` exactly as you did in the `compute_gradient` function in earlier labs:\n",
    "    - initialize variables to accumulate `dJdw` and `dJdb`\n",
    "    - loop over all examples\n",
    "        - calculate the error for that example $g(\\mathbf{x}^{(i)T}\\mathbf{w} + b) - \\mathbf{y}^{(i)}$\n",
    "        - add the error to `dJdb` (equation 1 above)\n",
    "        - for each input value $x_{j}^{(i)}$ in this example,  \n",
    "            - multiply the error by the input  $x_{j}^{(i)}$, and add to the corresponding element of `dJdw`. (equation 2 above)\n",
    "     - divide `dJdb` and `dJdw` by total number of examples (m)\n",
    "- Now compute the regularization term\n",
    "    - loop over all $w$\n",
    "        - add  $\\frac{\\lambda}{m} * w_j$ to the corresponding element of  `dJdw`\n",
    "\n",
    "As you are doing this, remember that the variables X and y are not scalar values but matrices of shape ($m, n$) and ($𝑚$,1 ) respectively, where  $𝑛$ is the number of features and $𝑚$ is the number of training examples. \n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "<details>\n",
    "  <summary><font size=\"2\" color=\"darkgreen\"><b>Hints</b></font></summary>\n",
    "    \n",
    "```python     \n",
    "def compute_gradient_reg(X, y, w, b, lambda_ = 1): \n",
    "    \"\"\"\n",
    "    Computes the gradient for linear regression \n",
    " \n",
    "    Args:\n",
    "      X : (array_like Shape (m,n)) variable such as house size \n",
    "      y : (array_like Shape (m,1)) actual value \n",
    "      w : (array_like Shape (n,1)) values of parameters of the model      \n",
    "      b : (scalar)                 value of parameter of the model  \n",
    "      lambda_ : (scalar,float)      regularization constant\n",
    "    Returns\n",
    "      dJdw: (array_like Shape (n,1)) The gradient of the cost w.r.t. the parameters w. \n",
    "      dJdb: (scalar)                The gradient of the cost w.r.t. the parameter b. \n",
    "    \"\"\"\n",
    "    m,n = X.shape\n",
    "    dJdw = np.zeros((n,1))\n",
    "    dJdb = 0.\n",
    "    err  = 0.\n",
    "\n",
    "    ### START CODE HERE ### \n",
    "\n",
    "    for i in range(m):\n",
    "        err = sigmoid(X[i] @ w + b)  - y[i]    \n",
    "        for j in range(n):\n",
    "            dJdw[j] = dJdw[j] + err * X[i][j]\n",
    "        dJdb = dJdb + err\n",
    "    dJdw = dJdw/m\n",
    "    dJdb = dJdb/m\n",
    "    \n",
    "    for j in range(n):\n",
    "        dJdw[j] = dJdw[j] + (lambda_/m) * w[j]\n",
    "\n",
    "\n",
    "    ### END CODE HERE ###         \n",
    "        \n",
    "    return dJdb[0],dJdw  #index dJdb to return scalar value\n",
    "```\n",
    "</details>"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "def compute_gradient_reg(X, y, w, b, lambda_ = 1): \n",
    "    \"\"\"\n",
    "    Computes the gradient for linear regression \n",
    " \n",
    "    Args:\n",
    "      X : (array_like Shape (m,n)) variable such as house size \n",
    "      y : (array_like Shape (m,1)) actual value \n",
    "      w : (array_like Shape (n,1)) values of parameters of the model      \n",
    "      b : (scalar)                 value of parameter of the model  \n",
    "      lambda_ : (scalar,float)      regularization constant\n",
    "    Returns\n",
    "      dJdw: (array_like Shape (n,1)) The gradient of the cost w.r.t. the parameters w. \n",
    "      dJdb: (scalar)                The gradient of the cost w.r.t. the parameter b. \n",
    "    \"\"\"\n",
    "    m,n = X.shape\n",
    "    dJdw = np.zeros((n,1))\n",
    "    dJdb = 0.\n",
    "    err  = 0.\n",
    "\n",
    "    ### START CODE HERE ### \n",
    "    ### BEGIN SOLUTION ###\n",
    "    for i in range(m):\n",
    "        err = sigmoid(X[i] @ w + b)  - y[i]    \n",
    "        for j in range(n):\n",
    "            dJdw[j] = dJdw[j] + err * X[i][j]\n",
    "        dJdb = dJdb + err\n",
    "    dJdw = dJdw/m\n",
    "    dJdb = dJdb/m\n",
    "    \n",
    "    for j in range(n):\n",
    "        dJdw[j] = dJdw[j] + (lambda_/m) * w[j]\n",
    "\n",
    "    ### END SOLUTION ### \n",
    "    ### END CODE HERE ###         \n",
    "        \n",
    "    return dJdb[0],dJdw  #index dJdb to return scalar value"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Run the cell below to check your implementation of the `compute_gradient_reg` function."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "np.random.seed(1)\n",
    "X_tmp = np.random.rand(5,3)\n",
    "y_tmp = np.array([0,1,0,1,0]).reshape(-1,1)\n",
    "initial_w  = np.random.rand(X_tmp.shape[1]).reshape(-1,1)\n",
    "initial_b = 0.5\n",
    "lambda_ = 1\n",
    "dJdb, dJdw =  compute_gradient_reg(X_tmp, y_tmp, initial_w, initial_b, lambda_)\n",
    "\n",
    "print(f\"dJdb: {dJdb}\", )\n",
    "print(f\"Regularized dJdw:\\n {dJdw.tolist()}\", )"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "**Expected Output**\n",
    "```\n",
    "dJdb: 0.341798994972791\n",
    "Regularized dJdw:\n",
    " [[0.2140281799506471], [0.34511336695769707], [0.1412845236752601]]\n",
    " ```"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Vectorized Version\n",
    "\n",
    "Using the equations above and previous experience with vectorized gradient descent in a Lab04, write `compute_gradient_reg_matrix`.\n",
    "\n",
    "The first section is the same as in Lab 4. Add a final section which adds $\\frac{\\lambda}{m} * w_j$ to the corresponding element of $djdw$."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "<details>\n",
    "  <summary><font size=\"2\" color=\"darkgreen\"><b>Hints</b></font></summary>\n",
    "    \n",
    "```python     \n",
    "def compute_gradient_reg_matrix(X, y, w, b, lambda_= 1): \n",
    "    \"\"\"\n",
    "    Computes the gradient for linear regression \n",
    " \n",
    "    Args:\n",
    "      X : (array_like Shape (m,n)) variable such as house size \n",
    "      y : (array_like Shape (m,1)) actual value \n",
    "      w : (array_like Shape (n,1)) Values of parameters of the model      \n",
    "      b : (scalar )                Values of parameter of the model      \n",
    "      predict_function: (function) function to call to make prediction\n",
    "    Returns\n",
    "      dJdw: (array_like Shape (n,1)) The gradient of the cost w.r.t. the parameters w. \n",
    "      dJdb: (scalar)                The gradient of the cost w.r.t. the parameter b. \n",
    "                                  \n",
    "    \"\"\"\n",
    "    m,n = X.shape\n",
    "    ### START CODE HERE ### \n",
    "\n",
    "    f_wb = sigmoid(X @ w + b)      \n",
    "    e   = f_wb - y                 \n",
    "    dJdw  = (1/m) * (X.T @ e)      \n",
    "    dJdb  = (1/m) * np.sum(e)  \n",
    "    dJdw += (lambda_/m) * w\n",
    "\n",
    "    ### END CODE HERE ###         \n",
    "   \n",
    "    return dJdb,dJdw\n",
    "```\n",
    "</details>"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "def compute_gradient_reg_matrix(X, y, w, b, lambda_= 1): \n",
    "    \"\"\"\n",
    "    Computes the gradient for linear regression \n",
    " \n",
    "    Args:\n",
    "      X : (array_like Shape (m,n)) variable such as house size \n",
    "      y : (array_like Shape (m,1)) actual value \n",
    "      w : (array_like Shape (n,1)) Values of parameters of the model      \n",
    "      b : (scalar )                Values of parameter of the model      \n",
    "      predict_function: (function) function to call to make prediction\n",
    "    Returns\n",
    "      dJdw: (array_like Shape (n,1)) The gradient of the cost w.r.t. the parameters w. \n",
    "      dJdb: (scalar)                The gradient of the cost w.r.t. the parameter b. \n",
    "                                  \n",
    "    \"\"\"\n",
    "    m,n = X.shape\n",
    "    ### START CODE HERE ### \n",
    "    ### BEGIN SOLUTION ###\n",
    "    f_wb = sigmoid(X @ w + b)      \n",
    "    e   = f_wb - y                 \n",
    "    dJdw  = (1/m) * (X.T @ e)      \n",
    "    dJdb  = (1/m) * np.sum(e)  \n",
    "    dJdw += (lambda_/m) * w\n",
    "    ### END SOLUTION ### \n",
    "    ### END CODE HERE ###         \n",
    "   \n",
    "    return dJdb,dJdw"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "np.random.seed(1)\n",
    "X_tmp = np.random.rand(5,3)\n",
    "y_tmp = np.array([0,1,0,1,0]).reshape(-1,1)\n",
    "initial_w  = np.random.rand(X_tmp.shape[1]).reshape(-1,1)\n",
    "initial_b = 0.5\n",
    "lambda_ = 1\n",
    "dJdb, dJdw =  compute_gradient_reg_matrix(X_tmp, y_tmp, initial_w, initial_b, lambda_)\n",
    "\n",
    "print(f\"dJdb: {dJdb}\", )\n",
    "print(f\"Regularized dJdw:\\n {dJdw.tolist()}\", )"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "**Expected Output**\n",
    "```\n",
    "dJdb: 0.34179899497279104\n",
    "Regularized dJdw:\n",
    " [[0.2140281799506471], [0.3451133669576971], [0.14128452367526012]]\n",
    " ```"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Speed Test your implementations"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "import time\n",
    "\n",
    "np.random.seed(1)\n",
    "m = 500; n=100\n",
    "X_tmp = np.random.rand(m,n)\n",
    "y_tmp = np.zeros((m,1))\n",
    "initial_w  = np.random.rand(X_tmp.shape[1]).reshape(-1,1)\n",
    "initial_b = 0.5\n",
    "lambda_ = 1\n",
    "\n",
    "start = time.time()\n",
    "for i in range(10):\n",
    "    dJdb, dJdw =  compute_gradient_reg(X_tmp, y_tmp, initial_w, initial_b, lambda_)\n",
    "end = time.time()\n",
    "print(f\"non- matrix time {end - start}\")\n",
    "\n",
    "\n",
    "start = time.time()\n",
    "for i in range(10):\n",
    "    dJdb, dJdw =  compute_gradient_reg_matrix(X_tmp, y_tmp, initial_w, initial_b, lambda_)\n",
    "end = time.time()\n",
    "print(f\"matrix time      {end - start}\")\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# "
   ]
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.8.6"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 5
}
