{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "#### Learn how to sell, write blogposts, show your work"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Drawbacks of random forest"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "* Rf works with clever nearest neighbours, so not good where we need extrapolation.  \n",
    "* Grocery sales competetion where there is time series component . \n",
    "* "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Intro to neural nets"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "* **pickle** - saving some object, for any python object (not optimal for all e.g. not for pandas)  \n",
    "* **destructuring** -- splitting x,y  \n",
    "* vector = rank 1 tensor, matrix = rank 2 tensor, 3D matrix = rank 3 tensor . \n",
    "* row = dimension 0 (axis = 0), columns = dimension 1 (axis = 1) . \n",
    "* Random forest is one algo for which we can completely ignore normalization as it depends on the order not actual values (think of splits) (general idea - algos needing trees) . \n",
    "* Normalization matters in neural nets/ DL . (matters for k-near neighbours)  \n",
    "* **reshape(-1)** - figure out itself about number of dimensions . \n",
    "* **logistic regression** is literally a neural net with 1 layer  \n",
    "* think **torch** like **numpy** for pytorch . \n",
    "* we have **view** in torch like **reshape** in numpy  \n",
    "* "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Caveats of neural network \n",
    "(michael neelson universal approx)\n",
    "\n",
    "* for continuous functions only. not with discountinuour or with jumps  \n",
    "* "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Blog ideas --\n",
    "* why normalization matters in neural net, not in rf. Also talk about same normalization for same training and validation . \n",
    "* can talk about where need -1 in reshape, slice, reorder . \n",
    "* something from rf from scratch . (about tree interpretor, feature importance, pdp) . \n",
    "* -ve log likelihood cost = cross entroy (logistic loss fn) -- can be binary and multi class . \n",
    "* "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## HW\n",
    "* rewrite loss fn with if statement  \n",
    "* read from Michael Nielsen blog  \n",
    "* "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {
    "collapsed": true
   },
   "outputs": [],
   "source": [
    "Dl questions\n",
    "sumproduct, matmul\n",
    "\n"
   ]
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.6.2"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 2
}
