{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Assignment-03 First Step of Machine Learning: Model and Evaluation"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Part-2 Question and Answer 问答"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "#### 1. What's the *model*? why  all the models are wrong, but some are useful? (5 points) "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Ans: \n",
    "##### What's the *model*? \n",
    "1. 本质上是一个函数，实现从样本x到标签y的映射\n",
    "2. 解决问题的一种人为构造的方法，不同类型的问题可以定义不同的模型\n",
    "3. 一般基于假设的前提，然后通过数据不断更新参数\n",
    "* 参考网址: <https://www.zhihu.com/question/285520177>\n",
    "\n",
    "##### why  all the models are wrong, but some are useful?\n",
    "1. 此话来自于George E. P. Box\n",
    "2. 受影响的features很多，但是往往我们考虑的点都很片面\n",
    "3. 唯一正确的模型是现实本身\n",
    "* 参考网址：<https://www.zhihu.com/question/35786261>"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "<评阅点>\n",
    "> + 对模型的理解是否正确,对模型的抽象性是否正确(5')"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "#### 2. What's the underfitting and overfitting? List the reasons that could make model overfitting or underfitting. (10 points)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Ans:\n",
    "\n",
    "*过拟合：*特征纬度过多，或者模型过于复杂，训练迭代次数过多，导致的模型过度学习，拟合的函数完美的经过训练集，但是对新数据的预测结果则较差。\n",
    "*欠拟合：*特征纬度过少，或者模型训练次数太少，导致模型没有很好的学习，模型泛化效果不好，拟合的函数无法满足训练集，误差较大。"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "<评阅点>\n",
    "> + 对过拟合和欠拟合的理解是否正确 (3')\n",
    "+ 对欠拟合产生的原因是否理解正确(2')\n",
    "+ 对过拟合产生的原因是否理解正确(5')"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "#### 3. What's the precision, recall, AUC, F1, F2score. What are they mainly target on? (12')"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Ans:\n",
    "```\n",
    "真正例（True Positive，TP）：真实类别为正例，预测类别为正例。\n",
    "假正例（False Positive，FP）：真实类别为负例，预测类别为正例。\n",
    "假负例（False Negative，FN）：真实类别为正例，预测类别为负例。\n",
    "真负例（True Negative，TN）：真实类别为负例，预测类别为负例。\n",
    "```\n",
    "**precision:** 预测正确的正例数据占预测为正例数据的比例, 看重预测正样本预测的准确度\n",
    "$$ P=\\frac{TP}{TP+FP}$$ \n",
    "\n",
    "**recall:** 预测为正例的数据占实际为正例数据的比例, 看重真实为正样本被正确预测的比例\n",
    "\n",
    "$$ R=\\frac{TP}{TP+FN} $$  \n",
    "**AUC:** Area Under roc Curve，就是ROC曲线的积分，也是ROC曲线下面的面积。一般用于分类问题的评估。\n",
    "\n",
    "\n",
    "**F1:** F1的核心思想在于，在尽可能地提高精确度(Precision)和召回率(Recall)的同时，也希望两者之间的差异尽可能的小。\n",
    "\n",
    "$$ F1 = \\frac{2*precision*recall}{precision + recall} $$\n",
    "\n",
    "\n",
    "\n",
    "**Fi:**\n",
    "    $$ F_i = (1+i)^2 * \\frac{P*R}{i^2*P + R} $$"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "* 参考网址：<https://www.zhihu.com/question/30643044>"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "<评阅点>\n",
    "> + 对precision, recall, AUC, F1, F2 理解是否正确(6‘)\n",
    "+ 对precision, recall, AUC, F1, F2的使用侧重点是否理解正确 (6’)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "#### 4. Based on our course and yourself mind, what's the machine learning?  (8')"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "Ans:\n",
    "\n",
    "1. 机器学习，是让机器学会人的思维，给其数据、特征，让机器学会人“识别事物的方法”。\n",
    "2. 机器学习就是模仿人识别事物的过程，即：学习、提取特征、识别、分类。\n",
    "3. 不同于分析式编程，传统的分析式编程，是对数据进行编程，人为的选取合适的模型方法，进行编程。"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "<评阅点> 开放式问题，是否能说出来机器学习这种思维方式和传统的分析式编程的区别（8'）"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "#### 5. \"正确定义了机器学习模型的评价标准(evaluation)， 问题基本上就已经解决一半\". 这句话是否正确？你是怎么看待的？ (8‘)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "我觉得正确。针对不同的问题，有对应的衡量评价指标，选择正确合适的评估指标，有利于对于结果的观察，评估，从而做进一步改进优化。如果选取的不对，可能会导致后续评估观察等产生问题，不利于做进一步的改进"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "<评阅点> 开放式问题，主要看能理解评价指标对机器学习模型的重要性."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Part-03 Programming Practice 编程练习"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 1. In our course and previous practice, we complete some importance components of Decision Tree. In this problem, you need to build a **completed** Decision Tree Model. You show finish a `predicate()` function, which accepts three parameters **<gender, income, family_number>**, and outputs the predicated 'bought': 1 or 0.  (20 points)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 35,
   "metadata": {},
   "outputs": [],
   "source": [
    "import pandas as pd"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 36,
   "metadata": {},
   "outputs": [],
   "source": [
    "from icecream import ic"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 37,
   "metadata": {},
   "outputs": [],
   "source": [
    "import numpy as np"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 38,
   "metadata": {},
   "outputs": [],
   "source": [
    "from collections import Counter"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 39,
   "metadata": {},
   "outputs": [],
   "source": [
    "def entropy(elements_list):\n",
    "    length = len(elements_list)\n",
    "    counter = Counter(elements_list)\n",
    "    probs = [counter[c] / length for c in set(elements_list)]\n",
    "#     print(probs)\n",
    "    ic(probs)\n",
    "    # 不写底数时默认以e为底\n",
    "    return -sum(p * np.log(p) for p in probs)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 52,
   "metadata": {},
   "outputs": [],
   "source": [
    "mock_data = {\n",
    "    'gender':['F', 'F', 'F', 'F', 'M', 'M', 'M'],\n",
    "    'income': ['+10', '-10', '+10', '+10', '+10', '+10', '-10'],\n",
    "    'family_number': [1, 1, 2, 1, 1, 1, 2],\n",
    "    'bought': [1, 1, 1, 0, 0, 0, 1],\n",
    "}\n",
    "dataset = pd.DataFrame.from_dict(mock_data)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 53,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/html": [
       "<div>\n",
       "<style scoped>\n",
       "    .dataframe tbody tr th:only-of-type {\n",
       "        vertical-align: middle;\n",
       "    }\n",
       "\n",
       "    .dataframe tbody tr th {\n",
       "        vertical-align: top;\n",
       "    }\n",
       "\n",
       "    .dataframe thead th {\n",
       "        text-align: right;\n",
       "    }\n",
       "</style>\n",
       "<table border=\"1\" class=\"dataframe\">\n",
       "  <thead>\n",
       "    <tr style=\"text-align: right;\">\n",
       "      <th></th>\n",
       "      <th>gender</th>\n",
       "      <th>income</th>\n",
       "      <th>family_number</th>\n",
       "      <th>bought</th>\n",
       "    </tr>\n",
       "  </thead>\n",
       "  <tbody>\n",
       "    <tr>\n",
       "      <th>0</th>\n",
       "      <td>F</td>\n",
       "      <td>+10</td>\n",
       "      <td>1</td>\n",
       "      <td>1</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>1</th>\n",
       "      <td>F</td>\n",
       "      <td>-10</td>\n",
       "      <td>1</td>\n",
       "      <td>1</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>2</th>\n",
       "      <td>F</td>\n",
       "      <td>+10</td>\n",
       "      <td>2</td>\n",
       "      <td>1</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>3</th>\n",
       "      <td>F</td>\n",
       "      <td>+10</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>4</th>\n",
       "      <td>M</td>\n",
       "      <td>+10</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>5</th>\n",
       "      <td>M</td>\n",
       "      <td>+10</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>6</th>\n",
       "      <td>M</td>\n",
       "      <td>-10</td>\n",
       "      <td>2</td>\n",
       "      <td>1</td>\n",
       "    </tr>\n",
       "  </tbody>\n",
       "</table>\n",
       "</div>"
      ],
      "text/plain": [
       "  gender income  family_number  bought\n",
       "0      F    +10              1       1\n",
       "1      F    -10              1       1\n",
       "2      F    +10              2       1\n",
       "3      F    +10              1       0\n",
       "4      M    +10              1       0\n",
       "5      M    +10              1       0\n",
       "6      M    -10              2       1"
      ]
     },
     "execution_count": 53,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "dataset"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 57,
   "metadata": {},
   "outputs": [],
   "source": [
    "def find_optimal_spilter(train_data: pd.DataFrame, target: str) -> str:\n",
    "    # set(train_data.columns.tolist()): {'family_number', 'income', 'gender', 'bought'}\n",
    "    x_fields = set(train_data.columns.tolist()) - {target}\n",
    "#     print(x_fields)\n",
    "    spliter = None\n",
    "    min_entropy = float('inf')\n",
    "    for f in x_fields:\n",
    "        \"\"\"\n",
    "            values:\n",
    "                {'M', 'F'}\n",
    "                {1, 2}\n",
    "                {'-10', '+10'}\n",
    "                {0, 1}\n",
    "        \"\"\"\n",
    "        values = set(train_data[f])\n",
    "        print(values)\n",
    "        for v in values:\n",
    "            sub_spliter_1 = train_data[train_data[f] == v][target].tolist()\n",
    "            ic(sub_spliter_1)\n",
    "            entropy_1 = entropy(sub_spliter_1)\n",
    "            \n",
    "            sub_spliter_2 = train_data[train_data[f] != v][target].tolist()\n",
    "            ic(sub_spliter_2)\n",
    "            entropy_2 = entropy(sub_spliter_2)\n",
    "            \n",
    "            entropy_v = entropy_1 + entropy_2\n",
    "            ic(entropy_v)\n",
    "            \n",
    "            if entropy_v <= min_entropy:\n",
    "                min_entropy = entropy_v\n",
    "                spliter = (f, v)\n",
    "    print('spliter is: {}'.format(spliter))\n",
    "    print('the min entropy is: {}'.format(min_entropy))    \n",
    "    return spliter, min_entropy             "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 58,
   "metadata": {},
   "outputs": [],
   "source": [
    "# 反复调用find_optimal_spilter， 获取整个完整决策树，存放进dict中\n",
    "def get_tree(train_data: pd.DataFrame, target: str):\n",
    "    tree = {}\n",
    "    while True:\n",
    "        # 寻找最优划分\n",
    "        (k, v), min_entropy = find_optimal_spilter(train_data=train_data, target=target)\n",
    "        tree[k] = v\n",
    "        if min_entropy == 0:\n",
    "            break\n",
    "        # 获取下一数据表\n",
    "        train_data = train_data[train_data[k] != v]\n",
    "        print(train_data)\n",
    "        train_data = train_data.drop([k],axis=1)\n",
    "        # 直到为0\n",
    "        if len(train_data.columns) == 1:\n",
    "            break\n",
    "        if train_data.shape[0] == 0:\n",
    "            break     \n",
    "    i = 0\n",
    "    print(\"划分如下：\")\n",
    "    for k,v in tree.items():\n",
    "        i += 1\n",
    "        print(\"第(%d)次，为(%s)\" %(i,(k,v)))    \n",
    "    return tree        "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 60,
   "metadata": {},
   "outputs": [
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "ic| sub_spliter_1: [1, 1, 0, 0, 0]\n",
      "ic| probs: [0.6, 0.4]\n",
      "ic| sub_spliter_2: [1, 1]\n",
      "ic| probs: [1.0]\n",
      "ic| entropy_v: 0.6730116670092565\n",
      "ic| sub_spliter_1: [1, 1]\n",
      "ic| probs: [1.0]\n",
      "ic"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "{1, 2}\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "| sub_spliter_2: [1, 1, 0, 0, 0]\n",
      "ic| probs: [0.6, 0.4]\n",
      "ic| entropy_v: 0.6730116670092565\n",
      "ic| sub_spliter_1: [1, 1]\n",
      "ic| probs: [1.0]\n",
      "ic| sub_spliter_2: [1, 1, 0, 0, 0]\n",
      "ic| probs: [0.6, 0.4]\n",
      "ic| entropy_v: 0.6730116670092565\n",
      "ic| sub_spliter_1: [1, 1, 0, 0, 0]\n",
      "ic| probs: [0.6, 0.4]\n",
      "ic| sub_spliter_2: [1, 1]\n",
      "ic| probs: [1.0]\n",
      "ic| entropy_v: 0.6730116670092565\n",
      "ic| sub_spliter_1: [1, 1, 1, 0]\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "{'-10', '+10'}\n",
      "{'F', 'M'}\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "ic| probs: [0.25, 0.75]\n",
      "ic| sub_spliter_2: [0, 0, 1]\n",
      "ic| probs: [0.6666666666666666, 0.3333333333333333]\n",
      "ic| entropy_v: 1.198849312913621\n",
      "ic| sub_spliter_1: [0, 0, 1]\n",
      "ic| probs: [0.6666666666666666, 0.3333333333333333]\n",
      "ic| sub_spliter_2: [1, 1, 1, 0]\n",
      "ic| probs: [0.25, 0.75]\n",
      "ic| entropy_v: 1.198849312913621\n",
      "ic| sub_spliter_1: [1]\n",
      "ic| probs: [1.0]\n",
      "ic| sub_spliter_2: [1]\n",
      "ic| probs: [1.0]\n",
      "ic| entropy_v: -0.0\n",
      "ic| sub_spliter_1: [1]\n",
      "ic| probs: [1.0]\n",
      "ic| sub_spliter_2: [1]\n",
      "ic| probs: [1.0]\n",
      "ic| entropy_v: -0.0\n",
      "ic| sub_spliter_1: [1]\n",
      "ic| probs: [1.0]\n",
      "ic| sub_spliter_2: [1]\n",
      "ic| probs: [1.0]\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "spliter is: ('income', '+10')\n",
      "the min entropy is: 0.6730116670092565\n",
      "  gender income  family_number  bought\n",
      "1      F    -10              1       1\n",
      "6      M    -10              2       1\n",
      "{1, 2}\n",
      "{'F', 'M'}\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "ic| entropy_v: -0.0\n",
      "ic| sub_spliter_1: [1]\n",
      "ic| probs: [1.0]\n",
      "ic| sub_spliter_2: [1]\n",
      "ic| probs: [1.0]\n",
      "ic| entropy_v: -0.0\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "spliter is: ('gender', 'M')\n",
      "the min entropy is: -0.0\n",
      "划分如下：\n",
      "第(1)次，为(('income', '+10'))\n",
      "第(2)次，为(('gender', 'M'))\n"
     ]
    }
   ],
   "source": [
    "decision_tree = get_tree(train_data=dataset, target='bought')"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 76,
   "metadata": {},
   "outputs": [],
   "source": [
    "def predicate(gender, income, family_number):\n",
    "    decisiontree = get_tree(train_data=dataset, target='bought')\n",
    "    bought = 0\n",
    "    data_list = [gender, income, family_number]\n",
    "#     data_list.append(gender)\n",
    "\n",
    "    for k in decisiontree:\n",
    "        print(k)\n",
    "        if decisiontree[k] in data_list:\n",
    "            bought = 1\n",
    "            break\n",
    "    if bought == 0:\n",
    "        print(str(data_list), \"会购买, bought==0\")\n",
    "    else:\n",
    "        print(str(data_list), \"不会购买, bought==1\")\n",
    "\n",
    "    return bought"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 77,
   "metadata": {},
   "outputs": [
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "ic| sub_spliter_1: [1, 1, 0, 0, 0]\n",
      "ic| probs: [0.6, 0.4]\n",
      "ic| sub_spliter_2: [1, 1]\n",
      "ic| probs: [1.0]\n",
      "ic| entropy_v: 0.6730116670092565\n",
      "ic| sub_spliter_1: [1, 1]\n",
      "ic| probs: [1.0]\n",
      "ic| sub_spliter_2: [1, 1, 0, 0, 0]\n",
      "ic| probs: [0.6, 0.4]\n",
      "ic| entropy_v: 0.6730116670092565\n",
      "ic| sub_spliter_1: [1, 1]"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "{1, 2}\n",
      "{'-10', '+10'}\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "\n",
      "ic| probs: [1.0]\n",
      "ic| sub_spliter_2: [1, 1, 0, 0, 0]\n",
      "ic| probs: [0.6, 0.4]\n",
      "ic| entropy_v: 0.6730116670092565\n",
      "ic| sub_spliter_1: [1, 1, 0, 0, 0]\n",
      "ic| probs: [0.6, 0.4]\n",
      "ic| sub_spliter_2: [1, 1]\n",
      "ic| probs: [1.0]\n",
      "ic| entropy_v: 0.6730116670092565\n",
      "ic| sub_spliter_1: [1, 1, 1, 0]\n",
      "ic| probs: [0.25, 0.75]\n",
      "ic| sub_spliter_2: [0, 0, 1]\n",
      "ic| probs: [0.6666666666666666, 0.3333333333333333]\n",
      "ic| entropy_v: 1.198849312913621\n",
      "ic| sub_spliter_1: [0, 0, 1]\n",
      "ic| probs: [0.6666666666666666, 0.3333333333333333]\n",
      "ic| sub_spliter_2: [1, 1, 1, 0]\n",
      "ic| probs: [0.25, 0.75]\n",
      "ic| entropy_v: 1.198849312913621\n",
      "ic| sub_spliter_1: [1]\n",
      "ic| probs: [1.0]\n",
      "ic| sub_spliter_2"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "{'F', 'M'}\n",
      "spliter is: ('income', '+10')\n",
      "the min entropy is: 0.6730116670092565\n",
      "  gender income  family_number  bought\n",
      "1      F    -10              1       1\n",
      "6      M    -10              2       1\n",
      "{1, 2}\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      ": [1]\n",
      "ic| probs: [1.0]\n",
      "ic| entropy_v: -0.0\n",
      "ic| sub_spliter_1: [1]\n",
      "ic| probs: [1.0]\n",
      "ic| sub_spliter_2: [1]\n",
      "ic| probs: [1.0]\n",
      "ic| entropy_v: -0.0\n",
      "ic| sub_spliter_1: [1]\n",
      "ic| probs: [1.0]\n",
      "ic| sub_spliter_2: [1]\n",
      "ic| probs: [1.0]\n",
      "ic| entropy_v: -0.0\n",
      "ic| sub_spliter_1: [1]\n",
      "ic| probs: [1.0]\n",
      "ic| sub_spliter_2: [1]\n",
      "ic| probs: [1.0]\n",
      "ic| entropy_v: -0.0\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "{'F', 'M'}\n",
      "spliter is: ('gender', 'M')\n",
      "the min entropy is: -0.0\n",
      "划分如下：\n",
      "第(1)次，为(('income', '+10'))\n",
      "第(2)次，为(('gender', 'M'))\n",
      "income\n",
      "gender\n",
      "['M', '-10', 1] 不会购买, bought==1\n"
     ]
    },
    {
     "data": {
      "text/plain": [
       "1"
      ]
     },
     "execution_count": 77,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "predicate('M', '-10', 1)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 78,
   "metadata": {},
   "outputs": [
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "ic| sub_spliter_1: [1, 1, 0, 0, 0]\n",
      "ic| probs: [0.6, 0.4]\n",
      "ic| sub_spliter_2: [1, 1]\n",
      "ic| probs: [1.0]\n",
      "ic| entropy_v: 0.6730116670092565\n",
      "ic| sub_spliter_1: [1, 1]\n",
      "ic| probs: [1.0]\n",
      "ic| sub_spliter_2: [1, 1, 0, 0, 0]\n",
      "ic| probs: [0.6, 0.4]\n",
      "ic| entropy_v: 0.6730116670092565\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "{1, 2}\n",
      "{'-10', '+10'}\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "ic| sub_spliter_1: [1, 1]\n",
      "ic| probs: [1.0]\n",
      "ic| sub_spliter_2: [1, 1, 0, 0, 0]\n",
      "ic| probs: [0.6, 0.4]\n",
      "ic| entropy_v: 0.6730116670092565\n",
      "ic| sub_spliter_1: [1, 1, 0, 0, 0]\n",
      "ic| probs: [0.6, 0.4]\n",
      "ic| sub_spliter_2: [1, 1]\n",
      "ic| probs: [1.0]\n",
      "ic| entropy_v: 0.6730116670092565\n",
      "ic| sub_spliter_1: [1, 1, 1, 0]\n",
      "ic| probs: [0.25, 0.75]\n",
      "ic| sub_spliter_2: [0, 0, 1]\n",
      "ic| probs: [0.6666666666666666, 0.3333333333333333]\n",
      "ic| entropy_v: 1.198849312913621\n",
      "ic| sub_spliter_1: [0, 0, 1]\n",
      "ic| probs: [0.6666666666666666, 0.3333333333333333]\n",
      "ic| sub_spliter_2: [1, 1, 1, 0]\n",
      "ic| probs: [0.25, 0.75]\n",
      "ic| entropy_v: 1.198849312913621\n",
      "ic| sub_spliter_1: [1]\n",
      "ic| probs: [1.0]\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "{'F', 'M'}\n",
      "spliter is: ('income', '+10')\n",
      "the min entropy is: 0.6730116670092565\n",
      "  gender income  family_number  bought\n",
      "1      F    -10              1       1\n",
      "6      M    -10              2       1\n",
      "{1, 2}\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "ic| sub_spliter_2: [1]\n",
      "ic| probs: [1.0]\n",
      "ic| entropy_v: -0.0\n",
      "ic| sub_spliter_1: [1]\n",
      "ic| probs: [1.0]\n",
      "ic| sub_spliter_2: [1]\n",
      "ic| probs: [1.0]\n",
      "ic| entropy_v: -0.0\n",
      "ic| sub_spliter_1: [1]\n",
      "ic| probs: [1.0]\n",
      "ic| sub_spliter_2: [1]\n",
      "ic| probs: [1.0]\n",
      "ic| entropy_v: -0.0\n",
      "ic| sub_spliter_1: [1]\n",
      "ic| probs: [1.0]\n",
      "ic| sub_spliter_2: [1]\n",
      "ic| probs: [1.0]\n",
      "ic| entropy_v: -0.0\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "{'F', 'M'}\n",
      "spliter is: ('gender', 'M')\n",
      "the min entropy is: -0.0\n",
      "划分如下：\n",
      "第(1)次，为(('income', '+10'))\n",
      "第(2)次，为(('gender', 'M'))\n",
      "income\n",
      "['M', '+10', 1] 不会购买, bought==1\n"
     ]
    },
    {
     "data": {
      "text/plain": [
       "1"
      ]
     },
     "execution_count": 78,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "predicate('M', '+10', 1)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "<评阅点>\n",
    "> + 是否将之前的决策树模型的部分进行合并组装， predicate函数能够顺利运行(8')\n",
    "+ 是够能够输入未曾见过的X变量，例如gender, income, family_number 分别是： <M, -10, 1>, 模型能够预测出结果 (12')"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 2. 将上一节课(第二节课)的线性回归问题中的Loss函数改成\"绝对值\"，并且改变其偏导的求值方式，观察其结果的变化。(19 point)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 82,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "<matplotlib.collections.PathCollection at 0x1a17c0c128>"
      ]
     },
     "execution_count": 82,
     "metadata": {},
     "output_type": "execute_result"
    },
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXQAAAD8CAYAAABn919SAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAAIABJREFUeJztnX+QHOWZ37/Pjho0i8+MwGsHBmThS0q643RizZYhpypXJF9QbGy8Eb9M4SuScoX84UqMTe1ZTjlGXJGgi3KB++PKV5SdHCl8WCDsNZg6y1eGS+qoAkdi2eN0oLrYgOQRMfKhwUYaxOzukz9mejQz22/32z39e76fKtWuZqa7n+7e+b5vP8/zPo+oKgghhBSfiawNIIQQEg8UdEIIKQkUdEIIKQkUdEIIKQkUdEIIKQkUdEIIKQkUdEIIKQkUdEIIKQkUdEIIKQlr0jzY+973Pt2wYUOahySEkMJz6NChX6jqVNDnUhX0DRs24ODBg2kekhBCCo+IvGbzObpcCCGkJFDQCSGkJFDQCSGkJFDQCSGkJFDQCSGkJFhluYjIqwB+BWAZwJKqzojIBQD2AdgA4FUAN6nqyWTMJKMyv9DA3gNHcLzZwsW1KuZ2bMTsdD1rs0Yi7nNK8xrFeay0721cx3P302i2UBHBsmrv56QzgdbSClSBighuuepS3DO7eeTjD2+7bdMUnn75RKz3of986il/18SmY1FX0GdU9Rd9r/0XAG+q6h4R2QVgnap+2W8/MzMzyrTF9JlfaOAr33kRrfZy77WqU8G9OzcXVtTjPqc0r1Gcx0r73sZ1PK/9BPHZq9dj5oMXRD6+zTHjvA+j7rMfETmkqjNBnxvF5fJpAA92f38QwOwI+yIJsvfAkVV/aK32MvYeOJKRRaMT9zmleY3iPFba9zau43ntJ4iHnzs20vFtjhnnfRh1n1GwFXQF8EMROSQit3df+4Cqvg4A3Z/v99pQRG4XkYMicvDEiROjW0xCc7zZCvV6EYj7nNK8RnEeK+17G9fxoti3rDrS8W2PGed9GGWfUbAV9K2q+mEAHwfweRH5qO0BVPUBVZ1R1ZmpqcCVqyQBLq5VQ71eBOI+pzSvUZzHSvvexnW8KPZVREY6vu0x47wPo+wzClaCrqrHuz/fAPBdAB8B8HMRuQgAuj/fSMpIMhpzOzai6lQGXqs6Fczt2JiRRaMT9zmleY3iPFba9zau43ntJ4hbrrp0pOPbHDPO+zDqPqMQmOUiIucBmFDVX3V/vwbAHwB4HMBtAPZ0f34vSUNJdNxgTJmyXOI+pzSvUZzHSvvexnW8/v2EzXKJenwv290sF9eGfn93mHMynU/uslxE5EPozMqBzgDw56r6n0TkQgCPAFgP4CiAG1X1Tb99McuFEOJFlmm1RcgCs81yCZyhq+pPAWzxeP0fAHwsmnmEENJhWFAbzRa+8p0XAYSbJfvt32+w8MucyYug28KVooSQTEky9dIdLBrNFhRnB4v5hUbvM2XKAqOgE0IyJUlBtRksypQFRkEnhGRKkoJqM1iUKQuMgk4IyZQkBdVmsJidruPenZtRr1UhAOq1aq4ComFItQUdIYQMk2Tq5dyOjZ4ZLMODxex0vZACPgwFnRCSOUkJahnXYPhBQSeEJEYeyjaXZfZtAwWdEJIISeeXk9UwKEoISYQylm3OOxR0QkjszC800CjRgp2iQEEnhMSK62oxUcQFO0WBPnRCSKz4de8ZThnMQ9C0TFDQCSGx4udS6V+ww6Bp/NDlQgiJFZNLpV6rWlc5JNGgoBNCYsV2KX+aVQ7nFxrYuucpXLbrSWzd89RAtcUyQZcLISRWbFdnXlyrembCxB00HSfXDgWdEBI7NqszbeusjEqZGlgEQUEnhGRCWnVWytTAIggKOiHEiiRSDNOos5KWaycPMChKCAnEppVb1P0mHawsUwOLICjohJBATH7o3Y8fjrzPpAaJYcrUwCIIulwIIYGY/M3NVhvzC41I4phmsHJcSuhyhk4ICcTP33zHvhciuUvGKViZFhR0QkggQf7mKO6SJJtDDzMuC4so6ISQQGan61g36fh+Juyy/bSClWn56vMABZ0QYsVdn7p8lQAP02i2rGfCaQUrx6lmDIOihBAr+hcCmZpXCNB7z2aJfRrBynHy1XOGTgixZna6jmd2bcf9N1+xarYuAHTo83mYCafpq88aCjohJDRe7pJhMXfJeibs5at3KoJTZ5ZKFySly4WQgpGXLj/D7pKte57K5RL74ZoxtUkHb7+zhGarDaBc1Rc5QyekQOQ5YyPPS+xdV9Ere67F5Dlr0F4ZfJ7Ig2soDijohBSIPGdsFGWJfZmDpHS5EFIg8i5GRVhiX+bqi5yhE1IgxiljIyny7BoaFQo6IQWizGKUFkVxDUXB2uUiIhUABwE0VPWTInIZgG8DuADA8wB+T1XfTcZMQgiQXpefslME11AUwvjQvwDgJQDv7f7/DwHcp6rfFpE/BfA5AF+P2T5CyBBFF6O8pF2WESuXi4hcAuBaAN/o/l8AbAewv/uRBwHMJmEgIaQ85DntsgzY+tDvB/D7AFa6/78QQFNVl7r//xkAzyFWRG4XkYMicvDEiRMjGUsIKTZ5TrssA4GCLiKfBPCGqh7qf9njo54rf1X1AVWdUdWZqampiGYSQspA3tMui46ND30rgOtE5BMA1qLjQ78fQE1E1nRn6ZcAOJ6cmYSQMlDmHPA8EDhDV9WvqOolqroBwGcAPKWqtwJ4GsAN3Y/dBuB7iVlJCCkFcaZdjksXojCMkof+ZQBfEpH/i45P/ZvxmEQIKStx5YAzuOqNqJqKXsbPzMyMHjx4MLXjEULKiamyY71WxTO7tmdgUbKIyCFVnQn6HGu5EFJyypj3zeCqN1z6T0iJKatrgjVtvKGgE1Jiypj3Pb/QwOl3l1a9zpo2dLkQUlhsXCllc024TxzDg1St6mD3dZcX3pU0KhR0QgrIV+dfxLeePdpbzWdqo3Z+1em1Wuvn/KqThpmx4/XEAQDnnbtm7MUcoMuFkMIxv9AYEHMXL1eKeK3p9nk975TtiSNuKOiEFIy9B45419nAamFrnl49O/d7Pe8wGOoPBZ2QguE3Gx0WNpPQKYDpP/hh4bJd2ODDHwo6GVuKunTcJNICrBK2uR0b4VS8/SsnT7dxx74XCiXsZe42FAcMipKxZDhbwhRUzCNzOzauyvQQALdevd7b9oDF4CdPt3vnDuS/G1LRG3wkCQWdjCV++dl5F4swbej2HjiC9kpweY9Wexl3P3EY77RXCjnIkQ50uZCxpOjZErPTdczt2IiLa1Ucb7aw98ART7dJmPM5ebpdukVI4wZn6GQsKXpdbluXkek8wzDqIFfGWjJ5hTN0MpbkPVsiKGBru6Tf6zzDMsogV9ZaMnmFgk7GkjxnS9iIoGnW3Gi2BgaA/vOMwqiDXBlryeQZulzI2JLXbAmbgG1t0sFJw+KgYffL7HQdB197Ew89ezTw2FVnAhecd25s7pGixyqKBgWdkJxhI4JBfWmGB4CHnztmdeylFY3Vx130WEXRoMuFkJxhs7z9LY+CW8P0DwDLlp3J2ssaqzsk77GKskFBJyRn2IigzQy3/zOVENW44nSH5DlWUUbociEkZ9gsHPJaLdrP8ABwy1WXWvnQgfjdIXmNVZQRCjohOSRIBN337n7icC84Kuis8q97DAD3zG7Gd59v4NS73gOAi5c7hHnkxYGCTkgOsRHRg6+9OVAGV3FWkL0E93SAmJ/dy6Adc/sX0V7uvN5otjC3fxEAywHkEQo6ITnDZhWoTZOLvQeOoNFsoSKCZdXeTz9a7RXMPXpWsO9+4nBPzF3ay4q7nzhMQc8hDIoSkjNsFuP4NblwBwA3XdAVcetMl5WzmS6mXHfT6yRbOEMnJCGi+p5t8tD9MlEqIsZgKXDW1x7FBpJvKOiEJEDYeuv94j9hcI30Z5+YFuwIgmfiNvN091g1Q5PpWkGbTJcdulwISYAwNUyGa7eYBPnUmaVejRavXHW3ycWoYutMSC/TZfd1l8OZkFXv777u8pGOQZKBM3RCEiBMDRMv8fei2WqvmuV7uXSe/JvXjfuoOhWsdSaMPvBa1cHu6y7v7T9MMw2SPRR0QhIgTA2TMP7q/hotplz1pk/AstVexrlrJuBUZCB7pepUPFdwMge9WFDQCUmAbZumVqUVmmqYhG1CETQABO2v2WrDmRCsm3TQPN1eJdSuiDearYEAKlvS5R/60AmJmfmFBh471BgQcwFw/ZXeM+qwTSjOD/CR2+yvvaKYPGcNXtlzLZ7ZtX1AzPtTHv3y3En+oKATEjNePnEF8PTLJzw/7xawWjdpF8w89e6Sb8ef4YJYJoZn+vMLDdz5yGKgP58pjfmFgk5IzNh2E+pndrqOha9dYyXqwyVuvdrVzU7X8cyu7Xhlz7XGbkX9/nx3Zm6z+Ii1zPMLBZ2QmPETPK92cv2CbLsC0x00bNrV2ZTjtc20YS3zfBMo6CKyVkR+LCKLInJYRO7uvn6ZiDwnIn8vIvtE5JzkzSUk/wT5sPv90MOCbIs7aPjlu7sDxRf3vYBz10xg3aRjrEnu50Zx3TasZZ5/bLJczgDYrqpvi4gD4K9F5C8AfAnAfar6bRH5UwCfA/D1BG0lpBD0526bsk3c121nxv30z5L93Dv9K1WbrTaqTgX33XyFpyCbMmMqIvijm7ZQxAtC4AxdO7zd/a/T/acAtgPY3339QQCziVhISAGZna5jbsdGY6cgQWd2HiZdEQDOO+dsvvj8QgMThv171XPxy1AxuWUo5sXCKg9dRCoADgH4xwD+BMBPADRVdan7kZ8B4F0nhSLJRTNBQUZFZ3ZuU9K2H7em+VfnX/Qsnwt0hNg06zfN6LkitBxYCbqqLgO4QkRqAL4L4De8Pua1rYjcDuB2AFi/fn1EMwmJl7DFs8Jy9xOHrdL/wvjNgc6XbPfjh/FWq+25bUUE9+7cbHT3+AVs2Squ+IRaKaqqTRH5KwBXA6iJyJruLP0SAMcN2zwA4AEAmJmZCfv3S0gi+AUTo4ha/2y/NulYZau44mryXZtm7l7VD12WVfHFfS+gNunAmRC0VwaX9zNDpdzYZLlMdWfmEJEqgN8F8BKApwHc0P3YbQC+l5SRhMRNmOJZQQxnqtiIuVPpVDTctmnK8/2rP7TOd1GQHz0bpFNsy5TZQsqHzQz9IgAPdv3oEwAeUdXvi8jfAfi2iNwDYAHANxO0k5BYCVM8y0R/zZOwtJcVdz6yaJyFv/oPLdx69XrPejB+1RKHj/Grd5aMmS2kfAQKuqr+DYBpj9d/CuAjSRhFSNLM7dg44EMHwrkkhn3wUfALhh5vtnDP7GYAwMPPHev1BL3+yjpmPniB9bGXVVlQa4zgSlEylgzXO7F1SbiLde7Y98JIYh7ExbUq5hca2PfjYwM9Qff9+BgArLLdr2QAC2qND6IhUqZGZWZmRg8ePJja8Uj+KVK97TCzcqciOO+cNb4BTL9t996wBbsfP2xs//bCXdeEsk0AvLLn2tC2kHwgIodUdSboc6yHTjIj6dTBuLFd1VkfGpg27Hoy3IG6cyzTYOD1unssk1+eBbXGg0IIepFmccSeuFMHkyYoA8bU9acesoFFe0UjuUjc444SGyDFJvc+dJtqcqSYxJk6mAZ+s1w/H3wUMW00W5h0zF9P099/1NgAKQe5n6EXbRZH7IkjdXBUvJ7+AO8l8KbMGJtenFVnAq32irVdFRGc61Rw2rCN398/V3yOL7kX9KLN4og9o6YOjoqXD3/u0UVA0Gug7OXXD3L/ee3XqYjnyk2TT35Z1bfZM//+iRe5F/Q8zOJIMmRdEMrr6a9fcF36nwhtZr+e+11WnHdOBSvtlYGc8qdfPmH0r4sApiQ0kbPB1lrVwe7rLuesnOTfh27TbYUUk6yD3WFmuTafdXPUTQJ96t3lgZzyxw41sG3TlLEZhsfY4vles9XG3KOLjCuR/As6gzzlJA/B7jBPeUGf7T8fW1rtZXx/8XWs9Ql+Ap3ZeO93w2eiZsaQcpF7lwvAIE8ZyUOw28uH70zIgA8dsHsijNJ5CPCvnNhDgVe7i4Iu88lpp1+dFELQSfnIQ7Db5MP3ei1okPGzu16r4tSZpUirRoHBpwNTTGn4c2Q8oaCTTMhLsNv09Bf2KcF0PvVaFc/s2h65mJdbZtdlbsdGzO1fHHiCADpPFowrEQo6yYSsUxZdhgOz2zZN4emXT4QO1Aadj9fTwOl3l3zL4K6bdHDXpwazV9zf737icG9bZrkQFwo6yYSsUhaHOwu9/c5SL1Wx0WzhoWeP9j5rqi0zv9DwFNR7d24eeP3cNf7Bzmt/+yI8dqgxMAgIOqVchuvB9OMXU8o6c4hkC6stklLhJ2hR3R4VEayo9mbwD//4GJY9cgonnQm0l3Ugl90V6HVDgwfQmcH356K7n+1//96dnZroNiLtdX6mlaykWNhWW6Sgk1IwPGt26Rc0vxzxrHB97CbbalUHZ5ZWrETatA/3GKS4sHwuGRv8Zt6t9jJ2P34YB197M3diDpxtEG3KkvHKjDGld+Yhc4hkS+4XFhESRFAOeLPVHvCN54lKd9VQ2OweL5E27YPpjOMDZ+iksIzSpDkvLKvisl1PojbpeBbvMjWEdlvUDWfoDAdZWSZjvKCgk1hJK8sijibNeUGBVaLtZs4A3g0rtm2aWlXR8aFnj6LqTGDdpIPm6TazXMYQCjqJjTRbykVdal8Uzix16qCb0jtN59+puS647+YrKORjyNgIOvNzk8e2PovpXoS5R2UP9AWV7P3ivhestiXjxVgIetGaERcVmywL0704+NqbA/7foHvkV9OkLPgNWkHnX/YBj3gzFlkufjNHEh82WRame/Hwc8dC3aNtm6ZGtDb/+GWnePUJsN2WlJexmKEzPzcdbOqzmGaVy4YFbu49ml9oYPfjhyNXLCwaQdkpXjVdbLcl5WUsZujMz00Hm2YkFTG1aPDGTc+be3RxbMS8ImK1XH92uo6Fr12D+2++gg1gCIAxWfrPGhf5YYNPg4bhpsnuPSp6rnkUBIgcMCblg0v/+8i6GXEeyIsg1H3qhrvpeI1mCxURtNrLni6FcaC/LV/YgDHJD2l/78Zihj7u5OkJJciWPCwYqohgWbX3Myr333wF7nxk0bgPpyKAYmB1qJ89w7DoVr6J83tnO0MfCx/6uJOnLJ8gP3seFgy54jmKmAOdc/Xbx94btmDvjVt61yLInmEY1M83WXzvxsLlMu7kLcvHr0FDWURq3aQDwN/F5F4D96ep/K1phs6gfr7J4nvHGfoYkHSWz/xCA1v3PIXLdj2JrXuewvxCI/I+bObE6yYdTIRLlkkVpyK461OdOixe+eJORXDqzNKq6+X12apTwS1XXer5OlMT800W2XUU9DHAJBRxCILrJ2w0WwOBvDCi3r+PIKpOBWfaywhwO2dGRQR7b9gyMPvudzGtm3QA7ZT0Hb5eJnfUPbObA9NBSf5I8ntngkHRMSGpaHscXXL8Ogmtm3SgCrzVOls98A6fOiZZYhPwYleh8SKu711saYsicimA/wngHwFYAfCAqv6xiFwAYB+ADQBeBXCTqp4MbSlJBT+/9Sj4+Qlt/5hN+xAAC1+7BsDZL4ZfUaq0EAEuPr/aS69cVu2lXQId0Tadc97iGSRZkvrembAJii4BuFNVnxeRXwNwSET+EsC/AvAjVd0jIrsA7ALw5eRMJXmkNul45onXJh3rgmimQlMTItiw60lMCHLlYlGF52zapgic6VwZ4CRxEOhDV9XXVfX57u+/AvASgDqATwN4sPuxBwHMJmUkySfzCw28/c6S53vNVts6ZctUaMrN7MiTmAMd98gw8wsN3PnIYuA5Z+FXJeNDqLRFEdkAYBrAcwA+oKqvAx3RF5H3x24dyTV7DxwxLooxhWa8XAvDK3knRlzQkzQbLhwUdHdmbpMvzlXLJEmsBV1E3gPgMQB3qOovxbLIkojcDuB2AFi/fn0UG0lOieL37XctePnYAeQ26OnyzE/exFfnX8Q9s5sBBC+GGnanpO1XJeODVdqiiDjoiPm3VPU73Zd/LiIXdd+/CMAbXtuq6gOqOqOqM1NT5a9hPU6E9fv2uxa80h3n9i/iSzkXc5eHnzvW+90v3ZLuFJImgYIunan4NwG8pKr/re+txwHc1v39NgDfi988kmeCmiz0Y7PEv72sWIndymRw3SvzCw3jsn3bMriExIWNy2UrgN8D8KKIuNOn/wBgD4BHRORzAI4CuDEZE0le6RfnoEVBw1khRU/TmxD//HkB8Ec3baGYk1QJFHRV/WvAOAn5WLzmkKLR7w/+zf/4FzjdXj3Hduua9FP4nqDq72rJb0iXlBku/Sex8Z93/nanJGwf/XVN+sljT9CKT4GY4XdsXENhSyAQMiqstphj8tKUwpYwKXlPv3wibfN8mXQmPJ8uRsHNQc/zPSPlgoKeU2xWHaZhQ9gBxTYlL28+9CAxj+pCydt5hqFoEwpCl0tuybopRRxVFP32PRGyWXRRKeqS/iTvP0kOCnpOybqIU1IDStCqyjwSdejxykGPo3Z8GmQ9oSDRoKDnlCyK4/eT1IAStcWcMyG9bJlRmltE2fTWq9db59u7eNUsL9KsN+sJBYkGBT2nZF3EKakBJYogCICbP3Ip7vrU5ajXqiMV61IAterqNEoT9Vq112CiYlvuAp0snmF/c5FmvVlPKEg0GBTNKVkXcZrbsdGzY/moA0qU/HMF8P3F17Hv/xxDezk9V03/+brXffiaeKEAHnr2KL6/+PpAY44izXqTuv8kWdixiKzCzW7wauAw6oAynL2TN+q1qu8AOr/QiFQ8rOpUsNaZ8Kwd7x43b1kkzHLJD7F1LCLlweYLOiy4y6q9mVkcX+Yw5QKiIOjMkNcZGm/44dUGzlQRMiyt9jLOXTOBqlPxHMyySEsNglUhiwd96GOCbUAuDT/v7HQdz+zajlf3XBvL/lzfdr1Wxa1Xr0e9VkUzQMxt4hOma1Z1on1t3mq1e82evcirP50UBwr6mGAr1HH6eW1S9MIEKL2oVR385N5P4P6br8CpM0t46NmjPQH22+benZsHjr3WQ6RN12xtyIwXl4tr1d5gZgqv5tGfTooDBX1MsBXquLIbbJ4Ivjr/IpqtcG6RfpwJwe7rLu8dy2Zf7jYAcGbp7OrQk6fbq+wzXbOg2b8Xw08AzCIhSUBBHxNMQjEhMiBicaVLBj0RzC808K1nj4baZz/1WhV7b+yUp7XJbReLbYafWPxE1+Q2GT6ma6ubk+4+tTSarVWzdGaRkFFhUHRM8EpDAzpBz/5gXFzpkqaAp/v63gNHItdHEQzWVw9yU3gFO4PsAzrXbO7RxVV9U493nzrcAKwJHTr2cMC5fx8VkYEBJYtgJLNaig8FfUxwv5h3PrK4atn9cFXAOLIbKoZGz24AcxRf8flDfne/3PaqU8G2TVPYuuepAaEKsq+Hh7Nb+34GiXr/eXo9Fbj7cG3JKtslD8XgyOjQ5VJy+gOTew8csepMHwem47ivj+IrbrbaA0FWUyu8dZMOrr+yjscONQZ8+Xfse8HXPne/ew8cCVzI5M7CTS6Y/vM0XePhI2SR7VKkVazEDGfoJcZr1mWaUcYRjOt/ZDfNgF3hM7mAbPGaQXrli3s9kQTh7td2kHOfDoavrVcg1Db3Pu1slyKtYiVmKOglxvSI78WoHYS8FiQN47WU/u4nDq9aABTkxnDpdxUNu4lGqero7jdsmYIgn7jXIOY3wKbp0zadK7NuigVdLpYUpexpP2HEaNQOQkGZJl7VBwHgl62lVZ91hdEG0wwyalVHl0azZXTl+GHyic8vNDA7Xcf1V9Z7fvqKCH7n1y/wzCratmkq1cqMWReDI/FAQbegSGVPXeYXGqFKxY76aO23/XDpgPmFBq64+4e+vmzXNw14BCr78JpBzi80Yisr4Ley04TJJz6/0MBjhxq9c15WxfNH38L1V9ZRr1V7qZX37tyMp18+kapPe3a63jvXfjsYEC0WLM5lgZs3PIxXOlxeMNlsYtRzCTqeu3/b4lzD9nht57or6n0+892PHx5psZLJBr/j2yDoZOd42ebaP1wQzbSfV2IqmUCKA4tzxUgRA0ZBM+aoZVFNfl1TzvawPTauEKciOHVmCZftenKV79gVvX4xbTRbmHt0ESsAln2KpYcR4H6bAW+f/1pnAksralXSt+ZTLMx94vOLP7jQp038oMvFgiIu0zbZ5j5KR3m09nM9zU7X8Z615vmBa0/QIChd1W222p7HeGbXdtRr1VXC3F5RXzHv7DzwFD1t7uedvmbSrfaKlZhXnQr8HoTdAKrNfujTJn5Q0C0oYsDIz2ZXGF/Zcy2e2bXd2k8alKvsV+PEvVZBg6AAq2b5rfYy7n7icO//UZ6M6rWqr6h62dFotgYC4FECre6A+ZaPG8gmE4c+bWIDBd2CIgaMkrA5yPVkEuta1ekdNyhzxDTJPnm63RPWKE9G2zZNWbeQAwbdOe4TQtiBxC1RMDtdN9q8btIJDLq6vvw8/72RfMCgKLEmKDjsFTisOhXcu3MzgLMLf2qTDlQ79cEnfAKAYY5js+22TVN4KKAgmMnP7opu1EBz0LUxnY/7GYr5eGMbFOUMnVhjml2ffnep5+P2eioAMOB7P3m6jTNLK7jv5iuwEmJC4c6Q3eOE4XizhXtmN+OzV68fyAPf+usXDNhrsuZ4yLz0YZec3xNT/3uuXUAxngRJvuAM3ZIiVqJLwub5hYZnaqDfTNIvpdEvRW+Y4VTGKKmZpmvQ30fV79g2PUVrVQe7r7s8938fpDhwhh4jRV1YlITNs9N1nHfu6mwWN3DptZrWz/fsJeZOReBMDPq7vYLQXjNmZ0LgVLx95aZr0H+tvBguWRDki+9vnEFImlDQLShiJbokbTYJ9MnTbc8BxCaIWRE524Tihi3Ye+OWwICulxtj741bsPeGLaH6dvplr3gdO+iJIu9/G6S8cGGRBWVaWBSHzbZFq1xhm9uxEXP7F31ztldUV62AtHFZmGq3z07XcdmuJz194sPXwHRNhhtpuNQtzj/PfxukvHCGbkGZFhbFYXOY4ODxZqvjpjlcMgupAAAJD0lEQVTHf+4QpWdpULE022sQ9lrZnH9t0ilcMTdSfCjoFpRtYdGoeLk6akNdhFxcUfRbWBPWLtv4gO012HCht3CbXh/OShn2qDsVwdvvLBUq5kLKAV0uFsTVZzMKUTNV0rb5k1suwmOHGsYaMSY3TUUkVGre/ELDqo0eYH8Nnv3pSc9jmV53991fPbL/GKfOLK3KAvKyb1SKmHlFkiUwbVFE/juATwJ4Q1V/q/vaBQD2AdgA4FUAN6mq+a+/S5HTFrPAbzGK3xc36S+6ya7rr6zj6ZdPeB436rkEHbefqJUIN+x60vjeqxH2Z/Ldx1kpMY7rSYpDnGmLfwbgXwy9tgvAj1T1nwD4Uff/JGaiZKqkkWJpsuvpl08Ya8TEUYogqJZK1PiAXxpilOuWRsyliJlXJHkCXS6q+r9FZMPQy58G8M+6vz8I4K8AfDlGuwiiZar4fdGHW7RFncVHzaAxZaTY4rd/ASLHB2656lJjSYAobhKvVnNxx1yKmHlFkieqD/0Dqvo6AKjq6yLy/hhtIl2i9Hm0+aJ7NY8ebrgct11x4JcuqbCz3Yt7ZjcbBX34enoNhMBqP/29Ozcn6vZiD1DiReJZLiJyu4gcFJGDJ06M1rdy3IiSqWLzuD/q43pWWT9zOzYaS5qHbRNnu33/dfNyZ809uoi5/YurXFwAIpUotqWImVckeaIK+s9F5CIA6P58w/RBVX1AVWdUdWZqarTO8uNGFL+zzRd91Mf1rMoJz07XcevV61eJehxCZnPdvAbCtkfHojR82UUs6UySJ6rL5XEAtwHY0/35vdgsKhFxZJuE9TvbpOrF8bg+qj88KvfMbsbMBy8wnl+SaZ5h/NNp+LKzugckvwQKuog8jE4A9H0i8jMAd6Ej5I+IyOcAHAVwY5JGFpFR/dSjEPRFTyNo50UUsTVtY6qYGOaae+3br1G2bckD97OEpI1Nlssthrc+FrMtpcI22yQL4lx0ZCvSUQY4m236jw+sbk5huuZR7PEaCJ0JAQQDbpc4BkcuGiJR4ErRhMh7WtnwLNetjRJ29mwrilEGuKBtbDsXeV3zKPaYBkKv10YR3yyf7kixoaAnRJHSyqIKSBhRjDLABW1j27TZ65rHnUsfp9Dm+emO5BsW50qIIqWVRU1jDCOKUVZPBm1j+7Tjdc3zXEEz7093JL9Q0BMir2llXmVnowpIGFGMMsAFbWMjvusmHc9rntcBd36hgQlDKYI8DDYk39DlkiB5SyszuVZqkw5Onl5d3tZLQPqDdedXHTgVsQoIRgnEBm3jFaTsp+pUcNenLo+0b7/z7v9snMFL9/54dUTKw2BD8g+bRGdMmtkMpqbKtaqDM0srgZX7vIKQzoTgPWvXoHm6jYtrVWzbNGWsuJgEwwOMCHq2xHVsv+qSXiWDoz6Jme5PRQR/dNOWXE0OSLrYVlvkDD1D0s5mMLlQ3mq1cd/NVwQOLKaVkpPnrMHC167JJDsjjacgU4zh4eeOWdVlt8V0f1ZUKebECgp6hqSdzeCXeWMjjFGyTsqQnWE6b1Oz6KjByyJlRpF8wqBohqSdzTBqIDBq1kkS52PTUzQuTOdtqqMeVYDzGqglxYGCniFpp86NmnkTNesk7vNJo4lHP6bzvuWqS2MV4LxmRpHiQJdLhmRRU2UUn3OUrJMkzidt147fefsVCot6LAo4iQqzXDKmbDU70jgfm56dZbuuZLxhlktBKNuMLI3zCQoeshYKGVfoQyeFI8iXzwbKZFzhDJ0kRpz1z/sJ8uWzFgoZVyjoJBGSqn/u4ufaYT43GVfociGJEMXtEZerhPncZFzhDJ0kQhL1z22JsyMTIUWCgk4SIYrbI05XSdmyhwixgS4XkghJ1D8nhPjDGTpJhCTqnxNC/OFKUUIIyTm2K0XpciGEkJJAQSeEkJJAQSeEkJJAQSeEkJJAQSeEkJKQapaLiJwA8FpqB4zG+wD8ImsjUoDnWS7G5TyB8TnX/vP8oKpOBW2QqqAXARE5aJMeVHR4nuViXM4TGJ9zjXKedLkQQkhJoKATQkhJoKCv5oGsDUgJnme5GJfzBMbnXEOfJ33ohBBSEjhDJ4SQkkBB70NEKiKyICLfz9qWJBGRV0XkRRF5QURKWy1NRGoisl9EXhaRl0Tkn2ZtU9yIyMbufXT//VJE7sjariQQkS+KyGER+VsReVhE1mZtUxKIyBe653g47L1k+dxBvgDgJQDvzdqQFNimqmXP5f1jAD9Q1RtE5BwAk1kbFDeqegTAFUBnQgKgAeC7mRqVACJSB/DvAfymqrZE5BEAnwHwZ5kaFjMi8lsA/g2AjwB4F8APRORJVf17m+05Q+8iIpcAuBbAN7K2hYyOiLwXwEcBfBMAVPVdVW1ma1XifAzAT1Q174v3orIGQFVE1qAzOB/P2J4k+A0Az6rqaVVdAvC/APxL240p6Ge5H8DvA1jJ2pAUUAA/FJFDInJ71sYkxIcAnADwP7putG+IyHlZG5UwnwHwcNZGJIGqNgD8VwBHAbwO4C1V/WG2ViXC3wL4qIhcKCKTAD4B4FLbjSnoAETkkwDeUNVDWduSEltV9cMAPg7g8yLy0awNSoA1AD4M4OuqOg3gFIBd2ZqUHF2X0nUAHs3aliQQkXUAPg3gMgAXAzhPRD6brVXxo6ovAfhDAH8J4AcAFgEs2W5PQe+wFcB1IvIqgG8D2C4iD2VrUnKo6vHuzzfQ8bd+JFuLEuFnAH6mqs91/78fHYEvKx8H8Lyq/jxrQxLidwG8oqonVLUN4DsAfidjmxJBVb+pqh9W1Y8CeBOAlf8coKADAFT1K6p6iapuQOex9SlVLd3oDwAicp6I/Jr7O4Br0HnMKxWq+v8AHBMRt8P0xwD8XYYmJc0tKKm7pctRAFeLyKSICDr386WMbUoEEXl/9+d6ADsR4r4yy2X8+ACA73a+E1gD4M9V9QfZmpQY/w7At7ruiJ8C+NcZ25MIXV/rPwfwb7O2JSlU9TkR2Q/geXRcEAso74rRx0TkQgBtAJ9X1ZO2G3KlKCGElAS6XAghpCRQ0AkhpCRQ0AkhpCRQ0AkhpCRQ0AkhpCRQ0AkhpCRQ0AkhpCRQ0AkhpCT8f9F55igLg2gbAAAAAElFTkSuQmCC\n",
      "text/plain": [
       "<Figure size 432x288 with 1 Axes>"
      ]
     },
     "metadata": {
      "needs_background": "light"
     },
     "output_type": "display_data"
    }
   ],
   "source": [
    "from sklearn.datasets import load_boston\n",
    "import matplotlib.pyplot as plt\n",
    "import random\n",
    "dataset = load_boston()\n",
    "x,y=dataset['data'],dataset['target']\n",
    "X_rm = x[:,5] # 为了方便，取用了前5个特征值\n",
    "# plot the RM with respect to y\n",
    "plt.scatter(X_rm,y)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 83,
   "metadata": {},
   "outputs": [],
   "source": [
    "# 定义目标函数\n",
    "def price(rm, k, b):\n",
    "    return k * rm + b"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "$$ loss = \\frac{1}{n} \\sum{|(y_i - \\hat{y_i})|}$$"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "$$ loss = \\frac{1}{n} \\sum{|(y_i - (kx_i+b_i))|}$$"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 148,
   "metadata": {},
   "outputs": [],
   "source": [
    "# 损失函数改为绝对值\n",
    "# 定义损失函数\n",
    "def loss(y, y_hat):\n",
    "    n = len(y)\n",
    "    loss_ = 0\n",
    "    for y_i, y_hat_i in zip(list(y), list(y_hat)):\n",
    "        \n",
    "        if y_i >= y_hat_i:\n",
    "            loss_ += (y_i - y_hat_i)\n",
    "        else:\n",
    "            loss_ += (y_hat_i - y_i) \n",
    "    print(\"loss_ /n\", loss_ /n)\n",
    "    return loss_ /n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 149,
   "metadata": {},
   "outputs": [],
   "source": [
    "# 求解关于k的偏导数\n",
    "def partial_derivative_k(x, y, y_hat): \n",
    "    n = len(y)\n",
    "    gradient = 0\n",
    "    for x_i, y_i, y_hat_i in zip(list(x),list(y),list(y_hat)):\n",
    "        if y_i >= y_hat_i:\n",
    "            gradient += -1 * x_i\n",
    "        else :\n",
    "            gradient += 1 * x_i\n",
    "    return 1/n * gradient"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 150,
   "metadata": {},
   "outputs": [],
   "source": [
    "# 求解关于b的偏导数\n",
    "def partial_derivative_b(y, y_hat): \n",
    "    n = len(y)\n",
    "    gradient = 0\n",
    "    for y_i, y_hat_i in zip(list(y),list(y_hat)):\n",
    "        if y_i >= y_hat_i:\n",
    "            gradient += -1\n",
    "        else :\n",
    "            gradient += 1\n",
    "    return 1 / n * gradient"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 151,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "loss_ /n 421.9467509435635\n",
      "Iteration 0, the loss is 421.9467509435635, parameters k is 61.51157215025401 and b is 57.90181571211795\n",
      "loss_ /n 421.9062543141808\n",
      "Iteration 1, the loss is 421.9062543141808, parameters k is 61.50528751586666 and b is 57.900815712117954\n",
      "loss_ /n 421.86575768479844\n",
      "Iteration 2, the loss is 421.86575768479844, parameters k is 61.49900288147931 and b is 57.899815712117956\n",
      "loss_ /n 421.8252610554156\n",
      "Iteration 3, the loss is 421.8252610554156, parameters k is 61.492718247091965 and b is 57.89881571211796\n",
      "loss_ /n 421.78476442603295\n",
      "Iteration 4, the loss is 421.78476442603295, parameters k is 61.486433612704616 and b is 57.89781571211796\n",
      "loss_ /n 421.7442677966502\n",
      "Iteration 5, the loss is 421.7442677966502, parameters k is 61.48014897831727 and b is 57.89681571211796\n",
      "loss_ /n 421.703771167268\n",
      "Iteration 6, the loss is 421.703771167268, parameters k is 61.47386434392992 and b is 57.895815712117965\n",
      "loss_ /n 421.663274537885\n",
      "Iteration 7, the loss is 421.663274537885, parameters k is 61.46757970954257 and b is 57.89481571211797\n",
      "loss_ /n 421.6227779085018\n",
      "Iteration 8, the loss is 421.6227779085018, parameters k is 61.46129507515522 and b is 57.89381571211797\n",
      "loss_ /n 421.58228127911934\n",
      "Iteration 9, the loss is 421.58228127911934, parameters k is 61.45501044076787 and b is 57.89281571211797\n",
      "loss_ /n 421.54178464973705\n",
      "Iteration 10, the loss is 421.54178464973705, parameters k is 61.44872580638052 and b is 57.891815712117975\n",
      "loss_ /n 421.50128802035425\n",
      "Iteration 11, the loss is 421.50128802035425, parameters k is 61.442441171993174 and b is 57.89081571211798\n",
      "loss_ /n 421.46079139097145\n",
      "Iteration 12, the loss is 421.46079139097145, parameters k is 61.436156537605825 and b is 57.88981571211798\n",
      "loss_ /n 421.42029476158865\n",
      "Iteration 13, the loss is 421.42029476158865, parameters k is 61.429871903218476 and b is 57.88881571211798\n",
      "loss_ /n 421.37979813220636\n",
      "Iteration 14, the loss is 421.37979813220636, parameters k is 61.42358726883113 and b is 57.887815712117984\n",
      "loss_ /n 421.3393015028241\n",
      "Iteration 15, the loss is 421.3393015028241, parameters k is 61.41730263444378 and b is 57.886815712117986\n",
      "loss_ /n 421.29880487344127\n",
      "Iteration 16, the loss is 421.29880487344127, parameters k is 61.41101800005643 and b is 57.88581571211799\n",
      "loss_ /n 421.2583082440584\n",
      "Iteration 17, the loss is 421.2583082440584, parameters k is 61.40473336566908 and b is 57.88481571211799\n",
      "loss_ /n 421.217811614676\n",
      "Iteration 18, the loss is 421.217811614676, parameters k is 61.39844873128173 and b is 57.88381571211799\n",
      "loss_ /n 421.1773149852927\n",
      "Iteration 19, the loss is 421.1773149852927, parameters k is 61.39216409689438 and b is 57.882815712117996\n",
      "loss_ /n 421.13681835591063\n",
      "Iteration 20, the loss is 421.13681835591063, parameters k is 61.385879462507035 and b is 57.881815712118\n",
      "loss_ /n 421.0963217265277\n",
      "Iteration 21, the loss is 421.0963217265277, parameters k is 61.379594828119686 and b is 57.880815712118\n",
      "loss_ /n 421.055825097145\n",
      "Iteration 22, the loss is 421.055825097145, parameters k is 61.37331019373234 and b is 57.879815712118\n",
      "loss_ /n 421.01532846776246\n",
      "Iteration 23, the loss is 421.01532846776246, parameters k is 61.36702555934499 and b is 57.878815712118005\n",
      "loss_ /n 420.9748318383796\n",
      "Iteration 24, the loss is 420.9748318383796, parameters k is 61.36074092495764 and b is 57.87781571211801\n",
      "loss_ /n 420.93433520899697\n",
      "Iteration 25, the loss is 420.93433520899697, parameters k is 61.35445629057029 and b is 57.87681571211801\n",
      "loss_ /n 420.89383857961434\n",
      "Iteration 26, the loss is 420.89383857961434, parameters k is 61.34817165618294 and b is 57.87581571211801\n",
      "loss_ /n 420.8533419502316\n",
      "Iteration 27, the loss is 420.8533419502316, parameters k is 61.34188702179559 and b is 57.874815712118014\n",
      "loss_ /n 420.8128453208486\n",
      "Iteration 28, the loss is 420.8128453208486, parameters k is 61.335602387408244 and b is 57.87381571211802\n",
      "loss_ /n 420.77234869146633\n",
      "Iteration 29, the loss is 420.77234869146633, parameters k is 61.329317753020895 and b is 57.87281571211802\n",
      "loss_ /n 420.7318520620836\n",
      "Iteration 30, the loss is 420.7318520620836, parameters k is 61.323033118633546 and b is 57.87181571211802\n",
      "loss_ /n 420.691355432701\n",
      "Iteration 31, the loss is 420.691355432701, parameters k is 61.3167484842462 and b is 57.870815712118024\n",
      "loss_ /n 420.650858803318\n",
      "Iteration 32, the loss is 420.650858803318, parameters k is 61.31046384985885 and b is 57.869815712118026\n",
      "loss_ /n 420.61036217393604\n",
      "Iteration 33, the loss is 420.61036217393604, parameters k is 61.3041792154715 and b is 57.86881571211803\n",
      "loss_ /n 420.5698655445529\n",
      "Iteration 34, the loss is 420.5698655445529, parameters k is 61.29789458108415 and b is 57.86781571211803\n",
      "loss_ /n 420.52936891517004\n",
      "Iteration 35, the loss is 420.52936891517004, parameters k is 61.2916099466968 and b is 57.86681571211803\n",
      "loss_ /n 420.48887228578764\n",
      "Iteration 36, the loss is 420.48887228578764, parameters k is 61.28532531230945 and b is 57.865815712118035\n",
      "loss_ /n 420.4483756564047\n",
      "Iteration 37, the loss is 420.4483756564047, parameters k is 61.279040677922104 and b is 57.86481571211804\n",
      "loss_ /n 420.4078790270225\n",
      "Iteration 38, the loss is 420.4078790270225, parameters k is 61.272756043534756 and b is 57.86381571211804\n",
      "loss_ /n 420.36738239764003\n",
      "Iteration 39, the loss is 420.36738239764003, parameters k is 61.26647140914741 and b is 57.86281571211804\n",
      "loss_ /n 420.32688576825683\n",
      "Iteration 40, the loss is 420.32688576825683, parameters k is 61.26018677476006 and b is 57.861815712118045\n",
      "loss_ /n 420.28638913887494\n",
      "Iteration 41, the loss is 420.28638913887494, parameters k is 61.25390214037271 and b is 57.86081571211805\n",
      "loss_ /n 420.2458925094916\n",
      "Iteration 42, the loss is 420.2458925094916, parameters k is 61.24761750598536 and b is 57.85981571211805\n",
      "loss_ /n 420.2053958801091\n",
      "Iteration 43, the loss is 420.2053958801091, parameters k is 61.24133287159801 and b is 57.85881571211805\n",
      "loss_ /n 420.1648992507264\n",
      "Iteration 44, the loss is 420.1648992507264, parameters k is 61.23504823721066 and b is 57.857815712118054\n",
      "loss_ /n 420.12440262134373\n",
      "Iteration 45, the loss is 420.12440262134373, parameters k is 61.228763602823314 and b is 57.856815712118056\n",
      "loss_ /n 420.0839059919615\n",
      "Iteration 46, the loss is 420.0839059919615, parameters k is 61.222478968435965 and b is 57.85581571211806\n",
      "loss_ /n 420.04340936257836\n",
      "Iteration 47, the loss is 420.04340936257836, parameters k is 61.216194334048616 and b is 57.85481571211806\n",
      "loss_ /n 420.0029127331956\n",
      "Iteration 48, the loss is 420.0029127331956, parameters k is 61.20990969966127 and b is 57.85381571211806\n",
      "loss_ /n 419.9624161038129\n",
      "Iteration 49, the loss is 419.9624161038129, parameters k is 61.20362506527392 and b is 57.852815712118066\n",
      "loss_ /n 419.9219194744309\n",
      "Iteration 50, the loss is 419.9219194744309, parameters k is 61.19734043088657 and b is 57.85181571211807\n",
      "loss_ /n 419.8814228450481\n",
      "Iteration 51, the loss is 419.8814228450481, parameters k is 61.19105579649922 and b is 57.85081571211807\n",
      "loss_ /n 419.84092621566526\n",
      "Iteration 52, the loss is 419.84092621566526, parameters k is 61.18477116211187 and b is 57.84981571211807\n",
      "loss_ /n 419.8004295862821\n",
      "Iteration 53, the loss is 419.8004295862821, parameters k is 61.17848652772452 and b is 57.848815712118075\n",
      "loss_ /n 419.75993295689983\n",
      "Iteration 54, the loss is 419.75993295689983, parameters k is 61.172201893337174 and b is 57.84781571211808\n",
      "loss_ /n 419.71943632751686\n",
      "Iteration 55, the loss is 419.71943632751686, parameters k is 61.165917258949825 and b is 57.84681571211808\n",
      "loss_ /n 419.678939698135\n",
      "Iteration 56, the loss is 419.678939698135, parameters k is 61.15963262456248 and b is 57.84581571211808\n",
      "loss_ /n 419.6384430687519\n",
      "Iteration 57, the loss is 419.6384430687519, parameters k is 61.15334799017513 and b is 57.844815712118084\n",
      "loss_ /n 419.59794643936937\n",
      "Iteration 58, the loss is 419.59794643936937, parameters k is 61.14706335578778 and b is 57.84381571211809\n",
      "loss_ /n 419.5574498099863\n",
      "Iteration 59, the loss is 419.5574498099863, parameters k is 61.14077872140043 and b is 57.84281571211809\n",
      "loss_ /n 419.516953180604\n",
      "Iteration 60, the loss is 419.516953180604, parameters k is 61.13449408701308 and b is 57.84181571211809\n",
      "loss_ /n 419.4764565512217\n",
      "Iteration 61, the loss is 419.4764565512217, parameters k is 61.12820945262573 and b is 57.84081571211809\n",
      "loss_ /n 419.4359599218384\n",
      "Iteration 62, the loss is 419.4359599218384, parameters k is 61.121924818238384 and b is 57.839815712118096\n",
      "loss_ /n 419.3954632924559\n",
      "Iteration 63, the loss is 419.3954632924559, parameters k is 61.115640183851035 and b is 57.8388157121181\n",
      "loss_ /n 419.35496666307307\n",
      "Iteration 64, the loss is 419.35496666307307, parameters k is 61.109355549463686 and b is 57.8378157121181\n",
      "loss_ /n 419.31447003369027\n",
      "Iteration 65, the loss is 419.31447003369027, parameters k is 61.10307091507634 and b is 57.8368157121181\n",
      "loss_ /n 419.2739734043076\n",
      "Iteration 66, the loss is 419.2739734043076, parameters k is 61.09678628068899 and b is 57.835815712118105\n",
      "loss_ /n 419.23347677492507\n",
      "Iteration 67, the loss is 419.23347677492507, parameters k is 61.09050164630164 and b is 57.83481571211811\n",
      "loss_ /n 419.1929801455427\n",
      "Iteration 68, the loss is 419.1929801455427, parameters k is 61.08421701191429 and b is 57.83381571211811\n",
      "loss_ /n 419.15248351616\n",
      "Iteration 69, the loss is 419.15248351616, parameters k is 61.07793237752694 and b is 57.83281571211811\n",
      "loss_ /n 419.11198688677723\n",
      "Iteration 70, the loss is 419.11198688677723, parameters k is 61.07164774313959 and b is 57.831815712118114\n",
      "loss_ /n 419.07149025739443\n",
      "Iteration 71, the loss is 419.07149025739443, parameters k is 61.065363108752244 and b is 57.83081571211812\n",
      "loss_ /n 419.0309936280119\n",
      "Iteration 72, the loss is 419.0309936280119, parameters k is 61.059078474364895 and b is 57.82981571211812\n",
      "loss_ /n 418.99049699862906\n",
      "Iteration 73, the loss is 418.99049699862906, parameters k is 61.05279383997755 and b is 57.82881571211812\n",
      "loss_ /n 418.9500003692468\n",
      "Iteration 74, the loss is 418.9500003692468, parameters k is 61.0465092055902 and b is 57.827815712118124\n",
      "loss_ /n 418.90950373986357\n",
      "Iteration 75, the loss is 418.90950373986357, parameters k is 61.04022457120285 and b is 57.826815712118126\n",
      "loss_ /n 418.86900711048116\n",
      "Iteration 76, the loss is 418.86900711048116, parameters k is 61.0339399368155 and b is 57.82581571211813\n",
      "loss_ /n 418.8285104810982\n",
      "Iteration 77, the loss is 418.8285104810982, parameters k is 61.02765530242815 and b is 57.82481571211813\n",
      "loss_ /n 418.7880138517159\n",
      "Iteration 78, the loss is 418.7880138517159, parameters k is 61.0213706680408 and b is 57.82381571211813\n",
      "loss_ /n 418.74751722233344\n",
      "Iteration 79, the loss is 418.74751722233344, parameters k is 61.01508603365345 and b is 57.822815712118135\n",
      "loss_ /n 418.7070205929501\n",
      "Iteration 80, the loss is 418.7070205929501, parameters k is 61.008801399266105 and b is 57.82181571211814\n",
      "loss_ /n 418.6665239635678\n",
      "Iteration 81, the loss is 418.6665239635678, parameters k is 61.002516764878756 and b is 57.82081571211814\n",
      "loss_ /n 418.6260273341856\n",
      "Iteration 82, the loss is 418.6260273341856, parameters k is 60.99623213049141 and b is 57.81981571211814\n",
      "loss_ /n 418.5855307048025\n",
      "Iteration 83, the loss is 418.5855307048025, parameters k is 60.98994749610406 and b is 57.818815712118145\n",
      "loss_ /n 418.54503407541995\n",
      "Iteration 84, the loss is 418.54503407541995, parameters k is 60.98366286171671 and b is 57.81781571211815\n",
      "loss_ /n 418.5045374460374\n",
      "Iteration 85, the loss is 418.5045374460374, parameters k is 60.97737822732936 and b is 57.81681571211815\n",
      "loss_ /n 418.4640408166546\n",
      "Iteration 86, the loss is 418.4640408166546, parameters k is 60.97109359294201 and b is 57.81581571211815\n",
      "loss_ /n 418.4235441872717\n",
      "Iteration 87, the loss is 418.4235441872717, parameters k is 60.96480895855466 and b is 57.814815712118154\n",
      "loss_ /n 418.3830475578897\n",
      "Iteration 88, the loss is 418.3830475578897, parameters k is 60.958524324167314 and b is 57.813815712118156\n",
      "loss_ /n 418.34255092850634\n",
      "Iteration 89, the loss is 418.34255092850634, parameters k is 60.952239689779965 and b is 57.81281571211816\n",
      "loss_ /n 418.30205429912377\n",
      "Iteration 90, the loss is 418.30205429912377, parameters k is 60.945955055392616 and b is 57.81181571211816\n",
      "loss_ /n 418.26155766974114\n",
      "Iteration 91, the loss is 418.26155766974114, parameters k is 60.93967042100527 and b is 57.81081571211816\n",
      "loss_ /n 418.2210610403588\n",
      "Iteration 92, the loss is 418.2210610403588, parameters k is 60.93338578661792 and b is 57.809815712118166\n",
      "loss_ /n 418.1805644109757\n",
      "Iteration 93, the loss is 418.1805644109757, parameters k is 60.92710115223057 and b is 57.80881571211817\n",
      "loss_ /n 418.14006778159296\n",
      "Iteration 94, the loss is 418.14006778159296, parameters k is 60.92081651784322 and b is 57.80781571211817\n",
      "loss_ /n 418.09957115221073\n",
      "Iteration 95, the loss is 418.09957115221073, parameters k is 60.91453188345587 and b is 57.80681571211817\n",
      "loss_ /n 418.0590745228278\n",
      "Iteration 96, the loss is 418.0590745228278, parameters k is 60.90824724906852 and b is 57.805815712118175\n",
      "loss_ /n 418.0185778934452\n",
      "Iteration 97, the loss is 418.0185778934452, parameters k is 60.901962614681175 and b is 57.80481571211818\n",
      "loss_ /n 417.9780812640623\n",
      "Iteration 98, the loss is 417.9780812640623, parameters k is 60.895677980293826 and b is 57.80381571211818\n",
      "loss_ /n 417.9375846346803\n",
      "Iteration 99, the loss is 417.9375846346803, parameters k is 60.88939334590648 and b is 57.80281571211818\n",
      "loss_ /n 417.89708800529746\n",
      "Iteration 100, the loss is 417.89708800529746, parameters k is 60.88310871151913 and b is 57.801815712118184\n",
      "loss_ /n 417.85659137591483\n",
      "Iteration 101, the loss is 417.85659137591483, parameters k is 60.87682407713178 and b is 57.80081571211819\n",
      "loss_ /n 417.8160947465318\n",
      "Iteration 102, the loss is 417.8160947465318, parameters k is 60.87053944274443 and b is 57.79981571211819\n",
      "loss_ /n 417.77559811714946\n",
      "Iteration 103, the loss is 417.77559811714946, parameters k is 60.86425480835708 and b is 57.79881571211819\n",
      "loss_ /n 417.735101487767\n",
      "Iteration 104, the loss is 417.735101487767, parameters k is 60.85797017396973 and b is 57.797815712118194\n",
      "loss_ /n 417.69460485838385\n",
      "Iteration 105, the loss is 417.69460485838385, parameters k is 60.851685539582384 and b is 57.796815712118196\n",
      "loss_ /n 417.65410822900145\n",
      "Iteration 106, the loss is 417.65410822900145, parameters k is 60.845400905195035 and b is 57.7958157121182\n",
      "loss_ /n 417.6136115996189\n",
      "Iteration 107, the loss is 417.6136115996189, parameters k is 60.839116270807686 and b is 57.7948157121182\n",
      "loss_ /n 417.5731149702357\n",
      "Iteration 108, the loss is 417.5731149702357, parameters k is 60.83283163642034 and b is 57.7938157121182\n",
      "loss_ /n 417.5326183408533\n",
      "Iteration 109, the loss is 417.5326183408533, parameters k is 60.82654700203299 and b is 57.792815712118205\n",
      "loss_ /n 417.49212171147025\n",
      "Iteration 110, the loss is 417.49212171147025, parameters k is 60.82026236764564 and b is 57.79181571211821\n",
      "loss_ /n 417.4516250820883\n",
      "Iteration 111, the loss is 417.4516250820883, parameters k is 60.81397773325829 and b is 57.79081571211821\n",
      "loss_ /n 417.41112845270544\n",
      "Iteration 112, the loss is 417.41112845270544, parameters k is 60.80769309887094 and b is 57.78981571211821\n",
      "loss_ /n 417.37063182332275\n",
      "Iteration 113, the loss is 417.37063182332275, parameters k is 60.80140846448359 and b is 57.788815712118215\n",
      "loss_ /n 417.33013519393984\n",
      "Iteration 114, the loss is 417.33013519393984, parameters k is 60.795123830096244 and b is 57.78781571211822\n",
      "loss_ /n 417.2896385645576\n",
      "Iteration 115, the loss is 417.2896385645576, parameters k is 60.788839195708896 and b is 57.78681571211822\n",
      "loss_ /n 417.2491419351747\n",
      "Iteration 116, the loss is 417.2491419351747, parameters k is 60.78255456132155 and b is 57.78581571211822\n",
      "loss_ /n 417.2086453057924\n",
      "Iteration 117, the loss is 417.2086453057924, parameters k is 60.7762699269342 and b is 57.784815712118224\n",
      "loss_ /n 417.16814867640915\n",
      "Iteration 118, the loss is 417.16814867640915, parameters k is 60.76998529254685 and b is 57.783815712118226\n",
      "loss_ /n 417.1276520470267\n",
      "Iteration 119, the loss is 417.1276520470267, parameters k is 60.7637006581595 and b is 57.78281571211823\n",
      "loss_ /n 417.08715541764366\n",
      "Iteration 120, the loss is 417.08715541764366, parameters k is 60.75741602377215 and b is 57.78181571211823\n",
      "loss_ /n 417.046658788261\n",
      "Iteration 121, the loss is 417.046658788261, parameters k is 60.7511313893848 and b is 57.78081571211823\n",
      "loss_ /n 417.0061621588786\n",
      "Iteration 122, the loss is 417.0061621588786, parameters k is 60.744846754997454 and b is 57.779815712118236\n",
      "loss_ /n 416.96566552949605\n",
      "Iteration 123, the loss is 416.96566552949605, parameters k is 60.738562120610105 and b is 57.77881571211824\n",
      "loss_ /n 416.92516890011404\n",
      "Iteration 124, the loss is 416.92516890011404, parameters k is 60.732277486222756 and b is 57.77781571211824\n",
      "loss_ /n 416.88467227073\n",
      "Iteration 125, the loss is 416.88467227073, parameters k is 60.72599285183541 and b is 57.77681571211824\n",
      "loss_ /n 416.84417564134753\n",
      "Iteration 126, the loss is 416.84417564134753, parameters k is 60.71970821744806 and b is 57.775815712118245\n",
      "loss_ /n 416.80367901196536\n",
      "Iteration 127, the loss is 416.80367901196536, parameters k is 60.71342358306071 and b is 57.77481571211825\n",
      "loss_ /n 416.76318238258267\n",
      "Iteration 128, the loss is 416.76318238258267, parameters k is 60.70713894867336 and b is 57.77381571211825\n",
      "loss_ /n 416.7226857532\n",
      "Iteration 129, the loss is 416.7226857532, parameters k is 60.70085431428601 and b is 57.77281571211825\n",
      "loss_ /n 416.68218912381735\n",
      "Iteration 130, the loss is 416.68218912381735, parameters k is 60.69456967989866 and b is 57.771815712118254\n",
      "loss_ /n 416.64169249443455\n",
      "Iteration 131, the loss is 416.64169249443455, parameters k is 60.688285045511314 and b is 57.77081571211826\n",
      "loss_ /n 416.601195865052\n",
      "Iteration 132, the loss is 416.601195865052, parameters k is 60.682000411123965 and b is 57.76981571211826\n",
      "loss_ /n 416.5606992356698\n",
      "Iteration 133, the loss is 416.5606992356698, parameters k is 60.67571577673662 and b is 57.76881571211826\n",
      "loss_ /n 416.520202606287\n",
      "Iteration 134, the loss is 416.520202606287, parameters k is 60.66943114234927 and b is 57.767815712118264\n",
      "loss_ /n 416.4797059769043\n",
      "Iteration 135, the loss is 416.4797059769043, parameters k is 60.66314650796192 and b is 57.766815712118266\n",
      "loss_ /n 416.43920934752146\n",
      "Iteration 136, the loss is 416.43920934752146, parameters k is 60.65686187357457 and b is 57.76581571211827\n",
      "loss_ /n 416.3987127181384\n",
      "Iteration 137, the loss is 416.3987127181384, parameters k is 60.65057723918722 and b is 57.76481571211827\n",
      "loss_ /n 416.3582160887557\n",
      "Iteration 138, the loss is 416.3582160887557, parameters k is 60.64429260479987 and b is 57.76381571211827\n",
      "loss_ /n 416.31771945937345\n",
      "Iteration 139, the loss is 416.31771945937345, parameters k is 60.638007970412524 and b is 57.762815712118275\n",
      "loss_ /n 416.27722282999054\n",
      "Iteration 140, the loss is 416.27722282999054, parameters k is 60.631723336025175 and b is 57.76181571211828\n",
      "loss_ /n 416.23672620060773\n",
      "Iteration 141, the loss is 416.23672620060773, parameters k is 60.625438701637826 and b is 57.76081571211828\n",
      "loss_ /n 416.1962295712254\n",
      "Iteration 142, the loss is 416.1962295712254, parameters k is 60.61915406725048 and b is 57.75981571211828\n",
      "loss_ /n 416.1557329418431\n",
      "Iteration 143, the loss is 416.1557329418431, parameters k is 60.61286943286313 and b is 57.758815712118285\n",
      "loss_ /n 416.11523631246\n",
      "Iteration 144, the loss is 416.11523631246, parameters k is 60.60658479847578 and b is 57.75781571211829\n",
      "loss_ /n 416.0747396830773\n",
      "Iteration 145, the loss is 416.0747396830773, parameters k is 60.60030016408843 and b is 57.75681571211829\n",
      "loss_ /n 416.03424305369447\n",
      "Iteration 146, the loss is 416.03424305369447, parameters k is 60.59401552970108 and b is 57.75581571211829\n",
      "loss_ /n 415.9937464243121\n",
      "Iteration 147, the loss is 415.9937464243121, parameters k is 60.58773089531373 and b is 57.754815712118294\n",
      "loss_ /n 415.95324979492915\n",
      "Iteration 148, the loss is 415.95324979492915, parameters k is 60.581446260926384 and b is 57.753815712118296\n",
      "loss_ /n 415.9127531655466\n",
      "Iteration 149, the loss is 415.9127531655466, parameters k is 60.575161626539035 and b is 57.7528157121183\n",
      "loss_ /n 415.87225653616366\n",
      "Iteration 150, the loss is 415.87225653616366, parameters k is 60.56887699215169 and b is 57.7518157121183\n",
      "loss_ /n 415.83175990678103\n",
      "Iteration 151, the loss is 415.83175990678103, parameters k is 60.56259235776434 and b is 57.7508157121183\n",
      "loss_ /n 415.7912632773984\n",
      "Iteration 152, the loss is 415.7912632773984, parameters k is 60.55630772337699 and b is 57.749815712118306\n",
      "loss_ /n 415.7507666480158\n",
      "Iteration 153, the loss is 415.7507666480158, parameters k is 60.55002308898964 and b is 57.74881571211831\n",
      "loss_ /n 415.71027001863325\n",
      "Iteration 154, the loss is 415.71027001863325, parameters k is 60.54373845460229 and b is 57.74781571211831\n",
      "loss_ /n 415.6697733892509\n",
      "Iteration 155, the loss is 415.6697733892509, parameters k is 60.53745382021494 and b is 57.74681571211831\n",
      "loss_ /n 415.6292767598677\n",
      "Iteration 156, the loss is 415.6292767598677, parameters k is 60.53116918582759 and b is 57.745815712118315\n",
      "loss_ /n 415.5887801304855\n",
      "Iteration 157, the loss is 415.5887801304855, parameters k is 60.524884551440245 and b is 57.74481571211832\n",
      "loss_ /n 415.5482835011028\n",
      "Iteration 158, the loss is 415.5482835011028, parameters k is 60.518599917052896 and b is 57.74381571211832\n",
      "loss_ /n 415.50778687171976\n",
      "Iteration 159, the loss is 415.50778687171976, parameters k is 60.51231528266555 and b is 57.74281571211832\n",
      "loss_ /n 415.467290242338\n",
      "Iteration 160, the loss is 415.467290242338, parameters k is 60.5060306482782 and b is 57.741815712118324\n",
      "loss_ /n 415.4267936129548\n",
      "Iteration 161, the loss is 415.4267936129548, parameters k is 60.49974601389085 and b is 57.74081571211833\n",
      "loss_ /n 415.3862969835723\n",
      "Iteration 162, the loss is 415.3862969835723, parameters k is 60.4934613795035 and b is 57.73981571211833\n",
      "loss_ /n 415.3458003541898\n",
      "Iteration 163, the loss is 415.3458003541898, parameters k is 60.48717674511615 and b is 57.73881571211833\n",
      "loss_ /n 415.30530372480627\n",
      "Iteration 164, the loss is 415.30530372480627, parameters k is 60.4808921107288 and b is 57.73781571211833\n",
      "loss_ /n 415.2648070954244\n",
      "Iteration 165, the loss is 415.2648070954244, parameters k is 60.474607476341454 and b is 57.736815712118336\n",
      "loss_ /n 415.22431046604163\n",
      "Iteration 166, the loss is 415.22431046604163, parameters k is 60.468322841954105 and b is 57.73581571211834\n",
      "loss_ /n 415.1838138366589\n",
      "Iteration 167, the loss is 415.1838138366589, parameters k is 60.462038207566756 and b is 57.73481571211834\n",
      "loss_ /n 415.14331720727637\n",
      "Iteration 168, the loss is 415.14331720727637, parameters k is 60.45575357317941 and b is 57.73381571211834\n",
      "loss_ /n 415.10282057789306\n",
      "Iteration 169, the loss is 415.10282057789306, parameters k is 60.44946893879206 and b is 57.732815712118345\n",
      "loss_ /n 415.0623239485104\n",
      "Iteration 170, the loss is 415.0623239485104, parameters k is 60.44318430440471 and b is 57.73181571211835\n",
      "loss_ /n 415.0218273191285\n",
      "Iteration 171, the loss is 415.0218273191285, parameters k is 60.43689967001736 and b is 57.73081571211835\n",
      "loss_ /n 414.98133068974516\n",
      "Iteration 172, the loss is 414.98133068974516, parameters k is 60.43061503563001 and b is 57.72981571211835\n",
      "loss_ /n 414.9408340603633\n",
      "Iteration 173, the loss is 414.9408340603633, parameters k is 60.42433040124266 and b is 57.728815712118354\n",
      "loss_ /n 414.90033743098013\n",
      "Iteration 174, the loss is 414.90033743098013, parameters k is 60.418045766855315 and b is 57.72781571211836\n",
      "loss_ /n 414.85984080159744\n",
      "Iteration 175, the loss is 414.85984080159744, parameters k is 60.411761132467966 and b is 57.72681571211836\n",
      "loss_ /n 414.819344172215\n",
      "Iteration 176, the loss is 414.819344172215, parameters k is 60.40547649808062 and b is 57.72581571211836\n",
      "loss_ /n 414.77884754283207\n",
      "Iteration 177, the loss is 414.77884754283207, parameters k is 60.39919186369327 and b is 57.724815712118364\n",
      "loss_ /n 414.73835091344915\n",
      "Iteration 178, the loss is 414.73835091344915, parameters k is 60.39290722930592 and b is 57.723815712118366\n",
      "loss_ /n 414.69785428406675\n",
      "Iteration 179, the loss is 414.69785428406675, parameters k is 60.38662259491857 and b is 57.72281571211837\n",
      "loss_ /n 414.65735765468395\n",
      "Iteration 180, the loss is 414.65735765468395, parameters k is 60.38033796053122 and b is 57.72181571211837\n",
      "loss_ /n 414.61686102530126\n",
      "Iteration 181, the loss is 414.61686102530126, parameters k is 60.37405332614387 and b is 57.72081571211837\n",
      "loss_ /n 414.5763643959189\n",
      "Iteration 182, the loss is 414.5763643959189, parameters k is 60.367768691756524 and b is 57.719815712118375\n",
      "loss_ /n 414.5358677665364\n",
      "Iteration 183, the loss is 414.5358677665364, parameters k is 60.361484057369175 and b is 57.71881571211838\n",
      "loss_ /n 414.49537113715337\n",
      "Iteration 184, the loss is 414.49537113715337, parameters k is 60.355199422981826 and b is 57.71781571211838\n",
      "loss_ /n 414.45487450777057\n",
      "Iteration 185, the loss is 414.45487450777057, parameters k is 60.34891478859448 and b is 57.71681571211838\n",
      "loss_ /n 414.41437787838805\n",
      "Iteration 186, the loss is 414.41437787838805, parameters k is 60.34263015420713 and b is 57.715815712118385\n",
      "loss_ /n 414.3738812490059\n",
      "Iteration 187, the loss is 414.3738812490059, parameters k is 60.33634551981978 and b is 57.71481571211839\n",
      "loss_ /n 414.33338461962273\n",
      "Iteration 188, the loss is 414.33338461962273, parameters k is 60.33006088543243 and b is 57.71381571211839\n",
      "loss_ /n 414.29288799023993\n",
      "Iteration 189, the loss is 414.29288799023993, parameters k is 60.32377625104508 and b is 57.71281571211839\n",
      "loss_ /n 414.25239136085804\n",
      "Iteration 190, the loss is 414.25239136085804, parameters k is 60.31749161665773 and b is 57.711815712118394\n",
      "loss_ /n 414.2118947314751\n",
      "Iteration 191, the loss is 414.2118947314751, parameters k is 60.311206982270384 and b is 57.7108157121184\n",
      "loss_ /n 414.1713981020924\n",
      "Iteration 192, the loss is 414.1713981020924, parameters k is 60.304922347883036 and b is 57.7098157121184\n",
      "loss_ /n 414.1309014727099\n",
      "Iteration 193, the loss is 414.1309014727099, parameters k is 60.29863771349569 and b is 57.7088157121184\n",
      "loss_ /n 414.09040484332695\n",
      "Iteration 194, the loss is 414.09040484332695, parameters k is 60.29235307910834 and b is 57.7078157121184\n",
      "loss_ /n 414.049908213944\n",
      "Iteration 195, the loss is 414.049908213944, parameters k is 60.28606844472099 and b is 57.706815712118406\n",
      "loss_ /n 414.009411584562\n",
      "Iteration 196, the loss is 414.009411584562, parameters k is 60.27978381033364 and b is 57.70581571211841\n",
      "loss_ /n 413.9689149551791\n",
      "Iteration 197, the loss is 413.9689149551791, parameters k is 60.27349917594629 and b is 57.70481571211841\n",
      "loss_ /n 413.92841832579654\n",
      "Iteration 198, the loss is 413.92841832579654, parameters k is 60.26721454155894 and b is 57.70381571211841\n",
      "loss_ /n 413.88792169641357\n",
      "Iteration 199, the loss is 413.88792169641357, parameters k is 60.260929907171594 and b is 57.702815712118415\n"
     ]
    }
   ],
   "source": [
    "# 参数初始化\n",
    "k = random.random() * 200 - 100 \n",
    "b = random.random() * 200 - 100\n",
    "learning_rate = 1e-3\n",
    "iteration_num = 200\n",
    "losses = []\n",
    "for i in range(iteration_num):\n",
    "    price_use_current_parameters = [price(r, k, b) for r in X_rm]\n",
    "    current_loss = loss(y, price_use_current_parameters)\n",
    "    losses.append(current_loss)\n",
    "    print(\"Iteration {}, the loss is {}, parameters k is {} and b is {}\".format(i, current_loss, k, b))\n",
    "    k_gradient = partial_derivative_k(X_rm, y, price_use_current_parameters)\n",
    "    b_gradient = partial_derivative_b(y, price_use_current_parameters)\n",
    "    k = k + (-1 * k_gradient) * learning_rate\n",
    "    b = b + (-1 * b_gradient) * learning_rate\n",
    "best_k = k\n",
    "best_b = b"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### *结果产生问题，losses无法收敛\n",
    "问题解决，loss 计算那边公式写的有问题"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 152,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "[<matplotlib.lines.Line2D at 0x1a190e8940>]"
      ]
     },
     "execution_count": 152,
     "metadata": {},
     "output_type": "execute_result"
    },
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXoAAAD8CAYAAAB5Pm/hAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4zLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvnQurowAAIABJREFUeJzt3Xd4VHX6/vH3kxB6l4BA6IgI0ocOibp0EBRRsYCrIjakZHd12V1dd/W3a9kNRUVE7KJYsCDSZTehQ0LovYQuBFAQQern90eG75VFQiYhyZlM7td15TJz8pmZ25PJzcnJzDPmnENEREJXmNcBREQkd6noRURCnIpeRCTEqehFREKcil5EJMSp6EVEQpyKXkQkxKnoRURCnIpeRCTEFfI6AECFChVczZo1vY4hIpKvJCUlHXLORWa2LiiKvmbNmiQmJnodQ0QkXzGznYGs06kbEZEQp6IXEQlxKnoRkRAXcNGbWbiZJZvZNP/lSWa2yczWmtnbZhbh336Pma32fywysya5FV5ERDKXlSP6YcCGdJcnAfWBRkAxYJB/+w4gxjnXGHgOmJADOUVEJJsCKnoziwJ6AhMvbHPOTXd+wDIgyr99kXPuB/+yJRe2i4iINwI9oh8NPAmcv/gL/lM2A4CZl7jeg8CMbKcTEZErlmnRm1kv4KBzLimDJeOABOfc/IuudyNpRf9UBrc72MwSzSwxNTU1i7HTHD5+ir9/s55jv5zJ1vVFRAqCQI7o2wO9zSwFmAzcZGYfApjZX4FIIDb9FcysMWmnefo45w5f6kadcxOccz7nnC8yMtMXdl3Swm2HeXfRDjrHxTN3/YFs3YaISKjLtOidcyOdc1HOuZpAf2Cec+5eMxsEdAXucs793ykdM6sOfAEMcM5tzqXcAPRuUoUvH2tPueKFGfR+IkM/Tubw8VO5eZciIvnOlTyPfjxQCVhsZivN7Bn/9meAq4Bx/u25OtugSbWyTB3SgRGd6jFj7X46xcXz9cq9pP2NWERELBgK0efzuZyYdbP5wE88+flqVu7+kZvqV+T5W66nStliOZBQRCT4mFmSc86X2bqQemVsvUqlmPJoO/7S8zoWbTtEl1EJTFq6k/Pnvf/HTETEKyFV9ADhYcagjrWZPTyGxlFl+POXa7nrzSXsOPSz19FERDwRckV/QfWrijNpUGtevK0R6/cfo9voBCYkbOPsuV+9FEBEJKSFbNEDmBl3tqzO3NgYoutF8o/pG+n7+iI27D/mdTQRkTwT0kV/QaXSRZkwoAWv3t2MvT+c5OZXFhA3exOnzp7zOpqISK4rEEUPaUf3vRpXYW5sDDc3qcLYeVvpNXYBK3b9kPmVRUTysQJT9BeUK1GYUXc25Z3ftuT4qbPc9voi/v7Nek6cPut1NBGRXFHgiv6CG+tXZPaIaO5pXZ23F+6g6+gEFm495HUsEZEcV2CLHqBU0Qiev6URnwxuQ6GwMO6ZuJSnPl/N0ZMakiYioaNAF/0FrWtfxYxhHXkkpg6fr9hD57h4Zq373utYIiI5QkXvVzQinD92r89Xj7XnqpJFePiDJB6ftILUnzQkTUTyNxX9RRpFlWHqkPb8vks95qw/QOdR8XyxYo+GpIlIvqWiv4SI8DCG3HQN04d1oHaFEsR+uor7313O3h9Peh1NRCTLVPSXUbdiKT57pB1/vbkBS7cfoUtcPB8sTtGQNBHJV1T0mQgPM+5vX4vZI6JpXqMcT3+9jv4TlrA99bjX0UREAqKiD1C18sV5/4FWvNyvMRu/P0a3MfN5/b8akiYiwS/gojezcDNLNrNp/suTzGyTma01s7fNLMK/vb6ZLTazU2b2+9wK7gUz43ZfNebGxnDjtZG8OHMjt4xbyLp9R72OJiKSoawc0Q8DNqS7PAmoDzQCigGD/NuPAEOBf+VEwGBUsXRR3hjg4/V7mvP90VP0fnUhL8/ayC9nNCRNRIJPQEVvZlFAT2DihW3OuenOD1gGRPm3H3TOLQdC/uWl3RtVZm5sNLc0rcpr/9lGz7HzSdp5xOtYIiL/I9Aj+tHAk8CvTkj7T9kMAGZm5Y7NbLCZJZpZYmpqalauGlTKFi/Mv+9ownsPtOKXM+fpN34xz05dx8+nNCRNRIJDpkVvZr2Ag865pAyWjAMSnHPzs3LHzrkJzjmfc84XGRmZlasGpZh6kcwaEc3ANjV4b3EKXUYlkLA5//4DJiKhI5Aj+vZAbzNLASYDN5nZhwBm9lcgEojNtYT5SMkihfhbn+v59OG2FIkIY+Dby/j9Z6v48cRpr6OJSAGWadE750Y656KcczWB/sA859y9ZjYI6Arc5ZzTcwzTaVmzPNOHduSxG+rwZfJeOsUlMGPNfq9jiUgBdSXPox8PVAIWm9lKM3sGwMyuNrM9pB3l/8XM9phZ6RzImq8UjQjnyW71+frx9lQsVYRHJ63g0Q+TOPjTL15HE5ECxoJhWJfP53OJiYlex8g1Z86dZ0LCdsZ8t4ViEeH8ped19GsRhZl5HU1E8jEzS3LO+TJbp1fG5oGI8DAev7Eu04d25JqKJfnD56sZ+PYydh854XU0ESkAVPR5qG7Fknz6cFv+3qchK3b+QNfRCby7cIeGpIlIrlLR57GwMGNg25rMGhGNr2Z5nv1mPXe8sZitBzUkTURyh4reI1HlivPe/S359+1N2HLwOD3GzOe1/2zljIakiUgOU9F7yMy4rUUUc2Nj6NSgIi/P2kSfVxeydq+GpIlIzlHRB4HIUkUYd08Lxt/bgtTjp+jz2kJenKkhaSKSM1T0QaTb9Vczd0QMtzWvyuv/3UaPMfNZnqIhaSJyZVT0QaZM8Qhe6teEDx9szelz57l9/GKe+XotxzUkTUSySUUfpDpcU4FZw6O5v31NPliyky5x8fxn00GvY4lIPqSiD2IlihTirzc35PNH2lG8SCHuf2c5sZ+s5IefNSRNRAKnos8HWtQox7dDO/DETXWZumofnUfF8+3q/QTD+AoRCX4q+nyiSKFwftflWqYO6UDlMsV4/KMVPPxBEgePaUiaiFyeij6faVClNF8+1o6R3esTvzmV38TF8+ny3Tq6F5EMqejzoULhYTwcU4cZwzpyXeXSPDllNQPeWsauwxqSJiK/pqLPx2pHlmTyQ214/pbrWbn7R7qOTuCtBTs4pyFpIpJOwEVvZuFmlmxm0/yXJ5nZJjNba2Zv+98kHEsz1sy2mtlqM2ueW+ElbUjavW1qMHtENK1rl+e5aevpN34RWw785HU0EQkSWTmiHwZsSHd5ElAfaAQUAwb5t3cHrvF/DAZev/KYkpkqZYvxzm9bMvrOpqQc+pmeYxcw9rstnD6rIWkiBV1ARW9mUUBPYOKFbc656c4PWAZE+b/UB3jf/6UlQFkzq5zDueUSzIxbmlVlTmwMXa+/mrg5m+n96gJW7/nR62gi4qFAj+hHA08Cvzo89J+yGQDM9G+qCuxOt2SPf5vkkQoli/DKXc14c6CPH06c5pbXFvLP6Rs0JE2kgMq06M2sF3DQOZeUwZJxQIJzbv6Fq1xiza/+Omhmg80s0cwSU1NTAw4sgevcoBKzR8RwZ8tqvJGwnW6jE1iy/bDXsUQkjwVyRN8e6G1mKcBk4CYz+xDAzP4KRAKx6dbvAaqluxwF7Lv4Rp1zE5xzPuecLzIyMpvxJTNlikXwz76N+WhQa8476D9hCX/+cg0//XLG62gikkcyLXrn3EjnXJRzribQH5jnnLvXzAYBXYG7nHPpT+lMBQb6n33TBjjqnNufG+ElcO3qVmDm8I4M6lCLj5ftosuoBOZtPOB1LBHJA1fyPPrxQCVgsZmtNLNn/NunA9uBrcCbwGNXFlFySvHChfhLrwZMebQdpYoW4oF3Exk+OZkjGpImEtIsGF467/P5XGJiotcxCpTTZ8/z2n+2Mu6/WylVNIJnezfk5saVMbvUn1hEJBiZWZJzzpfZOr0ytoAqXCiMEZ3r8c0THahWrhhDP07mofeT+P6ohqSJhBoVfQFX/+rSfPFYe/7c4zoWbE2lc1w8Hy/bpSFpIiFERS+EhxkPRddm5rBoGlYtzcgv1nD3m0vZefhnr6OJSA5Q0cv/qVmhBB8NasM/bm3E2r1H6To6gYnzt2tImkg+p6KX/xEWZtzdujqzY6NpX6cCz3+7gb6vL2LT9xqSJpJfqejlkiqXKcbE+3yMvasZu4+coNcr8xk9d7OGpInkQyp6yZCZ0btJFebGxtCjUWVGz93Cza8sYOVuDUkTyU9U9JKp8iUKM6Z/M966z8fRk2foO24hz09bz8nTGpImkh+o6CVgv7muErNjo+nfqjoTF+yg6+gEFm075HUsEcmEil6ypHTRCP5xayM+fqgNYQZ3v7mUkV+s5piGpIkELRW9ZEvbOlcxY1g0D0fX5pPlu+kcF8/c9RqSJhKMVPSSbcUKhzOyx3V89Xh7yhUvzKD3E3ni42QOHz/ldTQRSUdFL1escVRZpg7pQGznesxcu59OcfF8lbxXYxREgoSKXnJE4UJhDP3NNXw7tCM1rirB8E9W8uB7iez78aTX0UQKPBW95Kh6lUox5dF2PN2rAYu3HabLqAQ+XLKT8xqjIOIZFb3kuPAw48EOtZg1PJom1crwl6/WctebS9hxSEPSRLwQcNGbWbiZJZvZNP/lIWa21cycmVVIt66cmX1pZqvNbJmZXZ8bwSX4Vb+qOB8+2JqXbmvM+v3H6DY6gTfit3H2nMYoiOSlrBzRDwM2pLu8EOgE7Lxo3Z+Alc65xsBAYMwVJZR8zcy4o2U15sbGEF0vkn/O2Ejf1xexYf8xr6OJFBgBFb2ZRQE9gYkXtjnnkp1zKZdY3gD4zr9mI1DTzCpdeVTJzyqVLsqEAS147e7m7PvxJDe/soC42Zs4dVZjFERyW6BH9KOBJ4FAfudeBfQFMLNWQA0gKlvpJKSYGT0bV2bOiBh6N6nC2Hlb6Tl2AUk7f/A6mkhIy7TozawXcNA5lxTgbb4AlDOzlcATQDJw9hK3O9jMEs0sMTU1NSuZJZ8rV6IwcXc25Z37W3Li1Fn6jV/E375Zx4nTv3qYiEgOsMxe1GJm/wQGkFbWRYHSwBfOuXv9X08BfM65X023MjMDdgCNnXMZnpT1+XwuMTExu/8Pko8dP3WWl2Zu5P3FO4kqV4wX+jamwzUVMr+iiGBmSc45X2brMj2id86NdM5FOedqAv2BeRdKPoM7Lmtmhf0XBwEJlyt5KdhKFinE3/tcz6cPtyUiPIx731rKk5+v4uhJDUkTySnZfh69mQ01sz2knX9fbWYX/lB7HbDOzDYC3Ul7to7IZbWqVZ4Zwzry6A11mLJiL53j4pm17nuvY4mEhExP3eQFnbqR9NbsOcqTU1azYf8xejaqzLO9GxJZqojXsUSCTo6duhHJa42iyjB1SHv+0PVa5qw/QKe4eKYk7dGQNJFsUtFLUIoID+PxG+syfVgH6lYsye8+W8Vv31nOXg1JE8kyFb0EtboVS/HZw2159uYGLE85Qpe4eN5fnKIhaSJZoKKXoBcWZvy2fdqQtOY1yvHM1+u4c8JitqUe9zqaSL6gopd8o1r54rz/QCte7teYTd//RPcx8xn3362c0ZA0kctS0Uu+Ymbc7qvG3N/FcNO1FXlp5iZueW0ha/ce9TqaSNBS0Uu+VLFUUcYPaMHr9zTnwLFT9HltIS/P2sgvZzQkTeRiKnrJ17o3qszc2GhubVaV1/6zjR5j55OYcsTrWCJBRUUv+V7Z4oX51+1NeP+BVpw6c57b31jMs1PX8fMpDUkTARW9hJDoepHMHhHNfW1r8t7iFLqMSiBhsyajiqjoJaSUKFKIZ3s35LOH21IkIoyBby/j95+t4scTp72OJuIZFb2EJF/N8kwf2pHHb6zDl8l76RSXwIw1+72OJeIJFb2ErKIR4fyha32mDmlPpdJFeHTSCh75IImDx37xOppInlLRS8hrWKUMXz/enqe61WfepoN0iovns8TdGpImBYaKXgqEQuFhPHpDHWYM68i1V5fiD5+vZuDby9h95ITX0URyXcBFb2bhZpZsZtP8l4eY2VYzc2ZWId26Mmb2jZmtMrN1ZnZ/bgQXyY46kSX5ZHBbnuvTkBU7f6Dr6ATeXbhDQ9IkpGXliH4YsCHd5YVAJ2DnReseB9Y755oANwD/TvfWgiKeCwszBrStyawR0bSsWZ5nv1nP7W8sZuvBn7yOJpIrAip6M4sCegIX3i4Q51yycy7lEssdUMr/xuAlgSOkvbG4SFCJKlecd+9vSdwdTdiWepweYxbw6rwtGpImISfQI/rRwJNAID8Br5L2vrH7gDXAMOecfnIkKJkZfZtHMWdEDJ0bVuJfszfT+1UNSZPQkmnRm1kv4KBzLinA2+wKrASqAE2BV82s9CVud7CZJZpZYmqqXr0o3oosVYTX7m7OGwNacOh42pC0F2ZoSJqEhkCO6NsDvc0sBZgM3GRmH15m/f3AFy7NVmAHUP/iRc65Cc45n3POFxkZmY3oIjmva8OrmTsihn7Noxgfv40eY+azbIeGpEn+lmnRO+dGOueinHM1gf7APOfcvZe5yi7gNwBmVgm4FtieA1lF8kSZ4hG82K8xHz7YmtPnznPHG4t5+qu1/PTLGa+jiWRLtp9Hb2ZDzWwPEAWsNrMLf6h9DmhnZmuA74CnnHOHrjyqSN7qcE0FZo+I5oH2tfhw6U66jkrgP5sOeh1LJMssGF4d6PP5XGJiotcxRDKUtPMH/jhlNVsOHqdvs6o83asB5UroWcPiLTNLcs75MlunV8aKBKBFjXJMG9qBoTfVZeqqfXSKi2fa6n0aoyD5gopeJEBFCoUT2+VavnmiA1XKFmPIR8k8/EESBzQkTYKcil4ki66rXJovH2vHyO71id+cSqe4eD5ZvktH9xK0VPQi2VAoPIyHY+owc3g011UuzVNT1nDvW0vZdVhD0iT4qOhFrkCtCiWY/FAbnr/lelbtPkrX0Qm8tWAH5zQkTYKIil7kCoWFGfe2qcHsEdG0rXMVz01bz22vL2LzAQ1Jk+CgohfJIVXKFuOt+3yM6d+UnYd/pufY+Yz9bgunz2rUk3hLRS+Sg8yMPk2rMjc2hm7XVyZuzmZ6v7qAVbt/9DqaFGAqepFccFXJIrxyVzPeHOjjhxOnuXXcQv45fQMnT2tImuQ9Fb1ILurcoBJzYmO4s2U13kjYTvcxCSzedtjrWFLAqOhFclnpohH8s29jPhrUmvMO7npzCX/6cg3HNCRN8oiKXiSPtKtbgVnDo3moYy0mL9tFl7gE5m084HUsKQBU9CJ5qFjhcP7cswFfPNaeMsUieODdRIZNTubw8VNeR5MQpqIX8UDTamX55okODO90DdPX7KfzqASmrtKQNMkdKnoRjxQuFMbwTvWY9kRHqpUvztCPk3no/US+P6ohaZKzVPQiHrv26lJ88Wg7/tLzOhZsPUTnuHg+WrqL8xqjIDkk4KI3s3AzSzazaf7LQ8xsq5k5M6uQbt0fzGyl/2OtmZ0zs/K5EV4kVISHGYM61mbW8Giur1qGP325hrsnLiHl0M9eR5MQkJUj+mHAhnSXFwKdgJ3pFznnXnbONXXONQVGAvHOOb27skgAalxVgo8eas0LfRuxbu8xuo1J4M2E7RqSJlckoKI3syigJ3DhfWFxziU751IyuepdwMfZTidSAJkZ/VtVZ05sDB3qVuD/Td9A33EL2fS9hqRJ9gR6RD8aeBIIeDqTmRUHugFTspFLpMC7ukxR3hzo45W7mrHnh5P0emU+o+Zs1pA0ybJMi97MegEHnXNJWbztm4GFGZ22MbPBZpZoZompqalZvGmRgsHMuLlJFebExtCzUWXGfLeFXq/MJ3nXD15Hk3wkkCP69kBvM0sBJgM3mdmHAVyvP5c5beOcm+Cc8znnfJGRkQGFFSmoypcozOj+zXj7tz5++uUsfV9fxHPT1nPi9Fmvo0k+kGnRO+dGOueinHM1SSvvec65ey93HTMrA8QAX+dIShEB4Kb6lZg9Ipp7WlfnrQU76DZ6Pou2HvI6lgS5bD+P3syGmtkeIApYbWYT0335VmC2c07PDRPJYaWKRvD8LY2YPLgNYQZ3T1zKH6es5uhJDUmTS7NgeMm1z+dziYmJXscQyXd+OXOOUXM382bCdiJLFeH5WxrRuUElr2NJHjGzJOecL7N1emWsSD5WNCKckd2v46vH21OueGEeej+RIR+t4JCGpEk6KnqRENA4qixTh3Tgd53rMXvdATrHxfNV8l4NSRNARS8SMgoXCuOJ31zDt0M7ULNCCYZ/spIH3l3Ovh9Peh1NPKaiFwkx11QqxeePtOOZXg1Ysv0IXUYl8MGSnRqSVoCp6EVCUHiY8UCHWsweEU3TamV5+qu19H9zCTs0JK1AUtGLhLBq5YvzwYOteOm2xmzYf4xuoxMYH7+Ns+c0RqEgUdGLhDgz446W1ZgbG0NMvUhemLGRW8ctYv2+Y15HkzyiohcpICqVLsobA1rw2t3N2X/0JL1fXcC/Z2/i1NlzXkeTXKaiFylAzIyejSszZ0QMvZtW4ZV5W+k5dgFJOzUkLZSp6EUKoHIlChN3R1Pevb8lJ0+fo9/4Rfztm3X8fEpD0kKRil6kALvh2orMGhHNgDY1eGdhCl1HJzB/i8aGhxoVvUgBV7JIIf7e53o+fbgthcPDGPDWMp78fBVHT2hIWqhQ0YsIAK1qlWf6sI48ekMdpqzYS6dR8cxc+73XsSQHqOhF5P8UjQjnqW71+frx9kSWLMIjHybx+KQVpP6kIWn5mYpeRH7l+qpl+HpIe/7Q9VrmbDhAp7h4piTt0ZC0fCrgojezcDNLNrNp/stDzGyrmTkzq3DR2hvMbKWZrTOz+JwOLSK5LyI8jMdvrMv0oR2pW7Ekv/tsFfe9s5w9P5zwOppkUVaO6IcBG9JdXgh0AnamX2RmZYFxQG/nXEPg9isNKSLeqVuxJJ893Ja/9W5IYsoRuo5K4P3FKRqSlo8EVPRmFgX0BP7v7QKdc8nOuZRLLL8b+MI5t8u/7mAO5BQRD4WFGfe1q8ms4dE0r1GOZ75ex50TFrMt9bjX0SQAgR7RjwaeBAKZhFQPKGdm/zWzJDMbmO10IhJUqpUvzvsPtOJftzdh84HjdB8zn3H/3coZDUkLapkWvZn1Ag4655ICvM1CQAvSfgPoCjxtZvUucbuDzSzRzBJTU/UCDZH8wszo1yKKObHRdLquIi/N3MQtry1k7d6jXkeTDARyRN8e6G1mKcBk4CYz+/Ay6/cAM51zPzvnDgEJQJOLFznnJjjnfM45X2RkZDaii4iXKpYqyrh7WjD+3uYcOHaKPq8t5KWZG/nljIakBZtMi945N9I5F+Wcqwn0B+Y55+69zFW+BjqaWSEzKw605n//iCsiIaTb9ZX5LjaGvs2qMu6/2+gxdj6JKUe8jiXpZPt59GY21Mz2AFHAajObCOCc2wDMBFYDy4CJzrm1ORFWRIJTmeIRvHx7E95/oBWnzpzn9jcW89ev13JcQ9KCggXDCyB8Pp9LTEz0OoaI5ICfT53l5VmbeG9xClXKFOMffRsRU0+nZ3ODmSU553yZrdMrY0UkR5UoUohnezfk80faUjQijPveXsbvPl3FjydOex2twFLRi0iuaFGjPN8O7ciQG+vy9cq9dIqLZ/qa/V7HKpBU9CKSa4pGhPP7rtfy9ZD2XF2mKI9NWsEjHyRx8NgvXkcrUFT0IpLrGlYpw1ePteepbvWZt+kgneLi+TRxt4ak5REVvYjkiULhYTx6Qx1mDutI/atL8+Tnqxn49jJ2H9GQtNymoheRPFU7siSTB7fhuT4NWbHzB7qOTuCdhTs4pyFpuUZFLyJ5LizMGNC2JrNjY2hVqzx/+2Y9t49fxNaDP3kdLSSp6EXEM1XLFuOd37Zk1J1N2H7oZ3qMWcCr87ZoSFoOU9GLiKfMjFubRTE3NobODSvxr9mbufmVBazZoyFpOUVFLyJBoULJIrx2d3PeGNCCIz+f5pZxC3lhhoak5QQVvYgEla4Nr2ZObAz9mkcxPn4b3cfMZ+n2w17HytdU9CISdMoUi+DFfo2ZNKg1Z8+f584JS3j6q7X89MsZr6PlSyp6EQla7etWYNbwaB7sUIsPl+6k66gE/rNR706aVSp6EQlqxQsX4uleDZjyaDtKFCnE/e8uZ8QnKznys4akBUpFLyL5QvPq5Zg2tANDf3MN36zaR+e4eKat3qcxCgFQ0YtIvlGkUDixnevxzRMdqFquGEM+SmbwB0kc0JC0ywq46M0s3MySzWya//IQM9tqZs7MKqRbd4OZHTWzlf6PZ3IjuIgUXNdVLs0Xj7bjTz3qk7A5lU5x8XyyfJeO7jOQlSP6Yfzve78uBDoBOy+xdr5zrqn/4+9XElBE5FIKhYcxOLoOs4ZH06ByaZ6asoZ7Ji5l12ENSbtYQEVvZlFAT2DihW3OuWTnXEou5RIRCUjNCiX4+KE2/OPWRqzec5Quo+OZOH+7hqSlE+gR/WjgSSDQARRtzWyVmc0ws4bZiyYiEpiwMOPu1tWZExtNuzoVeP7bDdz2+iI2H9CQNAig6M2sF3DQOZcU4G2uAGo455oArwBfZXC7g80s0cwSU1NTAw4sIpKRymWK8dZ9Psb0b8quIyfoOXY+Y+Zu4fTZgj0kzTL744WZ/RMYAJwFigKlgS+cc/f6v54C+JxzhzK4/mW/DuDz+VxiYmJ28ouIXNLh46f42zfrmbpqH/WvLsWLtzWmSbWyXsfKUWaW5JzzZbYu0yN659xI51yUc64m0B+Yd6HkM7jjq83M/J+38t+HBlWISJ66qmQRxt7VjIkDffx44gy3jlvIP6Zv4OTpgjckLdvPozezoWa2B4gCVpvZhT/U9gPWmtkqYCzQ3+k5TyLikU4NKjE7Npr+raozIWE73cYksHhbwTr2zPTUTV7QqRsRyQuLth1i5Bdr2Hn4BHe1qs7IHvUpXTTC61jZlmOnbkREQkW7OhWYOSyawdG1+WT5LrrEJfDdhgNex8p1KnoRKVCKFQ7nTz2u44vH2lOmWAQPvpfI0I+TOXz8lNfRco2KXkQKpKbVyvLNEx0Y0akeM9bup/OoBL5euTckxyio6EWkwCpcKIxVndx+AAAIi0lEQVRhna7h26EdqV6+OMMmr2TQe4nsP3rS62g5SkUvIgVevUqlmPJoO/7S8zoWbjtEl7gEPlq6i/MhMkZBRS8iAoSHGYM61mb28BgaRZXhT1+u4e6JS0g59LPX0a6Yil5EJJ3qVxVn0qDWvNC3Eev2HqPr6AQmJGzj7Ln8O0ZBRS8ichEzo3+r6syJjaHjNZH8Y/pGbnt9ERu/P+Z1tGxR0YuIZODqMkV5c2ALXrmrGXt+OEmvsQuIm7OZU2fz1xgFFb2IyGWYGTc3qcKc2BhublKFsd9t4eZXFpC86wevowVMRS8iEoDyJQoz6s6mvPPblvz0y1n6vr6I56at58Tps15Hy5SKXkQkC26sX5HZI6K5p3V13lqwg66jE1i4NcMp7EFBRS8ikkWlikbw/C2N+GRwGwqFhXHPxKX8ccpqjp4843W0S1LRi4hkU+vaVzFjWEcejqnNp4m76RwXz+x133sd61dU9CIiV6BoRDgju1/HV4+3p3yJwgz+IIkhH63gUBANSVPRi4jkgMZRaUPSft+lHrPXHaBTXDxfJu8JiiFpARe9mYWbWbKZTfNfHmJmW83MmVmFS6xvaWbnzKxfTgYWEQlWEeFhDLnpGqYP60DtCiUY8ckq7n93OXt/9HZIWlaO6IcBG9JdXgh0AnZevNDMwoEXgVlXlE5EJB+qW7EUnz3Sjr/e3ICl24/QJS6eD5bs9GxIWkBFb2ZRQE/gwvvC4pxLds6lZHCVJ4ApwMErDSgikh+Fhxn3t6/F7BHRNKtejqe/Wkv/CUvYnno8z7MEekQ/GngSyHSqj5lVBW4FxmeybrCZJZpZYmpqaoAxRETyl2rli/PBg614qV9jNn5/jO5j5jM+Pm+HpGVa9GbWCzjonEsK8DZHA0855y47DMI5N8E553PO+SIjIwO8aRGR/MfMuMNXjbmxMdxwbSQvzNjILeMWsn5f3gxJC+SIvj3Q28xSgMnATWb24WXW+4DJ/vX9gHFmdsuVBhURye8qli7KGwN8vH5Pc74/eorery7grQU7cv1+C2W2wDk3EhgJYGY3AL93zt17mfW1LnxuZu8C05xzX11xUhGRENG9UWXa1rmK56ZtoEb54rl+f9l+Hr2ZDTWzPUAUsNrMJmZ2HRERSVO2eGH+fUcTOjWolOv3ZcHwZH6fz+cSExO9jiEikq+YWZJzzpfZOr0yVkQkxKnoRURCnIpeRCTEqehFREKcil5EJMSp6EVEQpyKXkQkxAXF8+jNLJVLjDsOUAUgWN+ZN1izKVfWBGsuCN5sypU12c1VwzmX6bCwoCj6K2FmiYG8YMALwZpNubImWHNB8GZTrqzJ7Vw6dSMiEuJU9CIiIS4Uin6C1wEuI1izKVfWBGsuCN5sypU1uZor35+jFxGRywuFI3oREbmMfF30ZtbNzDaZ2VYz+6OHOaqZ2X/MbIOZrTOzYf7tz5rZXjNb6f/o4UG2FDNb47//RP+28mY2x8y2+P9bzoNc16bbLyvN7JiZDfdin5nZ22Z20MzWptt2yX1kacb6H3Orzax5Hud62cw2+u/7SzMr699e08xOpttvl33P5lzIleH3zcxG+vfXJjPrmlu5LpPtk3S5UsxspX97Xu6zjDoibx5nzrl8+QGEA9uA2kBhYBXQwKMslYHm/s9LAZuBBsCzpL0jl5f7KQWocNG2l4A/+j//I/BiEHwvvwdqeLHPgGigObA2s30E9ABmAAa0AZbmca4uQCH/5y+my1Uz/ToP9tclv2/+n4NVQBGglv9nNjwvs1309X8Dz3iwzzLqiDx5nOXnI/pWwFbn3Hbn3GnS3s+2jxdBnHP7nXMr/J//BGwAqnqRJUB9gPf8n78HeP2evr8BtjnnsvuiuSvinEsAjly0OaN91Ad436VZApQ1s8p5lcs5N9s5d9Z/cQlp7/CWpzLYXxnpA0x2zp1yzu0AtpL2s5vn2czMgDuAj3Pr/jNymY7Ik8dZfi76qsDudJf3EATlamY1gWbAUv+mIf5fvd724hQJ4IDZZpZkZoP92yo55/ZD2gMQqOhBrvT6878/fF7vM8h4HwXT4+4B0o76LqhlZslmFm9mHT3Ic6nvWzDtr47AAefclnTb8nyfXdQRefI4y89Fb5fY5ulTiMysJDAFGO6cOwa8DtQBmgL7Sfu1Ma+1d841B7oDj5tZtAcZMmRmhYHewGf+TcGwzy4nKB53ZvZn4Cwwyb9pP1DdOdcMiAU+MrPSeRgpo+9bUOwvv7v43wOKPN9nl+iIDJdeYlu291t+Lvo9QLV0l6OAfR5lwcwiSPsGTnLOfQHgnDvgnDvnnDsPvEku/sqaEefcPv9/DwJf+jMcuPBroP+/B/M6VzrdgRXOuQMQHPvML6N95PnjzszuA3oB9zj/CV3/qZHD/s+TSDsXXi+vMl3m++b5/gIws0JAX+CTC9vyep9dqiPIo8dZfi765cA1ZlbLf1TYH5jqRRD/ub+3gA3Oubh029OfU7sVWHvxdXM5VwkzK3Xhc9L+kLeWtP10n3/ZfcDXeZnrIv9zlOX1Pksno300FRjof1ZEG+DohV+984KZdQOeAno7506k2x5pZuH+z2sD1wDb8zBXRt+3qUB/MytiZrX8uZblVa50OgEbnXN7LmzIy32WUUeQV4+zvPiLc259kPaX6c2k/Uv8Zw9zdCDt16rVwEr/Rw/gA2CNf/tUoHIe56pN2jMeVgHrLuwj4CrgO2CL/7/lPdpvxYHDQJl02/J8n5H2D81+4AxpR1IPZrSPSPuV+jX/Y24N4MvjXFtJO3d74XE23r/2Nv/3eBWwArg5j3Nl+H0D/uzfX5uA7nn9vfRvfxd45KK1ebnPMuqIPHmc6ZWxIiIhLj+fuhERkQCo6EVEQpyKXkQkxKnoRURCnIpeRCTEqehFREKcil5EJMSp6EVEQtz/B6ytteWn7lLZAAAAAElFTkSuQmCC\n",
      "text/plain": [
       "<Figure size 432x288 with 1 Axes>"
      ]
     },
     "metadata": {
      "needs_background": "light"
     },
     "output_type": "display_data"
    }
   ],
   "source": [
    "plt.plot(list(range(iteration_num)), losses)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "<评阅点>\n",
    "+ 是否将Loss改成了“绝对值”(3')\n",
    "+ 是否完成了偏导的重新定义(5')\n",
    "+ 新的模型Loss是否能够收敛 (11’)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": []
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.7.3"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 2
}
