{
 "cells": [
  {
   "cell_type": "markdown",
   "id": "060b1a56",
   "metadata": {},
   "source": [
    "![avatar](images/line_regression.png)\n",
    "平面上的很多点是按照某种规律生成的，对于上图中的点，我们观察发现这些点都集中分布在一条直线上，直线的公式y=w*x+b,w是直线的斜率，b是偏置量。那么，根据这些点的分布，怎样才能找到对应的w,b的值呢？"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 1,
   "id": "aa67abef",
   "metadata": {},
   "outputs": [],
   "source": [
    "import numpy as np\n",
    "import matplotlib.pyplot as plt"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "76d5cf04",
   "metadata": {},
   "source": [
    "下面，我们利用numpy生成一些(x,y)的点，假装不知道这些点是如何生成的，开启寻找规律的旅程。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 2,
   "id": "4d55a053",
   "metadata": {
    "scrolled": true
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "[0.5488135  0.71518937 0.60276338 0.54488318 0.4236548  0.64589411\n",
      " 0.43758721 0.891773   0.96366276 0.38344152 0.79172504 0.52889492\n",
      " 0.56804456 0.92559664 0.07103606 0.0871293  0.0202184  0.83261985\n",
      " 0.77815675 0.87001215 0.97861834 0.79915856 0.46147936 0.78052918\n",
      " 0.11827443 0.63992102 0.14335329 0.94466892 0.52184832 0.41466194\n",
      " 0.26455561 0.77423369 0.45615033 0.56843395 0.0187898  0.6176355\n",
      " 0.61209572 0.616934   0.94374808 0.6818203 ]\n",
      "[5.74406752 6.57594683 6.01381688 5.72441591 5.118274   6.22947057\n",
      " 5.18793606 7.458865   7.8183138  4.91720759 6.95862519 5.6444746\n",
      " 5.84022281 7.62798319 3.35518029 3.4356465  3.10109199 7.16309923\n",
      " 6.89078375 7.35006074 7.89309171 6.99579282 5.30739681 6.90264588\n",
      " 3.59137213 6.19960511 3.71676644 7.72334459 5.60924161 5.0733097\n",
      " 4.32277806 6.87116845 5.28075166 5.84216974 3.093949   6.08817749\n",
      " 6.06047861 6.08466998 7.71874039 6.4091015 ]\n"
     ]
    },
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAWoAAAD4CAYAAADFAawfAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjUuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/YYfK9AAAACXBIWXMAAAsTAAALEwEAmpwYAAATa0lEQVR4nO3df2xd5X3H8c/XjqPG0BlI3KqF2RfGxjaVQckdY91Wpbjb+LGUFvEHm6EdmnRFrHVh/0BbS0XpZGmr9oeZtoRdoVJW7mATC13TDNYqFWNSB51dKIbSTinNdSndcELnDYxGcvPdH/Z1netzfc+9Pufe8+P9klDjc0/t79OET06f5/s8x9xdAIDk6ut1AQCAjRHUAJBwBDUAJBxBDQAJR1ADQMJtieOb7tixwwuFQhzfGgAyaXZ29ri7Dwd9FktQFwoFzczMxPGtASCTzKza7DOmPgAg4UIFtZn9sZm9YGbPm9lDZva2uAsDACxrGdRmdr6kP5JUdPf3SOqXdHPchQEAloWd+tgiaZuZbZE0KOmV+EoCAKzVMqjd/YeS/lzSvKQfSVp096803mdmJTObMbOZhYWF6CsFgJwKM/VxrqQbJF0o6d2SzjKzWxrvc/eyuxfdvTg8HNhhAgCZVJmrqDBdUN++PhWmC6rMVSL9/mGmPj4o6fvuvuDuJyUdlPS+SKsAgJSqzFVUOlRSdbEql6u6WFXpUCnSsA4T1POSrjKzQTMzSWOSXoysAgBIsckjk1o6uXTGtaWTS5o8MhnZz2i54cXdnzazRyR9U9IpSc9IKkdWAQCkSGWuoskjk5pfnNfI0Iiqi8H7VOYX5yP7maF2Jrr73ZLujuynAkAK1ac56k/Q1cWqTCbX+hewjAyNRPZz2ZkIACEFTXO4XCY749rgwKCmxqYi+7kENQCE1Gw6w+UaHRqVyTQ6NKry7rLGLx2P7OfGcigTAKTdxOEJlWfLqnlN/dav0s5S0znp0aFRHbvjWGy18EQNAA0mDk/owMwB1bwmSap5TQdmDuji8y7W4MDgGfdGPc0RhKAGgAbl2eDGtieOPaHy7nKs0xxBmPoAgAb1J+mg6+OXjscezI14ogaQa0Hbv/utP/DeZtfjRlADyK1m2793FXYF3l/aWepugSsIagC51Wz799HXjmpPcc/qE3S/9WtPcY/2X7+/F2XK3NfvqNmsYrHovDMRQNL17esL3FVoMp2++3RXazGzWXcvBn3GEzWA3Gq2zTvK7d9RIKgB5NbU2FRP+qLbRVADyK3xS8d70hfdLuaoAWRS43GkU2NTiQvgtTaao2bDC4DMCTqOtHRoubUuyWHdDFMfADKnG29d6SaCGkDmNDuONMq3rnQTQQ0gc9LSdhcWQQ0gtYLO6ZDS03YXFouJAFIpzIJhmro+NkJ7HoBUKkwXevK2lbiwhRxA5mRtwXAjBDWAVMraguFGCGoAqZS1BcONENQAUikt53REgcVEAImRtvM5osRZHwASL2vnc0SJqQ8AiZC18zmiRFADSIQ8tdu1i6AGkAh5ardrF0ENIBHy1G7XLoIaQCLkqd2uXS3b88zsEkl/t+bSRZI+7e7Tzf47tOcBqMtzy107NtWe5+7flXT5yjfql/RDSY9GWSCAbKLlLhrtTn2MSfqeu68/sgoA1qjMVfSxRz9Gy10E2g3qmyU9FPSBmZXMbMbMZhYWFjZfGYDUqj9J17wW+Dktd+0JvTPRzLZK+pCkTwZ97u5lSWVpeY46kuoApMbaueg+62sa0hItd+1qZwv5tZK+6e7/FVcxANKnMlfR3sf26sSbJ1avbRTStNy1r52g/l01mfYAkE+Ni4Wt9Fs/LXcdCDVHbWaDkn5T0sF4ywGQJkHnczQzODCoBz7yACHdgVBB7e5L7r7d3RfjLghAeoRZFGTzyuZxzCmAjo0MjQS+YLZu+7btOn7n8S5WlE1sIQfQsaDzOeoG+gZ0z7X3dLmibCKoAXRs7fkc0vJioSSNDo3q/g/fz1RHRHgVFwAkwEZnffBEDQAJR1ADQMIR1ACQcAQ1ACQcQQ3kVGWuosJ0QX37+lSYLqgyV+l1SWiCDS9ADnGgf7rwRA3kUNAZHRzon1wENZBDzc7o4ED/ZCKogRxqdnA/B/onE0EN5FDQGR0c6J9cBDWQQ2vP6OAY0uTjrA8gQ9a+t3BkaERTY1OEb0psdNYH7XlARtByl11MfQAZQctddhHUQEbQcpddBDWQEbTcZRdBDWQELXfZRVADGUHLXXbRngcACcCruIAU4zhS0EcNJFRlrqK9j+3ViTdPrF6jNzqfeKIGEqi+eWVtSNfRG50/BDWQQEGbV9aiNzpfCGoggVoFMb3R+UJQAwm0URDTG50/BDXQI5W5inZ8dodsn8n2mXZ8dsdqR0fQ5hVJ2r5tO73ROUTXB9ADE4cndGDmwBnXTrx5Qrd98TZJP+no4MhSSGx4AbquMlfRrQdvlSv4373RoVEdu+NYd4tCz216w4uZnWNmj5jZd8zsRTP71WhLBLKvvnHlloO3NA1piY4OrBd26uMeSY+7+01mtlXS+skzAE01Huq/ETo60KhlUJvZT0l6v6TflyR3f0vSW/GWBWRLq77ouoG+ATo6sE6YqY+LJC1Iut/MnjGz+8zsrMabzKxkZjNmNrOwsBB5oUCahZnOOGvgLN3/4ftZMMQ6YYJ6i6QrJB1w9/dKekPSJxpvcveyuxfdvTg8PBxxmUC6bTSdMTo0qgdvfFCvf+p1QhqBwgT1y5JedvenV75+RMvBDSCkZof6P3jjgzp2xzECGhtqGdTu/p+SfmBml6xcGpP07VirAjKGQ/2xGaH6qM3sckn3Sdoq6SVJt7n7j5vdTx818mLi8ITKs2XVvKZ+61dpZ0n7r9/f67KQQhv1UYdqz3P3ZyUFfgMgrxp3F9a8tvo1YY0ocdYH0KHybLmt60CnCGqgQzWvtXUd6BRBDYQQ9N7CfusPvLfZdaBTBDXQwsThCd168FZVF6ty+ep7C3cVdgXeX9pZ6m6ByDyCGthAZa6ie2fuXXeI0tLJJR197aj2FPesPkH3W7/2FPewkIjIccwpsIHCdEHVxWrgZybT6btPd7kiZNWm2/OAvKjMVc44rL9ZSEuccofuIaiRe/Vwri5WZbLVaY7Gr9cyGafcoWsIauRa4znRjaHs8nVhbTLdXryd7d/oGhYTkWthzol2+RlndHzhxi+wYIiu4okauRbmnGjeYYhe44kaudZqQXBwYJC5aPQcQY1cCzon2mSSxFGkSAymPpBr9RBe25I3NTZFOCNR2PACAAmw0YYXpj4AIOEIagBIOIIaABKOoAaAhCOoASDhCGqkUtAbV4Csoo8aqVKZq2jvY3t14s0Tq9fqb1yRRP8zMoknaqRG/aS7tSFdt3RySZNHJntQFRA/ghqp0eqkuzAHLAFpRFAjNVoFMW9cQVYR1EisxgXD87ad1/ReTrlDlrGYiERqfPNKdbGqgb4Bbe3fqrdqb51x7/Zt23XPtfewkIjMIqiRSEHz0SdPn9T2bdt19tazOekOuUJQIxHCvv37tTdf0/E7j3e5OqC3CGr0TCdv/2bBEHlEUKMnOnn7NwuGyCu6PtATnbz9m9diIa9CPVGb2TFJ/yupJulUs7cQAGHx9m8gvHamPj7g7qziIBIbLRhKTHMAazH1gZ7g7d9AeGGfqF3SV8zMJf21u5cbbzCzkqSSJI2MsDKPjfH2byC8UG8hN7N3u/srZvYOSV+V9HF3f7LZ/byFPN8ae6IJYKC1Tb+F3N1fWfnPVyU9KunK6MpDltTb7qqLVbl89axoDvYHOtcyqM3sLDN7e/3Xkn5L0vNxF4Z0Cmq746xoYHPCzFG/U9KjZla//2/d/fFYq0JqNWu746xooHMtg9rdX5J0WRdqQQY0a7tj6zfQOdrzEKmgtjt6ooHNIagRqfFLx1XeXWbrNxChUO157aI9DwDas+n2PABA7xDUAJBwBDUAJBxBDQAJR1ADQMIR1ACQcAQ1ACQcQQ0ACUdQA0DCEdQAkHAENQAkHEENAAlHUANAwhHUAJBwBDUAJBxBDQAJR1CnXGWuosJ0QX37+lSYLqgyV+l1SQAiFuYt5EioylxFpUMlLZ1ckiRVF6sqHSpJEq++AjKEJ+oUmzwyuRrSdUsnlzR5ZLJHFQGIA0GdYvOL821dB5BOBHWKjQyNtHUdQDoR1Ck2NTalwYHBM64NDgxqamyqRxUBiANBnWLjl46rvLus0aFRmUyjQ6Mq7y6zkAhkjLl75N+0WCz6zMxM5N8XALLKzGbdvRj0GU/UAJBwBDUAJBxBDQAJR1ADQMIR1ACQcKGD2sz6zewZM/tynAUBAM7UzhP1XkkvxlUIACBYqKA2swskXS/pvnjLAQA0CvtEPS3pTkmn4ysFABCkZVCb2e9IetXdZ1vcVzKzGTObWVhYiKxAAMi7ME/UvybpQ2Z2TNLDkq42swcbb3L3srsX3b04PDwccZkAkF8tg9rdP+nuF7h7QdLNkr7m7rfEXhkAQBJ91ACQeG29M9Hdn5D0RCyVAAAC8UQNAAlHUANAwhHUAJBwBHUIlbmKCtMF9e3rU2G6oMpcpdclAciRthYT86gyV1HpUElLJ5ckSdXFqkqHSpLEuwkBdAVP1C1MHplcDem6pZNLmjwy2aOKAOQNQd3C/OJ8W9cBIGoEdQsjQyNtXQeAqBHU2nixcGpsSoMDg2fcPzgwqKmxqW6XCSCnch/U9cXC6mJVLl9dLKyH9fil4yrvLmt0aFQm0+jQqMq7yywkAugac/fIv2mxWPSZmZnIv28cCtMFVRer666PDo3q2B3Hul8QgFwys1l3LwZ9lvsnahYLASRd7oOaxUIASZf7oGaxEEDS5T6oWSwEkHSZXkyszFU0eWRS84vzGhka0dTYFAEMIJE2WkzM7FkfnNEBICsyO/XBGR0AsiIzT9SN0xxBvdESbXcA0icTQR00zWEyudbPv9N2ByBtMjH1ETTN4XKZ7IxrtN0BSKNMBHWz6QyX03YHIPUyMfXRbE6a8zoAZEEmnqjZXQggyzIR1OwuBJBlmd6ZCABpwTGnAJBiBDUAJBxBDQAJl5ignjg8oS2f2SLbZ9rymS2aODzR65IAIBES0Uc9cXhCB2YOrH5d89rq1/uv39+rsgAgERLxRF2eLbd1HQDyJBFBXfNaW9cBIE9aBrWZvc3MvmFm3zKzF8xsX9RF9Ft/W9cBIE/CPFH/n6Sr3f0ySZdLusbMroqyiNLOUlvXASBPWi4m+vLWxddXvhxY+SfS7Yz1BcPybFk1r6nf+lXaWWIhEQAUcgu5mfVLmpV0saS/cve7Au4pSSpJ0sjIyM5qNfgNKwCA9Ta9hdzda+5+uaQLJF1pZu8JuKfs7kV3Lw4PD2+qYADAT7TV9eHu/y3pCUnXxFEMAGC9MF0fw2Z2zsqvt0n6oKTvxFwXAGBFmJ2J75L0wMo8dZ+kv3f3L8dbFgCgLkzXx3OS3tuFWgAAAWJ5cYCZLUhq1faxQ9LxyH94OuR57FK+x5/nsUv5Hn+rsY+6e2AnRixBHYaZzTRrRcm6PI9dyvf48zx2Kd/j38zYE3HWBwCgOYIaABKul0Gd5zNM8zx2Kd/jz/PYpXyPv+Ox92yOGgAQDlMfAJBwBDUAJFysQW1m15jZd83sqJl9IuBzM7O/WPn8OTO7Is56ui3E+MdXxv2cmX3dzC7rRZ1xaDX2Nff9spnVzOymbtYXtzDjN7NdZvbsygs5/qXbNcYlxJ/7ITM7tOZlJLf1os44mNnnzOxVM3u+yeedZZ67x/KPpH5J35N0kaStkr4l6Rcb7rlO0mOSTNJVkp6Oq55u/xNy/O+TdO7Kr6/NyvjDjH3NfV+T9E+Sbup13V3+vT9H0rcljax8/Y5e193FsX9K0p+t/HpY0muStva69ojG/35JV0h6vsnnHWVenE/UV0o66u4vuftbkh6WdEPDPTdI+htf9pSkc8zsXTHW1E0tx+/uX3f3H698+ZSWj5HNgjC/95L0cUn/IOnVbhbXBWHG/3uSDrr7vCS5e1b+Nwgzdpf0djMzSWdrOahPdbfMeLj7k1oeTzMdZV6cQX2+pB+s+frllWvt3pNW7Y7tD7T8N20WtBy7mZ0v6SOS7u1iXd0S5vf+5ySda2ZPmNmsmX20a9XFK8zY/1LSL0h6RdKcpL3ufro75fVcR5kX5vS8TlnAtcZewDD3pFXosZnZB7Qc1L8ea0XdE2bs05Lucvfa8oNVpoQZ/xZJOyWNSdom6d/M7Cl3/4+4i4tZmLH/tqRnJV0t6WckfdXM/tXd/yfm2pKgo8yLM6hflvTTa76+QMt/g7Z7T1qFGpuZ/ZKk+yRd6+4nulRb3MKMvSjp4ZWQ3iHpOjM75e5f7EqF8Qr7Z/+4u78h6Q0ze1LSZZLSHtRhxn6bpD/15Unbo2b2fUk/L+kb3SmxpzrKvDinPv5d0s+a2YVmtlXSzZK+1HDPlyR9dGUl9CpJi+7+oxhr6qaW4zezEUkHJd2agSeptVqO3d0vdPeCuxckPSJpIiMhLYX7s/+Pkn7DzLaY2aCkX5H0YpfrjEOYsc9r+f9JyMzeKekSSS91tcre6SjzYnuidvdTZvaHkv5ZyyvBn3P3F8zs9pXP79Xyav91ko5KWtLy37SZEHL8n5a0XdL+lSfLU56Bk8VCjj2zwozf3V80s8clPSfptKT73D2wpStNQv7e/4mkz5vZnJanAu5y90wcfWpmD0naJWmHmb0s6W5JA9LmMo8t5ACQcOxMBICEI6gBIOEIagBIOIIaABKOoAaAhCOoASDhCGoASLj/B1pl4rNWXTGyAAAAAElFTkSuQmCC\n",
      "text/plain": [
       "<Figure size 432x288 with 1 Axes>"
      ]
     },
     "metadata": {
      "needs_background": "light"
     },
     "output_type": "display_data"
    }
   ],
   "source": [
    "np.random.seed(0) #为了让重复执行时结果保持不变，需要指定个随机数，后续生成随机数相关的操作就会保持一致。\n",
    "\n",
    "w,b=5,3\n",
    "x=np.random.rand(40)\n",
    "y = w*x + b\n",
    "plt.plot(x, y, 'o', color='green')\n",
    "print(x)\n",
    "print(y)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "25049db1",
   "metadata": {},
   "source": [
    "假设一开始我们不知道w,b, 只有这么多组(x,y)的向量，应该怎么确定w,b?\n",
    "\n",
    "首先，我们随便选取一组(w,b)作为初始值, 根据直线的公式，根据对应x值可以计算出在这组随机的(w,b)情况y的值是多少，我们设定这个叫predict_y.我们计算出predict_y和实际的y值的差值是多少，差值越少，说明我们假设的w,b值越接近真实值。\n",
    "\n",
    "问题1，如何衡量predict_y和实际的y值的差值呢？有两种方式：\n",
    "1.怎么计算预测的y值和实际的y值差别？abs(predict_y-y) 平均绝对误差（MAE），(predict_y-y)**2. 均方误差损失函数（MSE）。\n",
    "![avatar](images/mse.png)\n",
    "\n",
    "问题2. 怎么调整w,b值？\n",
    "求导理论(待补充)\n",
    "\n",
    "((w*x + b)-y)***2 = 0, 求导 w*x+b-y, w的偏导数=(w**x+b-y)*x, b的偏导数=(w**x+b-y)*1"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 3,
   "id": "8a12a867",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "w1,b1: 8 1\n",
      "(40,)\n",
      "w1:7.997,b1:1.014,error:14.721\n"
     ]
    }
   ],
   "source": [
    "learning_rate = 0.05\n",
    "\n",
    "w1,b1 = np.random.randint(0,10,2)\n",
    "print(\"w1,b1:\", w1,b1)\n",
    "\n",
    "def forward(x):  \n",
    "    '''定义模型'''\n",
    "    return w1*x+b1\n",
    "\n",
    "def loss(x,y):\n",
    "    '''损失函数'''\n",
    "    return np.sum((forward(x)-y)**2)/2\n",
    "\n",
    "def back(w1,b1,x,y): \n",
    "    \"\"\"更新w,b参数\"\"\"\n",
    "    predict_y = forward(x)\n",
    "    gradient = (predict_y-y)\n",
    "    tmp_w = w1 - learning_rate * np.sum(gradient * x) / x.shape[0]\n",
    "    tmp_b = b1 - learning_rate * np.mean(gradient * 1)  # sum/count 等价于mean\n",
    "    return tmp_w,tmp_b\n",
    "\n",
    "print(x.shape)\n",
    "w1,b1 = back(w1,b1,x,y)\n",
    "error = loss(x,y) \n",
    "print(\"w1:%.3f,b1:%.3f,error:%.3f\"%(w1,b1,error))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 4,
   "id": "382f6c9c",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "w1,b1: 15.0 30.0\n",
      "w1:13.826,b1:28.035,error:18245.026\n",
      "w1:12.749,b1:26.229,error:15414.940\n",
      "w1:11.760,b1:24.568,error:13024.571\n",
      "w1:10.852,b1:23.041,error:11005.594\n",
      "w1:10.019,b1:21.636,error:9300.301\n",
      "w1:9.255,b1:20.345,error:7859.950\n",
      "w1:8.553,b1:19.158,error:6643.375\n",
      "w1:7.910,b1:18.066,error:5615.804\n",
      "w1:7.320,b1:17.061,error:4747.870\n",
      "w1:6.779,b1:16.138,error:4014.768\n",
      "w1:6.282,b1:15.288,error:3395.548\n",
      "w1:5.828,b1:14.507,error:2872.513\n",
      "w1:5.411,b1:13.788,error:2430.720\n",
      "w1:5.029,b1:13.126,error:2057.543\n",
      "w1:4.679,b1:12.518,error:1742.323\n",
      "w1:4.358,b1:11.958,error:1476.053\n",
      "w1:4.065,b1:11.442,error:1251.128\n",
      "w1:3.797,b1:10.968,error:1061.123\n",
      "w1:3.551,b1:10.532,error:900.612\n",
      "w1:3.327,b1:10.130,error:765.014\n",
      "w1:3.121,b1:9.760,error:650.457\n",
      "w1:2.934,b1:9.419,error:553.671\n",
      "w1:2.763,b1:9.105,error:471.897\n",
      "w1:2.606,b1:8.816,error:402.800\n",
      "w1:2.464,b1:8.549,error:344.413\n",
      "w1:2.334,b1:8.304,error:295.070\n",
      "w1:2.216,b1:8.077,error:253.366\n",
      "w1:2.109,b1:7.869,error:218.116\n",
      "w1:2.011,b1:7.676,error:188.315\n",
      "w1:1.922,b1:7.499,error:163.119\n",
      "w1:1.842,b1:7.335,error:141.810\n",
      "w1:1.769,b1:7.184,error:123.786\n",
      "w1:1.703,b1:7.044,error:108.536\n",
      "w1:1.644,b1:6.915,error:95.630\n",
      "w1:1.590,b1:6.796,error:84.703\n",
      "w1:1.542,b1:6.686,error:75.447\n",
      "w1:1.499,b1:6.584,error:67.605\n",
      "w1:1.460,b1:6.490,error:60.955\n",
      "w1:1.426,b1:6.402,error:55.313\n",
      "w1:1.395,b1:6.322,error:50.523\n",
      "w1:1.368,b1:6.247,error:46.452\n",
      "w1:1.344,b1:6.177,error:42.988\n",
      "w1:1.323,b1:6.112,error:40.038\n",
      "w1:1.305,b1:6.052,error:37.522\n",
      "w1:1.290,b1:5.997,error:35.372\n",
      "w1:1.276,b1:5.945,error:33.533\n",
      "w1:1.265,b1:5.897,error:31.954\n",
      "w1:1.256,b1:5.852,error:30.598\n",
      "w1:1.249,b1:5.810,error:29.428\n",
      "w1:1.243,b1:5.770,error:28.416\n",
      "w1:1.238,b1:5.734,error:27.539\n",
      "w1:1.235,b1:5.699,error:26.774\n",
      "w1:1.234,b1:5.667,error:26.105\n",
      "w1:1.233,b1:5.637,error:25.517\n",
      "w1:1.234,b1:5.609,error:24.997\n",
      "w1:1.235,b1:5.582,error:24.536\n",
      "w1:1.237,b1:5.557,error:24.123\n",
      "w1:1.241,b1:5.533,error:23.752\n",
      "w1:1.244,b1:5.511,error:23.417\n",
      "w1:1.249,b1:5.490,error:23.112\n",
      "w1:1.254,b1:5.470,error:22.832\n",
      "w1:1.260,b1:5.451,error:22.574\n",
      "w1:1.266,b1:5.433,error:22.334\n",
      "w1:1.273,b1:5.415,error:22.110\n",
      "w1:1.280,b1:5.399,error:21.899\n",
      "w1:1.288,b1:5.383,error:21.700\n",
      "w1:1.296,b1:5.368,error:21.511\n",
      "w1:1.304,b1:5.354,error:21.330\n",
      "w1:1.312,b1:5.340,error:21.157\n",
      "w1:1.321,b1:5.327,error:20.989\n",
      "w1:1.330,b1:5.314,error:20.828\n",
      "w1:1.339,b1:5.302,error:20.671\n",
      "w1:1.349,b1:5.290,error:20.518\n",
      "w1:1.359,b1:5.278,error:20.368\n",
      "w1:1.368,b1:5.267,error:20.222\n",
      "w1:1.378,b1:5.257,error:20.079\n",
      "w1:1.388,b1:5.246,error:19.938\n",
      "w1:1.399,b1:5.236,error:19.799\n",
      "w1:1.409,b1:5.226,error:19.663\n",
      "w1:1.419,b1:5.216,error:19.528\n",
      "w1:1.430,b1:5.207,error:19.395\n",
      "w1:1.440,b1:5.197,error:19.264\n",
      "w1:1.451,b1:5.188,error:19.134\n",
      "w1:1.462,b1:5.179,error:19.005\n",
      "w1:1.472,b1:5.171,error:18.878\n",
      "w1:1.483,b1:5.162,error:18.751\n",
      "w1:1.494,b1:5.154,error:18.626\n",
      "w1:1.505,b1:5.145,error:18.502\n",
      "w1:1.516,b1:5.137,error:18.379\n",
      "w1:1.526,b1:5.129,error:18.257\n",
      "w1:1.537,b1:5.121,error:18.136\n",
      "w1:1.548,b1:5.113,error:18.016\n",
      "w1:1.559,b1:5.105,error:17.897\n",
      "w1:1.570,b1:5.098,error:17.778\n",
      "w1:1.581,b1:5.090,error:17.661\n",
      "w1:1.592,b1:5.083,error:17.544\n",
      "w1:1.603,b1:5.075,error:17.428\n",
      "w1:1.614,b1:5.068,error:17.313\n",
      "w1:1.625,b1:5.060,error:17.199\n",
      "w1:1.635,b1:5.053,error:17.086\n",
      "w1:1.646,b1:5.046,error:16.973\n",
      "w1:1.657,b1:5.039,error:16.861\n",
      "w1:1.668,b1:5.032,error:16.750\n",
      "w1:1.679,b1:5.025,error:16.639\n",
      "w1:1.690,b1:5.018,error:16.530\n",
      "w1:1.700,b1:5.011,error:16.421\n",
      "w1:1.711,b1:5.004,error:16.313\n",
      "w1:1.722,b1:4.997,error:16.205\n",
      "w1:1.732,b1:4.990,error:16.098\n",
      "w1:1.743,b1:4.984,error:15.992\n",
      "w1:1.754,b1:4.977,error:15.887\n",
      "w1:1.764,b1:4.970,error:15.782\n",
      "w1:1.775,b1:4.964,error:15.678\n",
      "w1:1.785,b1:4.957,error:15.575\n",
      "w1:1.796,b1:4.950,error:15.472\n",
      "w1:1.807,b1:4.944,error:15.370\n",
      "w1:1.817,b1:4.937,error:15.269\n",
      "w1:1.827,b1:4.931,error:15.168\n",
      "w1:1.838,b1:4.924,error:15.069\n",
      "w1:1.848,b1:4.918,error:14.969\n",
      "w1:1.859,b1:4.912,error:14.871\n",
      "w1:1.869,b1:4.905,error:14.773\n",
      "w1:1.879,b1:4.899,error:14.675\n",
      "w1:1.889,b1:4.893,error:14.579\n",
      "w1:1.900,b1:4.886,error:14.483\n",
      "w1:1.910,b1:4.880,error:14.387\n",
      "w1:1.920,b1:4.874,error:14.293\n",
      "w1:1.930,b1:4.868,error:14.198\n",
      "w1:1.940,b1:4.861,error:14.105\n",
      "w1:1.950,b1:4.855,error:14.012\n",
      "w1:1.960,b1:4.849,error:13.920\n",
      "w1:1.970,b1:4.843,error:13.828\n",
      "w1:1.980,b1:4.837,error:13.737\n",
      "w1:1.990,b1:4.831,error:13.646\n",
      "w1:2.000,b1:4.825,error:13.556\n",
      "w1:2.010,b1:4.819,error:13.467\n",
      "w1:2.020,b1:4.813,error:13.378\n",
      "w1:2.030,b1:4.807,error:13.290\n",
      "w1:2.040,b1:4.801,error:13.203\n",
      "w1:2.049,b1:4.795,error:13.116\n",
      "w1:2.059,b1:4.789,error:13.029\n",
      "w1:2.069,b1:4.783,error:12.944\n",
      "w1:2.078,b1:4.777,error:12.858\n",
      "w1:2.088,b1:4.771,error:12.774\n",
      "w1:2.098,b1:4.765,error:12.690\n",
      "w1:2.107,b1:4.759,error:12.606\n",
      "w1:2.117,b1:4.754,error:12.523\n",
      "w1:2.126,b1:4.748,error:12.440\n",
      "w1:2.136,b1:4.742,error:12.358\n",
      "w1:2.145,b1:4.736,error:12.277\n",
      "w1:2.155,b1:4.730,error:12.196\n",
      "w1:2.164,b1:4.725,error:12.116\n",
      "w1:2.173,b1:4.719,error:12.036\n",
      "w1:2.183,b1:4.713,error:11.957\n",
      "w1:2.192,b1:4.708,error:11.878\n",
      "w1:2.201,b1:4.702,error:11.800\n",
      "w1:2.210,b1:4.696,error:11.722\n",
      "w1:2.220,b1:4.691,error:11.645\n",
      "w1:2.229,b1:4.685,error:11.568\n",
      "w1:2.238,b1:4.680,error:11.492\n",
      "w1:2.247,b1:4.674,error:11.416\n",
      "w1:2.256,b1:4.669,error:11.341\n",
      "w1:2.265,b1:4.663,error:11.266\n",
      "w1:2.274,b1:4.658,error:11.192\n",
      "w1:2.283,b1:4.652,error:11.118\n",
      "w1:2.292,b1:4.647,error:11.045\n",
      "w1:2.301,b1:4.641,error:10.972\n",
      "w1:2.310,b1:4.636,error:10.900\n",
      "w1:2.319,b1:4.631,error:10.828\n",
      "w1:2.328,b1:4.625,error:10.757\n",
      "w1:2.337,b1:4.620,error:10.686\n",
      "w1:2.345,b1:4.614,error:10.616\n",
      "w1:2.354,b1:4.609,error:10.546\n",
      "w1:2.363,b1:4.604,error:10.476\n",
      "w1:2.372,b1:4.598,error:10.407\n",
      "w1:2.380,b1:4.593,error:10.339\n",
      "w1:2.389,b1:4.588,error:10.271\n",
      "w1:2.397,b1:4.583,error:10.203\n",
      "w1:2.406,b1:4.577,error:10.136\n",
      "w1:2.415,b1:4.572,error:10.069\n",
      "w1:2.423,b1:4.567,error:10.003\n",
      "w1:2.432,b1:4.562,error:9.937\n",
      "w1:2.440,b1:4.557,error:9.871\n",
      "w1:2.449,b1:4.552,error:9.806\n",
      "w1:2.457,b1:4.547,error:9.742\n",
      "w1:2.465,b1:4.541,error:9.678\n",
      "w1:2.474,b1:4.536,error:9.614\n",
      "w1:2.482,b1:4.531,error:9.550\n",
      "w1:2.490,b1:4.526,error:9.488\n",
      "w1:2.499,b1:4.521,error:9.425\n",
      "w1:2.507,b1:4.516,error:9.363\n",
      "w1:2.515,b1:4.511,error:9.301\n",
      "w1:2.523,b1:4.506,error:9.240\n",
      "w1:2.531,b1:4.501,error:9.179\n",
      "w1:2.540,b1:4.496,error:9.119\n",
      "w1:2.548,b1:4.491,error:9.059\n",
      "w1:2.556,b1:4.486,error:8.999\n",
      "w1:2.564,b1:4.481,error:8.940\n",
      "w1:2.572,b1:4.477,error:8.881\n",
      "w1:2.580,b1:4.472,error:8.822\n",
      "w1:2.588,b1:4.467,error:8.764\n",
      "w1:2.596,b1:4.462,error:8.706\n",
      "w1:2.604,b1:4.457,error:8.649\n",
      "w1:2.612,b1:4.452,error:8.592\n",
      "w1:2.620,b1:4.448,error:8.536\n",
      "w1:2.627,b1:4.443,error:8.479\n",
      "w1:2.635,b1:4.438,error:8.423\n",
      "w1:2.643,b1:4.433,error:8.368\n",
      "w1:2.651,b1:4.429,error:8.313\n",
      "w1:2.659,b1:4.424,error:8.258\n",
      "w1:2.666,b1:4.419,error:8.204\n",
      "w1:2.674,b1:4.415,error:8.150\n",
      "w1:2.682,b1:4.410,error:8.096\n",
      "w1:2.689,b1:4.405,error:8.043\n",
      "w1:2.697,b1:4.401,error:7.990\n",
      "w1:2.705,b1:4.396,error:7.937\n",
      "w1:2.712,b1:4.391,error:7.885\n",
      "w1:2.720,b1:4.387,error:7.833\n",
      "w1:2.727,b1:4.382,error:7.781\n",
      "w1:2.735,b1:4.378,error:7.730\n",
      "w1:2.742,b1:4.373,error:7.679\n",
      "w1:2.750,b1:4.369,error:7.628\n",
      "w1:2.757,b1:4.364,error:7.578\n",
      "w1:2.764,b1:4.360,error:7.528\n",
      "w1:2.772,b1:4.355,error:7.479\n",
      "w1:2.779,b1:4.351,error:7.429\n",
      "w1:2.786,b1:4.346,error:7.381\n",
      "w1:2.794,b1:4.342,error:7.332\n",
      "w1:2.801,b1:4.337,error:7.284\n",
      "w1:2.808,b1:4.333,error:7.236\n",
      "w1:2.816,b1:4.328,error:7.188\n",
      "w1:2.823,b1:4.324,error:7.141\n",
      "w1:2.830,b1:4.320,error:7.094\n",
      "w1:2.837,b1:4.315,error:7.047\n",
      "w1:2.844,b1:4.311,error:7.000\n",
      "w1:2.851,b1:4.307,error:6.954\n",
      "w1:2.858,b1:4.302,error:6.909\n",
      "w1:2.866,b1:4.298,error:6.863\n",
      "w1:2.873,b1:4.294,error:6.818\n",
      "w1:2.880,b1:4.290,error:6.773\n",
      "w1:2.887,b1:4.285,error:6.728\n",
      "w1:2.894,b1:4.281,error:6.684\n",
      "w1:2.900,b1:4.277,error:6.640\n",
      "w1:2.907,b1:4.273,error:6.596\n",
      "w1:2.914,b1:4.268,error:6.553\n",
      "w1:2.921,b1:4.264,error:6.510\n",
      "w1:2.928,b1:4.260,error:6.467\n",
      "w1:2.935,b1:4.256,error:6.424\n",
      "w1:2.942,b1:4.252,error:6.382\n",
      "w1:2.948,b1:4.248,error:6.340\n",
      "w1:2.955,b1:4.243,error:6.298\n",
      "w1:2.962,b1:4.239,error:6.257\n",
      "w1:2.969,b1:4.235,error:6.215\n",
      "w1:2.975,b1:4.231,error:6.174\n",
      "w1:2.982,b1:4.227,error:6.134\n",
      "w1:2.989,b1:4.223,error:6.093\n",
      "w1:2.995,b1:4.219,error:6.053\n",
      "w1:3.002,b1:4.215,error:6.013\n",
      "w1:3.009,b1:4.211,error:5.974\n",
      "w1:3.015,b1:4.207,error:5.934\n",
      "w1:3.022,b1:4.203,error:5.895\n",
      "w1:3.028,b1:4.199,error:5.856\n",
      "w1:3.035,b1:4.195,error:5.818\n",
      "w1:3.041,b1:4.191,error:5.779\n",
      "w1:3.048,b1:4.187,error:5.741\n",
      "w1:3.054,b1:4.183,error:5.704\n",
      "w1:3.061,b1:4.179,error:5.666\n",
      "w1:3.067,b1:4.176,error:5.629\n",
      "w1:3.073,b1:4.172,error:5.592\n",
      "w1:3.080,b1:4.168,error:5.555\n",
      "w1:3.086,b1:4.164,error:5.518\n",
      "w1:3.092,b1:4.160,error:5.482\n",
      "w1:3.099,b1:4.156,error:5.446\n",
      "w1:3.105,b1:4.152,error:5.410\n",
      "w1:3.111,b1:4.149,error:5.374\n",
      "w1:3.117,b1:4.145,error:5.339\n",
      "w1:3.124,b1:4.141,error:5.304\n",
      "w1:3.130,b1:4.137,error:5.269\n",
      "w1:3.136,b1:4.134,error:5.234\n",
      "w1:3.142,b1:4.130,error:5.200\n",
      "w1:3.148,b1:4.126,error:5.165\n",
      "w1:3.154,b1:4.122,error:5.131\n",
      "w1:3.160,b1:4.119,error:5.097\n",
      "w1:3.167,b1:4.115,error:5.064\n",
      "w1:3.173,b1:4.111,error:5.031\n",
      "w1:3.179,b1:4.108,error:4.997\n",
      "w1:3.185,b1:4.104,error:4.964\n",
      "w1:3.191,b1:4.100,error:4.932\n",
      "w1:3.197,b1:4.097,error:4.899\n",
      "w1:3.203,b1:4.093,error:4.867\n",
      "w1:3.208,b1:4.090,error:4.835\n",
      "w1:3.214,b1:4.086,error:4.803\n",
      "w1:3.220,b1:4.082,error:4.771\n",
      "w1:3.226,b1:4.079,error:4.740\n",
      "w1:3.232,b1:4.075,error:4.709\n",
      "w1:3.238,b1:4.072,error:4.678\n",
      "w1:3.244,b1:4.068,error:4.647\n",
      "w1:3.249,b1:4.065,error:4.616\n",
      "w1:3.255,b1:4.061,error:4.586\n",
      "w1:3.261,b1:4.058,error:4.556\n",
      "w1:3.267,b1:4.054,error:4.526\n",
      "w1:3.272,b1:4.051,error:4.496\n",
      "w1:3.278,b1:4.047,error:4.466\n",
      "w1:3.284,b1:4.044,error:4.437\n",
      "w1:3.289,b1:4.040,error:4.408\n",
      "w1:3.295,b1:4.037,error:4.379\n",
      "w1:3.301,b1:4.033,error:4.350\n",
      "w1:3.306,b1:4.030,error:4.321\n",
      "w1:3.312,b1:4.027,error:4.293\n",
      "w1:3.317,b1:4.023,error:4.264\n",
      "w1:3.323,b1:4.020,error:4.236\n",
      "w1:3.329,b1:4.016,error:4.208\n",
      "w1:3.334,b1:4.013,error:4.181\n",
      "w1:3.340,b1:4.010,error:4.153\n",
      "w1:3.345,b1:4.006,error:4.126\n",
      "w1:3.350,b1:4.003,error:4.099\n",
      "w1:3.356,b1:4.000,error:4.072\n",
      "w1:3.361,b1:3.997,error:4.045\n",
      "w1:3.367,b1:3.993,error:4.018\n",
      "w1:3.372,b1:3.990,error:3.992\n",
      "w1:3.378,b1:3.987,error:3.965\n",
      "w1:3.383,b1:3.983,error:3.939\n",
      "w1:3.388,b1:3.980,error:3.913\n",
      "w1:3.394,b1:3.977,error:3.888\n",
      "w1:3.399,b1:3.974,error:3.862\n",
      "w1:3.404,b1:3.971,error:3.837\n",
      "w1:3.409,b1:3.967,error:3.811\n",
      "w1:3.415,b1:3.964,error:3.786\n",
      "w1:3.420,b1:3.961,error:3.761\n",
      "w1:3.425,b1:3.958,error:3.736\n",
      "w1:3.430,b1:3.955,error:3.712\n",
      "w1:3.435,b1:3.951,error:3.687\n",
      "w1:3.441,b1:3.948,error:3.663\n",
      "w1:3.446,b1:3.945,error:3.639\n",
      "w1:3.451,b1:3.942,error:3.615\n",
      "w1:3.456,b1:3.939,error:3.591\n",
      "w1:3.461,b1:3.936,error:3.568\n",
      "w1:3.466,b1:3.933,error:3.544\n",
      "w1:3.471,b1:3.930,error:3.521\n",
      "w1:3.476,b1:3.927,error:3.497\n",
      "w1:3.481,b1:3.924,error:3.474\n",
      "w1:3.486,b1:3.921,error:3.452\n",
      "w1:3.491,b1:3.918,error:3.429\n",
      "w1:3.496,b1:3.914,error:3.406\n",
      "w1:3.501,b1:3.911,error:3.384\n",
      "w1:3.506,b1:3.908,error:3.361\n",
      "w1:3.511,b1:3.905,error:3.339\n",
      "w1:3.516,b1:3.902,error:3.317\n",
      "w1:3.521,b1:3.899,error:3.296\n",
      "w1:3.526,b1:3.897,error:3.274\n",
      "w1:3.531,b1:3.894,error:3.252\n",
      "w1:3.535,b1:3.891,error:3.231\n",
      "w1:3.540,b1:3.888,error:3.210\n",
      "w1:3.545,b1:3.885,error:3.188\n",
      "w1:3.550,b1:3.882,error:3.167\n",
      "w1:3.555,b1:3.879,error:3.147\n",
      "w1:3.559,b1:3.876,error:3.126\n",
      "w1:3.564,b1:3.873,error:3.105\n",
      "w1:3.569,b1:3.870,error:3.085\n",
      "w1:3.574,b1:3.867,error:3.064\n",
      "w1:3.578,b1:3.865,error:3.044\n",
      "w1:3.583,b1:3.862,error:3.024\n",
      "w1:3.588,b1:3.859,error:3.004\n",
      "w1:3.592,b1:3.856,error:2.984\n",
      "w1:3.597,b1:3.853,error:2.965\n",
      "w1:3.602,b1:3.850,error:2.945\n",
      "w1:3.606,b1:3.848,error:2.926\n",
      "w1:3.611,b1:3.845,error:2.907\n",
      "w1:3.615,b1:3.842,error:2.887\n",
      "w1:3.620,b1:3.839,error:2.868\n",
      "w1:3.625,b1:3.836,error:2.850\n",
      "w1:3.629,b1:3.834,error:2.831\n",
      "w1:3.634,b1:3.831,error:2.812\n",
      "w1:3.638,b1:3.828,error:2.794\n",
      "w1:3.643,b1:3.825,error:2.775\n",
      "w1:3.647,b1:3.823,error:2.757\n",
      "w1:3.652,b1:3.820,error:2.739\n",
      "w1:3.656,b1:3.817,error:2.721\n",
      "w1:3.660,b1:3.815,error:2.703\n",
      "w1:3.665,b1:3.812,error:2.685\n",
      "w1:3.669,b1:3.809,error:2.667\n",
      "w1:3.674,b1:3.807,error:2.650\n",
      "w1:3.678,b1:3.804,error:2.632\n",
      "w1:3.682,b1:3.801,error:2.615\n",
      "w1:3.687,b1:3.799,error:2.598\n",
      "w1:3.691,b1:3.796,error:2.581\n",
      "w1:3.695,b1:3.793,error:2.564\n",
      "w1:3.700,b1:3.791,error:2.547\n",
      "w1:3.704,b1:3.788,error:2.530\n",
      "w1:3.708,b1:3.786,error:2.513\n",
      "w1:3.713,b1:3.783,error:2.497\n",
      "w1:3.717,b1:3.780,error:2.480\n",
      "w1:3.721,b1:3.778,error:2.464\n",
      "w1:3.725,b1:3.775,error:2.448\n",
      "w1:3.729,b1:3.773,error:2.432\n",
      "w1:3.734,b1:3.770,error:2.416\n",
      "w1:3.738,b1:3.768,error:2.400\n",
      "w1:3.742,b1:3.765,error:2.384\n",
      "w1:3.746,b1:3.763,error:2.368\n",
      "w1:3.750,b1:3.760,error:2.353\n",
      "w1:3.754,b1:3.757,error:2.337\n",
      "w1:3.759,b1:3.755,error:2.322\n",
      "w1:3.763,b1:3.752,error:2.306\n",
      "w1:3.767,b1:3.750,error:2.291\n",
      "w1:3.771,b1:3.748,error:2.276\n",
      "w1:3.775,b1:3.745,error:2.261\n",
      "w1:3.779,b1:3.743,error:2.246\n",
      "w1:3.783,b1:3.740,error:2.231\n",
      "w1:3.787,b1:3.738,error:2.217\n",
      "w1:3.791,b1:3.735,error:2.202\n",
      "w1:3.795,b1:3.733,error:2.188\n",
      "w1:3.799,b1:3.730,error:2.173\n",
      "w1:3.803,b1:3.728,error:2.159\n",
      "w1:3.807,b1:3.726,error:2.145\n",
      "w1:3.811,b1:3.723,error:2.131\n",
      "w1:3.815,b1:3.721,error:2.116\n",
      "w1:3.819,b1:3.718,error:2.103\n",
      "w1:3.822,b1:3.716,error:2.089\n",
      "w1:3.826,b1:3.714,error:2.075\n",
      "w1:3.830,b1:3.711,error:2.061\n",
      "w1:3.834,b1:3.709,error:2.048\n",
      "w1:3.838,b1:3.707,error:2.034\n",
      "w1:3.842,b1:3.704,error:2.021\n",
      "w1:3.846,b1:3.702,error:2.008\n",
      "w1:3.849,b1:3.700,error:1.994\n",
      "w1:3.853,b1:3.697,error:1.981\n",
      "w1:3.857,b1:3.695,error:1.968\n",
      "w1:3.861,b1:3.693,error:1.955\n",
      "w1:3.864,b1:3.691,error:1.942\n",
      "w1:3.868,b1:3.688,error:1.929\n",
      "w1:3.872,b1:3.686,error:1.917\n",
      "w1:3.876,b1:3.684,error:1.904\n",
      "w1:3.879,b1:3.681,error:1.892\n",
      "w1:3.883,b1:3.679,error:1.879\n",
      "w1:3.887,b1:3.677,error:1.867\n",
      "w1:3.890,b1:3.675,error:1.854\n",
      "w1:3.894,b1:3.673,error:1.842\n",
      "w1:3.898,b1:3.670,error:1.830\n",
      "w1:3.901,b1:3.668,error:1.818\n",
      "w1:3.905,b1:3.666,error:1.806\n",
      "w1:3.909,b1:3.664,error:1.794\n",
      "w1:3.912,b1:3.662,error:1.782\n",
      "w1:3.916,b1:3.659,error:1.771\n",
      "w1:3.919,b1:3.657,error:1.759\n",
      "w1:3.923,b1:3.655,error:1.747\n",
      "w1:3.927,b1:3.653,error:1.736\n",
      "w1:3.930,b1:3.651,error:1.724\n",
      "w1:3.934,b1:3.649,error:1.713\n",
      "w1:3.937,b1:3.646,error:1.702\n",
      "w1:3.941,b1:3.644,error:1.691\n",
      "w1:3.944,b1:3.642,error:1.679\n",
      "w1:3.948,b1:3.640,error:1.668\n",
      "w1:3.951,b1:3.638,error:1.657\n",
      "w1:3.955,b1:3.636,error:1.646\n",
      "w1:3.958,b1:3.634,error:1.636\n",
      "w1:3.961,b1:3.632,error:1.625\n",
      "w1:3.965,b1:3.630,error:1.614\n",
      "w1:3.968,b1:3.627,error:1.604\n",
      "w1:3.972,b1:3.625,error:1.593\n",
      "w1:3.975,b1:3.623,error:1.582\n",
      "w1:3.978,b1:3.621,error:1.572\n",
      "w1:3.982,b1:3.619,error:1.562\n",
      "w1:3.985,b1:3.617,error:1.551\n",
      "w1:3.989,b1:3.615,error:1.541\n",
      "w1:3.992,b1:3.613,error:1.531\n",
      "w1:3.995,b1:3.611,error:1.521\n",
      "w1:3.998,b1:3.609,error:1.511\n",
      "w1:4.002,b1:3.607,error:1.501\n",
      "w1:4.005,b1:3.605,error:1.491\n",
      "w1:4.008,b1:3.603,error:1.481\n",
      "w1:4.012,b1:3.601,error:1.471\n",
      "w1:4.015,b1:3.599,error:1.462\n",
      "w1:4.018,b1:3.597,error:1.452\n",
      "w1:4.021,b1:3.595,error:1.443\n",
      "w1:4.025,b1:3.593,error:1.433\n",
      "w1:4.028,b1:3.591,error:1.424\n",
      "w1:4.031,b1:3.589,error:1.414\n",
      "w1:4.034,b1:3.587,error:1.405\n",
      "w1:4.037,b1:3.585,error:1.396\n",
      "w1:4.041,b1:3.583,error:1.387\n",
      "w1:4.044,b1:3.582,error:1.377\n",
      "w1:4.047,b1:3.580,error:1.368\n",
      "w1:4.050,b1:3.578,error:1.359\n",
      "w1:4.053,b1:3.576,error:1.350\n",
      "w1:4.056,b1:3.574,error:1.341\n",
      "w1:4.059,b1:3.572,error:1.333\n",
      "w1:4.063,b1:3.570,error:1.324\n",
      "w1:4.066,b1:3.568,error:1.315\n",
      "w1:4.069,b1:3.566,error:1.306\n",
      "w1:4.072,b1:3.564,error:1.298\n",
      "w1:4.075,b1:3.563,error:1.289\n",
      "w1:4.078,b1:3.561,error:1.281\n",
      "w1:4.081,b1:3.559,error:1.272\n",
      "w1:4.084,b1:3.557,error:1.264\n",
      "w1:4.087,b1:3.555,error:1.256\n",
      "w1:4.090,b1:3.553,error:1.247\n",
      "w1:4.093,b1:3.552,error:1.239\n",
      "w1:4.096,b1:3.550,error:1.231\n",
      "w1:4.099,b1:3.548,error:1.223\n",
      "w1:4.102,b1:3.546,error:1.215\n",
      "w1:4.102,b1:3.546,error:1.215\n"
     ]
    }
   ],
   "source": [
    "w1,b1 = 15.0,30.0\n",
    "learning_rate = 0.06\n",
    "print(\"w1,b1:\", w1,b1)\n",
    "\n",
    "for i in range(500):\n",
    "    w1,b1 = back(w1,b1,x,y) \n",
    "    error = loss(x,y)\n",
    "    print(\"w1:%.3f,b1:%.3f,error:%.3f\"%(w1,b1,error))\n",
    "        \n",
    "error = loss(x,y) \n",
    "print(\"w1:%.3f,b1:%.3f,error:%.3f\"%(w1,b1,error))\n",
    "    "
   ]
  },
  {
   "cell_type": "markdown",
   "id": "24c52f94",
   "metadata": {},
   "source": [
    "### 接下来，我们使用目前比较流行的深度学习框架pytorch来实现上面同样的逻辑"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 11,
   "id": "ec075b65",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "(40,) (40,)\n",
      "tensor([[-0.4022],\n",
      "        [-0.3292],\n",
      "        [-0.3785],\n",
      "        [-0.4039],\n",
      "        [-0.4571],\n",
      "        [-0.3596],\n",
      "        [-0.4510],\n",
      "        [-0.2517],\n",
      "        [-0.2202],\n",
      "        [-0.4748],\n",
      "        [-0.2956],\n",
      "        [-0.4109],\n",
      "        [-0.3938],\n",
      "        [-0.2369],\n",
      "        [-0.6118],\n",
      "        [-0.6048],\n",
      "        [-0.6341],\n",
      "        [-0.2777],\n",
      "        [-0.3016],\n",
      "        [-0.2613],\n",
      "        [-0.2136],\n",
      "        [-0.2924],\n",
      "        [-0.4405],\n",
      "        [-0.3005],\n",
      "        [-0.5911],\n",
      "        [-0.3622],\n",
      "        [-0.5801],\n",
      "        [-0.2285],\n",
      "        [-0.4140],\n",
      "        [-0.4611],\n",
      "        [-0.5269],\n",
      "        [-0.3033],\n",
      "        [-0.4429],\n",
      "        [-0.3936],\n",
      "        [-0.6347],\n",
      "        [-0.3720],\n",
      "        [-0.3744],\n",
      "        [-0.3723],\n",
      "        [-0.2289],\n",
      "        [-0.3438]], grad_fn=<AddmmBackward>)\n"
     ]
    }
   ],
   "source": [
    "import torch\n",
    "import torch.nn as nn\n",
    "\n",
    "m = nn.Linear(1, 1)  # y = w*x + b 的封装，实际上这个可以实现n元线性方程，y=w0*1 + w1*x1 + w2*x2 + .. + wn*xn,用矩阵表示:Y=W*X\n",
    "# print(m.weight,m.bias) # 系统会随机给定weight和bias, 如果要每次执行都一样的初始值，可以设定 torch.manual_seed(0)\n",
    "\n",
    "print(x.shape,y.shape)\n",
    "# inX = torch.from_numpy(x).unsqueeze(1) #from_numpy不能修改数据类型？\n",
    "# outY = torch.from_numpy(y).unsqueeze(1)\n",
    "inX = torch.as_tensor(x,dtype=torch.float32).unsqueeze(1)\n",
    "outY = torch.as_tensor(y,dtype=torch.float32).unsqueeze(1)\n",
    "# print(inX,inX.shape)\n",
    "\n",
    "predict_Y = m(inX)\n",
    "print(predict_Y)\n",
    "\n",
    "# .reshape(-1,1) -> .unsqueeze(1)\n",
    "# inX = torch.as_tensor(x,dtype=torch.float32) #将numpy转成tensor\n",
    "# outY = torch.as_tensor(y,dtype=torch.float32).reshape(-1,1)\n",
    "# print(inX.shape)\n",
    "# print(inX)\n",
    "# inX = torch.reshape(inX, (-1,1)) #将shape从 [1,40]转成[40,1]\n",
    "# print(inX,inX.shape)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 6,
   "id": "045919b2",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "epoch 0, loss 33.63819122314453\n",
      "epoch 1, loss 23.778167724609375\n",
      "epoch 2, loss 16.85102081298828\n",
      "epoch 3, loss 11.983793258666992\n",
      "epoch 4, loss 8.563364028930664\n",
      "epoch 5, loss 6.159113883972168\n",
      "epoch 6, loss 4.468597412109375\n",
      "epoch 7, loss 3.2793936729431152\n",
      "epoch 8, loss 2.4423093795776367\n",
      "epoch 9, loss 1.8525581359863281\n",
      "epoch 10, loss 1.4365441799163818\n",
      "epoch 11, loss 1.1425782442092896\n",
      "epoch 12, loss 0.9343550801277161\n",
      "epoch 13, loss 0.7863761186599731\n",
      "epoch 14, loss 0.6807320713996887\n",
      "epoch 15, loss 0.604844331741333\n",
      "epoch 16, loss 0.5498780012130737\n",
      "epoch 17, loss 0.5096278190612793\n",
      "epoch 18, loss 0.47973519563674927\n",
      "epoch 19, loss 0.4571380615234375\n",
      "epoch 20, loss 0.43968629837036133\n",
      "epoch 21, loss 0.42586904764175415\n",
      "epoch 22, loss 0.4146248400211334\n",
      "epoch 23, loss 0.4052082598209381\n",
      "epoch 24, loss 0.3970951437950134\n",
      "epoch 25, loss 0.3899172842502594\n",
      "epoch 26, loss 0.38341575860977173\n",
      "epoch 27, loss 0.3774084448814392\n",
      "epoch 28, loss 0.3717673718929291\n",
      "epoch 29, loss 0.3664019703865051\n",
      "epoch 30, loss 0.3612487316131592\n",
      "epoch 31, loss 0.35626277327537537\n",
      "epoch 32, loss 0.35141220688819885\n",
      "epoch 33, loss 0.3466746509075165\n",
      "epoch 34, loss 0.3420336842536926\n",
      "epoch 35, loss 0.33747804164886475\n",
      "epoch 36, loss 0.33299916982650757\n",
      "epoch 37, loss 0.3285912573337555\n",
      "epoch 38, loss 0.3242497742176056\n",
      "epoch 39, loss 0.3199711740016937\n",
      "epoch 40, loss 0.3157529830932617\n",
      "epoch 41, loss 0.3115931451320648\n",
      "epoch 42, loss 0.30749017000198364\n",
      "epoch 43, loss 0.3034425377845764\n",
      "epoch 44, loss 0.2994491457939148\n",
      "epoch 45, loss 0.2955089509487152\n",
      "epoch 46, loss 0.2916211485862732\n",
      "epoch 47, loss 0.28778472542762756\n",
      "epoch 48, loss 0.28399914503097534\n",
      "epoch 49, loss 0.28026336431503296\n",
      "epoch 50, loss 0.27657702565193176\n",
      "epoch 51, loss 0.27293911576271057\n",
      "epoch 52, loss 0.2693491578102112\n",
      "epoch 53, loss 0.2658063769340515\n",
      "epoch 54, loss 0.262310266494751\n",
      "epoch 55, loss 0.2588602602481842\n",
      "epoch 56, loss 0.2554556131362915\n",
      "epoch 57, loss 0.25209569931030273\n",
      "epoch 58, loss 0.24877996742725372\n",
      "epoch 59, loss 0.24550795555114746\n",
      "epoch 60, loss 0.24227885901927948\n",
      "epoch 61, loss 0.23909230530261993\n",
      "epoch 62, loss 0.2359476387500763\n",
      "epoch 63, loss 0.23284435272216797\n",
      "epoch 64, loss 0.229781836271286\n",
      "epoch 65, loss 0.2267596274614334\n",
      "epoch 66, loss 0.223777174949646\n",
      "epoch 67, loss 0.22083401679992676\n",
      "epoch 68, loss 0.21792955696582794\n",
      "epoch 69, loss 0.2150631844997406\n",
      "epoch 70, loss 0.21223458647727966\n",
      "epoch 71, loss 0.20944328606128693\n",
      "epoch 72, loss 0.20668859779834747\n",
      "epoch 73, loss 0.20397016406059265\n",
      "epoch 74, loss 0.2012874335050583\n",
      "epoch 75, loss 0.19864001870155334\n",
      "epoch 76, loss 0.19602744281291962\n",
      "epoch 77, loss 0.19344916939735413\n",
      "epoch 78, loss 0.1909049153327942\n",
      "epoch 79, loss 0.18839403986930847\n",
      "epoch 80, loss 0.1859162300825119\n",
      "epoch 81, loss 0.18347103893756866\n",
      "epoch 82, loss 0.1810579150915146\n",
      "epoch 83, loss 0.178676575422287\n",
      "epoch 84, loss 0.17632654309272766\n",
      "epoch 85, loss 0.17400740087032318\n",
      "epoch 86, loss 0.1717187762260437\n",
      "epoch 87, loss 0.169460266828537\n",
      "epoch 88, loss 0.16723138093948364\n",
      "epoch 89, loss 0.16503193974494934\n",
      "epoch 90, loss 0.16286133229732513\n",
      "epoch 91, loss 0.16071929037570953\n",
      "epoch 92, loss 0.15860548615455627\n",
      "epoch 93, loss 0.15651938319206238\n",
      "epoch 94, loss 0.15446080267429352\n",
      "epoch 95, loss 0.1524292528629303\n",
      "epoch 96, loss 0.15042439103126526\n",
      "epoch 97, loss 0.1484459638595581\n",
      "epoch 98, loss 0.1464935541152954\n",
      "epoch 99, loss 0.1445668488740921\n",
      "epoch 100, loss 0.14266541600227356\n",
      "epoch 101, loss 0.14078903198242188\n",
      "epoch 102, loss 0.13893729448318481\n",
      "epoch 103, loss 0.1371098905801773\n",
      "epoch 104, loss 0.13530655205249786\n",
      "epoch 105, loss 0.13352695107460022\n",
      "epoch 106, loss 0.1317707598209381\n",
      "epoch 107, loss 0.13003766536712646\n",
      "epoch 108, loss 0.12832733988761902\n",
      "epoch 109, loss 0.12663953006267548\n",
      "epoch 110, loss 0.12497389316558838\n",
      "epoch 111, loss 0.12333015352487564\n",
      "epoch 112, loss 0.12170803546905518\n",
      "epoch 113, loss 0.1201072558760643\n",
      "epoch 114, loss 0.11852757632732391\n",
      "epoch 115, loss 0.11696865409612656\n",
      "epoch 116, loss 0.11543023586273193\n",
      "epoch 117, loss 0.11391202360391617\n",
      "epoch 118, loss 0.11241377890110016\n",
      "epoch 119, loss 0.1109352707862854\n",
      "epoch 120, loss 0.10947616398334503\n",
      "epoch 121, loss 0.10803631693124771\n",
      "epoch 122, loss 0.10661538690328598\n",
      "epoch 123, loss 0.10521312803030014\n",
      "epoch 124, loss 0.10382933914661407\n",
      "epoch 125, loss 0.1024637222290039\n",
      "epoch 126, loss 0.10111606121063232\n",
      "epoch 127, loss 0.09978614747524261\n",
      "epoch 128, loss 0.0984736829996109\n",
      "epoch 129, loss 0.09717850387096405\n",
      "epoch 130, loss 0.09590034186840057\n",
      "epoch 131, loss 0.09463904798030853\n",
      "epoch 132, loss 0.09339433163404465\n",
      "epoch 133, loss 0.09216594696044922\n",
      "epoch 134, loss 0.0909537523984909\n",
      "epoch 135, loss 0.08975750207901001\n",
      "epoch 136, loss 0.08857693523168564\n",
      "epoch 137, loss 0.08741194009780884\n",
      "epoch 138, loss 0.08626225590705872\n",
      "epoch 139, loss 0.08512773364782333\n",
      "epoch 140, loss 0.08400808274745941\n",
      "epoch 141, loss 0.08290316164493561\n",
      "epoch 142, loss 0.0818127989768982\n",
      "epoch 143, loss 0.08073676377534866\n",
      "epoch 144, loss 0.07967488467693329\n",
      "epoch 145, loss 0.07862696051597595\n",
      "epoch 146, loss 0.07759281992912292\n",
      "epoch 147, loss 0.07657228410243988\n",
      "epoch 148, loss 0.07556517422199249\n",
      "epoch 149, loss 0.07457133382558823\n",
      "epoch 150, loss 0.07359050214290619\n",
      "epoch 151, loss 0.07262266427278519\n",
      "epoch 152, loss 0.07166749984025955\n",
      "epoch 153, loss 0.07072485983371735\n",
      "epoch 154, loss 0.0697946697473526\n",
      "epoch 155, loss 0.0688767209649086\n",
      "epoch 156, loss 0.06797081977128983\n",
      "epoch 157, loss 0.06707680225372314\n",
      "epoch 158, loss 0.06619462370872498\n",
      "epoch 159, loss 0.06532398611307144\n",
      "epoch 160, loss 0.06446484476327896\n",
      "epoch 161, loss 0.06361696869134903\n",
      "epoch 162, loss 0.06278024613857269\n",
      "epoch 163, loss 0.06195452809333801\n",
      "epoch 164, loss 0.06113964319229126\n",
      "epoch 165, loss 0.060335516929626465\n",
      "epoch 166, loss 0.05954193323850632\n",
      "epoch 167, loss 0.05875878781080246\n",
      "epoch 168, loss 0.057985950261354446\n",
      "epoch 169, loss 0.05722330883145332\n",
      "epoch 170, loss 0.056470680981874466\n",
      "epoch 171, loss 0.05572793632745743\n",
      "epoch 172, loss 0.05499500036239624\n",
      "epoch 173, loss 0.05427166819572449\n",
      "epoch 174, loss 0.05355783551931381\n",
      "epoch 175, loss 0.05285344272851944\n",
      "epoch 176, loss 0.05215831473469734\n",
      "epoch 177, loss 0.051472295075654984\n",
      "epoch 178, loss 0.05079527944326401\n",
      "epoch 179, loss 0.05012720823287964\n",
      "epoch 180, loss 0.049467913806438446\n",
      "epoch 181, loss 0.04881729558110237\n",
      "epoch 182, loss 0.04817521572113037\n",
      "epoch 183, loss 0.04754162207245827\n",
      "epoch 184, loss 0.04691634699702263\n",
      "epoch 185, loss 0.04629925265908241\n",
      "epoch 186, loss 0.04569031670689583\n",
      "epoch 187, loss 0.045089367777109146\n",
      "epoch 188, loss 0.04449637234210968\n",
      "epoch 189, loss 0.043911103159189224\n",
      "epoch 190, loss 0.04333357885479927\n",
      "epoch 191, loss 0.042763639241456985\n",
      "epoch 192, loss 0.04220118373632431\n",
      "epoch 193, loss 0.04164613410830498\n",
      "epoch 194, loss 0.04109841585159302\n",
      "epoch 195, loss 0.040557861328125\n",
      "epoch 196, loss 0.04002442583441734\n",
      "epoch 197, loss 0.03949800878763199\n",
      "epoch 198, loss 0.03897853568196297\n",
      "epoch 199, loss 0.03846586123108864\n",
      "epoch 200, loss 0.03795995935797691\n",
      "epoch 201, loss 0.03746070712804794\n",
      "epoch 202, loss 0.036968015134334564\n",
      "epoch 203, loss 0.0364818274974823\n",
      "epoch 204, loss 0.03600199520587921\n",
      "epoch 205, loss 0.035528458654880524\n",
      "epoch 206, loss 0.035061195492744446\n",
      "epoch 207, loss 0.03460004925727844\n",
      "epoch 208, loss 0.03414498642086983\n",
      "epoch 209, loss 0.033695872873067856\n",
      "epoch 210, loss 0.03325270488858223\n",
      "epoch 211, loss 0.0328153632581234\n",
      "epoch 212, loss 0.032383762300014496\n",
      "epoch 213, loss 0.03195784613490105\n",
      "epoch 214, loss 0.03153753653168678\n",
      "epoch 215, loss 0.031122734770178795\n",
      "epoch 216, loss 0.030713384971022606\n",
      "epoch 217, loss 0.030309418216347694\n",
      "epoch 218, loss 0.029910782352089882\n",
      "epoch 219, loss 0.029517367482185364\n",
      "epoch 220, loss 0.029129141941666603\n",
      "epoch 221, loss 0.028746027499437332\n",
      "epoch 222, loss 0.028367947787046432\n",
      "epoch 223, loss 0.02799484133720398\n",
      "epoch 224, loss 0.027626648545265198\n",
      "epoch 225, loss 0.027263304218649864\n",
      "epoch 226, loss 0.026904720813035965\n",
      "epoch 227, loss 0.026550855487585068\n",
      "epoch 228, loss 0.02620166540145874\n",
      "epoch 229, loss 0.025857066735625267\n",
      "epoch 230, loss 0.025516975671052933\n",
      "epoch 231, loss 0.025181377306580544\n",
      "epoch 232, loss 0.02485017478466034\n",
      "epoch 233, loss 0.024523314088582993\n",
      "epoch 234, loss 0.024200765416026115\n",
      "epoch 235, loss 0.02388247847557068\n",
      "epoch 236, loss 0.02356836572289467\n",
      "epoch 237, loss 0.023258402943611145\n",
      "epoch 238, loss 0.022952508181333542\n",
      "epoch 239, loss 0.022650597617030144\n",
      "epoch 240, loss 0.022352691739797592\n",
      "epoch 241, loss 0.02205870859324932\n",
      "epoch 242, loss 0.021768584847450256\n",
      "epoch 243, loss 0.021482262760400772\n",
      "epoch 244, loss 0.02119971439242363\n",
      "epoch 245, loss 0.020920898765325546\n",
      "epoch 246, loss 0.020645711570978165\n",
      "epoch 247, loss 0.02037418819963932\n",
      "epoch 248, loss 0.02010621502995491\n",
      "epoch 249, loss 0.019841784611344337\n",
      "epoch 250, loss 0.01958082616329193\n",
      "epoch 251, loss 0.019323280081152916\n",
      "epoch 252, loss 0.01906912960112095\n",
      "epoch 253, loss 0.018818324431777\n",
      "epoch 254, loss 0.018570849671959877\n",
      "epoch 255, loss 0.01832658238708973\n",
      "epoch 256, loss 0.018085556104779243\n",
      "epoch 257, loss 0.0178476944565773\n",
      "epoch 258, loss 0.01761295273900032\n",
      "epoch 259, loss 0.017381299287080765\n",
      "epoch 260, loss 0.0171526949852705\n",
      "epoch 261, loss 0.016927117481827736\n",
      "epoch 262, loss 0.016704484820365906\n",
      "epoch 263, loss 0.016484778374433517\n",
      "epoch 264, loss 0.016267983242869377\n",
      "epoch 265, loss 0.016053996980190277\n",
      "epoch 266, loss 0.01584285870194435\n",
      "epoch 267, loss 0.01563449017703533\n",
      "epoch 268, loss 0.01542886532843113\n",
      "epoch 269, loss 0.015225944109261036\n",
      "epoch 270, loss 0.015025684610009193\n",
      "epoch 271, loss 0.014828061684966087\n",
      "epoch 272, loss 0.014633026905357838\n",
      "epoch 273, loss 0.014440575614571571\n",
      "epoch 274, loss 0.014250645413994789\n",
      "epoch 275, loss 0.01406320370733738\n",
      "epoch 276, loss 0.013878241181373596\n",
      "epoch 277, loss 0.013695694506168365\n",
      "epoch 278, loss 0.013515567407011986\n",
      "epoch 279, loss 0.013337801210582256\n",
      "epoch 280, loss 0.013162377290427685\n",
      "epoch 281, loss 0.01298926305025816\n",
      "epoch 282, loss 0.012818428687751293\n",
      "epoch 283, loss 0.012649839743971825\n",
      "epoch 284, loss 0.012483463622629642\n",
      "epoch 285, loss 0.01231928076595068\n",
      "epoch 286, loss 0.012157252989709377\n",
      "epoch 287, loss 0.01199736725538969\n",
      "epoch 288, loss 0.011839550919830799\n",
      "epoch 289, loss 0.011683833785355091\n",
      "epoch 290, loss 0.011530161835253239\n",
      "epoch 291, loss 0.011378497816622257\n",
      "epoch 292, loss 0.011228865943849087\n",
      "epoch 293, loss 0.011081160977482796\n",
      "epoch 294, loss 0.010935438796877861\n",
      "epoch 295, loss 0.010791609063744545\n",
      "epoch 296, loss 0.01064967829734087\n",
      "epoch 297, loss 0.010509605519473553\n",
      "epoch 298, loss 0.010371376760303974\n",
      "epoch 299, loss 0.010234952904284\n",
      "epoch 300, loss 0.010100334882736206\n",
      "epoch 301, loss 0.00996750220656395\n",
      "epoch 302, loss 0.009836392477154732\n",
      "epoch 303, loss 0.009707026183605194\n",
      "epoch 304, loss 0.009579366073012352\n",
      "epoch 305, loss 0.009453373961150646\n",
      "epoch 306, loss 0.009329037740826607\n",
      "epoch 307, loss 0.00920633040368557\n",
      "epoch 308, loss 0.009085243567824364\n",
      "epoch 309, loss 0.008965743705630302\n",
      "epoch 310, loss 0.008847823366522789\n",
      "epoch 311, loss 0.00873146578669548\n",
      "epoch 312, loss 0.008616629056632519\n",
      "epoch 313, loss 0.008503306657075882\n",
      "epoch 314, loss 0.008391465060412884\n",
      "epoch 315, loss 0.008281087502837181\n",
      "epoch 316, loss 0.008172173984348774\n",
      "epoch 317, loss 0.008064685389399529\n",
      "epoch 318, loss 0.007958615198731422\n",
      "epoch 319, loss 0.007853944785892963\n",
      "epoch 320, loss 0.007750640157610178\n",
      "epoch 321, loss 0.007648696191608906\n",
      "epoch 322, loss 0.007548110093921423\n",
      "epoch 323, loss 0.007448837161064148\n",
      "epoch 324, loss 0.007350859697908163\n",
      "epoch 325, loss 0.0072541749104857445\n",
      "epoch 326, loss 0.007158766034990549\n",
      "epoch 327, loss 0.0070646158419549465\n",
      "epoch 328, loss 0.0069716982543468475\n",
      "epoch 329, loss 0.0068799941800534725\n",
      "epoch 330, loss 0.006789498962461948\n",
      "epoch 331, loss 0.006700216326862574\n",
      "epoch 332, loss 0.006612083874642849\n",
      "epoch 333, loss 0.0065251244232058525\n",
      "epoch 334, loss 0.006439291872084141\n",
      "epoch 335, loss 0.006354598794132471\n",
      "epoch 336, loss 0.006271027959883213\n",
      "epoch 337, loss 0.0061885397881269455\n",
      "epoch 338, loss 0.006107150577008724\n",
      "epoch 339, loss 0.006026831455528736\n",
      "epoch 340, loss 0.005947566591203213\n",
      "epoch 341, loss 0.005869345739483833\n",
      "epoch 342, loss 0.005792145151644945\n",
      "epoch 343, loss 0.005715967155992985\n",
      "epoch 344, loss 0.005640791729092598\n",
      "epoch 345, loss 0.005566599778831005\n",
      "epoch 346, loss 0.0054933736100792885\n",
      "epoch 347, loss 0.00542111974209547\n",
      "epoch 348, loss 0.005349824205040932\n",
      "epoch 349, loss 0.00527945114299655\n",
      "epoch 350, loss 0.005210027564316988\n",
      "epoch 351, loss 0.005141501780599356\n",
      "epoch 352, loss 0.005073888227343559\n",
      "epoch 353, loss 0.005007162224501371\n",
      "epoch 354, loss 0.004941300489008427\n",
      "epoch 355, loss 0.004876310471445322\n",
      "epoch 356, loss 0.00481216749176383\n",
      "epoch 357, loss 0.004748871084302664\n",
      "epoch 358, loss 0.0046864161267876625\n",
      "epoch 359, loss 0.004624779336154461\n",
      "epoch 360, loss 0.004563943948596716\n",
      "epoch 361, loss 0.004503920674324036\n",
      "epoch 362, loss 0.0044446783140301704\n",
      "epoch 363, loss 0.004386215470731258\n",
      "epoch 364, loss 0.004328535404056311\n",
      "epoch 365, loss 0.004271599464118481\n",
      "epoch 366, loss 0.0042154183611273766\n",
      "epoch 367, loss 0.004159973934292793\n",
      "epoch 368, loss 0.004105258267372847\n",
      "epoch 369, loss 0.004051276948302984\n",
      "epoch 370, loss 0.00399799132719636\n",
      "epoch 371, loss 0.003945402801036835\n",
      "epoch 372, loss 0.0038935106713324785\n",
      "epoch 373, loss 0.003842306090518832\n",
      "epoch 374, loss 0.0037917736917734146\n",
      "epoch 375, loss 0.003741899039596319\n",
      "epoch 376, loss 0.0036926858592778444\n",
      "epoch 377, loss 0.003644117619842291\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "epoch 378, loss 0.003596179187297821\n",
      "epoch 379, loss 0.003548887325450778\n",
      "epoch 380, loss 0.00350220431573689\n",
      "epoch 381, loss 0.0034561436623334885\n",
      "epoch 382, loss 0.0034106853418052197\n",
      "epoch 383, loss 0.0033658319152891636\n",
      "epoch 384, loss 0.0033215670846402645\n",
      "epoch 385, loss 0.0032778754830360413\n",
      "epoch 386, loss 0.0032347559463232756\n",
      "epoch 387, loss 0.0031922131311148405\n",
      "epoch 388, loss 0.0031502177007496357\n",
      "epoch 389, loss 0.0031087840907275677\n",
      "epoch 390, loss 0.0030678967013955116\n",
      "epoch 391, loss 0.0030275399331003428\n",
      "epoch 392, loss 0.0029877235647290945\n",
      "epoch 393, loss 0.0029484278056770563\n",
      "epoch 394, loss 0.0029096510261297226\n",
      "epoch 395, loss 0.002871372504159808\n",
      "epoch 396, loss 0.0028336052782833576\n",
      "epoch 397, loss 0.0027963335160166025\n",
      "epoch 398, loss 0.002759553026407957\n",
      "epoch 399, loss 0.002723255194723606\n",
      "epoch 400, loss 0.0026874386239796877\n",
      "epoch 401, loss 0.0026520933024585247\n",
      "epoch 402, loss 0.0026172087527811527\n",
      "epoch 403, loss 0.002582796383649111\n",
      "epoch 404, loss 0.0025488289538770914\n",
      "epoch 405, loss 0.002515302738174796\n",
      "epoch 406, loss 0.0024822228588163853\n",
      "epoch 407, loss 0.0024495769757777452\n",
      "epoch 408, loss 0.0024173615965992212\n",
      "epoch 409, loss 0.0023855690378695726\n",
      "epoch 410, loss 0.002354186959564686\n",
      "epoch 411, loss 0.0023232244420796633\n",
      "epoch 412, loss 0.002292667981237173\n",
      "epoch 413, loss 0.0022625168785452843\n",
      "epoch 414, loss 0.0022327620536088943\n",
      "epoch 415, loss 0.0022033986169844866\n",
      "epoch 416, loss 0.002174414461478591\n",
      "epoch 417, loss 0.0021458189003169537\n",
      "epoch 418, loss 0.0021175947040319443\n",
      "epoch 419, loss 0.0020897346548736095\n",
      "epoch 420, loss 0.0020622550509870052\n",
      "epoch 421, loss 0.0020351263228803873\n",
      "epoch 422, loss 0.002008365234360099\n",
      "epoch 423, loss 0.00198194058611989\n",
      "epoch 424, loss 0.001955874729901552\n",
      "epoch 425, loss 0.0019301483407616615\n",
      "epoch 426, loss 0.0019047599053010345\n",
      "epoch 427, loss 0.001879713498055935\n",
      "epoch 428, loss 0.001854990958236158\n",
      "epoch 429, loss 0.001830598688684404\n",
      "epoch 430, loss 0.0018065206240862608\n",
      "epoch 431, loss 0.0017827635165303946\n",
      "epoch 432, loss 0.001759312697686255\n",
      "epoch 433, loss 0.001736173639073968\n",
      "epoch 434, loss 0.0017133364453911781\n",
      "epoch 435, loss 0.0016908046090975404\n",
      "epoch 436, loss 0.0016685653245076537\n",
      "epoch 437, loss 0.001646626042202115\n",
      "epoch 438, loss 0.0016249653417617083\n",
      "epoch 439, loss 0.0016035981243476272\n",
      "epoch 440, loss 0.0015825077425688505\n",
      "epoch 441, loss 0.0015616889577358961\n",
      "epoch 442, loss 0.0015411563217639923\n",
      "epoch 443, loss 0.0015208907425403595\n",
      "epoch 444, loss 0.0015008870977908373\n",
      "epoch 445, loss 0.0014811439905315638\n",
      "epoch 446, loss 0.0014616698026657104\n",
      "epoch 447, loss 0.0014424460241571069\n",
      "epoch 448, loss 0.0014234735863283277\n",
      "epoch 449, loss 0.0014047531876713037\n",
      "epoch 450, loss 0.0013862715568393469\n",
      "epoch 451, loss 0.0013680399861186743\n",
      "epoch 452, loss 0.0013500459026545286\n",
      "epoch 453, loss 0.001332286512479186\n",
      "epoch 454, loss 0.0013147660065442324\n",
      "epoch 455, loss 0.0012974758865311742\n",
      "epoch 456, loss 0.0012804113794118166\n",
      "epoch 457, loss 0.001263574929907918\n",
      "epoch 458, loss 0.0012469543144106865\n",
      "epoch 459, loss 0.0012305602431297302\n",
      "epoch 460, loss 0.0012143723433837295\n",
      "epoch 461, loss 0.0011984009761363268\n",
      "epoch 462, loss 0.0011826397385448217\n",
      "epoch 463, loss 0.001167084788903594\n",
      "epoch 464, loss 0.0011517362436279655\n",
      "epoch 465, loss 0.001136590028181672\n",
      "epoch 466, loss 0.0011216390412300825\n",
      "epoch 467, loss 0.0011068868916481733\n",
      "epoch 468, loss 0.0010923314839601517\n",
      "epoch 469, loss 0.0010779683943837881\n",
      "epoch 470, loss 0.0010637894738465548\n",
      "epoch 471, loss 0.0010497983312234282\n",
      "epoch 472, loss 0.001035994035191834\n",
      "epoch 473, loss 0.0010223612189292908\n",
      "epoch 474, loss 0.00100891652982682\n",
      "epoch 475, loss 0.0009956449503079057\n",
      "epoch 476, loss 0.0009825477609410882\n",
      "epoch 477, loss 0.0009696257184259593\n",
      "epoch 478, loss 0.0009568772511556745\n",
      "epoch 479, loss 0.0009442873415537179\n",
      "epoch 480, loss 0.0009318710071966052\n",
      "epoch 481, loss 0.000919615849852562\n",
      "epoch 482, loss 0.0009075202979147434\n",
      "epoch 483, loss 0.0008955828961916268\n",
      "epoch 484, loss 0.0008838053909130394\n",
      "epoch 485, loss 0.000872179982252419\n",
      "epoch 486, loss 0.0008607084746472538\n",
      "epoch 487, loss 0.0008493890054523945\n",
      "epoch 488, loss 0.0008382241940125823\n",
      "epoch 489, loss 0.0008271950064226985\n",
      "epoch 490, loss 0.0008163148304447532\n",
      "epoch 491, loss 0.0008055813377723098\n",
      "epoch 492, loss 0.0007949856808409095\n",
      "epoch 493, loss 0.0007845278596505523\n",
      "epoch 494, loss 0.0007742098532617092\n",
      "epoch 495, loss 0.0007640270050615072\n",
      "epoch 496, loss 0.0007539769867435098\n",
      "epoch 497, loss 0.0007440610788762569\n",
      "epoch 498, loss 0.0007342767203226686\n",
      "epoch 499, loss 0.000724618963431567\n",
      "tensor([[5.7498],\n",
      "        [6.5655],\n",
      "        [6.0143],\n",
      "        [5.7306],\n",
      "        [5.1362],\n",
      "        [6.2258],\n",
      "        [5.2046],\n",
      "        [7.4312],\n",
      "        [7.7837],\n",
      "        [4.9391],\n",
      "        [6.9407],\n",
      "        [5.6522],\n",
      "        [5.8441],\n",
      "        [7.5970],\n",
      "        [3.4075],\n",
      "        [3.4864],\n",
      "        [3.1584],\n",
      "        [7.1412],\n",
      "        [6.8742],\n",
      "        [7.3245],\n",
      "        [7.8570],\n",
      "        [6.9772],\n",
      "        [5.3217],\n",
      "        [6.8858],\n",
      "        [3.6391],\n",
      "        [6.1965],\n",
      "        [3.7621],\n",
      "        [7.6906],\n",
      "        [5.6176],\n",
      "        [5.0922],\n",
      "        [4.3563],\n",
      "        [6.8550],\n",
      "        [5.2956],\n",
      "        [5.8460],\n",
      "        [3.1514],\n",
      "        [6.0873],\n",
      "        [6.0601],\n",
      "        [6.0838],\n",
      "        [7.6860],\n",
      "        [6.4019]], grad_fn=<AddmmBackward>)\n",
      "Parameter containing:\n",
      "tensor([[4.9026]], requires_grad=True)\n",
      "Parameter containing:\n",
      "tensor([3.0593], requires_grad=True)\n"
     ]
    }
   ],
   "source": [
    "# 创建一个线性回归的模型\n",
    "class LinearRegressionModel(nn.Module):\n",
    "    def __init__(self, input_dim, output_dim):\n",
    "        super(LinearRegressionModel, self).__init__()\n",
    "        self.linear = nn.Linear(input_dim, output_dim,bias=True)  \n",
    "\n",
    "    def forward(self, x):\n",
    "        out = self.linear(x)\n",
    "        return out\n",
    "\n",
    "model = LinearRegressionModel(1, 1) #模型初始化\n",
    "criterion = nn.MSELoss() #定义损失函数：均方误差\n",
    "optimizer = torch.optim.SGD(model.parameters(), lr=learning_rate) #定义最优化算法\n",
    "\n",
    "#常见的几种最优化算法?\n",
    "for epoch in range(500):  #迭代次数\n",
    "    optimizer.zero_grad() #清理模型里参数的梯度值\n",
    "    predict_Y = model(inX) #根据输入获得当前参数下的输出值\n",
    "    loss = criterion(predict_Y, outY) #计算误差\n",
    "    loss.backward() #反向传播，计算梯度，\n",
    "    optimizer.step() #更新模型参数\n",
    "    print('epoch {}, loss {}'.format(epoch, loss.item()))\n",
    "\n",
    "predict_Y = model(inX)\n",
    "print(predict_Y)\n",
    "\n",
    "for name,p in model.named_parameters():\n",
    "    print(p)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "0a17793a",
   "metadata": {},
   "source": [
    "至此，根据一些点找到直线上的w,b已经完成。对于复杂一些的分布规律，该如何搞呢？"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "90c3e67c",
   "metadata": {},
   "source": [
    "常见的几种最优化算法，https://pytorch.org/docs/master/optim.html\n",
    "\n",
    "![常见优化算法](https://ai-studio-static-online.cdn.bcebos.com/f4cf80f95424411a85ad74998433317e721f56ddb4f64e6f8a28a27b6a1baa6b)\n",
    "\n",
    "1. 随机梯度，Stochastic Gradient Descent,SGD\n",
    "    * 随机批量梯度，Batch Gradient Descent,BGD\n",
    "2. 动量 Momentum, 引入物理“动量”的概念，累积速度，减少震荡，使参数更新的方向更稳定\n",
    "3. AdaGrad,根据不同参数距离最优解的远近，动态调整学习率。学习率逐渐下降，依据各参数变化大小调整学习率\n",
    "4. Adam, 由于动量和自适应学习率两个优化思路是正交的，因此可以将两个思路结合起来，这就是当前广泛应用的算法。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "8c556c21",
   "metadata": {},
   "outputs": [],
   "source": []
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3 (ipykernel)",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.9.12"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 5
}
