{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# 智能2011《AI导论》课堂Demo_2: 全连接神经网络（回归）\n",
    "<h1 style='color:blue'> 1. 回归问题：Boston 房价回归</h1>\n",
    "\n",
    "### [Boston 房价回归数据集](https://keras-zh.readthedocs.io/datasets/#_9)\n",
    "\n",
    "- 数据集来自卡内基梅隆大学维护的 StatLib 库。\n",
    "- 样本包含 1970 年代的在波士顿郊区不同位置的房屋信息，总共有 13 种房屋属性。 目标值是**一个位置的房屋的中值（单位：k$）**。\n",
    "- 这十三个特征包括：\n",
    "    1. CRIM 城镇人均犯罪率\n",
    "    1. ZN 占地面积超过2.5万平方英尺的住宅用地比例\n",
    "    1. INDUS 城镇非零售业务地区的比例\n",
    "    1. CHAS 查尔斯河虚拟变量 (= 1 如果土地在河边；否则是0)\n",
    "    1. NOX 一氧化氮浓度（每1000万份）\n",
    "    1. RM 平均每居民房数\n",
    "    1. AGE 在1940年之前建成的所有者占用单位的比例\n",
    "    1. DIS 与五个波士顿就业中心的加权距离\n",
    "    1. RAD 辐射状公路的可达性指数\n",
    "    1. TAX 每10,000美元的全额物业税率\n",
    "    1. PTRATIO 城镇师生比例\n",
    "    1. B 1000(Bk - 0.63)^2 其中 Bk 是城镇的bk的比例\n",
    "    1. LSTAT 人口中地位较低人群的百分数\n",
    "\n",
    "- 而标签MEDV 以1000美元计算的自有住房的中位数"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 实验步骤\n",
    "\n",
    "- （1）加载数据。划分训练集和测试集。\n",
    "- （2）数据标准化：目的是消除数据间量纲的影响，使数据具有可比性。\n",
    "- （3）构建神经网络与训练。\n",
    "- （4）训练历史可视化。\n",
    "- （5）保存模型。\n",
    "- （6）模型的预测功能与反归一化。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 1,
   "metadata": {},
   "outputs": [],
   "source": [
    "# 绘图显示在jupyter笔记本中\n",
    "%matplotlib inline  \n",
    "\n",
    "import matplotlib.pyplot as plt  \n",
    "import pandas as pd \n",
    "import numpy as np\n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 1.1 数据加载"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 2,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "((404, 13), (404,), (102, 13), (102,), numpy.ndarray)"
      ]
     },
     "execution_count": 2,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# 数据加载\n",
    "from tensorflow.keras.datasets import boston_housing\n",
    "\n",
    "# seed随机种子，固定后每次拆分数据都是一致的，方便对比。\n",
    "# test_split测试集的占比。\n",
    "(x_train, y_train), (x_test, y_test) = boston_housing.load_data(seed = 430, test_split=0.2)  \n",
    "\n",
    "\n",
    "# 查看数据结构\n",
    "x_train.shape, y_train.shape, x_test.shape, y_test.shape, type(x_train)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 3,
   "metadata": {
    "scrolled": true
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(array([[1.67600e-01, 0.00000e+00, 7.38000e+00, 0.00000e+00, 4.93000e-01,\n",
       "         6.42600e+00, 5.23000e+01, 4.54040e+00, 5.00000e+00, 2.87000e+02,\n",
       "         1.96000e+01, 3.96900e+02, 7.20000e+00],\n",
       "        [1.30751e+01, 0.00000e+00, 1.81000e+01, 0.00000e+00, 5.80000e-01,\n",
       "         5.71300e+00, 5.67000e+01, 2.82370e+00, 2.40000e+01, 6.66000e+02,\n",
       "         2.02000e+01, 3.96900e+02, 1.47600e+01],\n",
       "        [9.33889e+00, 0.00000e+00, 1.81000e+01, 0.00000e+00, 6.79000e-01,\n",
       "         6.38000e+00, 9.56000e+01, 1.96820e+00, 2.40000e+01, 6.66000e+02,\n",
       "         2.02000e+01, 6.07200e+01, 2.40800e+01]]),\n",
       " array([23.8, 20.1,  9.5]))"
      ]
     },
     "execution_count": 3,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# 数据查看（前三条）\n",
    "x_train[:3], y_train[:3]"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 4,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "(50.0, 5.0)"
      ]
     },
     "execution_count": 4,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# 查看房屋价格y的最大，最小值\n",
    "y_train.max(), y_train.min()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 1.2 数据预览\n",
    "- 可以将array → DataFram格式，直接用describe()查看"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 5,
   "metadata": {},
   "outputs": [],
   "source": [
    "column_names = ['CRIM', 'ZN', 'INDUS', 'CHAS', 'NOX', 'RM', \n",
    "                'AGE', 'DIS', 'RAD', 'TAX', 'PTRATIO', 'B', 'LSTAT']"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 6,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/html": [
       "<div>\n",
       "<style scoped>\n",
       "    .dataframe tbody tr th:only-of-type {\n",
       "        vertical-align: middle;\n",
       "    }\n",
       "\n",
       "    .dataframe tbody tr th {\n",
       "        vertical-align: top;\n",
       "    }\n",
       "\n",
       "    .dataframe thead th {\n",
       "        text-align: right;\n",
       "    }\n",
       "</style>\n",
       "<table border=\"1\" class=\"dataframe\">\n",
       "  <thead>\n",
       "    <tr style=\"text-align: right;\">\n",
       "      <th></th>\n",
       "      <th>CRIM</th>\n",
       "      <th>ZN</th>\n",
       "      <th>INDUS</th>\n",
       "      <th>CHAS</th>\n",
       "      <th>NOX</th>\n",
       "      <th>RM</th>\n",
       "      <th>AGE</th>\n",
       "      <th>DIS</th>\n",
       "      <th>RAD</th>\n",
       "      <th>TAX</th>\n",
       "      <th>PTRATIO</th>\n",
       "      <th>B</th>\n",
       "      <th>LSTAT</th>\n",
       "    </tr>\n",
       "  </thead>\n",
       "  <tbody>\n",
       "    <tr>\n",
       "      <th>0</th>\n",
       "      <td>0.16760</td>\n",
       "      <td>0.0</td>\n",
       "      <td>7.38</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.493</td>\n",
       "      <td>6.426</td>\n",
       "      <td>52.3</td>\n",
       "      <td>4.5404</td>\n",
       "      <td>5.0</td>\n",
       "      <td>287.0</td>\n",
       "      <td>19.6</td>\n",
       "      <td>396.90</td>\n",
       "      <td>7.20</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>1</th>\n",
       "      <td>13.07510</td>\n",
       "      <td>0.0</td>\n",
       "      <td>18.10</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.580</td>\n",
       "      <td>5.713</td>\n",
       "      <td>56.7</td>\n",
       "      <td>2.8237</td>\n",
       "      <td>24.0</td>\n",
       "      <td>666.0</td>\n",
       "      <td>20.2</td>\n",
       "      <td>396.90</td>\n",
       "      <td>14.76</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>2</th>\n",
       "      <td>9.33889</td>\n",
       "      <td>0.0</td>\n",
       "      <td>18.10</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.679</td>\n",
       "      <td>6.380</td>\n",
       "      <td>95.6</td>\n",
       "      <td>1.9682</td>\n",
       "      <td>24.0</td>\n",
       "      <td>666.0</td>\n",
       "      <td>20.2</td>\n",
       "      <td>60.72</td>\n",
       "      <td>24.08</td>\n",
       "    </tr>\n",
       "  </tbody>\n",
       "</table>\n",
       "</div>"
      ],
      "text/plain": [
       "       CRIM   ZN  INDUS  CHAS    NOX     RM   AGE     DIS   RAD    TAX  \\\n",
       "0   0.16760  0.0   7.38   0.0  0.493  6.426  52.3  4.5404   5.0  287.0   \n",
       "1  13.07510  0.0  18.10   0.0  0.580  5.713  56.7  2.8237  24.0  666.0   \n",
       "2   9.33889  0.0  18.10   0.0  0.679  6.380  95.6  1.9682  24.0  666.0   \n",
       "\n",
       "   PTRATIO       B  LSTAT  \n",
       "0     19.6  396.90   7.20  \n",
       "1     20.2  396.90  14.76  \n",
       "2     20.2   60.72  24.08  "
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "text/html": [
       "<div>\n",
       "<style scoped>\n",
       "    .dataframe tbody tr th:only-of-type {\n",
       "        vertical-align: middle;\n",
       "    }\n",
       "\n",
       "    .dataframe tbody tr th {\n",
       "        vertical-align: top;\n",
       "    }\n",
       "\n",
       "    .dataframe thead th {\n",
       "        text-align: right;\n",
       "    }\n",
       "</style>\n",
       "<table border=\"1\" class=\"dataframe\">\n",
       "  <thead>\n",
       "    <tr style=\"text-align: right;\">\n",
       "      <th></th>\n",
       "      <th>CRIM</th>\n",
       "      <th>ZN</th>\n",
       "      <th>INDUS</th>\n",
       "      <th>CHAS</th>\n",
       "      <th>NOX</th>\n",
       "      <th>RM</th>\n",
       "      <th>AGE</th>\n",
       "      <th>DIS</th>\n",
       "      <th>RAD</th>\n",
       "      <th>TAX</th>\n",
       "      <th>PTRATIO</th>\n",
       "      <th>B</th>\n",
       "      <th>LSTAT</th>\n",
       "    </tr>\n",
       "  </thead>\n",
       "  <tbody>\n",
       "    <tr>\n",
       "      <th>0</th>\n",
       "      <td>0.10008</td>\n",
       "      <td>0.0</td>\n",
       "      <td>2.46</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.488</td>\n",
       "      <td>6.563</td>\n",
       "      <td>95.6</td>\n",
       "      <td>2.8470</td>\n",
       "      <td>3.0</td>\n",
       "      <td>193.0</td>\n",
       "      <td>17.8</td>\n",
       "      <td>396.90</td>\n",
       "      <td>5.68</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>1</th>\n",
       "      <td>0.10153</td>\n",
       "      <td>0.0</td>\n",
       "      <td>12.83</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.437</td>\n",
       "      <td>6.279</td>\n",
       "      <td>74.5</td>\n",
       "      <td>4.0522</td>\n",
       "      <td>5.0</td>\n",
       "      <td>398.0</td>\n",
       "      <td>18.7</td>\n",
       "      <td>373.66</td>\n",
       "      <td>11.97</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>2</th>\n",
       "      <td>0.13960</td>\n",
       "      <td>0.0</td>\n",
       "      <td>8.56</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.520</td>\n",
       "      <td>6.167</td>\n",
       "      <td>90.0</td>\n",
       "      <td>2.4210</td>\n",
       "      <td>5.0</td>\n",
       "      <td>384.0</td>\n",
       "      <td>20.9</td>\n",
       "      <td>392.69</td>\n",
       "      <td>12.33</td>\n",
       "    </tr>\n",
       "  </tbody>\n",
       "</table>\n",
       "</div>"
      ],
      "text/plain": [
       "      CRIM   ZN  INDUS  CHAS    NOX     RM   AGE     DIS  RAD    TAX  PTRATIO  \\\n",
       "0  0.10008  0.0   2.46   0.0  0.488  6.563  95.6  2.8470  3.0  193.0     17.8   \n",
       "1  0.10153  0.0  12.83   0.0  0.437  6.279  74.5  4.0522  5.0  398.0     18.7   \n",
       "2  0.13960  0.0   8.56   0.0  0.520  6.167  90.0  2.4210  5.0  384.0     20.9   \n",
       "\n",
       "        B  LSTAT  \n",
       "0  396.90   5.68  \n",
       "1  373.66  11.97  \n",
       "2  392.69  12.33  "
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    }
   ],
   "source": [
    "from IPython.display import display\n",
    "# 将x_train,x_test 转为 DataFrame\n",
    "x_train_df = pd.DataFrame(x_train, columns= column_names)\n",
    "x_test_df = pd.DataFrame(x_test, columns= column_names)\n",
    "\n",
    "display(x_train_df.head(3))\n",
    "display(x_test_df.head(3))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 7,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "<class 'pandas.core.frame.DataFrame'>\n",
      "RangeIndex: 404 entries, 0 to 403\n",
      "Data columns (total 13 columns):\n",
      " #   Column   Non-Null Count  Dtype  \n",
      "---  ------   --------------  -----  \n",
      " 0   CRIM     404 non-null    float64\n",
      " 1   ZN       404 non-null    float64\n",
      " 2   INDUS    404 non-null    float64\n",
      " 3   CHAS     404 non-null    float64\n",
      " 4   NOX      404 non-null    float64\n",
      " 5   RM       404 non-null    float64\n",
      " 6   AGE      404 non-null    float64\n",
      " 7   DIS      404 non-null    float64\n",
      " 8   RAD      404 non-null    float64\n",
      " 9   TAX      404 non-null    float64\n",
      " 10  PTRATIO  404 non-null    float64\n",
      " 11  B        404 non-null    float64\n",
      " 12  LSTAT    404 non-null    float64\n",
      "dtypes: float64(13)\n",
      "memory usage: 41.2 KB\n"
     ]
    }
   ],
   "source": [
    "# 查看各列数据类型\n",
    "x_train_df.info()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 8,
   "metadata": {
    "scrolled": true
   },
   "outputs": [
    {
     "data": {
      "text/html": [
       "<div>\n",
       "<style scoped>\n",
       "    .dataframe tbody tr th:only-of-type {\n",
       "        vertical-align: middle;\n",
       "    }\n",
       "\n",
       "    .dataframe tbody tr th {\n",
       "        vertical-align: top;\n",
       "    }\n",
       "\n",
       "    .dataframe thead th {\n",
       "        text-align: right;\n",
       "    }\n",
       "</style>\n",
       "<table border=\"1\" class=\"dataframe\">\n",
       "  <thead>\n",
       "    <tr style=\"text-align: right;\">\n",
       "      <th></th>\n",
       "      <th>CRIM</th>\n",
       "      <th>ZN</th>\n",
       "      <th>INDUS</th>\n",
       "      <th>CHAS</th>\n",
       "      <th>NOX</th>\n",
       "      <th>RM</th>\n",
       "      <th>AGE</th>\n",
       "      <th>DIS</th>\n",
       "      <th>RAD</th>\n",
       "      <th>TAX</th>\n",
       "      <th>PTRATIO</th>\n",
       "      <th>B</th>\n",
       "      <th>LSTAT</th>\n",
       "    </tr>\n",
       "  </thead>\n",
       "  <tbody>\n",
       "    <tr>\n",
       "      <th>count</th>\n",
       "      <td>404.000000</td>\n",
       "      <td>404.000000</td>\n",
       "      <td>404.000000</td>\n",
       "      <td>404.000000</td>\n",
       "      <td>404.000000</td>\n",
       "      <td>404.000000</td>\n",
       "      <td>404.000000</td>\n",
       "      <td>404.000000</td>\n",
       "      <td>404.000000</td>\n",
       "      <td>404.000000</td>\n",
       "      <td>404.000000</td>\n",
       "      <td>404.000000</td>\n",
       "      <td>404.000000</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>mean</th>\n",
       "      <td>3.767236</td>\n",
       "      <td>10.569307</td>\n",
       "      <td>11.244381</td>\n",
       "      <td>0.074257</td>\n",
       "      <td>0.557519</td>\n",
       "      <td>6.286725</td>\n",
       "      <td>68.644802</td>\n",
       "      <td>3.776204</td>\n",
       "      <td>9.504950</td>\n",
       "      <td>404.730198</td>\n",
       "      <td>18.395050</td>\n",
       "      <td>355.701411</td>\n",
       "      <td>12.692475</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>std</th>\n",
       "      <td>9.151749</td>\n",
       "      <td>22.634794</td>\n",
       "      <td>6.849242</td>\n",
       "      <td>0.262514</td>\n",
       "      <td>0.116204</td>\n",
       "      <td>0.717093</td>\n",
       "      <td>28.045879</td>\n",
       "      <td>2.092890</td>\n",
       "      <td>8.665122</td>\n",
       "      <td>168.033640</td>\n",
       "      <td>2.174583</td>\n",
       "      <td>92.100490</td>\n",
       "      <td>7.183601</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>min</th>\n",
       "      <td>0.006320</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>0.740000</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>0.392000</td>\n",
       "      <td>3.561000</td>\n",
       "      <td>2.900000</td>\n",
       "      <td>1.137000</td>\n",
       "      <td>1.000000</td>\n",
       "      <td>187.000000</td>\n",
       "      <td>12.600000</td>\n",
       "      <td>2.520000</td>\n",
       "      <td>1.730000</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>25%</th>\n",
       "      <td>0.084123</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>5.190000</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>0.453000</td>\n",
       "      <td>5.883000</td>\n",
       "      <td>46.175000</td>\n",
       "      <td>2.084875</td>\n",
       "      <td>4.000000</td>\n",
       "      <td>277.000000</td>\n",
       "      <td>17.000000</td>\n",
       "      <td>375.085000</td>\n",
       "      <td>7.197500</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>50%</th>\n",
       "      <td>0.279465</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>9.690000</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>0.538000</td>\n",
       "      <td>6.198500</td>\n",
       "      <td>77.700000</td>\n",
       "      <td>3.142300</td>\n",
       "      <td>5.000000</td>\n",
       "      <td>329.500000</td>\n",
       "      <td>18.900000</td>\n",
       "      <td>391.115000</td>\n",
       "      <td>11.360000</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>75%</th>\n",
       "      <td>3.594927</td>\n",
       "      <td>12.500000</td>\n",
       "      <td>18.100000</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>0.624000</td>\n",
       "      <td>6.636750</td>\n",
       "      <td>93.825000</td>\n",
       "      <td>5.212575</td>\n",
       "      <td>24.000000</td>\n",
       "      <td>666.000000</td>\n",
       "      <td>20.200000</td>\n",
       "      <td>396.157500</td>\n",
       "      <td>16.672500</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>max</th>\n",
       "      <td>88.976200</td>\n",
       "      <td>100.000000</td>\n",
       "      <td>27.740000</td>\n",
       "      <td>1.000000</td>\n",
       "      <td>0.871000</td>\n",
       "      <td>8.780000</td>\n",
       "      <td>100.000000</td>\n",
       "      <td>12.126500</td>\n",
       "      <td>24.000000</td>\n",
       "      <td>711.000000</td>\n",
       "      <td>22.000000</td>\n",
       "      <td>396.900000</td>\n",
       "      <td>37.970000</td>\n",
       "    </tr>\n",
       "  </tbody>\n",
       "</table>\n",
       "</div>"
      ],
      "text/plain": [
       "             CRIM          ZN       INDUS        CHAS         NOX          RM  \\\n",
       "count  404.000000  404.000000  404.000000  404.000000  404.000000  404.000000   \n",
       "mean     3.767236   10.569307   11.244381    0.074257    0.557519    6.286725   \n",
       "std      9.151749   22.634794    6.849242    0.262514    0.116204    0.717093   \n",
       "min      0.006320    0.000000    0.740000    0.000000    0.392000    3.561000   \n",
       "25%      0.084123    0.000000    5.190000    0.000000    0.453000    5.883000   \n",
       "50%      0.279465    0.000000    9.690000    0.000000    0.538000    6.198500   \n",
       "75%      3.594927   12.500000   18.100000    0.000000    0.624000    6.636750   \n",
       "max     88.976200  100.000000   27.740000    1.000000    0.871000    8.780000   \n",
       "\n",
       "              AGE         DIS         RAD         TAX     PTRATIO           B  \\\n",
       "count  404.000000  404.000000  404.000000  404.000000  404.000000  404.000000   \n",
       "mean    68.644802    3.776204    9.504950  404.730198   18.395050  355.701411   \n",
       "std     28.045879    2.092890    8.665122  168.033640    2.174583   92.100490   \n",
       "min      2.900000    1.137000    1.000000  187.000000   12.600000    2.520000   \n",
       "25%     46.175000    2.084875    4.000000  277.000000   17.000000  375.085000   \n",
       "50%     77.700000    3.142300    5.000000  329.500000   18.900000  391.115000   \n",
       "75%     93.825000    5.212575   24.000000  666.000000   20.200000  396.157500   \n",
       "max    100.000000   12.126500   24.000000  711.000000   22.000000  396.900000   \n",
       "\n",
       "            LSTAT  \n",
       "count  404.000000  \n",
       "mean    12.692475  \n",
       "std      7.183601  \n",
       "min      1.730000  \n",
       "25%      7.197500  \n",
       "50%     11.360000  \n",
       "75%     16.672500  \n",
       "max     37.970000  "
      ]
     },
     "execution_count": 8,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# 查看各列的指标\n",
    "x_train_df.describe()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 1.3 数据标准化\n",
    "\n",
    "- StandardScaler标准化方法，针对**每一个特征维度**来做的，而不是针对样本。 \n",
    "- 使得经过处理的数据符合标准正态分布，即均值为0，标准差为1，其转化函数为： **x_std = (x-μ)/σ**\n",
    "- 其中μ为所有样本数据的均值，σ为所有样本数据的标准差。\n",
    "<h4 style = 'color:red'>用于测试集的标准化的均值μ和标准差σ应该都来源于训练集，即在train上面fit，然后transform到test上</h4>"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 9,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/html": [
       "<div>\n",
       "<style scoped>\n",
       "    .dataframe tbody tr th:only-of-type {\n",
       "        vertical-align: middle;\n",
       "    }\n",
       "\n",
       "    .dataframe tbody tr th {\n",
       "        vertical-align: top;\n",
       "    }\n",
       "\n",
       "    .dataframe thead th {\n",
       "        text-align: right;\n",
       "    }\n",
       "</style>\n",
       "<table border=\"1\" class=\"dataframe\">\n",
       "  <thead>\n",
       "    <tr style=\"text-align: right;\">\n",
       "      <th></th>\n",
       "      <th>CRIM</th>\n",
       "      <th>ZN</th>\n",
       "      <th>INDUS</th>\n",
       "      <th>CHAS</th>\n",
       "      <th>NOX</th>\n",
       "      <th>RM</th>\n",
       "      <th>AGE</th>\n",
       "      <th>DIS</th>\n",
       "      <th>RAD</th>\n",
       "      <th>TAX</th>\n",
       "      <th>PTRATIO</th>\n",
       "      <th>B</th>\n",
       "      <th>LSTAT</th>\n",
       "    </tr>\n",
       "  </thead>\n",
       "  <tbody>\n",
       "    <tr>\n",
       "      <th>0</th>\n",
       "      <td>0.16760</td>\n",
       "      <td>0.0</td>\n",
       "      <td>7.38</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.493</td>\n",
       "      <td>6.426</td>\n",
       "      <td>52.3</td>\n",
       "      <td>4.5404</td>\n",
       "      <td>5.0</td>\n",
       "      <td>287.0</td>\n",
       "      <td>19.6</td>\n",
       "      <td>396.90</td>\n",
       "      <td>7.20</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>1</th>\n",
       "      <td>13.07510</td>\n",
       "      <td>0.0</td>\n",
       "      <td>18.10</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.580</td>\n",
       "      <td>5.713</td>\n",
       "      <td>56.7</td>\n",
       "      <td>2.8237</td>\n",
       "      <td>24.0</td>\n",
       "      <td>666.0</td>\n",
       "      <td>20.2</td>\n",
       "      <td>396.90</td>\n",
       "      <td>14.76</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>2</th>\n",
       "      <td>9.33889</td>\n",
       "      <td>0.0</td>\n",
       "      <td>18.10</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.679</td>\n",
       "      <td>6.380</td>\n",
       "      <td>95.6</td>\n",
       "      <td>1.9682</td>\n",
       "      <td>24.0</td>\n",
       "      <td>666.0</td>\n",
       "      <td>20.2</td>\n",
       "      <td>60.72</td>\n",
       "      <td>24.08</td>\n",
       "    </tr>\n",
       "  </tbody>\n",
       "</table>\n",
       "</div>"
      ],
      "text/plain": [
       "       CRIM   ZN  INDUS  CHAS    NOX     RM   AGE     DIS   RAD    TAX  \\\n",
       "0   0.16760  0.0   7.38   0.0  0.493  6.426  52.3  4.5404   5.0  287.0   \n",
       "1  13.07510  0.0  18.10   0.0  0.580  5.713  56.7  2.8237  24.0  666.0   \n",
       "2   9.33889  0.0  18.10   0.0  0.679  6.380  95.6  1.9682  24.0  666.0   \n",
       "\n",
       "   PTRATIO       B  LSTAT  \n",
       "0     19.6  396.90   7.20  \n",
       "1     20.2  396.90  14.76  \n",
       "2     20.2   60.72  24.08  "
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "data": {
      "text/html": [
       "<div>\n",
       "<style scoped>\n",
       "    .dataframe tbody tr th:only-of-type {\n",
       "        vertical-align: middle;\n",
       "    }\n",
       "\n",
       "    .dataframe tbody tr th {\n",
       "        vertical-align: top;\n",
       "    }\n",
       "\n",
       "    .dataframe thead th {\n",
       "        text-align: right;\n",
       "    }\n",
       "</style>\n",
       "<table border=\"1\" class=\"dataframe\">\n",
       "  <thead>\n",
       "    <tr style=\"text-align: right;\">\n",
       "      <th></th>\n",
       "      <th>CRIM</th>\n",
       "      <th>ZN</th>\n",
       "      <th>INDUS</th>\n",
       "      <th>CHAS</th>\n",
       "      <th>NOX</th>\n",
       "      <th>RM</th>\n",
       "      <th>AGE</th>\n",
       "      <th>DIS</th>\n",
       "      <th>RAD</th>\n",
       "      <th>TAX</th>\n",
       "      <th>PTRATIO</th>\n",
       "      <th>B</th>\n",
       "      <th>LSTAT</th>\n",
       "    </tr>\n",
       "  </thead>\n",
       "  <tbody>\n",
       "    <tr>\n",
       "      <th>0</th>\n",
       "      <td>0.10008</td>\n",
       "      <td>0.0</td>\n",
       "      <td>2.46</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.488</td>\n",
       "      <td>6.563</td>\n",
       "      <td>95.6</td>\n",
       "      <td>2.8470</td>\n",
       "      <td>3.0</td>\n",
       "      <td>193.0</td>\n",
       "      <td>17.8</td>\n",
       "      <td>396.90</td>\n",
       "      <td>5.68</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>1</th>\n",
       "      <td>0.10153</td>\n",
       "      <td>0.0</td>\n",
       "      <td>12.83</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.437</td>\n",
       "      <td>6.279</td>\n",
       "      <td>74.5</td>\n",
       "      <td>4.0522</td>\n",
       "      <td>5.0</td>\n",
       "      <td>398.0</td>\n",
       "      <td>18.7</td>\n",
       "      <td>373.66</td>\n",
       "      <td>11.97</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>2</th>\n",
       "      <td>0.13960</td>\n",
       "      <td>0.0</td>\n",
       "      <td>8.56</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.520</td>\n",
       "      <td>6.167</td>\n",
       "      <td>90.0</td>\n",
       "      <td>2.4210</td>\n",
       "      <td>5.0</td>\n",
       "      <td>384.0</td>\n",
       "      <td>20.9</td>\n",
       "      <td>392.69</td>\n",
       "      <td>12.33</td>\n",
       "    </tr>\n",
       "  </tbody>\n",
       "</table>\n",
       "</div>"
      ],
      "text/plain": [
       "      CRIM   ZN  INDUS  CHAS    NOX     RM   AGE     DIS  RAD    TAX  PTRATIO  \\\n",
       "0  0.10008  0.0   2.46   0.0  0.488  6.563  95.6  2.8470  3.0  193.0     17.8   \n",
       "1  0.10153  0.0  12.83   0.0  0.437  6.279  74.5  4.0522  5.0  398.0     18.7   \n",
       "2  0.13960  0.0   8.56   0.0  0.520  6.167  90.0  2.4210  5.0  384.0     20.9   \n",
       "\n",
       "        B  LSTAT  \n",
       "0  396.90   5.68  \n",
       "1  373.66  11.97  \n",
       "2  392.69  12.33  "
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    }
   ],
   "source": [
    "# 标准化前，再来看一眼数据：\n",
    "display(x_train_df.head(3))\n",
    "display(x_test_df.head(3))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 10,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "<class 'numpy.ndarray'> <class 'numpy.ndarray'>\n"
     ]
    }
   ],
   "source": [
    "# 特征数据标准化 用均值-标准差标准化\n",
    "from sklearn.preprocessing import StandardScaler\n",
    "\n",
    "sc = StandardScaler()  # 初始化\n",
    "x_train_std = sc.fit_transform(x_train_df)  # 用train fit和transform\n",
    "x_test_std = sc.transform(x_test_df)        # test只做transform\n",
    "print(type(x_train_std),type(x_test_std))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 11,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "在进行数据的标准化后：\n",
      "train数据预览：------------\n",
      "[[-0.39 -0.47 -0.56 -0.28 -0.56  0.19 -0.58  0.37 -0.52 -0.7   0.55  0.45\n",
      "  -0.77]\n",
      " [ 1.02 -0.47  1.   -0.28  0.19 -0.8  -0.43 -0.46  1.67  1.56  0.83  0.45\n",
      "   0.29]\n",
      " [ 0.61 -0.47  1.   -0.28  1.05  0.13  0.96 -0.86  1.67  1.56  0.83 -3.21\n",
      "   1.59]]\n",
      "train的最大值是：\n",
      " [9.32 3.96 2.41 3.53 2.7  3.48 1.12 3.99 1.67 1.82 1.66 0.45 3.52]\n",
      "train的最小值是：\n",
      " [-0.41 -0.47 -1.54 -0.28 -1.43 -3.81 -2.35 -1.26 -0.98 -1.3  -2.67 -3.84\n",
      " -1.53]\n",
      "train的平均值是：\n",
      " [ 0. -0.  0.  0. -0. -0. -0.  0.  0. -0.  0.  0.  0.]\n",
      "train的标准差是：\n",
      " [1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1. 1.]\n"
     ]
    }
   ],
   "source": [
    "print(\"在进行数据的标准化后：\\ntrain数据预览：------------\")\n",
    "print(x_train_std[:3].round(2))\n",
    "print(\"train的最大值是：\\n\",np.max(x_train_std, axis=0).round(2))\n",
    "print(\"train的最小值是：\\n\",np.min(x_train_std, axis=0).round(2))\n",
    "print(\"train的平均值是：\\n\",np.mean(x_train_std, axis=0).round(2))\n",
    "print(\"train的标准差是：\\n\",np.std(x_train_std, axis=0).round(2))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 12,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "在进行数据的标准化后：\n",
      "test数据预览：------------\n",
      "[[-0.4  -0.47 -1.28 -0.28 -0.6   0.39  0.96 -0.44 -0.75 -1.26 -0.27  0.45\n",
      "  -0.98]\n",
      " [-0.4  -0.47  0.23 -0.28 -1.04 -0.01  0.21  0.13 -0.52 -0.04  0.14  0.2\n",
      "  -0.1 ]\n",
      " [-0.4  -0.47 -0.39 -0.28 -0.32 -0.17  0.76 -0.65 -0.52 -0.12  1.15  0.4\n",
      "  -0.05]]\n",
      "test的最大值是：\n",
      " [3.78 3.29 2.41 3.53 2.7  3.4  1.12 3.32 1.67 1.82 1.66 0.45 2.5 ]\n",
      "test的最小值是：\n",
      " [-0.41 -0.47 -1.58 -0.28 -1.49 -1.96 -1.87 -1.27 -0.98 -1.26 -2.67 -3.86\n",
      " -1.36]\n",
      "test的平均值是：\n",
      " [-0.08  0.17 -0.08 -0.1  -0.12 -0.01 -0.01  0.04  0.03  0.1   0.14  0.05\n",
      " -0.03]\n",
      "test的标准差是：\n",
      " [0.65 1.13 1.01 0.82 0.98 0.9  1.02 1.03 1.02 1.01 0.97 0.96 0.97]\n"
     ]
    }
   ],
   "source": [
    "print(\"在进行数据的标准化后：\\ntest数据预览：------------\")\n",
    "print(x_test_std[:3].round(2))\n",
    "print(\"test的最大值是：\\n\",np.max(x_test_std, axis=0).round(2))\n",
    "print(\"test的最小值是：\\n\",np.min(x_test_std, axis=0).round(2))\n",
    "print(\"test的平均值是：\\n\",np.mean(x_test_std, axis=0).round(2))\n",
    "print(\"test的标准差是：\\n\",np.std(x_test_std, axis=0).round(2))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 13,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "((404, 13), (404,), (102, 13), (102,))"
      ]
     },
     "execution_count": 13,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# 最后，处理完数据，再确认一遍数据的维度\n",
    "x_train_std.shape, y_train.shape, x_test_std.shape, y_test.shape"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "**x是404个样本，13个特征；y是404个样本，数值型结果**"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "<h3 style = 'color:red'>以上，数据预处理部分结束，开始建模</h3>\n",
    "\n",
    "---"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# 2.1 使用机器学习-线性回归算法求解"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 14,
   "metadata": {
    "scrolled": true
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "在训练集训练得到参数组合为：\n",
      "其中w:\n",
      " [-0.87659378  1.06634679  0.29915997  0.52168038 -2.13559059  2.97638104\n",
      " -0.05700562 -2.95144347  2.78042878 -2.40524802 -2.19528919  0.82657633\n",
      " -3.26265917] \n",
      "b:\n",
      " 22.57722772277233\n"
     ]
    }
   ],
   "source": [
    "from sklearn.linear_model import LinearRegression  # 线性回归\n",
    "# 初始化模型\n",
    "lr = LinearRegression()\n",
    "# 训练\n",
    "lr.fit(x_train_std, y_train)\n",
    "# 输出训练组合\n",
    "print(\"在训练集训练得到参数组合为：\\n其中w:\\n\", lr.coef_, \"\\nb:\\n\",lr.intercept_)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 决定系数R2\n",
    "- 决定系数（coefficient ofdetermination），也叫R2判定系数或拟合优度。\n",
    "- R2反应了**因变量y的波动有多少百分比能被自变量x的波动所描述，所以R2的值在0-1之间，越接近1越好**\n",
    "- 意义：R2值越大，说明x对y的解释程度越高。也说明观察点在回归直线附近越密集。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 15,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "在训练集上的决定系数r2得分为： 0.7565107960048918\n",
      "在训练集上的均方误差MSE为： 20.006453472555805\n"
     ]
    }
   ],
   "source": [
    "from sklearn.metrics import r2_score\n",
    "from sklearn.metrics import mean_squared_error\n",
    "\n",
    "y_train_pred = lr.predict(x_train_std)\n",
    "print(\"在训练集上的决定系数r2得分为：\", r2_score(y_train, y_train_pred))\n",
    "print(\"在训练集上的均方误差MSE为：\", mean_squared_error(y_train, y_train_pred))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 16,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "在测试集上的决定系数r2得分为： 0.675908902305892\n",
      "在测试集上的均方误差MSE为： 30.240273702472102\n"
     ]
    }
   ],
   "source": [
    "y_test_pred = lr.predict(x_test_std)\n",
    "print(\"在测试集上的决定系数r2得分为：\", r2_score(y_test, y_test_pred))\n",
    "print(\"在测试集上的均方误差MSE为：\", mean_squared_error(y_test, y_test_pred))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": []
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": []
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# 2.2 构建神经网络模型 base_line\n",
    "\n",
    "- <h3 style='color: red'> 输入层13 → 隐层15 →输出层1 </h3>\n",
    "\n",
    "### tips：\n",
    "- input_dim = 13 和 input_shape = (13,) 是等价的。但是input_dim只用在输入是二维向量上\n",
    "- 编译模型\n",
    " - 损失函数 - 用于测量模型在训练期间的准确率。您会希望最小化此函数，以便将模型“引导”到正确的方向上。\n",
    " - 优化器 - 决定模型如何根据其看到的数据和自身的损失函数进行更新。\n",
    " - 指标 - 用于监控训练和测试步骤。以下示例使用了准确率，即被正确分类的图像的比率。\n",
    " \n",
    " https://keras-zh.readthedocs.io/getting-started/sequential-model-guide/"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 17,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Model: \"sequential\"\n",
      "_________________________________________________________________\n",
      "Layer (type)                 Output Shape              Param #   \n",
      "=================================================================\n",
      "dense (Dense)                (None, 15)                210       \n",
      "_________________________________________________________________\n",
      "dense_1 (Dense)              (None, 1)                 16        \n",
      "=================================================================\n",
      "Total params: 226\n",
      "Trainable params: 226\n",
      "Non-trainable params: 0\n",
      "_________________________________________________________________\n",
      "None\n"
     ]
    }
   ],
   "source": [
    "from tensorflow.keras.models import Sequential\n",
    "from tensorflow.keras.layers import Dense\n",
    "\n",
    "# 构建网络结构\n",
    "# （1）模型初始化\n",
    "model = Sequential()\n",
    "# （2）第一个隐层，15神经元节点\n",
    "model.add(Dense(units = 15, input_dim = 13 ,activation='relu'))  \n",
    "# （3）输出层，1个神经元节点，回归直接用linear的线性激励函数\n",
    "model.add(Dense(units = 1, activation = 'linear')) \n",
    "\n",
    "# 模型编译\n",
    "model.compile(optimizer = 'sgd',      # 默认的lr= 0.01\n",
    "              loss = 'mean_squared_error')\n",
    "# 显示模型结构\n",
    "print(model.summary())    \n"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 模型训练及可视化"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 18,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Epoch 1/100\n",
      "81/81 [==============================] - 0s 2ms/step - loss: 75.1475 - val_loss: 247.0158\n",
      "Epoch 2/100\n",
      "81/81 [==============================] - 0s 666us/step - loss: 45.9485 - val_loss: 21.7228\n",
      "Epoch 3/100\n",
      "81/81 [==============================] - 0s 770us/step - loss: 41.7090 - val_loss: 32.3340\n",
      "Epoch 4/100\n",
      "81/81 [==============================] - 0s 674us/step - loss: 24.7535 - val_loss: 21.5779\n",
      "Epoch 5/100\n",
      "81/81 [==============================] - 0s 675us/step - loss: 22.3866 - val_loss: 18.1009\n",
      "Epoch 6/100\n",
      "81/81 [==============================] - 0s 596us/step - loss: 24.5470 - val_loss: 23.7802\n",
      "Epoch 7/100\n",
      "81/81 [==============================] - 0s 676us/step - loss: 17.7731 - val_loss: 12.2572\n",
      "Epoch 8/100\n",
      "81/81 [==============================] - 0s 658us/step - loss: 14.9535 - val_loss: 41.1698\n",
      "Epoch 9/100\n",
      "81/81 [==============================] - 0s 659us/step - loss: 14.9917 - val_loss: 13.5854\n",
      "Epoch 10/100\n",
      "81/81 [==============================] - 0s 660us/step - loss: 15.0843 - val_loss: 23.3353\n",
      "Epoch 11/100\n",
      "81/81 [==============================] - 0s 815us/step - loss: 15.4364 - val_loss: 17.4759\n",
      "Epoch 12/100\n",
      "81/81 [==============================] - 0s 631us/step - loss: 16.7626 - val_loss: 13.7954\n",
      "Epoch 13/100\n",
      "81/81 [==============================] - 0s 658us/step - loss: 12.5471 - val_loss: 20.4775\n",
      "Epoch 14/100\n",
      "81/81 [==============================] - 0s 803us/step - loss: 14.1652 - val_loss: 82.1308\n",
      "Epoch 15/100\n",
      "81/81 [==============================] - 0s 656us/step - loss: 17.1535 - val_loss: 14.1575\n",
      "Epoch 16/100\n",
      "81/81 [==============================] - 0s 618us/step - loss: 14.2868 - val_loss: 14.1600\n",
      "Epoch 17/100\n",
      "81/81 [==============================] - 0s 658us/step - loss: 17.1125 - val_loss: 18.2367\n",
      "Epoch 18/100\n",
      "81/81 [==============================] - 0s 663us/step - loss: 14.1213 - val_loss: 12.6771\n",
      "Epoch 19/100\n",
      "81/81 [==============================] - 0s 819us/step - loss: 13.5015 - val_loss: 20.1915\n",
      "Epoch 20/100\n",
      "81/81 [==============================] - 0s 659us/step - loss: 12.3214 - val_loss: 14.3597\n",
      "Epoch 21/100\n",
      "81/81 [==============================] - 0s 662us/step - loss: 10.1799 - val_loss: 22.3894\n",
      "Epoch 22/100\n",
      "81/81 [==============================] - 0s 802us/step - loss: 11.0208 - val_loss: 13.9294\n",
      "Epoch 23/100\n",
      "81/81 [==============================] - 0s 732us/step - loss: 14.0173 - val_loss: 16.7107\n",
      "Epoch 24/100\n",
      "81/81 [==============================] - 0s 692us/step - loss: 11.3994 - val_loss: 16.2438\n",
      "Epoch 25/100\n",
      "81/81 [==============================] - 0s 621us/step - loss: 14.6794 - val_loss: 13.5855\n",
      "Epoch 26/100\n",
      "81/81 [==============================] - 0s 660us/step - loss: 12.3641 - val_loss: 11.8734\n",
      "Epoch 27/100\n",
      "81/81 [==============================] - 0s 659us/step - loss: 10.9421 - val_loss: 12.0124\n",
      "Epoch 28/100\n",
      "81/81 [==============================] - 0s 824us/step - loss: 12.5045 - val_loss: 11.5246\n",
      "Epoch 29/100\n",
      "81/81 [==============================] - 0s 659us/step - loss: 10.6118 - val_loss: 13.5893\n",
      "Epoch 30/100\n",
      "81/81 [==============================] - 0s 661us/step - loss: 11.5856 - val_loss: 13.5943\n",
      "Epoch 31/100\n",
      "81/81 [==============================] - 0s 820us/step - loss: 9.1081 - val_loss: 14.0738\n",
      "Epoch 32/100\n",
      "81/81 [==============================] - 0s 661us/step - loss: 10.8278 - val_loss: 12.8220\n",
      "Epoch 33/100\n",
      "81/81 [==============================] - 0s 659us/step - loss: 14.2571 - val_loss: 19.3678\n",
      "Epoch 34/100\n",
      "81/81 [==============================] - 0s 821us/step - loss: 10.7711 - val_loss: 24.6012\n",
      "Epoch 35/100\n",
      "81/81 [==============================] - 0s 658us/step - loss: 10.3074 - val_loss: 15.6219\n",
      "Epoch 36/100\n",
      "81/81 [==============================] - 0s 905us/step - loss: 10.0887 - val_loss: 12.8675\n",
      "Epoch 37/100\n",
      "81/81 [==============================] - 0s 692us/step - loss: 8.4956 - val_loss: 35.1694\n",
      "Epoch 38/100\n",
      "81/81 [==============================] - 0s 648us/step - loss: 10.3015 - val_loss: 13.7790\n",
      "Epoch 39/100\n",
      "81/81 [==============================] - 0s 658us/step - loss: 10.0485 - val_loss: 16.4581\n",
      "Epoch 40/100\n",
      "81/81 [==============================] - 0s 658us/step - loss: 9.7751 - val_loss: 13.2767\n",
      "Epoch 41/100\n",
      "81/81 [==============================] - 0s 659us/step - loss: 10.7486 - val_loss: 10.1823\n",
      "Epoch 42/100\n",
      "81/81 [==============================] - 0s 658us/step - loss: 9.5464 - val_loss: 27.9698\n",
      "Epoch 43/100\n",
      "81/81 [==============================] - 0s 664us/step - loss: 8.9063 - val_loss: 12.2760\n",
      "Epoch 44/100\n",
      "81/81 [==============================] - 0s 936us/step - loss: 8.8072 - val_loss: 14.7088\n",
      "Epoch 45/100\n",
      "81/81 [==============================] - 0s 617us/step - loss: 9.5596 - val_loss: 10.9233\n",
      "Epoch 46/100\n",
      "81/81 [==============================] - 0s 771us/step - loss: 8.9828 - val_loss: 11.4553\n",
      "Epoch 47/100\n",
      "81/81 [==============================] - 0s 934us/step - loss: 11.1900 - val_loss: 10.6116\n",
      "Epoch 48/100\n",
      "81/81 [==============================] - 0s 767us/step - loss: 12.5592 - val_loss: 16.5449\n",
      "Epoch 49/100\n",
      "81/81 [==============================] - 0s 755us/step - loss: 8.3567 - val_loss: 13.3214\n",
      "Epoch 50/100\n",
      "81/81 [==============================] - 0s 600us/step - loss: 9.5681 - val_loss: 15.8015\n",
      "Epoch 51/100\n",
      "81/81 [==============================] - 0s 812us/step - loss: 8.3026 - val_loss: 13.8092\n",
      "Epoch 52/100\n",
      "81/81 [==============================] - 0s 646us/step - loss: 8.3652 - val_loss: 16.5745\n",
      "Epoch 53/100\n",
      "81/81 [==============================] - 0s 659us/step - loss: 8.1799 - val_loss: 15.2482\n",
      "Epoch 54/100\n",
      "81/81 [==============================] - 0s 822us/step - loss: 9.1440 - val_loss: 17.0174\n",
      "Epoch 55/100\n",
      "81/81 [==============================] - 0s 675us/step - loss: 7.7172 - val_loss: 15.4553\n",
      "Epoch 56/100\n",
      "81/81 [==============================] - 0s 657us/step - loss: 10.9206 - val_loss: 14.5035\n",
      "Epoch 57/100\n",
      "81/81 [==============================] - 0s 895us/step - loss: 9.2370 - val_loss: 12.7549\n",
      "Epoch 58/100\n",
      "81/81 [==============================] - 0s 545us/step - loss: 8.0097 - val_loss: 10.5057\n",
      "Epoch 59/100\n",
      "81/81 [==============================] - 0s 854us/step - loss: 9.3722 - val_loss: 13.2221\n",
      "Epoch 60/100\n",
      "81/81 [==============================] - 0s 583us/step - loss: 7.7224 - val_loss: 14.1099\n",
      "Epoch 61/100\n",
      "81/81 [==============================] - 0s 783us/step - loss: 8.4899 - val_loss: 12.0738\n",
      "Epoch 62/100\n",
      "81/81 [==============================] - 0s 663us/step - loss: 9.2283 - val_loss: 13.7812\n",
      "Epoch 63/100\n",
      "81/81 [==============================] - 0s 660us/step - loss: 10.8752 - val_loss: 22.5924\n",
      "Epoch 64/100\n",
      "81/81 [==============================] - 0s 783us/step - loss: 8.0530 - val_loss: 12.1511\n",
      "Epoch 65/100\n",
      "81/81 [==============================] - 0s 635us/step - loss: 6.8116 - val_loss: 11.7200\n",
      "Epoch 66/100\n",
      "81/81 [==============================] - 0s 648us/step - loss: 7.8288 - val_loss: 16.3515\n",
      "Epoch 67/100\n",
      "81/81 [==============================] - 0s 1ms/step - loss: 8.9547 - val_loss: 16.7683\n",
      "Epoch 68/100\n",
      "81/81 [==============================] - 0s 618us/step - loss: 7.0917 - val_loss: 13.6515\n",
      "Epoch 69/100\n",
      "81/81 [==============================] - 0s 664us/step - loss: 7.9776 - val_loss: 13.1191\n",
      "Epoch 70/100\n",
      "81/81 [==============================] - 0s 696us/step - loss: 7.5205 - val_loss: 10.7955\n",
      "Epoch 71/100\n",
      "81/81 [==============================] - 0s 581us/step - loss: 7.0824 - val_loss: 13.0660\n",
      "Epoch 72/100\n",
      "81/81 [==============================] - 0s 790us/step - loss: 6.9101 - val_loss: 14.5806\n",
      "Epoch 73/100\n",
      "81/81 [==============================] - 0s 670us/step - loss: 9.2543 - val_loss: 13.0638\n",
      "Epoch 74/100\n",
      "81/81 [==============================] - 0s 666us/step - loss: 6.7671 - val_loss: 14.4165\n",
      "Epoch 75/100\n",
      "81/81 [==============================] - 0s 712us/step - loss: 7.8255 - val_loss: 12.6438\n",
      "Epoch 76/100\n",
      "81/81 [==============================] - 0s 594us/step - loss: 7.7841 - val_loss: 13.3841\n",
      "Epoch 77/100\n",
      "81/81 [==============================] - 0s 798us/step - loss: 7.8855 - val_loss: 12.4280\n",
      "Epoch 78/100\n",
      "81/81 [==============================] - 0s 661us/step - loss: 7.1184 - val_loss: 17.0288\n",
      "Epoch 79/100\n",
      "81/81 [==============================] - 0s 658us/step - loss: 8.4999 - val_loss: 14.7243\n",
      "Epoch 80/100\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "81/81 [==============================] - 0s 747us/step - loss: 10.9989 - val_loss: 15.6397\n",
      "Epoch 81/100\n",
      "81/81 [==============================] - 0s 533us/step - loss: 7.9440 - val_loss: 13.4372\n",
      "Epoch 82/100\n",
      "81/81 [==============================] - 0s 755us/step - loss: 7.8091 - val_loss: 15.2494\n",
      "Epoch 83/100\n",
      "81/81 [==============================] - 0s 749us/step - loss: 10.2807 - val_loss: 15.2022\n",
      "Epoch 84/100\n",
      "81/81 [==============================] - 0s 680us/step - loss: 8.0060 - val_loss: 14.8067\n",
      "Epoch 85/100\n",
      "81/81 [==============================] - 0s 821us/step - loss: 8.3503 - val_loss: 13.0396\n",
      "Epoch 86/100\n",
      "81/81 [==============================] - 0s 657us/step - loss: 7.3075 - val_loss: 13.2452\n",
      "Epoch 87/100\n",
      "81/81 [==============================] - 0s 971us/step - loss: 7.8197 - val_loss: 13.8765\n",
      "Epoch 88/100\n",
      "81/81 [==============================] - 0s 655us/step - loss: 7.3775 - val_loss: 15.4879\n",
      "Epoch 89/100\n",
      "81/81 [==============================] - 0s 669us/step - loss: 6.9616 - val_loss: 15.7300\n",
      "Epoch 90/100\n",
      "81/81 [==============================] - 0s 657us/step - loss: 8.9432 - val_loss: 14.2249\n",
      "Epoch 91/100\n",
      "81/81 [==============================] - 0s 645us/step - loss: 7.5877 - val_loss: 13.8765\n",
      "Epoch 92/100\n",
      "81/81 [==============================] - 0s 811us/step - loss: 6.9643 - val_loss: 11.9327\n",
      "Epoch 93/100\n",
      "81/81 [==============================] - 0s 643us/step - loss: 7.1212 - val_loss: 14.9353\n",
      "Epoch 94/100\n",
      "81/81 [==============================] - 0s 657us/step - loss: 9.0853 - val_loss: 16.8201\n",
      "Epoch 95/100\n",
      "81/81 [==============================] - 0s 652us/step - loss: 7.1131 - val_loss: 13.1685\n",
      "Epoch 96/100\n",
      "81/81 [==============================] - 0s 896us/step - loss: 7.0148 - val_loss: 11.2334\n",
      "Epoch 97/100\n",
      "81/81 [==============================] - 0s 569us/step - loss: 6.4309 - val_loss: 19.4343\n",
      "Epoch 98/100\n",
      "81/81 [==============================] - 0s 823us/step - loss: 6.5783 - val_loss: 11.6005\n",
      "Epoch 99/100\n",
      "81/81 [==============================] - 0s 593us/step - loss: 6.1197 - val_loss: 15.2096\n",
      "Epoch 100/100\n",
      "81/81 [==============================] - 0s 551us/step - loss: 7.7521 - val_loss: 18.8694\n"
     ]
    }
   ],
   "source": [
    "# 训练模型\n",
    "hs = model.fit(x_train_std, y_train,  \n",
    "               epochs=100,  # 迭代次数  \n",
    "               batch_size=5,  # 每次用来梯度下降的批处理数据大小  \n",
    "               validation_data = (x_test_std, y_test),  # 验证集  \n",
    "               verbose =1  # 0不显示训练过程，1全部显示（默认），2精简显示\n",
    "         )"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 查看神经网络的模型评分"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 19,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "在训练集上的决定系数r2得分为： 0.8792749206514454\n",
      "在训练集上的均方误差MSE为： 9.919456975209423\n"
     ]
    }
   ],
   "source": [
    "y_train_pred_nn = model.predict(x_train_std)\n",
    "print(\"在训练集上的决定系数r2得分为：\", r2_score(y_train, y_train_pred_nn))\n",
    "print(\"在训练集上的均方误差MSE为：\", mean_squared_error(y_train, y_train_pred_nn))"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 20,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "在测试集上的决定系数r2得分为： 0.7977732287348548\n",
      "在测试集上的均方误差MSE为： 18.869364066263856\n",
      "x_test中前十个样本 预测值 是：\n",
      " [34.54 20.23 21.08 34.51 24.73 48.51 32.91 27.86 20.58 30.51]\n",
      "x_test中前十个样本 真实值 是：\n",
      " [32.5 20.  20.1 37.3 23.1 50.  32.  23.5 21.7 23.6]\n"
     ]
    }
   ],
   "source": [
    "y_test_pred_nn = model.predict(x_test_std)\n",
    "print(\"在测试集上的决定系数r2得分为：\", r2_score(y_test, y_test_pred_nn))\n",
    "print(\"在测试集上的均方误差MSE为：\", mean_squared_error(y_test, y_test_pred_nn))\n",
    "print('x_test中前十个样本 预测值 是：\\n',y_test_pred_nn[:10].flatten().round(2))\n",
    "print('x_test中前十个样本 真实值 是：\\n', y_test[:10])"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 神经网络过程数据展现"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 21,
   "metadata": {
    "scrolled": false
   },
   "outputs": [
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAfQAAAEWCAYAAACQWmUDAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjIsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+WH4yJAAAgAElEQVR4nO3dd5zcVb3/8ddnZntL282mV5KQBgFDKKEFpFoAC0VRVBRFuIg/8Qp4Fb2KV72CHa8oTUG5uVKlhkBoUpMQ0nvdbLLZbO+zM3N+f5zZ7GQzm2ySnd3M8n4+HvuYme/MfOfMd2fm8z2f8/merznnEBERkdQW6O0GiIiIyOFTQBcREekDFNBFRET6AAV0ERGRPkABXUREpA9QQBcREekDFNBFZC9mNsbMnJmldeGxXzCz1w93PSJy+BTQRVKYmW02s5CZFXZYviQWTMf0TstEpKcpoIukvk3AFW03zGw6kN17zRGR3qCALpL6/gp8Pu72VcBf4h9gZv3M7C9mVm5mW8zsP8wsELsvaGa/MLPdZrYR+EiC595jZjvMbLuZ/djMggfbSDMbZmZPmlmlma03s6/E3TfLzBaaWa2ZlZnZnbHlWWb2oJlVmFm1mb1rZsUH+9oiHwQK6CKp7y2gwMwmxwLtZcCDHR7zW6AfMA44A78D8MXYfV8BPgocB8wEPtXhuQ8AYeCo2GPOBb58CO38O1ACDIu9xk/M7OzYfb8Gfu2cKwDGA3Njy6+KtXskMAj4GtB0CK8t0ucpoIv0DW299HOA1cD2tjvigvwtzrk659xm4A7gc7GHXAr8yjm3zTlXCfxX3HOLgQuAG51zDc65XcAvgcsPpnFmNhI4FfiOc67ZObcE+HNcG1qBo8ys0DlX75x7K275IOAo51zEObfIOVd7MK8t8kGhgC7SN/wV+AzwBTqk24FCIAPYErdsCzA8dn0YsK3DfW1GA+nAjljKuxr4IzD4INs3DKh0ztV10oargYnA6lha/aNx7+t54GEzKzWzn5tZ+kG+tsgHggK6SB/gnNuCL467EHi0w9278T3d0XHLRtHei9+BT2nH39dmG9ACFDrn+sf+CpxzUw+yiaXAQDPLT9QG59w659wV+B2FnwH/MLNc51yrc+6HzrkpwCn4oYHPIyL7UEAX6TuuBs5yzjXEL3TORfBj0rebWb6ZjQb+H+3j7HOBG8xshJkNAG6Oe+4OYB5wh5kVmFnAzMab2RkH0zDn3DbgDeC/YoVux8Ta+xCAmV1pZkXOuShQHXtaxMzmmNn02LBBLX7HJHIwry3yQaGALtJHOOc2OOcWdnL3vwENwEbgdeBvwL2x+/6ET2u/Dyxm3x7+5/Ep+5VAFfAPYOghNPEKYAy+t/4YcJtz7oXYfecDK8ysHl8gd7lzrhkYEnu9WmAV8Ar7FvyJCGDOud5ug4iIiBwm9dBFRET6AAV0ERGRPkABXUREpA9QQBcREekDUvq0hoWFhW7MmDG93QwREZEes2jRot3OuaKOy1M6oI8ZM4aFCzs7SkdERKTvMbMtiZYr5S4iItIHKKCLiIj0AQroIiIifUDSxtBjp0v8C37qxihwt3Pu12b2A/z5l8tjD73VOfdM7Dm34Od3jgA3OOeeT1b7REQk9bS2tlJSUkJzc3NvNyXpsrKyGDFiBOnpXTvBYDKL4sLAt5xzi2NnWFpkZm3zNv/SOfeL+Aeb2RT8OZan4k+1ON/MJsZOLCEiIkJJSQn5+fmMGTMGM+vt5iSNc46KigpKSkoYO3Zsl56TtJS7c26Hc25x7Hod/sQKw/fzlIuAh51zLc65TcB6YFay2iciIqmnubmZQYMG9elgDmBmDBo06KAyET0yhm5mY4DjgLdji643s6Vmdm/sdI3gg/22uKeVkGAHwMyuMbOFZrawvLy8490iItLH9fVg3uZg32fSA7qZ5QGPADc652qBPwDjgRnADuCOtocmePo+p4Jzzt3tnJvpnJtZVLTPcfWHrrESFvwEdi7rvnWKiIj0kKQGdDNLxwfzh5xzjwI458qccxHnXBR/Hua2tHoJMDLu6SPw503uGc018MrPYOfyHntJERFJLRUVFcyYMYMZM2YwZMgQhg8fvud2KBTa73MXLlzIDTfckLS2JbPK3YB7gFXOuTvjlg91zu2I3bwEaIugTwJ/M7M78UVxE4B3ktW+fQQz/GVk//8QERH54Bo0aBBLliwB4Ac/+AF5eXncdNNNe+4Ph8OkpSUOrTNnzmTmzJlJa1syq9xnA58DlpnZktiyW4ErzGwGPp2+GfgqgHNuhZnNBVbiK+Sv69EK92DssAAFdBEROQhf+MIXGDhwIO+99x7HH388l112GTfeeCNNTU1kZ2dz3333MWnSJF5++WV+8Ytf8NRTT/GDH/yArVu3snHjRrZu3cqNN9542L33pAV059zrJB4Xf2Y/z7kduD1ZbdqvPQG9tVdeXkREDs4P/7mClaW13brOKcMKuO1jUw/6eWvXrmX+/PkEg0Fqa2t59dVXSUtLY/78+dx666088sgj+zxn9erVLFiwgLq6OiZNmsS1117b5WPOE0npk7N0q7aUe1QBXUREDs6nP/1pgsEgADU1NVx11VWsW7cOM6O1NXFc+chHPkJmZiaZmZkMHjyYsrIyRowYcchtUEBvE1DKXUQklRxKTzpZcnNz91z/3ve+x5w5c3jsscfYvHkzZ555ZsLnZGZm7rkeDAYJh8OH1QbN5d5GKXcREekGNTU1DB/up1G5//77e+x1FdDbmPleugK6iIgchn//93/nlltuYfbs2UQiPVfbbc7tM3dLypg5c6ZbuHBh963w9qEw80twXu/U5YmIyP6tWrWKyZMn93Yzekyi92tmi5xz+xz/ph56vKB66CIikpoU0OMFM1TlLiIiKUkBPV4gXVXuIiKSkhTQ4ynlLiIiKUoBPV4wQwFdRERSkgJ6vKBS7iIikpoU0OMp5S4iIvtRXV3NXXfddUjP/dWvfkVjY2M3t6idAno8VbmLiMh+HMkBXXO5x9NMcSIish8333wzGzZsYMaMGZxzzjkMHjyYuXPn0tLSwiWXXMIPf/hDGhoauPTSSykpKSESifC9732PsrIySktLmTNnDoWFhSxYsKDb26aAHi+YDuHm3m6FiIh0xbM3w85l3bvOIdPhgp92evdPf/pTli9fzpIlS5g3bx7/+Mc/eOedd3DO8fGPf5xXX32V8vJyhg0bxtNPPw34ud379evHnXfeyYIFCygsLOzeNsco5R5PVe4iItJF8+bNY968eRx33HEcf/zxrF69mnXr1jF9+nTmz5/Pd77zHV577TX69evXI+1RDz2eArqISOrYT0+6JzjnuOWWW/jqV7+6z32LFi3imWee4ZZbbuHcc8/l+9//ftLbox56vGCaDlsTEZFO5efnU1dXB8B5553HvffeS319PQDbt29n165dlJaWkpOTw5VXXslNN93E4sWL93luMqiHHk9V7iIish+DBg1i9uzZTJs2jQsuuIDPfOYznHzyyQDk5eXx4IMPsn79er797W8TCARIT0/nD3/4AwDXXHMNF1xwAUOHDk1KUZxOnxrv8a/Dplfhm8u7b50iItJtdPpUnT61awJKuYuISGpSQI+nojgREUlRCujxFNBFRI54qTxUfDAO9n0qoMdTlbuIyBEtKyuLioqKPh/UnXNUVFSQlZXV5eeoyj2eqtxFRI5oI0aMoKSkhPLy8t5uStJlZWUxYsSILj9eAT1eMANcFKIRCAR7uzUiItJBeno6Y8eO7e1mHJGUco8XiO3fKO0uIiIpRgE9XjDDXyqgi4hIilFAj7cnoId7tx0iIiIHSQE9XlApdxERSU0K6PGUchcRkRSlgB6vLaBHlXIXEZHUkrSAbmYjzWyBma0ysxVm9o3Y8oFm9oKZrYtdDoh7zi1mtt7M1pjZeclqW6dU5S4iIikqmT30MPAt59xk4CTgOjObAtwMvOicmwC8GLtN7L7LganA+cBdZtazB4Mr5S4iIikqaQHdObfDObc4dr0OWAUMBy4CHog97AHg4tj1i4CHnXMtzrlNwHpgVrLal5Cq3EVEJEX1yBi6mY0BjgPeBoqdczvAB31gcOxhw4FtcU8riS3ruK5rzGyhmS3s9qn/VOUuIiIpKukB3czygEeAG51ztft7aIJl+8y+75y72zk30zk3s6ioqLua6SnlLiIiKSqpAd3M0vHB/CHn3KOxxWVmNjR2/1BgV2x5CTAy7ukjgNJktm8fe6rcdYIWERFJLcmscjfgHmCVc+7OuLueBK6KXb8KeCJu+eVmlmlmY4EJwDvJal9Ce6rcFdBFRCS1JPNsa7OBzwHLzGxJbNmtwE+BuWZ2NbAV+DSAc26Fmc0FVuIr5K9zzkWS2L59KeUuIiIpKmkB3Tn3OonHxQHO7uQ5twO3J6tNB7QnoKuHLiIiqUUzxcULKuUuIiKpSQE9nlLuIiKSohTQ46nKXUREUpQCerxgur9Uyl1ERFKMAnq8QFtAV8pdRERSiwJ6PFW5i4hIilJAj6eUu4iIpCgF9HhmfrY4pdxFRCTFKKB3FMxQlbuIiKQcBfSOgulKuYuISMpRQO8okK6Uu4iIpBwF9I6CGeqhi4hIylFA70gpdxERSUEK6B0FlXIXEZHUo4DeUTBDAV1ERFKOAnpHwXSIhnu7FSIiIgdFAb0jVbmLiEgKUkDvSFXuIiKSghTQO1KVu4iIpCAF9I5U5S4iIilIAb0jpdxFRCQFKaB3FEzXyVlERCTlKKB3pCp3ERFJQQroHSnlLiIiKUgBvSNVuYuISApSQO9IVe4iIpKCFNA7UspdRERSkAJ6R6pyFxGRFKSA3pGq3EVEJAUpoHcUzAAXhWikt1siIiLSZQroHQXT/aXG0UVEJIUooHcUzPCXSruLiEgKSVpAN7N7zWyXmS2PW/YDM9tuZktifxfG3XeLma03szVmdl6y2nVA6qGLiEgKSmYP/X7g/ATLf+mcmxH7ewbAzKYAlwNTY8+5y8yCSWxb59oCuirdRUQkhSQtoDvnXgUqu/jwi4CHnXMtzrlNwHpgVrLatl9KuYuISArqjTH0681saSwlPyC2bDiwLe4xJbFl+zCza8xsoZktLC8v7/7WBZRyFxGR1NPTAf0PwHhgBrADuCO23BI81iVagXPubufcTOfczKKiou5vocbQRUQkBfVoQHfOlTnnIs65KPAn2tPqJcDIuIeOAEp7sm17KOUuIiIpqEcDupkNjbt5CdBWAf8kcLmZZZrZWGAC8E5Ptm0P9dBFRCQFpSVrxWb2d+BMoNDMSoDbgDPNbAY+nb4Z+CqAc26Fmc0FVgJh4DrnXO9M1aYqdxERSUFJC+jOuSsSLL5nP4+/Hbg9We3pMqXcRUQkBWmmuI72VLkroIuISOpQQO9IY+giIpKCFNA72pNyV0AXEZHUoYDeUVApdxERST0K6B0p5S4iIimoSwHdzHLNLBC7PtHMPm5m6cltWi9pS7nrsDUREUkhXe2hvwpkmdlw4EXgi/izqfU9qnIXEZEU1NWAbs65RuATwG+dc5cAU5LXrF6klLuIiKSgLgd0MzsZ+CzwdGxZ0ial6VWqchcRkRTU1YB+I3AL8FhsmtZxwILkNasXqcpdRERSUJd62c65V4BXAGLFcbudczcks2G9RudDFxGRFNTVKve/mVmBmeXiT6Cyxsy+ndym9ZJAAAJpqnIXEZGU0tWU+xTnXC1wMfAMMAr4XNJa1dsC6Uq5i4hISulqQE+PHXd+MfCEc64VfwrUvimYoZS7iIiklK4G9D/iz1+eC7xqZqOB2mQ1qtcF0xXQRUQkpXS1KO43wG/iFm0xsznJadIRIKiUu4iIpJauFsX1M7M7zWxh7O8OfG+9b1IPXUREUkxXU+73AnXApbG/WuC+ZDWq1wUzVOUuIiIppauzvY13zn0y7vYPzWxJMhp0RFCVu4iIpJiu9tCbzOzUthtmNhtoSk6TjgBKuYuISIrpag/9a8BfzKxf7HYVcFVymnQE0GFrIiKSYrpa5f4+cKyZFcRu15rZjcDSZDau1wQzlHIXEZGU0tWUO+ADeWzGOID/l4T2HBmCaeqhi4hISjmogN6BdVsrjjSqchcRkRRzOAG9j0/9qpS7iIikjv2OoZtZHYkDtwHZSWnRkSCglLuIiKSW/QZ051x+TzXkiKIqdxERSTGHk3LvuxTQRUQkxSigJxJM0xi6iIikFAX0RFQUJyIiKUYBPZFgBkTDvd0KERGRLlNATySglLuIiKSWpAV0M7vXzHaZ2fK4ZQPN7AUzWxe7HBB33y1mtt7M1pjZeclqV5co5S4iIikmmT30+4HzOyy7GXjROTcBeDF2GzObAlwOTI095y4zCyaxbfsXzAAXhWik15ogIiJyMJIW0J1zrwKVHRZfBDwQu/4AcHHc8oedcy3OuU3AemBWstp2QMHY4fk6dE1ERFJET4+hFzvndgDELgfHlg8HtsU9riS2bB9mdo2ZLTSzheXl5clpZTDDXyrtLiIiKeJIKYpLdKKXhHPFO+fuds7NdM7NLCoqSk5r2gK6Kt1FRCRF9HRALzOzoQCxy12x5SXAyLjHjQBKe7ht7QJtKXf10EVEJDX0dEB/Ergqdv0q4Im45ZebWaaZjQUmAO/0cNvaKeUuIiIpZr8nZzkcZvZ34Eyg0MxKgNuAnwJzzexqYCvwaQDn3AozmwusBMLAdc653isx3xPQVRQnIiKpIWkB3Tl3RSd3nd3J428Hbk9Wew6KqtxFRCTFHClFcUcWpdxFRCTFKKAnsqfKXT10ERFJDQroiQSUchcRkdSigJ6IUu4iIpJiFNATUZW7iIikGAX0RFTlLiIiKUYBPRGl3EVEJMUooCfSl6rcm2ugtam3WyEiIkmmgJ5IMN1f9oWU+18/Ac9/t7dbISIiSZa0meJSWqAtoPeBlHvFesjM7+1WiIhIkqmHnkhfqXKPtEJzNTRV9nZLREQkyRTQE+krKfemKn/ZWNW77RARkaRTQE8k2EdS7g27/WWTArqISF+ngB5TUd/CXS+vZ11ZXd85bK0xFtBDdRBO8fciIiL7pYAeE4pE+flza3hrY0V7UVw03LuNOlxtPXRQL11EpI9TQI8ZUpBFflYaa8rqIBAAC/aBHnpF+3UVxomI9GkK6DFmxqTifNburPcLghl9K6A3KqCLiPRlCuhxJg7JZ01ZHc65WEDvSyl3BXQRkb5MAT3OxMF51DS1squuxZ+gJeV76LshLdtf1xi6iEifpoAeZ+IQP6Pa2rZK95QP6BUw6KjYdfXQRUT6MgX0OJOKfUBfs7POH4ue8lXuFdB/lN85UcpdRKRPU0CPMygvk8K8DN9DD6T3gR76bsgdBNkD1UMXEenjFNA7mFicz5qy+tRPuTvnU+45hZAzUGPoIiJ9nAJ6BxOL81lXVocLpqd2lXtzjR8yyBkE2QPUQxcR6eMU0DuYWJxPYyhCyKX4xDJtx6DnFvqArh66iEifpoDewaQheQA0RQKpHdDbjkHfk3JXD11EpC9TQO9gQqzSvT4cSO0q97YTs+QMbC+Kc6532yQiIkmjgN5BQVY6w/plUddKavfQ41PuOQMh2gqh+t5tk4iIJI0CegITivOpCVlqB/T4lHv2QH9dhXEiIn2WAnoCk4bkU90CLtLa2005dI0VkJ4DGTm+KA5UGCci0ocpoCcwsTifkAvSGmrp7aYcuobd/pA18Cl3UGGciEgfpoCewKTifFpJI9yawgG9saI9oCvlLiLS5/VKQDezzWa2zMyWmNnC2LKBZvaCma2LXQ7ojbYBHDU4j1bSiIZTOeW+2xfEQVwPXSl3EZG+qjd76HOcczOcczNjt28GXnTOTQBejN3uFdkZQbIyM3EpXRQXm/YV2sfQ1UMXEemzjqSU+0XAA7HrDwAX92JbyMvJIZDSRXFxY+jBdMgs0Bi6iEgf1lsB3QHzzGyRmV0TW1bsnNsBELscnOiJZnaNmS00s4Xl5eVJa2BBXg4BF6YlHEnaayRNqBFaG/2Z1tpk91fKXUSkD+utgD7bOXc8cAFwnZmd3tUnOufuds7NdM7NLCoqSloD++fnkk6YjeUNSXuNpGmbVKYt5Q46haqISB/XKwHdOVcau9wFPAbMAsrMbChA7HJXb7StzcD8XNIsysrt1b3ZjEOzZ9rXuB665nMXEenTejygm1mumeW3XQfOBZYDTwJXxR52FfBET7ctXv/8XAB+8tRS3li/uzebcvAa4qZ9baMeuohIn9YbPfRi4HUzex94B3jaOfcc8FPgHDNbB5wTu91r0tIzARhREOTz977Dw+9s7c3mHJxEKXf10EVE+rS0nn5B59xG4NgEyyuAs3u6PZ0KpAPw4BeP5+uPbubmR5exaXcD3zn/aAIB6+XGHUBbyn2vorgB0FwDkTAEe/zfLiIiSXYkHbZ2ZAn6gJ6f5rjvCydw5Umj+OOrG/ndgvXd9xo1JfDWH7r/tKYNu8GCkNmvfVnbbHHNNd37WiIickRQQO9MMMNfRlpJCwb40UXTOHHsQJ5aWtp9r7HwPnjuZqja3H3rhPZj0ANx/17N5y4i0qcpoHcm1kNvO4WqmXHOlGLWltWzrbKxe15j1yp/Wbaie9bXprFy74I40HzuIiJ9nAJ6Z/YE9PbZ4uYc7ee6WbCmm46o27XSX5Yt7571tYk/01qbnLZTqCqgi4j0RQronWlLuUfbA/q4wlzGDMrhpdXdENBDDe2p9u4O6I0JArp66CIifZoCemcCe6fcwafd5xw9mDc2VNAYCh/e+svXAA7Sc5OQcq9IkHJXD71HbX0b3vhtb7dCRD5AFNA7kyDlDnD20cWEwlHeWF9xeOtvS7dP/hhUboKW+sNbX5tI2M/ZntMhoGf185Xvms+9Z7zxG5j3PW1vEekxCuidiaty32PnMmYVQ25GkBcPN+2+axWkZcHkjwKuvUDucLX1wDum3M18L10p9+RzDra9AzjY9m5vt0ZEPiAU0DvTocqdujL409lkPH0Dp00o4uU1u3CHc/z4rpVQNAmGHONvly07vPa2aUgwqUwbzRbXM6q3QENsh2/bW73bFhH5wFBA70zHlPtbd0GkBdY8zUXD69hR08yqHXWHvv5dq2DwFOg/yp+rvLvG0RNN+9pG87n3jLZeeVZ/2KqALiI9QwG9M/FV7k3V8O49MP4sSMvmzMqHAXhpddmhrbuxEup2wODJPhVePLUbA3pbDz1BQM8ZqDHdnrDtbV/seOzlsH0RhFt6u0Ui8gGggN6Z+Cr3hfdAqA4+/EM47kqyV/6DM4e2Hvrha23j5YOn+su2gN4dU8A2JDh1apvsAQroPaHkHRh+PIyeDeFm2PH+4a8z0gqPfx1Klxz+ukSkT1JA70xbyr251s+3ftSHYegxcMr14KJcn/MC722rpqL+EHpfbRXugyf7y+Jp0FIL1d1wRrc9KfdOArpS7skVaoCdy2HkiTDqJL+sO9Lu296BJQ/5z6KISAIK6J1pS7kvuh8ayuHU/+dvDxgDUy/huF2Pk+8aeGVt+cGve9cqf+KUgmH+dvE0f9kdE8w0VvhD1Np2SOLlDIRwE7Q2Hf7rSGLbF4OLwMhZkDcYBo7vnoC+fr6/XPvsPodSioiAAnrn2gLijiUwYhaMPqX9vtk3EGyt52u5L/OTZ1Zz96sbqG9pn2gmHInyzLIdfOkPz3Pj/zxOSziy97p3rWwfP4dYT926Zxy9YXfigjjQbHGHqmE3/Olsf1z5gZS84y9HnOAvR50MW988/OGU9fP9uHxzDWx+7fDWJSJ9kgJ6Z9p66ACnfrM9+AIMPRbGn8VXMuYxrTiDnzyzmtk/fYk7563hDy9v4PSfL+C2h17ix2XX8/0d1/GLp+PGPZ3zAb14SvuyzDwYOHbfHvrO5fD7E2Hjy11vd6JpX9v01BnXmqqTu/6e1FQNf70Eti+EN3934J2ube/CoAnt23rUiX5771536G2oK4OdS+Hk6yA9B1b989DXJfvavqhvfWblA0sBvTNtPfSiyTDx/H3vn30j6U3l3D9tOY9fN5sTxw7kNy+t52fPrWbigADzh/6eoYEqBlo9de/8jRdWxiri63b4XtbgKXuvr3iqD+DxXvxPKF8N//t52LW6a+1uSDDta5ue6KFveg3+ezy886fkvUZPaamHhz7th0g+eQ9k5u+/l+6c76GPnNW+bNTJ/nLrm4fejg0v+cvJH/W1HKufhmj00Ncn7cpWwp8/DM/c1NstETlsCuidSc+FiRfAuT/e+7zibcaeDmNOg+dvYcbSH3H35VN4+aYzmX/jbO7Pv4t+1auwy/5KtHg612bN49v/t4TS6qZ9C+LaFE+Dyo2+qAp8T2/d83DCVyAtE/52KdQfYLzeOT/e31kPfc987odR6R5ugU2vJk4hh1vgqRshGoYXf3Tg9h7JWpvg77HDzj59H0z/FJz+77Dhxfbx7I4qN/oahrZ0O8Cgo/z/Y9vbh96W9fMhdzAUT/dTBdeXQYlmoDtszsG874KLwvJH/P9PJIUpoHcmEIDPPAwTPpz4fjP47D/g5Ovh3T/DH09jTMtqjnr3Nlg3Dy78BUy6gMBJ1zI6spXjI0v5xsPvEdkZS9nu00Ofxl5TwC643Y+Ff/gHcMXDUL8LHv4MtDZ33uaF9/oZyoYfn/j+7ki5P3czPPAxeOVn+973+q+gYj2c/1NobYAXf3Dor9ObohGYexVsfh0u+R8fRAFmfcUXRc77vn9MR9ti4+cjT2xfZtY+jn6obdnwEhx1tv9MTjjXH1K5Wmn3w7buBb9tZ9/ot+nrv+rtFsmRbuUTMP+HXa+JibTC0rndc0hyFyigH470LDjvdvj8k75H96ezfVX8qd+EE672j5n2Scgp5PZh/+LdzVWsWPIWLm9Ie3BtUxw7Jr1sOWx5AzYugFNv9OPrIz4En/ijT+c+8fXEH44dS+G5W3xK9vgvJG7vgVLuzvmdkyeuS7zjsPFlv9NQMBxe/i94/3/b79u9Hl67A6Z+Ak661v+99yCULOpk4x3BXv6pz45c+N9wzKXty9My/Q7WrhX+ELKOtr3tZ/0rOnrv5SNP9L2/ukOYiKh0id8BOyq2Y5nd32eHVv2zx34k+qRIKzx/q8+gnPUfcNyVsORvUFArYh4AAB+ZSURBVFva2y2TI1XNdj8XxOt3wrJ/HPjxVVvgvgvg0a/43/QeoIDeHcadAde+Acd/Hk68Fs76fvt96Vkw80sM3fky104PwK5VvN8ylMVbO6S9+4+GjHw/jv7S7ZBXDDOvbr9/ykU+mCx/BOZ+jqaaCl5ZW85vX1zH2q074P++4HcSLvlj4iECYFNNhBbL4oVFq1hX1mHa2nALLY9eB09/C957kOqHrtq7F9pSB0/8m/8B/Pqbfrjhyev9B9U5ePqbPuCd/1/+8af/u38Pz9y093hva5PfaXj9V7Dk7z6dvHPZkTOb2vr58Op/w4zPwglf3vf+KRf7ox5e+vG+Z8greReGf2jf7d82jh4/r3uoAUrfO3BQ3vAiYDBuTvuyyR+Dqs1JOO1upX//h7qjEA7Byid9piLS4fTCocbYZ/fz8Ph1sPqZ3j18cuG9ULHOD6kF02H2DT713pVT3jbX+J790rnw5l2+x/bWH46cz3BXLP0/+PsVPkvxQd0x3LXa77yvnde1xz9/qx9OHDwVnvuOr1fqzIrH4H9O86fJ/tR9MGZ297T5ANJ65FU+CLL7w8d/k/i+E66G13/Jtwe8RDRtB3MjU7n1rjf4+LHDuOHsCRRkp+EcDBh0NMFljxBsrqT81B9RWRkhFK6htrmV6sZWqoKXMHJMFbNX/Y6KlW/y69D1LHYTGPfqXUwIbMKueiphQVxTKMLvF6zn7lc38kpaLo015Zz/69e48sRRfPOcieSEKqi67zKKa97nt+GLCaUX8K3Nf+HN313NpC/9kYF5mTD/B1CzDb70nD/O/bK/wp/P8cMAJ17rx9U/cgfkD/EvmlUA5/wIHrvG92ZnfBaW/8MX+tVs23cb5Q+DM2/2jwv20seypgQe+YofDrnwF3sf2dDGzGdl7jkH5v2HH15Iz/ITEO1a6XdkOhp6rD+z3ta3YdKFsPgBeOXnfix8zGnwsV/DoPGJ27R+Pgw7bu+T7Rz9EXjqm76XPmRa97z31iZ48JNQujhxmyKt8P7ffUHeuDlw7GXtNRnO+bbMv619HDqzAMac6g/33LkcVj8FoXrIG+Jfa8mDvmJ//Fkwfg4MOdZnqTJyuuf97E9jpc8wjT2jveB1wBifjVl0P5z2rc4LS3evh79evPdn2IJ+7oF374GP/hLGnpbsd3B43vidrx0IZsKaZ/znffY3fHatfqf/f5Ut9wW8Uy/xn4eO34Vwiz98MtQIgSBYwG+HrAJfM5IzyJ/LoJPOxQGFGmD9i76GpaXO3w7V+9cZd6b/v/UbfvDrbaryO5bvPeQ/621O+DKce7v/Lieyfj6sfBzmfNd///54ut+Gl/zP3o9rbfKZ0kX3wfCZ8Kl7/Gerh9hhnTGsl82cOdMtXLiwt5vRNY98BVY8CtEwLRf+ht9Xn8Tdr22kubW99/rjtHu4Mu1FSt1A5rTcSQsZCVd1cdEObgvdQf9QGY1jzyN34zP8d/hSBpx3C1efOhaLffnqW8I8v3wnd76wlu3VTXziuOH8bPd1mIvyXOY5LNpSRU6acZU9RV60nnuKvs15n/4aQ/tlsey+b3BK2UP8lssZM2MOH1vyVWpnXEPWR39GRlrsS1q50VcIN1b4D+/VL+z9BXYO7j3fj6v3H+l7pEOPpeSE75I95gQGUeNrA2q2wdv/AyXvEh5wFA/mfp43M07hS6eOY9bYgXvez2Fxzn+Z63f5H62mKv9DVjjR/1hFWuG+C2HXKsJffolHtmQxdVg/pg3vl3h9T3/LZxr6jYKzv+cDwF8vgSsfaU+Px7vvI/4sbMF0v91GnQITzvGZinAznPkdOOWGvScEaqqCn4+D026Cs7679/ruPd//0F37L//e1j7ve5dFk/wOR3p217dNNAr/+KIfHzzxqz71HGmFObf6uoGl/+uHU6q3+uK8hl1+B2XKxf49vPMnn30oOhrO+p4//8HGl/1f1WY/idKUj/uAOXq2z/xsed3vHKx+BupiaW4L+EP+Rp3oh6rGnOaDRVc11/iq9bpSv9PRcVirzbM3wzt/hK++tvcOUfkaf5jo6Tf5NHxHpUv8Tg/ARb/3Ozy5hT5wbXgJnv5//v0e+xnf8090xsP9cQ52r4U1z/pgWTgJJp3vMzzxM1duft3fnz/Eb6d+I7q+/hf/06eMp1zs38Pqp+Bfv/Y7o4E03wNtk54DrY2+GPPkr/uAX/oeLH3Y90Cba/b/ehbwGavpn/bPzSvq/LGRMNRu98NWq56EdfP9JFjBDH90SUae/2upg5rYjJpDjvHftYJhfucye4D/fxRP33dHItLqsyiv/MzvGBRP852HKRf5E2+9+Tu/7FP3+u9QvNZm+MPJgPlMbHqW346v3QGfe8zvlILfEXrkan9k0uwb/Wco0QRf3cDMFjnnZu6zXAG9h2xfBH+K/eO/8hIM/xCl1U0sWLML5yBgxoStczlhxY9YPP02to2/jPRggPRggH7Z6fTPSad/djoF2elkpQf9l+nJG2Dl40TGzuHfAt/lmRW7uGLWSI4fNYDnlu/ktfW7CYWjHD0knx9dPI0Txgz0OxbL5u7VtLJAMVvPvZsTTjqzfWE0Su3DV1Ow9lGqXB5VLo8LQ/9Fi2UyrF82x43qz6yxAzktcyNjFt2Offw37XUA8Xa8D3fPgfwhVJ9yC/+xfjJPLS8jOz3Il04dwzWnjadfTjouGuXNZx+k+N2fMZ4SNjKcx1pPZmPxuXz87DM4Z3IxgUDiwO6cIxx1pAcMtvzLB9q18/b+cYqGfS+qo9zBvicZDcOqJyk7/498bfFI3ttaTVrA+MbZE7j2zPGkBRP0NDa+7A9j27nUD5eE6uA7W3y2pqOXbodXf+7TdR++zRe3mUHdTnjm2/5HrHgafPRXMDJWJb/iMT+U8qV5PsjFa+tlXfDf8N5f/LBF3hC/s1I8HS59oPNefydti374P2k96XoyG8v8DsuaZ3zgDjfDsON9BmXCuf79LnrAp5xDdX4bnvVdmHHlvtmV2lJfu9FZz8c5nxnZudTXgexc6rM9bb35qZf4k9wMm5H4uWuf9xmPncvbf+jBt3vqJ3x2bPiH/I7cysd972zb23D8VYkzav/7Odj4Cnxzmc9Etdn8L3/UQ1Y/+NzjUHjUvs9tbfLDNf/6tW9bXjEUDPUBZ+B4PySX6H9SscEPAax5pj3DUTjRj8FGWvxrjj/Lf1a2veM/x8FMfx/md5Kmf8rvNFds8DvQFev9eoZM94Fv6LF+h2/xA/ChL/psWtvOknO+B7phgW/fkOl+ZzeQ5n8r3rwLylf54BoJ+UA/+WM+UOcP8UMV0Yi/bK7xGZDG3T4DtW6+PzW0BX0mZvAUv51aG32vu7HCv8/a7e3fz7whfv1TPu53fOM/U875Ha+1z8Ka53xdketwCGe/UXDcZ33A7j/Sf56e+bYPtBPPhzNv8dsjvqOw7gV47Gu+XaffBMd9zs/0CPDyz+Dln8CVj/riVIgF+VN8m699w9cLzfue/+5f8j/tQT5JFNCPBH8+x38Aby2FjNx972+ugfcfhplf6tqenXN+T33YcUTT87jjhTX8fsEGAIb3z+a8qUO4YPoQPjRqQHswjEahucMkGpkFidPc4RDu75fBhgUsP+9hVqdPpaSqiQ3l9SzcXMXOWl841y87nU8cP5yrTh7DmMJ931do1wbuX9bMr17ZRiTq+Orp49hc0ciT75dSkJXGV04bx6KtVby8ppxZowr47fT1FK6dS3CbLyRZHh3DmvTJjOyXzvD8IMU5RiAtjbJIAStqs3hzVzo0VXJ19ssMC23BZfXHpl4MWf1xDmqbW6ltiTJ02HDSCob6H6HMfN/jauvt1O1g1ajP8IlNF5GRFuC7H5nMa+t288/3Szl+VH9+edkMRg9K8D+LRmHZ/8FLP/I/3Fd3Mh7XUgclC31BW6Je5+qn4embfJpz5hfh7Nt8wF71T/j2xn3/P1Wb4dfH+usDx/s08TGX+h/lx67xPZ6LfusDYtvr1+7wP2IDxu5ZX9VbDzLguet4o+ACvl73RVrCjjsuPZYLpw3xOxRrnoFjLvM9oY6ZkpZ6/3keMcsXb3aX1iYfqJf9nz9iJBLyQevk6/0PciDgt+UL3/c7cP1G+sLD4qn+L6u/70Uunet3DPqPguptgPM7VNNihZuJvoOlS+DuM/wPfuFEyC3y2Y43f+/X87nHDtwj3rXK7zjUlrb/VW70P/5TLvK9t2EzfHB+4zew6ikfPMed6XvkE8/3r9FS74tj1zzrMwD5Q3zmYfwc/35rt/virKVzfT1Am8wCH5ijEd+WaNxUwafFsg8Hk/Vyzrdj1T/96x790YP7f+9a5du4/BEf5NOz/WHBGTm+V91/tN+2A0b7eT8S1aF0Jhzyv2dNVf6vcpP/3298GTCfgdm5zL/GBT+DSRd0vq66nfDE9bD+Bf//mHi+HyJ76ptw9IXw6fv3fvzm1+H+j0DBCKgtgQnnwcV3dT5c040U0I8EpUt87+DErybtJRZvrSJoxjEj+nVPqjoc8inxDj0L5xwlVU0s3FLJS6vLeW75DsJRx5xJg7li1ihC4Shry+pYt6uOJVurKa1p5pwpxXz/o1MYOdCPk64sreWOeWt4cfUustODfOf8SXz+5DHtOx8124ksf4zahf9LRs1GmiJBmkmnlTTSLcogV022hfa0aQXjub/1bN7KPpOTJo1gS2UjK0tr90zLO7RfFl8/czyXnjCSzDQfVJtCEV5aVcbTbyzm2S2O0yYW8/NPHsOQfr5H+cSS7fzH48uJRB0fO2YYxQWZFOX7v4LsdHIz0sjJCJKd5ijMTScrK3Gqu6aplZWlteRkBMnNTCMvM428rDRy0oPt77eljtD8H5O+8G6a0gdAJETZoJNouOgeJg8tINgxQ/H23T6tPPWSvXcSqrf5FHrJuz7Y1+/yPem2/10wg7q8sSxrLmZm85u8547iW5m3cfLEYazfVc+SbdXcdO5ErptzVPd8hrqoMRTmdy+tZ1tVE+dNLeasoweTE6lvPylNzTZflDlogu+h5RbBGd+BD30h8Q5wS50PJGue9YdyTv0EDG4/AqG2uZX5K8t4ZW05x47oz5UnjfbDSa/d4Xt/DeV+2t9QnR9S+szcg0+jt6nb6d/Dwnv9iZgGjPE7ZVn9/fjtrGsgv/jQ1u2cD1otdVA4wW+Xtv9bOOR7pjuX+kzJ0RcecHWNoTAPvrWFV9aW89kTR3PBtCFd+hxsqWigMC+T3MwjoDSrarMfJ1/3vA/Mp36z68NQ5Wt91uv9h/1nICMPrn+3/dwb8Z76pu+dn/Mj/7veQ98XBXRJql21zTz09lYeensru2NnoDOD0QNzmFicz+WzRnLW0Yl/sNaW1dEvO53igk7SsjGhcJSFWyp5ZU05u+tDzJlUyBmjs8hvrQAcoQETWLBmF/+3sIR3N1cyvig3Ng5eQEFWOn9+fROLtlQxpCCLz508mrVldbywsozGUISi/Ey+cfYEPnviqH1+vEqrm/j+EytYsq2KioZQp0XBWekBzphYxPnThnDW0cVkpwd5ec0uHl+ynfmrdhEK7zu7mxnkZvgAHwwY26ubmGqbuT39HmYENvCt0Nd4JHo6+ZlpHD96ABMG5zFyYA4jB2YzYkAO+VlppAUCpAeNtGCA8roWNu2uZ3NZNaNX/oHBTRuJ5A3F+g0ja+BIdlTVUbLmPYaFNjE5bTuB/GLqPvl3xo8aiZnR3BrhO48s5YklpVxy3HD+6xPTSQsYK0preWdTJWvL6pgyrICTxw9i4uD8TodBEqluDLFqRx2D8jKYMDhvr+386tpybn1sGSVVTQzMzaCyIUR2epCzJg/mvKlDOGlMAYO3PufHOnevh5Ovo+FDX+Ot7SE27W7gtAlFTCzO22/gqWoIsXF3w57/+2vrymmNOAqy0qhtDjO2MJdbL5zMhycP3ns9rc3+CI5O1t3QEiYUjjIgN3HNy16aa2DhfT7FffRH/eFyB+jttkaivLaunHGFeQkzYN2lLZD/8ZWNVDSEKMzLZHd9C2dMLOI/L5qaOEMFLNxcya9fXMdr63ZTkJXGZ08azRdPGcPgA3yfD5dzjvL6ForyMg95x3NDeT39stMpzMvc985Iq88Q5QxqP3NiR9EotNS0F4j2EAV06REt4Qhvb6xkYG4GRw3O8+P9RwjnHP9aX8GvX1zLu5ur6J+TzgXThvKxY4dy4thB+/aAEwhHolQ2hiiva6G2KUxjKExjKEJjKMyK0lqeX7GTstoW0gJGTkaQ2uYwg3Iz+NixwzhzUhGRqKO+JUxDS4T6llbqWyLUN4dpaAnTEo5w1OA8pg7vx9QhuQyuWU5p/jTe3VLNO5sqWbSlis0VDXsVUu5P/5x00gLG7vrQXstnjRnIl04dyzlTihO+Z+ccv3tpPXe8sJYRA7KpbAjRGPLjmwNy0qlq9CncgbkZzBw9gJyMIBEH0agj6hzpwQBZ6QGy04NkpAXYUtHIitJatle3H6Y2vH82p08s4oyJhcxbWcaji7czriiXn37iGD40egDvbq7kqaWlPLtsJxUNvv3jinI5cexAiguyeHNDBYu3VtEaaf/9Gl+Uy4XTh3L6xCIq6lvYUN7AxvIGNu6uZ9PuBqobW/d6/QunD+GC6UOZMaI/L6/dxY+fXsXG8gZOGT+IL582lhPGDCQ/a++ev3OOrZWNvLe1mkVbqli8tYpVO2oBOG1CEZ/60AjOmVK853Nf3xJmzc46dtQ0MSAng6L8TArzMumfnX7AnaHGUJj/fXcbf3p1I6U1zQQDxkUzhvFvZ01gbFxgj0YdG3c3UN0YwswImK/JcfjPaygSJRxxZKUHGVeUy6DcjL0KZ9/cUMGra8t5dvkOdteHOG1CId84ewIzRvbnL29u4c4X1hKKRPna6eOYNrwfGWkBMtICNIUi3PuvTfxrfQWDcjO46pQxrN5Zy3PLdxIMGBfPGM7koQXUNLVS09RKdWOI/jkZHDuyH8eO6M+YQbkEAoZzjtqmMDtrm2kJRxiUl0lhXsaeLFpHVQ0hHllcwsPvbmP9rnqKCzI59agiTp9YyCnjCynKTxCc49Q2t/LkklIefncry7fXkhYwzplSzBWzRnHqUYUHtZN6INGo471t1Rw/qn+3ZrsU0EVinHNsq2xiaP8s0hMVux2GaNTxfkk1z63YSWV9iAunD+XUCYXd9jptvZJtlU2UVDXSGIoQjkRpjTjC0SgDcjIYV5TLuMK8PT3G5tYIO2uaKa1uon9OBlOGFXTptZ5ZtoMH3tjMpCH5zBo7kFljBjK4IIuSqkbe2ljJmxsqeG9bFeGIIxhoDyStkSjNrVGaWiM0t0YY3j/b76QMK2Dy0AJ2xIpB/7W+gvqWMGkB42tnjOf6s47aZwcwHImyvLSWtzdW8PamSt7dVEldS5ipwwo4dUIhp08oYvSgHBasKeeZpTt4e1MF0biftOKCTMYV5jG2KJdxhbmMjfvr+APbGonyt7e38sv5a6lubCVgMG14P04aN4iAGcu317C0pJraZj+Ek5sR5LhRAzh+VH8izvHY4u2U1jSTn5XGcaMGsGl3PdsqEx9rnxYwBuZmUJiXyaA8f5mdESQrLUhmeoBQOMqji0uoamxl1piBfHH2GBZtqeLBt7cQCke5eMZwhvbPYsm2apZuq6Eu7myPB1KQlca4ojzSg8Z7W6sJRx05GUFOPaqQa04fx8wxex8dUFbbzI+eWslTS3fss67CvAy+evp4PnvSKHIyfKp9S0UD97y+ibkLt+3Z+SzISqNfTjq760I0tfqdw/ysNAbkZFBW20xLguxVflaa3z65GQzMzWBQXiZ1za3MW1FGKBLluFH9OWdKMStKa/nX+t17dtj656QzYkA2I/rnMKx/Nmb+O9ASjlLb1Mpr63bT1Brh6CH5XDpzJKXVTTwS29bD+2dzwpgBpAVjWa9AgILsNIb1z/Z//bLpl51OKBylJezXaQZFeZkMzM0gLRggHInyzuZKnl22k+dW7KS8roWn/u3Uzo+YOQQK6CJyRAmFo7y3tYrBBVl79Tj3JxJ1NITCFGQlLhrdXd/C4i1VDOnn19mxh90VTaEIi7dW8fbGCt7aWMl72/wkUEcPKWD6iH4cM7wfx4zoz6Qh+XtlOKJRx5sbK3hkUQkrd9QyfnAek4fkM2lIASMGZFPd2Mru+hbK61oor2+hsj7E7voWdjeEqGxooSkUoaU1SnM4QjjqOGvSYK49c/xeAba8roW7X93AX9/aQjjiOHpoPseO6M+Mkf0Z0i+LqIOoc7T9rqcHA3uGZOpbwnsyFht2NdDYGuHkcYM4fWIhHxo9oNMecZttlY3UNLXSEo4SCkeJOsfxowaQnZH4eW1DEQXZ6Xu2UzgSZX15PUu31bCkpJr65jBD+mUxOD+T4oIsMtMCVDSE2F3XQkVDaM92qmwIxYa7HB87dhiXzxrJ0UPad0wjUcfy7TW8vamCrZWNlFQ1UVLVxI7qJsyMzLQAmWkBstKDnDhuIJefMGqvOqOWcIR5K8qYu3AbWyoa/U5y1BGORKlrDhOOHjhOmsHAnAzCUUdNUytZ6QHmTBrMBdOHcvbRg7u1tkABXUTkEDS3RjDjgAGvO0Wjbr+p34aWMMGAHVFDWn1VJOoor2the3UTpdVN1DWHyYjtIGSkBYhGHbtjOyHl9S1EIo4zJxVxxqSiPVmL7tZZQD8CyhFFRI5cvRE0DzSOe0RUkn9ABAPGkH5ZDOmXxYdG92zx28E64uZyN7PzzWyNma03s5t7uz0iIiKp4IgK6GYWBH4PXABMAa4wsyn7f5aIiIgcUQEdmAWsd85tdM6FgIeBi3q5TSIiIke8Iy2gDwfiT8VVElu2h5ldY2YLzWxheXl5jzZORETkSHWkBfRElSB7leE75+52zs10zs0sKtrP2XtEREQ+QI60gF4CjIy7PQIo7aW2iIiIpIwjLaC/C0wws7FmlgFcDjzZy20SERE54h1RBzM658Jmdj3wPBAE7nXOrejlZomIiBzxUnqmODMrB7Z082oLgd3dvM4PIm3H7qHt2D20HbuHtmP3ONztONo5t08RWUoH9GQws4WJptSTg6Pt2D20HbuHtmP30HbsHsnajkfaGLqIiIgcAgV0ERGRPkABfV9393YD+ghtx+6h7dg9tB27h7Zj90jKdtQYuoiISB+gHrqIiEgfoIAuIiLSByigx+g87IfGzEaa2QIzW2VmK8zsG7HlA83sBTNbF7sc0NttTQVmFjSz98zsqdhtbceDZGb9zewfZrY69rk8Wdvx4JnZN2Pf6eVm9nczy9J2PDAzu9fMdpnZ8rhlnW43M7slFnfWmNl5h/PaCujoPOyHKQx8yzk3GTgJuC627W4GXnTOTQBejN2WA/sGsCrutrbjwfs18Jxz7mjgWPz21HY8CGY2HLgBmOmcm4afufNytB274n7g/A7LEm632G/l5cDU2HPuisWjQ6KA7uk87IfIObfDObc4dr0O/+M5HL/9Hog97AHg4t5pYeowsxHAR4A/xy3WdjwIZlYAnA7cA+CcCznnqtF2PBRpQLaZpQE5+BNlaTsegHPuVaCyw+LOtttFwMPOuRbn3CZgPT4eHRIFdO+A52GXAzOzMcBxwNtAsXNuB/igDwzuvZaljF8B/w5E45ZpOx6ccUA5cF9s6OLPZpaLtuNBcc5tB34BbAV2ADXOuXloOx6qzrZbt8YeBXTvgOdhl/0zszzgEeBG51xtb7cn1ZjZR4FdzrlFvd2WFJcGHA/8wTl3HNCA0sIHLTbGexEwFhgG5JrZlb3bqj6pW2OPArqn87AfBjNLxwfzh5xzj8YWl5nZ0Nj9Q4FdvdW+FDEb+LiZbcYP+ZxlZg+i7XiwSoAS59zbsdv/wAd4bceD82Fgk3Ou3DnXCjwKnIK246HqbLt1a+xRQPd0HvZDZGaGH69c5Zy7M+6uJ4GrYtevAp7o6balEufcLc65Ec65MfjP30vOuSvRdjwozrmdwDYzmxRbdDawEm3Hg7UVOMnMcmLf8bPx9THajoems+32JHC5mWWa2VhgAvDOob6IZoqLMbML8WOYbedhv72Xm5QSzOxU4DVgGe1jv7fix9HnAqPwPw6fds51LBSRBMzsTOAm59xHzWwQ2o4Hxcxm4AsLM4CNwBfxnRdtx4NgZj8ELsMfyfIe8GUgD23H/TKzvwNn4k+RWgbcBjxOJ9vNzL4LfAm/nW90zj17yK+tgC4iIpL6lHIXERHpAxTQRURE+gAFdBERkT5AAV1ERKQPUEAXERHpAxTQRT7AzCxiZkvi/rptVjUzGxN/xikRSa603m6AiPSqJufcjN5uhIgcPvXQRWQfZrbZzH5mZu/E/o6KLR9tZi+a2dLY5ajY8mIze8zM3o/9nRJbVdDM/hQ7r/Y8M8vutTcl0scpoIt8sGV3SLlfFndfrXNuFvA7/CyKxK7/xTl3DPAQ8JvY8t8ArzjnjsXPnb4itnwC8Hvn3FSgGvhkkt+PyAeWZooT+QAzs3rnXF6C5ZuBs5xzG2Mn39npnBtkZruBoc651tjyHc65QjMrB0Y451ri1jEGeME5NyF2+ztAunPux8l/ZyIfPOqhi0hnXCfXO3tMIi1x1yOobkckaRTQRaQzl8Vdvhm7/gb+bHAAnwVej11/EbgWwMyCZlbQU40UEU97yyIfbNlmtiTu9nPOubZD1zLN7G38jv8VsWU3APea2beBcvyZzAC+AdxtZlfje+LXAjuS3noR2UNj6CKyj9gY+kzn3O7ebouIdI1S7iIiIn2AeugiIiJ9gHroIiIifYACuoiISB+ggC4iItIHKKCLiIj0AQroIiIifcD/B1GbLzAzxDacAAAAAElFTkSuQmCC\n",
      "text/plain": [
       "<Figure size 576x288 with 1 Axes>"
      ]
     },
     "metadata": {
      "needs_background": "light"
     },
     "output_type": "display_data"
    }
   ],
   "source": [
    "# 绘制训练 & 验证的损失值  \n",
    "# 模型每一步迭代的训练误差和测试误差，都保存在history里。\n",
    "plt.figure(figsize = (8,4))\n",
    "plt.plot(hs.history['loss'], label = 'Train')  \n",
    "plt.plot(hs.history['val_loss'],label = 'test')  \n",
    "plt.title('Model loss')  \n",
    "plt.ylabel('Loss')  \n",
    "plt.xlabel('Epoch')  \n",
    "plt.legend(loc=0)  # loc=0 表示最优位置，自动。也可以手动设置，如：'upper left'等\n",
    "plt.show()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# 3. 模型的保存与加载"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 22,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "x_test中前十个样本 预测值 是：\n",
      " [34.54 20.23 21.08 34.51 24.73 48.51 32.91 27.86 20.58 30.51]\n",
      "x_test中前十个样本 真实值 是：\n",
      " [32.5 20.  20.1 37.3 23.1 50.  32.  23.5 21.7 23.6]\n"
     ]
    }
   ],
   "source": [
    "from tensorflow.keras.models import load_model  \n",
    "\n",
    "# 保存模型 ,生成模型文件 'my_model.h5'  \n",
    "model.save('model_MLP.h5')  \n",
    "# 加载模型  \n",
    "model_reload = load_model('model_MLP.h5')\n",
    "# 验证加载后模型效果\n",
    "y_test_pred_reload = model_reload.predict(x_test_std)\n",
    "print('x_test中前十个样本 预测值 是：\\n',y_test_pred_reload[:10].flatten().round(2))\n",
    "print('x_test中前十个样本 真实值 是：\\n', y_test[:10])"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "<h3 style = 'color:red'>以上，是一个完整的全连接神经网络的训练和预测的全过程。下面来看看如何优化？</h3>"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "---\n",
    "---"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 脑洞：如果仅一个输入层，一个输出层，是不是等价于线性回归，来试试看咯"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 23,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Model: \"sequential_1\"\n",
      "_________________________________________________________________\n",
      "Layer (type)                 Output Shape              Param #   \n",
      "=================================================================\n",
      "dense_2 (Dense)              (None, 1)                 14        \n",
      "=================================================================\n",
      "Total params: 14\n",
      "Trainable params: 14\n",
      "Non-trainable params: 0\n",
      "_________________________________________________________________\n",
      "Epoch 1/100\n",
      "81/81 [==============================] - 0s 1ms/step - loss: 193.2418 - val_loss: 52.4583\n",
      "Epoch 2/100\n",
      "81/81 [==============================] - 0s 794us/step - loss: 28.7668 - val_loss: 33.3592\n",
      "Epoch 3/100\n",
      "81/81 [==============================] - 0s 577us/step - loss: 22.6512 - val_loss: 31.4833\n",
      "Epoch 4/100\n",
      "81/81 [==============================] - 0s 631us/step - loss: 22.3835 - val_loss: 31.3105\n",
      "Epoch 5/100\n",
      "81/81 [==============================] - 0s 654us/step - loss: 21.9332 - val_loss: 30.7877\n",
      "Epoch 6/100\n",
      "81/81 [==============================] - 0s 704us/step - loss: 21.6495 - val_loss: 30.3077\n",
      "Epoch 7/100\n",
      "81/81 [==============================] - 0s 643us/step - loss: 21.2699 - val_loss: 30.8453\n",
      "Epoch 8/100\n",
      "81/81 [==============================] - 0s 704us/step - loss: 21.1621 - val_loss: 30.6181\n",
      "Epoch 9/100\n",
      "81/81 [==============================] - 0s 612us/step - loss: 21.1461 - val_loss: 31.5385\n",
      "Epoch 10/100\n",
      "81/81 [==============================] - 0s 662us/step - loss: 20.9325 - val_loss: 29.4853\n",
      "Epoch 11/100\n",
      "81/81 [==============================] - 0s 918us/step - loss: 21.2188 - val_loss: 31.1541\n",
      "Epoch 12/100\n",
      "81/81 [==============================] - 0s 542us/step - loss: 21.2988 - val_loss: 30.6412\n",
      "Epoch 13/100\n",
      "81/81 [==============================] - 0s 666us/step - loss: 21.1506 - val_loss: 30.5593\n",
      "Epoch 14/100\n",
      "81/81 [==============================] - 0s 841us/step - loss: 21.2560 - val_loss: 30.1072\n",
      "Epoch 15/100\n",
      "81/81 [==============================] - 0s 1ms/step - loss: 21.3687 - val_loss: 30.0965\n",
      "Epoch 16/100\n",
      "81/81 [==============================] - 0s 1ms/step - loss: 21.0716 - val_loss: 31.1509\n",
      "Epoch 17/100\n",
      "81/81 [==============================] - 0s 835us/step - loss: 21.4293 - val_loss: 30.7262\n",
      "Epoch 18/100\n",
      "81/81 [==============================] - 0s 763us/step - loss: 21.3206 - val_loss: 29.7566\n",
      "Epoch 19/100\n",
      "81/81 [==============================] - 0s 809us/step - loss: 21.1706 - val_loss: 29.8960\n",
      "Epoch 20/100\n",
      "81/81 [==============================] - 0s 646us/step - loss: 21.2153 - val_loss: 29.8151\n",
      "Epoch 21/100\n",
      "81/81 [==============================] - 0s 581us/step - loss: 21.0601 - val_loss: 29.8240\n",
      "Epoch 22/100\n",
      "81/81 [==============================] - 0s 692us/step - loss: 21.4056 - val_loss: 31.0468\n",
      "Epoch 23/100\n",
      "81/81 [==============================] - 0s 627us/step - loss: 21.4103 - val_loss: 30.0941\n",
      "Epoch 24/100\n",
      "81/81 [==============================] - 0s 903us/step - loss: 21.2820 - val_loss: 30.0925\n",
      "Epoch 25/100\n",
      "81/81 [==============================] - 0s 667us/step - loss: 21.2936 - val_loss: 30.4521\n",
      "Epoch 26/100\n",
      "81/81 [==============================] - 0s 679us/step - loss: 21.0305 - val_loss: 30.3378\n",
      "Epoch 27/100\n",
      "81/81 [==============================] - 0s 656us/step - loss: 21.1116 - val_loss: 29.7618\n",
      "Epoch 28/100\n",
      "81/81 [==============================] - 0s 919us/step - loss: 21.4257 - val_loss: 30.3110\n",
      "Epoch 29/100\n",
      "81/81 [==============================] - 0s 654us/step - loss: 21.0552 - val_loss: 31.2853\n",
      "Epoch 30/100\n",
      "81/81 [==============================] - 0s 692us/step - loss: 21.4103 - val_loss: 29.8740\n",
      "Epoch 31/100\n",
      "81/81 [==============================] - 0s 651us/step - loss: 21.4246 - val_loss: 30.2242\n",
      "Epoch 32/100\n",
      "81/81 [==============================] - 0s 660us/step - loss: 21.1098 - val_loss: 31.4361\n",
      "Epoch 33/100\n",
      "81/81 [==============================] - 0s 795us/step - loss: 21.3230 - val_loss: 29.6991\n",
      "Epoch 34/100\n",
      "81/81 [==============================] - 0s 655us/step - loss: 21.3455 - val_loss: 30.0730\n",
      "Epoch 35/100\n",
      "81/81 [==============================] - 0s 641us/step - loss: 21.0455 - val_loss: 29.8301\n",
      "Epoch 36/100\n",
      "81/81 [==============================] - 0s 671us/step - loss: 21.2701 - val_loss: 29.9691\n",
      "Epoch 37/100\n",
      "81/81 [==============================] - 0s 749us/step - loss: 21.1945 - val_loss: 30.3278\n",
      "Epoch 38/100\n",
      "81/81 [==============================] - 0s 532us/step - loss: 20.9974 - val_loss: 31.7331\n",
      "Epoch 39/100\n",
      "81/81 [==============================] - 0s 773us/step - loss: 20.7112 - val_loss: 31.2658\n",
      "Epoch 40/100\n",
      "81/81 [==============================] - 0s 744us/step - loss: 21.2666 - val_loss: 30.9402\n",
      "Epoch 41/100\n",
      "81/81 [==============================] - 0s 1ms/step - loss: 20.8745 - val_loss: 29.8305\n",
      "Epoch 42/100\n",
      "81/81 [==============================] - 0s 982us/step - loss: 20.7963 - val_loss: 31.0617\n",
      "Epoch 43/100\n",
      "81/81 [==============================] - 0s 618us/step - loss: 21.3419 - val_loss: 30.5698\n",
      "Epoch 44/100\n",
      "81/81 [==============================] - 0s 655us/step - loss: 21.3863 - val_loss: 29.9882\n",
      "Epoch 45/100\n",
      "81/81 [==============================] - 0s 705us/step - loss: 21.3824 - val_loss: 29.7962\n",
      "Epoch 46/100\n",
      "81/81 [==============================] - 0s 586us/step - loss: 21.2654 - val_loss: 30.6566\n",
      "Epoch 47/100\n",
      "81/81 [==============================] - 0s 606us/step - loss: 21.4618 - val_loss: 30.2769\n",
      "Epoch 48/100\n",
      "81/81 [==============================] - 0s 663us/step - loss: 21.2797 - val_loss: 32.9341\n",
      "Epoch 49/100\n",
      "81/81 [==============================] - 0s 672us/step - loss: 21.4043 - val_loss: 30.3735\n",
      "Epoch 50/100\n",
      "81/81 [==============================] - 0s 810us/step - loss: 20.9485 - val_loss: 30.6557\n",
      "Epoch 51/100\n",
      "81/81 [==============================] - 0s 645us/step - loss: 21.4044 - val_loss: 30.8078\n",
      "Epoch 52/100\n",
      "81/81 [==============================] - 0s 645us/step - loss: 20.9761 - val_loss: 29.6222\n",
      "Epoch 53/100\n",
      "81/81 [==============================] - 0s 660us/step - loss: 20.8350 - val_loss: 29.8239\n",
      "Epoch 54/100\n",
      "81/81 [==============================] - 0s 645us/step - loss: 21.0278 - val_loss: 30.6224\n",
      "Epoch 55/100\n",
      "81/81 [==============================] - 0s 659us/step - loss: 21.2623 - val_loss: 30.8487\n",
      "Epoch 56/100\n",
      "81/81 [==============================] - 0s 658us/step - loss: 21.5205 - val_loss: 30.3599\n",
      "Epoch 57/100\n",
      "81/81 [==============================] - 0s 652us/step - loss: 21.0654 - val_loss: 30.1677\n",
      "Epoch 58/100\n",
      "81/81 [==============================] - 0s 665us/step - loss: 21.1481 - val_loss: 30.6063\n",
      "Epoch 59/100\n",
      "81/81 [==============================] - 0s 638us/step - loss: 21.5586 - val_loss: 29.9468\n",
      "Epoch 60/100\n",
      "81/81 [==============================] - 0s 584us/step - loss: 21.1103 - val_loss: 30.2876\n",
      "Epoch 61/100\n",
      "81/81 [==============================] - 0s 869us/step - loss: 21.1090 - val_loss: 30.0780\n",
      "Epoch 62/100\n",
      "81/81 [==============================] - 0s 576us/step - loss: 21.2375 - val_loss: 31.1896\n",
      "Epoch 63/100\n",
      "81/81 [==============================] - 0s 668us/step - loss: 21.2112 - val_loss: 31.6520\n",
      "Epoch 64/100\n",
      "81/81 [==============================] - 0s 757us/step - loss: 21.2714 - val_loss: 30.0199\n",
      "Epoch 65/100\n",
      "81/81 [==============================] - 0s 527us/step - loss: 20.9599 - val_loss: 31.6354\n",
      "Epoch 66/100\n",
      "81/81 [==============================] - 0s 660us/step - loss: 21.3273 - val_loss: 31.0174\n",
      "Epoch 67/100\n",
      "81/81 [==============================] - 0s 772us/step - loss: 21.2490 - val_loss: 30.0568\n",
      "Epoch 68/100\n",
      "81/81 [==============================] - 0s 684us/step - loss: 21.2258 - val_loss: 30.2969\n",
      "Epoch 69/100\n",
      "81/81 [==============================] - 0s 657us/step - loss: 21.0436 - val_loss: 29.6886\n",
      "Epoch 70/100\n",
      "81/81 [==============================] - 0s 659us/step - loss: 21.0417 - val_loss: 30.8545\n",
      "Epoch 71/100\n",
      "81/81 [==============================] - 0s 660us/step - loss: 21.2866 - val_loss: 30.7769\n",
      "Epoch 72/100\n",
      "81/81 [==============================] - 0s 1ms/step - loss: 21.5066 - val_loss: 30.5287\n",
      "Epoch 73/100\n",
      "81/81 [==============================] - 0s 785us/step - loss: 20.8739 - val_loss: 30.9158\n",
      "Epoch 74/100\n",
      "81/81 [==============================] - 0s 747us/step - loss: 21.3245 - val_loss: 31.2261\n",
      "Epoch 75/100\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "81/81 [==============================] - 0s 637us/step - loss: 21.2827 - val_loss: 30.9937\n",
      "Epoch 76/100\n",
      "81/81 [==============================] - 0s 659us/step - loss: 21.2271 - val_loss: 30.2076\n",
      "Epoch 77/100\n",
      "81/81 [==============================] - 0s 658us/step - loss: 21.0490 - val_loss: 30.3327\n",
      "Epoch 78/100\n",
      "81/81 [==============================] - 0s 663us/step - loss: 21.2632 - val_loss: 30.4530\n",
      "Epoch 79/100\n",
      "81/81 [==============================] - 0s 594us/step - loss: 21.1765 - val_loss: 29.7745\n",
      "Epoch 80/100\n",
      "81/81 [==============================] - 0s 660us/step - loss: 21.6573 - val_loss: 30.1429\n",
      "Epoch 81/100\n",
      "81/81 [==============================] - 0s 651us/step - loss: 21.0937 - val_loss: 31.2597\n",
      "Epoch 82/100\n",
      "81/81 [==============================] - 0s 764us/step - loss: 20.9353 - val_loss: 31.9587\n",
      "Epoch 83/100\n",
      "81/81 [==============================] - 0s 693us/step - loss: 21.3855 - val_loss: 30.8902\n",
      "Epoch 84/100\n",
      "81/81 [==============================] - 0s 617us/step - loss: 21.3258 - val_loss: 30.5883\n",
      "Epoch 85/100\n",
      "81/81 [==============================] - 0s 667us/step - loss: 20.7965 - val_loss: 31.3616\n",
      "Epoch 86/100\n",
      "81/81 [==============================] - 0s 673us/step - loss: 21.6589 - val_loss: 30.8118\n",
      "Epoch 87/100\n",
      "81/81 [==============================] - 0s 563us/step - loss: 20.9045 - val_loss: 31.6358\n",
      "Epoch 88/100\n",
      "81/81 [==============================] - 0s 663us/step - loss: 21.2823 - val_loss: 30.8099\n",
      "Epoch 89/100\n",
      "81/81 [==============================] - 0s 522us/step - loss: 21.3381 - val_loss: 29.8409\n",
      "Epoch 90/100\n",
      "81/81 [==============================] - 0s 823us/step - loss: 21.3935 - val_loss: 30.5556\n",
      "Epoch 91/100\n",
      "81/81 [==============================] - 0s 645us/step - loss: 21.4128 - val_loss: 31.4513\n",
      "Epoch 92/100\n",
      "81/81 [==============================] - 0s 474us/step - loss: 20.9440 - val_loss: 30.7022\n",
      "Epoch 93/100\n",
      "81/81 [==============================] - 0s 608us/step - loss: 21.0070 - val_loss: 29.9554\n",
      "Epoch 94/100\n",
      "81/81 [==============================] - 0s 605us/step - loss: 21.2523 - val_loss: 30.9434\n",
      "Epoch 95/100\n",
      "81/81 [==============================] - 0s 653us/step - loss: 21.3603 - val_loss: 30.2316\n",
      "Epoch 96/100\n",
      "81/81 [==============================] - 0s 629us/step - loss: 21.1408 - val_loss: 32.3674\n",
      "Epoch 97/100\n",
      "81/81 [==============================] - 0s 481us/step - loss: 21.3060 - val_loss: 30.1760\n",
      "Epoch 98/100\n",
      "81/81 [==============================] - 0s 795us/step - loss: 21.3050 - val_loss: 31.1683\n",
      "Epoch 99/100\n",
      "81/81 [==============================] - 0s 624us/step - loss: 21.2543 - val_loss: 30.7625\n",
      "Epoch 100/100\n",
      "81/81 [==============================] - 0s 654us/step - loss: 21.4324 - val_loss: 30.1392\n"
     ]
    },
    {
     "data": {
      "text/plain": [
       "[array([[-0.8455168 ],\n",
       "        [ 1.0948737 ],\n",
       "        [ 0.29024783],\n",
       "        [ 0.492974  ],\n",
       "        [-2.1748314 ],\n",
       "        [ 2.90304   ],\n",
       "        [ 0.01392851],\n",
       "        [-2.95751   ],\n",
       "        [ 2.8230338 ],\n",
       "        [-2.3882077 ],\n",
       "        [-2.1511388 ],\n",
       "        [ 0.8203659 ],\n",
       "        [-3.3363357 ]], dtype=float32),\n",
       " array([22.451803], dtype=float32)]"
      ]
     },
     "execution_count": 23,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "# 构建线性回归的NN\n",
    "model_lr = Sequential()\n",
    "model_lr.add(Dense(units = 1, input_dim = 13 ,activation='linear'))           \n",
    "model_lr.compile(optimizer = 'sgd', loss = 'mean_squared_error')   \n",
    "model_lr.summary()\n",
    "\n",
    "# 训练模型\n",
    "hs_lr = model_lr.fit(x_train_std, y_train,  \n",
    "               epochs=100,  # 迭代次数  \n",
    "               batch_size=5,  # 每次用来梯度下降的批处理数据大小  \n",
    "               validation_data = (x_test_std, y_test),  # 验证集  \n",
    "               verbose = 1  # 0不显示训练过程，1全部显示（默认），2精简显示\n",
    "         )\n",
    "\n",
    "# 输出模型的参数组合\n",
    "model_lr.get_weights()"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 24,
   "metadata": {
    "scrolled": true
   },
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "在训练集上的决定系数r2得分为： 0.7561422995650757\n",
      "在训练集上的均方误差MSE为： 20.036731229257217\n",
      "在测试集上的决定系数r2得分为： 0.6769923483379441\n",
      "在测试集上的均方误差MSE为： 30.139179581762757\n"
     ]
    }
   ],
   "source": [
    "y_train_pred_x = model_lr.predict(x_train_std)\n",
    "print(\"在训练集上的决定系数r2得分为：\", r2_score(y_train, y_train_pred_x))\n",
    "print(\"在训练集上的均方误差MSE为：\", mean_squared_error(y_train, y_train_pred_x))\n",
    "\n",
    "y_test_pred_x = model_lr.predict(x_test_std)\n",
    "print(\"在测试集上的决定系数r2得分为：\", r2_score(y_test, y_test_pred_x))\n",
    "print(\"在测试集上的均方误差MSE为：\", mean_squared_error(y_test, y_test_pred_x))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "<h3 style = 'color:red'>wow,实验证明神经网络也可以用来做最简单的线性回归欸 ^.^ </h3>"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": []
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.8.3"
  },
  "varInspector": {
   "cols": {
    "lenName": 16,
    "lenType": 16,
    "lenVar": 40
   },
   "kernels_config": {
    "python": {
     "delete_cmd_postfix": "",
     "delete_cmd_prefix": "del ",
     "library": "var_list.py",
     "varRefreshCmd": "print(var_dic_list())"
    },
    "r": {
     "delete_cmd_postfix": ") ",
     "delete_cmd_prefix": "rm(",
     "library": "var_list.r",
     "varRefreshCmd": "cat(var_dic_list()) "
    }
   },
   "types_to_exclude": [
    "module",
    "function",
    "builtin_function_or_method",
    "instance",
    "_Feature"
   ],
   "window_display": false
  }
 },
 "nbformat": 4,
 "nbformat_minor": 4
}
