{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# 第四周基础作业"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 1. 对连续型特征，可以用哪个函数可视化其分布？(可列举一个)并根据代码运行结果给出示例。\n",
    "   1）连续型特征是指某个特征的所有可能取值不可以逐个列举出来，而是取数轴上某一区间内的任一点作为此特征的描述。\n",
    "   \n",
    "   2）对于连续型特征，常用统计量观察其分布，可利用直方图、箱线图可视化其分布。比如可以用seaborn包中的boxplot可视化其箱线图"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 导入必要的工具包"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 39,
   "metadata": {},
   "outputs": [],
   "source": [
    "#导入需要使用的库\n",
    "import numpy as np\n",
    "import pandas as pd\n",
    "\n",
    "import matplotlib.pyplot as plt\n",
    "import seaborn as sn\n",
    "\n",
    "# 让输出的图形直接在Notebook中显示\n",
    "%matplotlib inline"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 读取数据集“共享单车骑行数据”"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 40,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "<class 'pandas.core.frame.DataFrame'>\n",
      "RangeIndex: 731 entries, 0 to 730\n",
      "Data columns (total 16 columns):\n",
      "instant       731 non-null int64\n",
      "dteday        731 non-null object\n",
      "season        731 non-null int64\n",
      "yr            731 non-null int64\n",
      "mnth          731 non-null int64\n",
      "holiday       731 non-null int64\n",
      "weekday       731 non-null int64\n",
      "workingday    731 non-null int64\n",
      "weathersit    731 non-null int64\n",
      "temp          731 non-null float64\n",
      "atemp         731 non-null float64\n",
      "hum           731 non-null float64\n",
      "windspeed     731 non-null float64\n",
      "casual        731 non-null int64\n",
      "registered    731 non-null int64\n",
      "cnt           731 non-null int64\n",
      "dtypes: float64(4), int64(11), object(1)\n",
      "memory usage: 91.5+ KB\n"
     ]
    },
    {
     "data": {
      "text/html": [
       "<div>\n",
       "<style scoped>\n",
       "    .dataframe tbody tr th:only-of-type {\n",
       "        vertical-align: middle;\n",
       "    }\n",
       "\n",
       "    .dataframe tbody tr th {\n",
       "        vertical-align: top;\n",
       "    }\n",
       "\n",
       "    .dataframe thead th {\n",
       "        text-align: right;\n",
       "    }\n",
       "</style>\n",
       "<table border=\"1\" class=\"dataframe\">\n",
       "  <thead>\n",
       "    <tr style=\"text-align: right;\">\n",
       "      <th></th>\n",
       "      <th>instant</th>\n",
       "      <th>season</th>\n",
       "      <th>yr</th>\n",
       "      <th>mnth</th>\n",
       "      <th>holiday</th>\n",
       "      <th>weekday</th>\n",
       "      <th>workingday</th>\n",
       "      <th>weathersit</th>\n",
       "      <th>temp</th>\n",
       "      <th>atemp</th>\n",
       "      <th>hum</th>\n",
       "      <th>windspeed</th>\n",
       "      <th>casual</th>\n",
       "      <th>registered</th>\n",
       "      <th>cnt</th>\n",
       "    </tr>\n",
       "  </thead>\n",
       "  <tbody>\n",
       "    <tr>\n",
       "      <th>count</th>\n",
       "      <td>731.000000</td>\n",
       "      <td>731.000000</td>\n",
       "      <td>731.000000</td>\n",
       "      <td>731.000000</td>\n",
       "      <td>731.000000</td>\n",
       "      <td>731.000000</td>\n",
       "      <td>731.000000</td>\n",
       "      <td>731.000000</td>\n",
       "      <td>731.000000</td>\n",
       "      <td>731.000000</td>\n",
       "      <td>731.000000</td>\n",
       "      <td>731.000000</td>\n",
       "      <td>731.000000</td>\n",
       "      <td>731.000000</td>\n",
       "      <td>731.000000</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>mean</th>\n",
       "      <td>366.000000</td>\n",
       "      <td>2.496580</td>\n",
       "      <td>0.500684</td>\n",
       "      <td>6.519836</td>\n",
       "      <td>0.028728</td>\n",
       "      <td>2.997264</td>\n",
       "      <td>0.683995</td>\n",
       "      <td>1.395349</td>\n",
       "      <td>0.495385</td>\n",
       "      <td>0.474354</td>\n",
       "      <td>0.627894</td>\n",
       "      <td>0.190486</td>\n",
       "      <td>848.176471</td>\n",
       "      <td>3656.172367</td>\n",
       "      <td>4504.348837</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>std</th>\n",
       "      <td>211.165812</td>\n",
       "      <td>1.110807</td>\n",
       "      <td>0.500342</td>\n",
       "      <td>3.451913</td>\n",
       "      <td>0.167155</td>\n",
       "      <td>2.004787</td>\n",
       "      <td>0.465233</td>\n",
       "      <td>0.544894</td>\n",
       "      <td>0.183051</td>\n",
       "      <td>0.162961</td>\n",
       "      <td>0.142429</td>\n",
       "      <td>0.077498</td>\n",
       "      <td>686.622488</td>\n",
       "      <td>1560.256377</td>\n",
       "      <td>1937.211452</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>min</th>\n",
       "      <td>1.000000</td>\n",
       "      <td>1.000000</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>1.000000</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>1.000000</td>\n",
       "      <td>0.059130</td>\n",
       "      <td>0.079070</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>0.022392</td>\n",
       "      <td>2.000000</td>\n",
       "      <td>20.000000</td>\n",
       "      <td>22.000000</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>25%</th>\n",
       "      <td>183.500000</td>\n",
       "      <td>2.000000</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>4.000000</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>1.000000</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>1.000000</td>\n",
       "      <td>0.337083</td>\n",
       "      <td>0.337842</td>\n",
       "      <td>0.520000</td>\n",
       "      <td>0.134950</td>\n",
       "      <td>315.500000</td>\n",
       "      <td>2497.000000</td>\n",
       "      <td>3152.000000</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>50%</th>\n",
       "      <td>366.000000</td>\n",
       "      <td>3.000000</td>\n",
       "      <td>1.000000</td>\n",
       "      <td>7.000000</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>3.000000</td>\n",
       "      <td>1.000000</td>\n",
       "      <td>1.000000</td>\n",
       "      <td>0.498333</td>\n",
       "      <td>0.486733</td>\n",
       "      <td>0.626667</td>\n",
       "      <td>0.180975</td>\n",
       "      <td>713.000000</td>\n",
       "      <td>3662.000000</td>\n",
       "      <td>4548.000000</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>75%</th>\n",
       "      <td>548.500000</td>\n",
       "      <td>3.000000</td>\n",
       "      <td>1.000000</td>\n",
       "      <td>10.000000</td>\n",
       "      <td>0.000000</td>\n",
       "      <td>5.000000</td>\n",
       "      <td>1.000000</td>\n",
       "      <td>2.000000</td>\n",
       "      <td>0.655417</td>\n",
       "      <td>0.608602</td>\n",
       "      <td>0.730209</td>\n",
       "      <td>0.233214</td>\n",
       "      <td>1096.000000</td>\n",
       "      <td>4776.500000</td>\n",
       "      <td>5956.000000</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>max</th>\n",
       "      <td>731.000000</td>\n",
       "      <td>4.000000</td>\n",
       "      <td>1.000000</td>\n",
       "      <td>12.000000</td>\n",
       "      <td>1.000000</td>\n",
       "      <td>6.000000</td>\n",
       "      <td>1.000000</td>\n",
       "      <td>3.000000</td>\n",
       "      <td>0.861667</td>\n",
       "      <td>0.840896</td>\n",
       "      <td>0.972500</td>\n",
       "      <td>0.507463</td>\n",
       "      <td>3410.000000</td>\n",
       "      <td>6946.000000</td>\n",
       "      <td>8714.000000</td>\n",
       "    </tr>\n",
       "  </tbody>\n",
       "</table>\n",
       "</div>"
      ],
      "text/plain": [
       "          instant      season          yr        mnth     holiday     weekday  \\\n",
       "count  731.000000  731.000000  731.000000  731.000000  731.000000  731.000000   \n",
       "mean   366.000000    2.496580    0.500684    6.519836    0.028728    2.997264   \n",
       "std    211.165812    1.110807    0.500342    3.451913    0.167155    2.004787   \n",
       "min      1.000000    1.000000    0.000000    1.000000    0.000000    0.000000   \n",
       "25%    183.500000    2.000000    0.000000    4.000000    0.000000    1.000000   \n",
       "50%    366.000000    3.000000    1.000000    7.000000    0.000000    3.000000   \n",
       "75%    548.500000    3.000000    1.000000   10.000000    0.000000    5.000000   \n",
       "max    731.000000    4.000000    1.000000   12.000000    1.000000    6.000000   \n",
       "\n",
       "       workingday  weathersit        temp       atemp         hum   windspeed  \\\n",
       "count  731.000000  731.000000  731.000000  731.000000  731.000000  731.000000   \n",
       "mean     0.683995    1.395349    0.495385    0.474354    0.627894    0.190486   \n",
       "std      0.465233    0.544894    0.183051    0.162961    0.142429    0.077498   \n",
       "min      0.000000    1.000000    0.059130    0.079070    0.000000    0.022392   \n",
       "25%      0.000000    1.000000    0.337083    0.337842    0.520000    0.134950   \n",
       "50%      1.000000    1.000000    0.498333    0.486733    0.626667    0.180975   \n",
       "75%      1.000000    2.000000    0.655417    0.608602    0.730209    0.233214   \n",
       "max      1.000000    3.000000    0.861667    0.840896    0.972500    0.507463   \n",
       "\n",
       "            casual   registered          cnt  \n",
       "count   731.000000   731.000000   731.000000  \n",
       "mean    848.176471  3656.172367  4504.348837  \n",
       "std     686.622488  1560.256377  1937.211452  \n",
       "min       2.000000    20.000000    22.000000  \n",
       "25%     315.500000  2497.000000  3152.000000  \n",
       "50%     713.000000  3662.000000  4548.000000  \n",
       "75%    1096.000000  4776.500000  5956.000000  \n",
       "max    3410.000000  6946.000000  8714.000000  "
      ]
     },
     "execution_count": 40,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "data_path = 'D:/DeepLearning_Dataset/bike_sharing_hourly/day.csv'\n",
    "rides = pd.read_csv(data_path)\n",
    "rides.info()\n",
    "rides.describe()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 利用箱线图可视化其“cnt”特征"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 41,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAX0AAAD4CAYAAAAAczaOAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjIsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8li6FKAAANkklEQVR4nO3dUYiddX6H8ee7k6pjl1DFGNJJbLJMqI3C0jqI7UJvbDG1pfFGyMLWUISAuNNpKZTYm70K7EUpjaEKobs10qUh2AVD0W4l7V4UrDrZFdwYUw/KmkmyOrvSXUEbN9lfL+ZdeppMMmc0nnPk/3zgMO/5n/c98zuSPPP6zplMqgpJUhs+M+oBJEnDY/QlqSFGX5IaYvQlqSFGX5IasmbUA6zkpptuqs2bN496DEn6VDl27NgPq2rdxetjH/3NmzczPz8/6jEk6VMlyfeXW/fyjiQ1xOhLUkOMviQ1xOhLUkOMviQ1xOhLUkOMviQ1ZOzfp6/xsX//fnq93qjHGAunT58GYGpqasSTjIfp6WlmZ2dHPYYGYPSlj+CDDz4Y9QjSR2L0NTDP5P7P3NwcAPv27RvxJNLqeE1fkhpi9CWpIUZfkhpi9CWpIUZfkhpi9CWpIUZfkhpi9CWpIUZfkhpi9CWpIUZfkhoyUPST/FmS40m+l+Qfk1yX5MYkzyV5vft4Q9/+jyTpJTmZ5J6+9TuSvNI99miSfBIvSpK0vBWjn2QK+BNgpqpuByaAncAe4GhVbQWOdvdJsq17/DZgO/BYkonu6R4HdgNbu9v2q/pqJElXNOjlnTXAZJI1wPXAGWAHcLB7/CBwX7e9AzhUVeeq6k2gB9yZZAOwtqqer6oCnuw7RpI0BCtGv6pOA38FvAWcBX5cVf8KrK+qs90+Z4Gbu0OmgFN9T7HQrU112xevXyLJ7iTzSeYXFxdX94okSZc1yOWdG1g6e98C/DLwi0m+dKVDllmrK6xfulh1oKpmqmpm3bp1K40oSRrQIJd3fgd4s6oWq+qnwDeB3wLe7i7Z0H18p9t/AdjUd/xGli4HLXTbF69LkoZkkOi/BdyV5Pru3TZ3AyeAI8Cubp9dwNPd9hFgZ5Jrk2xh6Ru2L3aXgN5Lclf3PA/0HSNJGoIVf11iVb2Q5CngO8B54LvAAeCzwOEkD7L0heH+bv/jSQ4Dr3b7P1xVF7qnewh4ApgEnu1ukqQhGeh35FbVV4CvXLR8jqWz/uX23wvsXWZ9Hrh9lTNKkq4SfyJXkhpi9CWpIUZfkhpi9CWpIUZfkhpi9CWpIUZfkhpi9CWpIUZfkhpi9CWpIUZfkhpi9CWpIUZfkhpi9CWpIUZfkhpi9CWpIUZfkhpi9CWpIUZfkhpi9CWpIUZfkhpi9CWpIUZfkhpi9CWpIUZfkhpi9CWpIUZfkhpi9CWpIUZfkhpi9CWpIUZfkhpi9CWpIUZfkhpi9CWpIUZfkhpi9CWpIUZfkhoyUPST/FKSp5K8luREkt9McmOS55K83n28oW//R5L0kpxMck/f+h1JXukeezRJPokXJUla3qBn+vuAf6mqW4HPAyeAPcDRqtoKHO3uk2QbsBO4DdgOPJZkonuex4HdwNbutv0qvQ5J0gBWjH6StcBvA18DqKoPq+q/gR3AwW63g8B93fYO4FBVnauqN4EecGeSDcDaqnq+qgp4su8YSdIQrBlgn88Bi8DfJ/k8cAyYA9ZX1VmAqjqb5OZu/yngP/uOX+jWftptX7x+iSS7Wfo/Am655ZaBX8wnYf/+/fR6vZHOoPHz8z8Tc3NzI55E42Z6eprZ2dlRj3FZg0R/DfAbwGxVvZBkH92lnMtY7jp9XWH90sWqA8ABgJmZmWX3GZZer8fL3zvBhetvHOUYGjOf+XDpj+WxN94e8SQaJxPvvzvqEVY0SPQXgIWqeqG7/xRL0X87yYbuLH8D8E7f/pv6jt8InOnWNy6zPvYuXH8jH9x676jHkDTmJl97ZtQjrGjFa/pV9QPgVJJf7ZbuBl4FjgC7urVdwNPd9hFgZ5Jrk2xh6Ru2L3aXgt5Lclf3rp0H+o6RJA3BIGf6ALPAN5JcA7wB/DFLXzAOJ3kQeAu4H6Cqjic5zNIXhvPAw1V1oXueh4AngEng2e4mSRqSgaJfVS8DM8s8dPdl9t8L7F1mfR64fTUDSpKuHn8iV5IaYvQlqSFGX5IaYvQlqSFGX5IaYvQlqSFGX5IaYvQlqSFGX5IaYvQlqSFGX5IaYvQlqSFGX5IaYvQlqSFGX5IaYvQlqSFGX5IaYvQlqSFGX5IaYvQlqSFGX5IaYvQlqSFGX5IaYvQlqSFGX5IaYvQlqSFrRj3AuDt9+jQT7/+YydeeGfUoksbcxPs/4vTp86Me44o805ekhnimv4KpqSl+cG4NH9x676hHkTTmJl97hqmp9aMe44o805ekhhh9SWqI0Zekhhh9SWqI0Zekhhh9SWqI0Zekhgwc/SQTSb6b5J+7+zcmeS7J693HG/r2fSRJL8nJJPf0rd+R5JXusUeT5Oq+HEnSlazmTH8OONF3fw9wtKq2Ake7+yTZBuwEbgO2A48lmeiOeRzYDWztbts/1vSSpFUZKPpJNgK/D/xd3/IO4GC3fRC4r2/9UFWdq6o3gR5wZ5INwNqqer6qCniy7xhJ0hAMeqb/N8BfAD/rW1tfVWcBuo83d+tTwKm+/Ra6talu++L1SyTZnWQ+yfzi4uKAI0qSVrJi9JP8AfBOVR0b8DmXu05fV1i/dLHqQFXNVNXMunXrBvy0kqSVDPIPrn0B+MMk9wLXAWuT/APwdpINVXW2u3TzTrf/ArCp7/iNwJlufeMy65KkIVnxTL+qHqmqjVW1maVv0P5bVX0JOALs6nbbBTzdbR8Bdia5NskWlr5h+2J3Cei9JHd179p5oO8YSdIQfJx/WvmrwOEkDwJvAfcDVNXxJIeBV4HzwMNVdaE75iHgCWASeLa7SZKGZFXRr6pvA9/utn8E3H2Z/fYCe5dZnwduX+2QkqSrw5/IlaSGGH1JaojRl6SGGH1JaojRl6SGGH1JaojRl6SGGH1JaojRl6SGGH1JaojRl6SGGH1JaojRl6SGfJx/WrkZE++/y+Rrz4x6DI2Rz/zPTwD42XVrRzyJxsnE++8C60c9xhUZ/RVMT0+PegSNoV7vPQCmPzfef8E1bOvHvhlGfwWzs7OjHkFjaG5uDoB9+/aNeBJpdbymL0kNMfqS1BCjL0kNMfqS1BCjL0kNMfqS1BCjL0kNMfqS1BCjL0kNMfqS1BCjL0kNMfqS1BCjL0kNMfqS1BCjL0kNMfqS1BCjL0kNMfqS1BCjL0kNMfqS1BCjL0kNWTH6STYl+fckJ5IcTzLXrd+Y5Lkkr3cfb+g75pEkvSQnk9zTt35Hkle6xx5Nkk/mZUmSljPImf554M+r6teAu4CHk2wD9gBHq2orcLS7T/fYTuA2YDvwWJKJ7rkeB3YDW7vb9qv4WiRJK1gx+lV1tqq+022/B5wApoAdwMFut4PAfd32DuBQVZ2rqjeBHnBnkg3A2qp6vqoKeLLvGEnSEKzqmn6SzcCvAy8A66vqLCx9YQBu7nabAk71HbbQrU112xevL/d5dieZTzK/uLi4mhElSVcwcPSTfBb4J+BPq+onV9p1mbW6wvqli1UHqmqmqmbWrVs36IiSpBUMFP0kv8BS8L9RVd/slt/uLtnQfXynW18ANvUdvhE4061vXGZdkjQkg7x7J8DXgBNV9dd9Dx0BdnXbu4Cn+9Z3Jrk2yRaWvmH7YncJ6L0kd3XP+UDfMZKkIVgzwD5fAP4IeCXJy93aXwJfBQ4neRB4C7gfoKqOJzkMvMrSO38erqoL3XEPAU8Ak8Cz3U2SNCQrRr+q/oPlr8cD3H2ZY/YCe5dZnwduX82AkqSrx5/IlaSGGH1JaojRl6SGGH1JaojRl6SGGH1JaojRl6SGGH1JaojRl6SGGH1JaojRl6SGGH1JaojRl6SGGH1JaojRl6SGGH1JaojRl6SGGH1JaojRl6SGGH1JaojRl6SGGH1JaojRl6SGGH1JaojRl6SGGH1JaojRl6SGGH1JaojRl6SGGH1JaojRl6SGGH1JaojRl6SGGH1JaojRl6SGGH1JasjQo59ke5KTSXpJ9gz780tSy4Ya/SQTwN8CvwdsA76YZNswZ5Cklq0Z8ue7E+hV1RsASQ4BO4BXhzyHPoL9+/fT6/VGPcZY+Pl/h7m5uRFPMh6mp6eZnZ0d9RgawLAv70wBp/ruL3Rr/0+S3Unmk8wvLi4ObThpUJOTk0xOTo56DGnVhn2mn2XW6pKFqgPAAYCZmZlLHtdoeCYnffoN+0x/AdjUd38jcGbIM0hSs4Yd/ZeArUm2JLkG2AkcGfIMktSsoV7eqarzSb4MfAuYAL5eVceHOYMktWzY1/SpqmeAZ4b9eSVJ/kSuJDXF6EtSQ4y+JDXE6EtSQ1I13j/7lGQR+P6o55CWcRPww1EPIV3Gr1TVuosXxz760rhKMl9VM6OeQ1oNL+9IUkOMviQ1xOhLH92BUQ8grZbX9CWpIZ7pS1JDjL4kNcToSx9Bku1JTibpJdkz6nmkQXlNX1qlJBPAfwG/y9IvBnoJ+GJV+bueNfY805dW706gV1VvVNWHwCFgx4hnkgZi9KXVmwJO9d1f6NaksWf0pdXLMmteJ9WngtGXVm8B2NR3fyNwZkSzSKti9KXVewnYmmRLkmuAncCREc8kDWTovyNX+rSrqvNJvgx8C5gAvl5Vx0c8ljQQ37IpSQ3x8o4kNcToS1JDjL4kNcToS1JDjL4kNcToS1JDjL4kNeR/AZ2+IfvB8DqkAAAAAElFTkSuQmCC\n",
      "text/plain": [
       "<Figure size 432x288 with 1 Axes>"
      ]
     },
     "metadata": {
      "needs_background": "light"
     },
     "output_type": "display_data"
    }
   ],
   "source": [
    "sn.boxplot(data=rides[\"cnt\"]); "
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 2. 对两个连续型特征，可用哪个函数得到这两个特征间的相关性？根据代码运行结果，给出示例。\n",
    "1）两个数值特征之间的关系可用相关矩阵来查看它们之间的相关性。（当有些机器学习算法不能很好地处理高度相关的输入变量并且特征之间高度相关时，\n",
    "  可考虑进行PCA降维（特征层面）或加正则项（模型层面））。\n",
    "\n",
    "2）可用DataFrame的corr()方法先计算出每对特征间的相关矩阵，然后将所得的相关矩阵传给seaborn的heatmap()方法，渲染出一个基于色彩编码的矩阵。"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 可视化各特征之间的相关性"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 97,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "<matplotlib.axes._subplots.AxesSubplot at 0x16c1d563408>"
      ]
     },
     "execution_count": 97,
     "metadata": {},
     "output_type": "execute_result"
    },
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAV8AAAEoCAYAAAD/kvL4AAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4xLjIsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy8li6FKAAAgAElEQVR4nOzdd3gU5RbA4d9JQk8IPQm9KiolFAEVQUCKKE0UlSJNAXu5il5EQUHA3hHRq2IXRGkioIBU6R0RVHoLhJYAAZLsuX/skmySBTZ1s+G8Pvu4M3NmzuyGPfvtN9/MiKpijDEmZwX4egeMMeZyZMXXGGN8wIqvMcb4gBVfY4zxASu+xhjjA1Z8jTHGB6z4GmMueyLSTkS2isg/IvKsh+WhIjJdRNaLyGYR6ZvpnDbO1xhzORORQGAb0BrYC6wE7lHVP91ihgChqvqMiJQGtgLhqnouo3mt5WuMudw1Av5R1e2uYvod0ClVjAIhIiJAMHAUSMhM0qDMrJwXxUdv9+lPgXylqvoyvTE5RTKzcno+p/lLVxsIDHCbNV5Vx7tNlwP2uE3vBRqn2sz7wDRgPxAC3KWqjnTtdCpWfI0xeZqr0I6/SIinL4LUxb0tsA5oCVQDfhWRRaoak9H9sm4HY4z/cSR6/7i0vUAFt+nyOFu47voCP6rTP8AOoGZmXoIVX2OM/1GH949LWwnUEJEqIpIfuBtnF4O73UArABEJA64EtmfmJVi3gzHG72hipo51pdyWaoKIPAzMBgKBT1V1s4gMci0fB4wAPheRjTi7KZ5R1ejM5LWhZqnYATdjckSmDrid27vR+wNu5WtnKld2sZavMcb/ZG6gQa5gxdcY43+8O5CWq1nxNcb4H2v5GmNMzsvKA26+YsXXGON/HNbyNcaYnGfdDua8oaPeZOGSFZQoXowpX43z9e4Yk7flgQNuueoMNxEpJiIP+no/MqJz+9aMe3Okr3fDmMtD1p7h5hO5qvgCxQC/LL4NI2sTWjTE17thzOXB4fD+kUvltm6HMUA1EVkH/AocAroBBYCfVHWYiFQGZgGLgSbAeuAz4EWgDNBDVVeIyHCcVx8qh/OiGa+q6sc5+mqMMdkjD4x2yG0t32eBf1U1EmfxrYHzQseRQAMRaeaKqw68A9TBeWWh7kBT4ClgiNv26gC3AtcBL4hIWU9JRWSAiKwSkVWffPFt1r8qY0yWUk30+pFb5baWr7s2rsda13QwzmK8G9ihqhsBRGQzMFdV1XXRi8pu25iqqnFAnIjMx1nIp6RO5H69T19f28EY44Vc3JfrrdxcfAUYraofpZjp7HY46zbL4TbtIOVrSl1IrbAakxfk4r5cb+W2bodYnLfoAOfl3fqJSDCAiJQTkTLp3F4nESkoIiWBm3BetzNbPD1sDD0GPsHO3Xtp1bknk6fPzq5Uxpg8MNohV7V8VfWIiCwRkU3AL8A3wB/Oe9ZxEugJpKcTZwXwM1ARGKGqqa9On2VeezHN3aaNMdklD4zzzVXFF0BVu6ea9Y6HsFpu8X3cnu90XwZsU1X3G+cZY/KCPDDaIdcVX2OMuaRc3J3grTxbfFV1uK/3wRiTTfLAAbc8W3yNMXmYFV9jjMl5ufnkCW9Z8TXG+B874GaMMT5g3Q7GGOMDNtrBGGN8wFq+xhjjA9byNVktPnq7z3LnK1XVZ7mNSRdr+RpjjA/YaAdjjPEBa/kaY4wPWJ+vMcb4gLV8jTHGB6zla4wxPmAtX2OM8YFE/7+wTm67h5sxxlyaw+H9wwsi0k5EtorIPyLi8Z5gInKTiKwTkc0isiCzL8FavsYY/5OF3Q4iEgh8ALQG9gIrRWSaqv7pFlMMGAu0U9XdGbiZbxrW8jXG+J+svXtxI+AfVd2uqueA74BOqWK6Az+q6m4AVT2U2ZdgxdcY43/S0e0gIgNEZJXbI/VNdcsBe9ym97rmubsCKC4iv4vIahG5N7MvwbodvLR42SrGvD2ORIeDrh3acV+vbimWn4iJ5fnRb7Fn3wEK5M/PiCFPUKNqZQC+nDiFydNmoarc0bEdve7qkqX7NnTUmyxcsoISxYsx5atxWbptY3KldBxwU9XxwPiLhIin1VJNBwENgFZAIeAPEVmmqtu83pFUcl3LV0SG+HofUktMTGTkGx/w4RsjmPb1R8z87Xf+3bErRczHX3xPzRrV+OmLDxn1/FOMedtZBP/evpPJ02bx7SdvM3nCWBYsXcGuPfuydP86t2/NuDdHZuk2jcnVsvaA216ggtt0eWC/h5hZqnpKVaOBhUDdzLyEXFd8gVxXfDdu2UbF8mWpUC6CfPnycUur5sxbtCxFzL87d9OkgfNvUbVSBfYdiCL66DG279xDnWtqUqhgQYKCAmkYWZu5C5dm6f41jKxNaNGQLN2mMbla1vb5rgRqiEgVEckP3A1MSxUzFbhRRIJEpDDQGNiSmZfg0+IrIlNc/SebXf0yY4BCruEcX7tieorICte8j1xHJhGRkyLyimv930Skkas/ZruIdHTF9BGRqSIyyzWMZFhG9vPQ4WjCy5ROmg4rU4pDh4+kiLmyelV+W+Asqhv/3MqBqENEHYqmetVKrF6/ieMnYog7c4ZFf6zkYNThjOyGMcZFHer145LbUk0AHgZm4yyoE1V1s4gMEpFBrpgtwCxgA7AC+ERVN2XmNfi6z7efqh4VkUI4v32aAw+raiSAiFwF3AXcoKrxIjIW6AF8ARQBflfVZ0TkJ2AkzqEiVwMTSP7magTUAk7jHELys6quct8JVwf8AICxb4zkvnvvSbGT6uHvJ6l6ie7rdSdj3v6Irr0foka1ytSsUY3AwECqVa5Ivx53cv/jQyhcqBBXVK9KYGBgxt4tY4xTFp/hpqozgZmp5o1LNf0a8FpW5fR18X1URM4ffaoA1Ei1vBXOTu6V4qx2hYDzQzzO4fwmAtgInHUV6I1AZbdt/KqqRwBE5EegKZCi+Lp3yMdHb09TasPKlOLgoeTWatShaEqXKpkiJrhIEUY+9+T57dH2jj6ULxsGQNcObenaoS0Ab4/7nPAypS74hhhjvJAHru3gs24HEbkJuBm4TlXrAmuBgqnDgAmqGul6XKmqw13L4lWT2qQO4CyAqjpI+aWSuphe+ndIKrVqXsHuvfvZu/8g8fHx/DJ3AS2aNkkRExN7kvj4eAAmT59Fg8jaBBcpAsCRY8cBOHDwEHMXLOGWm5undxeMMe4SEr1/5FK+bPmGAsdU9bSI1ATOV7N4EcmnqvHAXGCqiLylqodEpAQQoqq7LrRRD1q71osDOgP90rujQUGBDHniAQY+OZTExES63NaG6lUr8f1PPwNwV5db2b5rD0NGvE5gQABVK1fkpf8+nrT+E0NGcjwmhqCgIJ77z4NZfnDs6WFjWLl2A8ePx9Cqc08e7N8rqaVtTJ6UBy6sI+qpQzMnEosUAKbgHMy8FSgNDAduAToCa1S1h4jcBfwXZys9HnhIVZeJyElVDXZtazhwUlVfd02fVNVgEekDtMfZP1wd+EZVX7zYfnnqdrhc2D3cTA7yNLbWa6ffHuj157Tw4x9lKld28VnLV1XP4iy0qf0OPOMW9z3wvYf1g92eD7/QMuCQqj6cyd01xuQmeaDl6+sDbsYYk35eDCHL7fJ08VXVz4HPfbwbxpislgeu55uni68xJm9S63YwxhgfsG4HY4zxgTxwkoUVX2OM/7GWrzHG+ID1+RpjjA/YaAdjjPEB63bIe96v/4LPcufz4b+ngWtfIj56u8/y26nNJj1sqJkxxviCtXyNMcYHrPgaY4wP2DhfY4zJeZpgxdcYY3KedTsYY4wP2GgHY4zxAWv5GmOMD1jxNcaYnKeJ1u1gjDE5z1q+xhiT89SK7+WjUvM63DS8FwGBAWz67ndWjp2eYnnxahG0eX0AZWpVZulrk1g9fiYAgQXy0W3SUALzBxEQFMjfM1fwx5s/pit3xZvqcOPwXkhgAH9++ztrUuUuVi2Cm98YQOlalVn22iTWfjQzxXIJELr9PIJTB48xo+8bGXj1FzZ01JssXLKCEsWLMeWrcVm6bWMuKA8U3wBf78CliEhlEdnk030IEFqO7M2U3q8yodVgruzYhBI1yqaIOXP8FL8P+zKp6J6XeDaeH+4exVftnuOrds9RqXkdwutVS1fu5iN7M/3eV/mm5WCu6NSE4qlynz1+ioXDvmRtqtzn1e3fjmP/7Pc6Z3p0bt+acW+OzJZtG3NBjnQ8cqlcX3xzg/DIahzfGcWJ3YdxxCeydfoyqrVpkCIm7kgMURu240hIe53R+NNnAQgICiQgKAjS8aUdFlmNEzujiHHl/nvaMqp6yH1o/XYc8WlzFwkvQaWWkWz+9nfvk6ZDw8jahBYNyZZtG3Mh6lCvH7mVvxTfQBH5WEQ2i8gcESkkIr+LSEMAESklIjtdz/uIyBQRmS4iO0TkYRF5UkTWisgyESmR3uTB4cWJ3X80afrkgaMEhxX3en0JEHr88jID145l9+KNHFz3r9frFvGQu0i497lvHN6TpaO+zRM/04xJkqDeP3Ipfym+NYAPVPUa4DjQ9RLxtYDuQCPgZeC0qtYD/gDuTR0sIgNEZJWIrPrj5N9ptyaSZpam42+qDuXrW57jk8aPEl63GiWvKO/9yh5ye9tyrtwqkrgjMRzeuNP7fMb4gbzQ8vWXA247VHWd6/lqoPIl4ueraiwQKyIngPNHqDYCdVIHq+p4YDzAWxV7pvlrnTxwlJCyyQ3m4IgSnDp0LL2vgbMxp9m7bAuVb6rDkW17vVrnlKfcUd7ljmh4BVVa16dSi7oEFshH/pBCtH7nAX597MN077sxuUou7sv1lr+0fM+6PU/E+aWRQPL+F7xIvMNt2kEGvnAOrt9O8SrhFK1QmoB8gVzZoQnbf13j1bqFSoRQoGhhwDnyoWLTWhz91/uDX1HrtxNaOZwQV+4aHZuww8vcf7wykc8bPcoX1z/BnIc+YN+SP63wmjwhq1u+ItJORLaKyD8i8uxF4q4VkUQRuSOzr8FfWr6e7AQaACuATL8RF6OJDuY9P4HbvxyMBAaw+fsFHNm2jzo9WwKw4at5FC4dSvcZI8gfXAh1OKjXvx1ftHqGImWK0fbNgUhgABIgbJuxnB1z110iY8rcC5+fQKevnLn//H4BR7ft4xpX7s2u3N1+Ts5dt387vm75DPEn47Ll/XD39LAxrFy7gePHY2jVuScP9u9F1w5tsz2vucxlYctXRAKBD4DWwF5gpYhMU9U/PcS9AszOkryans5LHxCRysAMVa3lmn4KCAa+AyYCJ4F5QE9VrSwifYCGqvqwK36nazo69TJPPHU75BRf38PNl+webpcdDwczvHekQ3OvPy0lpy+4aC4RuQ4YrqptXdP/BVDV0aniHgfigWtx1qQf0rvf7nJ9y1dVd+I8gHZ++nW3xe79t0Ndyz8HPneLr+z2PMUyY4x/0gTvY0VkADDAbdZ413Ge88oBe9ym9wKNU22jHNAFaImz+GZari++xhiTRjq6HdwPqF+Ap5Zx6pb128AzqpoonkYgZYAVX2OM38niW7jtBSq4TZcHUh8Vbwh85yq8pYD2IpKgqlMymtSKrzHG72Rx8V0J1BCRKsA+4G6c5wkk51Otcv65iHyOs883w4UXrPgaY/xQVhZfVU0QkYdxjmIIBD5V1c0iMsi1PFuuGGXF1xjjdzQxa/pdk7anOhOYmWqex6Krqn2yIqcVX2OM31FH1hZfX7Dia4zxO1nc5+sTVnyNMX5H1Vq+xhiT46zlmwe9d3qzz3KfTjjjs9yPl73RZ7nj9i8iPnq7z/Lbqc3+x/p8jTHGBxxZPNrBF6z4GmP8jrV8jTHGB3L5xRi9YsXXGON3rOVrjDE+YEPNjDHGB2yomTHG+ECiw19uP3lhVnyNMX7H+nyNMcYHbLSDMcb4gLV8jTHGBxx5YLSD//da56AXRg1m3oqpzFzwPdfUqekxplf/u5i3Yirbo9dSvESxFMsa39CAGfO/Y9biH/h22ifpyj3ylSH8sWYW85ZMoXbdqz3G9Lu/O3+smcXB41so4ZY7pGgwX3w3lrmLf2LBH9O5u0eXdOV+682X+OvPxaxZ/Sv1Imt5jPliwnts3rSQdWvn8vH4NwgKcn6vN292HUcOb2HVyjmsWjmHoc89nq7cFzN01Js0u/VuOvcclGXbNP7B4RCvH7lVpouviMwUkWKXjkyKrywimzKbNyNE5GRG173p5qZUrlqRlo06MeTJkYx4bYjHuNUr1tGr6yD27k55/72QosG89OoQBvR8nHZN7+Dhfk97nbtV62ZUrVqJ6+q346nHhvHKGy94jFuxfC3dOvdjz+59Keb3va872/76l1ZNu3D7bfcybORg8uXL51XuW9q1pEb1KtS8uikPPPAMH7w/2mPct9/+xDW1mhFZrxWFChWkf7/kW2AtXryChte2oeG1bRj58ttevupL69y+NePeHJll2zP+w6Hi9SO3ynTxVdX2qno8K3YmN7v5lub8NHEGAOtWb6RoaAilw0qliftz41b27TmQZn6nrrcwe8Zc9u87CMCR6GNe527bviUTv5sKwJpV6ykaWpQyYaXTxG3asIU9u1PfdBVUleDgIgAUCS7M8WMnSEhI8Cp3hw5t+fLrHwBYvmINocVCCQ8vkybul1nzkp6vXLmO8uUjvNp+ZjSMrE1o0ZBsz2NyH1Xx+pFbXbL4ishgEXnU9fwtEZnnet5KRL4SkZ0iUsrVot0iIh+LyGYRmSMihVyxDURkvYj8ATzktu1rRGSFiKwTkQ0iUsO1nb9EZIJr3g8iUthtOwtEZLWIzBaRCNf8aiIyyzV/kYjUdM2vIiJ/iMhKERmRmTcqPKIMB1yFE+Dg/ijCI9IWoQupUq0SocWK8s3Uj5k692u6dLvN63UjIsKSijbAgf0HiUhH7k8//poaV1Zl/V8Lmb9kKs8/Oxr18nBxubLh7N2TXND37T1AubLhF4wPCgqiR4+uzJ49P2lekyYNWL3qV2ZM+5Krr77C6/025kJUvX/kVt60fBcC5y/22hAIFpF8QFNgUarYGsAHqnoNcBzo6pr/GfCoql6XKn4Q8I6qRrq2vdc1/0pgvKrWAWKAB1053wPuUNUGwKfAy6748cAjrvlPAWNd898BPlTVa4Hk6pWKiAwQkVUisirmTPSFYtLM87aAAQQGBVKr7lX0v+cR+tz5EI88dT9VqlX0at3M5m7RsimbNv5F3ZrNaHXj7Yx6bSjBIUWyJff7741i0aLlLF6yAoA1azdStXojGjRszQdjP2PypE+93m9jLuRy6XZYDTQQkRDgLPAHzkJ5I2mL7w5VXee2XmURCQWKqeoC1/wv3eL/AIaIyDNAJVWNc83fo6pLXM+/wlnorwRqAb+KyDpgKFBeRIKB64FJrvkfAed/894AfOshbwqqOl5VG6pqw6IFk7sSevXrxoz53zFj/ndEHTxMRLnkFl942TCiDh6+0CbTOLj/EAvnLSXu9BmOHT3OiqVrqHnNhVuBfe/rzm+LfuS3RT9y8OAhyrrljigbzsF05L67x+3MnP4rADt37Gb3rr3UqHHhC4g/MKh30gGy/QcOUr5C2aRl5cpHsP9AlMf1nh/6BKVLl+Spp4cnzYuNPcmpU6cBZ9dEvnxBlCxZ3Ot9N8aTy6LbQVXjgZ1AX2ApzoLbAqgGbEkVftbteSLOoWwCeGwqqeo3QEcgDpgtIi3PL0od6trOZlWNdD1qq2ob12s47jY/UlWvSrVuhnz56URua3E3t7W4m19nzk/qKohsUJvYmJMcjvLcSvbk119+59om9QgMDKRgoYLUbVCLf7ftuGD8Z598w8033s7NN97OrJ/n0u3uTgDUb1iX2JhYDkV5X3z37T3Ajc2bAFCqdEmqVa/Crp17Lhj/4bgJSQfIpk2bTa8edwDQuFF9Yk7EcPDgoTTr9Ot7D21a30SPng+laBmHufVNX9swkoCAAI4c8b6/2xhPElW8fuRW3h5wW4jz5/xCnMV3ELBOvfjt6zoYd0JEmrpm9Ti/TESqAttV9V1gGlDHtaiiiJzvorgHWAxsBUqfny8i+UTkGlWNAXaIyJ2u+SIidV3rLgHuTp03I+b/upg9u/Yyf+U0Rr/1PC8MTj7q/+m371Em3Flket9/D0s2zCK8bBlmLpzI6LedIxP+/XsHC+YtZebCifw050smfvUT2/7616vcv81ZwK6de1m2djZvvPMSz/7npaRlX0/8iDBX7v4De7Jm83wiyoYxb8lU3njX2c395mtjadioHvOXTOWHqZ8xcvgbHD3q3THSmb/MZfuO3WzdsoRx417l4UeSR3lMn/oFERFhAIz9YAxlypRi8aJpKYaUdb39Vtavm8fqVb/y9lsj6NHzQa/yeuPpYWPoMfAJdu7eS6vOPZk8fXaWbdvkbnmh20G86TsUkVbALJzdB6dEZBswTlXfFJGduPqCgRmqWsu1zlNAsKoOF5HzfbSngdk4+21rich/gZ5APM4+2e5AUWAmzkJ/PfA30EtVT4tIJPAuEIqzVf22qn4sIlWAD3F2N+QDvlPVl1zzv3HFTgaGqmrwxV5r1VL1fNZF78t7uEWfjvFZ7rj9qXuvcpbdw80nMlUVl4Tf4fXn9IaDP+TKCuxV8c1JIlIZtyKe06z45jwrvpelTBXERekovjfm0uJrpxcbY/yOZq525wq5rviq6k6coxqMMcajhFzcl+utXFd8jTHmUqzla4wxPpAH7iJkxdcY43+s5WuMMT5gLV9jjPGBvFB87WLqxhi/kyji9cMbItJORLaKyD8i8qyH5T1cV1ncICJL3c6izTBr+Rpj/I4jC/t8RSQQ+ABojfPKiitFZJqq/ukWtgNorqrHROQWnFdSbJyZvFZ8U9n0TD1f74JPtHt7l89yV7uik89y/7ttKvHR232W386uy5gsPg21EfCPqm4HEJHvgE5AUvFV1aVu8cuA8plNat0Oxhi/40jHw/163a7HgFSbKwe4X+Zvr2vehfQHfsnsa7CWrzHG7zi87MsF5/W6cXYTXIinjXlsXItIC5zFt6mn5elhxdcY43eyuNthL1DBbbo8kOZmiCJSB/gEuEVVj2Q2qRVfY4zfScjacyxWAjVcl6Ddh/Ma4N3dA0SkIvAjzsvbbsuKpFZ8jTF+JytHO6hqgog8jPNa44HAp6q6WUQGuZaPA14ASgJjXfc1TFDVhpnJa8XXGON3svqi26o6E+dNHNznjXN7fh9wX1bmtOJrjPE7Dv+/tIMVX2OM/8kLpxdb8TXG+J1Ea/kaY0zOs5avMcb4gBXfy0hA5Vrkb9UdREjYsIiEFTNTLq9wJQW6PIKeiAYgYdtqEv6YnhwgQsFeL6Anj3P2x3dyfe5HX3qIJi0bczbuLKOfeJVtm/5OExNRIZxhY4dStHgI2zb+zchHx5AQn8Ddg7rR+vZWAAQGBlKpRkU61ulKsZKhDP/w+aT1y1aM4NPXP+e198al2fZ5L45+lhatbyQu7gz/eWgomzZsSRPT+7576D+oJ5WrVqRu9Rs5dvQ4AAMf6UPnO24FICgokOpXVCWyRjNOHM/8nZqHjnqThUtWUKJ4MaZ8deH9N9kjD9zC7fIpviLyOc5b0v+QgZXJ37onZye+gcYepWCvF0j8dx16JOVJMI69f1+wuAU1aI3jyAGkQKFcn7tJy0aUr1Ke7k3v5er6V/Hk6McY1OHhNHEDn7ufiR9PZt60+fxnzOPces8tTP1iOt+Nm8h34yYCcH3r6+h2f1dij8cSezyW/m0GAhAQEMDk1d+z8JfFF9yPFjffSOVqlWjW8FbqNazDy28MpVPrHmniVi1fy9zZC/h++qcp5n/03ud89N7nANzctjn9H+iVJYUXoHP71nTv2pEhI17Pku2Z9MkLLV+7sI4XAiKqoscOoScOgyORhL+WE1g90uv1Jbg4gVXrkLBxoV/kbtr2Bmb/MAeAP9dsITg0mJJlSqSJq39DPRb8vACAWZPmcGPbG9LEtOrUgt+mzEszv0HTeuzftZ+ofYcuuB9t2rdg8nfTAFi7agNFi4ZQJqxUmrjNG/9i7540Z4Om0LFre6b9mOlroSRpGFmb0KIhWbY9kz7pubBObuWz4isi97ouTLxeRL4UkQ4islxE1orIbyIS5oprLiLrXI+1IhIiIjeJyAy3bb0vIn1cz18QkZUisklExouk4wocF9rX4GJo7NGkaY09hgQXTxMXULYaBXu/SIGuTyAlyybNz9fyHs4tmASa/qHhvshdKrwUh/YfTpo+fOAwpcJTFr3Q4kU5eeIkiYmOC8YUKFiAxjddy4KZi9LkaNmpBXM9FGV34RFlOLDvYNL0wf1RhEeU8fp1nFewUEFuanUDM6f9mu51Te6UKN4/ciufFF8RuQZ4DmipqnWBx4DFQBNVrQd8Bwx2hT8FPKSqkcCNQNwlNv++ql6rqrWAQsBtXuxP0iXnPl221VOEh3kpi5kjahdxHz3NmQnDiF/zGwW6PAJAQNW66OkYNCqj18vN+dyevq40dfH2HJRi8oY217Fx1WZij8emmB+UL4gb2lzP/BmXaI17yJGB7y9at2vOquVrs6zLwfheXmj5+qrPtyXwg6pGA6jqURGpDXwvIhFAfpxXjgdYArwpIl8DP6rq3ks0ZluIyGCgMFAC2AxMv9gK7pecO/1avzQfbz15DAlJ/tktIcXRk8dTBp07k/TUsWMjBARCoWACy1UnsHokgVXrIEH5IH9B8t96P+d+/vhiu5Tjubv07sRtPdoD8Ne6rZQpWzppWemI0hyJSnkRpxNHTxAcGkxgYACJiQ5KR5QmOlVMy46eW7dNWjTi741/cyz6WJpl9/a/m3vu7QrAhrWbiCgXnrQsvGwYUQcv3E1xIR263MLUyVnX5WB8LzcXVW/5qttBSHt69ns4W621gYFAQQBVHYPznOpCwDIRqQkkkHLfCwKISEFgLHCHazsfn1+WGY4DO5DiYUhoKQgIJKhmYxL/WZcyqEjRpKcB4VWcrba4k8QvmsyZcU9xZvxgzk4fh2P3X14X3pzM/dOEqfRvM5D+bQayaPYS2t7RBoCr61/FqZhTHDl0NM06a5euo/mtzQFod2cbFs9Jvth/kZAiRDapw+LZS9Os16pzS4/9wHtaRZEAACAASURBVABf/O87bml+J7c0v5PZP8+j690dAajXsA6xMSc5FBV9kXcrrZCQYJrc0JA5v8xP13omd9N0PHIrX7V85wI/ichbqnpEREoAoTgv5wbQ+3ygiFRT1Y3ARhG5DqgJrAauFpECOItrK5zdFucLbbSIBAN3AOkf3ZCaOjj321cUuONJCAggYeNi9Mh+gureBEDC+t8JuqIhQZEtwOFAE85xbnoWDT/yQe5lc5dzXcvGfLvkS87GnWH0k68lLXv1i1G88vQbHIk6wriXP2b42KHcN7gvf2/+h5+/TW5d3nhLU1YuXM2ZuDMptl2gYAEaNmvA68+8dcn9mPfrIlq0bsai1TOJizvDUw8PTVr2+fdjeeaxYUQdPEzfAd0Z9Gg/SpcpyZxFk5n32yKeeWw4AG1va8XC+UuJO32p3qr0eXrYGFau3cDx4zG06tyTB/v3omuHtlmaw1xYXri2g6Tpy8upxCK9gaeBRGAt8BPwFs4CvAy4VlVvEpH3gBauuD+BPqp6VkRexXmfpb+Bc8A0Vf1cREbivB7nTpy3BtmlqsO9HWrmqdvhcuDLe7jtjEt/V0JW+XfbVJ/lhsv6Hm6ZKp9jKvX0+nP67K6vcmWp9tk4X1WdAExINTvNJ0FVH7nA+oNJPijnPn8oMNTD/D4Z2lFjTK6TmKs7FLxz2ZxkYYzJO/LCATcrvsYYv+P/7V4rvsYYP2QtX2OM8YG8MNrBiq8xxu/YATdjjPEB63YwxhgfcFjL1xhjcp7/l14rvsYYP2TdDnmRw4ffqQG+O4Qb78N/zkfPnPRZ7hM9+vosd+jXnxEfvd1n+f351GbrdjDGGB9I9PUOZAErvsYYv6PW8jXGmJxnfb7GGOMD1udrjDE+4P+l14qvMcYPWcvXGGN8IC9c28FXN9A0xpgMy+pbx4tIOxHZKiL/iMizHpaLiLzrWr5BROpn9jVY8TXG+B1Nx3+XIiKBwAfALcDVwD0icnWqsFuAGq7HAODDzL4GK77GGL+TxS3fRsA/qrpdVc8B3+G8Oa+7TsAX6rQMKCYiEZl5DVZ8jTF+x6Hq9UNEBojIKrfHgFSbK4fzTufn7XXNS29MutgBNy8FVKlF/lbdISCAhPULSVg+M+XyCldSoOuj6PFoABK2rSZh6bTkABEK9h6Gxh7j7OR30pe7siu3CAkbFpGwwkPuLo+gJ9xy/zE9Ze5eL6Anj3P2R+9yP/HSI1zfsjFn4s4w4olX2Lbp7zQxERXCGTH2BYoWD2Hrxr958dFRJMQnEBIazHNvDKZcpbKcO3uOl//zKtu37gTguTcGc/3NTTgWfZyerfpdcj9ee30YbdreRNzpMwwc+BTr121OE/O/T9+iXv06JMTHs2r1eh59+DkSEhLodlcnnnxyEAAnT53i8ceeZ9PGLV69fnf5GjaiyKBHkMAAzvzyM3ETv/EYF3RFTULfHkvsqBc5t3hBuvN4a+ioN1m4ZAUlihdjylfjsi1PbpaeA26qOh4Yf5EQTxdVSZ3Am5h0ydGWr4h09NSZ7bY8UkTaZ2P+4SLyVAZWJH/rXpyd9BZnPnmOoKsbIyXLpglz7NnGmc+HcebzYSkLLxDUsDWOIwcystPkb92Tsz+8xZlPhxJ01QVy7/2bMxOGc2bC8JSFFwhqkL7c17VsTIUq5bizaU/GPPMGg0c/4THuoecG8t3Hk+jWtBexJ2LpcI/zT9f7kR5s2/wPvVrfx0uPjeaJlx5JWufnibN4osczXu1Hm7Y3Ua16ZerWbsEjD/+Xt98Z6THu+++nUj+yFY2ubUehggXp0/cuAHbt3EO7tnfRpPEtvDLmPd57f5TX70GSgACCH3qcmKGDOXZ/bwq0aEVgxUoe4wr3H0j86pXpz5FOndu3Ztybnt+Ly0VW9vnibMVWcJsuD+zPQEy6ZLj4uo7+pWt9VZ2mqmMuEhIJpKv4iki2t94DIqqixw+hJw6DI5GELSsIrFHP6/UlpDiBVeuSsH5hxnIfc8v913ICq0d6nzu4OIFV65Cw0fvczdrewC8/zAFg85otBIcWoWSZEmniGtxQj/k/O1t4MyfNplnbpgBUvqIyqxavAWDXv3sILx9G8VLFAVi3fAMxx2O82o/bbmvNt1//CMDKlesIDS1KWHjpNHFzZv+e9HzVqvWUK+fsilu+fA3HXblWrlhLuXLhXuV1F3TlVSTu34fj4AFISODs7/PIf13TNHEFO93OucULcBw/lu4c6dUwsjahRUOyPU9ulsV9viuBGiJSRUTyA3cD01LFTAPuddW9JsAJVc1AaypZuoqniFQWkS0iMhZYA/QSkT9EZI2ITBKRYFdcexH5S0QWu4ZnzHDN7yMi77ue3ykim0RkvYgsdL3ol4C7RGSdiNwlIkVE5FMRWSkia0Wkk9t2JonIdGCOa97TrrgNIvKi2z4/5xpC8htwZUbeJAkpjsYcTZrW2KNIcPE0cQHlqlOw74sUuPMJpFRy6zRfq3s49/tE0PSfkS7BxdBY99zHPOcuW42CvV+kQNcnUrSM87W8h3MLJoF6/wupdHgpovYfSpo+fCCa0uGlUsSEFi/KyRMnSUx0vqZDBw4nxfzz57/c1L4ZAFdH1iS8fDhlItIWzUuJKBvG3r3J/7737ztA2bIXLqBBQUHc070Lv85J+5P/3t53McfD/EsJKFkKx+Hk98IRfZiAUqXSxBS4/kbO/Jz682qyiwP1+nEpqpoAPAzMBrYAE1V1s4gMEpFBrrCZwHbgH+Bj4MHMvoaMtBqvBPoCLwA/Ajer6ikReQZ4UkReBT4CmqnqDhH59gLbeQFoq6r7RKSYqp4TkReAhqr6MICIjALmqWo/ESkGrHAVUYDrgDqqelRE2uAcAtIIZ9/MNBFpBpzC+S1Wz/Va1wCrU++IqwN+AMB7Xa6jX2NvanTKP6ojahdxHz4F8WcJqFqHAl0e5czHzxJQrS56KhaN2oVUyEjtv3RXkyNqF3EfPe3MXaU2Bbo8wplP/ktA1bro6Zj055a0OVPXbvEQcz7oi/e/4YmXHmbCnI/596/tbNv0N4mJ6b8IoKccepEvkbfeGcGSxStYujTlT/9mzZrQu3c3Wt98Z7r3wdN7kfrzXGTQI5z630fgyAuXe/EPWX1VM1WdibPAus8b5/ZcgYeyMmdGiu8uVV0mIrfhHBO3xPUhyQ/8AdQEtqvqDlf8t7gKWypLgM9FZCLOIu5JG6CjWz9tQaCi6/mvqnrULa4NsNY1HYyzGIcAP6nqaQAR8dg0ce+QP/1K3zR/VY09hhRN/tktISXQk8dTBp07k/TUsX0DtOkFhYIJLFeDwBqRBFargwTmgwIFyX/bAM7NuFj/v1vuk8eQEPfcxS+ee8dGCAh05a5OYPVIAqvWQYLyQf6C5L/1fs79/HGaPF17d6Zjj1sB2LLuL8LKlklaVjqiFNFR0Snijx89QXBoMIGBASQmOigTUZrDUUcAOH3yNC8/+WpS7I/LvmX/bu9+oQ0Y2Is+fe8GYPXqDZQvnzyap2y5CA4ciPK43n+HPEqpUiXo/vCQFPOvqVWT98eO4fbOfTl69LjHdS/GEX2YgNLJ70VAqdI4jqR8L4KuuJKQ/77gXB4aSv5GTTiZmMi5PxanO5/xTl74mstI8T3l+r/gLID3uC8UEa86Q1V1kIg0Bm4F1omIp45MAbqq6tZUORq77cf5uNGq+lGquMfJgmtwOA7sQIqXQUJLobHHCLqqEWenf5QyqEhROOXsXwyIqOJsMcWdJH7hD8Qv/ME5v8KV5GvUzuvCm5w7LDl3zcacnXGR3OFuuRdNJn7R5OTc17bzWHgBJk+YwuQJUwC4vlUT7ujTmV+nzuOa+ldxKuYURw4dTbPOmqVraXFrc36bNp/2d7Zl0ZwlAAQXLcKZuLMkxCfQsfutrFu+gdMnT3v1esd/9CXjP/oSgLbtWjBw0L1MmjSda6+NJCYmlqiDh9Os07vPXbS6uRm3te+RomVcvnxZvvn2Q+7v/yT//LMjzXreSNj6F4HlyhMQFo7jSDQFbmpJ7JgRKWKO9b476Xnwf57l3PI/rPBms8QMdOHlNpk5WLUM+EBEqqvqPyJSGOcRwL+AqiJSWVV3And5WllEqqnqcmC5iHTAeSQxFmdr9bzZwCMi8oiqqojUU9W1HjY3GxghIl+r6kkRKQfEAwtxtq7HuF5rB5xdIumjDs79+jUFuv0HJICEjYvQ6P0ERd4EQMK63wm68lqC6rUARyKaEM+5aVk0BEgdnPvtKwrc8aRzmNvGxeiR/QTVdeVe/ztBVzQkKLIFOBxowjnOTc9c7qVzl3F9y8ZMWvIVZ+POMvLJV5KWvfHFaEY//TrRUUf44OXxjBj7PAMH92fb5r+Z/q3zV1vlGpV44Z3/4kh0sGPbTkY99VrS+i9+MJT610VSrEQoU1dN5JPXP2fjh2M97sfsWfNp27YFGzb9TtzpOAYNGpy0bPJPn/LQg89y8MAh3nl3JLt372Pe784fUNOmzmLM6Pd4dsijlChRnLfecRbLhIQEmjVNPXb+EhyJnPzgbUJHvQ4BAZyZM5PEXTspeGtHAJ/08z49bAwr127g+PEYWnXuyYP9e9G1Q9sc3w9f8v/SC3KxPrQ0wSKVgRmqWss13RJ4BSjgChmqqtNcxfQ1IBpYAYSpag8R6YOrT1dEfsTZNSDAXOBxoDjOQpoPGI3zCOPbwPWuuJ2qepv7dtz27THgPtfkSaCnqv4rIs8B9wK7cA4X+VNVX7/Qa/TU7ZBjfHgPt1ZvZ6xlmBU2Htvps9w7b6x46aBsEvr1Zz7LDT6/h1um/rHfVvFWrz+nM3b/7LsP1kWkq/h6vVGRYFcLVHCeM/23qr6V5YmygRXfnGfF1zf8ufi2r9je68/pzN0zc2Xxza6TLO4XkXXAZiCUjPzUN8aYC1BVrx+5VbacoOBq5fpFS9cY43/yQp+vXdvBGON3EvNA+bXia4zxO7m5O8FbVnyNMX7H7uFmjDE+kNWnF/uCFV9jjN9xWLeDMcbkvLxw92IrvsYYv2N9vnmQVKvmu+QBvrulXqL+67Pc8Y4En+UuMriHz3L7Wnz0dp/lzuzZdTbawRhjfMBavsYY4wM22sEYY3zAuh2MMcYHLveLqRtjjE9Yn68xxviA9fkaY4wP2BluxhjjA9byNcYYH7ADbsYY4wPW7WCMMT5g3Q6XkSXb9vHqz6twOJQuDavTr3mtNDErtx/ktZ9XkeBwULxwAf53f1vOxifS7+PZxCc6SHA4uPmaSjx4c9305d66j1dnrHDmvrYG/W6q7Tn3jBUkJDooXqQg/xvQzpl7/C/EJ7hy16rMg60jM/weAPxnxKNc37IxZ+LO8tITo9m68e80MXf27cLd991BhSrlaV2rIyeOnshwvjffeJF27Vpy+nQc993/JOvWbUoT8/nn79Kgfh3i4xNYuWodDz30LAkJCXS4rQ3Dhj2Fw+EgISGRp54eztKlK73Ku2TzDl6dNB+HKl2ur0W/to3TxKzctofXfpjves8L8b8n72Jn1FEG/29GUsy+6BM8cNv19GzZwOvXvHjZKsa8PY5Eh4OuHdpxX69uKZafiInl+dFvsWffAQrkz8+IIU9Qo2plANp07U2RwoUJCAggMDCQiZ++63Vebwwd9SYLl6ygRPFiTPlqXJZuOz2s5esnRKQycL2qfpOR9RMdDkZPX8G4vjcTVrQwPT78heZXladamWJJMTFx5xg9bQUf9GlFRLEiHD0ZB0D+oAA+7t+awgXyEZ/ooO/4WTS9oix1Kpb2Pve0ZYzr38aZ+4OfaX5VBaqFpco9dRkf9L2ZiGLBKXPf1zY597hfaHplOa9zp3Z9y8ZUqFKerjf0oFb9q3lm9JP0u+2BNHHrV25i8a9/8OHktzOU57x2bVtQvXoVrr7mRho1qsd7747ixmYd08R99+1P9OnzKABffPE+/frew/iPv2Te/MVMnzEHgFq1avLN1x9Sp26LS+ZNdDgY/f1cxj16B2HFQujxytc0r1OdahElk2JiTp9h9He/8cHDXYkoUZSjsacBqBxWgolD7k3aTpshH9Gybg2vX3NiYiIj3/iAj98eRXiZUtx132O0aNqYalUqJcV8/MX31KxRjXdHv8D2XXt4+Y0P+N+7Y5KWf/reGIoXC/U6Z3p0bt+a7l07MmTE69myfW/lhZav7y6jlbMqA90zuvKmvUeoUCKE8iVCyBcUSNs6lfh9y54UMb+s30HLayoQUawIACWCCwEgIhQukA+AhEQHCYmKSDpy74mmQsmiybnrVkmbe912Wl5TkYhiwRfP7XCQjtRpNGvblJk/zHbu15o/CQkNpmSZEmnitm36mwN7D2Yik1OHDm346uvJAKxYsZZixYoSHl4mTdys2fOTnq9auY5y5SMAOHXqdNL8IkUKe31K6qadB6lQuhjlSxVzvucNruT39f+kiPll5V+0jKxBRImiAJQIKZxmO8v/2k35UsUoW7KoV3kBNm7ZRsXyZalQLoJ8+fJxS6vmzFu0LEXMvzt306SB89dT1UoV2Hcgiuijx7zOkRkNI2sTWjQkR3JdjKrD60du5dctXxG5F3gKUGADkAjEAA2BcGCwqv4AjAGuEpF1wATXre29dijmNOGhRZKmw4oWYeOe6BQxu47EkJDooP8nczh9Np7u19ekQz3n5SkTHQ7u+WAme47GclfjK6ldwfuWZ9rchdm453DK3NExJDgc9B8/y5n7hqvpUN8t9/sz2HMklrua1KR2Blu9AGXCSxG1/1Dyvu0/TJnw0hw5dDTD27yYsmXD2bt3f9L0vn0HKFs2nIMHD3mMDwoKonv32/nPU8OT5nXs2I6RI56hdOlSdO7S26u8h46fJLx4coEJKx7Cxp0HUsTsOnSMhMRE+r/1PafPnKN7i/p0aHJNipjZq//iloY1vcqZlPtwNOFlkv9GYWVKsXHz1hQxV1avym8LllK/bi02/rmVA1GHiDoUTakSxRERBjzxHCLCnZ1u4c5O7dOV31/khdEOftvyFZFrgOeAlqpaF3jMtSgCaArchrPoAjwLLFLVSE+FV0QGiMgqEVn1v1/T9gl6ajClbr0mJipb9h/l/XtbMLZPK8bP38iu6BgAAgMCmPjIbcwe3JVNe6P5J8r7VoqntpqkSp7ocLBl3xHe79OKsf1aM37eenYdPpGc+9GOzH72Tmfug5loIXlosmfnBU5Sv85L5Xv33ZdZvHg5S5asSJo3bdos6tRtwZ3d7mP4sKe8yuvpJ23qPUl0ONiy+xDvP3g7Yx/pyvhflrErKvlLKD4hkQUb/qV1/Su8ypmU24t/a/f1upOY2JN07f0QX/8wjZo1qhEYGAjAlx++waTP3ufDN0bw7Y8zWLVuY7ry+wsH6vUjt/Lnlm9L4AdVjQZQ1aOuD+sUdf7W+FNEwrzZkKqOB8YDxP0wMs1fKyy0MAdPnEqajoo5RemihdLEFCtSgEL581Eofz4aVC7D1gPHqFQq+Sdn0UL5aVgljCXb9lM9rLhXLzKsaOrcpyldNOVP3LDQIhQrUjA5d5Uwth48RqXSyf1+ybn3UT3cu9wAd/TpTOcetwHw57qthJVN/tlfpmxpDkdFX2jVDBk0sDf9+t0DwKrV6ylfvmzSsnLlIjhwIMrjes899zilS5Wk20PPely+ePFyqlatRMmSxTly5OJfQGHFQjh4LDZpOupYLKVDg1PFBFOsSCEKFchHoQL5aFC9PFv3HaZSmLMbZvHmHdSsEEbJokVIj7AypTh4KPmXTdShaEqXKpkiJrhIEUY+9yTg/DJqe0cfypd1/lMvU9oZW7J4MVo1u56Nf26lYWTaA7T+Li9c1cxvW744GyOe/gJnU8Vk2jXlSrL7SCz7jsYSn5DI7A27aF6zQoqYm66qwNqdh0hIdBB3LoGNe6KpWqYoR0+dISbuHABn4hNY/u9BqpT2/mDINeVLsTs6Jjn3+h00v6p8ytxXV2DtzqiUuUuHcvRk6twH0pUb4IfPp9Cz9X30bH0fC2Ytov0dbQGoVf9qTsacyvIuh3EfTaBR43Y0atyO6dNm07NHVwAaNarHiROxHrsc+va9m9Y3N6fXvQ+n+FBWc40AAIiMrEW+fPkvWXgBrqkUzu5Dx9kXfcL5nq/eSvM6Ke9wclOd6qz9d5/rPY9n484DVA1PLpKzVv1Fu2vT1+UAUKvmFezeu5+9+w8SHx/PL3MX0KJpkxQxMbEniY+PB2Dy9Fk0iKxNcJEinI47k9TPfTruDEtXrEkaBZHXOFS9fuRW/tzynQv8JCJvqeoREUl75CdZLJDhowRBgQE826ERD3w+F4cqnepXp3pYMSYt3wbAnY2voGqZUK6/oizd3puBCHRpWIPqYcXZdvAYz/+wBIfD+Q+hTe3KNKtZ/hIZU+Xu2JgHPv0Nhzro5NrupOVbXbmvpGqZYlx/RTm6vTsNEXHmDi/OtgNHeX7SkqR/hG1qV6bZVRUukfHClsxdxvWtmvDj0m84E3eWEU8kH2F/68tXePmpV4mOOkK3/l3p9cDdlCxTgm9++5Sl85bx8lOvpTvfL7Pm0a5dS7b8uZjTp+O4f8B/kpZNnTKBQQ8M5sCBKN5/bzS7d+9j4YIpAEyZ+gujRr1D5y630LNHV+LjE4iLO0PPXg96lTcoMIBn72rJA+9PxuFw0Om6WlQvW4pJC9cDcGezulSNKMn1V1em28sTnO/5DbWpXrYUAHHn4ln21y6Gdm+d7tccFBTIkCceYOCTQ0lMTKTLbW2oXrUS3//0MwB3dbmV7bv2MGTE6wQGBFC1ckVe+u/jABw5eozHhowAIDEhkfZtbqJpk4bp3oeLeXrYGFau3cDx4zG06tyTB/v3omuHtlmawxs5NdrBVVe+x3nQfifQTVWPpYqpAHyB8ziTAxivqu9cctv+3HwXkd7A0zgPtK11zZ7hOsiGiJxU1WARyQfMAkoBn1/sgJunbocc48N7uDV/eLbPcq8/6rt7iR3/+Xmf5Q6qe7PPcvtavlJVM/WrNCy0ptef06gTf2U4l4i8ChxV1TEi8ixQXFWfSRUTAUSo6hoRCQFWA51V9c+LbdufW76o6gRgwkWWB7v+Hw+0yqn9MsZkrxwc7dAJuMn1fALwO5Ci+KrqAeCA63msiGwBygEXLb7+3OdrjLlMpafP1300k+sxIB2pwlzF9XyRTTvQ3I3rhK56wPJLbdivW77GmMtTerpL3UczeSIiv+Hsr03tufTsk4gEA5OBx1U15lLxVnyNMX4nK8fvquoFO99FJEpEIlT1gKtv1+MZPq7jSpOBr1X1R2/yWreDMcbvqKrXj0yaBpw/NbI3MDV1gDhPMPgfsEVV3/R2w1Z8jTF+J1EdXj8yaQzQWkT+Blq7phGRsiIy0xVzA9ALaCki61yPS57Xbd0Oxhi/k1MnT6jqETyMlFLV/UB71/PFZOCELiu+xhi/48/nJ5xnxdcY43fywvV8rfgaY/yOtXyNMcYH8kLx9etrO+RGIjLANajbcltuy20uyIaaZb30nLpouS235b5MWfE1xhgfsOJrjDE+YMU36/myH8xyW+7LIXeeYAfcjDHGB6zla4wxPmDF1xhjfMCKrzHG+IAVX2PMJYnIY97MM96zA25ZQETCgUaAAitV9WAO5q4PNHXlXqKqa3IwdzHgXpy31U46VV1VH82pfcgpInL7xZZ7e/eCTOR/8hL5vb6Idwbzr1HV+qnmrVXVetmZNy+zaztkkojcB7wAzMN5Tc/3ROQlVf00B3K/ANwJnP/gfyYik1R1ZHbndpkJLAM2Atl+O1kRiYULX85KVYtmY/oOF1mmJP8NskuI6/9XAtfivMMCOPdrYXYlFZF7gO5AFRGZ5rYoBDiSXXkvB9byzSQR2Qpc77roMiJSEliqqlfmQO4tQD1VPeOaLgSsUdWrsju3K1+a1lAO5X0JOAh8ifMLrwcQoqqv5vS+5DQRmQN0VdVY13QIMElV22VTvkpAFWA08Kzbolhgg6omZEfey4G1fDNvL85/iOfFAntyKPdOoCBwxjVdAPg3h3IDfCki9wMzgLPnZ6rq0WzO21ZVG7tNfygiy4EcKb4icitwDc73HgBVfSkncgMVgXNu0+dwdvtkC1XdBewCrsuuHJcrK76Ztw9YLiJTcf787ASsON9Hl819cWeBzSLyqyt3a2CxiLzryp3dfa/ngNdw3mL7/E8oBapmc95EEekBfOfKdw+QmM05ARCRcUBhoAXwCXAHsCIncrt8ifPf1084X3sX4IvsTurq834FKIPz14YAms1dPXmadTtkkogMu9hyVX0xG3P3vthyVZ2QXbld+f8FGqtqdHbm8ZC3MvAOzhsXKrAEeFxVd+ZA7g2qWsft/8HAj6raJrtzu+1DfeBG1+RCVV2bAzn/ATqo6pbsznW5sJZvJmVncfUid7YWVy9sBk7ndFJXke2U03ld4lz/Py0iZXEedKqSw/tQGIhR1c9EpLSIVFHVHdmcM8oKb9ay4ptJItIQ58/uSqQcblUnB3LfBoxwy53TPwUTgXUiMp+Ufb7Z2t0hIlcAHwJhqlpLROoAHXNolMcM1xC714A1OFven+RAXiDpl1ZDnKMePgPyAV/h/BWQnVaJyPfAFFL+rbN7lEeeZd0OmeQa7fA0qYZbuQ5UZHfuf4DbgY3qgz/khbo9cqC7YwHO9/yj8+NMRWSTqtbKzrwe9qMAUFBVT+RgznVAPZyjWs6/9g3Z/WUvIp95mK2q2i878+Zl1vLNvMOqOu3SYdliD7DJF4UXfNrtUVhVV4iI+7wcGfIkIvd6mIeqZvtBL5dzqqoioq7cRXIobwDwmKoed+UtDryRQ7nzJCu+mTdMRD4B5pLzP8cGAzNdLUH33Nl6ttN5IrIDDyc9qGp2j3aIFpFq53OLyB3AgWzOed61bs8LAq1wdj/kVPGdKCIfAcVcw/z6AR/nQN465wsvgKoeExE7uy0TrPhmXl+gJs6+t/PdDjlx5ilEXgAACKJJREFUxhPAy8BJnEUgfw7kS62h2/OCOM+2K5EDeR/CeTHvmvL/9s4/5sqyjOOf7wtEhMJyUVFCQxOINJjolMRMZmw4aysjZ5bmXFtR2ipXmcs5HbWMWsZa2pYolS1p09INAZkFJGyCIKhpWzIcWasWiEPkR377477POOBrr3HOcz/vc97rszF2nvOefe+9HK7nfq77ur6X9FdgO/CpArrYvrr9taSxpPKvItheJOlDwB5S3vcG26sKSPdJerPtXQCSTiDiR0dEzrdDJG2zfVpN2httnzHwT5ZD0jrbswtpjQb6Wt1edSBpBKnTq/KuQknDgBW2L6haqx/ty4HrgN+QNhefABbaLnbj6TXiztU5GyRNs/1UDdoPSZpre2UN2q160xZ9pJ3w8a/x493UfRvwbeAdtudJmgbMsv2zAtr3czjV0gdMA+6pWhfA9n8kvSRpbMlDvqy9VNJGYA6pquZjNX3ne4bY+XZI9lc4mfTou5/D5V4lSs1eBEaTOs0OtGkXKTXLJWatL9AhUrvzItt/rlh3OanM6nrb0yUNBzaXeAKRdF7by0PADts7q9Zt078HOBtYBextXe9FJ7leJ4Jvh2TjkVdRotSsbiS9EbiYIy0lXbXPgaRHbZ/ZbmkoaYvtGVXqZp3RwD7br+R646nActsHq9bO+rWU9wXdJ9IOHWJ7h6TZwCmtjiPguBLaSrVWlwGTbN8saQIw3nYpr4H7gN2k0/6XB/jZbrI3u8e1qh3OBko9hq8Bzs2lVquBjcAlpH+HyrF9V3avm2j7mRKaQTXEzrdD2juObE/OLafLbFfdcYSkn5AqLObYfk8OCCttnznAR7ulX7yxIeueDiwGTgWeAMYBH7e9tYD2Y7ZPl3Q1MMr2LSVNxSV9GFgEvMH2JEkzgJtsf6SEftA9YufbOR8ldxwB2H4+e6yW4KwcCDZn7V2SSpacPSLpNNvbCmpi+7Gce51CynM/U+qxn/TAMYu0070qXyv5/+hG0tSU3wPY3iKptLdE0AUi+HZOXR1HAAdz+VFLexxlJkpsy5rDgSslPUvBw8aca17A4fFJayXd5mwqXzFfIpVc3Wv7SUknAQ8X0G1xyPYLR3X3xeNrA4ng2zl1dRwB/Ai4F3irpIUkb9lvFdC9qIDG/2IpybR+cX59KanRYX7VwrbX0Da2x/azQMlKgyckfRIYJumUrP1IQf2gS0TOt0MkfRd4CJhL2vmtAC6w/fVC+lNJLa4CVg8F2z9Jj9uePtC1irTHkdq6j55kMadq7az/JpKLXss/eAVws+39r/2pYDASwbdD1P9U18pdprLOz21/eqBrvYakO4HbbG/Ir88CrrC9oID2SuDXwLXA54ArSOZKpW62820vG+haMPiJ4HuMSPo8Ke94EkfOTTueNMK9cq+BowN/bjbYanta1dp1khtbpgDP5UsTgT+R8t2V5pwlbbI9s/0GK+kPts8b6LNd0u/vZl/LINOgMyLne+zcDSynn6murniApKTrgG8CoyTtaXvrIMlwptepZFLv66RVVfE3pUGazwMnVi0qaR5wIfBO5Rl9mTEUstMMukvsfBuMpO+QJvZO5nD+0flQqGfJdpI7be+X9EHgfcDSdsvDCrUvAtYCE0gHfmOAG23fX7HudGAGcBNwQ9tbLwIPt9zGguYQwbfB5OqKa0g7ry2knv/1pQ5/6kJpmsMZpLbmFcDvSE0uFxbQvosjTcVPIPlZFJnoIGlEq6Y5N9VMKNFcEnSfvroXEHTENSRz7x22zyc1e/yz3iUV4RXbh0gjlH5o+8vA+ELaR5uK/5v0ey/FKkljctB/HFgiqYh5ftBdIvg2m5dbjQWSRtp+mnQQ1esclHQpcDnwQL42opB2X95xArWYio+1vYd041lieyZQ3N836Jw4cGs2O5Um6d5H2hHtIh0A9TpXksq8Ftrenttrf1FI+/uktuojTMULaQMMlzQ+615fUDfoMpHz7RGy18FY4EHbB+peTy+TzdtbpuKrS5qKS5pP6mJcZ3tBbm/+nu2LS60h6A4RfIPG0OYp0S8lGluCoFtE2iFoEi1PiS/kv1vzwy4DXiq/nHJI+lq2r1xM/xOjY5JFw4jgGzSG1nQQSecc5Zf8DUl/JNXA9iotz46Nta4i6BoRfIMmMlrSbNvrACS9nzTLrmdpNXHEuKDeIYJv0ESuAu6QNDa/3k2y8ux5dOT05BYvkHbEtxfyNA66QBy4BY1F0hjSd7joGPU6kXQraWzSr/KlS4C/A6OAMb3uaNdLRPANGoekkbx6ajJVT00eDEhaY/sD/V2T9KTt99a1tuD/I9IOQRP5LelRexNpfNFQYpykibafA5A0EXhLfi/quxtEBN+giZxou05byTr5KrBO0l9ITR6TgAV5dmAcxjWISDsEjUPST4HFpacmDxZy2mUqKfg+HYdszSSCb9A4JD0FvBvYTsGpyYOBPMPtK8C7bH82D9GcYvuBAT4aDDIi7RA0kXl1L6BGlpBy3bPy653AMg67uwUNISwlg8aQS8sgTW/o789Q4GTbt5DHGdneR9r5Bw0jdr5Bk7ib5O+widRo0B50TBpm2usckDSK3GiRRyoNtYqPniCCb9AYbLeMddYBa4C12UB+SCBJwG3Ag8AESb8EzgE+U+e6gmMjDtyCxiFpDjAbOJe0291MCsS31rqwAkjaBMwlzesTsMH2v+pdVXAsRPANGomkYaT5deeTplrssz213lVVj6QfA3fafrTutQSdEcE3aBySVpNczNaTxrivs/2PeldVhlxmNxnYAexlCJXZ9RqR8w2ayFZgJnAqqc14t6T1+eS/1xnKZXY9Rex8g8Yi6TjSMM1rgbfbHlnzkoLgdRM736BxSPoi6bBtJunx+w5S+iEIGkME36CJjAJ+AGyyfajuxQTBsRBphyAIghqI9uIgCIIaiOAbBEFQAxF8gyAIaiCCbxAEQQ38F67lKTpRZhFSAAAAAElFTkSuQmCC\n",
      "text/plain": [
       "<Figure size 432x288 with 2 Axes>"
      ]
     },
     "metadata": {
      "needs_background": "light"
     },
     "output_type": "display_data"
    }
   ],
   "source": [
    "corrMatt = rides[['temp','atemp','hum','windspeed','casual','registered','cnt']].corr()\n",
    "mask = np.array(corrMatt)\n",
    "mask[np.tril_indices_from(mask)] = False\n",
    "sn.heatmap(corrMatt, mask=mask,vmax=.8, square=True,annot=True)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 3. 如果发现特征之间有较强的相关性，在选择线性回归模型时应该采取什么措施？\n",
    "1）对于数据来说，可考虑进行PCA降维以简化数据特征\n",
    "\n",
    "2）对于模型来说，可考虑加正则项"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 4. 当采用带正则的模型以及采用随机梯度下降优化算法时，需要对输入（连续型）特征进行去量纲预处理。课程代码给出了用标准化（StandardScaler）的结果，请改成最小最大缩放（MinMaxScaler）去量纲 ，并重新训练最小二乘线性回归、岭回归、和Lasso模型。"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 对类别型特征进行one-hot编码"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 43,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/html": [
       "<div>\n",
       "<style scoped>\n",
       "    .dataframe tbody tr th:only-of-type {\n",
       "        vertical-align: middle;\n",
       "    }\n",
       "\n",
       "    .dataframe tbody tr th {\n",
       "        vertical-align: top;\n",
       "    }\n",
       "\n",
       "    .dataframe thead th {\n",
       "        text-align: right;\n",
       "    }\n",
       "</style>\n",
       "<table border=\"1\" class=\"dataframe\">\n",
       "  <thead>\n",
       "    <tr style=\"text-align: right;\">\n",
       "      <th></th>\n",
       "      <th>season_1</th>\n",
       "      <th>season_2</th>\n",
       "      <th>season_3</th>\n",
       "      <th>season_4</th>\n",
       "      <th>mnth_1</th>\n",
       "      <th>mnth_2</th>\n",
       "      <th>mnth_3</th>\n",
       "      <th>mnth_4</th>\n",
       "      <th>mnth_5</th>\n",
       "      <th>mnth_6</th>\n",
       "      <th>...</th>\n",
       "      <th>weathersit_1</th>\n",
       "      <th>weathersit_2</th>\n",
       "      <th>weathersit_3</th>\n",
       "      <th>weekday_0</th>\n",
       "      <th>weekday_1</th>\n",
       "      <th>weekday_2</th>\n",
       "      <th>weekday_3</th>\n",
       "      <th>weekday_4</th>\n",
       "      <th>weekday_5</th>\n",
       "      <th>weekday_6</th>\n",
       "    </tr>\n",
       "  </thead>\n",
       "  <tbody>\n",
       "    <tr>\n",
       "      <th>0</th>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>...</td>\n",
       "      <td>0</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>1</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>1</th>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>...</td>\n",
       "      <td>0</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>2</th>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>...</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>3</th>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>...</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>4</th>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>...</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "    </tr>\n",
       "  </tbody>\n",
       "</table>\n",
       "<p>5 rows × 26 columns</p>\n",
       "</div>"
      ],
      "text/plain": [
       "   season_1  season_2  season_3  season_4  mnth_1  mnth_2  mnth_3  mnth_4  \\\n",
       "0         1         0         0         0       1       0       0       0   \n",
       "1         1         0         0         0       1       0       0       0   \n",
       "2         1         0         0         0       1       0       0       0   \n",
       "3         1         0         0         0       1       0       0       0   \n",
       "4         1         0         0         0       1       0       0       0   \n",
       "\n",
       "   mnth_5  mnth_6  ...  weathersit_1  weathersit_2  weathersit_3  weekday_0  \\\n",
       "0       0       0  ...             0             1             0          0   \n",
       "1       0       0  ...             0             1             0          1   \n",
       "2       0       0  ...             1             0             0          0   \n",
       "3       0       0  ...             1             0             0          0   \n",
       "4       0       0  ...             1             0             0          0   \n",
       "\n",
       "   weekday_1  weekday_2  weekday_3  weekday_4  weekday_5  weekday_6  \n",
       "0          0          0          0          0          0          1  \n",
       "1          0          0          0          0          0          0  \n",
       "2          1          0          0          0          0          0  \n",
       "3          0          1          0          0          0          0  \n",
       "4          0          0          1          0          0          0  \n",
       "\n",
       "[5 rows x 26 columns]"
      ]
     },
     "execution_count": 43,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "categorical_features = ['season','mnth','weathersit','weekday']\n",
    "\n",
    "#数据类型变为object，才能被get_dummies处理\n",
    "for col in categorical_features:\n",
    "    rides[col] = rides[col].astype('object')\n",
    "    \n",
    "X_rides_cat = rides[categorical_features]\n",
    "X_rides_cat = pd.get_dummies(X_rides_cat)\n",
    "X_rides_cat.head()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 对数值型特征进行MinMaxScaler，去量纲"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 44,
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/html": [
       "<div>\n",
       "<style scoped>\n",
       "    .dataframe tbody tr th:only-of-type {\n",
       "        vertical-align: middle;\n",
       "    }\n",
       "\n",
       "    .dataframe tbody tr th {\n",
       "        vertical-align: top;\n",
       "    }\n",
       "\n",
       "    .dataframe thead th {\n",
       "        text-align: right;\n",
       "    }\n",
       "</style>\n",
       "<table border=\"1\" class=\"dataframe\">\n",
       "  <thead>\n",
       "    <tr style=\"text-align: right;\">\n",
       "      <th></th>\n",
       "      <th>temp</th>\n",
       "      <th>atemp</th>\n",
       "      <th>hum</th>\n",
       "      <th>windspeed</th>\n",
       "    </tr>\n",
       "  </thead>\n",
       "  <tbody>\n",
       "    <tr>\n",
       "      <th>0</th>\n",
       "      <td>0.355170</td>\n",
       "      <td>0.373517</td>\n",
       "      <td>0.828620</td>\n",
       "      <td>0.284606</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>1</th>\n",
       "      <td>0.379232</td>\n",
       "      <td>0.360541</td>\n",
       "      <td>0.715771</td>\n",
       "      <td>0.466215</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>2</th>\n",
       "      <td>0.171000</td>\n",
       "      <td>0.144830</td>\n",
       "      <td>0.449638</td>\n",
       "      <td>0.465740</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>3</th>\n",
       "      <td>0.175530</td>\n",
       "      <td>0.174649</td>\n",
       "      <td>0.607131</td>\n",
       "      <td>0.284297</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>4</th>\n",
       "      <td>0.209120</td>\n",
       "      <td>0.197158</td>\n",
       "      <td>0.449313</td>\n",
       "      <td>0.339143</td>\n",
       "    </tr>\n",
       "  </tbody>\n",
       "</table>\n",
       "</div>"
      ],
      "text/plain": [
       "       temp     atemp       hum  windspeed\n",
       "0  0.355170  0.373517  0.828620   0.284606\n",
       "1  0.379232  0.360541  0.715771   0.466215\n",
       "2  0.171000  0.144830  0.449638   0.465740\n",
       "3  0.175530  0.174649  0.607131   0.284297\n",
       "4  0.209120  0.197158  0.449313   0.339143"
      ]
     },
     "execution_count": 44,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "#感觉数据已经做过处理（取值都在0-1之间），这里用MinMaxScaler再处理一次\n",
    "from sklearn.preprocessing import MinMaxScaler\n",
    "mn_X = MinMaxScaler()\n",
    "numerical_features = ['temp','atemp','hum','windspeed']\n",
    "temp = mn_X.fit_transform(rides[numerical_features])\n",
    "\n",
    "X_rides_num = pd.DataFrame(data=temp, columns=numerical_features, index=rides.index)\n",
    "X_rides_num.head()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 将处理后的数据进行整理"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 69,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "<class 'pandas.core.frame.DataFrame'>\n",
      "RangeIndex: 731 entries, 0 to 730\n",
      "Data columns (total 35 columns):\n",
      "instant         731 non-null int64\n",
      "season_1        731 non-null uint8\n",
      "season_2        731 non-null uint8\n",
      "season_3        731 non-null uint8\n",
      "season_4        731 non-null uint8\n",
      "mnth_1          731 non-null uint8\n",
      "mnth_2          731 non-null uint8\n",
      "mnth_3          731 non-null uint8\n",
      "mnth_4          731 non-null uint8\n",
      "mnth_5          731 non-null uint8\n",
      "mnth_6          731 non-null uint8\n",
      "mnth_7          731 non-null uint8\n",
      "mnth_8          731 non-null uint8\n",
      "mnth_9          731 non-null uint8\n",
      "mnth_10         731 non-null uint8\n",
      "mnth_11         731 non-null uint8\n",
      "mnth_12         731 non-null uint8\n",
      "weathersit_1    731 non-null uint8\n",
      "weathersit_2    731 non-null uint8\n",
      "weathersit_3    731 non-null uint8\n",
      "weekday_0       731 non-null uint8\n",
      "weekday_1       731 non-null uint8\n",
      "weekday_2       731 non-null uint8\n",
      "weekday_3       731 non-null uint8\n",
      "weekday_4       731 non-null uint8\n",
      "weekday_5       731 non-null uint8\n",
      "weekday_6       731 non-null uint8\n",
      "temp            731 non-null float64\n",
      "atemp           731 non-null float64\n",
      "hum             731 non-null float64\n",
      "windspeed       731 non-null float64\n",
      "holiday         731 non-null int64\n",
      "workingday      731 non-null int64\n",
      "yr              731 non-null int64\n",
      "cnt             731 non-null int64\n",
      "dtypes: float64(4), int64(5), uint8(26)\n",
      "memory usage: 70.1 KB\n"
     ]
    },
    {
     "data": {
      "text/html": [
       "<div>\n",
       "<style scoped>\n",
       "    .dataframe tbody tr th:only-of-type {\n",
       "        vertical-align: middle;\n",
       "    }\n",
       "\n",
       "    .dataframe tbody tr th {\n",
       "        vertical-align: top;\n",
       "    }\n",
       "\n",
       "    .dataframe thead th {\n",
       "        text-align: right;\n",
       "    }\n",
       "</style>\n",
       "<table border=\"1\" class=\"dataframe\">\n",
       "  <thead>\n",
       "    <tr style=\"text-align: right;\">\n",
       "      <th></th>\n",
       "      <th>instant</th>\n",
       "      <th>season_1</th>\n",
       "      <th>season_2</th>\n",
       "      <th>season_3</th>\n",
       "      <th>season_4</th>\n",
       "      <th>mnth_1</th>\n",
       "      <th>mnth_2</th>\n",
       "      <th>mnth_3</th>\n",
       "      <th>mnth_4</th>\n",
       "      <th>mnth_5</th>\n",
       "      <th>...</th>\n",
       "      <th>weekday_5</th>\n",
       "      <th>weekday_6</th>\n",
       "      <th>temp</th>\n",
       "      <th>atemp</th>\n",
       "      <th>hum</th>\n",
       "      <th>windspeed</th>\n",
       "      <th>holiday</th>\n",
       "      <th>workingday</th>\n",
       "      <th>yr</th>\n",
       "      <th>cnt</th>\n",
       "    </tr>\n",
       "  </thead>\n",
       "  <tbody>\n",
       "    <tr>\n",
       "      <th>0</th>\n",
       "      <td>1</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>...</td>\n",
       "      <td>0</td>\n",
       "      <td>1</td>\n",
       "      <td>0.355170</td>\n",
       "      <td>0.373517</td>\n",
       "      <td>0.828620</td>\n",
       "      <td>0.284606</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>985</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>1</th>\n",
       "      <td>2</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>...</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0.379232</td>\n",
       "      <td>0.360541</td>\n",
       "      <td>0.715771</td>\n",
       "      <td>0.466215</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>801</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>2</th>\n",
       "      <td>3</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>...</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0.171000</td>\n",
       "      <td>0.144830</td>\n",
       "      <td>0.449638</td>\n",
       "      <td>0.465740</td>\n",
       "      <td>0</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>1349</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>3</th>\n",
       "      <td>4</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>...</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0.175530</td>\n",
       "      <td>0.174649</td>\n",
       "      <td>0.607131</td>\n",
       "      <td>0.284297</td>\n",
       "      <td>0</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>1562</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>4</th>\n",
       "      <td>5</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>...</td>\n",
       "      <td>0</td>\n",
       "      <td>0</td>\n",
       "      <td>0.209120</td>\n",
       "      <td>0.197158</td>\n",
       "      <td>0.449313</td>\n",
       "      <td>0.339143</td>\n",
       "      <td>0</td>\n",
       "      <td>1</td>\n",
       "      <td>0</td>\n",
       "      <td>1600</td>\n",
       "    </tr>\n",
       "  </tbody>\n",
       "</table>\n",
       "<p>5 rows × 35 columns</p>\n",
       "</div>"
      ],
      "text/plain": [
       "   instant  season_1  season_2  season_3  season_4  mnth_1  mnth_2  mnth_3  \\\n",
       "0        1         1         0         0         0       1       0       0   \n",
       "1        2         1         0         0         0       1       0       0   \n",
       "2        3         1         0         0         0       1       0       0   \n",
       "3        4         1         0         0         0       1       0       0   \n",
       "4        5         1         0         0         0       1       0       0   \n",
       "\n",
       "   mnth_4  mnth_5  ...  weekday_5  weekday_6      temp     atemp       hum  \\\n",
       "0       0       0  ...          0          1  0.355170  0.373517  0.828620   \n",
       "1       0       0  ...          0          0  0.379232  0.360541  0.715771   \n",
       "2       0       0  ...          0          0  0.171000  0.144830  0.449638   \n",
       "3       0       0  ...          0          0  0.175530  0.174649  0.607131   \n",
       "4       0       0  ...          0          0  0.209120  0.197158  0.449313   \n",
       "\n",
       "   windspeed  holiday  workingday  yr   cnt  \n",
       "0   0.284606        0           0   0   985  \n",
       "1   0.466215        0           0   0   801  \n",
       "2   0.465740        0           1   0  1349  \n",
       "3   0.284297        0           1   0  1562  \n",
       "4   0.339143        0           1   0  1600  \n",
       "\n",
       "[5 rows x 35 columns]"
      ]
     },
     "execution_count": 69,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "#fields_to_drop = ['season', 'mnth','weathersit','weekday','temp','atemp','hum','windspeed','casual', 'dteday', 'registered']\n",
    "#rides = rides.drop(fields_to_drop, axis=1)\n",
    "#FE_rides = pd.concat([X_rides_cat,X_rides_num,rides],axis = 1)\n",
    "\n",
    "FE_rides = pd.concat([rides['instant'],X_rides_cat,X_rides_num,rides['holiday'],rides['workingday'],rides['yr'],rides['cnt']], axis = 1)\n",
    "FE_rides.to_csv('FE_day.csv', index=False)\n",
    "FE_rides.info()\n",
    "FE_rides.head()"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 将数据进行分割"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 77,
   "metadata": {},
   "outputs": [],
   "source": [
    "#读取整理后的数据\n",
    "df = pd.read_csv('FE_day.csv')\n",
    "\n",
    "# 从原始数据中分离输入特征x和输出y\n",
    "y = df['cnt']\n",
    "X = df.drop(['cnt'], axis = 1)\n",
    "\n",
    "#将数据分割训练数据与测试数据\n",
    "from sklearn.model_selection import train_test_split\n",
    "\n",
    "# 随机采样20%的数据构建测试样本，其余作为训练样本\n",
    "X_train, X_test, y_train, y_test = train_test_split(X, y, random_state=33, test_size=0.2)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### 模型选择\n",
    "#### 1）最小二乘线性回归"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 78,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "The r2 score of LinearRegression on test_dateset is 0.8279474225980329\n",
      "The r2 score of LinearRegression on train_dataset is 0.8516480637403496\n"
     ]
    }
   ],
   "source": [
    "from sklearn.linear_model import LinearRegression\n",
    "\n",
    "# 1.使用默认配置初始化学习器实例\n",
    "lr = LinearRegression()\n",
    "\n",
    "# 2.用训练数据训练模型参数\n",
    "lr.fit(X_train, y_train)\n",
    "\n",
    "# 3.用训练好的模型对测试集进行预测\n",
    "y_test_pred_lr = lr.predict(X_test)\n",
    "y_train_pred_lr = lr.predict(X_train)\n",
    "\n",
    "# 使用r2_score评价模型在测试集和训练集上的性能，并输出评估结果\n",
    "from sklearn.metrics import r2_score\n",
    "print('The r2 score of LinearRegression on test_dateset is', r2_score(y_test, y_test_pred_lr))\n",
    "print('The r2 score of LinearRegression on train_dataset is', r2_score(y_train, y_train_pred_lr))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "#### 2）岭回归（L2正则化回归模型）"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 98,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "The r2 score of RidgeCV on test_dateset is 0.828974067758238\n",
      "The r2 score of RidgeCV on train_dataset is 0.8497700029725643\n"
     ]
    }
   ],
   "source": [
    "from sklearn.linear_model import  RidgeCV\n",
    "\n",
    "# 1.设置超参数（正则参数）范围\n",
    "alphas = [ 0.01, 0.1, 1, 10,100]\n",
    "\n",
    "# 2.生成一个 RidgeCV 实例\n",
    "ridge = RidgeCV(alphas=alphas, store_cv_values=True)  \n",
    "\n",
    "# 3.模型训练\n",
    "ridge.fit(X_train, y_train)    \n",
    "\n",
    "# 4.预测\n",
    "y_test_pred_ridge = ridge.predict(X_test)\n",
    "y_train_pred_ridge = ridge.predict(X_train)\n",
    "\n",
    "# 使用r2_score评价模型在测试集和训练集上的性能，并输出评估结果\n",
    "from sklearn.metrics import r2_score\n",
    "print('The r2 score of RidgeCV on test_dateset is', r2_score(y_test, y_test_pred_ridge))\n",
    "print('The r2 score of RidgeCV on train_dataset is', r2_score(y_train, y_train_pred_ridge))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "#### 3）Lasso模型（L1正则化回归模型）"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 99,
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "The r2 score of LassoCV on test_dateset is 0.8283764110690385\n",
      "The r2 score of LassoCV on train_dataset is 0.8514566909611481\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 128155765.49747854, tolerance: 176331.28855032122\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 124237097.47580852, tolerance: 176331.28855032122\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 124404407.6562402, tolerance: 176331.28855032122\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 126641504.8447259, tolerance: 176331.28855032122\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 126483662.02624018, tolerance: 176331.28855032122\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 126350069.07426925, tolerance: 176331.28855032122\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 126266143.55127867, tolerance: 176331.28855032122\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 126217003.07178046, tolerance: 176331.28855032122\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 126189068.30268449, tolerance: 176331.28855032122\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 126173434.84491515, tolerance: 176331.28855032122\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 126164765.96900982, tolerance: 176331.28855032122\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 132632850.85720974, tolerance: 171396.26788522486\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 124714698.69332716, tolerance: 171396.26788522486\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 120788349.30547287, tolerance: 171396.26788522486\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 129516664.51149547, tolerance: 171396.26788522486\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 129704965.73518734, tolerance: 171396.26788522486\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 129629189.97557108, tolerance: 171396.26788522486\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 129564524.30552153, tolerance: 171396.26788522486\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 129522486.75465856, tolerance: 171396.26788522486\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 129497203.48293415, tolerance: 171396.26788522486\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 129482583.02914564, tolerance: 171396.26788522486\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 129474305.19280961, tolerance: 171396.26788522486\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 126053809.22215009, tolerance: 178645.74666680943\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 121793014.04183862, tolerance: 178645.74666680943\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 123771931.13141032, tolerance: 178645.74666680943\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 122613042.58713886, tolerance: 178645.74666680943\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 123229887.84016082, tolerance: 178645.74666680943\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 123190485.5363854, tolerance: 178645.74666680943\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 123130273.9162689, tolerance: 178645.74666680943\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 123088066.43075188, tolerance: 178645.74666680943\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 123062050.42117825, tolerance: 178645.74666680943\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 123046783.82691577, tolerance: 178645.74666680943\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 123038049.6021552, tolerance: 178645.74666680943\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 113890036.24356376, tolerance: 172687.4240346895\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 127014479.84933357, tolerance: 172687.4240346895\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 124850653.17076021, tolerance: 172687.4240346895\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 125058612.78548665, tolerance: 172687.4240346895\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 124973133.14978318, tolerance: 172687.4240346895\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 124870075.21183431, tolerance: 172687.4240346895\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 124799682.73309267, tolerance: 172687.4240346895\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 124756900.65551946, tolerance: 172687.4240346895\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 124731941.70588616, tolerance: 172687.4240346895\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 124717666.95888215, tolerance: 172687.4240346895\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 124709598.45341566, tolerance: 172687.4240346895\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 46744148.91845173, tolerance: 169474.6999863248\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 116033477.35881269, tolerance: 169474.6999863248\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 123540252.2284336, tolerance: 169474.6999863248\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 110366757.88672164, tolerance: 169474.6999863248\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 124837989.24853577, tolerance: 169474.6999863248\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 124909105.40988377, tolerance: 169474.6999863248\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 124796809.80135812, tolerance: 169474.6999863248\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 124716455.51166569, tolerance: 169474.6999863248\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 124667318.54325365, tolerance: 169474.6999863248\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 124638774.130823, tolerance: 169474.6999863248\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 124622593.50363544, tolerance: 169474.6999863248\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:472: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 124613547.54762678, tolerance: 169474.6999863248\n",
      "  tol, rng, random, positive)\n",
      "c:\\python\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:476: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 162360641.58816293, tolerance: 217175.10367928082\n",
      "  positive)\n"
     ]
    }
   ],
   "source": [
    "from sklearn.linear_model import LassoCV\n",
    "\n",
    "# 1.设置超参数搜索范围\n",
    "#alphas = [ 0.01, 0.1, 1, 10,100]\n",
    "#Lasso可以自动确定最大的alpha，所以另一种设置alpha的方式是设置最小的alpha值（eps） 和 超参数的数目（n_alphas），\n",
    "#然后LassoCV对最小值和最大值之间在log域上均匀取值n_alphas个\n",
    "#np.logspace(np.log10(alpha_max * eps), np.log10(alpha_max),num=n_alphas)[::-1]\n",
    "alphas = np.logspace(-3,2,20)\n",
    "\n",
    "# 2.生成LassoCV实例（默认超参数搜索范围）\n",
    "lasso = LassoCV(alphas=alphas)  \n",
    "#lasso = LassoCV()\n",
    "\n",
    "# 3.训练（内含CV）\n",
    "lasso.fit(X_train, y_train)  \n",
    "\n",
    "# 4.测试\n",
    "y_test_pred_lasso = lasso.predict(X_test)\n",
    "y_train_pred_lasso = lasso.predict(X_train)\n",
    "\n",
    "# 使用r2_score评价模型在测试集和训练集上的性能，并输出评估结果\n",
    "from sklearn.metrics import r2_score\n",
    "print('The r2 score of LassoCV on test_dateset is', r2_score(y_test, y_test_pred_lasso))\n",
    "print('The r2 score of LassoCV on train_dataset is', r2_score(y_train, y_train_pred_lasso))"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## 5. 代码中给出了岭回归（RidgeCV）和Lasso（LassoCV）超参数（alphas）调优的过程，请结合两个最佳模型以及最小二乘线性回归模型的结果，给出什么场合应该用岭回归，什么场合用Lasso，什么场合用最小二乘。\n",
    "1)岭回归和最小二乘相比，由以上可知，岭回归的测试集评分要比最小二乘高，而训练集评分比最小二乘低，说明使用L2正则后，模型的泛化能力增强，表现为 \n",
    "  线性回归系数收缩。根据各特征之间的相关性可知'temp','atemp','cnt'两两之间的相关性比较高，此时根据岭回归的解(XTX+λI)−1XTY可知，矩阵XTX对角\n",
    "  线存在等于0或接近于0的元素，但0+λ≠0，(XTX+λI)求逆仍可得到稳定解。因此岭回归适用于输入特征存在共线性的情况。\n",
    "\n",
    "2）Lasso模型相对于最小二乘，可以达到和岭回归相似的效果，但其测试集评分比岭回归低。因为当正则参数取合适值时，L1正则使得有些线性回归系数为0，\n",
    "  得到稀疏模型，当输入特征多，有些特征与目标变量之间相关性很弱时，L1正则可能只选择强相关的特征，模型解释性好。\n",
    "  \n",
    "3）最小二乘适用于特征之间不存在共线性的情况。如果特征之间存在共线性，则根据最小二乘的解(XTX)−1XTY，需要对矩阵XTX求逆。此时矩阵X是接近不满秩，\n",
    "  矩阵XTX接近奇异，求逆不稳定。训练数据容易产生过拟合。"
   ]
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.7.6"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 2
}
