{
 "cells": [
  {
   "cell_type": "markdown",
   "id": "b6724d45",
   "metadata": {},
   "source": [
    "# Python进阶-ML模型应用5_Ridge,Lasso Regression岭回归，Lasso回归解决线性回归的Overfitting过度拟合\n",
    "\n",
    "\n",
    "岭回归（Ridge Regression）和Lasso回归都是线性回归的变种，它们通过在损失函数中添加正则化项来防止过拟合。岭回归使用L2正则化，而Lasso回归使用L1正则化。\n",
    "\n",
    "岭回归的损失函数为：\n",
    "\n",
    "$$ J(\\theta) = \\frac{1}{2m} \\sum_{i=1}^{m} (h_{\\theta}(x^{(i)}) - y^{(i)})^2 + \\lambda \\sum_{j=1}^{n} \\theta_j^2 $$\n",
    "\n",
    "其中，$m$ 是样本数量，$n$ 是特征数量，$\\lambda$ 是正则化参数，$h_{\\theta}(x)$ 是预测值，$y$ 是真实值，$\\theta$ 是模型参数。\n",
    "\n",
    "Lasso回归的损失函数为：\n",
    "\n",
    "$$ J(\\theta) = \\frac{1}{2m} \\sum_{i=1}^{m} (h_{\\theta}(x^{(i)}) - y^{(i)})^2 + \\lambda \\sum_{j=1}^{n} |{\\theta_j}| $$\n",
    "\n",
    "公式采用LaTeX语法"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "6c026e4c",
   "metadata": {},
   "source": []
  },
  {
   "cell_type": "markdown",
   "id": "e9710b7b",
   "metadata": {},
   "source": [
    "Ridge回归和Lasso回归都是线性回归的扩展方法，它们都在试图解决线性回归的一些问题，如过拟合等。\n",
    "\n",
    "在标准线性回归的基础上，岭回归（Ridge regression）通过在损失函数中加入L2正则化项进行修改，即应用α系数的Σ（系数的平方值），以控制正则化强度。较大的α值意味着更强的正则化（过拟合程度降低但可能会变为欠拟合！），较小的值意味着弱正则化（过度拟合）。\n",
    "\n",
    "同样，Lasso回归也对标准线性回归做了修改，但它使用的是L1正则化项。此外，Lasso回归还有两个特性：稀疏性和选择特征的能力。在梯度下降时，Lasso回归求得的梯度只有1和-1两种值，所以每次更新步长它都在稳步向前前进。这使得Lasso回归能够将与相应变量无关系的变量系数缩减为0，从而实现特征选择。\n",
    "\n",
    "总的来说，Ridge回归和Lasso回归都通过在损失函数中添加正则化项来改进线性回归，以防止过拟合并提高模型的准确性。不过，它们在如何处理这些正则化项以及如何进行特征选择上有所不同。"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "fef51a8f",
   "metadata": {},
   "source": [
    "过拟合：\n",
    "\n",
    "过拟合是一种在机器学习中不希望出现的现象，它发生在模型过于贴合训练数据，以至于损失过拟合是一种在机器学习中不希望出现的现象，它发生在模型过于贴合训练数据，以至于损失函数在验证集或新数据上表现不佳，从而导致模型的泛化性能下降。换句话说，过拟合模型对训练数据的预测准确率很高，但对新的、未知的数据的预测准确率却很低。\n",
    "\n",
    "过拟合的原因可能包括：模型过于复杂、训练数据量不足、特征数量过多等。解决过拟合的方法通常包括：增加数据量、减少特征数量、使用正则化方法（如Ridge回归和Lasso回归）以及使用集成学习等。"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "0df5fe7c",
   "metadata": {},
   "source": [
    " 如何通过Ridge回归和Lasso回归判断模型已经不再过拟合了？\n",
    "\n",
    "判断Ridge回归和Lasso回归模型是否过拟合，主要依赖于一些评估指标。对于Ridge回归，可以使用均方误差（MSE）或均方根误差（RMSE）等度量来评估模型的好坏。如果这些值较小，说明模型的预测效果较好，可能并未发生过拟合。另外，还可以通过观察岭线图来分析模型的复杂度与误差之间的关系。\n",
    "\n",
    "对于Lasso回归，同样可以使用MSE或RMSE等度量标准进行评估。不同的是，Lasso回归会将一些系数压缩至为零，这可以帮助我们理解哪些特征对模型预测结果的影响较大。\n",
    "\n",
    "正则化参数alpha在Ridge回归和Lasso回归中都起着重要的作用。例如，在Ridge回归中，alpha是正则化参数，其值的大小会影响模型的复杂程度与误差之间的权衡。而在Lasso回归中，alpha的值会影响惩罚项与系数之和的大小，进而决定哪些特征的系数会被压缩为0。因此，选择合适的alpha值对于防止模型过拟合也非常关键。"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "96e046ad",
   "metadata": {},
   "source": [
    "## 应用案例（一）"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "69cbf3e6",
   "metadata": {},
   "source": [
    "### 1、导入函数库和数据集"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 14,
   "id": "92270263",
   "metadata": {},
   "outputs": [],
   "source": [
    "import numpy as np\n",
    "import pandas as pd\n",
    "import matplotlib.pyplot as plt\n",
    " \n",
    "from sklearn.datasets import load_boston#波士顿房价预测数据集\n",
    " \n",
    "boston=load_boston()#波士顿房价预测数据集"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 24,
   "id": "41a3aa90",
   "metadata": {
    "collapsed": true
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "{'data': array([[6.3200e-03, 1.8000e+01, 2.3100e+00, ..., 1.5300e+01, 3.9690e+02,\n",
       "         4.9800e+00],\n",
       "        [2.7310e-02, 0.0000e+00, 7.0700e+00, ..., 1.7800e+01, 3.9690e+02,\n",
       "         9.1400e+00],\n",
       "        [2.7290e-02, 0.0000e+00, 7.0700e+00, ..., 1.7800e+01, 3.9283e+02,\n",
       "         4.0300e+00],\n",
       "        ...,\n",
       "        [6.0760e-02, 0.0000e+00, 1.1930e+01, ..., 2.1000e+01, 3.9690e+02,\n",
       "         5.6400e+00],\n",
       "        [1.0959e-01, 0.0000e+00, 1.1930e+01, ..., 2.1000e+01, 3.9345e+02,\n",
       "         6.4800e+00],\n",
       "        [4.7410e-02, 0.0000e+00, 1.1930e+01, ..., 2.1000e+01, 3.9690e+02,\n",
       "         7.8800e+00]]),\n",
       " 'target': array([24. , 21.6, 34.7, 33.4, 36.2, 28.7, 22.9, 27.1, 16.5, 18.9, 15. ,\n",
       "        18.9, 21.7, 20.4, 18.2, 19.9, 23.1, 17.5, 20.2, 18.2, 13.6, 19.6,\n",
       "        15.2, 14.5, 15.6, 13.9, 16.6, 14.8, 18.4, 21. , 12.7, 14.5, 13.2,\n",
       "        13.1, 13.5, 18.9, 20. , 21. , 24.7, 30.8, 34.9, 26.6, 25.3, 24.7,\n",
       "        21.2, 19.3, 20. , 16.6, 14.4, 19.4, 19.7, 20.5, 25. , 23.4, 18.9,\n",
       "        35.4, 24.7, 31.6, 23.3, 19.6, 18.7, 16. , 22.2, 25. , 33. , 23.5,\n",
       "        19.4, 22. , 17.4, 20.9, 24.2, 21.7, 22.8, 23.4, 24.1, 21.4, 20. ,\n",
       "        20.8, 21.2, 20.3, 28. , 23.9, 24.8, 22.9, 23.9, 26.6, 22.5, 22.2,\n",
       "        23.6, 28.7, 22.6, 22. , 22.9, 25. , 20.6, 28.4, 21.4, 38.7, 43.8,\n",
       "        33.2, 27.5, 26.5, 18.6, 19.3, 20.1, 19.5, 19.5, 20.4, 19.8, 19.4,\n",
       "        21.7, 22.8, 18.8, 18.7, 18.5, 18.3, 21.2, 19.2, 20.4, 19.3, 22. ,\n",
       "        20.3, 20.5, 17.3, 18.8, 21.4, 15.7, 16.2, 18. , 14.3, 19.2, 19.6,\n",
       "        23. , 18.4, 15.6, 18.1, 17.4, 17.1, 13.3, 17.8, 14. , 14.4, 13.4,\n",
       "        15.6, 11.8, 13.8, 15.6, 14.6, 17.8, 15.4, 21.5, 19.6, 15.3, 19.4,\n",
       "        17. , 15.6, 13.1, 41.3, 24.3, 23.3, 27. , 50. , 50. , 50. , 22.7,\n",
       "        25. , 50. , 23.8, 23.8, 22.3, 17.4, 19.1, 23.1, 23.6, 22.6, 29.4,\n",
       "        23.2, 24.6, 29.9, 37.2, 39.8, 36.2, 37.9, 32.5, 26.4, 29.6, 50. ,\n",
       "        32. , 29.8, 34.9, 37. , 30.5, 36.4, 31.1, 29.1, 50. , 33.3, 30.3,\n",
       "        34.6, 34.9, 32.9, 24.1, 42.3, 48.5, 50. , 22.6, 24.4, 22.5, 24.4,\n",
       "        20. , 21.7, 19.3, 22.4, 28.1, 23.7, 25. , 23.3, 28.7, 21.5, 23. ,\n",
       "        26.7, 21.7, 27.5, 30.1, 44.8, 50. , 37.6, 31.6, 46.7, 31.5, 24.3,\n",
       "        31.7, 41.7, 48.3, 29. , 24. , 25.1, 31.5, 23.7, 23.3, 22. , 20.1,\n",
       "        22.2, 23.7, 17.6, 18.5, 24.3, 20.5, 24.5, 26.2, 24.4, 24.8, 29.6,\n",
       "        42.8, 21.9, 20.9, 44. , 50. , 36. , 30.1, 33.8, 43.1, 48.8, 31. ,\n",
       "        36.5, 22.8, 30.7, 50. , 43.5, 20.7, 21.1, 25.2, 24.4, 35.2, 32.4,\n",
       "        32. , 33.2, 33.1, 29.1, 35.1, 45.4, 35.4, 46. , 50. , 32.2, 22. ,\n",
       "        20.1, 23.2, 22.3, 24.8, 28.5, 37.3, 27.9, 23.9, 21.7, 28.6, 27.1,\n",
       "        20.3, 22.5, 29. , 24.8, 22. , 26.4, 33.1, 36.1, 28.4, 33.4, 28.2,\n",
       "        22.8, 20.3, 16.1, 22.1, 19.4, 21.6, 23.8, 16.2, 17.8, 19.8, 23.1,\n",
       "        21. , 23.8, 23.1, 20.4, 18.5, 25. , 24.6, 23. , 22.2, 19.3, 22.6,\n",
       "        19.8, 17.1, 19.4, 22.2, 20.7, 21.1, 19.5, 18.5, 20.6, 19. , 18.7,\n",
       "        32.7, 16.5, 23.9, 31.2, 17.5, 17.2, 23.1, 24.5, 26.6, 22.9, 24.1,\n",
       "        18.6, 30.1, 18.2, 20.6, 17.8, 21.7, 22.7, 22.6, 25. , 19.9, 20.8,\n",
       "        16.8, 21.9, 27.5, 21.9, 23.1, 50. , 50. , 50. , 50. , 50. , 13.8,\n",
       "        13.8, 15. , 13.9, 13.3, 13.1, 10.2, 10.4, 10.9, 11.3, 12.3,  8.8,\n",
       "         7.2, 10.5,  7.4, 10.2, 11.5, 15.1, 23.2,  9.7, 13.8, 12.7, 13.1,\n",
       "        12.5,  8.5,  5. ,  6.3,  5.6,  7.2, 12.1,  8.3,  8.5,  5. , 11.9,\n",
       "        27.9, 17.2, 27.5, 15. , 17.2, 17.9, 16.3,  7. ,  7.2,  7.5, 10.4,\n",
       "         8.8,  8.4, 16.7, 14.2, 20.8, 13.4, 11.7,  8.3, 10.2, 10.9, 11. ,\n",
       "         9.5, 14.5, 14.1, 16.1, 14.3, 11.7, 13.4,  9.6,  8.7,  8.4, 12.8,\n",
       "        10.5, 17.1, 18.4, 15.4, 10.8, 11.8, 14.9, 12.6, 14.1, 13. , 13.4,\n",
       "        15.2, 16.1, 17.8, 14.9, 14.1, 12.7, 13.5, 14.9, 20. , 16.4, 17.7,\n",
       "        19.5, 20.2, 21.4, 19.9, 19. , 19.1, 19.1, 20.1, 19.9, 19.6, 23.2,\n",
       "        29.8, 13.8, 13.3, 16.7, 12. , 14.6, 21.4, 23. , 23.7, 25. , 21.8,\n",
       "        20.6, 21.2, 19.1, 20.6, 15.2,  7. ,  8.1, 13.6, 20.1, 21.8, 24.5,\n",
       "        23.1, 19.7, 18.3, 21.2, 17.5, 16.8, 22.4, 20.6, 23.9, 22. , 11.9]),\n",
       " 'feature_names': array(['CRIM', 'ZN', 'INDUS', 'CHAS', 'NOX', 'RM', 'AGE', 'DIS', 'RAD',\n",
       "        'TAX', 'PTRATIO', 'B', 'LSTAT'], dtype='<U7'),\n",
       " 'DESCR': \".. _boston_dataset:\\n\\nBoston house prices dataset\\n---------------------------\\n\\n**Data Set Characteristics:**  \\n\\n    :Number of Instances: 506 \\n\\n    :Number of Attributes: 13 numeric/categorical predictive. Median Value (attribute 14) is usually the target.\\n\\n    :Attribute Information (in order):\\n        - CRIM     per capita crime rate by town\\n        - ZN       proportion of residential land zoned for lots over 25,000 sq.ft.\\n        - INDUS    proportion of non-retail business acres per town\\n        - CHAS     Charles River dummy variable (= 1 if tract bounds river; 0 otherwise)\\n        - NOX      nitric oxides concentration (parts per 10 million)\\n        - RM       average number of rooms per dwelling\\n        - AGE      proportion of owner-occupied units built prior to 1940\\n        - DIS      weighted distances to five Boston employment centres\\n        - RAD      index of accessibility to radial highways\\n        - TAX      full-value property-tax rate per $10,000\\n        - PTRATIO  pupil-teacher ratio by town\\n        - B        1000(Bk - 0.63)^2 where Bk is the proportion of black people by town\\n        - LSTAT    % lower status of the population\\n        - MEDV     Median value of owner-occupied homes in $1000's\\n\\n    :Missing Attribute Values: None\\n\\n    :Creator: Harrison, D. and Rubinfeld, D.L.\\n\\nThis is a copy of UCI ML housing dataset.\\nhttps://archive.ics.uci.edu/ml/machine-learning-databases/housing/\\n\\n\\nThis dataset was taken from the StatLib library which is maintained at Carnegie Mellon University.\\n\\nThe Boston house-price data of Harrison, D. and Rubinfeld, D.L. 'Hedonic\\nprices and the demand for clean air', J. Environ. Economics & Management,\\nvol.5, 81-102, 1978.   Used in Belsley, Kuh & Welsch, 'Regression diagnostics\\n...', Wiley, 1980.   N.B. Various transformations are used in the table on\\npages 244-261 of the latter.\\n\\nThe Boston house-price data has been used in many machine learning papers that address regression\\nproblems.   \\n     \\n.. topic:: References\\n\\n   - Belsley, Kuh & Welsch, 'Regression diagnostics: Identifying Influential Data and Sources of Collinearity', Wiley, 1980. 244-261.\\n   - Quinlan,R. (1993). Combining Instance-Based and Model-Based Learning. In Proceedings on the Tenth International Conference of Machine Learning, 236-243, University of Massachusetts, Amherst. Morgan Kaufmann.\\n\",\n",
       " 'filename': 'C:\\\\ProgramData\\\\Anaconda3\\\\lib\\\\site-packages\\\\sklearn\\\\datasets\\\\data\\\\boston_house_prices.csv'}"
      ]
     },
     "execution_count": 24,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "boston"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "bf151e01",
   "metadata": {},
   "source": [
    "### 2、提取特征向量集"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 17,
   "id": "199c7887",
   "metadata": {},
   "outputs": [],
   "source": [
    "X=pd.DataFrame(boston['data'])"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 18,
   "id": "d9f4d414",
   "metadata": {
    "collapsed": true
   },
   "outputs": [
    {
     "data": {
      "text/html": [
       "<div>\n",
       "<style scoped>\n",
       "    .dataframe tbody tr th:only-of-type {\n",
       "        vertical-align: middle;\n",
       "    }\n",
       "\n",
       "    .dataframe tbody tr th {\n",
       "        vertical-align: top;\n",
       "    }\n",
       "\n",
       "    .dataframe thead th {\n",
       "        text-align: right;\n",
       "    }\n",
       "</style>\n",
       "<table border=\"1\" class=\"dataframe\">\n",
       "  <thead>\n",
       "    <tr style=\"text-align: right;\">\n",
       "      <th></th>\n",
       "      <th>0</th>\n",
       "      <th>1</th>\n",
       "      <th>2</th>\n",
       "      <th>3</th>\n",
       "      <th>4</th>\n",
       "      <th>5</th>\n",
       "      <th>6</th>\n",
       "      <th>7</th>\n",
       "      <th>8</th>\n",
       "      <th>9</th>\n",
       "      <th>10</th>\n",
       "      <th>11</th>\n",
       "      <th>12</th>\n",
       "    </tr>\n",
       "  </thead>\n",
       "  <tbody>\n",
       "    <tr>\n",
       "      <th>0</th>\n",
       "      <td>0.00632</td>\n",
       "      <td>18.0</td>\n",
       "      <td>2.31</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.538</td>\n",
       "      <td>6.575</td>\n",
       "      <td>65.2</td>\n",
       "      <td>4.0900</td>\n",
       "      <td>1.0</td>\n",
       "      <td>296.0</td>\n",
       "      <td>15.3</td>\n",
       "      <td>396.90</td>\n",
       "      <td>4.98</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>1</th>\n",
       "      <td>0.02731</td>\n",
       "      <td>0.0</td>\n",
       "      <td>7.07</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.469</td>\n",
       "      <td>6.421</td>\n",
       "      <td>78.9</td>\n",
       "      <td>4.9671</td>\n",
       "      <td>2.0</td>\n",
       "      <td>242.0</td>\n",
       "      <td>17.8</td>\n",
       "      <td>396.90</td>\n",
       "      <td>9.14</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>2</th>\n",
       "      <td>0.02729</td>\n",
       "      <td>0.0</td>\n",
       "      <td>7.07</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.469</td>\n",
       "      <td>7.185</td>\n",
       "      <td>61.1</td>\n",
       "      <td>4.9671</td>\n",
       "      <td>2.0</td>\n",
       "      <td>242.0</td>\n",
       "      <td>17.8</td>\n",
       "      <td>392.83</td>\n",
       "      <td>4.03</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>3</th>\n",
       "      <td>0.03237</td>\n",
       "      <td>0.0</td>\n",
       "      <td>2.18</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.458</td>\n",
       "      <td>6.998</td>\n",
       "      <td>45.8</td>\n",
       "      <td>6.0622</td>\n",
       "      <td>3.0</td>\n",
       "      <td>222.0</td>\n",
       "      <td>18.7</td>\n",
       "      <td>394.63</td>\n",
       "      <td>2.94</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>4</th>\n",
       "      <td>0.06905</td>\n",
       "      <td>0.0</td>\n",
       "      <td>2.18</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.458</td>\n",
       "      <td>7.147</td>\n",
       "      <td>54.2</td>\n",
       "      <td>6.0622</td>\n",
       "      <td>3.0</td>\n",
       "      <td>222.0</td>\n",
       "      <td>18.7</td>\n",
       "      <td>396.90</td>\n",
       "      <td>5.33</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>...</th>\n",
       "      <td>...</td>\n",
       "      <td>...</td>\n",
       "      <td>...</td>\n",
       "      <td>...</td>\n",
       "      <td>...</td>\n",
       "      <td>...</td>\n",
       "      <td>...</td>\n",
       "      <td>...</td>\n",
       "      <td>...</td>\n",
       "      <td>...</td>\n",
       "      <td>...</td>\n",
       "      <td>...</td>\n",
       "      <td>...</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>501</th>\n",
       "      <td>0.06263</td>\n",
       "      <td>0.0</td>\n",
       "      <td>11.93</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.573</td>\n",
       "      <td>6.593</td>\n",
       "      <td>69.1</td>\n",
       "      <td>2.4786</td>\n",
       "      <td>1.0</td>\n",
       "      <td>273.0</td>\n",
       "      <td>21.0</td>\n",
       "      <td>391.99</td>\n",
       "      <td>9.67</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>502</th>\n",
       "      <td>0.04527</td>\n",
       "      <td>0.0</td>\n",
       "      <td>11.93</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.573</td>\n",
       "      <td>6.120</td>\n",
       "      <td>76.7</td>\n",
       "      <td>2.2875</td>\n",
       "      <td>1.0</td>\n",
       "      <td>273.0</td>\n",
       "      <td>21.0</td>\n",
       "      <td>396.90</td>\n",
       "      <td>9.08</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>503</th>\n",
       "      <td>0.06076</td>\n",
       "      <td>0.0</td>\n",
       "      <td>11.93</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.573</td>\n",
       "      <td>6.976</td>\n",
       "      <td>91.0</td>\n",
       "      <td>2.1675</td>\n",
       "      <td>1.0</td>\n",
       "      <td>273.0</td>\n",
       "      <td>21.0</td>\n",
       "      <td>396.90</td>\n",
       "      <td>5.64</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>504</th>\n",
       "      <td>0.10959</td>\n",
       "      <td>0.0</td>\n",
       "      <td>11.93</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.573</td>\n",
       "      <td>6.794</td>\n",
       "      <td>89.3</td>\n",
       "      <td>2.3889</td>\n",
       "      <td>1.0</td>\n",
       "      <td>273.0</td>\n",
       "      <td>21.0</td>\n",
       "      <td>393.45</td>\n",
       "      <td>6.48</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>505</th>\n",
       "      <td>0.04741</td>\n",
       "      <td>0.0</td>\n",
       "      <td>11.93</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.573</td>\n",
       "      <td>6.030</td>\n",
       "      <td>80.8</td>\n",
       "      <td>2.5050</td>\n",
       "      <td>1.0</td>\n",
       "      <td>273.0</td>\n",
       "      <td>21.0</td>\n",
       "      <td>396.90</td>\n",
       "      <td>7.88</td>\n",
       "    </tr>\n",
       "  </tbody>\n",
       "</table>\n",
       "<p>506 rows × 13 columns</p>\n",
       "</div>"
      ],
      "text/plain": [
       "          0     1      2    3      4      5     6       7    8      9     10  \\\n",
       "0    0.00632  18.0   2.31  0.0  0.538  6.575  65.2  4.0900  1.0  296.0  15.3   \n",
       "1    0.02731   0.0   7.07  0.0  0.469  6.421  78.9  4.9671  2.0  242.0  17.8   \n",
       "2    0.02729   0.0   7.07  0.0  0.469  7.185  61.1  4.9671  2.0  242.0  17.8   \n",
       "3    0.03237   0.0   2.18  0.0  0.458  6.998  45.8  6.0622  3.0  222.0  18.7   \n",
       "4    0.06905   0.0   2.18  0.0  0.458  7.147  54.2  6.0622  3.0  222.0  18.7   \n",
       "..       ...   ...    ...  ...    ...    ...   ...     ...  ...    ...   ...   \n",
       "501  0.06263   0.0  11.93  0.0  0.573  6.593  69.1  2.4786  1.0  273.0  21.0   \n",
       "502  0.04527   0.0  11.93  0.0  0.573  6.120  76.7  2.2875  1.0  273.0  21.0   \n",
       "503  0.06076   0.0  11.93  0.0  0.573  6.976  91.0  2.1675  1.0  273.0  21.0   \n",
       "504  0.10959   0.0  11.93  0.0  0.573  6.794  89.3  2.3889  1.0  273.0  21.0   \n",
       "505  0.04741   0.0  11.93  0.0  0.573  6.030  80.8  2.5050  1.0  273.0  21.0   \n",
       "\n",
       "         11    12  \n",
       "0    396.90  4.98  \n",
       "1    396.90  9.14  \n",
       "2    392.83  4.03  \n",
       "3    394.63  2.94  \n",
       "4    396.90  5.33  \n",
       "..      ...   ...  \n",
       "501  391.99  9.67  \n",
       "502  396.90  9.08  \n",
       "503  396.90  5.64  \n",
       "504  393.45  6.48  \n",
       "505  396.90  7.88  \n",
       "\n",
       "[506 rows x 13 columns]"
      ]
     },
     "execution_count": 18,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "X"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 23,
   "id": "4b9a139b",
   "metadata": {
    "collapsed": true
   },
   "outputs": [
    {
     "data": {
      "text/html": [
       "<div>\n",
       "<style scoped>\n",
       "    .dataframe tbody tr th:only-of-type {\n",
       "        vertical-align: middle;\n",
       "    }\n",
       "\n",
       "    .dataframe tbody tr th {\n",
       "        vertical-align: top;\n",
       "    }\n",
       "\n",
       "    .dataframe thead th {\n",
       "        text-align: right;\n",
       "    }\n",
       "</style>\n",
       "<table border=\"1\" class=\"dataframe\">\n",
       "  <thead>\n",
       "    <tr style=\"text-align: right;\">\n",
       "      <th></th>\n",
       "      <th>CRIM</th>\n",
       "      <th>ZN</th>\n",
       "      <th>INDUS</th>\n",
       "      <th>CHAS</th>\n",
       "      <th>NOX</th>\n",
       "      <th>RM</th>\n",
       "      <th>AGE</th>\n",
       "      <th>DIS</th>\n",
       "      <th>RAD</th>\n",
       "      <th>TAX</th>\n",
       "      <th>PTRATIO</th>\n",
       "      <th>B</th>\n",
       "      <th>LSTAT</th>\n",
       "    </tr>\n",
       "  </thead>\n",
       "  <tbody>\n",
       "    <tr>\n",
       "      <th>0</th>\n",
       "      <td>0.00632</td>\n",
       "      <td>18.0</td>\n",
       "      <td>2.31</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.538</td>\n",
       "      <td>6.575</td>\n",
       "      <td>65.2</td>\n",
       "      <td>4.0900</td>\n",
       "      <td>1.0</td>\n",
       "      <td>296.0</td>\n",
       "      <td>15.3</td>\n",
       "      <td>396.90</td>\n",
       "      <td>4.98</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>1</th>\n",
       "      <td>0.02731</td>\n",
       "      <td>0.0</td>\n",
       "      <td>7.07</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.469</td>\n",
       "      <td>6.421</td>\n",
       "      <td>78.9</td>\n",
       "      <td>4.9671</td>\n",
       "      <td>2.0</td>\n",
       "      <td>242.0</td>\n",
       "      <td>17.8</td>\n",
       "      <td>396.90</td>\n",
       "      <td>9.14</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>2</th>\n",
       "      <td>0.02729</td>\n",
       "      <td>0.0</td>\n",
       "      <td>7.07</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.469</td>\n",
       "      <td>7.185</td>\n",
       "      <td>61.1</td>\n",
       "      <td>4.9671</td>\n",
       "      <td>2.0</td>\n",
       "      <td>242.0</td>\n",
       "      <td>17.8</td>\n",
       "      <td>392.83</td>\n",
       "      <td>4.03</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>3</th>\n",
       "      <td>0.03237</td>\n",
       "      <td>0.0</td>\n",
       "      <td>2.18</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.458</td>\n",
       "      <td>6.998</td>\n",
       "      <td>45.8</td>\n",
       "      <td>6.0622</td>\n",
       "      <td>3.0</td>\n",
       "      <td>222.0</td>\n",
       "      <td>18.7</td>\n",
       "      <td>394.63</td>\n",
       "      <td>2.94</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>4</th>\n",
       "      <td>0.06905</td>\n",
       "      <td>0.0</td>\n",
       "      <td>2.18</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.458</td>\n",
       "      <td>7.147</td>\n",
       "      <td>54.2</td>\n",
       "      <td>6.0622</td>\n",
       "      <td>3.0</td>\n",
       "      <td>222.0</td>\n",
       "      <td>18.7</td>\n",
       "      <td>396.90</td>\n",
       "      <td>5.33</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>...</th>\n",
       "      <td>...</td>\n",
       "      <td>...</td>\n",
       "      <td>...</td>\n",
       "      <td>...</td>\n",
       "      <td>...</td>\n",
       "      <td>...</td>\n",
       "      <td>...</td>\n",
       "      <td>...</td>\n",
       "      <td>...</td>\n",
       "      <td>...</td>\n",
       "      <td>...</td>\n",
       "      <td>...</td>\n",
       "      <td>...</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>501</th>\n",
       "      <td>0.06263</td>\n",
       "      <td>0.0</td>\n",
       "      <td>11.93</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.573</td>\n",
       "      <td>6.593</td>\n",
       "      <td>69.1</td>\n",
       "      <td>2.4786</td>\n",
       "      <td>1.0</td>\n",
       "      <td>273.0</td>\n",
       "      <td>21.0</td>\n",
       "      <td>391.99</td>\n",
       "      <td>9.67</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>502</th>\n",
       "      <td>0.04527</td>\n",
       "      <td>0.0</td>\n",
       "      <td>11.93</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.573</td>\n",
       "      <td>6.120</td>\n",
       "      <td>76.7</td>\n",
       "      <td>2.2875</td>\n",
       "      <td>1.0</td>\n",
       "      <td>273.0</td>\n",
       "      <td>21.0</td>\n",
       "      <td>396.90</td>\n",
       "      <td>9.08</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>503</th>\n",
       "      <td>0.06076</td>\n",
       "      <td>0.0</td>\n",
       "      <td>11.93</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.573</td>\n",
       "      <td>6.976</td>\n",
       "      <td>91.0</td>\n",
       "      <td>2.1675</td>\n",
       "      <td>1.0</td>\n",
       "      <td>273.0</td>\n",
       "      <td>21.0</td>\n",
       "      <td>396.90</td>\n",
       "      <td>5.64</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>504</th>\n",
       "      <td>0.10959</td>\n",
       "      <td>0.0</td>\n",
       "      <td>11.93</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.573</td>\n",
       "      <td>6.794</td>\n",
       "      <td>89.3</td>\n",
       "      <td>2.3889</td>\n",
       "      <td>1.0</td>\n",
       "      <td>273.0</td>\n",
       "      <td>21.0</td>\n",
       "      <td>393.45</td>\n",
       "      <td>6.48</td>\n",
       "    </tr>\n",
       "    <tr>\n",
       "      <th>505</th>\n",
       "      <td>0.04741</td>\n",
       "      <td>0.0</td>\n",
       "      <td>11.93</td>\n",
       "      <td>0.0</td>\n",
       "      <td>0.573</td>\n",
       "      <td>6.030</td>\n",
       "      <td>80.8</td>\n",
       "      <td>2.5050</td>\n",
       "      <td>1.0</td>\n",
       "      <td>273.0</td>\n",
       "      <td>21.0</td>\n",
       "      <td>396.90</td>\n",
       "      <td>7.88</td>\n",
       "    </tr>\n",
       "  </tbody>\n",
       "</table>\n",
       "<p>506 rows × 13 columns</p>\n",
       "</div>"
      ],
      "text/plain": [
       "        CRIM    ZN  INDUS  CHAS    NOX     RM   AGE     DIS  RAD    TAX  \\\n",
       "0    0.00632  18.0   2.31   0.0  0.538  6.575  65.2  4.0900  1.0  296.0   \n",
       "1    0.02731   0.0   7.07   0.0  0.469  6.421  78.9  4.9671  2.0  242.0   \n",
       "2    0.02729   0.0   7.07   0.0  0.469  7.185  61.1  4.9671  2.0  242.0   \n",
       "3    0.03237   0.0   2.18   0.0  0.458  6.998  45.8  6.0622  3.0  222.0   \n",
       "4    0.06905   0.0   2.18   0.0  0.458  7.147  54.2  6.0622  3.0  222.0   \n",
       "..       ...   ...    ...   ...    ...    ...   ...     ...  ...    ...   \n",
       "501  0.06263   0.0  11.93   0.0  0.573  6.593  69.1  2.4786  1.0  273.0   \n",
       "502  0.04527   0.0  11.93   0.0  0.573  6.120  76.7  2.2875  1.0  273.0   \n",
       "503  0.06076   0.0  11.93   0.0  0.573  6.976  91.0  2.1675  1.0  273.0   \n",
       "504  0.10959   0.0  11.93   0.0  0.573  6.794  89.3  2.3889  1.0  273.0   \n",
       "505  0.04741   0.0  11.93   0.0  0.573  6.030  80.8  2.5050  1.0  273.0   \n",
       "\n",
       "     PTRATIO       B  LSTAT  \n",
       "0       15.3  396.90   4.98  \n",
       "1       17.8  396.90   9.14  \n",
       "2       17.8  392.83   4.03  \n",
       "3       18.7  394.63   2.94  \n",
       "4       18.7  396.90   5.33  \n",
       "..       ...     ...    ...  \n",
       "501     21.0  391.99   9.67  \n",
       "502     21.0  396.90   9.08  \n",
       "503     21.0  396.90   5.64  \n",
       "504     21.0  393.45   6.48  \n",
       "505     21.0  396.90   7.88  \n",
       "\n",
       "[506 rows x 13 columns]"
      ]
     },
     "execution_count": 23,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "#添加索引列\n",
    "X.columns=boston['feature_names']\n",
    "X"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "e2a2bf32",
   "metadata": {},
   "source": [
    "### 3、提取响应变量集 "
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 21,
   "id": "3fbed5ec",
   "metadata": {},
   "outputs": [],
   "source": [
    "y=boston.target"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 22,
   "id": "1fcf2500",
   "metadata": {
    "collapsed": true
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "array([24. , 21.6, 34.7, 33.4, 36.2, 28.7, 22.9, 27.1, 16.5, 18.9, 15. ,\n",
       "       18.9, 21.7, 20.4, 18.2, 19.9, 23.1, 17.5, 20.2, 18.2, 13.6, 19.6,\n",
       "       15.2, 14.5, 15.6, 13.9, 16.6, 14.8, 18.4, 21. , 12.7, 14.5, 13.2,\n",
       "       13.1, 13.5, 18.9, 20. , 21. , 24.7, 30.8, 34.9, 26.6, 25.3, 24.7,\n",
       "       21.2, 19.3, 20. , 16.6, 14.4, 19.4, 19.7, 20.5, 25. , 23.4, 18.9,\n",
       "       35.4, 24.7, 31.6, 23.3, 19.6, 18.7, 16. , 22.2, 25. , 33. , 23.5,\n",
       "       19.4, 22. , 17.4, 20.9, 24.2, 21.7, 22.8, 23.4, 24.1, 21.4, 20. ,\n",
       "       20.8, 21.2, 20.3, 28. , 23.9, 24.8, 22.9, 23.9, 26.6, 22.5, 22.2,\n",
       "       23.6, 28.7, 22.6, 22. , 22.9, 25. , 20.6, 28.4, 21.4, 38.7, 43.8,\n",
       "       33.2, 27.5, 26.5, 18.6, 19.3, 20.1, 19.5, 19.5, 20.4, 19.8, 19.4,\n",
       "       21.7, 22.8, 18.8, 18.7, 18.5, 18.3, 21.2, 19.2, 20.4, 19.3, 22. ,\n",
       "       20.3, 20.5, 17.3, 18.8, 21.4, 15.7, 16.2, 18. , 14.3, 19.2, 19.6,\n",
       "       23. , 18.4, 15.6, 18.1, 17.4, 17.1, 13.3, 17.8, 14. , 14.4, 13.4,\n",
       "       15.6, 11.8, 13.8, 15.6, 14.6, 17.8, 15.4, 21.5, 19.6, 15.3, 19.4,\n",
       "       17. , 15.6, 13.1, 41.3, 24.3, 23.3, 27. , 50. , 50. , 50. , 22.7,\n",
       "       25. , 50. , 23.8, 23.8, 22.3, 17.4, 19.1, 23.1, 23.6, 22.6, 29.4,\n",
       "       23.2, 24.6, 29.9, 37.2, 39.8, 36.2, 37.9, 32.5, 26.4, 29.6, 50. ,\n",
       "       32. , 29.8, 34.9, 37. , 30.5, 36.4, 31.1, 29.1, 50. , 33.3, 30.3,\n",
       "       34.6, 34.9, 32.9, 24.1, 42.3, 48.5, 50. , 22.6, 24.4, 22.5, 24.4,\n",
       "       20. , 21.7, 19.3, 22.4, 28.1, 23.7, 25. , 23.3, 28.7, 21.5, 23. ,\n",
       "       26.7, 21.7, 27.5, 30.1, 44.8, 50. , 37.6, 31.6, 46.7, 31.5, 24.3,\n",
       "       31.7, 41.7, 48.3, 29. , 24. , 25.1, 31.5, 23.7, 23.3, 22. , 20.1,\n",
       "       22.2, 23.7, 17.6, 18.5, 24.3, 20.5, 24.5, 26.2, 24.4, 24.8, 29.6,\n",
       "       42.8, 21.9, 20.9, 44. , 50. , 36. , 30.1, 33.8, 43.1, 48.8, 31. ,\n",
       "       36.5, 22.8, 30.7, 50. , 43.5, 20.7, 21.1, 25.2, 24.4, 35.2, 32.4,\n",
       "       32. , 33.2, 33.1, 29.1, 35.1, 45.4, 35.4, 46. , 50. , 32.2, 22. ,\n",
       "       20.1, 23.2, 22.3, 24.8, 28.5, 37.3, 27.9, 23.9, 21.7, 28.6, 27.1,\n",
       "       20.3, 22.5, 29. , 24.8, 22. , 26.4, 33.1, 36.1, 28.4, 33.4, 28.2,\n",
       "       22.8, 20.3, 16.1, 22.1, 19.4, 21.6, 23.8, 16.2, 17.8, 19.8, 23.1,\n",
       "       21. , 23.8, 23.1, 20.4, 18.5, 25. , 24.6, 23. , 22.2, 19.3, 22.6,\n",
       "       19.8, 17.1, 19.4, 22.2, 20.7, 21.1, 19.5, 18.5, 20.6, 19. , 18.7,\n",
       "       32.7, 16.5, 23.9, 31.2, 17.5, 17.2, 23.1, 24.5, 26.6, 22.9, 24.1,\n",
       "       18.6, 30.1, 18.2, 20.6, 17.8, 21.7, 22.7, 22.6, 25. , 19.9, 20.8,\n",
       "       16.8, 21.9, 27.5, 21.9, 23.1, 50. , 50. , 50. , 50. , 50. , 13.8,\n",
       "       13.8, 15. , 13.9, 13.3, 13.1, 10.2, 10.4, 10.9, 11.3, 12.3,  8.8,\n",
       "        7.2, 10.5,  7.4, 10.2, 11.5, 15.1, 23.2,  9.7, 13.8, 12.7, 13.1,\n",
       "       12.5,  8.5,  5. ,  6.3,  5.6,  7.2, 12.1,  8.3,  8.5,  5. , 11.9,\n",
       "       27.9, 17.2, 27.5, 15. , 17.2, 17.9, 16.3,  7. ,  7.2,  7.5, 10.4,\n",
       "        8.8,  8.4, 16.7, 14.2, 20.8, 13.4, 11.7,  8.3, 10.2, 10.9, 11. ,\n",
       "        9.5, 14.5, 14.1, 16.1, 14.3, 11.7, 13.4,  9.6,  8.7,  8.4, 12.8,\n",
       "       10.5, 17.1, 18.4, 15.4, 10.8, 11.8, 14.9, 12.6, 14.1, 13. , 13.4,\n",
       "       15.2, 16.1, 17.8, 14.9, 14.1, 12.7, 13.5, 14.9, 20. , 16.4, 17.7,\n",
       "       19.5, 20.2, 21.4, 19.9, 19. , 19.1, 19.1, 20.1, 19.9, 19.6, 23.2,\n",
       "       29.8, 13.8, 13.3, 16.7, 12. , 14.6, 21.4, 23. , 23.7, 25. , 21.8,\n",
       "       20.6, 21.2, 19.1, 20.6, 15.2,  7. ,  8.1, 13.6, 20.1, 21.8, 24.5,\n",
       "       23.1, 19.7, 18.3, 21.2, 17.5, 16.8, 22.4, 20.6, 23.9, 22. , 11.9])"
      ]
     },
     "execution_count": 22,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "y"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "9efaed82",
   "metadata": {},
   "source": [
    "### 4、三种回归模型分析"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "53d9d60c",
   "metadata": {},
   "source": [
    "#### （1）Linear Regression"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "f335969c",
   "metadata": {},
   "source": [
    "##### 使用K折交叉验证的方式进行模型评估\n",
    "\n",
    "K折交叉验证（K-fold Cross Validation）是一种常用的模型评估方法，用于评估机器学习算法的性能。它将数据集分成K个子集，每次将其中一个子集作为测试集，其余的K-1个子集作为训练集。然后，重复K次这个过程，每次选择不同的子集作为测试集，最后计算K次测试结果的平均值作为模型的性能指标。\n",
    "\n",
    "K折交叉验证的主要优点是可以有效地利用有限的数据集进行模型评估，同时避免了数据划分的随机性对评估结果的影响。但是，当数据集较小或者类别分布不均匀时，K折交叉验证可能会导致评估结果的不稳定。在这种情况下，可以尝试使用分层K折交叉验证（Stratified K-fold Cross Validation），在每个子集中保持类别的比例与整个数据集相同。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 30,
   "id": "b5764c8c",
   "metadata": {},
   "outputs": [],
   "source": [
    "from sklearn.linear_model import LinearRegression\n",
    "from sklearn.model_selection import cross_val_score\n",
    "#cross_val_score是scikit-learn库中的一个函数，用于评估模型的性能。\n",
    "#它通过将数据集分成k个子集，然后进行k次训练和验证，每次使用不同的子集作为验证集，最后计算模型的平均性能得分。\n",
    "#就是k“fold”——k折交叉验证\n",
    "linregressor=LinearRegression()\n",
    "cvscore=cross_val_score(linregressor,X,y,cv=5)\n",
    "#其中，cv参数表示交叉验证的折数，默认为5。scores是一个包含每次验证得分的数组。\n",
    "#如有500个数据，分成5份，就是每份100个，每次取一份100个做测试集，另外400份做训练集\n",
    "#重复进行五次，最后计算模型的平均性能得分。"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "a82e63c1",
   "metadata": {},
   "source": [
    "cross_val_score是scikit-learn库中的一个函数，用于评估模型的性能。它通过将数据集分成k个子集，然后进行k次训练和验证，每次使用不同的子集作为验证集，最后计算模型的平均性能得分。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 33,
   "id": "d9bfd65e",
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "array([ 0.63919994,  0.71386698,  0.58702344,  0.07923081, -0.25294154])"
      ]
     },
     "execution_count": 33,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "cvscore"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 34,
   "id": "d6991d60",
   "metadata": {},
   "outputs": [],
   "source": [
    "mean_score = np.mean(cvscore)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 35,
   "id": "68c22c54",
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "0.3532759243958772"
      ]
     },
     "execution_count": 35,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "mean_score"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "84ce38a8",
   "metadata": {},
   "source": [
    "经过5折的交叉验证，线性回归模型性能的平均得分为0.3532759243958772，说明改数据对于一元线性回归模型效果并不理想。"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "b73c2ebc",
   "metadata": {},
   "source": [
    "##### 虽然模型并未体现假设模型过拟合，但是如果出现过拟合，我们可以通过以下两种方式解决： "
   ]
  },
  {
   "cell_type": "markdown",
   "id": "67f99b6e",
   "metadata": {},
   "source": [
    "#### （2）Ridge Regression"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "1709db23",
   "metadata": {},
   "source": [
    "Ridge\n",
    "\n",
    "也被称为岭回归，是一种缩减（shrinkage）方法，用于控制线性回归模型中系数的大小。这种方法主要用于解决多重共线性问题和过拟合问题。与主成分回归等改变X的算法不同，Ridge回归通过添加一个L2正则化项，也就是对系数的大小进行惩罚，以防止过拟合。\n",
    "\n",
    "在Ridge回归中，我们通常使用λ（或alpha）参数来控制正则化的强度。当λ增加时，惩罚项的权重也会增加，导致模型的系数缩小。相反，当λ减小时，惩罚项的权重也会减小，这可能导致模型的系数增大。\n",
    "\n",
    "此外，Ridge回归和Lasso回归都是处理大量特征情况下创建简约模型的强大技术。这些技术可以应对大到足以引起计算硬件挑战的特征数量。总的来说，Ridge回归是一种强大且常用的机器学习工具，适用于许多现实生活的问题。\n",
    "\n",
    "GridSearchCV\n",
    "\n",
    "是一个用于超参数优化的交叉验证技术，它的主要功能是在指定的参数范围内，通过GridSearchCV是一个用于超参数优化的交叉验证技术，它的主要功能是在指定的参数范围内，通过网格搜索的方式找到一组最优参数，从而使得模型在验证集上的精度最高。这种方法主要用于寻找模型的最优超参数。\n",
    "\n",
    "在使用GridSearchCV进行参数调优时，我们首先需要定义一个包含所有可能参数组合的字典或列表。然后，我们将这个字典或列表传递给GridSearchCV对象，并指定要评估的指标、要使用的交叉验证策略以及所需的迭代次数。GridSearchCV会按照预设的策略（如分层抽样或随机抽样）对数据进行分割，并在每次迭代中使用不同的参数组合训练模型，然后在验证集上计算指定指标的得分。最后，GridSearchCV会根据平均得分最高的参数组合来返回最优模型。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 41,
   "id": "db6b2e91",
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "GridSearchCV(cv=5, estimator=Ridge(),\n",
       "             param_grid={'alpha': [1e-15, 1e-10, 1e-08, 0.001, 0.01, 1, 5, 10,\n",
       "                                   20, 30, 35, 40, 45, 50, 55, 100, 200, 500,\n",
       "                                   1000]})"
      ]
     },
     "execution_count": 41,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "from sklearn.linear_model import Ridge\n",
    "from sklearn.model_selection import GridSearchCV\n",
    " \n",
    "ridge=Ridge()\n",
    "parameters={'alpha':[1e-15,1e-10,1e-8,1e-3,1e-2,1,5,10,20,30,35,40,45,50,55,100,200,500,1000]}\n",
    "ridgeregressor=GridSearchCV(ridge,parameters,cv=5)\n",
    "ridgeregressor.fit(X,y)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 42,
   "id": "895c9291",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "{'alpha': 200}\n",
      "0.49798762179623124\n"
     ]
    }
   ],
   "source": [
    "print(ridgeregressor.best_params_)\n",
    "print(ridgeregressor.best_score_)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "58259af4",
   "metadata": {},
   "source": [
    "由模型评价指标看，明显高于上一个，但是岭回归模型需要找到合适的alpha值，当确定好一个模型参数后需要再继续找更合适更精细化的参数。\n",
    "\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 44,
   "id": "e52430ae",
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "GridSearchCV(cv=5, estimator=Ridge(),\n",
       "             param_grid={'alpha': [160, 170, 180, 190, 200, 210, 230]})"
      ]
     },
     "execution_count": 44,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "#ridge=Ridge()\n",
    "parameters={'alpha':[160,170,180,190,200,210,230]}\n",
    "ridgeregressor=GridSearchCV(ridge,parameters,cv=5)\n",
    "ridgeregressor.fit(X,y)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 45,
   "id": "9cdca7a0",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "{'alpha': 180}\n",
      "0.4982367681072658\n"
     ]
    }
   ],
   "source": [
    "print(ridgeregressor.best_params_)\n",
    "print(ridgeregressor.best_score_)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "09d4609f",
   "metadata": {},
   "source": [
    "修改完参数后，模型的得分提升了大约0.001"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "a44d4a77",
   "metadata": {},
   "source": [
    "#### （3）Lasso Regression\n",
    "\n",
    "\n",
    "Lasso\n",
    "\n",
    "全称Least Absolute Shrinkage and Selection Operator，是一种Lasso，全称Least Absolute Shrinkage and Selection Operator，是一种数据挖掘方法。它主要用在多元线性回归中，通过添加惩罚函数来不断压缩系数，从而达到精简模型的目的，以避免共线性和过拟合。此外，LASSO回归还有一个重要的特点就是能够实现对变量的筛选。当某个特征的系数为0时，该特征会被自动剔除，从而解决了逐步回归stepwise前进、后退变量筛选方法中的一些问题。\n",
    "\n",
    "从数学的角度来看，Lasso回归的原理与岭回归类似，都是在目标函数后加了一个正则化项。但是Ridge回归使用的是L2正则化，而Lasso回归使用的则是L1正则化。\n",
    "\n",
    "总的来说，Lasso回归提供了一种高效且灵活的解决方案，既可以避免模型过拟合，又可以解决变量选择的问题，因此在诸如基因组学、影像学等需要处理小样本数据的领域中，Lasso回归有着广泛的应用。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 49,
   "id": "1eef9349",
   "metadata": {},
   "outputs": [
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "C:\\ProgramData\\Anaconda3\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:530: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 4430.746729651311, tolerance: 3.9191485420792076\n",
      "  model = cd_fast.enet_coordinate_descent(\n",
      "C:\\ProgramData\\Anaconda3\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:530: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 4397.459304778431, tolerance: 3.3071316790123455\n",
      "  model = cd_fast.enet_coordinate_descent(\n",
      "C:\\ProgramData\\Anaconda3\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:530: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 3796.653037433508, tolerance: 2.813643886419753\n",
      "  model = cd_fast.enet_coordinate_descent(\n",
      "C:\\ProgramData\\Anaconda3\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:530: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 2564.292735790545, tolerance: 3.3071762123456794\n",
      "  model = cd_fast.enet_coordinate_descent(\n",
      "C:\\ProgramData\\Anaconda3\\lib\\site-packages\\sklearn\\linear_model\\_coordinate_descent.py:530: ConvergenceWarning: Objective did not converge. You might want to increase the number of iterations. Duality gap: 4294.252997826028, tolerance: 3.4809104444444445\n",
      "  model = cd_fast.enet_coordinate_descent(\n"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "{'alpha': 1}\n",
      "0.431848787926522\n"
     ]
    }
   ],
   "source": [
    "from sklearn.linear_model import Lasso\n",
    "from sklearn.model_selection import GridSearchCV\n",
    "lasso=Lasso()\n",
    "parameters={'alpha':[1e-15,1e-10,1e-8,1e-3,1e-2,1,5,10,20,30,35,40,45,50,55,100,200,300,500]}\n",
    "lassoregressor=GridSearchCV(lasso,parameters,cv=5)\n",
    " \n",
    "lassoregressor.fit(X,y)\n",
    "print(lassoregressor.best_params_)\n",
    "print(lassoregressor.best_score_)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "39a80221",
   "metadata": {},
   "source": [
    "从以上分析可以看出Lasso回归模型可以确定一个alpha值1，和模型评价指标值为0.43，但是通过确定更加精确alpha值，可以调整出更好的模型。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 51,
   "id": "41137fef",
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "Lasso(alpha=1, fit_intercept=False, max_iter=10000000000000, positive=True,\n",
       "      tol=1e-14)"
      ]
     },
     "execution_count": 51,
     "metadata": {},
     "output_type": "execute_result"
    }
   ],
   "source": [
    "Lasso(alpha=1, fit_intercept=False, tol=0.00000000000001,max_iter=10000000000000, positive=True)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "1b9a4fdb",
   "metadata": {},
   "source": [
    "除了使用交叉验证（CV）的方式让算法自动帮我们训练并模型好坏外，还可以使用传统的拆分数据集的方法对Ridge和Lasso模型进行训练，"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 53,
   "id": "2bf6748f",
   "metadata": {},
   "outputs": [],
   "source": [
    "from sklearn.model_selection import train_test_split\n",
    "X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.3, random_state=0)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 54,
   "id": "59f131d2",
   "metadata": {},
   "outputs": [],
   "source": [
    "prediction_lasso=lassoregressor.predict(X_test)\n",
    "prediction_ridge=ridgeregressor.predict(X_test)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 69,
   "id": "e0928fc8",
   "metadata": {
    "scrolled": true
   },
   "outputs": [
    {
     "data": {
      "text/plain": [
       "<AxesSubplot:ylabel='Count'>"
      ]
     },
     "execution_count": 69,
     "metadata": {},
     "output_type": "execute_result"
    },
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAX4AAAD4CAYAAADrRI2NAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjQuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/MnkTPAAAACXBIWXMAAAsTAAALEwEAmpwYAAAP20lEQVR4nO3df6zddX3H8eeLFoYBNsu4kK62qzpiNGSW5cocmAV/pvMfwKiMLK7L2NpkssA0ZsT9Ic4sMYu/kmVh1EGsBhlswEBn1I6hzGHQW1KhrDiMQSht2qvMAFkyU3jvj/PtvLT3todyv+cHn+cjOTnf8znnfL/v+0nv6377OZ/z+aaqkCS144RxFyBJGi2DX5IaY/BLUmMMfklqjMEvSY1ZOe4ChnHGGWfU+vXrx12GJE2VHTt2/LiqZg5vn4rgX79+PXNzc+MuQ5KmSpIfLdbuUI8kNcbgl6TGGPyS1BiDX5IaY/BLUmMMfklqjMEvSY0x+CWpMQa/JDXG4JdehDVr15Fk2W9r1q4b94+ml7CpWLJBmlR79zzOpdfdu+z7vXnL+cu+T+kQz/glqTEGvyQ1xuCXpMYY/JLUGINfkhpj8EtSYwx+SWqMwS9JjTH4JakxBr8kNcbgl6TGGPyS1BiDX5Ia01vwJzk5yXeSfC/JQ0k+2rWfnmR7kke6+1V91SBJOlKfZ/z/C7ylql4PbAA2JnkjcDVwV1WdDdzVPZYkjUhvwV8Dz3QPT+xuBVwEbOvatwEX91WDJOlIvY7xJ1mRZCdwANheVfcBZ1XVPoDu/sw+a5AkPV+vwV9Vz1bVBuAVwHlJzhn2vUk2J5lLMjc/P99bjZLUmpHM6qmqnwLfADYC+5OsBujuDyzxnq1VNVtVszMzM6MoU5Ka0OesnpkkL++2Xwa8DXgYuBPY1L1sE3BHXzVIko7U58XWVwPbkqxg8Afmlqr6cpJvA7ckuRx4DHhPjzVIkg7TW/BX1QPAuYu0/wR4a1/HlSQdnd/claTGGPyS1BiDX5IaY/BLUmMMfklqjMEvSY0x+CWpMQa/JDXG4Jekxhj8ktQYg1+SGmPwS1JjDH5JaozBL0mNMfglqTEGvyQ1xuCXpMYY/JLUGINfkhpj8EtSYwx+SWqMwS9JjTH4JakxvQV/krVJ7k6yO8lDSa7s2q9J8kSSnd3tnX3VIEk60soe930Q+GBV3Z/kNGBHku3dc5+uqk/0eGxJ0hJ6C/6q2gfs67afTrIbWNPX8SRJwxnJGH+S9cC5wH1d0xVJHkhyQ5JVS7xnc5K5JHPz8/OjKFOSmtB78Cc5FbgVuKqqngKuBV4NbGDwP4JPLva+qtpaVbNVNTszM9N3mZLUjF6DP8mJDEL/xqq6DaCq9lfVs1X1HPBZ4Lw+a5AkPV+fs3oCXA/srqpPLWhfveBllwC7+qpBknSkPmf1XAC8D3gwyc6u7cPAZUk2AAU8CmzpsQZJ0mH6nNXzLSCLPPWVvo4pSTo2v7krSY0x+CWpMQa/JDXG4Jekxhj8ktQYg1+SGmPwS1JjDH5JaozBL0mNMfglqTEGvyQ1xuCXpMYY/JLUGINfmkQnrCTJst/WrF037p9ME6DP9fglHa/nDnLpdfcu+25v3nL+su9T08czfklqjMEvSY0x+CWpMQa/JDXG4Jekxhj8ktQYg1+SGtNb8CdZm+TuJLuTPJTkyq799CTbkzzS3a/qqwZJ0pH6POM/CHywql4LvBF4f5LXAVcDd1XV2cBd3WNJ0oj0FvxVta+q7u+2nwZ2A2uAi4Bt3cu2ARf3VYMk6UgjGeNPsh44F7gPOKuq9sHgjwNw5hLv2ZxkLsnc/Pz8KMqUpCb0HvxJTgVuBa6qqqeGfV9Vba2q2aqanZmZ6a9ASWpMr8Gf5EQGoX9jVd3WNe9Psrp7fjVwoM8aJEnP1+esngDXA7ur6lMLnroT2NRtbwLu6KsGSdKR+lyW+QLgfcCDSXZ2bR8GPg7ckuRy4DHgPT3WIEk6zFDBn+SCqvqPY7UtVFXfArLE028dvkRJ0nIadqjnb4ZskyRNuKOe8Sf5LeB8YCbJBxY89YvAij4LkyT141hDPScBp3avO21B+1PAu/sqSpLUn6MGf1V9E/hmks9V1Y9GVJMkqUfDzur5hSRbgfUL31NVb+mjKElSf4YN/n8E/g74e+DZ/sqRJPVt2OA/WFXX9lqJJGkkhp3O+aUkf5Jkdbee/ulJTu+1MklSL4Y94z+0xMKHFrQV8KrlLUeS1Lehgr+qXtl3IZKk0Rh2yYbfX6y9qj6/vOVIkvo27FDPGxZsn8xgrZ37AYNfkqbMsEM9f7rwcZJfAr7QS0WSpF4d73r8/wOcvZyFSBqBE1aSZNlva9auG/dPphdg2DH+LzGYxQODxdleC9zSV1GSevLcQS697t5l3+3NW85f9n2qP8OO8X9iwfZB4EdVtaeHeiRJPRtqqKdbrO1hBit0rgJ+1mdRkqT+DBX8Sd4LfIfBZRLfC9yXxGWZJWkKDTvU8xfAG6rqAECSGeBfgX/qqzBJUj+GndVzwqHQ7/zkBbxXkjRBhj3j/2qSrwE3dY8vBb7ST0mSpD4d65q7vwacVVUfSvIu4E1AgG8DN46gPknSMjvWcM1ngKcBquq2qvpAVf0Zg7P9z/RbmiSpD8cK/vVV9cDhjVU1x+AyjEtKckOSA0l2LWi7JskTSXZ2t3ceV9WSpON2rOA/+SjPvewY7/0csHGR9k9X1Ybu5ucEkjRixwr+7yb548Mbk1wO7DjaG6vqHuDJF1GbJKkHx5rVcxVwe5Lf4+dBPwucBFxynMe8olvffw74YFX992IvSrIZ2Aywbp0LQEkTrVv8bTn9yivW8sTjjy3rPjVw1OCvqv3A+UneDJzTNf9LVf3bcR7vWuBjDBZ8+xjwSeAPlzj2VmArwOzsbC32GkkToofF31z4rT/Drsd/N3D3iz1Y94cEgCSfBb78YvcpSXphRvrt2ySrFzy8BNi11GslSf0Y9pu7L1iSm4ALgTOS7AE+AlyYZAODoZ5HgS19HV+StLjegr+qLluk+fq+jidJGo4LrUlSYwx+SWqMwS9JjTH4JakxBr8kNcbgl6TGGPxqxpq160iyrDdpGvU2j1+aNHv3PO56MhKe8UtScwx+SWqMwS9JjTH4JakxBr+kydRd1Wu5b2vWekU/Z/VImkw9XNULnIkFnvFLUnMMfklqjMEvSY0x+CWpMQa/JDXG4NfE6WMxNRdUk37O6ZyaOH0spgZO45MO8YxfkhrTW/AnuSHJgSS7FrSdnmR7kke6+1V9HV+StLg+z/g/B2w8rO1q4K6qOhu4q3ssSRqh3oK/qu4Bnjys+SJgW7e9Dbi4r+NLkhY36jH+s6pqH0B3f+ZSL0yyOclckrn5+fmRFShJL3UT++FuVW2tqtmqmp2ZmRl3OZL0kjHq4N+fZDVAd39gxMeXpOaNOvjvBDZ125uAO0Z8fElqXp/TOW8Cvg28JsmeJJcDHwfenuQR4O3dY0nSCPX2zd2qumyJp97a1zElScc2sR/uavK5po40nVyrR8fNNXWk6eQZvyQ1xuCXpMYY/JLUGINfkhpj8EtSYwx+SWqMwS9JjTH4JakxBr8kNcbgl6TGGPyS1BiDX5IaY/BLUmMMfklqjMEvSY0x+CWpMQa/JDXG4Jekxhj8ktQYg1+SGjOWi60neRR4GngWOFhVs+OoQ5JaNJbg77y5qn48xuNLUpMc6pGkxowr+Av4epIdSTYv9oIkm5PMJZmbn58fcXnjs2btOpIs623lSScv+z6TjLurpONzwspefh/WrF037p9saOMa6rmgqvYmORPYnuThqrpn4QuqaiuwFWB2drbGUeQ47N3zOJded++y7vPmLecv+z4P7VeaOs8dbP73YSxn/FW1t7s/ANwOnDeOOiSpRSMP/iSnJDnt0DbwDmDXqOuQpFaNY6jnLOD2box4JfDFqvrqGOqQpCaNPPir6ofA60d9XEnSgNM5JakxBr8kNcbgl6TGGPyS1BiDX5IaY/BLUmMMfklqjMEvSY0x+CWpMQa/JDXG4Jekxhj8ktQYg1+SGvOSD/4+LmU4bZdZkzQCU3RJx3FdenFk+riUIUzXZdYkjcAUXdLxJX/GL0l6PoNfkhpj8EtSYwx+SWqMwS9JjXnJz+rpTTd1S5KmjcF/vKZo6pYkLeRQjyQ1ZizBn2Rjku8n+UGSq8dRgyS1auTBn2QF8LfA7wCvAy5L8rpR1yFJrRrHGf95wA+q6odV9TPgH4CLxlCHJDUpVTXaAybvBjZW1R91j98H/GZVXXHY6zYDm7uHrwG+v8juzgB+3GO5y2EaaoTpqHMaaoTpqNMal88k1/mrVTVzeOM4ZvUsNgfyiL8+VbUV2HrUHSVzVTW7XIX1YRpqhOmocxpqhOmo0xqXz7TUudA4hnr2AGsXPH4FsHcMdUhSk8YR/N8Fzk7yyiQnAb8L3DmGOiSpSSMf6qmqg0muAL4GrABuqKqHjnN3Rx0KmhDTUCNMR53TUCNMR53WuHympc7/N/IPdyVJ4+U3dyWpMQa/JDVmKoN/WpZ8SPJokgeT7EwyN+56AJLckORAkl0L2k5Psj3JI939qnHW2NW0WJ3XJHmi68+dSd455hrXJrk7ye4kDyW5smufmP48So2T1pcnJ/lOku91dX60a5+kvlyqxonqy2FM3Rh/t+TDfwFvZzA19LvAZVX1n2MtbBFJHgVmq2pivtyR5LeBZ4DPV9U5XdtfA09W1ce7P6SrqurPJ7DOa4BnquoT46ztkCSrgdVVdX+S04AdwMXAHzAh/XmUGt/LZPVlgFOq6pkkJwLfAq4E3sXk9OVSNW5kgvpyGNN4xu+SDy9CVd0DPHlY80XAtm57G4NgGKsl6pwoVbWvqu7vtp8GdgNrmKD+PEqNE6UGnukentjdisnqy6VqnDrTGPxrgMcXPN7DBP5D7hTw9SQ7uiUoJtVZVbUPBkEBnDnmeo7miiQPdENBYx+SOiTJeuBc4D4mtD8PqxEmrC+TrEiyEzgAbK+qievLJWqECevLY5nG4B9qyYcJcUFV/QaDlUjf3w1f6PhdC7wa2ADsAz451mo6SU4FbgWuqqqnxl3PYhapceL6sqqeraoNDL7Nf16Sc8Zc0hGWqHHi+vJYpjH4p2bJh6ra290fAG5nMEw1ifZ3Y8GHxoQPjLmeRVXV/u4X7zngs0xAf3ZjvbcCN1bVbV3zRPXnYjVOYl8eUlU/Bb7BYOx8ovrykIU1TnJfLmUag38qlnxIckr3YRpJTgHeAew6+rvG5k5gU7e9CbhjjLUs6VAAdC5hzP3Zfdh3PbC7qj614KmJ6c+lapzAvpxJ8vJu+2XA24CHmay+XLTGSevLYUzdrB6AbrrUZ/j5kg9/Nd6KjpTkVQzO8mGwNMYXJ6HOJDcBFzJYSnY/8BHgn4FbgHXAY8B7qmqsH6wuUeeFDP47XcCjwJZD47/jkORNwL8DDwLPdc0fZjCGPhH9eZQaL2Oy+vLXGXx4u4LBCektVfWXSX6ZyenLpWr8AhPUl8OYyuCXJB2/aRzqkSS9CAa/JDXG4Jekxhj8ktQYg1+SGmPwS1JjDH5Jasz/ATRwtmAafIE7AAAAAElFTkSuQmCC\n",
      "text/plain": [
       "<Figure size 432x288 with 1 Axes>"
      ]
     },
     "metadata": {
      "needs_background": "light"
     },
     "output_type": "display_data"
    }
   ],
   "source": [
    "import seaborn as sns\n",
    "\n",
    "#sns.kdeplot(prediction_lasso)\n",
    "sns.histplot(prediction_lasso)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 68,
   "id": "94f5d092",
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/plain": [
       "<seaborn.axisgrid.FacetGrid at 0x15ac6edfe20>"
      ]
     },
     "execution_count": 68,
     "metadata": {},
     "output_type": "execute_result"
    },
    {
     "data": {
      "image/png": "iVBORw0KGgoAAAANSUhEUgAAAWAAAAFgCAYAAACFYaNMAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjQuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/MnkTPAAAACXBIWXMAAAsTAAALEwEAmpwYAAAQhElEQVR4nO3df4xlBXmH8ecLi9UAtksFsl12g7bEaEyLzUhbMI0/qtlSU9SolLSWRtolqTRQja3VpKXxH9P4K2ka6iJEtBS1BSJag1KkEqNBF1xh6WqxBmXZDbuUNECa1C68/WPOxsm6Mzu73HPfOzPPJ7mZe8/Mvefl5M7D3TP33JOqQpI0fcd1DyBJa5UBlqQmBliSmhhgSWpigCWpybruAZZjy5Ytdeutt3aPIUnHKodbuCJeAT/66KPdI0jSxK2IAEvSamSAJamJAZakJgZYkpoYYElqYoAlqYkBlqQmBliSmhhgSWpigCWpiQGWpCYGWJKaGGBJamKAJamJAdaat3HTZpJM7LJx0+bu/yStECviA9mlMe3Z/RAXfvRrE3u8T1967sQeS6ubr4AlqYkBlqQmBliSmhhgSWpigCWpiQGWpCYGWJKaGGBJamKAJamJAZYm7bh1HtqsZfFQZGnSnj7goc1aFl8BS1ITAyxJTQywJDUxwJLUxABLUhMDLElNRgtwkk1J7kiyK8n9SS4fll+Z5OEkO4bL+WPNIEmzbMz3AR8A3llV9yQ5Gbg7yW3D9z5cVR8Ycd2SNPNGC3BV7QX2DtefSLIL2DjW+iRppZnKPuAkZwIvBe4aFl2W5N4k1yZZv8h9tibZnmT7/v37pzGmVohJn8VY6jL6ochJTgJuBK6oqseTXAW8D6jh6weBtx16v6raBmwDmJubq7Hn1MrhWYy1Woz6CjjJCczH9/qqugmgqh6pqqeq6mngauCcMWeQpFk15rsgAlwD7KqqDy1YvmHBj70B2DnWDJI0y8bcBXEe8FbgviQ7hmXvAS5KcjbzuyAeBC4dcQZJmlljvgviq8Dh/sLxhbHWKUkriUfCSVITAyxJTQywJDUxwJLUxABLUhMDLElNDLAkNTHAktTEAEtSEwMsSU0MsCQ1McCS1MQAS1ITAyxJTQywJDUxwJLUxABLUhMDLElNDLAkNTHAktTEAEtSEwMsSU0MsCQ1McCS1MQAS1ITAyxJTQywJDUxwJLUxABLUhMDLElNDLAkNTHAktTEAEtSEwMsSU0MsCQ1McCS1MQAS1ITAyxJTQywJDUxwJLUxABLUhMDLElNDLAkNTHAktTEAEtSEwMsSU0MsCQ1McCS1GS0ACfZlOSOJLuS3J/k8mH5KUluS/LA8HX9WDNI0iwb8xXwAeCdVfUi4FeBtyd5MfBu4PaqOgu4fbgtSWvOaAGuqr1Vdc9w/QlgF7ARuAC4bvix64DXjzWDJM2yqewDTnIm8FLgLuD0qtoL85EGTlvkPluTbE+yff/+/dMYU5KmavQAJzkJuBG4oqoeX+79qmpbVc1V1dypp5463oCS1GTUACc5gfn4Xl9VNw2LH0myYfj+BmDfmDNI0qwa810QAa4BdlXVhxZ86xbg4uH6xcBnx5pBkmbZuhEf+zzgrcB9SXYMy94DvB/4TJJLgB8Cbx5xBkmaWaMFuKq+CmSRb796rPVK0krhkXCS1MQAS1ITAyxJTQywJDUxwJLUxABLUhMDLElNDLAkNTHAktTEAEtSEwMsSU0MsCQ1McCS1MQAS1ITAyxJTQywJDUxwJLUxABLUhMDLElNDLAkNTHAktTEAEtSEwMsSU0MsCQ1McCS1MQAS1ITAyxJTQywJDUxwJLUxABLUhMDLElNDLAkNTHAktTEAEtSEwMsSU0MsCQ1McCS1MQAS1ITAyxJTQywJDUxwJLUxABLUhMDLElNDLAkNTHAktTEAEtSEwMsSU0MsCQ1WVaAk5y3nGWHfP/aJPuS7Fyw7MokDyfZMVzOP/qRJWl1WO4r4L9d5rKFPg5sOczyD1fV2cPlC8tcvyStOuuW+maSXwPOBU5N8o4F33oucPxS962qO5Oc+YwnlKRV6kivgJ8FnMR8qE9ecHkceNMxrvOyJPcOuyjWL/ZDSbYm2Z5k+/79+49xVZIOtXHTZpJM5LJx0+bu/5wVbclXwFX1FeArST5eVT+YwPquAt4H1PD1g8DbFln3NmAbwNzcXE1g3ZKAPbsf4sKPfm0ij/XpS8+dyOOsVUsGeIGfSrINOHPhfarqVUezsqp65OD1JFcDnz+a+0vSarLcAP8T8PfAx4CnjnVlSTZU1d7h5huAnUv9vCStZssN8IGquupoHjjJDcArgOcl2Q38FfCKJGczvwviQeDSo3lMSVpNlhvgzyX5Y+Bm4H8PLqyqxxa7Q1VddJjF1xzdeJK0ei03wBcPX9+1YFkBL5jsOJK0diwrwFX1/LEHkaS1ZlkBTvL7h1teVZ+Y7DiStHYsdxfEyxZcfzbwauAewABL0jFa7i6IP1l4O8lPA58cZSJJWiOO9eMo/wc4a5KDSNJas9x9wJ9j/l0PMP8hPC8CPjPWUJK0Fix3H/AHFlw/APygqnaPMI8krRnL2gUxfCjPd5j/JLT1wI/GHEqS1oLlnhHjLcA3gDcDbwHuSnKsH0cpSWL5uyDeC7ysqvYBJDkV+Ffgn8caTJJWu+W+C+K4g/Ed/NdR3FeSdBjLfQV8a5IvAjcMty8EPJ+bJD0DRzon3C8Ap1fVu5K8EXg5EODrwPVTmE+SVq0j7Ub4CPAEQFXdVFXvqKo/Zf7V70fGHU2SVrcjBfjMqrr30IVVtZ350xNJko7RkQL87CW+95xJDqLVbZJn4pVWiyP9Ee6bSf6oqq5euDDJJcDd442l1cYz8Uo/6UgBvgK4Ocnv8uPgzgHPYv6kmpKkY7RkgIfTyJ+b5JXAS4bF/1JVXx59Mkla5Zb7ecB3AHeMPIskrSkezSZJTQywJDUxwJLUxABLUhMDLElNDLAkNTHAktTEAEtSEwMsSU0MsCQ1McCS1MQAS1ITAyxJTQywJDUxwJLUxABLUhMDLElNDLAkNTHAOqxJnkbeU8lLh7esc8Jp7ZnkaeTBU8lLh+MrYElqYoAlqYkBlqQmBliSmhhgSWpigCWpyWgBTnJtkn1Jdi5YdkqS25I8MHxdP9b6JWnWjfkK+OPAlkOWvRu4varOAm4fbkvSmjRagKvqTuCxQxZfAFw3XL8OeP1Y65ekWTftI+FOr6q9AFW1N8lpi/1gkq3AVoDNmzdPaTxpBh23zsO5V6mZPRS5qrYB2wDm5uaqeRypz9MHPCx8lZr2uyAeSbIBYPi6b8rrl6SZMe0A3wJcPFy/GPjslNcvSTNjzLeh3QB8HXhhkt1JLgHeD7wmyQPAa4bbkrQmjbYPuKouWuRbrx5rnZK0kngknCQ1McCS1MQAS1ITAyxJTQywJDUxwJLUxABLUhMDLElNDLAkNTHAktTEAEtSEwMsSU0MsCQ1McCS1MQAS1ITAyxJTQywJDUxwJLUxABLUhMDLElNDLAkNTHAktTEAEtSEwMsSU0MsCQ1McCS1MQAS1ITAyxJTQywJDUxwJLUxABLUhMDLElNDLAkNTHAktTEAEtSEwMsSU0MsCQ1McCS1MQAS1ITAyxJTQzwKrFx02aSTOwiLctx6yb6vNu4aXP3f9FUreseQJOxZ/dDXPjRr03s8T596bkTeyytYk8f8Hn3DPgKWJKaGGBJamKAJamJAZakJgZYkpoYYElq0vI2tCQPAk8ATwEHqmquYw5J6tT5PuBXVtWjjeuXpFbugpCkJl0BLuBLSe5OsvVwP5Bka5LtSbbv379/yuONz0OHJXXtgjivqvYkOQ24Lcl3qurOhT9QVduAbQBzc3PVMeSYPHRYUssr4KraM3zdB9wMnNMxhyR1mnqAk5yY5OSD14HXAjunPYckdevYBXE6cPOw33Id8I9VdWvDHJLUauoBrqrvA7807fVK0qzxbWiS1MQAS1ITAyxJTQywJDUxwJLUxABLmh1r7CzLnhVZ0uxYY2dZ9hWwJDUxwJLUxABLUhMDLElNDLAkNTHAktTEAEtSEwMsSU0MsCQ1McCS1MQAS1ITAyxJTQywJDUxwJLUxABLUhMDLElNDLAkNTHAktTEAEtSEwMsSU0MsCQ1WdUB3rhp85o6xbWklWVVn5Z+z+6H1tQpriWtLKv6FbAkzTIDLElNDLAkNTHAktTEAEtSEwMsSU0MsCQ1McCS1MQAS1KTVX0k3MQdt44k3VNIWq4J/87+3BmbePihH07s8Qzw0Xj6wMQObfawZmkKJvg7C5P/vXUXhCQ1McCS1MQAS1ITAyxJTQywJDUxwJLUxABLUpOWACfZkuS7Sb6X5N0dM0hSt6kHOMnxwN8Bvwm8GLgoyYunPYckdet4BXwO8L2q+n5V/Qj4FHBBwxyS1CpVNd0VJm8CtlTVHw633wr8SlVddsjPbQW2DjdfCHx3kYd8HvDoSOMeLWdZ3CzNM0uzwGzN4yyLeybzPFpVWw5d2PFZEIf7ZIyf+L9AVW0Dth3xwZLtVTU3icGeKWdZ3CzNM0uzwGzN4yyLG2Oejl0Qu4FNC26fAexpmEOSWnUE+JvAWUmen+RZwO8AtzTMIUmtpr4LoqoOJLkM+CJwPHBtVd3/DB7yiLsppshZFjdL88zSLDBb8zjL4iY+z9T/CCdJmueRcJLUxABLUpMVG+BZO5w5yYNJ7kuyI8n2Ka/72iT7kuxcsOyUJLcleWD4ur55niuTPDxsnx1Jzp/SLJuS3JFkV5L7k1w+LJ/69llilqlvmyTPTvKNJN8eZvnrYXnL82aJeVqeN8O6j0/yrSSfH25PfNusyH3Aw+HM/wG8hvm3tX0TuKiq/r1xpgeBuaqa+hvHk/w68CTwiap6ybDsb4DHqur9w/+g1lfVnzfOcyXwZFV9YBozLJhlA7Chqu5JcjJwN/B64A+Y8vZZYpa3MOVtk/kzVZ5YVU8mOQH4KnA58EYanjdLzLOFhufNMNM7gDnguVX1ujF+p1bqK2APZ16gqu4EHjtk8QXAdcP165j/Re+cp0VV7a2qe4brTwC7gI00bJ8lZpm6mvfkcPOE4VI0PW+WmKdFkjOA3wI+tmDxxLfNSg3wRuChBbd30/REXqCALyW5eziMutvpVbUX5n/xgdOa5wG4LMm9wy6Kqe0SOSjJmcBLgbto3j6HzAIN22b4J/YOYB9wW1W1bpdF5oGe581HgD8Dnl6wbOLbZqUGeFmHM0/ZeVX1y8x/ytvbh3+G68euAn4eOBvYC3xwmitPchJwI3BFVT0+zXUvY5aWbVNVT1XV2cwfjXpOkpdMY71HOc/Ut02S1wH7qurusde1UgM8c4czV9We4es+4Gbmd5N0emTY53hw3+O+zmGq6pHhF+xp4GqmuH2GfYo3AtdX1U3D4pbtc7hZOrfNsP7/Bv6N+f2t7c+bhfM0bZvzgN8e/q7zKeBVSf6BEbbNSg3wTB3OnOTE4Y8qJDkReC2wc+l7je4W4OLh+sXAZxtnOfiEPegNTGn7DH/cuQbYVVUfWvCtqW+fxWbp2DZJTk3yM8P15wC/AXyHpufNYvN0bJuq+ouqOqOqzmS+LV+uqt9jjG1TVSvyApzP/Dsh/hN4b/MsLwC+PVzun/Y8wA3M//Ps/5j/18ElwM8CtwMPDF9PaZ7nk8B9wL3DE3nDlGZ5OfO7p+4FdgyX8zu2zxKzTH3bAL8IfGtY507gL4flLc+bJeZped4smOsVwOfH2jYr8m1okrQarNRdEJK04hlgSWpigCWpiQGWpCYGWJKaGGBJamKAJanJ/wMyaarJtNkq0QAAAABJRU5ErkJggg==\n",
      "text/plain": [
       "<Figure size 360x360 with 1 Axes>"
      ]
     },
     "metadata": {
      "needs_background": "light"
     },
     "output_type": "display_data"
    }
   ],
   "source": [
    "import seaborn as sns\n",
    " \n",
    "#sns.kdeplot(prediction_ridge)\n",
    "sns.displot(prediction_ridge)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "28bfb29e",
   "metadata": {},
   "outputs": [],
   "source": []
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3 (ipykernel)",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.9.7"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 5
}
