<!DOCTYPE html>
<html lang="zh-CN">
  <head>
    
<meta charset="UTF-8"/>
<meta http-equiv="X-UA-Compatible" content="IE=edge" />
<meta name="viewport" content="width=device-width, initial-scale=1, maximum-scale=1"/>


<meta http-equiv="Cache-Control" content="no-transform" />
<meta http-equiv="Cache-Control" content="no-siteapp" />

<meta name="theme-color" content="#f8f5ec" />
<meta name="msapplication-navbutton-color" content="#f8f5ec">
<meta name="apple-mobile-web-app-capable" content="yes">
<meta name="apple-mobile-web-app-status-bar-style" content="#f8f5ec">



  <meta name="description" content="梯度下降求解最小二乘"/>




  <meta name="keywords" content="机器学习, 梯度下降法, 八一" />



  <meta name="baidu-site-verification" content="HhUstaSjr0" />



  <meta name="google-site-verification" content="UA-102975942-1" />






  <link rel="alternate" href="/atom.xml" title="八一">




  <link rel="shortcut icon" type="image/x-icon" href="/favicon.ico?v=2.6.0" />



<link rel="canonical" href="https://bay1.top/2018/05/11/梯度下降求解最小二乘/"/>


<link rel="stylesheet" type="text/css" href="/css/style.css?v=2.6.0" />
<link rel="stylesheet" type="text/css" href="/css/prettify.css" media="screen" />
<link rel="stylesheet" type="text/css" href="/css/sons-of-obsidian.css" media="screen" />



  <link rel="stylesheet" type="text/css" href="/lib/fancybox/jquery.fancybox.css" />




  
  <script id="baidu_analytics">
    var _hmt = _hmt || [];
    (function() {
      var hm = document.createElement("script");
      hm.src = "https://hm.baidu.com/hm.js?9a885cc9fb6cd7bcef579deb8efe8a70";
      var s = document.getElementsByTagName("script")[0];
      s.parentNode.insertBefore(hm, s);
    })();
  </script>



  <script id="google_analytics">
    (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){
        (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o),
        m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)
        })(window,document,'script','//www.google-analytics.com/analytics.js','ga');

        ga('create', 'UA-102975942-1', 'auto');
        ga('send', 'pageview');
  </script>










    <title> 梯度下降求解最小二乘 - 八一 </title>
  </head>

  <body><div id="mobile-navbar" class="mobile-navbar">
  <div class="mobile-header-logo">
    <a href="/." class="logo">八一</a>
  </div>
  <div class="mobile-navbar-icon">
    <span></span>
    <span></span>
    <span></span>
  </div>
</div>

<nav id="mobile-menu" class="mobile-menu slideout-menu">
  <ul class="mobile-menu-list">
    
      <a href="/archives">
        <li class="mobile-menu-item">
          
          
            文章
          
        </li>
      </a>
    
      <a href="/tags">
        <li class="mobile-menu-item">
          
          
            标签
          
        </li>
      </a>
    
      <a href="/about">
        <li class="mobile-menu-item">
          
          
            关于/友链
          
        </li>
      </a>
    
      <a href="/search">
        <li class="mobile-menu-item">
          
          
            站内搜索
          
        </li>
      </a>
    
  </ul>
</nav>

    <div class="container" id="mobile-panel">
      <header id="header" class="header"><div class="logo-wrapper">
  <a href="/." class="logo">八一</a>
</div>

<nav class="site-navbar">
  
    <ul id="menu" class="menu">
      
        <li class="menu-item">
          <a class="menu-item-link" href="/archives">
            
            
              文章
            
          </a>
        </li>
      
        <li class="menu-item">
          <a class="menu-item-link" href="/tags">
            
            
              标签
            
          </a>
        </li>
      
        <li class="menu-item">
          <a class="menu-item-link" href="/about">
            
            
              关于/友链
            
          </a>
        </li>
      
        <li class="menu-item">
          <a class="menu-item-link" href="/search">
            
            
              站内搜索
            
          </a>
        </li>
      
    </ul>
  
</nav>

      </header>

      <main id="main" class="main">
        <div class="content-wrapper">
          <div id="content" class="content">
            
  
  <article class="post">
    <header class="post-header">
      <h1 class="post-title">
        
          梯度下降求解最小二乘
        
      </h1>

      <div class="post-meta">
        <span class="post-time">
          2018-05-11
        </span>
        
        
        
      </div>
    </header>

    
    
  <div class="post-toc" id="post-toc">
    <h2 class="post-toc-title">文章目录</h2>
    <div class="post-toc-content">
      <ol class="toc"><li class="toc-item toc-level-2"><a class="toc-link" href="#梯度下降法"><span class="toc-text">梯度下降法</span></a><ol class="toc-child"><li class="toc-item toc-level-3"><a class="toc-link" href="#求解一元线性"><span class="toc-text">求解一元线性</span></a></li><li class="toc-item toc-level-3"><a class="toc-link" href="#求解多元"><span class="toc-text">求解多元</span></a></li></ol></li></ol>
    </div>
  </div>


    <div class="post-content">
      
        <p>最近烫金学长正在学习机器学习,我也抽闲时间上了车～<a id="more"></a><br><a href="https://github.com/codeboytj/machineLearning" target="_blank" rel="noopener">github</a><br>(以下都是自己根据末尾文章不理解的地方瞎记录.</p>
<h2 id="梯度下降法"><a href="#梯度下降法" class="headerlink" title="梯度下降法"></a>梯度下降法</h2><h3 id="求解一元线性"><a href="#求解一元线性" class="headerlink" title="求解一元线性"></a>求解一元线性</h3><p>梯度下降求解算法是一种迭代算法,即在求最小二乘的时候<br>朝向梯形负方向(梯形正方向定义为增长速度最快)按照一定步长迭代下降<br>直到符合我们的预期要求<br>(步长也就是定义的学习率</p>
<blockquote>
<p>我们使用机器学习就是拟合我们所需要的方程,使它尽可能的符合预期的”模仿方程式”<br>即对于一个一元线性方程,我们代入训练数据,使它每个数据点与预期点的差异越小就好啦～<br>即对于y=ax+b,我们可以求ax+b-y作为代价函数(我们评判学习结果的依据模型)<br>根据一系列的转换,我们可以得到下面关于一元方程偏导</p>
</blockquote>
<p><img src="https://s1.ax1x.com/2018/05/04/CNpi40.png" alt="偏导公式结果"></p>
<p>然后就是使用代码翻译公式</p>
<figure class="highlight python"><table><tr><td class="gutter"><pre><span class="line">1</span><br><span class="line">2</span><br><span class="line">3</span><br><span class="line">4</span><br><span class="line">5</span><br><span class="line">6</span><br><span class="line">7</span><br><span class="line">8</span><br><span class="line">9</span><br><span class="line">10</span><br><span class="line">11</span><br><span class="line">12</span><br><span class="line">13</span><br><span class="line">14</span><br><span class="line">15</span><br><span class="line">16</span><br><span class="line">17</span><br><span class="line">18</span><br><span class="line">19</span><br><span class="line">20</span><br><span class="line">21</span><br><span class="line">22</span><br><span class="line">23</span><br><span class="line">24</span><br><span class="line">25</span><br><span class="line">26</span><br><span class="line">27</span><br><span class="line">28</span><br><span class="line">29</span><br><span class="line">30</span><br><span class="line">31</span><br><span class="line">32</span><br><span class="line">33</span><br><span class="line">34</span><br><span class="line">35</span><br><span class="line">36</span><br><span class="line">37</span><br><span class="line">38</span><br><span class="line">39</span><br><span class="line">40</span><br><span class="line">41</span><br><span class="line">42</span><br><span class="line">43</span><br><span class="line">44</span><br><span class="line">45</span><br><span class="line">46</span><br><span class="line">47</span><br><span class="line">48</span><br><span class="line">49</span><br><span class="line">50</span><br><span class="line">51</span><br><span class="line">52</span><br><span class="line">53</span><br><span class="line">54</span><br><span class="line">55</span><br><span class="line">56</span><br><span class="line">57</span><br><span class="line">58</span><br><span class="line">59</span><br><span class="line">60</span><br><span class="line">61</span><br></pre></td><td class="code"><pre><span class="line"><span class="comment"># encoding: utf-8</span></span><br><span class="line"><span class="keyword">from</span> sklearn <span class="keyword">import</span> linear_model</span><br><span class="line"></span><br><span class="line">alpha = <span class="number">0.01</span></span><br><span class="line"><span class="comment"># 精度设定</span></span><br><span class="line">epsilon = <span class="number">1e-8</span></span><br><span class="line"><span class="comment"># 目标函数y=2x+1</span></span><br><span class="line">x = [<span class="number">1.</span>, <span class="number">2.</span>, <span class="number">3.</span>, <span class="number">4.</span>, <span class="number">5.</span>, <span class="number">6.</span>, <span class="number">7.</span>, <span class="number">8.</span>, <span class="number">9.</span>]</span><br><span class="line">y = [<span class="number">3.</span>, <span class="number">5.</span>, <span class="number">7.</span>, <span class="number">9.</span>, <span class="number">11.</span>, <span class="number">13.</span>, <span class="number">15.</span>, <span class="number">17.</span>, <span class="number">19.</span>]</span><br><span class="line"></span><br><span class="line"><span class="comment"># scikit-learn的求解方法</span></span><br><span class="line"><span class="function"><span class="keyword">def</span> <span class="title">solve_by_scikit</span><span class="params">()</span>:</span></span><br><span class="line">    <span class="comment"># 使用scikit-learn求解</span></span><br><span class="line">    <span class="comment"># reg = linear_model.SGDClassifier(loss="hinge", penalty="l2")</span></span><br><span class="line">    <span class="comment"># reg.fit(x, y)</span></span><br><span class="line">    print(<span class="string">"暂无"</span>)</span><br><span class="line"></span><br><span class="line"></span><br><span class="line"><span class="comment"># 采用梯度下降法求解一元线性回归</span></span><br><span class="line">    <span class="function"><span class="keyword">def</span> <span class="title">solve_by_gradient</span><span class="params">()</span>:</span></span><br><span class="line">    <span class="comment"># 获取循环的长度</span></span><br><span class="line">    m = len(x)</span><br><span class="line">    a, b, sse2 = <span class="number">0</span>, <span class="number">0</span>, <span class="number">0</span></span><br><span class="line">    <span class="keyword">while</span> <span class="keyword">True</span>:</span><br><span class="line">        grad_a, grad_b = <span class="number">0</span>, <span class="number">0</span></span><br><span class="line">        <span class="keyword">for</span> i <span class="keyword">in</span> range(m):</span><br><span class="line">            <span class="comment"># 求（a*x(i)+b-y[i])的a,b偏导</span></span><br><span class="line">            <span class="comment"># 这里使用common感觉更好,公式相同处</span></span><br><span class="line">            common = a * x[i] + b - y[i]</span><br><span class="line">            grad_a += x[i] * common</span><br><span class="line">            grad_b += common</span><br><span class="line"></span><br><span class="line">            grad_a = grad_a / m</span><br><span class="line">            grad_b = grad_b / m</span><br><span class="line"></span><br><span class="line">            <span class="comment"># 梯形下降(梯形负方向,速度下降最快)迭代求符合最小值的a,b</span></span><br><span class="line">            <span class="comment"># alpha设置迭代步长,即学习率</span></span><br><span class="line">            a -= alpha * grad_a</span><br><span class="line">            b -= alpha * grad_b</span><br><span class="line"></span><br><span class="line">            sse = <span class="number">0</span></span><br><span class="line">            <span class="keyword">for</span> j <span class="keyword">in</span> range(m):</span><br><span class="line">                sse += (a * x[j] + b - y[j]) ** <span class="number">2</span> / (<span class="number">2</span> * m)</span><br><span class="line">            <span class="comment"># 拟合结果判断相差绝对值</span></span><br><span class="line">            <span class="keyword">if</span> abs(sse2 - sse) &lt; epsilon:</span><br><span class="line">                <span class="keyword">break</span></span><br><span class="line">            <span class="keyword">else</span>:</span><br><span class="line">                sse2 = sse</span><br><span class="line">        print(<span class="string">'&#123;0&#125; * x + &#123;1&#125;'</span>.format(a, b))</span><br><span class="line"></span><br><span class="line"><span class="function"><span class="keyword">def</span> <span class="title">main</span><span class="params">()</span>:</span></span><br><span class="line">    <span class="keyword">try</span>:</span><br><span class="line">        print(<span class="string">"scikit模拟结果:"</span>)</span><br><span class="line">        solve_by_scikit()</span><br><span class="line">        print(<span class="string">"梯形下降模拟结果:"</span>)</span><br><span class="line">        solve_by_gradient()</span><br><span class="line">    <span class="keyword">except</span> BaseException <span class="keyword">as</span> e:</span><br><span class="line">        print(<span class="string">"\n=&gt;错误: "</span>, e)</span><br><span class="line"></span><br><span class="line"><span class="keyword">if</span> __name__ == <span class="string">"__main__"</span>:</span><br><span class="line">    main()</span><br></pre></td></tr></table></figure>
<p><img src="https://s1.ax1x.com/2018/05/11/C0tOw6.png" alt="结果"></p>
<h3 id="求解多元"><a href="#求解多元" class="headerlink" title="求解多元"></a>求解多元</h3><blockquote>
<p>待研究</p>
</blockquote>
<p>PS：机器学习方便还是很要求数学基础的,可以说都是数学问题～</p>
<p>参考文章:<a href="https://yq.aliyun.com/articles/232003?spm=a2c4e.11153940.blogcont541976.17.1e0b87a8LtU5Ti" target="_blank" rel="noopener">梯度下降从放弃到入门</a><br><a href="https://www.cnblogs.com/pinard/p/5970503.html" target="_blank" rel="noopener">梯度下降小结</a></p>

      
    </div>

    
      
      



      
      
    

    
      <footer class="post-footer">
        
          <div class="post-tags">
            
              <a href="/tags/机器学习/">机器学习</a>
            
              <a href="/tags/梯度下降法/">梯度下降法</a>
            
          </div>
        
        
        
  <nav class="post-nav">
    
      <a class="prev" href="/2018/05/11/某吧喇叭啦吐槽/">
        <i class="iconfont icon-left"></i>
        <span class="prev-text nav-default">某'吧喇叭啦'吐槽</span>
        <span class="prev-text nav-mobile">上一篇</span>
      </a>
    
    
      <a class="next" href="/2018/05/11/百度AI人脸识别体验/">
        <span class="next-text nav-default">百度AI人脸识别体验</span>
        <span class="prev-text nav-mobile">下一篇</span>
        <i class="iconfont icon-right"></i>
      </a>
    
  </nav>

      </footer>
    

  </article>


          </div>
          
  <div class="comments" id="comments">
      <div id="disqus_thread">
        <noscript>
          Please enable JavaScript to view the
          <a href="//disqus.com/?ref_noscript">comments powered by Disqus.</a>
        </noscript>
      </div> 
    </div>
  </div>


        </div>
      </main>

      <footer id="footer" class="footer">

  <div class="social-links">
    
      
        
          <a href="https://github.com/bay1" class="iconfont icon-github" title="github"></a>
        
      
    
      
        
          <a href="http://weibo.com/3190704711/profile?topnav=1&wvr=6&is_all=1" class="iconfont icon-weibo" title="weibo"></a>
        
      
    
      
    
      
    
      
    
    
    
  </div>


<div class="copyright">
  <span class="copyright-year">
    
    &copy; 
     
      2016 - 
    
    2018
    <span class="author">bay1</span>
  </span>
</div>
      </footer>

      <div class="back-to-top" id="back-to-top">
        <i class="iconfont icon-up"></i>
      </div>
    </div>

    
  
  <script type="text/javascript">
    var disqus_config = function () {
        this.page.url = 'https://bay1.top/2018/05/11/梯度下降求解最小二乘/';
        this.page.identifier = '2018/05/11/梯度下降求解最小二乘/';
        this.page.title = '梯度下降求解最小二乘';
    };
    (function() {
    var d = document, s = d.createElement('script');

    s.src = '//https-blog-flywinky-top-1.disqus.com/embed.js';

    s.setAttribute('data-timestamp', +new Date());
    (d.head || d.body).appendChild(s);
    })();  
  </script>



    
  





  
    <script type="text/javascript" src="/lib/jquery/jquery-3.1.1.min.js"></script>
  

  
    <script type="text/javascript" src="/lib/slideout/slideout.js"></script>
  

  
    <script type="text/javascript" src="/lib/fancybox/jquery.fancybox.pack.js"></script>
  


    <script type="text/javascript" src="/js/src/even.js?v=2.6.0"></script>
<script type="text/javascript" src="/js/src/bootstrap.js?v=2.6.0"></script>
<script src="/js/prettify.js"></script>
<script type="text/javascript">
$(document).ready(function(){
 $('pre').addClass('prettyprint');
   prettyPrint();
 })
</script>
  </body>
</html>
