

<!DOCTYPE html>
<html class="writer-html5" lang="zh" >
<head>
  <meta charset="utf-8">
  
  <meta name="viewport" content="width=device-width, initial-scale=1.0">
  
  <title>设置 &mdash; Scrapy 2.3.0 文档</title>
  

  
  <link rel="stylesheet" href="../_static/css/theme.css" type="text/css" />
  <link rel="stylesheet" href="../_static/pygments.css" type="text/css" />
  <link rel="stylesheet" href="../_static/css/tooltipster.custom.css" type="text/css" />
  <link rel="stylesheet" href="../_static/css/tooltipster.bundle.min.css" type="text/css" />
  <link rel="stylesheet" href="../_static/css/tooltipster-sideTip-shadow.min.css" type="text/css" />
  <link rel="stylesheet" href="../_static/css/tooltipster-sideTip-punk.min.css" type="text/css" />
  <link rel="stylesheet" href="../_static/css/tooltipster-sideTip-noir.min.css" type="text/css" />
  <link rel="stylesheet" href="../_static/css/tooltipster-sideTip-light.min.css" type="text/css" />
  <link rel="stylesheet" href="../_static/css/tooltipster-sideTip-borderless.min.css" type="text/css" />
  <link rel="stylesheet" href="../_static/css/micromodal.css" type="text/css" />
  <link rel="stylesheet" href="../_static/css/sphinx_rtd_theme.css" type="text/css" />

  
  
  
  

  
  <!--[if lt IE 9]>
    <script src="../_static/js/html5shiv.min.js"></script>
  <![endif]-->
  
    
      <script type="text/javascript" id="documentation_options" data-url_root="../" src="../_static/documentation_options.js"></script>
        <script src="../_static/jquery.js"></script>
        <script src="../_static/underscore.js"></script>
        <script src="../_static/doctools.js"></script>
        <script src="../_static/language_data.js"></script>
        <script src="../_static/js/hoverxref.js"></script>
        <script src="../_static/js/tooltipster.bundle.min.js"></script>
        <script src="../_static/js/micromodal.min.js"></script>
    
    <script type="text/javascript" src="../_static/js/theme.js"></script>

    
    <link rel="index" title="索引" href="../genindex.html" />
    <link rel="search" title="搜索" href="../search.html" />
    <link rel="next" title="例外情况" href="exceptions.html" />
    <link rel="prev" title="链接提取器" href="link-extractors.html" /> 
</head>

<body class="wy-body-for-nav">

   
  <div class="wy-grid-for-nav">
    
    <nav data-toggle="wy-nav-shift" class="wy-nav-side">
      <div class="wy-side-scroll">
        <div class="wy-side-nav-search" >
          

          
            <a href="../index.html" class="icon icon-home" alt="Documentation Home"> Scrapy
          

          
          </a>

          
            
            
              <div class="version">
                2.3
              </div>
            
          

          
<div role="search">
  <form id="rtd-search-form" class="wy-form" action="../search.html" method="get">
    <input type="text" name="q" placeholder="Search docs" />
    <input type="hidden" name="check_keywords" value="yes" />
    <input type="hidden" name="area" value="default" />
  </form>
</div>

          
        </div>

        
        <div class="wy-menu wy-menu-vertical" data-spy="affix" role="navigation" aria-label="main navigation">
          
            
            
              
            
            
              <p class="caption"><span class="caption-text">第一步</span></p>
<ul>
<li class="toctree-l1"><a class="reference internal" href="../intro/overview.html">Scrapy一目了然</a></li>
<li class="toctree-l1"><a class="reference internal" href="../intro/install.html">安装指南</a></li>
<li class="toctree-l1"><a class="reference internal" href="../intro/tutorial.html">Scrapy 教程</a></li>
<li class="toctree-l1"><a class="reference internal" href="../intro/examples.html">实例</a></li>
</ul>
<p class="caption"><span class="caption-text">基本概念</span></p>
<ul class="current">
<li class="toctree-l1"><a class="reference internal" href="commands.html">命令行工具</a></li>
<li class="toctree-l1"><a class="reference internal" href="spiders.html">蜘蛛</a></li>
<li class="toctree-l1"><a class="reference internal" href="selectors.html">选择器</a></li>
<li class="toctree-l1"><a class="reference internal" href="items.html">项目</a></li>
<li class="toctree-l1"><a class="reference internal" href="loaders.html">项目加载器</a></li>
<li class="toctree-l1"><a class="reference internal" href="shell.html">Scrapy shell</a></li>
<li class="toctree-l1"><a class="reference internal" href="item-pipeline.html">项目管道</a></li>
<li class="toctree-l1"><a class="reference internal" href="feed-exports.html">Feed 导出</a></li>
<li class="toctree-l1"><a class="reference internal" href="request-response.html">请求和响应</a></li>
<li class="toctree-l1"><a class="reference internal" href="link-extractors.html">链接提取器</a></li>
<li class="toctree-l1 current"><a class="current reference internal" href="#">设置</a><ul>
<li class="toctree-l2"><a class="reference internal" href="#designating-the-settings">指定设置</a></li>
<li class="toctree-l2"><a class="reference internal" href="#populating-the-settings">填充设置</a><ul>
<li class="toctree-l3"><a class="reference internal" href="#command-line-options">1。命令行选项</a></li>
<li class="toctree-l3"><a class="reference internal" href="#settings-per-spider">2。每个蜘蛛的设置</a></li>
<li class="toctree-l3"><a class="reference internal" href="#project-settings-module">三。项目设置模块</a></li>
<li class="toctree-l3"><a class="reference internal" href="#default-settings-per-command">4。每个命令的默认设置</a></li>
<li class="toctree-l3"><a class="reference internal" href="#default-global-settings">5。默认全局设置</a></li>
</ul>
</li>
<li class="toctree-l2"><a class="reference internal" href="#import-paths-and-classes">导入路径和类</a></li>
<li class="toctree-l2"><a class="reference internal" href="#how-to-access-settings">如何访问设置</a></li>
<li class="toctree-l2"><a class="reference internal" href="#rationale-for-setting-names">设置名称的理由</a></li>
<li class="toctree-l2"><a class="reference internal" href="#built-in-settings-reference">内置设置参考</a><ul>
<li class="toctree-l3"><a class="reference internal" href="#aws-access-key-id">AWS_ACCESS_KEY_ID</a></li>
<li class="toctree-l3"><a class="reference internal" href="#aws-secret-access-key">AWS_SECRET_ACCESS_KEY</a></li>
<li class="toctree-l3"><a class="reference internal" href="#aws-endpoint-url">AWS_ENDPOINT_URL</a></li>
<li class="toctree-l3"><a class="reference internal" href="#aws-use-ssl">AWS_USE_SSL</a></li>
<li class="toctree-l3"><a class="reference internal" href="#aws-verify">AWS_VERIFY</a></li>
<li class="toctree-l3"><a class="reference internal" href="#aws-region-name">AWS_REGION_NAME</a></li>
<li class="toctree-l3"><a class="reference internal" href="#asyncio-event-loop">ASYNCIO_EVENT_LOOP</a></li>
<li class="toctree-l3"><a class="reference internal" href="#bot-name">BOT_NAME</a></li>
<li class="toctree-l3"><a class="reference internal" href="#concurrent-items">CONCURRENT_ITEMS</a></li>
<li class="toctree-l3"><a class="reference internal" href="#concurrent-requests">CONCURRENT_REQUESTS</a></li>
<li class="toctree-l3"><a class="reference internal" href="#concurrent-requests-per-domain">CONCURRENT_REQUESTS_PER_DOMAIN</a></li>
<li class="toctree-l3"><a class="reference internal" href="#concurrent-requests-per-ip">CONCURRENT_REQUESTS_PER_IP</a></li>
<li class="toctree-l3"><a class="reference internal" href="#default-item-class">DEFAULT_ITEM_CLASS</a></li>
<li class="toctree-l3"><a class="reference internal" href="#default-request-headers">DEFAULT_REQUEST_HEADERS</a></li>
<li class="toctree-l3"><a class="reference internal" href="#depth-limit">DEPTH_LIMIT</a></li>
<li class="toctree-l3"><a class="reference internal" href="#depth-priority">DEPTH_PRIORITY</a></li>
<li class="toctree-l3"><a class="reference internal" href="#depth-stats-verbose">DEPTH_STATS_VERBOSE</a></li>
<li class="toctree-l3"><a class="reference internal" href="#dnscache-enabled">DNSCACHE_ENABLED</a></li>
<li class="toctree-l3"><a class="reference internal" href="#dnscache-size">DNSCACHE_SIZE</a></li>
<li class="toctree-l3"><a class="reference internal" href="#dns-resolver">DNS_RESOLVER</a></li>
<li class="toctree-l3"><a class="reference internal" href="#dns-timeout">DNS_TIMEOUT</a></li>
<li class="toctree-l3"><a class="reference internal" href="#downloader">DOWNLOADER</a></li>
<li class="toctree-l3"><a class="reference internal" href="#downloader-httpclientfactory">DOWNLOADER_HTTPCLIENTFACTORY</a></li>
<li class="toctree-l3"><a class="reference internal" href="#downloader-clientcontextfactory">DOWNLOADER_CLIENTCONTEXTFACTORY</a></li>
<li class="toctree-l3"><a class="reference internal" href="#downloader-client-tls-ciphers">DOWNLOADER_CLIENT_TLS_CIPHERS</a></li>
<li class="toctree-l3"><a class="reference internal" href="#downloader-client-tls-method">DOWNLOADER_CLIENT_TLS_METHOD</a></li>
<li class="toctree-l3"><a class="reference internal" href="#downloader-client-tls-verbose-logging">DOWNLOADER_CLIENT_TLS_VERBOSE_LOGGING</a></li>
<li class="toctree-l3"><a class="reference internal" href="#downloader-middlewares">DOWNLOADER_MIDDLEWARES</a></li>
<li class="toctree-l3"><a class="reference internal" href="#downloader-middlewares-base">DOWNLOADER_MIDDLEWARES_BASE</a></li>
<li class="toctree-l3"><a class="reference internal" href="#downloader-stats">DOWNLOADER_STATS</a></li>
<li class="toctree-l3"><a class="reference internal" href="#download-delay">DOWNLOAD_DELAY</a></li>
<li class="toctree-l3"><a class="reference internal" href="#download-handlers">DOWNLOAD_HANDLERS</a></li>
<li class="toctree-l3"><a class="reference internal" href="#download-handlers-base">DOWNLOAD_HANDLERS_BASE</a></li>
<li class="toctree-l3"><a class="reference internal" href="#download-timeout">DOWNLOAD_TIMEOUT</a></li>
<li class="toctree-l3"><a class="reference internal" href="#download-maxsize">DOWNLOAD_MAXSIZE</a></li>
<li class="toctree-l3"><a class="reference internal" href="#download-warnsize">DOWNLOAD_WARNSIZE</a></li>
<li class="toctree-l3"><a class="reference internal" href="#download-fail-on-dataloss">DOWNLOAD_FAIL_ON_DATALOSS</a></li>
<li class="toctree-l3"><a class="reference internal" href="#dupefilter-class">DUPEFILTER_CLASS</a></li>
<li class="toctree-l3"><a class="reference internal" href="#dupefilter-debug">DUPEFILTER_DEBUG</a></li>
<li class="toctree-l3"><a class="reference internal" href="#editor">EDITOR</a></li>
<li class="toctree-l3"><a class="reference internal" href="#extensions">EXTENSIONS</a></li>
<li class="toctree-l3"><a class="reference internal" href="#extensions-base">EXTENSIONS_BASE</a></li>
<li class="toctree-l3"><a class="reference internal" href="#feed-tempdir">FEED_TEMPDIR</a></li>
<li class="toctree-l3"><a class="reference internal" href="#feed-storage-gcs-acl">FEED_STORAGE_GCS_ACL</a></li>
<li class="toctree-l3"><a class="reference internal" href="#ftp-passive-mode">FTP_PASSIVE_MODE</a></li>
<li class="toctree-l3"><a class="reference internal" href="#ftp-password">FTP_PASSWORD</a></li>
<li class="toctree-l3"><a class="reference internal" href="#ftp-user">FTP_USER</a></li>
<li class="toctree-l3"><a class="reference internal" href="#gcs-project-id">GCS_PROJECT_ID</a></li>
<li class="toctree-l3"><a class="reference internal" href="#item-pipelines">ITEM_PIPELINES</a></li>
<li class="toctree-l3"><a class="reference internal" href="#item-pipelines-base">ITEM_PIPELINES_BASE</a></li>
<li class="toctree-l3"><a class="reference internal" href="#log-enabled">LOG_ENABLED</a></li>
<li class="toctree-l3"><a class="reference internal" href="#log-encoding">LOG_ENCODING</a></li>
<li class="toctree-l3"><a class="reference internal" href="#log-file">LOG_FILE</a></li>
<li class="toctree-l3"><a class="reference internal" href="#log-format">LOG_FORMAT</a></li>
<li class="toctree-l3"><a class="reference internal" href="#log-dateformat">LOG_DATEFORMAT</a></li>
<li class="toctree-l3"><a class="reference internal" href="#log-formatter">LOG_FORMATTER</a></li>
<li class="toctree-l3"><a class="reference internal" href="#log-level">LOG_LEVEL</a></li>
<li class="toctree-l3"><a class="reference internal" href="#log-stdout">LOG_STDOUT</a></li>
<li class="toctree-l3"><a class="reference internal" href="#log-short-names">LOG_SHORT_NAMES</a></li>
<li class="toctree-l3"><a class="reference internal" href="#logstats-interval">LOGSTATS_INTERVAL</a></li>
<li class="toctree-l3"><a class="reference internal" href="#memdebug-enabled">MEMDEBUG_ENABLED</a></li>
<li class="toctree-l3"><a class="reference internal" href="#memdebug-notify">MEMDEBUG_NOTIFY</a></li>
<li class="toctree-l3"><a class="reference internal" href="#memusage-enabled">MEMUSAGE_ENABLED</a></li>
<li class="toctree-l3"><a class="reference internal" href="#memusage-limit-mb">MEMUSAGE_LIMIT_MB</a></li>
<li class="toctree-l3"><a class="reference internal" href="#memusage-check-interval-seconds">MEMUSAGE_CHECK_INTERVAL_SECONDS</a></li>
<li class="toctree-l3"><a class="reference internal" href="#memusage-notify-mail">MEMUSAGE_NOTIFY_MAIL</a></li>
<li class="toctree-l3"><a class="reference internal" href="#memusage-warning-mb">MEMUSAGE_WARNING_MB</a></li>
<li class="toctree-l3"><a class="reference internal" href="#newspider-module">NEWSPIDER_MODULE</a></li>
<li class="toctree-l3"><a class="reference internal" href="#randomize-download-delay">RANDOMIZE_DOWNLOAD_DELAY</a></li>
<li class="toctree-l3"><a class="reference internal" href="#reactor-threadpool-maxsize">REACTOR_THREADPOOL_MAXSIZE</a></li>
<li class="toctree-l3"><a class="reference internal" href="#redirect-priority-adjust">REDIRECT_PRIORITY_ADJUST</a></li>
<li class="toctree-l3"><a class="reference internal" href="#retry-priority-adjust">RETRY_PRIORITY_ADJUST</a></li>
<li class="toctree-l3"><a class="reference internal" href="#robotstxt-obey">ROBOTSTXT_OBEY</a></li>
<li class="toctree-l3"><a class="reference internal" href="#robotstxt-parser">ROBOTSTXT_PARSER</a><ul>
<li class="toctree-l4"><a class="reference internal" href="#robotstxt-user-agent">ROBOTSTXT_USER_AGENT</a></li>
</ul>
</li>
<li class="toctree-l3"><a class="reference internal" href="#scheduler">SCHEDULER</a></li>
<li class="toctree-l3"><a class="reference internal" href="#scheduler-debug">SCHEDULER_DEBUG</a></li>
<li class="toctree-l3"><a class="reference internal" href="#scheduler-disk-queue">SCHEDULER_DISK_QUEUE</a></li>
<li class="toctree-l3"><a class="reference internal" href="#scheduler-memory-queue">SCHEDULER_MEMORY_QUEUE</a></li>
<li class="toctree-l3"><a class="reference internal" href="#scheduler-priority-queue">SCHEDULER_PRIORITY_QUEUE</a></li>
<li class="toctree-l3"><a class="reference internal" href="#scraper-slot-max-active-size">SCRAPER_SLOT_MAX_ACTIVE_SIZE</a></li>
<li class="toctree-l3"><a class="reference internal" href="#spider-contracts">SPIDER_CONTRACTS</a></li>
<li class="toctree-l3"><a class="reference internal" href="#spider-contracts-base">SPIDER_CONTRACTS_BASE</a></li>
<li class="toctree-l3"><a class="reference internal" href="#spider-loader-class">SPIDER_LOADER_CLASS</a></li>
<li class="toctree-l3"><a class="reference internal" href="#spider-loader-warn-only">SPIDER_LOADER_WARN_ONLY</a></li>
<li class="toctree-l3"><a class="reference internal" href="#spider-middlewares">SPIDER_MIDDLEWARES</a></li>
<li class="toctree-l3"><a class="reference internal" href="#spider-middlewares-base">SPIDER_MIDDLEWARES_BASE</a></li>
<li class="toctree-l3"><a class="reference internal" href="#spider-modules">SPIDER_MODULES</a></li>
<li class="toctree-l3"><a class="reference internal" href="#stats-class">STATS_CLASS</a></li>
<li class="toctree-l3"><a class="reference internal" href="#stats-dump">STATS_DUMP</a></li>
<li class="toctree-l3"><a class="reference internal" href="#statsmailer-rcpts">STATSMAILER_RCPTS</a></li>
<li class="toctree-l3"><a class="reference internal" href="#telnetconsole-enabled">TELNETCONSOLE_ENABLED</a></li>
<li class="toctree-l3"><a class="reference internal" href="#templates-dir">TEMPLATES_DIR</a></li>
<li class="toctree-l3"><a class="reference internal" href="#twisted-reactor">TWISTED_REACTOR</a></li>
<li class="toctree-l3"><a class="reference internal" href="#urllength-limit">URLLENGTH_LIMIT</a></li>
<li class="toctree-l3"><a class="reference internal" href="#user-agent">USER_AGENT</a></li>
<li class="toctree-l3"><a class="reference internal" href="#settings-documented-elsewhere">其他地方记录的设置：</a></li>
</ul>
</li>
</ul>
</li>
<li class="toctree-l1"><a class="reference internal" href="exceptions.html">例外情况</a></li>
</ul>
<p class="caption"><span class="caption-text">内置服务</span></p>
<ul>
<li class="toctree-l1"><a class="reference internal" href="logging.html">登录</a></li>
<li class="toctree-l1"><a class="reference internal" href="stats.html">统计数据集合</a></li>
<li class="toctree-l1"><a class="reference internal" href="email.html">发送电子邮件</a></li>
<li class="toctree-l1"><a class="reference internal" href="telnetconsole.html">远程登录控制台</a></li>
<li class="toctree-l1"><a class="reference internal" href="webservice.html">Web服务</a></li>
</ul>
<p class="caption"><span class="caption-text">解决具体问题</span></p>
<ul>
<li class="toctree-l1"><a class="reference internal" href="../faq.html">常见问题</a></li>
<li class="toctree-l1"><a class="reference internal" href="debug.html">调试spiders</a></li>
<li class="toctree-l1"><a class="reference internal" href="contracts.html">蜘蛛合约</a></li>
<li class="toctree-l1"><a class="reference internal" href="practices.html">常用做法</a></li>
<li class="toctree-l1"><a class="reference internal" href="broad-crawls.html">宽爬行</a></li>
<li class="toctree-l1"><a class="reference internal" href="developer-tools.html">使用浏览器的开发人员工具进行抓取</a></li>
<li class="toctree-l1"><a class="reference internal" href="dynamic-content.html">选择动态加载的内容</a></li>
<li class="toctree-l1"><a class="reference internal" href="leaks.html">调试内存泄漏</a></li>
<li class="toctree-l1"><a class="reference internal" href="media-pipeline.html">下载和处理文件和图像</a></li>
<li class="toctree-l1"><a class="reference internal" href="deploy.html">部署蜘蛛</a></li>
<li class="toctree-l1"><a class="reference internal" href="autothrottle.html">AutoThrottle 扩展</a></li>
<li class="toctree-l1"><a class="reference internal" href="benchmarking.html">标杆管理</a></li>
<li class="toctree-l1"><a class="reference internal" href="jobs.html">作业：暂停和恢复爬行</a></li>
<li class="toctree-l1"><a class="reference internal" href="coroutines.html">协同程序</a></li>
<li class="toctree-l1"><a class="reference internal" href="asyncio.html">asyncio</a></li>
</ul>
<p class="caption"><span class="caption-text">扩展Scrapy</span></p>
<ul>
<li class="toctree-l1"><a class="reference internal" href="architecture.html">体系结构概述</a></li>
<li class="toctree-l1"><a class="reference internal" href="downloader-middleware.html">下载器中间件</a></li>
<li class="toctree-l1"><a class="reference internal" href="spider-middleware.html">蜘蛛中间件</a></li>
<li class="toctree-l1"><a class="reference internal" href="extensions.html">扩展</a></li>
<li class="toctree-l1"><a class="reference internal" href="api.html">核心API</a></li>
<li class="toctree-l1"><a class="reference internal" href="signals.html">信号</a></li>
<li class="toctree-l1"><a class="reference internal" href="exporters.html">条目导出器</a></li>
</ul>
<p class="caption"><span class="caption-text">其余所有</span></p>
<ul>
<li class="toctree-l1"><a class="reference internal" href="../news.html">发行说明</a></li>
<li class="toctree-l1"><a class="reference internal" href="../contributing.html">为 Scrapy 贡献</a></li>
<li class="toctree-l1"><a class="reference internal" href="../versioning.html">版本控制和API稳定性</a></li>
</ul>

            
          
        </div>
        
      </div>
    </nav>

    <section data-toggle="wy-nav-shift" class="wy-nav-content-wrap">

      
      <nav class="wy-nav-top" aria-label="top navigation">
        
          <i data-toggle="wy-nav-top" class="fa fa-bars"></i>
          <a href="../index.html">Scrapy</a>
        
      </nav>


      <div class="wy-nav-content">
        
        <div class="rst-content">
        
          















<div role="navigation" aria-label="breadcrumbs navigation">

  <ul class="wy-breadcrumbs">
    
      <li><a href="../index.html" class="icon icon-home"></a> &raquo;</li>
        
      <li>设置</li>
    
    
      <li class="wy-breadcrumbs-aside">
        
            
        
      </li>
    
  </ul>

  
  <hr/>
</div>
          <div role="main" class="document" itemscope="itemscope" itemtype="http://schema.org/Article">
           <div itemprop="articleBody">
            
  <div class="section" id="settings">
<span id="topics-settings"></span><h1>设置<a class="headerlink" href="#settings" title="永久链接至标题">¶</a></h1>
<p>Scrapy设置允许您自定义所有Scrapy组件的行为，包括核心、扩展、管道和spider本身。</p>
<p>设置的基础结构提供了键值映射的全局命名空间，代码可以使用该命名空间从中提取配置值。可以通过下面描述的不同机制填充设置。</p><script async src="https://pagead2.googlesyndication.com/pagead/js/adsbygoogle.js"></script>
<ins class="adsbygoogle"
     style="display:block; text-align:center;"
     data-ad-layout="in-article"
     data-ad-format="fluid"
     data-ad-client="ca-pub-1466963416408457"
     data-ad-slot="8850786025"></ins>
<script>
     (adsbygoogle = window.adsbygoogle || []).push({});
</script>
<p>这些设置也是选择当前活动的Scrapy项目（如果您有许多项目）的机制。</p>
<p>有关可用内置设置的列表，请参阅： <a class="reference internal" href="#topics-settings-ref"><span class="std std-ref">内置设置参考</span></a> .</p>
<div class="section" id="designating-the-settings">
<span id="topics-settings-module-envvar"></span><h2>指定设置<a class="headerlink" href="#designating-the-settings" title="永久链接至标题">¶</a></h2>
<p>当你使用Scrapy时，你必须告诉它你使用的设置。您可以通过使用环境变量， <code class="docutils literal notranslate"><span class="pre">SCRAPY_SETTINGS_MODULE</span></code> .</p>
<p>价值 <code class="docutils literal notranslate"><span class="pre">SCRAPY_SETTINGS_MODULE</span></code> 应该使用python路径语法，例如 <code class="docutils literal notranslate"><span class="pre">myproject.settings</span></code> . 注意，设置模块应该在python上 <a class="reference external" href="https://docs.python.org/3/tutorial/modules.html#tut-searchpath" title="(在 Python v3.9)"><span class="xref std std-ref">import search path</span></a> .</p>
</div>
<div class="section" id="populating-the-settings">
<span id="populating-settings"></span><h2>填充设置<a class="headerlink" href="#populating-the-settings" title="永久链接至标题">¶</a></h2>
<p>可以使用不同的机制填充设置，每个机制具有不同的优先级。以下是按优先级降序排列的列表：</p>
<blockquote>
<div><ol class="arabic simple">
<li><p>命令行选项（最优先）</p></li>
<li><p>每个蜘蛛的设置</p></li>
<li><p>项目设置模块</p></li>
<li><p>每个命令的默认设置</p></li>
<li><p>默认全局设置（优先级较低）</p></li>
</ol>
</div></blockquote>
<p>这些设置源的填充是在内部处理的，但是可以使用API调用进行手动处理。见 <a class="reference internal" href="api.html#topics-api-settings"><span class="std std-ref">设置API</span></a> 供参考的主题。</p>
<p>下面将更详细地描述这些机制。</p>
<div class="section" id="command-line-options">
<h3>1。命令行选项<a class="headerlink" href="#command-line-options" title="永久链接至标题">¶</a></h3>
<p>命令行提供的参数是最优先的参数，覆盖了任何其他选项。您可以使用 <code class="docutils literal notranslate"><span class="pre">-s</span></code> （或） <code class="docutils literal notranslate"><span class="pre">--set</span></code> ）命令行选项。</p>
<p>例子：：</p>
<div class="highlight-sh notranslate"><div class="highlight"><pre><span></span>scrapy crawl myspider -s <span class="nv">LOG_FILE</span><span class="o">=</span>scrapy.log
</pre></div>
</div>
</div>
<div class="section" id="settings-per-spider">
<h3>2。每个蜘蛛的设置<a class="headerlink" href="#settings-per-spider" title="永久链接至标题">¶</a></h3>
<p>蜘蛛（见 <a class="reference internal" href="spiders.html#topics-spiders"><span class="std std-ref">蜘蛛</span></a> 章节供参考）可以定义它们自己的设置，这些设置将优先并覆盖项目设置。他们可以通过设置 <a class="reference internal" href="spiders.html#scrapy.spiders.Spider.custom_settings" title="scrapy.spiders.Spider.custom_settings"><code class="xref py py-attr docutils literal notranslate"><span class="pre">custom_settings</span></code></a> 属性：</p>
<div class="highlight-sh notranslate"><div class="highlight"><pre><span></span>class MySpider<span class="o">(</span>scrapy.Spider<span class="o">)</span>:
    <span class="nv">name</span> <span class="o">=</span> <span class="s1">&#39;myspider&#39;</span>

    <span class="nv">custom_settings</span> <span class="o">=</span> <span class="o">{</span>
        <span class="s1">&#39;SOME_SETTING&#39;</span>: <span class="s1">&#39;some value&#39;</span>,
    <span class="o">}</span>
</pre></div>
</div>
</div>
<div class="section" id="project-settings-module">
<h3>三。项目设置模块<a class="headerlink" href="#project-settings-module" title="永久链接至标题">¶</a></h3>
<p>项目设置模块是碎屑项目的标准配置文件，它将填充大部分自定义设置。对于标准的Scrapy项目，这意味着您将在 <code class="docutils literal notranslate"><span class="pre">settings.py</span></code> 为项目创建的文件。</p>
</div>
<div class="section" id="default-settings-per-command">
<h3>4。每个命令的默认设置<a class="headerlink" href="#default-settings-per-command" title="永久链接至标题">¶</a></h3>
<p>各 <a class="reference internal" href="commands.html"><span class="doc">Scrapy tool</span></a> 命令可以有自己的默认设置，这将覆盖全局默认设置。这些自定义命令设置在 <code class="docutils literal notranslate"><span class="pre">default_settings</span></code> 命令类的属性。</p>
</div>
<div class="section" id="default-global-settings">
<h3>5。默认全局设置<a class="headerlink" href="#default-global-settings" title="永久链接至标题">¶</a></h3>
<p>全局默认值位于 <code class="docutils literal notranslate"><span class="pre">scrapy.settings.default_settings</span></code> 并记录在 <a class="reference internal" href="#topics-settings-ref"><span class="std std-ref">内置设置参考</span></a> 部分。</p>
</div>
</div>
<div class="section" id="import-paths-and-classes">
<h2>导入路径和类<a class="headerlink" href="#import-paths-and-classes" title="永久链接至标题">¶</a></h2>
<div class="versionadded">
<p><span class="versionmodified added">VERSION 新版功能.</span></p>
</div>
<p>当设置引用要由scray导入的可调用对象（如类或函数）时，可以使用两种不同的方法指定该对象：</p>
<ul class="simple">
<li><p>作为包含该对象的导入路径的字符串</p></li>
<li><p>作为对象本身</p></li>
</ul>
<p>例如：：</p>
<div class="highlight-sh notranslate"><div class="highlight"><pre><span></span>from mybot.pipelines.validate import ValidateMyItem
<span class="nv">ITEM_PIPELINES</span> <span class="o">=</span> <span class="o">{</span>
    <span class="c1"># passing the classname...</span>
    ValidateMyItem: <span class="m">300</span>,
    <span class="c1"># ...equals passing the class path</span>
    <span class="s1">&#39;mybot.pipelines.validate.ValidateMyItem&#39;</span>: <span class="m">300</span>,
<span class="o">}</span>
</pre></div>
</div>
<div class="admonition note">
<p class="admonition-title">注解</p>
<p>不支持传递不可调用的对象。</p>
</div>
</div>
<div class="section" id="how-to-access-settings">
<h2>如何访问设置<a class="headerlink" href="#how-to-access-settings" title="永久链接至标题">¶</a></h2>
<p>在Spider中，可以通过 <code class="docutils literal notranslate"><span class="pre">self.settings</span></code> ：：</p>
<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">class</span> <span class="nc">MySpider</span><span class="p">(</span><span class="n">scrapy</span><span class="o">.</span><span class="n">Spider</span><span class="p">):</span>
    <span class="n">name</span> <span class="o">=</span> <span class="s1">&#39;myspider&#39;</span>
    <span class="n">start_urls</span> <span class="o">=</span> <span class="p">[</span><span class="s1">&#39;http://example.com&#39;</span><span class="p">]</span>

    <span class="k">def</span> <span class="nf">parse</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">response</span><span class="p">):</span>
        <span class="nb">print</span><span class="p">(</span><span class="sa">f</span><span class="s2">&quot;Existing settings: </span><span class="si">{</span><span class="bp">self</span><span class="o">.</span><span class="n">settings</span><span class="o">.</span><span class="n">attributes</span><span class="o">.</span><span class="n">keys</span><span class="p">()</span><span class="si">}</span><span class="s2">&quot;</span><span class="p">)</span>
</pre></div>
</div>
<div class="admonition note">
<p class="admonition-title">注解</p>
<p>这个 <code class="docutils literal notranslate"><span class="pre">settings</span></code> 属性在Spider初始化后在基本Spider类中设置。如果要在初始化之前使用这些设置（例如，在Spider的 <code class="docutils literal notranslate"><span class="pre">__init__()</span></code> 方法），您需要重写 <a class="reference internal" href="spiders.html#scrapy.spiders.Spider.from_crawler" title="scrapy.spiders.Spider.from_crawler"><code class="xref py py-meth docutils literal notranslate"><span class="pre">from_crawler()</span></code></a> 方法。</p>
</div>
<p>可以通过访问 <a class="reference internal" href="api.html#scrapy.crawler.Crawler.settings" title="scrapy.crawler.Crawler.settings"><code class="xref py py-attr docutils literal notranslate"><span class="pre">scrapy.crawler.Crawler.settings</span></code></a> 传递给的爬网程序的属性 <code class="docutils literal notranslate"><span class="pre">from_crawler</span></code> 扩展、中间商和项目管道中的方法：</p>
<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="k">class</span> <span class="nc">MyExtension</span><span class="p">:</span>
    <span class="k">def</span> <span class="fm">__init__</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">log_is_enabled</span><span class="o">=</span><span class="kc">False</span><span class="p">):</span>
        <span class="k">if</span> <span class="n">log_is_enabled</span><span class="p">:</span>
            <span class="nb">print</span><span class="p">(</span><span class="s2">&quot;log is enabled!&quot;</span><span class="p">)</span>

    <span class="nd">@classmethod</span>
    <span class="k">def</span> <span class="nf">from_crawler</span><span class="p">(</span><span class="bp">cls</span><span class="p">,</span> <span class="n">crawler</span><span class="p">):</span>
        <span class="n">settings</span> <span class="o">=</span> <span class="n">crawler</span><span class="o">.</span><span class="n">settings</span>
        <span class="k">return</span> <span class="bp">cls</span><span class="p">(</span><span class="n">settings</span><span class="o">.</span><span class="n">getbool</span><span class="p">(</span><span class="s1">&#39;LOG_ENABLED&#39;</span><span class="p">))</span>
</pre></div>
</div>
<p>设置对象可以像dict一样使用（例如， <code class="docutils literal notranslate"><span class="pre">settings['LOG_ENABLED']</span></code> ，但通常最好使用 <a class="reference internal" href="api.html#scrapy.settings.Settings" title="scrapy.settings.Settings"><code class="xref py py-class docutils literal notranslate"><span class="pre">Settings</span></code></a> 应用程序编程接口。</p>
</div>
<div class="section" id="rationale-for-setting-names">
<h2>设置名称的理由<a class="headerlink" href="#rationale-for-setting-names" title="永久链接至标题">¶</a></h2>
<p>设置名称通常以它们配置的组件作为前缀。例如，虚拟robots.txt扩展名的正确设置名称为 <code class="docutils literal notranslate"><span class="pre">ROBOTSTXT_ENABLED</span></code> ， <code class="docutils literal notranslate"><span class="pre">ROBOTSTXT_OBEY</span></code> ， <code class="docutils literal notranslate"><span class="pre">ROBOTSTXT_CACHEDIR</span></code> 等。</p>
</div>
<div class="section" id="built-in-settings-reference">
<span id="topics-settings-ref"></span><h2>内置设置参考<a class="headerlink" href="#built-in-settings-reference" title="永久链接至标题">¶</a></h2>
<p>以下是所有可用的零碎设置的列表，按字母顺序排列，以及它们的默认值和应用范围。</p>
<p>如果设置绑定到任何特定组件，那么范围（如果可用）将显示使用该设置的位置。在这种情况下，将显示该组件的模块，通常是扩展、中间件或管道。它还意味着必须启用组件才能使设置生效。</p>
<div class="section" id="aws-access-key-id">
<span id="std-setting-AWS_ACCESS_KEY_ID"></span><span id="std:setting-AWS_ACCESS_KEY_ID"></span><h3>AWS_ACCESS_KEY_ID<a class="headerlink" href="#aws-access-key-id" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">None</span></code></p>
<p>需要访问的代码使用的AWS访问密钥 <a class="reference external" href="https://aws.amazon.com/">Amazon Web services</a> ，比如 <a class="reference internal" href="feed-exports.html#topics-feed-storage-s3"><span class="std std-ref">S3 feed storage backend</span></a> .</p>
</div>
<div class="section" id="aws-secret-access-key">
<span id="std-setting-AWS_SECRET_ACCESS_KEY"></span><span id="std:setting-AWS_SECRET_ACCESS_KEY"></span><h3>AWS_SECRET_ACCESS_KEY<a class="headerlink" href="#aws-secret-access-key" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">None</span></code></p>
<p>需要访问的代码使用的AWS密钥 <a class="reference external" href="https://aws.amazon.com/">Amazon Web services</a> ，比如 <a class="reference internal" href="feed-exports.html#topics-feed-storage-s3"><span class="std std-ref">S3 feed storage backend</span></a> .</p>
</div>
<div class="section" id="aws-endpoint-url">
<span id="std-setting-AWS_ENDPOINT_URL"></span><span id="std:setting-AWS_ENDPOINT_URL"></span><h3>AWS_ENDPOINT_URL<a class="headerlink" href="#aws-endpoint-url" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">None</span></code></p>
<p>用于类似S3的存储的端点URL，例如Minio或S3.scality。</p>
</div>
<div class="section" id="aws-use-ssl">
<span id="std-setting-AWS_USE_SSL"></span><span id="std:setting-AWS_USE_SSL"></span><h3>AWS_USE_SSL<a class="headerlink" href="#aws-use-ssl" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">None</span></code></p>
<p>如果要禁用与S3或类似S3的存储进行通信的SSL连接，请使用此选项。默认情况下将使用SSL。</p>
</div>
<div class="section" id="aws-verify">
<span id="std-setting-AWS_VERIFY"></span><span id="std:setting-AWS_VERIFY"></span><h3>AWS_VERIFY<a class="headerlink" href="#aws-verify" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">None</span></code></p>
<p>验证scray和S3或类似S3的存储之间的SSL连接。默认情况下，将进行SSL验证。</p>
</div>
<div class="section" id="aws-region-name">
<span id="std-setting-AWS_REGION_NAME"></span><span id="std:setting-AWS_REGION_NAME"></span><h3>AWS_REGION_NAME<a class="headerlink" href="#aws-region-name" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">None</span></code></p>
<p>与AWS客户端关联的区域的名称。</p>
</div>
<div class="section" id="asyncio-event-loop">
<span id="std-setting-ASYNCIO_EVENT_LOOP"></span><span id="std:setting-ASYNCIO_EVENT_LOOP"></span><h3>ASYNCIO_EVENT_LOOP<a class="headerlink" href="#asyncio-event-loop" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">None</span></code></p>
<p>给定的异步事件循环类的导入路径。</p>
<p>如果启用了异步电抗器（请参阅 <a class="reference internal" href="#std-setting-TWISTED_REACTOR"><code class="xref std std-setting docutils literal notranslate"><span class="pre">TWISTED_REACTOR</span></code></a> )此设置可用于指定要与之一起使用的异步事件循环。将设置设置为所需的异步事件循环类的导入路径。如果设置为 <code class="docutils literal notranslate"><span class="pre">None</span></code> 将使用默认的异步事件循环。</p>
<p>如果要使用 <a class="reference internal" href="#scrapy.utils.reactor.install_reactor" title="scrapy.utils.reactor.install_reactor"><code class="xref py py-func docutils literal notranslate"><span class="pre">install_reactor()</span></code></a> 函数，可以使用 <code class="docutils literal notranslate"><span class="pre">event_loop_path</span></code> 参数指示要使用的事件循环类的导入路径。</p>
<p>请注意，事件循环类必须继承自 <a class="reference external" href="https://docs.python.org/3/library/asyncio-eventloop.html#asyncio.AbstractEventLoop" title="(在 Python v3.9)"><code class="xref py py-class docutils literal notranslate"><span class="pre">asyncio.AbstractEventLoop</span></code></a> .</p>
</div>
<div class="section" id="bot-name">
<span id="std-setting-BOT_NAME"></span><span id="std:setting-BOT_NAME"></span><h3>BOT_NAME<a class="headerlink" href="#bot-name" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">'scrapybot'</span></code></p>
<p>这个项目的名字叫Scrapy。此名称也将用于日志记录。</p>
<p>当您使用 <a class="reference internal" href="commands.html#std-command-startproject"><code class="xref std std-command docutils literal notranslate"><span class="pre">startproject</span></code></a> 命令。</p>
</div>
<div class="section" id="concurrent-items">
<span id="std-setting-CONCURRENT_ITEMS"></span><span id="std:setting-CONCURRENT_ITEMS"></span><h3>CONCURRENT_ITEMS<a class="headerlink" href="#concurrent-items" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">100</span></code></p>
<p>中并行处理的最大并发项数（每个响应） <a class="reference internal" href="item-pipeline.html#topics-item-pipeline"><span class="std std-ref">item pipelines</span></a> .</p>
</div>
<div class="section" id="concurrent-requests">
<span id="std-setting-CONCURRENT_REQUESTS"></span><span id="std:setting-CONCURRENT_REQUESTS"></span><h3>CONCURRENT_REQUESTS<a class="headerlink" href="#concurrent-requests" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">16</span></code></p>
<p>Scrapy下载程序将执行的最大并发（即同时）请求数。</p>
</div>
<div class="section" id="concurrent-requests-per-domain">
<span id="std-setting-CONCURRENT_REQUESTS_PER_DOMAIN"></span><span id="std:setting-CONCURRENT_REQUESTS_PER_DOMAIN"></span><h3>CONCURRENT_REQUESTS_PER_DOMAIN<a class="headerlink" href="#concurrent-requests-per-domain" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">8</span></code></p>
<p>将对任何单个域执行的最大并发（即同时）请求数。</p>
<p>参见： <a class="reference internal" href="autothrottle.html#topics-autothrottle"><span class="std std-ref">AutoThrottle 扩展</span></a> 及其 <a class="reference internal" href="autothrottle.html#std-setting-AUTOTHROTTLE_TARGET_CONCURRENCY"><code class="xref std std-setting docutils literal notranslate"><span class="pre">AUTOTHROTTLE_TARGET_CONCURRENCY</span></code></a> 选择权。</p>
</div>
<div class="section" id="concurrent-requests-per-ip">
<span id="std-setting-CONCURRENT_REQUESTS_PER_IP"></span><span id="std:setting-CONCURRENT_REQUESTS_PER_IP"></span><h3>CONCURRENT_REQUESTS_PER_IP<a class="headerlink" href="#concurrent-requests-per-ip" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">0</span></code></p>
<p>将对任何单个IP执行的最大并发（即同时）请求数。如果非零，则 <a class="reference internal" href="#std-setting-CONCURRENT_REQUESTS_PER_DOMAIN"><code class="xref std std-setting docutils literal notranslate"><span class="pre">CONCURRENT_REQUESTS_PER_DOMAIN</span></code></a> 设置被忽略，而是使用此设置。换句话说，并发限制将应用于每个IP，而不是每个域。</p>
<p>此设置还影响 <a class="reference internal" href="#std-setting-DOWNLOAD_DELAY"><code class="xref std std-setting docutils literal notranslate"><span class="pre">DOWNLOAD_DELAY</span></code></a> 和 <a class="reference internal" href="autothrottle.html#topics-autothrottle"><span class="std std-ref">AutoThrottle 扩展</span></a> 如果 <a class="reference internal" href="#std-setting-CONCURRENT_REQUESTS_PER_IP"><code class="xref std std-setting docutils literal notranslate"><span class="pre">CONCURRENT_REQUESTS_PER_IP</span></code></a> 是非零的，下载延迟是每个IP强制执行的，而不是每个域。</p>
</div>
<div class="section" id="default-item-class">
<span id="std-setting-DEFAULT_ITEM_CLASS"></span><span id="std:setting-DEFAULT_ITEM_CLASS"></span><h3>DEFAULT_ITEM_CLASS<a class="headerlink" href="#default-item-class" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">'scrapy.item.Item'</span></code></p>
<p>将用于实例化中的项的默认类 <a class="reference internal" href="shell.html#topics-shell"><span class="std std-ref">the Scrapy shell</span></a> .</p>
</div>
<div class="section" id="default-request-headers">
<span id="std-setting-DEFAULT_REQUEST_HEADERS"></span><span id="std:setting-DEFAULT_REQUEST_HEADERS"></span><h3>DEFAULT_REQUEST_HEADERS<a class="headerlink" href="#default-request-headers" title="永久链接至标题">¶</a></h3>
<p>违约：：</p>
<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="p">{</span>
    <span class="s1">&#39;Accept&#39;</span><span class="p">:</span> <span class="s1">&#39;text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8&#39;</span><span class="p">,</span>
    <span class="s1">&#39;Accept-Language&#39;</span><span class="p">:</span> <span class="s1">&#39;en&#39;</span><span class="p">,</span>
<span class="p">}</span>
</pre></div>
</div>
<p>用于报废HTTP请求的默认头。他们住在 <a class="reference internal" href="downloader-middleware.html#scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware" title="scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware"><code class="xref py py-class docutils literal notranslate"><span class="pre">DefaultHeadersMiddleware</span></code></a> .</p>
</div>
<div class="section" id="depth-limit">
<span id="std-setting-DEPTH_LIMIT"></span><span id="std:setting-DEPTH_LIMIT"></span><h3>DEPTH_LIMIT<a class="headerlink" href="#depth-limit" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">0</span></code></p>
<p>经营范围： <code class="docutils literal notranslate"><span class="pre">scrapy.spidermiddlewares.depth.DepthMiddleware</span></code></p>
<p>允许对任何网站进行爬网的最大深度。如果为零，则不施加限制。</p>
</div>
<div class="section" id="depth-priority">
<span id="std-setting-DEPTH_PRIORITY"></span><span id="std:setting-DEPTH_PRIORITY"></span><h3>DEPTH_PRIORITY<a class="headerlink" href="#depth-priority" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">0</span></code></p>
<p>经营范围： <code class="docutils literal notranslate"><span class="pre">scrapy.spidermiddlewares.depth.DepthMiddleware</span></code></p>
<p>用于调整 <code class="xref py py-attr docutils literal notranslate"><span class="pre">priority</span></code> A的 <a class="reference internal" href="request-response.html#scrapy.http.Request" title="scrapy.http.Request"><code class="xref py py-class docutils literal notranslate"><span class="pre">Request</span></code></a> 根据它的深度。</p>
<p>请求的优先级调整如下：</p>
<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">request</span><span class="o">.</span><span class="n">priority</span> <span class="o">=</span> <span class="n">request</span><span class="o">.</span><span class="n">priority</span> <span class="o">-</span> <span class="p">(</span> <span class="n">depth</span> <span class="o">*</span> <span class="n">DEPTH_PRIORITY</span> <span class="p">)</span>
</pre></div>
</div>
<p>随着深度的增加，正值为 <code class="docutils literal notranslate"><span class="pre">DEPTH_PRIORITY</span></code> 降低请求优先级（BFO），而负值则提高请求优先级（DFO）。也见 <a class="reference internal" href="../faq.html#faq-bfo-dfo"><span class="std std-ref">Scrapy是以广度优先还是深度优先的顺序爬行？</span></a> .</p>
<div class="admonition note">
<p class="admonition-title">注解</p>
<p>此设置调整优先级 <strong>以相反的方式</strong> 与其他优先级设置相比 <a class="reference internal" href="#std-setting-REDIRECT_PRIORITY_ADJUST"><code class="xref std std-setting docutils literal notranslate"><span class="pre">REDIRECT_PRIORITY_ADJUST</span></code></a> 和 <a class="reference internal" href="#std-setting-RETRY_PRIORITY_ADJUST"><code class="xref std std-setting docutils literal notranslate"><span class="pre">RETRY_PRIORITY_ADJUST</span></code></a> .</p>
</div>
</div>
<div class="section" id="depth-stats-verbose">
<span id="std-setting-DEPTH_STATS_VERBOSE"></span><span id="std:setting-DEPTH_STATS_VERBOSE"></span><h3>DEPTH_STATS_VERBOSE<a class="headerlink" href="#depth-stats-verbose" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">False</span></code></p>
<p>经营范围： <code class="docutils literal notranslate"><span class="pre">scrapy.spidermiddlewares.depth.DepthMiddleware</span></code></p>
<p>是否收集详细深度统计信息。如果启用此选项，则在统计信息中收集每个深度的请求数。</p>
</div>
<div class="section" id="dnscache-enabled">
<span id="std-setting-DNSCACHE_ENABLED"></span><span id="std:setting-DNSCACHE_ENABLED"></span><h3>DNSCACHE_ENABLED<a class="headerlink" href="#dnscache-enabled" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">True</span></code></p>
<p>是否启用DNS内存缓存。</p>
</div>
<div class="section" id="dnscache-size">
<span id="std-setting-DNSCACHE_SIZE"></span><span id="std:setting-DNSCACHE_SIZE"></span><h3>DNSCACHE_SIZE<a class="headerlink" href="#dnscache-size" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">10000</span></code></p>
<p>DNS内存缓存大小。</p>
</div>
<div class="section" id="dns-resolver">
<span id="std-setting-DNS_RESOLVER"></span><span id="std:setting-DNS_RESOLVER"></span><h3>DNS_RESOLVER<a class="headerlink" href="#dns-resolver" title="永久链接至标题">¶</a></h3>
<div class="versionadded">
<p><span class="versionmodified added">2.0 新版功能.</span></p>
</div>
<p>违约： <code class="docutils literal notranslate"><span class="pre">'scrapy.resolver.CachingThreadedResolver'</span></code></p>
<p>用于解析DNS名称的类。违约 <code class="docutils literal notranslate"><span class="pre">scrapy.resolver.CachingThreadedResolver</span></code> 支持通过指定DNS请求的超时 <a class="reference internal" href="#std-setting-DNS_TIMEOUT"><code class="xref std std-setting docutils literal notranslate"><span class="pre">DNS_TIMEOUT</span></code></a> 设置，但仅适用于IPv4地址。Scrapy提供了另一种解决方案， <code class="docutils literal notranslate"><span class="pre">scrapy.resolver.CachingHostnameResolver</span></code> ，它支持IPv4/IPv6地址，但不使用 <a class="reference internal" href="#std-setting-DNS_TIMEOUT"><code class="xref std std-setting docutils literal notranslate"><span class="pre">DNS_TIMEOUT</span></code></a> 考虑在内。</p>
</div>
<div class="section" id="dns-timeout">
<span id="std-setting-DNS_TIMEOUT"></span><span id="std:setting-DNS_TIMEOUT"></span><h3>DNS_TIMEOUT<a class="headerlink" href="#dns-timeout" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">60</span></code></p>
<p>处理DNS查询的超时（秒）。支持浮动。</p>
</div>
<div class="section" id="downloader">
<span id="std-setting-DOWNLOADER"></span><span id="std:setting-DOWNLOADER"></span><h3>DOWNLOADER<a class="headerlink" href="#downloader" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">'scrapy.core.downloader.Downloader'</span></code></p>
<p>用于爬行的下载程序。</p>
</div>
<div class="section" id="downloader-httpclientfactory">
<span id="std-setting-DOWNLOADER_HTTPCLIENTFACTORY"></span><span id="std:setting-DOWNLOADER_HTTPCLIENTFACTORY"></span><h3>DOWNLOADER_HTTPCLIENTFACTORY<a class="headerlink" href="#downloader-httpclientfactory" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">'scrapy.core.downloader.webclient.ScrapyHTTPClientFactory'</span></code></p>
<p>定义扭曲 <code class="docutils literal notranslate"><span class="pre">protocol.ClientFactory</span></code> 用于HTTP/1.0连接的类（用于 <code class="docutils literal notranslate"><span class="pre">HTTP10DownloadHandler</span></code> ）</p>
<div class="admonition note">
<p class="admonition-title">注解</p>
<p>现在很少使用HTTP/1.0，所以您可以安全地忽略此设置，除非您确实想使用HTTP/1.0并重写 <a class="reference internal" href="#std-setting-DOWNLOAD_HANDLERS"><code class="xref std std-setting docutils literal notranslate"><span class="pre">DOWNLOAD_HANDLERS</span></code></a> 对于 <code class="docutils literal notranslate"><span class="pre">http(s)</span></code> 相应的计划，即 <code class="docutils literal notranslate"><span class="pre">'scrapy.core.downloader.handlers.http.HTTP10DownloadHandler'</span></code> .</p>
</div>
</div>
<div class="section" id="downloader-clientcontextfactory">
<span id="std-setting-DOWNLOADER_CLIENTCONTEXTFACTORY"></span><span id="std:setting-DOWNLOADER_CLIENTCONTEXTFACTORY"></span><h3>DOWNLOADER_CLIENTCONTEXTFACTORY<a class="headerlink" href="#downloader-clientcontextfactory" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">'scrapy.core.downloader.contextfactory.ScrapyClientContextFactory'</span></code></p>
<p>表示要使用的ContextFactory的类路径。</p>
<p>这里，“ContextFactory”是一个用于SSL/TLS上下文的扭曲术语，它定义了要使用的TLS/SSL协议版本，无论是进行证书验证，还是甚至启用客户端身份验证（以及其他各种事情）。</p>
<div class="admonition note">
<p class="admonition-title">注解</p>
<p>剪贴默认上下文工厂 <strong>不执行远程服务器证书验证</strong> . 这通常对爬取Web很好。</p>
<p>如果您确实需要启用远程服务器证书验证，scrapy还可以设置另一个上下文工厂类， <code class="docutils literal notranslate"><span class="pre">'scrapy.core.downloader.contextfactory.BrowserLikeContextFactory'</span></code> ，它使用平台的证书来验证远程端点。</p>
</div>
<p>如果确实使用自定义ContextFactory，请确保 <code class="docutils literal notranslate"><span class="pre">__init__</span></code> 方法接受 <code class="docutils literal notranslate"><span class="pre">method</span></code> 参数（这是 <code class="docutils literal notranslate"><span class="pre">OpenSSL.SSL</span></code> 方法映射 <a class="reference internal" href="#std-setting-DOWNLOADER_CLIENT_TLS_METHOD"><code class="xref std std-setting docutils literal notranslate"><span class="pre">DOWNLOADER_CLIENT_TLS_METHOD</span></code></a> a） <code class="docutils literal notranslate"><span class="pre">tls_verbose_logging</span></code> 参数 (<code class="docutils literal notranslate"><span class="pre">bool</span></code> 和A <code class="docutils literal notranslate"><span class="pre">tls_ciphers</span></code> 参数（见） <a class="reference internal" href="#std-setting-DOWNLOADER_CLIENT_TLS_CIPHERS"><code class="xref std std-setting docutils literal notranslate"><span class="pre">DOWNLOADER_CLIENT_TLS_CIPHERS</span></code></a> ）</p>
</div>
<div class="section" id="downloader-client-tls-ciphers">
<span id="std-setting-DOWNLOADER_CLIENT_TLS_CIPHERS"></span><span id="std:setting-DOWNLOADER_CLIENT_TLS_CIPHERS"></span><h3>DOWNLOADER_CLIENT_TLS_CIPHERS<a class="headerlink" href="#downloader-client-tls-ciphers" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">'DEFAULT'</span></code></p>
<p>使用此设置可自定义默认HTTP/1.1下载器使用的TLS/SSL密码。</p>
<p>该设置应包含 <a class="reference external" href="https://www.openssl.org/docs/manmaster/man1/openssl-ciphers.html#CIPHER-LIST-FORMAT">OpenSSL cipher list format</a> ，这些密码将用作客户端密码。更改此设置可能是访问某些HTTPS网站所必需的：例如，您可能需要使用 <code class="docutils literal notranslate"><span class="pre">'DEFAULT:!DH'</span></code> 对于DH参数较弱的网站，或启用未包含在中的特定密码 <code class="docutils literal notranslate"><span class="pre">DEFAULT</span></code> 如果网站需要的话。</p>
</div>
<div class="section" id="downloader-client-tls-method">
<span id="std-setting-DOWNLOADER_CLIENT_TLS_METHOD"></span><span id="std:setting-DOWNLOADER_CLIENT_TLS_METHOD"></span><h3>DOWNLOADER_CLIENT_TLS_METHOD<a class="headerlink" href="#downloader-client-tls-method" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">'TLS'</span></code></p>
<p>使用此设置自定义默认HTTP/1.1下载程序使用的TLS/SSL方法。</p>
<p>此设置必须是以下字符串值之一：</p>
<ul class="simple">
<li><p><code class="docutils literal notranslate"><span class="pre">'TLS'</span></code> ：映射到OpenSSL <code class="docutils literal notranslate"><span class="pre">TLS_method()</span></code> （A.K.A） <code class="docutils literal notranslate"><span class="pre">SSLv23_method()</span></code> ，允许协议协商，从平台支持的最高点开始； <strong>默认，推荐</strong></p></li>
<li><p><code class="docutils literal notranslate"><span class="pre">'TLSv1.0'</span></code> ：此值强制HTTPS连接使用TLS版本1.0；如果希望scrapy的行为小于1.1，请设置此值。</p></li>
<li><p><code class="docutils literal notranslate"><span class="pre">'TLSv1.1'</span></code> ：强制TLS版本1.1</p></li>
<li><p><code class="docutils literal notranslate"><span class="pre">'TLSv1.2'</span></code> ：强制TLS版本1.2</p></li>
<li><p><code class="docutils literal notranslate"><span class="pre">'SSLv3'</span></code> ：强制SSL版本3（ <strong>未推荐的</strong> ）</p></li>
</ul>
</div>
<div class="section" id="downloader-client-tls-verbose-logging">
<span id="std-setting-DOWNLOADER_CLIENT_TLS_VERBOSE_LOGGING"></span><span id="std:setting-DOWNLOADER_CLIENT_TLS_VERBOSE_LOGGING"></span><h3>DOWNLOADER_CLIENT_TLS_VERBOSE_LOGGING<a class="headerlink" href="#downloader-client-tls-verbose-logging" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">False</span></code></p>
<p>设置为 <code class="docutils literal notranslate"><span class="pre">True</span></code> 将在建立HTTPS连接后启用有关TLS连接参数的调试级别消息。记录的信息类型取决于openssl和pyopenssl的版本。</p>
<p>此设置仅用于默认 <a class="reference internal" href="#std-setting-DOWNLOADER_CLIENTCONTEXTFACTORY"><code class="xref std std-setting docutils literal notranslate"><span class="pre">DOWNLOADER_CLIENTCONTEXTFACTORY</span></code></a> .</p>
</div>
<div class="section" id="downloader-middlewares">
<span id="std-setting-DOWNLOADER_MIDDLEWARES"></span><span id="std:setting-DOWNLOADER_MIDDLEWARES"></span><h3>DOWNLOADER_MIDDLEWARES<a class="headerlink" href="#downloader-middlewares" title="永久链接至标题">¶</a></h3>
<p>违约：： <code class="docutils literal notranslate"><span class="pre">{{}}</span></code></p>
<p>包含项目中启用的下载器中间软件及其订单的dict。有关详细信息，请参阅 <a class="reference internal" href="downloader-middleware.html#topics-downloader-middleware-setting"><span class="std std-ref">激活下载器中间件</span></a> .</p>
</div>
<div class="section" id="downloader-middlewares-base">
<span id="std-setting-DOWNLOADER_MIDDLEWARES_BASE"></span><span id="std:setting-DOWNLOADER_MIDDLEWARES_BASE"></span><h3>DOWNLOADER_MIDDLEWARES_BASE<a class="headerlink" href="#downloader-middlewares-base" title="永久链接至标题">¶</a></h3>
<p>违约：：</p>
<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="p">{</span>
    <span class="s1">&#39;scrapy.downloadermiddlewares.robotstxt.RobotsTxtMiddleware&#39;</span><span class="p">:</span> <span class="mi">100</span><span class="p">,</span>
    <span class="s1">&#39;scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware&#39;</span><span class="p">:</span> <span class="mi">300</span><span class="p">,</span>
    <span class="s1">&#39;scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware&#39;</span><span class="p">:</span> <span class="mi">350</span><span class="p">,</span>
    <span class="s1">&#39;scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware&#39;</span><span class="p">:</span> <span class="mi">400</span><span class="p">,</span>
    <span class="s1">&#39;scrapy.downloadermiddlewares.useragent.UserAgentMiddleware&#39;</span><span class="p">:</span> <span class="mi">500</span><span class="p">,</span>
    <span class="s1">&#39;scrapy.downloadermiddlewares.retry.RetryMiddleware&#39;</span><span class="p">:</span> <span class="mi">550</span><span class="p">,</span>
    <span class="s1">&#39;scrapy.downloadermiddlewares.ajaxcrawl.AjaxCrawlMiddleware&#39;</span><span class="p">:</span> <span class="mi">560</span><span class="p">,</span>
    <span class="s1">&#39;scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware&#39;</span><span class="p">:</span> <span class="mi">580</span><span class="p">,</span>
    <span class="s1">&#39;scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware&#39;</span><span class="p">:</span> <span class="mi">590</span><span class="p">,</span>
    <span class="s1">&#39;scrapy.downloadermiddlewares.redirect.RedirectMiddleware&#39;</span><span class="p">:</span> <span class="mi">600</span><span class="p">,</span>
    <span class="s1">&#39;scrapy.downloadermiddlewares.cookies.CookiesMiddleware&#39;</span><span class="p">:</span> <span class="mi">700</span><span class="p">,</span>
    <span class="s1">&#39;scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware&#39;</span><span class="p">:</span> <span class="mi">750</span><span class="p">,</span>
    <span class="s1">&#39;scrapy.downloadermiddlewares.stats.DownloaderStats&#39;</span><span class="p">:</span> <span class="mi">850</span><span class="p">,</span>
    <span class="s1">&#39;scrapy.downloadermiddlewares.httpcache.HttpCacheMiddleware&#39;</span><span class="p">:</span> <span class="mi">900</span><span class="p">,</span>
<span class="p">}</span>
</pre></div>
</div>
<p>包含默认在scrappy中启用的下载器中间软件的dict。低阶更接近引擎，高阶更接近下载器。您不应该在项目中修改此设置，请修改 <a class="reference internal" href="#std-setting-DOWNLOADER_MIDDLEWARES"><code class="xref std std-setting docutils literal notranslate"><span class="pre">DOWNLOADER_MIDDLEWARES</span></code></a> 相反。有关详细信息，请参阅 <a class="reference internal" href="downloader-middleware.html#topics-downloader-middleware-setting"><span class="std std-ref">激活下载器中间件</span></a> .</p>
</div>
<div class="section" id="downloader-stats">
<span id="std-setting-DOWNLOADER_STATS"></span><span id="std:setting-DOWNLOADER_STATS"></span><h3>DOWNLOADER_STATS<a class="headerlink" href="#downloader-stats" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">True</span></code></p>
<p>是否启用下载器统计信息收集。</p>
</div>
<div class="section" id="download-delay">
<span id="std-setting-DOWNLOAD_DELAY"></span><span id="std:setting-DOWNLOAD_DELAY"></span><h3>DOWNLOAD_DELAY<a class="headerlink" href="#download-delay" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">0</span></code></p>
<p>下载者从同一网站下载连续页面之前应等待的时间（以秒计）。这可以用来限制爬行速度，以避免对服务器造成太大的冲击。支持十进制数。例子：：</p>
<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">DOWNLOAD_DELAY</span> <span class="o">=</span> <span class="mf">0.25</span>    <span class="c1"># 250 ms of delay</span>
</pre></div>
</div>
<p>此设置也受 <a class="reference internal" href="#std-setting-RANDOMIZE_DOWNLOAD_DELAY"><code class="xref std std-setting docutils literal notranslate"><span class="pre">RANDOMIZE_DOWNLOAD_DELAY</span></code></a> 设置（默认启用）。默认情况下，scrappy不会在请求之间等待固定的时间，而是使用0.5之间的随机间隔 * <a class="reference internal" href="#std-setting-DOWNLOAD_DELAY"><code class="xref std std-setting docutils literal notranslate"><span class="pre">DOWNLOAD_DELAY</span></code></a> and 1.5 *  <a class="reference internal" href="#std-setting-DOWNLOAD_DELAY"><code class="xref std std-setting docutils literal notranslate"><span class="pre">DOWNLOAD_DELAY</span></code></a> .</p>
<p>什么时候？ <a class="reference internal" href="#std-setting-CONCURRENT_REQUESTS_PER_IP"><code class="xref std std-setting docutils literal notranslate"><span class="pre">CONCURRENT_REQUESTS_PER_IP</span></code></a> 为非零，每个IP地址而不是每个域强制延迟。</p>
<p id="spider-download-delay-attribute">您还可以通过设置每个蜘蛛更改此设置 <code class="docutils literal notranslate"><span class="pre">download_delay</span></code> 蜘蛛属性。</p>
</div>
<div class="section" id="download-handlers">
<span id="std-setting-DOWNLOAD_HANDLERS"></span><span id="std:setting-DOWNLOAD_HANDLERS"></span><h3>DOWNLOAD_HANDLERS<a class="headerlink" href="#download-handlers" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">{{}}</span></code></p>
<p>包含项目中启用的请求下载器处理程序的dict。见 <a class="reference internal" href="#std-setting-DOWNLOAD_HANDLERS_BASE"><code class="xref std std-setting docutils literal notranslate"><span class="pre">DOWNLOAD_HANDLERS_BASE</span></code></a> 例如格式。</p>
</div>
<div class="section" id="download-handlers-base">
<span id="std-setting-DOWNLOAD_HANDLERS_BASE"></span><span id="std:setting-DOWNLOAD_HANDLERS_BASE"></span><h3>DOWNLOAD_HANDLERS_BASE<a class="headerlink" href="#download-handlers-base" title="永久链接至标题">¶</a></h3>
<p>违约：：</p>
<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="p">{</span>
    <span class="s1">&#39;file&#39;</span><span class="p">:</span> <span class="s1">&#39;scrapy.core.downloader.handlers.file.FileDownloadHandler&#39;</span><span class="p">,</span>
    <span class="s1">&#39;http&#39;</span><span class="p">:</span> <span class="s1">&#39;scrapy.core.downloader.handlers.http.HTTPDownloadHandler&#39;</span><span class="p">,</span>
    <span class="s1">&#39;https&#39;</span><span class="p">:</span> <span class="s1">&#39;scrapy.core.downloader.handlers.http.HTTPDownloadHandler&#39;</span><span class="p">,</span>
    <span class="s1">&#39;s3&#39;</span><span class="p">:</span> <span class="s1">&#39;scrapy.core.downloader.handlers.s3.S3DownloadHandler&#39;</span><span class="p">,</span>
    <span class="s1">&#39;ftp&#39;</span><span class="p">:</span> <span class="s1">&#39;scrapy.core.downloader.handlers.ftp.FTPDownloadHandler&#39;</span><span class="p">,</span>
<span class="p">}</span>
</pre></div>
</div>
<p>包含在scrappy中默认启用的请求下载处理程序的dict。您不应该在项目中修改此设置，请修改 <a class="reference internal" href="#std-setting-DOWNLOAD_HANDLERS"><code class="xref std std-setting docutils literal notranslate"><span class="pre">DOWNLOAD_HANDLERS</span></code></a> 相反。</p>
<p>您可以通过分配 <code class="docutils literal notranslate"><span class="pre">None</span></code> 到他们的URI方案 <a class="reference internal" href="#std-setting-DOWNLOAD_HANDLERS"><code class="xref std std-setting docutils literal notranslate"><span class="pre">DOWNLOAD_HANDLERS</span></code></a> . 例如，要禁用内置的ftp处理程序（不替换），请将其放入 <code class="docutils literal notranslate"><span class="pre">settings.py</span></code> ：：</p>
<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">DOWNLOAD_HANDLERS</span> <span class="o">=</span> <span class="p">{</span>
    <span class="s1">&#39;ftp&#39;</span><span class="p">:</span> <span class="kc">None</span><span class="p">,</span>
<span class="p">}</span>
</pre></div>
</div>
</div>
<div class="section" id="download-timeout">
<span id="std-setting-DOWNLOAD_TIMEOUT"></span><span id="std:setting-DOWNLOAD_TIMEOUT"></span><h3>DOWNLOAD_TIMEOUT<a class="headerlink" href="#download-timeout" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">180</span></code></p>
<p>下载程序在超时前等待的时间（以秒计）。</p>
<div class="admonition note">
<p class="admonition-title">注解</p>
<p>可以使用以下方法设置每个蜘蛛的超时 <code class="xref py py-attr docutils literal notranslate"><span class="pre">download_timeout</span></code> 蜘蛛属性和每个请求使用 <a class="reference internal" href="request-response.html#std-reqmeta-download_timeout"><code class="xref std std-reqmeta docutils literal notranslate"><span class="pre">download_timeout</span></code></a> request.meta键。</p>
</div>
</div>
<div class="section" id="download-maxsize">
<span id="std-setting-DOWNLOAD_MAXSIZE"></span><span id="std:setting-DOWNLOAD_MAXSIZE"></span><h3>DOWNLOAD_MAXSIZE<a class="headerlink" href="#download-maxsize" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">1073741824</span></code> （1024MB）</p>
<p>下载程序将下载的最大响应大小（字节）。</p>
<p>如果要禁用它，请将其设置为0。</p>
<div class="admonition note" id="std-reqmeta-download_maxsize">
<span id="std:reqmeta-download_maxsize"></span><p class="admonition-title">注解</p>
<p>此尺寸可通过使用 <code class="xref py py-attr docutils literal notranslate"><span class="pre">download_maxsize</span></code> 蜘蛛属性和每个请求使用 <a class="reference internal" href="#std-reqmeta-download_maxsize"><code class="xref std std-reqmeta docutils literal notranslate"><span class="pre">download_maxsize</span></code></a> request.meta键。</p>
</div>
</div>
<div class="section" id="download-warnsize">
<span id="std-setting-DOWNLOAD_WARNSIZE"></span><span id="std:setting-DOWNLOAD_WARNSIZE"></span><h3>DOWNLOAD_WARNSIZE<a class="headerlink" href="#download-warnsize" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">33554432</span></code> （32 MB）</p>
<p>下载程序将开始警告的响应大小（字节）。</p>
<p>如果要禁用它，请将其设置为0。</p>
<div class="admonition note">
<p class="admonition-title">注解</p>
<p>此尺寸可通过使用 <code class="xref py py-attr docutils literal notranslate"><span class="pre">download_warnsize</span></code> 蜘蛛属性和每个请求使用 <code class="xref std std-reqmeta docutils literal notranslate"><span class="pre">download_warnsize</span></code> request.meta键。</p>
</div>
</div>
<div class="section" id="download-fail-on-dataloss">
<span id="std-setting-DOWNLOAD_FAIL_ON_DATALOSS"></span><span id="std:setting-DOWNLOAD_FAIL_ON_DATALOSS"></span><h3>DOWNLOAD_FAIL_ON_DATALOSS<a class="headerlink" href="#download-fail-on-dataloss" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">True</span></code></p>
<p>是否在错误的响应上失败，也就是说，声明 <code class="docutils literal notranslate"><span class="pre">Content-Length</span></code> 与服务器发送的内容不匹配，或者分块响应未正确完成。如果 <code class="docutils literal notranslate"><span class="pre">True</span></code> ，这些反应引发了 <code class="docutils literal notranslate"><span class="pre">ResponseFailed([_DataLoss])</span></code> 错误。如果 <code class="docutils literal notranslate"><span class="pre">False</span></code> ，这些响应将传递给 <code class="docutils literal notranslate"><span class="pre">dataloss</span></code> 添加到响应中，即： <code class="docutils literal notranslate"><span class="pre">'dataloss'</span> <span class="pre">in</span> <span class="pre">response.flags</span></code> 是 <code class="docutils literal notranslate"><span class="pre">True</span></code> .</p>
<p>或者，可以通过使用 <a class="reference internal" href="request-response.html#std-reqmeta-download_fail_on_dataloss"><code class="xref std std-reqmeta docutils literal notranslate"><span class="pre">download_fail_on_dataloss</span></code></a> 请求.meta键 <code class="docutils literal notranslate"><span class="pre">False</span></code> .</p>
<div class="admonition note">
<p class="admonition-title">注解</p>
<p>从服务器配置错误到网络错误，再到数据损坏，在多种情况下可能会发生中断响应或数据丢失错误。由用户决定处理中断的响应是否有意义，因为它们可能包含部分或不完整的内容。如果 <a class="reference internal" href="downloader-middleware.html#std-setting-RETRY_ENABLED"><code class="xref std std-setting docutils literal notranslate"><span class="pre">RETRY_ENABLED</span></code></a> 是 <code class="docutils literal notranslate"><span class="pre">True</span></code> 此设置设置为 <code class="docutils literal notranslate"><span class="pre">True</span></code> , the <code class="docutils literal notranslate"><span class="pre">ResponseFailed([_DataLoss])</span></code> 失败将像往常一样重试。</p>
</div>
</div>
<div class="section" id="dupefilter-class">
<span id="std-setting-DUPEFILTER_CLASS"></span><span id="std:setting-DUPEFILTER_CLASS"></span><h3>DUPEFILTER_CLASS<a class="headerlink" href="#dupefilter-class" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">'scrapy.dupefilters.RFPDupeFilter'</span></code></p>
<p>用于检测和筛选重复请求的类。</p>
<p>默认值 (<code class="docutils literal notranslate"><span class="pre">RFPDupeFilter</span></code> ）根据请求指纹使用 <code class="docutils literal notranslate"><span class="pre">scrapy.utils.request.request_fingerprint</span></code> 功能。为了更改检查重复项的方式，可以将 <code class="docutils literal notranslate"><span class="pre">RFPDupeFilter</span></code> 并覆盖其 <code class="docutils literal notranslate"><span class="pre">request_fingerprint</span></code> 方法。这个方法应该接受slapy <a class="reference internal" href="request-response.html#scrapy.http.Request" title="scrapy.http.Request"><code class="xref py py-class docutils literal notranslate"><span class="pre">Request</span></code></a> 对象并返回其指纹（字符串）。</p>
<p>您可以通过设置禁用重复请求的筛选 <a class="reference internal" href="#std-setting-DUPEFILTER_CLASS"><code class="xref std std-setting docutils literal notranslate"><span class="pre">DUPEFILTER_CLASS</span></code></a> 到 <code class="docutils literal notranslate"><span class="pre">'scrapy.dupefilters.BaseDupeFilter'</span></code> . 但是要非常小心，因为您可以进入爬行循环。通常设置 <code class="docutils literal notranslate"><span class="pre">dont_filter</span></code> 参数到 <code class="docutils literal notranslate"><span class="pre">True</span></code> 论具体 <a class="reference internal" href="request-response.html#scrapy.http.Request" title="scrapy.http.Request"><code class="xref py py-class docutils literal notranslate"><span class="pre">Request</span></code></a> 这不应该被过滤。</p>
</div>
<div class="section" id="dupefilter-debug">
<span id="std-setting-DUPEFILTER_DEBUG"></span><span id="std:setting-DUPEFILTER_DEBUG"></span><h3>DUPEFILTER_DEBUG<a class="headerlink" href="#dupefilter-debug" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">False</span></code></p>
<p>默认情况下， <code class="docutils literal notranslate"><span class="pre">RFPDupeFilter</span></code> 只记录第一个重复请求。设置 <a class="reference internal" href="#std-setting-DUPEFILTER_DEBUG"><code class="xref std std-setting docutils literal notranslate"><span class="pre">DUPEFILTER_DEBUG</span></code></a> 到 <code class="docutils literal notranslate"><span class="pre">True</span></code> 将使其记录所有重复的请求。</p>
</div>
<div class="section" id="editor">
<span id="std-setting-EDITOR"></span><span id="std:setting-EDITOR"></span><h3>EDITOR<a class="headerlink" href="#editor" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">vi</span></code> （在UNIX系统上）或空闲编辑器（在Windows上）</p>
<p>用于编辑蜘蛛的编辑器 <a class="reference internal" href="commands.html#std-command-edit"><code class="xref std std-command docutils literal notranslate"><span class="pre">edit</span></code></a> 命令。此外，如果 <code class="docutils literal notranslate"><span class="pre">EDITOR</span></code> 设置了环境变量， <a class="reference internal" href="commands.html#std-command-edit"><code class="xref std std-command docutils literal notranslate"><span class="pre">edit</span></code></a> 命令将优先于默认设置。</p>
</div>
<div class="section" id="extensions">
<span id="std-setting-EXTENSIONS"></span><span id="std:setting-EXTENSIONS"></span><h3>EXTENSIONS<a class="headerlink" href="#extensions" title="永久链接至标题">¶</a></h3>
<p>违约：： <code class="docutils literal notranslate"><span class="pre">{{}}</span></code></p>
<p>包含项目中启用的扩展及其顺序的dict。</p>
</div>
<div class="section" id="extensions-base">
<span id="std-setting-EXTENSIONS_BASE"></span><span id="std:setting-EXTENSIONS_BASE"></span><h3>EXTENSIONS_BASE<a class="headerlink" href="#extensions-base" title="永久链接至标题">¶</a></h3>
<p>违约：：</p>
<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="p">{</span>
    <span class="s1">&#39;scrapy.extensions.corestats.CoreStats&#39;</span><span class="p">:</span> <span class="mi">0</span><span class="p">,</span>
    <span class="s1">&#39;scrapy.extensions.telnet.TelnetConsole&#39;</span><span class="p">:</span> <span class="mi">0</span><span class="p">,</span>
    <span class="s1">&#39;scrapy.extensions.memusage.MemoryUsage&#39;</span><span class="p">:</span> <span class="mi">0</span><span class="p">,</span>
    <span class="s1">&#39;scrapy.extensions.memdebug.MemoryDebugger&#39;</span><span class="p">:</span> <span class="mi">0</span><span class="p">,</span>
    <span class="s1">&#39;scrapy.extensions.closespider.CloseSpider&#39;</span><span class="p">:</span> <span class="mi">0</span><span class="p">,</span>
    <span class="s1">&#39;scrapy.extensions.feedexport.FeedExporter&#39;</span><span class="p">:</span> <span class="mi">0</span><span class="p">,</span>
    <span class="s1">&#39;scrapy.extensions.logstats.LogStats&#39;</span><span class="p">:</span> <span class="mi">0</span><span class="p">,</span>
    <span class="s1">&#39;scrapy.extensions.spiderstate.SpiderState&#39;</span><span class="p">:</span> <span class="mi">0</span><span class="p">,</span>
    <span class="s1">&#39;scrapy.extensions.throttle.AutoThrottle&#39;</span><span class="p">:</span> <span class="mi">0</span><span class="p">,</span>
<span class="p">}</span>
</pre></div>
</div>
<p>包含默认情况下在scrappy中可用的扩展名及其顺序的dict。此设置包含所有稳定的内置扩展。请记住，其中一些需要通过设置启用。</p>
<p>有关更多信息，请参阅 <a class="reference internal" href="extensions.html#topics-extensions"><span class="std std-ref">extensions user guide</span></a> 以及 <a class="reference internal" href="extensions.html#topics-extensions-ref"><span class="std std-ref">list of available extensions</span></a> .</p>
</div>
<div class="section" id="feed-tempdir">
<span id="std-setting-FEED_TEMPDIR"></span><span id="std:setting-FEED_TEMPDIR"></span><h3>FEED_TEMPDIR<a class="headerlink" href="#feed-tempdir" title="永久链接至标题">¶</a></h3>
<p>feed temp目录允许您设置自定义文件夹，以便在上载之前保存crawler临时文件 <a class="reference internal" href="feed-exports.html#topics-feed-storage-ftp"><span class="std std-ref">FTP feed storage</span></a> 和 <a class="reference internal" href="feed-exports.html#topics-feed-storage-s3"><span class="std std-ref">Amazon S3</span></a> .</p>
</div>
<div class="section" id="feed-storage-gcs-acl">
<span id="std-setting-FEED_STORAGE_GCS_ACL"></span><span id="std:setting-FEED_STORAGE_GCS_ACL"></span><h3>FEED_STORAGE_GCS_ACL<a class="headerlink" href="#feed-storage-gcs-acl" title="永久链接至标题">¶</a></h3>
<p>将项存储到中时使用的访问控制列表（ACL） <a class="reference internal" href="feed-exports.html#topics-feed-storage-gcs"><span class="std std-ref">Google Cloud Storage</span></a> . 有关如何设置此值的详细信息，请参阅列 <em>JSON API</em> 在里面 <a class="reference external" href="https://cloud.google.com/storage/docs/access-control/lists">Google Cloud documentation</a> .</p>
</div>
<div class="section" id="ftp-passive-mode">
<span id="std-setting-FTP_PASSIVE_MODE"></span><span id="std:setting-FTP_PASSIVE_MODE"></span><h3>FTP_PASSIVE_MODE<a class="headerlink" href="#ftp-passive-mode" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">True</span></code></p>
<p>启动FTP传输时是否使用被动模式。</p>
<span class="target" id="std-reqmeta-ftp_password"><span id="std:reqmeta-ftp_password"></span></span></div>
<div class="section" id="ftp-password">
<span id="std-setting-FTP_PASSWORD"></span><span id="std:setting-FTP_PASSWORD"></span><h3>FTP_PASSWORD<a class="headerlink" href="#ftp-password" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">&quot;guest&quot;</span></code></p>
<p>当没有ftp连接时用于ftp连接的密码 <code class="docutils literal notranslate"><span class="pre">&quot;ftp_password&quot;</span></code> 在里面 <code class="docutils literal notranslate"><span class="pre">Request</span></code> 元。</p>
<div class="admonition note">
<p class="admonition-title">注解</p>
<p>释义 <a class="reference external" href="https://tools.ietf.org/html/rfc1635">RFC 1635</a> 尽管匿名ftp通常使用密码“guest”或电子邮件地址，但有些ftp服务器明确要求用户的电子邮件地址，不允许使用“guest”密码登录。</p>
</div>
<span class="target" id="std-reqmeta-ftp_user"><span id="std:reqmeta-ftp_user"></span></span></div>
<div class="section" id="ftp-user">
<span id="std-setting-FTP_USER"></span><span id="std:setting-FTP_USER"></span><h3>FTP_USER<a class="headerlink" href="#ftp-user" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">&quot;anonymous&quot;</span></code></p>
<p>当没有ftp连接时用于ftp连接的用户名 <code class="docutils literal notranslate"><span class="pre">&quot;ftp_user&quot;</span></code> 在里面 <code class="docutils literal notranslate"><span class="pre">Request</span></code> 元。</p>
</div>
<div class="section" id="gcs-project-id">
<span id="std-setting-GCS_PROJECT_ID"></span><span id="std:setting-GCS_PROJECT_ID"></span><h3>GCS_PROJECT_ID<a class="headerlink" href="#gcs-project-id" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">None</span></code></p>
<p>在上存储数据时将使用的项目ID <a class="reference external" href="https://cloud.google.com/storage/">Google Cloud Storage</a> .</p>
</div>
<div class="section" id="item-pipelines">
<span id="std-setting-ITEM_PIPELINES"></span><span id="std:setting-ITEM_PIPELINES"></span><h3>ITEM_PIPELINES<a class="headerlink" href="#item-pipelines" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">{{}}</span></code></p>
<p>包含要使用的项目管道及其订单的dict。顺序值是任意的，但通常在0-1000范围内定义它们。低订单处理优先于高订单。</p>
<p>例子：：</p>
<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">ITEM_PIPELINES</span> <span class="o">=</span> <span class="p">{</span>
    <span class="s1">&#39;mybot.pipelines.validate.ValidateMyItem&#39;</span><span class="p">:</span> <span class="mi">300</span><span class="p">,</span>
    <span class="s1">&#39;mybot.pipelines.validate.StoreMyItem&#39;</span><span class="p">:</span> <span class="mi">800</span><span class="p">,</span>
<span class="p">}</span>
</pre></div>
</div>
</div>
<div class="section" id="item-pipelines-base">
<span id="std-setting-ITEM_PIPELINES_BASE"></span><span id="std:setting-ITEM_PIPELINES_BASE"></span><h3>ITEM_PIPELINES_BASE<a class="headerlink" href="#item-pipelines-base" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">{{}}</span></code></p>
<p>包含默认情况下在Scrapy中启用的管道的dict。您不应该在项目中修改此设置，请修改 <a class="reference internal" href="#std-setting-ITEM_PIPELINES"><code class="xref std std-setting docutils literal notranslate"><span class="pre">ITEM_PIPELINES</span></code></a> 相反。</p>
</div>
<div class="section" id="log-enabled">
<span id="std-setting-LOG_ENABLED"></span><span id="std:setting-LOG_ENABLED"></span><h3>LOG_ENABLED<a class="headerlink" href="#log-enabled" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">True</span></code></p>
<p>是否启用日志记录。</p>
</div>
<div class="section" id="log-encoding">
<span id="std-setting-LOG_ENCODING"></span><span id="std:setting-LOG_ENCODING"></span><h3>LOG_ENCODING<a class="headerlink" href="#log-encoding" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">'utf-8'</span></code></p>
<p>用于日志记录的编码。</p>
</div>
<div class="section" id="log-file">
<span id="std-setting-LOG_FILE"></span><span id="std:setting-LOG_FILE"></span><h3>LOG_FILE<a class="headerlink" href="#log-file" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">None</span></code></p>
<p>用于日志记录输出的文件名。如果 <code class="docutils literal notranslate"><span class="pre">None</span></code> ，将使用标准错误。</p>
</div>
<div class="section" id="log-format">
<span id="std-setting-LOG_FORMAT"></span><span id="std:setting-LOG_FORMAT"></span><h3>LOG_FORMAT<a class="headerlink" href="#log-format" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">'%(asctime)s</span> <span class="pre">[%(name)s]</span> <span class="pre">%(levelname)s:</span> <span class="pre">%(message)s'</span></code></p>
<p>用于格式化日志消息的字符串。请参阅 <a class="reference external" href="https://docs.python.org/3/library/logging.html#logrecord-attributes" title="(在 Python v3.9)"><span class="xref std std-ref">Python logging documentation</span></a> 所有可用占位符列表。</p>
</div>
<div class="section" id="log-dateformat">
<span id="std-setting-LOG_DATEFORMAT"></span><span id="std:setting-LOG_DATEFORMAT"></span><h3>LOG_DATEFORMAT<a class="headerlink" href="#log-dateformat" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">'%Y-%m-%d</span> <span class="pre">%H:%M:%S'</span></code></p>
<p>用于格式化日期/时间的字符串，扩展 <code class="docutils literal notranslate"><span class="pre">%(asctime)s</span></code> 占位符 <a class="reference internal" href="#std-setting-LOG_FORMAT"><code class="xref std std-setting docutils literal notranslate"><span class="pre">LOG_FORMAT</span></code></a> . 参考 <a class="reference external" href="https://docs.python.org/3/library/datetime.html#strftime-strptime-behavior" title="(在 Python v3.9)"><span class="xref std std-ref">Python datetime documentation</span></a> 对于可用指令的整个列表。</p>
</div>
<div class="section" id="log-formatter">
<span id="std-setting-LOG_FORMATTER"></span><span id="std:setting-LOG_FORMATTER"></span><h3>LOG_FORMATTER<a class="headerlink" href="#log-formatter" title="永久链接至标题">¶</a></h3>
<p>违约： <a class="reference internal" href="logging.html#scrapy.logformatter.LogFormatter" title="scrapy.logformatter.LogFormatter"><code class="xref py py-class docutils literal notranslate"><span class="pre">scrapy.logformatter.LogFormatter</span></code></a></p>
<p>用于的类 <a class="reference internal" href="logging.html#custom-log-formats"><span class="std std-ref">formatting log messages</span></a> 对于不同的行动。</p>
</div>
<div class="section" id="log-level">
<span id="std-setting-LOG_LEVEL"></span><span id="std:setting-LOG_LEVEL"></span><h3>LOG_LEVEL<a class="headerlink" href="#log-level" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">'DEBUG'</span></code></p>
<p>要记录的最低级别。可用级别包括：严重、错误、警告、信息、调试。有关详细信息，请参阅 <a class="reference internal" href="logging.html#topics-logging"><span class="std std-ref">登录</span></a> .</p>
</div>
<div class="section" id="log-stdout">
<span id="std-setting-LOG_STDOUT"></span><span id="std:setting-LOG_STDOUT"></span><h3>LOG_STDOUT<a class="headerlink" href="#log-stdout" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">False</span></code></p>
<p>如果 <code class="docutils literal notranslate"><span class="pre">True</span></code> ，进程的所有标准输出（和错误）都将重定向到日志。例如，如果你 <code class="docutils literal notranslate"><span class="pre">print('hello')</span></code> 它会出现在残缺的木头上。</p>
</div>
<div class="section" id="log-short-names">
<span id="std-setting-LOG_SHORT_NAMES"></span><span id="std:setting-LOG_SHORT_NAMES"></span><h3>LOG_SHORT_NAMES<a class="headerlink" href="#log-short-names" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">False</span></code></p>
<p>如果 <code class="docutils literal notranslate"><span class="pre">True</span></code> 日志只包含根路径。如果设置为 <code class="docutils literal notranslate"><span class="pre">False</span></code> 然后显示负责日志输出的组件</p>
</div>
<div class="section" id="logstats-interval">
<span id="std-setting-LOGSTATS_INTERVAL"></span><span id="std:setting-LOGSTATS_INTERVAL"></span><h3>LOGSTATS_INTERVAL<a class="headerlink" href="#logstats-interval" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">60.0</span></code></p>
<p>统计信息的每个日志打印输出之间的间隔（秒） <a class="reference internal" href="extensions.html#scrapy.extensions.logstats.LogStats" title="scrapy.extensions.logstats.LogStats"><code class="xref py py-class docutils literal notranslate"><span class="pre">LogStats</span></code></a> .</p>
</div>
<div class="section" id="memdebug-enabled">
<span id="std-setting-MEMDEBUG_ENABLED"></span><span id="std:setting-MEMDEBUG_ENABLED"></span><h3>MEMDEBUG_ENABLED<a class="headerlink" href="#memdebug-enabled" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">False</span></code></p>
<p>是否启用内存调试。</p>
</div>
<div class="section" id="memdebug-notify">
<span id="std-setting-MEMDEBUG_NOTIFY"></span><span id="std:setting-MEMDEBUG_NOTIFY"></span><h3>MEMDEBUG_NOTIFY<a class="headerlink" href="#memdebug-notify" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">[]</span></code></p>
<p>当启用内存调试时，如果此设置不为空，则会将内存报告发送到指定的地址，否则报告将写入日志。</p>
<p>例子：：</p>
<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">MEMDEBUG_NOTIFY</span> <span class="o">=</span> <span class="p">[</span><span class="s1">&#39;user@example.com&#39;</span><span class="p">]</span>
</pre></div>
</div>
</div>
<div class="section" id="memusage-enabled">
<span id="std-setting-MEMUSAGE_ENABLED"></span><span id="std:setting-MEMUSAGE_ENABLED"></span><h3>MEMUSAGE_ENABLED<a class="headerlink" href="#memusage-enabled" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">True</span></code></p>
<p>经营范围： <code class="docutils literal notranslate"><span class="pre">scrapy.extensions.memusage</span></code></p>
<p>是否启用内存使用扩展。此扩展跟踪进程使用的峰值内存（它将其写入统计信息）。它还可以选择在超过内存限制时关闭  Scrapy   进程（请参见 <a class="reference internal" href="#std-setting-MEMUSAGE_LIMIT_MB"><code class="xref std std-setting docutils literal notranslate"><span class="pre">MEMUSAGE_LIMIT_MB</span></code></a> ，并在发生这种情况时通过电子邮件通知（请参见 <a class="reference internal" href="#std-setting-MEMUSAGE_NOTIFY_MAIL"><code class="xref std std-setting docutils literal notranslate"><span class="pre">MEMUSAGE_NOTIFY_MAIL</span></code></a> ）</p>
<p>见 <a class="reference internal" href="extensions.html#topics-extensions-ref-memusage"><span class="std std-ref">内存使用扩展</span></a> .</p>
</div>
<div class="section" id="memusage-limit-mb">
<span id="std-setting-MEMUSAGE_LIMIT_MB"></span><span id="std:setting-MEMUSAGE_LIMIT_MB"></span><h3>MEMUSAGE_LIMIT_MB<a class="headerlink" href="#memusage-limit-mb" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">0</span></code></p>
<p>经营范围： <code class="docutils literal notranslate"><span class="pre">scrapy.extensions.memusage</span></code></p>
<p>关闭scrappy前允许的最大内存量（以兆字节为单位）（如果memusage_enabled为true）。如果为零，则不执行任何检查。</p>
<p>见 <a class="reference internal" href="extensions.html#topics-extensions-ref-memusage"><span class="std std-ref">内存使用扩展</span></a> .</p>
</div>
<div class="section" id="memusage-check-interval-seconds">
<span id="std-setting-MEMUSAGE_CHECK_INTERVAL_SECONDS"></span><span id="std:setting-MEMUSAGE_CHECK_INTERVAL_SECONDS"></span><h3>MEMUSAGE_CHECK_INTERVAL_SECONDS<a class="headerlink" href="#memusage-check-interval-seconds" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">60.0</span></code></p>
<p>经营范围： <code class="docutils literal notranslate"><span class="pre">scrapy.extensions.memusage</span></code></p>
<p>这个 <a class="reference internal" href="extensions.html#topics-extensions-ref-memusage"><span class="std std-ref">Memory usage extension</span></a> 检查当前内存使用情况，与 <a class="reference internal" href="#std-setting-MEMUSAGE_LIMIT_MB"><code class="xref std std-setting docutils literal notranslate"><span class="pre">MEMUSAGE_LIMIT_MB</span></code></a> 和 <a class="reference internal" href="#std-setting-MEMUSAGE_WARNING_MB"><code class="xref std std-setting docutils literal notranslate"><span class="pre">MEMUSAGE_WARNING_MB</span></code></a> ，以固定的时间间隔。</p>
<p>这将以秒为单位设置这些间隔的长度。</p>
<p>见 <a class="reference internal" href="extensions.html#topics-extensions-ref-memusage"><span class="std std-ref">内存使用扩展</span></a> .</p>
</div>
<div class="section" id="memusage-notify-mail">
<span id="std-setting-MEMUSAGE_NOTIFY_MAIL"></span><span id="std:setting-MEMUSAGE_NOTIFY_MAIL"></span><h3>MEMUSAGE_NOTIFY_MAIL<a class="headerlink" href="#memusage-notify-mail" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">False</span></code></p>
<p>经营范围： <code class="docutils literal notranslate"><span class="pre">scrapy.extensions.memusage</span></code></p>
<p>通知是否达到内存限制的电子邮件列表。</p>
<p>例子：：</p>
<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">MEMUSAGE_NOTIFY_MAIL</span> <span class="o">=</span> <span class="p">[</span><span class="s1">&#39;user@example.com&#39;</span><span class="p">]</span>
</pre></div>
</div>
<p>见 <a class="reference internal" href="extensions.html#topics-extensions-ref-memusage"><span class="std std-ref">内存使用扩展</span></a> .</p>
</div>
<div class="section" id="memusage-warning-mb">
<span id="std-setting-MEMUSAGE_WARNING_MB"></span><span id="std:setting-MEMUSAGE_WARNING_MB"></span><h3>MEMUSAGE_WARNING_MB<a class="headerlink" href="#memusage-warning-mb" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">0</span></code></p>
<p>经营范围： <code class="docutils literal notranslate"><span class="pre">scrapy.extensions.memusage</span></code></p>
<p>发送警告电子邮件通知前允许的最大内存量（以兆字节为单位）。如果为零，则不会产生警告。</p>
</div>
<div class="section" id="newspider-module">
<span id="std-setting-NEWSPIDER_MODULE"></span><span id="std:setting-NEWSPIDER_MODULE"></span><h3>NEWSPIDER_MODULE<a class="headerlink" href="#newspider-module" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">''</span></code></p>
<p>模块在何处使用 <a class="reference internal" href="commands.html#std-command-genspider"><code class="xref std std-command docutils literal notranslate"><span class="pre">genspider</span></code></a> 命令。</p>
<p>例子：：</p>
<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">NEWSPIDER_MODULE</span> <span class="o">=</span> <span class="s1">&#39;mybot.spiders_dev&#39;</span>
</pre></div>
</div>
</div>
<div class="section" id="randomize-download-delay">
<span id="std-setting-RANDOMIZE_DOWNLOAD_DELAY"></span><span id="std:setting-RANDOMIZE_DOWNLOAD_DELAY"></span><h3>RANDOMIZE_DOWNLOAD_DELAY<a class="headerlink" href="#randomize-download-delay" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">True</span></code></p>
<p>如果启用，Scrapy将随机等待一段时间（0.5之间 * <a class="reference internal" href="#std-setting-DOWNLOAD_DELAY"><code class="xref std std-setting docutils literal notranslate"><span class="pre">DOWNLOAD_DELAY</span></code></a> and 1.5 *  <a class="reference internal" href="#std-setting-DOWNLOAD_DELAY"><code class="xref std std-setting docutils literal notranslate"><span class="pre">DOWNLOAD_DELAY</span></code></a> ）同时从同一网站获取请求。</p>
<p>这种随机化减少了爬虫被站点检测（并随后被阻止）的机会，这些站点分析请求，寻找它们请求之间的时间有统计学意义的相似性。</p>
<p>随机化策略与 <a class="reference external" href="https://www.gnu.org/software/wget/manual/wget.html">wget</a>  <code class="docutils literal notranslate"><span class="pre">--random-wait</span></code> 选择权。</p>
<p>如果 <a class="reference internal" href="#std-setting-DOWNLOAD_DELAY"><code class="xref std std-setting docutils literal notranslate"><span class="pre">DOWNLOAD_DELAY</span></code></a> 为零（默认值）此选项无效。</p>
</div>
<div class="section" id="reactor-threadpool-maxsize">
<span id="std-setting-REACTOR_THREADPOOL_MAXSIZE"></span><span id="std:setting-REACTOR_THREADPOOL_MAXSIZE"></span><h3>REACTOR_THREADPOOL_MAXSIZE<a class="headerlink" href="#reactor-threadpool-maxsize" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">10</span></code></p>
<p>扭曲反应器线程池大小的最大限制。这是各种 Scrapy 组件使用的通用多用途线程池。线程DNS解析器，blockingfeedstorage，s3filestore等等。如果遇到阻塞IO不足的问题，请增加此值。</p>
</div>
<div class="section" id="redirect-priority-adjust">
<span id="std-setting-REDIRECT_PRIORITY_ADJUST"></span><span id="std:setting-REDIRECT_PRIORITY_ADJUST"></span><h3>REDIRECT_PRIORITY_ADJUST<a class="headerlink" href="#redirect-priority-adjust" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">+2</span></code></p>
<p>经营范围： <code class="docutils literal notranslate"><span class="pre">scrapy.downloadermiddlewares.redirect.RedirectMiddleware</span></code></p>
<p>相对于原始请求调整重定向请求优先级：</p>
<ul class="simple">
<li><p><strong>正优先级调整（默认）意味着更高的优先级。</strong></p></li>
<li><p>负优先级调整意味着低优先级。</p></li>
</ul>
</div>
<div class="section" id="retry-priority-adjust">
<span id="std-setting-RETRY_PRIORITY_ADJUST"></span><span id="std:setting-RETRY_PRIORITY_ADJUST"></span><h3>RETRY_PRIORITY_ADJUST<a class="headerlink" href="#retry-priority-adjust" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">-1</span></code></p>
<p>经营范围： <code class="docutils literal notranslate"><span class="pre">scrapy.downloadermiddlewares.retry.RetryMiddleware</span></code></p>
<p>相对于原始请求调整重试请求优先级：</p>
<ul class="simple">
<li><p>积极的优先级调整意味着更高的优先级。</p></li>
<li><p><strong>负优先级调整（默认）意味着低优先级。</strong></p></li>
</ul>
</div>
<div class="section" id="robotstxt-obey">
<span id="std-setting-ROBOTSTXT_OBEY"></span><span id="std:setting-ROBOTSTXT_OBEY"></span><h3>ROBOTSTXT_OBEY<a class="headerlink" href="#robotstxt-obey" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">False</span></code></p>
<p>经营范围： <code class="docutils literal notranslate"><span class="pre">scrapy.downloadermiddlewares.robotstxt</span></code></p>
<p>如果启用，scrapy将遵守robots.txt策略。有关详细信息，请参阅 <a class="reference internal" href="downloader-middleware.html#topics-dlmw-robots"><span class="std std-ref">RobotsTxtMiddleware</span></a> .</p>
<div class="admonition note">
<p class="admonition-title">注解</p>
<p>当默认值为 <code class="docutils literal notranslate"><span class="pre">False</span></code> 出于历史原因，默认情况下，此选项在由生成的settings.py文件中启用。 <code class="docutils literal notranslate"><span class="pre">scrapy</span> <span class="pre">startproject</span></code> 命令。</p>
</div>
</div>
<div class="section" id="robotstxt-parser">
<span id="std-setting-ROBOTSTXT_PARSER"></span><span id="std:setting-ROBOTSTXT_PARSER"></span><h3>ROBOTSTXT_PARSER<a class="headerlink" href="#robotstxt-parser" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">'scrapy.robotstxt.ProtegoRobotParser'</span></code></p>
<p>用于分析的分析器后端 <code class="docutils literal notranslate"><span class="pre">robots.txt</span></code> 文件夹。有关详细信息，请参阅 <a class="reference internal" href="downloader-middleware.html#topics-dlmw-robots"><span class="std std-ref">RobotsTxtMiddleware</span></a> .</p>
<div class="section" id="robotstxt-user-agent">
<span id="std-setting-ROBOTSTXT_USER_AGENT"></span><span id="std:setting-ROBOTSTXT_USER_AGENT"></span><h4>ROBOTSTXT_USER_AGENT<a class="headerlink" href="#robotstxt-user-agent" title="永久链接至标题">¶</a></h4>
<p>违约： <code class="docutils literal notranslate"><span class="pre">None</span></code></p>
<p>中用于匹配的用户代理字符串机器人.txt文件。如果 <code class="docutils literal notranslate"><span class="pre">None</span></code> ，随请求或 <a class="reference internal" href="#std-setting-USER_AGENT"><code class="xref std std-setting docutils literal notranslate"><span class="pre">USER_AGENT</span></code></a> 设置（按该顺序）将用于确定要在中使用的用户代理机器人.txt文件。</p>
</div>
</div>
<div class="section" id="scheduler">
<span id="std-setting-SCHEDULER"></span><span id="std:setting-SCHEDULER"></span><h3>SCHEDULER<a class="headerlink" href="#scheduler" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">'scrapy.core.scheduler.Scheduler'</span></code></p>
<p>用于爬网的计划程序。</p>
</div>
<div class="section" id="scheduler-debug">
<span id="std-setting-SCHEDULER_DEBUG"></span><span id="std:setting-SCHEDULER_DEBUG"></span><h3>SCHEDULER_DEBUG<a class="headerlink" href="#scheduler-debug" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">False</span></code></p>
<p>设置为 <code class="docutils literal notranslate"><span class="pre">True</span></code> 将记录有关请求计划程序的调试信息。如果无法将请求序列化到磁盘，则当前只记录一次。统计计数器 (<code class="docutils literal notranslate"><span class="pre">scheduler/unserializable</span></code> ）跟踪发生这种情况的次数。</p>
<p>日志中的示例条目：</p>
<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="mi">1956</span><span class="o">-</span><span class="mi">01</span><span class="o">-</span><span class="mi">31</span> <span class="mi">00</span><span class="p">:</span><span class="mi">00</span><span class="p">:</span><span class="mi">00</span><span class="o">+</span><span class="mi">0800</span> <span class="p">[</span><span class="n">scrapy</span><span class="o">.</span><span class="n">core</span><span class="o">.</span><span class="n">scheduler</span><span class="p">]</span> <span class="n">ERROR</span><span class="p">:</span> <span class="n">Unable</span> <span class="n">to</span> <span class="n">serialize</span> <span class="n">request</span><span class="p">:</span>
<span class="o">&lt;</span><span class="n">GET</span> <span class="n">http</span><span class="p">:</span><span class="o">//</span><span class="n">example</span><span class="o">.</span><span class="n">com</span><span class="o">&gt;</span> <span class="o">-</span> <span class="n">reason</span><span class="p">:</span> <span class="n">cannot</span> <span class="n">serialize</span> <span class="o">&lt;</span><span class="n">Request</span> <span class="n">at</span> <span class="mh">0x9a7c7ec</span><span class="o">&gt;</span>
<span class="p">(</span><span class="nb">type</span> <span class="n">Request</span><span class="p">)</span><span class="o">&gt;</span> <span class="o">-</span> <span class="n">no</span> <span class="n">more</span> <span class="n">unserializable</span> <span class="n">requests</span> <span class="n">will</span> <span class="n">be</span> <span class="n">logged</span>
<span class="p">(</span><span class="n">see</span> <span class="s1">&#39;scheduler/unserializable&#39;</span> <span class="n">stats</span> <span class="n">counter</span><span class="p">)</span>
</pre></div>
</div>
</div>
<div class="section" id="scheduler-disk-queue">
<span id="std-setting-SCHEDULER_DISK_QUEUE"></span><span id="std:setting-SCHEDULER_DISK_QUEUE"></span><h3>SCHEDULER_DISK_QUEUE<a class="headerlink" href="#scheduler-disk-queue" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">'scrapy.squeues.PickleLifoDiskQueue'</span></code></p>
<p>计划程序将使用的磁盘队列类型。其他可用类型包括 <code class="docutils literal notranslate"><span class="pre">scrapy.squeues.PickleFifoDiskQueue</span></code> ， <code class="docutils literal notranslate"><span class="pre">scrapy.squeues.MarshalFifoDiskQueue</span></code> ， <code class="docutils literal notranslate"><span class="pre">scrapy.squeues.MarshalLifoDiskQueue</span></code> .</p>
</div>
<div class="section" id="scheduler-memory-queue">
<span id="std-setting-SCHEDULER_MEMORY_QUEUE"></span><span id="std:setting-SCHEDULER_MEMORY_QUEUE"></span><h3>SCHEDULER_MEMORY_QUEUE<a class="headerlink" href="#scheduler-memory-queue" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">'scrapy.squeues.LifoMemoryQueue'</span></code></p>
<p>调度程序使用的内存中队列的类型。其他可用类型为： <code class="docutils literal notranslate"><span class="pre">scrapy.squeues.FifoMemoryQueue</span></code> .</p>
</div>
<div class="section" id="scheduler-priority-queue">
<span id="std-setting-SCHEDULER_PRIORITY_QUEUE"></span><span id="std:setting-SCHEDULER_PRIORITY_QUEUE"></span><h3>SCHEDULER_PRIORITY_QUEUE<a class="headerlink" href="#scheduler-priority-queue" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">'scrapy.pqueues.ScrapyPriorityQueue'</span></code></p>
<p>调度程序使用的优先级队列的类型。另一种可用类型是 <code class="docutils literal notranslate"><span class="pre">scrapy.pqueues.DownloaderAwarePriorityQueue</span></code> . <code class="docutils literal notranslate"><span class="pre">scrapy.pqueues.DownloaderAwarePriorityQueue</span></code> 比 <code class="docutils literal notranslate"><span class="pre">scrapy.pqueues.ScrapyPriorityQueue</span></code> 当您并行地对许多不同的域进行爬网时。但目前 <code class="docutils literal notranslate"><span class="pre">scrapy.pqueues.DownloaderAwarePriorityQueue</span></code> 不与一起工作 <a class="reference internal" href="#std-setting-CONCURRENT_REQUESTS_PER_IP"><code class="xref std std-setting docutils literal notranslate"><span class="pre">CONCURRENT_REQUESTS_PER_IP</span></code></a> .</p>
</div>
<div class="section" id="scraper-slot-max-active-size">
<span id="std-setting-SCRAPER_SLOT_MAX_ACTIVE_SIZE"></span><span id="std:setting-SCRAPER_SLOT_MAX_ACTIVE_SIZE"></span><h3>SCRAPER_SLOT_MAX_ACTIVE_SIZE<a class="headerlink" href="#scraper-slot-max-active-size" title="永久链接至标题">¶</a></h3>
<div class="versionadded">
<p><span class="versionmodified added">2.0 新版功能.</span></p>
</div>
<p>违约： <code class="docutils literal notranslate"><span class="pre">5_000_000</span></code></p>
<p>正在处理的响应数据的软限制（字节）。</p>
<p>当正在处理的所有响应的大小之和大于此值时，Scrapy不处理新请求。</p>
</div>
<div class="section" id="spider-contracts">
<span id="std-setting-SPIDER_CONTRACTS"></span><span id="std:setting-SPIDER_CONTRACTS"></span><h3>SPIDER_CONTRACTS<a class="headerlink" href="#spider-contracts" title="永久链接至标题">¶</a></h3>
<p>违约：： <code class="docutils literal notranslate"><span class="pre">{{}}</span></code></p>
<p>包含项目中启用的蜘蛛合约的dict，用于测试蜘蛛。有关详细信息，请参阅 <a class="reference internal" href="contracts.html#topics-contracts"><span class="std std-ref">蜘蛛合约</span></a> .</p>
</div>
<div class="section" id="spider-contracts-base">
<span id="std-setting-SPIDER_CONTRACTS_BASE"></span><span id="std:setting-SPIDER_CONTRACTS_BASE"></span><h3>SPIDER_CONTRACTS_BASE<a class="headerlink" href="#spider-contracts-base" title="永久链接至标题">¶</a></h3>
<p>违约：：</p>
<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="p">{</span>
    <span class="s1">&#39;scrapy.contracts.default.UrlContract&#39;</span> <span class="p">:</span> <span class="mi">1</span><span class="p">,</span>
    <span class="s1">&#39;scrapy.contracts.default.ReturnsContract&#39;</span><span class="p">:</span> <span class="mi">2</span><span class="p">,</span>
    <span class="s1">&#39;scrapy.contracts.default.ScrapesContract&#39;</span><span class="p">:</span> <span class="mi">3</span><span class="p">,</span>
<span class="p">}</span>
</pre></div>
</div>
<p>包含Scrapy中默认启用的Scrapy契约的dict。您不应该在项目中修改此设置，修改 <a class="reference internal" href="#std-setting-SPIDER_CONTRACTS"><code class="xref std std-setting docutils literal notranslate"><span class="pre">SPIDER_CONTRACTS</span></code></a> 相反。有关详细信息，请参阅 <a class="reference internal" href="contracts.html#topics-contracts"><span class="std std-ref">蜘蛛合约</span></a> .</p>
<p>您可以通过分配 <code class="docutils literal notranslate"><span class="pre">None</span></code> 去他们的班级 <a class="reference internal" href="#std-setting-SPIDER_CONTRACTS"><code class="xref std std-setting docutils literal notranslate"><span class="pre">SPIDER_CONTRACTS</span></code></a> . 例如，禁用内置 <code class="docutils literal notranslate"><span class="pre">ScrapesContract</span></code> 把这个放在你的 <code class="docutils literal notranslate"><span class="pre">settings.py</span></code> ：：</p>
<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">SPIDER_CONTRACTS</span> <span class="o">=</span> <span class="p">{</span>
    <span class="s1">&#39;scrapy.contracts.default.ScrapesContract&#39;</span><span class="p">:</span> <span class="kc">None</span><span class="p">,</span>
<span class="p">}</span>
</pre></div>
</div>
</div>
<div class="section" id="spider-loader-class">
<span id="std-setting-SPIDER_LOADER_CLASS"></span><span id="std:setting-SPIDER_LOADER_CLASS"></span><h3>SPIDER_LOADER_CLASS<a class="headerlink" href="#spider-loader-class" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">'scrapy.spiderloader.SpiderLoader'</span></code></p>
<p>将用于加载spider的类，该类必须实现 <a class="reference internal" href="api.html#topics-api-spiderloader"><span class="std std-ref">SpiderLoader API</span></a> .</p>
</div>
<div class="section" id="spider-loader-warn-only">
<span id="std-setting-SPIDER_LOADER_WARN_ONLY"></span><span id="std:setting-SPIDER_LOADER_WARN_ONLY"></span><h3>SPIDER_LOADER_WARN_ONLY<a class="headerlink" href="#spider-loader-warn-only" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">False</span></code></p>
<p>默认情况下，当Scrapy试图从 <a class="reference internal" href="#std-setting-SPIDER_MODULES"><code class="xref std std-setting docutils literal notranslate"><span class="pre">SPIDER_MODULES</span></code></a> 如果有的话，它会很响地失效。 <code class="docutils literal notranslate"><span class="pre">ImportError</span></code> 例外。但是您可以选择沉默这个异常，并通过设置将它变成一个简单的警告 <code class="docutils literal notranslate"><span class="pre">SPIDER_LOADER_WARN_ONLY</span> <span class="pre">=</span> <span class="pre">True</span></code> .</p>
<div class="admonition note">
<p class="admonition-title">注解</p>
<p>一些 <a class="reference internal" href="commands.html#topics-commands"><span class="std std-ref">scrapy commands</span></a> 使用此设置运行到 <code class="docutils literal notranslate"><span class="pre">True</span></code> 已经（即，它们只会发出警告，不会失败），因为它们实际上不需要加载蜘蛛类来工作： <a class="reference internal" href="commands.html#std-command-runspider"><code class="xref std std-command docutils literal notranslate"><span class="pre">scrapy</span> <span class="pre">runspider</span></code></a> ， <a class="reference internal" href="commands.html#std-command-settings"><code class="xref std std-command docutils literal notranslate"><span class="pre">scrapy</span> <span class="pre">settings</span></code></a> ， <a class="reference internal" href="commands.html#std-command-startproject"><code class="xref std std-command docutils literal notranslate"><span class="pre">scrapy</span> <span class="pre">startproject</span></code></a> ， <a class="reference internal" href="commands.html#std-command-version"><code class="xref std std-command docutils literal notranslate"><span class="pre">scrapy</span> <span class="pre">version</span></code></a> .</p>
</div>
</div>
<div class="section" id="spider-middlewares">
<span id="std-setting-SPIDER_MIDDLEWARES"></span><span id="std:setting-SPIDER_MIDDLEWARES"></span><h3>SPIDER_MIDDLEWARES<a class="headerlink" href="#spider-middlewares" title="永久链接至标题">¶</a></h3>
<p>违约：： <code class="docutils literal notranslate"><span class="pre">{{}}</span></code></p>
<p>包含项目中启用的蜘蛛中间件及其订单的dict。有关详细信息，请参阅 <a class="reference internal" href="spider-middleware.html#topics-spider-middleware-setting"><span class="std std-ref">激活蜘蛛中间件</span></a> .</p>
</div>
<div class="section" id="spider-middlewares-base">
<span id="std-setting-SPIDER_MIDDLEWARES_BASE"></span><span id="std:setting-SPIDER_MIDDLEWARES_BASE"></span><h3>SPIDER_MIDDLEWARES_BASE<a class="headerlink" href="#spider-middlewares-base" title="永久链接至标题">¶</a></h3>
<p>违约：：</p>
<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="p">{</span>
    <span class="s1">&#39;scrapy.spidermiddlewares.httperror.HttpErrorMiddleware&#39;</span><span class="p">:</span> <span class="mi">50</span><span class="p">,</span>
    <span class="s1">&#39;scrapy.spidermiddlewares.offsite.OffsiteMiddleware&#39;</span><span class="p">:</span> <span class="mi">500</span><span class="p">,</span>
    <span class="s1">&#39;scrapy.spidermiddlewares.referer.RefererMiddleware&#39;</span><span class="p">:</span> <span class="mi">700</span><span class="p">,</span>
    <span class="s1">&#39;scrapy.spidermiddlewares.urllength.UrlLengthMiddleware&#39;</span><span class="p">:</span> <span class="mi">800</span><span class="p">,</span>
    <span class="s1">&#39;scrapy.spidermiddlewares.depth.DepthMiddleware&#39;</span><span class="p">:</span> <span class="mi">900</span><span class="p">,</span>
<span class="p">}</span>
</pre></div>
</div>
<p>包含spider中间件的dict，默认情况下在scrappy中启用，以及它们的顺序。低阶更接近发动机，高阶更接近蜘蛛。有关详细信息，请参阅 <a class="reference internal" href="spider-middleware.html#topics-spider-middleware-setting"><span class="std std-ref">激活蜘蛛中间件</span></a> .</p>
</div>
<div class="section" id="spider-modules">
<span id="std-setting-SPIDER_MODULES"></span><span id="std:setting-SPIDER_MODULES"></span><h3>SPIDER_MODULES<a class="headerlink" href="#spider-modules" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">[]</span></code></p>
<p>Scrapy将在其中查找蜘蛛的模块列表。</p>
<p>例子：：</p>
<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="n">SPIDER_MODULES</span> <span class="o">=</span> <span class="p">[</span><span class="s1">&#39;mybot.spiders_prod&#39;</span><span class="p">,</span> <span class="s1">&#39;mybot.spiders_dev&#39;</span><span class="p">]</span>
</pre></div>
</div>
</div>
<div class="section" id="stats-class">
<span id="std-setting-STATS_CLASS"></span><span id="std:setting-STATS_CLASS"></span><h3>STATS_CLASS<a class="headerlink" href="#stats-class" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">'scrapy.statscollectors.MemoryStatsCollector'</span></code></p>
<p>用于收集统计信息的类，必须实现 <a class="reference internal" href="api.html#topics-api-stats"><span class="std std-ref">统计收集器API</span></a> .</p>
</div>
<div class="section" id="stats-dump">
<span id="std-setting-STATS_DUMP"></span><span id="std:setting-STATS_DUMP"></span><h3>STATS_DUMP<a class="headerlink" href="#stats-dump" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">True</span></code></p>
<p>转储 <a class="reference internal" href="stats.html#topics-stats"><span class="std std-ref">Scrapy stats</span></a> （对着残破的木头）一旦蜘蛛完成。</p>
<p>有关详细信息，请参阅： <a class="reference internal" href="stats.html#topics-stats"><span class="std std-ref">统计数据集合</span></a> .</p>
</div>
<div class="section" id="statsmailer-rcpts">
<span id="std-setting-STATSMAILER_RCPTS"></span><span id="std:setting-STATSMAILER_RCPTS"></span><h3>STATSMAILER_RCPTS<a class="headerlink" href="#statsmailer-rcpts" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">[]</span></code> （空表）</p>
<p>蜘蛛完成爬取后发送残缺数据。见 <a class="reference internal" href="extensions.html#scrapy.extensions.statsmailer.StatsMailer" title="scrapy.extensions.statsmailer.StatsMailer"><code class="xref py py-class docutils literal notranslate"><span class="pre">StatsMailer</span></code></a> 更多信息。</p>
</div>
<div class="section" id="telnetconsole-enabled">
<span id="std-setting-TELNETCONSOLE_ENABLED"></span><span id="std:setting-TELNETCONSOLE_ENABLED"></span><h3>TELNETCONSOLE_ENABLED<a class="headerlink" href="#telnetconsole-enabled" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">True</span></code></p>
<p>一个布尔值，指定 <a class="reference internal" href="telnetconsole.html#topics-telnetconsole"><span class="std std-ref">telnet console</span></a> 将被启用（前提是它的扩展也被启用）。</p>
</div>
<div class="section" id="templates-dir">
<span id="std-setting-TEMPLATES_DIR"></span><span id="std:setting-TEMPLATES_DIR"></span><h3>TEMPLATES_DIR<a class="headerlink" href="#templates-dir" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">templates</span></code>  Scrapy  模块内部目录</p>
<p>创建新项目时要在其中查找模板的目录 <a class="reference internal" href="commands.html#std-command-startproject"><code class="xref std std-command docutils literal notranslate"><span class="pre">startproject</span></code></a> 命令和新蜘蛛 <a class="reference internal" href="commands.html#std-command-genspider"><code class="xref std std-command docutils literal notranslate"><span class="pre">genspider</span></code></a> 命令。</p>
<p>项目名称不得与中自定义文件或目录的名称冲突。 <code class="docutils literal notranslate"><span class="pre">project</span></code> 子目录。</p>
</div>
<div class="section" id="twisted-reactor">
<span id="std-setting-TWISTED_REACTOR"></span><span id="std:setting-TWISTED_REACTOR"></span><h3>TWISTED_REACTOR<a class="headerlink" href="#twisted-reactor" title="永久链接至标题">¶</a></h3>
<div class="versionadded">
<p><span class="versionmodified added">2.0 新版功能.</span></p>
</div>
<p>违约： <code class="docutils literal notranslate"><span class="pre">None</span></code></p>
<p>给定的导入路径 <a class="reference external" href="https://twistedmatrix.com/documents/current/api/twisted.internet.reactor.html" title="(在 Twisted v2.0)"><code class="xref py py-mod docutils literal notranslate"><span class="pre">reactor</span></code></a> .</p>
<p>如果还没有安装其他反应器，比如当 <code class="docutils literal notranslate"><span class="pre">scrapy</span></code> 调用CLI程序或在使用 <code class="xref py py-class docutils literal notranslate"><span class="pre">CrawlerProcess</span></code> 班级。</p>
<p>如果您正在使用 <code class="xref py py-class docutils literal notranslate"><span class="pre">CrawlerRunner</span></code> 类，还需要手动安装正确的reactor。你可以用 <a class="reference internal" href="#scrapy.utils.reactor.install_reactor" title="scrapy.utils.reactor.install_reactor"><code class="xref py py-func docutils literal notranslate"><span class="pre">install_reactor()</span></code></a> ：</p>
<dl class="py function">
<dt id="scrapy.utils.reactor.install_reactor">
<code class="sig-prename descclassname">scrapy.utils.reactor.</code><code class="sig-name descname">install_reactor</code><span class="sig-paren">(</span><em class="sig-param"><span class="n">reactor_path</span></em>, <em class="sig-param"><span class="n">event_loop_path</span><span class="o">=</span><span class="default_value">None</span></em><span class="sig-paren">)</span><a class="reference internal" href="../_modules/scrapy/utils/reactor.html#install_reactor"><span class="viewcode-link">[源代码]</span></a><a class="headerlink" href="#scrapy.utils.reactor.install_reactor" title="永久链接至目标">¶</a></dt>
<dd><p>安装 <a class="reference external" href="https://twistedmatrix.com/documents/current/api/twisted.internet.reactor.html" title="(在 Twisted v2.0)"><code class="xref py py-mod docutils literal notranslate"><span class="pre">reactor</span></code></a> 具有指定的导入路径。如果启用了asyncio reactor，还将使用指定的导入路径安装asyncio事件循环</p>
</dd></dl>

<p>如果已经安装了反应堆， <a class="reference internal" href="#scrapy.utils.reactor.install_reactor" title="scrapy.utils.reactor.install_reactor"><code class="xref py py-func docutils literal notranslate"><span class="pre">install_reactor()</span></code></a> 没有效果。</p>
<p><code class="xref py py-meth docutils literal notranslate"><span class="pre">CrawlerRunner.__init__</span></code> 加薪 <a class="reference external" href="https://docs.python.org/3/library/exceptions.html#Exception" title="(在 Python v3.9)"><code class="xref py py-exc docutils literal notranslate"><span class="pre">Exception</span></code></a> 如果安装的反应堆与 <a class="reference internal" href="#std-setting-TWISTED_REACTOR"><code class="xref std std-setting docutils literal notranslate"><span class="pre">TWISTED_REACTOR</span></code></a> 设置；因此，具有顶层 <a class="reference external" href="https://twistedmatrix.com/documents/current/api/twisted.internet.reactor.html" title="(在 Twisted v2.0)"><code class="xref py py-mod docutils literal notranslate"><span class="pre">reactor</span></code></a> 项目文件中的导入和导入的第三方库将导致垃圾的增加 <a class="reference external" href="https://docs.python.org/3/library/exceptions.html#Exception" title="(在 Python v3.9)"><code class="xref py py-exc docutils literal notranslate"><span class="pre">Exception</span></code></a> 当它检查安装了哪个反应堆。</p>
<p>为了使用Scrapy安装的反应器：</p>
<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="kn">import</span> <span class="nn">scrapy</span>
<span class="kn">from</span> <span class="nn">twisted.internet</span> <span class="kn">import</span> <span class="n">reactor</span>


<span class="k">class</span> <span class="nc">QuotesSpider</span><span class="p">(</span><span class="n">scrapy</span><span class="o">.</span><span class="n">Spider</span><span class="p">):</span>
    <span class="n">name</span> <span class="o">=</span> <span class="s1">&#39;quotes&#39;</span>

    <span class="k">def</span> <span class="fm">__init__</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="o">*</span><span class="n">args</span><span class="p">,</span> <span class="o">**</span><span class="n">kwargs</span><span class="p">):</span>
        <span class="bp">self</span><span class="o">.</span><span class="n">timeout</span> <span class="o">=</span> <span class="nb">int</span><span class="p">(</span><span class="n">kwargs</span><span class="o">.</span><span class="n">pop</span><span class="p">(</span><span class="s1">&#39;timeout&#39;</span><span class="p">,</span> <span class="s1">&#39;60&#39;</span><span class="p">))</span>
        <span class="nb">super</span><span class="p">(</span><span class="n">QuotesSpider</span><span class="p">,</span> <span class="bp">self</span><span class="p">)</span><span class="o">.</span><span class="fm">__init__</span><span class="p">(</span><span class="o">*</span><span class="n">args</span><span class="p">,</span> <span class="o">**</span><span class="n">kwargs</span><span class="p">)</span>

    <span class="k">def</span> <span class="nf">start_requests</span><span class="p">(</span><span class="bp">self</span><span class="p">):</span>
        <span class="n">reactor</span><span class="o">.</span><span class="n">callLater</span><span class="p">(</span><span class="bp">self</span><span class="o">.</span><span class="n">timeout</span><span class="p">,</span> <span class="bp">self</span><span class="o">.</span><span class="n">stop</span><span class="p">)</span>

        <span class="n">urls</span> <span class="o">=</span> <span class="p">[</span><span class="s1">&#39;http://quotes.toscrape.com/page/1&#39;</span><span class="p">]</span>
        <span class="k">for</span> <span class="n">url</span> <span class="ow">in</span> <span class="n">urls</span><span class="p">:</span>
            <span class="k">yield</span> <span class="n">scrapy</span><span class="o">.</span><span class="n">Request</span><span class="p">(</span><span class="n">url</span><span class="o">=</span><span class="n">url</span><span class="p">,</span> <span class="n">callback</span><span class="o">=</span><span class="bp">self</span><span class="o">.</span><span class="n">parse</span><span class="p">)</span>

    <span class="k">def</span> <span class="nf">parse</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">response</span><span class="p">):</span>
        <span class="k">for</span> <span class="n">quote</span> <span class="ow">in</span> <span class="n">response</span><span class="o">.</span><span class="n">css</span><span class="p">(</span><span class="s1">&#39;div.quote&#39;</span><span class="p">):</span>
            <span class="k">yield</span> <span class="p">{</span><span class="s1">&#39;text&#39;</span><span class="p">:</span> <span class="n">quote</span><span class="o">.</span><span class="n">css</span><span class="p">(</span><span class="s1">&#39;span.text::text&#39;</span><span class="p">)</span><span class="o">.</span><span class="n">get</span><span class="p">()}</span>

    <span class="k">def</span> <span class="nf">stop</span><span class="p">(</span><span class="bp">self</span><span class="p">):</span>
        <span class="bp">self</span><span class="o">.</span><span class="n">crawler</span><span class="o">.</span><span class="n">engine</span><span class="o">.</span><span class="n">close_spider</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="s1">&#39;timeout&#39;</span><span class="p">)</span>
</pre></div>
</div>
<p>使 <a class="reference external" href="https://docs.python.org/3/library/exceptions.html#Exception" title="(在 Python v3.9)"><code class="xref py py-exc docutils literal notranslate"><span class="pre">Exception</span></code></a> ，变成：</p>
<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="kn">import</span> <span class="nn">scrapy</span>


<span class="k">class</span> <span class="nc">QuotesSpider</span><span class="p">(</span><span class="n">scrapy</span><span class="o">.</span><span class="n">Spider</span><span class="p">):</span>
    <span class="n">name</span> <span class="o">=</span> <span class="s1">&#39;quotes&#39;</span>

    <span class="k">def</span> <span class="fm">__init__</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="o">*</span><span class="n">args</span><span class="p">,</span> <span class="o">**</span><span class="n">kwargs</span><span class="p">):</span>
        <span class="bp">self</span><span class="o">.</span><span class="n">timeout</span> <span class="o">=</span> <span class="nb">int</span><span class="p">(</span><span class="n">kwargs</span><span class="o">.</span><span class="n">pop</span><span class="p">(</span><span class="s1">&#39;timeout&#39;</span><span class="p">,</span> <span class="s1">&#39;60&#39;</span><span class="p">))</span>
        <span class="nb">super</span><span class="p">(</span><span class="n">QuotesSpider</span><span class="p">,</span> <span class="bp">self</span><span class="p">)</span><span class="o">.</span><span class="fm">__init__</span><span class="p">(</span><span class="o">*</span><span class="n">args</span><span class="p">,</span> <span class="o">**</span><span class="n">kwargs</span><span class="p">)</span>

    <span class="k">def</span> <span class="nf">start_requests</span><span class="p">(</span><span class="bp">self</span><span class="p">):</span>
        <span class="kn">from</span> <span class="nn">twisted.internet</span> <span class="kn">import</span> <span class="n">reactor</span>
        <span class="n">reactor</span><span class="o">.</span><span class="n">callLater</span><span class="p">(</span><span class="bp">self</span><span class="o">.</span><span class="n">timeout</span><span class="p">,</span> <span class="bp">self</span><span class="o">.</span><span class="n">stop</span><span class="p">)</span>

        <span class="n">urls</span> <span class="o">=</span> <span class="p">[</span><span class="s1">&#39;http://quotes.toscrape.com/page/1&#39;</span><span class="p">]</span>
        <span class="k">for</span> <span class="n">url</span> <span class="ow">in</span> <span class="n">urls</span><span class="p">:</span>
            <span class="k">yield</span> <span class="n">scrapy</span><span class="o">.</span><span class="n">Request</span><span class="p">(</span><span class="n">url</span><span class="o">=</span><span class="n">url</span><span class="p">,</span> <span class="n">callback</span><span class="o">=</span><span class="bp">self</span><span class="o">.</span><span class="n">parse</span><span class="p">)</span>

    <span class="k">def</span> <span class="nf">parse</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="n">response</span><span class="p">):</span>
        <span class="k">for</span> <span class="n">quote</span> <span class="ow">in</span> <span class="n">response</span><span class="o">.</span><span class="n">css</span><span class="p">(</span><span class="s1">&#39;div.quote&#39;</span><span class="p">):</span>
            <span class="k">yield</span> <span class="p">{</span><span class="s1">&#39;text&#39;</span><span class="p">:</span> <span class="n">quote</span><span class="o">.</span><span class="n">css</span><span class="p">(</span><span class="s1">&#39;span.text::text&#39;</span><span class="p">)</span><span class="o">.</span><span class="n">get</span><span class="p">()}</span>

    <span class="k">def</span> <span class="nf">stop</span><span class="p">(</span><span class="bp">self</span><span class="p">):</span>
        <span class="bp">self</span><span class="o">.</span><span class="n">crawler</span><span class="o">.</span><span class="n">engine</span><span class="o">.</span><span class="n">close_spider</span><span class="p">(</span><span class="bp">self</span><span class="p">,</span> <span class="s1">&#39;timeout&#39;</span><span class="p">)</span>
</pre></div>
</div>
<p>的默认值 <a class="reference internal" href="#std-setting-TWISTED_REACTOR"><code class="xref std std-setting docutils literal notranslate"><span class="pre">TWISTED_REACTOR</span></code></a> 设置是 <code class="docutils literal notranslate"><span class="pre">None</span></code> ，这意味着Scrapy不会尝试安装任何特定的reactor，并且将使用Twisted为当前平台定义的默认reactor。这是为了保持向后兼容性，并避免使用非默认反应器可能导致的问题。</p>
<p>有关其他信息，请参阅 <a class="reference external" href="https://twistedmatrix.com/documents/current/core/howto/choosing-reactor.html" title="(在 Twisted v20.3)"><span>Choosing a Reactor and GUI Toolkit Integration</span></a> .</p>
</div>
<div class="section" id="urllength-limit">
<span id="std-setting-URLLENGTH_LIMIT"></span><span id="std:setting-URLLENGTH_LIMIT"></span><h3>URLLENGTH_LIMIT<a class="headerlink" href="#urllength-limit" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">2083</span></code></p>
<p>经营范围： <code class="docutils literal notranslate"><span class="pre">spidermiddlewares.urllength</span></code></p>
<p>允许已爬网URL的最大URL长度。有关此设置的默认值的详细信息，请参阅：<a class="reference external" href="https://boutell.com/newfaq/misc/urllength.html">https://boutell.com/newfaq/misc/urllength.html</a></p>
</div>
<div class="section" id="user-agent">
<span id="std-setting-USER_AGENT"></span><span id="std:setting-USER_AGENT"></span><h3>USER_AGENT<a class="headerlink" href="#user-agent" title="永久链接至标题">¶</a></h3>
<p>违约： <code class="docutils literal notranslate"><span class="pre">&quot;Scrapy/VERSION</span> <span class="pre">(+https://scrapy.org)&quot;</span></code></p>
<p>爬网时要使用的默认用户代理，除非被重写。此用户代理还由 <a class="reference internal" href="downloader-middleware.html#scrapy.downloadermiddlewares.robotstxt.RobotsTxtMiddleware" title="scrapy.downloadermiddlewares.robotstxt.RobotsTxtMiddleware"><code class="xref py py-class docutils literal notranslate"><span class="pre">RobotsTxtMiddleware</span></code></a> 如果 <a class="reference internal" href="#std-setting-ROBOTSTXT_USER_AGENT"><code class="xref std std-setting docutils literal notranslate"><span class="pre">ROBOTSTXT_USER_AGENT</span></code></a> 设置是 <code class="docutils literal notranslate"><span class="pre">None</span></code> 并且没有为请求指定重写用户代理头。</p>
</div>
<div class="section" id="settings-documented-elsewhere">
<h3>其他地方记录的设置：<a class="headerlink" href="#settings-documented-elsewhere" title="永久链接至标题">¶</a></h3>
<p>以下设置记录在其他地方，请检查每个特定案例以了解如何启用和使用它们。</p>
<ul class="simple">
<li><p><a class="reference internal" href="downloader-middleware.html#std-setting-AJAXCRAWL_ENABLED">AJAXCRAWL_ENABLED</a></p></li>
<li><p><a class="reference internal" href="autothrottle.html#std-setting-AUTOTHROTTLE_DEBUG">AUTOTHROTTLE_DEBUG</a></p></li>
<li><p><a class="reference internal" href="autothrottle.html#std-setting-AUTOTHROTTLE_ENABLED">AUTOTHROTTLE_ENABLED</a></p></li>
<li><p><a class="reference internal" href="autothrottle.html#std-setting-AUTOTHROTTLE_MAX_DELAY">AUTOTHROTTLE_MAX_DELAY</a></p></li>
<li><p><a class="reference internal" href="autothrottle.html#std-setting-AUTOTHROTTLE_START_DELAY">AUTOTHROTTLE_START_DELAY</a></p></li>
<li><p><a class="reference internal" href="autothrottle.html#std-setting-AUTOTHROTTLE_TARGET_CONCURRENCY">AUTOTHROTTLE_TARGET_CONCURRENCY</a></p></li>
<li><p><a class="reference internal" href="extensions.html#std-setting-CLOSESPIDER_ERRORCOUNT">CLOSESPIDER_ERRORCOUNT</a></p></li>
<li><p><a class="reference internal" href="extensions.html#std-setting-CLOSESPIDER_ITEMCOUNT">CLOSESPIDER_ITEMCOUNT</a></p></li>
<li><p><a class="reference internal" href="extensions.html#std-setting-CLOSESPIDER_PAGECOUNT">CLOSESPIDER_PAGECOUNT</a></p></li>
<li><p><a class="reference internal" href="extensions.html#std-setting-CLOSESPIDER_TIMEOUT">CLOSESPIDER_TIMEOUT</a></p></li>
<li><p><a class="reference internal" href="commands.html#std-setting-COMMANDS_MODULE">COMMANDS_MODULE</a></p></li>
<li><p><a class="reference internal" href="downloader-middleware.html#std-setting-COMPRESSION_ENABLED">COMPRESSION_ENABLED</a></p></li>
<li><p><a class="reference internal" href="downloader-middleware.html#std-setting-COOKIES_DEBUG">COOKIES_DEBUG</a></p></li>
<li><p><a class="reference internal" href="downloader-middleware.html#std-setting-COOKIES_ENABLED">COOKIES_ENABLED</a></p></li>
<li><p><a class="reference internal" href="feed-exports.html#std-setting-FEEDS">FEEDS</a></p></li>
<li><p><a class="reference internal" href="feed-exports.html#std-setting-FEED_EXPORTERS">FEED_EXPORTERS</a></p></li>
<li><p><a class="reference internal" href="feed-exports.html#std-setting-FEED_EXPORTERS_BASE">FEED_EXPORTERS_BASE</a></p></li>
<li><p><a class="reference internal" href="feed-exports.html#std-setting-FEED_EXPORT_BATCH_ITEM_COUNT">FEED_EXPORT_BATCH_ITEM_COUNT</a></p></li>
<li><p><a class="reference internal" href="feed-exports.html#std-setting-FEED_EXPORT_ENCODING">FEED_EXPORT_ENCODING</a></p></li>
<li><p><a class="reference internal" href="feed-exports.html#std-setting-FEED_EXPORT_FIELDS">FEED_EXPORT_FIELDS</a></p></li>
<li><p><a class="reference internal" href="feed-exports.html#std-setting-FEED_EXPORT_INDENT">FEED_EXPORT_INDENT</a></p></li>
<li><p><a class="reference internal" href="feed-exports.html#std-setting-FEED_STORAGES">FEED_STORAGES</a></p></li>
<li><p><a class="reference internal" href="feed-exports.html#std-setting-FEED_STORAGES_BASE">FEED_STORAGES_BASE</a></p></li>
<li><p><a class="reference internal" href="feed-exports.html#std-setting-FEED_STORAGE_FTP_ACTIVE">FEED_STORAGE_FTP_ACTIVE</a></p></li>
<li><p><a class="reference internal" href="feed-exports.html#std-setting-FEED_STORAGE_S3_ACL">FEED_STORAGE_S3_ACL</a></p></li>
<li><p><a class="reference internal" href="feed-exports.html#std-setting-FEED_STORE_EMPTY">FEED_STORE_EMPTY</a></p></li>
<li><p><a class="reference internal" href="feed-exports.html#std-setting-FEED_URI_PARAMS">FEED_URI_PARAMS</a></p></li>
<li><p><a class="reference internal" href="media-pipeline.html#std-setting-FILES_EXPIRES">FILES_EXPIRES</a></p></li>
<li><p><a class="reference internal" href="media-pipeline.html#std-setting-FILES_RESULT_FIELD">FILES_RESULT_FIELD</a></p></li>
<li><p><a class="reference internal" href="media-pipeline.html#std-setting-FILES_STORE">FILES_STORE</a></p></li>
<li><p><a class="reference internal" href="media-pipeline.html#std-setting-FILES_STORE_GCS_ACL">FILES_STORE_GCS_ACL</a></p></li>
<li><p><a class="reference internal" href="media-pipeline.html#std-setting-FILES_STORE_S3_ACL">FILES_STORE_S3_ACL</a></p></li>
<li><p><a class="reference internal" href="media-pipeline.html#std-setting-FILES_URLS_FIELD">FILES_URLS_FIELD</a></p></li>
<li><p><a class="reference internal" href="downloader-middleware.html#std-setting-HTTPCACHE_ALWAYS_STORE">HTTPCACHE_ALWAYS_STORE</a></p></li>
<li><p><a class="reference internal" href="downloader-middleware.html#std-setting-HTTPCACHE_DBM_MODULE">HTTPCACHE_DBM_MODULE</a></p></li>
<li><p><a class="reference internal" href="downloader-middleware.html#std-setting-HTTPCACHE_DIR">HTTPCACHE_DIR</a></p></li>
<li><p><a class="reference internal" href="downloader-middleware.html#std-setting-HTTPCACHE_ENABLED">HTTPCACHE_ENABLED</a></p></li>
<li><p><a class="reference internal" href="downloader-middleware.html#std-setting-HTTPCACHE_EXPIRATION_SECS">HTTPCACHE_EXPIRATION_SECS</a></p></li>
<li><p><a class="reference internal" href="downloader-middleware.html#std-setting-HTTPCACHE_GZIP">HTTPCACHE_GZIP</a></p></li>
<li><p><a class="reference internal" href="downloader-middleware.html#std-setting-HTTPCACHE_IGNORE_HTTP_CODES">HTTPCACHE_IGNORE_HTTP_CODES</a></p></li>
<li><p><a class="reference internal" href="downloader-middleware.html#std-setting-HTTPCACHE_IGNORE_MISSING">HTTPCACHE_IGNORE_MISSING</a></p></li>
<li><p><a class="reference internal" href="downloader-middleware.html#std-setting-HTTPCACHE_IGNORE_RESPONSE_CACHE_CONTROLS">HTTPCACHE_IGNORE_RESPONSE_CACHE_CONTROLS</a></p></li>
<li><p><a class="reference internal" href="downloader-middleware.html#std-setting-HTTPCACHE_IGNORE_SCHEMES">HTTPCACHE_IGNORE_SCHEMES</a></p></li>
<li><p><a class="reference internal" href="downloader-middleware.html#std-setting-HTTPCACHE_POLICY">HTTPCACHE_POLICY</a></p></li>
<li><p><a class="reference internal" href="downloader-middleware.html#std-setting-HTTPCACHE_STORAGE">HTTPCACHE_STORAGE</a></p></li>
<li><p><a class="reference internal" href="spider-middleware.html#std-setting-HTTPERROR_ALLOWED_CODES">HTTPERROR_ALLOWED_CODES</a></p></li>
<li><p><a class="reference internal" href="spider-middleware.html#std-setting-HTTPERROR_ALLOW_ALL">HTTPERROR_ALLOW_ALL</a></p></li>
<li><p><a class="reference internal" href="downloader-middleware.html#std-setting-HTTPPROXY_AUTH_ENCODING">HTTPPROXY_AUTH_ENCODING</a></p></li>
<li><p><a class="reference internal" href="downloader-middleware.html#std-setting-HTTPPROXY_ENABLED">HTTPPROXY_ENABLED</a></p></li>
<li><p><a class="reference internal" href="media-pipeline.html#std-setting-IMAGES_EXPIRES">IMAGES_EXPIRES</a></p></li>
<li><p><a class="reference internal" href="media-pipeline.html#std-setting-IMAGES_MIN_HEIGHT">IMAGES_MIN_HEIGHT</a></p></li>
<li><p><a class="reference internal" href="media-pipeline.html#std-setting-IMAGES_MIN_WIDTH">IMAGES_MIN_WIDTH</a></p></li>
<li><p><a class="reference internal" href="media-pipeline.html#std-setting-IMAGES_RESULT_FIELD">IMAGES_RESULT_FIELD</a></p></li>
<li><p><a class="reference internal" href="media-pipeline.html#std-setting-IMAGES_STORE">IMAGES_STORE</a></p></li>
<li><p><a class="reference internal" href="media-pipeline.html#std-setting-IMAGES_STORE_GCS_ACL">IMAGES_STORE_GCS_ACL</a></p></li>
<li><p><a class="reference internal" href="media-pipeline.html#std-setting-IMAGES_STORE_S3_ACL">IMAGES_STORE_S3_ACL</a></p></li>
<li><p><a class="reference internal" href="media-pipeline.html#std-setting-IMAGES_THUMBS">IMAGES_THUMBS</a></p></li>
<li><p><a class="reference internal" href="media-pipeline.html#std-setting-IMAGES_URLS_FIELD">IMAGES_URLS_FIELD</a></p></li>
<li><p><a class="reference internal" href="email.html#std-setting-MAIL_FROM">MAIL_FROM</a></p></li>
<li><p><a class="reference internal" href="email.html#std-setting-MAIL_HOST">MAIL_HOST</a></p></li>
<li><p><a class="reference internal" href="email.html#std-setting-MAIL_PASS">MAIL_PASS</a></p></li>
<li><p><a class="reference internal" href="email.html#std-setting-MAIL_PORT">MAIL_PORT</a></p></li>
<li><p><a class="reference internal" href="email.html#std-setting-MAIL_SSL">MAIL_SSL</a></p></li>
<li><p><a class="reference internal" href="email.html#std-setting-MAIL_TLS">MAIL_TLS</a></p></li>
<li><p><a class="reference internal" href="email.html#std-setting-MAIL_USER">MAIL_USER</a></p></li>
<li><p><a class="reference internal" href="media-pipeline.html#std-setting-MEDIA_ALLOW_REDIRECTS">MEDIA_ALLOW_REDIRECTS</a></p></li>
<li><p><a class="reference internal" href="downloader-middleware.html#std-setting-METAREFRESH_ENABLED">METAREFRESH_ENABLED</a></p></li>
<li><p><a class="reference internal" href="downloader-middleware.html#std-setting-METAREFRESH_IGNORE_TAGS">METAREFRESH_IGNORE_TAGS</a></p></li>
<li><p><a class="reference internal" href="downloader-middleware.html#std-setting-METAREFRESH_MAXDELAY">METAREFRESH_MAXDELAY</a></p></li>
<li><p><a class="reference internal" href="downloader-middleware.html#std-setting-REDIRECT_ENABLED">REDIRECT_ENABLED</a></p></li>
<li><p><a class="reference internal" href="downloader-middleware.html#std-setting-REDIRECT_MAX_TIMES">REDIRECT_MAX_TIMES</a></p></li>
<li><p><a class="reference internal" href="spider-middleware.html#std-setting-REFERER_ENABLED">REFERER_ENABLED</a></p></li>
<li><p><a class="reference internal" href="spider-middleware.html#std-setting-REFERRER_POLICY">REFERRER_POLICY</a></p></li>
<li><p><a class="reference internal" href="downloader-middleware.html#std-setting-RETRY_ENABLED">RETRY_ENABLED</a></p></li>
<li><p><a class="reference internal" href="downloader-middleware.html#std-setting-RETRY_HTTP_CODES">RETRY_HTTP_CODES</a></p></li>
<li><p><a class="reference internal" href="downloader-middleware.html#std-setting-RETRY_TIMES">RETRY_TIMES</a></p></li>
<li><p><a class="reference internal" href="telnetconsole.html#std-setting-TELNETCONSOLE_HOST">TELNETCONSOLE_HOST</a></p></li>
<li><p><a class="reference internal" href="telnetconsole.html#std-setting-TELNETCONSOLE_PASSWORD">TELNETCONSOLE_PASSWORD</a></p></li>
<li><p><a class="reference internal" href="telnetconsole.html#std-setting-TELNETCONSOLE_PORT">TELNETCONSOLE_PORT</a></p></li>
<li><p><a class="reference internal" href="telnetconsole.html#std-setting-TELNETCONSOLE_USERNAME">TELNETCONSOLE_USERNAME</a></p></li>
</ul>
</div>
</div>
</div>


           </div>
           
          </div>
          <footer>
  
    <div class="rst-footer-buttons" role="navigation" aria-label="footer navigation">
      
        <a href="exceptions.html" class="btn btn-neutral float-right" title="例外情况" accesskey="n" rel="next">Next <span class="fa fa-arrow-circle-right"></span></a>
      
      
        <a href="link-extractors.html" class="btn btn-neutral float-left" title="链接提取器" accesskey="p" rel="prev"><span class="fa fa-arrow-circle-left"></span> Previous</a>
      
    </div>
  

  <hr/>

  <div role="contentinfo">
    <p>
        
        &copy; 版权所有 2008–2020, Scrapy developers
      <span class="lastupdated">
        最后更新于 10月 18, 2020.
      </span>

    </p>
  </div>
    
    
    
    Built with <a href="http://sphinx-doc.org/">Sphinx</a> using a
    
    <a href="https://github.com/rtfd/sphinx_rtd_theme">theme</a>
    
    provided by <a href="https://readthedocs.org">Read the Docs</a>. 

</footer>

        </div>
      </div>

    </section>

  </div>
  

  <script type="text/javascript">
      jQuery(function () {
          SphinxRtdTheme.Navigation.enable(true);
      });
  </script>

  
  
    
  
 
<script type="text/javascript">
!function(){var analytics=window.analytics=window.analytics||[];if(!analytics.initialize)if(analytics.invoked)window.console&&console.error&&console.error("Segment snippet included twice.");else{analytics.invoked=!0;analytics.methods=["trackSubmit","trackClick","trackLink","trackForm","pageview","identify","reset","group","track","ready","alias","page","once","off","on"];analytics.factory=function(t){return function(){var e=Array.prototype.slice.call(arguments);e.unshift(t);analytics.push(e);return analytics}};for(var t=0;t<analytics.methods.length;t++){var e=analytics.methods[t];analytics[e]=analytics.factory(e)}analytics.load=function(t){var e=document.createElement("script");e.type="text/javascript";e.async=!0;e.src=("https:"===document.location.protocol?"https://":"http://")+"cdn.segment.com/analytics.js/v1/"+t+"/analytics.min.js";var n=document.getElementsByTagName("script")[0];n.parentNode.insertBefore(e,n)};analytics.SNIPPET_VERSION="3.1.0";
analytics.load("8UDQfnf3cyFSTsM4YANnW5sXmgZVILbA");
analytics.page();
}}();

analytics.ready(function () {
    ga('require', 'linker');
    ga('linker:autoLink', ['scrapinghub.com', 'crawlera.com']);
});
</script>


</body>
</html>