<?xml version="1.0" encoding="utf-8"?>
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.1//EN"
  "http://www.w3.org/TR/xhtml11/DTD/xhtml11.dtd">

<html xmlns="http://www.w3.org/1999/xhtml">
<head><title></title>
<link href="../style/ebook.css" type="text/css" rel="stylesheet">
</head>
<body>
<h1>Quick Start</h1>
<p>The installation is quick and straightforward.</p>
<div class="highlight-bash notranslate"><div class="highlight"><pre><span></span><span class="c1"># airflow needs a home, ~/airflow is the default,</span>
<span class="c1"># but you can lay foundation somewhere else if you prefer</span>
<span class="c1"># (optional)</span>
<span class="nb">export</span> <span class="nv">AIRFLOW_HOME</span><span class="o">=</span>~/airflow

<span class="c1"># install from pypi using pip</span>
pip install apache-airflow

<span class="c1"># initialize the database</span>
airflow initdb

<span class="c1"># start the web server, default port is 8080</span>
airflow webserver -p <span class="m">8080</span>

<span class="c1"># start the scheduler</span>
airflow scheduler

<span class="c1"># visit localhost:8080 in the browser and enable the example dag in the home page</span>
</pre>
</div>
</div>
<p>Upon running these commands, Airflow will create the <code class="docutils literal notranslate"><span class="pre">$AIRFLOW_HOME</span></code> folder
and lay an &#x201C;airflow.cfg&#x201D; file with defaults that get you going fast. You can
inspect the file either in <code class="docutils literal notranslate"><span class="pre">$AIRFLOW_HOME/airflow.cfg</span></code>, or through the UI in
the <code class="docutils literal notranslate"><span class="pre">Admin-&gt;Configuration</span></code> menu. The PID file for the webserver will be stored
in <code class="docutils literal notranslate"><span class="pre">$AIRFLOW_HOME/airflow-webserver.pid</span></code> or in <code class="docutils literal notranslate"><span class="pre">/run/airflow/webserver.pid</span></code>
if started by systemd.</p>
<p>Out of the box, Airflow uses a sqlite database, which you should outgrow
fairly quickly since no parallelization is possible using this database
backend. It works in conjunction with the <code class="docutils literal notranslate"><span class="pre">SequentialExecutor</span></code> which will
only run task instances sequentially. While this is very limiting, it allows
you to get up and running quickly and take a tour of the UI and the
command line utilities.</p>
<p>Here are a few commands that will trigger a few task instances. You should
be able to see the status of the jobs change in the <code class="docutils literal notranslate"><span class="pre">example1</span></code> DAG as you
run the commands below.</p>
<div class="highlight-bash notranslate"><div class="highlight"><pre><span></span><span class="c1"># run your first task instance</span>
airflow run example_bash_operator runme_0 <span class="m">2015</span>-01-01
<span class="c1"># run a backfill over 2 days</span>
airflow backfill example_bash_operator -s <span class="m">2015</span>-01-01 -e <span class="m">2015</span>-01-02
</pre>
</div>
</div>
<div class="section" id="what-s-next">
<h2 class="sigil_not_in_toc">What&#x2019;s Next?</h2>
<p>From this point, you can head to the <a class="reference internal" href="tutorial.html"><span class="doc">Tutorial</span></a> section for further examples or the <a class="reference internal" href="howto/index.html"><span class="doc">How-to Guides</span></a> section if you&#x2019;re ready to get your hands dirty.</p>
</div>
</body>
</html>