<!doctype html>
<html lang="en">
<head>
    <meta charset="utf-8">
    <meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
    <meta name="twitter:description" content="NATTEN: Neighborhood Attention Extension. Fast and efficient sliding window attention is only a pip install away!">
    <meta name="description" content="NATTEN: Neighborhood Attention Extension. Fast and efficient sliding window attention is only a pip install away!">
    <meta property="og:type" content="website">
    <meta name="twitter:title" content="NATTEN">
    <meta property="og:description" content="NATTEN: Neighborhood Attention Extension. Fast and efficient sliding window attention is only a pip install away!">
    <title>NATTEN</title>
    <link rel="stylesheet" href="assets/css/bootstrap.min.css">
    <link rel="stylesheet" href="assets/css/style.min.css">
    <link rel="stylesheet" href="assets/css/main.css">
    <link rel="icon" type="image/x-icon" href="favicon.ico">
</head>

<body>
<div class="container mt-4">
    <div class="row align-items-center">
        <div class="col-lg-12 col-md-12 col-sm-12 text-center">
            <div align="center">
                <div class="logo"><h1>NATTEN</h1></div>
            </div>
            <p class="author">Neighborhood Attention Extension</p>
            <p class="author">Bringing attention to a neighborhood near you!</p>
            <p class="abstract">
               NATTEN is a PyTorch extension implementing
               <a href="https://arxiv.org/abs/2204.07143">Neighborhood Attention</a> (local attention)
               and <a href="https://arxiv.org/abs/2209.15001">Dilated Neighborhood Attention</a> 
               (sparse global attention, a.k.a. dilated local attention) as PyTorch modules  and ops
               for 1D, 2D, and 3D data.
            </p>
            <p class="abstract">
              Start using our new <a href="https://arxiv.org/abs/2403.04690">Fused Neighborhood Attention</a> implementation
              today!
            </p>
            <p class="text-center">
                <a href="https://github.com/SHI-Labs/NATTEN">GitHub</a> /
                <a href="https://pypi.org/project/natten/">PyPI</a>
            </p>
            <p class="text-center">
                <a href="https://github.com/SHI-Labs/Neighborhood-Attention-Transformer">Neighborhood Attention Transformers</a>
            </p>
        </div>
    </div>
</div>
<div class="container news mt-4">
    <div class="title row">
        <div class="col-lg-12">
            <p class="h2">Install with pip</p>
            <p>Latest release: <code>0.17.5</code></p>
        </div>
    </div>
    <div class="row">
        <div class="col-lg-12">
          <p>Please select your preferred PyTorch version with the correct CUDA build, or CPU build if you're not using CUDA:</p>
          <p></p>
          <ul class="nav nav-tabs" id="torchTab" role="tablist">
              <li class="nav-item" role="presentation">
                <a class="nav-link disabled" href="#" tabindex="-1" aria-disabled="true">PyTorch:</a>
              </li>
              <li class="nav-item" role="presentation">
                <button class="nav-link active" id="tab26" data-bs-toggle="tab" data-bs-target="#v260" type="button" role="tab" aria-controls="v260" aria-selected="true">2.6.X</button>
              </li>
              <li class="nav-item" role="presentation">
                <button class="nav-link" id="tab25" data-bs-toggle="tab" data-bs-target="#v250" type="button" role="tab" aria-controls="v250" aria-selected="true">2.5.X</button>
              </li>
          </ul>
          <div class="tab-content" id="torchTabContent">
              <div class="tab-pane fade show active" id="v260" role="tabpanel" aria-labelledby="tab26">
                <div class="d-flex align-items-start">
                  <div class="nav flex-column nav-pills me-3" id="v260-tab" role="tablist" aria-orientation="vertical">
                    <button class="nav-link active" id="v260-cu124-tab" data-bs-toggle="pill"
                      data-bs-target="#v260-126" type="button" role="tab" aria-controls="v260-126" aria-selected="true">CUDA 12.6</button>
                    <button class="nav-link" id="v260-cu124-tab" data-bs-toggle="pill" data-bs-target="#v260-124" type="button" role="tab" aria-controls="v260-124" aria-selected="true">CUDA 12.4</button>
                    <button class="nav-link" id="v260-cpu-tab" data-bs-toggle="pill" data-bs-target="#v260-cpu" type="button" role="tab" aria-controls="v230-cpu" aria-selected="false">CPU</button>
                  </div>
                  <div class="tab-content" id="v260-tabContent">
                    <div class="tab-pane fade show active" id="v260-126" role="tabpanel" aria-labelledby="v260-cu126-tab">
                      <p>Run this command:</p>
                      <pre>pip3 install natten==0.17.5+torch260cu126 -f https://shi-labs.com/natten/wheels/</pre>
                    </div>
                    <div class="tab-pane fade show" id="v260-124" role="tabpanel" aria-labelledby="v260-cu124-tab">
                      <p>Run this command:</p>
                      <pre>pip3 install natten==0.17.5+torch260cu124 -f https://shi-labs.com/natten/wheels/</pre>
                    </div>
                    <div class="tab-pane fade" id="v260-cpu" role="tabpanel" aria-labelledby="v260-cpu-tab">
                      <p>Run this command:</p>
                      <pre>pip3 install natten==0.17.5+torch260cpu -f https://shi-labs.com/natten/wheels</pre>
                    </div>
                  </div>
                </div>
              </div>
              <div class="tab-pane fade show" id="v250" role="tabpanel" aria-labelledby="tab25">
                <div class="d-flex align-items-start">
                  <div class="nav flex-column nav-pills me-3" id="v250-tab" role="tablist" aria-orientation="vertical">
                    <button class="nav-link active" id="v250-cu124-tab" data-bs-toggle="pill"
                      data-bs-target="#v250-124" type="button" role="tab" aria-controls="v250-124" aria-selected="true">CUDA 12.4</button>
                    <button class="nav-link" id="v250-cu121-tab" data-bs-toggle="pill" data-bs-target="#v250-121" type="button" role="tab" aria-controls="v250-121" aria-selected="true">CUDA 12.1</button>
                    <button class="nav-link" id="v250-cpu-tab" data-bs-toggle="pill" data-bs-target="#v250-cpu" type="button" role="tab" aria-controls="v230-cpu" aria-selected="false">CPU</button>
                  </div>
                  <div class="tab-content" id="v250-tabContent">
                    <div class="tab-pane fade show active" id="v250-124" role="tabpanel" aria-labelledby="v250-cu124-tab">
                      <p>Run this command:</p>
                      <pre>pip3 install natten==0.17.5+torch250cu124 -f https://shi-labs.com/natten/wheels/</pre>
                    </div>
                    <div class="tab-pane fade show" id="v250-121" role="tabpanel" aria-labelledby="v250-cu121-tab">
                      <p>Run this command:</p>
                      <pre>pip3 install natten==0.17.5+torch250cu121 -f https://shi-labs.com/natten/wheels/</pre>
                    </div>
                    <div class="tab-pane fade" id="v250-cpu" role="tabpanel" aria-labelledby="v250-cpu-tab">
                      <p>Run this command:</p>
                      <pre>pip3 install natten==0.17.5+torch250cpu -f https://shi-labs.com/natten/wheels</pre>
                    </div>
                  </div>
                </div>
              </div>
          </div>
          <p>Your build isn't listed? Mac user? Just do: <pre class="tight">pip install natten==0.17.5</pre></p>
          <p>Please note that without pre-compiled wheels, installing might take a while, as it will
          involve compiling NATTEN kernels on your device. In that case, compiler tools are required
          as well.</p>
          <p>If you're using an NVIDIA GPU, you're also required to have CUDA Toolkit &gt; 11.7, cmake &gt; 3.20 and PyTorch &gt; 2.0 installed before attempting to install/build NATTEN.</p>
          <p>NATTEN does not have pre-compiled wheels for Windows, but you can try 
            <a href="https://github.com/SHI-Labs/NATTEN/blob/main/docs/install.md#windows">building from
            source</a>.</p> 
          <p>For more information, please refer to <a href="https://github.com/SHI-Labs/NATTEN/blob/main/docs/install.md">our docs</a>.</p>
          <p></p>
          <p><h3>CUDA help</h3></p>
          <p>Don't know your torch/cuda version? 
          Run this: <pre class="tight">python3 -c "import torch; print(torch.__version__)"</pre></p>
          <p><em>Note: the CUDA version above refers to the version of the compiler which compiled your torch build,
              not the actual version of CUDA toolkit you may have installed locally.</em></p>
          <p>Run this to check if you have the CUDA compiler: <pre class="tight">which nvcc</pre></p>
          <p>and if you do, run this to check the version: <pre class="tight">nvcc --version</pre></p>
          <p>If you don't have the CUDA toolkit, and just want to know which torch and NATTEN build is
          best for you, check your driver version with: <pre class="tight">nvidia-smi</pre></p>
          <p>Once you know your driver version,
            <a href="https://docs.nvidia.com/cuda/cuda-toolkit-release-notes/index.html#id5">match it with the corresponding CUDA toolkit version</a>.</p>
          <p></p>
          <p></p>
        </div>
    </div>
</div>
<div class="container news mt-4">
    <div class="title row">
        <div class="col-lg-12">
            <p class="h2">Quick links</p>
        </div>
        <div class="col-lg-12">
          <p><a href="https://github.com/SHI-Labs/NATTEN/blob/main/docs/install.md">Install / build from source guide</a></p>
          <p><a href="https://github.com/SHI-Labs/NATTEN/tree/main/docs/fna">Fused neighborhood attention quickstart</a></p>
          <p><a href="https://github.com/SHI-Labs/NATTEN/tree/main/docs/">NATTEN docs</a></p>
        </div>
    </div>
</div>
<div class="container news mt-4">
    <div class="title row">
        <div class="col-lg-12">
            <p class="h2">Learn more about neighborhood attention</p>
        </div>
        <div class="col-lg-12">
          <p><a href="https://www.youtube.com/watch?v=Ya4BfioxIHA">Presentation recording: Neighborhood Attention (CVPR 2023)</a></p>
          <p><a href="https://www.youtube.com/watch?v=RCnlwtt7fUw">Presentation recording: Fused Neighborhood Attention (NeurIPS 2024)</a></p>
        </div>
    </div>
</div>
<div class="container citation mt-4">
    <div class="title row">
        <div class="col-lg-12">
            <p class="h2">Citation</p>
        </div>
        <div class="col-lg-12">
            <p>Please consider citing our most recent work on NATTEN:</p>
            <p></p>
            <p>
<pre><code>@inproceedings{hassani2024faster,
  title        = {Faster Neighborhood Attention: Reducing the O(n^2) Cost of Self Attention at the Threadblock Level},
  author       = {Ali Hassani and Wen-Mei Hwu and Humphrey Shi},
  year         = 2024,
  booktitle    = {Advances in Neural Information Processing Systems},
}</code></pre></p>
            <p></p>
            <p>and the original Neighborhood Attention Transformer papers:</p>
            <p></p>
            <p>
<pre><code>@inproceedings{hassani2023neighborhood,
  title        = {Neighborhood Attention Transformer},
  author       = {Ali Hassani and Steven Walton and Jiachen Li and Shen Li and Humphrey Shi},
  year         = 2023,
  booktitle    = {IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}
}
@misc{hassani2022dilated,
  title        = {Dilated Neighborhood Attention Transformer},
  author       = {Ali Hassani and Humphrey Shi},
  year         = 2022,
  url          = {https://arxiv.org/abs/2209.15001},
  eprint       = {2209.15001},
  archiveprefix = {arXiv},
  primaryclass = {cs.CV}
}</code></pre></p>
        </div>
    </div>
</div>
<footer class="footer text-center">
</footer>
<div class="copyright py-4 text-center mt-5">
    <div class="container"><small>Created with <a href="https://github.com/vincentdoerig/latex-css">LaTeX.css</a> and <a href="https://getbootstrap.com/">Bootstrap</a></small></div>
</div>
<script src="assets/js/jquery-3.6.1.slim.js"></script>
<script src="assets/js/bootstrap.bundle.min.js"></script>
<script src="assets/js/clipboard.min.js"></script>
<script src="assets/js/copy-pre.js"></script>
</body>

</html>
