---

title: TCN (Temporal Convolutional Network)


keywords: fastai
sidebar: home_sidebar

summary: "This is an unofficial PyTorch implementation by Ignacio Oguiza of  - oguiza@gmail.com based on Temporal Convolutional Network (Bai, 2018)."
description: "This is an unofficial PyTorch implementation by Ignacio Oguiza of  - oguiza@gmail.com based on Temporal Convolutional Network (Bai, 2018)."
nb_path: "nbs/113_models.TCN.ipynb"
---
<!--

#################################################
### THIS FILE WAS AUTOGENERATED! DO NOT EDIT! ###
#################################################
# file to edit: nbs/113_models.TCN.ipynb
# command to build the docs after a change: nbdev_build_docs

-->

<div class="container" id="notebook-container">
        
    {% raw %}
    
<div class="cell border-box-sizing code_cell rendered">

</div>
    {% endraw %}

<div class="cell border-box-sizing text_cell rendered"><div class="inner_cell">
<div class="text_cell_render border-box-sizing rendered_html">
<p><strong>References:</strong></p>
<ul>
<li>Bai, S., Kolter, J. Z., &amp; Koltun, V. (2018). An empirical evaluation of generic convolutional and recurrent networks for sequence modeling. arXiv preprint arXiv:1803.01271.</li>
<li>Official TCN PyTorch implementation: <a href="https://github.com/locuslab/TCN">https://github.com/locuslab/TCN</a></li>
</ul>

</div>
</div>
</div>
    {% raw %}
    
<div class="cell border-box-sizing code_cell rendered">

</div>
    {% endraw %}

    {% raw %}
    
<div class="cell border-box-sizing code_cell rendered">

</div>
    {% endraw %}

    {% raw %}
    
<div class="cell border-box-sizing code_cell rendered">

<div class="output_wrapper">
<div class="output">

<div class="output_area">


<div class="output_markdown rendered_html output_subarea ">
<h2 id="TemporalBlock" class="doc_header"><code>class</code> <code>TemporalBlock</code><a href="https://github.com/timeseriesAI/tsai/tree/main/tsai/models/TCN.py#L21" class="source_link" style="float:right">[source]</a></h2><blockquote><p><code>TemporalBlock</code>(<strong><code>ni</code></strong>, <strong><code>nf</code></strong>, <strong><code>ks</code></strong>, <strong><code>stride</code></strong>, <strong><code>dilation</code></strong>, <strong><code>padding</code></strong>, <strong><code>dropout</code></strong>=<em><code>0.0</code></em>) :: <code>Module</code></p>
</blockquote>
<p>Same as <code>nn.Module</code>, but no need for subclasses to call <code>super().__init__</code></p>

</div>

</div>

</div>
</div>

</div>
    {% endraw %}

    {% raw %}
    
<div class="cell border-box-sizing code_cell rendered">

<div class="output_wrapper">
<div class="output">

<div class="output_area">


<div class="output_markdown rendered_html output_subarea ">
<h4 id="TemporalConvNet" class="doc_header"><code>TemporalConvNet</code><a href="https://github.com/timeseriesAI/tsai/tree/main/tsai/models/TCN.py#L47" class="source_link" style="float:right">[source]</a></h4><blockquote><p><code>TemporalConvNet</code>(<strong><code>c_in</code></strong>, <strong><code>layers</code></strong>, <strong><code>ks</code></strong>=<em><code>2</code></em>, <strong><code>dropout</code></strong>=<em><code>0.0</code></em>)</p>
</blockquote>

</div>

</div>

</div>
</div>

</div>
    {% endraw %}

    {% raw %}
    
<div class="cell border-box-sizing code_cell rendered">

<div class="output_wrapper">
<div class="output">

<div class="output_area">


<div class="output_markdown rendered_html output_subarea ">
<h2 id="TCN" class="doc_header"><code>class</code> <code>TCN</code><a href="https://github.com/timeseriesAI/tsai/tree/main/tsai/models/TCN.py#L56" class="source_link" style="float:right">[source]</a></h2><blockquote><p><code>TCN</code>(<strong><code>c_in</code></strong>, <strong><code>c_out</code></strong>, <strong><code>layers</code></strong>=<em><code>[25, 25, 25, 25, 25, 25, 25, 25]</code></em>, <strong><code>ks</code></strong>=<em><code>7</code></em>, <strong><code>conv_dropout</code></strong>=<em><code>0.0</code></em>, <strong><code>fc_dropout</code></strong>=<em><code>0.0</code></em>) :: <code>Module</code></p>
</blockquote>
<p>Same as <code>nn.Module</code>, but no need for subclasses to call <code>super().__init__</code></p>

</div>

</div>

</div>
</div>

</div>
    {% endraw %}

    {% raw %}
    
<div class="cell border-box-sizing code_cell rendered">

</div>
    {% endraw %}

    {% raw %}
    
<div class="cell border-box-sizing code_cell rendered">
<div class="input">

<div class="inner_cell">
    <div class="input_area">
<div class=" highlight hl-ipython3"><pre><span></span><span class="n">bs</span> <span class="o">=</span> <span class="mi">16</span>
<span class="n">nvars</span> <span class="o">=</span> <span class="mi">3</span>
<span class="n">seq_len</span> <span class="o">=</span> <span class="mi">128</span>
<span class="n">c_out</span> <span class="o">=</span> <span class="mi">2</span>
<span class="n">xb</span> <span class="o">=</span> <span class="n">torch</span><span class="o">.</span><span class="n">rand</span><span class="p">(</span><span class="n">bs</span><span class="p">,</span> <span class="n">nvars</span><span class="p">,</span> <span class="n">seq_len</span><span class="p">)</span>
<span class="n">model</span> <span class="o">=</span> <span class="n">TCN</span><span class="p">(</span><span class="n">nvars</span><span class="p">,</span> <span class="n">c_out</span><span class="p">,</span> <span class="n">fc_dropout</span><span class="o">=</span><span class="mf">.5</span><span class="p">)</span>
<span class="n">test_eq</span><span class="p">(</span><span class="n">model</span><span class="p">(</span><span class="n">xb</span><span class="p">)</span><span class="o">.</span><span class="n">shape</span><span class="p">,</span> <span class="p">(</span><span class="n">bs</span><span class="p">,</span> <span class="n">c_out</span><span class="p">))</span>
<span class="n">model</span> <span class="o">=</span> <span class="n">TCN</span><span class="p">(</span><span class="n">nvars</span><span class="p">,</span> <span class="n">c_out</span><span class="p">,</span> <span class="n">conv_dropout</span><span class="o">=</span><span class="mf">.2</span><span class="p">)</span>
<span class="n">test_eq</span><span class="p">(</span><span class="n">model</span><span class="p">(</span><span class="n">xb</span><span class="p">)</span><span class="o">.</span><span class="n">shape</span><span class="p">,</span> <span class="p">(</span><span class="n">bs</span><span class="p">,</span> <span class="n">c_out</span><span class="p">))</span>
<span class="n">model</span> <span class="o">=</span> <span class="n">TCN</span><span class="p">(</span><span class="n">nvars</span><span class="p">,</span> <span class="n">c_out</span><span class="p">)</span>
<span class="n">test_eq</span><span class="p">(</span><span class="n">model</span><span class="p">(</span><span class="n">xb</span><span class="p">)</span><span class="o">.</span><span class="n">shape</span><span class="p">,</span> <span class="p">(</span><span class="n">bs</span><span class="p">,</span> <span class="n">c_out</span><span class="p">))</span>
<span class="n">model</span>
</pre></div>

    </div>
</div>
</div>

<div class="output_wrapper">
<div class="output">

<div class="output_area">



<div class="output_text output_subarea output_execute_result">
<pre>TCN(
  (tcn): Sequential(
    (0): TemporalBlock(
      (conv1): Conv1d(3, 25, kernel_size=(7,), stride=(1,), padding=(6,))
      (chomp1): Chomp1d()
      (relu1): ReLU()
      (dropout1): Dropout(p=0.0, inplace=False)
      (conv2): Conv1d(25, 25, kernel_size=(7,), stride=(1,), padding=(6,))
      (chomp2): Chomp1d()
      (relu2): ReLU()
      (dropout2): Dropout(p=0.0, inplace=False)
      (net): Sequential(
        (0): Conv1d(3, 25, kernel_size=(7,), stride=(1,), padding=(6,))
        (1): Chomp1d()
        (2): ReLU()
        (3): Dropout(p=0.0, inplace=False)
        (4): Conv1d(25, 25, kernel_size=(7,), stride=(1,), padding=(6,))
        (5): Chomp1d()
        (6): ReLU()
        (7): Dropout(p=0.0, inplace=False)
      )
      (downsample): Conv1d(3, 25, kernel_size=(1,), stride=(1,))
      (relu): ReLU()
    )
    (1): TemporalBlock(
      (conv1): Conv1d(25, 25, kernel_size=(7,), stride=(1,), padding=(12,), dilation=(2,))
      (chomp1): Chomp1d()
      (relu1): ReLU()
      (dropout1): Dropout(p=0.0, inplace=False)
      (conv2): Conv1d(25, 25, kernel_size=(7,), stride=(1,), padding=(12,), dilation=(2,))
      (chomp2): Chomp1d()
      (relu2): ReLU()
      (dropout2): Dropout(p=0.0, inplace=False)
      (net): Sequential(
        (0): Conv1d(25, 25, kernel_size=(7,), stride=(1,), padding=(12,), dilation=(2,))
        (1): Chomp1d()
        (2): ReLU()
        (3): Dropout(p=0.0, inplace=False)
        (4): Conv1d(25, 25, kernel_size=(7,), stride=(1,), padding=(12,), dilation=(2,))
        (5): Chomp1d()
        (6): ReLU()
        (7): Dropout(p=0.0, inplace=False)
      )
      (relu): ReLU()
    )
    (2): TemporalBlock(
      (conv1): Conv1d(25, 25, kernel_size=(7,), stride=(1,), padding=(24,), dilation=(4,))
      (chomp1): Chomp1d()
      (relu1): ReLU()
      (dropout1): Dropout(p=0.0, inplace=False)
      (conv2): Conv1d(25, 25, kernel_size=(7,), stride=(1,), padding=(24,), dilation=(4,))
      (chomp2): Chomp1d()
      (relu2): ReLU()
      (dropout2): Dropout(p=0.0, inplace=False)
      (net): Sequential(
        (0): Conv1d(25, 25, kernel_size=(7,), stride=(1,), padding=(24,), dilation=(4,))
        (1): Chomp1d()
        (2): ReLU()
        (3): Dropout(p=0.0, inplace=False)
        (4): Conv1d(25, 25, kernel_size=(7,), stride=(1,), padding=(24,), dilation=(4,))
        (5): Chomp1d()
        (6): ReLU()
        (7): Dropout(p=0.0, inplace=False)
      )
      (relu): ReLU()
    )
    (3): TemporalBlock(
      (conv1): Conv1d(25, 25, kernel_size=(7,), stride=(1,), padding=(48,), dilation=(8,))
      (chomp1): Chomp1d()
      (relu1): ReLU()
      (dropout1): Dropout(p=0.0, inplace=False)
      (conv2): Conv1d(25, 25, kernel_size=(7,), stride=(1,), padding=(48,), dilation=(8,))
      (chomp2): Chomp1d()
      (relu2): ReLU()
      (dropout2): Dropout(p=0.0, inplace=False)
      (net): Sequential(
        (0): Conv1d(25, 25, kernel_size=(7,), stride=(1,), padding=(48,), dilation=(8,))
        (1): Chomp1d()
        (2): ReLU()
        (3): Dropout(p=0.0, inplace=False)
        (4): Conv1d(25, 25, kernel_size=(7,), stride=(1,), padding=(48,), dilation=(8,))
        (5): Chomp1d()
        (6): ReLU()
        (7): Dropout(p=0.0, inplace=False)
      )
      (relu): ReLU()
    )
    (4): TemporalBlock(
      (conv1): Conv1d(25, 25, kernel_size=(7,), stride=(1,), padding=(96,), dilation=(16,))
      (chomp1): Chomp1d()
      (relu1): ReLU()
      (dropout1): Dropout(p=0.0, inplace=False)
      (conv2): Conv1d(25, 25, kernel_size=(7,), stride=(1,), padding=(96,), dilation=(16,))
      (chomp2): Chomp1d()
      (relu2): ReLU()
      (dropout2): Dropout(p=0.0, inplace=False)
      (net): Sequential(
        (0): Conv1d(25, 25, kernel_size=(7,), stride=(1,), padding=(96,), dilation=(16,))
        (1): Chomp1d()
        (2): ReLU()
        (3): Dropout(p=0.0, inplace=False)
        (4): Conv1d(25, 25, kernel_size=(7,), stride=(1,), padding=(96,), dilation=(16,))
        (5): Chomp1d()
        (6): ReLU()
        (7): Dropout(p=0.0, inplace=False)
      )
      (relu): ReLU()
    )
    (5): TemporalBlock(
      (conv1): Conv1d(25, 25, kernel_size=(7,), stride=(1,), padding=(192,), dilation=(32,))
      (chomp1): Chomp1d()
      (relu1): ReLU()
      (dropout1): Dropout(p=0.0, inplace=False)
      (conv2): Conv1d(25, 25, kernel_size=(7,), stride=(1,), padding=(192,), dilation=(32,))
      (chomp2): Chomp1d()
      (relu2): ReLU()
      (dropout2): Dropout(p=0.0, inplace=False)
      (net): Sequential(
        (0): Conv1d(25, 25, kernel_size=(7,), stride=(1,), padding=(192,), dilation=(32,))
        (1): Chomp1d()
        (2): ReLU()
        (3): Dropout(p=0.0, inplace=False)
        (4): Conv1d(25, 25, kernel_size=(7,), stride=(1,), padding=(192,), dilation=(32,))
        (5): Chomp1d()
        (6): ReLU()
        (7): Dropout(p=0.0, inplace=False)
      )
      (relu): ReLU()
    )
    (6): TemporalBlock(
      (conv1): Conv1d(25, 25, kernel_size=(7,), stride=(1,), padding=(384,), dilation=(64,))
      (chomp1): Chomp1d()
      (relu1): ReLU()
      (dropout1): Dropout(p=0.0, inplace=False)
      (conv2): Conv1d(25, 25, kernel_size=(7,), stride=(1,), padding=(384,), dilation=(64,))
      (chomp2): Chomp1d()
      (relu2): ReLU()
      (dropout2): Dropout(p=0.0, inplace=False)
      (net): Sequential(
        (0): Conv1d(25, 25, kernel_size=(7,), stride=(1,), padding=(384,), dilation=(64,))
        (1): Chomp1d()
        (2): ReLU()
        (3): Dropout(p=0.0, inplace=False)
        (4): Conv1d(25, 25, kernel_size=(7,), stride=(1,), padding=(384,), dilation=(64,))
        (5): Chomp1d()
        (6): ReLU()
        (7): Dropout(p=0.0, inplace=False)
      )
      (relu): ReLU()
    )
    (7): TemporalBlock(
      (conv1): Conv1d(25, 25, kernel_size=(7,), stride=(1,), padding=(768,), dilation=(128,))
      (chomp1): Chomp1d()
      (relu1): ReLU()
      (dropout1): Dropout(p=0.0, inplace=False)
      (conv2): Conv1d(25, 25, kernel_size=(7,), stride=(1,), padding=(768,), dilation=(128,))
      (chomp2): Chomp1d()
      (relu2): ReLU()
      (dropout2): Dropout(p=0.0, inplace=False)
      (net): Sequential(
        (0): Conv1d(25, 25, kernel_size=(7,), stride=(1,), padding=(768,), dilation=(128,))
        (1): Chomp1d()
        (2): ReLU()
        (3): Dropout(p=0.0, inplace=False)
        (4): Conv1d(25, 25, kernel_size=(7,), stride=(1,), padding=(768,), dilation=(128,))
        (5): Chomp1d()
        (6): ReLU()
        (7): Dropout(p=0.0, inplace=False)
      )
      (relu): ReLU()
    )
  )
  (gap): GAP1d(
    (gap): AdaptiveAvgPool1d(output_size=1)
    (flatten): Flatten(full=False)
  )
  (linear): Linear(in_features=25, out_features=2, bias=True)
)</pre>
</div>

</div>

</div>
</div>

</div>
    {% endraw %}

</div>
 

