{
  "nbformat": 4,
  "nbformat_minor": 0,
  "metadata": {
    "colab": {
      "name": "🛠_10. Time series fundamentals and Milestone Project 3: BitPredict 💰📈 Exercise Solutions.ipynb",
      "provenance": [],
      "collapsed_sections": [],
      "include_colab_link": true
    },
    "kernelspec": {
      "name": "python3",
      "display_name": "Python 3"
    },
    "language_info": {
      "name": "python"
    }
  },
  "cells": [
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "view-in-github",
        "colab_type": "text"
      },
      "source": [
        "<a href=\"https://colab.research.google.com/github/ashikshafi08/Zero-to-Mastery-TensorFlow-for-Deep-Learning-Exercise-Solutions/blob/main/%F0%9F%9B%A0_10_Time_series_fundamentals_and_Milestone_Project_3_BitPredict_%F0%9F%92%B0%F0%9F%93%88_Exercise_Solutions.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "n9BhPZ1Tx4bm"
      },
      "source": [
        "# 🛠 10. Time series fundamentals and Milestone Project 3: BitPredict 💰📈 Exercise Solutions \n",
        "\n",
        "1. Does scaling the data help for univariate/multivariate data? (e.g. getting all of the values between 0 & 1)\n",
        "    * Try doing this for a univariate model (e.g. model_1) and a multivariate model (e.g. model_6) and see if it effects model training or evaluation results.\n",
        "2. Get the most up to date data on Bitcoin, train a model & see how it goes (our data goes up to May 18 2021).\n",
        "    * You can download the Bitcoin historical data for free from  [coindesk.com/price/bitcoin](https://www.coindesk.com/price/bitcoin)  and clicking “Export Data” -> “CSV”.\n",
        "3. For most of our models we used WINDOW_SIZE=7, but is there a better window size?\n",
        "    * Setup a series of experiments to find whether or not there’s a better window size.\n",
        "    * For example, you might train 10 different models with HORIZON=1 but with window sizes ranging from 2-12.\n",
        "4. Create a windowed dataset just like the ones we used for model_1 using  [tf.keras.preprocessing.timeseries_dataset_from_array()](https://www.tensorflow.org/api_docs/python/tf/keras/preprocessing/timeseries_dataset_from_array)  and retrain model_1 using the recreated dataset.\n",
        "5. For our multivariate modelling experiment, we added the Bitcoin block reward size as an extra feature to make our time series multivariate.\n",
        "    * Are there any other features you think you could add?\n",
        "    * If so, try it out, how do these affect the model?\n",
        "6. Make prediction intervals for future forecasts. To do so, one way would be to train an ensemble model on all of the data, make future forecasts with it and calculate the prediction intervals of the ensemble just like we did for model_8.\n",
        "7. For future predictions, try to make a prediction, retrain a model on the predictions, make a prediction, retrain a model, make a prediction, retrain a model, make a prediction (retrain a model each time a new prediction is made). Plot the results, how do they look compared to the future predictions where a model wasn’t retrained for every forecast (model_9)?\n",
        "8. Throughout this notebook, we’ve only tried algorithms we’ve handcrafted ourselves. But it’s worth seeing how a purpose built forecasting algorithm goes.\n",
        "    * Try out one of the extra algorithms listed in the modelling experiments part such as:\n",
        "\t*  [Facebook’s Kats library](https://github.com/facebookresearch/Kats)  - there are many models in here, remember the machine learning practioner’s motto: experiment, experiment, experiment.\n",
        "\t*  [LinkedIn’s Greykite library](https://github.com/linkedin/greykite) \n"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "VT9614RIyCCl"
      },
      "source": [
        "## Downloading the data and preprocessing it "
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 422
        },
        "id": "aZe0Y63IyQgT",
        "outputId": "ca46e0cc-c761-4f3c-afa3-e6217144aadc"
      },
      "source": [
        "# Download Bitcoin historical data from GitHub \n",
        "!wget https://raw.githubusercontent.com/mrdbourke/tensorflow-deep-learning/main/extras/BTC_USD_2013-10-01_2021-05-18-CoinDesk.csv\n",
        "\n",
        "# Import with pandas \n",
        "import pandas as pd\n",
        "df = pd.read_csv(\"/content/BTC_USD_2013-10-01_2021-05-18-CoinDesk.csv\", \n",
        "                 parse_dates=[\"Date\"], \n",
        "                 index_col=[\"Date\"]) # parse the date column (tell pandas column 1 is a datetime)\n",
        "df.head()"
      ],
      "execution_count": 1,
      "outputs": [
        {
          "output_type": "stream",
          "name": "stdout",
          "text": [
            "--2021-09-14 10:00:47--  https://raw.githubusercontent.com/mrdbourke/tensorflow-deep-learning/main/extras/BTC_USD_2013-10-01_2021-05-18-CoinDesk.csv\n",
            "Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 185.199.108.133, 185.199.109.133, 185.199.110.133, ...\n",
            "Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|185.199.108.133|:443... connected.\n",
            "HTTP request sent, awaiting response... 200 OK\n",
            "Length: 178509 (174K) [text/plain]\n",
            "Saving to: ‘BTC_USD_2013-10-01_2021-05-18-CoinDesk.csv’\n",
            "\n",
            "\r          BTC_USD_2   0%[                    ]       0  --.-KB/s               \rBTC_USD_2013-10-01_ 100%[===================>] 174.33K  --.-KB/s    in 0.02s   \n",
            "\n",
            "2021-09-14 10:00:47 (7.01 MB/s) - ‘BTC_USD_2013-10-01_2021-05-18-CoinDesk.csv’ saved [178509/178509]\n",
            "\n"
          ]
        },
        {
          "output_type": "execute_result",
          "data": {
            "text/html": [
              "<div>\n",
              "<style scoped>\n",
              "    .dataframe tbody tr th:only-of-type {\n",
              "        vertical-align: middle;\n",
              "    }\n",
              "\n",
              "    .dataframe tbody tr th {\n",
              "        vertical-align: top;\n",
              "    }\n",
              "\n",
              "    .dataframe thead th {\n",
              "        text-align: right;\n",
              "    }\n",
              "</style>\n",
              "<table border=\"1\" class=\"dataframe\">\n",
              "  <thead>\n",
              "    <tr style=\"text-align: right;\">\n",
              "      <th></th>\n",
              "      <th>Currency</th>\n",
              "      <th>Closing Price (USD)</th>\n",
              "      <th>24h Open (USD)</th>\n",
              "      <th>24h High (USD)</th>\n",
              "      <th>24h Low (USD)</th>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>Date</th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "    </tr>\n",
              "  </thead>\n",
              "  <tbody>\n",
              "    <tr>\n",
              "      <th>2013-10-01</th>\n",
              "      <td>BTC</td>\n",
              "      <td>123.65499</td>\n",
              "      <td>124.30466</td>\n",
              "      <td>124.75166</td>\n",
              "      <td>122.56349</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2013-10-02</th>\n",
              "      <td>BTC</td>\n",
              "      <td>125.45500</td>\n",
              "      <td>123.65499</td>\n",
              "      <td>125.75850</td>\n",
              "      <td>123.63383</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2013-10-03</th>\n",
              "      <td>BTC</td>\n",
              "      <td>108.58483</td>\n",
              "      <td>125.45500</td>\n",
              "      <td>125.66566</td>\n",
              "      <td>83.32833</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2013-10-04</th>\n",
              "      <td>BTC</td>\n",
              "      <td>118.67466</td>\n",
              "      <td>108.58483</td>\n",
              "      <td>118.67500</td>\n",
              "      <td>107.05816</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2013-10-05</th>\n",
              "      <td>BTC</td>\n",
              "      <td>121.33866</td>\n",
              "      <td>118.67466</td>\n",
              "      <td>121.93633</td>\n",
              "      <td>118.00566</td>\n",
              "    </tr>\n",
              "  </tbody>\n",
              "</table>\n",
              "</div>"
            ],
            "text/plain": [
              "           Currency  Closing Price (USD)  ...  24h High (USD)  24h Low (USD)\n",
              "Date                                      ...                               \n",
              "2013-10-01      BTC            123.65499  ...       124.75166      122.56349\n",
              "2013-10-02      BTC            125.45500  ...       125.75850      123.63383\n",
              "2013-10-03      BTC            108.58483  ...       125.66566       83.32833\n",
              "2013-10-04      BTC            118.67466  ...       118.67500      107.05816\n",
              "2013-10-05      BTC            121.33866  ...       121.93633      118.00566\n",
              "\n",
              "[5 rows x 5 columns]"
            ]
          },
          "metadata": {},
          "execution_count": 1
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "1LQW1xGSyYDv",
        "outputId": "4c8fbe15-08e3-414d-cad7-0229cbaf6dc8"
      },
      "source": [
        "# Only want closing price for each day \n",
        "bitcoin_prices = pd.DataFrame(df[\"Closing Price (USD)\"]).rename(columns={\"Closing Price (USD)\": \"Price\"})\n",
        "bitcoin_prices.head() , bitcoin_prices.shape"
      ],
      "execution_count": 2,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "(                Price\n",
              " Date                 \n",
              " 2013-10-01  123.65499\n",
              " 2013-10-02  125.45500\n",
              " 2013-10-03  108.58483\n",
              " 2013-10-04  118.67466\n",
              " 2013-10-05  121.33866, (2787, 1))"
            ]
          },
          "metadata": {},
          "execution_count": 2
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "gIlIWPTWD6z_"
      },
      "source": [
        "# Get the data in array \n",
        "import numpy as np\n",
        "import tensorflow as tf \n",
        "from tensorflow.keras import layers\n",
        "\n",
        "timesteps = bitcoin_prices.index.to_numpy()\n",
        "prices = bitcoin_prices['Price'].to_numpy()\n",
        "\n",
        "# Instantiating the sklearn MinMaxScaler\n",
        "from sklearn.preprocessing import MinMaxScaler\n",
        "\n",
        "scaler = MinMaxScaler() "
      ],
      "execution_count": 3,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "7eynNXXID68q"
      },
      "source": [
        "# Create function to view NumPy arrays as windows \n",
        "\n",
        "def get_labelled_windows(x , horizon):\n",
        "  return x[:, :-horizon] ,x[: , -horizon:]\n",
        "\n",
        "\n",
        "def make_windows_scaled(x, window_size=7, horizon=1):\n",
        "  \"\"\"\n",
        "  Turns a 1D array into a 2D array of sequential windows of window_size. Also applies the standard scaler\n",
        "  \"\"\"\n",
        "  scaler.fit(np.expand_dims(x , axis =1))\n",
        "  scaled_x = scaler.transform(np.expand_dims(x , axis = 1))\n",
        "  scaled_x = np.squeeze(scaled_x)\n",
        "  \n",
        "  window_step = np.expand_dims(np.arange(window_size+horizon), axis=0)\n",
        "  window_indexes = window_step + np.expand_dims(np.arange(len(scaled_x)-(window_size+horizon-1)), axis=0).T # create 2D array of windows of size window_size\n",
        "  windowed_array = scaled_x[window_indexes]\n",
        "  windows, labels = get_labelled_windows(windowed_array, horizon=horizon)\n",
        "\n",
        "  return windows, labels\n",
        "\n",
        "\n",
        "# Make the splits \n",
        "def make_train_test_splits(windows , labels , test_split = 0.2):\n",
        "  split_size = int(len(windows) * (1 - test_split))\n",
        "  train_windows = windows[:split_size]\n",
        "  train_labels = labels[:split_size]\n",
        "  test_windows = windows[split_size:]\n",
        "  test_labels = labels[split_size:]\n",
        "\n",
        "  return train_windows ,  test_windows ,train_labels,  test_labels"
      ],
      "execution_count": 4,
      "outputs": []
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "7dItohLM1Ozr"
      },
      "source": [
        "### 1. Does scaling the data help for univariate/multivariate data? (e.g. getting all of the values between 0 & 1)\n",
        "* Try doing this for a univariate model (e.g. model_1) and a multivariate model (e.g. model_6) and see if it effects model training or evaluation results."
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "9S-J_3Tkzfto",
        "outputId": "d6d7d189-7df8-4b4f-e8ec-37cbc575ccb8"
      },
      "source": [
        "# Model 1 (Horizon = 1 , Window_size = 7)\n",
        "HORIZON = 1 \n",
        "WINDOW_SIZE = 7 \n",
        "\n",
        "\n",
        "full_windows , full_labels = make_windows_scaled(prices , window_size = WINDOW_SIZE , horizon = HORIZON)\n",
        "full_windows.shape , full_labels.shape"
      ],
      "execution_count": 5,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "((2780, 7), (2780, 1))"
            ]
          },
          "metadata": {},
          "execution_count": 5
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "TL8e6Iz9NOUX",
        "outputId": "f2c97dff-8cf7-42ca-be72-98fcb93b8139"
      },
      "source": [
        "# Looking at few examples of how price is scaled\n",
        "for i in range(3):\n",
        "  print(f'Window: {full_windows[i]} --> Label {full_labels[i]}')"
      ],
      "execution_count": 6,
      "outputs": [
        {
          "output_type": "stream",
          "name": "stdout",
          "text": [
            "Window: [0.00023831 0.00026677 0.         0.00015955 0.00020168 0.00019087\n",
            " 0.0002089 ] --> Label [0.00022847]\n",
            "Window: [0.00026677 0.         0.00015955 0.00020168 0.00019087 0.0002089\n",
            " 0.00022847] --> Label [0.00024454]\n",
            "Window: [0.         0.00015955 0.00020168 0.00019087 0.0002089  0.00022847\n",
            " 0.00024454] --> Label [0.00027478]\n"
          ]
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "mmHd26QZFZDU",
        "outputId": "6d5db718-1c2e-4c5a-a828-df1de1746e51"
      },
      "source": [
        "# Making train and test splits \n",
        "train_windows , test_windows , train_labels , test_labels = make_train_test_splits(full_windows , full_labels)\n",
        "len(train_windows), len(test_windows), len(train_labels), len(test_labels)"
      ],
      "execution_count": 7,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "(2224, 556, 2224, 556)"
            ]
          },
          "metadata": {},
          "execution_count": 7
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "A2dT3_hqyejt",
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "outputId": "d40af1bc-75e1-473a-cbb4-508e6a27b134"
      },
      "source": [
        "# Building the Model 1 \n",
        "tf.random.set_seed(42)\n",
        "\n",
        "# Construct the model \n",
        "model_1 = tf.keras.Sequential([\n",
        "  layers.Dense(128, activation= 'relu') ,\n",
        "  layers.Dense(HORIZON , activation = 'linear')\n",
        "])\n",
        "\n",
        "# Compiling the model \n",
        "model_1.compile(loss = 'mae' , \n",
        "                optimizer = tf.keras.optimizers.Adam() , \n",
        "                metrics = ['mae'])\n",
        "\n",
        "# Fit the model \n",
        "model_1.fit(x = train_windows , \n",
        "            y = train_labels , \n",
        "            epochs = 100 , batch_size = 128 , verbose = 0 , \n",
        "            validation_data = (test_windows , test_labels))"
      ],
      "execution_count": 8,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "<keras.callbacks.History at 0x7f5f2a3b99d0>"
            ]
          },
          "metadata": {},
          "execution_count": 8
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "2MwuA2gI0t2B",
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "outputId": "ea19048b-4e04-4f2b-9415-bad836e1b1cf"
      },
      "source": [
        "# Evaluate the model on test data \n",
        "model_1.evaluate(test_windows , test_labels)"
      ],
      "execution_count": 9,
      "outputs": [
        {
          "output_type": "stream",
          "name": "stdout",
          "text": [
            "18/18 [==============================] - 0s 1ms/step - loss: 0.0093 - mae: 0.0093\n"
          ]
        },
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "[0.00931711308658123, 0.00931711308658123]"
            ]
          },
          "metadata": {},
          "execution_count": 9
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "kh7sqPOjNo8V"
      },
      "source": [
        "# Making predictions \n",
        "model_1_preds = tf.squeeze(model_1.predict(test_windows))"
      ],
      "execution_count": 10,
      "outputs": []
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "pMNXeHUWXcEI"
      },
      "source": [
        "Now doing the same for the Multivariate data especially for the Model 6"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 235
        },
        "id": "6HoW1S6TXhvD",
        "outputId": "2fcb88b1-f66f-4c0f-ac7d-3901f6641561"
      },
      "source": [
        "# Block reward values\n",
        "block_reward_1 = 50 # 3 January 2009 \n",
        "block_reward_2 = 25 # 28 November 2012 \n",
        "block_reward_3 = 12.5 # 9 July 2016\n",
        "block_reward_4 = 6.25 # 11 May 2020\n",
        "\n",
        "# Block reward dates (datetime form of the above date stamps)\n",
        "block_reward_2_datetime = np.datetime64(\"2012-11-28\")\n",
        "block_reward_3_datetime = np.datetime64(\"2016-07-09\")\n",
        "block_reward_4_datetime = np.datetime64(\"2020-05-11\")\n",
        "\n",
        "# Get date indexes for when to add in different block dates\n",
        "block_reward_2_days = (block_reward_3_datetime - bitcoin_prices.index[0]).days\n",
        "block_reward_3_days = (block_reward_4_datetime - bitcoin_prices.index[0]).days\n",
        "block_reward_2_days, block_reward_3_days\n",
        "\n",
        "# Add block_reward column\n",
        "bitcoin_prices_block = bitcoin_prices.copy()\n",
        "bitcoin_prices_block[\"block_reward\"] = None\n",
        "\n",
        "# Set values of block_reward column (it's the last column hence -1 indexing on iloc)\n",
        "bitcoin_prices_block.iloc[:block_reward_2_days, -1] = block_reward_2\n",
        "bitcoin_prices_block.iloc[block_reward_2_days:block_reward_3_days, -1] = block_reward_3\n",
        "bitcoin_prices_block.iloc[block_reward_3_days:, -1] = block_reward_4\n",
        "bitcoin_prices_block.head()"
      ],
      "execution_count": 11,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/html": [
              "<div>\n",
              "<style scoped>\n",
              "    .dataframe tbody tr th:only-of-type {\n",
              "        vertical-align: middle;\n",
              "    }\n",
              "\n",
              "    .dataframe tbody tr th {\n",
              "        vertical-align: top;\n",
              "    }\n",
              "\n",
              "    .dataframe thead th {\n",
              "        text-align: right;\n",
              "    }\n",
              "</style>\n",
              "<table border=\"1\" class=\"dataframe\">\n",
              "  <thead>\n",
              "    <tr style=\"text-align: right;\">\n",
              "      <th></th>\n",
              "      <th>Price</th>\n",
              "      <th>block_reward</th>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>Date</th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "    </tr>\n",
              "  </thead>\n",
              "  <tbody>\n",
              "    <tr>\n",
              "      <th>2013-10-01</th>\n",
              "      <td>123.65499</td>\n",
              "      <td>25</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2013-10-02</th>\n",
              "      <td>125.45500</td>\n",
              "      <td>25</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2013-10-03</th>\n",
              "      <td>108.58483</td>\n",
              "      <td>25</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2013-10-04</th>\n",
              "      <td>118.67466</td>\n",
              "      <td>25</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2013-10-05</th>\n",
              "      <td>121.33866</td>\n",
              "      <td>25</td>\n",
              "    </tr>\n",
              "  </tbody>\n",
              "</table>\n",
              "</div>"
            ],
            "text/plain": [
              "                Price block_reward\n",
              "Date                              \n",
              "2013-10-01  123.65499           25\n",
              "2013-10-02  125.45500           25\n",
              "2013-10-03  108.58483           25\n",
              "2013-10-04  118.67466           25\n",
              "2013-10-05  121.33866           25"
            ]
          },
          "metadata": {},
          "execution_count": 11
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 390
        },
        "id": "4NQkP9EmX2Xt",
        "outputId": "4a3522fc-6458-4b30-9613-3ca27057b947"
      },
      "source": [
        "# Make a copy of the Bitcoin historical data with block reward feature\n",
        "bitcoin_prices_windowed = bitcoin_prices_block.copy()\n",
        "\n",
        "# Add windowed columns\n",
        "for i in range(WINDOW_SIZE): \n",
        "  bitcoin_prices_windowed[f\"Price+{i+1}\"] = bitcoin_prices_windowed[\"Price\"].shift(periods=i+1)\n",
        "bitcoin_prices_windowed.head(10)"
      ],
      "execution_count": 12,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/html": [
              "<div>\n",
              "<style scoped>\n",
              "    .dataframe tbody tr th:only-of-type {\n",
              "        vertical-align: middle;\n",
              "    }\n",
              "\n",
              "    .dataframe tbody tr th {\n",
              "        vertical-align: top;\n",
              "    }\n",
              "\n",
              "    .dataframe thead th {\n",
              "        text-align: right;\n",
              "    }\n",
              "</style>\n",
              "<table border=\"1\" class=\"dataframe\">\n",
              "  <thead>\n",
              "    <tr style=\"text-align: right;\">\n",
              "      <th></th>\n",
              "      <th>Price</th>\n",
              "      <th>block_reward</th>\n",
              "      <th>Price+1</th>\n",
              "      <th>Price+2</th>\n",
              "      <th>Price+3</th>\n",
              "      <th>Price+4</th>\n",
              "      <th>Price+5</th>\n",
              "      <th>Price+6</th>\n",
              "      <th>Price+7</th>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>Date</th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "    </tr>\n",
              "  </thead>\n",
              "  <tbody>\n",
              "    <tr>\n",
              "      <th>2013-10-01</th>\n",
              "      <td>123.65499</td>\n",
              "      <td>25</td>\n",
              "      <td>NaN</td>\n",
              "      <td>NaN</td>\n",
              "      <td>NaN</td>\n",
              "      <td>NaN</td>\n",
              "      <td>NaN</td>\n",
              "      <td>NaN</td>\n",
              "      <td>NaN</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2013-10-02</th>\n",
              "      <td>125.45500</td>\n",
              "      <td>25</td>\n",
              "      <td>123.65499</td>\n",
              "      <td>NaN</td>\n",
              "      <td>NaN</td>\n",
              "      <td>NaN</td>\n",
              "      <td>NaN</td>\n",
              "      <td>NaN</td>\n",
              "      <td>NaN</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2013-10-03</th>\n",
              "      <td>108.58483</td>\n",
              "      <td>25</td>\n",
              "      <td>125.45500</td>\n",
              "      <td>123.65499</td>\n",
              "      <td>NaN</td>\n",
              "      <td>NaN</td>\n",
              "      <td>NaN</td>\n",
              "      <td>NaN</td>\n",
              "      <td>NaN</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2013-10-04</th>\n",
              "      <td>118.67466</td>\n",
              "      <td>25</td>\n",
              "      <td>108.58483</td>\n",
              "      <td>125.45500</td>\n",
              "      <td>123.65499</td>\n",
              "      <td>NaN</td>\n",
              "      <td>NaN</td>\n",
              "      <td>NaN</td>\n",
              "      <td>NaN</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2013-10-05</th>\n",
              "      <td>121.33866</td>\n",
              "      <td>25</td>\n",
              "      <td>118.67466</td>\n",
              "      <td>108.58483</td>\n",
              "      <td>125.45500</td>\n",
              "      <td>123.65499</td>\n",
              "      <td>NaN</td>\n",
              "      <td>NaN</td>\n",
              "      <td>NaN</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2013-10-06</th>\n",
              "      <td>120.65533</td>\n",
              "      <td>25</td>\n",
              "      <td>121.33866</td>\n",
              "      <td>118.67466</td>\n",
              "      <td>108.58483</td>\n",
              "      <td>125.45500</td>\n",
              "      <td>123.65499</td>\n",
              "      <td>NaN</td>\n",
              "      <td>NaN</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2013-10-07</th>\n",
              "      <td>121.79500</td>\n",
              "      <td>25</td>\n",
              "      <td>120.65533</td>\n",
              "      <td>121.33866</td>\n",
              "      <td>118.67466</td>\n",
              "      <td>108.58483</td>\n",
              "      <td>125.45500</td>\n",
              "      <td>123.65499</td>\n",
              "      <td>NaN</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2013-10-08</th>\n",
              "      <td>123.03300</td>\n",
              "      <td>25</td>\n",
              "      <td>121.79500</td>\n",
              "      <td>120.65533</td>\n",
              "      <td>121.33866</td>\n",
              "      <td>118.67466</td>\n",
              "      <td>108.58483</td>\n",
              "      <td>125.45500</td>\n",
              "      <td>123.65499</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2013-10-09</th>\n",
              "      <td>124.04900</td>\n",
              "      <td>25</td>\n",
              "      <td>123.03300</td>\n",
              "      <td>121.79500</td>\n",
              "      <td>120.65533</td>\n",
              "      <td>121.33866</td>\n",
              "      <td>118.67466</td>\n",
              "      <td>108.58483</td>\n",
              "      <td>125.45500</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2013-10-10</th>\n",
              "      <td>125.96116</td>\n",
              "      <td>25</td>\n",
              "      <td>124.04900</td>\n",
              "      <td>123.03300</td>\n",
              "      <td>121.79500</td>\n",
              "      <td>120.65533</td>\n",
              "      <td>121.33866</td>\n",
              "      <td>118.67466</td>\n",
              "      <td>108.58483</td>\n",
              "    </tr>\n",
              "  </tbody>\n",
              "</table>\n",
              "</div>"
            ],
            "text/plain": [
              "                Price block_reward    Price+1  ...    Price+5    Price+6    Price+7\n",
              "Date                                           ...                                 \n",
              "2013-10-01  123.65499           25        NaN  ...        NaN        NaN        NaN\n",
              "2013-10-02  125.45500           25  123.65499  ...        NaN        NaN        NaN\n",
              "2013-10-03  108.58483           25  125.45500  ...        NaN        NaN        NaN\n",
              "2013-10-04  118.67466           25  108.58483  ...        NaN        NaN        NaN\n",
              "2013-10-05  121.33866           25  118.67466  ...        NaN        NaN        NaN\n",
              "2013-10-06  120.65533           25  121.33866  ...  123.65499        NaN        NaN\n",
              "2013-10-07  121.79500           25  120.65533  ...  125.45500  123.65499        NaN\n",
              "2013-10-08  123.03300           25  121.79500  ...  108.58483  125.45500  123.65499\n",
              "2013-10-09  124.04900           25  123.03300  ...  118.67466  108.58483  125.45500\n",
              "2013-10-10  125.96116           25  124.04900  ...  121.33866  118.67466  108.58483\n",
              "\n",
              "[10 rows x 9 columns]"
            ]
          },
          "metadata": {},
          "execution_count": 12
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 235
        },
        "id": "73jAFAVeYpD-",
        "outputId": "1f5f88c6-f926-4019-a6e9-0910a211bfd9"
      },
      "source": [
        "# Let's create X & y, remove the NaN's and convert to float32 to prevent TensorFlow errors \n",
        "X = bitcoin_prices_windowed.dropna().drop(\"Price\", axis=1).astype(np.float32) \n",
        "y = bitcoin_prices_windowed.dropna()[\"Price\"].astype(np.float32)\n",
        "X.head()"
      ],
      "execution_count": 13,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/html": [
              "<div>\n",
              "<style scoped>\n",
              "    .dataframe tbody tr th:only-of-type {\n",
              "        vertical-align: middle;\n",
              "    }\n",
              "\n",
              "    .dataframe tbody tr th {\n",
              "        vertical-align: top;\n",
              "    }\n",
              "\n",
              "    .dataframe thead th {\n",
              "        text-align: right;\n",
              "    }\n",
              "</style>\n",
              "<table border=\"1\" class=\"dataframe\">\n",
              "  <thead>\n",
              "    <tr style=\"text-align: right;\">\n",
              "      <th></th>\n",
              "      <th>block_reward</th>\n",
              "      <th>Price+1</th>\n",
              "      <th>Price+2</th>\n",
              "      <th>Price+3</th>\n",
              "      <th>Price+4</th>\n",
              "      <th>Price+5</th>\n",
              "      <th>Price+6</th>\n",
              "      <th>Price+7</th>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>Date</th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "    </tr>\n",
              "  </thead>\n",
              "  <tbody>\n",
              "    <tr>\n",
              "      <th>2013-10-08</th>\n",
              "      <td>25.0</td>\n",
              "      <td>121.794998</td>\n",
              "      <td>120.655327</td>\n",
              "      <td>121.338661</td>\n",
              "      <td>118.674660</td>\n",
              "      <td>108.584831</td>\n",
              "      <td>125.455002</td>\n",
              "      <td>123.654991</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2013-10-09</th>\n",
              "      <td>25.0</td>\n",
              "      <td>123.032997</td>\n",
              "      <td>121.794998</td>\n",
              "      <td>120.655327</td>\n",
              "      <td>121.338661</td>\n",
              "      <td>118.674660</td>\n",
              "      <td>108.584831</td>\n",
              "      <td>125.455002</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2013-10-10</th>\n",
              "      <td>25.0</td>\n",
              "      <td>124.049004</td>\n",
              "      <td>123.032997</td>\n",
              "      <td>121.794998</td>\n",
              "      <td>120.655327</td>\n",
              "      <td>121.338661</td>\n",
              "      <td>118.674660</td>\n",
              "      <td>108.584831</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2013-10-11</th>\n",
              "      <td>25.0</td>\n",
              "      <td>125.961159</td>\n",
              "      <td>124.049004</td>\n",
              "      <td>123.032997</td>\n",
              "      <td>121.794998</td>\n",
              "      <td>120.655327</td>\n",
              "      <td>121.338661</td>\n",
              "      <td>118.674660</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2013-10-12</th>\n",
              "      <td>25.0</td>\n",
              "      <td>125.279663</td>\n",
              "      <td>125.961159</td>\n",
              "      <td>124.049004</td>\n",
              "      <td>123.032997</td>\n",
              "      <td>121.794998</td>\n",
              "      <td>120.655327</td>\n",
              "      <td>121.338661</td>\n",
              "    </tr>\n",
              "  </tbody>\n",
              "</table>\n",
              "</div>"
            ],
            "text/plain": [
              "            block_reward     Price+1  ...     Price+6     Price+7\n",
              "Date                                  ...                        \n",
              "2013-10-08          25.0  121.794998  ...  125.455002  123.654991\n",
              "2013-10-09          25.0  123.032997  ...  108.584831  125.455002\n",
              "2013-10-10          25.0  124.049004  ...  118.674660  108.584831\n",
              "2013-10-11          25.0  125.961159  ...  121.338661  118.674660\n",
              "2013-10-12          25.0  125.279663  ...  120.655327  121.338661\n",
              "\n",
              "[5 rows x 8 columns]"
            ]
          },
          "metadata": {},
          "execution_count": 13
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "_SdPXx_SauLC"
      },
      "source": [
        "# Scaling the X data \n",
        "X_scaled = scaler.fit_transform(X)\n",
        "y_scaled = scaler.fit_transform(np.expand_dims(y , axis = 1))\n",
        "y_scaled = np.squeeze(y_scaled)"
      ],
      "execution_count": 14,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "0M8UVLaIa1km",
        "outputId": "69199146-ea99-4575-d799-f8faaeb9f1ef"
      },
      "source": [
        "# Make train and test set splits of the scaled data \n",
        "split_size = int(len(X) * 0.8)\n",
        "X_train, y_train = X_scaled[:split_size], y_scaled[:split_size]\n",
        "X_test, y_test = X_scaled[split_size:], y_scaled[split_size:]\n",
        "len(X_train), len(y_train), len(X_test), len(y_test)"
      ],
      "execution_count": 15,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "(2224, 2224, 556, 556)"
            ]
          },
          "metadata": {},
          "execution_count": 15
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "6sXgRZ1Ka-TT",
        "outputId": "69e36603-9540-486d-a975-9ba75a7b1f73"
      },
      "source": [
        "# Building a Multivariate time series model and fitting it\n",
        "tf.random.set_seed(42)\n",
        "\n",
        "model_6 = tf.keras.Sequential([\n",
        "  layers.Dense(128 , activation= 'relu'), \n",
        "  layers.Dense(HORIZON)\n",
        "])\n",
        "\n",
        "model_6.compile(loss = 'mae' , \n",
        "                optimizer = tf.keras.optimizers.Adam())\n",
        "\n",
        "model_6.fit(X_train , y_train , \n",
        "          epochs = 100 ,\n",
        "          verbose = 0 , batch_size = 128, \n",
        "          validation_data = (X_test , y_test))"
      ],
      "execution_count": 16,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "<keras.callbacks.History at 0x7f5f2e58d310>"
            ]
          },
          "metadata": {},
          "execution_count": 16
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "rs_ZDtr1b3-N",
        "outputId": "c2278fcd-c0f1-4136-a737-1cbc14016b7a"
      },
      "source": [
        "# Evaluate the model 6 \n",
        "model_6.evaluate(X_test , y_test)"
      ],
      "execution_count": 17,
      "outputs": [
        {
          "output_type": "stream",
          "name": "stdout",
          "text": [
            "18/18 [==============================] - 0s 1ms/step - loss: 0.0735\n"
          ]
        },
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "0.07352113723754883"
            ]
          },
          "metadata": {},
          "execution_count": 17
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "MqlPT_dXfuQf"
      },
      "source": [
        "### 2. Get the most up to date data on Bitcoin, train a model & see how it goes (our data goes up to May 18 2021).\n",
        "\n",
        "Coindesk : https://www.coindesk.com/price/bitcoin/ \n",
        "\n",
        "> Note: Download and import the data into the colab session, then replace the path `your_data_path` with your downloaded data path. "
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "LUKI8td_b_F_",
        "outputId": "06cfd514-d378-4707-d2ea-cc71c89a6605"
      },
      "source": [
        "# Loading the in the latest csv from Coindesk\n",
        "your_data_path = '/content/BTC_USD_2014-11-02_2021-09-09-CoinDesk.csv' \n",
        "df_updated = pd.read_csv( your_data_path, \n",
        "                 parse_dates = ['Date'] , \n",
        "                 index_col = ['Date'])\n",
        "\n",
        "bitcoin_prices_updated = pd.DataFrame(df_updated[\"Closing Price (USD)\"]).rename(columns={\"Closing Price (USD)\": \"Price\"})\n",
        "bitcoin_prices_updated.head(10) , bitcoin_prices_updated.shape"
      ],
      "execution_count": 18,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "(                Price\n",
              " Date                 \n",
              " 2014-11-02  325.22633\n",
              " 2014-11-03  331.60083\n",
              " 2014-11-04  324.71833\n",
              " 2014-11-05  332.45666\n",
              " 2014-11-06  336.58500\n",
              " 2014-11-07  346.77500\n",
              " 2014-11-08  344.81166\n",
              " 2014-11-09  343.06500\n",
              " 2014-11-10  358.50166\n",
              " 2014-11-11  368.07666, (2503, 1))"
            ]
          },
          "metadata": {},
          "execution_count": 18
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "52OylhbEiIII"
      },
      "source": [
        "prices_updated = bitcoin_prices_updated['Price'].to_numpy()"
      ],
      "execution_count": 19,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "JZYgYStegy7E"
      },
      "source": [
        "def make_windows(x, window_size=7, horizon=1):\n",
        "  \"\"\"\n",
        "  Turns a 1D array into a 2D array of sequential windows of window_size.\n",
        "  \"\"\"\n",
        "  \n",
        "  window_step = np.expand_dims(np.arange(window_size+horizon), axis=0)\n",
        "  window_indexes = window_step + np.expand_dims(np.arange(len(x)-(window_size+horizon-1)), axis=0).T # create 2D array of windows of size window_size\n",
        "  windowed_array = x[window_indexes]\n",
        "  windows, labels = get_labelled_windows(windowed_array, horizon=horizon)\n",
        "\n",
        "  return windows, labels"
      ],
      "execution_count": 20,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "bBzAx_9nhfGF",
        "outputId": "f4cf6e16-1a09-4760-df3e-f90d53d5355f"
      },
      "source": [
        "full_windows , full_labels = make_windows(prices_updated)\n",
        "len(full_windows), len(full_labels)"
      ],
      "execution_count": 21,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "(2496, 2496)"
            ]
          },
          "metadata": {},
          "execution_count": 21
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "jQ5Iir7jvLeY",
        "outputId": "512495cb-767c-41d6-ca8b-7f5c5cc2d252"
      },
      "source": [
        "# Looking at few examples of how price is scaled\n",
        "for i in range(3):\n",
        "  print(f'Window: {full_windows[i]} --> Label {full_labels[i]}')"
      ],
      "execution_count": 22,
      "outputs": [
        {
          "output_type": "stream",
          "name": "stdout",
          "text": [
            "Window: [325.22633 331.60083 324.71833 332.45666 336.585   346.775   344.81166] --> Label [343.065]\n",
            "Window: [331.60083 324.71833 332.45666 336.585   346.775   344.81166 343.065  ] --> Label [358.50166]\n",
            "Window: [324.71833 332.45666 336.585   346.775   344.81166 343.065   358.50166] --> Label [368.07666]\n"
          ]
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "2HCIuZdxh-GC",
        "outputId": "e6699867-8b03-4ee4-c862-f83f9d5f06e3"
      },
      "source": [
        "# Making train and test splits\n",
        "train_windows ,  test_windows ,train_labels,  test_labels =  make_train_test_splits(full_windows , full_labels)\n",
        "\n",
        "len(train_windows) ,  len(test_windows) , len(train_labels),  len(test_labels) "
      ],
      "execution_count": 23,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "(1996, 500, 1996, 500)"
            ]
          },
          "metadata": {},
          "execution_count": 23
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "e1Ve_RGCm5Ez"
      },
      "source": [
        "Now we're building the same Model 1 with the new coindesk data. "
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "4IytoXshkhJE",
        "outputId": "1bc5bd41-2a35-4ce6-e559-5cea7e8a7613"
      },
      "source": [
        "# Building the Model 1 with the updated data\n",
        "tf.random.set_seed(42)\n",
        "\n",
        "# Construct the model \n",
        "model_1 = tf.keras.Sequential([\n",
        "  layers.Dense(128, activation= 'relu') ,\n",
        "  layers.Dense(HORIZON , activation = 'linear')\n",
        "])\n",
        "\n",
        "# Compiling the model \n",
        "model_1.compile(loss = 'mae' , \n",
        "                optimizer = tf.keras.optimizers.Adam() , \n",
        "                metrics = ['mae'])\n",
        "\n",
        "# Fit the model \n",
        "model_1.fit(x = train_windows , \n",
        "            y = train_labels , \n",
        "            epochs = 100 , batch_size = 128 , verbose = 0 , \n",
        "            validation_data = (test_windows , test_labels))"
      ],
      "execution_count": 24,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "<keras.callbacks.History at 0x7f5f25beb910>"
            ]
          },
          "metadata": {},
          "execution_count": 24
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "Uq-4_z0rm_Gc",
        "outputId": "8ff2e0d1-15e9-4003-fc6d-a77c9547dcae"
      },
      "source": [
        "# Evaluating the model \n",
        "model_1.evaluate(test_windows , test_labels)"
      ],
      "execution_count": 25,
      "outputs": [
        {
          "output_type": "stream",
          "name": "stdout",
          "text": [
            "16/16 [==============================] - 0s 1ms/step - loss: 884.0138 - mae: 884.0138\n"
          ]
        },
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "[884.0137939453125, 884.0137939453125]"
            ]
          },
          "metadata": {},
          "execution_count": 25
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "OIuiLp-PnYlS"
      },
      "source": [
        "### 3. For most of our models we used WINDOW_SIZE=7, but is there a better window size?\n",
        "\n",
        "* Setup a series of experiments to find whether or not there’s a better window size.\n",
        "* For example, you might train 10 different models with HORIZON=1 but with window sizes ranging from 2-12."
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "P08KPQgG6op2"
      },
      "source": [
        "# Writing a evaluation function based on the preds and targets \n",
        "def evaluate_preds(y_true , y_pred):\n",
        "\n",
        "  # Casting the values to float32 \n",
        "  y_true = tf.cast(y_true , tf.float32)\n",
        "  y_pred = tf.cast(y_pred , tf.float32)\n",
        "\n",
        "\n",
        "  # Calculate the metrics \n",
        "  mae = tf.keras.metrics.mean_absolute_error(y_true , y_pred)\n",
        "  mse = tf.keras.metrics.mean_squared_error(y_true , y_pred)\n",
        "  rmse = tf.sqrt(mse)\n",
        "  mape = tf.keras.metrics.mean_absolute_percentage_error(y_true , y_pred)\n",
        "  \n",
        "  # For longer horizons \n",
        "  if mae.ndim > 0:\n",
        "    mae = tf.reduce_sum(mae)\n",
        "    mse = tf.reduce_sum(mse)\n",
        "    rmse = tf.reduce_sum(rmse)\n",
        "    mape = tf.reduce_sum(mape)\n",
        "\n",
        "  return {'mae' : mae.numpy() , \n",
        "          'mse': mse.numpy() , \n",
        "          'rmse': rmse.numpy() , \n",
        "          'mape': mape.numpy() }"
      ],
      "execution_count": 26,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "x11lEVED6d9A",
        "outputId": "0a939e55-bdd2-44ea-dfca-6a8337807537"
      },
      "source": [
        "# Writing a for loop to iterate over the Window size and build 10 different models\n",
        "\n",
        "# 10 Different models with window size ranging from (2 - 12) and store the results\n",
        "model_results_list = []\n",
        "\n",
        "from tqdm import tqdm\n",
        "for size in tqdm(range(2,12)):\n",
        "  HORIZON = 1 \n",
        "  WINDOW_SIZE = size\n",
        "\n",
        "  # Making window and labels \n",
        "  full_windows , full_labels = make_windows(prices, window_size= WINDOW_SIZE , horizon= HORIZON)\n",
        "  \n",
        "\n",
        "  # Splitting the data in train and test\n",
        "  train_windows ,  test_windows ,train_labels,  test_labels = make_train_test_splits(full_windows , full_labels)\n",
        "\n",
        "\n",
        "  # Building a simple dense model\n",
        "  input = layers.Input(shape = (WINDOW_SIZE ,) , name = 'Input_layer')\n",
        "  x = layers.Dense(128 , activation= 'relu')(input)\n",
        "  output = layers.Dense(HORIZON , activation= 'linear')(x)\n",
        "\n",
        "  # Packing into a model \n",
        "  model = tf.keras.Model(input , output , name = f'model_windowed_{size}')\n",
        "\n",
        "  # Compiling and fitting the model \n",
        "  model.compile(loss = 'mae' , optimizer = 'adam' , metrics = 'mae')\n",
        "\n",
        "  model.fit(train_windows , train_labels , \n",
        "            epochs = 100 , verbose = 0 , \n",
        "            batch_size = 128 , \n",
        "            validation_data = (test_windows , test_labels))\n",
        "  \n",
        "\n",
        "  # Making predictions \n",
        "  preds_ = model.predict(test_windows)\n",
        "  y_preds = tf.squeeze(preds_)\n",
        "\n",
        "  results = evaluate_preds(tf.squeeze(test_labels) , y_preds)\n",
        "  model_results_list.append(results)\n",
        " "
      ],
      "execution_count": 27,
      "outputs": [
        {
          "output_type": "stream",
          "name": "stderr",
          "text": [
            "100%|██████████| 10/10 [00:54<00:00,  5.42s/it]\n"
          ]
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "Qyp-2gqoUipn",
        "outputId": "ff2672cf-0ba2-4af4-e0a4-2ff2d8df06c6"
      },
      "source": [
        "# Below are the 10 different models result \n",
        "model_results_list"
      ],
      "execution_count": 28,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "[{'mae': 569.1506, 'mape': 2.5277913, 'mse': 1159396.5, 'rmse': 1076.7528},\n",
              " {'mae': 565.61, 'mape': 2.5274613, 'mse': 1139677.6, 'rmse': 1067.5569},\n",
              " {'mae': 623.44543, 'mape': 2.8342037, 'mse': 1266272.0, 'rmse': 1125.2875},\n",
              " {'mae': 565.61096, 'mape': 2.5193882, 'mse': 1157946.9, 'rmse': 1076.0793},\n",
              " {'mae': 575.82715, 'mape': 2.5713227, 'mse': 1181491.9, 'rmse': 1086.9645},\n",
              " {'mae': 624.65094, 'mape': 2.8559413, 'mse': 1279308.5, 'rmse': 1131.0652},\n",
              " {'mae': 673.8483, 'mape': 3.1560009, 'mse': 1401001.9, 'rmse': 1183.6393},\n",
              " {'mae': 661.0718, 'mape': 3.0542161, 'mse': 1343192.1, 'rmse': 1158.9617},\n",
              " {'mae': 583.37537, 'mape': 2.6407204, 'mse': 1204248.0, 'rmse': 1097.3823},\n",
              " {'mae': 701.84753, 'mape': 3.2888033, 'mse': 1443914.4, 'rmse': 1201.6299}]"
            ]
          },
          "metadata": {},
          "execution_count": 28
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "wzH981vjL-fT"
      },
      "source": [
        "### 4. Create a windowed dataset just like the ones we used for model_1 using  [tf.keras.preprocessing.timeseries_dataset_from_array()](https://www.tensorflow.org/api_docs/python/tf/keras/preprocessing/timeseries_dataset_from_array)  and retrain model_1 using the recreated dataset."
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "YcRI4RcIYnYE"
      },
      "source": [
        "WINDOW_SIZE = 7 \n",
        "HORIZON = 1"
      ],
      "execution_count": 29,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "zakDAvtCY32O"
      },
      "source": [
        "# Make the splits \n",
        "def make_train_test_splits(windows , labels , test_split = 0.2):\n",
        "  split_size = int(len(windows) * (1 - test_split))\n",
        "  train_windows = windows[:split_size]\n",
        "  train_labels = labels[:split_size]\n",
        "  test_windows = windows[split_size:]\n",
        "  test_labels = labels[split_size:]\n",
        "\n",
        "  return train_windows ,  test_windows ,train_labels,  test_labels"
      ],
      "execution_count": 30,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "ujv0D7izXNhn"
      },
      "source": [
        "ds = tf.keras.utils.timeseries_dataset_from_array(\n",
        "    data = prices , targets = prices , sequence_length = WINDOW_SIZE , sequence_stride = HORIZON, \n",
        "    batch_size = 128\n",
        ")"
      ],
      "execution_count": 31,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "ha5sXVH-ctoF"
      },
      "source": [
        "train_size , test_size = int(0.8 * len(ds)) ,int(0.2 * len(ds))"
      ],
      "execution_count": 32,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "ZlzXwk84cxp6"
      },
      "source": [
        "train_ds = ds.take(train_size)\n",
        "test_ds = ds.skip(train_size).take(test_size)"
      ],
      "execution_count": 33,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "AcTn-l1Idl3t",
        "outputId": "e1e10e0a-f448-4c7a-bb46-3d1cbe660688"
      },
      "source": [
        "for x , y in train_ds.take(1):\n",
        "  print(x[:2] , y[:2])"
      ],
      "execution_count": 34,
      "outputs": [
        {
          "output_type": "stream",
          "name": "stdout",
          "text": [
            "tf.Tensor(\n",
            "[[123.65499 125.455   108.58483 118.67466 121.33866 120.65533 121.795  ]\n",
            " [125.455   108.58483 118.67466 121.33866 120.65533 121.795   123.033  ]], shape=(2, 7), dtype=float64) tf.Tensor([123.65499 125.455  ], shape=(2,), dtype=float64)\n"
          ]
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "xuP5_htZduAu",
        "outputId": "35ef1976-60bb-4958-a04d-64a2eebad47d"
      },
      "source": [
        "for x , y in test_ds.take(1):\n",
        "  print(x[:2] , y[:2])"
      ],
      "execution_count": 35,
      "outputs": [
        {
          "output_type": "stream",
          "name": "stdout",
          "text": [
            "tf.Tensor(\n",
            "[[10302.19071368 10301.65965169 10231.42151196 10168.28770938\n",
            "  10223.5055788  10138.33520522  9984.52051597]\n",
            " [10301.65965169 10231.42151196 10168.28770938 10223.5055788\n",
            "  10138.33520522  9984.52051597 10031.86670899]], shape=(2, 7), dtype=float64) tf.Tensor([10302.19071368 10301.65965169], shape=(2,), dtype=float64)\n"
          ]
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "ieBAnXs0eJAy",
        "outputId": "75e09094-7676-49e3-bbb6-3947f345f1ef"
      },
      "source": [
        "# Building the Model 1 with the updated data\n",
        "tf.random.set_seed(42)\n",
        "\n",
        "# Building a simple dense model\n",
        "input = layers.Input(shape = (WINDOW_SIZE ,) , name = 'Input_layer' , dtype = tf.float32)\n",
        "x = layers.Dense(128 , activation= 'relu')(input)\n",
        "output = layers.Dense(HORIZON , activation= 'linear')(x)\n",
        "\n",
        "# Packing into a model \n",
        "model = tf.keras.Model(input , output)\n",
        "\n",
        "# Compiling the model \n",
        "model.compile(loss = 'mae' , \n",
        "                optimizer = tf.keras.optimizers.Adam() , \n",
        "                metrics = ['mae'])\n",
        "\n",
        "# Fit the model \n",
        "model.fit(train_ds ,\n",
        "          epochs = 100 , verbose = 0 , \n",
        "            validation_data = test_ds)"
      ],
      "execution_count": 36,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "<keras.callbacks.History at 0x7f5f28e1cb50>"
            ]
          },
          "metadata": {},
          "execution_count": 36
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "uPBtfK2diAkV",
        "outputId": "473e6927-60ca-42e9-f48b-6b2b362e7870"
      },
      "source": [
        "# Evaluating the model on the test set\n",
        "model.evaluate(test_ds)"
      ],
      "execution_count": 37,
      "outputs": [
        {
          "output_type": "stream",
          "name": "stdout",
          "text": [
            "4/4 [==============================] - 0s 19ms/step - loss: 643.4305 - mae: 643.4305\n"
          ]
        },
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "[643.4305419921875, 643.4305419921875]"
            ]
          },
          "metadata": {},
          "execution_count": 37
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "6GONHDDlhwKr"
      },
      "source": [
        "### 5. For our multivariate modelling experiment, we added the Bitcoin block reward size as an extra feature to make our time series multivariate.\n",
        "\n",
        "  * Are there any other features you think you could add?\n",
        "  * If so, try it out, how do these affect the model?"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "VA-51-MMxzNO",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 450
        },
        "outputId": "eba8a6bb-14ac-4a8b-bec4-5bfb8f5f25cf"
      },
      "source": [
        "df"
      ],
      "execution_count": 38,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/html": [
              "<div>\n",
              "<style scoped>\n",
              "    .dataframe tbody tr th:only-of-type {\n",
              "        vertical-align: middle;\n",
              "    }\n",
              "\n",
              "    .dataframe tbody tr th {\n",
              "        vertical-align: top;\n",
              "    }\n",
              "\n",
              "    .dataframe thead th {\n",
              "        text-align: right;\n",
              "    }\n",
              "</style>\n",
              "<table border=\"1\" class=\"dataframe\">\n",
              "  <thead>\n",
              "    <tr style=\"text-align: right;\">\n",
              "      <th></th>\n",
              "      <th>Currency</th>\n",
              "      <th>Closing Price (USD)</th>\n",
              "      <th>24h Open (USD)</th>\n",
              "      <th>24h High (USD)</th>\n",
              "      <th>24h Low (USD)</th>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>Date</th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "    </tr>\n",
              "  </thead>\n",
              "  <tbody>\n",
              "    <tr>\n",
              "      <th>2013-10-01</th>\n",
              "      <td>BTC</td>\n",
              "      <td>123.654990</td>\n",
              "      <td>124.304660</td>\n",
              "      <td>124.751660</td>\n",
              "      <td>122.563490</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2013-10-02</th>\n",
              "      <td>BTC</td>\n",
              "      <td>125.455000</td>\n",
              "      <td>123.654990</td>\n",
              "      <td>125.758500</td>\n",
              "      <td>123.633830</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2013-10-03</th>\n",
              "      <td>BTC</td>\n",
              "      <td>108.584830</td>\n",
              "      <td>125.455000</td>\n",
              "      <td>125.665660</td>\n",
              "      <td>83.328330</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2013-10-04</th>\n",
              "      <td>BTC</td>\n",
              "      <td>118.674660</td>\n",
              "      <td>108.584830</td>\n",
              "      <td>118.675000</td>\n",
              "      <td>107.058160</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2013-10-05</th>\n",
              "      <td>BTC</td>\n",
              "      <td>121.338660</td>\n",
              "      <td>118.674660</td>\n",
              "      <td>121.936330</td>\n",
              "      <td>118.005660</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>...</th>\n",
              "      <td>...</td>\n",
              "      <td>...</td>\n",
              "      <td>...</td>\n",
              "      <td>...</td>\n",
              "      <td>...</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2021-05-14</th>\n",
              "      <td>BTC</td>\n",
              "      <td>49764.132082</td>\n",
              "      <td>49596.778891</td>\n",
              "      <td>51448.798576</td>\n",
              "      <td>46294.720180</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2021-05-15</th>\n",
              "      <td>BTC</td>\n",
              "      <td>50032.693137</td>\n",
              "      <td>49717.354353</td>\n",
              "      <td>51578.312545</td>\n",
              "      <td>48944.346536</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2021-05-16</th>\n",
              "      <td>BTC</td>\n",
              "      <td>47885.625255</td>\n",
              "      <td>49926.035067</td>\n",
              "      <td>50690.802950</td>\n",
              "      <td>47005.102292</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2021-05-17</th>\n",
              "      <td>BTC</td>\n",
              "      <td>45604.615754</td>\n",
              "      <td>46805.537852</td>\n",
              "      <td>49670.414174</td>\n",
              "      <td>43868.638969</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2021-05-18</th>\n",
              "      <td>BTC</td>\n",
              "      <td>43144.471291</td>\n",
              "      <td>46439.336570</td>\n",
              "      <td>46622.853437</td>\n",
              "      <td>42102.346430</td>\n",
              "    </tr>\n",
              "  </tbody>\n",
              "</table>\n",
              "<p>2787 rows × 5 columns</p>\n",
              "</div>"
            ],
            "text/plain": [
              "           Currency  Closing Price (USD)  ...  24h High (USD)  24h Low (USD)\n",
              "Date                                      ...                               \n",
              "2013-10-01      BTC           123.654990  ...      124.751660     122.563490\n",
              "2013-10-02      BTC           125.455000  ...      125.758500     123.633830\n",
              "2013-10-03      BTC           108.584830  ...      125.665660      83.328330\n",
              "2013-10-04      BTC           118.674660  ...      118.675000     107.058160\n",
              "2013-10-05      BTC           121.338660  ...      121.936330     118.005660\n",
              "...             ...                  ...  ...             ...            ...\n",
              "2021-05-14      BTC         49764.132082  ...    51448.798576   46294.720180\n",
              "2021-05-15      BTC         50032.693137  ...    51578.312545   48944.346536\n",
              "2021-05-16      BTC         47885.625255  ...    50690.802950   47005.102292\n",
              "2021-05-17      BTC         45604.615754  ...    49670.414174   43868.638969\n",
              "2021-05-18      BTC         43144.471291  ...    46622.853437   42102.346430\n",
              "\n",
              "[2787 rows x 5 columns]"
            ]
          },
          "metadata": {},
          "execution_count": 38
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "CMRA5-nVmPnh"
      },
      "source": [
        "import datetime "
      ],
      "execution_count": 39,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 390
        },
        "id": "VtYlinvCm8EQ",
        "outputId": "82e12cd1-7f74-4b6b-955c-2a26206cac60"
      },
      "source": [
        "# Creating a day of week feature \n",
        "df['day_of_week'] = df.index.dayofweek\n",
        "df.head(10)"
      ],
      "execution_count": 40,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/html": [
              "<div>\n",
              "<style scoped>\n",
              "    .dataframe tbody tr th:only-of-type {\n",
              "        vertical-align: middle;\n",
              "    }\n",
              "\n",
              "    .dataframe tbody tr th {\n",
              "        vertical-align: top;\n",
              "    }\n",
              "\n",
              "    .dataframe thead th {\n",
              "        text-align: right;\n",
              "    }\n",
              "</style>\n",
              "<table border=\"1\" class=\"dataframe\">\n",
              "  <thead>\n",
              "    <tr style=\"text-align: right;\">\n",
              "      <th></th>\n",
              "      <th>Currency</th>\n",
              "      <th>Closing Price (USD)</th>\n",
              "      <th>24h Open (USD)</th>\n",
              "      <th>24h High (USD)</th>\n",
              "      <th>24h Low (USD)</th>\n",
              "      <th>day_of_week</th>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>Date</th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "      <th></th>\n",
              "    </tr>\n",
              "  </thead>\n",
              "  <tbody>\n",
              "    <tr>\n",
              "      <th>2013-10-01</th>\n",
              "      <td>BTC</td>\n",
              "      <td>123.65499</td>\n",
              "      <td>124.30466</td>\n",
              "      <td>124.75166</td>\n",
              "      <td>122.56349</td>\n",
              "      <td>1</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2013-10-02</th>\n",
              "      <td>BTC</td>\n",
              "      <td>125.45500</td>\n",
              "      <td>123.65499</td>\n",
              "      <td>125.75850</td>\n",
              "      <td>123.63383</td>\n",
              "      <td>2</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2013-10-03</th>\n",
              "      <td>BTC</td>\n",
              "      <td>108.58483</td>\n",
              "      <td>125.45500</td>\n",
              "      <td>125.66566</td>\n",
              "      <td>83.32833</td>\n",
              "      <td>3</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2013-10-04</th>\n",
              "      <td>BTC</td>\n",
              "      <td>118.67466</td>\n",
              "      <td>108.58483</td>\n",
              "      <td>118.67500</td>\n",
              "      <td>107.05816</td>\n",
              "      <td>4</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2013-10-05</th>\n",
              "      <td>BTC</td>\n",
              "      <td>121.33866</td>\n",
              "      <td>118.67466</td>\n",
              "      <td>121.93633</td>\n",
              "      <td>118.00566</td>\n",
              "      <td>5</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2013-10-06</th>\n",
              "      <td>BTC</td>\n",
              "      <td>120.65533</td>\n",
              "      <td>121.33866</td>\n",
              "      <td>121.85216</td>\n",
              "      <td>120.55450</td>\n",
              "      <td>6</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2013-10-07</th>\n",
              "      <td>BTC</td>\n",
              "      <td>121.79500</td>\n",
              "      <td>120.65533</td>\n",
              "      <td>121.99166</td>\n",
              "      <td>120.43199</td>\n",
              "      <td>0</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2013-10-08</th>\n",
              "      <td>BTC</td>\n",
              "      <td>123.03300</td>\n",
              "      <td>121.79500</td>\n",
              "      <td>123.64016</td>\n",
              "      <td>121.35066</td>\n",
              "      <td>1</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2013-10-09</th>\n",
              "      <td>BTC</td>\n",
              "      <td>124.04900</td>\n",
              "      <td>123.03300</td>\n",
              "      <td>124.78350</td>\n",
              "      <td>122.59266</td>\n",
              "      <td>2</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2013-10-10</th>\n",
              "      <td>BTC</td>\n",
              "      <td>125.96116</td>\n",
              "      <td>124.04900</td>\n",
              "      <td>128.01683</td>\n",
              "      <td>123.81966</td>\n",
              "      <td>3</td>\n",
              "    </tr>\n",
              "  </tbody>\n",
              "</table>\n",
              "</div>"
            ],
            "text/plain": [
              "           Currency  Closing Price (USD)  ...  24h Low (USD)  day_of_week\n",
              "Date                                      ...                            \n",
              "2013-10-01      BTC            123.65499  ...      122.56349            1\n",
              "2013-10-02      BTC            125.45500  ...      123.63383            2\n",
              "2013-10-03      BTC            108.58483  ...       83.32833            3\n",
              "2013-10-04      BTC            118.67466  ...      107.05816            4\n",
              "2013-10-05      BTC            121.33866  ...      118.00566            5\n",
              "2013-10-06      BTC            120.65533  ...      120.55450            6\n",
              "2013-10-07      BTC            121.79500  ...      120.43199            0\n",
              "2013-10-08      BTC            123.03300  ...      121.35066            1\n",
              "2013-10-09      BTC            124.04900  ...      122.59266            2\n",
              "2013-10-10      BTC            125.96116  ...      123.81966            3\n",
              "\n",
              "[10 rows x 6 columns]"
            ]
          },
          "metadata": {},
          "execution_count": 40
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "ppVyHOd9pZUC"
      },
      "source": [
        "# Defining the hyper parameters \n",
        "HORIZON = 1 \n",
        "WINDOW_SIZE = 7 \n",
        "\n",
        "bitcoin_prices_windowed['day_of_week'] = bitcoin_prices_windowed.index.dayofweek"
      ],
      "execution_count": 41,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "Z6zN_LcoqCca"
      },
      "source": [
        "# Getting three kinds of data (univariate , multivariate and the day of week)\n",
        "\n",
        "# Univariate data \n",
        "full_windows , full_labels = make_windows_scaled(prices)\n",
        "train_windows , test_windows , train_labels , test_labels = make_train_test_splits(full_windows , full_labels)\n",
        "\n",
        "# Multivaritate dat \n",
        "X = bitcoin_prices_windowed.dropna().drop('Price' , axis = 1).astype(np.float32)\n",
        "X_scaled = scaler.fit_transform(X)\n",
        "y = bitcoin_prices_windowed.dropna()['Price'].astype(np.float32)\n",
        "\n",
        "# Day of week \n",
        "day_of_week = bitcoin_prices_windowed.dropna()['day_of_week'].to_list()"
      ],
      "execution_count": 42,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "4YCJnEBEzj69",
        "outputId": "adf2e920-6c2d-4bb0-e802-6f92c370b31f"
      },
      "source": [
        "# Checking the shapes \n",
        "print(full_windows.shape , full_labels.shape)\n",
        "print(X.shape , y.shape)\n",
        "print(len(day_of_week))"
      ],
      "execution_count": 43,
      "outputs": [
        {
          "output_type": "stream",
          "name": "stdout",
          "text": [
            "(2780, 7) (2780, 1)\n",
            "(2780, 9) (2780,)\n",
            "2780\n"
          ]
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "CvJBEkZR4XVp",
        "outputId": "efef705a-8a90-4f06-eaeb-d9a6764d8491"
      },
      "source": [
        "# Splitting the multivariate and the day_of_week to train and test splits \n",
        "split_size = int(len(X) * 0.8)\n",
        "train_block_rewards , test_block_rewards = X[:split_size] , X[split_size:]\n",
        "train_days , test_days = day_of_week[:split_size] , day_of_week[split_size:]\n",
        " \n",
        "len(train_block_rewards), len(train_days) , len(test_block_rewards) , len(test_days)"
      ],
      "execution_count": 44,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "(2224, 2224, 556, 556)"
            ]
          },
          "metadata": {},
          "execution_count": 44
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "lQd1NpWSzke0",
        "outputId": "38713484-c851-4fd6-ce97-3c70506532ce"
      },
      "source": [
        "# Building a performant dataset for train and test \n",
        "\n",
        "train_data_tribid = tf.data.Dataset.from_tensor_slices((train_windows , \n",
        "                                                        train_block_rewards , \n",
        "                                                        train_days))\n",
        "\n",
        "train_labels_tribid = tf.data.Dataset.from_tensor_slices(train_labels)\n",
        "\n",
        "# The test/val split \n",
        "test_data_tribid = tf.data.Dataset.from_tensor_slices((test_windows , \n",
        "                                                       test_block_rewards , \n",
        "                                                       test_days))\n",
        "\n",
        "test_labels_tribid = tf.data.Dataset.from_tensor_slices(test_labels)\n",
        "\n",
        "# Zipping the data and labels into one complete dataset \n",
        "tribid_train_ds = tf.data.Dataset.zip((train_data_tribid , train_labels_tribid))\n",
        "tribid_test_ds = tf.data.Dataset.zip((test_data_tribid , test_labels_tribid))\n",
        "\n",
        "# Applying prefetch and batching the dataset \n",
        "tribid_train_ds = tribid_train_ds.batch(128).prefetch(tf.data.AUTOTUNE)\n",
        "tribid_test_ds = tribid_test_ds.batch(128).prefetch(tf.data.AUTOTUNE)\n",
        "\n",
        "tribid_train_ds ,tribid_test_ds"
      ],
      "execution_count": 45,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "(<PrefetchDataset shapes: (((None, 7), (None, 9), (None,)), (None, 1)), types: ((tf.float64, tf.float32, tf.int32), tf.float64)>,\n",
              " <PrefetchDataset shapes: (((None, 7), (None, 9), (None,)), (None, 1)), types: ((tf.float64, tf.float32, tf.int32), tf.float64)>)"
            ]
          },
          "metadata": {},
          "execution_count": 45
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "tyhukkUSzmu8",
        "outputId": "6c18723a-dbc1-49a5-d45c-38745a2586d3"
      },
      "source": [
        "# Building a tribid model \n",
        "\n",
        "input_windows = layers.Input(shape = (7,) , dtype=tf.float64 , name='Window Inputs')\n",
        "exp_layer_1 = layers.Lambda(lambda x: tf.expand_dims(x , axis = 1))(input_windows)\n",
        "conv1 = layers.Conv1D(filters= 32 , kernel_size=5 , padding='causal' , activation= 'relu')(exp_layer_1)\n",
        "window_model = tf.keras.Model(input_windows , conv1 , name = 'Windowed model')\n",
        "\n",
        "input_blocks = layers.Input(shape = (9,) , dtype= tf.float32 , name ='Block rewards input')\n",
        "exp_layer_2 = layers.Lambda(lambda x: tf.expand_dims(x , axis = 1))(input_blocks)\n",
        "conv2 = layers.Conv1D(filters = 32 , kernel_size= 5 , activation= 'relu' , padding = 'causal')(exp_layer_2)\n",
        "block_model = tf.keras.Model(input_blocks , conv2 , name = 'Block rewards model')\n",
        "\n",
        "\n",
        "# Use expand dims to match the same shape output (None , 1 , 128)\n",
        "# whereas without expand dims it would be (None , 128)\n",
        "input_days = layers.Input(shape= (1,) , dtype = tf.int32 , name ='Days of week Input')\n",
        "exp_layer_3 = layers.Lambda(lambda x: tf.expand_dims(x , axis = 1))(input_days)\n",
        "dense = layers.Dense(128 , activation= 'relu')(exp_layer_3)\n",
        "days_model = tf.keras.Model(input_days , dense , name = 'Days Model')\n",
        "\n",
        "# Concatenating the inputs \n",
        "concat = layers.Concatenate(name = 'combined_outputs' )([window_model.output , \n",
        "                                                           block_model.output , \n",
        "                                                           days_model.output])\n",
        "\n",
        "# Creating the output layer \n",
        "dropout = layers.Dropout(0.4)(concat)\n",
        "output_layer = layers.Dense(1 , activation = 'linear')(dropout)\n",
        "\n",
        "# Putting everything into a model \n",
        "tribid_model = tf.keras.Model(inputs = [window_model.input , \n",
        "                                        block_model.input , \n",
        "                                        days_model.input] , \n",
        "                              outputs = output_layer)\n",
        "tribid_model.summary()\n",
        "\n"
      ],
      "execution_count": 46,
      "outputs": [
        {
          "output_type": "stream",
          "name": "stdout",
          "text": [
            "Model: \"model_1\"\n",
            "__________________________________________________________________________________________________\n",
            "Layer (type)                    Output Shape         Param #     Connected to                     \n",
            "==================================================================================================\n",
            "Window Inputs (InputLayer)      [(None, 7)]          0                                            \n",
            "__________________________________________________________________________________________________\n",
            "Block rewards input (InputLayer [(None, 9)]          0                                            \n",
            "__________________________________________________________________________________________________\n",
            "Days of week Input (InputLayer) [(None, 1)]          0                                            \n",
            "__________________________________________________________________________________________________\n",
            "lambda (Lambda)                 (None, 1, 7)         0           Window Inputs[0][0]              \n",
            "__________________________________________________________________________________________________\n",
            "lambda_1 (Lambda)               (None, 1, 9)         0           Block rewards input[0][0]        \n",
            "__________________________________________________________________________________________________\n",
            "lambda_2 (Lambda)               (None, 1, 1)         0           Days of week Input[0][0]         \n",
            "__________________________________________________________________________________________________\n",
            "conv1d (Conv1D)                 (None, 1, 32)        1152        lambda[0][0]                     \n",
            "__________________________________________________________________________________________________\n",
            "conv1d_1 (Conv1D)               (None, 1, 32)        1472        lambda_1[0][0]                   \n",
            "__________________________________________________________________________________________________\n",
            "dense_28 (Dense)                (None, 1, 128)       256         lambda_2[0][0]                   \n",
            "__________________________________________________________________________________________________\n",
            "combined_outputs (Concatenate)  (None, 1, 192)       0           conv1d[0][0]                     \n",
            "                                                                 conv1d_1[0][0]                   \n",
            "                                                                 dense_28[0][0]                   \n",
            "__________________________________________________________________________________________________\n",
            "dropout (Dropout)               (None, 1, 192)       0           combined_outputs[0][0]           \n",
            "__________________________________________________________________________________________________\n",
            "dense_29 (Dense)                (None, 1, 1)         193         dropout[0][0]                    \n",
            "==================================================================================================\n",
            "Total params: 3,073\n",
            "Trainable params: 3,073\n",
            "Non-trainable params: 0\n",
            "__________________________________________________________________________________________________\n"
          ]
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "iDB8vFtu17Rp",
        "outputId": "82092496-c7eb-4ed1-9037-4a179895a4a5"
      },
      "source": [
        "# Compiling and fitting the model \n",
        "tribid_model.compile(loss = 'mae' , \n",
        "                     optimizer = 'adam' , metrics = ['mae'])\n",
        "\n",
        "# Fitting the model \n",
        "tribid_model.fit(tribid_train_ds , \n",
        "                 epochs = 20,  \n",
        "                 validation_data = tribid_test_ds , verbose = 2)"
      ],
      "execution_count": 47,
      "outputs": [
        {
          "output_type": "stream",
          "name": "stdout",
          "text": [
            "Epoch 1/20\n",
            "18/18 - 1s - loss: 173.1271 - mae: 173.1271 - val_loss: 124.0175 - val_mae: 124.0175\n",
            "Epoch 2/20\n",
            "18/18 - 0s - loss: 96.4324 - mae: 96.4324 - val_loss: 37.3843 - val_mae: 37.3843\n",
            "Epoch 3/20\n",
            "18/18 - 0s - loss: 58.4491 - mae: 58.4491 - val_loss: 40.1556 - val_mae: 40.1556\n",
            "Epoch 4/20\n",
            "18/18 - 0s - loss: 25.2752 - mae: 25.2752 - val_loss: 25.2996 - val_mae: 25.2996\n",
            "Epoch 5/20\n",
            "18/18 - 0s - loss: 6.4564 - mae: 6.4564 - val_loss: 7.1498 - val_mae: 7.1498\n",
            "Epoch 6/20\n",
            "18/18 - 0s - loss: 3.2152 - mae: 3.2152 - val_loss: 14.4171 - val_mae: 14.4171\n",
            "Epoch 7/20\n",
            "18/18 - 0s - loss: 1.3200 - mae: 1.3200 - val_loss: 5.2407 - val_mae: 5.2407\n",
            "Epoch 8/20\n",
            "18/18 - 0s - loss: 0.3744 - mae: 0.3744 - val_loss: 0.3289 - val_mae: 0.3289\n",
            "Epoch 9/20\n",
            "18/18 - 0s - loss: 0.2766 - mae: 0.2766 - val_loss: 0.3679 - val_mae: 0.3679\n",
            "Epoch 10/20\n",
            "18/18 - 0s - loss: 0.2502 - mae: 0.2502 - val_loss: 0.8294 - val_mae: 0.8294\n",
            "Epoch 11/20\n",
            "18/18 - 0s - loss: 0.4526 - mae: 0.4526 - val_loss: 1.4310 - val_mae: 1.4310\n",
            "Epoch 12/20\n",
            "18/18 - 0s - loss: 0.2831 - mae: 0.2831 - val_loss: 0.7019 - val_mae: 0.7019\n",
            "Epoch 13/20\n",
            "18/18 - 0s - loss: 0.2143 - mae: 0.2143 - val_loss: 0.2403 - val_mae: 0.2403\n",
            "Epoch 14/20\n",
            "18/18 - 0s - loss: 0.1896 - mae: 0.1896 - val_loss: 0.2570 - val_mae: 0.2570\n",
            "Epoch 15/20\n",
            "18/18 - 0s - loss: 0.1637 - mae: 0.1637 - val_loss: 0.2061 - val_mae: 0.2061\n",
            "Epoch 16/20\n",
            "18/18 - 0s - loss: 0.1283 - mae: 0.1283 - val_loss: 0.2460 - val_mae: 0.2460\n",
            "Epoch 17/20\n",
            "18/18 - 0s - loss: 0.1077 - mae: 0.1077 - val_loss: 0.1839 - val_mae: 0.1839\n",
            "Epoch 18/20\n",
            "18/18 - 0s - loss: 0.1213 - mae: 0.1213 - val_loss: 0.2423 - val_mae: 0.2423\n",
            "Epoch 19/20\n",
            "18/18 - 0s - loss: 0.0994 - mae: 0.0994 - val_loss: 0.2242 - val_mae: 0.2242\n",
            "Epoch 20/20\n",
            "18/18 - 0s - loss: 0.1007 - mae: 0.1007 - val_loss: 0.1563 - val_mae: 0.1563\n"
          ]
        },
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "<keras.callbacks.History at 0x7f5f278da8d0>"
            ]
          },
          "metadata": {},
          "execution_count": 47
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "zzCKgn8598cB",
        "outputId": "fb07da76-d3fa-4e23-b27e-a4200f9fcfdd"
      },
      "source": [
        "# Evaluating the model \n",
        "tribid_model.evaluate(tribid_test_ds)"
      ],
      "execution_count": 48,
      "outputs": [
        {
          "output_type": "stream",
          "name": "stdout",
          "text": [
            "5/5 [==============================] - 0s 3ms/step - loss: 0.1563 - mae: 0.1563\n"
          ]
        },
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "[0.1563473492860794, 0.1563473492860794]"
            ]
          },
          "metadata": {},
          "execution_count": 48
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "uaqbrn-xnnjW"
      },
      "source": [
        "### 6. Make prediction intervals for future forecasts. To do so, one way would be to train an ensemble model on all of the data, make future forecasts with it and calculate the prediction intervals of the ensemble just like we did for model_8."
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "EuGAuQZpovQ9"
      },
      "source": [
        "**Things to do**\n",
        "- Train an ensemble model on the whole data. \n",
        "- Make one dataset (no test/valid) which will use to predict future forecasts of bitcoins. \n",
        "- Make a function that will take the number of iterations and different loss functions to train the model with. \n"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "gS_NBQuqt5Bb",
        "outputId": "f3e09108-7264-4d1a-f860-f54e33df4ee7"
      },
      "source": [
        "# Make one whole dataset (with the updated bitcoin prices 2014 - 2021)\n",
        "\n",
        "X_all = bitcoin_prices_windowed.drop(['Price' , 'block_reward' , 'day_of_week'] , axis = 1).dropna().to_numpy()\n",
        "y_all = bitcoin_prices_windowed.dropna()['Price'].to_numpy()\n",
        "\n",
        "whole_ds = tf.data.Dataset.from_tensor_slices((X_all , y_all))\n",
        "whole_ds = whole_ds.batch(128).prefetch(tf.data.AUTOTUNE)\n",
        "whole_ds"
      ],
      "execution_count": 49,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "<PrefetchDataset shapes: ((None, 7), (None,)), types: (tf.float64, tf.float64)>"
            ]
          },
          "metadata": {},
          "execution_count": 49
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "a5zZCOdSvBDX"
      },
      "source": [
        "# Creating the function \n",
        "\n",
        "def get_ensemble_models(horizon = HORIZON , \n",
        "                        dataset = whole_ds , \n",
        "                        num_iter = 10 , \n",
        "                        num_epochs = 100 , \n",
        "                        loss_fns = ['mae' , 'mse' , 'mape']):\n",
        "  \n",
        "\n",
        "  # Make a empty list of the ensemble models \n",
        "  ensemble_models = []\n",
        "\n",
        "  # Create num_iter number of models per loss functions \n",
        "  for i in range(num_iter):\n",
        "    for loss_functions in loss_fns:\n",
        "      print(f'Optimizing model by reducing: {loss_functions} for {num_epochs} epochs, model number: {i}')\n",
        "\n",
        "      model = tf.keras.Sequential([\n",
        "          layers.Dense(128 , kernel_initializer='he_normal' , activation= 'relu'),\n",
        "          layers.Dense(128 , kernel_initializer= 'he_normal', activation= 'relu'),\n",
        "          layers.Dense(HORIZON)\n",
        "      ])\n",
        "\n",
        "      # Compiling the model \n",
        "      model.compile(loss = loss_functions , \n",
        "                    optimizer = 'adam' , metrics = ['mae' , 'mse'])\n",
        "      \n",
        "      # Fit the model \n",
        "      model.fit(dataset , \n",
        "                epochs = num_epochs , \n",
        "                verbose = 0,\n",
        "                callbacks=[tf.keras.callbacks.EarlyStopping(monitor=\"loss\",\n",
        "                                                            patience=200,\n",
        "                                                            restore_best_weights=True),\n",
        "                           tf.keras.callbacks.ReduceLROnPlateau(monitor=\"loss\",\n",
        "                                                                patience=100,\n",
        "                                                                verbose=1)])\n",
        "      \n",
        "      ensemble_models.append(model)\n",
        "\n",
        "  return ensemble_models"
      ],
      "execution_count": 50,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "btGEQNfLvCXL",
        "outputId": "15ecb076-e294-4ed3-c55c-a7fa93214695"
      },
      "source": [
        "# Running the above function \n",
        "ensemble_models = get_ensemble_models(num_iter=5 , num_epochs= 1000 , horizon = 1)"
      ],
      "execution_count": 51,
      "outputs": [
        {
          "output_type": "stream",
          "name": "stdout",
          "text": [
            "Optimizing model by reducing: mae for 1000 epochs, model number: 0\n",
            "\n",
            "Epoch 00304: ReduceLROnPlateau reducing learning rate to 0.00010000000474974513.\n",
            "\n",
            "Epoch 00438: ReduceLROnPlateau reducing learning rate to 1.0000000474974514e-05.\n",
            "Optimizing model by reducing: mse for 1000 epochs, model number: 0\n",
            "\n",
            "Epoch 00138: ReduceLROnPlateau reducing learning rate to 0.00010000000474974513.\n",
            "\n",
            "Epoch 00421: ReduceLROnPlateau reducing learning rate to 1.0000000474974514e-05.\n",
            "Optimizing model by reducing: mape for 1000 epochs, model number: 0\n",
            "\n",
            "Epoch 00272: ReduceLROnPlateau reducing learning rate to 0.00010000000474974513.\n",
            "\n",
            "Epoch 00374: ReduceLROnPlateau reducing learning rate to 1.0000000474974514e-05.\n",
            "\n",
            "Epoch 00495: ReduceLROnPlateau reducing learning rate to 1.0000000656873453e-06.\n",
            "Optimizing model by reducing: mae for 1000 epochs, model number: 1\n",
            "\n",
            "Epoch 00156: ReduceLROnPlateau reducing learning rate to 0.00010000000474974513.\n",
            "\n",
            "Epoch 00323: ReduceLROnPlateau reducing learning rate to 1.0000000474974514e-05.\n",
            "Optimizing model by reducing: mse for 1000 epochs, model number: 1\n",
            "\n",
            "Epoch 00402: ReduceLROnPlateau reducing learning rate to 0.00010000000474974513.\n",
            "\n",
            "Epoch 00677: ReduceLROnPlateau reducing learning rate to 1.0000000474974514e-05.\n",
            "Optimizing model by reducing: mape for 1000 epochs, model number: 1\n",
            "\n",
            "Epoch 00388: ReduceLROnPlateau reducing learning rate to 0.00010000000474974513.\n",
            "\n",
            "Epoch 00489: ReduceLROnPlateau reducing learning rate to 1.0000000474974514e-05.\n",
            "\n",
            "Epoch 00590: ReduceLROnPlateau reducing learning rate to 1.0000000656873453e-06.\n",
            "Optimizing model by reducing: mae for 1000 epochs, model number: 2\n",
            "\n",
            "Epoch 00187: ReduceLROnPlateau reducing learning rate to 0.00010000000474974513.\n",
            "\n",
            "Epoch 00324: ReduceLROnPlateau reducing learning rate to 1.0000000474974514e-05.\n",
            "Optimizing model by reducing: mse for 1000 epochs, model number: 2\n",
            "\n",
            "Epoch 00218: ReduceLROnPlateau reducing learning rate to 0.00010000000474974513.\n",
            "\n",
            "Epoch 00597: ReduceLROnPlateau reducing learning rate to 1.0000000474974514e-05.\n",
            "Optimizing model by reducing: mape for 1000 epochs, model number: 2\n",
            "\n",
            "Epoch 00253: ReduceLROnPlateau reducing learning rate to 0.00010000000474974513.\n",
            "\n",
            "Epoch 00360: ReduceLROnPlateau reducing learning rate to 1.0000000474974514e-05.\n",
            "\n",
            "Epoch 00461: ReduceLROnPlateau reducing learning rate to 1.0000000656873453e-06.\n",
            "Optimizing model by reducing: mae for 1000 epochs, model number: 3\n",
            "\n",
            "Epoch 00101: ReduceLROnPlateau reducing learning rate to 0.00010000000474974513.\n",
            "\n",
            "Epoch 00256: ReduceLROnPlateau reducing learning rate to 1.0000000474974514e-05.\n",
            "Optimizing model by reducing: mse for 1000 epochs, model number: 3\n",
            "\n",
            "Epoch 00164: ReduceLROnPlateau reducing learning rate to 0.00010000000474974513.\n",
            "\n",
            "Epoch 00428: ReduceLROnPlateau reducing learning rate to 1.0000000474974514e-05.\n",
            "Optimizing model by reducing: mape for 1000 epochs, model number: 3\n",
            "\n",
            "Epoch 00218: ReduceLROnPlateau reducing learning rate to 0.00010000000474974513.\n",
            "\n",
            "Epoch 00320: ReduceLROnPlateau reducing learning rate to 1.0000000474974514e-05.\n",
            "\n",
            "Epoch 00421: ReduceLROnPlateau reducing learning rate to 1.0000000656873453e-06.\n",
            "\n",
            "Epoch 00525: ReduceLROnPlateau reducing learning rate to 1.0000001111620805e-07.\n",
            "\n",
            "Epoch 00662: ReduceLROnPlateau reducing learning rate to 1.000000082740371e-08.\n",
            "\n",
            "Epoch 00763: ReduceLROnPlateau reducing learning rate to 1.000000082740371e-09.\n",
            "\n",
            "Epoch 00863: ReduceLROnPlateau reducing learning rate to 1.000000082740371e-10.\n",
            "\n",
            "Epoch 00963: ReduceLROnPlateau reducing learning rate to 1.000000082740371e-11.\n",
            "Optimizing model by reducing: mae for 1000 epochs, model number: 4\n",
            "\n",
            "Epoch 00246: ReduceLROnPlateau reducing learning rate to 0.00010000000474974513.\n",
            "\n",
            "Epoch 00367: ReduceLROnPlateau reducing learning rate to 1.0000000474974514e-05.\n",
            "Optimizing model by reducing: mse for 1000 epochs, model number: 4\n",
            "\n",
            "Epoch 00249: ReduceLROnPlateau reducing learning rate to 0.00010000000474974513.\n",
            "\n",
            "Epoch 00532: ReduceLROnPlateau reducing learning rate to 1.0000000474974514e-05.\n",
            "Optimizing model by reducing: mape for 1000 epochs, model number: 4\n",
            "\n",
            "Epoch 00213: ReduceLROnPlateau reducing learning rate to 0.00010000000474974513.\n",
            "\n",
            "Epoch 00314: ReduceLROnPlateau reducing learning rate to 1.0000000474974514e-05.\n",
            "\n",
            "Epoch 00514: ReduceLROnPlateau reducing learning rate to 1.0000000656873453e-06.\n",
            "\n",
            "Epoch 00616: ReduceLROnPlateau reducing learning rate to 1.0000001111620805e-07.\n",
            "\n",
            "Epoch 00802: ReduceLROnPlateau reducing learning rate to 1.000000082740371e-08.\n",
            "\n",
            "Epoch 00903: ReduceLROnPlateau reducing learning rate to 1.000000082740371e-09.\n"
          ]
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "jM4bW2EN__jH"
      },
      "source": [
        "# Making future forecastts of Bitcoins (using the whole data)\n",
        "def make_future_forecast(values , model_list , into_future , window_size):\n",
        "\n",
        "  future_forecast = []\n",
        "  last_window = values[-window_size:]\n",
        "\n",
        "  for _ in range(into_future): \n",
        "    for model in model_list:\n",
        "    \n",
        "      future_pred = model.predict(tf.expand_dims(last_window, axis= 0))\n",
        "      #future_pred = model.predict(last_window)\n",
        "      print(f'Predicing on: \\n {last_window} --> Prediction: {tf.squeeze(future_pred).numpy()}\\n')\n",
        "\n",
        "      future_forecast.append(tf.squeeze(future_pred).numpy())\n",
        "\n",
        "      # Update the last window \n",
        "      last_window = np.append(last_window , future_pred)[-window_size:]\n",
        "  return future_forecast"
      ],
      "execution_count": 52,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "SG-C8sGguG-o",
        "outputId": "57acbf21-893b-426c-f81a-43f9eb9818ee"
      },
      "source": [
        "# Getting the future forecast \n",
        "future_forecast = make_future_forecast(y_all , ensemble_models , into_future= 14 , window_size = 7 )"
      ],
      "execution_count": 53,
      "outputs": [
        {
          "output_type": "stream",
          "name": "stdout",
          "text": [
            "Predicing on: \n",
            " [56573.5554719  52147.82118698 49764.1320816  50032.69313676\n",
            " 47885.62525472 45604.61575361 43144.47129086] --> Prediction: 56393.9140625\n",
            "\n",
            "Predicing on: \n",
            " [52147.82118698 49764.1320816  50032.69313676 47885.62525472\n",
            " 45604.61575361 43144.47129086 56393.9140625 ] --> Prediction: 54017.7734375\n",
            "\n",
            "Predicing on: \n",
            " [49764.1320816  50032.69313676 47885.62525472 45604.61575361\n",
            " 43144.47129086 56393.9140625  54017.7734375 ] --> Prediction: 47962.16796875\n",
            "\n",
            "WARNING:tensorflow:5 out of the last 22 calls to <function Model.make_predict_function.<locals>.predict_function at 0x7f5f22b9e4d0> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has experimental_relax_shapes=True option that relaxes argument shapes that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for  more details.\n",
            "Predicing on: \n",
            " [50032.69313676 47885.62525472 45604.61575361 43144.47129086\n",
            " 56393.9140625  54017.7734375  47962.16796875] --> Prediction: 50942.78125\n",
            "\n",
            "WARNING:tensorflow:6 out of the last 23 calls to <function Model.make_predict_function.<locals>.predict_function at 0x7f5f1d2d54d0> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has experimental_relax_shapes=True option that relaxes argument shapes that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for  more details.\n",
            "Predicing on: \n",
            " [47885.62525472 45604.61575361 43144.47129086 56393.9140625\n",
            " 54017.7734375  47962.16796875 50942.78125   ] --> Prediction: 46727.8046875\n",
            "\n",
            "Predicing on: \n",
            " [45604.61575361 43144.47129086 56393.9140625  54017.7734375\n",
            " 47962.16796875 50942.78125    46727.8046875 ] --> Prediction: 45861.671875\n",
            "\n",
            "Predicing on: \n",
            " [43144.47129086 56393.9140625  54017.7734375  47962.16796875\n",
            " 50942.78125    46727.8046875  45861.671875  ] --> Prediction: 44864.1015625\n",
            "\n",
            "Predicing on: \n",
            " [56393.9140625  54017.7734375  47962.16796875 50942.78125\n",
            " 46727.8046875  45861.671875   44864.1015625 ] --> Prediction: 56452.046875\n",
            "\n",
            "Predicing on: \n",
            " [54017.7734375  47962.16796875 50942.78125    46727.8046875\n",
            " 45861.671875   44864.1015625  56452.046875  ] --> Prediction: 51451.390625\n",
            "\n",
            "Predicing on: \n",
            " [47962.16796875 50942.78125    46727.8046875  45861.671875\n",
            " 44864.1015625  56452.046875   51451.390625  ] --> Prediction: 48863.578125\n",
            "\n",
            "Predicing on: \n",
            " [50942.78125   46727.8046875 45861.671875  44864.1015625 56452.046875\n",
            " 51451.390625  48863.578125 ] --> Prediction: 51207.015625\n",
            "\n",
            "Predicing on: \n",
            " [46727.8046875 45861.671875  44864.1015625 56452.046875  51451.390625\n",
            " 48863.578125  51207.015625 ] --> Prediction: 46731.8203125\n",
            "\n",
            "Predicing on: \n",
            " [45861.671875  44864.1015625 56452.046875  51451.390625  48863.578125\n",
            " 51207.015625  46731.8203125] --> Prediction: 46192.66015625\n",
            "\n",
            "Predicing on: \n",
            " [44864.1015625  56452.046875   51451.390625   48863.578125\n",
            " 51207.015625   46731.8203125  46192.66015625] --> Prediction: 47911.23046875\n",
            "\n",
            "Predicing on: \n",
            " [56452.046875   51451.390625   48863.578125   51207.015625\n",
            " 46731.8203125  46192.66015625 47911.23046875] --> Prediction: 56282.6171875\n",
            "\n",
            "Predicing on: \n",
            " [51451.390625   48863.578125   51207.015625   46731.8203125\n",
            " 46192.66015625 47911.23046875 56282.6171875 ] --> Prediction: 50123.953125\n",
            "\n",
            "Predicing on: \n",
            " [48863.578125   51207.015625   46731.8203125  46192.66015625\n",
            " 47911.23046875 56282.6171875  50123.953125  ] --> Prediction: 48003.26953125\n",
            "\n",
            "Predicing on: \n",
            " [51207.015625   46731.8203125  46192.66015625 47911.23046875\n",
            " 56282.6171875  50123.953125   48003.26953125] --> Prediction: 50785.48046875\n",
            "\n",
            "Predicing on: \n",
            " [46731.8203125  46192.66015625 47911.23046875 56282.6171875\n",
            " 50123.953125   48003.26953125 50785.48046875] --> Prediction: 46381.83984375\n",
            "\n",
            "Predicing on: \n",
            " [46192.66015625 47911.23046875 56282.6171875  50123.953125\n",
            " 48003.26953125 50785.48046875 46381.83984375] --> Prediction: 47145.140625\n",
            "\n",
            "Predicing on: \n",
            " [47911.23046875 56282.6171875  50123.953125   48003.26953125\n",
            " 50785.48046875 46381.83984375 47145.140625  ] --> Prediction: 48285.0234375\n",
            "\n",
            "Predicing on: \n",
            " [56282.6171875  50123.953125   48003.26953125 50785.48046875\n",
            " 46381.83984375 47145.140625   48285.0234375 ] --> Prediction: 55764.58203125\n",
            "\n",
            "Predicing on: \n",
            " [50123.953125   48003.26953125 50785.48046875 46381.83984375\n",
            " 47145.140625   48285.0234375  55764.58203125] --> Prediction: 51085.05078125\n",
            "\n",
            "Predicing on: \n",
            " [48003.26953125 50785.48046875 46381.83984375 47145.140625\n",
            " 48285.0234375  55764.58203125 51085.05078125] --> Prediction: 48172.80859375\n",
            "\n",
            "Predicing on: \n",
            " [50785.48046875 46381.83984375 47145.140625   48285.0234375\n",
            " 55764.58203125 51085.05078125 48172.80859375] --> Prediction: 50463.453125\n",
            "\n",
            "Predicing on: \n",
            " [46381.83984375 47145.140625   48285.0234375  55764.58203125\n",
            " 51085.05078125 48172.80859375 50463.453125  ] --> Prediction: 46155.3828125\n",
            "\n",
            "Predicing on: \n",
            " [47145.140625   48285.0234375  55764.58203125 51085.05078125\n",
            " 48172.80859375 50463.453125   46155.3828125 ] --> Prediction: 48031.484375\n",
            "\n",
            "Predicing on: \n",
            " [48285.0234375  55764.58203125 51085.05078125 48172.80859375\n",
            " 50463.453125   46155.3828125  48031.484375  ] --> Prediction: 49102.78125\n",
            "\n",
            "Predicing on: \n",
            " [55764.58203125 51085.05078125 48172.80859375 50463.453125\n",
            " 46155.3828125  48031.484375   49102.78125   ] --> Prediction: 56283.421875\n",
            "\n",
            "Predicing on: \n",
            " [51085.05078125 48172.80859375 50463.453125   46155.3828125\n",
            " 48031.484375   49102.78125    56283.421875  ] --> Prediction: 50323.9609375\n",
            "\n",
            "Predicing on: \n",
            " [48172.80859375 50463.453125   46155.3828125  48031.484375\n",
            " 49102.78125    56283.421875   50323.9609375 ] --> Prediction: 48732.71875\n",
            "\n",
            "Predicing on: \n",
            " [50463.453125  46155.3828125 48031.484375  49102.78125   56283.421875\n",
            " 50323.9609375 48732.71875  ] --> Prediction: 50227.4765625\n",
            "\n",
            "Predicing on: \n",
            " [46155.3828125 48031.484375  49102.78125   56283.421875  50323.9609375\n",
            " 48732.71875   50227.4765625] --> Prediction: 46310.8046875\n",
            "\n",
            "Predicing on: \n",
            " [48031.484375  49102.78125   56283.421875  50323.9609375 48732.71875\n",
            " 50227.4765625 46310.8046875] --> Prediction: 48371.89453125\n",
            "\n",
            "Predicing on: \n",
            " [49102.78125    56283.421875   50323.9609375  48732.71875\n",
            " 50227.4765625  46310.8046875  48371.89453125] --> Prediction: 50554.3203125\n",
            "\n",
            "Predicing on: \n",
            " [56283.421875   50323.9609375  48732.71875    50227.4765625\n",
            " 46310.8046875  48371.89453125 50554.3203125 ] --> Prediction: 55898.3359375\n",
            "\n",
            "Predicing on: \n",
            " [50323.9609375  48732.71875    50227.4765625  46310.8046875\n",
            " 48371.89453125 50554.3203125  55898.3359375 ] --> Prediction: 49544.0390625\n",
            "\n",
            "Predicing on: \n",
            " [48732.71875    50227.4765625  46310.8046875  48371.89453125\n",
            " 50554.3203125  55898.3359375  49544.0390625 ] --> Prediction: 49103.9296875\n",
            "\n",
            "Predicing on: \n",
            " [50227.4765625  46310.8046875  48371.89453125 50554.3203125\n",
            " 55898.3359375  49544.0390625  49103.9296875 ] --> Prediction: 49796.2421875\n",
            "\n",
            "Predicing on: \n",
            " [46310.8046875  48371.89453125 50554.3203125  55898.3359375\n",
            " 49544.0390625  49103.9296875  49796.2421875 ] --> Prediction: 45768.51171875\n",
            "\n",
            "Predicing on: \n",
            " [48371.89453125 50554.3203125  55898.3359375  49544.0390625\n",
            " 49103.9296875  49796.2421875  45768.51171875] --> Prediction: 48544.9296875\n",
            "\n",
            "Predicing on: \n",
            " [50554.3203125  55898.3359375  49544.0390625  49103.9296875\n",
            " 49796.2421875  45768.51171875 48544.9296875 ] --> Prediction: 50777.8671875\n",
            "\n",
            "Predicing on: \n",
            " [55898.3359375  49544.0390625  49103.9296875  49796.2421875\n",
            " 45768.51171875 48544.9296875  50777.8671875 ] --> Prediction: 54836.2734375\n",
            "\n",
            "Predicing on: \n",
            " [49544.0390625  49103.9296875  49796.2421875  45768.51171875\n",
            " 48544.9296875  50777.8671875  54836.2734375 ] --> Prediction: 49565.2890625\n",
            "\n",
            "Predicing on: \n",
            " [49103.9296875  49796.2421875  45768.51171875 48544.9296875\n",
            " 50777.8671875  54836.2734375  49565.2890625 ] --> Prediction: 49562.37890625\n",
            "\n",
            "Predicing on: \n",
            " [49796.2421875  45768.51171875 48544.9296875  50777.8671875\n",
            " 54836.2734375  49565.2890625  49562.37890625] --> Prediction: 49270.93359375\n",
            "\n",
            "Predicing on: \n",
            " [45768.51171875 48544.9296875  50777.8671875  54836.2734375\n",
            " 49565.2890625  49562.37890625 49270.93359375] --> Prediction: 45760.8046875\n",
            "\n",
            "Predicing on: \n",
            " [48544.9296875  50777.8671875  54836.2734375  49565.2890625\n",
            " 49562.37890625 49270.93359375 45760.8046875 ] --> Prediction: 49161.3828125\n",
            "\n",
            "Predicing on: \n",
            " [50777.8671875  54836.2734375  49565.2890625  49562.37890625\n",
            " 49270.93359375 45760.8046875  49161.3828125 ] --> Prediction: 51403.3359375\n",
            "\n",
            "Predicing on: \n",
            " [54836.2734375  49565.2890625  49562.37890625 49270.93359375\n",
            " 45760.8046875  49161.3828125  51403.3359375 ] --> Prediction: 54375.02734375\n",
            "\n",
            "Predicing on: \n",
            " [49565.2890625  49562.37890625 49270.93359375 45760.8046875\n",
            " 49161.3828125  51403.3359375  54375.02734375] --> Prediction: 48986.8203125\n",
            "\n",
            "Predicing on: \n",
            " [49562.37890625 49270.93359375 45760.8046875  49161.3828125\n",
            " 51403.3359375  54375.02734375 48986.8203125 ] --> Prediction: 50394.9140625\n",
            "\n",
            "Predicing on: \n",
            " [49270.93359375 45760.8046875  49161.3828125  51403.3359375\n",
            " 54375.02734375 48986.8203125  50394.9140625 ] --> Prediction: 48840.24609375\n",
            "\n",
            "Predicing on: \n",
            " [45760.8046875  49161.3828125  51403.3359375  54375.02734375\n",
            " 48986.8203125  50394.9140625  48840.24609375] --> Prediction: 46024.56640625\n",
            "\n",
            "Predicing on: \n",
            " [49161.3828125  51403.3359375  54375.02734375 48986.8203125\n",
            " 50394.9140625  48840.24609375 46024.56640625] --> Prediction: 49481.828125\n",
            "\n",
            "Predicing on: \n",
            " [51403.3359375  54375.02734375 48986.8203125  50394.9140625\n",
            " 48840.24609375 46024.56640625 49481.828125  ] --> Prediction: 51802.73046875\n",
            "\n",
            "Predicing on: \n",
            " [54375.02734375 48986.8203125  50394.9140625  48840.24609375\n",
            " 46024.56640625 49481.828125   51802.73046875] --> Prediction: 53703.80078125\n",
            "\n",
            "Predicing on: \n",
            " [48986.8203125  50394.9140625  48840.24609375 46024.56640625\n",
            " 49481.828125   51802.73046875 53703.80078125] --> Prediction: 48709.10546875\n",
            "\n",
            "Predicing on: \n",
            " [50394.9140625  48840.24609375 46024.56640625 49481.828125\n",
            " 51802.73046875 53703.80078125 48709.10546875] --> Prediction: 50423.18359375\n",
            "\n",
            "Predicing on: \n",
            " [48840.24609375 46024.56640625 49481.828125   51802.73046875\n",
            " 53703.80078125 48709.10546875 50423.18359375] --> Prediction: 48579.90234375\n",
            "\n",
            "Predicing on: \n",
            " [46024.56640625 49481.828125   51802.73046875 53703.80078125\n",
            " 48709.10546875 50423.18359375 48579.90234375] --> Prediction: 46652.44921875\n",
            "\n",
            "Predicing on: \n",
            " [49481.828125   51802.73046875 53703.80078125 48709.10546875\n",
            " 50423.18359375 48579.90234375 46652.44921875] --> Prediction: 50358.8125\n",
            "\n",
            "Predicing on: \n",
            " [51802.73046875 53703.80078125 48709.10546875 50423.18359375\n",
            " 48579.90234375 46652.44921875 50358.8125    ] --> Prediction: 52127.98828125\n",
            "\n",
            "Predicing on: \n",
            " [53703.80078125 48709.10546875 50423.18359375 48579.90234375\n",
            " 46652.44921875 50358.8125     52127.98828125] --> Prediction: 53378.9140625\n",
            "\n",
            "Predicing on: \n",
            " [48709.10546875 50423.18359375 48579.90234375 46652.44921875\n",
            " 50358.8125     52127.98828125 53378.9140625 ] --> Prediction: 49651.31640625\n",
            "\n",
            "Predicing on: \n",
            " [50423.18359375 48579.90234375 46652.44921875 50358.8125\n",
            " 52127.98828125 53378.9140625  49651.31640625] --> Prediction: 50193.2734375\n",
            "\n",
            "Predicing on: \n",
            " [48579.90234375 46652.44921875 50358.8125     52127.98828125\n",
            " 53378.9140625  49651.31640625 50193.2734375 ] --> Prediction: 47909.78125\n",
            "\n",
            "Predicing on: \n",
            " [46652.44921875 50358.8125     52127.98828125 53378.9140625\n",
            " 49651.31640625 50193.2734375  47909.78125   ] --> Prediction: 46968.25\n",
            "\n",
            "Predicing on: \n",
            " [50358.8125     52127.98828125 53378.9140625  49651.31640625\n",
            " 50193.2734375  47909.78125    46968.25      ] --> Prediction: 50580.0703125\n",
            "\n",
            "Predicing on: \n",
            " [52127.98828125 53378.9140625  49651.31640625 50193.2734375\n",
            " 47909.78125    46968.25       50580.0703125 ] --> Prediction: 52521.796875\n",
            "\n",
            "Predicing on: \n",
            " [53378.9140625  49651.31640625 50193.2734375  47909.78125\n",
            " 46968.25       50580.0703125  52521.796875  ] --> Prediction: 53131.28515625\n",
            "\n",
            "Predicing on: \n",
            " [49651.31640625 50193.2734375  47909.78125    46968.25\n",
            " 50580.0703125  52521.796875   53131.28515625] --> Prediction: 49628.73046875\n",
            "\n",
            "Predicing on: \n",
            " [50193.2734375  47909.78125    46968.25       50580.0703125\n",
            " 52521.796875   53131.28515625 49628.73046875] --> Prediction: 50439.19140625\n",
            "\n",
            "Predicing on: \n",
            " [47909.78125    46968.25       50580.0703125  52521.796875\n",
            " 53131.28515625 49628.73046875 50439.19140625] --> Prediction: 47225.40625\n",
            "\n",
            "Predicing on: \n",
            " [46968.25       50580.0703125  52521.796875   53131.28515625\n",
            " 49628.73046875 50439.19140625 47225.40625   ] --> Prediction: 47181.16015625\n",
            "\n",
            "Predicing on: \n",
            " [50580.0703125  52521.796875   53131.28515625 49628.73046875\n",
            " 50439.19140625 47225.40625    47181.16015625] --> Prediction: 50794.50390625\n",
            "\n",
            "Predicing on: \n",
            " [52521.796875   53131.28515625 49628.73046875 50439.19140625\n",
            " 47225.40625    47181.16015625 50794.50390625] --> Prediction: 53307.56640625\n",
            "\n",
            "Predicing on: \n",
            " [53131.28515625 49628.73046875 50439.19140625 47225.40625\n",
            " 47181.16015625 50794.50390625 53307.56640625] --> Prediction: 52264.91015625\n",
            "\n",
            "Predicing on: \n",
            " [49628.73046875 50439.19140625 47225.40625    47181.16015625\n",
            " 50794.50390625 53307.56640625 52264.91015625] --> Prediction: 49707.08203125\n",
            "\n",
            "Predicing on: \n",
            " [50439.19140625 47225.40625    47181.16015625 50794.50390625\n",
            " 53307.56640625 52264.91015625 49707.08203125] --> Prediction: 49579.13671875\n",
            "\n",
            "Predicing on: \n",
            " [47225.40625    47181.16015625 50794.50390625 53307.56640625\n",
            " 52264.91015625 49707.08203125 49579.13671875] --> Prediction: 47048.64453125\n",
            "\n",
            "Predicing on: \n",
            " [47181.16015625 50794.50390625 53307.56640625 52264.91015625\n",
            " 49707.08203125 49579.13671875 47048.64453125] --> Prediction: 47663.24609375\n",
            "\n",
            "Predicing on: \n",
            " [50794.50390625 53307.56640625 52264.91015625 49707.08203125\n",
            " 49579.13671875 47048.64453125 47663.24609375] --> Prediction: 51271.14453125\n",
            "\n",
            "Predicing on: \n",
            " [53307.56640625 52264.91015625 49707.08203125 49579.13671875\n",
            " 47048.64453125 47663.24609375 51271.14453125] --> Prediction: 53245.5625\n",
            "\n",
            "Predicing on: \n",
            " [52264.91015625 49707.08203125 49579.13671875 47048.64453125\n",
            " 47663.24609375 51271.14453125 53245.5625    ] --> Prediction: 52062.46484375\n",
            "\n",
            "Predicing on: \n",
            " [49707.08203125 49579.13671875 47048.64453125 47663.24609375\n",
            " 51271.14453125 53245.5625     52062.46484375] --> Prediction: 50269.35546875\n",
            "\n",
            "Predicing on: \n",
            " [49579.13671875 47048.64453125 47663.24609375 51271.14453125\n",
            " 53245.5625     52062.46484375 50269.35546875] --> Prediction: 49249.23046875\n",
            "\n",
            "Predicing on: \n",
            " [47048.64453125 47663.24609375 51271.14453125 53245.5625\n",
            " 52062.46484375 50269.35546875 49249.23046875] --> Prediction: 46215.859375\n",
            "\n",
            "Predicing on: \n",
            " [47663.24609375 51271.14453125 53245.5625     52062.46484375\n",
            " 50269.35546875 49249.23046875 46215.859375  ] --> Prediction: 47993.95703125\n",
            "\n",
            "Predicing on: \n",
            " [51271.14453125 53245.5625     52062.46484375 50269.35546875\n",
            " 49249.23046875 46215.859375   47993.95703125] --> Prediction: 51396.5625\n",
            "\n",
            "Predicing on: \n",
            " [53245.5625     52062.46484375 50269.35546875 49249.23046875\n",
            " 46215.859375   47993.95703125 51396.5625    ] --> Prediction: 53075.09765625\n",
            "\n",
            "Predicing on: \n",
            " [52062.46484375 50269.35546875 49249.23046875 46215.859375\n",
            " 47993.95703125 51396.5625     53075.09765625] --> Prediction: 51671.6953125\n",
            "\n",
            "Predicing on: \n",
            " [50269.35546875 49249.23046875 46215.859375   47993.95703125\n",
            " 51396.5625     53075.09765625 51671.6953125 ] --> Prediction: 49988.4453125\n",
            "\n",
            "Predicing on: \n",
            " [49249.23046875 46215.859375   47993.95703125 51396.5625\n",
            " 53075.09765625 51671.6953125  49988.4453125 ] --> Prediction: 48931.1953125\n",
            "\n",
            "Predicing on: \n",
            " [46215.859375   47993.95703125 51396.5625     53075.09765625\n",
            " 51671.6953125  49988.4453125  48931.1953125 ] --> Prediction: 45492.1171875\n",
            "\n",
            "Predicing on: \n",
            " [47993.95703125 51396.5625     53075.09765625 51671.6953125\n",
            " 49988.4453125  48931.1953125  45492.1171875 ] --> Prediction: 48078.91015625\n",
            "\n",
            "Predicing on: \n",
            " [51396.5625     53075.09765625 51671.6953125  49988.4453125\n",
            " 48931.1953125  45492.1171875  48078.91015625] --> Prediction: 51667.48046875\n",
            "\n",
            "Predicing on: \n",
            " [53075.09765625 51671.6953125  49988.4453125  48931.1953125\n",
            " 45492.1171875  48078.91015625 51667.48046875] --> Prediction: 54053.19140625\n",
            "\n",
            "Predicing on: \n",
            " [51671.6953125  49988.4453125  48931.1953125  45492.1171875\n",
            " 48078.91015625 51667.48046875 54053.19140625] --> Prediction: 51349.953125\n",
            "\n",
            "Predicing on: \n",
            " [49988.4453125  48931.1953125  45492.1171875  48078.91015625\n",
            " 51667.48046875 54053.19140625 51349.953125  ] --> Prediction: 50194.8203125\n",
            "\n",
            "Predicing on: \n",
            " [48931.1953125  45492.1171875  48078.91015625 51667.48046875\n",
            " 54053.19140625 51349.953125   50194.8203125 ] --> Prediction: 48904.59375\n",
            "\n",
            "Predicing on: \n",
            " [45492.1171875  48078.91015625 51667.48046875 54053.19140625\n",
            " 51349.953125   50194.8203125  48904.59375   ] --> Prediction: 45459.26953125\n",
            "\n",
            "Predicing on: \n",
            " [48078.91015625 51667.48046875 54053.19140625 51349.953125\n",
            " 50194.8203125  48904.59375    45459.26953125] --> Prediction: 48047.63671875\n",
            "\n",
            "Predicing on: \n",
            " [51667.48046875 54053.19140625 51349.953125   50194.8203125\n",
            " 48904.59375    45459.26953125 48047.63671875] --> Prediction: 52612.19140625\n",
            "\n",
            "Predicing on: \n",
            " [54053.19140625 51349.953125   50194.8203125  48904.59375\n",
            " 45459.26953125 48047.63671875 52612.19140625] --> Prediction: 53871.13671875\n",
            "\n",
            "Predicing on: \n",
            " [51349.953125   50194.8203125  48904.59375    45459.26953125\n",
            " 48047.63671875 52612.19140625 53871.13671875] --> Prediction: 50329.83984375\n",
            "\n",
            "Predicing on: \n",
            " [50194.8203125  48904.59375    45459.26953125 48047.63671875\n",
            " 52612.19140625 53871.13671875 50329.83984375] --> Prediction: 49263.125\n",
            "\n",
            "Predicing on: \n",
            " [48904.59375    45459.26953125 48047.63671875 52612.19140625\n",
            " 53871.13671875 50329.83984375 49263.125     ] --> Prediction: 48462.9296875\n",
            "\n",
            "Predicing on: \n",
            " [45459.26953125 48047.63671875 52612.19140625 53871.13671875\n",
            " 50329.83984375 49263.125      48462.9296875 ] --> Prediction: 45159.40625\n",
            "\n",
            "Predicing on: \n",
            " [48047.63671875 52612.19140625 53871.13671875 50329.83984375\n",
            " 49263.125      48462.9296875  45159.40625   ] --> Prediction: 48913.47265625\n",
            "\n",
            "Predicing on: \n",
            " [52612.19140625 53871.13671875 50329.83984375 49263.125\n",
            " 48462.9296875  45159.40625    48913.47265625] --> Prediction: 52578.26171875\n",
            "\n",
            "Predicing on: \n",
            " [53871.13671875 50329.83984375 49263.125      48462.9296875\n",
            " 45159.40625    48913.47265625 52578.26171875] --> Prediction: 53219.37890625\n",
            "\n",
            "Predicing on: \n",
            " [50329.83984375 49263.125      48462.9296875  45159.40625\n",
            " 48913.47265625 52578.26171875 53219.37890625] --> Prediction: 50905.95703125\n",
            "\n",
            "Predicing on: \n",
            " [49263.125      48462.9296875  45159.40625    48913.47265625\n",
            " 52578.26171875 53219.37890625 50905.95703125] --> Prediction: 49239.21875\n",
            "\n",
            "Predicing on: \n",
            " [48462.9296875  45159.40625    48913.47265625 52578.26171875\n",
            " 53219.37890625 50905.95703125 49239.21875   ] --> Prediction: 47584.8125\n",
            "\n",
            "Predicing on: \n",
            " [45159.40625    48913.47265625 52578.26171875 53219.37890625\n",
            " 50905.95703125 49239.21875    47584.8125    ] --> Prediction: 45006.06640625\n",
            "\n",
            "Predicing on: \n",
            " [48913.47265625 52578.26171875 53219.37890625 50905.95703125\n",
            " 49239.21875    47584.8125     45006.06640625] --> Prediction: 49145.5625\n",
            "\n",
            "Predicing on: \n",
            " [52578.26171875 53219.37890625 50905.95703125 49239.21875\n",
            " 47584.8125     45006.06640625 49145.5625    ] --> Prediction: 52564.890625\n",
            "\n",
            "Predicing on: \n",
            " [53219.37890625 50905.95703125 49239.21875    47584.8125\n",
            " 45006.06640625 49145.5625     52564.890625  ] --> Prediction: 53952.734375\n",
            "\n",
            "Predicing on: \n",
            " [50905.95703125 49239.21875    47584.8125     45006.06640625\n",
            " 49145.5625     52564.890625   53952.734375  ] --> Prediction: 50242.4140625\n",
            "\n",
            "Predicing on: \n",
            " [49239.21875    47584.8125     45006.06640625 49145.5625\n",
            " 52564.890625   53952.734375   50242.4140625 ] --> Prediction: 48975.546875\n",
            "\n",
            "Predicing on: \n",
            " [47584.8125     45006.06640625 49145.5625     52564.890625\n",
            " 53952.734375   50242.4140625  48975.546875  ] --> Prediction: 46822.66015625\n",
            "\n",
            "Predicing on: \n",
            " [45006.06640625 49145.5625     52564.890625   53952.734375\n",
            " 50242.4140625  48975.546875   46822.66015625] --> Prediction: 44942.71484375\n",
            "\n",
            "Predicing on: \n",
            " [49145.5625     52564.890625   53952.734375   50242.4140625\n",
            " 48975.546875   46822.66015625 44942.71484375] --> Prediction: 49489.5546875\n",
            "\n",
            "Predicing on: \n",
            " [52564.890625   53952.734375   50242.4140625  48975.546875\n",
            " 46822.66015625 44942.71484375 49489.5546875 ] --> Prediction: 53447.06640625\n",
            "\n",
            "Predicing on: \n",
            " [53952.734375   50242.4140625  48975.546875   46822.66015625\n",
            " 44942.71484375 49489.5546875  53447.06640625] --> Prediction: 52523.6015625\n",
            "\n",
            "Predicing on: \n",
            " [50242.4140625  48975.546875   46822.66015625 44942.71484375\n",
            " 49489.5546875  53447.06640625 52523.6015625 ] --> Prediction: 49754.44921875\n",
            "\n",
            "Predicing on: \n",
            " [48975.546875   46822.66015625 44942.71484375 49489.5546875\n",
            " 53447.06640625 52523.6015625  49754.44921875] --> Prediction: 48846.37109375\n",
            "\n",
            "Predicing on: \n",
            " [46822.66015625 44942.71484375 49489.5546875  53447.06640625\n",
            " 52523.6015625  49754.44921875 48846.37109375] --> Prediction: 46489.92578125\n",
            "\n",
            "Predicing on: \n",
            " [44942.71484375 49489.5546875  53447.06640625 52523.6015625\n",
            " 49754.44921875 48846.37109375 46489.92578125] --> Prediction: 45251.1640625\n",
            "\n",
            "Predicing on: \n",
            " [49489.5546875  53447.06640625 52523.6015625  49754.44921875\n",
            " 48846.37109375 46489.92578125 45251.1640625 ] --> Prediction: 50144.66015625\n",
            "\n",
            "Predicing on: \n",
            " [53447.06640625 52523.6015625  49754.44921875 48846.37109375\n",
            " 46489.92578125 45251.1640625  50144.66015625] --> Prediction: 53215.52734375\n",
            "\n",
            "Predicing on: \n",
            " [52523.6015625  49754.44921875 48846.37109375 46489.92578125\n",
            " 45251.1640625  50144.66015625 53215.52734375] --> Prediction: 51586.09765625\n",
            "\n",
            "Predicing on: \n",
            " [49754.44921875 48846.37109375 46489.92578125 45251.1640625\n",
            " 50144.66015625 53215.52734375 51586.09765625] --> Prediction: 49344.63671875\n",
            "\n",
            "Predicing on: \n",
            " [48846.37109375 46489.92578125 45251.1640625  50144.66015625\n",
            " 53215.52734375 51586.09765625 49344.63671875] --> Prediction: 48984.8984375\n",
            "\n",
            "Predicing on: \n",
            " [46489.92578125 45251.1640625  50144.66015625 53215.52734375\n",
            " 51586.09765625 49344.63671875 48984.8984375 ] --> Prediction: 45793.5078125\n",
            "\n",
            "Predicing on: \n",
            " [45251.1640625  50144.66015625 53215.52734375 51586.09765625\n",
            " 49344.63671875 48984.8984375  45793.5078125 ] --> Prediction: 45716.91796875\n",
            "\n",
            "Predicing on: \n",
            " [50144.66015625 53215.52734375 51586.09765625 49344.63671875\n",
            " 48984.8984375  45793.5078125  45716.91796875] --> Prediction: 50511.25\n",
            "\n",
            "Predicing on: \n",
            " [53215.52734375 51586.09765625 49344.63671875 48984.8984375\n",
            " 45793.5078125  45716.91796875 50511.25      ] --> Prediction: 53188.9375\n",
            "\n",
            "Predicing on: \n",
            " [51586.09765625 49344.63671875 48984.8984375  45793.5078125\n",
            " 45716.91796875 50511.25       53188.9375    ] --> Prediction: 51838.49609375\n",
            "\n",
            "Predicing on: \n",
            " [49344.63671875 48984.8984375  45793.5078125  45716.91796875\n",
            " 50511.25       53188.9375     51838.49609375] --> Prediction: 48733.3515625\n",
            "\n",
            "Predicing on: \n",
            " [48984.8984375  45793.5078125  45716.91796875 50511.25\n",
            " 53188.9375     51838.49609375 48733.3515625 ] --> Prediction: 48638.0\n",
            "\n",
            "Predicing on: \n",
            " [45793.5078125  45716.91796875 50511.25       53188.9375\n",
            " 51838.49609375 48733.3515625  48638.        ] --> Prediction: 45680.10546875\n",
            "\n",
            "Predicing on: \n",
            " [45716.91796875 50511.25       53188.9375     51838.49609375\n",
            " 48733.3515625  48638.         45680.10546875] --> Prediction: 46366.92578125\n",
            "\n",
            "Predicing on: \n",
            " [50511.25       53188.9375     51838.49609375 48733.3515625\n",
            " 48638.         45680.10546875 46366.92578125] --> Prediction: 51320.046875\n",
            "\n",
            "Predicing on: \n",
            " [53188.9375     51838.49609375 48733.3515625  48638.\n",
            " 45680.10546875 46366.92578125 51320.046875  ] --> Prediction: 53418.25390625\n",
            "\n",
            "Predicing on: \n",
            " [51838.49609375 48733.3515625  48638.         45680.10546875\n",
            " 46366.92578125 51320.046875   53418.25390625] --> Prediction: 50952.79296875\n",
            "\n",
            "Predicing on: \n",
            " [48733.3515625  48638.         45680.10546875 46366.92578125\n",
            " 51320.046875   53418.25390625 50952.79296875] --> Prediction: 48775.328125\n",
            "\n",
            "Predicing on: \n",
            " [48638.         45680.10546875 46366.92578125 51320.046875\n",
            " 53418.25390625 50952.79296875 48775.328125  ] --> Prediction: 48139.6328125\n",
            "\n",
            "Predicing on: \n",
            " [45680.10546875 46366.92578125 51320.046875   53418.25390625\n",
            " 50952.79296875 48775.328125   48139.6328125 ] --> Prediction: 45625.76171875\n",
            "\n",
            "Predicing on: \n",
            " [46366.92578125 51320.046875   53418.25390625 50952.79296875\n",
            " 48775.328125   48139.6328125  45625.76171875] --> Prediction: 47295.2578125\n",
            "\n",
            "Predicing on: \n",
            " [51320.046875   53418.25390625 50952.79296875 48775.328125\n",
            " 48139.6328125  45625.76171875 47295.2578125 ] --> Prediction: 52069.08984375\n",
            "\n",
            "Predicing on: \n",
            " [53418.25390625 50952.79296875 48775.328125   48139.6328125\n",
            " 45625.76171875 47295.2578125  52069.08984375] --> Prediction: 52844.3046875\n",
            "\n",
            "Predicing on: \n",
            " [50952.79296875 48775.328125   48139.6328125  45625.76171875\n",
            " 47295.2578125  52069.08984375 52844.3046875 ] --> Prediction: 50847.17578125\n",
            "\n",
            "Predicing on: \n",
            " [48775.328125   48139.6328125  45625.76171875 47295.2578125\n",
            " 52069.08984375 52844.3046875  50847.17578125] --> Prediction: 49179.71484375\n",
            "\n",
            "Predicing on: \n",
            " [48139.6328125  45625.76171875 47295.2578125  52069.08984375\n",
            " 52844.3046875  50847.17578125 49179.71484375] --> Prediction: 48073.609375\n",
            "\n",
            "Predicing on: \n",
            " [45625.76171875 47295.2578125  52069.08984375 52844.3046875\n",
            " 50847.17578125 49179.71484375 48073.609375  ] --> Prediction: 45678.48046875\n",
            "\n",
            "Predicing on: \n",
            " [47295.2578125  52069.08984375 52844.3046875  50847.17578125\n",
            " 49179.71484375 48073.609375   45678.48046875] --> Prediction: 47512.57421875\n",
            "\n",
            "Predicing on: \n",
            " [52069.08984375 52844.3046875  50847.17578125 49179.71484375\n",
            " 48073.609375   45678.48046875 47512.57421875] --> Prediction: 52238.828125\n",
            "\n",
            "Predicing on: \n",
            " [52844.3046875  50847.17578125 49179.71484375 48073.609375\n",
            " 45678.48046875 47512.57421875 52238.828125  ] --> Prediction: 52444.38671875\n",
            "\n",
            "Predicing on: \n",
            " [50847.17578125 49179.71484375 48073.609375   45678.48046875\n",
            " 47512.57421875 52238.828125   52444.38671875] --> Prediction: 51246.140625\n",
            "\n",
            "Predicing on: \n",
            " [49179.71484375 48073.609375   45678.48046875 47512.57421875\n",
            " 52238.828125   52444.38671875 51246.140625  ] --> Prediction: 49045.04296875\n",
            "\n",
            "Predicing on: \n",
            " [48073.609375   45678.48046875 47512.57421875 52238.828125\n",
            " 52444.38671875 51246.140625   49045.04296875] --> Prediction: 48060.79296875\n",
            "\n",
            "Predicing on: \n",
            " [45678.48046875 47512.57421875 52238.828125   52444.38671875\n",
            " 51246.140625   49045.04296875 48060.79296875] --> Prediction: 45472.32421875\n",
            "\n",
            "Predicing on: \n",
            " [47512.57421875 52238.828125   52444.38671875 51246.140625\n",
            " 49045.04296875 48060.79296875 45472.32421875] --> Prediction: 47774.2734375\n",
            "\n",
            "Predicing on: \n",
            " [52238.828125   52444.38671875 51246.140625   49045.04296875\n",
            " 48060.79296875 45472.32421875 47774.2734375 ] --> Prediction: 52372.5234375\n",
            "\n",
            "Predicing on: \n",
            " [52444.38671875 51246.140625   49045.04296875 48060.79296875\n",
            " 45472.32421875 47774.2734375  52372.5234375 ] --> Prediction: 52800.2421875\n",
            "\n",
            "Predicing on: \n",
            " [51246.140625   49045.04296875 48060.79296875 45472.32421875\n",
            " 47774.2734375  52372.5234375  52800.2421875 ] --> Prediction: 50242.47265625\n",
            "\n",
            "Predicing on: \n",
            " [49045.04296875 48060.79296875 45472.32421875 47774.2734375\n",
            " 52372.5234375  52800.2421875  50242.47265625] --> Prediction: 48985.69140625\n",
            "\n",
            "Predicing on: \n",
            " [48060.79296875 45472.32421875 47774.2734375  52372.5234375\n",
            " 52800.2421875  50242.47265625 48985.69140625] --> Prediction: 46709.421875\n",
            "\n",
            "Predicing on: \n",
            " [45472.32421875 47774.2734375  52372.5234375  52800.2421875\n",
            " 50242.47265625 48985.69140625 46709.421875  ] --> Prediction: 45365.37890625\n",
            "\n",
            "Predicing on: \n",
            " [47774.2734375  52372.5234375  52800.2421875  50242.47265625\n",
            " 48985.69140625 46709.421875   45365.37890625] --> Prediction: 48454.53515625\n",
            "\n",
            "Predicing on: \n",
            " [52372.5234375  52800.2421875  50242.47265625 48985.69140625\n",
            " 46709.421875   45365.37890625 48454.53515625] --> Prediction: 53123.41015625\n",
            "\n",
            "Predicing on: \n",
            " [52800.2421875  50242.47265625 48985.69140625 46709.421875\n",
            " 45365.37890625 48454.53515625 53123.41015625] --> Prediction: 52275.11328125\n",
            "\n",
            "Predicing on: \n",
            " [50242.47265625 48985.69140625 46709.421875   45365.37890625\n",
            " 48454.53515625 53123.41015625 52275.11328125] --> Prediction: 50708.56640625\n",
            "\n",
            "Predicing on: \n",
            " [48985.69140625 46709.421875   45365.37890625 48454.53515625\n",
            " 53123.41015625 52275.11328125 50708.56640625] --> Prediction: 48689.56640625\n",
            "\n",
            "Predicing on: \n",
            " [46709.421875   45365.37890625 48454.53515625 53123.41015625\n",
            " 52275.11328125 50708.56640625 48689.56640625] --> Prediction: 46196.12109375\n",
            "\n",
            "Predicing on: \n",
            " [45365.37890625 48454.53515625 53123.41015625 52275.11328125\n",
            " 50708.56640625 48689.56640625 46196.12109375] --> Prediction: 45003.68359375\n",
            "\n",
            "Predicing on: \n",
            " [48454.53515625 53123.41015625 52275.11328125 50708.56640625\n",
            " 48689.56640625 46196.12109375 45003.68359375] --> Prediction: 49021.66015625\n",
            "\n",
            "Predicing on: \n",
            " [53123.41015625 52275.11328125 50708.56640625 48689.56640625\n",
            " 46196.12109375 45003.68359375 49021.66015625] --> Prediction: 53285.6484375\n",
            "\n",
            "Predicing on: \n",
            " [52275.11328125 50708.56640625 48689.56640625 46196.12109375\n",
            " 45003.68359375 49021.66015625 53285.6484375 ] --> Prediction: 51089.66796875\n",
            "\n",
            "Predicing on: \n",
            " [50708.56640625 48689.56640625 46196.12109375 45003.68359375\n",
            " 49021.66015625 53285.6484375  51089.66796875] --> Prediction: 49677.890625\n",
            "\n",
            "Predicing on: \n",
            " [48689.56640625 46196.12109375 45003.68359375 49021.66015625\n",
            " 53285.6484375  51089.66796875 49677.890625  ] --> Prediction: 48275.91015625\n",
            "\n",
            "Predicing on: \n",
            " [46196.12109375 45003.68359375 49021.66015625 53285.6484375\n",
            " 51089.66796875 49677.890625   48275.91015625] --> Prediction: 45380.921875\n",
            "\n",
            "Predicing on: \n",
            " [45003.68359375 49021.66015625 53285.6484375  51089.66796875\n",
            " 49677.890625   48275.91015625 45380.921875  ] --> Prediction: 45107.1640625\n",
            "\n",
            "Predicing on: \n",
            " [49021.66015625 53285.6484375  51089.66796875 49677.890625\n",
            " 48275.91015625 45380.921875   45107.1640625 ] --> Prediction: 49099.34375\n",
            "\n",
            "Predicing on: \n",
            " [53285.6484375  51089.66796875 49677.890625   48275.91015625\n",
            " 45380.921875   45107.1640625  49099.34375   ] --> Prediction: 53578.875\n",
            "\n",
            "Predicing on: \n",
            " [51089.66796875 49677.890625   48275.91015625 45380.921875\n",
            " 45107.1640625  49099.34375    53578.875     ] --> Prediction: 52384.38671875\n",
            "\n",
            "Predicing on: \n",
            " [49677.890625   48275.91015625 45380.921875   45107.1640625\n",
            " 49099.34375    53578.875      52384.38671875] --> Prediction: 49584.265625\n",
            "\n",
            "Predicing on: \n",
            " [48275.91015625 45380.921875   45107.1640625  49099.34375\n",
            " 53578.875      52384.38671875 49584.265625  ] --> Prediction: 47745.578125\n",
            "\n",
            "Predicing on: \n",
            " [45380.921875   45107.1640625  49099.34375    53578.875\n",
            " 52384.38671875 49584.265625   47745.578125  ] --> Prediction: 45102.48046875\n",
            "\n",
            "Predicing on: \n",
            " [45107.1640625  49099.34375    53578.875      52384.38671875\n",
            " 49584.265625   47745.578125   45102.48046875] --> Prediction: 45771.96875\n",
            "\n",
            "Predicing on: \n",
            " [49099.34375    53578.875      52384.38671875 49584.265625\n",
            " 47745.578125   45102.48046875 45771.96875   ] --> Prediction: 49859.85546875\n",
            "\n",
            "Predicing on: \n",
            " [53578.875      52384.38671875 49584.265625   47745.578125\n",
            " 45102.48046875 45771.96875    49859.85546875] --> Prediction: 55240.80859375\n",
            "\n",
            "Predicing on: \n",
            " [52384.38671875 49584.265625   47745.578125   45102.48046875\n",
            " 45771.96875    49859.85546875 55240.80859375] --> Prediction: 51379.4296875\n",
            "\n",
            "Predicing on: \n",
            " [49584.265625   47745.578125   45102.48046875 45771.96875\n",
            " 49859.85546875 55240.80859375 51379.4296875 ] --> Prediction: 49260.76953125\n",
            "\n",
            "Predicing on: \n",
            " [47745.578125   45102.48046875 45771.96875    49859.85546875\n",
            " 55240.80859375 51379.4296875  49260.76953125] --> Prediction: 47109.81640625\n",
            "\n",
            "Predicing on: \n",
            " [45102.48046875 45771.96875    49859.85546875 55240.80859375\n",
            " 51379.4296875  49260.76953125 47109.81640625] --> Prediction: 45137.8046875\n",
            "\n",
            "Predicing on: \n",
            " [45771.96875    49859.85546875 55240.80859375 51379.4296875\n",
            " 49260.76953125 47109.81640625 45137.8046875 ] --> Prediction: 46479.46875\n",
            "\n",
            "Predicing on: \n",
            " [49859.85546875 55240.80859375 51379.4296875  49260.76953125\n",
            " 47109.81640625 45137.8046875  46479.46875   ] --> Prediction: 51108.62109375\n",
            "\n",
            "Predicing on: \n",
            " [55240.80859375 51379.4296875  49260.76953125 47109.81640625\n",
            " 45137.8046875  46479.46875    51108.62109375] --> Prediction: 54624.4375\n",
            "\n",
            "Predicing on: \n",
            " [51379.4296875  49260.76953125 47109.81640625 45137.8046875\n",
            " 46479.46875    51108.62109375 54624.4375    ] --> Prediction: 49828.84375\n",
            "\n",
            "Predicing on: \n",
            " [49260.76953125 47109.81640625 45137.8046875  46479.46875\n",
            " 51108.62109375 54624.4375     49828.84375   ] --> Prediction: 49544.62109375\n",
            "\n",
            "Predicing on: \n",
            " [47109.81640625 45137.8046875  46479.46875    51108.62109375\n",
            " 54624.4375     49828.84375    49544.62109375] --> Prediction: 47059.84375\n",
            "\n",
            "Predicing on: \n",
            " [45137.8046875  46479.46875    51108.62109375 54624.4375\n",
            " 49828.84375    49544.62109375 47059.84375   ] --> Prediction: 44754.6640625\n",
            "\n",
            "Predicing on: \n",
            " [46479.46875    51108.62109375 54624.4375     49828.84375\n",
            " 49544.62109375 47059.84375    44754.6640625 ] --> Prediction: 46963.6953125\n",
            "\n",
            "Predicing on: \n",
            " [51108.62109375 54624.4375     49828.84375    49544.62109375\n",
            " 47059.84375    44754.6640625  46963.6953125 ] --> Prediction: 51266.640625\n",
            "\n",
            "Predicing on: \n",
            " [54624.4375     49828.84375    49544.62109375 47059.84375\n",
            " 44754.6640625  46963.6953125  51266.640625  ] --> Prediction: 53232.18359375\n",
            "\n",
            "Predicing on: \n",
            " [49828.84375    49544.62109375 47059.84375    44754.6640625\n",
            " 46963.6953125  51266.640625   53232.18359375] --> Prediction: 49981.76171875\n",
            "\n",
            "Predicing on: \n",
            " [49544.62109375 47059.84375    44754.6640625  46963.6953125\n",
            " 51266.640625   53232.18359375 49981.76171875] --> Prediction: 49550.04296875\n",
            "\n"
          ]
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "LZvOcaWdyPg3"
      },
      "source": [
        "### 7. For future predictions, try to make a prediction, retrain a model on the predictions, make a prediction, retrain a model, make a prediction, retrain a model, make a prediction (retrain a model each time a new prediction is made). Plot the results, how do they look compared to the future predictions where a model wasn’t retrained for every forecast (model_9)?\n"
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "8QPiaQIUwqB7"
      },
      "source": [
        "\n",
        "\n",
        "```\n",
        "- append pred to list -> train new model\n",
        "- make a pred on new list (with new model)\n",
        "- append new pred to list -> train new model\n",
        "- make a pred on new list (with new model)\n",
        "- append new pred to list -> train new model\n",
        "- make a pred on new list (with new model)\n",
        "- #7 is train a new model every single time a pred is made\n",
        "\n",
        "\n",
        "```"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "FD2tfR7Y-BWG"
      },
      "source": [
        "# Lets code things at first without a loop and see how it foes \n",
        "HORIZON = 1 \n",
        "WINDOW_SIZE = 7 \n",
        "\n",
        "# Building a model (You can replace it with any model)\n",
        "def get_model(horizon = HORIZON):\n",
        "    model = tf.keras.Sequential([\n",
        "            layers.Dense(128 , activation = 'relu'), \n",
        "            layers.Dense(128 , activation = 'relu'), \n",
        "            layers.Dense(horizon)\n",
        "        ])\n",
        "    \n",
        "    model.compile(loss = tf.keras.losses.mae , \n",
        "                  optimizer = tf.keras.optimizers.Adam())\n",
        "    \n",
        "\n",
        "    return model"
      ],
      "execution_count": 72,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "kzHOR23b-CkE"
      },
      "source": [
        "#. Making the data and labels for window size of 7 and horizon of 1 \n",
        "full_windows , full_labels = make_windows(prices , window_size= WINDOW_SIZE , horizon= HORIZON)"
      ],
      "execution_count": 153,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "uedqTuLr-Cs1"
      },
      "source": [
        "# Making future forecastts of Bitcoins (using the whole data)\n",
        "def pred_model_run(values , X, model , into_future , window_size  , horizon, epochs ):\n",
        "\n",
        "  '''\n",
        "  This function train a model for every updated predictions. \n",
        "\n",
        "  Arguments:\n",
        "  ----------\n",
        "      - values --> labels / truth values. Bitcoin prices \n",
        "      - X --> Windowed data of the bitcoin prices (default window size is 7)\n",
        "      - model --> compiled model with default horizon 1 \n",
        "      - into_future -->  how many time steps to predict in the future? \n",
        "      - window_size --> default is 7 (using the 7 days prices of bitcoin)\n",
        "      - horizon --> default is 1 (predicting the price of next day)\n",
        "\n",
        "  Returns: \n",
        "  --------\n",
        "      - model --> a model that has been trained on all the previous predictions + the data\n",
        "  '''\n",
        "\n",
        "  last_window = values[-window_size:]\n",
        "  X_all = X\n",
        "  y_all = values\n",
        "  for _ in range(into_future): \n",
        "\n",
        "      # Each time the model is trained for 5 epochs with the updated data\n",
        "      model.fit(x = X_all , y = y_all , epochs = epochs , verbose = 0)\n",
        "\n",
        "      future_pred = model.predict(tf.expand_dims(last_window, axis= 0))\n",
        "      #future_pred = model.predict(last_window)\n",
        "      print(f'Predicing on: \\n {last_window} --> Prediction: {tf.squeeze(future_pred).numpy()}\\n')\n",
        "\n",
        "      future_forecast.append(tf.squeeze(future_pred).numpy())\n",
        "      #values = np.append(values , tf.squeeze(future_pred).numpy())\n",
        "      for i in range(0 , len(X_all)):\n",
        "        x = X_all[i][1:]  # removing the 0th index of the X window ()\n",
        "        y = y_all[1:] # removing the 0th index  of y \n",
        "        X = np.append(x , future_pred) # append the future pred at last to X window\n",
        "        values = np.append(y , future_pred) # appending the future pred to y \n",
        "\n",
        "      # Update the last window \n",
        "      last_window = np.append(last_window , future_pred)[-window_size:]\n",
        "\n",
        "\n",
        "  return model "
      ],
      "execution_count": 164,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "v8TqNQNcOfWX",
        "outputId": "cd05699e-2989-4cb1-f4e4-6f5d42e7d9a6",
        "colab": {
          "base_uri": "https://localhost:8080/"
        }
      },
      "source": [
        "full_windows.shape , X_all.shape , full_labels.shape , y_all.shape"
      ],
      "execution_count": 163,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "((2780, 7), (2780, 7), (2780, 1), (2780,))"
            ]
          },
          "metadata": {},
          "execution_count": 163
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "ov3W2gc_A6-1",
        "outputId": "51bd96d6-4e54-44bb-ed08-752d68eff0e4",
        "colab": {
          "base_uri": "https://localhost:8080/"
        }
      },
      "source": [
        "# Using the above function \n",
        "trained_model = pred_model_run(values = tf.squeeze(full_labels) , \n",
        "                               X = full_windows , \n",
        "                               model = get_model(horizon = 1) , \n",
        "                               window_size = WINDOW_SIZE , \n",
        "                               horizon = HORIZON , \n",
        "                               epochs = 10 , \n",
        "                               into_future  =14 )"
      ],
      "execution_count": 167,
      "outputs": [
        {
          "output_type": "stream",
          "name": "stdout",
          "text": [
            "Predicing on: \n",
            " [56573.5554719  52147.82118698 49764.1320816  50032.69313676\n",
            " 47885.62525472 45604.61575361 43144.47129086] --> Prediction: 42488.6875\n",
            "\n",
            "Predicing on: \n",
            " [52147.82118698 49764.1320816  50032.69313676 47885.62525472\n",
            " 45604.61575361 43144.47129086 42488.6875    ] --> Prediction: 41630.0234375\n",
            "\n",
            "Predicing on: \n",
            " [49764.1320816  50032.69313676 47885.62525472 45604.61575361\n",
            " 43144.47129086 42488.6875     41630.0234375 ] --> Prediction: 40474.16015625\n",
            "\n",
            "Predicing on: \n",
            " [50032.69313676 47885.62525472 45604.61575361 43144.47129086\n",
            " 42488.6875     41630.0234375  40474.16015625] --> Prediction: 40302.421875\n",
            "\n",
            "Predicing on: \n",
            " [47885.62525472 45604.61575361 43144.47129086 42488.6875\n",
            " 41630.0234375  40474.16015625 40302.421875  ] --> Prediction: 39411.90625\n",
            "\n",
            "Predicing on: \n",
            " [45604.61575361 43144.47129086 42488.6875     41630.0234375\n",
            " 40474.16015625 40302.421875   39411.90625   ] --> Prediction: 38208.4921875\n",
            "\n",
            "Predicing on: \n",
            " [43144.47129086 42488.6875     41630.0234375  40474.16015625\n",
            " 40302.421875   39411.90625    38208.4921875 ] --> Prediction: 37553.8046875\n",
            "\n",
            "Predicing on: \n",
            " [42488.6875     41630.0234375  40474.16015625 40302.421875\n",
            " 39411.90625    38208.4921875  37553.8046875 ] --> Prediction: 37455.140625\n",
            "\n",
            "Predicing on: \n",
            " [41630.0234375  40474.16015625 40302.421875   39411.90625\n",
            " 38208.4921875  37553.8046875  37455.140625  ] --> Prediction: 36594.33203125\n",
            "\n",
            "Predicing on: \n",
            " [40474.16015625 40302.421875   39411.90625    38208.4921875\n",
            " 37553.8046875  37455.140625   36594.33203125] --> Prediction: 36659.265625\n",
            "\n",
            "Predicing on: \n",
            " [40302.421875   39411.90625    38208.4921875  37553.8046875\n",
            " 37455.140625   36594.33203125 36659.265625  ] --> Prediction: 35935.08984375\n",
            "\n",
            "Predicing on: \n",
            " [39411.90625    38208.4921875  37553.8046875  37455.140625\n",
            " 36594.33203125 36659.265625   35935.08984375] --> Prediction: 35489.00390625\n",
            "\n",
            "Predicing on: \n",
            " [38208.4921875  37553.8046875  37455.140625   36594.33203125\n",
            " 36659.265625   35935.08984375 35489.00390625] --> Prediction: 35369.76171875\n",
            "\n",
            "Predicing on: \n",
            " [37553.8046875  37455.140625   36594.33203125 36659.265625\n",
            " 35935.08984375 35489.00390625 35369.76171875] --> Prediction: 35315.53515625\n",
            "\n"
          ]
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "6eo6JOrd-Dme"
      },
      "source": [
        "### 8. Throughout this notebook, we’ve only tried algorithms we’ve handcrafted ourselves. But it’s worth seeing how a purpose built forecasting algorithm goes.\n",
        "\n",
        "* Try out one of the extra algorithms listed in the modelling experiments part such as:\n",
        "\t*  [Facebook’s Kats library](https://github.com/facebookresearch/Kats)  - there are many models in here, remember the machine learning practioner’s motto: experiment, experiment, experiment.\n",
        "\t*  [LinkedIn’s Greykite library](https://github.com/linkedin/greykite) \n"
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "Rngm_dIP-Exz",
        "outputId": "8d2f3bce-aad1-40f0-b937-442e532ec9eb"
      },
      "source": [
        "# Installing facebooks kats lib \n",
        "!pip install kats "
      ],
      "execution_count": 54,
      "outputs": [
        {
          "output_type": "stream",
          "name": "stdout",
          "text": [
            "Collecting kats\n",
            "  Downloading kats-0.1.0.tar.gz (6.3 MB)\n",
            "\u001b[K     |████████████████████████████████| 6.3 MB 4.5 MB/s \n",
            "\u001b[?25hRequirement already satisfied: LunarCalendar>=0.0.9 in /usr/local/lib/python3.7/dist-packages (from kats) (0.0.9)\n",
            "Requirement already satisfied: attrs>=21.2.0 in /usr/local/lib/python3.7/dist-packages (from kats) (21.2.0)\n",
            "Collecting ax-platform\n",
            "  Downloading ax_platform-0.2.1-py3-none-any.whl (844 kB)\n",
            "\u001b[K     |████████████████████████████████| 844 kB 62.2 MB/s \n",
            "\u001b[?25hRequirement already satisfied: convertdate>=2.1.2 in /usr/local/lib/python3.7/dist-packages (from kats) (2.3.2)\n",
            "Collecting fbprophet==0.7\n",
            "  Downloading fbprophet-0.7.tar.gz (64 kB)\n",
            "\u001b[K     |████████████████████████████████| 64 kB 2.7 MB/s \n",
            "\u001b[?25hCollecting gpytorch\n",
            "  Downloading gpytorch-1.5.1-py2.py3-none-any.whl (503 kB)\n",
            "\u001b[K     |████████████████████████████████| 503 kB 67.2 MB/s \n",
            "\u001b[?25hRequirement already satisfied: holidays>=0.10.2 in /usr/local/lib/python3.7/dist-packages (from kats) (0.10.5.2)\n",
            "Requirement already satisfied: matplotlib>=2.0.0 in /usr/local/lib/python3.7/dist-packages (from kats) (3.2.2)\n",
            "Requirement already satisfied: numpy>=1.15.4 in /usr/local/lib/python3.7/dist-packages (from kats) (1.19.5)\n",
            "Collecting numba>=0.52.0\n",
            "  Downloading numba-0.54.0-2-cp37-cp37m-manylinux2014_x86_64.manylinux_2_17_x86_64.whl (3.4 MB)\n",
            "\u001b[K     |████████████████████████████████| 3.4 MB 37.0 MB/s \n",
            "\u001b[?25hRequirement already satisfied: pandas>=1.0.4 in /usr/local/lib/python3.7/dist-packages (from kats) (1.1.5)\n",
            "Collecting plotly>=4.14.3\n",
            "  Downloading plotly-5.3.1-py2.py3-none-any.whl (23.9 MB)\n",
            "\u001b[K     |████████████████████████████████| 23.9 MB 13 kB/s \n",
            "\u001b[?25hCollecting pymannkendall>=1.4.1\n",
            "  Downloading pymannkendall-1.4.2-py3-none-any.whl (12 kB)\n",
            "Requirement already satisfied: pystan==2.19.1.1 in /usr/local/lib/python3.7/dist-packages (from kats) (2.19.1.1)\n",
            "Requirement already satisfied: python-dateutil>=2.8.0 in /usr/local/lib/python3.7/dist-packages (from kats) (2.8.2)\n",
            "Requirement already satisfied: scikit-learn>+0.24.2 in /usr/local/lib/python3.7/dist-packages (from kats) (0.22.2.post1)\n",
            "Requirement already satisfied: setuptools-git>=1.2 in /usr/local/lib/python3.7/dist-packages (from kats) (1.2)\n",
            "Requirement already satisfied: seaborn>=0.11.1 in /usr/local/lib/python3.7/dist-packages (from kats) (0.11.1)\n",
            "Collecting statsmodels>=0.12.2\n",
            "  Downloading statsmodels-0.12.2-cp37-cp37m-manylinux1_x86_64.whl (9.5 MB)\n",
            "\u001b[K     |████████████████████████████████| 9.5 MB 28.5 MB/s \n",
            "\u001b[?25hRequirement already satisfied: torch>=1.8.1 in /usr/local/lib/python3.7/dist-packages (from kats) (1.9.0+cu102)\n",
            "Requirement already satisfied: tqdm>=4.36.1 in /usr/local/lib/python3.7/dist-packages (from kats) (4.62.0)\n",
            "Requirement already satisfied: Cython>=0.22 in /usr/local/lib/python3.7/dist-packages (from fbprophet==0.7->kats) (0.29.24)\n",
            "Requirement already satisfied: cmdstanpy==0.9.5 in /usr/local/lib/python3.7/dist-packages (from fbprophet==0.7->kats) (0.9.5)\n",
            "Requirement already satisfied: pymeeus<=1,>=0.3.13 in /usr/local/lib/python3.7/dist-packages (from convertdate>=2.1.2->kats) (0.5.11)\n",
            "Requirement already satisfied: pytz>=2014.10 in /usr/local/lib/python3.7/dist-packages (from convertdate>=2.1.2->kats) (2018.9)\n",
            "Requirement already satisfied: korean-lunar-calendar in /usr/local/lib/python3.7/dist-packages (from holidays>=0.10.2->kats) (0.2.1)\n",
            "Requirement already satisfied: six in /usr/local/lib/python3.7/dist-packages (from holidays>=0.10.2->kats) (1.15.0)\n",
            "Requirement already satisfied: hijri-converter in /usr/local/lib/python3.7/dist-packages (from holidays>=0.10.2->kats) (2.1.3)\n",
            "Requirement already satisfied: ephem>=3.7.5.3 in /usr/local/lib/python3.7/dist-packages (from LunarCalendar>=0.0.9->kats) (4.0.0.2)\n",
            "Requirement already satisfied: cycler>=0.10 in /usr/local/lib/python3.7/dist-packages (from matplotlib>=2.0.0->kats) (0.10.0)\n",
            "Requirement already satisfied: pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 in /usr/local/lib/python3.7/dist-packages (from matplotlib>=2.0.0->kats) (2.4.7)\n",
            "Requirement already satisfied: kiwisolver>=1.0.1 in /usr/local/lib/python3.7/dist-packages (from matplotlib>=2.0.0->kats) (1.3.1)\n",
            "Requirement already satisfied: setuptools in /usr/local/lib/python3.7/dist-packages (from numba>=0.52.0->kats) (57.4.0)\n",
            "Collecting llvmlite<0.38,>=0.37.0rc1\n",
            "  Downloading llvmlite-0.37.0-cp37-cp37m-manylinux2014_x86_64.whl (26.3 MB)\n",
            "\u001b[K     |████████████████████████████████| 26.3 MB 86 kB/s \n",
            "\u001b[?25hCollecting tenacity>=6.2.0\n",
            "  Downloading tenacity-8.0.1-py3-none-any.whl (24 kB)\n",
            "Requirement already satisfied: scipy in /usr/local/lib/python3.7/dist-packages (from pymannkendall>=1.4.1->kats) (1.4.1)\n",
            "Requirement already satisfied: joblib>=0.11 in /usr/local/lib/python3.7/dist-packages (from scikit-learn>+0.24.2->kats) (1.0.1)\n",
            "Requirement already satisfied: patsy>=0.5 in /usr/local/lib/python3.7/dist-packages (from statsmodels>=0.12.2->kats) (0.5.1)\n",
            "Requirement already satisfied: typing-extensions in /usr/local/lib/python3.7/dist-packages (from torch>=1.8.1->kats) (3.7.4.3)\n",
            "Collecting botorch==0.5.0\n",
            "  Downloading botorch-0.5.0-py3-none-any.whl (475 kB)\n",
            "\u001b[K     |████████████████████████████████| 475 kB 53.6 MB/s \n",
            "\u001b[?25hRequirement already satisfied: typeguard in /usr/local/lib/python3.7/dist-packages (from ax-platform->kats) (2.7.1)\n",
            "Requirement already satisfied: jinja2 in /usr/local/lib/python3.7/dist-packages (from ax-platform->kats) (2.11.3)\n",
            "Requirement already satisfied: MarkupSafe>=0.23 in /usr/local/lib/python3.7/dist-packages (from jinja2->ax-platform->kats) (2.0.1)\n",
            "Building wheels for collected packages: kats, fbprophet\n",
            "  Building wheel for kats (setup.py) ... \u001b[?25l\u001b[?25hdone\n",
            "  Created wheel for kats: filename=kats-0.1.0-py3-none-any.whl size=286607 sha256=60d71974855300be5e3bceebe42a3b8d70dd9edee6b89ed668294b5708f6d80d\n",
            "  Stored in directory: /root/.cache/pip/wheels/c8/dd/5b/cc7cb7fc37c5b838c65e504437bba4c1828a3fbb473c0c11be\n",
            "  Building wheel for fbprophet (setup.py) ... \u001b[?25l\u001b[?25hdone\n",
            "  Created wheel for fbprophet: filename=fbprophet-0.7-py3-none-any.whl size=6637439 sha256=eb89e9d54b4ff20a8b001d6d80fab499092c9e875dc25e87fa19e5ae5fa70009\n",
            "  Stored in directory: /root/.cache/pip/wheels/82/e8/a8/53f37f0a409bc51f8693e967dcce8f88bfd33632b40a594a28\n",
            "Successfully built kats fbprophet\n",
            "Installing collected packages: tenacity, gpytorch, plotly, llvmlite, botorch, statsmodels, pymannkendall, numba, fbprophet, ax-platform, kats\n",
            "  Attempting uninstall: plotly\n",
            "    Found existing installation: plotly 4.4.1\n",
            "    Uninstalling plotly-4.4.1:\n",
            "      Successfully uninstalled plotly-4.4.1\n",
            "  Attempting uninstall: llvmlite\n",
            "    Found existing installation: llvmlite 0.34.0\n",
            "    Uninstalling llvmlite-0.34.0:\n",
            "      Successfully uninstalled llvmlite-0.34.0\n",
            "  Attempting uninstall: statsmodels\n",
            "    Found existing installation: statsmodels 0.10.2\n",
            "    Uninstalling statsmodels-0.10.2:\n",
            "      Successfully uninstalled statsmodels-0.10.2\n",
            "  Attempting uninstall: numba\n",
            "    Found existing installation: numba 0.51.2\n",
            "    Uninstalling numba-0.51.2:\n",
            "      Successfully uninstalled numba-0.51.2\n",
            "  Attempting uninstall: fbprophet\n",
            "    Found existing installation: fbprophet 0.7.1\n",
            "    Uninstalling fbprophet-0.7.1:\n",
            "      Successfully uninstalled fbprophet-0.7.1\n",
            "Successfully installed ax-platform-0.2.1 botorch-0.5.0 fbprophet-0.7 gpytorch-1.5.1 kats-0.1.0 llvmlite-0.37.0 numba-0.54.0 plotly-5.3.1 pymannkendall-1.4.2 statsmodels-0.12.2 tenacity-8.0.1\n"
          ]
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "1R2yY9-oqYd-"
      },
      "source": [
        "# Importing the TimeSerisData class from Kats \n",
        "from kats.consts import TimeSeriesData\n"
      ],
      "execution_count": 55,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "EFyRBrpIrKns"
      },
      "source": [
        "# Utils to work with Kats \n",
        "from dateutil import parser \n",
        "from datetime import datetime"
      ],
      "execution_count": 56,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "2BbGVjIvrSCT",
        "outputId": "115783db-52e9-4b02-ecaa-4d8cbda897b8"
      },
      "source": [
        "# Creating a Dataset object with Kats \n",
        "ts_data = TimeSeriesData(time = bitcoin_prices_updated.index , \n",
        "               value = bitcoin_prices_updated.Price)\n",
        "\n",
        "type(ts_data)"
      ],
      "execution_count": 57,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "kats.consts.TimeSeriesData"
            ]
          },
          "metadata": {},
          "execution_count": 57
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 408
        },
        "id": "lC4aG5BZraEa",
        "outputId": "6a7a9e19-315f-4e00-8a97-017d58f07b5c"
      },
      "source": [
        "# Plotting the timeseries data \n",
        "import matplotlib.pyplot as plt\n",
        "\n",
        "ts_data.plot(cols=['Price'])\n",
        "plt.show()\n"
      ],
      "execution_count": 58,
      "outputs": [
        {
          "output_type": "display_data",
          "data": {
            "image/png": "iVBORw0KGgoAAAANSUhEUgAAAr4AAAGHCAYAAABWGlGNAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjIsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+WH4yJAAAgAElEQVR4nOzdeWBU5b3/8feZJZM9JEAAEyBgUgQEsQahi1WkKUttaNWi1V6i0EsLtljtordVe+mvt9LW2toW2mIpDd5WqraXeBUCCtXWBTCgXpFigwKSGLYQsmcyy/n9Mckkkx0yW5LP6x9mnjnnzDOg8Ml3vud5DNM0TUREREREBjlLpCcgIiIiIhIOCr4iIiIiMiQo+IqIiIjIkKDgKyIiIiJDgoKviIiIiAwJCr4iIiIiMiTYIj2BCzVixAiysrIiPQ0RERERiSJHjx7lzJkzXb42YINvVlYWJSUlkZ6GiIiIiESR3Nzcbl9Tq4OIiIiIDAkKviIiIiIyJCj4ioiIiMiQMGB7fLvicrkoKyujqakp0lMJudjYWDIzM7Hb7ZGeioiIiMiAMKiCb1lZGUlJSWRlZWEYRqSnEzKmaVJZWUlZWRkTJkyI9HREREREBoRB1erQ1NTE8OHDB3XoBTAMg+HDhw+JyraIiIhIsAyq4AsM+tDbaqh8ThEREZFgGXTBN9KsViszZszg0ksv5fOf/zwNDQ1dHvfRj340zDMTERERGdoUfIMsLi6ON954gwMHDhATE8NvfvObgNfdbjcAr7zySiSmJyIiIjJkKfiG0FVXXcXhw4d54YUXuOqqq8jPz2fKlCkAJCYm+o/70Y9+xLRp07jsssu49957AXj33XeZP38+V1xxBVdddRWHDh2KyGcQERERGSwG1aoO7a3+37c5+EFNUK855aJkvveZqX061u12s23bNubPnw/A/v37OXDgQKdVGLZt20ZRURF79uwhPj6es2fPArB8+XJ+85vfkJOTw549e1i5ciW7du0K6ucRERERGUoGbfCNlMbGRmbMmAH4Kr7Lli3jlVde4corr+xy6bHnn3+e22+/nfj4eADS0tKoq6vjlVde4fOf/7z/OKfTGZ4PICIiIjJIDdrg29fKbLC19vh2lJCQ0OdreL1ehg0b1uV1RERERLrzme/8lvfOeXh73cpITyUqqcc3wvLy8ti4caN/9YezZ8+SnJzMhAkTePLJJwHfhhVvvvlmJKcpIiIiUa6hyclb3kzqk8dHeipRS8E3wubPn09+fj65ubnMmDGDhx56CIA//vGPbNiwgcsuu4ypU6dSVFQU4ZmKiIhINKtvbGuLbHa5IziT6DVoWx0ipa6urtPYNddcwzXXXNPtcffee69/NYdWEyZMoLi4OCRzFBERkcHH5W4Lu9V1DYxMTY7gbKKTKr4iIiIig4Cz2eV/XF1bH8GZRC8FXxEREZFBwNmuveFcXdc7xw51Cr4iIiIig4DL7fE/rqlvjOBMotegC76maUZ6CmExVD6niIiI9E37VgcF364NquAbGxtLZWXloA+FpmlSWVlJbGxspKciIiIiUaL9Sg61DU0RnEn0GlSrOmRmZlJWVsbp06cjPZWQi42NJTMzM9LTEBERkSjRvtWhtkE7vnZlUAVfu93e5bbAIiIiIoNd+4pvXZOCb1cGVauDiIiIyFDlbLeOb31jcwRnEr0UfEVEREQGiGaXmyee393la+52rQ71TleXxwx1g6rVQURERGQwK3hwE682jKKu8UWWfubqgNea2wffJlV8u9Kniu+5c+e48cYbueSSS5g8eTKvvvoqZ8+eJS8vj5ycHPLy8qiqqgJ8Kw6sWrWK7Oxspk+fzv79+/3XKSwsJCcnh5ycHAoLC/3j+/btY9q0aWRnZ7Nq1apBvyqDiIiIyIV4/6xvmbI9/3y/02vte3ybXJ5Or0sfg++dd97J/PnzOXToEG+++SaTJ09mzZo1zJ07l9LSUubOncuaNWsA2LZtG6WlpZSWlrJ+/XpWrFgBwNmzZ1m9ejV79uxh7969rF692h+WV6xYwaOPPuo/r7i4OEQfV0RERGTgSom1AnCmrvNyZe0rvh6viohd6TX4VldX8/e//51ly5YBEBMTw7BhwygqKqKgoACAgoICtmzZAkBRURFLlizBMAxmz57NuXPnqKioYPv27eTl5ZGWlkZqaip5eXkUFxdTUVFBTU0Ns2fPxjAMlixZ4r+WiIiIiLSJi/EF38bmzhVdt0fBtze9Bt8jR44wcuRIbr/9di6//HK+9KUvUV9fz8mTJxkzZgwAo0eP5uTJkwCUl5czduxY//mZmZmUl5f3ON5+PdrW8a6sX7+e3NxccnNzh8RavSIiIiJdMQyj09hr75T5H3u83nBOZ8DoNfi63W7279/PihUreP3110lISPC3NbQyDKPLP4BgW758OSUlJZSUlDBy5MiQv5+IiIjIQPHu6Tr/Y1V8u9Zr8M3MzCQzM5NZs2YBcOONN7J//35GjRpFRUUFABUVFaSnpwOQkZHB8ePH/eeXlZWRkZHR43hZWVmncREREREJ1Hr/f1flxmqnCXVnAHAr+Hap1+A7evRoxo4dyzvvvAPAzp07mTJlCvn5+f6VGQoLC1m0aBEA+fn5bNq0CdM02b17NykpKYwZM4Z58+axY8cOqqqqqKqqYseOHcybN48xY8aQnJzM7t27MU2TTZs2+a8lIiIiIp119UV7g8dCjMe36oNXwbdLfVrH95e//CW33norzc3NTJw4kY0bN+L1elm8eDEbNmxg/PjxPPHEEwAsXLiQrVu3kp2dTXx8PBs3bgQgLS2N+++/n5kzZwLwwAMPkJaWBsC6deu47bbbaGxsZMGCBSxYsCAUn1VERERk0Jn7zV8zOiWWZmzEGi6cXg8eQ8G3K30KvjNmzKCkpKTT+M6dOzuNGYbB2rVru7zO0qVLWbp0aafx3NxcDhw40JepiIiIiAxZHePsW4ff513bON6tB5LBXncUvF48lrYjT5ypYtbqZ/ju3AyWf/bacE436mjLYhEREZEBonWTr7e8mVz+1bX86bm9Aa/HWADTG3BzW/HutzAS0vjJjtJwTjUqKfiKiIiIDBDtN7etSsyipsEZ8LrDCphe2rf4JsXHAuDBGoYZRjcFXxEREZEBwmsGNjvUNrkCnsfaLZ0qvq1bGXstfepwHdQUfEVEREQGiI7LlDU43QHPM4bFd6r4NjibATANVXwVfEVEREQGiA9qA7cqPtvQHPB8UubwluDrS77rnnqeZ17z9fYapnZzU81bREREZICoTsoKeF7fbAakuZT4WDDP4Wmp+P64xAlcFLb5RTtVfEVEREQGKFeHIm5SvANME68JXm/HCq/W9lXwFREREYlCdQ1NZH1tEz/Y+HS3x7hMMBuq/FsVJ8XHYrT0+DY0BbZBGKaCr4KviIiISBT655EySBjOo/urAV8Q7shtWjC8Hv8exskJcYCv4utyB974ZjHdnc4fahR8RURERKJQ29Jlvl/P1dZ3OsZjGmB6/U0MsTF2DNPktH0ULnfgjXAeiz2Esx0YFHxFREREopC3ZU0yo+V5k7O50zEew4rF9GL3+KrBsQ47ZlI6Rkwcz+89EHhw4kj+deyDUE456in4ioiIiEQht6elYttS+a1v6iL4Jl+EJyaR/7l7PgtHnuPDl0z0v9bk6tzaUNVF1XgoUfAVERERiUJNzsBd2ZqafcE3ufYok8zj/nEjNpFp2eNY941bAbDWfNDl+QB229DexELBV0RERCQK1Tc5A563Btm8S0Zw5cXp3Z53+8xRLed3rhA7m4f2DW4KviIiIiJR6JtPvN7yyNfq0NTsC74xNmuPlVuH3bejRVfB198+MUQp+IqIiIhEIXdyhu+B4YtrrcHXYbdhs3Yf4XoKvs4u+n6HEgVfERERkShmtixD1hpaY+xW7NbeK74fVHW+ka15iAdfW++HiIiIiEjE2B0ANLYsZ+aw2bBYDMDXtnD9RbUBh8c6fPHu5frOfcAd1/YdahR8RURERKKYYfMF3/YVX4vRFnzTkuIDjnfYu9+oonmIB1+1OoiIiIhEoaTao4Bv+TJoW5HBEWMLCLCtrQ2t4hzdB1/XEG91UPAVERERiULJLfnV3pLWml2+sOuw2yg7U+0/LjYmMPjGxqji2x0FXxEREZEo1LJjsf/XJpdvVYc4RwxzP/wh/3EdK7w9BV8tZyYiIiIiUadj8G1saXWIj41h4ccu9x/XMejGx8Z0e83WqvFQpeArIiIiEoVaA2/LLzS1BN+EWEfAcXGOwKAb6+g++A71VR0UfEVERESikLfl147BNzEuMPgmdKjwdlXxNWpPAOD2eDu9NpQo+IqIiIhEIbO14msaQNuNafG9VHw7Pk9vOMr987N911CPr4iIiIhEm04V35b+3MT42IDjOlZ4OwbfG6/MIiUhDlCrg4KviIiISBTq2OPbWvHtHHw7tD7Etb0+quEo3/7ip4mx+bY49qjVQURERESijdnh12a3L7R2vLmtY8U3xm7DNH3Hjkq0+ccAXAq+IiIiIhJtWnt7obXH14vp9XRatSExPq7zyS29vIbhO9feUvENV6vDkzv38LWf/Sks73U+FHxFREREolDHHt9mjxc8nbccTuiwyoPvJN/ZVosv+LZWfD3e8FR8v/XcGf73ZEpY3ut8KPiKiIiIRBmv10tD8ngAzJaKr8tjgrdd8K07DXSzU1tL8LW0VHwdanUAFHxFREREok779XZblzVzewOD79a7P8mdl1mxWLqIcx0qvq2tDkN9HV9bpCcgIiIiIoGczS7/Y3/F1wt423p0p0zMZMrEzK4v0HJcS+4lxu6rCntal4oYolTxFREREYkyTe2Cb0vuxe0Fw+zjzWktleHWarAjRq0OoOArIiIiEnW6qvi6TbB4+xZ8WwNya8XXbvMF36He6qDgKyIiIhJlXO7Oqzd4TDDoW3BtDb7Wlopv6w1wbrU6iIiIiEg0cbragq+JwfZX36TGPgJLH4OvpXVVh5ak17qqg1fBV0RERESiibM5MPh+uagMIzaxz8HXk3wR0Lbbm70l+LrDtI5vtOpT8M3KymLatGnMmDGD3NxcAM6ePUteXh45OTnk5eVRVVUFgGmarFq1iuzsbKZPn87+/fv91yksLCQnJ4ecnBwKCwv94/v27WPatGlkZ2ezatUqTHNo/zQiIiIiQ1uzy9XluJW+ZSRLzQkAGpy+AB3j7/ENb8byRlnQ7nPF929/+xtvvPEGJSUlAKxZs4a5c+dSWlrK3LlzWbNmDQDbtm2jtLSU0tJS1q9fz4oVKwBfUF69ejV79uxh7969rF692h+WV6xYwaOPPuo/r7i4ONifU0RERGTACNxa2PA/shp9C67T03znty5fZrNZMb0evGEoLrrbzT3abqa74FaHoqIiCgoKACgoKGDLli3+8SVLlmAYBrNnz+bcuXNUVFSwfft28vLySEtLIzU1lby8PIqLi6moqKCmpobZs2djGAZLlizxX0tERERkKAro8TXagq/N6Orozlo3rmi/bq9hsfJ/dYnBmWAP6hub/I/dnj4uvxYmfQq+hmHwqU99iiuuuIL169cDcPLkScaMGQPA6NGjOXnyJADl5eWMHTvWf25mZibl5eU9jmdmZnYa78r69evJzc0lNzeX06dPn+dHFRERERkYmluCr2l6Caz49u18a0tY9nSo8Brxw4Iyv5688a9j/scuV+fVKSKpTzu3vfTSS2RkZHDq1Cny8vK45JJLAl43DAPD6OOfRD8sX76c5cuXA/h7jUVEREQGm9bgi8cTWPHt43f1rRXfcC7icP/6vzL94gzWb98PlnFA9N1M16ffvoyMDADS09P53Oc+x969exk1ahQVFRUAVFRUkJ6e7j/2+PHj/nPLysrIyMjocbysrKzTuIiIiMhQ5WptEfC6aV/xtfcx+Lbu2NZ++bK0uqMYtaeCNcVOHnvPwbeeO0P7rOsZaD2+9fX11NbW+h/v2LGDSy+9lPz8fP/KDIWFhSxatAiA/Px8Nm3ahGma7N69m5SUFMaMGcO8efPYsWMHVVVVVFVVsWPHDubNm8eYMWNITk5m9+7dmKbJpk2b/NcSERERGYpeeus93wOvB9PR1pdr6eM37P4e33atDlYDTCM0K9n+YOPT/sfv2sb5H0dbj2+vrQ4nT57kc5/7HABut5tbbrmF+fPnM3PmTBYvXsyGDRsYP348TzzxBAALFy5k69atZGdnEx8fz8aNGwFIS0vj/vvvZ+bMmQA88MADpKWlAbBu3Tpuu+02GhsbWbBgAQsWLAjJhxUREREZCP76QZLvgenBsLcF32Mx4/t0vs3aenNbuzELmN7QBN8NJZWQlN5pPHB1isjrNfhOnDiRN998s9P48OHD2blzZ6dxwzBYu3Ztl9daunQpS5cu7TSem5vLgQMH+jJfERERkaHDsAY8NRvO9ek0W2urQ/uKrwXA2vUJ/ZRmaaCyi/Foq/hq5zYRERGRKGXEBi4/9uDCrD6d19XNbTaL0SlIB0t3vccDrsdXRERERKJDzthRfTpu6YLZANwxf4Z/zGYxwBKa4Ntdvo22DSz6tJyZiIiIiEReRnpan46bOTWbo2uyA8bsVgPMEAXfbpZNi7YeX1V8RURERKLU9RfVBjy/aGTfgm9XbBYDrKGpeXYXfD1eBV8RERER6cYjm7f7H99x/TX+x0btyX5d1261YBiWts0xguSlNw7RXUNDtPX4qtVBREREJIrs+Vc54OvljXPE+Mf/Iy+rX9e1Wy3ggvrGJmLsib2f0AcP/2kbv/g/LyR1Pbdo6/FVxVdEREQkijS52toD2gdfm7V//bk233pmNDqb+3Wd9g683/NOcFrOTERERES65bC3Bdz2wddu7V9si7H5zm9odPbrOu2lJcb5H9tqyrHUVAS8roqviIiIiHTLbN10ou4MMfa2rlS7rX8dqjGtFd9mV7+u015aUlvwteNm35qbA153a1UHEREREenO/hO+YFr8zU9isbRFNbutf60OMS2V5Iam4FV8DcPwP7YAqcmJjG8+xkyHr/L7/P53gvZewaCb20RERESiSHPKWAAuycoIGLf1u9XBF3ybnMGr+PpWbfBd12L4KtUvPrwSt9vDvHvXM2vyFUF7r2BQ8BUREREZAPpb8XW0tE0E8+Y23wYVvnlZ24q/2GxWdj60ImjvEyxqdRAREREZAOz9XNXBX/ENYo9vaUWV/7HN6OHAKKHgKyIiIjIA2Ppb8Y1prfgGJ/j+ZddeXmkY5X9uHwCpcgBMUURERGRo8Hq7X/6r360OLec7g7Rz27sfnA54blHFV0RERET6qqc2hP5u/xsbYweCV/FNiY8LeO41g3LZkFLwFREREYkSdQ1NncaM2pNA/3tzW1sdmoNU8W3ocJNcdG1V0TUFXxEREZEoUd/YOfg6TN+6uxajf70EcS0VX6c7OMG340YYpiq+IiIiItJXDU2+Kuq8tLP+sS3fzuea5DPMvfLSfl3b0RJ8m5qDVfENDL5qdRARERGRPmvdVa39jWyXZGXwh+8UBOzidiHiHL7g2xykbYTbB2izsYZbP3pxUK4bStrAQkRERCRKuFpCqb2fu7R1pfXmNmeQKr7tb5J79b55XDQyLSjXDSVVfEVERESihKdlOTNLCNYGi3PEAMGr+NY2tQXf1jaKaKfgKyIiIhIl3C1LlvX3RrauBLvV4V+n6v2Pbf1swwiXgTFLERERkSEglBXf+FgHAM3u4Cw81uAGs+EcK6ZCSlJCUK4Zagq+IiIiIlGidZMKawgqqLFBbnVwmwZ2dz33/Nung3K9cFDwFREREYkSoaz4JsT5Kr7ufu4A18pjWrASnBAdLgq+IiIiIlHCX/ENSY+vr+L7ypng3IjmMSzYGACL97aj4CsiIiISJUJZ8W1dzozY5KBcz4sVmzGwgq/W8RURERGJEp6W7c9C0eNrsVhw1LwPBCdUey1W7MHP5yGliq+IiIhIlAhlxRd8wc8MUpHWtNiwD7AkOcCmKyIiIjJ4hXJVBwADEzNY/cMWOzHWgVXyVauDiIiISJTwV3xDcHMbtFR8g9TqgNWOQ8FXRERERC6E21/xDU2gNAwT0+zftd1uDxuefhHD7sAxQHZsa6XgKyIiIhIlWiu+oWp1CEbFd8kPC3mlYRQAsXZrEGYVPgMrpouIiIgMYq3B12YNUfA16HeP77uVjf7HCr4iIiIickE8Ht+SC5aQ3dwG/Y1/7VeFiI8ZWM0DA2u2IiIiIoOYt7XVIUQ3jVmM/rc6tA++sQMs+KriKyIiIhIlQrmBBfiCb383sGi/DHBCbEy/rhVuCr4iIiIiUSLkN7cZYBr9bHVo9zgh1t6/CYVZnz+5x+Ph8ssv57rrrgPgyJEjzJo1i+zsbG666Saam5sBcDqd3HTTTWRnZzNr1iyOHj3qv8aDDz5IdnY2kyZNYvv27f7x4uJiJk2aRHZ2NmvWrAnSRxMRERGJPoXP/p3JK39NbX1jp9e8/opvaFodrAbQz5vb2rc6JMY6+jehMOtz8H3kkUeYPHmy//k999zDXXfdxeHDh0lNTWXDhg0AbNiwgdTUVA4fPsxdd93FPffcA8DBgwfZvHkzb7/9NsXFxaxcuRKPx4PH4+GOO+5g27ZtHDx4kMcff5yDBw8G+WOKiIiIRIfvP/sOjcnj2PJiSafX/rznPSDErQ5dVHy9Xi+vvX3Y32Pck/YV38T4QRh8y8rKePbZZ/nSl74EgGma7Nq1ixtvvBGAgoICtmzZAkBRUREFBQUA3HjjjezcuRPTNCkqKuLmm2/G4XAwYcIEsrOz2bt3L3v37iU7O5uJEycSExPDzTffTFFRUSg+q4iIiEjExeAG4NjJqk6vnYrPAkK3qoPFMLoMvnc+spnPP/YOax57ttdruNsl34TBWPH9+te/zo9//GP/H0JlZSXDhg3DZvPdyZeZmUl5eTkA5eXljB07FgCbzUZKSgqVlZUB4+3P6W68K+vXryc3N5fc3FxOnz59AR9XREREJLK8LTunNThd3R5jDdE6vlYLGLFJncbfLj8HwKGyyh7Pd7s91CZl+Z8nJ8QGdX6h1uvv6jPPPEN6ejpXXHFFOObTo+XLl1NSUkJJSQkjR46M9HREREREzpunpeLa2FPwDVGP77uuYQD8ZdfewDm1dDhYeun/PVVVHfA8OSEueJMLg14XX3v55Zd5+umn2bp1K01NTdTU1HDnnXdy7tw53G43NpuNsrIyMjIyAMjIyOD48eNkZmbidruprq5m+PDh/vFW7c/pblxERERksGpsdnf7Wm8B9EIZcSkAHC4P/Obc7OrgLpw6Gxh8UxLjgzGtsOm14vvggw9SVlbG0aNH2bx5M9deey1//OMfmTNnDk899RQAhYWFLFq0CID8/HwKCwsBeOqpp7j22msxDIP8/Hw2b96M0+nkyJEjlJaWcuWVVzJz5kxKS0s5cuQIzc3NbN68mfz8/BB+ZBEREZHIMVqWRWh0dR98m92ekLz3h7zvAzA8OTCwmi1zcvbyvqeragOeD7SK7wU3kPzoRz/i4YcfJjs7m8rKSpYtWwbAsmXLqKysJDs7m4cffti/PNnUqVNZvHgxU6ZMYf78+axduxar1YrNZuNXv/oV8+bNY/LkySxevJipU6cG59OJiIiIRBlLS321qbn7kOls7r4Noj9umD0JgOq6JpqczWx9+XVOnKmiZRU1nK5egu+5wOA7bIBVfM9rn7lrrrmGa665BoCJEyeyd+/eTsfExsby5JNPdnn+d7/7Xb773e92Gl+4cCELFy48n6mIiIiIDExGS/DtIWSGquLraNli+JdveXn05Q00JY/D8cfdDGtJhE53z8uZnasLXHs4aahUfEVERETkArSs6tDs6T5kOnvo/+2PGJvV/7gpeZzvvZLH4Wmp+PY0J4CmDpXoUC27FioDa7YiIiIiA5y35ca1fzKWX/9lZ5fHtFZmg81u7/q6/lYHd8+3uTX10Jc8ECj4ioiIiISRSduKDWteDtzEwqg9BcDt130iJO8d20XwNU0vrXXeBlfPwbe1Eh1b8z5mw7lgTy/kQvPjhIiIiIh0yRtQdwwMmobpIbH2aMhaCGK6qviaZlvFt5fW4tbe418u+Rh5s6cHeXahp4qviIiISFi1W6O3Q4HVNAysoVnCF+gp+Pre1GlaO7/ejrOl1SHWYQ/63MJBwVdEREQkjEyj+4qv1xpLfAi/j4+N6SKwtmt1cBs9v3nrcmdxjpggzyw8FHxFREREwigg+Jptwfe1tw9jxA9rPxR0Xd80Z9KQPL7z3LrQ2uoQH+sI9tTCQsFXREREJIy6q/g+8j//AKDaFbpeh9iYzpVaw9YWYnsNvi3LnSXEKfiKiIiISG+6CZdjhycBsPqGK0L21hZLz6HaNHru8W1u2eBCrQ4iIiIi0ruA4NsWRJtalgrLGDEsZG+dMTKt5wN6qfi6WoKvWh1EREREpHeWtqqqEZvIyp/+EWjbHCIhLjZkb93rFsOWnm9uc3lag68qviIiIiLSmw7tBFtP+yq8zpYbxxIiGSp7WT/Y5fFielwDbqviVgNz1iIiIiIDlSUw+JoeFwBOl6+amhgfuopvr3qr+HpN8Payy0UUU/AVERERCacO1dK4+gqgbamwULY69MrqC76f+vav+drP/tTpZbdHwVdERERE+sDt9mB0qPi27prWumJCJJcKMwwL/3j9n/zLMo7/PZlCVU1dwOturwled4Rm138KviIiIiJh0tTsCnhuqTmBp6XntzlK+me/s2mX//HlP3wx4DW3F/B6GagUfEVERETCpNHZHPDcarrxtiwh5vKY4IlMNdVsqOKXC0cDEGvvPh56TDBMtTqIiIiISC+cHSq+dsODafj6al2eyLURGPGpfOYTV2C6mno8zhd8VfEVERERkV40daj42g0T02LlzX8d46h3eORvHDO9eLxmty+7var4ioiIiEgfOF2BFV2bBTAsfOHnz2LEJkJMfFjnY6sp7zT2nm18t8d7TTDoPhhHOwVfERERkTDp2OPra6c1cOG7wc2whXfzimHWwNYLo4vg/eK+g1TX1gPgxsCCWh1EREREpBcdV3Wwt1R8I1FE/cL4Rn5ScE2vxxU8eYRrv/MHABhF0/IAACAASURBVFzYiEGtDiIiIiLSi+aWVgej9gSm24nNYoBhgBG+OdyW7WKi+xgPrriRselpfTrnjHU4AB7DTqxVrQ4iIiIi0ovWiu8t04dx7KHrsVoMX8U3jP7zS59l10MrARg1fFjfTmqZo9fmIL7nXY2jmoKviIiISJi0VnztVl9Pr8Ug7MG3vaSEuD4dZzgS8Hq9YI8n0WHt/YQopeArIiIiEiZNzb7gG2P3hUerxcCwO/BGeSQzG6qormvAsNlJcgzcku/AnbmIiIjIAONy+4Kvw+aLYGU1HkgGT/JFEZvTw/NGkpqU0Gl88dgGnjjuW+VhhFlN2amzAAxLcIR1fsEU3T9eiIiIiAwiW0tKAbDbfBXfZjOwbcBSUxH2OV0/50rm5E7tNL7is5/wP65MyOLoB6cBSE2MDdvcgk3BV0RERCRMXm0YBYDD7qv4Wo3ANXF33PvpsM+pO3GOwDWF1219DYARyZ2rwwOFgq+IiIhImNlbe3yNtqXB4muOkT1uTKSm1EnH4Nvs9s01PTUxEtMJCgVfERERkTCLban4tg9iTuyRmUw34mIdTHQf8z+vd/mq06PTUiI1pX5T8BUREREJM3tL8G2/FYQ7aXRkJtONGLuNXQ+t5Ofz0wGocfl22Rjd17V/o5BWdRAREREJM4vhC5Htg68RwfV8e5KS6FvZoT52JDQ3MGl89LRjnK/o/B0WERERGcRaN7IwzTDuVXyBUpN8wdeIicfaVE1sh97fgUTBV0RERCTM3B4PEFjxjVYfGh+5NYaDTcFXREREJMxSo3RJsC23T+GmcQ08OCfVPxYf6yCxxneTm8eRHKmpBYV6fEVERETCxHQ7MZpquO7jvvV6o63iO2PSBGZMmtBp/N8/cTE/e8ON4YjOwN5XqviKiIiIhIvXQ3Zco/9ptAXf7mSmp/Z+0ACg4CsiIiISLhYrVkv7G9qi/+Y2gItGDNwlzNrrNfg2NTVx5ZVXctlllzF16lS+973vAXDkyBFmzZpFdnY2N910E83NzQA4nU5uuukmsrOzmTVrFkePHvVf68EHHyQ7O5tJkyaxfft2/3hxcTGTJk0iOzubNWvWBPkjioiIiEQJw4qtXfBNjx0YNd/WJc0Gul6Dr8PhYNeuXbz55pu88cYbFBcXs3v3bu655x7uuusuDh8+TGpqKhs2bABgw4YNpKamcvjwYe666y7uueceAA4ePMjmzZt5++23KS4uZuXKlXg8HjweD3fccQfbtm3j4MGDPP744xw8eDC0n1pEREQkzLxeL4bVFhB8n/l/t3P35XamW8q47yPRGy7TovRmvPPVa/A1DIPERN+ezC6XC5fLhWEY7Nq1ixtvvBGAgoICtmzZAkBRUREFBQUA3HjjjezcuRPTNCkqKuLmm2/G4XAwYcIEsrOz2bt3L3v37iU7O5uJEycSExPDzTffTFFRUag+r4iIiEhEtK7da7O2Bd+khDhW3fQpnv7hl/nSojmRmlqvhiUNkeAL4PF4mDFjBunp6eTl5XHxxRczbNgwbDbfohCZmZmUl5cDUF5eztixYwGw2WykpKRQWVkZMN7+nO7GRURERAYTZ7MLAJtl4N1iNZA3rWivT7/zVquVN954g7KyMvbu3cuhQ4dCPa8urV+/ntzcXHJzczl9+nRE5iAiIiJyIRqdvvuhbNaBF3wHi/P6nR82bBhz5szh1Vdf5dy5c7jdvpJ9WVkZGRkZAGRkZHD8+HEA3G431dXVDB8+PGC8/TndjXdl+fLllJSUUFJSwsiRI8/vk4qIiIhEkLOLVoeBxlYzsL+V7zX4nj59mnPnzgHQ2NjIc889x+TJk5kzZw5PPfUUAIWFhSxatAiA/Px8CgsLAXjqqae49tprMQyD/Px8Nm/ejNPp5MiRI5SWlnLllVcyc+ZMSktLOXLkCM3NzWzevJn8/PxQfV4RERGRiKhvdAIQM0Arvn/7Wi77fnRrpKfRL73u3FZRUUFBQQEejwev18vixYu57rrrmDJlCjfffDP33Xcfl19+OcuWLQNg2bJl/Nu//RvZ2dmkpaWxefNmAKZOncrixYuZMmUKNpuNtWvXYrVaAfjVr37FvHnz8Hg8LF26lKlTp4bwI4uIiIiE396D7wEwYfTA3AxiQsaoSE+h3wzTNAfGAnId5ObmUlJSEulpiIiIiPTJiof+m21nUnn8lhw+Mv1DkZ7OoNVTRhyYtXYRERGRAebwqVpMl5OZUy6O9FSGrF5bHURERESk/5pcXvA6sdmskZ7KkKWKr4iIiEgYuL2A1x3paQxpCr4iIiIiYeDygmF6Ij2NIU3BV0RERCQMPKaCb6Qp+IqIiIiEgdsEi+mN9DSGNAVfERERkTDwmAYWFHwjScFXREREJAw8WBR8I0zBV0RERCQMvBhYGZD7hg0aCr4iIiIiYeCyxBJnVfCNJG1gISIiIhJip6tqIHEE4+zNkZ7KkKaKr4iIiEiI/eRP2wGYPm5EhGcytCn4ioiIiITY22VnAViy4CMRnsnQplYHERERkRBr9ngxPU1cnDk60lMZ0lTxFREREQkxl8cEjzvS0xjyFHxFREREQszlMcGr4BtpCr4iIiIiIeb2gmF6Ij2NIU/BV0RERCTE3CYYXgXfSFPwFREREQkxVXyjg4KviIiISIh5TAML3khPY8hT8BUREREJMQ8GFlPBN9IUfEVERERCzIUVu6HgG2kKviIiIiIh5jFsxFjMSE9jyFPwFREREQmRuoYmrrprLd7EkcRaIz0b0ZbFIiIiIiGy4ekXOe7IwgDi7EakpzPkqeIrIiIiEiKxMXb/4ySH6o2RpuArIiIiEiJn6xr8j10e3dwWaQq+IiIiIiFytqYt+H55QW4EZyKgHl8RERGRkHnr+FkwEvjzFz/ErEtzIj2dIU8VXxEREZEQOd3ggdpTCr1RQsFXREREJETcXrRjWxRR8BUREREJEY9pYMET6WlICwVfERERkRDxYKjiG0UUfEVERERCxGMaWA1tVRwtFHxFREREQsRrWLCi4BstFHxFREREQsSLRRXfKKLgKyIiIhIiXsOKTWkrauiPQkRERCRETIuCbzTRH4WIiIhIqFhs2JW2okavfxTHjx9nzpw5TJkyhalTp/LII48AcPbsWfLy8sjJySEvL4+qqioATNNk1apVZGdnM336dPbv3++/VmFhITk5OeTk5FBYWOgf37dvH9OmTSM7O5tVq1ZhmuqFERERkUHAYiPGakR6FtKi1+Brs9n46U9/ysGDB9m9ezdr167l4MGDrFmzhrlz51JaWsrcuXNZs2YNANu2baO0tJTS0lLWr1/PihUrAF9QXr16NXv27GHv3r2sXr3aH5ZXrFjBo48+6j+vuLg4hB9ZREREJEysNuxWlXyjRa9/EmPGjOHDH/4wAElJSUyePJny8nKKioooKCgAoKCggC1btgBQVFTEkiVLMAyD2bNnc+7cOSoqKti+fTt5eXmkpaWRmppKXl4excXFVFRUUFNTw+zZszEMgyVLlvivJSIiIjJQeb1esNpV8Y0i5/UjyNGjR3n99deZNWsWJ0+eZMyYMQCMHj2akydPAlBeXs7YsWP952RmZlJeXt7jeGZmZqdxERERkYGsqdmFYViI0d1tUcPW1wPr6uq44YYb+PnPf05ycnLAa4ZhYBih/2lm/fr1rF+/HoDTp0+H/P1ERERELlRdQxMADps1wjORVn36EcTlcnHDDTdw6623cv311wMwatQoKioqAKioqCA9PR2AjIwMjh8/7j+3rKyMjIyMHsfLyso6jXdl+fLllJSUUFJSwsiRI8/zo4qIiIiE3pM793DoaDk1dQ0AOOwKvtGi1+BrmibLli1j8uTJ3H333f7x/Px8/8oMhYWFLFq0yD++adMmTNNk9+7dpKSkMGbMGObNm8eOHTuoqqqiqqqKHTt2MG/ePMaMGUNycjK7d+/GNE02bdrkv5aIiIjIQHL13ev41nNnWPjjbRw9cQaA5LiYCM9KWvXa6vDyyy/z2GOPMW3aNGbMmAHAD3/4Q+69914WL17Mhg0bGD9+PE888QQACxcuZOvWrWRnZxMfH8/GjRsBSEtL4/7772fmzJkAPPDAA6SlpQGwbt06brvtNhobG1mwYAELFiwIyYcVEQmX+9b/lccOefm/1QtISUqI9HREJAwe2/oPjsWMB8CbPIbv/envEJvFZz56aYRnJq0Mc4Aumpubm0tJSUmkpyEi0qWsr/4BEkeSWneU1391R6SnIyIhdqziNFc/srfTuFlfxZFHbsFi0Q1u4dJTRtSfgohICJiG7wu1qsSsyE5ERMLiV3/5W5fjw7zVCr1RRH8SIiIhkOKtBsBScyLCMxGRUGtyNvNkWVtLk726jNia9wGI7/P6WRIOCr4iIiGQHu/76zXD0RThmYhIqL3fchNbqwSrmwWTUgE459LmFdFEwVdEJARcHt/tE54BeReFiJyPYxWBwdduwA1X+xYEsBn6SyCaKPiKiISAy+v7x87tjfBERCTk3novcMfZEQlWPj7jEr440cmfv6ElWqOJOk9ERELA5fH9qoqvyOC37Y2jmN7RWBrPYiaNZuLIJAB+sPz6CM9MOlLwFREJAXdL4FXFV2Twq6gzsXEGEwsmkD5Ma3dHK7U6iIiEQK3XDoBXFV+RQc9pWonBhdfi25p4ZEpihGck3VHwFREJAZdjGABN+mJNZNBzW2zEWkyMlj3BkuIdEZ6RdEfBV0QkyLxeL9hjAXAlZ/rHG5qc7D/0XqSmJSIh4rU4iLXCj6+fgqPmfebNnh7pKUk3VIoQEQmy7//+aQyL3f+82eUmxm5j1t2/ozY5i3/9v3HE2PXXr8igYY8lwVrL5+fO4vNzZ0V6NtIDVXxFRILoyZ17+MNhe8BYY5MTgNrkLADOVteGe1oiEiKvvX0YIyaORIc10lORPlDwFREJom89d6bTmNPlDnh++pyCr8hg8fnH3gEgOS4mwjORvlDwFREJkcusvkXtG53NAeNnq+siMR0RCTKvt229QodNkWogUJOZiEgIJNYcY8S4RDgHVz+yl3+/xAP4vgqtqq2P7OREJCjeKzvpfzxlXHoEZyJ9pR9PRESCyPS4AHCZFuzWtr9iN+w95X9cWaPgKzLQVdXUcdNP/gpAXmold970qQjPSPpCwVdEJJha1vE0AXs3X30eOXE2jBMSkVBYcH8hlQlZAMy57GIsFkWqgUB/SiIiIRJja7vL26BtC7d3T5yLxHREJIjOuNpWb7k2d0oEZyLnQ8FXRCREbO1aHTzJF/kf1zS6IjEdEQkii+m7sc30ehg9IjXCs5G+UvAVEQmR9hXf9hpdnjDPRERCxVp3qveDJGoo+IqIBFVrj69BfVNz51cba2h0mZ3GRWRgibP4foDdtPzjEZ6JnA8FXxGRoGr5a9WAM7VNnV91N9LgDlz/U0QGHrdpYK8p4+MzLon0VOQ8KPiKiARTD3d2m24XVtNNdVIW0776mzBOSkSCzWVasKIfYAcaBV8RkSAyLL6+XhODtV+/qcOLYMX39Wh98vhwT01EgqTw2b/TnDKWRKuC70CjndtEREIkKSEucMCw4CImMpMRkaD40WPP8uu3fY9HJypGDTSq+IqIhIlhseKJSYz0NESkH1pDL0DeZVkRm4dcGAVfEZEQMDF8v9YH7tJmxCr4igwG+aNruPPmeZGehpwnBV8RkRCIN3xLmT33zbnMjjvhH7922Bn/47qGzqs+iEh0crs9ZK9c73/+vduvi+Bs5EIp+IqIBJHZVIvpbOCZ+xYD8KHxF/Ffyz6N6Wri27kOfn9vAfPSfFXgd8tORnKqInIe3n6vDHdyBgDTLWUMT0mK8IzkQqgrW0QkmCx2JtlOMX7MSP/QxZmjOfbTG/zPk+MdcBbO1dZHYoYSYccqTjMyNZn4WEekpyLnofX/V6P2JE88fHuEZyMXShVfEZEg8Xq9YLPjsPX8V6sjxldzWLbhJd5+93g4piYRdufPH2f/ofdocjZz9SN7mXvP7yI9pSGptr6RJ57ffUHnVrUE34LL04h1aHWWgUrBV0QkSJqaXRgWKw6btcfjYu2+4OtOzmD5L58Ox9Qkgo6Un6ToRDLX/+GfvLDvIAAf2DMiPKuh6TMPbOTbz1fyypvvnPe55+oagZZvbGTAUvAVEQmSspOVACTG2ns8zmFXl9lQUnHmnP/xrtdLfQ+8ri6Pfevw+9rOOoTKG30/lJYcOgbAoaPlnDpb3adzq1uCb0piXC9HSjTT374iIkHyRun7AEwcPazH42JjbIAv+FiMUM9KIq28XfB94ng8AEZMPG63B1u7bwd+V/Q3fvBqA/E1z3Bw3cqwz3Oweual/fzz6AmK9h3BlZwFwC9fPMKt82uZ/5s3sNZ8wLvr/r3X65yoqgXiSE2KD+2EJaRU8RURCZJ/HvMtWzZ5/Ogej4tztFWEjzuyeP/EmR6Ohl2vHaC2vrH/E5SIOFHZdUXx67/4c8DzH7zaAECDtrMOmt/+dRdffaaCtQdMyhxZ/nFXciYffuAZADzJF/V4jfdPnGHaHWt59sBJzOYmPjVreiinLCGm4CsiEiTvnfRV9i7PGdfjcbExga0Qn/j5HhqanP7nXq+XW77/ew4dLafsZCVL/3KMT37n98GfsITFqXN1XY6fONfQ7Tl/eObvoZrOkPLKofc7jU0yfTeUGnHJAJjO7v8cAO5Z/zS1SVnUJGVhbaoiMT42+BOVsFHwFREJkg+qGjFdTiZmjurxuLgu7gh/fMer/sdbXizhlYZRXP/jIj77g80AnHCMDe5kJWwqa8+/Wv+DrYdCMJOh5cSZKl6sGRkwZjrr2f6jr/Ahb/tAbPZ4nT0n2nquJ6d4gjlFiQAFXxGRIDnV4MXSVI3F0vNfrdYuGnvfq6j0P/Z4ff8QNxqxnEnI8g021QZtnhJeZ+u63qHvnUonq37+uP9mNktNhf+1ZEvXN79J333hwccDnj/779P5xz3XArDjxyv40iQPk8zjGI4EKqtreemNQ4z/9tPMvnOt/xyv14sndhj26jL+++aL+Z/vLwvrZ5DgU/AVEQkCr9fLOUc6SfT8tSmAy9P5rv3yyravw39SVAKAmdRWObZ4nJ3OkYHh3TOBwTeuxreiQG1SFk+fSOb3//siDU1OPI5kMpqOElN9nBqvnY99fS1bXngNgGaXO6AdRnpntPx8GV9zjLfuv5apF49l3OgR/tfvuz2fqRm+G1EPHD7O/Y/twrBYORGXxae+/WuyV67nqz97HCMmjmuy4vj4jEuI0YosA56Cr4hIELxXdhIjJp5JI3pf4/OGOTO5Lr2aL03yML7ZF4LqmtoqfKfiszqd47UnBG2uEl5nPTEktoRdgH0PB1YNj5w4yz9eP4ThSGD2xSOIs3hwJ2dQHpvFz//XF3wvXfU7pvzn85w4UxXWuQ9ko5J9vbgv/OBWkhK6XoIsa1QaAO8cP0F5Q1sk+pdlHO7kDLae9gXjOZddHOLZSrj0GnyXLl1Keno6l156qX/s7Nmz5OXlkZOTQ15eHlVVvv8RTdNk1apVZGdnM336dPbv3+8/p7CwkJycHHJycigsLPSP79u3j2nTppGdnc2qVaswzZ57bUREotGn1jzb52MtFgu/uvsW7rs9nxcfXonpdlHf7O75JJsWzR+o3PZEhjna/m3ruFXxeyer2f7aPwGYPSWL9hv/HbWPZ8+BUppTfD3esx96hYkrf8eb/zpGNNj12gEuWfnrqJlPe40t/0+lJSd2e8yHxqYD8MPdjTSnjMVSc8L/2gxbOeBrQbl+zswQzlTCqdfge9ttt1FcXBwwtmbNGubOnUtpaSlz585lzZo1AGzbto3S0lJKS0tZv349K1asAHxBefXq1ezZs4e9e/eyevVqf1hesWIFjz76qP+8ju8lIjIQeJPHANDsvoDNB9xOGprbzjMbazodYtjsNDmbL3h+Ehm19Y0YccmMTIzhrhk2Hlnga1+5xGzbqvr9s428dqQSs6mOz12dS40n8Ov0m/77XwHPvcljWPT7A/zHr58K/QfogdvtYelfjtGUPI5Fvz/Aa28fDtl7feTOtWTd2/cfLgGaXF5MtzNgreSOpkzMDHh+2fC2/w+3/GA5h38wn/fWfUlbFA8ivQbfT3ziE6SlpQWMFRUVUVBQAEBBQQFbtmzxjy9ZsgTDMJg9ezbnzp2joqKC7du3k5eXR1paGqmpqeTl5VFcXExFRQU1NTXMnj0bwzBYsmSJ/1oiIgOR23v+31oZsYkci/Gt3Xq6qsa/zFKrjyWcAuBsTdfLYkn0OvCuL+BmpCZw583zWHR1bsvztk0QymwZnHUa2JprsNmsPHTTFZgN5zpdK8cbuDTX48fiQho2e7Pqkc0Bz298dB+Hjpb365qnq2r40WPP4nYHrp5QEZcF+Hrp3y070cWZnTW5veDu+SbBzPS2fDPWeZTH7yvgt4sy+eE1vhaHnkKzDEwX1ON78uRJxozxVTdGjx7NyZMnASgvL2fs2LYldzIzMykvL+9xPDMzs9N4d9avX09ubi65ubmcPn36QqYuIhJSFxJ829v3z/c6jSU6fBXAyurzC77X3/8oWV8t5HRV5wqyhMfbR3z/pk0cnRowbrZbQsuw2alPHk+i4avoL7o6l2O/uJWfz09norutheBjHxrDg3NSOfDAXK4d5tv05K9/fyPUH6Fbu48EhnMjNpEFP+nft7a3/PCP/PptmPOt3/CZ7/yWDU+/EPD6xO9sY+6v9lFdW8+ps9W8/e7xri8EVDtNDHfPS8lZLBb/6g5bf7CUWEcM8z5yGbfM/1i/PodEr37f3GYYBoYRnj03ly9fTklJCSUlJYwcObL3E0REwqDsZNtSZPNnZJ33+VmuY5geX2Xq4FHfkla3TmhbCSAh1rfhxdnzCL4NTU72uy6CxBHsP3TkvOckwfHWEV918opJgbuxVdX7Qm5sTVsVd0Rc4D/Jn71mJr9ame9//pGpE/jCvI+SGB/LT1ZcD8C7JzpXhsFXGZ3zjXX87PHgtw8+uXMPt3z/91TGZmA21ZLQ7sY9M6nnXQt7U9noazU47sjiLW8m33/xLMWvdA7331z3F6657498+tH/49RZ3854h9+v8D8ufPbvVCVmkUp9r+953+35bP/RV7q9AU4GlwsKvqNGjaKiwveXc0VFBenpvubwjIwMjh9v++mrrKyMjIyMHsfLyso6jYuIDCQvvenrwbwlq4m7vjD/vM9PibVhWO243R7O1PiWQ7tq2sVMt5Rxe46LtCTf1+IFTx5h/6HOFeH2vF4vs+9cy5T/fN4/9uWish7OkFBpcjZTdMLXtjJzauCqAJdP8P27+R+fngp1vm8w58/ovFXxJVm+7XStNR+QN2uafzw1KQHT46a+uesNFY6frOSIfTyPvOkhZ8VvWfXzx7s87kJ867kzvNIwCsNm55sfTeOlNUuYzHHM+rOYbifNrl5u1OyBq0OLvGF38OU/H/Q/b10d47mq4f6tnZ95+Q2qa+v55Lr9XPnjl3hy5x5+UfwWAPfkf/iC5yKD0wUF3/z8fP/KDIWFhSxatMg/vmnTJkzTZPfu3aSkpDBmzBjmzZvHjh07qKqqoqqqih07djBv3jzGjBlDcnIyu3fvxjRNNm3a5L+WiMhAUVHpqzJlZ4zo5ciuxcX4+gi/+F9/oLreV+kdMSyJp3/4Zb637LOMTR/mP/Zz6/dRWd39Zha73yrlREs/ZHvvnzhzQXOTC/fQn9qqrR1Xcnhg6SL+9rVcCj79CX576+X84OoUvvXFhZ2uYbFY+NvXcnnr4SUBG6NYLJaWmyK7Dr5vv9f2w44rJZOnTyRT9GJJfz9Sp97bGTljSU1OZNuar/DJDBPD5uD/So9R8MM/9LkXt70GOt9EZsSlAPDIglF8fd7kTq9//+U6LvuvF/zPv/XcGSoTskhvOMrNn/rIec9BBrdeV2L+whe+wAsvvMCZM2fIzMxk9erV3HvvvSxevJgNGzYwfvx4nnjiCQAWLlzI1q1byc7OJj4+no0bNwKQlpbG/fffz8yZvuVAHnjgAf8Nc+vWreO2226jsbGRBQsWsGDBglB9VhGRkDh1rg6IY+zI1F6P7UpcjA0aYXfjaDh9BhIhPS3F//rotBTAF3aNmHiuePDvZDqP8tLP7uh0rV9seQno/HXzgXePByzeL6H37okqoPvf8wkZvhUe5n3ksh6v03pcJ24nTZaue8oPl3W+D+bObSdZdHWPb9Wt+9b/lX3vneJbN3wcgEnmcUzT5CPT2r7hGDUsAc7BDRvewLCP5Cu/+B+e+/GKPr/HXY9sxpN8kf/5denVPHPK9/+Bo/o4i67+NAAXjXid+R+5jLcOH2fpL5+hsnV3QyCj6Shl3hTi3bXc/VlVe6WzXoPv4493/fXIzp07O40ZhsHatWu7ONq3HvDSpUs7jefm5nLgwIHepiEiErVOVTcAcYwdPfyCzo+PafdXcaIvKKWntq3skDdrGnRoVyhzZOF2ewLuOvd6vb7w3GKC6xhXXjySP78fz7f++CrX5k7Vskwhcvj9Cub+5HkSPbV8dHwij96zhDfL6yBxBH/6QnZI3tPideHs5l7KY6eqgEQ23phF+ekq7nvR963Eb/+6iy9ff+15vc+TO/fw3+85gLGs3PgPSB7HbddO5wvzPhpwXOaIFDjqxLD7No443dD3pf22vPAa/1ORFDCW+6FMrpudzFeeLuei+LZrLfzY5QBc9qHxPHXvjTz+/F4Wz8mlovIcV13+6fP6bDL0aOc2EZF+qqr3bSV7oRXVeIe901j7gGqxWEipPdrpmH0d+n0/OO1bH91SU8EvF47mbz9dyaKPTQegPnk8c779KM+8tL/TdaT/Njz7MkZCGvXJ43muyvcDUKWRxIj6o3z0skkheU+7t5l6r++f8ZfeOETWVwtZvcG3JOgHZ303Qk6dmMniT87GUe27z+bBvY3+bZD7akfJO/7HTcnjAJg7dOQOnQAAIABJREFUc2qn426d9xH/TZoALm/fb3zf3vIe3/iwHaPWt1LUwo9exvyPzuDheSP54703d3nehIxRfKfgM2SPG8NVl3dugxDpSMFXRKSfTtU1YzY3dOrj7KvUpMC7yZO7CLlvrr0De01g1fexHYEB5tCxDwC4YUoyn/nEFQBMyx7nf70iLouvPlNxQXPc8PQLTF25jl2v6Ru6jg6/X8HRU9UBY8cqTkPcMC5K6vWL1Qs2wuGhKX40H79rLd/cuBMSR7Cx1E7RiyWcqHFiej2MGJZEjN3GO7/+CgtH+laA+GnR3vN6nzN1vh/sJtN2k3r7VpxWKUkJPPPlD5PRdBR7dRnO3r9U9qtt2bL7c1dfwTN35/EfV8b53+P6OVdy0ci0nk4X6TMFXxGRfnji+d2UObIwYuJ7P7gbi+fkBjxP6lwABqA56aKA5+VVbUs1vfLmO3zpr76lsT6c07Y+elJCHJ8ZFRjKLsRDxf+kPnk8S/9yjMJn/97v6w0GXq+X8V97jE+u28+rjYF91d/4dRGGxcqHLrqwvu++GJlgx7DFUObICrih8c5tJzliH49hsQbcELfuG7dC3RkqnPZON6l157/+8DSvu3z/3T35wG3YaspZNLr7daGnZY/j5Z/fQZLVjdvS97aaeqdvJYj0tBSmXjz2vNsxRPpKwVdE5AL99q+7+PbzvjV8p9D9Qvq9yR43hvSGo/7nN1w5scvjDKPtr2yzuYH9tUl4vb7exzt/17Z82eeuCQzSv7zrloDnxyrOfwMgZ7u77Xfsf/e8zx+MNu94FSOhrRJpqyn3r79c0uzb5GnVjaELcF/9zGwATLOt/zWmuuf/DudmGriTM/rU8nLqbDWPHvL1kCfUHCMxPpbD65bzyNe/0Ou5sTYgcSQrHvrvXo8FX/A13U5i7KGrkIuAgq+IyAV7cG/brlBb13ylX9eytrRD5njf5+5bul7d5iNx7ZaHMk2M2CReO+gLoWe8cZgeF9elV3d5A1vxV2Yww+bbRexv+/7J939fxPivPcZXftK3YGI323o3K6p73g1rqCjeVxrw/MMjLcR02OI2lCtp5M2eztE1n+bIg74bui41ytj23c8GbCjR0Scv991ot/P1UqasXNfjlscvvdnW2/vmL758XnP74sd9fc373+/92wav18u/LON6PU4kGBR8RUTO04kzVf5KK8C/X9K3r4174mm5Oz89KbbbYx7/3jKoO8PcYZV88RJfwDrRsoaw1+pgrLucX919S5fnXpKVwar/396dx0dVXo8f/9w7M9kXkhCWJEDYESqErSCIyK6IKAUVtKxVVKDi8rNVgaqVikvBDdGiFQsULGDRCsgikgAiIAQBga+sgRAgZCN7JjNzn98fEwYiBEPIZCbJeb9evJKZ3Jk59+QyOfPcc5/nHudV+C9tzeWTw2a0wHDWZoT96vyu0/7xOdbQRkTmJ+Gbc4pjqn6Z7Q6GYZTKTU12sS8VQBXl8u4fh3Nbh5au+6KKkqokDl3XSXrtLlbNepTmMQ3Y9vrYMrftWLKC3FepoRSENGHEgn00+eMi3l++ge/3HWZbSbF7Ji2TPUecPeUTWtlLzR5SHpNG9EflZ2Eqo8q4fJGLDn/8AIC61or1nwtxPeScghBCXIc/vLaQjRecV+2bcs5gD6jLtHHDbvh5L65YVSfg2n2RSXOdRc2cJV8DBs9+tovUrByw+BNgKrv3EqBv19/AwkQIdMavHHY0k5lv9xzhnt5drvqYEymp/PuEsxhvXT+AwV1a8UL8Bf6yKZPf3V5YapnX7Nx8OvwtHlPOGY7Ne6Q8u12t/ZhphpIZuB7rFESDumE0qBvG8U43kZB4iC43eaZPNTQ4EFWYg7/tyuWMWzVuCFxaAlgz+4LZl9mbTmLsdi6jbPrH5pL5dJ0Xa04celuF4tCUA/tVPgP1eup9kn1jebazDxv2niA3OBaAb/5WdsEuRGWREV8hhLgO35y7NHODIySKkIIzlfK8FwuEX87wUJbwkEAAikMb8er2QjSLL0G+vz6WMbnbpVPvm592jgBn5hWVuf2GHw4AEJqbxMfPPsSDd/SkpXEKzWzheMp513aHT55xrZ7lCImqFaO+Kti57LAqzGFIj/au+3Vdp0+XdqU+FFS176cPYt87V374uPxit4uUMjBCGrpuX76IREhuEg3qVuwCPU0ZrjMZl0v2jQXgzd3F/GiPBpwzRoSFBFXodYS4HlL4CiFEORmGAWYLym513RfuV/65Sq/FXvJ2HB5UvmIpJPDKlogQ/1+/iv7Z3w9mYHgmH93biOiSKaKOpxdgGAYFRdZS2xbb7JxJd44ivziiq6t3uFsL50piSZddJPfhl6VbHypyAV11deKtB0pNG+cNoiLDy7xQ7In2zmPttb7h7H7+No7MvBMt99KHmKENcvitr7PtoGuj4Ks+R3lcPuL7w4GjnDqXzp/eX37VbdvFyHRlompIq4MQQpTToRMpaGZf+tXJYOMF58hvi3qVM0rlUM4C+moF7dX0imuD+uIYmv+lFd5+3798S7TO/9No1/fKYSPFL5amTy8H3cSayd3YcfA4f11/EgIjqJufAoGxxDa8NFIcFR4Cx62cPJfJ4ZNnyMjO479nnAVSa5XMz1oj9h1Ndi21e/D4ae58bxsB1nQCTAofHc46gqin57Hz3SuXXa4OLn5IaGmcuuooqjd7+sE7+WUr+PsPxjHpqzMoaz7vPjkKwzB4+7N1TLnv6j3j5aFjYJSM+I5YeKhkVhLntH/KbkMzO+ft8885xRMj7qvw6whxPaTwFUKIckr8OQmAVtERbCxpn7zlpkaV8txGyVRlgeVcBCMyLIST74xi/9FT7D+azK1xrSs0g0DdohQyAmPR/JwF/PYDx3hlW76rDzg9MBblsNOq8aXT3z1ubg67DjJnj405e/a47r/F/xyDu7ZjxuYcEg8nu/qGB8/fi+YbSKFvIJfPB3Geuox99VOyC4p5oNdvmPv1Hk47Qlg8oYvXr8J1Nt25Sl50nYrP3+xN+nZpB1+doTHOkXpd18ucXaS8NKVwaBqGYZSaii84J4m9cx9nxbc7OZuezdSRj9/Q6whxPaTwFUKIcvp273GgLnEtYuCA84r3mMjKWaAgwmwjFWjSIOK6Hndzi8Y3dJq9T+tIVly2INwr2/Kv2MaUn05QwKWR6PYtmwAHr9hu6Yt/IC0rhxmbt/CvYz78a/IClCXAVVQ3tZ2kRb0g15K+AAk5kQD8uCkL/GLRgNH/Oc6s81mMGtSjwvvlboeSnL3dkSGe6+OtTH6+Pmx+shv1wgZU2nPqmsKudJpOXYIWeOn/ycwHuqHrOvf3715pryVEeVWv8zNCCOEBbSZ9QNtJ8/g21Vn89erYxrVoQHS9yulNXPPKGF7o7k+PDq0r5fnK67XHhvNQ0yIWj2xe6v4VY9rQweSc93dUhzqlfqbrOqogy3W7Tl4SH93rHPmODAshMj8JgLp6AQ1VOjHWJP477iY2zZ7ER38ewz0NcvhdVC6LRzbnywm/YVB4JlFFSTzd0cKwhrkArE8sPUeut1n3w/8BMLBLGw9HUnkaN6h71TmgK0rHwKb7lip6gTJnEBGiKsiIrxBClOHoqbO8+994ikKcI6oaQO55Avx8aa1Oc1hrTGzDyEp5rYjQYCbeW/XTX5nNJv726HAAbl31PVvznDMVdGnbnKUzYlj7/V5+1+e3Vzxu4dhOfPT1DgZ3acWDd5Tu093y90ew2R1lzmrwy5W//nFZz3FBkZWVL31DdkHxDe2XO30R/wNfpYaiinLp06Wdp8PxWibAsARQOZd/ClE5pPAVQogy9J+XCISUuq99qLMgW/PqRLLzCzw6ZVVlWzx9POczsym0OvcxwM/3qkUvQO/Obendue1Vf+bn60M5W5WvEODniyouYn+2wbHT52ge06BiT+QGn3yVwMgB3Zn635/RAsJo43vhuhd2qE1CfBR5JW0uvjmn+PeUgeTky6p/wrOk1UEIIcrJNzuZf08bAzhHSiNCKz7Vk7eqFx5Kk0oaxa4wWwG20BjGzv7cs3GAaz7iT1dt5q/f5XHPXz4B3YKy5vPRUyM8HJ1369b0Ui/3uO6N6NK2uXMRFSE8SEZ8hRCiDKogi0B7Dv/vznYM/O1vqFtnQKX2QIqre21Ic57flEWOB7sdzmdm02f6v8kPaYLKz3L1qR7RG6P5wbCGuRWaRaM2iWsRxcqzzp7tXu1beDgaIZxkxFcIIcpi9iM6UGPC3b2JqR8hRW8VGTWoB+acFOzKc92hv31jK/khTQBcRa8qcM5hZ85J4c3JMu/srxl9562u72+NqzkXAYrqTUZ8hRC1zthXPyX+rInEmfeW2a6wdN02NB9/AqWH0yP8sFPkoT9RWTl5ru87mFLYn2Xi4/Hd6NzmdlLSsmgWPUB6e8tB13X0nHOeDkOIUqTwFULUeBf7NHVd54cDR0nIiUQLhM6zNvNUnJmpIwdd8ZhZ/9sDwbGMvO3mqg5XAAFmRa6pcuZIvl4HjjsnNm5cfJIv50wq9bPQ4EBPhFRtHZ07vtqtbCdqNjkahRA1XpvJ82n2wtes+W4P9y36udTP5uzMI6+gyHV76bptNHnyP+QExxKZn8TIgbdUdbgCyLab0HwDePSNRVX+2qfPO+coHtQ+pspfu6aRold4GzkihRA1XnGoc3GFSV85V9uy5Fxaqkzz8eOZucsBWLU1kec3ZblWGvvLfbKylKf4aA4A1mWGE/vc6lIFsGE4pzrLyM6t0HMvWrOF85nZZf588/7jADSMCK3Q8wshvJe0OggharSvNu++4r4j8x5lxF8+Zl+aneLQRsQn5dPzyffJsQHBsQDcEpDK3bfdVbXBCpcvnh9Ov7mXfnfrMsOx2x2YzSb+8PoiNmU7Z1T4131N+ffG3aw/5aBVQCHr33j8ms/72JuLWZsRxgfrF7P574/xyaoE9p84x7dHLpDnEwE+/mh6HZStiB43x7l1H4UQVU8KXyFEjbb7cDJgYXh0Hp+nBDEoPBOAFX99GIDmkz7CGtKIFADnisTsn9G3Ri1MUR1dbeGKKW8t5cNnf8/25ALXuiJjl58AwiEIfrYVsffwSeIT/48/3j+g1Gl2wzBY9s0O1mY4+4bP+MXSYvrakp+GQkgoWPOhuBAFLB7bnjax0e7dSSFElZPCVwhRY+UVFPHfPSkQHMsTI/oyuxwLM0xoZZei10vcFpzG5txI/HNOURjSmLUZYfxw4CjGVbbVc85hhDTgnk9+AmDOzs9ppM7z7L3dWLX9IOvTgkHT0UxmwvOSuGD4YYQ0QM85x6BmvvRsF8vv77yLImsxZpNJZm0QooaSwlcIUWM9PmcpOSWtC2WtRmbBjgOIM6fQNiaCv0wYXnUBimtaOG0cn67azN29fs8d0xeSFhjrvDgxpDFdfM7StWVDcgutvDDmLk6dS2fQ3J1oZudcy5pPAKeJZerXqUAEmhmCc5PoFB3Iv16bDEBGdi7BAf74WC79KZS5moWo2aTwFUJUC8U2O2aTXu6rxO12B5tTzWiBMPk3ZS+E8NEjtzNt0SaWvPQHAvx8KytcUUnGDbkNgE+m3s0Dc1ZRULKoxD3dWzN6cC/Xdm1ioznxxj2u48NudzD+tYUU2RwkX7DSv11DZk6cXOq5a+KS00KIa9OUUsrTQVREly5d2LVrl6fDEEK4mWEYjH9tIQk5kUQVJRH/5qPYHY4yi9S9h09yc4tGTH3nM75KDWVAWAYf/XlMFUct3OXTVZvZdvAk855+UNoRhBBXda0aUUZ8hRBe7eP/xZOQ42xTOOMXS6sZ6wAYGJ7J9DF38uma73jqgYEEB/oz9e2lfHkuBPgJcE5FddvNTT0UuXCHcUNuY9wQT0chhKiupPAVQnidjOxcOv1lFf0aFrMrOc81xdjl1meGs/7tHYCZT175lvEtbSQczYSgENc2EflJjBr4WNUFLoQQwqvJAhZCCK+zMn4Xmn8I316oS05wLK2MU+yf0ReVn4nKz2RqhytPcS84YuFCUCxa7nma20+x7Zlb2P3eZDkdLoQQwkVGfIUQXuejbw+Cf6zr9trXHkXXdRJn3uu6Cv+pUZf6ebs88QGZQc7tezSAf8+49iIGQgghaicZ8RVCeI0zaZm8vmg1qf7OUd6762fzQnd/15X6EaHBpaae6tCqCbqus+vdx1HFBQC8+8QIj8QuhBDC+8mIrxDCK0yZs4RV550XpKn8LP7zt9GEhQSV67G6rrP52dvZdeiETFElhBCiTFL4CiE8av/RUwybsxZ7yKXlYf96R+NyF70XNWkYWeYiFUIIIQRI4SuEuIoia3G5V7DKKyji9cVr+DklkwNpxVgx0zjATpOIQBY8P9a13cHjp9m0+xAHTp0nyM+H74+eJ8uqkRfSBOUXQSfLGd5/YjhRkeHu2i0hhBC1nBS+QgiX/0tKYcGabfznVAD3xeTz5pT7Aef0YqGBAew7epI3/7OJg+eLsCuNqCCNI/l+EFwPaAglM4kdB45nQ5OpS4jRssgs1igsWXEL6gCgzNFotmxCc5N4/p5OjBwoSwULIYRwLyl8hXCDHT8dYftPx/nD0N4EBfh5JIbTqRk0iKiD2WwiIzuXhMRDRNYJxqTrHDmdSvqFfIodDtKz8zmTmce2rEA0v2AgAIDlpwNZ9qf/4ZN3luKghqAMNJMFaAAlbbSHDQcmlcbNphQGd27B+LtvY8pbS+nYPIrXNyWjBUWSQij4QSNrEvd1b0nrxvXJzi+kX9d20o8rhBCiSsmSxUJUglPn0vl210ESj6aw7Vgmaf6N0HTn/LF6zjkM3YTJsGHBjgMdhYamFCbNIMRs0CTMlxB/H3wtJvwsJmwOAx+zCT8fM2ez8kjJKsRhKHzNOoZSKMVlX6HQblDsAF2DXLtOgaUOmr9z+FUpA0379QlclL2Y+sVnGNCuAcnpuWzOLemXzUunpX8Buq4R7GtmdL84buvYhkMnUohrHVvm0sF5BUWcOHOej1d9x8xH7iU40L9yki2EEEJcw7VqRK8pfNeuXcvUqVNxOBw8/PDDPPfcc9fcvqoL34PHT3P41FlOpWbxn++P4GvW6NGqAX4WMzkFVvKLivHzMdM4sg4tG9Ujsk4wwYH+xNSL8NiIn7crKLKy+9Bx3vrvVk5nFzOsSxPaNmlATn4hF/IKSc/Op7DYzp6T6ZzM1fDRDBRgoOGrGwDYDQ0HGnUsBpGBZuyGIizQFz+LiQKrjXyrnUKbQZHdIN8GFt1ZHOoaaBpogKZpXPxvcPE/g1JgM6DIAValY8eMGQegUEpDaRoAJgyU0igKalAyGuoUnpeETUFucCxabiq+yopFM7AaJkyagQlwoOFQGla/cDTfwGvmStmtzuBMZlCGM0DXPwMMG5rdCpqO2bBS12IjwKLha9ax6Bomk07jiCAMpdA1jcjQQBrVq4OP2UxU3TpER4bRuGFkqanC7HYHe34+QeebmrmmExNCCCG8ndcXvg6Hg1atWrFhwwZiYmLo2rUrS5cupW3btmU+pqoL305T3ndNkH+9lK0IDAearZAgI586vop0q06RHkCQkY9JUziUhgMwlI5Dc44IOnQflMkClpLCubgQk70IDQMdo2TEUGHWDCyasy4ylPOrUqDQMCi5jbNQM+N8jKaBQ4FDaRhoGErD0HQMdJR26R+ayVkhKsN5qruk0NJQWIxi/HUHugZ2BXalodAwodA152FlKGcMBhqq5DUcmhnD5At+wa5R0Wvmz25DL8xAL3ldALvui27YXbko9quL5nP1DxjKVgT2YjSH1blPlFS9l391blnyxflVM+zohg2LsmPWDIoxoaNKxmtL4lDOgrBRoOLW1g3oHdeSxvUjaNG4IQC5+YW/OtJ5OjWDH4+cpFlUPYptNnILijDpOja7g/wiKxGhQXT7TctfzZMQQgghrl0jekWP786dO2nRogXNmjUDYOTIkXz55ZfXLHyr2uSB7TiVmkWD8BBui2tFWEggW/ceRtM0woIDqBMUwIW8Ao6fSePYmQyy863YHAbZBcVkGzasSpGtFBcs4eRiwkQOgUY+hVhQSkPHQFcGuqbwxY5JU/hoxfjpEKDrKAUZhoGhOwtWQ2k4tJIRQ3xRmNCUs8y9WJRpJd9rl322KdJMGEoHpbmKRr2kgPbBjq4pTBqYNDDrzn8azoLaoUp/zdPMZBMAaJiUDZNyAGDXTCiluWLQlYGmgUlTWLDjo9kI0AsJM+XRKCKQNjGRtGnSgI2Jh9E0iAgJIDTAn+AAX8xmE306tyUyLOSav5/c/EJS0jJp3KAu6RdyKSiyEhEaTFhwoEeXrC3P6f2Y+hHE1I+ogmiEEEKI2s0rCt+UlBQaNWrkuh0TE8OOHTuu2G7+/PnMnz8fgLS0tCqLD+APQ2+/4r77+3ev0hhqukG3dKjwY4MD/WkT6JwHtnGDq/ecCiGEEKJ2q1aNexMnTmTXrl3s2rWLyEiZqF4IIYQQQpSfVxS+0dHRJCcnu26fPn2a6OjoazxCCCGEEEKI6+MVhW/Xrl05cuQIJ06coLi4mM8++4yhQ4d6OiwhhBBCCFGDeEWPr9lsZu7cuQwaNAiHw8GECRNo166dp8MSQgghhBA1iFcUvgCDBw9m8ODBng5DCCGEEELUUF7R6iCEEEIIIYS7SeErhBBCCCFqBSl8hRBCCCFErSCFrxBCCCGEqBWk8BVCCCGEELWCFL5CCCGEEKJWkMJXCCGEEELUClL4CiGEEEKIWkFTSilPB1ERdevWJTY21tNhlEtaWhqRkZGeDqPGkby6j+TWPSSv7iF5dQ/Jq3tIXt3nYm6TkpJIT0+/6jbVtvCtTrp06cKuXbs8HUaNI3l1H8mte0he3UPy6h6SV/eQvLpPeXIrrQ5CCCGEEKJWkMJXCCGEEELUCqaXXnrpJU8HURt07tzZ0yHUSJJX95Hcuofk1T0kr+4heXUPyav7/FpupcdXCCGEEELUCtLqIIQQQgghagUpfIUQQgghRK0gha8QQlQi6R4TQgjvJYVvJTl37hwgf/Qq24EDBygqKvJ0GDXSd999x7FjxzwdRo1TWFjo6RBqJIfDAch7bGWTvLqPYRieDqHGqYzjVArfG7Rnzx769evHjBkzANA0zcMR1Qz79u3j1ltvZfr06WRkZHg6nBolMTGRgQMH0rdvX7Kzsz0dTo2xfft2hg8fzuTJk1m/fr2roBA35vvvv+eRRx7hrbfeIjc3V95jK8l3333H2LFjmTlzJpmZmZLXSrJz507effddAHRdSqzKsnPnTh555BFef/110tLSbui55LdSQUopnnrqKcaMGcPYsWP56KOPPB1SjTJz5kxGjBjBypUriY6OBmRE4kbZbDYeffRRJk6cyBNPPMGgQYOIj48HZGTiRsXHxzNp0iR+97vf0bp1axYvXkxWVpanw6r2EhISmDJlCn379uXMmTO8+uqrrFu3ztNhVXvHjx9n0qRJ9OnTh5MnTzJjxgxWr17t6bCqvbfffpthw4Yxc+ZMvv76awD5AHyDHA4Hzz//PBMnTqRnz54kJiby8ssvk5qaWuHnlMK3gjRNIy8vj44dOzJmzBgAjh07JgXEDTIMg+PHjxMUFMSTTz4JwIYNG7hw4YKckrtBVquV3r17s2XLFoYMGcLw4cM5dOgQdrtdRiZu0P79++natSsPPfQQo0ePxmazERQU5Omwqr3ExER69uzJqFGjmDFjBqmpqXz22Weu1jJRMT/88AM33XQT48aNY/bs2cTFxbFq1SqSk5M9HVq11qxZM1atWsUHH3zArFmzADCZTPI36wbFxMSwbNkyxo0bx9tvv8327dtvqKVMFrC4Dtu3b6egoICIiAgAbr/9dl588UUuXLjACy+8wLZt21izZg2NGzcmKirKw9FWH5fnVdM0TCYT06ZNo3nz5vzpT39i8+bNbNmyhcOHD9OrVy85JXcdLs+txWKhffv2WCwWAHbv3s3Zs2cZOnQohmFIXq/DL98LfH19+fOf/4zVauXhhx/Gz8+Pbdu2UVxcTLt27TwcbfXxy7xmZmaye/duunfvTmRkJBs3biQvL4/CwkK6dOni4Wirj6+++ooNGzZgGAYxMTEEBATw6aefMnDgQOrXr4+/vz+HDx8mOTmZ7t27ezrcauOXeW3RogX169enefPmrFy5koyMDLp164bD4ZDBhetwMa+aprnyGhMTg9VqJSwsjDVr1tClS5cK11nymyiHCxcucNdddzFgwACWLVtGfn4+ACEhIUyePJkVK1Ywa9Ysli5dSsOGDfn8889vuAelNrhWXsePH8+MGTOYMGEC69at4+GHH2b79u1s377dw1FXD1fLraZpKKVcZyV69+7NypUrycrKkjflcvplXvPy8gCIi4tj7dq1JCUlMW/ePOLj4+nZsydr167l0KFDHo7a+5WV11atWhESEsLYsWMZPnw4ycnJdOzY0fVzGUm7trNnz3L33XfzxhtvkJWVxfjx41m3bh3NmjXjlltuYdmyZQC0bt2atm3bkpmZKRcTl0NZeTWZTOi6jp+fH8888wz//Oc/SU9Px2w2ezrkauGXeR0zZgzr168nPDwccA4w5ObmcuLEiRsaXJQR33LIyMiguLiYYcOGcfbsWQBatmwJQLdu3Rg5ciRt2rTBbDYTFBTEkiVLGD16ND4+Pp4M2+tdK6+hoaG8/fbb9O7dm7i4OCIjI9m6dSvdu3enYcOGngy7Wigrt5qmoWkahmEQGhrKgQMHCAgIoE2bNh6OuHr4ZV41TXMds9HR0Xz88cfce++9REVFERYWxurVqxkyZAihoaEejty7lXW8RkRE0L9/f5o0aUKTJk2YOXMm2dnZLF++nAcffFDOUvyK+Ph4AgIC+Oc//0nv3r0xm82sWLGC+++/n8LCQhISEoiKiqJRo0ZkZGSwZMkSJkyY4Omwvd4v82qxWPjss88YOXKk65hs0qQJe/fu5eDBg/Tp04fTQECHAAAI0ElEQVSdO3e6rlcRV3etvF60ZcsWUlJSGDduHHl5eZw8edJ1hqi8ZJinDAsXLiQhIYGcnByio6OZOHEi999/P35+fuzYsYMzZ864tg0LC3N9v3v3bmJiYjCZTJ4I2+v9Wl5TUlIAaN++PW+++SZz584lPT2dxYsX89NPP133AV6blPeYVUqh6zpWqxUAPz8/1/3iSuXNq9VqpUePHrz//vsAbNy4kYyMDFd+RWnXyuvOnTtdefXx8aFPnz6uP367d+/mjjvu8GToXm3hwoXEx8djtVrp168fo0ePdv0sIiKi1KBNx44deeqpp8jLy+PAgQM0btyYgoICT4Xu1a6V1/DwcG666Sbg0oXCJpOJ6dOn8/rrrxMaGkpiYqK8x15FefNqs9kA59mhRo0asWDBArp27cqPP/543a8pI76XUUpx7tw5hg4dyr59+zh9+jQrV66kd+/ehISEYLFYsFgs7N69m+LiYtq3bw84/+Bt3bqVESNGkJqayiuvvEL9+vU9vDfe43ryarVa6dChAwCdOnUiLy+PlStXsnXrVubNm0eLFi08vDfepSLHrKZpOBwOfH19+fzzzykoKOD222+X0bPLVOSYNZvNhIeHs2HDBt577z0OHjzI3LlziY2N9fTueI2KvscCbN26lWHDhpGens4zzzxDnTp1PLgn3uWXeU1JSWHFihX079+f+vXrY7PZMJlMbNy4keTkZIYMGUJQUBDdu3fnwIEDLFq0iC1btjB79mw5o3aZiuT1YkvZsWPHGDNmDE2bNmXZsmXcdddd8h5boiJ5vTiYOGvWLD788EPCwsKYPXs2ffv2rVAAQillt9uVUkr9/PPP6qGHHnLdN2XKFDVs2LBS286ZM0dNmzZNXbhwQRUUFCillPruu+/UypUrqzboaqCiec3JyXHdX1xcXHUBVyMVzW1+fr7rfqvVWnUBVxMVyWtWVpbrvaCgoEAdO3asaoOuBip6vObl5SmllEpJSVGrV6+u2qCrgfLk9eI2Q4YMURs2bFBKKZWamqqUUspms5V6vxVOFc1rRkaGUsqZ32+//baqw/Z6Fc1rWlqaUkqpJUuWqOXLl99QDLW+49rhcDBjxgwcDgeDBw8mJyfH9cnCZDLxzjvvEBUVRUJCAr179wbgkUceYfr06fTv359Tp06xZ88eevTo4cnd8Do3ktcBAwZw8uRJ9uzZQ1RUlGsWAuFUmbmVPvRLbjSvp06dIjExkejoaJo1a+bJXfEqlXG8Xmwhk9lyLrnevBYXFxMZGUmrVq2YNm0aq1atIj4+nrCwMIKDgz28N96jMvK6adMm6tWrR7169Ty8N96jMvK6efNmRo0adcOx1Ooe34SEBDp37kxWVhYtWrRgxowZWCwWNm3axM6dOwHnyisvvfQSl3eErF69mnnz5hEXF8f+/fvlzfgXbjSvHTp0kLyWQXLrHpX1XiAXr5RWWcdrTEyMh/bAO11PXl988UUAioqK+PTTT+nXrx+5ubl88803pa5PEZWX14uzEAinysprpV0kfEPjxdXc5s2b1cKFC123H3/8cTVv3jy1YMEC1alTJ6WUUg6HQ509e1bdd9996sSJE0oppb744guVkJDgiZCrBcmr+0hu3UPy6h6SV/e43rwmJyerHTt2qNGjR6s9e/Z4KmyvJ3l1D2/La60ufPPz81VRUZGrn2Tx4sXqueeeU0op1aFDB/Xuu+8qpZT64Ycf1MiRIz0WZ3UjeXUfya17SF7dQ/LqHteT1wceeMBjcVY3klf38La81upWh4CAAHx9fV19Jhs2bCAyMhKABQsWcOjQIYYMGcKoUaPo1KkTIFM+lYfk1X0kt+4heXUPyat7XE9eO3fuDEhey0Py6h7eltdaf3EbOJuuNU0jNTWVoUOHAhAcHMyrr77KTz/9RNOmTV29ezIdSflJXt1Hcuseklf3kLy6h+TVPSSv7uEtea3VI74X6bqOzWajbt267Nu3jyFDhvDKK6+g6zq33nqrXLBSQZJX95Hcuofk1T0kr+4heXUPyat7eE1e3d5MUU18//33StM01bNnT/Xxxx97OpwaQ/LqPpJb95C8uofk1T0kr+4heXUPb8irrNx2mbp16/Lhhx/StWtXT4dSo0he3Udy6x6SV/eQvLqH5NU9JK/u4em8akpJZ7YQQgghhKj5pMdXCCGEEELUClL4CiGEEEKIWkEKXyGEEEIIUStI4SuEEEIIIWoFKXyFEMILmUwm4uLiaNeuHR06dGD27NkYhnHNxyQlJbFkyZIqilAIIaofKXyFEMIL+fv78+OPP3LgwAE2bNjA119/zcsvv3zNx0jhK4QQ1yaFrxBCeLl69eoxf/585s6di1KKpKQkevXqRadOnejUqRPbtm0D4LnnnmPLli3ExcXx1ltv4XA4ePbZZ+natSvt27fnH//4h4f3RAghPEvm8RVCCC8UFBREXl5eqfvq1KnDzz//THBwMLqu4+fnx5EjRxg1ahS7du0iPj6ev//976xatQqA+fPnc/78eaZPn47VaqVnz54sX76cpk2bemKXhBDC48yeDkAIIcT1sdlsTJkyhR9//BGTycThw4evut369evZt28fK1asACA7O5sjR45I4SuEqLWk8BVCiGrg+PHjmEwm6tWrx8svv0z9+vXZu3cvhmHg5+d31ccopXjvvfcYNGhQFUcrhBDeSXp8hRDCy6WlpfHYY48xZcoUNE0jOzubhg0bous6ixYtwuFwABAcHExubq7rcYMGDeKDDz7AZrMBcPjwYfLz8z2yD0II4Q1kxFcIIbxQYWEhcXFx2Gw2zGYzo0eP5umnnwZg0qRJDB8+nIULF3LHHXcQGBgIQPv27TGZTHTo0IFx48YxdepUkpKS6NSpE0opIiMj+eKLLzy5W0II4VFycZsQQgghhKgVpNVBCCGEEELUClL4CiGEEEKIWkEKXyGEEEIIUStI4SuEEEIIIWoFKXyFEEIIIUStIIWvEEIIIYSoFaTwFUIIIYQQtYIUvkIIIYQQolb4/+wDzP++zNOmAAAAAElFTkSuQmCC\n",
            "text/plain": [
              "<Figure size 720x432 with 1 Axes>"
            ]
          },
          "metadata": {}
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "WB57AkOir5rY"
      },
      "source": [
        "Lets make some forecasting with our data on a prophet model. "
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "DuNutrLdSUCI",
        "outputId": "023761ab-126b-49ca-88b7-ef6fe1bd6f7b"
      },
      "source": [
        "# Importing the prophet \n",
        "from kats.models.prophet import ProphetModel , ProphetParams\n",
        "\n",
        "# Creating a model param instance \n",
        "params = ProphetParams(seasonality_mode= 'multiplicative') \n",
        "\n",
        "# Create a prophet model instance (just like how we create an instance in sklearn)\n",
        "model = ProphetModel(ts_data , params)\n",
        "\n",
        "# Fitting the model \n",
        "model.fit() \n",
        "\n",
        "# Making predictions \n",
        "forecast = model.predict(steps= 1 , include_history= True , freq = '1W')"
      ],
      "execution_count": 59,
      "outputs": [
        {
          "output_type": "stream",
          "name": "stderr",
          "text": [
            "INFO:fbprophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n"
          ]
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 359
        },
        "id": "jrVnWaygShim",
        "outputId": "0ea8b9f8-6e15-44d0-864d-4cc218cb9afa"
      },
      "source": [
        "# Predicting the bitcoin price for a day (Horizon = 1 )\n",
        "forecast.head(10)"
      ],
      "execution_count": 60,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/html": [
              "<div>\n",
              "<style scoped>\n",
              "    .dataframe tbody tr th:only-of-type {\n",
              "        vertical-align: middle;\n",
              "    }\n",
              "\n",
              "    .dataframe tbody tr th {\n",
              "        vertical-align: top;\n",
              "    }\n",
              "\n",
              "    .dataframe thead th {\n",
              "        text-align: right;\n",
              "    }\n",
              "</style>\n",
              "<table border=\"1\" class=\"dataframe\">\n",
              "  <thead>\n",
              "    <tr style=\"text-align: right;\">\n",
              "      <th></th>\n",
              "      <th>time</th>\n",
              "      <th>fcst</th>\n",
              "      <th>fcst_lower</th>\n",
              "      <th>fcst_upper</th>\n",
              "    </tr>\n",
              "  </thead>\n",
              "  <tbody>\n",
              "    <tr>\n",
              "      <th>0</th>\n",
              "      <td>2014-11-02</td>\n",
              "      <td>100.477979</td>\n",
              "      <td>-1767.510883</td>\n",
              "      <td>2247.254977</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>1</th>\n",
              "      <td>2014-11-03</td>\n",
              "      <td>100.366334</td>\n",
              "      <td>-1872.590654</td>\n",
              "      <td>2010.129827</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2</th>\n",
              "      <td>2014-11-04</td>\n",
              "      <td>100.533908</td>\n",
              "      <td>-2013.697578</td>\n",
              "      <td>2204.157991</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>3</th>\n",
              "      <td>2014-11-05</td>\n",
              "      <td>100.078650</td>\n",
              "      <td>-1946.285033</td>\n",
              "      <td>2158.315272</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>4</th>\n",
              "      <td>2014-11-06</td>\n",
              "      <td>101.371062</td>\n",
              "      <td>-2064.808847</td>\n",
              "      <td>2150.947470</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>5</th>\n",
              "      <td>2014-11-07</td>\n",
              "      <td>100.766892</td>\n",
              "      <td>-1768.361453</td>\n",
              "      <td>2270.882938</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>6</th>\n",
              "      <td>2014-11-08</td>\n",
              "      <td>101.996420</td>\n",
              "      <td>-1719.764635</td>\n",
              "      <td>2041.536832</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>7</th>\n",
              "      <td>2014-11-09</td>\n",
              "      <td>102.773991</td>\n",
              "      <td>-2051.092356</td>\n",
              "      <td>2161.300161</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>8</th>\n",
              "      <td>2014-11-10</td>\n",
              "      <td>103.027023</td>\n",
              "      <td>-1976.533111</td>\n",
              "      <td>2137.969236</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>9</th>\n",
              "      <td>2014-11-11</td>\n",
              "      <td>103.665574</td>\n",
              "      <td>-1860.269906</td>\n",
              "      <td>2233.857738</td>\n",
              "    </tr>\n",
              "  </tbody>\n",
              "</table>\n",
              "</div>"
            ],
            "text/plain": [
              "        time        fcst   fcst_lower   fcst_upper\n",
              "0 2014-11-02  100.477979 -1767.510883  2247.254977\n",
              "1 2014-11-03  100.366334 -1872.590654  2010.129827\n",
              "2 2014-11-04  100.533908 -2013.697578  2204.157991\n",
              "3 2014-11-05  100.078650 -1946.285033  2158.315272\n",
              "4 2014-11-06  101.371062 -2064.808847  2150.947470\n",
              "5 2014-11-07  100.766892 -1768.361453  2270.882938\n",
              "6 2014-11-08  101.996420 -1719.764635  2041.536832\n",
              "7 2014-11-09  102.773991 -2051.092356  2161.300161\n",
              "8 2014-11-10  103.027023 -1976.533111  2137.969236\n",
              "9 2014-11-11  103.665574 -1860.269906  2233.857738"
            ]
          },
          "metadata": {},
          "execution_count": 60
        }
      ]
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "Vjry8BspjFrj"
      },
      "source": [
        "The above shows the simple implementation of the Kat's library. And we have used a Prophet model with multiplicative seasonlity. \n",
        "\n",
        "Feel free to play with other parameters as well. \n",
        "\n",
        "Next we will look into creating a ensemble of models using the Kat's library. We can achieve this by using `KatEnsemble`, it will be kinda similar to how scikit-learn works. "
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "4tP71FtbZZS3"
      },
      "source": [
        "# Importing the things we need \n",
        "from kats.models.ensemble.ensemble import EnsembleParams , BaseModelParams \n",
        "from kats.models.ensemble.kats_ensemble import KatsEnsemble\n",
        "from kats.models import (\n",
        "    arima, \n",
        "    holtwinters , \n",
        "    linear_model , \n",
        "    prophet , \n",
        "    quadratic_model , \n",
        "    sarima , \n",
        "    theta\n",
        ")"
      ],
      "execution_count": 61,
      "outputs": []
    },
    {
      "cell_type": "markdown",
      "metadata": {
        "id": "XL_oYp-7ZxUo"
      },
      "source": [
        "On the next step we will define parameters for each individual forecasting model using the `EnsembleParams` class. "
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "GZLW0KdWZ4ji"
      },
      "source": [
        "# Defining the parameters of different models \n",
        "model_params = EnsembleParams(\n",
        "    [\n",
        "     BaseModelParams('arima' , arima.ARIMAParams(p = 1 , d=1 , q=1)) , \n",
        "     BaseModelParams('sarima' ,\n",
        "                     sarima.SARIMAParams(\n",
        "                         p = 2 , d= 2 , q =1 , trend = 'ct' , \n",
        "                     seasonal_order = (1, 0 ,1 ,12) , enforce_invertibility = False , \n",
        "                     enforce_stationarity = False),\n",
        "     ),\n",
        "     BaseModelParams('prophet' , prophet.ProphetParams()) , \n",
        "     BaseModelParams('linear' , linear_model.LinearModelParams()) , \n",
        "     BaseModelParams('quadratic' , quadratic_model.QuadraticModelParams()),\n",
        "     BaseModelParams('theta' , theta.ThetaParams()), \n",
        "    ]\n",
        ")"
      ],
      "execution_count": 62,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "SyE-sW09bWqJ"
      },
      "source": [
        "# Creating KatEnsembleParam with detailed configuration \n",
        "KatEnsembleParams = {\n",
        "    'models': model_params , \n",
        "    'aggregation': 'median' , \n",
        "    'seasonality_length': 7 , \n",
        "    'decomposition_method': 'multiplicative'\n",
        "}"
      ],
      "execution_count": 63,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "IlX7gB4DcbY1"
      },
      "source": [
        "# Creating a Time Series dataset \n",
        "bitcoin_ts = TimeSeriesData(value = bitcoin_prices.Price,\n",
        "                            time = bitcoin_prices.index , \n",
        "                            sort_by_time= True)"
      ],
      "execution_count": 64,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/"
        },
        "id": "RQdLi-tGcL9t",
        "outputId": "1c24c8b5-ac01-41b2-c0f3-363d19043ddf"
      },
      "source": [
        "# Creating a KatEnsemble model (or) instantiating it \n",
        "ensemble_model = KatsEnsemble(\n",
        "    data = bitcoin_ts , \n",
        "    params = KatEnsembleParams\n",
        ")\n",
        "\n",
        "# Fitting the model \n",
        "ensemble_model.fit()"
      ],
      "execution_count": 65,
      "outputs": [
        {
          "output_type": "stream",
          "name": "stderr",
          "text": [
            "/usr/local/lib/python3.7/dist-packages/statsmodels/tsa/stattools.py:671: FutureWarning:\n",
            "\n",
            "fft=True will become the default after the release of the 0.12 release of statsmodels. To suppress this warning, explicitly set fft=False.\n",
            "\n",
            "/usr/local/lib/python3.7/dist-packages/statsmodels/tsa/arima_model.py:472: FutureWarning:\n",
            "\n",
            "\n",
            "statsmodels.tsa.arima_model.ARMA and statsmodels.tsa.arima_model.ARIMA have\n",
            "been deprecated in favor of statsmodels.tsa.arima.model.ARIMA (note the .\n",
            "between arima and model) and\n",
            "statsmodels.tsa.SARIMAX. These will be removed after the 0.12 release.\n",
            "\n",
            "statsmodels.tsa.arima.model.ARIMA makes use of the statespace framework and\n",
            "is both well tested and maintained.\n",
            "\n",
            "To silence this warning and continue using ARMA and ARIMA until they are\n",
            "removed, use:\n",
            "\n",
            "import warnings\n",
            "warnings.filterwarnings('ignore', 'statsmodels.tsa.arima_model.ARMA',\n",
            "                        FutureWarning)\n",
            "warnings.filterwarnings('ignore', 'statsmodels.tsa.arima_model.ARIMA',\n",
            "                        FutureWarning)\n",
            "\n",
            "\n",
            "/usr/local/lib/python3.7/dist-packages/statsmodels/tsa/base/tsa_model.py:527: ValueWarning:\n",
            "\n",
            "No frequency information was provided, so inferred frequency D will be used.\n",
            "\n",
            "/usr/local/lib/python3.7/dist-packages/statsmodels/tsa/base/tsa_model.py:527: ValueWarning:\n",
            "\n",
            "No frequency information was provided, so inferred frequency D will be used.\n",
            "\n",
            "/usr/local/lib/python3.7/dist-packages/statsmodels/tsa/arima_model.py:472: FutureWarning:\n",
            "\n",
            "\n",
            "statsmodels.tsa.arima_model.ARMA and statsmodels.tsa.arima_model.ARIMA have\n",
            "been deprecated in favor of statsmodels.tsa.arima.model.ARIMA (note the .\n",
            "between arima and model) and\n",
            "statsmodels.tsa.SARIMAX. These will be removed after the 0.12 release.\n",
            "\n",
            "statsmodels.tsa.arima.model.ARIMA makes use of the statespace framework and\n",
            "is both well tested and maintained.\n",
            "\n",
            "To silence this warning and continue using ARMA and ARIMA until they are\n",
            "removed, use:\n",
            "\n",
            "import warnings\n",
            "warnings.filterwarnings('ignore', 'statsmodels.tsa.arima_model.ARMA',\n",
            "                        FutureWarning)\n",
            "warnings.filterwarnings('ignore', 'statsmodels.tsa.arima_model.ARIMA',\n",
            "                        FutureWarning)\n",
            "\n",
            "\n",
            "/usr/local/lib/python3.7/dist-packages/statsmodels/tsa/arima_model.py:472: FutureWarning:\n",
            "\n",
            "\n",
            "statsmodels.tsa.arima_model.ARMA and statsmodels.tsa.arima_model.ARIMA have\n",
            "been deprecated in favor of statsmodels.tsa.arima.model.ARIMA (note the .\n",
            "between arima and model) and\n",
            "statsmodels.tsa.SARIMAX. These will be removed after the 0.12 release.\n",
            "\n",
            "statsmodels.tsa.arima.model.ARIMA makes use of the statespace framework and\n",
            "is both well tested and maintained.\n",
            "\n",
            "To silence this warning and continue using ARMA and ARIMA until they are\n",
            "removed, use:\n",
            "\n",
            "import warnings\n",
            "warnings.filterwarnings('ignore', 'statsmodels.tsa.arima_model.ARMA',\n",
            "                        FutureWarning)\n",
            "warnings.filterwarnings('ignore', 'statsmodels.tsa.arima_model.ARIMA',\n",
            "                        FutureWarning)\n",
            "\n",
            "\n",
            "INFO:fbprophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
            "/usr/local/lib/python3.7/dist-packages/statsmodels/tsa/holtwinters/model.py:429: FutureWarning:\n",
            "\n",
            "After 0.13 initialization must be handled at model creation\n",
            "\n",
            "/usr/local/lib/python3.7/dist-packages/statsmodels/tsa/holtwinters/model.py:922: ConvergenceWarning:\n",
            "\n",
            "Optimization failed to converge. Check mle_retvals.\n",
            "\n",
            "/usr/local/lib/python3.7/dist-packages/kats/models/theta.py:121: FutureWarning:\n",
            "\n",
            "`rcond` parameter will change to the default of machine precision times ``max(M, N)`` where M and N are the input matrix dimensions.\n",
            "To use the future default and silence this warning we advise to pass `rcond=None`, to keep using the old, explicitly pass `rcond=-1`.\n",
            "\n",
            "INFO:fbprophet:Disabling daily seasonality. Run prophet with daily_seasonality=True to override this.\n",
            "/usr/local/lib/python3.7/dist-packages/statsmodels/tsa/holtwinters/model.py:429: FutureWarning:\n",
            "\n",
            "After 0.13 initialization must be handled at model creation\n",
            "\n",
            "/usr/local/lib/python3.7/dist-packages/statsmodels/tsa/holtwinters/model.py:922: ConvergenceWarning:\n",
            "\n",
            "Optimization failed to converge. Check mle_retvals.\n",
            "\n",
            "/usr/local/lib/python3.7/dist-packages/kats/models/theta.py:121: FutureWarning:\n",
            "\n",
            "`rcond` parameter will change to the default of machine precision times ``max(M, N)`` where M and N are the input matrix dimensions.\n",
            "To use the future default and silence this warning we advise to pass `rcond=None`, to keep using the old, explicitly pass `rcond=-1`.\n",
            "\n",
            "/usr/local/lib/python3.7/dist-packages/statsmodels/tsa/arima_model.py:472: FutureWarning:\n",
            "\n",
            "\n",
            "statsmodels.tsa.arima_model.ARMA and statsmodels.tsa.arima_model.ARIMA have\n",
            "been deprecated in favor of statsmodels.tsa.arima.model.ARIMA (note the .\n",
            "between arima and model) and\n",
            "statsmodels.tsa.SARIMAX. These will be removed after the 0.12 release.\n",
            "\n",
            "statsmodels.tsa.arima.model.ARIMA makes use of the statespace framework and\n",
            "is both well tested and maintained.\n",
            "\n",
            "To silence this warning and continue using ARMA and ARIMA until they are\n",
            "removed, use:\n",
            "\n",
            "import warnings\n",
            "warnings.filterwarnings('ignore', 'statsmodels.tsa.arima_model.ARMA',\n",
            "                        FutureWarning)\n",
            "warnings.filterwarnings('ignore', 'statsmodels.tsa.arima_model.ARIMA',\n",
            "                        FutureWarning)\n",
            "\n",
            "\n"
          ]
        },
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "<kats.models.ensemble.kats_ensemble.KatsEnsemble at 0x7f5e96f72450>"
            ]
          },
          "metadata": {},
          "execution_count": 65
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "a2-IfczBc-pX"
      },
      "source": [
        "# Making prediction for the next 30 days \n",
        "forecast = ensemble_model.predict(steps = 30) "
      ],
      "execution_count": 66,
      "outputs": []
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 979
        },
        "id": "E0uPDVYKdACp",
        "outputId": "125e50e8-fcd5-46b6-e1ff-2a4d2703029d"
      },
      "source": [
        "# Aggregate individual model results (we will get the predictions for 30 Days)\n",
        "ensemble_model.aggregate()"
      ],
      "execution_count": 67,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/html": [
              "<div>\n",
              "<style scoped>\n",
              "    .dataframe tbody tr th:only-of-type {\n",
              "        vertical-align: middle;\n",
              "    }\n",
              "\n",
              "    .dataframe tbody tr th {\n",
              "        vertical-align: top;\n",
              "    }\n",
              "\n",
              "    .dataframe thead th {\n",
              "        text-align: right;\n",
              "    }\n",
              "</style>\n",
              "<table border=\"1\" class=\"dataframe\">\n",
              "  <thead>\n",
              "    <tr style=\"text-align: right;\">\n",
              "      <th></th>\n",
              "      <th>time</th>\n",
              "      <th>fcst</th>\n",
              "      <th>fcst_lower</th>\n",
              "      <th>fcst_upper</th>\n",
              "    </tr>\n",
              "  </thead>\n",
              "  <tbody>\n",
              "    <tr>\n",
              "      <th>0</th>\n",
              "      <td>2021-05-19</td>\n",
              "      <td>42702.328615</td>\n",
              "      <td>39032.386907</td>\n",
              "      <td>44921.608513</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>1</th>\n",
              "      <td>2021-05-20</td>\n",
              "      <td>42093.070700</td>\n",
              "      <td>38310.788632</td>\n",
              "      <td>44888.988334</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>2</th>\n",
              "      <td>2021-05-21</td>\n",
              "      <td>41453.679349</td>\n",
              "      <td>37395.374307</td>\n",
              "      <td>44078.196763</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>3</th>\n",
              "      <td>2021-05-22</td>\n",
              "      <td>42389.483521</td>\n",
              "      <td>37872.878370</td>\n",
              "      <td>45665.275739</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>4</th>\n",
              "      <td>2021-05-23</td>\n",
              "      <td>42430.877678</td>\n",
              "      <td>37962.923862</td>\n",
              "      <td>45973.603624</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>5</th>\n",
              "      <td>2021-05-24</td>\n",
              "      <td>41860.876656</td>\n",
              "      <td>37602.692938</td>\n",
              "      <td>45932.014000</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>6</th>\n",
              "      <td>2021-05-25</td>\n",
              "      <td>40876.735471</td>\n",
              "      <td>36456.911133</td>\n",
              "      <td>45016.331023</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>7</th>\n",
              "      <td>2021-05-26</td>\n",
              "      <td>41493.450702</td>\n",
              "      <td>36620.127055</td>\n",
              "      <td>46303.986131</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>8</th>\n",
              "      <td>2021-05-27</td>\n",
              "      <td>40974.434773</td>\n",
              "      <td>36010.993246</td>\n",
              "      <td>46215.963302</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>9</th>\n",
              "      <td>2021-05-28</td>\n",
              "      <td>40517.685762</td>\n",
              "      <td>35541.445586</td>\n",
              "      <td>45268.930748</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>10</th>\n",
              "      <td>2021-05-29</td>\n",
              "      <td>41310.823799</td>\n",
              "      <td>36146.832595</td>\n",
              "      <td>46942.661605</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>11</th>\n",
              "      <td>2021-05-30</td>\n",
              "      <td>41449.517163</td>\n",
              "      <td>35960.903837</td>\n",
              "      <td>47199.677049</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>12</th>\n",
              "      <td>2021-05-31</td>\n",
              "      <td>41278.951031</td>\n",
              "      <td>35290.926485</td>\n",
              "      <td>47151.482375</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>13</th>\n",
              "      <td>2021-06-01</td>\n",
              "      <td>40747.868299</td>\n",
              "      <td>34595.620869</td>\n",
              "      <td>46177.676612</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>14</th>\n",
              "      <td>2021-06-02</td>\n",
              "      <td>41375.657183</td>\n",
              "      <td>35106.352317</td>\n",
              "      <td>47465.047440</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>15</th>\n",
              "      <td>2021-06-03</td>\n",
              "      <td>41235.496211</td>\n",
              "      <td>34882.764880</td>\n",
              "      <td>47085.579379</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>16</th>\n",
              "      <td>2021-06-04</td>\n",
              "      <td>40843.805996</td>\n",
              "      <td>34460.086705</td>\n",
              "      <td>46245.740211</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>17</th>\n",
              "      <td>2021-06-05</td>\n",
              "      <td>41696.714289</td>\n",
              "      <td>35425.403281</td>\n",
              "      <td>47838.084470</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>18</th>\n",
              "      <td>2021-06-06</td>\n",
              "      <td>41886.155807</td>\n",
              "      <td>35500.021592</td>\n",
              "      <td>47887.671868</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>19</th>\n",
              "      <td>2021-06-07</td>\n",
              "      <td>41758.259614</td>\n",
              "      <td>35186.121799</td>\n",
              "      <td>47788.589750</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>20</th>\n",
              "      <td>2021-06-08</td>\n",
              "      <td>41253.592588</td>\n",
              "      <td>35252.447081</td>\n",
              "      <td>46866.733830</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>21</th>\n",
              "      <td>2021-06-09</td>\n",
              "      <td>41928.762161</td>\n",
              "      <td>35585.506553</td>\n",
              "      <td>48178.393089</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>22</th>\n",
              "      <td>2021-06-10</td>\n",
              "      <td>41813.850847</td>\n",
              "      <td>36187.033878</td>\n",
              "      <td>48211.347985</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>23</th>\n",
              "      <td>2021-06-11</td>\n",
              "      <td>41227.411220</td>\n",
              "      <td>35091.185874</td>\n",
              "      <td>47292.244745</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>24</th>\n",
              "      <td>2021-06-12</td>\n",
              "      <td>42317.681061</td>\n",
              "      <td>35364.917980</td>\n",
              "      <td>48537.083517</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>25</th>\n",
              "      <td>2021-06-13</td>\n",
              "      <td>42518.913745</td>\n",
              "      <td>36171.177209</td>\n",
              "      <td>48679.131392</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>26</th>\n",
              "      <td>2021-06-14</td>\n",
              "      <td>42392.329326</td>\n",
              "      <td>36271.636806</td>\n",
              "      <td>48717.445740</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>27</th>\n",
              "      <td>2021-06-15</td>\n",
              "      <td>41612.358831</td>\n",
              "      <td>35376.209350</td>\n",
              "      <td>47706.604601</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>28</th>\n",
              "      <td>2021-06-16</td>\n",
              "      <td>42556.673592</td>\n",
              "      <td>36088.857685</td>\n",
              "      <td>49013.276425</td>\n",
              "    </tr>\n",
              "    <tr>\n",
              "      <th>29</th>\n",
              "      <td>2021-06-17</td>\n",
              "      <td>42428.993650</td>\n",
              "      <td>36001.895304</td>\n",
              "      <td>48955.540297</td>\n",
              "    </tr>\n",
              "  </tbody>\n",
              "</table>\n",
              "</div>"
            ],
            "text/plain": [
              "         time          fcst    fcst_lower    fcst_upper\n",
              "0  2021-05-19  42702.328615  39032.386907  44921.608513\n",
              "1  2021-05-20  42093.070700  38310.788632  44888.988334\n",
              "2  2021-05-21  41453.679349  37395.374307  44078.196763\n",
              "3  2021-05-22  42389.483521  37872.878370  45665.275739\n",
              "4  2021-05-23  42430.877678  37962.923862  45973.603624\n",
              "5  2021-05-24  41860.876656  37602.692938  45932.014000\n",
              "6  2021-05-25  40876.735471  36456.911133  45016.331023\n",
              "7  2021-05-26  41493.450702  36620.127055  46303.986131\n",
              "8  2021-05-27  40974.434773  36010.993246  46215.963302\n",
              "9  2021-05-28  40517.685762  35541.445586  45268.930748\n",
              "10 2021-05-29  41310.823799  36146.832595  46942.661605\n",
              "11 2021-05-30  41449.517163  35960.903837  47199.677049\n",
              "12 2021-05-31  41278.951031  35290.926485  47151.482375\n",
              "13 2021-06-01  40747.868299  34595.620869  46177.676612\n",
              "14 2021-06-02  41375.657183  35106.352317  47465.047440\n",
              "15 2021-06-03  41235.496211  34882.764880  47085.579379\n",
              "16 2021-06-04  40843.805996  34460.086705  46245.740211\n",
              "17 2021-06-05  41696.714289  35425.403281  47838.084470\n",
              "18 2021-06-06  41886.155807  35500.021592  47887.671868\n",
              "19 2021-06-07  41758.259614  35186.121799  47788.589750\n",
              "20 2021-06-08  41253.592588  35252.447081  46866.733830\n",
              "21 2021-06-09  41928.762161  35585.506553  48178.393089\n",
              "22 2021-06-10  41813.850847  36187.033878  48211.347985\n",
              "23 2021-06-11  41227.411220  35091.185874  47292.244745\n",
              "24 2021-06-12  42317.681061  35364.917980  48537.083517\n",
              "25 2021-06-13  42518.913745  36171.177209  48679.131392\n",
              "26 2021-06-14  42392.329326  36271.636806  48717.445740\n",
              "27 2021-06-15  41612.358831  35376.209350  47706.604601\n",
              "28 2021-06-16  42556.673592  36088.857685  49013.276425\n",
              "29 2021-06-17  42428.993650  36001.895304  48955.540297"
            ]
          },
          "metadata": {},
          "execution_count": 67
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 441
        },
        "id": "7cmueWTgdzE_",
        "outputId": "6d65ec26-3ba0-48d2-b279-403c8f3a61d7"
      },
      "source": [
        "# Plotting the model \n",
        "ensemble_model.plot()"
      ],
      "execution_count": 68,
      "outputs": [
        {
          "output_type": "display_data",
          "data": {
            "image/png": "iVBORw0KGgoAAAANSUhEUgAAAsgAAAGoCAYAAABbtxOxAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjIsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+WH4yJAAAgAElEQVR4nOzdeXxU9b3/8fdkJhtbSEAgJMg2EYGCqEHAFotyU5QquFBFLcSC4sVesWqtWpcW64IW61bUG0QNWkXrQpQl4FZcrhAB5ar8oFEWyQIECIQlyWRmzu+P3DnOJJNlkpnMJPN6Ph59eOY755z5zhes73zzOd+vxTAMQwAAAAAkSTHh7gAAAAAQSQjIAAAAgBcCMgAAAOCFgAwAAAB4ISADAAAAXmzh7kBb69mzpwYMGBDubrRYTU2NYmNjw92NdoPxCgzjFTjGLDCMV2AYr8AwXoFhvKRdu3bpwIED9dqjLiAPGDBAGzduDHc3WqykpER9+/YNdzfaDcYrMIxX4BizwDBegWG8AsN4BYbxkjIzM/22U2IBAAAAeCEgAwAAAF4IyAAAAIAXAjIAAADghYAMAAAAeCEgAwAAAF4IyAAAAIAXAjIAAADghYAMAAAAeCEgAwAAAF4IyAAAAIAXAjIAAADghYAMAAAAeCEgAwAAAF4IyAAAAAgKwzDkcrnC3Y1WIyADAAAgKC699FLZbDadOHEi3F1pFQIyAAAAgmL58uWSpOPHj4e5J61DQAYAAEBQORyOcHehVQjIAAAACKrq6upwd6FVCMgAAAAIKgIyAAAA4IWADAAAAHipqqoKdxdahYAMAACAoGIGGQAAAPBCQAYAAAC8EJABAAAALwRkAAAAwAsBGQAAAPBCQAYAAEDUMwzDPCYgAwAAIGps2bJFffv21f79+33aXS6XeUxABgAAQNR47LHHVFpaqhUrVvi0E5Cb6fDhw5o2bZpOPfVUDR06VJ9//rkOHTqkrKwsZWRkKCsrS+Xl5ZJqp+XnzZsnu92ukSNHavPmzeZ9cnNzlZGRoYyMDOXm5prtmzZt0ogRI2S32zVv3jyfqX0AAAAEX1JSkiTpyJEjPu3eAdnpdLZpn4ItpAH5pptu0vnnn69t27Zpy5YtGjp0qBYsWKCJEyeqsLBQEydO1IIFCyRJq1evVmFhoQoLC5WTk6O5c+dKkg4dOqT58+drw4YNKigo0Pz5881QPXfuXC1evNi8Lj8/P5RfBwAAIOp1795dUuMB2e12t2mfgi1kAfnIkSP6+OOPNXv2bElSXFycunfvrry8PGVnZ0uSsrOztXz5cklSXl6eZs6cKYvForFjx+rw4cMqLS3VmjVrlJWVpZSUFCUnJysrK0v5+fkqLS1VRUWFxo4dK4vFopkzZ5r3AgAAQGjExsZKqj9L7B2QvY/bI1uobrxz506ddNJJ+s1vfqMtW7bozDPP1BNPPKF9+/YpNTVVktSnTx/t27dPklRcXKx+/fqZ16enp6u4uLjR9vT09Hrt/uTk5CgnJ0eStHfvXpWUlAT9+7aVsrKycHehXWG8AsN4BY4xCwzjFRjGKzCMV2BaOl4VFRWSpKNHj/pkqu3bt5vHhw8fbtd5K2QB2el0avPmzXrqqac0ZswY3XTTTWY5hYfFYpHFYglVF0xz5szRnDlzJEmZmZnq27dvyD8zlNp7/9sa4xUYxitwjFlgGK/AMF6BYbwC05Lx6tq1qySpW7duPtcXFBSYx507d27XfxYhK7FIT09Xenq6xowZI0maNm2aNm/erN69e6u0tFSSVFpaql69ekmS0tLStGfPHvP6oqIipaWlNdpeVFRUrx0AAABt7/jx4+Zxey+xCFlA7tOnj/r162dOt3/wwQcaNmyYpkyZYq5EkZubq6lTp0qSpkyZoqVLl8owDK1fv15JSUlKTU3VpEmTtHbtWpWXl6u8vFxr167VpEmTlJqaqm7dumn9+vUyDENLly417wUAAIDQ8KwaVrcK4NixY+Zxe39IL2QlFpL01FNP6eqrr5bD4dCgQYP0wgsvyO126/LLL9eSJUvUv39/vf7665KkyZMna9WqVbLb7erUqZNeeOEFSVJKSoruuecejR49WpJ07733KiUlRZL09NNP65prrlFlZaUuuOACXXDBBaH8OgAAAKjD6XTqm2++8QnI7X0GOaQBedSoUdq4cWO99g8++KBem8Vi0aJFi/zeZ9asWZo1a1a99szMTH3zzTet7ygAAABa5K9//av++Mc/mq+7dOniE5CPHTumqqoq9ezZMxzdaxF20gMAAEDAjh49KofD4ROOExMTFRsb61NiMWLECJ100knh6GKLhXQGGQAAAB2Lpwb58ccf17Zt29SlSxezvCI2NlYxMTE+M8i7du0KRzdbhRlkAAAAtEh+fr4SEhLM14ZhyGq1tvuH9AjIAAAAaDbPDLJHTEyMz3t1Z5A9ampqQt63YCEgAwAAoMW8l3vr1q2brFar34DsvU5ypCMgAwAAoNm8Z5BPOeUUn4Dcu3dvnxKLgwcPmu85nc6262QrEZABAADQbPfdd5/Pa+/ZYpvNZpZY7N6922dpNwIyAAAAOjyXy+VTW2yxWMwSi7p7VRCQAQAA0CE4nU4dPnzY73tut1sOh0PDhw8322JiYuR2u+tdQ0AGAABAhzB79mwlJyebtcddunQx33O5XHI4HOrVq5ck3xlk79UtJAIyAAAAOoilS5dKkvngnfdDeU6nU06nU7GxseZ7nof06gZklnkDAABAh+J5GM/hcJhtVVVVkmofzpNqA3JMTIzWrVtXb7MQ7+siHQEZAAAATfKUSHiXSlRWVkqSuZtefHy8vv76ax04cEA//PCDz/U33XRTG/W09Wzh7gAAAAAin8vlkmEYPsu6eQLy+PHjZbfbdeONN6pfv36S6s8Yr1u3ru0620oEZAAAADTJ5XL5hOOePXvqwIEDkqT9+/fr4Ycf9jm/PdUc10WJBQAAAJrkdDrNgPzggw/q17/+tfneoUOHzOOcnBxJUnV1ddt2MIgIyAAAAGiSy+Uy649tNpusVqvf85KSkiS1r4fy6iIgAwAAoEmeJd2kxgOyZ0WLujPI5513Xmg7GEQEZAAAADSpsRlkzyYinvek+jPIZ5xxRhv0MjgIyAAAAGhSYwH5yiuvNI89AXnXrl31rm8vCMgAAABoUkMlFlarVRMmTDDP8+yq98EHH/hcT0AGAABAh1J3Btmz5bT31tOe9xq6vr0gIAMAAKBJdQPy4sWLJfnurOd5ry6LxUJABgAAQMdSt8SiuLjY73n+AnJsbKzcbndI+xdMBGQAAAA0yeVymbvj2Ww2xcXF+T3PX0COi4tjBhkAAAAdi9PpNANybGys3n77bb/neR7S80ZABgAAQIfgXRZhGIYZkOPi4jRgwAC/1zRUYtGeArL/xwwBAAAQ9bwDstvtNkNubGys35liqWMEZGaQAQAA4Jd3qPWeQQ40IFNiAQAAgA6h7gyyd4lFQ+sd1w3OpaWlio+PJyADAACg/fMOtW63Ww6HQ1JgM8h9+vSR1WolIAMAAKD9a+ghveYG5N///veSarejZh1kAAAAtHt1Z5ADDcjjxo2TJMXExDCDDAAAgPav7gyyp8QiLi5OCQkJfq/xDsgxMbVRkxILAAAAdAgtmUH2bicgAwAAoENpaBWLhsKxxAwyAAAAOrDXX3/dPK67k56HJwR7dISAzE56AAAA8GvlypXmcd1l3iRp586dSkxM9LnGOzATkAEAANChpKSkmMf+SiwGDBjQ6PXeAdkTrtsDSiwAAADg18SJE83juusgN0d7nUEmIAMAAMAv73IJfyUWzb0+JiaGjUIAAADQ/nnP+npmkG02mywWS7OuZwYZAAAAHYq/dZCbO3ssEZABAADQwXiXRWzfvl0LFy70WcatKZ6ZZgIyAAAAOgTvUHvnnXdKko4ePdrs65lB9mPAgAEaMWKERo0apczMTEnSoUOHlJWVpYyMDGVlZam8vFxSbV3LvHnzZLfbNXLkSG3evNm8T25urjIyMpSRkaHc3FyzfdOmTRoxYoTsdrvmzZsnwzBC+XUAAACiSmtDLQG5AR999JG++uorbdy4UZK0YMECTZw4UYWFhZo4caIWLFggSVq9erUKCwtVWFionJwczZ07V1JtoJ4/f742bNiggoICzZ8/3wzVc+fO1eLFi83r8vPzQ/11AAAAokZLQ+2IESMkEZCbLS8vT9nZ2ZKk7OxsLV++3GyfOXOmLBaLxo4dq8OHD6u0tFRr1qxRVlaWUlJSlJycrKysLOXn56u0tFQVFRUaO3asLBaLZs6cad4LAAAArdfSUGu1WiX9GJBPnDih77//Pmj9CrWQ7qRnsVj0i1/8QhaLRddff73mzJmjffv2KTU1VZLUp08f7du3T5JUXFysfv36mdemp6eruLi40fb09PR67f7k5OQoJydHkrR3716VlJQE/bu2lbKysnB3oV1hvALDeAWOMQsM4xUYxiswjFdgmjNent/a19VUlvIE6wMHDqikpMScxNywYYNProtUIQ3In376qdLS0rR//35lZWXp1FNP9XnfYrE0ex291pgzZ47mzJkjScrMzFTfvn1D/pmh1N7739YYr8AwXoFjzALDeAWG8QoM4xWYpsarc+fOLbouISFBktSjRw+fc3v37t0u/oxCWmKRlpYmSerVq5cuueQSFRQUqHfv3iotLZUklZaWqlevXua5e/bsMa8tKipSWlpao+1FRUX12gEAABAc/kosmrPMm6e0wrNM3J///OcG7xeJQhaQjx8/bi4Dcvz4ca1du1Y/+clPNGXKFHMlitzcXE2dOlWSNGXKFC1dulSGYWj9+vVKSkpSamqqJk2apLVr16q8vFzl5eVau3atJk2apNTUVHXr1k3r16+XYRhaunSpeS8AAAC0nr9A6739dEM8Ncie60855RRJUk1NTRB7FzohK7HYt2+fLrnkEkmS0+nUVVddpfPPP1+jR4/W5ZdfriVLlqh///56/fXXJUmTJ0/WqlWrZLfb1alTJ73wwguSpJSUFN1zzz0aPXq0JOnee+9VSkqKJOnpp5/WNddco8rKSl1wwQW64IILQvV1AAAAos6jjz5ar83hcDR5Xd0ZZM+ss9PpDGLvQidkAXnQoEHasmVLvfYePXrogw8+qNdusVi0aNEiv/eaNWuWZs2aVa89MzNT33zzTes7CwAAgHpOnDjRous8Adkzg+zZnrqhgOx2G4qJCf1zac3FTnoAAAAIKk+JRXNnkLftOqxqR+TUJxOQAQAAUE9DOxSfd955TV5bdwbZE5D91SBXOZxyutxyuyNnR2QCMgAAAOpp6IE6730oGnLFFVdIkjIyMiQ1XmLx3Q8VcrrcLe1mSBCQAQAAUE9D5RBdu3Zt8tprr71WlZWVGjBggKTGSyxqnC65XJEzeywRkAEAAOBHQzPI999/f5PXWiwWc7MQqeESi+OVNXIbkruBco5wISADAACgnm3btkmSrr/+el188cVme/fu3QO+V0MzyBXHa1TtcEVU/bEU4q2mAQAA0D5dfvnlkqSqqiolJSVJanjr6ab4q0F21Lh06EiVXG5DEbTCmyRmkAEAAODHgQMHJNWGYs8M8MyZM1t0L38zyA6nWyeqnHK5DEXYBDIBGQAAAPVVVlZKqt05z7OuseefgfJXg1yy/7hcbkNuw6242MiKpJHVGwAAAEQETxh2OBxmwPWsbxyouiUWLpdbLrdh/tNiiawaCwIyAAAA6vHUHaekpLQ6INctsSjaf1xV1U65DSk+tmWz0qFEQAYAAEA9Z599tiTpgQceMANua0ssPAH56AmHHE63XC5DMRE2eywRkAEAAFCHYRh69913JUmdOnXS1q1bJUnffvtti+5XtwbZ7ZacTrfchqGUpPgg9Di4CMgAAADwUXdDj/z8fJ9/BqpuDbJhGHK5DRmGIZs18uJo5PUIAAAAYdXQLnotfZjOu8TC5XLLkOR0GYqwDfRMBGQAAAD4aCgge2aCA+VdYlF+1KGaGrdcbrc6J0bmnnWR2SsAAACETUMB2RN0A+U9g3z4aLWcTrd6dU9UYkJkRlFmkAEAAOCjbkAeMWKEJCkxMbFF9/PMPDtqanSsskZOtyGbLXJjaOT2DAAAAGFRNyAvW7ZMUu220y3hWT/Z4ahRjdOQy2UoAld3MxGQAQAA4KNuQO7WrZukH1ehCJTFYpHNZpPDUSO3u3Z5twjOxwRkAAAA+KobkD01xC6Xq8X3jI2NlcvplNtlyOV2t6p/oUZABgAAgA9PQH7llVckST169FDnzp21cOHCFt/TZrOpxumU023IIousEbj+sUdkPjoIAACAsPEEZE/NcWxsrI4dO9aqe9psNpVXnJBhGOqV0rKH/dpK5EZ3AAAAhIWn1rily7r546lBNgwpIc4atPuGAgEZAAAAPtz/VyPsWX0iGGJjY+V0OiN29zxvBGQAAAD4MP4vxQYzINtsNrmcThmK/IRMQAYAAIAPzwyyJYiLFdtsNjmdTrndBGQAAAC0M6GYQY6NjZXT5Yz4+mOJgAwAAIA6QjWDbLhdSuoSF7R7hgoBGQAAAD5CV4Pc8o1G2hIBGQAAAD5CsYpFbQ1yTdMnRgACMgAAAHyEosTCU4PcHrCTHgAAAHyEosRi2rRpOlRBQAYAAEA7FIoZ5J9N+rX+Z8u+oN0vlCixAAAAgI9QzCBv3nZAa9fvMe8tSVXVTj217Bt98e3+oH1OMBCQAQAA4CMUM8hdEmPlqHFr6cpCs23lZz9o685yvfXhTrlc7qB9VmsRkAEAAOAjFDPIXTrFSqqdSfY4UF5lHjtdkbPDHgEZAAAAPkIxg9w5sf6jb6UHTpjH8RG0wx4P6QEAAMBHSDYKsfre66aFn8lREzllFd6YQQYAAICPkGwUYv1xNtrlckdsOJYIyAAAAKgjFCUWVq+wfdPC//F5b/LPTg7a5wQDARkAAAA+QlFikTnsJPPY5fZ9IG/86alB+5xgICADAADARyhmkOPjrPrpaX3qtfdP7RK0zwgWHtIDAACAj1DMIEv1Z44n//RkXXROfx0+5gjq57QWM8gAAADwEYoZZEkaP8p3Bjnj5KSg3j9YCMgAAABR6tNPP1VeXl699lDNIJ+c2tU8vvGKn+jUAd2Dev9gCXlAdrlcOv3003XhhRdKknbu3KkxY8bIbrfriiuukMNRO6VeXV2tK664Qna7XWPGjNGuXbvMezz00EOy2+0aMmSI1qxZY7bn5+dryJAhstvtWrBgQai/CgAAQIcyfvx4XXzxxfXaf/jhB0nBn0GWpP8Yk6axI3pp2KDkoN87WEIekJ944gkNHTrUfH377bfr5ptv1nfffafk5GQtWbJEkrRkyRIlJyfru+++080336zbb79dkrR161YtW7ZM3377rfLz83XDDTfI5XLJ5XLpt7/9rVavXq2tW7fq1Vdf1datW0P9dQAAADocz4Slxx133CEp+DPIknTR+P7KvnBI0O8bTCENyEVFRVq5cqWuvfZaSbXT9R9++KGmTZsmScrOztby5cslSXl5ecrOzpYkTZs2TR988IEMw1BeXp6mT5+u+Ph4DRw4UHa7XQUFBSooKJDdbtegQYMUFxen6dOn+/0VAQAAABpXVVXltz0UM8jtQUhXsfjd736nRx55REePHpUkHTx4UN27d5fNVvux6enpKi4uliQVFxerX79+tZ2y2ZSUlKSDBw+quLhYY8eONe/pfY3nfE/7hg0b/PYjJydHOTk5kqS9e/eqpKQkyN+07ZSVlYW7C+0K4xUYxitwjFlgGK/AMF6BYbwC4z1eP/zwg1JSUvyek5iYGLTPLD94VG63W7E23znao5VO7d1brfg4a9A+qzVCFpBXrFihXr166cwzz9S//vWvUH1Ms8yZM0dz5syRJGVmZqpv375h7U9rtff+tzXGKzCMV+AYs8AwXoFhvALDeLVMcnKy37FLTU0N6pgeriqX2+1WXKxvEI455lCfPilKTIiMFYhD1ovPPvtM77zzjlatWqWqqipVVFTopptu0uHDh+V0OmWz2VRUVKS0tDRJUlpamvbs2aP09HQ5nU4dOXJEPXr0MNs9vK9pqB0AAADNV7cG2SNaSyxCVoP80EMPqaioSLt27dKyZct03nnn6R//+IfOPfdcvfHGG5Kk3NxcTZ06VZI0ZcoU5ebmSpLeeOMNnXfeebJYLJoyZYqWLVum6upq7dy5U4WFhTrrrLM0evRoFRYWaufOnXI4HFq2bJmmTJkSqq8DAADQYTUUkEPxkF570Obz2A8//LCmT5+uu+++W6effrpmz54tSZo9e7ZmzJghu92ulJQULVu2TJI0fPhwXX755Ro2bJhsNpsWLVokq7V2Wv7vf/+7Jk2aJJfLpVmzZmn48OFt/XUAAADaverqar/t0TqD3CYBecKECZowYYIkadCgQSooKKh3TkJCgv75z3/6vf6uu+7SXXfdVa998uTJmjx5clD7CgAAEG0amkF2uVxt3JPIEJ3z5gAAADA1FJCdTmdQP8caI7mNoN4yJAjIAAAAUc57B2NvQQ/I1hhzG+tIRkAGAACIcldffbUOHDhQr/2kk04K6ufYrDFyt4Mp5MhYbA4AAABtqu5M7qFDh9SzZ09J0qhRo5Samqrk5OSgfqbValE7mEBmBhkAACAaud3uBl+7XC4lJCQE/TNtVovc7SAhE5ABAACiUN0VKrxnlGtqasxldYPJIosOH3VEfB0yARkAACAK1Q3I3jPIR48eVbdu3YL+mU6XW50SbHJFeB0yARkAACAK1Q3INTU1kmo3DSkuLg5JiYVhSHGxVrlcBGQAAABEmLpLuHnWQn7ttdckSa+++mrQPzPWFqNYm4USCwAAAESeug/peQKyZyWLRx99NOif2btHojol2CJ+JQsCMgAAQBRqaAbZ0z5y5Migf6bFYlGsNSbiV7IgIAMAAEShujXIlZWVkn6sRY6NjQ3J58bEWEJy32AiIAMAAEShugH5wgsvVFlZWcgDssswdKIquFtYBxsBGQAAIArVDchS7W56noAcFxcXks+tcrhU46TEAgAAABGmbg2yVPvgnqcWOVQzyGkndVZC3I+bkByvrFG1o35YDydbuDsAAACAtld3FQupdg3kUJdYxMXGyOV263hljQxDOnLcoYR4a0Q9uMcMMgAAQBTyN4PscDhC/5CexaLEBJtOVDnlchuyWWOUEGtVrC1yYmnk9AQAAABtxl8NclsEZJs1RrG2GFkstRuG2GwW2Wwxiou1Nn1xGyEgAwAARCF/AbktSixiYixmQD5R5VSs1SprTGRF0sjqDQAAANpEUyUWoVrFQpJ6dk+U0+mW1Vq79bQ1whJphHUHAAAAbcHfQ3reAdlqDV3JQ0K8VYYMdUm0yWaN0YC+3UL2WS3BKhYAAABRyN8M8sGDB3XfffdJqt0WOlSsFots1hjFx1nVOTFW8XGRU38sMYMMAAAQlfzVID/55JNt8tmWmNpZ5JgYi/qndm2TzwwEARkAACAK+QvInTt3brPPj4+1KoKWPvZBiQUAAEAU8heQQ1l37C0hzqZYW0xErX3sjYAMAAAQhfzVILcle7+ksH5+YyIztgMAACCkPKtYrFy5Ups2bZIkGW1Y8xATY1FMTOgeBGwNZpABAACikGcGuV+/furWrXaZNX9lF9GIGWQAAIAo5AnDNptNMf+3k111dXU4uxQxCMgAAABRyBOQrVar+XDe5s2bw9mliEFABgAAiEKeEgvvGWTUYjQAAACi0KeffiqJgOwPowEAABCF3nzzTUm1JRZ1A/L3338fji5FDAIyAABAFPM3gzxw4MAw9SYyEJABAACimPdDeh4WS2SuT9xWCMgAAABRjBrk+hgNAACAKOavBjnasZMeAABAFLPZiIN1NfnjwlNPPaXy8vK26AsAAADamL8a5GjXZEDet2+fRo8ercsvv1z5+fkyDKMt+gUAAIA2QA1yfU2Oxv3336/CwkLNnj1bL774ojIyMvTHP/4x6tfHAwAA6AhiYmIIyHU0azQsFov69OmjPn36yGazqby8XNOmTdMf/vCHUPcPAAAAITJu3DhJLOtWV5NV2U888YSWLl2qnj176tprr9Vf//pXxcbGyu12KyMjQ4888khb9BMAAABB4na7JUmTJk2SRECuq8mAfOjQIb311lvq37+/T3tMTIxWrFgRso4BAAAgNFwulyRWsGhIk6Myf/78Bt8bOnRoUDsDAACA0HM6nZLE6hUNCFlFdlVVlc466yyddtppGj58uP70pz9Jknbu3KkxY8bIbrfriiuukMPhkCRVV1friiuukN1u15gxY7Rr1y7zXg899JDsdruGDBmiNWvWmO35+fkaMmSI7Ha7FixYEKqvAgAA0KH4m0Hu2bNnuLoTcUIWkOPj4/Xhhx9qy5Yt+uqrr5Sfn6/169fr9ttv180336zvvvtOycnJWrJkiSRpyZIlSk5O1nfffaebb75Zt99+uyRp69atWrZsmb799lvl5+frhhtukMvlksvl0m9/+1utXr1aW7du1auvvqqtW7eG6usAAAB0GJ6A7D2DXFZWpj179mjHjh3h6lbECFlAtlgs6tKliySppqZGNTU1slgs+vDDDzVt2jRJUnZ2tpYvXy5JysvLU3Z2tiRp2rRp+uCDD2QYhvLy8jR9+nTFx8dr4MCBstvtKigoUEFBgex2uwYNGqS4uDhNnz5deXl5ofo6AAAAHYanxKJuDXJ6eroGDhwYji5FlJAueudyuTRq1Cj16tVLWVlZGjx4sLp3727+YaSnp6u4uFiSVFxcrH79+kmq/cNKSkrSwYMHfdq9r2moHQAAAI3zN4OMH4X00UWr1aqvvvpKhw8f1iWXXKJt27aF8uMalJOTo5ycHEnS3r17VVJSEpZ+BENZWVm4u9CuMF6BYbwCx5gFhvEKDOMVGMar+fbu3StJOnbsWLvORaHSJmt7dO/eXeeee64+//xzHT58WE6nUzabTUVFRUpLS5MkpaWlac+ePUpPT5fT6dSRI0fUo0cPs93D+5qG2uuaM2eO5syZI0nKzMxU3759Q/VV20R7739bY7wCw3gFjjELDOMVGMYrMIxX83hmkHv06MGY+RGyEouysjIdPn9V370AACAASURBVHxYklRZWan33ntPQ4cO1bnnnqs33nhDkpSbm6upU6dKkqZMmaLc3FxJ0htvvKHzzjtPFotFU6ZM0bJly1RdXa2dO3eqsLBQZ511lkaPHq3CwkLt3LlTDodDy5Yt05QpU0L1dQAAADqMhmqQUStko1JaWqrs7Gy5XC653W5dfvnluvDCCzVs2DBNnz5dd999t04//XTNnj1bkjR79mzNmDFDdrtdKSkpWrZsmSRp+PDhuvzyyzVs2DDZbDYtWrTIrJf5+9//rkmTJsnlcmnWrFkaPnx4qL4OAABAh+GZxPQsqABfIQvII0eO1JdfflmvfdCgQSooKKjXnpCQoH/+859+73XXXXfprrvuqtc+efJkTZ48ufWdBQAAiCLff/+9pNpchvpCuooFAAAAIo8nIA8ePDjMPYlMBGQAAIAos3fvXnXu3FndunULd1ciEgEZAAAgyjgcDsXHx4e7GxGLgAwAABBlampqFBsbG+5uRCwCMgAAQJRxOBwE5EYQkAEAAKJMTU0NayA3goAMAAAQZRwOh+Li4sLdjYhFQAYAAIgyDoeDGeRGEJABAACiDA/pNY6ADAAAEGV4SK9xBGQAAIAowwxy4wjIAAAAUaasrExJSUnh7kbEIiADAABEEcMwtGvXLp188snh7krEIiADAABEkW3btqmyspKA3AgCMgAAQBT57LPPJEmDBw8Oc08iFwEZAAAgilRXV0uShg0bFuaeRC4CMgAAQBSpqamRJDYKaQQBGQAAIIp4AjJbTTeMgAwAABBFHA6HJGaQG0NABgAAiCKeGWQ2CmkYARkAACCKOBwO2Ww2WSyWcHclYhGQAQAAogjbTDeNgAwAABBFampqeECvCQRkAACAKOJwOJhBbgIBGQAAIIpUV1crPj4+3N2IaARkAACAKHLixAl17tw53N2IaARkAACAKHL8+HF16tQp3N2IaARkAACAKHHllVfq3XffJSA3gYAMAAAQJZYtWyZJlFg0gYAMAAAQZZhBbhwBGQAAIMrk5eWFuwsRjYAMAAAQBVwul3k8adKkMPYk8hGQAQAAosCBAwckSYMHD9Y777wT5t5ENgIyAABAFPj6668lSTk5OWw13QQCMgAAQBSorq6WJHXt2jXMPYl8BGQAAIAoUFNTI0nMHjcDARkAACAKOBwOSVJsbGyYexL5CMgAAABRwBOQmUFuGgEZAAAgChCQm4+ADAAAEAU8NciUWDSNgAwAABAFmEFuPgIyAABAFCAgNx8BGQAAIApQYtF8BGQAAIAowDJvzUdABgAAiAIOh0MxMTGyWq3h7krEIyADAABEgZqaGuqPmylkAXnPnj0699xzNWzYMA0fPlxPPPGEJOnQoUPKyspSRkaGsrKyVF5eLkkyDEPz5s2T3W7XyJEjtXnzZvNeubm5ysjIUEZGhnJzc832TZs2acSIEbLb7Zo3b54MwwjV1wEAAGjXHA4HAbmZQhaQbTabHn30UW3dulXr16/XokWLtHXrVi1YsEATJ05UYWGhJk6cqAULFkiSVq9ercLCQhUWFionJ0dz586VVBuo58+frw0bNqigoEDz5883Q/XcuXO1ePFi87r8/PxQfR0AAIB2zeFwUH/cTCELyKmpqTrjjDMkSV27dtXQoUNVXFysvLw8ZWdnS5Kys7O1fPlySVJeXp5mzpwpi8WisWPH6vDhwyotLdWaNWuUlZWllJQUJScnKysrS/n5+SotLVVFRYXGjh0ri8WimTNnmvcCAACAL2aQm8/WFh+ya9cuffnllxozZoz27dun1NRUSVKfPn20b98+SVJxcbH69etnXpOenq7i4uJG29PT0+u1+5OTk6OcnBxJ0t69e1VSUhL079hWysrKwt2FdoXxCgzjFTjGLDCMV2AYr8AwXo07cuSIrFarmYMYr4aFPCAfO3ZMl112mR5//HF169bN5z2LxSKLxRLqLmjOnDmaM2eOJCkzM1N9+/YN+WeGUnvvf1tjvALDeAWOMQsM4xUYxiswjFfDbDabEhISfMaI8fIvpKtY1NTU6LLLLtPVV1+tSy+9VJLUu3dvlZaWSpJKS0vVq1cvSVJaWpr27NljXltUVKS0tLRG24uKiuq1AwAAoNa2bds0adIklZSUUGIRgJAFZMMwNHv2bA0dOlS33HKL2T5lyhRzJYrc3FxNnTrVbF+6dKkMw9D69euVlJSk1NRUTZo0SWvXrlV5ebnKy8u1du1aTZo0SampqerWrZvWr18vwzC0dOlS814AgMhQWVnJCkNAGA0dOlRr167VI488oqKiIp100knh7lK7ELKA/Nlnn+mll17Shx9+qFGjRmnUqFFatWqV7rjjDr333nvKyMjQ+++/rzvuuEOSNHnyZA0aNEh2u13XXXednn76aUlSSkqK7rnnHo0ePVqjR4/Wvffeq5SUFEnS008/rWuvvVZ2u12DBw/WBRdcEKqvAwAI0IkTJ9SpUyfddttt4e4KEJXefvtt8/iJJ57Qhg0blJmZGcYetR8WI8p+tM/MzNTGjRvD3Y0WKykpoV4oAIxXYBivwDFmDTtw4IA5W+X5Tw3jFRjGKzCM14/279+v3r1712t/9dVXNX36dEmMl9RwLmQnPQBASDidznB3AYha+/fv99s+dOjQNu5J+0RABgCEhMPhCHcXgKg1YsQI83j06NHmcVJSUji60+4QkAEAIUFABiLDueeeax537do1jD1pPwjIAICQqKmpCXcXgKjkcrl8XickJJjHBOTmISADAEKCGWQgPCoqKnxed+7cWY899pj69+/POsjNREAGAIQEARkIjx9++MHndf/+/fW73/1Ou3btCk+H2iECMgAgJCixAMKjoKDA57Vn12I0HwEZABAS3jPIbrc7jD0Bost3333nU0pB3XHgCMgAgJAoLi42j5lNBtrOsWPH1K1bN/M1ATlwBGQAQEh41zuyaQjQdo4dO6bOnTubr72P0TwEZABASJw4ccI83rZtWxh7AkSXY8eOqUuXLnryySclST169Ahzj9ofAjIAICT+8Y9/mMfff/+9JOnJJ5/U73//+3B1CYgKnoB84403yjAMJSYmhrtL7Q4BGQAQErt37zaP4+PjJUkPP/ywHn300XB1CYgKx48fp6yilQjIAICgs1gsPq/r7uwFIDT+/e9/67PPPpPNZgt3V9o1AjIAIOR4SA9oG9OmTZP0Y1kTWoaADAAImccff1wSM8hAW6mqqpIkHTx4MMw9ad+YfwcAhMyAAQMkSVdddZW++eYbs90wjHplGABar2fPniosLNQdd9wR7q60a8wgAwBCxns3rwcffNA8ZuMQIPj+/e9/a+PGjZo6dapuv/32cHenXSMgAwCCyjAM89izekVdBw4caKvuAFHBMAwNGTJENTU1lDQFAQEZABBUbrfbPG4oIJeUlLRVd4Co4L0xD+setx4BGQAQVN4rVjQUkA8fPtxW3QGiQnV1tXm8ePHiMPakYyAgAwCCyjsge9cge/Oe7QLQep7VKyQpKSkpjD3pGAjIAICg8q5/bGgGmYAMBJd3QEbrEZABAEHVnBILAjIQXJ6A/Nhjj4W5Jx0DARkAEFTNKbEoKytrq+4AUcFTgzxw4MAw96RjICADAIKqOTPId9xxh1avXt1WXQI6PM8MckP/ziEwBGQAQFA1pwZZkj7++OO26A4QFZYvXy5JOumkk8Lck46BgAwACCrvGWSr1aohQ4b4Pa979+5t1SWgQ3v77bf1yCOPSJL69esX5t50DARkAEBQeW8jbbVaG5xFtlgsbdUloEO79NJLzWNmkIODgAwACCqHw2EeNxaQjxw50lZdAqLCsmXL+MEzSAjIAICg8t7RKyYmRk888YTf8wjIQOtt3brVPL7iiivC2JOOhYAMAAgq7xlki8WicePG6fPPP5cknX322frwww8lEZCB1iorK9Pw4cMlSTNnzgxzbzoWW7g7AADoWDwB+d133zXbzjrrLN1999267LLLNGrUKJ122mkEZKCVSkpKzOPbbrstjD3peJhBBgAElafEIjk52WyLiYnRX/7yF/Xq1UuSlJCQ4FOKgfZn//79Mgwj3N2Iap61jxctWqSf/OQnYe5Nx0JABgAElWcGubE1kGNjY7V27Vq99tprbdUtBMEnn3yi77//Xps2bVLv3r316quvhrtLHcbRo0dVXl4e0DWVlZWSpKFDh4aiS1GNgAwACCrPzHBD20xLks1WW+E3ffr0NukTguOcc86R3W4368jXr18f5h51HAMHDlRKSkpA13hmkBMSEkLRpahGQAYABNWJEyckNf4f7djY2LbqDkJgx44dkghmwXTw4MGAryEghw4BGQAQVPv375cks97YHwJy++O9hfizzz4ryXfXRKn2twcXXnih/vd//7dN+9beHTt2zDyurKzUyy+/rNmzZzd5nafEIjExMWR9i1YEZABAUO3bt09xcXFKSkpq8BwCcvvjHeI8HnvsMZ/Xl19+uVauXKnTTjutrbrVbrndbl177bWyWCzq2rWr2d6pUyfNmDFDzz//fKMPQRYXF2v37t2S2D0vFAjIAICg2rdvn/r06dPojl7e71177bUNnvfWW29p9erVQe0fWqaiosJvu9vtNo/feeedtupOu1dUVKQlS5Y0eo6/H0qk2pnj9PR03XnnnerVq5d69OgRii5GNQIyACCo9u7dq969ezd6jne95ZIlS7R27Vrz9YoVK3TeeefJMAxddtllmjx5csj6iuY7evRoQO15eXmh7E67V1paWq/td7/7nc/rAwcO+L22oKDAPO7cuXNwOwZJBGQAQJB5ZpAb88knn/i8/uc//2keX3bZZfroo4+0Zs2akPQPLdPQDHJDG758+eWXoexOu2YYhsaOHevTdvvtt2vhwoU+bWVlZX6vX758uXl87733Br+DICADAIKrtLS0yRnkuqxWq3nseeDoX//6l9nGhhTh19BM8T/+8Q9t27ZNkvSLX/xC3bp1k1T/AT786IcffvB57XK5tGDBAlmtVlVUVOjtt9+WVDuDXFBQIIvF4lPvvWfPHtntdhmGoWuuuaYtux41CMgAgKDZsWOH9u/fr0GDBgV0nWdzEenHGcmHH37YbGsonKHtfPfddz6vU1NTJUl//OMfzY0q9u/fr5/+9KdKSUnRF198oQcffJAfbvzwXhHE4XAoJubHONa1a1eNGDFCUm1AvvnmmyVJt9xyiy655BK98MILWrFihYYNG9a2nY4yBGQAQNB4lngbOXJko+d9+eWXevTRR3XBBRdIko4fPy6p4ZnilqwRi+AqLCxUp06dzNevvPKKz/tut1vbt2/X0KFD1b17d61du1Z33XWXSkpKtGLFCiUlJZnr9kY7z2Y6zz//vN8VXXr27CmptsTCe8yWL1+uWbNmqbq6Wv369WubzkapkAXkWbNmqVevXj57gx86dEhZWVnKyMhQVlaWuaWiYRiaN2+e7Ha7Ro4cqc2bN5vX5ObmKiMjQxkZGcrNzTXbN23apBEjRshut2vevHn8hAoAEeDCCy+U1PS6rKNGjdItt9yiVatW6fTTTzc3F2koCDf0ND/aTllZmc/a1nV3Sty5c6cqKys1dOhQnwfHDh06pBtuuEEVFRVKTEysNxPdlhwOR0T8sOUJvQ3tnNetWzfFxsZq3bp1PpnI2/nnnx+y/iGEAfmaa65Rfn6+T9uCBQs0ceJEFRYWauLEiVqwYIEkafXq1SosLFRhYaFycnI0d+5cSbX/Us2fP18bNmxQQUGB5s+fb4bquXPnavHixeZ1dT8LAND2POEjkJ29OnXqZAbkkpISv+d4ZpgRPmVlZTrppJM0d+5c/e1vf9O4ceN83v/0008lSWPHjtXXX39tto8cOdInWGdkZOiyyy7zKTNoK/Hx8erZs2e9h0SDIT8/X++//36zzm1qBzyLxaKamhq9++67kqQhQ4aY723evFk7duwwfxhFaIQsIJ9zzjn1fjLKy8tTdna2JCk7O9t8CjMvL08zZ86UxWLR2LFjdfjwYZWWlmrNmjXKyspSSkqKkpOTlZWVpfz8fJWWlqqiokJjx46VxWLRzJkzfZ7oBACEV3x8fLPP7dSpk7kjWN2A3LdvX0kE5EjgCchPP/20br755nrrXG/YsEGSNHjwYGVkZPi8t2nTJp/Xb731ln7961+HtsN1bN261Tw+55xzgv536oILLlBWVlazzg10i+hVq1Zp6NChWrp0qU4//XQNHDiwxf1E87RpDfK+ffvMov4+ffpo3759kmp3g/GupUlPT1dxcXGj7enp6fXaAQDtz6effqrPP/9cTqez3q+/L774YkmBBeTdu3ebv21E8HgCckOeeeYZJSUlKTExUVu2bNG6det83j///PM1YcIEvfTSS5KkZcuW+WwyEmrXX3+9z+uZM2e2eBb7ySef9Fma0Nu6dev0/fffN3q9JyA39oOkZ9b9v//7vzVo0CBt3bpVM2bMaFF/EThbuD7YYrE0ustSMOXk5CgnJ0dS7QL2Df0Krz1oaE1E+Md4BYbxChxj9iPvZ0FKS0v9/n+tv/EaMGCA/t//+38qLCyst/yVp5a1qKioWf/fffz4cZ1yyim69NJL9dRTTwX6FSJOpPz9crvd2rdvnzp37tzon0PPnj3N9+12u4qLizVlyhRt2rRJSUlJ5s5xM2bM0EsvvaQvv/zSnDgLhsbGa+fOnT6v33rrLX300UcBrwZRU1Ojm266SZKUnJys2bNn+2zwMWHCBA0ZMkQrVqzQF198oXPOOade3tmyZYskKSYmpsHx3Lhxo6qqqpoc89aIlL9fkahNA3Lv3r1VWlqq1NRUlZaWmj8dpaWlac+ePeZ5RUVFSktLU1pams86mEVFRZowYYLS0tJUVFRU7/yGzJkzR3PmzJEkZWZmmr+ya6/ae//bGuMVGMYrcIxZLe8AMn78eCUlJfk9r+54zZs3T3PnzlVycrK53NXo0aP1xRdf6NRTT5VU+0BYc8bZs8PYW2+9pTfffLNF3yPSRMLfr/3798vhcOjUU0/12x+LxSLDMJSWllbvfc9ucOPGjTPfu/TSS/XSSy8pJiam3vllZWX61a9+pWeeecZcPi4Qde+3Y8cOzZ8/X8XFxWYw9zhx4kTA4+u9A155ebkWLlzos0ybJG3fvl3z5s3T6tWrtXLlSk2ePFk7duwwfyt+5513Sqr9e95Wk4UNiYS/X5GoTUsspkyZYq5EkZubq6lTp5rtS5culWEYWr9+vZKSkpSamqpJkyZp7dq1Ki8vV3l5udauXatJkyYpNTVV3bp10/r162UYhpYuXWreCwAQHrt27ZIkffTRRw2GY388dZhVVVWqqKiQzWbTK6+8ovvuu898EOm6665rdPWD6upqDRw4UGPGjDHbvCde0DqLFy+WJPXv39+n/bTTTlNCQoLuu+8+SfJbg/urX/1Kknw2tPAsF+epPfe2du1arVu3TsOGDdMll1zS6nKZOXPmaOnSpZJqn3/au3evFi1aJKl26bpA+Zt1feSRR+q1rV69WpL073//WwcPHtTgwYM1efJkrV+/XlLtpirhDsdohBEi06dPN/r06WPYbDYjLS3NeO6554wDBw4Y5513nmG3242JEycaBw8eNAzDMNxut3HDDTcYgwYNMn7yk58YX3zxhXmfJUuWGIMHDzYGDx5sPP/882b7F198YQwfPtwYNGiQ8dvf/tZwu93N6teZZ54Z3C/axoqLi8PdhXaF8QoM4xU4xuxHy5cvNyQZmzdvbvAcf+P16quvGpKMhx56yPiv//ovIzk52XyvqqrKkGT+ryGvvfaaz3mSjF69erXuC0WASPn75RnTb7/91qfd6XQa1dXVxpEjR4xVq1b5/W+x0+k0jh075tP28ccfG5KM9957r975zz33nM+f4913393sfvobr6ysLPNeX375pdmemJho3HbbbcaGDRuMkpKSZn/G+++/X+/vmud/xcXFDb7n79xwi4Q+hFtDuTBkJRavvvqq3/YPPvigXpvFYjF/mqtr1qxZmjVrVr32zMxMffPNN63rJAAgaDw74Hm2Gm4uz4NKd955p8aPH+9zfd21dn/2s59p2bJlPg9qS76/9vbwbFqC1jG8asvr7pBotVpltVoVFxdnbvpSl9Vq9VkXWWp8Brnug5r333+/pk+fruHDhwfU74KCAg0aNEi7du3SoEGDNGTIEJ/l0rp3767PP/9cf/3rXzV58mStXLmyWfd98sknzeOzzjrLLOv55S9/qb59+2r37t06ceKEqqqqdOedd/pdhvayyy6jtCHCsZMeACAoKioqJAUekL2Xuvrkk0/MNZEl1fsV9GeffeZ3BzHvh6QuuugiSbW/7jfYRCpgTqdTf/nLX/T555/r8OHD2rFjh6TaraUDWd+6MZ6NZLz/rD08K5Z47yD3xhtvBHT/Rx99VGPGjNGcOXP0/fff69e//rVWrVrls4FN9+7dzbWbV61a1az7rlixQu+88475Ojk5We+9957POSeffLJOPfVUjRo1SosXL9YNN9yghx56SKtWrZJhGHI4HAF/H7Q9AjIAIChaGpDr7rpXt8azqXpm75D18ssva/ny5Ro/frzee+89n+CM5vn4449177336uyzz9bEiRP15ZdfSpK5aUUweGaQPWH4448/1vbt2yXV7prYuXNnxcfHm3sn/PnPfw5om+rf//73kqS3335bbrfbfNjTm/dazc1dt9vz2/Err7xSUu0PcOeee65uvfVWsw7bW3p6uhYtWqQ77rjDnGH3t7U0Ig8BGQAQFBUVFYqPjw9okxBJ6tGjh8/ruhtI+Ntq13v92kOHDkmqXdLz6quvVkxMjLlywpNPPmnuzhqIaJ559swYS7Vjv3v3bkmqt/lHa/Tu3VtS7T4INTU1+vnPf67JkydLqg3NnpKMF1980QzJ06ZNa/b9665+4a/vr7zyinmcnJzcrPtWVFRo+PDhevLJJzVgwAD9+c9/ltVq1cKFC3XGGWc0u3+IfARkAEBQvPvuuy16Kr/urqv/8R//4fN60KBBGjt2rE+b91KfntUqvO8zZcoU8/jZZ58NKPDu379fMTExiouL81sj21FVVlZq3bp15o54knTSSSdp7969SkhIUNeuXYP2WYmJierZs6fuvfdes858x44dmjNnjj788EN16dLFPNfzjJKnHKI5jh496rNqht1ur3dO586dtXv3bl1//fU6evRos+5bWVmp7t27q2fPntq5c6fPqinoWAjIAIBW27Rpk7Zu3RrQr8E96j6s1L1793rneJbG8vAs/VVcXKyzzz5bknx+jb5gwQK/5zeHZzm5mpoa/eUvf2n2de3dWWedpQkTJui5554z28rKyvTaa69p6NChQV+SzDPL723x4sUqLCz0KZvp3LmzrrvuOjmdziYfvHS73Ro3bpyKiopks9lUWlqqdevW+f07JdXWC/fp00fHjx+X0+lsss8nTpyoVxKEjomADABolT179igzM1OS9MADDwR8vcVi8VmtqO5ssT9r166VJD3//PNmm/cKBZL0n//5n+ZxcXFxs/vjWY1DUpNbBncUhmH4rAw1fPhwzZs3T1Ltn6+/ByNb64UXXvB5PXPmTPN47969Pu/95je/0fHjx/XZZ581es+NGzeaP0xlZ2erT58+Oueccxq9xlPjfsMNNzTZ58rKSrN+Gh0bARkA0CoPPfSQeXz77be36B41NTWSpIcfftisT/V26623SpJsttrVST0rCWzbtk2StHTpUvM9j0WLFplbG//www/KycnRxRdf7FNj64/3bHO0bMVbdzb31FNP1bfffmu+/vnPfx70z7zmmmtkGIZmzpyprl276tlnn/UpjfHm2aBkw4YNys7O9rv6heRbevOzn/2sWf345S9/Kal2Q4/GHDt2TF999VW9pQfRMRGQAQCt4qlNHTNmjKxWa4vu4XA4JMmn9tTbvffeK0l67LHHdO6555o1xRUVFTr99NM1Y8aMetfExMRo0qRJkmpnIK+//nrl5eX5zFTW5Xa7zXVup06dqo0bN/otBehoPKtJePztb3/TmWeeab6+4oorQvbZubm5qqioUGJioh599FG/5/Tq1UsWi0UPP/ywli5dqvPPP1/XXXed3G633G63eZ5nPexAZv5POeUUnXrqqerVq1ej53lKeX760582+95ovwjIAIAWufXWW2WxWMwtppcvX97iezUVkLt16ybDMPRf//Vf6tKli4qKirR+/XqfFQ/8SUtLU0pKis9MsGfZMn9yc3PNh9TuvvtuHT16VHa73Zzh9lixYoUuvPDCZtWttgfeD6mdf/75Ovnkk/XAAw9ox44dWr16tdLS0tqkH54HLUeMGOHTbrPZfB60/OSTT/Tcc8/pnnvukdVq1YoVKzR+/Hg9+OCD6t69e70NTZoSGxtb78/yxIkTyszM1Jo1a/Tdd9/p66+/liTdeOONLflqaGcIyACAgBUXF+tvf/ubJOn111+X1Wr1WxrRXJ6A3FjY9ejUqZN++OEHjRs3Th999FGT10ydOtU8nj9/vk6cONFgsPXs9vrwww8rMzNTkyZN0pEjR8yl5KTasouLLrpIK1euNH84aO+8Q9+wYcMk1YbSgQMH6vzzz2+zfqSkpGjFihX66KOPmnX+gw8+KKl2cxjPKhdN1Rz7Y7PZ6v0QdN1112nTpk06//zzzWXiFixY0OLfkqB9ISADAAL21Vdf+bz2/Aq8paqrqyU1LyAfO3bM53VTS7h5SiaeeuopcxOThpb1Ki8v15lnnqk//OEPkqSrr75a0o+boEjy+VV8R6lRXrdunSTpkksuqbcCSFv75S9/WW9tbOnH5f+ys7O1cOFCv2U1UvNrj735m0H2V2vM9tDRg4AMAAhYSUmJJJk1vq0NDp6A3JytjD3rHnvUnfmrq0uXLmZ5hqde+quvvtKKFSvMh70OHjyodevW6ciRIz4793mOPStb1F3GznvZMafTvXsOkAAAH01JREFU6fP+7t27zU02It1VV10lqXZjjkjd6e2NN97QK6+8ohdeeEG33nqrli5dar7ndrv1zTffqEuXLrr00ksDvrdnBtnlcunmm2/W66+/rhdffLHeeaNGjWrNV0A7QkAGAATM8zDUyJEjJdVfYi1QnhKL5qwQ8M477/isT/z44483+3M8vyo/77zzdNFFF+m5557TwoUL1bNnT02YMEEFBQU+Adkzk1lWVqY333zTXPrMM8vqHZCnTp2qxMREWSwW9ejRQwMGDDCXv/M4evRoRO7Sl5iYqL59+wa8TXhbSkpK0pVXXun3NxUWi0XDhw/X9u3bNXjw4IDvHRsbq5qaGm3ZskWPP/6434cSly1bVq82Gh2XrelTAADwVVJSop49e5rr4zZn5rcxnhnk5mxT3b9/f9199926++67A/6c8ePH+7zev3+/z9rNNTU1Pg+kDRw4UJLMbZA9brzxRt1xxx1mDXJxcbFWrVplvu+pWT5w4IBmzZqliooK9e3bV0899ZQuuugic5m6SHHs2LEGH5CMZH/4wx+0cuXKVt/HZrOpsrKyXslMVVWVXC6XDh8+THlFlGEGGQAQsG3btik1NdXcoay1WzKfccYZkmq3Ng6lurOP/jY2OeWUU8zj1NTUeu/ffffd6tSpk7p3764HH3xQgwcPVnp6uiSpX79+evzxxzV06FDz/BdeeEFvvvmmnnrqKUm1W3L/6U9/Csr3CZYDBw40uNtcJHv44Yd9NjhpqdjYWBUXF9d7IDE+Pl6dOnUiHEchAjIAoNlmzJihJ554QuvWrVNCQoK57W5rA/KiRYtUUFCgk08+ORjdbNT27dvrhaqtW7ea5RTeD3/VDdQPPvig5s+fL+nHut2xY8dqxowZeuCBB7R7927ddNNNeuaZZ/TMM89o//79crvduueee3Tvvfdq+/btkupvnR1u3377rU+ojzY2m61evbjnoUBEJ0osAADNcvToUb388st6+eWXJdWuNuBZb9ZTi9xSCQkJGj16dKv72ByeGeJnn33W3I56yJAh2rRpk44ePVpvJvVf//qXXnzxRd14443mTLdUu2nJAw884Hfm9ec//7nP7nP33XefeTxx4sQGV9EIhxUrVmjv3r0aPnx4uLsSNv4eTPSuRUf0ISADAJp02223aeHChT5tP/vZz3TGGWeooKBAp59+eph61nLXX3+9LrzwQhUVFSkmJsbczriuumHXIy4urkXbDnft2lX79u0L+LpQOH78uC666CJJ0sUXXxzm3oSP5zchUu0PMJMnT25wGTlEB0osAABNeuutt+q1jRs3TpI0evRo2Wztc74lLS1NY8aMadPP7Nq1q7755hvdc889bfq53oqKijR27Fjl5eVJknr37m2u8BGNvFcbefrpp3XLLbeEvB4ekY2ADABo0mmnnSapNhTPnz9fe/bsUadOncLcq/bJ8yDY/fff36af63a79eKLL6q0tFRXXXWVNmzYoJkzZ0qS3n///TbtS6TxPGQpib/XkESJBQCgGY4fP64xY8bof/7nf8Ldlf/f3p0HV13d/x9/Zl9vFrJASCABghASwhKQtTLoSGURK8uUxdEiSKtOR+sIWulYamtBxG0KOmVTihamWiQom4AKCAqYgBQExBCELISE7OtNbs73D37cX1LAciW5N8vrMXOHe+9ne593Pgnve+75nE+rN2PGDNasWcPFixeddszy8nKGDh3KiRMnGr1vs9mIjY213166vUpJSbE/bzjNn7Rf6kEWEZFrbt98veWtcZ7clqpr1672u/M5w+OPP35NcfyrX/0KgLfeegt39/ZdDly92BSunblE2qf2/RshIiKkpaVhsVjs41H/W1FREQcOHKCurs7JkbVdwcHBTi2Qjx07BsCECRNYtmwZubm5vP322xhjGDt2rNPiaMkmT57M73//e1eHIS2EhliIiLRzTz31FHBlFoN//vOfTJ8+vdHyjz/+GIAxY8Y4Pba2Kjg4mLKyMmw2Gx4eHs1+vN69e3P06FE2b96sHtIb+OCDD1wdgrQg6kEWEWmHhg4dytNPP82HH37I3r177e9fvfnFVS+//DIPPvgggYGBPPPMM84Os82yWq0ArFy50inHKy0tJSUlRcWxyE1SgSwi0s4YYzh48CCvvPIKkyZNumb5+++/D8A777zD/PnzAbjvvvuc0tPZXuTm5gLw6KOPMnbsWEpLS5v1eCUlJQQFBTXrMUTaEhXIIiLtzO7duxu9fuqpp/jlL39pf71jxw5++OEHZs2aZX/vaqEsTaPhnfW2b99un0WhtraWcePGMXz4cIqLi8nOzuatt96yF9Q3snjx4kYXmgF89NFHPPnkk8yYMYP9+/cTHh7e9A0RaaM0BllEpJ1JS0tr9Pp3v/sdkZGRLF++nIEDB7J69WpWr15tX15fX6+v5ptYly5dCAkJobi4GIDvv/8eq9XKyZMn2bZtGwChoaH29U+cOMGyZcuuu69du3bZLy5buHAha9eu5dy5c9es17AoF5Efpx5kEZF2prKyEjc3NzIzM1mxYgUxMTF4e3sTFhaGzWZrtO6+fftUHDeTq8XxVenp6Vy6dOm66y5fvpz4+HjWrFlj/xlVVFSwc+dOXnrpJft6f/rTn+zF8cCBA9mxYwcvvvgi58+fp3fv3s3TEJE2SD3IIiLtSH5+Pi+88AKdOnUiLi6ORx55pNHy/x4LO3LkSGeG166kp6czcOBAlixZwvz58xk2bBizZ89utM6UKVMoKipi9+7dZGRkMHv2bGbPns3jjz/Orl27OH36NAATJ05k8uTJnD59Gj8/PxYsWGD/YKPZR0QcpwJZRKQdWbp0KQAWi+W6yxMSEjh06BDHjx8nLi7OiZG1PwMGDMAYA/z/Md6rV69m+PDhpKamUlJSgp+fH35+fgwcOJC4uDjy8vI4efIky5cvB65MF/fSSy8xefJkjTEWaUIqkEVE2pFNmzYB2Me5/reNGzfy8ccfk5iY6Myw2r1jx46RnJwMQGJiIuHh4YSHh5OTk0NoaCiZmZn2dY8fP05aWhrffvstU6dOZdCgQa4KW6TNUoEsItJKVVZW8txzz7FmzRqOHDlCjx49briuMYazZ8/y3Xff8de//vWG60ZHR/PrX/+6uUKWG+jbty9Wq5V3332X0aNH/+i6SUlJJCUlOSkykfZJF+mJiLRS8+bN44033qCsrIz4+Hjc3NzYsGFDo3WMMaSkpGCxWIiPjwe4ZjowaRm8vLyYNWuWhraItAAqkEVEWhFjDM8//zxHjx7lyJEj1yyfPn06bm5uuLm5MX78eE6fPk16ejoVFRX2de666y5nhiwi0uqoQBYRaUUuXLjAn//8ZwYMGMCXX37JE088Qb9+/QCumY5t69atjb6K/+Mf/4gxRhdziYj8DxqDLCLSivz3TT5efPFF8vPz2bBhA/Pnz+fSpUssWbKE/v3789BDD9nnzM3OzqZz586uCFlEpNVRgSwi0oq888479ucxMTEEBAQQEBDAs88+C0CnTp149dVXgSu3MF6/fj1hYWEqjkVEHKAhFiIirUBhYSFLly5l8+bNvPDCCxhjuHDhwo9us3z5coKCgliwYIGTohQRaRvUgywi0sLl5uY26gF+4oknbmq70NBQioqKcHdXX4iIiCP0V1NEpAX75JNPGs1ZvGrVKoKCgm56exXHIiKOUw+yiIgTGGM4f/48y5YtIyAggMGDBxMSEsKIESPs69hsNr7//nvc3d359NNP2b9/P+vWrSM2Npa1a9cyatQoF7ZARKT9UIHcwpw4cYKRI0cSHh7Onj17dGGNSAv2ww8/MGvWLNasWXPNzR0uXLjAjh07SE5OJjMzk3nz5l13zPDChQu58847eeWVV9i1a1ej+YoB+vTpw0cffaSbe4iIOJEK5BZm8eLFFBcXU1xcTHR0NCNHjmTLli32r1TXrVvHihUrGDt2LCkpKfj6+lJZWUlycjLh4eF069bNxS0QaftWrVrFokWLOHv2LADdunVj06ZNfPXVVxQUFJCXl8dHH33UaBsvLy8mT57MM888w8KFC6moqGDPnj0sXLiQhQsXAjBjxgyGDRuGMYbBgweTnJyMv7+/s5snItLuqUBuIS5evMhjjz3Ghx9+yO23386hQ4cA+OKLL0hKSmLmzJls376do0ePAleuTr+esWPHsmjRIhISEvD29qa4uJgLFy4QGxtLYGDgLY1HvHz5Mjk5OURERNCxY0f7TQmqq6spKioiLCwMYww+Pj4/+RgixhjefPNNVq5cydChQ7n33nux2WxERETYPxAaY7BYLERGRhIVFfWTjlNbW0tpaSlw5Zub/fv3k52dTXBwMBaLhVOnTmGMwd3dnczMTGw2GxUVFRQWFvLDDz8A8LOf/Yx9+/YB8Itf/AIAb29vgoKCmDJlCn379sXX15eEhATuuOMOgoODAdiyZQtwZXzx22+/TWZmJqtXryYxMfGWciciIk3DzRhjXB2EMw0aNIivv/7a1WFcIyoqiosXLwKQk5PDI488wsGDB6moqKCqqsq+3qRJk5gzZw6RkZFkZGQQEBBAbm4uQUFBrFq1ip07d97wGF5eXsTGxtKrVy/Gjx/PpEmTCA0N5eLFi6xYsYKhQ4cyatQoAgICMMZQV1dHRUUFH3zwAf/617/Ys2cPdXV1APj5+dGhQweCgoI4c+aM/X0AT09PQkJC6N69O7W1tdhsNiwWC0OGDCEuLo6IiAi8vLyIiorC09MTb29vYmNjCQ0NvalcXT1l//uuYdeTk5OjYSo3UF9fT3l5OYGBgVitVry9vbl48WKz5au+vp6amhqqq6txc3PDZrNhjMHLywubzcb69ev54osvSEtL48yZMze9Xz8/P3r16oWvry++vr5UVVUREBBAQUEBWVlZVFVV4efnh81mo76+HpvNhs1mo7a2lvr6+hvuNzg4GH9/f9zd3YmKiiIwMJDAwECCg4Pp06cPc+fOJTw8nIMHD7J3714KCgqYO3duowvq5Fr6nXSM8uUY5csxyteN68JWXyBv376dJ554ApvNxpw5c+yT5d9ISyuQT506xYIFC9i4cSMAJ0+epHfv3tTW1lJXV8fnn3/OuHHjeOyxx3j11Ve5fPnyDU9mq9XKv//9b1577TUOHz4MQPfu3enevTsjRoygpqaG48ePs3PnTmpqaq67D09Pz0bF7lU+Pj7ccccdjBkzhsrKSvbv34/FYqGuro7AwED69OlDXV2dvQgqLCwkIyMDm82Gv78/33//Pd99990N8+Dh4UFUVBRxcXHExMQQHBxMVVUVOTk5BAYGEhcXR/fu3bFaraSmppKens748eOJjIyksLCQ6upqOnfuTHR0NJ07dyYwMJCIiAiMMXTu3JmYmBh8fX0BKC4uJjg4+LoFdn19Pfn5+RQWFlJTU0NsbCwhISE3VYw7wmazUVpayqlTpwgKCsLHxwdvb+9r/vXw8LjpYxtjKC8v59KlS+Tl5bF7925efPFFBg0axLhx4/Dz86O0tJRLly5x7tw59u/fT0lJiX17d3d3vL297ccLDQ2lY8eOdOjQgeDgYMrLyykpKaG0tJTCwkJKSkrw8vIiODgYPz8/rFYrcOWDy9UCuLq62v64uvzHhISEkJKSwoABA5g/fz5/+MMf6NixI/Hx8YSHh1NdXU1gYCAAZWVlfP311+zYsYPOnTtjtVqprq629zKHhobStWtXfHx8sFqteHh44OHhgbu7Ox4eHnh5eREREUFdXR0xMTGMHj3aPpSpqqrqpn/u+g/GMcqXY5QvxyhfjlG+2miBbLPZuO2229i5cycxMTEMHjyY9evX06dPnxtu4+wC+auvviInJ4fCwkKKioooKioiOzsbd3d38vLy2LZtGwD33HMPf//73+nateuP7s+Rk7mkpMT+lW5DVquV999/n7S0NC5evEh6ejq//e1v6datG/v27bMXSe7u7vj7+9OzZ08mTJiAh4eH4wn4f+rr6/nwww/x8PCgV69eFBQUUF5ejs1mw2q1cvjwYdLT0ykpKeHbb7+lrKyMsLAwunTpQm1tLWfPnm3Ukw4QGBiIh4cHYWFh+Pj4kJWVRVlZ2XWP7+XlRY8ePfD09OT48eNERUVhsVioqamhsrKS+vp6e49qbW3tdbf39fXFx8cHf39/QkJC7D2gVx/19fWNXhtj8Pb2xmazUVdXh9Vqpaam5ppvBX5Mx44dCQwMxM3NjaqqKurq6vDy8sLb2xtjDLW1tfZHZWUl1dXV/3OfXl5eREZG0qNHDwICAhgyZAhubm4UFBRQW1uLxWKhtraWAwcOUFpaSlBQEBUVFQQGBhIUFERwcDAhISH2D0jFxcVYrVZ8fX3t7YYrBffVXl0/Pz/7cx8fH+rr6/H09MTd3Z2amhpsNhujRo1iyJAhN5WXlkT/wThG+XKM8uUY5csxyteN68JWPQb50KFDxMfH26/unjZtGqmpqT9aIDvbb37zG7755hv7aw8PD0JCQvD09KRz585Mnz6dKVOmcP/99zd5L+X1imO4MkZy5syZzJw585pl48aNa9IYrnJ3d2fy5Mk3XD5p0iT786u9oA2v2jfGcOnSJXx9fQkKCrphrioqKjh79iw2m438/Hy+++47PD09OXz4MJ9//jmxsbGMGDGCqqoqampq8PHxoaysjPDwcLy9vbFYLERHRxMWFoaHhwdnzpyhtraWmpoa+6O0tJTy8nLc3d3tvaUNn199DVc+jHh6etqHknh5eREQEIDFYsFisdClSxeMMdTU1GC1Whs9ysrKyMjIoKCggA4dOuDv74+npye1tbVYrVbc3NzsxbKXlxd+fn5ERkY2eiQnJ5OXl0d1dbV9SMyPfdDRH0sREZFWXiBnZ2fTpUsX++uYmBgOHjx4zXorVqxgxYoVwJWL4XJycpwW45IlS3BzcyMkJITg4GACAgKuW9zl5ube1P7y8/ObOsQWydfX97o/p4qKimumwfpvYWFhAPYiMSIignvvvfcnxTFy5MiftF1LcenSJdzc3PDz86Oqqup/9ly3l/OrKSlnjlG+HKN8OUb5cozydWOtukC+WXPnzmXu3LnAla50Z/aQNcex1MPnGOXLMcqX45QzxyhfjlG+HKN8OUb5ur5WfQ/S6OjoRhPvZ2VlER0d7cKIRERERKS1a9UF8uDBgzlz5gyZmZlYrVY2bNjAxIkTXR2WiIiIiLRirXqIhaenJ8uWLePnP/85NpuNhx9+WBPti4iIiMgtadUFMlyZdaG5Zl4QERERkfanVQ+xEBERERFpaiqQRUREREQaUIEsIiIiItKACmQRERERkQZUIIuIiIiINKACWURERESkARXIIiIiIiINqEAWEREREWlABbKIiIiISAMqkEVEREREGlCBLCIiIiLSgJsxxrg6CGcKDw8nLi7O1WH8ZPn5+URERLg6jFZD+XKM8uU45cwxypdjlC/HKF+OUb7g3LlzFBQUXPN+uyuQW7tBgwbx9ddfuzqMVkP5cozy5TjlzDHKl2OUL8coX45Rvm5MQyxERERERBpQgSwiIiIi0oDHwoULF7o6CHFMSkqKq0NoVZQvxyhfjlPOHKN8OUb5cozy5Rjl6/o0BllEREREpAENsRARERERaUAFsoiIiIhIAyqQXezChQuMHj2aPn36kJiYyBtvvAFAYWEhd999Nz179uTuu++mqKgIgFOnTjFs2DB8fHxYunTpNfuz2WwMGDCACRMmOLUdztKU+YqLi6Nv377079+fQYMGOb0tztCU+SouLmbKlCn07t2bhIQEvvzyS6e3xxmaKmenT5+mf//+9kdQUBCvv/66S9rUnJryHHvttddITEwkKSmJ6dOnU11d7fT2NLemzNcbb7xBUlISiYmJbfLcAsfz9d5775GcnEzfvn0ZPnw433zzjX1f27dvp1evXsTHx7N48WKXtKe5NWW+Hn74YSIjI0lKSnJJW1zOiEvl5OSYtLQ0Y4wxpaWlpmfPnubEiRNm3rx5ZtGiRcYYYxYtWmTmz59vjDEmLy/PHDp0yDz33HPm5ZdfvmZ/r7zyipk+fboZP3688xrhRE2Zr9jYWJOfn+/cBjhZU+brwQcfNCtXrjTGGFNTU2OKioqc2BLnaerfSWOMqaurMx07djTnzp1zTiOcqKnylZWVZeLi4kxlZaUxxpipU6eat99+27mNcYKmytd//vMfk5iYaCoqKkxtba256667zJkzZ5zfoGbmaL72799vCgsLjTHGbN261dx+++3GmCu/g927dzcZGRmmpqbGJCcnmxMnTrigRc2rqfJljDF79uwxaWlpJjEx0cmtaBnUg+xiUVFRDBw4EACLxUJCQgLZ2dmkpqby0EMPAfDQQw+xadMmACIjIxk8eDBeXl7X7CsrK4stW7YwZ84c5zXAyZoyX+1BU+WrpKSEvXv3Mnv2bAC8vb0JCQlxYkucpznOsd27d9OjRw9iY2ObvwFO1pT5qquro6qqirq6OiorK+ncubPzGuIkTZWvkydPMmTIEPz9/fH09GTUqFFs3LjRuY1xAkfzNXz4cEJDQwEYOnQoWVlZABw6dIj4+Hi6d++Ot7c306ZNIzU11QUtal5NlS+AO+64gw4dOji5BS2HCuQW5Ny5cxw5coQhQ4aQl5dHVFQUAJ06dSIvL+9/bv/kk0+yZMkS3N3bx4/1VvPl5ubGmDFjSElJYcWKFc0drsvdSr4yMzOJiIhg1qxZDBgwgDlz5lBRUeGMsF3qVs+xqzZs2MD06dObK8wW41byFR0dzdNPP03Xrl2JiooiODiYMWPGOCNsl7mVfCUlJbFv3z4uX75MZWUlW7du5cKFC84I22Uczdfq1asZO3YsANnZ2XTp0sW+LCYmhuzsbOcE7iK3ki9RgdxilJeXM3nyZF5//XWCgoIaLXNzc8PNze1Ht//444+JjIxsN/MZ3mq+AL744gvS09PZtm0by5cvZ+/evc0Vrsvdar7q6upIT0/n0Ucf5ciRIwQEBLTZMXxXNcU5BmC1Wtm8eTNTp05tjjBbjFvNV1FREampqWRmZpKTk0NFRQXvvvtuc4bsUrear4SEBJ555hnGjBnDPffcQ//+/fHw8GjOkF3K0Xx99tlnrF69mpdeesmZYbYYytetU4HcAtTW1jJ58mRmzpzJpEmTAOjYsSO5ubkA5ObmEhkZ+aP72L9/P5s3byYuLo5p06bx6aef8sADDzR77K7QFPmCKz1WcOUrzPvvv59Dhw41X9Au1BT5iomJISYmhiFDhgAwZcoU0tPTmzdwF2qqcwxg27ZtDBw4kI4dOzZbvK7WFPnatWsX3bp1IyIiAi8vLyZNmsSBAweaPXZXaKrza/bs2aSlpbF3715CQ0O57bbbmjVuV3E0X8eOHWPOnDmkpqYSFhYGXPl737CHPSsry/5/QFvTFPkSFcguZ4xh9uzZJCQk8NRTT9nfnzhxImvXrgVg7dq13HfffT+6n0WLFpGVlcW5c+fYsGEDd955Z5vsfWmqfFVUVFBWVmZ//sknn7TJK3WbKl+dOnWiS5cunD59GrgyprZPnz7NF7gLNVXOrlq/fn2bHl7RVPnq2rUrX331FZWVlRhj2L17NwkJCc0auys05fl16dIlAM6fP8/GjRuZMWNG8wTtQo7m6/z580yaNIl169Y1+sAwePBgzpw5Q2ZmJlarlQ0bNjBx4kTnNsYJmipfgmaxcLV9+/YZwPTt29f069fP9OvXz2zZssUUFBSYO++808THx5u77rrLXL582RhjTG5uromOjjYWi8UEBweb6OhoU1JS0mifn332WZudxaKp8pWRkWGSk5NNcnKy6dOnj/nLX/7i4pY1j6Y8v44cOWJSUlJM3759zX333We/8rmtacqclZeXmw4dOpji4mJXNqlZNWW+nn/+edOrVy+TmJhoHnjgAVNdXe3KpjWLpszXyJEjTUJCgklOTja7du1yZbOajaP5mj17tgkJCbGvm5KSYt/Xli1bTM+ePU337t31N/8m8jVt2jTTqVMn4+npaaKjo82qVatc1SyX0K2mRUREREQa0BALEREREZEGVCCLiIiIiDSgAllEREREpAEVyCIiIiIiDahAFhERERFpQAWyiEgbUVxczJtvvglATk4OU6ZMcXFEIiKtk6Z5ExFpI86dO8eECRM4fvy4q0MREWnVPF0dgIiINI1nn32WjIwM+vfvT8+ePTl58iTHjx/nnXfeYdOmTVRUVHDmzBmefvpprFYr69atw8fHh61bt9KhQwcyMjJ4/PHHyc/Px9/fn5UrV9K7d29XN0tExOk0xEJEpI1YvHgxPXr04OjRo7z88suNlh0/fpyNGzdy+PBhFixYgL+/P0eOHGHYsGH84x//AGDu3Ln87W9/Iy0tjaVLl/LYY4+5ohkiIi6nHmQRkXZg9OjRWCwWLBYLwcHB3HvvvQD07duXY8eOUV5ezoEDB5g6dap9m5qaGleFKyLiUiqQRUTaAR8fH/tzd3d3+2t3d3fq6uqor68nJCSEo0ePuipEEZEWQ0MsRETaCIvFQllZ2U/aNigoiG7duvH+++8DYIzhm2++acrwRERaDRXIIiJtRFhYGCNGjCApKYl58+Y5vP17773H6tWr6devH4mJiaSmpjZDlCIiLZ+meRMRERERaUA9yCIiIiIiDahAFhERERFpQAWyiIiIiEgDKpBFRERERBpQgSwiIiIi0oAKZBERERGRBlQgi4iIiIg08H8HLrl9PSikVwAAAABJRU5ErkJggg==\n",
            "text/plain": [
              "<Figure size 720x432 with 1 Axes>"
            ]
          },
          "metadata": {}
        }
      ]
    },
    {
      "cell_type": "code",
      "metadata": {
        "id": "Ijknhx20evcK"
      },
      "source": [
        ""
      ],
      "execution_count": 68,
      "outputs": []
    }
  ]
}