{"nbformat":4,"nbformat_minor":0,"metadata":{"colab":{"provenance":[{"file_id":"1mwgseXvu6zgDFJ45iPPX67-7dZf2NsIX","timestamp":1686257442863}],"authorship_tag":"ABX9TyPw0bLSXvSXfhvyJjKWMfk0"},"kernelspec":{"name":"python3","display_name":"Python 3"},"language_info":{"name":"python"}},"cells":[{"cell_type":"markdown","source":["Copyright 2021 Google LLC\n","\n","Licensed under the Apache License, Version 2.0 (the \"License\"); you may not use this file except in compliance with the License. You may obtain a copy of the License at\n","\n","https://www.apache.org/licenses/LICENSE-2.0\n","\n","Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.\n","\n","# Overview\n","\n","This colab is supplemental code for the following publication:\n","\n","Title: Recurrent Convolutional Deep Neural Networks for Modeling Time-resolved Wildfire Spread Behavior\n","Authors:  John Burge, Matthew R. Bonanni, R. Lily Hu, Matthias Ihme\n","Journal: Fire Technology, 2023\n","\n","This colab will load in one of those models, and load in one file that contains data points for that model, and perform a single inference on one row of the data on the model.\n","\n","The file containing the model (a .h5 file), and the file containing the data, must be uploaded to the colab server's local file system, and then when specifying the location of those files, use their location on the colab server.\n","\n","This colab performs just a single inference, but it should be quite straightforward to modify it to perform more predictions.  Note that the paper details an 'autoregressive' process in which the result of a prediction made at time step t, is used in the input for time step t+1.  This colab only performs individual predictions, it does not replicate the (more complex) logic needed to perform predicitons across the entire autoregressive process.  See the paper for full details need to replicate that process (this colab should provide an adequate skeleton for how to perform the underlying predictions)."],"metadata":{"id":"qglQG9VVUYRz"}},{"cell_type":"code","execution_count":null,"metadata":{"colab":{"base_uri":"https://localhost:8080/"},"id":"GEJyTnq3Xjj6","executionInfo":{"status":"ok","timestamp":1685999185481,"user_tz":420,"elapsed":16592,"user":{"displayName":"John Burge","userId":"13461865008511798871"}},"outputId":"0bb448d5-9f38-4c6a-eb70-511276f8b5b9"},"outputs":[{"output_type":"stream","name":"stdout","text":["Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/\n","Requirement already satisfied: ml-collections in /usr/local/lib/python3.10/dist-packages (0.1.1)\n","Requirement already satisfied: absl-py in /usr/local/lib/python3.10/dist-packages (from ml-collections) (1.4.0)\n","Requirement already satisfied: PyYAML in /usr/local/lib/python3.10/dist-packages (from ml-collections) (6.0)\n","Requirement already satisfied: six in /usr/local/lib/python3.10/dist-packages (from ml-collections) (1.16.0)\n","Requirement already satisfied: contextlib2 in /usr/local/lib/python3.10/dist-packages (from ml-collections) (0.6.0.post1)\n"]}],"source":["#@title Imports.\n","\n","!pip install ml-collections  # Install ml_collections for ConfigDict.\n","\n","import os\n","from typing import Dict, Iterable, Tuple\n","from ml_collections import config_dict\n","import numpy as np\n","import pandas as pd\n","import matplotlib.pyplot as plt\n","import tensorflow as tf"]},{"cell_type":"code","source":["#@title Functions supporting loading the data.\n","\n","# Specifies the name of features added to tf.train.Example(s).\n","FEATURE_INPUT = 'input'\n","FEATURE_LABEL = 'label'\n","\n","\n","def get_tf_example_schema() -> Dict[str, tf.io.FixedLenFeature]:\n","  \"\"\"Returns the example schema needed to load TFRecordDatasets.\"\"\"\n","  return {\n","      FEATURE_INPUT: tf.io.FixedLenFeature([], dtype=tf.string),\n","      FEATURE_LABEL: tf.io.FixedLenFeature([], dtype=tf.string),\n","  }\n","\n","\n","def get_config():\n","  \"\"\"Provides the config for this dataset.\"\"\"\n","  cfg = config_dict.ConfigDict()\n","\n","  # The base directory where all input files are located.  This must be\n","  # specified for the dataset to do anything meaningful.\n","  cfg.input_base = 'must specify'\n","\n","  # To process more than one file, this is appended to the end of input base.\n","  cfg.glob = '*'\n","\n","  # How many data points in a single batch.\n","  cfg.batch_size = 1\n","\n","  return cfg\n","\n","\n","def _parse_example(data):\n","  result = tf.io.parse_single_example(\n","      data, get_tf_example_schema())\n","  feature_input = tf.io.parse_tensor(result[FEATURE_INPUT], tf.float64)\n","  feature_label = tf.io.parse_tensor(result[FEATURE_LABEL], tf.float64)\n","  return feature_input, feature_label\n","\n","\n","def create_dataset(cfg: config_dict.ConfigDict) -> tf.data.Dataset:\n","  \"\"\"Returns the dataset specified in the config.\n","\n","  Each element in the dataset is a tuple (A, B) where A is the input data and\n","  B is the label data.  A and B have the same shape.  If the data is temporal,\n","  the output shape is (b, t, h, w, c), otherwise it's (b, h, w, c).\n","  b = batch, t = time step, h,w = spatial dims, c=channels.\n","  \"\"\"\n","  def _data_to_dict(\n","      input_data: tf.Tensor,\n","      label_data: tf.Tensor) -> Tuple[Dict[str, tf.Tensor], tf.Tensor]:\n","    return ({'input_image': input_data}, label_data)\n","\n","  print('DATASET PROGRESS: Inside create.  Loading file: ', cfg.input_base)\n","  ds = tf.data.Dataset.list_files(cfg.input_base)\n","  ds = ds.interleave(\n","      lambda x: tf.data.TFRecordDataset(x, compression_type='GZIP'),\n","      cycle_length=tf.data.experimental.AUTOTUNE,\n","      num_parallel_calls=tf.data.experimental.AUTOTUNE,\n","      deterministic=True)\n","  ds = ds.map(\n","      map_func=_parse_example,\n","      num_parallel_calls=tf.data.experimental.AUTOTUNE)\n","  ds = ds.map(\n","      map_func=_data_to_dict, num_parallel_calls=tf.data.experimental.AUTOTUNE)\n","  ds = ds.batch(cfg.batch_size, drop_remainder=True)\n","  ds = ds.prefetch(tf.data.experimental.AUTOTUNE)\n","  return ds\n"],"metadata":{"id":"SrdXfQPac-hP","cellView":"form"},"execution_count":null,"outputs":[]},{"cell_type":"code","source":["#@title Functions supporting load the model.\n","\n","\n","def _identity_activation(data: tf.Tensor) -> tf.Tensor:\n","  \"\"\"A dummy activation that is the identity.\"\"\"\n","  return data\n","\n","\n","class RemoveLastChannel(tf.keras.layers.Layer):\n","  \"\"\"Removes the last channel of the input tensor.\"\"\"\n","\n","  def call(self, network):\n","    return network[..., :-1]\n","\n","\n","class LastChannelOneHot(tf.keras.layers.Layer):\n","  \"\"\"Builds a one-hot encoding for the last channel of the input tensor.\n","\n","  The last channel of the input tensor is assumed to be a channel with\n","  categorical values that range between 0 and some max value (though possible\n","  encoded as floats).\n","  \"\"\"\n","\n","  def __init__(self, num_values: int = 10, **kwargs):\n","    super(LastChannelOneHot, self).__init__(**kwargs)\n","    self.num_values = num_values\n","\n","  def get_config(self):\n","    config = super(LastChannelOneHot, self).get_config()\n","    config.update({'num_values': self.num_values})\n","    return config\n","\n","  def call(self, network):\n","    # Strip off just the last channel and cast to an integer value.\n","    network = network[..., -1]\n","    network = tf.cast(network, dtype=tf.int32)\n","\n","    #  Transform that slice into a one-hot encoding.\n","    network = tf.one_hot(\n","        network,\n","        off_value=0.0,\n","        on_value=1.0,\n","        depth=self.num_values,\n","        dtype=tf.float32)\n","\n","    return network\n","\n","\n","class ReshapeWithBatch(tf.keras.layers.Layer):\n","  \"\"\"A Reshape layer that allows reshaping to effect the batch dimension.\n","\n","  The stock tf.keras.layers.Reshape layer will not allow layers to be merged\n","  with the batch dimension.  E.g., assume the input to the layer had a shape\n","  (None, 3, 4) and the desired reshape was to get to was (None, 4).  I.e., the\n","  2nd dimension is being merged with the batch channel (which at graph-creation\n","  time, is unknown in size).  There's no way to accomplish this with a Layer\n","  like:\n","\n","    tf.layers.Reshape(target_shape=(-1, 10)) will result\n","\n","  That command would not change the size at all, as it would assume the -1 size\n","  was fine staying at '3'.  We want the -1 to indicate that the None and 2nd\n","  dimension are merged into a new, larger, batch dimension.  So, instead, use\n","  this layer like:\n","\n","    tf.layers.ReshapeWithBatch(target_shape=(-1, 10))\n","  \"\"\"\n","\n","  def __init__(self, target_shape: Iterable[int], **kwargs):\n","    super(ReshapeWithBatch, self).__init__(**kwargs)\n","    self.target_shape = target_shape\n","\n","  def get_config(self):\n","    config = super(ReshapeWithBatch, self).get_config()\n","    config.update({'target_shape': self.target_shape})\n","    return config\n","\n","  def call(self, inputs):\n","    return tf.reshape(inputs, shape=self.target_shape)\n"],"metadata":{"id":"THY5qu4llONP","cellView":"form"},"execution_count":null,"outputs":[]},{"cell_type":"code","source":["#@title Performing the inference.\n","\n","#\n","# Parameters\n","#\n","\n","#@markdown **WARNING**: Be sure that if you specify a temporal model, you provide a temporal data file (likewise, a static model needs a static input file).\n","\n","#@markdown Set this to the location of the input data file on the colab server (you must copy the file to the colab server first).\n","input_file = '/tmp/wildfire/real_nosun/nosun_test_static.tfr-00000-of-00153' #@param {type:'string'}\n","\n","#@markdown Which row in the input file to get the data point from.  This will cause an error if there are not sufficient rows in the input data file.\n","input_row = 5 #@param {type: 'integer'}\n","\n","# See if the input file exists.\n","print('Trying to list out the input_file:')\n","!ls $input_file\n","\n","#@markdown Set the name of the file on the local colab server that contains the model to load.\n","input_model = '/tmp/wildfire/real_wn/epd.h5' #@param {type: 'string'}\n","\n","# See if the input model exists.\n","print('Trying to list out the input_model:')\n","!ls $input_model\n","\n","#\n","# Execute the inference.\n","#\n","\n","# Set up the configuration for the dataset that will create data points.\n","cfg = get_config()\n","cfg.input_base = input_file\n","print('Using the folloing configuration for setting up the dataset: ', cfg)\n","\n","# Get the dataset.\n","dataset = create_dataset(cfg=cfg)\n","\n","# Need to load the model.\n","print('Attempting to load the model from: ', input_model)\n","model = tf.keras.models.load_model(\n","      input_model,\n","      custom_objects={'LastChannelOneHot': LastChannelOneHot,\n","                      'RemoveLastChannel': RemoveLastChannel,\n","                      'ReshapeWithBatch': ReshapeWithBatch,\n","                      '_identity_activation': _identity_activation})\n","\n","# Iterate through the points in the dataset.\n","count = 0\n","for dp in dataset:\n","  count += 1\n","  if count == input_row:\n","    print('Found row: ', input_row)\n","    dp_input = dp[0]['input_image']\n","    dp_label = dp[1]\n","    print('The input shape was: ', dp_input.shape)\n","    print('The label shape was: ', dp_label.shape)\n","\n","    print('Performing an inference...')\n","    pred = tf.cast(model(dp_input), tf.float64)\n","    print('... done.  Prediction had the shape: ', pred.shape)\n","\n","    print('The prediction shape was: ', pred.shape)\n","    print('The label shape was: ', dp_label.shape)\n","\n","    if len(pred.shape) == 5:\n","      # Temporal data, the final prediction is the last time step.\n","      pred = pred[0,-1,:,:,0]\n","      dp_label = dp_label[0,-1,:,:,0]\n","    elif len(pred.shape) == 4:\n","      # Static data, the final prediction is the final prediction.\n","      pred = pred[0,:,:,0]\n","      dp_label = dp_label[0,:,:,0]\n","    else:\n","      raise ValueError('The prediction has the wrong shape.')\n","\n","    fig, axes = plt.subplots(1, 3)\n","    axes[0].imshow(dp_label)\n","    axes[0].set_title('Label')\n","    axes[1].imshow(pred)\n","    axes[1].set_title('Prediction')\n","    error = dp_label - pred\n","    axes[2].imshow(error)\n","    axes[2].set_title('Error')\n","\n","    # Only perform a single inference.\n","    break"],"metadata":{"colab":{"base_uri":"https://localhost:8080/","height":558},"id":"C1eFy5ePhyzL","executionInfo":{"status":"ok","timestamp":1685999362657,"user_tz":420,"elapsed":4596,"user":{"displayName":"John Burge","userId":"13461865008511798871"}},"outputId":"fb745b95-d0e2-4c53-f83c-fb10639cfd8a","cellView":"form"},"execution_count":null,"outputs":[{"output_type":"stream","name":"stdout","text":["Trying to list out the input_file:\n","/tmp/wildfire/real_nosun/nosun_test_static.tfr-00000-of-00153\n","Trying to list out the input_model:\n","/tmp/wildfire/real_wn/epd.h5\n","Using the folloing configuration for setting up the dataset:  batch_size: 1\n","glob: '*'\n","input_base: /tmp/wildfire/real_nosun/nosun_test_static.tfr-00000-of-00153\n","\n","DATASET PROGRESS: Inside create.  Loading file:  /tmp/wildfire/real_nosun/nosun_test_static.tfr-00000-of-00153\n","Attempting to load the model from:  /tmp/wildfire/real_wn/epd.h5\n"]},{"output_type":"stream","name":"stderr","text":["WARNING:tensorflow:No training configuration found in the save file, so the model was *not* compiled. Compile it manually.\n"]},{"output_type":"stream","name":"stdout","text":["Found row:  5\n","The input shape was:  (1, 126, 126, 17)\n","The label shape was:  (1, 126, 126, 1)\n","Performing an inference...\n","... done.  Prediction had the shape:  (1, 126, 126, 1)\n","The prediction shape was:  (1, 126, 126, 1)\n","The label shape was:  (1, 126, 126, 1)\n"]},{"output_type":"display_data","data":{"text/plain":["<Figure size 640x480 with 3 Axes>"],"image/png":"iVBORw0KGgoAAAANSUhEUgAAAigAAADTCAYAAACrx+h2AAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/bCgiHAAAACXBIWXMAAA9hAAAPYQGoP6dpAAAgBUlEQVR4nO3deXSU5cH38d9MdghJWLJiSABBlipqUtJoUJTUCMori1UiPQWqgAqoBB5f8RyIVh7Tx7pEJQX6aoktWIVWpFqIsop4QpDNKlsDJ2xCErZsSNa53j94GB0SIECSuSd8P+fMOcx133PPdSW/DL/M3DOxGWOMAAAALMTu7gkAAACcj4ICAAAsh4ICAAAsh4ICAAAsh4ICAAAsh4ICAAAsh4ICAAAsh4ICAAAsh4ICAAAsh4LSyuzfv182m02vvvpqkx1z3bp1stlsWrduXZMdE9e22NhYjR071nm9OTJms9n0wgsvNNnxALQsCopFZGdny2azafPmze6eCq4B5/J27uLv76+ePXtq8uTJKioqcvf0Gm358uWUEDSJ838mzr9s3LjR3VO85ni7ewIA3Od3v/udunbtqsrKSm3YsEFz587V8uXL9d1336lNmzYtNo877rhDZ86cka+v72Xdbvny5crKymqwpJw5c0be3jzE4fKc+5k43/XXX++G2Vzb+OkFrmGDBw9WfHy8JOmxxx5Tx44d9frrr2vZsmVKTU2tt//p06fVtm3bJp+H3W6Xv79/kx6zqY+Ha8NPfyYao7a2Vg6Ho8FyfbU/L8YYVVZWKiAg4IqP4cl4icdDVFdXa9asWYqLi1NwcLDatm2rAQMGaO3atRe8zRtvvKGYmBgFBATozjvv1HfffVdvn927d+vBBx9Uhw4d5O/vr/j4eP3zn/9szqXAwu6++25JUkFBgcaOHavAwEDt27dPQ4YMUbt27TR69GhJksPhUGZmpvr27St/f3+Fh4dr4sSJOnXqlMvxjDGaPXu2rrvuOrVp00Z33XWXduzYUe9+L3QOSl5enoYMGaL27durbdu2uummm/Tmm29KksaOHausrCxJcnkq/pyGzkHZtm2bBg8erKCgIAUGBmrQoEH1nro/91T/V199pbS0NIWGhqpt27YaPny4jh07dvlfVLQaPz3HLzMzU927d5efn5927typF154QTabTTt37tQjjzyi9u3bKykpSdLZEvPSSy8594+NjdXzzz+vqqoql+PHxsbq/vvv12effab4+HgFBARo/vz57liqJfAMiocoKyvTO++8o9TUVI0fP17l5eV69913lZKSok2bNunmm2922f8vf/mLysvLNWnSJFVWVurNN9/U3XffrW+//Vbh4eGSpB07duj2229X586d9dxzz6lt27ZavHixhg0bpn/84x8aPny4G1YKd9q3b58kqWPHjpLOPrCmpKQoKSlJr776qvNln4kTJyo7O1vjxo3TU089pYKCAs2ZM0fbtm3TV199JR8fH0nSrFmzNHv2bA0ZMkRDhgzR1q1bdc8996i6uvqSc1m5cqXuv/9+RUZG6umnn1ZERIR27dqlTz/9VE8//bQmTpyoI0eOaOXKlfrrX/96yePt2LFDAwYMUFBQkJ599ln5+Pho/vz5GjhwoL744gslJCS47D9lyhS1b99e6enp2r9/vzIzMzV58mR9+OGHl/U1hWcpLS3V8ePHXcZsNpvzZ0KSFixYoMrKSk2YMEF+fn7q0KGDc9uvfvUr9ejRQy+//LKMMZLOPjv53nvv6cEHH9S0adOUl5enjIwM7dq1S0uXLnW5rz179ig1NVUTJ07U+PHjdcMNNzTjai3OwBIWLFhgJJmvv/66we21tbWmqqrKZezUqVMmPDzc/Pa3v3WOFRQUGEkmICDAHD582Dmel5dnJJmpU6c6xwYNGmRuvPFGU1lZ6RxzOBzmtttuMz169HCOrV271kgya9euvdplwiLO5W3VqlXm2LFj5tChQ+aDDz4wHTt2dGZnzJgxRpJ57rnnXG775ZdfGklm0aJFLuM5OTku48XFxcbX19fcd999xuFwOPd7/vnnjSQzZswY59j5GautrTVdu3Y1MTEx5tSpUy7389NjTZo0yVzoYUySSU9Pd14fNmyY8fX1Nfv27XOOHTlyxLRr187ccccd9b42ycnJLvc1depU4+XlZUpKShq8P3i2c9/3hi5+fn7GmB8fX4OCgkxxcbHL7dPT040kk5qa6jK+fft2I8k89thjLuPTp083ksyaNWucYzExMUaSycnJaaZVehZe4vEQXl5eztc4HQ6HTp48qdraWsXHx2vr1q319h82bJg6d+7svN6/f38lJCRo+fLlkqSTJ09qzZo1euihh1ReXq7jx4/r+PHjOnHihFJSUpSfn6/vv/++ZRYHt0lOTlZoaKiio6M1atQoBQYGaunSpS7ZeeKJJ1xus2TJEgUHB+uXv/ylMzfHjx9XXFycAgMDnS87rlq1StXV1ZoyZYrLSy/PPPPMJee1bds2FRQU6JlnnlFISIjLtp8eq7Hq6ur0+eefa9iwYerWrZtzPDIyUo888og2bNigsrIyl9tMmDDB5b4GDBiguro6HThw4LLvH54jKytLK1eudLmsWLHCZZ+RI0cqNDS0wds//vjjLtfPPeampaW5jE+bNk2S9K9//ctlvGvXrkpJSbmqNbQWvMTjQd577z299tpr2r17t2pqapzjDZ1x3qNHj3pjPXv21OLFiyVJe/fulTFGM2fO1MyZMxu8v+LiYpf/qND6ZGVlqWfPnvL29lZ4eLhuuOEG2e0//t7i7e2t6667zuU2+fn5Ki0tVVhYWIPHLC4uliTnf+TnZzE0NFTt27e/6LzOvdT0s5/97PIWdAHHjh3TDz/80ODT5b1795bD4dChQ4fUt29f53iXLl1c9js35/PPs0Hr0r9//0ueJNvQY+6Fth04cEB2u73eu4AiIiIUEhJSr/Be7NjXGgqKh1i4cKHGjh2rYcOG6b/+678UFhYmLy8vZWRkOB/ML4fD4ZAkTZ8+/YJtnbfVtX6XejD28/NzKSzS2eyEhYVp0aJFDd7mQr9ZehovL68Gx83/nleAa9fF3lVzoW2NfebvWn3HTkMoKB7i73//u7p166aPPvrIJejp6ekN7p+fn19v7D//+Y9iY2Mlyfk0t4+Pj5KTk5t+wmi1unfvrlWrVun222+/6INpTEyMpLNZ/OnLKseOHbvksxDdu3eXJH333XcXzWdjH/RDQ0PVpk0b7dmzp9623bt3y263Kzo6ulHHAi5HTEyMHA6H8vPz1bt3b+d4UVGRSkpKnD8nqI9zUDzEud/mfvrbW15ennJzcxvc/+OPP3Y5h2TTpk3Ky8vT4MGDJUlhYWEaOHCg5s+fr6NHj9a7PW+nxIU89NBDqqur00svvVRvW21trUpKSiSdPb/Fx8dHb7/9tktuMzMzL3kft956q7p27arMzEzn8c756bHOfcbE+fucz8vLS/fcc4+WLVum/fv3O8eLior0/vvvKykpSUFBQZecF3C5hgwZIql+7l9//XVJ0n333dfSU/IYPINiMX/+85+Vk5NTb3zgwIH66KOPNHz4cN13330qKCjQvHnz1KdPH1VUVNTb//rrr1dSUpKeeOIJVVVVKTMzUx07dtSzzz7r3CcrK0tJSUm68cYbNX78eHXr1k1FRUXKzc3V4cOH9c033zTrWuGZ7rzzTk2cOFEZGRnavn277rnnHvn4+Cg/P19LlizRm2++qQcffFChoaGaPn26MjIydP/992vIkCHatm2bVqxYoU6dOl30Pux2u+bOnauhQ4fq5ptv1rhx4xQZGandu3drx44d+uyzzyRJcXFxkqSnnnpKKSkp8vLy0qhRoxo85uzZs7Vy5UolJSXpySeflLe3t+bPn6+qqiq98sorTftFgsdasWKFdu/eXW/8tttuq/dyZ2P069dPY8aM0Z/+9CeVlJTozjvv1KZNm/Tee+9p2LBhuuuuu5pi2q0SBcVi5s6d2+D4wYMHVVFRofnz5+uzzz5Tnz59tHDhQi1ZsqTBP7D2m9/8Rna7XZmZmSouLlb//v01Z84cRUZGOvfp06ePNm/erBdffFHZ2dk6ceKEwsLCdMstt2jWrFnNtUS0AvPmzVNcXJzmz5+v559/Xt7e3oqNjdWvf/1r3X777c79Zs+eLX9/f82bN09r165VQkKCPv/880b91piSkqK1a9fqxRdf1GuvvSaHw6Hu3btr/Pjxzn1GjBihKVOm6IMPPtDChQtljLlgQenbt6++/PJLzZgxQxkZGXI4HEpISNDChQvrfQYKrl0XeuxbsGCBBg4ceEXHfOedd9StWzdlZ2dr6dKlioiI0IwZMy74Ej3OshnO+AIAABbDOSgAAMByKCgAAMByKCgAAMBy3FpQsrKyFBsbK39/fyUkJGjTpk3unA7QaGQXnorswlO4raB8+OGHSktLU3p6urZu3ap+/fopJSXF+THZgFWRXXgqsgtP4rZ38SQkJOjnP/+55syZI+nsx2dHR0drypQpeu655y56W4fDoSNHjqhdu3ZX9IfDAOnsB36Vl5crKirqsj7fgOzC3cguPNXlZNctn4NSXV2tLVu2aMaMGc4xu92u5OTkBj8ZtaqqSlVVVc7r33//vfr06dMic0Xrd+jQoXp/EO9CyC6shOzCUzUmu24pKMePH1ddXZ3Cw8NdxsPDwxv8BL+MjAy9+OKL9caTNETe8mm2eaJ1q1WNNmi52rVr1+jbNFl2bUPlbSO7uDK1pkYbzCduye63m8PVLpD3V+DKlFc4dGN8UaOy6xGfJDtjxgylpaU5r5eVlSk6Olre8uFBHlfuf1/cbM6nqy+YXRvZxVUy7sluu0C7gtpRUHB1GpNdtxSUTp06ycvLS0VFRS7jRUVFioiIqLe/n5+f/Pz8Wmp6wAWRXXgqsgtP45Ya7Ovrq7i4OK1evdo55nA4tHr1aiUmJrpjSkCjkF14KrILT+O2l3jS0tI0ZswYxcfHq3///srMzNTp06c1btw4d00JaBSyC09FduFJ3FZQHn74YR07dkyzZs1SYWGhbr75ZuXk5NQ7gQuwGrILT0V24Uk88q8Zl5WVKTg4WAP1ACca4orVmhqt0zKVlpYqKCioRe7TmV37CLKLK1ZrarTO8ZFbsrt/dyQnyeKKlZU7FNvraKOyS8oAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlUFAAAIDlNHlBeeGFF2Sz2VwuvXr1cm6vrKzUpEmT1LFjRwUGBmrkyJEqKipq6mkAl43swlORXbRGzfIMSt++fXX06FHnZcOGDc5tU6dO1SeffKIlS5boiy++0JEjRzRixIjmmAZw2cguPBXZRWvj3SwH9fZWREREvfHS0lK9++67ev/993X33XdLkhYsWKDevXtr48aN+sUvftEc0wEajezCU5FdtDbN8gxKfn6+oqKi1K1bN40ePVoHDx6UJG3ZskU1NTVKTk527turVy916dJFubm5FzxeVVWVysrKXC5AcyC78FRkF61NkxeUhIQEZWdnKycnR3PnzlVBQYEGDBig8vJyFRYWytfXVyEhIS63CQ8PV2Fh4QWPmZGRoeDgYOclOjq6qacNkF14LLKL1qjJX+IZPHiw89833XSTEhISFBMTo8WLFysgIOCKjjljxgylpaU5r5eVlfHDgiZHduGpyC5ao2Z/m3FISIh69uypvXv3KiIiQtXV1SopKXHZp6ioqMHXTs/x8/NTUFCQywVobmQXnorsojVo9oJSUVGhffv2KTIyUnFxcfLx8dHq1aud2/fs2aODBw8qMTGxuacCXBayC09FdtEaNPlLPNOnT9fQoUMVExOjI0eOKD09XV5eXkpNTVVwcLAeffRRpaWlqUOHDgoKCtKUKVOUmJjImeRwO7ILT0V20Ro1eUE5fPiwUlNTdeLECYWGhiopKUkbN25UaGioJOmNN96Q3W7XyJEjVVVVpZSUFP3xj39s6mkAl43swlORXbRGNmOMcfckLldZWZmCg4M1UA/I2+bj7unAQ9WaGq3TMpWWlrbY6+vO7NpHkF1csVpTo3WOj9yS3f27IxXUjr+SgitTVu5QbK+jjcouKQMAAJZDQQEAAJZDQQEAAJZDQQEAAJZDQQEAAJZDQQEAAJZDQQEAAJZDQQEAAJZDQQEAAJZDQQEAAJZDQQEAAJZDQQEAAJZDQQEAAJZDQQEAAJZDQQEAAJZDQQEAAJZDQQEAAJZDQQEAAJZDQQEAAJZDQQEAAJZDQQEAAJZDQQEAAJZDQQEAAJZDQQEAAJZDQQEAAJZDQQEAAJZDQQEAAJZDQQEAAJZDQQEAAJZDQQEAAJZDQQEAAJZDQQEAAJZDQQEAAJZDQQEAAJbj7e4JQLL5+ckrKkLmVIkcFadlamvdPSWgUbw6hMjWLlCm4geZ8nI5qmvcPSWgUb6pDlCo/Qc5ZFOwvU7t7F7unhLOwzMo7mazqe4XfVT5/4yOPtJXXl2uc/eMgEax+/royCM36PAbbXTwtz1keneXzW5z97SASzpZV6fH//ykJu8dpV9/M06LSm+Rwxh3TwvnoaC4k80m7+s6a/n778j/UaOadlJJfIS82rd398yAi/IKClTR2Fu05f/O0XVTf5BvmdGJm4Nk69vD3VMDLmpzVaAeeW66fMql17ov0f/0/YdyT3bTEwfvd/fUcB4Kiht5R0XqyJxAJc2YLEfRMUXP+UZtjlSq5qZYd08NuCibv79KfubQnU89IUfRMYX/5d+qDbDp8C8p17C2W33L9cZ/Z+nPz2QqxrtOCf5l2rW+m3Yt7uXuqeE8nIPiRnXFxxX23x11eJBNPzx1q36IdCh8oxTy+R7VuXtywEXUnSrRDX8q1YH/00GnI29Wac86tT0sRX71g7unBlyU3WbT9T6V+s3eX2lkxFb9ce8dqglxyP/GMndPDeehoLiRqamWNn2ryDa3yuFrV9V+bwXvOKW6U6fcPTXgokxNrbRzryLC+sl42eR/wlvtDpyW955DlGtYno/NrsFhO+Rlc2hw9C4pWrqlzQF3TwvnoaBYgNe6rfKS5CPx4A6PYRxGPmu2yx7gL98zlZJxqM7Gq8bwDL8J3q29NV4aHnhAPvJSDY++lsOjCYCr4jhTefYflBN4mOt9zpYSyok18YgCAAAsh4ICAAAsh4ICAAAsh4ICAAAsh4ICAAAs57ILyvr16zV06FBFRUXJZrPp448/dtlujNGsWbMUGRmpgIAAJScnKz8/32WfkydPavTo0QoKClJISIgeffRRVVRUXNVCgEs5ZY5pu/lK682nWmX+rmM66rKd7MKqTpliba/7UuvrlmlV3Yc6Zo64bCe7aI0uu6CcPn1a/fr1U1ZWVoPbX3nlFb311luaN2+e8vLy1LZtW6WkpKiystK5z+jRo7Vjxw6tXLlSn376qdavX68JEyZc+SqARqhTrQIVrF66pcHtZBdWVac6BdpC1Mse1+B2sovWyGbMlf8JR5vNpqVLl2rYsGGSzrb4qKgoTZs2TdOnT5cklZaWKjw8XNnZ2Ro1apR27dqlPn366Ouvv1Z8fLwkKScnR0OGDNHhw4cVFRV1yfstKytTcHCwBuoBedt8rnT6uIatMn9XX/XXDm1SaWmp2rVr17LZtY8gu7giq+o+VF/bL7TDbHRLdvfvjlRQO84OwJUpK3cottdRlZaWKigo6KL7NmnKCgoKVFhYqOTkZOdYcHCwEhISlJubK0nKzc1VSEiI84dEkpKTk2W325WXl9fgcauqqlRWVuZyAZoS2YWnIrtorZq0oBQWFkqSwsPDXcbDw8Od2woLCxUWFuay3dvbWx06dHDuc76MjAwFBwc7L9HR0U05bYDswmORXbRWHvE83YwZM1RaWuq8HDp0yN1TAhqF7MJTkV24W5MWlIiICElSUVGRy3hRUZFzW0REhIqLi12219bW6uTJk859zufn56egoCCXC9CUyC48FdlFa9WkBaVr166KiIjQ6tWrnWNlZWXKy8tTYmKiJCkxMVElJSXasmWLc581a9bI4XAoISGhKacDNBrZhaciu2itvC/3BhUVFdq7d6/zekFBgbZv364OHTqoS5cueuaZZzR79mz16NFDXbt21cyZMxUVFeV8p0/v3r117733avz48Zo3b55qamo0efJkjRo1qlFnkgNXqtbU6ox+/NyHSv0gSTp06JD69u1LdmFZtabmvOyelkR20bpddkHZvHmz7rrrLuf1tLQ0SdKYMWOUnZ2tZ599VqdPn9aECRNUUlKipKQk5eTkyN/f33mbRYsWafLkyRo0aJDsdrtGjhypt956qwmWA1xYmU5qq9Y7r+/Td5Kkl19+WYsWLSK7sKwyndJWx1rn9X3mW0lkF63bVX0OirvwOShoCrWmRuu0rFHvx28qfA4KmkKtqdE6x0duyS6fg4Kr4bbPQQEAAGgKFBQAAGA5FBQAAGA5FBQAAGA5FBQAAGA5FBQAAGA5FBQAAGA5FBQAAGA5FBQAAGA5FBQAAGA5FBQAAGA5FBQAAGA5FBQAAGA5FBQAAGA5FBQAAGA5FBQAAGA5FBQAAGA5FBQAAGA5FBQAAGA5FBQAAGA5FBQAAGA53u6ewJUwxkiSalUjGTdPBh6rVjWSfsxTS3Bm19S02H2i9TmXH3dkt7zC0WL3idbnXH4ak12PLCjl5eWSpA1a7uaZoDUoLy9XcHBwi92XJG0wn1CucdXckd0b44ta5P7QujUmuzbTkhW8iTgcDu3Zs0d9+vTRoUOHFBQU5O4pNZuysjJFR0e36nW6a43GGJWXlysqKkp2e8u82kl2Wx93rJPsNi+y23wuJ7se+QyK3W5X586dJUlBQUGtOkDnXAvrdMcaW+q3z3PIbuvV0usku82PdTaPxmaXk2QBAIDlUFAAAIDleGxB8fPzU3p6uvz8/Nw9lWZ1LazzWljjT10r62Wdrc+1slbWaQ0eeZIsAABo3Tz2GRQAANB6UVAAAIDlUFAAAIDlUFAAAIDlUFAAAIDleGRBycrKUmxsrPz9/ZWQkKBNmza5e0pX5YUXXpDNZnO59OrVy7m9srJSkyZNUseOHRUYGKiRI0eqqMj6fw9j/fr1Gjp0qKKiomSz2fTxxx+7bDfGaNasWYqMjFRAQICSk5OVn5/vss/Jkyc1evRoBQUFKSQkRI8++qgqKipacBVNi+ySXU9FdsluS/O4gvLhhx8qLS1N6enp2rp1q/r166eUlBQVFxe7e2pXpW/fvjp69KjzsmHDBue2qVOn6pNPPtGSJUv0xRdf6MiRIxoxYoQbZ9s4p0+fVr9+/ZSVldXg9ldeeUVvvfWW5s2bp7y8PLVt21YpKSmqrKx07jN69Gjt2LFDK1eu1Keffqr169drwoQJLbWEJkV2yS7ZtRaya/HsGg/Tv39/M2nSJOf1uro6ExUVZTIyMtw4q6uTnp5u+vXr1+C2kpIS4+PjY5YsWeIc27Vrl5FkcnNzW2iGV0+SWbp0qfO6w+EwERER5g9/+INzrKSkxPj5+Zm//e1vxhhjdu7caSSZr7/+2rnPihUrjM1mM99//32Lzb2pkF2yS3atg+yeZeXsetQzKNXV1dqyZYuSk5OdY3a7XcnJycrNzXXjzK5efn6+oqKi1K1bN40ePVoHDx6UJG3ZskU1NTUua+7Vq5e6dOni0WsuKChQYWGhy7qCg4OVkJDgXFdubq5CQkIUHx/v3Cc5OVl2u115eXktPuerQXbPIrtk10rIrrWz61EF5fjx46qrq1N4eLjLeHh4uAoLC900q6uXkJCg7Oxs5eTkaO7cuSooKNCAAQNUXl6uwsJC+fr6KiQkxOU2nr7mc3O/2PeysLBQYWFhLtu9vb3VoUMHj1s72f2Rp6+Z7J7l6d9Hsvsjq2bXu0XvDQ0aPHiw89833XSTEhISFBMTo8WLFysgIMCNMwMujuzCU5Fd6/OoZ1A6deokLy+vemdSFxUVKSIiwk2zanohISHq2bOn9u7dq4iICFVXV6ukpMRlH09f87m5X+x7GRERUe8kvNraWp08edLj1k52f+Tpaya7Z3n69/F8ZNd62fWoguLr66u4uDitXr3aOeZwOLR69WolJia6cWZNq6KiQvv27VNkZKTi4uLk4+PjsuY9e/bo4MGDHr3mrl27KiIiwmVdZWVlysvLc64rMTFRJSUl2rJli3OfNWvWyOFwKCEhocXnfDXI7llkl+xaFdm1YHZb9JTcJvDBBx8YPz8/k52dbXbu3GkmTJhgQkJCTGFhobundsWmTZtm1q1bZwoKCsxXX31lkpOTTadOnUxxcbExxpjHH3/cdOnSxaxZs8Zs3rzZJCYmmsTERDfP+tLKy8vNtm3bzLZt24wk8/rrr5tt27aZAwcOGGOM+f3vf29CQkLMsmXLzL///W/zwAMPmK5du5ozZ844j3HvvfeaW265xeTl5ZkNGzaYHj16mNTUVHct6aqQXbJLdq2D7Fo/ux5XUIwx5u233zZdunQxvr6+pn///mbjxo3untJVefjhh01kZKTx9fU1nTt3Ng8//LDZu3evc/uZM2fMk08+adq3b2/atGljhg8fbo4ePerGGTfO2rVrjaR6lzFjxhhjzr7lbebMmSY8PNz4+fmZQYMGmT179rgc48SJEyY1NdUEBgaaoKAgM27cOFNeXu6G1TQNskt2PRXZJbstzWaMMS37nA0AAMDFedQ5KAAA4NpAQQEAAJZDQQEAAJZDQQEAAJZDQQEAAJZDQQEAAJZDQQEAAJZDQQEAAJZDQQEAAJZDQQEAAJZDQQEAAJbz/wERMlHxDE2ZdgAAAABJRU5ErkJggg==\n"},"metadata":{}}]}]}